Unbiased Recursive Partitioning: A Conditional Inference Framework¶
Authors: Torsten Hothorn, Kurt Hornik, Achim Zeileis
Published: 2006 (Journal Paper)
Source: Journal of Computational and Graphical Statistics
Algorithm: ctree
DOI: 10.1198/106186006X133933
Summary¶
Introduces a conditional inference framework (ctree) for decision trees that uses permutation tests to select split variables, eliminating the selection bias toward high-cardinality or many-split predictors that plagues standard CART. The unbiased variable selection also removes the need for separate pruning, yielding statistically well-calibrated trees.
Abstract¶
A unified framework for recursive partitioning is proposed which embeds tree-structured regression models into a well defined theory of conditional inference procedures. The approach results in unbiased variable selection and does not require additional pruning. The stopping criterion is based on multiple test procedures. The framework is applicable to all kinds of regression problems including nominal, ordinal, numeric, censored as well as multivariate response variables and arbitrary measurement scales of the covariates.
Links¶
Primary
Standard
Alternate
Tags¶
-
Decision trees
-
Conditional inference
-
Unbiased variable selection
-
Statistical testing
-
Recursive partitioning
-
Machine learning