Chapter 14. Conditional inference trees and random forests
This chapter discusses conditional inference trees and random forests. These are non-parametric tree-structure models of regression and classification that can serve as an alternative to multiple regression. They are especially useful in the presence of many high-order interactions and in situations when the sample size is small, but the number of predictors is large. You will learn how to fit such models, interpret their results and evaluate their quality. The case study that illustrates the techniques deals with three English causative constructions <i>make + V</i>, <i>cause + to V</i> and <i>have + V </i>and identifies the set of independent semantic variables that are important for distinguishing between the constructions.