Chapter 14. Conditional inference trees and random forests

MyBook is a cheap paperback edition of the original book and will be sold at uniform, low price.
This Chapter is currently unavailable for purchase.

This chapter discusses conditional inference trees and random forests. These are non-parametric tree-structure models of regression and classification that can serve as an alternative to multiple regression. They are especially useful in the presence of many high-order interactions and in situations when the sample size is small, but the number of predictors is large. You will learn how to fit such models, interpret their results and evaluate their quality. The case study that illustrates the techniques deals with three English causative constructions <i>make &#43; V</i>, <i>cause &#43; to V</i> and <i>have &#43; V </i>and identifies the set of independent semantic variables that are important for distinguishing between the constructions.


This is a required field
Please enter a valid email address