Gilad Katz, Asaf Shabtai, Lior Rokach, Nir Ofek
2012 IEEE 12th International Conference on Data Mining, 339-348, 2012
Decision trees have three main disadvantages: reduced performance when the training set is small, rigid decision criteria and the fact that a single “uncharacteristic” attribute might “derail” the classification process. In this paper we present ConfDTree – a post-processing method which enables decision trees to better classify outlier instances. This method, which can be applied on any decision trees algorithm, uses confidence intervals in order to identify these hard-to-classify instances and proposes alternative routes. The experimental study indicates that the proposed post-processing method consistently and significantly improves the predictive performance of decision trees, particularly for small, imbalanced or multi-class datasets in which an average improvement of 5%-9% in the AUC performance is reported.