Boosting the performance of decision trees
Verfasser: Lugger, Klaus
Sachtitel: Boosting the performance of decision trees
Impressum: 1995; VIII, 71 Bl. : graph. Darst. 4^; Graz, Techn. Univ., Inst. fuer Informationsverarbeitung und Computergestuetzte neue Medien und Inst. fuer Elektro- und Biomedizinische Technik, Dipl.-Arb., 1995
Standort: Hauptbibliothek - Magazin Technikerstrasse 4
Signatur: II 124.654
Inventarnummer: 5908P95
Abstract
Recently it has been proved that the accuracy of any simple classification method can be increased, if a lot of training examples are available, by building three such classifiers. These classifiers are trained with increasingly difficult examples that are sieved out by the previously built classifiers. This method is called boosting and has already been tested with different classification algorithms, like neural nets, and positive results have been achieved. In this thesis boosting is performed on decision trees. Two versions of boosting have been implemented: One that uses OC1 to build the classifiers. This program can not perform recursion, which means that only three classifiers are built. The other implementation builds on single-rule classifiers but is able to perform recursion, i.e., recursively triples of classifiers are built. In extensive analyses comparing the two implemented boosting versions to the well-known decision tree programs C4.5 and OC1 on numerical as well as discrete and mixed data sets, OC1 is only beaten once by one of the implemented algorithms, C4.5 is also only beaten once. Comparing the complexity of the various methods, however, shows that especially the single-rules produce accurate and simple classifiers.
Betreuer
Kubat M./Flotzinger D.
Institut für Informationsverarbeitung und Computergestützte neue Medien