Number of the records: 1
Optimally Trained Regression Trees and Occam's Razor
- 1.
SYSNO ASEP 0404694 Document Type C - Proceedings Paper (int. conf.) R&D Document Type Conference Paper Title Optimally Trained Regression Trees and Occam's Razor Author(s) Savický, Petr (UIVT-O) SAI, RID, ORCID
Klaschka, Jan (UIVT-O) RID, SAI, ORCIDSource Title COMPSTAT 2002. Proceedings in Computational Statistics / Härdle W. ; Rönz B.. - Heidelberg : PhysicaVerlag, 2002 - ISBN 3-7908-1517-9 Pages s. 479-484 Number of pages 6 s. Action COMPSTAT 2002 Event date 24.08.2002-28.08.2002 VEvent location Berlin Country DE - Germany Event type WRD Language eng - English Country DE - Germany Keywords regression trees ; recursive partitioning ; optimization ; dynamic programming ; bottom-up algorithms ; generalization ; Occam's razor Subject RIV BA - General Mathematics R&D Projects GA201/00/1482 GA ČR - Czech Science Foundation (CSF) CEZ 1030915 UT WOS 000179942900073 Annotation Two bottom-up algorithms growing regression trees with the minimum mean squared error on the training data given the number of leaves are described. As demonstraded by the results of experiments with simulated data, the trees resulting from the optimization algorithms may have not only better, but also worse generalization properties than the trees grown by traiditional methods. This phenomenon is discussed from the point of view of the Occam's razor principle. Workplace Institute of Computer Science Contact Tereza Šírová, sirova@cs.cas.cz, Tel.: 266 053 800 Year of Publishing 2003
Number of the records: 1