Number of the records: 1  

Optimally Trained Regression Trees and Occam's Razor

  1. 1.
    SYSNO ASEP0404694
    Document TypeC - Proceedings Paper (int. conf.)
    R&D Document TypeConference Paper
    TitleOptimally Trained Regression Trees and Occam's Razor
    Author(s) Savický, Petr (UIVT-O) SAI, RID, ORCID
    Klaschka, Jan (UIVT-O) RID, SAI, ORCID
    Source TitleCOMPSTAT 2002. Proceedings in Computational Statistics / Härdle W. ; Rönz B.. - Heidelberg : PhysicaVerlag, 2002 - ISBN 3-7908-1517-9
    Pagess. 479-484
    Number of pages6 s.
    ActionCOMPSTAT 2002
    Event date24.08.2002-28.08.2002
    VEvent locationBerlin
    CountryDE - Germany
    Event typeWRD
    Languageeng - English
    CountryDE - Germany
    Keywordsregression trees ; recursive partitioning ; optimization ; dynamic programming ; bottom-up algorithms ; generalization ; Occam's razor
    Subject RIVBA - General Mathematics
    R&D ProjectsGA201/00/1482 GA ČR - Czech Science Foundation (CSF)
    CEZ1030915
    UT WOS000179942900073
    AnnotationTwo bottom-up algorithms growing regression trees with the minimum mean squared error on the training data given the number of leaves are described. As demonstraded by the results of experiments with simulated data, the trees resulting from the optimization algorithms may have not only better, but also worse generalization properties than the trees grown by traiditional methods. This phenomenon is discussed from the point of view of the Occam's razor principle.
    WorkplaceInstitute of Computer Science
    ContactTereza Šírová, sirova@cs.cas.cz, Tel.: 266 053 800
    Year of Publishing2003

Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.