Number of the records: 1  

Optimally Trained Regression Trees and Occam's Razor

  1. 1.
    0404694 - UIVT-O 20020058 RIV DE eng C - Conference Paper (international conference)
    Savický, Petr - Klaschka, Jan
    Optimally Trained Regression Trees and Occam's Razor.
    COMPSTAT 2002. Proceedings in Computational Statistics. Heidelberg: PhysicaVerlag, 2002 - (Härdle, W.; Rönz, B.), s. 479-484. ISBN 3-7908-1517-9.
    [COMPSTAT 2002. Berlin (DE), 24.08.2002-28.08.2002]
    R&D Projects: GA ČR GA201/00/1482
    Institutional research plan: AV0Z1030915
    Keywords : regression trees * recursive partitioning * optimization * dynamic programming * bottom-up algorithms * generalization * Occam's razor
    Subject RIV: BA - General Mathematics

    Two bottom-up algorithms growing regression trees with the minimum mean squared error on the training data given the number of leaves are described. As demonstraded by the results of experiments with simulated data, the trees resulting from the optimization algorithms may have not only better, but also worse generalization properties than the trees grown by traiditional methods. This phenomenon is discussed from the point of view of the Occam's razor principle.
    Permanent Link: http://hdl.handle.net/11104/0124933

     
     

Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.