Number of the records: 1  

Classification trees with soft splits optimized for ranking

  1. 1.
    0501997 - ÚI 2020 RIV DE eng J - Journal Article
    Dvořák, Jakub
    Classification trees with soft splits optimized for ranking.
    Computational Statistics. Roč. 34, č. 2 (2019), s. 763-786. ISSN 0943-4062. E-ISSN 1613-9658
    Institutional support: RVO:67985807
    Keywords : Supervised learning * Decision trees * Scoring classifier
    OECD category: Pure mathematics
    Impact factor: 0.744, year: 2019
    Method of publishing: Limited access
    http://dx.doi.org/10.1007/s00180-019-00867-1

    We consider softening of splits in classification trees generated from multivariate numerical data. This methodology improves the quality of the ranking of the test cases measured by the AUC. Several ways to determine softening parameters are introduced and compared including softening algorithm present in the standard methods C4.5 and C5.0. In the first part of the paper, a few settings of softening determined only from ranges of training data in the tree branches are explored. The trees softened with these settings are used to study the effect of using the Laplace correction together with soft splits. In a later part we introduce methods which employ maximization of the classifier’s performance on the training set over the domain of the softening parameters. The non-linear optimization algorithm Nelder–Mead is used and various target functions are considered. The target function evaluating the AUC on the training set is compared with functions summing over training cases some transformation of the error of score. Several data sets from the UCI repository are used in experiments.
    Permanent Link: http://hdl.handle.net/11104/0293959

     
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.