Number of the records: 1  

Effective Automatic Method Selection for Nonlinear Regression Modeling

  1. 1.
    0541777 - ÚI 2022 RIV SG eng J - Journal Article
    Kalina, Jan - Neoral, Aleš - Vidnerová, Petra
    Effective Automatic Method Selection for Nonlinear Regression Modeling.
    International Journal of Neural Systems. Roč. 31, č. 10 (2021), č. článku 2150020. ISSN 0129-0657. E-ISSN 1793-6462
    R&D Projects: GA ČR(CZ) GA19-05704S; GA ČR(CZ) GA18-23827S
    Institutional support: RVO:67985807
    Keywords : metalearning * nonlinear regression * robust statistical estimation * feature selection * AutoML
    OECD category: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    Impact factor: 6.325, year: 2021
    Method of publishing: Limited access
    http://dx.doi.org/10.1142/S0129065721500209

    Metalearning, an important part of artificial intelligence, represents a promising approach for the task of automatic selection of appropriate methods or algorithms. This paper is interested in recommending a suitable estimator for nonlinear regression modeling, particularly in recommending either the standard nonlinear least squares estimator or one of such available alternative estimators, which is highly robust with respect to the presence of outliers in the data. The authors hold the opinion that theoretical considerations will never be able to formulate such recommendations for the nonlinear regression context. Instead, metalearning is explored here as an original approach suitable for this task. In this paper, four different approaches for automatic method selection for nonlinear regression are proposed and computations over a training database of 643 real publicly available datasets are performed. Particularly, while the metalearning results may be harmed by the imbalanced number of groups, an effective approach yields much improved results, performing a novel combination of supervised feature selection by random forest and oversampling by synthetic minority oversampling technique (SMOTE). As a by-product, the computations bring arguments in favor of the very recent nonlinear least weighted squares estimator, which turns out to outperform other (and much more renowned) estimators in a quite large percentage of datasets.
    Permanent Link: http://hdl.handle.net/11104/0319314

     
    FileDownloadSizeCommentaryVersionAccess
    0541777-afin.pdf62.8 MBPublisher’s postprintrequire
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.