Number of the records: 1
How to down-weight observations in robust regression: A metalearning study
- 1.
SYSNO ASEP 0493805 Document Type C - Proceedings Paper (int. conf.) R&D Document Type Conference Paper Title How to down-weight observations in robust regression: A metalearning study Author(s) Kalina, Jan (UIVT-O) RID, SAI, ORCID
Pitra, Zbyněk (UIVT-O) RID, ORCID, SAISource Title Mathematical Methods in Economics 2018. Conference Proceedings. - Prague : MatfyzPress, 2018 / Váchová L. ; Kratochvíl V. - ISBN 978-80-7378-371-6 Pages s. 204-209 Number of pages 6 s. Publication form Print - P Action MME 2018. International Conference Mathematical Methods in Economics /36./ Event date 12.09.2018 - 14.09.2018 VEvent location Jindřichův Hradec Country CZ - Czech Republic Event type WRD Language eng - English Country CZ - Czech Republic Keywords metalearning ; robust statistics ; linear regression ; outliers Subject RIV BB - Applied Statistics, Operational Research OECD category Statistics and probability R&D Projects GA17-07384S GA ČR - Czech Science Foundation (CSF) GA17-01251S GA ČR - Czech Science Foundation (CSF) Institutional support UIVT-O - RVO:67985807 UT WOS 000507455300036 Annotation Metalearning is becoming an increasingly important methodology for extracting knowledge from a data base of available training data sets to a new (independent) data set. The concept of metalearning is becoming popular in statistical learning and there is an increasing number of metalearning applications also in the analysis of economic data sets. Still, not much attention has been paid to its limitations and disadvantages. For this purpose, we use various linear regression estimators (including highly robust ones) over a set of 30 data sets with economic background and perform a metalearning study over them as well as over the same data sets after an artificial contamination. We focus on comparing the prediction performance of the least weighted squares estimator with various weighting schemes. A broader spectrum of classification methods is applied and a support vector machine turns out to yield the best results. While results of a leave-1-out cross validation are very different from results of autovalidation, we realize that metalearning is highly unstable and its results should be interpreted with care. We also focus on discussing all possible limitations of the metalearning methodology in general. Workplace Institute of Computer Science Contact Tereza Šírová, sirova@cs.cas.cz, Tel.: 266 053 800 Year of Publishing 2019
Number of the records: 1