Number of the records: 1
Boosting in probabilistic neural networks
- 1.
SYSNO ASEP 0410888 Document Type C - Proceedings Paper (int. conf.) R&D Document Type Conference Paper Title Boosting in probabilistic neural networks Author(s) Grim, Jiří (UTIA-B) RID, ORCID
Pudil, Pavel (UTIA-B) RID
Somol, Petr (UTIA-B) RIDIssue data Los Alamitos: IEEE Computer Society, 2002 ISBN 0-7695-1699-8 Source Title Proceedings of the 16th International Conference on Pattern Recognition / Kasturi R. ; Laurendeau D. ; Suen C. Pages s. 136-139 Number of pages 4 s. Action International Conference on Pattern Recognition /16./ Event date 11.08.2002-15.08.2002 VEvent location Québec City Country CA - Canada Event type WRD Language eng - English Country US - United States Keywords neural networks ; finite mixtures ; boosting Subject RIV BB - Applied Statistics, Operational Research R&D Projects GA402/01/0981 GA ČR - Czech Science Foundation (CSF) KSK1019101 GA AV ČR - Academy of Sciences of the Czech Republic (AV ČR) CEZ AV0Z1075907 - UTIA-B Annotation It has been verified in practical experiments that the classification performance can be improved by increasing the weights of misclassified training samples. We prove that in case of maximum-likelihood estimation the weighting of discrete data vectors is asymptotically equivalent to multiplication of the estimated distributions by a positive function. Consequently, the Bayesian decision-making can be made asymptotically invariant with respect to arbitrary weighting of data under certain conditions. Workplace Institute of Information Theory and Automation Contact Markéta Votavová, votavova@utia.cas.cz, Tel.: 266 052 201.
Number of the records: 1