Number of the records: 1  

Boosting in probabilistic neural networks

  1. 1.
    SYSNO ASEP0410888
    Document TypeC - Proceedings Paper (int. conf.)
    R&D Document TypeConference Paper
    TitleBoosting in probabilistic neural networks
    Author(s) Grim, Jiří (UTIA-B) RID, ORCID
    Pudil, Pavel (UTIA-B) RID
    Somol, Petr (UTIA-B) RID
    Issue dataLos Alamitos: IEEE Computer Society, 2002
    ISBN0-7695-1699-8
    Source TitleProceedings of the 16th International Conference on Pattern Recognition / Kasturi R. ; Laurendeau D. ; Suen C.
    Pagess. 136-139
    Number of pages4 s.
    ActionInternational Conference on Pattern Recognition /16./
    Event date11.08.2002-15.08.2002
    VEvent locationQuébec City
    CountryCA - Canada
    Event typeWRD
    Languageeng - English
    CountryUS - United States
    Keywordsneural networks ; finite mixtures ; boosting
    Subject RIVBB - Applied Statistics, Operational Research
    R&D ProjectsGA402/01/0981 GA ČR - Czech Science Foundation (CSF)
    KSK1019101 GA AV ČR - Academy of Sciences of the Czech Republic (AV ČR)
    CEZAV0Z1075907 - UTIA-B
    AnnotationIt has been verified in practical experiments that the classification performance can be improved by increasing the weights of misclassified training samples. We prove that in case of maximum-likelihood estimation the weighting of discrete data vectors is asymptotically equivalent to multiplication of the estimated distributions by a positive function. Consequently, the Bayesian decision-making can be made asymptotically invariant with respect to arbitrary weighting of data under certain conditions.
    WorkplaceInstitute of Information Theory and Automation
    ContactMarkéta Votavová, votavova@utia.cas.cz, Tel.: 266 052 201.

Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.