Number of the records: 1  

Some Comparisons of the Worst-Case Errors in Linear and Neural Network Approximation

  1. 1.
    0403860 - UIVT-O 20000039 RIV FR eng C - Conference Paper (international conference)
    Kůrková, Věra - Sanguineti, M.
    Some Comparisons of the Worst-Case Errors in Linear and Neural Network Approximation.
    Proceedings CD of the Fourteenth International Symposium of Mathematical Theory of Networks and Systems. Perpignan: Université de Perpignan, 2000, nestr.
    [MTNS'2000. International Symposium of Mathematical Theory of Networks and Systems /14./. Perpignan (FR), 19.06.2000-23.06.2000]
    R&D Projects: GA ČR GA201/99/0092
    Grant - others:MURST(IT) 96.02472.CT07; MURST(IT) 97.00048.PF42
    Institutional research plan: AV0Z1030915
    Keywords : linear and neural approximation * Kolmogorov n-width * dimension-independent approximation * one-hidden-layer perceptron networks * curse of dimensionality
    Subject RIV: BA - General Mathematics

    Worst-case errors in linear and neural-network approximation are investigated in a more general framework of fixed versus variable-basis approximation. Such errors are compared for balls in certain norms, tailored to variable-basis. Estimates are applied to sets of functions either computable by perceptrons with periodic or sigmoidal activations, or approximable with dimension-independent rates by one-hidden-layer networks with such perceptrons.
    Permanent Link: http://hdl.handle.net/11104/0124148

     
     

Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.