Number of the records: 1  

Bounds on the complexity of neural-network models and comparison with linear methods

  1. 1.
    0505027 - ÚI 2020 GB eng J - Journal Article
    Hlaváčková-Schindler, Kateřina - Sanguineti, M.
    Bounds on the complexity of neural-network models and comparison with linear methods.
    International Journal of Adaptive Control and Signal Processing. Roč. 17, č. 2 (2003), s. 179-194. ISSN 0890-6327. E-ISSN 1099-1115
    Keywords : adaptive-control * system-identification * approximation * rates * non-linear models * polynomially bounded complexity * curse of dimensionality * neural networks
    Impact factor: 0.602, year: 2003

    A class of non-linear models having the structure of combinations of simple, parametrized basis functions is investigated; this class includes widespread neural networks in which the basis functions correspond to the computational units of a type of networks. Bounds on the complexity of such models are derived in terms of the number of adjustable parameters necessary for a given modelling accuracy. These bounds guarantee a more advantageous tradeoff than linear methods between modelling accuracy and model complexity: the number of parameters may increase much more slowly, in some cases only polynomially, with the dimensionality of the input space in modelling tasks. Polynomial bounds on complexity allow one to cope with the so-called 'curse of dimensionality', which often makes linear methods either inaccurate or computationally unfeasible. The presented results let one gain a deeper theoretical insight into the effectiveness of neural-network architectures, noticed in complex modelling applications.
    Permanent Link: http://hdl.handle.net/11104/0296550

     
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.