Počet záznamů: 1

Can Dictionary-based Computational Models Outperform the Best Linear Ones?

  1. 1.
    0360287 - UIVT-O 2012 RIV GB eng J - Článek v odborném periodiku
    Gnecco, G. - Kůrková, Věra - Sanguineti, M.
    Can Dictionary-based Computational Models Outperform the Best Linear Ones?.
    Neural Networks. Roč. 24, č. 8 (2011), s. 881-887 ISSN 0893-6080
    Grant CEP: GA MŠk OC10047
    Grant ostatní: CNR - AV ČR project 2010-2012(XE) Complexity of Neural-Network and Kernel Computational Models
    Výzkumný záměr: CEZ:AV0Z10300504
    Klíčová slova: dictionary-based approximation * linear approximation * rates of approximation * worst-case error * Kolmogorov width * perceptron networks
    Kód oboru RIV: IN - Informatika
    Impakt faktor: 2.182, rok: 2011

    Approximation capabilities of two types of computational models are explored: dictionary-based models (i.e., linear combinations of n-tuples of basis functions computable by units belonging to a set called "dictionary") and linear ones (i.e., linear combinations of n fixed basis functions). The two models are compared in terms of approximation rates, i.e., speeds of decrease of approximation errors for a growing number n of basis functions. Proofs of upper bounds on approximation rates by dictionary-based models are inspected, to show that for individual functions they do not imply estimates for dictionary based models that do not hold also for some linear models. Instead, the possibility of getting faster approximation rates by dictionary-based models is demonstrated for worst-case errors in approximation of suitable sets of functions. For such sets, even geometric upper bounds hold.
    Trvalý link: http://hdl.handle.net/11104/0197874