Počet záznamů: 1  

Limitations of Shallow Networks Representing Finite Mappings

  1. 1.
    0485613 - ÚI 2020 RIV US eng J - Článek v odborném periodiku
    Kůrková, Věra
    Limitations of Shallow Networks Representing Finite Mappings.
    Neural Computing & Applications. Roč. 31, č. 6 (2019), s. 1783-1792. ISSN 0941-0643
    Grant CEP: GA ČR GA15-18108S; GA ČR(CZ) GA18-23827S
    Institucionální podpora: RVO:67985807
    Klíčová slova: shallow and deep networks * sparsity * variational norms * functions on large finite domains * finite dictionaries of computational units * pseudo-noise sequences * perceptron networks
    Kód oboru RIV: IN - Informatika
    Obor OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    Impakt faktor: 4.664, rok: 2018

    Limitations of capabilities of shallow networks to efficiently compute real-valued functions on finite domains are investigated. Efficiency is studied in terms of network sparsity and its approximate measures. It is shown that when a dictionary of computational units is not sufficiently large, computation of almost any uniformly randomly chosen function either represents a well-conditioned task performed by a large network or an ill-conditioned task performed by a network of a moderate size. The probabilistic results are complemented by a concrete example of a class of functions which cannot be efficiently computed by shallow perceptron networks. The class is constructed using pseudo-noise sequences which have many features of random sequences but can be generated using special polynomials. Connections to the No Free Lunch Theorem and the central paradox of coding theory are discussed.
    Trvalý link: http://hdl.handle.net/11104/0280569
    Název souboruStaženoVelikostKomentářVerzePřístup
    0485613.pdf2330.3 KBAutorský preprintvyžádat