Počet záznamů: 1
Bounds on Sparsity of One-Hidden-Layer Perceptron Networks
- 1.0478625 - ÚI 2018 RIV DE eng C - Konferenční příspěvek (zahraniční konf.)
Kůrková, Věra
Bounds on Sparsity of One-Hidden-Layer Perceptron Networks.
Proceedings ITAT 2017: Information Technologies - Applications and Theory. Aachen & Charleston: Technical University & CreateSpace Independent Publishing Platform, 2017 - (Hlaváčová, J.), s. 100-105. CEUR Workshop Proceedings, V-1885. ISBN 978-1974274741. ISSN 1613-0073.
[ITAT 2017. Conference on Theory and Practice of Information Technologies - Applications and Theory /17./. Martinské hole (SK), 22.09.2017-26.09.2017]
Grant CEP: GA ČR GA15-18108S
Institucionální podpora: RVO:67985807
Klíčová slova: shallow perceptron networks * sparse networks * pseudo-noise sequences * variational norm
Obor OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
http://ceur-ws.org/Vol-1885/100.pdf
Limitations of one-hidden-layer (shallow) perceptron networks to sparsely represent multivariable functions is investigated. A concrete class of functions is described whose computation by shallow perceptron networks requires either large number of units or is unstable due to large output weights. The class is constructed using pseudo-noise sequences which have many features of random sequences but can be generated using special polynomials. Connections with the central paradox of coding theory are discussed.
Trvalý link: http://hdl.handle.net/11104/0274766
Název souboru Staženo Velikost Komentář Verze Přístup a0478625.pdf 2 309.1 KB Vydavatelský postprint vyžádat
Počet záznamů: 1