Number of the records: 1
Limitations of One-Hidden-Layer Perceptron Networks
- 1.0447921 - ÚI 2016 RIV DE eng C - Conference Paper (international conference)
Kůrková, Věra
Limitations of One-Hidden-Layer Perceptron Networks.
Proceedings ITAT 2015: Information Technologies - Applications and Theory. Aachen & Charleston: Technical University & CreateSpace Independent Publishing Platform, 2015 - (Yaghob, J.), s. 167-171. CEUR Workshop Proceedings, V-1422. ISBN 978-1-5151-2065-0. ISSN 1613-0073.
[ITAT 2015. Conference on Theory and Practice of Information Technologies /15./. Slovenský Raj (SK), 17.09.2015-21.09.2015]
R&D Projects: GA MŠMT(CZ) LD13002
Institutional support: RVO:67985807
Keywords : perceptron networks * model complexity * representations of finite mappings by neural networks
Subject RIV: IN - Informatics, Computer Science
Limitations of one-hidden-layer perceptron networks to represent efficiently finite mappings is investigated. It is shown that almost any uniformly randomly chosen mapping on a sufficiently large finite domain cannot be tractably represented by a one-hidden-layer perceptron network. This existential probabilistic result is complemented by a concrete example of a class of functions constructed using quasi-random sequences. Analogies with central paradox of coding theory and no free lunch theorem are discussed.
Permanent Link: http://hdl.handle.net/11104/0249675
File Download Size Commentary Version Access a0447921.pdf 0 606.5 KB Publisher’s postprint require
Number of the records: 1