Počet záznamů: 1
Lower Bounds on Complexity of Shallow Perceptron Networks
- 1.0460704 - ÚI 2017 RIV CH eng C - Konferenční příspěvek (zahraniční konf.)
Kůrková, Věra
Lower Bounds on Complexity of Shallow Perceptron Networks.
Engineering Applications of Neural Networks. Cham: Springer, 2016 - (Jayne, C.; Iliadis, L.), s. 283-294. Communications in Computer and Information Science, 629. ISBN 978-3-319-44187-0. ISSN 1865-0929.
[EANN 2016. International Conference /17./. Aberdeen (GB), 02.09.2016-05.09.2016]
Grant CEP: GA ČR GA15-18108S
Institucionální podpora: RVO:67985807
Klíčová slova: shallow feedforward networks * signum perceptrons * finite mappings * model complexity * Hadamard matrices
Obor OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Model complexity of shallow (one-hidden-layer) perceptron networks computing multivariable functions on finite domains is investigated. Lower bounds are derived on growth of the number of network units or sizes of output weights in terms of variations of functions to be computed. A concrete construction of a class of functions which cannot be computed by perceptron networks with considerably smaller numbers of units and output weights than the sizes of the function’s domains is presented. In particular, functions on Boolean d-dimensional cubes are constructed which cannot be computed by shallow perceptron networks with numbers of hidden units and sizes of output weights depending on d polynomially. A subclass of these functions is described whose elements can be computed by two-hidden-layer networks with the number of units depending on d linearly.
Trvalý link: http://hdl.handle.net/11104/0260719
Název souboru Staženo Velikost Komentář Verze Přístup a0460704.pdf 3 196.9 KB Vydavatelský postprint vyžádat
Počet záznamů: 1