Number of the records: 1
Sparsity and Complexity of Networks Computing Highly-Varying Functions
- 1.0493825 - ÚI 2019 RIV CH eng C - Conference Paper (international conference)
Kůrková, Věra
Sparsity and Complexity of Networks Computing Highly-Varying Functions.
Artificial Neural Networks and Machine Learning – ICANN 2018. Proceedings, Part III. Cham: Springer, 2018 - (Kůrková, V.; Manolopoulos, Y.; Hammer, B.; Iliadis, L.; Maglogiannis, I.), s. 534-543. Lecture Notes in Computer Science, 11141. ISBN 978-3-030-01423-0. ISSN 0302-9743.
[ICANN 2018. International Conference on Artificial Neural Networks /27./. Rhodes (GR), 04.10.2018-07.10.2018]
R&D Projects: GA ČR(CZ) GA18-23827S
Institutional support: RVO:67985807
Keywords : Shallow and deep networks * Model complexity * Sparsity * Highly-varying functions * Covering numbers * Dictionaries of computational units * Perceptrons
OECD category: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
https://www.springer.com/us/book/9783030014230
Approximative measures of network sparsity in terms of norms tailored to dictionaries of computational units are investigated. Lower bounds on these norms of real-valued functions on finite domains are derived. The bounds are proven by combining the concentration of measure property of high-dimensional spaces with characterization of dictionaries of computational units in terms of their capacities and coherence measured by their covering numbers. The results are applied to dictionaries used in neurocomputing which have power-type covering numbers. Probabilistic results are illustrated by a concrete construction of a class of functions, computation of which by perceptron networks requires large number of units or it is unstable due to large output weights.
Permanent Link: http://hdl.handle.net/11104/0287121
File Download Size Commentary Version Access a0493825.pdf 4 377.7 KB Author’s postprint require
Number of the records: 1