Number of the records: 1
Approximation of Classifiers by Deep Perceptron Networks
- 1.
SYSNO ASEP 0572576 Document Type J - Journal Article R&D Document Type Journal Article Subsidiary J Článek ve WOS Title Approximation of Classifiers by Deep Perceptron Networks Author(s) Kůrková, Věra (UIVT-O) RID, SAI, ORCID
Sanguineti, M. (IT)Source Title Neural Networks. - : Elsevier - ISSN 0893-6080
Roč. 165, August 2023 (2023), s. 654-661Number of pages 8 s. Publication form Print - P Language eng - English Country GB - United Kingdom Keywords Approximation by deep networks ; Probabilistic bounds on approximation errors ; Random classifiers ; Concentration of measure ; Method of bounded differences ; Growth functions OECD category Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) R&D Projects GA22-02067S GA ČR - Czech Science Foundation (CSF) Method of publishing Limited access Institutional support UIVT-O - RVO:67985807 UT WOS 001058145100001 EID SCOPUS 85163371420 DOI 10.1016/j.neunet.2023.06.004 Annotation We employ properties of high-dimensional geometry to obtain some insights into capabilities of deep perceptron networks to classify large data sets. We derive conditions on network depths, types of activation functions, and numbers of parameters that imply that approximation errors behave almost deterministically. We illustrate general results by concrete cases of popular activation functions: Heaviside, ramp sigmoid, rectified linear, and rectified power. Our probabilistic bounds on approximation errors are derived using concentration of measure type inequalities (method of bounded differences) and concepts from statistical learning theory. Workplace Institute of Computer Science Contact Tereza Šírová, sirova@cs.cas.cz, Tel.: 266 053 800 Year of Publishing 2024 Electronic address https://dx.doi.org/10.1016/j.neunet.2023.06.004
Number of the records: 1