Number of the records: 1  

Bounds on Sparsity of One-Hidden-Layer Perceptron Networks

  1. 1.
    0478625 - ÚI 2018 RIV DE eng C - Conference Paper (international conference)
    Kůrková, Věra
    Bounds on Sparsity of One-Hidden-Layer Perceptron Networks.
    Proceedings ITAT 2017: Information Technologies - Applications and Theory. Aachen & Charleston: Technical University & CreateSpace Independent Publishing Platform, 2017 - (Hlaváčová, J.), s. 100-105. CEUR Workshop Proceedings, V-1885. ISBN 978-1974274741. ISSN 1613-0073.
    [ITAT 2017. Conference on Theory and Practice of Information Technologies - Applications and Theory /17./. Martinské hole (SK), 22.09.2017-26.09.2017]
    R&D Projects: GA ČR GA15-18108S
    Institutional support: RVO:67985807
    Keywords : shallow perceptron networks * sparse networks * pseudo-noise sequences * variational norm
    OECD category: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    http://ceur-ws.org/Vol-1885/100.pdf

    Limitations of one-hidden-layer (shallow) perceptron networks to sparsely represent multivariable functions is investigated. A concrete class of functions is described whose computation by shallow perceptron networks requires either large number of units or is unstable due to large output weights. The class is constructed using pseudo-noise sequences which have many features of random sequences but can be generated using special polynomials. Connections with the central paradox of coding theory are discussed.
    Permanent Link: http://hdl.handle.net/11104/0274766

     
    FileDownloadSizeCommentaryVersionAccess
    a0478625.pdf2309.1 KBPublisher’s postprintrequire
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.