Number of the records: 1  

Classification by Sparse Neural Networks

  1. 1.
    0485611 - ÚI 2020 RIV US eng J - Journal Article
    Kůrková, Věra - Sanguineti, M.
    Classification by Sparse Neural Networks.
    IEEE Transactions on Neural Networks and Learning Systems. Roč. 30, č. 9 (2019), s. 2746-2754. ISSN 2162-237X. E-ISSN 2162-2388
    R&D Projects: GA ČR GA15-18108S; GA ČR(CZ) GA18-23827S
    Institutional support: RVO:67985807
    Keywords : Binary classification * Chernoff–Hoeffding bound * dictionaries of computational units * feedforward networks * measures of sparsity
    OECD category: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    Impact factor: 8.793, year: 2019
    Method of publishing: Limited access
    http://dx.doi.org/10.1109/TNNLS.2018.2888517

    The choice of dictionaries of computational units suitable for efficient computation of binary classification tasks is investigated. To deal with exponentially growing sets of tasks with increasingly large domains, a probabilistic model is introduced. The relevance of tasks for a given application area is modeled by a product probability distribution on the set of all binary-valued functions. Approximate measures of network sparsity are studied in terms of variational norms tailored to dictionaries of computational units. Bounds on these norms are proven using the Chernoff–Hoeffding bound on sums of independent random variables that need not be identically distributed. Consequences of the probabilistic results for the choice of dictionaries of computational units are derived. It is shown that when a priori knowledge of a type of classification tasks is limited, then the sparsity may be achieved only at the expense of large sizes of dictionaries.
    Permanent Link: http://hdl.handle.net/11104/0280566

     
    FileDownloadSizeCommentaryVersionAccess
    0485611-a.pdf18458.9 KBPublisher’s postprintrequire
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.