Number of the records: 1  

Constructive Lower Bounds on Model Complexity of Shallow Perceptron Networks

  1. 1.
    0474092 - ÚI 2019 RIV US eng J - Journal Article
    Kůrková, Věra
    Constructive Lower Bounds on Model Complexity of Shallow Perceptron Networks.
    Neural Computing & Applications. Roč. 29, č. 7 (2018), s. 305-315. ISSN 0941-0643. E-ISSN 1433-3058
    R&D Projects: GA ČR GA15-18108S
    Institutional support: RVO:67985807
    Keywords : shallow and deep networks * model complexity and sparsity * signum perceptron networks * finite mappings * variational norms * Hadamard matrices
    OECD category: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    Impact factor: 4.664, year: 2018

    Limitations of shallow (one-hidden-layer) perceptron networks are investigated with respect to computing multivariable functions on finite domains. Lower bounds are derived on growth of the number of network units or sizes of output weights in terms of variations of functions to be computed. A concrete construction is presented with a class of functions which cannot be computed by signum or Heaviside perceptron networks with considerably smaller numbers of units and smaller output weights than the sizes of the function’s domains. A subclass of these functions is described whose elements can be computed by two-hidden-layer perceptron networks with the number of units depending on logarithm of the size of the domain linearly.
    Permanent Link: http://hdl.handle.net/11104/0271209

     
    FileDownloadSizeCommentaryVersionAccess
    a0474092.pdf8495.8 KBPublisher’s postprintrequire
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.