Number of the records: 1  

Analog Neuron Hierarchy

  1. 1.
    SYSNO ASEP0507515
    Document TypeJ - Journal Article
    R&D Document TypeJournal Article
    Subsidiary JČlánek ve WOS
    TitleAnalog Neuron Hierarchy
    Author(s) Šíma, Jiří (UIVT-O) RID, SAI, ORCID
    Source TitleNeural Networks. - : Elsevier - ISSN 0893-6080
    Roč. 128, August 2020 (2020), s. 199-215
    Number of pages17 s.
    Languageeng - English
    CountryGB - United Kingdom
    Keywordsrecurrent neural network ; analog neuron hierarchy ; deterministic context-free language ; Turing machine ; Chomsky hierarchy
    Subject RIVIN - Informatics, Computer Science
    OECD categoryComputer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
    R&D ProjectsGA19-05704S GA ČR - Czech Science Foundation (CSF)
    Method of publishingLimited access
    Institutional supportUIVT-O - RVO:67985807
    UT WOS000567812200017
    EID SCOPUS85084938909
    DOI10.1016/j.neunet.2020.05.006
    AnnotationIn order to refine the analysis of the computational power of discrete-time recurrent neural networks (NNs) between the binary-state NNs which are equivalent to finite automata (level 3 in the Chomsky hierarchy), and the analog-state NNs with rational weights which are Turing complete (Chomsky level 0), we study an intermediate model alphaANN of a binary-state NN that is extended with alpha >= 0 extra analog-state neurons. For rational weights, we establish an analog neuron hierarchy 0ANNs subset 1ANNs subset 2ANNs subseteq 3ANNs and separate its first two levels. In particular, 0ANNs coincide with the binary-state NNs (Chomsky level 3) being a proper subset of 1ANNs which accept at most context-sensitive languages (Chomsky level 1) including some non-context-free ones (above Chomsky level 2). We prove that the deterministic (context-free) language L_# = { 0^n1^n | n >= 1 } cannot be recognized by any 1ANN even with real weights. In contrast, we show that deterministic pushdown automata accepting deterministic languages can be simulated by 2ANNs with rational weights, which thus constitute a proper superset of 1ANNs. Finally, we prove that the analog neuron hierarchy collapses to 3ANNs by showing that any Turing machine can be simulated by a 3ANN having rational weights, with linear-time overhead.
    WorkplaceInstitute of Computer Science
    ContactTereza Šírová, sirova@cs.cas.cz, Tel.: 266 053 800
    Year of Publishing2021
    Electronic addresshttp://dx.doi.org/10.1016/j.neunet.2020.05.006
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.