Number of the records: 1
Analog Neuron Hierarchy
- 1.
SYSNO ASEP 0507515 Document Type J - Journal Article R&D Document Type Journal Article Subsidiary J Článek ve WOS Title Analog Neuron Hierarchy Author(s) Šíma, Jiří (UIVT-O) RID, SAI, ORCID Source Title Neural Networks. - : Elsevier - ISSN 0893-6080
Roč. 128, August 2020 (2020), s. 199-215Number of pages 17 s. Language eng - English Country GB - United Kingdom Keywords recurrent neural network ; analog neuron hierarchy ; deterministic context-free language ; Turing machine ; Chomsky hierarchy Subject RIV IN - Informatics, Computer Science OECD category Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) R&D Projects GA19-05704S GA ČR - Czech Science Foundation (CSF) Method of publishing Limited access Institutional support UIVT-O - RVO:67985807 UT WOS 000567812200017 EID SCOPUS 85084938909 DOI https://doi.org/10.1016/j.neunet.2020.05.006 Annotation In order to refine the analysis of the computational power of discrete-time recurrent neural networks (NNs) between the binary-state NNs which are equivalent to finite automata (level 3 in the Chomsky hierarchy), and the analog-state NNs with rational weights which are Turing complete (Chomsky level 0), we study an intermediate model alphaANN of a binary-state NN that is extended with alpha >= 0 extra analog-state neurons. For rational weights, we establish an analog neuron hierarchy 0ANNs subset 1ANNs subset 2ANNs subseteq 3ANNs and separate its first two levels. In particular, 0ANNs coincide with the binary-state NNs (Chomsky level 3) being a proper subset of 1ANNs which accept at most context-sensitive languages (Chomsky level 1) including some non-context-free ones (above Chomsky level 2). We prove that the deterministic (context-free) language L_# = { 0^n1^n | n >= 1 } cannot be recognized by any 1ANN even with real weights. In contrast, we show that deterministic pushdown automata accepting deterministic languages can be simulated by 2ANNs with rational weights, which thus constitute a proper superset of 1ANNs. Finally, we prove that the analog neuron hierarchy collapses to 3ANNs by showing that any Turing machine can be simulated by a 3ANN having rational weights, with linear-time overhead. Workplace Institute of Computer Science Contact Tereza Šírová, sirova@cs.cas.cz, Tel.: 266 053 800 Year of Publishing 2021 Electronic address http://dx.doi.org/10.1016/j.neunet.2020.05.006
Number of the records: 1