Number of the records: 1  

Energy-Time Tradeoff in Recurrent Neural Nets

  1. 1.
    SYSNO ASEP0472477
    Document TypeC - Proceedings Paper (int. conf.)
    R&D Document TypeConference Paper
    TitleEnergy-Time Tradeoff in Recurrent Neural Nets
    Author(s) Šíma, Jiří (UIVT-O) RID, SAI, ORCID
    Source TitleArtificial Neural Networks. Methods and Applications in Bio-/Neuroinformatics. - Cham : Springer, 2015 / Koprinkova-Hristova P. ; Mladenov V. ; Kasabov N.K. - ISSN 2193-9349 - ISBN 978-3-319-09902-6
    Pagess. 51-62
    Number of pages12 s.
    Publication formPrint - P
    ActionICANN 2013. International Conference on Artificial Neural Networks /23./
    Event date10.09.2013-13.09.2013
    VEvent locationSofia
    CountryBG - Bulgaria
    Event typeWRD
    Languageeng - English
    CountryCH - Switzerland
    Keywordsenergy complexity ; recurrent neural network ; finite automaton ; energy-time tradeoff
    Subject RIVIN - Informatics, Computer Science
    R&D ProjectsGBP202/12/G061 GA ČR - Czech Science Foundation (CSF)
    Institutional supportUIVT-O - RVO:67985807
    UT WOS000380528700003
    EID SCOPUS85008397780
    DOI10.1007/978-3-319-09903-3_3
    AnnotationIn this chapter, we deal with the energy complexity of perceptron networks which has been inspired by the fact that the activity of neurons in the brain is quite sparse (with only about 1% of neurons firing). This complexity measure has recently been introduced for feedforward architectures (i.e., threshold circuits). We shortly survey the tradeoff results which relate the energy to other complexity measures such as the size and depth of threshold circuits. We generalize the energy complexity for recurrent architectures which counts the number of simultaneously active neurons at any time instant of a computation. We present our energy-time tradeoff result for the recurrent neural nets which are known to be computationally as powerful as the finite automata. In particular, we show the main ideas of simulating any deterministic finite automaton by a low-energy optimal-size neural network. In addition, we present a lower bound on the energy of such a simulation (within a certain range of time overhead) which implies that the energy demands in a fixedsize network increase exponentially with the frequency of presenting the input bits.
    WorkplaceInstitute of Computer Science
    ContactTereza Šírová, sirova@cs.cas.cz, Tel.: 266 053 800
    Year of Publishing2017
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.