Number of the records: 1  

Shared input and recurrency in neural networks for metabolically efficient information transmission

  1. 1.
    0584159 - FGÚ 2025 RIV US eng J - Journal Article
    Bárta, Tomáš - Košťál, Lubomír
    Shared input and recurrency in neural networks for metabolically efficient information transmission.
    PLoS Computational Biology. Roč. 20, č. 2 (2024), č. článku e1011896. ISSN 1553-734X. E-ISSN 1553-7358
    Grant - others:AV ČR(CZ) StrategieAV21/26
    Program: StrategieAV
    Institutional support: RVO:67985823
    Keywords : neural networks * probability distribution * neurons
    OECD category: Statistics and probability
    Impact factor: 4.3, year: 2022
    Method of publishing: Open access
    https://doi.org/10.1371/journal.pcbi.1011896

    Shared input to a population of neurons induces noise correlations, which can decrease the information carried by a population activity. Inhibitory feedback in recurrent neural networks can reduce the noise correlations and thus increase the information carried by the population activity. However, the activity of inhibitory neurons is costly. This inhibitory feedback decreases the gain of the population. Thus, depolarization of its neurons requires stronger excitatory synaptic input, which is associated with higher ATP consumption. Given that the goal of neural populations is to transmit as much information as possible at minimal metabolic costs, it is unclear whether the increased information transmission reliability provided by inhibitory feedback compensates for the additional costs. We analyze this problem in a network of leaky integrate-and-fire neurons receiving correlated input. By maximizing mutual information with metabolic cost constraints, we show that there is an optimal strength of recurrent connections in the network, which maximizes the value of mutual information-per-cost. For higher values of input correlation, the mutual information-per-cost is higher for recurrent networks with inhibitory feedback compared to feedforward networks without any inhibitory neurons. Our results, therefore, show that the optimal synaptic strength of a recurrent network can be inferred from metabolically efficient coding arguments and that decorrelation of the input by inhibitory feedback compensates for the associated increased metabolic costs.
    Permanent Link: https://hdl.handle.net/11104/0352146

     
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.