Abstract
It is known that the probability of decoding error has a phase transition at information rate equal to the channel capacity. The corresponding thermodynamic limit requires infinite coding dimension, hence making the actual decoding practically impossible. In this Rapid Communication we analyze finite-size effects that occur in limited neural populations. We report that the achievable rate approaches the asymptote in a remarkably nonlinear manner with the population size. Qualitatively, our findings do not seem to depend on the details of the model.
- Received 31 March 2019
- Revised 10 July 2019
DOI:https://doi.org/10.1103/PhysRevE.100.050401
©2019 American Physical Society