Quantifying Information Conveyed by Large Neuronal Populations.
Neural Comput
; 31(6): 1015-1047, 2019 06.
Article
em En
| MEDLINE
| ID: mdl-30979352
ABSTRACT
Quantifying mutual information between inputs and outputs of a large neural circuit is an important open problem in both machine learning and neuroscience. However, evaluation of the mutual information is known to be generally intractable for large systems due to the exponential growth in the number of terms that need to be evaluated. Here we show how information contained in the responses of large neural populations can be effectively computed provided the input-output functions of individual neurons can be measured and approximated by a logistic function applied to a potentially nonlinear function of the stimulus. Neural responses in this model can remain sensitive to multiple stimulus components. We show that the mutual information in this model can be effectively approximated as a sum of lower-dimensional conditional mutual information terms. The approximations become exact in the limit of large neural populations and for certain conditions on the distribution of receptive fields across the neural population. We empirically find that these approximations continue to work well even when the conditions on the receptive field distributions are not fulfilled. The computing cost for the proposed methods grows linearly in the dimension of the input and compares favorably with other approximations.
Texto completo:
1
Bases de dados:
MEDLINE
Assunto principal:
Redes Neurais de Computação
/
Aprendizado de Máquina
/
Modelos Neurológicos
/
Neurônios
Limite:
Humans
Idioma:
En
Revista:
Neural Comput
Assunto da revista:
INFORMATICA MEDICA
Ano de publicação:
2019
Tipo de documento:
Article
País de afiliação:
Estados Unidos