Your browser doesn't support javascript.
loading
Neural Circuit Inference from Function to Structure.
Real, Esteban; Asari, Hiroki; Gollisch, Tim; Meister, Markus.
Afiliação
  • Real E; Harvard University, Cambridge, MA 02139, USA.
  • Asari H; Harvard University, Cambridge, MA 02139, USA. Electronic address: asari@embl.it.
  • Gollisch T; Department of Ophthalmology, University Medical Center Göttingen, Göttingen 37073, Germany.
  • Meister M; Harvard University, Cambridge, MA 02139, USA. Electronic address: meister@caltech.edu.
Curr Biol ; 27(2): 189-198, 2017 Jan 23.
Article em En | MEDLINE | ID: mdl-28065610
ABSTRACT
Advances in technology are opening new windows on the structural connectivity and functional dynamics of brain circuits. Quantitative frameworks are needed that integrate these data from anatomy and physiology. Here, we present a modeling approach that creates such a link. The goal is to infer the structure of a neural circuit from sparse neural recordings, using partial knowledge of its anatomy as a regularizing constraint. We recorded visual responses from the output neurons of the retina, the ganglion cells. We then generated a systematic sequence of circuit models that represents retinal neurons and connections and fitted them to the experimental data. The optimal models faithfully recapitulated the ganglion cell outputs. More importantly, they made predictions about dynamics and connectivity among unobserved neurons internal to the circuit, and these were subsequently confirmed by experiment. This circuit inference framework promises to facilitate the integration and understanding of big data in neuroscience.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Células Ganglionares da Retina / Urodelos / Modelos Neurológicos / Neurônios Idioma: En Ano de publicação: 2017 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Células Ganglionares da Retina / Urodelos / Modelos Neurológicos / Neurônios Idioma: En Ano de publicação: 2017 Tipo de documento: Article