Your browser doesn't support javascript.
loading
Neural Encoding of Active Multi-Sensing Enhances Perceptual Decision-Making via a Synergistic Cross-Modal Interaction.
Delis, Ioannis; Ince, Robin A A; Sajda, Paul; Wang, Qi.
  • Delis I; School of Biomedical Sciences, University of Leeds, Leeds, LS2 9JT, United Kingdom i.delis@leeds.ac.uk.
  • Ince RAA; School of Psychology and Neuroscience, University of Glasgow, G12 8QQ, United Kingdom.
  • Sajda P; Department of Biomedical Engineering, Columbia University, New York, New York 10027.
  • Wang Q; Data Science Institute, Columbia University, New York, New York 10027.
J Neurosci ; 42(11): 2344-2355, 2022 03 16.
Article en En | MEDLINE | ID: mdl-35091504
ABSTRACT
Most perceptual decisions rely on the active acquisition of evidence from the environment involving stimulation from multiple senses. However, our understanding of the neural mechanisms underlying this process is limited. Crucially, it remains elusive how different sensory representations interact in the formation of perceptual decisions. To answer these questions, we used an active sensing paradigm coupled with neuroimaging, multivariate analysis, and computational modeling to probe how the human brain processes multisensory information to make perceptual judgments. Participants of both sexes actively sensed to discriminate two texture stimuli using visual (V) or haptic (H) information or the two sensory cues together (VH). Crucially, information acquisition was under the participants' control, who could choose where to sample information from and for how long on each trial. To understand the neural underpinnings of this process, we first characterized where and when active sensory experience (movement patterns) is encoded in human brain activity (EEG) in the three sensory conditions. Then, to offer a neurocomputational account of active multisensory decision formation, we used these neural representations of active sensing to inform a drift diffusion model of decision-making behavior. This revealed a multisensory enhancement of the neural representation of active sensing, which led to faster and more accurate multisensory decisions. We then dissected the interactions between the V, H, and VH representations using a novel information-theoretic methodology. Ultimately, we identified a synergistic neural interaction between the two unisensory (V, H) representations over contralateral somatosensory and motor locations that predicted multisensory (VH) decision-making performance.SIGNIFICANCE STATEMENT In real-world settings, perceptual decisions are made during active behaviors, such as crossing the road on a rainy night, and include information from different senses (e.g., car lights, slippery ground). Critically, it remains largely unknown how sensory evidence is combined and translated into perceptual decisions in such active scenarios. Here we address this knowledge gap. First, we show that the simultaneous exploration of information across senses (multi-sensing) enhances the neural encoding of active sensing movements. Second, the neural representation of active sensing modulates the evidence available for decision; and importantly, multi-sensing yields faster evidence accumulation. Finally, we identify a cross-modal interaction in the human brain that correlates with multisensory performance, constituting a putative neural mechanism for forging active multisensory perception.
Asunto(s)
Palabras clave

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Toma de Decisiones / Electroencefalografía Tipo de estudio: Prognostic_studies Límite: Female / Humans / Male Idioma: En Año: 2022 Tipo del documento: Article

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Toma de Decisiones / Electroencefalografía Tipo de estudio: Prognostic_studies Límite: Female / Humans / Male Idioma: En Año: 2022 Tipo del documento: Article