Your browser doesn't support javascript.
loading
Distinct Neural Components of Visually Guided Grasping during Planning and Execution.
Klein, Lina K; Maiello, Guido; Stubbs, Kevin; Proklova, Daria; Chen, Juan; Paulun, Vivian C; Culham, Jody C; Fleming, Roland W.
Afiliação
  • Klein LK; Department of Experimental Psychology, Justus Liebig University Giessen, 35390 Giessen, Germany.
  • Maiello G; School of Psychology, University of Southampton, Southampton SO17 1PS, United Kingdom guido_maiello@yahoo.it.
  • Stubbs K; Department of Psychology, University of Western Ontario, London, Ontario N6A 5C2, Canada.
  • Proklova D; Department of Psychology, University of Western Ontario, London, Ontario N6A 5C2, Canada.
  • Chen J; Center for the Study of Applied Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, and the School of Psychology, South China Normal University, Guangzhou, 510631, China.
  • Paulun VC; Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Guangzhou 510631, China.
  • Culham JC; McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139.
  • Fleming RW; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139.
J Neurosci ; 43(49): 8504-8514, 2023 12 06.
Article em En | MEDLINE | ID: mdl-37848285
Selecting suitable grasps on three-dimensional objects is a challenging visuomotor computation, which involves combining information about an object (e.g., its shape, size, and mass) with information about the actor's body (e.g., the optimal grasp aperture and hand posture for comfortable manipulation). Here, we used functional magnetic resonance imaging to investigate brain networks associated with these distinct aspects during grasp planning and execution. Human participants of either sex viewed and then executed preselected grasps on L-shaped objects made of wood and/or brass. By leveraging a computational approach that accurately predicts human grasp locations, we selected grasp points that disentangled the role of multiple grasp-relevant factors, that is, grasp axis, grasp size, and object mass. Representational Similarity Analysis revealed that grasp axis was encoded along dorsal-stream regions during grasp planning. Grasp size was first encoded in ventral stream areas during grasp planning then in premotor regions during grasp execution. Object mass was encoded in ventral stream and (pre)motor regions only during grasp execution. Premotor regions further encoded visual predictions of grasp comfort, whereas the ventral stream encoded grasp comfort during execution, suggesting its involvement in haptic evaluation. These shifts in neural representations thus capture the sensorimotor transformations that allow humans to grasp objects.SIGNIFICANCE STATEMENT Grasping requires integrating object properties with constraints on hand and arm postures. Using a computational approach that accurately predicts human grasp locations by combining such constraints, we selected grasps on objects that disentangled the relative contributions of object mass, grasp size, and grasp axis during grasp planning and execution in a neuroimaging study. Our findings reveal a greater role of dorsal-stream visuomotor areas during grasp planning, and, surprisingly, increasing ventral stream engagement during execution. We propose that during planning, visuomotor representations initially encode grasp axis and size. Perceptual representations of object material properties become more relevant instead as the hand approaches the object and motor programs are refined with estimates of the grip forces required to successfully lift the object.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Desempenho Psicomotor / Encéfalo Limite: Humans Idioma: En Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Desempenho Psicomotor / Encéfalo Limite: Humans Idioma: En Ano de publicação: 2023 Tipo de documento: Article