RESUMEN
Understanding how the body is represented in motor cortex is key to understanding how the brain controls movement. The precentral gyrus (PCG) has long been thought to contain largely distinct regions for the arm, leg and face (represented by the "motor homunculus"). However, mounting evidence has begun to reveal a more intermixed, interrelated and broadly tuned motor map. Here, we revisit the motor homunculus using microelectrode array recordings from 20 arrays that broadly sample PCG across 8 individuals, creating a comprehensive map of human motor cortex at single neuron resolution. We found whole-body representations throughout all sampled points of PCG, contradicting traditional leg/arm/face boundaries. We also found two speech-preferential areas with a broadly tuned, orofacial-dominant area in between them, previously unaccounted for by the homunculus. Throughout PCG, movement representations of the four limbs were interlinked, with homologous movements of different limbs (e.g., toe curl and hand close) having correlated representations. Our findings indicate that, while the classic homunculus aligns with each area's preferred body region at a coarse level, at a finer scale, PCG may be better described as a mosaic of functional zones, each with its own whole-body representation.
RESUMEN
People with paralysis express unmet needs for peer support, leisure activities, and sporting activities. Many within the general population rely on social media and massively multiplayer video games to address these needs. We developed a high-performance finger brain-computer-interface system allowing continuous control of 3 independent finger groups with 2D thumb movements. The system was tested in a human research participant over sequential trials requiring fingers to reach and hold on targets, with an average acquisition rate of 76 targets/minute and completion time of 1.58 ± 0.06 seconds. Performance compared favorably to previous animal studies, despite a 2-fold increase in the decoded degrees-of-freedom (DOF). Finger positions were then used for 4-DOF velocity control of a virtual quadcopter, demonstrating functionality over both fixed and random obstacle courses. This approach shows promise for controlling multiple-DOF end-effectors, such as robotic fingers or digital interfaces for work, entertainment, and socialization.