Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters

Database
Language
Affiliation country
Publication year range
1.
Brain ; 2024 Mar 19.
Article in English | MEDLINE | ID: mdl-38501612

ABSTRACT

The paralysis of the muscles controlling the hand dramatically limits the quality of life of individuals living with spinal cord injury (SCI). Here, with a non-invasive neural interface, we demonstrate that eight motor complete SCI individuals (C5-C6) are still able to task-modulate in real-time the activity of populations of spinal motor neurons with residual neural pathways. In all SCI participants tested, we identified groups of motor units under voluntary control that encoded various hand movements. The motor unit discharges were mapped into more than 10 degrees of freedom, ranging from grasping to individual hand-digit flexion and extension. We then mapped the neural dynamics into a real-time controlled virtual hand. The SCI participants were able to match the cue hand posture by proportionally controlling four degrees of freedom (opening and closing the hand and index flexion/extension). These results demonstrate that wearable muscle sensors provide access to spared motor neurons that are fully under voluntary control in complete cervical SCI individuals. This non-invasive neural interface allows the investigation of motor neuron changes after the injury and has the potential to promote movement restoration when integrated with assistive devices.

2.
Annu Int Conf IEEE Eng Med Biol Soc ; 2022: 4115-4118, 2022 07.
Article in English | MEDLINE | ID: mdl-36085754

ABSTRACT

The human hand possesses a large number of degrees of freedom. Hand dexterity is encoded by the discharge times of spinal motor units (MUs). Most of our knowledge on the neural control of movement is based on the discharge times of MUs during isometric contractions. Here we designed a noninvasive framework to study spinal motor neurons during dynamic hand movements with the aim to understand the neural control of MUs during sinusoidal hand digit flexion and extension at different rates of force development. The framework included 320 high-density surface EMG electrodes placed on the forearm muscles, with markerless 3D hand kinematics extracted with deep learning, and a realistic virtual hand that displayed the motor tasks. The movements included flexion and extension of individual hand digits at two different speeds (0.5 Hz and 1.5 Hz) for 40 seconds. We found on average 4.7±1.7 MUs across participants and tasks. Most MUs showed a biphasic pattern closely mirroring the flexion and extension kinematics. Indeed, a factor analysis method (non-negative matrix factorization) was able to learn the two components (flexion/extension) with high accuracy at the individual MU level ( R=0.87±0.12). Although most MUs were highly correlated with either flexion or extension movements, there was a smaller proportion of MUs that was not task-modulated and controlled by a different neural module (7.1% of all MUs with ). This work shows a noninvasive visually guided framework to study motor neurons controlling the movement of the hand in human participants during dynamic hand digit movements.


Subject(s)
Hand , Upper Extremity , Fingers , Humans , Motor Neurons , Movement
3.
Annu Int Conf IEEE Eng Med Biol Soc ; 2022: 702-706, 2022 07.
Article in English | MEDLINE | ID: mdl-36086496

ABSTRACT

Natural control of assistive devices requires continuous positional encoding and decoding of the user's volition. Human movement is encoded by recruitment and rate coding of spinal motor units. Surface electromyography provides some information on the neural code of movement and is usually decoded into finger joint angles. However, the current approaches to mapping the electrical signal into joint angles are unsatisfactory. There are no methods that allow precise estimation of joint angles during natural hand movements within the large numbers of degrees of freedom of the hand. We propose a framework to train a neural network from digital cameras and high-density surface electromyography from the extrinsic (forearm and wrist) hand muscles. Furthermore, we show that our 3D convolutional neural network optimally predicted 14 functional flexion/extension joints of the hand. We found in our experiments (4 subjects; mean age of 26±2.12 years) that our model can predict individual sinusoidal finger movement at different speeds (0.5 and 1.5 Hz), as well as two and three finger pinching, and hand opening and closing, covering 14 degrees of freedom of the hand. Our deep learning method shows a mean absolute error of 2.78±0.28 degrees with a mean correlation coefficient between predicted and expected joint angles of 0.94, 95% confidence interval (CI) [0.81, 0.98] with simulated real-time inference times lower than 30 milliseconds. These results demonstrate that our approach is capable of predicting the user's volition similar to digital cameras through a non-invasive wearable neural interface. Clinical relevance- This method establishes a viable interface that can be used for both immersive virtual reality medical simulations environments and assistive devices such as exoskeleton and prosthetics.


Subject(s)
Deep Learning , Adult , Electromyography/methods , Fingers/physiology , Hand/physiology , Humans , Movement/physiology , Young Adult
4.
IISE Trans Occup Ergon Hum Factors ; 9(3-4): 186-198, 2021.
Article in English | MEDLINE | ID: mdl-34121625

ABSTRACT

OCCUPATIONAL APPLICATIONSThis contribution provides a framework for modeling user-product interactions (in CAD) for in-depth ergonomic analysis of product design, using digital human models. The framework aims to be applicable to a wide range of different products while being suitable for designers - especially those who do not have specialized ergonomic expertise or training in human behavior - by providing an intuitive, standardized, and time-efficient modeling procedure. The framework contains 31 elementary affordances, which describe mechanical dependencies between product geometries and human end effectors. These elementary affordances serve as a tool for interaction modeling. Additionally, the paper provides a taxonomy of elementary affordances, which can be used to formalize / abstract the nature of user-product interactions and to describe them as elementary affordances. Furthermore, an implementation of the interaction-modeling framework is presented in a CAD environment and provides an example of how the framework could be used in terms of a computer aided ergonomics tool.


TECHNICAL ABSTRACTBackground Digital human models (DHM) have not yet reached their full potential for proactive virtual assessment of ergonomics in engineering and industrial design. Modeling the interaction between user and product often is time demanding, cumbersome, unstandardized, or embedded insufficiently in the computer aided engineering environment. Existing interaction-modeling frameworks either address the simulation of occupational processes, are limited to a specific use cases, or offer insufficient usability.Purpose We present a framework for interaction modeling, its methodic background, as well as its implementation. The framework aims to provide ergonomic analyses of product designs, while being suitable for designers who do not have specific ergonomic knowledge or training in human behavior.Methods To resolve these partly contradictable demands, we utilize affordances, which serve as a tool for interaction modeling. We hypothesize, that many interaction concepts existing in human technology interaction can be reduced to a relatively small set of elementary affordances. We developed a taxonomy of elementary affordances to deduce elementary affordances from empirical interaction data.Results We present the resulting taxonomy, as well as the resulting 31 elementary affordances, which describe mechanical dependencies between product geometries and human end effectors. The identified elementary affordances are implemented as affordance features in a CAD environment (Siemens NX) and result in an interaction-modeling framework. A brief application example regarding the functionalities of the framework is presented.Conclusions The introduced framework demonstrates how the integration of interaction modeling into the computer aided engineering environment can be achieved in a comprehensible and straightforward way. The resulting simplicity and accessibility may constitute one key factor to help exploit the potential of DHM simulation as a computer aided ergonomics tool in engineering and industrial design.


Subject(s)
Computers , Ergonomics , Ergonomics/methods , Humans
SELECTION OF CITATIONS
SEARCH DETAIL