Your browser doesn't support javascript.
loading
Biomimetic learning of hand gestures in a humanoid robot.
Olikkal, Parthan; Pei, Dingyi; Karri, Bharat Kashyap; Satyanarayana, Ashwin; Kakoty, Nayan M; Vinjamuri, Ramana.
Affiliation
  • Olikkal P; Department of Computer Science and Electrical Engineering, Sensorimotor Control Lab, University of Maryland, Baltimore, MD, United States.
  • Pei D; Department of Computer Science and Electrical Engineering, Sensorimotor Control Lab, University of Maryland, Baltimore, MD, United States.
  • Karri BK; Department of Computer Science and Electrical Engineering, Sensorimotor Control Lab, University of Maryland, Baltimore, MD, United States.
  • Satyanarayana A; Department of Computer Systems Technology, City Tech at City University of New York, New York, NY, United States.
  • Kakoty NM; Department of Electronics and Communication Engineering, Tezpur University, Assam, India.
  • Vinjamuri R; Department of Computer Science and Electrical Engineering, Sensorimotor Control Lab, University of Maryland, Baltimore, MD, United States.
Front Hum Neurosci ; 18: 1391531, 2024.
Article de En | MEDLINE | ID: mdl-39099602
ABSTRACT
Hand gestures are a natural and intuitive form of communication, and integrating this communication method into robotic systems presents significant potential to improve human-robot collaboration. Recent advances in motor neuroscience have focused on replicating human hand movements from synergies also known as movement primitives. Synergies, fundamental building blocks of movement, serve as a potential strategy adapted by the central nervous system to generate and control movements. Identifying how synergies contribute to movement can help in dexterous control of robotics, exoskeletons, prosthetics and extend its applications to rehabilitation. In this paper, 33 static hand gestures were recorded through a single RGB camera and identified in real-time through the MediaPipe framework as participants made various postures with their dominant hand. Assuming an open palm as initial posture, uniform joint angular velocities were obtained from all these gestures. By applying a dimensionality reduction method, kinematic synergies were obtained from these joint angular velocities. Kinematic synergies that explain 98% of variance of movements were utilized to reconstruct new hand gestures using convex optimization. Reconstructed hand gestures and selected kinematic synergies were translated onto a humanoid robot, Mitra, in real-time, as the participants demonstrated various hand gestures. The results showed that by using only few kinematic synergies it is possible to generate various hand gestures, with 95.7% accuracy. Furthermore, utilizing low-dimensional synergies in control of high dimensional end effectors holds promise to enable near-natural human-robot collaboration.
Mots clés

Texte intégral: 1 Collection: 01-internacional Base de données: MEDLINE Langue: En Journal: Front Hum Neurosci Année: 2024 Type de document: Article Pays d'affiliation: États-Unis d'Amérique Pays de publication: Suisse

Texte intégral: 1 Collection: 01-internacional Base de données: MEDLINE Langue: En Journal: Front Hum Neurosci Année: 2024 Type de document: Article Pays d'affiliation: États-Unis d'Amérique Pays de publication: Suisse