Your browser doesn't support javascript.
loading
American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions.
Gu, Yutong; Zheng, Chao; Todoh, Masahiro; Zha, Fusheng.
Afiliação
  • Gu Y; Graduate School of Engineering, Hokkaido University, Sapporo, Japan.
  • Zheng C; Wuhan Second Ship Design and Research Institute, China State Shipbuilding Corporation Limited, Wuhan, China.
  • Todoh M; Faculty of Engineering, Hokkaido University, Sapporo, Japan.
  • Zha F; State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, China.
Front Neurosci ; 16: 962141, 2022.
Article em En | MEDLINE | ID: mdl-35937881
ABSTRACT
A sign language translation system can break the communication barrier between hearing-impaired people and others. In this paper, a novel American sign language (ASL) translation method based on wearable sensors was proposed. We leveraged inertial sensors to capture signs and surface electromyography (EMG) sensors to detect facial expressions. We applied a convolutional neural network (CNN) to extract features from input signals. Then, long short-term memory (LSTM) and transformer models were exploited to achieve end-to-end translation from input signals to text sentences. We evaluated two models on 40 ASL sentences strictly following the rules of grammar. Word error rate (WER) and sentence error rate (SER) are utilized as the evaluation standard. The LSTM model can translate sentences in the testing dataset with a 7.74% WER and 9.17% SER. The transformer model performs much better by achieving a 4.22% WER and 4.72% SER. The encouraging results indicate that both models are suitable for sign language translation with high accuracy. With complete motion capture sensors and facial expression recognition methods, the sign language translation system has the potential to recognize more sentences.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Front Neurosci Ano de publicação: 2022 Tipo de documento: Article País de afiliação: Japão

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Front Neurosci Ano de publicação: 2022 Tipo de documento: Article País de afiliação: Japão