Your browser doesn't support javascript.
loading
Analyzing text recognition from tactually evoked EEG.
Khasnobish, A; Datta, S; Bose, R; Tibarewala, D N; Konar, A.
Afiliación
  • Khasnobish A; TCS Innovation Labs, New Town, Kolkata, 700156 India.
  • Datta S; TCS Innovation Labs, New Town, Kolkata, 700156 India.
  • Bose R; Electrical Engineering Department, Calcutta Institute of Engineering and Management, 24/1A, Chandi Ghosh Road, Kolkata, 700040 India.
  • Tibarewala DN; School of Bioscience and Engineering, Jadavpur University, Kolkata, 700032 India.
  • Konar A; Department of Electronics and Telecommunication Engineering, Jadavpur University, Kolkata, 700032 India.
Cogn Neurodyn ; 11(6): 501-513, 2017 Dec.
Article en En | MEDLINE | ID: mdl-29147143
ABSTRACT
Tactual exploration of objects produce specific patterns in the human brain and hence objects can be recognized by analyzing brain signals during tactile exploration. The present work aims at analyzing EEG signals online for recognition of embossed texts by tactual exploration. EEG signals are acquired from the parietal region over the somatosensory cortex of blindfolded healthy subjects while they tactually explored embossed texts, including symbols, numbers, and alphabets. Classifiers based on the principle of supervised learning are trained on the extracted EEG feature space, comprising three features, namely, adaptive autoregressive parameters, Hurst exponents, and power spectral density, to recognize the respective texts. The pre-trained classifiers are used to classify the EEG data to identify the texts online and the recognized text is displayed on the computer screen for communication. Online classifications of two, four, and six classes of embossed texts are achieved with overall average recognition rates of 76.62, 72.31, and 67.62% respectively and the computational time is less than 2 s in each case. The maximum information transfer rate and utility of the system performance over all experiments are 0.7187 and 2.0529 bits/s respectively. This work presents a study that shows the possibility to classify 3D letters using tactually evoked EEG. In future, it will help the BCI community to design stimuli for better tactile augmentation n also opens new directions of research to facilitate 3D letters for visually impaired persons. Further, 3D maps can be generated for aiding tactual BCI in teleoperation.
Palabras clave

Texto completo: 1 Base de datos: MEDLINE Idioma: En Revista: Cogn Neurodyn Año: 2017 Tipo del documento: Article

Texto completo: 1 Base de datos: MEDLINE Idioma: En Revista: Cogn Neurodyn Año: 2017 Tipo del documento: Article