Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros

Tipo del documento
Publication year range
2.
Front Neurosci ; 8: 373, 2014.
Artículo en Inglés | MEDLINE | ID: mdl-25477777

RESUMEN

For multimodal Human-Computer Interaction (HCI), it is very useful to identify the modalities on which the user is currently processing information. This would enable a system to select complementary output modalities to reduce the user's workload. In this paper, we develop a hybrid Brain-Computer Interface (BCI) which uses Electroencephalography (EEG) and functional Near Infrared Spectroscopy (fNIRS) to discriminate and detect visual and auditory stimulus processing. We describe the experimental setup we used for collection of our data corpus with 12 subjects. On this data, we performed cross-validation evaluation, of which we report accuracy for different classification conditions. The results show that the subject-dependent systems achieved a classification accuracy of 97.8% for discriminating visual and auditory perception processes from each other and a classification accuracy of up to 94.8% for detecting modality-specific processes independently of other cognitive activity. The same classification conditions could also be discriminated in a subject-independent fashion with accuracy of up to 94.6 and 86.7%, respectively. We also look at the contributions of the two signal types and show that the fusion of classifiers using different features significantly increases accuracy.

3.
Artículo en Zh | WPRIM | ID: wpr-597968

RESUMEN

ObjectiveTo evaluate the effect of electroencephalography(EEG) recognition rate resulted by stimulus of visual, auditory and combined perception. MethodsIn virtual traffic environment, the stimuli were designed in traffic information based on visual, auditory and audio-visual fusion. When one of the three stimuli appeared, the subjects completed starting and braking vehicle by imaginary using of right and left hands at the same time. The EEG signals were recorded and the imaginary motion-related features from C3, C4 were extracted and classified. ResultsThe best recognition rates in visual, auditory and fusion perception were 100%, 100%and 83%, respectively, while the averages of the rates were 68.8% 、82.2%、76.9%, respectively. Conclusion The recognition rates are affected by the audio, visual and fusion stimulus and apparent individual differences exist.

SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda