Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros

Bases de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
J Neural Eng ; 20(5)2023 09 28.
Artículo en Inglés | MEDLINE | ID: mdl-37683664

RESUMEN

Objective.Motor imagery (MI) is widely used in brain-computer interfaces (BCIs). However, the decode of MI-EEG using convolutional neural networks (CNNs) remains a challenge due to individual variability.Approach.We propose a fully end-to-end CNN called SincMSNet to address this issue. SincMSNet employs the Sinc filter to extract subject-specific frequency band information and utilizes mixed-depth convolution to extract multi-scale temporal information for each band. It then applies a spatial convolutional block to extract spatial features and uses a temporal log-variance block to obtain classification features. The model of SincMSNet is trained under the joint supervision of cross-entropy and center loss to achieve inter-class separable and intra-class compact representations of EEG signals.Main results.We evaluated the performance of SincMSNet on the BCIC-IV-2a (four-class) and OpenBMI (two-class) datasets. SincMSNet achieves impressive results, surpassing benchmark methods. In four-class and two-class inter-session analysis, it achieves average accuracies of 80.70% and 71.50% respectively. In four-class and two-class single-session analysis, it achieves average accuracies of 84.69% and 76.99% respectively. Additionally, visualizations of the learned band-pass filter bands by Sinc filters demonstrate the network's ability to extract subject-specific frequency band information from EEG.Significance.This study highlights the potential of SincMSNet in improving the performance of MI-EEG decoding and designing more robust MI-BCIs. The source code for SincMSNet can be found at:https://github.com/Want2Vanish/SincMSNet.


Asunto(s)
Algoritmos , Interfaces Cerebro-Computador , Imaginación , Electroencefalografía/métodos , Redes Neurales de la Computación , Imágenes en Psicoterapia
2.
IEEE Trans Biomed Eng ; 70(2): 436-445, 2023 02.
Artículo en Inglés | MEDLINE | ID: mdl-35867371

RESUMEN

OBJECT: Motor imagery (MI) is a mental process widely utilized as the experimental paradigm for brain-computer interfaces (BCIs) across a broad range of basic science and clinical studies. However, decoding intentions from MI remains challenging due to the inherent complexity of brain patterns relative to the small sample size available for machine learning. APPROACH: This paper proposes an end-to-end Filter-Bank Multiscale Convolutional Neural Network (FBMSNet) for MI classification. A filter bank is first employed to derive a multiview spectral representation of the EEG data. Mixed depthwise convolution is then applied to extract temporal features at multiple scales, followed by spatial filtering to mitigate volume conduction. Finally, with the joint supervision of cross-entropy and center loss, FBMSNet obtains features that maximize interclass dispersion and intraclass compactness. MAIN RESULTS: We compare FBMSNet with several state-of-the-art EEG decoding methods on two MI datasets: the BCI Competition IV 2a dataset and the OpenBMI dataset. FBMSNet significantly outperforms the benchmark methods by achieving 79.17% and 70.05% for four-class and two-class hold-out classification accuracy, respectively. SIGNIFICANCE: These results demonstrate the efficacy of FBMSNet in improving EEG decoding performance toward more robust BCI applications. The FBMSNet source code is available at https://github.com/Want2Vanish/FBMSNet.


Asunto(s)
Interfaces Cerebro-Computador , Imaginación , Redes Neurales de la Computación , Aprendizaje Automático , Encéfalo , Electroencefalografía/métodos , Algoritmos
3.
IEEE Trans Neural Syst Rehabil Eng ; 27(3): 507-513, 2019 03.
Artículo en Inglés | MEDLINE | ID: mdl-30714927

RESUMEN

The coma recovery scale-revised (CRS-R) behavioral scale is commonly used for the clinical evaluation of patients with disorders of consciousness (DOC). However, since DOC patients generally cannot supply stable and efficient behavioral responses to external stimulation, evaluation results based on behavioral scales are not sufficiently accurate. In this paper, we proposed a novel brain-computer interface (BCI) based on 3D stereo audiovisual stimuli to supplement object recognition evaluation in the CRS-R. During the experiment, subjects needed to follow the instructions and to focus on the target object on the screen, whereas EEG data were recorded and analyzed in real time to determine the object of focus, and the detection result was output as feedback. Thirteen DOC patients participated in the object recognition assessments using the 3D audiovisual BCI and CRS-R. None of the patients showed object recognition function in the CRS-R assessment before the BCI experiment. However, six of these DOC patients achieved accuracies that were significantly higher than the chance level in the BCI-based assessment, indicating the successful detection of object recognition function in these six patients using our 3D audiovisual BCI system. These results suggest that the BCI method may provide a more sensitive object recognition evaluation compared with CRS-R and may be used to assist clinical CRS-R for DOC patients.


Asunto(s)
Interfaces Cerebro-Computador , Trastornos de la Conciencia/diagnóstico , Imagenología Tridimensional , Reconocimiento en Psicología , Estimulación Acústica , Adolescente , Adulto , Anciano , Coma/diagnóstico , Simulación por Computador , Trastornos de la Conciencia/psicología , Electroencefalografía , Retroalimentación , Femenino , Voluntarios Sanos , Humanos , Masculino , Persona de Mediana Edad , Estimulación Luminosa , Recuperación de la Función , Adulto Joven
4.
PLoS One ; 6(6): e20801, 2011.
Artículo en Inglés | MEDLINE | ID: mdl-21750692

RESUMEN

One of the central questions in cognitive neuroscience is the precise neural representation, or brain pattern, associated with a semantic category. In this study, we explored the influence of audiovisual stimuli on the brain patterns of concepts or semantic categories through a functional magnetic resonance imaging (fMRI) experiment. We used a pattern search method to extract brain patterns corresponding to two semantic categories: "old people" and "young people." These brain patterns were elicited by semantically congruent audiovisual, semantically incongruent audiovisual, unimodal visual, and unimodal auditory stimuli belonging to the two semantic categories. We calculated the reproducibility index, which measures the similarity of the patterns within the same category. We also decoded the semantic categories from these brain patterns. The decoding accuracy reflects the discriminability of the brain patterns between two categories. The results showed that both the reproducibility index of brain patterns and the decoding accuracy were significantly higher for semantically congruent audiovisual stimuli than for unimodal visual and unimodal auditory stimuli, while the semantically incongruent stimuli did not elicit brain patterns with significantly higher reproducibility index or decoding accuracy. Thus, the semantically congruent audiovisual stimuli enhanced the within-class reproducibility of brain patterns and the between-class discriminability of brain patterns, and facilitate neural representations of semantic categories or concepts. Furthermore, we analyzed the brain activity in superior temporal sulcus and middle temporal gyrus (STS/MTG). The strength of the fMRI signal and the reproducibility index were enhanced by the semantically congruent audiovisual stimuli. Our results support the use of the reproducibility index as a potential tool to supplement the fMRI signal amplitude for evaluating multimodal integration.


Asunto(s)
Estimulación Acústica , Encéfalo/fisiología , Estimulación Luminosa , Semántica , Adulto , Envejecimiento/fisiología , Humanos , Imagen por Resonancia Magnética , Masculino , Patrones de Reconocimiento Fisiológico , Reproducibilidad de los Resultados , Lóbulo Temporal/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA