Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
Biomed Phys Eng Express ; 9(5)2023 08 25.
Artículo en Inglés | MEDLINE | ID: mdl-37591224

RESUMEN

Objective.In this paper, an around-ear EEG system is investigated as an alternative methodology to conventional scalp-EEG-based systems in classifying human affective states in the arousal-valence domain evoked in response to auditory stimuli.Approach.EEG recorded from around the ears is compared to EEG collected according to the international 10-20 system in terms of efficacy in an affective state classification task. A wearable device with eight dry EEG channels is designed for ear-EEG acquisition in this study. Twenty-one subjects participated in an experiment consisting of six sessions over three days using both ear and scalp-EEG acquisition methods. Experimental tasks consisted of listening to an auditory stimulus and self-reporting the elicited emotion in response to the said stimulus. Various features were used in tandem with asymmetry methods to evaluate binary classification performances of arousal and valence states using ear-EEG signals in comparison to scalp-EEG.Main results.We achieve an average accuracy of 67.09% ± 6.14 for arousal and 66.61% ± 6.14 for valence after training a multi-layer extreme learning machine with ear-EEG signals in a subject-dependent context in comparison to scalp-EEG approach which achieves an average accuracy of 68.59% ± 6.26 for arousal and 67.10% ± 4.99 for valence. In a subject-independent context, the ear-EEG approach achieves 63.74% ± 3.84 for arousal and 64.32% ± 6.38 for valence while the scalp-EEG approach achieves 64.67% ± 6.91 for arousal and 64.86% ± 5.95 for valence. The best results show no significant differences between ear-EEG and scalp-EEG signals for classifications of affective states.Significance.To the best of our knowledge, this paper is the first work to explore the use of around-ear EEG signals in emotion monitoring. Our results demonstrate the potential use of around-ear EEG systems for the development of emotional monitoring setups that are more suitable for use in daily affective life log systems compared to conventional scalp-EEG setups.


Asunto(s)
Nivel de Alerta , Dispositivos Electrónicos Vestibles , Humanos , Electroencefalografía , Emociones
2.
J Neural Eng ; 20(5)2023 Oct 06.
Artículo en Inglés | MEDLINE | ID: mdl-37748474

RESUMEN

Objective.This review paper provides a comprehensive overview of ear-electroencephalogram (EEG) technology, which involves recording EEG signals from electrodes placed in or around the ear, and its applications in the field of neural engineering.Approach.We conducted a thorough literature search using multiple databases to identify relevant studies related to ear-EEG technology and its various applications. We selected 123 publications and synthesized the information to highlight the main findings and trends in this field.Main results.Our review highlights the potential of ear-EEG technology as the future of wearable EEG technology. We discuss the advantages and limitations of ear-EEG compared to traditional scalp-based EEG and methods to overcome those limitations. Through our review, we found that ear-EEG is a promising method that produces comparable results to conventional scalp-based methods. We review the development of ear-EEG sensing devices, including the design, types of sensors, and materials. We also review the current state of research on ear-EEG in different application areas such as brain-computer interfaces, and clinical monitoring.Significance.This review paper is the first to focus solely on reviewing ear-EEG research articles. As such, it serves as a valuable resource for researchers, clinicians, and engineers working in the field of neural engineering. Our review sheds light on the exciting future prospects of ear-EEG, and its potential to advance neural engineering research and become the future of wearable EEG technology.

3.
Artículo en Inglés | MEDLINE | ID: mdl-37028309

RESUMEN

Recent advancements in immersive virtual reality head-mounted displays allowed users to better engage with simulated graphical environments. Having the screen egocentrically stabilized in a way such that the users may freely rotate their heads to observe virtual surroundings, head-mounted displays present virtual scenarios with rich immersion. With such an enhanced degree of freedom, immersive virtual reality displays have also been integrated with electroencephalograms, which make it possible to study and utilize brain signals non-invasively, to analyze and apply their capabilities. In this review, we introduce recent progress that utilized immersive head-mounted displays along with electroencephalograms across various fields, focusing on the purposes and experimental designs of their studies. The paper also highlights the effects of using immersive virtual reality discovered through the electroencephalogram analysis and discusses existing limitations, current trends as well as future research opportunities that may hopefully act as a useful source of information for further improvement of electroencephalogram-based immersive virtual reality applications.


Asunto(s)
Realidad Virtual , Humanos , Electroencefalografía
4.
IEEE Trans Cybern ; 52(12): 13212-13224, 2022 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-34495859

RESUMEN

Human emotions and behaviors are reciprocal components that shape each other in everyday life. While the past research on each element has made use of various physiological sensors in many ways, their interactive relationship in the context of daily life has not yet been explored. In this work, we present a wearable affective life-log system (ALIS) that is robust as well as easy to use in daily life to accurately detect emotional changes and determine the cause-and-effect relationship between emotions and emotional situations in users' lives. The proposed system records how a user feels in certain situations during long-term activities using physiological sensors. Based on the long-term monitoring, the system analyzes how the contexts of the user's life affect his/her emotional changes and builds causal structures between emotions and observable behaviors in daily situations. Furthermore, we demonstrate that the proposed system enables us to build causal structures to find individual sources of mental relief suited to negative situations in school life.


Asunto(s)
Dispositivos Electrónicos Vestibles , Humanos , Femenino , Masculino , Emociones/fisiología , Aprendizaje
5.
Sci Rep ; 10(1): 15916, 2020 09 28.
Artículo en Inglés | MEDLINE | ID: mdl-32985534

RESUMEN

This paper presents a computational framework for providing affective labels to real-life situations, called A-Situ. We first define an affective situation, as a specific arrangement of affective entities relevant to emotion elicitation in a situation. Then, the affective situation is represented as a set of labels in the valence-arousal emotion space. Based on psychological behaviors in response to a situation, the proposed framework quantifies the expected emotion evoked by the interaction with a stimulus event. The accumulated result in a spatiotemporal situation is represented as a polynomial curve called the affective curve, which bridges the semantic gap between cognitive and affective perception in real-world situations. We show the efficacy of the curve for reliable emotion labeling in real-world experiments, respectively concerning (1) a comparison between the results from our system and existing explicit assessments for measuring emotion, (2) physiological distinctiveness in emotional states, and (3) physiological characteristics correlated to continuous labels. The efficiency of affective curves to discriminate emotional states is evaluated through subject-dependent classification performance using bicoherence features to represent discrete affective states in the valence-arousal space. Furthermore, electroencephalography-based statistical analysis revealed the physiological correlates of the affective curves.


Asunto(s)
Nivel de Alerta/fisiología , Encéfalo/fisiología , Emociones/fisiología , Modelos Psicológicos , Electroencefalografía , Humanos
6.
IEEE Trans Neural Syst Rehabil Eng ; 28(7): 1614-1622, 2020 07.
Artículo en Inglés | MEDLINE | ID: mdl-32634098

RESUMEN

Visual information plays an essential role in enhancing neural activity during mental practices. Previous research has shown that using different visual scenarios during mental practices that involve imagining the movement of a specific body part may result in differences in performance. Many of these scenarios utilize the concept of embodiment, or one's observation of another entity to be a part of oneself, to improve practice quality of the imagined body movement. We therefore hypothesized that applying immersive virtual reality headsets, with their ability to provide rich immersion and illusion by presenting egocentrically simulated virtual scenarios, and action observation to motor imagery practice will result in significant improvement. To explore the possible synergy between immersive systems and motor imagery, we analyzed the electroencephalogram signals of our participants as they were presented the same virtual hand movement scenario with two different mediums: an immersive virtual reality headset and a monitor display. Our experimental results provide evidence that the immersive virtual reality headsets induced improved rhythmic patterns with better discriminating spatial features from the brain compared to the monitor display. These findings suggest that the use of immersive virtual reality headsets, with the illusion and embodiment they provide, can effectively improve motor imagery training.


Asunto(s)
Trastornos Motores , Realidad Virtual , Encéfalo , Electroencefalografía , Humanos , Movimiento
7.
IEEE Trans Neural Syst Rehabil Eng ; 23(2): 159-68, 2015 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-25376041

RESUMEN

This paper describes a low-cost noninvasive brain-computer interface (BCI) hybridized with eye tracking. It also discusses its feasibility through a Fitts' law-based quantitative evaluation method. Noninvasive BCI has recently received a lot of attention. To bring the BCI applications into real life, user-friendly and easily portable devices need to be provided. In this work, as an approach to realize a real-world BCI, electroencephalograph (EEG)-based BCI combined with eye tracking is investigated. The two interfaces can be complementary to attain improved performance. Especially to consider public availability, a low-cost interface device is intentionally used for test. A low-cost commercial EEG recording device is integrated with an inexpensive custom-built eye tracker. The developed hybrid interface is evaluated through target pointing and selection experiments. Eye movement is interpreted as cursor movement and noninvasive BCI selects a cursor point with two selection confirmation schemes. Using Fitts' law, the proposed interface scheme is compared with other interface schemes such as mouse, eye tracking with dwell time, and eye tracking with keyboard. In addition, the proposed hybrid BCI system is discussed with respect to a practical interface scheme. Although further advancement is required, the proposed hybrid BCI system has the potential to be practically useful in a natural and intuitive manner.


Asunto(s)
Interfaces Cerebro-Computador , Periféricos de Computador , Electrocardiografía/instrumentación , Movimientos Oculares/fisiología , Fotograbar/instrumentación , Pupila/fisiología , Diseño de Equipo , Análisis de Falla de Equipo , Femenino , Humanos , Masculino , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Integración de Sistemas , Adulto Joven
8.
Comput Biol Med ; 51: 82-92, 2014 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-24880998

RESUMEN

We propose a wearable hybrid interface where eye movements and mental concentration directly influence the control of a quadcopter in three-dimensional space. This noninvasive and low-cost interface addresses limitations of previous work by supporting users to complete their complicated tasks in a constrained environment in which only visual feedback is provided. The combination of the two inputs augments the number of control commands to enable the flying robot to travel in eight different directions within the physical environment. Five human subjects participated in the experiments to test the feasibility of the hybrid interface. A front view camera on the hull of the quadcopter provided the only visual feedback to each remote subject on a laptop display. Based on the visual feedback, the subjects used the interface to navigate along pre-set target locations in the air. The flight performance was evaluated by comparing with a keyboard-based interface. We demonstrate the applicability of the hybrid interface to explore and interact with a three-dimensional physical space through a flying robot.


Asunto(s)
Aeronaves , Interfaces Cerebro-Computador , Electroencefalografía , Movimientos Oculares/fisiología , Medidas del Movimiento Ocular , Humanos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA