Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Más filtros

Métodos Terapéuticos y Terapias MTCI
Bases de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Sci Rep ; 9(1): 9415, 2019 07 01.
Artículo en Inglés | MEDLINE | ID: mdl-31263113

RESUMEN

The ability of music to evoke activity changes in the core brain structures that underlie the experience of emotion suggests that it has the potential to be used in therapies for emotion disorders. A large volume of research has identified a network of sub-cortical brain regions underlying music-induced emotions. Additionally, separate evidence from electroencephalography (EEG) studies suggests that prefrontal asymmetry in the EEG reflects the approach-withdrawal response to music-induced emotion. However, fMRI and EEG measure quite different brain processes and we do not have a detailed understanding of the functional relationships between them in relation to music-induced emotion. We employ a joint EEG - fMRI paradigm to explore how EEG-based neural correlates of the approach-withdrawal response to music reflect activity changes in the sub-cortical emotional response network. The neural correlates examined are asymmetry in the prefrontal EEG, and the degree of disorder in that asymmetry over time, as measured by entropy. Participants' EEG and fMRI were recorded simultaneously while the participants listened to music that had been specifically generated to target the elicitation of a wide range of affective states. While listening to this music, participants also continuously reported their felt affective states. Here we report on co-variations in the dynamics of these self-reports, the EEG, and the sub-cortical brain activity. We find that a set of sub-cortical brain regions in the emotional response network exhibits activity that significantly relates to prefrontal EEG asymmetry. Specifically, EEG in the pre-frontal cortex reflects not only cortical activity, but also changes in activity in the amygdala, posterior temporal cortex, and cerebellum. We also find that, while the magnitude of the asymmetry reflects activity in parts of the limbic and paralimbic systems, the entropy of that asymmetry reflects activity in parts of the autonomic response network such as the auditory cortex. This suggests that asymmetry magnitude reflects affective responses to music, while asymmetry entropy reflects autonomic responses to music. Thus, we demonstrate that it is possible to infer activity in the limbic and paralimbic systems from pre-frontal EEG asymmetry. These results show how EEG can be used to measure and monitor changes in the limbic and paralimbic systems. Specifically, they suggest that EEG asymmetry acts as an indicator of sub-cortical changes in activity induced by music. This shows that EEG may be used as a measure of the effectiveness of music therapy to evoke changes in activity in the sub-cortical emotion response network. This is also the first time that the activity of sub-cortical regions, normally considered "invisible" to EEG, has been shown to be characterisable directly from EEG dynamics measured during music listening.


Asunto(s)
Encéfalo/fisiología , Música , Estimulación Acústica , Adulto , Encéfalo/diagnóstico por imagen , Mapeo Encefálico , Electroencefalografía , Femenino , Humanos , Imagen por Resonancia Magnética , Adulto Joven
2.
J Neural Eng ; 13(4): 046022, 2016 08.
Artículo en Inglés | MEDLINE | ID: mdl-27396478

RESUMEN

OBJECTIVE: We aim to develop and evaluate an affective brain-computer music interface (aBCMI) for modulating the affective states of its users. APPROACH: An aBCMI is constructed to detect a user's current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithmic music composition system and a case-based reasoning system. The system is trained and tested in a longitudinal study on a population of eight healthy participants, with each participant returning for multiple sessions. MAIN RESULTS: The final online aBCMI is able to detect its users current affective states with classification accuracies of up to 65% (3 class, [Formula: see text]) and modulate its user's affective states significantly above chance level [Formula: see text]. SIGNIFICANCE: Our system represents one of the first demonstrations of an online aBCMI that is able to accurately detect and respond to user's affective states. Possible applications include use in music therapy and entertainment.


Asunto(s)
Afecto/fisiología , Interfaces Cerebro-Computador/psicología , Música/psicología , Estimulación Acústica , Adulto , Algoritmos , Artefactos , Inteligencia Artificial , Electroencefalografía , Femenino , Humanos , Masculino , Adulto Joven
3.
Brain Cogn ; 101: 1-11, 2015 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-26544602

RESUMEN

It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests that over 20% of the variance of the participant's music induced emotions can be predicted by their neural activity and the properties of the music. Given the large amount of noise, non-stationarity, and non-linearity in both EEG and music, this is an encouraging result. Additionally, the combination of measures of brain activity and acoustic features describing the music played to our participants allows us to predict music-induced emotions with significantly higher accuracies than either feature type alone (p<0.01).


Asunto(s)
Percepción Auditiva/fisiología , Encéfalo/fisiología , Emociones/fisiología , Música/psicología , Estimulación Acústica , Adolescente , Adulto , Anciano , Mapeo Encefálico , Electroencefalografía , Femenino , Humanos , Masculino , Persona de Mediana Edad , Adulto Joven
4.
J Neurosci Methods ; 242: 65-71, 2015 Mar 15.
Artículo en Inglés | MEDLINE | ID: mdl-25546485

RESUMEN

BACKGROUND: The electroencephalogram (EEG) may be described by a large number of different feature types and automated feature selection methods are needed in order to reliably identify features which correlate with continuous independent variables. NEW METHOD: A method is presented for the automated identification of features that differentiate two or more groups in neurological datasets based upon a spectral decomposition of the feature set. Furthermore, the method is able to identify features that relate to continuous independent variables. RESULTS: The proposed method is first evaluated on synthetic EEG datasets and observed to reliably identify the correct features. The method is then applied to EEG recorded during a music listening task and is observed to automatically identify neural correlates of music tempo changes similar to neural correlates identified in a previous study. Finally, the method is applied to identify neural correlates of music-induced affective states. The identified neural correlates reside primarily over the frontal cortex and are consistent with widely reported neural correlates of emotions. COMPARISON WITH EXISTING METHODS: The proposed method is compared to the state-of-the-art methods of canonical correlation analysis and common spatial patterns, in order to identify features differentiating synthetic event-related potentials of different amplitudes and is observed to exhibit greater performance as the number of unique groups in the dataset increases. CONCLUSIONS: The proposed method is able to identify neural correlates of continuous variables in EEG datasets and is shown to outperform canonical correlation analysis and common spatial patterns.


Asunto(s)
Electroencefalografía/métodos , Reconocimiento de Normas Patrones Automatizadas/métodos , Estimulación Acústica , Percepción Auditiva/fisiología , Encéfalo/fisiología , Mapeo Encefálico/métodos , Simulación por Computador , Emociones/fisiología , Potenciales Evocados , Humanos , Modelos Neurológicos , Música
5.
Neurosci Lett ; 573: 52-7, 2014 Jun 24.
Artículo en Inglés | MEDLINE | ID: mdl-24820541

RESUMEN

This paper presents an EEG study into the neural correlates of music-induced emotions. We presented participants with a large dataset containing musical pieces in different styles, and asked them to report on their induced emotional responses. We found neural correlates of music-induced emotion in a number of frequencies over the pre-frontal cortex. Additionally, we found a set of patterns of functional connectivity, defined by inter-channel coherence measures, to be significantly different between groups of music-induced emotional responses.


Asunto(s)
Encéfalo/fisiología , Emociones , Música , Estimulación Acústica , Adolescente , Adulto , Anciano , Mapeo Encefálico , Electroencefalografía , Femenino , Humanos , Masculino , Persona de Mediana Edad , Adulto Joven
6.
Artículo en Inglés | MEDLINE | ID: mdl-25571015

RESUMEN

The neural mechanisms of music listening and appreciation are not yet completely understood. Based on the apparent relationship between the beats per minute (tempo) of music and the desire to move (for example feet tapping) induced while listening to that music it is hypothesised that musical tempo may evoke movement related activity in the brain. Participants are instructed to listen, without moving, to a large range of musical pieces spanning a range of styles and tempos during an electroencephalogram (EEG) experiment. Event-related desynchronisation (ERD) in the EEG is observed to correlate significantly with the variance of the tempo of the musical stimuli. This suggests that the dynamics of the beat of the music may induce movement related brain activity in the motor cortex. Furthermore, significant correlations are observed between EEG activity in the alpha band over the motor cortex and the bandpower of the music in the same frequency band over time. This relationship is observed to correlate with the strength of the ERD, suggesting entrainment of motor cortical activity relates to increased ERD strength.


Asunto(s)
Corteza Motora/fisiología , Música , Estimulación Acústica , Adolescente , Adulto , Anciano , Ritmo alfa , Percepción Auditiva , Femenino , Humanos , Masculino , Persona de Mediana Edad , Actividad Motora , Factores de Tiempo , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA