On the estimate of music appraisal from surface EEG: a dynamic-network approach based on cross-sensor PAC measurements.
J Neural Eng
; 18(4)2021 06 02.
Article
em En
| MEDLINE
| ID: mdl-33975291
Objective.The aesthetic evaluation of music is strongly dependent on the listener and reflects manifold brain processes that go well beyond the perception of incident sound. Being a high-level cognitive reaction, it is difficult to predict merely from the acoustic features of the audio signal and this poses serious challenges to contemporary music recommendation systems. We attempted to decode music appraisal from brain activity, recorded via wearable EEG, during music listening.Approach.To comply with the dynamic nature of music stimuli, cross-frequency coupling measurements were employed in a time-evolving manner to capture the evolving interactions between distinct brain-rhythms during music listening. Brain response to music was first represented as a continuous flow of functional couplings referring to both regional and inter-regional brain dynamics and then modelled as an ensemble of time-varying (sub)networks. Dynamic graph centrality measures were derived, next, as the final feature-engineering step and, lastly, a support-vector machine was trained to decode the subjective music appraisal. A carefully designed experimental paradigm provided the labeled brain signals.Main results.Using data from 20 subjects, dynamic programming to tailor the decoder to each subject individually and cross-validation, we demonstrated highly satisfactory performance (MAE= 0.948,R2= 0.63) that can be attributed, mostly, to interactions of left frontal gamma rhythm. In addition, our music-appraisal decoder was also employed in a part of the DEAP dataset with similar success. Finally, even a generic version of the decoder (common for all subjects) was found to perform sufficiently.Significance.A novel brain signal decoding scheme was introduced and validated empirically on suitable experimental data. It requires simple operations and leaves room for real-time implementation. Both the code and the experimental data are publicly available.
Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Música
Tipo de estudo:
Prognostic_studies
Limite:
Humans
Idioma:
En
Ano de publicação:
2021
Tipo de documento:
Article