Your browser doesn't support javascript.
loading
Probing the neural dynamics of musicians' and non-musicians' consonant/dissonant perception: Joint analyses of electrical encephalogram (EEG) and functional magnetic resonance imaging (fMRI).
Jo, Han Shin; Hsieh, Tsung-Hao; Chien, Wei-Che; Shaw, Fu-Zen; Liang, Sheng-Fu; Kung, Chun-Chia.
  • Jo HS; Institute of Medical Informatics, National Cheng Kung University (NCKU), Tainan, 70101, Taiwan.
  • Hsieh TH; Department of Computer Science and Information Engineering, NCKU, Tainan, 70101, Taiwan; Department of Computer Science, Tunghai University, Taichung, 407224, Taiwan.
  • Chien WC; Department of Computer Science and Information Engineering, NCKU, Tainan, 70101, Taiwan.
  • Shaw FZ; Department of Psychology, NCKU, Tainan, 70101, Taiwan; Mind Research and Imaging Center, NCKU, Tainan, 70101, Taiwan.
  • Liang SF; Institute of Medical Informatics, National Cheng Kung University (NCKU), Tainan, 70101, Taiwan; Department of Computer Science and Information Engineering, NCKU, Tainan, 70101, Taiwan.
  • Kung CC; Department of Psychology, NCKU, Tainan, 70101, Taiwan; Mind Research and Imaging Center, NCKU, Tainan, 70101, Taiwan. Electronic address: cckung@kunglab-nckupsy.org.
Neuroimage ; 298: 120784, 2024 Aug 13.
Article en En | MEDLINE | ID: mdl-39147290
ABSTRACT
The perception of two (or more) simultaneous musical notes, depending on their pitch interval(s), could be broadly categorized as consonant or dissonant. Previous literature has suggested that musicians and non-musicians adopt different strategies when discerning music intervals while musicians rely on the frequency ratios between the two fundamental frequencies, such as "perfect fifth" (32) as consonant and "tritone" (4532) as dissonant intervals; non-musicians may rely on the presence of 'roughness' or 'beats', generated by the difference of fundamental frequencies, as the key elements of 'dissonance'. The separate Event-Related Potential (ERP) differences in N1 and P2 along the midline electrodes provided evidence congruent with such 'separate reliances'. To replicate and to extend, in this study we reran the previous experiment, and separately collected fMRI data of the same protocol (with sparse sampling modifications). The behavioral and EEG results largely corresponded to our previous finding. The fMRI results, with the joint analyses by univariate, psycho-physiological interaction, and representational similarity analysis (RSA) approaches, further reinforce the involvement of central midline-related brain regions, such as ventromedial prefrontal and dorsal anterior cingulate cortex, in consonant/dissonance judgments. The final spatiotemporal searchlight RSA provided convincing evidence that the medial prefrontal cortex, along with the bilateral superior temporal cortex, is the joint locus of midline N1 and dorsal anterior cingulate cortex for the P2 effect (for musicians). Together, these analyses reaffirm that musicians rely more on experience-driven knowledge for consonance/dissonance perception; but also demonstrate the advantages of multiple analyses in constraining the findings from both EEG and fMRI.
Palabras clave

Texto completo: 1 Banco de datos: MEDLINE Idioma: En Año: 2024 Tipo del documento: Article

Texto completo: 1 Banco de datos: MEDLINE Idioma: En Año: 2024 Tipo del documento: Article