Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
Philos Trans R Soc Lond B Biol Sci ; 379(1895): 20220420, 2024 Jan 29.
Article in English | MEDLINE | ID: mdl-38104601

ABSTRACT

Expectation is crucial for our enjoyment of music, yet the underlying generative mechanisms remain unclear. While sensory models derive predictions based on local acoustic information in the auditory signal, cognitive models assume abstract knowledge of music structure acquired over the long term. To evaluate these two contrasting mechanisms, we compared simulations from four computational models of musical expectancy against subjective expectancy and pleasantness ratings of over 1000 chords sampled from 739 US Billboard pop songs. Bayesian model comparison revealed that listeners' expectancy and pleasantness ratings were predicted by the independent, non-overlapping, contributions of cognitive and sensory expectations. Furthermore, cognitive expectations explained over twice the variance in listeners' perceived surprise compared to sensory expectations, suggesting a larger relative importance of long-term representations of music structure over short-term sensory-acoustic information in musical expectancy. Our results thus emphasize the distinct, albeit complementary, roles of cognitive and sensory expectations in shaping musical pleasure, and suggest that this expectancy-driven mechanism depends on musical information represented at different levels of abstraction along the neural hierarchy. This article is part of the theme issue 'Art, aesthetics and predictive processing: theoretical and empirical perspectives'.


Subject(s)
Music , Pleasure , Auditory Perception , Music/psychology , Motivation , Bayes Theorem , Cognition , Acoustic Stimulation/methods
2.
Brain Struct Funct ; 228(1): 273-291, 2023 Jan.
Article in English | MEDLINE | ID: mdl-35476027

ABSTRACT

Semantic knowledge is central to human cognition. The angular gyrus (AG) is widely considered a key brain region for semantic cognition. However, the role of the AG in semantic processing is controversial. Key controversies concern response polarity (activation vs. deactivation) and its relation to task difficulty, lateralization (left vs. right AG), and functional-anatomical subdivision (PGa vs. PGp subregions). Here, we combined the fMRI data of five studies on semantic processing (n = 172) and analyzed the response profiles from the same anatomical regions-of-interest for left and right PGa and PGp. We found that the AG was consistently deactivated during non-semantic conditions, whereas response polarity during semantic conditions was inconsistent. However, the AG consistently showed relative response differences between semantic and non-semantic conditions, and between different semantic conditions. A combined analysis across all studies revealed that AG responses could be best explained by separable effects of task difficulty and semantic processing demand. Task difficulty effects were stronger in PGa than PGp, regardless of hemisphere. Semantic effects were stronger in left than right AG, regardless of subregion. These results suggest that the AG is engaged in both domain-general task-difficulty-related processes and domain-specific semantic processes. In semantic processing, we propose that left AG acts as a "multimodal convergence zone" that binds different semantic features associated with the same concept, enabling efficient access to task-relevant features.


Subject(s)
Brain Mapping , Parietal Lobe , Humans , Brain Mapping/methods , Parietal Lobe/diagnostic imaging , Parietal Lobe/physiology , Cognition/physiology , Semantics , Functional Neuroimaging , Magnetic Resonance Imaging/methods
4.
Sci Rep ; 11(1): 10119, 2021 05 12.
Article in English | MEDLINE | ID: mdl-33980876

ABSTRACT

Neurobiological models of emotion focus traditionally on limbic/paralimbic regions as neural substrates of emotion generation, and insular cortex (in conjunction with isocortical anterior cingulate cortex, ACC) as the neural substrate of feelings. An emerging view, however, highlights the importance of isocortical regions beyond insula and ACC for the subjective feeling of emotions. We used music to evoke feelings of joy and fear, and multivariate pattern analysis (MVPA) to decode representations of feeling states in functional magnetic resonance (fMRI) data of n = 24 participants. Most of the brain regions providing information about feeling representations were neocortical regions. These included, in addition to granular insula and cingulate cortex, primary and secondary somatosensory cortex, premotor cortex, frontal operculum, and auditory cortex. The multivoxel activity patterns corresponding to feeling representations emerged within a few seconds, gained in strength with increasing stimulus duration, and replicated results of a hypothesis-generating decoding analysis from an independent experiment. Our results indicate that several neocortical regions (including insula, cingulate, somatosensory and premotor cortices) are important for the generation and modulation of feeling states. We propose that secondary somatosensory cortex, which covers the parietal operculum and encroaches on the posterior insula, is of particular importance for the encoding of emotion percepts, i.e., preverbal representations of subjective feeling.


Subject(s)
Emotions/physiology , Evoked Potentials, Auditory , Gyrus Cinguli/physiology , Music , Neocortex/physiology , Somatosensory Cortex/physiology , Acoustic Stimulation , Adult , Brain Mapping , Female , Humans , Magnetic Resonance Imaging , Male , Young Adult
5.
Neuroimage ; 219: 117041, 2020 10 01.
Article in English | MEDLINE | ID: mdl-32534127

ABSTRACT

Conceptual knowledge is central to human cognition. The left posterior inferior parietal lobe (pIPL) is implicated by neuroimaging studies as a multimodal hub representing conceptual knowledge related to various perceptual-motor modalities. However, the causal role of left pIPL in conceptual processing remains unclear. Here, we transiently disrupted left pIPL function with transcranial magnetic stimulation (TMS) to probe its causal relevance for the retrieval of action and sound knowledge. We compared effective TMS over left pIPL with sham TMS, while healthy participants performed three different tasks-lexical decision, action judgment, and sound judgment-on words with a high or low association to actions and sounds. We found that pIPL-TMS selectively impaired action judgments on low sound-low action words. For the first time, we directly related computational simulations of the TMS-induced electrical field to behavioral performance, which revealed that stronger stimulation of left pIPL is associated with worse performance for action but not sound judgments. These results indicate that left pIPL causally supports conceptual processing when action knowledge is task-relevant and cannot be compensated by sound knowledge. Our findings suggest that left pIPL is specialized for the retrieval of action knowledge, challenging the view of left pIPL as a multimodal conceptual hub.


Subject(s)
Cognition/physiology , Judgment/physiology , Mental Recall/physiology , Parietal Lobe/physiology , Adult , Brain Mapping , Computer Simulation , Female , Humans , Knowledge , Language , Magnetic Resonance Imaging , Male , Neuropsychological Tests , Parietal Lobe/diagnostic imaging , Psychomotor Performance/physiology , Reaction Time/physiology , Transcranial Magnetic Stimulation , Young Adult
6.
Curr Biol ; 29(23): 4084-4092.e4, 2019 12 02.
Article in English | MEDLINE | ID: mdl-31708393

ABSTRACT

Listening to music often evokes intense emotions [1, 2]. Recent research suggests that musical pleasure comes from positive reward prediction errors, which arise when what is heard proves to be better than expected [3]. Central to this view is the engagement of the nucleus accumbens-a brain region that processes reward expectations-to pleasurable music and surprising musical events [4-8]. However, expectancy violations along multiple musical dimensions (e.g., harmony and melody) have failed to implicate the nucleus accumbens [9-11], and it is unknown how music reward value is assigned [12]. Whether changes in musical expectancy elicit pleasure has thus remained elusive [11]. Here, we demonstrate that pleasure varies nonlinearly as a function of the listener's uncertainty when anticipating a musical event, and the surprise it evokes when it deviates from expectations. Taking Western tonal harmony as a model of musical syntax, we used a machine-learning model [13] to mathematically quantify the uncertainty and surprise of 80,000 chords in US Billboard pop songs. Behaviorally, we found that chords elicited high pleasure ratings when they deviated substantially from what the listener had expected (low uncertainty, high surprise) or, conversely, when they conformed to expectations in an uninformative context (high uncertainty, low surprise). Neurally, we found using fMRI that activity in the amygdala, hippocampus, and auditory cortex reflected this interaction, while the nucleus accumbens only reflected uncertainty. These findings challenge current neurocognitive models of music-evoked pleasure and highlight the synergistic interplay between prospective and retrospective states of expectation in the musical experience. VIDEO ABSTRACT.


Subject(s)
Auditory Perception/physiology , Music , Pleasure , Uncertainty , Adult , Amygdala/physiology , Auditory Cortex/physiology , Female , Hippocampus/physiology , Humans , Male , Nucleus Accumbens/physiology , Young Adult
7.
Sci Rep ; 8(1): 3822, 2018 02 28.
Article in English | MEDLINE | ID: mdl-29491454

ABSTRACT

Complex auditory sequences known as music have often been described as hierarchically structured. This permits the existence of non-local dependencies, which relate elements of a sequence beyond their temporal sequential order. Previous studies in music have reported differential activity in the inferior frontal gyrus (IFG) when comparing regular and irregular chord-transitions based on theories in Western tonal harmony. However, it is unclear if the observed activity reflects the interpretation of hierarchical structure as the effects are confounded by local irregularity. Using functional magnetic resonance imaging (fMRI), we found that violations to non-local dependencies in nested sequences of three-tone musical motifs in musicians elicited increased activity in the right IFG. This is in contrast to similar studies in language which typically report the left IFG in processing grammatical syntax. Effects of increasing auditory working demands are moreover reflected by distributed activity in frontal and parietal regions. Our study therefore demonstrates the role of the right IFG in processing non-local dependencies in music, and suggests that hierarchical processing in different cognitive domains relies on similar mechanisms that are subserved by domain-selective neuronal subpopulations.


Subject(s)
Music , Prefrontal Cortex/physiology , Adolescent , Auditory Perception , Female , Humans , Magnetic Resonance Imaging , Male , Memory, Short-Term , Prefrontal Cortex/diagnostic imaging
SELECTION OF CITATIONS
SEARCH DETAIL