Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
Sci Rep ; 12(1): 21447, 2022 12 12.
Article in English | MEDLINE | ID: mdl-36509791

ABSTRACT

Evidence shows that participants performing a continuous visual categorization task respond slower following the presentation of a task-irrelevant sound deviating from an otherwise repetitive or predictable auditory context (deviant sound among standard sounds). Here, for the first time, we explored the role of the environmental context (instrumentalized as a task-irrelevant background picture) in this effect. In two experiments, participants categorized left/right arrows while ignoring irrelevant sounds and background pictures of forest and city scenes. While equiprobable across the task, sounds A and B were presented with probabilities of .882 and .118 in the forest context, respectively, and with the reversed probabilities in the city context. Hence, neither sound constituted a deviant sound at task-level, but each did within a specific context. In Experiment 1, where each environmental context (forest and city scene) consisted of a single picture each, participants were significantly slower in the visual task following the presentation of the sound that was unexpected within the current context (context-dependent distraction). Further analysis showed that the cognitive system reset its sensory predictions even for the first trial of a change in environmental context. In Experiment 2, the two contexts (forest and city) were implemented using sets of 32 pictures each, with the background picture changing on every trial. Here too, context-dependent deviance distraction was observed. However, participants took a trial to fully reset their sensory predictions upon a change in context. We conclude that irrelevant sounds are incidentally processed in association with the environmental context (even though these stimuli belong to different sensory modalities) and that sensory predictions are context-dependent.


Subject(s)
Attention , Auditory Perception , Humans , Acoustic Stimulation , Reaction Time , Sound
2.
Psychophysiology ; : e13704, 2020 Oct 14.
Article in English | MEDLINE | ID: mdl-33090526

ABSTRACT

The aim of the present study was to examine the relationship between object categorization in natural scenes and the engagement of cortico-limbic appetitive and defensive systems (emotional engagement) by manipulating both the bottom-up information and the top-down context. Concerning the bottom-up information, we manipulated the computational load by scrambling the phase of the spatial frequency spectrum, and asked participants to classify natural scenes as containing an animal or a person. The role of the top-down context was assessed by comparing an incremental condition, in which pictures were progressively revealed, to a condition in which no probabilistic relationship existed between each stimulus and the following one. In two experiments, the categorization and response to emotional and neutral scenes were similarly modulated by the computational load. The Late Positive Potential (LPP) was affected by the emotional content of the scenes, and by categorization accuracy. When the phase of the spatial frequency spectrum was scrambled by a large amount (>58%), chance categorization resulted, and affective LPP modulation was eliminated. With less degraded scenes, categorization accuracy was higher (.82 in Experiment 1, .86 in Experiment 2) and affective modulation of the LPP was observed at a late window (>800 ms), indicating that it is possible to delay the time of engagement of the motivational systems which are responsible for the LPP affective modulation. The present data strongly support the view that semantic analysis of visual scenes, operationalized here as object categorization, is a necessary condition for emotional engagement at the electrocortical level (LPP).

3.
J Cogn Neurosci ; 32(4): 621-633, 2020 04.
Article in English | MEDLINE | ID: mdl-31765599

ABSTRACT

Emotional stimuli engage corticolimbic circuits and capture attention even when they are task-irrelevant distractors. Whether top-down or contextual factors can modulate the filtering of emotional distractors is a matter of debate. Recent studies have indicated that behavioral interference by emotional distractors habituates rapidly when the same stimuli are repeated across trials. However, little is known as to whether we can attenuate the impact of novel (never repeated) emotional distractors when they occur frequently. In two experiments, we investigated the effects of distractor frequency on the processing of task-irrelevant novel pictures, as reflected in both behavioral interference and neural activity, while participants were engaged in an orientation discrimination task. Experiment 1 showed that, compared with a rare distractor condition (20%), frequent distractors (80%) reduced the interference of emotional stimuli. Moreover, Experiment 2 provided evidence that emotional interference was reduced by distractor frequency even when rare, and unexpected, emotional distractors appeared among frequent neutral distractors. On the other hand, in both experiments, the late positive potential amplitude was enhanced for emotional, compared with neutral, pictures, and this emotional modulation was not reduced when distractors were frequently presented. Altogether, these findings suggest that the high occurrence of task-irrelevant stimuli does not proactively prevent the processing of emotional distractors. Even when attention allocation to novel emotional stimuli is reduced, evaluative processes and the engagement of motivational systems are needed to support the monitoring of the environment for significant events.


Subject(s)
Attention/physiology , Brain/physiology , Emotions/physiology , Motivation/physiology , Pattern Recognition, Visual/physiology , Adult , Discrimination, Psychological , Electroencephalography , Female , Humans , Male , Reaction Time , Young Adult
4.
Anim Cogn ; 22(2): 169-186, 2019 Mar.
Article in English | MEDLINE | ID: mdl-30603932

ABSTRACT

Money is a cultural artefact with a central role in human society. Here, we investigated whether some features of money may be traced back to the exchange habits of nonhuman animals, capitalizing on their ability to flexibly use tokens in different domains. In Experiment 1, we evaluated whether capuchins can recognize token validity. Six subjects were required to exchange with the experimenter valid/familiar tokens, valid/unfamiliar tokens, invalid tokens, and no-value items. They first exchanged a similar number of valid/familiar and valid/unfamiliar tokens, followed by exchanges of invalid tokens and no-value items. Thus, as humans, capuchins readily recognized token validity, regardless of familiarity. In Experiment 2, we further evaluated the flexibility of the token-food association by assessing whether capuchins could engage in reverse food-token exchanges. Subjects spontaneously performed chains of exchanges, in which a food item was exchanged for a token, and then the token was exchanged for another food. However, performance was better as the advantage gained from the exchange increased. Overall, capuchins recognized token validity and successfully engaged in chains of reverse and direct exchanges. This suggests that-although nonhuman animals are far from having fully-fledged monetary systems-for capuchins tokens share at least some features with human money.


Subject(s)
Biological Evolution , Cebus , Cognition , Animals , Food , Humans , Male
5.
J Cogn Neurosci ; 31(1): 109-125, 2019 01.
Article in English | MEDLINE | ID: mdl-30188778

ABSTRACT

Understanding natural scenes involves the contribution of bottom-up analysis and top-down modulatory processes. However, the interaction of these processes during the categorization of natural scenes is not well understood. In the current study, we approached this issue using ERPs and behavioral and computational data. We presented pictures of natural scenes and asked participants to categorize them in response to different questions (Is it an animal/vehicle? Is it indoors/outdoors? Are there one/two foreground elements?). ERPs for target scenes requiring a "yes" response began to differ from those of nontarget scenes, beginning at 250 msec from picture onset, and this ERP difference was unmodulated by the categorization questions. Earlier ERPs showed category-specific differences (e.g., between animals and vehicles), which were associated with the processing of scene statistics. From 180 msec after scene onset, these category-specific ERP differences were modulated by the categorization question that was asked. Categorization goals do not modulate only later stages associated with target/nontarget decision but also earlier perceptual stages, which are involved in the processing of scene statistics.


Subject(s)
Brain/physiology , Decision Making/physiology , Goals , Pattern Recognition, Visual/physiology , Adult , Attention/physiology , Electroencephalography , Evoked Potentials , Female , Humans , Machine Learning , Male , Young Adult
6.
Front Psychol ; 6: 1193, 2015.
Article in English | MEDLINE | ID: mdl-26322001

ABSTRACT

Self-control failure has enormous personal and societal consequences. One of the most debated models explaining why self-control breaks down is the Strength Model, according to which self-control depends on a limited resource. Either previous acts of self-control or taking part in highly demanding cognitive tasks have been shown to reduce self-control, possibly due to a reduction in blood glucose levels. However, several studies yielded negative findings, and recent meta-analyses questioned the robustness of the depletion effect in humans. We investigated, for the first time, whether the Strength Model applies to a non-human primate species, the tufted capuchin monkey. We tested five capuchins in a self-control task (the Accumulation task) in which food items were accumulated within individual's reach for as long as the subject refrained from taking them. We evaluated whether capuchins' performance decreases: (i) when tested before receiving their daily meal rather than after consuming it (Energy Depletion Experiment), and (ii) after being tested in two tasks with different levels of cognitive complexity (Cognitive Depletion Experiment). We also tested, in both experiments, how implementing self-control in each trial of the Accumulation task affected this capacity within each session and/or across consecutive sessions. Repeated acts of self-control in each trial of the Accumulation task progressively reduced this capacity within each session, as predicted by the Strength Model. However, neither experiencing a reduction in energy level nor taking part in a highly demanding cognitive task decreased performance in the subsequent Accumulation task. Thus, whereas capuchins seem to be vulnerable to within-session depletion effects, to other extents our findings are in line with the growing body of studies that failed to find a depletion effect in humans. Methodological issues potentially affecting the lack of depletion effects in capuchins are discussed.

7.
Anim Cogn ; 18(5): 1019-29, 2015 Sep.
Article in English | MEDLINE | ID: mdl-25894673

ABSTRACT

When faced with choices between smaller sooner options and larger later options (i.e. intertemporal choices), both humans and non-human animals discount future rewards. Apparently, only humans consistently show the magnitude effect, according to which larger options are discounted over time at a lower rate than smaller options. Most of the studies carried out in non-human animals led instead to negative results. Here, we tested ten tufted capuchin monkeys (Sapajus spp.) in a delay choice task to evaluate whether they show a magnitude effect when choosing between different quantities of the same food or when the options are represented by high- and low-preferred foods in different conditions. Whereas food quality did not play a role, we provided the first evidence of an effect of the reward amount on temporal preferences in a non-human primate species, a result with potential implications for the validity of comparative studies on the evolution of delay tolerance. In contrast with human results, but as shown in other animal species, capuchins' choice of the larger later option decreased as the amount of the smaller sooner option increased. Capuchins based their temporal preferences on the quantity of the smaller sooner option, rather than on that of the larger later option, probably because in the wild they virtually never have to choose between the above two options at the same time, but they more often encounter them consecutively. Thus, paying attention to the sooner option and deciding on the basis of its features may be an adaptive strategy rather than an irrational response.


Subject(s)
Cebus/psychology , Choice Behavior , Food Quality , Food , Time Factors , Animals , Female , Male , Reward
SELECTION OF CITATIONS
SEARCH DETAIL
...