Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters











Database
Language
Publication year range
1.
Clin Neurophysiol ; 123(5): 892-901, 2012 May.
Article in English | MEDLINE | ID: mdl-22055841

ABSTRACT

OBJECTIVE: Binge drinking is a major health concern, but its cerebral correlates are still largely unexplored. We aimed at exploring (1) the cognitive step at which these deficits appear and (2) the respective influence of global alcohol intake and specific binge-drinking consumption pattern on this deficit. METHODS: On the basis of a screening phase (593 students), 80 participants were selected and distributed in four groups (control non-drinkers, daily drinkers, low and high binge drinkers). Event-related potentials (ERPs) were recorded while performing a simple visual oddball task. RESULTS: Binge drinking was associated with massive ERP impairments, starting at the perceptive level (P100/N100 and N170/P2) and spreading through the attentional (N2b/P3a) and decisional (P3b) ones. Moreover, these deficits were linked with global alcohol intake and also with the specific binge-drinking consumption pattern. CONCLUSIONS: Binge drinkers presented early and global ERP deficits, affecting basic and high-level cognitive stages. Moreover, we showed that binge drinking is deleterious for the brain because of alcohol consumption per se, and also because of its specific consumption pattern. SIGNIFICANCE: The present results show that binge-drinking habits lead to striking brain consequences, particularly because of the repeated alternation between intense intoxications and withdrawal episodes.


Subject(s)
Alcohol Drinking/pathology , Brain Waves/drug effects , Central Nervous System Depressants/pharmacology , Cerebral Cortex/drug effects , Ethanol/pharmacology , Adult , Alcohol Drinking/physiopathology , Electroencephalography , Evoked Potentials/drug effects , Female , Humans , Male , Psychiatric Status Rating Scales , Reaction Time/drug effects , Young Adult
2.
Alcohol Alcohol ; 42(6): 552-9, 2007.
Article in English | MEDLINE | ID: mdl-17878215

ABSTRACT

AIMS: Chronic alcoholism is classically associated with major deficits in the visual and auditory processing of emotions. However, the crossmodal (auditory-visual) processing of emotional stimuli, which occurs most frequently in everyday life, has not yet been explored. The aim of this study was to explore crossmodal processing in alcoholism, and specifically the auditory-visual facilitation effect. METHODS: Twenty patients suffering from alcoholism, and 20 matched healthy controls had to detect the emotion (anger or happiness) displayed by auditory, visual or auditory-visual stimuli. The stimuli were designed to elicit a facilitation effect (namely, faster reaction times (RTs) for crossmodal condition than for unimodal ones). RTs and performance were recorded. RESULTS: While the control subjects elicited a significant facilitation effect, alcoholic individuals did not present this effect, as no significant differences between RTs according to the modality were shown. This lack of facilitation effect is the marker of an impaired auditory-visual processing. CONCLUSIONS: Crossmodal processing of complex social stimuli (such as faces and voices) is crucial for interpersonal relations. This first evidence for a crossmodal deficit in alcoholism contribute in explaining the contrast observed between experimental results describing, up to now, mild impairments in emotional facial expression (EFE) recognition in alcoholic subjects (e.g. Oscar-Berman et al.,1990), and the many clinical observations suggesting massive problems.


Subject(s)
Alcoholism/physiopathology , Alcoholism/psychology , Anger/physiology , Happiness , Social Facilitation , Acoustic Stimulation/methods , Adult , Facial Expression , Female , Humans , Male , Middle Aged , Photic Stimulation/methods , Pilot Projects , Reaction Time/physiology
3.
Neurosci Lett ; 369(2): 132-7, 2004 Oct 14.
Article in English | MEDLINE | ID: mdl-15450682

ABSTRACT

Ten healthy volunteers took part in this event-related potential (ERP) study aimed at examining the electrophysiological correlates of the cross-modal audio-visual interactions in an identification task. Participants were confronted either to the simultaneous presentation of previously learned faces and voices (audio-visual condition; AV), either to the separate presentation of faces (visual, V) or voices (auditive, A). As expected, an interference effect of audition on vision was observed at a behavioral level, as the bimodal condition was performed more slowly than the visual condition. At the electrophysiological level, the subtraction (AV - (A + V)) gave prominence to three distinct cerebral activities: (1) a central positive/posterior negative wave around 110 ms, (2) a central negative/posterior positive wave around 170 ms, AND (3) a central positive wave around 270 ms. These data suggest that cross-modal cerebral interactions could be independent of behavioral facilitation or interference effects. Moreover, the implication of unimodal and multisensory convergence regions in these results, as suggested by a source localization analysis, is discussed.


Subject(s)
Auditory Perception/physiology , Evoked Potentials/physiology , Face , Visual Perception/physiology , Acoustic Stimulation/methods , Adult , Analysis of Variance , Brain Mapping , Electroencephalography , Female , Functional Laterality , Humans , Male , Photic Stimulation/methods , Reaction Time/physiology , Time Factors
4.
Neurosci Lett ; 367(1): 14-8, 2004 Aug 26.
Article in English | MEDLINE | ID: mdl-15308288

ABSTRACT

Pictures from the Ekman and Friesen series were used in an event-related potentials study to define the timing of occurrence of gender differences in the processing of positive (happy) and negative (fear) facial expressions. Ten male and 10 female volunteers were confronted with a visual oddball design, in which they had to detect, as quickly as possible, deviant happy or fearful faces amongst a train of standard stimuli (neutral faces). Behavioral results suggest that men and women detected fearful faces more quickly than happy ones. The main result is that the N2b component, functionally considered as an attentional orienting mechanism, was delayed in men for happy stimuli as compared with fearful ones. Gender differences observed in the processing of emotional stimuli could then originate at the attentional level of the information processing system.


Subject(s)
Emotions/physiology , Evoked Potentials, Visual/physiology , Facial Expression , Sex Characteristics , Adult , Analysis of Variance , Brain Mapping , Case-Control Studies , Electroencephalography/methods , Female , Humans , Male , Photic Stimulation , Reaction Time/physiology
5.
Psychophysiology ; 41(4): 625-35, 2004 Jul.
Article in English | MEDLINE | ID: mdl-15189485

ABSTRACT

An ERP study on 9 healthy participants was carried out to temporally constrain the neural network proposed by Campanella et al. (2001) in a PET study investigating the cerebral areas involved in the retrieval of face-name associations. Three learning sessions served to familiarize the participants with 24 face-name associations grouped in 12 male/female couples. During EEG recording, participants were confronted with four experimental conditions, requiring the retrieval of previously learned couples on the basis of the presentation of name-name (NN), face-face (FF), name-face (NF), or face-name (FN) pairs of stimuli. The main analysis of this experiment consisted in the subtraction of the nonmixed conditions (NN and FF) from the mixed conditions (NF and FN). It revealed two main ERP components: a negative wave peaking at left parieto-occipital sites around 285 ms and its positive counterpart recorded at left centro-frontal electrodes around 300 ms. Moreover, a dipole modeling using three dipoles whose localization corresponded to the three cerebral areas observed in the PET study (left inferior frontal gyrus, left medial frontal gyrus, left inferior parietal lobe) explained more than 90% of the variance of the results. The complementarity between anatomical and neurophysiological techniques allowed us to discuss the temporal course of these cerebral activities and to propose an interactive and original anatomo-temporal model of the retrieval of face-name associations.


Subject(s)
Electroencephalography , Memory/physiology , Adult , Evoked Potentials , Face , Female , Humans , Male , Photic Stimulation , Social Perception
6.
Neuroimage ; 14(4): 873-82, 2001 Oct.
Article in English | MEDLINE | ID: mdl-11554806

ABSTRACT

A PET study of seven normal individuals was carried out to investigate the neural populations involved in the retrieval of the visual representation of a face when presented with an associated name, and conversely. Face-name associations were studied by means of four experimental matching conditions, including the retrieval of previously learned (1) name-name (NN), (2) face-face (FF), (3) name-face (NF), and (4) face-name (FN) associations, as well as a resting scan with eyes closed. Before PET images acquisition, subjects were presented with 24 unknown face-name associations to encode in 12 male/female couples. During PET scanning, their task was to decide whether the presented pair was a previously learned association. The right fusiform gyrus was strongly activated in FF condition as compared to NN and Rest conditions. However, no specific activations were found for NN condition relative to FF condition. A network of three areas distributed in the left hemisphere, both active in (NF-FF) and (FN-NN) comparisons, was interpreted as the locus of the integration of visual faces and names representations. These three regions were localized in the inferior frontal gyrus (BA 45), the medial frontal gyrus (BA 6) and the supramarginal gyrus of the inferior parietal lobe (BA 40). An interactive model accounting for these results, with BA 40 seen as an amodal binding region, is proposed.


Subject(s)
Association Learning/physiology , Cerebral Cortex/physiology , Mental Recall/physiology , Pattern Recognition, Visual/physiology , Semantics , Tomography, Emission-Computed , Verbal Learning/physiology , Adult , Brain Mapping , Cerebral Cortex/diagnostic imaging , Dominance, Cerebral/physiology , Face , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Regional Blood Flow/physiology
SELECTION OF CITATIONS
SEARCH DETAIL