Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
Add more filters











Database
Language
Publication year range
1.
Biol Psychol ; 190: 108820, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38815896

ABSTRACT

The perception of biological motion is an important social cognitive ability. Models of biological motion perception recognize two processes that contribute to the perception of biological motion: a bottom-up process that binds optic-flow patterns into a coherent percept of biological motion and a top-down process that binds sequences of body-posture 'snapshots' over time into a fluent percept of biological motion. The vast majority of studies on autism and biological motion perception have used point-light figure stimuli, which elicit biological motion perception predominantly via bottom-up processes. Here, we investigated whether autism is associated with deviances in the top-down processing of biological motion. For this, we tested a sample of adults scoring low vs high on autism traits on a recently validated EEG paradigm in which apparent biological motion is combined with frequency tagging (Cracco et al., 2022) to dissociate between two percepts: 1) the representation of individual body postures, and 2) their temporal integration into movements. In contrast to our hypothesis, we found no evidence for a diminished temporal body posture integration in the high-scoring group. We did, however, find a group difference that suggests that adults scoring high on autism traits have a visual processing style that focuses more on a single percept (i.e. either body postures or movements, contingent on saliency) compared to adults scoring low on autism traits who instead seemed to represent the two percepts included in the paradigm in a more balanced manner. Although unexpected, this finding aligns well with the autism literature on perceptual stability.


Subject(s)
Autistic Disorder , Electroencephalography , Motion Perception , Humans , Motion Perception/physiology , Male , Female , Adult , Autistic Disorder/physiopathology , Autistic Disorder/psychology , Young Adult , Photic Stimulation/methods , Posture/physiology
2.
Eur J Neurosci ; 60(1): 3557-3571, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38706370

ABSTRACT

Extensive research has shown that observers are able to efficiently extract summary information from groups of people. However, little is known about the cues that determine whether multiple people are represented as a social group or as independent individuals. Initial research on this topic has primarily focused on the role of static cues. Here, we instead investigate the role of dynamic cues. In two experiments with male and female human participants, we use EEG frequency tagging to investigate the influence of two fundamental Gestalt principles - synchrony and common fate - on the grouping of biological movements. In Experiment 1, we find that brain responses coupled to four point-light figures walking together are enhanced when they move in sync vs. out of sync, but only when they are presented upright. In contrast, we found no effect of movement direction (i.e., common fate). In Experiment 2, we rule out that synchrony takes precedence over common fate by replicating the null effect of movement direction while keeping synchrony constant. These results suggest that synchrony plays an important role in the processing of biological group movements. In contrast, the role of common fate is less clear and will require further research.


Subject(s)
Electroencephalography , Motion Perception , Humans , Male , Female , Adult , Electroencephalography/methods , Motion Perception/physiology , Young Adult , Cues , Movement/physiology , Brain/physiology , Photic Stimulation/methods
3.
Cortex ; 172: 185-203, 2024 03.
Article in English | MEDLINE | ID: mdl-38354469

ABSTRACT

The specialization of left ventral occipitotemporal brain regions to automatically process word forms develops with reading acquisition and is diminished in children with poor reading skills (PR). Using a fast periodic visual oddball stimulation (FPVS) design during electroencephalography (EEG), we examined the level of sensitivity and familiarity to word form processing in ninety-two children in 2nd and 3rd grade with varying reading skills (n = 35 for PR, n = 40 for typical reading skills; TR). To test children's level of "sensitivity", false font (FF) and consonant string (CS) oddballs were embedded in base presentations of word (W) stimuli. "Familiarity" was examined by presenting letter string oddballs with increasing familiarity (CS, pseudoword - PW, W) in FF base stimuli. Overall, our results revealed stronger left-hemispheric coarse sensitivity effects ("FF in W" > "CS in W") in TR than in PR in both topographic and oddball frequency analyses. Further, children distinguished between orthographically legal and illegal ("W/PW in FF" > "CS in FF") but not yet between lexical and non-lexical ("W in FF" vs "PW in FF") word forms. Although both TR and PR exhibit visual sensitivity and can distinguish between orthographically legal and illegal letter strings, they still struggle with nuanced lexical distinctions. Moreover, the strength of sensitivity is linked to reading proficiency. Our work adds to established knowledge in the field to characterize the relationship between print tuning and reading skills and suggests differences in the developmental progress to automatically process word forms.


Subject(s)
Electroencephalography , Reading , Child , Humans , Photic Stimulation , Brain , Brain Mapping , Evoked Potentials/physiology , Pattern Recognition, Visual/physiology
4.
Cortex ; 165: 129-140, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37279640

ABSTRACT

People are often seen among other people, relating to and interacting with one another. Recent studies suggest that socially relevant spatial relations between bodies, such as the face-to-face positioning, or facingness, change the visual representation of those bodies, relative to when the same items appear unrelated (e.g., back-to-back) or in isolation. The current study addresses the hypothesis that face-to-face bodies give rise to a new whole, an integrated representation of individual bodies in a new perceptual unit. Using frequency-tagging EEG, we targeted, as a measure of integration, an EEG correlate of the non-linear combination of the neural responses to each of two individual bodies presented either face-to-face as if interacting, or back-to-back. During EEG recording, participants (N = 32) viewed two bodies, either face-to-face or back-to-back, flickering at two different frequencies (F1 and F2), yielding two distinctive responses in the EEG signal. Spectral analysis examined the responses at the intermodulation frequencies (nF1±mF2), signaling integration of individual responses. An anterior intermodulation response was observed for face-to-face bodies, but not for back-to-back bodies, nor for face-to-face chairs and machines. These results show that interacting bodies are integrated into a representation that is more than the sum of its parts. This effect, specific to body dyads, may mark an early step in the transformation towards an integrated representation of a social event, from the visual representation of individual participants in that event.


Subject(s)
Electroencephalography , Human Body , Humans , Electroencephalography/methods , Pattern Recognition, Visual/physiology , Photic Stimulation
5.
Neuroimage ; 255: 119181, 2022 07 15.
Article in English | MEDLINE | ID: mdl-35413443

ABSTRACT

Visual categorization is the brain ability to rapidly and automatically respond to a certain category of inputs. Whether category-selective neural responses are purely visual or can be influenced by other sensory modalities remains unclear. Here, we test whether odors modulate visual categorization, expecting that odors facilitate the neural categorization of congruent visual objects, especially when the visual category is ambiguous. Scalp electroencephalogram (EEG) was recorded while natural images depicting various objects were displayed in rapid 12-Hz streams (i.e., 12 images / second) and variable exemplars of a target category (either human faces, cars, or facelike objects in dedicated sequences) were interleaved every 9th stimulus to tag category-selective responses at 12/9 = 1.33 Hz in the EEG frequency spectrum. During visual stimulation, participants (N = 26) were implicitly exposed to odor contexts (either body, gasoline or baseline odors) and performed an orthogonal cross-detection task. We identify clear category-selective responses to every category over the occipito-temporal cortex, with the largest response for human faces and the lowest for facelike objects. Critically, body odor boosts the response to the ambiguous facelike objects (i.e., either perceived as nonface objects or faces) over the right hemisphere, especially for participants reporting their presence post-stimulation. By contrast, odors do not significantly modulate other category-selective responses, nor the general visual response recorded at 12 Hz, revealing a specific influence on the categorization of congruent ambiguous stimuli. Overall, these findings support the view that the brain actively uses cues from the different senses to readily categorize visual inputs, and that olfaction, which has long been considered as poorly functional in humans, is well placed to disambiguate visual information.


Subject(s)
Brain Mapping , Smell , Brain/physiology , Electroencephalography , Humans , Odorants , Photic Stimulation/methods
6.
Cognition ; 222: 105016, 2022 05.
Article in English | MEDLINE | ID: mdl-35030358

ABSTRACT

The human brain rapidly and automatically categorizes faces vs. other visual objects. However, whether face-selective neural activity predicts the subjective experience of a face - perceptual awareness - is debated. To clarify this issue, here we use face pareidolia, i.e., the illusory perception of a face, as a proxy to relate the neural categorization of a variety of facelike objects to conscious face perception. In Experiment 1, scalp electroencephalogram (EEG) is recorded while pictures of human faces or facelike objects - in different stimulation sequences - are interleaved every second (i.e., at 1 Hz) in a rapid 6-Hz train of natural images of nonface objects. Participants do not perform any explicit face categorization task during stimulation, and report whether they perceived illusory faces post-stimulation. A robust categorization response to facelike objects is identified at 1 Hz and harmonics in the EEG frequency spectrum with a facelike occipito-temporal topography. Across all individuals, the facelike categorization response is of about 20% of the response to human faces, but more strongly right-lateralized. Critically, its amplitude is much larger in participants who report having perceived illusory faces. In Experiment 2, facelike or matched nonface objects from the same categories appear at 1 Hz in sequences of nonface objects presented at variable stimulation rates (60 Hz to 12 Hz) and participants explicitly report after each sequence whether they perceived illusory faces. The facelike categorization response already emerges at the shortest stimulus duration (i.e., 17 ms at 60 Hz) and predicts the behavioral report of conscious perception. Strikingly, neural facelike-selectivity emerges exclusively when participants report illusory faces. Collectively, these experiments characterize a neural signature of face pareidolia in the context of rapid categorization, supporting the view that face-selective brain activity reliably predicts the subjective experience of a face from a single glance at a variety of stimuli.


Subject(s)
Facial Recognition , Illusions , Brain/physiology , Brain Mapping , Electroencephalography , Facial Recognition/physiology , Humans , Photic Stimulation/methods
7.
Neuropsychologia ; 160: 107967, 2021 09 17.
Article in English | MEDLINE | ID: mdl-34303717

ABSTRACT

Human faces and bodies are environmental stimuli of special importance that the brain processes with selective attention and a highly specialized visual system. It has been shown recently that the human brain also has dedicated networks for perception of pluralities of human bodies in synchronous motion or in face-to-face interaction. Here we show that a plurality of human bodies that are merely in close spatial proximity are automatically integrated into a coherent perceptual unit. We used an EEG frequency tagging technique allowing the dissociation of the brain activity related to the component parts of an image from the activity related to the global image configuration. We presented to participants images of two silhouettes flickering at different frequencies (5.88 vs. 7.14 Hz). Clear response at these stimulation frequencies reflected response to each part of the dyad. An emerging intermodulation component (7.14 + 5.88 = 13.02 Hz), a nonlinear response regarded as an objective signature of holistic representation, was significantly enhanced in the (typical) upright relative to an (altered) inverted position. Moreover, the inversion effect was significant for the intermodulation component but not for the stimulation frequencies, suggesting a trade-off between the processing of the global dyad configuration and that of the structural properties of the dyad elements. Our results show that when presented with two humans merely in close proximity the perceptual visual system will bind them. Hence the perception of the human form might be of a fundamentally different nature when it is part of a plurality.


Subject(s)
Brain , Electroencephalography , Attention , Brain/diagnostic imaging , Electrophysiological Phenomena , Humans , Pattern Recognition, Visual , Perception , Photic Stimulation , Visual Perception
8.
Proc Natl Acad Sci U S A ; 118(21)2021 05 25.
Article in English | MEDLINE | ID: mdl-34001601

ABSTRACT

Understanding how the young infant brain starts to categorize the flurry of ambiguous sensory inputs coming in from its complex environment is of primary scientific interest. Here, we test the hypothesis that senses other than vision play a key role in initiating complex visual categorizations in 20 4-mo-old infants exposed either to a baseline odor or to their mother's odor while their electroencephalogram (EEG) is recorded. Various natural images of objects are presented at a 6-Hz rate (six images/second), with face-like object configurations of the same object categories (i.e., eliciting face pareidolia in adults) interleaved every sixth stimulus (i.e., 1 Hz). In the baseline odor context, a weak neural categorization response to face-like stimuli appears at 1 Hz in the EEG frequency spectrum over bilateral occipitotemporal regions. Critically, this face-like-selective response is magnified and becomes right lateralized in the presence of maternal body odor. This reveals that nonvisual cues systematically associated with human faces in the infant's experience shape the interpretation of face-like configurations as faces in the right hemisphere, dominant for face categorization. At the individual level, this intersensory influence is particularly effective when there is no trace of face-like categorization in the baseline odor context. These observations provide evidence for the early tuning of face-(like)-selective activity from multisensory inputs in the developing brain, suggesting that perceptual development integrates information across the senses for efficient category acquisition, with early maturing systems such as olfaction driving the acquisition of categories in later-developing systems such as vision.


Subject(s)
Brain/physiology , Facial Recognition/physiology , Odorants , Vision, Ocular/physiology , Brain/diagnostic imaging , Brain Mapping , Electroencephalography , Female , Humans , Infant , Male , Photic Stimulation
SELECTION OF CITATIONS
SEARCH DETAIL