Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
Add more filters










Publication year range
1.
Top Cogn Sci ; 2023 Apr 28.
Article in English | MEDLINE | ID: mdl-37115518

ABSTRACT

Research over the past four decades has built a convincing case that co-speech hand gestures play a powerful role in human cognition . However, this recent focus on the cognitive function of gesture has, to a large extent, overlooked its emotional role-a role that was once central to research on bodily expression. In the present review, we first give a brief summary of the wealth of research demonstrating the cognitive function of co-speech gestures in language acquisition, learning, and thinking. Building on this foundation, we revisit the emotional function of gesture across a wide range of communicative contexts, from clinical to artistic to educational, and spanning diverse fields, from cognitive neuroscience to linguistics to affective science. Bridging the cognitive and emotional functions of gesture highlights promising avenues of research that have varied practical and theoretical implications for human-machine interactions, therapeutic interventions, language evolution, embodied cognition, and more.

2.
Front Psychol ; 11: 574418, 2020.
Article in English | MEDLINE | ID: mdl-33071912

ABSTRACT

Traditionally, much of the attention on the communicative effects of non-native accent has focused on the accent itself rather than how it functions within a more natural context. The present study explores how the bodily context of co-speech emblematic gestures affects perceptual and social evaluation of non-native accent. In two experiments in two different languages, Mandarin and Japanese, we filmed learners performing a short utterance in three different within-subjects conditions: speech alone, culturally familiar gesture, and culturally unfamiliar gesture. Native Mandarin participants watched videos of foreign-accented Mandarin speakers (Experiment 1), and native Japanese participants watched videos of foreign-accented Japanese speakers (Experiment 2). Following each video, native language participants were asked a set of questions targeting speech perception and social impressions of the learners. Results from both experiments demonstrate that familiar-and occasionally unfamiliar-emblems facilitated speech perception and enhanced social evaluations compared to the speech alone baseline. The variability in our findings suggests that gesture may serve varied functions in the perception and evaluation of non-native accent.

3.
J Speech Lang Hear Res ; 61(9): 2179-2195, 2018 09 19.
Article in English | MEDLINE | ID: mdl-30193334

ABSTRACT

Purpose: This study investigated the impact of metaphoric actions-head nods and hand gestures-in producing Mandarin tones for first language (L1) and second language (L2) speakers. Method: In 2 experiments, participants imitated videos of Mandarin tones produced under 3 conditions: (a) speech alone, (b) speech + head nods, and (c) speech + hand gestures. Fundamental frequency was recorded for both L1 (Experiment 1) and L2 (Experiment 2a) speakers, and the output of the L2 speakers was rated for tonal accuracy by 7 native Mandarin judges (Experiment 2b). Results: Experiment 1 showed that 12 L1 speakers' fundamental frequency spectral data did not differ among the 3 conditions. In Experiment 2a, the conditions did not affect the production of 24 English speakers for the most part, but there was some evidence that hand gestures helped Tone 4. In Experiment 2b, native Mandarin judges found limited conditional differences in L2 productions, with Tone 3 showing a slight head nods benefit in a subset of "correct" L2 tokens. Conclusion: Results suggest that metaphoric bodily actions do not influence the lowest levels of L1 speech production in a tonal language and may play a very modest role during preliminary L2 learning.


Subject(s)
Gestures , Imitative Behavior/physiology , Learning/physiology , Multilingualism , Speech/physiology , Adolescent , Asian People/psychology , Female , Hand , Head , Humans , Language , Young Adult
4.
J Soc Psychol ; 157(4): 474-484, 2017.
Article in English | MEDLINE | ID: mdl-27684932

ABSTRACT

Judging others' power facilitates successful social interaction. Both gender and body posture have been shown to influence judgments of another's power. However, little is known about how these two cues interact when they conflict or how they influence early processing. The present study investigated this question during very early processing of power-related words using event-related potentials (ERPs). Participants viewed images of women and men in dominant and submissive postures that were quickly followed by dominant or submissive words. Gender and posture both modulated neural responses in the N2 latency range to dominant words, but for submissive words they had little impact. Thus, in the context of dual-processing theories of person perception, information extracted from both behavior (i.e., posture) and from category membership (i.e., gender) are recruited side-by-side to impact word processing.


Subject(s)
Evoked Potentials/physiology , Language , Posture , Power, Psychological , Social Dominance , Social Perception , Adolescent , Adult , Female , Humans , Male , Young Adult
5.
Soc Cogn Affect Neurosci ; 10(9): 1236-43, 2015 Sep.
Article in English | MEDLINE | ID: mdl-25688095

ABSTRACT

In face-to-face communication, speech is typically enriched by gestures. Clearly, not all people gesture in the same way, and the present study explores whether such individual differences in gesture style are taken into account during the perception of gestures that accompany speech. Participants were presented with one speaker that gestured in a straightforward way and another that also produced self-touch movements. Adding trials with such grooming movements makes the gesture information a much weaker cue compared with the gestures of the non-grooming speaker. The Electroencephalogram was recorded as participants watched videos of the individual speakers. Event-related potentials elicited by the speech signal revealed that adding grooming movements attenuated the impact of gesture for this particular speaker. Thus, these data suggest that there is sensitivity to the personal communication style of a speaker and that affects the extent to which gesture and speech are integrated during language comprehension.


Subject(s)
Brain/physiology , Comprehension/physiology , Evoked Potentials/physiology , Gestures , Speech/physiology , Adult , Electroencephalography , Female , Humans , Language , Male , Speech Perception/physiology , Young Adult
6.
Soc Cogn Affect Neurosci ; 10(2): 255-61, 2015 Feb.
Article in English | MEDLINE | ID: mdl-24652857

ABSTRACT

Recipients process information from speech and co-speech gestures, but it is currently unknown how this processing is influenced by the presence of other important social cues, especially gaze direction, a marker of communicative intent. Such cues may modulate neural activity in regions associated either with the processing of ostensive cues, such as eye gaze, or with the processing of semantic information, provided by speech and gesture. Participants were scanned (fMRI) while taking part in triadic communication involving two recipients and a speaker. The speaker uttered sentences that were and were not accompanied by complementary iconic gestures. Crucially, the speaker alternated her gaze direction, thus creating two recipient roles: addressed (direct gaze) vs unaddressed (averted gaze) recipient. The comprehension of Speech&Gesture relative to SpeechOnly utterances recruited middle occipital, middle temporal and inferior frontal gyri, bilaterally. The calcarine sulcus and posterior cingulate cortex were sensitive to differences between direct and averted gaze. Most importantly, Speech&Gesture utterances, but not SpeechOnly utterances, produced additional activity in the right middle temporal gyrus when participants were addressed. Marking communicative intent with gaze direction modulates the processing of speech-gesture utterances in cerebral areas typically associated with the semantic processing of multi-modal communicative acts.


Subject(s)
Fixation, Ocular/physiology , Gestures , Gyrus Cinguli/physiology , Brain Mapping , Communication , Comprehension , Cues , Female , Humans , Magnetic Resonance Imaging , Speech , Young Adult
7.
J Speech Lang Hear Res ; 57(6): 2090-101, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25088127

ABSTRACT

PURPOSE: Research has shown that hand gestures affect comprehension and production of speech at semantic, syntactic, and pragmatic levels for both native language and second language (L2). This study investigated a relatively less explored question: Do hand gestures influence auditory learning of an L2 at the segmental phonology level? METHOD: To examine auditory learning of phonemic vowel length contrasts in Japanese, 88 native English-speaking participants took an auditory test before and after one of the following 4 types of training in which they (a) observed an instructor in a video speaking Japanese words while she made syllabic-rhythm hand gesture, (b) produced this gesture with the instructor, (c) observed the instructor speaking those words and her moraic-rhythm hand gesture, or (d) produced the moraic-rhythm gesture with the instructor. RESULTS: All of the training types yielded similar auditory improvement in identifying vowel length contrast. However, observing the syllabic-rhythm hand gesture yielded the most balanced improvement between word-initial and word-final vowels and between slow and fast speaking rates. CONCLUSIONS: The overall effect of hand gesture on learning of segmental phonology is limited. Implications for theories of hand gesture are discussed in terms of the role it plays at different linguistic levels.


Subject(s)
Gestures , Multilingualism , Phonetics , Semantics , Speech Perception , Adolescent , Comprehension , Female , Hand , Humans , Language , Language Development , Learning , Male , Psycholinguistics , Young Adult
8.
Front Psychol ; 5: 673, 2014.
Article in English | MEDLINE | ID: mdl-25071646

ABSTRACT

Co-speech hand gestures are a type of multimodal input that has received relatively little attention in the context of second language learning. The present study explored the role that observing and producing different types of gestures plays in learning novel speech sounds and word meanings in an L2. Naïve English-speakers were taught two components of Japanese-novel phonemic vowel length contrasts and vocabulary items comprised of those contrasts-in one of four different gesture conditions: Syllable Observe, Syllable Produce, Mora Observe, and Mora Produce. Half of the gestures conveyed intuitive information about syllable structure, and the other half, unintuitive information about Japanese mora structure. Within each Syllable and Mora condition, half of the participants only observed the gestures that accompanied speech during training, and the other half also produced the gestures that they observed along with the speech. The main finding was that participants across all four conditions had similar outcomes in two different types of auditory identification tasks and a vocabulary test. The results suggest that hand gestures may not be well suited for learning novel phonetic distinctions at the syllable level within a word, and thus, gesture-speech integration may break down at the lowest levels of language processing and learning.

9.
PLoS One ; 7(8): e42620, 2012.
Article in English | MEDLINE | ID: mdl-22912715

ABSTRACT

Co-speech hand gestures influence language comprehension. The present experiment explored what part of the visual processing system is optimized for processing these gestures. Participants viewed short video clips of speech and gestures (e.g., a person saying "chop" or "twist" while making a chopping gesture) and had to determine whether the two modalities were congruent or incongruent. Gesture videos were designed to stimulate the parvocellular or magnocellular visual pathways by filtering out low or high spatial frequencies (HSF versus LSF) at two levels of degradation severity (moderate and severe). Participants were less accurate and slower at processing gesture and speech at severe versus moderate levels of degradation. In addition, they were slower for LSF versus HSF stimuli, and this difference was most pronounced in the severely degraded condition. However, exploratory item analyses showed that the HSF advantage was modulated by the range of motion and amount of motion energy in each video. The results suggest that hand gestures exploit a wide range of spatial frequencies, and depending on what frequencies carry the most motion energy, parvocellular or magnocellular visual pathways are maximized to quickly and optimally extract meaning.


Subject(s)
Gestures , Speech , Visual Perception , Female , Humans , Male , Movement , Photic Stimulation , Range of Motion, Articular , Reaction Time , Semantics , Videotape Recording
10.
Psychol Sci ; 21(2): 260-7, 2010 Feb.
Article in English | MEDLINE | ID: mdl-20424055

ABSTRACT

Gesture and speech are assumed to form an integrated system during language production. Based on this view, we propose the integrated-systems hypothesis, which explains two ways in which gesture and speech are integrated--through mutual and obligatory interactions--in language comprehension. Experiment 1 presented participants with action primes (e.g., someone chopping vegetables) and bimodal speech and gesture targets. Participants related primes to targets more quickly and accurately when they contained congruent information (speech: "chop"; gesture: chop) than when they contained incongruent information (speech: "chop"; gesture: twist). Moreover, the strength of the incongruence affected processing, with fewer errors for weak incongruities (speech: "chop"; gesture: cut) than for strong incongruities (speech: "chop"; gesture: twist). Crucial for the integrated-systems hypothesis, this influence was bidirectional. Experiment 2 demonstrated that gesture's influence on speech was obligatory. The results confirm the integrated-systems hypothesis and demonstrate that gesture and speech form an integrated system in language comprehension.


Subject(s)
Comprehension , Gestures , Speech , Attention , Female , Humans , Male , Semantics
11.
J Speech Lang Hear Res ; 53(2): 298-310, 2010 Apr.
Article in English | MEDLINE | ID: mdl-20220023

ABSTRACT

PURPOSE: Previous research has found that auditory training helps native English speakers to perceive phonemic vowel length contrasts in Japanese, but their performance did not reach native levels after training. Given that multimodal information, such as lip movement and hand gesture, influences many aspects of native language processing, the authors examined whether multimodal input helps to improve native English speakers' ability to perceive Japanese vowel length contrasts. METHOD: Sixty native English speakers participated in 1 of 4 types of training: (a) audio-only; (b) audio-mouth; (c) audio-hands; and (d) audio-mouth-hands. Before and after training, participants were given phoneme perception tests that measured their ability to identify short and long vowels in Japanese (e.g., /kato/ vs. /kato/). RESULTS: Although all 4 groups improved from pre- to posttest (replicating previous research), the participants in the audio-mouth condition improved more than those in the audio-only condition, whereas the 2 conditions involving hand gestures did not. CONCLUSIONS: Seeing lip movements during training significantly helps learners to perceive difficult second-language phonemic contrasts, but seeing hand gestures does not. The authors discuss possible benefits and limitations of using multimodal information in second-language phoneme learning.


Subject(s)
Auditory Perception , Hand , Learning , Lip , Multilingualism , Phonetics , Visual Perception , Analysis of Variance , Gestures , Humans , Language , Language Tests , Lipreading , Motor Activity , Psycholinguistics , Speech , Young Adult
12.
J Cogn Neurosci ; 22(4): 683-94, 2010 Apr.
Article in English | MEDLINE | ID: mdl-19413483

ABSTRACT

Previous research has demonstrated a link between language and action in the brain. The present study investigates the strength of this neural relationship by focusing on a potential interface between the two systems: cospeech iconic gesture. Participants performed a Stroop-like task in which they watched videos of a man and a woman speaking and gesturing about common actions. The videos differed as to whether the gender of the speaker and gesturer was the same or different and whether the content of the speech and gesture was congruent or incongruent. The task was to identify whether a man or a woman produced the spoken portion of the videos while accuracy rates, RTs, and ERPs were recorded to the words. Although not relevant to the task, participants paid attention to the semantic relationship between the speech and the gesture, producing a larger N400 to words accompanied by incongruent versus congruent gestures. In addition, RTs were slower to incongruent versus congruent gesture-speech stimuli, but this effect was greater when the gender of the gesturer and speaker was the same versus different. These results suggest that the integration of gesture and speech during language comprehension is automatic but also under some degree of neurocognitive control.


Subject(s)
Brain/physiology , Gestures , Speech Perception/physiology , Speech/physiology , Analysis of Variance , Brain Mapping , Electroencephalography/methods , Electronic Data Processing/methods , Evoked Potentials, Auditory/physiology , Female , Functional Laterality , Humans , Male , Photic Stimulation/methods , Psycholinguistics/methods , Reaction Time/physiology , Sex Factors , Young Adult
13.
Soc Neurosci ; 3(3-4): 434-42, 2008.
Article in English | MEDLINE | ID: mdl-18633830

ABSTRACT

The present study investigated whether emotional states influence the neural processing of language. Event-related potentials recorded the brain's response to positively and negatively valenced words (e.g., love vs. death) while participants were directly induced into positive and negative moods. ERP electrodes in frontal scalp regions of the brain distinguished positive and negative words around 400 ms poststimulus. The amplitude of this negative waveform showed a larger negativity for positive words compared to negative words in the frontal electrode region when participants were in a positive, but not negative, mood. These findings build on previous research by demonstrating that people process affective language differently when in positive and negative moods, and lend support to recent views that emotion and cognition interact during language comprehension.


Subject(s)
Brain Mapping , Emotions/physiology , Evoked Potentials, Visual/physiology , Language , Analysis of Variance , Electroencephalography/methods , Female , Functional Laterality , Humans , Male , Photic Stimulation/methods , Reaction Time/physiology , Young Adult
14.
Brain Lang ; 101(3): 181-4, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17433429
15.
Brain Lang ; 101(3): 222-33, 2007 Jun.
Article in English | MEDLINE | ID: mdl-16997367

ABSTRACT

The present study investigates whether knowledge about the intentional relationship between gesture and speech influences controlled processes when integrating the two modalities at comprehension. Thirty-five adults watched short videos of gesture and speech that conveyed semantically congruous and incongruous information. In half of the videos, participants were told that the two modalities were intentionally coupled (i.e., produced by the same communicator), and in the other half, they were told that the two modalities were not intentionally coupled (i.e., produced by different communicators). When participants knew that the same communicator produced the speech and gesture, there was a larger bi-lateral frontal and central N400 effect to words that were semantically incongruous versus congruous with gesture. However, when participants knew that different communicators produced the speech and gesture--that is, when gesture and speech were not intentionally meant to go together--the N400 effect was present only in right-hemisphere frontal regions. The results demonstrate that pragmatic knowledge about the intentional relationship between gesture and speech modulates controlled neural processes during the integration of the two modalities.


Subject(s)
Cerebral Cortex/physiology , Gestures , Intention , Speech Perception/physiology , Adult , Analysis of Variance , Brain Mapping , Comprehension/physiology , Dominance, Cerebral , Evoked Potentials , Female , Humans , Male , Psycholinguistics , Reaction Time
16.
Brain Lang ; 89(1): 253-60, 2004 Apr.
Article in English | MEDLINE | ID: mdl-15010257

ABSTRACT

The present study examined the neural correlates of speech and hand gesture comprehension in a naturalistic context. Fifteen participants watched audiovisual segments of speech and gesture while event-related potentials (ERPs) were recorded to the speech. Gesture influenced the ERPs to the speech. Specifically, there was a right-lateralized N400 effect-reflecting semantic integration-when gestures mismatched versus matched the speech. In addition, early sensory components in bilateral occipital and frontal sites differentiated speech accompanied by matching versus non-matching gestures. These results suggest that hand gestures may be integrated with speech at early and late stages of language processing.


Subject(s)
Attention/physiology , Comprehension/physiology , Electroencephalography , Gestures , Speech Perception/physiology , Visual Perception/physiology , Adult , Cerebral Cortex/physiology , Evoked Potentials/physiology , Female , Humans , Male , Psycholinguistics , Semantics
17.
Dev Neuropsychol ; 24(2-3): 541-58, 2003.
Article in English | MEDLINE | ID: mdl-14561561

ABSTRACT

Event-related potentials (ERPs) from 134 children were obtained at 3 and 8 years of age and recorded to a series of consonant-vowel speech syllables and their nonspeech analogues. The HOME inventory was administered to these same children at 3 and 8 years of age and the sample was divided into 2 groups (low vs. high) based on their HOME scores. Discriminant functions analyses using ERP responses to speech and non-speech analogues successfully classified HOME scores obtained at 3 and 8 years of age and discriminated between children who received low vs. high levels of stimulation for language and reading.


Subject(s)
Auditory Perception/physiology , Discrimination, Psychological/physiology , Environment , Phonetics , Acoustic Stimulation , Aging/physiology , Cerebral Cortex/physiology , Child , Child, Preschool , Evoked Potentials/physiology , Female , Functional Laterality , Humans , Intelligence/physiology , Longitudinal Studies , Male
18.
Dev Neuropsychol ; 22(1): 323-49, 2002.
Article in English | MEDLINE | ID: mdl-12405508

ABSTRACT

This article investigates the role that nonverbal actions play in language processing over 3 different time frames. First, we speculate that nonverbal actions played a role in how formal language systems emerged from our primate ancestors over evolutionary time. Next, we hypothesize that if nonverbal behaviors played a foundational role in the emergence of language over evolution, these actions should influence how children learn language in the present. Finally, we argue that nonverbal actions continue to play a role for adults in the moment-to-moment processing of language. Throughout, we take an embodied view of language and argue that the neural, cognitive, and social components of language processing are firmly grounded in bodily action.


Subject(s)
Brain/physiology , Gestures , Language Development , Language , Nonverbal Communication/psychology , Speech , Animals , Biological Evolution , Comprehension , Frontal Lobe/physiology , Humans , Learning/physiology , Neocortex/physiology , Nonverbal Communication/physiology , Speech/physiology , Temporal Lobe/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...