Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 22
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
Top Cogn Sci ; 2023 Apr 28.
Artigo em Inglês | MEDLINE | ID: mdl-37115518

RESUMO

Research over the past four decades has built a convincing case that co-speech hand gestures play a powerful role in human cognition . However, this recent focus on the cognitive function of gesture has, to a large extent, overlooked its emotional role-a role that was once central to research on bodily expression. In the present review, we first give a brief summary of the wealth of research demonstrating the cognitive function of co-speech gestures in language acquisition, learning, and thinking. Building on this foundation, we revisit the emotional function of gesture across a wide range of communicative contexts, from clinical to artistic to educational, and spanning diverse fields, from cognitive neuroscience to linguistics to affective science. Bridging the cognitive and emotional functions of gesture highlights promising avenues of research that have varied practical and theoretical implications for human-machine interactions, therapeutic interventions, language evolution, embodied cognition, and more.

2.
Mem Cognit ; 49(5): 884-894, 2021 07.
Artigo em Inglês | MEDLINE | ID: mdl-33415717

RESUMO

Beyond conveying objective content about objects and actions, what can co-speech iconic gestures reveal about a speaker's subjective relationship to that content? The present study explores this question by investigating how gesture viewpoints can inform a listener's construal of a speaker's agency. Forty native English speakers watched videos of an actor uttering sentences with different viewpoints-that of low agency or high agency-conveyed through both speech and gesture. Participants were asked to (1) rate the speaker's responsibility for the action described in each video (encoding task) and (2) complete a surprise memory test of the spoken sentences (recall task). For the encoding task, participants rated responsibility near ceiling when agency in speech was high, with a slight dip when accompanied by gestures of low agency. When agency in speech was low, responsibility ratings were raised markedly when accompanied by gestures of high agency. In the recall task, participants produced more incorrect recall of spoken agency when the viewpoints expressed through speech and gesture were inconsistent with one another. Our findings suggest that, beyond conveying objective content, co-speech iconic gestures can also guide listeners in gauging a speaker's agentic relationship to actions and events.


Assuntos
Gestos , Percepção da Fala , Compreensão , Humanos , Idioma , Fala
3.
Front Psychol ; 11: 574418, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-33071912

RESUMO

Traditionally, much of the attention on the communicative effects of non-native accent has focused on the accent itself rather than how it functions within a more natural context. The present study explores how the bodily context of co-speech emblematic gestures affects perceptual and social evaluation of non-native accent. In two experiments in two different languages, Mandarin and Japanese, we filmed learners performing a short utterance in three different within-subjects conditions: speech alone, culturally familiar gesture, and culturally unfamiliar gesture. Native Mandarin participants watched videos of foreign-accented Mandarin speakers (Experiment 1), and native Japanese participants watched videos of foreign-accented Japanese speakers (Experiment 2). Following each video, native language participants were asked a set of questions targeting speech perception and social impressions of the learners. Results from both experiments demonstrate that familiar-and occasionally unfamiliar-emblems facilitated speech perception and enhanced social evaluations compared to the speech alone baseline. The variability in our findings suggests that gesture may serve varied functions in the perception and evaluation of non-native accent.

4.
J Speech Lang Hear Res ; 61(9): 2179-2195, 2018 09 19.
Artigo em Inglês | MEDLINE | ID: mdl-30193334

RESUMO

Purpose: This study investigated the impact of metaphoric actions-head nods and hand gestures-in producing Mandarin tones for first language (L1) and second language (L2) speakers. Method: In 2 experiments, participants imitated videos of Mandarin tones produced under 3 conditions: (a) speech alone, (b) speech + head nods, and (c) speech + hand gestures. Fundamental frequency was recorded for both L1 (Experiment 1) and L2 (Experiment 2a) speakers, and the output of the L2 speakers was rated for tonal accuracy by 7 native Mandarin judges (Experiment 2b). Results: Experiment 1 showed that 12 L1 speakers' fundamental frequency spectral data did not differ among the 3 conditions. In Experiment 2a, the conditions did not affect the production of 24 English speakers for the most part, but there was some evidence that hand gestures helped Tone 4. In Experiment 2b, native Mandarin judges found limited conditional differences in L2 productions, with Tone 3 showing a slight head nods benefit in a subset of "correct" L2 tokens. Conclusion: Results suggest that metaphoric bodily actions do not influence the lowest levels of L1 speech production in a tonal language and may play a very modest role during preliminary L2 learning.


Assuntos
Gestos , Comportamento Imitativo/fisiologia , Aprendizagem/fisiologia , Multilinguismo , Fala/fisiologia , Adolescente , Povo Asiático/psicologia , Feminino , Mãos , Cabeça , Humanos , Idioma , Adulto Jovem
5.
J Soc Psychol ; 157(4): 474-484, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-27684932

RESUMO

Judging others' power facilitates successful social interaction. Both gender and body posture have been shown to influence judgments of another's power. However, little is known about how these two cues interact when they conflict or how they influence early processing. The present study investigated this question during very early processing of power-related words using event-related potentials (ERPs). Participants viewed images of women and men in dominant and submissive postures that were quickly followed by dominant or submissive words. Gender and posture both modulated neural responses in the N2 latency range to dominant words, but for submissive words they had little impact. Thus, in the context of dual-processing theories of person perception, information extracted from both behavior (i.e., posture) and from category membership (i.e., gender) are recruited side-by-side to impact word processing.


Assuntos
Potenciais Evocados/fisiologia , Idioma , Postura , Poder Psicológico , Predomínio Social , Percepção Social , Adolescente , Adulto , Feminino , Humanos , Masculino , Adulto Jovem
6.
Soc Cogn Affect Neurosci ; 10(9): 1236-43, 2015 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-25688095

RESUMO

In face-to-face communication, speech is typically enriched by gestures. Clearly, not all people gesture in the same way, and the present study explores whether such individual differences in gesture style are taken into account during the perception of gestures that accompany speech. Participants were presented with one speaker that gestured in a straightforward way and another that also produced self-touch movements. Adding trials with such grooming movements makes the gesture information a much weaker cue compared with the gestures of the non-grooming speaker. The Electroencephalogram was recorded as participants watched videos of the individual speakers. Event-related potentials elicited by the speech signal revealed that adding grooming movements attenuated the impact of gesture for this particular speaker. Thus, these data suggest that there is sensitivity to the personal communication style of a speaker and that affects the extent to which gesture and speech are integrated during language comprehension.


Assuntos
Encéfalo/fisiologia , Compreensão/fisiologia , Potenciais Evocados/fisiologia , Gestos , Fala/fisiologia , Adulto , Eletroencefalografia , Feminino , Humanos , Idioma , Masculino , Percepção da Fala/fisiologia , Adulto Jovem
7.
Psychon Bull Rev ; 22(2): 517-23, 2015 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-25002252

RESUMO

Hand gestures and speech form a single integrated system of meaning during language comprehension, but is gesture processed with speech in a unique fashion? We had subjects watch multimodal videos that presented auditory (words) and visual (gestures and actions on objects) information. Half of the subjects related the audio information to a written prime presented before the video, and the other half related the visual information to the written prime. For half of the multimodal video stimuli, the audio and visual information contents were congruent, and for the other half, they were incongruent. For all subjects, stimuli in which the gestures and actions were incongruent with the speech produced more errors and longer response times than did stimuli that were congruent, but this effect was less prominent for speech-action stimuli than for speech-gesture stimuli. However, subjects focusing on visual targets were more accurate when processing actions than gestures. These results suggest that although actions may be easier to process than gestures, gestures may be more tightly tied to the processing of accompanying speech.


Assuntos
Compreensão , Gestos , Desempenho Psicomotor , Adolescente , Adulto , Associação , Feminino , Humanos , Masculino , Comunicação não Verbal , Tempo de Reação , Percepção da Fala , Adulto Jovem
8.
Soc Cogn Affect Neurosci ; 10(2): 255-61, 2015 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-24652857

RESUMO

Recipients process information from speech and co-speech gestures, but it is currently unknown how this processing is influenced by the presence of other important social cues, especially gaze direction, a marker of communicative intent. Such cues may modulate neural activity in regions associated either with the processing of ostensive cues, such as eye gaze, or with the processing of semantic information, provided by speech and gesture. Participants were scanned (fMRI) while taking part in triadic communication involving two recipients and a speaker. The speaker uttered sentences that were and were not accompanied by complementary iconic gestures. Crucially, the speaker alternated her gaze direction, thus creating two recipient roles: addressed (direct gaze) vs unaddressed (averted gaze) recipient. The comprehension of Speech&Gesture relative to SpeechOnly utterances recruited middle occipital, middle temporal and inferior frontal gyri, bilaterally. The calcarine sulcus and posterior cingulate cortex were sensitive to differences between direct and averted gaze. Most importantly, Speech&Gesture utterances, but not SpeechOnly utterances, produced additional activity in the right middle temporal gyrus when participants were addressed. Marking communicative intent with gaze direction modulates the processing of speech-gesture utterances in cerebral areas typically associated with the semantic processing of multi-modal communicative acts.


Assuntos
Fixação Ocular/fisiologia , Gestos , Giro do Cíngulo/fisiologia , Mapeamento Encefálico , Comunicação , Compreensão , Sinais (Psicologia) , Feminino , Humanos , Imageamento por Ressonância Magnética , Fala , Adulto Jovem
9.
Cognition ; 133(3): 692-7, 2014 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-25255036

RESUMO

In human face-to-face communication, language comprehension is a multi-modal, situated activity. However, little is known about how we combine information from different modalities during comprehension, and how perceived communicative intentions, often signaled through visual signals, influence this process. We explored this question by simulating a multi-party communication context in which a speaker alternated her gaze between two recipients. Participants viewed speech-only or speech+gesture object-related messages when being addressed (direct gaze) or unaddressed (gaze averted to other participant). They were then asked to choose which of two object images matched the speaker's preceding message. Unaddressed recipients responded significantly more slowly than addressees for speech-only utterances. However, perceiving the same speech accompanied by gestures sped unaddressed recipients up to a level identical to that of addressees. That is, when unaddressed recipients' speech processing suffers, gestures can enhance the comprehension of a speaker's message. We discuss our findings with respect to two hypotheses attempting to account for how social eye gaze may modulate multi-modal language comprehension.


Assuntos
Comunicação , Movimentos Oculares/fisiologia , Gestos , Percepção da Fala/fisiologia , Fala/fisiologia , Compreensão , Feminino , Humanos , Masculino , Tempo de Reação/fisiologia , Adulto Jovem
10.
J Speech Lang Hear Res ; 57(6): 2090-101, 2014 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-25088127

RESUMO

PURPOSE: Research has shown that hand gestures affect comprehension and production of speech at semantic, syntactic, and pragmatic levels for both native language and second language (L2). This study investigated a relatively less explored question: Do hand gestures influence auditory learning of an L2 at the segmental phonology level? METHOD: To examine auditory learning of phonemic vowel length contrasts in Japanese, 88 native English-speaking participants took an auditory test before and after one of the following 4 types of training in which they (a) observed an instructor in a video speaking Japanese words while she made syllabic-rhythm hand gesture, (b) produced this gesture with the instructor, (c) observed the instructor speaking those words and her moraic-rhythm hand gesture, or (d) produced the moraic-rhythm gesture with the instructor. RESULTS: All of the training types yielded similar auditory improvement in identifying vowel length contrast. However, observing the syllabic-rhythm hand gesture yielded the most balanced improvement between word-initial and word-final vowels and between slow and fast speaking rates. CONCLUSIONS: The overall effect of hand gesture on learning of segmental phonology is limited. Implications for theories of hand gesture are discussed in terms of the role it plays at different linguistic levels.


Assuntos
Gestos , Multilinguismo , Fonética , Semântica , Percepção da Fala , Adolescente , Compreensão , Feminino , Mãos , Humanos , Idioma , Desenvolvimento da Linguagem , Aprendizagem , Masculino , Psicolinguística , Adulto Jovem
11.
Front Psychol ; 5: 673, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-25071646

RESUMO

Co-speech hand gestures are a type of multimodal input that has received relatively little attention in the context of second language learning. The present study explored the role that observing and producing different types of gestures plays in learning novel speech sounds and word meanings in an L2. Naïve English-speakers were taught two components of Japanese-novel phonemic vowel length contrasts and vocabulary items comprised of those contrasts-in one of four different gesture conditions: Syllable Observe, Syllable Produce, Mora Observe, and Mora Produce. Half of the gestures conveyed intuitive information about syllable structure, and the other half, unintuitive information about Japanese mora structure. Within each Syllable and Mora condition, half of the participants only observed the gestures that accompanied speech during training, and the other half also produced the gestures that they observed along with the speech. The main finding was that participants across all four conditions had similar outcomes in two different types of auditory identification tasks and a vocabulary test. The results suggest that hand gestures may not be well suited for learning novel phonetic distinctions at the syllable level within a word, and thus, gesture-speech integration may break down at the lowest levels of language processing and learning.

12.
PLoS One ; 7(8): e42620, 2012.
Artigo em Inglês | MEDLINE | ID: mdl-22912715

RESUMO

Co-speech hand gestures influence language comprehension. The present experiment explored what part of the visual processing system is optimized for processing these gestures. Participants viewed short video clips of speech and gestures (e.g., a person saying "chop" or "twist" while making a chopping gesture) and had to determine whether the two modalities were congruent or incongruent. Gesture videos were designed to stimulate the parvocellular or magnocellular visual pathways by filtering out low or high spatial frequencies (HSF versus LSF) at two levels of degradation severity (moderate and severe). Participants were less accurate and slower at processing gesture and speech at severe versus moderate levels of degradation. In addition, they were slower for LSF versus HSF stimuli, and this difference was most pronounced in the severely degraded condition. However, exploratory item analyses showed that the HSF advantage was modulated by the range of motion and amount of motion energy in each video. The results suggest that hand gestures exploit a wide range of spatial frequencies, and depending on what frequencies carry the most motion energy, parvocellular or magnocellular visual pathways are maximized to quickly and optimally extract meaning.


Assuntos
Gestos , Fala , Percepção Visual , Feminino , Humanos , Masculino , Movimento , Estimulação Luminosa , Amplitude de Movimento Articular , Tempo de Reação , Semântica , Gravação de Videoteipe
13.
Psychol Sci ; 21(2): 260-7, 2010 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-20424055

RESUMO

Gesture and speech are assumed to form an integrated system during language production. Based on this view, we propose the integrated-systems hypothesis, which explains two ways in which gesture and speech are integrated--through mutual and obligatory interactions--in language comprehension. Experiment 1 presented participants with action primes (e.g., someone chopping vegetables) and bimodal speech and gesture targets. Participants related primes to targets more quickly and accurately when they contained congruent information (speech: "chop"; gesture: chop) than when they contained incongruent information (speech: "chop"; gesture: twist). Moreover, the strength of the incongruence affected processing, with fewer errors for weak incongruities (speech: "chop"; gesture: cut) than for strong incongruities (speech: "chop"; gesture: twist). Crucial for the integrated-systems hypothesis, this influence was bidirectional. Experiment 2 demonstrated that gesture's influence on speech was obligatory. The results confirm the integrated-systems hypothesis and demonstrate that gesture and speech form an integrated system in language comprehension.


Assuntos
Compreensão , Gestos , Fala , Atenção , Feminino , Humanos , Masculino , Semântica
14.
J Speech Lang Hear Res ; 53(2): 298-310, 2010 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-20220023

RESUMO

PURPOSE: Previous research has found that auditory training helps native English speakers to perceive phonemic vowel length contrasts in Japanese, but their performance did not reach native levels after training. Given that multimodal information, such as lip movement and hand gesture, influences many aspects of native language processing, the authors examined whether multimodal input helps to improve native English speakers' ability to perceive Japanese vowel length contrasts. METHOD: Sixty native English speakers participated in 1 of 4 types of training: (a) audio-only; (b) audio-mouth; (c) audio-hands; and (d) audio-mouth-hands. Before and after training, participants were given phoneme perception tests that measured their ability to identify short and long vowels in Japanese (e.g., /kato/ vs. /kato/). RESULTS: Although all 4 groups improved from pre- to posttest (replicating previous research), the participants in the audio-mouth condition improved more than those in the audio-only condition, whereas the 2 conditions involving hand gestures did not. CONCLUSIONS: Seeing lip movements during training significantly helps learners to perceive difficult second-language phonemic contrasts, but seeing hand gestures does not. The authors discuss possible benefits and limitations of using multimodal information in second-language phoneme learning.


Assuntos
Percepção Auditiva , Mãos , Aprendizagem , Lábio , Multilinguismo , Fonética , Percepção Visual , Análise de Variância , Gestos , Humanos , Idioma , Testes de Linguagem , Leitura Labial , Atividade Motora , Psicolinguística , Fala , Adulto Jovem
15.
J Cogn Neurosci ; 22(4): 683-94, 2010 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-19413483

RESUMO

Previous research has demonstrated a link between language and action in the brain. The present study investigates the strength of this neural relationship by focusing on a potential interface between the two systems: cospeech iconic gesture. Participants performed a Stroop-like task in which they watched videos of a man and a woman speaking and gesturing about common actions. The videos differed as to whether the gender of the speaker and gesturer was the same or different and whether the content of the speech and gesture was congruent or incongruent. The task was to identify whether a man or a woman produced the spoken portion of the videos while accuracy rates, RTs, and ERPs were recorded to the words. Although not relevant to the task, participants paid attention to the semantic relationship between the speech and the gesture, producing a larger N400 to words accompanied by incongruent versus congruent gestures. In addition, RTs were slower to incongruent versus congruent gesture-speech stimuli, but this effect was greater when the gender of the gesturer and speaker was the same versus different. These results suggest that the integration of gesture and speech during language comprehension is automatic but also under some degree of neurocognitive control.


Assuntos
Encéfalo/fisiologia , Gestos , Percepção da Fala/fisiologia , Fala/fisiologia , Análise de Variância , Mapeamento Encefálico , Eletroencefalografia/métodos , Processamento Eletrônico de Dados/métodos , Potenciais Evocados Auditivos/fisiologia , Feminino , Lateralidade Funcional , Humanos , Masculino , Estimulação Luminosa/métodos , Psicolinguística/métodos , Tempo de Reação/fisiologia , Fatores Sexuais , Adulto Jovem
16.
Soc Neurosci ; 3(3-4): 434-42, 2008.
Artigo em Inglês | MEDLINE | ID: mdl-18633830

RESUMO

The present study investigated whether emotional states influence the neural processing of language. Event-related potentials recorded the brain's response to positively and negatively valenced words (e.g., love vs. death) while participants were directly induced into positive and negative moods. ERP electrodes in frontal scalp regions of the brain distinguished positive and negative words around 400 ms poststimulus. The amplitude of this negative waveform showed a larger negativity for positive words compared to negative words in the frontal electrode region when participants were in a positive, but not negative, mood. These findings build on previous research by demonstrating that people process affective language differently when in positive and negative moods, and lend support to recent views that emotion and cognition interact during language comprehension.


Assuntos
Mapeamento Encefálico , Emoções/fisiologia , Potenciais Evocados Visuais/fisiologia , Idioma , Análise de Variância , Eletroencefalografia/métodos , Feminino , Lateralidade Funcional , Humanos , Masculino , Estimulação Luminosa/métodos , Tempo de Reação/fisiologia , Adulto Jovem
17.
Brain Lang ; 101(3): 181-4, 2007 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-17433429
18.
Brain Lang ; 101(3): 222-33, 2007 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-16997367

RESUMO

The present study investigates whether knowledge about the intentional relationship between gesture and speech influences controlled processes when integrating the two modalities at comprehension. Thirty-five adults watched short videos of gesture and speech that conveyed semantically congruous and incongruous information. In half of the videos, participants were told that the two modalities were intentionally coupled (i.e., produced by the same communicator), and in the other half, they were told that the two modalities were not intentionally coupled (i.e., produced by different communicators). When participants knew that the same communicator produced the speech and gesture, there was a larger bi-lateral frontal and central N400 effect to words that were semantically incongruous versus congruous with gesture. However, when participants knew that different communicators produced the speech and gesture--that is, when gesture and speech were not intentionally meant to go together--the N400 effect was present only in right-hemisphere frontal regions. The results demonstrate that pragmatic knowledge about the intentional relationship between gesture and speech modulates controlled neural processes during the integration of the two modalities.


Assuntos
Córtex Cerebral/fisiologia , Gestos , Intenção , Percepção da Fala/fisiologia , Adulto , Análise de Variância , Mapeamento Encefálico , Compreensão/fisiologia , Dominância Cerebral , Potenciais Evocados , Feminino , Humanos , Masculino , Psicolinguística , Tempo de Reação
19.
J Learn Disabil ; 39(4): 352-63, 2006.
Artigo em Inglês | MEDLINE | ID: mdl-16895159

RESUMO

Event-related potentials (ERPs) were recorded from 27 children (14 girls, 13 boys) who varied in their reading skill levels. Both behavior performance measures recorded during the ERP word classification task and the ERP responses themselves discriminated between children with above-average, average, and below-average reading skills. ERP amplitudes and peak latencies decreased as reading skills increased. Furthermore, hemisphere differences increased with higher reading skill levels. Sex differences were also related to ERP amplitude variations across the scalp. However, ERPs recorded from boys and girls did not differ as a function of differences in the children's reading levels.


Assuntos
Aptidão , Encéfalo/anatomia & histologia , Leitura , Encéfalo/fisiologia , Criança , Potenciais Evocados/fisiologia , Feminino , Humanos , Masculino
20.
Brain Lang ; 89(1): 253-60, 2004 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-15010257

RESUMO

The present study examined the neural correlates of speech and hand gesture comprehension in a naturalistic context. Fifteen participants watched audiovisual segments of speech and gesture while event-related potentials (ERPs) were recorded to the speech. Gesture influenced the ERPs to the speech. Specifically, there was a right-lateralized N400 effect-reflecting semantic integration-when gestures mismatched versus matched the speech. In addition, early sensory components in bilateral occipital and frontal sites differentiated speech accompanied by matching versus non-matching gestures. These results suggest that hand gestures may be integrated with speech at early and late stages of language processing.


Assuntos
Atenção/fisiologia , Compreensão/fisiologia , Eletroencefalografia , Gestos , Percepção da Fala/fisiologia , Percepção Visual/fisiologia , Adulto , Córtex Cerebral/fisiologia , Potenciais Evocados/fisiologia , Feminino , Humanos , Masculino , Psicolinguística , Semântica
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA