Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 19 de 19
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Ann N Y Acad Sci ; 2020 Dec 18.
Artigo em Inglês | MEDLINE | ID: mdl-33336809

RESUMO

It is commonly understood that hand gesture and speech coordination in humans is culturally and cognitively acquired, rather than having a biological basis. Recently, however, the biomechanical physical coupling of arm movements to speech vocalization has been studied in steady-state vocalization and monosyllabic utterances, where forces produced during gesturing are transferred onto the tensioned body, leading to changes in respiratory-related activity and thereby affecting vocalization F0 and intensity. In the current experiment (n = 37), we extend this previous line of work to show that gesture-speech physics also impacts fluent speech. Compared with nonmovement, participants who are producing fluent self-formulated speech while rhythmically moving their limbs demonstrate heightened F0 and amplitude envelope, and such effects are more pronounced for higher-impulse arm versus lower-impulse wrist movement. We replicate that acoustic peaks arise especially during moments of peak impulse (i.e., the beat) of the movement, namely around deceleration phases of the movement. Finally, higher deceleration rates of higher-mass arm movements were related to higher peaks in acoustics. These results confirm a role for physical impulses of gesture affecting the speech system. We discuss the implications of gesture-speech physics for understanding of the emergence of communicative gesture, both ontogenetically and phylogenetically.

2.
Perception ; 49(9): 905-925, 2020 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-33002391

RESUMO

Most objects have well-defined affordances. Investigating perception of affordances of objects that were not created for a specific purpose would provide insight into how affordances are perceived. In addition, comparison of perception of affordances for such objects across different exploratory modalities (visual vs. haptic) would offer a strong test of the lawfulness of information about affordances (i.e., the invariance of such information over transformation). Along these lines, "feelies"- objects created by Gibson with no obvious function and unlike any common object-could shed light on the processes underlying affordance perception. This study showed that when observers reported potential uses for feelies, modality significantly influenced what kind of affordances were perceived. Specifically, visual exploration resulted in more noun labels (e.g., "toy") than haptic exploration which resulted in more verb labels (i.e., "throw"). These results suggested that overlapping, but distinct classes of action possibilities are perceivable using vision and haptics. Semantic network analyses revealed that visual exploration resulted in object-oriented responses focused on object identification, whereas haptic exploration resulted in action-oriented responses. Cluster analyses confirmed these results. Affordance labels produced in the visual condition were more consistent, used fewer descriptors, were less diverse, but more novel than in the haptic condition.

3.
J Acoust Soc Am ; 148(3): 1231, 2020 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-33003900

RESUMO

Expressive moments in communicative hand gestures often align with emphatic stress in speech. It has recently been found that acoustic markers of emphatic stress arise naturally during steady-state phonation when upper-limb movements impart physical impulses on the body, most likely affecting acoustics via respiratory activity. In this confirmatory study, participants (N = 29) repeatedly uttered consonant-vowel (/pa/) mono-syllables while moving in particular phase relations with speech, or not moving the upper limbs. This study shows that respiration-related activity is affected by (especially high-impulse) gesturing when vocalizations occur near peaks in physical impulse. This study further shows that gesture-induced moments of bodily impulses increase the amplitude envelope of speech, while not similarly affecting the Fundamental Frequency (F0). Finally, tight relations between respiration-related activity and vocalization were observed, even in the absence of movement, but even more so when upper-limb movement is present. The current findings expand a developing line of research showing that speech is modulated by functional biomechanical linkages between hand gestures and the respiratory system. This identification of gesture-speech biomechanics promises to provide an alternative phylogenetic, ontogenetic, and mechanistic explanatory route of why communicative upper limb movements co-occur with speech in humans.

4.
Cogn Sci ; 44(9): e12889, 2020 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-32893407

RESUMO

Speakers often use gesture to demonstrate how to perform actions-for example, they might show how to open the top of a jar by making a twisting motion above the jar. Yet it is unclear whether listeners learn as much from seeing such gestures as they learn from seeing actions that physically change the position of objects (i.e., actually opening the jar). Here, we examined participants' implicit and explicit understanding about a series of movements that demonstrated how to move a set of objects. The movements were either shown with actions that physically relocated each object or with gestures that represented the relocation without touching the objects. Further, the end location that was indicated for each object covaried with whether the object was grasped with one or two hands. We found that memory for the end location of each object was better after seeing the physical relocation of the objects, that is, after seeing action, than after seeing gesture, regardless of whether speech was absent (Experiment 1) or present (Experiment 2). However, gesture and action built similar implicit understanding of how a particular handgrasp corresponded with a particular end location. Although gestures miss the benefit of showing the end state of objects that have been acted upon, the data show that gestures are as good as action in building knowledge of how to perform an action.

6.
Proc Natl Acad Sci U S A ; 117(21): 11364-11367, 2020 05 26.
Artigo em Inglês | MEDLINE | ID: mdl-32393618

RESUMO

We show that the human voice has complex acoustic qualities that are directly coupled to peripheral musculoskeletal tensioning of the body, such as subtle wrist movements. In this study, human vocalizers produced a steady-state vocalization while rhythmically moving the wrist or the arm at different tempos. Although listeners could only hear and not see the vocalizer, they were able to completely synchronize their own rhythmic wrist or arm movement with the movement of the vocalizer which they perceived in the voice acoustics. This study corroborates recent evidence suggesting that the human voice is constrained by bodily tensioning affecting the respiratory-vocal system. The current results show that the human voice contains a bodily imprint that is directly informative for the interpersonal perception of another's dynamic physical states.


Assuntos
Extremidade Superior/fisiologia , Voz/fisiologia , Adulto , Percepção Auditiva , Feminino , Audição/fisiologia , Humanos , Masculino , Atividade Motora/fisiologia , Experimentação Humana não Terapêutica , Punho/fisiologia
7.
J Exp Psychol Gen ; 149(2): 391-404, 2020 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-31368760

RESUMO

The phenomenon of gesture-speech synchrony involves tight coupling of prosodic contrasts in gesture movement (e.g., peak velocity) and speech (e.g., peaks in fundamental frequency; F0). Gesture-speech synchrony has been understood as completely governed by sophisticated neural-cognitive mechanisms. However, gesture-speech synchrony may have its original basis in the resonating forces that travel through the body. In the current preregistered study, movements with high physical impact affected phonation in line with gesture-speech synchrony as observed in natural contexts. Rhythmic beating of the arms entrained phonation acoustics (F0 and the amplitude envelope). Such effects were absent for a condition with low-impetus movements (wrist movements) and a condition without movement. Further, movement-phonation synchrony was more pronounced when participants were standing as opposed to sitting, indicating a mediating role for postural stability. We conclude that gesture-speech synchrony has a biomechanical basis, which will have implications for our cognitive, ontogenetic, and phylogenetic understanding of multimodal language. (PsycINFO Database Record (c) 2020 APA, all rights reserved).


Assuntos
Gestos , Fala/fisiologia , Adulto , Fenômenos Biomecânicos/fisiologia , Feminino , Humanos , Masculino , Física , Adulto Jovem
8.
Psychol Res ; 84(4): 966-980, 2020 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-30552506

RESUMO

Co-speech gestures have been proposed to strengthen sensorimotor knowledge related to objects' weight and manipulability. This pre-registered study (https://www.osf.io/9uh6q/) was designed to explore how gestures affect memory for sensorimotor information through the application of the visual-haptic size-weight illusion (i.e., objects weigh the same, but are experienced as different in weight). With this paradigm, a discrepancy can be induced between participants' conscious illusory perception of objects' weight and their implicit sensorimotor knowledge (i.e., veridical motor coordination). Depending on whether gestures reflect and strengthen either of these types of knowledge, gestures may respectively decrease or increase the magnitude of the size-weight illusion. Participants (N = 159) practiced a problem-solving task with small and large objects that were designed to induce a size-weight illusion, and then explained the task with or without co-speech gesture or completed a control task. Afterwards, participants judged the heaviness of objects from memory and then while holding them. Confirmatory analyses revealed an inverted size-weight illusion based on heaviness judgments from memory and we found gesturing did not affect judgments. However, exploratory analyses showed reliable correlations between participants' heaviness judgments from memory and (a) the number of gestures produced that simulated actions, and (b) the kinematics of the lifting phases of those gestures. These findings suggest that gestures emerge as sensorimotor imaginings that are governed by the agent's conscious renderings about the actions they describe, rather than implicit motor routines.


Assuntos
Gestos , Ilusões/psicologia , Percepção de Peso , Adolescente , Adulto , Feminino , Humanos , Julgamento , Masculino , Memória , Resolução de Problemas , Percepção de Tamanho , Adulto Jovem
9.
Psychol Res ; 84(2): 502-513, 2020 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-30066133

RESUMO

During silent problem solving, hand gestures arise that have no communicative intent. The role of such co-thought gestures in cognition has been understudied in cognitive research as compared to co-speech gestures. We investigated whether gesticulation during silent problem solving supported subsequent performance in a Tower of Hanoi problem-solving task, in relation to visual working-memory capacity and task complexity. Seventy-six participants were assigned to either an instructed gesture condition or a condition that allowed them to gesture, but without explicit instructions to do so. This resulted in three gesture groups: (1) non-gesturing; (2) spontaneous gesturing; (3) instructed gesturing. In line with the embedded/extended cognition perspective on gesture, gesturing benefited complex problem-solving performance for participants with a lower visual working-memory capacity, but not for participants with a lower spatial working-memory capacity.


Assuntos
Gestos , Memória de Curto Prazo , Resolução de Problemas , Memória Espacial , Adolescente , Adulto , Cognição , Feminino , Humanos , Masculino , Adulto Jovem
10.
Behav Res Methods ; 52(2): 723-740, 2020 04.
Artigo em Inglês | MEDLINE | ID: mdl-31659689

RESUMO

There is increasing evidence that hand gestures and speech synchronize their activity on multiple dimensions and timescales. For example, gesture's kinematic peaks (e.g., maximum speed) are coupled with prosodic markers in speech. Such coupling operates on very short timescales at the level of syllables (200 ms), and therefore requires high-resolution measurement of gesture kinematics and speech acoustics. High-resolution speech analysis is common for gesture studies, given that field's classic ties with (psycho)linguistics. However, the field has lagged behind in the objective study of gesture kinematics (e.g., as compared to research on instrumental action). Often kinematic peaks in gesture are measured by eye, where a "moment of maximum effort" is determined by several raters. In the present article, we provide a tutorial on more efficient methods to quantify the temporal properties of gesture kinematics, in which we focus on common challenges and possible solutions that come with the complexities of studying multimodal language. We further introduce and compare, using an actual gesture dataset (392 gesture events), the performance of two video-based motion-tracking methods (deep learning vs. pixel change) against a high-performance wired motion-tracking system (Polhemus Liberty). We show that the videography methods perform well in the temporal estimation of kinematic peaks, and thus provide a cheap alternative to expensive motion-tracking systems. We hope that the present article incites gesture researchers to embark on the widespread objective study of gesture kinematics and their relation to speech.

11.
Acta Psychol (Amst) ; 197: 131-142, 2019 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-31146090

RESUMO

In two experiments, we examined the role of gesture in reinterpreting a mental image. In Experiment 1, we found that participants gestured more about a figure they had learned through manual exploration than about a figure they had learned through vision. This supports claims that gestures emerge from the activation of perception-relevant actions during mental imagery. In Experiment 2, we investigated whether such gestures have a causal role in affecting the quality of mental imagery. Participants were randomly assigned to gesture, not gesture, or engage in a manual interference task as they attempted to reinterpret a figure they had learned through manual exploration. We found that manual interference significantly impaired participants' success on the task. Taken together, these results suggest that gestures reflect mental imaginings of interactions with a mental image and that these imaginings are critically important for mental manipulation and reinterpretation of that image. However, our results suggest that enacting the imagined movements in gesture is not critically important on this particular task.


Assuntos
Gestos , Imaginação/fisiologia , Movimento/fisiologia , Tato/fisiologia , Adolescente , Adulto , Feminino , Humanos , Aprendizagem/fisiologia , Masculino , Distribuição Aleatória , Adulto Jovem
12.
J Exp Psychol Gen ; 148(11): 2058-2075, 2019 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-30973250

RESUMO

The split-attention effect entails that learning from spatially separated, but mutually referring information sources (e.g., text and picture), is less effective than learning from the equivalent spatially integrated sources. According to cognitive load theory, impaired learning is caused by the working memory load imposed by the need to distribute attention between the information sources and mentally integrate them. In this study, we directly tested whether the split-attention effect is caused by spatial separation per se. Spatial distance was varied in basic cognitive tasks involving pictures (Experiment 1) and text-picture combinations (Experiment 2; preregistered study), and in more ecologically valid learning materials (Experiment 3). Experiment 1 showed that having to integrate two pictorial stimuli at greater distances diminished performance on a secondary visual working memory task, but did not lead to slower integration. When participants had to integrate a picture and written text in Experiment 2, a greater distance led to slower integration of the stimuli, but not to diminished performance on the secondary task. Experiment 3 showed that presenting spatially separated (compared with integrated) textual and pictorial information yielded fewer integrative eye movements, but this was not further exacerbated when increasing spatial distance even further. This effect on learning processes did not lead to differences in learning outcomes between conditions. In conclusion, we provide evidence that larger distances between spatially separated information sources influence learning processes, but that spatial separation on its own is not likely to be the only, nor a sufficient, condition for impacting learning outcomes. (PsycINFO Database Record (c) 2019 APA, all rights reserved).


Assuntos
Atenção/fisiologia , Cognição/fisiologia , Memória de Curto Prazo/fisiologia , Adulto , Movimentos Oculares/fisiologia , Feminino , Humanos , Masculino , Adulto Jovem
13.
Cogn Sci ; 43(3): e12721, 2019 03.
Artigo em Inglês | MEDLINE | ID: mdl-30900288

RESUMO

Gesture-speech synchrony re-stabilizes when hand movement or speech is disrupted by a delayed feedback manipulation, suggesting strong bidirectional coupling between gesture and speech. Yet it has also been argued from case studies in perceptual-motor pathology that hand gestures are a special kind of action that does not require closed-loop re-afferent feedback to maintain synchrony with speech. In the current pre-registered within-subject study, we used motion tracking to conceptually replicate McNeill's () classic study on gesture-speech synchrony under normal and 150 ms delayed auditory feedback of speech conditions (NO DAF vs. DAF). Consistent with, and extending McNeill's original results, we obtain evidence that (a) gesture-speech synchrony is more stable under DAF versus NO DAF (i.e., increased coupling effect), (b) that gesture and speech variably entrain to the external auditory delay as indicated by a consistent shift in gesture-speech synchrony offsets (i.e., entrainment effect), and (c) that the coupling effect and the entrainment effect are co-dependent. We suggest, therefore, that gesture-speech synchrony provides a way for the cognitive system to stabilize rhythmic activity under interfering conditions.


Assuntos
Percepção Auditiva/fisiologia , Retroalimentação Sensorial/fisiologia , Gestos , Fala , Adolescente , Fenômenos Biomecânicos/fisiologia , Feminino , Humanos , Masculino , Fatores de Tempo , Adulto Jovem
14.
Psychol Res ; 83(6): 1237-1250, 2019 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-29242975

RESUMO

Is visual reinterpretation of bistable figures (e.g., duck/rabbit figure) in visual imagery possible? Current consensus suggests that it is in principle possible because of converging evidence of quasi-pictorial functioning of visual imagery. Yet, studies that have directly tested and found evidence for reinterpretation in visual imagery, allow for the possibility that reinterpretation was already achieved during memorization of the figure(s). One study resolved this issue, providing evidence for reinterpretation in visual imagery (Mast and Kosslyn, Cognition 86:57-70, 2002). However, participants in that study performed reinterpretations with aid of visual cues. Hence, reinterpretation was not performed with mental imagery alone. Therefore, in this study we assessed the possibility of reinterpretation without visual support. We further explored the possible role of haptic cues to assess the multimodal nature of mental imagery. Fifty-three participants were consecutively presented three to be remembered bistable 2-D figures (reinterpretable when rotated 180°), two of which were visually inspected and one was explored hapticly. After memorization of the figures, a visually bistable exemplar figure was presented to ensure understanding of the concept of visual bistability. During recall, 11 participants (out of 36; 30.6%) who did not spot bistability during memorization successfully performed reinterpretations when instructed to mentally rotate their visual image, but additional haptic cues during mental imagery did not inflate reinterpretation ability. This study validates previous findings that reinterpretation in visual imagery is possible.


Assuntos
Sinais (Psicologia) , Imaginação/fisiologia , Rememoração Mental/fisiologia , Percepção Visual/fisiologia , Adolescente , Adulto , Animais , Feminino , Humanos , Masculino , Coelhos , Adulto Jovem
15.
Behav Brain Sci ; 40: e68, 2017 01.
Artigo em Inglês | MEDLINE | ID: mdl-29342524

RESUMO

We observe a tension in the target article as it stresses an integrated gesture-speech system that can nevertheless consist of contradictory representational states, which are reflected by mismatches in gesture and speech or sign. Beyond problems of coherence, this prevents furthering our understanding of gesture-related learning. As a possible antidote, we invite a more dynamically embodied perspective to the stage.


Assuntos
Gestos , Línguas de Sinais , Compreensão , Humanos , Idioma , Fala
16.
Front Psychol ; 7: 860, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-27375538

RESUMO

We investigated whether augmenting instructional animations with a body analogy (BA) would improve 10- to 13-year-old children's learning about class-1 levers. Children with a lower level of general math skill who learned with an instructional animation that provided a BA of the physical system, showed higher accuracy on a lever problem-solving reaction time task than children studying the instructional animation without this BA. Additionally, learning with a BA led to a higher speed-accuracy trade-off during the transfer task for children with a lower math skill, which provided additional evidence that especially this group is likely to be affected by learning with a BA. However, overall accuracy and solving speed on the transfer task was not affected by learning with or without this BA. These results suggest that providing children with a BA during animation study provides a stepping-stone for understanding mechanical principles of a physical system, which may prove useful for instructional designers. Yet, because the BA does not seem effective for all children, nor for all tasks, the degree of effectiveness of body analogies should be studied further. Future research, we conclude, should be more sensitive to the necessary degree of analogous mapping between the body and physical systems, and whether this mapping is effective for reasoning about more complex instantiations of such physical systems.

17.
Cogn Process ; 17(3): 269-77, 2016 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-26993293

RESUMO

Non-communicative hand gestures have been found to benefit problem-solving performance. These gestures seem to compensate for limited internal cognitive capacities, such as visual working memory capacity. Yet, it is not clear how gestures might perform this cognitive function. One hypothesis is that gesturing is a means to spatially index mental simulations, thereby reducing the need for visually projecting the mental simulation onto the visual presentation of the task. If that hypothesis is correct, less eye movements should be made when participants gesture during problem solving than when they do not gesture. We therefore used mobile eye tracking to investigate the effect of co-thought gesturing and visual working memory capacity on eye movements during mental solving of the Tower of Hanoi problem. Results revealed that gesturing indeed reduced the number of eye movements (lower saccade counts), especially for participants with a relatively lower visual working memory capacity. Subsequent problem-solving performance was not affected by having (not) gestured during the mental solving phase. The current findings suggest that our understanding of gestures in problem solving could be improved by taking into account eye movements during gesturing.


Assuntos
Movimentos Oculares/fisiologia , Gestos , Memória de Curto Prazo/fisiologia , Reconhecimento Visual de Modelos/fisiologia , Resolução de Problemas/fisiologia , Pensamento/fisiologia , Adulto , Sinais (Psicologia) , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Testes Neuropsicológicos , Estimulação Luminosa , Adulto Jovem
18.
Front Psychol ; 5: 359, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-24795687

RESUMO

Gestures are often considered to be demonstrative of the embodied nature of the mind (Hostetter and Alibali, 2008). In this article, we review current theories and research targeted at the intra-cognitive role of gestures. We ask the question how can gestures support internal cognitive processes of the gesturer? We suggest that extant theories are in a sense disembodied, because they focus solely on embodiment in terms of the sensorimotor neural precursors of gestures. As a result, current theories on the intra-cognitive role of gestures are lacking in explanatory scope to address how gestures-as-bodily-acts fulfill a cognitive function. On the basis of recent theoretical appeals that focus on the possibly embedded/extended cognitive role of gestures (Clark, 2013), we suggest that gestures are external physical tools of the cognitive system that replace and support otherwise solely internal cognitive processes. That is gestures provide the cognitive system with a stable external physical and visual presence that can provide means to think with. We show that there is a considerable amount of overlap between the way the human cognitive system has been found to use its environment, and how gestures are used during cognitive processes. Lastly, we provide several suggestions of how to investigate the embedded/extended perspective of the cognitive function of gestures.

19.
Acta Psychol (Amst) ; 140(3): 283-8, 2012 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-22717422

RESUMO

Being ostracized or excluded, even briefly and by strangers, is painful and threatens fundamental needs. Recent work by Zhong and Leonardelli (2008) found that excluded individuals perceive the room as cooler and that they desire warmer drinks. A perspective that many rely on in embodiment is the theoretical idea that people use metaphorical associations to understand social exclusion (see Landau, Meier, & Keefer, 2010). We suggest that people feel colder because they are colder. The results strongly support the idea that more complex metaphorical understandings of social relations are scaffolded onto literal changes in bodily temperature: Being excluded in an online ball tossing game leads to lower finger temperatures (Study 1), while the negative affect typically experienced after such social exclusion is alleviated after holding a cup of warm tea (Study 2). The authors discuss further implications for the interaction between body and social relations specifically, and for basic and cognitive systems in general.


Assuntos
Temperatura Baixa , Relações Interpessoais , Solidão , Metáfora , Temperatura Cutânea , Feminino , Humanos , Masculino , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA