Your browser doesn't support javascript.
loading
Eye Movements During Visual Speech Perception in Deaf and Hearing Children.
Worster, Elizabeth; Pimperton, Hannah; Ralph-Lewis, Amelia; Monroy, Laura; Hulme, Charles; MacSweeney, Mairéad.
Afiliação
  • Worster E; Institute of Cognitive Neuroscience University College London.
  • Pimperton H; Institute of Cognitive Neuroscience University College London.
  • Ralph-Lewis A; Deafness, Cognition, and Language Research Centre University College London.
  • Monroy L; Institute of Cognitive Neuroscience University College London.
  • Hulme C; University of Oxford.
  • MacSweeney M; Institute of Cognitive Neuroscience University College London.
Lang Learn ; 68(Suppl Suppl 1): 159-179, 2018 Jun.
Article em En | MEDLINE | ID: mdl-29937576
ABSTRACT
For children who are born deaf, lipreading (speechreading) is an important source of access to spoken language. We used eye tracking to investigate the strategies used by deaf (n = 33) and hearing 5-8-year-olds (n = 59) during a sentence speechreading task. The proportion of time spent looking at the mouth during speech correlated positively with speechreading accuracy. In addition, all children showed a tendency to watch the mouth during speech and watch the eyes when the model was not speaking. The extent to which the children used this communicative pattern, which we refer to as social-tuning, positively predicted their speechreading performance, with the deaf children showing a stronger relationship than the hearing children. These data suggest that better speechreading skills are seen in those children, both deaf and hearing, who are able to guide their visual attention to the appropriate part of the image and in those who have a good understanding of conversational turn-taking.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Tipo de estudo: Prognostic_studies Idioma: En Ano de publicação: 2018 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Tipo de estudo: Prognostic_studies Idioma: En Ano de publicação: 2018 Tipo de documento: Article