Your browser doesn't support javascript.
loading
More Than Words: the Relative Roles of Prosody and Semantics in the Perception of Emotions in Spoken Language by Postlingual Cochlear Implant Users.
Taitelbaum-Swead, Riki; Icht, Michal; Ben-David, Boaz M.
Afiliação
  • Taitelbaum-Swead R; Department of Communication Disorders, Ariel University, Israel.
  • Icht M; Meuhedet Health Services, Tel Aviv, Israel.
  • Ben-David BM; Department of Communication Disorders, Ariel University, Israel.
Ear Hear ; 43(4): 1378-1389, 2022.
Article em En | MEDLINE | ID: mdl-35030551
ABSTRACT

OBJECTIVES:

The processing of emotional speech calls for the perception and integration of semantic and prosodic cues. Although cochlear implants allow for significant auditory improvements, they are limited in the transmission of spectro-temporal fine-structure information that may not support the processing of voice pitch cues. The goal of the current study is to compare the performance of postlingual cochlear implant (CI) users and a matched control group on perception, selective attention, and integration of emotional semantics and prosody.

DESIGN:

Fifteen CI users and 15 normal hearing (NH) peers (age range, 18-65 years) 1istened to spoken sentences composed of different combinations of four discrete emotions (anger, happiness, sadness, and neutrality) presented in prosodic and semantic channels-T-RES Test for Rating Emotions in Speech. In three separate tasks, listeners were asked to attend to the sentence as a whole, thus integrating both speech channels (integration), or to focus on one channel only (rating of target emotion) and ignore the other (selective attention). Their task was to rate how much they agreed that the sentence conveyed each of the predefined emotions. In addition, all participants performed standard tests of speech perception.

RESULTS:

When asked to focus on one channel, semantics or prosody, both groups rated emotions similarly with comparable levels of selective attention. When the task was called for channel integration, group differences were found. CI users appeared to use semantic emotional information more than did their NH peers. CI users assigned higher ratings than did their NH peers to sentences that did not present the target emotion, indicating some degree of confusion. In addition, for CI users, individual differences in speech comprehension over the phone and identification of intonation were significantly related to emotional semantic and prosodic ratings, respectively.

CONCLUSIONS:

CI users and NH controls did not differ in perception of prosodic and semantic emotions and in auditory selective attention. However, when the task called for integration of prosody and semantics, CI users overused the semantic information (as compared with NH). We suggest that as CI users adopt diverse cue weighting strategies with device experience, their weighting of prosody and semantics differs from those used by NH. Finally, CI users may benefit from rehabilitation strategies that strengthen perception of prosodic information to better understand emotional speech.
Assuntos

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Percepção da Fala / Implantes Cocleares / Implante Coclear Limite: Adolescent / Adult / Aged / Humans / Middle aged Idioma: En Ano de publicação: 2022 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Percepção da Fala / Implantes Cocleares / Implante Coclear Limite: Adolescent / Adult / Aged / Humans / Middle aged Idioma: En Ano de publicação: 2022 Tipo de documento: Article