RESUMO
It is well-established that linguistic processing is primarily a left-hemisphere activity, while emotional prosody processing is lateralized to the right hemisphere. Does attention, directed at different regions of the talker's face, reflect this pattern of lateralization? We investigated visuospatial attention across a talker's face with a dual-task paradigm, using dot detection and language comprehension measures. A static image of a talker was shown while participants listened to speeches spoken in two prosodic formats, emotional or neutral. A single dot was superimposed on the speaker's face in one of 4 facial regions on half of the trials. Dot detection effects depended on emotion condition--in the neutral condition, discriminability was greater for the right-, than for the left-, side of the face image, and at the mouth, compared to the eye region. The opposite effects occurred in the emotional prosody condition. The results support a model wherein visuospatial attention used during language comprehension is directed by the left hemisphere given neutral emotional prosody, and by the right hemisphere given primarily negative emotional prosodic cues.