Your browser doesn't support javascript.
loading
Cerebral response to emotional working memory based on vocal cues: an fNIRS study.
Ohshima, Saori; Koeda, Michihiko; Kawai, Wakana; Saito, Hikaru; Niioka, Kiyomitsu; Okuno, Koki; Naganawa, Sho; Hama, Tomoko; Kyutoku, Yasushi; Dan, Ippeita.
Afiliação
  • Ohshima S; Applied Cognitive Neuroscience Laboratory, Faculty of Science and Engineering, Chuo University, Bunkyo, Japan.
  • Koeda M; Department of Neuropsychiatry, Graduate School of Medicine, Nippon Medical School, Bunkyo, Japan.
  • Kawai W; Department of Mental Health, Nippon Medical School Tama Nagayama Hospital, Tama, Japan.
  • Saito H; Applied Cognitive Neuroscience Laboratory, Faculty of Science and Engineering, Chuo University, Bunkyo, Japan.
  • Niioka K; Applied Cognitive Neuroscience Laboratory, Faculty of Science and Engineering, Chuo University, Bunkyo, Japan.
  • Okuno K; Applied Cognitive Neuroscience Laboratory, Faculty of Science and Engineering, Chuo University, Bunkyo, Japan.
  • Naganawa S; Applied Cognitive Neuroscience Laboratory, Faculty of Science and Engineering, Chuo University, Bunkyo, Japan.
  • Hama T; Applied Cognitive Neuroscience Laboratory, Faculty of Science and Engineering, Chuo University, Bunkyo, Japan.
  • Kyutoku Y; Department of Medical Technology, Ehime Prefectural University of Health Sciences, Iyo-gun, Japan.
  • Dan I; Department of Clinical Laboratory Medicine, Faculty of Health Science Technology, Bunkyo Gakuin University, Tokyo, Japan.
Front Hum Neurosci ; 17: 1160392, 2023.
Article em En | MEDLINE | ID: mdl-38222093
ABSTRACT

Introduction:

Humans mainly utilize visual and auditory information as a cue to infer others' emotions. Previous neuroimaging studies have shown the neural basis of memory processing based on facial expression, but few studies have examined it based on vocal cues. Thus, we aimed to investigate brain regions associated with emotional judgment based on vocal cues using an N-back task paradigm.

Methods:

Thirty participants performed N-back tasks requiring them to judge emotion or gender from voices that contained both emotion and gender information. During these tasks, cerebral hemodynamic response was measured using functional near-infrared spectroscopy (fNIRS).

Results:

The results revealed that during the Emotion 2-back task there was significant activation in the frontal area, including the right precentral and inferior frontal gyri, possibly reflecting the function of an attentional network with auditory top-down processing. In addition, there was significant activation in the ventrolateral prefrontal cortex, which is known to be a major part of the working memory center.

Discussion:

These results suggest that, compared to judging the gender of voice stimuli, when judging emotional information, attention is directed more deeply and demands for higher-order cognition, including working memory, are greater. We have revealed for the first time the specific neural basis for emotional judgments based on vocal cues compared to that for gender judgments based on vocal cues.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2023 Tipo de documento: Article