Your browser doesn't support javascript.
loading
ReCANVo: A database of real-world communicative and affective nonverbal vocalizations.
Johnson, Kristina T; Narain, Jaya; Quatieri, Thomas; Maes, Pattie; Picard, Rosalind W.
Afiliación
  • Johnson KT; Massachusetts Institute of Technology, MIT Media Lab, Cambridge, MA, USA. ktj@mit.edu.
  • Narain J; Massachusetts Institute of Technology, MIT Media Lab, Cambridge, MA, USA. jnarain@alum.mit.edu.
  • Quatieri T; Massachusetts Institute of Technology, Lincoln Laboratory, Lexington, MA, USA.
  • Maes P; Massachusetts Institute of Technology, MIT Media Lab, Cambridge, MA, USA.
  • Picard RW; Massachusetts Institute of Technology, MIT Media Lab, Cambridge, MA, USA.
Sci Data ; 10(1): 523, 2023 08 05.
Article en En | MEDLINE | ID: mdl-37543663
ABSTRACT
Nonverbal vocalizations, such as sighs, grunts, and yells, are informative expressions within typical verbal speech. Likewise, individuals who produce 0-10 spoken words or word approximations ("minimally speaking" individuals) convey rich affective and communicative information through nonverbal vocalizations even without verbal speech. Yet, despite their rich content, little to no data exists on the vocal expressions of this population. Here, we present ReCANVo Real-World Communicative and Affective Nonverbal Vocalizations - a novel dataset of non-speech vocalizations labeled by function from minimally speaking individuals. The ReCANVo database contains over 7000 vocalizations spanning communicative and affective functions from eight minimally speaking individuals, along with communication profiles for each participant. Vocalizations were recorded in real-world settings and labeled in real-time by a close family member who knew the communicator well and had access to contextual information while labeling. ReCANVo is a novel database of nonverbal vocalizations from minimally speaking individuals, the largest available dataset of nonverbal vocalizations, and one of the only affective speech datasets collected amidst daily life across contexts.

Texto completo: 1 Base de datos: MEDLINE Idioma: En Revista: Sci Data Año: 2023 Tipo del documento: Article

Texto completo: 1 Base de datos: MEDLINE Idioma: En Revista: Sci Data Año: 2023 Tipo del documento: Article