Your browser doesn't support javascript.
loading
Collecting Food and Drink Intake Data With Voice Input: Development, Usability, and Acceptability Study.
Millard, Louise A C; Johnson, Laura; Neaves, Samuel R; Flach, Peter A; Tilling, Kate; Lawlor, Deborah A.
Afiliação
  • Millard LAC; Medical Research Council (MRC) Integrative Epidemiology Unit, University of Bristol, Bristol, United Kingdom.
  • Johnson L; Department of Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, United Kingdom.
  • Neaves SR; Medical Research Council (MRC) Integrative Epidemiology Unit, University of Bristol, Bristol, United Kingdom.
  • Flach PA; Department of Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, United Kingdom.
  • Tilling K; Centre for Exercise, Nutrition and Health Sciences, School for Policy Studies, University of Bristol, Bristol, United Kingdom.
  • Lawlor DA; Centre for Health, National Centre for Social Research (NatCen), London, United Kingdom.
JMIR Mhealth Uhealth ; 11: e41117, 2023 03 31.
Article em En | MEDLINE | ID: mdl-37000476
ABSTRACT

BACKGROUND:

Voice-based systems such as Amazon Alexa may be useful for collecting self-reported information in real time from participants of epidemiology studies using verbal input. In epidemiological research studies, self-reported data tend to be collected using short, infrequent questionnaires, in which the items require participants to select from predefined options, which may lead to errors in the information collected and lack of coverage. Voice-based systems give the potential to collect self-reported information "continuously" over several days or weeks. At present, to the best of our knowledge, voice-based systems have not been used or evaluated for collecting epidemiological data.

OBJECTIVE:

We aimed to demonstrate the technical feasibility of using Alexa to collect information from participants, investigate participant acceptability, and provide an initial evaluation of the validity of the collected data. We used food and drink information as an exemplar.

METHODS:

We recruited 45 staff members and students at the University of Bristol (United Kingdom). Participants were asked to tell Alexa what they ate or drank for 7 days and to also submit this information using a web-based form. Questionnaires asked for basic demographic information, about their experience during the study, and the acceptability of using Alexa.

RESULTS:

Of the 37 participants with valid data, most (n=30, 81%) were aged 20 to 39 years and 23 (62%) were female. Across 29 participants with Alexa and web entries corresponding to the same intake event, 60.1% (357/588) of Alexa entries contained the same food and drink information as the corresponding web entry. Most participants reported that Alexa interjected, and this was worse when entering the food and drink information (17/35, 49% of participants said this happened often; 1/35, 3% said this happened always) than when entering the event date and time (6/35, 17% of participants said this happened often; 1/35, 3% said this happened always). Most (28/35, 80%) said they would be happy to use a voice-controlled system for future research.

CONCLUSIONS:

Although there were some issues interacting with the Alexa skill, largely because of its conversational nature and because Alexa interjected if there was a pause in speech, participants were mostly willing to participate in future research studies using Alexa. More studies are needed, especially to trial less conversational interfaces.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Alimentos Limite: Female / Humans / Male País/Região como assunto: Europa Idioma: En Revista: JMIR Mhealth Uhealth Ano de publicação: 2023 Tipo de documento: Article País de afiliação: Reino Unido

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Alimentos Limite: Female / Humans / Male País/Região como assunto: Europa Idioma: En Revista: JMIR Mhealth Uhealth Ano de publicação: 2023 Tipo de documento: Article País de afiliação: Reino Unido