Consumer Perspectives on the Use of Artificial Intelligence Technology and Automation in Crisis Support Services: Mixed Methods Study.
JMIR Hum Factors
; 9(3): e34514, 2022 Aug 05.
Article
en En
| MEDLINE
| ID: mdl-35930334
BACKGROUND: Emerging technologies, such as artificial intelligence (AI), have the potential to enhance service responsiveness and quality, improve reach to underserved groups, and help address the lack of workforce capacity in health and mental health care. However, little research has been conducted on the acceptability of AI, particularly in mental health and crisis support, and how this may inform the development of responsible and responsive innovation in the area. OBJECTIVE: This study aims to explore the level of support for the use of technology and automation, such as AI, in Lifeline's crisis support services in Australia; the likelihood of service use if technology and automation were implemented; the impact of demographic characteristics on the level of support and likelihood of service use; and reasons for not using Lifeline's crisis support services if technology and automation were implemented in the future. METHODS: A mixed methods study involving a computer-assisted telephone interview and a web-based survey was undertaken from 2019 to 2020 to explore expectations and anticipated outcomes of Lifeline's crisis support services in a nationally representative community sample (n=1300) and a Lifeline help-seeker sample (n=553). Participants were aged between 18 and 93 years. Quantitative descriptive analysis, binary logistic regression models, and qualitative thematic analysis were conducted to address the research objectives. RESULTS: One-third of the community and help-seeker participants did not support the collection of information about service users through technology and automation (ie, via AI), and approximately half of the participants reported that they would be less likely to use the service if automation was introduced. Significant demographic differences were observed between the community and help-seeker samples. Of the demographics, only older age predicted being less likely to endorse technology and automation to tailor Lifeline's crisis support service and use such services (odds ratio 1.48-1.66, 99% CI 1.03-2.38; P<.001 to P=.005). The most common reason for reluctance, reported by both samples, was that respondents wanted to speak to a real person, assuming that human counselors would be replaced by automated robots or machine services. CONCLUSIONS: Although Lifeline plans to always have a real person providing crisis support, help-seekers automatically fear this will not be the case if new technology and automation such as AI are introduced. Consequently, incorporating innovative use of technology to improve help-seeker outcomes in such services will require careful messaging and assurance that the human connection will continue.
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Contexto en salud:
2_ODS3
Problema de salud:
2_cobertura_universal
Tipo de estudio:
Prognostic_studies
/
Qualitative_research
Idioma:
En
Revista:
JMIR Hum Factors
Año:
2022
Tipo del documento:
Article
País de afiliación:
Australia