Your browser doesn't support javascript.
loading
AI enabled suicide prediction tools: a qualitative narrative review.
D'Hotman, Daniel; Loh, Erwin.
Affiliation
  • D'Hotman D; Oxford Uehiro Centre for Practical Ethics, University of Oxford, Oxford, United Kingdom daniel.dhotman@philosophy.ox.ac.uk.
  • Loh E; Monash Centre for Health Research and Implementation, Monash University, Clayton, Victoria, Australia.
BMJ Health Care Inform ; 27(3)2020 Oct.
Article in En | MEDLINE | ID: mdl-33037037
ABSTRACT

Background:

Suicide poses a significant health burden worldwide. In many cases, people at risk of suicide do not engage with their doctor or community due to concerns about stigmatisation and forced medical treatment; worse still, people with mental illness (who form a majority of people who die from suicide) may have poor insight into their mental state, and not self-identify as being at risk. These issues are exacerbated by the fact that doctors have difficulty in identifying those at risk of suicide when they do present to medical services. Advances in artificial intelligence (AI) present opportunities for the development of novel tools for predicting suicide.

Method:

We searched Google Scholar and PubMed for articles relating to suicide prediction using artificial intelligence from 2017 onwards.

Conclusions:

This paper presents a qualitative narrative review of research focusing on two categories of suicide prediction tools medical suicide prediction and social suicide prediction. Initial evidence is promising AI-driven suicide prediction could improve our capacity to identify those at risk of suicide, and, potentially, save lives. Medical suicide prediction may be relatively uncontroversial when it pays respect to ethical and legal principles; however, further research is required to determine the validity of these tools in different contexts. Social suicide prediction offers an exciting opportunity to help identify suicide risk among those who do not engage with traditional health services. Yet, efforts by private companies such as Facebook to use online data for suicide prediction should be the subject of independent review and oversight to confirm safety, effectiveness and ethical permissibility.
Subject(s)
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Artificial Intelligence / Global Health / Protective Factors / Suicide Prevention Type of study: Etiology_studies / Prognostic_studies / Qualitative_research / Risk_factors_studies Aspects: Ethics Limits: Humans Language: En Journal: BMJ Health Care Inform Year: 2020 Document type: Article Affiliation country: United kingdom

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Artificial Intelligence / Global Health / Protective Factors / Suicide Prevention Type of study: Etiology_studies / Prognostic_studies / Qualitative_research / Risk_factors_studies Aspects: Ethics Limits: Humans Language: En Journal: BMJ Health Care Inform Year: 2020 Document type: Article Affiliation country: United kingdom
...