Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 41
Filter
1.
Cuad Bioet ; 35(114): 143-155, 2024.
Article in Spanish | MEDLINE | ID: mdl-39135283

ABSTRACT

The digitization of mental health enables significant shifts in clinical practice by harnessing vast amounts of data derived from the use of apps and wearables to enhance medical research, patient care, and health system efficiency. However, this process brings forth pertinent ethical and legal risks. Ethically, concerns primarily revolve around safeguarding the privacy and confidentiality of sensitive data, alongside the transformation of the doctor-patient relationship through technological interaction. Within the regulatory realm, issues encompass the classification of these tools as medical products, ensuring normative assurance of effective protection of mental health data, and addressing potential legal risks within this domain. This article aims to provide an overarching view of this landscape, serving as a catalyst for the technological, ethical, and legal discourse necessitated by digital mental health.


Subject(s)
Confidentiality , Mental Health , Mobile Applications , Humans , Confidentiality/legislation & jurisprudence , Confidentiality/ethics , Mobile Applications/ethics , Mobile Applications/legislation & jurisprudence , Physician-Patient Relations/ethics , Telemedicine/ethics , Telemedicine/legislation & jurisprudence , Wearable Electronic Devices/ethics , Computer Security/legislation & jurisprudence
2.
Aten Primaria ; 56(7): 102901, 2024 Jul.
Article in Spanish | MEDLINE | ID: mdl-38452658

ABSTRACT

The medical history underscores the significance of ethics in each advancement, with bioethics playing a pivotal role in addressing emerging ethical challenges in digital health (DH). This article examines the ethical dilemmas of innovations in DH, focusing on the healthcare system, professionals, and patients. Artificial Intelligence (AI) raises concerns such as confidentiality and algorithmic biases. Mobile applications (Apps) empower but pose challenges of access and digital literacy. Telemedicine (TM) democratizes and reduces healthcare costs but requires addressing the digital divide and interconsultation dilemmas; it necessitates high-quality standards with patient information protection and attention to equity in access. Wearables and the Internet of Things (IoT) transform healthcare but face ethical challenges like privacy and equity. 21st-century bioethics must be adaptable as DH tools demand constant review and consensus, necessitating health science faculties' preparedness for the forthcoming changes.


Subject(s)
Artificial Intelligence , Telemedicine , Telemedicine/ethics , Humans , Artificial Intelligence/ethics , Bioethical Issues , Bioethics , Confidentiality/ethics , Mobile Applications/ethics , Digital Technology/ethics , Internet of Things/ethics , Digital Health
3.
Br J Dermatol ; 190(6): 789-797, 2024 May 17.
Article in English | MEDLINE | ID: mdl-38330217

ABSTRACT

The field of dermatology is experiencing the rapid deployment of artificial intelligence (AI), from mobile applications (apps) for skin cancer detection to large language models like ChatGPT that can answer generalist or specialist questions about skin diagnoses. With these new applications, ethical concerns have emerged. In this scoping review, we aimed to identify the applications of AI to the field of dermatology and to understand their ethical implications. We used a multifaceted search approach, searching PubMed, MEDLINE, Cochrane Library and Google Scholar for primary literature, following the PRISMA Extension for Scoping Reviews guidance. Our advanced query included terms related to dermatology, AI and ethical considerations. Our search yielded 202 papers. After initial screening, 68 studies were included. Thirty-two were related to clinical image analysis and raised ethical concerns for misdiagnosis, data security, privacy violations and replacement of dermatologist jobs. Seventeen discussed limited skin of colour representation in datasets leading to potential misdiagnosis in the general population. Nine articles about teledermatology raised ethical concerns, including the exacerbation of health disparities, lack of standardized regulations, informed consent for AI use and privacy challenges. Seven addressed inaccuracies in the responses of large language models. Seven examined attitudes toward and trust in AI, with most patients requesting supplemental assessment by a physician to ensure reliability and accountability. Benefits of AI integration into clinical practice include increased patient access, improved clinical decision-making, efficiency and many others. However, safeguards must be put in place to ensure the ethical application of AI.


The use of artificial intelligence (AI) in dermatology is rapidly increasing, with applications in dermatopathology, medical dermatology, cutaneous surgery, microscopy/spectroscopy and the identification of prognostic biomarkers (characteristics that provide information on likely patient health outcomes). However, with the rise of AI in dermatology, ethical concerns have emerged. We reviewed the existing literature to identify applications of AI in the field of dermatology and understand the ethical implications. Our search initially identified 202 papers, and after we went through them (screening), 68 were included in our review. We found that ethical concerns are related to the use of AI in the areas of clinical image analysis, teledermatology, natural language processing models, privacy, skin of colour representation, and patient and provider attitudes toward AI. We identified nine ethical principles to facilitate the safe use of AI in dermatology. These ethical principles include fairness, inclusivity, transparency, accountability, security, privacy, reliability, informed consent and conflict of interest. Although there are many benefits of integrating AI into clinical practice, our findings highlight how safeguards must be put in place to reduce rising ethical concerns.


Subject(s)
Artificial Intelligence , Dermatology , Humans , Artificial Intelligence/ethics , Dermatology/ethics , Dermatology/methods , Telemedicine/ethics , Informed Consent/ethics , Confidentiality/ethics , Diagnostic Errors/ethics , Diagnostic Errors/prevention & control , Computer Security/ethics , Skin Diseases/diagnosis , Skin Diseases/therapy , Mobile Applications/ethics
4.
PLoS One ; 16(7): e0254786, 2021.
Article in English | MEDLINE | ID: mdl-34310618

ABSTRACT

OBJECTIVES: The objective of this paper is to study under which circumstances wearable and health app users would accept a compensation payment, namely a digital dividend, to share their self-tracked health data. METHODS: We conducted a discrete choice experiment alternative, a separated adaptive dual response. We chose this approach to reduce extreme response behavior, considering the emotionally-charged topic of health data sales, and to measure willingness to accept. Previous experiments in lab settings led to demands for high monetary compensation. After a first online survey and two pre-studies, we validated four attributes for the final online study: monthly bonus payment, stakeholder handling the data (e.g., health insurer, pharmaceutical or medical device companies, universities), type of data, and data sales to third parties. We used a random utility framework to evaluate individual choice preferences. To test the expected prices of the main study for robustness, we assigned respondents randomly to one of two identical questionnaires with varying price ranges. RESULTS: Over a period of three weeks, 842 respondents participated in the main survey, and 272 respondents participated in the second survey. The participants considered transparency about data processing and no further data sales to third parties as very important to the decision to share data with different stakeholders, as well as adequate monetary compensation. Price expectations resulting from the experiment were high; pharmaceutical and medical device companies would have to pay an average digital dividend of 237.30€/month for patient generated health data of all types. We also observed an anchor effect, which means that people formed price expectations during the process and not ex ante. We found a bimodal distribution between relatively low price expectations and relatively high price expectations, which shows that personal data selling is a divisive societal issue. However, the results indicate that a digital dividend could be an accepted economic incentive system to gather large-scale, self-tracked data for research and development purposes. After the COVID-19 crisis, price expectations might change due to public sensitization to the need for big data research on patient generated health data. CONCLUSION: A continuing success of existing data donation models is highly unlikely. The health care sector needs to develop transparency and trust in data processing. An adequate digital dividend could be an effective long-term measure to convince a diverse and large group of people to share high-quality, continuous data for research purposes.


Subject(s)
Health Records, Personal/ethics , Information Dissemination/ethics , Models, Econometric , Wearable Electronic Devices/ethics , COVID-19/economics , COVID-19/psychology , Health Records, Personal/economics , Health Records, Personal/psychology , Humans , Mobile Applications/ethics , Surveys and Questionnaires , Wearable Electronic Devices/economics , Wearable Electronic Devices/psychology
5.
Bioethics ; 35(4): 366-371, 2021 05.
Article in English | MEDLINE | ID: mdl-33594709

ABSTRACT

The COVID-19 pandemic has infected millions around the world. Governments initially responded by requiring businesses to close and citizens to self-isolate, as well as funding vaccine research and implementing a range of technologies to monitor and limit the spread of the disease. This article considers the use of smartphone metadata and Bluetooth applications for public health surveillance purposes in relation to COVID-19. It undertakes ethical analysis of these measures, particularly in relation to collective moral responsibility, considering whether citizens ought, or should be compelled, to comply with government measures.


Subject(s)
COVID-19/prevention & control , Communicable Disease Control/methods , Public Health Surveillance/methods , Public Health/ethics , Humans , Metadata/ethics , Mobile Applications/ethics , Moral Obligations , Privacy , SARS-CoV-2 , Smartphone/ethics , Social Responsibility
6.
J Am Med Inform Assoc ; 28(1): 193-195, 2021 01 15.
Article in English | MEDLINE | ID: mdl-32584990

ABSTRACT

Recently, there have been many efforts to use mobile apps as an aid in contact tracing to control the spread of the SARS-CoV-2 (severe acute respiratory syndrome coronavirus 2) (COVID-19 [coronavirus disease 2019]) pandemic. However, although many apps aim to protect individual privacy, the very nature of contact tracing must reveal some otherwise protected personal information. Digital contact tracing has endemic privacy risks that cannot be removed by technological means, and which may require legal or economic solutions. In this brief communication, we discuss a few of these inherent privacy limitations of any decentralized automatic contact tracing system.


Subject(s)
COVID-19 , Contact Tracing/legislation & jurisprudence , Mobile Applications/legislation & jurisprudence , Privacy , COVID-19/epidemiology , Canada , Contact Tracing/ethics , Contact Tracing/methods , Humans , Mobile Applications/ethics , United States
7.
Camb Q Healthc Ethics ; 30(2): 262-271, 2021 Apr.
Article in English | MEDLINE | ID: mdl-32993842

ABSTRACT

Several digital contact tracing smartphone applications have been developed worldwide in the effort to combat COVID-19 that warn users of potential exposure to infectious patients and generate big data that helps in early identification of hotspots, complementing the manual tracing operations. In most democracies, concerns over a breach in data privacy have resulted in severe opposition toward their mandatory adoption. This paper examines India as a noticeable exception, where the compulsory installation of such a government-backed application, the "Aarogya Setu" has been deemed mandatory in certain situations. We argue that the mandatory app requirement constitutes a legitimate public health intervention during a public health emergency.


Subject(s)
Contact Tracing/ethics , Mobile Applications/ethics , Privacy , Bioethical Issues , Cell Phone , Ethical Analysis , Humans , India
9.
J Glob Health ; 10(2): 020103, 2020 Dec.
Article in English | MEDLINE | ID: mdl-33110502

ABSTRACT

The COVID-19 pandemic has put health systems, economies and societies under unprecedented strain, calling for innovative approaches. Scotland's government, like those elsewhere, is facing difficult decisions about how to deploy digital technologies and data to help contain, control and manage the disease, while also respecting citizens' rights. This paper explores the ethical challenges presented by these methods, with particular emphasis on mobile apps associated with contact tracing. Drawing on UK and international experiences, it examines issues such as public trust, data privacy and technology design; how changing disease threats and contextual factors can affect the balance between public benefits and risks; and the importance of transparency, accountability and stakeholder participation for the trustworthiness and good-governance of digital systems and strategies. Analysis of recent technology debates, controversial programmes and emerging outcomes in comparable countries implementing contact tracing apps, reveals sociotechnical complexities and unexpected paradoxes that warrant further study and underlines the need for holistic, inclusive and adaptive strategies. The paper also considers the potential role of these apps as Scotland transitions to the 'new normal', outlines challenges and opportunities for public engagement, and poses a set of ethical questions to inform decision-making at multiple levels, from software design to institutional governance.


Subject(s)
Contact Tracing/ethics , Disease Transmission, Infectious/ethics , Human Rights/ethics , Mobile Applications/ethics , Pandemics/ethics , Betacoronavirus , COVID-19 , Contact Tracing/methods , Coronavirus Infections/prevention & control , Disease Transmission, Infectious/prevention & control , Government , Humans , Pandemics/prevention & control , Pneumonia, Viral/prevention & control , SARS-CoV-2 , Scotland/epidemiology , Stakeholder Participation , Technology/ethics
10.
J Bioeth Inq ; 17(4): 835-839, 2020 Dec.
Article in English | MEDLINE | ID: mdl-32840842

ABSTRACT

Mobile applications are increasingly regarded as important tools for an integrated strategy of infection containment in post-lockdown societies around the globe. This paper discusses a number of questions that should be addressed when assessing the ethical challenges of mobile applications for digital contact-tracing of COVID-19: Which safeguards should be designed in the technology? Who should access data? What is a legitimate role for "Big Tech" companies in the development and implementation of these systems? How should cultural and behavioural issues be accounted for in the design of these apps? Should use of these apps be compulsory? What does transparency and ethical oversight mean in this context? We demonstrate that responses to these questions are complex and contingent and argue that if digital contract-tracing is used, then it should be clear that this is on a trial basis and its use should be subject to independent monitoring and evaluation.


Subject(s)
COVID-19 , Contact Tracing/ethics , Mobile Applications/ethics , Access to Information , Humans , Privacy , Public Health , SARS-CoV-2
11.
Yearb Med Inform ; 29(1): 93-98, 2020 Aug.
Article in English | MEDLINE | ID: mdl-32823302

ABSTRACT

OBJECTIVES: To provide an overview of recent work at the intersection of Biomedical Informatics, Human-Computer Interaction, and Ethics. METHODS: Search terms for Human-Computer Interaction, Biomedical Informatics, and Ethics were used to identify relevant papers published between 2017 and 2019.Relevant papers were identified through multiple methods, including database searches, manual reviews of citations, recent publications, and special collections, as well as through peer recommendations. Identified articles were reviewed and organized into broad themes. RESULTS: We identified relevant papers at the intersection of Biomedical Informatics, Human-Computer Interactions, and Ethics in over a dozen journals. The content of these papers was organized into three broad themes: ethical issues associated with systems in use, systems design, and responsible conduct of research. CONCLUSIONS: The results of this overview demonstrate an active interest in exploring the ethical implications of Human-Computer Interaction concerns in Biomedical Informatics. Papers emphasizing ethical concerns associated with patient-facing tools, mobile devices, social media, privacy, inclusivity, and e-consent reflect the growing prominence of these topics in biomedical informatics research. New questions in these areas will likely continue to arise with the growth of precision medicine and citizen science.


Subject(s)
Bioethical Issues , Medical Informatics/ethics , User-Computer Interface , Biomedical Research/ethics , Computers/ethics , Health Records, Personal/ethics , Humans , Mobile Applications/ethics
13.
S Afr Med J ; 110(5): 364-368, 2020 Apr 29.
Article in English | MEDLINE | ID: mdl-32657718

ABSTRACT

In everyday clinical practice, healthcare professionals (HCPs) are exposed to large quantities of confidential patient information, and many use WhatsApp groups to share this information. WhatsApp groups provide efficient mechanisms for clinical management advice, decision-making support and peer review. However, most HCPs do not fully understand the legal and ethical implications of sharing content in a WhatsApp group setting, which is often thought to be hosted on a secure platform and therefore removed from public scrutiny. In our paper, we unpack the legal and ethical issues that arise when information is shared in WhatsApp groups. We demonstrate that sharing content in this forum is tantamount to the publication of content; in other words, those who share content are subject to the same legal ramifications as a journalist would be. We also examine the role of the WhatsApp group administrator, who bears an additional legal burden by default, often unknowingly so. We consider the recommendations made by the Health Professions Council of South Africa in their guidelines for the use of social media, and highlight some areas where we feel the guidelines may not adequately protect HCPs from the legal repercussions of sharing content in a WhatsApp group. Finally, we provide a set of guidelines for WhatsApp group users that should be regularly posted onto the group by the relevant group administrator to mitigate some of the legal liabilities that may arise. We also provide guidelines for group administrators.


Subject(s)
Liability, Legal , Mobile Applications/ethics , Mobile Applications/legislation & jurisprudence , Clinical Decision-Making , Communication , Confidentiality/legislation & jurisprudence , Humans , Peer Review , Social Media/legislation & jurisprudence , South Africa
15.
Evid Based Ment Health ; 23(3): 107-111, 2020 Aug.
Article in English | MEDLINE | ID: mdl-32312794

ABSTRACT

BACKGROUND: While there are numerous mental health apps on the market today, less is known about their safety and quality. This study aims to offer a longitudinal perspective on the nature of high visibility apps for common mental health and physical health conditions. METHODS: In July 2019, we selected the 10 top search-returned apps in the Apple App Store and Android Google Play Store using six keyword terms: depression, anxiety, schizophrenia, addiction, high blood pressure and diabetes. Each app was downloaded by two authors and reviewed by a clinician, and the app was coded for features, functionality, claims, app store properties, and other properties. RESULTS: Compared with 1 year prior, there were few statistically significant changes in app privacy policies, evidence and features. However, there was a high rate of turnover with only 34 (57%) of the apps from the Apple's App Store and 28 (47%) from the Google Play Store remaining in the 2019 top 10 search compared with the 2018 search. DISCUSSION: Although there was a high turnover of top search-returned apps between 2018 and 2019, we found that there were few significant changes in features, privacy, medical claims and other properties. This suggests that, although the highly visible and available apps are changing, there were no significant improvements in app quality or safety.


Subject(s)
Confidentiality , Diabetes Mellitus , Hypertension , Mental Disorders , Mobile Applications , Telemedicine , Confidentiality/ethics , Confidentiality/standards , Confidentiality/trends , Humans , Longitudinal Studies , Mobile Applications/ethics , Mobile Applications/standards , Mobile Applications/statistics & numerical data , Mobile Applications/trends , Smartphone , Telemedicine/ethics , Telemedicine/standards , Telemedicine/statistics & numerical data , Telemedicine/trends
16.
Science ; 368(6491)2020 05 08.
Article in English | MEDLINE | ID: mdl-32234805

ABSTRACT

The newly emergent human virus SARS-CoV-2 (severe acute respiratory syndrome-coronavirus 2) is resulting in high fatality rates and incapacitated health systems. Preventing further transmission is a priority. We analyzed key parameters of epidemic spread to estimate the contribution of different transmission routes and determine requirements for case isolation and contact tracing needed to stop the epidemic. Although SARS-CoV-2 is spreading too fast to be contained by manual contact tracing, it could be controlled if this process were faster, more efficient, and happened at scale. A contact-tracing app that builds a memory of proximity contacts and immediately notifies contacts of positive cases can achieve epidemic control if used by enough people. By targeting recommendations to only those at risk, epidemics could be contained without resorting to mass quarantines ("lockdowns") that are harmful to society. We discuss the ethical requirements for an intervention of this kind.


Subject(s)
Betacoronavirus , Cell Phone , Contact Tracing/methods , Coronavirus Infections/prevention & control , Coronavirus Infections/transmission , Mobile Applications , Pandemics/prevention & control , Pneumonia, Viral/prevention & control , Pneumonia, Viral/transmission , Algorithms , Asymptomatic Diseases , Basic Reproduction Number , COVID-19 , China/epidemiology , Contact Tracing/ethics , Coronavirus Infections/epidemiology , Epidemics/prevention & control , Humans , Infection Control , Mobile Applications/ethics , Models, Theoretical , Pneumonia, Viral/epidemiology , Probability , Quarantine , SARS-CoV-2 , Time Factors
17.
Psychother Psychosom Med Psychol ; 70(11): 467-474, 2020 Nov.
Article in German | MEDLINE | ID: mdl-32069513

ABSTRACT

OBJECTIVE: The use of internet- and mobile-based interventions (IMIs) is often considered as empowerment of patients and improvement of accessibility of mental health services. Risks for specific patient groups are seldom discussed. Aim of the study is to identify patient groups that do not benefit from IMIs given the tension between autonomy and patient well-being. METHODS: The ethical analysis is based on available empirical evidence (randomized control trials - RCTs, reviews) as well as ethical papers. Methodological background is the tension between patient autonomy and patient well-being, which is crucial to the therapeutic alliance. On this foundation, patient groups are identified that do not benefit from IMIs in terms of empowerment or accessibility. RESULTS: The evidence-based ethical analysis shows that patients with certain disorders or high symptom severity, patients with low level of education or a lack of technical skills, and patients with a migrant background do often not benefit from IMIs. Risks of IMIs are a lack of individualization of interventions given individual treatment needs, symptom deterioration, higher dropout-rate, and insufficient identification of emergency situations. DISCUSSION: Overemphasizing autonomy may compromize patient well-being in certain patient groups. This may lead to a situation where those patient groups whose inclusion into mental health service should be facilitated by IMIs might not be reached. These access barriers should be considered when designing IMIs, so that multimorbid marginalized groups are not forgotten in the necessary digitalization of the health market. CONCLUSION: The application of IMIs depends on the individual resources of the patient. Should IMIs be further implemented within the German mental healthcare system, it is imperative that the patient well-being of those patient groups that do not benefit from IMIs is guaranteed. In addition, an early focus on marginalized groups may and the implementation of low-level access to counselling and treatment may provide chances for said groups.


Subject(s)
Internet-Based Intervention , Mental Health , Mobile Applications/ethics , Personal Autonomy , Ethical Analysis , Humans
19.
J Am Acad Psychiatry Law ; 47(4): 457-466, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31533994

ABSTRACT

Current approaches to monitoring patients' mental status rely heavily on self-reported symptomatology, clinician observation, and self-rated symptom scales. The limitations inherent in these methodologies have implications for the accuracy of diagnosis, treatment planning, and prognosis. Certain populations are particularly affected by these limitations because of their unique situations, including criminal forensic patients, who have a history of both criminal behavior and mental disorder, and experience increased stigma and restrictions in their access to mental health care. This population may benefit particularly from recent developments in technology and the growing use of mobile devices and sensors to collect behavioral information via passive monitoring. These technologies offer objective parameters that correlate with mental health status and create an opportunity to use Big Data and machine learning to refine diagnosis and predict behavior in a way that represents a marked shift from current practices. This article reviews the approaches to and limitations of psychiatric assessment and contrasts this with the promise of these new technologies. It then discusses the ethics concerns associated with these technologies and explores their potential relevance to criminal forensic psychiatry and the broader implications they carry for health and criminal justice policy.


Subject(s)
Criminals/psychology , Forensic Psychiatry/trends , Health Status , Mental Health , Mobile Applications/ethics , Mobile Applications/trends , Big Data , Humans , Machine Learning/ethics , Machine Learning/trends , Remote Sensing Technology/ethics , Remote Sensing Technology/trends , Risk Assessment , Self Report , Smartphone
SELECTION OF CITATIONS
SEARCH DETAIL