Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
1.
Nature ; 578(7793): 34-36, 2020 02.
Article in English | MEDLINE | ID: mdl-32020122
2.
Sci Eng Ethics ; 25(3): 869-898, 2019 06.
Article in English | MEDLINE | ID: mdl-29318451

ABSTRACT

Academia-intelligence agency collaborations are on the rise for a variety of reasons. These can take many forms, one of which is in the classroom, using students to stand in for intelligence analysts. Classrooms, however, are ethically complex spaces, with students considered vulnerable populations, and become even more complex when layering multiple goals, activities, tools, and stakeholders over those traditionally present. This does not necessarily mean one must shy away from academia-intelligence agency partnerships in classrooms, but that these must be conducted carefully and reflexively. This paper hopes to contribute to this conversation by describing one purposeful classroom encounter that occurred between a professor, students, and intelligence practitioners in the fall of 2015 at North Carolina State University: an experiment conducted as part of a graduate-level political science class that involved students working with a prototype analytic technology, a type of participatory sensing/self-tracking device, developed by the National Security Agency. This experiment opened up the following questions that this paper will explore: What social, ethical, and pedagogical considerations arise with the deployment of a prototype intelligence technology in the college classroom, and how can they be addressed? How can academia-intelligence agency collaboration in the classroom be conducted in ways that provide benefits to all parties, while minimizing disruptions and negative consequences? This paper will discuss the experimental findings in the context of ethical perspectives involved in values in design and participatory/self-tracking data practices, and discuss lessons learned for the ethics of future academia-intelligence agency partnerships in the classroom.


Subject(s)
Data Science/ethics , Data Science/methods , Education, Graduate/ethics , Education, Graduate/methods , Privacy , Software , Curriculum , Humans , North Carolina , Students , United States , United States Government Agencies , Universities , Workflow
5.
Annu Rev Biomed Data Sci ; 7(1): 1-14, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38598860

ABSTRACT

Advances in biomedical data science and artificial intelligence (AI) are profoundly changing the landscape of healthcare. This article reviews the ethical issues that arise with the development of AI technologies, including threats to privacy, data security, consent, and justice, as they relate to donors of tissue and data. It also considers broader societal obligations, including the importance of assessing the unintended consequences of AI research in biomedicine. In addition, this article highlights the challenge of rapid AI development against the backdrop of disparate regulatory frameworks, calling for a global approach to address concerns around data misuse, unintended surveillance, and the equitable distribution of AI's benefits and burdens. Finally, a number of potential solutions to these ethical quandaries are offered. Namely, the merits of advocating for a collaborative, informed, and flexible regulatory approach that balances innovation with individual rights and public welfare, fostering a trustworthy AI-driven healthcare ecosystem, are discussed.


Subject(s)
Artificial Intelligence , Data Science , Artificial Intelligence/ethics , Humans , Data Science/ethics , Data Science/methods , Computer Security/ethics , Computer Security/legislation & jurisprudence , Biomedical Research/ethics , Confidentiality/ethics , Privacy
6.
J Am Med Inform Assoc ; 28(3): 650-652, 2021 03 01.
Article in English | MEDLINE | ID: mdl-33404593

ABSTRACT

There is little debate about the importance of ethics in health care, and clearly defined rules, regulations, and oaths help ensure patients' trust in the care they receive. However, standards are not as well established for the data professions within health care, even though the responsibility to treat patients in an ethical way extends to the data collected about them. Increasingly, data scientists, analysts, and engineers are becoming fiduciarily responsible for patient safety, treatment, and outcomes, and will require training and tools to meet this responsibility. We developed a data ethics checklist that enables users to consider the possible ethical issues that arise from the development and use of data products. The combination of ethics training for data professionals, a data ethics checklist as part of project management, and a data ethics committee holds potential for providing a framework to initiate dialogues about data ethics and can serve as an ethical touchstone for rapid use within typical analytic workflows, and we recommend the use of this or equivalent tools in deploying new data products in hospitals.


Subject(s)
Codes of Ethics , Data Science/ethics , Hospitals, Pediatric/ethics , Checklist , Ethics, Clinical , Ethics, Professional , Hospital Information Systems/ethics , Washington
7.
EMBO Mol Med ; 12(3): e12053, 2020 03 06.
Article in English | MEDLINE | ID: mdl-32064790

ABSTRACT

On November 14 last year, the British Guardian published an account from an anonymous whistleblower at Google, accusing the company of misconduct in regard to handling sensitive health data. The whistleblower works for Project Nightingale, an attempt by Google to get into the lucrative US healthcare market, by storing and processing the personal medical data of up to 50 million customers of Ascension, one of America's largest healthcare providers. As the Wall Street Journal had already reported 3 days earlier, and as the whistleblower confirmed, neither was the data anonymized when transmitted from Ascension nor were patients or their doctors notified, let alone asked for consent to sharing their data with Google (Copeland, 2019; Pilkington, 2019). As a result, Google employees had full access to non-anonymous patient health data. Google Health chief David Feinberg commented that all Google employees involved had gone through medical ethics training and were approved by Ascension (Feinberg, 2019).


Subject(s)
Confidentiality , Data Science , Search Engine , Data Science/ethics , Humans , Whistleblowing
8.
PLoS One ; 15(11): e0241865, 2020.
Article in English | MEDLINE | ID: mdl-33152039

ABSTRACT

Research ethics has traditionally been guided by well-established documents such as the Belmont Report and the Declaration of Helsinki. At the same time, the introduction of Big Data methods, that is having a great impact in behavioral research, is raising complex ethical issues that make protection of research participants an increasingly difficult challenge. By conducting 39 semi-structured interviews with academic scholars in both Switzerland and United States, our research aims at exploring the code of ethics and research practices of academic scholars involved in Big Data studies in the fields of psychology and sociology to understand if the principles set by the Belmont Report are still considered relevant in Big Data research. Our study shows how scholars generally find traditional principles to be a suitable guide to perform ethical data research but, at the same time, they recognized and elaborated on the challenges embedded in their practical application. In addition, due to the growing introduction of new actors in scholarly research, such as data holders and owners, it was also questioned whether responsibility to protect research participants should fall solely on investigators. In order to appropriately address ethics issues in Big Data research projects, education in ethics, exchange and dialogue between research teams and scholars from different disciplines should be enhanced. In addition, models of consultancy and shared responsibility between investigators, data owners and review boards should be implemented in order to ensure better protection of research participants.


Subject(s)
Behavioral Sciences/ethics , Data Mining/ethics , Data Science/ethics , Adult , Big Data , Ethics, Research , Female , Humans , Informed Consent , Male , Middle Aged , Research Personnel , Stakeholder Participation/psychology , Switzerland , United States
9.
Big Data ; 6(3): 176-190, 2018 09 01.
Article in English | MEDLINE | ID: mdl-30283727

ABSTRACT

Ready data availability, cheap storage capacity, and powerful tools for extracting information from data have the potential to significantly enhance the human condition. However, as with all advanced technologies, this comes with the potential for misuse. Ethical oversight and constraints are needed to ensure that an appropriate balance is reached. Ethical issues involving data may be more challenging than the ethical challenges of some other advanced technologies partly because data and data science are ubiquitous, having the potential to impact all aspects of life, and partly because of their intrinsic complexity. We explore the nature of data, personal data, data ownership, consent and purpose of use, trustworthiness of data as well as of algorithms and of those using the data, and matters of privacy and confidentiality. A checklist is given of topics that need to be considered.


Subject(s)
Data Collection/ethics , Data Science/ethics , Confidentiality , Ethics , Humans , Informed Consent , Internet , Ownership , Privacy , Trust
SELECTION OF CITATIONS
SEARCH DETAIL