Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 8.738
Filter
1.
BMC Med Inform Decis Mak ; 24(1): 167, 2024 Jun 14.
Article in English | MEDLINE | ID: mdl-38877563

ABSTRACT

BACKGROUND: Consider a setting where multiple parties holding sensitive data aim to collaboratively learn population level statistics, but pooling the sensitive data sets is not possible due to privacy concerns and parties are unable to engage in centrally coordinated joint computation. We study the feasibility of combining privacy preserving synthetic data sets in place of the original data for collaborative learning on real-world health data from the UK Biobank. METHODS: We perform an empirical evaluation based on an existing prospective cohort study from the literature. Multiple parties were simulated by splitting the UK Biobank cohort along assessment centers, for which we generate synthetic data using differentially private generative modelling techniques. We then apply the original study's Poisson regression analysis on the combined synthetic data sets and evaluate the effects of 1) the size of local data set, 2) the number of participating parties, and 3) local shifts in distributions, on the obtained likelihood scores. RESULTS: We discover that parties engaging in the collaborative learning via shared synthetic data obtain more accurate estimates of the regression parameters compared to using only their local data. This finding extends to the difficult case of small heterogeneous data sets. Furthermore, the more parties participate, the larger and more consistent the improvements become up to a certain limit. Finally, we find that data sharing can especially help parties whose data contain underrepresented groups to perform better-adjusted analysis for said groups. CONCLUSIONS: Based on our results we conclude that sharing of synthetic data is a viable method for enabling learning from sensitive data without violating privacy constraints even if individual data sets are small or do not represent the overall population well. Lack of access to distributed sensitive data is often a bottleneck in biomedical research, which our study shows can be alleviated with privacy-preserving collaborative learning methods.


Subject(s)
Information Dissemination , Humans , United Kingdom , Cooperative Behavior , Confidentiality/standards , Privacy , Biological Specimen Banks , Prospective Studies
2.
BMC Med Inform Decis Mak ; 24(1): 162, 2024 Jun 12.
Article in English | MEDLINE | ID: mdl-38915012

ABSTRACT

Many state-of-the-art results in natural language processing (NLP) rely on large pre-trained language models (PLMs). These models consist of large amounts of parameters that are tuned using vast amounts of training data. These factors cause the models to memorize parts of their training data, making them vulnerable to various privacy attacks. This is cause for concern, especially when these models are applied in the clinical domain, where data are very sensitive. Training data pseudonymization is a privacy-preserving technique that aims to mitigate these problems. This technique automatically identifies and replaces sensitive entities with realistic but non-sensitive surrogates. Pseudonymization has yielded promising results in previous studies. However, no previous study has applied pseudonymization to both the pre-training data of PLMs and the fine-tuning data used to solve clinical NLP tasks. This study evaluates the effects on the predictive performance of end-to-end pseudonymization of Swedish clinical BERT models fine-tuned for five clinical NLP tasks. A large number of statistical tests are performed, revealing minimal harm to performance when using pseudonymized fine-tuning data. The results also find no deterioration from end-to-end pseudonymization of pre-training and fine-tuning data. These results demonstrate that pseudonymizing training data to reduce privacy risks can be done without harming data utility for training PLMs.


Subject(s)
Natural Language Processing , Humans , Privacy , Sweden , Anonyms and Pseudonyms , Computer Security/standards , Confidentiality/standards , Electronic Health Records/standards
4.
Comput Biol Med ; 177: 108646, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38824788

ABSTRACT

Improved data sharing between healthcare providers can lead to a higher probability of accurate diagnosis, more effective treatments, and enhanced capabilities of healthcare organizations. One critical area of focus is brain tumor segmentation, a complex task due to the heterogeneous appearance, irregular shape, and variable location of tumors. Accurate segmentation is essential for proper diagnosis and effective treatment planning, yet current techniques often fall short due to these complexities. However, the sensitive nature of health data often prohibits its sharing. Moreover, the healthcare industry faces significant issues, including preserving the privacy of the model and instilling trust in the model. This paper proposes a framework to address these privacy and trust issues by introducing a mechanism for training the global model using federated learning and sharing the encrypted learned parameters via a permissioned blockchain. The blockchain-federated learning algorithm we designed aggregates gradients in the permissioned blockchain to decentralize the global model, while the introduced masking approach retains the privacy of the model parameters. Unlike traditional raw data sharing, this approach enables hospitals or medical research centers to contribute to a globally learned model, thereby enhancing the performance of the central model for all participating medical entities. As a result, the global model can learn about several specific diseases and benefit each contributor with new disease diagnosis tasks, leading to improved treatment options. The proposed algorithm ensures the quality of model data when aggregating the local model, using an asynchronous federated learning procedure to evaluate the shared model's quality. The experimental results demonstrate the efficacy of the proposed scheme for the critical and challenging task of brain tumor segmentation. Specifically, our method achieved a 1.99% improvement in Dice similarity coefficient for enhancing tumors and a 19.08% reduction in Hausdorff distance for whole tumors compared to the baseline methods, highlighting the significant advancement in segmentation performance and reliability.


Subject(s)
Algorithms , Brain Neoplasms , Humans , Brain Neoplasms/diagnostic imaging , Blockchain , Machine Learning , Privacy , Magnetic Resonance Imaging/methods
5.
Sci Rep ; 14(1): 13243, 2024 06 09.
Article in English | MEDLINE | ID: mdl-38853152

ABSTRACT

Although the number of older adults requiring care is rapidly increasing, nursing homes have long faced issues such as the absence of a home-like environment. This exploratory mixed-method study investigated how residents (n = 15) in a long-term care unit in South Korea perceive home-like features and privacy in their living spaces. The results indicated that most participants were satisfied with the homeliness and privacy of their environment, but some were unhappy with the level of privacy. Most participants had low scores on the Geriatric Depression Scale and the Pittsburgh Sleep Quality Index, indicating low levels of depression and sleep disorders. Sleep quality was affected by factors such as sensory environment, staff visits, and room temperature. Although participants appreciated social support and private rooms, they expressed a desire for larger rooms. Overall, this study provides preliminary insights into older adults' views on their living spaces in long-term care with implications for improving their quality of life.


Subject(s)
Long-Term Care , Nursing Homes , Quality of Life , Humans , Female , Male , Aged , Aged, 80 and over , Republic of Korea , Privacy , Sleep Quality , Home Environment , Depression , Surveys and Questionnaires
6.
Hastings Cent Rep ; 54(3): 2, 2024 May.
Article in English | MEDLINE | ID: mdl-38842868

ABSTRACT

The privacy of the dead is an interesting area of concern for bioethicists. There is a legal doctrine that the dead can't have privacy rights, but also a body of contrary law ascribing privacy rights to the deceased and kin in relation to the deceased. As women's abortion privacy is under assault by American courts and legislatures, the implications of ascribing privacy rights to embryos and fetuses is more important than ever. Caution is called for in this domain.


Subject(s)
Abortion, Induced , Privacy , Humans , Female , United States , Abortion, Induced/legislation & jurisprudence , Abortion, Induced/ethics , Privacy/legislation & jurisprudence , Pregnancy , Abortion, Legal/legislation & jurisprudence , Abortion, Legal/ethics
7.
J Law Health ; 37(2): 105-126, 2024.
Article in English | MEDLINE | ID: mdl-38833598

ABSTRACT

Concern about individual rights and the desire to protect them has been part of our nation since its founding, and continues to be so today. The Ninth Amendment was created to assuage the Framers' concerns that enumerating some rights in the Bill of Rights would leave unenumerated rights unrecognized and unprotected, affirming that those rights are not disparaged or denied by their lack of textual support. The Ninth Amendment has appeared infrequently in our jurisprudence, and Courts initially construed it rather narrowly. But starting in the 1960s, the Ninth Amendment emerged as a powerful tool not just for recognizing unanticipated rights, but for protecting or expanding even enumerated rights. The right to privacy--encompassing the right to contraception and abortion--the right to preserve the integrity of your family, the right to vote, the right to own a firearm as an individual--all these rights have been asserted under and found to be supported by the Ninth Amendment. In its Dobbs v. Jackson Women's Health decision overturning Roe, the Supreme Court found that there is no right to abortion because it is not in the Constitution. But the potential of the Ninth Amendment is such that reproductive choice need not be mentioned in the Constitution to be protected. Reproductive choice may rightfully be considered as part of a right to privacy, an unenumerated right that nevertheless has abundant precedent behind it. The Ninth Amendment, and its counterparts found in many state constitutions, has the power to protect not just reproductive choice, but all of our fundamental rights.


Subject(s)
Reproductive Rights , Humans , United States , Female , Reproductive Rights/legislation & jurisprudence , Privacy/legislation & jurisprudence , Supreme Court Decisions , Abortion, Induced/legislation & jurisprudence , Contraception , Women's Rights/legislation & jurisprudence , Pregnancy , Abortion, Legal/legislation & jurisprudence
8.
J Law Health ; 37(2): 187-213, 2024.
Article in English | MEDLINE | ID: mdl-38833601

ABSTRACT

Since the overturning of prior abortion precedents in Dobbs v. Jackson Women's Health Organization, there has been a question on the minds of many women in this country: how will this decision affect me and my rights? As we have seen in the aftermath of Dobbs, many states have pushed for stringent anti-abortion measures seeking to undermine the foundation on which women's reproductive freedom had been grounded on for decades. This includes right here in Ohio, where Republican lawmakers have advocated on numerous occasions for implementing laws seeking to limit abortion rights, including a 6-week abortion ban advocated for and passed by the Ohio Republican legislature and signed into law by Ohio Governor Mike DeWine. Despite this particular ban being successfully challenged and stayed, significant problems persist regarding due process rights for women in Ohio, particularly in the aftermath of Justice Thomas's concurrence in Dobbs advising the Court to revisit prior precedents, such as Griswold v. Connecticut providing for the right to contraception. If the Court were to revisit and strike down Griswold, it would further undermine privacy and due process rights that have been granted to women across this country, including here in Ohio, for decades. Justice Thomas's concurrence, while merely dicta, encapsulates a Court that has become increasingly hostile to treasured fundamental rights for women, a hostility mirrored in numerous Republican legislatures, including right here in Ohio.


Subject(s)
Women's Rights , Humans , Ohio , Female , Women's Rights/legislation & jurisprudence , Pregnancy , Privacy/legislation & jurisprudence , Abortion, Induced/legislation & jurisprudence
9.
Sci Eng Ethics ; 30(3): 19, 2024 May 15.
Article in English | MEDLINE | ID: mdl-38748085

ABSTRACT

This study investigated people's ethical concerns of surveillance technology. By adopting the spectrum of technological utopian and dystopian narratives, how people perceive a society constructed through the compulsory use of surveillance technology was explored. This study empirically examined the anonymous online expression of attitudes toward the society-wide, compulsory adoption of a contact tracing app that affected almost every aspect of all people's everyday lives at a societal level. By applying the structural topic modeling approach to analyze comments on four Hong Kong anonymous discussion forums, topics concerning the technological utopian, dystopian, and pragmatic views on the surveillance app were discovered. The findings showed that people with a technological utopian view on this app believed that the implementation of compulsory app use can facilitate social good and maintain social order. In contrast, individuals who had a technological dystopian view expressed privacy concerns and distrust of this surveillance technology. Techno-pragmatists took a balanced approach and evaluated its implementation practically.


Subject(s)
Attitude , Mobile Applications , Privacy , Humans , Hong Kong , Contact Tracing/ethics , Contact Tracing/methods , Trust , Confidentiality , Technology/ethics , Internet , Female , Male , Adult , Narration
10.
J Med Internet Res ; 26: e50715, 2024 May 31.
Article in English | MEDLINE | ID: mdl-38820572

ABSTRACT

BACKGROUND: Mobile health (mHealth) apps have the potential to enhance health care service delivery. However, concerns regarding patients' confidentiality, privacy, and security consistently affect the adoption of mHealth apps. Despite this, no review has comprehensively summarized the findings of studies on this subject matter. OBJECTIVE: This systematic review aims to investigate patients' perspectives and awareness of the confidentiality, privacy, and security of the data collected through mHealth apps. METHODS: Using the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines, a comprehensive literature search was conducted in 3 electronic databases: PubMed, Ovid, and ScienceDirect. All the retrieved articles were screened according to specific inclusion criteria to select relevant articles published between 2014 and 2022. RESULTS: A total of 33 articles exploring mHealth patients' perspectives and awareness of data privacy, security, and confidentiality issues and the associated factors were included in this systematic review. Thematic analyses of the retrieved data led to the synthesis of 4 themes: concerns about data privacy, confidentiality, and security; awareness; facilitators and enablers; and associated factors. Patients showed discordant and concordant perspectives regarding data privacy, security, and confidentiality, as well as suggesting approaches to improve the use of mHealth apps (facilitators), such as protection of personal data, ensuring that health status or medical conditions are not mentioned, brief training or education on data security, and assuring data confidentiality and privacy. Similarly, awareness of the subject matter differed across the studies, suggesting the need to improve patients' awareness of data security and privacy. Older patients, those with a history of experiencing data breaches, and those belonging to the higher-income class were more likely to raise concerns about the data security and privacy of mHealth apps. These concerns were not frequent among patients with higher satisfaction levels and those who perceived the data type to be less sensitive. CONCLUSIONS: Patients expressed diverse views on mHealth apps' privacy, security, and confidentiality, with some of the issues raised affecting technology use. These findings may assist mHealth app developers and other stakeholders in improving patients' awareness and adjusting current privacy and security features in mHealth apps to enhance their adoption and use. TRIAL REGISTRATION: PROSPERO CRD42023456658; https://tinyurl.com/ytnjtmca.


Subject(s)
Computer Security , Confidentiality , Mobile Applications , Telemedicine , Humans , Privacy
11.
Milbank Q ; 102(2): 463-502, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38739543

ABSTRACT

Policy Points This study examines the impact of several world-changing events in 2020, such as the pandemic and widespread racism protests, on the US population's comfort with the use of identifiable data for public health. Before the 2020 election, there was no significant difference between Democrats and Republicans. However, African Americans exhibited a decrease in comfort that was different from other subgroups. Our findings suggest that the public remained supportive of public health data activities through the pandemic and the turmoil of 2020 election cycle relative to other data use. However, support among African Americans for public health data use experienced a unique decline compared to other demographic groups. CONTEXT: Recent legislative privacy efforts have not included special provisions for public health data use. Although past studies documented support for public health data use, several global events in 2020 have raised awareness and concern about privacy and data use. This study aims to understand whether the events of 2020 affected US privacy preferences on secondary uses of identifiable data, focusing on public health and research uses. METHODS: We deployed two online surveys-in February and November 2020-on data privacy attitudes and preferences using a choice-based-conjoint analysis. Participants received different data-use scenario pairs-varied by the type of data, user, and purpose-and selected scenarios based on their comfort. A hierarchical Bayes regression model simulated population preferences. FINDINGS: There were 1,373 responses. There was no statistically significant difference in the population's data preferences between February and November, each showing the highest comfort with population health and research data activities and the lowest with profit-driven activities. Most subgroups' data preferences were comparable with the population's preferences, except African Americans who showed significant decreases in comfort with population health and research. CONCLUSIONS: Despite world-changing events, including a pandemic, we found bipartisan public support for using identifiable data for public health and research. The decreasing support among African Americans could relate to the increased awareness of systemic racism, its harms, and persistent disparities. The US population's preferences support including legal provisions that permit public health and research data use in US laws, which are currently lacking specific public health use permissions.


Subject(s)
Pandemics , Politics , Public Health , Humans , United States , Male , Female , Adult , Surveys and Questionnaires , Middle Aged , COVID-19/epidemiology , Black or African American , Public Opinion , Privacy
12.
Sci Total Environ ; 940: 173315, 2024 Aug 25.
Article in English | MEDLINE | ID: mdl-38761955

ABSTRACT

The rapidly expanding use of wastewater for public health surveillance requires new strategies to protect privacy rights, while data are collected at increasingly discrete geospatial scales, i.e., city, neighborhood, campus, and building-level. Data collected at high geospatial resolution can inform on labile, short-lived biomarkers, thereby making wastewater-derived data both more actionable and more likely to cause privacy concerns and stigmatization of subpopulations. Additionally, data sharing restrictions among neighboring cities and communities can complicate efforts to balance public health protections with citizens' privacy. Here, we have created an encrypted framework that facilitates the sharing of sensitive population health data among entities that lack trust for one another (e.g., between adjacent municipalities with different governance of health monitoring and data sharing). We demonstrate the utility of this approach with two real-world cases. Our results show the feasibility of sharing encrypted data between two municipalities and a laboratory, while performing secure private computations for wastewater-based epidemiology (WBE) with high precision, fast speeds, and low data costs. This framework is amenable to other computations used by WBE researchers including population normalized mass loads, fecal indicator normalizations, and quality control measures. The Centers for Disease Control and Prevention's National Wastewater Surveillance System shows ∼8 % of the records attributed to collection before the wastewater treatment plant, illustrating an opportunity to further expand currently limited community-level sampling and public health surveillance through security and responsible data-sharing as outlined here.


Subject(s)
Information Dissemination , Wastewater , Privacy , Humans , Computer Security , Environmental Monitoring/methods , Wastewater-Based Epidemiological Monitoring
13.
Water Res ; 258: 121756, 2024 Jul 01.
Article in English | MEDLINE | ID: mdl-38781624

ABSTRACT

As the threat of COVID-19 recedes, wastewater surveillance - unlike other pandemic-era public health surveillance methods - seems here to stay. Concerns have been raised, however, about the potential risks that wastewater surveillance might pose towards group privacy. Existing scholarship has focused upon using ethics- or human rights-based frameworks as a means of balancing the public health objectives of wastewater surveillance and the potential risks it might pose to group privacy. However, such frameworks greatly lack enforceability. In order to further the strong foundation laid by such frameworks - while addressing their lack of enforceability - this paper proposes the idea of the 'obligation' as an alternative way to regulate wastewater surveillance systems. The legal codification of said obligations provides a method of ensuring that wastewater surveillance systems can be deployed effectively and equitably. Our paper proposes that legal obligations for wastewater surveillance can be created and enforced through transparent and purposeful legislation (which would include limits on power and grant institutions substantial oversight) as well as paying heed to non-legislative legal means of enforcement, such as through courts or contracts. Introducing legal obligations for wastewater surveillance could therefore be highly useful to researchers, policymakers, corporate technologists, and government agencies working in this field.


Subject(s)
Privacy , Public Health , Wastewater , Humans , COVID-19 , Pandemics , SARS-CoV-2
14.
JMIR Res Protoc ; 13: e54933, 2024 May 22.
Article in English | MEDLINE | ID: mdl-38776540

ABSTRACT

BACKGROUND: There is data paucity regarding users' awareness of privacy concerns and the resulting impact on the acceptance of mobile health (mHealth) apps, especially in the Saudi context. Such information is pertinent in addressing users' needs in the Kingdom of Saudi Arabia (KSA). OBJECTIVE: This article presents a study protocol for a mixed method study to assess the perspectives of patients and stakeholders regarding the privacy, security, and confidentiality of data collected via mHealth apps in the KSA and the factors affecting the adoption of mHealth apps. METHODS: A mixed method study design will be used. In the quantitative phase, patients and end users of mHealth apps will be randomly recruited from various provinces in Saudi Arabia with a high population of mHealth users. The research instrument will be developed based on the emerging themes and findings from the interview conducted among stakeholders, app developers, health care professionals, and users of mHealth apps (n=25). The survey will focus on (1) how to improve patients' awareness of data security, privacy, and confidentiality; (2) feedback on the current mHealth apps in terms of data security, privacy, and confidentiality; and (3) the features that might improve data security, privacy, and confidentiality of mHealth apps. Meanwhile, specific sections of the questionnaire will focus on patients' awareness, privacy concerns, confidentiality concerns, security concerns, perceived usefulness, perceived ease of use, and behavioral intention. Qualitative data will be analyzed thematically using NVivo version 12. Descriptive statistics, regression analysis, and structural equation modeling will be performed using SPSS and partial least squares structural equation modeling. RESULTS: The ethical approval for this research has been obtained from the Biomedical and Scientific Research Ethics Committee, University of Warwick, and the Medical Research and Ethics Committee Ministry of Health in the KSA. The qualitative phase is ongoing and 15 participants have been interviewed. The interviews for the remaining 10 participants will be completed by November 25, 2023. Preliminary thematic analysis is still ongoing. Meanwhile, the quantitative phase will commence by December 10, 2023, with 150 participants providing signed and informed consent to participate in the study. CONCLUSIONS: The mixed methods study will elucidate the antecedents of patients' awareness and concerns regarding the privacy, security, and confidentiality of data collected via mHealth apps in the KSA. Furthermore, pertinent findings on the perspectives of stakeholders and health care professionals toward the aforementioned issues will be gleaned. The results will assist policy makers in developing strategies to improve Saudi users'/patients' adoption of mHealth apps and addressing the concerns raised to benefit significantly from these advanced health care modalities. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): DERR1-10.2196/54933.


Subject(s)
Computer Security , Confidentiality , Mobile Applications , Telemedicine , Humans , Saudi Arabia , Surveys and Questionnaires , Male , Female , Privacy , Adult , Qualitative Research , Stakeholder Participation
15.
Sci Adv ; 10(18): eadl2524, 2024 May 03.
Article in English | MEDLINE | ID: mdl-38691613

ABSTRACT

The U.S. Census Bureau faces a difficult trade-off between the accuracy of Census statistics and the protection of individual information. We conduct an independent evaluation of bias and noise induced by the Bureau's two main disclosure avoidance systems: the TopDown algorithm used for the 2020 Census and the swapping algorithm implemented for the three previous Censuses. Our evaluation leverages the Noisy Measurement File (NMF) as well as two independent runs of the TopDown algorithm applied to the 2010 decennial Census. We find that the NMF contains too much noise to be directly useful without measurement error modeling, especially for Hispanic and multiracial populations. TopDown's postprocessing reduces the NMF noise and produces data whose accuracy is similar to that of swapping. While the estimated errors for both TopDown and swapping algorithms are generally no greater than other sources of Census error, they can be relatively substantial for geographies with small total populations.


Subject(s)
Algorithms , Bias , Censuses , United States , Humans , Privacy
16.
Cien Saude Colet ; 29(5): e15552022, 2024 May.
Article in English | MEDLINE | ID: mdl-38747777

ABSTRACT

The conceptions, values, and experiences of students from public and private high schools in two Brazilian state capitals, Vitória-ES and Campo Grande-MS, were analyzed regarding digital control and monitoring between intimate partners and the unauthorized exposure of intimate material on the Internet. Data from eight focus groups with 77 adolescents were submitted to thematic analysis, complemented by a questionnaire answered by a sample of 530 students. Most students affirmed that they do not tolerate the control/monitoring and unauthorized exposure of intimate materials but recognized that such activity is routine. They point out jealousy, insecurity, and "curiosity" as their main reasons. They detail the various dynamics of unauthorized exposure of intimate material and see it as a severe invasion of privacy and a breach of trust between partners. Their accounts suggest that such practices are gender violence. They also reveal that each platform has its cultural appropriation and that platforms used by the family, such as Facebook, cause more significant damage to the victim's reputation.


Subject(s)
Focus Groups , Sexual Partners , Students , Humans , Brazil , Adolescent , Female , Male , Surveys and Questionnaires , Students/psychology , Sexual Partners/psychology , Internet , Intimate Partner Violence/statistics & numerical data , Privacy , Gender-Based Violence , Interpersonal Relations , Jealousy , Schools , Young Adult
17.
Pediatr Crit Care Med ; 25(5): e258-e262, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38695704

ABSTRACT

Caring for children and their families at the end-of-life is an essential but challenging aspect of care in the PICU. During and following a child's death, families often report a simultaneous need for protected privacy and ongoing supportive presence from staff. Balancing these seemingly paradoxical needs can be difficult for PICU staff and can often lead to the family feeling intruded upon or abandoned during their end-of-life experience. In this "Pediatric Critical Care Medicine Perspectives" piece, we reframe provision of privacy at the end-of-life in the PICU and describe an essential principle that aims to help the interprofessional PICU team simultaneously meet these two opposing family needs: "Supported Privacy." In addition, we offer concrete recommendations to actualize "Supported Privacy" in the PICU, focusing on environmental considerations, practical needs, and emotional responses. By incorporating the principles of "Supported Privacy" into end-of-life care practices, clinicians can support the delivery of high-quality care that meets the needs of children and families navigating the challenges and supports of end-of-life in the PICU.


Subject(s)
Intensive Care Units, Pediatric , Privacy , Terminal Care , Humans , Terminal Care/ethics , Terminal Care/psychology , Intensive Care Units, Pediatric/organization & administration , Child , Professional-Family Relations , Family/psychology
18.
PLoS One ; 19(5): e0302924, 2024.
Article in English | MEDLINE | ID: mdl-38758778

ABSTRACT

Online research methods have grown in popularity due in part to the globalised and far-reaching nature of the internet but also linked to the Covid-19 pandemic whereby restrictions to travel and face to face contact necessitated a shift in methods of research recruitment and data collection. Ethical guidance exists to support researchers in conducting online research, however this is lacking within health fields. This scoping review aims to synthesise formal ethical guidance for applying online methods within health research as well as provide examples of where guidance has been used. A systematic search of literature was conducted, restricted to English language records between 2013 and 2022. Eligibility focused on whether the records were providing ethical guidance or recommendations, were situated or relevant to health disciplines, and involved the use or discussion of online research methods. Following exclusion of ineligible records and duplicate removal, three organisational ethical guidance and 24 research papers were charted and thematically analysed. Four key themes were identified within the guidance documents, 1) consent, 2) confidentiality and privacy, 3) protecting participants from harm and 4) protecting researchers from harm with the research papers describing additional context and understanding around these issues. The review identified that there are currently no specific guidelines aimed at health researchers, with the most cited guidance coming from broader methodological perspectives and disciplines or auxiliary fields. All guidance discussed each of the four key themes within the wider context of sensitive topics and vulnerable populations, areas and issues which are often prominent within health research thus highlighting the need for unifying guidance specific for health researchers. Further research should aim to understand better how online health studies apply ethical principles, to support in informing gaps across both research and guidance.


Subject(s)
Internet , Humans , COVID-19/epidemiology , Confidentiality/ethics , Informed Consent/ethics , Privacy , SARS-CoV-2 , Biomedical Research/ethics , Pandemics , Guidelines as Topic , Ethics, Research
19.
Sensors (Basel) ; 24(10)2024 May 18.
Article in English | MEDLINE | ID: mdl-38794067

ABSTRACT

In response to a burgeoning pediatric mental health epidemic, recent guidelines have instructed pediatricians to regularly screen their patients for mental health disorders with consistency and standardization. Yet, gold-standard screening surveys to evaluate mental health problems in children typically rely solely on reports given by caregivers, who tend to unintentionally under-report, and in some cases over-report, child symptomology. Digital phenotype screening tools (DPSTs), currently being developed in research settings, may help overcome reporting bias by providing objective measures of physiology and behavior to supplement child mental health screening. Prior to their implementation in pediatric practice, however, the ethical dimensions of DPSTs should be explored. Herein, we consider some promises and challenges of DPSTs under three broad categories: accuracy and bias, privacy, and accessibility and implementation. We find that DPSTs have demonstrated accuracy, may eliminate concerns regarding under- and over-reporting, and may be more accessible than gold-standard surveys. However, we also find that if DPSTs are not responsibly developed and deployed, they may be biased, raise privacy concerns, and be cost-prohibitive. To counteract these potential shortcomings, we identify ways to support the responsible and ethical development of DPSTs for clinical practice to improve mental health screening in children.


Subject(s)
Mental Disorders , Mental Health , Wearable Electronic Devices , Humans , Wearable Electronic Devices/ethics , Child , Mental Disorders/diagnosis , Mass Screening/ethics , Mass Screening/instrumentation , Privacy
20.
Sensors (Basel) ; 24(10)2024 May 11.
Article in English | MEDLINE | ID: mdl-38793906

ABSTRACT

Smartwatch health sensor data are increasingly utilized in smart health applications and patient monitoring, including stress detection. However, such medical data often comprise sensitive personal information and are resource-intensive to acquire for research purposes. In response to this challenge, we introduce the privacy-aware synthetization of multi-sensor smartwatch health readings related to moments of stress, employing Generative Adversarial Networks (GANs) and Differential Privacy (DP) safeguards. Our method not only protects patient information but also enhances data availability for research. To ensure its usefulness, we test synthetic data from multiple GANs and employ different data enhancement strategies on an actual stress detection task. Our GAN-based augmentation methods demonstrate significant improvements in model performance, with private DP training scenarios observing an 11.90-15.48% increase in F1-score, while non-private training scenarios still see a 0.45% boost. These results underline the potential of differentially private synthetic data in optimizing utility-privacy trade-offs, especially with the limited availability of real training samples. Through rigorous quality assessments, we confirm the integrity and plausibility of our synthetic data, which, however, are significantly impacted when increasing privacy requirements.


Subject(s)
Privacy , Wearable Electronic Devices , Humans , Monitoring, Physiologic/methods , Monitoring, Physiologic/instrumentation , Algorithms
SELECTION OF CITATIONS
SEARCH DETAIL
...