Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters











Database
Language
Publication year range
1.
J Clin Transl Sci ; 8(1): e94, 2024.
Article in English | MEDLINE | ID: mdl-39220818

ABSTRACT

Introduction: Patients with cystic fibrosis (CF) experience frequent episodes of acute decline in lung function called pulmonary exacerbations (PEx). An existing clinical and place-based precision medicine algorithm that accurately predicts PEx could include racial and ethnic biases in clinical and geospatial training data, leading to unintentional exacerbation of health inequities. Methods: We estimated receiver operating characteristic curves based on predictions from a nonstationary Gaussian stochastic process model for PEx within 3, 6, and 12 months among 26,392 individuals aged 6 years and above (2003-2017) from the US CF Foundation Patient Registry. We screened predictors to identify reasons for discriminatory model performance. Results: The precision medicine algorithm performed worse predicting a PEx among Black patients when compared with White patients or to patients of another race for all three prediction horizons. There was little to no difference in prediction accuracies among Hispanic and non-Hispanic patients for the same prediction horizons. Differences in F508del, smoking households, secondhand smoke exposure, primary and secondary road densities, distance and drive time to the CF center, and average number of clinical evaluations were key factors associated with race. Conclusions: Racial differences in prediction accuracies from our PEx precision medicine algorithm exist. Misclassification of future PEx was attributable to several underlying factors that correspond to race: CF mutation, location where the patient lives, and clinical awareness. Associations of our proxies with race for CF-related health outcomes can lead to systemic racism in data collection and in prediction accuracies from precision medicine algorithms constructed from it.

2.
Article in English | MEDLINE | ID: mdl-38918321

ABSTRACT

BACKGROUND: While precision medicine algorithms can be used to improve health outcomes, concerns have been raised about racial equity and unintentional harm from encoded biases. In this study, we evaluated the fairness of using common individual- and community-level proxies of pediatric socioeconomic status (SES) such as insurance status and community deprivation index often utilized in precision medicine algorithms. METHODS: Using 2012-2021 vital records obtained from the Ohio Department of Health, we geocoded and matched each residential birth address to a census tract to obtain community deprivation index. We then conducted sensitivity and specificity analyses to determine the degree of match between deprivation index, insurance status, and birthing parent education level for all, Black, and White children to assess if there were differences based on race. RESULTS: We found that community deprivation index and insurance status fail to accurately represent individual SES, either alone or in combination. We found that deprivation index had a sensitivity of 61.2% and specificity of 74.1%, while insurance status had a higher sensitivity of 91.6% but lower specificity of 60.1%. Furthermore, these inconsistencies were race-based across all proxies evaluated, with greater sensitivities for Black children but greater specificities for White children. CONCLUSION: This may explain some of the racial disparities present in precision medicine algorithms that utilize SES proxies. Future studies should examine how to mitigate the biases introduced by using SES proxies, potentially by incorporating additional data on housing conditions.

SELECTION OF CITATIONS
SEARCH DETAIL