RESUMO
Recent work has shown that predictive models can be applied to structured electronic health record (EHR) data to stratify autism likelihood from an early age (<1 year). Integrating clinical narratives (or notes) with structured data has been shown to improve prediction performance in other clinical applications, but the added predictive value of this information in early autism prediction has not yet been explored. In this study, we aimed to enhance the performance of early autism prediction by using both structured EHR data and clinical narratives. We built models based on structured data and clinical narratives separately, and then an ensemble model that integrated both sources of data. We assessed the predictive value of these models from Duke University Health System over a 14-year span to evaluate ensemble models predicting later autism diagnosis (by age 4â¯years) from data collected from ages 30 to 360â¯days. Our sample included 11,750 children above by age 3â¯years (385 meeting autism diagnostic criteria). The ensemble model for autism prediction showed superior performance and at age 30â¯days achieved 46.8% sensitivity (95% confidence interval, CI: 22.0%, 52.9%), 28.0% positive predictive value (PPV) at high (90%) specificity (CI: 2.0%, 33.1%), and AUC4 (with at least 4-year follow-up for controls) reaching 0.769 (CI: 0.715, 0.811). Prediction by 360â¯days achieved 44.5% sensitivity (CI: 23.6%, 62.9%), and 13.7% PPV at high (90%) specificity (CI: 9.6%, 18.9%), and AUC4 reaching 0.797 (CI: 0.746, 0.840). Results show that incorporating clinical narratives in early autism prediction achieved promising accuracy by age 30â¯days, outperforming models based on structured data only. Furthermore, findings suggest that additional features learned from clinician narratives might be hypothesis generating for understanding early development in autism.
Assuntos
Transtorno Autístico , Registros Eletrônicos de Saúde , Criança , Humanos , Lactente , Pré-Escolar , Transtorno Autístico/diagnóstico , Valor Preditivo dos Testes , Narração , EletrônicaRESUMO
Importance: Stroke is the fifth-highest cause of death in the US and a leading cause of serious long-term disability with particularly high risk in Black individuals. Quality risk prediction algorithms, free of bias, are key for comprehensive prevention strategies. Objective: To compare the performance of stroke-specific algorithms with pooled cohort equations developed for atherosclerotic cardiovascular disease for the prediction of new-onset stroke across different subgroups (race, sex, and age) and to determine the added value of novel machine learning techniques. Design, Setting, and Participants: Retrospective cohort study on combined and harmonized data from Black and White participants of the Framingham Offspring, Atherosclerosis Risk in Communities (ARIC), Multi-Ethnic Study for Atherosclerosis (MESA), and Reasons for Geographical and Racial Differences in Stroke (REGARDS) studies (1983-2019) conducted in the US. The 62â¯482 participants included at baseline were at least 45 years of age and free of stroke or transient ischemic attack. Exposures: Published stroke-specific algorithms from Framingham and REGARDS (based on self-reported risk factors) as well as pooled cohort equations for atherosclerotic cardiovascular disease plus 2 newly developed machine learning algorithms. Main Outcomes and Measures: Models were designed to estimate the 10-year risk of new-onset stroke (ischemic or hemorrhagic). Discrimination concordance index (C index) and calibration ratios of expected vs observed event rates were assessed at 10 years. Analyses were conducted by race, sex, and age groups. Results: The combined study sample included 62â¯482 participants (median age, 61 years, 54% women, and 29% Black individuals). Discrimination C indexes were not significantly different for the 2 stroke-specific models (Framingham stroke, 0.72; 95% CI, 0.72-073; REGARDS self-report, 0.73; 95% CI, 0.72-0.74) vs the pooled cohort equations (0.72; 95% CI, 0.71-0.73): differences 0.01 or less (P values >.05) in the combined sample. Significant differences in discrimination were observed by race: the C indexes were 0.76 for all 3 models in White vs 0.69 in Black women (all P values <.001) and between 0.71 and 0.72 in White men and between 0.64 and 0.66 in Black men (all P values ≤.001). When stratified by age, model discrimination was better for younger (<60 years) vs older (≥60 years) adults for both Black and White individuals. The ratios of observed to expected 10-year stroke rates were closest to 1 for the REGARDS self-report model (1.05; 95% CI, 1.00-1.09) and indicated risk overestimation for Framingham stroke (0.86; 95% CI, 0.82-0.89) and pooled cohort equations (0.74; 95% CI, 0.71-0.77). Performance did not significantly improve when novel machine learning algorithms were applied. Conclusions and Relevance: In this analysis of Black and White individuals without stroke or transient ischemic attack among 4 US cohorts, existing stroke-specific risk prediction models and novel machine learning techniques did not significantly improve discriminative accuracy for new-onset stroke compared with the pooled cohort equations, and the REGARDS self-report model had the best calibration. All algorithms exhibited worse discrimination in Black individuals than in White individuals, indicating the need to expand the pool of risk factors and improve modeling techniques to address observed racial disparities and improve model performance.
Assuntos
População Negra , Disparidades em Assistência à Saúde , Preconceito , Medição de Risco , Acidente Vascular Cerebral , População Branca , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Aterosclerose/epidemiologia , Doenças Cardiovasculares/epidemiologia , Ataque Isquêmico Transitório/epidemiologia , Estudos Retrospectivos , Acidente Vascular Cerebral/diagnóstico , Acidente Vascular Cerebral/epidemiologia , Acidente Vascular Cerebral/etnologia , Medição de Risco/normas , Reprodutibilidade dos Testes , Fatores Sexuais , Fatores Etários , Fatores Raciais/estatística & dados numéricos , População Negra/estatística & dados numéricos , População Branca/estatística & dados numéricos , Estados Unidos/epidemiologia , Aprendizado de Máquina/normas , Viés , Preconceito/prevenção & controle , Disparidades em Assistência à Saúde/etnologia , Disparidades em Assistência à Saúde/normas , Disparidades em Assistência à Saúde/estatística & dados numéricos , Simulação por Computador/normas , Simulação por Computador/estatística & dados numéricosRESUMO
OBJECTIVE: Develop and pilot test a mobile health (mHealth) cognitive behavioral coping skills training and activity coaching protocol (HCT Symptoms and Steps) for hematopoietic stem cell transplant (HCT) patients. DESIGN: Two-phase, mixed methods study. SAMPLE: HCT patients and healthcare providers. METHODS: Phase I was patient (n = 5) and provider (n = 1) focus groups and user testing (N = 5) to develop the HCT Symptoms and Steps protocol. Phase II was a pilot randomized trial (N = 40) to evaluate feasibility, acceptability, and pre-to-post outcomes (e.g., physical disability, pain, fatigue, distress, physical activity, symptom self-efficacy) compared to an education control. FINDINGS: Qualitative feedback on symptoms, recruitment strategies, coping skills, and mHealth components (e.g., Fitbit, mobile app) were integrated into the protocol. HCT Symptoms and Steps were feasible and acceptable. Pre-post changes suggest physical disability and activity improved while symptoms (e.g., fatigue, distress) decreased. CONCLUSIONS: HCT Symptoms and Steps have strong feasibility and acceptability and shows promise for benefits. Larger, fully-powered randomized trials are needed to examine intervention efficacy. IMPLICATIONS: HCT Symptoms and Steps may reduce physical disability and improve health outcomes post-transplant. CLINICAL TRIAL REGISTRATION NUMBER: NCT03859765.
Assuntos
Transplante de Células-Tronco Hematopoéticas , Tutoria , Humanos , Projetos Piloto , Transplante de Células-Tronco Hematopoéticas/psicologia , Depressão/psicologia , Fadiga/terapia , CogniçãoRESUMO
PURPOSE: To investigate the effect of systemic arterial blood pressure (BP) on rates of progressive structural damage over time in glaucoma. DESIGN: Retrospective cohort study. PARTICIPANTS: A total of 7501 eyes of 3976 subjects with glaucoma or suspected of glaucoma followed over time from the Duke Glaucoma Registry. METHODS: Linear mixed models were used to investigate the effects of BP on the rates of retinal nerve fiber layer (RNFL) loss from spectral-domain OCT (SD-OCT) over time. Models were adjusted for intraocular pressure (IOP), gender, race, diagnosis, central corneal thickness (CCT), follow-up time, and baseline disease severity. MAIN OUTCOME MEASURE: Effect of mean arterial pressure (MAP), systolic arterial pressure (SAP), and diastolic arterial pressure (DAP) on rates of RNFL loss over time. RESULTS: A total of 157 291 BP visits, 45 408 IOP visits, and 30 238 SD-OCT visits were included. Mean rate of RNFL change was -0.70 µm/year (95% confidence interval, -0.72 to -0.67 µm/year). In univariable models, MAP, SAP, and DAP during follow-up were not significantly associated with rates of RNFL loss. However, when adjusted for mean IOP during follow-up, each 10 mmHg reduction in mean MAP (-0.06 µm/year; P = 0.007) and mean DAP (-0.08 µm/year; P < 0.001) but not SAP (-0.01 µm/year; P = 0.355) was associated with significantly faster rates of RNFL thickness change over time. The effect of the arterial pressure metrics remained significant after additional adjustment for baseline age, diagnosis, sex, race, follow-up time, disease severity, and corneal thickness. CONCLUSIONS: When adjusted for IOP, lower MAP and DAP during follow-up were significantly associated with faster rates of RNFL loss, suggesting that levels of systemic BP may be a significant factor in glaucoma progression.
Assuntos
Pressão Sanguínea/fisiologia , Glaucoma/diagnóstico , Glaucoma/fisiopatologia , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Pressão Arterial/fisiologia , Progressão da Doença , Feminino , Seguimentos , Humanos , Pressão Intraocular/fisiologia , Masculino , Pessoa de Meia-Idade , Fibras Nervosas/patologia , Hipertensão Ocular/fisiopatologia , Sistema de Registros , Células Ganglionares da Retina/patologia , Estudos Retrospectivos , Tomografia de Coerência Óptica , Tonometria OcularRESUMO
PURPOSE: To investigate the impact of intraocular pressure (IOP) control on rates of change of spectral-domain OCT (SD-OCT) retinal nerve fiber layer (RNFL) thickness in a large clinical population. DESIGN: Retrospective cohort study. PARTICIPANTS: A total of 85 835 IOP measurements and 60 223 SD-OCT tests from 14 790 eyes of 7844 patients. METHODS: Data were extracted from the Duke Glaucoma Registry, a large database of electronic medical records of patients with glaucoma and suspected disease followed over time at the Duke Eye Center and satellite clinics. All records from patients with a minimum of 6 months of follow-up and at least 2 good-quality SD-OCT scans and 2 clinical visits with Goldmann applanation tonometry were included. Eyes were categorized according to the frequency of visits with IOP below cutoffs of 21 mmHg, 18 mmHg, and 15 mmHg over time. Rates of change for global RNFL thickness were obtained using linear mixed models and classified as slow if change was slower than -1.0 µm/year; moderate if between -1.0 and -2.0 µm/year; and fast if faster than -2.0 µm/year. Multivariable models were adjusted for age, gender, race, diagnosis, central corneal thickness, follow-up time, and baseline disease severity. MAIN OUTCOME MEASURES: Rates of change in SD-OCT RNFL thickness according to levels of IOP control. RESULTS: Eyes had a mean follow-up of 3.5±1.9 years. Average rate of change in RNFL thickness was -0.68±0.59 µm/year. Each 1 mmHg higher mean IOP was associated with 0.05 µm/year faster RNFL loss (P < 0.001) after adjustment for potentially confounding variables. For eyes that had fast progression, 41% of them had IOP <21 mmHg in all visits during follow-up, whereas 20% of them had all visits with IOP <18 mmHg, but only 9% of them had all visits with IOP <15 mmHg. CONCLUSIONS: Intraocular pressure was significantly associated with rates of progressive RNFL loss in a large clinical population. Eyes with stricter IOP control over follow-up visits had a smaller chance of exhibiting fast deterioration. Our findings may assist clinicians in establishing target pressures in clinical practice.
Assuntos
Glaucoma/diagnóstico , Pressão Intraocular/fisiologia , Vigilância da População/métodos , Células Ganglionares da Retina/patologia , Tomografia de Coerência Óptica/métodos , Tonometria Ocular/métodos , Campos Visuais , Idoso , Feminino , Seguimentos , Glaucoma/fisiopatologia , Humanos , Masculino , Pessoa de Meia-Idade , Estudos RetrospectivosRESUMO
BACKGROUND: The prevalence of child and adolescent obesity and severe obesity continues to increase despite decades of policy and research aimed at prevention. Obesity strongly predicts cardiovascular and metabolic disease risk; both begin in childhood. Children who receive intensive behavioral interventions can reduce body mass index (BMI) and reverse disease risk. However, delivering these interventions with fidelity at scale remains a challenge. Clinic-community partnerships offer a promising strategy to provide high-quality clinical care and deliver behavioral treatment in local park and recreation settings. The Hearts & Parks study has three broad objectives: (1) evaluate the effectiveness of the clinic-community model for the treatment of child obesity, (2) define microbiome and metabolomic signatures of obesity and response to lifestyle change, and (3) inform the implementation of similar models in clinical systems. METHODS: Methods are designed for a pragmatic randomized, controlled clinical trial (n = 270) to test the effectiveness of an integrated clinic-community child obesity intervention as compared with usual care. We are powered to detect a difference in body mass index (BMI) between groups at 6 months, with follow up to 12 months. Secondary outcomes include changes in biomarkers for cardiovascular disease, psychosocial risk, and quality of life. Through collection of biospecimens (serum and stool), additional exploratory outcomes include microbiome and metabolomics biomarkers of response to lifestyle modification. DISCUSSION: We present the study design, enrollment strategy, and intervention details for a randomized clinical trial to measure the effectiveness of a clinic-community child obesity treatment intervention. This study will inform a critical area in child obesity and cardiovascular risk research-defining outcomes, implementation feasibility, and identifying potential molecular mechanisms of treatment response. CLINICAL TRIAL REGISTRATION: NCT03339440 .
Assuntos
Obesidade Infantil , Adolescente , Índice de Massa Corporal , Criança , Família , Humanos , Estilo de Vida , Obesidade Infantil/terapia , Qualidade de Vida , Ensaios Clínicos Controlados Aleatórios como AssuntoRESUMO
PURPOSE: To investigate the incidence and risk factors for glaucomatous visual field progression in eyes with well-controlled intraocular pressure (IOP). DESIGN: Prospective cohort. PARTICIPANTS: A total of 460 eyes of 334 patients with glaucoma under treatment. METHODS: Study subjects had a mean follow-up of 4.3±0.8 years. Patients were classified as well controlled if all IOP measurements were less than 18 mmHg. Rates of visual field progression were calculated using ordinary least-squares linear regression of standard automated perimetry (SAP) mean deviation (MD) values over time. Progression was defined as a significantly negative MD slope (alpha = 0.05). MAIN OUTCOME MEASURES: Rates of SAP MD change; mean and peak IOP, and IOP fluctuation; and corneal biomechanics: corneal hysteresis (CH), central corneal thickness (CCT), and corneal index. RESULTS: Of the 179 eyes with well-controlled IOP, 42 (23.5%) demonstrated visual field progression. There was no significant difference between progressing and stable patients in baseline MD (-6.4±7.1 decibels [dB] vs. -6.0±6.2 dB; P = 0.346), mean IOP (11.7±2.0 mmHg vs. 12.1±2.3 mmHg; P = 0.405), IOP fluctuation (1.6±0.6 mmHg vs. 1.6±0.5 mmHg; P = 0.402), or peak IOP (14.3±1.9 mmHg vs. 14.6±2.1 mmHg; P = 0.926). Progressing eyes had significantly lower CH (8.6±1.3 mmHg vs. 9.4±1.6 mmHg; P = 0.014) and thinner CCT (515.1±33.1 µm vs. 531.1±42.4 µm; P = 0.018, respectively) compared with stable eyes. In the multivariate analysis, a 1 standard deviation lower corneal index, a summation of normalized versions of CH and CCT, resulted in a 68% higher risk of progression (odds ratio, 1.68; 95% confidence interval, 1.08-2.62; P = 0.021). CONCLUSIONS: Approximately one-quarter of eyes with well-controlled IOP may show visual field progression over time. Thin cornea and low CH are main risk factors.
Assuntos
Córnea/fisiopatologia , Elasticidade/fisiologia , Glaucoma de Ângulo Aberto/fisiopatologia , Pressão Intraocular/fisiologia , Doenças do Nervo Óptico/fisiopatologia , Transtornos da Visão/diagnóstico , Campos Visuais/fisiologia , Idoso , Fenômenos Biomecânicos , Progressão da Doença , Feminino , Seguimentos , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Fatores de Risco , Transtornos da Visão/fisiopatologiaRESUMO
AIMS: To develop a predictive model for assessing the risk of developing neonatal respiratory morbidity using lamellar body counts (LBCs) and gestational age (GA) to provide a more patient-specific assessment. METHODS: Retrospective cohort study of patients' ≥32 weeks' gestation who received amniocentesis with LBC analysis over a 9-year period. Respiratory morbidity was defined as respiratory distress syndrome, transient tachypnea of the newborn or oxygen requirement for >24 h. Logistic regression analyses were used to predict the absolute risk and odds of respiratory morbidity as a function of GA and lamellar body count. RESULTS: Two hundred and sixty-seven mother-infant pairs included in the analysis with 32 cases (12.0%) of respiratory morbidity. When compared to those without respiratory morbidity, neonates with respiratory morbidity had amniocentesis performed at an earlier median GA, had lower mean birthweight and had lower median LBC (P<0.01). The GA specific absolute risks and odds ratios for the presence of respiratory morbidity were calculated. The predicted absolute risks of neonatal respiratory morbidity ranged from 38% at 32 weeks to 6% at 40 weeks when LBC were 35,000/µL. CONCLUSION: GA specific predicted risk of neonatal respiratory morbidity using LBC provides a statistical model, which can aid clinicians in individually counseling patients regarding the absolute risk of their neonate developing respiratory morbidity.
Assuntos
Líquido Amniótico/metabolismo , Técnicas de Apoio para a Decisão , Idade Gestacional , Fosfolipídeos/metabolismo , Síndrome do Desconforto Respiratório do Recém-Nascido/diagnóstico , Taquipneia/diagnóstico , Amniocentese , Biomarcadores/metabolismo , Humanos , Recém-Nascido , Recém-Nascido Prematuro , Modelos Logísticos , Razão de Chances , Síndrome do Desconforto Respiratório do Recém-Nascido/metabolismo , Estudos Retrospectivos , Medição de Risco , Taquipneia/metabolismoRESUMO
Purpose: To compare how linear mixed models (LMMs) using Gaussian, Student t, and log-gamma (LG) random effect distributions estimate rates of structural loss in a glaucomatous population using OCT and to compare model performance to ordinary least squares (OLS) regression. Design: Retrospective cohort study. Subjects: Patients in the Bascom Palmer Glaucoma Repository (BPGR). Methods: Eyes with ≥ 5 reliable peripapillary retinal nerve fiber layer (RNFL) OCT tests over ≥ 2 years were identified from the BPGR. Retinal nerve fiber layer thickness values from each reliable test (signal strength ≥ 7/10) and associated time points were collected. Data were modeled using OLS regression as well as LMMs using different random effect distributions. Predictive modeling involved constructing LMMs with (n - 1) tests to predict the RNFL thickness of subsequent tests. A total of 1200 simulated eyes of different baseline RNFL thickness values and progression rates were developed to evaluate the likelihood of declared progression and predicted rates. Main Outcome Measures: Model fit assessed by Watanabe-Akaike information criterion (WAIC) and mean absolute error (MAE) when predicting future RNFL thickness values; log-rank test and median time to progression with simulated eyes. Results: A total of 35 862 OCT scans from 5766 eyes of 3491 subjects were included. The mean follow-up period was 7.0 ± 2.3 years, with an average of 6.2 ± 1.4 tests per eye. The Student t model produced the lowest WAIC. In predictive models, all LMMs demonstrated a significant reduction in MAE when estimating future RNFL thickness values compared with OLS (P < 0.001). Gaussian and Student t models were similar and significantly better than the LG model in estimating future RNFL thickness values (P < 0.001). Simulated eyes confirmed LMM performance in declaring progression sooner than OLS regression among moderate and fast progressors (P < 0.01). Conclusions: LMMs outperformed conventional approaches for estimating rates of OCT RNFL thickness loss in a glaucomatous population. The Student t model provides the best model fit for estimating rates of change in RNFL thickness, although the use of the Gaussian or Student t distribution in models led to similar improvements in accurately estimating RNFL loss. Financial Disclosures: Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.
RESUMO
Robotic pancreaticoduodenectomy (RPD) has a learning curve of approximately 30-250 cases to reach proficiency. The learning curve for laparoscopic pancreaticoduodenectomy (LPD) at Duke University was previously defined as 50 cases. This study describes the RPD learning curve for a single surgeon following experience with LPD. LPD and RPD were retrospectively analyzed. Continuous pathologic and perioperative metrics were compared and learning curve were defined with respect to operative time using CUSUM analysis. Seventeen LPD and 69 RPD were analyzed LPD had an inverted learning curve possibly accounting for proficiency attained during the surgeon's fellowship and acquisition of new skills coinciding with more complex patient selection. The learning curve for RPD had three phases: accelerated early experience (cases 1-10), skill consolidation (cases 11-40), and improvement (cases 41-69), marked by reduction in operative time. Compared to LPD, RPD had shorter operative time (379 vs 479 min, p < 0.005), less EBL (250 vs 500, p < 0.02), and similar R0 resection. RPD also had improved LOS (7 vs 10 days, p < 0.007), and lower rates of surgical site infection (10% vs 47%, p < 0.002), DGE (19% vs 47%, p < 0.03), and readmission (13% vs 41%, p < 0.02). Experience in LPD may shorten the learning curve for RPD. The gap in surgical quality and perioperative outcomes between LPD and RPD will likely widen as exposure to robotics in General Surgery, Hepatopancreaticobiliary, and Surgical Oncology training programs increase.
Assuntos
Laparoscopia , Neoplasias Pancreáticas , Procedimentos Cirúrgicos Robóticos , Cirurgiões , Humanos , Pancreaticoduodenectomia/métodos , Procedimentos Cirúrgicos Robóticos/métodos , Curva de Aprendizado , Estudos Retrospectivos , Laparoscopia/métodos , Neoplasias Pancreáticas/cirurgia , Complicações Pós-Operatórias/cirurgiaRESUMO
PURPOSE: To evaluate the effect of intraocular pressure (IOP) on the rates of macular thickness (ganglion cell layer [GCL] and ganglion cell-inner plexiform layer [GCIPL]) change over time measured by spectral-domain (SD) OCT. DESIGN: Retrospective cohort study. PARTICIPANTS: Overall, 451 eyes of 256 patients with primary open-angle glaucoma. METHODS: Data were extracted from the Duke Ophthalmic Registry, a database of electronic medical records of patients observed under routine clinical care at the Duke Eye Center, and satellite clinics. All records from patients with a minimum of 6 months of follow-up and at least 2 good-quality Spectralis SD-OCT macula scans were included. Linear mixed models were used to investigate the relationship between average IOP during follow-up and rates of GCL and GCIPL thickness change over time. MAIN OUTCOME MEASURES: The effect of IOP on the rates of GCL and GCIPL thickness loss measured by SD-OCT. RESULTS: Eyes had a mean follow-up of 1.8 ± 1.3 years, ranging from 0.5 to 10.2 years. The average rate of change for GCL thickness was -0.220 µm/year (95% confidence interval [CI], -0.268 to -0.172 µm/year) and for GCIPL thickness was -0.231 µm/year (95% CI, -0.302 to -0.160 µm/year). Each 1-mmHg higher mean IOP during follow-up was associated with an additional loss of -0.021 µm/year of GCL thickness (P = 0.001) and -0.032 µm/year of GCIPL thickness (P = 0.001) after adjusting for potentially confounding factors, such as baseline age, disease severity, sex, race, central corneal thickness, and follow-up time. CONCLUSIONS: Higher IOP was significantly associated with faster rates of GCL and GCIPL loss over time measured by SD-OCT, even during relatively short follow-up times. These findings support the use of SD-OCT GCL and GCIPL thickness measurements as structural biomarkers for the evaluation of the efficacy of IOP-lowering therapies in slowing down the progression of glaucoma. FINANCIAL DISCLOSURE(S): Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.
Assuntos
Glaucoma de Ângulo Aberto , Glaucoma , Humanos , Pressão Intraocular , Glaucoma de Ângulo Aberto/diagnóstico , Estudos Retrospectivos , Campos Visuais , Células Ganglionares da Retina , Progressão da Doença , Fibras Nervosas , Tomografia de Coerência ÓpticaRESUMO
PURPOSE: Glaucoma is the leading cause of irreversible blindness, a crippling disability resulting in higher risks of chronic health conditions. To better understand disparities in blindness risk, we identified risk factors of blindness on first presentation to a glaucoma clinic using a large clinical database. DESIGN: Retrospective cross-sectional study. METHODS: We used electronic health records of glaucoma patients from the Duke Ophthalmic Registry. International Classification of Diseases codes were used to identify glaucoma and exclude concurrent diseases. Blindness classification was based on the definition of legal blindness. Risk factors included gender, race, marital status, age, intraocular pressure, diabetes history, income level, and education. Odds ratios (ORs) and 95% CIs were calculated for risk factors using univariable and multivariable logistic regression. RESULTS: Our cohort consisted of 3753 patients, with 192 (5%) blind on first presentation. In univariable models, African American / Black race (OR 2.48, 95% CI 1.83-3.36), single marital status (1.74, 95% CI 1.25-2.44), prior diabetes diagnosis (2.23, 95% CI 1.52-3.27), and higher intraocular pressure (1.29 per 1 SD higher, 95% CI 1.13-1.46) were associated with increased risk of presenting blind, whereas higher annual income (0.75, 95% CI 0.65-0.86) and education (0.77, 95% CI 0.69-0.85) were associated with lower risk. These associations remained significant and in the same direction in a multivariable model apart from income, which became insignificant. CONCLUSIONS: Using a large real-world clinical database, we identified risk factors associated with presentation with blindness among glaucoma patients. Our results highlight disparities in health care outcomes and indicate the importance of targeted education to reduce disparities in blindness.
Assuntos
Glaucoma , Humanos , Estudos Retrospectivos , Estudos Transversais , Glaucoma/complicações , Glaucoma/diagnóstico , Glaucoma/epidemiologia , Cegueira/diagnóstico , Cegueira/epidemiologia , Cegueira/etiologia , Pressão Intraocular , Fatores de RiscoRESUMO
Importance: Autism detection early in childhood is critical to ensure that autistic children and their families have access to early behavioral support. Early correlates of autism documented in electronic health records (EHRs) during routine care could allow passive, predictive model-based monitoring to improve the accuracy of early detection. Objective: To quantify the predictive value of early autism detection models based on EHR data collected before age 1 year. Design, Setting, and Participants: This retrospective diagnostic study used EHR data from children seen within the Duke University Health System before age 30 days between January 2006 and December 2020. These data were used to train and evaluate L2-regularized Cox proportional hazards models predicting later autism diagnosis based on data collected from birth up to the time of prediction (ages 30-360 days). Statistical analyses were performed between August 1, 2020, and April 1, 2022. Main Outcomes and Measures: Prediction performance was quantified in terms of sensitivity, specificity, and positive predictive value (PPV) at clinically relevant model operating thresholds. Results: Data from 45â¯080 children, including 924 (1.5%) meeting autism criteria, were included in this study. Model-based autism detection at age 30 days achieved 45.5% sensitivity and 23.0% PPV at 90.0% specificity. Detection by age 360 days achieved 59.8% sensitivity and 17.6% PPV at 81.5% specificity and 38.8% sensitivity and 31.0% PPV at 94.3% specificity. Conclusions and Relevance: In this diagnostic study of an autism screening test, EHR-based autism detection achieved clinically meaningful accuracy by age 30 days, improving by age 1 year. This automated approach could be integrated with caregiver surveys to improve the accuracy of early autism screening.
Assuntos
Transtorno Autístico , Criança , Humanos , Adulto , Lactente , Transtorno Autístico/diagnóstico , Transtorno Autístico/epidemiologia , Registros Eletrônicos de Saúde , Estudos Retrospectivos , Valor Preditivo dos Testes , Inquéritos e QuestionáriosRESUMO
Background: Colorectal cancer (CRC) patients in early to mid-adulthood (≤50 years) are challenged by high symptom burden (i.e., pain, fatigue, distress) and age-related stressors (e.g., managing family, work). Cognitive behavioral theory (CBT)-based coping skills training interventions reduce symptoms and improve quality of life in cancer patients. However, traditional CBT-based interventions are not accessible to these patients (e.g., in-person sessions, during work day), nor designed to address symptoms within the context of this stage of life. We developed a mobile health (mHealth) coping skills training program for pain, fatigue and distress (mCOPE) for CRC patients in early to mid-adulthood. We utilize a randomized controlled trial to test the extent to which mCOPE reduces pain, fatigue and distress (multiple primary outcomes) and improves quality of life and symptom self-efficacy (secondary outcomes). Methods/Design: Patients (N = 160) ≤50 years with CRC endorsing pain, fatigue and/or distress are randomized 1:1 to mCOPE or standard care. mCOPE is a five-session CBT-based coping skills training program (e.g., relaxation, activity pacing, cognitive restructuring) that was adapted for CRC patients in early to mid-adulthood. mCOPE utilizes mHealth technology (e.g., videoconference, mobile app) to deliver coping skills training, capture symptom and skills use data, and provide personalized support and feedback. Self-report assessments are completed at baseline, post-treatment (5-8 weeks post-baseline; primary endpoint), and 3- and 6-months later. Conclusions: mCOPE is innovative and potentially impactful for CRC patients in early to mid-adulthood. Hypothesis confirmation would demonstrate initial efficacy of a mHealth cognitive behavioral intervention to reduce symptom burden in younger CRC patients.
RESUMO
Purpose: In patients with ophthalmic disorders, psychosocial risk factors play an important role in morbidity and mortality. Proper and early psychiatric screening can result in prompt intervention and mitigate its impact. Because screening is resource intensive, we developed a framework for automating screening using an electronic health record (EHR)-derived artificial intelligence (AI) algorithm. Methods: Subjects came from the Duke Ophthalmic Registry, a retrospective EHR database for the Duke Eye Center. Inclusion criteria included at least two encounters and a minimum of 1 year of follow-up. Presence of distress was defined at the encounter level using a computable phenotype. Risk factors included available EHR history. At each encounter, risk factors were used to discriminate psychiatric status. Model performance was evaluated using area under the receiver operating characteristic (ROC) curve and area under the precision-recall curve (PR AUC). Variable importance was presented using odds ratios (ORs). Results: Our cohort included 358,135 encounters from 40,326 patients with an average of nine encounters per patient over 4 years. The ROC and PR AUC were 0.91 and 0.55, respectively. Of the top 25 predictors, the majority were related to existing distress, but some indicated stressful conditions, including chemotherapy (OR = 1.36), esophageal disorders (OR = 1.31), central pain syndrome (OR = 1.25), and headaches (OR = 1.24). Conclusions: Psychiatric distress in ophthalmology patients can be monitored passively using an AI algorithm trained on existing EHR data. Translational Relevance: When paired with an effective referral and treatment program, such algorithms may improve health outcomes in ophthalmology.
Assuntos
Inteligência Artificial , Oftalmologia , Algoritmos , Registros Eletrônicos de Saúde , Estudos RetrospectivosRESUMO
PURPOSE: To compare the ability of linear mixed models with different random effect distributions to estimate rates of visual field loss in glaucoma patients. METHODS: Eyes with five or more reliable standard automated perimetry (SAP) tests were identified from the Duke Glaucoma Registry. Mean deviation (MD) values from each visual field and associated timepoints were collected. These data were modeled using ordinary least square (OLS) regression and linear mixed models using the Gaussian, Student's t, or log-gamma (LG) distributions as the prior distribution for random effects. Model fit was compared using the Watanabe-Akaike information criterion (WAIC). Simulated eyes of varying initial disease severity and rates of progression were created to assess the accuracy of each model in predicting the rate of change and likelihood of declaring progression. RESULTS: A total of 52,900 visual fields from 6558 eyes of 3981 subjects were included. Mean follow-up period was 8.7 ± 4.0 years, with an average of 8.1 ± 3.7 visual fields per eye. The LG model produced the lowest WAIC, demonstrating optimal model fit. In simulations, the LG model declared progression earlier than OLS (P < 0.001) and had the greatest accuracy in predicted slopes (P < 0.001). The Gaussian model significantly underestimated rates of progression among fast and catastrophic progressors. CONCLUSIONS: Linear mixed models using the LG distribution outperformed conventional approaches for estimating rates of SAP MD loss in a population with glaucoma. TRANSLATIONAL RELEVANCE: Use of the LG distribution in models estimating rates of change among glaucoma patients may improve their accuracy in rapidly identifying progressors at high risk for vision loss.
Assuntos
Glaucoma , Pressão Intraocular , Seguimentos , Glaucoma/diagnóstico , Humanos , Transtornos da Visão/diagnóstico , Transtornos da Visão/epidemiologia , Testes de Campo Visual , Campos VisuaisRESUMO
BACKGROUND: The etiology of atrial fibrillation (AF) is multifactorial and incompletely understood. OBJECTIVE: The purpose of this study was to evaluate the association between coronary artery disease (CAD) affecting atrial tissue and AF. METHODS: Patients from a single center with obstructive CAD during cardiac catheterization (January 1, 2007, through December 1, 2013) were included in a matched case-control analysis on the basis of the presence or absence of new-onset AF within 12 months of catheterization. Quantitative measurements of stenosis severity were performed for the sinoatrial nodal artery, atrioventricular (AV) nodal artery, and right intermediate atrial artery (RIAA) as well as the right coronary, left circumflex, and left anterior descending proximal to the takeoff for each atrial level artery. A multivariable logistic regression model identified factors associated with AF. RESULTS: Of 1794 patients, 115 (6%) developed AF within 1 year of catheterization. The matched cohort included 110 patients with and 110 patients without AF within 12 months of catheterization. Higher odds of AF at 1 year were associated with increasing lesion stenosis severity in the RIAA (odds ratio [OR] 1.41 per 10% increase in lesion severity above 50%; 95% confidence interval [CI] 1.01-1.97; P = .047) and AV nodal artery (OR 1.58 per 10% increase in lesion severity above 50%; 95% CI 1.00-2.49; P = .050). Odds of AF diagnosis during the year after catheterization increased with the number of atrial arteries with >50% lesion (OR 1.53 for each additional artery; 95% CI 1.08-2.15; P = .015). CONCLUSION: In patients with obstructive CAD, disease of the AV nodal artery and RIAA as well as a higher burden of CAD within all arteries supplying blood flow to the atrial myocardium were associated with higher odds of new-onset AF at 1 year.
Assuntos
Fibrilação Atrial , Doença da Artéria Coronariana , Estenose Coronária , Fibrilação Atrial/complicações , Fibrilação Atrial/etiologia , Constrição Patológica/complicações , Angiografia Coronária/efeitos adversos , Doença da Artéria Coronariana/complicações , Doença da Artéria Coronariana/diagnóstico , Estenose Coronária/complicações , Estenose Coronária/diagnóstico , Humanos , Fatores de RiscoRESUMO
BACKGROUND/AIMS: To investigate racial differences in the variability of longitudinal visual field testing in a 'real-world' clinical population, evaluate how these differences are influenced by socioeconomic status, and estimate the impact of differences in variability on the time to detect visual field progression. METHODS: This retrospective observational cohort study used data from 1103 eyes from 751 White individuals and 428 eyes from 317 black individuals. Linear regression was performed on the standard automated perimetry mean deviation values for each eye over time. The SD of the residuals from the trend lines was calculated and used as a measure of variability for each eye. The association of race with the SD of the residuals was evaluated using a multivariable generalised estimating equation model with an interaction between race and zip code income. Computer simulations were used to estimate the time to detect visual field progression in the two racial groups. RESULTS: Black patients had larger visual field variability over time compared with white patients, even when adjusting for zip code level socioeconomic variables (SD of residuals for Black patients=1.53 dB (95% CI 1.43 to 1.64); for white patients=1.26 dB (95% CI 1.14 to 1.22); mean difference: 0.28 (95% CI 0.15 to 0.41); p<0.001). The difference in visual field variability between black and white patients was greater at lower levels of income and led to a delay in detection of glaucoma progression. CONCLUSION: Black patients had larger visual field variability compared with white patients. This relationship was strongly influenced by socioeconomic status and may partially explain racial disparities in glaucoma outcomes.
Assuntos
Glaucoma , Campos Visuais , Progressão da Doença , Seguimentos , Glaucoma/diagnóstico , Humanos , Pressão Intraocular , Estudos Retrospectivos , Transtornos da Visão/diagnóstico , Testes de Campo VisualRESUMO
AIMS: To assess the impact of anxiety and depression in the risk of converting to glaucoma in a cohort of glaucoma suspects followed over time. METHODS: The study included a retrospective cohort of subjects with diagnosis of glaucoma suspect at baseline, extracted from the Duke Glaucoma Registry. The presence of anxiety and depression was defined based on electronic health records billing codes, medical history and problem list. Univariable and multivariable Cox proportional hazards models were used to obtain HRs for the risk of converting to glaucoma over time. Multivariable models were adjusted for age, gender, race, intraocular pressure measurements over time and disease severity at baseline. RESULTS: A total of 3259 glaucoma suspects followed for an average of 3.60 (2.05) years were included in our cohort, of which 911 (28%) were diagnosed with glaucoma during follow-up. Prevalence of anxiety and depression were 32% and 33%, respectively. Diagnoses of anxiety, or concomitant anxiety and depression were significantly associated with risk of converting to glaucoma over time, with adjusted HRs (95% CI) of 1.16 (1.01, 1.33) and 1.27 (1.07, 1.50), respectively. CONCLUSION: A history of anxiety or both anxiety and depression in glaucoma suspects was associated with developing glaucoma during follow-up.
Assuntos
Ansiedade/complicações , Depressão/complicações , Glaucoma/diagnóstico , Pressão Intraocular/fisiologia , Células Ganglionares da Retina/patologia , Tomografia de Coerência Óptica/métodos , Campos Visuais/fisiologia , Progressão da Doença , Feminino , Seguimentos , Glaucoma/etiologia , Glaucoma/fisiopatologia , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de TempoRESUMO
PURPOSE: To investigate the relationship between the rate of retinal nerve fiber layer (RNFL) loss during initial follow-up and the magnitude of associated visual field loss during an extended follow-up period. DESIGN: Retrospective cohort study. METHODS: A total of 1,150 eyes of 839 glaucoma patients extracted from the Duke Glaucoma Registry. Rates of RNFL loss were obtained from global RNFL thickness values of the first 5 optical coherence tomography (OCT) scans. Rates of visual field loss were assessed using standard automated perimetry mean deviation (SAP MD) during the entire follow-up period. Joint longitudinal mixed effects models were used to estimate rates of change. Eyes were categorized as fast, moderate or slow progressors based on rates of RNFL loss, with cutoffs of ≤-2 µm/year, -2 to -1 µm/year and ≥-1 µm/year, respectively. Univariable and multivariable regressions were completed to identify significant predictors of SAP MD loss. RESULTS: The rate of RNFL change was -0.76±0.85 µm/y during initial follow-up, which occurred over 3.7±1.5 years. 765 (66%) eyes were slow, 328 (29%) moderate, and 57 (5%) fast progressors, with rates of RNFL thinning of -0.36±0.54 µm/year, -1.34±0.25 µm/year, and -2.87±1.39 µm/year respectively. The rates of SAP MD loss among slow, moderate, and fast OCT progressors were -0.16±0.35 dB/y, -0.32±0.43 dB/y, and -0.71±0.65 dB/y respectively over the extended follow-up period of 6.1±1.9 years (P<0.001). Age, OCT progressor group, and concurrent SAP rate were all significantly associated with the overall rate of SAP MD loss in a multivariable model (all P<0.001). CONCLUSION: Rapid RNFL thinning during an initial follow-up period was predictive of concurrent and subsequent rates of visual field decline over an extended period.