Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
1.
PLoS Comput Biol ; 20(5): e1011200, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38709852

RESUMEN

During the COVID-19 pandemic, forecasting COVID-19 trends to support planning and response was a priority for scientists and decision makers alike. In the United States, COVID-19 forecasting was coordinated by a large group of universities, companies, and government entities led by the Centers for Disease Control and Prevention and the US COVID-19 Forecast Hub (https://covid19forecasthub.org). We evaluated approximately 9.7 million forecasts of weekly state-level COVID-19 cases for predictions 1-4 weeks into the future submitted by 24 teams from August 2020 to December 2021. We assessed coverage of central prediction intervals and weighted interval scores (WIS), adjusting for missing forecasts relative to a baseline forecast, and used a Gaussian generalized estimating equation (GEE) model to evaluate differences in skill across epidemic phases that were defined by the effective reproduction number. Overall, we found high variation in skill across individual models, with ensemble-based forecasts outperforming other approaches. Forecast skill relative to the baseline was generally higher for larger jurisdictions (e.g., states compared to counties). Over time, forecasts generally performed worst in periods of rapid changes in reported cases (either in increasing or decreasing epidemic phases) with 95% prediction interval coverage dropping below 50% during the growth phases of the winter 2020, Delta, and Omicron waves. Ideally, case forecasts could serve as a leading indicator of changes in transmission dynamics. However, while most COVID-19 case forecasts outperformed a naïve baseline model, even the most accurate case forecasts were unreliable in key phases. Further research could improve forecasts of leading indicators, like COVID-19 cases, by leveraging additional real-time data, addressing performance across phases, improving the characterization of forecast confidence, and ensuring that forecasts were coherent across spatial scales. In the meantime, it is critical for forecast users to appreciate current limitations and use a broad set of indicators to inform pandemic-related decision making.


Asunto(s)
COVID-19 , Predicción , Pandemias , SARS-CoV-2 , COVID-19/epidemiología , COVID-19/transmisión , Humanos , Predicción/métodos , Estados Unidos/epidemiología , Pandemias/estadística & datos numéricos , Biología Computacional , Modelos Estadísticos
2.
Clin Gastroenterol Hepatol ; 20(2): 409-418.e5, 2022 02.
Artículo en Inglés | MEDLINE | ID: mdl-33279780

RESUMEN

BACKGROUND & AIMS: Early liver transplantation (LT) for alcoholic hepatitis (AH) is lifesaving but concerns regarding return to harmful alcohol use remain. We sought to identify distinct patterns of alcohol use post-LT to inform pre-LT candidate selection and post-LT addiction care. METHODS: Detailed post-LT alcohol use data was gathered retrospectively from consecutive patients with severe AH at 11 ACCELERATE-AH sites from 2006-2018. Latent class analysis identified longitudinal patterns of alcohol use post-LT. Logistic and Cox regression evaluated associations between patterns of alcohol use with pre-LT variables and post-LT survival. A microsimulation model estimated the effect of selection criteria on overall outcomes. RESULTS: Of 153 LT recipients, 1-, 3-, and 5-year survival were 95%, 88% and 82%. Of 146 LT recipients surviving to home discharge, 4 distinct longitudinal patterns of post-LT alcohol use were identified: Pattern 1 [abstinent](n = 103; 71%), pattern 2 [late/non-heavy](n = 9; 6.2%), pattern 3 [early/non-heavy](n = 22; 15%), pattern 4 [early/heavy](n = 12; 8.2%). One-year survival was similar among the 4 patterns (100%), but patients with early post-LT alcohol use had lower 5-year survival (62% and 53%) compared to abstinent and late/non-heavy patterns (95% and 100%). Early alcohol use patterns were associated with younger age, multiple prior rehabilitation attempts, and overt encephalopathy. In simulation models, the pattern of post-LT alcohol use changed the average life-expectancy after early LT for AH. CONCLUSIONS: A significant majority of LT recipients for AH maintain longer-term abstinence, but there are distinct patterns of alcohol use associated with higher risk of 3- and 5-year mortality. Pre-LT characteristics are associated with post-LT alcohol use patterns and may inform candidate selection and post-LT addiction care.


Asunto(s)
Hepatitis Alcohólica , Trasplante de Hígado , Abstinencia de Alcohol , Consumo de Bebidas Alcohólicas/efectos adversos , Consumo de Bebidas Alcohólicas/epidemiología , Hepatitis Alcohólica/cirugía , Humanos , Trasplante de Hígado/efectos adversos , Recurrencia , Estudios Retrospectivos
3.
J Biomed Inform ; 123: 103895, 2021 11.
Artículo en Inglés | MEDLINE | ID: mdl-34450286

RESUMEN

BACKGROUND: The progression of many degenerative diseases is tracked periodically using scales evaluating functionality in daily activities. Although estimating the timing of critical events (i.e., disease tollgates) during degenerative disease progression is desirable, the necessary data may not be readily available in scale records. Further, analysis of disease progression poses data challenges, such as censoring and misclassification errors, which need to be addressed to provide meaningful research findings and inform patients. METHODS: We developed a novel binary classification approach to map scale scores into disease tollgates to describe disease progression leveraging standard/modified Kaplan-Meier analyses. The approach is demonstrated by estimating progression pathways in amyotrophic lateral sclerosis (ALS). Tollgate-based ALS Staging System (TASS) specifies the critical events (i.e., tollgates) in ALS progression. We first developed a binary classification predicting whether each TASS tollgate was passed given the itemized ALSFRS-R scores using 514 ALS patients' data from Mayo Clinic-Rochester. Then, we utilized the binary classification to translate/map the ALSFRS-R data of 3,264 patients from the PRO-ACT database into TASS. We derived the time trajectories of ALS progression through tollgates from the augmented PRO-ACT data using Kaplan-Meier analyses. The effects of misclassification errors, condition-dependent dropouts, and censored data in trajectory estimations were evaluated with Interval Censored Kaplan Meier Analysis and Multistate Model for Panel Data. RESULTS: The approach using Mayo Clinic data accurately estimated tollgate-passed states of patients given their itemized ALSFRS-R scores (AUCs > 0.90). The tollgate time trajectories derived from the augmented PRO-ACT dataset provide valuable insights; we predicted that the majority of the ALS patients would have modified arm function (67%) and require assistive devices for walking (53%) by the second year after ALS onset. By the third year, most (74%) ALS patients would occasionally use a wheelchair, while 48% of the ALS patients would be wheelchair-dependent by the fourth year. Assistive speech devices and feeding tubes were needed in 49% and 30% of the patients by the third year after ALS onset, respectively. The onset body region alters some tollgate passage time estimations by 1-2 years. CONCLUSIONS: The estimated tollgate-based time trajectories inform patients and clinicians about prospective assistive device needs and life changes. More research is needed to personalize these estimations according to prognostic factors. Further, the approach can be leveraged in the progression of other diseases.


Asunto(s)
Esclerosis Amiotrófica Lateral , Progresión de la Enfermedad , Humanos , Estudios Prospectivos , Habla , Caminata
4.
Gastroenterology ; 157(2): 472-480.e5, 2019 08.
Artículo en Inglés | MEDLINE | ID: mdl-30998988

RESUMEN

BACKGROUND & AIMS: Early liver transplantation (without requiring a minimum period of sobriety) for severe alcohol-associated hepatitis (AH) is controversial: many centers delay eligibility until a specific period of sobriety (such as 6 months) has been achieved. To inform ongoing debate and policy, we modeled long-term outcomes of early vs delayed liver transplantation for patients with AH. METHODS: We developed a mathematical model to simulate early vs delayed liver transplantation for patients with severe AH and different amounts of alcohol use after transplantation: abstinence, slip (alcohol use followed by sobriety), or sustained use. Mortality of patients before transplantation was determined by joint-effect model (based on Model for End-Stage Liver Disease [MELD] and Lille scores). We estimated life expectancies of patients receiving early vs delayed transplantation (6-month wait before placement on the waitlist) and life years lost attributable to alcohol use after receiving the liver transplant. RESULTS: Patients offered early liver transplantation were estimated to have an average life expectancy of 6.55 life years, compared with an average life expectancy of 1.46 life years for patients offered delayed liver transplantation (4.49-fold increase). The net increase in life expectancy from offering early transplantation was highest for patients with Lille scores of 0.50-0.82 and MELD scores of 32 or more. Patients who were offered early transplantation and had no alcohol use afterward were predicted to survive 10.85 years compared with 3.62 years for patients with sustained alcohol use after transplantation (7.23 life years lost). Compared with delayed transplantation, early liver transplantation increased survival times in all simulated scenarios and combinations of Lille and MELD scores. CONCLUSIONS: In a modeling study of assumed carefully selected patients with AH, early vs delayed liver transplantation (6 months of abstinence from alcohol before transplantation) increased survival times of patients, regardless of estimated risk of sustained alcohol use after transplantation. These findings support early liver transplantation for patients with severe AH. The net increase in life expectancy was maintained in all simulated extreme scenarios but should be confirmed in prospective studies. Sustained alcohol use after transplantation significantly reduced but did not eliminate the benefits of early transplantation. Strategies are needed to prevent and treat posttransplantation use of alcohol.


Asunto(s)
Enfermedad Hepática en Estado Terminal/cirugía , Hepatitis Alcohólica/cirugía , Trasplante de Hígado/métodos , Modelos Biológicos , Tiempo de Tratamiento , Adulto , Abstinencia de Alcohol , Consumo de Bebidas Alcohólicas/efectos adversos , Consumo de Bebidas Alcohólicas/prevención & control , Enfermedad Hepática en Estado Terminal/diagnóstico , Enfermedad Hepática en Estado Terminal/etiología , Enfermedad Hepática en Estado Terminal/mortalidad , Femenino , Hepatitis Alcohólica/complicaciones , Hepatitis Alcohólica/diagnóstico , Hepatitis Alcohólica/mortalidad , Humanos , Esperanza de Vida , Trasplante de Hígado/normas , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Medición de Riesgo/métodos , Índice de Severidad de la Enfermedad , Análisis de Supervivencia , Factores de Tiempo , Resultado del Tratamiento , Listas de Espera
5.
JAMA Health Forum ; 3(4): e220760, 2022 04.
Artículo en Inglés | MEDLINE | ID: mdl-35977324

RESUMEN

Importance: A key question for policy makers and the public is what to expect from the COVID-19 pandemic going forward as states lift nonpharmacologic interventions (NPIs), such as indoor mask mandates, to prevent COVID-19 transmission. Objective: To project COVID-19 deaths between March 1, 2022, and December 31, 2022, in each of the 50 US states, District of Columbia, and Puerto Rico assuming different dates of lifting of mask mandates and NPIs. Design Setting and Participants: This simulation modeling study used the COVID-19 Policy Simulator compartmental model to project COVID-19 deaths from March 1, 2022, to December 31, 2022, using simulated populations in the 50 US states, District of Columbia, and Puerto Rico. Projected current epidemiologic trends for each state until December 31, 2022, assuming the current pace of vaccination is maintained into the future and modeling different dates of lifting NPIs. Exposures: Date of lifting statewide NPI mandates as March 1, April 1, May 1, June 1, or July 1, 2022. Main Outcomes and Measures: Projected COVID-19 incident deaths from March to December 2022. Results: With the high transmissibility of current circulating SARS-CoV-2 variants, the simulated lifting of NPIs in March 2022 was associated with resurgences of COVID-19 deaths in nearly every state. In comparison, delaying by even 1 month to lift NPIs in April 2022 was estimated to mitigate the amplitude of the surge. For most states, however, no amount of delay was estimated to be sufficient to prevent a surge in deaths completely. The primary factor associated with recurrent epidemics in the simulation was the assumed high effective reproduction number of unmitigated viral transmission. With a lower level of transmissibility similar to those of the ancestral strains, the model estimated that most states could remove NPIs in March 2022 and likely not see recurrent surges. Conclusions and Relevance: This simulation study estimated that the SARS-CoV-2 virus would likely continue to take a major toll in the US, even as cases continued to decrease. Because of the high transmissibility of the recent Delta and Omicron variants, premature lifting of NPIs could pose a substantial threat of rebounding surges in morbidity and mortality. At the same time, continued delay in lifting NPIs may not prevent future surges.


Asunto(s)
COVID-19 , SARS-CoV-2 , Número Básico de Reproducción , COVID-19/epidemiología , Humanos , Pandemias/prevención & control
6.
JAMA Netw Open ; 5(9): e2230426, 2022 09 01.
Artículo en Inglés | MEDLINE | ID: mdl-36098969

RESUMEN

Importance: Quantitative assessment of disease progression in patients with nonalcoholic fatty liver disease (NAFLD) has not been systematically examined using competing liver-related and non-liver-related mortality. Objective: To estimate long-term outcomes in NAFLD, accounting for competing liver-related and non-liver-related mortality associated with the different fibrosis stages of NAFLD using a simulated patient population. Design, Setting, and Participants: This decision analytical modeling study used individual-level state-transition simulation analysis and was conducted from September 1, 2017, to September 1, 2021. A publicly available interactive tool, dubbed NAFLD Simulator, was developed that simulates the natural history of NAFLD by age and fibrosis stage at the time of (hypothetical) diagnosis defined by liver biopsy. Model health states were defined by fibrosis states F0 to F4, decompensated cirrhosis, hepatocellular carcinoma (HCC), and liver transplant. Simulated patients could experience nonalcoholic steatohepatitis resolution, and their fibrosis stage could progress or regress. Transition probabilities between states were estimated from the literature as well as calibration, and the model reproduced the outcomes of a large observational study. Exposure: Simulated natural history of NAFLD. Main Outcomes and Measures: Main outcomes were life expectancy; all cause, liver-related, and non-liver-related mortality; and cumulative incidence of decompensated cirrhosis and/or HCC. Results: The model included 1 000 000 simulated patients with a mean (range) age of 49 (18-75) years at baseline, including 66% women. The life expectancy of patients aged 49 years was 25.3 (95% CI, 20.1-29.8) years for those with F0, 25.1 (95% CI, 20.1-29.4) years for those with F1, 23.6 (95% CI, 18.3-28.2) years for those with F2, 21.1 (95% CI, 15.6-26.3) years for those with F3, and 13.8 (95% CI, 10.3-17.6) years for those with F4 at the time of diagnosis. The estimated 10-year liver-related mortality was 0.1% (95% uncertainty interval [UI], <0.1%-0.2%) in F0, 0.2% (95% UI, 0.1%-0.4%) in F1, 1.0% (95% UI, 0.6%-1.7%) in F2, 4.0% (95% UI, 2.5%-5.9%) in F3, and 29.3% (95% UI, 21.8%-35.9%) in F4. The corresponding 10-year non-liver-related mortality was 1.8% (95% UI, 0.6%-5.0%) in F0, 2.4% (95% UI, 0.8%-6.3%) in F1, 5.2% (95% UI, 2.0%-11.9%) in F2, 9.7% (95% UI, 4.3%-18.1%) in F3, and 15.6% (95% UI, 10.1%-21.7%) in F4. Among patients aged 65 years, estimated 10-year non-liver-related mortality was higher than liver-related mortality in all fibrosis stages (eg, F2: 16.7% vs 0.8%; F3: 28.8% vs 3.0%; F4: 40.8% vs 21.9%). Conclusions and Relevance: This decision analytic model study simulated stage-specific long-term outcomes, including liver- and non-liver-related mortality in patients with NAFLD. Depending on age and fibrosis stage, non-liver-related mortality was higher than liver-related mortality in patients with NAFLD. By translating surrogate markers into clinical outcomes, the NAFLD Simulator could be used as an educational tool among patients and clinicians to increase awareness of the health consequences of NAFLD.


Asunto(s)
Carcinoma Hepatocelular , Neoplasias Hepáticas , Enfermedad del Hígado Graso no Alcohólico , Carcinoma Hepatocelular/complicaciones , Femenino , Fibrosis , Humanos , Cirrosis Hepática/epidemiología , Cirrosis Hepática/etiología , Neoplasias Hepáticas/epidemiología , Masculino , Enfermedad del Hígado Graso no Alcohólico/complicaciones , Enfermedad del Hígado Graso no Alcohólico/epidemiología
7.
J Neurol ; 266(3): 755-765, 2019 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-30684209

RESUMEN

OBJECTIVE: To capture ALS progression in arm, leg, speech, swallowing, and breathing segments using a disease-specific staging system, namely tollgate-based ALS staging system (TASS), where tollgates refer to a set of critical clinical events including having slight weakness in arms, needing a wheelchair, needing a feeding tube, etc. METHODS: We compiled a longitudinal dataset from medical records including free-text clinical notes of 514 ALS patients from Mayo Clinic, Rochester-MN. We derived tollgate-based progression pathways of patients up to a 1-year period starting from the first clinic visit. We conducted Kaplan-Meier analyses to estimate the probability of passing each tollgate over time for each functional segment. RESULTS: At their first clinic visit, 93%, 77%, and 60% of patients displayed some level of limb, bulbar, and breathing weakness, respectively. The proportion of patients at milder tollgate levels (tollgate level < 2) was smaller for arm and leg segments (38% and 46%, respectively) compared to others (> 65%). Patients showed non-uniform TASS pathways, i.e., the likelihood of passing a tollgate differed based on the affected segments at the initial visit. For instance, stratified by impaired segments at the initial visit, patients with limb and breathing impairment were more likely (62%) to use bi-level positive airway pressure device in a year compared to those with bulbar and breathing impairment (26%). CONCLUSION: Using TASS, clinicians can inform ALS patients about their individualized likelihood of having critical disabilities and assistive-device needs (e.g., being dependent on wheelchair/ventilation, needing walker/wheelchair or communication devices), and help them better prepare for future.


Asunto(s)
Esclerosis Amiotrófica Lateral/diagnóstico , Esclerosis Amiotrófica Lateral/fisiopatología , Progresión de la Enfermedad , Índice de Severidad de la Enfermedad , Adulto , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Estudios Longitudinales , Masculino , Registros Médicos , Persona de Mediana Edad , Pronóstico , Adulto Joven
8.
Sci Rep ; 9(1): 16849, 2019 11 14.
Artículo en Inglés | MEDLINE | ID: mdl-31727921

RESUMEN

Hepatitis C virus (HCV) is 15 times more prevalent among persons in Spain's prisons than in the community. Recently, Spain initiated a pilot program, JAILFREE-C, to treat HCV in prisons using direct-acting antivirals (DAAs). Our aim was to identify a cost-effective strategy to scale-up HCV treatment in all prisons. Using a validated agent-based model, we simulated the HCV landscape in Spain's prisons considering disease transmission, screening, treatment, and prison-community dynamics. Costs and disease outcomes under status quo were compared with strategies to scale-up treatment in prisons considering prioritization (HCV fibrosis stage vs. HCV prevalence of prisons), treatment capacity (2,000/year vs. unlimited) and treatment initiation based on sentence lengths (>6 months vs. any). Scaling-up treatment by treating all incarcerated persons irrespective of their sentence length provided maximum health benefits-preventing 10,200 new cases of HCV, and 8,300 HCV-related deaths between 2019-2050; 90% deaths prevented would have occurred in the community. Compared with status quo, this strategy increased quality-adjusted life year (QALYs) by 69,700 and costs by €670 million, yielding an incremental cost-effectiveness ratio of €9,600/QALY. Scaling-up HCV treatment with DAAs for the entire Spanish prison population, irrespective of sentence length, is cost-effective and would reduce HCV burden.


Asunto(s)
Antivirales/economía , Análisis Costo-Beneficio , Hepacivirus/efectos de los fármacos , Hepatitis C Crónica/economía , Hepatitis C Crónica/epidemiología , Prisioneros , Adulto , Antivirales/uso terapéutico , Femenino , Costos de la Atención en Salud/estadística & datos numéricos , Hepacivirus/crecimiento & desarrollo , Hepacivirus/patogenicidad , Hepatitis C Crónica/tratamiento farmacológico , Hepatitis C Crónica/transmisión , Humanos , Masculino , Persona de Mediana Edad , Modelos Estadísticos , Prevalencia , Prisiones , Años de Vida Ajustados por Calidad de Vida , España/epidemiología
9.
PLoS One ; 12(2): e0172261, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-28222123

RESUMEN

Individuals are prioritized based on their risk profiles when allocating limited vaccine stocks during an influenza pandemic. Computationally expensive but realistic agent-based simulations and fast but stylized compartmental models are typically used to derive effective vaccine allocation strategies. A detailed comparison of these two approaches, however, is often omitted. We derive age-specific vaccine allocation strategies to mitigate a pandemic influenza outbreak in Seattle by applying derivative-free optimization to an agent-based simulation and also to a compartmental model. We compare the strategies derived by these two approaches under various infection aggressiveness and vaccine coverage scenarios. We observe that both approaches primarily vaccinate school children, however they may allocate the remaining vaccines in different ways. The vaccine allocation strategies derived by using the agent-based simulation are associated with up to 70% decrease in total cost and 34% reduction in the number of infections compared to the strategies derived by using the compartmental model. Nevertheless, the latter approach may still be competitive for very low and/or very high infection aggressiveness. Our results provide insights about potential differences between the vaccine allocation strategies derived by using agent-based simulations and those derived by using compartmental models.


Asunto(s)
Simulación por Computador , Vacunas contra la Influenza/provisión & distribución , Gripe Humana/prevención & control , Modelos Teóricos , Pandemias/prevención & control , Asignación de Recursos , Análisis de Sistemas , Adolescente , Adulto , Factores de Edad , Anciano , Niño , Preescolar , Transmisión de Enfermedad Infecciosa/estadística & datos numéricos , Humanos , Lactante , Gripe Humana/epidemiología , Gripe Humana/transmisión , Persona de Mediana Edad , Riesgo , Factores de Tiempo , Población Urbana , Washingtón , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA