Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 99
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
J Math Biol ; 89(2): 21, 2024 Jun 26.
Artigo em Inglês | MEDLINE | ID: mdl-38926228

RESUMO

For some communicable endemic diseases (e.g., influenza, COVID-19), vaccination is an effective means of preventing the spread of infection and reducing mortality, but must be augmented over time with vaccine booster doses. We consider the problem of optimally allocating a limited supply of vaccines over time between different subgroups of a population and between initial versus booster vaccine doses, allowing for multiple booster doses. We first consider an SIS model with interacting population groups and four different objectives: those of minimizing cumulative infections, deaths, life years lost, or quality-adjusted life years lost due to death. We solve the problem sequentially: for each time period, we approximate the system dynamics using Taylor series expansions, and reduce the problem to a piecewise linear convex optimization problem for which we derive intuitive closed-form solutions. We then extend the analysis to the case of an SEIS model. In both cases vaccines are allocated to groups based on their priority order until the vaccine supply is exhausted. Numerical simulations show that our analytical solutions achieve results that are close to optimal with objective function values significantly better than would be obtained using simple allocation rules such as allocation proportional to population group size. In addition to being accurate and interpretable, the solutions are easy to implement in practice. Interpretable models are particularly important in public health decision making.


Assuntos
COVID-19 , Simulação por Computador , Doenças Endêmicas , Imunização Secundária , Conceitos Matemáticos , Vacinação , Humanos , Imunização Secundária/estatística & dados numéricos , Doenças Endêmicas/prevenção & controle , Doenças Endêmicas/estatística & dados numéricos , COVID-19/prevenção & controle , COVID-19/epidemiologia , Vacinação/estatística & dados numéricos , Vacinas contra COVID-19/administração & dosagem , Vacinas contra COVID-19/provisão & distribuição , Modelos Biológicos , Influenza Humana/prevenção & controle , SARS-CoV-2/imunologia , Anos de Vida Ajustados por Qualidade de Vida , Vacinas contra Influenza/administração & dosagem , Doenças Transmissíveis/epidemiologia
2.
Health Care Manag Sci ; 26(4): 599-603, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37804456

RESUMO

The US is experiencing a severe opioid epidemic with more than 80,000 opioid overdose deaths occurring in 2022. Beyond the tragic loss of life, opioid use disorder (OUD) has emerged as a major contributor to morbidity, lost productivity, mounting criminal justice system costs, and significant social disruption. This Current Opinion article highlights opportunities for analytics in supporting policy making for effective response to this crisis. We describe modeling opportunities in the following areas: understanding the opioid epidemic (e.g., the prevalence and incidence of OUD in different geographic regions, demographics of individuals with OUD, rates of overdose and overdose death, patterns of drug use and associated disease outbreaks, and access to and use of treatment for OUD); assessing policies for preventing and treating OUD, including mitigation of social conditions that increase the risk of OUD; and evaluating potential regulatory and criminal justice system reforms.


Assuntos
Overdose de Drogas , Transtornos Relacionados ao Uso de Opioides , Humanos , Epidemia de Opioides , Analgésicos Opioides/efeitos adversos , Transtornos Relacionados ao Uso de Opioides/epidemiologia , Transtornos Relacionados ao Uso de Opioides/tratamento farmacológico , Overdose de Drogas/epidemiologia , Overdose de Drogas/prevenção & controle , Overdose de Drogas/tratamento farmacológico , Tomada de Decisões
3.
Emerg Infect Dis ; 28(7): 1375-1383, 2022 07.
Artigo em Inglês | MEDLINE | ID: mdl-35654410

RESUMO

Despite extensive technological advances in recent years, objective and continuous assessment of physiologic measures after vaccination is rarely performed. We conducted a prospective observational study to evaluate short-term self-reported and physiologic reactions to the booster BNT162b2 mRNA (Pfizer-BioNTech, https://www.pfizer.com) vaccine dose. A total of 1,609 participants were equipped with smartwatches and completed daily questionnaires through a dedicated mobile application. The extent of systemic reactions reported after the booster dose was similar to that of the second dose and considerably greater than that of the first dose. Analyses of objective heart rate and heart rate variability measures recorded by smartwatches further supported this finding. Subjective and objective reactions after the booster dose were more apparent in younger participants and in participants who did not have underlying medical conditions. Our findings further support the safety of the booster dose from subjective and objective perspectives and underscore the need for integrating wearables in clinical trials.


Assuntos
COVID-19 , Vacina BNT162 , COVID-19/prevenção & controle , Humanos , RNA Mensageiro , Autorrelato , Vacinação
4.
Stat Med ; 41(17): 3336-3348, 2022 07 30.
Artigo em Inglês | MEDLINE | ID: mdl-35527474

RESUMO

Outbreaks of an endemic infectious disease can occur when the disease is introduced into a highly susceptible subpopulation or when the disease enters a network of connected individuals. For example, significant HIV outbreaks among people who inject drugs have occurred in at least half a dozen US states in recent years. This motivates the current study: how can limited testing resources be allocated across geographic regions to rapidly detect outbreaks of an endemic infectious disease? We develop an adaptive sampling algorithm that uses profile likelihood to estimate the distribution of the number of positive tests that would occur for each location in a future time period if that location were sampled. Sampling is performed in the location with the highest estimated probability of triggering an outbreak alarm in the next time period. The alarm function is determined by a semiparametric likelihood ratio test. We compare the profile likelihood sampling (PLS) method numerically to uniform random sampling (URS) and Thompson sampling (TS). TS was worse than URS when the outbreak occurred in a location with lower initial prevalence than other locations. PLS had lower time to outbreak detection than TS in some but not all scenarios, but was always better than URS even when the outbreak occurred in a location with a lower initial prevalence than other locations. PLS provides an effective and reliable method for rapidly detecting endemic disease outbreaks that is robust to this uncertainty.


Assuntos
Surtos de Doenças , Humanos , Funções Verossimilhança , Prevalência
5.
AIDS Care ; 33(4): 441-447, 2021 04.
Artigo em Inglês | MEDLINE | ID: mdl-31986900

RESUMO

High prevalence of depression among people living with HIV (PLHIV) impedes antiretroviral therapy (ART) adherence and viral suppression. We estimate the effectiveness and cost-effectiveness of strategies to treat depression among PLHIV in Sub-Saharan Africa (SSA). We developed a microsimulation model of HIV disease and care in Uganda which captured individuals' depression status and the relationship between depression and HIV behaviors. We consider a strategy of screening for depression and providing antidepressant therapy with fluoxetine at ART initiation or re-initiation (if a patient has dropped out). We estimate that over 10 years this strategy would reduce prevalence of depression among PLHIV by 16.0% [95% uncertainty bounds 15.8%, 16.1%] from a baseline prevalence of 28%, increase adherence to ART by 1.0% [1.0%, 1.0%], and decrease rates of loss to followup by 3.7% [3.4%, 4.1%]. This would decrease first-line ART failure rates by 2.5% [2.3%, 2.8%] and increase viral suppression rates by 1.0% [1.0%, 1.0%]. This strategy costs $15/QALY compared to the status quo, and was highly cost-effective over a broad range of sensitivity analyses. We conclude that screening for and treating depression among PLHIV in SSA with fluoxetine would be effective in improving HIV treatment outcomes and would be highly cost-effective.


Assuntos
Fármacos Anti-HIV/uso terapêutico , Antidepressivos de Segunda Geração/uso terapêutico , Depressão/tratamento farmacológico , Fluoxetina/uso terapêutico , Infecções por HIV/complicações , Inibidores Seletivos de Recaptação de Serotonina/uso terapêutico , Adulto , Antidepressivos de Segunda Geração/economia , Análise Custo-Benefício , Depressão/economia , Depressão/epidemiologia , Feminino , Fluoxetina/economia , Infecções por HIV/tratamento farmacológico , Infecções por HIV/psicologia , Humanos , Masculino , Saúde Mental , Pessoa de Meia-Idade , Avaliação de Resultados em Cuidados de Saúde , Inibidores Seletivos de Recaptação de Serotonina/economia , Uganda/epidemiologia
6.
Health Care Manag Sci ; 24(3): 551-568, 2021 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-33666808

RESUMO

A safe supply of blood for transfusion is a critical component of the healthcare system in all countries. Most health systems manage the risk of transfusion-transmissible infections (TTIs) through a portfolio of blood safety interventions. These portfolios must be updated periodically to reflect shifting epidemiological conditions, emerging infectious diseases, and new technologies. However, the number of available blood safety portfolios grows exponentially with the number of available interventions, making it impossible for policymakers to evaluate all feasible portfolios without the assistance of a computer model. We develop a novel optimization model for evaluating blood safety portfolios that enables systematic comparison of all feasible portfolios of deferral, testing, and modification interventions to identify the portfolio that is preferred from a cost-utility perspective. We present structural properties that reduce the state space and required computation time in certain cases, and we develop a linear approximation of the model. We apply the model to retrospectively evaluate U.S. blood safety policies for Zika and West Nile virus for the years 2017, 2018, and 2019, defining donor groups based on season and geography. We leverage structural properties to efficiently find an optimal solution. We find that the optimal portfolio varies geographically, seasonally, and over time. Additionally, we show that for this problem the approximated model yields the same optimal solution as the exact model. Our method enables systematic identification of the optimal blood safety portfolio in any setting and any time period, thereby supporting decision makers in efforts to ensure the safety of the blood supply.


Assuntos
Infecção por Zika virus , Zika virus , Segurança do Sangue , Simulação por Computador , Humanos , Estudos Retrospectivos
7.
PLoS Med ; 17(10): e1003239, 2020 10.
Artigo em Inglês | MEDLINE | ID: mdl-33048929

RESUMO

BACKGROUND: Cycles of incarceration, drug abuse, and poverty undermine ongoing public health efforts to reduce overdose deaths and the spread of infectious disease in vulnerable populations. Jail diversion programs aim to divert low-level drug offenders toward community care resources, avoiding criminal justice costs and disruptions in treatment for HIV, hepatitis C virus (HCV), and drug abuse. We sought to assess the health benefits and cost-effectiveness of a jail diversion program for low-level drug offenders. METHODS AND FINDINGS: We developed a microsimulation model, calibrated to King County, Washington, that captured the spread of HIV and HCV infections and incarceration and treatment systems as well as preexisting interventions such as needle and syringe programs and opiate agonist therapy. We considered an adult population of people who inject drugs (PWID), people who use drugs but do not inject (PWUD), men who have sex with men, and lower-risk heterosexuals. We projected discounted lifetime costs and quality-adjusted life years (QALYs) over a 10-year time horizon with and without a jail diversion program and calculated resulting incremental cost-effectiveness ratios (ICERs) from the health system and societal perspectives. We also tracked HIV and HCV infections, overdose deaths, and jail population size. Over 10 years, the program was estimated to reduce HIV and HCV incidence by 3.4% (95% CI 2.7%-4.0%) and 3.3% (95% CI 3.1%-3.4%), respectively, overdose deaths among PWID by 10.0% (95% CI 9.8%-10.8%), and jail population size by 6.3% (95% CI 5.9%-6.7%). When considering healthcare costs only, the program cost $25,500/QALY gained (95% CI $12,600-$48,600). Including savings from reduced incarceration (societal perspective) improved the ICER to $6,200/QALY gained (95% CI, cost-saving $24,300). Sensitivity analysis indicated that cost-effectiveness depends on diversion program participants accessing community programs such as needle and syringe programs, treatment for substance use disorder, and HIV and HCV treatment, as well as diversion program cost. A limitation of the analysis is data availability, as fewer data are available for diversion programs than for more established interventions aimed at people with substance use disorder. Additionally, like any model of a complex system, our model relies on simplifying assumptions: For example, we simplified pathways in the healthcare and criminal justice systems, modeled an average efficacy for substance use disorder treatment, and did not include costs associated with homelessness, unemployment, and breakdown in family structure. CONCLUSIONS: We found that diversion programs for low-level drug offenders are likely to be cost-effective, generating savings in the criminal justice system while only moderately increasing healthcare costs. Such programs can reduce incarceration and its associated costs, and also avert overdose deaths and improve quality of life for PWID, PWUD, and the broader population (through reduced HIV and HCV transmission).


Assuntos
Criminosos/educação , Usuários de Drogas/educação , Transtornos Relacionados ao Uso de Substâncias/reabilitação , Adulto , Análise Custo-Benefício , Usuários de Drogas/psicologia , Programas Governamentais , Infecções por HIV/epidemiologia , Custos de Cuidados de Saúde/tendências , Hepatite C/epidemiologia , Humanos , Masculino , Pessoa de Meia-Idade , Modelos Teóricos , Avaliação de Resultados em Cuidados de Saúde/economia , Qualidade de Vida , Abuso de Substâncias por Via Intravenosa/epidemiologia , Transtornos Relacionados ao Uso de Substâncias/epidemiologia , Washington/epidemiologia
8.
Health Care Manag Sci ; 23(4): 507-519, 2020 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-33017035

RESUMO

Low adherence to prescribed medications causes substantial health and economic burden. We analyzed primary data from electronic medical records of 250,000 random patients from Israel's Maccabi Healthcare services from 2007 to 2017 to predict whether a patient will purchase a prescribed antibiotic. We developed a decision model to evaluate whether an intervention to improve purchasing adherence is warranted for the patient, considering the cost of the intervention and the cost of non-adherence. The best performing prediction model achieved an average area under the receiver operating characteristic curve (AUC) of 0.684, with 82% accuracy in detecting individuals who had less than 50% chance of purchasing a prescribed drug. Using the decision model, an adherence intervention targeted to patients whose predicted purchasing probability is below a specified threshold can increase the number of prescriptions filled while generating significant savings compared to no intervention - on the order of 6.4% savings and 4.0% more prescriptions filled for our dataset. We conclude that analysis of large-scale patient data from electronic medical records can help predict the probability that a patient will purchase a prescribed antibiotic and can provide real-time predictions to physicians, who can then counsel the patient about medication importance. More broadly, in-depth analysis of patient-level data can help shape the next generation of personalized interventions.


Assuntos
Antibacterianos , Prescrições de Medicamentos/estatística & dados numéricos , Adesão à Medicação/estatística & dados numéricos , Adulto , Fatores Etários , Prescrições de Medicamentos/economia , Registros Eletrônicos de Saúde , Feminino , Humanos , Israel , Masculino , Papel do Médico , Fatores Socioeconômicos
9.
Health Care Manag Sci ; 22(4): 756-767, 2019 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-30387040

RESUMO

The operating room is a major cost and revenue center for most hospitals. Thus, more effective operating room management and scheduling can provide significant benefits. In many hospitals, the post-anesthesia care unit (PACU), where patients recover after their surgical procedures, is a bottleneck. If the PACU reaches capacity, patients must wait in the operating room until the PACU has available space, leading to delays and possible cancellations for subsequent operating room procedures. We develop a generalizable optimization and machine learning approach to sequence operating room procedures to minimize delays caused by PACU unavailability. Specifically, we use machine learning to estimate the required PACU time for each type of surgical procedure, we develop and solve two integer programming models to schedule procedures in the operating rooms to minimize maximum PACU occupancy, and we use discrete event simulation to compare our optimized schedule to the existing schedule. Using data from Lucile Packard Children's Hospital Stanford, we show that the scheduling system can significantly reduce operating room delays caused by PACU congestion while still keeping operating room utilization high: simulation of the second half of 2016 shows that our model could have reduced total PACU holds by 76% without decreasing operating room utilization. We are currently working on implementing the scheduling system at the hospital.


Assuntos
Eficiência Organizacional , Salas Cirúrgicas/organização & administração , Admissão e Escalonamento de Pessoal/organização & administração , Sala de Recuperação/organização & administração , California , Simulação por Computador , Hospitais Pediátricos , Humanos , Aprendizado de Máquina , Salas Cirúrgicas/economia , Avaliação de Programas e Projetos de Saúde , Sala de Recuperação/economia
10.
PLoS Med ; 15(7): e1002586, 2018 07.
Artigo em Inglês | MEDLINE | ID: mdl-29969442

RESUMO

BACKGROUND: Rising atmospheric carbon dioxide concentrations are anticipated to decrease the zinc and iron concentrations of crops. The associated disease burden and optimal mitigation strategies remain unknown. We sought to understand where and to what extent increasing carbon dioxide concentrations may increase the global burden of nutritional deficiencies through changes in crop nutrient concentrations, and the effects of potential mitigation strategies. METHODS AND FINDINGS: For each of 137 countries, we incorporated estimates of climate change, crop nutrient concentrations, dietary patterns, and disease risk into a microsimulation model of zinc and iron deficiency. These estimates were obtained from the Intergovernmental Panel on Climate Change, US Department of Agriculture, Statistics Division of the Food and Agriculture Organization of the United Nations, and Global Burden of Disease Project, respectively. In the absence of increasing carbon dioxide concentrations, we estimated that zinc and iron deficiencies would induce 1,072.9 million disability-adjusted life years (DALYs) globally over the period 2015 to 2050 (95% credible interval [CrI]: 971.1-1,167.7). In the presence of increasing carbon dioxide concentrations, we estimated that decreasing zinc and iron concentrations of crops would induce an additional 125.8 million DALYs globally over the same period (95% CrI: 113.6-138.9). This carbon-dioxide-induced disease burden is projected to disproportionately affect nations in the World Health Organization's South-East Asia and African Regions (44.0 and 28.5 million DALYs, respectively), which already have high existing disease burdens from zinc and iron deficiencies (364.3 and 299.5 million DALYs, respectively), increasing global nutritional inequalities. A climate mitigation strategy such as the Paris Agreement (an international agreement to keep global temperatures within 2°C of pre-industrial levels) would be expected to avert 48.2% of this burden (95% CrI: 47.8%-48.5%), while traditional public health interventions including nutrient supplementation and disease control programs would be expected to avert 26.6% of the burden (95% CrI: 23.8%-29.6%). Of the traditional public health interventions, zinc supplementation would be expected to avert 5.5%, iron supplementation 15.7%, malaria mitigation 3.2%, pneumonia mitigation 1.6%, and diarrhea mitigation 0.5%. The primary limitations of the analysis include uncertainty regarding how food consumption patterns may change with climate, how disease mortality rates will change over time, and how crop zinc and iron concentrations will decline from those at present to those in 2050. CONCLUSIONS: Effects of increased carbon dioxide on crop nutrient concentrations are anticipated to exacerbate inequalities in zinc and iron deficiencies by 2050. Proposed Paris Agreement strategies are expected to be more effective than traditional public health measures to avert the increased inequality.


Assuntos
Dióxido de Carbono/efeitos adversos , Simulação por Computador , Produtos Agrícolas/metabolismo , Deficiências Nutricionais/epidemiologia , Abastecimento de Alimentos , Saúde Global , Deficiências de Ferro , Zinco/deficiência , Atmosfera , Dióxido de Carbono/metabolismo , Mudança Climática , Comorbidade , Produtos Agrícolas/crescimento & desenvolvimento , Deficiências Nutricionais/diagnóstico , Deficiências Nutricionais/metabolismo , Deficiências Nutricionais/prevenção & controle , Avaliação da Deficiência , Monitoramento Ambiental , Comportamento Alimentar , Humanos , Estado Nutricional , Valor Nutritivo , Medição de Risco , Fatores de Risco , Fatores de Tempo
12.
Am J Public Health ; 108(10): 1394-1400, 2018 10.
Artigo em Inglês | MEDLINE | ID: mdl-30138057

RESUMO

OBJECTIVES: To estimate health outcomes of policies to mitigate the opioid epidemic. METHODS: We used dynamic compartmental modeling of US adults, in various pain, opioid use, and opioid addiction health states, to project addiction-related deaths, life years, and quality-adjusted life years from 2016 to 2025 for 11 policy responses to the opioid epidemic. RESULTS: Over 5 years, increasing naloxone availability, promoting needle exchange, expanding medication-assisted addiction treatment, and increasing psychosocial treatment increased life years and quality-adjusted life years and reduced deaths. Other policies reduced opioid prescription supply and related deaths but led some addicted prescription users to switch to heroin use, which increased heroin-related deaths. Over a longer horizon, some such policies may avert enough new addiction to outweigh the harms. No single policy is likely to substantially reduce deaths over 5 to 10 years. CONCLUSIONS: Policies focused on services for addicted people improve population health without harming any groups. Policies that reduce the prescription opioid supply may increase heroin use and reduce quality of life in the short term, but in the long term could generate positive health benefits. A portfolio of interventions will be needed for eventual mitigation.


Assuntos
Redução do Dano , Transtornos Relacionados ao Uso de Opioides/epidemiologia , Transtornos Relacionados ao Uso de Opioides/prevenção & controle , Política Pública , Overdose de Drogas/mortalidade , Dependência de Heroína/epidemiologia , Dependência de Heroína/mortalidade , Dependência de Heroína/prevenção & controle , Humanos , Naloxona/provisão & distribuição , Antagonistas de Entorpecentes/provisão & distribuição , Programas de Troca de Agulhas , Tratamento de Substituição de Opiáceos , Transtornos Relacionados ao Uso de Opioides/mortalidade , Anos de Vida Ajustados por Qualidade de Vida , Estados Unidos/epidemiologia
13.
Health Care Manag Sci ; 21(4): 632-646, 2018 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-28861650

RESUMO

Effective treatment for tuberculosis (TB) patients on first-line treatment involves triaging those with drug-resistant (DR) TB to appropriate treatment alternatives. Patients likely to have DR TB are identified using results from repeated inexpensive sputum-smear (SS) tests and expensive but definitive drug sensitivity tests (DST). Early DST may lead to high costs and unnecessary testing; late DST may lead to poor health outcomes and disease transmission. We use a partially observable Markov decision process (POMDP) framework to determine optimal DST timing. We develop policy-relevant structural properties of the POMDP model. We apply our model to TB in India to identify the patterns of SS test results that should prompt DST if transmission costs remain at status-quo levels. Unlike previous analyses of personalized treatment policies, we take a societal perspective and consider the effects of disease transmission. The inclusion of such effects can significantly alter the optimal policy. We find that an optimal DST policy could save India approximately $1.9 billion annually.


Assuntos
Antituberculosos/uso terapêutico , Tuberculose Resistente a Múltiplos Medicamentos/diagnóstico , Tuberculose Resistente a Múltiplos Medicamentos/tratamento farmacológico , Fatores Etários , Algoritmos , Antituberculosos/administração & dosagem , Controle de Doenças Transmissíveis/organização & administração , Efeitos Psicossociais da Doença , Política de Saúde , Humanos , Índia , Cadeias de Markov , Fatores Sexuais , Escarro/microbiologia , Fatores de Tempo , Tuberculose/diagnóstico , Tuberculose/tratamento farmacológico
14.
PLoS Med ; 14(5): e1002312, 2017 05.
Artigo em Inglês | MEDLINE | ID: mdl-28542184

RESUMO

BACKGROUND: The risks of HIV transmission associated with the opioid epidemic make cost-effective programs for people who inject drugs (PWID) a public health priority. Some of these programs have benefits beyond prevention of HIV-a critical consideration given that injection drug use is increasing across most United States demographic groups. To identify high-value HIV prevention program portfolios for US PWID, we consider combinations of four interventions with demonstrated efficacy: opioid agonist therapy (OAT), needle and syringe programs (NSPs), HIV testing and treatment (Test & Treat), and oral HIV pre-exposure prophylaxis (PrEP). METHODS AND FINDINGS: We adapted an empirically calibrated dynamic compartmental model and used it to assess the discounted costs (in 2015 US dollars), health outcomes (HIV infections averted, change in HIV prevalence, and discounted quality-adjusted life years [QALYs]), and incremental cost-effectiveness ratios (ICERs) of the four prevention programs, considered singly and in combination over a 20-y time horizon. We obtained epidemiologic, economic, and health utility parameter estimates from the literature, previously published models, and expert opinion. We estimate that expansions of OAT, NSPs, and Test & Treat implemented singly up to 50% coverage levels can be cost-effective relative to the next highest coverage level (low, medium, and high at 40%, 45%, and 50%, respectively) and that OAT, which we assume to have immediate and direct health benefits for the individual, has the potential to be the highest value investment, even under scenarios where it prevents fewer infections than other programs. Although a model-based analysis can provide only estimates of health outcomes, we project that, over 20 y, 50% coverage with OAT could avert up to 22,000 (95% CI: 5,200, 46,000) infections and cost US$18,000 (95% CI: US$14,000, US$24,000) per QALY gained, 50% NSP coverage could avert up to 35,000 (95% CI: 8,900, 43,000) infections and cost US$25,000 (95% CI: US$7,000, US$76,000) per QALY gained, 50% Test & Treat coverage could avert up to 6,700 (95% CI: 1,200, 16,000) infections and cost US$27,000 (95% CI: US$15,000, US$48,000) per QALY gained, and 50% PrEP coverage could avert up to 37,000 (22,000, 58,000) infections and cost US$300,000 (95% CI: US$162,000, US$667,000) per QALY gained. When coverage expansions are allowed to include combined investment with other programs and are compared to the next best intervention, the model projects that scaling OAT coverage up to 50%, then scaling NSP coverage to 50%, then scaling Test & Treat coverage to 50% can be cost-effective, with each coverage expansion having the potential to cost less than US$50,000 per QALY gained relative to the next best portfolio. In probabilistic sensitivity analyses, 59% of portfolios prioritized the addition of OAT and 41% prioritized the addition of NSPs, while PrEP was not likely to be a priority nor a cost-effective addition. Our findings are intended to be illustrative, as data on achievable coverage are limited and, in practice, the expansion scenarios considered may exceed feasible levels. We assumed independence of interventions and constant returns to scale. Extensive sensitivity analyses allowed us to assess parameter sensitivity, but the use of a dynamic compartmental model limited the exploration of structural sensitivities. CONCLUSIONS: We estimate that OAT, NSPs, and Test & Treat, implemented singly or in combination, have the potential to effectively and cost-effectively prevent HIV in US PWID. PrEP is not likely to be cost-effective in this population, based on the scenarios we evaluated. While local budgets or policy may constrain feasible coverage levels for the various interventions, our findings suggest that investments in combined prevention programs can substantially reduce HIV transmission and improve health outcomes among PWID.


Assuntos
Análise Custo-Benefício , Infecções por HIV/prevenção & controle , Modelos Teóricos , Prevenção Primária/economia , Abuso de Substâncias por Via Intravenosa/prevenção & controle , Usuários de Drogas , Infecções por HIV/epidemiologia , Humanos , Prevalência , Abuso de Substâncias por Via Intravenosa/epidemiologia , Estados Unidos/epidemiologia
15.
J Theor Biol ; 428: 1-17, 2017 09 07.
Artigo em Inglês | MEDLINE | ID: mdl-28606751

RESUMO

Economic evaluations of infectious disease control interventions frequently use dynamic compartmental epidemic models. Such models capture heterogeneity in risk of infection by stratifying the population into discrete risk groups, thus approximating what is typically continuous variation in risk. An important open question is whether and how different risk stratification choices influence model predictions of intervention effects. We develop equivalent Susceptible-Infected-Susceptible (SIS) dynamic transmission models: an unstratified model, a model stratified into a high-risk and low-risk group, and a model with an arbitrary number of risk groups. Absent intervention, the models produce the same overall prevalence of infected individuals in steady state. We consider an intervention that either reduces the contact rate or increases the disease clearance rate. We develop analytical and numerical results characterizing the models and the effects of the intervention. We find that there exist multiple feasible choices of risk stratification, contact distribution, and within- and between-group contact rates for models that stratify risk. We show analytically and empirically that these choices can generate different estimates of intervention effectiveness, and that these differences can be significant enough to alter conclusions from cost-effectiveness analyses and change policy recommendations. We conclude that the choice of how to discretize risk in compartmental epidemic models can influence predicted effectiveness of interventions. Therefore, analysts should examine multiple alternatives and report the range of results.


Assuntos
Epidemias , Modelos Biológicos , Medição de Risco , Controle de Doenças Transmissíveis , Gonorreia/epidemiologia , Gonorreia/transmissão , Homossexualidade Masculina/estatística & dados numéricos , Humanos , Masculino , Análise Numérica Assistida por Computador , Prevalência , Fatores de Risco
16.
Malar J ; 16(1): 403, 2017 10 06.
Artigo em Inglês | MEDLINE | ID: mdl-28985732

RESUMO

BACKGROUND: Malaria is a leading cause of morbidity and mortality among HIV-infected pregnant women in sub-Saharan Africa: at least 1 million pregnancies among HIV-infected women are complicated by co-infection with malaria annually, leading to increased risk of premature delivery, severe anaemia, delivery of low birth weight infants, and maternal death. Current guidelines recommend either daily cotrimoxazole (CTX) or intermittent preventive treatment with sulfadoxine-pyrimethamine (IPTp-SP) for HIV-infected pregnant women to prevent malaria and its complications. The cost-effectiveness of CTX compared to IPTp-SP among HIV-infected pregnant women was assessed. METHODS: A microsimulation model of malaria and HIV among pregnant women in five malaria-endemic countries in sub-Saharan Africa was constructed. Four strategies were compared: (1) 2-dose IPTp-SP at current IPTp-SP coverage of the country ("2-IPT Low"); (2) 3-dose IPTp-SP at current coverage ("3-IPT Low"); (3) 3-dose IPTp-SP at the same coverage as antiretroviral therapy (ART) in the country ("3-IPT High"); and (4) daily CTX at ART coverage. Outcomes measured include maternal malaria, anaemia, low birth weight (LBW), and disability-adjusted life years (DALYs). Sensitivity analyses assessed the effect of adherence to CTX. RESULTS: Compared with the 2-IPT Low Strategy, women receiving CTX had 22.5% fewer LBW infants (95% CI 22.3-22.7), 13.5% fewer anaemia cases (95% CI 13.4-13.5), and 13.6% fewer maternal malaria cases (95% CI 13.6-13.7). In all simulated countries, CTX was the preferred strategy, with incremental cost-effectiveness ratios ranging from cost-saving to $3.9 per DALY averted from a societal perspective. CTX was less effective than the 3-IPT High Strategy when more than 18% of women stopped taking CTX during the pregnancy. CONCLUSION: In malarious regions of sub-Saharan Africa, daily CTX for HIV-infected pregnant women regardless of CD4 cell count is cost-effective compared with 3-dose IPTp-SP as long as more than 82% of women adhere to daily dosing.


Assuntos
Antimaláricos/economia , Coinfecção/epidemiologia , Análise Custo-Benefício , Infecções por HIV/epidemiologia , Malária/economia , Pirimetamina/economia , Sulfadoxina/economia , Combinação Trimetoprima e Sulfametoxazol/economia , África Subsaariana/epidemiologia , Antimaláricos/uso terapêutico , Coinfecção/parasitologia , Coinfecção/virologia , Combinação de Medicamentos , Feminino , Infecções por HIV/virologia , Humanos , Malária/prevenção & controle , Modelos Teóricos , Gravidez , Pirimetamina/uso terapêutico , Sulfadoxina/uso terapêutico , Combinação Trimetoprima e Sulfametoxazol/uso terapêutico , Adulto Jovem
17.
Health Care Manag Sci ; 20(1): 16-32, 2017 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-26188961

RESUMO

How long should a patient with a treatable chronic disease wait for more effective treatments before accepting the best available treatment? We develop a framework to guide optimal treatment decisions for a deteriorating chronic disease when treatment technologies are improving over time. We formulate an optimal stopping problem using a discrete-time, finite-horizon Markov decision process. The goal is to maximize a patient's quality-adjusted life expectancy. We derive structural properties of the model and analytically solve a three-period treatment decision problem. We illustrate the model with the example of treatment for chronic hepatitis C virus (HCV). Chronic HCV affects 3-4 million Americans and has been historically difficult to treat, but increasingly effective treatments have been commercialized in the past few years. We show that the optimal treatment decision is more likely to be to accept currently available treatment-despite expectations for future treatment improvement-for patients who have high-risk history, who are older, or who have more comorbidities. Insights from this study can guide HCV treatment decisions for individual patients. More broadly, our model can guide treatment decisions for curable chronic diseases by finding the optimal treatment policy for individual patients in a heterogeneous population.


Assuntos
Hepatite C Crônica/terapia , Antivirais/uso terapêutico , Tecnologia Biomédica , Técnicas de Apoio para a Decisão , Feminino , Hepatite C Crônica/tratamento farmacológico , Humanos , Masculino , Cadeias de Markov , Pessoa de Meia-Idade , Modelos Teóricos , Melhoria de Qualidade , Anos de Vida Ajustados por Qualidade de Vida , Medição de Risco , Resultado do Tratamento
18.
Ann Intern Med ; 165(1): 10-19, 2016 Jul 05.
Artigo em Inglês | MEDLINE | ID: mdl-27110953

RESUMO

BACKGROUND: The total population health benefits and costs of HIV preexposure prophylaxis (PrEP) for people who inject drugs (PWID) in the United States are unclear. OBJECTIVE: To evaluate the cost-effectiveness and optimal delivery conditions of PrEP for PWID. DESIGN: Empirically calibrated dynamic compartmental model. DATA SOURCES: Published literature and expert opinion. TARGET POPULATION: Adult U.S. PWID. TIME HORIZON: 20 years and lifetime. INTERVENTION: PrEP alone, PrEP with frequent screening (PrEP+screen), and PrEP+screen with enhanced provision of antiretroviral therapy (ART) for individuals who become infected (PrEP+screen+ART). All scenarios are considered at 25% coverage. OUTCOME MEASURES: Infections averted, deaths averted, change in HIV prevalence, discounted costs (in 2015 U.S. dollars), discounted quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios. RESULTS OF BASE-CASE ANALYSIS: PrEP+screen+ART dominates other strategies, averting 26 700 infections and reducing HIV prevalence among PWID by 14% compared with the status quo. Achieving these benefits costs $253 000 per QALY gained. At current drug prices, total expenditures for PrEP+screen+ART could be as high as $44 billion over 20 years. RESULTS OF SENSITIVITY ANALYSIS: Cost-effectiveness of the intervention is linear in the annual cost of PrEP and is dependent on PrEP drug adherence, individual transmission risks, and community HIV prevalence. LIMITATION: Data on risk stratification and achievable PrEP efficacy levels for U.S. PWID are limited. CONCLUSION: PrEP with frequent screening and prompt treatment for those who become infected can reduce HIV burden among PWID and provide health benefits for the entire U.S. population, but, at current drug prices, it remains an expensive intervention both in absolute terms and in cost per QALY gained. PRIMARY FUNDING SOURCE: National Institute on Drug Abuse.

19.
Health Care Manag Sci ; 19(4): 305-312, 2016 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-26003321

RESUMO

Operations research (OR)-based analyses have the potential to improve decision making for many important, real-world health care problems. However, junior scholars often avoid working on practical applications in health because promotion and tenure processes tend to value theoretical studies more highly than applied studies. This paper discusses the author's experiences in using OR to inform and influence decisions in health and provides a blueprint for junior researchers who wish to find success by taking a similar path. This involves selecting good problems to study, forming productive collaborations with domain experts, developing appropriate models, identifying the most salient results from an analysis, and effectively disseminating findings to decision makers. The paper then suggests how journals, funding agencies, and senior academics can encourage such work by taking a broader and more informed view of the potential role and contributions of OR to solving health care problems. Making room in academia for the application of OR in health follows in the tradition begun by the founders of operations research: to work on important real-world problems where operations research can contribute to better decision making.


Assuntos
Tomada de Decisões , Política de Saúde , Administração de Serviços de Saúde , Pesquisa Operacional , Pesquisadores/organização & administração , Comportamento Cooperativo , Pesquisa sobre Serviços de Saúde/organização & administração , Humanos , Disseminação de Informação , Universidades
20.
J Theor Biol ; 371: 154-65, 2015 Apr 21.
Artigo em Inglês | MEDLINE | ID: mdl-25698229

RESUMO

For many communicable diseases, knowledge of the underlying contact network through which the disease spreads is essential to determining appropriate control measures. When behavior change is the primary intervention for disease prevention, it is important to understand how to best modify network connectivity using the limited resources available to control disease spread. We describe and compare four algorithms for selecting a limited number of links to remove from a network: two "preventive" approaches (edge centrality, R0 minimization), where the decision of which links to remove is made prior to any disease outbreak and depends only on the network structure; and two "reactive" approaches (S-I edge centrality, optimal quarantining), where information about the initial disease states of the nodes is incorporated into the decision of which links to remove. We evaluate the performance of these algorithms in minimizing the total number of infections that occur over the course of an acute outbreak of disease. We consider different network structures, including both static and dynamic Erdös-Rényi random networks with varying levels of connectivity, a real-world network of residential hotels connected through injection drug use, and a network exhibiting community structure. We show that reactive approaches outperform preventive approaches in averting infections. Among reactive approaches, removing links in order of S-I edge centrality is favored when the link removal budget is small, while optimal quarantining performs best when the link removal budget is sufficiently large. The budget threshold above which optimal quarantining outperforms the S-I edge centrality algorithm is a function of both network structure (higher for unstructured Erdös-Rényi random networks compared to networks with community structure or the real-world network) and disease infectiousness (lower for highly infectious diseases). We conduct a value-of-information analysis of knowing which nodes are initially infected by comparing the performance improvement achieved by reactive over preventive strategies. We find that such information is most valuable for moderate budget levels, with increasing value as disease spread becomes more likely (due to either increased connectedness of the network or increased infectiousness of the disease).


Assuntos
Algoritmos , Epidemias/estatística & dados numéricos , Apoio Social , Número Básico de Reprodução , Surtos de Doenças , Humanos , Vigilância da População , Quarentena , Processos Estocásticos
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa