Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 266
Filtrar
Mais filtros

Tipo de documento
Intervalo de ano de publicação
1.
J Infect Dis ; 230(2): 382-393, 2024 Aug 16.
Artigo em Inglês | MEDLINE | ID: mdl-38581432

RESUMO

BACKGROUND: With coronavirus disease 2019 (COVID-19) vaccination no longer mandated by many businesses/organizations, it is now up to individuals to decide whether to get any new boosters/updated vaccines going forward. METHODS: We developed a Markov model representing the potential clinical/economic outcomes from an individual perspective in the United States of getting versus not getting an annual COVID-19 vaccine. RESULTS: For an 18-49 year old, getting vaccinated at its current price ($60) can save the individual on average $30-$603 if the individual is uninsured and $4-$437 if the individual has private insurance, as long as the starting vaccine efficacy against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection is ≥50% and the weekly risk of getting infected is ≥0.2%, corresponding to an individual interacting with 9 other people in a day under Winter 2023-2024 Omicron SARS-CoV-2 variant conditions with an average infection prevalence of 10%. For a 50-64 year old, these cost-savings increase to $111-$1278 and $119-$1706 for someone without and with insurance, respectively. The risk threshold increases to ≥0.4% (interacting with 19 people/day), when the individual has 13.4% preexisting protection against infection (eg, vaccinated 9 months earlier). CONCLUSIONS: There is both clinical and economic incentive for the individual to continue to get vaccinated against COVID-19 each year.


Assuntos
Vacinas contra COVID-19 , COVID-19 , Análise Custo-Benefício , Cadeias de Markov , SARS-CoV-2 , Vacinação , Humanos , COVID-19/prevenção & controle , COVID-19/economia , COVID-19/epidemiologia , Vacinas contra COVID-19/economia , Vacinas contra COVID-19/administração & dosagem , Pessoa de Meia-Idade , Adulto , Adolescente , SARS-CoV-2/imunologia , Vacinação/economia , Adulto Jovem , Estados Unidos/epidemiologia , Masculino , Feminino
2.
J Nutr ; 154(8): 2566-2574, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-38801862

RESUMO

BACKGROUND: National surveillance shows that food insecurity affects ∼1 in 10 Americans each year. Recently, experts have advocated for surveillance of nutrition insecurity alongside food insecurity. Nutrition security refers to the nutritional adequacy of accessible food and factors that impact one's ability to meet food preferences. OBJECTIVES: This study presents representative estimates of food insecurity and nutrition insecurity for Los Angeles County, CA, United States; compares predictors of these constructs; and examines whether they independently predict diet-related health outcomes. METHODS: In December 2022, a representative sample of Los Angeles County adults participating in the Understanding America Study (N = 1071) was surveyed about household food insecurity and nutrition insecurity over the past 12 months. Data were analyzed in 2023. RESULTS: Reported rates were similar for food insecurity (24%) and nutrition insecurity (25%), but the overlap of these subgroups was less than 60%. Logistic regression models indicated that non-Hispanic Asian individuals had higher odds of nutrition insecurity but not food insecurity. Moreover, nutrition insecurity was a stronger predictor of diabetes compared with food insecurity, and both constructs independently predicted poor mental health. CONCLUSIONS: Food and nutrition insecurity affect somewhat different populations. Both constructs are valuable predictors of diet-related health outcomes. Monitoring nutrition insecurity in addition to food insecurity can provide new information about populations with barriers to healthy diets.


Assuntos
Dieta , Insegurança Alimentar , Humanos , Los Angeles , Masculino , Feminino , Adulto , Pessoa de Meia-Idade , Abastecimento de Alimentos , Adulto Jovem , Idoso , Estado Nutricional , Adolescente , Segurança Alimentar
3.
J Health Commun ; 29(sup1): 1-10, 2024 Jun 03.
Artigo em Inglês | MEDLINE | ID: mdl-38831666

RESUMO

Society is at an inflection point-both in terms of climate change and the amount of data and computational resources currently available. Climate change has been a catastrophe in slow motion with relationships between human activity, climate change, and the resulting effects forming a complex system. However, to date, there has been a general lack of urgent responses from leaders and the general public, despite urgent warnings from the scientific community about the consequences of climate change and what can be done to mitigate it. Further, misinformation and disinformation about climate change abound. A major problem is that there has not been enough focus on communication in the climate change field. Since communication itself involves complex systems (e.g. information users, information itself, communications channels), there is a need for more systems approaches to communication about climate change. Utilizing systems approaches to really understand and anticipate how information may be distributed and received before communication has even occurred and adjust accordingly can lead to more proactive precision climate change communication. The time has come to identify and develop more effective, tailored, and precise communication for climate change.


Assuntos
Mudança Climática , Comunicação em Saúde , Humanos , Comunicação em Saúde/métodos , Análise de Sistemas , Comunicação
4.
J Health Commun ; 29(sup1): 77-88, 2024 Jun 03.
Artigo em Inglês | MEDLINE | ID: mdl-38845202

RESUMO

Over the past sixty years, scientists have been warning about climate change and its impacts on human health, but evidence suggests that many may not be heeding these concerns. This raises the question of whether new communication approaches are needed to overcome the unique challenges of communicating what people can do to slow or reverse climate change. To better elucidate the challenges of communicating about the links between human activity, climate change and its effects, and identify potential solutions, we developed a systems map of the factors and processes involved based on systems mapping sessions with climate change and communication experts. The systems map revealed 27 communication challenges such as "Limited information on how individual actions contribute to collective human activity," "Limited information on how present activity leads to long-term effects," and "Difficult to represent and communicate complex relationships." The systems map also revealed several themes among the identified challenges that exist in communicating about climate change, including a lack of available data and integrated databases, climate change disciplines working in silos, a need for a lexicon that is easily understood by the public, and the need for new communication strategies to describe processes that take time to manifest.


Assuntos
Mudança Climática , Comunicação em Saúde , Humanos , Comunicação em Saúde/métodos , Análise de Sistemas , Comunicação
5.
JAMA ; 331(18): 1544-1557, 2024 05 14.
Artigo em Inglês | MEDLINE | ID: mdl-38557703

RESUMO

Importance: Infections due to multidrug-resistant organisms (MDROs) are associated with increased morbidity, mortality, length of hospitalization, and health care costs. Regional interventions may be advantageous in mitigating MDROs and associated infections. Objective: To evaluate whether implementation of a decolonization collaborative is associated with reduced regional MDRO prevalence, incident clinical cultures, infection-related hospitalizations, costs, and deaths. Design, Setting, and Participants: This quality improvement study was conducted from July 1, 2017, to July 31, 2019, across 35 health care facilities in Orange County, California. Exposures: Chlorhexidine bathing and nasal iodophor antisepsis for residents in long-term care and hospitalized patients in contact precautions (CP). Main Outcomes and Measures: Baseline and end of intervention MDRO point prevalence among participating facilities; incident MDRO (nonscreening) clinical cultures among participating and nonparticipating facilities; and infection-related hospitalizations and associated costs and deaths among residents in participating and nonparticipating nursing homes (NHs). Results: Thirty-five facilities (16 hospitals, 16 NHs, 3 long-term acute care hospitals [LTACHs]) adopted the intervention. Comparing decolonization with baseline periods among participating facilities, the mean (SD) MDRO prevalence decreased from 63.9% (12.2%) to 49.9% (11.3%) among NHs, from 80.0% (7.2%) to 53.3% (13.3%) among LTACHs (odds ratio [OR] for NHs and LTACHs, 0.48; 95% CI, 0.40-0.57), and from 64.1% (8.5%) to 55.4% (13.8%) (OR, 0.75; 95% CI, 0.60-0.93) among hospitalized patients in CP. When comparing decolonization with baseline among NHs, the mean (SD) monthly incident MDRO clinical cultures changed from 2.7 (1.9) to 1.7 (1.1) among participating NHs, from 1.7 (1.4) to 1.5 (1.1) among nonparticipating NHs (group × period interaction reduction, 30.4%; 95% CI, 16.4%-42.1%), from 25.5 (18.6) to 25.0 (15.9) among participating hospitals, from 12.5 (10.1) to 14.3 (10.2) among nonparticipating hospitals (group × period interaction reduction, 12.9%; 95% CI, 3.3%-21.5%), and from 14.8 (8.6) to 8.2 (6.1) among LTACHs (all facilities participating; 22.5% reduction; 95% CI, 4.4%-37.1%). For NHs, the rate of infection-related hospitalizations per 1000 resident-days changed from 2.31 during baseline to 1.94 during intervention among participating NHs, and from 1.90 to 2.03 among nonparticipating NHs (group × period interaction reduction, 26.7%; 95% CI, 19.0%-34.5%). Associated hospitalization costs per 1000 resident-days changed from $64 651 to $55 149 among participating NHs and from $55 151 to $59 327 among nonparticipating NHs (group × period interaction reduction, 26.8%; 95% CI, 26.7%-26.9%). Associated hospitalization deaths per 1000 resident-days changed from 0.29 to 0.25 among participating NHs and from 0.23 to 0.24 among nonparticipating NHs (group × period interaction reduction, 23.7%; 95% CI, 4.5%-43.0%). Conclusions and Relevance: A regional collaborative involving universal decolonization in long-term care facilities and targeted decolonization among hospital patients in CP was associated with lower MDRO carriage, infections, hospitalizations, costs, and deaths.


Assuntos
Anti-Infecciosos Locais , Infecções Bacterianas , Infecção Hospitalar , Farmacorresistência Bacteriana Múltipla , Instalações de Saúde , Controle de Infecções , Idoso , Humanos , Administração Intranasal , Anti-Infecciosos Locais/administração & dosagem , Anti-Infecciosos Locais/uso terapêutico , Infecções Bacterianas/economia , Infecções Bacterianas/microbiologia , Infecções Bacterianas/mortalidade , Infecções Bacterianas/prevenção & controle , Banhos/métodos , California/epidemiologia , Clorexidina/administração & dosagem , Clorexidina/uso terapêutico , Infecção Hospitalar/economia , Infecção Hospitalar/microbiologia , Infecção Hospitalar/mortalidade , Infecção Hospitalar/prevenção & controle , Instalações de Saúde/economia , Instalações de Saúde/normas , Instalações de Saúde/estatística & dados numéricos , Hospitalização/economia , Hospitalização/estatística & dados numéricos , Hospitais/normas , Hospitais/estatística & dados numéricos , Controle de Infecções/métodos , Iodóforos/administração & dosagem , Iodóforos/uso terapêutico , Casas de Saúde/economia , Casas de Saúde/normas , Casas de Saúde/estatística & dados numéricos , Transferência de Pacientes , Melhoria de Qualidade/economia , Melhoria de Qualidade/estatística & dados numéricos , Higiene da Pele/métodos , Precauções Universais
6.
J Health Commun ; 28(sup1): 13-24, 2023 Apr 07.
Artigo em Inglês | MEDLINE | ID: mdl-37390012

RESUMO

A major challenge in communicating health-related information is the involvement of multiple complex systems from the creation of the information to the sources and channels of dispersion to the information users themselves. To date, public health communications approaches have often not adequately accounted for the complexities of these systems to the degree necessary to have maximum impact. The virality of COVID-19 misinformation and disinformation has brought to light the need to consider these system complexities more extensively. Unaided, it is difficult for humans to see and fully understand complex systems. Luckily, there are a range of systems approaches and methods, such as systems mapping and systems modeling, that can help better elucidate complex systems. Using these methods to better characterize the various systems involved in communicating public health-related information can lead to the development of more tailored, precise, and proactive communications. Proceeding in an iterative manner to help design, implement, and adjust such communications strategies can increase impact and leave less opportunity for misinformation and disinformation to spread.


Assuntos
COVID-19 , Comunicação em Saúde , Humanos , Saúde Pública , COVID-19/epidemiologia
7.
Food Policy ; 116: 102416, 2023 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-37234381

RESUMO

Translating agricultural productivity into food availability depends on food supply chains. Agricultural policy and research efforts promote increased horticultural crop production and yields, but the ability of low-resource food supply chains to handle increased volumes of perishable crops is not well understood. This study developed and used a discrete event simulation model to assess the impact of increased production of potato, onion, tomato, brinjal (eggplant), and cabbage on vegetable supply chains in Odisha, India. Odisha serves as an exemplar of vegetable supply chain challenges in many low-resource settings. Model results demonstrated that in response to increasing vegetable production 1.25-5x baseline amounts, demand fulfillment at the retail level fluctuated by + 3% to -4% from baseline; in other words, any improvements in vegetable availability for consumers were disproportionately low compared to the magnitude of increased production, and in some cases increased production worsened demand fulfillment. Increasing vegetable production led to disproportionately high rates of postharvest loss: for brinjal, for example, doubling agricultural production led to a 3% increase in demand fulfillment and a 19% increase in supply chain losses. The majority of postharvest losses occurred as vegetables accumulated and expired during wholesale-to-wholesale trade. In order to avoid inadvertently exacerbating postharvest losses, efforts to address food security through agriculture need to ensure that low-resource supply chains can handle increased productivity. Supply chain improvements should consider the constraints of different types of perishable vegetables, and they may need to go beyond structural improvements to include networks of communication and trade.

9.
Pediatr Res ; 91(1): 254-260, 2022 01.
Artigo em Inglês | MEDLINE | ID: mdl-33664477

RESUMO

BACKGROUND: Teaching caregivers to respond to normal infant night awakenings in ways other than feeding is a common obesity prevention effort. Models can simulate caregiver feeding behavior while controlling for variables that are difficult to manipulate or measure in real life. METHODS: We developed a virtual infant model representing an infant with an embedded metabolism and his/her daily sleep, awakenings, and feeds from their caregiver each day as the infant aged from 6 to 12 months (recommended age to introduce solids). We then simulated different night feeding interventions and their impact on infant body mass index (BMI). RESULTS: Reducing the likelihood of feeding during normal night wakings from 79% to 50% to 10% lowered infant BMI from the 84th to the 75th to the 62nd percentile by 12 months, respectively, among caregivers who did not adaptively feed (e.g., adjust portion sizes of solid foods with infant growth). Among caregivers who adaptively feed, all scenarios resulted in relatively stable BMI percentiles, and progressively reducing feeding probability by 10% each month showed the least fluctuations. CONCLUSIONS: Reducing night feeding has the potential to impact infant BMI, (e.g., 10% lower probability can reduce BMI by 20 percentile points) especially among caregivers who do not adaptively feed. IMPACT: Teaching caregivers to respond to infant night waking with other soothing behaviors besides feeding has the potential to reduce infant BMI. When reducing the likelihood of feeding during night wakings from 79% to 50% to 10%, infants dropped from the 84th BMI percentile to the 75th to the 62nd by 12 months, respectively, among caregivers who do not adaptively feed. Night-feeding interventions have a greater impact when caregivers do not adaptively feed their infant based on their growth compared to caregivers who do adaptively feed. Night-feeding interventions should be one of the several tools in a multi-component intervention for childhood obesity prevention.


Assuntos
Índice de Massa Corporal , Ritmo Circadiano , Comportamento Alimentar , Cuidadores , Humanos , Lactente , Modelos Teóricos
10.
PLoS Comput Biol ; 17(1): e1008470, 2021 01.
Artigo em Inglês | MEDLINE | ID: mdl-33411742

RESUMO

Finding medications or vaccines that may decrease the infectious period of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) could potentially reduce transmission in the broader population. We developed a computational model of the U.S. simulating the spread of SARS-CoV-2 and the potential clinical and economic impact of reducing the infectious period duration. Simulation experiments found that reducing the average infectious period duration could avert a median of 442,852 [treating 25% of symptomatic cases, reducing by 0.5 days, reproductive number (R0) 3.5, and starting treatment when 15% of the population has been exposed] to 44.4 million SARS-CoV-2 cases (treating 75% of all infected cases, reducing by 3.5 days, R0 2.0). With R0 2.5, reducing the average infectious period duration by 0.5 days for 25% of symptomatic cases averted 1.4 million cases and 99,398 hospitalizations; increasing to 75% of symptomatic cases averted 2.8 million cases. At $500/person, treating 25% of symptomatic cases saved $209.5 billion (societal perspective). Further reducing the average infectious period duration by 3.5 days averted 7.4 million cases (treating 25% of symptomatic cases). Expanding treatment to 75% of all infected cases, including asymptomatic infections (R0 2.5), averted 35.9 million cases and 4 million hospitalizations, saving $48.8 billion (societal perspective and starting treatment after 5% of the population has been exposed). Our study quantifies the potential effects of reducing the SARS-CoV-2 infectious period duration.


Assuntos
Tratamento Farmacológico da COVID-19 , COVID-19/transmissão , Modelos Biológicos , Pandemias , SARS-CoV-2 , COVID-19/epidemiologia , Vacinas contra COVID-19/uso terapêutico , Biologia Computacional , Simulação por Computador , Humanos , Pandemias/prevenção & controle , Pandemias/estatística & dados numéricos , SARS-CoV-2/efeitos dos fármacos , Fatores de Tempo , Estados Unidos/epidemiologia , Eliminação de Partículas Virais/efeitos dos fármacos
11.
J Med Internet Res ; 24(8): e30581, 2022 08 22.
Artigo em Inglês | MEDLINE | ID: mdl-35994313

RESUMO

BACKGROUND: The increasing prevalence of smartphone apps to help people find different services raises the question of whether apps to help people find physical activity (PA) locations would help better prevent and control having overweight or obesity. OBJECTIVE: The aim of this paper is to determine and quantify the potential impact of a digital health intervention for African American women prior to allocating financial resources toward implementation. METHODS: We developed our Virtual Population Obesity Prevention, agent-based model of Washington, DC, to simulate the impact of a place-tailored digital health app that provides information about free recreation center classes on PA, BMI, and overweight and obesity prevalence among African American women. RESULTS: When the app is introduced at the beginning of the simulation, with app engagement at 25% (eg, 25% [41,839/167,356] of women aware of the app; 25% [10,460/41,839] of those aware downloading the app; and 25% [2615/10,460] of those who download it receiving regular push notifications), and a 25% (25/100) baseline probability to exercise (eg, without the app), there are no statistically significant increases in PA levels or decreases in BMI or obesity prevalence over 5 years across the population. When 50% (83,678/167,356) of women are aware of the app; 58.23% (48,725/83,678) of those who are aware download it; and 55% (26,799/48,725) of those who download it receive regular push notifications, in line with existing studies on app usage, introducing the app on average increases PA and decreases weight or obesity prevalence, though the changes are not statistically significant. When app engagement increased to 75% (125,517/167,356) of women who were aware, 75% (94,138/125,517) of those who were aware downloading it, and 75% (70,603/94,138) of those who downloaded it opting into the app's push notifications, there were statistically significant changes in PA participation, minutes of PA and obesity prevalence. CONCLUSIONS: Our study shows that a digital health app that helps identify recreation center classes does not result in substantive population-wide health effects at lower levels of app engagement. For the app to result in statistically significant increases in PA and reductions in obesity prevalence over 5 years, there needs to be at least 75% (125,517/167,356) of women aware of the app, 75% (94,138/125,517) of those aware of the app download it, and 75% (70,603/94,138) of those who download it opt into push notifications. Nevertheless, the app cannot fully overcome lack of access to recreation centers; therefore, public health administrators as well as parks and recreation agencies might consider incorporating this type of technology into multilevel interventions that also target the built environment and other social determinants of health.


Assuntos
Aplicativos Móveis , Negro ou Afro-Americano , Exercício Físico , Feminino , Humanos , Obesidade/epidemiologia , Obesidade/prevenção & controle , Sobrepeso
12.
J Infect Dis ; 224(6): 938-948, 2021 09 17.
Artigo em Inglês | MEDLINE | ID: mdl-33954775

RESUMO

BACKGROUND: With multiple coronavirus disease 2019 (COVID-19) vaccines available, understanding the epidemiologic, clinical, and economic value of increasing coverage levels and expediting vaccination is important. METHODS: We developed a computational model (transmission and age-stratified clinical and economics outcome model) representing the United States population, COVID-19 coronavirus spread (February 2020-December 2022), and vaccination to determine the impact of increasing coverage and expediting time to achieve coverage. RESULTS: When achieving a given vaccination coverage in 270 days (70% vaccine efficacy), every 1% increase in coverage can avert an average of 876 800 (217 000-2 398 000) cases, varying with the number of people already vaccinated. For example, each 1% increase between 40% and 50% coverage can prevent 1.5 million cases, 56 240 hospitalizations, and 6660 deaths; gain 77 590 quality-adjusted life-years (QALYs); and save $602.8 million in direct medical costs and $1.3 billion in productivity losses. Expediting to 180 days could save an additional 5.8 million cases, 215 790 hospitalizations, 26 370 deaths, 206 520 QALYs, $3.5 billion in direct medical costs, and $4.3 billion in productivity losses. CONCLUSIONS: Our study quantifies the potential value of decreasing vaccine hesitancy and increasing vaccination coverage and how this value may decrease with the time it takes to achieve coverage, emphasizing the need to reach high coverage levels as soon as possible, especially before the fall/winter.


Assuntos
Vacinas contra COVID-19/economia , Análise Custo-Benefício , Vacinação/economia , COVID-19/prevenção & controle , Vacinas contra COVID-19/administração & dosagem , Humanos , Modelos Econômicos , SARS-CoV-2 , Estados Unidos , Vacinação/estatística & dados numéricos
13.
Clin Infect Dis ; 72(3): 438-447, 2021 02 01.
Artigo em Inglês | MEDLINE | ID: mdl-31970389

RESUMO

BACKGROUND: When trying to control regional spread of antibiotic-resistant pathogens such as carbapenem-resistant Enterobacteriaceae (CRE), decision makers must choose the highest-yield facilities to target for interventions. The question is, with limited resources, how best to choose these facilities. METHODS: Using our Regional Healthcare Ecosystem Analyst-generated agent-based model of all Chicago metropolitan area inpatient facilities, we simulated the spread of CRE and different ways of choosing facilities to apply a prevention bundle (screening, chlorhexidine gluconate bathing, hand hygiene, geographic separation, and patient registry) to a resource-limited 1686 inpatient beds. RESULTS: Randomly selecting facilities did not impact prevalence, but averted 620 new carriers and 175 infections, saving $6.3 million in total costs compared to no intervention. Selecting facilities by type (eg, long-term acute care hospitals) yielded a 16.1% relative prevalence decrease, preventing 1960 cases and 558 infections, saving $62.4 million more than random selection. Choosing the largest facilities was better than random selection, but not better than by type. Selecting by considering connections to other facilities (ie, highest volume of discharge patients) yielded a 9.5% relative prevalence decrease, preventing 1580 cases and 470 infections, and saving $51.6 million more than random selection. Selecting facilities using a combination of these metrics yielded the greatest reduction (19.0% relative prevalence decrease, preventing 1840 cases and 554 infections, saving $59.6 million compared with random selection). CONCLUSIONS: While choosing target facilities based on single metrics (eg, most inpatient beds, most connections to other facilities) achieved better control than randomly choosing facilities, more effective targeting occurred when considering how these and other factors (eg, patient length of stay, care for higher-risk patients) interacted as a system.


Assuntos
Enterobacteriáceas Resistentes a Carbapenêmicos , Infecção Hospitalar , Infecções por Enterobacteriaceae , Chicago/epidemiologia , Infecção Hospitalar/epidemiologia , Infecção Hospitalar/prevenção & controle , Ecossistema , Infecções por Enterobacteriaceae/tratamento farmacológico , Infecções por Enterobacteriaceae/epidemiologia , Infecções por Enterobacteriaceae/prevenção & controle , Humanos
14.
Am J Epidemiol ; 190(3): 448-458, 2021 02 01.
Artigo em Inglês | MEDLINE | ID: mdl-33145594

RESUMO

Typically, long-term acute care hospitals (LTACHs) have less experience in and incentives to implementing aggressive infection control for drug-resistant organisms such as carbapenem-resistant Enterobacteriaceae (CRE) than acute care hospitals. Decision makers need to understand how implementing control measures in LTACHs can impact CRE spread regionwide. Using our Chicago metropolitan region agent-based model to simulate CRE spread and control, we estimated that a prevention bundle in only LTACHs decreased prevalence by a relative 4.6%-17.1%, averted 1,090-2,795 new carriers, 273-722 infections and 37-87 deaths over 3 years and saved $30.5-$69.1 million, compared with no CRE control measures. When LTACHs and intensive care units intervened, prevalence decreased by a relative 21.2%. Adding LTACHs averted an additional 1,995 carriers, 513 infections, and 62 deaths, and saved $47.6 million beyond implementation in intensive care units alone. Thus, LTACHs may be more important than other acute care settings for controlling CRE, and regional efforts to control drug-resistant organisms should start with LTACHs as a centerpiece.


Assuntos
Enterobacteriáceas Resistentes a Carbapenêmicos , Protocolos Clínicos/normas , Infecções por Enterobacteriaceae/epidemiologia , Infecções por Enterobacteriaceae/prevenção & controle , Administração Hospitalar , Controle de Infecções/organização & administração , Simulação por Computador , Humanos , Controle de Infecções/normas , Modelos Teóricos
15.
Sex Transm Dis ; 48(5): 370-380, 2021 05 01.
Artigo em Inglês | MEDLINE | ID: mdl-33156291

RESUMO

BACKGROUND: Although current human papillomavirus (HPV) genotype screening tests identify genotypes 16 and 18 and do not specifically identify other high-risk types, a new extended genotyping test identifies additional individual (31, 45, 51, and 52) and groups (33/58, 35/39/68, and 56/59/66) of high-risk genotypes. METHODS: We developed a Markov model of the HPV disease course and evaluated the clinical and economic value of HPV primary screening with Onclarity (BD Diagnostics, Franklin Lakes, NJ) capable of extended genotyping in a cohort of women 30 years or older. Women with certain genotypes were later rescreened instead of undergoing immediate colposcopy and varied which genotypes were rescreened, disease progression rate, and test cost. RESULTS: Assuming 100% compliance with screening, HPV primary screening using current tests resulted in 25,194 invasive procedures and 48 invasive cervical cancer (ICC) cases per 100,000 women. Screening with extended genotyping (100% compliance) and later rescreening women with certain genotypes averted 903 to 3163 invasive procedures and resulted in 0 to 3 more ICC cases compared with current HPV primary screening tests. Extended genotyping was cost-effective ($2298-$7236/quality-adjusted life year) when costing $75 and cost saving (median, $0.3-$1.0 million) when costing $43. When the probabilities of disease progression increased (2-4 times), extended genotyping was not cost-effective because it resulted in more ICC cases and accrued fewer quality-adjusted life years. CONCLUSIONS: Our study identified the conditions under which extended genotyping was cost-effective and even cost saving compared with current tests. A key driver of cost-effectiveness is the risk of disease progression, which emphasizes the need to better understand such risks in different populations.


Assuntos
Alphapapillomavirus , Infecções por Papillomavirus , Displasia do Colo do Útero , Neoplasias do Colo do Útero , Análise Custo-Benefício , Detecção Precoce de Câncer , Feminino , Genótipo , Humanos , Papillomaviridae/genética , Infecções por Papillomavirus/diagnóstico , Infecções por Papillomavirus/epidemiologia , Gravidez
16.
J Infect Dis ; 222(11): 1910-1919, 2020 11 09.
Artigo em Inglês | MEDLINE | ID: mdl-32671397

RESUMO

BACKGROUND: Although norovirus outbreaks periodically make headlines, it is unclear how much attention norovirus may receive otherwise. A better understanding of the burden could help determine how to prioritize norovirus prevention and control. METHODS: We developed a computational simulation model to quantify the clinical and economic burden of norovirus in the United States. RESULTS: A symptomatic case generated $48 in direct medical costs, $416 in productivity losses ($464 total). The median yearly cost of outbreaks was $7.6 million (range across years, $7.5-$8.2 million) in direct medical costs, and $165.3 million ($161.1-$176.4 million) in productivity losses ($173.5 million total). Sporadic illnesses in the community (incidence, 10-150/1000 population) resulted in 14 118-211 705 hospitalizations, 8.2-122.9 million missed school/work days, $0.2-$2.3 billion in direct medical costs, and $1.4-$20.7 billion in productivity losses ($1.5-$23.1 billion total). The total cost was $10.6 billion based on the current incidence estimate (68.9/1000). CONCLUSION: Our study quantified norovirus' burden. Of the total burden, sporadic cases constituted >90% (thus, annual burden may vary depending on incidence) and productivity losses represented 89%. More than half the economic burden is in adults ≥45, more than half occurs in winter months, and >90% of outbreak costs are due to person-to-person transmission, offering insights into where and when prevention/control efforts may yield returns.


Assuntos
Efeitos Psicossociais da Doença , Gastroenterite/economia , Gastroenterite/epidemiologia , Norovirus , Adolescente , Adulto , Idoso , Criança , Pré-Escolar , Surtos de Doenças/economia , Gastroenterite/virologia , Custos de Cuidados de Saúde , Hospitalização , Humanos , Incidência , Lactente , Recém-Nascido , Pessoa de Meia-Idade , Estados Unidos/epidemiologia , Adulto Jovem
17.
J Infect Dis ; 222(7): 1138-1144, 2020 09 01.
Artigo em Inglês | MEDLINE | ID: mdl-32386323

RESUMO

BACKGROUND: The protection that an influenza vaccine offers can vary significantly from person to person due to differences in immune systems, body types, and other factors. The question, then, is what is the value of efforts to reduce this variability such as making vaccines more personalized and tailored to individuals. METHODS: We developed a compartment model of the United States to simulate different influenza seasons and the impact of reducing the variability in responses to the influenza vaccine across the population. RESULTS: Going from a vaccine that varied in efficacy (0-30%) to one that had a uniform 30% efficacy for everyone averted 16.0-31.2 million cases, $1.9-$3.6 billion in direct medical costs, and $16.1-$42.7 billion in productivity losses. Going from 0-50% in efficacy to just 50% for everyone averted 27.7-38.6 million cases, $3.3-$4.6 billion in direct medical costs, and $28.8-$57.4 billion in productivity losses. Going from 0-70% to 70% averted 33.6-54.1 million cases, $4.0-$6.5 billion in direct medical costs, and $44.8-$64.7 billion in productivity losses. CONCLUSIONS: This study quantifies for policy makers, funders, and vaccine developers and manufacturers the potential impact of efforts to reduce variability in the protection that influenza vaccines offer (eg, developing vaccines that are more personalized to different individual factors).


Assuntos
Transmissão de Doença Infecciosa/prevenção & controle , Epidemias , Vacinas contra Influenza/administração & dosagem , Influenza Humana/epidemiologia , Influenza Humana/prevenção & controle , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Criança , Pré-Escolar , Análise Custo-Benefício , Feminino , Humanos , Lactente , Recém-Nascido , Vacinas contra Influenza/economia , Vacinas contra Influenza/imunologia , Influenza Humana/economia , Masculino , Pessoa de Meia-Idade , Modelos Estatísticos , Farmácias , Estações do Ano , Resultado do Tratamento , Estados Unidos/epidemiologia , Vacinação/economia , Cobertura Vacinal , Adulto Jovem
18.
J Infect Dis ; 221(11): 1782-1794, 2020 05 11.
Artigo em Inglês | MEDLINE | ID: mdl-31150539

RESUMO

BACKGROUND: Clinical testing detects a fraction of carbapenem-resistant Enterobacteriaceae (CRE) carriers. Detecting a greater proportion could lead to increased use of infection prevention and control measures but requires resources. Therefore, it is important to understand the impact of detecting increasing proportions of CRE carriers. METHODS: We used our Regional Healthcare Ecosystem Analyst-generated agent-based model of adult inpatient healthcare facilities in Orange County, California, to explore the impact that detecting greater proportions of carriers has on the spread of CRE. RESULTS: Detecting and placing 1 in 9 carriers on contact precautions increased the prevalence of CRE from 0% to 8.0% countywide over 10 years. Increasing the proportion of detected carriers from 1 in 9 up to 1 in 5 yielded linear reductions in transmission; at proportions >1 in 5, reductions were greater than linear. Transmission reductions did not occur for 1, 4, or 5 years, varying by facility type. With a contact precautions effectiveness of ≤70%, the detection level yielding nonlinear reductions remained unchanged; with an effectiveness of >80%, detecting only 1 in 5 carriers garnered large reductions in the number of new CRE carriers. Trends held when CRE was already present in the region. CONCLUSION: Although detection of all carriers provided the most benefits for preventing new CRE carriers, if this is not feasible, it may be worthwhile to aim for detecting >1 in 5 carriers.


Assuntos
Enterobacteriáceas Resistentes a Carbapenêmicos/isolamento & purificação , Portador Sadio/diagnóstico , Infecções por Enterobacteriaceae/transmissão , Controle de Infecções/métodos , Portador Sadio/epidemiologia , Portador Sadio/transmissão , Busca de Comunicante , Infecções por Enterobacteriaceae/diagnóstico , Infecções por Enterobacteriaceae/epidemiologia , Infecções por Enterobacteriaceae/prevenção & controle , Hospitais/estatística & dados numéricos , Humanos , Casas de Saúde/estatística & dados numéricos , Prevalência
19.
Clin Infect Dis ; 70(5): 843-849, 2020 02 14.
Artigo em Inglês | MEDLINE | ID: mdl-31070719

RESUMO

BACKGROUND: Regions are considering the use of electronic registries to track patients who carry antibiotic-resistant bacteria, including carbapenem-resistant Enterobacteriaceae (CRE). Implementing such a registry can be challenging and requires time, effort, and resources; therefore, there is a need to better understand the potential impact. METHODS: We developed an agent-based model of all inpatient healthcare facilities (90 acute care hospitals, 9 long-term acute care hospitals, 351 skilled nursing facilities, and 12 ventilator-capable skilled nursing facilities) in the Chicago metropolitan area, surrounding communities, and patient flow using our Regional Healthcare Ecosystem Analyst software platform. Scenarios explored the impact of a registry that tracked patients carrying CRE to help guide infection prevention and control. RESULTS: When all Illinois facilities participated (n = 402), the registry reduced the number of new carriers by 11.7% and CRE prevalence by 7.6% over a 3-year period. When 75% of the largest Illinois facilities participated (n = 304), registry use resulted in a 11.6% relative reduction in new carriers (16.9% and 1.2% in participating and nonparticipating facilities, respectively) and 5.0% relative reduction in prevalence. When 50% participated (n = 201), there were 10.7% and 5.6% relative reductions in incident carriers and prevalence, respectively. When 25% participated (n = 101), there was a 9.1% relative reduction in incident carriers (20.4% and 1.6% in participating and nonparticipating facilities, respectively) and 2.8% relative reduction in prevalence. CONCLUSIONS: Implementing an extensively drug-resistant organism registry reduced CRE spread, even when only 25% of the largest Illinois facilities participated due to patient sharing. Nonparticipating facilities garnered benefits, with reductions in new carriers.


Assuntos
Enterobacteriáceas Resistentes a Carbapenêmicos , Infecção Hospitalar , Infecções por Enterobacteriaceae , Chicago , Infecção Hospitalar/epidemiologia , Infecção Hospitalar/prevenção & controle , Ecossistema , Infecções por Enterobacteriaceae/tratamento farmacológico , Infecções por Enterobacteriaceae/epidemiologia , Infecções por Enterobacteriaceae/prevenção & controle , Humanos , Illinois/epidemiologia , Sistema de Registros
20.
Pediatr Res ; 88(4): 661-667, 2020 10.
Artigo em Inglês | MEDLINE | ID: mdl-32179869

RESUMO

BACKGROUND: Studies show that by 3 months, over half of US infants receive formula, and guidelines play a key role in formula feeding. The question then is, what might happen if caregivers follow guidelines and, more specifically, are there situations where following guidelines can result in infants who are overweight/have obesity? METHODS: We used our "Virtual Infant" agent-based model representing infant-caregiver pairs that allowed caregivers to feed infants each day according to guidelines put forth by Johns Hopkins Medicine (JHM), Children's Hospital of Philadelphia (CHOP), Children's Hospital of the King's Daughters (CHKD), and Women, Infants, and Children (WIC). The model simulated the resulting development of the infants from birth to 6 months. The two sets of guidelines vary in their recommendations, and do not provide studies that support amounts at given ages. RESULTS: Simulations identified several scenarios where caregivers followed JHM/CHOP/CHKD and WIC guidelines, but infants still became overweight/with obesity by 6 months. For JHM/CHOP/CHKD guidelines, this occurred even when caregivers adjusted feeding based on infant's weight. For WIC guidelines, when caregivers adjusted formula amounts, infants maintained healthy weight. CONCLUSIONS: WIC guidelines may be a good starting point for caregivers who adjust as their infant grows, but the minimum amounts for JHM/CHKD/CHOP recommendations may be too high. IMPACT: Our virtual infant simulation study answers the question: can caregivers follow current formula-feeding guidelines and still end up with an infant who is overweight or has obesity? Our study identified several situations in which unhealthy weight gain and/or weight loss could result from following established formula-feeding recommendations. Our study also suggests that the minimum recommended amount of daily formula feeding should be lower for JHM/CHOP/CHKD guidelines to give caregivers more flexibility in adjusting daily feeding levels in response to infant weight. WIC guidelines may be a good starting point for caregivers who adjust as their infant grows. In order to understand how to adjust guidelines, we can use computational simulation models, which serve as "virtual laboratories" to help overcome the logistical and ethical issues of clinical trials.


Assuntos
Fórmulas Infantis , Fenômenos Fisiológicos da Nutrição do Lactente , Sobrepeso/prevenção & controle , Obesidade Infantil/prevenção & controle , Peso Corporal , Cuidadores , Simulação por Computador , Comportamento Alimentar/fisiologia , Feminino , Guias como Assunto , Humanos , Lactente , Alimentos Infantis , Recém-Nascido , Masculino , Fatores de Tempo , Estados Unidos , Aumento de Peso
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA