Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 49
Filtrar
1.
Am J Epidemiol ; 2024 Jun 17.
Artículo en Inglés | MEDLINE | ID: mdl-38885957

RESUMEN

Studies of SARS-CoV-2 incidence are important for response to continued transmission and future pandemics. We followed a rural community cohort with broad age representation with active surveillance for SARS-CoV-2 identification from November 2020 through July 2022. Participants provided serum specimens at regular intervals and following SARS-CoV-2 infection or vaccination. We estimated the incidence of SARS-CoV-2 infection identified by study RT-PCR, electronic health record documentation or self-report of a positive test, or serology. We also estimated the seroprevalence of SARS-CoV-2 spike and nucleocapsid antibodies measured by ELISA. Overall, 65% of the cohort had ≥1 SARS-CoV-2 infection by July 2022, and 19% of those with primary infection were reinfected. Infection and vaccination contributed to high seroprevalence, 98% (95% CI: 95%, 99%) of participants were spike or nucleocapsid seropositive at the end of follow-up. Among those seropositive, 82% were vaccinated. Participants were more likely to be seropositive to spike than nucleocapsid following infection. Infection among seropositive individuals could be identified by increases in nucleocapsid, but not spike, ELISA optical density values. Nucleocapsid antibodies waned more quickly after infection than spike antibodies. High levels of SARS-CoV-2 population immunity, as found in this study, are leading to changing epidemiology necessitating ongoing surveillance and policy evaluation.

2.
Water Res ; 252: 121242, 2024 Mar 15.
Artículo en Inglés | MEDLINE | ID: mdl-38342066

RESUMEN

Water reuse is a growing global reality. In regulating water reuse, viruses have come to the fore as key pathogens due to high shedding rates, low infectious doses, and resilience to traditional wastewater treatments. To demonstrate the high log reductions required by emerging water reuse regulations, cost and practicality necessitate surrogates for viruses for use as challenge organisms in unit process evaluation and monitoring. Bacteriophage surrogates that are mitigated to the same or lesser extent than viruses of concern are routinely used for individual unit process testing. However, the behavior of these surrogates over a multi-barrier treatment train typical of water reuse has not been well-established. Toward this aim, we performed a meta-analysis of log reductions of common bacteriophage surrogates for five treatment processes typical of water reuse treatment trains: advanced oxidation processes, chlorination, membrane filtration, ozonation, and ultraviolet (UV) disinfection. Robust linear regression was applied to identify a range of doses consistent with a given log reduction of bacteriophages and viruses of concern for each treatment process. The results were used to determine relative conservatism of surrogates. We found that no one bacteriophage was a representative or conservative surrogate for viruses of concern across all multi-barrier treatments (encompassing multiple mechanisms of virus mitigation). Rather, a suite of bacteriophage surrogates provides both a representative range of inactivation and information about the effectiveness of individual processes within a treatment train. Based on the abundance of available data and diversity of virus treatability using these five key water reuse treatment processes, bacteriophages MS2, phiX174, and Qbeta were recommended as a core suite of surrogates for virus challenge testing.


Asunto(s)
Bacteriófagos , Purificación del Agua , Agua , Bacteriófago phi X 174 , Purificación del Agua/métodos , Desinfección/métodos , Levivirus
3.
J Water Health ; 21(9): 1209-1227, 2023 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-37756190

RESUMEN

By community intervention in 14 non-disinfecting municipal water systems, we quantified sporadic acute gastrointestinal illness (AGI) attributable to groundwater. Ultraviolet (UV) disinfection was installed on all supply wells of intervention communities. In control communities, residents continued to drink non-disinfected groundwater. Intervention and control communities switched treatments by moving UV disinfection units at the study midpoint (crossover design). Study participants (n = 1,659) completed weekly health diaries during four 12-week surveillance periods. Water supply wells were analyzed monthly for enteric pathogenic viruses. Using the crossover design, groundwater-borne AGI was not observed. However, virus types and quantity in supply wells changed through the study, suggesting that exposure was not constant. Alternatively, we compared AGI incidence between intervention and control communities within the same surveillance period. During Period 1, norovirus contaminated wells and AGI attributable risk from well water was 19% (95% CI, -4%, 36%) for children <5 years and 15% (95% CI, -9%, 33%) for adults. During Period 3, echovirus 11 contaminated wells and UV disinfection slightly reduced AGI in adults. Estimates of AGI attributable risks from drinking non-disinfected groundwater were highly variable, but appeared greatest during times when supply wells were contaminated with specific AGI-etiologic viruses.


Asunto(s)
Agua Potable , Agua Subterránea , Adulto , Niño , Humanos , Abastecimiento de Agua , Desinfección , Enterovirus Humano B
4.
J Environ Qual ; 52(2): 270-286, 2023 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-36479898

RESUMEN

Antimicrobial resistance is a growing public health problem that requires an integrated approach among human, agricultural, and environmental sectors. However, few studies address all three components simultaneously. We investigated the occurrence of five antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in private wells drawing water from a vulnerable aquifer influenced by residential septic systems and land-applied dairy manure. Samples (n = 138) were collected across four seasons from a randomized sample of private wells in Kewaunee County, Wisconsin. Measurements of ARGs and intI1 were related to microbial source tracking (MST) markers specific to human and bovine feces; they were also related to 54 risk factors for contamination representing land use, rainfall, hydrogeology, and well construction. ARGs and intI1 occurred in 5%-40% of samples depending on target. Detection frequencies for ARGs and intI1 were lowest in the absence of human and bovine MST markers (1%-30%), highest when co-occurring with human and bovine markers together (11%-78%), and intermediate when co-occurring with just one type of MST marker (4%-46%). Gene targets were associated with septic system density more often than agricultural land, potentially because of the variable presence of manure on the landscape. Determining ARG prevalence in a rural setting with mixed land use allowed an assessment of the relative contribution of human and bovine fecal sources. Because fecal sources co-occurred with ARGs at similar rates, interventions intended to reduce ARG occurrence may be most effective if both sources are considered.


Asunto(s)
Antibacterianos , Estiércol , Animales , Humanos , Bovinos , Antibacterianos/farmacología , Ganado , Heces , Farmacorresistencia Microbiana/genética
5.
Hum Vaccin Immunother ; 18(7): 2159215, 2022 12 30.
Artículo en Inglés | MEDLINE | ID: mdl-36577134

RESUMEN

The safety of 9-valent HPV vaccine (9vHPV) has been established with regard to common and uncommon adverse events. However, investigation of rare and severe adverse events requires extended study periods to capture rare outcomes. This observational cohort study investigated the occurrence of three rare and serious adverse events following 9-valent human papillomavirus (9vHPV) vaccination compared to other vaccinations, in US individuals 9-26 years old, using electronic health record data from the Vaccine Safety Datalink (VSD). We searched for occurrences of Guillain-Barré syndrome (GBS), chronic inflammatory demyelinating polyneuropathy (CIDP), and stroke following 9vHPV vaccination from October 4, 2015, through January 2, 2021. We compared the risks of GBS, CIDP, and stroke following 9vHPV vaccination to risks of those outcomes following comparator vaccines commonly given to this age group (Td, Tdap, MenACWY, hepatitis A, and varicella vaccines) from January 1, 2007, through January 2, 2021. We observed 1.2 cases of stroke, 0.3 cases of GBS, and 0.1 cases of CIDP per 100,000 doses of 9vHPV vaccine. After observing more than 1.8 million doses of 9vHPV, we identified no statistically significant increase in risks associated with 9vHPV vaccination for any of these adverse events, either combined or stratified by age (9-17 years of age vs. 18-26 years of age) and sex (males vs. females). Our findings provide additional evidence supporting 9vHPV vaccine safety, over longer time frames and for more serious and rare adverse events.


Asunto(s)
Infecciones por Papillomavirus , Vacunas contra Papillomavirus , Polirradiculoneuropatía Crónica Inflamatoria Desmielinizante , Adolescente , Adulto , Niño , Femenino , Humanos , Masculino , Adulto Joven , Virus del Papiloma Humano , Infecciones por Papillomavirus/epidemiología , Vacunas contra Papillomavirus/administración & dosificación , Vacunas contra Papillomavirus/efectos adversos , Polirradiculoneuropatía Crónica Inflamatoria Desmielinizante/inducido químicamente , Vacunación/efectos adversos
6.
Sci Rep ; 12(1): 17138, 2022 10 13.
Artículo en Inglés | MEDLINE | ID: mdl-36229636

RESUMEN

Stable isotopes are useful for estimating livestock diet selection. The objective was to compare δ13C and δ15N to estimate diet proportion of C3-C4 forages when steers (Bos spp.) were fed quantities of rhizoma peanut (Arachis glabrata; RP; C3) and bahiagrass (Paspalum notatum; C4).Treatments were proportions of RP with bahiagrass hay: 100% bahiagrass (0%RP); 25% RP + 75% bahiagrass (25%RP); 50% RP + 50% bahiagrass (50%RP); 75% RP + 25% bahiagrass (75%RP); and 100% RP (100% RP). Feces, plasma, red blood cell (RBC), and hair were collected at 8-days intervals, for 32 days. Two-pool mixing model was utilized to back-calculate the proportion of RP based on the sample and forage δ13C or δ15N. Feces showed changes using δ13C by 8 days, and adj. R2 between predicted and observed RP proportion was 0.81 by 8 days. Plasma, hair, and RBC required beyond 32-days to reach equilibrium, therefore were not useful predictors of diet composition during the study. Diets were best represented using fecal δ13C at both 8-days and 32-days. By 32-days, fecal δ15N showed promise (R2 = 0.71) for predicting diet composition in C3-C4 diets. Further studies are warranted to further corroborate fecal δ15N as a predictor of diet composition in cattle.


Asunto(s)
Dieta , Paspalum , Alimentación Animal/análisis , Animales , Bovinos , Dieta/veterinaria , Heces , Isótopos
7.
J Anim Sci ; 100(3)2022 Mar 01.
Artículo en Inglés | MEDLINE | ID: mdl-35137106

RESUMEN

Recently, there has been increased interest in including triticale (X Triticosecale Wittmack) or other winter cereals within forage programs throughout the southwest United States. Our objectives were to screen 14 diverse triticale cultivars for agronomic and nutritive characteristics with specific emphasis on identifying normal, as well as deviant, responses to the calendar date and plant maturity for forages seeded in December and harvested from late February throughout May at Maricopa, AZ. Fourteen cultivars were established in a randomized complete block design with each cultivar represented within each of three field blocks. Plots were clean tilled and established on December 18, 2018, and then harvested at 2-wk intervals beginning on February 27 and ending May 23, 2019. Across all harvest dates, forage (N = 315) energy density (NEL) exhibited strong negative correlations with growth stage (r =  -0.879), plant height (r =  -0.913), head weight (r =  -0.814), and estimated dry matter (DM) yield (r =  -0.886) but was positively associated with percentages of leaf (r = 0.949), and weakly associated with percentages of the stem (r = 0.138). Through April 10, similar correlations were observed within individual harvest dates (N = 45) for growth stage, leaf percentage, and plant height but not for stem or head-weight percentages. Within later harvest dates, only sporadic correlations with NEL were observed. Primarily cubic regression relationships for neutral detergent fiber, acid detergent lignin, 30- and 48-h in vitro disappearance of DM and fiber, and NEL were fit for the mean or typical cultivar using both days from February 1 and growth stage as independent variables. Coefficients of determination (R2 ≥ 0.860) in all cases indicated a good fit for the polynomial models. For NEL, deviation from the typical cultivar when days from February 1 was used as the independent regression variable was largely affected by cultivar maturation rate. When the growth stage was substituted as the independent variable, plant height, stem percentage beginning at anthesis, and low grain-head percentage were associated with the maximum negative deviant cultivar (Merlin Max). The 0.23 Mcal/kg difference between maximum positive and negative deviant cultivars at a common late-boot/early-heading stage of growth suggests that some attention should be placed on cultivar selection as well as forage inventory needs and overall cropping goals.


Recently, there has been increased interest in using triticale within forage programs in the southwest United States. Our objectives were to screen 14 triticale cultivars for agronomic and nutritive value characteristics with specific emphasis on identifying typical, as well as deviant, responses to the calendar date and plant maturity. Regression relationships for neutral detergent fiber, acid detergent lignin, 30- and 48-h in vitro disappearance of dry matter and fiber, and net energy of lactation (NEL) were fit for the mean or typical cultivar using both days from February 1 or growth stage at harvest as independent regression variables. Deviant cultivars usually demonstrated rapid or slow maturation rates, which were often accompanied by physical characteristics reflective of advanced or slow maturation, respectively. Overall, there were a limited number of cultivars that deviated from typical with respect to NEL, but the total range in energy density at a common late-boot/early-heading stage of growth (0.23 Mcal/kg) suggests that some attention should be placed on cultivar selection, especially when specific cultivars display atypical growth characteristics, such as greater canopy height. However, either positive or negative deviation with respect to energy density may be desirable depending on the energy needs of the targeted livestock class.


Asunto(s)
Triticale , Animales , Fibras de la Dieta , Digestión , Grano Comestible , Valor Nutritivo , Estados Unidos
8.
Environ Health Perspect ; 129(6): 67003, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-34160247

RESUMEN

BACKGROUND: Private wells are an important source of drinking water in Kewaunee County, Wisconsin. Due to the region's fractured dolomite aquifer, these wells are vulnerable to contamination by human and zoonotic gastrointestinal pathogens originating from land-applied cattle manure and private septic systems. OBJECTIVE: We determined the magnitude of the health burden associated with contamination of private wells in Kewaunee County by feces-borne gastrointestinal pathogens. METHODS: This study used data from a year-long countywide pathogen occurrence study as inputs into a quantitative microbial risk assessment (QMRA) to predict the total cases of acute gastrointestinal illness (AGI) caused by private well contamination in the county. Microbial source tracking was used to associate predicted cases of illness with bovine, human, or unknown fecal sources. RESULTS: Results suggest that private well contamination could be responsible for as many as 301 AGI cases per year in Kewaunee County, and that 230 and 12 cases per year were associated with a bovine and human fecal source, respectively. Furthermore, Cryptosporidium parvum was predicted to cause 190 cases per year, the most out of all 8 pathogens included in the QMRA. DISCUSSION: This study has important implications for land use and water resource management in Kewaunee County and informs the public health impacts of consuming drinking water produced in other similarly vulnerable hydrogeological settings. https://doi.org/10.1289/EHP7815.


Asunto(s)
Criptosporidiosis , Cryptosporidium , Agua Subterránea , Animales , Carbonato de Calcio , Bovinos , Magnesio , Medición de Riesgo , Pozos de Agua , Wisconsin/epidemiología
9.
Environ Health Perspect ; 129(6): 67004, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-34160249

RESUMEN

BACKGROUND: Groundwater quality in the Silurian dolomite aquifer in northeastern Wisconsin, USA, has become contentious as dairy farms and exurban development expand. OBJECTIVES: We investigated private household wells in the region, determining the extent, sources, and risk factors of nitrate and microbial contamination. METHODS: Total coliforms, Escherichia coli, and nitrate were evaluated by synoptic sampling during groundwater recharge and no-recharge periods. Additional seasonal sampling measured genetic markers of human and bovine fecal-associated microbes and enteric zoonotic pathogens. We constructed multivariable regression models of detection probability (log-binomial) and concentration (gamma) for each contaminant to identify risk factors related to land use, precipitation, hydrogeology, and well construction. RESULTS: Total coliforms and nitrate were strongly associated with depth-to-bedrock at well sites and nearby agricultural land use, but not septic systems. Both human wastewater and cattle manure contributed to well contamination. Rotavirus group A, Cryptosporidium, and Salmonella were the most frequently detected pathogens. Wells positive for human fecal markers were associated with depth-to-groundwater and number of septic system drainfield within 229m. Manure-contaminated wells were associated with groundwater recharge and the area size of nearby agricultural land. Wells positive for any fecal-associated microbe, regardless of source, were associated with septic system density and manure storage proximity modified by bedrock depth. Well construction was generally not related to contamination, indicating land use, groundwater recharge, and bedrock depth were the most important risk factors. DISCUSSION: These findings may inform policies to minimize contamination of the Silurian dolomite aquifer, a major water supply for the U.S. and Canadian Great Lakes region. https://doi.org/10.1289/EHP7813.


Asunto(s)
Criptosporidiosis , Cryptosporidium , Agua Subterránea , Contaminantes Químicos del Agua , Animales , Carbonato de Calcio , Canadá , Bovinos , Monitoreo del Ambiente , Magnesio , Nitratos/análisis , Factores de Riesgo , Contaminantes Químicos del Agua/análisis , Pozos de Agua , Wisconsin
10.
Influenza Other Respir Viruses ; 14(5): 479-482, 2020 09.
Artículo en Inglés | MEDLINE | ID: mdl-32390298

RESUMEN

We developed and evaluated a model to predict serious outcomes among 243 adults ≥60 years old with medically attended respiratory illness and laboratory-confirmed respiratory syncytial virus (RSV); 47 patients had a serious outcome defined as hospital admission, emergency department (ED) visit, or pneumonia diagnosis. The model used logistic regression with penalized maximum likelihood estimation. The reduced penalized model included age ≥ 75 years, ≥1 ED visit in prior year, crackles/rales, tachypnea, wheezing, new/increased sputum, and new/increased dyspnea. The optimal score cutoff yielded sensitivity and specificity of 66.0% and 81.6%. This prediction model provided moderate utility for identifying older adults with elevated risk of complicated RSV illness.


Asunto(s)
Pacientes Ambulatorios/estadística & datos numéricos , Infecciones por Virus Sincitial Respiratorio/complicaciones , Infecciones por Virus Sincitial Respiratorio/diagnóstico , Estaciones del Año , Factores de Edad , Anciano , Servicio de Urgencia en Hospital/estadística & datos numéricos , Humanos , Modelos Logísticos , Persona de Mediana Edad , Neumonía/diagnóstico , Neumonía/virología , Valor Predictivo de las Pruebas , Virus Sincitial Respiratorio Humano/genética , Virus Sincitial Respiratorio Humano/patogenicidad , Factores de Riesgo , Índice de Severidad de la Enfermedad
11.
Water Res ; 178: 115814, 2020 Jul 01.
Artículo en Inglés | MEDLINE | ID: mdl-32325219

RESUMEN

Drinking water supply wells can be contaminated by a broad range of waterborne pathogens. However, groundwater assessments frequently measure microbial indicators or a single pathogen type, which provides a limited characterization of potential health risk. This study assessed contamination of wells by testing for viral, bacterial, and protozoan pathogens and fecal markers. Wells supplying groundwater to community and noncommunity public water systems in Minnesota, USA (n = 145) were sampled every other month over one or two years and tested using 23 qPCR assays. Eighteen genetic targets were detected at least once, and microbiological contamination was widespread (96% of 145 wells, 58% of 964 samples). The sewage-associated microbial indicators HF183 and pepper mild mottle virus were detected frequently. Human or zoonotic pathogens were detected in 70% of wells and 21% of samples by qPCR, with Salmonella and Cryptosporidium detected more often than viruses. Samples positive by qPCR for adenovirus (HAdV), enterovirus, or Salmonella were analyzed by culture and for genotype or serotype. qPCR-positive Giardia and Cryptosporidium samples were analyzed by immunofluorescent assay (IFA), and IFA and qPCR concentrations were correlated. Comparisons of indicator and pathogen occurrence at the time of sampling showed that total coliforms, HF183, and Bacteroidales-like HumM2 had high specificity and negative predictive values but generally low sensitivity and positive predictive values. Pathogen-HF183 ratios in sewage have been used to estimate health risks from HF183 concentrations in surface water, but in our groundwater samples Cryptosporidium oocyst:HF183 and HAdV:HF183 ratios were approximately 10,000 times higher than ratios reported for sewage. qPCR measurements provided a robust characterization of microbiological water quality, but interpretation of qPCR data in a regulatory context is challenging because few studies link qPCR measurements to health risk.


Asunto(s)
Criptosporidiosis , Cryptosporidium , Agua Subterránea , Animales , Monitoreo del Ambiente , Heces , Humanos , Minnesota , Microbiología del Agua
12.
Pediatr Obes ; 15(1): e12572, 2020 01.
Artículo en Inglés | MEDLINE | ID: mdl-31595686

RESUMEN

BACKGROUND: Recent studies suggest kids tend to gain the most weight in summer, but schools are chastised for supporting obesogenic environments. Conclusions on circannual weight gain are hampered by infrequent body mass index (BMI) measurements, and guidance is limited on the optimal timeframe for paediatric weight interventions. OBJECTIVES: This study characterized circannual trends in BMI in Wisconsin children and adolescents and identified sociodemographic differences in excess weight gain. METHODS: An observational study was used to pool data from 2010 to 2015 to examine circannual BMI z-score trends for Marshfield Clinic patients age 3 to 17 years. Daily 0.20, 0.50, and 0.80 quantiles of BMI z-score were estimated, stratified by gender, race, and age. RESULTS: BMI z-scores increased July to September, followed by a decrease in October to December, and another increase to decrease cycle beginning in February. For adolescents, the summer increase in BMI was greater among those in the upper BMI z-score quantile relative to those in the lower quantile (+0.15 units vs +0.04 units). This pattern was opposite in children. CONCLUSIONS: BMI increased most rapidly in late summer. This growth persisted through autumn in adolescents who were larger, suggesting weight management support may be beneficial for kids who are overweight at the start of the school year.


Asunto(s)
Obesidad/prevención & control , Aumento de Peso , Adolescente , Índice de Masa Corporal , Niño , Preescolar , Femenino , Humanos , Masculino , Estaciones del Año
13.
Pediatrics ; 144(6)2019 12.
Artículo en Inglés | MEDLINE | ID: mdl-31740498

RESUMEN

BACKGROUND AND OBJECTIVES: Human papillomavirus is the most common sexually transmitted infection in the United States and causes certain anogenital and oropharyngeal cancers. The 9-valent human papillomavirus vaccine (9vHPV) provides protection against additional types not included in the quadrivalent vaccine. We conducted near real-time vaccine safety surveillance for 24 months after the vaccine became available in the Vaccine Safety Datalink. METHODS: Immunizations and adverse events were extracted weekly from October 2015 to October 2017 from standardized data files for persons 9 to 26 years old at 6 Vaccine Safety Datalink sites. Prespecified adverse events included anaphylaxis, allergic reaction, appendicitis, Guillain-Barré syndrome, chronic inflammatory demyelinating polyneuropathy, injection site reaction, pancreatitis, seizure, stroke, syncope, and venous thromboembolism. The observed and expected numbers of events after 9vHPV were compared weekly by using sequential methods. Both historical and concurrent comparison groups were used to identify statistical signals for adverse events. Unexpected signals were investigated by medical record review and/or additional analyses. RESULTS: During 105 weeks of surveillance, 838 991 doses of 9vHPV were administered. We identified unexpected statistical signals for 4 adverse events: appendicitis among boys 9 to 17 years old after dose 3; pancreatitis among men 18 to 26 years old; and allergic reactions among girls 9 to 17 years old and women 18 to 26 years old after dose 2. On further evaluation, which included medical record review, temporal scan analysis, and additional epidemiological analyses, we did not confirm signals for any adverse events. CONCLUSIONS: After 2 years of near real-time surveillance of 9vHPV and several prespecified adverse events, no new safety concerns were identified.


Asunto(s)
Sistemas de Registro de Reacción Adversa a Medicamentos/tendencias , Monitoreo Epidemiológico , Infecciones por Papillomavirus/epidemiología , Infecciones por Papillomavirus/prevención & control , Vacunas contra Papillomavirus/administración & dosificación , Vacunas contra Papillomavirus/efectos adversos , Adolescente , Adulto , Apendicitis/inducido químicamente , Apendicitis/epidemiología , Niño , Hipersensibilidad a las Drogas/epidemiología , Femenino , Humanos , Masculino , Pancreatitis/inducido químicamente , Pancreatitis/epidemiología , Estados Unidos/epidemiología , Adulto Joven
14.
Vaccine ; 37(44): 6673-6681, 2019 10 16.
Artículo en Inglés | MEDLINE | ID: mdl-31540812

RESUMEN

INTRODUCTION: A recent study reported an association between inactivated influenza vaccine (IIV) and spontaneous abortion (SAB), but only among women who had also been vaccinated in the previous influenza season. We sought to estimate the association between IIV administered in three recent influenza seasons and SAB among women who were and were not vaccinated in the previous influenza season. METHODS: We conducted a case-control study over three influenza seasons (2012-13, 2013-14, 2014-15) in the Vaccine Safety Datalink (VSD). Cases (women with SAB) and controls (women with live births) were matched on VSD site, date of last menstrual period, age group, and influenza vaccination status in the previous influenza season. Of 1908 presumptive cases identified from the electronic record, 1236 were included in the main analysis. Administration of IIV was documented in several risk windows, including 1-28, 29-56, and >56 days before the SAB date. RESULTS: Among 627 matched pairs vaccinated in the previous season, no association was found between vaccination in the 1-28 day risk window and SAB (adjusted odds ratio (aOR) 0.9; 95% confidence interval (CI) 0.6-1.5). The season-specific aOR ranged from 0.5 to 1.7 with all CIs including the null value of 1.0. Similarly, no association was found among women who were not vaccinated in the previous season; the season-specific aOR in the 1-28 day risk window ranged from 0.6 to 0.7 and the 95% CI included 1.0 in each season. There was no association found between SAB and influenza vaccination in the other risk windows, or when vaccine receipt was analyzed relative to date of conception. CONCLUSION: During these seasons we found no association between IIV and SAB, including among women vaccinated in the previous season. These findings lend support to current recommendations for influenza vaccination at any time during pregnancy, including the first trimester.


Asunto(s)
Aborto Espontáneo/epidemiología , Aborto Espontáneo/etiología , Vacunas contra la Influenza/efectos adversos , Vacunación/efectos adversos , Vacunas de Productos Inactivados/efectos adversos , Aborto Espontáneo/historia , Adolescente , Adulto , Estudios de Casos y Controles , Femenino , Historia del Siglo XXI , Humanos , Vacunas contra la Influenza/administración & dosificación , Oportunidad Relativa , Vigilancia en Salud Pública , Estaciones del Año , Vacunación/métodos , Vacunas de Productos Inactivados/administración & dosificación , Adulto Joven
15.
Open Forum Infect Dis ; 5(12): ofy316, 2018 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-30619907

RESUMEN

BACKGROUND: The epidemiology and burden of respiratory syncytial virus (RSV) illness are not well defined in older adults. METHODS: Adults ≥60 years old seeking outpatient care for acute respiratory illness were recruited from 2004-2005 through 2015-2016 during the winter seasons. RSV was identified from respiratory swabs by multiplex polymerase chain reaction. Clinical characteristics and outcomes were ascertained by interview and medical record abstraction. The incidence of medically attended RSV was estimated for each seasonal cohort. RESULTS: RSV was identified in 243 (11%) of 2257 enrollments (241 of 1832 individuals), including 121 RSV type A and 122 RSV type B. The RSV clinical outcome was serious in 47 (19%), moderate in 155 (64%), and mild in 41 (17%). Serious outcomes included hospital admission (n = 29), emergency department visit (n = 13), and pneumonia (n = 23) and were associated with lower respiratory tract symptoms during the enrollment visit. Moderate outcomes included receipt of a new antibiotic prescription (n = 144; 59%), bronchodilator/nebulizer (n = 45; 19%), or systemic corticosteroids (n = 28; 12%). The relative risk of a serious outcome was significantly increased in persons aged ≥75 years (vs 60-64 years) and in those with chronic obstructive pulmonary disease or congestive heart failure. The average seasonal incidence was 139 cases/10 000, and it was significantly higher in persons with cardiopulmonary disease compared with others (rate ratio, 1.89; 95% confidence interval, 1.44-2.48). CONCLUSIONS: RSV causes substantial outpatient illness with lower respiratory tract involvement. Serious outcomes are common in older patients and those with cardiopulmonary disease.

16.
Vaccine ; 35(40): 5314-5322, 2017 09 25.
Artículo en Inglés | MEDLINE | ID: mdl-28917295

RESUMEN

INTRODUCTION: Inactivated influenza vaccine is recommended in any stage of pregnancy, but evidence of safety in early pregnancy is limited, including for vaccines containing A/H1N1pdm2009 (pH1N1) antigen. We sought to determine if receipt of vaccine containing pH1N1 was associated with spontaneous abortion (SAB). METHODS: We conducted a case-control study over two influenza seasons (2010-11, 2011-12) in the Vaccine Safety Datalink. Cases had SAB and controls had live births or stillbirths and were matched on site, date of last menstrual period, and age. Of 919 potential cases identified using diagnosis codes, 485 were eligible and confirmed by medical record review. Exposure was defined as vaccination with inactivated influenza vaccine before the SAB date; the primary exposure window was the 1-28days before the SAB. RESULTS: The overall adjusted odds ratio (aOR) was 2.0 (95% CI, 1.1-3.6) for vaccine receipt in the 28-day exposure window; there was no association in other exposure windows. In season-specific analyses, the aOR in the 1-28days was 3.7 (95% CI 1.4-9.4) in 2010-11 and 1.4 (95% CI 0.6-3.3) in 2011-12. The association was modified by influenza vaccination in the prior season (post hoc analysis). Among women who received pH1N1-containing vaccine in the previous influenza season, the aOR in the 1-28days was 7.7 (95% CI 2.2-27.3); the aOR was 1.3 (95% CI 0.7-2.7) among women not vaccinated in the previous season. This effect modification was observed in each season. CONCLUSION: SAB was associated with influenza vaccination in the preceding 28days. The association was significant only among women vaccinated in the previous influenza season with pH1N1-containing vaccine. This study does not and cannot establish a causal relationship between repeated influenza vaccination and SAB, but further research is warranted.


Asunto(s)
Aborto Espontáneo/etiología , Subtipo H1N1 del Virus de la Influenza A/inmunología , Subtipo H1N1 del Virus de la Influenza A/patogenicidad , Vacunas contra la Influenza/efectos adversos , Vacunas contra la Influenza/uso terapéutico , Gripe Humana/inmunología , Gripe Humana/prevención & control , Adulto , Estudios de Casos y Controles , Femenino , Humanos , Oportunidad Relativa , Embarazo , Vacunas de Productos Inactivados/efectos adversos , Vacunas de Productos Inactivados/inmunología , Vacunas de Productos Inactivados/uso terapéutico , Adulto Joven
17.
Environ Health Perspect ; 125(8): 087009, 2017 08 16.
Artículo en Inglés | MEDLINE | ID: mdl-28885976

RESUMEN

BACKGROUND: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. OBJECTIVES: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. METHODS: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irrigation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. RESULTS: Median risk estimates from Monte Carlo simulations ranged from 10-5 to 10-2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. CONCLUSIONS: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk. https://doi.org/10.1289/EHP283.


Asunto(s)
Riego Agrícola/métodos , Industria Lechera , Estiércol/microbiología , Modelos Teóricos , Medición de Riesgo
18.
Hydrogeol J ; 25(4): 903-919, 2017 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-30245581

RESUMEN

Groundwater quality is often evaluated using microbial indicators. This study examines data from 12 international groundwater studies (conducted 1992-2013) of 718 public drinking-water systems located in a range of hydrogeological settings. Focus was on testing the value of indicator organisms for identifying virus-contaminated wells. One or more indicators and viruses were present in 37 and 15% of 2,273 samples and 44 and 27% of 746 wells, respectively. Escherichia coli (E. coli) and somatic coliphage are 7-9 times more likely to be associated with culturable virus-positive samples when the indicator is present versus when it is absent, while F-specific and somatic coliphages are 8-9 times more likely to be associated with culturable virus-positive wells. However, single indicators are only marginally associated with viruses detected by molecular methods, and all microbial indicators have low sensitivity and positive predictive values for virus occurrence, whether by culturable or molecular assays, i.e., indicators are often absent when viruses are present and the indicators have a high false-positive rate. Wells were divided into three susceptibility subsets based on presence of (1) total coliform bacteria or (2) multiple indicators, or (3) location of wells in karst, fractured bedrock, or gravel/cobble settings. Better associations of some indicators with viruses were observed for (1) and (3). Findings indicate the best indicators are E. coli or somatic coliphage, although both indicators may underestimate virus occurrence. Repeat sampling for indicators improves evaluation of the potential for viral contamination in a well.

19.
Open Forum Infect Dis ; 3(2): ofw081, 2016 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-27419158

RESUMEN

Background. Respiratory syncytial virus (RSV) and influenza are significant causes of seasonal respiratory illness in children. The incidence of influenza and RSV hospitalization is well documented, but the incidence of medically attended, laboratory-confirmed illness has not been assessed in a well defined community cohort. Methods. Children aged 6-59 months with medically attended acute respiratory illness were prospectively enrolled during the 2006-2007 through 2009-2010 influenza seasons in a Wisconsin community cohort. Nasal swabs were tested for RSV and influenza by multiplex reverse-transcription polymerase chain reaction. The population incidence of medically attended RSV and influenza was estimated separately and standardized to weeks 40 through 18 of each season. Results. The cohort included 2800-3073 children each season. There were 2384 children enrolled with acute respiratory illness; 627 (26%) were positive for RSV and 314 (13%) for influenza. The mean age was 28 months (standard deviation [SD] = 15) for RSV-positive and 38 months (SD = 16) for influenza-positive children. Seasonal incidence (cases per 10 000) was 1718 (95% confidence interval [CI], 1602-1843) for RSV and 768 (95% CI, 696-848) for influenza. Respiratory syncytial virus incidence was highest among children 6-11 (2927) and 12-23 months old (2377). Influenza incidence was highest (850) in children 24-59 months old. The incidence of RSV was higher than influenza across all seasons and age groups. Conclusions. The incidence of medically attended RSV was highest in children 6-23 months old, and it was consistently higher than influenza. The burden of RSV remains high throughout the first 2 years of life.

20.
Open Forum Infect Dis ; 2(3): ofv100, 2015 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-26258157

RESUMEN

We conducted a double-blind, randomized trial of 134 outpatients with polymerase chain reaction-confirmed influenza to assess the effects of oseltamivir initiated 48-119 hours after illness onset. Oseltamivir treatment did not reduce illness duration, severity, or duration of virus detection. However, the power of this study was limited due to lower than expected enrollment.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...