Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 49
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Am J Epidemiol ; 2024 Jun 17.
Artigo em Inglês | MEDLINE | ID: mdl-38885957

RESUMO

Studies of SARS-CoV-2 incidence are important for response to continued transmission and future pandemics. We followed a rural community cohort with broad age representation with active surveillance for SARS-CoV-2 identification from November 2020 through July 2022. Participants provided serum specimens at regular intervals and following SARS-CoV-2 infection or vaccination. We estimated the incidence of SARS-CoV-2 infection identified by study RT-PCR, electronic health record documentation or self-report of a positive test, or serology. We also estimated the seroprevalence of SARS-CoV-2 spike and nucleocapsid antibodies measured by ELISA. Overall, 65% of the cohort had ≥1 SARS-CoV-2 infection by July 2022, and 19% of those with primary infection were reinfected. Infection and vaccination contributed to high seroprevalence, 98% (95% CI: 95%, 99%) of participants were spike or nucleocapsid seropositive at the end of follow-up. Among those seropositive, 82% were vaccinated. Participants were more likely to be seropositive to spike than nucleocapsid following infection. Infection among seropositive individuals could be identified by increases in nucleocapsid, but not spike, ELISA optical density values. Nucleocapsid antibodies waned more quickly after infection than spike antibodies. High levels of SARS-CoV-2 population immunity, as found in this study, are leading to changing epidemiology necessitating ongoing surveillance and policy evaluation.

2.
J Water Health ; 21(9): 1209-1227, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37756190

RESUMO

By community intervention in 14 non-disinfecting municipal water systems, we quantified sporadic acute gastrointestinal illness (AGI) attributable to groundwater. Ultraviolet (UV) disinfection was installed on all supply wells of intervention communities. In control communities, residents continued to drink non-disinfected groundwater. Intervention and control communities switched treatments by moving UV disinfection units at the study midpoint (crossover design). Study participants (n = 1,659) completed weekly health diaries during four 12-week surveillance periods. Water supply wells were analyzed monthly for enteric pathogenic viruses. Using the crossover design, groundwater-borne AGI was not observed. However, virus types and quantity in supply wells changed through the study, suggesting that exposure was not constant. Alternatively, we compared AGI incidence between intervention and control communities within the same surveillance period. During Period 1, norovirus contaminated wells and AGI attributable risk from well water was 19% (95% CI, -4%, 36%) for children <5 years and 15% (95% CI, -9%, 33%) for adults. During Period 3, echovirus 11 contaminated wells and UV disinfection slightly reduced AGI in adults. Estimates of AGI attributable risks from drinking non-disinfected groundwater were highly variable, but appeared greatest during times when supply wells were contaminated with specific AGI-etiologic viruses.


Assuntos
Água Potável , Água Subterrânea , Adulto , Criança , Humanos , Abastecimento de Água , Desinfecção , Enterovirus Humano B
3.
Hydrogeol J ; 25(4): 903-919, 2017 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-30245581

RESUMO

Groundwater quality is often evaluated using microbial indicators. This study examines data from 12 international groundwater studies (conducted 1992-2013) of 718 public drinking-water systems located in a range of hydrogeological settings. Focus was on testing the value of indicator organisms for identifying virus-contaminated wells. One or more indicators and viruses were present in 37 and 15% of 2,273 samples and 44 and 27% of 746 wells, respectively. Escherichia coli (E. coli) and somatic coliphage are 7-9 times more likely to be associated with culturable virus-positive samples when the indicator is present versus when it is absent, while F-specific and somatic coliphages are 8-9 times more likely to be associated with culturable virus-positive wells. However, single indicators are only marginally associated with viruses detected by molecular methods, and all microbial indicators have low sensitivity and positive predictive values for virus occurrence, whether by culturable or molecular assays, i.e., indicators are often absent when viruses are present and the indicators have a high false-positive rate. Wells were divided into three susceptibility subsets based on presence of (1) total coliform bacteria or (2) multiple indicators, or (3) location of wells in karst, fractured bedrock, or gravel/cobble settings. Better associations of some indicators with viruses were observed for (1) and (3). Findings indicate the best indicators are E. coli or somatic coliphage, although both indicators may underestimate virus occurrence. Repeat sampling for indicators improves evaluation of the potential for viral contamination in a well.

4.
J Infect Dis ; 211(10): 1529-40, 2015 May 15.
Artigo em Inglês | MEDLINE | ID: mdl-25406334

RESUMO

BACKGROUND: During the 2012-2013 influenza season, there was cocirculation of influenza A(H3N2) and 2 influenza B lineage viruses in the United States. METHODS: Patients with acute cough illness for ≤7 days were prospectively enrolled and had swab samples obtained at outpatient clinics in 5 states. Influenza vaccination dates were confirmed by medical records. The vaccine effectiveness (VE) was estimated as [100% × (1 - adjusted odds ratio)] for vaccination in cases versus test-negative controls. RESULTS: Influenza was detected in 2307 of 6452 patients (36%); 1292 (56%) had influenza A(H3N2), 582 (25%) had influenza B/Yamagata, and 303 (13%) had influenza B/Victoria. VE was 49% (95% confidence interval [CI], 43%-55%) overall, 39% (95% CI, 29%-47%) against influenza A(H3N2), 66% (95% CI, 58%-73%) against influenza B/Yamagata (vaccine lineage), and 51% (95% CI, 36%-63%) against influenza B/Victoria. VE against influenza A(H3N2) was highest among persons aged 50-64 years (52%; 95% CI, 33%-65%) and persons aged 6 months-8 years (51%; 95% CI, 32%-64%) and lowest among persons aged ≥65 years (11%; 95% CI, -41% to 43%). In younger age groups, there was evidence of residual protection from receipt of the 2011-2012 vaccine 1 year earlier. CONCLUSIONS: The 2012-2013 vaccines were moderately effective in most age groups. Cross-lineage protection and residual effects from prior vaccination were observed and warrant further investigation.


Assuntos
Vacinas contra Influenza/administração & dosagem , Vacinas contra Influenza/imunologia , Influenza Humana/prevenção & controle , Orthomyxoviridae/imunologia , Orthomyxoviridae/isolamento & purificação , Adolescente , Adulto , Fatores Etários , Idoso , Idoso de 80 Anos ou mais , Criança , Pré-Escolar , Proteção Cruzada , Feminino , Humanos , Lactente , Influenza Humana/imunologia , Masculino , Pessoa de Meia-Idade , Resultado do Tratamento , Estados Unidos , Adulto Jovem
5.
Clin Med Res ; 13(3-4): 103-11, 2015 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-25487238

RESUMO

OBJECTIVE: In this study, health event capture is broadly defined as the degree to which a group of people use a particular provider network as their primary source of health care services. The Marshfield Epidemiologic Study Area (MESA) is a valuable resource for population-based health research, but the completeness of health event capture has not been validated in recent years. Our objective was to determine the current level of outpatient and inpatient health event capture by Marshfield Clinic (MC) facilities and affiliated hospitals for people living within MESA. DESIGN: A stratified sample survey with strata defined by MESA region (Central or North) and age group (<18 years or ≥18 years). SETTING: 24 ZIP codes in central and northern Wisconsin, USA. PARTICIPANTS: 2,485 individuals participated of the 4,313 sampled cohort members residing in MESA Central (N=61,041) and MESA North (N=25,906) on February 22, 2011. METHODS: A health care utilization survey was mailed to a random sample stratified by age group and MESA region. Telephone interviews were attempted for nonrespondents. The survey requested information on sources of outpatient care and overnight hospital admissions. Population proportions representing health event capture metrics and corresponding 95% confidence intervals (CI) were estimated with analytic weights applied to account for the survey design. RESULTS: Among those with an outpatient visit during the past 24 months, the most recent visit of an estimated 93% (95% CI, 91% - 94%) was at a MC facility. The most recent admission of an estimated 93% (95% CI, 90% - 96%) of those hospitalized in the past 24 months was at a hospital affiliated with MC. The proportion admitted to MC affiliated hospitals was higher for residents of MESA Central (97%) compared to MESA North (83%). CONCLUSION: A high proportion of outpatient visits and inpatient admissions in MESA Central and MESA North are accessible in the MC electronic health record. This pattern of high health event capture has been demonstrated since the inception of MESA in 1991. The results from this study validate and support the continued use of MESA for population-based epidemiologic and clinical research.


Assuntos
Assistência Ambulatorial , Atenção à Saúde , Admissão do Paciente , Fatores Etários , Feminino , Humanos , Masculino , Wisconsin
6.
J Infect Dis ; 207(8): 1262-9, 2013 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-23341536

RESUMO

BACKGROUND: The 2009 influenza A virus subtype H1N1 (A[H1N1]pdm09) did not exhibit antigenic drift during the 2010-2011 influenza season, providing an opportunity to investigate the duration of protection after vaccination. We estimated the independent effects of 2010-2011 seasonal trivalent inactivated influenza vaccine (TIV) and A(H1N1)pdm09 vaccine for preventing medically attended influenza A virus infection during the 2010-2011 season. METHODS: Individuals were tested for influenza A virus by real-time reverse transcription polymerase chain reaction (rRT-PCR) after a clinical encounter for acute respiratory illness. Case-control analyses compared participants with rRT-PCR-confirmed influenza A virus infection and test-negative controls. Vaccine effectiveness was estimated separately for monovalent pandemic vaccine and TIV and was calculated as 100 × [1 - adjusted odds ratio], where the odds ratio was adjusted for potential confounders. RESULTS: The effectiveness of TIV against influenza A virus infection was 63% (95% confidence interval [CI], 37%-78%). The effectiveness of TIV against A(H1N1)pdm09 infection was 77% (95% CI, 44%-90%). Monovalent vaccine administered between October 2009 and April 2010 was not protective during the 2010-2011 season, with an effectiveness of -1% (95% CI, -146% to 59%) against A(H1N1)pdm09 infection. CONCLUSIONS: Monovalent vaccine provided no sustained protection against A(H1N1)pdm09 infection during the 2010-2011 season. This waning effectiveness supports the need for annual revaccination, even in the absence of antigenic drift in A(H1N1)pdm09.


Assuntos
Vírus da Influenza A Subtipo H1N1/patogenicidade , Vacinas contra Influenza/uso terapêutico , Influenza Humana/prevenção & controle , Pandemias/prevenção & controle , Adolescente , Adulto , Estudos de Casos e Controles , Criança , Feminino , Humanos , Vírus da Influenza A Subtipo H1N1/imunologia , Vacinas contra Influenza/imunologia , Influenza Humana/epidemiologia , Influenza Humana/imunologia , Influenza Humana/virologia , Masculino , Pessoa de Meia-Idade , Estações do Ano , Wisconsin/epidemiologia , Adulto Jovem
7.
Water Res ; 252: 121242, 2024 Mar 15.
Artigo em Inglês | MEDLINE | ID: mdl-38342066

RESUMO

Water reuse is a growing global reality. In regulating water reuse, viruses have come to the fore as key pathogens due to high shedding rates, low infectious doses, and resilience to traditional wastewater treatments. To demonstrate the high log reductions required by emerging water reuse regulations, cost and practicality necessitate surrogates for viruses for use as challenge organisms in unit process evaluation and monitoring. Bacteriophage surrogates that are mitigated to the same or lesser extent than viruses of concern are routinely used for individual unit process testing. However, the behavior of these surrogates over a multi-barrier treatment train typical of water reuse has not been well-established. Toward this aim, we performed a meta-analysis of log reductions of common bacteriophage surrogates for five treatment processes typical of water reuse treatment trains: advanced oxidation processes, chlorination, membrane filtration, ozonation, and ultraviolet (UV) disinfection. Robust linear regression was applied to identify a range of doses consistent with a given log reduction of bacteriophages and viruses of concern for each treatment process. The results were used to determine relative conservatism of surrogates. We found that no one bacteriophage was a representative or conservative surrogate for viruses of concern across all multi-barrier treatments (encompassing multiple mechanisms of virus mitigation). Rather, a suite of bacteriophage surrogates provides both a representative range of inactivation and information about the effectiveness of individual processes within a treatment train. Based on the abundance of available data and diversity of virus treatability using these five key water reuse treatment processes, bacteriophages MS2, phiX174, and Qbeta were recommended as a core suite of surrogates for virus challenge testing.


Assuntos
Bacteriófagos , Purificação da Água , Água , Bacteriófago phi X 174 , Purificação da Água/métodos , Desinfecção/métodos , Levivirus
8.
J Environ Qual ; 52(2): 270-286, 2023 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-36479898

RESUMO

Antimicrobial resistance is a growing public health problem that requires an integrated approach among human, agricultural, and environmental sectors. However, few studies address all three components simultaneously. We investigated the occurrence of five antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in private wells drawing water from a vulnerable aquifer influenced by residential septic systems and land-applied dairy manure. Samples (n = 138) were collected across four seasons from a randomized sample of private wells in Kewaunee County, Wisconsin. Measurements of ARGs and intI1 were related to microbial source tracking (MST) markers specific to human and bovine feces; they were also related to 54 risk factors for contamination representing land use, rainfall, hydrogeology, and well construction. ARGs and intI1 occurred in 5%-40% of samples depending on target. Detection frequencies for ARGs and intI1 were lowest in the absence of human and bovine MST markers (1%-30%), highest when co-occurring with human and bovine markers together (11%-78%), and intermediate when co-occurring with just one type of MST marker (4%-46%). Gene targets were associated with septic system density more often than agricultural land, potentially because of the variable presence of manure on the landscape. Determining ARG prevalence in a rural setting with mixed land use allowed an assessment of the relative contribution of human and bovine fecal sources. Because fecal sources co-occurred with ARGs at similar rates, interventions intended to reduce ARG occurrence may be most effective if both sources are considered.


Assuntos
Antibacterianos , Esterco , Animais , Humanos , Bovinos , Antibacterianos/farmacologia , Gado , Fezes , Resistência Microbiana a Medicamentos/genética
9.
Environ Sci Technol ; 46(17): 9299-307, 2012 Sep 04.
Artigo em Inglês | MEDLINE | ID: mdl-22839570

RESUMO

Acute gastrointestinal illness (AGI) resulting from pathogens directly entering the piping of drinking water distribution systems is insufficiently understood. Here, we estimate AGI incidence from virus intrusions into the distribution systems of 14 nondisinfecting, groundwater-source, community water systems. Water samples for virus quantification were collected monthly at wells and households during four 12-week periods in 2006-2007. Ultraviolet (UV) disinfection was installed on the communities' wellheads during one study year; UV was absent the other year. UV was intended to eliminate virus contributions from the wells and without residual disinfectant present in these systems, any increase in virus concentration downstream at household taps represented virus contributions from the distribution system (Approach 1). During no-UV periods, distribution system viruses were estimated by the difference between well water and household tap virus concentrations (Approach 2). For both approaches, a Monte Carlo risk assessment framework was used to estimate AGI risk from distribution systems using study-specific exposure-response relationships. Depending on the exposure-response relationship selected, AGI risk from the distribution systems was 0.0180-0.0661 and 0.001-0.1047 episodes/person-year estimated by Approaches 1 and 2, respectively. These values represented 0.1-4.9% of AGI risk from all exposure routes, and 1.6-67.8% of risk related to drinking water exposure. Virus intrusions into nondisinfected drinking water distribution systems can contribute to sporadic AGI.


Assuntos
Água Potável/efeitos adversos , Água Potável/virologia , Gastroenteropatias/etiologia , Gastroenteropatias/virologia , Viroses/etiologia , Adulto , Criança , Desinfecção/métodos , Trato Gastrointestinal/virologia , Humanos , Incidência , Medição de Risco , Raios Ultravioleta , Viroses/virologia , Purificação da Água/métodos
10.
J Anim Sci ; 100(3)2022 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-35137106

RESUMO

Recently, there has been increased interest in including triticale (X Triticosecale Wittmack) or other winter cereals within forage programs throughout the southwest United States. Our objectives were to screen 14 diverse triticale cultivars for agronomic and nutritive characteristics with specific emphasis on identifying normal, as well as deviant, responses to the calendar date and plant maturity for forages seeded in December and harvested from late February throughout May at Maricopa, AZ. Fourteen cultivars were established in a randomized complete block design with each cultivar represented within each of three field blocks. Plots were clean tilled and established on December 18, 2018, and then harvested at 2-wk intervals beginning on February 27 and ending May 23, 2019. Across all harvest dates, forage (N = 315) energy density (NEL) exhibited strong negative correlations with growth stage (r =  -0.879), plant height (r =  -0.913), head weight (r =  -0.814), and estimated dry matter (DM) yield (r =  -0.886) but was positively associated with percentages of leaf (r = 0.949), and weakly associated with percentages of the stem (r = 0.138). Through April 10, similar correlations were observed within individual harvest dates (N = 45) for growth stage, leaf percentage, and plant height but not for stem or head-weight percentages. Within later harvest dates, only sporadic correlations with NEL were observed. Primarily cubic regression relationships for neutral detergent fiber, acid detergent lignin, 30- and 48-h in vitro disappearance of DM and fiber, and NEL were fit for the mean or typical cultivar using both days from February 1 and growth stage as independent variables. Coefficients of determination (R2 ≥ 0.860) in all cases indicated a good fit for the polynomial models. For NEL, deviation from the typical cultivar when days from February 1 was used as the independent regression variable was largely affected by cultivar maturation rate. When the growth stage was substituted as the independent variable, plant height, stem percentage beginning at anthesis, and low grain-head percentage were associated with the maximum negative deviant cultivar (Merlin Max). The 0.23 Mcal/kg difference between maximum positive and negative deviant cultivars at a common late-boot/early-heading stage of growth suggests that some attention should be placed on cultivar selection as well as forage inventory needs and overall cropping goals.


Recently, there has been increased interest in using triticale within forage programs in the southwest United States. Our objectives were to screen 14 triticale cultivars for agronomic and nutritive value characteristics with specific emphasis on identifying typical, as well as deviant, responses to the calendar date and plant maturity. Regression relationships for neutral detergent fiber, acid detergent lignin, 30- and 48-h in vitro disappearance of dry matter and fiber, and net energy of lactation (NEL) were fit for the mean or typical cultivar using both days from February 1 or growth stage at harvest as independent regression variables. Deviant cultivars usually demonstrated rapid or slow maturation rates, which were often accompanied by physical characteristics reflective of advanced or slow maturation, respectively. Overall, there were a limited number of cultivars that deviated from typical with respect to NEL, but the total range in energy density at a common late-boot/early-heading stage of growth (0.23 Mcal/kg) suggests that some attention should be placed on cultivar selection, especially when specific cultivars display atypical growth characteristics, such as greater canopy height. However, either positive or negative deviation with respect to energy density may be desirable depending on the energy needs of the targeted livestock class.


Assuntos
Triticale , Animais , Fibras na Dieta , Digestão , Grão Comestível , Valor Nutritivo , Estados Unidos
11.
Hum Vaccin Immunother ; 18(7): 2159215, 2022 12 30.
Artigo em Inglês | MEDLINE | ID: mdl-36577134

RESUMO

The safety of 9-valent HPV vaccine (9vHPV) has been established with regard to common and uncommon adverse events. However, investigation of rare and severe adverse events requires extended study periods to capture rare outcomes. This observational cohort study investigated the occurrence of three rare and serious adverse events following 9-valent human papillomavirus (9vHPV) vaccination compared to other vaccinations, in US individuals 9-26 years old, using electronic health record data from the Vaccine Safety Datalink (VSD). We searched for occurrences of Guillain-Barré syndrome (GBS), chronic inflammatory demyelinating polyneuropathy (CIDP), and stroke following 9vHPV vaccination from October 4, 2015, through January 2, 2021. We compared the risks of GBS, CIDP, and stroke following 9vHPV vaccination to risks of those outcomes following comparator vaccines commonly given to this age group (Td, Tdap, MenACWY, hepatitis A, and varicella vaccines) from January 1, 2007, through January 2, 2021. We observed 1.2 cases of stroke, 0.3 cases of GBS, and 0.1 cases of CIDP per 100,000 doses of 9vHPV vaccine. After observing more than 1.8 million doses of 9vHPV, we identified no statistically significant increase in risks associated with 9vHPV vaccination for any of these adverse events, either combined or stratified by age (9-17 years of age vs. 18-26 years of age) and sex (males vs. females). Our findings provide additional evidence supporting 9vHPV vaccine safety, over longer time frames and for more serious and rare adverse events.


Assuntos
Infecções por Papillomavirus , Vacinas contra Papillomavirus , Polirradiculoneuropatia Desmielinizante Inflamatória Crônica , Adolescente , Adulto , Criança , Feminino , Humanos , Masculino , Adulto Jovem , Papillomavirus Humano , Infecções por Papillomavirus/epidemiologia , Vacinas contra Papillomavirus/administração & dosagem , Vacinas contra Papillomavirus/efeitos adversos , Polirradiculoneuropatia Desmielinizante Inflamatória Crônica/induzido quimicamente , Vacinação/efeitos adversos
12.
Sci Rep ; 12(1): 17138, 2022 10 13.
Artigo em Inglês | MEDLINE | ID: mdl-36229636

RESUMO

Stable isotopes are useful for estimating livestock diet selection. The objective was to compare δ13C and δ15N to estimate diet proportion of C3-C4 forages when steers (Bos spp.) were fed quantities of rhizoma peanut (Arachis glabrata; RP; C3) and bahiagrass (Paspalum notatum; C4).Treatments were proportions of RP with bahiagrass hay: 100% bahiagrass (0%RP); 25% RP + 75% bahiagrass (25%RP); 50% RP + 50% bahiagrass (50%RP); 75% RP + 25% bahiagrass (75%RP); and 100% RP (100% RP). Feces, plasma, red blood cell (RBC), and hair were collected at 8-days intervals, for 32 days. Two-pool mixing model was utilized to back-calculate the proportion of RP based on the sample and forage δ13C or δ15N. Feces showed changes using δ13C by 8 days, and adj. R2 between predicted and observed RP proportion was 0.81 by 8 days. Plasma, hair, and RBC required beyond 32-days to reach equilibrium, therefore were not useful predictors of diet composition during the study. Diets were best represented using fecal δ13C at both 8-days and 32-days. By 32-days, fecal δ15N showed promise (R2 = 0.71) for predicting diet composition in C3-C4 diets. Further studies are warranted to further corroborate fecal δ15N as a predictor of diet composition in cattle.


Assuntos
Dieta , Paspalum , Ração Animal/análise , Animais , Bovinos , Dieta/veterinária , Fezes , Isótopos
13.
J Water Health ; 9(4): 799-812, 2011 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-22048438

RESUMO

We tested the association of common events in drinking water distribution systems with contamination of household tap water with human enteric viruses. Viruses were enumerated by qPCR in the tap water of 14 municipal systems that use non-disinfected groundwater. Ultraviolet disinfection was installed at all active wellheads to reduce virus contributions from groundwater to the distribution systems. As no residual disinfectant was added to the water, any increase in virus levels measured downstream at household taps would be indicative of distribution system intrusions. Utility operators reported events through written questionnaires. Virus outcome measures were related to distribution system events using binomial and gamma regression. Virus concentrations were elevated in the wells, reduced or eliminated by ultraviolet disinfection, and elevated again in distribution systems, showing that viruses were, indeed, directly entering the systems. Pipe installation was significantly associated with higher virus levels, whereas hydrant flushing was significantly associated with lower virus levels. Weak positive associations were observed for water tower maintenance, valve exercising, and cutting open a water main. Coliform bacteria detections from routine monitoring were not associated with viruses. Understanding when distribution systems are most vulnerable to virus contamination, and taking precautionary measures, will ensure delivery of safe drinking water.


Assuntos
Enterovirus/isolamento & purificação , Microbiologia da Água , Abastecimento de Água/normas , Humanos , Engenharia Sanitária , Wisconsin
14.
Environ Health Perspect ; 129(6): 67003, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-34160247

RESUMO

BACKGROUND: Private wells are an important source of drinking water in Kewaunee County, Wisconsin. Due to the region's fractured dolomite aquifer, these wells are vulnerable to contamination by human and zoonotic gastrointestinal pathogens originating from land-applied cattle manure and private septic systems. OBJECTIVE: We determined the magnitude of the health burden associated with contamination of private wells in Kewaunee County by feces-borne gastrointestinal pathogens. METHODS: This study used data from a year-long countywide pathogen occurrence study as inputs into a quantitative microbial risk assessment (QMRA) to predict the total cases of acute gastrointestinal illness (AGI) caused by private well contamination in the county. Microbial source tracking was used to associate predicted cases of illness with bovine, human, or unknown fecal sources. RESULTS: Results suggest that private well contamination could be responsible for as many as 301 AGI cases per year in Kewaunee County, and that 230 and 12 cases per year were associated with a bovine and human fecal source, respectively. Furthermore, Cryptosporidium parvum was predicted to cause 190 cases per year, the most out of all 8 pathogens included in the QMRA. DISCUSSION: This study has important implications for land use and water resource management in Kewaunee County and informs the public health impacts of consuming drinking water produced in other similarly vulnerable hydrogeological settings. https://doi.org/10.1289/EHP7815.


Assuntos
Criptosporidiose , Cryptosporidium , Água Subterrânea , Animais , Carbonato de Cálcio , Bovinos , Magnésio , Medição de Risco , Poços de Água , Wisconsin/epidemiologia
15.
Environ Health Perspect ; 129(6): 67004, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-34160249

RESUMO

BACKGROUND: Groundwater quality in the Silurian dolomite aquifer in northeastern Wisconsin, USA, has become contentious as dairy farms and exurban development expand. OBJECTIVES: We investigated private household wells in the region, determining the extent, sources, and risk factors of nitrate and microbial contamination. METHODS: Total coliforms, Escherichia coli, and nitrate were evaluated by synoptic sampling during groundwater recharge and no-recharge periods. Additional seasonal sampling measured genetic markers of human and bovine fecal-associated microbes and enteric zoonotic pathogens. We constructed multivariable regression models of detection probability (log-binomial) and concentration (gamma) for each contaminant to identify risk factors related to land use, precipitation, hydrogeology, and well construction. RESULTS: Total coliforms and nitrate were strongly associated with depth-to-bedrock at well sites and nearby agricultural land use, but not septic systems. Both human wastewater and cattle manure contributed to well contamination. Rotavirus group A, Cryptosporidium, and Salmonella were the most frequently detected pathogens. Wells positive for human fecal markers were associated with depth-to-groundwater and number of septic system drainfield within 229m. Manure-contaminated wells were associated with groundwater recharge and the area size of nearby agricultural land. Wells positive for any fecal-associated microbe, regardless of source, were associated with septic system density and manure storage proximity modified by bedrock depth. Well construction was generally not related to contamination, indicating land use, groundwater recharge, and bedrock depth were the most important risk factors. DISCUSSION: These findings may inform policies to minimize contamination of the Silurian dolomite aquifer, a major water supply for the U.S. and Canadian Great Lakes region. https://doi.org/10.1289/EHP7813.


Assuntos
Criptosporidiose , Cryptosporidium , Água Subterrânea , Poluentes Químicos da Água , Animais , Carbonato de Cálcio , Canadá , Bovinos , Monitoramento Ambiental , Magnésio , Nitratos/análise , Fatores de Risco , Poluentes Químicos da Água/análise , Poços de Água , Wisconsin
16.
Am J Public Health ; 100(6): 1116-22, 2010 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-20075320

RESUMO

OBJECTIVES: We performed a case-control study to determine if participants with herpes zoster had fewer contacts with persons with varicella or zoster, and with young children, to explore the hypothesis that exposure to persons with varicella zoster virus (VZV) results in "immune boosting." METHODS: Participants were patients of the multispecialty Marshfield Clinic in Wisconsin. We identified patients aged 40 to 79 years with a new diagnosis of zoster from August 2000 to July 2005. We frequency matched control participants to case participants for age. We confirmed diagnoses by chart review and assessed exposures by interview. RESULTS: Interviews were completed by 633 of 902 eligible case participants (70.2%) and 655 of 1149 control participants (57.0%). The number of varicella contacts was not associated with zoster; there was no trend even at the highest exposure level (3 or more contacts). Similarly, there was no association with exposure to persons with zoster or to children, or with workplace exposures. CONCLUSIONS: Although exposure to VZV in our study was relatively low, the absence of a relationship with zoster reflects the uncertain influence of varicella circulation on zoster epidemiology.


Assuntos
Vacina contra Varicela/imunologia , Herpes Zoster/epidemiologia , Herpesvirus Humano 3 , Adulto , Idoso , Estudos de Casos e Controles , Criança , Família , Feminino , Herpes Zoster/etiologia , Humanos , Masculino , Pessoa de Meia-Idade , Exposição Ocupacional/estatística & dados numéricos , Razão de Chances , Análise de Regressão , Wisconsin/epidemiologia
18.
Influenza Other Respir Viruses ; 14(5): 479-482, 2020 09.
Artigo em Inglês | MEDLINE | ID: mdl-32390298

RESUMO

We developed and evaluated a model to predict serious outcomes among 243 adults ≥60 years old with medically attended respiratory illness and laboratory-confirmed respiratory syncytial virus (RSV); 47 patients had a serious outcome defined as hospital admission, emergency department (ED) visit, or pneumonia diagnosis. The model used logistic regression with penalized maximum likelihood estimation. The reduced penalized model included age ≥ 75 years, ≥1 ED visit in prior year, crackles/rales, tachypnea, wheezing, new/increased sputum, and new/increased dyspnea. The optimal score cutoff yielded sensitivity and specificity of 66.0% and 81.6%. This prediction model provided moderate utility for identifying older adults with elevated risk of complicated RSV illness.


Assuntos
Pacientes Ambulatoriais/estatística & dados numéricos , Infecções por Vírus Respiratório Sincicial/complicações , Infecções por Vírus Respiratório Sincicial/diagnóstico , Estações do Ano , Fatores Etários , Idoso , Serviço Hospitalar de Emergência/estatística & dados numéricos , Humanos , Modelos Logísticos , Pessoa de Meia-Idade , Pneumonia/diagnóstico , Pneumonia/virologia , Valor Preditivo dos Testes , Vírus Sincicial Respiratório Humano/genética , Vírus Sincicial Respiratório Humano/patogenicidade , Fatores de Risco , Índice de Gravidade de Doença
19.
Pediatr Obes ; 15(1): e12572, 2020 01.
Artigo em Inglês | MEDLINE | ID: mdl-31595686

RESUMO

BACKGROUND: Recent studies suggest kids tend to gain the most weight in summer, but schools are chastised for supporting obesogenic environments. Conclusions on circannual weight gain are hampered by infrequent body mass index (BMI) measurements, and guidance is limited on the optimal timeframe for paediatric weight interventions. OBJECTIVES: This study characterized circannual trends in BMI in Wisconsin children and adolescents and identified sociodemographic differences in excess weight gain. METHODS: An observational study was used to pool data from 2010 to 2015 to examine circannual BMI z-score trends for Marshfield Clinic patients age 3 to 17 years. Daily 0.20, 0.50, and 0.80 quantiles of BMI z-score were estimated, stratified by gender, race, and age. RESULTS: BMI z-scores increased July to September, followed by a decrease in October to December, and another increase to decrease cycle beginning in February. For adolescents, the summer increase in BMI was greater among those in the upper BMI z-score quantile relative to those in the lower quantile (+0.15 units vs +0.04 units). This pattern was opposite in children. CONCLUSIONS: BMI increased most rapidly in late summer. This growth persisted through autumn in adolescents who were larger, suggesting weight management support may be beneficial for kids who are overweight at the start of the school year.


Assuntos
Obesidade/prevenção & controle , Aumento de Peso , Adolescente , Índice de Massa Corporal , Criança , Pré-Escolar , Feminino , Humanos , Masculino , Estações do Ano
20.
Water Res ; 178: 115814, 2020 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-32325219

RESUMO

Drinking water supply wells can be contaminated by a broad range of waterborne pathogens. However, groundwater assessments frequently measure microbial indicators or a single pathogen type, which provides a limited characterization of potential health risk. This study assessed contamination of wells by testing for viral, bacterial, and protozoan pathogens and fecal markers. Wells supplying groundwater to community and noncommunity public water systems in Minnesota, USA (n = 145) were sampled every other month over one or two years and tested using 23 qPCR assays. Eighteen genetic targets were detected at least once, and microbiological contamination was widespread (96% of 145 wells, 58% of 964 samples). The sewage-associated microbial indicators HF183 and pepper mild mottle virus were detected frequently. Human or zoonotic pathogens were detected in 70% of wells and 21% of samples by qPCR, with Salmonella and Cryptosporidium detected more often than viruses. Samples positive by qPCR for adenovirus (HAdV), enterovirus, or Salmonella were analyzed by culture and for genotype or serotype. qPCR-positive Giardia and Cryptosporidium samples were analyzed by immunofluorescent assay (IFA), and IFA and qPCR concentrations were correlated. Comparisons of indicator and pathogen occurrence at the time of sampling showed that total coliforms, HF183, and Bacteroidales-like HumM2 had high specificity and negative predictive values but generally low sensitivity and positive predictive values. Pathogen-HF183 ratios in sewage have been used to estimate health risks from HF183 concentrations in surface water, but in our groundwater samples Cryptosporidium oocyst:HF183 and HAdV:HF183 ratios were approximately 10,000 times higher than ratios reported for sewage. qPCR measurements provided a robust characterization of microbiological water quality, but interpretation of qPCR data in a regulatory context is challenging because few studies link qPCR measurements to health risk.


Assuntos
Criptosporidiose , Cryptosporidium , Água Subterrânea , Animais , Monitoramento Ambiental , Fezes , Humanos , Minnesota , Microbiologia da Água
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA