Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 11 de 11
Filtrar
1.
Sci Total Environ ; 930: 172505, 2024 Jun 20.
Artículo en Inglés | MEDLINE | ID: mdl-38636851

RESUMEN

Human sewage contaminates waterways, delivering excess nutrients, pathogens, chemicals, and other toxic contaminants. Contaminants and various sewage indicators are measured to monitor and assess water quality, but these analytes vary in their representation of sewage contamination and the inferences about water quality they support. We measured the occurrence and concentration of multiple microbiological (n = 21) and chemical (n = 106) markers at two urban stream locations in Milwaukee, Wisconsin, USA over two years. Five-day composite water samples (n = 98) were collected biweekly, and sewage influent samples (n = 25) were collected monthly at a Milwaukee, WI water reclamation facility. We found the vast majority of markers were not sensitive enough to detect sewage contamination. To compare analytes for monitoring applications, five consistently detected human sewage indicators were used to evaluate temporal patterns of sewage contamination, including microbiological (pepper mild mottle virus, human Bacteroides, human Lachnospiraceae) and chemical (acetaminophen, metformin) markers. The proportion of human sewage in each stream was estimated using the mean influent concentration from the water reclamation facility and the mean concentration of all stream samples for each sewage indicator marker. Estimates of instream sewage pollution varied by marker, differing by up to two orders of magnitude, but four of the five sewage markers characterized Underwood Creek (mean proportions of human sewage ranged 0.0025 % - 0.075 %) as less polluted than Menomonee River (proportions ranged 0.013 % - 0.14 %) by an order of magnitude more. Chemical markers correlated with each other and yielded higher estimates of sewage pollution than microbial markers, which exhibited greater temporal variability. Transport, attenuation, and degradation processes can influence chemical and microbial markers differently and cause variation in human sewage estimates. Given the range of potential human and ecological health effects of human sewage contamination, robust characterization of sewage contamination that uses multiple lines of evidence supports monitoring and research applications.


Asunto(s)
Monitoreo del Ambiente , Ríos , Aguas del Alcantarillado , Contaminantes Químicos del Agua , Monitoreo del Ambiente/métodos , Humanos , Ríos/microbiología , Ríos/química , Ríos/virología , Wisconsin , Contaminantes Químicos del Agua/análisis , Preparaciones Farmacéuticas/análisis , Bacterias/aislamiento & purificación , Calidad del Agua , Microbiología del Agua , Virus/aislamiento & purificación
2.
Appl Environ Microbiol ; 90(3): e0162923, 2024 Mar 20.
Artículo en Inglés | MEDLINE | ID: mdl-38335112

RESUMEN

We used quantitative microbial risk assessment to estimate ingestion risk for intI1, erm(B), sul1, tet(A), tet(W), and tet(X) in private wells contaminated by human and/or livestock feces. Genes were quantified with five human-specific and six bovine-specific microbial source-tracking (MST) markers in 138 well-water samples from a rural Wisconsin county. Daily ingestion risk (probability of swallowing ≥1 gene) was based on daily water consumption and a Poisson exposure model. Calculations were stratified by MST source and soil depth over the aquifer where wells were drilled. Relative ingestion risk was estimated using wells with no MST detections and >6.1 m soil depth as a referent category. Daily ingestion risk varied from 0 to 8.8 × 10-1 by gene and fecal source (i.e., human or bovine). The estimated number of residents ingesting target genes from private wells varied from 910 (tet(A)) to 1,500 (intI1 and tet(X)) per day out of 12,000 total. Relative risk of tet(A) ingestion was significantly higher in wells with MST markers detected, including wells with ≤6.1 m soil depth contaminated by bovine markers (2.2 [90% CI: 1.1-4.7]), wells with >6.1 m soil depth contaminated by bovine markers (1.8 [1.002-3.9]), and wells with ≤6.1 m soil depth contaminated by bovine and human markers simultaneously (3.1 [1.7-6.5]). Antibiotic resistance genes (ARGs) were not necessarily present in viable microorganisms, and ingestion is not directly associated with infection. However, results illustrate relative contributions of human and livestock fecal sources to ARG exposure and highlight rural groundwater as a significant point of exposure.IMPORTANCEAntibiotic resistance is a global public health challenge with well-known environmental dimensions, but quantitative analyses of the roles played by various natural environments in transmission of antibiotic resistance are lacking, particularly for drinking water. This study assesses risk of ingestion for several antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in drinking water from private wells in a rural area of northeast Wisconsin, United States. Results allow comparison of drinking water as an exposure route for antibiotic resistance relative to other routes like food and recreational water. They also enable a comparison of the importance of human versus livestock fecal sources in the study area. Our study demonstrates the previously unrecognized importance of untreated rural drinking water as an exposure route for antibiotic resistance and identifies bovine fecal material as an important exposure factor in the study setting.


Asunto(s)
Antibacterianos , Agua Potable , Animales , Humanos , Bovinos , Antibacterianos/farmacología , Genes Bacterianos , Ganado , Heces , Suelo , Medición de Riesgo , Farmacorresistencia Microbiana/genética , Ingestión de Alimentos
3.
BMJ Open ; 13(3): e068560, 2023 03 02.
Artículo en Inglés | MEDLINE | ID: mdl-36863739

RESUMEN

INTRODUCTION: The burden of disease attributed to drinking water from private wells is not well characterised. The Wells and Enteric disease Transmission trial is the first randomised controlled trial to estimate the burden of disease that can be attributed to the consumption of untreated private well water. To estimate the attributable incidence of gastrointestinal illness (GI) associated with private well water, we will test if the household treatment of well water by ultraviolet light (active UV device) versus sham (inactive UV device) decreases the incidence of GI in children under 5 years of age. METHODS AND ANALYSIS: The trial will enrol (on a rolling basis) 908 families in Pennsylvania, USA, that rely on private wells and have a child 3 years old or younger. Participating families are randomised to either an active whole-house UV device or a sham device. During follow-up, families will respond to weekly text messages to report the presence of signs and symptoms of gastrointestinal or respiratory illness and will be directed to an illness questionnaire when signs/symptoms are present. These data will be used to compare the incidence of waterborne illness between the two study groups. A randomly selected subcohort submits untreated well water samples and biological specimens (stool and saliva) from the participating child in both the presence and absence of signs/symptoms. Samples are analysed for the presence of common waterborne pathogens (stool and water) or immunoconversion to these pathogens (saliva). ETHICS: Approval has been obtained from Temple University's Institutional Review Board (Protocol 25665). The results of the trial will be published in peer-reviewed journals. TRIAL REGISTRATION NUMBER: NCT04826991.


Asunto(s)
Proyectos de Investigación , Agua , Niño , Humanos , Preescolar , Comités de Ética en Investigación , Heces , Cabeza
4.
J Environ Qual ; 52(2): 270-286, 2023 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-36479898

RESUMEN

Antimicrobial resistance is a growing public health problem that requires an integrated approach among human, agricultural, and environmental sectors. However, few studies address all three components simultaneously. We investigated the occurrence of five antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in private wells drawing water from a vulnerable aquifer influenced by residential septic systems and land-applied dairy manure. Samples (n = 138) were collected across four seasons from a randomized sample of private wells in Kewaunee County, Wisconsin. Measurements of ARGs and intI1 were related to microbial source tracking (MST) markers specific to human and bovine feces; they were also related to 54 risk factors for contamination representing land use, rainfall, hydrogeology, and well construction. ARGs and intI1 occurred in 5%-40% of samples depending on target. Detection frequencies for ARGs and intI1 were lowest in the absence of human and bovine MST markers (1%-30%), highest when co-occurring with human and bovine markers together (11%-78%), and intermediate when co-occurring with just one type of MST marker (4%-46%). Gene targets were associated with septic system density more often than agricultural land, potentially because of the variable presence of manure on the landscape. Determining ARG prevalence in a rural setting with mixed land use allowed an assessment of the relative contribution of human and bovine fecal sources. Because fecal sources co-occurred with ARGs at similar rates, interventions intended to reduce ARG occurrence may be most effective if both sources are considered.


Asunto(s)
Antibacterianos , Estiércol , Animales , Humanos , Bovinos , Antibacterianos/farmacología , Ganado , Heces , Farmacorresistencia Microbiana/genética
5.
Environ Sci Technol ; 56(10): 6315-6324, 2022 05 17.
Artículo en Inglés | MEDLINE | ID: mdl-35507527

RESUMEN

Infection risk from waterborne pathogens can be estimated via quantitative microbial risk assessment (QMRA) and forms an important consideration in the management of public groundwater systems. However, few groundwater QMRAs use site-specific hazard identification and exposure assessment, so prevailing risks in these systems remain poorly defined. We estimated the infection risk for 9 waterborne pathogens based on a 2-year pathogen occurrence study in which 964 water samples were collected from 145 public wells throughout Minnesota, USA. Annual risk across all nine pathogens combined was 3.3 × 10-1 (95% CI: 2.3 × 10-1 to 4.2 × 10-1), 3.9 × 10-2 (2.3 × 10-2 to 5.4 × 10-2), and 1.2 × 10-1 (2.6 × 10-2 to 2.7 × 10-1) infections person-1 year-1 for noncommunity, nondisinfecting community, and disinfecting community wells, respectively. Risk estimates exceeded the U.S. benchmark of 10-4 infections person-1 year-1 in 59% of well-years, indicating that the risk was widespread. While the annual risk for all pathogens combined was relatively high, the average daily doses for individual pathogens were low, indicating that significant risk results from sporadic pathogen exposure. Cryptosporidium dominated annual risk, so improved identification of wells susceptible to Cryptosporidium contamination may be important for risk mitigation.


Asunto(s)
Criptosporidiosis , Cryptosporidium , Virus , Bacterias , Humanos , Minnesota , Medición de Riesgo , Microbiología del Agua , Abastecimiento de Agua , Pozos de Agua
6.
Environ Health Perspect ; 129(6): 67003, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-34160247

RESUMEN

BACKGROUND: Private wells are an important source of drinking water in Kewaunee County, Wisconsin. Due to the region's fractured dolomite aquifer, these wells are vulnerable to contamination by human and zoonotic gastrointestinal pathogens originating from land-applied cattle manure and private septic systems. OBJECTIVE: We determined the magnitude of the health burden associated with contamination of private wells in Kewaunee County by feces-borne gastrointestinal pathogens. METHODS: This study used data from a year-long countywide pathogen occurrence study as inputs into a quantitative microbial risk assessment (QMRA) to predict the total cases of acute gastrointestinal illness (AGI) caused by private well contamination in the county. Microbial source tracking was used to associate predicted cases of illness with bovine, human, or unknown fecal sources. RESULTS: Results suggest that private well contamination could be responsible for as many as 301 AGI cases per year in Kewaunee County, and that 230 and 12 cases per year were associated with a bovine and human fecal source, respectively. Furthermore, Cryptosporidium parvum was predicted to cause 190 cases per year, the most out of all 8 pathogens included in the QMRA. DISCUSSION: This study has important implications for land use and water resource management in Kewaunee County and informs the public health impacts of consuming drinking water produced in other similarly vulnerable hydrogeological settings. https://doi.org/10.1289/EHP7815.


Asunto(s)
Criptosporidiosis , Cryptosporidium , Agua Subterránea , Animales , Carbonato de Calcio , Bovinos , Magnesio , Medición de Riesgo , Pozos de Agua , Wisconsin/epidemiología
7.
Environ Health Perspect ; 129(6): 67004, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-34160249

RESUMEN

BACKGROUND: Groundwater quality in the Silurian dolomite aquifer in northeastern Wisconsin, USA, has become contentious as dairy farms and exurban development expand. OBJECTIVES: We investigated private household wells in the region, determining the extent, sources, and risk factors of nitrate and microbial contamination. METHODS: Total coliforms, Escherichia coli, and nitrate were evaluated by synoptic sampling during groundwater recharge and no-recharge periods. Additional seasonal sampling measured genetic markers of human and bovine fecal-associated microbes and enteric zoonotic pathogens. We constructed multivariable regression models of detection probability (log-binomial) and concentration (gamma) for each contaminant to identify risk factors related to land use, precipitation, hydrogeology, and well construction. RESULTS: Total coliforms and nitrate were strongly associated with depth-to-bedrock at well sites and nearby agricultural land use, but not septic systems. Both human wastewater and cattle manure contributed to well contamination. Rotavirus group A, Cryptosporidium, and Salmonella were the most frequently detected pathogens. Wells positive for human fecal markers were associated with depth-to-groundwater and number of septic system drainfield within 229m. Manure-contaminated wells were associated with groundwater recharge and the area size of nearby agricultural land. Wells positive for any fecal-associated microbe, regardless of source, were associated with septic system density and manure storage proximity modified by bedrock depth. Well construction was generally not related to contamination, indicating land use, groundwater recharge, and bedrock depth were the most important risk factors. DISCUSSION: These findings may inform policies to minimize contamination of the Silurian dolomite aquifer, a major water supply for the U.S. and Canadian Great Lakes region. https://doi.org/10.1289/EHP7813.


Asunto(s)
Criptosporidiosis , Cryptosporidium , Agua Subterránea , Contaminantes Químicos del Agua , Animales , Carbonato de Calcio , Canadá , Bovinos , Monitoreo del Ambiente , Magnesio , Nitratos/análisis , Factores de Riesgo , Contaminantes Químicos del Agua/análisis , Pozos de Agua , Wisconsin
8.
Water Res ; 178: 115814, 2020 Jul 01.
Artículo en Inglés | MEDLINE | ID: mdl-32325219

RESUMEN

Drinking water supply wells can be contaminated by a broad range of waterborne pathogens. However, groundwater assessments frequently measure microbial indicators or a single pathogen type, which provides a limited characterization of potential health risk. This study assessed contamination of wells by testing for viral, bacterial, and protozoan pathogens and fecal markers. Wells supplying groundwater to community and noncommunity public water systems in Minnesota, USA (n = 145) were sampled every other month over one or two years and tested using 23 qPCR assays. Eighteen genetic targets were detected at least once, and microbiological contamination was widespread (96% of 145 wells, 58% of 964 samples). The sewage-associated microbial indicators HF183 and pepper mild mottle virus were detected frequently. Human or zoonotic pathogens were detected in 70% of wells and 21% of samples by qPCR, with Salmonella and Cryptosporidium detected more often than viruses. Samples positive by qPCR for adenovirus (HAdV), enterovirus, or Salmonella were analyzed by culture and for genotype or serotype. qPCR-positive Giardia and Cryptosporidium samples were analyzed by immunofluorescent assay (IFA), and IFA and qPCR concentrations were correlated. Comparisons of indicator and pathogen occurrence at the time of sampling showed that total coliforms, HF183, and Bacteroidales-like HumM2 had high specificity and negative predictive values but generally low sensitivity and positive predictive values. Pathogen-HF183 ratios in sewage have been used to estimate health risks from HF183 concentrations in surface water, but in our groundwater samples Cryptosporidium oocyst:HF183 and HAdV:HF183 ratios were approximately 10,000 times higher than ratios reported for sewage. qPCR measurements provided a robust characterization of microbiological water quality, but interpretation of qPCR data in a regulatory context is challenging because few studies link qPCR measurements to health risk.


Asunto(s)
Criptosporidiosis , Cryptosporidium , Agua Subterránea , Animales , Monitoreo del Ambiente , Heces , Humanos , Minnesota , Microbiología del Agua
9.
Environ Sci Technol ; 53(7): 3391-3398, 2019 04 02.
Artículo en Inglés | MEDLINE | ID: mdl-30895775

RESUMEN

Regulations for public water systems (PWS) in the U.S. consider Cryptosporidium a microbial contaminant of surface water supplies. Groundwater is assumed free of Cryptosporidium unless surface water is entering supply wells. We determined the incidence of Cryptosporidium in PWS wells varying in surface water influence. Community and noncommunity PWS wells ( n = 145) were sampled ( n = 964) and analyzed for Cryptosporidium by qPCR and immunofluorescence assay (IFA). Surface water influence was assessed by stable isotopes and the expert judgment of hydrogeologists using site-specific data. Fifty-eight wells (40%) and 107 samples (11%) were Cryptosporidium-positive by qPCR, and of these samples 67 were positive by IFA. Cryptosporidium concentrations measured by qPCR and IFA were significantly correlated ( p < 0.001). Cryptosporidium incidence was not associated with surface water influence as assessed by stable isotopes or expert judgment. We successfully sequenced 45 of the 107 positive samples to identify species, including C. parvum (41), C. andersoni (2), and C. hominis (2), and the predominant subtype was C. parvum IIa A17G2R1. Assuming USA regulations for surface water-supplied PWS were applicable to the study wells, wells positive for Cryptosporidium by IFA would likely be required to add treatment. Cryptosporidium is not uncommon in groundwater, even when surface water influence is absent.


Asunto(s)
Cryptosporidium , Agua Subterránea , Incidencia , Minnesota , Agua , Abastecimiento de Agua
10.
Environ Health Perspect ; 125(8): 087009, 2017 08 16.
Artículo en Inglés | MEDLINE | ID: mdl-28885976

RESUMEN

BACKGROUND: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. OBJECTIVES: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. METHODS: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irrigation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. RESULTS: Median risk estimates from Monte Carlo simulations ranged from 10-5 to 10-2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. CONCLUSIONS: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk. https://doi.org/10.1289/EHP283.


Asunto(s)
Riego Agrícola/métodos , Industria Lechera , Estiércol/microbiología , Modelos Teóricos , Medición de Riesgo
11.
Water Res ; 96: 105-13, 2016 06 01.
Artículo en Inglés | MEDLINE | ID: mdl-27023926

RESUMEN

The limit of detection (LOD) for qPCR-based analyses is not consistently defined or determined in studies on waterborne pathogens. Moreover, the LODs reported often reflect the qPCR assay alone rather than the entire sample process. Our objective was to develop an approach to determine the 95% LOD (lowest concentration at which 95% of positive samples are detected) for the entire process of waterborne pathogen detection. We began by spiking the lowest concentration that was consistently positive at the qPCR step (based on its standard curve) into each procedural step working backwards (i.e., extraction, secondary concentration, primary concentration), which established a concentration that was detectable following losses of the pathogen from processing. Using the fraction of positive replicates (n = 10) at this concentration, we selected and analyzed a second, and then third, concentration. If the fraction of positive replicates equaled 1 or 0 for two concentrations, we selected another. We calculated the LOD using probit analysis. To demonstrate our approach we determined the 95% LOD for Salmonella enterica serovar Typhimurium, adenovirus 41, and vaccine-derived poliovirus Sabin 3, which were 11, 12, and 6 genomic copies (gc) per reaction (rxn), respectively (equivalent to 1.3, 1.5, and 4.0 gc L(-1) assuming the 1500 L tap-water sample volume prescribed in EPA Method 1615). This approach limited the number of analyses required and was amenable to testing multiple genetic targets simultaneously (i.e., spiking a single sample with multiple microorganisms). An LOD determined this way can facilitate study design, guide the number of required technical replicates, aid method evaluation, and inform data interpretation.


Asunto(s)
Límite de Detección , Microbiología del Agua , Reacción en Cadena de la Polimerasa , Reacción en Cadena en Tiempo Real de la Polimerasa , Sensibilidad y Especificidad , Agua
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...