Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 13 de 13
1.
mSphere ; 9(5): e0076023, 2024 May 29.
Article En | MEDLINE | ID: mdl-38606968

Antimicrobial resistance (AMR) poses a global health threat, causing millions of deaths annually, with expectations of increased impact in the future. Wastewater surveillance offers a cost-effective, non-invasive tool to understand AMR carriage trends within a population. We monitored extended-spectrum ß-lactamase producing Escherichia coli (ESBL-E. coli) weekly in influent wastewater from six wastewater treatment plants (WWTPs) in Switzerland (November 2021 to November 2022) to investigate spatio-temporal variations, explore correlations with environmental variables, develop a predictive model for ESBL-E. coli carriage in the community, and detect the most prevalent ESBL-genes. We cultured total and ESBL-E. coli in 300 wastewater samples to quantify daily loads and percentage of ESBL-E. coli. Additionally, we screened 234 ESBL-E. coli isolates using molecular methods for the presence of 18 ESBL-gene families. We found a population-weighted mean percentage of ESBL-E. coli of 1.9% (95% confidence interval: 1.8-2%) across all sites and weeks, which can inform ESBL-E. coli carriage. Concentrations of ESBL-E. coli varied across WWTPs and time, with higher values observed in WWTPs serving larger populations. Recent precipitations (previous 24/96 h) showed no significant association with ESBL-E. coli, while temperature occasionally had a moderate impact (P < 0.05, correlation coefficients approximately 0.40) in some locations. We identified blaCTX-M-1, blaCTX-M-9, and blaTEM as the predominant ESBL-gene families. Our study demonstrates that wastewater-based surveillance of culturable ESBL-E. coli provides insights into AMR trends in Switzerland and may also inform resistance. These findings establish a foundation for long term, nationally established monitoring protocols and provide information that may help inform targeted public health interventions. IMPORTANCE: Antimicrobial resistance (AMR) is a global health threat and is commonly monitored in clinical settings, given its association with the risk of antimicrobial-resistant infections. Nevertheless, tracking AMR within a community proves challenging due to the substantial sample size required for a representative population, along with high associated costs and privacy concerns. By investigating high resolution temporal and geographic trends in extended-spectrum beta-lactamase producing Escherichia coli in wastewater, we provide an alternative approach to monitor AMR dynamics, distinct from the conventional clinical settings focus. Through this approach, we develop a mechanistic model, shedding light on the relationship between wastewater indicators and AMR carriage in the population. This perspective contributes valuable insights into trends of AMR carriage, emphasizing the importance of wastewater surveillance in informing effective public health interventions.


Escherichia coli , Wastewater , beta-Lactamases , Wastewater/microbiology , Escherichia coli/genetics , Escherichia coli/drug effects , Escherichia coli/enzymology , Escherichia coli/isolation & purification , Switzerland , beta-Lactamases/genetics , Humans , Escherichia coli Infections/microbiology , Escherichia coli Infections/epidemiology , Carrier State/microbiology , Carrier State/epidemiology , Anti-Bacterial Agents/pharmacology
2.
Water Res ; 254: 121374, 2024 May 01.
Article En | MEDLINE | ID: mdl-38422696

Intense rainfall and snowmelt events may affect the safety of drinking water, as large quantities of fecal material can be discharged from storm or sewage overflows or washed from the catchment into drinking water sources. This study used ß-d-glucuronidase activity (GLUC) with microbial source tracking (MST) markers: human, bovine, porcine mitochondrial DNA markers (mtDNA) and human-associated Bacteroidales HF183 and chemical source tracking (CST) markers including caffeine, carbamazepine, theophylline and acetaminophen, pathogens (Giardia, Cryptosporidium, adenovirus, rotavirus and enterovirus), water quality indicators (Escherichia coli, turbidity) and hydrometeorological data (flowrate, precipitation) to assess the vulnerability of 3 drinking water intakes (DWIs) and identify sources of fecal contamination. Water samples were collected under baseline, snow and rain events conditions in urban and agricultural catchments (Québec, Canada). Dynamics of E. coli, HF183 and WWMPs were similar during contamination events, and concentrations generally varied over 1 order of magnitude during each event. Elevated human-associated marker levels during events demonstrated that urban DWIs were impacted by recent contamination from an upstream municipal water resource recovery facility (WRRF). In the agricultural catchment, mixed fecal pollution was observed with the occurrences and increases of enteric viruses, human bovine and porcine mtDNA during peak contaminating events. Bovine mtDNA qPCR concentrations were indicative of runoff of cattle-derived fecal pollutants to the DWI from diffuse sources following rain events. This study demonstrated that the suitability of a given MST or CST indicator depend on river and catchment characteristics. The sampling strategy using continuous online GLUC activity coupled with MST and CST markers analysis was a more reliable source indicator than turbidity to identify peak events at drinking water intakes.


Cryptosporidiosis , Cryptosporidium , Drinking Water , Enterovirus , Animals , Cattle , Swine , Humans , Escherichia coli , Environmental Monitoring , DNA, Mitochondrial , Glucuronidase
3.
FEMS Microbes ; 4: xtad016, 2023.
Article En | MEDLINE | ID: mdl-37705999

Legionella are natural inhabitants of building plumbing biofilms, where interactions with other microorganisms influence their survival, proliferation, and death. Here, we investigated the associations of Legionella with bacterial and eukaryotic microbiomes in biofilm samples extracted from 85 shower hoses of a multiunit residential building. Legionella spp. relative abundance in the biofilms ranged between 0-7.8%, of which only 0-0.46% was L. pneumophila. Our data suggest that some microbiome members were associated with high (e.g. Chthonomonas, Vrihiamoeba) or low (e.g. Aquabacterium, Vannella) Legionella relative abundance. The correlations of the different Legionella variants (30 Zero-Radius OTUs detected) showed distinct patterns, suggesting separate ecological niches occupied by different Legionella species. This study provides insights into the ecology of Legionella with respect to: (i) the colonization of a high number of real shower hoses biofilm samples; (ii) the ecological meaning of associations between Legionella and co-occurring bacterial/eukaryotic organisms; (iii) critical points and future directions of microbial-interaction-based-ecological-investigations.

4.
Environ Sci Technol Lett ; 10(4): 379-384, 2023 Apr 11.
Article En | MEDLINE | ID: mdl-37064823

Preventing failures of water treatment barriers can play an important role in meeting the increasing demand for microbiologically safe water. The development and integration of failure prevention strategies into quantitative microbial risk assessment (QMRA) offer opportunities to support the design and operation of treatment trains. This study presents existing failure models and extends them to guide the development of risk-based operational monitoring strategies. For barriers with rapid performance loss, results show that a failure of 15 s should be reliably detected to verify a log reduction value (LRV) of 6.0; thus, detecting and remediating these failures may be beyond current technology. For chemical disinfection with a residual, failure durations in order of minutes should be reliably detected to verify a LRV of 6.0. Short-term failures are buffered because the disinfectant residual concentration sustains a partial reduction performance. Therefore, increasing the contact time and hydraulic mixing reduces the impact of failures. These findings demonstrate the importance of defining precise frequencies to monitor barrier performances during operation. Overall, this study highlights the utility of process-specific models for developing failure prevention strategies for water safety management.

5.
Water Res ; 229: 119437, 2023 Feb 01.
Article En | MEDLINE | ID: mdl-36476383

Waterborne enteric viruses in lakes, especially at recreational water sites, may have a negative impact on human health. However, their fate and transport in lakes are poorly understood. In this study, we propose a coupled water quality and quantitative microbial risk assessment (QMRA) model to study the transport, fate and infection risk of four common waterborne viruses (adenovirus, enterovirus, norovirus and rotavirus), using Lake Geneva as a study site. The measured virus load in raw sewage entering the lake was used as the source term in the water quality simulations for a hypothetical scenario of discharging raw wastewater at the lake surface. After discharge into the lake, virus inactivation was modeled as a function of water temperature and solar irradiance that varied both spatially and temporally during transport throughout the lake. Finally, the probability of infection, while swimming at a popular beach, was quantified and compared among the four viruses. Norovirus was found to be the most abundant virus that causes an infection probability that is at least 10 times greater than the other viruses studied. Furthermore, environmental inactivation was found to be an essential determinant in the infection risks posed by viruses to recreational water users. We determined that infection risks by enterovirus and rotavirus could be up to 1000 times lower when virus inactivation by environmental stressors was accounted for compared with the scenarios considering hydrodynamic transport only. Finally, the model highlighted the role of the wind field in conveying the contamination plume and hence in determining infection probability. Our simulations revealed that for beaches located west of the sewage discharge, the infection probability under eastward wind was 43% lower than that under westward wind conditions. This study highlights the potential of combining water quality simulation and virus-specific risk assessment for a safe water resources usage and management.


Enterovirus , Norovirus , Viruses , Humans , Lakes , Sewage , Water Microbiology , Environmental Monitoring
6.
Water Res ; 205: 117707, 2021 Oct 15.
Article En | MEDLINE | ID: mdl-34619609

Minimum treatment requirements are set in response to established or anticipated levels of enteric pathogens in the source water of drinking water treatment plants (DWTPs). For surface water, contamination can be determined directly by monitoring reference pathogens or indirectly by measuring fecal indicators such as Escherichia coli (E. coli). In the latter case, a quantitative interpretation of E. coli for estimating reference pathogen concentrations could be used to define treatment requirements. This study presents the statistical analysis of paired E. coli and reference protozoa (Cryptosporidium, Giardia) data collected monthly for two years in source water from 27 DWTPs supplied by rivers in Canada. E. coli/Cryptosporidium and E. coli/Giardia ratios in source water were modeled as the ratio of two correlated lognormal variables. To evaluate the potential of E. coli for defining protozoa treatment requirements, risk-based critical mean protozoa concentrations in source water were determined with a reverse quantitative microbial risk assessment (QMRA) model. Model assumptions were selected to be consistent with the World Health Organization (WHO) Guidelines for drinking-water quality. The sensitivity of mean E. coli concentration trigger levels to identify these critical concentrations in source water was then evaluated. Results showed no proportionalities between the log of mean E. coli concentrations and the log of mean protozoa concentrations. E. coli/protozoa ratios at DWTPs supplied by small rivers in agricultural and forested areas were typically 1.0 to 2.0-log lower than at DWTPs supplied by large rivers in urban areas. The seasonal variations analysis revealed that these differences were related to low mean E. coli concentrations during winter in small rivers. To achieve the WHO target of 10-6 disability-adjusted life year (DALY) per person per year, a minimum reduction of 4.0-log of Cryptosporidium would be required for 20 DWTPs, and a minimum reduction of 4.0-log of Giardia would be needed for all DWTPs. A mean E. coli trigger level of 50 CFU 100 mL-1 would be a sensitive threshold to identify critical mean concentrations for Cryptosporidium but not for Giardia. Treatment requirements higher than 3.0-log would be needed at DWTPs with mean E. coli concentrations as low as 30 CFU 100 mL-1 for Cryptosporidium and 3 CFU 100 mL-1 for Giardia. Therefore, an E. coli trigger level would have limited value for defining health-based treatment requirements for protozoa at DWTPs supplied by small rivers in rural areas.


Cryptosporidiosis , Cryptosporidium , Drinking Water , Escherichia coli , Humans , Risk Assessment , Rivers , Water Microbiology
7.
Water Res ; 200: 117296, 2021 Jul 15.
Article En | MEDLINE | ID: mdl-34098267

A monitoring strategy was implemented at two drinking water treatment plants in Quebec, Canada, to evaluate microbial reduction performances of full-scale treatment processes under different source water conditions. ß-D-glucuronidase activity in source water was automatically monitored in near-real-time to establish baseline and event conditions at each location. High-volume water samples (50-1,500 L) were collected at the inflow and the outflow of coagulation/flocculation, filtration, and UV disinfection processes and were analysed for two naturally occurring surrogate organisms: Escherichia coli and Clostridium perfringens. Source water Cryptosporidium data and full-scale C. perfringens reduction data were entered into a quantitative microbial risk assessment (QMRA) model to estimate daily infection risks associated with exposures to Cryptosporidium via consumption of treated drinking water. Daily mean E. coli and Cryptosporidium concentrations in source water under event conditions were in the top 5% (agricultural site) or in the top 15% (urban site) of what occurs through the year at these drinking water treatment plants. Reduction performances of up to 6.0-log for E. coli and 5.6-log for C. perfringens were measured by concentrating high-volume water samples throughout the treatment train. For both drinking water treatment plants, removal performances by coagulation/flocculation/sedimentation processes were at the high end of the range of those reported in the literature for bacteria and bacterial spores. Reductions of E. coli and C. perfringens by floc blanket clarification, ballasted clarification and rapid sand filtration did not deteriorate during two snowmelt/rainfall events. QMRA results suggested that daily infection risks were similar during two rainfall/snowmelt events than during baseline conditions. Additional studies investigating full-scale reductions would be desirable to improve the evaluation of differences in treatment performances under various source water conditions.


Cryptosporidiosis , Cryptosporidium , Drinking Water , Water Purification , Canada , Escherichia coli , Humans , Quebec , Water Microbiology
8.
Water Res X ; 11: 100091, 2021 May 01.
Article En | MEDLINE | ID: mdl-33598650

This study investigates short-term fluctuations in virus concentrations in source water and their removal by full-scale drinking water treatment processes under different source water conditions. Transient peaks in raw water faecal contamination were identified using in situ online ß-d-glucuronidase activity monitoring at two urban drinking water treatment plants. During these peaks, sequential grab samples were collected at the source and throughout the treatment train to evaluate concentrations of rotavirus, adenovirus, norovirus, enterovirus, JC virus, reovirus, astrovirus and sapovirus by reverse transcription and real-time quantitative PCR. Virus infectivity was assessed through viral culture by measurement of cytopathic effect and integrated cell culture qPCR. Virus concentrations increased by approximately 0.5-log during two snowmelt/rainfall episodes and approximately 1.0-log following a planned wastewater discharge upstream of the drinking water intake and during a ß-d-glucuronidase activity peak in dry weather conditions. Increases in the removal of adenovirus and rotavirus by coagulation/flocculation processes were observed during peak virus concentrations in source water, suggesting that these processes do not operate under steady-state conditions but dynamic conditions in response to source water conditions. Rotavirus and enterovirus detected in raw and treated water samples were predominantly negative in viral culture. At one site, infectious adenoviruses were detected in raw water and water treated by a combination of ballasted clarification, ozonation, GAC filtration, and UV disinfection operated at a dose of 40 mJ cm-2. The proposed sampling strategy can inform the understanding of the dynamics associated with virus concentrations at drinking water treatment plants susceptible to de facto wastewater reuse.

9.
Risk Anal ; 41(8): 1413-1426, 2021 08.
Article En | MEDLINE | ID: mdl-33103797

Temporal variations in concentrations of pathogenic microorganisms in surface waters are well known to be influenced by hydrometeorological events. Reasonable methods for accounting for microbial peaks in the quantification of drinking water treatment requirements need to be addressed. Here, we applied a novel method for data collection and model validation to explicitly account for weather events (rainfall, snowmelt) when concentrations of pathogens are estimated in source water. Online in situ ß-d-glucuronidase activity measurements were used to trigger sequential grab sampling of source water to quantify Cryptosporidium and Giardia concentrations during rainfall and snowmelt events at an urban and an agricultural drinking water treatment plant in Quebec, Canada. We then evaluate if mixed Poisson distributions fitted to monthly sampling data ( n = 30 samples) could accurately predict daily mean concentrations during these events. We found that using the gamma distribution underestimated high Cryptosporidium and Giardia concentrations measured with routine or event-based monitoring. However, the log-normal distribution accurately predicted these high concentrations. The selection of a log-normal distribution in preference to a gamma distribution increased the annual mean concentration by less than 0.1-log but increased the upper bound of the 95% credibility interval on the annual mean by about 0.5-log. Therefore, considering parametric uncertainty in an exposure assessment is essential to account for microbial peaks in risk assessment.


Cryptosporidiosis/parasitology , Drinking Water/parasitology , Giardia , Giardiasis/parasitology , Rain , Risk Assessment/methods , Snow , Cities , Cryptosporidiosis/prevention & control , Cryptosporidium , Environmental Monitoring , Giardiasis/prevention & control , Humans , Quebec , Rivers , Water Microbiology , Water Purification
10.
Risk Anal ; 41(8): 1396-1412, 2021 08.
Article En | MEDLINE | ID: mdl-33103818

The identification of appropriately conservative statistical distributions is needed to predict microbial peak events in drinking water sources explicitly. In this study, Poisson and mixed Poisson distributions with different upper tail behaviors were used for modeling source water Cryptosporidium and Giardia data from 30 drinking water treatment plants. Small differences (<0.5-log) were found between the "best" estimates of the mean Cryptosporidium and Giardia concentrations with the Poisson-gamma and Poisson-log-normal models. However, the upper bound of the 95% credibility interval on the mean Cryptosporidium concentrations of the Poisson-log-normal model was considerably higher (>0.5-log) than that of the Poisson-gamma model at four sites. The improper choice of a model may, therefore, mislead the assessment of treatment requirements and health risks associated with the water supply. Discrimination between models using the marginal deviance information criterion (mDIC) was unachievable because differences in upper tail behaviors were not well characterized with available data sets ( n<30 ). Therefore, the gamma and the log-normal distributions fit the data equally well but may predict different risk estimates when they are used as an input distribution in an exposure assessment. The collection of event-based monitoring data and the modeling of larger routine monitoring data sets are recommended to identify appropriately conservative distributions to predict microbial peak events.


Cryptosporidiosis/parasitology , Drinking Water/parasitology , Giardia/parasitology , Giardiasis/parasitology , Water Microbiology , Bayes Theorem , Cryptosporidiosis/prevention & control , Cryptosporidium , Environmental Monitoring/methods , Giardiasis/prevention & control , Humans , Oocysts , Poisson Distribution , Risk Assessment/methods , Water Purification/methods , Water Supply
11.
Water Res ; 170: 115369, 2020 Mar 01.
Article En | MEDLINE | ID: mdl-31830653

In several jurisdictions, the arithmetic mean of Escherichia coli concentrations in raw water serves as the metric to set minimal treatment requirements by drinking water treatment plants (DWTPs). An accurate and precise estimation of this mean is therefore critical to define adequate requirements. Distributions of E. coli concentrations in surface water can be heavily skewed and require statistical methods capable of characterizing uncertainty. We present four simple parametric models with different upper tail behaviors (gamma, log-normal, Lomax, mixture of two log-normal distributions) to explicitly account for the influence of peak events on the mean concentration. The performance of these models was tested using large E. coli data sets (200-1800 samples) from raw water regulatory monitoring at six DWTPs located in urban and agricultural catchments. Critical seasons of contamination and hydrometeorological factors leading to peak events were identified. Event-based samples were collected at an urban DWTP intake during two hydrometeorological events using online ß-d-glucuronidase activity monitoring as a trigger. Results from event-based sampling were used to verify whether selected parametric distributions predicted targeted peak events. We found that the upper tail of the log-normal and the Lomax distributions better predicted large concentrations than the upper tail of the gamma distribution. Weekly sampling for two years in urban catchments and for four years in agricultural catchments generated reasonable estimates of the average raw water E. coli concentrations. The proposed methodology can be easily used to inform the development of sampling strategies and statistical indices to set site-specific treatment requirements.


Drinking Water , Rivers , Agriculture , Environmental Monitoring , Escherichia coli , Water Microbiology
12.
Water Res ; 164: 114869, 2019 Nov 01.
Article En | MEDLINE | ID: mdl-31377523

Past waterborne outbreaks have demonstrated that informed vulnerability assessment of drinking water supplies is paramount for the provision of safe drinking water. Although current monitoring frameworks are not designed to account for short-term peak concentrations of fecal microorganisms in source waters, the recent development of online microbial monitoring technologies is expected to fill this knowledge gap. In this study, online near real-time monitoring of ß-d-glucuronidase (GLUC) activity was conducted for 1.5 years at an urban drinking water intake impacted by multiple point sources of fecal pollution. Parallel routine and event-based monitoring of E. coli and online measurement of physico-chemistry were performed at the intake and their dynamics compared over time. GLUC activity fluctuations ranged from seasonal to hourly time scales. All peak contamination episodes occurred between late fall and early spring following intense rainfall and/or snowmelt. In the absence of rainfall, recurrent daily fluctuations in GLUC activity and culturable E. coli were observed at the intake, a pattern otherwise ignored by regulatory monitoring. Cross-correlation analysis of time series retrieved from the drinking water intake and an upstream Water Resource Recovery Facility (WRRF) demonstrated a hydraulic connection between the two sites. Sewage by-passes from the same WRRF were the main drivers of intermittent GLUC activity and E. coli peaks at the drinking water intake following intense precipitation and/or snowmelt. Near real-time monitoring of fecal pollution through GLUC activity enabled a thorough characterization of the frequency, duration and amplitude of peak contamination periods at the urban drinking water intake while providing crucial information for the identification of the dominant upstream fecal pollution sources. To the best of our knowledge, this is the first characterization of a hydraulic connection between a WRRF and a downstream drinking water intake across hourly to seasonal timescales using high frequency microbial monitoring data. Ultimately, this should help improve source water protection through catchment mitigation actions, especially in a context of de facto wastewater reuse.


Drinking Water , Wastewater , Environmental Monitoring , Escherichia coli , Feces , Glucuronidase , Water Microbiology , Water Pollution , Water Supply
13.
Sci Total Environ ; 683: 547-558, 2019 Sep 15.
Article En | MEDLINE | ID: mdl-31146060

Urban source water protection planning requires the characterization of sources of contamination upstream of drinking water intakes. Elevated pathogen concentrations following Combined Sewer Overflows (CSOs) represent a threat to human health. Quantifying peak pathogen concentrations at the intakes of drinking water plants is a challenge due to the variability of CSO occurrences and uncertainties with regards to the fate and transport mechanisms from discharge points to source water supplies. Here, a two-dimensional deterministic hydrodynamic and water quality model is used to study the fluvial contaminant transport and the impacts of the upstream CSO discharges on the downstream concentrations of Escherichia coli in the raw water supply of two drinking water plants, located on a large river. CSO dynamic loading characteristics were considered for a variety of discharges. As a result of limited Cryptosporidium data, a probability distribution of the ratio of E. coli to Cryptosporidium based on historical data was used to estimate microbial risk from simulated CSO-induced E. coli concentrations. During optimal operational performance of the plants, the daily risk target was met (based on the mean concentration during the peak) for 80% to 90% of CSO events. For suboptimal performance of the plants, these values dropped to 40% to 55%. Mean annual microbial risk following CSO discharge events was more dependent on treatment performance rather than the number of CSO occurrences. The effect of CSO-associated short term risk on the mean annual risk is largely dependent on the treatment performance as well as representativeness of the baseline condition at the intakes, demonstrating the need for assessment of treatment efficacy. The results of this study will enable water utilities and managers with a tool to investigate the potential alternatives in reducing the microbial risk associated with CSOs.


Drinking Water , Environmental Monitoring/methods , Hydrodynamics , Rivers/chemistry , Water Microbiology , Water Quality
...