Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 28
Filter
1.
Sci Total Environ ; 929: 172448, 2024 Jun 15.
Article in English | MEDLINE | ID: mdl-38615775

ABSTRACT

This study establishes site-specific risk-based threshold (RBT) concentrations for sewage-associated markers, including Bacteroides HF183 (HF183), Lachnospiraceae Lachno3 (Lachno3), cross-assembly phage (CrAssphage), and pepper mild mottle virus (PMMoV), utilizing quantitative microbial risk assessment (QMRA) for recreational estuarine waters (EW). The QMRA model calculates a RBT concentration corresponding to a selected target illness risk for ingestion of EW contaminated with untreated sewage. RBT concentrations were estimated considering site-specific decay rates and concentrations of markers and reference pathogen (human norovirus; HNoV), aiding in the identification of high-risk days during the swimming season. Results indicated varying RBT concentrations for fresh (Day 0) and aged (Days 1 to 10) sewage contamination scenarios over 10 days. HF183 exhibited the highest RBT concentration (26,600 gene copis (GC)/100 mL) initially but decreased rapidly with aging (2570 to 3120 GC/100 mL on Day 10) depending on the decay rates, while Lachno3 and CrAssphage remained relatively stable. PMMoV, despite lower initial RBT (3920 GC/100 mL), exhibited increased RBT (4700 to 6440 GC/100 mL) with aging due to its slower decay rate compared to HNoV. Sensitivity analysis revealed HNoV concentrations as the most influential parameter. Comparison of marker concentrations in estuarine locations with RBT concentrations showed instances of marker exceedance, suggesting days of potential higher risks. The observed discrepancies between bacterial and viral marker concentrations in EW highlight the need for optimized sample concentration method and simultaneous measurement of multiple markers for enhanced risk predictions. Future research will explore the utility of multiple markers in risk management. Overall, this study contributes to better understanding human health risks in recreational waters, aiding regulators, and water quality managers in effective decision-making for risk prioritization and mitigation strategies.


Subject(s)
Environmental Monitoring , Estuaries , Sewage , Risk Assessment , Environmental Monitoring/methods , Water Microbiology , Tobamovirus , Swimming , Biomarkers/analysis
3.
Environ Sci Technol ; 57(49): 20542-20550, 2023 Dec 12.
Article in English | MEDLINE | ID: mdl-38014848

ABSTRACT

Influenza A virus (IAV) causes significant morbidity and mortality in the United States and has pandemic potential. Identifying IAV epidemic patterns is essential to inform the timing of vaccinations and nonpharmaceutical interventions. In a prospective, longitudinal study design, we measured IAV RNA in wastewater settled solids at 163 wastewater treatment plants across 33 states to characterize the 2022-2023 influenza season at the state, health and human services (HHS) regional, and national scales. Influenza season onset, offset, duration, peak, and intensity using IAV RNA in wastewater were compared with those determined using laboratory-confirmed influenza hospitalization rates and outpatient visits for influenza-like illness (ILI). The onset for HHS regions as determined by IAV RNA in wastewater roughly corresponded with those determined using ILI when the annual geometric mean of IAV RNA concentration was used as a baseline (i.e., the threshold that triggers onset), although offsets between the two differed. IAV RNA in wastewater provided early warning of onset, compared to the ILI estimate, when the baseline was set at twice the limit of IAV RNA detection in wastewater. Peak when determined by IAV RNA in wastewater generally preceded peak determined by IAV hospitalization rate by 2 weeks or less. IAV RNA in wastewater settled solids is an IAV-specific indicator that can be used to augment clinical surveillance for seasonal influenza epidemic timing and intensity.


Subject(s)
Influenza, Human , United States/epidemiology , Humans , Influenza, Human/epidemiology , Wastewater , Seasons , Longitudinal Studies , Prospective Studies , Hospitalization , RNA
4.
Environ Sci Technol ; 57(26): 9559-9566, 2023 07 04.
Article in English | MEDLINE | ID: mdl-37342916

ABSTRACT

Pathogen log10 reduction targets for onsite nonpotable water systems were calculated using both annual infection (LRTINF) and disability-adjusted life year (LRTDALY) benchmarks. The DALY is a measure of the health burden of a disease, accounting for both the severity and duration of illness. Results were evaluated to identify if treatment requirements change when accounting for the likelihood, duration, and severity of illness in addition to the likelihood of infection. The benchmarks of 10-4 infections per person per year (ppy) and 10-6 DALYs ppy were adopted along with multilevel dose-response models for Norovirus and Campylobacter jejuni, which characterize the probability of illness given infection (Pill|inf) as dose-dependent using challenge or outbreak data. We found differences between treatment requirements, LRTINF - LRTDALY, for some pathogens, driven by the likelihood of illness, rather than the severity of illness. For pathogens with dose-independent Pill|inf characterizations, such as Cryptosporidium spp., Giardia, and Salmonella enterica, the difference, LRTINF - LRTDALY, was identical across reuse scenarios (

Subject(s)
Cryptosporidiosis , Cryptosporidium , Water Purification , Humans , Disability-Adjusted Life Years , Cryptosporidiosis/epidemiology , Benchmarking , Risk Assessment
5.
Water Res ; 233: 119742, 2023 Apr 15.
Article in English | MEDLINE | ID: mdl-36848851

ABSTRACT

Onsite non-potable water systems (ONWS) collect and treat local source waters for non-potable end uses such as toilet flushing and irrigation. Quantitative microbial risk assessment (QMRA) has been used to set pathogen log10-reduction targets (LRTs) for ONWS to achieve the risk benchmark of 10-4 infections per person per year (ppy) in a series of two efforts completed in 2017 and 2021. In this work, we compare and synthesize the ONWS LRT efforts to inform the selection of pathogen LRTs. For onsite wastewater, greywater, and stormwater, LRTs for human enteric viruses and parasitic protozoa were within 1.5-log10 units between 2017 and 2021 efforts, despite differences in approaches used to characterize pathogens in these waters. For onsite wastewater and greywater, the 2017 effort used an epidemiology-based model to simulate pathogen concentrations contributed exclusively from onsite waste and selected Norovirus as the viral reference pathogen; the 2021 effort used municipal wastewater pathogen data and cultivable adenoviruses as the reference viral pathogen. Across source waters, the greatest differences occurred for viruses in stormwater, given the newly available municipal wastewater characterizations used for modeling sewage contributions in 2021 and the different selection of reference pathogens (Norovirus vs. adenoviruses). The roof runoff LRTs support the need for protozoa treatment, but these remain difficult to characterize due to the pathogen variability in roof runoff across space and time. The comparison highlights adaptability of the risk-based approach, allowing for updated LRTs as site specific or improved information becomes available. Future research efforts should focus on data collection of onsite water sources.


Subject(s)
Drinking Water , Norovirus , Viruses , Humans , Wastewater , Sewage , Risk Assessment , Adenoviridae
6.
ACS ES T Water ; 2(11): 2167-2174, 2022 Nov 11.
Article in English | MEDLINE | ID: mdl-36380770

ABSTRACT

Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) RNA concentrations in wastewater settled solids correlate well with coronavirus disease 2019 (COVID-19) incidence rates (IRs). Here, we develop distributed lag models to estimate IRs using concentrations of SARS-CoV-2 RNA from wastewater solids and investigate the impact of sampling frequency on model performance. SARS-CoV-2 N gene and pepper mild mottle virus (PMMoV) RNA concentrations were measured daily at four wastewater treatment plants in California. Artificially reduced data sets were produced for each plant with sampling frequencies of once every 2, 3, 4, and 7 days. Sewershed-specific models that related daily N/PMMoV to IR were fit for each sampling frequency with data from mid-November 2020 through mid-July 2021, which included the period of time during which Delta emerged. Models were used to predict IRs during a subsequent out-of-sample time period. When sampling occurred at least once every 4 days, the in- and out-of-sample root-mean-square error changed by <7 cases/100 000 compared to daily sampling across sewersheds. This work illustrates that real-time, daily predictions of IR are possible with small errors, despite changes in circulating variants, when sampling frequency is once every 4 days or more. However, reduced sampling frequency may not serve other important wastewater surveillance use cases.

7.
Environ Sci Technol ; 55(22): 15246-15255, 2021 11 16.
Article in English | MEDLINE | ID: mdl-34699171

ABSTRACT

The annual risks of colonization, skin infection, bloodstream infection (BSI), and disease burden from exposures to antibiotic-resistant and susceptible Staphylococcus aureus (S. aureus) were estimated using quantitative microbial risk assessment (QMRA). We estimated the probability of nasal colonization after immersion in wastewater (WW) or greywater (GW) treated across a range of treatment alternatives and subsequent infection. Horizontal gene transfer was incorporated into the treatment model but had little effect on the predicted risk. The cumulative annual probability of infection (resulting from self-inoculation) was most sensitive to the treatment log10 reduction value (LRV), S. aureus concentration, and the newly calculated morbidity ratios and was below the health benchmark of 10-4 infections per person per year (ppy) given a treatment LRV of roughly 3.0. The predicted annual disability-adjusted life years (DALYs), which were dominated by BSI, were below the health benchmark of 10-6 DALYs ppy for resistant and susceptible S. aureus, given LRVs of 4.5 and 3.5, respectively. Thus, the estimated infection risks and disease burdens resulting from nasal colonization are below the relevant health benchmarks for risk-based, nonpotable, or potable reuse systems but possibly above for immersion in minimally treated GW or WW. Strain-specific data to characterize dose-response and concentration in WW are needed to substantiate the QMRA.


Subject(s)
Communicable Diseases , Staphylococcus aureus , Anti-Bacterial Agents , Communicable Diseases/drug therapy , Humans , Risk Assessment , Wastewater
8.
Environ Sci Technol ; 54(20): 13101-13109, 2020 10 20.
Article in English | MEDLINE | ID: mdl-32969642

ABSTRACT

Fecal pollution at beaches can pose a health risk to recreators. Quantitative microbial risk assessment (QMRA) is a tool to evaluate the use of candidate fecal indicators to signify a health risk from enteric pathogens in sewage-impacted waters. We extend the QMRA approach to model mixtures of sewage at different ages using genetic marker concentrations for human-associated crAssphage, Bacteroides spp., and polyomavirus in sewage samples from 49 wastewater facilities across the contiguous United States. Risk-based threshold (RBT) estimates varied across different mixture and sewage age scenarios. Fresh sewage RBT estimates were not always protective when aged sewage was present, and aged sewage RBT estimates often fell below the marker lower limit of quantification. Conservative RBT estimates of 9.3 × 102 and 9.1 × 103 (copies/100 mL) for HF183/BacR287 and CPQ_056, respectively, were predicted when fresh sewage was greater (by volume) than aged at the time of measurement. Conversely, genetic markers may not be effective indicators when aged sewage contributes the majority of pathogens, relative to fresh contamination, but minimal marker levels. Results highlight the utility of QMRA that incorporates pollutant age and mixture scenarios, the potential advantages of a crAssphage fecal indicator, and the potential influence of site-specific factors on estimating RBT values.


Subject(s)
Environmental Monitoring , Sewage , Bacteria , Feces/chemistry , Genetic Markers , Humans , Sewage/analysis , Wastewater , Water Microbiology , Water Pollution
9.
J Water Health ; 18(3): 331-344, 2020 Jun.
Article in English | MEDLINE | ID: mdl-32589619

ABSTRACT

Water quality standards (WQSs) based on water quality measures (e.g., fecal indicator bacteria (FIB)) have been used by regulatory agencies to assess onsite, non-potable water reuse systems. A risk-based approach, based on quantitative microbial risk assessment, was developed to define treatment requirements that achieve benchmark levels of risk. This work compared these approaches using the predicted annual infection risks for non-potable reuse systems that comply with WQSs along with the benchmark risk levels achieved by the risk-based systems. The systems include a recirculating synthetic sand filter or an aerobic membrane bioreactor (MBR) combined with disinfection. The greywater MBR system had predicted risks in the range of the selected benchmark levels. However, wastewater reuse with systems that comply with WQSs had uncertain and potentially high predicted risks (i.e., >10-2 infections per person per year) in residential applications, due to exposures to viruses and protozoa. The predicted risks illustrate that WQSs based on FIB treatment performance do not ensure adequate treatment removal of viruses and protozoa. We present risk-based log10 pathogen reduction targets for intermediate-sized non-potable systems, which are 0.5 log10 less than those previously proposed for district-sized systems. Still, pathogen treatment performance data are required to better manage non-potable reuse risk.


Subject(s)
Water Purification , Water Quality , Disinfection , Quality Indicators, Health Care , Wastewater
10.
Environ Sci Technol Lett ; 7(12): 943-947, 2020.
Article in English | MEDLINE | ID: mdl-33409324

ABSTRACT

Exposure factors (e.g., ingestion volume and frequency) are required to establish risk-based treatment requirements (i.e., log10 reduction targets (LRTs)) for enteric pathogens using quantitative microbial risk assessment (QMRA). However, data to Wastewater characterize nonpotable exposure factors are sparse. We calculated graywater and wastewater nonpotable LRTs (corresponding to 10-4 infections per person per year) for uses missing detailed exposure data (including showering and decorative fountain) and across a range of exposure factors. The LRTs decreased linearly toward zero as the log10 transformed volume or the frequency of reuse decreased. When nonroutine exposure was included, representing either accidental ingestion from misuse or cross-connection between potable and nonpotable waters, the LRTs remained high, even as the routine ingestion volume decreased. Therefore, uses with small anticipated routine ingestion volumes (i.e., roughly <10-5 L), e.g., domestic indoor or decorative fountain uses, share common LRTs, and further refinement of the routine exposure is of limited value. Additional data to characterize nonroutine exposures and uses with high routine ingestion, e.g., showering, remain valuable to better estimate LRTs. These results will assist regulators in the selection of LRTs for nonpotable uses that lack detailed exposure factor characterizations.

11.
Microb Risk Anal ; 9: 72-81, 2018 Aug.
Article in English | MEDLINE | ID: mdl-35280215

ABSTRACT

We assessed the annual probability of infection resulting from non-potable exposures to distributed greywater and domestic wastewater treated by an aerobic membrane bioreactor (MBR) followed by chlorination. A probabilistic quantitative microbial risk assessment was conducted for both residential and office buildings and a residential district using Norovirus, Rotavirus, Campylobacter jejuni, and Cryptosporidium spp. as reference pathogens. A Monte Carlo approach captured variation in pathogen concentration in the collected water and pathogen (or microbial surrogate) treatment performance, when available, for various source water and collection scale combinations. Uncertain inputs such as dose-response relationships and the volume ingested were treated deterministically and explored through sensitivity analysis. The predicted 95th percentile annual risks for non-potable indoor reuse of distributed greywater and domestic wastewater at district and building scales were less than the selected health benchmark of 10-4 infections per person per year (ppy) for all pathogens except Cryptosporidium spp., given the selected exposure (which included occasional, accidental ingestion), dose-response, and treatment performance assumptions. For Cryptosporidium spp., the 95th percentile annual risks for reuse of domestic wastewater (for all selected collection scenarios) and district-collected greywater were greater than the selected health benchmark when using the limited, available MBR treatment performance data; this finding is counterintuitive given the large size of Cryptosporidium spp. relative to the MBR pores. Therefore, additional data on MBR removal of protozoa is required to evaluate the proposed MBR treatment process for non-potable reuse. Although the predicted Norovirus annual risks were small across scenarios (less than 10-7 infections ppy), the risks for Norovirus remain uncertain, in part because the treatment performance is difficult to interpret given that the ratio of total to infectious viruses in the raw and treated effluents remains unknown. Overall, the differences in pathogen characterization between collection type (i.e., office vs. residential) and scale (i.e., district vs. building) drove the differences in predicted risk; and, the accidental ingestion event (although modeled as rare) determined the annual probability of infection. The predicted risks resulting from treatment malfunction scenarios indicated that online, real-time monitoring of both the MBR and disinfection processes remains important for non-potable reuse at distributed scales. The resulting predicted health risks provide insight on the suitability of MBR treatment for distributed, non-potable reuse at different collection scales and the potential to reduce health risks for non-potable reuse.

12.
Water (Basel) ; 10(10)2018 10.
Article in English | MEDLINE | ID: mdl-31297273

ABSTRACT

We used quantitative microbial risk assessment (QMRA) to estimate the microbial risks from two contamination pathways in onsite non-potable water systems (ONWS): contamination of potable water by (treated) reclaimed, non-potable water and contamination of reclaimed, non-potable water by wastewater or greywater. A range of system sizes, event durations, fraction of users exposed, and intrusion dilutions were considered (chlorine residual disinfection was not included). The predicted annual microbial infection risk from domestic, non-potable reuse remained below the selected benchmark given isolated, short-duration intrusion (i.e., 5-day) events of reclaimed water in potable water. Whereas, intrusions of wastewater into reclaimed, non-potable water resulted in unacceptable annual risk without large dilutions or pathogen inactivation. We predicted that 1 user out of 10,000 could be exposed to a 5-day contamination event of undiluted wastewater in the reclaimed, non-potable water system each year to meet the annual benchmark risk of 10-4 infections per person per year; whereas, 1 user out of 1000 could be exposed to a 5-day contamination event of undiluted reclaimed water in the potable water each year. Overall, the predicted annual risks support the use of previously derived non-potable reuse treatment requirements for a variety of ONWS sizes and support the prioritization of protective measures to prevent the intrusion of wastewater into domestic ONWS.

13.
Risk Anal ; 37(2): 245-264, 2017 Feb.
Article in English | MEDLINE | ID: mdl-27285380

ABSTRACT

The application of quantitative microbial risk assessments (QMRAs) to understand and mitigate risks associated with norovirus is increasingly common as there is a high frequency of outbreaks worldwide. A key component of QMRA is the dose-response analysis, which is the mathematical characterization of the association between dose and outcome. For Norovirus, multiple dose-response models are available that assume either a disaggregated or an aggregated intake dose. This work reviewed the dose-response models currently used in QMRA, and compared predicted risks from waterborne exposures (recreational and drinking) using all available dose-response models. The results found that the majority of published QMRAs of norovirus use the 1 F1 hypergeometric dose-response model with α = 0.04, ß = 0.055. This dose-response model predicted relatively high risk estimates compared to other dose-response models for doses in the range of 1-1,000 genomic equivalent copies. The difference in predicted risk among dose-response models was largest for small doses, which has implications for drinking water QMRAs where the concentration of norovirus is low. Based on the review, a set of best practices was proposed to encourage the careful consideration and reporting of important assumptions in the selection and use of dose-response models in QMRA of norovirus. Finally, in the absence of one best norovirus dose-response model, multiple models should be used to provide a range of predicted outcomes for probability of infection.


Subject(s)
Caliciviridae Infections/prevention & control , Drinking Water/virology , Environmental Monitoring/methods , Risk Assessment/methods , Water Microbiology , Humans , Models, Theoretical , Norovirus/genetics , Recreation , Software , Wastewater/virology
14.
Microb Risk Anal ; 5: 32-43, 2017.
Article in English | MEDLINE | ID: mdl-31534999

ABSTRACT

This paper presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., locally-collected greywater, roof runoff, and stormwater). A probabilistic Quantitative Microbial Risk Assessment (QMRA) was used to derive the pathogen log10 reduction targets (LRTs) that corresponded with an infection risk of either 10-4 per person per year (ppy) or 10-2 ppy. The QMRA accounted for variation in pathogen concentration and sporadic pathogen occurrence (when data were available) in source waters for reference pathogens in the genera Rotavirus, Mastadenovirus(human adenoviruses), Norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium. Non-potable uses included indoor use (for toilet flushing and clothes washing) with occasional accidental ingestion of treated non-potable water (or cross-connection with potable water), and unrestricted irrigation for outdoor use. Various exposure scenarios captured the uncertainty from key inputs, i.e., the pathogen concentration in source water; the volume of water ingested; and for the indoor use, the frequency of and the fraction of the population exposed to accidental ingestion. Both potable and non-potable uses required pathogen treatment for the selected waters and the LRT was generally greater for potable use than non-potable indoor use and unrestricted irrigation. The difference in treatment requirements among source waters was driven by the microbial quality of the water - both the density and occurrence of reference pathogens. Greywater from collection systems with 1000 people had the highest LRTs; however, those for greywater collected from a smaller population (~ 5 people), which have less frequent pathogen occurrences, were lower. Stormwater had highly variable microbial quality, which resulted in a range of possible treatment requirements. The microbial quality of roof runoff, and thus the resulting LRTs, remains uncertain due to lack of relevant pathogen data.

15.
Microb Risk Anal ; 5: 44-52, 2017 Apr.
Article in English | MEDLINE | ID: mdl-30148198

ABSTRACT

As decentralized water reuse continues to gain popularity, risk-based treatment guidance is increasingly sought for the protection of public health. However, effort s to evaluate pathogen risks and log-reduction requirements have been hindered by an incomplete understanding of pathogen occurrence and densities in locally-collected wastewaters (i.e., from decentralized collection systems). Of particular interest is the potentially high enteric pathogen concentration in small systems with an active infected excreter, but generally lower frequency of pathogen occurrences in smaller systems compared to those with several hundred contributors. Such variability, coupled with low concentrations in many source streams (e.g., sink, shower/bath, and laundry waters), has limited direct measurement of pathogens. This study presents an approach to modeling pathogen concentrations in variously sized greywater and combined wastewater collection systems based on epidemiological pathogen incidence rates, user population size, and fecal loadings to various residential wastewater sources. Pathogen infections were modeled within various population sizes (5-, 100-, and 1,000-person) for seven reference pathogens (viruses: adenoviruses, Norovirus, and Rotavirus; bacteria: Campylobacter and Salmonella spp.; and protozoa: Cryptosporidium and Giardia spp.) on each day of 10,000 possible years, accounting for intermittent infection and overlap of infection periods within the population. Fecal contamination of fresh greywaters from bathroom sinks, showers/baths, and laundry, as well as combined greywater and local combined wastewater (i.e., including toilets), was modeled based on reported fecal indicators in the various sources. Simulated daily infections and models of fecal contamination were coupled with pathogen shedding characteristics to generate distributions of pathogen densities in the various waters. The predicted frequency of pathogen occurrences in local wastewaters was generally low due to low infection incidence within small cohort groups, but increased with collection scale (population size) and infection incidence rate (e.g., Norovirus). When pathogens did occur, a decrease in concentrations from 5- to 100- and from 100- to 1,000-person systems was observed; nonetheless, overall mean concentrations (i.e., including non-occurrences) remained the same due to the increased number of occurrences. This highlights value of the model for characterizing scaling effects over averaging methods, which overestimate the frequency of pathogen occurrence in small systems while underestimating concentration peaks that likely drive risk periods. Results of this work will inform development of risk-based pathogen reduction requirements for decentralized water reuse.

16.
Water Res ; 109: 186-195, 2017 Feb 01.
Article in English | MEDLINE | ID: mdl-27888775

ABSTRACT

We compared water and sanitation system options for a coastal community across selected sustainability metrics, including environmental impact (i.e., life cycle eutrophication potential, energy consumption, and global warming potential), equivalent annual cost, and local human health impact. We computed normalized metric scores, which we used to discuss the options' strengths and weaknesses, and conducted sensitivity analysis of the scores to changes in variable and uncertain input parameters. The alternative systems, which combined centralized drinking water with sanitation services based on the concepts of energy and nutrient recovery as well as on-site water reuse, had reduced environmental and local human health impacts and costs than the conventional, centralized option. Of the selected sustainability metrics, the greatest advantages of the alternative community water systems (compared to the conventional system) were in terms of local human health impact and eutrophication potential, despite large, outstanding uncertainties. Of the alternative options, the systems with on-site water reuse and energy recovery technologies had the least local human health impact; however, the cost of these options was highly variable and the energy consumption was comparable to on-site alternatives without water reuse or energy recovery, due to on-site reuse treatment. Future work should aim to reduce the uncertainty in the energy recovery process and explore the health risks associated with less costly, on-site water treatment options.


Subject(s)
Global Warming , Sanitation , Eutrophication , Humans , Waste Disposal, Fluid/economics , Wastewater , Water
17.
Water Res ; 77: 155-169, 2015 Jun 15.
Article in English | MEDLINE | ID: mdl-25864006

ABSTRACT

Planning for sustainable community water systems requires a comprehensive understanding and assessment of the integrated source-drinking-wastewater systems over their life-cycles. Although traditional life cycle assessment and similar tools (e.g. footprints and emergy) have been applied to elements of these water services (i.e. water resources, drinking water, stormwater or wastewater treatment alone), we argue for the importance of developing and combining the system-based tools and metrics in order to holistically evaluate the complete water service system based on the concept of integrated resource management. We analyzed the strengths and weaknesses of key system-based tools and metrics, and discuss future directions to identify more sustainable municipal water services. Such efforts may include the need for novel metrics that address system adaptability to future changes and infrastructure robustness. Caution is also necessary when coupling fundamentally different tools so to avoid misunderstanding and consequently misleading decision-making.


Subject(s)
Conservation of Natural Resources , Waste Disposal, Fluid/methods , Water Resources/supply & distribution , City Planning , Environmental Monitoring , Models, Theoretical , Water Supply/methods , Water Supply/statistics & numerical data
19.
Water Res ; 66: 254-264, 2014 Dec 01.
Article in English | MEDLINE | ID: mdl-25222329

ABSTRACT

We simulate the influence of multiple sources of enterococci (ENT) as faecal indicator bacteria (FIB) in recreational water bodies on potential human health risk by considering waters impacted by human and animal sources, human and non-pathogenic sources, and animal and non-pathogenic sources. We illustrate that risks vary with the proportion of culturable ENT in water bodies derived from these sources and estimate corresponding ENT densities that yield the same level of health protection that the recreational water quality criteria in the United States seeks (benchmark risk). The benchmark risk is based on epidemiological studies conducted in water bodies predominantly impacted by human faecal sources. The key result is that the risks from mixed sources are driven predominantly by the proportion of the contamination source with the greatest ability to cause human infection (potency), not necessarily the greatest source(s) of FIB. Predicted risks from exposures to mixtures comprised of approximately 30% ENT from human sources were up to 50% lower than the risks expected from purely human sources when contamination is recent and ENT levels are at the current water quality criteria levels (35 CFU 100 mL(-1)). For human/non-pathogenic, human/gull, human/pig, and human/chicken faecal mixtures with relatively low human contribution, the predicted culturable enterococci densities that correspond to the benchmark risk are substantially greater than the current water quality criteria values. These findings are important because they highlight the potential applicability of site specific water quality criteria for waters that are predominantly un-impacted by human sources.


Subject(s)
Bacteria , Feces/microbiology , Water Microbiology , Water Quality , Animals , Enterococcus , Environmental Monitoring , Escherichia coli O157 , Gastrointestinal Diseases/microbiology , Humans , Probability , Risk Assessment , Swine , United States , Water Pollutants/analysis , Water Pollution , Water Supply
20.
Environ Sci Technol ; 48(16): 9728-36, 2014 Aug 19.
Article in English | MEDLINE | ID: mdl-24988142

ABSTRACT

As a pilot approach to describe adverse human health effects from alternative decentralized community water systems compared to conventional centralized services (business-as-usual [BAU]), selected chemical and microbial hazards were assessed using disability adjusted life years (DALYs) as the common metric. The alternatives included: (1) composting toilets with septic system, (2) urine-diverting toilets with septic system, (3) low flush toilets with blackwater pressure sewer and on-site greywater collection and treatment for nonpotable reuse, and (4) alternative 3 with on-site rainwater treatment and use. Various pathogens (viral, bacterial, and protozoan) and chemicals (disinfection byproducts [DBPs]) were used as reference hazards. The exposure pathways for BAU included accidental ingestion of contaminated recreational water, ingestion of cross-connected sewage to drinking water, and shower exposures to DBPs. The alternative systems included ingestion of treated greywater from garden irrigation, toilet flushing, and crop consumption; and ingestion of treated rainwater while showering. The pathways with the highest health impact included the ingestion of cross-connected drinking water and ingestion of recreational water contaminated by septic seepage. These were also among the most uncertain when characterizing input parameters, particularly the scale of the cross-connection event, and the removal of pathogens during groundwater transport of septic seepage. A comparison of the health burdens indicated potential health benefits by switching from BAU to decentralized water and wastewater systems.


Subject(s)
Environmental Exposure , Waste Disposal, Fluid/methods , Disinfection , Drinking Water , Humans , Recreation , Risk Assessment , Wastewater , Water Pollution
SELECTION OF CITATIONS
SEARCH DETAIL
...