Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 16 de 16
Filter
1.
J Water Health ; 20(12): 1721-1732, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36573675

ABSTRACT

Water safety plans (WSPs) are intended to assure safe drinking water (DW). WSPs involve assessing and managing risks associated with microbial, chemical, physical and radiological hazards from the catchment to the consumer. Currently, chemical hazards in WSPs are assessed by targeted chemical analysis, but this approach fails to account for the mixture effects of the many chemicals potentially present in water supplies and omits the possible effects of non-targeted chemicals. Consequently, effect-based monitoring (EBM) using in vitro bioassays and well plate-based in vivo assays are proposed as a complementary tool to targeted chemical analysis to support risk analysis, risk management and water quality verification within the WSP framework. EBM is frequently applied to DW and surface water and can be utilised in all defined monitoring categories within the WSP framework (including 'system assessment', 'validation', 'operational' and 'verification'). Examples of how EBM can be applied within the different WSP modules are provided, along with guidance on where to apply EBM and how frequently. Since this is a new area, guidance documents, standard operating procedures (SOPs) and decision-making frameworks are required for both bioassay operators and WSP teams to facilitate the integration of EBM into WSPs, with these resources being developed currently.


Subject(s)
Drinking Water , Water Supply , Water Quality , Risk Management , Risk Assessment , Environmental Monitoring
2.
Parasitol Res ; 120(12): 4167-4188, 2021 Dec.
Article in English | MEDLINE | ID: mdl-33409629

ABSTRACT

Waterborne diseases are a major global problem, resulting in high morbidity and mortality, and massive economic costs. The ability to rapidly and reliably detect and monitor the spread of waterborne diseases is vital for early intervention and preventing more widespread disease outbreaks. Pathogens are, however, difficult to detect in water and are not practicably detectable at acceptable concentrations that need to be achieved in treated drinking water (which are of the order one per million litre). Furthermore, current clinical-based surveillance methods have many limitations such as the invasive nature of the testing and the challenges in testing large numbers of people. Wastewater-based epidemiology (WBE), which is based on the analysis of wastewater to monitor the emergence and spread of infectious disease at a population level, has received renewed attention in light of the current coronavirus disease 2019 (COVID-19) pandemic. The present review will focus on the application of WBE for the detection and surveillance of pathogens with a focus on severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and the waterborne protozoan parasites Cryptosporidium and Giardia. The review highlights the benefits and challenges of WBE and the future of this tool for community-wide infectious disease surveillance.


Subject(s)
COVID-19 , Cryptosporidiosis , Cryptosporidium , Giardia , Humans , SARS-CoV-2 , Wastewater , Wastewater-Based Epidemiological Monitoring
3.
Regul Toxicol Pharmacol ; 110: 104545, 2020 Feb.
Article in English | MEDLINE | ID: mdl-31778715

ABSTRACT

Small and brief exceedances of chemicals above their guideline values in drinking water are unlikely to cause an appreciable increased risk to human health. As a result, short-term exposure values (STEV) can be derived to help decide whether drinking water can still be supplied to consumers without adverse health risks. In this study, three approaches were applied to calculate and compare STEV for pesticides. The three approaches included basing a STEV on the acute reference dose (ARfD) (Approach 1), removing conventional attribution rates and uncertainty factors from current guideline values (Approach 2) and extrapolating 1 d and 7 d no observed adverse effect levels (NOAEL) from existing toxicity data using a log-linear regression (Approach 3). Despite being very different methods, the three approaches produced comparable STEV generally within an order of magnitude, which often overlapped with other existing short-term exposure values such as short-term no adverse response levels (SNARL) and health advisories (HA). The results show that adjusting the current guideline value using standard extrapolation factors (Approach 2) often produced the most conservative values. Approach 2 was then applied to two other chemical classes, disinfection by-products (DBPs) and cyanotoxins, demonstrating the wider applicability of the approach.


Subject(s)
Bacterial Toxins/standards , Dietary Exposure/standards , Drinking Water/standards , Marine Toxins/standards , Pesticides/standards , Water Pollutants, Chemical/standards , Adult , Child , Disinfection , Humans , No-Observed-Adverse-Effect Level , Risk Assessment
4.
Emerg Microbes Infect ; 7(1): 50, 2018 Mar 29.
Article in English | MEDLINE | ID: mdl-29593246

ABSTRACT

Norovirus is estimated to cause 677 million annual cases of gastroenteritis worldwide, resulting in 210,000 deaths. As viral gastroenteritis is generally self-limiting, clinical samples for epidemiological studies only partially represent circulating noroviruses in the population and is biased towards severe symptomatic cases. As infected individuals from both symptomatic and asymptomatic cases shed viruses into the sewerage system at a high concentration, waste water samples are useful for the molecular epidemiological analysis of norovirus genotypes at a population level. Using Illumina MiSeq and Sanger sequencing, we surveyed circulating norovirus within Australia and New Zealand, from July 2014 to December 2016. Importantly, norovirus genomic diversity during 2016 was compared between clinical and waste water samples to identify potential pandemic variants, novel recombinant viruses and the timing of their emergence. Although the GII.4 Sydney 2012 variant was prominent in 2014 and 2015, its prevalence significantly decreased in both clinical and waste water samples over 2016. This was concomitant with the emergence of multiple norovirus strains, including twoGII.4 Sydney 2012 recombinant viruses, GII.P4 New Orleans 2009/GII.4 Sydney 2012 and GII.P16/GII.4 Sydney 2012, along with three other emerging strains GII.17, GII.P12/GII.3 and GII.P16/GII.2. This is unusual, as a single GII.4 pandemic variant is generally responsible for 65-80% of all human norovirus infections at any one time and predominates until it is replaced by a new pandemic variant. In sumary, this study demonstrates the combined use of clinical and wastewater samples provides a more complete picture of norovirus circulating within the population.


Subject(s)
Caliciviridae Infections/epidemiology , Caliciviridae Infections/virology , Norovirus/genetics , Norovirus/isolation & purification , Wastewater/virology , Caliciviridae Infections/diagnosis , Communicable Diseases, Emerging/diagnosis , Communicable Diseases, Emerging/epidemiology , Communicable Diseases, Emerging/transmission , Communicable Diseases, Emerging/virology , Gastroenteritis/epidemiology , Gastroenteritis/virology , Genotype , High-Throughput Nucleotide Sequencing , Humans , Norovirus/classification , Pandemics/prevention & control , Phylogeny , Prevalence , RNA, Viral/genetics
5.
Water Res ; 111: 100-108, 2017 03 15.
Article in English | MEDLINE | ID: mdl-28063282

ABSTRACT

Two hypothetical scenario exercises were designed and conducted to reflect the increasingly extreme weather-related challenges faced by water utilities as the global climate changes. The first event was based on an extreme flood scenario. The second scenario involved a combination of weather events, including a wild forest fire ('bushfire') followed by runoff due to significant rainfall. For each scenario, a panel of diverse personnel from water utilities and relevant agencies (e.g. health departments) formed a hypothetical water utility and associated regulatory body to manage water quality following the simulated extreme weather event. A larger audience participated by asking questions and contributing key insights. Participants were confronted with unanticipated developments as the simulated scenarios unfolded, introduced by a facilitator. Participants were presented with information that may have challenged their conventional experiences regarding operational procedures in order to identify limitations in current procedures, assumptions, and readily available information. The process worked toward the identification of a list of specific key lessons for each event. At the conclusion of each simulation a facilitated discussion was used to establish key lessons of value to water utilities in preparing them for similar future extreme events.


Subject(s)
Drinking Water , Weather , Climate Change , Floods , Humans , Water Quality
6.
Water Res ; 85: 124-36, 2015 Nov 15.
Article in English | MEDLINE | ID: mdl-26311274

ABSTRACT

Among the most widely predicted and accepted consequences of global climate change are increases in both the frequency and severity of a variety of extreme weather events. Such weather events include heavy rainfall and floods, cyclones, droughts, heatwaves, extreme cold, and wildfires, each of which can potentially impact drinking water quality by affecting water catchments, storage reservoirs, the performance of water treatment processes or the integrity of distribution systems. Drinking water guidelines, such as the Australian Drinking Water Guidelines and the World Health Organization Guidelines for Drinking-water Quality, provide guidance for the safe management of drinking water. These documents present principles and strategies for managing risks that may be posed to drinking water quality. While these principles and strategies are applicable to all types of water quality risks, very little specific attention has been paid to the management of extreme weather events. We present a review of recent literature on water quality impacts of extreme weather events and consider practical opportunities for improved guidance for water managers. We conclude that there is a case for an enhanced focus on the management of water quality impacts from extreme weather events in future revisions of water quality guidance documents.


Subject(s)
Disaster Planning , Disasters , Drinking Water/standards , Water Purification/methods , Weather , Climate Change , Cyclonic Storms , Droughts , Fires , Floods , Water Purification/standards , Water Quality/standards
7.
J Environ Manage ; 90(10): 3122-34, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19515479

ABSTRACT

Water-borne pathogens such as Cryptosporidium pose a significant human health risk and catchments provide the first critical pollution 'barrier' in mitigating risk in drinking water supply. In this paper we apply an adaptive management framework to mitigating Cryptosporidium risk in source water using a case study of the Myponga catchment in South Australia. Firstly, we evaluated the effectiveness of past water quality management programs in relation to the adoption of practices by landholders using a socio-economic survey of land use and management in the catchment. The impact of past management on the mitigation of Cryptosporidium risk in source water was also evaluated based on analysis of water quality monitoring data. Quantitative risk assessment was used in planning the next round of management in the adaptive cycle. Specifically, a pathogen budget model was used to identify the major remaining sources of Cryptosporidium in the catchment and estimate the mitigation impact of 30 alternative catchment management scenarios. Survey results show that earlier programs have resulted in the comprehensive adoption of best management practices by dairy farmers including exclusion of stock from watercourses and effluent management from 2000 to 2007. Whilst median Cryptosporidium concentrations in source water have decreased since 2004 they remain above target levels and put pressure on other barriers to mitigate risk, particularly the treatment plant. Non-dairy calves were identified as the major remaining source of Cryptosporidium in the Myponga catchment. The restriction of watercourse access of non-dairy calves could achieve a further reduction in Cryptosporidium export to the Myponga reservoir of around 90% from current levels. The adaptive management framework applied in this study was useful in guiding learning from past management, and in analysing, planning and refocusing the next round of catchment management strategies to achieve water quality targets.


Subject(s)
Agriculture , Cryptosporidium/isolation & purification , Water Supply/analysis , Water/parasitology , Animals , Environmental Monitoring/methods , Risk Assessment , South Australia
8.
Water Res ; 42(12): 3047-56, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18486962

ABSTRACT

Studies undertaken to assess the performance of filter materials to remove phosphorus in decentralised sewage systems have not reported on the broader performance of these systems. This study aimed to identify virus fate and transport mechanisms at the laboratory scale for comparison with field experiments on a mound system amended with blast furnace slag. Inactivation was a significant removal mechanism for MS2 bacteriophage, but not for PRD1 bacteriophage. Column studies identified rapid transport of PRD1. Laboratory studies predicted lower removal of PRD1 in a full scale system than was experienced in the field study, highlighting the importance of considering pH and flow rate in pathogen removal estimates. The results highlight the necessity for studying a range of organisms when assessing the potential for pathogen transport.


Subject(s)
Bacteriophage PRD1/physiology , Sewage/virology , Bioreactors/virology , Environmental Monitoring , Filtration/instrumentation , Silicon Dioxide , Soil Microbiology , Water Microbiology , Water Purification/methods
9.
J Water Health ; 6 Suppl 1: 1-9, 2008.
Article in English | MEDLINE | ID: mdl-18401123

ABSTRACT

A wide range of microbial and chemical characteristics in drinking water have the potential to affect human health. However, it is not possible or practical to test drinking water for all potentially harmful characteristics. If drinking water is contaminated, people may already be exposed by the time test results are available. The 'boil water alert' issued in Sydney, Australia in 1998 following the detection of Cryptosporidium and Giardia in the finished water supply, highlighted the uncertainties associated with the public health response to test results. The Sydney experience supports the international consensus that a preventive risk-management approach to the supply of drinking water (manifesting as water safety plans (WSPs)) is the most reliable way to protect public health. A key component of a comprehensive WSP is that water suppliers and health authorities must have plans to respond in the case of water contamination and/or outbreaks. These plans must include clear guidance on when to issue warnings to consumers, and how these warnings are to be communicated. The pressure on health authorities to develop clear and systematic boil-water guidance will increase as utilities all over the world develop their WSPs.


Subject(s)
Communication , Safety Management/organization & administration , Water Pollutants , Animals , Cryptosporidium/isolation & purification , Giardia/isolation & purification , Guidelines as Topic , Humans , New South Wales , Public Health , Risk Management , Water Supply/analysis
10.
J Water Health ; 5(2): 187-208, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17674569

ABSTRACT

In drinking water catchments, reduction of pathogen loads delivered to reservoirs is an important priority for the management of raw source water quality. To assist with the evaluation of management options, a process-based mathematical model (pathogen catchment budgets - PCB) is developed to predict Cryptosporidium, Giardia and E. coli loads generated within and exported from drinking water catchments. The model quantifies the key processes affecting the generation and transport of microorganisms from humans and animals using land use and flow data, and catchment specific information including point sources such as sewage treatment plants and on-site systems. The resultant pathogen catchment budgets (PCB) can be used to prioritize the implementation of control measures for the reduction of pathogen risks to drinking water. The model is applied in the Wingecarribee catchment and used to rank those sub-catchments that would contribute the highest pathogen loads in dry weather, and in intermediate and large wet weather events. A sensitivity analysis of the model identifies that pathogen excretion rates from animals and humans, and manure mobilization rates are significant factors determining the output of the model and thus warrant further investigation.


Subject(s)
Environmental Monitoring/methods , Models, Economic , Water Microbiology , Water Supply/economics , Animals , Australia , Cryptosporidium/isolation & purification , Escherichia coli/isolation & purification , Giardia/isolation & purification , Humans , Sewage , Water Purification/economics , Water Supply/standards , Weather
11.
J Water Health ; 5(1): 83-95, 2007 Mar.
Article in English | MEDLINE | ID: mdl-17402281

ABSTRACT

The dispersion and transport of Cryptosporidium parvum oocysts, Escherichia coli and PRD1 bacteriophage seeded into artificial bovine faecal pats was studied during simulated rainfall events. Experimental soil plots were divided in two, one sub-plot with bare soil and the other with natural vegetation. Simulated rainfall events of 55 mm.h(-1) for 30 min were then applied to the soil plots. Each experimental treatment was performed in duplicate and consisted of three sequential artificial rainfall events ('Runs'): a control run (no faecal pats); a fresh faecal pat run (fresh faecal pats); and an aged faecal pat run (one week aged faecal pats). Transportation efficiency increased with decreasing size of the microorganism studied; Cryptosporidium oocysts were the least mobile followed by E. coli and then PRD1 phage. Rainfall events mobilised 0.5 to 0.9% of the Cryptosporidium oocysts, 1.3-1.4% of E. coli bacteria, and 0.03-0.6% of PRD1 bacteriophages from the fresh faecal pats and transported them a distance of 10 m across the bare soil sub-plots. Subsequent rainfall events applied to aged faecal pats only mobilised 0.01-0.06% of the original Cryptosporidium oocyst load, between 0.04 and 15% of the E. coli load and 0.0006-0.06% of PRD1 bacteriophages, respectively.


Subject(s)
Feces/microbiology , Feces/parasitology , Rain , Animals , Bacteriophage PRD1/isolation & purification , Cattle , Cryptosporidium/isolation & purification , Escherichia coli/isolation & purification , Feces/virology , Humans , Oocysts , Soil/parasitology , Soil Microbiology , Time Factors , Water/parasitology , Water Microbiology
12.
J Health Popul Nutr ; 24(3): 346-55, 2006 Sep.
Article in English | MEDLINE | ID: mdl-17366776

ABSTRACT

The provision of alternative water sources is the principal arsenic mitigation strategy in Bangladesh, but can lead to risk substitution. A study of arsenic mitigation options was undertaken to assess water quality and sanitary condition and to estimate the burden of disease associated with each technology in disability-adjusted life years (DALYs). Dugwells and pond-sand filters showed heavy microbial contamination in both dry and monsoon seasons, and the estimated burden of disease was high. Rainwater was of good quality in the monsoon but deteriorated in the dry season. Deep tubewells showed microbial contamination in the monsoon but not in the dry season and was the only technology to approach the World Health Organization's reference level of risk of 10-6 DALYs. A few dugwells and one pond-sand filter showed arsenic in excess of 50 microg/L. The findings suggest that deep tubewells and rainwater harvesting provide safer water than dugwells and pond-sand filters and should be the preferred options.


Subject(s)
Arsenic Poisoning/prevention & control , Arsenic/analysis , Risk Assessment , Water Microbiology , Water Supply/standards , Animals , Arsenic/adverse effects , Bangladesh , Consumer Product Safety , Cost of Illness , Environmental Exposure , Humans , Rain , Seasons , Water Supply/analysis
13.
Appl Environ Microbiol ; 71(10): 5929-34, 2005 Oct.
Article in English | MEDLINE | ID: mdl-16204506

ABSTRACT

A fecal analysis survey was undertaken to quantify animal inputs of pathogenic and indicator microorganisms in the temperate watersheds of Sydney, Australia. The feces from a range of domestic animals and wildlife were analyzed for the indicator bacteria fecal coliforms and Clostridium perfringens spores, the pathogenic protozoa Cryptosporidium and Giardia, and the enteric viruses adenovirus, enterovirus, and reovirus. Pathogen and fecal indicator concentrations were generally higher in domestic animal feces than in wildlife feces. Future studies to quantify potential pathogen risks in drinking-water watersheds should thus focus on quantifying pathogen loads from domestic animals and livestock rather than wildlife.


Subject(s)
Animals, Domestic/microbiology , Animals, Wild/microbiology , Feces/microbiology , Fresh Water/microbiology , Water Supply , Animals , Australia , Cats , Cattle , Clostridium perfringens/isolation & purification , Cryptosporidium/isolation & purification , Dogs , Enterobacteriaceae/isolation & purification , Giardia/isolation & purification , Spores, Bacterial/isolation & purification , Viruses/isolation & purification
14.
Appl Environ Microbiol ; 70(2): 1151-9, 2004 Feb.
Article in English | MEDLINE | ID: mdl-14766600

ABSTRACT

The dispersion and initial transport of Cryptosporidium oocysts from fecal pats were investigated during artificial rainfall events on intact soil blocks (1,500 by 900 by 300 mm). Rainfall events of 55 mm h(-1) for 30 min and 25 mm h(-1) for 180 min were applied to soil plots with artificial fecal pats seeded with approximately 10(7) oocysts. The soil plots were divided in two, with one side devoid of vegetation and the other left with natural vegetation cover. Each combination of event intensity and duration, vegetation status, and degree of slope (5 degrees and 10 degrees ) was evaluated twice. Generally, a fivefold increase (P < 0.05) in runoff volume was generated on bare soil compared to vegetated soil, and significantly more infiltration, although highly variable, occurred through the vegetated soil blocks (P < 0.05). Runoff volume, event conditions (intensity and duration), vegetation status, degree of slope, and their interactions significantly affected the load of oocysts in the runoff. Surface runoff transported from 10(0.2) oocysts from vegetated loam soil (25-mm h(-1), 180-min event on 10 degrees slope) to up to 10(4.5) oocysts from unvegetated soil (55-mm h(-1), 30-min event on 10 degrees slope) over a 1-m distance. Surface soil samples downhill of the fecal pat contained significantly higher concentrations of oocysts on devegetated blocks than on vegetated blocks. Based on these results, there is a need to account for surface soil vegetation coverage as well as slope and rainfall runoff in future assessments of Cryptosporidium transport and when managing pathogen loads from stock grazing near streams within drinking water watersheds.


Subject(s)
Cryptosporidium/physiology , Feces/parasitology , Oocysts/physiology , Rain , Soil/parasitology , Animals , Cattle , Cattle Diseases/parasitology , Cryptosporidiosis/parasitology , Cryptosporidiosis/veterinary , Cryptosporidium/growth & development , Cryptosporidium/isolation & purification , Oocysts/isolation & purification , Water/parasitology
15.
Can J Microbiol ; 50(9): 675-82, 2004 Sep.
Article in English | MEDLINE | ID: mdl-15644920

ABSTRACT

This study compared the recovery of Cryptosporidium oocysts and Giardia cysts ((oo)cysts) from raw waters using 4 different concentration-elution methods: flatbed membranes, FiltaMax foam, Envirochek HV capsules, and Hemoflow ultrafilters. The recovery efficiency of the combined immunomagnetic separation and staining steps was also determined. Analysis of variance of arcsine-transformed data demonstrated that recovery of Cryptosporidium oocysts by 2 of the methods was statistically equivalent (flatbed filtration 26.7% and Hemoflow 28.3%), with FiltaMax and Envirochek HV recoveries significantly lower (18.9% and 18.4%). Recovery of Giardia cysts was significantly higher using flatbed membrane filtration (42.2%) compared with the other 3 methods (Envirochek HV 29.3%, FiltaMax 29.0%, and Hemoflow 20.9%). All methods were generally acceptable and are suitable for laboratory use; 2 of the methods are also suitable for field use (FiltaMax and Envirochek HV). In conclusion, with recoveries generally being statistically equivalent or similar, practical considerations become important in determining which filters to use for particular circumstances. The results indicate that while low-turbidity or "finished" waters can be processed with consistently high recovery efficiencies, recoveries from raw water samples differ significantly with variations in raw water quality. The use of an internal control with each raw water sample is therefore highly recommended.


Subject(s)
Cryptosporidium/isolation & purification , Fresh Water/parasitology , Giardia/isolation & purification , Oocysts/isolation & purification , Water Purification/methods , Animals , Cryptosporidium/growth & development , Filtration/methods , Giardia/growth & development , Immunomagnetic Separation , Micropore Filters , Water Supply
16.
Appl Environ Microbiol ; 69(5): 2842-7, 2003 May.
Article in English | MEDLINE | ID: mdl-12732556

ABSTRACT

Accurate quantification of Cryptosporidium parvum oocysts in animal fecal deposits on land is an essential starting point for estimating watershed C. parvum loads. Due to the general poor performance and variable recovery efficiency of existing enumeration methods, protocols were devised based on initial dispersion of oocysts from feces by vortexing in 2 mM tetrasodium pyrophosphate, followed by immunomagnetic separation. The protocols were validated by using an internal control seed preparation to determine the levels of oocyst recovery for a range of fecal types. The levels of recovery of 10(2) oocysts from cattle feces (0.5 g of processed feces) ranged from 31 to 46%, and the levels of recovery from sheep feces (0.25 g of processed feces) ranged from 21% to 35%. The within-sample coefficients of variation for the percentages of recovery from five replicates ranged from 10 to 50%. The ranges for levels of recovery of oocysts from cattle, kangaroo, pig, and sheep feces (juveniles and adults) collected in a subsequent watershed animal fecal survey were far wider than the ranges predicted by the validation data. Based on the use of an internal control added to each fecal sample, the levels of recovery ranged from 0 to 83% for cattle, from 4 to 62% for sheep, from 1 to 42% for pigs, and from 40 to 73% for kangaroos. Given the variation in the levels of recovery of oocysts from different fecal matrices, it is recommended that an internal control be added to at least one replicate of every fecal sample analyzed to determine the percentage of recovery. Depending on the animal type and based on the lowest approximate percentages of recovery, between 10 and 100 oocysts g of feces(-1) must be present to be detected.


Subject(s)
Cryptosporidium parvum/isolation & purification , Feces/parasitology , Animals , Cattle/parasitology , Macropodidae/parasitology , Oocysts/isolation & purification , Parasitology/methods , Sheep/parasitology , Sus scrofa/parasitology , Water/parasitology
SELECTION OF CITATIONS
SEARCH DETAIL
...