Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 69
Filter
1.
Aust Vet J ; 99(8): 319-325, 2021 Aug.
Article in English | MEDLINE | ID: mdl-33851419

ABSTRACT

OBJECTIVES: Measurement of ruminal pH throughout a 148-day feeding period in cattle fed commercial diets and to relate this to feed intake, growth rate and feed conversion ratio. Factors contributing to variation in rumen pH, including meal frequency, duration and weight, and, total daily intake, were also evaluated. METHODS: Forty-eight cattle were randomly allocated to two pens and 12 randomly selected from each pen had rumen pH monitoring boli inserted. Ruminal pH was measured every 10 min and feed intake was measured continually. The cattle were fed a commercial feedlot diet for 148 days and weighed into and out of the feedlot to measure growth rate and to calculate feed conversion ratio. Cattle from both pens were registered to collect individual feed intake data using the GrowSafe® feed management system. RESULTS: Mean ruminal pH decreased with days on feed. Mean daily dry matter intake was the major contributor to greater average daily gain and lower ruminal pH. Lower mean ruminal pH was associated with greater average daily gain and lower feed conversion ratio, where it remained above the threshold of 5.6. There was no association between ruminal pH and average daily gain or feed conversion ratio for mean ruminal pH below 5.6. CONCLUSIONS: Ruminal acidosis can occur at any time during the feeding period, and the risk could be greater as days on feed increase. Feedlot production outcomes are not improved by ruminal pH depression below the threshold of 5.6.


Subject(s)
Animal Feed , Rumen , Animal Feed/analysis , Animals , Australia , Cattle , Diet/veterinary , Feeding Behavior , Hydrogen-Ion Concentration
2.
J Appl Microbiol ; 131(1): 288-299, 2021 Jul.
Article in English | MEDLINE | ID: mdl-33174331

ABSTRACT

AIM: The family Arcobacteraceae formerly genus Arcobacter has recently been reclassified into six genera. Among nine species of the genus Aliarcobacter, Aliarcobacter faecis and Aliarcobacter lanthieri have been identified as emerging pathogens potentially cause health risks to humans and animals. This study was designed to develop/optimize, validate and apply Arcobacteraceae family- and two species-specific (A. faecis and A. lanthieri) loop-mediated isothermal amplification (LAMP) assays to rapidly detect and quantify total number of cells in various environmental niches. METHODS AND RESULTS: Three sets of LAMP primers were designed from conserved and variable regions of 16S rRNA (family-specific) and gyrB (species-specific) genes. Optimized Arcobacteraceae family-specific LAMP assay correctly amplified and detected 24 species, whereas species-specific LAMP assays detected A. faecis and A. lanthieri reference strains as well as 91 pure and mixed culture isolates recovered from aquatic and faecal sources. The specificity of LAMP amplification of A. faecis and A. lanthieri was further confirmed by restriction fragment length polymorphism analysis. Assay sensitivities were tested using variable DNA concentrations extracted from simulated target species cells in an autoclaved agricultural water sample by achieving a minimum detection limit of 10 cells mL-1 (10 fg). Direct DNA-based quantitative detection, from agricultural surface water, identified A. faecis (17%) and A. lanthieri (1%) at a low frequency compared to family-level (93%) with the concentration ranging from 2·1 × 101 to 2·2 × 105 cells 100 mL-1 . CONCLUSIONS: Overall, these three DNA-based rapid and cost-effective novel LAMP assays are sensitive and can be completed in less than 40 min. They have potential for on-site quantitative detection of species of family Arcobacteraceae, A. faecis and A. lanthieri in food, environmental and clinical matrices. SIGNIFICANCE AND IMPACT OF THE STUDY: The newly developed LAMP assays are specific, sensitive, accurate with higher reproducibility that have potential to facilitate in a less equipped lab setting and can help in early quantitative detection and rate of prevalence in environmental niches. The assays can be adopted in the diagnostic labs and epidemiological studies.


Subject(s)
Arcobacter/isolation & purification , Campylobacteraceae/isolation & purification , Molecular Diagnostic Techniques , Nucleic Acid Amplification Techniques , Water Microbiology , Agriculture , Animals , Arcobacter/classification , Arcobacter/genetics , Campylobacteraceae/classification , Campylobacteraceae/genetics , DNA Primers , DNA, Bacterial/analysis , DNA, Bacterial/genetics , Feces/microbiology , Humans , RNA, Ribosomal, 16S , Reproducibility of Results , Sensitivity and Specificity , Species Specificity
3.
Environ Monit Assess ; 192(1): 67, 2019 Dec 26.
Article in English | MEDLINE | ID: mdl-31879802

ABSTRACT

Optical sensing of chlorophyll-a (chl-a), turbidity, and fluorescent dissolved organic matter (fDOM) is often used to characterize the quality of water. There are many site-specific factors and environmental conditions that can affect optically sensed readings; notwithstanding the comparative implication of different procedures used to measure these properties in the laboratory. In this study, we measured these water quality properties using standard laboratory methods, and in the field using optical sensors (sonde-based) at water quality monitoring sites located in four watersheds in Canada. The overall objective of this work was to explore the relationships among sonde-based and standard laboratory measurements of the aforementioned water properties, and evaluate associations among these eco-hydrological properties and land use, environmental, and ancillary water quality variables such as dissolved organic carbon (DOC) and total suspended solids (TSS). Differences among sonde versus laboratory relationships for chl-a suggest such relationships are impacted by laboratory methods and/or site specific conditions. Data mining analysis indicated that interactive site-specific factors predominately impacting chl-a values across sites were specific conductivity and turbidity (variables with positive global associations with chl-a). The overall linear regression predicting DOC from fDOM was relatively strong (R2 = 0.77). However, slope differences in the watershed-specific models suggest laboratory DOC versus fDOM relationships could be impacted by unknown localized water quality properties affecting fDOM readings, and/or the different standard laboratory methods used to estimate DOC. Artificial neural network analyses (ANN) indicated that higher relative chl-a concentrations were associated with low to no tree cover around sample sites and higher daily rainfall in the watersheds examined. Response surfaces derived from ANN indicated that chl-a concentrations were higher where combined agricultural and urban land uses were relatively higher.


Subject(s)
Chlorophyll A/analysis , Environmental Monitoring/methods , Humic Substances/analysis , Hydrodynamics , Rivers/chemistry , Water Quality/standards , Agriculture , British Columbia , Ecology , Fluorometry , Ontario , Urbanization
4.
Water Res ; 151: 423-438, 2019 03 15.
Article in English | MEDLINE | ID: mdl-30639728

ABSTRACT

Predicting bacterial levels in watersheds in response to agricultural beneficial management practices (BMPs) requires understanding the germane processes at both the watershed and field scale. Controlling subsurface tile drainage (CTD) is a highly effective BMP at reducing nutrient losses from fields, and watersheds when employed en masse, but little work has been conducted on CTD effects on bacterial loads and densities in a watershed context. This study compared fecal indicator bacteria (FIB) [E. coli, Enterococcus, Fecal coliform, Total coliform, Clostridium perfringens] densities and unit area loads (UAL) from a pair of flat tile-drained watersheds (∼250-467 ha catchment areas) during the growing season over a 10-year monitoring period, using a before-after-control-impact (BACI) design (i.e., test CTD watershed vs. reference uncontrolled tile drainage (UCTD) watershed during a pre CTD intervention period and a CTD-intervention period where the test CTD watershed had CTD deployed on over 80% of the fields). With no tile drainage management, upstream tile drainage to ditches comprised ∼90% of total ditch discharge. We also examined FIB loads from a subset of tile drained fields to determine field load contributions to the watershed drainage ditches. Statistical evidence of a CTD effect on FIB UAL in the surface water systems was not strong; however, there was statistical evidence of increased FIB densities [pronounced when E. coli >200 most probable number (MPN) 100 mL-1] in the test CTD watershed during the CTD-intervention period. This was likely a result of reduced dilution/flushing in the test CTD watershed ditch due to CTD significantly decreasing the amount of tile drainage water entering the surface water system. Tile E. coli load contributions to the ditches were low; for example, during the 6-yr CTD-intervention period they amounted to on average only ∼3 and ∼9% of the ditch loads for the test CTD and reference UCTD watersheds, respectively. This suggests in-stream, or off-field FIB reservoirs and bacteria mobilization drivers, dominated ditch E. coli loads in the watersheds during the growing season. Overall, this study suggested that decision making regarding deployment of CTD en masse in tile-fed watersheds should consider drainage practice effects on bacterial densities and loads, as well as CTD's documented capacity to boost crop yields and reduce seasonal nutrient pollution.


Subject(s)
Escherichia coli , Rivers , Agriculture , Bacteria , Seasons
5.
Qual Life Res ; 27(9): 2373-2382, 2018 Sep.
Article in English | MEDLINE | ID: mdl-29948600

ABSTRACT

PURPOSE: To determine the changes in each of the five dimensions of the EuroQol 5-dimension index associated with community-based physiotherapy. METHODS: Four thousand one hundred and thirty-six patients that received community-based musculoskeletal physiotherapy across five NHS centres completed the EQ-5D on entry into the service and upon discharge. Patients were categorised on symptom location and response to treatment based on their EQ-5D index improving by at least 0.1 ("EQ-5D responders"). For each symptom location, and for responders and non-responders to treatment, the mean (± SD) were calculated for each dimension pre- and post-treatment as well as the size of effect. RESULTS: The mobility dimension improved (p < 0.05) in all symptom locations for EQ-5D responders (d = 0.26-1.58) and in ankle, knee, hip and lumbar symptoms for EQ-5D non-responders (d = 0.17-0.45). The self-care dimension improved (p < 0.05) in all symptom locations for EQ-5D responders (d = 0.49-1.16). The usual activities dimension improved (p < 0.05) across all symptom locations for EQ-5D responders (d = 1.00-1.75) and EQ-5D non-responders (d = 0.14-0.60). Despite the pain/discomfort dimension improving (p < 0.05) across all symptom locations for both EQ-5D responders (d = 1.07-1.43) and EQ-5D non-responders (d = 0.29-0.66), the anxiety/depression dimension improved (p < 0.05) from higher starting levels in EQ-5D responders (d = 0.76-1.05) with no change seen for EQ-5D non-responders (d = - 0.16 to 0.06). CONCLUSIONS: Clinicians should not assume that a patient presenting with pain but expressing high anxiety/depression is unlikely to respond to treatment, as they may show the best HRQoL outcomes. For patients presenting with pain/discomfort and low levels of anxiety/depression, the EQ-5D index is perhaps not a suitable tool for sole use in patient management and service evaluation.


Subject(s)
Physical Therapy Modalities/psychology , Quality of Life/psychology , Female , Humans , Male , Surveys and Questionnaires
6.
Health Qual Life Outcomes ; 15(1): 212, 2017 Oct 25.
Article in English | MEDLINE | ID: mdl-29065895

ABSTRACT

BACKGROUND: Community-based musculoskeletal physiotherapy is used to improve function and health related quality of life (HRQoL). The purpose of this retrospective, multi-centre observational study was to determine the association between community-based physiotherapy management for musculoskeletal disorders and changes in HRQoL. METHODS: Four thousand one hundred twelve patients' data were included in the study. Patients were included if they received a single period of treatment for a musculoskeletal injury or disorder. Patients were only included if they were being treated for a single morbidity. Patients received standard physiotherapy appropriate to their specific disorder, which could include health education/advice, exercise therapy, manual therapy, taping, soft tissue techniques, electrotherapy and/or acupuncture. Health related quality of life was assessed using the EQ-5D index. RESULTS: EQ-5D improved by 0.203 across all patients (d = 1.10). When grouped by anatomical site of symptom, the largest increases in EQ-5D was in foot pain (0.233; d = 1.29) and lumbar pain (0.231; d = 1.13). Improvements in EQ-5D greater than the minimum clinically important difference (MCID) were seen in 68.4% of all patients. The highest proportion of patients with positive responses to treatment were in ankle pain (74.2%) and thoracic pain (73.4%). The hand (40.5%), elbow (34.7%), and hip (33.9%) showed the greatest proportion of patients that did not respond to treatment. CONCLUSIONS: Community-based musculoskeletal physiotherapy is associated with improved health related quality of life. A randomised controlled trial is needed to determine any causal relationship between community-based physiotherapy and health related quality of life improvements.


Subject(s)
Musculoskeletal Diseases/psychology , Musculoskeletal Diseases/rehabilitation , Musculoskeletal System/injuries , Physical Therapy Modalities , Quality of Life , Adult , Community Health Services , Female , Humans , Male , Middle Aged , Retrospective Studies , Surveys and Questionnaires
7.
J Appl Microbiol ; 123(6): 1522-1532, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28960631

ABSTRACT

AIM: A single-tube multiplex PCR (mPCR) assay was developed for rapid, sensitive and simultaneous detection and identification of six Arcobacter species including two new species, A. lanthieri and A. faecis, along with A. butzleri, A. cibarius, A. cryaerophilus and A. skirrowii on the basis of differences in the lengths of their PCR products. Previously designed monoplex, mPCR and RFLP assays do not detect or differentiate A. faecis and A. lanthieri from other closely related known Arcobacter spp. METHODS AND RESULTS: Primer pairs for each target species (except A. skirrowii) and mPCR protocol were newly designed and optimized using variable regions of housekeeping including cpn60, gyrA, gyrB and rpoB genes. The accuracy and specificity of the mPCR assay was assessed using DNA templates from six targets and 11 other Arcobacter spp. as well as 50 other bacterial reference species and strains. Tests on the DNA templates of target Arcobacter spp. were appropriately identified, whereas all 61 other DNA templates from other bacterial species and strains were not amplified. Sensitivity and specificity of the mPCR assay was 10 pg µl-1 of DNA concentration per target species. The optimized assay was further evaluated, validated and compared with other mPCR assays by testing Arcobacter cultures isolated from various faecal and water sources. CONCLUSIONS: Study results confirm that the newly developed mPCR assay is rapid, accurate, reliable, simple, and valuable for the simultaneous detection and routine diagnosis of six human- and animal-associated Arcobacter spp. SIGNIFICANCE AND IMPACT OF THE STUDY: The new mPCR assay is useful not only for pure but also mixed cultures. Moreover, it has the ability to rapidly detect six species which enhances the value of this technology for aetiological and epidemiological studies.


Subject(s)
Arcobacter/genetics , Multiplex Polymerase Chain Reaction/methods , Animals , Arcobacter/classification , Arcobacter/isolation & purification , DNA, Bacterial/chemistry , DNA, Bacterial/genetics , Gram-Negative Bacterial Infections/diagnosis , Humans , Sensitivity and Specificity , Species Specificity
8.
Water Res ; 105: 625-637, 2016 Nov 15.
Article in English | MEDLINE | ID: mdl-27721171

ABSTRACT

Many Cryptosporidium species/genotypes are not considered infectious to humans, and more realistic estimations of seasonal infection risks could be made using human infectious species/genotype information to inform quantitative microbial risk assessments (QMRA). Cryptosporidium oocyst concentration and species/genotype data were collected from three surface water surveillance programs in two river basins [South Nation River, SN (2004-09) and Grand River, GR (2005-13)] in Ontario, Canada to evaluate seasonal infection risks. Main river stems, tributaries, agricultural drainage streams, water treatment plant intakes, and waste water treatment plant effluent impacted sites were sampled. The QMRA employed two sets of exposure data to compute risk: one assuming all observed oocysts were infectious to humans, and the other based on the fraction of oocysts that were C. hominis and/or C. parvum (dominant human infectious forms of the parasite). Viability was not considered and relative infection risk was evaluated using a single hypothetical recreational exposure. Many sample site groupings for both river systems, had significant seasonality in Cryptosporidium occurrence and concentrations (p ≤ 0.05); occurrence and concentrations were generally highest in autumn for SN, and autumn and summer for GR. Mean risk values (probability of infection per exposure) for all sites combined, for each river system, were roughly an order of magnitude lower (avg. of SN and GR 5.3 × 10-5) when considering just C. parvum and C. hominis oocysts, in relation to mean infection risk (per exposure) assuming all oocysts were infectious to humans (5.5 × 10-4). Seasonality in mean risk (targeted human infectious oocysts only) was most strongly evident in SN (e.g., 7.9 × 10-6 in spring and 8.1 × 10-5 in summer). Such differences are important if QMRA is used to quantify effects of water safety/quality management practices where inputs from a vast array of fecal pollution sources can readily occur. Cryptosporidium seasonality in water appears to match the seasonality of human infections from Cryptosporidium in the study regions. This study highlights the importance of Cryptosporidium species/genotype data to help determine surface water pollution sources and seasonality, as well as to help more accurately quantify human infection risks by the parasite.


Subject(s)
Cryptosporidium/genetics , Seasons , Animals , Cryptosporidiosis/epidemiology , Genotype , Humans , Oocysts
9.
J Environ Qual ; 44(5): 1589-604, 2015 Sep.
Article in English | MEDLINE | ID: mdl-26436276

ABSTRACT

Controlled tile drainage (CTD) regulates water and nutrient export from tile drainage systems. Observations of the effects of CTD imposed en masse at watershed scales are needed to determine the effect on downstream receptors. A paired-watershed approach was used to evaluate the effect of field-to-field CTD at the watershed scale on fluxes and flow-weighted mean concentrations (FWMCs) of N and P during multiple growing seasons. One watershed (467-ha catchment area) was under CTD management (treatment [CTD] watershed); the other (250-ha catchment area) had freely draining or uncontrolled tile drainage (UCTD) (reference [UCTD] watershed). The paired agricultural watersheds are located in eastern Ontario, Canada. Analysis of covariance and paired tests were used to assess daily fluxes and FWMCs during a calibration period when CTD intervention on the treatment watershed was minimal (2005-2006, when only 4-10% of the tile-drained area was under CTD) and a treatment period when the treatment (CTD) watershed had prolific CTD intervention (2007-2011 when 82% of tile drained fields were controlled, occupying >70% of catchment area). Significant linear regression slope changes assessed using ANCOVA ( ≤ 0.1) for daily fluxes from upstream and downstream monitoring sites pooled by calibration and treatment period were -0.06 and -0.20 (stream water) (negative values represent flux declines in CTD watershed), -0.59 and -0.77 (NH-N), -0.14 and -0.15 (NO-N), -1.77 and -2.10 (dissolved reactive P), and -0.28 and 0.45 (total P). Total P results for one site comparison contrasted with other findings likely due to unknown in-stream processes affecting total P loading, not efficacy of CTD. The FWMC results were mixed and inconclusive but suggest physical abatement by CTD is the means by which nutrient fluxes are predominantly reduced at these scales. Overall, our study results indicate that CTD is an effective practice for reducing watershed scale fluxes of stream water, N, and P during the growing season.

10.
J Environ Qual ; 44(2): 629-41, 2015 Mar.
Article in English | MEDLINE | ID: mdl-26023981

ABSTRACT

Controlled tile drainage (CTD) can reduce pollutant loading. The Annualized Agricultural Nonpoint Source model (AnnAGNPS version 5.2) was used to examine changes in growing season discharge, sediment, nitrogen, and phosphorus loads due to CTD for a ∼3900-km agriculturally dominated river basin in Ontario, Canada. Two tile drain depth scenarios were examined in detail to mimic tile drainage control for flat cropland: 600 mm depth (CTD) and 200 mm (CTD) depth below surface. Summed for five growing seasons (CTD), direct runoff, total N, and dissolved N were reduced by 6.6, 3.5, and 13.7%, respectively. However, five seasons of summed total P, dissolved P, and total suspended solid loads increased as a result of CTD by 0.96, 1.6, and 0.23%. The AnnAGNPS results were compared with mass fluxes observed from paired experimental watersheds (250, 470 ha) in the river basin. The "test" experimental watershed was dominated by CTD and the "reference" watershed by free drainage. Notwithstanding environmental/land use differences between the watersheds and basin, comparisons of seasonal observed and predicted discharge reductions were comparable in 100% of respective cases. Nutrient load comparisons were more consistent for dissolved, relative to particulate water quality endpoints. For one season under corn crop production, AnnAGNPS predicted a 55% decrease (CTD) in dissolved N from the basin. AnnAGNPS v. 5.2 treats P transport from a surface pool perspective, which is appropriate for many systems. However, for assessment of tile drainage management practices for relatively flat tile-dominated systems, AnnAGNPS may benefit from consideration of P and particulate transport in the subsurface.

11.
Water Res ; 76: 120-31, 2015 Jun 01.
Article in English | MEDLINE | ID: mdl-25799976

ABSTRACT

Serovar prevalence of the zoonotic pathogen, Salmonella enterica, was compared among 1624 surface water samples collected previously from five different Canadian agricultural watersheds over multiple years. Phagetyping, pulsed field gel electrophoresis (PFGE), and antimicrobial resistance subtyping assays were performed on serovars Enteritidis, Typhimurium, and Heidelberg. Serovars and subtypes from surface water were compared with those from animal feces, human sewage, and serovars reported to cause salmonellosis in Canadians. Sixty-five different serovars were identified in surface water; only 32% of these were isolated from multiple watersheds. Eleven of the 13 serovars most commonly reported to cause salmonellosis in Canadians were identified in surface water; isolates of these serovars constituted >40% of the total isolates. Common phagetypes and PFGE subtypes of serovars associated with illness in humans such as S. Enteritidis and S. Typhimurium were also isolated from surface water and animal feces. Antimicrobial resistance was generally low, but was highest among S. Typhimurium. Monitoring of these rivers helps to identify vulnerable areas of a watershed and, despite a relatively low prevalence of S. enterica overall, serovars observed in surface water are an indication of the levels of specific S. enterica serovars present in humans and animals.


Subject(s)
Fresh Water/microbiology , Salmonella Infections/microbiology , Salmonella enterica/isolation & purification , Sewage/microbiology , Agriculture , Animals , Canada/epidemiology , Drug Resistance, Microbial , Feces/microbiology , Humans , Salmonella Infections/epidemiology , Salmonella enterica/drug effects , Salmonella enterica/genetics , Salmonella enteritidis/genetics , Salmonella enteritidis/isolation & purification , Salmonella typhimurium/genetics , Salmonella typhimurium/isolation & purification , Serogroup
12.
J Environ Qual ; 44(1): 236-47, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25602339

ABSTRACT

When surface water levels decline, exposed streambed sediments can be mobilized and washed into the water course when subjected to erosive rainfall. In this study, rainfall simulations were conducted over exposed sediments along stream banks at four distinct locations in an agriculturally dominated river basin with the objective of quantifying the potential for contaminant loading from these often overlooked runoff source areas. At each location, simulations were performed at three different sites. Nitrogen, phosphorus, sediment, fecal indicator bacteria, pathogenic bacteria, and microbial source tracking (MST) markers were examined in both prerainfall sediments and rainfall-induced runoff water. Runoff generation and sediment mobilization occurred quickly (10-150 s) after rainfall initiation. Temporal trends in runoff concentrations were highly variable within and between locations. Total runoff event loads were considered large for many pollutants considered. For instance, the maximum observed total phosphorus runoff load was on the order of 1.5 kg ha. Results also demonstrate that runoff from exposed sediments can be a source of pathogenic bacteria. spp. and spp. were present in runoff from one and three locations, respectively. Ruminant MST markers were also present in runoff from two locations, one of which hosted pasturing cattle with stream access. Overall, this study demonstrated that rainfall-induced runoff from exposed streambed sediments can be an important source of surface water pollution.

13.
Water Res ; 47(16): 6244-57, 2013 Oct 15.
Article in English | MEDLINE | ID: mdl-24075721

ABSTRACT

Over 3500 individual water samples, for 131 sampling times, targeting waterborne pathogens/fecal indicator bacteria were collected during a 7-year period from 4 sites along an intermittent stream running through a small livestock pasture system with and without cattle access-to-stream restriction measures. The study assessed the impact of cattle pasturing/riparian zone protection on: pathogen (bacterial, viral, parasite) occurrence, concentrations of fecal indicators, and quantitative microbial risk assessments (QMRA) of the risk of Cryptosporidium, Giardia and Escherichia coli O157:H7 infection in humans. Methodologies were developed to compute QMRA mean risks on the basis of water samples exhibiting potentially human infectious Cryptosporidium and E. coli based on genotyping Crytosporidium, and E. coli O157:H7 presence/absence information paired with enumerated E. coli. All Giardia spp. were considered infectious. No significant pasturing treatment effects were observed among pathogens, with the exception of Campylobacter spp. and E. coli O157:H7. Campylobacter spp. prevalence significantly decreased downstream through pasture treatments and E. coli O157:H7 was observed in a few instances in the middle of the unrestricted pasture. Densities of total coliform, fecal coliform, and E. coli reduced significantly downstream in the restricted pasture system, but not in the unrestricted system. Seasonal and flow conditions were associated with greater indicator bacteria densities, especially in the summer. Norovirus GII was detected at rates of 7-22% of samples for all monitoring sites, and rotavirus in 0-7% of samples for all monitoring sites; pasture treatment trends were not evident, however. Seasonal and stream flow variables (and their interactions) were relatively more important than pasture treatments for initially stratifying pathogen occurrence and higher fecal indicator bacteria densities. Significant positive associations among fecal indicator bacteria and Campylobacter spp. detection were observed. For QMRA, adjusting for the proportion of Cryptosporidium spp. detected that are infectious for humans reduces downstream risk estimates by roughly one order of magnitude. Using QMRA in this manner provides a more refined estimate of beneficial management practice effects on pathogen exposure risks to humans.


Subject(s)
Bacterial Physiological Phenomena , Parasites/physiology , Rivers , Virus Physiological Phenomena , Water Microbiology , Animal Husbandry , Animals , Bacterial Load , Cattle , Humans , Population Density , Prevalence , Risk Assessment , Rivers/microbiology , Rivers/parasitology , Rivers/virology , Seasons , Water Movements , Zoonoses/epidemiology
14.
Water Res ; 47(10): 3255-72, 2013 Jun 15.
Article in English | MEDLINE | ID: mdl-23623467

ABSTRACT

Human campylobacteriosis is the leading bacterial gastrointestinal illness in Canada; environmental transmission has been implicated in addition to transmission via consumption of contaminated food. Information about Campylobacter spp. occurrence at the watershed scale will enhance our understanding of the associated public health risks and the efficacy of source water protection strategies. The overriding purpose of this study is to provide a quantitative framework to assess and compare the relative public health significance of watershed microbial water quality associated with agricultural BMPs. A microbial monitoring program was expanded from fecal indicator analyses and Campylobacter spp. presence/absence tests to the development of a novel, 11-tube most probable number (MPN) method that targeted Campylobacter jejuni, Campylobacter coli, and Campylobacter lari. These three types of data were used to make inferences about theoretical risks in a watershed in which controlled tile drainage is widely practiced, an adjacent watershed with conventional (uncontrolled) tile drainage, and reference sites elsewhere in the same river basin. E. coli concentrations (MPN and plate count) in the controlled tile drainage watershed were statistically higher (2008-11), relative to the uncontrolled tile drainage watershed, but yearly variation was high as well. Escherichia coli loading for years 2008-11 combined were statistically higher in the controlled watershed, relative to the uncontrolled tile drainage watershed, but Campylobacter spp. loads for 2010-11 were generally higher for the uncontrolled tile drainage watershed (but not statistically significant). Using MPN data and a Bayesian modelling approach, higher mean Campylobacter spp. concentrations were found in the controlled tile drainage watershed relative to the uncontrolled tile drainage watershed (2010, 2011). A second-order quantitative microbial risk assessment (QMRA) was used, in a relative way, to identify differences in mean Campylobacter spp. infection risks among monitoring sites for a hypothetical exposure scenario. Greater relative mean risks were obtained for sites in the controlled tile drainage watershed than in the uncontrolled tile drainage watershed in each year of monitoring with pair-wise posterior probabilities exceeding 0.699, and the lowest relative mean risks were found at a downstream drinking water intake reference site. The second-order modelling approach was used to partition sources of uncertainty, which revealed that an adequate representation of the temporal variation in Campylobacter spp. concentrations for risk assessment was achieved with as few as 10 MPN data per site. This study demonstrates for the first time how QMRA can be implemented to evaluate, in a relative sense, the public health implications of controlled tile drainage on watershed-scale water quality.


Subject(s)
Campylobacter , Escherichia coli , Models, Theoretical , Risk Assessment/methods , Rivers/microbiology , Water Microbiology , Agriculture , Bayes Theorem , Campylobacter/pathogenicity , Campylobacter Infections/epidemiology , Canada , Environmental Monitoring/methods , Escherichia coli/pathogenicity , Escherichia coli Infections/epidemiology , Feces/microbiology , Humans , Ontario , Public Health , Water Quality
15.
J Environ Qual ; 41(4): 1301-14, 2012.
Article in English | MEDLINE | ID: mdl-22751075

ABSTRACT

This 5-yr study compared, via an upstream-downstream experimental design, nutrient and microbial water quality of an intermittent stream running through a small pasture (∼2.5 animals ha) where cattle are restricted from the riparian zone (restricted cattle access [RCA]) and where cattle have unrestricted access to the stream (unrestricted cattle access [URCA]). Fencing in the RCA excluded pasturing cattle to within ∼3 to 5 m of the stream. Approximately 88% (26/32) of all comparisons of mean contaminant load reduction for lower, higher, and all stream flow conditions during the 5-yr study indicated net contaminant load reductions in the RCA; for the URCA, this percentage was 38% (12/32). For all flow conditions, mean percent load reductions in the RCA for nutrients and bacteria plus F-coliphage were 24 and 23%, respectively. These respective percentages for the URCA were -9 and -57% (positive values are reductions; negative values are increases). However, potentially as a result of protected wildlife habitat in the RCA, the mean percent load reduction for for "all flow" was -321% for the RCA and 60% for the URCA; for , these respective percentages were -209% (RCA) and 73% (URCA). For "all flow" situations, mean load reductions for the RCA were significantly greater ( < 0.1) than those from the URCA for NH-N, dissolved reactive phosphorus (DRP), total coliform, , and . For "high flow" situations, mean load reductions were significantly greater for the RCA for DRP, total coliform, and . For "low flow" conditions, significantly greater mean load reductions were in favor of the RCA for DRP, total P, total coliforms, fecal coliforms, , and . In no case were mean pollutant loads in the URCA significantly higher than RCA pollutant loads. Restricting pasturing livestock to within 3 to 5 m of intermittent streams can improve water quality; however, water quality impairment can occur if livestock have unrestricted access to a stream.


Subject(s)
Bacteria/isolation & purification , Cattle , Rivers/chemistry , Rivers/microbiology , Water Pollutants , Water/chemistry , Animal Husbandry , Animals , Coliphages/isolation & purification , Environmental Monitoring , Geologic Sediments/microbiology , Nitrogen/chemistry , Phosphorus/chemistry , Soil Microbiology , Water Microbiology , Water Pollution/prevention & control
16.
J Environ Qual ; 41(1): 21-30, 2012.
Article in English | MEDLINE | ID: mdl-22218170

ABSTRACT

Canada's National Agri-Environmental Standards Initiative sought to develop an environmental benchmark for low-level waterborne pathogen occurrence in agricultural watersheds. A field study collected 902 water samples from 27 sites in four intensive agricultural watersheds across Canada from 2005 to 2007. Four of the sites were selected as reference sites away from livestock and human fecal pollution sources in each watershed. Water samples were analyzed for Campylobacter spp., Salmonella spp., Escherichia coli O157:H7, Cryptosporidium spp., Giardia spp., and the water quality indicator E. coli. The annual mean number of pathogen species was higher at agricultural sites (1.54 ± 0.07 species per water sample) than at reference sites (0.75 ± 0.14 species per water sample). The annual mean concentration of E. coli was also higher at agricultural sites (491 ± 96 colony-forming units [cfu] 100 mL(-1)) than at reference sites (53 ± 18 cfu 100 mL(-1)). The feasibility of adopting existing E. coli water quality guideline values as an environmental benchmark was assessed, but waterborne pathogens were detected at agricultural sites in 80% of water samples with low E. coli concentrations (<100 cfu 100 mL(-1)). Instead, an approach was developed based on using the natural background occurrence of pathogens at reference sites in agricultural watersheds to derive provisional environmental benchmarks for pathogens at agricultural sites. The environmental benchmarks that were derived were found to represent E. coli values lower than geometric mean values typically found in recreational water quality guidelines. Additional research is needed to investigate environmental benchmarks for waterborne pathogens within the context of the "One World, One Health" perspective for protecting human, domestic animal, and wildlife health.


Subject(s)
Agriculture , Benchmarking , Escherichia coli/isolation & purification , Water Microbiology/standards , Water Movements , Water Pollutants/standards , Canada , Ecosystem , Water/parasitology
17.
Water Res ; 45(18): 5807-25, 2011 Nov 15.
Article in English | MEDLINE | ID: mdl-21889781

ABSTRACT

Over a five year period (2004-08), 1171 surface water samples were collected from up to 24 sampling locations representing a wide range of stream orders, in a river basin in eastern Ontario, Canada. Water was analyzed for Cryptosporidium oocysts and Giardia cyst densities, the presence of Salmonella enterica subspecies enterica, Campylobacter spp., Listeria monocytogenes, and Escherichia coli O157:H7. The study objective was to explore associations among pathogen densities/occurrence and objectively defined land use, weather, hydrologic, and water quality variables using CART (Classification and Regression Tree) and binary logistical regression techniques. E. coli O157:H7 detections were infrequent, but detections were related to upstream livestock pasture density; 20% of the detections were located where cattle have access to the watercourses. The ratio of detections:non-detections for Campylobacter spp. was relatively higher (>1) when mean air temperatures were 6% below mean study period temperature values (relatively cooler periods). Cooler water temperatures, which can promote bacteria survival and represent times when land applications of manure typically occur (spring and fall), may have promoted increased frequency of Campylobacter spp. Fifty-nine percent of all Salmonella spp. detections occurred when river discharge on a branch of the river system of Shreve stream order = 9550 was >83 percentile. Hydrological events that promote off farm/off field/in stream transport must manifest themselves in order for detection of Salmonella spp. to occur in surface water in this region. Fifty seven percent of L. monocytogenes detections occurred in spring, relative to other seasons. It was speculated that a combination of winter livestock housing, silage feeding during winter, and spring application of manure that accrued during winter, contributed to elevated occurrences of this pathogen in spring. Cryptosporidium and Giardia oocyst and cyst densities were, overall, positively associated with surface water discharge, and negatively associated with air/water temperature during spring-summer-fall. Yet, some of the highest Cryptosporidium oocyst densities were associated with low discharge conditions on smaller order streams, suggesting wildlife as a contributing fecal source. Fifty six percent of all detections of ≥ 2 bacteria pathogens (including Campylobacter spp., Salmonella spp., and E. coli O157:H7) in water was associated with lower water temperatures (<∼ 14 °C; primarily spring and fall) and when total rainfall the week prior to sampling was >∼ 27 mm (62 percentile). During higher water temperatures (>∼ 14 °C), a higher amount of weekly rainfall was necessary to promote detection of ≥ 2 pathogens (primarily summer; weekly rainfall ∼>42 mm (>77 percentile); 15% of all ≥ 2 detections). Less rainfall may have been necessary to mobilize pathogens from adjacent land, and/or in stream sediments, during cooler water conditions; as these are times when manures are applied to fields in the area, and soil water contents and water table depths are relatively higher. Season, stream order, turbidity, mean daily temperature, surface water discharge, cropland coverage, and nearest upstream distance to a barn and pasture were variables that were relatively strong and recurrent with regard to discriminating pathogen presence and absence, and parasite densities in surface water in the region.


Subject(s)
Agriculture , Bacteria/isolation & purification , Environment , Parasites/isolation & purification , Rivers/microbiology , Rivers/parasitology , Animals , Campylobacter/isolation & purification , Cryptosporidium/cytology , Cryptosporidium/isolation & purification , Geography , Giardia/cytology , Giardia/isolation & purification , Logistic Models , Ontario , Oocysts/cytology , Salmonella/isolation & purification , Surface Properties , Water Microbiology , Weather
18.
J Appl Microbiol ; 110(2): 407-21, 2011 Feb.
Article in English | MEDLINE | ID: mdl-21091592

ABSTRACT

AIMS: Isolate and characterize water enterococci from the South Nation River drainage basin, an area dominated by agriculture. METHODS AND RESULTS: A total of 1558 enterococci were isolated from 204 water samples from the South Nation River obtained over a 3-year period. PCR was used to identify isolates to the species level and characterize them for carriage of 12 virulence determinants. Antibiotic resistance was evaluated phenotypically. Enterococcus faecalis (36·4%), Enterococcus faecium (9·3%) and Enterococcus durans (8·5%) were the major enterococci species isolated. Enterococci carrying more than two virulence determinants were more frequently detected in the summer (59·6%) than in other seasons (≤ 37·6%). Very few isolates (≤ 2·0%) were resistant to category I antibiotics ciprofloxacin and vancomycin. CONCLUSIONS: Comparison of major water enterococci species with major faecal enterococci species obtained from various host groups (human, domesticated mammals and birds, wildlife) in this drainage basin suggest that water enterococci may have varied faecal origins. The low level of antibiotic resistance among enterococci suggests that dispersion of antibiotic resistance via waterborne enterococci in this watershed is not significant. SIGNIFICANCE AND IMPACT OF THE STUDY: The data obtained in this study suggests that water enterococci in the SNR have a faecal origin and that their potential impact on public health regarding antibiotic resistance and virulence determinants is minimal.


Subject(s)
Drug Resistance, Bacterial , Enterococcus/drug effects , Enterococcus/pathogenicity , Genes, Bacterial , Rivers/microbiology , Virulence Factors/genetics , Enterococcus/genetics , Enterococcus/isolation & purification , Enterococcus faecalis/drug effects , Enterococcus faecalis/genetics , Enterococcus faecium/drug effects , Enterococcus faecium/genetics , Feces/microbiology , Humans , Ontario , Virulence/genetics
19.
QJM ; 101(7): 557-65, 2008 Jul.
Article in English | MEDLINE | ID: mdl-18400776

ABSTRACT

BACKGROUND: Widow spider-bite causes latrodectism and is associated with significant morbidity worldwide. Antivenom is given by both the intravenous (IV) and intramuscular (IM) routes and it is unclear which is more effective. AIM: To compare the effectiveness of IV vs. IM redback spider antivenom. DESIGN: Randomized controlled trial. METHODS: Patients with latrodectism were given either IV or IM antivenom according to a randomized double-dummy, double-blind protocol. The first antivenom treatment was followed by another identical treatment after two hours if required. The primary outcome was a clinically significant reduction in pain two hours after the last treatment. A fully Bayesian analysis was used to estimate the probability of the desired treatment effect, predetermined as an absolute difference of 20%. RESULTS: We randomly allocated 126 patients to receive antivenom IV (64) and IM (62). After antivenom treatment pain improved in 40/64(62%) in the IV group vs. 33/62(53%) in the IM group (+9%; 95% Credible Interval [CrI]: -8% to +26%). The probability of a difference greater than zero (IV superior) was 85% but the probability of a difference >20% was only 10%. In 55 patients with systemic effects, these improved in 58% after IV antivenom vs. 65% after IM antivenom (-8%; 95% CrI: -32% to +17%). Twenty-four hours after antivenom pain had improved in 84% in the IV group vs. 71% in the IM group (+13%; 95% CrI: -2% to +27%). A meta-analysis including data from a previous trial found no difference in the primary outcome between IV and IM administration. DISCUSSION: The difference between IV and IM routes of administration of widow spider antivenom is, at best, small and does not justify routinely choosing one route over the other. Furthermore, antivenom may provide no benefit over placebo.


Subject(s)
Antivenins/administration & dosage , Pain/drug therapy , Spider Bites/drug therapy , Spider Venoms/antagonists & inhibitors , Adult , Antivenins/pharmacology , Bayes Theorem , Dose-Response Relationship, Drug , Epidemiologic Methods , Female , Humans , Injections, Intramuscular , Injections, Intravenous , Male , Middle Aged , Monitoring, Ambulatory
SELECTION OF CITATIONS
SEARCH DETAIL