Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
J Dairy Sci ; 102(7): 6682-6698, 2019 Jul.
Article in English | MEDLINE | ID: mdl-31128869

ABSTRACT

Our objective was to compare the composition of bedding materials and manure, cow welfare and hygiene assessments, measures of milk production and quality, and incidence of mastitis during a 3-yr trial with lactating Holstein cows housed in a freestall barn containing 4 identical pens with 32 freestalls/pen. Bedding systems evaluated included deep-bedded organic manure solids (DBOS), shallow-bedded manure solids spread over mattresses (MAT), deep-bedded recycled sand (RSA), and deep-bedded new sand (NSA). The experiment was designed as a 4 × 4 Latin square with 4 bedding systems and 4 experimental periods, but was terminated after 3 yr following discussions with the consulting statistician; therefore, data were analyzed as an incomplete Latin square. A total of n = 734 mostly primiparous cows (n = 725 primiparous, n = 9 multiparous; 224 to 267 cows/yr) were enrolled in the trial. Before placement in freestalls, organic solids (OS) exhibited lower concentrations of dry matter (36.5 vs. 94.3%), and greater concentrations of volatile solids, C, N, NH4-N, P, water-extractable P, K, and S compared with RSA or NSA. Cow comfort index was greater for sand-bedded systems compared with those using OS (88.4 vs. 82.8%). Cows bedded in systems using OS (DBOS and MAT) exhibited greater mean hock scores (1 = no swelling, no hair loss; 2 = no swelling, bald area on hock) than those bedded in sand (1.25 vs. 1.04), but this effect was entirely associated with use of mattresses (MAT), which differed sharply from DBOS (1.42 vs. 1.07). Generally, hygiene scores for legs, flanks, and udders were numerically similar for DBOS, NSA, and RSA bedding systems, and differences between bedding systems were associated entirely with MAT, yielding detectable contrasts between MAT and DBOS for legs (2.94 vs. 2.20), flanks (2.34 vs. 1.68), and udders (1.83 vs. 1.38). No significant contrast comparing bedding systems was detected for measures of milk production or quality. Documented cases of clinical mastitis requiring treatment ranged from a low rate of 7.4 cases/yr for RSA to a high of 23.1 cases/yr for DBOS, based on a mean enrollment of 60.7 to 63.0 cows/treatment per yr. Cows bedded with OS exhibited a greater incidence of mastitis than those bedded with sand (19.0 vs. 8.4 cases/yr), but no differences were observed for comparisons within individual bedding-material types. Collectively, these results generally favored use of sand-bedding materials over systems using OS.


Subject(s)
Cattle , Dairying/methods , Housing, Animal , Silicon Dioxide , Animal Husbandry , Animals , Female , Hygiene , Incidence , Lactation , Mastitis, Bovine/epidemiology , Mastitis, Bovine/prevention & control , Random Allocation
2.
Water Res ; 157: 356-364, 2019 Jun 15.
Article in English | MEDLINE | ID: mdl-30970285

ABSTRACT

Enteric viruses pose the greatest acute human health risks associated with subsurface drinking water supplies, yet quantitative risk assessment tools have rarely been used to develop health-based targets for virus treatment in drinking water sourced from these supplies. Such efforts have previously been hampered by a lack of consensus concerning a suitable viral reference pathogen and dose-response model as well as difficulties in quantifying pathogenic viruses in water. A reverse quantitative microbial risk assessment (QMRA) framework and quantitative polymerase chain reaction data for norovirus genogroup I in subsurface drinking water supplies were used herein to evaluate treatment needs for such water supplies. Norovirus was not detected in over 90% of samples, which emphasizes the need to consider the spatially and/or temporally intermittent patterns of enteric pathogen contamination in subsurface water supplies. Collectively, this analysis reinforces existing recommendations that a minimum 4-log treatment goal is needed for enteric viruses in groundwater in absence of well-specific monitoring information. This result is sensitive to the virus dose-response model used as there is approximately a 3-log discrepancy among virus dose-response models in the existing literature. This emphasizes the need to address the uncertainties and lack of consensus related to various QMRA modelling approaches and the analytical limitations that preclude more accurate description of virus risks.


Subject(s)
Drinking Water , Groundwater , Viruses , Disinfection , Humans , Water Microbiology , Water Supply
3.
J Anim Sci ; 96(3): 964-974, 2018 Apr 03.
Article in English | MEDLINE | ID: mdl-29401268

ABSTRACT

Dairy slurry is used commonly as an animal-sourced fertilizer in agronomic production. However, residual effects of slurry application on intake and digestibility of alfalfa (Medicago sativa L.) silage from subsequent harvests are not well known. The objective of this study was to determine if moisture concentration of alfalfa silage and timing of dairy slurry application relative to subsequent harvest affected intake and digestibility by sheep. Katahdin crossbred ewes (n = 18; 48 ± 5.3 kg) in mid-gestation were stratified by BW and allocated randomly in each of two periods to one of six treatments arranged in a two × three factorial arrangement. Treatments consisted of recommended (RM; 46.8%) or low (LM; 39.7%) moisture at baling after either no slurry application (NS), slurry application to stubble immediately after removal of the previous cutting (S0), or slurry application 14 d after removal of the previous cutting (S14). Silages were chopped through a commercial straw chopper, packed into plastic trash cans, and then offered to ewes within 4 d of chopping. Period 1 of the intake and digestion study consisted of a 14-d adaptation followed by a 7-d fecal collection period. Period 2 followed period 1 after a 4-d rest and consisted of an 11-d adaptation followed by 7 d of fecal collection. Ewes were housed individually in 1.4 × 4.3-m pens equipped with rubber mat flooring. Feces were swept from the floor twice daily, weighed, and dried at 50 °C. Ewes had ad libitum access to water and were offered chopped silage for a minimum of 10% refusal (DM). Blood samples were collected immediately prior to feeding, and 4 and 8 h after feeding on the day prior to the end of each period. Organic matter intake (g/kg BW) and OM digestibility tended (P < 0.10) to be, and digestible OM intake (g/kg BW) was reduced by slurry application. Lymphocytes (% of total white blood cells) were greater (P < 0.05) from LM vs. RM and from NS vs. S0 and S14. Red blood cell concentrations were greater (P < 0.05) from S14 vs. S0 and from S0 and S14 vs. NS. Serum urea N concentrations did not differ (P > 0.17) across treatments. Therefore, moisture concentration of alfalfa silage within the range used in this study may not affect voluntary intake or digestibility, but slurry application may have an effect on digestible OM intake. Also, moisture concentration of alfalfa silage and time of dairy slurry application may affect specific blood hemograms.


Subject(s)
Fertilizers/analysis , Medicago sativa , Sheep/physiology , Silage/analysis , Animals , Blood Urea Nitrogen , Diet/veterinary , Digestion , Feces , Female , Fermentation , Fertilizers/adverse effects , Manure , Random Allocation
4.
Water Res ; 113: 11-21, 2017 04 15.
Article in English | MEDLINE | ID: mdl-28187346

ABSTRACT

Great Lakes tributaries are known to deliver waterborne pathogens from a host of sources. To examine the hydrologic, land cover, and seasonal patterns of waterborne pathogens (i.e. protozoa (2), pathogenic bacteria (4) human viruses, (8) and bovine viruses (8)) eight rivers were monitored in the Great Lakes Basin over 29 months from February 2011 to June 2013. Sampling locations represented a wide variety of land cover classes from urban to agriculture to forest. A custom automated pathogen sampler was deployed at eight sampling locations which provided unattended, flow-weighted, large-volume (120-1630 L) sampling. Human and bovine viruses and pathogenic bacteria were detected by real-time qPCR in 16%, 14%, and 1.4% of 290 samples collected while protozoa were never detected. The most frequently detected pathogens were: bovine polyomavirus (11%), and human adenovirus C, D, F (9%). Human and bovine viruses were present in 16.9% and 14.8% of runoff-event samples (n = 189) resulting from precipitation and snowmelt, and 13.9% and 12.9% of low-flow samples (n = 101), respectively, indicating multiple delivery mechanisms could be influential. Data indicated human and bovine virus prevalence was different depending on land cover within the watershed. Occurrence, concentration, and flux of human viruses were greatest in samples from the three sampling locations with greater than 25% urban influence than those with less than 25% urban influence. Similarly, occurrence, concentration, and flux of bovine viruses were greatest in samples from the two sampling locations with greater than 50 cattle/km2 than those with less than 50 cattle/km2. In seasonal analysis, human and bovine viruses occurred more frequently in spring and winter seasons than during the fall and summer. Concentration, occurrence, and flux in the context of hydrologic condition, seasonality, and land use must be considered for each watershed individually to develop effective watershed management strategies for pathogen reduction.


Subject(s)
Lakes , Seasons , Animals , Cattle , Environmental Monitoring , Humans , Hydrology , Rivers
5.
J Dairy Sci ; 97(11): 7197-211, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25242431

ABSTRACT

Dairy producers frequently ask questions about the risks associated with applying dairy slurry to growing alfalfa (Medicago sativa L.). Our objectives were to determine the effects of applying dairy slurry on the subsequent nutritive value and fermentation characteristics of alfalfa balage. Dairy slurry was applied to 0.17-ha plots of alfalfa; applications were made to the second (HARV1) and third (HARV2) cuttings during June and July of 2012, respectively, at mean rates of 42,400 ± 5271 and 41,700 ± 2397 L/ha, respectively. Application strategies included (1) no slurry, (2) slurry applied directly to stubble immediately after the preceding harvest, (3) slurry applied after 1 wk of post-ensiled regrowth, or (4) slurry applied after 2 wk of regrowth. All harvested forage was packaged in large, rectangular bales that were ensiled as wrapped balage. Yields of DM harvested from HARV1 (2,477 kg/ha) and HARV2 (781 kg/ha) were not affected by slurry application treatment. By May 2013, all silages appeared to be well preserved, with no indication of undesirable odors characteristic of clostridial fermentations. Clostridium tyrobutyricum, which is known to negatively affect cheese production, was not detected in any forage on either a pre- or post-ensiled basis. On a pre-ensiled basis, counts for Clostridium cluster 1 were greater for slurry-applied plots than for those receiving no slurry, and this response was consistent for HARV1 (4.44 vs. 3.29 log10 genomic copies/g) and HARV2 (4.99 vs. 3.88 log10 genomic copies/g). Similar responses were observed on a post-ensiled basis; however, post-ensiled counts also were greater for HARV1 (5.51 vs. 5.17 log10 genomic copies/g) and HARV2 (5.84 vs. 5.28 log10 genomic copies/g) when slurry was applied to regrowth compared with stubble. For HARV2, counts also were greater following a 2-wk application delay compared with a 1-wk delay (6.23 vs. 5.45 log10 genomic copies/g). These results suggest that the risk of clostridial fermentations in alfalfa silages is greater following applications of slurry. Based on pre- and post-ensiled clostridial counts, applications of dairy slurry on stubble are preferred (and less risky) compared with delayed applications on growing alfalfa.


Subject(s)
Fertilizers/analysis , Medicago sativa/metabolism , Nutritive Value , Animal Nutritional Physiological Phenomena , Animals , Diet/veterinary , Fermentation , Fertilizers/adverse effects , Manure , Medicago sativa/chemistry , Silage/analysis
6.
Sci Total Environ ; 490: 849-60, 2014 Aug 15.
Article in English | MEDLINE | ID: mdl-24908645

ABSTRACT

To examine the occurrence, hydrologic variability, and seasonal variability of human and bovine viruses in surface water, three stream locations were monitored in the Milwaukee River watershed in Wisconsin, USA, from February 2007 through June 2008. Monitoring sites included an urban subwatershed, a rural subwatershed, and the Milwaukee River at the mouth. To collect samples that characterize variability throughout changing hydrologic periods, a process control system was developed for unattended, large-volume (56-2800 L) filtration over extended durations. This system provided flow-weighted mean concentrations during runoff and extended (24-h) low-flow periods. Human viruses and bovine viruses were detected by real-time qPCR in 49% and 41% of samples (n=63), respectively. All human viruses analyzed were detected at least once including adenovirus (40% of samples), GI norovirus (10%), enterovirus (8%), rotavirus (6%), GII norovirus (1.6%) and hepatitis A virus (1.6%). Three of seven bovine viruses analyzed were detected including bovine polyomavirus (32%), bovine rotavirus (19%), and bovine viral diarrhea virus type 1 (5%). Human viruses were present in 63% of runoff samples resulting from precipitation and snowmelt, and 20% of low-flow samples. Maximum human virus concentrations exceeded 300 genomic copies/L. Bovine viruses were present in 46% of runoff samples resulting from precipitation and snowmelt and 14% of low-flow samples. The maximum bovine virus concentration was 11 genomic copies/L. Statistical modeling indicated that stream flow, precipitation, and season explained the variability of human viruses in the watershed, and hydrologic condition (runoff event or low-flow) and season explained the variability of the sum of human and bovine viruses; however, no model was identified that could explain the variability of bovine viruses alone. Understanding the factors that affect virus fate and transport in rivers will aid watershed management for minimizing human exposure and disease transmission.


Subject(s)
Environmental Monitoring , Rivers/virology , Viruses/growth & development , Water Microbiology , Animals , Cattle , Humans , Viruses/classification , Wisconsin
7.
Water Res ; 46(13): 4281-91, 2012 Sep 01.
Article in English | MEDLINE | ID: mdl-22673345

ABSTRACT

Naturally-occurring inhibitory compounds are a major concern during qPCR and RT-qPCR analysis of environmental samples, particularly large volume water samples. Here, a standardized method for measuring and mitigating sample inhibition in environmental water concentrates is described. Specifically, the method 1) employs a commercially available standard RNA control; 2) defines inhibition by the change in the quantification cycle (C(q)) of the standard RNA control when added to the sample concentrate; and 3) calculates a dilution factor using a mathematical formula applied to the change in C(q) to indicate the specific volume of nuclease-free water necessary to dilute the effect of inhibitors. The standardized inhibition method was applied to 3,193 large-volume water (surface, groundwater, drinking water, agricultural runoff, sewage) concentrates of which 1,074 (34%) were inhibited. Inhibition level was not related to sample volume. Samples collected from the same locations over a one to two year period had widely variable inhibition levels. The proportion of samples that could have been reported as false negatives if inhibition had not been mitigated was between 0.3% and 71%, depending on water source. These findings emphasize the importance of measuring and mitigating inhibition when reporting qPCR results for viral pathogens in environmental waters to minimize the likelihood of reporting false negatives and under-quantifying virus concentration.


Subject(s)
GB virus C/genetics , RNA, Viral/genetics , Real-Time Polymerase Chain Reaction/methods , Reverse Transcriptase Polymerase Chain Reaction/methods , Water Microbiology , Drinking Water/virology , Environmental Monitoring/methods , GB virus C/isolation & purification , Groundwater/virology , RNA, Viral/isolation & purification , Reproducibility of Results , Sewage/virology
8.
J Virol Methods ; 163(2): 244-52, 2010 Feb.
Article in English | MEDLINE | ID: mdl-19835913

ABSTRACT

Quantifying infectious viruses by cell culture depends on visualizing cytopathic effect, or for integrated cell culture-PCR, attaining confidence a PCR-positive signal is the result of virus growth and not inoculum carryover. This study developed mathematical methods to calculate infectious virus numbers based on viral growth kinetics in cell culture. Poliovirus was inoculated into BGM cell monolayers at 10 concentrations from 0.001 to 1000 PFU/ml. Copy numbers of negative-strand RNA, a marker of infectivity for single-stranded positive RNA viruses, were measured over time by qRT-PCR. Growth data were analyzed by two approaches. First, data were fit with a continuous function to estimate directly the initial virus number, expressed as genomic copies. Such estimates correlated with actual inoculum numbers across all concentrations (R(2)=0.62, n=17). Second, the length of lag phase appeared to vary inversely with inoculum titers; hence, standard curves to predict inoculum virus numbers were derived based on three definitions of lag time: (1) time of first detection of (-)RNA, (2) second derivative maximum of the fitted continuous function, and (3) time when the fitted curve crossed a threshold (-)RNA concentration. All three proxies yielded standard curves with R(2)=0.69-0.90 (n=17). The primary advantage of these growth kinetics approaches is being able to quantify virions that are unambiguously infectious, a particular advantage for viruses that do not produce CPE.


Subject(s)
Environmental Microbiology , Models, Theoretical , Polymerase Chain Reaction/methods , Viral Load , Virology/methods , Viruses/growth & development , Animals , Cell Culture Techniques , Chlorocebus aethiops , Gene Dosage , Viruses/genetics
9.
J Appl Microbiol ; 107(4): 1089-97, 2009 Oct.
Article in English | MEDLINE | ID: mdl-19486387

ABSTRACT

AIMS: To evaluate the effectiveness of continuous separation channel centrifugation for concentrating Toxoplasma gondii and Cyclospora cayetanensis from drinking water and environmental waters. METHODS AND RESULTS: Ready-to-seed vials with known quantities of T. gondii and C. cayetanensis oocysts were prepared by flow cytometry. Oocysts were seeded at densities ranging from 1 to 1000 oocysts l(-1) into 10 to 100 l test volumes of finished drinking water, water with manipulated turbidity, and the source waters from nine drinking water utilities. Oocysts were recovered using continuous separation channel centrifugation and counted on membrane filters using epifluorescent microscopy. Recovery efficiencies of both parasites were > or =84% in 10 l volumes of drinking water. In source waters, recoveries ranged from 64% to 100%, with the lowest recoveries in the most turbid waters. Method precision was between 10% and 20% coefficient of variation. CONCLUSION: Toxoplasma gondii and C. cayetanensis are effectively concentrated from various water matrices by continuous separation channel centrifugation. SIGNIFICANCE AND IMPACT OF THE STUDY: Waterborne transmission of T. gondii and C. cayetanensis presents another challenge in producing clean drinking water and protecting public health. Detection of these parasites relies on effectively concentrating oocysts from ambient water, otherwise false negatives may result. Validation data specific to T. gondii and C. cayetanensis concentration methods are limited. Continuous separation channel centrifugation recovers oocysts with high efficiency and precision, the method attributes required to accurately assess the risk of waterborne transmission.


Subject(s)
Centrifugation/methods , Cyclospora/isolation & purification , Fresh Water/parasitology , Toxoplasma/isolation & purification , Water Microbiology , Water Supply , Animals , Environmental Monitoring , Oocysts/parasitology , Water Supply/standards
10.
J Appl Microbiol ; 92(4): 649-56, 2002.
Article in English | MEDLINE | ID: mdl-11966905

ABSTRACT

AIMS: The aim of this study was to determine the effectiveness of continuous separation channel centrifugation for concentrating water-borne pathogens of various taxa and sizes. METHODS AND RESULTS: Cryptosporidium parvum oocysts, Giardia lamblia cysts, Encephalitozoon intestinalis spores and Escherichia coli were seeded into different water matrices at densities ranging from 5 to 10 000 organisms l(-1) and recovered using continuous separation channel centrifugation. All pathogens were enumerated on membrane filters using microscopy. Recovery efficiencies were usually > 90%. Oocyst recovery did not vary with source water turbidity or with centrifuge flow rate up to 250 ml min(-1). Based on excystation, this concentration method did not alter oocyst viability. CONCLUSIONS: Continuous separation channel centrifugation is an effective means of concentrating water-borne pathogens. SIGNIFICANCE AND IMPACT OF THE STUDY: Methods are needed for detecting pathogens in drinking water to ensure public health. The first step for any pathogen detection procedure is concentration. However, this step has been problematic because recovery efficiencies of conventional methods, like filtration, are often low and variable, which may lead to false negatives. Continuous separation channel centrifugation can simultaneously concentrate multiple pathogens as small as 1 microm with high and reproducible efficiency in a variety of water matrices.


Subject(s)
Cryptosporidium/isolation & purification , Escherichia coli/isolation & purification , Microsporidia/isolation & purification , Water Microbiology , Water/parasitology , Animals , Centrifugation/methods , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...