RESUMEN
We used quantitative microbial risk assessment to estimate ingestion risk for intI1, erm(B), sul1, tet(A), tet(W), and tet(X) in private wells contaminated by human and/or livestock feces. Genes were quantified with five human-specific and six bovine-specific microbial source-tracking (MST) markers in 138 well-water samples from a rural Wisconsin county. Daily ingestion risk (probability of swallowing ≥1 gene) was based on daily water consumption and a Poisson exposure model. Calculations were stratified by MST source and soil depth over the aquifer where wells were drilled. Relative ingestion risk was estimated using wells with no MST detections and >6.1 m soil depth as a referent category. Daily ingestion risk varied from 0 to 8.8 × 10-1 by gene and fecal source (i.e., human or bovine). The estimated number of residents ingesting target genes from private wells varied from 910 (tet(A)) to 1,500 (intI1 and tet(X)) per day out of 12,000 total. Relative risk of tet(A) ingestion was significantly higher in wells with MST markers detected, including wells with ≤6.1 m soil depth contaminated by bovine markers (2.2 [90% CI: 1.1-4.7]), wells with >6.1 m soil depth contaminated by bovine markers (1.8 [1.002-3.9]), and wells with ≤6.1 m soil depth contaminated by bovine and human markers simultaneously (3.1 [1.7-6.5]). Antibiotic resistance genes (ARGs) were not necessarily present in viable microorganisms, and ingestion is not directly associated with infection. However, results illustrate relative contributions of human and livestock fecal sources to ARG exposure and highlight rural groundwater as a significant point of exposure.IMPORTANCEAntibiotic resistance is a global public health challenge with well-known environmental dimensions, but quantitative analyses of the roles played by various natural environments in transmission of antibiotic resistance are lacking, particularly for drinking water. This study assesses risk of ingestion for several antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in drinking water from private wells in a rural area of northeast Wisconsin, United States. Results allow comparison of drinking water as an exposure route for antibiotic resistance relative to other routes like food and recreational water. They also enable a comparison of the importance of human versus livestock fecal sources in the study area. Our study demonstrates the previously unrecognized importance of untreated rural drinking water as an exposure route for antibiotic resistance and identifies bovine fecal material as an important exposure factor in the study setting.
Asunto(s)
Antibacterianos , Agua Potable , Animales , Humanos , Bovinos , Antibacterianos/farmacología , Genes Bacterianos , Ganado , Heces , Suelo , Medición de Riesgo , Farmacorresistencia Microbiana/genética , Ingestión de AlimentosRESUMEN
By community intervention in 14 non-disinfecting municipal water systems, we quantified sporadic acute gastrointestinal illness (AGI) attributable to groundwater. Ultraviolet (UV) disinfection was installed on all supply wells of intervention communities. In control communities, residents continued to drink non-disinfected groundwater. Intervention and control communities switched treatments by moving UV disinfection units at the study midpoint (crossover design). Study participants (n = 1,659) completed weekly health diaries during four 12-week surveillance periods. Water supply wells were analyzed monthly for enteric pathogenic viruses. Using the crossover design, groundwater-borne AGI was not observed. However, virus types and quantity in supply wells changed through the study, suggesting that exposure was not constant. Alternatively, we compared AGI incidence between intervention and control communities within the same surveillance period. During Period 1, norovirus contaminated wells and AGI attributable risk from well water was 19% (95% CI, -4%, 36%) for children <5 years and 15% (95% CI, -9%, 33%) for adults. During Period 3, echovirus 11 contaminated wells and UV disinfection slightly reduced AGI in adults. Estimates of AGI attributable risks from drinking non-disinfected groundwater were highly variable, but appeared greatest during times when supply wells were contaminated with specific AGI-etiologic viruses.
Asunto(s)
Agua Potable , Agua Subterránea , Adulto , Niño , Humanos , Abastecimiento de Agua , Desinfección , Enterovirus Humano BRESUMEN
Antimicrobial resistance is a growing public health problem that requires an integrated approach among human, agricultural, and environmental sectors. However, few studies address all three components simultaneously. We investigated the occurrence of five antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in private wells drawing water from a vulnerable aquifer influenced by residential septic systems and land-applied dairy manure. Samples (n = 138) were collected across four seasons from a randomized sample of private wells in Kewaunee County, Wisconsin. Measurements of ARGs and intI1 were related to microbial source tracking (MST) markers specific to human and bovine feces; they were also related to 54 risk factors for contamination representing land use, rainfall, hydrogeology, and well construction. ARGs and intI1 occurred in 5%-40% of samples depending on target. Detection frequencies for ARGs and intI1 were lowest in the absence of human and bovine MST markers (1%-30%), highest when co-occurring with human and bovine markers together (11%-78%), and intermediate when co-occurring with just one type of MST marker (4%-46%). Gene targets were associated with septic system density more often than agricultural land, potentially because of the variable presence of manure on the landscape. Determining ARG prevalence in a rural setting with mixed land use allowed an assessment of the relative contribution of human and bovine fecal sources. Because fecal sources co-occurred with ARGs at similar rates, interventions intended to reduce ARG occurrence may be most effective if both sources are considered.
Asunto(s)
Antibacterianos , Estiércol , Animales , Humanos , Bovinos , Antibacterianos/farmacología , Ganado , Heces , Farmacorresistencia Microbiana/genéticaRESUMEN
Infection risk from waterborne pathogens can be estimated via quantitative microbial risk assessment (QMRA) and forms an important consideration in the management of public groundwater systems. However, few groundwater QMRAs use site-specific hazard identification and exposure assessment, so prevailing risks in these systems remain poorly defined. We estimated the infection risk for 9 waterborne pathogens based on a 2-year pathogen occurrence study in which 964 water samples were collected from 145 public wells throughout Minnesota, USA. Annual risk across all nine pathogens combined was 3.3 × 10-1 (95% CI: 2.3 × 10-1 to 4.2 × 10-1), 3.9 × 10-2 (2.3 × 10-2 to 5.4 × 10-2), and 1.2 × 10-1 (2.6 × 10-2 to 2.7 × 10-1) infections person-1 year-1 for noncommunity, nondisinfecting community, and disinfecting community wells, respectively. Risk estimates exceeded the U.S. benchmark of 10-4 infections person-1 year-1 in 59% of well-years, indicating that the risk was widespread. While the annual risk for all pathogens combined was relatively high, the average daily doses for individual pathogens were low, indicating that significant risk results from sporadic pathogen exposure. Cryptosporidium dominated annual risk, so improved identification of wells susceptible to Cryptosporidium contamination may be important for risk mitigation.
Asunto(s)
Criptosporidiosis , Cryptosporidium , Virus , Bacterias , Humanos , Minnesota , Medición de Riesgo , Microbiología del Agua , Abastecimiento de Agua , Pozos de AguaRESUMEN
Anaerobic digestion has been suggested as an intervention to attenuate antibiotic resistance genes (ARGs) in livestock manure but supporting data have typically been collected at laboratory scale. Few studies have quantified ARG fate during full-scale digestion of livestock manure. We sampled untreated manure and digestate from seven full-scale mesophilic dairy manure digesters to assess ARG fate through each system. Samples were collected biweekly from December through August (i.e., winter, spring, and summer; n = 235 total) and analyzed by quantitative polymerase chain reaction for intI1, erm(B), sul1, tet(A), and tet(W). Concentrations of intI1, sul1, and tet(A) decreased during anaerobic digestion, but their removal was less extensive than expected based on previous laboratory studies. Removal for intI1 during anaerobic digestion equaled 0.28 ± 0.03 log10 units (mean ± SE), equivalent to only 48% removal and notable given intI1's role in horizontal gene transfer and multiple resistance. Furthermore, tet(W) concentrations were unchanged during anaerobic digestion (p > 0.05), and erm(B) concentrations increased by 0.52 ± 0.03 log10 units (3.3-fold), which is important given erythromycin's status as a critically important antibiotic for human medicine. Seasonal log10 changes in intI1, sul1, and tet(A) concentrations were ≥50% of corresponding log10 removals by anaerobic digestion, and variation in ARG and intI1 concentrations among digesters was quantitatively comparable to anaerobic digestion effects. These results suggest that mesophilic anaerobic digestion may be limited as an intervention for ARGs in livestock manure and emphasize the need for multiple farm-level interventions to attenuate antibiotic resistance.
Asunto(s)
Antibacterianos , Estiércol , Anaerobiosis , Animales , Antibacterianos/farmacología , Bovinos , Farmacorresistencia Bacteriana/genética , Genes Bacterianos , Ganado/genéticaRESUMEN
BACKGROUND: Private wells are an important source of drinking water in Kewaunee County, Wisconsin. Due to the region's fractured dolomite aquifer, these wells are vulnerable to contamination by human and zoonotic gastrointestinal pathogens originating from land-applied cattle manure and private septic systems. OBJECTIVE: We determined the magnitude of the health burden associated with contamination of private wells in Kewaunee County by feces-borne gastrointestinal pathogens. METHODS: This study used data from a year-long countywide pathogen occurrence study as inputs into a quantitative microbial risk assessment (QMRA) to predict the total cases of acute gastrointestinal illness (AGI) caused by private well contamination in the county. Microbial source tracking was used to associate predicted cases of illness with bovine, human, or unknown fecal sources. RESULTS: Results suggest that private well contamination could be responsible for as many as 301 AGI cases per year in Kewaunee County, and that 230 and 12 cases per year were associated with a bovine and human fecal source, respectively. Furthermore, Cryptosporidium parvum was predicted to cause 190 cases per year, the most out of all 8 pathogens included in the QMRA. DISCUSSION: This study has important implications for land use and water resource management in Kewaunee County and informs the public health impacts of consuming drinking water produced in other similarly vulnerable hydrogeological settings. https://doi.org/10.1289/EHP7815.
Asunto(s)
Criptosporidiosis , Cryptosporidium , Agua Subterránea , Animales , Carbonato de Calcio , Bovinos , Magnesio , Medición de Riesgo , Pozos de Agua , Wisconsin/epidemiologíaRESUMEN
BACKGROUND: Groundwater quality in the Silurian dolomite aquifer in northeastern Wisconsin, USA, has become contentious as dairy farms and exurban development expand. OBJECTIVES: We investigated private household wells in the region, determining the extent, sources, and risk factors of nitrate and microbial contamination. METHODS: Total coliforms, Escherichia coli, and nitrate were evaluated by synoptic sampling during groundwater recharge and no-recharge periods. Additional seasonal sampling measured genetic markers of human and bovine fecal-associated microbes and enteric zoonotic pathogens. We constructed multivariable regression models of detection probability (log-binomial) and concentration (gamma) for each contaminant to identify risk factors related to land use, precipitation, hydrogeology, and well construction. RESULTS: Total coliforms and nitrate were strongly associated with depth-to-bedrock at well sites and nearby agricultural land use, but not septic systems. Both human wastewater and cattle manure contributed to well contamination. Rotavirus group A, Cryptosporidium, and Salmonella were the most frequently detected pathogens. Wells positive for human fecal markers were associated with depth-to-groundwater and number of septic system drainfield within 229m. Manure-contaminated wells were associated with groundwater recharge and the area size of nearby agricultural land. Wells positive for any fecal-associated microbe, regardless of source, were associated with septic system density and manure storage proximity modified by bedrock depth. Well construction was generally not related to contamination, indicating land use, groundwater recharge, and bedrock depth were the most important risk factors. DISCUSSION: These findings may inform policies to minimize contamination of the Silurian dolomite aquifer, a major water supply for the U.S. and Canadian Great Lakes region. https://doi.org/10.1289/EHP7813.
Asunto(s)
Criptosporidiosis , Cryptosporidium , Agua Subterránea , Contaminantes Químicos del Agua , Animales , Carbonato de Calcio , Canadá , Bovinos , Monitoreo del Ambiente , Magnesio , Nitratos/análisis , Factores de Riesgo , Contaminantes Químicos del Agua/análisis , Pozos de Agua , WisconsinRESUMEN
Giardia is a zoonotic gastrointestinal parasite responsible for a substantial global public health burden, and quantitative microbial risk assessment (QMRA) is often used to forecast and manage this burden. QMRA requires dose-response models to extrapolate available dose-response data, but the existing model for Giardia ignores valuable dose-response information, particularly data from several well-documented waterborne outbreaks of giardiasis. The current study updates Giardia dose-response modeling by synthesizing all available data from outbreaks and experimental studies using a Bayesian random effects dose-response model. For outbreaks, mean doses (D) and the degree of spatial and temporal aggregation among cysts were estimated using exposure assessment implemented via two-dimensional Monte Carlo simulation, while potential overreporting of outbreak cases was handled using published overreporting factors and censored binomial regression. Parameter estimation was by Markov chain Monte Carlo simulation and indicated that a typical exponential dose-response parameter for Giardia is r = 1.6 × 10-2 [3.7 × 10-3 , 6.2 × 10-2 ] (posterior median [95% credible interval]), while a typical morbidity ratio is m = 3.8 × 10-1 [2.3 × 10-1 , 5.5 × 10-1 ]. Corresponding (logistic-scale) variance components were σr = 5.2 × 10-1 [1.1 × 10-1 , 9.6 × 10-1 ] and σm = 9.3 × 10-1 [7.0 × 10-2 , 2.8 × 100 ], indicating substantial variation in the Giardia dose-response relationship. Compared to the existing Giardia dose-response model, the current study provides more representative estimation of uncertainty in r and novel quantification of its natural variability. Several options for incorporating variability in r (and m) into QMRA predictions are discussed, including incorporation via Monte Carlo simulation as well as evaluation of the current study's model using the approximate beta-Poisson.
RESUMEN
Anaerobic digestion can inactivate zoonotic pathogens present in cattle manure, which reduces transmission of these pathogens from farms to humans through the environment. However, the variability of inactivation across farms and over time is unknown because most studies have examined pathogen inactivation under ideal laboratory conditions or have focused on only one or two full-scale digesters at a time. In contrast, we sampled seven full-scale digesters treating cattle manure in Wisconsin for 9 mo on a biweekly basis ( = 118 pairs of influent and effluent samples) and used real-time quantitative polymerase chain reaction to analyze these samples for 19 different microbial genetic markers. Overall, inactivation of pathogens and fecal indicators was highly variable. When aggregated across digester and season, log-removal values for several representative microorganisms-bovine , -like CowM3, and bovine polyomavirus-were 0.78 ± 0.34, 0.70 ± 0.50, and 0.53 ± 0.58, respectively (mean ± SD). These log-removal values were up to two times lower than expected based on the scientific literature. Thus, our study indicates that full-scale anaerobic digestion of cattle manure requires optimization with regard to pathogen inactivation. Future studies should focus on identifying the potential causes of this suboptimal performance (e.g., overloading, poor mixing, poor temperature control). Our study also examined the fate of pathogens during manure separation and found that the majority of microbes we detected ended up in the liquid fraction of separated manure. This finding has important implications for the transmission of zoonotic pathogens through the environment to humans.
Asunto(s)
Bacterias/aislamiento & purificación , Reactores Biológicos , Estiércol/microbiología , Anaerobiosis , Animales , Bovinos , Temperatura , Virus , WisconsinRESUMEN
Residual wastewater solids are a significant reservoir of antibiotic resistance genes (ARGs). While treatment technologies can reduce ARG levels in residual wastewater solids, the effects of these technologies on ARGs in soil during subsequent land-application are unknown. In this study we investigated the use of numerous treatment technologies (air drying, aerobic digestion, mesophilic anaerobic digestion, thermophilic anaerobic digestion, pasteurization, and alkaline stabilization) on the fate of ARGs and class 1 integrons in wastewater solids-amended soil microcosms. Six ARGs [erm(B), qnrA, sul1, tet(A), tet(W), and tet(X)], the integrase gene of class 1 integrons (intI1), and 16S rRNA genes were quantified using quantitative polymerase chain reaction. The quantities of ARGs and intI1 decreased in all microcosms, but thermophilic anaerobic digestion, alkaline stabilization, and pasteurization led to the most extensive decay of ARGs and intI1, often to levels similar to that of the control microcosms to which no wastewater solids had been applied. In contrast, the rates by which ARGs and intI1 declined using the other treatment technologies were generally similar, typically varying by less than 2 fold. These results demonstrate that wastewater solids treatment technologies can be used to decrease the persistence of ARGs and intI1 during their subsequent application to soil.
Asunto(s)
Antibacterianos , Farmacorresistencia Microbiana , Integrones/genética , Aguas Residuales , Genes Bacterianos , ARN Ribosómico 16S , SueloRESUMEN
BACKGROUND: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. OBJECTIVES: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. METHODS: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irrigation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. RESULTS: Median risk estimates from Monte Carlo simulations ranged from 10-5 to 10-2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. CONCLUSIONS: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk. https://doi.org/10.1289/EHP283.
Asunto(s)
Riego Agrícola/métodos , Industria Lechera , Estiércol/microbiología , Modelos Teóricos , Medición de RiesgoRESUMEN
The limit of detection (LOD) for qPCR-based analyses is not consistently defined or determined in studies on waterborne pathogens. Moreover, the LODs reported often reflect the qPCR assay alone rather than the entire sample process. Our objective was to develop an approach to determine the 95% LOD (lowest concentration at which 95% of positive samples are detected) for the entire process of waterborne pathogen detection. We began by spiking the lowest concentration that was consistently positive at the qPCR step (based on its standard curve) into each procedural step working backwards (i.e., extraction, secondary concentration, primary concentration), which established a concentration that was detectable following losses of the pathogen from processing. Using the fraction of positive replicates (n = 10) at this concentration, we selected and analyzed a second, and then third, concentration. If the fraction of positive replicates equaled 1 or 0 for two concentrations, we selected another. We calculated the LOD using probit analysis. To demonstrate our approach we determined the 95% LOD for Salmonella enterica serovar Typhimurium, adenovirus 41, and vaccine-derived poliovirus Sabin 3, which were 11, 12, and 6 genomic copies (gc) per reaction (rxn), respectively (equivalent to 1.3, 1.5, and 4.0 gc L(-1) assuming the 1500 L tap-water sample volume prescribed in EPA Method 1615). This approach limited the number of analyses required and was amenable to testing multiple genetic targets simultaneously (i.e., spiking a single sample with multiple microorganisms). An LOD determined this way can facilitate study design, guide the number of required technical replicates, aid method evaluation, and inform data interpretation.
Asunto(s)
Límite de Detección , Microbiología del Agua , Reacción en Cadena de la Polimerasa , Reacción en Cadena en Tiempo Real de la Polimerasa , Sensibilidad y Especificidad , AguaRESUMEN
Waterborne pathogens were measured at three beaches in Lake Michigan, environmental factors for predicting pathogen concentrations were identified, and the risk of swimmer infection and illness was estimated. Waterborne pathogens were detected in 96% of samples collected at three Lake Michigan beaches in summer, 2010. Samples were quantified for 22 pathogens in four microbial categories (human viruses, bovine viruses, protozoa, and pathogenic bacteria). All beaches had detections of human and bovine viruses and pathogenic bacteria indicating influence of multiple contamination sources at these beaches. Occurrence ranged from 40 to 87% for human viruses, 65-87% for pathogenic bacteria, and 13-35% for bovine viruses. Enterovirus, adenovirus A, Salmonella spp., Campylobacter jejuni, bovine polyomavirus, and bovine rotavirus A were present most frequently. Variables selected in multiple regression models used to explore environmental factors that influence pathogens included wave direction, cloud cover, currents, and water temperature. Quantitative Microbial Risk Assessment was done for C. jejuni, Salmonella spp., and enteroviruses to estimate risk of infection and illness. Median infection risks for one-time swimming events were approximately 2 × 10(-5), 8 × 10(-6), and 3 × 10(-7) [corrected] for C. jejuni, Salmonella spp., and enteroviruses, respectively. Results highlight the importance of investigating multiple pathogens within multiple categories to avoid underestimating the prevalence and risk of waterborne pathogens.
Asunto(s)
Bacterias/aislamiento & purificación , Lagos/microbiología , Lagos/virología , Virus/aislamiento & purificación , Animales , Bacterias/patogenicidad , Playas , Campylobacter jejuni/aislamiento & purificación , Campylobacter jejuni/patogenicidad , Bovinos , Enterovirus/aislamiento & purificación , Enterovirus/patogenicidad , Monitoreo del Ambiente , Great Lakes Region , Humanos , Medición de Riesgo/métodos , Salmonella/aislamiento & purificación , Salmonella/patogenicidad , Estaciones del Año , Virus/patogenicidad , Microbiología del AguaRESUMEN
This study investigated the use of thermophilic anaerobic digestion for removing antibiotic resistance genes (ARGs) from residual municipal wastewater solids. Four laboratory-scale anaerobic digesters were operated in 8-day batch cycles at temperatures of 40, 56, 60, and 63 °C. Two tetracycline resistance genes (tet(W) and tet(X)), a fluoroquinolone resistance gene (qnrA), the integrase gene of class 1 integrons (intI1), 16S rRNA genes of all Bacteria, and 16S rRNA genes of methanogens were quantified using real-time quantitative PCR. ARG and intI1 quantities decreased at all temperatures and were described well by a modified form of the Collins-Selleck disinfection kinetic model. The magnitudes of Collins-Selleck kinetic parameters were significantly greater at thermophilic temperatures compared to 40 °C, but few statistically significant differences were observed among these parameters for the thermophilic anaerobic digesters. This model allows for the direct comparison of different operating conditions (e.g., temperature) on anaerobic digestion performance in mitigating the quantity of ARGs in wastewater solids and could be used to design full-scale anaerobic digesters to specifically treat for ARGs as a "pollutant" of concern.
RESUMEN
Substantial quantities of antibiotic resistance genes (ARGs) are discharged with treated residual municipal wastewater solids and subsequently applied to soil. The objective of this work was to determine the decay rates for ARGs and class 1 integrons following simulated land application of treated wastewater solids. Treated residual solids from two full-scale treatment plants were applied to sets of triplicate soil microcosms in two independent experiments. Experiment 1 investigated loading rates of 20, 40, and 100 g kg(-1) of residual solids to a sandy soil, while experiment 2 investigated a loading rate of 40 g kg(-1) to a silty-loamy soil. Five ARGs (erm(B), sul1, tet(A), tet(W), and tet(X)), the integrase of class 1 integrons (intI1), 16S rRNA genes, 16S rRNA genes of all Bacteroides spp., and 16S rRNA genes of human-specific Bacteroides spp. were quantified using real-time polymerase chain reaction. ARGs and intI1 quantities declined in most microcosms, with statistically significant (P < 0.05) half-lives varying between 13 d (erm(B), experiment 1, 100 g kg(-1)) and 81 d (intI1, experiment 1, 40 g kg(-1)). These kinetic rates were much slower than have been previously reported for unit operations used to treat wastewater solids (e.g., anaerobic digestion). This research suggests that the design and operation of municipal wastewater treatment facilities with the explicit goal of mitigating the release of ARGs should focus on using technologies within the treatment facility, rather than depending on attenuation subsequent to land application.
Asunto(s)
Ciudades , Farmacorresistencia Microbiana/genética , Genes Bacterianos , Integrones/genética , Aguas del Alcantarillado/microbiología , Microbiología del Suelo , Aguas Residuales/microbiología , Antibacterianos/farmacología , Humanos , Cinética , ARN Ribosómico 16S/genética , Reacción en Cadena en Tiempo Real de la Polimerasa , Suelo , Purificación del AguaRESUMEN
This study investigated whether air-drying beds reduce antibiotic resistance gene (ARG) concentrations in residual municipal wastewater solids. Three laboratory-scale drying beds were operated for a period of nearly 100 days. Real-time PCR was used to quantify 16S rRNA genes, 16S rRNA genes specific to fecal bacteria (AllBac) and human fecal bacteria (HF183), the integrase gene of class 1 integrons (intI1), and five ARGs representing a cross-section of antibiotic classes and resistance mechanisms (erm(B), sul1, tet(A), tet(W), and tet(X)). Air-drying beds were capable of reducing all gene target concentrations by 1 to 5 orders of magnitude, and the nature of this reduction was consistent with both a net decrease in the number of bacterial cells and a lack of selection within the microbial community. Half-lives varied between 1.5 d (HF183) and 5.4 d (tet(X)) during the first 20 d of treatment. After the first 20 d of treatment, however, half-lives varied between 8.6 d (tet(X)) and 19.3 d (AllBac), and 16S rRNA gene, intI1, and sul1 concentrations did not change (P > 0.05). These results demonstrate that air-drying beds can reduce ARG and intI1 concentrations in residual municipal wastewater solids within timeframes typical of operating practices.
Asunto(s)
Bacterias/aislamiento & purificación , Proteínas Bacterianas/análisis , Eliminación de Residuos Líquidos/métodos , Aguas Residuales/análisis , Contaminación Química del Agua/prevención & control , Antibacterianos/análisis , Antibacterianos/farmacología , Bacterias/genética , Bacterias/metabolismo , Proteínas Bacterianas/genética , Proteínas Bacterianas/metabolismo , Farmacorresistencia Microbiana , Integrones , ARN Ribosómico 16S/genética , ARN Ribosómico 16S/aislamiento & purificación , ARN Ribosómico 16S/metabolismo , Reacción en Cadena en Tiempo Real de la Polimerasa , Aguas Residuales/microbiología , Contaminantes Químicos del Agua/análisis , Contaminantes Químicos del Agua/farmacologíaRESUMEN
Numerous initiatives have been undertaken to circumvent the problem of antibiotic resistance, including the development of new antibiotics, the use of narrow spectrum antibiotics, and the reduction of inappropriate antibiotic use. We propose an alternative but complimentary approach to reduce antibiotic resistant bacteria (ARB) by implementing more stringent technologies for treating municipal wastewater, which is known to contain large quantities of ARB and antibiotic resistance genes (ARGs). In this study, we investigated the ability of conventional aerobic digestion to reduce the quantity of ARGs in untreated wastewater solids. A bench-scale aerobic digester was fed untreated wastewater solids collected from a full-scale municipal wastewater treatment facility. The reactor was operated under semi-continuous flow conditions for more than 200 days at a residence time of approximately 40 days. During this time, the quantities of tet(A), tet(W), and erm(B) decreased by more than 90%. In contrast, intI1 did not decrease, and tet(X) increased in quantity by 5-fold. Following operation in semi-continuous flow mode, the aerobic digester was converted to batch mode to determine the first-order decay coefficients, with half-lives ranging from as short as 2.8 days for tet(W) to as long as 6.3 days for intI1. These results demonstrated that aerobic digestion can be used to reduce the quantity of ARGs in untreated wastewater solids, but that rates can vary substantially depending on the reactor design (i.e., batch vs. continuous-flow) and the specific ARG.
RESUMEN
In this study, the impact of tertiary-treated municipal wastewater on the quantity of several antibiotic resistance determinants in Duluth-Superior Harbor was investigated by collecting surface water and sediment samples from 13 locations in Duluth-Superior Harbor, the St. Louis River, and Lake Superior. Quantitative PCR (qPCR) was used to target three different genes encoding resistance to tetracycline (tet(A), tet(X), and tet(W)), the gene encoding the integrase of class 1 integrons (intI1), and total bacterial abundance (16S rRNA genes) as well as total and human fecal contamination levels (16S rRNA genes specific to the genus Bacteroides ). The quantities of tet(A), tet(X), tet(W), intI1, total Bacteroides , and human-specific Bacteroides were typically 20-fold higher in the tertiary-treated wastewater than in nearby surface water samples. In contrast, the quantities of these genes in the St. Louis River and Lake Superior were typically below detection. Analysis of sequences of tet(W) gene fragments from four different samples collected throughout the study site supported the conclusion that tertiary-treated municipal wastewater is a point source of resistance genes into Duluth-Superior Harbor. This study demonstrates that the discharge of exceptionally treated municipal wastewater can have a statistically significant effect on the quantities of antibiotic resistance genes in otherwise pristine surface waters.