RESUMO
We characterized concentrations of trihalomethanes (THMs), a measure of disinfection byproducts (DBPs), in tap water samples collected from households with utility-supplied water in two rural counties in Appalachian Virginia, and assessed associations with pH, free chlorine, and metal ions which can impact THM formation. Free chlorine concentrations in all samples (n = 27 homes) complied with EPA drinking water guidelines, though 7% (n = 2) of first draw samples and 11% (n = 3) of 5-min flushed-tap water samples exceeded the US Safe Drinking Water Act (SDWA) maximum contaminant level (MCL) for THM (80 ppb). Regression analyses showed that free chlorine and pH were positively associated with the formation of THM levels above SDWA MCLs (OR = 1.04, p = 0.97 and OR = 1.74, p = 0.79, respectively), while temperature was negatively associated (OR = 0.78, p = 0.38). Of the eight utilities serving study households, samples from water served by three different utilities exceeded the EPA MCL for THM. Overall, these findings do not indicate substantial exposures to DBPs for rural households with utility-supplied water in this region of southwest Virginia. However, given the observed variability in THM concentrations between and across utilities, and established adverse health impacts associated with chronic and acute DBP exposure, more research on DBPs in rural Central Appalachia is warranted.
Assuntos
Cloro , Água Potável , População Rural , Trialometanos , Poluentes Químicos da Água , Abastecimento de Água , Virginia , Cloro/análise , Água Potável/química , Água Potável/análise , Poluentes Químicos da Água/análise , Trialometanos/análise , Purificação da Água/métodos , Desinfecção , Humanos , Desinfetantes/análise , Região dos Apalaches , Características da FamíliaRESUMO
In 1974, the United States established the Safe Drinking Water Act (SDWA) to protect consumers from potential exposure to drinking water contaminants associated with health risks. Each contaminant is assigned a health-based standard meant to reflect the maximum level at which an adverse human health outcome is unlikely; measurements beyond that level have greater potential to result in adverse health outcomes. Although there is extensive research on human health implications following water contaminant exposure, few studies have specifically examined associations between fetal health and municipal drinking water violations. Therefore, the objective of this study is to assess whether SDWA drinking water violations are associated with fetal health outcomes, including preterm birth (PTB), low birth weight (LBW), and term-low birth weight (tLBW), in the Commonwealth of Virginia. Singleton births (n = 665,984) occurring between 2007 and 2015 in Virginia were geocoded and assigned to a corresponding estimated water service area. Health-based (HB) and monitoring and reporting (MR) violations for 12 contaminants were acquired from the US EPA Safe Drinking Water Information System, with exposure defined at the approximate service area level to limit exposure misclassification. A logistic regression model for each birth outcome assessed potential relationships with SDWA violations. When examining the association between individual MR violations and birth outcomes, Nitrate-Nitrite (OR = 1.10; 95% CI = 1.02, 1.18, P = 0.01) was positively associated with PTB and the total coliform rule was negatively associated with tLBW (OR = 0.93; 95% CI = 0.87, 1.00, P = 0.04). These findings indicate that a lack of regular monitoring and reporting by water providers (resulting in monitoring and reporting violations) may be concealing health-based violations as these health concerns cannot be revealed without testing, suggesting a need for additional technical, managerial, and financial support to enable often-underfunded water systems to adhere to monitoring and reporting requirements meant to protect public health.
Assuntos
Água Potável , Complicações na Gravidez , Nascimento Prematuro , Feminino , Recém-Nascido , Estados Unidos , Humanos , Água Potável/análise , Virginia/epidemiologia , Recém-Nascido de Baixo PesoRESUMO
Bioretention cells (BRCs) are effective tools for treating urban stormwater, but nitrogen removal by these systems is highly variable. Improvements in nitrogen removal are hampered by a lack of data directly quantifying the abundance or activity of denitrifying microorganisms in BRCs and how they are controlled by original BRC design characteristics. We analyzed denitrifiers in twenty-three BRCs of different designs across three mid-Atlantic states (MD, VA, and NC) by quantifying two bacterial denitrification genes ( nirK and nosZ) and potential enzymatic denitrification rates within the soil medium. Overall, we found that BRC design factors, rather than local environmental variables, had the greatest effects on variation in denitrifier abundance and activity. Specifically, denitrifying populations and denitrification potential increased with organic carbon and inorganic nitrogen concentrations in the soil media and decreased in BRCs planted with grass compared to other types of vegetation. Furthermore, the top layers of BRCs consistently contained greater concentrations and activity of denitrifying bacteria than bottom layers, despite longer periods of saturation and the presence of permanently saturated zones designed to promote denitrification at lower depths. These findings suggest that there is still considerable potential for design improvements when constructing BRCs that could increase denitrification and mitigate nitrogen export to receiving waters.
Assuntos
Desnitrificação , Microbiologia do Solo , Bactérias , Nitrogênio , SoloRESUMO
Identification of agricultural practices that mitigate the environmental dissemination of antibiotics is a key need in reducing the prevalence of antibiotic-resistant bacteria of human health concern. Here, we aimed to compare the effects of crop (lettuce [ L.] or radish [ L.]), soil amendment type (inorganic fertilizer, raw dairy manure, composted dairy manure, or no amendment), and prior antibiotic use history (no antibiotics during previous lactation cycles vs. manure mixed from cows administered pirlimycin or cephapirin) of manure-derived amendments on the incidence of culturable antibiotic-resistant fecal coliforms in agricultural soils through a controlled field-plot experiment. Antibiotic-resistant culturable fecal coliforms were recoverable from soils across all treatments immediately after application, although persistence throughout the experiment varied by antibiotic class and time. The magnitude of observed coliform counts differed by soil amendment type. Compost-amended soils had the highest levels of cephalosporin-resistant fecal coliforms, regardless of whether the cows from which the manure was derived were administered antibiotics. Samples from control plots or those treated with inorganic fertilizer trended toward lower counts of resistant coliforms, although these differences were not statistically significant. No statistical differences were observed between soils that grew leafy (lettuce) versus rooted (radish) crops. Only pirlimycin was detectable past amendment application in raw manure-amended soils, dissipating 12 to 25% by Day 28. Consequently, no quantifiable correlations between coliform count and antibiotic magnitude could be identified. This study demonstrates that antibiotic-resistant fecal coliforms can become elevated in soils receiving manure-derived amendments, but that a variety of factors likely contribute to their long-term persistence under typical field conditions.
Assuntos
Clindamicina/análogos & derivados , Compostagem , Farmacorresistência Bacteriana , Enterobacteriaceae , Esterco , Microbiologia do Solo , Animais , Antibacterianos , Bovinos , Clindamicina/metabolismo , Feminino , Humanos , Solo , VerdurasRESUMO
Poor sanitation in rural infrastructure is often associated with high levels of fecal contamination in adjacent surface waters, which presents a community health risk. Although microbial source tracking techniques have been widely applied to identify primary remediation needs in urban and/or recreational waters, use of human-specific markers has been more limited in rural watersheds. This study quantified the human source tracking marker Bacteroides-HF183, along with more general fecal indicators (i.e. culturable Escherichia coli and a molecular Enterococcus marker), in two Appalachian streams above and below known discharges of untreated household waste. Although E. coli and Enterococcus were consistently recovered in samples collected from both streams, Bacteroides-HF183 was only detected sporadically in one stream. Multiple linear regression analysis demonstrated a positive correlation between the concentration of E. coli and the proximity and number of known waste discharge points upstream; this correlation was not significant with respect to Bacteroides-HF183, likely due to the low number of quantifiable samples. These findings suggest that, while the application of more advanced source targeting strategies can be useful in confirming the influence of substandard sanitation on surface waters to justify infrastructure improvements, they may be of limited use without concurrent traditional monitoring targets and on-the-ground sanitation surveys.
Assuntos
Bacteroides/isolamento & purificação , Enterococcus/isolamento & purificação , Monitoramento Ambiental , Escherichia coli/isolamento & purificação , Fezes/microbiologia , Rios/microbiologia , Esgotos/microbiologia , Contagem de Colônia Microbiana , Reação em Cadeia da Polimerase , Rios/química , Esgotos/análise , Virginia , Qualidade da ÁguaRESUMO
The remediation of mine water to preserve receiving water quality has advanced substantially over the past half century, but prospective regulations to limit the conductivity of mining-impacted waters pose a significant new challenge. Conventional approaches to reduce high levels of conductivity in these mine waters are often costly, requiring high levels of maintenance and significant inputs of energy and refined chemicals. In contrast, passive biological treatment (PBT) systems are a relatively low-cost, low-maintenance treatment technology for mine waters that have been used for over three decades. However, their practical ability to reduce conductivity is unclear, given previous research reports focused on the removal of metals, acidity, and solids. A systematic literature review to identify previous reports of PBT systems at the laboratory or field scale that include evaluations of changes in conductivity suggests that decreases in conductivity of 30 to 40% are achievable. Substantial variability in performance is common, however, and conductivity increased markedly in some systems. This variation may be associated with the dissolution of limestone, which is a key treatment material in some systems. Although the development of PBT to serve as pre-, post-, or stand-alone treatment systems targeting conductivity may reduce overall treatment cost in some settings, optimization of these designs requires an increase in the number of published conductivity datasets from similar systems, detailed reports on the key ions contributing to elevated conductivity region to region, and further investigation of the underlying biochemical processes responsible for conductivity reductions.
Assuntos
Mineração , Poluentes Químicos da Água , Purificação da Água , Metais , Estudos Prospectivos , Literatura de Revisão como AssuntoRESUMO
Macroinvertebrate community assessment is used in most US states to evaluate stream health under the Clean Water Act. While water quality assessment and impairment determinations are reported to the US Environmental Protection Agency, there is no national summary of biological assessment findings. The objective of this work was to determine the national extent of invertebrate-based impairments and to identify pollutants primarily responsible for those impairments. Evaluation of state data in the US Environmental Protection Agency's Assessment and Total Maximum Daily Load Tracking and Implementation System database revealed considerable differences in reporting approaches and terminologies including differences in if and how states report specific biological assessment findings. Only 15% of waters impaired for aquatic life could be identified as having impairments determined by biological assessments (e.g., invertebrates, fish, periphyton); approximately one-third of these were associated with macroinvertebrate bioassessment. Nearly 650 invertebrate-impaired waters were identified nationwide, and sediment was the most common pollutant in bedded (63%) and suspended (9%) forms. This finding is not unexpected, given previous work on the negative impacts of sediment on aquatic life, and highlights the need to more specifically identify the mechanisms driving sediment impairments in order to design effective remediation plans. It also reinforces the importance of efforts to derive sediment-specific biological indices and numerical sediment quality guidelines. Standardization of state reporting approaches and terminology would significantly increase the potential application of water quality assessment data, reveal national trends, and encourage sharing of best practices to facilitate the attainment of water quality goals.
Assuntos
Sedimentos Geológicos/química , Invertebrados/fisiologia , Rios/química , Poluentes Químicos da Água/análise , Qualidade da Água , Animais , Monitoramento Ambiental/métodos , Estados Unidos , United States Environmental Protection Agency , Poluentes Químicos da Água/químicaRESUMO
Although extensive literature documents corrosion in municipal water systems, only minimal data is available describing corrosion in private water systems (e.g., wells), which serve as a primary source of drinking water for approximately 47 million Americans. This study developed a profiling technique specifically tailored to evaluate lead release in these systems. When applied in an intensive field study of 15 private systems, three patterns of lead release were documented: no elevated lead or lead elevated in the first draw only (Type I), erratic spikes of particulate lead (Type II), and sustained detectable lead concentrations (Type III). While flushing protocols as short as 15-30 s may be sufficient to reduce lead concentrations below 15 µg/L for Types I and III exposure, flushing may not be an appropriate remediation strategy for Type II exposure. In addition, the sustained detectable lead concentrations observed with Type III exposure likely result from corrosion of components within the well and therefore cannot be reduced with increased flushing. As profiling techniques are labor- and sample-intensive, we discuss recommendations for simpler sampling schemes for initial private system surveys aimed at quantifying lead and protecting public health.
Assuntos
Exposição Ambiental/análise , Chumbo/análise , Poluentes Químicos da Água/análise , Abastecimento de Água , Ácidos/química , Corrosão , Água Potável/análise , Material Particulado/análise , Virginia , Poços de ÁguaRESUMO
Although recent studies suggest contamination by bacteria and nitrate in private drinking water systems is of increasing concern, data describing contaminants associated with the corrosion of onsite plumbing are scarce. This study reports on the analysis of 2,146 samples submitted by private system homeowners. Almost 20% of first draw samples submitted contained lead concentrations above the United States Environmental Protection Agency action level of 15 µg/L, suggesting that corrosion may be a significant public health problem. Correlations between lead, copper, and zinc suggested brass components as a likely lead source, and dug/bored wells had significantly higher lead concentrations as compared to drilled wells. A random subset of samples selected to quantify particulate lead indicated that, on average, 47% of lead in the first draws was in the particulate form, although the occurrence was highly variable. While flushing the tap reduced lead below 15 µg/L for most systems, some systems experienced an increase, perhaps attributable to particulate lead or lead-bearing components upstream of the faucet (e.g., valves, pumps). Results suggest that without including a focus on private as well as municipal systems it will be very difficult to meet the existing national public health goal to eliminate elevated blood lead levels in children.
Assuntos
Água Potável/análise , Água Subterrânea/análise , Chumbo/análise , Poluentes Químicos da Água/análise , Humanos , Incidência , Metais Pesados/análise , Virginia/epidemiologia , Qualidade da Água/normasRESUMO
Elevated levels of fecal indicator bacteria (FIB) remain the leading cause of surface water-quality impairments in the United States. Under the Clean Water Act, basin-specific total maximum daily load (TMDL) restoration plans are responsible for bringing identified water impairments in compliance with applicable standards. Watershed-scale model predictions of FIB concentrations that facilitate the development of TMDLs are associated with considerable uncertainty. An increasingly cited criticism of existing modeling practice is the common strategy that assumes bacteria behave similarly to "free-phase" contaminants, although many field evidence indicates a nontrivial number of cells preferentially associate with particulates. Few attempts have been made to evaluate the impacts of sediment on the predictions of in-stream FIB concentrations at the watershed scale, with limited observational data available for model development, calibration, and validation. This study evaluates the impacts of bacteria-sediment interactions in a continuous, watershed-scale model widely used in TMDL development. In addition to observed FIB concentrations in the water column, streambed sediment-associated FIB concentrations were available for model calibration. While improved model performance was achieved compared with previous studies, model performance under a "sediment-attached" scenario was essentially equivalent to the simpler "free-phase" scenario. Watershed-specific characteristics (e.g., steep slope, high imperviousness) likely contributed to the dominance of wet-weather pollutant loading in the water column, which may have obscured sediment impacts. As adding a module accounting for bacteria-sediment interactions would increase the model complexity considerably, site evaluation preceding modeling efforts is needed to determine whether the additional model complexity and effort associated with partitioning phases of FIB is sufficiently offset by gains in predictive capacity.
RESUMO
Dairy manure has much potential for use as an organic fertilizer in the United States. However, the levels of indicator organisms and pathogens in dairy manure can be ten times higher than stipulated use guidelines by the National Organic Standards Board (NOSB) even after undergoing anaerobic digestion at mesophilic temperatures. The objective of this study was to identify pasteurization temperatures and treatment durations to reduce fecal coliforms, E. coli, and Salmonella concentrations in separated liquid dairy manure (SLDM) of a mesophilic anaerobic digester effluent to levels sufficient for use as an organic fertilizer. Samples of SLDM were pasteurized at 70, 75, and 80°C for durations of 0 to 120 min. Fecal coliforms, E. coli, and Salmonella concentrations were assessed via culture-based techniques. All of the tested pasteurization temperatures and duration combinations reduced microbial concentrations to levels below the NOSB guidelines. The fecal coliforms and E. coli reductions ranged 2from 0.76 to 1.34 logs, while Salmonella concentrations were reduced by more than 99% at all the pasteurization temperatures and active treatment durations.
Assuntos
Indústria de Laticínios/métodos , Fertilizantes/análise , Fertilizantes/microbiologia , Esterco/análise , Esterco/microbiologia , Eliminação de Resíduos de Serviços de Saúde/métodos , Agricultura Orgânica/métodos , Anaerobiose , Escherichia coli/isolamento & purificação , Pasteurização/métodos , Salmonella/isolamento & purificação , Estados UnidosRESUMO
The development of models for understanding antibiotic resistance gene (ARG) persistence and transport is a critical next step toward informing mitigation strategies to prevent the spread of antibiotic resistance in the environment. A field study was performed that used a mass balance approach to gain insight into the transport and dissipation of ARGs following land application of manure. Soil from a small drainage plot including a manure application site, an unmanured control site, and an adjacent stream and buffer zone were sampled for ARGs and metals before and after application of dairy manure slurry and a dry stack mixture of equine, bovine, and ovine manure. Results of mass balance suggest growth of bacterial hosts containing ARGs and/or horizontal gene transfer immediately following slurry application with respect to ermF, sul1, and sul2 and following a lag (13 days) for dry-stack-amended soils. Generally no effects on tet(G), tet(O), or tet(W) soil concentrations were observed despite the presence of these genes in applied manure. Dissipation rates were fastest for ermF in slurry-treated soils (logarithmic decay coefficient of -3.5) and for sul1 and sul2 in dry-stack-amended soils (logarithmic decay coefficients of -0.54 and -0.48, respectively), and evidence for surface and subsurface transport was not observed. Results provide a mass balance approach for tracking ARG fate and insights to inform modeling and limiting the transport of manure-borne ARGs to neighboring surface water.
Assuntos
Bactérias/genética , Bactérias/isolamento & purificação , Farmacorresistência Bacteriana/genética , Genes Bacterianos/genética , Sedimentos Geológicos/microbiologia , Esterco/microbiologia , Microbiologia do Solo , Animais , Antibacterianos/farmacologia , Bactérias/efeitos dos fármacos , Bovinos , Transferência Genética Horizontal , Genes Bacterianos/efeitos dos fármacos , Cavalos , Esterco/análise , Estações do Ano , Ovinos , VirginiaRESUMO
Over 1.7 million Virginians rely on private water sources to provide household water. The heaviest reliance on these systems occurs in rural areas, which are often underserved with respect to available financial resources and access to environmental health education. This study aimed to identify potential associations between concentrations of fecal indicator bacteria (FIB) (coliforms, Escherichia coli) in over 800 samples collected at the point-of-use from homes with private water supply systems and homeowner-provided demographic data (household income and education). Of the 828 samples tested, 349 (42%) of samples tested positive for total coliform and 55 (6.6%) tested positive for E. coli. Source tracking efforts targeting optical brightener concentrations via fluorometry and the presence of a human-specific Bacteroides marker via quantitative real-time polymerase chain reaction (qPCR) suggest possible contamination from human septage in over 20 samples. Statistical methods implied that household income has an association with the proportion of samples positive for total coliform, though the relationship between education level and FIB is less clear. Further exploration of links between demographic data and private water quality will be helpful in building effective strategies to improve rural drinking water quality.
Assuntos
Água Potável/microbiologia , Enterobacteriaceae/isolamento & purificação , Fezes/microbiologia , Adolescente , Adulto , Idoso , Bacteroides/genética , Bacteroides/isolamento & purificação , Criança , Pré-Escolar , DNA Bacteriano/análise , Enterobacteriaceae/genética , Escherichia coli/genética , Escherichia coli/isolamento & purificação , Feminino , Fluorometria , Humanos , Lactente , Recém-Nascido , Masculino , Pessoa de Meia-Idade , Reação em Cadeia da Polimerase em Tempo Real , Fatores Socioeconômicos , Virginia , Qualidade da Água , Adulto JovemRESUMO
High levels of fecal indicator bacteria (FIB) are the leading cause of surface water quality impairments in the United States. Watershed-scale models are commonly used to identify relative contributions of watershed sources and to evaluate the effectiveness of remediation strategies. However, most existing models simplify FIB transport behavior as equivalent to that of dissolved-phase contaminants, ignoring the impacts of sediment on the fate and transport of FIB. Implementation of sediment-related processes within existing models is limited by minimal available monitoring data on sediment FIB concentrations for model development, calibration, and validation purposes. The purpose of the present study is to evaluate FIB levels in the streambed sediments as compared to those in the water column and to identify environmental variables that influence water and underlying sediment FIB levels. Concentrations of and enterococci in the water column and sediments of an urban stream were monitored weekly for 1 yr and correlated with a variety of potential hydrometeorological and physicochemical variables. Increased FIB concentrations in both the water column and sediments were most strongly correlated with increased antecedent 24-h rainfall, increased stream water temperature, decreased dissolved oxygen, and decreased specific conductivity. These observations will support future efforts to incorporate sediment-related processes in existing models through the identification of key FIB relationships with other model inputs, and the provision of sediment FIB concentrations for direct model calibration. In addition, identified key variables can be used in quick evaluation of the effectiveness of potential remediation strategies.
RESUMO
Per- and polyfluoroalkyl substances (PFAS) are a class of man-made contaminants of human health concern due to their resistance to degradation, widespread environmental occurrence, bioaccumulation in living organisms, and potential negative health impacts. Private drinking water supplies may be uniquely vulnerable to PFAS contamination in impacted areas, as these systems are not protected under federal regulations and often include limited treatment or remediation, if contaminated, prior to use. The goal of this study was to determine the incidence of PFAS contamination in private drinking water supplies in two counties in Southwest Virginia, USA (Floyd and Roanoke) that share similar bedrock geologies, are representative of different state Department of Health risk categories, and to examine the potential for reliance on citizen-science based strategies for sample collection in subsequent efforts. Samples for inorganic ions, bacteria, and PFAS analysis were collected on separate occasions by participants and experts at the home drinking water point of use (POU) for comparison. Experts also collected outside tap samples for analysis of 30 PFAS compounds. At least one PFAS was detectable in 95 % of POU samples collected (n = 60), with a mean total PFAS concentration of 23.5 ± 30.8 ppt. PFOA and PFOS, two PFAS compounds which presently have EPA health advisories, were detectable in 13 % and 22 % of POU samples, respectively. On average, each POU sample contained >3 PFAS compounds, and one sample contained as many as 8 compounds, indicating that exposure to a mixture of PFAS in drinking water may be occurring. Although there were significant differences in total PFAS concentrations between expert and participant collected samples (Wilcoxon, alpha = 0.05), collector bias was inconsistent, and may be due to the time of day of sampling (i.e. morning, afternoon) or specific attributes of a given home. Further research is required to resolve sources of intra-sample variability.
Assuntos
Água Potável , Monitoramento Ambiental , Fluorocarbonos , Poluentes Químicos da Água , Abastecimento de Água , Poluentes Químicos da Água/análise , Água Potável/química , Fluorocarbonos/análise , Virginia , Abastecimento de Água/estatística & dados numéricosRESUMO
The sampling and analysis of sewage for pathogens and other biomarkers offers a powerful tool for monitoring and understanding community health trends and potentially predicting disease outbreaks. Since the early months of the COVID-19 pandemic, the use of wastewater-based testing for public health surveillance has increased markedly. However, these efforts have focused on urban and periurban areas. In most rural regions of the world, healthcare service access is more limited than in urban areas, and rural public health agencies typically have less disease outcome surveillance data than their urban counterparts. The potential public health benefits of wastewater-based surveillance for rural communities are therefore substantial - though so too are the methodological and ethical challenges. For many rural communities, population dynamics and insufficient, aging, and inadequately maintained wastewater collection and treatment infrastructure present obstacles to the reliable and responsible implementation of wastewater-based surveillance. Practitioner observations and research findings indicate that for many rural systems, typical implementation approaches for wastewater-based surveillance will not yield sufficiently reliable or actionable results. We discuss key challenges and potential strategies to address them. However, to support and expand the implementation of responsible, reliable, and ethical wastewater-based surveillance for rural communities, best practice guidelines and standards are needed.
Assuntos
COVID-19 , Vigilância Epidemiológica Baseada em Águas Residuárias , Humanos , Águas Residuárias , População Rural , Pandemias , COVID-19/epidemiologiaRESUMO
OBJECTIVES: In the US, violations of drinking water regulations are highest in lower-income rural areas overall, and particularly in Central Appalachia. However, data on drinking water use, quality, and associated health outcomes in rural Appalachia are limited. We sought to assess public and private drinking water sources and associated risk factors for waterborne pathogen exposures for individuals living in rural regions of Appalachian Virginia. METHODS: We administered surveys and collected tap water, bottled water, and saliva samples in lower-income households in two adjacent rural counties in southwest Virginia (bordering Kentucky and Tennessee). Water samples were tested for pH, temperature, conductivity, total coliforms, E. coli, free chlorine, nitrate, fluoride, heavy metals, and specific pathogen targets. Saliva samples were analyzed for antibody responses to potentially waterborne infections. We also shared water analysis results with households. RESULTS: We enrolled 33 households (83 individuals), 82% (n = 27) with utility-supplied water and 18% with private wells (n = 3) or springs (n = 3). 58% (n = 19) reported household incomes of <$20,000/year. Total coliforms were detected in water samples from 33% (n = 11) of homes, E. coli in 12%, all with wells or springs (n = 4), and Aeromonas, Campylobacter, and Enterobacter in 9%, all spring water (n = 3). Diarrhea was reported for 10% of individuals (n = 8), but was not associated with E. coli detection. 34% (n = 15) of saliva samples had detectable antibody responses for Cryptosporidium spp., C. jejuni, and Hepatitis E. After controlling for covariates and clustering, individuals in households with septic systems and straight pipes had significantly higher likelihoods of antibody detection (risk ratios = 3.28, 95%CI = 1.01-10.65). CONCLUSIONS: To our knowledge, this is the first study to collect and analyze drinking water samples, saliva samples, and reported health outcome data from low-income households in Central Appalachia. Our findings indicate that utility-supplied water in this region was generally safe, and individuals in low-income households without utility-supplied water or sewerage have higher exposures to waterborne pathogens.
Assuntos
Água Potável , Humanos , Água Potável/microbiologia , Virginia/epidemiologia , Masculino , Adulto , Feminino , Pessoa de Meia-Idade , Saliva/microbiologia , Microbiologia da Água , Qualidade da Água , Abastecimento de Água , Adulto Jovem , Adolescente , População Rural/estatística & dados numéricos , Idoso , Região dos Apalaches/epidemiologia , Criança , PobrezaRESUMO
Over one million households rely on private water supplies (e.g. well, spring, cistern) in the Commonwealth of Virginia, USA. The present study tested 538 private wells and springs in 20 Virginia counties for total coliforms (TCs) and Escherichia coli along with a suite of chemical contaminants. A logistic regression analysis was used to investigate potential correlations between TC contamination and chemical parameters (e.g. NO3(-), turbidity), as well as homeowner-provided survey data describing system characteristics and perceived water quality. Of the 538 samples collected, 41% (n = 221) were positive for TCs and 10% (n = 53) for E. coli. Chemical parameters were not statistically predictive of microbial contamination. Well depth, water treatment, and farm location proximate to the water supply were factors in a regression model that predicted presence/absence of TCs with 74% accuracy. Microbial and chemical source tracking techniques (Bacteroides gene Bac32F and HF183 detection via polymerase chain reaction and optical brightener detection via fluorometry) identified four samples as likely contaminated with human wastewater.
Assuntos
Água Potável/microbiologia , Características da Família , Microbiologia da Água , Poluentes Químicos da Água/química , Animais , Virginia , Qualidade da Água , Abastecimento de Água , Poços de ÁguaRESUMO
Quantitative polymerase chain reaction (qPCR) offers a rapid, highly sensitive analytical alternative to the traditional culture-based techniques of microbial enumeration typically used in water quality monitoring. Before qPCR can be widely applied within surface water monitoring programs and stormwater assessment research, the relationships between microbial concentrations measured by qPCR and culture-based methods must be assessed across a range of water types. Previous studies investigating fecal indicator bacteria quantification using molecular and culture-based techniques have compared measures of total concentration, but have not examined particle-associated microorganisms, which may be more important from a transport perspective, particularly during the calibration of predictive water quality models for watershed management purposes. This study compared total, free-phase, and particle-associated Escherichia coli concentrations as determined by the Colilert defined substrate method and qPCR targeting the uidA gene in stream grab samples partitioned via a calibrated centrifugation technique. Free-phase concentrations detected through qPCR were significantly higher than those detected using Colilert although total concentrations were statistically equivalent, suggesting a source of analytical bias. Although a specimen processing complex was used to identify and correct for inhibition of the qPCR reaction, high particle concentrations may have resulted in underestimation of total cell counts, particularly at low concentrations. Regardless, qPCR-based techniques will likely have an important future role in stormwater assessment and management.
Assuntos
Escherichia coli/genética , Genes Bacterianos , Rios/microbiologia , Microbiologia da Água , Carga Bacteriana , DNA Bacteriano/genética , Escherichia coli/isolamento & purificação , Reação em Cadeia da Polimerase/métodos , Poluentes da Água/isolamento & purificaçãoRESUMO
Upstream anthropogenic land cover can degrade source drinking water quality and thereby inhibit the ability of a community water system to provide safe drinking water. This study aimed to predict differences in Safe Drinking Water Act (SDWA) compliance between water systems based on upstream land cover in Central Appalachia and to examine whether national trends correlating violations with system size and source type were relevant for this region. Multiple generalized linear mixed models assessed relationships between SDWA violations and the distance weighted land cover proportions associated with the water system's contributing source watershed, as well as county economic status, system size, and water source. Results indicate that rates of monitoring and reporting violations were significantly higher for smaller water systems in more economically distressed counties. Interestingly, increases in surface mining landuse and high density development decreased monitoring and reporting violations, which may reflect impacts of associated economic development. Increases in low intensity development increased the likelihood of health-based violations. To protect public health, community managers should consider source water protection and/or upgrading drinking water system treatment capacity prior to developing previously undeveloped areas and further motivate compliance with monitoring and reporting requirements.