Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
J Environ Manage ; 347: 119213, 2023 Dec 01.
Artigo em Inglês | MEDLINE | ID: mdl-37812899

RESUMO

Grazing management is an important factor affecting the delivery of ecosystem services at the watershed scale. Moreover, characterizing the impacts of climate variation on water resources is essential in managing rangelands. In this study, the effects of alternative grazing management scenarios on provisioning, regulating, and supporting services were assessed in two watersheds with contrasting climates; the Lower Prairie Dog Town Fork Red River (LPDTFR) Watershed in North Texas and the Apple Watershed in South Dakota. The impacts of heavy stocking continuous grazing, light stocking continuous grazing, Adaptive Multi-Paddock (AMP) grazing, and an ungrazed exclosure were compared using the Soil and Water Assessment Tool (SWAT) model. Our results indicate that the quantity of snow and timing of snow melt substantially influenced grazing management effects on ecosystem services in the Apple Watershed. In contrast, precipitation was the main factor influencing these effects in the LPDTFR Watershed because it highly affected the variation in water cycling, streamflow, sediment, and nutrient controls. Simulated results indicated that AMP grazing was the optimal grazing management approach for enhancing water conservation and ecosystem services in both watersheds regardless of climatic conditions. The Apple Watershed, which is a snow-dominated watershed, exhibited greater ecosystem service improvements under AMP grazing (50.6%, 58.7%, 74.4%, 61.5% and 72.6% reduction in surface runoff, streamflow, and sediment, total nitrogen (TN) and total phosphorus (TP) losses, respectively as compared to HC grazing) than the LPDTFR Watershed (46.0%, 22.8%, 34.1%, 18.9% and 38.4% reduction in surface runoff, streamflow, and sediment, TN and TP losses, respectively). Our results suggest that improved grazing management practices enhance ecosystem services and water catchment functions in rangeland-dominated areas, especially in colder climates.


Assuntos
Ecossistema , Solo , North Dakota , Texas , Água
2.
Sci Rep ; 11(1): 20102, 2021 10 11.
Artigo em Inglês | MEDLINE | ID: mdl-34635701

RESUMO

Determining optimum irrigation termination periods for cotton (Gossypium hirsutum L.) is crucial for efficient utilization and conservation of finite groundwater resources of the Ogallala Aquifer in the Texas High Plains (THP) region. The goal of this study was to suggest optimum irrigation termination periods for different Evapotranspiration (ET) replacement-based irrigation strategies to optimize cotton yield and irrigation water use efficiency (IWUE) using the CROPGRO-Cotton model. We re-evaluated a previously evaluated CROPGRO-Cotton model using updated yield and in-season physiological data from 2017 to 2019 growing seasons from an IWUE experiment at Halfway, TX. The re-evaluated model was then used to study the effects of combinations of irrigation termination periods (between August 15 and September 30) and deficit/excess irrigation strategies (55%-115% ET-replacement) under dry, normal and wet years using weather data from 1978 to 2019. The 85% ET-replacement strategy was found ideal for optimizing irrigation water use and cotton yield, and the optimum irrigation termination period for this strategy was found to be the first week of September during dry and normal years, and the last week of August during wet years. Irrigation termination periods suggested in this study are useful for optimizing cotton production and IWUE under different levels of irrigation water availability.


Assuntos
Irrigação Agrícola/métodos , Gossypium/crescimento & desenvolvimento , Água Subterrânea , Fotossíntese , Estações do Ano
3.
Sci Total Environ ; 490: 379-90, 2014 Aug 15.
Artigo em Inglês | MEDLINE | ID: mdl-24867702

RESUMO

Rapid groundwater depletion has raised grave concerns about sustainable development in many parts of Texas, as well as in other parts of the world. Previous hydrologic investigations on groundwater levels in Texas were conducted mostly on aquifer-specific basis, and hence lacked state-wide panoramic view. The aim of this study was to present a qualitative overview of long-term (1930-2010) trends in groundwater levels in Texas and identify spatial patterns by applying different statistical (boxplots, correlation-regression, hierarchical cluster analysis) and geospatial techniques (Moran's I, Local Indicators of Spatial Association) on 136,930 groundwater level observations from Texas Water Development Board's database. State-wide decadal median water-levels declined from about 14 m from land surface in the 1930s to about 36 m in the 2000s. Number of counties with deeper median water-levels (water-level depth>100 m) increased from 2 to 13 between 1930s and 2000s, accompanied by a decrease in number of counties having shallower median water-levels (water-level depth<25 m) from 134 to 113. Water-level declines across Texas, however, mostly followed logarithmic trends marked by leveling-off phenomena in recent times. Assessment of water-levels by Groundwater Management Areas (GMA), management units created to address groundwater depletion issues, indicated hotspots of deep water-levels in Texas Panhandle and GMA 8 since the 1960s. Contrasting patterns in water use, landcover, geology and soil properties distinguished Texas Panhandle from GMA 8. Irrigated agriculture is the major cause of depletion in the Texas Panhandle as compared to increasing urbanization in GMA 8. Overall our study indicated that use of robust spatial and statistical methods can reveal important details about the trends in water-level changes and shed lights on the associated factors. Due to very generic nature, techniques used in this study can also be applied to other areas with similar eco-hydrologic issues to identify regions that warrant future management actions.


Assuntos
Monitoramento Ambiental , Água Subterrânea/análise , Recursos Hídricos/estatística & dados numéricos , Abastecimento de Água/estatística & dados numéricos , Irrigação Agrícola , Agricultura/estatística & dados numéricos , Conservação dos Recursos Naturais , Solo , Texas , Urbanização , Abastecimento de Água/análise
4.
Sci Total Environ ; 472: 370-80, 2014 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-24295753

RESUMO

We assessed spatial distribution of total dissolved solids (TDS) in shallow (<50 m), intermediate (50-150 m), and deep (>150 m) municipal (domestic and public supply) wells in nine major aquifers in Texas for the 1960s-1970s and 1990s-2000s periods using geochemical data obtained from the Texas Water Development Board. For both time periods, the highest median groundwater TDS concentrations in shallow wells were found in the Ogallala and Pecos Valley aquifers and that in the deep wells were found in the Trinity aquifer. In the Ogallala, Pecos Valley, Seymour and Gulf Coast aquifers, >60% of observations from shallow wells exceeded the secondary maximum contaminant level (SMCL) for TDS (500 mg L(-1)) in both time periods. In the Trinity aquifer, 72% of deep water quality observations exceeded the SMCL in the 1990s-2000s as compared to 64% observations in the 1960s-1970s. In the Ogallala, Edwards-Trinity (plateau), and Edwards (Balcones Fault Zone) aquifers, extent of salinization decreased significantly (p<0.05) with well depth, indicating surficial salinity sources. Geochemical ratios revealed strong adverse effects of chloride (Cl(-)) and sulfate (SO4(2-)) on groundwater salinization throughout the state. Persistent salinity hotspots were identified in west (southern Ogallala, north-west Edwards-Trinity (plateau) and Pecos Valley aquifers), north central (Trinity-downdip aquifer) and south (southern Gulf Coast aquifer) Texas. In west Texas, mixed cation SO4-Cl facies led to groundwater salinization, as compared to Na-Cl facies in the southern Gulf Coast, and Ca-Na-HCO3 and Na-HCO3 facies transitioning to Na-Cl facies in the Trinity-downdip regions. Groundwater mixing ensuing from cross-formational flow, seepage from saline plumes and playas, evaporative enrichment, and irrigation return flow had led to progressive groundwater salinization in west Texas, as compared to ion-exchange processes in the north-central Texas, and seawater intrusion coupled with salt dissolution and irrigation return flow in the southern Gulf Coast regions.


Assuntos
Monitoramento Ambiental , Água Subterrânea/química , Salinidade , Abastecimento de Água/estatística & dados numéricos , Água do Mar/análise , Texas , Poluentes Químicos da Água/análise , Abastecimento de Água/análise
5.
J Environ Qual ; 43(4): 1404-16, 2014 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-25603087

RESUMO

Groundwater quality degradation is a major threat to sustainable development in Texas. The aim of this study was to elucidate spatiotemporal patterns of groundwater fluoride (F) contamination in different water use classes in 16 groundwater management areas in Texas between 1960 and 2010. Groundwater F concentration data were obtained from the Texas Water Development Board and aggregated over a decadal scale. Our results indicate that observations exceeding the drinking water quality threshold of World Health Organization (1.5 mg F L) and secondary maximum contaminant level (SMCL) (2 mg F L) of the USEPA increased from 26 and 19% in the 1960s to 37 and 23%, respectively, in the 2000s. In the 2000s, F observations > SMCL among different water use classes followed the order: irrigation (39%) > domestic (20%) > public supply (17%). Extent and mode of interaction between F and other water quality parameters varied regionally. In western Texas, high F concentrations were prevalent at shallower depths (<50 m) and were positively correlated with bicarbonate (HCO) and sulfate anions. In contrast, in southern and southeastern Texas, higher F concentrations occurred at greater depths (>50 m) and were correlated with HCO and chloride anions. A spatial pattern has become apparent marked by "excess" F in western Texas groundwaters as compared with "inadequate" F contents in rest of the state. Groundwater F contamination in western Texas was largely influenced by groundwater mixing and evaporative enrichment as compared with water-rock interaction and mineral dissolution in the rest of the state.

6.
Sci Total Environ ; 452-453: 333-48, 2013 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-23532041

RESUMO

A vast region in north-central Texas, centering on Dallas-Fort Worth metroplex, suffers from intense groundwater drawdown and water quality degradation, which led to inclusion of 18 counties of this region into Priority Groundwater Management Areas. We combined aquifer-based and county-based hydrologic analyses to (1) assess spatio-temporal changes in groundwater level and quality between 1960 and 2010 in the Trinity and Woodbine aquifers underlying the study region, (2) delve into major hydrochemical facies with reference to aquifer hydrostratigraphy, and (3) identify county-based spatial zones to aid in future groundwater management initiatives. Water-level and quality data was obtained from the Texas Water Development Board (TWDB) and analyzed on a decadal scale. Progressive water-level decline was the major concern in the Trinity aquifer with >50% of observations occurring at depths >100 m since the 1980s, an observation becoming apparent only in the 2000s in the Woodbine aquifer. Water quality degradation was the major issue in the Woodbine aquifer with substantially higher percentage of observations exceeding the secondary maximum contaminant levels (SMCL; a non-enforceable threshold set by the United State Environmental Protection Agency (USEPA)) and/or maximum contaminant level (MCL, a legally enforceable drinking water standard set by the USEPA) for sulfate (SO4(2-)), chloride (Cl(-)), and fluoride (F(-)) in each decade. In both aquifers, however, >70% of observations exceeded the SMCL for total dissolved solids indicating high groundwater salinization. Water-level changes in Trinity aquifer also had significant negative impact on water quality. Hydrochemical facies in this region sequentially evolved from Ca-Mg-HCO3 and Ca-HCO3 in the fluvial sediments of the west to Na-SO4-Cl in the deltaic sediments to the east. Sequentially evolving hydrogeochemical facies and increasing salinization closely resembled regional groundwater flow pattern. Distinct spatial zones based on homogenous hydrologic characteristics have become increasingly apparent over time indicating necessity of zone-specific groundwater management strategies.


Assuntos
Água Subterrânea/análise , Poluição da Água/análise , Qualidade da Água , Cloretos/análise , Análise por Conglomerados , Monitoramento Ambiental/métodos , Fluoretos/análise , Sulfatos/análise , Texas , Poluentes Químicos da Água/análise
7.
J Environ Qual ; 42(6): 1699-710, 2013 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-25602410

RESUMO

Subsurface tile drains in agricultural systems of the midwestern United States are a major contributor of nitrate-N (NO-N) loadings to hypoxic conditions in the Gulf of Mexico. Hydrologic and water quality models, such as the Soil and Water Assessment Tool, are widely used to simulate tile drainage systems. The Hooghoudt and Kirkham tile drain equations in the Soil and Water Assessment Tool have not been rigorously tested for predicting tile flow and the corresponding NO-N losses. In this study, long-term (1983-1996) monitoring plot data from southern Minnesota were used to evaluate the SWAT version 2009 revision 531 (hereafter referred to as SWAT) model for accurately estimating subsurface tile drain flows and associated NO-N losses. A retention parameter adjustment factor was incorporated to account for the effects of tile drainage and slope changes on the computation of surface runoff using the curve number method (hereafter referred to as Revised SWAT). The SWAT and Revised SWAT models were calibrated and validated for tile flow and associated NO-N losses. Results indicated that, on average, Revised SWAT predicted monthly tile flow and associated NO-N losses better than SWAT by 48 and 28%, respectively. For the calibration period, the Revised SWAT model simulated tile flow and NO-N losses within 4 and 1% of the observed data, respectively. For the validation period, it simulated tile flow and NO-N losses within 8 and 2%, respectively, of the observed values. Therefore, the Revised SWAT model is expected to provide more accurate simulation of the effectiveness of tile drainage and NO-N management practices.

8.
J Environ Qual ; 41(6): 1806-17, 2012.
Artigo em Inglês | MEDLINE | ID: mdl-23128738

RESUMO

Nitrate (NO) is a major contaminant and threat to groundwater quality in Texas. High-NO groundwater used for irrigation and domestic purposes has serious environmental and health implications. The objective of this study was to evaluate spatio-temporal trends in groundwater NO concentrations in Texas on a county basis from 1960 to 2010 with special emphasis on the Texas Rolling Plains (TRP) using the Texas Water Development Board's groundwater quality database. Results indicated that groundwater NO concentrations have significantly increased in several counties since the 1960s. In 25 counties, >30% of the observations exceeded the maximum contamination level (MCL) for NO (44 mg L NO) in the 2000s as compared with eight counties in the 1960s. In Haskell and Knox Counties of the TRP, all observations exceeded the NO MCL in the 2000s. A distinct spatial clustering of high-NO counties has become increasingly apparent with time in the TRP, as indicated by different spatial indices. County median NO concentrations in the TRP region were positively correlated with county-based area estimates of crop lands, fertilized croplands, and irrigated croplands, suggesting a negative impact of agricultural practices on groundwater NO concentrations. The highly transmissive geologic and soil media in the TRP have likely facilitated NO movement and groundwater contamination in this region. A major hindrance in evaluating groundwater NO concentrations was the lack of adequate recent observations. Overall, the results indicated a substantial deterioration of groundwater quality by NO across the state due to agricultural activities, emphasizing the need for a more frequent and spatially intensive groundwater sampling.


Assuntos
Água Subterrânea/química , Nitratos/química , Poluentes Químicos da Água/química , Irrigação Agrícola , Conservação dos Recursos Naturais , Monitoramento Ambiental/métodos , Solo/química , Texas , Fatores de Tempo
9.
J Environ Qual ; 41(1): 217-28, 2012.
Artigo em Inglês | MEDLINE | ID: mdl-22218190

RESUMO

The nitrates (NO(3)-N) lost through subsurface drainage in the Midwest often exceed concentrations that cause deleterious effects on the receiving streams and lead to hypoxic conditions in the northern Gulf of Mexico. The use of drainage and water quality models along with observed data analysis may provide new insight into the water and nutrient balance in drained agricultural lands and enable evaluation of appropriate measures for reducing NO(3)-N losses. DRAINMOD-NII, a carbon (C) and nitrogen (N) simulation model, was field tested for the high organic matter Drummer soil in Indiana and used to predict the effects of fertilizer application rate and drainage water management (DWM) on NO-N losses through subsurface drainage. The model was calibrated and validated for continuous corn (Zea mays L.) (CC) and corn-soybean [Glycine max (L.) Merr.] (CS) rotation treatments separately using 7 yr of drain flow and NO(3)-N concentration data. Among the treatments, the Nash-Sutcliffe efficiency of the monthly NO(3)-N loss predictions ranged from 0.30 to 0.86, and the percent error varied from -19 to 9%. The medians of the observed and predicted monthly NO(3)-N losses were not significantly different. When the fertilizer application rate was reduced ~20%, the predicted NO(3)-N losses in drain flow from the CC treatments was reduced 17% (95% confidence interval [CI], 11-25), while losses from the CS treatment were reduced by 10% (95% CI, 1-15). With DWM, the predicted average annual drain flow was reduced by about 56% (95% CI, 49-67), while the average annual NO(3)-N losses through drain flow were reduced by about 46% (95% CI, 32-57) for both tested crop rotations. However, the simulated NO(3)-N losses in surface runoff increased by about 3 to 4 kg ha(-1) with DWM. For the simulated conditions at the study site, implementing DWM along with reduced fertilizer application rates would be the best strategy to achieve the highest NO(3)-N loss reductions to surface water. The suggested best strategies would reduce the NO(3)-N losses to surface water by 38% (95% CI, 29-46) for the CC treatments and by 32% (95% CI, 23-40) for the CS treatments.


Assuntos
Simulação por Computador , Modelos Teóricos , Nitratos/química , Nitrogênio/química , Movimentos da Água , Poluentes Químicos da Água/química , Monitoramento Ambiental , Fertilizantes , Solo/química , Glycine max , Zea mays
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...