RESUMO
Insect outbreaks affect forest structure and function and represent a major category of forest disturbance globally. However, the resulting impacts on evapotranspiration (ET), and especially hydrological partitioning between the abiotic (evaporation) and biotic (transpiration) components of total ET, are not well constrained. As a result, we combined remote sensing, eddy covariance, and hydrological modeling approaches to determine the effects of bark beetle outbreak on ET and its partitioning at multiple scales throughout the Southern Rocky Mountain Ecoregion (SRME), USA. At the eddy covariance measurement scale, 85 % of the forest was affected by beetles, and water year ET as a fraction of precipitation (P) decreased by 30 % relative to a control site, with 31 % greater reductions in growing season transpiration relative to total ET. At the ecoregion scale, satellite remote sensing masked to areas of >80 % tree mortality showed corresponding ET/P reductions of 9-15 % that occurred 6-8 years post-disturbance, and indicated that the majority of the total reduction occurred during the growing season; the Variable Infiltration Capacity hydrological model showed an associated 9-18 % increase in the ecoregion runoff ratio. Long-term (16-18 year) ET and vegetation mortality datasets extend the length of previously published analyses and allowed for clear characterization of the forest recovery period. During that time, transpiration recovery outpaced total ET recovery, which was lagged in part due to persistently reduced winter sublimation, and there was associated evidence of increasing late summer vegetation moisture stress. Overall, comparison of three independent methods and two partitioning approaches demonstrated a net negative impact of bark beetles on ET, and a relatively greater negative impact on transpiration, following bark beetle outbreak in the SRME.
Assuntos
Besouros , Gorgulhos , Animais , Casca de Planta , Florestas , ÁrvoresRESUMO
Extreme precipitation can have profound consequences for communities, resulting in natural hazards such as rainfall-triggered landslides that cause casualties and extensive property damage. A key challenge to understanding and predicting rainfall-triggered landslides comes from observational uncertainties in the depth and intensity of precipitation preceding the event. Practitioners and researchers must select from a wide range of precipitation products, often with little guidance. Here we evaluate the degree of precipitation uncertainty across multiple precipitation products for a large set of landslide-triggering storm events and investigate the impact of these uncertainties on predicted landslide probability using published intensity-duration thresholds. The average intensity, peak intensity, duration, and NOAA-Atlas return periods are compared ahead of 177 reported landslides across the continental United States and Canada. Precipitation data are taken from four products that cover disparate measurement methods: near real-time and post-processed satellite (IMERG), radar (MRMS), and gauge-based (NLDAS-2). Landslide-triggering precipitation was found to vary widely across precipitation products with the depth of individual storm events diverging by as much as 296 mm with an average range of 51 mm. Peak intensity measurements, which are typically influential in triggering landslides, were also highly variable with an average range of 7.8 mm/h and as much as 57 mm/h. The two products more reliant upon ground-based observations (MRMS and NLDAS-2) performed better at identifying landslides according to published intensity-duration storm thresholds, but all products exhibited hit ratios of greater than 0.56. A greater proportion of landslides were predicted when including only manually verified landslide locations. We recommend practitioners consider low-latency products like MRMS for investigating landslides, given their near-real time data availability and good performance in detecting landslides. Practitioners would be well-served considering more than one product as a way to confirm intense storm signals and minimize the influence of noise and false alarms.
RESUMO
This manuscript describes an observationally-based dataset of soil evaporation for the conterminous U.S. (CONUS), gridded to a 9 km resolution for the time-period of April 2015-March 2019. This product is termed E-SMAP (Evaporation-Soil Moisture Active Passive) in which soil evaporation is estimated from the surface layer, defined by the SMAP sensing depth of 50 mm, between SMAP overpass intervals that are screened on the basis of precipitation and SMAP quality control flags. Soil evaporation is estimated using a water balance of the surface soil that we show is largely dominated by SMAP-observed soil drying. E-SMAP soil evaporation is on average 0.72 mm day-1, which falls within the range of soil evaporation estimates (0.17-0.89 mm day-1) derived from operational land surface models and an alternative remote sensing product. E-SMAP is independent from existing soil evaporation estimates and therefore has the potential to improve understanding of evapotranspiration partitioning and model development.
RESUMO
Severe and persistent 21st-century drought in southwestern North America (SWNA) motivates comparisons to medieval megadroughts and questions about the role of anthropogenic climate change. We use hydrological modeling and new 1200-year tree-ring reconstructions of summer soil moisture to demonstrate that the 2000-2018 SWNA drought was the second driest 19-year period since 800 CE, exceeded only by a late-1500s megadrought. The megadrought-like trajectory of 2000-2018 soil moisture was driven by natural variability superimposed on drying due to anthropogenic warming. Anthropogenic trends in temperature, relative humidity, and precipitation estimated from 31 climate models account for 47% (model interquartiles of 35 to 105%) of the 2000-2018 drought severity, pushing an otherwise moderate drought onto a trajectory comparable to the worst SWNA megadroughts since 800 CE.