RESUMEN
Trees remain sufficiently hydrated during drought by closing stomata and reducing canopy conductance (Gc ) in response to variations in atmospheric water demand and soil water availability. Thresholds that control the reduction of Gc are proposed to optimize hydraulic safety against carbon assimilation efficiency. However, the link between Gc and the ability of stem tissues to rehydrate at night remains unclear. We investigated whether species-specific Gc responses aim to prevent branch embolisms, or enable night-time stem rehydration, which is critical for turgor-dependent growth. For this, we used a unique combination of concurrent dendrometer, sap flow and leaf water potential measurements and collected branch-vulnerability curves of six common European tree species. Species-specific Gc reduction was weakly related to the water potentials at which 50% of branch xylem conductivity is lost (P50 ). Instead, we found a stronger relationship with stem rehydration. Species with a stronger Gc control were less effective at refilling stem-water storage as the soil dries, which appeared related to their xylem architecture. Our findings highlight the importance of stem rehydration for water-use regulation in mature trees, which likely relates to the maintenance of adequate stem turgor. We thus conclude that stem rehydration must complement the widely accepted safety-efficiency stomatal control paradigm.
Asunto(s)
Hojas de la Planta , Árboles , Árboles/fisiología , Hojas de la Planta/fisiología , Xilema/fisiología , Agua/fisiología , Sequías , FluidoterapiaRESUMEN
Forests account for nearly 90 % of the world's terrestrial biomass in the form of carbon and they support 80 % of the global biodiversity. To understand the underlying forest dynamics, we need a long-term but also relatively high-frequency, networked monitoring system, as traditionally used in meteorology or hydrology. While there are numerous existing forest monitoring sites, particularly in temperate regions, the resulting data streams are rarely connected and do not provide information promptly, which hampers real-time assessments of forest responses to extreme climate events. The technology to build a better global forest monitoring network now exists. This white paper addresses the key structural components needed to achieve a novel meta-network. We propose to complement - rather than replace or unify - the existing heterogeneous infrastructure with standardized, quality-assured linking methods and interacting data processing centers to create an integrated forest monitoring network. These automated (research topic-dependent) linking methods in atmosphere, biosphere, and pedosphere play a key role in scaling site-specific results and processing them in a timely manner. To ensure broad participation from existing monitoring sites and to establish new sites, these linking methods must be as informative, reliable, affordable, and maintainable as possible, and should be supplemented by near real-time remote sensing data. The proposed novel meta-network will enable the detection of emergent patterns that would not be visible from isolated analyses of individual sites. In addition, the near real-time availability of data will facilitate predictions of current forest conditions (nowcasts), which are urgently needed for research and decision making in the face of rapid climate change. We call for international and interdisciplinary efforts in this direction.
RESUMEN
Ecological research, just as all Earth System Sciences, is becoming increasingly data-rich. Tools for processing of "big data" are continuously developed to meet corresponding technical and logistical challenges. However, even at smaller scales, data sets may be challenging when best practices in data exploration, quality control and reproducibility are to be met. This can occur when conventional methods, such as generating and assessing diagnostic visualizations or tables, become unfeasible due to time and practicality constraints. Interactive processing can alleviate this issue, and is increasingly utilized to ensure that large data sets are diligently handled. However, recent interactive tools rarely enable data manipulation, may not generate reproducible outputs, or are typically data/domain-specific. We developed datacleanr, an interactive tool that facilitates best practices in data exploration, quality control (e.g., outlier assessment) and flexible processing for multiple tabular data types, including time series and georeferenced data. The package is open-source, and based on the R programming language. A key functionality of datacleanr is the "reproducible recipe"-a translation of all interactive actions into R code, which can be integrated into existing analyses pipelines. This enables researchers experienced with script-based workflows to utilize the strengths of interactive processing without sacrificing their usual work style or functionalities from other (R) packages. We demonstrate the package's utility by addressing two common issues during data analyses, namely 1) identifying problematic structures and artefacts in hierarchically nested data, and 2) preventing excessive loss of data from 'coarse,' code-based filtering of time series. Ultimately, with datacleanr we aim to improve researchers' workflows and increase confidence in and reproducibility of their results.