Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters

Database
Country/Region as subject
Language
Publication year range
1.
Ecol Appl ; 26(5): 1456-1474, 2016 Jul.
Article in English | MEDLINE | ID: mdl-27755750

ABSTRACT

Re-establishing connectivity between protected areas isolated by habitat clearing is a key conservation goal in the humid tropics. In northeastern Costa Rica, payments for environmental services (PES) and a government ban on deforestation have subsidized forest protection and reforestation in the San Juan-La Selva Biological Corridor (SJLSBC), resulting in a decline in mature forest loss and the expansion of tree plantations. We use field studies and graph models to assess how conservation efforts have altered functional connectivity over the last 25 years for four species of insectivorous understory birds. Field playback studies assessed how reforestation habitat quality affected the willingness of Myrmeciza exsul, Henicorhina leucosticta, Thamnophilus atrinucha, and Glyphorynchus spirurus to travel outside forest habitat for territorial defense. Observed travel distances were greatest in nonnative and native tree plantations with high understory stem density, regardless of overstory composition. In contrast, tree plantations with low stem density had travel responses comparable to open pasture for three of the four bird species. We modeled landscape connectivity for each species using graph models based on varying possible travel distances in tree plantations, gallery forests, and pastures. From 1986 to 2011, connectivity for all species declined in the SJLSBC landscape (5825 km2 ) by 14% to 21% despite only a 4.9% net loss in forest area and the rapid expansion of tree plantations over 2% of the landscape. Plantation placement in the landscape limited their potential facilitation of connectivity because they were located either far from forest cover or within already contiguous forest areas. We mapped current connectivity bottlenecks and identified priority areas for future reforestation. We estimate that reforestation of priority areas could improve connectivity by 2% with only a 1% gain in forest cover, an impressive gain given the small area reforested. Results indicate key locations where spatial targeting of PES within the SJLSBC study region would protect existing forest connectivity and enhance the connectivity benefits of reforestation.


Subject(s)
Birds/physiology , Environmental Restoration and Remediation/methods , Forests , Animal Distribution , Animals , Costa Rica , Tropical Climate
2.
MethodsX ; 10: 101998, 2023.
Article in English | MEDLINE | ID: mdl-36660342

ABSTRACT

With the increased availability of hyperspectral imaging (HSI) data at various scales (0.03-30 m), the role of simulation is becoming increasingly important in data analysis and applications. There are few commercially available tools to spatially degrade imagery based on the spatial response of a coarser resolution sensor. Instead, HSI data are typically spatially degraded using nearest neighbor, pixel aggregate or cubic convolution approaches. Without accounting for the spatial response of the simulated sensor, these approaches yield unrealistically sharp images. This article describes the spatial response resampling (SR2) workflow, a novel approach to degrade georeferenced raster HSI data based on the spatial response of a coarser resolution sensor. The workflow is open source and widely available for personal, academic or commercial use with no restrictions. The importance of the SR2 workflow is shown with three practical applications (data cross-validation, flight planning and data fusion of separate VNIR and SWIR images).•The SR2 workflow derives the point spread function of a specified HSI sensor based on nominal data acquisition parameters (e.g., integration time, altitude, speed), convolving it with a finer resolution HSI dataset for data simulation.•To make the workflow approachable for end users, we provide a MATLAB function that implements the SR2 methodology.

3.
MethodsX ; 9: 101601, 2022.
Article in English | MEDLINE | ID: mdl-34984174

ABSTRACT

Our article describes a data processing workflow for hyperspectral imaging data to compensate for the water column in shallow, clear to moderate optical water types. We provide a MATLAB script that can be readily used to implement the described workflow. We break down each code segment of this script so that it is more approachable for use and modification by end users and data providers. The workflow initially implements the method for water column compensation described in Lyzenga (1978) and Lyzenga (1981), generating depth invariant indices from spectral band pairs. Given the high dimensionality of hyperspectral imaging data, an overwhelming number of depth invariant indices are generated in the workflow. As such, a correlation based feature selection methodology is applied to remove redundant depth invariant indices. In a post-processing step, a principal component transformation is applied, extracting features that account for a substantial amount of the variance from the non-redundant depth invariant indices while reducing dimensionality. To fully showcase the developed methodology and its potential for extracting bottom type information, we provide an example output of the water column compensation workflow using hyperspectral imaging data collected over the coast of Philpott's Island in Long Sault Parkway provincial park, Ontario, Canada.•Workflow calculates depth invariant indices for hyperspectral imaging data to compensate for the water column in shallow, clear to moderate optical water types.•The applied principal component transformation generates features that account for a substantial amount of the variance from the depth invariant indices while reducing dimensionality.•The output (both depth invariant index image and principal component image) allows for the analysis of bottom type in shallow, clear to moderate optical water types.

4.
MethodsX ; 8: 101429, 2021.
Article in English | MEDLINE | ID: mdl-34434852

ABSTRACT

Before pushbroom hyperspectral imaging (HSI) data can be applied in remote sensing applications, it must typically be preprocessed through radiometric correction, atmospheric compensation, geometric correction and spatial resampling procedures. After these preprocessing procedures, HSI data are conventionally given as georeferenced raster images. The raster data model compromises the spatial-spectral integrity of HSI data, leading to suboptimal results in various applications. Inamdar et al. (2021) developed a point cloud data format, the Directly-Georeferenced Hyperspectral Point Cloud (DHPC), that preserves the spatial-spectral integrity of HSI data more effectively than rasters. The DHPC is generated through a data fusion workflow that uses conventional preprocessing protocols with a modification to the digital surface model used in the geometric correction. Even with the additional elevation information, the DHPC is still stored with file sizes up to 13 times smaller than conventional rasters, making it ideal for data distribution. Our article aims to describe the DHPC data fusion workflow from Inamdar et al. (2021), providing all the required tools for its integration in pre-existing processing workflows. This includes a MATLAB script that can be readily applied to carry out the modification that must be made to the digital surface model used in the geometric correction. The MATLAB script first derives the point spread function of the HSI data and then convolves it with the digital surface model input in the geometric correction. By breaking down the MATLAB script and describing its functions, data providers can readily develop their own implementation if necessary. The derived point spread function is also useful for characterizing HSI data, quantifying the contribution of materials to the spectrum from any given pixel as a function of distance from the pixel center. Overall, our work makes the implementation of the DHPC data fusion workflow transparent and approachable for end users and data providers.•Our article describes the Directly-Georeferenced Hyperspectral Point Cloud (DHPC) data fusion workflow, which can be readily implemented with existing processing protocols by modifying the input digital surface model used in the geometric correction.•We provide a MATLAB function that performs the modification to the digital surface model required for the DHPC workflow. This MATLAB script derives the point spread function of the hyperspectral imager and convolves it with the digital surface model so that the elevation data are more spatially consistent with the hyperspectral imaging data as collected.•We highlight the increased effectiveness of the DHPC over conventional raster end products in terms of spatial-spectral data integrity, data storage requirements, hyperspectral imaging application results and site exploration via virtual and augmented reality.

5.
MethodsX ; 8: 101471, 2021.
Article in English | MEDLINE | ID: mdl-34434871

ABSTRACT

Airborne remotely sensed data (e.g. hyperspectral imagery, thermal videography, full frame RGB photography) often requires post-processing to be combined into a series of images or a mosaic for analysis. This is generally accomplished through the use of position and attitude hardware (i.e. Global Navigation Satellite System - GNSS / Inertial Measurement Unit - IMU) in combination with specialized software. Occasionally, hardware failure in the GNSS/IMU instrumentation occurs, however the data are still recoverable through a correction process, which allows image registration to mosaic the data. Here we present a simple and flexible MATLAB® code package that has been developed to combine video-based remotely sensed data. It first applies an iterative image registration process to align all frames, using pre-existing GPS information if supplied by the user, and then grids the frame data together to develop a final, single mosaic dataset that can be used for analysis. An example of this method using airborne infrared video data of a wildfire is shown as a demonstration.•MATLAB functions are easily adaptable to specific user needs and datasets.•The method outputs the combined data and positional information in three separate MATLAB variables that can be readily used for analysis in MATLAB or exported for use in other software.

6.
Carbon Balance Manag ; 9(1): 9, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25243018

ABSTRACT

BACKGROUND: The high spatio-temporal variability of aboveground biomass (AGB) in tropical forests is a large source of uncertainty in forest carbon stock estimation. Due to their spatial distribution and sampling intensity, pre-felling inventories are a potential source of ground level data that could help reduce this uncertainty at larger spatial scales. Further, exploring the factors known to influence tropical forest biomass, such as wood density and large tree density, will improve our knowledge of biomass distribution across tropical regions. Here, we evaluate (1) the variability of wood density and (2) the variability of AGB across five ecosystems of Costa Rica. RESULTS: Using forest management (pre-felling) inventories we found that, of the regions studied, Huetar Norte had the highest mean wood density of trees with a diameter at breast height (DBH) greater than or equal to 30 cm, 0.623 ± 0.182 g cm-3 (mean ± standard deviation). Although the greatest wood density was observed in Huetar Norte, the highest mean estimated AGB (EAGB) of trees with a DBH greater than or equal to 30 cm was observed in Osa peninsula (173.47 ± 60.23 Mg ha-1). The density of large trees explained approximately 50% of EAGB variability across the five ecosystems studied. Comparing our study's EAGB to published estimates reveals that, in the regions of Costa Rica where AGB has been previously sampled, our forest management data produced similar values. CONCLUSIONS: This study presents the most spatially rich analysis of ground level AGB data in Costa Rica to date. Using forest management data, we found that EAGB within and among five Costa Rican ecosystems is highly variable. Combining commercial logging inventories with ecological plots will provide a more representative ground level dataset for the calibration of the models and remotely sensed data used to EAGB at regional and national scales. Additionally, because the non-protected areas of the tropics offer the greatest opportunity to reduce rates of deforestation and forest degradation, logging inventories offer a promising source of data to support mechanisms such as the United Nations REDD + (Reducing Emissions from Tropical Deforestation and Degradation) program.

SELECTION OF CITATIONS
SEARCH DETAIL