Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros

Banco de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Artículo en Inglés | MEDLINE | ID: mdl-36082135

RESUMEN

Atmospheric radiative transfer models (RTMs) simulate the light propagation in the Earth's atmosphere. With the evolution of RTMs, their increase in complexity makes them impractical in routine processing such as atmospheric correction. To overcome their computational burden, standard practice is to interpolate a multidimensional lookup table (LUT) of prestored simulations. However, accurate interpolation relies on large LUTs, which still implies large computation times for their generation and interpolation. In recent years, emulation has been proposed as an alternative to LUT interpolation. Emulation approximates the RTM outputs by a statistical regression model trained with a low number of RTM runs. However, a concern is whether the emulator reaches sufficient accuracy for atmospheric correction. Therefore, we have performed a systematic assessment of key aspects that impact the precision of emulating MODTRAN: 1) regression algorithm; 2) training database size; 3) dimensionality reduction (DR) method and a number of components; and 4) spectral resolution. The Gaussian processes regression (GPR) was found the most accurate emulator. The principal component analysis remains a robust DR method and nearly 20 components reach sufficient precision. Based on a database of 1000 samples covering a broad range of atmospheric conditions, GPR emulators can reconstruct the simulated spectral data with relative errors below 1% for the 95th percentile. These emulators reduce the processing time from days to minutes, preserving sufficient accuracy for atmospheric correction and providing model uncertainties and derivatives. We provide a set of guidelines and tools to design and generate accurate emulators for satellite data processing applications.

2.
IEEE Trans Geosci Remote Sens ; 57(2): 1040-1048, 2019 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-36082240

RESUMEN

Physically based radiative transfer models (RTMs) are widely used in Earth observation to understand the radiation processes occurring on the Earth's surface and their interactions with water, vegetation, and atmosphere. Through continuous improvements, RTMs have increased in accuracy and representativity of complex scenes at expenses of an increase in complexity and computation time, making them impractical in various remote sensing applications. To overcome this limitation, the common practice is to precompute large lookup tables (LUTs) for their later interpolation. To further reduce the RTM computation burden and the error in LUT interpolation, we have developed a method to automatically select the minimum and optimal set of input-output points (nodes) to be included in an LUT. We present the gradient-based automatic LUT generator algorithm (GALGA), which relies on the notion of an acquisition function that incorporates: 1) the Jacobian evaluation of an RTM and 2) the information about the multivariate distribution of the current nodes. We illustrate the capabilities of GALGA in the automatic construction and optimization of MODTRAN-based LUTs of different dimensions of the input variables space. Our results indicate that when compared with a pseudorandom homogeneous distribution of the LUT nodes, GALGA reduces:1) the LUT size by >24%; 2) the computation time by 27%; and 3) the maximum interpolation relative errors by at least 10%. It is concluded that an automatic LUT design might benefit from the methodology proposed in GALGA to reduce interpolation errors and computation time in computationally expensive RTMs.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA