Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Appl Opt ; 63(12): 3015-3028, 2024 Apr 20.
Artículo en Inglés | MEDLINE | ID: mdl-38856445

RESUMEN

The accuracy of the absolute radiometric calibration (RadCal) for remote sensing instruments is essential to their wide range of applications. The uncertainty associated with the traditional source-based RadCal method is assessed at a 2% (k=1) or higher level for radiance measurement. To further improve the accuracy to meet the demands of climate studies, a detector-based approach using tunable lasers as a light source has been devised. The Goddard Laser for Absolute Measurement of Radiance, known as the GLAMR system, is a notable example of the incorporation of such technology. Using transfer radiometers calibrated at the National Institute of Standards and Technology as calibration standards, the absolute spectral response function of a remote sensing instrument is measured with its uncertainty traceable to the International System of Units. This paper presents a comprehensive uncertainty analysis of the detector-based absolute RadCal using the GLAMR system. It identifies and examines uncertainty sources during the GLAMR RadCal test, including those from the GLAMR system, the testing configuration, and data processing methodologies. Analysis is carried out to quantify the contribution of each source and emphasize the most influential factors. It is shown that the calibration uncertainty of GLAMR RadCal can be better than 0.3% (k=1) in the wavelength range of 350-950 nm and 0.6% (k=1) between 950 and 2300 nm, with the exception of regions with strong water absorption. In addition, recommendations are made to refine the calibration process to further reduce the uncertainty.

2.
Mar Pollut Bull ; 196: 115558, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-37757532

RESUMEN

The Geostationary Littoral Imaging and Monitoring Radiometer (GLIMR) will provide unique high temporal frequency observations of the United States coastal waters to quantify processes that vary on short temporal and spatial scales. The frequency and coverage of observations from geostationary orbit will improve quantification and reduce uncertainty in tracking water quality events such as harmful algal blooms and oil spills. This study looks at the potential for GLIMR to complement existing satellite platforms from its unique geostationary viewpoint for water quality and oil spill monitoring with a focus on temporal and spatial resolution aspects. Water quality measures derived from satellite imagery, such as harmful algal blooms, thick oil, and oil emulsions are observable with glint <0.005 sr-1, while oil films require glint >10-5 sr-1. Daily imaging hours range from 6 to 12 h for water quality measures, and 0 to 6 h for oil film applications throughout the year as defined by sun glint strength. Spatial pixel resolution is 300 m at nadir and median pixel resolution was 391 m across the entire field of regard, with higher spatial resolution across all spectral bands in the Gulf of Mexico than existing satellites, such as MODIS and VIIRS, used for oil spill surveillance reports. The potential for beneficial glint use in oil film detection and quality flagging for other water quality parameters was greatest at lower latitudes and changed location throughout the day from the West and East Coasts of the United States. GLIMR scan times can change from the planned ocean color default of 0.763 s depending on the signal-to-noise ratio application requirement and can match existing and future satellite mission regions of interest to leverage multi-mission observations.


Asunto(s)
Contaminación por Petróleo , Calidad del Agua , Estados Unidos , Imágenes Satelitales , Floraciones de Algas Nocivas , Golfo de México , Monitoreo del Ambiente/métodos
3.
Remote Sens (Basel) ; 8(2)2016 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-32818076

RESUMEN

The VIIRS instrument on board the S-NPP spacecraft has successfully operated for more than four years since its launch in October, 2011. Many VIIRS environmental data records (EDR) have been continuously generated from its sensor data records (SDR) with improved quality, enabling a wide range of applications in support of users in both the operational and research communities. This paper provides a brief review of sensor on-orbit calibration methodologies for both the reflective solar bands (RSB) and the thermal emissive bands (TEB) and an overall assessment of their on-orbit radiometric performance using measurements from instrument on-board calibrators (OBC) as well as regularly scheduled lunar observations. It describes and illustrates changes made and to be made for calibration and data quality improvements. Throughout the mission, all of the OBC have continued to operate and function normally, allowing critical calibration parameters used in the data production systems to be derived and updated. The temperatures of the on-board blackbody (BB) and the cold focal plane assemblies are controlled with excellent stability. Despite large optical throughput degradation discovered shortly after launch in several near and short-wave infrared spectral bands and strong wavelength dependent solar diffuser degradation, the VIIRS overall performance has continued to meet its design requirements. Also discussed in this paper are challenging issues identified and efforts to be made to further enhance the sensor calibration and characterization, thereby maintaining or improving data quality.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA