Machine learning efficiently corrects LIBS spectrum variation due to change of laser fluence.
Opt Express
; 28(10): 14345-14356, 2020 May 11.
Article
em En
| MEDLINE
| ID: mdl-32403475
ABSTRACT
This work demonstrates the efficiency of machine learning in the correction of spectral intensity variations in laser-induced breakdown spectroscopy (LIBS) due to changes of the laser pulse energy, such changes can occur over a wide range, from 7.9 to 71.1 mJ in our experiment. The developed multivariate correction model led to a precise determination of the concentration of a minor element (magnesium for instance) in the samples (aluminum alloys in this work) with a precision of 6.3% (relative standard deviation, RSD) using the LIBS spectra affected by the laser pulse energy change. A comparison to the classical univariate corrections with laser pulse energy, total spectral intensity, ablation crater volume and plasma temperature, further highlights the significance of the developed method.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Idioma:
En
Revista:
Opt Express
Assunto da revista:
OFTALMOLOGIA
Ano de publicação:
2020
Tipo de documento:
Article