Your browser doesn't support javascript.
loading
Montrer: 20 | 50 | 100
Résultats 1 - 1 de 1
Filtrer
Plus de filtres











Base de données
Gamme d'année
1.
Sensors (Basel) ; 24(4)2024 Feb 08.
Article de Anglais | MEDLINE | ID: mdl-38400287

RÉSUMÉ

Accurate calibration between LiDAR and camera sensors is crucial for autonomous driving systems to perceive and understand the environment effectively. Typically, LiDAR-camera extrinsic calibration requires feature alignment and overlapping fields of view. Aligning features from different modalities can be challenging due to noise influence. Therefore, this paper proposes a targetless extrinsic calibration method for monocular cameras and LiDAR sensors that have a non-overlapping field of view. The proposed solution uses pose transformation to establish data association across different modalities. This conversion turns the calibration problem into an optimization problem within a visual SLAM system without requiring overlapping views. To improve performance, line features serve as constraints in visual SLAM. Accurate positions of line segments are obtained by utilizing an extended photometric error optimization method. Moreover, a strategy is proposed for selecting appropriate calibration methods from among several alternative optimization schemes. This adaptive calibration method selection strategy ensures robust calibration performance in urban autonomous driving scenarios with varying lighting and environmental textures while avoiding failures and excessive bias that may result from relying on a single approach.

SÉLECTION CITATIONS
DÉTAIL DE RECHERCHE