An Unsupervised Transfer Learning Framework for Visible-Thermal Pedestrian Detection.
Sensors (Basel)
; 22(12)2022 Jun 10.
Article
en En
| MEDLINE
| ID: mdl-35746199
Dual cameras with visible-thermal multispectral pairs provide both visual and thermal appearance, thereby enabling detecting pedestrians around the clock in various conditions and applications, including autonomous driving and intelligent transportation systems. However, due to the greatly varying real-world scenarios, the performance of a detector trained on a source dataset might change dramatically when evaluated on another dataset. A large amount of training data is often necessary to guarantee the detection performance in a new scenario. Typically, human annotators need to conduct the data labeling work, which is time-consuming, labor-intensive and unscalable. To overcome the problem, we propose a novel unsupervised transfer learning framework for multispectral pedestrian detection, which adapts a multispectral pedestrian detector to the target domain based on pseudo training labels. In particular, auxiliary detectors are utilized and different label fusion strategies are introduced according to the estimated environmental illumination level. Intermediate domain images are generated by translating the source images to mimic the target ones, acting as a better starting point for the parameter update of the pedestrian detector. The experimental results on the KAIST and FLIR ADAS datasets demonstrate that the proposed method achieves new state-of-the-art performance without any manual training annotations on the target data.
Palabras clave
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Asunto principal:
Conducción de Automóvil
/
Peatones
Tipo de estudio:
Diagnostic_studies
Límite:
Humans
Idioma:
En
Revista:
Sensors (Basel)
Año:
2022
Tipo del documento:
Article
País de afiliación:
Bélgica
Pais de publicación:
Suiza