Your browser doesn't support javascript.
loading
Identity-Guided Collaborative Learning for Cloth-Changing Person Reidentification.
IEEE Trans Pattern Anal Mach Intell ; 46(5): 2819-2837, 2024 May.
Article en En | MEDLINE | ID: mdl-38015700
Cloth-changing person reidentification (ReID) is a newly emerging research topic aimed at addressing the issues of large feature variations due to cloth-changing and pedestrian view/pose changes. Although significant progress has been achieved by introducing extra information (e.g., human contour sketching information, human body keypoints, and 3D human information), cloth-changing person ReID remains challenging because pedestrian appearance representations can change at any time. Moreover, human semantic information and pedestrian identity information are not fully explored. To solve these issues, we propose a novel identity-guided collaborative learning scheme (IGCL) for cloth-changing person ReID, where the human semantic is effectively utilized and the identity is unchangeable to guide collaborative learning. First, we design a novel clothing attention degradation stream to reasonably reduce the interference caused by clothing information where clothing attention and mid-level collaborative learning are employed. Second, we propose a human semantic attention and body jigsaw stream to highlight the human semantic information and simulate different poses of the same identity. In this way, the extraction features not only focus on human semantic information that is unrelated to the background but are also suitable for pedestrian pose variations. Moreover, a pedestrian identity enhancement stream is proposed to enhance the identity importance and extract more favorable identity robust features. Most importantly, all these streams are jointly explored in an end-to-end unified framework, and the identity is utilized to guide the optimization. Extensive experiments on six public clothing person ReID datasets (LaST, LTCC, PRCC, NKUP, Celeb-reID-light, and VC-Clothes) demonstrate the superiority of the IGCL method. It outperforms existing methods on multiple datasets, and the extracted features have stronger representation and discrimination ability and are weakly correlated with clothing.
Asunto(s)

Texto completo: 1 Base de datos: MEDLINE Asunto principal: Peatones / Prácticas Interdisciplinarias Límite: Humans Idioma: En Revista: IEEE Trans Pattern Anal Mach Intell Asunto de la revista: INFORMATICA MEDICA Año: 2024 Tipo del documento: Article

Texto completo: 1 Base de datos: MEDLINE Asunto principal: Peatones / Prácticas Interdisciplinarias Límite: Humans Idioma: En Revista: IEEE Trans Pattern Anal Mach Intell Asunto de la revista: INFORMATICA MEDICA Año: 2024 Tipo del documento: Article