Your browser doesn't support javascript.
loading
Implicit 3D Human Reconstruction Guided by Parametric Models and Normal Maps.
Ren, Yong; Zhou, Mingquan; Wang, Yifan; Feng, Long; Zhu, Qiuquan; Li, Kang; Geng, Guohua.
Afiliação
  • Ren Y; School of Information Science and Technology, Northwest University, Xi'an 710127, China.
  • Zhou M; National and Local Joint Engineering Research Center for Cultural Heritage Digitization, Xi'an 710127, China.
  • Wang Y; School of Information Science and Technology, Northwest University, Xi'an 710127, China.
  • Feng L; National and Local Joint Engineering Research Center for Cultural Heritage Digitization, Xi'an 710127, China.
  • Zhu Q; School of Information Science and Technology, Northwest University, Xi'an 710127, China.
  • Li K; National and Local Joint Engineering Research Center for Cultural Heritage Digitization, Xi'an 710127, China.
  • Geng G; School of Information Science and Technology, Northwest University, Xi'an 710127, China.
J Imaging ; 10(6)2024 May 29.
Article em En | MEDLINE | ID: mdl-38921610
ABSTRACT
Accurate and robust 3D human modeling from a single image presents significant challenges. Existing methods have shown potential, but they often fail to generate reconstructions that match the level of detail in the input image. These methods particularly struggle with loose clothing. They typically employ parameterized human models to constrain the reconstruction process, ensuring the results do not deviate too far from the model and produce anomalies. However, this also limits the recovery of loose clothing. To address this issue, we propose an end-to-end method called IHRPN for reconstructing clothed humans from a single 2D human image. This method includes a feature extraction module for semantic extraction of image features. We propose an image semantic feature extraction aimed at achieving pixel model space consistency and enhancing the robustness of loose clothing. We extract features from the input image to infer and recover the SMPL-X mesh, and then combine it with a normal map to guide the implicit function to reconstruct the complete clothed human. Unlike traditional methods, we use local features for implicit surface regression. Our experimental results show that our IHRPN method performs excellently on the CAPE and AGORA datasets, achieving good performance, and the reconstruction of loose clothing is noticeably more accurate and robust.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: J Imaging Ano de publicação: 2024 Tipo de documento: Article País de afiliação: China País de publicação: CH / SUIZA / SUÍÇA / SWITZERLAND

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: J Imaging Ano de publicação: 2024 Tipo de documento: Article País de afiliação: China País de publicação: CH / SUIZA / SUÍÇA / SWITZERLAND