Your browser doesn't support javascript.
loading
ASMNet: Action and Style-Conditioned Motion Generative Network for 3D Human Motion Generation.
Li, Zongying; Wang, Yong; Du, Xin; Wang, Can; Koch, Reinhard; Liu, Mengyuan.
Afiliação
  • Li Z; School of Artificial Intelligence, Chongqing University of Technology, Chongqing, China.
  • Wang Y; School of Artificial Intelligence, Chongqing University of Technology, Chongqing, China.
  • Du X; School of Artificial Intelligence, Chongqing University of Technology, Chongqing, China.
  • Wang C; Multimedia Information Processing Laboratory at the Department of Computer Science, Kiel University, Kiel, Germany.
  • Koch R; Advanced Institute of Information Technology (AIIT), Peking University, Hangzhou, China.
  • Liu M; Hangzhou Linxrobot Co. Ltd., Hangzhou, China.
Cyborg Bionic Syst ; 5: 0090, 2024.
Article em En | MEDLINE | ID: mdl-38348153
ABSTRACT
Extensive research has explored human motion generation, but the generated sequences are influenced by different motion styles. For instance, the act of walking with joy and sorrow evokes distinct effects on a character's motion. Due to the difficulties in motion capture with styles, the available data for style research are also limited. To address the problems, we propose ASMNet, an action and style-conditioned motion generative network. This network ensures that the generated human motion sequences not only comply with the provided action label but also exhibit distinctive stylistic features. To extract motion features from human motion sequences, we design a spatial temporal extractor. Moreover, we use the adaptive instance normalization layer to inject style into the target motion. Our results are comparable to state-of-the-art approaches and display a substantial advantage in both quantitative and qualitative evaluations. The code is available at https//github.com/ZongYingLi/ASMNet.git.

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Tipo de estudo: Qualitative_research Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Tipo de estudo: Qualitative_research Idioma: En Ano de publicação: 2024 Tipo de documento: Article