Development of a Non-Contact Sensor System for Converting 2D Images into 3D Body Data: A Deep Learning Approach to Monitor Obesity and Body Shape in Individuals in Their 20s and 30s.
Sensors (Basel)
; 24(1)2024 Jan 02.
Article
em En
| MEDLINE
| ID: mdl-38203129
ABSTRACT
This study demonstrates how to generate a three-dimensional (3D) body model through a small number of images and derive body values similar to the actual values using generated 3D body data. In this study, a 3D body model that can be used for body type diagnosis was developed using two full-body pictures of the front and side taken with a mobile phone. For data training, 400 3D body datasets (male 200, female 200) provided by Size Korea were used, and four models, i.e., 3D recurrent reconstruction neural network, point cloud generative adversarial network, skinned multi-person linear model, and pixel-aligned impact function for high-resolution 3D human digitization, were used. The models proposed in this study were analyzed and compared. A total of 10 men and women were analyzed, and their corresponding 3D models were verified by comparing 3D body data derived from 2D image inputs with those obtained using a body scanner. The model was verified through the difference between 3D data derived from the 2D image and those derived using an actual body scanner. Unlike the 3D generation models that could not be used to derive the body values in this study, the proposed model was successfully used to derive various body values, indicating that this model can be implemented to identify various body types and monitor obesity in the future.
Palavras-chave
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Telefone Celular
/
Aprendizado Profundo
Limite:
Female
/
Humans
/
Male
Idioma:
En
Revista:
Sensors (Basel)
Ano de publicação:
2024
Tipo de documento:
Article