Your browser doesn't support javascript.
loading
Sex classification of 3D skull images using deep neural networks.
Noel, Lake; Fat, Shelby Chun; Causey, Jason L; Dong, Wei; Stubblefield, Jonathan; Szymanski, Kathryn; Chang, Jui-Hsuan; Wang, Paul Zhiping; Moore, Jason H; Ray, Edward; Huang, Xiuzhen.
Afiliação
  • Noel L; Department of Computational Biomedicine, Cedars Sinai Medical Center, Los Angeles, CA, USA.
  • Fat SC; Department of Surgery, Cedars Sinai Medical Center, Los Angeles, CA, USA.
  • Causey JL; Center for No-Boundary Thinking (CNBT), Arkansas State University, Jonesboro, AR, USA.
  • Dong W; Department of Computer Science, Arkansas State University, Jonesboro, AR, USA.
  • Stubblefield J; Ann Arbor Algorithms, Ann Arbor, MI, USA.
  • Szymanski K; Center for No-Boundary Thinking (CNBT), Arkansas State University, Jonesboro, AR, USA.
  • Chang JH; Department of Computer Science, Arkansas State University, Jonesboro, AR, USA.
  • Wang PZ; School of Medicine, Creighton University, Omaha, NE, USA.
  • Moore JH; Department of Computational Biomedicine, Cedars Sinai Medical Center, Los Angeles, CA, USA.
  • Ray E; Department of Computational Biomedicine, Cedars Sinai Medical Center, Los Angeles, CA, USA.
  • Huang X; Department of Computational Biomedicine, Cedars Sinai Medical Center, Los Angeles, CA, USA. Jason.Moore@csmc.edu.
Sci Rep ; 14(1): 13707, 2024 06 14.
Article em En | MEDLINE | ID: mdl-38877045
ABSTRACT
Determining the fundamental characteristics that define a face as "feminine" or "masculine" has long fascinated anatomists and plastic surgeons, particularly those involved in aesthetic and gender-affirming surgery. Previous studies in this area have relied on manual measurements, comparative anatomy, and heuristic landmark-based feature extraction. In this study, we collected retrospectively at Cedars Sinai Medical Center (CSMC) a dataset of 98 skull samples, which is the first dataset of this kind of 3D medical imaging. We then evaluated the accuracy of multiple deep learning neural network architectures on sex classification with this dataset. Specifically, we evaluated methods representing three different 3D data modeling approaches Resnet3D, PointNet++, and MeshNet. Despite the limited number of imaging samples, our testing results show that all three approaches achieve AUC scores above 0.9 after convergence. PointNet++ exhibits the highest accuracy, while MeshNet has the lowest. Our findings suggest that accuracy is not solely dependent on the sparsity of data representation but also on the architecture design, with MeshNet's lower accuracy likely due to the lack of a hierarchical structure for progressive data abstraction. Furthermore, we studied a problem related to sex determination, which is the analysis of the various morphological features that affect sex classification. We proposed and developed a new method based on morphological gradients to visualize features that influence model decision making. The method based on morphological gradients is an alternative to the standard saliency map, and the new method provides better visualization of feature importance. Our study is the first to develop and evaluate deep learning models for analyzing 3D facial skull images to identify imaging feature differences between individuals assigned male or female at birth. These findings may be useful for planning and evaluating craniofacial surgery, particularly gender-affirming procedures, such as facial feminization surgery.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Crânio / Redes Neurais de Computação / Imageamento Tridimensional / Aprendizado Profundo Limite: Adult / Female / Humans / Male Idioma: En Revista: Sci Rep Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Estados Unidos

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Crânio / Redes Neurais de Computação / Imageamento Tridimensional / Aprendizado Profundo Limite: Adult / Female / Humans / Male Idioma: En Revista: Sci Rep Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Estados Unidos
...