Human gender estimation from CT images of skull using deep feature selection and feature fusion.
Sci Rep
; 14(1): 16879, 2024 07 23.
Article
en En
| MEDLINE
| ID: mdl-39043755
ABSTRACT
This research endeavors to prognosticate gender by harnessing the potential of skull computed tomography (CT) images, given the seminal role of gender identification in the realm of identification. The study encompasses a corpus of CT images of cranial structures derived from 218 male and 203 female subjects, constituting a total cohort of 421 individuals within the age bracket of 25 to 65 years. Employing deep learning, a prominent subset of machine learning algorithms, the study deploys convolutional neural network (CNN) models to excavate profound attributes inherent in the skull CT images. In pursuit of the research objective, the focal methodology involves the exclusive application of deep learning algorithms to image datasets, culminating in an accuracy rate of 96.4%. The gender estimation process exhibits a precision of 96.1% for male individuals and 96.8% for female individuals. The precision performance varies across different selections of feature numbers, namely 100, 300, and 500, alongside 1000 features without feature selection. The respective precision rates for these selections are recorded as 95.0%, 95.5%, 96.2%, and 96.4%. It is notable that gender estimation via visual radiography mitigates the discrepancy in measurements between experts, concurrently yielding an expedited estimation rate. Predicated on the empirical findings of this investigation, it is inferred that the efficacy of the CNN model, the configurational intricacies of the classifier, and the judicious selection of features collectively constitute pivotal determinants in shaping the performance attributes of the proposed methodology.
Palabras clave
Texto completo:
1
Bases de datos:
MEDLINE
Asunto principal:
Cráneo
/
Tomografía Computarizada por Rayos X
/
Caracteres Sexuales
/
Antropología Forense
Límite:
Adult
/
Aged
/
Female
/
Humans
/
Male
/
Middle aged
Idioma:
En
Revista:
Sci Rep
Año:
2024
Tipo del documento:
Article
País de afiliación:
Turquía