Your browser doesn't support javascript.
loading
Gender-Driven English Speech Emotion Recognition with Genetic Algorithm.
Yue, Liya; Hu, Pei; Zhu, Jiulong.
Afiliação
  • Yue L; Fanli Business School, Nanyang Institute of Technology, Nanyang 473004, China.
  • Hu P; School of Computer and Software, Nanyang Institute of Technology, Nanyang 473004, China.
  • Zhu J; Fanli Business School, Nanyang Institute of Technology, Nanyang 473004, China.
Biomimetics (Basel) ; 9(6)2024 Jun 14.
Article em En | MEDLINE | ID: mdl-38921240
ABSTRACT
Speech emotion recognition based on gender holds great importance for achieving more accurate, personalized, and empathetic interactions in technology, healthcare, psychology, and social sciences. In this paper, we present a novel gender-emotion model. First, gender and emotion features were extracted from voice signals to lay the foundation for our recognition model. Second, a genetic algorithm (GA) processed high-dimensional features, and the Fisher score was used for evaluation. Third, features were ranked by their importance, and the GA was improved through novel crossover and mutation methods based on feature importance, to improve the recognition accuracy. Finally, the proposed algorithm was compared with state-of-the-art algorithms on four common English datasets using support vector machines (SVM), and it demonstrated superior performance in accuracy, precision, recall, F1-score, the number of selected features, and running time. The proposed algorithm faced challenges in distinguishing between neutral, sad, and fearful emotions, due to subtle vocal differences, overlapping pitch and tone variability, and similar prosodic features. Notably, the primary features for gender-based differentiation mainly involved mel frequency cepstral coefficients (MFCC) and log MFCC.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Biomimetics (Basel) Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Biomimetics (Basel) Ano de publicação: 2024 Tipo de documento: Article