Jaffar, Anum ORCID: https://orcid.org/0009-0006-8016-8429, Ali, Sara ORCID: https://orcid.org/0000-0002-5100-9430, Fahad Iqbal, Khawaja ORCID: https://orcid.org/0000-0001-6711-2574, Ayaz, Yasar ORCID: https://orcid.org/0000-0002-2425-9063, Ansari, Ali R ORCID: https://orcid.org/0000-0001-5090-7813, Fayyaz, Muhammad A B ORCID: https://orcid.org/0000-0002-1794-3000 and Nawaz, Raheel (2024) A comprehensive multimodal humanoid system for personality assessment based on the Big Five model. IEEE Access, 12. pp. 84261-84272. ISSN 2169-3536
|
Published Version
Available under License Creative Commons Attribution Non-commercial No Derivatives. Download (2MB) | Preview |
Abstract
Personality analysis allows the experts to get insights into an individual's conduct, vulnerabilities, and prospective capabilities. Some common methods employed for personality prediction include text analysis, social media data, facial expressions, and emotional speech extraction. Recently, some studies have utilized the big five model to predict personality traits using non-verbal cues (gaze score, body motion, head motion). However, these studies mostly target only three aspects of the big five mode. None of the studies so far have used non-verbal cues to target all five traits (extraversion, openness, neuroticism, agreeableness, and conscientiousness) of the Big Five model. In this paper, we propose a multi-modal system that predicts all five personality traits of the Big Five model using non-verbal cues (facial expressions, head poses, body poses), 44-item Big Five Inventory (BFI) questionnaire, and expert analysis. The facial expression module utilizes the Face Emotion Recognition Plus (FER+) dataset trained with Convolution Neural Network (CNN) model achieving 95.14% accuracy. Evaluating 16 subjects in verbal interaction with humanoid robot NAO, we combined questionnaire feedback, human-robot interaction data, and expert perspectives to deduce their Big Five traits. Findings reveal 100% accuracy in personality prediction via expert insights and the system, and 75% for the questionnaire-based approach.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.