Abstract
The automated recognition of human emotion, age, and gender from facial images is a significant area of research with applications in fields like security, healthcare, and human-computer interaction. While numerous systems exist for these tasks, their accuracy often falls short of satisfactory levels, and identifying robust methods remains a challenge. This study proposes a novel deep learning approach using a two-tier architecture that combines a Convolutional Neural Network (CNN) for emotion recognition with a Local-Deep Neural Network (LDNN) for age and gender classification. The model was trained and tested on the AffectNet and UTKFace datasets, demonstrating high efficacy in both training and real-time modes. The system successfully identifies six basic emotions (happy, sad, anger, fear, disgust, surprise), eight age ranges, and two genders. Our results show a significant improvement in emotion recognition accuracy over preceding studies, validating the effectiveness of the proposed architecture.



