Real-Time Facial Emotion Recognition for AI-Enhanced Personalized Learning
Main Article Content
Abstract
Introduction: The integration of emotional intelligence into digital learning platforms has emerged as a significant avenue for enhancing learner engagement and academic performance. This research presents a Real-Time Facial Emotion Recognition (FER) framework specifically designed to support AI-enhanced personalized learning environments. The proposed study identifies key research gaps in the existing FER-based learning systems, including limited real-time applicability, lack of contextual adaptability, and insufficient accuracy in dynamic educational settings. To address these gaps, we develop a robust deep learning-based FER model that leverages Convolutional Neural Networks (CNN) and real-time image processing techniques to recognize learners' emotional states such as happiness, confusion, boredom, and frustration with high precision. The recognized emotional data is systematically integrated into a personalized learning system to dynamically adapt content delivery and pedagogical strategies based on the learner's affective state. The research methodology includes a comprehensive literature review, a well-defined system architecture, and an empirically validated FER framework. Experimental evaluations, conducted on benchmark FER datasets and real-time e-learning environments, demonstrate the efficacy and scalability of the proposed system. The study concludes with key insights, limitations, and potential future directions aimed at advancing emotion-aware AI-driven learning systems