Explainable AI Methods for Predicting Student Grades and Improving Academic Success

Main Article Content

E. Ben George, R. Senthilkumar, Fatma Al-Junaibi, Zakariya Al-Shuaibi

Abstract

Introduction: This study explores applying Explainable Artificial Intelligence (XAI) techniques to predict student performance in educational settings. Predicting student outcomes in advance has become more accurate with the help of AI and machine learning. However, there is a lack of clarity in many AI models and their predictions, which are termed black box models. This is a significant problem in the education industry because it can erode administrators' and educators' faith in the explainability or openness of predicted outcomes.
Objectives: This research aims to reduce the shortcomings of traditional AI models by making them more understandable using XAI. XAI provides stakeholders with a better understanding of the underlying logic of the predictions to make better decisions. By utilizing XAI techniques, this paper provided valuable and reasonable intelligence-driven student grade predictions to increase confidence in AI systems. These interpretable predictions will guide students who may perform poorly at the very early stage.
Methods: This research employs XAI techniques such as SHAP (SHapley Additive exPlanations) and LIME (Local Interpretable Model-agnostic Explanations) to explain the predictions. Students' performance scores, such as quizzes, midterm examinations, practical tests, assignments, and activities were used as features to predict the final grades using the Random Forest Classifier (RFC). The investigation uses Partial Dependence Plots (PDPs), SHAP, and LIME to improve the comprehension of the model's predictions.
Results: Applying these XAI techniques will enhance comprehension of the critical features impacting student performance. The results provide clear insights into the areas students can improve to achieve higher grades. They also provide a broader view of the factors that influence academic accomplishment or failure, aiding educators and stakeholders in making proper decisions.
Conclusions: The findings demonstrate that using XAI in student performance data will provide transparency in predicting results. The outcome of this research will help create more effective instructional techniques, and students can improve their weaknesses.

Article Details

Section
Articles