SARSA-Based Reinforcement Learning Framework for Energy-Aware and Makespan-Optimized Workload Scheduling in Cloud Computing
Main Article Content
Abstract
In fact, it’s no longer a question of whether users should utilize computational resources from the cloud — the question is rather how to do it. Dynamic workload scheduling is however diffi-cult to optimize because of the interplay between energy consumption and makespan. Lastly, to overcome this, we put forward a reinforcement learning (RL) framework grounded on SARSA, with the objective of achieving that balance between makespan and energy consumption. In-dependently it adapts scheduling decision for tasks based on real time workload characteristics, without compromising the throughput but optimization the energy consumed. Through exper-iment, our proposed SARSA based scheduling algorithm has been show to improve over tradi-tional scheduling strategies and can potentially save a large amount of energy and minimize makespan. In this work, an adaptive mechanism is proposed that allows tuning of the cloud computing service to optimize its sustainability while minimizing a deleterious effect on the service quality.