Adaptive edge service deployment in burst load scenarios using deep reinforcement learning

The Journal of Supercomputing(2024)

Cited 0|Views8
No score
Abstract
The development of edge computing provides a novel deployment strategy for delay-aware applications, in which applications initially deployed in central servers are shifted closer to end-users for higher-quality and lower-delay services. However, with the growth in the number of end-users and devices, edge services are increasingly susceptible to sudden load spikes. In burst load scenarios, deploying services and allocating resources to maintain service quality and load balancing of edge servers become challenging, particularly given the coupling of resource requirements between services. This paper addresses this challenge by modeling the load burst scenario as a Markov decision problem and proposing a deep reinforcement learning-based (DRL-based) approach. The proposed approach ranks services based on their migration status and request delay violations, and makes scaling and migration decisions for each service in turn, with the goal of maximizing the total request throughput while satisfying delay requirements and resource constraints. Simulation results show that the proposed approach outperforms other algorithms in terms of total throughput and delay violation rate.
More
Translated text
Key words
Burst load,Edge service,Scaling and migrating,Deep reinforcement learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined