Dependency-Aware Computation Offloading In Mobile Edge Computing: A Reinforcement Learning Approach

IEEE ACCESS(2019)

引用 53|浏览23
暂无评分
摘要
Mobile edge computing (MobEC) builds an Information Technology (IT) service environment to enable cloud-computing capabilities at the edge of mobile networks. To tackle the restrictions in the battery power and computation capability of mobile devices, task offloading for using MobEC is developed and used to reduce the service latency and to ensure high service efficiency. However, most of the existing schemes only focus on one-shot offloading, while taking less into consideration the task dependency. It is urgently needed a more comprehensive and adaptive way to take both the energy constraint and the inherent dependency of tasks into account, since modern communication networks have increasingly become complicated and dynamic. To this end, in this paper, we are motivated to study the problem of dependency-aware task offloading decision in MobEC, aiming at minimizing the execution time for mobile applications with constraints on energy consumption. To solve this problem, we propose a model-free approach based on reinforcement learning (RL), i.e., a Q-learning approach that adaptively learns to optimize the offloading decision and energy consumption jointly by interacting with the network environment. Simulation results show that our RL-based approach is able to achieve significant reduction on the total execution time with comparably less energy consumption.
更多
查看译文
关键词
Mobile edge computing,offloading,resource allocation,reinforcement learning,task dependency
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要