iCOS: A Deep Reinforcement Learning Scheme for Wireless-Charged MEC Networks

IEEE Transactions on Vehicular Technology(2022)

引用 1|浏览8
暂无评分
摘要
Computation offloading is an effective method in mobile edge computing (MEC) to relieve user equipment (UE) from the limited computation resource and battery capacity. Meanwhile, simultaneous wireless information and power transmission (SWIPT) can be applied to MEC to extend the operating time of the equipment. However, in multi-user network environment, diverse computation task requirements and changeable network channel states make it challenging to obtain offloading strategy timely and accurately. To address the issue, we propose an intelligent computation offloading scheme (iCOS) based on enhanced priority deep deterministic policy gradient (EPDDPG) algorithm to minimize the energy consumption of all the UEs by jointly optimizing the offloading decision, the central processing unit (CPU) frequency and the power split ratio in a dynamic SWIPT-MEC network. In particular, we improve the traditional fully-connected network structure to obtain both discrete and continuous action outputs, and accelerate neural network parameter updates by using prioritized experience tuples. Furthermore, we use dynamic voltage and frequency scaling (DVFS) technology to dynamically adjust the CPU frequency of local computing, and employ SWIPT technology to balance the charging and communication according to the obtained strategy. Simulation results show that the algorithm proposed in this paper can effectively reduce the energy cost of UEs, and complete more computation tasks within the delay limit.
更多
查看译文
关键词
Computation offloading,deep reinforcement learning,energy consumption,mobile edge computing,offloading strategy,wireless information and power transmission
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要