A Deep Reinforcement Learning-Based Resource Management Game in Vehicular Edge Computing

IEEE Transactions on Intelligent Transportation Systems(2022)

Cited 30|Views38
No score
Abstract
Vehicular Edge Computing (VEC) is a promising paradigm that leverages the vehicles to offload computation tasks to the nearby VEC server with the aim of supporting the low latency vehicular application scenarios. Incentivizing VEC servers to participate in computation offloading activities and make full use of computation resources is of great importance to the success of intelligent transportation services. In this paper, we formulate the competitive interactions between the VEC servers and vehicles as a two-stage Stackelberg game with the VEC servers as the leader players and the vehicles as the followers. After obtaining the full information of vehicles, the VEC server calculates the unit price of computation resource. Given the unit prices announced by VEC server, the vehicles determine the amount of computation resource to purchase from VEC server. In the scenario that vehicles do not want to share their computation demands, a deep reinforcement learning based resource management scheme is proposed to maximize the profits of vehicles and VEC server. The extensive experimental results have demonstrated the effectiveness of our proposed resource management scheme based on Stackelberg game and deep reinforcement learning.
More
Translated text
Key words
Resource management,Servers,Games,Edge computing,Reinforcement learning,Pricing,Computational modeling
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined