A deep reinforcement learning-based resilience enhancement framework for distribution networks under extreme weather events

CSEE Journal of Power and Energy Systems(2024)

引用 0|浏览6
暂无评分
摘要
The frequent occurrence of extreme weather events due to global warming, has led to a significant increase in operating uncertainty and greatly reduced the system resilience. To deal with high-impact, low-probability (HILP) extreme weather events, more sophisticated control strategies and smarter methods are required to enhance the resilience of the system. In this paper, a data-driven deep reinforcement learning (DRL) framework is proposed to integrate different strategies and system situational awareness to enhance grid resilience. Specifically, the resilience enhancement problem is formulated as a Markov decision process (MDP), taking into account the situation awareness and controllability improvement of modern power system. Next, the probabilistic effects of extreme weather events on renewable energy and transmission lines are studied and leveraged in the proposed DRL framework to improve the performance of extreme weather forecast and estimation. Then, to speed up the training process of DRL, this paper adopts imitation learning and develops a safe topology search algorithm. Finally, an improved Soft Actor Critic (SAC) algorithm is proposed for continuous learning and training. The proposed method is tested on a modified CIGRE 15-bus medium-voltage distribution network, and the results verify the effectiveness of the proposed model and method.
更多
查看译文
关键词
Deep reinforcement learning,Soft Actor-Critic,resilience enhancement,extreme weather events,distribution networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要