Chrome Extension
WeChat Mini Program
Use on ChatGLM

A Two-Terminal Fault Location Fusion Model of Transmission Line Based on CNN-Multi-Head-LSTM with an Attention Module

Chao Su, Qiang Yang, Xiaomei Wu, Chun Sing Lai, Loi Lei Lai

ENERGIES(2023)

Cited 0|Views3
No score
Abstract
Most traditional artificial intelligence-based fault location methods are very dependent on fault signal selection and feature extraction, which is often based on prior knowledge. Further, these methods are usually very sensitive to line parameters and selected fault characteristics, so the generalization performance is poor and cannot be applied to different lines. In order to solve the above problems, this paper proposes a two-terminal fault location fusion model, which combines a convolutional neural network (CNN), an attention module (AM), and multi-head long short-term memory (multi-head-LSTM). First, the CNN is used to accomplish the self-extraction of fault data features. Second, the CBAM (convolutional block attention module) model is embedded into the convolutional neural network to selectively learn fault features autonomously. Furthermore, the LSTM is combined to learn the deep timing characteristics. Finally, a MLP output layer is used to determine the optimal weights to construct a fusion model based on the results of the two-terminal relative fault location model and then output the final location result. Simulation studies show that this method has a high location accuracy, does not require the design of complex feature extraction algorithms, and exhibits good generalization performance for lines with different parameters, which is of great importance for the development of AI-based methods of fault location.
More
Translated text
Key words
fault location,convolutional neural network,long short-term memory,attention module
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined