PSparseFormer: Enhancing Fault Feature Extraction Based On Parallel Sparse Self-Attention and Multiscale Broadcast Feed-Forward Block

Jie Wang,Haidong Shao, Ying Peng,Bin Liu

IEEE Internet of Things Journal(2024)

引用 0|浏览2
暂无评分
摘要
Currently, various state-of-the-art Transformer variants have gained widespread attention in the field of fault diagnosis. However, these Transformers often adopt a global sequence modelling strategy to extract fault features, which is susceptible to the interference of redundant information and strong noise, due to the local and sparse nature of vibration signals. Therefore, a new feature enhancement and end-to-end fault diagnosis model named PSparseFormer is proposed in this paper. Firstly, a parallel sparse self-attention module is designed to efficiently extract the local and sparse features at different locations of complex vibration signals to reduce the over-sensitivity to irrelevant information. Secondly, the multiscale broadcast feed-forward block is developed to simultaneously facilitate global and local spatial feature information transmission and adjust the contribution of features at different levels, enhancing the robustness of local feature extraction against noise. Experimental analysis using datasets from two planetary gearboxes illustrates the effectiveness of the proposed method in addressing challenges related to feature extraction and enhancement, particularly in the presence of strong noise interference. Comparative evaluations against various state-of-the-art Transformers reveal that the proposed method exhibits superior diagnostic performance.
更多
查看译文
关键词
fault diagnosis,feature extraction and enhancement,multiscale broadcast feed-forward block,parallel sparse self-attention,PsparseFormer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要