CLFormer: A Lightweight Transformer Based on Convolutional Embedding and Linear Self-Attention With Strong Robustness for Bearing Fault Diagnosis Under Limited Sample Conditions

IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT(2022)

引用 29|浏览5
暂无评分
摘要
As a rising star in the field of deep learning, the Transformers have achieved remarkable achievements in numerous tasks. Nonetheless, due to the safety considerations, complex environment, and limitation of deployment cost in actual industrial production, the algorithms used for fault diagnosis often face the three challenges of limited samples, noise interference, and lightweight, which is an impediment in the fault diagnosis practice of transformer with high requirements for number of samples and parameters. For this reason, this article proposes a lightweight transformer based on convolutional embedding and linear self-attention (LSA), called CLFormer. By modifying the embedding module and the form of self-attention, the aim of lightweight is realized (MFLOPs: 0.12; Params: 4.88 K) under the condition of boosting high-accuracy of transformer. The effectiveness was demonstrated on Self-Made dataset with four comparative models, especially when each type of training sample is, the CLFormer achieves the highest average accuracy of 83.58 & x0025; when the signal-to-noise ratio (SNR) is from & x2212;8 to 8 dB for three types of noise. As the first attempt to use transformer for fault diagnosis of rotating machinery, this work provides a feasible strategy for the research topic of fault diagnosis with the goal of practical deployment.
更多
查看译文
关键词
Transformers,Feature extraction,Fault diagnosis,Data models,Convolution,Training,Convolutional neural networks,Anti-noise,convolutional neural networks,fault diagnosis,lightweight,limited sample
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要