Refining one-class representation: A unified transformer for unsupervised time-series anomaly detection

INFORMATION SCIENCES(2024)

引用 0|浏览1
暂无评分
摘要
The deep unsupervised time-series anomaly detector depends on the one-class representation, which is more effective by only formulating the normal samples. However, normal samples are always mixed with anomalies in the unlabeled training dataset. The learned one-class representation may be biased and violates the one-class setting. To address this problem, we refine the one-class representation and propose a unified AMFormer (Active Masked transFormer) framework, which integrates Transformer with the masked operation mechanism and cost sensitive learning theory. Specifically, we first develop a network-driven masked operation with Hadamard product transformation to damage the initial input samples. The encoder and decoder representations rebuild the processed incomplete samples, which avoids identical shortcuts and further enhances robustness. Secondly, we exploit the active MSE (Mean Squared Error) loss function to purify the training samples. The different weights are dynamically added to different kinds of samples according to their rebuilding-errors-based pseudo labels. The pseudo anomalies with more significant rebuilding errors are removed by putting lower weights. Finally, extensive experiments are conducted on four benchmark datasets. The experimental results demonstrate that our AMFormer outperforms the nine relevant benchmark algorithms, boosting the mean f1 score from 0.851 to 0.937.
更多
查看译文
关键词
Anomaly detection,One-class representation,Hadamard product masked encoder,Active MSE
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要