InTrans: Fast Incremental Transformer for Time Series Data Prediction.

International Conference on Database and Expert Systems Applications (DEXA)(2022)

引用 3|浏览20
暂无评分
摘要
Predicting time-series data is useful in many applications, such as natural disaster prevention system, weather forecast, traffic control system, etc. Time-series forecasting has been extensively studied. Many existing forecasting models tend to perform well when predicting short sequence time-series. However, their performances greatly degrade when dealing with the long one. Recently, more dedicated research has been done for this direction, and Informer is currently the most efficient predicting model. The main drawback of Informer is the inability to incrementally learn. This paper proposes an incremental Transformer, called InTrans, to address the above bottleneck by reducing the training/predicting time of Informer. The time complexities of InTrans comparing to the Informer are: (1) O(S) vs O(L) for positional and temporal embedding, (2) O((S + k - 1) (*) k) vs O(L (*) k) for value embedding, and (3) O((S + k - 1) (*) d(dim)) vs O(L (*) d(dim)) for the computation of Query/Key/Value, where L is the length of the input; k is the kernel size; d(dim )is the number of dimensions; and S is the length of the non-overlapping part of the input that is usually significantly smaller than L. Therefore, InTrans could greatly improve both training and predicting speed over the state-of-the-art model, Informer. Extensive experiments have shown that InTrans is about 26% faster than Informer for both short sequence and long sequence time-series prediction.
更多
查看译文
关键词
Incremental learning,Transformer,Time-series forecasting
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要