Adversarial self-attentive time-variant neural networks for multi-step time series forecasting

Expert Systems with Applications(2023)

引用 1|浏览16
暂无评分
摘要
Accurate forecasting of time series mitigates the uncertainty of future outlooks and is a great help in reducing errors in decisions. Despite years of researches, there are still some challenges to accurate forecasting of time series, including the difficulty of dynamic modeling, the problem of capturing short-term correlations, and the conundrum of long-term forecasting. This paper offers an Adversarial Truncated Cauchy Self-Attentive Time Variant Neural Network (ASATVN) for multi-step ahead time series forecasting. Specifically, the proposed model builds on Generative Adversarial Networks, in which the generator is composed of a novel time-variant model. The time-variant model contributes to learning dynamic time-series changes with its time-variant architecture and employs a newly proposed Truncated Cauchy Self-Attention block to capture the local sequential dependencies better. For the discriminator, two self-attentive discriminators are presented to regularize predictions with fidelity and continuity, which is beneficial to predicting sequence over longer time horizons. Our proposed ASATVN model outperforms the state-of-the-art predictive models on eleven real-world benchmark datasets, demonstrating its effectiveness.
更多
查看译文
关键词
Time series forecasting,Dynamic modeling,Short-term correlations,Long-term forecasting
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要