Multiplicative Attention Mechanism for Multi-horizon Time Series Forecasting

2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2021)

Cited 2|Views58
No score
Abstract
Multi-horizon time series forecasting plays an important role in many industrial and business decision processes. To grasp complex and various patterns across different time series is the crucial step in achieving promising performance. However, most deep learning-based forecasting approaches simply take series-specific static (i.e. time-invariant) covariates as input features, which can fail to capture the complex pattern variation for each possible time series. In this paper, we propose a novel multiplicative attention-based architecture to tackle such forecasting problem. Our modification to multi-head attention layers leverages the series-specific covariates to build flexible attention functions for each possible time series. This improvement contributes to greater representation capacity to grasp different patterns across related time series. Experiment results demonstrate that our approach achieves state-of-the-art performance on a variety of real-world datasets.
More
Translated text
Key words
multi-horizon time series forecasting, attention, deep neural networks
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined