Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
CoRR(2023)
摘要
Over the past years, foundation models have caused a paradigm shift in
machine learning due to their unprecedented capabilities for zero-shot and
few-shot generalization. However, despite the success of foundation models in
modalities such as natural language processing and computer vision, the
development of foundation models for time series forecasting has lagged behind.
We present Lag-Llama, a general-purpose foundation model for univariate
probabilistic time series forecasting based on a decoder-only transformer
architecture that uses lags as covariates. Lag-Llama is pretrained on a large
corpus of diverse time series data from several domains, and demonstrates
strong zero-shot generalization capabilities compared to a wide range of
forecasting models on downstream datasets across domains. Moreover, when
fine-tuned on relatively small fractions of such previously unseen datasets,
Lag-Llama achieves state-of-the-art performance, outperforming prior deep
learning approaches, emerging as the best general-purpose model on average.
Lag-Llama serves as a strong contender to the current state-of-art in time
series forecasting and paves the way for future advancements in foundation
models tailored to time series data.
更多查看译文
关键词
time series forecasting,foundation models,lag-llama
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要