Skip to content
/ Timer-XL Public

About Code release for "Timer-XL: Long-Context Transformers for Unified Time Series Forecasting"

License

Notifications You must be signed in to change notification settings

thuml/Timer-XL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 

Repository files navigation

Timer-XL

Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [Paper],[Slides].

🚩 News (2025.01) Timer-XL has been accepted as ICLR 2025. See you at Singapore :)

🚩 News (2024.12) Released a univariate pre-trained model [HuggingFace]. An quickstart usage is provided here.

🚩 News (2024.10) Model implementation is included in [OpenLTM].

Introduction

Timer-XL is a decoder-only Transformer for time series forecasting. It can be used for task-specific training or scalable pre-training, handling arbitrary-length and any-variable time series.

💪 We observe performance degradation of encoder-only Transformers on long-context time series.

💡 We propose multivariate next token prediction, a paradigm to uniformly predict univariate and multivariate time series with decoder-only Transformers.

🌟 We pre-train Timer-XL, a long-context version of time-series Transformers (Timer), for zero-shot forecasting.

🏆 Timer-XL achieves state-of-the-art performance as a one-for-all time series forecaster.

What is New

For our previous work, please refer to Time-Series-Transformer (Timer)

Comparison

Time-Series Transformers PatchTST iTransformer TimeXer UniTST Moirai Timer Timer-XL (Ours)
Intra-Series Modeling Yes No Yes Yes Yes Yes Yes
Inter-Series Modeling No Yes Yes Yes Yes No Yes
Causal Transformer No No No No No Yes Yes
Pre-Trained No No No No Yes Yes Yes

Generalize 1D Sequences to 2D Time Series

Multivariate Next Token Prediction

We generalize next-token prediction for multivariate time series. Each prediction is made based on tokens of the previous time from multiple variables:

Universal TimeAttention

We design TimeAttention, a causal self-attention allowing intra- and inter-series modeling while maintaining the causality and flexibility of decoder-only Transformers. It can be applied to univariate and covariate-informed contexts, enabling unified time series forecasting.

Main Results

Citation

If you find this repo helpful, please cite our paper.

@article{liu2024timer,
  title={Timer-XL: Long-Context Transformers for Unified Time Series Forecasting},
  author={Liu, Yong and Qin, Guo and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
  journal={arXiv preprint arXiv:2410.04803},
  year={2024}
}

Acknowledgment

We appreciate the following GitHub repos a lot for their valuable code and efforts:

Contact

If you have any questions or want to use the code, feel free to contact:

About

About Code release for "Timer-XL: Long-Context Transformers for Unified Time Series Forecasting"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published