Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,36 @@
|
|
1 |
-
---
|
2 |
-
license: mit
|
3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
---
|
4 |
+
|
5 |
+
# Large Time-Series Model (Timer)
|
6 |
+
|
7 |
+
Large time-series model introduced in the paper: [Timer: Generative Pre-trained Transformers Are Large Time Series Models](https://arxiv.org/abs/2402.02368). [[Poster]](https://cloud.tsinghua.edu.cn/f/91da8a3d06984f209461/), [[Slides]](https://cloud.tsinghua.edu.cn/f/b766629dbc584a4e8563/).
|
8 |
+
|
9 |
+
The base version is pre-trained on 260B time points, which supports zero-shot forecasting ([benchmark](https://cdn-uploads.huggingface.co/production/uploads/64fbe24a2d20ced4e91de38a/n2IW7fTRpuZFMYoPr1h4O.png)) and further adaptations.
|
10 |
+
|
11 |
+
See https://github.com/thuml/Large-Time-Series-Model for examples of using this model.
|
12 |
+
|
13 |
+
## Acknowledgments
|
14 |
+
|
15 |
+
Timer is mostly built from the Internet public time series dataset, which comes from different research teams and providers. We sincerely thank all individuals and organizations who have contributed the data. Without their generous sharing, this model would not have existed.
|
16 |
+
|
17 |
+
* Time-Series-Library (https://github.com/thuml/Time-Series-Library)
|
18 |
+
* UTSD (https://huggingface.co/datasets/thuml/UTSD)
|
19 |
+
* LOTSA (https://huggingface.co/datasets/Salesforce/lotsa_data)
|
20 |
+
|
21 |
+
## Citation
|
22 |
+
|
23 |
+
```
|
24 |
+
@inproceedings{liutimer,
|
25 |
+
title={Timer: Generative Pre-trained Transformers Are Large Time Series Models},
|
26 |
+
author={Liu, Yong and Zhang, Haoran and Li, Chenyu and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
|
27 |
+
booktitle={Forty-first International Conference on Machine Learning}
|
28 |
+
}
|
29 |
+
|
30 |
+
@article{liu2024timer,
|
31 |
+
title={Timer-XL: Long-Context Transformers for Unified Time Series Forecasting},
|
32 |
+
author={Liu, Yong and Qin, Guo and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
|
33 |
+
journal={arXiv preprint arXiv:2410.04803},
|
34 |
+
year={2024}
|
35 |
+
}
|
36 |
+
```
|