Time Series Foundation Model - TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)".
TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.
Please try to reproduc the zero-shot experiments on ETTh2 [here on Colab].
We use the following Colab page to show the demo of building the customer dataset and directly do the inference via our pre-trained foundation model: [Colab]
Please try our foundation model demo [here].
We also updated our models on HuggingFace: [Melady/TEMPO].
conda create -n tempo python=3.8
conda activate tempo
pip install -r requirements.txt
Download the data from [Google Drive] or [Baidu Drive], and place the downloaded data in the folder./dataset
. You can also download the STL results from [Google Drive], and place the downloaded data in the folder./stl
.
bash [ecl, etth1, etth2, ettm1, ettm2, traffic, weather].sh
After training, we can test TEMPO model under the zero-shot setting:
bash [ecl, etth1, etth2, ettm1, ettm2, traffic, weather]_test.sh
You can download the pre-trained model from [Google Drive] and then run the test script for fun.
Here is the prompts use to generate the coresponding textual informaton of time series via [OPENAI ChatGPT-3.5 API]
The time series data are come from [S&P 500]. Here is the EBITDA case for one company from the dataset:
Example of generated contextual information for the Company marked above:
You can download the processed data with text embedding from GPT2 from: [TETS].
Feel free to connect DefuCao@USC.EDU / YanLiu.CS@USC.EDU if you’re interested in applying TEMPO to your real-world application.
@inproceedings{
cao2024tempo,
title={{TEMPO}: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting},
author={Defu Cao and Furong Jia and Sercan O Arik and Tomas Pfister and Yixiang Zheng and Wen Ye and Yan Liu},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=YH5w12OUuU}
}
@article{
Jia_Wang_Zheng_Cao_Liu_2024,
title={GPT4MTS: Prompt-based Large Language Model for Multimodal Time-series Forecasting},
volume={38},
url={https://ojs.aaai.org/index.php/AAAI/article/view/30383},
DOI={10.1609/aaai.v38i21.30383},
number={21},
journal={Proceedings of the AAAI Conference on Artificial Intelligence},
author={Jia, Furong and Wang, Kevin and Zheng, Yixiang and Cao, Defu and Liu, Yan},
year={2024}, month={Mar.}, pages={23343-23351}
}