Skip to content

shikishima510/Time-Series-Library

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Time Series Library (TSLib)

TSLib is an open-source library for deep learning researchers, especially for deep time series analysis.

We provide a neat code base to evaluate advanced deep time series models or develop your model, which covers five mainstream tasks: long- and short-term forecasting, imputation, anomaly detection, and classification.

🚩News (2024.04) Many thanks for the great work from frecklebars. The famous sequenctial model Mamba has been included in our library. See this file, where you need to install mamba_ssm with pip at first.

🚩News (2024.03) Given the inconsistent look-back length of various papers, we split the long-term forecasting in the leaderboard into two categories: Look-Back-96 and Look-Back-Searching. We recommend researchers read TimeMixer, which includes both settings of the look-back length into experiments for scientific rigor.

🚩News (2023.10) We add an implementation to iTransformer, which is the state-of-the-art model for long-term forecasting. The official code and complete scripts of iTransformer can be found here.

🚩News (2023.09) We added a detailed tutorial for TimesNet and this library, which is quite friendly to beginners of deep time series analysis.

🚩News (2023.02) We release the TSlib as a comprehensive benchmark and code base for time series models, which is extended from our previous GitHub repository Autoformer.

Leaderboard for Time Series Analysis

Till March 2024, the top three models for five different tasks are:

Model
Ranking
Long-term
Forecasting
Look-Back-96
Long-term
Forecasting
Look-Back-Searching
Short-term
Forecasting
Imputation Classification Anomaly
Detection
🥇 1st iTransformer TimeMixer TimesNet TimesNet TimesNet TimesNet
🥈 2nd TimeMixer PatchTST Non-stationary
Transformer
Non-stationary
Transformer
Non-stationary
Transformer
FEDformer
🥉 3rd TimesNet DLinear FEDformer Autoformer Informer Autoformer

Note: We will keep updating this leaderboard. If you have proposed advanced and awesome models, you can send us your paper/code link or raise a pull request. We will add them to this repo and update the leaderboard as soon as possible.

Compared models of this leaderboard. ☑ means that their codes have already been included in this repo.

  • TimeMixer - TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting [ICLR 2024] [Code].
  • TSMixer - TSMixer: An All-MLP Architecture for Time Series Forecasting [arXiv 2023] [Code]
  • iTransformer - iTransformer: Inverted Transformers Are Effective for Time Series Forecasting [ICLR 2024] [Code].
  • PatchTST - A Time Series is Worth 64 Words: Long-term Forecasting with Transformers [ICLR 2023] [Code].
  • TimesNet - TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis [ICLR 2023] [Code].
  • DLinear - Are Transformers Effective for Time Series Forecasting? [AAAI 2023] [Code].
  • LightTS - Less Is More: Fast Multivariate Time Series Forecasting with Light Sampling-oriented MLP Structures [arXiv 2022] [Code].
  • ETSformer - ETSformer: Exponential Smoothing Transformers for Time-series Forecasting [arXiv 2022] [Code].
  • Non-stationary Transformer - Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting [NeurIPS 2022] [Code].
  • FEDformer - FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting [ICML 2022] [Code].
  • Pyraformer - Pyraformer: Low-complexity Pyramidal Attention for Long-range Time Series Modeling and Forecasting [ICLR 2022] [Code].
  • Autoformer - Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting [NeurIPS 2021] [Code].
  • Informer - Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting [AAAI 2021] [Code].
  • Reformer - Reformer: The Efficient Transformer [ICLR 2020] [Code].
  • Transformer - Attention is All You Need [NeurIPS 2017] [Code].

See our latest paper [TimesNet] for the comprehensive benchmark. We will release a real-time updated online version soon.

Newly added baselines. We will add them to the leaderboard after a comprehensive evaluation.

  • Mamba - Mamba: Linear-Time Sequence Modeling with Selective State Spaces [arXiv 2023] [Code]
  • SegRNN - SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting [arXiv 2023] [Code].
  • Koopa - Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors [NeurIPS 2023] [Code].
  • FreTS - Frequency-domain MLPs are More Effective Learners in Time Series Forecasting [NeurIPS 2023] [Code].
  • TiDE - Long-term Forecasting with TiDE: Time-series Dense Encoder [arXiv 2023] [Code].
  • FiLM - FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting [NeurIPS 2022][Code].
  • MICN - MICN: Multi-scale Local and Global Context Modeling for Long-term Series Forecasting [ICLR 2023][Code].
  • Crossformer - Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting [ICLR 2023][Code].
  • TFT - Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting [arXiv 2019][Code].

Usage

  1. Install Python 3.8. For convenience, execute the following command.
pip install -r requirements.txt
  1. Prepare Data. You can obtain the well pre-processed datasets from [Google Drive] or [Baidu Drive], Then place the downloaded data in the folder./dataset. Here is a summary of supported datasets.

  1. Train and evaluate model. We provide the experiment scripts for all benchmarks under the folder ./scripts/. You can reproduce the experiment results as the following examples:
# long-term forecast
bash ./scripts/long_term_forecast/ETT_script/TimesNet_ETTh1.sh
# short-term forecast
bash ./scripts/short_term_forecast/TimesNet_M4.sh
# imputation
bash ./scripts/imputation/ETT_script/TimesNet_ETTh1.sh
# anomaly detection
bash ./scripts/anomaly_detection/PSM/TimesNet.sh
# classification
bash ./scripts/classification/TimesNet.sh
  1. Develop your own model.
  • Add the model file to the folder ./models. You can follow the ./models/Transformer.py.
  • Include the newly added model in the Exp_Basic.model_dict of ./exp/exp_basic.py.
  • Create the corresponding scripts under the folder ./scripts.

Note: The original code for the classification task can be found here. It is hard to fuse all five tasks in one library. We are still working on this task.

Citation

If you find this repo useful, please cite our paper.

@inproceedings{wu2023timesnet,
  title={TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis},
  author={Haixu Wu and Tengge Hu and Yong Liu and Hang Zhou and Jianmin Wang and Mingsheng Long},
  booktitle={International Conference on Learning Representations},
  year={2023},
}

Contact

If you have any questions or suggestions, feel free to contact our maintenance team:

Or describe it in Issues.

Acknowledgement

This project is supported by the National Key R&D Program of China (2021YFB1715200).

This library is constructed based on the following repos:

All the experiment datasets are public, and we obtain them from the following links:

All Thanks To Our Contributors

About

A Library for Advanced Deep Time Series Models.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 48.1%
  • Shell 44.6%
  • Jupyter Notebook 7.3%