Skip to content

GestaltCogTeam/selective-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

3 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Selective Learning for Deep Time Series Forecasting

BasicTS arXiv

img.png

๐Ÿ“Œ Abstract

Selective Learning is a novel and powerful training strategy designed to make deep time series forecasting (TSF) models more robust against overfitting. Instead of uniformly learning from all timesteps, it selectively focuses on reliable ones while filtering out uncertain or anomalous samples. This is achieved through two complementary masks โ€” an uncertainty mask based on residual entropy and an anomaly mask using residual lower-bound estimation. Experiments on eight benchmark datasets show that Selective Learning consistently improves performance, reducing MSE by 37.4% on Informer, 8.4% on TimesNet, and 6.5% on iTransformer.

๐Ÿ› ๏ธ Usage

  1. Environment and Dependencies
    This project is built upon the BasicTS time series benchmarking library.
    Download the package with version โ‰ฅ 1.0.0 from this release link and install it with pip.

     pip install basicts-1.0-py3-none-any.whl
  2. Prepare Datasets
    All datasets used in the paper are natively supported in BasicTS.
    Please refer to the dataset documentation for instructions on downloading and using them.

  3. Train the Estimation Model
    Each dataset requires training an estimation model first to guide anomaly masking.
    For example, you can train a DLinear estimation model on the ETTh1 dataset as shown in train_est_model of the demo, and the logs and checkpoints will be saved under project_root_path/checkpoints/DLinear/.

  4. Train the Main Model with Selective Learning
    BasicTS now natively supports Selective Learning.
    You can simply add the SelectiveLearning callback in your configuration to enable selective learning during training.

     from basicts.runners.callback import SelectiveLearning
     
     estimator = DLinear(DLinearConfig(...))
     
     sl_callback = SelectiveLearning(
     	r_u=0.3, # uncertainty masking ratio
     	r_a=0.3, # anomaly masking ratio
     	estimator=estimator, # estimation model
     	ckpt_path="checkpoints/DLinear/..." # .pt file
     )
     
     config = BasicTSForecastingConfig(
     	#..., your other config
     	callbacks=[sl_callback] # add selective learning callback
     )

    The function train_main_model in demo shows how to train iTransformer with selective learning.

๐Ÿ“ˆ Results

img.png

๐Ÿ”— Citation

๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ If you find this repository useful, please consider citing our NeurIPS'25 paper! ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ

@misc{fu2025selectivelearningdeeptime,
      title={Selective Learning for Deep Time Series Forecasting}, 
      author={Yisong Fu and Zezhi Shao and Chengqing Yu and Yujie Li and Zhulin An and Qi Wang and Yongjun Xu and Fei Wang},
      year={2025},
      eprint={2510.25207},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2510.25207}, 
}

About

Official Code for the NeurIPS'25 paper: Selective Learning for Deep Time Series Forecasting

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages