Skip to content

Official implementation for CAPS: Clock-weighted Aggregation with Prefix-Products and Softmax.

Notifications You must be signed in to change notification settings

vireshpati/CAPS-Attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CAPS: Unifying Attention, Recurrence, and Alignment in Transformer-based Time Series Forecasting

Official implementation of the paper.

Installation

Install PyTorch first, then:

pip install -r requirements.txt

Datasets

All datasets are expected in the dataset/ directory. The 10 datasets used in the paper are:

Dataset Channels
ETTm1 7
ETTm2 7
ETTh1 7
ETTh2 7
Weather 21
Solar 137
ECL 321
PEMS03 358
PEMS04 307
PEMS08 170

These can be sourced from Time-Series-Library.

Running Experiments

Full CAPS model:

bash caps.sh

LinAttn + RoPE ablation baseline:

bash caps_baseline.sh

Monitoring

Metrics are logged to Weights & Biases. Set WANDB_MODE=disabled to run offline.

About

Official implementation for CAPS: Clock-weighted Aggregation with Prefix-Products and Softmax.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published