Skip to content

Latest commit

 

History

History
11 lines (9 loc) · 398 Bytes

README.md

File metadata and controls

11 lines (9 loc) · 398 Bytes

S-LAM

Local Attention Mechanism for time series forecasting.

Citation

@article{aguilera2024local, title={Local Attention Mechanism: Boosting the Transformer Architecture for Long-Sequence Time Series Forecasting}, author={Aguilera-Martos, Ignacio and Herrera-Poyatos, Andr{'e}s and Luengo, Juli{'a}n and Herrera, Francisco}, journal={arXiv preprint arXiv:2410.03805}, year={2024} }