UPDF AI

Non-stationary Transformers: Rethinking the Stationarity in Time Series Forecasting

Yong Liu,Haixu Wu,Jianmin Wang,Mingsheng Long

2022 · DOI: 10.48550/arXiv.2205.14415
arXiv.org · 70회 인용

TLDR

The Non-stationary Transformers framework consistently boosts mainstream Transformers by a large margin, making them the state-of-the-art in time series forecasting, and mitigate the over-stationarization problem for predictive capability of deep models simultaneously.