UPDF AI

Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting

Shizhan Liu,Hang Yu,4 Authors,S. Dustdar

2022 · DBLP: conf/iclr/LiuYLLLLD22
International Conference on Learning Representations · 582 Citations

TLDR

The pyramidal attention module (PAM) is introduced in which the inter-scale tree structure summarizes features at different resolutions and the intra-scale neighboring connections model the temporal dependencies of different ranges.

Cited Papers
Citing Papers