UPDF AI

RoFormer: Enhanced Transformer with Rotary Position Embedding

Jianlin Su,Yu Lu,2 Auteurs,Yunfeng Liu

2021 · DOI: 10.1016/j.neucom.2023.127063
Neurocomputing · 2,555 citations

TLDR

A novel method named Rotary Position Embedding(RoPE) is proposed to effectively leverage the positional information in transformer-based language models and enables valuable properties, including the flexibility of sequence length, decaying inter-token dependency with increasing relative distances, and the capability of equipping the linear self-attention with relative position encoding.