Conjecture: RoPE structures keys and queries to enable better retrieval
Establish whether Rotary Position Embeddings (RoPE) structure the key and query vectors in Transformer attention mechanisms in a manner that conditions them for improved retrieval performance compared to the absence of RoPE (NoPE).
References
We conjecture that RoPE structures the keys and queries there by conditioning them for better retrieval.
— LUCID: Attention with Preconditioned Representations
(2602.10410 - Duvvuri et al., 11 Feb 2026) in Appendix A.5, Ablations (RoPE paragraph)