RATTENTION: Towards the Minimal Sliding Window Size in Local-Global Attention Models | Xiaol.x | Podwise