Menu

Blog

Mar 9, 2024

Unveiling Infinite Context Windows: Leveraging LLMs in Streaming Apps with Attention Sinks

Posted by in category: futurism

Year 2023


LLMs trained with a finite attention window can be extended to infinite sequence lengths without any fine-tuning.

Comments are closed.