Can LLMs have infinite context? Researchers from Google say yes. A new paper proposed Infini-attention, which lets LLMs have infinite context. How Infini-attention works:
• has local attention like any transformer
• has global attention via compression
• combines local and
Infini-attention: Google’s approach to infinite context in LLMs
By
–
Leave a Reply