AI Dynamics

Global AI News Aggregator

Infini-attention: Google’s approach to infinite context in LLMs

Can LLMs have infinite context? Researchers from Google say yes. A new paper proposed Infini-attention, which lets LLMs have infinite context. How Infini-attention works:
• has local attention like any transformer
• has global attention via compression
• combines local and

→ View original post on X — @swyx,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *