AI Dynamics

Global AI News Aggregator

Google Gemma 10M Context Window: Local LLM Breakthrough

Google Gemma with a 10M context window LLM features:
• 1250x context length of base Gemma
• Requires less than 32GB of memory
• Infini-attention + activation compression Imagine if you can run 10 million context LLM locally on your computer with @ollama or @LMStudioAI
.

→ View original post on X — @saboo_shubham_,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *