AI Dynamics

Global AI News Aggregator

Mistral AI Releases Small 3.1 Multilingual Multimodal LLM

Mistral AI released Small 3.1, a SOTA multilingual and multimodal LLM —24B (can run on a laptop)
—128k token context window —Outperforms Gemma 3 and GPT-4o Mini on most benchmarks
—Inference speed of 150 tokens/sec
—Open-source under Apache 2.0 license

→ View original post on X — @rowancheung,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *