AI Dynamics

Global AI News Aggregator

Mistral Releases Mixtral 8×22B Frontier LLM Model

French AI startup Mistral released Mixtral 8×22B, a powerful new frontier LLM, dropped quietly via a 281GB file on X available for download. The LLM features a 65,000-token context window, 176B parameters and is expected to surpass the previous Mixtral.

→ View original post on X — @rowancheung,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *