French AI startup Mistral released Mixtral 8×22B, a powerful new frontier LLM, dropped quietly via a 281GB file on X available for download. The LLM features a 65,000-token context window, 176B parameters and is expected to surpass the previous Mixtral.
Mistral Releases Mixtral 8×22B Frontier LLM Model
By
–
Leave a Reply