Databricks just announced DBRX, a new standard for open-sourced LLMs.
— Rowan Cheung (@rowancheung) 28 mars 2024
DBRX outperforms Mixtral MoE, Llama-2 70B, and Grok-1 in language understanding, programming, and math.
Most surprising, was the model was reportedly trained in just 2 months with $10M pic.twitter.com/HvyxOx084g
Databricks just announced DBRX, a new standard for open-sourced LLMs. DBRX outperforms Mixtral MoE, Llama-2 70B, and Grok-1 in language understanding, programming, and math. Most surprising, was the model was reportedly trained in just 2 months with $10M
Leave a Reply