AI Dynamics

Global AI News Aggregator

Databricks DBRX: New Open-Source LLM Standard Outperforming Competitors

Databricks just announced DBRX, a new standard for open-sourced LLMs. DBRX outperforms Mixtral MoE, Llama-2 70B, and Grok-1 in language understanding, programming, and math. Most surprising, was the model was reportedly trained in just 2 months with $10M

→ View original post on X — @rowancheung,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *