AI Dynamics

Global AI News Aggregator

Falcon 3 Language Models Released: 1B to 10B Parameters

Falcon 3 is out! 1B, 3B, 7B, 10B (Base + Instruct) & 7B Mamba, trained on 14 Trillion tokens and apache 2.0 licensed! > 1B-Base surpasses SmolLM2-1.7B and matches gemma-2-2b
> 3B-Base outperforms larger models like Llama-3.1-8B and Minitron-4B-Base
> 7B-Base is on par with

→ View original post on X — @reach_vb,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *