LFM2 gets a new 2.6B model! With 30 layers mixing convolution and attention, it's a tiny powerhouse that outperforms other 3B models. Really proud of this one, it achieves surprisingly complex tasks for such a small size!
LFM2 Releases Efficient 2.6B Model Outperforming Larger Models
By
–
Leave a Reply