AI Dynamics

Global AI News Aggregator

Mistral 7B Extended Context Training and Dataset Improvements

On the model side, longer context length for the 7B! If you could train over the new Mistral 7B w/ 32K that’d be awesome. As for the dataset, I haven’t found many issues with it 🙂

→ View original post on X — @mattshumer_,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *