On the model side, longer context length for the 7B! If you could train over the new Mistral 7B w/ 32K that’d be awesome. As for the dataset, I haven’t found many issues with it 🙂
Mistral 7B Extended Context Training and Dataset Improvements
By
–
Global AI News Aggregator
By
–
On the model side, longer context length for the 7B! If you could train over the new Mistral 7B w/ 32K that’d be awesome. As for the dataset, I haven’t found many issues with it 🙂
Leave a Reply