AI Dynamics

Global AI News Aggregator

Fine-tuning with 16k tokens but capable of handling 100k

When I understood correctly, they used 16k for finetuning, but it can handle up to 100k even?

→ View original post on X — @rasbt,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *