When I understood correctly, they used 16k for finetuning, but it can handle up to 100k even?
Fine-tuning with 16k tokens but capable of handling 100k
By
–
Global AI News Aggregator
By
–
When I understood correctly, they used 16k for finetuning, but it can handle up to 100k even?
Leave a Reply