AI Dynamics

Global AI News Aggregator

Full Finetuning Not Always Necessary for Large Language Models

I don't think full finetuning is necessary, except as a baseline perhaps. Even without PEFT, it often doesn't make sense to tune all layers: (from my article https://
magazine.sebastianraschka.com/p/finetuning-l
arge-language-models
…)

→ View original post on X — @rasbt,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *