I don't think full finetuning is necessary, except as a baseline perhaps. Even without PEFT, it often doesn't make sense to tune all layers: (from my article https://
magazine.sebastianraschka.com/p/finetuning-l
arge-language-models
…)
Full Finetuning Not Always Necessary for Large Language Models
By
–
Leave a Reply