After playing with fine-tuning gpt-2 for some simple classification tasks, my honest take is that for most *supervised* ML projects you really, really, really don’t need the LLMs. For simple image/text/signal problems use a pre-trained nn, such as one of the deep NNs trained on
Fine-tuning GPT-2: When LLMs Aren’t Necessary for ML
By
–
Leave a Reply