AI Dynamics

Global AI News Aggregator

ULMFiT Language Modeling Self-Supervised Pre-training Innovation

For sure. And indeed CV pre-training was a key inspiration for ULMFiT. AFAIK there wasn't previous examples of using language modeling on a general purpose corpus as a self-supervised task then fine-tuning that in two more steps for downstream tasks (i.e like today's LLMs).

→ View original post on X — @jeremyphoward,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *