AI Dynamics

Global AI News Aggregator

Language Models Learn Thousands of Tasks During Pre-training

The first step is to understand that when LMs are pre-trained on next-word prediction, they are really doing massive multi-task learning on thousands (millions?) of tasks. Here is a list of some potential tasks.

→ View original post on X — @_jasonwei,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *