AI Dynamics

Global AI News Aggregator

OpenAI’s Research Foundations: Transformers and Self-Supervised Learning

You are wrong.
First, FAIR and GDM aren't "academic".
2nd, don't confuse achievements in product development and research.
3rd, OpenAI builds on everyone else's research. GPTs use transformer architectures (from Google), Self-Supervised pre-training (several contributors),

→ View original post on X — @ylecun,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *