Impressive inference speed from Inception Labs’ diffusion LLMs. Diffusion LLMs are a fascinating alternative to conventional autoregressive LLMs. Well done @StefanoErmon and team! https://t.co/w4w5QZpyp6
— Andrew Ng (@AndrewYNg) 25 février 2026
Impressive inference speed from Inception Labs’ diffusion LLMs. Diffusion LLMs are a fascinating alternative to conventional autoregressive LLMs. Well done @StefanoErmon and team! Stefano Ermon (@StefanoErmon) Mercury 2 is live 🚀🚀 The world’s first reasoning diffusion LLM, delivering 5x faster performance than leading speed-optimized LLMs. Watching the team turn years of research into a real product never gets old, and I’m incredibly proud of what we’ve built. We’re just getting started on what diffusion can do for language. — https://nitter.net/StefanoErmon/status/2026340720064520670#m
→ View original post on X — @andrewyng, 2026-02-25 02:04 UTC

Leave a Reply