New course: Build and Train an LLM with JAX, built in partnership with @Google and taught by @chrisachard.
— Andrew Ng (@AndrewYNg) 4 mars 2026
JAX is the open-source library behind Google's Gemini, Veo, and other advanced models. This short course teaches you to build and train a 20-million parameter language… pic.twitter.com/iBJTIjTOIW
New course: Build and Train an LLM with JAX, built in partnership with @Google and taught by @chrisachard. JAX is the open-source library behind Google's Gemini, Veo, and other advanced models. This short course teaches you to build and train a 20-million parameter language model from scratch using JAX and its ecosystem of tools. You'll implement a complete MiniGPT-style architecture from scratch, train it, and chat with your finished model through a graphical interface. Skills you'll gain: – Learn JAX's core primitives: automatic differentiation, JIT compilation, and vectorized execution – Build a MiniGPT-style LLM using Flax/NNX, implementing embedding and transformer blocks – Load a pretrained MiniGPT model and run inference through a chat interface Come learn this important software layer for building LLMs! deeplearning.ai/short-course…
→ View original post on X — @andrewyng, 2026-03-04 18:41 UTC

Leave a Reply