AI Dynamics

Global AI News Aggregator

Pretraining LLMs entirely on Hugging Face Hub infrastructure

You can now pretrain LLMs entirely on the HF Hub 💥 Last week, @OpenAI launched a competition to see who can pretrain the best LLM in under 10 minutes. So over the weekend, I made a little demo to automate this end-to-end using the Hub as the infra layer: – Jobs to scale compute – Buckets to store all experiments – Trackio to log all the metrics The cool thing here is that everything is launched locally: no ssh shenanigans into a cluster or fighting with colleagues over storage and GPUs ⚔️ All that's left is coming up with new ideas, but luckily Codex can automate that part too 😁 Can I have a job now please @reach_vb 🙏?

→ View original post on X — @thom_wolf, 2026-03-23 16:29 UTC

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *