Didn't tweet nanoGPT yet (quietly getting it to good shape) but it's trending on HN so here it is 🙂 : https://
github.com/karpathy/nanoG
PT
…
Aspires to be simplest, fastest repo for training/finetuning medium-sized GPTs. So far confirmed it reproduced GPT-2 (124M). 2 simple files of ~300 lines
nanoGPT: Simplest Repository for Training Medium-Sized GPTs
By
–
Leave a Reply