https://twitter.com/karpathy/status/1777427944971083809
Have you ever wanted to train LLMs in pure C without 245MB of PyTorch and 107MB of cPython? No? Well now you can! With llm.c:
https://github.com/karpathy/llm.c
To start, implements GPT-2 training on CPU/fp32 in only 1,000 lines of clean code. It compiles and runs instantly, and exactly matches the PyTorch reference implementation.
Karpathy 这个人非常有意思,讲东西深入浅出,循序渐进,水平很高。