I trained an LLM from scratch on pre-1900 text to see if it could come up with quantum mechanics and relativity.
— Michael Hla (@hla_michael) 2 avril 2026
While the model is too small to do meaningful reasoning, it has glimpses of intuition.
When given observations from past landmark experiments, the model can declare… pic.twitter.com/qVjUF6bvtp
I trained an LLM from scratch on pre-1900 text to see if it could come up with quantum mechanics and relativity. While the model is too small to do meaningful reasoning, it has glimpses of intuition. When given observations from past landmark experiments, the model can declare that “light is made up of definite quantities of energy” and even suggest that gravity and acceleration are locally equivalent. I’m releasing the dataset + models and leave this as an open problem to the research community. I also include what this project has taught me about intelligence in a mini essay linked below. 🧵(1/n)
→ View original post on X — @thom_wolf, 2026-04-02 18:14 UTC
Leave a Reply