Intelligence is Compression
Intelligence is the ability to reduce a complex world to a few structures that still predict what happens next.
A good framework compresses a complex system into a few variables that still explain what happens. Supply and demand compress thousands of transactions into two forces that predict price. OODA compresses conflict into a decision cycle. Reflexivity compresses financial bubbles into a feedback loop between price and behavior.
Prediction and compression are mathematically linked. If a model predicts data well, it can encode that data in fewer bits because outcomes become less surprising. Conversely, if you can compress a dataset effectively, you have captured its underlying structure. Information theory formalizes this relationship in ideas like Minimum Description Length and Kolmogorov complexity.
This same pattern appears across many forms of thinking:
- Frameworks compress causal structure. They reduce complex systems to the forces that drive outcomes.
- Good updates compress evidence. The value of an update is not the observation itself but the belief change it justifies.
- Scientific laws compress reality. Newton’s laws compress planetary motion into a few equations.
- Machine learning compresses patterns. Models learn compact representations that allow them to predict future data.
In each case, knowledge is the same thing: the shortest explanation that still predicts reality.
Compression is powerful because it changes the economics of thinking. It speeds judgment, enables generalization, and allows many people to coordinate around the same model of the world. But compression has a constraint: remove too much structure and prediction collapses.