Entropy is a concept from physics that initially stemmed from some ideas a guy called Rudolf Clausius had around thermodynamic processes.
The purists are going to hate this but I’m going to really simplify the concept so that we can cover it right now. The goal here isn’t to understand to the level of a physicist or chemist, but just to be able to wrap our heads around entropy enough to be able to add it to our latticework of mental models!
Here’s a simplistic definition:
Entropy is a measure of complexity (order or disorder) in a system.
The greater the level of disorder the higher the entropy.
Let us say you have a bag of balls. You grab one ball from the bag and put it on the table. How many ways can you arrange that ball? The answer: one way. What if we grab two balls and ask the same question? Now there are more ways to arrange the two balls. We keep doing this until all the balls are on the table. At this point, there are so many ways to arrange the bag of balls, you might not even be able to count the number of ways. This situation is very much like entropy.
In this situation, entropy is defined as the number of ways a system can be arranged. The higher the entropy (meaning the more ways the system can be arranged), the more the system is disordered.
Why is this interesting?
The interesting thing about entropy is that it’s always increasing. Everything in our universe naturally tends towards disorder. Think about simple systems like your bedroom or garden. If unmanaged, your bedroom tends towards increasing messy, and your neat garden beds become increasingly weed ridden.
Entropy is a great mental model to have in your personal lattice work of models. You can use it to acknowledge and understand the natural slide towards disorder in the systems around you, and continue the good fight towards order!
I’ll leave you with the words of former Czech leader Vaclav Havel:
Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.