June 27th, 2020

Book Notes: The Great Mental Models Volume 1

I've decided to start publishing notes on non-fiction books that I have found useful in some shape or form, and are worth re-reading in the future.

I don't remember precisely how The Great Mental Models ended up on my reading list. I'm a massive fan of Naval Ravikant, and he often talks about the importance of having good mental models for making good decisions. I read the first volume of the book back in May.

My favourite models here are probably the two "Razors" at the end, and they have certainly come in handy when making decisions in work and personal life.

The Map is Not the Territory

  • Maps and models are useful in specific contexts but are never an accurate depiction of reality.

    • The map of the London Underground is handy to travellers but worthless to train drivers.
  • A map or model can be thought of more broadly as an abstraction of reality.

  • A news story is an abstraction put together by the journalist who studied numerous different sources.

  • Models are useful in certain situations and should always be checked against reality and updated accordingly.

    • Newton's discoveries and theories form a lot of the fundamentals of physics. However, they cannot explain certain phenomena. Einstein's theory of relativity adds to Newton's models but still cannot accurately describe things like quantum mechanics.
  • Models are, by definition, a reduction of something much more complex and always have some level of subjectivity. Models describe reality at a particular point in time.

  • All models are flawed, but some are useful.

  • Mentioned Book: The Death and Life of Great American Cities

Circle of Competence

  • The story of a Lifer vs a Stranger in a small town.

    • The Stranger may feel like there is not a lot to know about a small town. He thinks he can draw from past experiences and set up a flourishing business. But he will hit problems, which the Lifer will be able to solve much more effectively.
  • Sherpas - an indigenous people near Mt Everest. They can navigate the mountain and know how to survive at high altitudes as they have been living there for so long. A person would be a fool to hire anyone but a Sherpa as a guide to navigating these landscapes.

  • Competence is earned after getting at least several years of experience and experiencing failures.

  • Circles are dynamic and must be kept up to date (just like Maps). They are built and kept up to date by:

    • Curiosity and a desire to learn
    • Monitoring what is going on in the field and not succumbing to one's ego, thinking they know everything. Ego is the enemy (just like Ryan Holiday's book title).
    • Seeking Feedback
  • Knowing how to operate outside of a circle is vital.

    • Learn the basics of the field but acknowledge that you are still a Stranger.
    • Talk to people who have strong circles and ask thoughtful questions.
    • Use other broad mental models to navigate the field in which your understanding is limited.
  • When seeking wisdom from experts, beware of their incentives! Go back to learning the basics before committing to anything.

  • The store manager is probably willing to tell small lies to get rid of an unpopular product.

First Principles Thinking

Clarification of complex problems by separating the underlying facts from any assumptions.

  • Plato and Socrates talked about first principles, later also Aristotle and Descarte.
  • Taking things apart and testing our assumptions, not blindly relying on what others tell us.

Everything that is not a law of nature is merely a shared belief. There are two ways to identify principles:

  • Socratic questioning. Truly scrutinising our thinking and beliefs. Why do I think what I think? How do I back it up? What's the evidence?
  • The Five Whys. Repeatedly asking why until we get to a what or how. The goal is to reach a statement of falsifiable fact.
    • The dogma of the sterile stomach. For decades, scientists believed that the stomach was sterile and ulcers could only be caused by stress. Towards the end of the 20th century, they discovered a bacterium responsible—example of beliefs that were not based on first principles.
    • Breaking things down to first principles allows us to unleash creativity.

Thought Experiment

Thought experiments are devices of the imagination used to investigate the nature of things.

  • Enables us to look at the world at an angle otherwise impossible.
  • Helps understand the actual cause and effect.

Examples of thought experiments:

  • Imagining the physically impossible. If money were no object... If I had all the time/money in the world...
  • Reimagining history. What if person X hadn't done Y or been at location Z at such and such time? What could have happened?
  • Intuiting the non-intuitive. What laws and rules would I advocate for if I did not know what kind of person I myself was?

Second-Order Thinking

  • Thinking farther ahead, past the immediate consequences and results of our actions.

  • Comprehensively considering effects of effects.

  • We have been feeding antibiotics to cows to make safer meat (first-order consequence). By doing so, we have helped create bacteria resistant to antibiotics (second-order consequence).

  • Second-order thinking helps prioritise long-term outcomes over instant gains. This is important in building trust and non-transactional relationships.

  • Looking farther ahead helps construct an effective argument. Always show that you have thought of secondary effects.

  • Pitfalls of second-order thinking:

    • Analysis paralysis
    • The slippery slope effect - arguing extreme hypothetical secondary effects.
      • Example: if we let the kids choose the movie, they will expect to be able to choose everything in life.

Probabilistic Thinking

Using tools of maths and logic to predict the most likely outcomes when the real outcomes are unknown.

  • Similar to heuristics - a mental shortcut for making judgements and solving problems quickly (look up Daniel Kahneman).
  • Bayesian Thinking (Thomas Bayes)
    • As we get new information, we should take into account all prior information before making decisions based on new information.
    • Look at long-term trends and statistics before drawing conclusions from the latest news.
    • Bayes Factor - new information may increase or reduce the likelihood of a prior being true.
  • Similar to Bayesian thinking is Conditional Probability. It looks at whether events are dependent or independent. Does the probability of an event occurring change given that another event has occurred?
  • Fat-Tailed Curves
    • In "fat-tailed" distributions, extremes are more likely to occur despite the "average".
    • Power Law curve has a very fat tail while the Bell Curve has a long and skinny one.
  • Asymmetries
    • Metaprobability - the probability of probability estimates themselves being accurate.
    • Investors estimating return on an investment.
    • Predicting the effect of traffic on travel time. How many people arrive early "because of traffic"?
    • Far more estimates are over-optimistic than under-optimistic.

Nassim Taleb's Books

  • Black Swan - small errors in probabilities can cause results that are orders of magnitude different.
  • Anti Fragile
    • Better to prepare than predict.
    • Create serendipity and go to places where the likelihood of good things happening is higher.
    • Manage risk and fail properly. Never put yourself in a position where you could be taken out completely.
    • How do insurance companies figure out the price of insuring a Victoria's Secret model's legs?

Inversion

  • Thinking backwards about a problem, starting from the end.
  • Instead of focusing on goals or what to do, think about what to avoid and not to do. Then see what's left over.
  • What else would have to be true for X to be true?
  • John Bogle invented the index fund by not focusing on how to beat the market. Instead, he thought about how never to lose more than the market.

Occam's Razor

Simple explanations are more likely to be true than complex ones.

  • People tend to find complex explanations for simple things. My wife is still not home from the party; therefore, something bad must have happened to her.
  • We should always pick the explanation with the fewest moving parts (assuming equal explanatory values). Fewer variables → higher probability of all variables being correct.
  • Caveat: Not all things are always simple. Human flight would have been impossible if we hadn't looked at very complex explanations. Various frauds and schemes are very elaborate.

Hanlon's Razor

We should not attribute to malice that which is more easily explained by stupidity.

  • Road rage. When someone cuts you off on the road, it is more likely that they didn't see you than that they did it deliberately.
  • Someone doing wrong is more likely than someone doing wrong and doing it on purpose.
  • The Linda problem.
  • Vasili Arkhipov refusing to launch torpedos in the Cuban Missile Crisis after noticing the explosions of signalling depth charges dropped by the Americans. The crew did not know if war had broken out or not, but Arkhipov believed it was more likely that it had not.

© Siim Männart 2020