Acquiring knowledge may seem like a daunting task. There is so much to know and time is precious. Luckily, we don’t have to master everything.

To get the biggest bang for our buck we can study the big ideas from the big disciplines: physics, biology, psychology, philosophy, literature, sociology, history, and a few others. We call these big ideas mental models.

Our aim is not to remember facts and try to repeat them when asked, the way you studied for your high school history exams. We’re going to try and hang these ideas on a latticework of mental models, with vivid examples in our head to help us remember and apply them.

The latticework of mental models puts them in a useable form to analyze a wide variety of situations and enables us to make better decisions. And when big ideas from multiple disciplines all point towards the same conclusion, we can begin to conclude that we’ve hit on an important truth.

The idea for building such a latticework comes from Charlie Munger, Vice Chairman of Berkshire Hathaway and one of the finest cross-disciplinary thinkers in the world.

Here, Charlie Munger explains his approach to worldly wisdom:

Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ’em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form.

You’ve got to have models in your head. And you’ve got to array your experience both vicarious and direct on this latticework of models. You may have noticed students who just try to remember and pound back what is remembered. Well, they fail in school and in life. You’ve got to hang experience on a latticework of models in your head.

What are the models? Well, the first rule is that you’ve got to have multiple models because if you just have one or two that you’re using, the nature of human psychology is such that you’ll torture reality so that it fits your models, or at least you’ll think it does…

It’s like the old saying, "To the man with only a hammer, every problem looks like a nail." And of course, that’s the way the chiropractor goes about practicing medicine. But that’s a perfectly disastrous way to think and a perfectly disastrous way to operate in the world. So you’ve got to have multiple models.

And the models have to come from multiple disciplines because all the wisdom of the world is not to be found in one little academic department. That’s why poetry professors, by and large, are so unwise in a worldly sense. They don’t have enough models in their heads. So you’ve got to have models across a fair array of disciplines.

You may say, "My God, this is already getting way too tough." But, fortunately, it isn’t that tough because 80 or 90 important models will carry about 90% of the freight in making you a worldly wise person. And, of those, only a mere handful really carry very heavy freight.(1)

John T. Reed, author of Succeeding offers an important additional insight:

When you first start to study a field, it seems like you have to memorize a zillion things. You don’t. What you need is to identify the core principles – generally three to twelve of them – that govern the field. The million things you thought you had to memorize are simply various combinations of the core principles.

Mental Models

The central principle of the mental model approach is that you must have many of them. Ideally, all the ones you need to solve the problem at hand. As with physical tools, lacking a mental tool at the crucial moment can lead to a bad result.

This seems self-evident, but it’s an unnatural way to think. Without the right training, your brain takes the other approach, which is to say: Which models do I already know and love, and how can I apply them here? Munger’s analogy for this is the man with a hammer, to whom everything looks a bit like a nail. Such narrow-minded thinking feels entirely natural to us, but it leads to far too many misjudgments.

As an example, Tim Wu writes about Chris Anderson, who wrote The Long Tail, a popular 21st century business book. Here is what happens when you rely on one (powerful) mental model to solve everything.

Chris Anderson’s The Long Tail does something that only the best books do—uncovers a phenomenon that’s undeniably going on and makes clear sense of it. Anderson, the Wired editor-in-chief who first wrote about the Long Tail concept in 2004, had two moments of genius: He visualized the demand for certain products as a "power curve," and he came up with a catchy phrase to go with his observation. Like most good ideas, the Long Tail attaches to your mind and gets stuck there. Everything you take in—cult blogs, alternative music, festival films—starts looking like the Long Tail in action. But that’s also the problem. The Long Tail theory is so catchy it can overgrow its useful boundaries. Unfortunately, Anderson’s book exacerbates this problem. When you put it down, there’s one question you won’t be able to answer: When, exactly, doesn’t the Long Tail matter?

This insight goes only so far, but like many business books, The Long Tail commits the sin of overreaching. The tagline on the book’s cover reads, "Why the Future of Business Is Selling Less of More," which is certainly wrong or at least exaggerated. Inside we learn about "the Long Tail of Everything." Anderson’s book, unlike his original Wired article, threatens to turn a great theory of inventory economics into a bad theory of life and the universe. He writes that "there are now Long Tail markets practically everywhere you look," calling offshoring the "Long Tail of labor," and online universities "the Long Tail of education." He quotes approvingly an analysis that claims, improbably, that there’s a "Long Tail of national security" in which al-Qaida is a "supercharged niche supplier." At times, the Long Tail becomes the proverbial theory hammer looking for nails to pound.

What the book doesn’t get at is the relationship between these standards-driven industries where the Long Tail doesn’t matter, and the content industries where it does. There aren’t Long Tails everywhere.

The truth is that the good ideas are just as dangerous as the bad ones. Warren Buffett’s mentor, Ben Graham, used to put it as such:

You can get in way more trouble with a good idea than a bad idea, because you forget that the good idea has limits.

The best antidote to this sort of overreaching is to add more colors to your mental palette; to expand your repertoire of ideas, make them vivid and available, and watch your mind grow.

You’ll know you’re on to something when the ideas start to compete with one another. At first, this is mildly uncomfortable. One idea says X and the other idea says the reverse of X: How do I decide which is right?

This process of letting the models compete and fight for superiority and greater fundamentalness is called thinking! It’s a little like learning to walk or ride a bike; at first you can’t believe all that you’re supposed to do at once, but eventually you wonder how you ever got along without it. As Charlie likes to say, going back to any other method would feel like cutting off your hands.

Good luck, and let’s explore the models.

The Farnam Street Latticework of Mental Models

Human Psychology

Biases emanating from the Availability Heuristic:
Ease of Recall

Biases emanating from the Representativeness Heuristic
Bias from insensitivity to base rates
Bias from insensitivity to sample size
Misconceptions of chance
Regression to the mean
Bias from conjunction fallacy

Biases emanating from the Confirmation Heuristic
Confirmation bias
Bias from anchoring
Conjunctive and disjunctive-events bias
Bias from over-confidence
Hindsight Bias

– Bias from incentives and reinforcement
– Bias from self-interest
– Bias from association
– Bias from liking/loving
– Bias from disliking/hating
Commitment and Consistency Bias
Bias from excessive fairness
Bias from envy and jealousy
– Reciprocation bias
– Over-influence from authority
– Deprival Super-Reaction Bias
– Bias from contrast
– Bias from stress-influence
– Bias from emotional arousal
– Bias from physical or psychological pain
Fundamental Attribution Error
– Bias from the status quo
– Do something tendency
– Do nothing tendency
– Over-influence from precision/models
– Uncertainty avoidance
– Not invented here bias
– Short-term bias
– Tendency to avoid extremes
– Man with a Hammer Tendency
Bias from social proof
– Over-influence from framing effects
– Lollapalooza

– Price Sensitivity
– Scale
– Distribution
– Cost
– Brand
– Improving Returns
– Porters 5 Forces
– Decision Trees
– Diminishing Returns
– Double Entry Accounting

Mr. Market
Circle of competence

Complex adaptive systems
– Systems Thinking

– Utility
– Diminishing Utility
Supply and Demand
– Elasticity
– Economies of Scale
– Opportunity Cost
– Marginal Cost
Comparative Advantage
– Trade-offs
– Price Discrimination
– Positive and Negative Externalities
– Sunk Costs
– Moral Hazard
Game Theory
Prisoners’ Dilemma
Tragedy of the Commons
– Bottlenecks
– Time value of Money

Feedback loops
Margin of Safety
– Tight coupling
– Breakpoints

Bayes Theorem
– Power Law
– Law of large numbers
– Compounding
– Probability Theory
– Permutations
– Combinations
– Variability
– Standard Deviation and normal distribution
– Regression to the mean
Multiplicative Systems

– Outliers and self fulfilling prophecy
– Correlation versus Causation
– Mean, Median, Mode
– Distribution

– Thermodynamics
– Kinetics
– Autocatalysis

– Newton’s Laws
– Momentum
– Quantum Mechanics
Critical Mass

– Natural Selection

More Models:
– Asymmetric Information
Occam’s Razor
Deduction and Induction
Basic Decision Making Process
Scientific Method
– Process versus Outcome
– And then what?
– The Agency Problem
– 7 Deadly Sins
– Network Effect
Gresham’s Law
The Red Queen Effect

Remember, the above list is always subject to growing and changing. We’re always trying to find better ways to organize our knowledge.

1. Charlie Munger, Poor Charlie’s Almanack
2. John T. Reed, Succeeding
3. Alice Schroeder, The Snowball: Warren Buffett and the Business of Life

comments powered by Disqus