This book taught me that I probably don't know as much as I think I do. Many research studies show that we’re often overconfident about what we know. In this book, Steven Sloman and Philip Fernbach explain the nature of this illusion of knowledge, why we suffer from this illusion, and what implications this has for the way we talk about science and politics.
This book summary is divided into the following sections:
(1) We don’t know as much as we think.
(2) Why do we overestimate how much we know?
(3) What does this mean for the way we talk about science and politics?
Most of us know less than we think we do. We suffer from an illusion of understanding.
Humans are not warehouses of knowledge
“We wager that, except for a few areas that you’ve developed expertise in, your level of knowledge about the causal mechanisms that control not only devices, but the mechanisms that determine how events begin, how they unfold, and how one event leads to another is relatively shallow. But before you stopped to consider what you actually know, you may not have appreciated how shallow it is.”
To understand this, we first have to understand how people acquire knowledge. In this book, the authors discuss 2 main ways people come to know things:
(1) Causal reasoning
“Human beings are the world’s master causal thinkers. We can predict what will happen when we rub a match against a rough surface, if we go out in the rain without an umbrella, or if we say the wrong thing to our sensitive colleague. All of this requires causal reasoning.”
Causal thinking means that when we see two events A and B, we draw some kind of causal connection between the two things. We naturally infer all kinds of things that we’re not told and that we don’t have direct experience with. For example, when we see a toddler crying and their milk spilled on the floor, we infer that the spilled milk caused the toddler to cry. We do this for more complex events too. When we can’t see what vaccines are doing or how food is genetically modified, we fill in the missing pieces with what we’ve experienced or read about.
The most common way that people pass causal information to one another is by storytelling. E.g. one tale from the Bible discusses the root cause of everything, how the world was created. Many other biblical stories tell us about the consequences of our actions and why certain actions are right and others wrong.
(2) Human beings’ knowledge comes from stuff outside our brains.
"Our intelligence resides not in individual brains but in the collective mind. To function, individuals rely not only on knowledge stored within our skulls, but also on knowledge stored elsewhere: in our bodies, in the environment, and especially with other people."
There are at least 3 places where we “store” knowledge:
The physical world. Human beings have a type of “embodied intelligence”. We use the world as our memory store. For example, when catching a ball, we don’t calculate the physics of the ball’s movement, we just intuit where the ball is going. When we’re doing house chores, we let the pile of dirty dishes “tell us” that the dishes need to be done.
This is why the body plays such an important role in learning. For example, writing out mathematical calculations is much easier than doing it all in your head.
“In general, the fact that thought is more effective when it is done in conjunction with the physical world suggests that thought is not a disembodied process that takes place on a stage inside the head”
Other people. Humans harness the power of multiple entities working together to generate massive intelligence. We live in a community of knowledge.
This community of knowledge is organized through a division of cognitive labor. Groups split up cognitive tasks and specialize in one bit of the task (e.g. we have doctors treating patients, software engineers making new technology). Psychological research shows that people naturally divide up cognitive labor without thinking about it (e.g. cooking dinner with friends).
This division of cognitive labor allows humans to achieve great things. The “social brain” hypothesis suggests that humans got so intelligent relative to other mammals because of the increasing size and complexity of social groups, which allowed us to do more complex tasks.
“The social brain hypothesis posits that the cognitive demands and adaptive advantages associated with living in a group created a snowball effect: As groups got larger and developed more complex joint behaviors, individuals developed new capabilities to support those behaviors. These new capabilities in turn allowed groups to get even larger and allowed group behavior to become even more complex.”
Technology. A lot of our knowledge comes from technology. We store understanding on the Internet, just as we store it with other people. Many apps and software make this act of knowledge storage frictionless, accelerating the trend of storing knowledge in technology.
These two methods of how we acquire knowledge make human beings brilliant, but it also traps us in the knowledge illusion.
(1) Humans do a lot of causal reasoning, but we are often bad at it.
Although causal reasoning is our modus operandi when it comes to understanding the world, most humans are still really bad at it! Think about how a toilet flush works. This is a mechanism that most of us interact with daily but few of us understand how pressing the flush button causes water to flush in the toilet bowl.
The difficulty of causal reasoning is one reason why there’s no universal agreement about the best diet, what the best economic policy is, or what the best foreign policy in the Middle East is. A lot of the time we’re just guessing.
Our inclination for causal reasoning combined with the difficulty of making accurate causal inferences means that we often have false beliefs that we think are true beliefs!
(2) A lot of our knowledge comes from the world outside our brains, but we confuse it for stuff inside our brains.
Living in a community of knowledge means that only some of the knowledge you have resides in your head. A lot of it is actually contained in other people. There is no sharp boundary between one person's ideas and knowledge and those of other members of the team, so it’s easy to mistake what is in someone else's head for what is in your head.
Recent students have also shown that storing knowledge in technology leads us to overestimate how much we know. Adrian Ward (psychologist at the University of Texas) finds that engaging in Internet searches increases people’s self-esteem, their sense of their own ability to remember and process information. Furthermore, people who searched on the web for facts they didn’t know and were later asked where they found the information often misremembered and told the researchers that they knew those facts all along! As the authors write, “They gave themselves the credit instead of Google.”
Unfortunately, the knowledge illusion can have pretty detrimental effects on the way we think and talk about science and politics.
The knowledge illusion in science: There are lots of controversial issues in science where people have strong opinions but little understanding. Examples include climate change, genetic engineering, and vaccines. These issues are often infused with anti-scientific sentiments, a rejection of scientific research, and mistrust in medical professionals.
The “deficit” model of scientific understanding says that these attitudes are due to people misunderstanding science and hence these misunderstandings can be fixed once this “deficit” is filled. However, there’s little evidence that this deficit model is correct. Showing people information on the effects of vaccines and the effects of not getting vaccines doesn’t actually change their attitudes.
One reason why the deficit model is wrong is that we typically don't know enough individually to form knowledgeable, nuanced views about new technologies and scientific developments. So, we have to adopt the positions of those we trust (community of knowledge). Our attitudes and those of the people around us become mutually reinforcing. The fact that we have a strong opinion makes us think there must be a firm basis for our opinions, reinforcing the knowledge illusion.
"Scientific attitudes are not based on rational evaluation of evidence, and therefore providing information does not change them. Attitudes are determined instead by a host of contextual and cultural factors that make them largely immune to change."
The knowledge illusion in politics: Public opinion about social policies is always more extreme than people’s understanding justifies.
There are two important reasons for this: