A common question of skeptics and science-based thinkers is “How could anyone believe that?” People do believe some really weird things and even some obviously false things. The more basic question is how we form all our beliefs, whether false or true.
Michael Shermer’s book Why People Believe Weird Things has become a classic. Now he has a new book out: The Believing Brain: From Ghosts and Gods to Politics and Conspiracies: How We Construct Beliefs and Reinforce Them as Truths It synthesizes 30 years of research into the question of how and why we believe what we do in all aspects of our lives.
Some of the content is repetitious for those of us who have read Shermer’s previous books and heard him speak, but the value of the new book is that it incorporates new research and it puts everything together in a handy package with a new focus.
Shermer says
I’m a skeptic not because I do not want to believe, but because I want to know. How can we tell the difference between what we would like to be true and what is actually true? The answer is science.
He includes a pithy quotation from Richard Feynman that I had not seen before:
If it disagrees with experiment, it is wrong. In that simple statement is the key to science. It doesn’t make any difference how beautiful your guess is, how smart you are, who made the guess, or what his name is. If it disagrees with experiment, it’s wrong. That’s all there is to it.
Our schools tend to teach what science knows rather than how science works. The scientific method is a teachable concept. But
our most deeply held beliefs are immune to attack by direct educational tools, especially for those who are not ready to hear contradictory evidence.
This is a problem. Shermer does not offer a solution.
The brain is a belief engine. It relies on two processes: patternicity and agenticity. It finds meaningful patterns in both meaningful and meaningless data. It infuses patterns with meaning, and imagines intention and agency in inanimate objects and chance occurrences. We believe before we reason. Once beliefs are formed, we seek out confirmatory arguments and evidence to justify them. We ignore contrary evidence or make up rationalizations to explain it away. We do not like to admit we are wrong. We seldom change our minds.
Our thinking is what Morgan Levy has called “intelligently illogical.” If our ancestors assumed that the wind rustling the bushes was a lion and they ran away, that wasn’t a big problem. If there really was a lion and they didn’t run away, they were in trouble. Natural selection favors strategies that make many false causal assumptions in order to not miss the true ones that are essential to survival. Superstition and magical thinking are natural processes of a learning brain. People believe weird things because of our evolved need to believe nonweird things.
Belief comes quickly and naturally, skepticism is slow and unnatural, and most people have a low tolerance for ambiguity.
We rely on a feeling of conviction, but that feeling can be uncoupled from good reasons and good evidence. Science hopes to counteract false beliefs by recoupling through counterarguments with even better reasons and evidence.
As science advances, the things we once thought of as supernatural acquire natural explanations. Thunderstorms are caused by natural processes of electricity in clouds, not by a god throwing thunderbolts.
Belief in God is hardwired into our brains through patternicity and agenticity. We see patterns even when they are not there (the Virgin Mary on a toasted cheese sandwich), and we interpret events as having been deliberately caused by a conscious agent (the AIDS virus was created in a government lab for genocidal purposes). God is the ultimate pattern and agent that explains everything. And religious belief had survival value for human groups, encouraging conformity, group cooperation, and altruism.
Shermer covers a variety of subjects, from alien abductions to cosmology, from economics to politics, from belief in the afterlife to evolution, from ESP to morality, with a lot of entertaining examples. He doesn’t give much space to medical topics but he does mention AIDS denial, the vaccine/autism brouhaha, and alternative medicine, which he calls “a form of pseudoscience.”
Conspiracy theories abound, from Holocaust denial to 9/11 Truthers to the spread of AIDS. This is a result of wide-open pattern detection filters and to the assumption that there must be a conscious agent behind everything. Shermer provides a handy list of 10 characteristics of a conspiracy theory that indicate that it is likely to be false; for instance, the more people who would have to have been involved in a cover-up, and the longer the alleged cover-up has lasted, the less likely that no one would have spilled the beans by now.
He provides a useful discussion of the various biases we are prone to, from confirmation bias to the status quo bias, and points out that science is the ultimate bias-detection machine. He revisits the “Gorillas in our midst” video to remind us that we don’t see things that we’re not looking for. (In case you don’t know, that was an experiment demonstrating inattentional blindness: a gorilla walks through a group of people playing basketball and we don’t see him because our attention is fixed on counting the number of times the players in white shirts passed the ball.) He quotes Upton Sinclair:
It is difficult to get a man to understand something when his job depends on not understanding it.
When I read that, Dana Ullman came to mind.
I particularly got a kick out of one of Shermer’s examples. Galileo used an early telescope to observe 4 moons around Jupiter. One colleague of Galileo’s refused to even look through the telescope, calling it a parlor trick, saying he didn’t believe anyone else would see what Galileo saw, and saying that looking through glasses would only make him dizzy. Other colleagues who did look were similarly dismissive; one tested the telescope in a series of experiments and said it worked fine for terrestrial viewing, but when pointed at the sky it somehow deceived the viewer. One professor of mathematics accused Galileo of putting the moons of Jupiter inside the tube.
We are beginning to develop a new understanding of how the brain generates beliefs and reinforces them. Mr. Spock is science fiction; humans are often illogical and emotional. We need emotion to motivate us and help us function. An emotional leap of faith beyond reason is often required for us to make decisions or just to get through the day.
This thought-provoking book is a good read and a good reference. Takeaway lessons:
- Beliefs come first, reasons follow.
- False beliefs arise from the same thought processes that our brains evolved to enable them to learn about the world.
- Our faulty thinking mechanisms can’t be eliminated but our errors can be corrected by science.
This article was originally published in the Science-Based Medicine Blog