We
all have had the strange experience of conversing with a seemingly rational and
sane person who at some point expresses some belief that to us seems
unimaginably weird: that transdimensional reptilian aliens killed JFK or some
such thing. Most often the expressed belief isn’t so far outside the mainstream
as that, but even mainstream beliefs can seem inexplicable to those of us who
don’t share them. We like to believe (there’s a variant of that word again)
that we are logical and those other people are not. This is improbable. Even if
we happen to be right, or at least less wrong, the odds are we arrived at our
beliefs first and found reasons to justify them later. This is just the way the
human mind works; those of us with mainstream opinions come by them with few
exceptions the same way the reptilian enthusiasts do.
Michael
Shermer is a founding publisher of the magazine Skeptic. He also writes a column for Scientific American. I’d read numerous articles by him in the past.
Last week I picked up The Believing Brain
in which he explains how people are hardwired to see
patterns and agency in the world. Shermer writes
about the physiology of belief and the effects of priming, anchoring, and
pre-existing expectations. There are good evolutionary reasons for
people to be good at discerning causes, patterns, meanings, and hidden links
where they exist, but also to be just as talented at seeing and believing in them where
they don’t. Those of our primate ancestors who
concluded a rustle in tall grasses meant that leopards were lurking in there
because there was one the last time the grass rustled, for example, most often
were wrong; nonetheless, they were more likely to survive than the primates who
didn’t believe in the connection. Being credulous is less likely to get you
killed and thereby removed from the gene pool. Evolution favors belief based on
shoddy evidence over skepticism. Far from being a barrier to holding
outlandish beliefs, by the way, intelligence actually helps. Shermer argues that intelligent people can convolute,
reinterpret, and interconnect data in creative ways that dimmer bulbs cannot
match. It is especially hard to change
minds when a sense of self is tied to a system of beliefs such as a religion or
political ideology: “our most deeply held beliefs are immune to attack by
direct educational tools, especially for those who are not ready to hear
contradictory evidence.” The
truth, or something approximating it, sometimes can be found, however, and the
scientific method is best way of finding it. It behooves us to recall the
admonition of physicist Richard Feynman:
“If
it disagrees with experiment, it is wrong. In that simple statement is the key
to science. It doesn’t make any difference how beautiful your guess is, how
smart you are, who made the guess, or what his name is. If it disagrees with
experiment, it’s wrong. That’s all there is to it.”
My math teacher
back in high school liked to quote Edgar Allan Poe: “Believe nothing you hear,
and only one half that you see.” This is hyperbolic (and of course logically
self-negating), but there is an underlying value to the advice. “Belief comes quickly
and naturally, skepticism is slow and unnatural, and most people have a low
tolerance for ambiguity,” says Shermer. Turning skepticism on ourselves is
especially hard, but in a world increasingly characterized by true believers
(secular and otherwise) and extreme partisanship it is something to be
encouraged.
The Believing Brain is worth a read, as prods to question
ourselves (not just others) and our own belief systems usually are. Thumbs Up.
Thin Lizzy – Don't Believe a
Word
Belief comes quickly and naturally, skepticism is slow and unnatural--and so does intelligence or so it seems to me. Is it considered unnatural because evolution or nature/nurture is so predominately hardwired into us? And most people have a low tolerance for ambiguity--probably so, but I feel more ambiguous these days rather than knowing a great deal.
ReplyDeleteWe take a lot of shortcuts when forming judgments, which is a handy skill when we need to make snap decisions. It is not a good habit when we have the time to reflect but don't use it. Take the old admonition, "To a man with a hammer everything looks like a nail." Shermer relates a classic experiment in which several people falsely reported to psychologists that they heard voices (not uncommon in cases of stress or sleep deprivation) but that the voices went away. They otherwise answered all questions honestly. Every single one was diagnosed as schizophrenic and admitted to psychiatric hospitals; the therapists' notes showed that everything the subjects said (no matter how normal) was interpreted as supporting that conclusion. None of the staff in the hospitals questioned the diagnoses, though interestingly the other patients did. Some patients would ask things like, "You're a reporter for newspaper, aren't you?"
Delete