This is an excellent book spanning the psychology, biology, and neuroscience of belief. Shermer has gone the rounds through religious and political beliefs and broken through to an appreciation of the science of belief itself. Shermer is a psychologist, the founding publisher of Skeptic magazine, and has a Ph.D. in the History of Science.
Shermer sees the post-modernist media-reinforced idea of
‘truth is relative’ being taken too far and out of context to allow more
possibilities than is realistic. As the X-Files had it – we want to believe.
‘The truth is out there.’ He agrees but intervenes, saying that science is by
magnitudes the best method of discerning truth. Many or most people believe in
the supernatural and the paranormal when asked. Perhaps it is the persistence
of life’s mysteries that leads to this, or as he says, a misunderstanding of the
scientific process. But why do people still believe, regardless of what science
says?
“Belief change comes from a combination of personal
psychological readiness and a deeper social and cultural shift in the
underlying zeitgeist, which is affected in part by education but is more the
product of larger and harder-to-define political, economic, religious, and
social changes.”
He notes that after we form our beliefs, we defend, justify,
and rationalize them. So, belief comes first, and our conception of reality is
based on that belief. He calls this idea belief-dependent realism, and the idea
is modelled on the ‘model-dependent realism’ theory of the reality of physics
put forth by Stephen Hawking and Leonard Mlodinow in their book, The Grand
Design (also reviewed in this blog). He takes the idea a step further to say
that all scientific models, including model-dependent realism, are, in essence, belief-dependent realism.
Our brains interpret sensory data to find patterns and then
infuse those patterns with meaning:
“The first process I call patternicity: the tendency to
find meaningful patterns in both meaningful and meaningless data. The second
process I call agenticity: the tendency to infuse patterns with meaning,
intention, and agency.
The meaningful patterns become beliefs and shape how we
view reality. Once beliefs are formed, we look for confirmatory evidence,e which
strengthens beliefs with emotional support. It is rare, he notes, for humans to
change their beliefs. We tend to hold onto them even in light of new evidence,
probably because we have invested in them.
He begins with the story of retired bricklayer Chick
D’Arpino, who had a mystical experience at 4 A.M. in 1966 where he heard the
clear voice of the “source” (not determined whether God or some other) that was
just 13 words involving love – the source knows us, loves us, and we can have a
relationship with it. However, he won’t tell anyone the 13 words and says he
never will. Why? I don’t know. He thought for sure the experience came from
outside his mind, but Shermer, who became friends with D’Arpino, thinks
otherwise.
He talks about his undergrad experience in an Abnormal
Psychology class where he visited clinics and hospitals for mental illness and
had to read the famous experiment of psychologist Davis Rosenhan, where his
associates clandestinely entered mental hospitals as patients after reporting
hallucinations. Seven of the eight were diagnosed as schizophrenic and one as
manic depressive. The hospital staff 'believed’ the diagnoses were correct,
treated the patients accordingly, and interpreted their behavior as symptomatic.
However, quite a few of the real patients suspected the ruse, which is
interesting. The power of expectation is significant. Part of the
issue is the assumption that since the patient is there, then he or she must
have mental health issues, so the diagnostic bias or label is already in
place. Rosenhan tried the experiment in reverse – to see if insane people would
be judged sane under opposite circumstances. He told an institution that he
would send fake patients. However, none were actually sent. Even so, the
institution judged about 20% of newly admitted patients as fake and suspected many
others – so, yes, the bias worked both ways, though not as well.
“What you believe is what you see. The label is the
behavior. Theory molds data. Concepts determine percepts. Belief-dependent
realism.”
Shermer describes himself as a materialist. He thinks mind
is generated solely by the activities of the brain.
Next, he tells the story of an atheistic scientist, a
geneticist, Francis Collins, M.D. Ph. D., who had an epiphany and became a
born-again Christian. He was influenced by the writings of C.S. Lewis. After
his epiphany, his belief was formed, and then his new reality unfolded –
remember, belief-dependent realism. He was the head of the National Institutes
of Health. He wrote a best-selling book called The Language of God in 2006, where he concluded that it is more rational to believe in God than to not believe
in God. He is well-versed in science, supports evolution, and debunks
intelligent design theory. He was raised very secular, noticed that some people
found comfort in religion and faith, and was swayed by Lewis’s arguments at a
critical point in his life, apparently. He calls himself a theistic
evolutionist. He describes his conversion as a choice – Lewis said one needed
to make a choice – and a leap. Shermer, a once-believer now non-believer, gives
parts of his interview with him, a once non-believer, now a believer. Shermer
sees his conversion as having intellectual and emotional components. Collins
even came to agree with Shermer, who thinks it's becoming clear that our moral
sense evolved along with our tendencies to be social, cooperative, and altruistic
– they all increase our fitness. They agree to disagree with Collins seeing our
inner voice or moral sense as deriving ultimately from God and Shermer seeing
it as derived solely from evolution.
Shermer also points to evidence that more educated people with higher IQs are more skilled at rationalizing beliefs. This leads to his rule of thumb:
“…smart people believe weird things because they are
skilled at defending beliefs they arrived at for nonsmart reasons.”
Another way he says it is that “reason’s bit is in the
mouth of belief’s horse.”
Shermer became ‘born again’ while a senior in high school, so he recounts his journey from there to skepticism. He was a “Jesus freak,” or
“bible thumper” for seven years. He hung around with others like him. He
recounts his slow and gradual ‘deconversion,’ becoming ‘unborn’ again. When at
college, he was among others more secular. That, along with philosophy classes,
secular and behaviorist professors, science, and seeking a master's degree in
experimental psychology, helped him deconvert. Nowadays, Shermer doesn’t even
believe in the existence of mind. He sees it as a form of dualism innate to our
cognition – perhaps a convenient (and functional) illusion. In grad school, he
studied ethology (one precursor to evolutionary psychology) and cultural
anthropology. Once deconversion was unstoppable, he realized what a pain in the
ass he must have seemed trying to evangelize and convert others. He understands
the worldview of religious indoctrination because he was once under its spell.
Perhaps that gives him great perspective to study belief. Again, he notes that
reality follows belief. We choose a way to believe and our reality tends to
accord with it. Everything one encounters is put through that belief framework
– one sees through the lens of belief. The final straw for him was the problem
of evil which he could not reconcile. He wrote a book about it called The
Science of Good and Evil. He also found that morality is not at all dependent
on religion.
Continuing his own skeptic’s journey, Shermer turns to
politics. Similar to his religious conversion, he had a political conversion
sparked by the writings of Ayn Rand. He now acknowledges that Rand developed a
cult-like following. She was venerated like a cult leader and thought to be
right without question. He is still a fan of Rand’s ideas but not of her
infallibility. He began a study of economics and capitalism. He is basically a
free-market advocating libertarian.
He says that science has three legs: data, theory, and
narrative. He splits narrative into formal (narrative of explanation) and
informal (narrative of practice). The informal narrative of practice is
messier, like life. While Shermer is a skeptic who does believe in science, he
does acknowledge that he might be wrong. He says, “Maybe. But I doubt it.”
Regarding religion, he does mention the absolute absurdity that belief in a
specific supernatural scenario could be the dividing line between intense joy and
intense suffering in some afterlife.
Next, he addresses patternicity. The main example involves
a hominid in the savanna encountering a rustling in the grass. It could be a
dangerous predator, or it could be the wind. Successful prediction could be a
life and death matter. If one assumes a predator but it turns out to be wind, then
that is called a ‘false positive,’ or Type I error in cognition. No major harm
done. If one assumes wind but it turns out to be a dangerous predator, then that
is called a ‘false negative,’ a Type II cognitive error, and death or serious
injury could be the result.
“Our brains are belief engines, evolved
pattern-recognition machines that connect the dots and create meaning out of
the patterns we think we see in nature. Sometimes A is connected to B;
sometimes it is not.”
This is patternicity or ‘association learning.’ Our success
in interpreting meaningful patterns aids our survival, so natural selection
strengthens it via our evolution through time. All animals do it to some
extent. He defines patternicity as “the tendency to find meaningful patterns in
both meaningful and meaningless noise.” An earlier version of his theory (as he
says) was presented by biologists Foster and Kokko (from Harvard and Helsinki, respectively) in their 2008 paper “The Evolution of Superstitious and
Superstitious-Like Behavior,” and was based on Hamilton’s idea of inclusive
fitness and kinship. They determined that “whenever the cost of believing a
false pattern is real is less than the cost of not believing a real pattern,
natural selection will favor the patternicity.” Thus, they concluded that the
evolutionary rationale for superstition is that “natural selection will favor
strategies that make many incorrect causal associations in order to establish
those that are essential for survival and reproduction.” True pattern
recognition helps us survive, but false pattern recognition does not necessarily
have negative consequences, so it tends to stick around. Shermer states it this
way: “people believe weird things because of our evolved need to believe
nonweird things.”
He notes that anecdotal association is an example of
patternicity that often leads to “faulty conclusions.” Anecdotal thinking is
part of our history and folklore, and likely our biology too, so it competes with
the methods of science, which are a few hundred years old and far more learned
than biological. Shermer mentions B.F. Skinner’s experiments explored the
superstitious behavior of pigeons when presented with a variable feeding
schedule after a period of a regular feeding schedule. Skinner concluded that
the pigeons were attempting to mimic their body positions and orientations
during the previous feeding to bring on the next. Did they first scan for a
pattern, then attempt to recreate it? Skinner was wholly convinced that their
gestures were performed in order to get the food via pattern matching, albeit
imagined pattern matching (aka superstition). The false connections have also
been termed “accidental learning.” Shermer notes that superstition can be
extinguished in pigeons but that it is much more difficult to do so in humans.
Learning by imprinting, as many animals do, involves forming
fixed and lasting memory patterns. This is what Shermer calls ‘hardwired
patternicity,' our instinctual imprinting. These are mostly stimulus-response
sequences among animals, to recognize a pattern and act according to the
imprint learning or other associative learning. Responses and response tendencies evolve. Facial recognition learning occurs early in humans
and, he says, is likely a Sign Stimulus-Innate Releasing Mechanism-Fixed Action
Pattern (SS-IRM-FAP) process as first described in herring gulls feeding their
young by ethologists Tinbergen and Lorenz in the 1950s. Recognizing faces
offered evolutionary advantages to early humans. It also has an unconscious
aspect – we actually recognize and begin to react to faces (and other hardwired
patterns) unconsciously before we do so consciously. Our intentions appear to
be acted upon before we are aware of a conscious decision to act – suggest some
experiments using EEG.
Another form of patternicity is when an insect-eating
predator may avoid insects with colors similar to stinging insects. Thus,
evolution has influenced our tendencies to consume or avoid. Evolution also
favors non-poisonous snakes that resemble poisonous ones. What we look for in
potential romantic partners is also evolutionarily hardwired to some extent. We
tend to respond to ‘super-normal’ stimuli like unusual or enhanced features.
These are examples of preprogrammed patternicities.
Regarding control of one’s environment, psychologists refer
to an internal locus of control and an external locus of control. Those with a
strong internal locus of control tend to think that they create their own
situations and experiences, while those with a dominant external locus of
control tend to think that things just happen to them. Skeptics, he says,
tend to have a stronger internal locus of control while believers in the
paranormal tend to have a stronger external locus of control, as measured by
tests designed to do so. Where the environment is more certain (as in modern
times in developed countries), the tendency for an internal locus of control is
higher. Anxiety and uncertainty are more prevalent in magical thinking. He also
recounts experimental psychologist Susan Blackmore’s change from a believer in
the paranormal to a skeptic and her experiments that showed believers were far
more likely to see hidden patterns and messages, which led her and others toward
skepticism. Believers and skeptics approach the data of experience differently.
Perhaps it was uncertainty and lack of control that led to the conspiracy
theories around the 9/11 terrorist attacks. Evidence suggests that negative
events, especially unexpected ones, are more likely to be attributed to
incorrect causes and conspiracies. Many of us have a tendency to make ‘illusory
correlations’ or illusory pattern detections under certain circumstances.
Experiments also suggest that a sense of control is also associated with
positive health and feelings of well-being.
Patternicity can be useful or damaging. People indulging in delusional conspiracies have committed murder. Quackery and pseudo-science can
also occasionally be harmful or even deadly.
Agenticity often involves the presumption of an ‘other’
that is also an intentional agent. Shermer defines agenticity as “the tendency
to infuse patterns with meaning, attention, and agency.” Beliefs in spirits,
ghosts, souls, gods, demons, aliens, government conspiracies, etc. in most
forms are examples of agenticity. Patternicity and agenticity make up the
cognitive basis of shamanism, paganism, animism, monotheism, polytheism,
spiritualism, intelligent design, and New Age-ism, he says – in most forms (I add)
since he does not consider here the possible metaphorical and psychological
aspects of such beliefs which may be given meaning without actually subscribing
to the beliefs. He also considers, and I agree, that such ideas are, or at least
seem, intuitive, that we often find patterns and ascribe agency to them in our
everyday analysis of experience. Essentialism, or belief in a life force that
can be transferred, strengthened, or weakened, is an example of such a
seemingly intuitive idea as is animism. We all tend to do it sometimes. Shermer
has spent some of his scientific career (as a professional skeptic) debunking
the attribution of agenticity to meaningless patterns. He has participated in
experiments and debates related to ‘sensed presence,’ magnetically-induced
OBEs, and many psychic and parapsychology experiments. He has even had some
bizarre experiences himself, one of a stress-induced and
sleep-deprivation-induced alien-like visitation during a long cross-country
bicycle race. Such experiences are also common with mountain climbers and other
extreme sports enthusiasts. He relates these experiences as stress-induced
‘sensed presence.” He sees extreme conditions as the trigger-cause with deeper
causes in the brain as follows:
“1) an extension of our normal sense of presence of
ourselves and others in our physical and social environments; 2) a conflict
between the high road of controlled reason and the low road of automatic
emotion; 3) a conflict within the body schema, or our physical sense of self,
in which your brain is tricked into thinking that there is another you; or 4) a
conflict within the mind schema, or our psychological sense of self, in which
the mind is tricked into thinking that there another mind.”
He goes into some of the possible neuroscience of these
causes, such as controlled vs. automated brain functions and the emotional
circuits, such as the amygdala fight-flight-freeze circuit and the autonomic
nervous system. He also mentions things like phantom limb syndrome as a learned
component of paralysis based on the expectations and habits of the past about
our body schema. The sensed presence of another mind may have to do with our
‘theory of mind’, which concludes that there are other minds different from our
own mind.
Shermer sees the mind as ‘what the brain does.’ It reduces
to the level of the neuron. Here, he reviews cognitive neuroscience, once
referred to as physiological psychology. Neurons make excitatory and inhibitory
postsynaptic potentials. They communicate information by firing frequency,
firing location, and firing number. They are considered similar to the binary
1-0 digits of a computer. Electrical signals course through neurons until they
reach the synapses, where chemical transmitter substances (CTS) are the chemical
signals that transfer information to subsequent neurons. Various drugs can
affect CTS release and uptake processes. The CTS dopamine has been called the
‘belief drug,’ and is involved with the learning and reward systems of the
brain (discovered by Skinner in his operant conditioning experiments). Skinner
called the reward reinforcement and the sequence of operant conditioning as
Behavior-Reinforcement-Behavior. The dopamine system is also involved. There is
debate as to whether dopamine “acts to stimulate pleasure or to motivate
behavior. The dopamine system is involved in addiction as the drugs or
behaviors take over the role of reward signals. UCLA neuroscientist Russell
Poldrack thinks that the dopamine system is more involved with motivation, and
the opioid system is involved with pleasure. He says that blocking the dopamine
system (in rats) will stop motivation but not enjoyment. Experiments have
suggested that increased dopamine boosts the signal, or rather the
signal-to-noise ratio (SNR), which can aid ‘error detection’ and other
patternicity. Dopamine can enhance our responses to patterns by boosting our
pattern-detection abilities, by boosting SNR. Schizophrenics and creative
people may also develop enhanced pattern-detection abilities.
Shermer says he is a monist, meaning the brain is all,
rather than a mind-body, mind-brain, body-soul dualist, Descartes-style. Some
researchers think we are intuitively dualists, seeing mind and body as
separate. It just seems that way. This perhaps reaches into our views about
life after death. It seems plausible to many that the soul/mind lives on
somehow yet there is no evidence. Some have even said we have a belief instinct
(see Jesse Bering’s book, The Belief Instinct). Neurologists like Oliver Sacks
showed us that changes in the brain are often the cause of hallucinations, some
of which are interpreted as real by the experiencer.
When we become aware that we and others have beliefs,
desires, and intentions, we engage in what is called Theory of Mind (ToM). ToM
is the basis for agenticity. We realize we are an agent, and, taken to a higher
level, we realize others as agents. ToM evolved out of necessity to read the
intentions of others to enhance our own survival. He says ToM is an automatic
system that kicks in during social situations. ToM may be involved in learning
through imitation, transferring the movements of others into our own movements
when learning, possibly with the use of so-called ‘mirror neurons,’ which fire
during imitation learning. Shermer recounts 2007 experiments by neuroscientist
Sam Harris and colleagues that suggest that it is easier to believe than to
reject a belief, to accept appearances until proven false. Those experiments
also looked for neural correlates of belief and found activity in the
ventromedial prefrontal cortex, which links lower-order emotional responses with
higher-order cognitive factual evaluations. Shermer says this supports
“Spinoza’s conjecture: belief comes quickly and naturally, skepticism is slow
and unnatural, and most people have a low tolerance for ambiguity.” Other
experiments by Harris et al. suggested that there was no belief or disbelief
module in the brain and that we can rely on feelings and convictions to support, especially as they decouple from reason and evidence. Shermer hopes that we can
use reason and evidence in counterarguments to re-couple with emotion and change
beliefs.
He explores belief in an afterlife. After going through
some stats he comes up with the following observations: 1) belief in an
afterlife is a kind of agenticity; 2) it is also a kind of dualism; 3) it
derives from Theory of Mind; 4) it is an extension of our body schema (we
mentally project the body schema into the future); 5) afterlife belief is
probably mediated by our left-hemisphere interpreter (this neural
network/circuit is involved in creating narratives which is how belief in
afterlife scenarios seem to work); 6) it is an extension of our normal ability
to imagine ourselves somewhere else in space and time.
He also says we are intuitive immortalists. Jesse Bering, in
the book ‘The Belief Instinct,’ noted that we have a hard time fathoming
what it would be like to not exist, as we have no basis for understanding, so we
just assume we will always exist in some way.
Shermer notes that there are four lines of evidence often
given by those who believe in life after death: 1) information fields and
universal life force – these are also intuitive notions without evidence; 2)
ESP and evidence of mind; 3) quantum consciousness; and 4) near-death
experiences. On the first point, he goes through the work of Rupert Sheldrake
regarding information fields and concludes that it is mostly bunk. He does the
same with ESP and has done so in many experiments where he was the skeptic. It
is the same with quantum consciousness and NDEs (and OBEs) – no real evidence.
He talks about a 2009 episode of Larry King Live on which he appeared with Dr.
Sanjay Gupta, Dr. Deepak Chopra, and Conservative Christian apologist (recently
pardoned by Trump for illegal political contributions) Dinesh D’Souza (who also
wrote a book arguing in favor of life after death). His “baloney detector” was
going haywire. Their arguments were often simply – if one can’t provide a
natural explanation, then a supernatural one can suffice. Bull, he says. Chopra,
it appears, simply wants to verify fuzzy language, New Age consciousness mumbo
jumbo with some quantum mechanics and neuroscience thrown in. Shermer, for all
his skepticism, says he would like to believe in some sort of afterlife, but
there is simply no evidence.
Most people in the world believe in God or gods or some
higher power. According to surveys, America has some of the highest percentages
of believers. Darwin pondered whether evolution could account for the
universality of religious beliefs. Shermer believes it is indeed a powerful
influence, one of several. He defines religion as: “a social institution to
create and promote myths, to encourage conformity and altruism, and to signal
the level of commitment to cooperate and reciprocate among members of a community.”
He thinks that as human bands coalesced into larger tribes and eventually
city-states, religion co-evolved with government to codify moral behavior
into laws and principles. He thinks that specific human universals related to
religion (belief in the supernatural, anthropomorphizing, ideas about fortune,
etc.) are influenced by our genetic predispositions, and this is why they have
come to be recognized as human universals. He notes that small hunter-gatherer
bands are often very egalitarian, and that is likely because the needs of the
group are favored over the needs of the individual by strong enforcement of moral
rules with gossip, ridicule, shunning, and other forms of ostracization. Myths
and supernatural beings are often employed to promote fairness in the social
group, which also becomes a moral group. While our own culture gives us the
specifics of religion, the desire to be religious itself is influenced by
evolution. Studies of identical twins vs. fraternal twins have strongly
suggested that genetics influences one’s religious activities and, to a lesser
extent, their beliefs. However, it is doubtful that we possess a “God gene” as
geneticist Gene Hamer’s book title suggests (apparently, he did not approve of
the title chosen by the publisher), even though we may have genes that make us more
predisposed to engaging in spiritual activities.
Shermer notes that man created gods rather than the other
way around. Da, this is obvious. Gods and myths often arise in response to the
conditions and trials of the tribe. Shermer’s section – Theist, Atheist,
Agnostic, and the Burden of Proof – goes through the various arguments for the
existence of God. As an agnostic, he favors the words of a bumper sticker he
once saw: “Militant Agnostic: I Don’t Know and You Don’t Either.”
Shermer asks the odd yet compelling question that
determines what he calls Shermer’s last law: “any sufficiently advanced
extraterrestrial intelligence is indistinguishable from God.” He relates
this idea to evolution, to SETI (the search for extraterrestrial intelligence),
and intelligent design theory. His arguments are interesting but inconclusive.
He considers ‘Einstein’s God’ (which I see as more or less simply giving a name
and creator rank to mystery itself) and whether Einstein meant his ideas
literally or metaphorically. He considered God to be beyond comprehension. He
was also influenced by his Jewish identity. Einstein favored Spinoza’s God, that
is, the harmony of existence; however, he did not think that God was concerned
with the fate or actions of humans.
Shermer describes the ‘supernatural’ as simply a term given
to mysteries as of yet not understood fully. The history of science shows that
we now understand many things naturally and scientifically that were once
considered to have supernatural causes. Still, mysteries enthrall us.
Nonetheless, Shermer sees it this way:
“Flawed as they may be, science and the secular
Enlightenment values expressed in Western democracies are our best hope for
survival.” (I might add that many of those values are also expressed in
many non-Western democracies as well)
Shermer next considers belief in aliens. As a skeptic, he
has debated several so-called alien abductees, including the famed Whitney
Strieber. Shermer asked Strieber before the show what he does in his off time –
he said he writes science fiction! He also considers other causes for perceived
alien abductions, including hypnagogic hallucinations, sleep paralysis,
hypnosis, sleep deprivation, stress, and lucid dreams – especially since the
“visions” recounted are often similar. His own view of ETs is that they could
exist, but their rarity, combined with the vast distances, makes encounters
unlikely. He recounts a conversation with Richard Dawkins about what ETs are
likely to look like, assuming evolution occurs in a similar way in other parts
of the universe. Sci-Fi writer Michael Crichton went so far as to describe SETI
as a religion, having faith that there is ‘someone out there.’ While this may
be the case, SETI is also a science run by scientists to possibly answer a question
that may end up being more religious than scientific.
Conspiracy theories are next considered. Conspiracy
theories are different from actual conspiracies. They are often highly
improbable, illogical, tend to snowball, and yet can be held onto even in the
face of heaps of refuting evidence. Shermer thinks that they are believed due
to not applying pattern detection filters and are aided by confirmation bias
and hindsight bias, manipulating information to the narrative. There are
patterns in the way they develop that are pretty easy to figure out. He goes through
several in detail, including 9/11 conspiracies and JFK murder conspiracies.
Next is the politics of belief, which also includes
economics and ideologies. This part was good, I thought. Psychologists have
studied why people tend to lump into the liberal and conservative edges of the
political spectrum. A 2003 Stanford study of conservatives concluded that they
suffer from “uncertainty avoidance” and “terror management” and have a “need
for order, structure, and “closure” along with “dogmatism” and “intolerance of
ambiguity,” which lead to “resistance to change” and “endorsement of inequality”
in their actions. Many conservatives did not agree and dissed the study, which
also associated some conservatives with Nazis. Shermer acknowledges that there
has long been a liberal belief bias in academia, which Harvard psychologist
Stephen Pinker has written about and which bubbled up recently, especially in
so-called alt-right circles. University of Virginia psychologist Jonathan Haidt
considered that the standard liberal bias for why people vote Republican is
that conservatives are “cognitively inflexible, fond of hierarchy, and
inordinately afraid of uncertainty, change, and death.” Haidt encouraged his
academic colleagues to move beyond such biases. Shermer changes the analysis to
suggest what conservatives might say about liberals: ‘lack of moral compass,
inability to make clear ethical choices, lack of certainty about social issues,
a fear of clarity that leads to indecisiveness, a naïve belief in equal talent,
and a belief that culture and environment are way more important than
biological human nature influences (these last two mesh with Pinker’s “myth” of
the Blank Slate – that we are all the same before culture takes over – not so
says Pinker). So, both liberals and conservatives tend to be biased, especially
about the other. The belief that “bleeding heart” liberals are more generous
and that conservatives are “heartless” is not borne out by data that show
conservatives give more to charity (although religious motives and being
wealthier in general may account for some of that). One reason this might be
the case is that conservatives think charity should be private, provided by
individuals, companies, and non-profits, while liberals think charity should be
public, provided by the government.
A 2005 UCLA study suggested that the media have a liberal
bias, and the current period of Trumpism says the same in a much over-the-top
version. Of course, with Fox News, we have a conservative bias strongly
manifested. The more biased media sources are also the most predictable.
Moderates and libertarians tend to be less predictable. Liberals and
conservatives stereotype each other, and such stereotypes tend to be emotionally charged. Haidt proposed that our ‘moral sense’ is based on five
innate psychological systems: 1) Harm/care (empathy and sympathy); 2)
Fairness/reciprocity – reciprocal altruism evolved into our sense of justice
and morality; 3) In-group/loyalty – social evolution based in tribalism; 4)
Authority/respect – based on social hierarchies developed from our primate
histories onward; 5) Purity/sanctity – we evolved to equate morality and
civility with cleanliness and immorality and barbarism with filth. On Haidt’s
survey, liberals score higher on the first two and conservatives on the last
three. In other words, liberals and conservatives emphasize different moral
values.
Psychological experiments about generosity and the rule of law
suggest to Shermer that “in order for there to be social harmony, society needs
to have in place a system that both encourages generosity and punishes free
riding.” Religion and government are two such systems, he says. When societies
became too large for ostracizations such as gossip, ridicule, and shunning, the institutions of government and religion developed to take over the enforcement of moral codes. Conservatives tend to favor private regulation of
behavior through religion, while liberals tend to favor public regulation of
behavior through government (Shermer adds – except for sexual mores, where
liberals tend not to want government to interfere). A perhaps confounding issue
is that we also evolved tribally in-group and out-group biases that tend to
make us competitive as ‘us vs. them’ team players. Shermer admits that he, as a
civil libertarian, is conflicted politically. He hopes that identifying the
moral values of liberals and conservatives will help bridge the political
divide.
Shermer goes through economist Thomas Sowell’s ideas in his
book, ‘A Conflict of Visions,’ where he argues that conservatives have a
constrained moral vision of human nature and liberals an unconstrained moral
vision of human nature. The unconstrained vision is optimistic but perhaps
overly idealistic. It suggests that all social problems can be solved with
sufficient commitment. The constrained vision is pessimistic but also realistic
in the sense that it acknowledges that all attempts to solve social problems
have costs, can lead to other social ills, and there are always trade-offs. Stephen
Pinker, in his 2002 book, The Blank Slate (which I plan to review here at some
point), relabeled Sowell’s visions as the Tragic Vision and the Utopian Vision.
The Tragic Vision emphasizes that things like bureaucracies can explode into
self-interest for the implementers of the policies, while the Utopian Vision
seems to emphasize an increase in what is now often invoked as ‘social
engineering.’ Issues like the size of government, the level of taxation, free
trade vs. fair trade (oddly, Trump and the conservatives who back him seem to
have reversed this one). Shermer further alters this conflict of visions idea
to say that it is more of a spectrum. He calls it the Realistic Vision, where on
one end there is constraint and on the other no constraint, and that in reality, human nature is partly constrained by genetics and evolution, especially. It
acknowledges that we have a dual nature of being both selfish and selfless,
that people vary (ie, no blank slate), and that over-focusing on equality could
cause as many new problems as it would solve. He thinks political moderates on
both left and right generally favor such a partially constrained Realistic
Vision of human nature. He gives evidence in support of a partially constrained
model: genetic differences among people that leads to different abilities,
failed communist and socialist experiments, failed utopian experiments, the
enduring power of family ties, the power of reciprocal altruism, the desire to
punish cheaters, the ubiquity of hierarchical structures, and in-group/out-group
dynamics.
Next, he explains why he is a libertarian. He invokes John
Stuart Mill, who in his 1859 book, On Liberty, argued that it was democracy
that defeated the tyranny of the magistrate characteristic of European
monarchies, but that same democracy could also lead to the tyranny of the
majority, which can work against the rights of the individual. He notes that our
Bill of Rights is intended to prevent a tyranny of the majority. He explains
that libertarianism is based on the principle of freedom, without infringing on
the freedom of others. He says that libertarianism incorporates moral
principles embraced by both liberals and conservatives. He does not think
Libertarians will ever be a viable third party in the U.S., though. I think he
considers them a type of moderate, but one rooted in personal liberties. The
party itself seems to produce both reasonable politicians and nutty ones, seemingly overly obsessed with certain liberties such as gun rights or
corporate rights, which is perhaps one reason it is not very popular.
Political beliefs are different than scientific beliefs.
One might simply believe that a certain policy, at this time and place, is the
most viable and useful. Timothy Ferris, author of The Science of Liberty, says
that liberalism and science are methods rather than ideologies. Extreme
Islamists and some fundamentalist Christians favor theocracies that restrict
freedoms. Shermer has also written about free-market capitalism and offers this
assessment of democracy and capitalism:
“Liberal democracy is not just the least bad political
system compared to all others; it is the best system yet devised for giving
people a chance to be heard, an opportunity to participate, and a voice to
speak truth to power. Market capitalism is the greatest generator of wealth in
the history of the world and it has worked everywhere that it has been tried.
Combine the two and Idealpolitik may become Realpolitik.”
‘Confirmations of Belief’ is the next chapter title and is
a summary of cognitive biases. He starts out with what he calls folk numeracy,
a form of patternicity where we have a natural tendency to misperceive
probabilities, to think anecdotally rather than statistically, and to focus on
trends that confirm our own biases. Confirmation bias, where we tend to confirm
our own beliefs by selecting data that conforms to them, is, according to
Shermer, the mother of all cognitive biases. We do this to confirm our beliefs.
He defines confirmation bias as follows: “the tendency to seek and find
confirmatory evidence in support of already existing beliefs and ignore or
reinterpret disconfirming evidence.” Experiments have shown that people will
often favor evidence that confirms their own beliefs over evidence that
disconfirms them. He notes that confirmation bias is particularly powerful in
political beliefs. We tend to have emotional reactions to data that conflicts
with our beliefs (cognitive dissonance), and neuroscience has confirmed this
somewhat. Our preconceptions about various subjects, people, and policies tend
to be entrenched, and the power of expectation is also in play – we tend to
expect reality to fit our beliefs, and if it doesn’t, we tend to get emotional.
Remember, in Shermer’s model, beliefs come first, then reality =
belief-dependent realism.
Next, the “hindsight bias is the tendency to reconstruct
the past to fit with present knowledge.” After accidents or weather events and
during wartime, the hindsight bias often appears. It is easy to conclude that
we should have known or have been prepared for such events, that the clues were
there. It can get conspiratorial. After 9/11 came much hindsight bias. A
related bias is the self-justification bias. This is “the tendency to
rationalize decisions after the fact to convince ourselves that what we did was
the best thing we could have done.” Most biases involve “cherry-picking” data
to conform to pre-existing beliefs. The justification bias is strong in
politicians who spin things to depict themselves as seemingly right, in their
opinions, even when they are wrong and have clearly made incorrect predictions.
Attribution bias is “the tendency to attribute different
causes for our own beliefs and actions than those of others.” We might attribute
the success of others to luck, circumstances, having connections, or to some
innate disposition they have. In contrast, we tend to attribute our own
successes to hard work and/or some positive disposition. Shermer and a
colleague, Frank Sulloway, discovered and presented new forms of attribution
bias they call intellectual attribution bias and emotional attribution bias. They
noticed that when asking people why they believe in God people tended to give
intellectual reasons for their own belief, such as the harmonious design of the
universe, but when they asked the same people why other people believe in God
the same people tended to give emotional reasons, such as the fear of death. We
tend to do the same in political hot button issues where we give rational
reasons for our own beliefs and attribute emotional reasons to the beliefs of
others, particularly to those whose beliefs are opposed to ours.
Sunk-cost bias is simply “the tendency to believe in
something because of the cost sunk into that belief.” This often leads to the
fallacy that we cannot abandon an idea simply because we have invested
considerable resources into it. This is one reason why beliefs are difficult to
change. It may also be why politicians are so hard-headed.
Status quo bias is similar. He defines it as “the tendency
to opt for whatever it is we are used to, that is, the status quo.” This
rewards our laziness! It is likely another reason why people don’t like to
change their beliefs. The status quo bias is influenced by the endowment
effect. Economist Richard Thaler defined the endowment effect as “the tendency
to value what we own more than what we do not own.” Evolution is likely an
influence here. Certain animals tend to mark and defend their chosen territories,
even when other ones are available. Shermer notes that “beliefs are a type of
private property – in the form of private thoughts with public expressions –
and therefore the endowment effect applies to belief systems.” I think that the
sunk-cost bias, the status quo bias, and the endowment effect are much about
the energy required to overcome or redesign the past and about laziness.
Next, are framing effects – “the tendency to draw different
conclusions based on how data are presented.” How data is presented or
“pitched” can affect how we perceive it. This is one method of neuro-linguistic
programming. It is also often used in behavioral economics, and it is ubiquitous
in sales.
The anchoring bias is “the tendency to rely too heavily on
a past reference or on one piece of information when making decisions.” I see
this in politics, among environmental activists, and among those opposed to
environmental activists. They might overly rely on one particular study, or
someone might reuse over and over a technique that once worked for them well in
the past, even though it doesn’t work so well now.
The availability heuristic refers to “the tendency to
assign probabilities of potential outcomes, based on examples that are
immediately available to us.” This is especially true of emotionally-charged
situations. He gives the example that we especially notice every red light when
we are late for an appointment. This is also a factor in how we tend to assess
risk. If some disaster or epidemic happened recently, even though it is
statistically rare, we will tend to see it as riskier than it really is.
Related to the availability heuristic is the representative
bias, which was described by psychologists Amos Tversky and Daniel Kahneman as
follows: “An event is judged probable to the extent that it represents the
essential features of its parent population or generating process.” People tend
to use shortcuts when they need to decide on something and those shortcuts
often employ biases. We might throw out candidates for a job for biased reasons, just to lighten the load.
Inattentional bias has more to do with our sensory
perception and the automatic nature it sometimes has. Psychologists define it
as “the tendency to miss something obvious and general while attending to
something special and specific.” The classic experiment here has a guy in a
gorilla suit walking through while the subjects are told to count the number of
basketball passes by a team in black shirts and another in white shirts. A
1-minute video is shown and the gorilla walks in at 30 seconds, thumps his chest, and walks out. Consistently (and amazingly), 50% of the subjects do not report
seeing a gorilla-suited guy!
Shermer gives a long list of other cognitive biases, including a bias to trust authority, to jump on bandwagons, to believe what
seems believable, to over-rely on expectations, to conflate cause with
correlation, to overvalue initial events, to overvalue events that are recent,
and the ubiquitous overgeneralization known as stereotyping.
He also mentions the bias blind spot. This is “the tendency
to recognize the power of cognitive biases in other people but to be blind to
their influence upon our own beliefs.”
Shermer describes science as “the ultimate
bias-detection machine.” Mechanisms such as double-blind controls in
experiments are designed to weed out bias. The peer-review process is another
bias-reduction technique. Skepticism and the ability to falsify are given
importance in the scientific process. Scientists must defend their conclusions
to the satisfaction of other scientists.
Science is our best means of separating meaningful patterns
from meaningless ones. Shermer uses the model of exploration of new lands to
explore the psychology of science here. Prevailing paradigms shape our
perceptions. Explorers of the past used the prevailing paradigms of the time to
describe their new discoveries. The set of beliefs about reality that make up
science has changed as new discoveries have been made. Paradigms have shifted
and will likely continue to shift. As Galileo found out, paradigms can be slow
to shift when belief systems are entrenched. The shift from Aristotelian logic
and deduction to Francis Bacon’s ‘observational method’ of induction took time, but rewarded us with a less entrenched societal belief system. This is akin to
what I call ‘cultivating the shiftable paradigm,’ or simply reminding oneself
that what is or seems true today may be refuted at any time with better and
more detailed experimental evidence. Of course, pure empiricism is not always
perfect. We may be tricked by our eyes, even when great instruments are
employed. If some new structure or function is revealed to us we might not
recognize or value it if it doesn’t fit into our current paradigm based on
empirical observation. Shermer notes Galileo’s mistaking of the rings of Saturn
for three stars as an example. At the time, there was no available concept of
planetary rings, and the resolution of the telescopes of the time was not enough
to see Saturn’s distance very well. Thus, even direct observation combined
with the limits of the current paradigm can fool us. He notes the triad
data-theory-presentation as most important in combination for complete
scientific understanding. Another way of expressing the triad is
induction-deduction-communication, or what we see, what we think, and what we
say. There is an interplay of the three. Stephen Jay Gould called this the
‘power and the poverty of pure empiricism.’ Observation can trick us into
seeing something that is not really there, but is partially based on previous paradigms.
This is perhaps most true of the vast and the tiny, both of which are beyond
our sensory ranges.
A history of astronomy and cosmology is perhaps a reminder
that observations can change with new observing techniques (such as
spectroscopy). Shermer gives a short history here, which was unexpected and
perhaps a bit of a digression, but it is relevant to the philosophical aspects
of astronomy. He notes Arthur C. Clarke's first law: “When a distinguished
elderly scientist states that something is possible, he is almost certainly
right. When he states that something is impossible, he is very probably wrong.”
The history of science is one in which established theories have been upended
to the astonishment of scientists and with much resistance. Orthodoxy can
permeate science, and this has been called “scientism.” However, the word
scientism has also often been used as a charge, often a lame charge, by
believers in the supernatural and less credible ideas to discredit mainstream
science. Time tends to resolve debates in science as new discoveries are made.
Mystery, paradox, and the inadequacy of language and the
nature of meaning itself have kept us from uncovering the deeper picture of the
nature of reality. Will it always be so? We don’t know, but some mysteries have
yielded little to no ground. Astronomy has meshed with philosophy to give us
further curious angles to explore. There is what is called the fine-tuning
problem – why our universe seems so finely-tuned toward certain conclusions, including existence itself. The Big Bang is said to be “sensitive” to the ‘six
cosmic numbers.’ (There are more, but these are considered the most important.)
These are: 1) the amount of matter in the universe, 2) how firmly atomic nuclei
bind together, 3) the number of dimensions in which we live, 4) the ratio of
the strength of electromagnetism to that of gravity, 5) the fabric of the
universe, and 6) the cosmological constant, or “antigravity” force. This
fine-tuning has been dubbed the anthropic principle. There is a
counter-principle known as the Copernican principle that concludes that we are
not special. The universe’s apparent fine-tuning has given much energy to the
intelligent design advocates, including biblical creationists. Shermer and many
others argue that there can be many alternatives to the anthropic principle,
as the notion that the universe was designed especially for us. Carl Sagan
mentioned “carbon chauvinism,” the belief that life cannot be based on anything
but carbon, and Shermer takes that a step further to call it “cosmic
chauvinism,” the idea that the universe is not fine-tuned for us but rather
that we are fine-tuned for it. There is still much not understood about
relativity and quantum mechanics, and the cosmic numbers may not be as constant
as thought. They may also be all related in some other fashion. Shermer delves
deeper here and mentions six types of theories of a multiverse, of which our
universe may be but one component. Stephen Hawking rejected any kind of
intelligent design notion based on the anthropic principle. He and Leonard
Mlodinow presented their ideas about this in the 2010 book The Grand Design
(also reviewed in this blog). Their idea is called model-dependent realism.
They stated that it is only useful to ask if a model agrees with observation.
If it does, we may use it to describe reality. They model the universe with an
extension of string theory called M-theory with eleven dimensions. If the
universe is somehow determined to be finite, then M-theory would say the
universe created itself. These ideas are conjectural, perhaps as conjectural as
God. One may posit God, but there is little reason to believe, especially in the
case of the Gods of our typical religions.
In the epilogue, he states simply that skepticism is
science. Science has the null hypothesis, which states that a hypothesis is
false until proven true. In science, the burden of proof is always on proving a
hypothesis is true. The burden is not on the skeptics to disprove it. This is
important to realize when dealing with the supernatural, which has a history of
claiming something is true simply because it can’t be conclusively disproven.
Of course, in the final analysis, many of the things we regard as true may not
really be so, so much may be regarded as provincial truth in contrast to
definitive truth. The opposite argument, from a perspective of negative
evidence, might be something like – if you can’t prove it wasn’t God, spirits,
UFOs, Jews, Rothschilds, Masons, etc., etc., then it must have been them. That is
a ridiculous argument.
Science might also proceed toward a ‘convergence of
evidence.’ Here, lines of inquiry from different inferential sciences converge
to form the current scientific paradigm around a subject. This is typical of
sciences that rely less on laboratory evidence and direct experimentation. It
is called the convergence method. Geology, archaeology, and cosmology are
examples where the convergence of other sciences makes up their totality. History
can often be tested through the ‘comparative method’, which was exemplified by
Jared Diamond in his book Guns, Germs, and Steel (also reviewed in this blog).
By comparing the resources available to ancient peoples in different parts of
the world and their geographical boundaries and constraints, he realized that
the variance of those resources and geographies accounted for much of the
lopsided development. Both convergence and comparative methods are employed by
paleontologists in testing hypotheses about evolution. “The principle of
positive evidence “states that you must have positive evidence in favor of your
theory and not just negative evidence against rival theories.” Thus, bunk
creationist arguments are only against evolution as they have zero positive
evidence of creationism. Shermer says man as homo rationalis probably never
existed, as we are never really purely rational but are always affected by
emotion, pain, and the difficulty of life.
This is an excellent book – highly recommended. This is one
reason I wanted to do a detailed review. I hope to read a few more of Shermer’s
books as well. He also does short video segments on Big Think.
No comments:
Post a Comment