By Vexen Crabtree 2018
Skepticism is an approach to understanding the world where anecdotal stories and personal experience are by instinct not accepted as reliable indicators of truth. There are a great many ways in which we can come to faulty conclusions - often based on faulty perceptions - given our imperfect knowledge of the world around us1. With practice, we can all become better thinkers through developing our critical-thinking and skeptical skills2. Facts must be checked, especially if they contradict established and stable theories that do have evidence behind them3,4. Thinking carefully, slowly and methodically is a proven method of coming to more sensible conclusions5,6,7. Skepticism demands that 'truth' is taken seriously, approached carefully, and investigated thoroughly.
“We all suffer from systematic thinking errors8,9 which fall into three main types: (1) internal cognitive errors; (2) errors of emotion10, perception and memory; and (3) social errors that result from the way we communicate ideas and the effects of traditions and dogmas. Some of the most common errors are the misperception of random events as evidence that backs up our beliefs, the habitual overlooking of contradictory data, our expectations and current beliefs actively changing our memories and our perceptions and using assumptions to fill-in unknown information. These occur naturally and subconsciously even when we are trying to be truthful and honest. Many of these errors arise because our brains are highly efficient (rather than accurate) and we are applying evolutionarily developed cognitive rules of thumb to the complexities of life6,11. We will fly into defensive and heated arguments at the mere suggestion that our memory is faulty, and yet memory is infamously unreliable and prone to subconscious inventions. They say "few things are more dangerous to critical thinking than to take perception and memory at face value"12. We were never meant to be the cool, rational and logical computers that we pretend to be. Unfortunately, and we find it hard to admit this to ourselves, many of our beliefs are held because they're comforting or simple13. In an overwhelming world, simplicity lets us get a grip. Human thinking errors can lead individuals, or whole communities, to come to explain types of events and experiences in fantastical ways. Before we can guard comprehensively against such cumulative errors, we need to learn the ways in which our brains can misguide us - lack of knowledge of these sources of confusion lead many astray14.”
Prof. Bono produced a book in 1985 that endeavoured to offer some training in how to think in a structured and methodical way15. He named the critical aspect black hat thinking. He found that even with a little training, people's abilities in brainstorming and analysis increase manifold. We naturally want to come to wise conclusions, if only we would let ourselves be a little more thoughtful and methodical about it. Likewise in "Thinking, Fast and Slow" by Daniel Kahneman (2011) the author explains how slow, deliberate and organized thinking on a subject is one of the greatest ways to overcome the errors introduced by the instinctual fast thinking which has given us great evolutionary advantage, but, which of course introduces many errors to how we perceive and rationalize things. Finally, psychologist Jonah Lehrer in The Decisive Moment: How the Brain Makes Up Its Mind teaches that we must encourage ourselves to face "inner dissonance" and think about the information we don't want to think about, and "pay attention to the data that disturbs our entrenched beliefs"7.
The awesome scientist and public educator, Carl Sagan, explains that although many people believe daft things, it is not a lack-of-intelligence that should be blamed, but lack of knowledge of skeptical thinking techniques:
“Both Barnum and H.L. Mencken are said to have made the depressing observation that no one ever lost money by underestimating the intelligence of the American public. The remark has worldwide application. But the lack is not intelligence, which is in plentiful supply; rather, the scarce commodity is systematic training in critical thinking.”
Carl Sagan (1979)16
Although we know sensible mechanisms for overcoming individual cognitive bias - science - a new problem thus emerges: we do not have time to analyse every one of our beliefs with careful skeptical analysis. It is practically impossible. We have instincts to only analyse new information critically, but re-training our thoughts on existing opinion is very hard. Also, because knowledge is highly specialized, we simply can never know enough to skeptically analyze everything that we should.
“Eventually one must do everything oneself in order to know something; which means that one has much to do!”
An article in Skeptical Inquirer bemoans these facts:
“We all have limitations and built-in biases that hinder our ability to apply the methods of skepticism objectively and consistently. Nonskeptics and professed skeptics alike are equally vulnerable to developing beliefs that have not been subjected to rigorous skeptical inquiry. [...] This is because (a) we do not have time to evaluate every claim that becomes part of our belief system and may rely upon what is commonly believed or what we would like to be true ; (b) we are more likely to perform a skeptical evaluation for claims that are inconsistent with our current belief systems [...] while simply accepting claims consistent with our beliefs [...]; (c) many beliefs are already formed and reinforced prior to learning how to think skeptically; (d) some beliefs are formed based primarily upon an emotional evaluation; and (e) skeptics have limited areas of expertise (.e.g., a biologist may know little about economics), which restricts our ability to skeptically evaluate all potential claims because knowledge is extremely specialized.”
To overcome these personal difficulties, we need to reach rational conclusions in groups. So, scientific institutions that specialize in specific subjects are trusted in their pronouncements on those subjects, and authority is granted to subject matter experts who have proven qualifications in the correct fields of knowledge. They do the hard work of conducting research and submitting findings for peer review. This careful and cautious process by which we, in groups, formulate theories about the world and put them to the test, in order to minimize human error, is the core of what we call science, and as formulated technically, as the scientific method.
But this itself causes another problem. Multiple institutions and organisations all clamour for space in trying to sell their truths, and many people claim to be trustworthy. Given that scientific language can be hard to comprehend too, many rely on news services and mass media products for their information. We end up with a situation where the main obstacle for the discovery of truth is in the politics and sociology of the passage of accurate information, in a globalised world full to the brim with broadcasters and public conversations. How do we know who to trust?
The extent to which we came trust a body of experts who make pronouncements on any topic is tied completely and strongly to the extent to which they follow the scientific method, and therefore are provably making every attempt to reduce errors. Likewise, some institutions have long histories of skepticism and cautiousness, and, we should learn to seek out those bodies within our own countries and internationally, and support them.
Skepticism does not include disbelief in theories that are supported by good scientific consensus4 (that's denialism). Supporting ideas that are rejected by a large portion of scientists and academics (by doubting all of them) is almost always unwise unless there is particularly strong evidence, and a good argument (very good!) to explain why they are all wrong - else you become a maverick or pseudoscientist (depending on the reasons for your wild theories!).4 Likewise, cynicism is a case of skepticism being taken too far with regards to human motivations - a more balanced skeptic shouldn't conclude that a person's motives are selfish without evidence of it first. One final case of skepticism being taken too far is found in the stereotype of the grumpy, moody, anti-social skeptic who has no fun: there is no reason at all for skeptics to be impolite, rude or intolerance of others' beliefs.
That final example - about some skeptics' negative approach to others, was taken up as a theme of talks at The Amazing Meeting 8 (a skeptics convention). Here are some comments:
“TAM8 acknowledged the importance of outreach and grassroots activism, but it also recognized the need to improve the public image of skepticism. The message was that we can increase our effectiveness and increase our numbers by simply being... nicer.
We all know the stereotypes of skeptics. There is the boring, bearded, bespectacled old man. There is the cynical and cantankerous curmudgeon. There is the self-righteous, smug, superior know-it-all. [...] We should worry about approaches and attitudes. [...]
Phil Plait also tackled these themes. [He] voiced his concerns over the public perception of skeptics as antisocial, egotistical, and abrasive. [...] Plait's plea was that we should avoid undue attacks and insults because we're most persuasive when we're respectful and rational.
Of course, there's a time to be confrontational, but there's also a time to be considerate.”
All #tags used on this page - click for more:
Skeptical Inquirer. Magazine. Published by Committee for Skeptical Inquiry, NY, USA. Pro-science magazine published bimonthly.
Bono, Prof E.
(1985) Six Thinking Hats. Paperback book. 2000 edition based on revised First Back Bay 1999 edition. Published by Penguin Books Ltd, The Strand, London, UK.
Carroll, Robert Todd. (1945-2016). Taught philosophy at Sacramento City College from 1977 until retirement in 2007. Created The Skeptic's Dictionary in 1994.
(2011) Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed!. E-book. Amazon Kindle digital edition. Published by the James Randi Educational Foundation.
(1991) How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. Paperback book. 1993 edition. Published by The Free Press, NY, USA.
(2009) The Decisive Moment: How the Brain Makes Up Its Mind. Hardback book. Published by Canongate Books, Edinburgh.
(1999) Social Psychology. Paperback book. 6th ('international') edition. Originally published 1983. Current version published by McGraw Hill.