The Human Truth Foundation

Selection Bias and Confirmation Bias

By Vexen Crabtree 2014

Like this page:

Share this page:
Comments:
FB, LJ

#confirmation_bias #selection_bias #social_psychology #thinking_errors


1. Poor Sampling Techniques

It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than negatives.

Francis Bacon (1620)1

Selection Bias is the result of poor sampling techniques2 whereby we use partial and skewed data in order to back up our beliefs. It includes confirmation bias which is the unfortunate way in which us Humans tend to notice evidence that supports our ideas and we tend to ignore evidence that contradictions it3,4. We are all pretty hard-working when it comes to finding arguments that support us, and when we find opposing views, we work hard to discredit them. But we don't work hard to discredit those who agree with us. In other words, we work at undermining sources of information that sit uncomfortably with what we believe, and we idly accept those who agree with us.

If we believe a correlation exists, we notice and remember confirming instances. If we believe that premonitions correlate with events, we notice and remember the joint occurrence of the premonition and the event's later occurrence. We seldom notice or remember all the times unusual events do not coincide. If, after we think about a friend, the friend calls us, we notice and remember the coincidence. We don't notice all the times we think of a friend without any ensuing call, or receive a call from a friend about whom we've not been thinking.

People see not only what they expect, but correlations they want to see. This intense human desire to find order, even in random events, leads us to seek reasons for unusual happenings or mystifying mood fluctuations. By attributing events to a cause, we order our worlds and make things seem more predictable and controllable. Again, this tendency is normally adaptive but occasionally leads us astray.

"Social Psychology" by David Myers (1999)3

Our brains are good at jumping to conclusions, and it is often hard to resist the urge. We often feel clever, even while deluding ourselves! If a bus is late twice on a row on Tuesdays whilst we are stood there waiting for it, we try to work out the cause. We would do much better to accurately note on how many other days it is late, and note how many times it isn't late on a Tuesday. But rare is the person who engages methodically in such trivial investigations. We normally just go with the flow, and think we have arrived at sensible conclusions based on the data we happen to have observed in our own little bubble of life.

All of this is predictable enough for an individual, but another form of Selection Bias occurs in mass media publications too. Cheap tabloid newspapers publish every report that shows foreigners in a bad light, or shows that crime is bad, or that something-or-other is eroding proper morality. And they mostly avoid any reports that say the opposite, because such assuring results do not sell as well. Newspapers as a whole simply never report anything boring - therefore they constantly support world-views that are divorced from everyday reality. Sometimes commercial interests skew the evidence that we are exposed to - drugs companies conduct many pseudo-scientific trials of their products but their choice of what to publish is manipulative - studies in 2001 and 2002 shows that "those with positive outcomes were nearly five times as likely to be published as those that were negative"5. Interested companies just have to keep paying for studies to be done, waiting for a misleading positive result, and then publish it and make it the basis of an advertising campaign. The public have few resources to overcome this kind of orchestrated Selection Bias.

RationalWiki uses mentions that "a website devoted to preventing harassment of women unsurprisingly concluded that nearly all women were victims of harassment at some point"6. And another example - a statistic that is most famous for being wrong, is that one in ten males is homosexual. This was based on a poll done on the community surrounding a gay-friendly publication - the sample of respondents was skewed away from the true average, hence, misleading data was observed.

The only solution for these problems is the proper and balanced statistical analysis, after actively and methodically gathering data. Alongside raising awareness of Selection Bias, Confirmation Bias, and other thinking errors. This level of critical thinking is quite a rare endeavour in anyone's personal life - we don't have time, the inclination, the confidence, or the skill, to properly evaluate subjective data. Unfortunately just fact-checking is also rare in mass media publications. Hence why personal opinions (including your own) and news outlets ought to be given little trust.

2. Case Examples: The Evidence for Supernatural Powers

Take a very simple example provided by skeptical thinker Robert Todd Carroll in "Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed!" (2011)7: "If two psychics pick opposite winners in an athletic contest, one of them may appear to have more knowledge that the other, but the appearance is an illusion". If both published their prediction in a newspaper a week before the event, you can guarantee that the paper that had (by luck) hosted the correct prediction is more likely to run a second article announcing that the psychic was correct. The other paper is very unlikely to run an article and say that the psychic was wrong: they'll simply ignore it and move on to more interesting topics. The poor public, therefore, only ever read of success stories, and are therefore misled into thinking that evidence exists for psychic fortune-telling, when in fact it really is pure luck.

Book CoverLet us imagine that one hundred professors of psychology throughout the country read of Rhine's work and decide to test a [human subject for signs of ESP capability]. The fifty who fail to find ESP in their first preliminary test are likely to be discouraged and quit, but the other fifty will be encouraged to continue. Of this fifty, more will stop work after the second test, while some will continue because they obtained good results. Eventually, one experimenter remains whose subject has made high scores for six or seven successive sessions. Neither experimenter nor subject is aware of the other ninety-nine projects, and so both have a strong delusion that ESP is operating. The odds are, in fact, much against the run. But in the total (and unknown) context, the run is quite probable. (The odds against winning the Irish sweepstakes are even higher. But someone does win it.) So the experimenter writes an enthusiastic paper, sends it to Rhine who publishes it in his magazine, and the readers are greatly impressed.

At this point one may ask, "Would not this experimenter be disappointed if he continues testing his subject?" The answer is yes, but as Rhine tells us, subjects almost always show a marked decline in ability after their initial successes.

"Fads & Fallacies in the Name of Science" by Martin Gardner (1957)8

3. Other Sources of Thinking Errors

Current edition: 2014 Nov 10
http://www.humantruth.info/selection_bias.html
Parent page: The Human Truth Foundation

Social Media

References: (What's this?)

Book Cover

Book Cover

Book Cover

The Economist. Published by The Economist Group, Ltd. A weekly newspaper in magazine format, famed for its accuracy, wide scope and intelligent content. See vexen.co.uk/references.html#Economist for some commentary on this source..

Carroll, Robert Todd. (1945-2016). Taught philosophy at Sacramento City College from 1977 until retirement in 2007. Created The Skeptic's Dictionary in 1994.
(2011) Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed!. E-book. Amazon Kindle digital edition. Published by the James Randi Educational Foundation.

Coolican, Hugh
(2004) Research Methods and Statistics in Psychology. 4th edition. Published by Hodder Headline, London, UK.

Gardner, Martin. Died 2010 May 22 aged 95.
(1957) Fads & Fallacies in the Name of Science. Paperback book. Originally published 1952 by G. P. Putnam's Sons as "In the Name of Science". Current version published by Dover Publications, Inc., New York, USA.

Gilovich, Thomas
(1991) How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. Paperback book. 1993 edition. Published by The Free Press, NY, USA.

Lehrer, Jonah
(2009) The Decisive Moment: How the Brain Makes Up Its Mind. Hardback book. Published by Canongate Books, Edinburgh.

Myers, David
(1999) Social Psychology. Paperback book. 6th ('international') edition. Originally published 1983. Current version published by McGraw Hill.

Footnotes

  1. Gilovich (1991) p32 cites Francis Bacon (1620) The new organon and related writings (1960). Published by Liberal Arts Press, New York, USA.^
  2. Coolican (2004) p51.^
  3. Myers (1999) p114.^
  4. Lehrer (2009) .^
  5. The Economist (2008 Nov 29) article "Pharmaceuticals: Absence of evidence" discusses this (p94). It states: "A study published this week in PLoS Medicine, an online journal, confirms [that] drugs companies try to spin the results of clinical trials. [... There is] troubling evidence of suppression and manipulation of data in studies published in (or often withheld from) peer-reviewed medical journals". And the full text on the 2001 and 2002 study states: "The results are distressing. Only three-quarters of the original trials were ever published, and it turned out that those with positive outcomes were nearly five times as likely to be published as those that were negative. Earlier investigations have shown that the explanation is not editorial bias; well-designed studies in which drugs fail have as good a chance of being published in leading journals as those in which drugs succeed". Results published of 155 sets of data, 17 "appeared in publications without having first been discussed in regulatory filings. Fifteen of these 17 made the drugs in question look better".^
  6. RationalWiki.org article "Selection bias" accessed 2014 Nov 10.^
  7. Carroll (2011) p200.^
  8. Gardner (1957) p303.^

©2017. All rights reserved.