The challenge is recognizing it in ourselves.
But actually, the term bias confirmation includes several elements that are useful and important.
A relevant article in Psychology Today pointed out that bias confirmation "can refer to the following tendencies:
• Search: to search only for confirming evidence (Wason’s original definition)
• Preference: to prefer evidence that supports our beliefs
• Recall: to best remember information in keeping with our beliefs
• Interpretation: to interpret evidence in a way that supports our beliefs
• Framing: to use mistaken beliefs to misunderstand what is happening in a situation
• Testing: to ignore opportunities to test our beliefs
• Discarding: to explain away data that don’t fit with our beliefs"
What we really need to watch out for is fixation, a term that encompasses the last three tendencies.
The topic is explored very effectively by Dr. Gary Klein here:
The concept of fixation is that we get stuck on an initial explanation. Often, that initial explanation will be accurate but when it is wrong, with hindsight we can see that we held on to it too long.
In my view, M2C persists because of fixation; M2C intellectuals got stuck on an initial explanation (M2C) and, with hindsight, we can see that we held onto it for too long.
Of course, the M2C intellectuals reject the teachings of the prophets about the New York Cumorah on the same rationale.
That's why it's up to each individual to make an informed decision.
And that's why the ongoing censorship by Book of Mormon Central is so pernicious.
But fixation errors aren’t just holding onto our initial explanation too long—fixation gets compounded when we dismiss any anomalous evidence that runs counter to our original diagnosis instead of taking these anomalies into account and revising our beliefs. DeKeyser and Woods (1990) speculated about some ways that fixation works, and Feltovich et al. (2001) called these tactics, “knowledge shields” that we use to deflect contrary data.
These six knowledge shields are pervasive in the literature of the M2C citation cartel.
Chinn & Brewer (1993) listed six basic ways that knowledge shields can operate, ways that we can react to anomalous data that are inconsistent with our beliefs:
(i) we can ignore the data;
(ii) we can reject the data by finding some flaw or weakness in the way the data were collected or analyzed or even speculate that the data reflected a random occurrence;
(iii) we can decide that the data don’t really apply to the phenomenon of interest;
(iv) we can set the data aside for the present in the expectation that future developments will show why the anomaly is not really a problem
(v) we can find a way to interpret the data that allows us to preserve our beliefs;
(vi) we can make cosmetic changes to our beliefs and fool ourselves into thinking that we have taken the data into account.
Chinn and Brewer found that college students displayed each of these tactics and so did established scientists. Chinn and Brewer also listed a seventh type of reaction—we can accept the data and change or discard our initial beliefs.
This seventh reaction is the one I eventually used to discard my belief in M2C. I'll explain in upcoming posts.