contention

The prevalence of a spirit of contention amongst a people is a certain sign of deadness with respect to the things of religion. When men's spirits are hot with contention, they are cold to religion. - Jonathan Edwards “The Book of Mormon does not supplant the Bible. It expands, extends, clarifies, and amplifies our knowledge of the Savior. Surely, this second witness should be cause for great rejoicing by all Christians.” - Joseph B. Wirthlin

Saturday, August 24, 2019

Why we believe alternative facts

Motivated reasoning drives most of our beliefs, according to this article:

https://www.apa.org/monitor/2017/05/alternative-facts

"Motivated reasoning is a pervasive tendency of human cognition," says Peter Ditto, PhD, a social psychologist at the University of California, Irvine, who studies how motivation, emotion and intuition influence judgment. "People are capable of being thoughtful and rational, but our wishes, hopes, fears and motivations often tip the scales to make us more likely to accept something as true if it supports what we want to believe."

Motivated and the "expertise paradox" account for much of what we see happening in terms of M2C and the peep stone-in-a-hat theory.

The article explains this quite well.

The more you know

People often dismiss those who hold opposing views as idiots (or worse). Yet highly educated people are just as likely to make biased judgments—and they might actually do it more often.
In one example of this "expertise paradox," Kahan and colleagues asked volunteers to analyze a small data set. First, they showed data that purportedly demonstrated the effectiveness of a cream for treating skin rash. Unsurprisingly, people who had a greater ability to use quantitative information did better at analyzing the data.
But there was a twist. When participants saw the very same numbers, but were told they came from a study of a gun-control ban, their political views affected how accurately they interpreted the results. And those who were more quantitatively skilled actually showed the most polarized responses. In other words, expertise magnified the tendency to engage in politically motivated reasoning (Behavioural Public Policy, in press). 
"As people become more proficient in critical reasoning, they become more vehement about the alignment of the facts with their group's position," Kahan says.
The pattern holds up outside the lab as well. In a national survey, Kahan and colleagues found that overall, people who were more scientifically literate were slightly less likely to see climate change as a serious threat. And the more they knew, the more polarized they were: Conservatives became more dismissive of climate change evidence, and liberals became more concerned about the evidence, as science literacy and quantitative skills increased (Nature Climate Change, 2012).
"It's almost as though the sophisticated approach to science gives people more tools to curate their own sense of reality," says Matthew Hornsey, PhD, a professor of psychology at the University of Queensland who studies the processes that influence people to accept or reject scientific messages.

Monday, August 12, 2019

Why facts don't change our minds

For those who wonder why M2C continues to be taught, consider these two sentences:

We don't always believe things because they are correct. Sometimes we believe things because they make us look good to the people we care about.

There are few more obvious examples than M2C. Employees at Book of Mormon Central, for example, are unusually concerned with what their bosses and mentors think. 

The two lines in that quotation come from a wonderful essay that explains a fascinating aspect of human nature: People like to think their opinions are based on facts, but that is not the case.

The essay is found here:


Here are two fun quotations from the essay:


The economist J.K. Galbraith once wrote, “Faced with a choice between changing one’s mind and proving there is no need to do so, almost everyone gets busy with the proof.”
Leo Tolstoy was even bolder: “The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.”

Monday, August 5, 2019

M2C: Fixation subset of bias confirmation

I've written a lot about M2C and bias confirmation because that's an easy paradigm for people to understand. People experience bias confirmation all the time.

The challenge is recognizing it in ourselves.

But actually, the term bias confirmation includes several elements that are useful and important.

A relevant article  in Psychology Today pointed out that bias confirmation "can refer to the following tendencies:  

•      Search: to search only for confirming evidence (Wason’s original definition)

•      Preference: to prefer evidence that supports our beliefs

•      Recall: to best remember information in keeping with our beliefs

•      Interpretation: to interpret evidence in a way that supports our beliefs

•      Framing: to use mistaken beliefs to misunderstand what is happening in a situation

•      Testing: to ignore opportunities to test our beliefs

•      Discarding: to explain away data that don’t fit with our beliefs"

What we really need to watch out for is fixation, a term that encompasses the last three tendencies.

The topic is explored very effectively by Dr. Gary Klein here:

https://www.psychologytoday.com/us/blog/seeing-what-others-dont/201906/escaping-fixation

The concept of fixation is that we get stuck on an initial explanation. Often, that initial explanation will be accurate but when it is wrong, with hindsight we can see that we held on to it too long. 

In my view, M2C persists because of fixation; M2C intellectuals got stuck on an initial explanation (M2C) and, with hindsight, we can see that we held onto it for too long. 

Of course, the M2C intellectuals reject the teachings of the prophets about the New York Cumorah on the same rationale. 

That's why it's up to each individual to make an informed decision.

And that's why the ongoing censorship by Book of Mormon Central is so pernicious.

But fixation errors aren’t just holding onto our initial explanation too long—fixation gets compounded when we dismiss any anomalous evidence that runs counter to our original diagnosis instead of taking these anomalies into account and revising our beliefs. DeKeyser and Woods (1990) speculated about some ways that fixation works, and Feltovich et al. (2001) called these tactics, “knowledge shields” that we use to deflect contrary data.

These six knowledge shields are pervasive in the literature of the M2C citation cartel.

Chinn & Brewer (1993) listed six basic ways that knowledge shields can operate, ways that we can react to anomalous data that are inconsistent with our beliefs: 

(i) we can ignore the data; 

(ii) we can reject the data by finding some flaw or weakness in the way the data were collected or analyzed or even speculate that the data reflected a random occurrence; 

(iii) we can decide that the data don’t really apply to the phenomenon of interest; 

(iv) we can set the data aside for the present in the expectation that future developments will show why the anomaly is not really a problem 

(v) we can find a way to interpret the data that allows us to preserve our beliefs; 

(vi) we can make cosmetic changes to our beliefs and fool ourselves into thinking that we have taken the data into account. 

Chinn and Brewer found that college students displayed each of these tactics and so did established scientists. Chinn and Brewer also listed a seventh type of reaction—we can accept the data and change or discard our initial beliefs.
_____

This seventh reaction is the one I eventually used to discard my belief in M2C. I'll explain in upcoming posts.