Motivated reasoning drives most of our beliefs, according to this article:
https://www.apa.org/monitor/2017/05/alternative-facts
"Motivated reasoning is a pervasive tendency of human cognition," says Peter Ditto, PhD, a social psychologist at the University of California, Irvine, who studies how motivation, emotion and intuition influence judgment. "People are capable of being thoughtful and rational, but our wishes, hopes, fears and motivations often tip the scales to make us more likely to accept something as true if it supports what we want to believe."
Motivated and the "expertise paradox" account for much of what we see happening in terms of M2C and the peep stone-in-a-hat theory.
The article explains this quite well.
https://www.apa.org/monitor/2017/05/alternative-facts
"Motivated reasoning is a pervasive tendency of human cognition," says Peter Ditto, PhD, a social psychologist at the University of California, Irvine, who studies how motivation, emotion and intuition influence judgment. "People are capable of being thoughtful and rational, but our wishes, hopes, fears and motivations often tip the scales to make us more likely to accept something as true if it supports what we want to believe."
Motivated and the "expertise paradox" account for much of what we see happening in terms of M2C and the peep stone-in-a-hat theory.
The article explains this quite well.
The more you know
People often dismiss those who hold opposing views as idiots (or worse). Yet highly educated people are just as likely to make biased judgments—and they might actually do it more often.
In one example of this "expertise paradox," Kahan and colleagues asked volunteers to analyze a small data set. First, they showed data that purportedly demonstrated the effectiveness of a cream for treating skin rash. Unsurprisingly, people who had a greater ability to use quantitative information did better at analyzing the data.
But there was a twist. When participants saw the very same numbers, but were told they came from a study of a gun-control ban, their political views affected how accurately they interpreted the results. And those who were more quantitatively skilled actually showed the most polarized responses. In other words, expertise magnified the tendency to engage in politically motivated reasoning (Behavioural Public Policy, in press).
"As people become more proficient in critical reasoning, they become more vehement about the alignment of the facts with their group's position," Kahan says.
The pattern holds up outside the lab as well. In a national survey, Kahan and colleagues found that overall, people who were more scientifically literate were slightly less likely to see climate change as a serious threat. And the more they knew, the more polarized they were: Conservatives became more dismissive of climate change evidence, and liberals became more concerned about the evidence, as science literacy and quantitative skills increased (Nature Climate Change, 2012).
"It's almost as though the sophisticated approach to science gives people more tools to curate their own sense of reality," says Matthew Hornsey, PhD, a professor of psychology at the University of Queensland who studies the processes that influence people to accept or reject scientific messages.