How the disconnect between information and insight explains our dangerous self-righteousness.
“Allow yourself the uncomfortable luxury of changing your mind,” I wrote in reflecting on the 7 most important things I learned in 7 years of Brain Pickings. It’s a conundrum most of us grapple with — on the one hand, the awareness that personal growth means transcending our smaller selves as we reach for a more dimensional, intelligent, and enlightened understanding of the world, and on the other hand, the excruciating growing pains of evolving or completely abandoning our former, more inferior beliefs as we integrate new knowledge and insight into our comprehension of how life works. That discomfort, in fact, can be so intolerable that we often go to great lengths to disguise or deny our changing beliefs by paying less attention to information that contradicts our present convictions and more to that which confirms them. In other words, we fail the fifth tenet of Carl Sagan’s timelessly brilliant and necessary Baloney Detection Kit for critical thinking: “Try not to get overly attached to a hypothesis just because it’s yours.”
That humbling human tendency is known as the backfire effect and is among the seventeen psychological phenomena David McRaney explores in You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself (public library) — a fascinating and pleasantly uncomfortable-making look at why “self-delusion is as much a part of the human condition as fingers and toes,” and the follow-up to McRaney’s You Are Not So Smart, one of the best psychology books of 2011. McRaney writes of this cognitive bug:
Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens those misconceptions instead. Over time, the backfire effect makes you less skeptical of those things that allow you to continue seeing your beliefs and attitudes as true and proper.
But what makes this especially worrisome is that in the process of exerting effort on dealing with the cognitive dissonance produced by conflicting evidence, we actually end up building new memories and new neural connections that further strengthen our original convictions. This helps explain such gobsmacking statistics as the fact that, despite towering evidence proving otherwise, 40% of Americans don’t believe the world is more than 6,000 years old. The backfire effect, McRaney points out, is also the lifeblood of conspiracy theories. He cites the famous neurologist and conspiracy-debunker Steven Novella, who argues believers see contradictory evidence is as part of the conspiracy and dismiss lack of confirming evidence as part of the cover-up, thus only digging their heels deeper into their position the more counter-evidence they’re presented with.
For the rest of the story: http://www.brainpickings.org/index.php/2014/05/13/backfire-effect-mcraney/?utm_content=bufferb4b12&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer