This morning I briefly engaged in a spirited political debate with a friend of mine. He’s a staunch conservative, and a great guy, two things that I don’t consider mutually exclusive. Me? I’m part old-school conservative, part social liberal. Whatever that means. So the conversation started with the state of global markets, but in just minutes, devolved into a pointless talking-points standoff, with my friend presenting all sorts of plausible but completely unscientific reasons for why we need a Republican president. I suggested we not talk about politics any more, and maybe talk about the new Foster the People album instead. Why? Because science shows that our brains are wired to reject things that go against what we think we know. In a series of studies, University of Michigan researcher Brendan Nyhan found that when misinformed people – especially those with a partisan stance – were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more rigid in their beliefs. The Boston Globe piece How Facts Backfire explains this in more detail, and the Mother Jones piece The Science of Why We Don’t Believe Science goes WAY in depth with lots of links to academic studies that also explain why, for instance, people’s beliefs about climate change facts are so inconsistent. All of these things are examples of Confirmation Bias. But I’ll never convince you of THAT.
And why you can't convince me otherwise.