"If fifty million people say a foolish thing, it is still a foolish thing."
I was reading an article from Science Daily on How We Support Our False Beliefs, and wondering if I'm any more likely to challenge my own "beliefs" than the average person. The first difference I think I have is that I can't think of anything I believe in. The very definition of believe is:...to have confidence in the truth, the existence, or the reliability of something, although without absolute proof that one is right in doing so...I prefer to use words like, "I think", "I know", or "I feel". To me, it seems like believing in something means taking a leap of faith. Faith being something that is not based on proof. I don't like either word and I try to stay away from them. Even still, when I really think about it, I get caught in the trap of thinking I know something and not wanting to listen to conflicting information. Can I break that cycle?
Here's what the article says:
Co-author Steven Hoffman, Ph.D., visiting assistant professor of sociology at the University at Buffalo, says, "Our data shows substantial support for a cognitive theory known as 'motivated reasoning,' which suggests that rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe."In fact," he says, "for the most part people completely ignore contrary information.
"The argument here is that people get deeply attached to their beliefs," Hoffman says."We form emotional attachments that get wrapped up in our personal identity and sense of morality, irrespective of the facts of the matter. The problem is that this notion of 'motivated reasoning' has only been supported with experimental results in artificial settings. We decided it was time to see if it held up when you talk to actual voters in their homes, workplaces, restaurants, offices and other deliberative settings.
"The study team employed a technique called "challenge interviews" on a sample of voters who reported believing in a link between Saddam and 9/11. The researchers presented the available evidence of the link, along with the evidence that there was no link, and then pushed respondents to justify their opinion on the matter. For all but one respondent, the overwhelming evidence that there was no link left no impact on their arguments in support of the link."They wanted to believe in the link," he says, "because it helped them make sense of a current reality. So voters' ability to develop elaborate rationalizations based on faulty information, whether we think that is good or bad for democratic practice, does at least demonstrate an impressive form of creativity."
After reading the article, I think there's a very important lesson here, but it's not just that we should be fighting against these false beliefs. Don't get me wrong, that is important, but I think that might be a losing battle. Instead I think we should ask: Why do we not want to change our thinking? Because it's an uncomfortable feeling to be proven wrong. It makes sense that we would try to avoid it. Empathy is the real lesson. As a parent, it's an important part of helping my kids when they make incorrect assumptions about the world. If I am able to put myself into their shoes, I can help ease that pain a little bit. I hope that over time it will teach them that challenging their "beliefs" is an important thing to do, even if it's not a comfortable feeling. By empathizing with them, they'll know that everyone feels that way sometimes. If they can understand that, well that's half the battle right there.