I had the opportunity recently to chat with Katy Byron, of the Poynter Institute’s MediaWise, and Erik Palmer, who consults with schools about, among other things, operating in a digital world. They offered very helpful strategies to help students (and teachers) become savvy digital-media consumers, including how to determine the integrity and validity of the information they receive and share. Social media certainly inundates us with content, much of it designed to influence instead of objectively inform us. How can we distinguish truth from fiction, bias from nonpartisan? Fortunately, resources are available to help.
Still, I’m worried.
Knowing how to do lateral reading, for instance, is great, but you still need the motivation to apply that strategy. Since social media generally feeds us with information we already agree with, why would any of us take the time to verify it? I have a dream (fantasy?) of a world where people see confirming news and think, “I wonder if there’s a conflicting view.” Can we get people actually to seek information that contradicts their existing beliefs, to work to understand, rather than convince, people with other viewpoints? To achieve such a world, I think it helps to understand what drives our information seeking and evaluating behavior, and here’s where it gets both interesting and complicated.
You likely have heard about confirmation bias, our desire to seek and embrace information that confirms what we already believe. It makes us feel good. However, recent research is revealing that our relationship with truth and contradictory information is full of nuance. A recent piece in The Economist titled “This Article is Full of Lies” does a nice, concise job of making much of this research accessible. One fundamental question is whether or not truth even matters. As the article notes, according to polls, some politicians are more liked than they are trusted.
Do we simply accept that politicians lie and embrace them based on other characteristics?
A brain study at Emory University, for instance, found that when both Democrats and Republicans were given threatening information about their candidate before the 2004 election, the parts of the brain related to emotion and emotion regulation lit up. The parts typically related to reasoning were quiet. Does emotion rule and rational thinking drool, at least when it comes to partisan issues?
Research by Dan Kahan at Yale Law School suggests that something called Identity-Protective Cognition may be at play. It’s not just that we seek out confirming information, even if it may be misinformation; we defend ourselves against information, including research and data, that threatens the beliefs related to our cultural identities. We all belong to groups that give us friendship and support and help define who we are. Those groups matter to us, and we want to maintain those affiliations. When those groups share core beliefs, we risk alienation by adopting ideas that contradict those beliefs. So, if you belong to a group that questions the validity of human-induced climate change, you are likely going to weigh evidence in a way that favors the beliefs of the group. In a time of hyper-partisan affiliations, we are increasingly likely to see Identity-Protective Cognition at work.
Sadly, more education doesn’t seem to improve the chances of open-mindedness. Kahan’s results mirrored what my Harvard colleagues David Perkins and Shari Tishman reported a number of years ago. Across a series of studies looking at responses to contentious political questions, here’s what they found: “People with higher IQs were no more likely to attend to the other side of the case than people with lower IQs, although people with higher IQs did tend to offer more elaborate justifications of their preferred side of the case.” More knowledge led to more sophisticated justifications for maintaining existing beliefs. Sigh . . .
Are you beginning to understand why I’m still worried about our ability to engage in truth-finding and civil discourse when our lives are filled with inflammatory social media bubbles?
Looking across the research landscape, though, I find reasons to be hopeful, and I have some suggestions to guide experiments that might lead to my contradictory information-seeking world.
Be the first to read the latest from Shaped.