Many people love social media because their feeds are a safe space to share pictures of celebrities who #stayhome wearing masks, discuss the Black Lives Matter protests, and strategize to solve climate change.
And many people love social media because their feeds are a safe pace to share pictures of celebrities #tricked into wearing masks, discuss the Black Lives Matter rioters, and strategize to stop the climate change hoax.
It’s a tale of two feeds, because thanks to confirmation bias and powerful proprietary algorithms, social media platforms ensure we only get a single side of every story.
Even though most Americans continue to describe themselves as holding balanced views, we still naturally gravitate toward certain content online. Over time, algorithms turn slight preferences into a polarized environment in which only the loudest voices and most extreme opinions on either side can break through the noise.
What Is Confirmation Bias?
Confirmation bias is the natural human tendency to seek, interpret, and remember new information in accordance with preexisting beliefs. Consider it our brains’ default setting. Just by going through life, humans discover all sorts of information through focused research, general experience, and wild hunches—and it feels especially good to our brains when what we learn matches what we already expected.
Also called “myside bias,” confirmation bias is an innate, universal trait that shows up across cultures. It’s a part of all of us, although once we acknowledge its presence we can take steps to diminish the hold it has on our thinking. The scientific method, legal system, and judicial process are all inventions humans created to get around our tendency to jump to conclusions (even if those same systems are sometimes subject to it). Now that confirmation bias has become so prevalent on social media, we need additional tools to manage its impact.
How Does Confirmation Bias Manifest on Social Media?
Science journalist David McRaney, host of the You Are Not So Smart podcast, believes confirmation bias is the root of why we’re drawn to social media.
“The fact that social media platforms confirm what we already believe is the reason many people use them in the first place,” he says. “If the platforms didn’t do that, they wouldn’t be successful.”
For the biggest brands in social media—think Facebook, YouTube, and Twitter—success is defined by the hours users spend engaged with content, and measured in advertising dollars our attention generates.
Social media companies therefore rely on adaptive algorithms to assess our interests and flood us with information that will keep us scrolling. The algorithms ignore the recency and frequency of what our friends are posting and instead focus on what we “like,” “retweet,” and “share” to keep feeding content that is similar to what we’ve indicated makes us comfortable.
Social media has removed traditional gatekeepers to information who evaluated stories for newsworthiness and accuracy, and while this has been a boon for the discovery of niche online groups who appreciate the same things you do, it also creates echo chambers in which a user is never presented with alternative perspectives.
Why Should We Care?
According to Kristina Lerman, a USC professor whose research focuses on the structure of modern social networks, echo chambers strengthen polarization and the divisions in our society. It’s common to feel uneasy because of the disassociation between the warm blanket of a like-minded social media community and the cold reality of a real-world populated with challenging perspectives.
Still, it’s possible to find a balance. Jess Davis, a digital marketer who founded the brand Folk Rebellion, specializes in the responsible use of technology. “If the companies and algorithms aren’t doing it for us,” she says, “it’s up to us to regulate ourselves.”
Davis, who formerly managed the social media portfolios for several well-known brands, noticed how most of the discussion around algorithms came from marketers who wanted to harness their power to increase sales. She rejected the premise of attention as a product to be sold and has attempted to reclaim social media for a more personal purpose: the making and maintaining of connections that compliment life offline.
Of course there are trolls online who actively peddle disinformation to incite hatred and violence, but usually it’s possible to separate them from regular users who just happen to have a different perspective. Davis has found satisfaction ignoring the aggressors and using the rest of her social media network as a launching point for in-person relationships.
“It feels more fulfilling to have a conversation over coffee with someone you disagree with than to than reply to a thread alone and angry at home,” she says.
What We Can Do About It
McRaney encourages listeners to seek disconfirmation, a term for information that actively contradicts preconceived opinions. But ever the realist, he also describes this important effort as one of the hardest things for a human brain to do.
“Brains seek confirmation as a knee-jerk reaction even after we’ve just formed an idea,” he points out, “And then will defend instant conclusions with the same fervor as long-held religious beliefs. It’s frustrating, but these are the brains we have.”
So it’s up to us to construct an online environment that encourages our lazy brains to burst the confirmation bias bubble and consider all available angles.
Here are five steps we can take today to fight back against the algorithms and reclaim our social media feeds.
1. “Like” everything. Algorithms can’t categorize you if they can’t determine what you really like. Be generous with your thumbs ups and hearts and you’ll be rewarded with something beyond grateful friends who are glad you noticed their posts: a continued churn of content from confused platforms who must show you everything as they assess and reevaluate your preferences.
2. Actively cultivate prestige media on all sides. Step one is about baby pictures and vacation brags. Step two is about avoiding fake news. Swallow your pre-conceived opinions and follow prestige publications across the political spectrum. A profile searching for The National Review and The New Yorker means you’ll keep your newsfeed clear of the most polarizing stories pushed by trolls on both the right and left who are out to influence and incite anger rather than inform.
3. Pay attention to the amount of followers the people you follow have. Lerman’s research revealed that individuals who have a great disparity between their amount of followers and the number of people they themselves follow often acquire outsized influence on social media. Be cognizant of friends who post frequently but follow few others, and consider muting their accounts if they appear to engage with only like-minded thinkers.
4. Change feeds to focus on recency rather than personalization. Both Facebook and Twitter allow users to view the most recent posts first, but the setting is difficult to find and often reverts without warning to the default view, which highlights how much the platforms want you to rely on their algorithms. Changing this setting is worth the effort. Some users will immediately see posts from accounts that have been hidden for years.
5. Create space for new voices. Chances are you aren’t coming to the confirmation bias conversation with a clean slate, so to really start to see changes in your feed you may need to actively undo old habits to enable different voices to appear. Consider temporarily muting celebrities whose accounts share your perspectives to make room for the rest of your efforts to break through.
But Is This All Worth It?
There’s no question it’s a lot harder to fight confirmation bias than to just open your apps and let their algorithms control what you see. But in addition to the personal sense of calm a decreased dependence on confirmation bias creates, Lerman’s research also demonstrated that “balanced” feeds cause a positive effect that cascades through social media.
In essence, posts from a balanced account will be shown more frequently and to a wider array of individuals across all algorithmic spectrums while the platforms attempt to categorize you.
So try out the above steps to keep the platforms wondering what you really think. The ripple effect will keep confirmation bias away, and help build an environment that makes it possible for others to achieve the same result.
More Great WIRED Stories
- ? Want the latest on tech, science, and more? Sign up for our newsletters!
- The oysters that knew what time it was
- Tips to fix the most annoying Bluetooth headphone problems
- My week of radical transparency at a Chinese business seminar
- Could a tree help find a decaying corpse nearby?
- How to protect the data on your laptop
- ? Things not sounding right? Check out our favorite wireless headphones, soundbars, and Bluetooth speakers