The Small but Mighty Danger of Echo Chamber Extremism

Research shows that relatively few people exist in perfectly sealed-off media bubbles—but they’re still having an outsize impact on US politics.
Angry MAGA mob yelling before storming the US Capitol on January 6 2021
Photograph: Graeme Sloan/Bloomberg/Getty Images

One of the top concerns when it comes to the harms of social media and political polarization in the United States is the fear of echo chambers or people operating in media bubbles. If people are only hearing opinions they already agree with or seeing stories that align with their worldview, they may become more entrenched in their beliefs, whether or not their beliefs reflect the real world. They may also become easier to manipulate and more extreme.

Interestingly, research largely shows the vast majority of people don’t inhabit perfectly sealed-off echo chambers. It’s been found that only about 4 percent of people operate in online echo chambers, and most people on Twitter, for example, don’t follow any political accounts. Essentially, most people aren’t following politics, and a lot of people who do are getting at least a little bit of information from different sides of the political spectrum. That said, echo chambers and media bubbles are an issue because they can radicalize people, negatively affect the people who inhabit them, and distort the broader political landscape.

“The subset of the population that does consume hyper-partisan media and inhabit echo chambers on social platforms is very consequential,” says Magdalena Wojcieszak, a professor of communication at the University of California, Davis. “They’re more politically interested, more participatory, more strongly partisan, and more polarized. Because of all these things, they’re more likely to take part in politics.”

Wojcieszak says because these people are so politically involved, they have a disproportionate influence on American politics. They’re often the loudest voices in the room. She says people who are politically active like to have their views confirmed, so they can end up following accounts that align with their views and end up in echo chambers. Social media makes it easier to find people who align with them politically, and algorithms often feed them the content they’re going to like. All of this can ultimately lead to people going down rabbit holes and becoming more politically extreme.

“It makes you more extreme or polarized. It reinforces your attitudes. It also reinforces your sense of belonging to this group, and it reinforces your negativity and hostility toward other groups,” Wojcieszak says. “You think you’re the legitimate one, the good one, the virtuous one. The others are evil.”

People can start to believe they’re the only ones with the facts and that the other side is illegitimate. (Perhaps you’ve seen this in a person who paid tens of billions of dollars for a social media company not long ago.) Wojcieszak says the process of people becoming radicalized can start with them having just a few political views in common with those who are more extreme than they are. Having a few stances that align with these extreme actors online can be the snare that pulls them into the rabbit hole.

“In order to enter that process of this individual psychological and algorithmic confirmation, you do need to have some extent of susceptibility to some sort of narratives from the left or the right,” Wojcieszak says. “If there are some social or political issues in which you have some views, that can start the process.”

You can imagine someone who isn’t particularly politically extreme but does harbor certain fears about ways the country is changing being pulled in by extremists and becoming extreme themselves as they get increasingly embedded in that community. People need community, and extremists can give them that. They’ll be welcomed by this community, Wojcieszak says, and they’ll feel the psychological need to start going along with whatever that community’s narrative is on any number of issues. 

Mike Gruszczynski, an assistant professor of communication science at Indiana University, says a distrust in institutions, such as news media and the government, can lead to people creating echo chambers and often falling for disinformation because it appeals to their political beliefs. He says this has been found to be more common on the political right than the political left.

“You have a lot of people on the right wing of the political spectrum who have been highly distrustful of traditional journalism for quite a while,” Gruszczynski says. “Not only are they distrustful of it, but they exist in a kind of feedback loop where their chosen leaders tell them that the things that come out of the media are false or biased.”

One of the ways society can help prevent people from going down these rabbit holes and becoming more extreme is by teaching them media literacy. Gruszczynski says it won’t necessarily be easy to do, especially because there’s so much disinformation out there and it’s often quite convincing. But it’d be worth the effort. “Everyone kind of has to be their own detective in a way now,” Gruszczynski says.

It often feels like an insurmountable challenge, Wojcieszak says, because those who have become politically extreme are living in such a different reality than the rest of the populace. If someone is spending most of their time on extremist forums or in extreme groups on social media, for example, it’s hard to reach them and bring them back to reality. She says improving social media algorithms so these platforms are less likely to make people more extreme in the first place could be a good place to start when it comes to attacking this problem.

“In the US, things have gotten so deeply bad for some groups. The people who are, say, true Trump believers or who are convinced Covid was a hoax—I’m not sure if you can deprogram them,” Wojcieszak says.

Society may not be able to pull everyone out of these rabbit holes, but increased media literacy and social media platforms that aren’t designed to confirm people’s existing beliefs and make them more extreme could help fewer people become radicalized. It’s a widespread problem that will take time to address, but the status quo does not seem sustainable.