Government & Politics

A Surprising Discovery About Facebook’s Role in Driving Polarization

Recent studies find that tweaking the site’s feeds did not change users’ political attitudes during the 2020 election.

September 14, 2023

| by Katia Savchuk
Illustration of two figures walking toward a divide. One figure is on red stripes, the other on blue stars, and in the divide is the facebook logo.

About half of the content Facebook users saw during the last presidential campaign came from sources with like-minded political views. | Cory Hall

Polarization has become the defining feature of the U.S. political landscape, and a common refrain is that social media is to blame. Online echo chambers and filter bubbles spread misinformation, fuel extremism, and stoke antipathy toward those with different beliefs — so the thinking goes.

Facebook, in particular, has come under fire for its role in spreading divisive content. Lawmakers have even weighed proposals to regulate the algorithms that determine what the platform’s users see in their feeds.

Yet a series of recent articles published in Science and Nature suggest such measures alone may not make much of a dent in ideological polarization. Four studies that tracked tens of thousands of Facebook and Instagram users during the 2020 election concluded that altering their feeds didn’t significantly affect their political attitudes.

“The role of social media on increasing polarization is potentially overstated,” says Neil Malhotra, a professor of political economy at Stanford Graduate School of Business and a lead author on two of the studies, conducted as part of an ongoing project examining Facebook’s role in the last national election. “That doesn’t mean that, historically, social media didn’t create polarization. But if you want to decrease it, policy proposals [to regulate social media] are not going to be a silver bullet.”

New Algorithms, New Experiences

Facebook plays a central role in the online information ecosystem: In late 2020, around 231 million American adults used the platform monthly, and about half of the content they saw came from sources with like-minded political views, while less than 15% originated from sources with opposing perspectives. And the political news users consumed was highly ideologically segregated, with most misinformation clustering in sources that reached conservative audiences.

In partnership with Facebook and Instagram’s parent company, Meta, the researchers recruited nearly 200,000 study participants through an invitation in their feeds. Volunteers consented to have their online activity analyzed and were given five surveys between August and December 2020 to assess their political attitudes.

Quote
The role of social media on increasing polarization is potentially overstated. That doesn’t mean that, historically, social media didn’t create polarization.
Attribution
Neil Malhotra

The researchers made several experimental alterations to the algorithm that drives users’ Facebook feeds. In one study, they reduced posts from like-minded sources by one-third for a sample of more than 23,000 people. This slightly increased the content people saw from sources with opposing views and decreased their exposure to “uncivil” language and known sources of misinformation. The change didn’t affect how much time users spent on Facebook, and while they engaged less overall with posts from like-minded sources, they were more likely to interact with the like-minded content they did see.

Another study ordered the feeds of a subset of Facebook and Instagram users so that they saw newer posts first. Compared with users with the standard feed, Facebook users saw more political content and more posts deemed untrustworthy by Meta’s machine learning model, but they also saw fewer posts with slur words and more from sources with moderate political views and politically diverse audiences. On Instagram, users also saw more political and untrustworthy content (Meta doesn’t classify posts based on political ideology on this platform). The result: Users spent much less time on both platforms and reacted to fewer posts, instead spending more time on alternative sites such as TikTok, YouTube, and Reddit.

A third study removed reshares from the feeds of a random sample of Facebook users. This drastically reduced their exposure to political news, including from untrustworthy sources, and decreased how much they clicked or reacted to posts overall.

The Stickiness of Polarization

While these changes had significant effects on users’ online experiences, they had no substantive effect on political polarization — either extremism on the issues or hostility toward the other side. Nor did they significantly alter other political attitudes or behaviors. For example, a reverse chronological feed made Facebook users less likely to engage with political content, such as liking a political post or mentioning candidates, but had no effect on their political knowledge or participation in real-world activities such as signing petitions or attending rallies. Seeing less like-minded content also didn’t change how users evaluated presidential candidates or whether they believed false statements. And the only significant result of eliminating reshares was less knowledge of the news.

Malhotra suggests that this lack of impact could stem from the fact that people’s political views are firmly entrenched. “Attitudes are very sticky,” he said. “They don’t change a lot, and they’re very slow to change.”

Users also consume plenty of content beyond social media. “When you make an algorithm not engaging for people, they do other stuff with their time,” Malhotra says. “They might watch cable TV or go on other internet forums where they could be exposed to misinformation or polarizing content.”

Malhotra points out that the research only speaks to Facebook’s and Instagram’s algorithms as they were in the fall of 2020, not before or since. He and his coauthors also note that their studies’ results could differ over a longer period or outside of a divisive election campaign. And boosting, rather than decreasing, users’ exposure to certain content could have a stronger effect.

No Easy Scapegoats

The research involved an unusual collaboration between more than 25 academics and Meta. The tech giant paid for the research costs and handled individual-level data to protect users’ privacy but did not compensate the researchers and did not have the right to approve their findings. Malhotra said his team had independence and that such a study would be impossible without Meta’s collaboration. “If you actually want data housed internally within these companies,” he says, “the choice is you wait for the laws to change, which I don’t think is wise, or you just can’t do the research.”

The partnership will produce a handful of other studies, including one in which users were paid to deactivate their accounts and another that eliminated political ads from users’ feeds.

Malhotra said future research could also focus on the effect of social media on extreme partisanship among political elites, who have an outsized role in shaping the broader conversation. “It could be that social media is accelerating polarization among this group — not just the people who run for political office, but the people who work for campaigns and interest groups,” he says.

The authors also propose further studies that look at users who could be most susceptible to like-minded content, users in other countries, and the subset of users who see an overwhelming amount of like-minded posts.

In the meantime, Malhotra says this research provides compelling evidence that fixing our political discourse requires looking beyond social media. “It’s a very easy scapegoat,” he says. “People wanted a clean story: Our politics are broken — it must be a single company that’s doing it. The real reason our politics are broken is because humans in our society are broken, and there are many reasons for that separate from social media. That’s not what people want to hear.”

For media inquiries, visit the Newsroom.

Explore More