Technology & AI

When Social Media and Political Speech Collide

A deeper look at how Facebook’s content played a role in elections and the dissemination of hate speech around the globe.

September 29, 2023

| by Kelsey Doyle Jenny Luna

This podcase is a case study which examines Facebook’s history as it relates to political speech in the U.S. and around the world. It covers the “Facebook era” in American presidential elections, including charges of Russian interference, as well as how Facebook content may have contributed to ethnic violence in Myanmar. The podcase also discusses how the company, now called Meta, has navigated contentious decisions about political speech in an ever-evolving social media landscape.

The case is designed for use in classes on business ethics or corporations and society to discuss the social responsibility of business. It raises ethical questions such as: How should social media balance a commitment to freedom of speech with the need to prevent violence and civil unrest? When is censorship ok, if ever? And what role — if any — should government play in legislating how internet companies regulate what elected officials can say in their platforms?

Podcases: Case Studies, Reimagined

A “podcase” is a teaching tool: an audio version of a traditional case study, designed to provide an alternate learning method for students. It includes audio enhancements, such as sound effects, intended to illuminate the material.

Full Transcript

Mark Zuckerberg: “I don’t think Facebook or Internet platforms in general should be arbiters of truth. I think that’s kind of a dangerous line to get to in terms of deciding what is true and what isn’t…

Frances Haugen: Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.

Mark Zuckerberg: In general, you want to give as wide of a voice as possible, and I think you want to have a special deference to political speech.

Jenny Luna: Hi there. I’m Jenny Luna, a multimedia producer at Stanford Graduate School of Business.

Kelsey Doyle: And I’m Kelsey Doyle also a producer here at the GSB; and you’re listening to “Facebook and Political Speech.” We’ve taken a written case by Political Economy Professor Ken Shotts and case writer Sheila Melvin and turned it into… what we’re calling a podcase.

Jenny Luna: Yeah, this podcase is designed for use in classes on business ethics or corporations and society to discuss the social responsibility of business. As you’re listening, questions may arise for you like… how should social media balance a commitment to freedom of speech, with the need to prevent violence and civil unrest? When is censorship ok, if ever? And what role should government play — if any — in legislating how internet companies regulate what elected officials can say in their platforms?

Kelsey Doyle: To dive deeper into these questions, Jenny and I will dig into Facebook’s history, particularly in US politics, and examine how it’s navigated contentious decisions about political speech in an ever-evolving digital landscape.

Jenny Luna: The podcase won’t be a comprehensive overview of Facebook and elections in the US or anywhere else. It’s really meant to see how you respond to it. What is your moral reaction and how is it different from others’ who hear the same thing?

Kelsey Doyle: We’re excited to guide you through this auditory experience. You can find a link to the written case study via our website: gsb.stanford.edu.

Jenny Luna: When a college student named Mark Zuckerberg created a website called “The Facebook” back in 2004, he had no idea it would eventually become the world’s largest social media company. With around 3 billion users sharing 140 billion messages and a billion stories a day, Facebook has changed the way humans communicate and relate to one another.

Kelsey Doyle: More than 80 percent of adult social media users in the United States reported visiting Facebook at least once a week in 2022, and the platform’s global penetration was nearly 40 percent.

Jenny Luna: Facebook’s original mission statement was “Making the world more open and connected.” As the audience at the 2010 AllThingsD conference found out, company founder Mark Zuckerberg even wore it inside his trademark hooded sweatshirt.

Kara Swisher: It is a warm hoodie.

Marc Zuckerberg: Yeah it’s a thick hoodie, It’s a company hoodie. We print our mission on the inside.

Kara Swisher: What, Oh, the inside of the hoodie everybody. What is it? Making the…

Marc Zuckerberg: Making the world more open and connected.

Kara Swisher: Oh my god its like a secret cult.

Jenny Luna: But In 2017, Zuckerberg announced a change…

Mark Zuckerberg: The idea behind our new mission is to bring the world closer together. And the full formal mission statement is gonna be: “To give people the power to build community to bring the world closer together.”

Kelsey Doyle: The Facebook company, which nowadays is named Meta, also includes Instagram and WhatsApp. Combined, Meta’s apps are used by over 200 million businesses. On the Meta website, the company’s principles are to quote, “give people a voice; build connection and community; serve everyone; keep people safe and protect privacy; promote economic opportunity.”

Jenny Luna: Meta’s primary source of revenue, is advertising. Connecting people… bringing them closer together… with economic pressure to make sure that users engage with the site and generate revenue for advertisers… balancing these sometimes competing missions and business requirements has been a difficult tightrope for Meta to walk.

Kelsey Doyle: Facebook and Meta have been accused of bias and censorship, of helping foment violence and fuel genocide, and of tipping the balance of national elections. In many ways, the rise of Facebook’s influence, is the story of how the internet became the most powerful political tool on earth.

News Clip: Please welcome President Clinton and Senator Dole.

Jenny Luna: The 1996 U.S. presidential campaign was the first that incorporated the world wide web, as it was known back then. Only 20 million American adults had access to the Internet at the time, mostly through dial-up modems. The New York Times suggested that the Web presences of both the Republican and Democratic campaigns were quote, “partly marketing gestures intended to emphasize that their candidates are men of the future, at ease with modern technology.”

Senator Bob Dole: This election is important. I ask for your support, I ask for your help. And if you really want to get involved, just tap into my home page, www.DoleKemp96 o-r-g. Thank you. God bless America.

Jenny Luna: Republican Presidential candidate Bob Dole’s forgetting the “dot” before “org” showed just how new the internet was for most people. Nonetheless, longtime journalist Marvin Kalb predicted that the Web would play a growing role in politics. He wrote:

Male Voice: In an uncensored, open-access medium where everyone can have a say, everyone will have a say in one way or another. What people say will be opinionated and partisan, it will play fast and loose with the facts, and it will largely be undocumented. Much of the Web already is information by mob rule, and that isn’t likely to change as this decentralized medium becomes more populated.

Kelsey Doyle: Eight years later, Howard Dean, a candidate for the 2004 Democratic presidential nomination, was generally considered the first politician to make effective use of the Web. His followers connected through meetup dot com, and his campaign staff embraced a decentralized work structure.

News Clip: 5000 strong, the most to ever come out on a caucus. So this is the crew that you guys will be seeing a lot of. This is Ken Sanguin who’s directing the effort right here.

Ken Sanguin: This site is where we’re gonna have training for the Iowa caucus for Iowans who want to learn about it. There will be video showing them exactly what they’re going to do at the caucus so they’re not scared of it and they’ll have a good time and help Dean win. We’re gonna have places for people to come in, log in find out how they’re gonna get to Iowa. There’s gonna be a ride board so people can organize trips from all over the country to come into Iowa. It’s gonna be awesome!

Kelsey Doyle: Some argue that the Internet “invented” Dean as a candidate, helping the little-known Vermont governor soar to national attention and, for a time, be a serious presidential contender. But the internet may have also led to Dean’s demise. A passionate speech after the Iowa democratic caucuses ended in what came to be called the “Dean scream.”

Howard Dean: And we’re going to South Dakota and Oregon and Washington and Michigan, and then we’re going to Washington, D.C., to take back the White House! Yeah!

Jenny Luna: That scream was ridiculed, and became what many consider the first political meme.

Jay Leno: Oh my god did you see Dean speech last night? Oh my god. Now I hear the cows in Iowa are afraid of getting mad Dean disease.

(News clips and memes of scream)

Jenny Luna: As it went viral on the internet, and on TV, news coverage of the scream drowned out Dean’s political messages.

(Memes of scream)

Kelsey Doyle: The 2004 Dean scream, coincidentally took place about 2 weeks before a 19 year old Mark Zuckerberg launched “The facebook” on Harvard’s campus. Politics were not even a consideration. The goal was to let Harvard University students connect with one another online.

Jenny Luna: But four years later, when the 2008 presidential campaign rolled around, Facebook had accumulated 100 million active users, and politicians saw it as a powerful tool on the campaign trail. U.S. News and World Report underlined the role that Facebook played in Barack Obama’s 2008 victory, claiming, quote, “Obama enjoyed a groundswell of support among, for lack of a better term, the Facebook generation. He will be the first occupant of the White House to have won a presidential election on the Web.” The article further noted that Chris Hughes, a Facebook co-founder who left the company to volunteer for Obama, had masterminded the Obama campaign’s highly effective Web blitzkrieg — everything from social networking sites to podcasting and mobile messaging.

Barack Obama: Part of what makes for a healthy democracy, what is a good politics, is when you’ve got citizens who are informed, who are engaged. And what Facebook allows us to do is make sure this isn’t just a one-way conversation; makes sure that not only am I speaking to you but you’re also speaking back and we’re in a conversation, we’re in a dialogue.

Kelsey Doyle: During the 2010 congressional elections, Facebook conducted an experiment to gauge the political influence of its platform. The randomized controlled trial sent political mobilization messages to 61 million Facebook users to test the hypothesis that political behavior can spread through an online social network.

Automated Female Voice: Find your polling place on the US politics page and click the I voted button to tell your friends. You voted Jamie Settle, Jason Jones, and 13 of your friends have voted.

Kelsey Doyle: The results, which were published in Nature, found that the messages directly influenced political self-expression, information seeking and real-world voting behavior of millions of people. Furthermore, the messages not only influenced the users who received them, but also the users’ friends, and friends of friends.

Jenny Luna: By the 2012 presidential election, Facebook had over 1 billion users. This exponential growth, the Guardian wrote, meant this will be the first election cycle in which Facebook could become a dominant political force… a major campaigning tool that has the potential to transform friendship into a political weapon. Facebook is also being seen as a source of invaluable data on voters.

Kelsey Doyle: Using its Targeted Sharing app, Obama’s campaign asked supporters to share their friend lists, which most did. Here’s Betsy Hoover, the campaign’s online organizing director, being interviewed on NPR:

Betsy Hoover: We got your list of friends. And then we matched it to our model, our list of voters that we didn’t build with Facebook data. We built with voter history and, you know, all of the other data points that Democratic campaigns use to build models. But we matched the data of your friends to that model and then reflected it back to the person who had authorized the app and said, if you want to reach out to your friends about this election on Facebook, here are the ones that you should reach out to first.

Jenny Luna: In 2014, responding to privacy concerns, Facebook began to limit how much information third party apps could collect. The shift was lamented by the industry newsletter Campaigns & Elections in an article headlined, “Facebook Kills a Grassroots Tool.” Nonetheless, analysts predicted that Facebook, which had become a major conduit of news and information, would significantly impact future elections. Tarun Wadhwa, writing on HuffPost, argued:

Male Voice: The design, policies, and algorithms chosen by the company are having a major impact on how elections are run and how the electorate gets their information… this puts Facebook in an incredibly powerful position to determine the political future of the several countries where it is most popular. Whether it wants this responsibility or not, Facebook has now become an integral part of the democratic process globally.

Hillary Clinton: When your kids and grandkids ask what you did in 2016, when everything was on the line, I hope you’ll be able to say that you voted for a better, stronger, fairer America.

Kelsey Doyle: In 2016, Democratic Party presidential candidate Hillary Clinton was widely expected to win — and did win the popular vote by 2.8 million votes — but Republican Party nominee Donald Trump won the election, with victories in enough states to earn a 304 to 227 margin in the Electoral College.

Donald Trump: It’s been what they call a historic event, but to be really historic, we have to do a great job. And I promise you that I will not let you down. We will do a great job. We will do a great job. I look very much forward to being your president.

Kelsey Doyle: Analysts from across the political spectrum attributed his victory to Facebook, for a number of reasons. First and foremost was the use of the platform for messaging and advertising.

Jenny Luna: Trump’s digital media manager, Brad Parscale, told Wired magazine that Facebook helped generate most of the $250 million dollars that the campaign raised online. And he said that “Facebook and Twitter were the reason we won this thing. Twitter for Mr. Trump. And Facebook for fundraising.” Here’s Parscale speaking to PBS Frontline’s James Jacoby:

Brad Parscale: We knew that Facebook, the audiences were there; the people were there; the people we needed to touch, and there was probably no better way to, per dollar, connect with them other than Facebook.

James Jacoby: And so what was, in the primary season, what was the strategy on Facebook, and how did it kind of shift going into 2016?

Brad Parscale: Shock and awe.

James Jacoby: Shock and awe? How so? What does that mean?

Brad Parscale: Which means is, put Mr. Trump’s message, let him speak directly to camera, and get it to as many people as possible.

James Jacoby: And why was Facebook the ideal medium for that?

Brad Parscale: Low-cost CPM, large numbers of conservative voters, ability to broadcast all day multiple times to the same audience, and the numbers were showing in the consumer side that people were spending more and more hours of their day consuming Facebook content and aggregated news feed.

Kelsey Doyle: With the advent of social media, it became possible to target advertising with greater precision at a lower cost; this ability accelerated after Facebook Ads was launched in 2007. In reference to the Obama campaign, media analyst Daniel Kreiss wrote that Facebook:

Male Voice: provided a wealth of new ways to target groups of voters. These ads were based on a cost per click model, where the campaign only pays when an individual sees an ad and clicks on it. On Facebook, the campaign targeted advertising based on a host of different characteristics revealed on voters’ profile pages, from political persuasion and religion to hobbies.

Jenny Luna: Facebook launched its mobile app in 2011, and in 2012 the platform provided advertisers the ability to place ads in users’ newsfeeds, including on the app. In 2013, Facebook launched a feature that was later renamed Meta Pixel, which enabled advertisers to track conversions, and understand how users interacted with their brand, or campaign, both on and off Facebook. It also launched Lookalike Audiences, algorithmically generated groups of people most likely to be interested in whatever someone was selling. As machine learning improved, so did the algorithms used for targeting readers.

Kelsey Doyle: In 2016, the Trump campaign ran 40 to 50,000 variants of its Facebook ads daily; on the day of the third presidential debate, it ran 175,000, a technique that Gary Coby, director of advertising at the Republican National Committee, called A/B testing on steroids. The campaign spent 80 percent of its digital ad budget on Facebook. Facebook embedded staff to work alongside the Trump campaign and help them target potential donors and voters. The Trump team’s key contact at Facebook encouraged the campaign to run ads that targeted Facebook users who had liked or commented on Trump’s posts in the previous month; every dollar spent on such ads yielded the campaign $2 to $3 in donations, bringing in millions in just a few days. The company offered the same services to the Clinton campaign, but found them less receptive.

Jenny Luna: But it wasn’t just advertising. Disinformation and so-called fake news was also on the list of election related criticism coming Facebook’s way.

Kelsey Doyle: Around the time of the 2016 election, BuzzFeed reported that scores of websites focused on U.S. politics were run by teenagers in Macedonia, who had found the best way to generate shares on Facebook is to publish sensationalist and often false content that caters to Trump supporters. The creators of the sites, whose goal was to make money, used information from right-wing American websites to garner clicks. Much of the content was completely made up, like a viral post claiming that the Pope had endorsed Trump.

Jenny Luna: Several days after the election, CEO Mark Zuckerberg appeared at the Techonomy conference, where he addressed claims that fake news on Facebook had helped Trump win:

Mark Zuckerberg: You know, personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way is I think is a pretty crazy idea… there is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way they did is they saw some fake news. Right, if you believe that, then I don’t think you have internalized the message the Trump supporters are trying to send in this election.

Jenny Luna: Zuckerberg later walked back this statement, writing: “After the election, I made a comment that I thought the idea that misinformation on Facebook changed the outcome of the election was a crazy idea. Calling that crazy was dismissive and I regret it. This is too important an issue to be dismissive.” Yet, Zuckerberg argued, “The data has always shown that our broader impact — from giving people a voice to enabling candidates to communicate directly to helping millions of people vote — played a far bigger role in this election.”

Kelsey Doyle: There was also the issue of Russian interference. In June 2016, the Democratic National Committee, or DNC, announced that it had been attacked by Russian hackers; this led Facebook to investigate whether any Russian activity had been directed at the social media site. By November, according to the New York Times, Facebook had discovered that Russian operatives were pushing DNC leaks and propaganda on the platform.

Donald Trump: Russia if you’re listening, I hope you’re able to find the 30,000 emails that are missing.

Kelsey Doyle: In February 2018, the Justice Department indicted 13 Russian individuals and three companies for trying to criminally interfere in the 2016 U.S. presidential election. This included the Russian Internet Research Agency, or IRA, which allegedly placed ads on Facebook to promote divisive content and stoke political and social unrest. These ads used Facebook targeting tools to reach narrow groups of users. The IRA created an estimated 470 fake Facebook accounts, some pretending to belong to activist organizations in the U.S., and used these accounts to influence or incite people around issues like racism, immigration, and LGBTQ+ rights. The IRA also bought ads or hosted fake accounts on Google and Twitter, as well as Pokemon Go, Tumblr, and PayPal.

Automated Female Voice: Hillary Clinton is the co-author of Obama’s anti-police and anti-Constitutional propaganda.

Automated Female Voice: Join us because we care. Black Matters!

Automated Male Voice: Who is behind this mask? A man? A woman? A terrorist? Burqa is a security risk and it should be banned on U.S soil!

Automated Female Voice: We call for disqualification and removal of Hillary Clinton from the presidential ballot as dynastic succession of the Clinton family in American politics breaches the core democratic principles.

Jenny Luna: In September 2017, Facebook agreed to give Congress detailed information on Russian-backed ads, improve disclosure requirements for political ads, change its ad review process, and more than double the size of its election integrity team. “It is a new challenge for Internet communities to deal with nation-states attempting to subvert elections,” Zuckerberg said. “But if that’s what we must do, we are committed to rising to the occasion.” Experts disagreed on the impact of the Russian ads; a 2018 Washington Post analysis, for example, called the scale of Russian ads vastly overstated.

Kelsey Doyle: As Facebook scrambled to make changes, the focus wasn’t just on the United States. Groups all over the world had become fully aware of Facebook’s ability to shift people’s views and perspectives on topics far beyond electoral politics. In 2018, it created a 24/7 “command center,” to monitor and remove abuses and limit the spread of hateful or untrue content by tweaking its algorithms or adding banners and labels with relevant, authoritative information.

News Clip: Inside this war room, a team of Facebook analysts monitor elections around the world. They’re looking for fake accounts, suspicious patterns, any sign of election tampering.

Kelsey Doyle: The initial focus was elections in Brazil and the US. But events in Myanmar, formerly known as Burma, had demonstrated that Facebook could be used for purposes far beyond electoral politics.

Jenny Luna: Myanmar was ruled by a military junta from 1962 to 2011. During that time, few people had cell phones or Internet access. When Myanmar began to liberalize in 2011, mobile adoption was rapid; Facebook initially allowed its app, which came pre-loaded on new phones, to be used without data charges.

Kelsey Doyle: People who once visited tea shops to get word-of-mouth news could now get it from Facebook, which one commentator called the nation’s digital tea shop. By 2019, there were an estimated 21 million Facebook users out of a population of 54 million; Facebook reportedly accounted for 99 percent of social media usage and had become a virtual synonym for Internet.

Jenny Luna: Beginning in 2013, members of Myanmar’s military were believed to have used fake identities on Facebook to make incendiary posts against the Rohingya, the nation’s largest Muslim minority group. The Rohingya had long been discriminated against, perceived as illegal immigrants from Bangladesh, even though many could trace their roots in Myanmar back centuries. They were banned from holding citizenship or owning land, and had to seek permission in order to marry or travel. According to reporting in the New York Times, some of the propaganda techniques used by the military were based on tactics used in Russia, while others were home grown. These included Facebook pages and news pages ostensibly dedicated to pop stars and other celebrities that displayed gruesome photos, fake news, and inflammatory charges targeting the Rohingya and democracy activists like the Nobel Peace prize winner Aung San Suu Kyi.

News Clip: Social media was slow in coming to Myanmar. Its now booming, providing new platforms to figures such as Buddhist Monk, Ashin Wirathu. When he goes on Facebook and calls Rohingyas snakes and mad dogs, he has more than 400 thousand followers who can spread that across other platforms.

Kelsey Doyle: In August 2017, the military in Myanmar began a series of attacks against the Rohingya that included murder, rape, and torture; human rights groups widely characterized these attacks as acts of genocide. Villages were razed and thousands killed, leading to the exodus of nearly a million Rohingya, primarily to Bangladesh.

News Clip: The violence against the Muslim minority and the resulting exodus by land and sea are unprecedented, and deadly. And must be stopped, says the UN secretary general: “The authorities in Myanmar must take determined action to put an end to this vicious cycle of violence, and to provide security and assistance to all those in need.”

Kelsey Doyle: U.N. human rights investigators stated that social media had played a determining role in the events. Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar, added, that “It has… substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly of course a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.”

Jenny Luna: In August 2018, Meta acknowledged that, “The ethnic violence in Myanmar is horrific and we have been too slow to prevent misinformation and hate on Facebook.” The company hired more Burmese language experts and created better AI tools to detect posts that broke the rules. Facebook also commissioned an independent human rights assessment and published the findings in November 2018. Product policy manager Alex Warofka wrote: “The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.”

Kelsey Doyle: According to the Los Angeles Times, in August 2018 “Facebook banned 20 individuals and organizations, including Myanmar’s military chief, and removed pages that together were followed by almost 12 million people. The company said it was the first time it had banned a country’s military or political leaders.”

Jenny Luna: Meanwhile, in Brazil, Facebook and WhatsApp were flooded with a tsunami of fake news during that country’s 2018 election.

Kelsey Doyle: And in the US, Facebook continued to uncover ongoing Russian attempts to influence voters, including fake accounts and identities linked to individuals associated with past activity by the Russian Internet Research Agency. The company announced the creation of an independent Oversight Board to review content decisions. The board, which began operating in 2020, was to use its independent judgment to support people’s right to free expression and ensure those rights are being adequately respected,” and its decisions were binding.

Jenny Luna: Facebook also addressed the issue of political ads. It stopped paying sales commissions to employees selling political ads and stopped embedding employees with presidential campaigns. Google and Twitter also had offered political advertising support to Republicans and Democrats but, according to an article in the Wall Street Journal, “Facebook’s size and questions about misuse of data siphoned from the platform meant it took the brunt of the public backlash.”

Kelsey Doyle: In 2018, facebook launched a searchable database of political ads, and soon expanded it to include all ads on the site. The company reportedly considered dropping political advertising altogether — as Twitter opted to do in October 2019. But Facebook ultimately didn’t do that. It did, however, update it’s advertising policies to offer expanded transparency and more controls over political ads. Rob Leathern, Facebook’s director of product management, explained the reasoning by writing on the company’s website:

Male Voice: Unlike Google, we have chosen not to limit targeting of these ads. We considered doing so, but through extensive outreach and consultations we heard about the importance of these tools for reaching key audiences from a wide range of NGOs, non-profits, political groups and campaigns, including both Republican and Democratic committees in the US. Ultimately, we don’t think decisions about political ads should be made by private companies, which is why we are arguing for regulation that would apply across the industry. In the absence of regulation, Facebook and other companies are left to design their own policies. We have based ours on the principle that people should be able to hear from those who wish to lead them, warts and all, and that what they say should be scrutinized and debated in public.

Jenny Luna: Leathern concluded by noting that all advertisers, including politicians, had to abide by the company’s Community Standards that “ban hate speech, harmful content and content designed to intimidate voters or stop them from exercising their right to vote.”

Kelsey Doyle: That policy was soon put to the test. In May 2020 George Floyd, an unarmed 46-year-old Black man, was murdered by Minneapolis police while being arrested on suspicion of using a counterfeit 20 dollar bill. During the massive racial justice protests that followed, President Trump posted remarks on both Facebook and Twitter that included the phrase “when the looting starts, the shooting starts.”

Jenny Luna: Twitter flagged Trump’s post and made it difficult to share, but Facebook kept it accessible, an approach that was consistent with its practice of allowing political speech that was newsworthy to remain on the site, even if it otherwise violated the company’s standards.

Kelsey Doyle: Major companies unhappy with Facebook’s decision launched an advertising boycott, and more than 5,000 Facebook employees denounced the decision.

News Clip: Some Facebook employees perform a virtual walkout, choosing not to show up for work on Monday. Its all part of a protest over CEO Mark Zuckerberg’s decision not to take action against President Donald Trump’s controversial post referencing violent protests around the country. One Facebook employee says he disagrees with Zuckerberg’s decision to do nothing about Trump’s recent posts, and said there isn’t a neutral position on racism.

Jenny Luna: In response to the uproar over this post and others in which Trump promoted falsehoods about COVID-19 cures, election fraud, and more. Facebook announced that it would remove posts, including those by political leaders, that incited violence or attempted to suppress voting, and said that it would label posts that violated its policies, including on hate speech.

Kelsey Doyle: Closer to the 2020 elections, Facebook implemented break the glass measures, or temporary interventions to keep the platform safe, like slowing down the growth of political groups that might spread misinformation and foment extremism. It also suspended political and social ads in the week before election day, November 3rd.

Jenny Luna: Facebook’s election related efforts, described by one former company executive as an attempt to make sure Facebook wasn’t the story, were initially seen as largely successful. But, immediately after Democrat candidate Joe Biden won the election by more than 7 million votes and in the Electoral College. Trump and his allies also declared victory and spread false stories claiming the election had been stolen.

Donald Trump: This is a fraud on the American public. This is an embarrassment to our country. We were getting ready to win this election. Frankly, we did win this election. We did win this election. So our goal now is to ensure the integrity for the good of this nation. This is a very big moment. This is a major fraud in our nation.

Kelsey Doyle: Facebook, meanwhile, began a previously planned rollback of the emergency measures that had helped control toxic speech and misinformation before election day. It also dismantled the Civic Integrity group and redeployed its members to other teams. Guy Rosen, Facebook’s vice president for integrity, outlined the new structure on December 2, 2020, noting, “We’ve made incredible progress in getting problems under control, and many of us looked to the US 2020 elections as a major milestone on this path. I’m so very proud of this team and everything we’ve accomplished.”

News Clip: “Stop the Steal” chant

Jenny Luna: The success of Facebook’s election-related efforts was widely questioned when thousands of Trump supporters shouting “stop the steal” attacked the United States Capitol on January 6, 2021, trying to prevent the constitutionally mandated certification of President-elect Biden’s victory.

Donald Trump: “Were gonna walk down to the Capitol! Lets take the Capitol! Take the Capitol!

Kelsey Doyle: Many false claims of election fraud had been shared in Facebook groups; when Facebook tried to shut these “stop the steal” groups down, their members just moved to other groups. It was subsequently revealed that between election day on November 3 and January 6, some 650,000 posts in Facebook groups attacked the legitimacy of the Biden victory, and many of these posts aimed to incite political violence.

Jenny Luna: The day after the January 6 insurrection, Facebook placed an indefinite ban on posts by Donald Trump. “We believe the risks of allowing the President to continue to use our service during this period are simply too great,” Zuckerberg wrote. Facebook’s Oversight Board upheld the suspension, but not its indefinite term; Facebook then set it at two years. After that, said Nick Clegg, vice president of global affairs, the company would “evaluate external factors, including instances of violence, restrictions on peaceful assembly and other markers of civil unrest.”

Kelsey Doyle: Facebook rejected suggestions that its platform was to blame for the violent attack on the Capitol. In a statement to the Washington Post, the company said:

Male Voice: The notion that the January 6 insurrection would not have happened but for Facebook is absurd. The former President of the United States pushed a narrative that the election was stolen, including in-person a short distance from the Capitol building that day. The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them.

Jenny Luna: Facebook increasingly faced allegations of censorship and double standards as it struggled to combat incendiary and divisive posts.

News Clip: We have evidence of them blacklisting in some cases, conservative news.

News Clip: There was a tremendous bias against conservative news and content. And a favorable bias towards liberal content.

News Clip: There are instances in which your platforms are taking a very distinctively partisan approach, and not a neutral one.

Jenny Luna: As the Wall Street Journal wrote in 2021,

Male Voice: The issue sits in the middle of one of the most sensitive debates around Facebook. Activists on the left have been urging the company to more forcefully block what they see as harmful content. Other activists, largely on the right, accuse Facebook of censorship directed at conservative voices.

Kelsey Doyle: Supporters of Trump strongly criticized Facebook for deplatforming him. On TheFederalist.com, David Marcus argued that Trump, quote “did not, for example, tweet out to his supporters asking them to donate to a bail fund for the rioters, as Vice President-elect Kamala Harris did during the Minneapolis riots. This is a blatant and obvious double standard.” Marcus argued that “very likely owing to the progressive bent of the company itself, it disproportionately hides content from conservative sources.” More broadly, he argued that “Facebook’s product is very powerful: It feeds the news to people all over the world. If it claims simply to be a neutral marketplace of ideas, it has no justification in banning political speech. People deserve an opportunity to engage in free political discourse, not just the discourse that Zuckerberg thinks is appropriate.”

Jenny Luna: Senator Chuck Grassley, a Republican from Iowa, criticized Facebook for flagging a Fox News article he posted in early 2022.

Senator Chuck Grassley: It’s truly mind-blowing that these companies continue to interfere in free expression. Big Tech is silencing everyone they disagree with and clearly they see no check to their power. Why does Facebook and one of its third-party fact-checkers, partners they are, get to make the decision that this news article is considered false information? That decision should be made by the American people who should be able to view that content and decide that fact for themselves.

Jenny Luna: NYU researchers Paul Barrett and J. Grant Sims took issue with claims of anti-conservative bias by Facebook and other platforms, noting that through the 2020 election, Trump and conservative media outlets had vastly more engagement on social media than Biden and liberal media outlets. They argued that “In connection with the Capitol Riot, Facebook and Twitter did not censor an extreme conservative cause, but facilitated it,” adding that post-insurrection bans on Trump “constitute reasonable attempts to forestall additional violence and avoid real risks to the workings of American democracy.”

Kelsey Doyle: Once again, the world was watching events in the USA… and taking notes.

Jenny Luna: Activist Thet Swe Win compared the company’s slow reaction in Myanmar with its handling of the U.S. Capitol insurrection, saying quote, “It took a really long time to respond, but in the U.S. it was just days. Maybe this is the privilege of living in a First World country.” Then, on February 1, 2021, less than a month after the insurrection at the US capital, Myanmar’s military seized power in a coup and imprisoned the nation’s democratically elected leaders. Facebook declared an emergency on February 11 and took action, including using its Integrity Operations Center to quote, “bring together subject matter experts from across the company, including Myanmar nationals with native language skills, so we can monitor and respond to any threats in real time.” On February 24, Facebook banned the military from the platform, while working to protect quote, “political speech that allows the people of Myanmar to express themselves and to show the world what is transpiring inside their country.” Protesters and activists who opposed the coup used Facebook to share news and voice their opposition.

News Clip: Her name is Frances Haugen, that is a fact that Facebook has been anxious to know since last month when an anonymous former employee filed complaints with federal law enforcement. The complaints say Facebook’s own research shows that it amplifies hate, misinformation and political unrest, but the company hides what it knows.

Kelsey Doyle: On October 3, 2021, 60 Minutes broadcast a bombshell interview, with a whistleblower who had been leaking front page news for weeks.

Jenny Luna: Frances Haugen served as a product manager on Facebook’s Civic Misinformation team from 2019 to 2021. According to Haugen’s personal website, she became increasingly alarmed by the choices Facebook was making, prioritizing their own profits over public safety and putting people’s lives at risk. Her decision to blow the whistle on Facebook, Haugen wrote, was a last resort.

Frances Haugen: I don’t think Facebook ever set out to intentionally promote divisive, extreme polarizing content. I do think though, that they are aware of the side effects of the choices they have made around amplification.

Jenny Luna: Haugen left Facebook in May 2021 and took with her thousands of pages of internal documents, then filed a federal whistleblowing complaint with the U.S. Securities & Exchange Commission. She also provided these documents to the Wall Street Journal and Congress. The Journal published a series of articles based on the leaked documents, which it called “The Facebook Files.” Haugen testified before the U.S. Senate, the British Parliament, and government agencies regarding the impact of Facebook’s products on society.

Frances Haugen: Anger and hate is the easiest way to grow on Facebook.

Kelsey Doyle: In her testimony, Haugen placed considerable emphasis on Facebook’s algorithm approach, arguing that this lay at the core of the company’s problems.

Frances Haugen: Algorithmic bias issues are a major issue for our democracy.

Kelsey Doyle: The algorithm to which she referred was the system Facebook used to determine the position of a post in each user’s newsfeed.

Frances Haugen: The algorithms are very smart in the sense that latch onto things that people want to continue to engage with. And unfortunately, in the case of teen girls and things like self harm, they develop these feedback cycles where children are using Instagram as to self-soothe, but then are exposed to more and more content that makes them hate themselves.

Kelsey Doyle: A complex formula assessed a minimum of 10,000 data points, related to both the user and to each available post, to predict what a specific user was most likely to engage with. While the algorithm was tailored to each user, it also reflected the company’s overall strategy regarding what to promote.

Frances Haugen: Facebook’s own research, says they cannot adequately identify dangerous content. And as a result, those dangerous algorithms that they admit are picking up the extreme sentiments, the division. They can’t protect us from the harms that they know exist in their own system.

Jenny Luna: For many years, Facebook algorithms had optimized time spent on the site, and thus gave prominence to articles that people were likely to click on, as well as professionally produced videos. However, in 2018, the company made a significant change, which Zuckerberg described in a post on Facebook, writing:

Male Voice: The first changes you’ll see will be in News Feed, where you can expect to see more from your friends, family, and groups. As we roll this out, you’ll see less public content like posts from businesses, brands, and media. And the public content you see more will be held to the same standard — it should encourage meaningful interactions between people.

Jenny Luna: Haugen told British lawmakers that the changes had failed.

Kelsey Doyle: Other critics argued that the post-2018 algorithm prioritized divisive content. Facebook, meanwhile, insisted in a memo distributed to employees that academic research did not support “the idea that Facebook, or social media more generally, is the primary cause of polarization.”

Jenny Luna: Zuckerberg himself wrote, quote, “The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.” Clegg, the Facebook VP, told ABC’s George Stephanopoulos that the algorithm actually made Facebook safer.

Nick Clegg: If you were just sort of to, across the board, remove the algorithm, the first thing that would happen is that people would see more, not less hate speech. More, not less misinformation. More, not less harmful content. Why? Because those algorithmic systems precisely are designed like a great sort of giant spam filter to identify and deprecate and downgrade bad content.

Jenny Luna: Separately, Clegg noted that each person’s newsfeed was heavily shaped by individual behavior and that “ultimately, content ranking is a dynamic partnership between people and algorithms.” “On Facebook,” he said. “it takes two to tango.”

Kelsey Doyle: A company statement argued “If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t… Is a ranking change the source of the world’s divisions? No.”

Jenny Luna: But controversies continued to arise all over the world. In Ethiopia, reports provided to CNN revealed Facebook employees became concerned that the company was allowing people to use the platform “to incite violence against ethnic minorities.”

Kelsey Doyle: And one of the documents taken by Haugen, the whistleblower, was a report based on a Facebook team’s research trip to Europe.

Jenny Luna: The report said that the long-term health of the company necessitated revisions to the algorithm, likening the emphasis on outrage-generating content to eating junk food. “We can choose to be idle and keep feeding users fast-food, but that only works for so long,” it said. “Many have already caught on to the fact that fast-food is linked to obesity and, therefore, its short-term value is not worth the long-term cost.”

Kelsey Doyle: The report cited specific concerns from Poland. After decades of communist rule, Poland transitioned to democracy in the 1990s. But in 2015, the populist and socially conservative Law and Justice Party came to power, espousing traditional Polish values. In the facebook team’s report, political parties described a social civil war online, with fierce battle lines drawn on topics like abortion, LGBTQ+ rights, and political issues.

News Clip: A mob of right-wing Poles attacked a pride march in the town of B–, one of several towns that have declared themselves to be LBGT-free.

Jenny Luna: According to Freedom House, nationalist and discriminatory rhetoric had grown, and the government enacted “numerous measures that have… damaged Poland’s democratic progress.” Both major parties blamed social media for worsening Poland’s political polarization and called the situation unsustainable.

Kelsey Doyle: Poland’s far-right Confederation party had the biggest Facebook presence — even though it had only a few seats in Parliament. Tomasz Grabarczyk, the party’s social media director, told the Washington Post that Facebook was a hate algorithm, and his party did well with emotional messages. One party leader summarized its platform, saying, “We don’t want Jews, homosexuals, abortion, taxes, and the European Union.”

Jenny Luna: The Confederation party won 11 seats in the 2019 Parliamentary election, despite not expecting to win any. Grabarczyk credited his party’s online efforts, explaining, “We did everything on the Internet, everything. I’d say it’s 70 percent thanks to Facebook.”

Kelsey Doyle: Facebook removed the page of one of Confederation’s leaders in 2020, and later banned the entire party for “repeated violations” of its standards on COVID-19 disinformation and hate speech. A Confederation party leader criticized Meta, saying an “American corporation wants to suppress freedom of debate and interfere with democracy and the electoral process in Poland,” and called on the government to pass a proposed law to stop quote, “totalitarian censorship” by social media companies.

Jenny Luna: Prime Minister Mateusz Morawiecki of the Law and Justice Party had previously compared “censorship” by social media companies to the authoritarian regimes that once ruled his country. In a 2021 Facebook post, he wrote:

Male Voice: We lived in a censored country for nearly 50 years, a country where Big Brother told us… what we don’t have the right to think, to say, to write. That is why we look with concern at any attempts to limit freedom. A byword for freedom has always been the Internet — the most democratic medium in history… gradually, large, transnational corporations, richer and more powerful than many countries, have started to dominate… Poland will always uphold democratic values, including freedom of speech. Social media cannot act above the law. Therefore we will do all we can to set out a framework for the functioning of Facebook, Twitter, Instagram and other similar platforms.

Kelsey Doyle: In April 2021, Facebook instituted changes to give users more choice over their newsfeeds, including a “most recent” mode, where new posts appeared first, and the ability to prioritize up to 30 friends or pages. Facebook VP Clegg told NPR that quote, “Everybody accepts that new rules of the road need to be written.” And explained the new features during an interview.

Nick Clegg: What I announced in this article is some additional controls which give people real transparency in how the systems work and allow people to pull levers. So for instance, in a new feature I announced today, You’ll be able to, in effect, override the algorithm and curate your own news feed.

Jenny Luna: In October 2021, the Facebook company officially changed its name to Meta. It framed the rebranding as a shift in focus, towards building a virtual world called the metaverse. But the new name hasn’t quieted the debate about social media’s role as a platform for political speech.

Kelsey Doyle: On January 25, 2023, a few months after Twitter reactivated President Trump’s account, Meta announced it would do the same, ending its suspension of Trump and reinstating him on Facebook and Instagram. In a written statement, the company said in light of Trump’s violations, he will also face heightened penalties for repeat offenses. It continued, quote, “We believe it is both necessary and possible to draw a line between content that is harmful and should be removed, and content that, however distasteful or inaccurate, is part of the rough and tumble of life in a free society.”

Jenny Luna: We hope the information in this podcase helps you understand how political speech and social media intersect. Our goal is to spark conversations and ideas about business ethics and business’s role in society.

Kelsey Doyle: This podcase was based on the original written case titled, “Facebook and Political Speech,” written by Ken Shotts, professor of political economy at Stanford Graduate School of Business, and case writer, Sheila Melvin. It was produced by Andrew Stelzer and Pablo Woythaler. I’m Kelsey Doyle.

Jenny Luna: And I’m Jenny Luna. Thanks for listening.

For media inquiries, visit the Newsroom.

Explore More