Facebook plans to announce Friday that it will no longer automatically give politicians a pass when they break the company’s hate speech rules, a major reversal after years of criticism that it was too deferential to powerful figures during the Trump presidency.
Since the 2016 election, the company has applied a test to political speech that weighs the newsworthiness of the content against its propensity to cause harm. Now the company will throw out the first part of the test and will no longer consider newsworthiness as a factor, according to a person familiar with the company’s thinking who spoke on the condition of anonymity because that person was not authorized to speak publicly.
But Facebook doesn’t plan to end the newsworthiness exception entirely. In the cases where an exception is made, the company will now disclose it publicly, the person said — after years of such decisions being closely held. And it will also become more transparent about its strikes system for people who violate its rules.
The moves, first reported by the Verge, are part of a set of responses to the Facebook Oversight Board’s recommendations. The largely independent Facebook-funded body recently ruled on whether the social network should reinstate former president Donald Trump’s account on its service. The company’s responses are the first major test of how a nongovernment watchdog might act as a check on the powerful social network, which is used by 3.45 billion people globally on a monthly basis.
Trump has been suspended from the platform since Jan. 6, when the company determined that his posts incited violence during the Capitol insurrection. But soon after, Facebook turned its decision — which it said would be enforced indefinitely — over to the Oversight Board to decide whether the company made the right call.
After four months of deliberations, the Oversight Board unexpectedly kicked the Trump decision back to the social network, giving it six months to decide whether to ban Trump permanently or reinstate him. It also recommended that the company publish a report about its role in the Jan. 6 riot and make changes to its newsworthiness exception. The company has committed to responding to the board’s recommendations within 30 days.
The Oversight Board ruled that Facebook was right to suspend Trump in the moment. But it said that Facebook had not provided a better rationale for the indefinite suspension, noting that indefinite suspensions are not part of Facebook’s policies. That rationale needed to be clear and transparent, the board said.
Publicly, Facebook executives have deflected blame for the events at the Capitol onto other companies. The Washington Post and others have reported that rioters used Facebook to help organize.
The Post reported last year that the newsworthiness exemption was first created in response to Trump’s inflammatory remarks about Muslims during his candidacy. Since then, the company has maintained that it rarely used the exception and has only acknowledged using it six times. Those incidents were all outside the United States, and include political speech in Hungary, Vietnam and Italy.
In practice, however, Facebook has appeared to give politicians and political leaders a pass in many more instances. In 2019, CEO Mark Zuckerberg said the company would not apply its fact-checking to political ads, for example.
And throughout his presidency, Trump repeatedly flooded the platform with misinformation. He promoted baseless claims of voter fraud and repeatedly stated without evidence that the 2020 election was stolen. Facebook chose to append a generic label to most of that content rather than ban it.
Even more so than the newsworthiness exception, the strikes system is another opaque area of Facebook’s policies and practices. Users can be censored or demoted after a certain number of strikes for breaking rules. But the company has said it does not want to share its policing strategies for fear that it will enable loopholes.
The result was what was criticized as an arbitrary system, however. People whose content was removed often did not know what rule they had broken, and seemingly routine violators sometimes appeared to be treated with kid gloves.
Facebook’s response to the Oversight Board is being watched as a key test for the possibility of self-regulation by powerful social media companies. Facebook and other Silicon Valley giants are facing a wave of potential new regulation over issues such as privacy and algorithmic transparency all over the world, as well as a major antitrust lawsuit in the United States.
If the Facebook-created board is viewed as a legitimate check on the company’s power, experts have said it could become a model for countries looking at ways to regulate how social media companies police content on their platforms, or for other companies in a similar position. But it also could make the need for regulation seem less urgent because a solution already exists, they said.
In 2018, Zuckerberg — under immense political pressure over the company’s content moderation practices — presented the idea for an independent body that would oversee controversial decisions made by the social network. The idea was to put a check on the social network’s power, which was being roundly criticized by government officials, academics and the public over allowing the spread of Russian disinformation, inflammatory political discourse and hate speech.
Facebook funded the Oversight Board through an independent trust and selects its members but has given it the power to make binding decisions on content that the board determines has been wrongly removed or kept up. The 20-member board also can issue voluntary policy recommendations. Members include a Nobel laureate, free-speech experts, and a former Danish prime minister.
Trump also has been suspended indefinitely from YouTube, the gaming platform Twitch, Snapchat and other platforms, and has been banned from Twitter over the same set of comments from Jan. 6.
Trump built one of the world’s most powerful and passionate online audiences during his tenure as president. But researchers have shown that he has not been able to garner the same level of online attention since he was take off mainstream platforms. He recently turned to using his own website to put out statements, but his team shut it down this week.