MENLO PARK, Calif. - Bracing for a contentious election with no immediate results and possible “civil unrest,” Facebook is enacting a host of measures to ensure its platform is not used to sow chaos and spread misinformation before, during and after the U.S. presidential election.
But it's not clear the changes are enough.
The company said Thursday it will restrict new political ads in the week before the election and remove posts that convey misinformation about COVID-19 and voting. It will also attach links with official results to posts by candidates and campaigns that prematurely declare victory.
“This election is not going to be business as usual. We all have a responsibility to protect our democracy,” Facebook CEO Mark Zuckerberg said in a post on Thursday. “That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”
Some activists hailed the new policies but said the onus will be on Facebook to enforce them. And others were skeptical that they'll really make a difference.
“Voting starts in North Carolina tomorrow. Election Day isn’t in two months, it’s tomorrow and every day after. Which means voters in that state and many others that vote early will be subject to months of dishonest ads on Facebook’s platform," said Shaunna Thomas, co-founder and executive director of UltraViolet, a women's organization critical of Facebook.
She called the announcement a “PR stunt designed to distract from the fact that Facebook is the single biggest vector of dangerous misinformation and voter suppression campaigns in the United States.”
Siva Vaidhyanathan, a Facebook expert at the University of Virginia, said the company again proved itself incapable of effectively snuffing out dangerous misinformation last week when it failed to remove postings by right-wing militia organizers urging supporters with rifles to converge on Kenosha, Wisconsin.
“Facebook’s biggest problem has always been enforcement,” he said. “Even when it creates reasonable policies that seem well-meaning, it gets defeated by its own scale. So I am not optimistic that this will be terribly effective.”
Facebook and other social media companies are being scrutinized over how they handle misinformation, given problems with President Donald Trump and other candidates posting false information and Russia’s ongoing attempts to interfere in U.S. politics.
Facebook has long been criticized for not fact-checking political ads or limiting how they can be targeted at small groups of people.
Mark Zuckerberg at South by Southwest in 2008 (Courtesy: Wiki Creative Commons)
With the nation divided, and election results potentially taking days or weeks to be finalized, there could be an “increased risk of civil unrest across the country," Zuckerberg said.
Civil rights groups said they directly pitched Zuckerberg and other Facebook executives to make many of the changes announced Thursday.
“These are really significant steps but everything is going to depend on the enforcement,” said Vanita Gupta, who was head of the Obama Justice Department’s Civil Rights Division and now leads the Leadership Conference on Civil and Human Rights. “I think they’re going to be tested on it pretty soon."
In July, Trump refused to publicly commit to accepting the results of the upcoming election as he scoffed at polls that showed him lagging behind Democratic rival Joe Biden. Trump also has made false claims that the increased use of mail-in voting because of the coronavirus pandemic allows for voter fraud. That has raised concern over the willingness of Trump and his supporters to abide by election results.
Under the new measures, Facebook says it will prohibit politicians and campaigns from running new election ads in the week before the election. However, they can still run existing ads and change how they are targeted. Many voters, however, are expected to vote by mail well ahead of Election Day.
Trump campaign spokeswoman Samantha Zager criticized the ban on new political ads, saying it would prevent Trump from defending himself on the platform in the last seven days of the presidential campaign.
Posts with obvious misinformation on voting policies and the coronavirus pandemic will also be removed. Users can only forward articles to a maximum of five others on Messenger, Facebook’s messaging app. The company also will work with Reuters to provide official election results and make the information available both on its platform and with push notifications.
After being caught off-guard by Russia’s efforts to interfere in the 2016 U.S. presidential election, Facebook, Google, Twitter and other companies put safeguards in place to prevent it from happening again. That includes taking down posts, groups and accounts that engage in “coordinated inauthentic behavior" and strengthening verification procedures for political ads. Last year, Twitter banned political ads altogether.
The Facebook logo is displayed at the Facebook Innovation Hub on Feb. 24, 2016 in Berlin, Germany. (Photo by Sean Gallup/Getty Images)
Zuckerberg said Facebook had removed more than 100 networks worldwide engaging in such interference over the last few years.
“Just this week, we took down a network of 13 accounts and two pages that were trying to mislead Americans and amplify division,” he said.
But experts and Facebook’s own employees have said the measures have not been enough to stop the spread of misinformation, including from politicians. Internal dissent among Facebook employees might have helped influence Zuckerberg’s decision to do something, said Joan Donovan, a disinformation researcher at Harvard University.
“This is a huge about-face for Facebook in this moment because for so long they said they were unwilling to moderate political speech and now at this stage they are drawing very sharp lines and I think that’s because their company cannot survive another four-year scandal,” she said.
Facebook had previously drawn criticism for its ads policy, which cited freedom of expression as the reason for letting politicians like Trump post false information about voting.
Associated Press Writers Matt O'Brien, Barbara Ortutay and Frank Bajak contributed to this report.