Facebook expanded its mea culpa on Monday for failing to prevent “bad actors” such as Russian entities from using the social media platform to distort democratic processes—notably the U.S. presidential election in 2016.
The company unveiled another series of measures it’s taking to prevent malefactors, who hide behind false names, from spreading fake and incendiary political stories through the network’s news feed to influence voters.
For example, advertisers will have to unmask their identities and reveal who’s funding their ads, Facebook civic engagement team member Samidh Chakrabarti wrote in a blog post. Company CEO Mark Zuckerberg had already announced Friday that Facebook would try to give more prominent play to reliable news outlets by asking its members which news sources they trust the most.
Debate immediately erupted over whether those measures will fail or even backfire. But perhaps the most striking thing about the current stance taken by Facebook—which had previously denied its platform was harming democracy—is that the company now declares it cannot singlehandedly control the potential damage from the powerful communication mechanisms it has created.
“If there’s one fundamental truth about social media’s impact on democracy it’s that it amplifies human intent—both good and bad. At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy,” Chakrabarti writes. “I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t.”
Despite all the countermeasures Facebook is installing, “the battle will never end,” Chakrabarti wrote, as he referenced what he called “a cyberwar intended to divide society.” Facebook is up against professionalized campaigns constantly trying to game the system, he says.
Struggles like Facebook’s would have been no surprise to 19th-century novelist Mary Shelley, whose enterprising Dr. Frankenstein cobbled together a Creature from human parts, only to see it escape from his control. Immensely taller, stronger, and smarter than Dr. Frankenstein, the Creature at first wanted only to win approval from the human community—but ended up unleashing violence on it. Shelley framed the book as a warning to scientific explorers motivated by ambition.
Facebook is now trying to enlist the community at large to help rein in the dangerous consequences of its creation—an information-sharing platform open to all comers that, critics say, tends to spread stories full of drama and purported tribal resentments, rather than well-grounded reports that can help readers make rational decisions. Facebook acknowledged that violence can result.
“Our concerns with political hate speech aren’t limited to the online sphere—we also need to be vigilant that social media doesn’t facilitate offline violence,” Chakrabarti says.
For months, Facebook and Twitter have been taking the heat from committees of Congress and other critics for failing to police the content of the messages that were amplified by social media networks as Russian entities manipulated the national political debate online. Both companies have been struggling to curate their content, by enforcing their community standards governing discourse. But arguably, the attempt has been a minefield of unintended consequences, just as it would be for a benign government that tried its hand at well-intentioned political censorship.
Menlo Park, CA-based Facebook now wants to rely on means such as crowdsourced surveys to rank the trustworthiness of news organizations, rather than acting as a media organization and setting up its own editorial controls.
“In the public debate over false news, many believe Facebook should use its own judgment to filter out misinformation,” Chakrabarti wrote in his blogpost Monday. “We’ve chosen not to do that because we don’t want to be the arbiters of truth, nor do we imagine this is a role the world would want for us.”
But the Atlantic’s Alexis Madrigal says Facebook hasn’t shared enough about the survey methodology it’s using to give some news organizations better placement in its News Feed than it gives others. Madrigal wants to know whether the mass surveys will become the sole arbiter of news rankings, and whether any humans will review the results.
“It seems like this system cannot be truly and fully automated,” Madrigal writes. “In particular, what kind of steps has Facebook taken to ensure that this kind of survey can’t be manipulated en masse by some subset of users to tank the rating of news outlets that they don’t like? Surely—hopefully—they have considered this possibility.”
Mathew Ingram, chief digital writer for the Columbia Journalism Review, concluded that Facebook’s new methods for prioritizing news items are likely to give misinformation the edge. Statements by Zuckerberg indicate that Facebook plans to emphasize news reports that generate online conversations, spur reactions, and increase engagement on the site, Ingram writes. Most likely, these will come from “publishers who specialize in either completely fake stories, or stories that have a grain of truth but are wildly exaggerated,” he says.
“This approach, besides putting news judgment in the hands of the same users who were duped into trusting fake news distributed by Russian troll factories, brings with it its own problems—namely, that the question of trust is all wrapped with political ideology, as a recent Knight-Gallup poll found,” Ingram writes. “People trust the outlets they agree with, which makes trust an ineffective metric for measuring the truth.”
Among the other countermeasures unveiled by Facebook’s Chakrabarti:
—Facebook will allow readers to go to an advertiser’s page and see the ads it is currently running. Electoral ads will be archived, and searchable. This is apparently designed to help voters check whether political candidates, or their advocates, are changing their messages as they move from one population to another.
“Micro-targeting can enable dishonest campaigns to spread toxic discourse without much consequence,” Chakrabarti writes. “Democracy then suffers because we don’t get the full picture of what our leaders are promising us.”
—Facebook has made it easier to report false stories to fact-checkers, who can push the stories lower on the News Feed. The site will also provide more “context” on the publishers behind news stories, such as their ownership structures, and policies on fact-checking and ethics.
But such fixes may not satisfy Facebook’s deepest critics, such as Roger McNamee, an early Facebook investor and former mentor to Zuckerberg who believes that societal harms are now an inevitable consequence of Facebook’s current business model.
McNamee, who is co-founder