Roiled by election, Facebook struggles to balance civility and growth
By Kevin Roose, Mike Isaac and Sheera Frenkel
In the tense days after the presidential election, a team of Facebook employees presented the chief executive, Mark Zuckerberg, with an alarming finding: Election-related misinformation was going viral on the site.
President Donald Trump was already casting the election as rigged, and stories from right-wing media outlets with false and misleading claims about discarded ballots, miscounted votes and skewed tallies were among the most popular news stories on the platform.
In response, the employees proposed an emergency change to the site’s news feed algorithm, which helps determine what more than 2 billion people see every day. It involved emphasizing the importance of what Facebook calls “news ecosystem quality” scores, or NEQ, a secret internal ranking it assigns to news publishers based on signals about the quality of their journalism.
Typically, NEQ scores play a minor role in determining what appears on users’ feeds. But several days after the election, Zuckerberg agreed to increase the weight that Facebook’s algorithm gave to NEQ scores to make sure authoritative news appeared more prominently, said three people with knowledge of the decision, who were not authorized to discuss internal deliberations.
The change was part of the “break glass” plans Facebook had spent months developing for the aftermath of a contested election. It resulted in a spike in visibility for big, mainstream publishers like CNN, The New York Times and NPR, while posts from highly engaged hyperpartisan pages, such as Breitbart and Occupy Democrats, became less visible, the employees said.
It was a vision of what a calmer, less divisive Facebook might look like. Some employees argued the change should become permanent, even if it was unclear how that might affect the amount of time people spent on Facebook. In an employee meeting the week after the election, workers asked whether the “nicer news feed” could stay, said two people who attended.
Guy Rosen, a Facebook executive who oversees the integrity division that is in charge of cleaning up the platform, said on a call with reporters last week that the changes were always meant to be temporary. “There has never been a plan to make these permanent,” he said. John Hegeman, who oversees the news feed, said in an interview that while Facebook might roll back these experiments, it would study and learn from them.
The news feed debate illustrates a central tension that some inside Facebook are feeling acutely these days: that the company’s aspirations of improving the world are often at odds with its desire for dominance.
In the past several months, as Facebook has come under more scrutiny for its role in amplifying false and divisive information, its employees have clashed over the company’s future. On one side are idealists, including many rank-and-file workers and some executives, who want to do more to limit misinformation and polarizing content. On the other side are pragmatists who fear those measures could hurt Facebook’s growth, or provoke a political backlash that leads to painful regulation.
“There are tensions in virtually every product decision we make and we’ve developed a companywide framework called ‘Better Decisions’ to ensure we make our decisions accurately, and that our goals are directly connected to delivering the best possible experiences for people,” said Joe Osborne, a Facebook spokesperson.
These battles have taken a toll on morale. In an employee survey this month, Facebook workers reported feeling less pride in the company compared to previous years. About half felt that Facebook was having a positive impact on the world, down from roughly three-quarters earlier this year, according to a copy of the survey, known as Pulse, which was reviewed by The New York Times. Employees’ “intent to stay” also dropped, as did confidence in leadership.
BuzzFeed News previously reported on the survey results.
Even as Election Day and its aftermath have passed with few incidents, some disillusioned employees have quit, saying they could no longer stomach working for a company whose products they considered harmful. Others have stayed, reasoning they can make more of a difference on the inside. Still others have made the moral calculation that even with its flaws, Facebook is, on balance, doing more good than harm.
“Facebook salaries are among the highest in tech right now, and when you’re walking home with a giant paycheck every two weeks, you have to tell yourself that it’s for a good cause,” said Gregor Hochmuth, a former engineer with Instagram, which Facebook owns, who left in 2014. “Otherwise, your job is truly no different from other industries that wreck the planet and pay their employees exorbitantly to help them forget.”
Several employees said they were frustrated that to tackle thorny issues like misinformation, they often had to demonstrate that their proposed solutions wouldn’t anger powerful partisans or come at the expense of Facebook’s growth.
Facebook’s moves to clean up its platform will be made easier, in some ways, by the end of the Trump administration. For years, Trump and other leading conservatives accused the company of anti-conservative bias each time it took steps to limit misinformation.
But even with an incoming Biden administration, Facebook will need to balance employees’ desire for social responsibility with its business goals.
“The question is, what have they learned from this election that should inform their policies in the future?” said Vanita Gupta, chief executive of the civil rights group Leadership Conference on Civil and Human Rights. “My worry is that they’ll revert all of these changes despite the fact that the conditions that brought them forward are still with us.”