White House dispute exposes Facebook blind spot on misinformation
By Sheera Frenkel
At the start of the pandemic, a group of data scientists at Facebook held a meeting with executives to ask for resources to help measure the prevalence of misinformation about COVID-19 on the social network.
The data scientists said figuring out how many Facebook users saw false or misleading information would be complex, perhaps taking a year or more, according to two people who participated in the meeting. But they added that by putting some new hires on the project and reassigning some existing employees to it, the company could better understand how incorrect facts about the virus spread on the platform.
The executives never approved the resources, and the team was never told why, according to the people, who requested anonymity because they were not authorized to speak to reporters.
Now, more than a year later, Facebook has been caught in a firestorm about the very type of information that the data scientists were hoping to track.
The White House and other federal agencies have pressed the company to hand over data about how anti-vaccine narratives spread online, and have accused Facebook of withholding key information. President Joe Biden on Friday accused the company of “killing people” by allowing false information to circulate widely. On Monday, he walked that back slightly, instead directing blame at people who originate falsehoods.
“Anyone listening to it is getting hurt by it,” Biden said. He said he hoped that instead of “taking it personally,” Facebook would “do something about the misinformation.”
The company has responded with statistics on how many posts containing misinformation it has removed, as well as how many Americans it has directed to factual information about the government’s pandemic response. In a blog post Saturday, Facebook asked the Biden administration to stop “finger-pointing,” and casting blame on Facebook after missing its goal of vaccinating 70% of American adults by July 4.
“Facebook is not the reason this goal was missed,” Guy Rosen, Facebook’s vice president of integrity, said in the post.
But the pointed back-and-forth struck an uncomfortable chord for the company: It doesn’t actually know many specifics about how misinformation about the coronavirus and the vaccines to combat it have spread. That blind spot has reinforced concerns among misinformation researchers over Facebook’s selective release of data, and how aggressively — or not — the company has studied misinformation on its platform.
“The suggestion we haven’t put resources toward combating COVID misinformation and supporting the vaccine rollout is just not supported by the facts,” said Dani Lever, a Facebook spokesperson. “With no standard definition for vaccine misinformation, and with both false and even true content (often shared by mainstream media outlets) potentially discouraging vaccine acceptance, we focus on the outcomes — measuring whether people who use Facebook are accepting of COVID-19 vaccines.”
Executives at Facebook, including its CEO, Mark Zuckerberg, have said the company committed to removing COVID-19 misinformation when the pandemic began. The company said it had removed more than 18 million pieces of COVID-19 misinformation since the start of the pandemic.
Experts who study disinformation said the number of pieces that Facebook removed was not as informative as how many were uploaded to the site, or in which groups and pages people were seeing the spread of misinformation.
“They need to open up the black box that is their content ranking and content amplification architecture. Take that black box and open it up for audit by independent researchers and government,” said Imran Ahmed, CEO of the Center for Countering Digital Hate, a nonprofit that aims to combat disinformation. “We don’t know how many Americans have been infected with misinformation.”
Ahmed’s group, using publicly available data from CrowdTangle, a Facebook-owned program, found that 12 people were responsible for 65% of the COVID-19 misinformation on Facebook. The White House, including Biden, has repeated that figure in the past week. Facebook says it disagrees with the characterization of the “disinformation dozen,” adding that some of their pages and accounts were removed, while others no longer post content that violate Facebook’s rules.
Renée DiResta, a disinformation researcher at Stanford’s Internet Observatory, called on Facebook to release more granular data, which would allow experts to understand how false claims about the vaccine were affecting specific communities within the country. The information, which is known as “prevalence data,” essentially looks at how widespread a narrative is, such as what percentage of people in a community on the service see it.
“The reason more granular prevalence data is needed is that false claims don’t spread among all audiences equally,” DiResta said. “In order to effectively counter specific false claims that communities are seeing, civil society organization and researchers need a better sense of what is happening within those groups.”
Many employees within Facebook have made the same argument. Brian Boland, a former Facebook vice president in charge of partnerships strategy, told CNN on Sunday that he had argued while at the company that it should publicly share as much information as possible. When asked about the dispute with the White House over COVID-19 misinformation, he said, “Facebook has that data.”
“They look at it,” Boland said. But he added: “Do they look at it the right way? Are they investing in the teams as fully as they should?”
Boland’s comments were widely repeated as evidence that Facebook has the requested data but is not sharing it. He did not respond to a request for comment from The New York Times, but one of the data scientists who pushed inside Facebook for deeper study of coronavirus misinformation said the problem was more about whether and how the company studied the data.
Technically, the person said, the company has data on all content that moves through its platforms. But measuring and tracking COVID-19 misinformation first requires defining and labeling what qualifies as misinformation, something the person said the company had not dedicated resources toward.