Disinformation for hire, a shadow industry, is quietly booming
By Max Fisher
In May, several French and German social media influencers received a strange proposal.
A London-based public relations agency wanted to pay them to promote messages on behalf of a client. A polished three-page document detailed what to say and on which platforms to say it.
But it asked the influencers to push not beauty products or vacation packages, as is typical, but falsehoods tarring Pfizer-BioNTech’s COVID-19 vaccine. Stranger still, the agency, Fazze, claimed a London address where there is no evidence any such company exists.
Some recipients posted screenshots of the offer. Exposed, Fazze scrubbed its social media accounts. That same week, Brazilian and Indian influencers posted videos echoing Fazze’s script to hundreds of thousands of viewers.
The scheme appears to be part of a secretive industry that security analysts and U.S. officials say is exploding in scale: disinformation for hire.
Private firms, straddling traditional marketing and the shadow world of geopolitical influence operations, are selling services once conducted principally by intelligence agencies.
They sow discord, meddle in elections, seed false narratives and push viral conspiracies, mostly on social media. And they offer clients something precious: deniability.
“Disinfo-for-hire actors being employed by government or government-adjacent actors is growing and serious,” said Graham Brookie, director of the Atlantic Council’s Digital Forensic Research Lab, calling it “a boom industry.”
Similar campaigns have been recently found promoting India’s ruling party, Egyptian foreign policy aims and political figures in Bolivia and Venezuela.
Brookie’s organization tracked one operating amid a mayoral race in Serra, a small city in Brazil. An ideologically promiscuous Ukrainian firm boosted several competing political parties.
In the Central African Republic, two separate operations flooded social media with dueling pro-French and pro-Russian disinformation. Both powers are vying for influence in the country.
A wave of anti-American posts in Iraq, seemingly organic, were tracked to a public relations company that was separately accused of faking anti-government sentiment in Israel.
Most trace to back-alley firms whose legitimate services resemble those of a bottom-rate marketer or email spammer.
Job postings and employee LinkedIn profiles associated with Fazze describe it as a subsidiary of a Moscow-based company called Adnow. Some Fazze web domains are registered as owned by Adnow, as first reported by the German outlets Netzpolitik and ARD Kontraste. Third-party reviews portray Adnow as a struggling ad service provider.
European officials say they are investigating who hired Adnow. Sections of Fazze’s anti-Pfizer talking points resemble promotional materials for Russia’s Sputnik-V vaccine.
For-hire disinformation, though only sometimes effective, is growing more sophisticated as practitioners iterate and learn. Experts say it is becoming more common in every part of the world, outpacing operations conducted directly by governments.
The result is an accelerating rise in polarizing conspiracies, phony citizen groups and fabricated public sentiment, deteriorating our shared reality beyond even the depths of recent years.
An open frontier
The trend emerged after the Cambridge Analytica scandal in 2018, experts say. Cambridge, a political consulting firm linked to members of Donald Trump’s 2016 presidential campaign, was found to have harvested data on millions of Facebook users.
The controversy drew attention to methods common among social media marketers. Cambridge used its data to target hyperspecific audiences with tailored messages. It tested what resonated by tracking likes and shares.
The episode taught a generation of consultants and opportunists that there was big money in social media marketing for political causes, all disguised as organic activity.
Some newcomers eventually reached the same conclusion as Russian operatives had in 2016: Disinformation performs especially well on social platforms.
At the same time, backlash to Russia’s influence-peddling appeared to have left governments wary of being caught — while also demonstrating the power of such operations.
“There is, unfortunately, a huge market demand for disinformation,” Brookie said, “and a lot of places across the ecosystem that are more than willing to fill that demand.”
Commercial firms conducted for-hire disinformation in at least 48 countries last year — nearly double from the year before, according to an Oxford University study. The researchers identified 65 companies offering such services.
New technology enables nearly anyone to get involved. Programs batch-generate fake accounts with hard-to-trace profile photos. Instant metrics help to hone effective messaging. So does access to users’ personal data, which is easily purchased in bulk.
The campaigns are rarely as sophisticated as those by government hackers or specialized firms like the Kremlin-backed Internet Research Agency.
But they appear to be cheap. In countries that mandate campaign finance transparency, firms report billing tens of thousands of dollars for campaigns that also include traditional consulting services.
The layer of deniability frees governments to sow disinformation more aggressively, at home and abroad, than might otherwise be worth the risk. Some contractors, when caught, have claimed they acted without their client’s knowledge or only to win future business.
Platforms have stepped up efforts to root out coordinated disinformation. Analysts especially credit Facebook, which publishes detailed reports on campaigns it disrupts.
Still, some argue that social media companies also play a role in worsening the threat. Engagement-boosting algorithms and design elements, research finds, often privilege divisive and conspiratorial content.
Political norms have also shifted. A generation of populist leaders, like Rodrigo Duterte of the Philippines, has risen in part through social media manipulation. Once in office, many institutionalize those methods as tools of governance and foreign relations.
In India, dozens of government-run Twitter accounts have shared posts from India Vs Disinformation, a website and set of social media feeds that purport to fact-check news stories on India.
India Vs Disinformation is, in reality, the product of a Canadian communications firm called Press Monitor.
Nearly all the posts seek to discredit or muddy reports unfavorable to Prime Minister Narendra Modi’s government, including on the country’s severe COVID-19 toll. An associated site promotes pro-Modi narratives under the guise of news articles.
A Digital Forensic Research Lab report investigating the network called it “an important case study” in the rise of “disinformation campaigns in democracies.”
But governments may find that outsourcing such shadowy work also carries risks, Brookie said. For one, the firms are harder to control and might veer into undesired messages or tactics.
For another, firms organized around deceit may be just as likely to turn those energies toward their clients, bloating budgets and billing for work that never gets done.
“The bottom line is that grifters are going to grift online,” he said.