By Shelby Grossman & Khadeja Ramali
In December 2019, Twitter suspended 88,000 accounts that were pushing narratives aligned with positions of the government of Saudi Arabia.
These accounts had published tweets that, among other things, denigrated Jamal Khashoggi, the journalist who was murdered inside the Saudi consulate in Istanbul.
What made this social media takedown notable was that Twitter attributed the pro-Saudi messaging operation not to the government of Saudi Arabia but, rather, to SMAAT, a Saudi digital marketing firm with a mix of political and corporate clients.
SMAAT was co-founded by Ahmed Almutairi, an agent of the Saudi royal family who also recruited two Twitter employees to spy on the accounts of critics of the Saudi government.
While the details of this case are intriguing, in many ways SMAAT’s Twitter network is illustrative of the new normal: a disinformation campaign linked to a digital marketing firm.
We are increasingly seeing state actors outsourcing their disinformation operations to these companies.
Outsourcing an influence campaign to a private company has benefits for national governments.
The primary advantage is that it gives the government a level of plausible deniability.
If the operation is uncovered, government actors can claim that it was simply a rogue social media marketing agency, and that they had nothing to do with the activities.
Similarly, if one firm gets banned from social platforms, governments can switch to working with a new one.
This strategy is effective.
Facebook and Twitter, the two platforms that are most transparent about the political disinformation networks they uncover, increasingly attribute operations not to governments, but to firms like SMAAT, even though there are often strong suspicions that the firms are working at the behest of state actors.
Like SMAAT, many of these digital marketing firms are headed by individuals with one foot in the media marketing world and one in the government.
For example, in 2019, Facebook suspended a network of accounts linked to a firm called New Waves based in Egypt—which hosts many such firms.
New Waves’s founder, Amr Hussein, is a former Egyptian army officer and previously worked at Al Bawaba News, a private pro-government Egyptian newspaper.
He currently calls himself a social media consultant and is regularly hosted on Egyptian news shows where, ironically, he discusses the harms of social media and calls for greater restrictions on social media use.
Information operations like those run by SMAAT and New Waves can have multiple objectives that can include ginning up conflict or creating the illusion of popular support for unpopular policies.
New Waves created Facebook pages that resembled local news media to push political messaging aligned with several Gulf governments.
The company ran campaigns targeting people living through regional conflicts, including in Sudan and Libya, and aimed to stoke political tension and divisions within an already complex geopolitical region.
Following Sudan’s 2019 coup, the company also worked to boost the image of Sudan’s military regime and create the impression that it had grassroots support, a position supported by Saudi Arabia and the United Arab Emirates.
In addition to deniability, outsourcing can offer foreign campaigns other benefits, including lower costs and local knowledge.
In October 2019, Facebook suspended pages targeting Libyans that were linked to Russian businessman Yevgeny Prigozhin.
There is some evidence that an Egyptian digital marketing firm created the content for these pages. This likely saved Prigozhin money, as Egypt is a source of cheap labor, and provided him access to people who may have a better understanding of local politics.
U.S. companies have engaged in these social media disinformation campaigns as well.
In August 2020, Facebook suspended a network of accounts linked to CLS Strategies, a U.S.-based communications firm. This firm worked on behalf of the Bolivian government, which at the time was headed by Interim President Jeanine Áñez, who was considering a run for president during the country’s 2020 elections.
The Facebook pages supported Áñez and had page titles like “Todos con Áñez” (Everyone with Áñez).
These activities are noteworthy for the damage they could cause to U.S. foreign policy.
As long as a public relations firm discloses that it is working for a foreign government—and at least in the case of CLS Strategies, it did disclose that it was working for the Bolivian government—the U.S. government is not able to limit the firm’s role in pursuing these sorts of online disinformation campaigns.
The use of digital marketing firms provides unique and as yet underutilized opportunities for civil society advocacy.
These firms typically have corporate clients in addition to their political clients. For example, Twitter’s SMAAT takedown included tweets not just about Khashoggi but also about Dunkin’ Donuts, one of Smaat’s other clients; Smaat’s tweets had defended the brand against a scandal when it used a four-finger hand gesture to communicate how cheap its coffee was—a hand gesture that has been used by the Muslim Brotherhood.
Advocacy groups could pressure companies like Dunkin’ Donuts to avoid working with digital marketing firms that have been linked to disinformation operations.
As social media companies increasingly attribute disinformation campaigns to digital marketing firms, journalists can play an important role in uncovering the firms’ connections to state actors and provide insights into how these firms function.
Public understanding of these campaigns would be enhanced if we knew which ministry or politician managed the contract, and tech companies are not always well placed to make such fine-grained attribution.
Investigative work, such as that of the CNN reporting team led by Clarissa Ward that uncovered details about the workings of a Russian troll farm in Ghana, can shed light on these questions and on how outsourcing is evolving.
Private actors are adapting their online disinformation campaigns to avoid detection as social media companies continue to target them with takedowns and suspensions, and governments will likely continue to work through these private firms because they benefit from the indirect approach.
Understanding these practices and the intent behind them is necessary to identify and address them.
Shelby Grossman is a research scholar at the Stanford Internet Observatory. Her research focuses on online disinformation targeting Africa. Grossman holds a doctorate in political science from Harvard University.
Khadeja Ramali is an independent social media researcher with an interest in the intersection of online spaces, culture and political discourse. Ramali’s work focuses on Arabic language online spaces and foreign interference campaigns in the Middle East and North Africa region.