M.A., Renée DiResta, Stanford University; Josh A. Goldstein, Stanford University; and Shelby Grossman, Stanford University
The 2009 Green Movement in Iran and 2011 Arab Spring uprisings across the Middle East and North Africa (MENA) region showed governments the power of social media activism and its potential threat to regime stability. Early media coverage and academic research posited that the new platforms would be democratizing. However, in the years that followed, the region’s governments transformed from passive targets of social media mobilization to active online agents themselves—shaping and constraining public opinion for their own political ends.
The regimes incorporated social media activities into their own domestic and foreign policy toolkits; social networks became yet another broadcast channel upon which to communicate state messages and transmit propaganda. The affordances of social media enabled a range of novel tactics for covert information operations in particular: profiles absent verification, for example, can be used to create personas to conduct agent-of-influence activities. Because much of the world now has accounts on the largest social media platforms, such as Facebook and Twitter, state actors can target the citizens of other nations directly. For instance, one information operation originating in Iran focused on countries ranging from Bosnia and Bangladesh to Mauritania and Morocco to Senegal and Sudan.
When platforms identify this type of manipulative activity targeting users, they take it down, removing (in the case of Facebook, for example) the Pages as well as the accounts identified as active participants. Assessing these operations is important for a number of reasons: they are happening (or, at least, being detected) with increasing frequency, and they have the potential to destabilize countries or exacerbate geopolitical tensions. The dominant model for thinking about state-sponsored influence operations remains what is sometimes called the “Russian playbook” after the activity attributed to Russia’s Internet Research Agency. In their 2014-2017 operation targeting the United States, the Internet Research Agency used fake personas masquerading as citizens of the targeted country, front news media that on the surface appeared to be activist publications, cross-platform deployment of both personas and fronts, and attempts to leverage distinct facets of identity (race, gender, religion etc.) to create tension with those from different demographic or ideological groups. However, studies of information operations initiated by other state actors reveal a breadth of tactics: examination of networks attributed to the Chinese Communist Party, for example, suggests that it does not invest the time or effort required to develop convincing personas, but instead creates new clusters of accounts to address particular topics, often using them as amplifiers rather than as message initiators. Another novel approach appeared in a network attributed to the Pakistani military, which leveraged many real accounts (alongside some fake ones) to create fan Pages for the military. The accounts mass-reported perceived enemies of the government to silence counter-speech.So, the “Russian playbook,” if one exists at all, is but one approach.
In this paper we take information operations originating in the MENA region as a class. Since most MENA operations have not targeted Western elections—where much of the research efforts have focused—they remain relatively understudied, as Unver notes in this collection. We ask: what are the trends, tactics, and promoted narratives from the networks disrupted in the MENA takedowns? Our goal is not to conduct new research on specific campaigns, but to contribute to a broader effort to look across existing information operations research for emerging themes and trends.Is there a discernible “playbook” common to individual country or regional political operations?
Data and Methods
To answer these questions, this paper examines a dataset of all known Facebook and Twitter takedowns centered on the MENA region. We built the dataset in three steps. First, we sifted through all takedown announcements by Facebook and Twitter, and identified MENA-centered takedowns. A takedown is considered “MENA-centered” if both the attributed country of origin and at least one target country are from the MENA region. To assess an operation’s location of origin, we rely on attributions made by Facebook and Twitter. Assessing target was more difficult; although Facebook and Twitter often include in their takedown announcement the “focus” of an operation, what a network of accounts discusses and who it targets are two distinct questions. Who, for example, is the target of an information operation in Arabic that discusses Libya? It may be Libyans, but it could also be regional governments; information operations frequently try to convince governments to act in a manner favorable to the perpetrator. We estimated the target based on language and the most frequently discussed topics.
Second, we coded key variables for each takedown: actor attribution, target audiences, platforms used, size of the network, number of followers amassed, narratives promoted, and tactics used to advance those narratives. To assess these variables, we relied on platform and researcher takedown reports for individual operations. Since August 2018, Facebook and Twitter have announced dozens of takedowns with brief summaries of the influence operation activity and reason for removal. The platforms also partner with third-party cybersecurity research firms and academic institutions, which publish deeper independent analysis of the networks. Coding some attributes—like the number of accounts removed in the takedown—was straightforward, while coding other variables—like tactics—was more challenging. If a social media platform or a researcher report referenced that a tactic was used, we concluded that an operation included the aforementioned tactic. However, the lack of mention by a social media platform or researcher report does not necessarily imply that a tactic was not used; our coding on tactics should thus be considered a lower-bound of total uses.
Third, in cases where researcher reports were not available, we looked at the hashed Twitter takedowns directly. This approach was not possible for Facebook takedowns without researcher reports, as Facebook does not provide a public archive of information removed in takedowns.
The result is a dataset of 46 information operations, originating from ten MENA countries, whose removals were announced by Facebook or Twitter between August 2018 and March 2021.
While comprehensive in terms of MENA takedowns announced by Facebook and Twitter, our data set is not necessarily representative of all information operations in the MENA region, as shown by Jones in his contribution to this collection. Some operations may have evaded detection; others may not have been disclosed to the public.
Iran was the most frequent country of origin; 20 of the 46 takedowns in the dataset originated from Iran. Egypt was second (10 takedowns), the UAE third (6), and Saudi Arabia fourth (5). The coding of country of origin does not mean that the operation was directed by the government in that particular country, but simply that Facebook or Twitter reported that the accounts originated from the country in question.
We examined attributions made by Facebook and Twitter for each Middle East takedown, shown in Table 1. Among the 46 takedowns, 24% were linked to a government and 26% were attributed to a marketing, PR, or IT firm. The use of marketing firms is not unique to the Middle East. Governments increasingly outsource influence operations to digital mercenaries because of access to external expertise and plausible deniability. Around half of the takedowns attributed to marketing firms involve the UAE or Egypt.
Table 1: Summary of Middle East Takedowns by Facebook and Twitter, Coded by Attribution
|Type of Entity Involved||Number of Takedowns||Location of Origin||Attribution from Platform
(T=Twitter, F=Facebook; Data of Public Disclosure)
|Unspecified Individuals||18||Iran||· Iran: “linked to the network we removed in October 2020” (F: April 6, 2021)
· “individuals in Iran with academic backgrounds” (F: March 3, 2021)
· “individuals in Tehran” (F: March 3, 2021)
· “links to individuals in Iran” (F: January 12, 2021)
· “originated in Iran” (F: October 21, 2019)
· “originated in Iran” (F: May 28, 2019)
· “our review linked these accounts to Iran” (F: March 26, 2019)
· “tied to Iran” (F: January 31, 2019)
· Iran, not specified (F: August 21, 2018 Iran Network 3)
· “originating in Iran” (T: August 21, 2018)
· “may have origins in Iran” (T: January 31, 2019)
· “operating from Iran” (T: October 8, 2020)
|Morocco||· “originated primarily in Morocco” (F: March 3, 2021)|
|Palestine, UAE||· “individuals in Palestine and UAE” (F: February 9, 2021)|
|Yemen||· “originating in Yemen” (F: August 6, 2020)|
|Iraq||· “in Iraq” (F: September 16, 2019)|
|UAE||· “operating uniquely from the UAE” (T: September 20, 2019)|
|Saudi Arabia||· “associated with Saudi Arabia” (T: April 2, 2020)|
|Marketing, PR, or IT Firm||12||Egypt||· “Bee interactive, a marketing firm in Egypt” (F: April 6, 2021)
· “Maat, a marketing firm in Egypt” (F: April 2, 2020)
· “two marketing firms in Egypt, New Waves and Flexell” (F: March 2, 2020)
|Palestine, UAE, Belgium||· “a recently created marketing firm called Orientation Media in Belgium” (F: February 9, 2021)|
|Morocco||· “Qualitia Systems, a marketing firm in Morocco, also known as Marketing Digital Maroc” (F: January 12, 2021)|
|Iran||· “linked to individuals associated with EITRC, a Tehran-based IT company” (F: November 5, 2020)|
|Israel||· “Israeli commercial entity, Archimedes Group” (F: May 16, 2019)|
|Tunisia||· “a Tunisia-based PR firm Ureputation” (F: June 5, 2020)|
|UAE, Egypt, Nigeria||· “Charles Communications in UAE, MintReach in Nigeria and Flexell in Egypt” (F: October 3, 2019)|
|UAE, Egypt||· “New Waves in Egypt, and Newave in the UAE” (F: August 1, 2019)
· “created and managed by DotDev, a private technology company operating in the UAE and Egypt” (T: September 20, 2019)
|Saudi Arabia||· “Smaat, a social media marketing and management company based in Saudi Arabia” (T: December 20, 2019)|
|Government-Linked||11||Iran||· “individuals associated with the Iranian government” (F: October 27, 2020)
· “Islamic Republic of Iran Broadcasting Corporation.” (F: May 5, 2020)
· “Iranian state media” (F: August 21, 2018 Iran Network 1)
· “all are associated with — or directly backed by — the Iranian government” (X3) (T: August 21, 2018)
|Kurdistan||· “Zanyari Agency, part of the intelligence services of the Kurdistan Regional Government in Iraqi Kurdistan” (F: June 5, 2020)|
|Saudi Arabia||· “individuals associated with the government of Saudi Arabia” (F: August 1, 2019)
· “linked to Saudi Arabia’s state-run media apparatus” (T: September 20, 2019)
· “with ties to the Saudi government” (T: October 8, 2020)
|Egypt||· “El Fagr network…Information we gained externally indicates it was taking direction from the Egyptian government.” (T: August 2, 2020)|
|Third Party News||3||Israel||· “ElBaladd, a news website in Israel” (F: April 6, 2021)|
|Egypt||· “an Egyptian newspaper El Fagr” (F: October 3, 2019)|
|Iran||· “Liberty Front Press” (F: August 21, 2018 Iran Network 2)|
|Political Group||2||Albania||· “MEK, an exiled militant opposition group from Iran now based in Albania” (F: April 6, 2021)|
|Egypt, Turkey, Morocco||· “individuals in Egypt, Turkey and Morocco associated with the Muslim Brotherhood” (F: November 5, 2020)|
Goals and Narratives
After assessing the actors to which the operations were attributed, we examined the content shared by these networks to understand the potential goals of the operation. There were four primary objectives spanning all operations:
- Attempts to cast one’s own government, culture, or policies in a positive light
- Advocacy for or against specific policies
- Attempts to make allies look good and rivals look bad to third-party countries
- Attempts to destabilize foreign relations or domestic affairs in rival countries
In this section, we describe these objectives, and offer examples of narratives leveraged in the effort to achieve them.
Promoting (and Protecting) One’s Image
The networks engaged in promoting a positive image of their country of origin amplified narratives that cast their leadership and policies as beneficial to both their own citizens, and often the broader region as well. Data sets attributed to Iran-linked actors contained content that positioned Iran as the champion of the oppressed and the leader of the Muslim world. Iran-based networks often championed Palestinian rights and denounced US, Israel, and Saudi Arabia’s regional interventions and collusion. Iran was portrayed as a bulwark against neocolonialism and the “West,” and a stabilizing force in the region. Showcasing Iran’s capability to stand up for the oppressed and confront the “West,” posts from other Iran-linked networks also boasted of the Iranian military’s threat to Israel and the United States. Furthermore, at least two takedowns had assets promoting the Iranian Supreme Leader’s religious teachings, potentially to increase his appeal among Muslims outside Iran and further Iran’s cultural diplomacy. Accounts that tweeted in many languages reveal Iran’s attempt to promote its appeal among a global audience. Such messages are consistent with transparently-attributable, “white propaganda” Iranian state media narratives. However, the covert influence campaigns additionally allowed accounts with seemingly no visible ties to Iran to launder Iranian propaganda to unsuspecting users.Examining a takedown linked to the Islamic Republic of Iran Broadcasting Corporation, Graphika researchers noted, “Many of its assets conducted what could have been considered classic public diplomacy, if it had been done overtly: promoting Iran’s successes and spiritual authority to Arabic- and English-speaking audiences.”
Saudi Arabia largely promoted its achievements for its domestic population, contrasting its successes with Iran’s domestic failures. It also sought to present an attractive image of Saudi Arabia to the Western world. For instance, an August 2019 takedown attributed to individuals associated with the Saudi government promoted Saudi Arabia’s military and social achievements. The network promoted the successes of the Saudi Armed Forces, and the Crown Prince Mohammad bin Salman’s economic and social reform plan, “Vision 2030.” One Twitter takedown portrayed the Saudi Crown Prince as personable and relatable, showing him trying VR games and leading a traditional dance. Some of the network’s posts promoted the country’s progress in women’s rights, featuring Saudi women who were pushing traditional boundaries as horseback racers, top chefs, and more. These successes were implicitly accredited to the government. This Saudi network also featured other feel-good posts highlighting national points of pride.
The Saudi networks also engaged in reputational damage control during controversies that garnered significant media coverage, such as when Jeff Bezos’ phone was hacked or after the journalist Jamal Khashoggi was murdered in the Saudi consulate in Istanbul in 2018. The takedown data set revealed that the networks tried different strategies for deflecting blame of the Khashoggi murder from Saudi Arabia, offering a range of overlapping (and at times conflicting) narratives that ranged from denying the murder, to claiming it occurred elsewhere, to attacking Khashoggi’s character.
As with the Saudi campaigns, the campaigns that originated in the UAE showcased the country’s social and economic achievements. Campaigns originating in the UAE attempted to create the perception of broad, widespread global praise for the country. A September 2019 Twitter takedown of 4,248 accounts operating uniquely from the UAE exemplifies this. A set of accounts claimed to be of diverse nationalities and posted praise of the UAE in languages including Arabic, Chinese, English, French, German, Hebrew, Italian, Japanese, Korean, Persian, Polish, Portuguese, Russian, Spanish, and Turkish. Many of the accounts posted about a visit that Pope Francis made to the UAE, and emphasized that the UAE is a tolerant country. Others promoted the UAE as an attractive tourist destination (e.g. “Summer goals #UAE 🇦🇪 #SaturdayMorning” from a Twitter account that purported to belong to an Australian activist). They touted the UAE’s international events, such as the World Government Summit, its celebration of Chinese New Year, and the Abu Dhabi International Triathlon.
Manufacturing Consensus For or Against Specific Policies
Astroturfing accounts were used to create an impression of domestic grassroots support or opposition not only to certain governments, but also to particular government policies.
For example, we saw astroturfing related to the Iran-Russia Defense Agreement: In an April 2020 Twitter takedown, a network of accounts associated with Saudi Arabia created the impression of local Iranian opposition to a potential joint defense agreement between Russia and Iran. Amidst rising U.S.-Iran tensions in the summer of 2019, reports surfaced that Iran was pursuing a joint defense agreement with Russia. Posing as Iranians, the accounts used the English hashtag #GetLostFromIranRussia. They portrayed the agreement as Russian colonial intervention infringing on Iran’s sovereignty.
We also observed astroturfing against the Grand Ethiopian Renaissance Dam Project. A March 2021 Facebook takedown attributed to Bee Interactive, a marketing firm in Egypt promoted domestic resistance to the project, which threatened Egypt’s fresh water supplies. Five Pages posed as independent news outlets and criticized the dam.
Denigrating Regional Rivals
Regional rivalries, often between a Saudi Arabia/UAE/Egypt axis on one hand, and an Iran/Turkey/Qatar axis on the other, feature prominently in MENA takedowns.
Networks attributed to the former set commonly portrayed the latter set as having a destabilizing presence in the region. Among the 15 takedowns that originated in Egypt, Saudi Arabia, and/or the UAE, at least 10 portrayed Qatar, Turkey, and/or Iran as sponsors of terrorism. In addition to underscoring the rivals’ destabilizing regional activities, the campaigns also criticized their domestic performance, potentially to reduce the rivals’ regional appeal or domestic stability. They showcased moral corruption in Iran, economic collapse in Turkey, and human rights violations in Qatar, frequently leveraging accounts posing as locals in the target country to push those narratives.
Isolating Qatar was by far the most salient example of MENA takedown networks trying to denigrate rivals. In June 2017, Saudi Arabia and other Arab countries cut diplomatic ties and launched a land, air, and sea blockade against Qatar. The intra-Gulf crisis reflected a geostrategic and ideological gulf between Qatar and the blockading countries. The latter issued a list of demands that Qatar had to meet for the blockade to end. These included demands that Qatar curb its relations with Iran and Turkey, sever ties to Islamist and terrorist groups, and shut down its state-funded broadcaster, Al Jazeera, and affiliate stations. The blockade lasted three and a half years until a January 2021 GCC summit aimed at reconciliation. During the blockade, narratives attempting to isolate Qatar from any potential allies and create rifts featured in the takedown data sets:
- Qatar-United States Rift: Some posts sought to create a wedge between Qatar and the United States. Accounts relentlessly portrayed Qatar as a sponsor of terrorism. One Page, from an October 2019 takedown attributed to three commercial firms, promoted the narrative that Qatar indirectly supported the 9/11 attacks against the United States.
- Qatar-Turkey Rift: Assets in the October 2020 Twitter takedown linked to the Saudi government posted or retweeted posts that Turkey killed and insulted several members of Qatar’s royal family, that Turkey was occupying Qatar, and that “Erdogan is used to exploiting the young #Qatari Emir.”
- Qatar-Iran Rift: For example, a February 2020 Facebook takedown attributed to two marketing firms in Egypt promoted allegations that Qatar played a role in the assassination of the nationally popular Iranian general, Qassim Soleimani.
The rift-creating narratives are in line with the blockading countries’ geopolitical interests. Qatar’s ties with Iran and Turkey were not only stimuli for the blockade. They also undermined the blockade’s aim of severely pressuring Qatar, as Iran and Turkey provided critical exports to Qatar during the blockade. Iran and Turkey, too, appeared in the narratives claiming that their leaders support terror groups, including ISIS and the Muslim Brotherhood.
The networks also pushed claims of Qatar, Turkey, and Iran’s interference in particular countries. An August 2019 Facebook takedown attributed to two marketing firms – New Waves in Egypt, and Newave in the UAE – amplified claims that Qatar and Turkey support terrorist groups in Africa and the Middle East. The takedown particularly amplified narratives that Qatar was involved in a terror attack in Somalia. It used fake news outlets to amplify such reports. One Page, posing as the social media arm of the website Somalianow, also promoted its articles criticizing Qatar’s investments in Africa.
Iranian operations frequently criticized regional rivals. Of the 20 takedowns originating in Iran, at least 15 criticized Israel, Saudi Arabia, and/or the UAE. The takedowns commonly portrayed these countries as corrupt and complicit in Western crimes, or as un-Islamic. Their operations also painted a picture of Western neocolonialism in the region, abetted by rival regional governments like Saudi Arabia and the UAE. An April 2020 Facebook takedown attributed to the Islamic Republic of Iran Broadcasting Corporation (IRIB) likewise promoted narratives of the Saudi royal family’s corruption. For instance, one Arabic-language Page, largely active between 2014 and 2015, was named “Saudi opposition and free speech page.” Its ‘About’ section described itself as a “Private page of Saudi revolutionaries that transmits the truth to the outside and to anyone who is looking for the truth, in order to free the country [Saudi Arabia] from the [House of] Saud, may they be cursed by God.” Its memes and texts portrayed the Saudi ruling family as a puppet regime serving the United States and Israel.
Destabilizing Rival Governments
Denigrating regional rivals extended beyond policy criticism and into outright attempts to undermine leaders and destabilize regimes.
At least two takedowns promoted independence for Somaliland. The August 2019 Facebook takedown attributed to New Waves in Egypt and Newave in the UAE frequently posted on this topic. Five accounts from the April 2020 Twitter takedown associated with Saudi Arabia claimed to be based in Somaliland. Their tweets extolled the wildlife, nature, and physical beauty of Somaliland. Some mentioned the “rebirth of Somaliland,” and claimed they were “proud to be #Somalilanders.” Accounts also tweeted the English hashtag, #Somaliland_not_somalia.
The campaigns promoting Somaliland’s independence are an extension of Saudi Arabia and the UAE’s geopolitical interests. During the intra-GCC crisis, Saudi Arabia and the UAE reportedly pressured Somalia’s newly elected president, Mohamed Abdullahi Mohamed (also referred to as Farmajo), to sever ties with Qatar. Farmajo insisted on remaining neutral. However, reports that the president received funds from Qatar ahead of his election, and his appointment of officials close to Doha, raised Abu Dhabi’s doubts of his neutrality.
This delegitimization strategy was deployed against other governments as well:
- Sudan: The April 2020 El Fagr takedown promoted narratives undermining the Sudanese government. Amidst protests in Sudan in June 2019, many fake accounts supported the protestors, saying the protesters were rejecting the Muslim Brotherhood.
- Syria: The April 2020 Twitter takedown associated with Saudi Arabia advanced narratives of the Syrian regime’s domestic unpopularity. Thirty-six accounts had Syria-related usernames or references to Syria in their profile. Their tweets criticized Syria’s President Bashar al-Assad.
- Libya: Networks often attacked Fayez al-Sarraj, the former Prime Minister of Libya’s Government of National Accord (GNA). A Saudi-linked Twitter takedown amplified the hashtag “Sarraj the traitor of Libya” (translated from Arabic). Feeding off this inauthentically widespread use of the hashtag, a pro-Haftar YouTube channel and several articles published about the “trending” hashtag; they asserted that many Libyans commonly viewed Sarraj’s agreement with Erdogan as a betrayal of their country.
- Iran: In at least one takedown, accounts boldly criticized Iran’s Supreme Leader Ayatollah Ali Khamenei, and the late IRGC Quds Force commander, Qassem Soleimani. In addition to employing hashtags like #IranRegimeChange (translated into English), the suspended accounts promoted candidates for the incumbent regime’s replacement. They used the pro-Iranian monarchy hashtag #G20RecognizePahlavi and promoted the Mojahedin-e Khalq (MEK).
In our coding process we made note of tactics, techniques, and procedures that appeared within research reports describing each operation. Around three quarters of the takedowns included assets that claimed to be news outlets. Influence operations involving accounts masquerading as media outlets is an extremely common, recurring approach. However, there is some nuance involved in how manipulators execute on this approach. In the Middle East data set, the “news outlet” accounts took on a range of forms:
- Leveraging quasi-real slanted news outlets. For example, two takedowns were attributed to the El Fagr newspaper in Egypt. While El Fagr claims to be an “independent weekly newspaper,” Twitter noted that information “gained externally indicates it was taking direction from the Egyptian government.”
- Creating front media. Most news outlets linked to suspended social media operations in the dataset fell into this category. They often reposted content from other news sites, selectively publishing news stories that advanced the network’s objectives. The accounts’ occasional original content was often poorly written, replete with grammar and spelling mistakes. The fake news outlets used standard newspaper naming conventions, like “Sudan Today” and “Afghan Mirror.”
- False franchising. Some Facebook Pages covered in takedown reports were designed to look like regional branches of authentic large news outlets. For example, one asset posed as the regional Page for the Huffington Post, calling itself the Huffpost Taounatepress.
- Impersonating real news outlets. Some takedowns impersonated real news organizations. For instance, in a May 2019 Iran takedown, one Page, @AlArabyi, impersonated the Saudi-funded news channel @AlArabiya. The Arabic-language mimicked the name, logo, and visual branding of AlArabiya.net. Occasionally these outlets used “typo-squatting,” employing Facebook URLs that mimic the URLs for authentic media outlets, with minor typographical changes to trick users into mistaking them for the authentic domains. A November 2020 takedown attributed to individuals in Iran and Afghanistan engaged in typo-squatting of a popular Facebook Page; the Page facebook.com/aff.varzeshi spoofed the genuine facebook.com/aff.varezshi by reversing the “ez.”
The authentic facebook.com/aff.varezshi Page, right, and the mimicking facebook.com/aff.varzeshi Page, left.
We observed a number of tactics to build audiences and increase account legitimacy. These included:
- Non-political, humorous or fashion content. Gathering a relatively large and broad following with light-hearted content, the assets would then occasionally redirect their viewers to the more narrowly politicized assets in the network. Sometimes these two roles – posting engaging content and advancing political narratives – were conducted by the same asset. They interspersed strategic political posts amidst filler content.
- Handle-switching, where an account grows its following, perhaps with spammy follow-back behavior, then deletes its old tweets and changes its handle. We saw this tactic used by an account that, once it established a following, pretended to be an interim Qatari government.
- Early account creation dates, potentially from hacking or purchasing real accounts.
- Bolstering accounts on one platform with “off-platform” presence elsewhere online, for example by having a fictitious persona publish op-eds.
- Astroturfing. Creating accounts that posed as locals in target countries. The tactic of using ‘fake local’ accounts may be manifold. It may be to distort the picture for journalists and analysts who rely on social media to gauge public sentiment in the subject country. It may also be to influence real locals in the target country to overestimate the predominance of a particular public sentiment, which might in turn quiet the voices of those harboring opposing viewpoints, or intensify the attitudes of those sharing those views. Networks advocating protests and questioning a government’s legitimacy may seek to foment domestic overthrow of unfriendly governments; the 2011 Arab Spring made clear the power of social media as a tool to spur popular uprising. However, the networks’ ability to achieve those potential objectives is debatable; many of these accounts received little engagement and researchers have long struggled to measure the impact of influence operations.
Marc Owen Jones’ article in this collection discusses an important recent tactical innovation, particularly on Twitter: the use of chopped hashtags. With chopped hashtags, “sock puppet accounts dilute and pollute critical hashtags using abbreviated versions of the real hashtags.” While we have not seen this tactic in the takedowns analyzed in this paper, we expect to see this tactic in future takedowns.
Assessing attributed influence operation narratives and methods offers visibility into what topics state and non-state actors have chosen to prioritize, and potentially can elide a set of recurring tactics, techniques, and procedures to help attribute subsequent operations. However, in the culmination of our cross-dataset analysis of MENA region takedowns, we observed a very wide breadth of tactics and narratives even within operations attributed to one single state actor. Perhaps the best way to describe the “Middle East playbook” is that many information operations attributed to countries in the Middle East use a “kitchen sink” approach. Across multiple individual takedowns we saw a wide array of tactics and narratives employed in service to multiple geopolitical objectives – far more than, for example, activity observed in Chinese operations in which clusters of accounts are largely dedicated to producing or amplifying content about a single self-promotional narrative, with a recurring focus on bolstering China’s own image. The huge breadth of issues we see covered in these MENA region operations suggests, perhaps, that covert social media influence operations are becoming a normal part of propaganda and influence efforts by these regimes rather than something reserved for events or topics of extremely high impact or significance. The narratives the networks promoted often systematically aligned with the geopolitical interests of their country of origin. However, the social media campaigns’ efficacy is debatable. Most of the Middle East takedowns achieved low engagement. Moreover, the causal mechanisms between exposure and changes in attitudes and behavior have yet to be strongly established. Nonetheless, investigating these information operations helps to better understand regional and domestic politics in the Middle East and North Africa.
 Philip N. Howard and Muzammil M. Hussain. 2011. “The Upheavals in Egypt and Tunisia: The Role of Digital Media.” Journal of Democracy 22(3): 35-48.
 See for example: McGarty, Craig, Emma F. Thomas, Girish Lala, Laura GE Smith, and Ana‐Maria Bliuc. 2014. “New Technologies, New Identities, and the Growth of Mass Opposition in the Arab Spring.” Political Psychology 35(6): 725-740.
 For example, across five Middle Eastern and North African countries surveyed in a 2018 Pew study, 68% of respondents said that they use social networking sites. Poushter, Jacob, Caldwell Bishop, and Hanyu Chwe. 2018. “Social Media Use Continues to Rise in Developing Countries but Plateaus Across Developed Ones.” Pew Research Center.
 Renée, DiResta, Kris Shaffer, Becky Ruppel, David Sullivan, Robert Matney, Ryan Fox, Jonathan Albright, and Ben Johnson. 2019. “The Tactics & Tropes of the Internet Research Agency.” New Knowledge Report.
 Renée DiResta, Carly Miller, Vanessa Molter, John Pomfret, and Glenn Tiffert. 2020. “Telling China’s Story: The Chinese Communist Party’s Campaign to Shape Global Narratives.” Hoover Institution/Stanford Internet Observatory Report. Accessed: https://fsi-live.s3.us-west-1.amazonaws.com/s3fs-public/sio-china_story_white_paper-final.pdf.
 For a thoughtful critique on the notion that a single “Russian Playbook” exists, see: François, Camille. “Moving Beyond Fears of the ‘Russian Playbook.’” Lawfare. September 15, 2020. Accessed: https://www.lawfareblog.com/moving-beyond-fears-russian-playbook
 For one data point, the Partnership for Countering Influence Operations at the Carnegie Endowment for International Peace created a catalogue of 460 counter-influence operations initiatives. The project found that 44% of initiatives studying influence operations are located in the North America and 37% in Europe, perhaps partially explaining the over-indexing on elections in those regions. See: Smith, Victoria. “Mapping Worldwide Initiatives to Counter Influence Operations.” Carnegie Endowment for International Peace. December 14, 2020. Accessed: https://carnegieendowment.org/2020/12/14/mapping-worldwide-initiatives-to-counter-influence-operations-pub-83435
 For other recent comparative analyses, see: Bradshaw, Samantha, Hannah Bailey, and Philip N. Howard. 2021. “Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation.” Computational Propaganda Research Project. Accessed: https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/127/2021/01/CyberTroop-Report-2020-v.2.pdf; Goldstein, Josh A. and Shelby Grossman. “How disinformation evolved in 2020.” Lawfare. January 4, 2021. Accessed: https://www.brookings.edu/techstream/how-disinformation-evolved-in-2020/; Martin, Diego A., Jacob N. Shapiro, and Julia Ilhardt. 2020. “Trends in Online Influence Efforts.” Working Paper.
 In this paper, we consider MENA countries to consist of Algeria, Bahrain, Egypt, Iran, Iraq, Israel, Jordan, Kuwait, Lebanon, Libya, Morocco, Oman, Qatar, Saudi Arabia, Syria, Tunisia, United Arab Emirates, and Yemen.
 After we completed our research, Facebook unrelatedly published a dataset of all takedowns from its platform. We therefore checked our coding against their attribution and target coding. Twitter, however, has not yet published a similar spreadsheet. See: Gleicher, Nathaniel, Margarita Franklin, David Agranovich, Ben Nimmo, Olga Belogolova, and Mike Torrey. May 2021. “Threat Report: The State of Influence Operations 2017-2020.” Facebook. Accessed: https://about.fb.com/wp-content/uploads/2021/05/IO-Threat-Report-May-20-2021.pdf
 As Facebook noted in one of its takedown reports, “We routinely take down less sophisticated, high-volume inauthentic behaviors like spam and we do not announce these enforcement actions when we take them” https://about.fb.com/wp-content/uploads/2020/05/April-2020-CIB-Report.pdf
 Some takedowns originated in multiple countries. For instance, one Twitter takedown announced in April 2020 and associated with Saudi Arabia operated out of three countries: Saudi Arabia, Egypt, and the UAE. https://twitter.com/TwitterSafety/status/1245682443241259010?s=20
 For Twitter operations, we footnote to Twitter’s announcement of the operation that includes its attribution language. We do not include Facebook citations since Facebook itself now provides a single spreadsheet with links to each takedown announcement, which can be found here: https://about.fb.com/wp-content/uploads/2021/05/IO-Threat-Report-May-20-2021.pdf p. 43.
 Shelby Grossman and Khadeja Ramali. “Outsourcing Disinformation.” Lawfare. December 13, 2020.
 We double count this operation as both an example of an attribution to unspecified individuals and a marketing, PR, or IT firm. For more information, see footnote 34.
 We double count this operation as both an example of an attribution to unspecified individuals and a marketing, PR, or IT firm. For more information, see footnote 34.
 We double counted one takedown due to mixed attribution categories, so the total number of takedowns in the dataset is 46. The operation we double counted was attributed by Facebook as follows: “Our investigation found links to individuals in Palestine and UAE, in addition to links between a small portion of this network and individuals associated with a recently created marketing firm called Orientation Media in Belgium.” We thus code Orientation Media in Belgium under the ‘Marketing, PR, or IT Firm’ category, and individuals in Palestine and UAE under the ‘Unspecified Individuals’ category. For more information, see: https://about.fb.com/news/2021/02/january-2021-coordinated-inauthentic-behavior-report/.
 https://public-assets.graphika.com/reports/graphika_report_irib_takedown.pdf; https://medium.com/dfrlab/facebook-removes-iran-based-assets-again-f17358ef21f.
 See also: Brooking, Emerson T. and Suzanne Kianpour. 2020. “Iranian Digital Influence Efforts: Guerrilla Broadcasting for the Twenty-First Century.” Atlantic Council Report.
 https://about.fb.com/news/2019/08/cib-uae-egypt-saudi-arabia/; https://medium.com/dfrlab/royally-removed-facebook-takes-down-pages-promoting-saudi-interests-edc0ce8b972a
 The term “astroturf” refers to activities that appear to be grassroots activism from ordinary people, but are in fact paid for or executed by government, institutional, or corporate entities.
 https://fsi-live.s3.us-west-1.amazonaws.com/s3fs-public/20200402_blame_it_on_iran_qatar_and_turkey_v2_0.pdf p. 29.
 https://public-assets.graphika.com/reports/graphika_report_inauthentic_beehavior.pdf; https://about.fb.com/wp-content/uploads/2021/04/March-2021-CIB-Report.pdf, p17.
 We note that these axis groupings are broad generalizations. The countries’ alignments and relations pivot with respect to different issues. Nonetheless, these groupings helpfully depict the broad trends observed across the takedowns.
 https://github.com/stanfordio/publications/blob/main/twitter-SA-202009.pdf, p19.
 E.g. https://www.mobtada.com/details/892394
 Scholars have historically called this pre-propaganda: propaganda that is not directly related to the political message of the propagandist. See: Golovchenko, Yevgeniy, Cody Buntain, Gregory Eady, Megan A. Brown, and Joshua A. Tucker. 2020. “Cross-Platform State Propaganda: Russian Trolls on Twitter and YouTube during the 2016 U.S. Presidential Election.” The International Journal of Press/Politics 25(3): 357-389.
 https://about.fb.com/news/2021/03/february-2021-coordinated-inauthentic-behavior-report/; For a discussion of astroturfing and its potential effects, see: Keller, Franziska B., David Schoch, Sebastian Stier, and JungHwan Yang. 2020. “Political Astroturfing on Twitter: How to Coordinate a Disinformation Campaign.” Political Communication 37(2): 256-280; Zerback, Thomas, Florian Töpfl, and Maria Knöpfle. 2021. “The disconcerting potential of online disinformation: Persuasive effects of astroturfing comments and three strategies for inoculation against them.” new media & society23(5): 1080-1098.
 Research on the ‘spiral of silence’ suggests that people are less likely to share their views when they believe that their views are not widespread. See, for example: Hampton, Keith, Lee Rainie, Weixu Lu, Maria Dwyer, Inyoung Shin, and Kristen Purcell. “Social Media and the ‘Spiral of Silence.’” Pew Research Center. August 26, 2014. Accessed: https://www.pewresearch.org/internet/2014/08/26/social-media-and-the-spiral-of-silence/; Nielle-Neumann, Elisabeth. “The Spiral of Silence: A Theory of Public Opinion.” Journal of Communication 24(2): 43-51.
Ben Nimmo. 2020. “The Breakout Scale: Measuring the Impact of Influence Operations.” Brookings Institution. Accessed: https://www.brookings.edu/wp-content/uploads/2020/09/Nimmo_influence_operations_PDF.pdf