Mahsa Alimardani and Mona Elswah, Oxford Internet Institute, University of Oxford
These voices will increase in number and volume. They cannot be ignored. (Mark Zuckerberg, 2012)
It is easy to forget in the current climate of scholarship centered on Silicon Valley disillusionment that social media companies were originally billed as conduits of revolution. The CEOs of these companies wanted the world to see them as tools that democratised information and access. The strategy of Facebook’s Mark Zuckerberg during the company’s stock market launch in 2012 was to take credit for pro-democracy movements like the 2011 Arab Spring. Zuckerberg described the controlled media systems of countries living under censorship as “the intermediaries controlled by a select few” which would become liberated with the emancipatory features of his technology. A decade after the Arab Spring, the roles have been reversed. During the crisis surrounding Israeli settlers seizing homes in the Sheikh Jarrah neighborhood of Jerusalem, the Palestinian movement against what B’Tselem and Human Rights Watch have recently called Israeli apartheid found the policies developed by Facebook, and to a lesser extent other social media companies, a major obstacle to their mobilization.
In this paper, we zero in on Arabic content moderation. We identify the systemic policies that are being administered by social media companies, whether designed within the technology or implemented through policies. While there have been issues with other platforms, we focus our analysis and argument in this paper on Facebook as the most egregious violator, with far reaching systemic problems and impact on Arab and pro-Palestine content. Second, we identify the various formats of digital repression of speech regarding the rights movement supporting Palestinians online. Third, we look at the countermeasures users have employed to overcome this repression as part of a greater movement for accountability from platforms in the Arab region. In this paper, we contend that failures and subjectivity of platform governance has given rise to what we call a new orientalism in the digital sphere, or digital orientalism. Orientalism is the stereotypical and discriminatory lens by which western nations view the Middle East and North African region. Western countries have used this lens to assert dominance and colonialism, either through war, media, governance and policies. We argue that this framework now defines the policies and actions Western social media companies use to disadvantage Internet users in this region.
A History of Facebook’s Problems with Content Moderation
While the issues in the region have been playing out across many platforms, Facebook retains a particularly significant hold over online communications across the Arab world, with millions of users scattered across their three platforms: Facebook, Instagram and WhatsApp. Its influence on communications and media has come to the fore globally, especially since the 2016 United States Presidential elections and the use of Facebook in scandals such as Cambridge Analytica. This increase in focus and concern has been exacerbated by studies that have outlined Facebook’s outsized role in inciting a genocide by the Myanmar military against the Rohingya Muslims in 2018. Due to flaws in the architectural design of Facebook, some scholars have questioned Facebook’s legal responsibility and whether the company could be held accountable for its roles in these scandals.
Here we focus on deep rooted problems inherent specifically in Facebook’s content moderation policies, which are ad hoc and inconsistent. In 2018, the New York Times published leaked content moderation guidelines and practices through troves of PowerPoint presentations. They concluded that “the Facebook guidelines do not look like a handbook for regulating global politics. They consist of dozens of unorganized PowerPoint presentations and Excel spreadsheets with bureaucratic titles. The investigation revealed that policies are designed for moderators to use Google Translate, as Facebook remains short on moderators who speak local languages. They rely on translations that often miss the nuances or facts of the context of speech at hand, a major problem in a region where Arabic is spoken with diversity in dialect and cultures. Although the company has taken some measures to prevent further scandals, Facebook’s content moderation has continued to impose harm — especially in the Arab world.
The Hurdles of Arabic Content Moderation
Arab activists have been part of the broader digital rights movement calling out commercially-oriented social media platforms for their problematic positions. Since 2011, the policies and teams concerned with the Middle East and North Africa have developed in response to urgent pressure by users, governments, issues, and events. Prior to the Arab Spring, issues of content moderation were only resolved for elite and well-connected users. The most famous case is of Facebook’s removal of the popular “We Are All Khaled Saeed” page for going against its “real name policy” prior to the start of the 25 January 2011 Egyptian Revolution. The page was restored only because of the connections of Wael Ghonim, one of the anonymous administrators of the page, who worked for Google and used his contacts to get in touch with Facebook’s Chief Operating Officer Sheryl Sandberg to revive the page.
The issues within the region have been persistent over the past ten years (see Table 1). We identify five forms of platform bias. First, the removal of pro-democracy Arabic content (e.g., posts, tweets, pages) has harmed many activists in the region. Second, Arab activists have repeatedly had their accounts restricted and deleted on the basis of violating the platform’s community standards. In Arab countries, many pro-democracy pages, groups, accounts, and content have been taken down, with their accounts suspended or de-platformed for what companies would call “Terrorist and Violent Extremist Content” (TVEC), hate speech, organized hate, hateful conduct, and violent threats. Second, even unintentional removals through automated systems have far reaching consequences. For example, YouTube’s community guidelines prevent publishing graphic and violent videos which have mistakenly led their algorithms to take down several videos from Syria that documented the war crimes of the regime of Bashar al-Assad. From 2012 to 2019, YouTube erased about 206,077 videos related to Syria and removed several channels owned by activists and local news outlets.
Third, the Global South, including the MENA region, faces double standards compared to the rest of the world. For example, activists and researchers have noted the limited access to social media data during elections and other political crises. This was evident during Tunisia’s 2019 elections when civil society members could not benefit from the archives of political ads in the Facebook Ad library. While researchers and activists in many countries were able to monitor political ads and know which audiences are being targeted by politicians, Tunisian researchers and activists were prohibited from such information.
Fourth, social media platforms employ discriminatory and unfair measures towards content from the Arab world.Facebook’s organizational structure within the region speaks to systemic issues that reflect more broadly these orientalist tropes. While countries like Israel, and almost every European country has their own designated public policy head, the Middle East and North Africa region, despite the vast linguistic, state, religious, and cultural differences, are all lumped under one system of management. While Facebook maintains a broad “MENA” office in Dubai, they have a country specific office in Israel with their own public policy director (Jordana Cutler) who previously held the political positions of former adviser to Israeli Prime Minister Benjamin Netanyahu and Likud staffer. No such equivalent position exists for Palestinians or any other Arab country or diaspora.
Fifth, as most recently documented and recorded by 7amleh, users posting pro-Palestine content noted their audiences’ views and reach was decreased. As Marwa Fatafta and Ahmed Shaheed and Benjamin Greenacre document in their contributions to this collection, these systemic policies have elevated and prioritised Israeli content and takedown requests.
Platform moderation is a key area where these discriminatory politics play out. Jillian York’s recent book Silicon Values outlines the free speech implications of the choices made by these social media corporations. In one of her interviews with former platform content moderation experts, an anonymous former Facebook moderator said that when confronted about harms faced by groups in the region countries, Facebook would simply refuse to develop policies for those threats: “this kind of policy would never get any face time with the policy team because they were always busy with…whatever was prioritized by countries like Germany and the US.”
Table: Types of Platform Bias in the Arab World
|Type||Definition||First Reported ||Platforms Involved|
|The removal of pro-democracy content||Relying on algorithmic or human moderation to remove content that does not go in line with platforms’ “community standards.”||2011||Facebook and YouTube|
|Restricting and deleting accounts of activists||Suspending accounts temporarily or deleting them permanently because they violated platforms’ “community standards.”||2011|
|Limiting data access||Denying access to platform data despite providing them to Western researchers and civil society activists.||2019|
|The lack of measures and resources to Arabic content||Not employing the same policies and measures that are applied in Western countries.||2021||All platforms|
|Reach reduction to activists’ content||Adjusting the algorithms to reduce the reach of a certain type of content.||2021||Instagram and Facebook|
When Orientalism goes online: Understanding digital orientalism
The recent evidence of censorship of pro-Palestine content in May 2021 was dismissed as merely “technical errors” by Facebook. This aligns with the more insidious patterns of systemic design discrimination, exacerbated by lack of resources and discriminatory policies. This has set the stage for situational crises in the Arab world, especially with the unfolding escalations of digital orientalism that pro-Palestine voices face in the midst of real-world repression.
While 2021 has been the year the concept of segregation and the systemic discrimination of Palestinians within Israel and the Palestinian territories has started to gain mainstream currency, Palestinian activists have begun to label injustices they claim they face online, both from social media companies and the Israeli Internet infrastructure that controls the flow of the Internet to Palestinians as “digital apartheid.” In this framework, Palestinian rights advocates argue that these forms of online discriminations are a continuation of the systemic forms of segregation, discrimination and abuses the Israeli authorities subjugate Palestinians to, only within the online realm. While we believe the term digital apartheid is correctly applied by Palestinian rights advocates within the broader struggles for rights and dignity against the Israeli state, we situate the problem more broadly into a more regional framework of digital orientalism. The discrimination and failures inherent in digital orientalism must be recognised as part of the growing strains of our online world. For too long this digital orientalism has been plaguing MENA countries, and in the case of Palestine, the platforms and their policies are in their ways contributing to events that the United Nations and leading human rights organisations say amount to war crimes.
This is why we place the flaws in Facebook’s policies into the context of more traditional media studies discourses. The arguments put forward by Edward Said in his last book within his Orientalism trilogies, Covering Islam, broaches the question of the power of American and European media to shape perceptions of Islam and the countries of the region. Said’s central argument was that media language builds and maintains stereotypes, and attempts to turn these western frameworks to describe a foreign culture into objective truths. We also see the parameters by which social media companies are policing speech and how this wields similar subjective power to build and maintain those same stereotypes. Said’s focus on media coverage of the 1979 Revolution in Iran, for example, centred on the media’s creations of notions of a “penchant for Shiite martyrdom”, the “return of Islam” created as Western tropes that supplanted the causes of the Iranain movement and tried to speak for Iranians themselves. This language, Said argued, obscured the complexities and contradictions within the region and within Islam itself.
We see this echoed now through the new gatekeepers of information and culture: social media platforms. In today’s new digital orientalism, the media narratives of Said’s era have been adopted by the new gatekeepers of news and information and incorporated into the community guidelines of social media platforms. For example, Facebook’s application of Dangerous Individuals and Organisations is a quagmire of problematic applications within the Middle East and North African region. The Western-centric origins of the policies that determine who these individuals and organisations mean that Facebook relies on the US Department of State’s Foreign Terrorist Organisation’s (FTO) list for its removal of accounts and content. The FTO list overwhelmingly includes a majority of Islamist terrorist entities, as opposed to other lists like the United Nations Security Council terrorist list whose designations have a more global and religiously diverse distribution.
This has become a major problem for freedom of expression, often hindering mere speech about events or news related to these designated FTO entities, who often are the topics of everyday life and governance in the region. As Marwa Fatafta notes in this collection, Facebook has been known to equate words such as Shaheed, which is a common word within Islamic, Persian and Arabic lexicon as part of their Dangerous Individuals and Organisations policy. Shaheed is a generic term for martyrs in Arabic, but the company’s content moderation implementation automatically equates the word with terrorism, feeding into the Orientalist or Islamophobic conception of equating Islam with terrorism.
Digital Orientalism and Palestine
We now examine these issues in the context of the May 2021 crisis transpiring in Israel and Palestine in response to the movements against the forced evictions in the Palestinian Jerusalemite neighborhood of Sheikh Jarrah. Social media became central in two parts during the recent crisis: firstly, as protest and advocacy mechanism for Palestinian rights; and secondly, as a means of documentation of possible war crimes.
Since 2016, the digital repression against pro-Palestine content has been on the rise. Mounting evidence throughout May 2021 has demonstrated a continued pattern of online discrimination by platforms, one that the Palestinian based digital rights organisation 7amleh has been documenting for years.
The Israeli Cyber Unit has indicated in the past that 85 percent of their government requests to “remove content deemed harmful or dangerous” from platforms such as Facebook, Google and Twitter are accepted. During September 2016, Facebook complied with Israel’s threats to block its platform in the country if Facebook did not comply with the deletion orders. This move resulted in the deletion of many accounts of Palestinian activists and journalists. Following online protests against Facebook, the company retrieved these accounts and apologised for this action. This trend repeated itself in 2021. On 13 May 2021, the Israeli Justice Minister held a Zoom meeting with Facebook and TikTok executives to urge them to remove “anti-Israel” content. The power of this Israeli pressure has been well documented and felt. Hundreds of accounts, pages, and groups associated with Palestinian activists and media outlets have been documented as deleted or their content removed.
As mounting evidence shows the increasing erosion of rights for Palestinians, including the right to protest, the right to life, and worship, the coinciding online censorship has cemented fears that the systems of Israeli repression are being replicated online. While official statements by Facebook have resorted to explaining the initial issues as technical “glitches”, digital rights activists have shared statements showing their dissatisfaction with Facebook’s accountability and investigations. Internal leaks have revealed there are more systemic issues at play to explain the censorship on Facebook’s platform’s, beyond mere technical errors. Hashtags such as “Al Aqsa Mosque” were systematically being blocked for reasons unrelated to the original technical glitch Facebook announced as causing the problems on Instagram. Further investigations into Facebook reveal problematic and discriminatory policies are at play by Facebook. This is especially true in their development of policies surrounding speech critical or against “Zionism” which is the political ideology of Israel which in turn limits pro-Palestinian speech.
Facebook announced that the ensuing situation that started in Sheikh Jarrah and led to the aerial bombardment in Gaza has led it to develop an Israel-Palestine crisis centre. There is skepticism that the centre will not do much besides further embolden existing pro-Israeli and anti-free speech policies. But there is hope that the recent uptake of the Arabic digital rights movement by civil society and media to pressure and seek accountability from these companies will lead to a shift in policies and prioritisation in the region. This public relations crisis for Facebook has highlighted that Arabic content moderation policies in the region validate a theory of digital orientalism. Users in this region are systemically subjected to a second-tier status as users with free speech and community support that is less than other regions and languages.
Conclusion: The Future of Arabic Content Moderation
When Arab activists noticed the systematic repression by social media platforms to suppress pro-Palestinian opinions, they took several steps to continue expressing their voices online. They promoted a campaign to downgrade the rating of Facebook on Google Play and Apple Store; they used petitions, open letters, and articles to pressure social media companies to stop their algorithmic oppression; they innovatively manipulated algorithms by tweaking the written Arabic text by either adding asterisk between letters, removing a letter from a word, adding “tanween” to words and hashtags, or changing the order of the letters. One innovative approach has been to us the old dot-less Arabic cryptology tools. Social media AI is trained to read and analyse the standard Arabic letters — the one with dots in them — and this dot-less Arabic text tactic prevents the take downs of online content. While these measures seem promising, the digital repression facilitated by platforms’ machine learning remains concerning. This chess-like game between Arab activists and platforms’ architectural design is unbalanced and unfair. All the tactical innovations by Arab activists remain reactive and defensive against discriminatory systems that deprioritize their speech.
The digital orientalism of pro-Palestine content has led to silencing and censoring the voices of hundreds of thousands of the Internet’s Arab users and their networks. They have also assisted Israel in erasing or drawing attention away from evidence of Israel’s war crimes and human rights violations, and weakening the campaigns for Palestinian solidarity. Despite the grim events, the mobilization around Sheikh Jarrah has succeeded in generating an unprecedented amount of interest in the unfair practices and design of Arabic content moderation, both within social media discourses, organic campaigns to protest policies, and media coverage.
We contextualised the new digital orientalism of platform governance within the previous media theory framework of orientalism that Edward Said conceived. This theory of media orientalism has underpinned much of the Islamophobic media tropes that pervade Western society, where it has seeped into the policies of social media companies, the new gatekeepers of information. The same colonial infrastructures that subjugate and repress Palestinians in Israel in a apartheid state manifest themselves in the unequal conditions afforded to the users and preferences of Israel. As Zuckerberg alluded to in his 2012 pitch to investors, these voices cannot be ignored.
 In a letter to investors during Facebook’s Initial Public Offering (IPO)
 Adrian Chen, “Mark Zuckerberg Takes Credit for Populist Revolutions Now That Facebook's Gone Public,” Gawker, February 2, 2012, http://gawker.com/5881657/facebook-takes-credit-for-populist-revolutions-now-that-its-gone-public.
 B’Tselem, “A Regime of Jewish Supremacy from the Jordan River to the Mediterranean Sea: This Is Apartheid,” B’Tselem, January 12, 2021, https://www.btselem.org/publications/fulltext/202101_this_is_apartheid; Human Rights Watch, “Israeli Authorities and the Crimes of Apartheid and Persecution | HRW,” April 27, 2021, https://www.hrw.org/report/2021/04/27/threshold-crossed/israeli-authorities-and-crimes-apartheid-and-persecution.
 Hunt Allcott and Matthew Gentzkow, “Social Media and Fake News in the 2016 Election,” Journal of Economic Perspectives 31, no. 2 (May 1, 2017): 211–36, https://doi.org/10.1257/jep.31.2.211.
 Paul Mozur, “A Genocide Incited on Facebook, With Posts From Myanmar’s Military,” New York Times, October 15, 2018, https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html.
 Max Fisher, “Inside Facebook’s Secret Rulebook for Global Political Speech – The New York Times,” New York Times, December 27, 2018, https://www.nytimes.com/2018/12/27/world/facebook-moderators.html.
 The page would later be credited with mobilising the revolutionary momentum that would remove Hosni Mubarak in 25 January 2011.
 Anver Emon, Ellen Lust, and Audrey Macklin, “We Are All Khaled Said: An Interview with the Administrators of the Facebook Page That Fueled the Egyptian Revolution,” Boston Review 3, no. 1–2 (2011), https://bostonreview.net/archives/BR36.6/khaled_said_facebook_egypt_revolution.php.
 Jillian C. York, Silicon Values : The Future of Free Speech Under Surveillance Capitalism (Verso, 2021), 70.
 William Lafi Youmans and Jillian C. York, “Social Media and the Activist Toolkit: User Agreements, Corporate Interests, and the Information Infrastructure of Modern Social Movements,” Journal of Communication 62, no. 2 (April 1, 2012): 315–29, https://doi.org/10.1111/j.1460-2466.2012.01636.x.
 Abdul Rahman Al Jaloud et al., “Caught in the Net: The Impact of ‘Extremist’ Speech Regulations on Human Rights Content,” Electronic Frontier Foundation, May 30, 2019, https://www.eff.org/wp/caught-net-impact-extremist-speech-regulations-human-rights-content; Sarah El Deeb, “History of Syria’s War at Risk as YouTube Reins in Content,” AP NEWS, September 13, 2017, sec. AP Top News, https://apnews.com/d9f1c4f1bf20445ab06cbdff566a2b70; Kate O’Flaherty, “YouTube Keeps Deleting Evidence of Syrian Chemical Weapon Attacks,” Wired UK, 2018, https://www.wired.co.uk/article/chemical-weapons-in-syria-youtube-algorithm-delete-video.
 Mona Elswah and P. N. Howard, “The Challenges of Monitoring Social Media in the Arab World: The Case of the 2019 Tunisian Elections,” Data Memo 2020.1, Computational Propaganda Research Project (Oxford Internet Institute, University of Oxford, 2020), https://comprop.oii.ox.ac.uk/research/posts/the-challenges-of-monitoring-social-media-in-the-arab-world-the-case-of-the-2019-tunisian-elections/.
 The Facebook Ad library was released in 2019 to be a hub where Facebook can show its running ads. It is also used to archive political ads and present additional information on them (e.g., target audience, budget, sponsor, etc.). However, achieving political ads is not active in all countries and it is not enabled in the Arab region.
 AccessNow, “Open Letter to Facebook on the Upcoming Tunisian Elections of 2019,” Access Now (blog), September 2, 2019, https://www.accessnow.org/open-letter-to-facebook-regarding-the-upcoming-tunisian-elections-of-2019/.
 Masaar, “Statement from Global Civil Society on the Impact of Facebook, Google and Twitter,” Massar (blog), January 22, 2021, https://masaar.net/en/statement-from-global-civil-society-on-the-impact-of-facebook-google-and-twitter-concern-for-democracy-and-human-rights-must-not-end-at-the-uss-borders/.
 Ryan Mac, “Instagram Labeled One Of Islam’s Holiest Mosques A Terrorist Organization,” BuzzFeed News, May 12, 2021, https://www.buzzfeednews.com/article/ryanmac/instagram-facebook-censored-al-aqsa-mosque.
 York, Silicon Values : The Future of Free Speech Under Surveillance Capitalism, 20.
 This indicated to the first time — to our knowledge— this was reported by civil society organizations and media. That does not mean that these forms of biases were not taking place earlier.
 Palestine based digital rights activists at 7amleh have called this a “digital divide” in the past, but however, as of May 2021, 7amleh has been using the term “digital apartheid” (7amleh, 2017; Nashif, 2017).
 United Nations, “UN Human Rights Chief Appeals for De-Escalation in Israel-Palestine Crisis,” UN News, May 15, 2021, https://news.un.org/en/story/2021/05/1092012.
 Edward W. Said, Covering Islam: How the Media and the Experts Determine How We See the Rest of the World (London: Vintage, 1997).
 US Department of States, “Foreign Terrorist Organisations,” https://www.state.gov/foreign-terrorist-organizations/
 United Nations, “United Nations Security Council Consolidated List,” https://scsanctions.un.org/consolidated/
 Lawrence Pintak, “The Trump Administration’s Islamophobic Holy Grail,” Foreign Policy, February 22, 2017; Isobel Cockerell, “Instagram Shuts down Iranian Accounts after Soleimani’s Death,” Coda Story, January 10, 2021, https://www.codastory.com/authoritarian-tech/instagram-iran-soleimani/; ARTICLE19, “Turkey: ARTICLE 19’s Submission to the Facebook Oversight Board,” ARTICLE19, May 4, 2021.
 Layla Mashkoor, “Sheikh Jarrah Content Takedowns Reveal Pattern of Online Restrictions in Palestine,” The National News, May 10, 2021, https://www.thenationalnews.com/mena/sheikh-jarrah-content-takedowns-reveal-pattern-of-online-restrictions-in-palestine-1.1220037.
 Access Now, “Analysis: Facebook Zionism Hate Speech Policy Proposal,” Access Now (blog), March 2, 2021, https://www.accessnow.org/facebook-hate-speech-policy-zionism/.
 7amleh, “The Attacks on Palestinian Digital Rights: Progress Report,” 7amleh, May 21, 2021.
 7amleh, “Facebook and Palestinians: Biased or Neutral Content Moderation Policies?,” October 29, 2018, https://7amleh.org/2018/10/29/7amleh-releases-policy-paper-facebook-and-palestinians-biased-or-neutral-content-moderation-policies.
 Shahar Ilan, “Israeli Official Reports Increased Cooperation on Removing Content from Social Media,” CTech, December 29, 2017, https://www.calcalistech.com/ctech/articles/0,7340,L-3728439,00.html; Anan Abu Shanab, “Connection Interrupted: Israel’s Control of the Palestinian ICT Infrastructure and Its Impact on Digital Rights,” 7amleh, January 31, 2019, 44.
 Glenn Greenwald, “Facebook Says It Is Deleting Accounts at the Direction of the U.S. and Israeli Governments,” The Intercept, December 30, 2017, https://theintercept.com/2017/12/30/facebook-says-it-is-deleting-accounts-at-the-direction-of-the-u-s-and-israeli-governments/.
 Ylenia Gostoli, “Is Facebook Neutral on Palestine-Israel Conflict?,” Al Jazeera, December 26, 2016, https://www.aljazeera.com/news/2016/09/26/is-facebook-neutral-on-palestine-israel-conflict/.
 Sophia Hyatt, “Facebook ‘Blocks Accounts’ of Palestinian Journalists,” 2016, https://www.aljazeera.com/news/2016/9/25/facebook-blocks-accounts-of-palestinian-journalists.
 Emily Birnbaum, “Facebook Meets with Israeli and Palestinian Officials to Discuss Online Hate Speech, Threats as Violence Escalates,” May 14, 2021, https://www.politico.com/news/2021/05/14/facebook-israel-palestine-hate-speech-488400.
 Access Now, “Analysis: Facebook Zionism hate speech policy proposal”. Access Now (blog). March 2, 2021, https://www.accessnow.org/facebook-hate-speech-policy-zionism/.
 Matthew Ingram, “Social Networks Accused of Censoring Palestinian Content – Columbia Journalism Review,” Columbia Journalism Review, May 19, 2021, https://www.cjr.org/the_media_today/social-networks-accused-of-censoring-palestinian-content.php.
 Ryan Mac. “Instagram Labeled One Of Islam’s Holiest Mosques A Terrorist Organization.” BuzzFeed News, May 12, 2021. https://www.buzzfeednews.com/article/ryanmac/instagram-facebook-censored-al-aqsa-mosque.
 Sam Biddle, “Facebook’s Secret Rules About the Word ‘Zionist’ Impede Criticism of Israel,” The Intercept, May 14, 2021, https://theintercept.com/2021/05/14/facebook-israel-zionist-moderation/.
 Elizabeth Culliford, “Facebook Deploys Special Team as Israel-Gaza Conflict Spreads across Social Media,” Reuters, May 19, 2021, https://www.reuters.com/technology/facebook-running-special-center-respond-content-israeli-gaza-conflict-2021-05-19/.
 Marwa Fatafta, “Palestine-Israel Facebook Special Operations,” Tweet, Twitter, May 20, 2021, https://twitter.com/marwasf/status/1395465697908727809.
 murdockism, “1 Star Facebook Ratings,” Tweet, Murdockism Tweet (blog), May 18, 2021, https://twitter.com/murdockism/status/1394790980679782402.
 SMEX, “#SavePalestinianVoices: Tech Companies Must Stop Silencing Palestinian Content,” Petition, Change.org, 2021, https://www.change.org/p/facebook-savepalestinianvoices-tech-companies-must-stop-silencing-palestinian-content.
 Mada Masr, “The Arab Revenge,” Mada Masr, May 18, 2021, https://www.madamasr.com/ar/2021/05/18/feature/%d8%b3%d9%8a%d8%a7%d8%b3%d8%a9/%d8%a7%d9%84%d8%a7%d9%ae%d8%b3%d8%a7%da%ba-%d8%b5%d8%af-%d8%a7%d9%84%d8%a7%d9%84%d9%87-%d9%ae%d9%88%d8%b1%d9%87-%d8%a7%d9%84%d9%ae%d9%af%d8%a7%d8%b7-%d8%a7%d9%84%d8%b9%d8%b1%d9%ae%d9%89%d9%87/.