Edoardo Chiti, Alberto di Martino, Gianluigi Palombella (a cura di)
L'era dell'interlegalità
DOI: 10.1401/9788815370334/c13
As we have noted, social content moderation has long been a privilege of the platforms. However, two major po
{p. 375}litical events occurred in 2016 – the launching of the Brexit process and the US elections – have led to a reconsideration of their role in both the US and the world. Targeted ads spreading fake news used to target voters in the UK and the US raised concerns over the standard of moderation and regulatory control over platforms. Mass spread of hate speech and disinformation led the platforms to begin policing their content more and more, and regulatory pressures exercised by various States contributed to this. In 2017, Germany passed the Network Enforcement Act (Netz DG), a piece of legislation targeting social networks with more than two million registered users in Germany – namely Facebook, Twitter, and YouTube. This law did not introduce new criminal acts but seeks to enforce the existing criminal legislation for specific criminal acts [47]
.
In the meantime, the platforms began to engage in a cartel-like behavior, which meant that the removal of one piece of content by one platform would soon trigger such a removal in other platforms [48]
. This effect was beneficial for the suppression of terrorist recruitment and promotion, as it led to the creation of the Global Internet Forum for Countering Terrorism – a forum for the exchange of information on terrorist practices in social networks. With the start of the COVID-19 pandemic, platforms moved aggressively to take down many of the disinformation and conspiracy theories related to the pandemic. In this same {p. 376}period, Twitter flagged posts by the US President as factually untrue, also limiting the ability to distribute his tweets thereby beginning to act as an editor going beyond its obligations under Section 230 of the CDA [49]
. This prompted fears of censorship and, as a political response, calls for amendments of Section 230 of the US CDA and calls for a regulatory approach that would use competition law to limit the power of the big companies [50]
.
These challenges to the content moderation policy from both US and non-US regulators have marked an end of the era in which the internal legal standards of the platforms were grounded in the permissive liberal tradition [51]
. The platforms’ internal deliberation on the content began to include the elements of an approach typical of judicial deliberation before the international human rights tribunals including the invocations of the International Covenant on Civil and Political Rights [52]
.
Once the social platforms began placing greater weight on balancing between their tradition of content moderation which was traditionally liberal and the pressure from national regulators, they found themselves in the ecosystem of inter-legality. Rules of the three different legal orders applied here: international law, national regulation, and private moderation standards that the platforms developed. Surely, the rules of international law enshrined primarily in Articles 19 and 20 of the International Covenant on Civil {p. 377}and Political Rights were relevant, even when the platforms had just begun operating. However, it was only in 2018 that the UN Special Rapporteur highlighted the international law as a source of normativity that should govern content moderation [53]
. While academics have made calls for its application [54]
, a specific reference to this body of law was made only in January 2021 with the first decisions of the Facebook Oversight Board.
The second legality comes from the national regulators, which, as said above, have been increasingly interested in the regulation of platforms through legislation. The tools allowing the regulators to challenge the platforms are now not only legal but also technological. These include the ability of the national regulator to not only administer fines but also order a decrease in bandwidth or a complete block of the traffic from a particular site [55]
. Further, both German law and other recently proposed legislation aim to apply extraterritorially by limiting access to content that originated outside of these jurisdictions [56]
. The recent decision of the EU Court of Justice of the EU in an Austrian defamation case went in the same direction: if the court in one member State orders a takedown of the social media content, then the social platform is obliged to carry out the takedown by effectively banning it in all EU member States, regardless of whether the content was used for purposes other than defamation [57]
. Thus, these laws and decisions limit the right {p. 378}to receive foreign speech as an integral part of freedom of expression [58]
.
Surely, the current state of technology enables the use of the so-called Virtual Private Networks (VPNs), software that in some situations may enable the users from one jurisdiction to access Internet content banned in that jurisdiction making the reach of the aforementioned solutions imperfect [59]
. However, this software is not accessible or affordable to many, and it is impossible to predict what the future holds considering its future access or development. The other alternative, migrating users to other platforms that are more supportive of the kind of speech and content moderation they prefer, would also not solve many of the problems. Suggested as the “market response” to the issue of platforms as early as 1995 [60]
, this method would have a fragmenting effect on the speech forums across political and cultural lines and would just increase the number of platforms being targeted by the legislators.
The content moderation standards remain the third normativity governing the situation concerning platforms as mediators in the free speech triangle. The pressure that we described herein has revealed access to justice as the main problem of this normativity. It was only in October 2020 that Facebook allowed a clear complaint procedure for the removed content or for requests that a content be removed. Even so, the lengthy period between the complaint and the decision does not ensure a speedy resolution or protection of the rights of speakers. Thus, this third normativity remains essential for the users and their protection of the right to freedom of speech. However, this normativity is not only balancing the demands of national regulators and (to a {p. 379}much lesser extent) international law, but it is also balancing proportionality – enshrined in the decisions of content moderators – with probability – enshrined in the automatic takedowns of content performed by the AI [61]
. Thus, within this normativity, both inter-legal and intra-legal balancing [62]
occurs: inter-legal, when the platforms decide whether to follow their own guidelines or the freedom of speech standards existing within jurisdiction; and intra-legal, when they run proportionality tests between different principles and values on which their governance rules are based.

4. The Virtuality of the Vantage Point and A Three-Step Analysis

It is in this clash between the three legal orders (the national, the international and the platform) that the content moderation and consequentially the limits to the freedom of speech are shaped. Taking an approach of inter-legality here we analyze this dilemma through a three-step analysis: (i) taking the vantage point of the affair – the case at hand – seriously, (ii) understanding the relevant normativities controlling the case, and (iii) looking at the demands of justice stemming from the case [63]
.
Regarding the first step, the angle of the case is fundamental to an inter-legality approach. As Palombella and Klabbers claim,
one does not need the ascent to a juridical heaven for ready-made and principled justice – a deracinated, universalist point – to realize that different legal orders may overlap normatively and reach beyond their own limits. On the contrary, an inter-legality perspective simply happens to be taken as soon as the vantage point of the concrete affair under scrutiny – the case at hand – is taken seriously [64]
.
{p. 380}
Note
[47] Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz - NetzDG) (2017). Available at https://www.gesetze-im-internet.de/netzdg/BJNR335210017.html. For the platforms, it establishes the obligation to remove content that features use of symbols of unconstitutional organizations, forming terrorist organizations, incitement of masses, including denial of the Holocaust, child pornography, insult, malicious gossip, defamation, violation of intimate privacy by taking photographs or other images, and threatening commission of serious criminal offence. See P. Zurth, The German NetzDG as Role Model Or Cautionary Tale? – Implications for the Debate on Social Media Liability, Fordham Intellectual Property, in «Media and Entertainment Law Journal», 2021, pp. 1084 ff.
[48] E. Douek, The Rise of Content Cartels, Knight First Amendment Institute at Columbia, 2020.
[49] Trump’s social media bans are raising new questions on tech regulation CNBC Jan 11, 2021 (https://www.cnbc.com/2021/01/11/facebook-twitter-trump-ban-raises-questions-in-uk-and-europe.html)
[50] I. Brown, Interoperability as a tool for competition regulation. Open Forum Academy, 2020 (explaining the different regulatory approaches that could limit the power of the platforms).
[51] J. Thai, Facebook’s Speech Code and Policies: How They Suppress Speech and Distort Democratic Deliberation, in «American University Law Review», 69, 2020, pp. 1641 ff.
[52] The first few decisions of the Facebook Oversight Board issued in January 2021 with their invocation of the Article 19 and 20 of the ICCPR as well as the United Nations’ Human Rights Committee are instructive in this sense. See Case Decision 2020-003-FB-UA at https://oversightboard.com/decision/FB-QBJDASCV/.
[53] Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression No. A/HRC/38/35 (Jun.2018)
[54] See S. Benesch, But Facebook’s Not a Country: How to Interpret Human Rights Law for Social Media Companies, in «Yale Journal on Regulation», 2020, pp. 86 ff.
[55] For example, the Turkish legislator has considered such a solution. See a recent discussion in Turkish law tightening rules on social media comes into effect, 1.10.2020, available at https://www.euronews.com/2020/10/01/turkish-law-tightening-rules-on-social-media-comes-into-effect.
[56] See Zurth, The German NetzDG, cit., 257.
[57] Glawischnig-Piesczek v. Facebook Ireland (C-18/18).
[58] See excellent discussion in J. Thai, The Right to Receive Foreign Speech, in «Oklahoma Law Review», 71, 2018, p. 269.
[59] See T. Sardá, S. Natale, N. Sotirakopoulos and M. Monaghan, Understanding Online Anonymity, in «Media, Culture & Society», 41, 2019, n. 4, pp. 559 ff.
[60] See G. David, Anarchy State and the Internet, in «Journal of Online Law», 1995, article 3, available at SSRN: https://ssrn.com/abstract=943456.
[61] See Douek, The Rise of Content Cartels, cit., p. 52.
[62] See the chapter by Gabriel Encinas in this volume.
[63] See Klabbers and Palombella, Introduction, cit., pp. 1-16.
[64] Ibidem, p. 2.