Edoardo Chiti, Alberto di Martino, Gianluigi Palombella (a cura di)
L'era dell'interlegalità
DOI: 10.1401/9788815370334/c13
In the US, the Communication Decency Act § 230 (CDA) shielded platforms with broad immunity from liability for user-generated content [18]
. The first relevant case in which the interpretation of the immunity granted under § 230 (also considered the most important case of Internet Law to this date) [19]
was the case of Zeran v. America Online [20]
. Plaintiff
{p. 370}Zeran claimed that the American Online Inc. (AOL) is liable for defamatory messages posted by an unidentified third party. He argued that AOL had a duty to remove the defamatory post, notify its users of the post’s incorrectness, and screen future defamatory material [21]
. However, the Court found that AOL was not liable under § 230, as the latter provided federal immunity to AOL [22]
. According to the Court, purposive reading of the provision has demonstrated that “the imposition of tort liability on service providers for the communication of other represented, for Congress, is simply another form of intrusive government regulation of speech. Section 230, was enacted, in part, to maintain the robust nature of Internet communication and, accordingly, to keep government interference in the medium to a minimum” [23]
. Thus, according to the Court, the “specter of tort liability” is precluded to avoid chilling effect [24]
. Furthermore, the holding encouraged “service providers to self-regulate the dissemination of offensive material over their services” [25]
.
Similarly, the EU law in its E-Commerce Directive provided online intermediaries with responsibility exceptions for the illegal content shared by the users [26]
. The two main reasons for this special regime were the lack of effective control and knowledge over the content generated by users and the desire of EU institutions to enhance the digital economy. Therefore, the mechanism of “notice and takedown” was considered an effective solution for illicit content due to the {p. 371}lack of awareness of the activities of users on social networks. As such, the platforms were free to remove or block illicit content when they were aware of its presence. They were granted a sort of “private” discretion on the fundamental rights of users, particularly the freedom of expression and the right to respect for private life. This delegation of public power by public actors ensured the effective implementation of public policies online [27]
. The documents of the European institutions have also confirmed this public role of online platforms by clearly setting rules, guidelines, and principles to fight against illegal and harmful contents, and ensuring their responsibility [28]
.
The case law of the European courts has also demonstrated this quasi-public “private” discretion. In 2000, the Tribunal de Grande Instance in Paris ordered Yahoo! to take all the measures to prevent access in France to an auction site selling Nazi objects, in accordance with French Law that prohibits such kind of sales [29]
. On the one hand, the case clearly demonstrates the inadequacy of the State to monitor each criminal act occurring online [30]
, while on the other, it {p. 372}illustrated the power of private Internet intermediaries in enforcing fundamental rights [31]
. The issue at stake was the balance between the freedom of speech of the company and the other rights of claimants. Owing to the limited power of the State over the Internet, the private company was held responsible by the Court for user-generated content on their website. The case was the first sign of opening a door to the “privatization of right enforcement”, thus adding power to private entities in enforcing rights [32]
. This “privatization” trend was also named as “the invisible handshake”, marking the peculiar nature of collaboration between the private actors and the public functions [33]
.
Therefore, the platforms regulated content access but carried out this sui generis activity mainly based on their definition of freedom of speech standards [34]
. In the early phase of platform operation, this responsibility meant that the users participated in content moderation by flagging offensive and undesirable content. However, the weaknesses of such an approach materialized as the number of users grew and as the reports for content takedown intensified. To this, the platforms responded with creation of their own elaborate content rules, such as Facebook’s “Community Standards” [35]
to govern content that are divided into six sections, namely, violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property, and content-related requests [36]
. {p. 373}
With such a prominent role of the platforms in shaping the speech, free speech transformed from a relationship between the citizens and the State into a triangle composed of speakers, governments, and private governance actors namely the platforms [37]
. In this phase lasting approximately up to 2016, the platforms played a role in expanding the borders of free speech also recognized by the US Supreme Court, which praised the “vast democratic forums of the Internet” [38]
.
Still, the overwhelming amount of criticism and content takedown led Facebook to announce the creation of an independent and global body to make decisions about user-generated content [39]
. Called the Oversight Board, a body composed of many prominent legal experts and former judges was established in 2020. Its function, defined in the Oversight Board Bylaws, is to “protect freedom of expression by making principled, independent decisions about important pieces of content and by issuing policy advisory opinions on Facebook’s content policies” [40]
. In October 2020, the Board started hearing cases, limited to “highly emblematic cases” as Facebook itself would referred to them releasing its first decisions in January 2021 [41]
.
In this way, online platforms have become first the legislators by drafting their rules; then “courts” by deciding on these rules, and finally the “executives”, by enforcing the decisions they made. This perspective on the Internet {p. 374}threatens the separation of powers, according to which the authority of the State is divided into three branches, a legislator, an executive, and a judiciary, which mutually check and balance each other [42]
. This concentration of the powers in the hands of private entities [43]
led some authors to compare the regulation of speech at the platforms to a new form of feudalism [44]
.
These trends have demonstrated that the platforms cannot be considered solely in the context of the private realm. Indeed, the Supreme Court of the United States also acknowledged in 2017 in the case of Packingham v. North Carolina, viewing social platforms as “modern public square”, they are “perhaps the most powerful mechanisms available to a private citizen to make his or her voice heard” [45]
. Silhouetting the online form of a State, social platforms have posed us a question whether they would be considered as a legal order in the ecosystem of inter-legality. As the inter-legal approach “does not itself decide what counts as a legal order” [46]
we use the approach to explain the legal relations between the platforms, national regulators and international law as well as to demonstrate the usefulness of the approach in (quasi-) judicial decision-making.

3. Social Platforms in the Ecosystem of Inter-legality

As we have noted, social content moderation has long been a privilege of the platforms. However, two major po
{p. 375}litical events occurred in 2016 – the launching of the Brexit process and the US elections – have led to a reconsideration of their role in both the US and the world. Targeted ads spreading fake news used to target voters in the UK and the US raised concerns over the standard of moderation and regulatory control over platforms. Mass spread of hate speech and disinformation led the platforms to begin policing their content more and more, and regulatory pressures exercised by various States contributed to this. In 2017, Germany passed the Network Enforcement Act (Netz DG), a piece of legislation targeting social networks with more than two million registered users in Germany – namely Facebook, Twitter, and YouTube. This law did not introduce new criminal acts but seeks to enforce the existing criminal legislation for specific criminal acts [47]
.
Note
[18] Communication Decency Act (1996) 47 USC § 230 (c) (1) states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”.
[19] E. Goldman and J. Kosseff, Commemorating the 20th Anniversary of Internet Law’s Most Important Judicial Decision, in E. Goldman and J. Kosseff (eds.), Zeran v. America Online E-book at https://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?article=3286&context=historical, 2020, p. 6.
[20] United States Court of Appeals, Fourth Circuit, Zeran v. America Online Inc. 129 F.3d 327, 1997.
[21] Ibidem.
[22] Ibidem.
[23] Ibidem.
[24] Ibidem. The judgment also held that Internet service providers were not liable even when receiving notice of potential defamatory post.
[25] Ibidem. The conceptualization of online platforms within the First Amendment was another critical issue for the courts in the US. The reasonings analyzed analogies to State, company towns, broadcasters and editors. See this debate in Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, cit.
[26] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce), OJL 178/1 of 17.7.2000.
[27] See the debate in Gregorio, From Constitutional Freedoms to Power of the Platforms: Protecting Fundamental Rights Online in the Algorithmic Society, cit.
[28] There are at least four leading documents to see the approach of the EU to online platforms: European Commission, COM (2016) 288 final, Online Platforms and the Digital Single Market Opportunities and Challenges for Europe; European Commission, COM (2017) 555 final, Tackling Illegal Content Online Towards an Enhanced Responsibility of Online Platforms; European Commission, C(2018) 1177 final, On Measures to Effectively Tackle Illegal Content Online; Council of the European Union, 2019, 12522/19, Progress on Combatting Hate Speech Online through the EU Code of Conduct 2016-2019 (all these documents have encouraged online platforms to take self-regulatory measures, and supported the special liability regime provided by the E-Commerce Directive).
[29] TGI Paris (22 May 2000) Licra et UEJF v. Yahoo! Inc. and Yahoo! France; US District Court for the Northern District of California (2001) Yahoo! Inc. v. La Ligue Contre Le Racisme, 169 F. Supp. 2d 1181 (ND Cal. 2001).
[30] See R.J. Reidenberg, States and Internet Enforcement, in «University of Ottawa Law & Technology Journal», 2004, pp. 215 ff.
[31] See M. Bassini, Fundamental Rights and Private Enforcement in the Digital Age, in «European Law Journal», 25, 2019, n. 2, pp. 188 ff.
[32] Ibidem (discussing the enforcement of the right to be forgotten recognized by the CJEU in the Case 131/12 Google Spain v. AEPD EU:C:2014:317, in the context of “privatization”).
[33] See M. Birnhack e N. Elkin-Koren, The Invisible Handshake: The Reemergence of the State in the Digital Environment, 2003, available at https://ssrn.com/abstract=381020.
[34] See K.S. Rahman, Democracy Against Domination, Oxford, Oxford University Press, 2016.
[35] See the Facebook Community Standards at https://www.facebook.com/communitystandards/
[36] Article 19, Facebook Community Standards Legal Analysis, June 2018, pp. 2-26. See the report on https://www.article19.org/wp-content/uploads/2018/07/Facebook-Community-Standards-August-2018-1-1.pdf.
[37] See J.M. Balkin, Free Speech is a Triangle, in «Columbia Law Review», 118(7), 2018, pp. 2011 ff.
[38] 137 S. Ct. 1730, 1735 (2017).
[39] K. Klonick and T. Kadri, How to Make Facebook’s “Supreme Court” Work, in «The New York Times», 17 November 2018, available at https://www.nytimes.com/2018/11/17/opinion/facebook-supreme-court-speech.html
[40] Facebook Oversight Board Bylaws, January 2020, available at https://about.fb.com/news/2020/05/welcoming-the-oversight-board/.
[41] H. Brent, Oversight Board to Start Hearing Cases, 22 October 2020, available at https://about.fb.com/news/2020/10/oversight-board-to-start-hearing-cases/.
[42] See L. Belli e J. Venturini, Private ordering and the rise of terms of service as cyber-regulation, in «Internet Policy Review», 5, 2016, n. 4.
[43] See ibidem.
[44] B. Schneier, Power in the Age of the Feudal Internet, in MIND, Collaboratory discussion paper #6 Internet & Security, 2013; L. Belli, Collaborative Policymaking: from Technical to Legal Interoperability. Presented at the XIX International Congress of Constitutional Law, Brasilia, Panel 7, 2016. Available at https://www.youtube.com/watch?v=KyQ5f--Yw44&t=236s.
[45] Packingham v. North Carolina (2017), 137 S. Ct. 1730.
[46] Klabbers and Palombella, Introduction, cit., p. 10.
[47] Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz - NetzDG) (2017). Available at https://www.gesetze-im-internet.de/netzdg/BJNR335210017.html. For the platforms, it establishes the obligation to remove content that features use of symbols of unconstitutional organizations, forming terrorist organizations, incitement of masses, including denial of the Holocaust, child pornography, insult, malicious gossip, defamation, violation of intimate privacy by taking photographs or other images, and threatening commission of serious criminal offence. See P. Zurth, The German NetzDG as Role Model Or Cautionary Tale? – Implications for the Debate on Social Media Liability, Fordham Intellectual Property, in «Media and Entertainment Law Journal», 2021, pp. 1084 ff.