Enforcement Through the Network: The Network Enforcement Act and Article 10 of the European Convention on Human Rights
This Comment explores the conflict between state-described freedom of expression and the autonomy of social media companies to regulate content on their platforms through the lens of the Network Enforcement Act, passed by Germany in 2017, and the freedom of expression clause of the European Convention on Human Rights. The Network Enforcement Act, which compels social media companies to monitor and remove content from their sites which violate certain other provisions of German law, has thrust the issues of intermediary autonomy and censorship-by-proxy into the spotlight. Proponents of the law support it as a way to ensure that what is illegal offline remains illegal online. Opponents argue that the law essentially amounts to censorship, and therefore violates freedom of expression under the German constitution and a host of international treaties. This Comment finds that while the law likely does not violate freedom of expression as enumerated under Article 5 of the Basic Laws of the Republic of Germany, it may violate freedom of expression under Article 10 of the European Convention of Human Rights, in part because the law incentivizes “overblocking” which could lead to the removal of lawful speech without due process. In order to promulgate such regulations, more than one country needs to band together in order to promote safety and international security without curtailing civil rights.
In January 2018, Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken1
Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken [Network Enforcement Act], Sept. 1, 2017, Bundesgesetzblatt, Teil I [BGBl I] at 3352 (Ger.).
Official English translation may be found here: http://perma.cc/72JK-3KNM.
The law is also referred to by its abbreviated German names, NetzDG or Netzkdurchsetzungsgesetz.
Philip Oltermann & Thomas Furmann, Tough New German Law puts Tech Firms and Free Speech in Spotlight, The Guardian (Jan. 5, 2018), http://perma.cc/ENU5-4FKZ.
Stefan Engels, Network Enforcement Act in a Nutshell, DLA Piper Blog: IPT Germany (Jan. 31, 2018), http://perma.cc/E6QT-VSEL.
Netzwerkdurchsetzungsgesetz, supra note 1, at §1(2).
Id. at § 1(3).
Id. at § 3(2)(2).
Id. at § 3(2)(3).
Id. at § 4(2).
Since the law went into effect in January 2018, it has faced a bevy of complaints. This Comment focuses on one—whether the law violates the freedom of expression clause, Article 10, of the European Convention on Human Rights (ECHR).9
European Convention for the Protection of Human Rights and Fundamental Freedoms, opened for signature Nov. 4, 1950, Eur. T.S. No. 5, 213 U.N.T.S. 221, at art. 10.
Deutscher Bundestag: Plenarprotokoll 18/235 at 23848, (statement of Heiko Maas, Bundesminister BMJV), http://perma.cc/WKU4-HSGJ(Ger.).
Linda Kinstler, Can Germany Fix Facebook?, The Atlantic (Nov. 2, 2017), http://perma.cc/B7EF-UDS8.
Because of the punitive nature of the fines, social media companies are incentivized to err on the side of caution and remove any content that is reported. This includes unlawful content, but also clearly satirical tweets parodying actual illegal content12
Emma Thomasson, Germany Looks to Revise Social Media Law as Europe Watches, Reuters (Mar. 8, 2018), http://perma.cc/ZM6T-76XU(explaining that Titanic, a satirical magazine, had its content removed for parodying the language of a tweet from a far-right German political party which was also removed).
“Harmless” varies from person to person. There are a number of online posts, some of which will be discussed in this Comment, which cause real emotional harm to users. However, harmless is used here to mean not tending to call for or incite immediate violence or threatening physical harm against a given user.
Christof Kerkmann, German Court Overturns Facebook ‘Censorship,’ Handelsblatt Today (Apr. 13, 2018), http://perma.cc/S4LZ-DRCE(including examples of comments, such as “the Germans are becoming more and more stupid. No wonder, as they are being clobbered daily by left-wing media with fake news about skilled workers, declining unemployment or Trump.”).
Facebook has allegedly recruited “several hundred staff” to deal with complaints.15
Germany Starts Enforcing Hate Speech Law, BBC News (Jan. 1, 2018), http://perma.cc/23QB-MXB7.
Bitkom is the German Association for IT, Telecommunications, and New Media. For more information see http://perma.cc/T56F-EGUT.
Guy Chazan, Berlin Forced to Defend Hate Speech Law, Fin. Times (Jan. 5, 2018), http://perma.cc/8GAN-72TF.
See generally Jarred Prier, Commanding the Trend: Social Media as Information Warfare, 11 Strategic Stud. Q. 50 (2017).
To say that the law is controversial is an understatement. However, whether the European Court of Human Rights (ECtHR) would find that it in fact violates freedom of expression is another story. While the law, and the way the lower courts are currently enforcing it, may harm individual freedom of expression, the legitimate national security and safety concerns could allow for the law to be upheld without further adjustments. This would alter the way these claims get handled. That is, rather than the government or another complaining individual having to bring their case to the courts to remove speech, the affected individuals will have to bring their cases to court to get their accounts and posts reinstated. This could create a severe enough chilling effect on speech to warrant the ECtHR overturning at least part of the law. However, as this Comment explains, it is debatable that the Network Enforcement Act is uniquely to blame for this issue, and it is unclear whether removing the law will solve these free expression claims.
Section II addresses the history of the Network Enforcement Act. Although current discourse relating to controlling online speech has been centered on fake news in the wake of the U.S. 2016 election, the passage of the Network Enforcement Act is the culmination of a decade of growing tension in Europe between lawmakers and social media companies as both attempt to combat terrorism. It also explains how the “Brussels Effect”19
The Brussels Effect is the term coined to describe the E.U.’s growing ability to control and affect international regulations without entering into formal international agreements. For more information see Section II(C).
Section III briefly examines the history of the ECHR’s Article 10 and the role of freedom of expression in Europe. The ECHR’s ratification in the shadow of World War II means that its goals are centered in a historical moment that is very different from one that the mostly U.S.-based social media companies are accustomed to. This means that, although the Network Enforcement Act’s goals instinctually seem to violate the traditional definition of freedom of expression that many of the affected social media companies operate under, it is not necessarily antithetical to the historical goals of the treaty.
In Section IV, this Comment determines whether the Network Enforcement Act indeed violates freedom of expression under Article 10. Because the state has a positive obligation to not interfere with freedom of expression, and penalties are generally considered interferences, Article 10 is implicated. Despite the fact that the goals which the legislature is attempting to promote through its interference are rational, and the fact that the law is potentially necessary, the lack of oversight and disproportionate fines mean that the ECtHR should find that the law violates Article 10. However, this Comment concludes that some sort of regulation over social media companies is necessary on an international scale in order to maintain a unified digital environment. Finding the correct balance between maintaining freedom of expression and promoting other rights, such as the right to privacy or national security, is increasingly crucial and difficult as expression moves away from public, government-sponsored forums to private locations.
In order to understand the Network Enforcement Act’s interaction with free expression rights, it is necessary to examine the law itself, as well as the forces that led to its passage. As Section II(A) discusses, the Network Enforcement Act must be understood in the context of the recent intensification of xenophobia in Europe, as well as the advent of terrorist attacks—coordinated online—in major European cities. Yet, lawmakers’ attempts to curtail this unlawful speech are arguably harming the free expression rights of their citizens when it goes beyond standard national security justifications. In Section II(B), this Comment examines what the Network Enforcement Act actually does. The law acknowledges that by the time a given piece of media makes its way through the court system, it may be too late. The effects of harmful speech or images can multiply in seconds because of the internet. By moving the adjudication process from the courts to social media companies and speeding up the timeline, lawmakers are responding to real problems with monitoring online content, but in a way that arguably causes more harm.
Indeed, as Section II(C) discusses, because of the way the Network Enforcement Act is being interpreted, the German Bundestag is arguably expanding domestic laws far outside Germany’s borders. Because the internet has no boundaries when German courts ask for content to be “removed” they can, and have, asked companies to remove it anywhere a German citizen might view it. With current technology, this means that German law is superseding international law and infringing on other countries’ citizens’ rights. Thus, although this is a German law, the ECtHR should adjudicate it.
In order to analyze the freedom of expression concerns, it is important to understand the context of the Network Enforcement Act. Discussed in more detail in Section IV, one defense to a violation of freedom of expression is a compelling state interest. Here, Germany has frequently asserted an interest in national security, namely blocking terrorist and extremist content on the internet.
While the Network Enforcement Act feels like a law rooted in fears about populism and foreign election tampering, in many ways the worries that led to the act’s passage came to a head in the wake of the 2015 Charlie Hebdo attacks in Paris. On January 7, 2015, twelve people, including four cartoonists, were murdered because of the magazine’s publication of a satirical “Prophet Mohammed” cartoon.20
Agnes Callamard, Religion, Terrorism, and Speech in a ‘Post-Charlie Hebdo’ World, 10 Relig. & Hum. Rts. 207 (2015).
Id. at 208. While this Comment attributes the Charlie Hebdo attacks to ISIS, they were actually claimed by Al-Qaeda in the Arabian Peninsula. See Catherine E. Schoichet & Josh Levs, Al Qaeda Branch Claims Charlie Hebdo Attack was Years in the Making, CNN (Jan. 21, 2015), http://perma.cc/F5BL-4PRK.
March 2016 brought another terrorist attack, this time in Brussels, a city known to be the center of privacy regulation in Europe.22
Brussels is home to the Brussels Privacy Hub, “an academic privacy research centre with a global focus . . . Brussels is where key decisions are taken on data protection in the European Union, and EU rules set the standard for data protection and privacy law around the world.” About the Brussels Privacy Hub, Brussels Privacy Hub, http://perma.cc/W9P6-AX89.
Council of the E.U. Press Release 158/16, Joint Statement of E.U. Ministers for Justice and Home Affairs and Representatives of E.U. Institutions on the Terrorist Attacks in Brussels on 22 March 2016 (Mar. 24, 2016), http://perma.cc/L8XJ-V59D.
Liat Clark, Facebook and Twitter Must Tackle Hate Speech or Face New Laws, Wired (Dec. 5, 2016), http://perma.cc/YCN7-4AP6.
Id. See also European Commission Factsheet, Code of Conduct on Countering Illegal Hate Speech Online: First Results on Implementation (Dec. 2016).
Danielle Keats Citron, Extremist Speech, Compelled Conformity, and Censorship Creep, 93 Notre Dame L. Rev. 1035, 1038 (2018).
See Eur. Comm’n, Priority: Digital Single Market, http://perma.cc/74TV-9684.
Clark, supra note 24.
Id.
By passing the Network Enforcement Act, Germany showed that it felt Facebook and other social media companies’ responses since Charlie Hebdo have been inadequate. They are not alone.30
See generally Giancarlo F. Frosio, Why Keep a Dog and Bark Yourself? From Intermediary Liability to Responsibility, 26 Int’l J. L. & Info. Tech. 1 (2017).
Germany: Flawed Social Media Law, Human Rights Watch (Feb. 14, 2018), http://perma.cc/B87K-YGLJ.
Billy Perrigo, The U.K. Is About to Regulate Online Porn, and Free Speech Advocates Are Terrified, Time Magazine (Aug. 20, 2018), http://perma.cc/JH4U-LMXY.
James McAuley, France Weighs a Law to Rein in ‘Fake News,’ Raising Fears for Freedom of Speech, Washington Post (Jan. 10, 2018), http://perma.cc/HW79-N7LE.It is important to note that while distasteful, printing fake news is usually not considered illegal in modern times.
Saqib Shah, E.U. Will Fine Social Media Sites for Lingering Extremism, Engadget (Sept. 12, 2018), http://perma.cc/NR2E-RVYJ.
1. The Network Enforcement Act is a law designed to combat extremism and hate speech online.
The Network Enforcement Act is a deceptively simple law. It applies to “telemedia service providers” or “social networks,” defined as entities with over two million registered users in Germany “which, for profit-making purposes, operate internet platforms which are designed to enable users to share any content with other users or to make such content available to the public.”35
Netzwekdurchsetzungsgesetz, supra note 1, at § 1.
Id.
Engels & Fuhrmann, supra note 3.
The law then outlines these companies’ reporting obligations. Companies which receive more than a hundred complaints per calendar year about unlawful content are mandated to produce biannual reports on how they handled said unlawful content.38
Netzwekdurchsetzungsgesetz, supra note 1, at § 2(1).
Twitter Netzwerkdurchsetzungsgesetzbericht: Januar–Juni 2018, Twitter (2018), http://perma.cc/JU7L-ND3U(Ger.).
Removals under the Network Enforcement Law, Google (2018), http://perma.cc/L8PY-RAQ4.
Removals under the Network Enforcement Law, Google (2018), http://perma.cc/Z98S-25V5.
Centre For European Policy Studies, Germany’s NetzDG: A Key Test For Combatting Online Hate 9 (2018).
NetzDG Transparency Report, Facebook (July 2018), http://perma.cc/99SR-E3T9.
Netzwekdurchsetzungsgesetz, supra note 1, at §2(2)(1).
Id. at § 2(2)(3).
Finally, and most controversially, the law dictates how social media companies should handle certain kinds of complaints about unlawful content. The law requires social media companies to address complaints related to eighteen provisions of the criminal code,46
Id. at § 1(3).
See Michael Bohlander, trans., Criminal Code in the version promulgated on 13 November 1998, Federal Law Gazette [Bundesgesetzblatt] I p. 3322, last amended by Article 1 of the Law of 24 September 2013, Federal Law Gazette I p. 3671 and with the text of Article 6(18) of the Law of 10 October 2013, Federal Law Gazette I p 3799. Bundesgesetzblatt [Criminal Code], Nov. 13, 1998, BGBL II, last amended by Gesetz [G], Sept. 2013 BGBL II, translation at http://perma.cc/Y6DK-FAEQ.
- Dissemination of propaganda material of unconstitutional organizations (§ 86)
- Using symbols of unconstitutional organizations (§ 86(a))
- Preparation of a serious violent offense endangering the state (§ 89(a))
- Encouraging the commission of a serious violent offense endangering the state (§ 91)
- Treasonous forgery (§ 100(a))
- Public incitement to crime (§ 111)
- Breach of the public peace by threatening to commit offenses (§ 126)
- Forming criminal or terrorist organizations (§§ 129–129(b))
- Incitement to hatred (§ 130)
- Dissemination of depictions of violence (§ 131)
- Rewarding and approving of offenses (§ 140)
- Defamation of religions, religious and ideological associations (§ 166)
- Insult (§ 185)
- Defamation (§ 186)
- Intentional defamation (§ 187)
- Violation of intimate privacy by taking photographs (§ 201a)
- Threatening the commission of a felony (§ 241)
- Forgery of data intended to provide proof (§ 269)
Content which is determined to be “manifestly unlawful,” because it violates one of the above criminal provisions, must be removed within twenty-four hours, although a company may work with law enforcement to receive an extension.48
Netzwerkdurchsetzungsgesetz, supra note 1, at § 3.
Id.
The statutes vary in effectiveness and clarity. For example, the definition for “insult” only reads “An insult shall be punished with imprisonment not exceeding one year or a fine and, if the insult is committed by means of an assault, with imprisonment not exceeding two years or a fine.” Bohlander, supra note 47, at § 185. Whereas the definition of “[b]reach of the public peace by threatening to commit offences” states, in detail:
(1) Whosoever, in a manner capable of disturbing the public peace, threatens to commit
1. an offence of rioting indicated in section 125a 2nd sentence Nos 1 to 4;
2. murder under specific aggravating circumstances (section 211), murder (section 212) or genocide (section 6 of the Code of International Criminal Law) or a crime against humanity (section 7 of the Code of International Criminal Law) or a war crime (section 8, section 9, section 10, section11 or section 12 of the Code of International Criminal Law);
3. grievous bodily harm (section 226);
4. an offence against personal freedom under section 232(3), (4), or (5), section 233(3), each to the extent it involves a felony, section 234, section 234a, section 239a or section 239b;
5. robbery or blackmail with force or threats to life and limb (Sections 249 to 251 or section 255);
6. a felony endangering the public under sections 306 to 306c or section 307(1) to (3), section 308(1) to (3), section 309(1) to (4), section 313, section 314 or section 315(3), section 315b(3), section 316a(1) or (3), section 316c(1) or (3) or section 318(3) or (4); or
7. a misdemeanour endangering the public under section 309(6), section 311(1), section 316b(1), section 317(1) or section 318(1),
shall be liable to imprisonment not exceeding three years or a fine.
(2) Whosoever intentionally and knowingly and in a manner capable of disturbing the public peace pretends that the commission of one of the unlawful acts named in subsection (1) above is imminent, shall incur the same penalty.
Id. at § 126.
If a user’s content is removed, currently the only recourse they have is at the discretion of the social media company. If the decision depends on the falsity of a factual allegation or other factual circumstances, the network may give a user an opportunity to respond.51
Netzwerkdurchsetzungsgesetz, supra note 1, at § 3(2)(3)(a).
2. The three initial concerns about the interaction between freedom of expression and the Network Enforcement Act are censorship, overblocking, and removal of lawful content
The Network Enforcement Act raises three main concerns with regard to freedom of expression. First, there is the issue of censorship. Second, there is the problem of overblocking, which leads to to a chilling effect on speech. Finally, there is the issue of what “remove” really means and the exportation of censorship to other countries.
a) Censorship
Politicians from Germany’s far-right party, Alternative for Deutschland (AfD), are among the law’s staunchest opponents.52
AfD, Facebook (Nov. 21, 2017), http://perma.cc/VM4Q-LFVH.A post where AfD says they “kept their word” and are requesting the cancellation of the Network Enforcement Act. It includes a link to a bill to repeal the act.
Linda Kinstler, Germany’s Attempt to Fix Facebook Is Backfiring, The Atlantic (May 18, 2018), http://perma.cc/9A3P-DDRF.
Philip Oltermann & Pádraig Collins, Two Members of Germany’s Far-Right Party Investigated by State Prosecutor, The Guardian (Jan. 2, 2018), http://perma.cc/R9U9-YC4U.
Id.
Id.
Id.
AfD is not alone in its objections to the law. The Left Party and the pro-business Free Democratic Party also have their own concerns about the law.58
Id.
Carol Anee Costabile-Heming, “Rezensur”: A Case Study of Censorship and Programmatic Reception in the GDR, 92 Monatshefte 53, 54 (2000) (“[GDR’s] structure [went] beyond censorship, a term that itself was taboo, and bec[ame] a type of systematic control.”).
Id. at 56.
Id. at 58.
While the Network Enforcement Act is not a prior restraint in the same way a license is, the similarities are difficult to ignore. The Network Enforcement Act is another law in a long line of attempts to censor content by proxy. Seth Kreimer illustrates several examples of proxy censorship through the internet perpetrated by France, Switzerland, Germany, and Britain.62
Seth F. Kreimer, Censorship by Proxy, 155 U. Pa. L. Rev. 11, 19–20 (2006) (providing examples of: France attempting to impose liability on Yahoo! for making overseas Nazi messages, images, and paraphernalia available to French citizens; Swiss police inducing ISPs to block neo-Nazi sites; German courts requiring ISPs to block access to extraterritorial neo-Nazi websites; British telecom blocking access to sites on a child pornography blacklist).
Id. at 27.
The Council of Europe Commissioner of Human Rights has denounced such proxy censorship. As early as 2014 it stated “[r]ule of law obligations, including those flowing from Article[] . . . 10 . . . of the ECHR, may not be circumvented through ad hoc arrangements with private actors who control the internet and the wider digital environment.”64
Council of Europe Commissioner for Human Rights, The Rule of Law on the Internet and in the Wider Digital World (2014).
Id. at ¶ 16.
Values, Council of Europe (2019), http://perma.cc/Q857-ZSA7(“The Council of Europe promotes human rights through international conventions. . . It monitors member states' progress in these areas and makes recommendations through independent expert monitoring bodies.”).
b) Overblocking
Less maliciously, there is a concern of overblocking—the blocking of content which is not actually illegal. While the Bundestag assured companies that fines would only be levied against systematic actors, there are currently no checks on social media companies to determine whether the content they are blocking is actually unlawful. Determining the unlawfulness of content would “ordinarily take weeks in a German court,” according to Mirko Hohmann, a project manager at the Global Public Policy Institute in Berlin.67
Kinstler, supra note 53.
Kreimer, supra note 62, at 28.
Unfortunately, there is no way around this. Speed is among the primary reasons the law is considered necessary. Once content is placed on the web, it spreads like wildfire and becomes difficult to remove. The Bundestag was not thinking of fringe cases of people blowing off steam, or satire. Instead, it was thinking of imminent threats of violence that need to be removed immediately.69
Maas verteidigt Gesetz gegen Hass im Internet, Spiegel Online (Apr. 1, 2018), http://perma.cc/67DK-7X6K(“Calls for murder, threats and insults, sedition or Auschwitz lie [Holocaust denial] are not expressions of freedom of expression, but rather are attacks on the freedom of expression of others.”).
Id.
c) Removal
Finally, the lack of definition for “removal” brings the law into an international context. What the German Bundestag likely had in mind was that a post would be taken down for German users. However, Alice Weidel once again provides an example of why this is far more complicated than it sounds. In May, a court in Hamburg said that Facebook did not do enough to prevent German users from viewing a comment on a Huffington Post article about Weidel’s opposition to gay marriage.71
Id.; see also Joachim Huber, Alice Weidel gewinnt Rechtsstreit mit Facebook, Der Tagesspiegel (Jan. 5, 2018), http://perma.cc/HSS3-QNBC.
Maas verteidigt Gesetz gegen Hass im Internet, supra note 69.
Id.
A VPN, or Virtual Private Network, allows users to create a secure connection to another network over the internet. VPNs can be used to access region-restricted websites, so a viewer in Germany can appear to be in France.
David Meyer, Facebook’s New Court Defeat: This Time it ‘May Have Free Speech Implications’, ZDNet (May 1, 2018), http://perma.cc/66DS-SRY2.
On the face of the law, this is a perfectly acceptable outcome. However, with respect to international norms, this is unprecedented. Of course, Facebook could just pay the fine and refuse to remove the content—Facebook’s income for 2018 was 55.8 billion dollars, a figure which even the maximum fine would not scratch.76
Facebook, Inc., Annual Report (Form 10-K) 59 (Feb. 1, 2018).
This is not a classroom hypothetical. The Court of Justice of the European Union recently heard a case on substantially similar grounds related to the General Data Protection Regulation (GDPR). GDPR is a regulation intended to “protect[] fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.”77
Regulation (EU) 2016/679 of the European Parliament and the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation), 2016 O.J. L 119/1 [hereinafter GDPR], at art. 1(2), http://perma.cc/W56Q-AKZW.
Id. at art. 3(1).
Id.
Mark Scott, In Google Privacy Case, Europe’s Highest Court to Decide on Future of the Web, Politico (Sept. 12, 2018), http://perma.cc/TU2F-FPZB.
The Network Enforcement Act could lead to even bigger conflicts. Unlike GDPR’s data erasure provision, where a given consumer is requesting that information about themselves be removed, the Network Enforcement Act forces the removal of content that the consumer explicitly does not want to be removed. This could potentially lead to novel and impossible to solve conflict-of-laws issues. If an American college student visits a German news site for a class and is moved to insult the person in the article, the First Amendment and the Network Enforcement Act could have the ultimate legal showdown, with social media companies trapped in the middle.
What happens in Germany and the E.U. on the internet has an outsized effect on global internet culture. Anu Bradford has coined this phenomenon “The Brussels Effect” in order to describe the “deeply underestimated aspect of European power that the discussion on globalization and power politics overlooks: Europe’s unilateral power to regulate global markets.”81
Anu Bradford, The Brussels Effect, 107 Nw. U. L. Rev. 1, 3. (2012).
For social media companies, this influence is often exerted without utilizing legal channels. For instance, the code of conduct to counteract hate speech mentioned previously is not binding law. These “voluntary” measures have their own advantages and disadvantages because they allow “[the circumvention of] the E.U. charter on restrictions to fundamental rights, avoiding the threat of legal challenges, and taking a quicker reform route.”82
Frosio, supra note 30, at 13–14.
Nevertheless, depending on how courts interpret the law, German criminal law might easily be moved far outside its borders. Weidel’s case is a good example of this. A rude comment, such as the one she fought, would likely not have implicated removal in other portions of Europe. The German criminal code’s particular sensitivity to references to the Nazi Party, as well as their unusual Beleidigunggesetz, or law protecting people against insults, goes well beyond standard defamation law—particularly how U.S.-based tech companies would understand it.83
Erik Kirschbaum, In Germany It Can be a Crime to Insult Someone in Public, L.A. Times (Sept. 6, 2016), http://perma.cc/7JTF-45KP.
Frosio, supra note 30, at 2.
This Section discusses the history of Article 10 of the ECHR, as well as the history of the ECtHR generally. Because Germany is a signatory country to this treaty, the ECtHR can have jurisdiction over a case brought to it, provided that the petitioner exhausts their opportunity for relief within their own country. Section III(B) explains why Germany’s highest court will likely find that the Network Enforcement Act is constitutional under Germany’s Basic Law for the Federal Republic of Germany, which acts as the country’s constitution. The final section explains how Article 10 cases have been reviewed generally, in order to provide context for how those rights may apply to the Network Enforcement Act. It explains that the two principles of necessity and proportionality are key to examining Article 10 cases.
The ECHR was opened for signature in Rome in November 1950 and entered into force three years later.85
Council of Europe, The Conscience of Europe: 50 Years of the European Court of Human Rights 22 (Egbert Myjer et al. eds., 1st ed. 2010).
See generally Bernadette Rainey et al., Jacobs, White, and Ovey: The European Convention on Human Rights (7th ed. 2017).
David Harris et al., Harris, O'Boyle & Warbrick: Law of the European Convention on Human Rights 621 (3rd ed. 2014).
47 Member States, Council of Europe, http://perma.cc/G23F-TGVR.
Like most post-war international human rights treaties, the ECHR establishes a set of enumerated rights. Article 10 instituted freedom of expression as one of those rights.89
European Convention on Human Rights, supra note 9, at art. 10.
Handyside v. United Kingdom, 24 Eur. Ct. H.R. (ser. A) 49 (1976).
Mario Oetheimer, Protecting Freedom of Expression: The Challenge of Hate Speech in the European Court of Human Rights Case Law, 17 Cardozo J. Int'l & Comp. L. 427, 427 (2009), citing Handyside v. United Kingdom, 24 Eur. Ct. H.R. (ser. A) 49 (1976).
European Convention on Human Rights, supra note 9, at art.10.
Handyside v. United Kingdom, supra note 90, at 18.
Rainey et. al., supra note 86, at 484.
Id.
While the historical background of Article 10 provides a mandate of sorts to the ECtHR to protect freedom of expression, that same history has allowed the court to curtail freedom of expression that is seen to violate historical norms. This leads to biased jurisprudence when it comes to freedom of expression claims. The case law has evolved such that there is a presumption in favor of national authorities where they justify their laws based on “their fight against . . . anti-Convention values.”96
David Harris et al., Harris, O’Boyle, & Warbrick: Law of the European Convention on Human Rights 601 (4th ed. 2018).
Aleksandra Gliszczynska-Grabias & Grazyna Baranowska, The European Court of Human Rights on Nazi and Soviet Past in Central and Eastern Europe, 45 Polish Pol. Sci. Y.B. 117, 119 (2016).
Rainey et. al., supra note 86, at 490.
The ECtHR follows a four-part test in determining whether an action violates Article 10. First, the ECtHR must determine whether the state action actually interferes with free expression. The court has found that a wide variety of activities, from run-of-the-mill censorship and confiscation to prohibitions on wearing symbols that communicate resistance, constitute interference with expression.99
Id. at 485.
Harris et al. (3rd ed. 2014), supra note 87, at 616.
Article 10 explicitly outlines ways in which states may abridge expression.101
ECHR, supra note 9, at art. 10 (states are allowed to license television).
Id.
See generally Open Door Counselling Ltd. v. Ireland, 246 Eur. Ct. H.R. (ser. A) (1992); Delfi AS v. Estonia, App. No. 64569/09, 2015-II Eur. Ct. H.R. 319.
Delfi v. Estonia, supra note 103, at ¶ 122.
Instead, the bulk of the analysis lies in the third and fourth steps that make up the necessity test. The ECtHR must determine whether the free speech limitation is necessary in a democratic society, and, if it is necessary, whether it is proportionate to the legitimate aim pursued.105
See generally Handyside v. United Kingdom, supra note 90, at 16; European Convention on Human Rights, supra note 9, at art. 10.; Oetheimer, supra note 91, at 434.
Delfi v. Estonia, supra note 103, at ¶ 131.
Sunday Times v. United Kingdom, App. No. 6538/74, 30 Eur. Ct. H.R. (ser. A) ¶ 59 (1979).
ECHR, supra note 9, at art. 10.
Even if necessity is found, an interference can be a violation of Article 10 if it is not proportionate. Proportionality is the fuzziest portion of this test. It is unclear who has the burden of proving or disproving proportionality.109
Steven Greer, The Exceptions to Article 8 to 11 of the European Convention on Human Rights 15 (1997).
Id.
Vogt v. Germany, 323 Eur. Ct. H.R. (ser. A) (1995).
Greece v. United Kingdom, App. No. 176/56, 1959 Y.B. Eur. Conv. On H.R. (Eur. Comm’n on H.R.) (1958-59).
Norris v. Ireland, App. No. 10581/83, 13 Eur. H.R. Rep. 186 (1988).
Jersild v Denmark, App. No. 15890/89, 298 Eur. Ct. H.R. (ser. A) (1994).
In order to understand the Court’s approach to proportionality, it is important to briefly discuss the margin of appreciation doctrine. According to Professor Yutaka Arai, “the ‘margin of appreciation’ refers to the latitude a government enjoys in evaluating factual situations and in applying the provisions enumerated in international human rights treaties.”115
Yutaka Arai-Takahashi, The Margin of Appreciation Doctrine and the Principle of Proportionality in the Jurisprudence of the ECHR 2 (2002).
Id. at 136.
Id. at 127.
Harris (4th ed. 2018), supra note 96, at 608.
Id. at 612.
Outside of this test, Article 10 claims must also be analyzed in the context of other rights within the ECHR. This is mandated by Article 17, the prohibition on the abuse of rights. That provision states:
Nothing in this Convention may be interpreted as implying for any State, group or person any right to engage in any activity or perform any act aimed at the destruction of any of the rights and freedoms set forth herein, or at their limitation to a greater extent than is provided for in the convention.120
120ECHR, supra note 9, at art. 17.
In light of the historical context for the adoption of the ECHR, the ECtHR and Europe, in general, are “s[k]eptical of the ability of the democracy to resist the danger of racist propaganda leading to totalitarian dictatorships and massive abuses.”121
Harris (3rd ed.), supra note 87, at 621.
Oetheimer, supra note 91, at 429.
Id.
Similarly, individual instances of speech may implicate other aspects of the ECHR. For example, defamation claims can be found to interfere with a person’s Article 8 rights. Article 8 of the ECHR protects the right to “respect for private and family life, home, and correspondence.”124
ECHR, supra note 9, at. art. 8.
Harris, (4th ed. 2018), supra note 96, at 608.
ECHR, supra note 9, at art. 9.
Only individuals, groups of individuals, and other member states may bring claims to the ECtHR.127
Id. at art. 33.
European Court of Human Rights, The ECHR in 50 Questions, at Question 19 (Feb., 2014).
European Convention on Human Rights, supra note 9, at art. 35(1).
Article 5 of the Basic Law of the Federal Government of Germany is the German constitutional provision which governs freedom of expression. It merely states: “[e]very person shall have the right to freely express and disseminate his opinion in speech, writing and picture . . . There shall be no censorship.”130
Grundgesetz für die Bundesrepublik Deutschland, at art. 5(1), translated in Basic Law for the Federal Republic of Germany (Christian Tomuschat & David P. Currie trans., 2008).
Id.
Operating under the assumption that every individual law the Network Enforcement Act is supposed to enforce has been found constitutional in Germany, recent scholarship suggests that the Act will also be found constitutional.132
See generally Thomas Wischmeyer, ‘What is Illegal Offline is Also Illegal Online’ – The German Network Enforcement Act 2017, in Fundamental Rights Protection Online: The Future Regulation of Intermediaries (Bilyana Petkova & Tuomas Ojanen eds., 2019).
Id. at 15.
Sebastien Schwiddesen, German Attorney General: Video Game with Swastika Does Not Violate the Law; Constitutes Art, Lexology (May 8, 2018), http://perma.cc/E3F4-W3CV.
Id. (Wolfenstein is a series of games set in Third Reich era Germany. Wolfenstein 3D has players kill members of the National Socialist Party, and they even go on to kill Hitler in the end. Naturally, Nazi imagery is pervasive throughout the game.)
Id.
Judith Vonberg, Germany Lifts Ban on Nazi Symbols in Computer Games, CNN (Aug. 10, 2018), http://perma.cc/LVL9-C6SV.
Additionally, none of the problems that the Network Enforcement Act arguably creates are new to social media companies nor unique to law. Companies have always had the ability to regulate speech according to their own terms of service. Neither the Council of Europe’s anti-hate speech code nor the Network Enforcement Act changed the fact that the final step of review lies with the courts. Although the Network Enforcement Act acknowledges this fact wholeheartedly and moves even further towards private regulation, it is not clear that the Act itself is the problem.138
Wischmeyer, supra note 132, at 16.
Once the Network Enforcement Act is found constitutional within Germany, the ECHR is implicated. Article 1 states: “The High Contracting Parties shall secure to everyone within their jurisdiction the rights and freedoms defined in Section I of this Convention.”139
ECHR, supra note 9, at art. 1.
Rainey et. al., supra note 86, at 515.
Id.
However, this is a much closer issue than it might initially seem and highlights some of the problems the ECtHR will have to deal with when it comes to regulating social media companies in the future. As long as undesirable content is posted online, social media companies and global leaders will need to work together to keep the internet safe for everyone, while still providing a unified network.
When examining the legality of the Network Enforcement Act, there are a few cases and terms that will be used extensively. Therefore, it is important to outline them at the forefront of the argument. While, as Section III(C) discussed, the court has a robust history of dealing with Article 10 claims generally, the internet and intermediaries have illuminated some of the weaknesses in that procedure. Most modern communication takes place over networks owned by private companies, which the court acknowledges “provides an unprecedented platform for the exercise of freedom of expression.”142
Delfi AS v. Estonia, supra note 103.
Intermediary Liability, The Center for Internet and Society at Stanford Law School, http://perma.cc/27LM-5SZG.
As such, the court is in the process of developing new methodology when it comes to these internet intermediary liability cases. According to Robert Spano, a judge on the ECtHR, the court appears to be trying to strike a balance between two competing viewpoints. The first is net neutrality, described generally as the proposition that internet service providers should treat all traffic equally, regardless of origin.144
Robert Spano, Intermediary Liability for Online User Comments under the European Convention on Human Rights, 17 Hum. Rts. L. Rev. 665, 667 n. 5 (2017).
Id. at 667. See also Communication from the Commission to the European Parliament COM 555, the Council the European Economic and Social Committee, and the Committee of the Regions, Tackling Illegal Content Online: Towards an Enhanced Responsibility of Online Platforms (Sep. 28, 2017), at 2. (“what is illegal online is also illegal offline . . . addressing the detection and removal of illegal content online represents an urgent challenge for the digital society today.”)
One of the first cases to help establish the contours of this emerging methodology is Delfi v. Estonia.146
Delfi v. Estonia, supra note 103.
Id. at ¶ 11.
Gemius Rating, Domains, http://perma.cc/4MSC-XHEZ.
Delfi v. Estonia, supra note 103, at ¶ 12.
Id. at ¶ 13.
Id. at ¶ 16.
Id.
Id. at ¶ 18.
Id. at ¶ 19.
Delfi AS v. Estonia, Columbia Global Freedom of Expression, http://perma.cc/5BPM-K6NG.
See generally id.; Spano, supra note 144, at 669-72; Delfi AS v. Estonia, supra note 103.
A Network Enforcement Act claim would not simply be a repeat of Delfi. First, the Network Enforcement Act explicitly excludes news sites.157
Netzwerkdurchsetzungsdesetz, supra note 1, at § 1(1).
Spano, supra note 144, at 670.
Delfi v. Estonia, supra note 103, at ¶ 116.
Lisl Brunner, The Liability of an Online Intermediary for Third Party Content - The Watchdog Becomes the Monitor: Intermediary Liability after Delfi v. Estonia, 16 Hum. Rts. L. Rev. 163, 173 (2016).
Second, Delfi concerned a series of comments, rather than an overarching notice and takedown regime like the one the Network Enforcement Act promotes. Nevertheless, the general scenario is very similar. In both cases, an intermediary, who does not have control over the content posted, is requested to remove unlawful content and faced with a fine for not doing so in a timely manner. The intermediary in Delfi was also a big name—Delfi is prominent in Estonia,161
See Gemius Rating, supra note 148.
Recall that the Network Enforcement Act only affects social media companies with more than two million registered German users. See Netzwerkdurchsetzungsdesetz, supra note 1, at § 1.
Pihl v. Sweden, App. No. 74742/14, Eur. Ct. H.R. (2017).
Phil v. Sweden, Columbia Global Freedom of Expression, http://perma.cc/645V-R4H5.See also Pihl v. Sweden, supra note 163, at ¶ 35 (“However, the Court has previously found that liability for third-party comments may have negative consequences on the comment-related environment of an internet portal and thus a chilling effect on the freedom of expression via internet.”).
A similar, but contrary, intermediary liability case is Magyar Tartalomszolgáltatók Egyesülete and Index.hu v. Hungary.165
Magyar Tartalomszolàltatòk Egyesülete and Index.hu v. Hungary, App. No. 22947/13 (Eur. Ct. H.R. 2016).
Id. at para. 5.
Id. at para. 11.
Id. at para. 12.
Harris et al., supra note 87, at 623.
Magyar Tartalomszolàltatòk Egyesülete and Index.hu v. Hungary , supra note 165, at para. 64.
Id.
Magyar is important because it acknowledges that having a law that holds a “large Internet news portal” liable for third-party commenters is enough to meet the “prescribed by law” portion of the Article 10 analysis.172
Id. at para. 51.
Spano, supra note 144, at 673.
The final applicable case is Tamiz v. United Kingdom,174
Tamiz v. United Kingdom, App. no. 3877/14 (Eur. Ct. H.R. 2017).
Id. at para. 7.
Id. at para. 17
Id. at para. 18-21.
The court distinguished the case from Delfi by pointing out that here the Court was finally dealing with a social media platform “where the platform provider[] does not offer any content and where the content provider may be a private person running a website or blog as a hobby.”178
Id. at para. 85 (citing Delfi v. Estonia, supra note 103, at ¶ 115-116).
Id. at para. 90.
The threshold questions of the ECtHR’s freedom of expression analysis— whether a state action is an interference with expression and whether that interference was prescribed by law—are relatively simple to answer in light of the aforementioned case law. The Network Enforcement Act was a widely publicized law, passed by the national legislature, therefore it is certainly foreseeable enough to be “prescribed by law.” As far as interference goes, the ECtHR in Delfi, Pihl, and Tamiz frequently referred to the Article 10 rights of providers as a given. That the social media companies who are tasked with removing comments have Article 10 rights has not been the subject of in-depth analysis by the court, but it is something that they recognize. For example, in Tamiz, the court consistently refers not only to the Article 10 rights of readers but also of Google and information society service providers (“ISSPs”).180
See generally id.
Given the nature of policy developments regarding the internet, while intermediary liability is not required by the ECHR, it does not go against the treaty. Therefore, the ECtHR will likely find that Germany’s interference with freedom of expression through the Network Enforcement Act is necessary for many of the eighteen criminal provisions companies are asked to enforce. For example, the Act falls within the sweet spot of “desirable” and “indispensable” that the court described in Sunday Times181
Sunday Times v. United Kingdom, supra note 107, at para. 59
It has never been a question that the internet, and by extension social media, needs to be regulated. The problem that courts have grappled with for the better part of two decades is how. If social media is like the press, then the ECtHR will more closely examine the necessity of regulations which restrict it. The ECtHR has emphasized on numerous occasions that the press is the “public watchdog in a democratic society.”182
Rainey et. al., supra note 86, at 496.
See id. at 487; ECHR, supra note 9, at art. 8 (outlining the right to respect for private and family life).
See, generally Tamiz v. United Kingdom, supra note 174, at para. 75 (noting that while many user comments are likely defamatory, the majority of comments are likely to be too trivial or limited in publication to cause any significant damages to a person’s reputation.); See also Kreimer, supra note 62, at 17.
Frosio, supra note 30, at 7 (citing Andrew Shapiro, The Control Revolution: How the Internet is Putting Individuals in Charge and Changing the World We Know 225 (2000)).
The ECtHR has stated that due to the important role that ISSPs play in facilitating access to information and public debate, the state has a wide margin of appreciation in cases similar to Tamiz.186
Tamiz v. United Kingdom, supra note 174, at para. 90.
ECHR, supra note 9, at art. 10(2)
Delfi v. Estonia, supra note 103, at para. 136 (“Moreover, the Court has held that speech that is incompatible with the values proclaimed and guaranteed by the Convention is not protected by Article 10 by virtue of Article 17 of the Convention.”).
Heiko Maas, Germany’s Federal Minister of Justice and Consumer Protection, described NetzDG as promoting, rather than chilling, freedom of expression because it removes violent and unlawful content online.189
Deutscher Bundestag, supra note 10.
Id.
Id.
Markus Reuter, Umfrage: Zeitungsredaktionen Schränken Kommentarfunktionen 2015 Weiter ein, Netzpolitik (Apr. 3, 2016), http://perma.cc/V5E2-7YXJ(translated “Survey: Newspaper editors continue to restrict commentary features in 2015.”).
Nevertheless, there is a strong argument that the Network Enforcement Act is not necessary for the more nuanced criminal provisions like defamation or insult. In those circumstances, the law certainly is not indispensable. Even assuming that, as the court presumed in Delfi, there is a tendency for platform providers to drag their feet when it comes to the removal of content, whether that is due to lack of knowledge, manpower, or actual bad faith varies from instance to instance. One reading of the Delfi judgment is that the fine was necessary to deter a notoriously bad actor from failing to remove content. After all, Delfi took six weeks and a lawsuit to remove the twenty comments against the applicant.193
Delfi v. Estonia, supra note 103.
Twitter Netzwerkdurchsetzungsgesetzbericht, supra note 39.
According to Internet Live Stats, 6,000 tweets are sent per second. Twitter Usage Statistics, Internet Live Stats, http://perma.cc/AAY7-6MKH.That means it would take fewer than 5 seconds to reach the amount of content removed over those six months, and 45 seconds to reach the number of complaints lodged.
Moreover, the law might introduce further confusion. In the case of Twitter, Facebook, and Google, content subject to complaints was first screened using the company’s own terms of service, which allows for the removal of legal and illegal content.196
The Network Enforcement Act Apparently Leads to Excessive Blocking of Content, Reporters Without Borders (Aug. 3, 2018), http://perma.cc/H3D9-T7ND.
David Meyer, Facebook Can Block Hate Speech, Even If It's Not Illegal, Court Rules, ZDNet (Sept. 18, 2018), http://perma.cc/D6YD-7ABK.
Franklin Foer, The Death of the Public Square, The Atlantic (July 6, 2018), http://perma.cc/AXM5-8C5S.
Von Constantin Van Lijnden, Facebook, Geben Sie Redefreiheit!, Frankfurter Allgemeine Zeitung (Sep. 6, 2018), http://www.faz.net/aktuell/feuilleton/medien/facebook-darf-nicht-eigenhaendig-beitraege-loeschen-15773244.html.
This highlights the Left Party’s initial hesitation with the Network Enforcement Act. In response to Heiko Maas’s impassioned speech about promoting freedom of expression, a party member stated “[d]as ist keine Durchsetzung gegenüber den Netzwerken, sondern durch die Netzwerke”—“The Network Enforcement Act is not enforced against the networks but through the networks.”200
Id.
This is a new direction for intermediaries, and the desirability of said direction is up for debate. Historically, anti-censorship laws and their enforcement have only applied to state actors. As Judge Spano points out, “Article 10 of the convention does not . . . mandate any particular form of intermediary liability.”201
Spano, supra note 144, at 668.
Dorothée Baumann-Pauly, German Companies Report on the Implementation of New Hate Speech Law, Commentary: NYU Stern Center for Business and Human Rights (Aug. 7, 2018), http://perma.cc/VB89-6AR7.
However, if not social media companies, it is unclear who could handle these complaints. The traditional justice system seems currently unable to deal with the sheer mass and speed of the dissemination of unlawful content on the internet in a timely manner. Thus, there are few alternatives to social media providers defining and enforcing the ground rules for online speech through private community standards.203
Wischmeyer, supra note 132, at 16.
Therefore, while the ECtHR could find that the Network Enforcement Act is necessary in its entirety, the best approach that the court could take is to find the law necessary for some of the eighteen criminal provisions, but not for others. For example, encouraging the commission of a serious violent offense endangering the state, as well as public incitement to crime, have imminence that surrounds their offenses such that real harm could come from allowing the content to remain online. Therefore, fines and criminal sanctions levied against a negligent intermediary may seem necessary to induce a speedy takedown of that content. Conversely, with crimes like “insult” it is not clear that the law is doing anything more than what a company’s own terms of service are doing. Additionally, as all of the court cases discussed have shown, whether something is insulting or defamatory to an individual is hard enough to determine. Whether it is insulting or defamatory enough to fine an intermediary is another question altogether. In those cases, without the immediate harm, it may be enough for Germany to do what it already has the ability to do—retrieve the information about the perpetrator and go after them through the normal court system.204
Kirschbaum, supra note 83.
There is no one standard for determining the proportionality of a law. As discussed in Section III(B), the ECtHR requires laws to be convincingly established and narrowly construed in order to be proportional. This analysis takes into consideration the kind of speech affected and the state’s margin of appreciation. Following these guidelines, there is a strong argument that the Network Enforcement Act is not proportional for most of the eighteen criminal provisions it covers. Because the harm inflicted by most of the affected content is not comparable to the potential chilling effect that fines and government intervention have on speech, there is an imbalance between “the interests served by the measure and the interests that are harmed by introducing it.”205
Janneke Gerards, How to Improve the Necessity Test of the European Court of Human Rights, 11 Int’l J. of Const. L. 466, 469 (2013).
Before addressing the reasons the law is not proportional, it is important to note one reason which does not come into play—extraterritorial removal. The court has previously found that, following the margin of appreciation, states have the ability to choose the measures by which they deal with issues of obscenity.206
Harris et al., supra note 87, at 637.
Perrin v. United Kingdom, App. No. 5446/03 (Eur. Ct. H.R. 2005). (“The fact that dissemination of the images in question may have been legal in other States, including non-Parties to the Convention such as the United States, does not mean that in proscribing such dissemination within its own territory and in prosecuting and convicting the applicant, the respondent State exceeded the margin of appreciation afforded to it.”)
Currently, the law has no guarantee of due process rights. Tech companies are not required to allow people to explain their comments or content before removing them. Such a requirement is something which the European Commission has recommended.208
European Commission, C(2018) 1177, Commission Recommendation of 1.3.2018 on Measures to Effectively Tackle Illegal Content Online (Mar. 1, 2018).
Human Rights Watch, supra note 31.
Baumann-Pauly, supra note 202.
Likewise, companies cannot explain why a given piece of content was reported in the first place. This creates an environment conducive to discrimination. This is particularly true with regard to the categories, such as “insult,” that are more open to interpretation. It would not be hard to imagine a scenario where potentially insulting comments are written, but only those relating to or posted by certain political parties or ethnic groups are targeted for reporting. There is a reason that many of the comments behind the challenged cases this Comment has discussed are authored by AfD members, beyond the party’s anti-immigrant sentiment and Neo-Nazi ties.211
Kate Connolly, Chemnitz Riots Spark Calls for AfD to Be Put under Surveillance, The Guardian (Sept. 4, 2018), http://perma.cc/TB39-XHPV.
Kinstler, supra note 53.
The implications of this are particularly worrisome when it comes to administrative agencies. While the reports state whether the complaints came from governmental entities or users, there is nothing to stop government agencies from tracking individuals and reporting their content, regardless of whether they are in an “unconstitutional organization” or not. When Facebook or Twitter then refuses to remove it, the companies could be fined. Although it is unlikely to happen, the fraught history the West has with authoritarianism means that the court should find this argument persuasive.
The lack of transparency when it comes to fines is another issue in its own right. Although Heiko Maas emphasized the fact that fines of any size will only be levied on “systematic” actors,213
Deutscher Bundestag, supra note 10.
Kinstler, supra note 11.
To the second point, placing a burden this heavy on tech companies is disproportionate to the harm caused by these posts. As Google’s lawyers in Tamiz v. United Kingdom argued,
[H]olding ISSPs liable from the moment the first letter of complaint was received, without allowing a reasonable period of time to investigate the merits of a complaint, to contact the author of the blog or comment, and take the necessary technical and practical steps to facilitate removal, would [] result in a disproportionate interference with the ISSP’s Article 10 rights. In order to strike a fair balance between the interests of the aggrieved person and the provider of the blogging platform, an ISSP must be afforded a reasonable period of time to investigate and evaluate a request to remove a comment and, where appropriate, to implement removal. To find otherwise would effectively compel ISSPs to remove comments immediately following a complaint, without first considering its merits, and this would likely stifle legitimate speech and suppress the publication of information on important matters of public interest.215
215Tamiz v. United Kingdom, supra note 174, at para. 71.
Although the court did not directly address the proportionality of the response, Google’s point still stands. The Network Enforcement Act is not proportionate because it provides every incentive to over-police content with no oversight, and no equivalent incentive to ensure that lawful content is not deleted. There is no case of the Court finding disproportionality on these grounds because the recent cases like Tamiz and Delfi have not addressed the issue. Nevertheless, the argument is persuasive. Unlike individuals posting, governments assume that companies are rational actors who will do the bare minimum to maintain the culture of their platforms and avoid legal costs. This is the way that fines are expected to work—by increasing the cost of unlawful behavior, such that it is no longer in a company’s best interest to behave in that way.216
Fine and Punishment, The Economist (July 21, 2012), http://perma.cc/4TTY-RG67.
Finally, the Network Enforcement Act is not the least restrictive means by which Germany can target the harm caused by this content. There are rare examples where the court has found that the benefits a law provides are outweighed by the harms it causes under a least restrictive means analysis.217
Gerards, supra note 205, at 483.
Tamiz v. United Kingdom, supra 174, at para. 82.
Id.
The Network Enforcement Act, by design, creates circumstances where no measures are in place to enable applicants to protect their Article 10 rights. In many ways, it prioritizes the Article 8 rights of the complainers over the Article 10 rights of the commenters. As the divided lower court decisions discussed earlier show, there is a chance that social media companies could get their decisions wrong. With no formal mechanism other than the courts to help individuals adjudicate their rights, chilling of speech is inevitable.
On balance, it is arguable that the court should find in favor of the Network Enforcement Act for content which incites violence or promotes terrorism. But for all other content, no matter how insulting or demeaning, it is not proportional. By drawing this line, the court could help clarify its own jurisprudence and strike the balance that it desires. Rather than helping facilitate obstacles to a unified internet and encouraging a fragmented digital economy, the court could find that although some content is internationally undesirable, the harm to the free expression rights of intermediaries outweighs the harm to individuals for other content. This would pave the way for international cooperation on goals relating to regulations for the internet—something which, as a policy matter outside the scope of this Comment, is desirable.
The Network Enforcement Act suffers from the issues all laws passed with short notice and out of fear suffer from: it is vague, overbroad, and pins the moral blame for very real issues on the wrong individuals. By laying the blame at the feet of social media intermediaries rather than at the actual perpetrators of hate speech and violent actions, it effectively shoots the messenger. The ECtHR should find that this law violates Article 10 for all criminal provisions other than those which implicate imminent violence or threats to government agencies. Even then, the law is arguably not necessary because of the fact that the European Union is working towards its own version of regulating speech and conduct online, on top of regulations that already exist. Still, at least with regard to the worst conduct, the fines that companies could potentially receive for not monitoring their content are more proportional to the harms which they seek to prevent.
This is not to say that the impetus behind the law is misguided—social media companies do need to be regulated. We are beyond the times where such companies could be viewed as paper and pens that radical and violent individuals use to write posters. Instead, they look more and more like billboards on the side of the highway who choose to sell their space and turn a blind eye to the consequences of who posts. Also, speed matters when regulating the internet.220
Take, for example, the case of Cody Wilson. Mr. Wilson wished to publish blueprints for 3-D-printed guns online in summer of 2018. The last time he posted blueprints for a handgun, they were downloaded 100,000 times from his site before the State Department ordered it to be taken down. See generally Kurtis Lee, Want to Make a Gun with a 3-D Printer? Here Is Why Gun Control Groups Oppose the Practice, LA Times (July 31, 2018), http://perma.cc/DP8A-KXLN.
While Hercules would cut off the heads of the Hydra, Iolaus would cauterize the necks to prevent them from growing back.
This leads to the most important question, which is far outside the bounds of this Comment: the question of what the scope of social media regulation should be. The Network Enforcement Act is a proxy war in the ultimate battle over how to tame the internet. Companies censor legal content outside the scope of the Network Enforcement Act all the time. For example, Facebook’s algorithms blocked and removed posts with variations on the phrase “men are scum,”222
Samuel Gibbs, Facebook Bans Women for Posting 'Men Are Scum' after Harassment Scandals, The Guardian (Dec. 5, 2017), http://perma.cc/XV7F-8YUP.
Wischmeyer, supra note 132, at 20.
What Germany’s law does show is the problems with regulating the internet on a nation-by-nation basis. Social media combines all of the trickiest portions of free expression jurisprudence and forces it across borders. If courts hold that content must be globally removed, it will be impossible to uphold the Network Enforcement Act and respect the margin of appreciation other states have for monitoring content within their borders. As such the ECtHR will be hard pressed to find that any domestic law like the Network Enforcement Act is necessary or proportional as Article 10 requires. Such laws need to be propagated by large, international bodies such as the E.U. or the U.N. in order to ensure a plurality of countries agree upon what content needs to be regulated when.
Given the variety of free expression regimes across borders, the best solution is to focus on the content when there is an international consensus on its egregiousness and unlawfulness. For example, the E.U. has already decided that “propaganda that prepares, incites or glorifies acts of terrorism” should be removed from the internet.224
EU to Give Internet Firms 1 Hour to Remove Extremist Content, Associated Press (Sept. 12, 2018), http://perma.cc/U8KV-Z93T.
- 1Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken [Network Enforcement Act], Sept. 1, 2017, Bundesgesetzblatt, Teil I [BGBl I] at 3352 (Ger.).
Official English translation may be found here: http://perma.cc/72JK-3KNM.
The law is also referred to by its abbreviated German names, NetzDG or Netzkdurchsetzungsgesetz.
- 2Philip Oltermann & Thomas Furmann, Tough New German Law puts Tech Firms and Free Speech in Spotlight, The Guardian (Jan. 5, 2018), http://perma.cc/ENU5-4FKZ.
- 3Stefan Engels, Network Enforcement Act in a Nutshell, DLA Piper Blog: IPT Germany (Jan. 31, 2018), http://perma.cc/E6QT-VSEL.
- 4Netzwerkdurchsetzungsgesetz, supra note 1, at §1(2).
- 5Id. at § 1(3).
- 6Id. at § 3(2)(2).
- 7Id. at § 3(2)(3).
- 8Id. at § 4(2).
- 9European Convention for the Protection of Human Rights and Fundamental Freedoms, opened for signature Nov. 4, 1950, Eur. T.S. No. 5, 213 U.N.T.S. 221, at art. 10.
- 10Deutscher Bundestag: Plenarprotokoll 18/235 at 23848, (statement of Heiko Maas, Bundesminister BMJV), http://perma.cc/WKU4-HSGJ(Ger.).
- 11Linda Kinstler, Can Germany Fix Facebook?, The Atlantic (Nov. 2, 2017), http://perma.cc/B7EF-UDS8.
- 12Emma Thomasson, Germany Looks to Revise Social Media Law as Europe Watches, Reuters (Mar. 8, 2018), http://perma.cc/ZM6T-76XU(explaining that Titanic, a satirical magazine, had its content removed for parodying the language of a tweet from a far-right German political party which was also removed).
- 13“Harmless” varies from person to person. There are a number of online posts, some of which will be discussed in this Comment, which cause real emotional harm to users. However, harmless is used here to mean not tending to call for or incite immediate violence or threatening physical harm against a given user.
- 14Christof Kerkmann, German Court Overturns Facebook ‘Censorship,’ Handelsblatt Today (Apr. 13, 2018), http://perma.cc/S4LZ-DRCE(including examples of comments, such as “the Germans are becoming more and more stupid. No wonder, as they are being clobbered daily by left-wing media with fake news about skilled workers, declining unemployment or Trump.”).
- 15Germany Starts Enforcing Hate Speech Law, BBC News (Jan. 1, 2018), http://perma.cc/23QB-MXB7.
- 16Bitkom is the German Association for IT, Telecommunications, and New Media. For more information see http://perma.cc/T56F-EGUT.
- 17Guy Chazan, Berlin Forced to Defend Hate Speech Law, Fin. Times (Jan. 5, 2018), http://perma.cc/8GAN-72TF.
- 18See generally Jarred Prier, Commanding the Trend: Social Media as Information Warfare, 11 Strategic Stud. Q. 50 (2017).
- 19The Brussels Effect is the term coined to describe the E.U.’s growing ability to control and affect international regulations without entering into formal international agreements. For more information see Section II(C).
- 20Agnes Callamard, Religion, Terrorism, and Speech in a ‘Post-Charlie Hebdo’ World, 10 Relig. & Hum. Rts. 207 (2015).
- 21Id. at 208. While this Comment attributes the Charlie Hebdo attacks to ISIS, they were actually claimed by Al-Qaeda in the Arabian Peninsula. See Catherine E. Schoichet & Josh Levs, Al Qaeda Branch Claims Charlie Hebdo Attack was Years in the Making, CNN (Jan. 21, 2015), http://perma.cc/F5BL-4PRK.
- 22Brussels is home to the Brussels Privacy Hub, “an academic privacy research centre with a global focus . . . Brussels is where key decisions are taken on data protection in the European Union, and EU rules set the standard for data protection and privacy law around the world.” About the Brussels Privacy Hub, Brussels Privacy Hub, http://perma.cc/W9P6-AX89.
- 23Council of the E.U. Press Release 158/16, Joint Statement of E.U. Ministers for Justice and Home Affairs and Representatives of E.U. Institutions on the Terrorist Attacks in Brussels on 22 March 2016 (Mar. 24, 2016), http://perma.cc/L8XJ-V59D.
- 24Liat Clark, Facebook and Twitter Must Tackle Hate Speech or Face New Laws, Wired (Dec. 5, 2016), http://perma.cc/YCN7-4AP6.
- 25Id. See also European Commission Factsheet, Code of Conduct on Countering Illegal Hate Speech Online: First Results on Implementation (Dec. 2016).
- 26Danielle Keats Citron, Extremist Speech, Compelled Conformity, and Censorship Creep, 93 Notre Dame L. Rev. 1035, 1038 (2018).
- 27See Eur. Comm’n, Priority: Digital Single Market, http://perma.cc/74TV-9684.
- 28Clark, supra note 24.
- 29Id.
- 30See generally Giancarlo F. Frosio, Why Keep a Dog and Bark Yourself? From Intermediary Liability to Responsibility, 26 Int’l J. L. & Info. Tech. 1 (2017).
- 31Germany: Flawed Social Media Law, Human Rights Watch (Feb. 14, 2018), http://perma.cc/B87K-YGLJ.
- 32Billy Perrigo, The U.K. Is About to Regulate Online Porn, and Free Speech Advocates Are Terrified, Time Magazine (Aug. 20, 2018), http://perma.cc/JH4U-LMXY.
- 33James McAuley, France Weighs a Law to Rein in ‘Fake News,’ Raising Fears for Freedom of Speech, Washington Post (Jan. 10, 2018), http://perma.cc/HW79-N7LE.It is important to note that while distasteful, printing fake news is usually not considered illegal in modern times.
- 34Saqib Shah, E.U. Will Fine Social Media Sites for Lingering Extremism, Engadget (Sept. 12, 2018), http://perma.cc/NR2E-RVYJ.
- 35Netzwekdurchsetzungsgesetz, supra note 1, at § 1.
- 36Id.
- 37Engels & Fuhrmann, supra note 3.
- 38Netzwekdurchsetzungsgesetz, supra note 1, at § 2(1).
- 39Twitter Netzwerkdurchsetzungsgesetzbericht: Januar–Juni 2018, Twitter (2018), http://perma.cc/JU7L-ND3U(Ger.).
- 40Removals under the Network Enforcement Law, Google (2018), http://perma.cc/L8PY-RAQ4.
- 41Removals under the Network Enforcement Law, Google (2018), http://perma.cc/Z98S-25V5.
- 42Centre For European Policy Studies, Germany’s NetzDG: A Key Test For Combatting Online Hate 9 (2018).
- 43NetzDG Transparency Report, Facebook (July 2018), http://perma.cc/99SR-E3T9.
- 44Netzwekdurchsetzungsgesetz, supra note 1, at §2(2)(1).
- 45Id. at § 2(2)(3).
- 46Id. at § 1(3).
- 47See Michael Bohlander, trans., Criminal Code in the version promulgated on 13 November 1998, Federal Law Gazette [Bundesgesetzblatt] I p. 3322, last amended by Article 1 of the Law of 24 September 2013, Federal Law Gazette I p. 3671 and with the text of Article 6(18) of the Law of 10 October 2013, Federal Law Gazette I p 3799. Bundesgesetzblatt [Criminal Code], Nov. 13, 1998, BGBL II, last amended by Gesetz [G], Sept. 2013 BGBL II, translation at http://perma.cc/Y6DK-FAEQ.
- 48Netzwerkdurchsetzungsgesetz, supra note 1, at § 3.
- 49Id.
- 50The statutes vary in effectiveness and clarity. For example, the definition for “insult” only reads “An insult shall be punished with imprisonment not exceeding one year or a fine and, if the insult is committed by means of an assault, with imprisonment not exceeding two years or a fine.” Bohlander, supra note 47, at § 185. Whereas the definition of “[b]reach of the public peace by threatening to commit offences” states, in detail:
(1) Whosoever, in a manner capable of disturbing the public peace, threatens to commit
1. an offence of rioting indicated in section 125a 2nd sentence Nos 1 to 4;
2. murder under specific aggravating circumstances (section 211), murder (section 212) or genocide (section 6 of the Code of International Criminal Law) or a crime against humanity (section 7 of the Code of International Criminal Law) or a war crime (section 8, section 9, section 10, section11 or section 12 of the Code of International Criminal Law);
3. grievous bodily harm (section 226);
4. an offence against personal freedom under section 232(3), (4), or (5), section 233(3), each to the extent it involves a felony, section 234, section 234a, section 239a or section 239b;
5. robbery or blackmail with force or threats to life and limb (Sections 249 to 251 or section 255);
6. a felony endangering the public under sections 306 to 306c or section 307(1) to (3), section 308(1) to (3), section 309(1) to (4), section 313, section 314 or section 315(3), section 315b(3), section 316a(1) or (3), section 316c(1) or (3) or section 318(3) or (4); or
7. a misdemeanour endangering the public under section 309(6), section 311(1), section 316b(1), section 317(1) or section 318(1),
shall be liable to imprisonment not exceeding three years or a fine.
(2) Whosoever intentionally and knowingly and in a manner capable of disturbing the public peace pretends that the commission of one of the unlawful acts named in subsection (1) above is imminent, shall incur the same penalty.
Id. at § 126.
- 51Netzwerkdurchsetzungsgesetz, supra note 1, at § 3(2)(3)(a).
- 52AfD, Facebook (Nov. 21, 2017), http://perma.cc/VM4Q-LFVH.A post where AfD says they “kept their word” and are requesting the cancellation of the Network Enforcement Act. It includes a link to a bill to repeal the act.
- 53Linda Kinstler, Germany’s Attempt to Fix Facebook Is Backfiring, The Atlantic (May 18, 2018), http://perma.cc/9A3P-DDRF.
- 54Philip Oltermann & Pádraig Collins, Two Members of Germany’s Far-Right Party Investigated by State Prosecutor, The Guardian (Jan. 2, 2018), http://perma.cc/R9U9-YC4U.
- 55Id.
- 56Id.
- 57Id.
- 58Id.
- 59Carol Anee Costabile-Heming, “Rezensur”: A Case Study of Censorship and Programmatic Reception in the GDR, 92 Monatshefte 53, 54 (2000) (“[GDR’s] structure [went] beyond censorship, a term that itself was taboo, and bec[ame] a type of systematic control.”).
- 60Id. at 56.
- 61Id. at 58.
- 62Seth F. Kreimer, Censorship by Proxy, 155 U. Pa. L. Rev. 11, 19–20 (2006) (providing examples of: France attempting to impose liability on Yahoo! for making overseas Nazi messages, images, and paraphernalia available to French citizens; Swiss police inducing ISPs to block neo-Nazi sites; German courts requiring ISPs to block access to extraterritorial neo-Nazi websites; British telecom blocking access to sites on a child pornography blacklist).
- 63Id. at 27.
- 64Council of Europe Commissioner for Human Rights, The Rule of Law on the Internet and in the Wider Digital World (2014).
- 65Id. at ¶ 16.
- 66Values, Council of Europe (2019), http://perma.cc/Q857-ZSA7(“The Council of Europe promotes human rights through international conventions. . . It monitors member states' progress in these areas and makes recommendations through independent expert monitoring bodies.”).
- 67Kinstler, supra note 53.
- 68Kreimer, supra note 62, at 28.
- 69Maas verteidigt Gesetz gegen Hass im Internet, Spiegel Online (Apr. 1, 2018), http://perma.cc/67DK-7X6K(“Calls for murder, threats and insults, sedition or Auschwitz lie [Holocaust denial] are not expressions of freedom of expression, but rather are attacks on the freedom of expression of others.”).
- 70Id.
- 71Id.; see also Joachim Huber, Alice Weidel gewinnt Rechtsstreit mit Facebook, Der Tagesspiegel (Jan. 5, 2018), http://perma.cc/HSS3-QNBC.
- 72Maas verteidigt Gesetz gegen Hass im Internet, supra note 69.
- 73Id.
- 74A VPN, or Virtual Private Network, allows users to create a secure connection to another network over the internet. VPNs can be used to access region-restricted websites, so a viewer in Germany can appear to be in France.
- 75David Meyer, Facebook’s New Court Defeat: This Time it ‘May Have Free Speech Implications’, ZDNet (May 1, 2018), http://perma.cc/66DS-SRY2.
- 76Facebook, Inc., Annual Report (Form 10-K) 59 (Feb. 1, 2018).
- 77Regulation (EU) 2016/679 of the European Parliament and the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation), 2016 O.J. L 119/1 [hereinafter GDPR], at art. 1(2), http://perma.cc/W56Q-AKZW.
- 78Id. at art. 3(1).
- 79Id.
- 80Mark Scott, In Google Privacy Case, Europe’s Highest Court to Decide on Future of the Web, Politico (Sept. 12, 2018), http://perma.cc/TU2F-FPZB.
- 81Anu Bradford, The Brussels Effect, 107 Nw. U. L. Rev. 1, 3. (2012).
- 82Frosio, supra note 30, at 13–14.
- 83Erik Kirschbaum, In Germany It Can be a Crime to Insult Someone in Public, L.A. Times (Sept. 6, 2016), http://perma.cc/7JTF-45KP.
- 84Frosio, supra note 30, at 2.
- 85Council of Europe, The Conscience of Europe: 50 Years of the European Court of Human Rights 22 (Egbert Myjer et al. eds., 1st ed. 2010).
- 86See generally Bernadette Rainey et al., Jacobs, White, and Ovey: The European Convention on Human Rights (7th ed. 2017).
- 87David Harris et al., Harris, O'Boyle & Warbrick: Law of the European Convention on Human Rights 621 (3rd ed. 2014).
- 8847 Member States, Council of Europe, http://perma.cc/G23F-TGVR.
- 89European Convention on Human Rights, supra note 9, at art. 10.
- 90Handyside v. United Kingdom, 24 Eur. Ct. H.R. (ser. A) 49 (1976).
- 91Mario Oetheimer, Protecting Freedom of Expression: The Challenge of Hate Speech in the European Court of Human Rights Case Law, 17 Cardozo J. Int'l & Comp. L. 427, 427 (2009), citing Handyside v. United Kingdom, 24 Eur. Ct. H.R. (ser. A) 49 (1976).
- 92European Convention on Human Rights, supra note 9, at art.10.
- 93Handyside v. United Kingdom, supra note 90, at 18.
- 94Rainey et. al., supra note 86, at 484.
- 95Id.
- 96David Harris et al., Harris, O’Boyle, & Warbrick: Law of the European Convention on Human Rights 601 (4th ed. 2018).
- 97Aleksandra Gliszczynska-Grabias & Grazyna Baranowska, The European Court of Human Rights on Nazi and Soviet Past in Central and Eastern Europe, 45 Polish Pol. Sci. Y.B. 117, 119 (2016).
- 98Rainey et. al., supra note 86, at 490.
- 99Id. at 485.
- 100Harris et al. (3rd ed. 2014), supra note 87, at 616.
- 101ECHR, supra note 9, at art. 10 (states are allowed to license television).
- 102Id.
- 103See generally Open Door Counselling Ltd. v. Ireland, 246 Eur. Ct. H.R. (ser. A) (1992); Delfi AS v. Estonia, App. No. 64569/09, 2015-II Eur. Ct. H.R. 319.
- 104Delfi v. Estonia, supra note 103, at ¶ 122.
- 105See generally Handyside v. United Kingdom, supra note 90, at 16; European Convention on Human Rights, supra note 9, at art. 10.; Oetheimer, supra note 91, at 434.
- 106Delfi v. Estonia, supra note 103, at ¶ 131.
- 107Sunday Times v. United Kingdom, App. No. 6538/74, 30 Eur. Ct. H.R. (ser. A) ¶ 59 (1979).
- 108ECHR, supra note 9, at art. 10.
- 109Steven Greer, The Exceptions to Article 8 to 11 of the European Convention on Human Rights 15 (1997).
- 110Id.
- 111Vogt v. Germany, 323 Eur. Ct. H.R. (ser. A) (1995).
- 112Greece v. United Kingdom, App. No. 176/56, 1959 Y.B. Eur. Conv. On H.R. (Eur. Comm’n on H.R.) (1958-59).
- 113Norris v. Ireland, App. No. 10581/83, 13 Eur. H.R. Rep. 186 (1988).
- 114Jersild v Denmark, App. No. 15890/89, 298 Eur. Ct. H.R. (ser. A) (1994).
- 115Yutaka Arai-Takahashi, The Margin of Appreciation Doctrine and the Principle of Proportionality in the Jurisprudence of the ECHR 2 (2002).
- 116Id. at 136.
- 117Id. at 127.
- 118Harris (4th ed. 2018), supra note 96, at 608.
- 119Id. at 612.
- 120ECHR, supra note 9, at art. 17.
- 121Harris (3rd ed.), supra note 87, at 621.
- 122Oetheimer, supra note 91, at 429.
- 123Id.
- 124ECHR, supra note 9, at. art. 8.
- 125Harris, (4th ed. 2018), supra note 96, at 608.
- 126ECHR, supra note 9, at art. 9.
- 127Id. at art. 33.
- 128European Court of Human Rights, The ECHR in 50 Questions, at Question 19 (Feb., 2014).
- 129European Convention on Human Rights, supra note 9, at art. 35(1).
- 130Grundgesetz für die Bundesrepublik Deutschland, at art. 5(1), translated in Basic Law for the Federal Republic of Germany (Christian Tomuschat & David P. Currie trans., 2008).
- 131Id.
- 132See generally Thomas Wischmeyer, ‘What is Illegal Offline is Also Illegal Online’ – The German Network Enforcement Act 2017, in Fundamental Rights Protection Online: The Future Regulation of Intermediaries (Bilyana Petkova & Tuomas Ojanen eds., 2019).
- 133Id. at 15.
- 134Sebastien Schwiddesen, German Attorney General: Video Game with Swastika Does Not Violate the Law; Constitutes Art, Lexology (May 8, 2018), http://perma.cc/E3F4-W3CV.
- 135Id. (Wolfenstein is a series of games set in Third Reich era Germany. Wolfenstein 3D has players kill members of the National Socialist Party, and they even go on to kill Hitler in the end. Naturally, Nazi imagery is pervasive throughout the game.)
- 136Id.
- 137Judith Vonberg, Germany Lifts Ban on Nazi Symbols in Computer Games, CNN (Aug. 10, 2018), http://perma.cc/LVL9-C6SV.
- 138Wischmeyer, supra note 132, at 16.
- 139ECHR, supra note 9, at art. 1.
- 140Rainey et. al., supra note 86, at 515.
- 141Id.
- 142Delfi AS v. Estonia, supra note 103.
- 143Intermediary Liability, The Center for Internet and Society at Stanford Law School, http://perma.cc/27LM-5SZG.
- 144Robert Spano, Intermediary Liability for Online User Comments under the European Convention on Human Rights, 17 Hum. Rts. L. Rev. 665, 667 n. 5 (2017).
- 145Id. at 667. See also Communication from the Commission to the European Parliament COM 555, the Council the European Economic and Social Committee, and the Committee of the Regions, Tackling Illegal Content Online: Towards an Enhanced Responsibility of Online Platforms (Sep. 28, 2017), at 2. (“what is illegal online is also illegal offline . . . addressing the detection and removal of illegal content online represents an urgent challenge for the digital society today.”)
- 146Delfi v. Estonia, supra note 103.
- 147Id. at ¶ 11.
- 148Gemius Rating, Domains, http://perma.cc/4MSC-XHEZ.
- 149Delfi v. Estonia, supra note 103, at ¶ 12.
- 150Id. at ¶ 13.
- 151Id. at ¶ 16.
- 152Id.
- 153Id. at ¶ 18.
- 154Id. at ¶ 19.
- 155Delfi AS v. Estonia, Columbia Global Freedom of Expression, http://perma.cc/5BPM-K6NG.
- 156See generally id.; Spano, supra note 144, at 669-72; Delfi AS v. Estonia, supra note 103.
- 157Netzwerkdurchsetzungsdesetz, supra note 1, at § 1(1).
- 158Spano, supra note 144, at 670.
- 159Delfi v. Estonia, supra note 103, at ¶ 116.
- 160Lisl Brunner, The Liability of an Online Intermediary for Third Party Content - The Watchdog Becomes the Monitor: Intermediary Liability after Delfi v. Estonia, 16 Hum. Rts. L. Rev. 163, 173 (2016).
- 161See Gemius Rating, supra note 148.
- 162Recall that the Network Enforcement Act only affects social media companies with more than two million registered German users. See Netzwerkdurchsetzungsdesetz, supra note 1, at § 1.
- 163Pihl v. Sweden, App. No. 74742/14, Eur. Ct. H.R. (2017).
- 164Phil v. Sweden, Columbia Global Freedom of Expression, http://perma.cc/645V-R4H5.See also Pihl v. Sweden, supra note 163, at ¶ 35 (“However, the Court has previously found that liability for third-party comments may have negative consequences on the comment-related environment of an internet portal and thus a chilling effect on the freedom of expression via internet.”).
- 165Magyar Tartalomszolàltatòk Egyesülete and Index.hu v. Hungary, App. No. 22947/13 (Eur. Ct. H.R. 2016).
- 166Id. at para. 5.
- 167Id. at para. 11.
- 168Id. at para. 12.
- 169Harris et al., supra note 87, at 623.
- 170Magyar Tartalomszolàltatòk Egyesülete and Index.hu v. Hungary , supra note 165, at para. 64.
- 171Id.
- 172Id. at para. 51.
- 173Spano, supra note 144, at 673.
- 174Tamiz v. United Kingdom, App. no. 3877/14 (Eur. Ct. H.R. 2017).
- 175Id. at para. 7.
- 176Id. at para. 17
- 177Id. at para. 18-21.
- 178Id. at para. 85 (citing Delfi v. Estonia, supra note 103, at ¶ 115-116).
- 179Id. at para. 90.
- 180See generally id.
- 181Sunday Times v. United Kingdom, supra note 107, at para. 59
- 182Rainey et. al., supra note 86, at 496.
- 183See id. at 487; ECHR, supra note 9, at art. 8 (outlining the right to respect for private and family life).
- 184See, generally Tamiz v. United Kingdom, supra note 174, at para. 75 (noting that while many user comments are likely defamatory, the majority of comments are likely to be too trivial or limited in publication to cause any significant damages to a person’s reputation.); See also Kreimer, supra note 62, at 17.
- 185Frosio, supra note 30, at 7 (citing Andrew Shapiro, The Control Revolution: How the Internet is Putting Individuals in Charge and Changing the World We Know 225 (2000)).
- 186Tamiz v. United Kingdom, supra note 174, at para. 90.
- 187ECHR, supra note 9, at art. 10(2)
- 188Delfi v. Estonia, supra note 103, at para. 136 (“Moreover, the Court has held that speech that is incompatible with the values proclaimed and guaranteed by the Convention is not protected by Article 10 by virtue of Article 17 of the Convention.”).
- 189Deutscher Bundestag, supra note 10.
- 190Id.
- 191Id.
- 192Markus Reuter, Umfrage: Zeitungsredaktionen Schränken Kommentarfunktionen 2015 Weiter ein, Netzpolitik (Apr. 3, 2016), http://perma.cc/V5E2-7YXJ(translated “Survey: Newspaper editors continue to restrict commentary features in 2015.”).
- 193Delfi v. Estonia, supra note 103.
- 194Twitter Netzwerkdurchsetzungsgesetzbericht, supra note 39.
- 195According to Internet Live Stats, 6,000 tweets are sent per second. Twitter Usage Statistics, Internet Live Stats, http://perma.cc/AAY7-6MKH.That means it would take fewer than 5 seconds to reach the amount of content removed over those six months, and 45 seconds to reach the number of complaints lodged.
- 196The Network Enforcement Act Apparently Leads to Excessive Blocking of Content, Reporters Without Borders (Aug. 3, 2018), http://perma.cc/H3D9-T7ND.
- 197David Meyer, Facebook Can Block Hate Speech, Even If It's Not Illegal, Court Rules, ZDNet (Sept. 18, 2018), http://perma.cc/D6YD-7ABK.
- 198Franklin Foer, The Death of the Public Square, The Atlantic (July 6, 2018), http://perma.cc/AXM5-8C5S.
- 199Von Constantin Van Lijnden, Facebook, Geben Sie Redefreiheit!, Frankfurter Allgemeine Zeitung (Sep. 6, 2018), http://www.faz.net/aktuell/feuilleton/medien/facebook-darf-nicht-eigenhaendig-beitraege-loeschen-15773244.html.
- 200Id.
- 201Spano, supra note 144, at 668.
- 202Dorothée Baumann-Pauly, German Companies Report on the Implementation of New Hate Speech Law, Commentary: NYU Stern Center for Business and Human Rights (Aug. 7, 2018), http://perma.cc/VB89-6AR7.
- 203Wischmeyer, supra note 132, at 16.
- 204Kirschbaum, supra note 83.
- 205Janneke Gerards, How to Improve the Necessity Test of the European Court of Human Rights, 11 Int’l J. of Const. L. 466, 469 (2013).
- 206Harris et al., supra note 87, at 637.
- 207Perrin v. United Kingdom, App. No. 5446/03 (Eur. Ct. H.R. 2005). (“The fact that dissemination of the images in question may have been legal in other States, including non-Parties to the Convention such as the United States, does not mean that in proscribing such dissemination within its own territory and in prosecuting and convicting the applicant, the respondent State exceeded the margin of appreciation afforded to it.”)
- 208European Commission, C(2018) 1177, Commission Recommendation of 1.3.2018 on Measures to Effectively Tackle Illegal Content Online (Mar. 1, 2018).
- 209Human Rights Watch, supra note 31.
- 210Baumann-Pauly, supra note 202.
- 211Kate Connolly, Chemnitz Riots Spark Calls for AfD to Be Put under Surveillance, The Guardian (Sept. 4, 2018), http://perma.cc/TB39-XHPV.
- 212Kinstler, supra note 53.
- 213Deutscher Bundestag, supra note 10.
- 214Kinstler, supra note 11.
- 215Tamiz v. United Kingdom, supra note 174, at para. 71.
- 216Fine and Punishment, The Economist (July 21, 2012), http://perma.cc/4TTY-RG67.
- 217Gerards, supra note 205, at 483.
- 218Tamiz v. United Kingdom, supra 174, at para. 82.
- 219Id.
- 220Take, for example, the case of Cody Wilson. Mr. Wilson wished to publish blueprints for 3-D-printed guns online in summer of 2018. The last time he posted blueprints for a handgun, they were downloaded 100,000 times from his site before the State Department ordered it to be taken down. See generally Kurtis Lee, Want to Make a Gun with a 3-D Printer? Here Is Why Gun Control Groups Oppose the Practice, LA Times (July 31, 2018), http://perma.cc/DP8A-KXLN.
- 221While Hercules would cut off the heads of the Hydra, Iolaus would cauterize the necks to prevent them from growing back.
- 222Samuel Gibbs, Facebook Bans Women for Posting 'Men Are Scum' after Harassment Scandals, The Guardian (Dec. 5, 2017), http://perma.cc/XV7F-8YUP.
- 223Wischmeyer, supra note 132, at 20.
- 224EU to Give Internet Firms 1 Hour to Remove Extremist Content, Associated Press (Sept. 12, 2018), http://perma.cc/U8KV-Z93T.