The new German social media law – a risk worth taking? An ‘extended look’ by Stefan Theil

19 February 2018 by

social media.pngThe German Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz) (literally: Law on the improvement of law enforcement in social networks and known as ‘NetzDG’) has attracted much media attention, e.g. here and here, since fully entering into force on 1 January 2018. This was sparked to a significant extent by a few high profile deletions, including a tweet from the responsible Minister for Justice.

This contribution will give an overview of the NetzDG and explain how some of the criticisms are overstated and partially misguided. While the NetzDG is unlikely to resolve all challenges surrounding social media and freedom of expression, and undoubtedly presents a certain risk of stifling expression online, I believe it is nonetheless a significant step in the right direction. Rather than undermine freedom of expression, it promises to contribute to more inclusive debates by giving the loud and radical voices less prominence. In any case, it appears reasonable to let this regulatory experiment play out and observe whether fears over a ‘chilling effect’ on free expression are borne out by the evidence. A review of the law and its effects are is planned after an initial three year operation period, which should deliver ample data and regulatory experience while limiting the scope for potential harm.

 

The statute in a nutshell

The NetzDG provides compliance regulations for social media platform operators with at least two million users within Germany. Social media networks are defined as internet platforms that seek to profit from providing users with the opportunity to share content with other users and the broader public. Platforms which provide individualised communication services, such as email or messaging apps, as well as platforms providing editorialised content, such as news websites, are explicitly excluded from the scope of the law (§ 1 I NetzDG).

The core obligations are setting up an effective and transparent complaints management infrastructure (§ 3 NetzDG) and compiling bi-annual reports on the complaints management activity (§ 2 NetzDG). The latter reporting obligations are quite detailed and include provisions that set out the training and management oversight requirements of the social media platform operators. The complaints management infrastructure must chiefly ensure that the social networks delete or block illegal content within a specified timeframe. Deletion results in a global removal of the content from the platform, while blocking merely makes the content unavailable in Germany. Although blocking and deleting are thus distinct, they will be collectively referred to as deleting throughout the post.

Content is designated illegal if it falls under the one of the enumerated provisions of the German criminal code (Strafgesetzbuch – StGB). The most important ones for the purposes of freedom of expression are insult (§ 185), defamation (§ 186 and § 187), public incitement to crime (§ 111), incitement to hatred (§ 130) and dissemination of depictions of violence (§ 131). It is important to note that the obligation to delete or block is not novel. The NetzDG merely enforces an existing legal obligation under § 10 of the Telemedia Act (Telemediengesetz – TMG). Under that provision, social media operators are liable for illegal content on their site under criminal and private law.

NetzDG further distinguishes between manifestly illegal and merely illegal content and prescribes different deadlines for deletion. Manifestly illegal content must be deleted within 24 hours of a receiving a complaint, while merely illegal content allows for up to seven days before action must be taken. The most important exception to the seven-day deadline applies if operators refer the decision of whether to delete to an independent body of industry self-regulation.

Such bodies must be setup and funded collectively by social media platform operators and reach independent decisions that the operator accepts as binding. Certain conditions apply to bodies of industrial self-regulation (§ 3 VI NetzDG), and they must be accredited by the Ministry of Justice. Such bodies are a common feature in the German regulatory landscape and have been setup for instance by the movie, TV, and computer games industries to rate the age appropriateness of content (the FSK, FSF and USK respectively).

Separately from this duty on social media platforms to delete content, it remains possible for anyone to seek criminal prosecution for any content that violates the criminal code, and social media platforms must preserve the deleted content for evidence purposes for 10 weeks (§ 3 II 4 NetzDG). Additionally, NetzDG requires social media platform operators to name an agent in Germany that is responsible for receiving complaints. A failure to name or lack of response from a responsible agent attracts a fine of up to 500.000 Euros, while any other failure to implement a complaints management scheme as set out in NetzDG can draw a fine of up to 5 million Euros. The latter increases to 50 million Euros for legal persons and corporations under § 30 II 3 of the Code on Administrative Offences (Gesetz über Ordnungswidrigkeiten), see § 4 II 2 NetzDG.

 

NetzDG and ‘hate speech’

It is debatable whether it is useful to view NetzDG as an attempt at curbing hate speech on social media. This is largely due to specific criminal law provisions referenced by the statute and the peculiarities these produce. While the criminal law provisions referenced in NetzDG cover some aspects of what is conventionally, and in the context of some jurisdictions legally, defined as ‘hate speech’, there is no general definition or use of this terminology in German law.  Collectively, the enumerated provisions of the German criminal code simultaneously criminalize more and less than would be encompassed by a generic ban on hate speech.

For instance, one might attract criminal liability for defamation when describing an abortion doctor’s work as ‘babycaust’ even though this does not fall under most definitions of hate speech, as it is not based on attributes such as race, religion, ethnic origin, sexual orientation, disability, or gender. Conversely, posters by a far right party depicting ethnically stereotyped people on a flying carpet with the caption ‘Have a good flight home’ did not attract criminal liability even though they arguably constitute hate speech on the grounds of race, religion and ethnic origin.

Hence, the analytical value of hate speech is limited due to the particular criminal provisions NetzDG is based upon. It would be more accurate to say that the statute itself does precious little beyond seeking the removal of content that one cannot already express in public without the risk of criminal prosecution and sanctions. The law expressly avoids creating new criminal offences. It does not, in any real sense, seek to expand existing limitations on freedom of expression in Germany. The fact that one could in the past express many views that constitute incitement to hatred on social media platforms without any real fear of repercussions does not fundamentally alter the conclusion that the true focus of this law is on enforcement.

 

An assault on freedom of expression?

It is rare for legal system to treat freedom of expression as an absolute right. Most European jurisdictions, including the German Basic Law recognise that there are limits. As a matter of German constitutional law, it is not clear whether the provision would run afoul of freedom of expression. At this stage it is useful to distinguish two scenarios.

In the first scenario, a social media platform operator deletes content that is illegal: in this case freedom of expression is not violated. Under the German Basic Law, freedom of expression does not cover insults and defamations, or incitement to hatred. To the extent that deletion of the illegal content amounts to an infringement, this is justified as it is provided by provisions of general laws under Article 5 II Basic Law. Moreover, deleting illegal content appears as a measured sanction, given that such statements, when made in offline scenarios, often attract criminal prosecution which may result in fines and prison sentences.

Conversely, in the second scenario the operator deletes content after mistakenly deeming it illegal. Here, the issues become more complicated. The German Federal Constitutional Court has recognised that there is a presumption in favour of freedom of expression whenever it is unclear whether the expression is illegal, at least on topics of public interest. This has been settled case law ever since the famous decisions in Lüth and more recently in Wunsiedel. Notably, this protection extends to public forums, even where access to them is regulated through private law relationships. However, NetzDG notably does not require censorship (i.e. pre-emptively scrutinizing content before it is shared, which is unconstitutional under Article 5 I 3 Basic Law), nor does it discriminate against specific content (which would generally be unconstitutional). Rather, it primarily enforces legal obligations under § 10 TMG, and the requirement to delete only stretches to content already illegal under criminal law provisions.

Hence, arguments alleging unconstitutionality rely primarily on the unsubstantiated contention that NetzDG will promote an overly aggressive deletion policy (so-called ‘overblocking’) that will have a ‘chilling effect’ on freedom of expression for users of social media platforms: reducing their readiness to make use of their rights. If overblocking does take place as a result of NetzDG, then this would indeed be would be problematic under the German Basic Law.

 

The overstated danger of ‘overblocking’

Despite their prevalence in legal writing on the subject, concerns that social media platforms will, when in doubt, delete content rather than risk a fine, appear overstated. Overblocking is likely to arise, so goes the argument, due to the structure of the fines that apply to a systematic failure to delete illegal content. Hence, a prudent social media platform operator would, when in doubt and confronted with a flurry of complaints, delete content that is questionable, rather than risk a fine. This is discussed in an English-language article here.

With respect to illegal content, the matter is unproblematic from a constitutional perspective. For the reasons stated earlier, social media users do not benefit from protections under freedom of expression for illegal content.

Again, the more problematic scenario arises when the social media platform operator mistakenly deletes legal content. For the user, this represents an infringement of freedom of expression. Indeed, if overblocking is a prevalent phenomenon beyond the occasional erroneous decision of the complaints management infrastructure, it could dissuade users from expressing their views on the platform. This in turn, would render the NetzDG significantly more problematic, and arguably unconstitutional. The Federal Constitutional Court has found a violation in ordering the publishers of a satirical magazine to pay compensation to an individual for an allegedly defamatory article, chiefly basing their ruling on the risk that it would discourage future exercise of freedom of expression.

However, it is not clear that such a chilling effect is inevitable: occasional, non-systematic mistakes by social media platform operators in an otherwise lawful complaints management infrastructure would arguably not suffice to produce such an effect. Notably, and contrary to the impression given by some reports, no fines attach to decisions in individual cases. Rather, a fine requires a systemic and persistent failure in the complaints management infrastructure which must be substantiated through content that has been ruled illegal by a court (§ 4 V NetzDG).

It is difficult to see why a social media platform operator, which ultimately requires continuous user engagement and content creation to be profitable would adopt an overly aggressive deletion policy. An exodus of users would be sure to follow the consistent and arbitrary deletion of legal content, and thus critically undermine the viability of the social media platform. It therefore appears more likely that the limited scope of the fines and the inherent economic interests of social networks encourage a more nuanced deletion policy: one that complies with existing laws but avoids removing more content than necessary. However, even assuming a measurable ‘chilling effect’ this would not necessarily equate to the unconstitutionality of NetzDG.

 

A limited free speech environment

The argument being advanced here is that freedom of expression does not necessarily equate to a right to access to any specific means of expression. For instance, a recipient of social security was not entitled to claim the necessary transportation costs to travel to a protest meeting. Moreover, access to and expression on most social networks is already significantly limited through private law terms and conditions. These grant platform operators wide-ranging powers to delete content or even indefinitely suspend accounts of users for actions that are unlikely to fall afoul of German criminal law. Against this backdrop, it is difficult to sustain an argument that the potential for unintended side effects of NetzDG are a unique or would on its own suffice to find it unconstitutional. Social media platforms are hardly a free speech paradise: users already operate in an environment where arbitrary limitations are placed on freedom of expression, where deletion and suspension of accounts may occur without the possibility of appeal or redress to courts, and where the terms and conditions of participation can be altered at the sole discretion of the platform operators at any time.

 

Conclusion

Overall, the NetzDG might after all form part of a civilizing influence on online debate, instead of having a one-sided chilling effect on freedom of expression. The fact that to date social media and online interaction more generally, has created a space for a significantly more laissez faire approach to expression is neither here nor there on the question of constitutionality. The obligations to delete illegal content are based on well-establish limits to freedom of expression, to which NetzDG chiefly adds a more robust enforcement mechanism.

The constitutionality of NetzDG may to a considerable extent rest on an evaluation of the complaints management infrastructure that social media operators develop. Should it consistently, and inevitably, lead to a chilling effect on freedom of expression, then the argument for unconstitutionality grows stronger, but in my view by no means straight-forward. Conversely, the Federal Constitutional Court would be less likely to take issue with this novel regulatory approach if the deletion of legal content is limited to individual, non-systematic mistakes. Ultimately, the goal must be to limit the divide between the online and offline world as far as possible: it is not evident why constitutionally acceptable limits on freedom of expression should not extended and enforced on social networks.

 

Stefan Theil is a Research Fellow at the Bonavero Institute of Human Rights, Faculty of Law, University of Oxford. This contribution previously appeared on Verfassungsblog.

 

 

 

 

1 comment;


  1. tyelko says:

    I’ve myself made the experience of Facebook telling me that a post I reported did not violate community guidelines. I used the function provided to give negative feedback on the experience of reporting said post and in writing announcing that I would use the NetzDG reporting mechanism since I was convinced the post violated §130 of the criminal code (incitement). Even before I received a result for the NetzDG report, I received yet another answer to my initial report that the post had been removed and the person making it banned – and the reply on my NetzDG report confirmed that Facebook now all of a sudden agreed with me that the post was a likely violation of §130.
    This also illustrates part of why and how the law came about: If Facebook would apply more due diligence to regular reports, the NetzDG would likely never have come about. And that includes not just removing content more rigorously, but verifying reports more rigorously, because the notion that the NetzDG limits freedom of speech ignores that Facebook has been removing content and temporarily or permanently blocking people all along – sometimes for the most ludicrous reasons. What the NetzDG does is forcing them to compare a post more rigorously to legal standards. Before that, it sometimes seemed like they were rolling dice, and revenge reporting was certainly an issue.

Leave a Reply

Welcome to the UKHRB


This blog is run by 1 Crown Office Row barristers' chambers. Subscribe for free updates here. The blog's editorial team is:
Commissioning Editors: Darragh Coffey
Jasper Gold
Editorial Team: Rosalind English
Angus McCullough KC
David Hart KC
Martin Downs
Jim Duffy
Jonathan Metzer

Free email updates


Enter your email address to subscribe to this blog for free and receive weekly notifications of new posts by email.

Subscribe

Categories


Disclaimer


This blog is maintained for information purposes only. It is not intended to be a source of legal advice and must not be relied upon as such. Blog posts reflect the views and opinions of their individual authors, not of chambers as a whole.

Our privacy policy can be found on our ‘subscribe’ page or by clicking here.

Tags


Aarhus Abortion Abu Qatada Abuse Access to justice administrative court adoption ALBA Allison Bailey Al Qaeda animal rights anonymity Appeals Article 1 Protocol 1 Article 2 article 3 Article 4 article 5 Article 6 Article 7 Article 8 Article 9 article 10 Article 11 article 13 Article 14 Artificial Intelligence Asbestos assisted suicide asylum Australia autism benefits Bill of Rights biotechnology blogging Bloody Sunday brexit Bribery Catholicism Chagos Islanders charities Children children's rights China christianity citizenship civil liberties campaigners climate change clinical negligence Coercion common law confidentiality consent conservation constitution contempt of court Control orders Copyright coronavirus Coroners costs court of appeal Court of Protection covid crime Criminal Law Cybersecurity Damages Dartmoor data protection death penalty defamation deportation deprivation of liberty Detention diplomatic immunity disability disclosure Discrimination disease divorce DNA domestic violence duty of candour duty of care ECHR ECtHR Education election Employment Employment Law Employment Tribunal enforcement Environment Equality Act Ethiopia EU EU Charter of Fundamental Rights EU costs EU law European Court of Justice evidence extradition extraordinary rendition Fair Trials Family Fertility FGM Finance football foreign criminals foreign office France freedom of assembly Freedom of Expression freedom of information freedom of speech Free Speech Gay marriage Gaza gender Gender Recognition Act genetics Germany gmc Google government Grenfell Health healthcare high court HIV home office Housing HRLA human rights Human Rights Act human rights news Huntington's Disease immigration India Indonesia injunction injunctions Inquests international law internet Inuit Iran Iraq Ireland Islam Israel Italy IVF Jalla v Shell Japan Japanese Knotweed Journalism Judaism judicial review jury trial JUSTICE Justice and Security Bill Land Reform Law Pod UK legal aid legal ethics legality Leveson Inquiry LGBTQ Rights liability Libel Liberty Libya Lithuania local authorities marriage Maya Forstater mental capacity Mental Health military Ministry of Justice Mirror Principle modern slavery monitoring murder music Muslim nationality national security NHS Northern Ireland nuclear challenges nuisance Obituary ouster clauses parental rights parliamentary expenses scandal Parole patents Pensions Personal Injury Piracy Plagiarism planning Poland Police Politics pollution press Prisoners Prisons privacy Private Property Procedural Fairness Professional Discipline Property proportionality Protection of Freedoms Bill Protest Public/Private public access public authorities public inquiries public law Regulatory Proceedings rehabilitation Reith Lectures Religion RightsInfo Right to assembly right to die right to family life Right to Privacy Right to Roam right to swim riots Roma Romania Round Up Royals Russia Saudi Arabia Scotland secrecy secret justice Sex sexual offence sexual orientation Sikhism Smoking social media Social Work South Africa Spain special advocates Sports Standing statelessness Statutory Interpretation stop and search Strasbourg Supreme Court Supreme Court of Canada surrogacy surveillance Syria Tax technology Terrorism tort Torture Transgender travel travellers treaty TTIP Turkey UK Ukraine UK Supreme Court unduly harsh united nations unlawful detention USA US Supreme Court vicarious liability Wales War Crimes Wars Welfare Western Sahara Whistleblowing Wikileaks Wild Camping wind farms WomenInLaw YearInReview Zimbabwe

Tags


Aarhus Abortion Abu Qatada Abuse Access to justice administrative court adoption ALBA Allison Bailey Al Qaeda animal rights anonymity Appeals Article 1 Protocol 1 Article 2 article 3 Article 4 article 5 Article 6 Article 7 Article 8 Article 9 article 10 Article 11 article 13 Article 14 Artificial Intelligence Asbestos assisted suicide asylum Australia autism benefits Bill of Rights biotechnology blogging Bloody Sunday brexit Bribery Catholicism Chagos Islanders charities Children children's rights China christianity citizenship civil liberties campaigners climate change clinical negligence Coercion common law confidentiality consent conservation constitution contempt of court Control orders Copyright coronavirus Coroners costs court of appeal Court of Protection covid crime Criminal Law Cybersecurity Damages Dartmoor data protection death penalty defamation deportation deprivation of liberty Detention diplomatic immunity disability disclosure Discrimination disease divorce DNA domestic violence duty of candour duty of care ECHR ECtHR Education election Employment Employment Law Employment Tribunal enforcement Environment Equality Act Ethiopia EU EU Charter of Fundamental Rights EU costs EU law European Court of Justice evidence extradition extraordinary rendition Fair Trials Family Fertility FGM Finance football foreign criminals foreign office France freedom of assembly Freedom of Expression freedom of information freedom of speech Free Speech Gay marriage Gaza gender Gender Recognition Act genetics Germany gmc Google government Grenfell Health healthcare high court HIV home office Housing HRLA human rights Human Rights Act human rights news Huntington's Disease immigration India Indonesia injunction injunctions Inquests international law internet Inuit Iran Iraq Ireland Islam Israel Italy IVF Jalla v Shell Japan Japanese Knotweed Journalism Judaism judicial review jury trial JUSTICE Justice and Security Bill Land Reform Law Pod UK legal aid legal ethics legality Leveson Inquiry LGBTQ Rights liability Libel Liberty Libya Lithuania local authorities marriage Maya Forstater mental capacity Mental Health military Ministry of Justice Mirror Principle modern slavery monitoring murder music Muslim nationality national security NHS Northern Ireland nuclear challenges nuisance Obituary ouster clauses parental rights parliamentary expenses scandal Parole patents Pensions Personal Injury Piracy Plagiarism planning Poland Police Politics pollution press Prisoners Prisons privacy Private Property Procedural Fairness Professional Discipline Property proportionality Protection of Freedoms Bill Protest Public/Private public access public authorities public inquiries public law Regulatory Proceedings rehabilitation Reith Lectures Religion RightsInfo Right to assembly right to die right to family life Right to Privacy Right to Roam right to swim riots Roma Romania Round Up Royals Russia Saudi Arabia Scotland secrecy secret justice Sex sexual offence sexual orientation Sikhism Smoking social media Social Work South Africa Spain special advocates Sports Standing statelessness Statutory Interpretation stop and search Strasbourg Supreme Court Supreme Court of Canada surrogacy surveillance Syria Tax technology Terrorism tort Torture Transgender travel travellers treaty TTIP Turkey UK Ukraine UK Supreme Court unduly harsh united nations unlawful detention USA US Supreme Court vicarious liability Wales War Crimes Wars Welfare Western Sahara Whistleblowing Wikileaks Wild Camping wind farms WomenInLaw YearInReview Zimbabwe

Discover more from UK Human Rights Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading