In a long-awaited judgement, the European Court upheld the decisions of the Estonian courts (a domestic court and the Tallinn Court of Appeal in 2008 and Estonia’s Supreme Court in 2009) and of the ECHR’s Chamber (judgement of October 2013).
Delfi had appealed previous judgements which found the publisher liable for manifestly offensive comments posted on its site. It argued that the court rulings violated the right to freedom of expression (article 10 of the Convention of Human Rights). In the final word on the matter, the Grand Chamber ruled – by fifteen votes to two – that there was no violation of the right of freedom of expression. Background on the case and the judicial process can be found here.
What does this judgement mean for publishers and for the future of online comments hosted on news organisations’ websites? We asked Dirk Voorhoof, Professor of Media Law, Copyright Law and Journalism & Ethics at Ghent University for his view. Dirk Voorhoof has commented the Delfi case in an earlier stage of the procedure (here) and took part in the coalition requesting for a referral of the case by the Grand Chamber.
What are the implications of this judgement for publishers?
DV: “The Grand Chamber has opened the door wide open for national authorities to disallow publishers of news fora with readers’ comments from invoking, or rely on, the limited liability afforded to internet service (hosting) providers. It will no longer be sufficient for publishers of these kinds of online fora, from now on, to remove illegal/defamatory/insulting content upon notice – as they risk being liable because the material has already been posted for some time.
They will need to pre-monitor or remove on their own initiative manifestly illegal content, or risk being held liable by national authorities, in case of a complaint. The national judicial authorities won’t have much choice but to decide like this, as otherwise they can be held in breach of Article 8 of the Convention, as is explained in the judgment. By not considering a platform liable for having defamatory or offending content online and accessible up for some time would infringe on the right of reputation and the rights of others. That the Grand Chamber only refers to manifestly defamatory content or hate speech inciting to violence does not remove the general obligation to monitor, as general monitoring is necessary to identify such manifest illegal or offending hate speech.”
In short, as Professor Voorhoof noted in a first comment on Strasbourg Observers, the Delfi “news platform was to be considered a provider of content services, rather than a provider of technical services, and therefore it should have effectively prevented clearly unlawful comments from being published.” Moreover, “The circumstance that Delfi had immediately removed insulting content after having received notice of it, did not suffice to exempt Delfi from liability”, Prof. Voorhoof commented.
Does this judgement set a precedent?
DV: “It creates an open standard! Let me quote what two members of the Grand Chamber wrote about the judgment delivered by the majority:
“In this judgment the Court has approved a liability system that imposes a requirement of constructive knowledge on active Internet intermediaries (that is, hosts who provide their own content and open their intermediary services for third parties to comment on that content). We find the potential consequences of this standard troubling. The consequences are easy to foresee. For the sake of preventing defamation of all kinds, and perhaps all “illegal” activities, all comments will have to be monitored from the moment they are posted. As a consequence, active intermediaries and blog operators will have considerable incentives to discontinue offering a comments feature, and the fear of liability may lead to additional self-censorship by operators. This is an invitation to self-censorship at its worst”.
DV continues: “It is correct that the judgment seems only to be applicable on fora that are managed by professional, commercially run publishing companies, and that this case does not concern “other fora on the Internet” where third-party comments can be disseminated, for example an Internet discussion forum or a bulletin board where users can freely set out their ideas on any topics without the discussion being channelled by any input from the forum’s manager. The Grand Chamber’s finding is neither applicable on a social media platform where the platform provider does not offer any content and where the content provider may be a private person running the website or “a blog as a hobby”.
The Court indeed emphasises very strongly the liability when it concerns a professionally managed Internet news portal, run on a commercial basis. Freedom of expression in its far-reaching dimensions and high level of protection seems no longer to be guaranteed on professional media platforms, only for “hobby”!
In general I assume most problematic that the Grand Chamber substantially decreases the level of protection of Delfi (and other professional platforms with UGC) in terms of limited liability as ISP toward traditional liability as publisher, only because of their commercial nature or economic aim.”
It seems however that the comments were considered as hate speech and incitement to violence. Isn’t that a pertinent reason to make such comments being removed asap from online fora?
DV: “The Grand Chamber indeed emphasises that the impugned comments in the present case mainly constituted hate speech and speech that directly advocated acts of violence. Hence, the establishment of their unlawful nature did not require any linguistic or legal analysis by Delfi since the remarks were on their face manifestly unlawful. According to the Grand Chamber, its judgment is not to be understood as imposing a form of “private censorship”. However, the judgment considers interferences and removal taken on initiative of the providers of online platforms as the necessary way to protecting the rights of others, while there are other ways that can achieve the same goal, but with less overbroad (pre-)monitoring of all user-generated content or with less collateral damage for freedom of expression and information. Such action could include taking action against the content providers, and effectively install obligations for providers to help to identify the (anonymous) content providers in case of manifest hate speech or other illegal content. Also a right of reply or rectification can be considered as a more proportionate way of reacting, next to of course expeditiously taking down the manifestly offensive comments upon request. Obliging online platforms to filter or monitor users’ comments in order to prevent any possible liability for illegal content creates a new paradigm for participatory online media.”