Tuesday, 16 July 2024

4 Times NO: Article 13 Censorship Filter Confirmed as Illegal

The Max Planck Institute for Innovation and Competition (‘the Max Planck Institute’) responds [PDF – link updated on 20 Sept, 2017] to the questions on the censorship filter (Article 13) addressed to the Council Legal Services by a series of Member States (Belgium, the  Czech Republic, Finland, Hungary, Ireland and the Netherlands) in the ‘non-paper’ leaked by Statewatch (see our analysis here). In the meantime, it seems that the German government also submitted a contribution to the Council, wherein they too express concerns about Article 13.

The Max Planck Institute’s reply draws on their earlier work [PDF] on Article 13, which includes some alternative language (see pp. 20ff). This excellent contribution to the debate was authored by Professor Dr Reto Hilty, Director of the Institute and Professor at the University of Zurich, and Dr Valentina Moscon, Senior Research Fellow at the Institute.

The Conclusion: End this Madness & Do Not Adopt Article 13 and Recitals 38 & 39

In its paper, the Max Planck Institute calls on policymakers not to adopt Article 13 and Recitals 38 and 39, as it criticises the proposals both on their approach and on their substance. More specifically, they consider these provisions:

  1. create legal uncertainty, due to the use of :
    • undefined legal concepts; and,
    • barely understandable formulations.
  2. do not specify how they fit into and align with the InfoSoc Directive [2001/29/EC] and e-Commerce Directive (ECD) [2000/31/EC].
  3. are inconsistent with the ECD; and,
  4. could enable abusive behaviour that threatens fundamental human rights, such as the freedom of expression and information as specified under Article 11 of the Charter of Fundamental Rights of the European Union (‘the Charter’).

The authors suggest alternatives to Article 13, such as:

  • harmonising ‘notice-and-take down’ procedures;
  • introducing ‘counter notice’ procedures; and,
  • implementing a user-generated content (UGC) exception.

 

The thoroughness of the analysis (16 pages), set-out in more details below, and the damning nature of its conclusion is in stark contrast with the ‘answer’ the Member States got from Council Legal Services: orally and in less than 10 minutes. But then again, maybe French (the language used by the services for their ‘contribution’) is more to the point than English? Or is it just that in one case, academic analysis prevails, whilst  in the other, the Council Legal Services’ exercise is a mere rubber stamping of the European Commission’s rhetoric?

Shhh. Stop talking. The answer is still no. N. O. No.

Question 1: Article 13 vs. Copyright Exceptions & the Charter

“Article 13 – can lead to a significant limitation of the fundamental rights including freedom of expression and information”

The Member States asked the Council Legal Services: Is Article 13 compatible with the Charter and are the proposed measures justified and proportionate?

The Max Planck Institute’s answer: NO. The authors explain that: “the Proposed Article 13 entails serious risks of contrasts with the Charter of Fundamental Rights as well as with copyright exceptions” (p. 4).

The paper warns that “content recognition technology and procedures enable abuse”, as they “can lead to a sensitive limitation of the fundamentally protected freedom of expression and information (Article 11 of the Charter of Fundamental Rights), it must remain reserved to legally authorised judges to decide on the legality of content” (p. 5).

The Max Planck Institute therefore calls to uphold the ECD’s fundamental principle in Article 15 that service providers  have no general filtering or monitoring obligation of user content.

Question 2: Article 13 vs. Article 14 of the ECD

Sub-question 1 – The Member States asked: Is it appropriate to modify the ECD’s horizontal application and interpretation in a Recital?

The Max Planck Institute’s answer: NO. The authors explain that: “Derogations from existing law – if any – should be specific and clearly stated in the text of the proposal” (p. 6).

The paper points out that Article 13 “contains a series of undefined legal concepts that make it difficult to identify points of contact and differences of the proposal with the E-Commerce Directive”. Such ‘undefined legal concepts’ are, for example, the scope of the service providers covered and the criterion of ‘large amounts’ of works or other subject-matter (p. 6).

The Max Planck Institute also highlights the EC’s failure to list the ECD amongst the directives that are left intact (see Article 1(2) of the proposal), as this “may raise interpretative doubts”. The paper adds that this problem could be aggravated “by the fact that some incompatibilities exist with regard to Article 14 of the E-Commerce Directive” (p. 7).

Recital 38 in referring to Article 14 (1) of the E-Commerce Directive misinterprets the CJEU case law on the matter

Sub-question 2 – The Member States asked: Is the current state of play of CJEU [Court of Justice of the European Union] jurisprudence regarding the ECD’s liability exemptions described in Recital 38 accurate and complete?

The Max Planck Institute’s answer: NO. The authors warn that “this legislative approach – if not corrected during the legislative process – will worsen the current patchwork causing further significant inconsistencies” (p. 8).

Sub-question 3 – The Member States asked: Would it not be more preferable to replace part of Recital 38 with a “without prejudice clause” in respect to the ECD?

The Institute’s answer: NO. The authors consider that there is no quick fix, and that this would only be “a minimal achievement in the context of a proposal that needs to be completely reviewed” (p. 9).

Question 3: Article 13 vs. Article 15 of the ECD

The Member States also asked: Does the prohibition for Member States to impose general monitoring obligations of Article 15 of the ECD not apply in the situation where Member States’ legislation would oblige certain platforms to apply technology that identifies and filters all the data of each of its users before the upload on the publically available services?

The Max Planck Institute’s answer: NO.

“(…) obliging certain platforms to apply technology that identifies and filters all the data of each of its users before the upload on the publicly available services is contrary to Article 15 of the InfoSoc Directive as well as the European Charter of Fundamental Rights.” (p. 10)

The paper points out that the inconsistencies with Article 15 of the ECD, just as with Article 14, are not clearly resolved by the proposal.

To those that would try to claim that Article 13 does not impose an ex-ante general monitoring or filtering obligation, the authors explain that (pp. 10-11):

  1. “‘effective content recognition technologies’ will, by definition, require monitoring”;
  2. infringing content cannot be effectively recognized on a platform by means of a technological tool without the oversight of the totality of the content on that platform.

    the EC’s own Impact Assessment (see Annex 12A – pp. 164-165) sets out that content recognition technologies check each piece of content that an end user attempts to upload onto the service.

  3. the EC’s own Impact Assessment (see Annex 12A – pp. 164-165) sets out that content recognition technologies check each piece of content that an end user attempts to upload onto the service. Therefore, installing and applying a content recognition (i.e. filtering) system would involve the active monitoring of almost all of the data relating to all of service provider users“;
  4. “What matters is the breadth of the material being monitored, which represents the difference between general and specific monitoring. As made clear by the CJEU, Article 15(1) of the E-Commerce Directive is aimed at avoiding a system for filtering that applies indiscriminately to all users as a preventive measure” (see CJEU Case C-360/10, SABAM v Netlog); and,
  5. “(…) the introduction of “effective content recognition technologies” would also be incompatible with the Charter of Fundamental Rights of the EU. In the Netlog case, for instance, the imposition of a filtering obligation was seen as incompatible with Article 15, but also with a fair balance of the fundamental rights at stake.”

Question 4: Article 13 vs. the notion of a ‘communication to the public’

Finally, the Member States asked: Does the Legal Service consider it is sufficient to “provide access to the public” to a copyrighted work to constitute an act of communication to the public under Directive 2001/29, or does the CJEU require that further conditions be met to establish a communication to the public?

The Institute’s answer: NO. The authors consider the EC is is cutting corners with its interpretation in Recital 38 that service providers that store and provide access to the public to copyright protected works or other subject-matter uploaded by their users”, are “thereby going beyond the mere provision of physical facilities and [are] performing an act of communication to the public“.

Recital 38 of the proposal misunderstands EU copyright and related rights law by assuming that these providers go beyond the mere provision of physical facilities and perform an act of communication to the public”

In their view, “Recital 38 of the proposal misunderstands EU copyright and related rights law by assuming that these providers go beyond the mere provision of physical facilities and perform an act of communication to the public” (p. 16).

The paper explains why the EC’s reasoning is flawed at 3 levels (pp. 12-15):

  1. “provide access to the public to copyright protected works”: The authors consider that it is unclear what this looks like and how it differs from “storing”. In this context, they point to the L’Oréal v eBay CJEU case (Case 324/09), where the Court referred to the conditions of ‘optimising the presentation’ or ‘promoting’, and question “the relationship between those conditions – which are referenced again in the Recital 38(2) in respect of Article 14 of the E-Commerce Directive – and the notion of ‘provide access to the public to copyright protected works'”. In their view, this raises more concern (again).
  2. the notion of “communication to the public”: The authors explain that Recital 38 does not reflect the notion of “communication to the public” of the InfoSoc Directive as interpreted by the CJEU, adding that the InfoSoc Directive does not define the concept of “communication to the public”.

    “(…) we should not overlook the fact that the approach pursued by the CJEU aiming at broadening the scope of the right of communication to the public is the consequence of a lack of harmonized intermediaries’ liability and secondary copyright infringement.”

     They continue that “in its case law on Article 3(1), the CJEU has consistently stated that the essential requirements of Article 3(1) are an ‘act of communication’, directed to a ‘public'”, and remark that the CJEU has also introduced further criteria (e.g. necessary and deliberate intervention on the side of the intermediary, profit-making intention, etc.). One of these criteria being the need for full knowledge of the protected nature of that work and the possible lack of consent to publication on the internet by the copyright holder” (see CJEU Case C-160/15, GS Media v Sanoma). A criteria that the court has kept using in follow-up decision such as in the more recent Pirate Bay case. The paper highlights that in this case, the CJEU stresses that these criteria “must be applied case by case both individually and in their interaction with one another” (§25). The authors consider that even from this last decision “it is impossible to draw the conclusion that information society service providers perform an act of communication to the public by storing and ‘providing access’ to user-uploaded content“.
  3. widening the scope of copyright protection (and therefore copyright liability): The authors consider that “the attempt to bring the intermediaries’ liability within the field of copyright infringement (primary liability) is open to criticism: it was never the intention of the EU legislature, on the basis of the InfoSoc Directive, to harmonize the conditions for secondary liability of intermediary services”. In this context they quote Advocate General Miguel Poiares Maduro of the CJEU who explained, in a trademark infringement case, that “liability rules are more appropriate, since they do not fundamentally change the decentralised nature of the internet (…)”.

What other academics say

What others are saying

Herman Rucic is Senior Policy Manager in the secretariat of the Copyright 4 Creativity (C4C) coalition. He is Senior Policy Manager at N-square Consulting since September 2010. [All content from this author is made available under a CC BY 4.0 license]