Sunday, 14 July 2024

Over 50 Academics Slam Censorship Filter & Join Calls to Stop © Madness

On 17 October, 56 respected academics co-signed a recommendation on measures to safeguard fundamental rights and the Open Internet in the framework of the EU copyright reform. This effort is a reaction to the multiple questions regarding the legality of the so-called censorship filter (Article 13 and its Recitals) that were raised by seven Member States, including Germany (see here and here).

The academic paper is without appeal as regards Article 13:

“Article 13 (…) is disproportionate and irreconcilable with the fundamental rights guarantees in the Charter [of Fundamental Rights of the EU]” (p. 14) and “contains imbalanced, undefined legal concepts that make it incompatible with the existing acquis(p. 23).

This is not the first time that academics speak-up in the debate (see herehere, and here) . The initiative comes just days after over 50 NGOs representing human rights and media freedom sent an open letter to the European Commission President, the European Parliament and the Council asking them to delete the censorship filter proposal (Article 13), as it “would violate the freedom of expression set out in (…) the Charter of Fundamental Rights” and “provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens’ communications”. It is especially striking that organisations such as Reporters without Borders and Human Rights Watch, which are known to intervene for the protection of human rights in less democratic countries, have now been moved to the point where they felt the need to voice their concerns in this matter to ensure that EU citizens are safeguarded from the EU’s copyright agenda crushing their fundamental rights.

As pointed out in a previous post this week about POLITICO leaking the Council Legal Service’s written response to the Member States’ questions,  this leaked Council document concluded that a censorship filter would be illegal at various levels and it can therefore not be the case that Article 13 could be interpreted as requesting such a measure, and confirmed that the questions raised were legitimate. The academics’ recommendation echoes this message but whilst the views of the Legal Services were clearly influenced by internal EU politics in terms of forcefulness and tone, the academic paper sets it out without any ambiguity.

What’s at stake?

In practice “the application of filtering systems that would result from the adoption of Article 13 (…) would place a disproportionate burden on platform providers, in particular small and medium-sized operators, and lead to the systematic screening of personal data, even in cases where no infringing content is uploaded”, and on top of that these measures “would also deprive users of the room for freedom of expression that follows from statutory copyright exceptions” (p. 1).

The academics do not however dodge the debate and propose concrete solutions to possible failures of the system in place: instead of looking at intrusive filters, EU legislators need to “clarify and further harmonize the rules for the hosting and provision of access to content uploaded by users, and reward authors and performers for the online use of their creations” (p. 3). These clarifications should result into “more specific, harmonized rules on the ‘notice and takedown’ procedure and include the introduction of a ‘counter notice’ procedure” (p. 4).

Bonus points for this solution: “It is in line with the current acquis and generates an additional revenue stream for authors and performers without encroaching upon fundamental rights and freedoms and eroding the safe harbour for hosting in Article 14 of the E-Commerce Directive” (p. 24). In other terms: it’s legal AND smart!

It’s that easy, really. Consumers could continue to enjoy the virtuous of the Internet, online platforms and start-ups would no longer be under threat, and authors and performers would get the fair remuneration they are calling for. However, will this solution please rightholders? Well that’s another question, as their goal is to do everything in their power to obtain legislation that entrenches their business model. Some say it’s about the money, money, money…or maybe it’s just about control?

But let’s have a closer look at what these academics have to say on Article 13.

Compatibility with the Charter of EU Fundamental Rights? NO!

One major question the different Member States struggle with is the fact if Article 13 is compatibility with the Charter of EU Fundamental Rights. The academics’ answer is quite straightforward: No.

In their view, the current measures foreseen in Article 13 “can hardly be deemed compatible with the fundamental rights and freedoms guaranteed under Articles 8 (protection of personal data), 11 (freedom of expression) and 16 (freedom to conduct a business) of the Charter of Fundamental Rights of the EU” (p. 1).

The conclusion of the academics goes against the view taken by the Council Legal Service in their response to the Member States, which is that “that the Court would be likely to find the proposed Article 13 together with its recitals 37 and 39 as reflecting a fair balance between the fundamental rights at stake (ie the protection of intellectual property and the freedom to conduct a business)” (par. 13). This reasoning stems from the Legal Service’s starting point that a censorship filter would be illegal at various levels and it can therefore not be the case that Article 13 could be interpreted as requesting such a measure. A premisse that reads more like wishful thinking and can only be understood in conjunction with the repeated criticism in the paper regarding the poor drafting and lack of legal clarity of the European Commission’s proposal.

So, how did the academics come to their more straightforward conclusion?

They start from the reasoning that the Court of Justice of the European Union (CJEU) has made it explicit that in transposing and implementing EU directives “Member States must […] take care to rely on an interpretation of the directives which allows a fair balance to be struck between the various fundamental rights protected by the Community legal order” (p. 10). In order to assess if such a balance could be achieved, they look at the CJEU’s ruling in the Sabam/Netlog case (C-360/10), which they explain “offered the CJEU the chance to provide guidance on a filtering system that could become a standard measure if Article 13 (…) was implemented at the national level” (p. 11). In this case, they note that the “Court took as a starting point the explicit recognition of intellectual property as a fundamental right”, but that it also “recognized that intellectual property must be balanced against the protection of other fundamental rights and freedoms” (p. 11).

The academics can partially agree that providers’ freedom to conduct a business could be less impacted under Article 13 than in the Sabam/Netlog case. This is due to the involvement of the rightholders in the content identification process, something which the Council Legal Service also stressed in their response. However, the recommendation remarks this doesn’t alleviate the burden for all players, as the recommendation remarks that “content recognition technologies remain quite expensive for small and medium-sized businesses, particularly for start-ups” (p. 12). So, while it might have less impact on larger platforms’ freedom to conduct a business, it still hurts the smaller players, and this would “lead to further market concentration in favour of providers which already have a strong market position” (p. 4).

The recommendation points out that in the Sabam/Netlog case the CJEU considers that the “the filtering system could potentially undermine freedom of information, as long as it was not capable of distinguishing adequately between unlawful content and lawful content, with the result that it could block lawful communications” (p. 11). Moreover, the academics support the remark from the Member States that “the filtering system may deprive users of the room for freedom of expression that follows from statutory copyright exceptions, in particular the quotation right and the right to parody” (p. 12).

In its response the Council Legal Service puts a lot of faith into the fact that the Member States are obliged to foresee ‘complaints and redress mechanism’ under Article 13, as the magic wand to solve the issues around freedom of expression and information. The academics note that in the Sabam/Netlog case, “the Court made no indication that put-back mechanisms are capable of sufficiently addressing the harm caused by incorrect automatic removals” (p. 13). They also point out that: (1) these mechanisms are only ‘vaguely sketched’ and (2) studies around such mechanisms have shown they can have chilling effects if improperly formulated. In their view, the Estonian Presidency’s compromise proposal is even more worrying, “as it would place decision-making in case of disputes in the hands of the right holder – a party with strong incentives to disallow use” (p. 13).

Blanket Monitoring is Not General when it’s Specific, but that’s not the Case here

The recommendation stresses the broad scope of Article 13, as it “seems to encompass blogging platforms, news portals working with citizen journalists and/or offering discussion fora, photo/film/music portals, social networking sites, online marketplaces and search engines offering keyword advertising services” (p. 18). In this context, it also points out that CJEU jurisprudence shows that Article 15 of the E-Commerce Directive, the provision preventing Member States from imposing a general obligation on service providers to monitor , “is fully applicable to user-generated content platforms and intended to shield these platforms from general monitoring obligations” (p. 18).

“(…) the notion of ‘monitoring obligations in a specific case’ reflected in Recital 47 of the E-Commerce Directive should not be overstretched to justify acts of filtering which would target all content uploaded onto a given platform and apply indiscriminately to all users, even if the filtering seeks only to identify instances of infringement of an individual item of protected content.” – pp. 3-4

The academics explain that “even if the filtering seeks only to identify instances of infringement of an individual item of protected content” this does not “justify acts of filtering which would target all content uploaded onto a given platform and apply indiscriminately to all users“(pp. 3-4) . The recommendation clarifies that the CJEU has deemed that “monitoring should be deemed specific only if it relates to a specific content item, in respect of which infringement has been established previously, or if it targets a specific user who has previously been found to have engaged in such infringements” (pp. 3-4). Moreover, “all filtering requires examination of content posted by non-infringing users” (p. 5), and “in its effort to uncover infringing content, the identification systems will necessarily examine content which does not [infringe]” (p. 13).

In this context, the recommendation also refers to the CJEU’s ruling in the Sabam/Netlog case, wherein the Court “ignored the broadness of the injunction in terms of the content which it sought to protect and focused exclusively on the breadth of the material being monitored”, as the CJEU considered that a filtering system would “require active observation of files stored by users with the hosting service provider and would involve almost all of the information thus stored and all of the service users of that provider” (p. 19). The academics point out that in the more recent McFadden case (C-484/14) “the question submitted to the Court made it clear that the contemplated measure would involve ‘examining all communications passing through [the provider’s systems] in order to ascertain whether the particular copyright-protected work is unlawfully transmitted again'” (p. 19).

The conclusion is thus that “the [CJEU’s] jurisprudence shows clearly that an obligation to filter any information uploaded to the server of a platform hosting user-generated content would lead to a prohibited general monitoring obligation and be incompatible with Article 15 of the E-Commerce Directive” (p. 2).

This disrupts the quirky belief some have that Article 13 does not impose an ex-ante general monitoring or filtering obligation, as they consider that “general monitoring can only be understood as searching for all potentially illegal content”, and that therefore, it does not apply when the infringing content to be searched for is identified”. However, the academics are very clear: “This approach (…) cannot be accepted as correct” (p. 19).

The recommendation also warns that “the decision over the scope and reach of filtering measures must not be left to agreements between industry representatives that are likely to focus on cost and efficiency considerations instead of seeking to avoid unnecessary content censorship.” (pp. 5-6). This could especially have detrimental effects for small and medium-sized platforms, as academics warn that smaller players “should not be disadvantaged through the imposition of obligations to invest in filtering systems or due to an inability to purchase the most advanced systems” (p. 4).

Focus on Notice-and-Action Instead of Tinkering with the ‘Communication to the Public’

The academics consider that the European Commission’s initial proposal and the Estonian Presidency’s compromise proposals are confusing at best.

The recommendation explains that these proposals “confuse and mix different legal questions”, namely (p. 2):

  1. “the scope of the safe harbour for hosting under Article 14(1) of the E-Commerce Directive”; and,
  2. “the issue of whether (and when) platform providers themselves carry out an act of communication to the public and fulfill the requirements of Article 3(1) of the Information Society Directive.”

What’s causing all this confusion? One small word in Recital 38: ‘thereby’. Some seem to consider that it implies that “if information society service providers store and provide access to user-generated content, they inevitably do more than providing physical facilities and in fact perform an act of communication to the public” (p. 20).

They’re wrong! The academics warns that the ambiguous wording of Recital 38 leads to legal uncertainty, as it creates “a real risk of modifying the notion of ‘communication to the public’ considerably” (p. 2). The academics are clear that this is not the way forward, as “the broadening of the right of communication to the public (and corresponding copyright liability) does not constitute an appropriate compensatory measure for the lack of a harmonized system of intermediary liability” (p. 22). Therefore, the 1st paragraph of Recital 38 should simply be deleted, or heavily redrafted as set out below.

[Note by CopyBuzz: no mandate was given to the European Commission to redraft this concept, nor was any evaluation made of such a potential redraft in the Impact Assessment made by the European Commission]

The recommendation explains that the CJEU “has developed a complex set of conditions for identifying acts of communication to the public”. Therefore, the academics consider that “new legislation in this area should refrain from collapsing the different assessment criteria into one single test of providing access”, and instead the the starting point should be “to distinguish clearly between infringement criteria that apply to the primary act of uploading content, and those that apply to the secondary acts undertaken in relation to uploaded content” (p. 5). In this context, the recommendation explain that “to ‘provide access to the public’ is not sufficient to find a communication to the public, as the CJEU requires further conditions to be met” (p. 22).

Alternatively, the 1st paragraph of Recital 38 could be redrafted to clarify that “the requirement of ‘providing access to the public’ and ‘performing an act of communication to the public’ are two separate and cumulative requirements which must both be fulfilled to establish an infringement” (p. 22). The recommendation explains that the 2nd requirement of ‘performing an act of communication to the public’ “would reflect additional tests evolving from CJEU jurisprudence, such as the criterion of a ‘new public’, ‘knowledge’ and a ‘profit motive’ – the latter as a vehicle for a presumption of knowledge” (p. 22), as in the current wording of the European Commission’s proposal, the “general requirement of ‘knowledge of, or control over’ infringing user-generated content is missing” (p. 1).

“The corrosive effect of such legislation would be felt across the whole spectrum of relevant services: from online marketplaces and social media platforms to collaborative software development platforms and repositories of public domain material and scientific papers. It would render the safe harbour for hosting meaningless, destroy the equilibrium between affected fundamental rights and freedoms, erode the basis for investment in new online services – particularly new services developed by small and medium-sized enterprises – and, in consequence, lead to further market concentration in favour of providers which already have a strong market position” – p. 4

Article 14 of the e-Commerce Directive regulates the liability of intermediaries, such as online platforms. The problem identified by the academics is that the lack of the requirements of knowledge and control in Recital 38 would “lead to a remarkable restriction of eligibility for the liability privilege” (p. 16), as “the Recital gives the impression that any act of promoting or optimizing the presentation of user-generated content automatically excludes eligibility for the liability safe harbour established by Article 14 of the E-Commerce Directive” (p. 15). Moreover, the fact that the 2nd paragraph of Recital 38 tries to assess the ‘active role’ played by a service provider “irrespective of the nature of the means used therefor”, is also considered as a clear deviation from the CJEU jurisprudence which highlights the danger of including “instances of optimization and promotion by automatic means in the absence of knowledge or control” (p. 16).

In the academics’ view, “the attempt to regulate intermediary liability on the basis of rules concerning primary copyright infringement is inconsistent and imbalanced“, as they emphasise the fact that the “Information Society Directive does not provide for the checks and balances necessary to achieve a proper equilibrium of all fundamental rights and freedoms involved” (p. 22). Achieving this balance is not that clear cut, not even for the courts, as the academic points that the “CJEU established the rule that the infringement analysis required an ‘individual assessment’ of the circumstances of each case”, and that “even though an “act of communication” has taken place, the use does not amount to infringement, because other criteria are not fulfilled” (p. 21).

What needs to be done instead? Focus on notice-and-action, instead of tinkering with the concept of a communication to the public.

To achieve this the recommendation calls for the further clarification and harmonisation of the principle “that providers are not liable for users’ actions which they cannot reasonably be expected to know and control” – which is already grounded in the EU acquis (p. 2). Doing so, should help “to pave the way for a uniform application of service provider immunity throughout the internal market” (p. 2). Finally, this harmonisation effort should go hand in hand with “a well-structured European legislative design of the ‘notice and takedown’ procedure”, coupled with “an appropriate ‘counter notice’ procedure” (p. 2).

How to Ensure Fair Remuneration for Authors and Performers

Good news! The academics pondered over a solution that “does not encroach upon fundamental rights and freedoms, and leaves intact the safe harbour for hosting in Article 14 of the E-Commerce Directive” (p. 2). So, what did they come up with that the European Commission could not (or did not want to) think of?

Well, they propose to “introduce a new use privilege in favour of the creation of content remixes and mash-ups by users and the further dissemination of these remixes and mash-ups on online platforms“, which could be linked to “online platforms with user-uploaded content [being] responsible for the payment of fair compensation“. Online platforms could then “either pass on these additional costs to their users, or use a part of their advertising income to finance the payment of fair compensation” (p. 2).

The academics consider this to be a ‘fair’ solution, as (p. 5):

  1. “such a new copyright limitation would offer a sound basis for the payment of equitable remuneration which could become an additional source of income for authors and performers”; and,
  2. “the adoption of a new copyright limitation is also appropriate considering that not only platform providers, but also copyright holders
    and users, should contribute to the development of adequate solutions”.

This solution is not even rocket science, as they point out that “Article 5 of the Information Society Directive shows clearly that it is already established EU practice to combine the adoption of certain use privileges with an obligation to pay fair compensation” (p. 23). Although, it should be remarked that the collection and distribution of such compensations is not always without controversy, such as in the case of private copy levies .

Guess what? The European Parliament has also been exploring the idea of such a user-generated content (UGC) exception, as the IMCO and CULT Committees agreed on such an exception in their Opinions. However, the proposal was kind of watered down in the CULT Committee, as it went from a mandatory to a voluntary exception.

The Omission of the Council Legal Service brought back on the table: Academic Repositories

It should not come as a surprise that academics take to heart the question from the German Government on the impact of Article 13 for platforms such as academic repositories, storing researchers own works, or those where public-domain works are stored on. Not exactly a small issue, but the Council Legal Service did not respond to it.

It is regrettable that the question was not properly addressed by the Legal Service, but the academics come to the rescue and explain that: “platforms with self-created works, public domain material or scientific papers would fall under the proposed new rules and thus be obliged to introduce filtering systems for all uploaded material, regardless of whether this material consists of an uploader’s own creations, unprotected works in the public domain or papers serving the academic debate” (pp. 13-14). Another proof that Article 13 is the wrong approach.

 


What others are saying

 

Herman Rucic is Senior Policy Manager in the secretariat of the Copyright 4 Creativity (C4C) coalition. He is Senior Policy Manager at N-square Consulting since September 2010. [All content from this author is made available under a CC BY 4.0 license]