Sunday, 14 July 2024

Upload filters: heads – they’re illegal; tails – they’re illegal

As numerous CopyBuzz posts attest, there are three sections of the proposed EU Copyright Directive that are particularly problematic: Articles 3, 11 and 13. Article 3 concerns text and data mining, while Article 11 is the infamous snippet tax, also known as the link tax. Those are both worrisome, but it is Article 13, the upload filter, that is arguably the worst of all.

A filter, by its very nature, is about stopping people from uploading material. No system is perfect, so it is inevitable that some things will be blocked even though they are perfectly legal. That, in turn, will lead to people self-censoring because they fear that something near the boundary of what is permitted will be blocked. The supposedly bright line between legal and illegal will blur, taking out a big chunk of legitimate creativity in the process, and damaging freedom of expression in the EU.

The call for an upload filtering system in itself reflects an ignorance about the technical reality of such approaches. Proponents of the idea like to cite YouTube’s Content ID as proof that upload filters can be built. What this overlooks is the fact that according to Google, Content ID is the product of over 50,000 hours of coding and some 60 million dollars. There are few other firms that could match the scale of investment made by Google. One leading system used for music filtering, Audible Magic, has been in development for over a decade. An EU requirement for mandatory upload filters could mean that US companies will end up with a monopoly for video and audio content filtering, deciding what can and cannot be posted online, an unsatisfactory situation for startups in the region.

Moreover, upload recognition systems comparable to Audible Magic and Content ID will be needed for every kind of material, not just audio and video. That will either require hundreds of millions of euros’ expenditure or lead to cheaper, flawed products. Trying to cut corners during development to save money would lead to systems that produce many false positives, with serious chilling effects for creativity in the EU.

It’s not only the range of filtered material that is broad. The list of sites that will be subject to a requirement to monitor everything that is uploaded to them goes way beyond deep-pocketed services like YouTube. In particular, it will impose impossible burdens on key sites like Wikipedia and the GitHub development platform. As a non-profit organisation, Wikipedia simply doesn’t have the resources to allocate to costly filtering systems. GitHub’s ability to act as a relatively friction-free vehicle for open source will be seriously harmed by a requirement to check every single file that is uploaded to its servers for possible copyright infringements. That will have knock-on effects that will put a brake on free software development in the EU. Similarly, these issues will have a devastating impact on open access and academic collaboration in the region. The EU’s standing in the world of research will suffer as a result.

As if those practical problems weren’t enough, there’s another serious issue, which concerns the compatibility of the general upload filter idea with existing EU law. Article 14 of the EU’s E-commerce Directive says that “the service provider is not liable for the information stored at the request of a recipient of the service”, while Article 15 unequivocally states: “Member States shall not impose a general obligation on providers … to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.” The Copyright Directive’s requirement for major online services to install an upload filter that monitors everything for possible copyright infringement in order to avoid liability runs counter to both Article 14 and Article 15 of the E-commerce Directive, as well as other key rulings from the EU’s highest court on the issue.

The European Commission is well aware of this. In a desperate attempt to square the circle, it published recently its “Communication on Tackling Illegal Content Online – Towards an enhanced responsibility of online platforms”. As its title suggests, the Commission wants online services to take “enhanced responsibility” for removing material from their platforms. The supposedly voluntary nature – emphatically not “a general obligation” – is the trick that the Commission claims enables online companies to carry out constant pro-active monitoring and filtering without losing their immunity for copyright infringement that the E-commerce Directive grants to sites that function as passive conduits of user data.

However, even accepting this absurd idea of “passive” service providers that “pro-actively” filter all material, there’s another big problem with the fact that European Commission “strongly encourages” online platforms to do this, but won’t require it by law. It was noted in a blog post by Dr Sophie Stalla-Bourdillon, who is Associate Professor in Information Technology/Intellectual Property Law within Southampton Law School at the University of Southampton.

The issue involves one of the most important pieces of recent legislation passed by the EU, the General Data Protection Regulation (GDPR), which will be enforced from May next year. The GDPR is an update to the EU’s already strong privacy protections. It adds a number of important new features designed to enhance the rights of EU citizens in this field. One of them concerns “Automated individual decision-making“, where Article 22 of the GDPR lays down: “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

Naturally, there are some exceptions to that right. For example, if the automated processing forms part of a contract, or if the person involved gives their explicit consent. Dr Stalla-Bourdillon’s blog post explains why the contract exception is unlikely to apply in the case of users uploading files to an online platform. There is clearly no consent to material being blocked by an automated filter, since the person uploading it wants it to appear online. That leaves just one other legal justification for carrying out the automated decision-making: if it is “authorised by Union or Member State law to which the [data] controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests”.

When it comes to automated processing carried out by upload filters installed to satisfy Article 13, the European Commission might try to claim it is legal under the GDPR because of that third exception, thanks to its “Communication on Tackling Illegal Content Online”, hoping that no one would notice that it is a policy document with no mandatory authority. But even if it were, the use of upload filters would no longer be voluntary, but a requirement, in which case it is forbidden by Article 15 of the E-commerce Directive.

To summarise: Article 13’s automated general upload filters are either voluntary, in which case they are illegal under the GDPR, or they are mandatory, and therefore illegal under the E-commerce Directive. There’s no other possibility. What’s clear is that upload filters are illegal in all situations, and must therefore be dropped from the Copyright Directive completely.

Featured image by Nicu Buculei.

Writer (Rebel Code), journalist, blogger. on openness, the commons, copyright, patents and digital rights. [All content from this author is made available under a CC BY 4.0 license]