Tuesday, 16 July 2024

Upload filters, copyright and magic pixie dust

Last week, the European Commission unveiled a major initiative aimed at tackling “illegal content online“. As is so often the case when politicians want to be seen to be “doing something” about terrorism, it’s full of really bad ideas.

At the heart of the initiative is a plan for online platforms to “increase the proactive prevention, detection and removal of illegal content inciting hatred, violence and terrorism online.” Significantly, the ideas are presented as “guidelines and principles”. That’s because they are entirely voluntary. Except that the Commission makes it quite clear that if this totally voluntary system is not implemented by companies like Facebook and Google, it will bring in new laws to make them do it on a not-so-voluntary basis. The Commission is quite eager to see swift results from these voluntary efforts, as legislative proposals could already be on the table by May 2018.

One of the bad ideas is for online platforms to work with what are called “trusted flaggers” – “specialised entities with expert knowledge on what constitutes illegal content”. They may be experts, but they will not be judges, which means that the Commission wants Facebook and Google to take down material without needing to worry about the niceties of whether a real judge would rule that it is actually illegal.

But the worst idea, and one that appears multiple times in the latest plans, is the routine and pervasive use of upload filters. In a 20-page document describing the initiative in detail, entitled “Communication on Tackling Illegal Content Online – Towards an enhanced responsibility of online platforms”, there is great emphasis on “using technology to detect illegal content”. In particular, the use and further development of automatic detection and filtering technologies is “encouraged”.

One of the chief reasons the European Commission places such great hopes in automation to solve the problem of illegal material is because apparently it believes “in the field of copyright, automatic content recognition has proven an effective tool for several years”. Except that isn’t true. The Pirate Party MEP Julia Reda has written a helpful blog post which details nine very different ways upload filters fail. In doing so, they have caused notable collateral damage, especially to fundamental rights.

One response to that parade of failure might be to concede that upload filters are imperfect, but to add that simply shows more research is needed to improve them. This is the standard “nerd harder” argument that is frequently deployed in the context of creating backdoors in encryption programs. Despite the fact that security experts unanimously and repeatedly explain it is not possible to design a weakness that can only be used by the authorities, and that is not vulnerable to attacks by criminals and hostile state actors, governments continue to insist that they know better, and that companies should just do it. And so it is here. Even though people who understand how upload filters operate patiently point out that it is not possible to capture the extreme complexities of copyright law in filtering rules that can be applied automatically and correctly, the authorities continue to press for this supposed panacea.

Call it the “magic pixie dust” delusion – the belief that technology can be sprinkled on hard, real-world problems, and they will be solved, just like that. The European Commission is a great believer in high-tech pixie dust, as its demand for upload filters in both the Copyright Directive and the new framework for tackling illegal content makes clear. Last week’s announcement is a worrying sign that far from beginning to understand that upload filters are not a practical solution in the field of online copyright, it is doubling down on the idea, and now extending it to other domains.

The European Commission is well aware that Article 15 of the E-Commerce Directive explicitly prohibits Member States from imposing “a general obligation on providers … to monitor the information which they transmit or store, [or] a general obligation actively to seek facts or circumstances indicating illegal activity.” By foregrounding the “enhanced responsibility of online platforms”, as the front page of the last week’s Communication does, the Commission seems to be underlining that its new approach does indeed involve a “general obligation” on those companies to filter all uploads for a vast range of “illegal content”. It’s not hard to see the Court of Justice of the European Union striking down any attempts to enshrine this “enhanced responsibility” in law.

Aside from the fact that they won’t work, and that they are illegal under the E-Commerce Directive, there’s another reason why Article 13’s general upload filters should be dropped: there is no evidence they are needed. Just as the European Commission has cheerfully propagated the incorrect view that automatic filtering works, so it has meekly accepted the bogus claim that unauthorised copies of copyright works are taking a terrible toll on the copyright industry and artists.

As we learned recently from the belated publication of a key report that cost the European Commission a princely €369,871, the evidence shows the opposite. It’s striking that the Commission tried to bury its own analysis, paid for by the EU public, presumably because the findings did not square with its agenda to introduce ever-harsher penalties for copyright infringement. As the report admitted, in general “the results do not show robust statistical evidence of displacement of sales by online copyright infringements”.

Two specific areas did show some effect of unauthorised sharing: new films were affected adversely, while for games, illegal consumption actually led to more legal sales. It is a sign of the European Commission’s biased approach in this area that its economists published a short report about the negative effect of downloads on films, but omitted to mention the positive effect they had on games.

That lack of good faith makes the Commission’s stubborn insistence on a non-existent technical solution to a non-existent problem even more frustrating. If it had the courage to admit the truth about the unproblematic nature of unauthorised sharing of copyright materials, it wouldn’t need to come up with unhelpful approaches like upload filters that are certain to cause immense harm to both the online world and to the EU’s Digital Single Market.

Feature image by Stromcarlson.

Writer (Rebel Code), journalist, blogger. on openness, the commons, copyright, patents and digital rights. [All content from this author is made available under a CC BY 4.0 license]