7

I read about the capture of Daniela Klette, via Google Translate:

In Germany, the criminal police claimed that they were not authorized to use this type of AI [PimEyes], due to data protection requirements.

Original text:

En Allemagne, la police criminelle a affirmé qu’elle n’était pas autorisée à utiliser ce type d’IA [PimEyes], protection des données oblige.

This "type of AI" refers to PimEyes, which

allows users to identify all images on the internet of a person given a sample image.

Anyone can use PimEyes. Why and which data protection requirement prevent the German police from using it?

6
  • 4
    Might be worth asking on Law.SE. It seems more of a legal question and there are regulars who are German or live in Germany.
    – Lag
    Commented Mar 12 at 17:55
  • 1
    Added a related question: law.stackexchange.com/questions/101289/… Commented Mar 12 at 20:32
  • 3
    "Anyone can use PimEyes" is a bit irrelevant here with regards to what tactics are allowed to be used by police in order to ensure a conviction (or, more accurately, evidence to help ensure said conviction). Those are two very different rulesets.
    – Flater
    Commented Mar 13 at 5:24
  • 1
    @Flater it’s not about evidence to secure a conviction, it’s about evidence to help locate a known suspect. See my linked question above. Commented Mar 13 at 5:31
  • 1
    Sounds like the only reason why that person said that thing at that time was lobbying: "Look, we'd love a law that explicitly allows us to use PimEyes, so let me talk about how we didn't use PimEyes in this case and how the world would have been better if we had used it"
    – Stef
    Commented Mar 13 at 13:15

2 Answers 2

17

First "the criminal police" quoted was, as far as I can make out, not the State Criminal Investigation Office (Landeskriminalamt) of Lower Saxony, who was responsible for the arrest. Typically, comments on legal matters are left to the state prosecutors, which are in charge of affairs. Instead, comments like this have been made by Jochen Kopelke, Chairman of one of the police unions, the Gewerkschaft der Polizei. He also commented on why his collegues would not use this software platform:

The people in the police force who carry out internet-based searches are all familiar with software tools. But they don't use all of them. I also discussed the Klette case with colleagues who use OSINT [Open Source Intelligence, the professional research of information that people make publicly available on the internet and analysing it for usable findings]... All colleagues know Pimeyes. But to use it, they would have to send data to servers in non-European countries. And that's always a huge problem for the police. It always has to be a German server with a closed network. Before OSINT colleagues try out new tools, they understandably say: I don't even know if I'm allowed to do that, I'd better ask our data protection officer. They are then quickly told that this is not covered by the applicable police law or the Code of Criminal Procedure, it is not possible. What's more, we have to carry out OSINT searches from our service devices. However, these are not necessarily state of the art. We know the software, we can use it, but we often lack the right tools and the legal framework.

There is a good discussion of the legal issues in a German-language podcast. Since there is no official transcription, let me sum up the pertinent points made there:

[start of paraphrasing]

The relevant legisplation is the EU General Data Protection Regulation, which in Article 9 states a general prohibition, together with exceptions.

  1. Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited.

  2. Paragraph 1 shall not apply if one of the following applies:

    • (e) processing relates to personal data which are manifestly made public by the data subject;
    • (f) processing is necessary for the establishment, exercise or defence of legal claims or whenever courts are acting in their judicial capacity;
    • (g) processing is necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights;

The exceptions to the general use should be able to cover all aspects of this case. In fact the Federal Criminal Investigation Office (BKA) and the Criminal Investigation Offices of the states use software for facial recognition since 2008, a system called GES.

The software may only be used by schooled officers which can act as experts in front of courts. It compares pictures from the internet to pictures from the internal database INPOL. Around 20,000 times a year such searches are conducted according to the BKA. Each of those requires a judge to have ordered a public manhunt. In the case of Daniela Klette, such an order has been available for decades.

Neither BKA nor the LKA of Lower Saxony have made any statements about the use of PimEyes. It is clear that the above-mentioned GES will have been used. Journalistic reseaches from recent years have shown that police and secret agencies in Germany have used other and external platforms in addition. Singular cases of help from other international agencies have been uncovered over the years, but the general rules covering this are intransparent.

Independently of who executes the search, to use PimEyes and comparable platforms might be legally problematic, though. For example its Terms of Use forbid the use for anything but searching for ones own face.

[end of paraphrasing]

Another aspect was pointed out in an article of the Neue Züricher Zeitung, obviously from a Swiss viewpoint. But German and EU legislation are not far apart in this case (Translated with DeepL.com):

The software is technically impressive, but it is probably not legal. No court has yet ruled on whether PimEyes broke the law when creating its software. However, the people whose photos are stored and processed in the software were never asked for their consent - they ended up in a facial recognition database without their knowledge.

For Martin Steiger, lawyer and media spokesperson for the Digitale Gesellschaft association, it is clear that the very act of collecting data for the PimEyes software violates data protection law. This is because biometric data such as the face is particularly strongly protected. He says: "The police lack the legal basis to use such a tool."

Florent Thouvenin, professor of law at the University of Zurich, says that explicit consent for the use of personal data is only necessary in certain cases. "But tools like PimEyes are highly problematic. They enable new forms of surveillance."

Geting back to the use by police agencies, the article goes on:

The fact that this creates a situation in which journalists have tools at their disposal that are forbidden to the police is only absurd at first glance, says Thouvenin. On closer inspection, it makes sense: "It's about setting limits for the state when processing personal data."...

One of the reasons for this is that the authorities have more power and data about private individuals than most private organisations: Tax data, fingerprints and iris scans for passports, for example. The police are prohibited from accessing such data because they could otherwise use it to build up a surveillance apparatus, as has already happened in countries such as China.

Similar comments have been made by Khesrau Behroz, the main author of the journalistic researches that found Klette using PimEyes.


The legal situation regarding the use of AI tools is about to change. The EU commission and parliament are in the process of finalizing a regulation laying down harmonised rules on Artificial Intelligence. According to the currently agreed text, AI able to provide facial recgnition will be placed in the "high-risk technology" category. Its providers, importer, deployers or anyone offering a service will have to provide a proof of conformity with regulatory rules and to register the product with the EU.

This is not the place to provide details on the obligations. But notably, the real-time use of remote biometric identification systems in publicly accessible spaces by law enforcement will be restricted to a defined group of purposes, and only allowed under judicial oversight.

AI systems that create or expand facial recognition databases through the untargeted scraping of facial images from the internet or CCTV footage will be banned completely.

See this page for information by a public interest group on content and process.

6
  • 1
    "we often lack the right tools and the legal framework." This is not very specific. Often doesn't mean always. You're just quoting, do it is what it is, but maybe far from conclusive proof of anything that is possible or impossible. Commented Mar 13 at 7:13
  • 7
    Regarding the last sentence in the last quote about building up a surveillance apparatus, we don't even have to look to China (which is one of the currently existing ones), but Germany had at least two of those in the past 100 years (mostly associated with the "Stasi" in the GDR and the "Gestapo" during the Nazi regime, but of course there was a whole system behind it). People like to say "but we would never misuse it!", but we know it sadly does not work like that... Commented Mar 13 at 8:32
  • 2
    @NoDataDumpNoContribution I absolutely agree. The point of that first part of the answer is that neither did a police official say what the article quoted in the question claimed, nor did those that proposed the claim present clear evidence. Mr. Kopelke, as a union representative, wants to lobby the public for the acceptance of laws that give more rights to the police, and for that goal the impression of a prohibition is more important than the facts.
    – ccprog
    Commented Mar 13 at 14:23
  • @SimonLehmann You might be interested to know that all residents in Germany are required to register their religion and address. Commented Mar 13 at 17:10
  • 3
    @ReasonablyAgainstGenocide Correction: you are only required to register the formal membership in one of the religious organisations that are recognised as Körperschaft des öffentlichen Rechts. If you are not a member, nothing will be known about your religion.
    – ccprog
    Commented Mar 13 at 17:38
5

Question:

Why isn't the German police allowed to use PimEyes (social network tracking via face recognition)?

Broadly, because using facial recognition software in order to identify individuals in a crowd doesn't work very well yet. There is a huge technical difference between verifying an individual's identity in a 1 to 1 match (like an iPhone does) and identifying a specific individual in a crowd. 1 to many.

More narrowly because minorities tend to look alike to a computer. Which is a lawsuit waiting to happen for any organization which relies on this technology for law enforcement.

While its defenders claim these technologies are a valuable tool to narrow down a large set of people to a narrower set of people, others claim having a primary tool which discriminates against minorities probably is not the best option if your ultimate goal is to find criminals and place them in prison, given any minority identified by such software is going to have an instant appeal, as they probably should.

This technology has had this particular peculiar bias for many years now. I think I first heard about it a decade ago when I was working in the bioidentity industry.

From Scientific America. Police Facial Recognition Technology Can’t Tell Black People Apart

from August 2023, Law professor explores racial bias implications in facial recognition technology

when it comes to recognizing faces of colour, especially the faces of Black women, the technology seems to manifest its highest error rate, which is about 35 per cent.”

While computers aren't prejudiced, the likely cause is the engineers who developed the software were primarily white men who designed the algorithms based on themselves and that algorithm evidently doesn't scale well for all minorities and sexes.

I know the FBI here in the United States has strict regulations on the use of such software. The software won't identify an individual, rather, it will always return a set of individuals necessitating further investigation.

FBI BioSpecs / Face Inter-state Photo System

Law enforcement may submit a probe photo for a search against over 30 million criminal mug shot photos. The response will return a minimum of 2, a maximum of 50, or a default of 20 candidates for investigative leads. The contributor is required to compare all available candidates.

Which is still controversial.

1
  • The discussion about the accuracy of facial recognition has been moved to chat; please do not continue the discussion here. Before posting a comment below this one, please review the purposes of comments. Comments that do not request clarification or suggest improvements usually belong as an answer, on Politics Meta, or in Politics Chat. Comments continuing discussion may be removed.
    – Philipp
    Commented Mar 14 at 10:19

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .