-2

When developing an application which requires some personal data to function, a good thing to do, from a security and integrity perspective, is to minimise the amount of user data one gets access to. Some data will be necessary for the application to function properly (for example authentication data) and some data might be required to be recorded for legal reasons. Most other data has the possibility of being end-to-end (e2e) encrypted and completely inaccessible to the application developers. If we assume this is the case, that all other user data is e2e encrypted and the encrypted data cannot be linked to the user except by the user themselves, maybe that would affect how GDPR applies to the situation.

The user data is personal data for the user but when it reaches the servers it's encrypted and, from what I understand, should therefore not be regarded as personal data for the server owners. How e2e encryption often works is that user data is processed by a program (distributed by the same company) running on the user's device and encrypts the data with a key known only by the user before sending the data back to the servers. It's unclear to me whether that counts as the server owners processing the data or as the user processing their own data.

So my questions are:
Does a company still qualify as data controllers even though the data stored is encrypted?
Does a company still qualify as data processors even though all processing is done on the device of the user?

4
  • Personal data is any information that relates to an identifiable natural person. Your users are natural persons. You can identify/distinguish them because you have a concept of user accounts. Thus, the encrypted blob is personal data, even if you don't have access to the plaintext contents. You are deciding purposes and means for how this data is processed, e.g. stored. Thus, you're a data controller for those processing activities. Depending on how your app works exactly, you do not decide how the plaintext data is processed, and then wouldn't be controller for that processing.
    – amon
    Commented May 30 at 17:47
  • Regarding "data processors" – a data processor processes personal data on behalf of another data controller. If you're a processor, you need a contract with the controller pursuant to Art 28 GDPR. This matters a lot in B2B relationships, but you won't be a processor in B2C situations because there wouldn't be another controller who would be responsible instead.
    – amon
    Commented May 30 at 17:50
  • "Please don't ask questions seeking legal advice on a specific matter. These are off-topic for Law Stack Exchange. While users generally contribute answers in good faith, the answers are not legal advice, and contributors here are not your lawyer." Commented May 31 at 10:21
  • I edited the question to make the phrasing more general.
    – n-l-i
    Commented May 31 at 10:59

1 Answer 1

6

This has been a matter of some speculation over the years and I'll admit I'm not 100% up to date on the current legal position. That said encrypted personal data is still considered personal data (because the anonymisation isn't irreversible as possession of the appropriate key can turn it back into regular old data), however based on the judgement in T‑557/20 - Single Resolution Board v European Data Protection Supervisor there's an argument that if the recipient of the data (i.e you) doesn't possess the legal or practical means to enable them to de-anonymise the data then it's not considered personal data.

So this encrypted data that you can't access (and can't really means can't - if any part of your organisation has access to the decryption key then it's back in scope as personal data, if the encryption chosen is weak and you can brute force it easily it's back in scope, if you are in a position to legally and practically obtain the key it's back in scope) wouldn't necessarily be considered personal data while you've got it.

As for processing the data on device then depending on what the application does with that data while it is unencrypted it may or may not count as processing.

Of course that's not the same as saying you aren't a data controller - you don't specify what the "small amount of necessary data (for example authentication data and data needed due to legal reasons)" consists of personal data (and it would be hard to see how you authenticating users without some form of identifier) then you're a Controller, and assuming you're using that data in some fashion (authenticating users, billing etc) then you're also a Processor.

Honestly, given the potentially tenuous nature of getting the encrypted data excluded completely a good way to cover the bases would be would be transparent to the user that you were storing the encrypted data off-device but not the key. Of course a better way would be to simply not store this data in the first place, it's of no real value to you anyway and re-assuring users after a breach that don't worry we don't store the keys so the bad guys can't decrypt your data anyway is hassle you don't need.

None of this means you shouldn't encrypt this user data in the manner you suggest, it's good practice under Article 32 of GDPR and is a smart way to minimise your risk exposure.

3
  • 1
    I realise now that you pointed it out, that our additional data that we are required by law to store and process is still personal data despite being required. I think I mixed up that part with the need to get consent from the user. And I agree, don't want a less secure application just because it can be. This is really helpful.
    – n-l-i
    Commented May 30 at 13:05
  • 2
    You're using the term "PII," which is a term of art in US data protection law. This term does not appear in the GDPR.
    – phoog
    Commented May 31 at 1:12
  • @phoog good shout, I'm in the UK and here we use "PII" colloquially for personal data that is also an identifier but you're right and GDPR doesn't use that term. I've edited now. Commented May 31 at 8:25

Not the answer you're looking for? Browse other questions tagged .