0

It has been stated for over a decade on Telegram's Privacy Policy that

All data is stored heavily encrypted and the encryption keys in each case are stored in several other data centers in different jurisdictions. This way local engineers or physical intruders cannot get access to user data.

For reference, Telegram has several data centers, with the two main ones being in the United States and the Netherlands.

This statement raises several questions, it means that at no point in time must the full encryption key be stored on disk, thus, the different keys need to be refetched from different jurisdictions at each server reboot. If the server is capable of doing this, then an engineer with wide access to Telegram infrastructure across different jurisdictions would also be. However, Telegram is known to be wholly uncooperative with law enforcement[1][2][3].

If an engineer in the US is served with a subpoena for user data, then a defense that reads like "Sorry, I cannot access the database because I would need to connect to and retrieve the password from a server in the Netherlands" sounds to me like it would never succeed, as it would be laughably easy for any service to avoid providing data this way.

Essentially, my question is in what way can the key retrieval process mentioned above be legally secure given that it lacks the technical security of systems like end-to-end encryption, where the data is simply "impossible to access". At which point does the effort required to access the data become unreasonable enough to provide a legal defense?

5
  • I'm having a hard time determining what you are really asking. The question and your musings on possible answers and the implications of different setups are so intertwined, it is hard to tell where one ends and the other begins. The final paragraph starts off looking like a question (although not quite the title question) but seems to answer itself later on in the same paragraph. I also added a subpoena tag as a subpoena would be more common than a law enforcement search warrant as a way to force disclosure of information in this context. "any thoughts on this" is not specific enough.
    – ohwilleke
    Commented Jun 19 at 23:16
  • @ohwilleke Thanks for the feedback, I'll try to rewrite some parts in a clearer way. It is a complex topic since it involves both the technical implementation of the encryption system, which I've tried to briefly explain, and also the legal ramifications of it. I would say the core of the question is maybe at which point does the effort required to access user data become unreasonable enough to provide a legal defense.
    – Dust
    Commented Jun 19 at 23:17
  • This is a complex question involving several layers. There are assumptions about what keys are needed during operations and where, what access different sub-entities within the organization have. "Sorry, I cannot access the database because I would need to connect to and retrieve the password from a server belonging to and administered by Telegram Netherlands", is a very different defense. Commented Jun 20 at 0:06
  • @user1937198 Even if several entities were involved, they are all ultimately controlled by one or two individuals which law enforcement could target, it is not unheard of for a CEO to respond regarding their companies actions in situations like this
    – Dust
    Commented Jun 20 at 1:09
  • Law enforcement doesn't target engineers, they don't serve the order to the engineer directly. It may not matter whether the engineer can access the relevant data, if the company says "that data is outside your jurisdiction, you have no right to access it, so we won't tell our employees to work on it". Microsoft won a case over emails stored in Ireland because of this. That aside, it's often generally possible to arrange things so that engineers do not have normal access to the relevant data anyways (such as to prevent spying on users generally). Commented Jun 20 at 2:59

2 Answers 2

2

The short answer to what appears to be the actual question is that in the , the required effort provides a defense to complying with a subpoena when it is "unduly burdensome." Fed.R.Civ.P. 45(d)(3)(A)(iv).

This is a flexible standard that means different things in different contexts.

I have no knowledge of Telegram's setup, but let's just imagine that they came into court with credible evidence that complying with a subpoena would cost them $1 million. If you had sent the subpoena in connection with a small-claims lawsuit over a $200 bill from your plumber, Telegram would have a good case that it's unduly burdensome, because nonparties shouldn't have to take on substantial burdens to resolve relatively trivial disputes.

But if the subpoena came from the government as it sought to disrupt and prosecute a terrorist network on the verge of launching a domestic attack, that argument is going to be a lot weaker.

13
  • "unduly burdensome" has no relevance in this context. Either the data can be reasonably easy retrieved and provided, or it is impossible technically. It is not the case that they could retrieve the data but it would cost them $$$.
    – Greendrake
    Commented Jun 20 at 1:54
  • 4
    @Greendrake That's definitely false, but OK.
    – bdb484
    Commented Jun 20 at 1:55
  • @Trish How is that relevant to this question? Telegram either can fetch the data without much trouble, or can't at all. There's is nothing in between here.
    – Greendrake
    Commented Jun 20 at 8:45
  • @Greendrake You'd need to prove it can be done trivially - For all I know, there might be considerable costs involved you are not aware of.
    – Trish
    Commented Jun 20 at 8:47
  • 3
    And my bona fides: I've been a programmer and system administrator for 40+ years. I've never actually seen a system locked down this well myself, but I believe it's possible.
    – Barmar
    Commented Jun 21 at 0:06
2

Telegram is one of the most agile and smart asses on the market

TL; DR

Telegram is ahead of the game. The law is lagging behind.

Data accessibility varies

A platform like Telegram has different kinds of data with varying accessibility levels:

  • Messages. Genuine end-to-end encryption makes it technically impossible to access messages by anything other than the conversation participants' devices. Whether Telegram's end-to-end encryption is genuine (and whether the messages are not back-synced to the servers) is beside the point. The point is that it can plausibly claim that providing this data is technically impossible. This is what it does here.
  • User identification data: phone numbers, IP addresses, devices, timezones etc. This would be the data that:

... is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data

The law requires to provide what is stored, not what can be seen

Subpoenas and search warrants can force to give access to physical devices and the encryption keys for these devices.

However, the individual chunks of data (and chunks of the keys) physically stored in each specific jurisdiction are useless without all the remaining chunks.

But this data is needed by the business on daily basis for analytics and operational decision making. So, how is it made accessible? Well, this is what makes platforms like Telegram ahead of the game. Simply put, the internal systems would be designed to connect to all the chunks remotely, put them together in RAM on the terminal device, decrypt and show on the screen. The assembled data won't be saved on the hard disks anywhere. Close the application that puts the chunks together, and all the readable data vanishes into thin air.

In other words, Telegram can quickly and cheaply spin up data that law enforcement wants, but it can't be compelled to give it — it can only be compelled to give those individual useless chunks of data stored in the particular jurisdiction of the law enforcement. This is what makes Telegram able to cooperate where they see fit (e.g. to nail terror suspects) but refuse otherwise.

22
  • 1
    Like the question, this answer is larded with unfounded an unsupported assumptions.
    – bdb484
    Commented Jun 20 at 0:55
  • 3
    @Greendrake All of them are unsupported, as you haven't cited any sources at all. For instance, I'd be particularly interested to learn how you came to believe that Telegram can't and doesn't defend itself from law enforcement action.
    – bdb484
    Commented Jun 20 at 1:11
  • 1
    @Greendrake I find it hard to believe that Telegram would receive no other subpoenas given that we know from transparency reports that other companies such as Discord handle many of these each year. My suspicion is that Telegram has figured out some defense that allows them to claim in response that they are unable to access any data, similar to E2E encrypted apps.
    – Dust
    Commented Jun 20 at 1:21
  • 2
    @Greendrake In their FAQ, Telegram states "To this day, we have disclosed 0 bytes of user data to third parties, including governments.". Further, in their Privacy Policy, it is stated: "If Telegram receives a court order that confirms you're a terror suspect, we may disclose your IP address and phone number to the relevant authorities. So far, this has never happened. When it does, we will include it in a semiannual transparency report published at: t.me/transparency." To my knowledge no such transparency report has ever been issued for any nation.
    – Dust
    Commented Jun 20 at 1:33
  • 2
    Despite the overhaul, this answer continues to omit any discussion of any actual laws and naturally ends up reaching incorrect conclusions.
    – bdb484
    Commented Jun 21 at 3:49

Not the answer you're looking for? Browse other questions tagged .