132
$\begingroup$

I think 1024 bit RSA keys were considered secure ~5 years ago, but I assume that's not true anymore. Can 2048 or 4096 keys still be relied upon, or have we gained too much computing power in the meanwhile?

Edit: Lets assume an appropriate padding strategy. Also, I'm asking both about security of signatures and security of data encryption.

$\endgroup$
1
  • 2
    $\begingroup$ Secure against what? Me, script kiddy, NSA. The answer is dependant on the threat . Secure for how long? A day, a year, fifty years? $\endgroup$
    – Ram
    Commented Dec 30, 2019 at 1:23

7 Answers 7

157
$\begingroup$

Since 2000, on a given $\text{year}$, no RSA key bigger than $(\text{year} - 2000) \cdot 32 + 512$ bits has been openly factored other than by exploitation of a flaw of the key generator (a pitfall observed in poorly implemented devices including Smart Cards). This linear estimate of academic factoring progress should not be used for choosing a key length so as to be safe from attacks with high confidence (or, equivalently, conforming to standards with that aim), a goal best served by this website on keylength.

The current factoring record is the 829-bit RSA-250 in late Feb. 2020, see the summary by the CADO-NFS team. That came shortly after the 795-bit RSA-240 in Dec. 2019, see the detailed paper.

I emphasize that the above is about attacks actually performed by academics. As far as we know, hackers have always been some years behind (see below). On the other hand, it is very conceivable that well-funded government agencies are many years ahead in the factoring game. They have the hardware and CPU time. And there are so many 1024-bit keys around that it is likely a worthwhile technique to be in a position to break these. It is one of the few credible and conjectured explanations for claims of a cryptanalytic breakthrough by the NSA. Also, dedicated hardware could change the picture someday; e.g. as outlined by Daniel Bernstein and Tanja Lange: Batch NFS (in proceedings of SAC 2014; also in Cryptology ePrint Archive, November 2014). Or in the distant future, quantum computers usable for cryptanalysis.

By 2020, the main practical threat to systems still using 1024-bit RSA to protect commercial assets often is not factorization of a public modulus; but rather, penetration of the IT infrastructure by other means, such as hacking, and trust in digital certificates issued to entities that should not be trusted. With 2048 bits or more we are safe from that factorization threat for perhaps two decades, with fair (but not absolute) confidence.

Factorization progress is best shown on a graph (to get at the raw data e.g. to make a better graph, edit this answer)

Graph of academic RSA factorization records This also shows the linear approximation at the beginning of this answer, which actually is a conjecture at even odds for the [2000-2016] period that I made privately circa 2002 in the context of deciding if the European Digital Tachograph project should be postponed to upgrade its 1024-bit RSA crypto (still widely used today). I committed it publicly in 2004 (in French). Also pictured are the three single events that I know of hostile factorization of an RSA key (other than copycats of these events, or exploitation of flawed key generator):

  • The Blacknet PGP Key in 1995. Alec Muffett, Paul Leyland, Arjen Lenstra, and Jim Gillogly covertly factored a 384-bit RSA key that was used to PGP-encipher "the BlackNet message" spammed over many Usenet newsgroup. There was no monetary loss.

  • The French "YesCard" circa 1998. An individual factored the 321-bit key then used (even though it was clearly much too short) in issuer certificates for French debit/credit bank Smart Cards. By proxy of a lawyer, he contacted the card issuing authority, trying to monetize his work. In order to prove his point, he made a handful of counterfeit Smart Cards and actually used them in metro tickets vending machine(s). He was caught and got a 10 months suspended sentence (judgment in French). In 2000 the factorization of the same key was posted (in French) and soon after, counterfeit Smart Cards burgeoned. These worked with any PIN, hence the name YesCard (in French) (other account in English). For a while, they caused some monetary loss in vending machines.

  • The TI-83 Plus OS Signing Key in 2009. An individual factored the 512-bit key used to sign downloadable firmware in this calculator, easing installation of custom OS, thus making him a hero among enthusiasts of the machine. There was no direct monetary loss, but the manufacturer was apparently less than amused. Following that, many 512-bit keys (including those of other calculators) have been factored.

Note: 512-bit RSA was no longer providing substantial security by 2000-2005. Despite that, reportedly, certificates with this key size were issued until 2011 by official Certification Authorities and used to sign malware, possibly by means of a hostile factorization.

$\endgroup$
14
  • $\begingroup$ also a consideration - factoring is being done more and more on GPU rather than CPU systems, which offer higher parallelization and shorter crack times (at least on symmetric systems) $\endgroup$
    – warren
    Commented Apr 10, 2012 at 22:39
  • $\begingroup$ @warren: do you have a link describing sieving or linear algebra (the bottlenecks for GNFS) on GPU? $\endgroup$
    – fgrieu
    Commented Apr 11, 2012 at 7:49
  • $\begingroup$ @fgrieu - some of the more detailed references off Jeff Atwood's recent blog post on hashing may have some of those details $\endgroup$
    – warren
    Commented Apr 11, 2012 at 18:26
  • $\begingroup$ @warren: as far as I can tell, nothing in there is even remotely applicable to factoring an RSA modulus (except if generated from a password using a PBKDF). $\endgroup$
    – fgrieu
    Commented Apr 12, 2012 at 6:54
  • $\begingroup$ @fgrieu - I more looked at it as a general coverage with reasonable resources as to how attacks are being carried out from a hardware perspective .. not that it's necessarily directly-related to RSA :) $\endgroup$
    – warren
    Commented Apr 12, 2012 at 13:07
28
$\begingroup$

You might want to look at NIST SP800-57, section 5.2. As of 2011, new RSA keys generated by unclassified applications used by the U.S. Federal Government, should have a moduli of at least bit size 2048, equivalent to 112 bits of security. If you are not asking on behalf of the U.S. Federal Government, or a supplier of unclassified software applications to the U.S. Federal Government, other rules might of course apply.

However, at the very least, these figures indicate what the U.S. Federal Government thinks about the computational resources of it's adversaries, and presuming they know what they are talking about and have no interest in deliberately disclosing their own sensitive information, it should give some hint about the state of the art.

$\endgroup$
1
  • $\begingroup$ The pertinent tables are located in Section 5.6.1 Table2 and Section 5.6.2 Table 4. At least in NIST SP800-57 Revision 4. $\endgroup$
    – jtpereyda
    Commented Sep 21, 2016 at 21:19
13
$\begingroup$

The simplest answer would be to look at the keylength.com site, and if you don't trust that, to the linked papers, particularly by NIST and ECRYPT II. Note that those mainly agree with the Lenstra equations, so you could use those as well.

You may have additional restrictions and - if you are brave or stupid - relaxations depending on the use case. But at least they establish a base line that you can work with.

$\endgroup$
9
$\begingroup$

An adversary with a moderately large quantum computer to run Shor's algorithm will cut through a 1024-bit RSA modulus like a hot knife through butter, and maybe through a 2048-bit RSA modulus like a butter knife through a tough piece of steak.

The quantitative difference between ‘butter’ and ‘steak’ here is that the cost of Shor's algorithm run by an attacker is a quadratic function of the cost of computing RSA by a legitimate user. Traditionally in crypto we want the cost of attacks to be exponential in the cost of usage, like trying to use a rubber duckie to cut through a meter-thick steel bank vault door. The cost of the best classical attacks on RSA, NFS and ECM, is not actually an exponential function of the user's costs, but it's comfortably more than polynomial, which is why, for example, we use 2048-bit moduli and not 256-bit moduli for a >100-bit security level.

However, while the cost of Shor's algorithm is a quadratic function of the user's cost, it is a function of $\lg n$, where $n$ is the modulus. Users can use the well-known technique of multiple primes to drive $\lg n$ up, making Shor's algorithm (and the classical NFS) much costlier, at more or less linear cost to users. The best classical attack in that case is no longer the NFS but the ECM, whose cost depends on $\lg y$ where $y > p_i$ is an upper bound on all factors $p_i$ of $n$.

An adversary with a large quantum computer combine the ECM with Grover's algorithm to get a quadratic speedup, requiring the legitimate user to merely double their prime sizes. 1024-bit primes are considered safe enough for today against ECM, so we could double that to 2048-bit primes to be safe against Grover–ECM, but out of an abundance of caution, we might choose 4096-bit primes.

At what size moduli does the cost of Shor's algorithm exceed the cost of Grover–ECM? It's hard to know for sure how to extrapolate that far out, but we might surmise from conservative estimates of costs what might be good enough.

Thus, to attain security against all attacks known or plausibly imaginable today including adversaries with large quantum computers, cryptographers recommend one-terabyte RSA moduli of 4096-bit primes. Cryptographers also recommend that you brush your teeth and floss twice a day.

Note that these estimates are very preliminary, because nobody has yet built a quantum computer large enough to factor anything larger than the dizzyingly large 21 with Shor's algorithm modified to get a little help from someone who knows the factors already. (Larger numbers like 291311 = 523*557 have been factored on adiabatic quantum computers, but nobody seems to know how the running time might scale even if we had enough qubits.)

So this recommendation may be unnecessarily conservative: maybe once we reach the limits on the cost of qubit operations, it will turn out to take only a few gigabytes to thwart Shor's algorithm. Moreover, standard multiprime RSA may not be the most efficient post-quantum RSA variant: maybe there is a middle ground between traditional RSA and RSA for paranoids, that will outperform this preliminary pqRSA proposal.

$\endgroup$
2
  • 5
    $\begingroup$ Everyone upvoting this is absurd. $\endgroup$ Commented Mar 12, 2019 at 3:49
  • $\begingroup$ Don't see that "butter knife" with 14 Qbits and a measurement error of 30%, growing exponentially with the number of Qbits (below 100%). $\endgroup$ Commented Jan 19, 2022 at 21:55
7
$\begingroup$

Another way of determining the key size that offers 'adequate security' and was presented initially by Lenstra is the equivalence between symmetric and asymmetric key lengths where two systems offer cost-equivalent security if at a given time accessing the hardware that allows a successful attack in a certain fixed amount of time, costs the same amount of money for both systems. Adequate security was defined as the security offered by DES in 1982.

There are several factors that influence the choice of the key length, for example the life span of the data you want to protect, the estimation of the computational resources (consider Moore's law) and the cryptanalytic advances through the years (Integer factorisation). According to Lenstra, by 2013 a symmetric key size of 80 bits and an asymmetric key size of at least 1184 bits is considered to offer adequate security.

A more recent method of determining adequate key sizes again by Lenstra ("Using the cloud to determine key strengths") is by using cloud services to estimate the computational cost required to factor keys assuming that the fastest way is the Number Field Sieve algorithm. He used Amazon's cloud services to develop his cost based model. Note that the Number Field Sieve algorithm is over 20 years old and since 1989 in this area there have been no major advances besides small tweaks.

In recent surveys it has been observed that people tend to move towards 2048-bit keys although certificates holding 1024-bit keys have no reason to be revoked as long they are not expired.

$\endgroup$
5
$\begingroup$

On my website, I have described why I personally have chosen a 10kbit RSA key:

$\endgroup$
5
  • 7
    $\begingroup$ The "do not cost significant performance penalty" depends very much on context. For email that may be true since you usually don't receive several emails per second. For many other contexts, such as SSL, several hundred milliseconds of decryption time for a single handshake are not acceptable. The other big problem is that many RSA implementations do not support such large keys. $\endgroup$ Commented Jun 20, 2014 at 9:55
  • 2
    $\begingroup$ It does not change the two overall conclusions: The penalty is solely paid by the key holder - not by the communication partner and measure before you assume 10kbit keys will be a problem. $\endgroup$
    – Ole Tange
    Commented Jun 20, 2014 at 18:29
  • 5
    $\begingroup$ “measure before you assume“ – For the fun of it, I just did and as a result noticed that your claims are build on faulty conclusions. Skipping potential discussions about your somewhat naive benchmark strategy, I would like to point out that the numbers on your site are wrong. For example: you state CPU performance 4kbit --encrypt = <0.001-0.004 secs and CPU performance 10kbit --encrypt = 0.004-0.012 secs, but you come to the conclusion that Additional cost for --encrypt = 0 secs? Also, in one case you even claim 4kbit to be slower than 10kbit… I hope you notice what’s wrong with that! $\endgroup$
    – e-sushi
    Commented Jun 20, 2014 at 20:25
  • 1
    $\begingroup$ The additional cost is 0 seconds rounded to 0 decimals. It makes sense to round to 0 decimals, when the measuring uncertainty is greater than the number (in this case the uncertainty is around 0.008 and the difference is less than 0.008). I have tried figuring out which case you are talking about, and have been unable to find it. Also I do not claim: I measure and report the measurements. Will you care to share the numbers you got when when you re-did the tests on your systems? $\endgroup$
    – Ole Tange
    Commented Jun 20, 2014 at 22:57
  • 1
    $\begingroup$ "should last a life time if the future holds no surprises" A lifetime is a very long time, and the future is always full of surprises. Think of the advances in computing -- you can now keep in your pocket a device which is billions of times more powerful than a room-sized calculator of 50 years ago. $\endgroup$
    – user26033
    Commented Jan 30, 2018 at 10:50
3
$\begingroup$

According to Schneier's book Cryptography Engineering, n = pq

...should be around 6800 bits...

in order to

design a system that will be used for 30 years, and the data must be kept secure for 20 years after it has first been processed.

also it states that it's not as same as symmetric keys in

...variable-sized...public key only needs to protect data 21 years, rather than the 50 years needed for symmetric keys. Each year, you generate a new public key, and you can choose larger public keys as computing technology progresses.

but it was also stated that key size almost never matter, as the other parts of the system is more often than not, are weaker.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.