'Encryption' tag logo

RSA key lengths, elliptic curve cryptography and quantum computing

Some tools, like PGP, are still stuck1 to legacy cryptography, mainly the RSA algorithm. For such tools, RSA-2048 is often described as strong enough for any foreseeable future, anything above being overkill The GnuPG official documentation in particular even goes this far as considering that using RSA-3027 or RSA-4096 constitutes “an improvement so marginal that it’s really not worth mentioning”, adding that “the way to go would be to switch to elliptical curve cryptography”.

The assertion that this improvement is “marginal” is debatable, as is the trust in the elliptical curves to protect us in the future.

Longer RSA keys

While the NIST considers RSA-2048 to be safe for commercial use up to 2030, it still advises the use of at least a RSA-3072 key for beyond (see BlueKrypt’s Keylength website to get an overview of various recommendations).

Read quickly, such recommendation sounds like RSA-2048 should indeed be safe for todays world. In fact this depends on the use you intend for your keys, as “safe up to 2030” doesn’t mean that you are safe as long as you plan to migrate to something else before 2030. This is not some kind of end-of-support date. This means that you must assume that whatever you encrypt now will be decrypted in a dozen of years (and a dozen of years goes pretty fast).

For short term secrets or, to some extents, signature this is usually less of a problem. The fact for instance that an attacker may be able to fake a signature from a dozen years ago shouldn’t cause an issue: by that time such signature should have been revoked and good software should refuse to trust keys size or algorithms widely known as weak.

But for systems which may imply long-term storage or the exchange of valuable information, then the fact that the data may be decrypted in a dozen of years may potentially be devastating. Concretely, if you store today an encrypted archive protected using RSA-2048 on a cloud service, you must assume that the content of this archive will be known to authorities and intelligence services in a dozen of years (and, again, time goes very fast).

Even if the archive file is to be quickly deleted, some intelligence agencies attempt to process digital data exchange as a whole (the whole of Internet, satellites, phone communications, etc.) and massively intercept and copy even remotely potentially interesting data (an encrypted archive for instance is a perfect candidate) to be able to analyze or decrypt them few years down the road.

Data acquisition and long-term storage is a major investment for some intelligence agencies, the most widely known example being of course the NSA. A year before Snowden events, Laura Poitras published a short documentary on William Binney, another former NSA employee. This documentary focused on NSA‘s “Stellar Wind” program and their Utah data center:

Binney calculates the facility has the capacity to store 100 years’ worth of the world’s electronic communications.2

Quantum computing

The NSA is a dual headed organization, with both a national intelligence and an advisor role to protect against foreign intelligence3.

Cover of 'Commercial National Security Algorithm Suite and Quantum Computing FAQ' As part of its advisor role, in January 2016 the NSA wrote a very interesting FAQ titled Commercial National Security Algorithm Suite and Quantum Computing FAQ (I highly encourage you to read it). As soon as National Security Systems (NSS) are concerned, RSA-2048 should simply be not used anymore. This is as simple as that. If you want to protect your data, use RSA-3072 minimum, this minimum being kept relatively low for compatibility purposes but knowing that higher is better.

This paper then focuses on the next real threat against modern cryptography. According to the NSA, is not the natural evolution of processing power as it was before, but the progress toward effective quantum computing.

Professor Gilles Brassard explains the threat as follow:

It takes no more time to break RSA on a quantum computer (up to a multiplicative constant) than to use it legitimately on a classical computer.

Leading the NSA to conclude, in the above mentioned paper:

A sufficiently large quantum computer, if built, would be capable of undermining all widely-deployed public key algorithms used for key establishment and digital signatures.

Quantum computing would affect RSA and ECC algorithms alike, so ECC is not a solution here. However quantum computing is not some kind of magical threat affecting any kind of encryption. Symmetric algorithms, for instance, are said to be more resistant against quantum computing, and new quantum resistant asymmetric algorithms proposal have already be done.

According to the NSA, the future in asymmetric encryption is through these quantum resistant asymmetric algorithms, and not through ECC, despite the claim in the GnuPG documentation quoted in the beginning of this article.

This does not mean that ECC is not an improvement over older algorithms such as RSA: it certainly is. This is a matter of cost: if company or a project cannot afford to implement both ECC and then quantum resistant algorithms in a row, they should spare their time and money to invest it on upcoming quantum resistant algorithms once standardization has been achieved (a process which should take a few years). If a project can afford both then its obviously better. But one should not rush now on ECC just to find themselves unable to proceed with quantum resistant algorithms down the road.

ECC algorithms were an answer to the increase in computational power, but as the threat shifts the answer has to shift too.

Note

One the advantages of ECC algorithms is a return to relatively small key size (a 256 ECC key providing the same strength as a 3072 bits RSA key).

According to this NSA paper, this won’t be the case anymore with quantum resistant algorithms as:

The key sizes for these algorithms will be much larger than those used in current algorithms.

Because of this, the NSA also calls interested parties to measure potential side-effects before-hand:

Work will be required to gauge the effects of these larger key sizes on standard protocols as well. NSA encourages those interested to engage with standards organizations working in this area and to analyze the effects of adopting quantum resistant algorithms in standard protocols.

Non-standard key lengths and algorithms

From time to time I encounter people advocating the use of non-standard algorithms or of standard algorithms used in non-standard or unusual ways:

  • New algorithms which didn’t went through the same amount of scrutiny as the standardized ones.
  • Non-standard algorithm combination or usage.
  • Uncommon key sizes.

Cryptography is a very complex and sensitive matter, it should go without saying none of these practices should be considered in the realm of any real security scheme.

  • Schneier’s law says that you should not run your own crypto (this is discussed more in depth here), this also apply in choosing obscure algorithms which weren’t vetted by the cryptographers community just because you somehow made a wrong relation between less known and more secure.

  • Cryptographic algorithms are designed to be used a certain way, and they will deliver the highest security when they are used exactly this way. As soon as you start to deviate from this way, even “just a little”, you must assume that you reduce the resulting security.

    The most perfect example is hashing algorithms: I regularly see misinformed people re-hashing something several times “to increase security”, while for several reasons hashing a hash will in fact decrease the resulting security.

  • Even if an algorithm may theoretically designed to work with an arbitrary key size, once the community agreed on a common set of sizes it is usually unwise to settle apart.

    Every software and devices being tested with those sizes, using uncommon key sizes puts you out of usual test-cases and may raise unexpected behaviors. In the best case, this will be an error message. In the worst case, this will be a weakness affecting the resulting security.

    As with the first bullet, such practice comes from a frequent misconception that what is uncommon is more secure. Advocates of such measure usually explain that, assuming that a state actor is able to break 4096-RSA, they would require specific optimization which won’t work against, say, a 3456 bits key which would require specific development to be broken.

    To leave the assumptions realms and go back to Earth, I’ve never encountered any report stating that a 120 bits key is harder to break than a 128 bits one. So if an attacker is able to break RSA keys up to 4096 bits, then a 3456 bits key will be broken too.

    Uncommon key sizes exposes you to software bugs and interoperability issues without any real security gain.


  1. The use of ECC in PGP tools has been standardized by the RFC 6637 in 2012. GnuPG added it in GPG 2.1 released in 2014, turning into stable in GPG 2.2 in August 2017. 

  2. I saw several websites trying to estimate the storage space required or available in such facility in regards to cost, often ending with astronomical numbers. The fact is that you don’t need this facility to be able to store 100 years worth of communication right from the beginning, this would be plain dumb. You need to be able to store one or just a few years, and simply ensure that the storage capacity grows at a sufficient pace compared to the quantity of incoming intercepted data, either by adding new storage units over the years or replacing existing ones to take advantage of the constant technological evolution. You don’t need storage capacity, you need storage scalability, and the NSA itself doesn’t say anything different:

    The Utah Data Center was built with future expansion in mind and the ultimate capacity will definitely be “alottabytes”!

  3. Of course these two roles don’t go without a certain amount of conflict of interests, as shown in Dual_EC_DRBG case


Cover of 'Commercial National Security Algorithm Suite and Quantum Computing FAQ'

Cover of 'Commercial National Security Algorithm Suite and Quantum Computing FAQ'

Popular tags see all

Website

Author

Follow