Humans have been attempting to both advance and break encryption since the Spartan scytale and the infamous Caesar cipher. Modern cryptography is no different. From the enigma code to RSA, the methods we use to secure our data have always been and will always be tested.
The advent of quantum computing will up the ante once again, presenting an impending threat to today’s cryptographic algorithms. Because preparing for this eventuality will be complicated and time consuming, it is imperative to get the ball rolling as soon as possible. We rely on these cryptographic algorithms to secure our applications, our most important data — our entire digital lives. The risk is too high to ignore.
The increasing pace of this age-old game of cat and mouse and our increased reliance on these algorithms calls for the accelerated development of a much-overlooked practice: crypto agility. Simply stated, crypto agility is the ability to rapidly change cryptographic algorithms and implementations. It is about to become a key driver of organizational survival and success.
This blog explains why the software industry must plan to adopt a cryptographically agile approach, both in software development and IT operations — well in advance of quantum computing. I’ll explain how quantum computing and the current state of our crypto portfolios are driving this change, as well as its impacts and available solutions.
Why is crypto agility so important?
There are two key business drivers for
crypto agility. First, many organizations need to concurrently support multiple
cryptographic algorithms across their infrastructure without creating
unsustainable complexity. Diverse regulatory environments drive the use of
varying algorithms, key characteristics, key-management systems, and more. On a
practical level, this can be quite a pain to manage. Many organizations employ more
cryptography than they need and flip between implementations as support for
features in one library leapfrogs the other. A needless diversity of
cryptography makes it even harder to upgrade, debug, or manage software.
Implementing crypto agility enables developers to design applications and
services that are decoupled from their cryptography. This enables the ability
to support multiple policy-driven cryptographic requirements at once, while
also consolidating their “crypto footprint.”
Second, cryptographic algorithms become less effective over time. As the sophistication of cryptanalysis techniques increases and computing power improves, the likelihood of breaking these algorithms increases. This requires organizations to implement more frequent changes to effectively protect data.
Unfortunately, remediating vulnerable cryptography is exceedingly expensive and error-prone. For example, a year after OpenSSL suffered an implementation error that led to the Heartbleed vulnerability, more than 50% of US organizations still had not patched all of their OpenSSL instances. This is because software applications are usually written with their cryptographic algorithms “baked in,” rendering them extremely difficult to change. This long tail of security vulnerabilities is exactly the problem crypto agility is designed to solve.
Currently, the industry is not prepared for its cryptographic workhorses — RSA, EC, and DSA — to be invalidated. Although they have not suffered a core algorithmic compromise within the last few decades, advancing technologies will eventually make key cracking a possibility.
Quantum computing is a new catalyst for cryptographic change. With access to more qubits over time, these advanced quantum computers can take advantage of their unique computing power to efficiently find private keys based on prime-number factors of large numbers using Shor’s Algorithm. Our current standards of public-key cryptography, including encryption, key encapsulation, and digital signatures, rely on the one-way nature of these operations. (Note that the discrete logarithm problem and elliptic-curve discrete logarithm problem used by EC and DSA are also easily solved by a powerful quantum computer using Shor’s algorithm.)
To explain, for integer factorization, it is easy to generate a very large number from two prime numbers. However, without a quantum computer, it is extremely difficult to determine which two prime numbers are its factors. But with a quantum computer, prime factors can be found quickly. This means our current methodologies of key exchange and public-key cryptography (such as RSA, Diffie-Hellman, and ECDH) will be weakened in the future. These are the keys to the kingdom, and soon they may be exposed.
Unfortunately, this is not some distant problem future generations will have to worry about. Part of the threat is already knocking on the doors of our (currently rather vacant) offices. The exposure happening today will facilitate the threat we will face tomorrow. Here’s how: when public-key cryptography and key-exchange algorithms are broken, the symmetric keys we use for bulk encryption may be extracted from our transmissions. Like the keystone for a bridge, a disruptive reduction in the difficulty of attacking public-key cryptography is leading to the fall of conventional cryptography.
If our fundamental cryptographic algorithms are broken this decade, an attacker can record encrypted data now in preparation for breaking the encryption once scalable quantum computing is available. Breaking the public encryption, they will be able to access the symmetric keys and, from there, the encrypted data. This is known as a “harvest-now, decrypt-later” attack, and is particularly frightening for data with long-term sensitivity. What’s more, this risk will only increase over time. As time goes on and we get closer to the quantum-computing threat, sensitive data with shorter life spans also becomes worrisome. The longer we delay, the larger the risk.
What encrypted data does your organization send over the public internet that will still be sensitive in the next decade? Definite food for thought.
While quantum-safe algorithms (called post-quantum cryptography, or PQC), have been under development for some time, they have not been widely adopted. NIST initiated a process to evaluate and standardize one or more quantum-resistant public-key cryptographic algorithms in 2017. Now in its third round of submissions, you can read about each of the finalists in detail here. It’s important to realize that these PQC algorithms use a variety of approaches (code based, lattice based, and hash based) that are fundamentally different from each other and from today’s conventional cryptography. Each algorithm makes significant tradeoffs with wildly different key sizes, computational cost, number of roundtrips in key exchange, and the length of ciphertext produced.
These differing characteristics means upgrading to these algorithms or switching between them will change resource utilization, disrupt application performance, and severely affect any dependent services. Crypto agility enabled through a cryptographic abstraction layer will allow for swift policy-based changes, while minimizing disruption to the application.
What can we do now?
So now that you understand the problem and the impending threat, here’s how to help your organization begin the process of getting prepared:
- The first step in any plan is always information gathering. You
can’t defend something you’re not aware of. You’ll need to inventory the
cryptography you currently use. Whether it’s in the applications your organization
has developed or in the software provided to you by a vendor, creating an up-to-date
list of your cryptography will form an important part of your asset-management
- Next, you’ll need to develop a plan for addressing things that go wrong. To that end, your organization’s incident-response plan should be updated to include a procedure to patch crypto libraries. It is also critical to include IoT devices in these planning phases. These devices often have long lives and have historically suffered from poor patch compliance.
- Consider the exposure that is happening now. Once an exposure happens, it is irreversible. CIOs should consider policy changes to minimize encrypted public exposure of long-term sensitive data.
- To mitigate exposure risk, implement hybrid PQC algorithms for secure communication over public networks. (Hybrid PQC implements a PQC algorithm in tandem with standard encryption to maintain compliance with current cryptography standards, while adding quantum safety.)
- Finally, your organization should plan for key rolling and data recovery, because data encrypted with old symmetric keys that have been exposed may need to be decrypted and re-encrypted.
The advent of quantum computing has amplified the need for crypto agility in the modern enterprise. Our reliance on specific algorithms means that our organizations are slow to adapt to changes in the cryptographic landscape. To be more resilient to the risks of weakening ciphers and a dynamic regulatory landscape, we must embrace crypto agility. We must architect our applications, services, and information-security postures to change more easily — if we are to stay ahead of the curve.