Security risks of outdated encryption: Is your data really secure?
They say that those who fail to learn history are doomed to repeat it. A salient factor in the defeat of Austria by Prussia in the 1866 Austro-Prussian war was the Prussian army’s standardization of the (then) modern, rapid firing, bolt-action Dreyse needle-gun. In contrast, the Austrian army persisted with the use of outdated (slow-loading) muzzle-loading rifles. And so, Austria was out-gunned, leading to a disastrous battlefield performance. This is an important lesson in not adapting to modern technological improvements and one that is pertinent to encryption applications.
It’s well-known that encryption algorithms rarely stand the test of time. This is partly because such algorithms are devised against a knowledge of the methods of attack and cryptanalysis that either exist or are envisaged at the time the algorithm is written. And so, as novel methods of attack are uncovered, hardware processing speeds increase and cryptanalysis develops, so too do encryption algorithms fall prey to new vulnerabilities.
There are many examples of outdated encryption algorithms, now considered dangerously obsolete. Some common examples are discussed here.
Types of encryption algorithms
An algorithm is only as good as the testing it goes through in the field. It is often that insecurities inherent in an algorithm only become apparent after many years of use. Let’s look at some examples.
Hash algorithms and collision attacks
Hashing is an umbrella term that encompasses methods used to encrypt data in a manner that cannot be reversed. Data passed through a hash algorithm produces a fixed size sequence of bytes, which should be unique for any data input. However, hashing is an insecure method.
Invented in 1991 by the famous cryptographer Ron Rivest, the hash algorithm MD5 was considered secure enough for most cryptographic purposes throughout the early ‘90s. Later, however, it was discovered to be totally insecure.
Collision resistance is a measure of the ability of the algorithm to afford a unique output for a given input. If a hash algorithm is vulnerable to collisions, it means that there is a reasonable probability of finding more than one input value that affords the same output value.
It is clear that should such a hash be used for data verification and digital signatures, i.e., checking that data has not been altered or substituted, an attacker could potentially make a change in that data while still generating a hash that matches that from the original data.
The algorithm, MD5 (and a more modern hash algorithm, SHA1), have both been shown to be susceptible to collision attacks; indeed, it has been demonstrated that even digital certificates that use a SHA1 hash as part of the signing algorithm may be counterfeited. However, although these hash methods have been deprecated for some while, they still linger in older software and systems. This makes those systems vulnerable.
Similarly, the simple use of hashes for password storage is now considered obsolete. This is not because of collision, but rather, their inherent speed makes them unsuitable for this purpose; it is possible for an attacker to simply try billions of random data inputs until one is found that produces the target hash. Again, use of hashing for password storage is still in use in 2020, and there are even still many online code examples for password storage based on hashing with MD5.
Best practices and hashing
Using obsolete hashing methods for digital signature or data verification applications is a good way to make your system vulnerable and an easy thing to upgrade. It’s well worth checking your code libraries and default settings for the use of poor hash methods and password storage methods. For secure password storage, use an appropriate algorithm such as BCRYPT or ARGON2, unless you want to join the ever-growing list of companies who have lost user data due to poor password storage practice.
The term “encryption” means reversible encryption, used to protect data at rest and in transit. These applications of encryption are frequently littered with the unexploded ordnance of poor practice and obsolete (or simply bad) algorithms, waiting to explode data upon hackers’ prodding.
There has never been a period in which easy-to-use and secure encryption algorithms are commonplace, and yet there are so many instances of poor practice. One case in point in securing HTTPS traffic: for some time, the danger of web servers permitting the use of outdated and insecure protocols (TLS 1.0, TLS 1.1, SSLv2, SSLv3) has been known, but there are still many web servers that permit these protocols. One estimate report around 850,00 websites still use TLS 1.0 or 1.1 in March 2020.
The danger is that these are exploitable, leading to data loss. If you value your customers’ data, you should check your own websites for compliance with the more secure standards of TLS 1.1 and 1.2. You can test your site at, for example, https://www.cdn77.com/tls-test.
Encryption used in desktop and server applications (e.g., to protect files and other sensitive data during storage or transit) may also be obsolete and vulnerable. In some cases, highly embedded obsolete algorithms will be difficult to replace, a matter to ponder when designing any system that utilizes encryption.
One such example is 3DES, based on the old DES algorithm designed in the 1970s. Although known to be vulnerable since 2016, 3DES is still widely used, being embedded in many payment and other financial systems. This makes replacement problematic. Doubtless, it is only a matter of time before these systems are attacked using such an exploit.
Best practice and reversible encryption
Algorithms such as 3DES, RC4, 1024-bit RSA have, for some time, been considered vulnerable and using these favors having your, or your clients’, data compromised. Systems using these algorithms should have been replaced with a modern equivalent (AES for symmetric encryption, RSA with a minimum of 2048 bit or ECC for asymmetric encryption) as a priority.
A similar argument applies to the use of esoteric or in-house algorithms. Why risk your data when excellent established algorithms are readily available?
In designing new software or replacing obsolete code, there is no excuse not to use modern encryption algorithms. Provided, that is, that the encryption is correctly implemented — use of a modern algorithm by itself is no guarantee of success here, as the following example shows.
Encryption is not only used in desktop and server software, but also in applications such as smart cars, smart TVs, cameras and other IoT devices. Unfortunately, there is often a disparity between the skills shown in designing impressive hardware devices and that of producing secure firmware and communications systems used in them.
One such example is Texas Instruments’ DST80 encryption, which is used in many car electronic key systems. This is a secure system when used as designed by Texas Instruments. However, several car manufacturers decided that, rather than use the recommended full 80-bit key size, they would use 24-bit, which is easily hacked.
If you want to keep winning the war against data theft, ensure that your encryption is up to the task, can be readily updated and (most importantly) is implemented securely. All of this may require some effort, which may not make your product more marketable or add new features but will give peace of mind when it comes to the security of the data it handles.
Thanks also to Dr. Stephen Hitchen, Security Architect at Avoco Secure who helped me compile examples and information to write this article.
- The SHA1 hash function is now completely unsafe, Computerworld
- node.bcrypt.js, npm
- node-argon2, npm
- Browsers to block access to HTTPS sites using TLS 1.0 and 1.1 starting this month, ZDNet
- Lennert Wouters, Jan Van den Herrewegen, Flavio D. Garcia, David Oswald, Benedikt Gierlichs and Bart Preneel, “Dismantling DST80-based Immobiliser Systems,” IACR Transactions on Cryptographic Hardware and Embedded Systems