The No Encryption Dystopia

Get posts like this one in your inbox by signing up for our newsletter.


Yet again, the US government is proposing that “warrant-proof” encryption be made illegal. Attorney General Barr essentially states that encryption as it currently stands allows criminals to completely hide their communications from the government. While that point is valid, the modern alternatives are even worse than how things currently stand. Here are just a few of the many problems allowing the government to crack encryption would have.

Encryption Isn’t Software

Before I continue with what could possibly happen if the government could break encryption at will, it’s important to understand why that would never work. Encryption isn’t like a piece of software or a website that simply asks for your password and grants or denies access. If that were the case, then it would be rather easy for companies to verify the government has a valid warrant, and provide access to the information requested. However, encryption just doesn’t work that way.

Encryption essentially scrambles the data a certain way using a key. With the correct key, it’s pretty easy to re-organize the data as it was before being scrambled; without the key, that task becomes virtually impossible. As such, encryption can either be reversed with the key, or with the key and something else, such as a master key. Encryption is rather binary in this regard: it can either be opened without the correct key, or it can’t. There’s no middle ground where only someone with the correct key or the government with a valid warrant can decrypt the data.

One more thing; I’m assuming the backdoor would be one made by the US government, not by the individual companies implementing the encryption. Although I believe the latter may be somewhat more secure as there are now multiple master keys needed to monitor absolutely everything as opposed to just one, the same general issues remain apply.

The Master Key Would Be Stolen Or Cracked

Imagine a situation in which the government mandated that all encryption be done with a certain algorithm they have a backdoor to (i.e. a master key). That algorithm would act just like any other encryption algorithm; data is encrypted using a key, and anyone with that key can rather easily decrypt the data. However, the government knows a way to bypass the algorithm without knowing the key used to encrypt the data, such as a master key. That secret would be guarded with the best security known to man, as having that secret stolen would compromise all data ever encrypted with that algorithm.

Everything works just as planned for years and years. The average citizen can message their friends without hackers being able to see the messages, while the government can keep tabs on known or suspected criminals. Each time the government wants to access data they don’t have the key to, they need to get a search warrant and go through some process to obtain special permission to use the backdoor in the algorithm. Regardless of how well the information on how to crack the encryption is stored, it will eventually be leaked. No computer is completely invulnerable to being hacked, and government computers aren’t an exception. In fact, the exploit WanaCry used was found by the NSA (who didn’t alert Microsoft for more than half a decade) but was then leaked.

However, let’s imagine the process is bulletproof; the secret isn’t stored digitally so it can’t be hacked, and guarded with enough checks and balances to prevent abuse. Although that situation is impossible in the real world, it’s somehow magically implemented for this scenario. But, eventually, some foreign government (or other entity with the resources) wants to keep tabs on some citizens or government officials of another country. Since the software for communications around the world are created in the US, they must comply with US standards. So, that foreign government spends billions of dollars and has the best cryptographers working to crack the algorithm. Eventually, they figure out out the master key. That adversary can now decrypt literally any data they want, as long as it was encrypted with that algorithm (which is legally required to be used by all encrypted software).

What About Modern Encryption?

Couldn’t a vulnerability just as easily be found in modern encryption algorithms, like AES? Well, it is possible for flaws to be found, as some already have been, but it’s rather unlikely for a flaw to be found that effectively renders AES completely useless. The chances of a flaw being found in an algorithm intentionally designed with one are much higher than the same flaw being found in an algorithm designed to secure data against all without the key.

Slipping Standards

Another way in which the backdoor could go disastrously wrong is if the government changes the criteria under which it makes use of the backdoor. Remember that if the government decides to use the backdoor, it will work on any piece of encrypted data regardless of if the backdoor is being used legally or not. At first, the government may want to show that it isn’t abusing the backdoor, and would therefore likely keep its use minimal. But over time, it’s easy to see how they could slowly use it more and more, even when not necessary or even legal. For example, the criteria may change from needing a warrant to the backdoor being used on people suspected, without evidence, of communicating with a terrorist organization.

Criminals Could Create Their Own Encryption

Lastly, nothing is stopping people from using their own encryption and/or infrastructure. If someone truly wanted to hide their data from the government, they wouldn’t put their trust in some mainstream, government approved app. They would either make their own solution using strong encryption without a backdoor, or just use older versions of software created before the new encryption algorithm was mandated. If a demand exists for an app to hide data from the government, regardless of the legality of such an app, one will be created. All a mandated backdoor would do is harm citizens who follow the law, while allowing those targeted by the law to be completely unaffected.


As Bruce Schneier puts it:

I wish I could give the good guys the access they want without also giving the bad guys access, but I can’t. If the FBI gets its way and forces companies to weaken encryption, all of us — our data, our networks, our infrastructure, our society — will be at risk.