Encryption Doesn’t Allow for Compromise

Encryption Doesn’t Allow for Compromise

Why Encryption Is Under Threat

Encryption is once again a hot topic, as U.S. officials are pushing to undermine user IT security by demanding backdoors in encryption systems for law enforcement access. These anti-encryption advocates claim there’s a “golden mean”: strong encryption with “exceptional access” for authorities. They argue that tech companies are creating a world where people can commit crimes without fear of detection.

Despite this new rhetoric, most experts still agree—“exceptional access,” no matter how it’s implemented, weakens security. The terminology may have changed, but the core question remains: should tech companies build systems that fundamentally harm their users? The answer is still “no.” Here’s why.

Four Reasons Why Backdoors Are a Bad Idea

  1. It violates the First Amendment. If mandated by the government, it would breach the U.S. Constitution’s “compelled speech” doctrine, which prohibits forcing individuals or organizations to disclose information.
  2. It endangers users. In the 1990s, the White House introduced the Clipper chip, which had a built-in backdoor. AT&T security researcher Matt Blaze found major vulnerabilities, showing that brute-force attacks could compromise the technology.
  3. It harms U.S. businesses and stifles innovation. The U.S. government can’t stop the development of encryption technology; it can only push it overseas.
  4. It doesn’t prevent crime. No matter what requirements the U.S. government imposes on domestic companies, sophisticated criminals can still access strong encryption from abroad.

The Myth of a “Safe Backdoor”

Despite consensus among IT experts, some policymakers continue to chase the impossible “golden mean.” Last month, after years of research, the U.S. National Academy of Sciences released a report on encryption and “exceptional access,” reframing the question from “should the government require exceptional access?” to “how can the government require it without compromising user security?”

Cryptanalyst Susan Landau warned that some might misinterpret the report as proof that “exceptional access” is close to technical reality:

“The Academy’s report discusses approaches to ‘building… secure systems’ with ‘exceptional access,’ but these are only initial ideas… The committee’s presentations were brief descriptions from three experts, not detailed system architectures. There’s a huge difference between a sketch and an actual implementation—Leonardo da Vinci’s flying machine drawings are not the same as the Wright brothers’ airplane in Kitty Hawk.”

Research isn’t limited to the Academy. The international EastWest Institute recently published a report proposing two “golden mean” policies to support more constructive dialogue. Last week, Wired magazine covered former Microsoft CTO Ray Ozzie’s attempt to find an exceptional access model for phones that could satisfy both law enforcement and privacy advocates. While Ozzie may have had good intentions, experts like Matt Green, Steve Bellovin, Matt Blaze, Rob Graham, and others quickly pointed out serious flaws. No system is perfect, and a backdoored system for billions of phones carries enormous risk.

The search for a solution continues, but the truth remains: all attempts at “constructive dialogue” ignore a fundamental obstacle—the starting point for this dialogue is diametrically opposed to the very purpose of encryption. Here’s why.

Encryption: A User’s Guide to Keys

Encryption is often explained using the analogy of keys—anyone with the “key” can decrypt or read the “locked” information. But this metaphor has its limits.

In ancient times, encryption was achieved with sets of instructions now called “symmetric encryption.” These instructions explained how to both disguise and reveal a message. Sometimes, simple rules were used, like shifting each letter or number to the next in the alphabet or sequence (so “A” becomes “B,” “B” becomes “C,” etc.). More complex methods involved converting letters to numbers and then transforming those numbers with mathematical formulas, making the result unreadable to outsiders.

As encryption evolved, cryptographers began using “keys,” which provided greater security. A “key” is secret information known to both the sender and recipient, required to encrypt and decrypt messages. Keys still play a crucial role in modern encryption, but now more than one type of key is typically used.

Some digital devices encrypt stored data, and the password you enter unlocks a random key used for that encryption. For communication between people, like emails or chats, “public key encryption” is used. The advantage is that people don’t need to know the “key” in advance.

With public key encryption, a user—whether a person, company, website, or server—receives two (sometimes more) related keys: one for encrypting data (the “public key”) and one for decrypting it (the “private key”). The public key can be shared with anyone, like an open set of instructions. Anyone wanting to send encrypted messages can use these instructions to encrypt data. The private key is never published and is used to decrypt data encrypted with the corresponding public key.

In modern encryption, public and private keys aren’t used to encrypt messages directly. Instead, they encrypt and decrypt a separate “session key,” which is then used for the actual data encryption and decryption. This session key is a secret set of instructions both sender and recipient can use. Public key encryption ensures the session key is protected and can’t be intercepted or used by outsiders. The fewer opportunities there are to steal or accidentally publish a private key, the higher the security.

But “exceptional access” demands the opposite—more keys, more access, and more vulnerability. It undermines encryption security by giving law enforcement either their own set of private keys for every encrypted device and personal key, or by requiring the creation and secure storage of duplicate keys for transfer.

That’s why law enforcement proposals for a “responsible solution” are actually irresponsible. Any system with a separate channel for third-party access is inherently less secure than one without it. In encryption systems, the very existence of duplicate or special-purpose keys makes them attractive targets for attackers. It’s like making extra physical keys to a bank vault—the risk of one being lost or stolen is high. Copying encryption keys (for U.S. law enforcement and, eventually, worldwide) increases the risks.

There is no good-faith compromise in the government’s request for “exceptional access.” The “golden mean” between what law enforcement wants (weak encryption) and what users want (strong encryption) is still just weak encryption.

U.S. Deputy Attorney General Rod Rosenstein admitted in a Politico interview that a device with exceptional access “would be less secure than a product that does not have that capability.” He continued: “The legitimate question is: how much risk are we willing to accept because of that?” The answer must be based on solid arguments about the risks of abandoning strong encryption.

Common sense must prevail over populist political rhetoric.

Leave a Reply