The terrorist attacks in Paris and Brussels have ignited a new debate about encryption. Let's take a closer look...
Encryption transforms readable content such as a note, picture or diagram into something that appears to be complete gibberish.
The algorithms involved are extremely sophisticated and come in different flavors. They rely on special mathematical operations that are relatively simple to perform but extremely difficult to reverse, unless you have a secret key.
Information before it is encrypted is known as plain-text (called that even if it's not text, per se, it could be a picture, movie, anything). Information after it's encrypted is known as cipher-text. A password or passphrase is the key used by the algorithm to transform (encrypt) plain text into cipher-text, and to transform (decrypt) cipher-text back into plain text.
The way the media talks about it, you'd think encryption was something special and rare. It's not. Encryption is in almost everything. You use it constantly. It's global.
Every time you unlock your car with your remote key, that transmission is encrypted. When your doctor accesses your electronic medical records, it's via encrypted connections. Each time you tap your employee ID card to enter your office building, that data is encrypted.
Whenever you log into WiFi the connection is encrypted. Each time you swipe your card to make a purchase or use an ATM, the interaction is encrypted. Every time you access a web site with that mysterious "https" in the name your interaction with the site is encrypted.
And that's just the beginning. The list is thousands of items long, and includes encryption used in some surprising places, such as smart appliances like refrigerators, or taxi meters.
Fact #1: Encryption is ubiquitous and freely available.
Any developer can download high-quality open-source implementations from hundreds of sources. There's simply no way to close Pandora's box. Anyone can get it, and everyone uses it in endless ways.
There's been a lot of talk about forcing companies to provide what's called a 'back door' into their encrypted systems. Such a facility would be akin to having a master key law enforcement could use to open anything that's encrypted. Agencies such as the FBI and GCHQ clamor for this. And they don't just want it for messaging apps, they want it for everything that could possibly be used by terrorists to comminicate.
Of course, it wouldn't be just one key, it would be millions. Why? Because there are countless encrypted systems out there, each with their own specific implementation.
For years this idea has been examined by experts from every perspective, and the conclusion is the same every time.
Fact #2: Back doors are extremely dangerous.
Managing a system of back doors would require a massive new catalogue and control system with a global reach so that companies from Apple to Zenith and from General Motors to Hyundai to Sony could submit the mandatory back doors to the encryption that protects critical aspects of their products and services.
A huge new bureaucracy would have to be created, and it would have to guard this precious treasure box of secret keys with perfection. Absolute perfection.
Right, sure.... like that's going to happen.
Every encryption system on earth — from ATM machines to video games, from refrigerators to chat programs — would have to be registered and monitored. Registration compliance would have to be global, in every country and territory with no exceptions. And the entire thing would have to be protected like nothing else has ever been.
Rather than a central repository, it's been proposed that companies maintain back doors privately. For example, the Manhattan DA suggests that…
"…any designer of an operating system for a smartphone or tablet manufactured, leased, or sold in the U.S. to ensure that data on its devices is accessible pursuant to a search warrant".
Doesn't that sound like a good compromise between respecting privacy and providing law enforcement with reasonable investigative tools?
Well, not so much...
Look at the implications: In order to comply with such a law, the operating system would have to spy on all apps, recording all images, keystrokes and sounds, before data is even passed by the operating system to the apps it hosts.
And if an app attempts to encrypt something, say a file, to comply with the law the operating system would have to take a secret copy before allowing the app to encrypt it. Seriously, that's really what it means.
The DA's seemingly reasonable statement actually proposes a breathtaking level of intrusion.
Then there's the fact that the mere existence of private decryption capability would place US companies at a huge competitive disadvantage relative to companies in countries that do not require it.
Here's something you won't learn from the media: You don't have to be a mega-corporation to build an encrypted chat app.
In fact, two guys with a couple of servers could build a system good enough to fulfill the needs of criminals or terrorists in just a month.
The hard part is not making it work. The difficult bits have to do with features that terrorists don't have to worry about, such as an appealing interface or smooth usability.
Even a crappy app and network infrastructure is good enough if you have a guaranteed user base. And with some clever dynamic server management, it would be virtually impossible to stop.
Fact #3: Terrorists can and will build their own.
Governments believe that terrorists have the skill to develop sophisticated cyber weapons. Billions have been allocated to combat the possibility of cyber attacks on critical infrastructure such as power stations or electrical transmission systems. Developing their own encrypted communication apps is absurdly trivial by comparison.
Actually, there's a pretty good chance they already have.
If we punch holes in systems ordinary people use, law enforcement will have keys to everything the terrorists don't use, and no keys to what they do. Pause for a moment to think about that.
For example, anyone with a video game console like X-Box or Playstation, or who uses a computer to participate in an online game such as Battlefield 4, has access to built-in chat and voice functions to allow "team play".
At any given time there are millions of simultaneous conversations happening in systems like Teamspeak.
Fact #4: The scale of the surveillance would be stunning.
Even if they had all the keys, to be even remotely effective governments would have to monitor hundreds of millions of simultaneous conversations across a staggering array of systems terrorists wouldn't use anyway, for precisely that reason.
That bears repeating: They won't use them. They'll just pick systems from countries that lack back door legislation, or build their own.
Terrorists may be insane, but they're not stupid. We treat them that way at our great peril.
Imagine that despite all the reasons creating back doors is a terrible idea, the USA and EU do it anyway. OK, great. But what about creating the same legislation for all the encrypted systems erupting out of China? What about Russia? How about India, or Brazil? What about Iraq or North Korea?
Fact #5: The global cooperation needed is politically impossible.
How do you get them to all create legislation that forces back doors into systems built within their borders, which of course are just as globally available as anything else on the internet?
In fact, how do you stop even just two talented teenagers in Damascus, Kabul or Tehran from cobbling something together, dumping it onto file sharing sites and quietly telling all the wrong people about it. No back doors there.
If the USA and the EU were to build a database of back doors into the millions of different ways there are to use encryption, that database would itself become a massively attractive target for hackers of all stripes, not the least of which would be the terrorists themselves.
Both industry and government have a terrible track record of protecting confidential information, as you can plainly see from this 'top 10' list.
The problem is, even one breach of the system that protects the back door codes – even one bad actor in a position of authority – could result in disaster. Suddenly the bad guys would have the keys to the kingdom: total access to banks, power grids, health care systems, corporate databases, air traffic control systems… everything.
Fact #6: It's insane to concentrate such value.
The only thing that mandating back doors will accomplish is to lull us into a false sense of security. It would delude us into believing our governments are doing something effective while actually placing us in far greater danger than ISIS could ever hope to accomplish.
Wisely, even the White House has decided that it will not call for legislation requiring companies to decode messages for law enforcement.
We believe terrorism is such a serious problem that society must develop solutions that actually work. The situation is too grave for half-measures that have no chance of success. Encryption is not the enemy.
There is no way to ban encryption or implement a system of back doors without creating huge problems and opening horrifying new vulnerabilities.
Instead, let's do something effective about the underlying problem rather than plunging ourselves still deeper into a surveillance society that will ultimately destroy everyone's freedom. That is what the bad guys are trying to accomplish. Let's not help them.
Could an organization like ISIS use Merlin as a social recruiting tool?
No. Merlin's social networks are completely private, there's no way to reach anyone who you don't already know (which is the entire point). So as a recruiting tool it's useless. If bad guys want to recruit new people they can use an open social network, such as Facebook.
Could terrorists or criminals use your apps?
We suppose that some will, just like some also drive cars or use smart phones. But we offer them nothing they don't already have now, in great abundance. For example, here's a partial list of completely encrypted ways they can communicate right now:
• iPhone text messages (recently fully encrypted by Apple)
• Silent Circle
• TOR Chat
• Chat Secure
• PGP / GPG Mail
• … and many others
As you can see, there's no shortage of ways for bad people to communicate. What is in short supply are safe and simple ways for good people to protect themselves.
Isn't Merlin Voice a great way for criminals to communicate?
Merlin is a great way for anyone to communicate, and unfortunately that includes criminals.
It's also true that cars are a great way for criminals to travel, and software is a great way to organize complex criminal activity (for example, Al Qaeda's primary planning tool for the attacks of September 11, 2001 reportedly was Microsoft Excel).
Phone scramblers are a great way to talk in secret, and amateur (ham) radio is a great way for criminals to communicate anonymously over short or very long distances, especially when used with hardware encryption devices.
So are encrypted or coded telephone calls or voice mails, or just plain old snail mail (presumably they'd use something better than 'invisible ink').
And there are many others, including steganography, encoding in publicly published messages (the classified ads one sees in old movies), walkie-talkies, overnight package or letter delivery services, microdots printed on paper, semaphores of all kinds, or just messengers.
Most of these are hard to discover and difficult or impossible to trace, monitor or crack.
Can you stop bad people once it's discovered they are using Merlin?
Yes. If we are presented with credible evidence (by law enforcement, for example) that someone is using our products in violation of our Terms of Service – which prohibit any sort of criminal activity – we will immediately exercise our right to cancel the user accounts and Kahuna™ IDs involved.
We can't see inside Merlin to know what anyone is saying, but we have both the right and the ability to stop people who attempt to use our tools for bad things.This actually makes Merlin safer than a lot of the systems listed above, over which no one has any control.