The hard truths about the encryption debate


In my latest piece in TechCrunch, I gave a full breakdown of the ongoing debate between government agencies and tech firms over whether consumer devices and software should be embedded with strong encryption technology, and if manufacturers should bake backdoors into their products to allow security agencies access to encrypted communications.

I had started work on the piece weeks ago, and by coincidence, the brutal Paris attacks came to pass just as I was about to submit the final draft. The tragic episode has added a new twist to this ongoing conflict, and now government officials blame secure apps and hardware – and their vendors – for providing terrorists with the right tools to keep their schemes hidden.

I’m not wont to give an opinion on such issues in my pieces, and what I detailed in the TechCrunch article was a pure analysis of the issue and a reiteration of the arguments that each of the parties involved put forth. In this blog post, I will give you the hard facts, which in my belief prove that blocking encrypted communications and installing backdoors on devices aren’t the way to stop such attacks from happening again.

Backdoors won’t remain in the exclusive possession of government

Forcing companies to install backdoors into their software and devices and handing over the keys to the government would be ideal, only if the knowledge and the tools remained in the sole possession of government agencies. But as Apple’s Tim Cook puts it, “You can’t have a back door that’s only for the good guys.”

The vulnerabilities will eventually fall into the hands of users with malicious intents, which they’ll use to hack into devices and steal critical data. In a worst case scenario, the backdoors will eventually find their way into the hands of oppressive regimes, which they’ll use to crack down on dissent and human rights activist.

And after a long and tortuous road, we’ll be back to square one, left with no other option that to plug the hole and shut the backdoor close because the harm it does is far greater than the advantages it gives us.

There are more communication methods than governments can control

Government agencies think that by having the keys to break into iMessage or WhatsApp (or Viber, SnapChat and a dozen more services), they’ll be guaranteed to control all methods that terrorists might use to communicate.

But the truth is that there are just too many services and software that criminals can use to communicate and there’s only so much government agencies can do to control all of them. Moreover, criminals are using new, innovative ways like messaging within video games, using shooting patterns in video games or playing patterns in online chess games to relay their messages, and in these cases, they don’t even need to encrypt their communications, because spy agencies don’t even know about them. And there are dozens and scores of titles to choose from. Try controlling that.

And the more information governments collect, the more data they’ll be drowning in, and the more time they’ll need to analyze and figure out communication patterns in that endless sea of bits and bytes.

Oh, and did I forget to say that the new, tech-savvy generation of terrorist are well capable of brewing their own encryption software, and they’re not legally bound to hand over the keys to government agencies (or if they are, they’re not abiding anyway).

There is already so much unencrypted data that agencies can use

Government agencies already have access to so much unobscured metadata that can be used to discern and identify activity habits and patterns. These data include phone numbers, IP addresses that are communicating with each other, dates and times of communication, and in some cases the location of the people communicating. Government-class technology can enable surveillance agencies to scoop up such metadata in mass quantities from undersea cables.

Metadata can be extremely powerful in discerning relationships and connection networks. Fact of the matter is, the U.S. is already using such data to select drone strike targets, which relays the degree of reliability that metadata already has.

Some might argue that IP addresses are encrypted when services like Tor are used. That doesn’t seem to be a problem at the government level though, as it was recently exposed that the FBI had hired security researchers at Carnegie Melon University (CMU) to break into Tor and locate suspects, including those involved in the huge Silk Road bust.

Final thoughts

All in all, I think that the case for weakening encryption in software and hardware for the sake of national security goals is a hoax, a goal that is self-destructive. We shouldn’t stop the advancement of technology for the fear of it being used by criminals and terrorists. Those who have evil intentions will end up finding other methods to carry out their evil deeds.

We should let the tech industry progress at its pace and encourage it to speed up. In tandem, we have to look for ways to secure our world and increase general awareness and collaboration to prevent tragedies like the Paris Friday 13 attacks from happening again.

Have any suggestions? Share with me in the comments section.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.