The rapid development of technology has brought with it countless benefits to citizens across the world but, as has become abundantly clear over the last couple of years, there is very much a dark side.
The tech industry – along with government and law enforcement – is currently faced with an issue which at its worst puts the lives of innocent citizens at risk and at its best, keeps us chasing cyber shadows.
I’m talking, of course, about the privacy vs encryption debate which has become a hugely emotive one in the wake of terrorist attacks in America, Europe and most recently in the UK.
The debate took centre stage last year when Apple refused to give the FBI access to an iPhone belonging to a terrorist who was involved in the attack which killed 14 people in San Bernardino, California.
A federal judge ordered Apple to provide “reasonable technical assistance” to authorities, but Apple was not to budge. CEO Tim Cook warned that the court order sets a “dangerous precedent” and that introducing a backdoor into the company’s system would create the “software equivalent to cancer”.
He said: “We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data.”
Apple, rightly or wrongly, received a lot of praise from within the tech community for taking a stand, as the likes of Google chief executive Sundar Pichai and Box CEO Aaron Levie publicly supported the company.
Even the UN’s human rights commissioner got in on the act, saying that the US Department of Justice risked “unlocking a Pandora’s Box” in its efforts to force Apple to decrypt the iPhone in question and that the case could have “extremely damaging” implications for human rights.
David Emm, principal security researcher at Kaspersky Lab agrees, telling Silicon that “the requirement for application vendors who use encryption to provide a way for government or law enforcement agencies to ‘see through’ encryption, poses some real dangers. Creating a ‘backdoor’ to decipher encrypted traffic is akin to leaving a key to your front door under the mat outside.
“Your intention is for it to be used only by those you have told about it. But if someone else discovers it, you’d be in trouble. Similarly, if a government backdoor were to fall into the wrong hands, cybercriminals, foreign governments or anyone else might also be able to inspect encrypted traffic – thereby undermining not only personal privacy, but corporate or national security. It would effectively create a zero-day (i.e. unpatched) vulnerability in the application.”
In the end, the FBI reportedly managed to break into the iPhone without Apple’s help , but the whole episode triggered a debate about encryption that spread across the world and, whether you agree or disagree with Apple, the fact that this discussion is now so front of mind is undoubtedly a good thing.
The latest issue focuses on mobile messaging service WhatsApp. Earlier this week, Home Secretary Amber Rudd said it is “completely unacceptable” that police are unable to access the encrypted WhatsApp messages of the man who killed four people in Westminster in a very public attack that was felt around the world.
“There should be no place for terrorists to hide,” Rudd told the BBC’s Andrew Marr Show. “We need to make sure that organisations like WhatsApp, and there are plenty of others like that, don’t provide a secret place for terrorists to communicate with each other.”
Her comments, while totally justifiable after the attack that tragically took the lives of of four innocent people, open up a whole barrel of complications to which there is no easy solution.
The encryption debate continues on page 2…
Page: 1 2