NEW YORK (AP) — Your phone is getting better and better at protecting your privacy. But Uncle Sam isn’t totally comfortable with that, because it’s also complicating the work of tracking criminals and potential national-security threats.
For decades, tech companies have steadily expanded the use of encryption — a data-scrambling technology that shields information from prying eyes, whether it’s sent over the Internet or stored on phones and computers. For almost as long, police and intelligence agencies have sought to poke holes in the security technology, which can thwart investigators even when they have a legal warrant for, say, possibly incriminating text messages stored on a phone.
The authorities haven’t fared well; strong encryption now keeps strangers out of everything from your iMessages to app data stored on the latest Android phones. But in the wake of the Paris attacks, U.S. officials are again pushing for limits on encryption, even though there’s still no evidence the extremists used it to safeguard their communications.
While various experts are exploring ways of resolving the impasse, none are making much headway. For now, the status quo favors civil libertarians and the tech industry, although that could change quickly — for instance, should another attack lead to mass U.S. casualties. Such a scenario could stampede Congress into passing hasty and potentially counterproductive restrictions on encryption.
“There are completely reasonable concerns on both sides,” said Yeshiva University law professor Deborah Pearlstein. The aftermath of an attack, however, “is the least practical time to have a rational discussion about these issues.”
Encryption plays a little heralded, yet crucial role in the modern economy and daily life. It protects everything from corporate secrets to the credit-card numbers of online shoppers to the communications of democracy advocates fighting totalitarian regimes.
At the same time, recent decisions by Apple and Google to encrypt smartphone data by default have rankled law enforcement officials, who complain of growing difficulty in getting access to the data they feel they need to build criminal cases and prevent attacks. For months, the Obama administration — which has steered away from legislative restrictions on encryption — has been in talks with technology companies to brainstorm ways of giving investigators legal access to encrypted information.
But technology experts and their allies say there’s no way to grant law enforcement such access without making everyone more vulnerable to cybercriminals and identity thieves. “It would put American bank accounts and their health records, and their phones, at a huge risk to hackers and foreign criminals and spies, while at the same time doing little or nothing to stop terrorists,” Sen. Ron Wyden, D-Ore., said in an interview Monday.
Lawmakers on the U.S. Senate Select Committee on Intelligence remain on what they call an “exploratory” search for options that might expand access for law enforcement, although they’re not necessarily looking at new legislation.
The FBI and police have other options even if they can’t read encrypted files and messages. So-called metadata — basically, a record of everyone an individual contacts via phone, email or text message — isn’t encrypted, and service providers can make it available when served with subpoenas. Data stored on remote computers in the cloud — for instance, on Apple’s iCloud service or Google’s Drive — is also often available to investigators with search warrants. (Apple and Google encrypt that data, but also hold the keys.)
Some security experts suggest that should be enough. Michael Moore, chief technology officer and co-founder of the Baltimore, Maryland-based data security firm Terbium Labs, noted that police have managed to take down online criminals even without bypassing encryption. He pointed to the 2013 take down of Silk Road, a massive online drug bazaar that operated on the “dark Web,” essentially the underworld of the Internet.
“The way they figured that out was through good old-fashioned police work, not by breaking cryptography,” Moore said. “I don’t think there’s a shortcut to good police work in that regard.”
Others argue that the very notion of “compromise” makes no sense where encryption is concerned. “Encryption fundamentally is about math,” said Mike McNerney, a fellow on the Truman National Security Project and a former cyber policy adviser to the Secretary of Defense. “How do you compromise on math?” He called the idea of backdoors “silly.”
Some in law enforcement have compromise ideas of their own. The Manhattan District Attorney’s office, for instance, recently called for a federal law that would require smartphone companies to sell phones they could unlock for government searches — in essence, forcing them to hold the keys to user data.
In a report on the subject, the office called its suggestion a “limited proposal” that would only apply to data stored on smartphones and restrict searches to devices that authorities had already seized. Privacy advocates and tech companies aren’t sold, saying it would weaken security for phones that are already too vulnerable to attack.
Marcus Thomas, the chief technology officer at Subsentio and former assistant director of the FBI’s operational technology division, argued that it’s too late to turn back the clock on strong encryption, putting law enforcement in a “race against time” to obtain investigatory data whenever and wherever it can. But he urged security experts to find ways to help out investigators as they design next-generation encryption systems.
The idea of allowing law enforcement secure access to encrypted information doesn’t faze Nate Cardozo, a staff attorney for the San Francisco-based Electronic Frontier Foundation — provided a warrant is involved. Unfortunately, he says, cryptographers agree that the prospect is a “pure fantasy.”
___
Follow Bree Fowler at https://twitter.com/APBreeFowler and Tami Abdollah at https://twitter.com/latams .
Copyright 2015 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
The people I’ve known in government follow the law and the constitution; a person is a target because they’ve convinced a judge there is reason to believe they are criminals or terrorists. These people have a moral compass.
The people in the “encryption everywhere” commuity that I know, have no moral qualms about helping anyone who wants to evade law enforcement or intelligence. They don’t ask if the person who wants to encrypt their email is a child pornographer, and they don’t ask if the person who wants to use Tor intends to set up an on-line illicit drug trading business; they think anything they do to make governing more difficult is good; they have no moral compass.
One answer may be that if you encrypt your data, and you behave in a way that gets the attention of law enforcement, we assume you are hiding your guilt until you provide law enforcement with the key. If all you are hiding is love letters or your porn viewing habits, provide the key and prove it. Otherwise we will take the allegations of law enforcement as true, and convict you on that basis. We do not permit people to refuse a subpoena for access to their home or business, we cannot allow them to refuse a subpoena for access to their data.
So you’re saying we should make policy based on your assessment of whichever people you happen to know. Can’t say I’m sold, I’m afraid.
In the news today are a pair of stories about two groups in government who I’m not interested in entrusting my secure information with. One is the Chicago police, who suppressed for a year a video of an officer killing a helpless man with multiple shots, after which they mischaracterized the encounter to protect themselves. Also we have the story of the bombing in Afghanistan of a Doctors Without Frontiers hospital, where the official story is pretty damning, but we can’t trust it because the Pentagon refused to allow an independent investigation.
Police and military have long records of not being trustworthy in many instances. Yet those are precisely the ones who we’d be handing the keys to our private data to. No thanks.
You could mandate that large corporations use standardized encryption packages, which would have built-in backdoors for government, and they would have to do it. But this wouldn’t stop everyday folk, or the odd terrorist, from using open-source encryption without the backdoors. So even leaving aside the laughable notion that major corporations would acquiesce to having weaker encryption than Joe Blow, this would still not do much to help security officials.
I couldn’t disagree more. While I agree that your “answer” is possible to implement from a technological standpoint, your premise that you should be guilty until proven innocent strikes me as completely unamerican. Not only does it violate the “innocent until proven guilty” maxim, and probably the 4th amendment, it also strikes me as a really really bad idea.
It’s times like these that I think it’s important to put things in perspective. How many preventable violent crimes have happened as a result of cryptography? Compared to say drunk driving or gun violence? Why would we ever ever trade our liberty for such a marginal gain?
Backdoors are HUGELY problematic. Ignoring all the problems with regards to the fact that criminals could just use encryption that doesn’t have backdoors, who is going to keep these keys safe?
If there were a key that could decrypt anyone’s data, how could you possibly keep that safe and only in the hands of “good guys”? Can you imagine how incredibly valuable that key would be? And how incredibly simple it would be to steal it without a trace? Basically you are creating a tiny piece of information, essentially a string of text, that would be worth billions of dollars.
That gives a whole lots of power to a select few. Seems like a very bad idea.