One of the big revelations to come out of the National Security Agency (NSA) documents leaked in 2013 by Edward Snowden didn't have to do with what the NSA was doing with our data. Instead, it had to do with what the NSA couldn't do: Namely, the agency couldn't break cryptography.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
"Properly implemented encryption does work," Snowden said last week at Harvard University's fourth annual Symposium on the Future of Computation in Science and Engineering. Snowden, the former systems administrator-turned-whistleblower who has taken refuge in Moscow, was beamed into the Cambridge auditorium via Google Hangouts, for a conversation with security technology expert Bruce Schneier, a fellow at Harvard's Berkman Center for Internet and Society.
Schneier said it was surprising to learn that the government doesn't seem to have a secret sauce or advanced technology such as quantum computers to break encryption. "Ten to 20 years ago, we would assume that we, in the academic world, were a decade behind the NSA and other countries," Schneier said. "That seems like that might not be true."
When Snowden and Schneier refer to "properly implemented encryption," they're referring to open source encryption tools such as Tor (an anonymity network), OTR (an instant messaging tool) and PGP (data encryption software), and not to what Snowden called "homebrewed," "boutique" or closed-source cryptography or even to hardware implementations of cryptography, which he said have successfully been broken.
Schneier said it's a credit to math that agencies and governments aren't producing "fantastic results" at breaking encryptions, especially considering the amount of money they're spending to do so. Based on figures from the U.S. intelligence black budget, 4% of the Department of Defense Consolidated Cryptologic Program's $11 billion budget, or $440 million, is set aside for research and technology, he said.
Yet despite sound encryption, data is still at risk. "When they do attack, it's typically through some kind of weakness, some sort of shortcut that reduces the resistance," Snowden said. He pointed to the ongoing Silk Road trial as a timely case in point. Ross Ulbricht, the alleged mastermind behind the online drug market, used PGP to encrypt personal documents. "He had fully, irresistibly encrypted material. Yet just yesterday in court, [members of the prosecution] were reading out encrypted diary entries to a room full of reporters," Snowden said. "Encryption is not fool proof."
Prosecutors didn't break the encryption; instead, they found a way around PGP, Snowden said, by pulling a key off of Ulbricht's laptop. "The way everyone gets around cryptography is by getting around cryptography," Schneier said.
The weakness in encryption, in other words, isn't the algorithms and it isn't data in transit; it's everything else, Schneier said. "What we really have to worry about is the rest of everything -- so the bad implementations, the weak keys, any kind of back doors being inserted in the software," he said.
That includes weaknesses commonly found at the endpoints. Tools from surveillance manufacturers such as Hacking Team and Remote Control System sell products to third-world countries to perform NSA-like activities at a smaller scale. Activities include hacking into computers and reading encrypted traffic after it's been decrypted or covertly recording passwords through keystroke logging, Schneier said.
And it includes how encryption keys are stored, Snowden said. "One of the real dangers of the current security model at scale for defenders is the aggregation of key material," he said. "If you have a centralized database of keys, that is a massive target." If attackers can't access that material remotely, they could very well send someone to get hired into your organization to develop that access, Snowden said.
"We've got to focus on end points, we've got to focus on the keys [and make them] more defensible," he said.
Snowden tips for the nation's technocrats
Just because you can doesn't mean you should. One change Snowden observed following his exposure of the NSA data collection tactics is a more pronounced "just because we can, doesn't mean we should" attitude at the highest levels of government. By simply acknowledging this idea, the government has tempered its data collection practices, he believes. Businesses should follow suit.
One check and balance is to ask the question: Is the intelligence worth the potential cost? Snowden asked this question of himself when he shared an image of an NSA Tailored Access Operations (TAO) unit stuffing "Trojan horse systems" into a Cisco router for surveillance purposes. The same caution could make corporations think twice about exposing private customer data. (Case in point: Ride-sharing service Uber, which used GPS data to map "rides of glory" or one-night stands.)
Communication is vital. The NSA might be the bad guy, but, as Schneier pointed out, corporations got there first. "It's not that the NSA woke up one morning and said, 'We want to spy on the Internet.' They woke up one morning and said, 'Corporations are spying on the entire Internet; let's get ourselves a copy,'" he said. The government did so without public consent, a decision Snowden characterized as a divorce between government and its constituency. "We should at least have a reasonable understanding of the broad outlines of policies and powers they're investing themselves with," Snowden said. "That's happening behind closed doors, and they can't really say to be representing our interests because they are divorced from our interests. When there's no communication, they're no longer part of the community."
Previously on The Data Mill
Secrets to big data success
Systems diagrams help leaders manage change
The top five Data Mills from 2014