News Stay informed about the latest enterprise technology news and product updates.

Snowden: Data encryption is good, but not good enough

Edward Snowden and Bruce Schneier talk data encryption and privacy in a networked world. The Data Mill reports.

One of the big revelations to come out of the National Security Agency (NSA) documents leaked in 2013 by Edward Snowden didn't have to do with what the NSA was doing with our data. Instead, it had to do with what the NSA couldn't do: Namely, the agency couldn't break cryptography.

"Properly implemented encryption does work," Snowden said last week at Harvard University's fourth annual Symposium on the Future of Computation in Science and Engineering. Snowden, the former systems administrator-turned-whistleblower who has taken refuge in Moscow, was beamed into the Cambridge auditorium via Google Hangouts, for a conversation with security technology expert Bruce Schneier, a fellow at Harvard's Berkman Center for Internet and Society.

Schneier said it was surprising to learn that the government doesn't seem to have a secret sauce or advanced technology such as quantum computers to break encryption. "Ten to 20 years ago, we would assume that we, in the academic world, were a decade behind the NSA and other countries," Schneier said. "That seems like that might not be true."

When Snowden and Schneier refer to "properly implemented encryption," they're referring to open source encryption tools such as Tor (an anonymity network), OTR (an instant messaging tool) and PGP (data encryption software), and not to what Snowden called "homebrewed," "boutique" or closed-source cryptography or even to hardware implementations of cryptography, which he said have successfully been broken.

Schneier said it's a credit to math that agencies and governments aren't producing "fantastic results" at breaking encryptions, especially considering the amount of money they're spending to do so. Based on figures from the U.S. intelligence black budget, 4% of the Department of Defense Consolidated Cryptologic Program's $11 billion budget, or $440 million, is set aside for research and technology, he said.

Edward Snowden speaking with Bruce Schneier
Edward Snowden, the former systems administrator-turned-whistleblower, spoke with security technology expert Bruce Schneier at the Privacy in a Networked World symposium at Harvard University on Friday, January 23, 2015.

Yet despite sound encryption, data is still at risk. "When they do attack, it's typically through some kind of weakness, some sort of shortcut that reduces the resistance," Snowden said. He pointed to the ongoing Silk Road trial as a timely case in point. Ross Ulbricht, the alleged mastermind behind the online drug market, used PGP to encrypt personal documents. "He had fully, irresistibly encrypted material. Yet just yesterday in court, [members of the prosecution] were reading out encrypted diary entries to a room full of reporters," Snowden said. "Encryption is not fool proof."

Prosecutors didn't break the encryption; instead, they found a way around PGP, Snowden said, by pulling a key off of Ulbricht's laptop. "The way everyone gets around cryptography is by getting around cryptography," Schneier said.

The weakness in encryption, in other words, isn't the algorithms and it isn't data in transit; it's everything else, Schneier said. "What we really have to worry about is the rest of everything -- so the bad implementations, the weak keys, any kind of back doors being inserted in the software," he said.

That includes weaknesses commonly found at the endpoints. Tools from surveillance manufacturers such as Hacking Team and Remote Control System sell products to third-world countries to perform NSA-like activities at a smaller scale. Activities include hacking into computers and reading encrypted traffic after it's been decrypted or covertly recording passwords through keystroke logging, Schneier said.

And it includes how encryption keys are stored, Snowden said. "One of the real dangers of the current security model at scale for defenders is the aggregation of key material," he said. "If you have a centralized database of keys, that is a massive target." If attackers can't access that material remotely, they could very well send someone to get hired into your organization to develop that access, Snowden said.

"We've got to focus on end points, we've got to focus on the keys [and make them] more defensible," he said.

Snowden tips for the nation's technocrats

Just because you can doesn't mean you should. One change Snowden observed following his exposure of the NSA data collection tactics is a more pronounced "just because we can, doesn't mean we should" attitude at the highest levels of government. By simply acknowledging this idea, the government has tempered its data collection practices, he believes. Businesses should follow suit.

One check and balance is to ask the question: Is the intelligence worth the potential cost? Snowden asked this question of himself when he shared an image of an NSA Tailored Access Operations (TAO) unit stuffing "Trojan horse systems" into a Cisco router for surveillance purposes. The same caution could make corporations think twice about exposing private customer data. (Case in point: Ride-sharing service Uber, which used GPS data to map "rides of glory" or one-night stands.)

Communication is vital. The NSA might be the bad guy, but, as Schneier pointed out, corporations got there first. "It's not that the NSA woke up one morning and said, 'We want to spy on the Internet.' They woke up one morning and said, 'Corporations are spying on the entire Internet; let's get ourselves a copy,'" he said. The government did so without public consent, a decision Snowden characterized as a divorce between government and its constituency. "We should at least have a reasonable understanding of the broad outlines of policies and powers they're investing themselves with," Snowden said. "That's happening behind closed doors, and they can't really say to be representing our interests because they are divorced from our interests. When there's no communication, they're no longer part of the community."

Welcome to The Data Mill, a weekly column devoted to all things data. Heard something newsy (or gossipy)? Email me or find me on Twitter at @TT_Nicole.

Next Steps

Previously on The Data Mill

Secrets to big data success

Systems diagrams help leaders manage change

The top five Data Mills from 2014

This was last published in January 2015

Dig Deeper on Enterprise data privacy management

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

5 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

How is data security different from data privacy?
Cancel
Data security and data privacy are two related terms that people often mistake for synonyms. Data security refers to the availability, integrity, and confidentiality of data. In other words, it involves ensuring that only authorized users can access specific data. On the other hand, data privacy refers to the proper use of data. When companies and merchants practice data privacy, they use the data entrusted to them for the agreed purposes only. 
Cancel
Computing power doesn't exist today to break good encryption: nice; for now. Other measures are needed, as is proper understanding of tools and tech used.
Cancel
Certainly brings a lot of moral questions in view which is a blessing.
Cancel
Encryption only helps against an attacker who doesn't have sufficient bandwidth and compute resources to brute force it, but in many cases, that doesn't matter.  All it takes is compromising just one person's credentials and a whole cavalcade of malicious opportunity opens.


Cancel

-ADS BY GOOGLE

SearchCompliance

SearchHealthIT

SearchCloudComputing

SearchMobileComputing

SearchDataCenter

Close