In a nod to the Easter holiday ahead, I offer a variation on the age-old riddle: Which comes first, privacy or...
the golden egg of big data? The riddle is far from being solved, at least when it comes to personal information. And it isn't even being pondered in a serious way at many companies, if the enterprise risk managers I rubbed shoulders with last week are any indicator.
Sitting at lunch with risk managers employed by a swath of industries from real estate to financial services to hospitality, I learned in no uncertain terms that the potential risks to the enterprise posed by possible abuses in the handling of big data paled in comparison to the risks burning up their heat maps.
Perhaps they are right to put personal data privacy on the back burner. Yes, privacy is in the news, but what are the enterprise risks of dealing in big data compared with liquidity risk, talent risk, competitive pressure, fraud, big weather? Besides, thanks to the communication technologies that make it easier than ever to disseminate information, privacy is on its deathbed -- or it will be by the time we boomers meet Joe Black.
About the same time I was hearing about heat maps and risk frameworks, the CIA's chief technology officer, Ira 'Gus' Hunt, dropped his bombshell about the value of dots at GigaOm's Structure: Data event in New York: To wit, "Since you can't connect dots you don't have, it drives us into a mode of, we fundamentally try to collect everything and hang on to it forever." "Forever," according to some reports, was in air quotes, but make no mistake, Hunt is not just in the hunt for big personal data. In his view, "it is really very nearly within our grasp to compute on all human-generated information." The aim is to analyze big data sets and take covert action based on the analysis. We even have some idea of how that will go down. Just two days before Hunt's remarks in New York, Federal Computer Week, citing unnamed sources, reported that the CIA has signed a contract with Amazon Web Services, valued at up to $600 million over 10 years, to build a private cloud behind its firewall.
'Personal' vs. 'private' in personal data privacy
After lunch, I attended the one-and-only session on data security and privacy risks at this two-day enterprise risk management conference, and left understanding a little better why my tablemates might be inclined to shrug off questions about the personal data privacy guidelines at their companies. The session featured the chief security officer at a regional insurance company in upstate New York, schooled by HIPAA regulations on the need to manage privacy risks and -- as a board member of a Boston firm using big data to track terrorists -- privy to the lucrative uses of big data. He admirably laid out the hazards of the privacy terrain, from the difficulty of rationalizing the muddle of multiple privacy laws (since 1998 there have been more than 200 privacy laws passed in more than 150 countries) to sorting out the now-muddied distinction between what is personal and private.
"A picture, your face, name and address may be personal but it is not private, and it is not going to become any more private than it is today. In fact it is going to get worse," the security executive said. He warned the audience that using information for other than the stated purpose for which it was collected can get companies into hot water (for example, insurance companies that use personal data to ascertain how much people cost the plan, then find a way to kick them out).
The main point is this: Big data, which aggregates many types of public information, exposes personal information, and from a company perspective, exposes the enterprise to great risk. While a company can't be held responsible for personal data out in the public, using that data in its models could run afoul of legal and regulatory issues, the security exec said. Making matters worse, people are becoming desensitized to what is private, which only makes it harder to instill a "culture of privacy into how your company handles data," he said. "People are looking at what lies ahead for privacy. There's really not much to say. We're kind of on our heels with respect to the topic."
From the back of the room, someone called out, "Let's talk solutions. We have all these experts in the room. I'm tired of hearing how difficult it is. Let's talk about solving the problem." Good luck with that.
As the CIA's Hunt was quoted as saying, anyone who carries a smartphone is "already a walking sensor platform," visible even when the device is turned off. "The inanimate is becoming sentient," he said. All us, not just risk managers, are in jeopardy of being left in the digital dust by Jeopardy-busters such as Watson, which (who?) can compute on all that human-generated information much faster, of course, than our other human systems -- governmental, legal, ethical -- can figure out what's appropriate.
Let us know what you think about the story; email Linda Tucci, Executive Editor.