If you’re about to take an analytic crack at unfiltered terabytes or even petabytes of complex, unstructured data, it can stay in a public cloud provider like Amazon Web Services, said Judith Hurwitz.
The consultant and author of numerous IT books, including several in the For Dummies instructional series, said analyzing big data in the cloud often works “because you’re still in the process of separating the wheat from the chaff,” so there’s no reason for the extra security of keeping it in-house.
“It has not become mission-critical at that point,” Hurwitz said.
But once you complete your analysis and have your unique, core data, bring it back on terra firma. Private cloud computing, especially the sort that is built in a company’s data center, is the preferred place, she said.
“Companies will be more likely to then move their data into a private environment, into the data center, into a private cloud, because those are now the crown jewels.”
John Burke, an analyst at Nemertes Research, said if you have data in several places — some in on-premises systems and some in the public cloud, “you have to look carefully at whether or not it would make more sense to either leave the data where it sits or bring it all into one of the cloud environments from others and do the analysis there.”
There are lots of variables, Burke said, from the method and cost of connectivity to the infrastructure you have on-site to how varied your big data workloads are.
“Is it something where you’re running roughly the same volume of information through your system and you’re maintaining roughly the same volume of data in storage all the time — or does it fluctuate wildly?” Burke said.