Alec Chattaway is committed to getting the most out of cloud — and lowering the costs of cloud computing. The director of cloud infrastructure operations at Informatica is increasing the efficiency of cloud infrastructure use at the data management software company. He’s doing that by almost eliminating overprovisioning, or buying more computing resources in case they’re needed — a bad habit that could have a drastic impact on the bottom line.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
“It’s not 100%. It’s such a large percent that it may as well be,” Chattaway said. He reckons about 95%. For Informatica, getting any closer wouldn’t be worth it.
To unwind what sounds like a counterintuitive argument, start with overprovisioning itself: On its face, it makes sense. If there’s an unexpected surge in visitors to a website, for example, there will be no shortage of processing power, memory and storage to handle it.
But if there isn’t, those resources are wasted — and so is the cash that bought them. Cue the increase in costs of cloud computing. (Amazon Web Services, for one, isn’t complaining, Chattaway said. The No. 1 cloud infrastructure provider is “relying on people being inefficient so that they can make money.”)
Predicting out of a predicament
Chattaway’s plan is to run predictive analytics on all of Informatica’s cloud use to determine how much computing power he’ll need at any given time — and then supply just the right amount. He won’t entirely eliminate the need to overprovision, because he can’t predict everything — a massive influx in demand for the company’s integration tool, for example.
And he can’t underprovision, or supply too conservatively. If he does, and the company’s web tools slow down and customers flee in frustration, “that’s a really bad thing.”
To avoid having to err toward not enough resources, Chattaway needs a “fudge factor,” or a margin of safety, and expects that to be approximately 5%. So with the use of predictive analytics, 95% of supplied computing resources will meet the need.
‘A pipe dream’
As Informatica gets better at the analysis, Chattaway plans to edge down from 5% overprovisioning. But getting to 1% is “basically a pipe dream.” It’s too costly. Chattaway thinks of the calculus as a modified 80/20 rule. The original says 80% of the effects come from 20% of the causes.
“It’s really easy to do the 80%,” he said, slashing the costs of cloud computing. “The 20% is 10 times more expensive, and then that last 1% is 100 times more expensive.”
Some can do it. Multinationals with many billions of dollars in revenue — say, Johnson & Johnson or Coca-Cola — can afford to do the kind of predictive analytics that would be needed to get to 1%. And for those companies, the difference between 5% and 1% is colossal.
For Informatica, a midsize company with $1.05 billion in revenue, “We’ve got some good resources, but not enough to make those final few percentages.”
For more on Alec Chattaway’s efforts at cutting the costs of cloud computing at Informatica, read this SearchCIO column.