No one disputes that the Internet of Things has the potential to generate
large amounts of data. Gartner has predicted there will be close to 26 billion devices on the IoT by 2020.
The question is how to handle the IoT sensor traffic. One approach that is getting a lot of buzz is mesh networking, a computing architecture constructed of nodes that are wirelessly connected to each other. Data hops from node to node until it reaches its destination across the network. But while mesh networks show promise — and would be cheaper than using the Internet for this traffic — some enterprise IT folks are not so certain these self-organized and self-governed networks are a viable IoT architecture for enterprises.
Enter the newly bisected HP, which thinks it has found a way to use a highly managed cloud infrastructure to efficiently handle the sensor data.
HP calls this “distributed mesh computing”, explained Bobby Patrick, chief marketing officer of HP Cloud (pictured, left).
Distributed mesh computing, as described by Patrick, consists of small servers “out on the edge” (literally on street corners collecting data close to the ground), medium sized servers and storage in the middle that store data, and big servers in a central location where the heavy analytics and long-term storage take place.
“All of these things will work together in one big distributed mesh computing system to serve this massive growth in data and connected devices expectation that we’ll have in the next five years,” Patrick said.
HP is not the only one working with this concept of distributed computing and storing data closer to the ground for the IoT. Cisco is also doing this and calling it fog computing.
As for distributed mesh computing’s relation to mesh networks, Patrick said HP’s cloud-based solution for IoT traffic is similar in that data is stored in multiple locations so that if one server goes down, the data can still be accessed. (The HP Helion hybrid cloud solution is built on OpenStack, the open-source cloud computing software platform.) With its always-on mesh infrastructure, the architecture would also be more resilient in a catastrophe, he said, because in order to completely eliminate access to the data stored in the servers, every single server would have to be destroyed.
Said Patrick: “This era of distributed mesh computing is something that I guarantee you, over the next few years, everyone will be talking about. This is really the new architecture of the future.”