Kit Wai Chan - Fotolia
Edge computing, where data processing and computing are pushed closer to the "edge," is increasingly becoming an important factor in our network topology. With the devices producing the data also the ones processing it, it's helped solve the bandwidth limitations and network costs organizations were experiencing with the use of cloud computing.
Currently, edge computing is deployed in three ways:
- Local devices. For instance, your iPhone can be an edge computing device. It used to require processing a face recognition algorithm on the cloud, but as that algorithm moved onto the device where only the result of the algorithm is sent to the cloud, that device is now on the edge.
- Localized data centers. These are between one to 10 racks and are suitable for cases where the computational load would be too much for the device. They work as an intermediary between the cloud and the edge devices.
- Regional data centers. These are large centers with more than 10 racks that typically function as a local copy of the cloud, although they do have their own characteristics.
But with any benefit come challenges. Here we take look at the top edge computing challenges IT leaders are facing in the enterprise today.
The top concern is security. "The edge creates an increase in attack surfaces," said Theresa Miller, principal technologist at Cohesity, which develops data management software. Furthermore, unlike data centers, edge computing devices are more easily accessible by bad actors than data center servers. "It is more difficult to secure the edge, and a higher level of visibility is necessary at the edge to discern attacks from bad actors."
"More devices" also translates to more maintenance requirements. Application upgrades and patches must be orchestrated at hundreds or thousands of devices in a secure way while minimizing user-visible services. "The biggest edge computing [challenge] is security and access," said Partho Mishra, president and chief product officer at Cumulus Networks, which offers networking for data centers.
Patents and company secrets
A startup's "secret sauce" or proprietary AI algorithm is well-protected in a central server. Users don't have access to the physical devices and firewalls, and anti-malware programs prevent unwanted reach. However, if an AI algorithm is put on an edge device, it is very likely that malicious actors might get their hands on the program. Companies must be careful with what they allow on the external devices.
Monitoring and maintenance
One of the benefits of cloud computing was being able to forget about the computing infrastructure. With edge computing, some of that burden returns. "Troubleshooting and remote monitoring for the edge is much harder than for the traditional data center," Mishra shared. "Physical access is difficult, network access does not always support redundancy and even the most modern tools do not always provide real-time visibility."
Visibility is crucial, particularly when there are performance issues. "The edge is often used to interconnect between enterprises and service providers for latency-sensitive applications," said Michael Segal, area vice president of strategic alliances at NetScout, a provider of network performance management products. "When there are problems with application or service performance, one challenge that enterprises face at the edge is identifying with whom the issue lies. The problem could be with the service provider, cloud provider or with the enterprise itself. Quickly identifying the owner is imperative, but difficult."
But the edge computing challenge does not stop there. "Once the owner of the issue is identified, a secondary challenge of establishing the underlying cause arises," Segal continued. "Enterprises have many dependencies within their service delivery chain, such as networks, load balancers, service enablers and databases, to name a few, making isolating and identifying the cause of the issue very challenging."
"Companies need to focus on identifying hardware needs in advance," said Patrick Hubbard, Head Geek at SolarWinds, an analytics software development company. "You shouldn't assume edge deployments are nearly as flexible in terms of appropriateness for any task as the server infrastructure you're currently using in your larger, centralized environment."
The extra hardware brings in several maintenance requirements, Bridget Karlin, global CTO and vice president of IBM Services, explained. "There can be challenges with managing significant quantities of remote devices including field servicing, capital costs, managing the host environment, including power, cooling, physical security and more."
Network requirements at the edge are substantially different. "Enterprises will have to go through an infrastructure modernization process," said Shamik Mishra, vice president at Altran, a global engineering consultation firm. "There will be large uplink data as sensors [and] devices generate more data that gets offloaded to edge computes. This will require expanding networking capabilities."
The structure of the network also needs to change, particularly when it comes to network resilience and business continuity. "Redundant systems, like those used in data centers, are impractical at smaller edge locations," warned Marcio Saito, CTO at remote IT management company Opengear. "As compute loads move from core data centers to edge locations, organizations must adjust their network management processes to deliver always-on uptime in a more dispersed networking ecosystem with more remote locations."
In a network that spans several multiple devices across a wide geographic area, database challenges also start to surface. "In edge computing architectures, every edge device needs to be able to operate independently, while simultaneously sharing and synchronizing data with other devices and nodes," said Amber Caramella, chief revenue officer at Netrality, a data center providing company.
"However, coordinating multiple edge devices while enabling them to work independently can be challenging. Conventional databases can scale up to meet stateful data coordination challenges within a data center," she said. "But due to their outdated design, they can't effectively scale out across large, dispersed geographic areas. The coordination of stateful data, therefore, becomes a constraining factor in how many devices can perform complex tasks in an edge computing system."
Cultural and workforce challenges
Along with the requirements for hardware upgrades and maintenance, the need for skilled IT personnel surfaces as an edge computing challenge as well. "Edge computing is a new frontier," warned Rich Carpenter, GM of product management at Emerson Machine Automation Solutions. "Make sure you have an empowered, skilled workforce equipped to operate new machinery and ready to navigate new challenges."
Equally important to trained IT staff is acceptance from leadership, Carpenter noted. "Many initiatives require a culture change, so buy-in from leadership is vital to run fast and lean and make sure that organizational objectives are handled swiftly," he said. Edge computing requires extra burden compared to cloud computing, so leadership needs to be fully aware of why they need to take that path.
Limited processing power
Unlike the cloud, edge computing does not have access to infinite resources. "Moving to the edge means coping with limited CPU processing power. That means that algorithms need to be leaner and achieve their goals with less compute power," said Erez Steinberg, vice president of R&D at Eyesight Technologies, which aims to create safer driving experiences using AI.
"If we talk about deep learning specifically, there are very powerful GPUs and TPUs on the cloud that can run huge networks very fast. On edge devices, even with embedded GPUs and NPU, the capabilities are much smaller," he said.
One of the primary reasons for edge computing is to save costs -- instead of sending every piece of data the sensors record to the cloud, the devices process that data locally and only connect to the cloud when necessary.
However, you should measure where this policy is feasible. Particularly, does the cost of computing power at the edge and all related upgrade, maintenance and monitoring costs bypass the cost of network connectivity or not? In short, bandwidth savings should not be your only measure.