Where To Pick Up the Speed Necessary for Big Data Analytics
Log in to LinkedIn to get personalized content offers right here.
Big data is about, well data: historically Business data, then recently Human data, and coming soon Machine data from the internet of things. Enabling companies to take wide varieties of raw data from many sources and use it to create actionable insight is what the Big data opportunity is all about. Big data can provide competitive advantage, with companies taking that insight and using it to deliver better products and services to customers.
Important to handling big data effectively a focus on infrastructure and putting in place an effective IT foundation that enables new levels of speed and agility that are critical to all aspects of big data analytics. Respondents to Intel’s 2013 IT Managers Survey on How Organizations are Using Big Data cited “data center infrastructure update” as their Number One priority.
IT professionals recognize that speed is key to delivering successful big data analytics. Analysts and experts for years have talked about the three Vs of big data: volume, variety and velocity. Speed is not just a challenge with how quickly data is being generated, but it is also a matter of how quickly it can be accessed, analyzed and acted upon to have a clear impact on real-time (or near-real-time) decision-making and business responsiveness.
Legacy data center infrastructures are, unfortunately, laden with bottlenecks that prevent the kind of speed necessary to successfully perform and deliver big data analytics. Among other places, these bottlenecks can be found most prominently in:
- Older-generation servers that don’t deliver sufficient compute performance, memory capacity, or processing speed to handle and analyze data in a timely manner. And time is money.
- Legacy rotating hard disk drives where performance in delivering IOPS and throughput pales in comparison to solid-state storage. If individual data retrieval in each server is slow getting their respective jobs done, the whole result waits.
- Legacy network switches and infrastructures that don’t deliver the speed and performance to address the needs of applications in more highly virtualized environments. As datasets scale and interconnect, moving them effectively with sufficient bandwidth becomes a priority.
As IT professionals look to update their infrastructures to deliver the right levels of performance to support big data analytics, these are key areas and technologies they should be looking into:
- Storage: Several important advances have taken place in solid-state storage during the past few years: The price has come down significantly and the quality has improved markedly for enterprise environments. Incorporating flash for high-performance big data analytics is an important step in increasing the speed of the infrastructure. Flash doesn’t have to be used everywhere, but it must be considered in many tier one environments that require high performance.
- Servers:To enable the speed required for big data analytics, it will be important to look at your server infrastructure and upgrade where necessary. Current generation servers are available with more processing cores, faster memory and additional cache, among other important features.
- Networks: Faster is better, so moving up to 10 GbE Ethernet for big data is certainly a next step for many data centers, and 40 GbE should also be considered for strategic applications. In addition, IT will want to look at innovations such as Intel Ethernet Switch Silicon, as well as Ethernet fabrics, which will improve the performance of legacy Ethernet networks in highly virtualized and cloud environments.
- Scalable infrastructure: Converged infrastructures will not only help you improve the performance of your infrastructure, but it will also make the entire infrastructure much easier to manage and scale. IT will be able to manage, monitor, troubleshoot and remediate problems anywhere in the infrastructure—physical or virtual—from a central management platform.
- Intelligent Things: Client endpoints are critical in Big data environments, as the places where new types of unstructured data are gathered—think social media—and also where many of the benefits of big data analytics are delivered and acted upon. Modern endpoint devices have an increasingly important role to play in the successful deployment of big data initiatives—from more powerful devices for consumers, to faster and more powerful devices embedded in machines, to innovations in desktop computing driven by technologies such as virtual desktop infrastructure.
They say speed kills. However, in building the right infrastructure to support big data analytics, it is actually the lack of speed that is the real danger. Fortunately, technology innovation is moving quickly to not just keep up with the demands of next-generation applications, but to actually drive the advancement towards big data analytics.
© 2014 Intel Inc.