Implementing a big data technology framework requires a strategy, as Gahl Berkooz illustrated during his sweeping presentation, titled "In Search of Actionable Insights," at the recent Big Data Innovation Summit.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Berkooz, the head of data and governance at Ford Motor Company in Dearborn, Mich., briefly shared with attendees the data supply chain's big data technology framework, which included seven categories, ranging from analytics enablement to infrastructure management.
But -- and this is a key point -- Ford didn't attack all seven categories at the same time with the same fervor. When one attendee asked about his prioritization strategy, Berkooz didn't hesitate. "We look at what are the things that are Ford-unique and differentiating for Ford versus the things that we can rely on an industry out-of-the-box capability," he said.
It's becoming a familiar piece of advice for IT leaders: Focus departmental efforts on what differentiates the business and rely on vendor partners to do the rest. Mark Shaver, vice president and CIO at Joy Global Inc., a mining equipment and services company in Milwaukee, provided a similar perspective at the Fusion 2015 CEO-CIO Symposium last spring. The solution for the company's industrial IoT initiative wasn't "off the shelf" technology, but instead was the result of unique partnerships with companies like Verizon. They added expertise Joy Global didn't have -- and didn't want to over-invest in.
"I'm not your standard CIO," Shaver said. "I have no data center. Outlook is outsourced. Security is outsourced. I don't add value to those things."
At Ford, data standards and data quality are considered fundamental building blocks, where time and energy is devoted. And for good reason: "Data is the input to our analytics that produce insight," Berkooz said. "If the quality of the data is not controlled, the value of the insight is going to be very limited."
For capabilities that aren't a Ford differentiator, such as infrastructure management, identity and security management, or data ingestion, Berkooz and his team turn to the market.
Firefighting with analytics
Timothy Persons, chief scientist at the U.S. Government Accountability Office, trotted out this analytics nugget during his presentation at the Big Data summit: An analysis of how New York City's analytics squad used data to crack down on "illegal conversions" or properties that have been subdivided without authorization.
The story has been written about before, but it remains a compelling example of how powerful analytics can be. Illegally converted properties are a magnet for pests, crime and fire hazards, posing unforeseen danger to firefighters. At the CDO Summit last spring, Amen Ra Mashariki, the current chief analytics officer for New York City, explained to attendees that during one such fire, a couple of firefighters lost their lives because they didn't have an accurate understanding of how the residence was laid out. He posed a simple question: "So, how do you keep firefighters safe?"
The answer turned out to be data. According to Persons, the analytics team started by culling data from different departments or silos -- from 911 calls to property tax payments -- to spot patterns more indicative of illegal conversions than gut feel.
The team discovered that city departments record similar location data, but they do so in different languages: The police used traditional mapping coordinates, firefighters mapped neighborhoods by fire hydrants and the tax department by lot number, Persons said. So the team came up with a universal language by integrating data from different departments together to describe the city's 900,000 lots.
And it worked. Using analytics instead of gut feel, city inspectors were able to issue vacate warnings on 70% of the properties they visited -- rather than the typical 12%, Persons said.
Prescriptive analytics isn't the 'final frontier'
Erik Andrejko, vice president of data science at the Climate Corp., a San Francisco-based data science company that works with the farming community, made a bold statement at the summit: The "final frontier of analytics capabilities" isn't prescriptive analytics; it's explanatory analytics.
"When you're telling a decision maker -- in this case, a farmer -- that they should apply this much fertilizer to this piece of land on this day, it is very helpful to answer the question, 'why,' when it's asked," he said.
The key to explanatory analytics may be models. He pointed to two techniques that can help with this: Structural modeling, which he said is "used to illustrate statistical connections between the environment and crop outcomes," and mechanistic modeling, which he said is "used to capture physical phenomenon when the underlying physics are well understood or can be captured directly."
Andrejko said both methods are similar and are used in addition to "black box" approaches to data mining. "We need multiple techniques because there is no 'free lunch' and, typically, one technique will not work for every problem," he said after the event via email.
"You put your finger on one of the key issues the federal government faces, which is just how do you buy IT systems and services now? It sounds simple. If you're a small business, you can just go to Google and lease a cloud. The problem with the U.S. federal system is that we have -- and I'm not making things up, for you computer science folks who know your history -- we have things running in FORTRAN and COBOL that we still have to maintain." -- Timothy Persons, chief scientist, U.S. Government Accountability Office
"The Ebola crisis: Liberia, Sierra Leone, Guinea, West Africa. It hit us last year. We predicted that we would have 13 million people impacted in three months. And the question to me, when I wake up in the morning, is what could I have done if I knew a month earlier? How many lives could we have saved? Now, the world reacted well and responded well, but that's just the beginning. And that's where big data is crucial when it comes to addressing social issues." -- Atti Riazi, assistant secretary general and chief IT officer, United Nations
"I assume a lot of people [here] are working on big data projects, and that's a pet peeve of mine. I don't have a single big data project. I have business projects." -- Jack Levis, director of process management, UPS
Read up on the big data architecture dilemma for CIOs: Should they build or buy?