Think different: Cognitive computing systems will bring data-led change

Today's programming models are things of the past, says consultant Judith Hurwitz. Cognitive computing systems use data to devise applications that reflect the world as we see it.

The way business applications are built today is flawed.

Judith Hurwitz, a longtime consultant and author of numerous books on IT, made the declaration at the recent Cloud Expo in New York. The building process is flawed, she said, because it relies on business logic -- the programming that lays out what operations will be set in motion and what tasks will be done.  

"We can't continue to write programs the old-fashioned way -- with logic, with a beginning, a middle and an end -- and then feed in the data and everything will be OK," Hurwitz said. "That's why we have so many problems as we do today with our systems. They were built on how we thought about a problem 20, 50, 5,000 years ago."

Over the years, as business models have changed and the number of data sources multiplied, organizations tweaked their old programs to fit, patching the logic and creating what Hurwitz called "monster systems."

Cognitive computing systems, which use hardware or software to approximate human cognitive functions, will change all that, Hurwitz said. The business applications of the future will be based on fast-moving, ever-changing data from an ever-growing number of sources. Gone will be step-by-step instructions based on the past. Cognitive computing learns from patterns and anomalies, makes guesses about what could happen -- and it doesn't assume there is one correct answer. As more data is ingested and analyzed, the system changes, too.

IT consultant Judith Hurwitz speaks on cognitive computing at the recent Cloud Expo in New York.
IT consultant Judith Hurwitz speaks on cognitive computing at the recent Cloud Expo in New York.

Though cognitive computing is designed to operate in much the same way as a human brain does -- absorbing a large amount of information, learning from it and making hypotheses based on it -- science has yet to replicate the brain. So, cognitive computing systems -- "unless we fast-forward 100 years when we can simulate all the synapses in the brain" -- won't work on their own; they'll rely on collaboration between humans and machines, Hurwitz said. Humans bring with them the knowledge they already have, and machines bring the capacity to process and store much, much more of it.

A job well done together

Hurwitz, whose talk was based on the 2015 book she co-wrote, Cognitive Computing and Big Data Analytics, used this example: A doctor sees a patient with an unusual set of symptoms he's never seen before. But because he just read a medical journal article about them, he's able to make a diagnosis. Happenstance, Hurwitz said.

We can't continue to write programs the old-fashioned way -- with logic, with a beginning, a middle and an end.
Judith HurwitzIT consultant

"It just happened that the doctor saw that article and then came up with an answer," she said. In contrast, a cognitive computing system "can take in huge volumes of data, huge volumes of research in a medical situation and actually collaborate with that doctor and say, 'I see these symptoms.' And then the system may say, 'Well, based on this, have you thought of this?'"

What's needed for this to work is data -- lots of it, Hurwitz said. And lots of types of it: structured records and files, for example, plus unstructured written reports and audio recordings. A system also needs context to understand what data means. Consider brown leaves falling off a tree. That could be bad for the tree -- unless it's happening in late October in the U.S. Northeast or Midwest. If the tree is shedding in mid-April, that's a different story.

Machine learning is an important part of the cognitive computing process. That's the computer science discipline with deep roots in statistics, psychology and cognitive science that's getting a lot of attention today. Instead of programming machines, information is fed into them. Algorithms take this information, learn from patterns they find in it, make a hypothesis about what's happening and then test it out.

Humans, machines: Lots in common

"That's how we all work," Hurwitz said. For example, when people start a new job, they go in with certain assumptions. "Your boss tells you, 'This is the way we do our jobs,' and you follow that way. But as you learn more, as you talk to more customers, as you gather more data, you change, your level of sophistication changes, what you understand changes."

And that's what cognitive computing systems do, Hurwitz said. They learn from facts, even as they morph, and can build programs by "letting the data lead." That sets aside biases and assumptions that could be built into business applications for a range of industries -- especially ones that deal in mammoth amounts of data, such as healthcare and manufacturing -- and lead decision makers to the wrong conclusions.

Hurwitz illustrated how this could happen with the example of another area that cognitive computing systems will likely have a big effect on: cities, which entail everything from transportation to policing to environmental planning. Imagine, she said, writing business logic for urban transportation 150 years ago.

"We would build logic around how horses operate in a city, because that would be our reality and how we look at the world," she said. "Clearly, if we tried to apply those same rules today, it would be silly."

Consultant Judith Hurwitz discusses how cognitive computing systems will be available in the future in the second part of this two-part report.

Next Steps

Will deep learning make machines more human?

The future of AI: A human-machine partnership

All walks of IT life walk the halls at Cloud Expo

Dig Deeper on Enterprise artificial intelligence (AI)