In this second half of a two-part interview, Harvard Business School Professor and author Robert Kaplan discusses how he defines two subjects that are receiving a lot of buzz these days: agile business and predictive analysis. He also shares his thoughts on why companies overlooked signs of the recession and why risk management deserves its own scorecard.
The first half of this Q&A focused on
There is a lot of talk about agile business and agile methodologies. How do you define
Kaplan: That's one of those buzzwords that people have different meanings for. To be agile is the ability to sense changes in the markets and customer preferences faster, as they are evolving, and be able to respond to it. It's not just information, but it's really analysis to see patterns in customers' purchasing decisions and preferences and have that [data] come into the company so they can respond to whatever these evolving needs are. It's also keeping track of competitive forces as well, to be able to offset that. But the front end of agility is information because it's what you're being agile with respect to. It's not just that you're doing things within the company faster. It has to really respond to some market need.
Agile methodologies tie into the idea of being able to respond to change faster, but can
companies get to a point of predictive analysis?
Kaplan: Predictive analyses comes from analytics that are being applied to historical data. In the old days, you were using historical data to evaluate performance and reward people. Now you're trying to use data to help understand the future. Wal-Mart, for example, does a very good job of understanding the types of bundles consumers are likely to purchase. They're trying to predict the patterns of consumer purchasing and then arrange the offerings to encourage the buying of multiple products and services. We have crystal balls, but they're not very accurate. What we do have is data, and by having access to large quantities of data on consumer purchasing, then yes, that does help you predict the future.
The question is, are companies investing sufficiently in analytic methods to make sense out of the data? Raw data is useless, but if you can study the past and use various statistical methods to process the data, then you really can provide information and knowledge that's actionable and that will be predictable in the future, as long as historical patterns are persistent.
Is there a technology or role that IT can play in taking that data and making sense of it, or
is that something the business should be responsible for?
Kaplan: The more you can make the life of applied statisticians easier by providing that analytic interface between transactional data and the kinds of methods they want to use to explore the data -- it could be just powerful ways to display that data on a screen so they can see patterns. I don't think we've reached the stage where you can completely automate this process [with technology], though. There are judgments that need to be made as you build the models. It's easy to get fooled by the data and make the statistics lie for you.
Why were so many financial institutions seemingly caught off guard by the falling housing
market and mortgage lending crisis?
Kaplan: They didn't have good models about the values of the securities they held and the risks they held. There were a couple of banks that had much better internal staff for looking at the transactions that were taking place in the market and then thinking what that meant to their own portfolio. But I think companies like Bear Stearns, Lehman Brothers and Wachovia … I mean, they failed miserably in understanding the deteriorating value of the securities they were holding, and this was knowable. If they had better models and analytics they could have seen this much earlier and perhaps not had the kind of failures that they did.
Many financial institutions had models and analytics in place, though?
Kaplan: The people who built the models didn't fully understand the businesses that were behind the data that they were looking at. And they didn't understand the pressure testing -- the things to watch out for. The data they were seeing in a period of rising housing prices would not necessarily be representative of what that data would look like if housing prices were flat to declining. They needed someone to understand that the housing market had a boom and had gotten overpriced, and while you couldn't predict when the housing price would level off or start to decline, you could test "what if" that happened and see the sensitivity of your asset holdings to that economic event. Very few banks did that type of pressure testing. We have to have some robust software that enables you to look at scenarios and do the "what ifs." Even though you're not predicting the future, you're thinking about what the consequences are under various alternatives, and that's what they failed to do. I don't think it was a lack so much in software, but a lack of imagination. Maybe they didn't want to think that the good times had a possibility of ending.
Have you adjusted the balanced scorecard methodology due to the recession?
Kaplan: If I had to say there was one thing missing that has been revealed in the last few years, it's that there's nothing about risk assessment and risk management. My current thinking on that is that I think companies need a parallel scorecard to their strategy scorecard -- a risk scorecard. The risk scorecard is to think about what are the things that could go wrong? What are hurdles that could jump up, and how do we get early warning signals to suggest when some of these barriers have suddenly appeared so you can act quickly to mitigate that. [Risk management] turned out to be an extremely important function that was not done well by many of the [financial services] companies we talked about earlier. Risk management was siloed and considered more of a compliance issue and not a strategic function. Now we see that identification, mitigation and management of risk has to be on an equal level with the strategic process.