This article can also be found in the Premium Editorial Download "CIO Decisions: BI and big data: Enterprise CIO Decisions Vol. 13."
Download it now to read this article plus other related content.
If the myriad CIO surveys performed recently are to
1. Analytics projects can be highly nuanced.
Other IT projects -- accounting or production planning, for example -- follow a fairly well-understood process. Analytics projects, on the other hand, reflect the way humans make decisions. And because humans make decisions in a nearly infinite number of ways, analytics projects often do not follow a prescribed path.
Instead of mapping business rules, transactions and workflows, analytics projects require that we stay in very close contact with our project stakeholders so the project can track to the meanderings of their human minds.
2. Analytics projects might evolve in unanticipated ways.
Both the nuanced nature of human decision making and our successful analytics projects argue for projects that are designed to evolve and adapt to change. This means that there likely is no such thing as a "Big Bang" analytics project. These projects work best when we use Agile methods and create and get feedback on rapid prototypes, plan multiple project iterations, and make frequent go/no-go decisions.
For example, one of our departments was adamant about the end state of an analytics project. They knew exactly what they needed, and they wanted it in one major release. When I talked with them about breaking the project into adaptable phases, they told me, "No need, we know how to make the decisions; we just need you to get us the data."
Not wanting to give in to a doing a Big Bang analytics project, I issued a challenge: Let's take an Agile approach to the project, then decide whether it worked. If the Agile approach was inferior, I would buy the project sponsor and her staff a lunch. With that gauntlet thrown down, the sponsor and I agreed on a first phase that supported what her team felt would be the most obvious benefit of the project.
For the next few weeks, as we worked on the first phase and the sponsor started playing with an early version of our product, she and her team recognized they were gaining insights into cause-effect relationships they had not anticipated. As we starting planning for the next phase, their requirements changed completely. No longer did they want to gather data about product returns. Instead, lack of inventory, or stockouts, became the driving factor for the next phase. When the project ended (successfully, I might add) the end state in no way resembled what the team had thought of originally. Rather, it had evolved along with their thinking and decision making.
3. Effective analytics projects incorporate external data.
IT projects typically focus on information that is inside the organization: accounts and amounts, items, inventories, sales orders, customer contacts. In order to lead effectively to improved decision making, however, analytics projects must gather information from outside the enterprise as well. Because we might not control or even know how to get to this data, our planning should include experiments and tasks to find it.
Suppose our analytics project is designed to improve decisions about our product lifecycle management. We can better manage these lifecycles if we have such internal information as sales transactions, inventory records, product-line goals, pricing and discount history, success criteria, and marketing campaign results. Internal information alone, however, might not lead to better lifecycle management. It also might be worthwhile to factor in external information: competitive products, product alternatives, product placement, macroeconomic data, weather, school calendars -- pretty much anything that might affect a product's potential success.
Once we understand which external information might improve our decision making, we can determine how critical and how available it is -- as well as how much we are willing to pay for it.
4. Analytics projects focus on cause-effect relationships.
Three underlying beliefs drive my approach to analytics projects. First, such projects should improve decision making demonstrably. Second, better decision making comes from establishing cause-effect relationships accurately. Third, establishing accurate cause-effect relationships is extremely difficult and fraught with risk.
Why "fraught with risk"? We often think that correlation is the same as cause. Having an excellent correlation does not mean that there is a cause-effect relationship.
More on analytics projects
We can make business intelligence incredibly powerful if we establish cause. The foundation of improved decision making is establishing cause. But how do we do that? For starters, we can think through these (hopefully obvious) cause-effect relationships. But before we commit to these relationships, we should test our assumptions to see whether they exist. Will reducing call-center queue times increase sales? Try it and see. Will longer testing cycles improve product quality? Try them and see.
Analytics projects also require IT leadership that is both credible and consultative: credible about delivering phased, high-impact projects; credible about changing the role of IT to deliver business value through analytics; consultative about the power that quality analytics can unleash; consultative about taking a different approach to BI projects.
Niel Nickolaisen is CIO at Western Governors University in Salt Lake City. He is a frequent speaker, presenter and writer on IT's dual role enabling strategy and delivering operational excellence. Write to him at firstname.lastname@example.org.
This was first published in February 2012