maxkabakov - Fotolia
It's 2003. May. IT is becalmed, in the doldrums, in limbo. The flurry of activity and inexplicable spending that was the Y2K tsunami has long since blown over. Some still refer to the millennium bug as IT's finest hour. Others believe it was a totally manufactured crisis. Either way, in 2003, the business side of the house is no longer supporting any of IT's fantasies. The Standish Group has just published the results of five years of analysis on the failure rate of IT projects: A depressing 65% of IT projects fail. The dominant project methodology is Waterfall (popular since the 1970's). Looking at the breakdown of the average IT spend, approximately 20-25% is committed to keeping everyday IT operations up and running, with the remainder going to innovation and new solutions for the business. The business is reacting to Y2K and pushing for more stability, availability and reliability from their IT systems. The business is demanding real business value and ROI from their IT spend, not pie-in-the-sky touchy-feely measurements. IT is struggling to respond. Struggling to translate real IT performance gains into demonstrable value for the business. Struggling to understand its role in the confusing new millennium. Struggling, in some way, to justify its very existence.
Around this same time, a small group of unconventional programmers gathered at a ski resort in Utah in October 2001 and created Agile. More about this milestone in the evolution of IT later…
Economically, in 2003, we are in a downturn. The collapse of Enron and WorldCom has vividly demonstrated that no company is "too big to fail." These scandals lead to the passage of the Sarbanes-Oxley Act in 2002, which contains provisions where key executives can now be sent to prison for falsifying the financials of public companies. IT is to play a major role in the financial reporting systems of public companies. In addition to the heavy operational focus of most IT shops, new compliance requirements are added to IT shops already heavily burdened with a plethora of audits.
Politically, in 2003, we are reeling from recent events. Our first presidential election of the new millennium took six weeks to decide because of the method used by Florida to collect simple data ("hanging chads"). The headlines in Europe call it a real "Mickey Mouse" operation. Less than two years prior, 9/11 had caused us all to question everything. Even the long-vaunted FBI had been caught with its IT computing pants down. The nation's top cops, famed for their ability to gather and sift through huge volumes of information, are exposed as laggards in 2001, dependent on outdated systems that do not have a prayer of keeping up with the exponentially increasing demand. Systems that were linked -- at least, indirectly -- to the domestic intelligence failures leading up to 9/11. We went from invincible to vulnerable in the span of one sunny Tuesday morning. It is a difficult time for the U.S. and a difficult time to be in IT.
As if things couldn't get any worse for IT, in the May 2003 issue of the Harvard Business Review comes an article by Nicholas Carr entitled, "IT Doesn't Matter." The premise is simple and direct: IT, like so many other technological innovations before it, has become a commodity. IT no longer represents a strategic business advantage. No longer can one organization exercise technological dominance using IT as its lever. Now, every organization has roughly the same IT: networks, routers, servers, databases, websites, email and so on.
Carr further points out that, now that IT has reached this stage of commoditization, it represents a major risk to the business enterprise: The risk of not being there. IT outages that were mere annoyances in the past now place the business at a significant disadvantage. This development in the evolution of IT was not just a shift in thinking, but a strategic shift in where you put your IT spend: Risk mitigation versus innovation. In fact, this fundamental shift in thinking (and spending) was well underway. Every organization on the planet had been caught on the horns of this dilemma three years earlier during the Y2K mess. But there was simply far too much to do at the time and no downtime to think about why we were doing it. Y2K was the embodiment of Carr's basic premise: IT represented a strategic risk for the business.
The industry's response to Carr's article was swift and pointed. Analysts and pundits lined up to debunk the premise that IT and its technology were little more than 21st century plumbing. The overriding theme of the responses to Carr's article can be paraphrased as, "Sure, we all have the same technology, but it totally depends on what you do with it! Look at Cisco and Wal-Mart and Dell -- they are innovative." (Remember, this is 2003!) So, the real question is: Was Nicholas Carr right in 2003?
Is he right today?
It's time for a change!
Fast forward to 2017. What has changed? What hasn't? The Standish Group has now been measuring the failure rate of IT projects for 20 years. The IT project failure rate is still at the depressingly low rate of 65%, with very little fluctuation from year to year. The dominant project methodology is still Waterfall.
In 2017, most IT operations groups remain focused on delivering reliable, stable, secure services with a minimum of down time. Now, however, approximately 75-80% of the IT spend is committed to keeping the everyday IT operations up and running, with the tiny remainder going to innovation and business solutions. Is this progress? Today, the business is pushing for more creativity, flexibility and innovation from their IT systems. Yet, many IT operations groups -- stuck in post-Y2K mode -- remain hunkered down and determined to create the most bulletproof environments in history.
Meanwhile, the business side has been scrambling to maintain its relevance while competing in our new digital world. Smartphones and tablets that are always connected. More savvy consumers. Drastically different customer expectations and timeframes. The business now knows what needs to be done and they know how fast they need it done. The business has started pushing development (dev) and project management (PM) to get with the program. Pushing them hard. Very hard. With 20 years of data to back it up, dev and PM have to admit that they are broken. The Waterfall project methodology is simply not working for most development projects. There are better ways to develop software.
Phew! Crisis averted. Business, dev and project management are all in sync, oh my! Breaking into their happy dance and all's right with the world!
Evolution of IT leaves IT ops behind
Whoa! Not so fast. Yes, things are finally good with biz, dev and PM, but nobody told ops. You remember IT ops, the people who actually deploy all these new wondrous Agile-developed business solutions so that actual value can actually be realized by the business. For real.
IT operations is still operating under the old rules. The 2003 rules. Nothing has changed for them. Well, that's not quite true. They are trying to apply the 2003 rules to a world with smartphones, Wi-Fi and ubiquitous cloud environments. They are still trying to: Lock it down, resist change, make it bulletproof, defeat hackers, safeguard availability, increase reliability and maintain stability … and, indirectly, frustrate the heck out of the business and dev and PM as they try to maximize value at a totally different pace.
Ops seem to represent a literal roadblock to the fast lane of Agile deployment. Ops is not a little off- kilter, but rather is suffering from a complete disconnect. A different definition of value and success. Like landing on a different planet where everything seems upside down. A Twilight Zone episode where all the beliefs you have valued and cherished are up in smoke. Poof! Life, literally, doesn't make sense anymore.
Can this chasm be crossed? Can two such diametrically opposed sets of values ever be brought into alignment? Even more than that: Can two sides that seem to be polar opposites actually exist --no, thrive -- under a common set of rules? They can. The popular buzzwords are digital transformation. But haven't we actually been digitally transforming since before the mid 1990s? Yes, so let's call it what it is -- a value transformation. (Which means it's a people transformation.)
Go to part two of Spalding's essay on the evolution of IT, "Soar toward 'value transformation' on DevOps wings," for a discussion of DevOps' role in delivering today's definition of business value.
A strategic CIO plays the long game
Need for speed propels DevOps
Masters of low-cost, low-risk change