In the late 1990s, IT hit upon the concept of prototyping and applied it to application development. The prototyping...
model was a forerunner of today's iteration process -- design and build small, do it fast in components, and then test and tweak the components. Like the iteration approach practiced today, this forerunner was a good idea., but it really didn't work well. The tweaking typically took longer than expected, business users often lost patience and the practice was never widely accepted.
In the early 2000s, the iteration process was adopted as a central tenet by the folks who did business process management (BPM), a discipline which itself evolved from business re-engineering. At roughly the same time, IT reimagined the prototyping model and, this time around, embedded it in the Agile software development approach as the "iteration" of the design and solution.
So, here we are again, struggling to make iteration a mainstream process and, again, getting mixed results.
Before we go further, I want to state clearly that I am a proponent of the iterative approach -- I simply want it to work right and really be helpful.
But, in practice, as its history suggests, the iteration process has both good and bad points.
Let's discuss the bad part first, so we can end on a positive note.
Where the iteration process goes bad
There is a myth that iteration makes a project end sooner. Neither I nor the many people I have interacted with professionally have heard of projects where iteration has really saved time. (Certainly, there may be people who can testify to iteration's time-saving property, but I don't personally know of any.) I do know of evaluation groups that have praised the Agile approach -- and, thus, iteration -- and still reported unbelievably high project failure rates, reaching 70% or higher for BPM-related work. It is noteworthy that many of these projects seem to eventually succeed, once they have iterated sufficiently to get past the never-ending design, construction, evaluation and redesign work.
So, what are we to make of the gap between myth and reality when it comes to the iteration process? Let's look first at a few of iteration realities that I have experienced firsthand.
Never-ending: Iterations can go on and on, expanding the design and construction of the solution well beyond estimates. In iteration, there is no real attempt to get the new design right the first time -- the goal is to move fast and be "nimble" then improve over multiple iterations. For the more analytically predisposed managers, the solution can always be better and it is never finished. In this situation, the managers believe that the next iteration will make things much better than the last one.
Increased risk: Each time the team creates a new iteration model, it must be completely rechecked -- if not, the risk of delivering a product with problems increases with each model. This, again, extends the time needed to build the solution. This is a trial-and-error approach that will eventually lead to a good solution, but it may not be the most efficient way to get to that point.
Disruption: At some point, success is just declared and the solution is installed. But the iteration process continues because what was installed is likely incomplete. This causes ongoing business disruption as parts or versions of the solution keep getting implemented and then changed again.
Confusion: After a couple of iterations of the solution, no one on the "floor" has a clue as to what they should really do or how they should do it. The obvious result is business staff and manager frustration.
Control the number of iterations with simulation
Iteration is a good concept and when used -- and controlled -- in the right way, it can work well.
For some CIOs and applications development leaders, this "right way" involves mixing business and IT BPM-related concepts, methodologies and techniques, taking the best to compensate for the issues of each. However, when melding the BPM methodology (normally, a Waterfall-type of methodology) and the IT BPM methodology (usually an Agile-based methodology), the key is to put mechanisms in place to control the number of iterations and what is expected of each iteration.
In this part of the discussion, I proceed from the assumption that the application solution development teams are able to create applications that fulfill the business and technical requirements it was given. This assumption means that the programs will deliver the required services. It does not mean that the application solution runs completely smoothly or is as efficient as it could be. It also does not mean the application solution is flexible or even complete. Nor it does mean that the application solution eliminates complexity.
But these problems that demand iteration can be dealt with efficiently. On both the IT BPM and business BPM sides, I recommend that teams consider using simulation modeling to evaluate each iteration design. The simulation tools will point to the bottlenecks, how the solution will work under different workloads and a lot of other issues with the design.
By using the results of the simulation to focus design improvement, the team is constantly moving to optimize the design through focused improvements. In this way, the improvement evaluations are driven by a rigorous simulation-based efficiency evaluation and not by, "Let's try this and see what happens." The result is that the number of iterations becomes controlled and many fewer seem to be needed. Arguably, this approach also produces a much better business design.
When the iteration process goes right
The new business process model is "good" when it has a high probability of delivering the target products or services and when the simulation and financial reviews say the workflow and other aspects of the business are optimal. By comparing existing current state simulation results against the new solution's operating simulation, the team also has the ability to predict project benefit -- exactly what will be saved by moving to the new operation (workflow), what will be saved by eliminating business problems through the new design and what will be saved by eliminating or reducing errors.
Once the business process models have reached a point where they can be shown to be effective and efficient using a simulation tool, the applications can be generated by a BPM suite (BPMS) tool. Assuming a BPMS tool is used, a "straw man" version of the applications can be generated.
In addition, using the old technique of testing the applications with "stubs" to simulate passing data to another application and "drivers" to simulate situations where the solution systems receive data from other applications, the models and solution design can be further optimized to assure that the workflow and the operation of the supporting computer applications deliver the target results.
The stub- and driver-type of iteration that should be considered in BPM projects -- especially ones supported by a BPMS tool. As with the business design iteration process, the stub- and driver-type iteration must be planned and carefully controlled. Also, as with the business design iteration, this program testing and modification cycle can produce much improved results when properly governed.
Taken together, these two uses of controlled iteration eliminate many of the inherent problems with business process development, while allowing teams to create better results, faster.
As always, I welcome your thoughts -- even disagreements. Please contact me at email@example.com.
More from Dan Morris
Business process management system tools drive software development