Thursday, August 31, 2006

 

Methodology Musical Chairs

Here is another idea while we are writing the book "Beyond Agile - What is Wrong with Agile Development and How to Fix It to Scale Massively and Succeed Consistently."

In the 1970s, we had several competing methodologies: IBM's SDM/70, Arthur Andersen's Method/1 and many more. It seemed like these methodologies had many differentiators, but, looking back with 20/20 vision, they were more the same than different. They were all all "waterfall" methodologies, meaning that they advocated that we completely finish each phase - requirements, analysis, design, coding, testing, etc. - before proceeding to the next phase.

This model has an interesting history. The term "waterfall" was coined by Walker Royce, one of the first prominent methodologists in the computer field. Royce proposed this model, but then immediately noted what flaws existed in it and suggested improvements, including feedback loops from testing back to design. However, the industry seized on Royce's "initial model" of waterfall and built all methodologies based on Royce's admittedly flawed approach. We are told that Royce was not pleased with this, but his voice was lost in the noise.

Very soon after, Fred Brooks wrote the book "Mythical Man Month" which proposed a very different way of software development. Reading the book today, a reader could be forgiven for thinking it was written by the latest, hipest XP/Agile/Scrum evangelist, when, in fact, it was first published over thirty years ago. Tight feedback loops, iterative/incremental lifecycles, deep involvement of the business users were all concepts proposed by Brooks.

In the 1980s, detailed processes like James Martin's Information Engineering and Yourdon/DeMarco's Structured Analysis and Design claimed prominence. These were also waterfall processes, and continued with a focus on rigorous, extensive piles of documentation throughout the lifecycle, making any possibility of change more and more difficult as a project progresses through its phases.

In the 1990s, there was plenty of momentum for waterfall lifecycles, with IBM introducing a new waterfall lifecycle called AD/Cycle and Andersen Consulting even, somewhat inexplicably, introduced a new waterfall process in the late 1990s called "Business Intelligence" (not related to data warehousing). The 1990s was also when the CASE tool phenomenon hit our industry - Computer Aided Software Engineering. Six figure tools like Application Development Workbench (ADW) and the Information Engineering Facility (IEF) were purchased by many IT departments in hopes of reducing time-to-market of software applications. Instead, these costly tools only served to lengthened the development lifecycle, handcuff the developers in their implementation choices, and frustrate business users and their ever-changing demands. CASE tools, also called I-CASE (Integrated CASE) could possibly have been our biggest mistake so far in this industry (although we did learn some good things from our CASE experiences, so nothing is a complete loss).

In the late 1990s, the most noteworthy event in the methodology world was when Rational Software introduced the Rational Unified Process (RUP). It was heralded as possibly a process that could unite the industry, in the same way the Unified Modeling Language (UML) had done for notation of diagrams and models. The RUP included all the latest ideas in software development, iterative/incremental lifecycle, use cases, automated testing, traceability, object-oriented design and construction, component-based development, architecture-centric - you name a buzzword, RUP had it.

But RUP was no sooner out of the chute than a series of books was written pointing to the "new" great white way - eXtreme Programming (XP). XP could hardly be called a methodology, instead it was a collection of "best practices" taken from a failed project at Chrysler called the C3 Project. The proponents of XP were already well known in the industry, having led the practice of "Class-Responsibility-Collaboration Cards" or CRC Card sessions, which remains a successful technique to this day.

Again, XP hardly had a chance to gain ground before a high-profile meeting of methodologists at a ski resort in Utah brought us "Agile Development." Agile was an extension, not a dismissal of, XP. The Agile banner allowed multiple people to proclaim their adherence to a certain set of baseline principles, while varying their own processes slightly to maintain autonomy of their individual brands. Today, under Agile, we have XP, Scrum, Crystal, Lean Systems Development, Feature Driven Development and Dynamic Systems Development (among others). Even the proponents of these processes would admit that these approaches have much more in common than they have differentiators. That can largely be attributed to the first meeting at the ski resort of the industry experts who agreed to basically be cooperative rather than political about managing the competition between ideas.

Why has our industry faced such tremendous change in our short history? Every five to eight years, we seem to completely overhaul our view of software development. We jump from one train-of-thought to another, always hoping to find something that will help, and (so far, at least) always being disappointed.

Most notably, all this change has not led us to converge on a single, best answer, but, instead we've been seesawing from waterfall to iterative, from lots of documentation to none, from involving the business users to pushing them away, from trying to automate our processes to doing things manually.

One client I was consulting with last year was moving steadily toward an iterative/incremental, use case driven lifecycle. Suddenly, with the installation of a new CIO, they've jumped back into the waterfall camp and have forgotten about use cases as well.

Why do we play these games of "Methodology Musical Chairs?"

Should computer methodologies resemble hairstyles? If you substitute an afro for Method/1, big hair for SA/SD, and a buzz cut for XP, isn't it kind of a true statement that we have methodology "fashions" that closely mirror those hairstyles from days gone by? Is that what we want? Isn't there more to software development than jumping from fad to fad?

Let's examine a little closer what these "fashions" are based on. It's fairly safe to say that each of the new trends in methodology are based on someone's experiences. A highly intelligent person looks at what is working on their own projects and what isn't working, and begins to experiment. If they find something that works, maybe they write a book about it.

That is certainly how our book "Use Cases: Requirements in Context" was written. Eamonn Guiney and I noticed that requirements phases were going very poorly on certain projects, but that the approaches using use cases seemed to work better. When we noticed this trend, we began paying more and more attention to the "best practices" of requirements on our teams, and the book resulted from those observations.

But are best practices the right way to develop methodologies? See our post on best practices here.

Perhaps it's time to come up with a new way of coming up with methodologies.

Comments: Post a Comment



<< Home

This page is powered by Blogger. Isn't yours?