Software development is complicated and complex. That is why many projects lead to failures. It is difficult enough to even determine whether a project was successful or not. The goals may not be clearly defined, for political reasons failures can be redefined as successes, and finally there are different parameters by which success can be evaluated. Subjectively, the proportion of projects that fail to achieve one or more of the objectives seems to be quite significant.

Chaos Report

A long-established study on the success of projects is Standish Group’s Chaos Report. It has been compiling findings on IT projects since 1994. The report has a database of more than 20 years and over 50,000 projects. Analysts evaluate the projects so that the assessment as success or failure is not based on the subjective impressions of the people involved in the project. The report from 2015 is available free of charge. It shows that for several years about 20 percent of all projects have been failures. About 40 percent have not achieved all goals. Another 40 percent are successful. In categories such as schedule, costs, or deadlines, between 40 and 60 percent do not achieve the goals. According to the report, agility reduces these numbers considerably, sometimes by half.

These figures can certainly be criticized: Success can hardly be objectified. For a long time, the Chaos Report defined budget, scope, and time as success criteria. However, agile projects cannot commit to a scope in advance, but only plan the next increment. Therefore, truly agile projects cannot successfully meet a scope because it is unknown at the beginning of the project. This raises the question of how meeting the expected scope can be evaluated at all if the projects are actually agile.

The Best is the Enemy of the Good

Because of these contradictions, it may well be that the “agile” projects from the Chaos Report implement some agile practices, but not all. And there are also other reasons to doubt whether the “agile” projects are really agile. E.g. there are quite a few misunderstandings in this area. Many projects call themselves agile, although in reality they fail to implement fundamental concepts. So the agile projects from the Chaos Report are probably projects that implement a “broken” version of agility. On the positive side, they have started to adapt agility, but have not yet implemented all important practices. Since agility should be a continuous improvement process, the adaptation of agility is actually never completely finished. But even this important aspect of continuous improvement is often not followed.

If it is indeed the case that the “broken” implementation of agile mechanisms already results in significant improvements in project results, then this approach might make economic sense. One could even ask what additional economic benefit the full and correct implementation of agility would bring. However, there are good indications for such economic advantages. But the implementation of full agility - and especially of the agile values - could be costly without a corresponding economic advantage. In this case, the agile processes, some of which are very poorly implemented, could be economically optimal in terms of cost-benefit considerations because they are relatively easy to achieve and already offer considerable advantages.

To paraphrase Voltaire: the best - i.e. “real” agility - is the enemy of the good - i.e. “broken” agility. You will only achieve further improvement if you are not satisfied with something that in fact is already sufficiently good. As a consequence, broken agile processes might be here to stay. This may make economic sense. But “broken” processes demotivate. Often values and culture deteriorate - and people suffer as a result. So let’s hope that this is not the end and agility will improve even further.

tl;dr

If “broken” agility already brings significant benefits, this explains why processes are often broken and why they are not further improved.

Many thanks to my colleagues Anja Kammer, Martin Kühl, and Hanna Prinz for their comments on an earlier version of the article.

TAGS