Software development projects pretty much always run longer than estimated, and the quality often is poor. Books about software development point out exactly the same problems over and over again since the 70s, and the books are still as valid as ever. Many people I know wholeheartedly agree the described problems are a common pattern in IT business.
So how come the situation never changes? I had parked this posting for a while when I came across an IEEE article about lying in the IT business (“Lying on Software Projects”, IEEE Software November/ December 2008) and found there was an interesting overlap.
Frequency of lying:
- Cost or schedule estimation: 66%
- Status reporting: 65%
- Political maneuvering: 58%
- Hype: 32%
Who is lying, and why (Cost/ Status/ Politics/ Hype):
- Management (53%/49%/ 44%/ 31%)
- Project Lead (48%/ 54%/ 34%/ 32%)
- Developer (45%/ 30%/ 19%/ 29%)
- Marketing (40%/ 20%/ 26%/ 36%)
- Customer (11%/ 12%/ 13%/ 16%)
From these number I get that most often management up from and including the project manager is lying, and most lying involves cost/ schedule and project status. Anybody who knows the IT world from inside who is surprised?
(A) There is always pressure on estimates and schedules to beat the competition and to please management (knowing that the difference between what the customer is willing to pay and what the developers are willing to commit themselves to is income). (B) Later in the project there is pressure to hide cost or schedule overruns, a subject closely linked to status reports.
What has that to do with fighting software entropy? There are three basic screws you may turn to adjust project parameters: scope, cost/ time (the number of people aka “resources” is part of this aspect), and quality.
Scope rarely can be discussed. For sure not at the beginning, because nobody wants to try to win a bid by telling the customer “OK, we can do it, but only if you drop this, this, and that requirement”. Later in the project cutting scope still is awkward because you can’t do it without discussing it with the customer, or the customer noticing if you skipped the discussion part.
Often there are hard constraints on cost and time at the beginning: You have to (or figure you have to) beat somebody’s price, and there is a hard date when the project must be finished. Often enough cost and/ or time slips later in the project, but this is always very visible to management and customer, and nobody likes that.
That all makes quality an easy taget. The problem already starts with the definition and perception of software quality. For some the absence of (noticeable) bugs is pretty much the only criterion for quality. The next person cares about maintenance and extension, and therefore about programming style and architecture. But the absence of that kind of quality is pretty abstract for management, customer, and regrettably for many developers as well.
So what happens? In order to cut cost and development time way too often proper software design is tossed over board first (and I saw that happen while developers where cheering). Enhancing software quality seems to be an extra, some sort of beautification that can be lived without. And it can be left out, but not without very real consequences in the future (which is what this blog is all about).
Lack of awareness. Many project managers don’t know all that much about the presence, absence, or effects of architecture and programming style. How many projects do you know where the project lead or somebody appointed for the task regularly checks the source code for software quality? And if you know some projects: were these checks about formatting and naming conventions, or about architecture, coding philosophy, and programming style?
If management is not aware of software quality they can’t pursue it, plan for it, educate people on it, foster it, advertise it to their own managers, sell it to the customer. Software quality is worth something, and not everybody has it! Others (competitors?) may claim they do, but ask them what software quality is in their eyes, and educate yourself about what it should be, and what the advantages are.
Low software quality does not poke you in the eye like bugs noticed by a customer. But there are a number of symptoms that are easy to spot even without ever looking at the code:
- It becomes more difficult from release to release to get features implemented (the cost and time per function increases strongly).
- Reasonable features are being rejected by the developers because they know what trouble the implementation would cause.
- Developers tell you it makes no sense to take on more developers because they “don’t know the internals” (of course we all know that programming does not scale the same way as filling sand bags does, but it should not be impossible to take on new programmers or replace some).
- The cost for even small changes becomes so large that even when all details can be explained it simply doesn’t feel right, and it is embarrassing to the tell the customer.
- Programmers strongly avoid or plain refuse doing any work on certain parts of the application (knowing any change to that cesspool of code will break things in an unforeseeable way).
- Programmers are leaving the project in droves.
While this may sound like the developers are responsible, they quite often are not: it can’t be expected that sufficient time gets spent on architecture and design when the only goal is “to have something deliverable by date x”. On the contrary I have been working on projects were the programmers “ganged up” and “secretly” refactored code, sometimes even on their own time, because they felt they never would be granted the time to change something that turned out to be awkward, an obstacle, or a constant pain in the neck.
Programmers must know about and code according to good programmings standards, and management must allocate the resources for this. This includes selecting and/ or educating programmers with the right skills, and factoring in software quality into estimation, planning, implementation, and marketing.
Conflicting goals. I worked for a company where one unit was responsible for initial software development and another for operations, maintenance, and extension. I know that this is not exactly an exotic situation. The problem was that there was nobody watching the project who was high enough in the hierarchy to oversee both initial development and continued maintenance.
Good architecture costs some extra effort, no doubt about that. Good architecture and the improved software quality that comes with it often doesn’t you get that much of an advantage in the first release of a software. The payoff comes later when changes and additions are easier and code is more understandable for new members on the team (just to name a few points).
In our case we just could not get the customer to pay for some additional architectural efforts. They even agreed that it would be better to put in some more effort for a better architecture, but since they would not receive the benefits they simply could not justify the expenditure. Of course there would have been an overall savings, but, no, the decisions were made by two different units with two different goals. And, of course, we got blamed again and again for high development costs when the software was extended during over a dozen releases.
[This article very likely is going to be extended, and/ or broken up in smaller units]