Saturday, September 21, 2013

How management consultants will kill technology innovation

Recently I was forwarded an article by a colleague of mine. It was an article published by McKinsey&Company, with the subject line "FW: Improving application development". Ironically, after reading the article, I was left with a stark vision of the future of technology with McKinsey consultants in the driver's seat. Spoiler: it would look something like an assembly line.

The article, titled "Enhancing the efficiency and effectiveness of application development", introduces a pedestrian concept of measuring the input side of application development as a means to measure productivity on the output side. The concept is an attempt to quantify use cases with a point system and the points are used to measure productivity. While there was nothing inherently problematic with the concept in the abstract, reading the article raised huge red flags.

According to the article:

"organizations often don’t have a robust way to gather and organize functional and technical requirements for application-development projects. Instead, they list requirements in what often amounts to little more than a loosely structured laundry list. Organizations may have used this laundry-list approach over a long period of time, and it thus may be deeply entrenched."

The diction used suggests that the reason that the requirements gathering is not robust and organized is a function of inefficiency of the individuals or processes used to gather them. A major flaw in this line of thinking is often the organization itself has an insufficient understanding of the root problems they face and how to solve them. The word entrenched suggests irrationality, with the implication that there may be something of a hostile takeover required to correct the problem. But, what is the problem?


"organizations find it difficult to fully and accurately capture requirements and align them with the needs of their internal or external business clients. Their application-development projects tend to suffer the inefficiencies of shifting priorities, last-minute change requests, and dissatisfied business users. In our experience, these changes often amount to cost overruns of 30 to 100 percent... ...use cases provide a logical and structured way to organize the functional requirements of an application-development project. Each use case is a description of a scenario under which the user of an application interacts with that application. For example, a UC for an online-banking application might describe each of the steps that a bank customer follows to log into her account and check the balance of available funds, as well as the transactions involved when that application calls on a database to pull up the stored information."

According to authors, the problem is that the requirements aren't logical or structured. That lack of logic and structure is responsible for project overruns. However, use cases have long been employed for documenting requirements during object-oriented analysis using UML use case diagrams. This approach is structured. A well-known benefit of use case diagrams is that they are commonly understood by business people lacking technical backgrounds. However, logic and structure is not missing. It is converging on an understanding of the root problem, a solution (or solutions) and accurately predicting time and resources involved in implementing it. Project analyses are commonly logical and structured. However, overruns come from a changing understanding of the problem(s) or the resources required to solve them.

Traditional waterfall design promoted large up-front analysis, assuming that it was just a matter of a robust organization and structure to understanding the problem and costs involved. However, modern methods start with the assumption that it is natural and expected that the solution and estimates change as you start to better understand what problem you are trying to address and the details involved in a solution. Initial predictions matter less because they are based on insufficient understanding. Modern methods try to incorporate this learning to deliver concrete working software, iteratively. The idea is  that all along you produce something you can potentially use.

From the sidebar, it appears the authors don't understand these methods.

"...where the primary purpose of SPs is to allocate the workload across the team. ...However, because SPs are based solely on gut feel, they are too subjective or too easy to game to compare different development teams or even the performance of a single team over multiple periods.
Use-case points (UCPs) represent a sweet spot between FPs and SPs. UCPs are easier to calculate than FPs, provide a similar level of accuracy and objectivity, and require far less overhead. At the same time, UCPs provide significantly more accuracy and objectivity than SPs, without unduly adding overhead."

Story points aren't an absolute because, if you are being honest with yourself, your true ability to define and estimate work is not good enough. You are more likely to be able to identify relative complexities, and ultimately, that's good enough to predict a velocity once you have established a baseline on real work with your team. The experience levels, problems, tools and techniques you have will drastically reduce the efficacy of an estimate taken from another company or team. However, if you estimate your entire backlog of identified work (stories) in relative terms, after measuring the velocity of the team you can estimate delivery dates. However, remember your understanding is changing as you learn more about your problem(s) and solution(s).

Trying to optimize productivity when you aren't ensuring that you regularly converge on the right problem(s) is misguided. Applying metrics based on poor assumptions and using it to steer decisions without a full awareness of the biases and limitations undermines the whole point of building something: to deliver something valuable. But then, how do you measure value? Maybe, we haven't done a good job of that one either.


There are people out there doing better, but not by mashing together old-school business thinking and antiquated software development practices. It is natural to fail to meet our expectations when they are unrealistic. Starting with an appreciation of how poor our initial understanding is and an ability to fail fast and pivot quickly with learning allows you to converge on something valuable. It might not be what you thought you needed, but it will be something of value. Repeatably delivering valuable things, not hitting Use Case Points estimates, will propel the world's next wild success story.

No comments:

Post a Comment