Pages

Friday 19 October 2012

The Perils of the Large Backlog

This post is a result of some thinking triggered by a recent quote by Dan North; "How can you respond to change when you have 600 stories in your backlog?".

This statement seems so seemingly obvious but at the same time so often overlooked. We know that throughout the course of a project new requirements will emerge and existing requirements will change. As an Agile team we want to be able to effectively respond to that change and, as the Agile Manifesto tells us, be positioned to harness that change for the customer's competitive advantage.

That said, the first thing we see many project teams doing in their quest to be Agile is create massively long product backlogs. Backlogs containing so many stories that the team immediately find themselves rooted in a level of detail which directly inhibits their responsiveness to change. I refer to 'level of detail' in two contexts:

- Low level (overly granular short term requirements)
- Long term (overly vague and non relevant longer term requirements)

Both contribute to the problem by creating a backlog which is so low level and long term that any requirements change requires a complete review, restructure and rewrite of the backlog. This is obviously a massive overhead and will probably lead to a team actually trying to avoid change, never mind trying to harness it. It isn't Agile and in turn the customer/business will struggle to be agile in their operation.

We can even go a step further and use queuing theory to reflect on the effect that long backlogs have on our Agile process. Not only does a long backlog affect out ability to respond to change it affects our ability to deliver change in the first place. Little's Law as part of queuing theory tells us that:

Size of Inventory = Processing Rate x Delivery Time

We can adapt this, as below, to see the impact that the Size of Inventory has on Delivery Time.

Delivery Time = Size of Inventory / Processing Rate

We can see from this that Delivery Time can be reduced by either a reduction in the Size of Inventory or an increase in the Processing Rate. Let's now place this in an Agile context:

Story Delivery Time = Size of Backlog / Story Throughput

Now lets apply some simple numbers:

100 stories in the backlog / 10 stories completed a week = 10 weeks for a story to be delivered

This 10 week Delivery Time results from a culmination of the time it takes for a story to get into the backlog, the time it takes for work to start on the story and the time it takes to complete work on the story.

There are two things that can be done here to reduce the Delivery Time. Firstly we can speed up the time it takes to complete stories. There are many ways in which this could be done e.g. better specification and engineering practices, those however are a subject for another post. The second way in which we can reduce Delivery Time is by reducing the number of stories in the backlog. With a smaller backlog we can start working on stories sooner, complete the work and deliver value for the business sooner.

Now, in reality this may be slightly complicated by the fact that stories don't necessarily go through a backlog sequentially, indeed some stories might end up not being done at all. Generally speaking though queuing theory shows us that we can get items through a system more quickly if there are less items waiting to be completed in the first place. Lean thinking, Work in Progress limits and flow systems all provide us with practical ways in which to benefit from this understanding. Further examination of these however is again a subject for another post.

Future posts aside, the message from this post is a fairly simple one. Long backlogs inhibit our ability to respond to change and deliver value. They inhibit the Agility of the project team and in turn inhibit the agility of the business. We should therefore strive to avoid them. Down with the massive backlog!


No comments:

Post a Comment