Monday, May 7, 2012

Release Planning: Effort (1)

Now we move on to Effort in Release Planning.

For those who have not been reading before, this is a continuation of a series that starts here.

***
Estimating the effort of a Release Plan is not quite what I wanted to do in this part.  What I want to do is estimate the relative effort of each User Story that we have so far.

So, imagine that we have 50 user stories representing roughly (per gut check) about 6 months worth. Ballpark.

Now we have to do two things:
* Establish a Definition of Done for the Team.
* Do Planning Poker, which estimates a "story point" number for each user story.

Definition of Done 

Stealing from Taiichi Ohno, I suggest we do this differently than I see many people do.

What others do is a list that describes what 'done, done' means once we get there.

Maybe they say:
* Coded
* United Tested
* Basic documentation written and reviewed
* Functionally tested
* Small regression test
* No bugs (all identified bugs fixed)
* Product Owner Review (any issues fixed)
* No increased technical debt
* Promoted to the QA2 Server

What I would prefer is greater clarity how we got to this state. Or even if we really can get to this state.

What Taiichi Ohno proposed is that we ask the workers to write down the process that they currently use.

Once the workers do that, they themselves can see that it has weaknesses. And once the process is (more) visible, then everyone can help improve it. But especially the workers themselves will improve it. And so, they start to 'own' the process.  Which makes for better motivation. And better results.

Some in agile are concerned. Their concern is that, by writing down the process, we have locked the process in stone. And have made people into machines.

And this is actually the opposite of what we are doing.  We are making the process visible so that it can be improved.  So that it can be changed.  NOT so it will remain the same.  Now, by making the process visible, we do enable anyone who sees the process to open his mouth. And if the Team is not strong enough, then a 'bad' manager could try to force them into his process. But this seems unduly negative in the general case.
 
Let's make our suggestion more concrete.

[Insert picture of sample DOD.]

So, before or during the Sprint Planning Meeting, the team can do many things to 'get the stories ready, ready'. But what we want to estimate in story points are the things that happen during the Sprint.

We recommend that the first thing done in the Sprint is that we have a conversation about Story 1. The conversation, in classic form, is between the PO, the Coder, and the Tester. (Technically, with all the people with all the skill sets to get that story done, done.)  In this (short) conversation, all 3 people try to assure they are on the same page about the story.

Then we list everything that the Team says it can and will do to get Story 1 to a done in the Sprint.

In my opinion, the last step (or near the last) must be PO Review, meaning that the PO looks at the working product and gives feedback. If the PO feels the customer will not like it, he gives that feedback and the implementers must fix it. In the Sprint, in my opinion.

And anything done after the Sprint to make that story into something that can go in the live production product, all that work is listed as well. Below the line. And it represents, in my opinion, all the bad news getting better with age. But we have to accept that we can't always get to "live, in production, in use by the customer" within the Sprint.

We think this approach to DOD gives much more clarity or transparency.

And starts the Team on the road to becoming more professional. And enables the Team to improve their own process.

***
In the next post, we will go into Planning Poker.



No comments: