How Agile Are You? (Take This 42 Point Test)
Recently I saw a brief set of questions from Nokia to assess whether or not a team is 'agile'.
And by 'agile', I think they meant to what extent the team was following agile practices Scrum and XP (eXtreme Programming), not whether or not they could touch their toes :-)
I'm not sure if it was deliberately brief to emphasise the things that are the real *essence* of agile, but we've developed the questions into a more comprehensive set of statements. A set of statements that make a fuller assessment of someone's status with agile principles and methods.
Here they are...
- The team is empowered to make decisions.
- The team is self-organising and does not rely on management to set and meet its goals.
- The team commits and takes responsibility for delivery and is prepared to help with any task that helps the team to achieve its goal.
- The team knows who the product owner is.
- Each sprint/iteration has a clear goal.
- All team members, including testers, are included in requirements workshops.
- Requirements documentation is barely sufficient and the team collaborates to clarify details as features are ready for development.
- Test cases are written up-front with the requirements/user story.
- There is a product backlog/feature list prioritised by business value.
- The product backlog has estimates created by the team.
- The team knows what their velocity is.
- Velocity is used to gauge how many user stories should be included in each sprint/iteration.
- Sprints/iterations are timeboxed to four weeks or less.
- Sprint budget is calculated to determine how many product backlog items/features can be included in the sprint/iteration.
- The sprint/iteration ends on the agreed end date.
- All tasks on the sprint backlog are broken down to a size that is less than one day.
- Requirements are expressed as user stories and written on a card.
- The team estimates using points which indicate the relative size of each feature on the product backlog/feature list.
- The team generates burndown charts to track progress daily.
- Software is tested and working at the end of each sprint/iteration.
- The team is not disrupted during the sprint/iteration.
- Changes are integrated throughout the sprint/iteration.
- Automated unit testing is implemented where appropriate.
- There is an automated build and regression test.
- Stretch tasks are identified for inclusion in the sprint/iteration if it goes better than expected.
- The Product Owner is actively involved throughout each sprint.
- All code changes are reversible and it is possible to make a release at any time.
- Testing is integrated throughout the lifecycle and starts on delivery of the first feature.
- Impediments that hold up progress are raised, recorded on the whiteboard and resolved in a timely fashion.
- When someone says 'done', they mean DONE! (ie shippable).
- The team uses the whiteboard to provide clear visibility of progress and issues on a daily basis.
- The sprint/iteration goal(s) is clearly visible on the board.
- All user stories and tasks are displayed on the whiteboard for the duration of the sprint/iteration.
- Daily scrums happen at the same time every day – even if the scrum master isn’t present.
- The daily scrum is resticted to answering the standard 3 scrum questions and lasts no more than 15 minutes.
- There is a product demonstration/sprint review meeting at the end of each sprint/iteration.
- All team members, including testers and Product Owner, are included in the sprint/iteration review.
- The sprint/iteration review is attended by executive stakeholders.
- There is a sprint retrospective at the end of each sprint/iteration.
- Key metrics are reviewed and captured during each sprint retrospective.
- All team members, including testers, are included in the sprint retrospective meeting.
- Actions from the sprint retrospective have a positive impact on the next sprint/iteration.
Our approach to these statements is this:
* Ask every team member of an agile team (including the product owner, tester, manager, everyone) to review the statements honestly.
* Ask them only to mark a score with a 1 if - and only if - they believe they are consistent and it could be audited. In other words, if I was to turn up at any time and ask for evidence, are you confident you could provide it. Otherwise score 0.
* Add up the 1's for each team member. Then average the score for the team.
To what extent a team is really effective at all these points is another matter, of course.
But if a team has really got agile principles and practices consistently nailed, and according to every team member...
But if a team has really got agile principles and practices consistently nailed, and according to every team member...
They score 42 of course! :-)
How agile are you?
22 January 2008 07:26
One of main tennets of Scrum is that the adoption of Agile practises is sensitive towards the context. Scoring based on items like "Requirements are expressed as user stories and written on a card" is a bit too specific and context-dependable to be used generally. What is wrong with not using user stories? what is wrong with not having them on cards? It might be that one organization will be using one-liners as requirements and excel spreadsheets for requirements, printing out progress reports from that. Would you deem this as less agile? On what grounds?
22 January 2008 08:20
You make a fair point. If you're only using Scrum, then you are absolutely right.
However, as I implied in the introduction to this post, these statements are intended to assess to what extent a team is following the principles and practices of Scrum and XP.
Whereas Scrum is an agile management framework, XP focuses more on agile engineering and the approach to the actual development. User Stories, the bit you question, is an XP practice so if you're just using Scrum it wouldn't be relevant. Likewise with the other questions about XP.
When I use the word 'agile' here, I'm referring to agile in the context of a Scrum/XP approach, not all agile methodologies or interpretations of the word.
If you find any of the questions irrelevant in your situation, yet you still think it might be a useful dip-test, you could always adapt this list. Remember a key principle of Scrum is that it's adaptive :-)
Kelly.
22 January 2008 16:21
Hi Kelly,
Fantastic list – really is the answer to life the universe but not everything. There is little mention of the scrum master apart from – and I paraphrase – continue with out him/her. Have you thought of making a list of check points for the scrum master? Is the scrum master part of the team? In water fall projects tend to fall away with out the influence of the typical Project Manager. In scrum I think that the budget could get be wasted if the team (including DBA, Testers, systems analyst, product marketing manager, business analyst, usability expert, QA engineer, graphic designer and the list goes on….).
Do other methodologies use some sort of master or do they use project managers?
23 January 2008 16:09
Although the list doesn't talk specifically about the Scrum Master, there are two key aspects to their role:
1) ensure that the Scrum process is adhered to, or at least that any deviations are the decision of the team and appropriate for the team/organisation's situation. In this way the list should help a Scrum Master to do that, as it lists all the key elements of the process and the principles behind them.
2) resolve impediments that are delaying the team's progress. This aspect of the Scrum Master's role is covered specifically in the list.
Kelly.
25 January 2008 10:39
Basic quesiton: why is it called agile?
At our company we work with three month sprints which only permit one release to live every three weeks.
This means that if something doesn't work when it goes live, it's another three weeks until you get the chance to (hopefully) fix it.
Of course, this would be reasonable if the test environments bore any resemblance to the live environments, but they don't. This means that things tested and passed on lower environments quite often fail to work when pushed to live.
It's also not uncommmon for things to work with test data, but not with live data.
I'm working on some projects which are now nine weeks behind schedule for these reasons.
So why is it called "agile"?
25 January 2008 18:56
Why is it called agile? I would say because the scope/features are flexible and the timescale is fixed, whereas traditional development projects seek to control scope/features.
In many traditional development environments, doing a release every three weeks is very frequent, with many typical projects lasting many months. It may not be agile compared with a newspaper environment, but it certainly is compared with many traditional project environments.
In my opinion, it shouldn't necessarily be the case that you can only release at the end of a sprint or iteration, particularly of course if the changes are critically important.
It is convenient and efficient to stick to a fixed release cycle, however, so being disciplined about release cycles unless changes are important exceptions is beneficial to the team's productivity and therefore the product in the end.
The issue to address in the situation you describe is the underlying quality. In this case that's where the principles and practices of XP (extreme programming), which is an agile engineering methodology, would help more than Scrum, which is a (complimentary) agile management methodology.
Kelly.
25 January 2008 23:43
Kelly,
Nice work . . .
I've been investigating "personal agility." It's based on the ongoing problems of trying to implement or scale agile in traditional cultures.
I should confess that my expertise is not in software dev or project management but in applied team/peer leadership.
I just posted a brief four-item questionnaire you can use to assess your Personal Agility Quotient. See what you think:
http://www.christopheravery.com/blog/whats-your-personal-agility-quotient/
Thanks,
Christopher
27 January 2008 18:57
Thanks for the good article. The author points to a number of factors that will help move company for the next phase of the company's development.
If you are interested in balanced scorecard metrics in business, check this web-site to learn more about Metrics
http://www.business-development-metrics.com
10 August 2009 00:58
This is a great list. However, there is little mention about the Information Developers (Technical Writers) and user assistance. I would make a slight modification, "Software is tested, documented, and working at the end of each sprint/iteration."
21 October 2009 07:37
wow, this list doesn't look "agile" at all.
23 April 2010 17:21
Excellent post. But what was missing was the customers (real user, product owner) CONSTANT USAGE and feedback. PO can do (own) acceptance testing. If not done it is like old waterfall, right?
ET testing was missing too.