For years, I've given Software Testers in my teams the official job title of Test Analyst, or something along those lines.
Yet (informally) I've always referred to them as Testers.
Only in more recent years - and especially since adopting Agile Software Development and User Stories - have I really discovered how to put the *Analyst* into Test Analyst.
I've written before about 'Why Agile Testers Should Be Involved From The Start'. It's obviously very important in agile development, and has a number of very good benefits.
With User Stories, we've gone a step further.
A Business Analyst, Product Manager, Product Owner and/or the Team should identify the relevant User Stories for the Product Backlog. From this feature list, for the selected items for the next Sprint, requirements need to be clarified and expanded on.
The requirements for each User Story should be discussed and clarified as a team. But, in my view, the Test Analyst is an ideal person to lead this discussion and write up the User Story cards.
Test Analysts tend to be very analytical in their nature. They tend to be good communicators. And, as we all know, they can think of scenarios that business people and developers never even dream of! :-)
Moreso, when it comes to writing test cases on the back of the User Story card, they are obviously the ideal person to do this. Writing these test cases up-front, when the feature is being defined, helps to improve quality from the outset, as developers are more likely to write their code to pass the tests (because they know what they are).
But also, it makes perfect sense to me, for the person that's going to test a User Story, to be the person that defined it.
Kelly.
P.S. Click one of the icons below to join the growing community of people keeping up with this blog by RSS or by email...
Photo by p0psicle
Subscribe
Putting the *Analyst* into Test Analyst
2 comments
See other entries about: testing , user stories
Testers! Are You Uneasy About The Transition To Agile Testing? You Are Not Alone.
I've written before about agile testing, and how I feel the role of tester is possibly one of the hardest to transition to agile software development.
In so many ways, agile testing goes against what testers are traditionally taught. Some of the things that worked before, and were important, in traditional projects, in agile projects are no longer relevant. Some even conflict with some of the key agile principles.
The testing process, principles, and deliverables a tester produces, in agile testing are somewhat different. That's what makes the transition hard. And particularly for professional testers with years of training and experience in more traditional methods.
The skills may be the same, but the role of a tester changes in agile software development.
Nevertheless, the skills of a tester are an essential ingredient of any agile team. I believe strongly that a professional tester is a huge asset to any team, because we can kid ourselves, but we all know really that most developers can't test for toffee!
If you are uneasy about the transition to agile testing, or you're just new to it, I recommend watching this video (above). It's a techtalk presentation at Google. It's about an hour long, so you'll have to put some time aside to watch it. Elisabeth Hendrickson from QualityTree Software is an excellent speaker, and gives a real, practical insight into the transition to agile testing.
Most importantly, she does so from the perspective of a professional tester.
Kelly.
P.S. Click one of the icons below to join the growing community of people keeping up with this blog by RSS or by email...
4 comments
See other entries about: testing
Agile Testing: The Changing Role of Testers
In my opinion, the most challenging role to adapt to agile development is the role of Tester.
That's because agile development contradicts so many things that many Testers have been taught is 'best practice'.
Testers might typically have gone through some recognised training such as ISEB certification.
ISEB testing qualifications, for example, specifically acknowledge iterative-incremental development models, including agile methods.
However, many testers would have undertaken their training, or gained their experience, when the waterfall model was more prevalent. And consequently would have potentially spent years practicing the V-model. With the V-model, system testing correlates directly to the system's specification, and testing is conducted when the software is completed.
Put simply, a Tester's life in traditional development methods was reasonably straight-forward. Give a Tester a spec and a finished piece of software, and they can check it works as specified.
Forget whether what was specified was really what the user wanted, or whether all the requirements were adequately captured, if it meets the spec, the quality is supposedly good.
With agile development methods, a tester's life is rather more complicated.
First of all, there are no big documents specifying every detail of the requirements and functionality for them to test against. Only small pieces of documentation per feature and details captured verbally through collaboration.
Secondly, the software is tested early and throughout the lifecycle while it is still being developed. In other words a moving target.
Put like that, agile testing can be a real challenge.
Add to that the idea of writing test cases up-front, before the software is developed, so acceptance tests form part of the requirements analysis.
Add to that the idea that some tests will be automated at code level and implemented by developers.
Add to that a much greater emphasis on automated regression testing due to the fact that feature-level testing has been completed while the code was still being developed.
To cope with such demands, an agile tester's role must change.
Some of the differences I've observed are as follows:
• With User Stories, there's little difference between a requirements scenario and a test case. Some requirements are implied from the test cases.
• If test cases are written up-front, there’s a fine line between requirements analysis and test analysis.
• A Business Analyst may have special skills in workshop facilitation, interviewing, gathering business requirements, etc. But, regarding functional analysis, Tester and Analyst roles start to converge.
• If unit tests are automated, a tester needs to work with developers to ensure that the tests are complete and appropriate, and that all important scenarios have been identified?
• Testers need to avoid duplicating tests that have already been adequately unit tested when they and others come to system test the features later in the lifecycle.
• Automated regression tests need to be written and maintained.
• The tester's involvement is increasingly important throughout the entire lifecycle of the development, not just in the latter stages of development.
• The role of tester is more aptly described as Test Analyst, and is more of a general QA role for the team, not necessarily the person who devises and executes all the tests.
6 comments
See other entries about: testing
"One Team"
One of the key principles of agile development, and particularly Scrum, is the concept of "One Team".
The Scrum team should include all key roles for the product, wherever they report to, including Product Owner, Product Manager, Test Analyst, Developers, Business Analysts, and any others that might be appropriate such as SEO, Creative, User Research, etc.
This is important for all roles, but especially for Testers, as they need to be aware of the original requirements, and any changes to them, or they can't really do their job effectively. Or certainly not to the level a professional tester expects of themselves, anyway.
For agile to work effectively for professional testers, Test Analysts need to be included in Sprint Planning, or Pre-Planning, wherever the requirements are discussed. And they need to be informed, either at Scrums or as they happen, about any clarifications or changes to the requirements.
Increasingly, with the use of User Stories, test cases will be defined up-front as part of the requirements gathering, and written on the back of the story card. This means that the traditional Tester role is starting to converge with the Analyst role, putting much greater emphasis on the Analyst part of many tester's job titles: Test Analyst.
Using an agile approach, collaboration between team members becomes a key principle. Without a full product specification, agile requirements are barely sufficient and collaboration is key.
It is imperative, therefore, that all Scrum team members - and especially Test Analysts - are included in all key aspects of your regular Scrum process. Irrespective of line management boundaries, which may well be different, it's imperative that the Scrum team is acting as one.
Example of a User Story
I recently described User Stories and the composition of a User Story Card - Card, Conversation and Confirmation.
I'm not really sure if you would consider this example to be good, bad or indifferent - I guess it depends what you're used to - but here is an example nevertheless!
This is the front of the card.
The Card section describes the user story. The Conversation section provides more information about the feature.
Note the feature (for a user to log in to a web site) is small, so the story can be fairly well described on a small card.
Clearly it's not as detailed as a traditional specification, but annotating a visual representation of a small feature at a time, makes it fairly self explanatory for team members.
And I would certainly argue it's more easily digestable than a lengthy specification, especially for business colleagues.
Here is the back of the card:
The back of the card outlines the test cases for this feature - how it's going to be confirmed.
Whether or not these are the right scenarios, or cover all possible scenarios, isn't really the point of this example.
The point is that the test cases for this feature are written on the back of the card, in support of the information about the feature, and before the feature is developed.
Generally speaking, there is a very fine line between a requirements scenario and a test case, so it isn't necessary to capture too much detail on the front of the card and clutter it up. The card in its entirety represents the requirements for the feature; whether captured in the Conversation section or the Confirmation section.
Even the description of the user story in the Card section carries some important information. In this case, there is a pre-condition (the user must be registered) and a post-condition (the user can access subscriber-only content). All of the card must be read to get the whole story.
Importantly, the User Story is expressed in business language, and in a micro, more easily digestable, information-packed format.
8 comments
See other entries about: requirements , testing , user stories
Agile Principle #9: Agile Testing Is Not For Dummies!
In agile development, testing is integrated throughout the lifecycle; testing the software continuously throughout its development.
Agile development does not have a separate test phase as such. Developers are much more heavily engaged in testing, writing automated repeatable unit tests to validate their code.
Apart from being geared towards better quality software, this is also important to support the principle of small, iterative, incremental releases.
With automated repeatable unit tests, testing can be done as part of the build, ensuring that all features are working correctly each time the build is produced. And builds should be regular, at least daily, so integration is done as you go too.
The purpose of these principles is to keep the software in releasable condition throughout the development, so it can be shipped whenever it's appropriate.
The XP (eXtreme Programming) agile methodology goes further still. XP recommends test driven development, writing tests before writing the software.
But testing shouldn't only be done by developers throughout the development. There is still a very important role for professional testers, as we all know "developers can't test for toffee!" :-)
The role of a tester can change considerably in agile development, into a role more akin to quality assurance than purely testing. There are considerable advantages having testers involved from the outset.
This is compounded further by the lightweight approach to requirements in agile development, and the emphasis on conversation and collaboration to clarify requirements more than the traditional approach of specifications and documentation.
Although requirements can be clarified in some detail in agile development (as long as they are done just-in-time and not all up-front), it is quite possible for this to result in some ambiguity and/or some cases where not all team members have the same understanding of the requirements.
So what does this mean for an agile tester? A common concern from testers moving to an agile development approach - particularly from those moving from a much more formal environment - is that they don't know precisely what they're testing for. They don't have a detailed spec to test against, so how can they possibly test it?
Even in a more traditional development environment, I always argued that testers could test that software meets a spec, and yet the product could still be poor quality, maybe because the requirement was poorly specified or because it was clearly written but just not a very good idea in the first place! A spec does not necessarily make the product good!
In agile development, there's a belief that sometimes - maybe even often - these things are only really evident when the software can be seen running. By delivering small incremental releases and by measuring progress only by working software, the acid test is seeing the software and only then can you really judge for sure whether or not it's good quality.
Agile testing therefore calls for more judgement from a tester, the application of more expertise about what's good and what's not, the ability to be more flexible and having the confidence to work more from your own knowledge of what good looks like. It's certainly not just a case of following a test script, making sure the software does what it says in the spec.
And for these reasons, agile testing is not for dummies!
For further discussion on this or other agile topics, go to the 'all about agile' forum at http://groups.google.com/group/allaboutagile.
For further reading about agile development principles, see below...
4 comments
See other entries about: principles , team , testing
Developers Can’t Test For Toffee!
In the more traditional world of managing software development projects, it is widely acknowledged that developers can’t test for toffee!
Yet Agile Development methods increasingly seem to require or imply that all people in the project team should test, including developers.
So, first of all, why is it that developers can’t test? Are we to believe that these highly intelligent individuals somehow don’t have what it takes to test? Of course not.
The trouble is simply this: Good developers build their code to handle every scenario they can think of. If they’re really good developers, they write automated unit tests to test everything they’ve built. And if they’re really good developers, they even do this up-front so everything is designed and built to pass first time.
But still it’s not enough.
For developers only build (and therefore test) everything they can think of. And testers, partly due to their nature and partly due to the fact they’re not buried in the detail of implementing the code, are able to think up scenarios that the developer didn’t think of. And that’s the problem.
So what do we do in Agile Development, where everyone is expected to test?
My answer is this:
Wherever possible, do include at least one dedicated tester within the Agile Development team. If the tester is not able to test everything personally because there are more developers than the tester can handle, have the tester take a special QA role, including the following activities:
- Identify the test strategy and test scenarios
- Ensure the appropriate test environments are in place and controlled
- Write the test cases/scripts – ideally up-front but on a just-in-time basis per feature
- Review the developers’ automated unit tests, to avoid re-testing the same things later and to QA the scope of the tests
- Execute some of the most important test scripts personally, particularly where there is higher complexity or risk, less clarity or where more attention to detail is required
- Coordinate the test efforts of others (including developers), so one person knows what scripts have been executed, what areas have been tested, their status at any time, and the issues logged against them
- Manage the bug log to ensure issues are logged clearly and prioritised consistently
- Liaise with customers and/or business users to organise acceptance testing and advise on approach
- Ensure that each developer's code is also tested by someone else, even if it's another developer
In my experience, there is real business value in having tester expertise for this quality management / quality assurance role, even in a situation where there aren’t enough testers to go round. In an Agile Development environment, the emphasis of a tester’s role is more on QA than it is on testing per se.
See also:
Why Agile Testers should be in at the start
10 Key Principles of Agile Development
Why Agile Testers Should Be In At The Start
In my experience, some people implement agile principles within the development team itself, but leave other key roles (for instance business users or testers) out of, or on the fringes, of the agile team.
Earlier in my blog I wrote that active user involvement is imperative in agile development for a wide variety of reasons. It's just as important for the agile team to include all roles required to ensure that each feature is actually complete, that is complete, at the end of each iteration or Sprint.
In particular it's a good idea to include Testers from the outset, and especially in planning, because:
- Testers tend to be exceptionally good at clarifying requirements and identifying alternative scenarios. Doing this thinking at the requirements clarification and planning stage improves the quality of estimates given for the work, and therefore increases the chances of successfully delivering in the timescales.
- When more is known about the test approach up-front, developers are more inclined to write their code to pass! (see my earlier post on test driven development).
- Testers that are involved in requirements clarification and planning first-hand will have a much better understanding of what's needed and how it's being implemented, enabling them to do a better job of the testing. In the absence of a lengthy specification, this is essential in agile development projects to allow a Tester to do their job effectively.
- And testing must be completed within each Sprint for any completed features to be really 100% "DONE!", i.e. production ready.
See also: 10 Key Principles of Agile Development
Photo by lorenzo281203
Testing Testing 123 (test driven development)
XP (eXtreme Programming) advocates Test Driven Development, where test cases are written before the code. Radical, huh?
If you think about it, it makes complete sense. Assuming you are planning to write test cases anyway, it’s no more effort than writing them later. And the big advantage of writing them first?
If you know how you’re going to test it, you write the code to pass first time! Simple really. Simple but inspired.
See also:
10 Key Principles of Agile Development
Principle #9: Testing is integrated throughout the lifecycle
0 comments
See other entries about: testing , xp (extreme programming)