- How can we be sure we don’t miss anything important when testing?
- If we don’t have scripts what can we use to train new joiners?
- What if the assigned tester goes off sick?
- How can we convince the project team it’s an acceptable method?
- How can we provide evidence for an audit if we have no scripts?
I’ll post this blog in two parts, this post focuses on the issues raised by testers, part 2 will consider the concerns from a management point of view.
When I first tried exploratory testing, some of my team really struggled with the concept of no scripts. Personally, I found writing test scripts a chore, but I came to realise that for a lot of testers, writing scripts is an essential stage in developing their understanding of the system. They were not comfortable letting go of a process used for years, without having something structured to replace it. I have since learned to explain that time not spent writing detailed test scripts can be spent actively learning about the system, assessing the risks and forming a test approach. Wherever possible, talk with others to develop your test ideas. We use a combination of brainstorming with heuristic checklists and mind mapping to capture the results. This detailed analysis session (also known as a survey or intake session) provides clarity on the understanding of the system, and the ability to check that with other people. Once the system and risks are better understood, the test design approach is also discussed. This is proving more efficient than designing tests on a solitary basis and then asking for the design ideas (scripts) to be reviewed afterwards. This is a more proactive Peer Review which does not involve rework.
Having gone through these sessions, testers are more confident they know what is important to test. I should mention at this point; no scripts does not equal no test documentation. If something is important to capture, then capture it! Consider using a test matrix or other form of documentation which requires less maintenance than a detailed script.
Without fail, in each class people say they need test scripts so new joiners can get up to speed. So I ask how they feel when joining a team and being told to sit on their own and learn the system by running the regression pack. The answer is unanimous, they hate it! Perhaps we should make joining a new team an exciting part of the job instead of the most boring? There is an inevitable amount of reading to be done when joining a new team, but I’d argue test scripts are not an effective learning tool and may actually cause a tester to switch off their brain.
One effective method is to capture the essential project information (domain knowledge, technical aspects, testing considerations etc) into a wiki. Some of the time saved by not writing scripts can be invested into updating the wiki, capturing the salient points into a concise document which is readable and available to anyone in the project who may be interested, not just the test team.
When a new joiner starts, they should begin by reading the wiki… but to make it more interactive, ask them to review for accuracy and be responsible for making updates and improvements to the knowledge base. Would you be more engaged in a team when you’re already asked for your contribution, or when you’re left on your own and told to follow instructions? Once basic knowledge is gathered from the wiki, allow your new joiner to explore the system, engage them in pair testing or shadowing with an experienced tester. Encourage them to ask questions, to suggest ideas for test design, and challenge yours. Without fail, when we’ve done this, the new joiner has helped the experienced tester look at the system in a new way too. This also sets the expectation for the new joiner… “This is not a team where you can sit back and be spoon fed instructions. We expect you to engage and contribute intellectually from day one, regardless of your experience”. If thats not the kind of team your new joiner wants to be in, its best for all to find that out sooner rather than later!
By using the techniques above, I’ve seen testers gain confidence in their ability to test and no longer rely on detailed test scripts. Time saved not writing detailed test scripts has been spent collaborating, learning about the system and allow testing to start sooner. I’ve seen stronger bonds amongst the team and improved morale as our testers become recognised as experts within their projects. This has been my experience, I’d love to hear about yours.
In part 2, I’ll talk about some of the risks raised by management when considering exploratory testing.