Meditations on Rapid Software Testing

I should have written this a couple of weeks ago when I actually sat the course, but it was an incredibly hectic week (3 days on the course, 1 evening #BSTC Meetup, #BDDX on the Friday) and, as you might well imagine, there was a whole lot of information to take in. I reckon James probably only made his way through about 1/3rd of the actual slides for the RST content, and then there’s another 200 pages or so of appendix content as well. This is a good thing though since you leave the class feeling as though you’ve only scratched the surface of what is potentially a lifelong learning exercise. The longest journey starts with a single step, and all that…

I expected big things from the course, and I wasn’t disappointed. I’ve been following James’ work for the last couple of years; ever since I came to the conclusion that scripted manual testing was a waste of time and effort and began looking for a better way of doing things. During my search I came across the phrase “Exploratory Testing” and well…, you can guess the rest.

So now I’ve had a bit of time to reflect, what did I take away from 3 days with James Bach and some fellow testers?

Draw Stuff – Putting things down on paper encourages your brain to think about what you’re testing in different ways. Many of the problems encountered during software engineering projects come down to communicating what it is that is actually going to be delivered. When we test, we build a model inside our heads of what the product under test looks like. If the model is wrong, our tests will be wrong (pattern recognition is used when first creating a mental model; your pattern may not be correct.)  If we’re able to communicate the model, either via a diagram or possibly some other method (e.g. mind mapping or flowcharts), our test model will be clearer not only to the testers but to the team, allowing for increased and higher quality communication, identification of problems, gaps, misunderstandings and hopefully, much improved development and testing.

I’m not very good at this. Drawing a diagram of what it is that I’m meant to be testing isn’t usually the first thing that springs to mind when I’m planning my work, although I do sketch out architecture and stuff like that quite often. I’m using mind maps a bit more now, but I still have a ways to go in this area.

Tell a Convincing Test Story – A large part of the software testing role is communication, of tests, results and issues. Being able to communicate what is planned, how it will be achieved, problems and results, clearly and convincingly, can only serve to improve our effectiveness and elevate credibility.

I’m not too bad in this area. Once I build up a bit of steam I can do a reasonable talk/presentation and engage an audience. Communication with my team tends to be pretty reasonable also. Room for improvement though. Writing is an area I can definitely improve upon so will be trying to blog a bit more.

Gather Evidence – James uses video a lot in his testing. Using video timestamps in bug reports instead of steps to reproduce seems like a great idea!

Test Scripting – James isn’t against test scripting. He’s against bad test scripts that are written ostensibly so “anyone can read them.” This is a mantra that will have been heard often (it certainly was by me anyway) in the “Fake Testing Industry.” If you’ve been there, you’ll know what I’m talking about. Test scripts should be written so that a person who meets a minimum set of requirements can read them, e.g. somebody who has had the pre-requisite training for the product to be tested.

James had a bit of a rant about fake versus real testing at this point during the course. Come to think about it, he had a bit of a rant about something or other at quite a few points during the training, but he’s very good at it [ranting], so I’m not complaining!

Heuristics – and models were a major part of the training. Heuristics being the rules of thumb or experience based judgement (“common sense”) that everybody applies to life situations, every day. Heuristics are informed by our subjective experiences and therefore constitute biases if we are not aware of what our heuristics are. This is a tricky subject, and one that I’m not sure I understand completely myself yet. But I think it boils down to examining the way I think and what my mental models are to ensure that I’m not making ill informed assumptions or leaps of faith.

And lastly, De-Focusing – James made a big deal about this, to the effect that we should [consciously] use focusing and de-focusing techniques during our testing. Focusing when analysing a bug or behaviour, de-focusing when feeling confused or not sure what/where to do/go next. This again is an area I feel like I need to dig into some more since it’s not something I use consciously, yet. I’m aware that I probably do it on an unconscious level since I quite often find myself doing things to distract myself (talking to colleagues, checking my phone, looking on the internet etc) when testing isn’t progressing the way I think it should. I think I just need to find a better way of managing this behaviour. Easier said than done though!

The above is just a small part of what I actually took away from the course. I’ll probably return to the subject in the future as I process what I’ve learnt some more.

If you’re interested in doing the course yourself, James is in Cambridge March 7th to 9th 2012 and taking bookings here.

- Simon

P.S If you're interested in learning more about performance testing, checkout my Performance Testing 101 course here.