I’ve never really been that much of a fan of test cases. I like to call the method of going through a list of tests one by one “checkbox testing”. I don’t like how it encourages you down a specific path, and discourages you from doing what testers are good at: checking the quality of a product.

I always thought I was in the minority. I thought that my dislike of test cases made me, in some ways, a bad tester. I mean, I can write test cases. But almost every job description I see requires experience in doing so. I never understood this emphasis on something I didn’t think was a very good way of checking the quality of a product throughly.

So it was heartening to see the tweet thread, and this later blog post, by Michael Bolton, which includes this brilliant thought:

We also agreed that test cases often lead to goal displacement. Instead of a thorough investigation of the product, the goal morphs into “finish the test cases!” Managers are inclined to ask “How’s the testing going?” But they usually don’t mean that. Instead, they almost certainly mean “How’s the product doing?” But, it seems to me, testers often interpret “How’s the testing going?” as “Are you done those test cases?”, which ramps up the goal displacement.

I might have to bookmark this one, as a well thought out argument outlining the problems that come with test cases.