Thursday, September 01, 2011

The 10 Minute Test Plan

By James Whittaker

Anything in software development that takes ten minutes or less to perform is either trivial or is not worth doing in the first place. If you take this rule of thumb at face value, where do you place test planning? Certainly it takes more than 10 minutes. In my capacity as Test Director at Google I presided over teams that wrote a large number of test plans and every time I asked how long one would take I was told “tomorrow” or “the end of the week” and a few times, early in the day, I was promised one “by the end of the day.” So I’ll establish the task of test planning to be of the hours-to-days duration.

As to whether it is worth doing, well, that is another story entirely. Every time I look at any of the dozens of test plans my teams have written, I see dead test plans. Plans written, reviewed, referred to a few times and then cast aside as the project moves in directions not documented in the plan. This begs the question: if a plan isn’t worth bothering to update, is it worth creating in the first place?

Other times a plan is discarded because it went into too much detail or too little; still others because it provided value only in starting a test effort and not in the ongoing work. Again, if this is the case, was the plan worth the cost of creating it given its limited and diminishing value?

Some test plans document simple truths that likely didn’t really need documenting at all or provide detailed information that isn’t relevant to the day to day job of a software tester. In all these cases we are wasting effort. Let’s face facts here: there is a problem with the process and content of test plans.

To combat this, I came up with a simple task for my teams: write a test plan in 10 minutes. The idea is simple, if test plans have any value at all then let’s get to that value as quickly as possible.

Given ten minutes, there is clearly no room for fluff. It is a time period so compressed that every second must be spent doing something useful or any hope you have of actually finishing the task is gone. This was the entire intent behind the exercise from my point of view: boil test planning down to only the essentials and cut all fat and fluff. Do only what is absolutely necessary and leave the details to the test executors as opposed to the test planners. If I wanted to end the practice of writing test plans that don’t stand the test of time, this seemed a worthwhile exercise.

However, I didn’t tell the people in the experiment any of this. I told them only: here is an app, create a test plan in 10 minutes or less. Remember that these people work for me and, technically, are paid to do as I tell them. And, again technically I am uniquely positioned to begin termination procedures with respect to their Google employment. On top of that I am presuming they have some measure of respect for me, which means they were likely convinced I actually thought they could do it. This was important to me. I wanted them to expect to succeed!

As preparation they could spend some time with the app in question and familiarize themselves with it. However, since many of the apps we used (Google Docs, App Engine, Talk Video, etc.) were tools they used every week, this time was short.

So here's how the task progressed:

They started, did some work and when ten minutes passed I interrupted them. They stated they weren't done yet. I responded by telling them they were out of time, nice try, here's a different problem to work on. 10 minutes later, the same thing happened and I changed the problem again. They began working faster and trying different angles, things that were too time consuming or not worth the effort got jettisoned really quick!

In each case, the teams came up with techniques that helped speed things along. They chose to jot down lists and create grids over writing long paragraphs of prose. Sentences … yes, paragraphs … no. They wasted little time on formatting and explanations and chose instead to document capabilities. Indeed, capabilities or what the software actually does, were the one commonality of all the plans. Capabilities were the one thing that all the teams gravitated toward as the most useful way to spend the little time they were given.

The three things that emerged as most important:

1. Attributes the adverbs and adjectives that describe the high level concepts testing is meant to ensure. Attributes such as fast, usable, secure, accessible and so forth.

2. Components the nouns that define the major code chunks that comprise the product. These are classes, module names and features of the application.

3. Capabilities the verbs that describe user actions and activities.

None of the teams finished the experiment in the 10 minutes allotted. However, in 10 minutes they were all able to get through both the Attributes and Components (or things that served a similar purpose) and begin documenting Capabilities. At the end of an additional 20 minutes most of the experiments had a large enough set of Capabilities that it would have been a useful starting point for creating user stories or test cases.

Which, at least to me, made the experiment a success. I gave them 10 minutes and hoped for an hour. They had 80% of the work complete in 30 minutes. And really isn’t 80% enough? We know full well that we are not going to test everything so why document everything? We know full well that as we start testing, things (schedules, requirements, architecture, etc.) are going to change so insisting on planning precision when nothing else obeys such a calling for completeness seems out of touch with reality.

80% complete in 30 minutes or less. Now that’s what I call a 10 minute test plan!


  1. Great article. I don't think I'm alone on this but I'd love to see what google QA uses for test case management...especially if it's an internally developed one. Is there any news about this?

  2. I'd have to say that this article most insightful! I view myself as pragmatic and somewhat of a purist when it comes to testing processes and documentation.
    This is my take away:
    For US Federal IT test contractors/consults, I think the "10 Minute Test Plan" brings to attention a perspective that we all probably were already aware of for some time now - most test plans (documents) are nothing but "paragraphs of prose". The concept of using a “time box” method forced the team members to really focus on the three areas of project management that mattered most: time, cost, and scope. Time – as limited as it was required finding creative and effective ways to cut corners that were never cut before. Cost – the fear of either losing your job or having your performance looking less than par in comparison with coworkers. Scope – the need to focus on what was “really” important to convey to users of the test plan and ignore what was not.

    I’ve worked for a variety of Federal agencies and many have templates which contractors have to tailor and adhere to. While it would be near impossible to cull the fat from these test documents, perhaps the real lesson to learn is the necessity of focusing on ONLY what is important to document and ONLY document the things that will be continuously referenced throughout the testing life cycle.

    Sometimes less is more when less contains JUST the essentials.

  3. Great experiment, has it succeeded in freeing up your testers to actually do more testing than documentation?

  4. I agree but the 20% will take few more hours/days to complete. It's the fine precision things that take a lot of time. The complicated work-flows that verify some edge conditions that need detailed description and long lists of expected results. Generating data for a security check or configuring a system for testing that consumes the most time.

  5. " ...plans are useless but planning is indispensable" - Dwight D. Eisenhower

  6. Sounds really nice, but I am I right to assume that it works only for "self-contained" projects, and not really for integration projects, where the final product is a collection of smaller projects? I mean, test plans for single projects are almost always a waste of time, and can be better represented by code itself[1], but I have still to find a good way to reduce the size of the test plan for integration projects, where several components are released simultaneously, and you should be prepared to test the scenarios you thought of in advance, even without knowledge about the implementation.

    [1] Of course, there are cases where a test plan certainly helps, but I usually see the test plan as a tool for test design, not something which should live forever.

  7. If you would share the plans and experiments or explain how do you state that 80% was complete, this would be a much much better post.

    80% of the task can take 20% of the time :)

  8. You asked some questions.
    really isn’t 80% enough?

    Most of the time maybe it is. If your life, business or your livelihood depended on it, probably not. For example, if your bank calculated your balance correctly 80% of the time, is that enough?

    We know full well that we are not going to test everything so why document everything?

    We are not even going to test everything we think we are going to test when we exclude some things. We always seem to run out of time. The reason to have a complete list of everything we'd like to test is so that when we report the results of the test we can not only report the results of the test, but report what we did not test. A test report is to enable those who make the decision to go to make a well informed decision.

    But, taking in to consideration my comments, this appears to be a useful exercise. Time boxing the planning to a very short amount of time is something I'll make part of my approach. If it generates 80% completion in less than a hour it is well worth it.

  9. Jason: It's called Google Test Case Manager and will be mentioned/demoed at GTAC 2011.

    Rich: That is precisely the idea!

    James: I've heard this as "the value is in the process, not the artifact" couldn't agree more.

    Juraci: Google doesn't have any self contained projects so I can't say. Everything we have is integrated.

    Julio: The appendix of How Google Tests Software will have a complete ACC test plan.

    RentonRebel: Calculating bank balances is easy, testing is not. Apples to Grenades dude.

  10. I'd say it takes me longer than 10min to figure out what the product/feature is about.

    While I agree that #1. (Attributes) is really needed as part of a Test Plan, I claim that
    #2. (Components) & #3. (Capabilities) should already be there as part of requirements.

    We do waste a lot of time, "planning" items which could be automatically generated by an ALM - If we just evaluate sub-features, an easy calculation can give us rough estimation of test writing effort, execution and automation.
    That will leave us with lot's of free time to really put an effort on strategic planning.

    The reason most plans are not maintained, is that it is simply too hard to do.
    Again, here proper features in ALM tools can make it feasible, and reduce the time we spend on "managing" papers and statistics for status meetings.

    halperinko - Kobi Halperin

  11. Again, this sounds really nice and I'll try to use it in the future, but I still fail to see how effective (or dangerous) this would be. If you have complex systems with complex interations and you are spending at most 10 minutes at analyzing the relationships, thinking about which parts are critical to test and write down your thoughts, you will certainly miss something important sooner or later.

    Unless the idea is to spend 10 minutes at writing only, then, I agree 100% :-)

  12. One way to think about testing plan is that the result, i.e., the testing is not important, but the process of writing testing plan is important. By asking people to write testing plan, we force them to think about the feature to be tested.

    Another way is from the view of testing plan reviewer. As a reviewer, I like to know
    1) whether the tester understand the feature to be tested, such as user scenario, risk idea.
    2) whether the tester have a reasonable testing strategy.
    so that I can have some feeling about the testing.

  13. With this experience behind you, could you tell if complex tests were also produced during this phase?
    My fear is that, in most cases, only basics and same kind of flows would be produced...
    Also do you have any indicators on the "productivity" of these test plan? did the 80% found the interesting stuff you expected to? the most important bugs?

  14. This is indeed a nice experiment. Agree with you that most of the contents of Test plan are nothing but copy paste from previuos release. Instead of doing this copy paste, one should rather focus on actual or important thing, then it makes much more sense. Good one.

  15. James,

    How does the 10 minute test plan fit in with the CFC (Component Feature Capability) analysis?


  16. Really liked the post! I do agree that we spend too much time on documentation like the test plan, and we could do a lot better with less.
    But we have a hard time changing this issue, since clients and some process vigilants keeps demanding for more...

  17. I would like to ask one question that in which phase Test Plan is created, whether Test Strategy is covered under Test Plan or not.

  18. Nice post. I am wondering if you are actually using this approach in daily testing activities (or whenever you need a plan)? If yes, how often and how is that working?

  19. This is true and really happening in most of the product development companies.... the thought is very insightful

  20. Nice Article. But I won't agree with below statement.

    "Anything in software development that takes ten minutes or less to perform is either trivial or is not worth doing in the first place."

    Because, we can do lot of good things within 10 minutes time.

  21. James Whittaker has followed up on this blog post with a video presentation on the 'ten minute test plan.' Catch it in full on EuroSTAR TV:

  22. In the test plan describe only the System capability,Functionalities to be tested and the Test schedule that will be enough to document

  23. I hate to create the document called Test Plan - because as James W has mentioned it becomes a dead document. In my case, even before test execution starts.

    So, I'm going to try this ACC to see whether it work. I'm yet come across a practically meaningful method / approach to derive a great set of test cases.

    Btw, I like this phrase in this article "...We know full well that as we start testing, things are going to change so insisting on planning precision when nothing else obeys such a calling for completeness seems out of touch with reality " !

  24. That's great James :)

    I really like this test planning 10 minute approach. I will going to try this on my side too and look for the fast, most test covered output.

    I think we can also try your experiment on test cases creation and there execution priority number.

    Thanks for the great idea :)


  25. Nice! Love the idea! I think it's the most sensible thing I've seen that "upgrades" the thinking around test planning to a similar level as Agile did with highly collaborative planning meetings.

    The comments from people who are worried about "incompleteness" assume that adding more time for analysis will get them better Test Plans, which is true but depending how you facilitate (I'll get to that and maybe this is what you've done) this meeting, you may get 80% meat, skip a bunch of garbage (your premise of this article) and do it all fast. By reducing the garbage, your reducing the cost of information maintenance so that more time is spent on effective work rather than BS.
    Also, I bet the effort of test plans follows the usual Quality versus Time curve where increasing quality requires an asymptotic increase in time (to get beyond 80% Test Plan quality is very expensive and likely not worth in except in cases of life and death.) So overall, the process "sounds right"--getting good enough quality in ten minutes so we can get back to our workstations, and reducing the inventory of useless information which drags at us daily, and if done highly collaboratively--leveraging group think--I expect the results to often outstrip the old process of working alone. I will add in one important filter is that the team (or a large percentage of them) must have some history with the project under test. Otherwise, the old process of working alone or outside the meeting will allow them to develop some competency. In that case they are doing more than creating test plans.

    Highly Collaborative Test Planning Meeting (leveraging group think, in situ review and plan creation)
    Get the group together, outline the information as Whittaker mentioned (ten-minute time box), and then ask the group to debrief for ten minutes, creating a collated notes which becomes the test plan.

    During ten minute time box, do it as a team brainstorm (properly conducted brainstorms are very efficient and 10 minutes is a lot of time for a brainstorm, many people don't know the rules of brainstorming).
    During the 10 minute timebox, facilitator silently posts each team members artifacts (one detail per sticky).
    For some apps, the 10 minute process will result in a *few* items that need further research because you don't have the right person in the room or it's obscure or it needs a PHD to research the topic (I'm serious here.) So the ten minute process will allow you to discover the *few* things that will require the day to weeks. (Where if the team members work alone, lack of communication will drive *most* of the items to require independent research.)

    Whittaker, glad you posted this!
    Lance Kind

  26. First I would like to say thank you for sharing such a time saving and cost saving concept. I watched a short video on this concept, and instantly felt relief. I'm now able to complete 25 test cases in one day by myself.

  27. That incomplete 20% may have 80% of bugs..just like Pareto Principle......:)

  28. Our team is moving to Agile from waterfall and deciding whether we will drop written test plans so this article is timely. I think the 10-minute idea is great. Writing something down at least forces some focused thinking. At a minimum a bullet list of key test areas would provide a good list for reviewing the test effort with the developers and PMs.

  29. This concept is really cool. I've been trying to see if the concept could be used on our projects. I've a few questions though.. Do you write test plans at the application level? So for example, you'd have a test plan for Google Plus. How does it work with features within an application? For example a comment feature? Would you write this type of test plan for things like that? Or is that too granular?


  30. The 10 minute test plan was an untested theory proposed by James (who no longer works at Google) in 2011. I don't know of a single team in Google that uses this approach in practice.

    On this topic, I will be posting a Google Testing Blog article on a very different approach to authoring test plans in the next several weeks. Stay tuned...


The comments you read and contribute here belong only to the person who posted them. We reserve the right to remove off-topic comments.