paint-brush
6 Steps to Validate JTBD with Surveysby@scottmiddleton
1,535 reads
1,535 reads

6 Steps to Validate JTBD with Surveys

by Scott MiddletonJune 7th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Early Jobs-to-be-Done Validation Survey is essential to validate your product idea. The best way to do this is to get quantitative validation of the problem your new product idea is solving as soon as possible. The steps you will take to design and run this type of survey are:Prepare Plan Design Test Run Analyse with some kind of survey tools, like Monkey, will help you find the right people to respond to the survey. There are 6 broad approaches to reaching or finding people who respond to your survey: In-house lists - databases and email lists of customers, employees and suppliers. Panels - paid groups of people with relevant backgrounds.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - 6 Steps to Validate JTBD with Surveys
Scott Middleton HackerNoon profile picture

You want to get quantitative validation of the problem your new product idea is solving as soon as possible. This is why early surveys to validate Jobs-to-be-Done with your customers are essential.

Understanding the nature of the problem, independent of your product or solution, will help you substantially improve the chances of your new product idea being successful. The better you can narrow in on the problem you solve, the better you can market your product and the better you can solve the problem.

Once you have a bit of an idea of your customer’s problems - we like to express these through Jobs-to-be-Done - it is worthwhile validating these JTBD in a way that you can feel confident that it’s real and not just your opinion, your bosses opinion or the result of a few friendly interviews. The best way to do this is a survey. 

This post looks at how to design and run an Early Jobs-to-be-Done Validation Survey. The steps you will take to design and run this type of survey are:

  1. Prepare
  2. Plan
  3. Design
  4. Test
  5. Run
  6. Analyse

Step 1: Prepare

Before you get started planning, designing and running your survey you need to get some information together. The information you need before you start is:

  1. Who your customers are - the more specific (segmented) the better, even if just a hypothesis.
  2. Your JTBD hypothesis - What you think your customer’s Jobs-to-be-Done are. These may be validated a bit already through Early Customer Interviews.

Step 2: Plan

This is where you plan exactly who you are going to target with the survey, how you are going to reach them and, in broad terms, what you are going to ask them.

What to ask

With an Early Jobs-to-be-Done Validation Survey the what to ask is somewhat straightforward. You are going to be asking people about their Jobs-to-be-Done. 

However, what you do need to think about is what Jobs-to-be-Done you are going to ask. People have a limited attention span and the more questions you ask the higher your drop-off rate will become. So you may need to prioritise the JTBD to focus on a few jobs you believe to be more important. You may want to combine some or cut others out entirely.

Who are you going to target

You likely have an idea of who your customer is. But you may want to think further about whether your general persona is who you are best targeting with the survey or whether there are subsets.

The subsets you might want to think about are early adopters, certain demographics or who is most likely to have and value the problem you are focused on. You need to balance this with using the survey as an opportunity to also understand your customer in a broader sense (i.e. maybe only 5% of people care about the problem you’re solving, but maybe for that 5% it’s a massive problem).  

How are you going to reach them

You need to work out how you plan to get your survey to your customers and have them respond. This partly depends on how many people you want to reach and what type of people they are.

There are 6 broad approaches to reaching or finding people to respond to your survey:

  1. In-house lists - databases and email lists of customers, employees and suppliers within your organisation.
  2. Panels - paid groups of people with relevant backgrounds that participate in research on an ongoing basis. If you don’t already have one setup, and you’re only planning this one survey, this probably isn’t a fit.
  3. Website visitors - popup a link to visitors to your website.
  4. Survey tools - some survey tools, like Survey Monkey, will recruit respondents for you.
  5. Partners - there may be industry partners you can work with to access their lists (like staff or customer list). This usually comes with some kind of exchange (which could just be learning).
  6. Hitting the pavement - get outside the building or jump on the phone and get people to respond to the survey by getting in front of them. This more manual approach is time consuming but can work.

Step 3: Design

The design of the survey is the most interesting yet challenging part. It’s the core of what this whole activity is about. A well designed survey sets you up for success, a poorly designed one brings back rubbish or, worse, results in poor decisions.

Survey design is a skill unto itself that won’t be done justice if we try to cover it here (my mother was a statistician, she’d be upset if I tried). Instead, you’ll find some links to resources from recognised institutions that will give you solid foundations in survey design. Then, in this section, you’ll find some example question formats to help you fast track validating your Jobs-to-be-Done.

Foundations in Survey Design links

For an understanding of survey design grounded in statistics, research and tried-and-tested methods the best place to start is with guides from institutions recognised for the quality of their survey-based research. 

Here are two starting points for you: 

Reading these will give you the foundations to either complete your own survey design or the foundations to continue your research into areas of survey design most relevant to you.

Example Jobs-to-be-Done

Here are some examples to help you fast track your Early Jobs-to-be-Done Validation Survey. 

Each example question has the type of data being sought (e.g. Behaviour, Opinion or Fact), the answer format, as well as some example answers. You’ll also want to add some Classification/Demographic questions to your survey. 

With the Answer Formats below:

  1. Checklist: User selects multiple items
  2. Multiple choice: User selects one item from multiple items
  3. Open: User writes whatever they want

Surveying for a single Job-to-be-Done

If you’re trying to understand just this Job-to-be-Done:

Surveying for multiple Jobs-to-be-Done

If you’re trying to understand more about the customers behaviour overall and their context:

Classification Examples

Classification helps you understand more about the people behind the responses. The common examples of classification are demographics but sometimes demographics may not be the most useful for some products. 

Here are some examples of classification questions that you may want to consider for your JTBD survey:

  • Years of role or industry experience - “How many years have you been in your industry/role?”. Multiple choice with some brackets might help you (e.g. Under 1 year, 1-2 years, 3-5 years, 5-10, 10+) 
  • Education - “what is the highest level of education you have completed?”
  • Role - it can be open or multiple choice 

Step 4: Test

Before you send your survey to a large number of people you need to conduct some tests. These tests may send you back to the design step but at least you’ve caught them early, before incurring the costs and energy involved in running the survey itself.

The goal of testing, for our purposes here, is to make sure the questions you’ve asked are going to achieve the outcomes you’re looking to achieve. Amongst other things you want to make sure the people you are targeting understand the words you’ve used, are able to respond in a timely manner and don’t get stuck.

If you think you need additional rigour then you may want to turn back to the Australian Bureau of Statistics or Harvard for ways to increase the rigour of your testing.

Generally, for Early Jobs-to-be-Done Validation Surveys it is sufficient to send your survey to some friendlies and some colleagues. You may also want to sit next to some people as they try to complete it to see where they get stuck, how they react or where they think more than you expected.

Step 5: Run

Now you’re ready to get responses to your survey.

The two steps here are:

  1. Putting together the list of respondents 
  2. Sending the survey to them

Putting together the list

In the planning step you identified where you were going to get your respondents from. You now need to get that list together and in a form you can use to send the survey. 

This can either be almost instantaneous, it might be taken care of through the method you’ve chosen (e.g. your panel provider may organise it), or it may be a time consuming process requiring research.

Send the survey

Now you need to send the survey to your list. 

If you’re lucky enough then this step of the process will be short. You load your questions into your tool of choice then hit send. The tools we’re currently using are: Survey Monkey, Typeform and Google Forms.

Otherwise, you or your team may need to start manually working through the list, printing and sending letters or doing something else to get your survey out there and collect the results.

Step 6: Analyse

Now you have your survey responses so you can perform your analysis. 

You are ultimately looking to validate or invalidate the Jobs-to-be-Done that you’re looking at. You may get some black and white data but most likely you’ll get shades of grey. Either way, you’ve hopefully made some advances with how you think about your customer, their problem and their Jobs-to-be-Done.

Importantly, your analysis needs to focus exclusively on the customer, their problem and Jobs-to-be-Done. Try not to let your solution or product creep into your analysis and bias your thinking. Leave that for after this survey activity is complete. Focus your analysis on the data itself.

The analysis of data is a topic for another day. However, there are some considerations at this stage that regularly come into play with Early Jobs-to-be-Done Validation Surveys:

  • Consider the percentage difference between results. For example, 75/100 people do activity A, 60 do activity B.  That’s 25% more people doing X than Y. 
  • Be careful drawing conclusions from small numbers. This often comes about when looking at relationships between two responses. For example, in a fake survey of 100 people 10 people like fish, 9 of those also like rice. You may be tempted to say 90% of people that like fish also like rice but you need to be cautious that you’re only talking about 9-10 people. Say if it was actually 11 people who said they liked fish, your figures change by almost 10%.
  • Consider the biases in your sample. Now that you have your results, you’ll probably have a better view on who actually responded (versus who you planned). Consider this along with the biases it introduces to your data.