Early in my career, I was called on to manage a significant development project. The team I had to manage was both skilled and wise enough to know when their skills weren’t adequate for the challenge at hand. We were well-resourced, so we could, and did, draw on external expertise where needed. We successfully developed a product that has gone on to do very well. My immediate manager and other senior management were pleased with the results and my performance. When I looked back over it myself, I was happy overall with my performance and how the team had done.
But.
There was one thing we did that didn’t match the rest. We put a lot of work into the consumer evaluation study using questionnaires sent to potential customers. This could not be described as a good job. And when I say not a good job, I mean we would have reached our goal quicker by not doing it at all. The only good thing I can say is that at least we ignored the results. The data we generated not only failed to help in any of the decisions we had to make, but if anything, it would’ve led us in the wrong direction had we followed it.
Luckily, nobody outside the team noticed. I was thankful and a bit sheepish about this at the time. We didn’t know what we were doing and should have got some expertise in. But now, many years later, after reading a great many not very good consumer studies, I realise that it is a subject that very few people who don’t do it as a full-time job understand. It is also a subject I get a lot of questions on. Many projects need some input from the target market but don’t have the budget to employ a professional.
I can’t claim to be an expert on designing great studies, and if your pockets are deep enough, I’d recommend talking to a company that does this kind of work regularly. But I can give you some tips on how to waste time when doing them yourself.
1. Make Questions Too Complicated
One of the most common mistakes in consumer surveys is crafting overly complex questions. Often, this happens because the survey creator wants to cover all bases and gather detailed data. You know your product – it is easy to imagine your consumer does too.
However, if your respondents have to read a question multiple times to understand it, you’re likely to end up with inaccurate data or, even more likely, no data as they give up.
Keep it simple! Use clear, concise language and avoid industry jargon that might confuse participants.
2. Ignore the Importance of Relevance
Ask questions about everything you can think of, and if possible, ask the same question in different ways. Did it clean your hair? Did it leave your hair feeling soft? Did it make your hair shiny? Was your hair less brittle? How do you rate, on a scale of 1-5, how brittle your hair was? Make sure you ask about their gender, age, location, occupation, and what their skin type and hair type is. While you’re at it, why not ask for their PIN number?
Every question in your survey should serve a clear purpose. Irrelevant questions often sneak in because survey designers want to collect as much information as possible, just in case it might be useful later. However, this can frustrate respondents and lead to drop-offs. Focus on what really matters for your product evaluation—such as product effectiveness and overall satisfaction. If a question doesn’t directly contribute to your evaluation goals, it’s best to leave it out. Gender might be important for, say, a lipstick—you might discover that your market isn’t what you thought it was. Does it matter for a moisturiser? Unless you are going to use the data for something, it is better not to trouble the respondent with a question.
3. Don’t Test Your Survey
You are too busy to check things out. Just send it out and get the data back.
Skipping a pilot test is a great way of acquiring unusable data. This mistake often occurs because of time constraints or overconfidence in the survey design. However, a small test run can reveal unforeseen issues, such as confusing questions or technical glitches. Gather a small group from your target demographic to complete the survey and provide feedback. This step can save you from larger headaches down the line.
4. Make the Survey Long
Long questions and plenty of them are popular ways of wasting not only your own time but everyone involved in filling the form.
Lengthy surveys are a surefire way to lose your audience. The desire to gather comprehensive data often leads to overly long surveys. Aim to keep your survey as brief as possible while still gathering essential information. A good rule of thumb is that it should take no longer than ten minutes to complete. If you find your survey running long, consider which questions are absolutely necessary and trim the rest.
5. Don’t Think About Data Analysis Until You’ve Got Some Data
Just send out the questionnaire. You can sort out the analysis later, or better still, get a boffin to do so instead.
Collecting data should only be done when you have a clear idea of what you want to learn. Failing to analyse it properly can waste all your efforts. This oversight often happens because the sheer volume of data can be overwhelming, or there might be a lack of expertise in data analysis. Use statistical tools to interpret the data and look for trends and patterns if they help answer the questions you are trying to answer. But don’t use them for the sake of it. If you want to know if most people like your product, it really is simply a matter of totting up the numbers. If you are comparing two different levels of a moisturising agent, you might need something like a paired t-test. Decide in advance. Incidentally, a very common question I get is how many subjects do I need to test a product on to make a claim. Sometimes the questioner seems to assume that there is a rulebook somewhere that lays it down. The reality is that the only rule is that you aren’t allowed to lie. If you want to claim something, you have to design a study that demonstrates that claim, and the number of respondents you need depends on the design you use. If you want to claim that you’ve tried it on three people and they all liked it, there is nothing to stop you from doing so. You won’t sell much on the back of such a claim, but it is honest.
Incidentally, one mistake that is made more often than I would expect is failing to ask for testimonials. Asking people to say in everyday language what they think about a product is just as valid data as anything that can be plotted with a spreadsheet. In some ways, it is more valuable, especially for formulators, because it gives you insight into what the user thinks is important about the product. This can be very different from what you think is important. Also, you can use them in your advertising—so long as you keep them on file somewhere.
Avoiding these common pitfalls can significantly improve the quality of your consumer product evaluations. By understanding the motivations behind these mistakes, keeping questions clear and relevant, testing your survey, keeping it short, and thoroughly analysing the data, you can gather valuable insights that will enhance your future formulating and marketing efforts.