Four common mistakes when designing questionnaires

There are many things that can go wrong in the world of research. Bad questionnaire design doesn’t have to be one of them and can easily be avoided. We have identified some of the most common mistakes.

Not Designing for the Right Target Group

Surprisingly often, questionnaires are not designed for the right target group and there are a couple of things that can go wrong.

Let’s start with the composition of the sample: If you want to understand the whole market of your product, it is not enough to ask only (frequent) users of the product. These people certainly have a good idea of how your product actually works, but they already have some degree of engagement with the brand. Their perspective is based on previous experiences with your brand and cannot be assumed for all consumers in the relevant segment. The people who already use your product will not help you to understand the reasons and motivations of consumers who reject it, the hidden needs of consumers who don’t think your product is relevant or how to reach potential consumers who are still completely unaware of your product or brand. The point is that most samples should contain more than the core constituency of the topic at hand. Perhaps they should even be representative of the whole population. Otherwise, the feedback may be very one-sided.

Another common mistake is selecting the wrong sample size. If your data set is too small, it is very hard to draw dependable conclusions based on the data. This is especially true if you want to compare different subsets of your sample, e.g. compare the brand awareness among readers of a certain magazine (in which your brand is promoted) with brand awareness among non-readers. All segments need to be represented with a decent sample size in the overall study if you don’t want to start speculating about them when interpreting the data.

But even if you get the sampling right, it doesn’t mean your questionnaire is fully designed for your target group. If you want to hear the opinions of children or less-educated people, you need to pay special attention to language. You might even want to consider illustrating complex ideas and concepts with images to make them more comprehensible. In contrast, if you’re targeting business decision-makers, the tone and appearance of the questionnaire should be different. In short, your questionnaire has to resonate with your target group. This argument may be a bit on the qualitative side and requires some empathy. The following chapters will provide us with some more general principles to create a good respondent-centric user experience.

Not designing for a great user experience

Just like not paying attention to the tone of a survey, many people make the mistake of neglecting the user experience when setting up questionnaires. Our experience shows that you should always apply a respondent-centric approach when designing surveys where the user experience is key.

If you fail to design your questionnaire for a great user experience you will fail to engage your target group(s). Answering surveys can be tedious, do what you can to make it exciting. Make sure you keep the questions short and simple. To keep the respondent engaged, it helps to vary the question types and also use interactive options such as sliders or image maps. If it suits the survey type, you may even use illustrations, videos or other elements to keep the interest and motivation of the respondent up.

And while the visual elements are really important don’t forget to pay attention to the text elements as well. Try to keep them as comprehensible as possible. Especially the first few pages of a survey will decide whether the respondent will lose interest or keep going. Therefore, the first impression of the questionnaire is really important and you should always try to make the beginning as engaging as possible. Keep in mind not to ask questions that require a lot of thought or reading at the beginning of the study to avoid people from breaking off. Also try not to use long lists that require a lot of scrolling or many repetitive questions at the beginning. Start with some easy questions as a warm-up and – if possible – try to move the open questions further back. Even better, if you can make them voluntary to answer.  Also, make sure you do not force your respondents to take a stance if they don’t want to. Always include a neutral middle position in sliders and a ‘don’t know’ option in the multiple choice questions.

Not designing for all devices

Mobile friendly is just another word for user friendly. ‘Mobile first’ should not be an empty phrase when it comes to surveys.

As many of us access the online world most often via a mobile device, this should really be paramount. If you do not design your surveys for mobile you will probably not reach large parts of the population and most likely exclude especially the hard to reach target groups (like young people). Similarly important when optimising for mobile is thinking about the technical functionality of mobile devices and keeping some technical particularities in mind when thinking about functional issues such as hovering on touchscreens.

Equally important as the technical aspects are the conceptual aspects of the questionnaire. With mobile in mind you have to be aware of the general length of the survey as the attention spans go down. Another important thing to consider is the length of the individual questions. Nobody likes endless scrolling and as already mentioned above you should pay attention to keeping them as short and clear as possible. This is of course also true for all other devices with which the survey can be accessed.

Not designing for the human flaw

We shouldn’t treat opinions as if they were an assured knowledge of respondents that can be retrieved by simply asking them.

You may treat demographic information this way (“What’s your age?” – “thirty-eight”), but opinions, feelings, sensations or tastes (“To what degree do you agree with this?”) are a different thing entirely. There’s nothing wrong with opinion polls in general, but we have to be careful about taking answers too literally. Most responses should be treated as an indication of vague dispositions that can gradually gain conscious realisation during the research process, rather than ready-made facts.

If you approach questionnaires with this in mind you can make sure you avoid taking everything at face value. This will also help you to consider the very human disposition that sometimes surveys are answered in ways respondents think would be desirable. This, in turn, may lead to a discrepancy in what respondents say they would do in surveys and what they actually do in their daily lives.

However, calling respondents liars, just because their explicit answers to a questionnaire don’t match their behaviour is a bit too harsh. Sure, with being researchers we are aware that attitude and behaviour can make a huge difference. We know that intentions don’t always translate into actions and that cognitive biases distort our (self-) perception. All this doesn’t mean respondents lie deliberately, quite the contrary: most of them respond sincerely. The fact that their behaviour doesn’t match their attitude is the normal case and not a deviation from it. Because we all sometimes want to be (perceived as) better than we really are. Explaining the discrepancy between attitude and behaviour is our problem, not theirs and we should be aware of this.

At the same time, also conscious cheating should be considered and there are a couple of ways of doing so. Red herring questions are a good way for example and the main reason why we keep asking respondents which device they are currently using is not our genuine interest in the device, but to check whether they are telling the truth before the actual survey starts. Comparing metadata with the given answer may help us to detect respondents that deliberately lie.

 

As a data collection company we work hand in hand with our clients. In a typical project, they provide us with the questionnaire, we do all the fieldwork and deliver the interview data. However, we do not merely take the questionnaire and put it to the field regardless of content. We also advise our clients to make sure they get the best out of their study. And keeping these four pitfalls in mind – target group, user-experience, device and the human flaw – will help you design engaging surveys for your respondents.

Continue your reading

5 basic principles for writing good questionnaires

Explore the top 5 principles for effective questionnaire design: Learn to avoid common mistakes in online interviews, telephone or face-to-face to enhance your research skills.

Read more

How to design effective online questionnaires

Unlock success in online data collection: Discover our crucial design principles to exceed respondent expectations and excel in crafting online questionnaires.

Read more

Streamlined data collection

Our comprehensive data collection solution supports you at every stage, from defining your target audience to survey scripting and result delivery. Managed with expertise, flexibility, and your specific needs in mind.

Learn more