The simplest ranking questions use a series of open-ended questions: Rank the following brands according to how much you like them Please place a 3 next to the brand you like most, a 2 in your next preferred brand and a 1 next to your least preferred brand.
A constant-sum question asks the respondent to allocate a specific number of points, tokens, dollars or some other quantity across different alternatives. For example:. There are many more advanced question types. Some of these are described in Advanced Questions and Questionnaires. Writing a question involves selecting a question type and phrasing the question. Plagiarism is at the heart of asking good questions, as there are many, many ways of wording a question poorly.
The easiest way to get across the basics is to look at a few examples of bad questions all from real studies : What is your principal brand of soft-drink? What problems can you see with this question? However, this does not remove the ambiguity from the question: is it asking which one is liked the most, or bought the most often? Such a question will likely end up measuring brand salience which brands come to mind rather than anything else.
There are lots of problems with this question. What about cider drinkers? They are ignored, as the categories are not exhaustive. Furthermore, people who do drink are likely to drink different types of alcohol at different times, such as beer after sport, wine with dinner and spirits after dinner; how should such a person answer the question? The least-educated person who is likely to have to answer the question needs to be able to figure out how to answer the questions accurately.
The issue is not one of respondent confusion, as this is already addressed by the first principle, and respondents are remarkably adept at answering incomprehensible questions presumably because this is the only way they can finish many questionnaires. The problem with ambiguity is to avoid situations where, once we have collected the data, we cannot discern what it means. As a result, the research was ambiguous and could not be used to derive valid insights.
To avoid being ambiguous, questions need to focus on the current, the specific and the real. A good way of checking for ambiguity is to use think aloud interviews, in which real respondents are asked to answer the question, but are required to verbalize all their thoughts while answering the question. The third principle is that questions must be incentive compatible, which is a term of art in economics, and it means that questions need to be written in such a way that people have an incentive to provide honest data.
Consider the question: Are you aged under 35? If asked as a screener at the very beginning of a questionnaire, respondents conclude that an answer of No will cause them to be screened out of the study i. Consequently, when this question is asked at the beginning of questionnaires where people are being paid more if they complete the whole questionnaire, some respondents lie and pretend they are aged less than 35, when they are actually older.
By contrast, if asked at the beginning of a questionnaire where people are not being paid to do the study, people aged under 35 pretend to be older, as this becomes a polite way of refusing to participate. An incentive compatible way of screening people based on being aged under 35 instead asks:.
Imagine yourself charged with the problem of pricing an iPad before they were launched. How much should you charge? It may come as no surprise, however, that the relationship between the number of questions in a survey and the time spent answering each question is not linear. The more questions you ask, the less time your respondents spend, on average, answering each question.
On average, we discovered that respondents take just over a minute to answer the first question in a survey including the time spent reading any survey introductions and spend about 5 minutes in total, answering a 10 question survey. However, respondents take more time per question when responding to shorter surveys compared to longer surveys:.
Can we always assume that longer surveys contain less thorough answers? Not always, since it depends on the type of survey, the audience, and the relationship of respondents to surveyor, among other factors. However, data shows that the longer a survey is, the less time respondents spend answering each question. For surveys longer than 30 questions, the average amount of time respondents spend on each question is nearly half of that compared to on surveys with less than 30 questions.
The tolerance for lengthier surveys was greater for surveys that were work or school related and decreased when they were customer related. Take survey completion time into consideration as you design your next survey. You can find the details of our method in the Versta Research Winter Newsletter. The method is easy to learn and easy to implement. That being said, and knowing that all the different question types tend to average themselves out in most of the surveys we write, here is what we generally proffer as the number of questions you can ask in a survey:.
For most surveys, 20 minutes is about the maximum you can go before respondent attention lags and data quality deteriorates.
Keeping your survey question count low is crucial, because survey fatigue is a real danger for survey makers hoping to collect the best, most accurate data. A few well worded, well designed survey questions are usually no problem for respondents to complete.
But, once a survey starts to get bogged down with page after page of radio buttons, essay boxes, and convoluted question phrasing, respondents either lose interest and become too frustrated to complete the rest of the survey.
Deciding the exact number of survey questions you need to reach your goals is, of course, more complicated. It depends largely on your purpose and audience. In this post, I cover each of these considerations and give you tips for determining the optimal length for your next survey project. The answers to these questions will help you determine the kind of survey you are running, the survey question types you will use, and how many survey questions you need to ask to get you to where you want to be.
A small business owner wants to expand his current web design business to include new services. He has a few ideas of what offerings he could make, like mobile app development, copywriting, or digital marketing consulting, but before he makes the investment in new personnel, he wants to make sure his customers are interested. The purpose of his survey is to determine which services existing customers would be most interested in seeing from his team.
The goal is to identify which service his business should develop next and, importantly, where he will be investing his time and money. He wants to make sure the survey data points him in the right direction!
0コメント