Questionable Surveys

Asking the right question is as important as asking the right people.

We do a lot of primary research work at Silicon Strategies Marketing, and that means we do quite a bit of surveying (so much that we run our own software suite for online surveys). Surveying remains the cornerstone of quantitative investigation even in this age of constant polling where nominal surveys receive less than 0.1% response rates.

Which makes the oversupply of sloppy surveying mythologies downright depressing.

There is both science and art in survey design. It begins with knowing what critical information is to be acquired – what business decision you are making. Non-specific problem definitions lead to non-specific questions which produce non-specific answers (to be specific). Surprisingly many folks do not precisely understand the business decision they face, and we spend a lot of time in business coaching to improve their focus before assembling their surveys.

Even after establishing the business need, formulating survey questions requires hard work. Every industry and genotype has its own language. Survey questions (and invitations for that matter) need to communicate in their language, using instantly recognizable idioms and terminology (start-ups and companies defining new markets have significant question formulation burdens since the language that describes the problem and solution may not yet exist). Composing clearly stated and vernacularly correct question sounds simple, but if it were simple we would not be doing as much survey design work as we do.

Even then, human bias enters answers, and survey design needs to control for it (and if the language base for the respondents is vague, then you need to double the corrective effort). Asking a respondent if they prefer black or white requires no sanity checking of their answers. Ask the same respondent to rate their opinion of congress requires extra investigation (public opinion surveys have relentlessly shown that people think congress is inept but that their personal congressman is a genius). Surveys can take counter-measures – such as restating the same question in different ways in different parts of the survey – to catch incongruities.

Value measurements are the toughest, and price sensitivity studies are the worst. The value people place on anything tends to be skewed by frugality or the Mercedes Effect, causing people to place a low or high dollar value on products. In this age of hyper-automation, one trick is to give each respondent in an online survey a random price question, where each of the possible questions shows a different range of prices. If the response rate is high enough, you can map price lifts and drops visually. Other tools, such as Van Westendorp’s Price Sensitivity Meter, achieve good results with even low response rates or small populations.

If you need to survey, take time to work and rework your survey instrument. Asking the wrong question in the wrong way will produce the wrong answer.


Speak up! What are your thoughts?

Your email address will not be published.