How to Protect Your Survey from Disengaged Responders and Ensure Quality Data

For market research, online surveys are powerful tools for uncovering the motivations of consumers and the attitudes they have toward brands. They are a very cost efficient way to collect large amounts of data in a short amount of time. However, if participants are responding arbitrarily, the survey can be a waste of time and resources, often resulting in marred and unusable data.

Survey responders can be intentionally unengaged or may not realize that they need to spend more time reading and responding to questions. Collecting haphazard answers from unengaged responders can affect your efforts and tarnish your data. Below are some common ways to tell if you have a disengaged respondent:

  • Straight lining – When given a set of questions in a matrix format, it is easy for responders just to click and select the same response in a column from top to bottom to move on to the next part of the survey. Straight patterns in matrix questions are a way to tell if responders are rushing through a survey. One way to prevent this is to break up these questions into different sections or pages, so that it requires the respondent to click on another button to advance. Consider setting up rules in your survey tool that identify if responders are straight lining questions and terminate them from the survey.
  • Timing – The amount of time responders are spending on questions is another way to tell if they are really engaging with them. Enable and use your survey tool to monitor the amount of time spent on responses. If it’s clear that responders are only spending a couple of seconds on each question, then it’s time to terminate them from the survey.
  • Attention checks – If responders are speeding through a survey, it is likely that they are not even taking the time to read the questions. Strategically place attention filter questions with discernable answers to catch responders that aren’t fully engaged with reading the question. Consider embedding a three-strike rule, where if a respondent provides bogus answers to multiple attention check questions, they will be terminated from the survey. Or insert two of the same question to monitor if responses are consistent.

Keeping this type of behavior in mind can help identify and weed out bad responses. However, before we blame disengaged respondents and begin removing them or throwing away all of their responses, there are things that can be done in the design of the survey to keep respondents engaged and to avoid bias where the respondent is pushed to answer a certain way.

  • Avoid leading and confusing questions – Craft questions so that they don’t lead or confuse the respondent. Make questions and response choices simple and clear for your intended audience.
  • Limit cognitively demanding questions – Based on the topic of your survey, limit the complexity of questions that may be cognitively taxing on the respondent. For example, if you ask a respondent “what percent of your income has gone to clothing in the last 8 months?” the respondent may not even know how to answer this since they might not be tracking their clothing expenses. Questions like this may add unnecessary length to the survey, which in turn may lose the interest of the respondent.
  • Use a neutral design and accentuate sparingly – Bold, italicized, underlined, and colored words can draw attention to certain parts of a question. They may be helpful in underscoring specific terms to help guide the respondent. However, it’s best to keep a clean and neutral layout so that the respondent is not distracted or influenced by any extraneous elements.
  • Lastly, don’t let your own bias take over – When constructing the survey and reviewing the data, be careful not to let your own opinions take over and begin to tweak the design and results to make more sense in your own mind. Bring in a few people that fall into your intended target audience and have them take the survey with you listening to their question-by-question commentary. You’ll be amazed at what you learn.

So these are some tips that help to identify unengaged responders and a few ways to promote thoughtful responses. As consumer behaviors evolve and new survey techniques are introduced, the way in which we design questions and analyze data will be ever changing.

What other ideas do you have for detecting unengaged responders? What other techniques have you seen to keep responders engaged?

Join the Discussion