Response vs Non Response Bias in Surveys + [Examples]


When conducting research, response and non response bias are some of the things that researchers need to avoid to get correct results. These biases may come from the researcher or the respondents.

For researchers, it may arise due to indirectly putting words in the respondent’s mouth or restricting their choices to what you want it to be. That way, they have no other choice but to choose what the researcher wants.

Respondents may also be the reason for biases by intentionally giving subjective responses to questions asked by researchers. There are, however, various reasons that can make a respondent give bias responses to questions.

What is Response Bias? 

Response bias is a situation whereby a respondent or participant gives inaccurate or false answers to a question. It is very common in research that involves the participant’s self-report, questionnaires, surveys, interviews, etc. Response bias may be caused by different factors, especially the human factor, unlike robots do not necessarily give a straightforward answer to questions.

Response bias is not something that can be completely avoided, as it is central to any survey. Giving bias responses to questions can be intentional or accidental. It doesn’t matter which case it is, it completely affects the research rendering the collected data useless.

Types of Response Bias

Social Desirability Bias

Social desirability bias is a type of response bias in which respondents give socially acceptable rather than sincere responses to questions. The questions, in this case, are usually worded in a particular manner that may make people give in to social conditioning.

Although people may have a different personal opinion, in a bid to not look bad, they give the socially acceptable answer. This can also be related to cases where people may want to form class or belong to an elite group, so they lie to look better.

A respondent will likely choose “no” in this case, even if he/she has cheated on an exam before and felt there is nothing wrong with it. This is because society has conditioned cheating  to be seen as something bad.

Demand Characteristics

This comes up in cases where respondents give a particular response to a question because he/she thinks that is the answer the questionnaire wants. For example, a company looking to improve its product quality may send out a survey to know what customers think should be changed in the product design. 

In this case, the customer may have no improvement suggestion to make to the questionnaire. But since he/she thinks the company needs answers, a response will be given regardless.

Only a few respondents care about brand identity. Yet, many will go for “Yes” because they think that is what the brand wants to hear.

Extreme Response

The extreme response bias arises as a result of respondents giving exaggerated responses to a question. This may be done to make themselves look good or another person looks bad to the questionnaire.

For example, a respondent who visited a restaurant may be asked how the food was, and he/she gives an exaggerated response detailing how terrible it was. Some may even do this to get an extra plate of food as compensation from a restaurant.

Assume that a philanthropist is giving out $5000 out to people who need it and this is a question from the survey. Most people will choose the “More than enough” option to portray need, desirability and seek pity even if it is too small.

This is because the “Too small” option may make the respondent come off as been proud and not content.

Neutral Response

This occurs when the respondent gives a neutral response to all the questions asked. A lot of respondents do this when they are indecisive or have no idea about an answer to the question being asked. So, they give neutral responses to all questions. This type of response is equally as bad as the extreme response.

The image below is an example of neutral response bias.

Acquiescence Bias

Acquiescence bias is a form of response bias where respondents give positive answers to all the questions asked. In some cases, they don’t bother to read the questions before choosing affirmative answers. This type of response bias is usually the respondent’s fault and is mostly intentional. This response bias is usually easy to spot as it leads to a train of contradictory statements from the respondents.

By choosing an affirmative answer to all the questions, the respondent made a contradictory statement. How can you be an atheist and still believe in God?

Dissent Bias

The dissent bias is the opposite of acquiescence bias, and is a form of response bias where respondents give negative answers to all the questions asked. This response bias is mostly intentional because they don’t usually attempt to read the questions.

This may happen as a result of the questions being too much for the respondent to answer. Therefore, at a point, respondents just start choosing random answers to the questions.

 

 Reasons For Response Bias  

  • Emotional Questions

Some questions may appeal to a person’s emotions and make them give bias answers. This is due to a human’s emotional nature.

For example, here is an emotional approach to asking someone if they are planning to have children anytime soon.

Your parents are getting old and would like to see their grandchildren. Would you consider having a child anytime soon?____

The questionnaire could have easily asked if the respondent would consider having a child soon. But to appeal to the person’s emotion, the current state of the parents was put into consideration.

  • Reward Questions

Many online survey platforms reward respondents when they participate in surveys. Before participating in a paid survey, respondents are usually made to go through a pre-screening phase to confirm whether they are fit to participate in the survey.

At this point, respondents want to qualify for these surveys, so they can take them and get rewarded. Therefore, they resort to lying for rewards.

For example, a survey company may need people within the age of 40-60 to carry out a survey. So, they ask respondents if they fall into this age range during the pre-screening stage.

Some respondents will answer in the affirmative, even if they don’t fall within that age range just to qualify for the reward.

  • Complex Questions:

Respondents usually find it difficult to answer some questions because it is too complex. So, they just choose a random answer to the question.

The response bias is usually intentional in this case, and it can be related to multiple-choice questions in the educational system. When students are assessed and asked to choose an answer from the options, they may choose a random answer if they are not sure about the correct one.

How to Avoid Response Bias

  • Stop asking emotional questions. Questionnaires need to stop using an emotional approach when asking respondents questions so that they only give objective answers.
  • Keep it simple. The questions should be simple enough for a layman to understand. Do not use big grammars or confusing sentences.
  • Embrace anonymity. Respondents are known to be more blunt and sincere if they are anonymous. That way, there will be no need for Social desirability bias because the answer can not be traced to them.
  • Add response validation. Researchers should validate responses in such a way that respondents are not able to make bias responses to a question.

For example, if a respondent answers “yes/no” to a question, he/she shouldn’t be allowed to answer “yes/no” to another question if it contradicts the first answer. You can do this with the Logic feature on Formplus form builder.

  • Allow respondents to save & continue later. Some respondents choose random answers to a question when they are tired and the questions are still a lot. 

By allowing them to save and continue later, they will be able to take a break when they are tired and respond correctly when they are ready to.

Disadvantages of Response Bias

  • Response Bias produces irrelevant data. Researchers are usually unable to work with response biases because it is usually incorrect. If they were going to use them anyway, it suffices to formulate responses themselves instead.
  • It results in wrong assertions or conclusions. If the researcher doesn’t notice the loopholes in the data collected and the go-ahead to use it, a wrong conclusion will be made.
  • It leads to an expensive data collection process. Researchers may need to go through another data collection process if the first one is irrelevant. Therefore costing more money and time.
  • It may lead to a business incurring a loss. A lot of businesses use respondent’s opinions to make relevant decisions like launching a new product. Response bias answers may inform a business decision which may eventually be wrong.

What is Non Response Bias?  

Non-response bias is a type of bias that occurs when people are unwilling or unable to respond to a survey due to a factor that makes them differ greatly from people who respond. The difference between non-respondents and respondents is usually an influencing factor for the lack of response.

Sometimes called participation bias, a non-response bias may be due to a poor survey construction and targeting skill from the questionnaire. It may also be because of an irrelevant decision made by the respondent.

For example, a survey asking about the best alcoholic drink brand targeted at older religious people will likely receive no response. In other cases, the survey may not even reach the target respondent—like an email that dropped in the spam folder.

Examples of Non-Response Bias 

Example One

Let us consider a database of 1000 email addresses belonging to older people who only use their email accounts to be in contact with their children. In most cases, we may see that these people only know how to send and read emails, which is something that was taught to them by their children. 

Now consider an individual who is about opening a new club and need to perform a competitive analysis by running a survey asking a few questions about existing night clubs. It is clear that if this survey is sent to these 1000 email addresses, about 90% will not respond.

A night club will not resonate with older people the way it will do with young folks. Well, except if it is a nightclub for older people.

Example Two

A lot of young people watch adult videos on their internet-enabled devices, but most of them are ashamed to talk about it. Therefore running a survey that contains questions like, how often do you watch adult videos? What is your favorite adult video site? may not receive as many responses as it should.

This is because they are mostly embarrassed talking about it. A better way to receive more responses to questions like this will be to allow them to remain anonymous when answering these questions. Even at that, some will choose not to respond, due to fear that they may be exposed in the future.

Reasons for Non-Response Bias 

  • Request for sensitive information

Consider a survey measuring the rate at which separated parents comply with child support payment directive. A parent who does not regularly pay child support will be most uncomfortable filling out this survey. 

Therefore treating a bias that inclined the data towards a more law-abiding net sample than the original sample. This reaction is obvious and we can also say that surveys that explicitly state its involvement with a government agency will likely receive a non-response bias.

  • Email Invitation/SPAM

Sometimes, researchers are usually the cause of no response biases because the do not pretest their invites properly. Pretesting is very important when sending email invites, which is why email sending platforms allow you to send a test email to yourself first.

This way, you can confirm if the email renders well on mobile and PC. It is general knowledge that most young adults respond to emails on their mobile devices and if the survey doesn’t render well on mobile devices, responses from smartphone users will drop dramatically.

Sometimes, the mail drops in the spam, preventing potential respondents from seeing them.

  • Wrong Audience

It is very important to choose the right target audience when sending surveys out. For example, sending a survey about weekly work hours and earnings to undergraduates or unemployed young graduates may not receive as much response as it would with employed people.

This will be the same if a survey about how many hours are spent studying for exams may not receive responses from non-students.

Disadvantages of Non-Response Bias

  • It invalidates the results of an investigation or research.
  • It may result in higher variances for the estimates since the sample size the researcher ends up with is lesser than what was expected.
  • It may lead to inconclusive research.

How to Avoid Non-Response Bias in Surveys and Polls 

  • Use Close-ended Questions

Close-ended questions are usually more straightforward and require no descriptive response from respondents. Therefore, they find it easy to give their responses.

In this case, respondents will not abandon the questions because they are finding it difficult to give answers. You can also control the kind of responses you get.

Easily create close-ended questions using the Formplus choice option feature. For example:

  1. How many glass cups of water do you take daily?
  2. Less than 1
  3. 1
  4. 2
  5. 3
  6. 4 or more
  • Your Questions must be very neutral

Do not attempt to put words into your respondent’s mouth by restricting the options to only responses that suit you. Researchers need to keep their personal opinions aside when preparing a questionnaire.

That is, don’t ask; 

How great was President Trump’s State of the Nation Address? 

  • Excellent
  • Very Great
  • Better than Other President. 

Rather, say:

How was President Trump’s State of the Nation Address?

  • Poor
  • Fair
  • Good
  • Very good
  • Excellent.  

In the first example highlighted above, the options limited the respondent’s answers to only saying something positive about President Trump. This may not sit well with other non Trump supporters and they may refuse to respond.

The second example, on the other hand, gives room for both positive and negative opinions.

  • Avoid Double Barreled Questions

Double-barreled questions are questions that touch on more than one issue but allow for only one answer. It is usually very confusing and may discourage respondents from answering.

An example of a double-barreled question would be the following;

“Do you think our chef and waiter were great?”.

“Do you think our chef was great” and “Do you think our waiter was great?” were combined. 

double barreled question

Asides from confusing the respondent, even the researcher can not make proper conclusions from this kind of data. 

  • Understand your intended targeted audience

It is important to understand your targeted audience to know the kind of manner in which the question should be asked. When dealing with older folks, for instance, the wordings should be formal as that is more appealing to them.

This may not be the case when dealing with the younger folks.

  • Ensure your options covers required possible answers

Always ensure that your options are inclusive and cover all the possible answers.

For example, when asking a person about their gender, it is not enough to have the options male and female anymore. 

This is because some people identify as non-binary, transgender, etc. Therefore, adding these options to the available options, other non-binary genders can answer.

  • Provide Incentives

Paid surveys are a way of providing incentives to respondents. That way, you can have the required responses to your survey. 

You don’t have to be the one controlling the survey, as many companies are available to help you handle paid surveys.

Conclusion

Biases may be one of the things that can’t be avoided when carrying out surveys. They can, however, be reduced by making a deliberate effort to prevent it from happening.

The first step to doing this is by understanding response bias and non-response bias which may be caused by either the researcher or the respondent. Sometimes, these biases may be unintentional, like in the case of a respondent who forgets to respond to a survey.