The Enigma of the Response Rate and How to Ensure Your Sample Doesn't Mislead You

A client asked me whether, with a 9% response rate, the resulting sample from surveying a customer database can still be considered representative.

Situation:

We have 1,400 unique contacts—customers who purchased online in 2023. The product is industrial, sold in large volumes but with low frequency, meaning customers return to purchase multiple times a year only if they are professionals or B2B buyers.

This is the first time the client has conducted this survey and has no prior information about the response rate. Some customers have purchased multiple times, while others have purchased only once. Their purchase experience may be recent or from some time ago.

I propose you take 5 minutes to read through my reasoning and the response I gave to the client, who is keen on doing things the right way.

The answer comes from practical realities, not just what statistics theory teaches us. We build our reasoning based on theory and validate or invalidate the sampling parameters to conclude whether the sample is “good.”

Finite Population Context:

It’s important to note that we are sampling from a finite population. We have a limited number of people we can approach and convince to respond to our survey. In infinite population sampling, volume helps significantly—if someone refuses, we move on to the next, like a bottomless bag.

Approach Method:

Phone, email, or SMS?
If I send the survey invitation via email, do I have a guarantee that the message reaches everyone’s inbox?
A phone approach is the best option because human interaction generates more responses. SMS ensures that the survey link reaches everyone in the contact base. The real question is who will respond.

Experience Status

  • How recent the customer’s purchase experience is affects both the response rate and the relevance of the answers.
  • There are newer and older contacts, and we will see who is more likely to respond to the survey.

Difficulty/Simplicity of the Questionnaire:

This factor greatly impacts the response rate and can lead to high abandonment rates. Customers may intend to complete the survey but give up midway.

Survey Invitation Recipients:

Should I pre-select contacts to send the invitation to, or should I send it to everyone at once?
I can send the invitations in batches, provided these batches are randomly selected and maintain the structure of the contact database. Alternatively, I could use the purchase month/period as a criterion to build the batches.
In both cases, I have a chance to monitor the response rate dynamics and adjust my approach.

Sample Objective and Tolerance for Error:

I want a robust sample that allows me to analyze data by sub-groups.

Let’s say I want a sample of 600 customers, which requires a 40% response rate. Great! This also gives me a small sampling error of +/- 3%.

What are the alternatives? I could accept a larger error tolerance, say +/- 5%, and settle for a sample of 300, which requires a 20% response rate.


Response Rate Objective:

It’s hard to decide without benchmarks, even approximate ones. I know how it feels—just like when I’m guessing the incidence rate of a product’s consumption.

  • Phone approach: If phone numbers are accurate, the response rate is around 30%.
  • Online panels: Typically have a response rate of around 10%.
  • Email approach: Response rates are lower due to spam/junk filters. For direct email marketing, response rates are about 1%.

When you have no idea, you can set a desired sample size. Ultimately, the final number of responses is the most important factor for the analysis you plan to conduct, and the response rate will result from this.


Target Population Profiling:

What variables are important for my business that are available in the database and can be used for minimal segmentation?
It’s important to know my resources, understand who I’m asking for feedback, and anticipate who will respond. Profiling is critical for representativeness—ensuring the sample reflects the target population’s structure after selection.


Sample Profiling:

The descriptive variables of the target population also apply to the sample.
I monitor daily progress and check for deviations between the population structure and the sample structure. I hope there’s no sub-group of customers stubbornly refusing to respond. If that happens, I need to get creative in motivating their participation.


Survey Participant Rewards:

  • Are rewards a good strategy for increasing the response rate? If so, what kind of rewards should I offer?
  • In my opinion, rewards should be seen as tokens of appreciation, not incentives. They can be symbolic gestures, especially when surveying customers who’ve already had an experience with your business.
  • There’s no need to turn customer interaction into a condition or irresistible attraction, as it’s preferable for the results to reflect reality.

Monitoring Field Parameters:

Also known as a field report, this involves tracking statistics on the status of the contacts you invited to participate—who abandoned (and at what question), who completed. Unfortunately, you’ll never know how many messages landed in the inbox versus the promotions folder.
For this reason, the SMS approach (with a link in the message) is better. By phone, you’ll know exactly who refused outright, who abandoned, and which phone numbers were incorrect.


Final Thoughts:

Returning to the challenge: Is a 9% response rate enough to ensure a representative sample?

Strictly in this context, a higher response rate is desirable—around 20%. It depends heavily on the reference point and what you’re comparing it to. The answer might lie in profiling. For example, you might find that most responses came from customers who purchased within the last three months. In this case, you can adjust the response rate calculation base and redefine the target population. This increases the response rate.

If we consider that the invitation was sent via email, then a 9% response rate could be considered good. The question remains whether the sample of 126 respondents is enough to answer the business objectives. Profiling will again provide the answer regarding representativeness.

Statistics remains fascinating for me because it allows us to craft the story of a representative sample in a given context. Even if I’m unlucky and don’t find a satisfying story that checks all the above boxes, I can be content with the opportunity to experiment and learn. 126 responses are not to be dismissed… Next time, I’m sure to craft a better story and know how to increase the response rate.

Image Credit: Shubham Dhage on Unsplash

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram