sv

Why your market research risks getting it wrong

The problem no one is talking about

Did you know that around 40% of the population is systematically missing from surveys that claim to represent society as a whole? Using reports based on responses from what is, in many ways, a homogeneous group to make decisions that affect the entire population is not only methodologically unsound, it is unethical and, in the long run, very costly.

Whether you work in the private or public sector, there are a few simple but crucial steps you can take to safeguard the quality of your decision-making data. Otherwise, we risk making decisions and shaping a society that does not work for everyone. But first, we need to understand why your data may be leading you astray.

In the communications industry, we like to talk about the importance of understanding the target audience. We emphasize insight-driven communication, human-centered strategy, and keeping “our ear to the ground.” But what happens when the insights we rely on do not reflect reality?

Today, surveys are used at almost every stage of marketing and communication strategies, from campaign development to brand tracking, positioning, and policy development. At its core, this is a positive development: making decisions based on data rather than speculation or gut feeling. But it also places higher demands on the quality of the data we rely on. The truth is, not all data is good data and in the worst cases, poor data can do more harm than good.

Many people talk about “non-response” as a natural part of data collection, and that’s true. Not everyone will or wants to respond to your survey. What we often fail to discuss, however, is that certain groups are far more likely to be missing than others: young people, individuals born abroad, people over 65, and those with literacy challenges. Together, these groups make up around 40% of the population.

For data to be reliable, respondents, the people who actually answer your survey, must reflect the target group you are trying to understand.

In the research industry, non-response is often addressed through statistical weighting after the fact. This means that responses from smaller groups are given greater weight to represent more people. But when certain groups are significantly underrepresented from the outset, there are limits to what can be corrected statistically. That’s why the work of addressing non-response must start earlier, in how surveys are designed, distributed, and communicated, so that more people are actually reached and willing to participate.

So what’s behind this problem? It’s certainly not ill intent. Rather, it stems from a range of factors. One is the shift to our increasingly digital lives where responding to surveys via post or telephone no longer feels as natural as it once did. Trust is another. Being asked to share personal information with an unfamiliar sender, at a time when we are constantly warned about fraud, does not make participation easier.

We recently conducted a survey asking people about their experience of participating in surveys. The results should serve as a wake-up call for the entire industry:

  • 95% prefer not to respond to surveys by phone.

  • 40% would respond more often if they had greater trust in the sender.

  • 36% say they do not recognize themselves in the surveys they encounter.

In other words, it is hardly surprising that results are often skewed. If we want to create communication, strategies, and decision-making frameworks that actually work, we must start with the basics: measuring the right things in the right way.

Here are three tips for anyone commissioning research:

1. Be clear in your requirements
The number of responses alone does not determine the quality of a survey. Ensure that your provider can deliver representativeness among respondents - that those who respond reflect your target group. This is critical for ensuring that analysis and reporting meet statistical standards. As a client, you have both the right and the responsibility to set clear expectations.

2. Ask for transparency about who responded
To trust the results, you need to know who has answered. Always request information about respondents’ age, gender, geographic distribution, background, or other relevant factors. This makes it easier to assess whether the survey has reached the right audience, or whether key perspectives are missing.

3. Use the right methodology
Make sure your surveys are adapted to how people actually live and communicate today. Whether that means responding via apps, social media, phone, or in multiple languages - choose methods that fit the target audience, not just you as the sender. This will increase overall response rates and broaden representation.

Article published in Dagens Opinion.

Want to stay updated?

Get updates, accurate insights and what’s new on Perspetivo.

Sign me up

Perspetivo AB

Org. 559378-5818

Jakobsbergsgatan 24
111 44 Stockholm

Want to stay updated?

Get updates, accurate insights and what’s new on Perspetivo.

Sign me up

Perspetivo AB

Org. 559378-5818

Jakobsbergsgatan 24
111 44 Stockholm

Want to stay updated?

Get updates, accurate insights and what’s new on Perspetivo.

Sign me up

Perspetivo AB

Org. 559378-5818

Jakobsbergsgatan 24
111 44 Stockholm