Type to search

Can we trust consumer research?

News

Can we trust consumer research?

Share

85% of all research is made up on the spot, or so goes the old joke. Pick up any newspaper or visit any news website and you will find some sort of frothy research insight into attitudes and behaviour. Recently there have been a number of investigations into how accurate consumer research is. ABC’s Media Watch recently devoted a program to looking at the behaviour of McCrindle Research and another piece in the Sunday Telegraph looked at research company Canstar Blue. The underlying question being, can we trust research?

Canstar Blue provides research ‘by consumers for consumers’. So one would imagine that the consumer is its most important stakeholder, and you would further believe that by positioning itself in this way as the authority ‘for consumers’ that its research can be completely trusted. So is it?

Let’s look at what Canstar Blue has been allegedly pulled up on. The article focuses on a piece of research into coffee shop chains in Australia where McDonald’s McCafe came out on top for ‘customer satisfaction’. This came as somewhat of a surprise to many judging by the comments on blogs, which in part led to the Sunday Telegraph investigation. It is worth noting that only three months previous McDonald’s had apologised for the quality of its coffee. Canstar Blue says it undertook the research independently and then sold it to McDonald’s. You might think that this raises a number of ethical questions but it isn’t unusual. Many research companies do this. However, Maccas really did love it. A media release was issued and the brand proudly showed off its top rating in its TV adverts. But when you dig into the research some far-reaching questions arise.

The first is that the claim appears to say that 2,500 people had been surveyed, yet it turns out only 1,700 qualified to actually answer the survey. Why did Canstar Blue not make this fact far clearer? After all, 1,700 respondents are more than enough to provide a strong result.

A further point is that it doesn’t appear to be comparing apples with apples. Asking customers of McDonald’s about ‘satisfaction’ is very different to asking the same question to a customer at the premium end of the coffee chain spectrum. It is like asking a customer of Big W how satisfied they are and then asking a customer of David Jones. They are very different and therefore their expectations are very different. In the automotive industry you would only ever compare your car to another in its class.

However, the biggest issue is that Canstar Blue wouldn’t share its data. If Canstar Blue is 100% confident in its methodology and raw data, surely it would happily provide the full breakdown and stand behind it. The results are already being promoted, where is the sensitivity? By not sharing data it all looks like there is something to hide.

This leads to the question: what might be wrong with the data? If you look at how the research was undertaken, people only had to have visited one of the coffee shop chains it listed over the past six months (and remember only three months previous McDonald’s admitted that their coffee wasn’t up to scratch). So what if only 100 of the 1,700 pool had visited Hudsons or The Coffee Club, would that be representative? Canstar Blue won’t release the numbers so we don’t know, but if the numbers were as low as this then surely they can’t be representative. These places have thousands of people through their doors every day. Worryingly, they admit that a coffee shop chain only had to receive 30 responses to qualify in the research.

Marketers have always spun research to their own advantage. Political parties are particular protagonists. How many times have we seen a piece of health or economic research that gets argued as a success or failure from two different stand points? The fundamental difference is that we will be given access to the raw data and therefore they know that if they dare argue a point that can’t be justified in some way, they will be found out.

The Sunday Telegraph investigation is quite revealing and raises issues that shouldn’t just be applied to Canstar Blue but more broadly to consumer research companies. But let’s stop for a moment and ask ourselves as marketers the question: when we commission a piece of research, are we primarily trying to find out something revealing about a consumer or are we primarily interested in creating a marketing story? If we are honest it is mostly the latter. The media has also to shoulder some of the blame. They love surveys. If they didn’t we wouldn’t bother trying to get them published. So we all can take a little responsibility in creating an environment where these types of investigations are undertaken. In fact it has probably been a long time coming.

Tags:
James Wright

James joined Red Agency as general manager from the UK where he was a board director at international PR agency Grayling. With broad ranging experience across consumer and corporate communications, James has particular experience in corporate communications, stakeholder engagement, digital strategy and crisis and issues management. He has worked on some of Europe’s most influential and challenging campaigns for major blue chips, governments and charities. He was named PRCA’s Consultant of the Year 2008 and the same year was named International CSR Leader at the PR News (USA) annual All Stars Award ceremony. Red Agency has offices in Brisbane, Melbourne and Sydney, Red Agency operates across four dedicated sectors: consumer, corporate, technology and government.

  • 1

You Might also Like

Leave a Comment