World views collide to distort public discourse.

The media exist to inform, to educate, to convert, and particularly to monetise, contact with people in societies.

The market research discipline uses applied scientific techniques to gather replicable and reliable insights and knowledge to support decision-making.

These two different worlds do meet.

As examples of the collision between the two worlds, in the last month, four media reports based on survey results implicitly or expressly argued for public and perhaps Government action on three topical issues – hunting foxes, lessening obesity, and Muslims in Australia.

In one article, the journalists noted the survey had been repeated twice and the survey’s usual sample was 1,000.  The second article noted a sample of 304 people.  The foxhunting story mentioned no sample size, and the obesity warning report mentioned a sample of more than 86,500 people collected over 12 month period (1).

Relevant issues for market researchers in assessing the data would include the sample size, sample selection, sample representativeness, sample error, survey method, whether the results were weighted to their population, and the impact these market research issues may have on the findings reported.  The language, the question wording and any information supplied as context for the questions asked, was not discussed in these reports.

Discussion of these methodological issues may not be as exciting as commentary on incendiary findings, but discussion of the validity of findings is needed to ensure the community draws the correct information from the studies.

People may still then act on their prejudices, as the 200,000 witches tortured, burnt or hanged in the UK over 300 years past would attest, were they here (2), but reporting survey findings that include the careful caveats we market researchers would include, may better inform and so influence public discourse.

We market researchers can and may contact individual journalists to suggest these questions, but one-off individual action may not have the influence required.

Rather, the conclusion from the reporting of these four surveys is that the Australian Marketing Research organisations should develop, distribute and promote a journalist’s guide to questions that should be answered before reporting surveys results.

As Kirkpatrick said, reporters should make sure of the validity of their survey-based facts before reporting them (3).

To do so is to enrich rather than to distort public discourse.


1  The (UK) Guardian, October 3, 2016.  September 21, 2016:  September 27, 2016:  September 21, 2016:



Research News Live –

Knowing, not guessing, what motivates really helps you sell.

Joining was easy.  I was in the store.

I’d bought some hiking gear and would get a slight discount if I joined the retailer’s loyalty scheme, giving them personal details to add to the product purchase details.

A great start to an ongoing sales relationship?  No!  No, because the retailer’s monthly promotional email is a standard listing of things I could buy for me, at cheap prices.

Yet, in this sector, what motivates me is hiking gear for children.  They love the outdoors, and I’m a sucker for anything that makes their hiking experiences better.

The retailer’s lack of knowledge of what motivates me, and people like me, means their emailed promotions do not generate sales from me – nor from people like me.

After all, I only need one pack and one pair of boots.  The new equipage opportunities for the children are far, far more – and they keep growing and so need new clothing, new boots and new gear.

Our customer research can easily, quickly and cost-effectively tell this retailer what really motivates groups of their customers.  Then they can market to those actual motivations and generate more sales as a result.

If you feel your marketing could also be sharpened with appeals to your customer groups’ motivations, and want assistance identifying those, call, email or use the form below to contact Philip Derham for the next steps.