Category Archives: News and updates

News and updates, Derham Marketing Research’s blog

The wheel? Reinvent or reuse?

A problem can emerge when you need to decide, but know a better decision will follow when your customers’ views or intentions, expectations or needs are there to help guide that decision.

Research – by asking people directly, via a survey – can often give you that decision-strengthening knowledge.  And speedily and inexpensively, with big audiences, as today, surveys are mostly online.

The question then is whether your knowledge-need is unique or is it one sometimes faced by others, too.

If your problem or your circumstances are unique, you will need a specifically written survey.

But, if what you need to know is not uncommon, there may be a more cost-effective option to writing your own unique survey.

That option can be to use an existing relevant survey and tweaking it to your specific needs.  That can be quicker, cheaper and more effective.  And those pre-written surveys include relevant supplementary questions needed to clarify answers given.

As we’ve found commonalities in knowledge needs over the years, we’ve developed a range of online Monitors – our 20 Questions One Topic Monitors © – that are pre-written surveys.   These get answers to questions about communications, customer needs and intentions, customer practice (that can supplement your own database analyses), customer satisfaction, experience, recommendation and more. 

The surveys can be undertaken with your own customers or customers from a broader pool, as needed.  The 20 Questions One Topic Monitors©  are modified to match your specific market and competition, branding and need.  And for a very cost-effective investment.

If you have a decision need that can be strengthened with answers from a survey, these Monitors may answer your need. 

To find out more, please email or ring me – or 0414 543 765.

Research helps marketers turn their sales back up.

What to do when sales decline?

A long-term profitable and well-known product presented its marketing team with just that challenge. 

Sales had been strong and then began a slow and gradual decline.  Some careful changes were tested and didn’t really alter the pattern.

The client then had us review their own internal customer data.  This revealed customer type and customer location change over time. 

While the reasons were open for speculation, our exploratory purchase motivation research (via discussion in online and in-person focus groups and individual interviews) gave good direction on likely reasons.  Which were confirmed by subsequent survey findings.

The client modified the product contents, to better reflect purchase motivation needs. 

They then re-advertised, emphasising how the product now satisfied buyer needs. 

And sales gradually increased.

If you would like to find out how your own customer data and purchase motivation needs research can help you to the answers you need to strengthen your sales, please call or email Philip Derham, whose contact details are on this site.

Quick answers can solve emerging customer concerns.

When, at one of your regular management meetings, someone raises a problem emerging in some customer groups, do you and your colleagues go with gut feeling and decide then?

Do you decide to wait and see what more may happen? Do you decide to research the scope and impact of the problem?

Waiting or researching may take more time than you want to risk.

A research solution to get answers quickly, so you can decide quickly, is to use one of our 20 Questions One Topic Monitors.

These need-specific and industry-specific Monitors can be undertaken with your customers and reported very quickly. They are targeted, effective and cost-effective.

To find out how the 20 Questions One Topic Monitors can get the answers you need and strengthen your quick decisions, call or email Philip Derham now.

Printed words – on screen or on paper – convince and sell!

Recently, we measured the value and the effectiveness of three different to-customer communications channels. The client was concerned about the budget for each – and their value for money.

The client had well trained sales executives who cold called, wrote and sent out compelling email content, and sent out a regular newsletter. 

The client knew the ROI for the sales calls and for the emails, but was unsure whether the newsletter channel worked to generate enough in-bound calls to justify the budget required. 

We questioned their customers about what channels of information to stimulate query and purchase they saw as most effective and of most value to them. 

The answer surprised, as the graph shows.
Customers reported that calls in were more often stimulated by the newsletter than by the outbound contacts.

So, now knowing what was most effective and most value to their customers, this client can justify continuing their newsletter, and support it with their other channels. 

And can do so knowing that they are not being old-fashioned, but are giving their customers the information needed, in the style needed, that cost-effectively gets them more enquiries and sales opportunities.

This example may be client-specific and industry-specific, but shows the value of testing channels and customer preference to identify your more effective communications approaches in terms beyond the immediate sales response.

So, please call, use the contact form or email Philip Derham to discuss how we can help generate a better understanding of your customers’ preferences and responses, if knowing those specific triggers can help you to further strengthen your business.

Merging what you know with survey answers enables you to sell more effectively.

Analysing the information you have about your customers helps strengthen your marketing.

Getting actual customer answers via surveys helps strengthen your marketing.

Combining both measures can charge your marketing, as a client recently reported.

When we analysed the client’s own customer data, we identified three major age/gender groups, then, when customer-spend-with-the-client data was added, we found seven spend and purchase frequency groups.

Each group was sufficiently different in characteristics that a single message would be unlikely to effectively motivate each.

We then surveyed customers to find what would motivate people in the different groups to respond – and also found their overall spending patterns (as they did not always shop with the client).

The pre-survey data analysis and group identification allowed the survey to answer specifically what people in each group did, and what motivated them.  The combined knowledge then enabled the client to extend their marketing effectiveness and marketing ROI. 

If you feel your 2019 can be strengthened with a similar customer data analysis/customer needs and motivations survey project, please call or email Philip Derham to discuss those next steps.

Do you see what I see? Or more, importantly, do you see what your customers see?

As managers, we’re often so busy with our jobs, we tend to see customers with problems, very important customers, or see customers episodically.  A result of this occasional seeing can be that our perceptions may not fully match the reality of our current, or of our prospective, customers.

This implicit perception problem was reinforced in a recent project, when we spoke personally to a mix of a client’s customers and prospective customers.

It seemed that the client’s advertising was cutting through very effectively, as all knew the client, its products and its locations.

However, one major barrier for prospective customers emerged from these discussions.

The people shown in the client’s advertising were almost entirely blue-eyed and blonde.  This was unremarked on by customers.

But the prospective customers said that the advertising showed that this business was a club for one group in the community, but not for all.

The reality was that the client’s business was a totally inclusive business – and would serve any customer very happily.

The problem was that the client had just not seen what prospective customers saw.

The solution was quick and easy.

A more varied and population-representative group of people were used in the following advertising campaigns.  And business rose as a result of the more inclusive vision of who its customers were.

The busy managers just hadn’t had time to see what the prospective customers saw, and so had been missing opportunity.

If you think there are opportunities to increase your business, by knowing whether what you see is what your customers and prospective customers see, please call or email Philip Derham to discuss how we can help strengthen that commonality of view – and so your business.

Getting satisfaction, recommendation and action!

Customer satisfaction is always a goal which, hopefully, leads to repeat business and recommendations from existing customers to prospective new customers.

The Net Promoter Score is often used to measure that recommendation intent.

Sometimes, as we found recently, the Net Promoter Score can be quite low while customer satisfaction measures are quite high.

Rather than report just the single (and low) Net Promoter Score and apparently contradictory high satisfaction measure, we looked at satisfaction, then at recommendation in the last year, and those who gave the promoters’ Net Promoter Score rating of 9 or 10 separately from just the Net Promoter Score.

The graph below shows the findings, by age groups.

Key out-takes are that, for this client, customer satisfaction rises by age.

Conversely, the positive Net Promoter Score ratings of 9 or 10, declines with age (under 20s excepted).

The “have recommended in the last year” measure also declines with age (under 20s excepted).

But what is very useful for managers is the finding that the “have recommended in the last year” measure mirrors, but at a higher level, the positive Net Promoter Score ratings of 9 or 10 (the promoters).

These findings suggest two practices relating to satisfaction and the Net Promoter Score.

1.     Satisfaction is not a comparable measure with recommendation and the two should be read and used as two different measures, not as complementary measures.

2.     When using the Net Promoter Score, managers should include comparisons with:

a.       past recommendation practice proportions, and

b.     those who scored 9 or 10 on the Net Promoter Score (the positive promoters)

to give more rounded views of the Net Promoter Score measures, when those are used.

If you’d like to discuss how these extended measures can help you manage your customer experience and recommendation marketing, please call or email Philip Derham, using the Contact page form or email or phone.

Why research customers? Because knowing their views will strengthen your decisions.

We’ve long explained that when you know what motivates your customers, you can market more effectively to them.  And we use this same approach when reviewing what research tools will work most effectively for you.

Sometimes, the tools we research are major tools – online focus groups or face-to-face focus groups, for example.  And sometimes they are smaller tools – the type of questions that most effectively get the answers you need.

One type of question – the matrix question – is commonly used when several statements or products or services are measured on the same scale.  For visual economy and for ease in answering, the statement and scale questions are shown as one question, as in the example below.

This read-from-top-to-bottom format has had some ask if people may be more likely to answer the first listed statement and then less likely to answer statements lower on the list.

We tested whether this hypothesis was valid.  We found:

  1. No relationship between the statement order and whether people answered the statement.
  2. Other factors – participants’ awareness of the product, their use of the product, or the importance or relevance of the product to them – determine whether they answer the statements in a matrix question list.
  3. Hence, there is research method value in using matrix questions, as these enable more effective answers.

Full details are in this month’s Quirk’s Marketing Research Review (USA) – which you can see, in full, at     .

While this may be a little more a researcher’s interest, we thought it may also be of interest to you as it evidences the thought and care we put into every element of the research projects we undertake – to your benefit.

So, if you’d like to talk further about accurate customer research measures to help strengthen your decisions, please call or email Philip Derham.  His contact details are below.

Strengthening your Net Promoter Score effectiveness by strengthening its relevance.

The Net Promoter Score is widely used to measure customer relationships and particularly customer loyalty to the organisation.  Its simple metric enables comparisons across time, across organisations and across staff.  Changes in it should reflect the effectiveness of management-initiated action.

The risk is that its very simplicity of measure enables the Net Promoter Score to be readily used, even when the context is not appropriate.

Two utility survey examples:

  • After a customer spent two hours on the phone, resolving a simple issue, the last staff member spoken with asked if the customer would complete a short Net Promoter Score survey – and asked expressly for a good number, as that would help them remain employed.
  • The second utility performed its job, in the time advised.  On completion, it emailed the customer a brief Net Promoter Score

In both cases, the service provision was the minimum required and the minimum expected.  Each delivered the service promised.  Each utility thus appeared to think that a reasonable Net Promoter Score could be expected from the contact, and so undertook their Net Promoter Score survey.

But as the context for each utility contact was to rectify a problem, the contact was not an engagement from choice.

That the engagements were forced and were engagements that could not be concluded until the problems had been resolved meant the context of the contact was more significant and more likely to influence the rating.  And of course, a direct staff request to get a good rating can also skew the genuine Net Promoter Score measures.

The research issue thus is to ensure the Net Promoter Score measures are used in the appropriate context, not as a simple one-size-fits-all-contacts remedy.

If you’d like to strengthen the value and utility of your Net Promoter Score use, and so ensure your measures are more effective, please call or email Philip Derham to discuss this further.  His contact details are on the Contact Us page (to the right on the screen menu).

Customer data reveals great opportunity!

Customer data reveals great opportunity! It enables you to identify more business opportunity and build on what you have.
If you have a customer database, you can analyse it and find useful facts.
Facts such as frequent customers using their discount cards on most of their visits. Infrequent customers using their discount cards on fewer of their visits, so missing the reward benefits and reasons to repeat their visits.
Facts such as more of your mortgages are held by customers in Sydney’s western suburbs, though more live in Canberra.
Or that high value purchase customers are infrequent customers.
Finding opportunities like these can enable you to implement more effective marketing – perhaps to reinforce frequent shoppers’ practices, or to encourage occasional customers to shop more often.
Finding opportunities may cause you to review your branch or broker distribution, or to review product profitability to see whether frequent lower value purchases give better profits than occasional higher value purchase.
But the bit that is missing is the “Why?”
Is daily shopping just because the buyer works in an office next door and will shift elsewhere just as happily if they change jobs? Is occasional high value shopping linked to particular emotional states or events? And beyond the obvious Mothers’ or Fathers’ Days, what stimulates those emotional states?
Databases show the opportunity, but may not tell the motivating “why”.
We can add that extra element and help you take a marketing campaign to being a motivating marketing campaign.
If you’d like to ensure your next campaign motivates most effectively, please call or email Philip Derham on T: 0414 543 765 or E: .

Identifying and then retaining disengaged customers.

When surveying customers and former customers for clients, the survey response rates from each group differ.  Customers are usually happy to participate and ex-customers rarely bother doing so.  That is expected, given the different levels of involvement with the client organisation.

But, when customers are analysed by their involvement, using measures such as recency of purchase, value of ongoing business, etc., we find different patterns of response among those the client describes as customers, as the graph shows:

Customers who have not bought, or who are not using profitable services, respond at levels similar to those of ex-customers.  Customers who are active purchasers, or users of high value products, are even more likely to participate in customer surveys than customers overall.

This is more than just nice-to-know.

This post-survey analysis enables you to identify disengaged customers before they become ex-customers.  You can then decide whether those disengaged customers are worth working to retain, or whether it’s more realistic and cost-effective to just let them drift away.

As an example, realistically, how many packs, tents, hiking boots, sleeping bags, or cooking equipment can any one person buy?  After they have bought all they can carry, are they really ongoing prospective customers – or just people who once were?

Please email or call Philip Derham to find out how our customer surveys can strengthen your marketing by helping you to identify your disengaged customers, so you can decide the most effective next steps.

New and more business from new insights.

You know what you’ve had to do to get the business you have now, and getting more and new business may just require more of the same – or may need to evolve to satisfy new needs.

And the insight into whether more of the same, or something new is required, can be effectively identified by surveys – usually for speed and economy, by online surveys.

In designing an online survey, three key points must be followed to get the insights needed.

1.    Keep the survey short.

Our research on past surveys shows that while online surveys longer than 15 minutes still get some completions, these longer surveys have more people dropping out part-way through.

2.    Don’t ask what you already know. 

When you ask customers for information you already know (e.g. age, gender, home postcode or State), you waste questions unnecessarily.

3.    Make it easy to do on mobiles and on computers. 

While more people still do their online surveys on computers and laptops, an ever-increasing proportion use their mobile phones – with few using tablets.

Any question that is not easy to answer on a small phone screen will lose people and lessen the insights available to you.

When you are ready to gain fresh insights to help you grow your business, please call or email Philip Derham, so we can design the survey that will get you those needed insights.

Mixing or matching? Hearing or missing new business opportunities.

Should we mix or match when running focus groups for you?

Mixing different types of people in a group can expose new business opportunities – or can silence the exposure of that same opportunity.

Part of our skill in identifying new business opportunities for you comes from deciding whether to mix or to match.

Should we mix different types of people in a focus group?

As an example, while men see themselves as silent, stoic and strong when ill (man-flu excepted), mixing the well and the ill in focus groups found that business-opportunity differences may relate more to education and income than to their health.

This insight then lead to quite different and more effective communications.

Should we have similar types of people in the focus group?

Conversely, matching can also highlight opportunity.

If you were looking to boost your mortgage lending, matching like-people with like-people can uncover common difficulties in saving and in meeting lending criteria, identifying appropriate communication approaches.

Mixing different types of people may, in contrast, lessen the understanding of the motivations and needs of those with average incomes, if those with average incomes were in a group mixed with people who earn seven figure incomes each year.

Choosing the approach that will work best for you

Mixing or matching to make focus groups more effective depends on the products or services from which you want to increase your sales.  In mixed or in matched focus groups, our accurate identification of your customers’ motivations can help you to further strengthen your business.

If you’d like to discuss how we can help you by using focus groups that reveal customer or prospective customer motivations to strengthen your communications and so results, please call or email Philip Derham now.

How knowing what they think enables more business.

We see our customers all the time – in our stores, in our branches, on the phone, in their emails, on social media, in the daily or regular customer reports, in the Net Promoter Score scores, and particularly in the profit and loss accounts.

So we think we know them.  We think we know what motivates them and market to our thoughts and get commensurate results.

But thinking and knowing may not be not the same, as a client recently found.  Volunteers give it over 25,000 free days labour every year but the client, as any business, needs both labour and cash to operate. 

Knowing how committed their volunteers were, made the volunteers seem a perfect segment for cash donations as well, but the volunteers gave less cash than expected.

We ran focus groups with the volunteers and clarified their motivations, which explained why the marketing had generated less than expected.

Post-focus group result?  A revised marketing campaign that appealed to the volunteers’ actual motivations.

So, if identifying your customers’ actual motivations can further strengthen your marketing, please call or email me, and we can discuss the next steps to your strengthened results, just as the client has done.

Five facts that form a trend that influences your business.

The Australian Bureau of Statistics tells us facts – lots of facts – and leaves the drawing together and understanding of the influence of those facts to us.

Five recent facts that influence your business are that the proportions of people who:

  1. Own their homes outright has declined from 43% to 38% (1995-6 to 2015-16),
  2. Own their homes with a mortgage has risen from 28% to 37% in the same time.
  3. Are children is down from 22% of the population to 19% (1995 to 2015).
  4. Were families with children were 64% of all in 1996 but only 61% in 2016. And yet
  5. Spending on non-essentials was down.  In 1984, 44% of household income was spent on discretionary items, but by 2015-6, this had declined to just 41%.

Individually, these facts may or not be interesting, but as a collective, raise concerns for your business.

Proportionately fewer families have children, yet there is less discretionary spending.   More is being spent on basic items, including mortgages.

Less discretionary spending means less on optional purchases – the small or the larger pleasures, as well as other products, services or facilities.

Seeing these facts as parts of a trend, you could continue marketing as before or you could find out what your customers do and want, then see what motivates them, so they’ll spend more of limited discretionary spending with you.

We specialise in finding these opportunities and motivations quickly and cost-effectively.  So, as you reflect on the new year and its challenges, call or email Philip Derham now to start identifying the trends that are influencing your business and to find out how to motivate your customers your way.

Improve your Satisfaction KPIs – by up to 3 SEs.

When customers do more with you, you can assume they are satisfied with your products, prices and service.

But as assumptions are just assumptions, many organisations survey their customers’ to establish their satisfaction levels and report those as formal KPI measures.  We’ve found it strengthens your satisfaction KPI results if you let your survey run for a little longer time.

Most people complete their online or on-mobile surveys within 48 hours of their emailed invitation, so it can be tempting to close then and report the results quickly.

For customers who are just satisfied, that approach is fine.

The proportions of customers who are just satisfied remain stable from the first to the tenth day after the survey opens.

But the very satisfied customers are different.

While more of the very satisfied customers complete their survey within 48 hours, quite a few take longer.  They keep your email invitation and return to it three, five or even seven days after it was sent to them.

In surveys, we’ve seen the extra time help the very satisfied proportion to grow from 54% after 48 hours to 62% after five days – significant 3 standard error gains.

This very satisfied proportion growth also boosts the net satisfaction level, making your net satisfaction KPI stronger too.

As we effectively measure customers’ satisfaction, please call or email Philip Derham to discuss your next steps to better measures and strengthened satisfaction KPIs.  His contact details are below.


Telephone:     (+61)  0414 543 765

Evolving concepts – new names, new logos, new types of ADI.

As you well know, the ADI financial services sector is evolving, and rapidly.

New entrants and new payment or transfer methods are appearing.

The Government is encouraging more competition, seeking to further benefit customers.

In response,

  • Some ADIs have merged.
  • Some former Credit Unions have become banks – and more are considering that change.
  • Some have renamed.  Some have re-branded – with changes in logos and identity.
  • Some remain as they have been.  And
  • Some are hedging their bets, registering possible new names and exploring the opportunities.

Much of the reviewing is being quietly undertaken in-house, with staff and Directors.  But knowing what your current members and eligible prospects think, can clarify the likely outcomes by telling you how changes may influence financial activity with you – or with another ADI.

Over the last twenty-five years, we’ve researched the impact of changes or not-changes of ADI type, of ADI name and branding, logo and presentation.  And seen some very positive outcomes follow.

Please call or email me to know more – whether from using our expressly developed 20 Questions Brand Monitor™, or our logo options testing facilities, or by undertaking your own bespoke customer and prospect research.

Then, with sound knowledge of the impact of change – or of no-change – we can strengthen your final decision.  So please call or email me now, to activate the steps to that knowledge.  Click on the Contact page for quick contact now!

Perceptions matter because they influence action.

Perceptions matter because they influence action.

Since the Australian Bureau of Statistics’ August retail sales results were released last Thursday, media commentators have bemoaned the retail sales decline of 0.6% since the previous month.

And if Australians are tightening their wallets and not spending on retail, and if the mood is one of caution, investing to generate sales and more sales needs to be targeted.  It needs to be targeted to groups more likely to spend, borrow, invest as you need.  And targeted to appeal to the motivations of people in those groups to act.

Clarifying those two targets is what we do.

Our research shows who the targets for your profitable products and your profitable services are.  Our research then identifies what you need to offer to motivate them – to excite their interest and ultimately, their action.

If you feel your sales could be strengthened now, with your profitable products and services sold more effectively, please call me to discuss the steps to achieve that knowledge.

Brands. To have and to hold.

Known brands are shorthand to get quick customer and prospect recognition; and response and purchase.  Unknown brands get querying and questioning; may not be seen; and get lower consideration and lower purchase likelihood.

If your brand is known, why change?

  • To strengthen the business. New laws allowing Credit Unions to rebrand as Banks may enable more new business.
  • To enter new markets.  The Surrey Hills Café brand may be too localised to work elsewhere.
  • To become current.  The Buggy Whip Manufactory brand may be too dated a brand for phone app developer.
  • The brand is damaged.  The White Pointer Bait surfboard brand may sell more if rebranded.

Choices for the better brand decision.

Take a punt and hope for the best?

Or to find out what your current and future customers’ views of your brand are now, and the impact of alternatives. Then, with knowledge, decide.  (The business-strengthening decision).

Identifying your customer views can be done using online and offline focus groups, specific surveys or our 20 Questions Brand Monitor™

The 20 Questions Brand Monitor™ is an online survey, for all your email-accessible customers, with personalised questions about brand awareness, brand attributes, brand perceptions, brand preferences, and likely actions if the brand were…

We can add a sample of others not from your database as a comparison, if needed. To know more, please contact me today!  Details below.

Extending customer research reach finds more sales opportunities.

Working in an office normalises office practices.  We receive, read and reply to emails at work on work computers.  And we mostly complete online surveys at work, on our work computers.

These practices are seen as normal for people in offices.

But as we’ve seen in recent surveys, one in four read emails and complete online surveys on their small-screen smartphones, at work or elsewhere.

Up to the age of 40, half or more in each age group used their smartphone to complete the survey.

Smartphone and computer users took much the same time to complete their surveys (14 and 13 minutes), indicating our online surveys are easy to do on any device.

This enables a full range of your customers to complete our online surveys, giving a better knowledge of their needs and so of sales opportunities.  Please ring or email Philip Derham to find out more.

Time to reflect on our assumptions.

Down time – particularly that of a relaxing summer break – can allow for reflections on the year just past. On its highs, on its what could have been, on the assumptions that were made.

That reflective time can enable you to return, as some did last year, with a determination to test marketing assumptions they’d made during the year. Some of the assumptions we tested for them included:

* That people “like me” were the customers. We found the customers were less “like me” and so a changed sales approach became profitable, quickly, as it targeted actual buyers more directly.

* That the most profitable customers were the most satisfied. Data analysed with survey results found the most satisfied customers were not necessarily the most profitable, requiring a rethink of the core customer group.

* That sponsoring a local sports team would get almost total audience reach, which we found was not necessarily the case. Knowing this, enabled them to allocate their sponsorship investment more effectively.

So, if you’ve had the time and the opportunity to consider assumptions, and would like to test those, to ensure your marketing and sales investment delivers the maximum returns, please ring or email or use the form on this site to contact Philip Derham to discuss the steps.

Does being patient pay off?

Are laggards more likely to be satisfied or dissatisfied? The answer might surprise you. Philip Derham reports.

This article first appeared in the June 2016  edition of the AMSRS Research News and is republished with permission.

Clients often use measures of customer satisfaction as an element in their assessments of their own corporate effectiveness, competitive advantage, and senior management’s performance – and related senior managers’ bonuses. Our online surveys thus normally include satisfaction measures, which we seek to make as effective and accurate as possible.

When reviewing satisfaction measure effectiveness, we look at the data itself and how we have measured that. Today, this can often include how long the online survey is open, as clients are often keen on speedy, though accurate, results.

Our online survey software allows us to see when each survey response is received. When graphed, the time and day results from a recent 2,934-sample survey, with three email contacts, raised questions, as Figure 1 indicates.

The double peaks of high response within 24 hours of each email’s dispatch, shown in Figure 1, was initially interesting but had a simple explanation. The survey invitation emails were sent in the early afternoon and would have been received shortly after. Most who responded did so when they received the emails – mainly in the afternoon and evening of the day of dispatch or the next morning, with few accessing their emails overnight, hence, the response was largely immediate, as Figure 2 shows.

Figure 2: Survey completion in the 24 hours from the dispatch of the survey invitation, shown on a 24-hour clock face

This lead us to question why a few took two, three, and up to six days after the email was received to complete the survey. We first wondered if these laggards were dissatisfied customers.

The evidence indicated they were not; only three per cent of those who responded were dissatisfied. In contrast, among those who answered within 24 hours of the email, only three per cent were dissatisfied, and two per cent of those who responded two, three or four days later were dissatisfied. Hence, the laggards were not dissatisfied customers. Nor did the second or third email affect the low dissatisfied rate, suggesting more prompts do not, of themselves, cause dissatisfaction.

However, there was a significant difference between groups of satisfied customers, depending on their time of response.

Overall, and the result reported to the client, was that 86 per cent of customers were at least satisfied with the client – and 55 per cent were very satisfied. The remaining 11 per cent were neutral, had not used the service, or had not answered the question.

The significant difference was that while 55 per cent of all customers were very satisfied, and 54 per cent of those who did the survey within the first 24 hours were very satisfied, proportionately more customers who did the survey two, three, four or five days later were very satisfied – 62 per cent, as Figure 3 illustrates.

Figure 3: Comparisons of very satisfied customers, in total, of those who responded within 24 hours of an invitation, and those who responded later

It is unlikely that the difference between the two groups is due to chance, as the late responders’ group results were nearly three standard errors larger than the results from the group that responded within 24 hours of the invitations.

An interpretation of these results is that while most of the very satisfied customers will complete an online survey immediately, a separate, numerically smaller but proportionately larger group of very satisfied customers will complete the survey some days after they receive their invitations.

The conclusions from this one survey analysis are that:

  • The practice of sending up to three email invitations to customers on client databases does not increase the proportion of dissatisfied customers
  • The longer the online survey is open (to the 14 days tested), the more very satisfied customers will complete it. This may strengthen the proportion of satisfied customers reported to clients, and so strengthen their own subsequent satisfaction-related actions.
  • However, if results are urgently required, online satisfaction surveys can be closed earlier, with slight reductions in overall net satisfaction scores likely.
  • It would useful to undertake further analysis to compare very satisfied customers who immediately complete the survey with those who take two or more days later to do so.
  • These single online survey findings have stimulated us to look at more of our surveys to see whether this is a singular result, or a harbinger, and to report those findings later. We suggest others may wish to review their satisfaction survey results to see if we can develop a broader industry view on this.

Philip Derham, Director, Derham Marketing Research Pty. Ltd.

This article first appeared in the June 2016  edition of the AMSRS Research News and is republished with permission.

What is a recommendation really worth?

recommendation1For one online retailer, the value of recommendation is clear.

For them, a recommendation is worth $20.

Just given them a friend’s email, and if your friend buys, they get $10 off that purchase and you get $10 of your next purchase (T & Cs applied).

Otherwise, your recommendation is implicitly, not of value.

Thinking on this, several questions follow.

  1. What is the value of recommendation to your business?
  2. Who recommends?
  3. Who do they recommend to?
  4. Do the recipients follow the recommendation?
  5. And if they do, when?
  6. Do recommendations generate business – or just nice warm feelings? And,
  7. If recommendations do generate more business for you, how can you stimulate more of them?

Our Recommender Advantage Monitor taps into the answers of these questions (and more) and can help you establish the value of recommendations for you, and how you can stimulate more of them.

If you want more, useful recommendations, please contact Philip Derham for more details.

World views collide to distort public discourse.

The media exist to inform, to educate, to convert, and particularly to monetise, contact with people in societies.

The market research discipline uses applied scientific techniques to gather replicable and reliable insights and knowledge to support decision-making.

These two different worlds do meet.

As examples of the collision between the two worlds, in the last month, four media reports based on survey results implicitly or expressly argued for public and perhaps Government action on three topical issues – hunting foxes, lessening obesity, and Muslims in Australia.

In one article, the journalists noted the survey had been repeated twice and the survey’s usual sample was 1,000.  The second article noted a sample of 304 people.  The foxhunting story mentioned no sample size, and the obesity warning report mentioned a sample of more than 86,500 people collected over 12 month period (1).

Relevant issues for market researchers in assessing the data would include the sample size, sample selection, sample representativeness, sample error, survey method, whether the results were weighted to their population, and the impact these market research issues may have on the findings reported.  The language, the question wording and any information supplied as context for the questions asked, was not discussed in these reports.

Discussion of these methodological issues may not be as exciting as commentary on incendiary findings, but discussion of the validity of findings is needed to ensure the community draws the correct information from the studies.

People may still then act on their prejudices, as the 200,000 witches tortured, burnt or hanged in the UK over 300 years past would attest, were they here (2), but reporting survey findings that include the careful caveats we market researchers would include, may better inform and so influence public discourse.

We market researchers can and may contact individual journalists to suggest these questions, but one-off individual action may not have the influence required.

Rather, the conclusion from the reporting of these four surveys is that the Australian Marketing Research organisations should develop, distribute and promote a journalist’s guide to questions that should be answered before reporting surveys results.

As Kirkpatrick said, reporters should make sure of the validity of their survey-based facts before reporting them (3).

To do so is to enrich rather than to distort public discourse.


1  The (UK) Guardian, October 3, 2016.  September 21, 2016:  September 27, 2016:  September 21, 2016:



Research News Live –

Knowing, not guessing, what motivates really helps you sell.

Joining was easy.  I was in the store.

I’d bought some hiking gear and would get a slight discount if I joined the retailer’s loyalty scheme, giving them personal details to add to the product purchase details.

A great start to an ongoing sales relationship?  No!  No, because the retailer’s monthly promotional email is a standard listing of things I could buy for me, at cheap prices.

Yet, in this sector, what motivates me is hiking gear for children.  They love the outdoors, and I’m a sucker for anything that makes their hiking experiences better.

The retailer’s lack of knowledge of what motivates me, and people like me, means their emailed promotions do not generate sales from me – nor from people like me.

After all, I only need one pack and one pair of boots.  The new equipage opportunities for the children are far, far more – and they keep growing and so need new clothing, new boots and new gear.

Our customer research can easily, quickly and cost-effectively tell this retailer what really motivates groups of their customers.  Then they can market to those actual motivations and generate more sales as a result.

If you feel your marketing could also be sharpened with appeals to your customer groups’ motivations, and want assistance identifying those, call, email or use the form below to contact Philip Derham for the next steps.

Do you see what I see?

The risk is that not all customers, not all prospects, see what you see.

In the real-life, graphed example, staff see their business benefits as being friendly, secure and offering competitive products.

do-you-see-what-i-see-graphFewer customers agree.

Prospects do not see what staff see and scarcely see the client at all.

Benefits seen from within may not be the reasons customers or prospects buy.

Is, for example, the benefit sold to shoppers of “the great tasting coffee” the main reason to buy coffee from a shopping centre café?  Or is it the chance to sit and rest, and have some refreshment?

Marketing to “what the buyers see” rather than to what you see increases your marketing return on investment.

Knowing what your customers/prospects see, want and will buy makes your marketing more effective and increase its ROI.

Research identifies buyers’ and prospects’ perceptions.  So by knowing, research sharpens your marketing and your marketing investment return.

To ensure your views and your customers’ views are aligned, please call or email me for the next steps.

Getting more recommendations. A very cheap way to advertise – effectively.

Recommendations from customers are a very cheap form of advertising.  And one which really extends your marketing budget further and further at no extra cost.

The question “How to get more recommendations from your customers?” has an answer and our past research shows that in specific industries:

  • One essential attribute that drives recommendations;
  • Four characteristics that recommenders share; and
  • Three core groups who may be influenced by recommenders.

Knowing your own specific groupings will enable you to work on the essential attributes that drives recommendations for you.Recommend trio

Knowing what your recommender/promoter groups are, will enable you to target those people most likely to recommend, with suggestions that they do so.

Hence, we’ve developed a brief, Recommender Advantage Monitor that identifies which of your customers are most likely to recommend you – and why.

If you’d like to know more, please call or email Philip Derham, or use the form below.

How we can strengthen the results from your online surveys

Response graph July 4Improved online survey techniques help you to get strengthened results.  At no extra cost, which is why we continually review our research techniques.

One recent review showed that we could get good numbers of responses from 1 email invitation and could close the survey in 48 hours (as the graph shows).

But additional customers responded to the 2nd invitation.

Keeping the survey open longer also enabled more very satisfied customers to respond.

In this reviewed customer satisfaction survey, 3,000 responded –most within 24 hours (61%).  But responses continued for days after each email.

We analysed the “within 24 hours”, and the “2+ days later” responses.

We found that:

  • 54% of those who responded within 24 hours were very satisfied with the client,
  • 62% of customers who responded 2+ days later were very satisfied.
  • This was nearly 3 standard errors difference, and so statistically significant.

Including these later-responding very satisfied customers increased the overall satisfaction measure, a useful benefit if customer satisfaction is a KPI.

The conclusion is that while we can close a survey after 48 hours and have a good sample size; to give you the best results, our preference is to keep your online surveys open for at least 10 days.

To know more about our business-strengthening online surveys, please call or email Philip Derham.  Please click on the Contact Us tab to email or to ring him.

Smartphones and strengthened survey responses

Laptop Phone Online Survey
Laptop and smartphones with online surveys

Solutions to strengthen your online survey response.

When research technique matters to clients is when we (or others) find that research technique improvements can get strengthened results for no increases in the fees.

One recent research technique we reviewed was how the increased use of smartphones now influences online survey completions.

We analysed 14,111 recent online surveys and found that:

*  4 in 5 of the people who started, completed their survey.

*  84% of those who used desktops completed their survey,

*  66% of those who used smart phones completed their survey.

As every participant is valuable, we asked why fewer smart phone users completed their surveys.  The reasons were clear:

*  The smartphones’ smaller screen sizes; and so

*  Capacity to easily answer the questions (“my fingers are too big”), and

*  Smartphones often had slower Internet speeds than desktops had.

The solutions were just as clear and so we recommend:

*  Shorter surveys (with split samples and questionnaires, if needed) and specifically written for you;

*  When relevant, using one of our smartphone-specific 20 Questions One Topic Monitors or our Summary Spotlight Surveys.

For more information about how we can strengthen your survey findings, and so strengthen your decisions, please call or email Philip Derham now!  His contact details are under the Contact Us tab at the right/top of your screen.

The article and the data in full are available in the June 2016 edition of Quirk’s Marketing Research Review,


3 steps to more cost-effective focus groups.

People are influenced by where they live and where they live influences their purchase behaviour.  And these geographically-based differences in attitudes and behaviour can affect your sales.

So when looking to understand the influence of these geographically-based differences on your sales, we have found face-to-face focus groups to be a useful research tool.

But face-to-face focus groups across the country can be costly.

We may have to travel from Dubbo to Darwin and to Doncaster and beyond to ensure your key customer or prospect segments are researched.  And you can’t always be there yourself to see what they are saying.

This is the tyranny of distance in Australia – and beyond, if you have an international business.  But there is a solution.

That solution is that of online focus groups.  We run each of those online focus groups with 4 or 5 or 6 people selected for relevance from a range of areas.  All participate at the same time.  They use their screens and web cams, so we can all see each other and the images or words we want them to see, just as we do in face-to-face focus groups.

With three key differences.

  1. The participants are in their homes (in Sydney or in New York, in Bendigo or in Idaho, in Penrith or in Perth) and I’m in my office.

No travel.  No travel costs.

  1. You can watch each group from work or at home as fits your time.
  2. You can, via the researcher, seek more comment on a participant’s comment, while that topic is relevant and fresh.

If you want to know more about online focus groups and their capacity to inform you, more cost-effectively, call or email me, Philip Derham, and I’ll tell you more – online or offline as you wish.