Three steps to help you benefit from COVID-19 lock-down-generated changes in customers’ behaviour.

Recently, a client had a sales problem and needed help to find out why.

We undertook a three-step research project to find out where, why and what was needed to reverse the decline.

That process is summarised in the short (1 minute 45 second) video – Insights to help reverse sales declines or to increase sales  which you may find useful. 

Useful?  Yes. 

Why?  Because the COVID-19 lock-downs are very likely to change customers’ future behaviour.  And when you know what those changes are, you can manage for them.

If the three steps detailed in the video can help you plan to benefit from those behaviour changes, please email or call Philip Derham, on 0414 543 765 or derhamp@derhamresearch.com.au

Online tools will find insights, despite the coronavirus.

Online tools can generate insights, despite the coronavirus, as this two minute video outlines.

People participate in interviews, group discussions or surveys online, talking into their own computer or mobile (cell) phone.

People participate from their homes or wherever they feel safe. The interviewer is in another place, and the discussion is via computer or phone, online.

As the video notes, interviews online with participants at their homes, with real-life family dynamics around, can reveal further motivations and behaviour. These additional insights can strengthen your marketing, as the video notes.

We can keep your business-strengthening insights coming – despite the coronavirus.

As you know, people are becoming cautious about associating with others because of fears of catching coronavirus.

We can neither stop the fears nor the virus, but those need not stop the customer insights needed to strengthen your business.

Our online interview and survey tools enable us to talk to and listen to people, in their homes or wherever they feel safe.  And they are still willing to assist by sharing their thoughts and motivations with us – online.

As an example of how well the online interview tool can work, we talked recently with a couple, online, about their finances and mortgage intentions.  We wanted to understand why they’d not sought a mortgage, though employed and eligible.  This insight was to help our client strengthen marketing approaches to people in similar situations.

After twenty minutes of online/computer-visual chat with the apparently happy couple, a small child appeared in the background, saw me on the screen, waved and said “Hi”.

One sweetly asked the other if she would put the child to bed, again.  

After the female partner left, the almost lovey-dovey couple also disappeared. 

For the next minutes, the man explained about their previous marriages, that the child was hers, not his, that only she could discipline the child, and so on.

The female partner reappeared. 

As did the image of the apparently happy couple facing me on the screen, and who were still unclear about their motives for not seeking a mortgage.

The online interview provided an interesting insight to be tested in the subsequent online survey.

But the additional takeout was that the insight was obtained safely for the couple and for me, so there was no fear of catching anything from the interview.

Thus, even in a time of heightened caution about personal interaction, we can get valuable insights by using our online research tools.

If you’d like to know more, please call or email Philip Derham to discuss how we can help you.

T: (61) 0414 543 765

E: derhamp@derhamresearch.com.au

Satisfaction, the NPS, and Recommendation

Satisfaction is a key metric in measuring how customers feel after they have dealt with your organisation.  It can indicate ongoing business intention.

The Net Promoter Score (NPS) is designed to measure customer experience and indicate future business growth.

Recommendation measures are what people have done and to whom the recommendation was given.

Combined, the three measures provide clear KPIs that you can use to further strengthen your business performance and profitability.

Our research has found there can be disconnects between customer satisfaction levels, the NPS and past recommendations.

These disconnects can make it difficult to know which business elements need improvement, so that high satisfaction and past recommendation practices also generate high NPS measures. 

We’ve developed a quick and cost-effective solution to identify which customer groups you may need to concentrate on, so all the measured elements are in accord.

The solution is our Satisfaction NPS Recommender Monitor.

This Monitor enables you to identify those customers who are satisfied, who are intending promoters (yes, those with good intentions – think diets and weight loss intentions), and those who are actual recommenders.

When you know, you can market to the specific customer groups that need additional attention to strengthen your business. 

To know how the Satisfaction NPS Recommender Monitor can help you strengthen your business, please contact Philip Derham.

Words and thoughts

Thoughts and words can provide fascinating insights into motivations, action and outcomes.

But what is said in words can be different to what is thought, and so can lead to less than desired business outcomes.

As examples:

  • When your boss asks you to do something, your words may sensibly express agreement, while your thoughts, and perhaps your subsequent practice, may rebel.
  • When asked about product features on a pre-set list, the words given may be chosen, but the thoughts that motivate purchase may be different – “Yes, milk chocolate is delightful, flavour-filled, sweet and value-for-money”.  The thought may be “Don’t eat milk chocolate because of the calories”.  Hence words and thoughts can generate quite different sales outcomes.

The insights that follow the words and thoughts alignment enable more effective management, more effective product or service integration to meet needs and so provide better business outcomes.

We use a range of in-person, online and other market and business research tools and techniques to ensure the words that our participants express match closely to their thoughts – and subsequent behaviour.

If you think there is a gap between the words you hear and the actions that your staff or your customers do (perhaps because of what they think), and there is an increased business benefit if you can market to the more aligned words and thoughts, please ring or email Philip Derham to find out how we can help you now.

Insights explain divergences between satisfaction and the NPS

Recently, a client asked us to add the Net Promoter Score (NPS) to their customer engagement KPIs, and then in our next customer survey for them, we found they had maintained their high and positive 90% customer satisfaction measure but had an NPS of +19.

As the two measures seemed a little at odds, we looked for insights to explain the divergence.

Our first insight was the divergence between the client’s very satisfied and their (just) satisfied customers, as the graph shows.

The very satisfied customers generated the very positive +54 NPS, but the (just) satisfied customers generated the very negative NPS of-117. 

We then saw that the (just) satisfied customers were primarily detractors, who were unlikely to recommend the client to others.  For the NPS to rise, the client knew they’d have to improve but needed to know what, where and for which customers.

The survey results showed the very satisfied customers differed noticeably from the (just) satisfied customers in demographics and use of the client’s services.  Research also highlighted differences in attitudes and in motivation between the very satisfied and the (just) satisfied customers. 

These insights enabled the client to use their database more effectively to promote relevant messages to the (just) satisfied customers and so strengthen their next NPS results.

If similar insights about your customers and their motivations can help you strengthen your business in the next year, please call or email Philip Derham – at derhamp@derhamresearch.com.au or on 0414 543 765.

The wheel? Reinvent or reuse?

A problem can emerge when you need to decide, but know a better decision will follow when your customers’ views or intentions, expectations or needs are there to help guide that decision.

Research – by asking people directly, via a survey – can often give you that decision-strengthening knowledge.  And speedily and inexpensively, with big audiences, as today, surveys are mostly online.

The question then is whether your knowledge-need is unique or is it one sometimes faced by others, too.

If your problem or your circumstances are unique, you will need a specifically written survey.

But, if what you need to know is not uncommon, there may be a more cost-effective option to writing your own unique survey.

That option can be to use an existing relevant survey and tweaking it to your specific needs.  That can be quicker, cheaper and more effective.  And those pre-written surveys include relevant supplementary questions needed to clarify answers given.

As we’ve found commonalities in knowledge needs over the years, we’ve developed a range of online Monitors – our 20 Questions One Topic Monitors © – that are pre-written surveys.   These get answers to questions about communications, customer needs and intentions, customer practice (that can supplement your own database analyses), customer satisfaction, experience, recommendation and more. 

The surveys can be undertaken with your own customers or customers from a broader pool, as needed.  The 20 Questions One Topic Monitors©  are modified to match your specific market and competition, branding and need.  And for a very cost-effective investment.

If you have a decision need that can be strengthened with answers from a survey, these Monitors may answer your need. 

To find out more, please email or ring me – Philip.Derham@derhamresearch.com.au or 0414 543 765.

Research helps marketers turn their sales back up.

What to do when sales decline?

A long-term profitable and well-known product presented its marketing team with just that challenge. 

Sales had been strong and then began a slow and gradual decline.  Some careful changes were tested and didn’t really alter the pattern.

The client then had us review their own internal customer data.  This revealed customer type and customer location change over time. 

While the reasons were open for speculation, our exploratory purchase motivation research (via discussion in online and in-person focus groups and individual interviews) gave good direction on likely reasons.  Which were confirmed by subsequent survey findings.

The client modified the product contents, to better reflect purchase motivation needs. 

They then re-advertised, emphasising how the product now satisfied buyer needs. 

And sales gradually increased.

If you would like to find out how your own customer data and purchase motivation needs research can help you to the answers you need to strengthen your sales, please call or email Philip Derham, whose contact details are on this site.

Quick answers can solve emerging customer concerns.

When, at one of your regular management meetings, someone raises a problem emerging in some customer groups, do you and your colleagues go with gut feeling and decide then?


Do you decide to wait and see what more may happen? Do you decide to research the scope and impact of the problem?


Waiting or researching may take more time than you want to risk.

A research solution to get answers quickly, so you can decide quickly, is to use one of our 20 Questions One Topic Monitors.

These need-specific and industry-specific Monitors can be undertaken with your customers and reported very quickly. They are targeted, effective and cost-effective.

To find out how the 20 Questions One Topic Monitors can get the answers you need and strengthen your quick decisions, call or email Philip Derham now.

Printed words – on screen or on paper – convince and sell!

Recently, we measured the value and the effectiveness of three different to-customer communications channels. The client was concerned about the budget for each – and their value for money.

The client had well trained sales executives who cold called, wrote and sent out compelling email content, and sent out a regular newsletter. 

The client knew the ROI for the sales calls and for the emails, but was unsure whether the newsletter channel worked to generate enough in-bound calls to justify the budget required. 

We questioned their customers about what channels of information to stimulate query and purchase they saw as most effective and of most value to them. 

The answer surprised, as the graph shows.
Customers reported that calls in were more often stimulated by the newsletter than by the outbound contacts.

So, now knowing what was most effective and most value to their customers, this client can justify continuing their newsletter, and support it with their other channels. 

And can do so knowing that they are not being old-fashioned, but are giving their customers the information needed, in the style needed, that cost-effectively gets them more enquiries and sales opportunities.

This example may be client-specific and industry-specific, but shows the value of testing channels and customer preference to identify your more effective communications approaches in terms beyond the immediate sales response.

So, please call, use the contact form or email Philip Derham to discuss how we can help generate a better understanding of your customers’ preferences and responses, if knowing those specific triggers can help you to further strengthen your business.

Merging what you know with survey answers enables you to sell more effectively.

Analysing the information you have about your customers helps strengthen your marketing.

Getting actual customer answers via surveys helps strengthen your marketing.

Combining both measures can charge your marketing, as a client recently reported.

When we analysed the client’s own customer data, we identified three major age/gender groups, then, when customer-spend-with-the-client data was added, we found seven spend and purchase frequency groups.

Each group was sufficiently different in characteristics that a single message would be unlikely to effectively motivate each.

We then surveyed customers to find what would motivate people in the different groups to respond – and also found their overall spending patterns (as they did not always shop with the client).

The pre-survey data analysis and group identification allowed the survey to answer specifically what people in each group did, and what motivated them.  The combined knowledge then enabled the client to extend their marketing effectiveness and marketing ROI. 

If you feel your 2019 can be strengthened with a similar customer data analysis/customer needs and motivations survey project, please call or email Philip Derham to discuss those next steps.

Do you see what I see? Or more, importantly, do you see what your customers see?

As managers, we’re often so busy with our jobs, we tend to see customers with problems, very important customers, or see customers episodically.  A result of this occasional seeing can be that our perceptions may not fully match the reality of our current, or of our prospective, customers.

This implicit perception problem was reinforced in a recent project, when we spoke personally to a mix of a client’s customers and prospective customers.

It seemed that the client’s advertising was cutting through very effectively, as all knew the client, its products and its locations.

However, one major barrier for prospective customers emerged from these discussions.

The people shown in the client’s advertising were almost entirely blue-eyed and blonde.  This was unremarked on by customers.

But the prospective customers said that the advertising showed that this business was a club for one group in the community, but not for all.

The reality was that the client’s business was a totally inclusive business – and would serve any customer very happily.

The problem was that the client had just not seen what prospective customers saw.

The solution was quick and easy.

A more varied and population-representative group of people were used in the following advertising campaigns.  And business rose as a result of the more inclusive vision of who its customers were.

The busy managers just hadn’t had time to see what the prospective customers saw, and so had been missing opportunity.

If you think there are opportunities to increase your business, by knowing whether what you see is what your customers and prospective customers see, please call or email Philip Derham to discuss how we can help strengthen that commonality of view – and so your business.

Getting satisfaction, recommendation and action!

Customer satisfaction is always a goal which, hopefully, leads to repeat business and recommendations from existing customers to prospective new customers.

The Net Promoter Score is often used to measure that recommendation intent.

Sometimes, as we found recently, the Net Promoter Score can be quite low while customer satisfaction measures are quite high.

Rather than report just the single (and low) Net Promoter Score and apparently contradictory high satisfaction measure, we looked at satisfaction, then at recommendation in the last year, and those who gave the promoters’ Net Promoter Score rating of 9 or 10 separately from just the Net Promoter Score.

The graph below shows the findings, by age groups.

Key out-takes are that, for this client, customer satisfaction rises by age.

Conversely, the positive Net Promoter Score ratings of 9 or 10, declines with age (under 20s excepted).

The “have recommended in the last year” measure also declines with age (under 20s excepted).

But what is very useful for managers is the finding that the “have recommended in the last year” measure mirrors, but at a higher level, the positive Net Promoter Score ratings of 9 or 10 (the promoters).

These findings suggest two practices relating to satisfaction and the Net Promoter Score.

1.     Satisfaction is not a comparable measure with recommendation and the two should be read and used as two different measures, not as complementary measures.

2.     When using the Net Promoter Score, managers should include comparisons with:

a.       past recommendation practice proportions, and

b.     those who scored 9 or 10 on the Net Promoter Score (the positive promoters)

to give more rounded views of the Net Promoter Score measures, when those are used.

If you’d like to discuss how these extended measures can help you manage your customer experience and recommendation marketing, please call or email Philip Derham, using the Contact page form or email or phone.

Why research customers? Because knowing their views will strengthen your decisions.

We’ve long explained that when you know what motivates your customers, you can market more effectively to them.  And we use this same approach when reviewing what research tools will work most effectively for you.

Sometimes, the tools we research are major tools – online focus groups or face-to-face focus groups, for example.  And sometimes they are smaller tools – the type of questions that most effectively get the answers you need.

One type of question – the matrix question – is commonly used when several statements or products or services are measured on the same scale.  For visual economy and for ease in answering, the statement and scale questions are shown as one question, as in the example below.

This read-from-top-to-bottom format has had some ask if people may be more likely to answer the first listed statement and then less likely to answer statements lower on the list.

We tested whether this hypothesis was valid.  We found:

  1. No relationship between the statement order and whether people answered the statement.
  2. Other factors – participants’ awareness of the product, their use of the product, or the importance or relevance of the product to them – determine whether they answer the statements in a matrix question list.
  3. Hence, there is research method value in using matrix questions, as these enable more effective answers.

Full details are in this month’s Quirk’s Marketing Research Review (USA) – which you can see, in full, at  https://www.quirks.com/articles/is-there-a-list-order-bias-in-online-survey-matrix-questions     .

While this may be a little more a researcher’s interest, we thought it may also be of interest to you as it evidences the thought and care we put into every element of the research projects we undertake – to your benefit.

So, if you’d like to talk further about accurate customer research measures to help strengthen your decisions, please call or email Philip Derham.  His contact details are below.

Strengthening your Net Promoter Score effectiveness by strengthening its relevance.

The Net Promoter Score is widely used to measure customer relationships and particularly customer loyalty to the organisation.  Its simple metric enables comparisons across time, across organisations and across staff.  Changes in it should reflect the effectiveness of management-initiated action.

The risk is that its very simplicity of measure enables the Net Promoter Score to be readily used, even when the context is not appropriate.

Two utility survey examples:

  • After a customer spent two hours on the phone, resolving a simple issue, the last staff member spoken with asked if the customer would complete a short Net Promoter Score survey – and asked expressly for a good number, as that would help them remain employed.
  • The second utility performed its job, in the time advised.  On completion, it emailed the customer a brief Net Promoter Score

In both cases, the service provision was the minimum required and the minimum expected.  Each delivered the service promised.  Each utility thus appeared to think that a reasonable Net Promoter Score could be expected from the contact, and so undertook their Net Promoter Score survey.

But as the context for each utility contact was to rectify a problem, the contact was not an engagement from choice.

That the engagements were forced and were engagements that could not be concluded until the problems had been resolved meant the context of the contact was more significant and more likely to influence the rating.  And of course, a direct staff request to get a good rating can also skew the genuine Net Promoter Score measures.

The research issue thus is to ensure the Net Promoter Score measures are used in the appropriate context, not as a simple one-size-fits-all-contacts remedy.

If you’d like to strengthen the value and utility of your Net Promoter Score use, and so ensure your measures are more effective, please call or email Philip Derham to discuss this further.  His contact details are on the Contact Us page (to the right on the screen menu).

Customer data reveals great opportunity!

Customer data reveals great opportunity! It enables you to identify more business opportunity and build on what you have.
 
If you have a customer database, you can analyse it and find useful facts.
 
Facts such as frequent customers using their discount cards on most of their visits. Infrequent customers using their discount cards on fewer of their visits, so missing the reward benefits and reasons to repeat their visits.
 
Facts such as more of your mortgages are held by customers in Sydney’s western suburbs, though more live in Canberra.
 
Or that high value purchase customers are infrequent customers.
Finding opportunities like these can enable you to implement more effective marketing – perhaps to reinforce frequent shoppers’ practices, or to encourage occasional customers to shop more often.
 
Finding opportunities may cause you to review your branch or broker distribution, or to review product profitability to see whether frequent lower value purchases give better profits than occasional higher value purchase.
 
But the bit that is missing is the “Why?”
 
Is daily shopping just because the buyer works in an office next door and will shift elsewhere just as happily if they change jobs? Is occasional high value shopping linked to particular emotional states or events? And beyond the obvious Mothers’ or Fathers’ Days, what stimulates those emotional states?
 
Databases show the opportunity, but may not tell the motivating “why”.
 
We can add that extra element and help you take a marketing campaign to being a motivating marketing campaign.
 
If you’d like to ensure your next campaign motivates most effectively, please call or email Philip Derham on T: 0414 543 765 or E: Philip.Derham@derhamresearch.com.au .
 

Identifying and then retaining disengaged customers.

When surveying customers and former customers for clients, the survey response rates from each group differ.  Customers are usually happy to participate and ex-customers rarely bother doing so.  That is expected, given the different levels of involvement with the client organisation.

But, when customers are analysed by their involvement, using measures such as recency of purchase, value of ongoing business, etc., we find different patterns of response among those the client describes as customers, as the graph shows:

Customers who have not bought, or who are not using profitable services, respond at levels similar to those of ex-customers.  Customers who are active purchasers, or users of high value products, are even more likely to participate in customer surveys than customers overall.

This is more than just nice-to-know.

This post-survey analysis enables you to identify disengaged customers before they become ex-customers.  You can then decide whether those disengaged customers are worth working to retain, or whether it’s more realistic and cost-effective to just let them drift away.

As an example, realistically, how many packs, tents, hiking boots, sleeping bags, or cooking equipment can any one person buy?  After they have bought all they can carry, are they really ongoing prospective customers – or just people who once were?

Please email or call Philip Derham to find out how our customer surveys can strengthen your marketing by helping you to identify your disengaged customers, so you can decide the most effective next steps.

New and more business from new insights.

You know what you’ve had to do to get the business you have now, and getting more and new business may just require more of the same – or may need to evolve to satisfy new needs.

And the insight into whether more of the same, or something new is required, can be effectively identified by surveys – usually for speed and economy, by online surveys.

In designing an online survey, three key points must be followed to get the insights needed.

1.    Keep the survey short.

Our research on past surveys shows that while online surveys longer than 15 minutes still get some completions, these longer surveys have more people dropping out part-way through.

2.    Don’t ask what you already know. 

When you ask customers for information you already know (e.g. age, gender, home postcode or State), you waste questions unnecessarily.

3.    Make it easy to do on mobiles and on computers. 

While more people still do their online surveys on computers and laptops, an ever-increasing proportion use their mobile phones – with few using tablets.

Any question that is not easy to answer on a small phone screen will lose people and lessen the insights available to you.

When you are ready to gain fresh insights to help you grow your business, please call or email Philip Derham, so we can design the survey that will get you those needed insights.

Mixing or matching? Hearing or missing new business opportunities.

Should we mix or match when running focus groups for you?

Mixing different types of people in a group can expose new business opportunities – or can silence the exposure of that same opportunity.

Part of our skill in identifying new business opportunities for you comes from deciding whether to mix or to match.

Should we mix different types of people in a focus group?

As an example, while men see themselves as silent, stoic and strong when ill (man-flu excepted), mixing the well and the ill in focus groups found that business-opportunity differences may relate more to education and income than to their health.

This insight then lead to quite different and more effective communications.

Should we have similar types of people in the focus group?

Conversely, matching can also highlight opportunity.

If you were looking to boost your mortgage lending, matching like-people with like-people can uncover common difficulties in saving and in meeting lending criteria, identifying appropriate communication approaches.

Mixing different types of people may, in contrast, lessen the understanding of the motivations and needs of those with average incomes, if those with average incomes were in a group mixed with people who earn seven figure incomes each year.

Choosing the approach that will work best for you

Mixing or matching to make focus groups more effective depends on the products or services from which you want to increase your sales.  In mixed or in matched focus groups, our accurate identification of your customers’ motivations can help you to further strengthen your business.

If you’d like to discuss how we can help you by using focus groups that reveal customer or prospective customer motivations to strengthen your communications and so results, please call or email Philip Derham now.

How knowing what they think enables more business.

We see our customers all the time – in our stores, in our branches, on the phone, in their emails, on social media, in the daily or regular customer reports, in the Net Promoter Score scores, and particularly in the profit and loss accounts.

So we think we know them.  We think we know what motivates them and market to our thoughts and get commensurate results.

But thinking and knowing may not be not the same, as a client recently found.  Volunteers give it over 25,000 free days labour every year but the client, as any business, needs both labour and cash to operate. 

Knowing how committed their volunteers were, made the volunteers seem a perfect segment for cash donations as well, but the volunteers gave less cash than expected.

We ran focus groups with the volunteers and clarified their motivations, which explained why the marketing had generated less than expected.

Post-focus group result?  A revised marketing campaign that appealed to the volunteers’ actual motivations.

So, if identifying your customers’ actual motivations can further strengthen your marketing, please call or email me, and we can discuss the next steps to your strengthened results, just as the client has done.

Five facts that form a trend that influences your business.

The Australian Bureau of Statistics tells us facts – lots of facts – and leaves the drawing together and understanding of the influence of those facts to us.

Five recent facts that influence your business are that the proportions of people who:

  1. Own their homes outright has declined from 43% to 38% (1995-6 to 2015-16),
  2. Own their homes with a mortgage has risen from 28% to 37% in the same time.
  3. Are children is down from 22% of the population to 19% (1995 to 2015).
  4. Were families with children were 64% of all in 1996 but only 61% in 2016. And yet
  5. Spending on non-essentials was down.  In 1984, 44% of household income was spent on discretionary items, but by 2015-6, this had declined to just 41%.

Individually, these facts may or not be interesting, but as a collective, raise concerns for your business.

Proportionately fewer families have children, yet there is less discretionary spending.   More is being spent on basic items, including mortgages.

Less discretionary spending means less on optional purchases – the small or the larger pleasures, as well as other products, services or facilities.

Seeing these facts as parts of a trend, you could continue marketing as before or you could find out what your customers do and want, then see what motivates them, so they’ll spend more of limited discretionary spending with you.

We specialise in finding these opportunities and motivations quickly and cost-effectively.  So, as you reflect on the new year and its challenges, call or email Philip Derham now to start identifying the trends that are influencing your business and to find out how to motivate your customers your way.

Improve your Satisfaction KPIs – by up to 3 SEs.

When customers do more with you, you can assume they are satisfied with your products, prices and service.

But as assumptions are just assumptions, many organisations survey their customers’ to establish their satisfaction levels and report those as formal KPI measures.  We’ve found it strengthens your satisfaction KPI results if you let your survey run for a little longer time.

Most people complete their online or on-mobile surveys within 48 hours of their emailed invitation, so it can be tempting to close then and report the results quickly.

For customers who are just satisfied, that approach is fine.

The proportions of customers who are just satisfied remain stable from the first to the tenth day after the survey opens.

But the very satisfied customers are different.

While more of the very satisfied customers complete their survey within 48 hours, quite a few take longer.  They keep your email invitation and return to it three, five or even seven days after it was sent to them.

In surveys, we’ve seen the extra time help the very satisfied proportion to grow from 54% after 48 hours to 62% after five days – significant 3 standard error gains.

This very satisfied proportion growth also boosts the net satisfaction level, making your net satisfaction KPI stronger too.

As we effectively measure customers’ satisfaction, please call or email Philip Derham to discuss your next steps to better measures and strengthened satisfaction KPIs.  His contact details are below.

Email:     Philip.Derham@derhamresearch.com.au

Telephone:     (+61)  0414 543 765

Evolving concepts – new names, new logos, new types of ADI.

As you well know, the ADI financial services sector is evolving, and rapidly.

New entrants and new payment or transfer methods are appearing.

The Government is encouraging more competition, seeking to further benefit customers.

In response,

  • Some ADIs have merged.
  • Some former Credit Unions have become banks – and more are considering that change.
  • Some have renamed.  Some have re-branded – with changes in logos and identity.
  • Some remain as they have been.  And
  • Some are hedging their bets, registering possible new names and exploring the opportunities.

Much of the reviewing is being quietly undertaken in-house, with staff and Directors.  But knowing what your current members and eligible prospects think, can clarify the likely outcomes by telling you how changes may influence financial activity with you – or with another ADI.

Over the last twenty-five years, we’ve researched the impact of changes or not-changes of ADI type, of ADI name and branding, logo and presentation.  And seen some very positive outcomes follow.

Please call or email me to know more – whether from using our expressly developed 20 Questions Brand Monitor™, or our logo options testing facilities, or by undertaking your own bespoke customer and prospect research.

Then, with sound knowledge of the impact of change – or of no-change – we can strengthen your final decision.  So please call or email me now, to activate the steps to that knowledge.  Click on the Contact page for quick contact now!

Perceptions matter because they influence action.

Perceptions matter because they influence action.

Since the Australian Bureau of Statistics’ August retail sales results were released last Thursday, media commentators have bemoaned the retail sales decline of 0.6% since the previous month.

And if Australians are tightening their wallets and not spending on retail, and if the mood is one of caution, investing to generate sales and more sales needs to be targeted.  It needs to be targeted to groups more likely to spend, borrow, invest as you need.  And targeted to appeal to the motivations of people in those groups to act.

Clarifying those two targets is what we do.

Our research shows who the targets for your profitable products and your profitable services are.  Our research then identifies what you need to offer to motivate them – to excite their interest and ultimately, their action.

If you feel your sales could be strengthened now, with your profitable products and services sold more effectively, please call me to discuss the steps to achieve that knowledge.

Brands. To have and to hold.

Known brands are shorthand to get quick customer and prospect recognition; and response and purchase.  Unknown brands get querying and questioning; may not be seen; and get lower consideration and lower purchase likelihood.

If your brand is known, why change?

  • To strengthen the business. New laws allowing Credit Unions to rebrand as Banks may enable more new business.
  • To enter new markets.  The Surrey Hills Café brand may be too localised to work elsewhere.
  • To become current.  The Buggy Whip Manufactory brand may be too dated a brand for phone app developer.
  • The brand is damaged.  The White Pointer Bait surfboard brand may sell more if rebranded.

Choices for the better brand decision.

Take a punt and hope for the best?

Or to find out what your current and future customers’ views of your brand are now, and the impact of alternatives. Then, with knowledge, decide.  (The business-strengthening decision).

Identifying your customer views can be done using online and offline focus groups, specific surveys or our 20 Questions Brand Monitor™

The 20 Questions Brand Monitor™ is an online survey, for all your email-accessible customers, with personalised questions about brand awareness, brand attributes, brand perceptions, brand preferences, and likely actions if the brand were…

We can add a sample of others not from your database as a comparison, if needed. To know more, please contact me today!  Details below.

Extending customer research reach finds more sales opportunities.

Working in an office normalises office practices.  We receive, read and reply to emails at work on work computers.  And we mostly complete online surveys at work, on our work computers.

These practices are seen as normal for people in offices.

But as we’ve seen in recent surveys, one in four read emails and complete online surveys on their small-screen smartphones, at work or elsewhere.

Up to the age of 40, half or more in each age group used their smartphone to complete the survey.

Smartphone and computer users took much the same time to complete their surveys (14 and 13 minutes), indicating our online surveys are easy to do on any device.

This enables a full range of your customers to complete our online surveys, giving a better knowledge of their needs and so of sales opportunities.  Please ring or email Philip Derham to find out more.

Time to reflect on our assumptions.

Down time – particularly that of a relaxing summer break – can allow for reflections on the year just past. On its highs, on its what could have been, on the assumptions that were made.

That reflective time can enable you to return, as some did last year, with a determination to test marketing assumptions they’d made during the year. Some of the assumptions we tested for them included:

* That people “like me” were the customers. We found the customers were less “like me” and so a changed sales approach became profitable, quickly, as it targeted actual buyers more directly.

* That the most profitable customers were the most satisfied. Data analysed with survey results found the most satisfied customers were not necessarily the most profitable, requiring a rethink of the core customer group.

* That sponsoring a local sports team would get almost total audience reach, which we found was not necessarily the case. Knowing this, enabled them to allocate their sponsorship investment more effectively.

So, if you’ve had the time and the opportunity to consider assumptions, and would like to test those, to ensure your marketing and sales investment delivers the maximum returns, please ring or email or use the form on this site to contact Philip Derham to discuss the steps.

Does being patient pay off?

Are laggards more likely to be satisfied or dissatisfied? The answer might surprise you. Philip Derham reports.

This article first appeared in the June 2016  edition of the AMSRS Research News and is republished with permission.

Clients often use measures of customer satisfaction as an element in their assessments of their own corporate effectiveness, competitive advantage, and senior management’s performance – and related senior managers’ bonuses. Our online surveys thus normally include satisfaction measures, which we seek to make as effective and accurate as possible.

When reviewing satisfaction measure effectiveness, we look at the data itself and how we have measured that. Today, this can often include how long the online survey is open, as clients are often keen on speedy, though accurate, results.

Our online survey software allows us to see when each survey response is received. When graphed, the time and day results from a recent 2,934-sample survey, with three email contacts, raised questions, as Figure 1 indicates.

The double peaks of high response within 24 hours of each email’s dispatch, shown in Figure 1, was initially interesting but had a simple explanation. The survey invitation emails were sent in the early afternoon and would have been received shortly after. Most who responded did so when they received the emails – mainly in the afternoon and evening of the day of dispatch or the next morning, with few accessing their emails overnight, hence, the response was largely immediate, as Figure 2 shows.

Figure 2: Survey completion in the 24 hours from the dispatch of the survey invitation, shown on a 24-hour clock face

This lead us to question why a few took two, three, and up to six days after the email was received to complete the survey. We first wondered if these laggards were dissatisfied customers.

The evidence indicated they were not; only three per cent of those who responded were dissatisfied. In contrast, among those who answered within 24 hours of the email, only three per cent were dissatisfied, and two per cent of those who responded two, three or four days later were dissatisfied. Hence, the laggards were not dissatisfied customers. Nor did the second or third email affect the low dissatisfied rate, suggesting more prompts do not, of themselves, cause dissatisfaction.

However, there was a significant difference between groups of satisfied customers, depending on their time of response.

Overall, and the result reported to the client, was that 86 per cent of customers were at least satisfied with the client – and 55 per cent were very satisfied. The remaining 11 per cent were neutral, had not used the service, or had not answered the question.

The significant difference was that while 55 per cent of all customers were very satisfied, and 54 per cent of those who did the survey within the first 24 hours were very satisfied, proportionately more customers who did the survey two, three, four or five days later were very satisfied – 62 per cent, as Figure 3 illustrates.

Figure 3: Comparisons of very satisfied customers, in total, of those who responded within 24 hours of an invitation, and those who responded later

It is unlikely that the difference between the two groups is due to chance, as the late responders’ group results were nearly three standard errors larger than the results from the group that responded within 24 hours of the invitations.

An interpretation of these results is that while most of the very satisfied customers will complete an online survey immediately, a separate, numerically smaller but proportionately larger group of very satisfied customers will complete the survey some days after they receive their invitations.

The conclusions from this one survey analysis are that:

  • The practice of sending up to three email invitations to customers on client databases does not increase the proportion of dissatisfied customers
  • The longer the online survey is open (to the 14 days tested), the more very satisfied customers will complete it. This may strengthen the proportion of satisfied customers reported to clients, and so strengthen their own subsequent satisfaction-related actions.
  • However, if results are urgently required, online satisfaction surveys can be closed earlier, with slight reductions in overall net satisfaction scores likely.
  • It would useful to undertake further analysis to compare very satisfied customers who immediately complete the survey with those who take two or more days later to do so.
  • These single online survey findings have stimulated us to look at more of our surveys to see whether this is a singular result, or a harbinger, and to report those findings later. We suggest others may wish to review their satisfaction survey results to see if we can develop a broader industry view on this.

Philip Derham, Director, Derham Marketing Research Pty. Ltd.

This article first appeared in the June 2016  edition of the AMSRS Research News and is republished with permission.

What is a recommendation really worth?

recommendation1For one online retailer, the value of recommendation is clear.

For them, a recommendation is worth $20.

Just given them a friend’s email, and if your friend buys, they get $10 off that purchase and you get $10 of your next purchase (T & Cs applied).

Otherwise, your recommendation is implicitly, not of value.

Thinking on this, several questions follow.

  1. What is the value of recommendation to your business?
  2. Who recommends?
  3. Who do they recommend to?
  4. Do the recipients follow the recommendation?
  5. And if they do, when?
  6. Do recommendations generate business – or just nice warm feelings? And,
  7. If recommendations do generate more business for you, how can you stimulate more of them?

Our Recommender Advantage Monitor taps into the answers of these questions (and more) and can help you establish the value of recommendations for you, and how you can stimulate more of them.

If you want more, useful recommendations, please contact Philip Derham for more details.

World views collide to distort public discourse.

The media exist to inform, to educate, to convert, and particularly to monetise, contact with people in societies.

The market research discipline uses applied scientific techniques to gather replicable and reliable insights and knowledge to support decision-making.

These two different worlds do meet.

As examples of the collision between the two worlds, in the last month, four media reports based on survey results implicitly or expressly argued for public and perhaps Government action on three topical issues – hunting foxes, lessening obesity, and Muslims in Australia.

In one article, the journalists noted the survey had been repeated twice and the survey’s usual sample was 1,000.  The second article noted a sample of 304 people.  The foxhunting story mentioned no sample size, and the obesity warning report mentioned a sample of more than 86,500 people collected over 12 month period (1).

Relevant issues for market researchers in assessing the data would include the sample size, sample selection, sample representativeness, sample error, survey method, whether the results were weighted to their population, and the impact these market research issues may have on the findings reported.  The language, the question wording and any information supplied as context for the questions asked, was not discussed in these reports.

Discussion of these methodological issues may not be as exciting as commentary on incendiary findings, but discussion of the validity of findings is needed to ensure the community draws the correct information from the studies.

People may still then act on their prejudices, as the 200,000 witches tortured, burnt or hanged in the UK over 300 years past would attest, were they here (2), but reporting survey findings that include the careful caveats we market researchers would include, may better inform and so influence public discourse.

We market researchers can and may contact individual journalists to suggest these questions, but one-off individual action may not have the influence required.

Rather, the conclusion from the reporting of these four surveys is that the Australian Marketing Research organisations should develop, distribute and promote a journalist’s guide to questions that should be answered before reporting surveys results.

As Kirkpatrick said, reporters should make sure of the validity of their survey-based facts before reporting them (3).

To do so is to enrich rather than to distort public discourse.

Footnotes:

1  The (UK) Guardian, October 3, 2016.  September 21, 2016:  http://www.smh.com.au/federal-politics/political-news/half-of-all-australians-want-to-ban-muslim-immigration-poll-20160920-grkufa.html  September 27, 2016: www.smh.com.au/federal-politics/political-news/new-national-snapshot-finds-60-per-cent-of-australians-would-be-concerned-if-a-relative-married-a-muslim-20160926-grp4x0.html  September 21, 2016: http://www.theage.com.au/national/health/australian-diets-below-benchmark-construction-workers-main-culprits-20160925-gro3pm.html

2  http://www.historic-uk.com/CultureUK/Witches-in-Britain/

https://en.wikipedia.org/wiki/Roger_de_Kirkpatrick

Source:

Research News Live –  http://amsrslive.com.au/2016/11/10/world-views-collide-to-distort-public-discourse/