Explained in 90 seconds!
See how this 90 second video explains how your business can benefit from the insights gained for you from our effective Five Customer Key Insights Monitor.
Explained in 90 seconds!
See how this 90 second video explains how your business can benefit from the insights gained for you from our effective Five Customer Key Insights Monitor.
Why insights research is more than just asking a question and getting an answer.
You can get an answer to a business need by asking a question.
But unless the question gives you a background context to the people answering, to their emotional contexts, to their self-perceptions, and to their actual behaviours, then the answer may not give you the insight you need to make a better business decision.
To illustrate, the Australian Bureau of Statistics just released their May 2020 Household Impacts of COVID-19 survey findings, drawn from a national sample of 2,612 people aged 18 years or older.
The first question reported was whether the participants were concerned about their personal health due to spread of COVID-19. To that one question,
>> 24% of Australians said that
>> No, they were not concerned about their personal health, because of COVID-19.
This answer suggests an immediate need for a new Australian Government Grim Reaper-style advertising campaign!
Australians, particularly Queenslanders and South Australians, must be told and scared into caring.
And advertising agencies must start spruiking that need to Government immediately, and for it to commission them to do that!
But that single question does not show a business opportunity because 99% of Australians had changed their behaviour.
And if 99% had changed their behaviour, that 24% were not concerned for their personal health because of COVID-19 is clearly not a Grim Reaper-style campaign need.
The 24% finding may indicate new skin care sales opportunities or skin-friendly brand extensions for hand sanitisers, but does not indicate other insights, because we don’t know why the 24% are unconcerned. As data collection, the survey is very useful as it tells us a lot about the 24%. It tells their age, gender, work status, education, health, insurance cover and use of Government payments. From this, we could infer, but do not know why, the 24% are not concerned.
Is it because they:
>> Have acted and so feel safe?
>> Or had action forced on them by Government requirements?
>> Are genuinely unconcerned anyway (perhaps because of some self-perception)?
As we don’t know, we can’t generate insights to help strengthen business decisions.
Which brings us back to the initial point.
Getting you the right answers and why insights research is more than just asking a question and getting an answer.
In getting the insights you can use, we need more. More questions, more reflection, more testing and more thought. Then, the questions we ask for you will reveal the business-building insights you need.
If you’d like our assistance to understand more about your customers and prospects and gain business-usable insights than “interesting and so what answers”, ring (0414 543 765) or email Philip Derham (email@example.com ) now.
For the past few years, Philip Derham, an ex-Monash University student, has been assisting current Monash students by being a mentor.
He is one of a large number of ex-Monash University and other volunteers who mentor students, and gain fresh approaches from those interactions.
The next semester’s mentoring program has just begun and Philip is again pleased to be involved.
Things were looking up, the Australian Bureau of Statistics reported on Monday July 13, 2020 in its June 2020 COVID-19 survey report*.
Key (and positive points) were that in June 2020:
* Fewer Australians reported personal stress because of COVID-19 (24% compared with 43% in April 2020).
* Only 9% said they were suffering from loneliness (down from 22% in April 2020).
* And particularly, among those aged 18-64 years, those who said they had problems maintaining a healthy lifestyle declined to only 9% compared with 22% in April 2020.
Then COVID-19 hit back.
Melbourne is in another 6-week lockdown. Victorians are locked out of other States. There are sporadic COVID-19 outbreaks elsewhere.
All of which means that the good-news Australian Bureau of Statistics report is out of date and so difficult to use for business decisions.
Not because of any fault in its research or delivery, just in its timing. But the timing means its results are less of a guide to action than may have been hoped when it was undertaken.
Ups and downs among your customers
Given the rapid mood changes reported by the Australian Bureau of Statistics, you too could expect similar ups and downs in behaviours, in attitudes, and in intentions among your customers.
And from something just out of your control – the unfortunate continually changing COVID-19 induced lockdowns, outbreaks, and publicity.
Staying on top of your customers’ mind and behavioural changes, so you can plan and market accordingly, requires quick response.
Stay on top with our online services
Our online services can tell you quickly what your customers are thinking, why they are thinking it, and what they are doing or plan to do, when…
Our online surveys get quick and effective responses.
Most participants complete our surveys in the first forty-eight hours, giving a solid sample for fast, thorough analysis.
We report your insights speedily, so you get clear insights when those are current. Which allows you to plan and market to the current, rather than to the past.
Contact Philip Derham to stay on top of what is now, not past
If you feel that knowing what is now, rather than what was past, will help you strengthen your business in the next COVID-19 months, please call or email Philip Derham now.
His contact details are: email firstname.lastname@example.org or telephone (+61) 0414 543 765.
* Source: 4940.0 – Household Impacts of COVID-19 Survey, 24-29 June 2020, released July 13, 2020.
When you seek to strengthen your business, post-COVID-19, one key group to have on side are your staff. Their interactions now and then can have a significant impact.
You can find their worries and motivations by survey. We can help you with those staff surveys. And, as outsiders, staff are more comfortable telling us what they think than they are telling colleagues in-house.
When planning your staff survey, you’ll set out what you need to know, so we get you those answers.
Other factors may be relevant and worth testing.
As many employees have or are still working from home, one relevant factor may be the time spent commuting to work, when they return to your workplace.
Until the COVID-19 lockdowns, on average, employees were commuting for an hour a day to and from work.
And employees in Sydney and Melbourne were spending more time commuting (71 and 65 minutes). This has an impact on job satisfaction and engagement too. Job satisfaction averages only 7 on a 10 point scale*.
There may be other, specific-to-your-workplace factors too. We can find those before the survey, by interviewing some staff anonymously. And those can then be included in your staff survey, covering all bases.
If insights into your staff’s motivations to really strive for you will help you strengthen your business post-COVID-19, I’m happy to discuss those further with you.
My contact details are: E: email@example.com and T: (+61) 0414 543 765.
* Source: The Household, Income and Labour Dynamics in Australia Survey: Selected Findings from Waves 1 to 17. Melbourne Institute, University of Melbourne.
The COVID-19 crisis is tragic for the families of those who have died, for those who have had it, and it generates different views about what will happen after.
Will customers’ behaviour return more or less to normal, as they return to work?
Yes: Dr Cristina de Balanzo suggests that the behaviour changes are the result of enforced change, while people’s attitudes have not changed much, so expect a return to past behaviour.
No: Sam Zell says the pandemic will have same long-lasting changes in human behaviour and so economic and social effects as the 1930s Depression. Which will imperil many business models.
Shall we wait and see? Or, as appropriate, find out what our customers want, intend, are motivated by, and what they are doing, as the months go by.
To get ahead of your competitors, now and the next months are the right times to find out what your customers are doing. We can tell you these customer behaviour insights quickly and cost-effectively, using our online interviews, our online discussions and our online surveys.
So, if knowing your customers’ actions, intents and motivations will help you plan and implement stronger responses now and over the next twelve months, please email or call Philip Derham to discuss how we can find and report those customer-strengthening insights for you.
Sources: https://www.warc.com/newsandopinion/news/arebehavioursreallychangingbecauseoflockdown/43581?utm_source=daily-email-free-link&utm_medium=email&utm_campaign=daily-email-apac-prospects-20200507 May 7 2020.
Sometimes better decisions follow when you know what your customers want or need or can do. This knowing need can be particularly strong now, as purchase patterns and habits are in locked-down change.
The question is, then, are your knowledge needs unique or are they ones likely to be faced by others too?
If your decision-need problem or circumstances are unique, a specifically written survey may answer your insights needs.
If what you need to know is likely to be needed by others too, there may be a cost-effective option to writing your own unique survey. That option can be to use an existing relevant survey and tweak it to your specific needs.
We’ve found commonalities in knowledge needs now, in this unusual time.
Our pre-written, online 20 Questions One Topic Monitors© may answer your need, as this 95 second video explains – https://youtu.be/4ynTt6ZzLgo
And if you’d like to know more, my contact details follow.
If customer satisfaction is a key performance indicator, it is worth measuring most effectively.
Online surveys measuring customer satisfaction can be undertaken quickly, but we have found that if we wait for six or more days before closing, we get more accurate customer satisfaction measures than if we close after 48 hours.
The difference can be, as an example, from a customer satisfaction measure of 55% very satisfied on day 1 to 68% being very satisfied after six or more days.
This short (ninety second) video details more, about more effective customer satisfaction measures.
Recently, we measured the value and the effectiveness of three different to-customer communications channels.
The client was concerned about the budget for each – and their value for money.
We questioned their customers about what channels of information to stimulate query and purchase they saw as most effective and of most value to them.
The answer surprised.
Customers reported that calls in were more often stimulated by the newsletter than by the outbound contacts.
This example may be client-specific and industry-specific, but shows the value of testing channels and customer preference to identify your more effective communications approaches in terms beyond the immediate sales response.
Recently, a client had a sales problem and needed help to find out why.
We undertook a three-step research project to find out where, why and what was needed to reverse the decline.
That process is summarised in the short (1 minute 45 second) video – Insights to help reverse sales declines or to increase sales which you may find useful.
Why? Because the COVID-19 lock-downs are very likely to change customers’ future behaviour. And when you know what those changes are, you can manage for them.
If the three steps detailed in the video can help you plan to benefit from those behaviour changes, please email or call Philip Derham, on 0414 543 765 or firstname.lastname@example.org
Online tools can generate insights, despite the coronavirus, as this two minute video outlines.
People participate in interviews, group discussions or surveys online, talking into their own computer or mobile (cell) phone.
People participate from their homes or wherever they feel safe. The interviewer is in another place, and the discussion is via computer or phone, online.
As the video notes, interviews online with participants at their homes, with real-life family dynamics around, can reveal further motivations and behaviour. These additional insights can strengthen your marketing, as the video notes.
As you know, people are becoming cautious about associating with others because of fears of catching coronavirus.
We can neither stop the fears nor the virus, but those need not stop the customer insights needed to strengthen your business.
Our online interview and survey tools enable us to talk to and listen to people, in their homes or wherever they feel safe. And they are still willing to assist by sharing their thoughts and motivations with us – online.
As an example of how well the online interview tool can work, we talked recently with a couple, online, about their finances and mortgage intentions. We wanted to understand why they’d not sought a mortgage, though employed and eligible. This insight was to help our client strengthen marketing approaches to people in similar situations.
After twenty minutes of online/computer-visual chat with the apparently happy couple, a small child appeared in the background, saw me on the screen, waved and said “Hi”.
One sweetly asked the other if she would put the child to bed, again.
After the female partner left, the almost lovey-dovey couple also disappeared.
For the next minutes, the man explained about their previous marriages, that the child was hers, not his, that only she could discipline the child, and so on.
The female partner reappeared.
As did the image of the apparently happy couple facing me on the screen, and who were still unclear about their motives for not seeking a mortgage.
The online interview provided an interesting insight to be tested in the subsequent online survey.
But the additional takeout was that the insight was obtained safely for the couple and for me, so there was no fear of catching anything from the interview.
Thus, even in a time of heightened caution about personal interaction, we can get valuable insights by using our online research tools.
If you’d like to know more, please call or email Philip Derham to discuss how we can help you.
T: (61) 0414 543 765
Satisfaction is a key metric in measuring how customers feel after they have dealt with your organisation. It can indicate ongoing business intention.
The Net Promoter Score (NPS) is designed to measure customer experience and indicate future business growth.
Recommendation measures are what people have done and to whom the recommendation was given.
Combined, the three measures provide clear KPIs that you can use to further strengthen your business performance and profitability.
Our research has found there can be disconnects between customer satisfaction levels, the NPS and past recommendations.
These disconnects can make it difficult to know which business elements need improvement, so that high satisfaction and past recommendation practices also generate high NPS measures.
We’ve developed a quick and cost-effective solution to identify which customer groups you may need to concentrate on, so all the measured elements are in accord.
The solution is our Satisfaction NPS Recommender Monitor.
This Monitor enables you to identify those customers who are satisfied, who are intending promoters (yes, those with good intentions – think diets and weight loss intentions), and those who are actual recommenders.
When you know, you can market to the specific customer groups that need additional attention to strengthen your business.
To know how the Satisfaction NPS Recommender Monitor can help you strengthen your business, please contact Philip Derham.
Thoughts and words can provide fascinating insights into motivations, action and outcomes.
But what is said in words can be different to what is thought, and so can lead to less than desired business outcomes.
The insights that follow the words and thoughts alignment enable more effective management, more effective product or service integration to meet needs and so provide better business outcomes.
We use a range of in-person, online and other market and business research tools and techniques to ensure the words that our participants express match closely to their thoughts – and subsequent behaviour.
If you think there is a gap between the words you hear and the actions that your staff or your customers do (perhaps because of what they think), and there is an increased business benefit if you can market to the more aligned words and thoughts, please ring or email Philip Derham to find out how we can help you now.
Recently, a client asked us to add the Net Promoter Score (NPS) to their customer engagement KPIs, and then in our next customer survey for them, we found they had maintained their high and positive 90% customer satisfaction measure but had an NPS of +19.
As the two measures seemed a little at odds, we looked for insights to explain the divergence.
Our first insight was the divergence between the client’s very satisfied and their (just) satisfied customers, as the graph shows.
The very satisfied customers generated the very positive +54 NPS, but the (just) satisfied customers generated the very negative NPS of-117.
We then saw that the (just) satisfied customers were primarily detractors, who were unlikely to recommend the client to others. For the NPS to rise, the client knew they’d have to improve but needed to know what, where and for which customers.
The survey results showed the very satisfied customers differed noticeably from the (just) satisfied customers in demographics and use of the client’s services. Research also highlighted differences in attitudes and in motivation between the very satisfied and the (just) satisfied customers.
These insights enabled the client to use their database more effectively to promote relevant messages to the (just) satisfied customers and so strengthen their next NPS results.
If similar insights about your customers and their motivations can help you strengthen your business in the next year, please call or email Philip Derham – at email@example.com or on 0414 543 765.
A problem can emerge when you need to decide, but know a better decision will follow when your customers’ views or intentions, expectations or needs are there to help guide that decision.
Research – by asking people directly, via a survey – can often give you that decision-strengthening knowledge. And speedily and inexpensively, with big audiences, as today, surveys are mostly online.
The question then is whether your knowledge-need is unique or is it one sometimes faced by others, too.
If your problem or your circumstances are unique, you will need a specifically written survey.
But, if what you need to know is not uncommon, there may be a more cost-effective option to writing your own unique survey.
That option can be to use an existing relevant survey and tweaking it to your specific needs. That can be quicker, cheaper and more effective. And those pre-written surveys include relevant supplementary questions needed to clarify answers given.
As we’ve found commonalities in knowledge needs over the years, we’ve developed a range of online Monitors – our 20 Questions One Topic Monitors © – that are pre-written surveys. These get answers to questions about communications, customer needs and intentions, customer practice (that can supplement your own database analyses), customer satisfaction, experience, recommendation and more.
The surveys can be undertaken with your own customers or customers from a broader pool, as needed. The 20 Questions One Topic Monitors© are modified to match your specific market and competition, branding and need. And for a very cost-effective investment.
If you have a decision need that can be strengthened with answers from a survey, these Monitors may answer your need.
To find out more, please email or ring me – Philip.Derham@derhamresearch.com.au or 0414 543 765.
What to do when sales decline?
A long-term profitable and well-known product presented its marketing team with just that challenge.
Sales had been strong and then began a slow and gradual decline. Some careful changes were tested and didn’t really alter the pattern.
The client then had us review their own internal customer data. This revealed customer type and customer location change over time.
While the reasons were open for speculation, our exploratory purchase motivation research (via discussion in online and in-person focus groups and individual interviews) gave good direction on likely reasons. Which were confirmed by subsequent survey findings.
The client modified the product contents, to better reflect purchase motivation needs.
They then re-advertised, emphasising how the product now satisfied buyer needs.
And sales gradually increased.
If you would like to find out how your own customer data and purchase motivation needs research can help you to the answers you need to strengthen your sales, please call or email Philip Derham, whose contact details are on this site.
When, at one of your regular management meetings, someone raises a problem emerging in some customer groups, do you and your colleagues go with gut feeling and decide then?
Do you decide to wait and see what more may happen? Do you decide to research the scope and impact of the problem?
Waiting or researching may take more time than you want to risk.
A research solution to get answers quickly, so you can decide quickly, is to use one of our 20 Questions One Topic Monitors.
These need-specific and industry-specific Monitors can be undertaken with your customers and reported very quickly. They are targeted, effective and cost-effective.
To find out how the 20 Questions One Topic Monitors can get the answers you need and strengthen your quick decisions, call or email Philip Derham now.
Analysing the information you have about your customers helps strengthen your marketing.
Getting actual customer answers via surveys helps strengthen your marketing.
Combining both measures can charge your marketing, as a client recently reported.
When we analysed the client’s own customer data, we identified three major age/gender groups, then, when customer-spend-with-the-client data was added, we found seven spend and purchase frequency groups.
Each group was sufficiently different in characteristics that a single message would be unlikely to effectively motivate each.
We then surveyed customers to find what would motivate people in the different groups to respond – and also found their overall spending patterns (as they did not always shop with the client).
The pre-survey data analysis and group identification allowed the survey to answer specifically what people in each group did, and what motivated them. The combined knowledge then enabled the client to extend their marketing effectiveness and marketing ROI.
If you feel your 2019 can be strengthened with a similar customer data analysis/customer needs and motivations survey project, please call or email Philip Derham to discuss those next steps.
As managers, we’re often so busy with our jobs, we tend to see customers with problems, very important customers, or see customers episodically. A result of this occasional seeing can be that our perceptions may not fully match the reality of our current, or of our prospective, customers.
This implicit perception problem was reinforced in a recent project, when we spoke personally to a mix of a client’s customers and prospective customers.
It seemed that the client’s advertising was cutting through very effectively, as all knew the client, its products and its locations.
However, one major barrier for prospective customers emerged from these discussions.
But the prospective customers said that the advertising showed that this business was a club for one group in the community, but not for all.
The reality was that the client’s business was a totally inclusive business – and would serve any customer very happily.
The problem was that the client had just not seen what prospective customers saw.
The solution was quick and easy.
The busy managers just hadn’t had time to see what the prospective customers saw, and so had been missing opportunity.
If you think there are opportunities to increase your business, by knowing whether what you see is what your customers and prospective customers see, please call or email Philip Derham to discuss how we can help strengthen that commonality of view – and so your business.
Customer satisfaction is always a goal which, hopefully, leads to repeat business and recommendations from existing customers to prospective new customers.
The Net Promoter Score is often used to measure that recommendation intent.
Sometimes, as we found recently, the Net Promoter Score can be quite low while customer satisfaction measures are quite high.
Rather than report just the single (and low) Net Promoter Score and apparently contradictory high satisfaction measure, we looked at satisfaction, then at recommendation in the last year, and those who gave the promoters’ Net Promoter Score rating of 9 or 10 separately from just the Net Promoter Score.
The graph below shows the findings, by age groups.
Conversely, the positive Net Promoter Score ratings of 9 or 10, declines with age (under 20s excepted).
The “have recommended in the last year” measure also declines with age (under 20s excepted).
But what is very useful for managers is the finding that the “have recommended in the last year” measure mirrors, but at a higher level, the positive Net Promoter Score ratings of 9 or 10 (the promoters).
These findings suggest two practices relating to satisfaction and the Net Promoter Score.
1. Satisfaction is not a comparable measure with recommendation and the two should be read and used as two different measures, not as complementary measures.
2. When using the Net Promoter Score, managers should include comparisons with:
a. past recommendation practice proportions, and
b. those who scored 9 or 10 on the Net Promoter Score (the positive promoters)
to give more rounded views of the Net Promoter Score measures, when those are used.
If you’d like to discuss how these extended measures can help you manage your customer experience and recommendation marketing, please call or email Philip Derham, using the Contact page form or email or phone.
We’ve long explained that when you know what motivates your customers, you can market more effectively to them. And we use this same approach when reviewing what research tools will work most effectively for you.
Sometimes, the tools we research are major tools – online focus groups or face-to-face focus groups, for example. And sometimes they are smaller tools – the type of questions that most effectively get the answers you need.
One type of question – the matrix question – is commonly used when several statements or products or services are measured on the same scale. For visual economy and for ease in answering, the statement and scale questions are shown as one question, as in the example below.
We tested whether this hypothesis was valid. We found:
Full details are in this month’s Quirk’s Marketing Research Review (USA) – which you can see, in full, at https://www.quirks.com/articles/is-there-a-list-order-bias-in-online-survey-matrix-questions .
While this may be a little more a researcher’s interest, we thought it may also be of interest to you as it evidences the thought and care we put into every element of the research projects we undertake – to your benefit.
So, if you’d like to talk further about accurate customer research measures to help strengthen your decisions, please call or email Philip Derham. His contact details are below.
The Net Promoter Score is widely used to measure customer relationships and particularly customer loyalty to the organisation. Its simple metric enables comparisons across time, across organisations and across staff. Changes in it should reflect the effectiveness of management-initiated action.
The risk is that its very simplicity of measure enables the Net Promoter Score to be readily used, even when the context is not appropriate.
Two utility survey examples:
In both cases, the service provision was the minimum required and the minimum expected. Each delivered the service promised. Each utility thus appeared to think that a reasonable Net Promoter Score could be expected from the contact, and so undertook their Net Promoter Score survey.
But as the context for each utility contact was to rectify a problem, the contact was not an engagement from choice.
That the engagements were forced and were engagements that could not be concluded until the problems had been resolved meant the context of the contact was more significant and more likely to influence the rating. And of course, a direct staff request to get a good rating can also skew the genuine Net Promoter Score measures.
The research issue thus is to ensure the Net Promoter Score measures are used in the appropriate context, not as a simple one-size-fits-all-contacts remedy.
If you’d like to strengthen the value and utility of your Net Promoter Score use, and so ensure your measures are more effective, please call or email Philip Derham to discuss this further. His contact details are on the Contact Us page (to the right on the screen menu).
When surveying customers and former customers for clients, the survey response rates from each group differ. Customers are usually happy to participate and ex-customers rarely bother doing so. That is expected, given the different levels of involvement with the client organisation.
But, when customers are analysed by their involvement, using measures such as recency of purchase, value of ongoing business, etc., we find different patterns of response among those the client describes as customers, as the graph shows:
Customers who have not bought, or who are not using profitable services, respond at levels similar to those of ex-customers. Customers who are active purchasers, or users of high value products, are even more likely to participate in customer surveys than customers overall.
This is more than just nice-to-know.
This post-survey analysis enables you to identify disengaged customers before they become ex-customers. You can then decide whether those disengaged customers are worth working to retain, or whether it’s more realistic and cost-effective to just let them drift away.
As an example, realistically, how many packs, tents, hiking boots, sleeping bags, or cooking equipment can any one person buy? After they have bought all they can carry, are they really ongoing prospective customers – or just people who once were?
Please email or call Philip Derham to find out how our customer surveys can strengthen your marketing by helping you to identify your disengaged customers, so you can decide the most effective next steps.
You know what you’ve had to do to get the business you have now, and getting more and new business may just require more of the same – or may need to evolve to satisfy new needs.
And the insight into whether more of the same, or something new is required, can be effectively identified by surveys – usually for speed and economy, by online surveys.
In designing an online survey, three key points must be followed to get the insights needed.
1. Keep the survey short.
Our research on past surveys shows that while online surveys longer than 15 minutes still get some completions, these longer surveys have more people dropping out part-way through.
2. Don’t ask what you already know.
When you ask customers for information you already know (e.g. age, gender, home postcode or State), you waste questions unnecessarily.
While more people still do their online surveys on computers and laptops, an ever-increasing proportion use their mobile phones – with few using tablets.
Any question that is not easy to answer on a small phone screen will lose people and lessen the insights available to you.
When you are ready to gain fresh insights to help you grow your business, please call or email Philip Derham, so we can design the survey that will get you those needed insights.
Should we mix or match when running focus groups for you?
Part of our skill in identifying new business opportunities for you comes from deciding whether to mix or to match.
Should we mix different types of people in a focus group?
As an example, while men see themselves as silent, stoic and strong when ill (man-flu excepted), mixing the well and the ill in focus groups found that business-opportunity differences may relate more to education and income than to their health.
This insight then lead to quite different and more effective communications.
Should we have similar types of people in the focus group?
Conversely, matching can also highlight opportunity.
If you were looking to boost your mortgage lending, matching like-people with like-people can uncover common difficulties in saving and in meeting lending criteria, identifying appropriate communication approaches.
Mixing different types of people may, in contrast, lessen the understanding of the motivations and needs of those with average incomes, if those with average incomes were in a group mixed with people who earn seven figure incomes each year.
Choosing the approach that will work best for you
Mixing or matching to make focus groups more effective depends on the products or services from which you want to increase your sales. In mixed or in matched focus groups, our accurate identification of your customers’ motivations can help you to further strengthen your business.
If you’d like to discuss how we can help you by using focus groups that reveal customer or prospective customer motivations to strengthen your communications and so results, please call or email Philip Derham now.
We see our customers all the time – in our stores, in our branches, on the phone, in their emails, on social media, in the daily or regular customer reports, in the Net Promoter Score scores, and particularly in the profit and loss accounts.
So we think we know them. We think we know what motivates them and market to our thoughts and get commensurate results.
But thinking and knowing may not be not the same, as a client recently found. Volunteers give it over 25,000 free days labour every year but the client, as any business, needs both labour and cash to operate.
We ran focus groups with the volunteers and clarified their motivations, which explained why the marketing had generated less than expected.
Post-focus group result? A revised marketing campaign that appealed to the volunteers’ actual motivations.
So, if identifying your customers’ actual motivations can further strengthen your marketing, please call or email me, and we can discuss the next steps to your strengthened results, just as the client has done.
The Australian Bureau of Statistics tells us facts – lots of facts – and leaves the drawing together and understanding of the influence of those facts to us.
Five recent facts that influence your business are that the proportions of people who:
Individually, these facts may or not be interesting, but as a collective, raise concerns for your business.
Proportionately fewer families have children, yet there is less discretionary spending. More is being spent on basic items, including mortgages.
Less discretionary spending means less on optional purchases – the small or the larger pleasures, as well as other products, services or facilities.
Seeing these facts as parts of a trend, you could continue marketing as before or you could find out what your customers do and want, then see what motivates them, so they’ll spend more of limited discretionary spending with you.
We specialise in finding these opportunities and motivations quickly and cost-effectively. So, as you reflect on the new year and its challenges, call or email Philip Derham now to start identifying the trends that are influencing your business and to find out how to motivate your customers your way.