When surveying customers and former customers for clients, the survey response rates from each group differ. Customers are usually happy to participate and ex-customers rarely bother doing so. That is expected, given the different levels of involvement with the client organisation.
But, when customers are analysed by their involvement, using measures such as recency of purchase, value of ongoing business, etc., we find different patterns of response among those the client describes as customers, as the graph shows:
Customers who have not bought, or who are not using profitable services, respond at levels similar to those of ex-customers. Customers who are active purchasers, or users of high value products, are even more likely to participate in customer surveys than customers overall.
This is more than just nice-to-know.
This post-survey analysis enables you to identify disengaged customers before they become ex-customers. You can then decide whether those disengaged customers are worth working to retain, or whether it’s more realistic and cost-effective to just let them drift away.
As an example, realistically, how many packs, tents, hiking boots, sleeping bags, or cooking equipment can any one person buy? After they have bought all they can carry, are they really ongoing prospective customers – or just people who once were?
Please email or call Philip Derham to find out how our customer surveys can strengthen your marketing by helping you to identify your disengaged customers, so you can decide the most effective next steps.
You know what you’ve had to do to get the business you have now, and getting more and new business may just require more of the same – or may need to evolve to satisfy new needs.
And the insight into whether more of the same, or something new is required, can be effectively identified by surveys – usually for speed and economy, by online surveys.
In designing an online survey, three key points must be followed to get the insights needed.
1. Keep the survey short.
Our research on past surveys shows that while online surveys longer than 15 minutes still get some completions, these longer surveys have more people dropping out part-way through.
2. Don’t ask what you already know.
When you ask customers for information you already know (e.g. age, gender, home postcode or State), you waste questions unnecessarily.
While more people still do their online surveys on computers and laptops, an ever-increasing proportion use their mobile phones – with few using tablets.
Any question that is not easy to answer on a small phone screen will lose people and lessen the insights available to you.
When you are ready to gain fresh insights to help you grow your business, please call or email Philip Derham, so we can design the survey that will get you those needed insights.
Should we mix or match when running focus groups for you?
Part of our skill in identifying new business opportunities for you comes from deciding whether to mix or to match.
Should we mix different types of people in a focus group?
As an example, while men see themselves as silent, stoic and strong when ill (man-flu excepted), mixing the well and the ill in focus groups found that business-opportunity differences may relate more to education and income than to their health.
This insight then lead to quite different and more effective communications.
Should we have similar types of people in the focus group?
Conversely, matching can also highlight opportunity.
If you were looking to boost your mortgage lending, matching like-people with like-people can uncover common difficulties in saving and in meeting lending criteria, identifying appropriate communication approaches.
Mixing different types of people may, in contrast, lessen the understanding of the motivations and needs of those with average incomes, if those with average incomes were in a group mixed with people who earn seven figure incomes each year.
Choosing the approach that will work best for you
Mixing or matching to make focus groups more effective depends on the products or services from which you want to increase your sales. In mixed or in matched focus groups, our accurate identification of your customers’ motivations can help you to further strengthen your business.
If you’d like to discuss how we can help you by using focus groups that reveal customer or prospective customer motivations to strengthen your communications and so results, please call or email Philip Derham now.
We see our customers all the time – in our stores, in our branches, on the phone, in their emails, on social media, in the daily or regular customer reports, in the Net Promoter Score scores, and particularly in the profit and loss accounts.
So we think we know them. We think we know what motivates them and market to our thoughts and get commensurate results.
But thinking and knowing may not be not the same, as a client recently found. Volunteers give it over 25,000 free days labour every year but the client, as any business, needs both labour and cash to operate.
We ran focus groups with the volunteers and clarified their motivations, which explained why the marketing had generated less than expected.
Post-focus group result? A revised marketing campaign that appealed to the volunteers’ actual motivations.
So, if identifying your customers’ actual motivations can further strengthen your marketing, please call or email me, and we can discuss the next steps to your strengthened results, just as the client has done.
The Australian Bureau of Statistics tells us facts – lots of facts – and leaves the drawing together and understanding of the influence of those facts to us.
Five recent facts that influence your business are that the proportions of people who:
- Own their homes outright has declined from 43% to 38% (1995-6 to 2015-16),
- Own their homes with a mortgage has risen from 28% to 37% in the same time.
- Are children is down from 22% of the population to 19% (1995 to 2015).
- Were families with children were 64% of all in 1996 but only 61% in 2016. And yet
- Spending on non-essentials was down. In 1984, 44% of household income was spent on discretionary items, but by 2015-6, this had declined to just 41%.
Individually, these facts may or not be interesting, but as a collective, raise concerns for your business.
Proportionately fewer families have children, yet there is less discretionary spending. More is being spent on basic items, including mortgages.
Less discretionary spending means less on optional purchases – the small or the larger pleasures, as well as other products, services or facilities.
Seeing these facts as parts of a trend, you could continue marketing as before or you could find out what your customers do and want, then see what motivates them, so they’ll spend more of limited discretionary spending with you.
We specialise in finding these opportunities and motivations quickly and cost-effectively. So, as you reflect on the new year and its challenges, call or email Philip Derham now to start identifying the trends that are influencing your business and to find out how to motivate your customers your way.
When customers do more with you, you can assume they are satisfied with your products, prices and service.
But as assumptions are just assumptions, many organisations survey their customers’ to establish their satisfaction levels and report those as formal KPI measures. We’ve found it strengthens your satisfaction KPI results if you let your survey run for a little longer time.
Most people complete their online or on-mobile surveys within 48 hours of their emailed invitation, so it can be tempting to close then and report the results quickly.
For customers who are just satisfied, that approach is fine.
The proportions of customers who are just satisfied remain stable from the first to the tenth day after the survey opens.
But the very satisfied customers are different.
While more of the very satisfied customers complete their survey within 48 hours, quite a few take longer. They keep your email invitation and return to it three, five or even seven days after it was sent to them.
In surveys, we’ve seen the extra time help the very satisfied proportion to grow from 54% after 48 hours to 62% after five days – significant 3 standard error gains.
This very satisfied proportion growth also boosts the net satisfaction level, making your net satisfaction KPI stronger too.
As we effectively measure customers’ satisfaction, please call or email Philip Derham to discuss your next steps to better measures and strengthened satisfaction KPIs. His contact details are below.
Telephone: (+61) 0414 543 765
As you well know, the ADI financial services sector is evolving, and rapidly.
New entrants and new payment or transfer methods are appearing.
The Government is encouraging more competition, seeking to further benefit customers.
- Some ADIs have merged.
- Some former Credit Unions have become banks – and more are considering that change.
- Some have renamed. Some have re-branded – with changes in logos and identity.
- Some remain as they have been. And
- Some are hedging their bets, registering possible new names and exploring the opportunities.
Much of the reviewing is being quietly undertaken in-house, with staff and Directors. But knowing what your current members and eligible prospects think, can clarify the likely outcomes by telling you how changes may influence financial activity with you – or with another ADI.
Over the last twenty-five years, we’ve researched the impact of changes or not-changes of ADI type, of ADI name and branding, logo and presentation. And seen some very positive outcomes follow.
Please call or email me to know more – whether from using our expressly developed 20 Questions Brand Monitor™, or our logo options testing facilities, or by undertaking your own bespoke customer and prospect research.
Then, with sound knowledge of the impact of change – or of no-change – we can strengthen your final decision. So please call or email me now, to activate the steps to that knowledge. Click on the Contact page for quick contact now!
Perceptions matter because they influence action.
Since the Australian Bureau of Statistics’ August retail sales results were released last Thursday, media commentators have bemoaned the retail sales decline of 0.6% since the previous month.
And if Australians are tightening their wallets and not spending on retail, and if the mood is one of caution, investing to generate sales and more sales needs to be targeted. It needs to be targeted to groups more likely to spend, borrow, invest as you need. And targeted to appeal to the motivations of people in those groups to act.
Clarifying those two targets is what we do.
Our research shows who the targets for your profitable products and your profitable services are. Our research then identifies what you need to offer to motivate them – to excite their interest and ultimately, their action.
If you feel your sales could be strengthened now, with your profitable products and services sold more effectively, please call me to discuss the steps to achieve that knowledge.
Known brands are shorthand to get quick customer and prospect recognition; and response and purchase. Unknown brands get querying and questioning; may not be seen; and get lower consideration and lower purchase likelihood.
If your brand is known, why change?
- To strengthen the business. New laws allowing Credit Unions to rebrand as Banks may enable more new business.
- To enter new markets. The Surrey Hills Café brand may be too localised to work elsewhere.
- To become current. The Buggy Whip Manufactory brand may be too dated a brand for phone app developer.
- The brand is damaged. The White Pointer Bait surfboard brand may sell more if rebranded.
Choices for the better brand decision.
Take a punt and hope for the best?
Or to find out what your current and future customers’ views of your brand are now, and the impact of alternatives. Then, with knowledge, decide. (The business-strengthening decision).
Identifying your customer views can be done using online and offline focus groups, specific surveys or our 20 Questions Brand Monitor™.
The 20 Questions Brand Monitor™ is an online survey, for all your email-accessible customers, with personalised questions about brand awareness, brand attributes, brand perceptions, brand preferences, and likely actions if the brand were…
We can add a sample of others not from your database as a comparison, if needed. To know more, please contact me today! Details below.
Working in an office normalises office practices. We receive, read and reply to emails at work on work computers. And we mostly complete online surveys at work, on our work computers.
These practices are seen as normal for people in offices.
But as we’ve seen in recent surveys, one in four read emails and complete online surveys on their small-screen smartphones, at work or elsewhere.
Up to the age of 40, half or more in each age group used their smartphone to complete the survey.
This enables a full range of your customers to complete our online surveys, giving a better knowledge of their needs and so of sales opportunities. Please ring or email Philip Derham to find out more.
That reflective time can enable you to return, as some did last year, with a determination to test marketing assumptions they’d made during the year. Some of the assumptions we tested for them included:
* That people “like me” were the customers. We found the customers were less “like me” and so a changed sales approach became profitable, quickly, as it targeted actual buyers more directly.
* That the most profitable customers were the most satisfied. Data analysed with survey results found the most satisfied customers were not necessarily the most profitable, requiring a rethink of the core customer group.
* That sponsoring a local sports team would get almost total audience reach, which we found was not necessarily the case. Knowing this, enabled them to allocate their sponsorship investment more effectively.
So, if you’ve had the time and the opportunity to consider assumptions, and would like to test those, to ensure your marketing and sales investment delivers the maximum returns, please ring or email or use the form on this site to contact Philip Derham to discuss the steps.
Are laggards more likely to be satisfied or dissatisfied? The answer might surprise you. Philip Derham reports.
This article first appeared in the June 2016 edition of the AMSRS Research News and is republished with permission.
Clients often use measures of customer satisfaction as an element in their assessments of their own corporate effectiveness, competitive advantage, and senior management’s performance – and related senior managers’ bonuses. Our online surveys thus normally include satisfaction measures, which we seek to make as effective and accurate as possible.
When reviewing satisfaction measure effectiveness, we look at the data itself and how we have measured that. Today, this can often include how long the online survey is open, as clients are often keen on speedy, though accurate, results.
Our online survey software allows us to see when each survey response is received. When graphed, the time and day results from a recent 2,934-sample survey, with three email contacts, raised questions, as Figure 1 indicates.
The double peaks of high response within 24 hours of each email’s dispatch, shown in Figure 1, was initially interesting but had a simple explanation. The survey invitation emails were sent in the early afternoon and would have been received shortly after. Most who responded did so when they received the emails – mainly in the afternoon and evening of the day of dispatch or the next morning, with few accessing their emails overnight, hence, the response was largely immediate, as Figure 2 shows.
Figure 2: Survey completion in the 24 hours from the dispatch of the survey invitation, shown on a 24-hour clock face
This lead us to question why a few took two, three, and up to six days after the email was received to complete the survey. We first wondered if these laggards were dissatisfied customers.
The evidence indicated they were not; only three per cent of those who responded were dissatisfied. In contrast, among those who answered within 24 hours of the email, only three per cent were dissatisfied, and two per cent of those who responded two, three or four days later were dissatisfied. Hence, the laggards were not dissatisfied customers. Nor did the second or third email affect the low dissatisfied rate, suggesting more prompts do not, of themselves, cause dissatisfaction.
However, there was a significant difference between groups of satisfied customers, depending on their time of response.
Overall, and the result reported to the client, was that 86 per cent of customers were at least satisfied with the client – and 55 per cent were very satisfied. The remaining 11 per cent were neutral, had not used the service, or had not answered the question.
The significant difference was that while 55 per cent of all customers were very satisfied, and 54 per cent of those who did the survey within the first 24 hours were very satisfied, proportionately more customers who did the survey two, three, four or five days later were very satisfied – 62 per cent, as Figure 3 illustrates.
It is unlikely that the difference between the two groups is due to chance, as the late responders’ group results were nearly three standard errors larger than the results from the group that responded within 24 hours of the invitations.
An interpretation of these results is that while most of the very satisfied customers will complete an online survey immediately, a separate, numerically smaller but proportionately larger group of very satisfied customers will complete the survey some days after they receive their invitations.
The conclusions from this one survey analysis are that:
- The practice of sending up to three email invitations to customers on client databases does not increase the proportion of dissatisfied customers
- The longer the online survey is open (to the 14 days tested), the more very satisfied customers will complete it. This may strengthen the proportion of satisfied customers reported to clients, and so strengthen their own subsequent satisfaction-related actions.
- However, if results are urgently required, online satisfaction surveys can be closed earlier, with slight reductions in overall net satisfaction scores likely.
- It would useful to undertake further analysis to compare very satisfied customers who immediately complete the survey with those who take two or more days later to do so.
- These single online survey findings have stimulated us to look at more of our surveys to see whether this is a singular result, or a harbinger, and to report those findings later. We suggest others may wish to review their satisfaction survey results to see if we can develop a broader industry view on this.
Philip Derham, Director, Derham Marketing Research Pty. Ltd.
This article first appeared in the June 2016 edition of the AMSRS Research News and is republished with permission.
For them, a recommendation is worth $20.
Just given them a friend’s email, and if your friend buys, they get $10 off that purchase and you get $10 of your next purchase (T & Cs applied).
Otherwise, your recommendation is implicitly, not of value.
Thinking on this, several questions follow.
- What is the value of recommendation to your business?
- Who recommends?
- Who do they recommend to?
- Do the recipients follow the recommendation?
- And if they do, when?
- Do recommendations generate business – or just nice warm feelings? And,
- If recommendations do generate more business for you, how can you stimulate more of them?
Our Recommender Advantage Monitor taps into the answers of these questions (and more) and can help you establish the value of recommendations for you, and how you can stimulate more of them.
If you want more, useful recommendations, please contact Philip Derham for more details.
The market research discipline uses applied scientific techniques to gather replicable and reliable insights and knowledge to support decision-making.
These two different worlds do meet.
As examples of the collision between the two worlds, in the last month, four media reports based on survey results implicitly or expressly argued for public and perhaps Government action on three topical issues – hunting foxes, lessening obesity, and Muslims in Australia.
In one article, the journalists noted the survey had been repeated twice and the survey’s usual sample was 1,000. The second article noted a sample of 304 people. The foxhunting story mentioned no sample size, and the obesity warning report mentioned a sample of more than 86,500 people collected over 12 month period (1).
Relevant issues for market researchers in assessing the data would include the sample size, sample selection, sample representativeness, sample error, survey method, whether the results were weighted to their population, and the impact these market research issues may have on the findings reported. The language, the question wording and any information supplied as context for the questions asked, was not discussed in these reports.
Discussion of these methodological issues may not be as exciting as commentary on incendiary findings, but discussion of the validity of findings is needed to ensure the community draws the correct information from the studies.
People may still then act on their prejudices, as the 200,000 witches tortured, burnt or hanged in the UK over 300 years past would attest, were they here (2), but reporting survey findings that include the careful caveats we market researchers would include, may better inform and so influence public discourse.
We market researchers can and may contact individual journalists to suggest these questions, but one-off individual action may not have the influence required.
Rather, the conclusion from the reporting of these four surveys is that the Australian Marketing Research organisations should develop, distribute and promote a journalist’s guide to questions that should be answered before reporting surveys results.
As Kirkpatrick said, reporters should make sure of the validity of their survey-based facts before reporting them (3).
To do so is to enrich rather than to distort public discourse.
1 The (UK) Guardian, October 3, 2016. September 21, 2016: http://www.smh.com.au/federal-politics/political-news/half-of-all-australians-want-to-ban-muslim-immigration-poll-20160920-grkufa.html September 27, 2016: www.smh.com.au/federal-politics/political-news/new-national-snapshot-finds-60-per-cent-of-australians-would-be-concerned-if-a-relative-married-a-muslim-20160926-grp4x0.html September 21, 2016: http://www.theage.com.au/national/health/australian-diets-below-benchmark-construction-workers-main-culprits-20160925-gro3pm.html
Research News Live – http://amsrslive.com.au/2016/11/10/world-views-collide-to-distort-public-discourse/
Joining was easy. I was in the store.
I’d bought some hiking gear and would get a slight discount if I joined the retailer’s loyalty scheme, giving them personal details to add to the product purchase details.
A great start to an ongoing sales relationship? No! No, because the retailer’s monthly promotional email is a standard listing of things I could buy for me, at cheap prices.
Yet, in this sector, what motivates me is hiking gear for children. They love the outdoors, and I’m a sucker for anything that makes their hiking experiences better.
The retailer’s lack of knowledge of what motivates me, and people like me, means their emailed promotions do not generate sales from me – nor from people like me.
After all, I only need one pack and one pair of boots. The new equipage opportunities for the children are far, far more – and they keep growing and so need new clothing, new boots and new gear.
Our customer research can easily, quickly and cost-effectively tell this retailer what really motivates groups of their customers. Then they can market to those actual motivations and generate more sales as a result.
If you feel your marketing could also be sharpened with appeals to your customer groups’ motivations, and want assistance identifying those, call, email or use the form below to contact Philip Derham for the next steps.
The risk is that not all customers, not all prospects, see what you see.
In the real-life, graphed example, staff see their business benefits as being friendly, secure and offering competitive products.
Prospects do not see what staff see and scarcely see the client at all.
Benefits seen from within may not be the reasons customers or prospects buy.
Is, for example, the benefit sold to shoppers of “the great tasting coffee” the main reason to buy coffee from a shopping centre café? Or is it the chance to sit and rest, and have some refreshment?
Marketing to “what the buyers see” rather than to what you see increases your marketing return on investment.
Knowing what your customers/prospects see, want and will buy makes your marketing more effective and increase its ROI.
Research identifies buyers’ and prospects’ perceptions. So by knowing, research sharpens your marketing and your marketing investment return.
To ensure your views and your customers’ views are aligned, please call or email me for the next steps.
Recommendations from customers are a very cheap form of advertising. And one which really extends your marketing budget further and further at no extra cost.
The question “How to get more recommendations from your customers?” has an answer and our past research shows that in specific industries:
- One essential attribute that drives recommendations;
- Four characteristics that recommenders share; and
- Three core groups who may be influenced by recommenders.
Knowing what your recommender/promoter groups are, will enable you to target those people most likely to recommend, with suggestions that they do so.
Hence, we’ve developed a brief, Recommender Advantage Monitor that identifies which of your customers are most likely to recommend you – and why.
If you’d like to know more, please call or email Philip Derham, or use the form below.
One recent review showed that we could get good numbers of responses from 1 email invitation and could close the survey in 48 hours (as the graph shows).
But additional customers responded to the 2nd invitation.
Keeping the survey open longer also enabled more very satisfied customers to respond.
In this reviewed customer satisfaction survey, 3,000 responded –most within 24 hours (61%). But responses continued for days after each email.
We analysed the “within 24 hours”, and the “2+ days later” responses.
We found that:
- 54% of those who responded within 24 hours were very satisfied with the client,
- 62% of customers who responded 2+ days later were very satisfied.
- This was nearly 3 standard errors difference, and so statistically significant.
Including these later-responding very satisfied customers increased the overall satisfaction measure, a useful benefit if customer satisfaction is a KPI.
The conclusion is that while we can close a survey after 48 hours and have a good sample size; to give you the best results, our preference is to keep your online surveys open for at least 10 days.
To know more about our business-strengthening online surveys, please call or email Philip Derham. Please click on the Contact Us tab to email or to ring him.
Solutions to strengthen your online survey response.
When research technique matters to clients is when we (or others) find that research technique improvements can get strengthened results for no increases in the fees.
One recent research technique we reviewed was how the increased use of smartphones now influences online survey completions.
We analysed 14,111 recent online surveys and found that:
* 4 in 5 of the people who started, completed their survey.
* 84% of those who used desktops completed their survey,
* 66% of those who used smart phones completed their survey.
As every participant is valuable, we asked why fewer smart phone users completed their surveys. The reasons were clear:
* The smartphones’ smaller screen sizes; and so
* Capacity to easily answer the questions (“my fingers are too big”), and
* Smartphones often had slower Internet speeds than desktops had.
The solutions were just as clear and so we recommend:
* Shorter surveys (with split samples and questionnaires, if needed) and specifically written for you;
* When relevant, using one of our smartphone-specific 20 Questions One Topic Monitors or our Summary Spotlight Surveys.
For more information about how we can strengthen your survey findings, and so strengthen your decisions, please call or email Philip Derham now! His contact details are under the Contact Us tab at the right/top of your screen.
The article and the data in full are available in the June 2016 edition of Quirk’s Marketing Research Review, http://www.quirks.com/articles/2016/20160605.aspx
People are influenced by where they live and where they live influences their purchase behaviour. And these geographically-based differences in attitudes and behaviour can affect your sales.
So when looking to understand the influence of these geographically-based differences on your sales, we have found face-to-face focus groups to be a useful research tool.
But face-to-face focus groups across the country can be costly.
We may have to travel from Dubbo to Darwin and to Doncaster and beyond to ensure your key customer or prospect segments are researched. And you can’t always be there yourself to see what they are saying.
This is the tyranny of distance in Australia – and beyond, if you have an international business. But there is a solution.
That solution is that of online focus groups. We run each of those online focus groups with 4 or 5 or 6 people selected for relevance from a range of areas. All participate at the same time. They use their screens and web cams, so we can all see each other and the images or words we want them to see, just as we do in face-to-face focus groups.
With three key differences.
- The participants are in their homes (in Sydney or in New York, in Bendigo or in Idaho, in Penrith or in Perth) and I’m in my office.
No travel. No travel costs.
- You can watch each group from work or at home as fits your time.
- You can, via the researcher, seek more comment on a participant’s comment, while that topic is relevant and fresh.
If you want to know more about online focus groups and their capacity to inform you, more cost-effectively, call or email me, Philip Derham, and I’ll tell you more – online or offline as you wish.
At work, we all plan.
Our plans have formal, specific objectives and measurable outcomes. We plan for the growth anticipated and for the problems foreseen.
But as the year progresses, the unexpected can influence our plans and more particularly, our planned outcomes.
Then, we need to review and perhaps regroup and change our plans.
We can deal with the problems by using our accumulated experience and expertise and, often this works.
Sometimes, we need to go beyond experience and assumption and revise and make new plans and new decisions based on knowledge – knowledge of customer or competitor activity and behaviour.
But inevitably, time is tight.
The old plans need modifying and new solutions need to be implemented quickly, to enable us to overcome the emergent problems.
Here, our 20 Questions One Topic™ monitors (contact us for a brochure) can get you the rapid knowledge you need, enabling more effective change so your planned achievements are met as nearly as possible.
To know more, or to discuss your plans-disrupting problems, please call or email Philip Derham.
Most of us have been children, or parents, or observers of parents and so think we know how parents think and behave, and what motivates them.
This would seem to make marketing to parents quite straightforward.
Except the marketing needed for what we know, as opposed to what we think we know, can be quite different.
As examples, our recent research found that what is seen as “normal” for some groups of parents is:
* Spending $2,500 to take a child to a one day dancing contest in another State.
* Working second jobs so they can afford private school fees.
* Giving up coffee with friends, to save the money for food.
* Travelling for a year in a small caravan with their children, so the children can see and experience Australia.
* Buying everything from eBay, for the cheaper prices.
* Borrowing to build extra bedrooms so each child can have their own 75 inch TV in their own room (keeps the peace, apparently).
The behaviours from parents in these groups are different to those of other parents in other groups.
And, depending on the numbers of people in such motivational groups, the behaviours could have markedly different spending, saving, shopping, travel and café culture practices.
Hence, even with groups of people apparently as similar as parents, we need to be sure in our knowledge, so our marketing is relevant and targeted.
Just thinking we know is not now a sufficient base for effective marketing, whether to parents or any other types of customers.
Now, to strengthen your marketing effectiveness with your own customers and prospects, you need to understand their motivations (and how big each different motivational group is) so you can sell more, more effectively.
If you would like to know more about identifying motivations, groups and group sizes of customers and prospects, please call or email Philip Derham. Use the “Contact us” tab to contact him.
New information about your customers from surveys and focus groups is always exciting – and can often reveal needs or attitudes perhaps not fully expected.
A second source of exciting information about your customers is your own customer database. Analysis and reanalysis of that can often reveal additional opportunities.
We can’t tell you about other clients’ database analyses, but can tell you about our own reanalysis of Melbourne University’s Household, Income and Labour Dynamics in Australia survey.
That database first showed that happiness, analysed by age, is U shaped. That is, people are happier when young, less so when middle-aged, and happier again when older. The first-run analysis stopped there.
Our reanalysis of this database found marked differences by commonly recorded database details. These differences included varying satisfaction according to home type, home location and neighbourhood; age, gender; and the often-collected details of personal finances and personal health.
As examples, people living in separate houses were markedly more satisfied than those living in flats. The neighbourhood influenced satisfaction and perceptions of personal safety; and home type and health were linked.
This reanalysis of an existing database identified immediate opportunities for particular types of loan, for home security products, for specific types of insurance, for particular foods and beverages, for health services, and for modifications to existing products or services. Surveys or focus groups that may be needed after can be more specifically directed because of the database analysis, which is a powerful step in identifying more customer opportunity.
If you’d like to know more about how we can strengthen your database reanalyses, please call or email Philip Derham now.
These questions are the subject line of our invitation today to thousands of Australians, to test their preferences for online survey question types.
When we know the sorts of online survey questions people prefer to answer, we can ensure we ask your questions in the ways that get more, and more accurate, answers – strengthening the value of your research investment.
We are testing the ease of answering questions again now, because the devices used to complete online surveys have changed.
Once, there were only desktop PCs or laptops.
Now, you can use an iPhone, Ipad, Macbook, Surface, laptop, PC, Android tablet, Android smart phone, and more. All with different screen sizes and shapes. And all with different data input techniques -ranging from on- and off-screen keyboards to fingers.
Some styles of questions work really well on some devices. Other styles of questions work better on other types of devices.
Today, as PCs and laptops are used to answer only about two thirds of online surveys, it is necessary to know what question types are preferred by the people who complete our online surveys, on the screens they use.
This survey about chocolates, holidays and supermarkets is really about question types and question type preferences, made palatable by being about things more interesting.
We will report the results, but thought you may be interested to see what is being tested, and so invite you to also complete the brief survey. The link is:
If you’d like to know more, please call or email Philip Derham.
What encourages people to complete your survey?
A gift? Points towards future rewards? A chance to win a prize? Curiosity and wanting to know what others think?
For longer surveys, chances to win a prize and direct gifts are apt. For short surveys, knowing is the incentive. And knowing leads directly to our new Summary Spotlight Survey Report with its potentially extensive findings.
When its 5 questions are answered, people immediately see the answer totals, and their own answers, so they can compare themselves with everyone else. And knowing how you compare is a strong survey completion incentive!
The benefits for you are far greater.
With short surveys, you get quick responses, quick knowledge and quick decision-making capabilities. And far more knowledge. Our Summary Spotlight Survey Report answers can be much more than the 5 questions we ask, to answer your specific information need.
We can pre-load everything you know about your customers. This saves asking again the personal and purchase details you already know.
Such respondent-saving pre-loaded knowledge can include customers’:
- age and gender,
- home address (for mapping), time at that address (measuring mobility),
- family type and composition,
- profit value or other segments you use, or credit history perhaps,
- products held, bought, or applied for,
- personal or household income and work status,
- home or motor vehicle ownership, or assets and liabilities,
- purchase frequency, wider product use, and more.
All this pre-known detail can be built into your 5 question survey, giving much the value of a longer survey, just cleverly asked to answer your specific need.
The results are quick, detailed and thorough answers to your one specific knowledge need. For more information about our Summary Spotlight Survey Report, please call or email Philip Derham. Contact details are below.
As you know, smart phone screens are smaller than the more traditionally used PC, laptop, netbook or even tablet screens, so wondered if there had been changes in survey responses too.
We reviewed our recent online survey results by types of the devices used. These included Android smart phones, iPhones, iPads, Android tablets, Mac computers, Windows 8 computers, and computers running older Windows versions.
We found that:
1. The average time taken to complete the survey was the same, regardless of device used.
2. The same proportions answered open ended questions, regardless of device used.
3. Older men in particular tended to use PCs or laptops, younger people were more likely to use smart phones or tablets, and the middle aged were likely to use PCs, laptops, tablets or smart phones.
4. But, when answers differed, they differed by age or by gender, by activities or by intention, not by the type of device used.
These findings show that online surveys work regardless of the devices used, as the answers you get are consistent across all types of devices.
As our surveys are all device-agnostic – for easy completion on any device – we can get the maximum value from your online survey investment.
If you would like to know more, please call or email Philip Derham today!
You know that one nagging question that niggles away at you.
It’s that one you talk about with colleagues and friends, looking for a solution. And they kindly offer well-meant ideas and suggestions that you know aren’t the answer. Ideas and suggestions you’ve already thought of and discarded because they just didn’t explain and give you the real solution you want.
Sometimes the question is about customers.
* “What more do we have to do to get their profitable business?”
Sometimes the question is about offers.
* “What do we have to offer, to get more enquiries, because we can convert those, it’s the leads we want.”
Sometimes the question is about satisfaction.
* “We have really high customer satisfaction, so why aren’t we used more? Or at least recommended more?”
And sometimes the question is about advertising.
* “Why aren’t we getting a response to this really compelling ad?”
We can get the answer you need – quickly and effectively, with our 20 Questions One Topic Monitors™.
These single issue surveys can answer that nagging question. The brochure (on this link – 20 Questions One Topic Monitors ) gives the details for 2015-2016 – and highlights one new benefit!
If you supply the survey sample with your customers’ names, email addresses and age, gender and home postcodes, you can add 3 extra questions of your own choice to the Monitor you are using!
To find out more, please call or email me today!
Today, 23.8 million Australians use 21 million smart phones to download one gigabyte of data a month on average, showing your smart phone is a key entertainment/purchase/survey instrument.
I say survey instrument deliberately. Deliberately, because smart phone surveys can be undertaken instantly, can be undertaken anywhere, can be used to upload images of products or materials or the things that motivate purchase.
And the change in just two years has made smart phones and smart phone surveys mainstream. Once for the young and for the early adopters, smart phones are widely owned and widely used for surveys, as the graph shows.
With this survey instrument in the hands of your customers or prospective customers, we can survey them quickly, survey them shortly, get more answers and give you more knowledge, so you can outpace the others.
And they can still do the surveys on their desktops at home or work, or on their laptops or tablets when they are using those.
When you need to know what your customers and prospective customers think, do and will do, in statistically reliable terms, our smart phone+ surveys may be your answer. Our surveys are device-agnostic (that is, they look good and work easily on small-screen smartphones, tablets, laptops or computers), and work where and whenever we need to be to get the surveys answered.
When you need to know how our smart phone survey knowledge can strengthen your business, please call or email Philip DERHAM today!
When we need to find out what your customers/prospective customers think, do and will do, in statistically reliable terms, we use surveys.
Online and mobile surveys can give the best reach and the most cost-efficient surveys.
We’ve found the best, most cost-efficient responses come from large customer databases that include names, addresses, email addresses and other relevant customer details.
With those, we can send personalised email invitations and reminders to your customers, so more complete the survey.
And email communications work.
We’ve found they give up to three times the responses of SMS invitations, and more again than other contact methods.
From recent email invitation surveys, we’ve found:
- 6 in 10 completed the surveys on PCs or laptops,
- 3 in 10 used smart phones,
- 1 in 10 used tablets.
Our surveys are completely device-agnostic.
That is, they look good and work easily on large-screen laptops, on small-screen smartphones and mid-size tablets.
Email survey invitations get to your customers whenever and wherever they wish to get them – on holidays, in the train, at home at night, during the working day in bosses’ time. And when they get the survey email invitation, they respond.
And it is much easier to respond on the device you use to get your emails than to take a phone call or post a letter. Hence, surveys based on email invitations work very well.
If you’d like to know how this knowledge can strengthen your business, please call or email Philip Derham, or use the Contact us form.