Satisfaction and recommendation – KPIs

Australian banks measure customer satisfaction and customer recommendation intention. We find out whether these two KPIs could usefully be combined into one for easier management response.

The back of a woman taking money out of an ATM.

Satisfaction and Recommendation as KPIs

Banks in Australia receive regular public and private results from customer satisfaction surveys. These surveys measure customers’ attitudes to their recent customer-bank interactions.
The banks can use the public domain results to distinguish positively one brand from another if the product range is similar. These results are also used to advocate the benefits of banking with mutually owned banks and credit unions, as their customer satisfaction results are usually markedly higher than those of other banks.

Public and private customer satisfaction measures are also used as key performance indicators (KPIs) in assessing staff and corporate performance overall.
The Net Promoter Score (NPS) measures of future recommendation intention are also significant KPIs. The NPS uses a 0 to 10 scale as it asks customers how likely they are to recommend the business (in this case, banks) in the future.

""

Both are reported as separate measures. But there can be an assumption that strong customer satisfaction will lead to stronger future recommendation intention and so a strengthened business. And if this hypothesis were to hold, would there be an advantage to the bank end users of the two separate customer satisfaction and NPS measures if they could be merged into one single metric? The single metric utility was noted by a manager juggling many, and one less was an appeal!

In this preliminary study, we reviewed one bank’s customer satisfaction because of its past performance and the bank’s customers’ intention to recommend it in the future. In the light of actual data, we considered whether these two key performance indicators should be retained as separate measures of different customer mindsets or could usefully be merged into a single continuum measure, as a single measure may make it easier for banks to respond to their results.

The findings

1. Nine in ten were satisfied!

We looked at three recent bank customer satisfaction online surveys.

In each, the customers were asked about their activities in the last three months with the bank and were asked to rate that on the five-point scale of very satisfied, satisfied, neither satisfied nor dissatisfied, dissatisfied, and very dissatisfied.

All surveys returned very high net satisfaction results.

In each survey, nine in ten customers were satisfied (very satisfied and satisfied combined) with their activities with the bank overall.  The individual survey customer satisfaction levels were 88%, 89%, and 92%.

2. The Net Promoter Score scale findings

We then reviewed the NPS scale results from the three surveys.  We used the NPS 0 to 10 point scale, rather than its summation of customers, which the NPS describes as promoters, passives, or detractors[1].  We did so as we were comparing scale results rather than specific pre-set methodologies.

At the 9 and 10 levels of the NPS scale, 27%, 37%, and 20% of bank customers said they would recommend the bank in the future.

When we combined the 7, 8, 9, or 10 selectors (equating the “very satisfied” or “satisfied” scale points), we found that half the bank’s customers said they are likely to recommend it in the next year (54%, 65%, and 42%).

We then looked at customers who were satisfied and recommenders at the 9 or 10 scale level, and the results scarcely differed.  We found that 26%, 36%, and 19% of the customers were satisfied with the bank and intended to recommend it at 9 or 10 levels.  The NPS scale 9 or 10 results on their own were 27%, 37%, and 20%.  Again, little difference between the single or the combined group.

We also tested the inclusion of customers who answered 7 or 8 on the NPS scale and were satisfied with the bank.  We found that about half the customers were both satisfied with the bank and intended to recommend it in the next year (47%, 61%, and 40% – again similar to the 7 to 10 NPS-only measures of 54%, 65%, and 42%).

3. But does intention follow with action?

The question we were considering was whether satisfaction and recommendation intent answers could be combined into a single new metric.

Related

Look Beyond Product for Drivers of B2B Satisfaction and Value

Our data was not longitudinal, so we were unable to see if intention did follow into practice, but we had past recommendation data for one of the surveys and looked at that.  In that survey, customers were asked about their past recommendations, and only 32% of bank customers said they had recommended the bank in any way in the last year.

As 31% of all customers were satisfied and had made a recommendation in the last year group, the connection between recommendation past practice and recommendation future intention is limited to a relatively small group of perhaps a third of all the bank’s customers.

Only 17% of the bank customers had recommended and intend to recommend at the 9 or 10 on the NPS scale).  Including those who answered 7 or 8 as well, this group increased to 23% of all the bank’s customers.  Again, this level is very similar to the proportions that were satisfied and would recommend in the next year (27%, 37%, and 20% at the NPS 9 or 10 level).

Essentially, there was a core group of about one in four customers who were satisfied, were past recommenders, and were likely to future recommenders.

""

Merge into one metric or continue as two?

The data from these three surveys indicate low associations between satisfaction, intention to recommend in the future, and past recommendation.  Comparatively, few customers were satisfied, had recommended, and intended to recommend in the future.  This suggests that the best measure practice in the Australian banking consumer customer market is to continue with the two measures as KPIs.  That is, the customer satisfaction with past bank activity and future recommendation intention (via the NPS) should remain as separate measures.

Satisfaction and Recommendation: Should 2 KPIs Be Merged Into 1? | GreenBook

Thoughts, words and business opportunity

Thoughts and words give insight into customer motivations, actions and potential business outcomes.

But words can differ from thoughts. Unless you measure both, you may miss business opportunities. And when researching your customers, you need to be aware of their words and their thoughts.

Aligning both provides better business outcomes, as this 2 minute video details – https://tinyurl.com/thoughtsandwords


Our research tools and techniques ensure words match closely to thoughts – and so to subsequent behaviour.

We can help you to strengthen your business – by finding the words expressed, and the thoughts unsaid, that do influence behaviour.

To know more, please contact me, Philip Derham, at Derham Insights Research – derhamp@derhamresearch.com.au or 0414 543 765.

The March 2021 retail sales data – and opportunities it shows

Yesterday, the Australian Bureau of Statistics released its March 2021 quarter retail sales data.

In this 2 minute video, Philip Derham notes key findings and the need to ensure that marketing activities are both sector-relevant and appeal to the current shopper motivations.

The different shopper motivations now are exemplified in the marked increase in spending in the leisure/pleasure cafes, restaurants and takeaway food sector, and the decline in the supermarkets and liquor sector sales.

https://tinyurl.com/RetailSalesOpportunities

How to make customer-wanted shorter surveys give you what you need, too.

Our research on survey completions, over the last decade, has found that online surveys longer than 12 minutes on average have higher dropouts. People just stop, regardless of the incentive to complete the survey, if they are too long.

Knowing this, we plan our online surveys to average 12 minutes or less.

But the problem can often be that you need to know more about your customers or your staff and their needs. A 12-minute survey can be too short to ask all you need to know. 

One solution is for us to write the long survey that answers all your questions. Then divide it into smaller surveys, and when those shorter surveys are finished, we combine the answers to tell you all you need.

But an effective, additional way to get more from your online surveys is to include the information you already know about your customers (or staff). We already upload their names and email addresses, so each customer or staff member receives a personal invitation to complete the survey. And each invitation has its own individual, unique survey link. This stops them doing the survey more than once.

And this allows us to add extra information for later analysis when we upload the email addresses. We can upload details from your loyalty program or from your customer database or staff list.

We can upload the standard demographics of age, gender, and home postcode. Doing that saves three questions – and about a minute in survey time.

Then, we can upload, from your records, the number of coffees they bought from you last week, or the value of their cash investments with you, or the value of their loans with you, or how long it has been since they last shopped with you, or their rank or length of service with you. 

And more – up to 200 descriptors which we can then use to identify opportunity and areas of strength or segments that need to be strengthened. And all without adding another question the participant needs to answer.

Hence, shorter but efficient, effective surveys.

If we can help you further build your business by running shorter surveys, rich with pre-loaded, pre-known details, giving you better insights, please call or email Philip Derham, derhamp@derhamresearch.com.au or call 0414 543 765.