Does being patient pay off?

Are laggards more likely to be satisfied or dissatisfied? The answer might surprise you. Philip Derham reports.

This article first appeared in the June 2016  edition of the AMSRS Research News and is republished with permission.

Clients often use measures of customer satisfaction as an element in their assessments of their own corporate effectiveness, competitive advantage, and senior management’s performance – and related senior managers’ bonuses. Our online surveys thus normally include satisfaction measures, which we seek to make as effective and accurate as possible.

When reviewing satisfaction measure effectiveness, we look at the data itself and how we have measured that. Today, this can often include how long the online survey is open, as clients are often keen on speedy, though accurate, results.

Our online survey software allows us to see when each survey response is received. When graphed, the time and day results from a recent 2,934-sample survey, with three email contacts, raised questions, as Figure 1 indicates.

The double peaks of high response within 24 hours of each email’s dispatch, shown in Figure 1, was initially interesting but had a simple explanation. The survey invitation emails were sent in the early afternoon and would have been received shortly after. Most who responded did so when they received the emails – mainly in the afternoon and evening of the day of dispatch or the next morning, with few accessing their emails overnight, hence, the response was largely immediate, as Figure 2 shows.

Figure 2: Survey completion in the 24 hours from the dispatch of the survey invitation, shown on a 24-hour clock face

This lead us to question why a few took two, three, and up to six days after the email was received to complete the survey. We first wondered if these laggards were dissatisfied customers.

The evidence indicated they were not; only three per cent of those who responded were dissatisfied. In contrast, among those who answered within 24 hours of the email, only three per cent were dissatisfied, and two per cent of those who responded two, three or four days later were dissatisfied. Hence, the laggards were not dissatisfied customers. Nor did the second or third email affect the low dissatisfied rate, suggesting more prompts do not, of themselves, cause dissatisfaction.

However, there was a significant difference between groups of satisfied customers, depending on their time of response.

Overall, and the result reported to the client, was that 86 per cent of customers were at least satisfied with the client – and 55 per cent were very satisfied. The remaining 11 per cent were neutral, had not used the service, or had not answered the question.

The significant difference was that while 55 per cent of all customers were very satisfied, and 54 per cent of those who did the survey within the first 24 hours were very satisfied, proportionately more customers who did the survey two, three, four or five days later were very satisfied – 62 per cent, as Figure 3 illustrates.

Figure 3: Comparisons of very satisfied customers, in total, of those who responded within 24 hours of an invitation, and those who responded later

It is unlikely that the difference between the two groups is due to chance, as the late responders’ group results were nearly three standard errors larger than the results from the group that responded within 24 hours of the invitations.

An interpretation of these results is that while most of the very satisfied customers will complete an online survey immediately, a separate, numerically smaller but proportionately larger group of very satisfied customers will complete the survey some days after they receive their invitations.

The conclusions from this one survey analysis are that:

  • The practice of sending up to three email invitations to customers on client databases does not increase the proportion of dissatisfied customers
  • The longer the online survey is open (to the 14 days tested), the more very satisfied customers will complete it. This may strengthen the proportion of satisfied customers reported to clients, and so strengthen their own subsequent satisfaction-related actions.
  • However, if results are urgently required, online satisfaction surveys can be closed earlier, with slight reductions in overall net satisfaction scores likely.
  • It would useful to undertake further analysis to compare very satisfied customers who immediately complete the survey with those who take two or more days later to do so.
  • These single online survey findings have stimulated us to look at more of our surveys to see whether this is a singular result, or a harbinger, and to report those findings later. We suggest others may wish to review their satisfaction survey results to see if we can develop a broader industry view on this.

Philip Derham, Director, Derham Marketing Research Pty. Ltd.

This article first appeared in the June 2016  edition of the AMSRS Research News and is republished with permission.