Many companies spend too much to get too little from their customer experience surveys – don’t be one of them.
The Bad News
Mark Twain once opined, “there are three kinds of lies: lies, damn lies, and statistics.” Today he would likely counsel, “there are three kinds of lies: lies, damn lies, and customer experience surveys.”
At their worst, customer experience surveys can mislead. One of my favorite examples is the WellCare Health Plan fiasco. Two months after J.D. Power and Associates commended WellCare Health Plans Inc. for outstanding customer service, federal officials suspended WellCare’s privilege to sign up new Medicare clients.
Citing repetitive customer care problems, regulators claimed that WellCare’s “performance was substandard in numerous
areas” and was “one of the overall worst performers among all plans.”1
Even at their best, most customer experience surveys aren’t especially telling or capable of providing actionable results. They take the pulse of the customer and provide a score, but they’re all too often ineffectual in guiding the development of a better customer experience and moving the needle.
In short, many companies today are spending much and getting little in return for their investments in customer experience surveys.
The Good News
Ineffective customer experience surveys aren’t a fait accompli. Done right, they offer rich, meaningful insights about the customer experience, as well as a trustworthy barometer of corporate well-being.
Taken on with more than just good intentions, they facilitate the discernment and implementation of the “right” actions –
those actions that lie at the intersection of a better customer experience and a more profitable company.
Seven Habits For Getting A Better ROI For Your Customer Experience Survey Efforts
Achieving a better ROI for your customer experience surveys is simple, but it’s not easy. It won’t cost more money, it’s not reliant on new technologies, and it’s not about the adoption of some novel business strategy.
It does, however, require a discipline to practice seven habits.
- Embrace management by facts, not anecdotes.
Many companies rely on the individual, personal experiences of employees or other anecdotal data (e.g., focus groups, comment cards, complaint records, etc.) as a surrogate for customer experience surveys.
How often have you heard a colleague say, “I know a customer who…,” or “I know about this one recent complaint where…,” or “In this focus group we just completed…,” and so on.
Sometimes known as the “person who” fallacy, these rich, vivid anecdotes can have a hypnotic effect on the organization and disproportionately influence decision making.
As a senior executive of a large trade association once lamented to me, “if one member expresses an opinion about his needs, it’s a trend; if two members share that view, it’s a mandate.”
Worse yet, falling victim to the “person who” fallacy can lull a company into a sense of complacency about its need for a more intentional, empirical approach to listening to the voice of the customer.
Would you manage your sales pipeline, marketing campaigns, product development efforts, or financial forecasts by anecdote? Of course not. And neither should customer experience outcomes be left to chance.
Companies that are “built to last” recognize that using intuitions, gut feelings, and anecdotal data about the customer experience is a risky methodology for sustaining predictable, enduring, and profitable relationships with customers.
Instead, they institutionalize some type of formal, robust voice of the customer survey process that yields reliable, valid, and ongoing insights about the customer experience. - Set your sights on improving performance, not chasing scores.
Score chasing comes in a variety of forms. Sometimes, companies literally beg for better scores.
For example, following a recent car repair, I received a “pre-survey” from the local dealership (“Will you rate your experience as “Very Satisfied”?) a few weeks before I got the “real survey” from the auto manufacturer.
And during a recent hotel stay, the front desk receptionist wore a button that read, “How about a 10?” (“On the survey you’ll receive from our corporate office”).
In both cases, the aim was to “coach” me to give a higher score (which, ironically, our research shows to have a negative impact on the score).
Other times, companies “correct” the scores. I once worked with a domestic auto manufacturer to end a practice whereby dealerships could request the removal of certain surveys from their overall score (i.e., those with lower ratings) because the customer was “unreasonable,” “crazy,” or otherwise “wrong.”
Companies getting the most from their customer experience surveys have transformed their culture from one that’s score-centric to one that’s performance inspired. I like Andrew Carnegie’s vision of philanthropic effectiveness as an allegory here; the aim is to engineer “real and permanent good.”
Analogously, companies devoted to improving performance value a better customer experience – not a score. Their objective isn’t solely to hit a target number tied to an “ultimate question” (i.e., Net Promoter) or some arcane industry-specific benchmark.
Rather, the survey leads an effort to engineer a better customer experience – and the scores follow. - See the glass as half-empty.
An abundance of decision making research shows that people may be more influenced by a fear of “loss” than by a promise of “gain.”
Yet – for fear of presenting bad news or using the results for promotional campaigns – many companies ignore this practical consideration when it comes to their customer experience surveys.
Using biased scales and questions (e.g., Are you Very Satisfied, Mostly Satisfied, Somewhat Satisfied, or Only A Bit Satisfied?), selecting “special” samples of customers (e.g., excluding those who are known to have had problems), and putting a positive “spin” on the results (e.g., combining Very Satisfied respondents with Somewhat Satisfied respondents to get a higher satisfaction score even when those who were Somewhat Satisfied are three times less likely to buy again), are but a few of the tactics used for viewing the world through rose colored glasses.
The most powerful results quantify the risk associated with not taking acting to improve the customer experience. Companies earning a better ROI for their customer experience surveys purposefully include survey questions to ferret out areas of customer dissatisfaction and potential causes of customer defection (e.g., providing participants with a list 50-60 problems they may have experienced, presenting respondents with a list of competitors and asking which company is the best, etc.).
While such practices are counter to conventional wisdom, the resulting data go a long way toward compelling action. - Set credible targets.
Some years ago, a well-known consumer transportation company sought our counsel during their customer satisfaction crisis. Bad press coupled with stagnant customer satisfaction index (CSI) made for a rather anxious Board of Directors.
Hoping to send a message, the Board set a CSI target of 80. A goal that seemed completely defensible until you considered a current CSI of 66 and an average annual change that had not exceeded about 5 points over the prior few years. Target setting isn’t about sending messages or emotional appeasement.
Those companies realizing a better ROI for their customer experience surveys predicate their targets on a goal of encouraging continuous, long-term, incremental improvement.
Thus, targets are rationally, carefully, and methodically calibrated on the basis of current and past performance (e.g., what’s the floor and ceiling, what’s the observed average change, etc.), statistical considerations (e.g., what is the level of statistical significance for the proposed target), and credibility (e.g., will the organization embrace the target?). - Concentrate on what matters most.
Sometimes less is more. Better a small success than a colossal failure. Of course, when applied to the customer experience, these strategies work best when you focus on those elements of the customer experience having a significant impact on loyalty.
All too often, customer experience surveys measure many things, except the right things. We once worked with a US professional services organization to explore the validity of their metrics.
Their two-page questionnaire featured a few key outcome measures (e.g., overall satisfaction and loyalty) and 20 satisfaction questions that were designed to help them predict satisfaction and loyalty (covering various aspects of the customer experience).
Together, these 20 measures only predicted about 30% of customer loyalty. In other words, they were measuring a lot of things they assumed were predictors of customer loyalty; very few were.
All else being equal, you’ll get a better ROI for your customer experience surveys if they help you target your limited resources so that you can improve where it truly counts.
You don’t have to measure everything, but at least measure those few things that really make a difference. - Tell a good story.
When people look at the results of a customer experience survey, they tend toward one of three reactions. Some see nothing (the data resemble a television test pattern or the linkage between the results and their day-to-day job is muddled).
Others – especially when the results are negative – express confusion or get defensive (since the data are complicated or inconsistent with their own experience, there must be something wrong with the data).
And still others, get it but ask, “so what?” Storytelling breaks down these barriers to using the results and engages the organization in prescribing actions. The best stories help the company establish a shared narrative and business case for change.
These stories create an economic imperative to act – by quantifying what’s at risk from not taking action – and connect the dots between the survey results, the right actions, and the benefits of effectively executing those actions.
A long-time financial services client of ours demonstrated the value of good storytelling to me. Their corporate culture for sharing survey results was all too familiar.
Directors from every department were shepherded into a conference room to endure an annual, two-hour perfunctory PowerPoint briefing which almost always ended in the edict, “things must change.”
They didn’t; at least until this company reinvented the process for sharing results. Although senior leadership continued to set the priorities for improvement, the broader organization was held accountable for driving change.
Empowered crossfunctional teams of mid-level managers were engaged in a day-long, structured and facilitated dialogue to formulate action plans for improving customer loyalty.
Since launching this new protocol, this company has been consistently ranked first in its syndicated customer satisfaction survey. - Strive to evolve customer experience data literacy in the organization.
Data literacy is the ability to collect, manage, evaluate, and apply data, in a critical manner.
While there’s no doubt companies are (slowly) becoming more advanced in collecting and using big data to drive transactions (e.g., service, sales, marketing, etc.), more often than not, they are less skilled at leveraging “small”, strategic data that offers insights into the broader customer experience.
It is one part knowledge deficit (“how do I read and apply simple statistics?”), one part cultural (“We just need one number like NPS”), and one part “la visual résistance” (“I can’t/don’t want to look at data unless it’s in a colorful cartoon format”).
Companies that have scaled the customer experience mountain and achieved great things have progressively improved the data literacy of the organization.
These market leaders have an appetite for the detail, understand the core concepts of customer experience statistics, and have developed the abilities of their people to translate survey results into customer experience policy.
1 Tampa Bay Tribune, March 20, 2009, Richard Mullins, “Two Months After J.D. Power Honor, Regulators Stepped In At
WellCare!”