I grew up a fan of the New York Jets. Not the best decision on my part – tying my loyalty to a team that (at least over the past 40 years) has not seemed to rise too far above mediocrity. Their ineptitude can clearly be illustrated by a string of their coaches – Richie Kotite, Bruce Coslet, Todd Bowles, Joe Walton – who never seemed to get their act together.
One coach was different: Herm Edwards – maybe not in terms of results, but certainly in spirit.
Herm’s attitude extended beyond the typical clichés of focusing on the fundamentals, controlling the tempo of the game and ensuring his guys left everything on the field by doing everything he could do ahead of game time to increase the likelihood of a win. At one press conference, he described his approach by using a quote from The Little Prince by author Antoine de Saint-Exupery: “A goal without a plan is just a wish.”
When you think about it, that mindset can really apply to any task at hand. In order to maximize the likelihood of a successful outcome, you need to know your objectives, your plan and your intended outcome from the get-go.
Customer Experience Measurement – No Exception
This mindset’s applicability to customer experience measurement is no exception. There are a number of key moments in any project that dictate the success of an engagement: some are at the start – like the setting of the initial objectives; others are near the end – such as the framing of the ultimate storylines.
And, in the middle there is a crucial transition point, when the data collection is finished and attention is being shifted to its analysis. As the baton is passed from the phase of collecting good data to the phase of analyzing and reporting results, you need to make sure that you don’t rush too quickly but instead put some steps in place to ensure that the findings that are generated maximize the insights and translate them into action steps.
Here is a checklist of five items to consider to help with this post-data collection, pre-analysis transition point:
- Know What Customer Experience Research Analytics You Plan To Run Ahead Of Time
The day the data collection field closes should not be the first time you are thinking about how the analysis will be conducted. Every study is different – with unique questions and distinct customer experience strategy requirements – though time pressures to meet and exceed reporting deadlines are ever-present. Having so many moving parts – all trying to be managed under a “Go! Go! Go!” mindset – is a recipe for mistakes to be made. One way to counter this is to review and pre-plan all the project-specific customer experience research analytics while the data collection is still ongoing. We operationalized this as our Analytical Protocol Form. Completing this form ensures all the key components will be remembered – like the KPI’s on which Key Driver Analysis will be conducted, if and how any legacy customer experience research data will be trended, or whether there are any special customer-specific “look and feel” requirements (such as PowerPoint templates or specific fonts). Accuracy and efficiency are critical, and employing something as simple as this Analytical Protocol Form can guarantee things are done right and done quickly. - Ensure That The Customer Experience Research Respondents “Look” Like The Population To Which The Results Are Projecting
Customer experience research results are designed to be projectable to a broader population. However, if the profile of the respondents (that it their distribution across key demographics or segmentation characteristics) is not aligned with that of the full population, then the results may steer you off course. Say, for example, a medical product distribution company who has 20% of their customers in their East region has collected data which was comprised of 45% of East region responses. Clearly, the results would be unduly slanted towards the East region customer experience – such that any issues with the East region’s key personnel, distribution centers or competitive landscape would skew the across-the-board aggregate results. This can occur for a number of reasons – ranging from a deliberate oversampling of a particular type of customer to response rates differing in a systematic manner. The good news is that there is an easy remedy for this situation (if it does occur) – the weighting of the results. So, the first post-data collection step should always be to generate a demographic profile of the respondent and compare this against the similar profile of the population to determine if there is a need to apply a customer experience research weighting scheme to the results. - Think Beyond The Standard Charts & Tables
Nothing makes eyes glaze over faster than yet another table or chart that looks just like the table or chart that preceded it. It does not have to be this way. There are now an abundance of data visualization options easily available to researchers – ranging from heat maps to pyramid charts and from ranking plots to the use of filled icons instead of bars in charts – that can break up the monotony of legacy data presentation techniques. Thinking ahead about not just what data will be presented but how to convey the information can maximize the likelihood that the customer experience research results will resonate, and, most importantly, increase the likelihood that better decisions will be made from these findings. - Voice of the Customer Aggregate Results Are Great, But Segmented Findings Generate Action
If you polled CCMC clients about what they hoped to achieve from their customer experience research – “actionability” of the insights would certainly be a top cited desired outcome. One way to make findings more actionable is to determine, from the outset, the manner in which the data can be a catalyst for the brainstorming of ideas, prioritization around those ideas and ultimately implementing changes to improve the customer experience. Often this comes down to segmentation. For example, CCMC recently conducted a customer experience measurement study at the likely actions Emergency Operators would take in response to specific emergency incidents. The Emergency Operators were made up of police officers and firefighters – think of them as “blue” and “red” respondents – each of which had distinct actions to take that varied by role. Now the results could certainly be reported in aggregate – creating combined “purple” results – but, while technically accurate, these “purple” aggregated findings would not yield anything meaningful or of value. For this customer experience research, segmented results matter more than the data presented in aggregate. Having a pre-set plan for how to look at the outcomes of these Emergency Operators in a more segmented view ensured insights were generated, not simply numbers. - Try To Walk In The Shoes Of The Respondents
“Data. Data. Data.” Isn’t that often what is going through a researcher’s mind when he or she is generating reports. But for CCMC clients, it’s not just data – it’s a real world situation with real world consequences. So how to overcome a situation whereby researchers are unintentionally blind to the actions our clients have taken to impact the day-to-day experiences of the respondents? What we do – and this has application on both one-time and longitudinal studies – is to have our clients prepare a “diary” that details the events (both good and bad) that could have impacted the respondents’ customer experience. Was there a field force overhaul that changed half of the respondents’ key point of contact? Put it in the diary. Was there a change to a goodwill policy whereby more customers received more coupons? Put it in the diary. Were there IT challenges such that the website timed out during specific transactions? Put it in the diary. Having this sort of context for what is going on in the “real world” is critical for better tailoring of the analyses that are conducted and for how the subsequent results are interpreted.
Bottom line:
Embrace this mindset when conducting customer experience research – that the planning and preparation surrounding a project is as critical as the actual execution of the administration and analysis – and use (and expand on) this checklist to help achieve that outcome.