How to Create a Balanced Survey
It would be natural to assume that companies which invest in customer experience measurement (CEM) would put customer preferences at the top of the list, but this is not always the case. Companies do not consciously ignore customers in the survey process. Rather it’s more often a matter of doing what has come to be expected internally— populating a dashboard with metrics that provide a snapshot of performance at various levels in the organization. Overly structured surveys may do this efficiently while at the same time falling short of adequately describing customers’ actual experiences. It doesn’t have to be this way. One approach to creating more customer-centric surveys is to make sure customers are able to tell their stories. By shifting the survey balance to include some unstructured feedback, everyone wins.
Unsatisfying Customer Satisfaction Surveys
It’s ironic but a number of customers who take ‘satisfaction’ surveys find the experience less than satisfying. Surveys frustrate customers and the interviewers who have to administer them. The effects can be even more harmful with self-administered questionnaires done online or through the mail—there is nothing keeping a customer from prematurely ending an unsatisfying “exchange.”
Poorly Designed Surveys Have Real Consequences
Too often customers are hindered to say what’s on their minds and interviewers are stymied in their attempt to record valuable information. Completely close-ended customer experience surveys administered using in flexible software are all too common, and contribute to:
- Declining response rates—Respondents fail to complete the survey. Others refuse to participate based on previous unpleasant experiences. The available respondent pool shrinks and survey costs increase.
- Poor quality data—Respondents rush to get through surveys filled with questions that are irrelevant to them, or are forced into selecting answers which do not represent their true or complete feelings.
- Missing or incomplete information—What company would not benefit from learning in a customer’s own words what went amiss in a service transaction, or the opposite—what went exactly right? Too many surveys simply do not provide this opportunity.
The bottom line: customers are becoming disengaged with the very feedback process designed to improve their experiences. Over time, this behavior will have a negative impact on perceptions of your brand—which you may find yourself reading about on a social media or internet rating site.
Creating the Right Kind of Survey
Today’s customers are not waiting to be asked what they think about customer experience surveys—they are telling us without reservation and we need to give them the tools and utilize technology that allow customers to give us feedback.
A key element is more flexible surveys that not only provide better data but also create a better survey experience. In other words, surveys which are more like everyday conversations. During conversations people exchange information quickly and efficiently. They readily engage, react to each other’s statements and naturally probe for and provide further detail. Adding open-end questions to customer surveys helps create an environment in which interesting information surfaces and customers are able to tell their stories in their own words. It’s a matter of shifting the survey balance from 100 percent close-ended ratings-based questions to providing targeted opportunities for unstructured feedback.
Qualitative research entails primarily an open-ended exchange between interviewer and customer. We are not advocating all customer experience surveys should go to this extreme, but there is certainly room to shift the balance and let customers more freely give us the feedback they want to give and not just the ratings organizations force on them.
This does not mean giving up performance metrics. A well balanced experience survey will meet the needs of all stakeholders in the customer experience measurement process. How far a company moves along the continuum depends on a number of factors including:
- Information goals: Is the survey’s focus on performance appraisal, diagnosis of systemic problems, rapid problem resolution or retention/relationship building?
- How the information will be used and by whom?
- The category/type of transaction
- The organization’s culture
What Should We Ask?
There is not a magic formula for questions that solicit useful, unstructured feedback. It starts with deciding exactly what type of information you want, who will use it and for what purpose. General considerations are:
- Question selection/wording
- Placement in the survey
- Number of questions
- Probing and clarifying responses to best effect
Question Selection
Just as researchers agonize over the best wording for an attribute, they should also give careful thought to the wording of open-end questions. Start by matching the question to the specific information need, and then get creative. In general, the less specific or loosely defined the question is, the less specific the response will be.
Don’t be afraid to experiment with adjectives that might be considered too leading in a close-ended question. Words like unforgettable, terrific or disappointing may inspire respondents to more focused and detailed responses. Don’t forget to communicate research concepts in customer friendly ways and ask them directly:
- What stood out?
- What matters the most to you?
- How do we keep your business?
Consider borrowing simple projective techniques from the qualitative arsenal, e.g., “If you were the President of the company, what would you do to improve this experience?”
Placement
Data continuity will be a consideration unless you are designing a new program. Where new questions are placed
in the survey may influence responses to questions that follow. Therefore, it is advisable to pretest the new questionnaire to understand these effects. An exception would be when new questions are placed at the end of the survey.
How Many Open-Ends Is Too Many?
There is not a one-size- fits-all answer, but in the same way an attribute list can become burdensome, it is possible to put too many open-end questions into a customer experience survey. A pretest will reveal the information each question produces, allowing you to judge incremental value and whether some questions are redundant.
If it turns out that there are several productive questions, consider splitting the questions up across the sample; there should still be enough information to analyze. While it is important that all respondents provide an overall rating, it is not necessary that everyone experiences the same set of open-end questions. The main point here is to make sure that respondents get the most relevant opportunities to provide their feedback.
Be realistic about the survey subject and especially the character of transaction when considering which and how many open-ends to include. Low involvement transactions, especially those done repeatedly, become routine and unmemorable. A simple question at the end of the survey such as, “Please tell us anything else memorably positive/negative?” may be all that is needed.
Getting the Most Out of Open-Ends
More companies are moving customer experience surveys online and it is important that open-end questions can be as effective in self-administered as in interviewer-administered formats. The success of open-ends administered by live interviewers is dependent on the quality of their probing and clarifying skills. The success of open-end questions in online surveys is also driven by effective probing. If the response to an online open-end question is left blank or is too brief, simply trigger a prompt such as, “Please can you tell us more?”
Technology to the Rescue
Automated text analysis uses a combination of natural language processing and other computational linguistic techniques to:
- Categorize and summarize text
- Extract information into a suitable form for additional analysis
In other words, it turns unstructured text information into structured data that can be summarized and analyzed using familiar quantitative tools. Note that automated text analysis tools are capable of far more than comment categorization (comparable to human coding).
Surveys are a Reflection of your Brand
Every interaction with your company—including a customer experience survey—is a reflection on your brand. One way to make sure the survey experience is positive is to shift the balance from completely structured to semi-structured. Open-end questions have the potential, when designed and executed well, to create a better survey experience for respondents and to generate data with significant diagnostic value. A more open-ended questionnaire design creates a survey experience that is more conversational and allows customers to tell their stories in their own words.