One-Third of Researchers Think Survey Ratings Are All They Need
You’d be hard-pressed to find anyone who doesn’t think customer feedback matters, but it seems an alarming number of researchers don’t believe they really need to hear what people have to say!
In fact, almost a third of market researchers we recently polled either don’t give consumers the opportunity to comment or flat out ignore their responses.
- 30% of researchers report they do not include an option for customer comments in longitudinal customer experience trackers because they “don’t want to deal with the coding/analysis.” Almost as many (34%) admit the same for ad hoc surveys.
- 42% of researchers also admit launching surveys that contain an option for customer comments with no intention of doing anything with the comments they receive.
Customer Comments Aren’t Necessary?
Part of the problem—as the first bullet indicates—is that coding/analysis of responses to open-ended questions has historically been a time-consuming and labor-intensive process. (Happily, this is no longer the case.)
But a more troubling issue, it seems, is a widespread lack of recognition for the value of unstructured customer feedback, especially compared to quantitative survey data.
- Almost half (41%) of researchers said actual voice-of-customer comments are of secondary importance to structured rating questions.
- Of those who do read/analyze customer comments, 20% said it’s sufficient to just read/code a small subset of the comments rather than each and every
In short, we can conclude that many researchers omit or ignore customer comments because they believe they can get the same or better insights from quantitative ratings data.
This assumption is absolutely WRONG.
Misconception: Ratings Are Enough
I’ve posted on the serious problems with relying exclusively on quantitative data for insights before here.
But before I discovered text analytics, I used to be in the same camp as the researchers highlighted in our survey.
My first mistake was that I assumed I would always be able to frame the right questions and conceive of all possible relevant answers.
I also believed, naively, that respondents actually consider all questions equally and that the decimal point differences in mean ratings from (frequently onerous) attribute batteries are meaningful, especially if we can apply a T Test and the 0.23% difference is deemed “significant” (even if only at a directional 80% confidence level).
Since then, I have found time and time again that nothing predicts actual customer behavior better than the comment data from a well-crafted open-end.
For a real world example, I invite you to have a look at the work we did with Jiffy Lube.
There are real dollars attached to what our customers can tell us if we let them use their own words. If you’re not letting them speak, your opportunity cost is probably much higher than you realize.
Thank you for your readership,
I look forward to your COMMENTS!
[PS. Over 200 marketing researchers professionals completed the survey in just the first week in field (statistics above), and the survey is still fielding here. What I was most impressed with so far was ironically the quality and thought fullness of the two open ended comments that were provided. Thus I will be doing initial analysis and reporting here on the blog during the next few days. So come back soon to see part II and maybe even a part III of the analysis to this very short but interesting survey of research professionals]
10 Responses
Could not agree more. Where possible, any work we have done in this, we have used an anonymous identifier to cross reference survey data with customer comments and other sales data. The findings are surprising and reveal a lot more about what is going on with this relationship. I rarely do any original research until we have exhausted data like you have flagged.
We just closed the survey mentioned above. If you would like to comment on this topic feel free to do so here in the comments section. We plan to share a deeper analysis of the results here on the blog over the next few days.
We always have an open end at the end of EVERY survey asking if there were issues with either the content of the questionnaire or the execution of the survey. We ALWAYS read these after a pre-launch to see if there are issues needing correcting before a full launch. But this is a qualitative read, we don’t code or tabulate the results. Any REALLY strong opinions about the client are passed on verbatim to our client, but these are anonymous surveys so no direct feedback to the customer/respondent is possible.
And this is why people don’t take surveys: the accurate intuition that nobody is listening.
I have been conducting marketing and public affairs research for over 40 years. I use open-ends to give context, color and depth to closed-end responses. This is particularly useful on pharma studies when interviewing patients about the side-effects of certain drugs and general compliance. In addition, open-end responses are very useful when interviewing wealthy customers on private banking, interviewing users of specific software, and understanding the reasons the public doesn’t support certain public policy issues.
In your deeper dive into the data, it would be good to see if there are differences between researchers with resources (those in the agency and therefore have the analytical software) and those who DIY all processes of research. Thanks.
I always include opportunities for customers to add their comments or suggestions. It is the opportunity to capture something we might have not thought about to ask. I agree that text analytics tools definitely help in handling the qualitative aspect of our surveys. In some cases we also ask the customer for permission to call them to further expand on their feedback and for sharing their reply with the appropriate organizations for follow up. This approach also helps us to reinforce the value we place on our customers’ comments and helps us in keeping and improving our high response rates. Thanks for the opportunity to share.
Great post. While this is disturbing it is no surprise. There is a contingent of MR people who hang on to the old ways and refuse to see the writing on the wall. These customer comments represent the totality of the current “brand-customer experience” and measuring this stuff is critical.
Posted a follow-up post to this here today: http://odintext.com/blog/five-reasons-to-never-design-a-survey-without-a-comment-field/
I find this “trend” really hard to believe among professional researchers…do-it-yourself folks…maybe.