OdinText Discovers Job Satisfaction Drivers in Anonymous Employee Data
Employee satisfaction surveys go by different names – “stakeholder satisfaction,” “360-degree surveys,” “employee engagement.” I blog a lot about the shortcomings of survey data and problems like respondent and data quality due to bad sample, long surveys, and poorly structured questions (which assumes the researcher already knows the best questions to ask), but I haven’t talked much about human resources/employee surveys.
HR surveys have a different and bigger problem than consumer surveys. It’s not the sample; we know exactly who our employees are. We know their email, phone numbers and where they sit. Heck, we can even mandate 100% survey participation (and some companies do). In fact, I’ve spoken to Human Resources directors who actually survey their employees once per week. The reasoning goes something like, “our millennial employees are in high demand and we want to keep them happy.” But that’s not a problem, per se; in fact, I’m a believer in frequent data collection.
The Problem with Employee Satisfaction Surveys
NO ONE taking an employee survey trusts that their data will be kept anonymous and confidential. This is the case even when third parties promising to keep all data at an aggregate level are hired to conduct the fieldwork.
It really doesn’t matter that this mistrust may be unfounded or invalid, only that it exists. And as it happens, it isn’t entirely unfounded. In fact, I know a former senior HR manager at a Fortune 500 who admitted to successfully pressuring vendors into providing de-anonymized, individual-level data.
Even if you as an employee believed your data would remain anonymous, you might nonetheless be wary of being completely forthcoming. For instance, if you were one of three or four direct reports asked to rate your satisfaction with either your company or your manager on a five-point Likert scale, it might feel foolhardy to answer with anything less than a top-3-box score. There would be a lot of interest in who the ‘disgruntled’ employee was, after all.
This is a data problem, and there are two solutions:
- Text Analysis of Employee Satisfaction Comment Data
Unstructured, free-form comment data can be a window into the soul! I might never risk giving my company or supervisor anything below a top-2-box satisfaction rating on a Likert scale, but there are other ways to unearth the truth. For example, consider these open-ended questions:
Q: “What do you think about the prospects for your company’s success in the next two years?”
Or maybe a specific question about a boss I didn’t like? Such as:
Q: “Tell us about your relationship with your boss. Does he/she provide you with adequate X?”
While the respondent would obviously still be very careful about how he/she answers – probably even more so – it would be nearly impossible not to divulge important clues about how he/she really feels.
Why? Because we really aren’t very good at lying. We can’t help leaving emotional clues in spoken or written text that reveal our hidden emotions based on word choice.
Even in the absence of any negative terms or emotions, just the appearance of significantly lower levels of basic positive sentiment within a department or within a specific manager’s group might signal a serious problem.
- Anonymizing Employee Satisfaction Data
The other solution is to collect data that truly is more anonymous. This is a second unmet opportunity to improve employee satisfaction and engagement research. The trick is not just providing an option for anonymous feedback such as a suggestion box, but making it top-of-mind and encouraging more frequent anonymous feedback.
Obviously, many companies know their customers are discussing them and giving them ratings both anonymously and with their real profiles on various sites—Amazon.com, BazaarVoice.com, Airbnb, TripAdvisor and Yelp, to name just a few.
But what about employee data? Back during the dotcom boom, working for Snowball.com/Ign.com I recall everyone at the company and other dotcom’s regularly visiting F*ckedCompany.com (the asterisk was added by me) where anonymous feedback about management, impending layoffs, etc., would be shared. This isn’t necessarily a good thing for anyone except investors wanting to short a certain company.
Today there are sites like GlassDoor.com where employees rate larger companies on both work satisfaction, in general, and even the interview process. The problem with this site is that it tends to be focused more on larger companies (think Home Depot and Lowes), though there are many ratings for middle-market and smaller companies, too.
I think in the future there will be even more opportunities to access public reviews of satisfaction at companies, but also hopefully more private ways to collect unbiased data on your company’s employee satisfaction.
What to Expect from Text Analysis of Good/Anonymous Employee Data?
While I’ll be writing more on this topic in the future, what prompted the idea for this blog post was one of our most recent clients, TruckersReport.com. As the name suggests, TruckersReport.com is a trucking industry professional community that includes an anonymous discussion board.
Recently, OdinText deployed to analyzed anonymous, unstructured comments as well as structured ratings data from the TruckersReport.com’s website. Some rather unexpected findings were covered in a joint press release. For example, OdinText discovered 11 features driving job satisfaction, and salary didn’t make the top three! You may request a more detailed report here.
I look forward to your thoughts and questions related to either the above study or human resources analytics in general.
2 Responses
Thanks, Tom. I 100% agree on employee surveys. They are marketed to HR and internally as anonymized but when you have a minimum cutoff of, say, six reports to get the information on your team, it ‘s all too easy for the supervisor to figure out who might have said what.Glassdoor and Indeed are both rich sources of current and former employee insights for HR leaders and their bosses. They should be integrated into a total social listening program. True, they are best for big companies where you can count on a steady stream of reviews. But for Fortune 500 and those with outposts around the country, sales offices, service depots, etc., much can be learned about views of the corporate parent and local conditions on the ground.
I would never go below 10 for units of reporting and ideally not less than 20. That helps with anonymity.