Here are the top 10 employee engagement survey questions that should be included on your next survey.Continue reading
Here are the top 10 employee engagement survey questions that should be included on your next survey.Continue reading
Has living in a COVID-19 world left you or your team with emotional exhaustion, a reduced sense of personal accomplishment, feelings of cynicism and detachment from the job, or a sense of ineffectiveness?
On this episode of the Engaging People Podcast, we’re joined by DecisionWise President, Matt Wride, and Senior Consultants, Charles Rogel, and Spencer Taylor. As champions of caring leadership, they share 10 questions that we use on employee engagement surveys to measure employee burnout and workload.
On this episode of the Engaging People Podcast, we’re joined by VP of Consulting, Christian Nielson, and Senior Consultants, Charles Rogel and Spencer Taylor.
Recently, our consultants have been spending more time advising our clients on their employee listening strategy during a crisis. Many organizations haven’t surveyed during these circumstances and need extra guidance on the following:
Our team goes into depth on these questions and more in an effort to help you engage employees with two-way communication.
Listen to more of this insightful conversation on best practices for surveying during a crisis.
Consider the following recommendations when determining your employee survey philosophy:
The primary goal of a survey is to make the organization more SUCCESSFUL, not simply to gather data. It should help you understand how to ALIGN the employee experience with the goals of the organization.
While some surveys provide interesting data, they do little to support the organization’s STRATEGY. Your survey process will be successful if it allows the organization to gather and ACT upon information that makes the company more successful.
A campaign is long-term and has a PURPOSE. It’s not a one time event. Engagement doesn’t start and end with a survey. The survey should be part of a larger PROCESS to address the whole employee experience.
Survey data provides excellent information that can be INTEGRATED with other performance data such as revenue, quality, and customer service.
Technology simply provides data upon which to act. Don’t let the technology drive the process.
Organizations can get caught up in having the latest CAPABILITIES out of a survey software system.
However, survey technology doesn’t change the employee experience. The tried-and-true organization development concepts of “measure, act, and re-measure” are what create the CHANGE — not the technology.
Administering a “do-it-yourself” survey using internal resources may be useful and practical at times, such as asking 10 people what they thought of the company party.
But surveying an organization of 50,000 (or even 50) may be best handled through the EXPERTISE and DEDICATION of an outside firm.
Only survey as often as you are prepared to act. Let this statement be your guide when it comes to FREQUENCY. If your organization doesn’t have the capacity or intent to act on a monthly survey, don’t survey monthly. Choose a different frequency.
Nothing says, “We don’t care” like receiving feedback and not taking ACTION.
Which employee engagement survey method is best for measuring employee feedback? The simple answer is, it depends. It’s rare these days to pick up an HR publication or attend a conference that isn’t at least partially dedicated to employee engagement. Many of these articles or events begin with alarming—although not always accurate—quotes like, “over three-fourths of your employees are actively disengaged, and unlikely to be making a positive contribution to the organization.” Scary.
While most of these statistics are hyperbole (i.e., do you really think an organization can function if seven out of eight employees are either actively sabotaging your company or darting for the exits?), there appears to be little challenge to the idea that an organization’s success is directly tied to the employee experience (EX). Engaged employees are far more likely to deliver results than disengaged employees. Also, few dispute the notion that keeping your finger on the pulse of the organization is critical to business success. The question is no longer one of if an organization should gather feedback but, rather, how that feedback should be gathered.
For the past two years, consulting firm DecisionWise has surveyed HR practitioners to understand just how organizations go about measuring engagement and the employee experience. Based on responses from more than 200 companies across the globe (representing over 1.2 million employees), two-thirds of organizations (67%) claim to formally measure employee engagement on a regular basis and have specific initiatives in place to address their findings. Interestingly, over the past several years, the question of “how often and how should we solicit employee feedback?” seems to have replaced the previous question of “should we solicit employee feedback?” Much of this results from a new wave of technology that allows organizations to gather real-time feedback, as well as to continue collecting more strategic feedback through their traditional annual employee engagement surveys.
There has been an increased push in 2018 and over the past several years to downplay the role of the traditional annual employee engagement survey, with some recommending instead that feedback be gathered more frequently—even as often as once a day or in “real time.” Further, in a push fueled primarily by survey software providers, rather than HR professionals, some organizations are enticed by technology that allows them to both solicit and provide feedback at any time of the day or night, in real time. On the other end of the spectrum, many organizations still prefer a more traditional approach to surveying their employees, opting instead for an annual or semi-annual employee engagement survey. However, while most organizations won’t be abandoning the annual employee survey anytime soon, most agree that an annual check-in with their employees is not enough. It’s simply too infrequent to understand the employee experience (EX).
So, which of these employee engagement survey methods is the most effective? More importantly, which of these employee survey methodologies provides the best information upon which to make critical employee and strategic decisions?
As with most employee-related questions, the answer is, “it all depends.” While one solution or a group of solutions won’t be right for every organization, it is important to understand the options available before settling on a particular solution or set of solutions to use in your organization.
Although numerous variations are currently available, employee engagement survey solutions generally vary by two main factors:
1. Scope – Scope refers to the magnitude and depth of the survey (number of employees surveyed, number and depth of survey questions or items, level of reporting detail and analysis, how the survey results will be used, etc.).
2. Frequency – Frequency is simply how often the employee engagement survey will be conducted (annually, quarterly, weekly, always-on, etc.).
These two main factors, scope and frequency, create four primary types of employee engagement survey options. Remember, numerous variations of these four employee engagement survey methods exist. However, they can generally be broken down into the following, ordered from most frequent to least frequent in administration:
Always-on surveys or continuous feedback technology provides “real-time feedback” that can be quickly deployed and reviewed. These employee surveys are typically used in two ways:
1. To gather ongoing employee feedback to the company and/or for employee performance feedback (including reward and recognition). Always-on surveys generally ignore organizational structure, meaning that due to anonymity, survey results don’t typically roll up under departments, functions, or specific managers. When used for company feedback, these platforms act as a modern-day variation on the old suggestion box. When used for employee performance purposes, real-time feedback for colleagues, bosses, subordinates, and others can be provided with just a few taps of the screen. Employee engagement surveys can address guided questions, which are to be evaluated on a Likert scale (“How likely would you be to recommend XYZ Company to friends, based on today’s experience?”), specific topics (“Have you had a performance conversation with your manager sometime in the past week?), or even as general open-ended comments (“What would you like us to know about your experience?”).
2. Another way that always-on employee engagement surveys are often used is, the unstructured feedback option: The ability to comment on anything, at any time (a modern take on yesteryear’s suggestion box).
Similar to an always-on employee engagement survey, spot surveys are also generally quick to deploy, address sentiment rather than engagement, and do not require (or allow for) in-depth analysis. They are often designed to address specific events (a conference), areas (benefits), or issues (a recent downsizing). Employee spot surveys are rolled out to measure current hot topics pertinent to the organization, such as gathering input about a change in benefits or soliciting opinions from employees. For this reason, industrial psychologists group employee spot surveys under the “polling and opinion” category. Employee spot surveys are fairly simple to launch or to roll out, and many survey software platforms allow managers or other users to design and administer their own surveys. In this case, these platforms are purchased as applications that may reside on local desktops or are made available through an online application software-as-a-service (SaaS)-licensed model (the latter has overtaken the former as far as popularity). Other systems are geared more to administration by human resources personnel, allowing them to get a feel for the thoughts or opinions of employees.
Called “pulse surveys” because they “take the pulse” of an organization or group, these types of surveys are helpful tools in gauging progress, warning of potential dangers, understanding trends in the employee experience, and promoting action. Pulse surveys share many similarities with spot surveys, but with two key differentiators:
1. They occur at regular or planned intervals, or with planned groups, and generally involve large pockets of the organization’s population (if not all employees).
2. Employee engagement pulse surveys are often intentional follow-ups or supplements to other employee surveys. Pulses surveys sometimes ride the coattails of annual employee engagement anchor surveys or other employee pulse surveys, in that they serve as a great way to drill down for more specific information or follow up on areas that need to be addressed.
When most companies talk about their annual employee engagement survey, they are referring to an “anchor survey.” These employee surveys have been used for decades. Our research found that 89% of organizations use annual employee anchor surveys (in some form or another) and, of those, only 6% said they would be moving away from anchor surveys in the foreseeable future. Despite what some software providers might advertise, this type of employee engagement survey is likely to be a part of gathering feedback for most organizations for years to come. Why? Most organizations simply find that, when implemented correctly, they work.
Employee engagement anchor surveys carry many different names: employee engagement survey, employee survey, well-being survey, climate survey, employee satisfaction survey, employee experience survey, culture survey, and so on. While there are actually differences between each of these types of employee anchor surveys, that’s a discussion for another day. However, they all fall under the category industrial/organizational psychology refers to as “anchor surveys,” due to the fact that they presumably form the base around which other employee surveys operate.
While various employee engagement survey providers may use different names and features, these four survey types listed above generally cover the range of variations and options. So, which employee engagement survey method is best? As stated before, it depends on your goals, scope, and frequency.
John D. Rockefeller is widely considered the richest man in modern history. Adjusted for inflation, his peak wealth would be worth over $340 billion today. For perspective, Jeff Bezos is currently the richest man alive with a mere $131 billion fortune.
Someone once asked Mr. Rockefeller, “How much is enough?” His response: “Just a little bit more.”
When asked, the richest man in modern times still wanted more. And so it is for all of us. Ask me if I should be paid more and the answer will always be yes. That doesn’t necessarily mean that I’m not paid fairly now, or that my current salary isn’t sufficient for a comfortable lifestyle. But if I’m asked, I’ll always take more. Why not?
Download: Employee Engagement Survey Sample
We have a complex relationship with compensation. Money is a motivator. It can mean more comfort, security, influence, and/or entertainment. We will go to great lengths for more money. But it is important to remember that money does not create meaning and purpose in our work, and it certainly doesn’t guarantee our engagement. To borrow a cliché, money isn’t everything.
When I’m helping my clients design an employee engagement survey, I strongly encourage them not to include questions about compensation. Salary and wages are important inputs into the employee experience, but a compensation strategy should be informed by market information, attrition data, and other relevant metrics. It should not rely on popular opinion from an engagement survey.
Here are three reasons why you should not include compensation questions in your next employee engagement survey.
For all the reasons I mentioned above, questions about compensation (and benefits, for that matter) almost always trend low. Since we can predict how compensation questions will score, why ask them? They tell us something we already know. Why waste our employee population’s time? Yes – they would like more money.
Employee engagement is unlocked when employees find the right mix of meaning, autonomy, growth, impact, and connection (ENGAGEMENT MAGIC®) in their work. Employees also need a foundation of what we call “satisfaction items.” Satisfaction items refer to the transactional items employees receive from their employer (compensation, benefits, tools and resources, etc.). These items are important and questions about them need to be included in the survey. However, satisfaction items do not engage employees. By asking unnecessary satisfaction questions on a survey, we emphasize the transactional side of an employee’s work experience and we may unnecessarily stir up negative perceptions. These questions distract from the true goals of an employee engagement survey – engagement.
At DecisionWise, we advise our clients not to ask a specific survey question if their organization will not be willing to take action on the results. Sometimes an organization insists on including compensation questions, but when the results come back low, they are powerless to change the existing compensation strategy. If a question is asked and returns unfavorable results, the employee population expects the organization to respond. When positive changes don’t occur, employees are disappointed and frustrated. When the results come in low, are you prepared to take action?
Occasionally, there are legitimate reasons to include questions about compensation in an engagement survey. For example, if the organization has been using compensation questions in prior surveys, it can create doubt and unrest if the questions are conspicuously absent from the next survey. However, even in such cases, it is wise to gradually move away from these questions.
When contemplating the use of compensation questions in your next employee survey, consider the risk/reward tradeoffs you will be making. Asked within an engagement survey, these questions frequently cause more harm than good. If you are still wondering how employees are feeling about compensation, let me help – they would like “just a little bit more.”
In the VIDEO: 6 Best Practices in Creating Employee Engagement Surveys I cover six ways to form questions in order to create a survey that will obtain the best and most accurate responses possible. By creating a powerful employee engagement survey you will accurately measure your organizations levels of engagement like never before.
The most accurate way to measure employee engagement is to use the average score from a subset of validated anchor questions. We then use the score from the anchor questions to place employees in one of four different groups from Fully Engaged to Fully Disengaged.
These items will be used to determine unique perceptions among the different engagement groups of employees. Also include items to measure the five major drivers of employee engagement, which are: Meaning, Autonomy, Growth, Impact, and Connection.
Good questions are actionable, meaning that it is obvious what actions to take based on the item. So avoid vague or general statements that can be interpreted in a variety of ways. Also, don’t use double-barreled questions which are items that contain two ideas in one question. Finally, use questions that are positively worded with a consistent rating scale.
Every organization is unique, so make sure to include items that evaluate recent changes, alignment with company values, or other pertinent issues. But don’t go too far. If you customize too much, you won’t be able to utilize industry benchmarks on similar questions.
We recommend asking: “What are the areas that need the most improvement in our organization?” and “What are the greatest strengths of our organization?” These two items will generate good qualitative feedback that provides context to the scores on the other survey questions. Don’t use too many open-ended questions because it makes the survey run long and it doesn’t provide any additional feedback.
A survey with 50 questions takes about 8 minutes to complete. If your survey is too short, then it doesn’t provide enough information to make specific conclusions and it becomes in actionable. If it is too long, employees will not finish it and you could also end up with too much information with leads to data paralysis.
Want to learn more? Download our whitepaper, “10 Questions to Consider Before Your Next Employee Engagement Survey” to learn more about building the most effective survey possible.
Take a look at more videos to assist in building a better organization:
Thanks for reading or watching and best of luck in your efforts to create an engaged workplace.
5 Criteria You Must Meet Before Running Your Own Employee Engagement Survey.Continue reading
Let me share an example that will help clarify what I mean. The other day my seven-year-old came to me and told me he was sick and couldn’t go to school. Now, I love my kid, but let’s just say that when it comes to school attendance, he hasn’t earned my unadulterated trust just yet. So I check his throat and, wouldn’t you know it, it’s bright red. Flaming red. So, (a)I felt bad for questioning him, and (b) I immediately went to get a thermometer. His temperature was higher than normal, but nothing I would normally worry about. But coupled with how bad his throat looked, it was enough to put us on the road to an urgent care facility and the doctor running some strep tests.
So, back to your engagement survey. There are three basic steps when analyzing results of any kind (we’ll focus this article on differences in employee engagement scores). First, identify a difference. Second, determine if it’s a real difference. And third, if it is real, determine how meaningful it is.
In the example, my son came to me with a difference (he felt different from when he was healthy). I could have taken these results at face value, but I’m sure you can come up with a number of reasons why that would be inadequate parenting. So my next step was to determine whether or not that was a real difference. A visual check and a temperature check indicated that perhaps there was a difference from his normal functioning. At this point, it seemed likely that he should stay home from school. (OK, he’s starting to earn a bit of that academic trust back)
Likewise, most organizations stop at step one or two and make the mistake of drawing conclusions and taking actions without understanding whether their engagement results are real and meaningful.
Let’s look at each step in more detail.
A typical employee engagement survey consists of participants answering questions along some sort of scale. This is your basic quantitative survey. Reporting generally falls along the lines of taking the percentage of respondents who answered a certain way, and ranking those numbers against the other questions or comparing them to past years or to industry norms. “We see a 5% increase in item X as compared to last year and we have lost 3.2% in item Y.” Does this sound familiar at all? Likely. We seldom see a company that doesn’t make this comparison.
This method of interpreting results is intuitive; it makes sense to almost any audience. The problem here is in the fundamental nature of measuring humans. We’re not atomic particles. You can ask us a question one day and it might be a completely different answer from when you asked us last week. We’re fickle and capricious and easily swayed. And even when we try to be consistent, we’re not. So any survey of employee attitudes and perceptions is going to have some natural fluctuations one way or the other.
Flip a coin 100 times and I’ll bet you $100 that it doesn’t end up 50-50. It will more likely be off balance, something like 53-47 or 45-55. If you then said “looks like we lost 8 percentage points from our first flip,” technically you’re correct. But does that loss mean anything?
So when we say we “lost 3.2% in item Y from last year” not only are we being overly precise on something that shouldn’t be measured with such granularity, we’re not even sure if that’s a real difference or if it’s just random chance. This method is not robust enough to draw any conclusions yet. We need to know if the difference is real and significant.
Fortunately there are statistical analyses that we can use to analyze whether those differences are likely due to chance or to some sort of meaningful difference. Data are compared, and a score is spat out. That score corresponds with a percentage likelihood that the difference is due to chance—or not. The typical cutoff used by statisticians and sociologists when dealing with people is 5%, which for our purposes here means there is only a 5% chance that the difference between two things is due to random variation.
This level of rigorous testing is critical for interpreting data meaningfully. We have to know if a real difference exists before we start investing in something that was never actually a problem to begin with. However, statistical testing is not without its concerns.
Statisticians are quite good at what they do. As such, they have developed tests that are incredibly sensitive in picking up differences between groups, even when the groups are quite small. In industry, we often are testing large groups of employees with relatively sensitive tests, which can lead to complications.
Let’s look at this in real life. A 2013 Business Insider poll asked which American states were the most arrogant. New York won, although followed closely by Texas. Let’s say I want to examine whether New Yorkers have a reason to be more arrogant than Texans, and I choose “average IQ” as my measure. I want to make sure I capture the effect, so I test 4,000 people from each state—a statistically representative sample of the overall population. Lo and behold, I find that New Yorkers do, in fact, have a reason to be arrogant, as they have significantly higher IQs than Texans (corroborated by a 2006 study). After all, as Babe Ruth said, “It ain’t braggin’ if you can back it up.”
A deeper dive into the data, however, reveals that New Yorkers averaged a 100.7 on the test, and Texans an even 100. That means in a 75-minute test, with hundreds of activities and tasks, the average Texan possibly defined one fewer word, or could only recite six numbers backwards instead of seven. In other words, despite the statistical test indicating there was a significant difference, that difference is meaningless. Texans redeemed.
Using significance can work if the environment is just right. But, there are enough potential obstacles that could necessitate something a bit more powerful.
This is where effect size calculations come in. Effect size measures the magnitude of the difference between two groups. This simple procedure can shed a vast amount of light onto the true nature of the differences and allow for more meaning to be drawn from the results. The interpretation of effect size numbers is actually very straightforward.
For the IQ example, the effect size was .02, i.e. it shouldn’t merit a second thought.
Effect size can always be measured, and is independent of significance, meaning every significant result has an effect size, but not every effect size is significant.
Sometimes, when looking at the difference between two groups, it is readily apparent that the difference is so small as to be negligible (as in the IQ example). However, often it is not. Statistics take into account both the average of a group and also the variation of scores around that average. This can reveal insights that might go undetected.
Recently, I completed an analysis for one of our employee engagement client partners. When comparing their survey items to our national norms, several items jumped off the page as needing immediate attention. But, one that was lurking in the wings was a question that asked if they “…[understood] how [their] work contributes to the overall goals of the organization.” This particular item typically receives fairly high scores across industries, and while this organization came in lower than the industry benchmark, the scores were still relatively high. Under normal circumstances, this would have been treated as a non-factor.
However, the effect size was unusually large. Further analysis revealed that the scores on this question are relatively tight around the average score (there wasn’t much variance between employee opinions). When this company scored lower than the benchmark or average, that was a more meaningful difference than a question with a greater differential, but for which answers range all over the scale. In other words, it wasn’t just a few employees telling the company they were having a “below average employee experience,” it was most employees. Dealing with this issue ended up being one of this company’s highest priorities to come out of the survey. It wouldn’t have even registered without checking for effect size.
The reality is that any question you ask is going to have a distribution of answers. To ignore the distribution, in favor of reporting simple comparisons such as those we typically see from most surveys is not only inadequate, it’s potentially misleading. Bad data result in bad decisions. The effect size is a straightforward calculation that resolves any problems arising from sample size or comparing differently shaped distributions. Further, it has the added bonus of simplifying ordering priorities.
The next time you analyze results, ask yourself if you’d rather:
Without answering each question in the three-step process, organizations can be distracted by focusing on the wrong issue or miss an issue all-together. Remember:
The end of my story is that when the doctor swabbed my son’s throat for a strep test, the redness came off on the swab. It turns out my wonderful progeny had decided he didn’t want to go to school. So at breakfast, he ate around all the red marshmallows from his cereal until he was done and then swallowed them last, hence the bright red throat. I was too impressed to be angry. And more germane to this article, had I not followed the three-step process (i.e. seeing if the apparent difference was meaningful), my conclusion would have been incorrect. Maybe we all need to quit faking it and get back to school.