• Home
  • Blog
  • Improve Your Data by Dropping These Five Poor Survey Questions

Improve Your Data by Dropping These Five Poor Survey Questions

Oftentimes I hear that surveys are not a good way to collect feedback from learners. Or that people are getting tired of answering surveys, which then leads to low response rates. Or people are not getting any insights or actionable data from their surveys. All of these reasons can give the perception that surveys are a waste of time.

How can we shift the perception of surveys?

I think one way is to eliminate poor survey questions. We often ask poor questions because that’s the way we have always done it. The questions may not seem that bad because we have gotten used to seeing them over and over again. However, taking the time to critique your current survey could eliminate some of these poor questions. Here is my list of five bad survey questions to consider dropping or revising:

Number 5:

On a scale of 1 to 10, how did you enjoy the catering at today’s session?

You have probably seen questions that ask the learners about the facility, classroom, or food. I am not sure any of these things have an impact on the learning that takes place. Yes, I agree that the environment does impact a person’s ability to receive the information, but how much control do you, as a learning leader, have over the food and classroom? And the key to any survey question is to know what you are going to do with the data. Do you currently share results back with the catering company or with your facilities team? If we cannot act on the information we gather, then we should instead ask other questions that we can act on.

Number 4:

The registration process for this course was efficient and effective. (strongly disagree to strongly agree)

This one is trying to collect data about how people registered for the training. But how much control does a learning experience designer have over the registration process? Often this is asking for feedback on the LMS or your organization’s learning technology, which we already know that is hard to change or improve your LMS or learning technology. Another problem with this question is the use of the word “and.” Asking a learner to provide feedback using two items can confuse the learner. What if the registration process was effective but not efficient? And use words that are easy to understand for the learner. They may have different ideas on what is efficient and effective. Instead, you could ask, What problems did you have registering for the course, or How did you learn about this training opportunity?

Number 3:

The trainer was effective and knowledgeable. (strongly disagree to strongly agree)

We do need to understand what part the trainer played in the learning experience. But this question is vague and leaves us with data that we cannot act on. What can we do if the trainer is not effective and not knowledgeable? This item does not tell us the behaviors that the trainer exhibited in the training. If you really want to know about the trainer, then use a behavior checklist and observe the trainer in the classroom. Or design a series of questions that lead to specific behaviors such as, The trainer encouraged participants to take part in class discussions, the trainer answered my questions, or the trainer gave me actionable feedback. If you don’t want to lengthen your current survey, then you can determine the effectiveness of the trainer by analyzing the comments from the question, How could this course be improved? And if you don’t have a way to quickly share the feedback with the trainer, then you should not collect the data in a survey.

Number 2:

The content of the course met the objectives. (strongly disagree to strongly agree)

We often ask this question to determine if a course achieved the learning objectives. However, learners often do not know the objectives of the course. And even if the trainer explained the objectives at the start of the training, how would a learner know if they met or achieved those objectives? One way to determine if a course met the objectives is to design a test, and use the test score to determine how well the course met the objectives. We often use surveys to collect data because we cannot use a test. Instead of asking about the objectives, you can ask the learner to note one thing they learned that they know they will use. You can then analyze the comments to see how well they match up with the objectives.

Number 1:

Overall, how satisfied are you with this learning experience? (not satisfied to very satisfied)

We often ask this question to determine if the course met the needs of the learner. However, when we use the word “satisfied,” it leaves much of the meaning up to the learner. The learner could interpret satisfaction to mean the training was engaging, or useful, or relevant, or meaningful or if they liked the people in the classroom. What if the learning experience was compliance training? How satisfied were you with your last compliance training experience? We are usually happy just to be done with compliance training.

So, what can you ask instead? We need to ask questions that determine the effectiveness of the training. Effectiveness can be determined by rating agreement to items such as, The training will help me be more successful on the job, my job performance will improve as a result of this training, or this training was worth the time away from my job. Learners value a learning experience that is useful, helps improve their performance, and is a good use of their time.

 

About the Authors

Scott Weersing
What is learning analytics and why am I passionate about it? Way back when I was a newspaper photographer, I really wanted to know the who, what, when, where, and why about the story I was assigned to. I loved to find out more information so I could be in the right place at the right time in order to get the best photograph. The more information I had, along with personal experience, prepared me to take an impactful photograph. My journey to learning analytics follows the same path of asking questions and finding the right tools. When I started working in Learning and Development as an instructional designer, I always was curious about what the learners were going to do with the training on the job. Oftentimes, I would get a response from the SME that the new knowledge would just change behavior on the job. I guess I am a little cynical about the magic of training. Just wave the magic wand, attend the training, view the WBT, and your problems will be solved. I did not know the questions to ask to ensure that the training would be applied on the job, but my leaders noticed that I was curious and liked to ask questions. They asked me whether I would you like to be a performance consultant. After telling me what a performance consultant does, I said that it sounded great. Who wouldn’t want to solve business and performance problems with a series of interventions? It was my time as a performance consultant that I learned about the right questions to ask to get to outcomes and, in turn, I became fascinated with metrics. My favorite questions are still as follows: Can you tell me more about the problem? What have you have already tried to solve the problem? What would it look like after this problem is solved? What metrics or data do you have that show there is a problem? I became data driven to find the causes of problems and then track the solutions to see if we were moving the needle. The tools to find the root cause of a problem are the same tools to see whether the training is being applied on the job. I use interviews, focus groups, observations, checklists, and surveys to find out what is causing a problem, and then I use the same tools to find out what is happening after training and, in turn, making an impact on business outcomes. I would say that learning analytics and photography are similar in that you need to plan with the end in mind to collect the right information in order to tell a story and make an impact.

Get in touch.

Learn more about our talent transformation solutions.

Transformation doesn’t happen overnight if you’re doing it right. We continuously deliver measurable outcomes and help you stay the course – choose the right partner for your journey.

Our suite of offerings include:

  • Consulting Services | Aligning vision and strategy to deliver integrated and systemic business results to drive growth and change through people.
  • Learning Services | Modern learning strategies, content, experiences, and delivery approaches that optimise workforce performance.
  • Technologies | An ecosystem of learning and talent tools, systems, platforms, and expertise that enable learning and talent transformation.