This is the third in a series of posts exploring the best practices that underlie successful event-based corporate market research programs. We offer these up to help you to assess the “health” of your own programs.
In our first post, we looked at the need to move event measurement programs from quantifying what happened at the event, to looking at the impact of the event on the business. Our second post explored why your first step should be to review the survey instruments that your company is already using.
In this post, we will give you five tips to help you to design more user-friendly surveys. This matters because the more user-friendly the survey is, the higher the response rates tend to be. And since statistics is based on sample size, the larger a sample that you have (number of respondents) for a question, the more confidently you can project the results for the entire audience.
In Enterprise IT and many similar industries, somewhere between 15-25% of the attendees are willing to participate in event-related surveys. Your goal is to do everything that you can to maintain their goodwill… By the way, expect there to be fall off in participation during the survey – people get bored or frustrated, the phone rings, the plane takes off.
There is nothing much you can do about it, except to think about how you order your questions. And if you have the technology, make it easy for them to come back to finish the survey at a later date.
1/ Ask questions in order of importance As a general rule, we recommend that you put the most important questions at the beginning of the survey. You have the largest sample at the beginning, and people will be fresh and interested in what you are asking. The first few questions set the tone of your survey – readers will quickly decide if you are interested in what they think or are simply collecting data. (That is why we leverage registration data.)
2/ Challenge the value of each question. Less is more, and in surveys shorter is always better. The fundamental test that every question must pass to earn a place on the survey is, “what are you going to do with the data?”
Surveys seem to take on a life of their own and magically get longer every year. Most of the surveys that we see are filled with legacy questions that have been asked for years, and are kept because they provide “year over year” data. The funny thing is that no one remembers how they got there, and no one knows who uses the “year over year” data. These are the first ones we cut. It’s a good time for Spring Cleaning!
When a client wants to use an event survey to conduct market research or demonstrate event ROI, we remove almost all of the event logistics questions. This is in keeping with our philosophy of delivering actionable data – data that can be used to inform decisions that will help to improve the event in the future.
3/ Vary the question types. There are a lot of ways to ask questions. Mixing them helps to keep the respondent engaged. But not all questions are created equal – some of them are demand a lot of time and effort, and quickly become annoying.
The five point balanced scale question is generally the foundation for the survey. These questions are easy to answer, and the data can be reported in multiple ways. True/false questions are also very easy to answer and don’t tax the respondent.
Write-in questions (called literals) are invaluable because they provide much needed insight into the quantitative data you are collecting. Structure the question to be open-ended – you want to know what’s on your respondent’s mind. Set up the text box in your survey tool so that people can write as much as they want – no point in cutting them off if they are on a roll. Two to three literal questions are about right for a medium to long survey, use one for a short survey.
There are two types of questions that really tax the respondent. All you have to do is look at them to know which ones they are. Yes, the incredibly long list and its even more offensive relative, the incredibly long list with rating columns.
We understand the need to know which of the 63 products each customer is interested in. We also understand the need to know which of the 63 products is of the greatest interest. And which of the 63 product sessions provided great information and which ones didn’t meet expectations.
But seriously, would you stick around the second or third time you saw that grid? Even with a very nice incentive a lot of people will bail when the mental effort becomes too great…
4/ Use “mandatories” sparingly. Many survey engines allow users to make specific responses mandatory – which means that the question has to be answered before the user can go on to the next question. Clearly this has its uses (e.g. shipping and billing information) but it’s frequently overused in survey design.
Remember that once you annoy a respondent, chances are that you are going to lose them… You always have to trade off the value of getting a specific question answered, against the risk of losing their participation in the rest of the survey.
5/ Use branching and collect data from multiple sources. The problem with the 63-question set is that while it reflects your product offering and the event curriculum, it is not relevant to the individual respondent… No one is interested in 63 products, or goes to 63 sessions.
Many survey engines offer branching (sometimes called progressive or skip logic). In this instance, you might ask, “which of the 63 products are you interested in?” The respondent makes their selection, then going forward, the survey engine logic limits the questions to the products the respondents have selected – which means they only have to sort through the master list once. The ability to reconfigure (adapt) a survey to an individual respondent in real-time, is a significant advantage of a web-based instrument over paper.
Here’s another idea… Want to know what they thought about the product sessions? Don’t ask them, incorporate the session attendance and evaluation scores into your report. In this example, the session evaluation reports are more valuable because they are asked immediately after the session rather then days (and sometimes weeks!) later.
By prioritizing your questions, using various question types, using the tools in your survey engine and collecting data at multiple points before, during and after the event, you will increase your response rates and improve the accuracy of your projections.
Do you review your survey design from the respondents point-of-view? When is the last time you pruned the survey to refocus it on actionable data? Do you know who uses the answers to each question?