Share
Image for AudienceMetrix.com

Enabling corporate event sponsors and managers to demonstrate success

We came across an excellent whitepaper from the Aberdeen Group entitled Web Analytics: Marketing Beyond Online Customer Data. The whitepaper suggested a number of ideas that are equally applicable to measuring the impact of corporate events.

One comment that caught our attention was the statement that “Best-in-Class marketers are keen on getting granular with the specific impact of each marketing channel and campaign…”

Thinking about an event as a marketing channel which presents a collection of individual marketing campaigns is a powerful and useful idea. Especially as the one thing that events and web marketing have in common is an emphasis (and reliance) on delivering valuable content.

Events remain hugely popular and highly effective because they offer a branded, immersive environment. In many ways they are everything that every other form of paid media strives to be… a venue for conducting multi-dimensional campaigns using a combination of messages, messengers and experiences. Again this is increasingly true of the kinds of experiences that the web at its best can deliver.

The abundance (perhaps plethora?) of options leads to the quandary that corporate event sponsors often find themselves in: the need to stimulate demand for their shiny new objects, versus the need to meet attendee’s needs for information to help them to maximize the return on their investment.

This division is not as much of a dichotomy as it might seem. In fact, it reflects our experience with technology conferences, tradeshows and events of all stripes and flavors… There is a simple value proposition at work: one cannot attract an audience, and thus build a sustainable marketing channel, without providing content that addresses the needs and interests of each attendee.

These motivations are no more self-serving than the corporate goal of hosting what a recent survey respondent termed “a three day marketing brochure.” Targeted, relevant content is the quid pro quo that defines the success of the channel and the campaign(s). When this agreement is not honored, the result is usually a drop in attendance, followed by a slow fade from the competitive landscape.

Balancing these two different types of initiatives – stimulating demand (i.e. pipeline) versus supporting customers (i.e. creating loyalty and evangelists) – requires that the sponsor have a clear vision of the role of the event as a marketing channel. That is why it is a worthwhile exercise to go back and revisit the original objectives, which first brought the event to life. It also suggests that current event objectives need to be considered in terms of the current audience – often a change in event objectives must be matched by a change in audience acquisition strategy.

With the channel defined – or redefined – and more often than not, it is a moving target, one can then move on to campaign development. How many “things” is a particular conference going to be about? Each “thing” can be developed as a separate campaign built on attendee relevant content and experiences. By ordering the many initiatives competing for attention, the event can support multiple campaigns simultaneously.

As in any medium, the key to assessing the success of this approach is an event measurement program that defines ways to specifically measure each ‘event campaign’ and establish associated KPI’s. Or as the old truism goes, don’t expect what you don’t inspect.

Since no audience is not totally homogeneous, the measurements need to be assessed based on how they relate to each key audience segment.  Both campaign ‘pushes’ and attendee content ‘pulls’ need to be evaluated in this fashion to provide a balanced and ‘attendee POV’ perspective on what the event actually accomplished and, just as importantly, where it fell short.

One of the areas that our Event Measurement Best Practices Survey explores is the role that setting objectives and defining metrics play in the overall content development process.

How does your team go about balancing the need to drive sales and the need to provide product content? What challenges are unique to your company?

Please consider subscribing to our blog. When you do, be sure to let us know what event measurement and market research topics you would like us to address in future posts.

 

 

 

 

{ 0 comments }

Event survey results can be positive or negative

We had an interesting chat with one of our clients the other day.

A specific post event survey score had dropped enough to attract executive notice. The very public presentation of the news upset the function owner; who was now looking for cover in the form of insight into what the drop meant… and what he would have to do to correct it.

After the call, we got an email from our client who wrote “We should look at all of our survey questions to ensure the direction we want is always up as “better”.”

Which led us to ponder the question, is up always better? It would be an excellent question to put in our Event Measurement Best Practices Survey

This may surprise you (blog posts are supposed to be surprising),but in our opinion, the answer is either no or depends – but it is definitely not an arbitrary yes.

The key to delivering actionable survey metrics is understanding the objective behind what is to be measured, then selecting a measurement technique that best captures progress towards that objective.

Many aspects of conference and event operations are focused on improving the attendee experience. Conference staffs work hard to improve attendee satisfaction in areas like registration, hotel reservations and ground transportation.

Creating survey questions that measure the success of these kinds of initiatives is an extension of the kinds of measurement practices used in the manufacturing and quality control departments of many organizations. Fewer failures, fewer orders returned, fewer repairs…

The result of reducing or minimizing certain factors is a positive increase in customer satisfaction…. This is where the confusion about is ‘up’ or ‘down’ better sometimes comes from.

When you move from the service mindset of improving satisfaction by minimizing frustration to the marketing focus on achieving a competitive advantage, the objectives change. As do the survey questions and the way that the results are reported.

Success is achieved through some combination of offering a better product, service or solution, providing superior service before and after the sale and delivering greater perceived value.

Measurement is almost always used to determine if the event was effective at communicating the competitive advantage in a compelling manner. Research focuses on dimensions such as creating awareness, increasing or reinforcing preference, accelerating purchase intent and often enhancing brand loyalty or preference.

These are all objectives where an ‘up’ score is almost always better. Even when a sponsor is using the event to address a known problem, or to try to overcome a known negative perception, measuring the extent to which the event reduced the negative perception is generally reported as an ‘up’.

If you can think of a marketing objective where down is better, we’d like to hear from you – and we’d like you to consider writing a guest post on the subject.

Please consider subscribing to our blog. When you do,be sure to let us know what event measurement and market research topics you would like us to address in future posts.

 

 

 

 

 

 

 

 

 

{ 0 comments }

AUDIENCE METRIX LAUNCHES FIRST GLOBAL EVENT MEASUREMENT BEST PRACTICES SURVEY

Taos, NM (May 31, 2012) Audience Metrix, a market research firm specializing in corporate event research, today announced the first annual Event Measurement Best Practices Survey (EMBPS) for corporate event sponsors and managers.

The survey explores issues including how companies measure ROI, at what level of the organization event goals are set and how event data is used for planning. Each respondent will receive a personalized report, comparing their response to a global sample of their peers.

“This survey will be an important tool for corporate event sponsors and managers who want to evaluate and improve their meeting and event measurement practices,” said Audience Metrix Partner Christopher Korody. “We are actively encouraging people to share the survey with their peers so that we can get a global sample.”

“The survey explores many of the key issues in event research,” said Scott Schenker, Vice President of Global Events for SAP, and Audience Metrix client. “The findings will help to define best practices, and provide actionable information to anyone who wants to know the true success of an experience-marketing program. The more companies that take the time to respond, the greater value for everyone.”

“By doing the survey annually, we are providing a way for companies everywhere to continuously improve their measurement practices,” adds Audience Metrix Partner Kevin O’Neill.

Qualtrics has agreed to co-sponsor the survey and is promoting it to their user community. The survey will run on the Qualtrics survey platform.

“The fact that Audience Metrix is sharing the results on an individualized basis makes this project unique,” said Daniel Russell, Qualtrics Account Director.

About Audience Metrix

Audience Metrix (http://audiencemetrix.com) is a marketing research firm that specializes in conducting primary research at corporate events that enable event sponsors and managers to demonstrate the success of their program. The two partners, Kevin O’Neill and Christopher Korody have been conducting corporate event research since 2000.

About Qualtrics

Qualtrics is a worldwide leader in enterprise data collection and analysis with software that is easy enough for an intern, but sophisticated enough for PhD. Global organizations and research firms of all types and sizes use Qualtrics software to make better decisions based on strategic research intelligence. For more information and a free trial, visit www.qualtrics.com

{ 0 comments }

It s been an interesting and busy couple of weeks. We ve been putting the finishing touches on a survey for one of our key accounts. In preparation, we did a series of interviews with account executives about why their customers attend the event.

It s easy to understand that account guys want to get their customers and prospects to the event. Once there they can dazzle them with shiny objects and overwhelm them with the full glory of the enterprise. It is as good a chance to get close , as an account executive can hope for this side of the golf course…

The more interviews we conducted, the more we wondered, why do customers really come? Surely it’s not to avoid disappointing their account executive… Why do they give up two or three or four days if you count travel time? Corporate customers are busy people, and they are not likely to give up precious days unless they are going to get a return on time spent.

What’s in it for them?
Is there a quid pro quo?

What if customers use an event to accomplish their own goals be it completing a procurement process, gathering information for a long term plan, networking to gain a deeper understanding about what it takes to do a successful implementation?

While we were in the middle of this fast forming revelation, our friend Chris came calling. Chris is the quintessential entrepreneur who has built a very successful online boutique for unobtanium toys. He s the guy who has figured out how to make a vocation out of our avocation We had been working with him on his new website, and had been making good progress, but we felt that he had gone overboard on very picky design details. Ever helpful, we proposed that there might be other things that were more important to his customers than the gradient in the upper left corner and gave him a list of for-instances.

The next morning, Chris presented us with the first pass of a survey he had put together using one of the many available free online survey tools. (You can read more about picking an online survey engine here.)

His first pass was pretty good for a rookie, but his focus was on the details that he was wrestling with in his site design. The ones that he thought mattered to making a sale…. Sound familiar?

That s when we thought about our new insight and wondered how his customers might want to use his site to accomplish their own goals. That would tell us a lot about what mattered in the design and how things needed to be organized. Chris got it right away, and we rewrote the survey together. He added a couple of nice incentives, then sent an invitation to participate out to his entire customer mailing list.

For the rest of the day, we were barraged with emails from him with messages like:

In an hour and a half, I have 127 fully completed surveys. As good as this is, there is gold in the text comments. Wow!

Within 24 hours, 700 customers from around the world had responded. It was about a 10% return which is pretty impressive, and it is a very solid sample. That is a good story with a happy ending. But as it turned out, the fun was just starting.

I m embarrassed that I didn t do this sooner. What an eye opener Tools were there waiting for me Ughhh. Moving forward .

So what is the moral of the story? And why should you consider doing something like this if you are a small business?

Just as I was pondering how I was going to explain this, Seth Godin s daily post entitled Hard work on the right things showed up:

I don’t think winners beat the competition because they work harder. And it’s not even clear that they win because they have more creativity. The secret, I think, is in understanding what matters.

It’s not obvious, and it changes. It changes by culture, by buyer, by product and even by the day of the week. But those that manage to capture the imagination, make sales and grow are doing it by perfecting the things that matter and ignoring the rest.

It s pretty clear that the survey gave Chris deeper insight into what matters to his customers. He scrapped the website design he spent the last month on because he learned that what he thought was important, wasn t. Hat s off because not everyone can do it to be fair, it s obviously easier to do when you are a one-man band.

In Chris’ typical cut to the chase style he wrote:

This survey had a profound effect on me. It got me WAY outside my bubble, and allowed a glimpse at how people perceive my biz. While mostly positive- there were clearly areas where I need to shore up some weaknesses. The good news is that those weaknesses can be addressed relatively easily. Nothing really too complex here. Lower pricing. More selection. More frequent communication . Oh and update website 😉

What difference would these kinds of insights communicated to you in your customers own words make to your business? Have you ever done a survey or customer interviews? What did you learn? What do you need to know about how your customer sees your business?

{ 0 comments }

All of us at Audience Metrix are delighted to be launching the first global Event Measurement Best Practices Survey (EMBPS) today.

After 35+ years in the event business, we can tell you with confidence that one of the particular challenges of being in corporate events is that nobody knows what you do. It is easy to find out who has the best price on a projector, but it is next to impossible to find out much about the business thinking behind the events themselves.

Call them conferences, meetings, lollapaloozas or what have you, their ambitions range from modest to grand, their audiences from fifty to small cities. Each one is different, purpose built to meet a challenge or maximize an opportunity. Some are repeated annually as proud focal points of the companies culture and proof of its prowess, others spring up to seize the moment and disappear just as quickly. Each requires armies of tremendously skilled and dedicated people, all of whom disappear the day after the audience departs like gypsies looking for the next carnival.

This survey is both timely and important. Timely because CMO’s are under unprecedented pressure to justify every line item in their budgets, including customer events. Important because for the first time, professionals from around the world will be able to compare their event measurement practices with a global sample of their peers.

The Event Measurement Best Practices Survey explores a wide range of issues including how companies measure ROI, how they set event goals and at what level of the organization goals are set, how they execute global measurement programs and how they use the data for planning. It also looks at some of the new technology that is entering the space – like QR codes and Geo-tracking – to determine how people will be incorporating these new tools into their measurement programs.

The survey is specifically designed to provide corporate customer event sponsors and managers with the opportunity to compare their event measurement practices with a global sample of their peers. In exchange for their participation, each respondent will receive a personalized copy of the results, enabling them to compare their responses with the group average on a question-by-question basis.

In exchange for their participation, each respondent will be able to opt-in to receive a personalized copy of the results, which will compare their responses with the group average on a question-by-question basis. We will also publish a Whitepaper, which will be available to the participants.

We are very pleased to be working with our survey provider, Qualtrics on this project. To help us to promote the survey, Qualtrics is featuring Audience Metrix as their Researchers Of The Month on Q-Munity, their blog, and will be co-sponsoring the Whitepaper. They make a great product and are terrific people to work with.

If you are involved in the event business – from either the agency or the client side, we would very much appreciate your participation. If you’re not, but know someone who is, please share this post with them.

 

{ 0 comments }

Since 2008, corporations have become extremely conservative. Today, CMO’s have to justify every expenditure. While events – especially large customer events – have traditionally been exempt from this type of scrutiny, that is no longer the case. We address some of this in our Whitepaper, 5 Things Your CMO Should Know About Your Event.

Recognizing the need for information on this topic, on May 17 Audience Metrix will launch the first annual Event Measurement Best Practices Survey (EMBPS). Our plan is to recruit 500 corporate employees who have responsibility for events at their company, to share their event measurement practices. In exchange for participating, each respondent will receive a personalized copy of their survey results, comparing their answers to the average. This in and of itself will provide a very valuable benchmark; especially for companies who are in the process of examining their current approach to event measurement.

Later, the Executive Report will look at the answers from the perspective of “is the average high enough to be the best practice”. We will also do a series of blog posts drilling down into specific areas.

Which brings us to today’s topic of best practices – what they are and why they matter – not just in market research and event measurement – but in many aspects of business.

Wikipedia tells us “A best practice is a method or technique that has consistently shown results superior to those achieved with other means…”

One of the biggest challenges of being in the event industry is that no one really knows what you do. Another is that no two companies – much less 200 – do things the same way. As a result, there is no standard – no batting .375 – no Nielsen or Arbitron rating, no CPM that allows companies to measure their own performance, compare their performance with others or predict the return they will achieve based on industry norms.

The first, and perhaps most important question that the EMBPS survey will address is how organizations measure event ROI. The survey will explore what measurement techniques are used, who in the company is involved in goal setting and how the data is utilized. By design, taking the survey will offer a high-level overview of measurement options and techniques. Frankly, we will be astonished if any one company applies all of the techniques that we have included. We elected to take this broad approach in order to expose the respondents to a wide range of contemporary measurement techniques. This is part of our larger goal of making the survey a self-diagnostic tool.

We know that right now its “news” that companies are under pressure to justify their marketing spend. In contrast, individual managers have always been under pressure to justify what they do. This survey will provide managers with an objective, statistically valid foundation to support their recommendations, and we hope help them to secure the funding necessary to implement their recommendations.

One last point. In 12 months, there will be a 2013 survey, and we’ll see what has changed. The EMBPS is not a static industry snapshot. It is the foundation for a continuous improvement process that will document how event measurement is evolving from the baseline we establish in 2012.

In terms of details… the survey has been developed using the Qualtrics survey engine. We have used a few of their more interesting question models to enhance the respondent experience. Qualtrics will also be promoting the survey to their clients on their blog, and co-sponsoring our findings.

To provide a meaningful sample, we will leave the survey open until we have at least 500 responses. Then the Qualtrics engine will run the individual reports and mail them to the respondents… And we’ll go to work on the report.

There is no doubt that this research is needed, and we are confident that our reports are going to have a significant impact on corporate events. So much so, that we are already planning to do a companion piece this Fall looking at the measurement practices used for Virtual Events including webinars.

We are definitely looking for ways to promote the survey to people who work in or on corporate events. We’d appreciate your support in reaching out to people who you think would be interested in participating.The survey will open May 17. Look for it in the blog post, on Qualtrics.com/blog and PR Newswire.

Thanks! As always, we look forward to hearing from you.

{ 0 comments }

How To Segment Your Event Audience

This is the last post in our Corporate Event Measurement Best Practices series.

In our first post, we looked at the need to move event measurement programs from quantifying what happened at the event, to analyzing the impact of the event on the business. Our second post explored why you should review the event market research that your company already has in place. In our third post, we offered five tips for how to maximize your sample by ensuring that your surveys are user friendly. In our fourth post, we looked at seven criteria for choosing the right event survey engine.

In this post, we are going to discuss how you can make your event data more valuable by segmenting your audience. Let’s start with the concept of segmentation itself, and why you should be concerned about it.

Our belief is that a successful, sustainable event depends on meeting both corporate objectives and attendee needs. The flip side is that different groups of attendees want different things from the event. Sometimes these are nuances, sometimes they are quite distinct. Just remember that these differences are always important to the individuals who make up your audience.

Once you start looking at the differences rather than the similarities, you will quickly understand the benefits of segmenting the responses.

Here are five segmentations that we commonly use, there are of course many other possibilities:

· Alumni versus First Time Attendee

· Small and Medium Business versus Large Enterprise

· Customers versus Partners

· Long time experience with the product or service versus little or no experience

· North America versus other regions

Each attendee in our hypothetical audience fits into all five segments. Depending on the issue being explored, only one or two segmentations may show a significant difference. In fact, many of the audiences that we have studied over the years tend to be quite homogeneous – teasing out the important differences takes a lot of thought and constant refinement.

> Think about the ways that it would be useful to segment your audience. 

Identifying the segments that are relevant to your company  is usually the easy part. The trick is where you will ask the questions that will enable you to put each attendee into the appropriate segments.

There are two ways to do this. Each approach has its advantages.

1/ Registration Data

As we’ve previously discussed, registration data are an extremely valuable tool for anyone who studies audiences. We sometimes call it “census” data because it is the one data set that at least in theory contains all of the information about each attendee.

There are two things you have to do to leverage “reg” data. First, you have to make sure that a question exists to identify every segment that you want to track. As suggested in our second post, start by looking at the existing registration questions. We spend a lot of time tuning registration questions to get the information we need.

Second, and this can be very challenging depending on your skills and suppliers, you must be able to tie the registration data file to your survey responses. This requires the use of a unique identifier that we call a linchpin – typically a badge number works well.

 2/ The Post Event Survey

We try to avoid asking census type questions in our Post Event Surveys for two reasons. The first reason should be obvious – by leveraging data we can keep the survey as short as possible.

The second is a bit more subtle. Data gathered this way only represents a portion of the audience (the respondents) as opposed to all attendees (the registrants). If your survey response rates are on the anemic side, this can provide a misleading picture.

Still there are times when there are no other options – for instance when registration for the upcoming event is already open. If the results are useful, then put the question into the next registration set.

Are you currently segmenting your audiences? If you are, how has the information been used to improve the event? If not, what are the challenges you need to overcome?

Thanks for reading. Please be sure to subscribe to the blog. Let us know what topics you would like to read about next.

{ 0 comments }

Our dear friend, one-time boss and long-time client Scott Schenker, VP Global Events at SAP, is debuting his new blog today: JanusDialogs.com. It’s all about what Scott has dubbed Janus Moments – harbingers of “the world as it will be” (think tipping points on steroids). His editorial goal is to herald the arrival of the new normal in all of its myriad guises. It promises to be fascinating reading, and we are pleased to recommend it to you.

As a master event marketer, Scott’s particular focus is on prognosticating the impact of Janus Moments on the event industry; those of us whose success depends on exceeding the expectations of our very live audiences.

Since the advent of the Internet, lots of Janus Moments have impacted the world of corporate marketing. This particular one (which is curiously nameless) marks the profound shift in the interaction between audiences and companies… One in which customers have been gaining influence through the forums, posts and tweets of the online diaspora; while marketers continue losing control of the brand and product discussion.

As a result it has become increasingly important for us to provide our clients with ways to forecast the impact of the live event on the virtual buzz that influences the market place long after the curtain comes down.

When we were first engaged, Scott wondered what clues we might be overlooking in the data that we were already gathering pre and post event that might help to forecast the likely online commentary.

One line of inquiry led us to consider the almost universal application of “Top Box Scoring” and “Averaging” to report events. “Top Box” scoring is of course specifically designed to focus the executive reader on the good news… Something that is completely understandable when people whose reviews and KPIs are predicated on audience satisfaction, are doing the reporting.

The “Average” score is more even handed, but by definition homogenizes the results.  Neither approach tells us much about those on the fringes…

Which leads to our hypothesis – that attendees who are unhappy, dissatisfied or otherwise chagrined by some product, person or policy… are the very same ones who are the most likely to be highly vocal and potentially viral online. At its heart, this is nothing more then a modern day interpretation of “the unhappy customer tells one hundred people” played out around the world at the speed of data.

Our own “aha” moment came with the realization that we could manipulate the widely used Likert 5 Point scale to bring the vocal “Extremely Displeased” to the same level of prominence as the  “Extremely Pleased”.

We dubbed this new calculation, the AIR Score – short for Audience Impact Rating. This alternative interpretation is based on our observations of hundreds of surveys, containing thousands of these questions…

Which brings us back to www.janusdialogs.com.

The AIR Score is now a standard part of SAP event analysis, where it plays a vital role in providing actionable data that enables managers to proactively deal with the consequences of this Janus Moment.

Scott has graciously invited us to be part of the debut of the Janus dialog, and is posting our whitepaper “Measuring The Modern Event” in which we present the AIR Score on his site at http://bit.ly/AMxJanus

The paper is also available to you as a PDF at http://bit.ly/AMxBP424.

We believe that the AIR Score will be an important tool for anyone involved at the intersection of customer events and online communities – a location where we are certain that more Janus Moments will take place.

Reflective of another Janus Moment, the AIR Score calculation is Open – meaning that the underlying math is available to you to apply to your own research… all you have to do is ask. We look forward to hearing about your experiences applying the AIR Score to your own work.

As always we encourage you to subscribe to the blog. Please let us know what topics you would like us to cover.

 

 

{ 0 comments }

This is the fourth post in our Event Measurement Best Practices series.

In our first post, we looked at the need to move event measurement programs from quantifying what happened at the event, to analyzing the impact of the event on the business. Our second post explored why you should review the event market research that your company already has in place. In our third post, we offered up five tips for how to maximize your sample by ensuring that your surveys are user friendly.

In this post we are going to give you some ideas about choosing event survey software. It’s been top of mind for us, because after ten plus years with our old survey application, we are migrating to new software. Our business is growing and we need tools that leverage our time… along with the investment, support and security a bigger company provides.

To pick the right solution for your needs, you must make a fundamental assessment. Do you need a solution for a specific event, or do you need software that can satisfy a wide range of needs across your organization? Do you need something with bells and whistles, or do you just need something to get out the odd survey?

Here are seven of the criteria that we considered, which might impact your choice.

 

1/ Branching Logic

An online survey has one huge advantage over its paper equivalent, branching. Using branching (also called skip and progressive) logic, a survey can be programmed to adapt to the respondent as they complete the survey.

Keeping questions relevant to the respondent, is the key to maintaining engagement and getting the largest sample.

Here’s a common situation. You just introduced a new silver bullet gizmo, and you need to determine just how successful the launch event was. You need to ask the Partners and the Customers a lot of the same questions – using branching you can ask the Partners how many the think they are going to sell, and ask the Customers how many they think they are going to buy…

In our decision making process, we put a very high premium on picking a survey application that offered a full complement of logic options. For us this is the “gotta-have” feature that separates the serious tools from the lightweights. Can you think of ways you might apply branching to your surveys?

 

2/ Question Types

Can you present questions the way you want to, or are you limited to a few pre-formatted designs?

It used to be that if you had a couple of multiple-choice questions, a true/false and a write-in block, you were pretty well set. But people today are used to a more visual, more interactive communication style.

The ability to compare images, rate videos and interact with a specific image may be important to your company. Departments like Training, Customer Support and other groups in marketing may also be able to take advantage of these capabilities.

 

3/ Branding

There are two distinct schools of thought about branding surveys. One school of thought is that a survey should be visually straightforward to communicate that it is an objective, third party instrument.

But there is also research that suggests that in some instances the response rate will improve if the survey is branded. Bottom line, you probably need a certain amount of flexibility to meet the needs of various projects.

Take a look at the graphic tool set. Does it require knowledge of HTML and CSS to get the most out of it? Can you easily incorporate logos, pick backgrounds and implement other details?

 

4/ Mobility

If you are looking to the future, you need a solution that is mobile device friendly… That automatically scales so your subject can respond on a smart phone, a tablet, a laptop or even on that old dinosaur, a desktop. After all, you want to make it as easy as possible for people to take your surveys when and where they want to.

 

5 / Reporting Dashboard

Our old software was great at spitting out numbers and tabular reports, but that was about it. The first thing that we had to do was create charts, usually in Excel. Not only was this time-consuming – it was slow.

Today, many systems offer real time reporting using pre-formatted dashboards of various kinds. It sounds impressive but let’s separate these two concepts.

We are not at all convinced that real time reporting is a good thing. At the end of the day, statistics is about sample size, and the worst possible case scenario is for people to start over-reacting based on what the TV networks call “the early returns”.

But a pre-formatted dashboard, now that is a thing of beauty and genuine value. Once again, how much flexibility and control you will need is a question of resources and application. There are a number of things worth considering.

The first is access. For instance you might want your vendors to be able to see some of the data, but not other parts. How much control do you have over access?

How easy is it to create the reports you want? Can you define how the data will be reported? Can you easily present the data in the look and feel that is expected?

Finally, beyond displaying charts and graphs, how powerful is the software? Does it support going beyond counting responses to doing analysis, such as segmentation?

Which leads us to another key criteria, integration.

 

6/ Integration

In the case of survey software, integration is about two things. How easily software can add additional information from outside sources to individual records. And then, after the survey is complete, how easily it can share the information collected with other software.

One of the things that we routinely do is integrate survey data with registration data. That requires a unique identifier (a linchpin) to connect the individual registration file and the survey file.  If you need or want to be able to combine data sets, make sure your solution has this capability.

The second is sharing the data that is collected. This is all about the wonderful world of API’s, Application Program Interfaces, which enable one program to talk to another.  The obvious one is exporting a .csv file to Excel. Essential, but boring.

But there is much more that can be done. For instance, our new application has a salesforce.com API –it can port responses directly to a clients CRM system where the data becomes part of the customers record. That’s pretty slick – to say nothing of fast and accurate. It’s the kind of thing that is of tremendous value to the right customer.  How can you improve the value of your data to  your internal customers?

 

7/ Support

One of the nice things about the new generation of solutions is that many are cloud-based, so the software can be constantly updated and improved without the user having to deal with it. But all of the whiz-bang features in the world aren’t worth much if you can’t use them, or they don’t work they way that you expect them to, or it takes too long to get things done.

Here are three things we loosely lumped under support, the idea being that each makes it easier for us to get it right the first time.

First, libraries. Take a look at things like pre-built templates and pre-formatted question types which offer quick solutions to last minute projects.

Second, on-line tutorials. Whether you are just learning a system, or want to use a new feature, being able to go online to get a good tutorial is a real time-saver.

Finally, the tech support people. Start with – can you find the phone number on the website??? Talk about pet peeves!

A commitment to support is a direct manifestation of a company’s culture and philosophy. The more people in your organization are going to use the software, the more important support becomes.

So that’s it – seven areas that we looked at and you may wish to explore when you consider survey software applications. If you’ve read this far, you’re probably curious about which survey software solution we chose for Audience Metrix. For these reasons and others specific to our business and our clients, we went with Qualtrics.

Check out this sample survey, it’s an eye-opener!

Are you currently looking for new survey software? What would you recommend about your current survey application? What else would you like to know?

 

 

 

{ 1 comment }

design better surveys

This is the third in a series of posts exploring the best practices that underlie successful event-based corporate market research programs. We offer these up to help you to assess the “health” of your own programs.

In our first post, we looked at the need to move event measurement programs from quantifying what happened at the event, to looking at the impact of the event on the business. Our second post explored why your first step should be to review the survey instruments that your company is already using.

In this post, we will give you five tips to help you to design more user-friendly surveys. This matters because the more user-friendly the survey is, the higher the response rates tend to be. And since statistics is based on sample size, the larger a sample that you have (number of respondents) for a question, the more confidently you can project the results for the entire audience.

In Enterprise IT and many similar industries, somewhere between 15-25% of the attendees are willing to participate in event-related surveys. Your goal is to do everything that you can to maintain their goodwill… By the way, expect there to be fall off in participation during the survey – people get bored or frustrated, the phone rings, the plane takes off.

There is nothing much you can do about it, except to think about how you order your questions. And if you have the technology, make it easy for them to come back to finish the survey at a later date.

1/ Ask questions in order of importance As a general rule, we recommend that you put the most important questions at the beginning of the survey. You have the largest sample at the beginning, and people will be fresh and interested in what you are asking. The first few questions set the tone of your survey – readers will quickly decide if you are interested in what they think or are simply collecting data. (That is why we leverage registration data.)

 2/ Challenge the value of each question. Less is more, and in surveys shorter is always better. The fundamental test that every question must pass to earn a place on the survey is, “what are you going to do with the data?”

Surveys seem to take on a life of their own and magically get longer every year. Most of the surveys that we see are filled with legacy questions that have been asked for years, and are kept because they provide “year over year” data. The funny thing is that no one remembers how they got there, and no one knows who uses the “year over year” data. These are the first ones we cut. It’s a good time for Spring Cleaning!

When a client wants to use an event survey to conduct market research or demonstrate event ROI, we remove almost all of the event logistics questions. This is in keeping with our philosophy of delivering actionable data – data that can be used to inform decisions that will help to improve the event in the future.

3/  Vary the question types. There are a lot of ways to ask questions. Mixing them helps to keep the respondent engaged. But not all questions are created equal – some of them are demand a lot of time and effort, and quickly become annoying.

The five point balanced scale question is generally the foundation for the survey. These questions are easy to answer, and the data can be reported in multiple ways. True/false questions are also very easy to answer and don’t tax the respondent.

Write-in questions (called literals) are invaluable because they provide much needed insight into the quantitative data you are collecting. Structure the question to be open-ended – you want to know what’s on your respondent’s mind. Set up the text box in your survey tool so that people can write as much as they want – no point in cutting them off if they are on a roll. Two to three literal questions are about right for a medium to long survey, use one for a short survey.

There are two types of questions that really tax the respondent. All you have to do is look at them to know which ones they are. Yes, the incredibly long list and its even more offensive relative, the incredibly long list with rating columns.

We understand the need to know which of the 63 products each customer is interested in. We also understand the need to know which of the 63 products is of the greatest interest. And which of the 63 product sessions provided great information and which ones didn’t meet expectations.

But seriously, would you stick around the second or third time you saw that grid? Even with a very nice incentive a lot of people will bail when the mental effort becomes too great…

4/ Use “mandatories” sparingly. Many survey engines allow users to make specific responses mandatory – which means that the question has to be answered before the user can go on to the next question. Clearly this has its uses (e.g. shipping and billing information) but it’s frequently overused in survey design.

Remember that once you annoy a respondent, chances are that you are going to lose them… You always have to trade off the value of getting a specific question answered, against the risk of losing their participation in the rest of the survey.

5/ Use branching and collect data from multiple sources. The problem with the 63-question set is that while it reflects your product offering and the event curriculum, it is not relevant to the individual respondent… No one is interested in 63 products, or goes to 63 sessions.

Many survey engines offer branching (sometimes called progressive or skip logic). In this instance, you might ask, “which of the 63 products are you interested in?” The respondent makes their selection, then going forward, the survey engine logic limits the questions to the products the respondents have selected – which means they only have to sort through the master list once. The ability to reconfigure (adapt) a survey to an individual respondent in real-time, is a significant advantage of a web-based instrument over paper.

Here’s another idea… Want to know what they thought about the product sessions? Don’t ask them, incorporate the session attendance and evaluation scores into your report. In this example, the session evaluation reports are more valuable because they are asked immediately after the session rather then days (and sometimes weeks!) later.

By prioritizing your questions, using various question types, using the tools in your survey engine and collecting data at multiple points before, during and after the event, you will increase your response rates and improve the accuracy of your projections.

Do you review your survey design from the respondents point-of-view? When is the last time you pruned the survey to refocus it on actionable data? Do you know who uses the answers to each question?

 

 

{ 0 comments }