More companies are using research as a form of content marketing – be it survey-based studies that dive deep into an industry trend or analyses of internal user data to show off the brand’s expertise and point of view.

Yet, all that new interest comes with a learning curve. Original research is one of those areas where so much can go wrong, especially if you’re inexperienced. Most problems I see relate to one of two things: poor survey design or faulty statistical analysis.

Most problems I see with original #research stem from poor survey design or faulty statistical analysis, says @clare_mcd via @CMIContent. #Research Click To Tweet

Today, I’ll focus on one aspect of survey design – the survey experience.

Survey experience is about how well your survey takers think the questions are relevant, intelligent, and appropriate. Will they be able to (and feel motivated to) complete your survey? Will they answer honestly and openly? Would they answer a survey in the future based on this experience?

While it may seem like a nice-to-have, a good survey experience will boost completion and accuracy. A bad survey experience can nuke your survey results (more on that in a bit).

A good survey experience isn’t just a nice-to-have. It boosts completion rates and accuracy of the results, says @clare_mcd via @CMIContent. #Research Click To Tweet

After 20 years of working with clients using research-as-content, I pay close attention to these eight experience elements.

1. Pre- and post-survey considerations

The survey-taker experience begins before they start the survey and after they hit complete. In the invitation to take the survey, be sure to explain why you’re conducting the survey, what you aim to do with the data, and how long the survey will reasonably take. (Be brief in these explanations. You don’t want someone to drop out before they’ve even begun.)

If you plan to collect survey responses from anyone in Europe, even unintentionally, turn on the GDPR opt-in settings available on all survey platforms and provide a link to your company’s data privacy policies.

Turn on the GDPR opt-in settings if you plan to collect survey responses from anyone in Europe, says @clare_mcd via @CMIContent. #Research Click To Tweet

Also, mind the post-survey experience. Someone who completes the survey should see a custom thank-you page, not the default page provided by your survey platform.
If you collect the survey takers’ email addresses, be crystal clear about your intent. For example, we collect emails from those who want a copy of the final report or to enter the raffle for completing the survey. We never use an email for any other purpose (and in fact, I strip out the email column in my spreadsheet and put it in a different tab to decouple identities from responses).

If survey takers believe their responses will be used to market to them in any way, they will not want to take your survey.

If survey takers think their responses will be used to market to them, they will not want to take your survey, says @clare_mcd via @CMIContent. #Research Click To Tweet

2. Survey length

Don’t you love the “please-take-our-20-minute-survey” invitation? That’s a hard no for me. Unless you pay someone to take your survey, it should never be more than eight minutes – and even eight minutes is a big ask.

Never create a survey that goes more than eight minutes or your completion rate will fall off a cliff, says @clare_mcd via @CMIContent. #Research Click To Tweet

Consider this: Completion rates drop for each additional minute required to answer the questions. In my experience, it begins to fall off a cliff at around the eight-minute mark. Editing for survey length is an absolute necessity, and it’s an excellent way to ensure you’re tightly focused from an editorial standpoint.

An analysis of survey length by SurveyMonkey found drop-off rises with each additional question — an important reminder to keep survey length as tight as possible.

Survey completion rates drop for each additional minute required to answer the questions, according to @SurveyMonkey #research. Click To Tweet

3. Question length and complexity

Most survey platforms warn when questions and/or answer options are too long. That’s because long questions or answers lead to fatigue, speeding, misunderstanding/error for the survey takers, and they look like hell on mobile. Avoid long questions and answers unless it’s absolutely mission-critical for one or two questions.

Also, beware of the compound question (e.g., “Does your job give you satisfaction and pride?”) Boil down your question to a single idea or variable so that your survey taker can answer easily, and you can report findings clearly.

4. Tempo

Your survey pace should resemble a conversation between two strangers. Don’t dive into the most probing, sensitive questions up front. Wait until the survey taker can see your study is worthwhile based on the quality of your survey questions. Then, they may be more willing to share sensitive details. Income is always a sensitive topic, but others also make people uneasy, such as plans to leave a job or reveal sensitive company information.

I advise a few antidotes to this awkwardness: Put those types of questions toward the end of your survey and make them optional (or add an option for “prefer not to answer”). You may even remind survey takers at sensitive moments that their responses will be fully anonymized and never used for any other purpose.

5. Demographics

Capturing demographic information is essential to ensure the survey sample represents the audience you’re attempting to study. Demographic responses also can expand options for interesting “cuts” of the data, showing you how different cohorts of your study group (e.g., generations) differ.

In recent years, survey designers have evolved how they ask demographic questions to be more inclusive. For example, they ensure questions about sexual identity or gender identification use language that includes rather than alienates — an important aspect of survey experience.

The challenge is balancing inclusivity and brevity. Rather than list a dozen choices for gender identity, for example, I limit answer choices to man, woman, non-binary, and prefer to self-identify (write-in). The prefer-to self-identify option ensures everyone has a choice that fits. And fewer answer choices make it more likely you will have segments large enough to compare.

Your survey platform can be a good resource when designing demographics questions. (They often have question libraries to draw from. SurveyMonkey’s library is particularly good). I also like to look at what big research organizations like Pew Research Center use for demographic questions. Whether I’m asking questions about gender, race, ethnicity, age, sexual identity, or other characteristics, I consult established surveys to find a consensus.

You can use established research organizations like Pew Research Center to gut-test demographic questions.

When does this advice go out the window? When gathering granular detail is part of your study’s primary aim (e.g., the research focuses on gender identity) and commonly used demographic questions don’t provide the specificity you need.

6. Self-serving questions

Don’t even think about asking self-serving or promotional questions. Invariably, I work with companies that want to toss in a few questions that not-so-subtly promote their product/service. The problem? Your survey takers are smart, and they will resent the question and even punish you for it.

Don’t even think about asking self-serving or promotional questions in your survey, says @clare_mcd via @CMIContent. #Research Click To Tweet

The most comical example of this was a client I worked with a few years ago who was adamant they wanted to ask, “Which do you like better?” in relation to an analytics dashboard using illustrations of each. One image was clearly superior (and belonged to the client) and one was the primitive Stone Age option. Guess what? A third of respondents chose the Stone Age option. I suspect they knew it was a setup. The results were not usable.

7. Survey testing

Testing your survey is the most important thing to do before you release it to the wild. Recruit at least five people (over 10 is better) to take the survey and comment on ANY issue that gives them pause. These individuals should be in your target study group so they can gut-test the question wording and answer choices.

This screenshot from the survey tool Alchemer shows how to generate automated tests. Tools like Alchemer allow you to generate dummy responses, an excellent way to test your conditional formatting and piping.

Testing your survey is the most important thing to do before you release it to the wild, says @clare_mcd via @CMIContent. #Research Click To Tweet

My company pays testers to ensure they take it slowly and record all their questions and concerns. Is any question unclear? Do the answer choices make sense? Are they able to answer every question or do some not apply? Make sure some testers respond to the survey on mobile and others on a desktop to validate both experiences.

When your testers finish, run automated testing through your survey platform. It generates responses automatically. Comb through your summary report. These dummy responses can help pinpoint any problems with survey logic and piping.

8. Survey feedback

Make sure to include an email address for people who have questions in the introduction, on disqualify pages, and at the end. If you’ve designed a great survey experience, you likely won’t receive any emails (we rarely do). But providing the option can serve as an early warning system to any survey problems missed in testing. (Note: You can’t substantially edit a question once you’ve gone live, but you can either choose to restart the survey or drop the offending question.)

Better survey experience brings big benefits

Why does experience matter so much? A frictionless experience increases your number of completes and your sample size – boosting your study’s credibility and making it more likely that you can tell interesting stories. Plus, a good survey experience signals to survey takers that the research is worthwhile, which is critical when you ask customers or other audience members to participate.

Please note: All noted tools are suggested by authors. Feel free to add your favorite tools in the comments (from your company or ones that you have used). 

If you want to be one of the first to see the results of CMI’s annual benchmarks and trend research or niche reports, subscribe to the daily or weekly newsletter.

Cover image by Joseph Kalinowski/Content Marketing Institute






Source link