If you couldn’t tell from the title, I’m not a huge fan of surveys.
Do they serve a purpose in marketing?
Absolutely!
Are they the be-all-end-all of customer research?
Absolutely not!
Marketers and business owners have been relying on surveys to do the heavy lifting for too long…
- Don’t know whether your customer is going to like your offer? Ask!
- Don’t know why your customer is leaving your page? Ask!
- Don’t know where your customer is coming from? Ask!
Whenever a gap in user behavior appears, it becomes the knee-jerk reaction to simply just ask the user “Why?”
Well, this can lead to huge problems and completely ruin your product development, offer construction, and your marketing campaigns.
In short – relying on surveys can literally ruin your business.
Why? I’m glad you asked. 🙂
Surveys can hurt your business because there is a HUGE difference between how a person reports they will behave and how they actually behave.
Even worse, a complex survey is HARD to make, and if it isn’t made by a professional, you could be leaving yourself open to a litany of biases that skew results and provide even worse data!
Think about it: Over reliance on surveys provides potentially biased data from a source that is already WEAK in the accuracy department.
Today, I want to cover…
- Exactly where surveys fall short
- The situations where they’re still appropriate
- And customer behavior data sources you can use instead of surveys
Let’s get right to it!
Contents
Why Surveys Can Suck
Before I can talk about why surveys suck, I need to get a little nerdy about an important data distinction.
Surveys are a type of user data I refer to as “active user data.” Active user data is any type of data that is gathered when the visitor is aware they are in a testing (or observed) environment.
While its antithesis is “passive user data,” which is any type of data where the subject is unaware they are in an observed environment.
The distinction between active user data and passive user data is very important because behavior changes between a visitor who knows they are being observed versus a visitor who does not.
Next time you look at your data, ask yourself whether it is active or passive.
Passive data is more accurate on the surface, i.e., the data is useful as soon as you gather it, whereas active data will require validation and other data sources to verify it’s accuracy.
Okay, we’ve gotten the theoretical distinctions out of the way.
Now, let’s talk about the real stuff: Why surveys are bad for business.
People Don’t Know What They Want
I’m kind of beating a dead horse, but don’t just take my word for it! This research has already been done.
Ever hear of Howard Moskowitz? If not, he’s a market researcher and psychophysicist.
He’s helped develop new products for companies like Pepsi and Prego using traditional user research methods.
When Dr. Moskowitz was working with Pepsi to develop the perfect diet Pepsi, he would regularly get user preference data back that was all over the map. It didn’t make sense. One person liked diet Pepsi at a high sweetness whereas others preferred it less sweet.
In this field, having messy data isn’t anything new.
However, when Dr. Moskowitz pondered over the weird data set he realized something:
It’s not about creating the perfect Pepsi, it’s about creating the perfect Pepsis.
This concept wasn’t taken seriously, and eventually, he had to find work elsewhere.
Before I tell you where, have you ever walked down the pickle aisle in a grocery store? Have you seen just how many product variants they have? This is because they know there is no “Perfect Pickle,” only “Perfect Pickles.”
Dr. Moskowitz started working for Vlasic and they embraced this philosophy.
Rather than just improving a single line item, companies should focus on “horizontal segmentation,” which is a fancy way of saying they should have a variety of options to meet the needs of an audience who:
- Don’t know what they want and
- Have such varied preferences there is no overall “best” option
Now, let’s get away from Dr. Moskowitz’s resume and talk briefly about his research.
I want to share one of his research questions and that will definitely drive home the most important part of his work (in regards to surveys).
When asked what type of coffee consumers prefer, they respond with:
A rich dark roast.
Though, after consuming coffee and asked which they prefer, 25-27% of these people actually prefer:
Milky, weak coffee.
Dr. Moskowitz discovered that there is a massive difference between what people report to want and what they actually want.
There are all kinds of reasons for why this phenomenon occurs, and survey biases are the first place to look.
1. So. Many. Survey. Biases.
Crafting a survey is hard.
Like really hard.
There are people who dedicate their professional career to designing, launching, and analyzing surveys. Likely you are not one of those people, so if you’re using surveys you might have a few of these biases to deal with!
Hawthorne Effect
Also referred to as the Observer Effect, this occurs when a respondent alters their behavior in response to their awareness of being observed.
This effect is one of the reasons why I differentiate between active and passive user data. The moment a person realizes they are a part of a study or that their responses will be recorded is the moment you lose TRUE transparency.
During Traffic and Conversion Summit 2017, I asked the audience two questions:
- Raise your hand and keep it up if you know what a heat map is?
- Put your hand down if you haven’t used a heat map.
From the stage, I could see people putting their hand down and up again as they surveyed the room to see which behavior was acceptable. After two seconds or so, most of the room still had their hand raised.
Clearly, some of the audience hadn’t used a heat map, but because they were among their peers and because the surveyor (me) could see them, they altered their response.
Sampling Bias
A sampling bias occurs when your surveyed audience isn’t a true representation of the entire population.
If you were to craft new messaging, new products, or new offerings off of a data set with a sampling bias, then, you’d be designing something that only works for some of your audience and not all of it.
Say your audience is represented by a bag of M&Ms. You’d want to get input from all the different M&Ms in order to get useful data.
However, say there is an error in targeting and you only get data from the green M&Ms (and I hear they have very different opinions than Red and Yellow M&Ms).
Without the input from the other sources, you could be making decisions based on a skewed data set which would hurt your business!
Here’s a less cartoony example:
Say you run a survey on a campaign where 80% of your traffic comes from Google pay-per-click (PPC) campaigns.
People coming from Google are generally looking for a solution to a problem and have a different reason for visiting your page than someone coming from an organic or referral source. If you polled people hitting this page, your responses would be heavily influenced by your PPC audience, which could have a negative effect for your other visitors!
Researcher’s Bias
This is more of a bias category than an individual bias.
Yes, there are sets of biases you’ll have to deal with.
These types of biases could include the following:
1. Confirmation Bias:
When a researcher uses new information to confirm an existing belief, theory, or hypothesis.
2. Question Order Bias:
The order of your questions can (and will) impact future responses.
This is somewhat unavoidable, but, generally, you want to use general questions before you ask specific questions and positive questions before negative questions to minimize the effect.
3. Leading Question or Wording Bias:
The more you elaborate on a question, the more likely you are to put words in your respondent’s mouth.
This can lead to a respondent thinking there is a “right or wrong” answer and makes the question susceptible to the Hawthorne Effect.
Non-Response Bias
This bias occurs when individuals chosen in the sample are unwilling or unable to take the survey.
This is only a problem if the non-respondents would have meaningfully different responses than the respondents and can cause a sampling bias.
2. Over-Reliance During Offer or Product Development Can Squash Innovation
Let’s go back to Dr. Moskowitz for a second.
When he started working at Vlasic, they were trying to create the “Perfect Pickle.” That would lead to a single pickle jar on a shelf that gets less retail exposure and is only desired by a mere fraction of the population.
Any zesty pickle fans here?
Well, your precious zesty pickles wouldn’t exist if Vlasic continued to use user data to create the single perfect product versus creating MULTIPLE appealing products and letting the people choose.
Honestly, if you aren’t willing to launch a product or a new offering without consulting your user base you might as well just give up right now.
You need to be the innovator. You need to be proactive! Relying on user survey data is reactive and won’t lead to that breakthrough you were looking for.
Relying on user survey data is reactive and won’t lead to that breakthrough you were looking for.
Remember, even if you are completely bias free (which I doubt you are), you are still using a data source where the primary data, e.g., the respondents, aren’t considered reliable.
This is why you need to use complimentary data to verify the findings of properly designed surveys.
Types of Surveys
So, I’ve trashed surveys pretty hard up to this point.
I want to make it clear that they are useful WHEN USED APPROPRIATELY.
There are two types of online surveys I want to cover:
- The microsurvey
- The standard “Macro” Survey
Spoiler alert: The microsurvey is WAY more appropriate and will help avoid a lot of the biases I covered earlier.
Microsurvey
Micro-surveys are small surveys collecting information from users with the help of one or two questions.
You can choose where, when, and how they appear on your page.
I wrote at length about microsurveys here, I’d suggest giving it a read if you plan to start using them.
The power behind these types of surveys is they have the two following characteristics:
- They’re short: Visitors don’t like long surveys and will respond to a single question. Take that non-response bias!
- They’re targeted: You can go after a broad or specific segment. This gives you the ability to ask the right people the right question.
When constructing these types of surveys, you want to ensure that you’re using “Factual” questions instead of questions that can be lead to interpretation.
Say I want to find out how much traffic my paid subscribers get to their websites as a way to qualify them for a new product. This should be easy, right?
Let’s try!
Version A
Question:
How much traffic do you get monthly?
Possible Responses:
a) A lot of traffic
b) Moderate traffic
c) Low traffic
Version B
Question:
How much traffic do you get monthly?
Possible Responses:
a) 100,000+ visits
b) Between 10,001 and 99,999 visits
c) Fewer than 10,000 visits
Please pick B, please pick B, please pick B.
The answer is B! Why?
There isn’t any interpretation required by the respondent.
High traffic for someone who works at Amazon is likely very different than high traffic for your local plumber.
For your next survey, try a microsurvey!
Keep it…
- Short
- Targeted
- Interpretation free
…and you’ll have a useful data set!
Macro Survey
These are the types of surveys we’re most familiar with. The ones that come with the possibility of winning an Amazon gift card if you take the entire survey…
I don’t have a lot to say about these types of surveys.
It is more difficult to get your visitors to take a long survey, which is why people generally offer incentives.
So, with these types of surveys, you’ll have a lower response rate and if you do incentivize you could run into “professional” survey takers, which can skew data.
What’s worse is these are the types of surveys that can be hit with all kinds of survey biases.
Generally, I advise people to use survey templates and questions that have been developed by professional survey constructors.
If you’re using a survey building tool, look at their templates and ask them how the questions were developed.
Despite all the risks, some of you still need to use these types of surveys. Here are tips to reduce the risk of macro surveys and increase the number of respondents:
- Use advanced targeting to get the right survey in front of the right people and avoid statistical noise.
- Be upfront with the time commitment and use liberal estimates. Let people know how many questions or how long the survey will take!
- Start with demographic and factual data then move to interpretive.
- Avoid leading questions – if you have to further clarify a question your question is bad.
When it’s Appropriate to Use Surveys
I generally only use surveys and other active user data sources, when I can’t get the questions I need answered by passive data sources, e.g:
- Web analytics
- Funnel analysis
- Heat maps
- Session recordings
I haven’t used a long-form survey in roughly three years. That’s a LONG time considering I’m a digital experience optimizer.
When I did use long surveys they were targeted to a niche email list and the data was used for a benchmark report.
This data was highly factual and focused on user demographics; there were very few interpretative questions.
I prefer to use short, target microsurveys to fill in my knowledge gaps and I do so with this very important understanding:
Surveys are supposed to be used as complimentary data.
I’m not trying to find out a respondent’s desire, but am trying to get more information about the respondent to gauge whether what I’m doing is effective!
For example: DigitalMarketer was trying to figure out whether our community, a closed Facebook group called Engage, impacted how much the average customer would spend with the company.
To test this theory, DigitalMartketer used a microsurvey tool to poll the audience.
To cut down on the number of questions, the team targeted visitors who logged into the subscription product DM Lab.
Then, two questions were asked:
- Are you a member of the DM Engage Facebook group?
- To confirm, what’s the email address you use to log in to DM Lab?
Based on the response to Question 1, a different tag was applied to the respondent’s record in Infusionsoft. From there, the DM team could export the data and look at the user’s spending habits to see if and how community impacted the buyer’s behavior.
In fact, we could slice this data even more if we wanted!
Without having to ask the respondent, we were able to gather country of origin, city, operating system information, browser information, and more.
Using this type of technology allows you to not only ask targeted questions that talk with your CRM, but it eliminates many qualifying questions you need to ask to segment data by user demographics.
Passive Data Sources for Customer Behavior Analysis
Customer behavior analytics can be broken down into two camps:
- Active user data: A type of user data where the subject knows they are in an observed environment.
- Passive user data: A type of user data where the subject is unaware they are in an observed environment.
We’ve discussed active user data at length and where it tends to fall short.
Now, I want to show you some examples of passive user data and how it can be used as a substitute for the problematic active user data sources.
Heat Maps
This is a type of passive user data that will show you where people interact on your website.
You’ll see…
- Where they click
- Where they scroll
- How their mouse moves
- How long it takes to click a particular element
…and more.
Why is this better than a survey?
Well, you are seeing your customer act as they normally would. You’re observing them in the wild!
Furthermore, you aren’t asking a particular preference or for them to report on what they think they might do – this is exactly what they would do and how they behave will show you preference.
If you notice that people aren’t scrolling through your content, then you know you haven’t created a page that is compelling enough to get people to explore further.
If people aren’t clicking your main call-to-action, you’ll quickly see what is causing the distraction.
There isn’t any guessing: either a visitor makes the converting action or they don’t. It’s this type of data that will show you WHY they aren’t taking that action and what they’re doing instead.
If you’re using heat maps or are planning to use a heat map, I highly recommend giving this a read so you know what to do with the data once you have it.
Recordings
Recordings are an insanely powerful way to understand your user’s journey.
However, they are extremely time consuming and should only be used if your heat maps don’t answer your underlying questions on the page.
Say you create a heat map for a checkout page and notice that people are only clicking the first field and don’t click again until they get to the Credit Card. You might start to think that they aren’t filling in the other sections or your workflow isn’t ideal.
This is the perfect time to explore using a recording.
In this case, you’ll likely see that a user is using the “tab” button to move from field to field. “Tab” users on desktop and “next” keyboard tappers on mobile have different user needs than standard field clickers.
For example, if you rely on your field text to show the field title then your user won’t know what to type! When a user doesn’t know what to type, they’re not going to convert.
Heat maps are extremely powerful for getting an average view of how a user interacts with your page. Even with more advanced filters you still only get over-arching data!
Recordings will show you everything your user goes through on the page (and if you want on subsequent pages).
So, if you use a heat map and still have questions, don’t jump directly to a survey! Try out a session recording to get the answers you need from your user’s behavior.
Form Analytics
The form analytics report is hands down one of my favorite types of data. When working on lead generation pages or checkout pages you have to optimize two crucial areas:
- The page itself
- The form experience
You could have the most fantastic page in the world, but if your form doesn’t make sense, you’re going to lose people.
A bad form is a deal breaker and being able to dig into the form data itself will provide more insights than survey feedback that says “Your form sucked!”
Forms can show you:
- The fields your users spend the most time on
- What had to be refilled
- What was left blank
- The overall conversion rate of the form
When you know this information, you can make better decisions about what goes on the form and how to improve the usability.
The bad conversion rate will tell you the form sucks (so you don’t need a survey for that). Then, all of the subsequent form data points will tell you WHY it sucks.
I actually ran form analytics on one of DigitalMarkter’s lead generation landing pages, and what I found was eye-opening. Check that out here!
Funnel Analytics
This is more on the quantitative data side, but it is quantitative customer behavior data!
This report will tell you where you’re losing people in your funnel.
If you don’t know where your drop off points are, then you can’t possibly hope to improve your business.
Sequencing is everything and if your offers are out of sequence your funnel analytics will show you that right away. There is no amount of survey taking that will tell you as accurately (or quickly) whether your sequencing is off than your funnel analytics.
What’s better is you can get a better understanding about your user’s journey by using advanced filters to slice up your data via the user’s demographic.
You can collect useful demographic and device data to filter and inspect user segment behavior.
All of this data is readily available and doesn’t require you to ask ‘What country are you located in?’ it just gathers that data upon the visit.
2 Passive Data Sets You Already Have
Everything I’ve shared so far requires a new marketing technology to use. Whether it’s an advanced niche survey tool like
Whether it’s an advanced niche survey tool like Formstack, WuFoo, or SurveyMonkey or a customer behavior suite like TruConversion.
New tools are intimidating, have a learning curve, and have a cost.
So, as my gift to you for getting this far, here are two game-changing passive data sources you already have!
1. Customer Service Questions
Customer service departments see the good, the bad, and the ugly and all of that data is useful!
When a customer is complaining or praising you at this level they are not worried about being observed, so the Hawthorne Effect goes out the window.
Pretty much, you’re getting detailed insights around what your customer truly thinks!
You are getting people when they are most critical, and this can help fuel future updates or campaigns.
I recommend talking to your customer service department weekly and asking the following:
- What are the top five questions you get from customers?
- What are your answers to these questions?
- Are there any particular aspects of X that people don’t understand?
- What aspects of X do people like the most/least?
- Have we missed anything important? Anything to add?
Knowing these issues will tell you where to start focusing your optimization/improvement efforts with several new ideas to try.
2. Sales Questions
This dataset will tell you the reasons customers buy or don’t buy from your company.
During the sales phase, you’re going to uncover things that worked surprisingly well and other things that absolutely tanked.
This type of user data will shed light on the messaging you should improve and the stuff you need to highlight more. This can be used for on-page optimization as well as for new ad collateral and media campaigns.
Ask your sales team these five questions regularly to get the most out of your sales inquiries:
- What are the top five questions you get from prospects?
- What are your answers to these questions?
- What is the biggest barrier keeping people from buying the product?
- Are there any selling points that work particularly well (or not so well)? If so, which one(s)?
- Have we missed anything important? Anything to add?
If you have a short sales cycle, I recommend asking these questions weekly. If you have a longer sales cycle you might want to ask monthly.
tl;dr
Relying on surveys to provide user data, product insights, and offer analysis is a shortsighted and inaccurate approach.
Get your user data from how your customers actually behave, not from how they report behaving. Your business will thank you for it!
One Response
Great article. I’m presently listening to Ask by Ryan Levesque… so your timing couldn’t be better.
It seems like we should all do what Dan Kennedy recommends and embrace the complexity… there are a lot of cogs in the machinery and using each effectively requires dedication and implementation.
Thanks for the article.