As growth marketers, we love to swim in quantitative data. We tend to be analytical and love to find hard answers to tough problems.

But the “softer,” qualitative stuff is important, too. In fact, at certain stages and for certain problems, it can be even more valuable than quantitative data.

Qualitative Data: Finding The Why

Quantitative data is great at finding who your customers are, what the problems may be, and where you need to focus your efforts. At the beginning of any project, I set baselines using quantitative data. It’s simply hard to plan for growth otherwise.

Similarly, quantitative data can answer questions related to causality if you’re running A/B tests. This is the gold standard in web and product optimization, and that likely won’t change soon.

But if you’re chasing after answers to “why” and “how” questions, the qualitative stuff is your friend.

Qualitative data can help you answer questions like:

  • Who is my audience (not just demographically, but psychologically)?
  • Why are people not converting?
  • What do I test?
  • What specific language do people use when describing their problem/motivation?
  • What hesitations do people have that I need to address in an experience?
  • What part of an experience brings delight (Aha moment analysis comes to mind)?

Generally speaking, if you’re rigorous and asking important growth questions, qualitative data usually has a place in helping you solve them.

Qualitative insights are an excellent complement to quantitative as well. For example, your form builder tool may even have some analytics that show you the drop-off on form fields and submissions. But running a few user tests will really help you get to the emotion users feel when filling out the form and why they are dropping off. Both forms of data work in tandem.

Note: I’m using the term “qualitative” kind of broadly here, and maybe not specific to its technical definition. For the purpose of this article, qualitative data is basically any data that isn’t behavioral, revenue, demographic, or technically related. It includes things like attitudinal surveys, user tests, and interviews. The results of this data are things like psychographics, user journey maps, “psych” scores, and voice of customer data.

Methods to Gather Qualitative Data

Depending on the creativity of your research design and your specific problem, you can gather qualitative insights in a variety of ways, from the colloquial (coffee shop user testing) to the formal (complex survey design and semantic analysis).

That being said, there are a few that are more common (and cheap) than others. Here are a few favorites that have helped me out of tough problems…

Customer Surveys

Customer surveys are sort of the go-to tool for marketers, probably because they don’t have to actually talk to customers (my cynical answer).

They’re also used often because, nowadays, you can get incredibly granular with who you target and when you do so. Additionally, survey tools today are endlessly customizable, as well as personalizable, so they can be used for almost any means. You can even integrate survey tools with your marketing tools to create better campaigns (but that’s means for a different article).

Image Source

Customer surveys allow you to quantify data as well using survey response scales. However, in the pursuit of growth, I’ve always found the open-ended qualitative questions to be more valuable.

Customer Interviews

I’m a big fan of customer interviews, for two very different purposes.

First, for the exploratory phase of a campaign, product, or even a company. When you need help even determining the direction, well-executed conversions with customers (or potential customers) can really help put you on path.

Second, for honing in on specific motivations and language that the customer uses. This is super useful when you’re setting up paid campaigns, writing copy for landing pages, or building narratives for product marketing.

In an “informal” conversation filled with open-ended questions, you can often get people to lower their guard and you can get closer to the truth (surveys are pretty stuffy after all).

On-Site Polls

On-site polls are another form of survey tool, but they’re usually anonymous and target website visitors at large instead of a segment of your customer, lead, or subscriber base.

These are great for finding “real-time” feedback on your site. For instance, you can slap an on-site poll on your thank you page to find out what factors led to the final purchase (and what almost prevented it).

Image Source

Or you could use on-site polls to determine if your blog article or knowledge base article is providing enough context and value. While content analytics can provide a lot of insight, there’s nothing like an on-site poll to triage content-based problems.

Image Source

The use cases here are pretty endless. Use a tool like Qualaroo to accomplish all of this super easily.

Session Replays

Session replays are anonymized videos of users actually using your site or product. Creepy? Somewhat. Useful? Totally.

The first time you watch people interact with your landing page or product, that you pored time and energy into, will probably be a painful experience. Inevitably, you’ll find usability errors, confusing user behavior, and “obvious” fixes to make.

Image Source

I find that session replays are wonderful at honing in on user experience and usability problems. Sometimes, the fixes are obvious (if a button or link is broken, for instance). Sometimes the videos simply make you aware of problems that will require further research (e.g. why did these users all drop off after putting all their info in already? How can we fix this funnel step).

User Tests

User tests are my favorite qualitative research method. I love both moderated and unmoderated tests, and I think if you run these rigorously, they can be the most bang for your buck in terms of costs vs. insights.

Basically, you can run them any way you please, but you’ll be watching users interact with your site or product in a way that you direct. You can set broad prompts (e.g. “find an item you would like to buy”), or you can set specific prompts (e.g. “find a pair of men’s boot cut jeans in size 33/32 for under $80”). I recommend doing both.

You can also ask survey questions after task completions, and you can make this stuff quantitative by timing completion rates and asking quantitative satisfaction survey questions between tasks.

Much more on user tests and satisfaction in these articles:


“Qualitative” Does Not Mean Without Rigor

I mentioned that qualitative data is usually looked at as a “softer” and more malleable form of data than the quantitative stuff. Not totally true. While it’s easy to bias your quantitative analytics with poor tracking, poor analysis, or poor strategy, you can also easily (maybe more easily) pollute your qualitative data.

Like my university English literature professor said while knocking postmodernism, “you can always find what you’re looking for if you look hard enough.”

Think of it like this. If you have a strong suspicion that your form is too long, and you’re running users tests to find out if that’s the case, it’s very likely you’ll notice every little hesitation a user has when filling out the form. You’ll put on the blinders to those testers who just breeze through it without problem.

Basically, if we’re not careful, we just see what we want to see. That’s a big problem with qualitative data, and there’s not an easy fix to it.

One big way to mitigate biases on decision making is to know that they exist and that you’re affected, too. No one is exempt.

Another way is to systematize your approach to analyzing qualitative data. Depending on what method of research you’re using, that could look like a few things.

For example, if you’re analyzing session replay videos, perhaps you have a scorecard with specific issues that you mark while watching the videos. You can scribble comments by these points, and every time you analyze videos, you do it with a cross-functional team of 4 other employees.

Perhaps, you’ve systematized surveys to be analyzed according to buckets of issue based on classifying algorithms. Maybe you’re just looking for some new copy to write so you snag a phrase you like and you’re fine with that.

It’s all up to you. My point is that you should think about these things up front or risk post-hoc rationalization.

A lot of what the book Sprint covers is a framework for systematizing user feedback on early prototypes. It’s helpful to outline a system like this in advanced and iterate and improve on it with time.

Image Source

The topic of bias in research, especially qualitative, is here. Here are three great articles on the topic:

In addition, make sure that your qualitative research is done in pursuit of insights you can actually take action on. Any type of information gathering is a cost, and you need to have proportional reward for that cost.

Most of the time, growth marketers are seeking out specific solutions to copy, design, or usability fixes, but you can also draw out more complex use psychology maps using frameworks like ELMR.

Whatever the case, make sure you’re actually using this stuff to grow your business, otherwise it’s just work for work’s sake.


As growth marketers, what we’re really working on is decision optimization, and we’re always operating under conditions of uncertainty (h/t Matt Gershoff).

More information, either through qualitative or quantitative data, can reduce that uncertainty (at the cost of collecting it). The best use of resources is to find the cheapest way to make an optimal decision at a certainty level you’re comfortable with.

Sometimes that’s true quantitative data, such as in the case of quick and cheap A/B tests. But sometimes that’s through qualitative data (a few quick user tests is usually cheaper than building a predictive churn model).

Additionally, when you’re just starting out or when you’re trying to actually create better experiences, qualitative data is really your leg to stand on. It’s what can tell you your customers’ desires, fears, hesitations, and doubts better than any Google Analytics report ever could.