‘Wow, you captured some great information in this qualitative study. Can you quantify that for us?’
I can’t help but sigh every time a client asks that question. Quantifying qual is like hanging a kryptonite chain around Superman’s neck. It might look pretty, but you just took away all the power.
One of qual’s most impressive features is its ability to highlight more nuanced responses. For example: when people answer the main question, but then tell you why they said that. Or – and I love when this happens – they figure out their answer by talking through the question and their thought process.
That’s gold, baby! You need to take those gems and run! Don’t bury them in a quantitative report.
OK, that was my soapbox. I will get down now and explain three ways that quantifying qual studies is dangerous.
- Too many perspectives in one bucket
The Olinger Group recently did a qual study where we talked to radiologists and oncologists. Both groups care for cancer patients, but they approach the problem from different angles. Lumping their perspectives together would have been unfair to each profession and to our client.
In another study, we talked to physicians who ranged from GPs to specialists. They worked in a variety of settings, including hospitals, group practices, and private practices. Aside from being physicians, our respondents had little in common.
Unsurprisingly, some of their responses were all over the board, and quantifying the answers would have been meaningless. Instead, we looked at the data and recommended that our client develop a multi-prong approach to respect and address the variety of processes the physicians were working with.
- Soften the punch
Imagine a study where three out of 10 respondents did not like something. We could quantify the results and say that 70 percent of the people surveyed were favorable. Sounds great, right?
Now imagine if those three people really, really didn’t like the product. That is important information, especially if they give their reasons why. Especially if there is a pattern there that no one anticipated. Especially if that pattern could be addressed and the problem or barrier lessened.
However, quantifying that data would soften its punch, perhaps to the point that the decision makers and stakeholders might not even notice it.
- Funny math
Here are two (or is it three?) ways that quantifying qual data results in funny math.
- You can interview 30 people, and 20 people will love something while 22 would never use it. (Have fun with that.)
- “I can see how that service could be helpful for someone else.” (OK, but will you use it? What does that even mean?)
- “I love it, but no.” (Again, what does that mean? Is it a yes or a no?)
Bottom line
Qualitative and quantitative studies are both important, but they are not interchangeable. And they don’t always play well together.
To paraphrase Jim Croce:
You don’t put Kryptonite on Superman.
You don’t spit into the wind.
You don’t pull the mask off that old Lone Ranger.
And you don’t quantify your qual.