Monthly Archives: August 2010

Pro/Con: The Role of Social Media in Qualitative Research

Earlier this week on Twitter, some folks following the #MRX conversation had some strong reactions to a BNET article, “Why Social Networking Isn’t Customer Research.” One comment read, “I pity the fool who thinks social media leads to customer insights.” Really?
Then why did Quirk’s Marketing Research Review dedicate an entire issue to the topic last month? In the August issue, you could read about sampling social media data, best practices for surveying niche social media members and more.

Geoffrey James, author of the BNET tirade, gives some good reasons why you can’t understand what’s going on in a customer base through social media. Here are two that stood out to me:

  • Commenters are self-selected: “Real research involves statistical sampling of a random group,” he says. “People who comment are pre-disposed to comment, making their inputs statistically worthless.”
  • Paid commenting is endemic: “PR firms frequently ‘stuff’ comments with fake endorsements,” he says. “Contrariwise, competitors stuff comments with fake criticism.”

But Andrew Wilson, author of “When your consumers are talking online, here are some tips on how to listen” (registration required) in the August issue of Quirk’s, makes some valid points as well. Wilson says user-generated content (like that posted on Twitter, blogs, review sites, etc.) “should be viewed as a series of unstructured conversations that, when used appropriately, can add depth to and expand your understanding of who your customers are and their experiences with your products and services.”

Wilson recognizes social media’s limitations but still sees its value in qualitative research. No, it can’t replace your research – you still need to do that bulletin board focus group to understand your customers – but it can, in some situations, enhance your research efforts.

Throughout the article he provides examples of how one wrongly can interpret user-generated content – and tips for maximizing the validity of your findings.

Is the “Online Focus Group” Changing?

Have you noticed that the meaning of the term “online focus group” is changing?
Online Focus Group.png
When we started doing online qualitative research in 2000, there were two basic types — the bulletin board focus group and the online focus group. At that time, the online focus group was clearly defined as real-time, text-chat focus groups. The definition was clear and unambiguous.

Today, the term is evolving and creating confusion. Though one online moderator might use the term online focus group in the context of a text-chat focus group, another online moderator might use the term to refer to a webcam focus group. 

Today, the only consistency in the term “online focus group” is that the speaker is referring to a virtual focus group in real time. So, the next time someone asks you about an online focus group, you might want to ask for a bit more information before moving forward.

Don’t Make This Mistake With Your Next Bulletin Board Focus Group

When we talked to online qualitative research veteran Liz Van Patten of Consumer Advisory Panels last week about tips for engaging a bulletin board focus group, she brought up a great point that we thought deserved its very own post–the biggest mistake beginner online moderators make when conducting a bulletin board focus group.
 
(Drum roll, please…)
 
They ask too many questions!
 
If you recruit participants based on a 30-45 minute time commitment each day, you have to really deliver on that promise,” Liz says. Otherwise, she says, you can expect engagement, participation and ultimately your results to suffer. Plus, that negative experience can spread by word of mouth, thus making recruitment more of a challenge for future projects. “It’s really about earning the participants’ trust by being truthful with them,” she explains. “If they feel they can trust the moderator and the process they will be more willing to open up and share their thoughts and feelings, which is the moderator’s goal.”
 
So how do you know how many questions you can ask in your bulletin board focus group? “It comes down to the math,” says Liz, who explains that you should allow 2-3 minutes to respond to the average online question–3-4 minutes (or more) if you want them to read and react to other participants. For a 30-45 minute daily time commitment, that works out to 10-12 questions each day. To make sure you’re on target, consider doing a run-through of the board yourself as you develop the discussion guide.
 
When creating your discussion guide, aim for a reasonable number of questions based on the time commitment you’ve asked for. If you find that you’re going over, consider combining repetitive questions or rethinking questions to be broader in scope. Liz also suggests asking a more general opening question and following up throughout the day with specific probes.

Secrets to Engaging Your Bulletin Board Focus Group

Engagement in a bulletin board focus group can be a tricky thing to gauge. For starters, you can’t see your participants. Plus, you’re not necessarily interacting with your participants in real-time. But those barriers don’t make engagement in a bulletin board focus group any less important–or impossible to achieve.

We talked with veteran qualitative research consultant Liz Van Patten, who runs Consumer Advisory Panels, to find out some of her secrets to engaging participants in a bulletin board focus group.

Manage their expectations from the beginning: Be real when talking up participation. “I tell them it should be engaging and entertaining,” Liz says, but doesn’t try to oversell the experience. “The message is ‘Let’s have fun with this, but I need to hear your sincere answers.'” She also stresses the importance of being truthful about time commitments. She says 30-45 minutes a day equals about 10-12 questions.

Remember, it’s a conversation: Liz starts a bulletin board by greeting each participant individually and asking them to post something about themselves, like where they live. “Then, I’ll comment on that,” she says. Liz explains another secret is to use “I,” instead of “we” (as in the online moderator and the client). “Responses are more open and honest when I talk about myself as an individual,” she says. She also pays attention to how she speaks when having a casual conversation and tries to communicate in a similar way as an online moderator.

Reward desired behavior: “It’s like training a child or a pet,” she says. “You reward behavior that you want to see happen and gently discourage behavior you don’t want to see.” Liz says if she wants participants to interact with each other, she’ll include it in the instructions, but that when she sees someone do it, she’ll deliberately thank them.

Be visual: Liz has found that incorporating visuals into the process helps keep participants engaged. She relies on colors and images of things like sticky notes to draw participants’ attention to certain areas, and she prefers inserting pop-up pages to writing “walls of text.”

Switch up formats: Liz suggests using different techniques to keep participants engaged over longer projects. “One week you might have them do a projective exercise, and the next week they keep a diary,” she says.

Is the market research industry slow to innovate?

I posted the following on Next Gen Market Research Linkedin Forum. The discussion is a great read.

I’m intrigued by Tom’s contention that brands are becoming more innovative than agencies. I must admit that is our experience as well.

I agree with the comments about institutional conservatism and that the personality of the researcher is to avoid risk. After all, a very valid definition of market research is to reduce the risk of a decision. However, the greatest barrier to innovation that we encounter is that research firms have “tried and true” methods for making money. Therefore, research firms are VERY risk averse so innovations have two major hurdles, effectiveness and profitability. These are huge hurdles
for innovation.

Brands, on the other hand, have different hurdles. Though they can be risk averse, their hurdles are more likely to be effectiveness and cost. Cost is hugely different than profitability. Since we are in the online qualitative research arena, we are seeing brands coming directly to us more now than in the 10 years we have been in this space. Brand innovators are pushing their research firms to deliver, but when they don’t, they are bypassing them with increasing frequency.

3 Keys to Successful Online Concept Testing

Online concept testing shares many similarities with face-to-face testing– from the way they’re recruited to the type of concepts shown to participants. Going online also boasts many advantages over face-to-face: You can test more concepts before worrying about burnout, the format works well with iterative concept development and you can test multiple groups simultaneously, not to mention the fact that it’s usually cheaper and faster.

But online concept testing isn’t foolproof. At 20|20 Research, we’ve helped hundreds of clients execute online concept testing projects using QualMeeting, our webcam focus group software, and QualBoard, our bulletin board focus group software, and we’ve learned a lot about what can make or break an online concept testing project. Here are three keys to successful online concept testing:   

1. Have concepts ready and test them first. It’s better to delay your online focus group than show up with no concept or one that doesn’t render correctly for participants.

2. Create professional concepts. Typos and sloppy concepts scream “I didn’t work very hard on this” to your participants. If it’s clear your team didn’t put much effort into the concepts, your participants won’t put much effort in their feedback to you.

3. Consider burnout. With online testing, you can show more concepts than F2F–but not that many more. Keep burnout at bay by limiting your concepts to five or fewer each day. If you have more concepts, it’s better to test across smaller groups than one large one.

Find more keys to successful online concept testing at 2020research.com.

How to Add Social Tools to Your Online Qualitative Research

Countless companies, including 20|20 Research, are experimenting with ways to reach current and potential customers using social media tools like Twitter, Facebook and LinkedIn, and in many cases they’re succeeding.

So what does that mean for market researchers? Can we add social tools to our online qualitative research toolbox? Yes and no, explains Tamara Barber in this week’s Quirk’s enewsletter. “While social market research is not a replacement for more traditional research, it can serve as a valuable complement to other insight-gathering techniques.”

A couple of her ideas for using it in online qualitative research:

To add a new recruitment channel: Hard to reach certain participants? “Think of tapping into open-affinity communities or LinkedIn as a way to do observational research or in-depth interviews with specific groups of people, such as certain types of business executives or enthusiasts,” Barber suggests.

To listen in on what consumers are already saying: Barber explains that consumers are already talking online about brands and that it’s easy to listen in. “Mining social media channels allows market researchers to understand, in the consumer’s own words, how well their — or their competitors’ — new product or campaign is being received on a real-time basis,” she says. “This can then inform future campaigns; be used to tweak messaging in the short term; or be a catalyst for further messaging research.”

The article also addresses the challenges in social market research, like dealing with information overload and unrepresentative samples.

Is There a Difference in Quality of F2F vs Online Qualitative Research?

Is face-to-face any better than online qualitative research when it comes to quality? That’s what one of my LinkedIn groups was mulling over last week.

This
is a very complex question because there are many, many online qualitative research methods
and many, many combinations of research objectives.
And this doesn’t even begin to address the definition of “quality.”
Since there is no easy answer I’ll just toss in my 2 cents on a couple
of items:

There is no doubt that the “data” from viewing people as they talk is
largely missing from most online qualitative research methods. Viewing participants in person can
reveal powerful insights, and the experience is insightful and memorable for the client
viewer as well. This deficit is being overcome by online research software but, frankly,
it is still a downside to online. So, when a direct comparison is made
using this criteria, online qualitative research often comes up short.

On the other hand, online offers many opportunities that F2F
does not. I love the opportunity to do longitudinal qualitative that
several online methods facilitate (e.g., bulletin board focus group). Longitudinal studies are very
difficult and very expensive F2F. Online provides greater
opportunity to go into the lives of a participant for a 360 degree view
of their interaction with a product or service. Online also gives us
tools that allow people to be totally open with anonymity. There are
other advantages, but you get the idea.

So, at the end of the day, the question to me is not which is higher quality, but which methodology best fits my needs?

How to Engage in a Bulletin Board Focus Group

A bulletin board focus group is a terrific online qualitative research methodology if the  moderator keeps participants engaged; otherwise, it can become a series of open-ended questions. So how does an online moderator turn this methodology from hum-drum to WOW?  3 ways.

  1. Set expectations upfront. During the qualitative recruiting process, be sure that participants understand what you expect. They need to know log-in expectations and participation expectations. Set the rules early and be sure they agree to them.
  2. Create Engaging Discussions. Ideally, the discussion itself is an engaging, high involvement topic for the participants. However, often the moderator must take the lead and model the desired behavior for participants. This is especially important on Day 1.  Create a lot of probes and generate discussion among participants.  An online moderator who works hard on Day 1 will reap the benefits throughout the project.
  3. Proper Incentives. The incentive should be relative to participant expectations. Doctors expect more than consumers. Experienced panelists from the major panel providers generally expect less. Also, be sure participants understand that they receive their incentive after the discussion concludes and only if they fully participate.

3 Keys to an Effective Hybrid Research Design

survey mag logo.jpgThe current issue of Survey Magazine features an article I wrote on the growing capabilities of “hybrid” research design, particularly with the explosion of online research software.

In the research industry, “hybrid” research is quickly being defined as the integration of quantitative and qualitative research.  There are several options available in the market, including:

  • Chat Intercepts during an online survey
  • Webcam intercepts during and online survey
  • “Smart” open ended questions with automated probing
  • Bulletin Board focus groups following an online survey

Each of these is optimal for different research objectives and problems to be solved.  A researcher who uses these methods should be aware of 3 keys to an effective design when picking a hybrid methodology.

  1. Depth. Some hybrid methods do not provide the depth of true qualitative research.
  2. Speed. Adding a qualitative “phase” to a quantitative project can push the schedule past an acceptable deadline.
  3. Integration. Is the quantitative and qualitative research truly integrated or pieced together

At 20/20 Research, we have answered many of these questions with QualLink, a simultaneous hybrid that creates a direct link between virtually all survey platforms and a QualBoard bulletin board focus group. The method is deep, fully integrated and can often be completed before the survey closes.

1 2  Scroll to top