Archive for the ‘Qualitative General’ Category



No: One Simple Word for Better Research

Posted on: September 10th, 2018 by doyle No Comments

My friend and colleague Julia Eisenberg over at 20|20 Research published an article addressing why when it comes to research, more isn’t always better.

In these days with every methodology at our finger tips, it can be harder than ever to find the right balance when designing a study.  After all, we can reach our audiences and collect insights so many ways. But when does too much of a good thing become a bad thing?

In Julia’s piece, found here, she suggests that the best way to gain the full picture and meaningful insights may mean we sometimes have to say no, keeping the scope of a project tighter or breaking it into phases.

At high level, she suggests the following:

  • Reduce. Limit the scope of the project to two or three objectives and be overly critical of everything else.
  • Identify. Proactively discuss any potential issues and clearly identify expectations before research begins.
  • Protect. Guard your work and don’t let other stakeholders add unrelated objectives that draw attention away from the main focus.
  • Verify. Value the quality control process as much as the methodology being use. Check-in early and check-in often.

Three Tips to Avoid Surprise Objectives

Posted on: August 21st, 2018 by doyle

We’ve all been there. We ask thoughtful and detailed questions upfront to understand the business needs and the research objectives.  We use this information to create a study design and craft a targeted discussion guide.   And then suddenly, brand new objectives pop up and derail the carefully conceived plan. 

Here are three tips to help minimize the unwanted appearance of surprise objectives:

  1. Define success at the beginning of the project – and write it down. The definition of success varies from person to person, leaving room for misinterpretation so make sure all key stakeholders are in alignment on their expectations of a successful outcome.  Document this in writing and be sure the definition of success is clearly stated on all collateral.  If objectives change as the project evolves, update them and be sure they’re clear and present at each stage of the project.
  2. Press for details. No question is a stupid question, especially when it comes to forming a partnership based on mutual agreement and understanding. Ask for as much detail upfront as possible—moderators of the client, and the client of their team members. When stakeholders and researchers begin to assume, we run a considerable risk of misalignment when the project ends.  As curious, inquisitive researchers, our job of collecting information begins well before the first respondent engages – we must engage our stakeholders from the start by asking smart questions, clarifying any lingering assumptions and confirming that we are meeting expectations.
  3. Check-in early and often. Throughout the project, plan check-ins at key milestones where important information is shared efficiently. This gives ample opportunity for reaction, response, and redesign if needed, or to bring attention to initiatives that aren’t aligning with expectations. When everyone participates in a well-designed, efficiently executed process, we can all share in the successful outcome and diminish the potential of surprise objectives coming to light too late.

Have any tips that have worked well for you? We’d love to talk further about how to avoid or manage surprise objectives.

 

Quant, Qual, and Quant+Qual: Addressing Data Quality

Posted on: August 8th, 2018 by doyle

My good friend and colleague Jim Bryson at 20|20 recently shared his thoughts in a blog post about qualitative research and its value for protecting sample and data quality.  I agree with Jim’s key point, that data fraud is a huge industry problem.  Another concern I have is that despite massive panel sizes the reality is that only a fraction of those panelists are active.  That means we are relying more and more on a smaller and smaller pool of participants for the information on which huge business decisions are based.  It seems to me that the panel industry is ripe for disruption. 

As for qualitative and the role it can play in addressing both issues, there is no question that qualitative recruiting can be more precise, and more accurate, because the smaller numbers required allow for a high-touch recruit. It is much harder to be a “cheater or repeater” when you are “face-to-face” with the recruiter and the researcher.  However, qualitative research is not always the solution our clients are seeking.  And is in no way a 1:1 replacement for a true quantitative study.  Which is why hybrid research is surging in popularity.

The ability to scale qualitative, quickly and efficiently, makes it a viable alternative or supplement to more traditional quantitative research. For example, 30-50 online chat interviews might just meet your need for a small-scale quant study, while still getting you the VOC insights that qualitative provides.  Best of both worlds.

Or, something the Doyle team is doing more of is conducting a full-scale quantitative study supplemented by a small number of qualitative interviews.   Respondents meeting key criteria can be routed to the qualitative exercise immediately after participating in the survey or they can be hand-picked after the fact to amplify specific quantitative findings.

While qualitative and quantitative research serve distinctly different purposes, the increasing capabilities available for blending them produces multiple benefits worth considering.

A Look at Our Time-Tested Project Process

Posted on: July 23rd, 2018 by doyle

Clients often ask me what makes Doyle Research special, how we’re different, and how we consistently deliver impactful insights. To answer that, I always fall back on our tried and true process for projects.  Sure, we have always been innovative in the methodologies we recommend, and sure, our team is some of the best and brightest in the industry.  But it starts fundamentally with a little extra effort upfront to really drill down to the core needs and along the way to be sure we are partnering to deliver the best output.

Here are the seven key steps we divide our projects into and what’s included in each.

Step 1: Understanding objectives

Our conversations always start broad.  As in, “What are three key takeaways you’d like to get from your research?” or “What business problem are you trying to solve?”

Sometimes it’s big picture, exploratory, or diagnostic, trying to understand a decline in sales, or a recognition that a competitor is making gains. Other times it’s more tactical — we need to pick a package for this cereal or know which ad to run for our new campaign. But making sure that we start with the broadest possible understanding of the reasons behind the research makes sure we are aligned with the necessary outcomes.

Step 2: Picking the methodology

Once we understand the business question, we start to think about methodology. Audience plays a huge part in this — we always want to choose the approach that will be best aligned with the lifestyle or situation of the people we need to get information from. And the type of learnings we need are also a big driver. Are we trying to inform product development or R&D? Then seeing people use the product or perform the task can be key. Are we trying to identify key milestones in a customer journey? In that case, maybe online journaling with a video component is the right fit.

We are not beholden to or invested in a particular approach, and in fact one of our favorite things is to blend different tools and processes. Often there is no single right answer for which methodology is “best.” Instead, it’s a question of evaluating all of the different factors — timeline, cost, audience, outcomes, internal stakeholders, topic, background — and coming up with the best approach for success.

Step 3: Assigning the team and kickoff

Our goal is always to create an easy experience for our clients. We want them to feel confident that their work is going to go smoothly, accurately, and get them what they need, so we have a dedicated project team who stays with the project the entire way, making sure that information transfers and details are managed accurately and efficiently.

Step 4: Action!

To ensure that we all start on the same page, we initiate every project with a kick-off meeting that includes all internal and external stakeholders. We review previous discussions to ensure alignment on objectives, timelines, audience, and other expectations.

Once we complete that step, typical milestones during this phase include:

  • Screener development
  • Recruiting
  • Programming (if a digital method is being used)
  • Discussion guide or survey development
  • Fielding / moderation

Each of these serves as a key touchpoint with our clients, whether we are keeping them up-to-date on the pace of recruiting, getting their input into the discussion guide, or helping them schedule stakeholders to observe virtual focus groups. In the spirit of transparency and partnership, nothing happens without them having a chance to weigh in.

Step 5: Analysis

This is where we really start to see the fruits of our labor. Our research analysts are a part of the project team, which means that by the time they sit down to write a report, they are already steeped in the background of the study and understand the nuances of how the project came together.

The payoff for this is learnings that are not just accurate, well thought out and actionable, but that are also infused with the intangibles that accompany any project.

Step 6: Presentation and wrap up

Delivering a report is an exciting moment – it’s the precursor to the most important part of the work we do, which is helping our client digest, understand, and begin to apply the learnings.

Whenever possible we like to present the results of the research to our clients, whether in person or on the phone. Time and again we hear how valuable this phase is. Our team has been intimately engaged with the research questions for weeks, and we are in the best position a to explain the findings and answer questions as they occur. To our minds, without this step the project isn’t done.

Step 7: Debrief

After we send our client off with their report and are confident that they have what they need to move ahead, we’re still not done. Our final step — and again, a critical piece of our success — is to conduct an internal debrief. While it’s fresh in our minds, we make sure to circle the team and discuss what worked, what could have been better, and the things we definitely want to avoid for next time, as well as those we want to try to do again.

Whether it’s a hybrid quantitative/qualitative study, a six-month online community, or a series of in-person focus groups, all the work we do for our clients follows this same basic recipe. It’s been tested and refined over time, and while the details can vary widely, we know from experience that it’s getting these fundamental steps right that sets us up for success. Want to hear more?  Simply contact us – we’ve love to chat.

Revolutionary Developments in Research Best Practices: Insights from IIeX 2018

Posted on: June 25th, 2018 by doyle

From the desk of Carole Schmidt 

Well, all y’all, IIEX-NA in Atlanta last week made me happy as a pig in mud!

 This was no southern drawl of a conference. IIEX, held recently in Atlanta, was packed with revolutionary developments in research best practices, scores of technological advancements in quant and qual tools, and a buzzing hornets’ nest of collaborative energy.

 Here are my thoughts on five hot topics that emerged over the course of the event.

What the heck is blockchain, the “right to be forgotten,” and the advanced-thinking new Vermont law? And why should I care? As technologies advance, consumers are increasingly able to lock up their personal data warehouse and throw away (or more specifically, encode) the key. The future of market research lies in consumer control: “I decide what to share, when to share, and how much to share.” Facilitating the transfer and decentralization of control is blockchain. What’s great about this? As researchers, we will soon gain a 360-degree view of the consumer—via a channel by which we can directly access the consumer’s knowledge warehouse. 

Brokers and panel companies who collect and sell consumer data for advertisers are facing a particularly stringent new set of regulations in Vermont. The new law is aimed at cracking down on those who make a living tracking users’ personal information. Data brokers must report information about their data collection activities publicly and have opt-out policies for consumers. They must also disclose security breaches that happened in the prior year—including the total known number of consumers affected. Vermont acted over concerns about online privacy. Brokers are usually invisible because they don’t directly interact with consumers who share data, knowingly or not, on platforms like Facebook, airline booking reservation sites and other sites. From my perspective as a consumer, cheers to the people out there protecting me when I might not know fully how to protect my data myself. 

Improving the respondent experience starts with “We need to stop asking questions that consumers can’t answer!”  It’s time to uncomplicate the survey, the journal assignments, the discussion guide.  Let’s get critical of HOW we ask questions! Are we using casual, human language? Are we keeping our questions simple? Are we editing to ensure we’re only asking what we really need to know to minimize the time commitment on the respondents’ part?

 How else can we engage respondents more effectively?  More productively?  Use videos to welcome, inform and instruct respondents on surveys and assignments.  Embrace the use of avatars, colors, icons, and/or code names for respondents to encourage more candor from online participants. Use photos/images, or emoticons for responses instead of always relying on numeric scales. A memorable point from Misty Flantroy at J. M. Smucker Company: “Turn to the person to your right. Start a conversation and have them respond only with a number from 1 to 5!” Point taken.

 Get creative and utilize digital and product bonuses (tied to deadlines) to instill the urgency to complete surveys; e.g., “the first 50 to complete this survey or assignment are automatically entered to win an iPad!” Yes, these efforts lead to richer insights, and ultimately, smarter business decisions.

 “When people feel they’re being heard, they’ll tell you anything.” One-on-one, face-to-face conversations were cited by many presenters as being ever more essential for savvy marketers in this behind-the-screen, digital age.  Though described by a few as “old school,” face-to-face research is anything but, because customers and consumers are hungry for human-to-human connection. 80% of communication is non-verbal. Follow up large-scale surveys with a select number of webcam interviews or small group discussions. Visit your customer on site, or your consumer in home, wherever consideration and purchase decisions are being made. Walk in their shoes and listen. Querying the behaviors, exploring the attitudes, influences, influencers, and drivers behind those behaviors gets us on the other side of the screens, where real decisions are made. 

 Force “quiet reflection” into research practices. Say what? Up-end how we observe and ingest the research we do, to get more bang for the buck. More than one roundtable discussion at IIEX-NA revolved around getting teams to put away the laptops, the phones, and the often self-important, “I’ve gotta step out to take this call” behaviors. You’re spending good money and allocating time to listen to the voice of the customer, so listen, damnit! Get out of the office and meet your customer where they work and live. Take notes in notebooks—gasp!—by hand, with a pen! Draw pictures, ask questions, challenge each other, build in work sessions—and never put the project fully to bed until everyone on the product or brand team has played an active part of the research. As one client mentioned, “we bring research to the executive table as a valued voice of the team.”

 Deliverables should inspire stakeholders to feel somethingShift our report writing approach and see ourselves as “Insight Journalists.” What’s the storyline? What’s real news? What’s important and relevant? Explore the “opening argument” approach, like that of a trial. Research learning should strive to create movement within the organization.  Consider using other media formats to deliver or reinforce insights and action: wall posters, composites/personas, podcasts, documentaries. Incorporate “points of reflection” or “challenge the learning” or “how do we take action” slides as dividers in our insights decks—that is, make time to stop along the way and talk about: What have we learned? What do we do about what we learned?