Recently, we’ve read and written quite a bit on the subject of qualitative research methods’ viability as a substitute for traditional offline methods; however we, as researchers, shouldn’t always think about these methods in an either-or sense. When used in tandem, online qualitative methods can lend additional insights and deeper rapport to your traditional qualitative studies.
In more and more of the offline qualitative studies that we at Accelerant Research conduct for client organizations, we find ourselves creating online homework assignments for participants to complete in advance of their scheduled in-person focus groups or follow-up online exercises to be conducted after the in-person groups wrap up. Assigning these homework tasks requires very little additional time for recruiters and comes at minimal additional cost.
The following are just some of the many potential benefits of conducting online discussions in advance of your traditional qualitative research study:
Building rapport with your respondents. As all qualitative research consultants know, establishing trust and getting your participants to open up is an art. Giving your participants the opportunity to virtually get to know their moderator (as well as other participants) prior to meeting them – and to do so in the comfort of their own homes/offices – helps to increase their comfort level greatly. That’s not to say that you won’t still have to perform your moderator magic to get your participants engaged when the groups begin, but it certainly helps you take preemptive steps toward that end. We often find it helpful to kick off the online sessions with a brief video that introduces the moderator and lays down some initial ground rules for the study.
Getting participants thinking about the subject matter. Giving your respondents time, during the days leading up to your offline qualitative sessions, to talk about and reflect on their experiences with whatever subject matter you are studying helps you drive them that much closer to the “sweet spot” frame of mind that will yield the deepest insights for moderator and end-user clients.
Documenting behaviors or product usage. Asking participants to keep a journal or log of relevant behaviors (e.g., nutrition diaries, travel journals, product usage logs, countless other examples) leading up to their scheduled in-person sessions can be incredibly informative sources of insights. Instructing participants to use an online qualitative platform for organizing these records can save hours of time a moderator or research assistant might otherwise spend compiling this information from participants’ disparate note-taking sources.
Simpler execution of projective exercises. Projective techniques are fantastic ways to get at some of the underlying emotional connections participants have with the subject you’re studying. However, in some cases, pulling these exercises off logistically in an in-person setting can be quite an “arts & crafts” time suck. Giving your participants collage-building, storytelling, or perception mapping exercises before or as a follow-up to your group discussions can be incredibly insightful, and doing so online is much more efficient (from both an execution and analysis standpoint).
Early predictions on who to “pay and send.” No matter how strong the recruiter was or how well-crafted the screener, we’ve all had situations where more respondents than needed show up for groups and we end up wishing we had a do-over on who we decided to keep versus excuse. Using online homework assignments, you can get to know which of your participants are the best communicators. Plus, having an idea of which of your participants would be the appropriate candidate(s) for dismissal can help to save you from having to spend the first several minutes of a focus group in the back room flipping frantically through your respondent grids and re-screeners to decide who stays and who goes.
Trimming time from an already crowded discussion guide. Research clients are under an ever-increasing amount of pressure to squeeze as many insights as possible out of a given research project. The above methods are just a few that can help to trim precious minutes off the length of your in-person qualitative sessions, opening up opportunities to explore your subject matter at a deeper level or even to start tackling more of the “time permitting” sections of your discussion guides.
For more information about Accelerant's online qualitative and white glove recruiting services, please visit us at www.accelerantresearch.com, email us at firstname.lastname@example.org, or call us at 704.206.8500.
Wednesday, April 18, 2018
Monday, April 2, 2018
Last month, we stated that due to changes in business conditions, it may be a good opportunity to revisit the value of historical data collected in a customer satisfaction tracking study, and consider whether revisions could improve the value of the data collected while reducing costs. As an example of “how things can change” it was offered that the advent of various technologies continues to change consumers’ opinions and expectations of brands, and that these new conditions may change consumers’ purchasing behavior and underlying brand loyalties, causing them to “redefine” their criteria for customer satisfaction. As such, satisfaction measures that are currently tracked, but were identified as key drivers in a study conducted prior societal changes may have decreased in importance, while others may have increased importance and reached the status of key driver, but are not being tracked at all. Overall, the rank order of importance of key drivers may have changed in such a way as to render current tracking programs ineffective in doing what they are designed to do: monitor the effectiveness of various customer programs and initiatives in lifting overall customer satisfaction.
Assuming it’s time to re-assess and refresh key drivers, a number of excellent opportunities emerge to manage your research budget. First, by conducting a new key driver identification study, you can leverage its findings and include only those items found to be key drivers in your tracking survey, i.e., pare down the amount of data collected to only what is needed.
This is where key driver studies pay for themselves. That is, you can avoid spending research budget on measurements of things that do not have a strong relationship with overall customer satisfaction (or other business-based dependent measure). Eliminate the “nice-to-know” survey items and keep your tracking questionnaire as brief as possible. A good rule of thumb to follow for designing surveys for customers is to keep it at 10 minutes or less. This amount of time is quite adequate to fit several topics of survey items that map back to key drivers identified in the previous study, as well as accommodate important open-ended questions in which customers get to provide their opinions and feelings in their own words.
Another opportunity is to review the methodology that has been used in the past tracking efforts and consider whether a less expensive process of data collection can be deployed without a loss of research quality, e.g., switching from CATI to web or interactive voice response (IVR), provided that sample representativeness can be maintained. Methodology changes can sometimes cut the research expenditure in half while delivering the same value.
And yet another opportunity exists during these tracker episodes in the form of re-bidding the study with a new set of research suppliers. Indeed, a research manager’s fiduciary responsibility to his or her employer is to drive bang for the buck on behalf of the organization’s research spend levels. Indeed, there is nothing like an RFP for a tracking study that forces research firms to sharpen their pencils in pricing its offer of tracking services, both for the incumbent supplier as well as for prospective ones, too.
The third and final installment of “Have you thought about your tracker lately?” will be published next month. In it, the reader will see a step-by-step process laid out in which migrating from one method to another and/or one supplier to another can be done, while minimizing the loss of historical data. This migration, when done with some care, can enable an organization to maintain at least some, if not most, of its historical data provided in previous waves of tracking research.
Tuesday, March 20, 2018
When I was a young pup in the market research industry, circa 1985, I was in graduate school working on my doctorate in psychology and serving as a graduate student intern at JC Penney headquarters in Manhattan on 8th Avenue across from the Winter Garden where Cats had been playing for years and years. It was a very exciting time in my life as I was beginning my career juxtaposed between “book life” and “school life,” and being challenged to reconcile the stark differences between the two. Every day, I toggled between the conceptual and the practical in my pursuit of becoming a research professional; some days, I’d toggle so hard, I’d pull a muscle and needed to stretch in order to work out the kinks that had developed.
At Fordham University, I was trained to write like an academician. At JC Penney, I was criticized for doing so, and urged to write simply, succinctly, and to do so in a style that would enable the reader to become enmeshed in the words I put forth. I must admit, it was a battle that was waged, in my head, and I ultimately allowed the academic side to win. I thought, if my writing style reflected my educational training displayed by using multi-syllabic, highfalutin language, others would see me as “smart.”
Man, was I stupid. I had no intention of becoming a professor; rather had set my sights only on being a practitioner. But I couldn’t help leading with my degree, to declare, without saying, to everyone in the room at almost every given time, that I was highly educated. Throughout my internship, I never strayed from that posture and, in hindsight, it never benefited me.
Then I graduated and went to work for a large advertising agency in NY. Early on, I literally got my ass kicked by writing reports in a style that was better suited to publication in a scholarly journal. I was no longer an intern; now I was a paid employee and was working on studies that were delivered at high cost to our clients. But I remained steadfast to my style, stubbornly clinging to my degree and what (I thought) it meant to others in my sphere of colleagues. Needless to say, my boss was not patient with my learning curve as I struggled to write reports, and even questionnaires, that were easy to understand and digest. Indeed, ones that clearly tied back to the purpose of the study and the business objectives at hand.
As time went by, I was assigned to a new boss who took me under his wing and worked with me closely. I’ll never forget one (late) night at the office working on a report that needed to be delivered before deadline when he said “your report needs to read like you are telling a story.” It needs a beginning that draws in the reader, a middle that states what the study found, and an end that draws conclusions and provides closure to the reader on the subject matter such that he or she takes away the key learnings of what the study was designed to discover. In essence, the whole report must be a story that leaves the client with clarity on what decisions need to be made to drive business and why.
That experience was a career milestone for me. It was a great lesson that, not only provided the motivation to improve, but taught me the lesson that my work was not about me. Rather, it forced me to stare straight into the eyes of the main purpose of what I was doing -- helping clients figure out how to improve their own businesses. It finally dawned on me that storytelling was the way I was going to practice my profession. As a psychologist, I was trained to be compelled to help my clients in any way I could, and so I took this lesson about storytelling and ran with it as a way to do just that.
Fast forward to nowadays. Our firm, Accelerant Research, has instituted a kind of “teaching hospital” model whereby we hire some staff at an entry level position as Project Analyst. They come to us without any particular set of market research skills, and we train them to become professionals by having them work directly with seasoned veterans who teach them all of the skills they need to become proficient as market research suppliers. They first learn how to program surveys and then write them. They learn how to design PowerPoint decks that will serve as the final report, how to read crosstabs and populate .ppt slides with results, and then how to write the report itself. These rookies in the biz learn how to leverage previously written deliverables to learn how to execute the entire development of a final version deliverable that is “client-ready.” But what they get from archived studies is a set of stories told, across a wide variety of study objectives, methodologies, industries, and findings. Each one accessed reads like a story that has a beginning, a middle and an end.
The secret to success as a market research supplier is hardly a secret at all. Rather, the key is widely known and has been around for a very long time. That is, to design and deliver to clients the answers that have been sought by the research undertaking, and the formatting of these answers in a way that laypeople (the folks actually paying for the research) can easily understand and digest. The products of our efforts as research professionals must be in the form of a story such that one’s conceptualization of the findings and recommendations fit like a glove into the needs to inform marketing strategies. At AccelerantResearch, delivering insights in this manner is our ultimate promise.