The THINKerry
Article by Jack (Xinkang) Du
Connect with Jack on LinkedIn
August 31, 2022Forward by Kerry Edelstein
Connect with Kerry on LinkedIn
FORWARD
During the pandemic, I had the good fortune to guest lecture in a remarkably thorough and rigorous market research class offered virtually by my alma mater, the Anderson School at UCLA. In that class, one of the full time MBA students stood out immediately – Jack Du. So I felt very fortunate when he reached out to us to inquire about an internship.
Born and raised in China, then a resident of New Zealand for a decade, Jack came to us with a delightful Kiwi accent and many years of experience as an educator. When he embarked on his internship at Research Narrative, he was halfway through his second master’s degree. We henceforth referred to Jack as “the most overqualified intern ever.”
Jack was, and still is, building an entrepreneurial endeavor of his own – a language learning game called Lost Abroad. When he began working here, he’d recently written a survey to do some research on the market opportunity for his game, and he began asking me questions about sampling.
I stopped him immediately and said, “Instead of giving you the answers, let’s make this a project.” I figured it was a great way to teach Jack both sampling AND the methodology of secret shopping. But it also turns out to be a great primer for our industry on the challenges of sampling – and how that affects our industry on many fronts, from DIY pitfalls to accurate cost estimates. I found Jack’s experience fascinating and insightful, and I hope you will too.
– Kerry
LOST ONLINE – by Jack Du
Earlier this year, I was doing some market research for my Ed Tech startup – a language learning game – and needed a sample provider for a market survey we wanted to conduct. Having aced a rigorous market research course at UCLA Anderson last summer and worked at Research Narrative for 8 months as an MBA intern, I was feeling good about finding a sample partner who could meet our needs.
Indeed, after a couple of minutes with Uncle Google, I landed on the homepage of a major research company – we’ll call them “Company A.” The page looked clean and modern, with the “Get Started” button nice and big in the middle. “Good UI” I thought. I was off to a great start! Sadly, that’s where all the good thoughts ended. Upon clicking on the “Get Started” button, I was immediately prompted to create an account.
That seemed premature. I wondered, “Don’t you need to know what I’m here for, first?”
I was puzzled but proceeded to create an account anyway. And that’s when a whole page full of questions shot up in front of my eyes, with boxes and sliders in all shapes and sizes. Survey length. Days in the field. Prescreening questions…
“Ok, I see where this is going, but it’ll probably scare the crap out of a newbie!” I mumbled, believing myself to be a veteran capable of navigating the circumstance.
Well, that confidence didn’t last long – karma is a b*tch. About 30 seconds into answering questions, it was my turn to feel like a newbie as I tried my best to estimate the “incidence rate” for my target audience – Americans aged 13-35 who are interested in learning Mandarin Chinese and play casual games. To give the page designer some credit, there was a line of explanation in small font, explaining the concept of incidence rate, but it did nothing to help me determine what it actually was for my survey. Also, incidence rate among whom? They never showed me how they were going to prescreen on their end.
I did some ballpark estimates and typed in a 33% IR, with a target budget of $3/complete. The page instantly returned an estimate of 235 field days for an N=500 sample.
My jaw dropped. 8 months!? For 500 completes? What?? And before I could pick my jaw up, the website was asking me to check out and type in my credit card info! I jumped up, livid, “you don’t even know if what I just typed in is good or not and you want to charge me a grand and a half!?”
We joke at Research Narrative that UIs like this – that jump straight to a transaction without knowing what you need – are “asking for your hand in marriage on the first date.” I never closed a browser tab that fast in my entire life. The aggressive UI literally scared me off. In retrospect, Company A never asked me for my quotas, so even if I was ok with the pricing and the UI, my survey result would have been “GIGO” – garbage in, garbage out.
After that most unexpected “marriage proposal”, I decided to only go with a sample provider with whom I can have a pleasant conversation first. It was destiny that I then met Suzie at “Company B”, who responded warmly after I posted my short and sweet letter (thanks to a strict word limit) on the “Contact Us” form on their website:
“Hi, my name is Jack. I’m an MBA student at UCLA starting an EdTech company making language learning apps. I’m in the market research phase and I have created a survey on Qualtrics. I’d like to recruit maybe 500 respondents who are interested in learning languages, and I’d like a quote on pricing and the time it will take for the recruitment. Thank you very much for your help! Kind regards”
Suzie was delightful and efficient, as her quote came in only 20 minutes after her initial email:
The quote looked pretty good at first glance – except that, due to the word limit, I never mentioned the segment, incidence rate or length of survey.
“Wonderful, she’s a mind reader!” I thought to myself.
At this point, thanks to the training from my market research class, I remembered that there is such a thing called fraud. So, I immediately drafted an email asking Suzie if they had any fraud mitigation measures to catch those “survey mills” from the other side of the world.
She replied in her soothing words “we have strict security settings on our surveys… We do implement security and monitor carefully for ‘bad actors’ through various methods to ensure you are not getting fraudulent responses.” Largely unspecific, but a conversation I couldn’t have with the first company.
This was when I realized that beneath the warm demeanor and pleasantry, the delightfully responsive Suzie had overlooked one foundational requirement that the faceless algorithm of Company A had ignored as well: the literal act of sampling.
In both cases, I didn’t know if our survey would be sent to our target audience in the right areas, across the correct age, racial group, and gender ratios. But I guess if it’s cheap, who cares if your 500 people were all male from Kansas City, or from China or Eastern Europe, each with 8 phones hooked on VPN just clicking survey options at random at the speed of light? What both Companies A and B had in common was that they seemed to prey on inexperience and jump to budgets without confirming sampling basics.
But I did not give up. It takes perseverance to find true love, they say. However, after the 2 devastating heartbreaks, I decided to take a methodical approach: play the numbers game. I contacted another 7 sample providers and put together an Excel Spreadsheet (like a good MBA student would) as an evaluation study on “love at first sight question.”
I’m very glad that I did this because I met some really interesting souls: there was not another robot-like Company A, hallelujah, but there was the Mr. “If your budget is less than $3k let’s save each other some time” big shot, the Miss. Flaky “Let me send you a quote after this call” but who never did, and the Mr. “I will send you another intro email because I forgot who you are.” If these companies understood sampling, I never found out, because they didn’t understand communication or respect.
Eventually I did find diamonds worthy of courtship: a couple of companies really impressed me. They were warm and helpful. They asked all the right questions from the get-go, and they answered all my questions with great honesty, introspection, and insight. I really wanted to go on a 2nd date with them, but sadly, they were way out of my league – they were 1.5x or even 2x as expensive as the quote from Suzie – and I simply did not have the budget for it. In the end, I couldn’t conduct the survey at the time. What a heartbreaking tragedy!
As I later learned through my internship and now position as a research analyst, there was a solution, which was to work with one of the companies that asked all the right questions and consider either a smaller sample size such as N=300 or broader screening criteria to even validate if a more niche target was correct. But in talking to 9 sample companies, not one helped me understand that, let alone suggested either as an ideal alternative.
My less-than-ideal experience probably isn’t unusual and thus has important implications for fellow entrepreneurs and brands who expect to DIY their way through survey research. I thought I could do that myself; after all, I took a graduate course on this topic and interned in this field. But DIY really requires knowing the “I” or “It” part with tremendous expertise. Yes, you will get answers on a budget, but they won’t be the answers to the survey questions you asked. Because the devil isn’t just in the survey question details, it’s in the sampling questions you did not ask, and are likely not being asked.