As of this morning, the product page for the Kia Niro has a module with still from the Melissa McCarthy ad that lets you watch it again. Why not some information about environmental or nature causes such as the ones espoused by McCarthy in the ad? The NFL page has no mention of the ad with the babies in it, which seems odd for an organization that’s struggling to promote youth football. Bud’s immigration story ad features heavily on the brand’s home page today but has no follow-up, such as Adolphus Busch’s real story or Anheuser-Busch’s pioneering role in American brewing.
Let’s talk about one advertiser who got it right with, ironically, the most controversial ad of the night, 84 Lumber’s “Journey” ad.
As part of a promotion for my two new ebooks, I’m sharing selections from “Building a Better BS Detector” parts 1 & 2 about market research. Interested in reading more? Email me at firstname.lastname@example.org.
Methodology Means a Lot
Confession time: I don’t really know what a “standard deviation” is. For that matter, I have never met a standard deviate (thank you, ladies and germs). However, I know enough about research methodologies to know when I can trust conclusions. Every marketer needs to know some basics.
Sample size. For qualitative research (focus groups, ethnographies, etc.), sample sizes mean relatively little since insight comes from depth, not breadth. For quantitative research, they mean the difference between relevance and irrelevance.By way of example, I once reviewed a survey conducted by a respected research organization among pharmaceutical manufacturers. In the notes, they mentioned that they spoke to respondents from seven (7) companies. Then I noticed that each of the percentages for each answer was a multiple of one-seventh (1/7). I concluded that the researcher only spoke to seven (7) actual individuals and didn’t use any results from that survey.That said, no hard-and-fast rules exist for minimum sample sizes. Or, rather, a blizzard of rules exist. If you have the inclination, you can learn all the factors that make up a viable sample. Since you probably don’t, here are some rules of thumb:
Any sample of under 100 strains credibility, unless the population described is very small (e.g. professional skiers in Florida). Use 100 as a bare minimum.
For general consumer studies, e.g. “adults 18-49,” use a larger base size, 500 or ideally 1,000. A larger sample reduces the risk of anomalies (e.g. the odds that 100 out of 150 respondents to a survey about fast food choices were vegans).
Professional and specific consumer studies can vary in base size between 100 and 1000. In general, more = better.Protip: samples are often listed in charts as “n=[x].” So “n=736” means a sample size of 736 individuals.
Margin of error. While it may seem like minutia, the margin of error can make or break survey results. The margin of error represents the amount of doubt around a survey’s results based mostly on sample size. So a margin of error of +/-4% means that if 35% of a population made choice X, the real number is between 31% and 39%.Thus, if the top two choices to a question were 41% for A and 40% for B and the margin of error is +/-3%, we may not conclude that A beat B. Professional researchers will supply a margin of error. As a rule of thumb, I use +/-4-5% as a margin of error if a survey does not supply one.
Recruitment. If possible, learn how the researcher recruited respondents. Professionals use tactics such as random digit dialing with phone surveys to get a good cross-section of respondents. Others often put up a survey online (e.g. SurveyMonkey) and email links to friends or post the links on social networks.As a result, marketers need to take the latter approach with a grain of salt. The friends-and-family approach means that respondents may over-represent a certain age or social group and under-represent others.
Selective reporting. Unlike quantitative research, qualitative research does not always lend itself to simple reporting. We can easily grasp what “74% of respondents preferred skinny jeans” means. We can have less certainty around what “respondents evaluated fit by shopping with friends” means. Do they make appointments to shop with friends? Do they simply grab a friend from the office and shop over lunch? Do they email friends links from websites with pictures of jeans?By nature, qualitative reporting synthesizes input from a variety of respondents. It helps, then, to know more precisely what these respondents said. Look for complete videos, transcripts or even extended verbatims to get a fuller sense of what respondents said and what they meant.