Tag Archives: consumer segments

Market Research Survey Silliness

I’ve spent more than a decade working in market research, both quantitative & qualitative research, and still more years working with market research data in one form or another.  So, I’ve seen maybe not it all, but a whole lot.  I have managed and analyzed hundreds of surveys conducted by mail, on the phone, and online.  I’ve attended and moderated hundreds of focus groups.  I’ve managed segmentation surveys, discrete choice surveys, and I’ve randomized this and that and versioned that and this.  I’ve applied the unknowledgeable-management consultant’s favorite – conjoint analysis – and regressed and forecasted.  I’ve observed natural consumer behaviors, stimulated behaviors, and followed people while they operate cars, computers, and cell phones, and while they dined.  I’ve talked to people about death and dying and about surviving.  I’ve talked to people about how they shop, why they shop, and why they don’t like to bend over to pick up items on the bottom shelf when they shop.

My career gave me a fascination for consumer behavior, from the little decisions they make to the big, life changing decisions.  People are fascinating, and what makes it even more interesting, they very often don’t do what they say they do – imagine that!  Yes, consumers are captivating and indeed, understanding them and catering to them is the product’s best route to success.

I’m in a bit of a different professional position these days.  I use market research data, but it is of the most basic kind.  I currently conduct and use some customer satisfaction and general feedback data with a few qualitative interactions thrown in for good measure.  Yet as a supporter of market research and a consumer myself, I continue to complete surveys whether it’s a phone call at night, an online survey, or one in the mail, provided it is a legitimate survey and not someone trying to disguise themselves as a survey and then sell me a product or a politician.  I continue to respond as a fiduciary duty to my former profession and just to be helpful to the organizations out there attempting so dutifully to figure out their customers.  I continue to support data collection, even when it is excruciatingly painful.  And that brings me to this post.

Earlier this fall, I received a survey invitation via email for the purposes of providing feedback on school supplies.  What timing!  I had just completed purchasing school supplies for my son, and I will be in the market for the next 16 years by the time my youngest gets through high school.  This survey, I thought, would be interesting.

I go through the normal procedure of following the link.  I answer several demographic questions about my age, gender, household make-up, age of my kids, and income levels.  I answer these questions without hesitation because I know that I am one of probably 1,000+ answering these questions and that the 25-year-old analyst reading this data is not going to care about who I actually am (nope, not even my income level) for I am just one of a thousand.  (Just so you know, analysts care about percentages of the total sample; they don’t care about individuals in these instances.  Yes, you are special, but so are the other 999 people completing the survey.  It’s okay to let people know your household income and that you are a boy or girl, and etc. – trust me!  Now if they ask for a social security or credit card number – that’s different:  shut the survey down and/or hang up the phone – that’s not market research, that’s a scam.)

So I’ve now answered the demographic questions, and indeed, I am one of the target customers from which feedback is desired.  I am asked to continue with the survey.  I then answer several questions about which products and brands of school supplies I bought, which ones I considered, and from what stores I purchased the school supplies.  I was asked about how much I spent and how much I had intended to spend.  It was a nicely designed survey and I felt like I was really cruising through it and helping out.  And then it began – I fell into survey hell.

First, there was a battery of questions for every single product I bought and every single brand I considered.  Then there were batteries of questions for impressions of brands I had not considered.  The batteries were endless and included statements where I was to rate my agreement with things like:  “I feel like a leader for having brought this brand.”  “Others will admire me for buying this brand.”  “This brand is like me.”  “This brand is the only brand for my child.”  “This brand will make my child more successful.”  “I talked to friends and colleagues about this brand.”  “I encouraged others to buy this brand” – and many more questions along those lines.   Multiply these brand behavioral questions times the brands I considered and/or was aware of, times the number of back-to-school supplies I ticked off the list that I bought.  This was in fact, a survey that I may be completing for the next 16 years!

Yes, I was in survey hell, and not just at the surface.  I was in the hottest most uncomfortable level with no hope of getting to the surface anytime soon.  It is now several weeks later and having taken some time to recover from this experience, I have some advice to offer those of you developing surveys for your product or service or working with organizations that are doing so.

First of all, good market researchers/questionnaire designers realize that you never, I mean never, put people through endless list of questions on everything they did.  You randomly select a number of products for someone to evaluate.  Why?  Because I felt committed to answering the full survey, but by the end, I was clicking answers just to get through it (sorry about that – I still feel the guilt of providing bad data, but I just couldn’t take it anymore); others will just simply exit the survey.  There are several questionnaire programs that will randomly select a handful of items for a consumer to evaluate if it is the case that a typical consumer has purchased or considered more than 5 or 6 items and these programs will ensure that you have enough sample to analyze each item.  Ask consumers to evaluate more than 5-6 items with associated statements, etc., and your data will be flawed.  Good researchers protect the validity of their data and the sanity of their respondents.

Secondly, let’s think about the behavioral and brand leader issues related to buying a PENCIL!  Buying pencils and erasers are much different than buying, for example, a car, or apparel, or kitchen appliances or major electronics.  The latter are big ticket or emotional purchases that consumers may use to define themselves and that they may talk about with friends and colleagues.  If I’m going to be a brand leader or influencer, it’s going to be with these types of ego-type purchases – a pack of standard #2 pencils?  Not so much.  You might do some brand loyalty and behavioral segmentation on back-packs or on PDAs or other higher-profile supplies, but on pencils?  I’ve no clue what brand I even purchased for goodness sake!  But, I still was presented the battery of questions because I was familiar with some pencil brands.  Whoever designed this survey somehow believed that it was legitimate to apply brand and behavioral attributes/measurements used to describe the purchase process of a muscle car ($40,000) against the purchase of a pencil (40 cents).  Does that seem logical?  Remember that analogy:  the clock-builder will know how the clock is built, how it keeps nearly precise time, how the gears (now chips) work together, and about when it will need repair.  The consumer will know – what time it is.  I’m guessing that same analogy can be applied to the #2 pencil.  Good researchers question their logic and the logic of their clients who often get so caught up in their product they don’t understand its role and importance in the lives of everyday consumers. 

Third, and this one is important, good market researchers need to understand how the purchase is made prior to measuring and interpreting actual purchase.  The survey should have asked me how I made decisions about the purchase of school supplies.  You see, my school district, and all the ones around me (I talked to other parents, yup, I’m an “influencer”), gave me a list of supplies for my elementary-aged son.  They asked for specific brands and sizes and colors (emphasizing “washable” products, of course) because what happens is that we give the entire bag of supplies to the teacher who puts them in community bins.  The kids, when they need it, just get their pencil or marker from the bin.  Then they return it when they are done using it.  It’s a great idea – things don’t get lost, no need to label, there are not equity issues (all the same brand and size and color) and for the kids who cannot afford supplies, they are not left without.  So all this “brand leader/influencer” stuff the surveyor forced upon me?  Unfortunately, it’s all bad data.  If this bin trend for elementary students is a wide one, the individuals that should be surveyed are the teachers who are making the brand decisions, not me.  Good researchers do some research before they do the research.

So if you are a product manager or a market research consultant, I beg you, please consider your product, the product purchase process, and your consumers.  Make it easy for us to provide you good quality data.  Fewer and fewer people are responding to surveys, so please don’t push away those of us who still do respond.  And I should also say, lest I receive emails from pencil managers, I’ve nothing against the #2 pencil.  Why I am one of the few adults who still uses a pencil – not a mechanical pencil that breaks every time I use it – a real, sturdy, old-fashioned pencil that requires a sharpener.  Hmm, I wonder what type of consumer-segment that puts me in. . .