Wednesday, September 30, 2009

Attention Panel Companies – Your Panelists Want to Talk to You

In the course of conducting studies, we have inadvertently discovered something about panel companies. They don’t want to talk to their panelists. True to the nature of their online business, they want all correspondence with panel members to be through email or the panel member portal. They make it close to impossible, if not completely impossible for a panelist to talk to them on the phone. I understand being “true to the method” and the cost implications a panel company could have if they actually talked to their panelists, but I still don’t get it. As consumers, and yes panelists are consumers (or we wouldn’t want to talk to them), we all know how frustrating it can be when you want to talk to a real person. Think about the last time you called your bank, an airline, or your cable company. It probably was not a fun experience, but with some diligence, you hopefully did get to speak to a live person.

As a matter of policy, when we place product for an in-home-use-test, we provide respondents with a toll free number. The intent is for them to have a way to contact us if they have any issues or concerns with using the product. But we have noticed more and more respondents are calling us because they want to talk to their panel company. By default, we get the calls.

Panel companies, it is time for you to address this. If you don’t talk to your panel members, it will only be a matter of time before they are talking about you – and not necessarily in a good way.

Tuesday, September 22, 2009

Research Snowmen

Many of you, who are friends, clients or business partners with our company, have received our Research Snowmen holiday cards over the years. What started as a humorous card in 2003, with a bunch of somewhat geeky Snowmen Researchers, has continued as an annual tradition. The 2009 card will mark our seventh edition.

We had planned for the Snowmen to make a onetime appearance. But they proved to be quite popular, and we have kept them around. The problem is – our designers who originated the idea are long gone. It has fallen on our shoulders to come up with a new idea every year. While I think we have come up with some great ideas, it has become a challenge. So we are turning to you for help.

If you aren’t familiar with our Snowmen, or you need some inspiration, turn on your speakers, and mosey on over to http://www.cooper-roberts.com/snowmen/. There you can experience all of the cards, with the appropriate holiday background music.

The rules are simple: the card must include the Snowmen, they should have some general (non-religious) holiday association, and hopefully have some tie back into marketing research. If we select your idea, you will be given credit for the inspiration on the back of the card.

And BTW, there is no gender bias – it’s just that Snowpeople doesn’t sound right to us. I know for a fact that the inspiration for some of our Snowmen were women.

You can respond here, or to keep your idea secret, respond to snowmen@cooper-roberts.com

Wednesday, September 9, 2009

Is your research broad enough?

On a recent flight back from Dulles, the twenty something sitting next to me asked me what time it was at least three times. This didn’t surprise me at all. In fact, it frequently happens on flights. I’ve learned that his generation generally does not wear a watch. They count on their cell phone to tell time. An airplane is one of the few places where they can’t use their cell phone to check the time. I wonder if watch manufacturers even considered cell phones to be a threat to their industry. Actually, I wonder if a lot of industries and businesses have considered the competition and threat from outside their core area.

  • Hollywood Video was probably worried about other brick & mortar video stores such as Blockbuster, but did they see Netflix or On-demand TV as a competitor?
  • Did parking meters (yes, they are a revenue stream, so I consider them to be a business) see the impact of debit cards, where people no longer generate the change needed to feed a meter?
  • Did FedEx realize the impact email would have on their business?
  • And we all know that most newspapers were blind in terms of the crippling effect the Internet has had on so many aspects of their business.

Most research, trying to maximize the information garnered for the minimal amount of the respondents’ time, concentrates on the core category and known competition. Such studies would probably never give an early warning of competition from outside the core sector. But this path of efficiency could be dangerous, if not life threatening. I strongly recommend that at least once a year, and ideally twice a year, your research, whether it be qual or quant, explores outside the box. Study evolving ways that consumers do things. Like the cell phone has hurt the watch business, what is lurking out there that could hurt your business.

Wednesday, September 2, 2009

DIY Disasters: Part 2

My posting last week about DIY disasters seems to have touched a nerve with quite a few people. People emailed me with their own examples. Given the nature of the topic, they choose to email rather than add a comment. After reading some of the notes, I saw a recurring theme. I had talked about design driven disasters. The examples brought to my attention were all about interpretation disasters. In other words, bad business decisions were made based on the research.

A common problem was the lack of perspective. For example, one fairly new CPG company did a concept test. Based on a 35% “top 2 box” purchase interest score, they rolled out the product. A seasoned researcher would question the wisdom of rolling out a product with such a low score. But they thought it was a great score. They felt they could reach a third of the market, which was more than enough for them. The product went to market and it bombed. A detailed analysis of the original DIY results, by a research professional, easily identified the problem.

Another example was an ongoing customer satisfaction study that a service company was conducting via a DIY platform. Everyone was happy with the consistent 80% satisfaction score they were receiving. Then they started losing subscribers. They did another DIY survey, and could not figure out why. They eventually turned to a professional researcher for help. First, they were advised that rather than gloat about 80% of your customers being satisfied, they should be worried that 20% were not satisfied. The typical company threshold for unsatisfied customers is 5-10%. A little analysis gave them some big insights. They failed to see that the dissatisfied customers were downright irate with the service. They also failed to see that these customers were the heaviest users of their services. Losing 1% in customers from this group equated to losing 5% of revenue. They also failed to see that these irate customers were experimenting with competitive services.

In both of these cases, the execution of the research was fairly strong, but the interpretation was weak. Actions taken, based on these projects, were far more costly than it would have been to call in a professional in the first place.