Wednesday, September 2, 2009

DIY Disasters: Part 2

My posting last week about DIY disasters seems to have touched a nerve with quite a few people. People emailed me with their own examples. Given the nature of the topic, they choose to email rather than add a comment. After reading some of the notes, I saw a recurring theme. I had talked about design driven disasters. The examples brought to my attention were all about interpretation disasters. In other words, bad business decisions were made based on the research.

A common problem was the lack of perspective. For example, one fairly new CPG company did a concept test. Based on a 35% “top 2 box” purchase interest score, they rolled out the product. A seasoned researcher would question the wisdom of rolling out a product with such a low score. But they thought it was a great score. They felt they could reach a third of the market, which was more than enough for them. The product went to market and it bombed. A detailed analysis of the original DIY results, by a research professional, easily identified the problem.

Another example was an ongoing customer satisfaction study that a service company was conducting via a DIY platform. Everyone was happy with the consistent 80% satisfaction score they were receiving. Then they started losing subscribers. They did another DIY survey, and could not figure out why. They eventually turned to a professional researcher for help. First, they were advised that rather than gloat about 80% of your customers being satisfied, they should be worried that 20% were not satisfied. The typical company threshold for unsatisfied customers is 5-10%. A little analysis gave them some big insights. They failed to see that the dissatisfied customers were downright irate with the service. They also failed to see that these customers were the heaviest users of their services. Losing 1% in customers from this group equated to losing 5% of revenue. They also failed to see that these irate customers were experimenting with competitive services.

In both of these cases, the execution of the research was fairly strong, but the interpretation was weak. Actions taken, based on these projects, were far more costly than it would have been to call in a professional in the first place.

No comments: