When it comes to marketing data, “With much data comes much responsibility"
Having once designed and managed market research projects for a living I found the information reported in Kraft tackles data integrity issues a bit disturbing. Reporting on the Kraft food group’s own research into the quality of purchased data supporting its marketing campaigns, Julie Fleischer, Kraft’s director of data content and media, is quoted as saying the following after Kraft’s evaluation of third-party data on owners of Keurig brewing machines:
“The level of accuracy on these data sets ranged somewhere between 14% to 20% … 80% of this [data] buy was spent against non-Keurig owners when our data sets were Keurig owners.”
According to Fleischer this problem is not unique to third-party market research data in this particular vertical. This has led Kraft to “double down” on obtaining data from “first party” data sources.
You can’t blame Kraft for wanting the best possible data on which to base important product development, distribution, and pricing decisions. As anyone who has been in the data business knows, the closer you are to the source of the data the better off you’ll be not only in terms of accuracy but also in terms of data quality.
That’s one reason why vendors like Apple and Amazon are in such a good position to understand the thinking of their customers. Given the breadth of their data infrastructures they can tap directly in to a wide range of data associated with a verified purchasing behaviors. Adding more data sources to the mix such as Apple Pay and iBeacon will potentially offer even more data to those seeking to understand buyer behavior.
In theory, such comprehensive sources and the steady expansion of “Internet of things” data mean that the total population of customer behavior data is bound to grow – for those who can access the data.
One “catch” is that “With much data comes much responsibility.”
I’m referring here not only to privacy concerns but to the limits of how much you can replace traditional data gathering processes with automated processing. Then you — or your “data scientists” — have to be able to work through the burgeoning volumes of data to make sense. Then you have to apply what you learn to real-world problems — such asthe introduction of coffee products that work with new Keurig machines.
There’s also an economic challenge for intermediary and third party services that gather, analyze, and sell data to companies like Kraft. If you’ve spent any time in the market research business you know how cost competitive data services can become. Pressures to maintain data quality mount while customers seem to always want more while paying less.
How much of the data quality problems faced by Kraft are at least indirectly related to pressure on data vendors to control and cut costs? Given current processes, isn’t there a limit to how much manual processing and decision-making can be replaced throughout the data generating chain of events that are relevant to market planning?
Perhaps one fault here lies with Kraft and other companies like Kraft for taking so long to realize the quality limitations of purchased data services. I certainly have no way of knowing how prevalent this might be. But as a project manager with a lot of data management experience I do think that paying attention to the real costs associated with data quality is always a good idea.
- The Importance of Audience Research to Open Data Program Success
- New Media Tools and Market Research Independence
- More Thoughts About “Web 2.0 and Sales Process Management”
- More Data, Please - and Don’t Hold the Pickle
- Don’t Let Tools Drive Enterprise Data Strategy
- The Changing Culture of Big Data Management
- Planning for Big Data: Lessons Learned from Large Energy Utility Projects
- Learning from the World Bank’s “Big Data” Exploration Weekend
Copyright © 2015 by Dennis D. McDonald