[reading time: just over 4 minutes]
I continue to be a skeptic when it comes to the Big Data-version in which marketers get to scrape all data they can get their hands on to improve their ability to target prospects and Customers with so-called ‘personalized’ offers.
I’m a skeptic about this since I believe it is an approach that ultimately suffers from the law of diminishing returns, and at an increasingly high speed.
For one I believe this because it has been the case with most one-to-one (predictive analytical) marketing programs that I’ve seen running (or ran myself). In the beginning it looks as if conversions can only go up, but after a while you will be defending ever decreasing conversions to a point it just doesn’t make sense anymore. This is true because you will literally have exploited most opportunities in your Customer base over time.
But mostly this is true because these days your Customers and prospects are increasingly well equipped to find you when they have a need to. They can safely ignore your targeted broadcast, for they don’t need to remember. Not that I believe they actually did so in the past.
Or, to put it in another way: we’ve been long convinced it was our specific offer for a specific prospect/customer that persuaded them to buy more, where in fact it was the Customer who was ALREADY interested in your product or service, who purchased. Whereas you may have been convinced it was BECAUSE OF your doing that people bought, in fact it was the customer who purchased DESPITE your effort.
Control groups, statistical analysis and all, chances are that any net-conversion you thought you realized were just a result of mere chance. You can’t really tell because most evaluations are based upon too few data-points anyway. This problem is not solved by big-data (getting more different kinds of data) for it is a problem of too few of the same data-points over a long period of time.
I like the analogy by Nassim Taleb (see the video embedded in this article; please ignore that it’s a Dutch article, Taleb speaks English) that explains how we humans do not predict when it’s safe to cross a road by adding more different data-points, like e.g. the color of the eyes of by-passing car-drivers, to our decision-making process, but by filtering the data and only assess what’s relevant to get across safely.
Big Data programs that aim to do the same as old-fashioned predictive analytics based marketing programs, will suffer from the same old-fashioned results: conversion rates will drop, efforts to improve (collect more and more data) will increase and as a result cost-per-order will rise beyond any reason. Automation may improve productivity for some time, but all it will actually do is cloud the need to really and fundamentally innovate around value. Your effort to the bottom of things will cost you more time and money than you can really afford. Time and money you cannot spend on increasing co-created value in-use.
Thus, my statement of today is that the Big-Data route towards increasing conversion rates, by predicting contextual relevance for delivering a personalized offer, only knows a slope of diminishing returns.
The alternative route, towards increasing value co-created by improving your value proposition and the experience of finding, understanding, customizing, purchasing and using it to generate desired outcomes, though, is only limited by our ability to understand what drives value co-creation, our imagination and capability to transfer those insights into these very value propositions.
This is a pie that can cannot only get smaller, it’s one that has the potential of getting bigger and bigger, which is a much more fun outlook than the inevitable downward slope of diminishing returns.
This will put you on the route where you will try to understand how you can bridge the gap for Customers between what they are trying to get done or want to achieve and them enjoying those outcomes. Instead of putting yourself in between your Customer and their desired outcomes.
This will put you on the route where you try to generate (relevant) data with Customers (not just harvest it from them) and put it in use for their benefit, not (just) you.
This is where you do not eavesdrop but listen carefully and share back what you know. It may even put you in the spot where you are willing to pay your Customer for their contribution to your capability to better help them (and get paid in return). Or better, where your Customers want to share with you their (use) data, for they trust it will benefit them.
This is when you understand that you do not add value in each step of the value chain but jointly co-create value over it’s emergence.
This will ultimately put you in a spot where Customers value your help, not ignore your (re-)targeted message.
Howdy! Would you mind if I share your blog with my myspace group?
There’s a lot of people that I think would really enjoy your content.
Please let me know. Thanks
Pingback: Big Data, Trust and ‘You as The Product’ | Wim Rampen's Blog
Pingback: What Is All The Content Marketing Fuss About? (byWim Rampen) | BRANDLOVE
Pingback: What Is All The Content Marketing Fuss About? | Wim Rampen's Blog
interesting thoughts again – however the analogy with the traffic makes me think somewhat: Crossing the road we are filtering through a massive data stream that we are confronted with through our senses. We are using kind of “big data” along with a predictive analysis based upon past experience (“those cars are far enough away – they simply cannot accelerate fast enough to come near me if I start crossing the road now”). This strongly speaks in favor of using “big data” technologies – and I do not like this phrase either, just that I do not have a better one.
The main difference is that in this example the customer does the analytics (I, who wants to cross the road), not the company (the approaching car).
So the real crux for companies is to create a consistent track of helping customers achieving what they want (by being predictive?). Now, how else but by themselves analysing a huge amount of data can they do? Or: Wouldn’t that limit company’s ability to generate completely new markets?
Just some tired random thoughts early morning in the airport lounge ;-)
Excellent comment! Although I do not (fully) agree.
What you are actually saying is that it is important to understand what data is relevant. We humans know instinctively that the color of ones eyes is irrelevant for crossing a street. In business it’s not always that clear though..
But that doesn’t mean that you can simply declare ALL data relevant data, like Big-Data seems to be implying. I am in favor of declaring all data suspicious for starters and then have its relevancy proved by a rigorous process. This is standard procedure in professional analytics departments. Why should big data be treated any different?
And yes, using lots of (contextual data in real time) to predict what kind of help a customer could use for his next step in his Customer Job makes a lot of sense to me. Problem is that predictability will be low as long as there is only a lot of data over a short period of time.
Thus, the only way to go is test test test, and use control groups to understand what the effect is that you are causing. Oh, and take some time. You can’t decide what did or did not have an effect based on a one day trial. Really.
Thx for taking the time to read and comment Thomas! Much appreciated,
Yes, predictability is low if the sample is not big enough or the data is not collected over enough of a time. However, it is possible to put previous experience into algorithmic knowledge, too, which overcomes the lack of hard data to some extent.
Having said that we seem to have slightly different notions of ‘big data’. For me it includes the intelligence (tools, models, etc) to work this data and to come to good conclusions – better without becoming or even appearing creepy, which adds another dimension. I also think that Ellen is far off. Real time is not the value prop of ‘big data’. Real time gathering of data is possible for quite some time – just that there was and is an increased notion about the models used to analyse the data and to come to useful conclusions do not cut the mustard. So one needs to collect more to be able to analyse better … And to build better models that potentially require less data and possibly computing power.
Sticking to the example, kids do not immediately know that the driver’s eye color is not a good measure for deciding whether it is safe to cross the road but distance and speed and estimated ability and likelihood of acceleration are – and then they still need to learn to properly assess these. This takes time, too – which confirms your point. But then the kids have a model that works (mostly) in an instant.
As you say, not the amount of data is the crucial point but the ability to filter for and select the right data. As it is hard to predict what the right data is people just collect the lot – hence also the current intense discussion about the conduct of worldwide spy agencies – but I deviate …
However, needs and wishes are changing fast nowadays, often they are even generated by marketing – look at tablets or smartphones to name but two. Then there seems to be an ever increasing need to ‘get one’s job done’ right now.
The first one again confirms your point, the second one contradicts it by demanding from a company be or become aware at the right point in time and also the right position, if necessary.
So, maybe a possible synthesis is that there should be less mass marketing but more closely targeted, yet not creepy, marketing to fewer individuals, delivered via the appropriate channel?
As a result everybody gets less annoying marketing and more of the little remaining becomes relevant.
There seem to be a lot of assumptions present in this article, particularly assuming that all companies use similar data in the same way.
I don’t think many will argue against the point that big data does not inform on the value proposition (nor was that the goal of those using it) but rather it informs in REAL time what is actually happening in the market(s). Somewhere in the last five years, the definition of advertising, marketing and sales data has become murky, particularly where business is driven off the Internet. Companies need to understand the difference.
Customers relate to companies as a brand and by specific products. How customers feel about a product during and after the sale impacts the reputation of the brand and the sat/loyalty metrics. Analysis of big data (purchase, returns, the impact on product brand to other peer brands and the corporate brand) ultimately does impact return of purchases and near term purchases of those within the customer’s ever enlarging peer circle. It is all a part of the ecosystem.
By harnessing the power of big data you can develop a quicker and more accurate picture of factors likely to impact the value of your current value proposition. It’s a huge improvement over traditional trackers which were slow to inform and often had large skews. Big data IS what’s happening. It’s the determinate action of the elements of the value proposition and it informs on the relevance of the value proposition. It’s not a diminishing return because your sat and loyalty numbers do fluctuate and now the subtleties of those changes can be addressed or dismissed at the data informs on specific populations and growing/shrinking segments.
Thx for reading and taking the time to engage.
I do not base my arguments on assumptions. They are all rooted in my own 15 years of experience within service, marketing/business intelligence, analytics and customer experience management.
Furthermore I do see potential of big data informing the value proposition, for the value proposition is not a static thing, it is a living and breathing construct that is formed in the eyes of the consumer when they are interacting with the brand. How well informed those interactions are could well be improved by adding a lot more (data based) intelligence to the mix than currently the case.
I also see the value in your real-time example, albeit that real-time is vastly overrated. Daily, weekly or even monthly works fine for most companies and in most cases. And from a statistical point of view the shorter the cycle is the less valuable the analytics you perform on it (of at all valuable).
One of my senior analysts said to me a couple of weeks ago, that he would be in favor of doing less, not more ‘marketing’. It’s all the doing that blurs the data and provides us with the false feeling of safety that it is actually us that are causing something to move.. Yet, the very fact that we do so much (at the same time) and shift strategies so much, makes it impossible to establish just that. I agree with him.
I think we should all (as marketers) spend a lot more time trying to make informed improvements into the value proposition and a lot less time trying to ‘market’ that.
My opinion of course.
I agree with Greg Satell. I understand your intent Wim but you’d have had used a different title, narrowing the “bad” usage of Big Data when it means only targeting people as in the old school. Moreover I’m a fan of co-creation and I’m sure that digital devices will help us interact more with companies but how will be the percentage of people really participating to the creation process? Always low (imo but maybe i’m wrong) and Big Data in the sense of big volume of information units can give companies the opportunity to probe deeper and deeper in the customers’ profile trying to understand them better. So you don’t need to use only predictive analytics for targeting but there are a lot of other applications that can gain benefits from them. For example the recommender systems using collaborative filtering (here predictive approaches are certainly used) can be improved a lot integrating additional information to the system giving mutual benefits to people and companies (last thing, I think that weather info together with traffic can help us no only when you’re planning a trip but also in real time when you move to work in big cities :) ).
To be honest Wim, I think you’re missing the point a bit here. What’s important about big data is not that it helps us do old jobs better, but that it helps us do new ones entirely (or apply completely different approaches to old tasks).
So, while I think you’re right that big data might not be particularly helpful when it comes to targeting tasks, it is absolutely essential to do new things like simulations, predictive maintenance, etc.
Interesting point of view. Yet I don’t think it holds true, i.e. I don’t think I’m missing the point ;)
First of all, in this post I only address Big-Data diminishing returns in relation to predictive targeting (see first sentence), and we seem to agree on that.
Secondly I say:
“This [the alternative to] will put you on the route where you try to generate (relevant) data with Customers (not just harvest it from them) and put it in use for their benefit, not (just) you.”
A clear pointer to my firm believe that new data-points and it’s analytics can really be beneficial in business, more specifically for the improvement of Customer and company value co-creating capabilities. Yet, the same rule of diminishing return applies:
Think how we can use TomTom traffic gps data to show where traffic jams are, and how they can re-use that data to help governments to better plan road-works. In combination with smart-phone-location data it even becomes more accurate. On the other side: combining this data with weather data doesn’t really help so much, does it? It may for those planning trips ahead of time, not so much for your daily commute. Diminishing returns.
Thirdly: Jobs-to-be-done and the outcomes desired, in our society, really do not change that much. The tools, techniques and methods we use to get them done and meet those outcomes do. Much like consumers always had the job of hanging a painting, or enjoying art in their homes, they now have completely new (digital) tools to get those jobs done, but the job hasn’t changed. The same goes for companies imho. But please correct me if I’m wrong.
Thx for reading, your comment and the shout. Much appreciated.
Couldn’t agree more, Wim.
I think that there are factors that support your contention, as well. For example, why would any marketer believe that he or she would have access to “big” data that their competitors don’t. Generally, our shopping and buying behavior in many categories cuts across companies. Piecing together the “big” data picture requires third-party data aggregators (e.g., Google, Apple, etc.) who will sell that data to all competitors in a space.
Second — and this factor applies to marketing efforts long before “big” data came along — marketers tend to ignore exogenous factors when evaluating the success of their marketing efforts. Factors like the health of the economy which might drive up purchases across all competitors, or factors like the poor financial and operational health of a competitor (e.g., during the BP crisis, other gas stations picked up business without doing anything).
I removed the link. You can still check the link to the article. The video should show in there as well. For your convenience: here it is: https://decorrespondent.nl/310/de-lessen-van-nassim-taleb/5833439260-b86defe2
Have a great sunday :)
Excellent post, continuing questioning current trends by going back to basics.
Pls check the video link by Nassim Taleb, seems to be broken (temporary?).