When it comes to data, more isn't always better. The challenge is applying the more insightful old school methodologies to the digital customer experience.
As marketers, we're constantly drowning in data meant to deepen our understanding of customers' behaviors and needs, which, let's be honest, many marketers have a tendency to inflate. Don't get me wrong; metrics can be very useful, but that's because they play into a feedback loop, not because they're a "smoking gun." No metric on its own can explain correlations between marketing tactics and customer behaviors.
In addition, metrics can't tell you all of the dimensions you need to be paying attention to; for example, whether your engagement messaging is actually hitting the right target.
It all comes down to the difference between just seeing data and understanding what that data means in a larger business-impact context.
Good data and bad marketing
Consider this example from a previous company I worked at that placed a lot of emphasis on metrics. The email marketing team was very proud to have achieved a 15 percent lift in conversions in their latest campaign. Upon closer examination, we found the actual benefit to the business went down. Not only were opt-outs increasing – shrinking our marketable universe – but revenue per user was down compared to the previous campaign. The marketer in charge of the campaign meant well but was too focused on the tactical data and KPIs set for the email program. Along the way, he'd completely lost sight of the context within which these KPIs lived.
What's more, many marketers tend to let their objectives get blurry. For instance, the distinction between awareness and demand generation can often shift, depending on which tactics you're using. For example, I once worked on an online advertising campaign that was designed to create awareness of the company's new version of a core product. After three months of execution, the program was deemed a success, since the view-through and click-through data showed that it was influencing our target audience. Yet, when we dug deeper, we realized that the actual execution of the ad didn't even mention the new product – and only people clicking through would see the brand name at all. Logically, there is no way that ad execution would work, yet all the metrics said it did. Why?
The meanings of metrics
Executives are always asking for the ability to fit all the data together; to see it all in one place. What they want, in other words, is a centralized dashboard that provides insight into how and why certain marketing activities contribute to certain results. But as we all know, that kind of dashboard doesn't exist. A dashboard can show you elaborate lists of metrics, sure, but those metrics alone won't tell you why subscribes went up or down. All they can tell you is that something has changed.
In fact, no single metric can tell you whether a given change is positive or negative, in terms of your sales and marketing goals. Say, for example, usage is going up. Seems like a good thing, right? But this might point back to the fact that usage was unexpectedly low for most of the product's history. By the same token, flat usage might seem like an obvious negative, but it may mean that you lost some of your lower-value customers while gaining higher-value users in other segments.
To put a finer point on it, no metric can perfectly tell you whether you're optimizing for the right goals. If you're optimizing to spend your budget more carefully, that may temporarily lower your revenue, which means a short-term rise in revenue might point back to a drop in your return on investment (ROI). Inversely, if you're optimizing for more revenue, your return on each individual investment may temporarily drop, so a report that shows lower than usual ROI might be exactly what you want.
Understanding metrics requires more than just a dashboard; it requires iteration and inspection. Marketers have to see beyond the individual pieces and understand how the puzzle fits together to create the whole picture. Even more, understanding requires actual observation of your customers.
It’s easy to bring in a sleek presentation, walk through the numbers and say, "I have this under control." Data looks neat, but the real world is messy. The truth is, though, that the only way to get truly meaningful customer insights is to get your hands a little dirty.
In the pre-internet era, this truth tended to go without saying. Before we had metrics on views and click throughs, we'd sit behind a one-way mirror and watch focus groups. We'd listen carefully as consumers watched our ads, played with our products and shared their thoughts. This approach wasn't fast, and it wasn't self-service, as modern metrics are. It was heavy on insight and low on data. But while today's approach is fast and heavy on data, we're sacrificing a lot of insight into how our customers actually behave and why they take the actions they do.
That's why more customer data isn't necessarily a good thing. To adapt and remain agile, you need to understand your audience, and the only way to get that understanding is to view their actions in context. The challenge is applying this old school methodology to the digital customer experience.