Top 5 Metrics You're Measuring Incorrectly … or Not
Last night as I was casually perusing the days digital analytics news — yes, yes I really do that — I came across a headline and article that got my attention. While the article’s title (“Top 5 Metrics You’re Measuring Incorrectly”) is the sort I am used to seeing in our Buzzfeed-ified world of pithy “made you click” headlines, it was the article’s author that got my attention. Whereas these pieces are usually authored by well-meaning startups trying to “hack growth” … this one was written and published by Jodi McDermot, a Vice President at comScore and the current President of the Digital Analytics Association.
I have known Jodi for many years and we were co-workers at Visual Sciences back in the day. I have tremendous respect for Jodi and the work she has done, both at comScore and in the industry in general. That said, her blog post is the kind of vendor-centric FUD that, at least when published by a credible source like comScore, creates unnecessary consternation within Enterprise leadership that has the potential to trickle down to the very analysts she is the champion for at the DAA.
Gross.
Jodi does not mince words in her post, opening with the following (emphasis mine):
“With the availability of numerous devices offering web access, daily usage trends, and multi-device ownership by individual consumers, traditional analytics are not only misleading, but often flat out wrong.”
While open to interpretation, it is not unreasonable to believe that Jodi is saying that companies who have invested heavily in analytics platforms from Adobe, Google, Webtrends, IBM, etc. are just wasting money and, worse, the analysts they pay good salaries to are somehow allowing this to happen. She goes on to detail a handful of metrics that are negatively impacted by the multi-platform issue, essentially creating fear, uncertainty, and doubt about the data that we all recognize is core to any digital analytics effort in the Enterprise.
Now, at this point it is worth pointing out that I don’t fundamentally disagree with Jodi’s main thesis; multi-device fragmentation is happening, and if not addressed, does have the potential to impact your digital analytics reporting and analysis efforts. But making the jump from “potential” to “traditional analytics are not only misleading, but often flat out wrong” is a mistake for several reasons:
- Assuming analysts aren’t already taking device fragmentation into account is likely wrong. It’s not as if multi-device fragmentation is a new problem … we have been talking about issues related to the use of multiple computers/browsers/devices for a very, very long time. Jodi’s post seems to imply that digital analysts (and DAA members) are ignoring the issue and simply puking data into reports.
- Assuming consumers are doing the same thing on different devices is likely wrong. This is a more gray area since it does depend on what the site is designed to do, but when Jodi says that “conversion rate metrics must follow the user, not the device” she is making the assumption that consumers are just as likely to make a purchase on a small screen as a large one. I am sure there is more recent data, but a quick Google search finds that less than 10% of the e-commerce market was happening on mobile devices in Q2 2013.
- Assuming the technology exists to get a “perfect” picture of cross-device behavior is flat-out wrong. This is my main beef with Jodi’s post; while she never comes out and says “comScore Digital Analytix is the solution to all of these problems” you don’t have to read between the lines very much to get to that conclusion. The problem is that, while many companies are working on this issue from an analytical perspective (e.g., Google, Adobe, Facebook, etc.), the consensus is that a universal solution has yet to emerge and, if you’re an old, jaded guy like me, is unlikely to emerge anytime soon.
I don’t fault Jodi for being a fangirl for comScore — that is her job — but implying that all other technology is broken and (by extension) analysts not using comScore technology are misleading their business partners is either unfair, irresponsible, or both. The reality is, at least within our client base, this is a known issue that is being addressed in multiple ways. Through sampling, segmentation, the use of technologies like Digital Analytix, and good old fashioned data analysis, our clients have largely been able to reconcile the issues Jodi describes such that the available data is treated as gospel within the digital business.
What’s more, while comScore data can be useful for very large sites, in my experience sites that don’t have massive traffic volumes (and thusly representation in the comScore panel) often fail the basic “sniff” test for data quality at the site-level. I do admit, however, that as a firm we don’t see Digital Analytix all that often among our Enterprise-class clients, so perhaps there are updates we are not privy to that address this issue.
What do you think? Are you an analyst who lays awake at night, sweating and stressing over multi-device consumers? Do you dread producing analysis knowing that the data you are about to present is “misleading and flat out wrong?” Or have you taken consumer behavior into account and continue to monitor said behavior for other potential business and analysis opportunities?
Comments are always welcome. Or, if you want to debate this in person, meet me in person at ACCELERATE 2014 in Atlanta, Georgia on September 18th.