In Defense of "Web Reporting"
Avinash’s last post attempted to describe The Difference Between Web Reporting and Web Analysis. While I have some quibbles with the core content of the post — the difference between reporting and analysis — I take real issue with the general tone that “reporting = non-value-add data puking.”
I’ve always felt that “web analytics” is a poor label for what most of us who spend a significant amount of our time with web behavioral data do day in and day out. I see three different types of information-providing:
- Reporting — recurring delivery of the same set of metrics as a critical tool for performance monitoring and performance management
- Analysis — hypothesis-driven ad hoc assessment geared towards answering a business question or solving a business problem (testing and optimization falls into this bucket as well)
- Analytics — the development and application of predictive models in the support of forecasting and planning
My dander gets raised when anyone claims or implies that our goal should be to spend all of our time and effort in only one of these areas.
Reporting <> (Necessarily) Data Puking
I’ll be the first person to decry reporting squirrel-age. I expect to go to my grave in a world where there is still all too much pulling and puking of reams of data. But (or, really, BUT, as this is a biggie), a wise and extremely good-looking man once wrote:
If you don’t have a useful performance measurement report, you have stacked the deck against yourself when it comes to delivering useful analyses.
It bears repeating, and it bears repeating that dashboards are one of the most effective means of reporting. Dashboards done well (and none of the web analytics vendors provide dashboards well enough to use their tools as the dashboarding tool) meet a handful of dos and don’ts:
- They DO provide an at-a-glance view of the status and trending of key indicators of performance (the so-called “Oh, shit!” metrics)
- They DO provide that information in the context of overarching business objectives
- They DO provide some minimal level of contextual data/information as warranted
- They DON’T exceed a single page (single eyescan) of information
- They DON’T require the person looking at them to “think” in order to interpret them (no mental math required, no difficult assessment of the areas of circles)
- They DON’T try to provide “insight” with every updated instance of the dashboard
The last item in this list uses the “i” word (“insight”) and can launch a heated debate. But, it’s true: if you’re looking for your daily, weekly, monthly, or real-time-on-demand dashboard to deliver deep and meaningful insights every time someone looks at it, then either:
- You’re not clear on the purpose of a dashboard, OR
- You count, “everything is working as expected” to be a deep insight
Below is a perfectly fine (I’ll pick one nit after the picture) dashboard example. It’s for a microsite whose primary purpose is to drive registrations to an annual user conference for a major manufacturer. It is produced weekly, and it is produced in Excel, using data from Sitecatalyst, Twitalyzer, and Facebook. Is this a case of, as Avinash put it, us being paid “an extra $15 an hour to dump the data into Excel and add a color to the table header?” Well, maybe. But, by using a clunky Sitecatalyst dashboard and a quick glance at Twitalyzer and Facebook, the weekly effort to compile this is: 15 minutes. Is it worth $3.75 per week to get this? The client has said, “Absolutely!”
I said I would pick one nit, and I will. The example above does not do a good job of really calling out the key performance indicators (KPIs). It does, however, focus on the information that matters — how much traffic is coming to the site, how many registrations for the event are occurring, and what the fallout looks like in the registration process. Okay…one more nit — there is no segmentation of the traffic going on here. I’ll accept a slap on the wrist from Avinash or Gary Angel for that — at a minimum, segmenting by new vs. returning visitors would make sense, but that data wasn’t available from the tools and implementation at hand.
An Aside About On-Dashboard Text
I find myself engaged in regular debates as to whether our dashboards should include descriptive text. The “for” argument goes much like Avinash’s implication that “no text” = “limited value.” The main beef I have with any sort of standardized report or dashboard including a text block is that, when baked into a design, it assumes that there is the same basic word count of content to say each time the report is delivered. That isn’t my experience. In some cases, there may be quite a bit of key callouts for a given report…and the text area isn’t large enough to fit it all in. In other cases, in a performance monitoring context, there might not be much to say at all, other than, “All systems are functioning fine.” Invariably, when the latter occurs, in an attempt to fill the space, the analyst is forced to simply describe the information already effectively presented graphically. This doesn’t add value.
If a text-based description is warranted, it can be included as companion material. <forinstance> “Below is this week’s dashboard. If you take a look at it, you will, as I did, say, ‘Oh, shit! we have a problem!’ I am looking into the [apparent calamitous drop] in [KPI] and will provide an update within the next few hours. If you have any hypotheses as to what might be the root cause of [apparent calamitous drop], please let me know” </forinstance> This does two things:
- Enables the report to be delivered on a consistent schedule
- Engages the recipients in any potential trouble spots the (well-formed) dashboard highlights, and leverages their expertise in understanding the root cause
Which…gets us to…
Analysis
Analysis, by [my] definition, cannot be something that is scheduled/recurring/repeating. Analysis is hypothesis-driven:
- The dashboard showed an unexpected change in KPIs. “Oh, shit!” occurred, and some root cause work is in order
- A business question is asked: “How can we drive more Y?” Hypotheses ensue
If you are repeating the same analysis…you’re doing something wrong. By its very nature, analysis is ad hoc and varied from one analysis to another.
When it comes to the delivery of analysis results, the medium and format can vary. But, I try to stick with two key concepts — both of which are violated multiple times over in every example included in Avinash’s post:
- The principles of effective data visualization (maximize the data-pixel ratio, minimize the use of a rainbow palette, use the best visualization to support the information you’re trying to convey, ensure “the point” really pops, avoid pie charts at all costs, …) still need to be applied
- Guy Kawasaki’s 10-20-30 rule is widely referenced for a reason — violate it if needed, but do so with extreme bias (aka, slideuments are evil)
While I am extremely wordy on this blog, and my emails sometimes tend in a similar direction, my analyses are not. When it comes to presenting analyses, analysts are well-served to learn from the likes of Garr Reynolds and Nancy Duarte when it comes to how to communicate effectively. It’s sooooo easy to get caught up in our own brilliant writing that we believe that every word we write is being consumed with equal care (you’re on your third reading of this brilliant blog post, are you not? No doubt trying to figure which paragraph most deserves to be immortalized as a tattoo on your forearm, right? You’re not? What?!!!). “Dumb it down” sounds like an insult to the audience, and it’s not. Whittle, hone, remove, repeat. We’re not talking hours and hours of iterations. We’re talking about simplifying the message and breaking it up into bite-sized, consumable, repeatable (to others) chunks of actionable information.
Analysis Isn’t Reporting
Analysis and reporting are unquestionably two very differing things, but I don’t know that I agree with assertions that analysis requires an entirely different skillset from reporting. Meaningful reporting requires a different mindset and skillset from data puking, for sure. And, reporting and analysis are two different things, but you can’t be successful with the latter without being successful with the former.
Effective reporting requires a laser focus on business needs and business context, and the ability to crisply and effectively determine how to measure and monitor progress towards business objectives. In and of itself, that requires some creativity — there are seldom available metrics that are perfectly and directly aligned with a business objective.
Effective analysis requires creativity as well — developing reasonable hypotheses and approaches for testing them.
Both reporting and analysis require business knowledge, a clear understanding of the objectives for the site/project/campaign/initiative, a better-than-solid understanding of the underlying data being used (and its myriad caveats), and effective presentation of information. These skills make up the core of a good analyst…who will do some reporting and some analysis.
What About Analytics?
I’m a fan of analytics…but see it as pretty far along the data maturity continuum. It’s easy to poo-poo reporting by pointing out that it is “all about looking backwards” or “looking at where you’ve been.” But, hey, those who don’t learn from the past are condemned to repeat it, no? And, “How did that work?” or “How is that working?” are totally normal, human, helpful questions. For instance, say we did a project for a client that, when it came to the results of the campaign from the client’s perspective, was a fantastic success! But, when it came to what it cost us to deliver the campaign, the results were abysmal. Without an appropriate look backwards, we very well might do another project the same way — good for the client, perhaps, but not for us.
In general, I avoid using the term “analytics” in my day-to-day communication. The reason is pretty simple — it’s not something I do in my daily job, and I don’t want to put on airs by applying a fancy word to good, solid reporting and analysis. At a WAW once, I actually heard someone say that they did predictive modeling. When pressed (not by me), it turned out that, to this person, that meant, “putting a trendline on historical data.” That’s not exactly congruent with my use of the term analytics.
Your Thoughts?
Is this a fair breakdown of the work? I scanned through the comments on Avinash’s post as of this writing, and I’m feeling as though I am a bit more contrarian than I would have expected.