Why I Don’t Put Recommendations on Dashboards
WARNING: Gilligan contrarianism alert! The following post posits a thesis that runs contrary to popular opinion in the analytics community.
Many companies these days rely on some form of internal dashboard(s). That’s a good thing. Even better is when those companies have actually automated these dashboards – pulling data from multiple data sources, structuring it in a way that directly maps to business objectives, and delivering the information in a clean, easy-to-digest format. That’s nirvana.
Reality, often, is that the dashboards can only be partially automated. They wind up being something an analyst needs to at least lightly touch to bridge inevitable API gaps before delivering them on some sort of recurring schedule: through email, through an intranet, or even in person in a regularly scheduled meeting.
So, what is the purpose of these dashboards? Here’s where a lack of clarity — clearly communicated — becomes a slippery slope faster than Miley Cyrus can trigger a TV viewer’s gag reflex. Dashboards are, first and foremost, performance measurement tools. They are a mechanism for quickly (at a glance!) answering a single question:
“Are we achieving the goals we set out to achieve?”
They can provide some minimal context around performance, but everything beyond answering that question is a distant second purpose-wise.
It’s easy enough to wax sophomoric on this. It doesn’t change the fact, though, that one of the top complaints dashboard-delivering analysts hear is: “I get the [weekly/monthly/quarterly] dashboard from the analyst, but it doesn’t have recommendations on it. It’s just data!”
I get it. And, my response? When that complaint is leveled, it’s a failure on the part of the analyst to educate (communicate), and a failure of process — a failure to have mechanisms in place to deliver actionable analytical results in a timely and effective manner.
But…here…I’m just going to lay out the various reasons that dashboards are not the place to expect to deliver recommendations, because, in my experience, analysts hear that complaint and respond by trying to introduce recommendations to their dashboards. Why shouldn’t they? I can give four reasons!
Reason No. 1: Dashboards Can’t Wait
Another complaint analysts often hear is that dashboards aren’t delivered quickly enough at the end of the reporting period. Well, no one, as far as I know, has found a way to stop time. It marches on inexorably, with every second taking exactly one second, every minute having a duration of 60 seconds, and every hour having a duration of 60 minutes (crappy Adam Sandler movies — pardon the adjectival redundancy — notwithstanding).
Given that, let’s step back and plot out a timeline for what it takes in an “insights and recommendations delivered with the dashboard” scenario for a dashboard that gets delivered monthly:
- Pull data (can’t happen until the end of the previous month)
- Consolidate data to get it into the dashboard
- Review the data — look at KPIs that missed targets and supporting metrics that moved unexpectedly
- Dig in to do analysis to try to figure out why those anomalies appeared
- IF the root cause is determined, assess whether this is something that needs “fixing” and posit ways that it might be fixable
- Summarize the results — the explanation for why those anomalies appeared and what might be done to remedy them going forward (if the root cause was something that requires a near-term change)
- Add the results to the dashboard
- Deliver the dashboard
- [Recipient] Review the dashboard and the results
- [Recipient] Decide whether to take action
- [Recipient] If action will be taken, then take the action
Seems like a long list, right? I didn’t write it trying to split out separate steps and make it needlessly long. What’s interesting is that steps 1 and 2 can (and should!) be shortened through automation. Aside from systems that are delayed in making their data available, there is no reason that steps 1 and 2 can’t be done within hours (or a day) of the end of the reporting period.
Steps 3 through 7, though, are time-consuming. And, often, they require conversations and discussion — not to mention time to actually conduct analysis. Despite vendor-perpetuated myths that “the tool” can generate recommendations… tools really suck at doing so (outside of highly operationalized processes).
Here’s the other kicker, though: steps 9 through 11 take time, too! So, realistically, let’s say that steps 1 and 2 take a day, steps 3 through 8 take a week, steps 9 and 10 takes 3 days (because the recipient doesn’t drop everything to review the dashboard when it arrives), and then step 11 takes a week (because “action” actually requires marshalling resources and getting something done). That means — best case — we’re 2.5 weeks into the month before action gets taken.
So, what happens at the end of the month? The process repeats, but there was only 1.5 weeks of the change actually being in place… which could easily get dwarfed by the 2.5 weeks of the status quo!!!
Let’s look at how a “dashboard without insights” process can work:
- Pull data (can’t happen until the end of the previous month)
- Consolidate data to get it into the dashboard
- Deliver the dashboard (possibly calling out any anomalies or missed targets)
- [Recipient] Review the dashboard and hones in on anything that looks troubling that she cannot immediately explain (more on that in the next section)
- The analyst and the recipient identify what, if any, trouble spots require deeper analysis and jointly develop actionable hypotheses to dig in
- The analyst conducts a very focused analysis (or, in some cases, proposes an A/B test) and delivers the results.
- [Recipient] If action is warranted, takes action
Time doesn’t stop for this process, either. But, it gets the information into the business’s hand inside of 2 days. The analyst doesn’t waste time discovering root causes that the business owner already knows (see the next section). The analysis that gets conducted is focused and actionable, and the business owner is already primed to take action, because she participated in determining what analyses made the most sense.
Reason No. 2: Analysts Aren’t Omniscient
I alluded to this twice in the prior paragraph. Let’s look at a generic and simplistic (but based on oft-observed real-world experience) example:
- The analyst compiles the dashboard and sees that traffic is down
- The analyst digs into the traffic sources and sees that paid search traffic is down dramatically
- The analyst digs in further and sees that paid search traffic went to zero on the 14th of the month and stayed there
- The analyst fires off an urgent email to the business that paid search traffic went to zero mid-month and that something must be wrong with the site’s SEM!
- The business responds that SEM was halted mid-month due to budget adjustments, and they’ve been meaning to ask what impact that has had
What’s wrong with this picture? Steps 2 through 4 are largely wasted time and effort! There is very real analysis to be done… but it doesn’t come until step 5, when the business provides some context and is ready for a discussion.
This happens all the time. It’s one of the reasons that it is imperative that analysts build strong relationships with their marketing stakeholders, and one of the reasons that a sign of a strong analytics organization is one where members of the team are embedded – literally or virtually – in the teams they support.
But, even with a strong relationship, co-location with the supported team, regular attendance at the team’s recurring meetings, and a spot on the team’s email distribution list, analysts are seldom aware of every activity that might result in an explainable anomaly in the results delivered in a dashboard.
This gets to a data source that gets ignored all too often: the minds and memories of the marketing team. There is nothing at all wrong with an analyst making the statement: “Something unexpected happened here, and, after I did some cursory digging, I’m not sure why. Do you have any ideas as to what might have caused this?” There are three possible responses from the marketer who is asked this question:
- “I know exactly what’s going on. It’s almost certainly the result of X.”
- “I’m not sure what might have caused that, but it’s something that we should get to the bottom of. Can you do some more digging to see if you can figure it out?”
- “I’m not sure what might have caused that, but I don’t really care, either. It’s not important.”
These are quick answers to an easy question that can direct the analyst’s next steps. And, two of the three possible answers lead to a next step of moving onto a value-adding analysis — not pursuing a root cause that will lead to no action! Powerful stuff!
Reason No. 3: Insights Don’t have a Predictable and Consistent Length
I see it all the time: a standard dashboard format that, appropriately, has a consistent set of KPIs and supporting metrics carefully laid out in a very tightly designed structure. Somewhere in that design is a small box – at the top of the dashboard, at the bottom right of the dashboard, somewhere – that has room for a handful of bullet points or a short paragraph. This area of the dashboard often has an ambitious heading: “Insights,” “Recommendations,” “Executive Summary.”
The idea – conceived either on a whiteboard with the initial design of the dashboard, or, more likely, added the first time the dashboard was produced – is that this is where the analysts real value will be manifested. THIS is where the analyst will place the Golden Nuggets of Wisdom that have been gleaned from the data.
Here’s the problem: some of these nuggets are a flake of dust, and some are full-on gold bars. Expecting insights to fit into a consistent, finite space week in and week out or month in and month out is naïve. Sometimes, the analyst has half a tweet’s worth of prose-worthy material to include, which makes for a largely empty box, leaving the analyst and the recipient to wonder if the analyst is slacking. At other times, the analyst has a handful of useful nuggets to impart…but then has to figure out how to distill a WordPress-sized set of information into a few tweet-sized statements.
Now, if you buy into my first two reasons as to why recommendations shouldn’t be included with the dashboard in the first place, then this whole section becomes moot. But, if not — if you or your stakeholders still insist that performance measurement include recommendations — then don’t constrain the space to include that information to a fixed box on the dashboard.
Reason No. 4: Insights Can’t Be Scheduled
A scene from The Marketer and the Analyst (it’s a gripping — if entirely fictitious — play):
Marketer: “This monthly dashboard is good. It’s showing me how we’re doing. But, it doesn’t include any insights based on the performance for the month. I need insights to take action!”
Analyst: “Well, what did you do differently this month from previous months?”
Marketer: “What do you mean?”
Analyst: “Did you make any changes to the site?”
Marketer: “Not really.”
Analyst: “Did you change your SEM investment or strategy?”
Marketer: “No.”
Analyst: “Did you launch any new campaigns?”
Marketer: “No.”
Analyst: “Were there any specific questions you were trying to answer about the site this month?”
Marketer: “No.”
Analyst: ???!
Raise your hand if this approximates an exchange you’ve had. It’s symptomatic of a completely ass-backward perception of analytics: that the data is a vast reserve of dirt and rock with various veins of golden insights threaded throughout. And, that the analyst merely needs to find one or more of those veins, tap into it, and then produce a monthly basket of new and valuable ingots from the effort.
The fact is, insights come from analyses, and analyses come from hypotheses. Some analyses are small and quick. Some are large and require gathering data – through an A/B or multivariate test, for instance, or through a new custom question on a site survey. Confusing “regularly scheduled performance measurement” with “hypothesis-driven analysis” has become the norm, and that is a mistake.
While it is absolutely fine to measure the volume and value of analyses completed, it is a recipe for failure to expect a fixed number of insights to be driven from and delivered with a scheduled dashboard.
A Final Word: Dashboards vs. Reports
Throughout this post, I’ve discussed “dashboards.” I’ve steered clear of the word “report,” because it’s a word that has become pretty ambiguous. Should a report include insights? It depends on how you define a report:
- If the report is the means by which, on a regularly scheduled basis, the performance of a [site/campaign/channel/initiative] is performing, then my answer is: “No.” Reasons 1, 2, and 4 explain why.
- If the report is the term used to deliver the results of a hypothesis-driven analysis (or set of hypothesis-driven analyses), then my answer is, “Perhaps.” But…why not call it “Analysis Results” to remove the ambiguity in what it is?
- If the report is intended to be a combination of both of the above, then you will likely be delivering a 25+ deck of rambling slides that — despite your adoration for the content within — is going to struggle to hold your audience’s attention and is going to do a poor job of both measuring performance and of delivering clearly actionable analysis results.
We live in a real-time world. Consumers — all marketers have come to accept — have short attention spans and consume content in bite-sized chunks. An effective analyst delivers information that is super-timely and is easily digestible.
So. Please. Don’t spend 3 weeks developing insights and recommendations to include on a 20-page document labeled “dashboard.”