There are few sentence openers that set off analytics alarm bells for me more than:
“It would be interesting to see…”
That phrase gives me a 15-minute cardio workout without needing to get up from my chair. Really. (Well. Almost. I definitely have developed a physiological reaction to the phrase — elevated heart rate, burning sensation in my ears, etc.)
The curse of the analyst is how often we find ourselves producing results that fall into the dastardly void of “interesting but not actionable.” The real kicker is that, if we let ourselves be guided by “interesting”-based requests, and we repeatedly deliver analyses that scratch those curiosity itches, then, over time, we wind up with the worst possible feedback:
“The reports you deliver have a lot of interesting information…but they don’t have insights. They don’t make recommendations. They don’t tell me what I should do.”
In other words, the reports and analyses we’ve been delivering are exactly what was requested, but seldom produce actionable insights and, ultimately, lead to nothing more than a big, fat, depressing, “So, what?”
The root cause of this vicious and soul-sucking ritual goes back to the initial request: a lack of discipline regarding the intake of the request itself. The “So what?” test can be applied much more cheaply at the point of intake rather than waiting until a full-blown analysis has been conducted and presented!
In a perfect world, where egos do not have to be protected and where organizational pecking orders are not part of the identification and qualification of requests for analysis, it’s simple (the request below is essentially verbatim from an email, just with masked/bracketed details — the subsequent exchange is idealized and fictitious):
Executive: I woke this morning thinking about an interesting impact metric for our website. To what extent does a blip in twitter result in a blip on our website. For example, I suspect that something related to [cultural topic] was trending on [date]. Did we also see an increase in traffic on our website? We could play around with the metric and use our website analytics to see where people were going.
Analyst: That’s a pretty broad ask. Can you help me understand what we would do with that data?
Executive: We’d have a better understanding of, when a relevant topic spikes in social media, how people behave on our site.
Analyst: Okay. But, wouldn’t you expect that to change depending on the topic? And, I’m trying to envision what the results of such an analysis might look like where you would easily be able to act on the information.
Executive: Well…er…hmmm. That’s a good question. I guess it would…well…no…maybe not. Um. Wow. I really hadn’t thought this through. Now that you’re asking me to…I don’t think this would actually be all that useful. And, you would probably have to spend a lot of time to respond to the request. Thanks for probing a bit rather than just running off and spinning your wheels on my behalf!
We don’t live in a perfect world. Vague requests are going to get floated. Organizational hierarchies and relationship-building realities mean that, as analysts, we do have to chase “interesting” requests that lead nowhere. But, that doesn’t mean we shouldn’t recognize and strive to minimize how often that happens. Here’s how you can do that:
- Condition yourself to go to full alert whenever the word “interesting” is used (as well as “understand,” as in, “I want to understand…<some sort of behavior>;” and, heck, let’s throw in, from the example above: “play around!”)
- When you’re on full alert, probe for clarification as much as you can without being an ass — be delicate!
- Even if you are not able to probe very much, ask yourself the “So what?” question repeatedly — frame the specific questions that you might try to answer based on the request, envision a possible answer, and then apply the litmus test: “So what? If that was the answer, what would we do differently than we’re doing now?”
These tips are all pre-analysis, but they focus your analysis and the way the results get delivered.