Analytics Aphorisms — Gilligan-Style
Last week, I had the pleasure of presenting at a SEER Interactive conference titled “Marketing Analytics: Proving and Improving Online Performance.” The conference was at SEER’s main office, which is an old church (“old” as in “built in 1850” and “church” as in “yes…a church”) in the Northern Liberties part of Philadelphia. The space itself is, possibly, the most unique that I’ve presented in to date (photo courtesy of @mgcandelori — click to view the larger version…that’s real stained glass!):
As luck would have it, Michele Kiss attended the conference, which meant all of the speakers got a pretty nice set of 140-character notes on the highlights of what they’d said.
Reviewing the stream of tweets afterwards, I realized I’ve developed quite a list of aphorisms that I tend to employ time and again in analytics-oriented conversations. I’m sufficiently self-aware that I’ll often preface them with, “So, this is soapbox #23,” but, perhaps, not self-aware enough to not actually spout them!
The occasion of standing on an actual altar (SEER maintained much of the the space’s original layout) seemed like a good time to put together a partial catalog of my go-to one-liners. Enjoy!
Being data-driven requires People AND Process AND Technology
I beat this drum fairly often. It’s not enough to have a pristine and robust technology stack. Nor is it sufficient to have great data platforms and great analysts. I believe — firmly — that successful companies have to have a solid analytics process, too. Otherwise, those wildly-in-demand analysts sifting through exponentially growing volumes of data don’t have a prayer. Effective analytics has to be efficient analytics, and efficiency comes from a properly managed process for identifying what to test and analyze.
Identifying KPIs is nothing more than answering two question: 1) What are we trying to achieve? and 2) How will we know if we’ve done that?
I’ve got to credit former colleague Matt Coen for the clarity of these. I like to think I’ve done a little more than just brand them “the two magic questions,” but it’s possible that I haven’t! The point here is that business-speak is, possibly, more vile than Newspeak. “Goals” vs. “strategies” vs. “objectives” vs. “tactics” — these are all words that different people define in different ways. I actually have witnessed — on multiple occasions — debates between smart people as to whether something is an “objective” or a “strategy.”
As soon as we use the phrase “key performance indicator,” the acronym “KPI,” or the phrase “success measure,” we’re asking for trouble. So, whether I verbally articulate the two questions above, or whether I simply ask them of myself and try to answer them, I try to avoid business-speak:
- What are we trying to achieve? Answer that question without data. It’s nothing more than the elevator pitch for an executive who, while making conversation while traveling from the ground floor to the 8th floor, asks, “What’s the purpose of <insert whatever you’re working on>?”
- How will we know if we’ve done that? This question sometimes gets asked…but it skips the first question and invites spouting of a lengthy list of data points. As a plain English follow-on to the first question, though, it invites focus!
The K in KPI stands for “Key” — not for 1,000.
This one is a newer one for me, but I’ll be using it for a lonnnng time. All too often, “KPI” gets treated as a fancy-pants way to say “data” or “metrics” or “measures.” Sure, we feel like we’re business sophisticates when we can use fancy language…but that doesn’t mean that we should be using fancy language poorly! I covered this one in more detail in my last post…but I’m going to repeat the picture I used there, anyway, because it cracks me up:
A KPI is not a KPI if it doesn’t have a target.
“Visits” is not a KPI. Nor is “conversion rate.” Or “customer satisfaction.” A KPI is not a KPI without a target. Setting targets is an inexact science and is often an uncomfortable exercise. But…it’s not as hard as it often gets made out to be.
Human nature is to think, “If I set a target and I miss it…then I will be viewed as having FAILED!” In reality, that’s the wrong view of targets. If you work for a manager, a company, or a client where that is the de facto response…then you need to find a new job.
Targets set up a clear and objective way to: 1) ensure alignment on expectations at the outset of an effort, and 2) objectively determine whether you were able to meet those expectations. If you wildly miss a very-hard-to-set target, then you will have learned a lot more and will be better equipped to set expectations (targets) the next time.
This all leads into another of my favorites…
You’re never more objective about what you *might* accomplish than before you set out to achieve it.
“I have no idea and no expectations!” is almost always an unintentional lie. Somebody decided that time and energy would be spent on the campaign/channel/initiative/project. That means there was some expectation that it would be worthwhile to do so. And “worthwhile” means something. It’s really, really hard to, at the end of a 6-month redesign where lots of people pulled lots of long hours to hit the launch date, stand up and say, “This didn’t do as well as we’d hoped.” In the absence of targets, that never happens. The business owner or project manager or analyst automatically starts looking for ways to illustrate to the project team and to the budget owner that the effort paid off.
But, that’s short-sighted. For starters, without a target set up front, just about any reporting of success will carry with it a whiff of disingenuousness (“You’re telling me that’s good…but this is the first I’m hearing that we knew what ‘good’ would look like!”). And, the after-the-fact-looking-for-success means effort is spent looking backwards rather than minimal effort to look backwards so that real effort can go into looking forward: “Based on how we performed against our expectations (targets), what should we do next, and what do we expect to achieve?”
Any meaningful analysis is based on a hypothesis or set of hypotheses.
I’ve had the debate many times over the years as to whether there are cases where “data mining” means “just poking around in the data to see what patterns emerge.” In some cases, the person is truly misguided and believes that, with a sufficiently large data set and sufficiently powerful analytical tools, that is truly all that is needed: data + tools = patterns –> actionable insights. That’s just wrongheaded.
More often, though, the debate is an illustration that a lot of analysts don’t realize that, in reality, they and their stakeholders are actually testing hypotheses. Great analysts may subconsciously be doing that…but it’s happening. The more we recognize that that’s what we’re doing, the more focused and efficient we can be with our analyses!
Actionable hypotheses come from filling in the blanks on two questions: 1) I believe _______, and 2) If I’m right, I will ______.
Having railed against fancy business-speak…it’s really not all that cool of me to be floating the word “hypothesis” now, is it? In day-to-day practice…I don’t! Rather, I try to complete these two statements (in order) before diving into any hypothesis:
- I believe [some idea] — this actually is the hypothesis. A hypothesis is an assumption, an idea, a guess, a hunch, or a belief. Note that this isn’t “I know…” and it’s not even “I strongly believe…” It’s the lowest level of conviction possible, so we should be fine learning (quickly, and with data) when the belief is untrue!
- If I am right, I will [take some action] — this isn’t actually part of the hypothesis. Rather, it’s a way to qualify the hypothesis by ensuring that it is sufficiently focused that, if the belief holds up, action could be taken. In my experience, taking one broad belief and breaking it down into multiple focused hypotheses leads to much more efficient and actionable analysis.
Like the two magic questions, I don’t necessarily force my clients to use this terminology. I’ll certainly introduce it when the opportunity arises, but, as an analyst, I always try to put requests into this structure. It helps not only focus the analysis (and, often, promote some probing and clarification before I dig into the time-intensive work of pulling and analyzing the data), but focus the output of the analysis in a way that makes it more actionable.
Do you have favorite analytics aphorisms?
I’d love to grow my list of meaningful analytics one-liners. Do you have any you use or have heard that you really like?