Analysis, Testing and Optimization

Big Data without Digital Insight Management Is a Big Hot Mess

One of the many exciting aspects of joining a new company is the opportunity for reflection. The lead-up to the job change forced some introspection — what was it I really most enjoyed about my profession and what would a dream job look like that allowed me to spend as much of each day doing that as possible? And, as a new company, everyone has had to put their heads together to build out the processes needed to bring the vision for the company to life, which has required a different flavor of reflection: reflecting on what has and has not worked in our collective experience when it comes to enabling brands to be as data-informed as possible in their daily processes.

Shortly after joining Clearhead, I attended eMetrics in Boston. The conference, as always, was a great time. And, as often is the case, one of the conversations that stuck with me the most occurred where I didn’t expect it — in the exhibit hall during the sessions with a vendor I’d never heard of before the conference: Sweetspot Intelligence. Sergio Maldonado (@sergiomaldo) explained the vision for Sweetspot, gave me a brief product tour, and handed me a copy of the paper they sponsored Eric Peterson to write: Digital Insight Management: Ten Tips to Better Leverage Your Existing Investment in Digital Analytics and Optimization. The concept of “Digital Insight Management” is intriguing. And, luckily, it’s much more than an abstract idea — it’s real and, I believe, something that all analysts should be striving to implement.

Let’s Start with the Basics — Demystified’s Hierarchy of Analytical Needs

Early in the paper, Eric included Analytics Demystified‘s Hierarchy of Analytical Needs:

Experienced analysts look at this diagram and think, “Well…yeah. That’s a good depiction of the battle we fight every day.” Any sort of ho-hum response to the diagram is because we’ve been fighting the battle to move “up the pyramid” for a while, and we often feel undermined by the business environment in which we work. This is one of the more succinct and elegant depictions (not just the labels on the left — the assessment in the boxes on the right) that I’ve seen.

One Step Back Adds Another Element

When viewed through the lens of “what an analyst can do,” the hierarchy is complete. In some respects, the analyst can only lead the proverbial horse to water (clearly communicate a data-informed recommendation). The analyst can’t necessarily make the horse drink (take action). But, still, it’s worth recognizing that, if we take just one step back from this pyramid, we want to see one more level on the hierarchy:

Again, this is somewhat obvious. Yet, it’s where “we” (businesses) seem to so often stumble. There is so much “Data” now that marketers are now conditioned to prepend any mention of the word “data” with the word “big!” Few reports rely on data from a single source as analysts, and marketers work hard to place the data into meaningful context.  But, of course, the further up the pyramid we go, the easier and easier it is to get derailed. Ultimately…limited action.

Pivoting the Process

While the hierarchies above are unequivocally true, the actual process for meaningful analytics — analysis that drives relevant action — actually looks quite different:

Let’s break this down a bit:

  • Everything hinges on having clear objectives and measures of success — it’s scary how often marketers stumble on this, and, as analysts, it behooves us to be skilled in helping marketers get these nailed down (these are soft skills!)
  • Performance measurement is key…but it’s not the source of insights — performance measurement is the alerting system; it tracks the KPIs against targets, as well as some supporting and contextual metrics. But, the reports and dashboards themselves don’t yield insights — they surface problems that then need to be further explored.
  • All analysis starts with a business problem, business question, or business idea — the lefthand column is where th magic happens (or, all too often, doesn’t!).

It is impossible to attend any analytics-oriented conference these days without being hit over the head with how critical it is to develop and foster strong relationships with your business partners: regularly communicate, listen for the problems they’re having that your analytical skills can help with, learn how to communicate effectively, etc. That is a recurring theme because actually teasing out the right business questions and problems can be tricky!

Conversely, the back end of the process can be tricky, too. We’ve all had cases where we completed the right analysis and got actionable results…but action never occurred. As I understand it, that is where Sweetspot comes in: technology that supports communication and workflow related to getting actionable information to the people who can take action:

So…Will Tag Management Solve This?

(Blog authors get to crack themselves up with their headings…)

What Eric’s paper, and Sweetspot’s product, got me thinking about are a couple of gaps that, hopefully, I’ve covered in this post:

  • As analysts, we need to develop, implement, and own workable processes within our companies to make analytics truly gain and sustain traction
  • There is an opportunity for better technology to support these processes…and that is analytics technology that has nothing to do with the mechanics of capturing customer data

Is “Digital Insight Management” the next big thing? I think it is. Big Data is just a big hot mess without it.

Analytics Strategy

"Tag! You’re It!" One More Analyst’s Tag Management Thoughts

Unless you’re living in a cave (and, you’re clearly not, because you’re spending enough time trolling the interwebtubes to wind up on this blog), you’ve seen, heard, and felt the latest wave of news and excitement about tag management. Google announced Google Tag Manager last month, rumors are swirling that Adobe is going to begin providing their tag manager for free to all Sitecatalyst clients, and, most recently, Eric Peterson wrote a post trying to bring it all together. (Well…that was the most recent post when I started writing this, but Rudi Shumpert actually weighed in with a wish list for tag management systems, and his post is worth a read, too!).

My awareness of tag management dates back just over two years — back to Eric’s initial paper on the subject, which was sponsored by Ensighten; this coincided with Josh Manion’s “Tagolution” marketing stunt at the Washington, D.C. eMetrics in the fall of 2010. Since then:

  • I’ve had multiple discussions and demos from multiple enterprise tag management vendors (and even training from one!).
  • I’ve had one client that used Ensighten.
  • I hosted a recent Web Analytics Wednesday in Columbus that was co-sponsored by BrightTag, who also presented there
  • I’ve chatted with several local peers who either already have or are in the process of implementing tag management.
  • I’ve taken a crack at rolling out Google Tag Manager on this blog (I failed after an hour of fiddling — broke several things and couldn’t get Google Analytics working with it)

Along the way, of course, I’ve read posts, seen conference presentations, and chatted with a number of sharp analysts. I’ve also, apparently, derided the underlying need for tag management to Evan LaPointe of Satellite. This was over a year ago, and I don’t remember the conversation, but I am a cynic by nature, so I don’t doubt that I was somewhat skeptical.

My fear then, as it is now, is this:

Once again, we’re treating an emerging class of technology as a panacea. We’re letting vendors frame the conversation, and we’re putting our heads in the sand about some important realities.

Now, all told, I’ve had a lot of conversations and very little direct hands-on use of these platforms. On the one hand, since I love to tinker, that’s a symptom of some of what I’ll cover in this toast — tag management requires wayyyy more than “just dropping a single line of Javascript” to actually use. On the other hand, I may just be doing that blogging bloviation thing. You decide!

Tag Management Doesn’t Simplify the Underlying Tools

Example No. 1: In the spring of 2011, I got a demo — via Webex — of one of the leading enterprise tag management systems. The sales engineer who was demoing the product repeatedly confused Sitecatalyst and Google Analytics functionality in his demo. It was apparent that he had little familiarity with any web analytics tool, and questions that we asked to try to understand how the product worked got very vague and unsatisfactory answers. If the sales guy couldn’t clearly show and articulate how to accomplish some of our most common tagging challenges, we wondered, how could we believe him that his solution was, well, a solution?

Example No. 2: When it came to our client who used Ensighten, we were set up such that analysts at my agency developed the Sitecatalyst tagging requirements as part of our design and development work for the client, while an analytics consultancy — also under contract with the client — actually implemented what we specified through the TMS. Time and again, new content got pushed to production with incorrect or nonexistent tagging. And, time and again, we were told by the analytics consultancy that we needed to make adjustments to the site in order for them to be able to get the tags to fire as specified. Certainly, this was not all the fault of the tag management platform, as a tool is only as good as the people and processes that use it. But, the experience highlighted that tag management introduces complexity at the same time that it introduces flexibility.

Example No. 3: I had a discussion with a local analyst who works for a large retailer that is using BrightTag. When I asked how she liked it, she said the tool was fine, but, what no one had really thought through was that the “owner of the tool” inherently needed to be fairly well-versed in every tool the TMS was managing. In her case, she was an analyst well-versed in Sitecatalyst. She had BrightTag added to her plate of responsibilities. Overnight, she found herself needing to understand her companies implementations of ForeSee, Google Analytics, Brightcove, and a whole slew of media tracking technologies. In order to deploy a tag or tracking pixel correctly through the TMS, she actually needed to know what tags to deploy and how to deploy them in their own right.

Example No. 4: In my own experience with rolling out Google Tag Manager, I quickly realized how many different tags and tag customizations I’ve got on this blog. My documentation sucks, I admit, and over half of what is deployed is “for tinkering,” so, in that regard, my experience with this site isn’t a great example. On the other hand, sites that have been built up and evolved over years can’t simply “add tag management.” They have to ferret out where all of their tags are and how they’ve been tweaked and customized. Then, for each tagging technology, they need to get them completely un-deployed, and then redeploy them through the tag management system. That’s not a trivial task.

Putting all of these examples together is concerning, because there is a very real risk that, in an industry that is already facing a serious supply shortage, a significant number of very smart, multi-year-experienced analysts will find themselves spending 100% of their time as tool jockeys managing tags rather than analysts focused on solving business problems.

Tag Management Doesn’t Simplify User Experience

Completely separate from the discussions around tag management is the reality of the continuing evolution and fragmentation of the online consumer experience.

I recently completed a tagging spec for a client whose technology will be rolled out onto the sites of a number of their clients. As I navigated through the experience — widget-ized content augmentation our client’s clients’ sites — I was reminded anew how non-linear and non-“page”-based our online experiences have become. Developing tags that would enable the performance management and analysis that we scoped out in our measurement framework, that would be reasonably deployable and maintainable, and that would deliver data that would be interpretable by the casual business user while also having the underlying breadth and depth needed for the analyst, required many hours of thought and work.

And …the platform has pretty minimal designed integration with social media.

And …the platform does not yet have a robust mobile experience.

In other words, in some respects, this was a pretty simple tagging exercise…and it wasn’t simple!

The truth: Most of our customers and potential customers are now multi-device (phones, tablets, laptops, desktops, TV,…) and multi-channel (Facebook, Twitter, Pinterest, apps, web site,…). Tag management only works where “the tag” can be deployed, and it doesn’t inherently provide cross-device and cross-channel tracking. (For the record, neither does branding the next iteration of your platform as “Universal Analytics,” either…but that’s a topic for another day.)

Tag Management IS Another Failure Point

I’ve developed a reverse Pavlovian response to the phrases “single line of code” and “no IT involvement” — it’s a reverse response because, rather than drooling, I snarl. Tag management vendors are by no means the only platforms that laud their ease of deployment. And, there is truth in what they say — a single line of Javascript that includes a file with a bunch of code in it is a pretty clever way to minimize the technical level of effort to deploy a tool.

But, with the power of tag management comes some level of risk. I can deploy and update my Webtrends tags through a TMS. That means there are risks that:

  • I could misdeploy it and not be capturing data at all.
  • I could deploy it in a way that hangs up the loading of the entire site.
  • I could use my tag management system to implement some UI changes rather than waiting for IT’s deployment cycle…and break the UI for certain browsers.
  • I could implement code that will capture user data that violates my company’s privacy policy.

IT departments, as a whole, are risk averse. They are the ones whose cell phones ring in the middle of the night when the site crashes. They’re the ones who wind up in front of the CEO to explain why the site crashed on Black Friday. They are the ones who wind up in the corporate counsel’s office responding to a request to provide details on exactly what systems did what and when in response to lawsuits.

In other words, every time a TMS vendor proudly delivers their, “No IT involvement!” claim…I feel a little ill for two reasons:

  • The friction between IT and Marketing is real, but it needs to be addressed through communication rather than a technology solution.
  • The statement illustrates that the TMS vendor is not recognizing that site-wide updates of Javascript should be vetted through some sort of rigorous (not necessarily lengthy!) process (yes, many TMSs have workflow capabilities, but the fact that, “Oh, we have workflow and you can have it go through IT before being deployed…if you need to do that” is a response to a question rather than an up-front recognition is concerning).

I have a lot of empathy for the IT staff that gets cut out of TMS vendor discussions until after the contract has been signed and are then told to “just push out this one line of code.” That puts them in a difficult, delicate, and unfair spot!

Yet…Tag Management Is the Future

Do my concerns above mean that I think tag management is misguided or a mistake? Absolutely not! Tag management is an important step forward for the industry, but we can’t ignore the underlying realities. Tag management isn’t easy — no more than web analytics is easy or testing and optimization easy. The technology is a critical part of the whole formula, and I’m excited to see as many players as there are in the space — they’ll be innovating like crazy to survive! But, people (with knowledge) and processes (that cut across many systems) seem like they must be just as important to a successful TMS deployment as the TMS itself.
Analysis, Reporting, Social Media

"Demystifying" the Formula for Social Media ROI (there isn't one)

I raved about John Lovett’s new book, Social Media Metrics Secrets in an earlier post, and, while I make my way through Marshall Sponder’s Social Media Analytics book that arrived on bookshelves at almost exactly the same time, I’ve also been working on putting some of Lovett’s ideas into action.

One of the more directly usable sections of the book is in Chapter 5, where Lovett lays out pseudo formulas for KPIs for various possible (probable) social media business objectives. This post started out to be about my experiences drilling down into some of those formulas…but then the content took a turn, and one of Lovett’s partners at Analytics Demystified wrote a provocative blog post…so I’ll save the formula exploration for a subsequent post.

Instead…Social Media ROI

Lovett explicitly notes in his book that there is no secret formula for social media ROI. In my mind, there never will be — just as there will never be unicorns, world peace, or delicious chocolate ice cream that is as healthy as a sprig of raw broccoli, no matter how much little girls and boys, rationale adults, or my waistline wish for them.

Yes, the breadth of social media data available is getting better by the day, but, at best, it’s barely keeping pace with the constant changes in consumer behavior and social media platforms. It’s not really gaining ground.

What Lovett proposes, instead of a universally standard social media ROI calculation, is that marketers be very clear as to what their business objectives are – a level down from “increase revenue,” “lower costs,”and “increase customer satisfaction” – and then work to measure against those business objectives.

The way I’ve described this basic approach over the past few years is using the phrase “logical model,” – as in, “You need to build a logical link from the activity you’re doing all the way to ultimate business benefit, even if you’re not able to track those links all the way along that chain. Then…measure progress on the activity.”

Unfortunately, “logical model” is a tricky term, as it already has a very specific meaning in the world of database design. But, if you squint and tilt you’re head just a bit, that’s okay. Just as a database logical model is a representation of how the data is linked and interrelated from a business perspective (as opposed to the “physical model,” which is how the data actually gets structured under the hood), building a logical model of how you expect your brand’s digital/social activities to ladder up to meaningful business outcomes is a perfectly valid  way to set up effective performance measurement in a messy, messy digital marketing world.

No Wonder These Guys Work Together

Right along the lines of Lovett’s approach comes one of the other partners at Analytics Demystified with, in my mind, highly complementary thinking. Eric Peterson’s post about The Myth of the “Data-Driven Business” postulates that there are pitfalls a-looming if the digital analytics industry continues to espouse “being totally data-driven” as the penultimate goal. He notes:

…I simply have not seen nearly enough evidence that eschewing the type of business acumen, experience, and awareness that is the very heart-and-soul of every successful business in favor of a “by the numbers” approach creates the type of result that the “data-driven” school seems to be evangelizing for.

What I do see in our best clients and those rare, transcendent organizations that truly understand the relationship between people, process, and technology — and are able to leverage that knowledge to inform their overarching business strategy — is a very healthy blend of data and business knowledge, each applied judiciously based on the challenge at hand. Smart business leaders leveraging insights and recommendations made by a trusted analytics organization — not automatons pulling levers based on a hit count, p-value, or conversion rate.

I agree 100% with his post, and he effectively counters the dissenting commenters (partial dissent, generally – no one has chimed in yet fully disagreeing with him). Peterson himself questions whether he is simply making a mountain out of a semantic molehill. He’s not. We’ve painted ourselves into corners semantically before (“web analyst” is too confining a label, anyone…?). The sooner we try to get out of this one, the better — it’s over-promising / over-selling / over-simplifying the realities of what data can do and what it can’t.

Which Gets Back to “Is It Easy?”

Both Lovett’s and Peterson’s ideas ultimately go back to the need for effective analysts to have a healthy blend of data-crunching skills and business acumen. And…storytelling! Let’s not forget that! It means we will have to be communicators and educators — figuring out the sound bites that get at the larger truths about the most effective ways to approach digital and social media measurement and analysis. Here’s my quick list of regularly (in the past…or going forward!) phrases:

  • There is no silver bullet for calculating social media ROI — the increasing fragmentation of the consumer experience and the increasing proliferation of communication channels makes it so
  • We’re talking about measuring people and their behavior and attitudes — not a manufacturing process; people are much, much messier than widgets on a production line in a controlled environment
  • While it’s certainly advisable to use data in business, it’s more about using that data to be “data-informed” rather than aiming to be “data-driven” — experience and smart thinking count!
  • Rather than looking to link each marketing activity all the way to the bottom line, focus on working through a logical model that fits each activity into the larger business context, and then find the measurement and analysis points that balance “nearness to the activity” with “nearness to the ultimate business outcome.”
  • Measurement and analytics really is a mix of art and science, and whether more “art” is required or more “science” is required varies based on the specific analytics problem you’re trying to solve

There’s my list — cobbled from my own experience and from the words of others!

Analysis, Analytics Strategy

Complex Processes and Analyses Therein

Stéphane Hamel, it seems, is a bit peeved with Eric Peterson. These are two pretty big names in web analytics — Eric as one of the fathers of web analytics, and Stéphane as both a thought leader in the space as well as the creator of one of the most practical, useful web analytics supplemental tools out there — WASP: The Web Analytics Solution Profiler plugin for Firefox. With the plugin, you visit any site, and a sidebar will tell you what web analytics solutions it looks like it’s running. It’s pretty cool.

I don’t know the full background of the current back-and-forth between these two guys, but I’m a huge fan of Stéphane, and my ears perked up when I read this observation in the post:

Business Process Analysis implies understanding & improving a collection of interrelated tasks which solve a particular issue. Nothing new here… Most businesses face complex and “hard” processes, and the way to make them “easy” is by decomposing them into smaller sub-processes until they are manageable.

For one thing, for a period of ~8 months, my job title was “Director of Business Process Analytics.” And, frankly, I was never sure what that meant. In hindsight, if I’d had these two sentences from Stéphane and if I’d replaced “Analytics” with “Analysis,” I would have seen a much clearer mapping from my label to what I was actually doing in the role.

More important, though, is the concept of “decomposition.” I find myself preaching the Decomposition Doctrine regularly. And I believe in it strongly.

As an example, when it comes to the Holy Grail of Marketing Analysis — calculating the ROI of your marketing spend — many, many B2B marketers start out looking for the correlation between leads generated and revenue. I have yet to see a case in B2B where this can be found with a sufficiently tight, sustained correlation to be meaningful. That actually makes sense. It’s like looking for a correlation between the state someone is born in and the achievement of a PhD. There’s a lot going on over time between Point A and Point Z.

In the case of B2B marketing, decomposition makes sense. Decompose the process:

  • The lead-to-qualified lead sub-process
  • The qualified lead to sales accepted lead sub-process
  • The sales accepted lead to sales qualified lead sub-process
  • The sales qualified lead to close sub-process

Each of these sub-processes have people who proceed to the next sub-process as well as people who do not — put simplistically: people who “fall out of the funnel.” But, you can further decompose — of the people who fell out, where did they fall out and why? And does that mean they are gone forever, or are there processes/subprocesses that can be used to reengage them in the future?

The key here is that, from a theoretical perspective, if you link together all of the simpler sub-processes, then you’ve got an accurate representation of the more complex master process. The problem is that this is mostly true. There are probably other sub-processes that are unknown — those pesky “corner cases” that the real world insists on throwing at us. And, each sub-process likely experiences various anomalies over time. Add those together, and you’ve got a complex process that verges on the unanalyzable.

On the other hand, if you focus on a sub-process, you can analyze what is going on, including accounting for the anomalies. “But, isn’t there a risk that you’ll be missing the forest for the trees?” you ask. Absolutely. That’s why it’s important to start with a high-level view of the whole process, with a clear picture of the components that go into it. If you simply pick a “simple sub-process” to focus on, without understanding how and where that fits into the big picture, you run the risk of rearranging deck chairs on the Titanic. On the other hand, if you simply try to “analyze the Titanic,” without some level of decomposition, you’re equally doomed.