Analytics Strategy

Quick Tip: Track Your Omniture JS File Version [SiteCatalyst]

Have you ever had someone run a report in SiteCatalyst and come running to you saying something like this?

This report doesn’t make sense…There is obviously a tagging issue and you need to fix it ASAP!

If I had a dollar for every time this happened to me, I’d be a rich man! The truth is, that after many wild goose chases, the problem is not usually tagging related (note you can use tools like ObservePoint and DigitalPulse to verify). But if it ever was tagging that caused the issue, it was usually related to the release of a new JavaScript file. That has been the culprit many times for me over the years. Therefore, in this post, I will share a trick you can use to easily find out if data issues you might be experiencing might be related to a new JavaScript file release.

Tracking Your JavaScript File

So how can you use SiteCatalyst to determine if a new JavaScript file you released is wreaking havoc on your data? For example, let’s imagine a scenario where the morning of May 18th, you started seeing some strange data irregularities (possibly by checking Data Quality as described here!). Here is what you need to do:

  1. Each time you create a new version of your JS file, assign it a version number (i.e. 0.5, 0.8, 1.2)
  2. Pass this version number into a tool that can store it and let you know when it sees the version number change
  3. Look at a report that shows you when the version number value has changed (what date it changed and at what time)

Sounds easy right? If only we knew of a tool into which we could pass data, have it be time-stamped and report upon changes in version number values…Hmmm….Where would we find such a tool??

Obviously, we already have that tool and it is SiteCatalyst! We can use the tool we know and love to track each version of the JavaScript file by simply passing in the version number of the file into an sProp on every page (and yes, I get the irony that we are using a JavaScript file which sets a beacon to enable tracking to track itself!). By doing this, we will have a historical record of when each JavaScript file was released. After you pass in the JavaScript File version you will see a report like this:

Here we can see the distribution of page views related to each JavaScript file version. In this case, we have been busy and have had four JavaScript file changes in one month! However, this report isn’t super-useful in answering our initial question: Were the issues we saw on the morning of May 18th related to a new JavaScript file release?

To answer this question, all we have to do is to switch to the “trended” view of this report and we will see a report like this:

Now we can start to see the flow of JavaScript file updates. Looking at this report, we can see that we moved from version 0.5 to version 0.7 (poor version 0.6!) on May 18th… This might support our hypothesis, but to be sure, we can look at this report by hour on May 17th & 18th and see this:

 

Now we can narrow things down to an hour and it looks like the JavaScript file did, in fact, change around 9:00 am on May 18th. As you can see, the simple action of taking the administrative step of keeping your JavaScript file in an sProp can provide an easy way for you to do some sleuthing when you are in a pinch!

Additionally, if you want to further test your hypothesis, you can isolate data that is related to a specific JavaScript file to see if it represents the issue you are seeing. To do this, simply use DataWarehouse to create a segment that only pulls pages that had data collected using a specific JavaScript file version as shown here:

 


Analytics Strategy

U.S. Privacy and Data Security Legislation Summary/Recap

Andy Kennemer, VP of Social Marketing & Media at Resource Interactive, recently attended the NRF Washington Leadership Conference, which included a meeting of the Shop.org Policy Advisory Group (PAG) meeting, of which he is a member. A major focus of the PAG meeting was the increased legislative focus on privacy and data security. Andy agreed to summarize some of the highlights for me to share here.

Legislation is cyclical, and we’re in a hot period right now.

The focus of our meeting was discussing in detail the various legislative actions in Congress regarding both online privacy and data security. These two issues are separate, but related, and the more they are mixed together in legislation, the more complicated and ambiguous it will make things for retailers and brands.

Last year we saw 3 main efforts:

  1. The FTC’s attempt to establish rule-making authority through a new US Privacy Framework proposal;
  2. Rep Boucher’s attempt to introduce an online privacy bill, which primarily would support the notion of consumer opt-IN to 3rd party tracking; and
  3. Sen. Pryor introduced a bill addressing Commerce Data Security.

The FTC is likely to release a final “staff report” on this matter sometime this year. The Boucher and Pryor bills never made it to the floor for debate.

This year, mainly in the last 3 months, we have seen a flurry of activity like never before.

Key privacy bills introduced this year:

  • Sens. Kerry & McCain introduce broad privacy bill (4/12/11)
  • Reps. Stearns & Matheson introduce broad privacy bill (4/13/11)
  • Sen. Rockefeller introduces “Do Not Track Online” bill (5/9/11)
  • Reps. Markey & Barton introduce “Do Not Track Kids” bill (5/13/11)
  • Sens. Franken & Blumenthal introduce Location Privacy bill (6/15/11)
  • Sen. Wyden & Rep. Chaffetz introduce “GPS” Privacy bill (6/15/11)

Key data security proposals:

  • White House releases Cyber Security proposal (5/25/11)
  • Sen. Leahy re-introduces Judiciary bill from 111th Congress (6/8/11)
  • Dept of Commerce “Green Paper” on Cyber Security framework (6/8/11)
  • Rep Bono Mack revises House-passed data security bill from 111th Congress (6/10/11)
  • Sen. Pryor re-introduces Commerce Bill from 111th Congress (6/15/11)

With so much activity, it’s challenging to even keep track of everything, and which bills and proposals matter the most. There are a few of these that are gaining momentum that, as an industry, we need to watch. The Kerry-McCain bill, White House Cyber Security proposal, and potential final report from the FTC on the privacy framework will have the broadest impact to brands and retailers.

For now, the feedback from a congressional staffer who attended the meeting was:

  • There is a fear of modernity within the government. What needs to be better articulated is how data collection is used to actually help consumers, to have a more relevant and enjoyable online experience.
  • We need the voice of actual consumers. Right now consumer advocacy groups have influence, but it isn’t clear if they really represent the concerns of average consumers.
  • Retailers have not adequately addressed the consequences of legislation, in terms of actual economic harm, or hindering innovation. Some sort of cost / benefit / risks analysis could be helpful (e.g., What does our online experience look like if advertising is not as effective? Are consumers ready to pay for services that are currently free and ad-supported?

This continues to be a complex and rapidly evolving area, and brands cannot afford to simply put their heads in the sand and hope it goes away. Legislation will get passed, but the extent and impact of that legislation is far from clear.

Analytics Strategy, General

Three Great Jobs at Best Buy

Now that summer is upon us I suspect that some of my personal blogging activity will slow down but I wanted to call my reader’s attention to three great jobs that our good friends at Best Buy just posted:

  • Senior Analyst, Digital Analytics
  • Associate Manager, Digital Analytics
  • Manager, Digital Analytics

Those of you who were at Emetrics in San Francisco this Spring heard some of the story about the work we’ve been fortunate to help with at Best Buy. Those of you coming to Internet Retailer in San Diego on June 16th will get to hear a shortened version of the same story. If you can’t/didn’t make either event I am happy to put interested parties directly in touch with the hiring manager at Best Buy, email me directly for details.

If you are coming to Internet Retailer, come and hear Lynn Lanphier (Best Buy) and I tell their amazing story.

Analytics Strategy

Web Analytics (How It Works) Explained in 4 Minutes

I was tinkering around a few weeks ago trying to figure out the best way to communicate an idea out to a group of people and hit on using Snagit to record me talking my way through a few PowerPoint slides that had some basic diagrams on them and then uploading the resulting video to YouTube (in that case, as a private video). It worked great — perfectly okay audio quality (I used a USB headset) and perfectly okay graphics. Lo-fi, but using the tools I already had at hand.

Below is an audio slideshow that uses the same approach to provide a very basic overview of how page tag-based web analytics tools work. If you’re a web analyst, I sincerely hope there is nothing new to you here. But, if you’re a web analyst who has repeatedly beaten your head against a brick wall when trying to explain to some marketers you work with that they need to put campaign tracking parameters on the links they use…maybe it’s a video you can send their way! It’s right at 4 minutes long, with a subtle-but-shameless suck-up to my favorite Irish web analyst at the 1:30 mark (it really never hurts to suck up to an Irish(wo)man, now, does it?).

The video is a much simplified overview of what I went into in greater detail in an earlier blog post.

If you’d like to download the slides (.pptx) for your own use (attribution appreciated but not required, and edit at will), you can do so here.

I’d love to hear what you think (of the format and/or of the content)!

Analytics Strategy

Webtrends Table Limits — Simply Explained

A co-worker ran into a classic Webtrends speed bump a couple of weeks ago. A new area of a client’s web site had rolled out a few days earlier…and Webtrends wasn’t showing any data for it on the Pages report. More perplexingly, there was traffic to the new content group that had been created along with the launch showing up in the Content Groups report. What was going on? I happened to walk by, and, although I haven’t done heavy Webtrends work in a few years, the miracle of cranial synapses meant that the issue jumped out pretty quickly (I can’t figure out how to say that without sounding egotistical; oh, well — it is what it is).

Heavy Webtrends users will recognize this as a classic symptom of “table limits reached.” There’s quite a bit written on the subject online…if you know where to look. The best post I found was You need to read this post about Table Limits by Rocky of the Webtrends Outsiders. The last sentence (well, sentence fragment, really) in the post is, “End of rant.” In other words, the post starts AND finishes strong, and the content in between is damn good, too.

What I found, though, was that it took a couple of conversations and a couple of whiteboard rounds to really explain to my colleague what was going on under the hood that was causing the issue in a way that he could really understand. That’s not a knock against him. Rather, it’s one of those things that makes perfect sense…once it makes sense. It’s like reading an analog clock or riding a bicycle (or, presumably, riding a RipStik…I wouldn’t know!).  So, I decided I’d take a crack at laying out a simplistic example in semi-graphical form as a supplement to the post above.

The Webtrends Report-Table Paradigm

First, it’s important to understand that every report in Webtrends has two tables associated with it:

  • Report Table — this is the table of data that gets displayed when you view a report
  • Analysis Table — the analysis table is identical in structure to the report table, but it has more rows, and it’s where the data really gets stored as it comes into the system

Webtrends aggregates data, meaning that it doesn’t store raw visitor-level, click-by-click data and then try to mine through a massive data volume any time someone runs a simple report. Rather, it simply increments counters in the analysis tables. That makes sense from a performance perspective, but can easily lead to a “hit the limits” issue.

Key: neither of these tables simply expands (adds rows) as needed. Both have their maximum row count configured in the admin console. Those limits can be adjusted…but that comes at a storage and a processing load price.

(Now, actually, there are multiple analysis tables for any single report — copies of the underlying table structure populated with data for a specific day, week, or month…but it’s beyond the scope of this post to go into detail there. Just tuck it away as another wrinkle to learn.)

In the rest of this post, I’m going to walk through an overly simplistic scenario of a series of visits to a fictitious site with unrealistically low table limits to illustrate what happens.

The Scenario

Let’s say we have a web site with a series of pages that we’ll call Page A, Page B, Page C,…Page Z. And, let’s say we have our Report Table limit for the Pages report set to “4” (in practice, it’s probably more like 5,000) and our Analysis Table limit set to “8” (in practice, it would be more like 20,000). That gives us a couple of empty tables that look something like this:

Now, we’re going to walk through a series of visits to the site and look at what gets put into the tables.

Visit 1

The first visitor to our site visits three pages in the following order: Page A –> Page B –> Page C  –> <Exit>.

The analysis table gets its first three rows loaded up in the order that the pages were visited, and each page gets a Visits value of 1. If we looked at the Pages report at that point, the Report Table would pull those top 3 values, and everything would look fine:

Visit 2

The next visitor comes to the site and visits 5 pages in the following order: Page B –> Page C –> Page D –> Page E –> Page F –> <Exit>

We’ve now had more unique pages visited than can be displayed in the report (because the report table limit is set to 4). But, that’s okay. After two visits to the site, our Analysis Table would still have a row or two to spare, and the Report Table could pull the top 4 pages from the Analysis Table and do a quick sort to display correctly, using the All Others row to lump in everything that didn’t make the top 4:

If you searched or queried for “Page F” at this point, you wouldn’t see it. It’s there in the Analysis Table, but you’re searching/querying off of the Report Table. That doesn’t mean Page F is lost, though. It just means it has less traffic (or is tied for last) with the last item that fit in the Report Table.

Visit 3

Sequence of pages: Page F –> Page G –> Page H –> Page B –> <Exit>

Following the same steps above and incrementing the values in our Analysis Table, and again looking at a report for the entire period, we see (bolded numbers in the Analysis Table are the ones that got created or incremented with this visit):

Look! Page F is now showing up in the Report Table! Can you see why? Because the Analysis Table has greater row limits, the Report Table can adjust and pick the top-visited pages.

Visit 4

Sequence of pages: Page F –> Page I –> Page J –> Page B –> <Exit>

Here’s where we really start to lose page-level granularity. Our Analysis Table is full, so there are no rows to store Page I and Page J. So, that will add 2 visits to the All Others row in the Analysis Table (while this is a single visit, this is the pages report, and each of those pages received a visit). Our tables now look like this:

Until the Analysis Table gets reset, no pages after Page H will ever appear in a report.

Even if Page I Becomes the Most Popular Page on My Site?

It’s time for a direct quote from the Webtrends Outsider post referenced at the beginning of this post:

Ugly example #1: Your end users contact you wanting to know about traffic to their expensive new microsite.  You know you’ve been collecting the data correctly because you triple-checked the tagging before and after launch.  So you open the Pages report and WebTrends tells you those pages don’t exist.  Those  expensive pages got no traffic at all, apparently.  Knowing how the CEO’s been obsessed with the new microsite, you call in sick indefinitely.

It doesn’t matter if Page I becomes the only page on your site. Until the tables reset, you won’t see the page in your Pages report — it will continue to be lumped into All Others.

And That Is Why…

If you started out on Google Analytics and then switched over to Webtrends you might have noticed something odd about the URLs being captured (I learned it going in the opposite direction): in Google Analytics, the full URLs for each page, including any query string parameters (campaign tracking parameters excluded) are reported by default. In Webtrends, query string parameters are dropped by default. In the case of Google Analytics, you can configure a profile to drop specific parameters, while, in Webtrends, you can configure the tool to include specific parameters.

Why does Webtrends exclude all parameters by default? The table limits is one of the reasons. If, for instance, your site search functionality passes the terms searched for and other details to the search engine using query parameters, the Analysis Table for the Pages report would fill up very quickly…with long tail searches that only received 1 or a small handful of requests.

What to Do?

The most important thing to do is to keep an eye on your table sizes and see which ones are getting close to hitting their limits. If they’re getting close, then consider adjusting your configuration to reduce “fluff” values going in. If that’s not an issue, then you need to bump up your table limits. That may slow down the time it takes for Webtrends to process your profiles, but it will keep you from unpleasant conversations with the business users you support!

Analytics Strategy, Conferences/Community

Amazing news from Analysis Exchange

UPDATED: We got great quotes from the Vice President of Human Resources who hired Jan Alden Cornish that clarify how Analysis Exchange is making a difference when it comes to hiring web analysts.  See below!

If you’ve worked in web analytics and digital measurement for long, or if you’ve ever tried to hire an experienced web analyst, you know that there are not enough qualified, experienced, and well-trained web analysts in the world. What’s more, for the majority of our sector’s development there was literally nowhere someone new could go to get the kind of hands-on education and experience that most hiring managers are looking for. Considered together the web analytics industry has been stuck in a “lose/lose” situation.

The training gap was the central problem we set out to solve in 2009 when we launched the Analysis Exchange. Our goal was to bring “student learners” together with experienced mentors to provide guided education and work to ensure that entry-level analysts were familiar with both the theory and practice of web analytics. Analysis Exchange was designed as a logical “next step” for people who had read books, followed blogs, or taken online training from great groups like the WAA via their University of British Columbia coursework.

What’s more, so that our students would learn to “tell a story with data and analysis” we opted to work with nonprofits from around the globe — a traditionally under-served group when it came to site analysis and insight generation. This turned out to be a great idea, and we are honored every week by a handful of organizations who are willing to help us create valuable training opportunities for our community.

I set a lofty goal for Analysis Exchange when I first announced the effort was open to everyone at the Emetrics Summit in San Jose last May — I wanted to help 1,000 nonprofits and create training opportunities for 500 students. Unfortunately we didn’t meet that goal … but we have made amazing strides, a few of which I’d like to share with you today:

  1. We have grown to over 1,250 members around the world, including 205 nonprofit groups and nearly 650 students. Following the Web Analytics Association we believe Analysis Exchange to be the single largest group of individuals interested in the subject of web analytics in the world — and we’re pretty excited about that!
  2. Our members have completed over 100 projects in the past year. What’s more, our students and mentors have earned awesome scores with an average “likelihood to recommend this mentor/student” score of 9.5 and an average rating for each member’s work of 9.4 (both out of 10.0)
  3. We won a prestigious award from the Web Analytics Association. Analysis Exchange was recognized as the “Most Influential Agency or Vendor” by the WAA at this year’s awards event.
  4. IQ Workforce has just agreed to help us grow and expand our efforts. Given our commitment to incubating new talent within the web analytics community this sponsorship makes great sense (read more about it here) and we’re delighted to have Corry Prohens and his team helping our mentors and students expand their horizons.
  5. We recently had our first student get a full-time job working in web analytics. This more than anything excites me … the fact that Analysis Exchange is working “as designed” for the web analytics community, helping individuals get the experience they need to bridge the gap between “knowledgable” and “employed.”

On this last point I wanted to share a little more detail. We have some pretty motivated mentors and students in the Analysis Exchange. One of our students is Jan Alden Cornish from Carmel, California. Jan has done three projects with us and in one case stepped in and helped out at the very last minute. He’s bright, articulate, and one of the nicest guys you’ll ever meet … so when he called and asked me to provide a reference for him on a job interview I was more than happy to help.

According to Jan:

“Completing three projects with the Analytics Exchange afforded me a rare opportunity to work side by side with seasoned practitioners. Each project had it’s own unique set of challenges. Nothing can replace hands on experience with real data and a need to solve real problems. Digital marketing doesn’t take in an organizational vacuum. These projects also provided me an understanding of organizational context in web analytics takes place.”

We also heard from the Vice President of Human Resources who hired Jan, Cynthia Nelson Holmsky:

“As a major e-commerce website we were recruiting for an E-Commerce Analyst and found an alumni of Analysis Exchange.  While the candidate had many years of business and software analytics, his only web experience was through Analysis Exchange.  However that Exchange experience provided just enough applied web analytics to win him the interview.  During this recruitment I met other candidates with strong business analysis backgrounds who lacked any web experience, and I referred all of them to Analysis Exchange as a great place to learn web analytics and expand their career potential.”

Cynthia clearly understands the challenges facing recruiters and HR specialists looking for web analytics talent (emphasis mine):

“Web analytics is still a young discipline.  Many individuals and businesses want to develop competencies in web analytics, but wonder “Where do you go to develop expertise?”  Many colleges and universities have yet to integrate web analytics into their curricula, or what they cover is not hands-on, so Analysis Exchange is meeting a key need in the marketplace for individuals who want real world experience, while at the same time building supply to meet the demand for web analysis talent in the tech job market.  Plus, the Exchange is meeting the needs of non-profit organizations that normally could not tap into this type of expertise.  Analysis Exchange is a  great idea, and a win-win-win model.

Hopefully Jan will continue to support the Analysis Exchange — as a mentor, now that he is working professionally in the field. I also hope those of you reading this post will consider joining Jan in the Analysis Exchange. Signing up takes less than a minute and there are plenty of projects looking for mentors and students available right now.

Analytics Strategy

Would you pay $100 per year for Google Analytics?

Back in February our newest partner Adam Greco waxed philosophical about Google offering a paid version of Google Analytics. He got a bunch of feedback and all-in-all the post raised some interesting questions about Google’s place in the web analytics marketplace. Now there is a new rumor — one far less substantiated than the so called “Enterprise” offering Adam discussed — but one with potentially more far reaching implications.

On a call today, we heard that Google may be considering charging everyone for the use of Google Analytics.

Everyone? Yep. Everyone.

The details were sparse and wholly unsubstantiated, but come from a source that we generally trust as reliable. And while we normally don’t deal with rumors here at Analytics Demystified, given Google’s footprint — conservatively estimated to be around 30,000 business sites around the world with perhaps an order (or two) more non-business sites being tracked today — the implications of this rumor are interesting for two reasons:

  1. If Google were to charge a fee similar to other of their offerings, say $100 per year, to use Google Analytics, these fees may produce millions of dollars in annual revenue (and profits) for Google and their investors. Outside projections for Google Analytics installations range into the millions, which, given a reasonable retention rate (say, 10%) would produce substantial revenue. Given the changes Google is going through right now on their management team it’s hard to say how important “revenue” is, especially when the bandwidth and data storage costs for Google Analytics are likely to be significant given estimated volumes and at a time when Google is being criticized over their increasing operating costs. Given the constant criticism over the years of Google’s inability to generate profits outside of their advertising business perhaps this sort of obvious revenue is suddenly appealing.
  2. If Google were to start forcing folks to pay, this might be a huge boon to the emerging “secondary” market of web and digital analytics vendors including Woopra, Chartbeat, Performable, Clicky, Kissmetrics, and a rapidly expanding set of web and mobile analytics vendors who largely charge tens to hundreds of dollars per month. Despite the odds given the footprint “traditional” web analytics vendors have within the Enterprise, combined with the hegemony Google has over entry-level businesses and small companies, in the last three years we have seen a surprising “second coming” of web analytics vendors gaining traction in a variety of niches. Be it real-time analytics for blogs (Chartbeat, Woopra), funnel analysis (Kissmetrics), heat-mapping and session recording (Robot Replay, ClickTale, Reinvigorate), or customer-focused analytics (Performable), these companies are, by-and-large small, agile, and somehow managing to gain adoption despite the presence of Google and “the bigs”.

This second implication is very interesting to me … the fact that against well-established, already deployed, and in Google’s case completely free competition, these start ups are able to grow and, at least in a few cases, thrive (see Clicktale, Robot Replay who was acquired by Foresee Results, Performable, etc.) Imagine the glee that founders and investors in these companies would experience if Google Analytics were to put up a real (albeit potentially small) barrier to entry. It likely wouldn’t be enough to stop companies, but it might be enough to make them think “Hmm, I wonder what else is out there?”

Given that most of these start-ups are focusing on ease-of-use and specific use cases, and in many instances are doing a pretty darn good job (my opinion), this pause might be exactly what these start-ups need. Heck, it might even help some of the bigs, given the trouble they have had selling against Google Analytics juxtaposed against the dramatic interface changes that some are poised to unleash. Don’t get me wrong — I’m not saying that Google charging $100 for analytics magically re-opens the door for traditional vendors with annual contracts in the tens of thousands of dollars. But every substantial change in the marketplace is an opportunity for great management teams, and Google suddenly charging anything would surely be a substantial change.

But I digress.

At the end of the day I personally consider it highly unlikely that Google would start to charge everyone just because they can — it just seems like an unnecessary and “evil” thing to do (despite the fact that we did the same thing earlier this year at Twitalyzer without any negative impact on our business.) Still, Google is held to a pretty high standard, and I suspect that the (relatively) small amount of revenue they would ultimately generate is hardly worth the negative press they would likely receive.

But I’m interested in what you folks think. Would you pay $100 per year for Google Analytics as it exists today? What if they offered more features or functionality? If the latter, what would they need to add to get you to pony up? Or would you immediately pull the code off your site if Google required any kind of payment? If so, why?

I welcome your comments and conversation.

Analytics Strategy, Social Media

Privacy: It's a 2.5-Dimensional Issue

I’m keeping the voting open for another week or so on my “choose a new profile picture” poll, so if you haven’t voted yet, please click over and do so. There’s a charitable donation (by me!) involved!

“Privacy” is a hot topic in the world of marketing analytics, driven primarily by shifting consumer (and, in turn, regulatory) sentiment on the subject. That shifting sentiment, I think, is largely being driven by the increasing integration of social media into our lives and our online behavior.

The WAA stepped up and put together a Code of Ethics a few months ago, and privacy is going to be a recurring topic at eMetrics and other conferences for the foreseeable future. Following the San Francisco eMetrics conference, Stéphane Hamel put together three scenarios and asked the #measure community to vote as to the ethics and allowability of each situation. He then revealed the results and added his own thoughts. Towards the end of that second post, Stéphane noted that he was disappointed by the lack of interest in the exercise, given the generally accepted importance of the topic.

Emer Kirrane responded in the comments:

It’s interesting that there seems to be a correlation between legality and ethics in the minds of your respondents. To me, the Code of Ethics is there as a flag against practices that are deemed unethical by the community, rather than deemed unethical by law.

Stéphane’s concern and Emer’s response have been bouncing around in my brain for several weeks. My conclusion: “ethics vs. legality” is going to continue to give us fits.

I realize this isn’t the first time that “ethics” and “the law” haven’t perfectly aligned (they almost never do, actually, even though that, from a purist point of view, is the goal), but bear with me — it’s worth using that lens to explore the issue and outline the challenges we’re going to have to deal with. These are two very different dimensions of the privacy debate, and one of them is in flux on several fronts.

Why 2.5 Dimensions?

Obviously, there is a legal/regulatory dimension, and there is an ethical dimension. But, really, the legal/regulatory dimension is heavily driven and influenced by consumer perceptions and fears. I actually wrote some thoughts on that a couple of years ago. With high-profile Facebook snafus and high-profile media outlets reporting on cookies and cross-site tracking, politicians have found an issue that their constituents care about (or can be prodded to care about). So, in a sense, the legal/regulatory dimension has some added “oomph” of consumer concerns behind it; I’m calling that “consumer perspective” another half a dimension.

It’s possible that “consumer perception” should be a third dimension in and of itself. But, oh boy, that would make for some hairy sketching in the remainder of this post. I’m pretty sure I’m not just punting, though — the will of the consumer when it comes to something like privacy does generally get manifested through some form of government regulation.

Start with the Basics

Two dimensions: legal and ethical. We can look at them like this:

Various practices raise privacy questions. In theory, we can plot each of them on this (conceptual) grid — there are more than shown here, but I’m just laying out the basic idea of the framework:

In Theory, We’d Have Harmonious Dimensions

If life was simple, we would have perfect clarity for each dimension, and perfect alignment between dimensions:

Notice the shaded quadrants at top left and bottom right — there would be no practices that were ethical but not legal, nor would there be any practices that were legal but unethical.

Alas! Privacy is Rife with Gray Areas!

Reality is more like this — gray areas rather than hard lines along both dimensions:

Ugh. Things get messy. There are more activities that are questionable — they may or may not be legal and/or they may or may not be ethical! Argh!

But Wait! There’s More!

Ever since the web went mainstream, it’s been a more global medium than anything that came before. And, we’ve all run into cases and concerns that our standard web analytics implementation runs afoul of the law in some country somewhere. This grid illustrates that wrinkle, too — the legal/regulatory gray areas live in different places depending on the country (only the U.S. and the E.U. are shown here — it’s an illustrative diagram, people! Not a comprehensive one!):

And the big blue arrow shows where pressure is being applied (back to that half-dimension of consumer fears mentioned at the beginning of this post). It’s a little counterintuitive that the arrow is pointing upward, isn’t it? How could it be that things are trending towards “allowed?” They’re not. Rather, the “interpretation zone” is moving upward — practices that used to be “clearly allowed” aren’t inherently changing what they are, but those practices are moving from “in the clear” towards the gray area.

Helpful?

This was definitely one of those situations where, when I initially had a rough picture in my mind that would represent these two dimensions, it was simple and clear. It was only as I put pen to paper to sketch it out that it turned out to be tricky. Shortly after I finished writing this post (but, obviously, before I published it…as I’m adding this comment at the end), Jason Thompson made a really good case as to what is (misguidedly) driving the legal dimension out of alignment with the ethical perspective. That reminded me that I keep meaning to go back and re-read the last chapter (chapter 9?) of Jim Sterne’s Social Media Metrics book, as I recall that it was an intriguing non-sequitur that considered turning the entire “tracking” model on its head. Food for thought for another post, that.

What do you think? Is this an effective representation of the shifting privacy landscape we’re dealing with? What does it miss?

Analytics Strategy, Conferences/Community

WAA Elections: I Support the Slate

While the voting period is mostly over I wanted to drop a quick note and offer up some thoughts on the candidates and process for the current Web Analytics Association elections. This year is clearly different thanks to a new process, one that has the membership voting on both a “slate” of candidates and two “at large” positions. While initially I didn’t understand the need to change the process, upon further explanation and a little reflection, I believe the new process makes sense and has the best interests of the Association and it’s membership at heart.

Before you go and Tweet “he’s lost his mind …” hear me out.

As the Web Analytics Association has grown the few board positions have become less of an obligation and more of an opportunity for people. In that, in recent years, we have seen an almost staggering number of people nominated into the election process. This, in my opinion, has created a problem in that A) most of the candidates, despite qualification, are relatively unknown to the web analytics community and B) because of the relatively low number of voters, a “popular vote” has become relatively easily gamed. I have certainly thrown my weight behind individual candidates in the past and, because my blog has tens of thousands of readers worldwide (many of whom do vote in WAA elections), I believe I have been able to help folks get elected.

Yeah for us and our friends, but boo for the process in general.

The popular vote has led to some truly great people participating in the WAA — folks (and my bias here) like John Lovett, June Dershewitz, Matt Langie, Dennis Mortensen, Ed Wu, and Peter Sanborn. But the popular vote has also led to some less-than-stellar participants in my humble opinion — people who either quit the board mid-stream or who served more as obstructionists than participants.

This new process, with what I believe to be a pretty well vetted board “slate” and list of “at large” candidates, has tremendous potential to do one very important thing: allow the Association to maintain the momentum they have today. From where I sit, in the past year the Association has:

  • Hired a very qualified Executive Director in Mike Levin
  • Started a very successful “local” event in the Symposium
  • Launched a very important community initiative with the Code of Ethics
  • Held a wonderful recognition event in the Emetrics/WAA Gala

and more. Plus, while I am not privy to any greater level of detail than anyone else, my general sense is that the current board is more productive and more collegial than many (or any) past boards and that bodes well for all of us.

So when it comes to the current election cycle, the “slate” has three returning Board members in Peter Sanborn (currently the Board President), Ed Wu, and Alex Yoder plus two new members who are, in my opinion, tremendously qualified to serve in Jodi McDermott and Shari Cleary. I have faith in Peter, Ed, and Alex based on their past work, Jodi has been a passionate contributor to WAA Standards and a number of other initiatives, and Shari is one of the most intelligent, level-headed people I know in life, much less web analytics.

The “at large” positions do create some problems, to be sure. The proposed group was whittled down from a larger group of folks, several of whom were qualified, passionate, and motivated, but my understanding is that the “secret selection committee” (which I offered to help with but asked too late) made decisions based on demonstrated commitment, involvement, and a willingness to work within the processes the Association has already established for the benefit of the membership. This strategy ends up recognizing folks like Chris Berry, a huge supporter of Research and Standards, Eric Feinberg and Lee Isensee, the “Laurel and Hardy” of the WAA and critical members of the membership committee, and Bob Page and Joe Megibow, two individuals who represent the level of leadership in web analytics that many (if not all) of us aspire to. In short, a brilliant group.

This list leaves off some pretty nice people as well, and this I think is what is creating some of the recent consternation in Twitter, but from where I sit the opportunity is clear: Participate in the WAA at the level that Chris, Eric, and Lee have, or build the reputation that Bob and Joe have, and you’re a shoe-in for the “at large” slots in the future.

For the record I am voting for Joe Megibow and Bob Page for the “at large” positions. Both are brilliant, both are passionate about measurement, and both serve as an excellent example of the kind of work we should all be doing. The Association needs more practitioners to represent the real needs of our industry and I cannot  think of two better people to fill that role.

Anyway, for what it’s worth, I too was confused about the “slate” process and this election, but hopefully like me you are willing to give the process a chance.

I welcome your comments.

 

Analysis, Analytics Strategy, Reporting

In Defense of "Web Reporting"

Avinash’s last post attempted to describe The Difference Between Web Reporting and Web Analysis. While I have some quibbles with the core content of the post — the difference between reporting and analysis — I take real issue with the general tone that “reporting = non-value-add data puking.”

I’ve always felt that “web analytics” is a poor label for what most of us who spend a significant amount of our time with web behavioral data do day in and day out. I see three different types of information-providing:

  • Reporting — recurring delivery of the same set of metrics as a critical tool for performance monitoring and performance management
  • Analysis —  hypothesis-driven ad hoc assessment geared towards answering a business question or solving a business problem (testing and optimization falls into this bucket as well)
  • Analytics — the development and application of predictive models in the support of forecasting and planning

My dander gets raised when anyone claims or implies that our goal should be to spend all of our time and effort in only one of these areas.

Reporting <> (Necessarily) Data Puking

I’ll be the first person to decry reporting squirrel-age. I expect to go to my grave in a world where there is still all too much pulling and puking of reams of data. But (or, really, BUT, as this is a biggie), a wise and extremely good-looking man once wrote:

If you don’t have a useful performance measurement report, you have stacked the deck against yourself when it comes to delivering useful analyses.

It bears repeating, and it bears repeating that dashboards are one of the most effective means of reporting. Dashboards done well (and none of the web analytics vendors provide dashboards well enough to use their tools as the dashboarding tool) meet a handful of dos and don’ts:

  • They DO provide an at-a-glance view of the status and trending of key indicators of performance (the so-called “Oh, shit!” metrics)
  • They DO provide that information in the context of overarching business objectives
  • They DO provide some minimal level of contextual data/information as warranted
  • They DON’T exceed a single page (single eyescan) of information
  • They DON’T require the person looking at them to “think” in order to interpret them (no mental math required, no difficult assessment of the areas of circles)
  • They DON’T try to provide “insight” with every updated instance of the dashboard

The last item in this list uses the “i” word (“insight”) and can launch a heated debate. But, it’s true: if you’re looking for your daily, weekly, monthly, or real-time-on-demand dashboard to deliver deep and meaningful insights every time someone looks at it, then either:

  • You’re not clear on the purpose of a dashboard, OR
  • You count, “everything is working as expected” to be a deep insight

Below is a perfectly fine (I’ll pick one nit after the picture) dashboard example. It’s for a microsite whose primary purpose is to drive registrations to an annual user conference for a major manufacturer. It is produced weekly, and it is produced in Excel, using data from Sitecatalyst, Twitalyzer, and Facebook. Is this a case of, as Avinash put it, us being paid “an extra $15 an hour to dump the data into Excel and add a color to the table header?” Well, maybe. But, by using a clunky Sitecatalyst dashboard and a quick glance at Twitalyzer and Facebook, the weekly effort to compile this is: 15 minutes. Is it worth $3.75 per week to get this? The client has said, “Absolutely!”

I said I would pick one nit, and I will. The example above does not do a good job of really calling out the key performance indicators (KPIs). It does, however, focus on the information that matters — how much traffic is coming to the site, how many registrations for the event are occurring, and what the fallout looks like in the registration process. Okay…one more nit — there is no segmentation of the traffic going on here. I’ll accept a slap on the wrist from Avinash or Gary Angel for that — at a minimum, segmenting by new vs. returning visitors would make sense, but that data wasn’t available from the tools and implementation at hand.

An Aside About On-Dashboard Text

I find myself engaged in regular debates as to whether our dashboards should include descriptive text. The “for” argument goes much like Avinash’s implication that “no text” = “limited value.” The main beef I have with any sort of standardized report or dashboard including a text block is that, when baked into a design, it assumes that there is the same basic word count of content to say each time the report is delivered. That isn’t my experience. In some cases, there may be quite a bit of key callouts for a given report…and the text area isn’t large enough to fit it all in. In other cases, in a performance monitoring context, there might not be much to say at all, other than, “All systems are functioning fine.” Invariably, when the latter occurs, in an attempt to fill the space, the analyst is forced to simply describe the information already effectively presented graphically. This doesn’t add value.

If a text-based description is warranted, it can be included as companion material. <forinstance> “Below is this week’s dashboard. If you take a look at it, you will, as I did, say, ‘Oh, shit! we have a problem!’ I am looking into the [apparent calamitous drop] in [KPI] and will provide an update within the next few hours. If you have any hypotheses as to what might be the root cause of [apparent calamitous drop], please let me know” </forinstance> This does two things:

  1. Enables the report to be delivered on a consistent schedule
  2. Engages the recipients in any potential trouble spots the (well-formed) dashboard highlights, and leverages their expertise in understanding the root cause

Which…gets us to…

Analysis

Analysis, by [my] definition, cannot be something that is scheduled/recurring/repeating. Analysis is hypothesis-driven:

  • The dashboard showed an unexpected change in KPIs. “Oh, shit!” occurred, and some root cause work is in order
  • A business question is asked: “How can we drive more Y?” Hypotheses ensue

If you are repeating the same analysis…you’re doing something wrong. By its very nature, analysis is ad hoc and varied from one analysis to another.

When it comes to the delivery of analysis results, the medium and format can vary. But, I try to stick with two key concepts — both of which are violated multiple times over in every example included in Avinash’s post:

  • The principles of effective data visualization (maximize the data-pixel ratio, minimize the use of a rainbow palette, use the best visualization to support the information you’re trying to convey, ensure “the point” really pops, avoid pie charts at all costs, …) still need to be applied
  • Guy Kawasaki’s 10-20-30 rule is widely referenced for a reason — violate it if needed, but do so with extreme bias (aka, slideuments are evil)

While I am extremely wordy on this blog, and my emails sometimes tend in a similar direction, my analyses are not. When it comes to presenting analyses, analysts are well-served to learn from the likes of Garr Reynolds and Nancy Duarte when it comes to how to communicate effectively. It’s sooooo easy to get caught up in our own brilliant writing that we believe that every word we write is being consumed with equal care (you’re on your third reading of this brilliant blog post, are you not? No doubt trying to figure which paragraph most deserves to be immortalized as a tattoo on your forearm, right? You’re not? What?!!!). “Dumb it down” sounds like an insult to the audience, and it’s not. Whittle, hone, remove, repeat. We’re not talking hours and hours of iterations. We’re talking about simplifying the message and breaking it up into bite-sized, consumable, repeatable (to others)  chunks of actionable information.

Analysis Isn’t Reporting

Analysis and reporting are unquestionably two very differing things, but I don’t know that I agree with assertions that analysis requires an entirely different skillset from reporting. Meaningful reporting requires a different mindset and skillset from data puking, for sure. And, reporting and analysis are two different things, but you can’t be successful with the latter without being successful with the former.

Effective reporting requires a laser focus on business needs and business context, and the ability to crisply and effectively determine how to measure and monitor progress towards business objectives. In and of itself, that requires some creativity — there are seldom available metrics that are perfectly and directly aligned with a business objective.

Effective analysis requires creativity as well — developing reasonable hypotheses and approaches for testing them.

Both reporting and analysis require business knowledge, a clear understanding of the objectives for the site/project/campaign/initiative, a better-than-solid understanding of the underlying data being used (and its myriad caveats), and effective presentation of information. These skills make up the core of a good analyst…who will do some reporting and some analysis.

What About Analytics?

I’m a fan of analytics…but see it as pretty far along the data maturity continuum. It’s easy to poo-poo reporting by pointing out that it is “all about looking backwards” or “looking at where you’ve been.” But, hey, those who don’t learn from the past are condemned to repeat it, no? And, “How did that work?” or “How is that working?” are totally normal, human, helpful questions. For instance, say we did a project for a client that, when it came to the results of the campaign from the client’s perspective, was a fantastic success! But, when it came to what it cost us to deliver the campaign, the results were abysmal. Without an appropriate look backwards, we very well might do another project the same way — good for the client, perhaps, but not for us.

In general, I avoid using the term “analytics” in my day-to-day communication. The reason is pretty simple — it’s not something I do in my daily job, and I don’t want to put on airs by applying a fancy word to good, solid reporting and analysis. At a WAW once, I actually heard someone say that they did predictive modeling. When pressed (not by me), it turned out that, to this person, that meant, “putting a trendline on historical data.” That’s not exactly congruent with my use of the term analytics.

Your Thoughts?

Is this a fair breakdown of the work? I scanned through the comments on Avinash’s post as of this writing, and I’m feeling as though I am a bit more contrarian than I would have expected.

Analytics Strategy, Social Media

eMetrics San Francisco 2011 — Recap by the Tweets

Note: There’s a lot of gushing I could do about how great it was to meet a lot of people in person whom I’d only known via Twitter prior, to see people I’ve met before, and to meet new people…but I’ll save some of that for a later post. This is the “content recap” post.

The last 3-4 conferences I’ve gone to, I’ve used Twitter in lieu of a notepad for my note-taking. What I realized after the first time I tried this was that it forced me to be succinct and to be selective as to what I noted. Now, for better or worse, my thumbs have gotten a bit more nimble at the same time that the input mechanisms on my mobile devices have improved. So, some might say I’m not as selective as I should be!

But, after eMetrics in D.C. last fall, I realized that there’s another benefit of tweet-based note-taking at conferences — it enables crowdsourcing the key takeaways. In theory, at least! Given that, I decided to organize this recap based on one thing: the top most retweeted tweets during the conference, as reported by a TweetReach tracker I set up. Scroll to the end of this post to download a CSV with the raw data, if you’re interested in crunching it yourself.

With that, here are my six summary takeaways:

The Data Isn’t Actionable without Storytelling

Hands, down, the most retweeted tweet (14 times) from the conference was this from Wendy Ertter:

Several presenters touched on the fact that one of the key challenges in our industry is communicating what the data means. As analysts, it can be easy to get absorbed in the data to the point that we intuitively can interpret our analyses. All too often, though, we forget that the business users we’re supporting are neither “wired for data” nor have they been as immersed in it as we have. So, rather than getting stamp-your-foot irritated that your brilliant insights have not led to action, take a look at how those insights are being communicated.

Now, not discussed was the fact that “tell a story with the data” can easily come across as “torture the data until it tells the story you want it to.” It’s a fine line, really, that means transparency has to come along with the storytelling. Storytelling must be merely a means of “effectively communicating the truth” — conveying what the data really is saying, but in a digestible manner.

Social Media, Social Media, Social Media

Ken Burbary’s tweet during Guy Kawasaki’s closing keynote (which garnered quite a bit of ire from the attendees, but that’s potential fodder for a future post) was retweeted 11 times:

Social media was a hot topic at the conference, with the sessions devoted to it concluding: “It’s tough to analyze.” In general, there was consensus that Performance Measurement 101 still applies — if you want to have any hope of measuring social media, you darn tootin’ better have clear objectives for your investment in the channel. Now, because social media isn’t the same as longer standing channels, there are different measures to work with.

One of the more intriguing sessions I attended was a panel, moderated by Michele Hinojosa, that featured Gary Angel of Semphonic and Michael Healy. The subject was sentiment analysis. Specifically, sentiment analysis of short-form text messages — Twitter and the like. Both by illustrating examples and talking through some of the advanced machine learning algorithms that have been applied to the challenge, they made a pretty strong case that trying to discretely quantify sentiment in a Twitter world is a fool’s errand.

Gary also made a distinction between “monitoring” and “measurement” and, later in the discussion, postulated that social media may be one case where you actually need to do analysis first and then set up your measurement. This makes sense, even in light of my “Performance Measurement 101” comment above. It does make sense to sift around in the conversation that is going on around a topic or a brand a bit to get a human and qualitative sense of the lay of the land before determining exactly what to measure and how.

[Update: I just realized that Gary wrote up a pretty detailed post about his key points in the session over on his blog last week — it’s worth a read.]

Attitude Is As Important As Behavior

This tweet from @SocialMedia2Day during Larry Freed’s opening day keynote was retweeted 10 times:

Foresee Results was the Diamond Sponsor for eMetrics, and the company continues to push the web analytics industry to recognize attitudinal data as being every bit as important as behavioral data. Interestingly, VOC vendors overall had a much more prominent presence than web analytics vendors (only Google Analytics and Yahoo! Web Analytics were exhibitors at the event — Webtrends, Adobe/Omniture, and Coremetrics were nowhere to be seen in the exhibit hall).

I have to credit Chris Dooley from Foresee Results for initially introducing me to (read: pestering me about) the rightful place of attitudinal data as a companion to behavioral data. He was right when he started preaching it, and he’s still right today. Another VOC vendor noted during his presentation that, when his company surveyed the top 500 retail sites and the top 500 overall trafficked sites, they found that only 15% were running on-site surveys. That is both surprising and alarming! OpinionLab also impressed a number of people with their presentations in the exhibit hall theater, and iPerceptions provided a bit more detail about their coming 4Q Premium product (which, seeing as how they announced it was coming back in October, is somewhat underwhelming given the price tag).

In short, lots of reinforcement that the voice of the customer matters and shouldn’t be ignored!

comScore’s Silver Bullet (A Bit Tarnished, IMHO)

Since I said I’d go with the most retweets, I have to include this one from John Lovett, which was retweeted 10 times:

The key here is that comScore announced all the problems they were solving. The main differentiator, as best as I can tell, is that comScore is combining web analytics capabilities with its rich demographic/audience-based data. That might be slick, although it seems that they’re overpromising a bit when it comes to the flexibility of the tool and the completeness of the demographic data. I trust John…a lot…so maybe I’m being unduly and prematurely cynical. We’ll see.

Consumers Are Cross-Channel — So Should Be Your Analysis

At the risk of inflating John’s ego (which I’m not all that worried about, but if, ages hence, he’s turned into a pompous ass, I’ll dig up this post and claim credit for starting a perfectly pleasant guy down that path!), the next tweet and the last one are all Lovett-related. Lovett. Love it! 🙂 This next one was Eric Peterson quoting John and was retweeted 10 times:

Data integration and cross-channel analytics were covered by a number of presenters. With the exception of one vendor (who shall remain nameless…but who announced a name change to his company at the conference), the overwhelming agreement was that cross-channel integration is hard, tedious, expensive…and necessary. That one vendor had a video that showed it as being simply a technology issue (and they had the technology!). I’ve dabbled in the customer data integration (CDI) world enough to know that doing this integration at the individual person level is a bear.

But, because customers are living in multiple channels — offline, digital, mobile, social — and are switching freely between them, it’s dangerous to narrow in on a single channel and draw too many conclusions. This challenge isn’t going to go away any time soon.

Several times, both in sessions and in hallway discussions, it came up that both “WAA” and “eMetrics” have quickly become misnomers. Most of the attendees at the conference have responsibilities well beyond simply “web site analytics,” and simply “digital metrics.” I put a plug in that we could start considering “eMetrics” to be “everywhereMetrics,” which is a shameless ripoff of Resource Interactive‘s stance that “eCommerce” has become “everywhereCommerce.”

Fun times to come!

Consumer Privacy — the Regulations, the Law, the Ethics of It

Covered briefly in several sessions, touched on in the WAA Member Meeting, and then covered in depth in a panel was the challenges our industry is facing with regards to consumer concerns about privacy:

John has been the face of a multi-person effort to craft a code of ethics that individuals can sign that lays out how we will treat customer data. What became evident at eMetrics is that there simply is no easy answer to “consumer privacy.” And, the fact that the FTC covers the U.S. and has taken differing stances from the EU, and the EU will get to “one policy…implemented and enforced by country,” just makes my head hurt.

The good news, it seems, is that there seems to be an emerging philosophical consensus as to what is “good/okay” and what is “bad” when it comes to user tracking. The kicker is that it’s really, really hard to write that down in an unambiguous, loophole-free way.

If anything, I took away a sense of empowerment when it comes to really living the Code of Ethics and speaking up if/when I see an initiative starting to get into a gray area — it’s not just a “do the right thing because it’s ethical” case at this point. It’s a “do the right thing…or it might come out that you didn’t, and your brand can get burned severely.”

The Tweets Themselves…

As promised at the beginning of this post, if you want to download the data file with all of the tweets from 14-Mar-2011 to 16-Mar-2011 (Eastern time) that came out of my Tweetreach tracker, you can do so here. If you do anything interesting with them, please leave a comment here as to what that was.

Analytics Strategy, Reporting, Social Media

A Framework for Social Media Measurement Tools

Fundamental marketing measurement best practices apply to social media as much as they apply to email marketing and web site analytics. It all begins with clear objectives and well-formed key performance indicators (KPIs). The metrics that are actually available are irrelevant when it comes to establishing clear objectives, but they do come into play when establishing KPIs and other measures.

In a discussion last week, I grabbed a dry erase marker and sketched out a quick diagram on an 8″x8″ square of nearby whiteboard to try to illustrate the landscape of social media measurement tools. A commute’s worth o’ pondering heading home that evening, followed by a similar commute back in the next morning, and I realized I might have actually gotten a reasonable-to-comprehend picture that showed how and wear the myriad social media measurement tools fit.

Here it is (yep — click on the image to view a larger version):

‘Splain Yourself, Lucy

The first key to this diagram is that it makes a distinction between “individual channel performance” and “overall brand results.” Think about the green box as being similar to a publicly traded company’s quarterly filing. It includes an income statement that shows total revenue, total expenses, and net income. Those are important measures, but they’re not directly actionable. If a company’s profitability tanks in any given quarter, the CEO can’t simply say, “We’re going to take action to increase profitability!”  Rather, she will have to articulate actions to be taken in each line of business, within specific product lines, regarding specific types of expenses, etc. to drive an increase profitability. At the same time, by publicly announcing that profitability is important (a key objective) and that it is suffering, line of business managers can assess their own domains (the blue boxes above) and look for ways to increase profitability. In practice, both approaches are needed, but the actions actually occur in the “blue box” area.

When it comes to marketing, and especially when it comes to the fragmented consumer world of social media, things are quite a bit murkier. This means performance measurement should occur at two levels — at the overall ecosystem (the green box above), which is akin to the quarterly financial reporting of a public company, and at the individual channel level, which is akin to the line of business manager evaluating his area’s finances. I use a Mississippi River analogy to try to explain that approach to marketers.

Okay. Got It. Now, What about These “Measurement Instruments?”

Long, long, LONG gone are the days when a “web analyst” simply lived an breathed a web analytics tool and looked within that tool for all answers to all questions. First, we realized that behavioral data needed to be considered along with attitudinal data and backend system data. Then, social media came along introduced a whole other set of wrinkles. Initially, social media was simply “people talking about your brand.” Online listening platforms came onto the scene to help us “listen” (but not necessarily “measure”). Soon, though, social media channels became a platform where brands could have a formally managed presence: a Facebook fan page, a Twitter account, a YouTube channel, etc. Once that happened, performance measurement of specific channels became as important as performance measurement of the brand’s web site.

When it comes to “managing social media,” brand actions occur within a specific channel, and each channel should be managed and measured to ensure it is as effective as possible. Unfortunately, each of the channels is unique when it comes to what can be measured and what should be measured. Facebook, for instance, is an inherently closed environment. No tool can simply “listen” to everything being said in Facebook, because much of users’ content is only available to members of their social graph within the environment, or interactions they have with a public fan page. Twitter, on the other hand, is largely public (with the exception of direct messages and users who have their profile set to “private”). The differing nature of these environments mean that they should be managed differently, that they should be measured differently, and that different measurement instruments are needed to effectively perform that measurement.

Online listening platforms are not a panacea, no matter how much they present themselves as such. Despite what may be implied in their demos and on their sites, both the Physics of Facebook and the Physics of Twitter apply — data access limited by privacy settings in the former and limited by API throttling in the latter. That doesn’t mean these tools don’t have their place, but they are generalist tools and should be seen primarily as generalist measurement platforms.

Your Diagram Is Missing…

I sketched the above diagram in under a minute and then drew it in a formal diagram in under 30 minutes the next morning. It’s not comprehensive by any means — neither with the three “social media channels” (the three channels listed are skewed heavily towards North America and towards consumer brands…because that’s where I spend the bulk of my measurement effort these days) nor with the specific measurement instruments. I’m aware of that. I wasn’t trying to make a totally comprehensive eye chart. Rather, I was trying to illustrate that there are multiple measurement instruments that need to be implemented depending on what and where measurement is occurring.

As one final point, you can actually wipe out the “measurement instrument” boxes and replace those with KPIs at each level. You can swap out the blue boxes with mobile channels (apps, mobile site, SMS/MMS, mobile advertising). I’m (clearly) somewhat tickled with the construct as a communication and planning tool. I’d love to field some critiques so I can evolve it!

Analytics Strategy

Web Analytics Tools Comparison — Columbus WAW Recap Part 2

[Update: After getting some feedback from a Coremetrics expert and kicking around the content with a few other people, I rounded out the presentation a bit.]

In my last post, I recapped and posted the content from Bryan Cristina’s 10-minute presentation and discussion of campaign measurement planning at February’s Columbus Web Analytics Wednesday. For my part of the event, I tackled a comparison of the major web analytics platforms: Google Analytics, Adobe/Omniture Sitecatalyst, Webtrends, and, to a certain extent, Coremetrics. I only had five minutes to present, so I focussed in on just the base tools — not the various “warehouse” add-ons, not the A/B and MVT testing tools, etc.

Which Tool Is Best?

This question gets asked all the time. And, anyone who has been in the industry for more than six nanoseconds knows the answer: “It depends.” That’s not a very satisfying answer, but it’s true. Unfortunately, it’s also an easy answer — someone who knows Google Analytics inside and out, has never seen the letters “DCS,” referenced the funkily-spelled “eluminate” tag, or bristled at Microsoft usurping the word “Vista” for use with a crappy OS, can still confidently answer the, “Which tool is best?” question with, “It depends.”

And You’re Different?

The challenge is that very, very few people are truly fluent in more than a couple of web analytics tools. I’ve heard that a sign of fluency in a language is that you actually think in the language. Most of us in web analytics, I suspect, are not able to immediately slip into translated thought when it comes to a tool. So, here’s my self-evaluation of my web analytics tool fluency (with regards to the base tools offered — excluding add-ons for this assessment; since the add-ons bring a lot of power, that’s an important limitation to note):

  • Basic page tag data capture mechanics — 95th percentile — this is actually something pretty important to have a good handle on when it comes to understanding one of the key differences between Sitecatalyst and other tools
  • Google Analytics — 95th percentile — I’m not Brian Clifton or  John Henson, but I’ve crafted some pretty slick implementations in some pretty tricky situations
  • Adobe-iture Sitecatalyst — 80th percentile — I’m more recent to the Sitecatalyst world, but I’ve now gotten some implementations under my belt that leverage props, evars, correlations, subrelations, classifications, and even a crafty usage of the products variable
  • Webtrends — 80th percentile — I cut my teeth on Webtrends and would have put myself in the 95th percentile five years ago, but my use of the tool has been limited of late; I’m actually surprised at how little some of the fundamentals change, but maybe I should
  • Coremetrics — 25th percentile — I can navigate the interface, I’ve dived into the mechanics of the different tags, and I’ve done some basic implementation work; it’s just the nature of the client work I’ve done — my agency has Coremetrics expertise, and I’m hoping to rely on that to refine the presentation over time

So, there’s my full disclosure. I consider myself to be pretty impartial when it comes to tools (I don’t have much patience for people who claim impartiality and then exhibit a clear bias towards “their” tool — the one tool they know really well), but, who knows? It’s a fine line between “lack of bias” and “waffler.”

Any More Caveats Before You Get to the Content?

My goal with this exercise was to sink my teeth in a bit and see what I could clearly capture and explain as the differences. Ideally, this would also get to the, “So what?” question. What I’ve found, though, is that answering that question gets circular in a hurry: “If <something one tool shines as> is important to you, then you really should go with <that tool>.” Two examples:

  • If enabling users to quickly segment traffic and view any number of reports by those segments is important, then you should consider Google Analytics (…or buying the “warehouse” add-on and plenty of seats for whatever other tool you go with)
  • If being able to view clickpaths through content aggregated different ways is important, then you should consider Sitecatalyst

These are more of a “features”-oriented assessment, and they rely on a level of expertise with web analytics in order to assess their importance in a given situation. That makes it tough.

Any tool is only as good as its implementation and the analysts using it (see Avinash’s 10/90 rule!). Some tools are much trickier to implement and maintain than others — that trickiness brings a lot of analytics flexibility, so the implementation challenges have an upside. In the end, I’ll take any tool properly implemented and maintained over a tool I get to choose that is going to be poorly implemented.

Finally! The Comparison

I expect to continue to revisit this subject, but the presentation below is the first cut. You might want to click through to view it on SlideShare and click the “Speaker Notes” tab under the main slide area — I added those in after I presented to try to catch the highlights of what I spoke to on each slide.

Do you see anything I missed or with which you violently disagree? Let me know!

http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6= http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=

http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6= http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=

http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=
http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=

http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=
http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=

http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=

Analytics Strategy, Conferences/Community

Guest Post: Success in The Analysis Exchange!

Since Analysis Exchange has been honored with a nomination in the Web Analytics Association Gala Awards, while our community is considering their votes I figured it was a good time to share some of the great email we get from Exchange participants.  This one is from David Schuette who started as a student and has already graduated to mentor!  You can follow David on Twitter @TheCakeScraps and thanks to David and everyone who has benefitted from Analysis Exchange!

If you are in the WAA please consider that a vote for Analysis Exchange is a vote for EVERYONE who contributes to the effort around the world.

A Tale of Two Projects

In the middle of 2010 – 2.5 years into my career as a web analyst – I made one of the better decisions on my journey through the field of web analytics.  A friend of mine, active in the community for some time, pointed me to a project called the Analysis Exchange; he encouraged me to check it out and to sign up as a student.  I did some research and it seemed like a great match.  I would get to help nonprofits and learn a lot in the process.

I’ll be honest; it took a while for me to secure my first project.  I wasn’t sure what the problem was until Eric pointed out (to all members) that a complete profile greatly contributed to the likelihood that a student or mentor would be selected for a project.  I filled it out and started applying again.  At the time there were only a few projects available, in contrast to the 5+ open right now thanks to the hard work of Wendy and team, so it took some time but I was picked to work with Kids Matter, Inc. – an organization supporting foster children.

The experience couldn’t have been better.  Megan, the partner at Kids Matter, was filled with excitement and ambition.  She had done some great work for her organization and wanted to learn more.  She wanted to let the data take away some of the guessing and let it do part of the work for her.

I dove right in and, before long, I had a great presentation that I was able to tweak based on the feedback from my mentor.  The presentation went smoothly and the people at Kids Matter were extremely appreciative of the work.  I even got a thank-you card that was hand-made by one of the kids.  It really made me stop and appreciate just how much good can come from a little time given.

While I was busy working on my first project, the Analysis Exchange kept improving.  The Google Group, a bit quiet recently, contributed in a huge way to make small but important improvements to the Exchange site.  It is cool to look at some of the discussions from just a few months ago and see the ideas already implemented into the site.  It made it all the easier to sign-up for my next project, at the Apalachicola Bay Chamber of Commerce.

The second project went as well as the first.  My mentor and I provided a high-level usability driven analysis to Anita, our partner at Apalachicola Bay.  The analysis focused on opportunities to draw visitors deeper into the site so they could really see what the Apalachicola Bay area had to offer.  Again, our partner was excited about the results and was genuinely appreciative of the work we put in.  It was our pleasure.

And now I have transitioned myself to a Mentor on The Exchange.  If my next experiences are half as good as the first two I would be thrilled.  I’m excited, even anxious, to have the chance to help another organization and provide some coaching to an upcoming analytics ninja.  But I also view this change to a mentor as a re-upping of my commitment to The Exchange; I have made it my goal to bring at least 1 local non-profit to The Exchange this year and hopefully more!

Everything about my experience has been wonderful.  If you have thought about joining, or perhaps have not participated in awhile, go check it out.  You won’t regret it.

Analytics Strategy, Conferences/Community, General

A few thoughts on the upcoming WAA Awards

I got a nice note this morning from Mike Levin at the Web Analytics Association:

“CONGRATULATIONS! You have been nominated for a WAA Award of Excellence in the category of: Most Influential Industry Contributor (individual) Your nomination recognizes the contributions you and/or your company have made to the web analytics industry. It is an honor to be nominated and the WAA congratulates you on your success. “

While I am honored by the recognition and delighted to have been nominated I told Mike that I am declining to participate in the voting.

Mike wrote me back and seemed surprised but my thinking is very simple: I have been very fortunate in my web analytics career and have received lots of recognition from my peers, my clients, and the press. I’m not one to bang my own drum and brag about my accomplishments … I prefer to just do my thing, help my clients and the community, and build a strong company for my partners and associates.

So I humbly and politely decline the honor and instead will cast my vote for folks I believe to be truly deserving of an industry honor. Here are the people I will be voting for:

  • Web Analytics Rising Star: Jason Thompson.  Jason is still a bit rough around the edges but I love his style and commitment to getting things done.  If I can vote twice I am voting for Michele “Jojoba” Hinojosa … her passion is palpable and her enthusiasm is infectious.
  • Most Influential Industry Contributor: John Lovett. I’m not sure John is actually eligible because he is on the WAA Board but his work on the WAA Code of Ethics is a monumental achievement and one that has the potential to shape our industry for years to come.  If I can vote twice my second nod goes to Jim Sterne … who has done more for this industry than Jim Sterne?  Damn right, nobody!
  • Most Influential Vendor: Google.  Most of the positive changes we have seen in the past two years in web analytics can be derived either directly or indirectly to the work that Brett Crosby and the team at Google Analytics put out there.  Second vote goes to Omniture given the critical mass they have been able to create and the big strides they made since the Adobe acquisition on customer support and overall focus.

UPDATE: OMG I didn’t realize that Corry Prohens was running a shameless and ruthless campaign to win the “Influential Agency/Vendor” award.  You should read his “shameless campaign” blog post and consider voting for Corry.

  • Client/Practitioner of the Year: Best Buy. Difficult to not vote for one of your own favorite clients but I hope you will all come to my keynote presentation with Lynn Lanphier at Emetrics and hear why I cast this vote.  Second vote? Dell, for taking the advice I gave them last year to heart and who are now kicking ass and taking names for testing and optimization. Bravo!
  • Technology of the Year: Analysis Exchange. Now, of course, I’m not really going to vote for something I helped create, but I am pretty damn proud of the work we have done and with Wendy Greco at the helm things are only getting better.  If I could vote twice … I wouldn’t, because I’d be tempted to vote for Twitalyzer LOL!

Again, I do appreciate the nod from the WAA and am looking forward to the party — the Analytics Demystified and Keystone Solutions crews will be there in force. I wish everyone nominated for the WAA awards the best of luck and, as a native of Chicago, remember to vote early and vote often!

Don’t forget to nominate your favorite web analytics superstar!

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Conference Season is Upon Us

Wow, I just got done looking more closely at the Analytics Demystified team calendar for the next few months and it is a doozy! Chances are if you live in the U.S. and do any type of digital measurement, analysis, or optimization professionally we are going to see you between now and the end of March.

If that is the case, we’d like to buy you a drink!

Despite each of us presenting, often multiple times, we are always happy to make time for our clients and potential clients when we are out-and-about.  If you realize you’re going to be at one of the following events why not drop us a line and we’ll see if we can connect. Who knows, maybe we’re planning a great party or something …

After all that the three of us are going to slink home to our loved ones and try and convince them we are in fact their fathers, husbands, and sons.

Seriously, though, we never get enough opportunities to meet with partners, friends, and prospects at these events so if you’d like to meet with any or all of us please drop us a line sooner than later so that we can block time and make plans.

Analytics Strategy

The Web Analyst’s Code of Ethics

The Web Analyst’s Code of Ethics is a reality! This Code represents an industry effort to promote ethical data practices and treat consumer data with the respect and attention it deserves.

I’m writing this on the eve before the official launch announcement of the Web Analyst’s Code of Ethics here at the WAA Symposium in Austin Texas. As you can see in the video above, this effort is the culmination of a ton of hard work by a community of contributors.

Yet, the conversation isn’t a new one. My partner Eric has been writing about the fact that We are our own worst enemy since August and our internal conversations about privacy regulation and public opinion of tracking practices have been going on long before that. The issue received mainstream attention from the Wall Street Journal in their What They Know series, which took a bias view in our opinion. Anything that starts out with the phrase; Marketers are spying on Internet users… is FUD in my opinion.

So, in September of last year we decided to do something about it. I must say that Eric never fails to amaze me in his ability to make things happen, because not 24 hours after our conversation about launching a Code of Ethics, he had one drafted and in my inbox. We decided that the best avenue for getting this code out to the community was to work in conjunction with the WAA, where I am a member of the board. Thus, I shopped it around to my fellow board members and we all agreed that it was something that our industry needed. The issue was brought before the WAA Standards Committee and a sub-committee was formed to hash out the details. And the Code was offered to the community for public comment. After numerous iterations and literally dozens of comments and contributors, we arrived at the final Code you see here.

It’s important to recognize that this Code is a pledge for individuals and not organizations. We created it as such because we know that not every individual will be able to enforce policy within their company, but every individual can inform and educate their peers. Yet, as we state in the pledge itself, “I recognize that we are far stronger as a community…”. And this effort is about a community showing it’s commitment to ethical data collection and utilization practices.

Momentum for this project has been incredible thus far, but our work is far from over. It’s just beginning. Like any good analyst, I’ve created goals and success metrics for the code of ethics that I’ll be tracking and reporting on over time. The video above is the first effort to share a glimpse of the metrics, but ultimately I’m shooting for the following goals:

      1) Gain 1,000 Pledges to the Code of Ethics in 2011

 

      2) Attract mainstream media attention to this community effort within the first 90 days of launch (e.g., recognition by @WhatTheyKnow)

 

    3) Ensure that our collective voice is heard by legislators and policy makers before regulation is forced upon us

Let us know what you think about the Code of Ethics here by leaving comments and joining the conversation. Or simply show your support by pledging to follow the Web Analyst’s Code of Ethics.

Adobe Analytics, Analytics Strategy, General

Free webcast on Tag Management Systems on Jan 25th

Given the considerable buzz in the marketplace regarding Tag Management Systems and vendors like Ensighten, TagMan, and BrightTag I wanted to call your collective attention to a free webcast I am participating in next week on “The Myth of the Universal Tag.” On Tuesday, January 25th at 1:00 PM Pacific time I will presenting with Josh Manion, CEO of Ensighten and Brandon Bunker, Senior Manager of Analytics at Sony, detailing some of the advantages I see in the adoption of a tag management platform.

What’s more, the nice folks at Ensighten have taken the registration form off of my white paper on tag management systems and so everyone is free to read all of my thoughts on Tag Management without prompting a sales call.  How cool is that?

Spread the word:

“The Myth of the Universal Tag” free webcast sponsored by Ensighten
Tuesday, January 25th, 1:00 PM Pacific / 4:00 PM Eastern
Register online now at GoTo Meeting!

Don’t forget to download that free copy of my white paper on tag management systems!

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Want to meet Adam Greco? Go to OMS 2011 in San Diego!

By now I hope you have heard that Adam Greco is joining John and I as a Senior Partner in Analytics Demystified. While his official start date isn’t still for a few weeks he’s already on the road as part of the Demystified team. If you’d like to meet Adam in person and talk with him about the practice he is building there are a few places I just happened to know he will be in the coming months:

  • Adam will be participating in the Web Analytics Association (WAA) Symposium in Austin, Texas on Monday, January 24th. Adam is talking about integrating web analytics and CRM which is core to his practice area given his past work at Salesforce.com and Omniture.
  • Adam will also be presenting at the Online Marketing Summit in San Diego, California on Tuesday, February 8th. He’ll be giving the same presentation on web analytics and CRM, discussing how to move marketing analytics from the server room to the board room.
  • Adam will also be joining me in Minneapolis on Wednesday, February 16th for a special Web Analytics Wednesday sponsored by our good friends at SiteSpect and with the generous help from our friends at Stratigent.  We don’t have the details on the site yet but the event will be downtown Minneapolis and Adam and I will be doing some prognostication and fielding questions from Twin Cities locals.

Adam will also be at Webtrends Engage, Adobe’s Omniture Summit, and the Emetrics Marketing Optimization Summit but we’ll post more on that when additional details emerge.  Suffice to say Adam will be busy in his first few months on the job.

If you haven’t met Adam I would encourage you to head out to one of these events and introduce yourself. Especially if you’re a marketer and are considering the Online Marketing Summit — if you haven’t been to OMS you really need to go.  Every year I am absolutely blown away by the job that Aaron Kahlow and the OMS team do bringing that conference together.  OMS draws amazing speakers, amazing sponsors, and most importantly amazing conference participants and delivers an absolute fire-hose of information.

I’m sincerely bummed that Adam is taking my place at OMS this year — I haven’t actually missed a big OMS event in California ever — but I am confident that the audience will benefit greatly from Adam’s message about CRM integration, his direct experience at Salesforce.com, and his distinct presentation style.

Analytics Strategy, Conferences/Community, General

Big Changes at Analytics Demystified

I suspect by now many of you have noticed but this week we made two pretty amazing announcements here at Analytics Demystified. Now that the dust is settling I have some time to take a step back and offer up some comments on the announcements and what I believe they mean for our clients, our prospects, and the web analytics industry in general.

On Tuesday we announced that respected industry veteran Adam Greco had joined John and I as a Senior Partner. Adam is well-known to many in our community thanks to his high-visibility work during his tenure at Omniture, his popular “Omni-Man” blog, and his fine, fine work on the Beyond Web Analytics podcast series.

For John and I bringing Adam on board was a no-brainer. The guy is as bright as they come, he is articulate, and most importantly he knows how to squeeze every last drop of value out of the most widely deployed digital measurement solutions in use today — Adobe SiteCatayst and Google Analytics. Adam is committed to extending that expertise to all of the popular platforms as quickly as possible, and our hope is that by mid-year he will be providing the same great insights he has for SiteCatalyst to Webtrends, Unica, Coremetrics, Nedstat, and other customers.

Adam will be running our Operational Use Audit and Framework Development practice as well as providing custom training and generally supporting the rest of the Demystified service offerings.  Which brings me to our second announcement …

On Wednesday we announced an exclusive partnership with tactical and technical consulting practice leaders Keystone Solutions. Keystone is a slightly better-kept secret than Adam Greco, although their current clients certainly know who they are. Founded years ago by former Omniture super-star Matthew Gellis, Keystone has grown into a talent magnet comprable to, well, Analytics Demystified.  Matt Wright from HP, Kurt Slater from Expedia, Rudi Schumpert from Ariba, and a host of other amazing analytics technicians.

We have doubled-down with Keystone for one simple reason: in our experience they are the best of the best when it comes to providing fundamental and foundational support for any digital measurement practice. Especially against those same two “most popular” solutions — Google Analytics and Adobe SiteCatalyst — Keystone delivers in a way that few others out there are capable, and that is the kind of talent we prefer to work with in the field.

Through this partnership Analytics Demystified clients will be able to benefit from a dramatically expanded set of web analytics consulting service offerings ranging from on-the-ground implementation support to ongoing reporting and analysis to some pretty amazing custom solutions. They will also be taking the lead on our Tag Management Systems Audit and Deployment practice, an offering I expect to be red-hot in 2011 and beyond.

Now, unfortunate as it is, we were not able to pursue this type of relationship with Keystone without some cost. The immediate fall-out is that Analytics Demystified will no longer be participating in the X Change conference. While this breaks my heart after having put three years of sweat equity into the event, relationships change and so it is time to move on.

I do, however, promise every one of the hundreds of consultants, vendors, and practitioners we have personally invited to this conference over the past three years that we will be back, live and in-person, with something far more “Demystified” in nature. Based on our work with Web Analytics Wednesday, the Analysis Exchange, and hundreds of other events around the globe, we have a pretty good idea of what is truly missing from the web analytics event landscape … and now, thanks to Adam and the team at Keystone, we have the means to deliver.

I welcome your comments and questions about both pieces of news, and I hope you’ll keep your eyes open in the coming few weeks for even more news from our growing company. It is exciting times, indeed.

Analytics Strategy, General, Social Media

It's not about you, it's about the community …

Happy New Years my readers! I hope the recent holidays treated you well regardless of your faith, persuasion, or geographic location. I wanted to take a quick break from all the heavy privacy chatter these past few months and tell a little story about the generosity of our community and one individual in particular.

If you follow me on Twitter you may have noticed me cryptically tweeting “it’s not about you, it’s about the community” from time to time. I started sending this update as a subtle hint to a few folks who harp on and on about their accomplishments, products, and “research” in the Twitter #measure community … but sadly those folks never got the hint (so much for being subtle, huh?)

Over time the tweet became something larger — it became a reminder about what we all are capable of when we think about more than our own little world.  “It’s not about you, it’s about the community” is about some of the greatest contributors in the history of web analytics, people like:

  • Jim Sterne, who years ago realized that we needed a place to gather, and who wisely picked the Four Seasons Biltmore in Santa Barbara, California.  While Emetrics may have become a profit-generating machine, those of you who know Jim and know history understand that the conference is as much about and for the community as it is anything else;
  • Jim Sterne, Bryan Eisenberg, Rand Schulman, Greg Drew, Seth Romanow, and others who founded the Web Analytics Association years ago when it was clear that we needed some type of organizing body, committing themselves to hundreds of hours of work without thinking about how they would make money off of the effort;
  • Jim Sterne (again!!!!) who has been making sure that we all know who is doing what where and when via his “Sterne Measures” email newsletter for as long as I can remember;
  • Avinash Kaushik, Google’s famed Analytics Evangelist, who has long committed the profits from his books on web analytics to two amazing charities;
  • Super-contributors to the Web Analytics Forum at Yahoo Groups, folks like Kevin Rogers, Yu Hui, Jay Tkachuk, and dozen more who still take the time to answer questions from newer members of this rapidly expanding community;
  • Past and current Web Analytics Association Board members and super-volunteers, folks like Alex Yoder, Jim Novo, Raquel Collins, Jim Humphries, and so many more who give their time and energy every month to make sure the Association continues to evolve and grow;
  • Activists and evangelists like my partner John Lovett, who in the midst of writing his first book on social media analytics has taken the time to shepherd our Web Analysts Code of Ethics effort through the Web Analytics Association Board of Directors;
  • Everyone who has ever hosted a Web Analytics Wednesday event, including luminaries like Judah Phillips, June Dershewitz, Tim Wilson, Bob Mitchell, Emer Kirrane, Perti Mertanen, Alex Langshur, Anil Batra, Ruy Carneiro, Dash Lavine, Jenny Du, David Rogers, and way too many more folks to list who contribute their valuable time to help grow organic web analytics communities locally;
  • All of the over 1,000 members of the Analysis Exchange, many of whom have contributed to multiple projects to make sure that nonprofit organizations around the world have access to web analytics insights;
  • Dozens of others I am forgetting, and probably hundreds more I have never even met …

When I think about this list of people and their individual contributions to the web analytics community it is almost overwhelming — how lucky we are to have such considerate and giving friends!  Still, people have been giving back for years and so it is rare that I see something or someone in the community that really blows me away …

Until recently.

Not everyone knows Jason Thompson, and I suspect he would be the first to admit that not everyone who knows him actually likes him, but if I had to pick one “web analytics super-hero” for 2010 Jason would be my hand’s-down, number one choice.  See, Jason was smart enough to not just get the web analytics community to give back to our community, he managed to get our community to help provide clean water to an entire community in a developing nation.

Having worked repeatedly as a volunteer with Analysis Exchange Jason was introduced to charity:water, a nonprofit organization who’s vision is very simple: to provide clean, safe drinking water for everyone on the planet.

Water.

Not a great blog or free books, not data or solution profilers, but water that mothers can bring to their children. Clean, pure water that I would venture each and every one of the members of the web analytics community takes for granted and rarely even considers the source and its availability.

But Jason thought about it, and what’s more, Jason did something about it. Thanks to some cool new technology Jason was able to donate his 36th birthday to help raise $500. By leveraging Twitter and his web analytics community he was able to raise that $500 by December 18th.  Having met his goal before his birthday Jason didn’t stop and settle, he set the bar higher, working first to raise $1,000, then $3,000, and finally $5,000, enough to provide water for an entire village – 80 people for 20 years.

Jason’s effort brought out the best in our community again, collecting donations from luminaries and lay-users alike … hell, he even got money from his mom! Some of the biggest names in web analytics helped Jason along, and donations large and small rolled in right up until Ensighten’s Josh Manion put in the last $300 on Jason’s birthday, putting him over the top and completing his final goal.

Honestly I don’t know Jason very well, but I do know passion and greatness when I see it. Jason once again served as a reminder that “it’s not about you, it’s about the community” and he did more than just tweet obnoxiously … he put his time and money where his mouth is and did something real.

Bravo, Mr. Thompson.  Bravo.

If you don’t know Jason I highly recommend following him in Twitter (@usujason, if you’re into Twitter) and, if you see him at a conference or event do like I will and buy the man a drink. I for one am going to let Jason be an example of how I can work even harder to make a difference both inside and outside of the web analytics community in 2011 and beyond.

Hopefully some of you will do the same.

Analytics Strategy

The Privacy Apogee

The biggest topic that you will grapple with in 2011 is consumer privacy. We are at the most liberal and lenient point of consumer privacy in the history of time. It’s primarily because digital data is spewed by consumers with each click, like, Tweet, share, and update with reckless abandon. Consumers are barely aware of the digital footprints they’re creating and we don’t know how to handle it. There are no rules here.

Consumers are racing to new digital medium at breakneck speeds to be early adopters of the next best thing and are literally addicted to digital. Our obsession is so ravenous that almost half of smartphone users will wake up in the middle of the night to check for digital updates. It’s not their fault really, in fact I include myself in this frantic race to get the newest browser, the latest app, or to connect with nearly anyone who asks. Heck, I downloaded the Owner’s Manual to a Hyundai on my iPad within seconds of watching a TV commercial just because I could. I have no idea what data Hyundai now has on me and if or when I’ll start receiving ads or emails containing must-have offers for a car that I probably won’t ever buy (although it looks sweet!). My point is that we’re on the precipice of a substantive change in the way that consumer data is collected and utilized. If we (and by “we” I mean we digital measurers, organizations and institutions) don’t get our acts together in the first quarter of Q1 then we will have regulation forced upon us.

In my opinion, the number one most critical component for even getting off the ground with privacy protection is education. We must educate consumers, organizations, developers and governments to have a meaningful conversation about privacy. If we fall short of that, ignorance about how data is collected, how it’s used, and who uses it, will continue to be vilified by consumers and media sources that don’t know What they Know.

To that end, I’m working on a concept that I’m calling the Privacy Apogee.

Those of you who are up to speed on your celestial mechanics will know that an apogee reflects the furthest point of orbit from earth. What I seek to explore is the farthest point of ethical data collection from a consumer. My working diagram above depicts your average consumer at the epicenter of privacy and the way we track his digital activities using technology that extends from innocuous to invasive. My plan is to flesh out this concept with current tracking capabilities and potential consumer benefits. Moreover, I intend to create a blueprint for accountability. Ultimately the goal is to produce an infographic that conveys several things:

For consumers

      – The Privacy Apogee will illustrate data tracking capabilities that exist today and highlight some of the benefits of opting-in to these tracking practices.

For developers – It will offer guidance on what methods of data to collect and how to communicate data collection, storage and utilization practices in clear language.

For organizations – The Privacy Apogee will illustrate just how far – is too far – by showing what’s technically possible and what’s morally ethical.

In creating this work, I hope to educate and inform the masses by offering a public service that will open some eyes to the critical imperative for self-regulation before we have governmental mandates forced upon us. The Privacy Apogee will illustrate current technological capabilities for tracking consumers’ digital actions and offer both positive and negative repercussions of those actions.

So back to you Captain Blackbeak…I’m listening and this is what I’m doing to create change. It’s a change in perception. A change in education. And a change in direction for our industry. But like you, I cannot do this alone and need the support and mindshare of our industry. With the help of my partner Eric and the industry #measure pros out there my goal is to crowd source this idea to ensure that I’ve fully considered the technology capabilities and the benefits of tracking practices, So I need your help. The Web Analyst’s Code of Ethics is one part of this, but I’ll be working to define the pros and cons of data collection and the methods by which we accomplish our task. Stay tuned for more, as this is just the beginning…

But in the meantime, what do you think?

Analytics Strategy, Social Media

Is It Just Me, or Are There a Lot of #measure Tweets These Days?

<Standard “good golly I haven’t been blogging with my planned weekly frequency / been busy / try to get back on track in 2011” disclaimer omitted>

Update: This update almost warrants deleting this entire post…but I’m going to leave it up, anyway. See Michele Hinojosa’s link in the comment for a link to an Archivist archive of #measure tweets that goes back to May 2010 and doesn’t show anything like the spike the data below shows, and also shows an average monthly tweet volume of roughly 3X what the November spike below shows. Kevin Hillstrom also created a Twapper Keeper archive back in early November 2010, and the count of tweets in that archive to date looks to be in line with what the Archivist archive is showing. So…wholly invalid data and conclusion below!!!

Corry Prohens’s holiday e-greeting email included a list of hist “best of” for web analytics for 2010, and he really nailed it. That just further validates what all web analysts know: Corry is, indeed “Recruiter Man” for our profession. He’s planning to turn the email into a blog post, so, I’ll sit back and wait for that. But, I did suggest that the #measure hashtag probably deserved some sort of shout out (I actually dubbed #measure my “web analytics superhero-sans-cape” in my interview as part of Emer Kirrane‘s “silly series”).

That got me to thinking: how much, really, has the #measure community grown since it’s formal rollout in late July 2009 via an Eric Peterson blog post?

10 minutes in my handy-dandy online listening platform, and I had a nice plot of messages by month:

Yowza! My immediate speculation is that the jump that started in October was directly related to the Washington, D.C. eMetrics conference in the first week of October — the in-person discussions of social media, combined with the continuing adoption of smartphones, combined with the live tweeting that occurred at the conference itself (non-Twitter users at the conference picking up on how Twitter was being effectively used by their peers). That’s certainly a testable hypothesis…but it’s not one I’m going to test right now (add a comment if you’ve got a competing hypothesis or two — maybe I will dive a little deeper if we get some nice competing theories to try out; this will definitely — the horror! — fall in the “interesting but not actionable” category, so, shhhh!!!, don’t point your business users to this post!).

It’s also possible that the data is not totally valid — gotta love the messiness of social media! I’d love to have someone else do a quick “conversation volume” analysis of #measure tweets to see if similar results crop up. Unfortunately, Twitter doesn’t make that sort of historical data available, I shut off my #measure RSS feed archive a few months ago, and, apparently, no one (myself included) ever set up a TwapperKeeper archive for it. So, I can’t immediately think of an alternative source to use to check the data.

Thoughts? Observations? Harsh criticisms? Comment spammers (I know I can always count on you to chime in, you automated, Akismet-busting robots, you!)?


Analytics Strategy

Santa Puts Aprimo Under the Tree!

2011 is shaping up to be the year of big marketing. And luckily for us measurers, smart marketing is founded in data and measurement. With IBM’s recent acquisition rampage and now Teradata’s plans to buy Aprimo, there is unprecedented choice for integrated enterprise marketing solutions. Teradata announced today it’s intentions to buy the Enterprise Marketing Management leader for $525M with a closing date anticipated for sometime in Q1 2011. It’s a smart move in my opinion because the days of big data management and the ability to harness the consumer data firehose for elevated marketing are upon us.

On the executive briefing this morning, I pointed a question by asking if this acquisition was a response to IBM’s recent buying spree and the answer was a definitive no. Bill Godfrey, Aprimo’s Chief Executive Officer, quickly pointed out that Aprimo’s technology set covers 8 categories and that only one competes directly with the IBM/Unica offering. He iterated, “This is not a copy-cat move” with mild umbrage. Mr. Godfrey went on to eloquently explain that the merger pursues an independent strategy that brings a unified platform covering a very broad end-to-end spectrum of functionality. While the story sounded familiar, it’s a good one. It leverages the database storage and business analytics capabilities of Teradata and layers the marketing management and operations proficiency of Aprimo on top. This enterprise-ready integrated solution fuels a marketers’ paradise where insights are churned from data, which pumps intelligent life into automated marketing. All this happens within a closed-loop system that improves over time. Sounds rosy doesn’t it? To paraphrase Teradata’s CMO Darryl McDonald, “The combined solution will help accelerate revenue generating campaigns and leverage data for strategic insights and quick response.

Keep in mind that this isn’t entirely new territory for Teradata who has been offering marketing products to its customers for some time. With IWI (Integrated Web Intelligence) and TRM (Teradata Relationship Manager), it’s already servicing digital data integration and intelligent marketing to it’s customers. Yet, it will be interesting to see how many existing clients and new organizations adopt this complete functionality. My hunch is that this stack is not for the feint of heart nor the bootstrapped organization. It will work best with deeply integrated datasets, stored within big iron and activated using some complex Marketing Resource Management capabilities. All things that both Teradata and Aprimo excel at. But fair warning: Mom & Pop shops need not apply. However, if you’re a large enterprise looking to accelerate your marketing prowess, then this may be the solution you’ve had on your wish list all these years.

While integrating these technologies may take a while, and the promise of an end-to-end solution is no trivial pledge, I’m bullish on the deal. This is a step forward for marketers because it has the potential to deliver the ERP system they never had. It still doesn’t cover everything, but the combined solution sure does handle some critical moving parts.

Congrats to everyone at Aprimo for building an attractive offering and to Teradata for recognizing it. And Happy Holidays to all!

Analytics Strategy, Conferences/Community, General

FTC "Do Not Track?" Bring it on …

As the hubub around consumer privacy continues I was gently prodded by a friend to pipe up in the conversation.  While my feelings about how we have ended up in this position are pretty clear, and while my partner John and I have proposed what we believe is a step in the right direction regarding online privacy and the digital measurement community, it seems that some type of ban or limitation on online tracking is becoming inevitable.

Without getting political or debating the reality of what we can and cannot know about online visitors I have a single word response to the FTC:

Whatever.

Before you accuse me of changing my stripes or going completely nuts consider this: If the FTC is able to somehow pull off the creation of a universal opt-out mechanism, and if the browser developers support this mechanism despite clear and compelling reasons not to, and if consumers actually widely adopt the mechanism — all pretty big “ifs” in my humble opinion — then I believe the digital measurement industry will do what I have already described as inevitable:

We will hold a revolution!

Since my tenure at JupiterResearch back in 2005 I have been telling anyone who would listen to stop worrying about counting every visitor, visit, and page view and instead start thinking about statistically relevant samples, confidence intervals, and the algorithmic use of data to conduct analysis.  Yes, you need to work to ensure data quality — of course you do — but you don’t have to do it at the expense of your sanity, your reputation, or your job …

See, it turns out in our community it doesn’t really matter whether we are able to measure 100% of the population, 90% of the population, or even 80% of the population — what matters is that we are able to analyze our visitor populations and that are able to draw reasonable conclusions from that analysis.  Oh, we have to be empowered to conduct analysis as well, but that’s a whole other problem …

Statistical analysis of the data … trust me, it’s going to be all the rage in a few years. I’m not saying this simply because I have a white paper describing the third generation of digital measurement tools that will empower this type of analysis … although I would encourage you to download and read “The Coming Revolution in Web Analytics” (freely available thanks to the generous folks at SAS!)

I’m saying this because every day I see the writing on the wall.  Data volumes are increasing, data sources are increasing, and demands for insights are increasing, all while professional journalists, politicians, and political appointees are supposedly protecting our “God-given right to surf the Internet in peace” without any regard to the businesses, employees, and investors who depend to a greater or lesser degree on web-collected data to provide a service, pay their bills, and make a profit …

Okay, sorry, that was editorializing.  My bad.

Still, rather than wring our hands and gripe about how much the credit card companies know (which is a silly argument given that credit card companies provide tangible value in exchange for the data they collect … it’s called “money”) I believe it is time to do three things:

  1. Suck it up.
  2. Hold yourself to a higher standard.
  3. Buy “Statistics in Plain English” and start reading.

The good news is that we have access to lots and lots of great statistical analysis of sampled data today — we just might not realize it.  Consider:

Have I mentioned Excel, Tableau, and R?  Hopefully by now you get the gist … statistics is already all around us all the time, perhaps just not exactly where we expect it or, in the context of lower rates of data collection, where we will ultimately need it to be.

Perhaps the most encouraging evidence that we will be able to make this shift is the increasing attention the digital world is getting from traditional business intelligence market leaders like Teradata, FICO, IBM, and SAS.  I, for one, am more or less convinced that the gap between “web analytics” and “Analytics” is about to be closed even further … and here’s one guy that seems to agree with me.

We don’t need to thumb our noses at the privacy people — quite the opposite, and to this end John and I will be sitting down with a representative from the Center for Democracy and Privacy and Adobe’s Chief Privacy Officer MeMe Rasmussen at the next Emetrics in San Francisco! We also don’t need to stick our head’s back in the sand and hope this issue will simply go away — it won’t, trust me.

We need to prepare.

Prepare by committing yourself to not being that scary data miner that consumers are supposedly so afraid of; prepare by improving your data quality to the extent that you are able; and prepare by starting to communicate to leadership that it really doesn’t matter if you can count every visitor, every visit, and every page view — what matters is your ability to analyze data using the tools at your disposal to deliver value back to the business.

If you’re not sure how to do that, call us.

Viva la revolution!

DISCLOSURE: I mentioned and linked to lots of vendors in this post which I normally do not do. Some are clients of Analytics Demystified, others are not. If you have concerns about why we linked to one company and not another please don’t hesitate to email me directly.

Adobe Analytics, Analytics Strategy, General

Tracking Lead Gen Forms by Page Name

Every once in a while, as a web analyst, I get frustrated by stuff and feel like there has to be a better way to do what I am trying to do. Many times you are able to find a better way, often times you are not. In this case, I had a particular challenge and did find a cool way to solve it. You may not have the same problem, but, if for no other reason than to get it off my chest, I am writing this as a way to exhale and bask in my happiness of solving a web analytics problem…

My Recent Problem
So what was the recent problem I was facing that got me all bent out of shape? It had to do with Lead Generation forms, which are a staple of B2B websites like mine. Let me explain. Many websites out there, especially B2B websites, have Lead Generation as their primary objective. In past blog posts, I have discussed how you can track Form Views, Form Completes and Form Completion Rates. However, over time, your website may end up with lots of forms (we have hundreds at Salesforce.com!). In a perfect world, each website form would have a unique identifier so you can see completion rates independently. That isn’t asking too much is it? However, as I have learned, we rarely live in a perfect world!

Through some work I did in SiteCatalyst, I found that our [supposedly unique] form identifier codes were being copied to multiple pages on multiple websites. While this causes no problems from a functionality standpoint – visitors can still complete forms – what I found was that the same Form ID used in the US was also being used in the UK, India, China, etc… Therefore, when I ran our Form reports and looked at Form Views, Form Completes and Form Completion Rate by Form ID, I had no idea that I was looking at data for multiple countries. For example, if you look at this report nothing seems out of the ordinary right?

However, look what happened when I broke this report (last row of above report) down by a Page Name eVar:

At first, I thought I was going crazy! How can this unique Form ID be passed into SiteCatalyst on eleven different form pages on nine country sites? This caused me to dig deeper, so I did a DataWarehouse report of Form ID’s by Page Name and found that an astounding number of Form Pages on our global websites shared ID’s. Suddenly, I panicked and realized that whenever I had been reporting on how Forms were performing, I was really reporting on how they were performing across several pages on multiple websites. In the example above, I realized that the 34.669% Form Completion Rate I was reporting for the US version of the form in question was really reporting data with the same ID for forms residing on websites in Germany, China, Mexico, etc… While the majority was coming the the form I was expecting, 22% was coming from other pages! Not good!

The Solution
So there I was. Stuck in web analytics hell, reporting something different than I thought I was. What do you do? The logical solution was be to do an audit and make sure each Form page on the website had a truly unique ID. However, that is easier said than done when your web development team is already swamped. Also, even if you somehow manager to fix all of the ID’s, what is preventing these ID’s from getting duplicated again? We looked at all types of process/technology solutions and then realized that there is an easy way to fix this by doing a little SiteCatalyst trickery.

So what did we do? We simply replaced the Form ID eVar value with a new value that concatenated the Page Name and the Form ID on every Form Page and Form Confirmation Page. By concatenating the Page Name value, even if the same Form ID was used on multiple pages, the concatenated value would still be unique. For example, the old Form ID report looked like the one above:

But the new version looked like this:

With this new & improved report, when I was reporting for a particular form on a particular site/page, I could search by the form pagename and be sure I was only looking at results from that page. Also, a cool side benefit of this approach is that you could add a Form ID to the search function to quickly find all pages that had the same Form ID in case you ever did want to clean up your Form ID’s:

Implementation Gothca!
However, there is one tricky part of this solution. While it is certainly easy to concatenate the s.pagename value with the Form ID on the Form page, what about the Form Confirmation page? The Form Confirmation page is where you should be setting your Form Completion Success Event and that page is going to have a different pagename. If your Form ID report doesn’t have the same Page Name + Form ID value for both the Form View and Form Complete Success Event, you cannot use a Form Completion Rate Calculated Metric. For this reason, you need to use the Previous Value Plug-in to pass the previous pagename on the Form Confirmation page. Doing this will allow you to pass the name of the “Form View” page on both the Form View and Form Complete page of your site so you have the same page name value merged with the Form ID.

A Few More Things
Finally, while the Form ID report above serves this particular function, it is not very glamorous and it might not be the most user-friendly report for your users. If you want to provide a more friendly experience you can do the following with SAINT Classifications:

  1. Classify the Form ID value by its Page Name so your users can see Form Views, Form Completions and the Form Completion Rate by Page Name
  2. Classify the Form ID value by the Form ID if for some reason you want to go back to seeing the report you had previously

Final Thoughts
Well there you have it. A very specific solution to a specific problem I encountered. If you have Lead Generation Forms on your website, maybe it will help you out one day. If not, thanks for letting me get this out of my system!

Analytics Strategy, Conferences/Community, General

Are you in Atlanta? I will be, next week!

Just a quick note to those of you in the greater Atlanta (GA) metropolitan region to let you know I will be in town next week working and participating in two awesome events:

  1. A blow-out Atlanta Web Analytics Wednesday, sponsored by the fine folks at Unica, where I will be moderating a “practitioner panel” with Delta, Home Depot, and How Stuff Works. Sadly we only had room for 60 odd people and that list filled up almost right away … but you can write to Jane Kell and ask to be put on the waiting list just in case people have to back out.
  2. A presentation to Atlanta CHI on “Getting to Know Your Users Using Data” at the Georgia Tech Research Institute Conference Center. As this is a somewhat mixed audience my presentation will be a high-level walk through the systems and processes we all leverage on a daily basis.

As far as I know the CHI event is not sold out and costs $35 for the general public, $10 for students with ID, and nothing (free!) if you are a member of Atlanta CHI!

For those of you not in Atlanta, apologies for this utterly useless blog post. Hopefully I will make it to your town soon and can make it up to you …

Analytics Strategy, Conferences/Community

Updated "Web Analysts Code of Ethics"

Just in case you hadn’t seen this already I wanted to call your attention to the updated (version 2) “Web Analysts Code of Ethics” over at the Web Analytics Association blog. John Lovett and the members of the Standards Subcommittee did a wonderful job condensing my original work down into a more easily digested document.

The committee is still looking for comments on this version so please, please head over, read the update, and let us know what you think.

Thanks to John and the WAA for making this happen for all of us!

Analytics Strategy

My Letter To The C-Suite

The following originally posted in Exact Target’s 10 Ideas To Turn Into Results report. It’s part of their Letters to the C-Suite Series and this is my letter…

To The Executive Team:

Do you even know who your customers are anymore? Chances are, you probably don’t. You
catch fleeting glimpses of them as they open your emails or pop onto your website for a quick
visit. You might even momentarily engage with them when they drop into your store to browse
around or see your products firsthand. Or maybe you meet them ever so briefly as they feign
interest in your brand by “liking” something you posted on Facebook.

If you’re doing it right, your business is collecting feedback across many customer
touch points.

But you only really hear them when they shout from the rooftops, irate and full of vim. That’s
probably where you begin to learn what’s on their minds. But do you even know that it’s the
same person who was showing you all that love during your last promotion? Probably not.
In actuality, few companies really know their customers. Whether your customers are end
users or other businesses, how they interact with your brand, where they discover new
information, and how they communicate is changing at an astounding rate. Customers
are increasingly unaffected by traditional marketing conventions, and their tolerance for
redundant messaging, static content, and conflicting brand information is nonexistent. They
don’t see your organization like you do—in departmentalized silos of categories, products,
business units, and operating divisions. To them, you’re just that brand they either love, hate,
or treat with ambivalence. That is, until you knock their socks off by impressing them with your
service, support, and relevance. Yet, to really deliver value to your customers, you need to get
to know them. This starts by remembering the interactions you have with them and building
off of these activities.

Digital communication is the new reality, and treating customers through digital channels is
synonymous with how you’d treat someone you meet in person. Listen to what they’re saying
and respond with appropriate dialog. But most importantly, remember these things (because
upon your next conversation, your customer might just remember you):

• Your memory of customers exists at the database level.

• By maintaining customer profiles and appending them with attributes that contain history,
activity, and propensity (among other things), you can truly begin to have meaningful
interactions.

• To do this effectively, the database must contain information from all your touch points.

This includes transactional systems, web analytics, call centers, mobile devices, social
media, ATMs, stores, email systems, and whatever else you’re using to reach out.
Bringing your data together through integrations enables you to achieve a holistic picture of
your customers. A little scared by this? Well, you should be. Customer behaviors are going to
fundamentally change the way you engage with your audience. If you’re not equipped, they’re
going to take their conversations (and their wallets) elsewhere. By integrating your data, you
open opportunities for new customer dialogs.

Take my word for it—it’s happening NOW.

Your Agent For Change,
John Lovett

 

Analysis, Analytics Strategy, Reporting, Social Media

Analyzing Twitter — Practical Analysis

In my last post, I grabbed tweets with the “#emetrics” hashtag and did some analysis on them. One of the comments on that post asked what social tools I use for analysis — paid and free. Getting a bit more focussed than that, I thought it might be interesting to write up what free tools I use for Twitter analysis. There are lots of posts on “Twitter tools,” and I’ve spent more time than I like to admit sifting through them and trying to find ones that give me information I can really use. This, in some ways, is another one of those posts, except I’m going to provide a short list of tools I actually do use on a regular basis and how and why I use them.

What Kind of Analysis Are We Talking About?

I’m primarily focussed on the measurement and analysis of consumer brands on Twitter rather than on the measurement of one’s personal brand (e.g., @tgwilson). While there is some overlap, there are some things that make these fundamentally different. With that in mind, there are really three different lenses through which Twitter can be viewed, and they’re all important:

  • The brand’s Twitter account(s) — this is analysis of followers, lists, replies, retweets, and overall tweet reach
  • References of the brand or a campaign on Twitter — not necessarily mentions of @<brand>, but references to the brand in tweet content
  • References to specific topics that are relevant to the brand as a way to connect with consumers — at Resource Interactive, we call this a “shared passion,” and the nature of Twitter makes this particularly messy, but, to whatever level it’s feasible, it’s worth doing

While all three of these areas can also be applied in a competitor analysis, this is the only mention (almost) I’m going to make of that  — some of the techniques described here make sense and some don’t when it comes to analyzing the competition.

And, one final note to qualify the rest of this post: this is not about “online listening” in the sense that it’s not really about identifying specific tweets that need a timely response (or a timely retweet). It’s much more about ways to gain visibility into what is going on in Twitter that is relevant to the brand, as well as whether the time spent investing in Twitter is providing meaningful results. Online listening tools can play a part in that…but we’ll cover that later in this post.

Capturing Tweets?

When it comes to Twitter analysis, it’s hard to get too far without having a nice little repository of tweets themselves.  Unfortunately, Twitter has never made an endless history of tweets available for mining (or available for anything, for that matter). And, while the Library of Congress is archiving tweets, as far as I know, they haven’t opened up an API to allow analysts to mine them. On top of that, there are various limits to how often and how much data can be pulled in at one time through the Twitter API. As a consumer, I suppose I have to like that there are these limitations. As a data guy, it gets a little frustrating.

Two options that I’ve at least looked at or heard about on this front…but haven’t really cracked:

  • Twapper Keeper — this is a free service for setting up a tweet archive based on a hashtag, a search, or a specific user. In theory, it’s great. But, when I used it for my eMetrics tweet analysis, I stumbled into some kinks — the file download format is .tar (which just means you have to have a utility that can uncompress that format), and the date format changed throughout the data, so getting all of the tweets’ dates readable took some heavy string manipulation
  • R — this is an open source statistics package, and I talked to a fellow several months ago who had used it to hook into Twitter data and do some pretty intriguing stuff. I downloaded it and poked around in the documentation a bit…but didn’t make it much farther than that

I also looked into just pulling Tweets directly into Excel or Access through a web query. It looks like I was a little late for that — Chandoo documented how to use Excel as a Twitter client, but then reportd that Twitter made a change that means that approach no longer works as of September 2010.

So, for now, the best way I’ve found to reliably capture tweets for analysis is with RSS and Microsoft Outlook:

  1. Perform a search for the twitter username, a keyword, or a hashtag from http://search.twitter.com (or, if you just want to archive tweets for a specific user, just go to the user’s Twitter page)
  2. Copy the URL for the RSS for the search (or the user)
  3. Add a new RSS feed in MS Outlook and paste in the URL

From that point forward, assuming Outlook is updating periodically, the RSS feeds will all be captured.

There’s one more little trick: customize the view to make it more Excel/export-friendly. In Outlook 2007, go to View » Current View » Customize Current View » Fields. I typically remove everything except From, Subject, and Received. Then go to View » Current View » Format Columns and change the Received column format from Best Fit to the dd-Mmm-yy format. Finally, remove the grouping. This gives you a nice, flat view of the data. You can then simply select all the tweets you’re interested in, press <Ctrl>-<C>, and then paste them straight into Excel.

I haven’t tried this with hundreds of thousands of tweets, but it’s worked great for targeted searches where there are several thousand tweets.

Total Tweets, Replies, Retweets

While replies and retweets certainly aren’t enough to give you the ultimate ROI of your Twitter presence, they’re completely valid measures of whether you are engaging your followers (and, potentially, their followers). Setting up an RSS feed as described above based on a search for the Twitter username (without the “@”) will pick up both all tweets by that account as well as all tweets that reference that account.

It’s then a pretty straightforward exercise to add columns to a spreadsheet to classify tweets any number of ways by some use of the IF, ISERROR, and FIND functions. These can be used to quickly flag each tweet  as a reply, a retweet, a tweet by the brand, or any mix of things:

  • Tweet by the brand — the “From” value is the brand’s Twitter username
  • Retweet — tweet contains the string “RT @<username>
  • Reply — tweet is not a retweet and contains the string “@<username>

Depending on how you’re looking at the data, you can add a column to roll up the date — changing the tweet date to be the tweet week (e.g., all tweets from 10/17/2010 to 10/23/2010 get given a date of 10/17/2010) or the tweet month. To convert a date into the appropriate week (assuming you want the week to start on Sunday):

=C1-WEEKDAY(C1)+1

To convert the date to the appropriate month (the first day of the month):

=DATE(YEAR(C1),MONTH(C1),1)

C1, of course, is the cell with the tweet date.

Then, a pivot table or two later, and you have trendable counts for each of these classifications.

This same basic technique can be used with other RSS feeds and altered formulas to track competitor mentions, mentions of the brand (which may not match the brand’s Twitter username exactly), mention of specific products, etc.

Followers and Lists

Like replies and retweets, simply counting the number of followers you have isn’t a direct measure of business impact, but it is a measure of whether consumers are sufficiently engaged with your brand. Unfortunately, there are not exactly great options for tracking net follower growth over time. The “best” two options I’ve used:

  • Twitter Counter — this site provides historical counts of followers…but the changes in that historical data tend to be suspiciously evenly distributed. It’s better than nothing if you don’t have a time machine handy. (See the Twitalyzer note at the end of this post — I may be changing tools for this soon!)
  • Check the account manually — getting into a rhythm of just checking an account’s total followers is the best way I’ve found to accurately track total followers over time; in theory a script could be written and scheduled that would automatically check this on a recurring basis, but that’s not something I’ve tackled

I also like to check lists and keep track of how many lists the Twitter account is included on. This is a measure, in my mind, of whether followers of the account are sufficiently interested in the brand or the content that they want to carve it off into a subset of their total followers so they are less likely to miss those tweets and/or because they see the Twitter stream as being part of a particular “set of experts.” Twitalyzer looks like it trends list membership over time, but, since I just discovered that it now does that, I can’t stand up and say, “I use that!” I may very well start!

Referrals to the Brand’s Site

This doesn’t always apply, but, if the account represents a brand, and the brand has a web site where the consumer can meaningfully engage with the brand in some way, then measuring referrals from Twitter to the site are a measure of whether Twitter is a meaningful traffic driver. There are fundamentally two types of referrals here:

  • Referrals from tweeted links by the brand’s Twitter account that refer back to the site — these can be tracked by a short URL (such as bit.ly), by adding campaign tracking parameters to the URL so the site’s web analytics tool can identify the traffic as a brand-triggered Twitter referral, or both. The campaign tracking is what is key, because it enables measuring more than simply “clicks:” whether the visitors are first-time visitors to the site or returning visitors, how deeply they engaged with the site, and whether they took any meaningful action (conversions) on the site
  • “Organic” referrals — overall referrals to the site from twitter.com. Depending on which web analytics tool you are using on your site, this may or may not include the clickthroughs from links tweeted by the brand.

By looking at referral traffic, you can measure both the volume of traffic to the site and the relative quality of the traffic when compared to other referral sources for the site.

(If the volume of that traffic is sufficiently high to warrant the effort, you may even consider targeting content on the landing page(s) for Twitter referral traffic to try to engage visitors more effectively– you know the visitor is engaged with social media, so why not test some secondary content on the page to see if you can use that knowledge to deliver more relevant content and CTAs?)

Word Clouds with Wordle

While this isn’t a technique for performance management, it’s hard to resist the opportunity to do a qualitative assessment of the tweets to look for any emerging or hot topics that warrant further investigation. Because all of the tweets have been captured, a word cloud can be interesting (see my eMetrics post for an example). Hands-down, Wordle makes the nicest word clouds out there. I just wish it was easier to save and re-use configuration settings.

One note here: you don’t want to just take all of the tweet content and drop it straight into Wordle, as the search criteria you used for the tweets will dwarf all of the other words. If you first drop the tweets into Word, you can then do a series of search and replaces (which you can record as a macro if you’re going to repeat the analysis over time) — replace the search terms, “RT,” and any other terms that you know will be dominant-but-not-interesting with blanks.

Not Exactly the Holy Grail…

Do all of these techniques, when appropriately combined, provide near-perfect measurement of Twitter? Absolutely not. Not even close. But, they’re cheap, they do have meaning, and they beat the tar out of not measuring at all. If I had to pick one tool that I was going to bet on that I’d be using inside of six months for more comprehensive performance measurement of Twitter, it would be Twitalyzer. It sure looks like it’s come a long way in the 6-9 months since I last gave it a look. What it does now that it didn’t do initially:

  • Offers a much larger set of measures — you can pick and choose which measures make sense for your Twitter strategy
  • Provides clear definitions of how each metric is calculated (less obfuscated than the definitions used by Klout)
  • Allows trending of the metrics (including Lists and Followers).

Twitalyzer, like Klout, and Twitter Counter and countless other tools, is centered on the Twitter account itself. As I’ve described here, there is more going on in Twitter that matters to your brand than just direct engagement with your Twitter account and the social graph of your followers. Online listening tools such as Nielsen Buzzmetrics can provide keyword-based monitoring of Twitter for brand mentions and sentiment — this is not online listening per se, really, but it is using online listening tools for measurement.

For the foreseeable future, “measuring Twitter” is going to require a mix of tools. As long as the mix and metrics are grounded in clear objectives and meaningful measures, that’s okay. Isn’t it?

Analytics Strategy, Social Media

eMetrics Washington, D.C. 2010 — Fun with Twitter

I took a run at the #emetrics tweets to see if anything interesting turned up. Rather than jump into Nielsen Buzzmetrics, which was an option, I just took the raw tweets from the event and did some basic slicing and dicing of them.

[Update: I’ve uploaded the raw data — cleaned up a bit and with some date/time parsing work included — in case you’d like to take another run at analyzing the data set. It’s linked to here as an Excel 2007 file]

The Basics of the Analysis

I constrained the analysis to tweets that occurred between October 4, 2010, and October 6, 2010, which were the core days of the conference. While tweets occurred both before and after this date range, these were the days that most attendees were on-site and attending sessions.

To capture the tweets, I set up a Twapper Keeper archive for all tweets that included the #emetrics hashtag. I also, certainly, could have simply set up an RSS feed and used Outlook to capture the tweets, which is what I do for some of our clients, but I thought this was a good way to give Twapper Keeper a try.

The basic stats: 1,041 tweets from 218 different users (not all of these users were in attendance, as this analysis included all retweets, as well as messages to attendees from people who were not there but were attending in spirit).

Twapper Keeper

Twapper Keeper is free, and it’s useful. The timestamps were inconsistently formatted and/or missing in the case of some of the tweets. I don’t know if that’s a Twapper Keeper issue, a Twitter API issue, or some combination. The tool does have a nice export function that got the data into a comma-delimited format, which is really the main thing I was looking for!

Twitter Tools Used

Personally, I’ve pretty much settled on HootSuite — both the web site and the Droid app — for both following Twitter streams and for tweeting. I was curious as to what the folks tweeting about eMetrics used as a tool. Here’s how it shook out:

So, HootSuite and TweetDeck really dominated.

Most Active Users

On average, each user who tweeted about eMetrics tweeted 4.8 times on the topic. But, this is a little misleading — there were a handful of very prolific users and a pretty long tail when you look at the distribution.

June Li and Michele Hinojosa were the most active users tweeting at the conference by far, accounting for 23% of all tweets between the two of them directly (and another 11% through replies and retweets to their tweets, which isn’t reflected in the chart below — tweet often, tweet with relevancy, and your reach expands!):

Tweet Volume by Hour

So, what sessions were hot (…among people tweeting)? The following is a breakdown of tweets by hour for each day of the conference:

Interestingly, the biggest spike (11:00 AM on Monday) was not during a keynote. Rather, it was during a set of breakout sessions. From looking at the tweets themselves, these were primarily from the Social Media Metrics Framework Faceoff session that featured John Lovett of Web Analytics Demystifed and Seth Duncan of Context Analytics. Of course, given the nature of the session, it makes sense that the most prolific users of Twitter attending the conference would be attending that session and sharing the information with others on Twitter!

The 2:00 peak on Monday occurred during the Vendor Line-Up session, which was a rapid-fire and entertaining overview of many of the exhibiting vendors (an Elvis impersonator and a CEO donning a colonial-era wig are going to generate some buzz).

There was quite a fall-off after the first day in overall tweets. Tweeting fatigue? Less compelling content? I don’t know.

Tweet Content

A real challenge for listening to social media is trying to pick up hot topics from unstructured 140-character data. I continue to believe that word clouds hold promise there…although I can’t really justify why a word frequency bar chart wouldn’t do the job just as well.

Below is a word cloud created using Wordle from all 1,041 tweets used in this analysis. The process I went through was that I took all of the tweets and dropped them in MS Word and then did a handful of search-and-replaces to remove the following words/characters:

  • #emetrics
  • data
  • measure
  • RT

These were words that would come through with a very strong signal and dominate potentially more interesting information. Note: I did not include the username for the person who tweeted. So, occurrences of @usernames were replies and retweets only.

Here’s the word cloud:

What jumped out at me was the high occurrence of usernames in this cloud. This appears to be a combination of the volume of tweets from that user (opening up opportunities for replies and retweets) and the “web analytics celebrity” of the user. The Expedia keynote clearly drove some interest, but no vendors generated sufficient buzz to really drive a discussion volume sufficient to bubble up here.

As I promised in my initial write-up from eMetrics, I wasn’t necessarily expecting this analysis to yield great insight. But, it did drive me to some action — I’ve added a few people to the list of people I follow!

Analytics Strategy

Gilligan's eMetrics Recap — Washington, D.C. 2010

I attended the eMetrics Marketing Optimization Summit earlier this week in D.C., and this post is my attempt to hash out my highlights from the experience. Of all the conferences I’ve attended (I’m not a major conference attendee, but I’m starting to realize that, by sheer dint of advancing age, I’m starting to rack up “experience” in all sorts of areas by happenstance alone), this was one that I walked away from without having picked up on any sort of unintended conference theme. Normally, any industry conference is abuzz about something, and that simply didn’t seem to be the case with this one.

(In case you missed it, the paragraph above was a warning that this post will not have a unifying thread! Let’s plunge ahead nonetheless!)

Voice of the Customer

It’s good to see VOC vendors aggressively engaging the “traditional web analytics” audience. Without making any direct effort, I repeatedly tripped over Foresee Results, iPerceptions, OpinionLabs, and CRM Metrix in keynotes, sessions, the exhibit hall, and over meals.

My takeaway? It’s a confusing space. Check back in 12-18 months and maybe I’ll be able to pull off a post that provides a useful comparison of their approaches. If I had my ‘druthers, we’d pull off some sort of bracketed Lincoln-Douglas style debate at a future eMetrics where these vendors were forced to engage each other directly, and the audience would get to vote on who gets to advance – not necessarily judging which tool is “better” (I’m pretty sure each tool is best-in-class for some subset of situations…although I know at least one of the vendors above who would vigorously tell me this is not the case), but declaring a winner of each matchup so that we would get a series of one-on-one debates between different vendors that would be informative for the audience.

Cool Technology

I generally struggle to make my way around an exhibit hall, so I didn’t come anywhere close to covering all of the vendors This wasn’t helped by the fact that I talked to a couple of exhibitors early on that were spectacularly unappealing. That wasn’t exactly a great motivator for continuing the process. There were, however, several tools that intrigued me:

  • Ensighten – if you’re reading this blog, then chances are you read “real” blogs, too, and you likely caught that Eric Peterson recently wrote a paper on Tag Management Systems (sponsored by Ensighten). It’s worth a read. Ensighten was originally developed in-house at Stratigent and then spun off as a separate business with Josh Manion at the helm. Their corny (but highly effective) schtick at the conference was that they were starting a “tagolution” (a tagging revolution). That gave them high visibility…but I think they’ve got the goods to back it up. Put simply, you deploy the Ensighten javascript on your site instead of all of the other tags you need (web analytics, media tracking, VOC tools, etc.). When the page loads, that javascript makes a call to Ensighten, which returns all of the tags that need to be executed. Basically, you get to manage your tags without touching the content on your site directly. And, according to Josh, page performance actually improves in most cases (he had a good explanation as to why — counter-intuitive as it seems). Very cool stuff. Currently, they’re targeting major brands, and the price point reflects this – “six figures” was the response when I asked about cost for deploying the solution on a handful of domains. Ouch.
  • DialogCentral – this is actually an app/service from OpinionLabs, and I have no idea what kind of traction it will get. But, as I stood chatting with the OpinionLabs CIO, I pulled out my Droid and had had a complete DialogCentral experience in under a minute. The concept? Location-based services as a replacement for “tell us what you think” postcards at physical establishments. You fire up their app (iPhone) or their mobile site (dialogcentral.com will redirect to the mobile site if you visit it with a mobile device). DialogCentral then pulls up your location and nearby establishments (think Foursquare, Gowalla, Brightkite-type functionality to this point), and then lets you type in feedback for the establishment. That feedback then gets sent to the establishment, regardless of whether the venue is a DialogCentral customer. Obviously, their hope is that companies will sign on as customers and actually promote the feedback mechanism in-store, at which point the feedback pipeline gets much smoother. It’s an intriguing idea — a twist-o’-the-old on all of the different “publicly comment on this establishment” aspects of existing services.
  • Clicktale – these guys have been around for a while, and I was vaguely familiar with them, but got an in-depth demo. They use client-side code (which, presumably, could be managed through Ensighten — I’m just sayin’…) to record intra-page mouse movements and clicks. They then use that data to enable “replays” of the activity as well as to generate page-level heatmaps of activities and mouse placement. Their claim (substantiated by research) is that mouse movements are a pretty tight proxy for eye movement, so you get a much lower cost / broadly collected set of (virtual) eye-tracking data. And, the tool has all sorts of triggering and filtering capabilities to enable honing in on subsets of activity. Pretty cool stuff.
  • ShufflePoint – this wasn’t an exhibiting vendor, but, rather, the main gist of one of the last sessions of the conference. The tool is a poor man’s virtual-data-mart enabler. Basically, it’s an interface to a variety of tool APIs (Google Analytics, Google Adwords, Constant Contact, etc. – Facebook and Twitter are apparently in the pipeline) that allows you to build queries and then embed those queries in Excel. I’ve played around with the Google Analytics API enough to get it hooked into Excel and pulling data…and know that I’m not a programmer. Josh Katinger of Accession Media was the presenter, and he struck me as being super-pragmatic, obsessive about efficiency, and pretty much bullshit-free (I found out after I got to the airport that a good friend of mine from Austin, Kristin Farwell, actually goes wayyy back with Josh, and she confirmed that this was an accurate read). We’ll be giving ShufflePoint a look!

Social Media Measurement

I was expecting to hear a lot more on social media measurement at the conference…but it really wasn’t covered in-depth. Jim Sterne kicked off with a keynote on the subject (he did recently publish a book on the topic, which is now sitting on my nightstand awaiting a read). And, there was a small panel early on the first day where John Lovett got to discuss the framework he developed with Jeremiah Owyang (which is fantastic) this past spring. But, other than that, there really wasn’t much on the subject.

MMM and Cross-Channel Analytics

Steve Tobias from Marketing Management Analytics conducted a session that focused on the challenges of marketing mix modeling (MMM) in a digital world. I felt pretty smart as he listed multiple reasons why MMM struggles to effectively incorporate digital and social media, because many of his points mirrored what I’ve put together on the exact same subject (to be clear, he didn’t get his content from me!). It was good to get validation on that front from a true expert on the subject.

Where things got interesting, though, was when Steve talked about how his company is dealing with these challenges by supplementing their MMM work (their core strength) with “cross-channel analytics.” By “cross-channel analytics,” he meant panel-based measurement. Again, I felt kinda’ smart (and, really, it’s all about me and my feelings, isn’t it?), as I keep thinking (and I’ve got this in some internal presentations, too), that panel-based measurement is going to be key in truly getting a handle on cross-channel/cross-device consumer interactions and their impact.

The People

One of the main reasons to go to a conference like eMetrics is the people — catching up with people you know, meeting people you’ve only “known” digitally, and meeting people you didn’t know at all.

For me, it was great to again get to chat with Hemen Patel from CRM Metrix, John Lovett from Analytics Demystified, Corry Prohens from IQ Workforce, and the whole Foresee Results gang (Eric F., Eric H., Chris, Maggie,…and more). And, it wound up being a really special treat to see Michelle Rutan, who I take credit for putting on the web analytics career path way back when we worked at National Instruments together…and she was presenting (as an amusing aside, I credit Michelle’s husband, Ryan — although they weren’t even dating at the time — as being pretty key to helping me understand the mechanics of page tagging; he’s credited by name in one of the most popular posts on this blog)!

I actually got to meet Stéphane Hamel in person, which was a huge treat (I saw a lot of other web analytics celebrities, but never wound up in any sort of conversation with them — maybe next time), as well as Jennifer Day, who I’ve swapped tweets with for a while.

Digital Analytics folk are good peeps. That’s all there is to it.

Twitter (and Twapper Keeper) Means More to Come!

I actually managed to have the presence of mind to set up a Twapper  Keeper archive for #emetrics shortly before the conference started, and I’m hoping to have a little fun with that in the next week or two. We’ll see if any insights (I’m not promising actionable insights, as I’ve decided that term is wildly overused) emerge. I picked up a few new people to follow just based on the thoroughness and on-pointed-ness of their tweets — check out @michelehinojosa (who also is blogging her eMetrics takeaways) if you’re looking to expand your follower list.

It was a good conference!

Adobe Analytics, Analytics Strategy, General, Reporting

Presentations from Analytics Demystified

This week is somewhat bittersweet for me because it marks the very first time I have missed an Emetrics in the United States since the conference began. And while I’m certainly bummed to miss the event, knowing that my partner John is there representing the business makes all the difference in the world. If you’re at Emetrics this week, please look for John (or Twitter him at @johnlovett) and say hello.

If you’re like me and not going to the conference perhaps I can interest you in one of the four (!!!) webcasts and live events I am presenting this week:

  • On Tuesday, October 5th I will be presenting my “Web Analytics 201” session to the fine folks at Nonprofit Technology Network (NTEN) who we partner with on The Analysis Exchange. You need to be a NTEN member to sign up but if you are I’d love to talk with you!
  • On Wednesday, October 6th I will be doing a free webcast for all our friends in Europe talking about our “no excuses” approach towards measuring engagement in the online world. Sponsored by Nedstat (now part of comScore) all attendees will get a free copy of our recent white paper on the same topic.
  • Also on Wednesday, October 6th (although at a slightly more normal time for me) I will be presenting our Mobile (and Multi-channel) Measurement Framework with both our sponsor OpinionLab and a little consumer electronics retailer you may have heard of … Best Buy! The webcast is open to everyone and all attendees will also get a copy of our similarly themed white paper.
  • On Thursday, October 7th I will be at the Portland Intensive Social Media workshop presenting with Dean McBeth (of Old Spice fame) and Hallie Janssen from Anvil Media. I will be presenting John and Jeremiah’s Social Marketing Analytics framework and am pretty excited about the event!

All-in-all it promises to be a very busy week presenting content so I hope to hear from some of you on the calls or see you in person on Thursday.

Analytics Strategy, Excel Tips, Presentation

Data Visualization Tips and Concepts (Monish Datta calls it "stellar")*

Columbus Web Analytics Wednesday was sponsored by Resource Interactive last week, and it was, as usual, a fun and engaging event:

Web Analytics Wednesday -- Attendees settling in

We tried a new venue — the Winking Lizard on Bethel Road — and were pretty pleased with the accommodations (private room, private bar, very reasonable prices), so I expect we’ll be back.

Relatively new dad Bryan Cristina had a child care conflict with his wife…so he brought along Isabella (who was phenomenally calm and well-behaved, and is cute as a button!):

Bryan and Isabella

I presented on a topic I’m fairly passionate about — data visualization. The presentation was well-received (Monish Datta really did tweet that it was “stellar”)  and generated a lot of good discussion. I had several requests for copies of the presentation, so I’ve modified it slightly to make it more Slideshare-friendly and posted it. If you click through on the embedded version below, you can see the notes for each slide by clicking on the “Notes on Slide X” tab underneath the slideshow, or you can download the file itself (PowerPoint 2007), which includes notes with each slide (I think you might have to create/login to a Slideshare account, which it looks like you can do quickly using Facebook Connect).

 

 

 

 

I had fun putting the presentation together, as this is definitely a topic that I’m passionate about!

* The “Monish Datta” reference in the title of this post, while accurate, is driven by my never-ending quest to dominate search rankings for searches for Monish. I’m doing okay, but not exactly dominating.

http://b.scorecardresearch.com/beacon.js?c1=7&c2=7400849&c3=1&c4=&c5=&c6=

Analytics Strategy

Minimize Robot Traffic

Robots are cool. I like robots when they build cars, try to plug oil spills and clean carpets. The only types of robots I don’t like are the ones that hit websites repeatedly and throw off my precious web analytics data! Do you have a problem with these types of robots? Would you know how to see if you do? I find that many web analytics customers don’t even know how to see this, so in this post I will share what I do to monitor robots and hope that others out there will share other ways they deal with robots.

Why Should I Care About Robots?
This is often the first question I get. Who cares? Here are my reasons for caring about minimizing robots hitting your site:

  1. If you use Visits or Unique Visitors as part of any of your website KPI’s (i.e. Revenue/Unique Visitor), you should care because robots are inflating your denominator and dragging your conversion rates down
  2. If you are tasked with reducing Bounce Rates on your site, you should care as robots will often be seen as bounces
  3. Omniture (and other web analytics vendors) often bill you by website traffic (server calls) so you may be paying $$$ for junk data
  4. Often times web analytic KPI’s have razor-thin differences month over month and having a lot of garbage data can mean the difference between making a good and bad website business decision

Do I Have a Problem?
The first step is to identify if you have a problem with robots. Unfortunately, SiteCatalyst does not currently have an “out-of-the-box” way to alert you if you have a problem (@VaBeachKevin has added this to the Idea Exchange so please vote!), but in the meantime, here is my step-by-step approach to determining this:

  • Create a recurring DataWarehouse report that sends you Page Views and Visitors for each IP address hitting your site (If you store the Omniture Visitor ID in an sProp, I would use that in place of IP address). This can be daily, weekly or monthly depending on how much traffiic your website receives. I sometimes add the Country/City as well (you’ll see why later).

  • When you receive this report, it should look something like this:

  • Once you have the data, I create a calculation which divides Page Views by Visitors and then sort by that column (if you have a lot of data from different days/weeks, you can create a pivot table). The result should look like the report below where you will start to see which IP addresses are viewing a lot of pages on your site per visitor. Keep in mind that this doesn’t mean they are all bad. It is common for small companies or individuals to share IP addresses. The goal of this step is just to identify the IP addresses that might be issues. In the example below, you can see that the the top two IP addresses appear to be a bit different than the rest. While it may make you feel good that these unique visitors liked your website so much they viewed thousands of pages each, you might be fooling yourself!

  • Once you have this list, I like to do some research on the the top IP Address offenders. You can do this via a basic Whois IP Lookup or you can invest in a reverse IP lookup service.

What Do I Do If I Find Robots?
If after reviewing the top offending IP addresses you find that you do, in fact, have a robot hitting your site, you have a few options:

  1. Work with your IT group to exclude these IP addresses from hitting your website. This is your best option since it will be the most reliable and reduce your web analytics server call cost.
  2. Work with Omniture’s Engineering Services team to create a DB Vista Rule that will move these website hits to a new report suite so it will not pollute your data. The best part of this option is that you don’t have to engage with your IT team and you can add/remove IP addresses anytime you want via FTP. Unfortunately, you will still be hit with server call charges for this (not to mention the cost of the DB Vista Rule!), but if you also pass data to Omniture Discover, you might save money there by not passing bad data to Discover.
  3. Work with Omniture’s Engineering Services team to build a custom solution for dealing with robots…

Employee Traffic
While I don’t want to imply that your co-workers are robots, I wanted to mention employee traffic in this post as well since it is tangentially related. I find that many Omniture customers don’t exclude their own employees from their web analytics reports. This can be a huge mistake if you have a lot of employees or have employees who actively use the website. For example, at my employer (Salesforce.com), we use our website to log into our internal systems which are all run on Salesforce.com! This means that we have thousands of employees hitting our website every day to log in to our “cloud” applications and that traffic should not count towards our marketing/website goals. Therefore, we manually exclude all employee traffic from our reports by IP address to minimize the impact of employee traffic impacting our KPI’s. While we don’t consider this to be robot traffic, we address it in the same manner by passing employee traffic to its own report suite. One cool by-product of placing employee traffic in its own report suite is that you can see how often your own employees are using your website so you can show management that the dollars they give you serve multiple audiences!

Final Thoughts
As I stated in the beginning of this post, this is just one way to investigate and deal with robots. If you have other techniques, please share them here! Thanks!

Analytics Strategy

Free white paper on Tag Management Systems

This last week I was in London thanks to the good graces of our friends at Tealeaf to deliver a keynote speech at their EMEA customer conference. After the event, both reporters and conference attendees asked me “What is the most important technology trend in web analytics today?”

I have been asked this hundreds of times in my career as an analyst and consultant and the answer used to be tricky. In the past I’ve opined “multichannel integration”, “segmentation”, “application usability” and even “none, it’s about people and process, not technology.”

This time, however, my answer was clear: Tag Management Systems.

Tag management has become a nightmare for many companies.  As we outlined in our white paper with ObservePoint on the need for a “Chief Data Officer”, tagging and data collection has gotten out of control in companies of all sizes. Information Technology supports one set of tag-based tools, marketing deploys their own stuff via content management systems (CMS), advertising and other individual stakeholders drop their tags, and before you know it you have a dozen or more scripts included at various points across your site.

In a way, because of fragmentation in the marketplace this situation was inevitable. But it doesn’t have to be this way.

Emerging tag management systems (TMS) are rapidly transforming the data capture and technology deployment landscape, replacing inefficient, individual installations with a “one stop shop” able to manage any number of tag-based technologies via a single user interface. Early adopters of these systems are reporting a profound transformation of both their ability to manage data capture and their relationship with Information Technology.

From a web analytics perspective this is what we call a “win/win.”

One of these vendors is Ensighten, a company founded by a group of folks who have a long established reputation in digital measurement. Their CEO Josh Manion and I go back pretty far, and so when he told me about their platform I was immediately intrigued but somewhat skeptical.

I had already seen nearly all of the competing solutions in the market and walked away a variety of concerns. I’d even gone so far as trying to establish an “Open Tag Alliance” initiative with one vendor, which unfortunately collapsed due to time constraints.

Needless to say, I was impressed with Ensighten, so much so that I asked Josh if we could partner with him (something we rarely do with technology vendors.) He agreed, and so we are proud to announce that we are the first deployment and integration partner to sign up with Ensighten.

In support of our partnership we agreed to write a paper detailing what we see as the advantages of tag management systems. Titled “The Myth of the Universal Tag and the Future of Digital Data Collection”, this short paper outlines the need for TMS, the rationale behind deployments, and the opportunity for return on the investment. The paper is freely available now at the Ensighten web site or you can write us directly for a complimentary copy.

Readers should note that there are a handful of tag management solutions in the market today. At Analytics Demystified we believe the growth in the sector is validation of the opportunity — each of these companies have good stories to tell and an expanding customer base.

You should consider a tag management system if you are:

  • Frustrated with the “one tag, one project, one timeline” model of tag deployment;
  • Switching vendors and looking to gain leverage over future deployments;
  • Heavily invested in Flash but have long struggled to measure the technology;
  • Managing globally distributed sites but have little centralized control over tags;
  • Looking to add Q/A and workflow management to your tag deployments;
  • Concerned at all about the quality and data accuracy from your web analytics.

If any of these criteria apply to you I would strongly encourage you to give John or I a call. We’ll be more than happy to walk you through the current tag management system vendor landscape at no charge and point you towards whatever solutions seems right for you.

We welcome you to the age of tag management systems and we hope you will join us in welcoming Ensighten to the market.

Your next actions:

Analytics Strategy, Conferences/Community, General

Web Analysts Code of Ethics …

Following up on last week’s thread about how the web analytics industry is on the cusp of becoming our own worst enemy as the tide of public opinion increasingly turns against online and behavioral analytics I wanted to make good on my offer to help the Web Analytics Association. I fully support the efforts of the Association to create a solid community for web analytics professionals around the world and have long been a contributor to their work, be it turning the Web Analytics Forum (at Yahoo! Groups) over to WAA management, opening the doors for WAA participation in Web Analytics Wednesday, and providing other “behind the scenes” support when asked.

To this end I composed a preliminary “Web Analysts Code of Ethics” that I had planned to work on here in my blog (with you all) and then turn over to the Web Analytics Association. Much to my surprise, according to my partner John Lovett (who is a Board member) the Board of Directors loved the preliminary code and asked to have it publish at the Web Analytics Association blog.

Easy enough, and so I would like to redirect all of you over to the Association blog where I and the WAA both would like to hear what you have to say about this early effort. The comments have already started over there, and of course if you’re more comfortable commenting here then by all means, I welcome  that.

As I mentioned a few times in my recent Beyond Web Analytics podcast (not live until early on September 13th), I believe that we need to start advocating on our own behalf and I see this code as one small step in the right direction. Hopefully the WAA Standards Committee, the Board, and all of you out there whether you’re in the Association or not will join me in this effort to help the wider world understand what we all do (and what we do and will not do.)

So go do two things right now:

  1. Read and comment on my “Web Analysts Code of Ethics” at the WAA Blog
  2. Listen to my interview with Adam Greco and Rudi Shumpert at Beyond Web Analytics
Adobe Analytics, Analytics Strategy, General

Internal Search Term Click-Through & Exit Rates

Recently, I was re-reading one of Avinash Kaushik’s older blog posts on tracking Internal Search Term Exit Rates and realized that I had never discussed how to report on this using Omniture SiteCatalyst. In a past Internal Search post, I covered many different things you can do to track internal search on your site, but did not cover ways to see which terms are doing well and which are not. In this post I will share how you can see this so you can determine which search terms need help…

Why Track Internal Search Term Click-Through & Exit Rates?
So why should you do this? In the era of Google, we are all slowly being trained to find things through search. Many of my past clients saw the percent of website visitors using search rise over the past few years. In addition, Internal Search and Voice of Customer tools are some of the few out there where you can see the intent of your visitors. Unfortunately, most websites have horrible Internal Search results which can lead to site exits. In my previous Internal Search post I demonstrated how to track your Search Results Page Exit Rate, but that only shows you if you have a problem or not. If you do have a high Search Results Page Exit Rate, the next logical step is to determine which search terms your users think have relevant search results and which do not. Note that this is not meant to show you which terms lead directly to website exits, but rather, which terms cause visitors to use or not use the search results you offer them after they search on a particular term.

How Do You Track Internal Search Term Click-Through & Exit Rates?
Ok, so how do you do this? Follow the following implementation steps:

  • Make sure that you are setting a Success Event when visitors conduct Internal Searches on your website. Hopefully you are already doing this so in many cases this step will be done!
  • Make sure that you are capturing the Internal Search Term the visitor searched upon in an eVar variable. Again, you should be doing this (if not, shame on you!).
  • Here is where we get into uncharted territory. The next step is to set a new Success Event when visitors click on one of the items on the search results page. Depending upon the technology you use for Internal Search, this could be hard or easy. Regardless of how you actually code it, the key here is to set the second Success Event (I call it Internal Search Results Clicks) only if visitors click on a search result item (not if they click on a second page of search results or go to another page through other navigation). It is also critically important that you only set this Search Results Clicks Success Event once per search term! Do not set it every time a visitor clicks on one of the search results after using the “Back” button. If you don’t do this correctly, your Click-Through and Exit Rates will be off. This could take a few iterations to get right, but stick with it!
  • Once you have both the Internal Searches and Internal Search Results Clicks Success Events set, you can create a Calculated Metric that divides Internal Search Results Clicks by Internal Searches to see the Internal Search Click-Through Rate as shown here:

  • From there you can create the converse metric which subtracts the Internal Search Click Through Rate by “1” to come up with the Internal Search Exit Rate as shown here:

  • After this is done, you can open the Internal Search Term eVar report and add all three metrics so you see a report like this:

In this case, it looks like the “Zune” Internal Search term might need some different search result content as it has a much higher exit rate as the others. Another cool thing you can do is to create a report which trends the Internal Search CTR % or Exit % for specific Internal Search terms so you can see if they have been good/bad over time. Also, if you use SAINT Classifications to group your Internal Search Terms into buckets, you can see the report above for groups of Internal Search terms. If you vote for my idea in the Idea Exchange, you would be able to set SiteCatalyst Alerts to be notified if your top Internal Search Terms have spikes in their Click-Through or Exit Rates. You can also segment your data to see how the Internal Search rates differ when people come from Paid Search vs. SEO, etc… and even use Test&Target to try out different promotional banners on your search results page…

Finally, don’t forget that when you create a new calculated metric like the Internal Search CTR % metric described above, you also get the bonus of seeing this metric across your entire website under the My Calc Metrics area of the SiteCatalyst toolbar. Simply find this new metric and click on it and you can see your overall Internal Search Click-Through Rate regardless of internal search term. Your report will look something like this:

For The True Web Analyst Geeks
If you were bothered when I mentioned above that you should only set the Search Results Clicks Success Event once per search term, then you are my kind of person (please apply for a job with me!)! You were probably saying to yourself: “If I only count once per search term, how will I know which search terms get visitors to click on multiple search result links?” Right you are! That could be valuable information. If you want to see that as well, all you have to do is set a second Success Event each time a visitor clicks on a search result [I call this Internal Search Result Clicks (All)]. Then you can compare how many times people click on any search result to how often click in total. Here is a sample report:

In this example, you can see that the search term “api” had one click only in either scenario, but the search term “chatter” had people click on it 100% of the time and 5 times they clicked on two search result items. If you want, you can create another Calculated Metric that divides the Internal Search Result Clicks (All) by the # of Internal Searches to see how many search result clicks each term averages. In the case of “chatter” above, it would be 2.25 search result clicks per search!

Final Thoughts
If Internal Search is important to your site, make sure you are tracking it adequately so you can improve it and increase your overall website conversion. Do you have any other cool Internal Search tracking tips I haven’t covered? If so, leave a comment here…

Analytics Strategy

Acquisitions Aplenty! comScore Buys Nedstat

We’re certainly on an acquisition hot roll here in our cozy little measurement industry. This week marked yet another buy-up of a web analytics company, Netherlands based Nedstat, was acquired by comScore. The sale price was reported at $36.7 million USD, which brings the tally of measurement buy-outs including the $1.8 billion dollar Omniture acquisition last year to nearly $2.5 billion dollars by my count. Those are some good multiples on revenue since my Forrester Web Analytics Forecast didn’t peg market spending to hit even $1 billion until sometime in 2015. Granted Omniture, Unica and to some extent Coremetrics were offering more than just web analytics in their product portfolios. But regardless, measurement technologies are all the rage these days and finally, big businesses are taking note of the value of web analytics.

Foreshadowing

Some might say that comScore and Nedstat, while serving similar industries for different purposes, were running on parallel paths and that an acquisition was a plausible outcome. But before I dive into that hypothesis, first I’ll toot my own horn by mentioning that I went on record predicting this one. The good fellas at Beyond Web Analytics interviewed me on the topic of market consolidation just after the IBM acquisition of Unica and we had a good chat about it here on the podcast. The closing question asked me to look into my crystal ball and guess who would be the next acquirer in the analytics market. While I didn’t guess that it would be comScore, I did speculate that there are some very interesting and valuable technologies that exist in Europe. I mentioned both Webtrekk in Germany and Nedstat as companies that would make appealing acquisition targets. Clearly comScore must have been listening (c’mon, I jest). But one of my clients across the pond also mentioned a couple of weeks ago that Nedstat’s CEO was quoted in a German newspaper as saying that there is no longer a place for a dedicated web analytics company in this environment. I’ve been saying this since early 2009, but coming from a chief officer of a successful technology operation…Foreshadowing indeed.

The Red Herring

So, bright and early on morning of the acquisition my friend Jodi McDermott reached out to me on the news by pointing out the press release on the deal and I owe her a big thanks for that. When we spoke later that morning along with Magid Abraham, comScore founder and CEO the first question Jodi asked me was…”Were you surprised?”. Now, the dirty little secret is that analysts can never show surprise, but heck yeah I was surprised that comScore was the buyer!?! I didn’t anticipate comScore because of their Unified Digital Measurement (UDM) solution which currently handles over 500 billion transactions per month and is growing rapidly. So, they already had their own tag based measurement solution. Additionally, just under a year ago comScore announced a strategic partnership with Omniture to deliver a newly created Media Metrix 360 solution predicated on UDM that would leverage a hybrid combination of Omniture page tags and comScore’ panel based measurement.

It was brilliant actually, and demonstrated the first significant attempt to bring together advertising measurement with site-side data. Yet, just a month after this partnership was announced, Omniture was snatched up by Adobe, and I can only speculate that the momentum on the partnership was stymied. Don’t get me wrong, Media Metrix 360 still exists, and clients like Martha Stewart and the Wall Street Journal add marquee status to the initiative. Thus, I would expect that comScore will support Media Metrix 360 by continuing the partnership with Adobe’s Omniture Business Unit as well as continue development on their own proprietary solution. Whatever they choose to do, these efforts – their own hybrid UDM tags and the Omniture relationship – created a red herring for me that had me looking elsewhere. Now the real question is… Was Nielson surprised and how will they counter? Sorry friends, my crystal ball is not that good.

The Plot Twister

I saved the best for last because here’s where the plot starts to get really interesting. comScore has stated that its acquisition interests in Nedstat are to better serve the media and publishing industries. Web analytics and site-side measurement has long been focused on the transaction and sites that don’t have traditional online transactions are left to quantify success by custom fitting solutions to meet their needs. With most web analytics solutions you’re forced to follow the conversion funnel through to a transaction (or not) and attempt tie things together or launch remarketing efforts from there. But when there’s no transaction at the end of the visit, then many traditional web metrics have very little resonance to the business.

Nedstat has long been focused on key topics like engagement and rich media measurement – metrics that matter to publishers. Now with the acquisition by comScore who has a stronghold within many media companies (not to mention a reserved line item in their budgets) they can create a very different value proposition for media companies looking to quantify metrics for their advertisers as well as optimize the experience for their visitors. I tend to agree with Magid who stated that this new paradigm for publishers is likely to create a natural segmentation in the market. With stalwart web analytics firms (albeit in their current incarnations) Omniture, Coremetrics and Unica are working towards an analytical system that feeds marketing automation. Now we’ve got the potential for something entirely different.

For these reasons I’m bullish on the acquisition. We have a new opportunity for web analytics where site-side measurement meets audience (panel based) measurement. It’s the collision course that many have been talking about. And it sets the stage for propelling measurement into next generation devices, apps and mobile platforms that don’t have transactional elements. It’s still too soon to say how this will play out, but I applaud Magid, Gian and the comScore team on their vision for creating a new measurement paradigm. And a big congrats goes out to Michael, Michiel, Fred, Ulrike and the entire Nedstat team for building a globally attractive solution. Bravo.

But these are just my thoughts…I may be way off…I may be crazy. Readers, do you agree that this new duo can impact enterprise measurement on a new level? I’d love to know what others think.

Analytics Strategy

Congratulations to Nedstat and comScore!

This summer’s web analytics acquisition season has heated up to the point where when my phone rings and it’s John (who wakes up hours earlier) saying “comScore has acquired Nedstat” my response is “of course they have!” Not to say this isn’t an exciting acquisition, but wow the vendor landscape has changed a bunch this year …

John spent the morning on the phone with our friend Jodi McDermott and comScore’s CEO Dr. Magid Abraham talking about the decision for comScore to get more deeply into client-side measurement technology and he promises to have a more comprehensive post up in a day or two. I only wanted to weigh in and say “congratulations” to the entire team at Nedstat!

I have been lucky enough to have worked with Michael, Fred, Michiel, and Ulrike on a number of occasions and have produced two white papers with them (one on video, the other on mobile) with a third coming out in a few weeks. The management team and everyone I have interacted with at Nedstat are wonderful people and they will definitely add great value and expertise to the comScore family.

Again, watch for more detail and analysis from John in the coming days and congratulations to the comScore and Nedstat families from all of us here at Analytics Demystified.

Analytics Strategy, Conferences/Community, General

We are our own worst enemy …

Back in February of this year, in partnership with BPA Worldwide, Analytics Demystified published a white paper detailing the risks associated with the use of Flash Local Shared Objects (LSOs) in digital measurement. Titled “The Use of Flash Objects in Visitor Tracking: Brilliant Idea or Risky Business?” the paper drilled down into how some companies are using Flash LSOs and offered  the following guidance:

  1. Do not use Flash to reset browser cookies
  2. Disclose the use of Local Shared Objects
  3. Allow site visitors to disable Local Shared Objects

The first piece of advice turns out to be pretty important since companies are now being sued over their use of Flash to reset browser cookies. MTV, ESPN, MySpace, Hulu, ABC, NBC, Disney, and others are being dragged into a lawsuit based on their use of Quantcast and Clearspring who were identified by Soltani, et al. as using Flash LSO to reset deleted browser cookies. These lawsuits allege a “pattern of covert online surveillance” and seeks status as a class action lawsuit.

Yikes.

Fortunately for Adobe they do not seem to be one of the targets in these suits, which makes sense considering the position the company has taken regarding the use of Flash. In my interview with MeMe Rasmussen, Adobe’s Chief Privacy Officer back April of  this year, Mrs. Rasmussen explicitly stated:

“… the position we outlined in the FTC Comment on condemning the misuse of local storage, was specific to the practice of restoring browser cookies without user knowledge and express consent.  We believe that there are opportunities to provide value to our customers by combining Omniture solutions with Flash technology while honoring consumers’ privacy expectations.”

On the topic of consumer privacy and web analytics, following up my partner John’s response to the Wall Street Journal article on online privacy (“Be still my analytical heart”), I recently wrote a piece for Audience Development Magazine titled “You are all evil …” While a little tongue-in-cheek the article encourages marketers and business owners to:

  1. Have a rock-solid privacy policy
  2. Not use tracking software they don’t understand
  3. Not be unaware of what tracking software they have deployed
  4. Have a clear answer for “how and why do you track us?”
  5. Be transparent as hell when anybody asks what you’re doing

As I reflect back on the guidance we have provided in the past year I run the risk of becoming quite depressed. None of our recommendations are surprising, revolutionary, or particularly Earth shattering … but not nearly enough companies are doing most of these very simple things. Given this, one possible outcome is becoming increasingly apparent …

We are going to get screwed.

Go back to Walt Mossberg’s 2005 assertion that “cookies are spyware” and the related conversation around cookie deletion and you will see a clear pattern: media (ostensibly acting in the best interest of consumers) points out that what we do is somehow devious … and we more or less ignore the problem, hoping it will go away.

My friend Bob Page once referred to something he called the “Data Chernobyl”  … a unexpected and massive meltdown in consumer trust associated with the data that we collect, store, and use to make business decisions.  When you think about it for just a little bit the idea is terrifying … because everything we do depends entirely on our ability to collect, store, and use information about consumer behavior on the Internet.

Our livelihoods depend on everyone ignoring the fact that we track, understanding why we track, or getting something tangible out of the tracking we do.  Sadly we have never offered anything tangible, we have never really made an effort to explain what we do in court of public opinion, and it is increasingly clear that the bright light shining on our trade isn’t going to fade anytime soon.

What’s worse is that we are collecting even more information across mobile, social, and other emerging channels, perfecting our ability to integrate that data into over-arching consumer data warehouses, and occasionally using techniques that even the most hard-hearted of web analysts get all geeked-out about.

We have become our own worst enemy.

Now, as I declared in the Audience Development piece, I simply do not believe that consumers are as freaked out about tracking online as the media makes them out to be … the data I have seen just doesn’t support that conclusion. But consumers aren’t the real problem: the real problem: is the media, lawyers, and potentially the Federal Government. All three of these groups continue to generate page views, make money, and “protect the common man” (sic) by throwing our industry under the bus … and we aren’t doing anything in our defense.

Dumb, dumb, dumb.

People much smarter than I am have repeatedly stated that they don’t want to engage the media or “privacy police” in a conversation that they cannot possibly win.  To a small extent this makes sense, but at some point I wonder if we are going to collectively end up looking like my four year old when he knows he’s made a mistake.  My son gets away with it because he’s awesome cute and I love him, but I am beginning to think the collective web analytics industry is not going to get away with mumbling and making lame excuses for much longer.

The advertising industry has the IAB and NAI, both of whom appear to be responding to articles, lawsuits, and Congressional investigation on many of these issues.  (If you haven’t seen it yet, have a look at this amazing “privacy matters” campaign the IAB is running.) But we are not the advertising industry, we are the web analytics and digital measurement industry, and we need to have our own voice, our own lobby, and our own representation.

Since the framework for this already exists, I am officially asking that the Web Analytics Association formalize and finalize their Industry Advocacy program and represent the digital measurement community in the forum of public opinion.

I have already volunteered to help with this effort under the Presidency of Alex Langshur and reiterate that commitment to the current Board of Directors.  The WAA needs to bring together corporate members and key practitioner representatives to quickly hash out a clear, concise, and practical position on the relationship between digital measurement technology and consumers. The current WAA Board is in perhaps the best position in years to make the decision to represent the needs of our community … but decisive action is required.

Without the WAA’s leadership on this issue I fear that over time we will lose the battle of public opinion and my tongue in cheek assessment of the “evilness” of our industry will be far less funny than it seems today.

Let’s not let that happen.

We are an awesome industry full of brilliant people.  The work we do is some of the most valuable but least understood in the interactive world.  I believe it is time to come out of the closet, accurately describe the value of the work we do, and stop shying away from a conversation we feel is stacked against us and a battle we are unsure that we can win.  If we don’t try, without a doubt, we will remain our own worst enemy.

Analytics Strategy, Conferences/Community

Do not miss this year's X Change conference!

What a crazy week it has been, what with client visits with John here in the West, web casts with the fine folks at Tealeaf and Unica, and the end of summer fast approaching at the Peterson household. I was so busy I wasn’t able to pay close attention to our X Change registrations and when I looked just now I realized something …

X Change 2010 is damn near sold out.

Thanks to some quick thinking from Joel, Grace, and Gary over at Semphonic we have a few more seats available than last year, but with nearly a month to go before we convene in Monterey, California at the beautiful Monterey Plaza Resort and Spa we have sold more seats than last year and last year was completely sold out!

You’re not gonna miss the X Change because you waited too long to sign up, are you? You aren’t going to risk missing out on the chance to discuss digital measurement in our intense and intimate conversation format with practice leaders and managers from amazing brands like Best Buy, ESPN, Expedia, Facebook, MTV, New York Times, Lowes, Turner Broadcasting, HP, Salesforce.com, Nike, Charles Schwab, Comcast, eBay, and NBC Universal, are you?

Seriously, don’t miss out.

I only wish I could make a list of all the great participant companies coming to this year’s event … but I can’t.  If I could you would see that by coming to this year’s X Change you would be joining some of the most respected brands in technology, media, healthcare, advertising, software, and retail in the world.  Worse, you would realize that missing X Change means not getting to hear first-hand how some of the greatest minds in the digital measurement industry are getting it done today.

You would be bummed.

Don’t be bummed, come to X Change 2010, September 20, 21, and 22 in Monterey, California. Register at our web site now or contact me directly for more information.

Analytics Strategy

IBM buys Unica in $480M Deal

So the beauty of one public company buying another is that they usually hold an analyst call to explain the rationale. Props to IBM for holding this call just two short hours after the news broke and for giving a few of us a chance to pepper them with tough questions. On this call, Craig Hayman, General Manager of IBM Business Solutions (within the IBM Software Solutions Group) and Yuchun Lee, Founder and CEO of Unica shared an insiders’ perspective on the deal. In fact, Craig even shared the code name “Amaru” which was his secret squirrel moniker for referring to the deal internally before it was done.

So here’s the scoop.

The IBM acquisition of Unica was largely driven by a recognized need for enterprises to get closer to their customers by understanding their experiences and interactions across a broad network of channels and customer touch points. They’ll accomplish this by using analytics technologies, building single view profiles of customers and delivering marketing process improvements.

Sounds a little like markety-speak doesn’t it? Well, regardless it’s still a pretty good story and one that I hope IBM is able to pull off. It’s actually similar to the one that Adobe told after the Omniture acquisition with perhaps more of an automation spin.

What does this mean for Web Analytics?

When Joe Stanhope of Forrester fame deftly asked how IBM planned to rationalize the overlap between NetInsight and the recent Coremetrics acquisition, the response was dominated by the word “synergies” which they see a lot of between these firms. Yuchun rightly went on to describe NetInsight as only one product in the Unica portfolio and that Coremetrics and Netinsight served different segments within the web analytics market. He explained that NetInsight has strength in the on premise solution market (which they do) and that their ability to leverage web analytics within an online datamart was also differentiated (while not entirely unique to the market at large, it’s true when compared to Coremetrics. NetInsight uses a relational database construct for storing and accessing clickstream data). Yuchun also pointed out that Coremetrics has strength when it comes to collecting high volume, high transaction data. This is a result of Coremetrics robust infrastructure that they’ve been building to collect and deliver this data at scale without incurring exorbitant expenses (and they were doing a damn good job of this).

All in all, the comments about the synergies concluded by stating that both tools would accelerate the benefits of deep customer insights for IBM’s clients. None the less, it will be very interesting to wait and see which features and functions emerge from a combined solution of two web analytics powerhouses.

What does this mean for IBM?

As much as I’d like to think that analytics is the epicenter of the business world, this deal is about multi-channel campaign management and marketing automation. IBM is without question on a buying spree. They snatched up SPSS, Coremetrics, DataCap, Sterling Commerce and now Unica in short order. Presumably this is all part of IBM’s strategic growth plan that earmarked a whopping $20 billion for acquisitions through 2015. But from a web analytics perspective, this acquisition didn’t occur because of the NetInsight product. Don’t get me wrong, I’m a big fan of the technology, but Unica’s campaign automation solutions and interactive marketing prowess within the marketplace surely made them a tasty morsel for IBM to gobble.

The newly acquired Unica technology will sit within the Software Group business – or more specifically – IBM Software Solutions Group. Yuchun will own the BU within the software solutions group. And this group also holds Websphere Commerce, Coremetrics, Cognos and about a bazillion other software solutions. But as we learned on the call today, Craig Hayman will work to build out frameworks and the connections between these multitude of solutions.

What does this mean for clients?

So, when I look at the big picture, my speculation is that IBM is furthering the bifurcation of the marketplace in yet another direction that separates the “haves” from the “have nots”. What I mean by this is slightly different from what Eric described in his bifurcation of analytics market as a separation of tools based on the level of experience for each user. He puts technologies like Adobe Omniture’s Discover and Coremetrics’ Explore into the exclusive camp of highly skilled analysts who are capable of performing true analysis on digital data sets. The rest of the population is left with simpler, yet still capable tools (not meant in a disparaging way) like Google Analytics that are intuitive and require little training to begin garnering insights. While I agree with Eric, a new twist in this divide can also be developing on a financial level.

As we know, Google Analytics is free and enterprise analytics can quickly run into six – even seven – digit figures in a hurry. My thoughts on this financial divide and IBM’s perpetuation of it stem from Sam Palmisano’s scoff at the notion of consumer technologies dominating the enterprise. Clearly the IBM acquisition moves dictate that a set of tools designed for the professional marketer will be vastly different from the solutions accessible to consumers on the street. Thus, I see this as yet another wedge in the bifurcated divide between large enterprises, the ones that typically purchase software from the likes of IBM, Oracle, and SAS, and small and mid-sized companies who are forced to use a different toolset primarily because of price.

So at first blush, if you’re a big enterprise this all sounds pretty good. IBM and Unica join forces, which isn’t too much of a stretch as there’s also some history here…IBM is a Unica customer using campaign management, marketing resource management and other services to bring about a “marketing transformation” within their own organization (at least that’s how Craig Hayman put it). And Unica has also been OEM’ing IBM solutions for some time. The acquisition extends the growth trajectory that Unica was already on and helps to bring together IBM’s end-to-end story that they call “Blue Washing” (Err…hope that code word was okay for public consumption).

But, the acquisition also acts as a good milestone for IBM who is assembling all the key ingredients for a leading enterprise solution – Sterling Commerce is connecting to the back end – Coremetrics offers deep insights into customer behavior and segments – and now Unica delivers a marketing management solution. It’s hard to argue that they’re not connecting a very compelling story for marketing professionals.

Yet, if you’re a mid-sized business or even a small organization…the IBM “Blueness” may have just distanced itself even further into the stratosphere.

Analytics Strategy

Webtrends Acquires Transpond: Analytics, Meet Apps

The burning question on many a web analyst’s mind today is likely…Who is Transpond? That’s the question I asked when I first learned of Webtrends’ plan to acquire the San Francisco based application development platform vendor.

Today the acquisition closed and word is out. At first glance, this may sound like a left turn for web analytics and perhaps it is. But in my mind it’s an interesting acquisition that’s headed in a positive direction. It also demonstrates that Webtrends isn’t afraid to make bold moves and assert its innovative position in the social analytics realm.

 

The History

Transpond was founded in 2007 as iWidgets, back when widgets were all the rage (here’s a view from the Wayback Machine). The company got off the ground with a $4M investment in early 2009, but found that they were limited by their chosen iWidgets moniker and went through a rebranding exercise in the Summer 2009 to become Transpond. All the while, they’ve been providing application development tools for companies to build and deliver apps on mobile devices like the iPhone or Android, web platforms like Facebook and even TV apps for connected televisions. Transpond offers do-it-yourself development of applications such as quizzes, polls, games and interactive commerce for distribution across multiple digital channels. They also provide development support if you’re looking for some expert dev resources to really make your apps sing. Under the new ownership of Webtrends all of these capabilities will be folded into the Webtrends Apps offering and presumably reporting will become available within the Webtrends Analytics 9 interface.

What’s in it for Webtrends?

So, you may be asking yourself, why is Webtrends interested in this company? Well, the way I see it, Webtrends is tuned into the fact that more and more organizations are developing content that will live and breathe off-site. That is, apps that are not contained within your primary web presence. Whether it’s on a mobile phone, Facebook or the next new platform, users are interacting with your content and each other off-site. That’s a domain that web analytics has traditionally not been able to capture without some fancy footwork because most web analytics solutions rely on tracking contained within the pages of your primary web sites. While tracking within apps is not new either, this acquisition opens up the possibility of integrating behavior with applications that exist off your site into the data soup that is digital analytics. It’s really a logical extension of the analytics technology.

Why is this Cool?

What’s also really appealing about this technology from a development perspective is that the platform allows company’s to build apps and deliver them across multiple platforms in a consistent manner. Thus the ability to build it once and delivery to many, in whatever format they choose to consumer the content. Plus, when you bake in measurement and analytics to the apps, then you really have a means to evaluate interaction and compare across channels.

While this is certainly a new direction for web analytics acquisitions, I for one like the purchase and look forward to seeing Webtrends execute on the delivery of this new solution. Webtrends has the distinction of being one of the first pioneers in web analytics out there and also as the last independent vendor left standing. I’m pleased to see that this old dog ain’t afraid to learn new tricks.

Congratulations go out to Alex, Casey, Justin and the Webtrends team on the innovative move and to Peter Yared and Charles Christolini of Transpond for closing the deal.

Adobe Analytics, Analytics Strategy, General, Reporting

Our Mobile Measurement Framework is now available

Today I am really excited to announce the publication of our framework for mobile and multi-channel reporting, sponsored by OpinionLab. You can download the report freely from the OpinionLab web site in trade for your name and email address.

This paper builds on our “Truth About Mobile Analytics” paper we published with our friends at Nedstat last year and focuses on both measurement in mobile applications and, more importantly, a cross-channel measurement framework built around interactions, engagement, and consumer-generated feedback.

  • Interactions occur in every channel, digital or not. Online and on mobile sites we call these “visits” (although that is a made up word for interactions); in mobile apps the interaction starts when you click the icon and ends when you click “close”; in SMS it starts when you receive the message; on the phone it starts when you dial, and in stores interactions start when you walk up to an employee.
  • Engagement is simply “more valuable” interactions. Regardless of your particular belief about the definition of engagement, we all know it when we see it. Online it happens after some number of minutes, or clicks, or sessions, or whatever; in mobile apps it happens when you’ve clicked enough buttons; on SMS it happens when you respond to the message; on the phone it starts when you begin a conversation, and the same is true in a physical store.  We say engagement is “more valuable” because without engagement, value is unlikely to manifest.
  • Positive Feedback happens when you do a really, really good job. Measuring feedback is a critical “miss” for far too many organizations. Apples “app store” and the value of the star-rating system has essentially proven that there are massive financial differences associated with positive and negative experiences … but most companies still make the mistake of ignoring qualitative feedback altogether.

These three incredibly simple metrics can be applied to every one of your channels, your sub-channels, and  your sub-sub-channels (if you like.)  When applied you can create an apples to apples comparison between your web, mobile web, mobile apps, video, social, etc. efforts.

Then you can apply cost data, and you’re really in business.

I don’t want to say much more than that but I would really, really encourage you all to download and read this free white paper. When we put something like this out — something we believe has the power to really transform the way everyone thinks about the metrics they use to run their business, and something that has the potential to force dashboards everywhere to be scrapped and started over — we’d really like your collective feedback.

DOWNLOAD  THE WHITE PAPER NOW

Thanks to Mark, Rick, Rand, and the entire team at OpinionLab for sponsoring this work. If you’re the one person reading my blog that hasn’t seen their application in action, head on over to their site and have a look.

Analytics Strategy, Conferences/Community

Analysis Exchange members going to X Change 2010

Earlier this year after we launched the Analysis Exchange we put out our first challenge to the membership. We asked people to “be exceptional” in their participation, to step up and make a difference by working harder than expected, by bringing crazy passion to their work, and by participating in unexpected ways.  In exchange for “being exceptional” we said we would provide a complimentary pass to one mentor and one student to this year’s X Change conference in Monterey, California September 20, 21, and 22.

Today I am pleased to announce your exceptional winners of this challenge.

While nearly everyone who has participated in Analysis Exchange thus far has really blown my mind with their energy, their commitment, and their willingness to do something special for the larger web analytics and nonprofit communities, five people really stood out in the crowd.

  • Sarah DeAtley, Mentor from Washington who worked like crazy to sign up her fellow Seattle-ites and continues to evangelize for the effort;
  • Victor Acquah, Mentor from Virginia who participated in both our Alpha and Beta tests with PBS and provided tremendously valuable feedback;
  • Jason Thompson, Mentor from Utah who has stepped up repeatedly to mentor projects and has helped a great deal to spread the word;
  • Jan Alden Cornish, Student from California who has not only participated in multiple projects but has been an invaluable source of ideas and feedback;
  • Michael Healy, Student from California who has helped out on numerous projects and who really understands what Analysis Exchange is trying to do.

Unfortunately not everyone would have been able to make the X Change this year due to previous commitments; fortunately that made our job selecting the finalists nominally easier.  To make our final decision we asked everyone to send us a short paragraph describing “what they have learned” in Analysis Exchange to date.  Here is what we heard back:

From Jason Thompson:

“First let me say, that with the amazing cast of students and mentors
that make up the Analysis Exchange, I am truly humbled to be
considered for this honor.

The Analysis Exchange has reminded me that what makes us truly rich is
not the contents of our wallets or how much money we have in our bank
accounts, whoa….sorry…started channeling Tyler Durden there for a
second, but what makes us truly rich are the relationships we have in
life.

So what have I learned thus far?  I have learned that we are all
students and that if we are open, there are many great lessons for us
to be taught.

I didn’t join the AE with the thought of getting anything in return,
although I have been given many wonderful gifts through my
participation. I joined because the AE provided me with the
opportunity to give back to an industry that has given me so much.

I honestly feel a little bit weird writing this email and the humble
Jason in me says this opportunity should go the person who would
benefit from it the most.”

From Jan Alden Cornish:

“My two Analysis Exchange student projects have demonstrated several key takeaways. First, project management principles are paramount. The project kick-of meeting should reinforce the Analysis Exchange project priorities:

  • Schedule: short-term project
  • Cost : limited resource availability, with the expectation that the student have the most flexibility
  • Scope:  the project deliverables given the constraints

The project kick-off meeting should confirm the core roles and responsibilities of the project team members.

  • Organizational Lead: the key business stakeholder who approves the project scope and project deliverables. The organizational lead may be the project manager
  • Mentor : the web analysis expert / consultant. The mentor may be the project manager.
  • Student: the primary execution resource

The project team has to rapidly converge on meeting times for reviewing and also for approving deliverables. The team needs to determine web / audio conferencing details: Skype, GoToMeeting, etc.

The first milestone is approving the project charter, which provides the scope definition and defines the team members’ roles and responsibilities.  Scope definition may require more than one meeting, particularly if the business plan of the nonprofit organization does not clearly set forth measurable expectations for their web presence.  For example, the organizational lead may be accountable for other time critical projects (i.e. migration to a new content management system). The Analysis Exchange project may run concurrently with other projects within the organization that impact its web presence (i.e, an outsourced web redesign).

A second key takeaway for me was that the nonprofit  world is a microcosm of the real world. Thus, risk management is key. A risk is a potential issue which might adversely impact the success of the project. Risks may be categorized as technical or organizational. The web implementation may not allow certain analysis questions to be answered.  Nonprofits often leverage third-party platforms for key business functions such as e-newsletter management, volunteer recruitment, and e-commerce. These functions might not be tightly integrated from an analytics reporting standpoint. They may “roll up” to other departments in the nonprofit’s organization. The nonprofit ‘s web implementation also may have a limited deployment of Google Analytics. Other web analysis tools that are deployed may not may not offer similar functionality to Google Analytics.

Armed with these takeaways from my first two Analysis Exchange projects I look eagerly forward to my next project.”

And from Michael Healy:

“Superhuman effort isn’t worth a damn unless it achieves results. – E. Shackleton

In my experience thus far with the Analysis Exchange I learned that the bounce rate, page views, time on page and every other web metric pretty much aren’t worth anything. More accurately, they aren’t worth anything to the client unless they start to solve a business problem.

Considering that the organizations in the Analysis Exchange aren’t selling anything per se, but instead are providing a non-profit service to others presented a few challenges. Working with Gordon Holstlander, of the Circle Drive Alliance Church of Saskatoon, and Michael Helbling, my excellent mentor, I learned how to work together as a team to move beyond the prima facie challenge.

Our project involved what appeared to be a simple analysis of the home page real estate to determine the best usage of the page. I built out several personas of usage for the CDAC website which showed dynamic access to information; with different goals at different times of the day, week and year. Answering ‘it depends’ to Gordon’s original question delighted my Econ brain to no end.

Moving beyond the population of people who already accessed the site, I was able to show Gordon how to do a simple Google Trends search for the Saskatoon area. An examination of the entire CDAC website also revealed a great source of underutilized content. These two were passed onto the client for future SEO usage.

The biggest lesson in the Analysis Exchange thus far has been the open dialog and client relationships developed. Websites can be very personal things, with people at non-profits often pouring countless hours into improving them.Michael facilitated an exchange of ideas with Gordon and me; such that when I made my presentation all parties were open to improvements. That is a lesson I will lean on for the rest of my career.”

As you can see we are honored to have members who are so thoughtful, intelligent, and especially in Jan’s case, precise! In the end the decision was nearly impossible to make … until Michael Healy mentioned that he would be coming to the X Change regardless of the outcome.  To smooth that path we are sending Michael our “maximum discount” code for the conference and will ensure that he drinks his fill at the bar each evening.

Which leaves us with Jason Thompson and Jan Alden Cornish, this year’s Analysis Exchange at the X Change contest winners!

Jason and Jan will be coming to the X Change compliments of Analytics Demystified and Semphonic, co-hosts for the conference. I hope you will all join us in comments congratulating Jason, Jan, and all of our distinguished members! And if you are lucky enough to be joining us at this year’s X Change conference make sure to find Jason and Jan and congratulate them in person, shake their hands, and ask them about their experience in this effort.

The Analysis Exchange is always looking for more volunteer students, mentors, and nonprofit organizations. The X Change conference will be held September 20, 21, and 22 at the beautiful Monterey Plaza Hotel and Spa in Monterey, California.

Analytics Strategy

Web’s Goldmine – or – Consumer Jackpot?

This weekend the Wall Street Journal produced a well researched article called The Web’s New Gold Mine: Your Secrets. Apparently, it’s the first in a series of articles about Internet tracking practices. It’s entirely informative and chock full of quotes, anecdotes, video and interesting visuals. I highly recommend giving this article a read if you subscribe to the WSJ, or encourage you to join the discussion on their blog. However, I take serious issue with the bias inherent within this first article. The author, Julia Angwin uses phraseology like “the business of spying on consumers”, and “…details about her, all to be put up for sale for a tenth of a penny”. Clearly, the conclusion drawn by the author and presented to readers is that tracking solutions are spawned from malice. I vehemently disagree.

While, it’s true that some tracking can be used for devious function, the majority of uses are fully anonymous and serve to benefit end users exponentially. The reality is that media fragmentation, facilitated by the Internet, has forced advertisers to compete for our attention. To do this, they’re hocking their wares in a significantly more relevant way. By serving up advertising content that’s based on activity, propensity and preference, they are saving us from the irrelevant fire hose of most advertising. Without being coarse, I find that the fact that some consumers are self-conscious and sensitive to advertising that’s targeted to their browsing activity as trivial. It’s trivial compared to the the benefits that targeting delivers to the rest of us.

I’ve got more to say on this topic, a lot more in fact, but I’ll stop short for now. My closing thought is that, while the author of the Web’s New Goldmine may see the art and science of tracking as a boon for advertisers… I see it as a significant win for consumers. A jackpot perhaps. I hope and expect that my online and offline interactions with brands will get increasingly better and more relevant as my interactions continue. Tracking will enable this to happen. But, that’s just me…I’d love to know what you think.

Analytics Strategy, Reporting, Social Media

Marketing Measurement and the Mississippi River

At least once a week in my role at Resource Interactive, I get asked some flavor of this basic question: “How do I measure the impact of my digital/social media investment?” It’s a fair question, but the answer (or, in some cases, the impetus for the question) is complicated and, often, is related to the frustration gap — the logical leap that, since digital marketing is the most measurable marketing medium of all time, it enables a near-perfect linkage between marketing investments and financial results.

It’s no fun to be the bearer of Reality Tidings when asked the question, especially when it’s easy to sound like the reason we can’t make a clean linkage is because it’s really hard or we just aren’t smart enough to do so. There are countless sharp, well-funded people in the marketing industry trying to answer this exact question, and, to date, there is a pretty strong consensus when you get a group of these people together:

  1. We all wish we had “the answer”
  2. The evolution of consumers and the growth of social media adoption has made “the answer” more elusive rather than less
  3. “The answer” is not something that is just around the corner — we’re chipping away at the challenge, but the increasing fragmentation of consumer experiences, and the explosion of channels available for marketers to engage with those consumers, is constantly increasing the complexity of “the question”

That’s not an easy message to convey.

So, How’s That Explanation Working Out for Ya’?

It’s a tough row to hoe — not just being a data guy who expends a disproportionate amount of energy, time, and brainpower trying to find a clean way to come at this measurement, but trying to concisely explain the complexity. Of late, I’ve landed on an analogy that seems to hold up pretty well: measuring marketing is like measuring the Mississippi River.

If you are tasked with measuring the Mississippi, you can head to New Orleans, don hip waders, load up a rucksack with instruments, and measure all sorts of things at the river’s mouth: flow volume, fish count, contaminants, etc. That’s analogous to measuring a brand’s overall marketing results: brand awareness, share of voice in the industry, customer satisfaction, revenue, profitability, etc. The explosion of digital and social media actually makes some of this measurement easier and cheaper than ever before through the emergency of various online listening and social media analytics platforms.

While these “mouth of the river” measures are useful information — they are measures of the final outcome that really matters (both in the case of the Mississippi and brand marketing) — how actionable are they, really? As soon as results are reported, the obvious questions come: “But, what’s causing those results?”

What causes the Mississippi River to flow at a certain rate, with a certain number of a fish, with a certain level of a certain contaminant where it empties into the Gulf of Mexico? It’s the combination of all that is happening upstream…and the Mississippi’s headwaters reach from Montana (and even western Canada) all the way to Pennsylvania! The myriad headwaters come together many times over — they interact with each other just as different marketing channels interact with and amplify each other — in thousands of ways over time.

If we’re looking to make the Mississippi cleaner, we could travel to western Kansas and check the cleanliness of the Smoky Hill River. If it’s dirtier than we think it should be, we can work to clean it up. But, will that actually make the Mississippi noticeably cleaner? Logic tells us that it certainly can’t hurt! But, rational thought also tells us that that is just one small piece in an almost incomprehensibly puzzle.

With marketing, we have a comparably complex ecosystem at work. We can measure the growth of our Facebook page’s fans, but how is that interacting with our Twitter feed and our web site and our TV advertising and blog posts that reference us and reviews of our products on retailer sites and our banner ads and our SEO efforts and our affiliate programs and our competitors’ presence in all of these areas and… ugh! At a high level, a marketer’s Mississippi River looks like this:

Not only does each of the “managed tactics” represent dozens or even hundreds of individual activities, but environmental factors can be a Mack truck that dwarfs all of the careful planning and investment:

  • Cultural trends — do you really think that the Silly Bandz explosion was carefully orchestrated and planned by Silly Bandz marketers (the CEO of Silly Bandz certainly thinks so — I’m skeptical that there wasn’t a healthy dose of luck involved)
  • Economic factors — during a global recession, most businesses suffer, and successful marketing is often marketing that manages to simply help keep the company afloat
  • Competition — if you are a major oil producer, and one of the top players in your market inadvertently starts dumping an unfathomable amount of crude into the Gulf of Mexico, your brand begins to look better by comparison (although your industry as a whole suffers on the public perception front)

“It’s complicated” is something of an understatement when trying to accurately measure either the Mississippi River or marketing!

So, We Just Throw Up Our Hands and Give Up?

Just because we cannot practically achieve the Holy Grail of measurement doesn’t mean that we can’t be data driven or that we can’t quantify the impact of our investments — it just means that we have to take a structured, disciplined approach to the effort and accept (and embrace) that marketing measurement is both art and science. In the Mississippi River example, there are really three fundamentally different measurement approaches:

  • Measure the river where it flows into the Gulf of Mexico
  • Measure all (or many) of the tributaries that feed into each other and, ultimately, into the main river
  • Model the whole river system by gathering and crunching a lot of data

The first two approaches are reasonably straightforward. The third gets complex, expensive, and time-consuming.

For marketers — and I’m just going to focus on digital marketing here, as that’s complex enough! — we’ve got an analogous set of options (as it should be…or I wouldn’t be calling this an analogy!):

Measuring the direct and cross-channel effect of each tactic on the overall brand outcomes is nirvana — that’s what we’d like to be able to do in some reasonably reliable and straightforward way. And, we’d like that to be able to factor in offline tactics and even environmental factors. For now, the most promising approach is to use panel-based measurement for this — take a sufficiently large panel of volunteers (we’re talking 10s or 100s of thousands of people here) who voluntarily have their exposure to different media tracked, and then map that exposure to brand results: unaided recall of the brand, purchase intent, and even actual purchases. But, even to do this in an incomplete and crude fashion is currently an expensive proposition. That doesn’t mean it’s not an investment worth making — it just means it’s not practical in many, many situations.

However, we can combine the other two approaches — measurement of tactics (tactics include both always-on channels such as a Facebook page or a web site, as well as campaigns that may or may not cut across multiple channels) and measurement of brand results. The key here is to have clearly defined objectives at the brand level and to align your tactic-level measurement with those same objectives. I’m not going to spend time here expanding on clear definition of objectives, but if you’re looking for some interesting thinking there, take a look at John Lovett and Jeremiah Owyang’s white paper on social marketing analytics. They list four basic objectives that social media can support. At the overall brand level, I think there are basically eight possible objectives that a consumer brand might be tackling (with room for any brand to have one or two niche objectives that aren’t included in that list) — and, realistically, focusing in on about half that many is smart business. But I said I wasn’t going to expand on objectives…

What is important is to apply the same objectives at the brand and the tactic level — each tactic isn’t necessarily intended to drive all of the brand’s objectives, so being clear as to which objectives are not expected to  be supported by a given tactic can help set appropriate expectations.

Just because the objectives should align between the tactic and the brand-level measurement does NOT mean that the measures used to track progress against each objective should be the same. For instance, if one of your objectives is to increase engagement with consumers, at the brand level, this may be measured by the volume and sentiment of conversations occurring online about the brand (online listening platforms enable this measurement in near real-time). For the brand’s Facebook page (a tactic), which shares the objective, the measure may, instead, be the number of comments and likes for content posted on the page.

But…How Does That Really Help?

By using objectives to align the measurement of tactics and the measurement of the brand, you wind up with a powerful performance measurement tool:

As simplistic and extreme examples, consider the situation where all of your tactics are performing swimmingly, but the brand overall is suffering. This might be the result of a Mack truck environmental factor — which, hopefully, you are well aware of because you are a savvy marketer and are paying attention to the environment in which you are operating. If not, then you should consider revisiting your overall strategy — do you have the wrong tactics in place to support the brand outcomes you hope to achieve?

On the other hand, consider a situation where the brand overall is suffering and the tactics as a whole are suffering. In that case, you might have a perfectly fine strategy, but your tactical execution is weak. The first order of business is to get the tactics clicking along as designed and see if the brand results improve (in a sense, this is a preferable situation, as it is generally easier to adjust and improve tactics than it is to overhaul a strategy).

In practice, we’re seldom working in a world where things are as black and white (or as green and red) as this conceptual scenario. But, it can certainly be the case that macro-level measurement of an objective — say, increasing brand awareness — is suffering while the individual tactics are performing fine. Let’s say you heavily invested in your Facebook page as the primary tactic to drive brand awareness. The page has been growing total fans and unique page views at a rapid clip, but your overall brand awareness is not changing. You may realize that you’re starting from a very small number of fans on Facebook, and your expectation that that tactic will heavily drive overall brand awareness is not realistic — you need to introduce additional tactics to really move the brand-level awareness needle.

In the End, It’s Art AND Science

Among marketing measurement practitioners, the phrase “it’s art and science” is oft-invoked. It sounds downright cliché…yet it is true and it’s something that many marketers struggle to come to terms with. Look at marketing strategy development and execution this way:

“The data” is never going to generate a strategy — knowing your customers, your company, your competition, and a bevy of other qualitative factors should all be included in the development or refinement of your strategy. Certainly, data can inform and influence the strategy, but it cannot generate a strategy on its own. Performance measurement, though, is all about science — at its best, it is the quantitative and objective measurement of progress towards a set of objectives through the tracking of pre-defined direct and proxy measures. Dashboards can identify trouble spots and can trigger alerts, but their root causes and remediation may or may not be determined from the data — qualitative knowledge and hypothesizing (“arts”) are often just as valuable as drilling deeper into the data.

It’s a fun world we live in — lots of data that can be very valuable and can drive both the efficiency and effectiveness of marketing investments. It just can’t quite deliver nirvana in an inexpensive, easy-to-use, web-based, real-time dashboard! 🙂

Adobe Analytics, Analytics Strategy, General

Validating Orders & Revenue

I recently received an e-mail from a blog reader who was having issues tying their Orders in SiteCatalyst to Orders in their back-end system. Here is a snippet from the e-mail:

I have a little issue in my own SiteCatalyst setup that I recently discovered. Sad for me I had trusted the number of Orders for each day’s Conversion Funnel and recently I decided to validate the numbers in SiteCatalyst against what our back-end system has. SiteCatalyst is 5%-10% understated each day which makes for a heck of a difference at the end of the month! I’d rather be understated than overstated, but can you give me some ideas where I should look first?

Unfortunately this is an all too common problem I hear out there. In this post I am going to share some ideas on how you can tackle this Order/Revenue validation issue head-on and make sure you can trust your critical Orders/Revenue data in SiteCatalyst.

Order ID eVar
If you have an online shopping cart, you should already be setting the s.purchaseID variable with a unique Order ID when an Order takes place on the website. This variable is used by SiteCatalyst to ensure Order uniqueness. Unfortunately, the downside of this variable is that it is not readily available in the SiteCatalyst interface. It is available in DataWarehouse but not in regular SiteCatalyst reports or Discover. Carmen Sutter (@c_sutter) has submitted an idea in the Idea Exchange to change this, but until then, I recommend that you set what I call an Order ID eVar variable. To do this, all you need to do is set the same value you pass to the PurchaseID variable to a custom eVar. This will allow you to see all Orders and Revenue by Order ID from within SiteCatalyst and Discover as you would any other eVar. Once you have done this, you can open up this new Offer ID eVar and add your Orders or Revenue Success Event as needed:

In the example above, we can see that most Orders have only one Order ID, which is what we want. However, in this case, we can see that one ID was counted twice. That may require some research and I like to schedule a report like the one above to be sent to me weekly so I can make sure nothing strange is going on.

Data Sources Setup
However, while adding an Order ID eVar is helpful in seeing if you are over counting Orders in SiteCatalyst, it won’t tell you if you are under counting Orders or how close your SiteCatalyst data is to your back-end systems. To do this, I recommend you use Data Sources. As a quick refresher, Data Sources allows you to import external data/metrics into SiteCatalyst (see post link for more details). In this case, I recommend that you import in a file from your back-end system into SiteCatalyst which contains your unique Order ID, the number of Orders (which should always be “1”) and the Revenue Amount. When you import data via Data Sources, you include the date that you want the data to be associated with so it doesn’t matter if you import the data on a daily, weekly or monthly basis, but the more frequently you upload it, the better so you can find issues quickly.

Here are step-by-step instructions on how to do this:

  • Create the Order ID eVar described above
  • Create two new Incrementer Success Events and name them “Back-End Orders” (Type=Numeric) and “Back-End Revenue” (Type=Currency)
  • Create a new Data Sources upload template (ClientCare or Omniture Consulting can assist with this). You want to be sure to map the two new “Back-End” Success Events to the Data Sources template. Even more critical, is that you want to include the newly created Order ID eVar in the Data Sources template. If you do not do this, then you will not be able to see these two new Back-End metrics in the same Order ID eVar report that you have in SiteCatalyst (more on this later).

  • When you are done, you should have a Data Sources template that looks something like this:

  • Now all you have to do is work with your developers to have this file sent via FTP to the Data Sources FTP on a regular basis.

The Payoff
So by now, you are probably saying to yourself: “That’s a lot of work!” No argument here! However, hang with me as I share what the ultimate payoff is for doing this. As you recall, our primary objective was to see if our online Order and Revenue data was matching what our back-end systems indicated. Now that we have the Order ID eVar and two new “Back-End” Order and Revenue metrics, we have everything we need. This is where the fun begins and we put it all together!

All you have to do now is to open the new Order ID eVar report and add all of the relevant metrics. First, we will add the SiteCatalyst Orders and Revenue so we can see online Orders and Revenue by Order ID:

Next, we will add the two new “Back-End” metrics to the report and, since we were smart enough to include the Order ID eVar value in the Data Sources upload, SiteCatalyst knows which “Back-End” Order ID and dates line up with our online data:

Cool huh! As far as SiteCatalyst is concerned, these offline metrics are connected to your Order ID eVar values just as if they had happened online. Using this report, we can see if there are any differences between our online and offline data. In the example above, it looks like the “Back-End” system had an order with $2,350 in revenue that wasn’t captured online. Having this information makes it much easier to troubleshoot order submission issues. You can even use DataWarehouse or Discover (only if you use Transaction ID Data Sources) to break down Order ID by browser, domain, IP address, etc… to see if you can figure out what is happening. In addition, you can export this data to Excel and look at the totals to see how far off you are in general.

Finally, for the true SiteCatalyst geeks, you can create a Calculated Metric that divides Orders by Back-End Orders and/or Revenue by Back-End Revenue to see a trended % that each is off and set up Alerts to notify you if they deviate too much! When you take into account this level of assurance all of a sudden the Data Sources work above might not seem like all that much in the long run!

Final Thoughts
If you sell products online, nothing is more critical than believing in your key metrics. Even if you don’t sell online, the same principles here can be applied to lead generation forms, subscriptions or any other metrics you store in SiteCatalyst and also in your back-end systems.

Analytics Strategy, General, Social Media

Guest Post: Kevin Hillstrom

Kevin Hillstrom is one smart dude. President of MineThatData, author of Online Marketing Simulations, and prolific contributor to the Twitter #measure channel. Kevin spends a huge amount of time in Twitter challenging web analysts to think and work harder on behalf of their “clients,” 140 characters at a time.

A few weeks ago I asked Kevin “what five practices learned in the offline data analytics world would you like to see web analytics professionals adopt?” The following contributed blog post has Kevin’s answers which are, unsurprisingly, awesome. Near the end Kevin says “The Web Analyst has the keys to the future of the business, so it is a manner of getting the Web Analyst to figure out how to use keys to unlock the future potential of a business.”

Brilliant. We are the future of business … so what future will we be helping to create?

Kevin Hillstrom, President, MineThatData

In 1998, I became the Circulation Director at Eddie Bauer. Back in those days, Eddie Bauer printed money, generating more than a hundred million dollars of pre-tax profit on an annual basis.

One of the ways that Eddie Bauer generated profit was through the use of discounts and promotions. If a customer failed to purchase over a six month period of time, Eddie Bauer applied a “20% off your order” offer. The customer had to use a special promotion code, in order to receive discounted merchandise.

We analyzed each promotion code, using “A/B” test panels. Customers were randomly selected from the population, and then assigned to one of two test panels. The first test panel received the promotion, the second test panel did not receive the promotion. We subtracted the difference between the promotion segment and the control segment, and ran a profit and loss statement against the difference.

In almost all cases, the segment receiving the promotion generated more profit than the control segment. In other words, it became a “best practice” to offer customers promotions and incentives at Eddie Bauer. Over the course of a five year period of time, the marketing calendar became saturated with promotions. In fact, it became hard to find an open window where we could add promotions!

Being a huge fan of “A/B” testing, I decided to try something different. I asked my circulation team to choose two customer groups at random from our housefile. One group would receive promotions for the next six months, if the customer was eligible to receive the promotion. The other group would not receive a single promotion for the next six months. At the end of the six month test period, we would determine which strategy yielded the most profit.

At the end of six months, we observed a surprising outcome. The test group that received no promotions spent the exact same amount of money that the group receiving all promotions spent. After calculating the profitability of each test group, it was obvious that Eddie Bauer was making a significant mistake. It appeared that we would lose, at most, five percent of total annual sales, if we backed off of our promotional strategy. Eddie Bauer would be significantly more profitable by minimizing the existing promotional strategy.

In 1999, we backed off of almost all of our housefile promotions. At the end of 1999, the website/catalog division enjoyed the most profitable year in the history of the business.

This experience shaped all of my subsequent analytical work.

Just because we have the tools to measure our activities in real-time doesn’t mean we are truly optimizing business results. In the Eddie Bauer example, we had the analytical tools to measure every single promotion we offered the customer, and we used existing best practices and “A/B” testing strategies. All of it, however, was wrong, costing us $26,000,000 of profit on an annual basis. Simply put, we were measuring “conversion rate”. What actually happened was that we “shifted conversions” out of non-promotional windows, into promotional windows! Had we measured non-promotional windows, we would have noticed that demand decreased.

So, by measuring customer behavior across a six month period of time, we made a significant change to business strategy, one that dramatically increased annual profit.

What does this have to do with Web Analytics?

The overwhelming majority of Web Analytics activity is focused on improving “conversion rate”. Our software tools are calibrated for easy analysis of events. Did a visitor do what we wanted the visitor to do? Did a promotion work? Did a search visitor from a long-tail keyword buy merchandise when they visited the website? All of these questions are easily answered by the Web Analytics expert, the expert simply analyzes an event to determine if the event yielded a favorable outcome.

Offline analytics experts (often called “Business Intelligence” professionals or “SAS Programmers” if they use SAS software to analyze data) frequently analyze business problems from a different perspective. They use whatever data is available, incomplete or comprehensive, to determine if the individual actions taken by a business over time cause a customer to become more loyal.

With that in mind, here are five offline practices I wish online analytics experts would adopt.

Practice #1 = Extend the Conversion Window: Instead of analyzing whether a customer converted within a single visit or session, it makes sense to extend the conversion window and learn whether the customer converted across a period of time. For instance, when I ran Database Marketing at Nordstrom, we learned that our best customers had a 5% conversion rate, when measured on the basis of individual visits, but our best customers nearly achieved a 100% conversion rate when combining website visits and store visits during a month. By extending the conversion window, we realized that we didn’t have website problems, instead, we had loyal customers who used our website as a tool in a multi-channel process.

Practice #2 = Measure Long-Term Value: Offline analytics practitioners want to know if a series of actions results in long-term profit. In other words, individual conversions are relatively meaningless if, over the course of a year, individual conversions do not yield incremental profit. This is essentially the “Eddie Bauer” example I mentioned at the start of this paper, we learned that individual conversions (customers purchasing via a promo code) yielded increased profit during the promotional period, but generated a loss when measured across a six month timeframe. A generation of Web Analytics experts were trained, largely because of software limitations, to analyze short-term business results, and have not developed the discipline to do what is right for a business across a six month or one year timeframe. Fortunately, Web Analytics practitioners are exceptionally bright, and are easily able to adapt to longer conversion windows.

Practice #3 = Comfort with Incomplete Data: I recently analyzed data for a retailer that was able to tie 70% of store transactions to a name/address. During my presentation, an Executive mentioned that my results must be inaccurate, because I was leaving 30% of the transactions out of my analysis. When I asked the Executive if it would be better to make decisions on incomplete data, or to simply not make any decisions at all until all data is complete and accurate, the Executive acknowledged that inferences from incomplete data are better than inaction caused by data uncertainty. Offline analysts have been dealing with incomplete multi-channel data for decades, and have become good at communicating the benefits and limitations of incomplete data to business leaders. The same opportunity exists for Web Analytics practitioners. Don’t hide from incomplete data! Instead, make confident decisions based on the data that is available, simply communicating what one can and cannot infer from incomplete data.

Practice #4 = Demonstrate What Happens to a Business Five Years From Now Based on Today’s Actions: Believe it or not, this is how I make a living. I use conditional probabilities to show what happens if customers evolve a certain way. Pretend a business had 100 customers in 2009, and 44 of the 100 customers purchase again during 2010. This business must find 56 new customers in 2010 to replace the customers lost during 2010. I can demonstrate what the business will look like in 2015, based on how well the business can retain existing customers or acquire new customers. This type of analysis is the exact opposite of “conversion rate analysis”, because we are looking at the long-term retention/acquisition dynamics that impact every single business. I find that CEOs and CFOs love this type of analysis, because for the first time, they have a window into the future, they actually get to see where the business is heading if things remain as they are today. Better yet, the CEO/CFO can go through “scenario planning” to identify ways to mitigate problems or to capitalize on favorable business trends. The Web Analytics practitioner has the data to do this type of analysis, it is simply a matter of tagging customers or shaping queries in a way that allows the analyst to make inferences that impact long-term customer value.

Practice #5 = Communicate Better: This probably applies to all analysts, not just Web Analytics experts. Executives are frequently called “HiPPOs” by the Web Analytics community, a term that refers to “Highest Paid Person’s Opinion”. The term can be used in a negative manner, suggesting that the Executive is choosing to not make decisions based on data but rather on opinion or gut feel or instinct or internal politics. I was a member of the Executive team at Nordstrom for more than six years, and I can honestly say that I made far more decisions based on opinion than I made based on sound data and analytics … and I am an analyst by trade!! Too often, the analytics community tells an incomplete story. Once, I witnessed an analytically minded individual who made a compelling argument, demonstrating that e-mail marketing had a better return on investment than catalog marketing. This analyst used the argument to suggest that the company shut down the catalog marketing division. On the surface, the argument made sense. Upon digging into the data a bit more, we learned that 75% of all e-mail addresses were acquired when a catalog shopper was placing an online order, so if we discontinued catalog marketing, we would cut off the source of future e-mail addresses. This is a case where the analyst failed to communicate in an appropriate manner, causing the Executive to not heed the advice of the analyst. Too often, analysts fail to put data and customer findings into a larger context. Total company profit, long-term customer profitability, total company staffing strategies and politics, multi-channel customer dynamics, and Executive goals and objectives all need to be taken into account by the analyst when communicating a data-driven story. When this is done well, the analyst becomes a surrogate member of the Executive team. When this is not done well, the analyst sometimes perceives the Executive to be a “HiPPO”.

These are the five areas I’d like to see Web Analytics experts evolve into. The Web Analyst has the keys to the future of the business, so it is a manner of getting the Web Analyst to figure out how to use keys to unlock the future potential of a business. Based on what I have witnessed during the past forty months of multi-channel consulting, I am very confident that Web Analytics practitioners can combine offline techniques with online analytics. The combination of offline techniques and online analytics yields a highly-valued analyst that Executives depend upon to make good business decisions!

Analytics Strategy, General

Are you looking for experienced web analysts?

Anyone who has read my blog for long knows that I am passionate about two things in web analytics: process and people. Process is the glue that holds all the hard work we do as analysts together and allows our effort to translate into tangible business value. But without a doubt it is the people who are absolutely critical to any businesses ability to compete and succeed on web analytics.

Unfortunately people, especially really good ones, are incredibly hard to find. So much so that my partners and I have invested heavily in creating an entirely new way for novice and veteran analytics practitioners alike to gain valuable “hand’s on” experience using data to answer business questions, The Analysis Exchange.

While the Analysis Exchange has exceeded every single short-term milestone we have established for the effort, it has long been clear to my partners and I that training alone is not enough to satisfy the immediate needs of businesses working to take advantage of their existing investment in web Analytics. Companies need analytical talent now, not a year from now, not in six months, right now.

Why the urgency? Myriad reasons. The money has been spent on technology, the clock is ticking, the promises have been made, offline revenues are in decline and the company’s digital channels are the hope and future and difference between profitability and not.

The web analytics promise is real — companies that have become adept at generating analytically-driven insights and then translating those insights into sound business decisions have staked a clear competitive advantage. The giants of our industry — brilliant people like Joe Megibow, Dylan Lewis, Shari Cleary, and Lynn Lanphier <plug>all of whom are coming to the X Change conference in September, are you?</plug> — have not only determined the value of people but have also figured out how to convince management of that value.

Have you? Most companies have not.

Most companies persist in their belief that web and digital analytics is something that they can do “part time” and still have the successes that Intuit, Expedia, MTV, Best Buy, and others gain by hiring brilliant people, giving them clear direction, and recognizing the value of the analytical output they produce. Despite being well-intentioned, far too many managers still believe that software alone will provide insights and make recommendations.

But I digress.

Because we at Analytics Demystified believe in people and process so strongly, and because we are pretty confident in our consulting as it relates to process, we have decided to put our money where our mouths are and start helping companies fill their open positions for “web analyst, senior.” Today we are extremely proud to announce our first-of-it’s-kind partnership with the web analytics community’s leading recruiting firm, IQ Workforce.

Working directly with Workforce CEO Corry Prohens and his team, Analytics Demystified has crafted a “one-two” punch to help speed the process of finding, vetting, and hiring the kind of deep talent and teams required to take complete advantage of any investment in digital measurement technology. The Demystified partners and IQ Workforce will help you determine exactly which roles you need to fill, what strengths the ideal candidate will have, and how hired resources will fit into the organization that both creates business value and a satisfying experience for the analyst (which has a surprisingly positive impact on retention!)

In essence Analytics Demystified with our 30+ years of experience in web analytics will sit on your hiring panel and help you find and hire the critical difference between “web analytics as a cost center” and “web analytics as a profit center.”

Did we mention we will do it for a fixed price and in a way that allows most companies to circumvent HR’s aversion to “outside help?”

If you’re looking for an analytics guru for your organization, give us a call. We are more than happy to explain how this partnership creates a dramatic advantage for most companies, and would love to talk with you about our business and our partners at IQ Workforce. In the meantime please have a look at our press release on the announcement and more details about the offering:

Thanks to Corry and his team for making this idea a reality. On behalf of IQ Workforce and the Demystified Partners we look forward to helping you with your staffing needs.

Analytics Strategy

Does your data quality still suck?

Years ago Google’s Analytics Evangelist Avinash Kaushik told everyone “data quality sucks, get over it” which at the time was quite the funny and controversial thing to say.  Among other things Mr. Kaushik encouraged his readers to “resist the urge to dig deep” to understand data-related problems, to “assume a level of comfort with the data” and to focus more on trends and less on absolutes.

At the time this advice seemed good. Any number of companies were in the midst of switching vendors back in 2006 (a trend that has noticeably declined) and so guidance to not stress out on the differences observed between old system “A” and new system “B” was good, as was his encouragement to spend more time focusing on data quality in key areas (checkout, carts, etc.)

Unfortunately times have changed.

Since 2006 we have seen a slow but steady increase in the prominence that digitally collected data has within businesses of all sizes. Now in 2010, more senior managers, Vice Presidents, and CEOs than ever are incorporating both qualitative and quantitative data collected from web, mobile, and social sites than ever before. Among our clients we have seen a profound shift from “nice to have” to “critical” when it comes to data flowing through Omniture, Coremetrics, Unica, Google Analytics and other systems, and slowly web analytics is becoming an embedded component of business decision making.

While this shift has far reaching implications lately at Analytics Demystified we have been looking more closely at how we can help our clients not “get over” the “suckiness” of data quality and actually do something about it.  We are doing this for one simple reason: senior leadership doesn’t want a glib response to data quality issues, they want as high a level of accuracy possible and concrete answers for why that accuracy isn’t forthcoming.

Don’t believe me? The next time your boss asks about the quality of the numbers you produce look them squarely in the eye and repeat Mr. Kaushik’s words, “Well Bob the data quality sucks and so you should just get over it, okay?”

When you’re done you can call my friend Corry Prohens to help you find a new job.

The alternative is, of course, to actually pay attention to your data’s quality and work diligently to incrementally improve data collection processes. Rather  than be lazy about the very foundation of all of your valuable work (and the high-quality analysis you’re working to drive into the business) you can do a few simple things designed to make your data “less sucky” and thusly more valuable.

And what are those things? Thanks to our friends at ObservePoint we have authored a short white paper on this exact subject! Titled “When More is Not Better: Page Tags” and subtitled “The Dramatic Proliferation of Script-based Tagging and the Resulting Need for a Chief Data Officer” (okay, not my best title, I admit it) the paper outlines the business processes and technologies required to develop a little more trust and faith in your digitally collected data.

The paper is a free download from ObservePoint but you will need to trade some information with them. I can assure you Rob Seolas and his team are fine folks and given that they have a tendency to send out sweet USB devices to prospects there are worse things than having someone from ObservePoint call.

Get your copy of “When More is Not Better: Page Tags” at the ObservePoint web site today!

If you’ve read the paper I welcome your comments. While we recognize that few companies are going to appoint a Chief Data Officer to manage their digital data quality we hope that our readers understand that the point is not the job title but rather the work the associated work. Our thesis is that as the push towards digital continues those companies who have (and can communicate) a high-level of trust in their data will gain a competitive advantage, and in a world where the competition is always only a click away, who doesn’t want every advantage they can create?

Analysis, Analytics Strategy, Social Media

Integrated View of Visitors = Multiple Data Sources

I attended the Foresee Results user summit last month, and John Lovett of Analytics Demystified was the keynote speaker. It’s a credit to my general lack of organization that I wasn’t aware he was going to be speaking, much less keynoting!

John showed this diagram when discussing the importance of recognizing your capabilities:

The diagram starts to get at the never-ending quest to obtain a “360 degree customer view.” A persistent misperception among marketers when it comes to web analytics is that behavioral data alone can provide a comprehensive view of the customer. It really can’t — force your customers to behave in convoluted ways and then only focus on behavioral data, and you can draw some crazily erroneous conclusions (“Our customers appear to visit our web site and then call us multiple times to resolve a single issue. They must like to have a lot of interactions with us!”).

Combining multiple data sources — behavioral and attitudinal — is important. As it happened, Larry Freed, the Foresee Results CEO, had a diagram that came at the same idea:

This diagram was titled “Analytics Maturity.” It’s true — slapping Google Analytics on your web site (behavioral data) is cheap and easy. It takes more effort to actually capture voice-of-the-customer (attitudinal) data; even if it’s with a “free” tool like iPerceptions 4Q, there is still more effort required to ensure that the data being captured is valid and to analyze any of the powerful open-ended feedback that such surveys provide. Integrating behavioral and attitudinal data from two sources is tricky enough, not to mention integrating that data with your e-mail, CRM, marketing automation, and ERP systems and third-party data sources that provide demographic data!

It’s a fun and challenging world we live in as analysts, isn’t it?

(On the completely off-topic front: I did snag 45 minutes one afternoon to walk around the University of Michigan campus a bit, as the conference was hosted at the Ross School of Business; a handful of pictures from that moseying is posted over on Flickr.)

Analytics Strategy

Be-Still My Analytical Heart

Okay…I’ve been quiet about the Coremetrics acquisition by IBM for long enough now. While the dust still won’t settle until sometime in Q3’10, when this deal passes FTC scrutiny, I’m compelled to weigh in and offer my $.02 USD mainly because there’s been some good dialog in the blogosphere from people I respect like: Eric, Joe Stanhope, Akin and more recently Brian Clifton.

I’ll take a slightly different approach and use the acquisition to talk about the state of the web analytics marketplace. For starters, let me just say that this acquisition was inevitable. So too will Webtrends be acquired by some player looking to incorporate metrics into their overarching set of technology capabilities. And as I blogged earlier this spring, yet another even bigger fish will eat the existing big fish and we’ll utter oooh’s and ahhh’s as the analytics technology market evolves into a vital organ for all businesses with a heartbeat. While not immune to arrhythmia, this course of events shouldn’t really take anyone by surprise. I’ve been saying this for a while now and even penned “Web Analytics is Destined to Become an Integrated Service” back in May 2009 when I wrote the Forrester US Web Analytics Forecast 2008-2014 (subscription required). I’ve been advocating web analytics as a function within the marketing organization, which seems to be a logical orientation. However, it’s interesting that the consumption of analytical technologies has come from a smattering of different perspectives.

Here’s how the post-acquisition landscape looks:

Creatively Speaking…

Adobe’s acquisition of Omniture undoubtedly took many by surprise (myself included – although you’re never allowed to admit surprise as an analyst). The promise Adobe made to investors was that they would incorporate the market leading web analytics technology into the creative life-cycle by enabling measurement at the point of content creation. Perhaps that’s not exactly how they positioned it, but that was my impression and they’re now executing on that promise. Say what you want about acquisitions and the slow moving integration process, but Creative Suite 5 debuted in April just six short months after the deal closed, with measurement hooks from FlashPro and Dreamweaver into both SiteCatalyst and Test & Target. They’ve also accomplished this remarkable feat using a visual interface allowing content editors and non power-users the ability to begin measuring their digital assets. This utilization of analytics places measurement at the operational level, yet by and large it’s still within the marketing group.

The Marketer’s Toolbox…

Enter Unica with their rebranded Marketing Innovation product suite where NetInsight (formerly Sane Solutions) web analytics sits at the core. While both Omniture and Coremetrics made pre-acquisition strides to amass a truly effective online marketing suite, they were merely playing second fiddle to Unica Campaign, Interact and Marketing Platform solutions. Unica is widely acclaimed as a leading Campaign Management tool and sits proudly in the marketing departments across many an enterprise business. They’ve worked web analytics into the DNA of their overall marketing perspective and use it to power the automation and decisioning that many organizations strive for with lust and admiration. Their utilization of analytics really does empower analytics as a lynchpin for integrated marketing.

Shoppers World…

With speculation still swirling about the how’s and why’s of IBM’s intended use of Coremetrics, it’s tough to ignore Coremetrics’ strength in the retail vertical. While Coremetrics has an impressive client based outside of retail, including publishers and financial institutions among others, they’ve clearly got some good mojo going with their triple-A retail clients. Just thinking of how Big Blue will assimilate the nimble teams of relentless Coremetrics marketers in San Mateo and Texas makes me slightly nervous. Not for any loss of focus by the Coremetrics team on their dedication to client support or from their delivery of leading analytical capabilities that they offer – rather – where will this newly acquired asset live within the IBM estate? The way I see it, two possible scenarios can play out here:

1. First is the scenario that Akin speculates upon whereby IBM is folded into the Websphere group and serves to illuminate the value of customer interactions within website platforms across IBM’s customer base. This would greatly benefit Websphere customers although it would narrowly define a finite application of a technology that is so much bigger than just online commerce.

Or;

2. The scenario that Eric envisions (and one that I believe would benefit our industry exponentially) is the one where IBM becomes the “business analytics” juggernaut in the enterprise. If this were to occur, IBM would need to integrate its SPSS and Cognos acquisitions to get really crafty about delivering extremely high value digital insights.

These are two very different outcomes and both speculatory, but I’m rooting for the latter simply because it has the potential to push analytics so much further along. My sources tell me that some long-time IBM’ers feel this way too. One confidant with access to IBM brass even shared with me that internally the acquisition will be deemed a failure by some at IBM if Coremetrics isn’t integrated with SPSS and Cognos. That’s great news, because wholesale failure of business analytics isn’t an option.

Switzerland…

So here we have Webtrends as the only standalone web analytics player remaining from the set of truly original US-based technologies. They’re doing a good job of playing the part of Switzerland as they not-so-quietly establish a platform of Open Analytics whereby data flows in -and- out of the interface fueling other operations around the business. While this is not the same as an integrated approach, Webtrends is taking a strong stance on have-it-your-way analytics. Their open APIs and REST URLs make it easy to leverage their data collection and pump data to any application within the enterprise. Thus, they too offer an integrated approach yet do so by maintaining a position that supports rather than delivers the adjacent marketing functions.

The Low End Theory…

Any post about the state of the analytics marketplace would be remiss if Google Analytics wasn’t included in the conversation. I include the Big Googley in the Low End Theory – not because they’re trailing – but because they’re sneaky smart. Just in case you haven’t been watching, since Google acquired Urchin Software, GA has been quietly amassing millions of installations across businesses large and small adding to the democratization of web analytics. I’d argue that they’re not doing this in a concerted enterprise-wide way, but they are probably gaining the most ground across the enterprise by sheer adoption and hands-on utilization. What this means is that pockets of users are deploying Google Analytics for very focused use of the data and the organization is becoming more accustomed to seeing GA data and using it to make key decisions in their day-to-day operations.

Many other analytics programs are delivering similar value to business users, yet in an extremely isolated manner with tools like KissMetrics, Twitalyzer, Visible Measures and Radian6 just to name a few. This is truly the low end theory because the data is rarely seen by anyone outside the marketing group, but it’s driving key activity around specific marketing functions without the larger business really taking note. Think grassroots baby – under the radar – with potential super smartie effectiveness.

Can Marketing Come from the Heart?

By now you should be asking yourself; So where’s this all going? Despite how each of the companies I described above fit into the overall aspect of a company’s business, I think that we can all agree that analytics is about understanding business performance. Here is where Eric’s vision of the Coming Revolution in Web Analytics fits into the story and the quietly powerful behemoth that’s already penetrated the enterprise garden sits in wait down in Cary, North Carolina. Whether it’s SAS, another player, or an amalgamation of services from multiple players – analytics needs to be at the heart of the organization. Here’s where my analogy pays off…because if this is to happen, then data becomes the lifeblood of the enterprise and analytics allows companies to relate to their customers and offer more tuned in and relevant products and services. Marketing should control this blood flow but use it to power the brain and the working limbs of the organization. While this may start to look like Business Intelligence, I believe it’s different because it requires real-time information, automated decisioning and ultimately creativity. These are qualities that I have yet to see from a BI tool. But maybe I’m naive.

Before this diatribe gets any longer, and you dear reader need resuscitation I’ll call it quits. But I’ll offer fair warning that this is just the beginning of my thoughts on the matter and there’s more to follow. I’d also love to hear what you think.

Analytics Strategy

Thoughts on IBM's acquisition of Coremetrics …

By now you’ve heard that IBM announced today their intent to purchase Coremetrics. While I was caught off guard by the news — it was all over Twitter before I’d had a single drop of Espresso — I cannot say I was particularly surprised. Coremetrics has not been particularly coy about their intent to do what is best for their customers and company, and IBM is on a tear for acquisitions and investment in analytics with over $10 billion spent on 14 (now 16) acquisitions since 2005 including SPSS, Cognos, and now Coremetrics.

In fact, according to CMS Wire, IBM is now #2 behind privately held SAS for analytics market-share (14.5% versus SAS’s 33%.)  Couple this with the consulting services organization IBM established just over a year ago and the picture becomes incredibly clear: IBM has the potential to become a digital analytics juggernaut in the Enterprise.

You know what? I’m excited about this.

I’m excited for three reasons, none of which are particularly altruistic of me considering that I am a strategic business consultant who has spend years working to elevate the visibility of digital analytics within the Enterprise. From my point of view, my partners and I (and our peers) will only be more successful as traditional business pays more attention to the value of digitally collected data as an input to business intelligence.

But I digress.

I’m excited by IBM’s acquisition of Coremetrics for these three reasons:

  1. This acquisition validates my “Coming Revolution in Web Analytics” thesis. In 2009, with the support of SAS, we postulated that we were on the cusp of a grand revolution in web analytics, one that had profound implications on both the practice and the practitioner. In the same way SAS continues to refine their web and customer experience analytics offerings, IBM buying Coremetrics pushes us further along towards what I believe is an inevitable future where digital analytics is powered as much (or more) by statistical and predictive models as “pretty” spreadsheets and iPhone viewer apps.
  2. This acquisition validates my “Two Sets of Tools” thesis. Back in February of this year I wrote a post about the need for two sets of tools to do digital analytics professionally. I wrote this post in response to a dramatic increase in the number of companies telling us their business users were frustrated with the complexity of the most widely deployed “Enterprise” web analytics tools. While Coremetrics 2010 is certainly not perfectly suited for business users of all shapes and sizes, I do like what they’ve done with the recent release (view source on the main site!) More importantly, Coremetrics plus SPSS is essentially the bifurcated solution I describe appropriate for deployment within large companies.
  3. This acquisition has tremendous potential to elevate the visibility of “digital analytics” within business leaders across the globe. One of our vendor clients commented today “don’t you think the need for web analytics has been validated already?” to which I could only respond “no.” Everywhere we go — conferences, clients, prospects, social events, you name it — we continue to see a half-hearted expression of commitment by senior leadership for web and digital analytics. Don’t get me wrong — they are 100% committed to connected efforts! Leadership LOVES mobile apps, social media, Flash and Flex and Silverlight and the iPad and QR codes and … well, you get the idea. But in my experience when it comes to investing in the measurement of those efforts to validate their value to the business, well, let’s just say interest seems to wane. My hope is that with IBM backing digital analytics and pushing Coremetrics as part of their portfolio that this will change much more quickly than it would with, say, Adobe pushing the same agenda (sorry, sorry, sorry!)

My last point is clearly the most important. For web analytics to truly “grow up” and mature into the valuable contributor to the entire business we all know (or at least suspect) it can be, it has to become a greater priority for senior leadership. And despite the fact that the people I respect the most — guys like Jim Sterne, Gary Angel, and Josh Manion — continue to do the real evangelical work, pounding their fists and respectfully raising their voices in an attempt to get management to pay attention, I suspect that John Squire and Joe Davis injected into the analytics sales process at IBM has far greater potential to change minds and focus within corner offices.

Sorry to put that on ya, Squire. Let me know if I can help.

Since I was slow to get this post up two other well-written and thoughtful pieces about the acquisition have been published that I would encourage you to read:

  • Akin Arikan (Unica)’s “Farewell to Web Analytics” post, which I think is a little hopeful on Akin’s part but should be expected as IBM is a large Unica customer and so this deal creates some risk for Akin and his team;
  • Joe Stanhope (Forrester)’s take on the deal. Joe was lucky enough to have dinner with John Squire last night and has a lot of detail about IBM’s opportunity.

Again I am really excited for Coremetrics management and staff. Personally I think that IBM made a great decision, one that will create benefits for everyone truly committed to digital analytics.

As always I welcome your thoughts.

Analytics Strategy, Reporting, Social Media

Monish Datta Learns All about Facebook Measurement

Columbus Web Analytics Wednesday was last week — sponsored by Omniture, an Adobe company, and the topic wound up being “Facebook Measurement” (deck at the end of this post).

For some reason, Monish Datta cropped up — prominently — in half of the pictures I took while floating around the room. In my never-ending quest to dominate SEO for searches for Monish, this was well-timed, as I’m falling in the rankings on that front. You’d think I’d be able to get some sort of cross-link from http://www.monishdatta.com/, but maybe that’s not to be.

Columbus Web Analytics Wednesday -- May 2010

We had another great turnout at the event, AND we had a first for a Columbus WAW: a door prize. Omniture provided a Flip video camera and a copy of Adobe Premier Elements 8 to one lucky winner. WAW co-organizer Dave Culbertson presented the prize to the lucky winner, Matt King of Quest Software:

Columbus Web Analytics Wednesday -- May 2010

Due to an unavoidable last minute schedule change, I wound up pinch-hitting as the speaker and talked about Facebook measurement. It’s been something I’ve spent a good chunk of time exploring and thinking about over the past six months, and it was a topic I was slated to speak on the following night in Toronto at an Omniture user group, so it wound up being a nice dry run in front of a live, but friendly crowd.

I made some subsequent updates to the deck (improvements!), but below is substantially the material I presented:

In June, Columbus Web Analytics Wednesday is actually going to happen in Cincinnati — we’re planning a road trip down and back for the event. We’re hoping for a good showing!

Analytics Strategy

The Impending Acquisition of Adobe

This is purely speculation. I have no inside knowledge into the possibility I present here other than hypothetical conversations with peers. Sean Power brought up the topic over dinner recently, which caused me to start thinking seriously about the realities of Microsoft buying Adobe. He blogged about it way back when Adobe acquired Omniture. At that time, I was at Forrester and when we got wind of the deal, we had an all-hands meeting to make sense of the awkward acquisition. Upon arriving at consensus, I quickly penned a missive about why the acquisition of the leading web analytics and optimization firm made sense for a creative software firm like Adobe. Yet, like most others, we had to squint at the deal to see any logic in it at all. Now it’s starting to become clear why Adobe shelled $1.8B to add some attraction to its offering for a much larger suitor.

Adobe controls big chunks of the digital customer experience. Specifically, they play a major role in content creation through the CS5 suite of products. While web developers aren’t necessarily building their global digital offerings in Dreamweaver, surely they are using elements of Creative Suite to do just that. Further, any document where the author wants to control its integrity will lock it down by saving it in PDF format. And now through the acquisition of Omniture, they gained the ability to measure and optimize consumer utilization of those assets as well as the web sites and marketing efforts of leading brands across the globe. We’re just starting to see the fruits of this curious marriage between the two firms in the announcements of tracking capabilities within the CS5 release. Yet, these tracking methods are not meant for the traditional users of Omniture’s set of highly robust analysis capabilities; they are designed for content creators and developers to gain insights about the digital assets they’re producing. I like to think of this as tricking people into using web analytics by not actually telling them that they’re using data to make day-to-day business decisions. Brilliant actually. This introduction of tracking capabilities within CS5 falls precisely in line with what my partner Eric Peterson describes as the bifurcation of the web analytics marketplace. Analytics at the low-end are offering information that is helpful (dare I say critical) in making decisions about business activity. At the top end are trained web analysts who crunch the data to tease out the insights and offer recommendations based on a holistic representation of data from numerous disparate sources. With Omniture Insights providing the analysis horsepower at the top of this scenario and CS5 empowering the bottom, all of the sudden, Adobe becomes an invaluable resource for enterprises that deliver services in online, offline, B2C, B2B or B2B2C environments. Now, let’s introduce Microsoft into this mix.

MSFT has labored [successfully] to own the consumer desktop with its operating system, indispensable productivity tools (MS Office), and not-so-universally, rich media with Silverlight. Not to mention that they’re still working diligently to capture consumers with Bing, MSN and a slew of other services pointed at end users. All this traction across MSFT properties gives them a lofty vantage point from which to monitor consumer behavior across digital channels. Adding a stack of ubiquitous software for content creation and some world class measurement capabilities may be quite attractive to the Redmond rotund. They’d immediately challenge Apple on a new level of customer intelligence and empower their enterprise customers with a whopping new set of capabilities. Despite the new consumer view to be gained from this possible acquisition, the real benefits are a nicely wrapped enterprise solution complete with: MS servers, a .NET framework, SharePoint, Dynamics CRM, Dynamics ERP, and a kitchen sink of bells, whistles and anything else you might want. Given the opportunity to deliver, measure and manage the customer experience at a really deep and integrated level seems like an appealing bet to me.

I won’t droll on about how or when this impending acquisition will occur; mostly ‘cause I have no idea. But I will hedge by saying that others suitors may actually line up before MSFT comes calling. Google for instance could parlay a nice entrance to the packaged software market and gain the ability to create, deliver and measure that largest advertising network on the globe. For that matter Apple may be strategically sparring with Adobe’s crystal palace in a deliberate attempt to soften their value. Swooping in for an acquisition after some fierce battling on the street wouldn’t be completely unheard of…now would it?

Adobe Analytics, Analytics Strategy

Thoughts on the WAA Certification Exam

I’ve been pondering this blog post for a couple of weeks now since I took the WAA Certification Exam along with eight others in the inaugural proctored exam at eMetrics in San Jose.

To be totally honest, I probably didn’t need to take this test. For starters, I’m not a traditional web analyst that’s down in the trenches doing the hard work of analysis, reporting and translating the massive amounts of data we’re all so fond of collecting into insights and recommendations. While these web analysts have something to prove to their organizations about the value of their jobs and the expertise they posses – frankly I have nothing to prove.

Additionally, I work for a well established consultancy with a great brand reputation and I’m not planning on looking for a new job anytime soon. Our clients are most likely going to work with us regardless of our certification status. Yet, I wanted to take this test because I do advise my clients on what they should be doing with web analytics from a strategic perspective. I speak frequently about analytics and how to interpret and deliver data in the most effective ways. So my vantage point cannot be void of practical knowledge that dictates what’s possible in a realistic world.

Thus, I took the test in part to illustrate to myself that I not only talk the talk, but am willing to put my practical skills to the test. And yes…I passed, so you’ll be seeing the CWA (Certified Web Analyst) designation show up on my credentials.

Further, many of you voted recently to elect me to the Web Analytics Association Board of Directors; and I thank you for that. I took the WAA Certification Exam, so that I could lead by example and educate others about what I genuinely believe to be a valuable test of digital measurement knowledge. I encouraged all of my fellow board members to take the test as well and several have done so and more are sure to follow.

But because I went through the experience of taking this exam, I am uniquely qualified to share my experiences that stretch way beyond the speculation of any detractors that criticize this exam. Thus, I give you the Good…the Bad…and the Ugly of the WAA Certification Exam.

The Good…

This exam is a true test of analytical knowledge that requires both business acumen and a deep understanding of applied web analytics. Like all things analytics – it’s not easy. In fact, it’s downright hard. The guidance offered by the WAA regarding a recommended 3 years of practical experience is sound advice. And even then, this exam will require web analysts to dig deep into their skill set to come up with not just acceptable answers, but the best answer. Out of the initial nine exam-takers, seven passed the test, which is good. Yet, the minimum passing grade for the exam is 60% and the mean scores for our inaugural group was 61.7% (maybe I should have saved that for the ugly). The high score among all test takers thus far was 70%. While this may open questions about whether or not this test is too hard, to me it shows that there is plenty of runway for analysts to showcase their superstar skills with high scores. And if it was easy, where everyone could pass, then what validation of knowledge would that really be?

As my fellow WAA Board member Vicky Brock Tweeted: “As an employer I’d hire folk who ace this, as it tests analytical skills not recall”. Vicky also shared thoughts on her experience. Much like Vicky, I believe this exam is a good test of knowledge that requires prospective certified analysts to know their stuff, which in turn demonstrates that the credential holds distinction.

The format is a familiar multiple choice answer system with four possible answers. Like most diligent test takers, I relied on the process of eliminating the ones that I knew were incorrect and then sorting through the remaining choices. This typically left me with two answer choices that could work, but knowing that one was better than the other, I was largely going on instinct to make the right choice. There is also a word question section that offered business scenarios and data sets leaving you to solve problems within the context of a specific business. These questions were the real gems of the exam and guaranteed to make your head spin. I love these types of questions, but perhaps I’m a glutton for punishment.

The big elephant in the room is the price. Without question, taking this exam is a financial commitment. I shelled out the bucks from my own pocket to do it because I believe in the value of certification. We as an industry are gaining momentum so quickly that analytics and data-driven cultures are all the rage today. The use of data is permeating organizations from the tactical to the strategic and ending up on the boardroom table, and in some cases, in financial analyst reports that end up on Wall Street. Yet, despite these significant gains, we have no designation to acknowledge that our Web Analysts are qualified for the job. This certification exam is that designation that will identify the truly proficient practitioners. In my opinion, this exam is worth every penny and I strongly believe that as more and more professionals acquire the CWA accreditation it will become the gold standard by which job candidates, consultants and trusted advisors are selected. When we reach this critical mass, those who aren’t Certified Web Analysts will be questioned with just cause…So why aren’t you certified?

The Bad…

I’ll be the first to admit that their are still some kinks in the system so it’s not perfect. Yet, nobody is so I’m willing to offer some leniency. For me, just downloading the application to sign up for the test was a chore. I offered feedback, so hopefully a fix is in the works now [there is], but when I registered the editable PDF application only worked if you had Acrobat writer on your machine, which I don’t. So after filling out the entire form, I couldn’t save it. I ended up printing out the pages and then scanning them back in to submit my application. Now, that’s more than I’d expect from your average exam taker, but I was on a mission. Also, be prepared to dig out your resume because the application requires listing all of your previous employers, their addresses, manager names and phone numbers. I was toggling between the application and my LinkedIn profile just to complete the darn thing.

**UPDATE** There is now a web based form that serves as the application, so no more downloading the PDF.

Next, it was very challenging for me to prepare for this exam. I did utilize the documents offered by the WAA including the Knowledge Required for Certification and the practice questions. The practice questions were actually great. They helped me to decide whether I was going to take the test and did closely resemble the actual questions on the test. I just wished there were more of them. The Knowledge Required document also contained a great deal of useful information, but after pouring through the 37 pages of material, I was still left feeling unprepared. The document mirrors the UBC course material, so it is thorough in describing what will be offered in terms of knowledge, but the meat of the work isn’t included in this document. It was all menu and no entree. So essentially, the document tells you what you will be tested on, but doesn’t teach any of the concepts. While they clearly state that: “Taking these four courses is not required to sit for the certification test.” those that do will be much better prepared than I was. I know that these courses are incredibly valuable and students rave about their success, but most professionals like myself don’t have the time to endure them – despite their value.

The Ugly…

So, I already ranted about the preparation materials and the costs above, but the Ugly for me was determining if I would actually re-take this test if I failed. The feedback that I received from the WAA did contain results for the four sections that were included in the test (Analytical Business Culture, Case Studies, Marketing Campaigns, and Site Optimization) and my scores for each section. Yet, this was the extent of the feedback on my performance. It was up to me to decipher which questions may have been within each of the four categories and where I needed to focus my efforts to better prepare for a re-test. To the credit of the Association, most standardized tests are scored this way and offer similar amounts of feedback – but most tests of this magnitude also have test preparation courses that teach the skills of taking the test and offer extensive feedback on skills necessary to score well on the exam. Thus, it was ugly for me because I can sincerely admit that I wouldn’t have paid to retake this test because I do not know how I would have prepared for a second exam.

The bright spot in this potentially ugly situation is that the WAA Board is committed to endorsing organizations that choose to develop WAA Certification Exam training programs. Since this test is still very new, these programs have yet to emerge, but the opportunity is out there. I want the WAA Certification Program to succeed for the WAA and for our industry. If the test-takers are better prepared to take the test through the help of a training program, then that’s a win-win. This type of prep course would offer me the confidence I needed to take the test again if I had failed…or for those of you taking the exam for the first time. Stay tuned for more news on this front as it develops.

The Summary…

This post is already getting long in the tooth and I’ve said a lot. The bottom line for me is that this exam is a strong indication of the digital measurement skills that an individual brings to his or her organization. Passing the WAA Certification Exam means that an individual is an expert in the field of web analytics. It’s an accomplishment that anyone in our industry should be proud of, and one that should receive accolades on top of accolades.

But that’s enough of my rant…What do you think?

I look forward to starting a long-term dialog on this topic, so please comment, email me or otherwise shout your opinions from the rooftops.

Analytics Strategy, Conferences/Community, General

The Analysis Exchange is OPEN TO EVERYONE

Back in December of last year Aurelie, John, and I announced an idea we believe has the potential to change the web analytics industry forever, The Analysis Exchange. Briefly, the Analysis Exchange is a totally new approach towards web analytics training — one that depends less on what you read and more on what you do.

The Analysis Exchange lets experience web analysts demonstrate their passion for their work and gives beginners valuable “hands on” experience with data and real business problems. What’s more, the output from Analysis Exchange projects directly benefits some of the most amazing organizations around the globe — nonprofits and non-governmental groups who work not for money but for the betterment of humanity, our planet, and all creatures great and small.

You can read more about the origination of this effort in our blog posts and a very nice write up by our friend Jim Sterne, founder of the Web Analytics Association:

Since December we have been hard at work building out a web site and perfecting the business process that would be required to accomplish our core goals. What are those goals, you ask? Very, very simple … between now and June 1, 2011 we want to:

  • Provide FREE analysis to 1,000 nonprofit organizations
  • Provide FREE training and certification to 500 web analytics students
  • Provide FREE certification and support to 150 web analytics experts

1,000/500/150 are the numbers that we will be living by, but we know we’re not living there alone. We know this because the initial response to The Analysis Exchange has been tremendous! In addition to the great stuff we learned in our first testing round we have had excellent feedback from nonprofits, mentors, and students alike.

I love what Amy Sample, Director of Web Analytics at PBS Interactive had to say:

“What I love about the Analysis Exchange is the learning is reciprocal.  Not only is the student learning about analytics and giving back to the organization, but the organization is learning from the student as well.  Many of our local PBS stations have little experience with Web Analytics.  Through the Exchange, the stations are able to learn how to tackle analytics problems along with the student and how to make a lasting impact to their own organization.”

Cindy Olnick from the Los Angeles Conservancy had similar enthusiasm for her project:

“Joy’s a terrific mentor from what I can tell, and she and Danielle are great at translating all the numbers into information I can use. They’ve given me a report and will set up some new parameters in Google Analytics targeted to my goal of increasing membership.”

Todd Bullivant, one of our students said:

“It was a great way to end the week! Thanks again to everyone for the opportunity. I learned a lot about analytics that I can use in my own organization as well as future projects. I hope to work on many more of these in the future! I also just heard that my company is planning to spotlight me in the next internal newsletter due to this project, so increased visibility!”

Susie Hall, Director of Outreach and Enrollment at Acton School of Business said:

“The Analysis Exchange project was very enlightening for me as well.  We found out some valuable information, and I’m excited to use this new-found knowledge to help shape our outreach efforts. This project could not have come at a better time, we are in the middle of changing pretty much all of our processes, so moving forward armed with such powerful information is invaluable. Andrew and Candace were lovely to work with, and I am very happy with the whole experience.”

One of our super-motivated students, Andrew Hall said:

“During the course of the project, I worked with almost every functionality of Analytics other than custom variables, got to understand how Adwords campaigns work, and learned the benefits of taking data from Analytics into an analysis software like Tableau to gain and communicate insights.  Most importantly, I confirmed that I really enjoy doing this!  I am waiting to hear back from a couple of jobs, but in the mean time I’ve decided the knowledge I now possess would be beneficial a lot of organizations.  I feel confident enough to start approaching businesses and nonprofits in my community to get consulting work.”

By now I’m sure you get the picture, and our mentors have been having a great time as well. So much so that Joy Billings from Digitaria gave a positively glowing review of her work at the San Jose Emetrics, John Lovett’s student had her work go all the way to the CMO’s office at The Holocaust Museum, and Victor Acquah from Blue Analytics said:

“Just got off the presentation! Todd did such a great job with the analysis and presentation that it is hard to tell he hasn’t been in analytics for too long. Totally impressive output.”

We at Analytics Demystified have felt totally blessed to be part of the projects that have been going this far … but now is the time to take it to the next level: Starting with the publication of this blog post, The Analysis Exchange is open to all students, all mentors, and all qualifying organizations around the world.

If you haven’t already, please create an Analysis Exchange profile and join us in our effort to change web analytics forever. If you need more information first we have lots and lots of content including:

If you’re already in the Analysis Exchange and you want to help, please reach out to nonprofits you know and ask them to create a project so you can work with them. If you’re on Twitter, please use our short link (http://bit.ly/analysis-exchange) to help spread the news. If you’re a member of Web Analytics Wednesday, please consider mentioning the effort at your next meeting.

Finally, I want to offer a special “thank you” to Aurelie, John, Jim, Holly Ross, Beth Kanter, Sean Power, and each and every one of the mentors, students, and organizations who have helped us over the last five months. You are all amazing for contributing your time and energy to help make this effort run as smoothly as possible. Thank you!

Analysis, Analytics Strategy, Reporting

Answering the "Why doesn't the data match?" Question

Anyone who has been working with web analytics for more than a week or two has inevitably asked or been asked to explain why two different numbers that “should” match don’t:

  • Banner ad clickthroughs reported by the ad server don’t match the clickthroughs reported by the web analytics tool
  • Visits reported by one web analytics tool don’t match visits reported by another web analytics tool running in parallel
  • Site registrations reported by the web analytics tool don’t match the number or registrations reported in the CRM system
  • Ecommerce revenue reported by the web analytics tool doesn’t match that reported from the enterprise data warehouse

In most cases, the “don’t match” means +/- 10% (or maybe +/- 15%). And, seasoned analysts have been rattling off all the reasons the numbers don’t match for years. Industry guru Brian Clifton has written (and kept current) the most comprehensive of white papers on the subject. It’s 19 pages of goodness, and Clifton notes:

If you are an agency with clients asking the same accuracy questions, or an in-house marketer/analyst struggling to reconcile data sources, this accuracy whitepaper will help you move forward. Feel free to distribute to clients/stakeholders.

It can be frustrating and depressing, though, to watch the eyes of the person who insisted on the “match” explanation glaze over as we try to explain the various nuances of capturing data from the internet. After a lengthy and patient explanation, there is a pause, and then the question: “Uh-huh. But…which number is right?” I mentally flip a coin and then respond either, “Both of them” or “Neither of them” depending on how the coin lands in my head. Clifton’s paper should be required reading for any web analyst. It’s important to understand where the data is coming from and why it’s not simple and perfect. But, that level of detail is more than most marketers can (or want to) digest.

After trying to educate clients on the under-the-hood details…I almost wind up at a point where I’m asked the “Well, which number is right?” question. That leads to a two-point explanation:

  • The differences aren’t really material
  • What matters in many, many cases is more the trend and change over time of the measure — not its perfect accuracy (as Webtrends has said for years: “The trends are more important than the actual numbers. Heck, we put ‘trend’ in our company name!”

This discussion, too, can have frustrating results.

I’ve been trying a different tactic entirely of late in these situations. I can’t say it’s been a slam dunk, but it’s had some level of results. The approach is to list out a handful of familiar situations where we get discrepant measures and are not bothered by it at all, and then use those to map back to the data that is being focussed on.

Here’s my list of examples:

  • Compare your watch to your computer clock to the time on your cell phone. Do they match? The pertinent quote, most often attributed to Mark Twain, is as follows: “A man with one watch knows what time it is; a man with two watches is never quite sure.” Even going to the NIST Official U.S. Time Clock will yield results that differ from your satellite-synched cell phone. Two (or more) measures of the time that seldom match up, and with which we’re comfortable with a 5-10 minute discrepancy.

Photo courtesy of alexkerhead

  • Your bathroom scale. You know you can weigh yourself as you get out of the shower first thing in the morning, but, by the time you get dressed, get to the doctor’s office, and step on the scale there, you will have “gained” 5-10 lbs. Your clothes are now on, you’ve eaten breakfast, and it’s a totally different scale, so you accept the difference. You don’t worry about how much of the difference comes from each of the contributing factors you identify. As long as you haven’t had a 20-lb swing since your last visit to the doctor, it’s immaterial.

Photo courtesy of dno1967

  • For accountants…”revenue.” If the person with whom your speaking has a finance or accounting background, there’s a good chance they’ve been asked to provide a revenue number at some point and had to drill down into the details: bookings or billings? GAAP-recognized revenue? And, within revenue, there are scads of nuances that can alter the numbers slightly…but almost always in non-material ways.

Photo courtesy of alancleaver_2000

  • Voting (recounts). In close elections, it’s common to have a recount. If the recount re-affirms the winner from the original count, then the results is accepted and moved on from. There isn’t a grand hullabaloo about why the recount numbers differed slightly from the original account. In really close races, where several recounts occur, the numbers always come back differently. And, no one knows which one is “right.” But, once there is a convergence as to the results, that is what gets accepted.

Photo courtesy of joebeone

    That’s my list. Do you have examples that you use to explain why there’s more value in picking either number and interpreting it rather than obsessing about reconciling disparate numbers. I’m always looking for other analogies, though. Do you have any?

    Adobe Analytics, Analytics Strategy, Conferences/Community, General

    Excited to Announce X Change 2010 Keynote!

    Now that Emetrics West is behind us, and what an Emetrics it was this year, Analytics Demystified and Semphonic officially start to ramp up our efforts to get the best of the best of you to join us for three days in Monterey September 20, 21, and 22. While I am excited about the entire event, I am particularly excited about our keynote offering this year titled “A Conversation with Management.”

    Because the X Change draws so many expert practitioners, managers, and directors of web analytics my general feeling has always been that we should be programming for “lifers” in the field, looking for opportunities to help participants expand their career horizons. Our “Conversation with Management” keynote is a conversation with three of the most successful web analytics professionals I personally know:

    • Shari Cleary, Vice President of Digital Research at MTV Networks
    • Joe Megibow, Vice President of Global Analytics at Expedia.com
    • Steve Bernstein, Vice President of Analytics at Myspace

    I have personally known Shari, Joe, and Steve for years and have had the great honor of watching each progress up the management chain, taking an increasing amount of responsibility with each step. Now all three of our keynote participants represent web analytics at the highest levels within each of their organizations, an incredible feat when you consider the footprint MTV, Expedia, and Myspace have on the Internet.

    During our keynote I will be leading the panel to explore common “lifer” challenges including staffing, vendor management, the balance between reporting and analysis, their relationship with senior-most management, and the importance of business process to each of their jobs. My goal will be to get each to share details regarding their own career path in hopes those insights will help X Change attendees accelerate their own goals.

    You can learn more about the 2010 X Change on our micro-site for the conference:

    • The 2010 X Change conference schedule
    • Hotel information for the Monterey Plaza Resort and Spa
    • Details about Analytics Demystified’s Think Tank offerings
    • Our X Change Frequently Asked Questions document
    • Registration for the 2010 X Change in Monterey, CA September 20, 21, and 22

    If you have questions about the conference please don’t hesitate to give any of the Analytics Demystified partners a call or email. Remember that the conference is limited to the first 100 people who register and registration has already started.

    See you at the X Change!

    Analytics Strategy, Conferences/Community, General

    Are you coming to Emetrics?

    Well folks, it’s that time of year again. The winds are dying down and the flowers have all started to bloom so it must be time to make our annual pilgrimage to San Jose to bask in the glory of Jim Sterne and the Emetrics Marketing Optimization Summit! As usual I will be there and have the honor of sharing a keynote slot with my long-time friend and uber-optimizer Bryan Eisenberg!

    • Emetrics Keynote: Wednesday at 1:00 PM in the Grand Ballroom

    Partner John Lovett will also be there, basking in his own glory on the heels of his Web Analytics Association victory … and taking the WAA’s new Certification test. I haven’t really had much time to think about the Certification yet but will be interested to hear what John and others taking the test have to say.

    I also have the rare honor of presenting with Brett Crosby, Group Product Manager for Google Analytics and one of the nicest guys in the entire industry, hand’s down. Oddly he and I are presenting IMMEDIATELY AFTER his “What’s new from Google Analytics” pitch on Tuesday … but to compensate we’re gonna try something new and have a very loose “conversation” about web analytics that is more similar to an X Change mini-huddle than a traditional presentation.

    • Talking Analyics: Tuesday at 2:00 PM at The Conversion Conference (co-located w/Emetrics)

    Finally I will be sharing the stage at Web Analytics Wednesday with Adam Laughlin from the nonprofit Save the Children. We will be talking about our respective community education efforts — his “Web Analytics Without Borders” WAA initiative and our own Analysis Exchange. I will be making a few exciting announcements about The Analysis Exchange next Wednesday so if you cannot attend Web Analytics Wednesday please watch my blog or follow me on Twitter.

    • Web Analytics Wednesday: Wednesday at 6:00 PM at the Fairmont in San Jose

    That schedule again:

    • Tuesday, 2:00 PM at The Conversion Conference with Brett Crosby (Google)
    • Wednesday, 1:00 PM at Emetrics with Bryan Eisenberg (Emetrics Keynote)
    • Wednesday, 6:00 PM at Web Analytics Wednesday (The Fairmont Hotel, Market Street Foyer)

    Thanks to Coremetrics and SAS for their generous support of Web Analytics Wednesday at Emetrics, by the way. Great companies like these are what keep WAW events around the world free and open to everyone!

    See you in San Jose!

    Analytics Strategy, General

    My Interview with Adobe Chief Privacy Officer

    Those of you paying close attention to issues regarding consumer privacy on the Internet are probably at least a little familiar by now with Flash Local Shared Objects (also called Flash “Cookies” by some.) I wrote a white paper on the subject Flash objects’s use in web analytics on behalf of BPA Worldwide back in February and had to update the blog post I wrote when I noticed  that Adobe had wisely written a letter to the Federal Trade Commission regarding the use of Flash to reset browser cookies.

    After writing that update I got in contact with Adobe’s Chief Privacy Officer, MeMe Rasmussen, who politely agreed to answer a few questions that I had about their letter and Adobe’s position on the use of Flash as a back-up strategy for cookies.  Given that Scout Analytics is now reporting that Flash “Cookies” are increasingly being deleted by privacy-concerned Internet users I figured it was a good time to publish my questions and MeMe’s responses.

    The following are my questions (in bold) and Mrs. Rasmussen’s responses verbatim.

    Flash Local Shared Objects (LSOs) have been around for a long-time and I have been aware of their use as a “backup” for browser cookies for reset and other calculations for a few years.  What made you write your letter to the FTC now?  Was there a specific event or occurrence?

    The topic of respawning browser cookies using Flash local storage was publicized after research conducted by UC Berkeley on the subject was published in August 2009.  The topic was also raised at the FTC’s First Privacy Roundtable in December, so when the FTC announced that its Second Roundtable would focus on Technology and Privacy, we felt it was the appropriate opportunity for Adobe to describe the problem and state our position on the practice.

    While I believe the position you outlined in your letter to the FTC is the correct one, you have put many of your customers in an uncomfortable position by condemning an act that they have been using for quite some time — essentially issuing negative guidance where none had been previously issued (to my knowledge.)  What has the response to this been if I may ask?

    We have not received any comments or concerns from customers about our Comment Letter to the FTC.  Adobe’s position specifically condemns the practice of using Flash local storage to back up browser cookies for the purpose of restoring them after they have been deleted by the user without the user’s knowledge and express consent.  We believe companies should follow responsible privacy practices for their products and services, regardless of the technologies they choose to use.

    On page 8 of your response to the FTC you discuss Adobe’s commitment to research the extent of this (mis)use of Flash LSOs.  Given the extent to which LSOs are being used perhaps “not as designed” and the sheer popularity of Flash on the web this seems quite a task.  Can you describe how you have started going about this effort?

    We are currently in the process of defining the research project and are working with a well-respected consumer advocacy group and university professor.  At this time, the specific details of the project have not yet been finalized.

    Within the web analytics community many have commented that your position on Flash LSOs may impact some of what Mr. Nayaren and Mr. James have said about the integration of Omniture and Adobe products like Flash.  Specifically some of the commentary suggests a tight integration of Omniture’s tracking and Flash.  Does your position on LSOs as a tracking device change the guidance the company has issued to common customers?

    No, the position we outlined in the FTC Comment on condemning the misuse of local storage, was specific to the practice of restoring browser cookies without user knowledge and express consent.  We believe that there are opportunities to provide value to our customers by combining Omniture solutions with Flash technology while honoring consumers’ privacy expectations.

    One of the suggestions I made in the white paper with BPA Worldwide that you cited was to use Flash LSO as a back-up tracking mechanism but NOT to use it to re-spawn cookies.  From a measurement perspective there are a handful of good reasons to do this … does Adobe have a position on that strategy that you can outline?

    The point we made in our FTC Comment was that we considered the practice of using Flash local storage to respawn HTML cookies without user consent or knowledge to be an inappropriate privacy practice.  In your white paper, you identified some uses of Flash local storage whereby browser cookies are rest but the use is given clear notice and an opportunity to consent.  We believe that technology should be used responsibly and in ways that are consistent with user expectations.  The example you presented in your white paper was an example of a Web site that, by giving notice and control to the user, implemented our technology in what appeared to be a responsible manner.

    (Thanks again to MeMe and the team at Adobe for getting these responses back to me! As always I welcome your comments and questions.)

    Analytics Strategy

    An Open Letter to Steve Jobs

    Update (April 22, 2010): This article at Venturebeat suggests that iPhone application measurement vendors are hearing good news from Apple regarding their ability to measure in-app data.  The article, however, is devoid of any kind of details whatsoever and so the Tweets saying “Apple NOT banning analytics from OS 4.0” appear to be somewhat optimistic in my opinion.  I’d love to hear from Google, Omniture, Webtrends, Coremetrics, Unica, etc. and see if they are having the same “you have nothing to worry about” conversations with Apple. Obviously is Venturebeat is correct this is great news, but if I were developing an iPhone application I’d want a little more  than rumor to contradict the language in Section 3.3.9.

    Wait and see I guess …

    Dear Mr. Jobs,

    As a very loyal Apple customer and user of your products I want to thank you for all that you’ve done for computing in general. Your attention to detail and your vision have resulted in many of the most useful and usable products I own, too many to list honestly. While I was able to hold off for three days before purchasing the original iPhone (now on my third since I upgrade with every release) I pre-ordered my iPad and absolutely love it.

    Thanks for that.

    Unfortunately as a long-time member of the digital measurement industry I am in the uncomfortable position of having to ask you to reconsider what will undoubtedly be viewed by many Apple customers, developers, and end-users as an egregious mistake. I am talking about Section 3.3.9 in your updated iPhone Developer Agreement in which you apparently ban all third-party in-app measurement. While I respect Apple’s right to privacy, for those not familiar with Section 3.3.9 I would encourage you to read the following articles:

    The summary statement is that your updated Developer Agreement, if my read is accurate, strips all of your Development partners of their ability to measure application usage with an eye towards improving the overall quality of their product.  Just as a reminder these Developer partners include Best Buy, Expedia, The New York Times, The Wall Street Journal, Netflix, and some 150,000+ other companies working to deliver great experiences on your iPhone, iPod Touch, and iPad devices.

    While I certainly understand the over-arching desire to have quality control in all things Apple, which the mobile family of applications essentially become by proxy, banning the ability to measure application use is likely to be met with some resistance among your larger Development partners.  Many of these companies are known to me as a consultant and have active programs in place to use solutions like Adobe’s Omniture, Webtrends, Coremetrics, Unica, Google Analytics, and Yahoo Web Analytics to determine which application functionality is working and which needs to be addressed in future updates.

    Given that Apple is a long-time Adobe/Omniture customer I rather suspect that this third-party tracking is embedded in many of your own applications.  Perhaps that’s not the case, but given the general utility of these applications I would be pretty surprised if your own developers aren’t in violation of the new Developer Agreement somewhere in the pre-installed application stack.

    If not, well, shame on your developers for not embedding application tracking in complex applications like Pages and Keynote on the iPad. While I certainly do love the freedom I have to write on the iPad, I suspect if you were using Adobe/Omniture to track Pages you’d see me continually tapping the page in landscape mode trying to get a menu to come up so I can make a bulleted list …

    But I digress.

    Since many of your best Development partners are companies well-known for their general prowess for digital analytics — companies like Best Buy, Expedia, Cisco, Netflix, Disney, ABC, ESPN, and many, many more — you may want to give a little more thought to Section 3.3.9. If this section remains you are essentially blocking all of these companies (and all mobile developers in your App Store) from gaining valuable insight into how their applications can be more useful, more delightful, and frankly, more like Apple.

    Hopefully this was just a huge oversight on someone else’s part within Apple, especially since as far as anyone knows you don’t have technology available to replace the data that would be lost if (or when) Developers comply with this requirement. It may seem a touch geeky but business owners are increasingly relying on this data to justify the expense and commitment it takes to participate in the App Store (and the mobile revolution in general.)

    Being such a fan of your work I’d like to offer a solution, just in case your open to the idea.

    Apple has an opportunity to do something that, well, nobody else really has or does in terms of digital measurement. Because of how the App Store works, Apple could create a set of terms and conditions for application tracking that would simultaneously provide guidance to your Developer community and create an unprecedented level of transparency for technology end-users everywhere (or at least those using Apple products.)

    Imagine a tier of requirements and resulting notifications to the end-user based on the type of data the application wanted to pass.  Just like geo-location requires explicit one-time opt-in today, tracking of individually identifiable (e.g., device-level or personal) data could require the same type of opt-in.  For more basic tracking (e.g., completely anonymous interaction data) you could simply allow that without opt-in to foster the growth and development of the application development community.

    The most important thing is you would have an opportunity to craft a set of mobile tracking requirements that could be extended and applied across the entire mobile universe. In the same way Apple has changed our relationship with “pocket computing” forever, your company could essentially resolve a problem that in some ways is an accident waiting to happen, and do so in a way that creates opportunities rather than creating tension with the very group that is making your products so successful today.

    If this is in any way interesting to you I’d love to discuss it more. My contact information is on my web site.

    Measurement is not as sexy as the iPad or iPhone, but at the end of the day it is just about as important. With every new technology comes the need to understand it’s use and justify related expenses. Your Development partners are intensely drawn to the iPhone opportunity to be sure, and it’s great that you’re making people like Loren Brichter and others rich thanks to their efforts.

    But not everyone will be as savvy as Apple or as fortunate as Atebits; most companies work to use the limited data they do have to understand user behavior in an effort to make incremental improvements to their applications. Section 3.3.9 seems to prevent this data from getting into your partner’s hands, preventing the very thing I suspect you’re working to promote: the highest quality applications possible delivered via amazing devices.

    Hopefully I and others are simply reading Section 3.3.9 the wrong way. I would be honored if you or someone from Apple would provide guidance on this point and I’m happy to help communicate that guidance in whatever way I am able.

    Sincerely,

    Eric T. Peterson
    CEO, Founder, and Senior Partner
    Analytics Demystified, Inc.

    Analytics Strategy, Reporting, Social Media

    Digital Measurement and the Frustration Gap

    Earlier this week, I attended the Digital Media Measurement and Pricing Summit put on by The Strategy Institute and walked away with some real clarity about some realities of online marketing measurement. The conference, which was relatively small (less than 100 attendees) had a top-notch line-up, with presenters and panelists representing senior leadership at first-rate agencies such as Crispin Porter + Bogusky, and Razorfish, major digital-based consumer services such as Facebook and TiVo, major audience measurement services such as comScore and Nielsen, and major brands such as Alberto Culver and Unilever. Of course, having a couple of vocal and engaged attendees from Resource Interactive really helped make the conference a success as well!

    I’ll be writing a series of posts with my key takeaways from the conference, as there were a number of distinct themes and some very specific “ahas” that are interrelated but would make for an unduly long post for me to write up all at once, much less for you to read!

    The Frustration Gap

    One recurring theme both during the panel sessions and my discussions with other attendees is what I’m going to call The Digital Measurement Frustration Gap. Being at an agency, and especially being at an agency with a lot of consumer packaged goods (CPG) clients, I’m constantly being asked to demonstrate the “ROI of digital” or to “quantify the impact of social media.” We do a lot of measurement, and we do it well, and it drives both the efficient and effective use of our clients’ resources…but it’s seldom what is in the mind’s eye of our clients or our internal client services team when they ask us to “show the ROI.” It falls short.

    This post is about what I think is going on (with some gross oversimplification) which was an observation that was actively confirmed by both panelists and attendees.

    Online Marketing Is Highly Measurable

    When the internet arrived, one of the highly touted benefits to marketers was that it was a medium that is so much more measurable than traditional media such as TV, print, and radio. That’s true. Even the earliest web analytics tools provided much more accurate information about visitors to web sites – how many people came, where they came from, what pages they visited, and so on – than television, print, or radio could offer. On a “measurability” spectrum ranging from “not measurable at all” to “perfectly measurable” (and lumping all offline channels together while also lumping all online channels together for the sake of simplicity), offline versus online marketing looks something like this:

    Online marketing is wildly more measurable than offline marketing. With marketers viewing the world through their lens of experience – all grounded in the history of offline marketing – the promise of improved measurability is exciting. They know and understand the limitations of measuring the impact of offline marketing. There have been decades of research and methodology development to make measurement of offline marketing as good as it possibly can be, which has led to marketing mix modeling (MMM), the acceptance of GRPs and circulation as a good way to measure reach, and so on. These are still relatively blunt instruments, and they require accepting assumptions of scale: using massive investments in certain campaigns and media and then assessing the revenue lift allows the development of models that work on a much smaller scale.

    The High Bar of Expectation

    Online (correctly) promised more. Much more. The problem is that “much more” actually wound up setting an expectation of “close to perfect:”

    This isn’t a realistic expectation. While online marketing is much more measurable, it’s still marketing – it’s the art and science of influencing the behavior of human beings, who are messy, messy machines. While the adage that it requires, on average, seven exposures to a brand or product before a consumer actually makes a purchase decision may or may not be accurate, it is certainly true that it is rare for a single exposure to a single message in a single marketing tactic to move a significant number of consumers from complete unawareness to purchase.

    So, while online marketing is much more measurable than offline marketing, it really shines at measurement of the individual tactic (including tracking of a single consumer across multiple interactions with that tactic, such as a web site). Tracking all of the interactions a consumer has with a brand – both online and offline – that influence their decision to purchase remains very, very difficult. Technically, it’s not really all that complex to do this…if we just go to an Orwellian world where every person’s action is closely tracked and monitored across channels and where that data is provided directly to marketers.

    We, as consumers, are not comfortable with that idea (with good reason!). We’re willing to let you remember our login information and even to drop cookies on our computers (in some cases) because we can see that that makes for a better experience the next time we come to your site. But, we shy away from being tracked – and tracked across channels – just so marketers are better equipped to know which of our buttons to push to most effectively influence our behavior. The internet is more measurable…but it’s also a medium where consumers expect a decent level of anonymity and control.

    The Frustration Gap

    So, compare the expectation of online measurement to the reality, and it’s clear why marketers are frustrated:

    Marketers are used to offline measurement capabilities, and they understand the technical mechanics of how consumers take in offline content, so they expect what they get, for the most part.

    Online, though, there is a lot more complexity as to what bits and bytes get pushed where and when, and how they can be linked together, as well as how they can be linked to offline activity, to truly measure the impact of digital marketing tactics. And, the emergence and evolution of social media has added a slew of new “interactions with or about the brand” that consumers can have in places that are significantly less measurable than traffic to their web sites.

    Consumer packaged goods struggle mightily with this gap. Brad Smallwood, from Facebook, , showed two charts that every digital creative agency and digital media agency gnashes their teeth over on a daily basis:

    • A chart that shows the dramatic growth in the amount of time that consumers are spending online rather than offline
    • A chart that shows how digital marketing remains a relatively small part of marketing’s budget

    Why, oh why, are brands willing to spend millions of dollars on TV advertising (in a world where a substantial and increasing number of consumers are watching TV through a time-shifting medium such as DVR or TiVo) without batting an eye, but they struggle to justify spending a couple hundred thousand dollars on an online campaign. “Prove to us that we’re going to get a higher return if we spend dollars online than if we spend them on this TV ad,” they say. There’s a comfort level with the status quo – TV advertising “works” both because it’s been in use for half a century and because it’s been “proven” to work through MMM and anecdotes.

    So, the frustration gap cuts two ways: traditional marketers are frustrated that online marketing has not delivered the nirvana of perfect ROI calculation, while digital marketers are frustrated that traditional marketers are willing to pour millions of dollars into a medium that everyone agrees is less measureable, while holding online marketing to an impossible standard before loosening the purse strings.

    My prediction: the measurement of online will get better at the same time that traditional marketers lower their expectations, which will slowly close the frustration gap. The gap won’t be closed in 2010, and it won’t even close much in 2011 – it’s going to be a multi-year evolution, and, during those years, the capabilities of online and the ways consumers interact with brands and each other will continue to evolve. That evolution will introduce whole new channels that are “more measurable” than what we have today, but that still are not perfectly measurable. We’ll have a whole new frustration gap!

    Analytics Strategy, General

    iPad, Mobile Analytics, and Web Analytics 3.0

    If you follow me on Twitter (@erictpeterson) you are likely already annoyingly aware that I rushed right out last week and bought Apple’s new iPad. I got the device for a few reasons but fundamentally it was because I’m a technology geek–always have been really–and despite knowing the iPad will only get better over time I was happy to shell out $500 to see what the future of computing and all media would look like.

    Yeah, I see the iPad as the future of computing and all media. Bold, sure, but hear me out … and I promise I’ll make this relevant to web analytics, eventually.

    I believe that all that the “average user” of any technology really wants is a simple solution to whatever problem they may have at the time. At a high level people look towards their operating system to simplify access to the multitude of applications and documents they use; at a lower level we want our applications to simplify whatever process we’re undertaking.

    Proof points for my belief are everywhere, ranging from the adoption of speed dial on phones (simplifies calling your friends and family), power seats in cars (simplifies getting comfortable when you switch drivers), and even into web analytics where a substantial growth driver behind Google Analytics has been the profound simplicity with which important tasks such as custom report creation and segmentation are accomplished.

    The iPad, and to some extent the iPhone and it’s clones, absolutely crushes simplicity in a way that is simultaneously brilliant and powerful. Want to read a book? Touch the iBooks application, touch the book you want, and start reading. Want to send an email? Touch the Mail app, touch the new icon, and start writing. Want to play a game or send an SMS or Tweet something? It all works exactly the same way … tap, swipe, smile.

    Sure, the iPad is a little heavier than is optimal, and yeah it shows fingerprints and costs a lot of money and isn’t open source and … blah, blah, blah, blah. The complainers are gonna complain no matter what–you’re Apple or your not in this world I guess. But the complainers I think fail to grasp the opportunity the iPad creates:

    • The iPad takes mobile computing to an entirely new level. With iPad you have a 1.5 lb device that will let you read, write, watch, and generally stay connected from just about anywhere for up to 10 hours between charging. What computer or phone does that? None that I know of, and so iPad gives us a simple answer to “I need to work but I’m away from the office.”
    • The iPad enforces usability of applications, and this is a very good thing. The complainers complain that Apple asserts too much control over app design via their App Store acceptance processes. Apparently these folks haven’t used enough crappy software in their lifetimes and are hungry for more. Apple’s model and their application design toolkit gives us a simple answer to “I wish this software was easier to use.”
    • The iPad changes media consumption forever. Despite the Flash-issue, one I suspect will become a non-issue very quickly thanks to the adoption of HTML5, the iPad is the most amazing media consumption device ever created. It is a portable, high-definition TV, it is a near-complete movie library, it provides access to hundreds of thousands of books, and it allows you to surf the Internet in a way that can only be described as “delightful”. By definition the iPad gives us a simple answer to “I wish I had a way to keep my books, my movies, my newspapers, my TV shows, … all of my media, in a single place that could be accessed anytime from anywhere.”
    • The iPad changes education forever. I’m making a bet that by the time my first grade daughter hits middle school a significant number of children will carry iPads to school, not expensive, heavy, and immediately out-dated textbooks. Think about this for a second: interactive textbooks that can be updated as easily as a web site, think about young people’s media consumption model today, and think for just a second about why Apple would be motivated to provide “significant educational discounts” for the device. The iPad in schools gives us a simple answer to “How can we provide a common platform for learning that any student or teacher can immediately master and reflects our rapidly changing world?”

    Think that last piece isn’t important? Have a look at the image at the right, sent to me by @VABeachKevin (thanks man!) where he has already translated all three of my books into the ePub format and placed them on his iBooks bookshelf! This collection gives any web analyst with the iPad instant access to hundreds of pages of web analytics insight, anywhere, anytime. How cool is that?

    (And heck, these aren’t even Jim, Avinash, or Bryan’s books … I bet Kevin’s converting those as we speak!)

    I suspect you cannot appreciate this until you have one in your hands but the iPad has or soon will remove the necessity to purchase printed books, newspapers, and magazines. More importantly it gives the holder the ability to work efficiently from nearly any location around the world–all you need is a Wifi connection today and later this month that will be augmented with a 3G option.

    Yeah, I’m an Apple fanboy, and yeah, I’m lucky to be able to drop $500 on technology without giving it much thought, but wait and see … I bet the adoption curve on the iPad will very much mirror the iPhone which is essentially ubiquitous these days. And just wait until someone develops a full-featured web analytics data viewer that takes advantage of all the pinching, swiping, dragging, and zoom UI capabilities of the iPad, that will simply be awesome! Imagine:

    • Scrolling along through time by simply swiping left or right
    • Zooming in on data by tapping or dragging across several dates
    • Adding metrics and dimensions by dragging them onto the existing graph or table
    • Changing from graph to table by simply rotating the device

    Total “Minority Report” for web analytics … and I bet we see this within nine months time. In fact, if you’re a Apple developer looking for an awesome project … call me! I’d love to help guide a team developing next-generation web analytics interfaces on tablet computers.

    Why This Matters to Web Analytics Professionals

    I said I would try and make this relevant to web analytics practitioners so here I go. The iPad matters to measurement folks for exactly the reason I outlined back in September, 2007 when I first wrote about mobile’s impact on digital measurement. Web Analytics 3.0, a term I coined at the time and one I still use, is essentially the addition of a completely new dimension for analysis: user location.

    In a digitally ubiquitous world–again one I described in 2007 that has more or less come to pass (although the prediction was kind of like predicting gridlock in Washington or rain in Oregon in April)–where a visitor is accessing information from becomes increasingly important and adds potentially significant context to any analysis we conduct. Location coupled with the device they’re using will likely have a profound impact on their likelihood to transact or otherwise use your site.

    For example, a visitor accessing your site from home will likely have different needs and goals than one in their car, in an airport, in a coffeeshop, or in one of your competitors stores. In a world where an increasing number of visits are “out of home/out of office” visits conducted using mobile devices our collective approach towards analysis needs to change, perhaps dramatically.

    To be fair, this is not something you need to solve and resolve today. While our ability to discern and differentiate mobile visits is getting better all the time, our overall analytical capabilities for mobile including the ability to tie mobile, fixed web, and offline visitors together is still unfortunately complicated. On top of that, while applications are increasingly able to pass over geographic information, most web browsers are not, and so our ability to gather large quantities of this data are still limited …

    … at least for the time being.

    For now I stand by what I said back in 2007–digital ubiquity and location-awareness changes everything. Back then the devices and platforms were just an idea; now we have the iPhone and it’s clones, the iPad is about to usher in a new era of mobile computing, Google and Apple are both behind mobile advertising, and the full scope of our analytical challenges are just beginning to emerge. If  you’re struggling with how to measure your mobile investment and thinking about how that strategy needs to evolve please consider giving us a call.

    What do you think? Do you have an iPad or do you refuse to purchase one? Why or why not? Have you already started to struggle measuring mobile devices or do you have it all worked out? Is this all as exciting to you as it is to me? As always I welcome your thoughts and comments.

    Analytics Strategy

    Opening Day Baseball Brings Out New Reporting

    Baseball fans across the nation were smiling this weekend with opening day games around the league. Those of you who know me, recognize that I’m a raging Red Sox fan, but last night’s 8pm start time against the Yankees was just too late for me to catch the entire game. So, upon checking the scores this morning I got to see this cool new interactive game summary on Redsox.com.

     

    The spark lines show Tweet volume with mouseovers that offer details on each individual tweet. And the highlights are actual video clips that fire off a new window right from the summary page. Way to go MLB.com for delivering a simple, yet innovative mix of professional and consumer generated content.

    Oh, yeah…the Sox beat the Yankees 9 – 7 in the opener if you’re wondering.
    GO SOX!!

    Analytics Strategy

    Vote Lovett for the WAA Board of Directors

     

    Being a change agent for web analytics requires taking calculated risks, standing up for what you believe, and working diligently to make our industry stronger. I left my job at Forrester Research in part to become a change agent for web analytics and my bid for a seat on the WAA board of directors is the next big step in my journey. But, this quest I cannot fulfill alone – I need your vote. I’ve never run for elected office before so to illustrate my conviction, I borrow words from John F. Kennedy’s 1960 presidential nomination speech and added a few of my own…

    “With a deep sense of duty and high resolve, I accept your nomination.” I’ve stated several times before that there is no industry better than web analytics. Our colleagues within web analytics – the practitioners, the vendors, the leaders and gurus – are by and large friendly, approachable, and always willing to lend a hand. It makes working within analytics gratifying and fun. My hope is to elevate these positive attributes of our industry by aligning under the professional organization that we can call our own.

    “The times are too grave, the challenge too urgent, and the stakes too high…”
    Despite my positivism, we are facing turbulent times as an industry. We need to strengthen our association at a global scale to ensure that we speak with a common voice in all countries and in all languages to distinguish the Web Analytics Association as the undisputed resource for education, standards, research, and advocacy.

    “…if we open a quarrel between the present and the past, we shall be in danger of losing the future.” It has recently come to my attention, despite proof from some members that not everyone receives value from the WAA. If elected to the Web Analytics Association board I will dedicate my term to proving the value of our association to members at every level, from student to vendor to advanced consultant. If we cannot recognize our own value, how then can we expect outsiders to accept our mission with the credibility and respect it deserves?

    “I believe the times demand new invention, innovation, imagination, decision.” Those of you who know me recognize that I am not one to dwell on the mistakes of our past. Instead, I look to the future to determine how we can improve our situation and our position within the industry. These are exciting times for web analytics but times that will regale us to obscurity if we fail to demonstrate our vision through genuine contributions. We must think differently about how measurement technologies can be applied to today’s challenges and illustrate how the WAA is defining these efforts by taking a decisive leadership stand.

    “This is a platform on which I can run with enthusiasm and conviction.”
    It is this industry, its people and our collective challenges that I want to champion as a representative of the Web Analytics Association. I’ve stated my intentions on the WAA web site, but will reiterate the most important. It’s time to stop making excuses and start delivering value to the members of the WAA. A vote for me will get you a dedicated evangelist who is willing to shoulder the burden of hard work and diligence that’s required to orchestrate change in this industry.

    I welcome your thoughts and comments about how to improve our industry and will guarantee an open mind throughout my tenure on the Web Analytics Association board of directors if elected. Thanks for reading, Now Get Out And Vote!!

    Analytics Strategy

    Columbus WAW Recap: Don't "Antisappoint" Visitors

    We had a fantastic Web Analytics Wednesday last week in Columbus, sponsored by (Adobe) Omniture, with just under 50 attendees! Darren “DJ” Johnson was the presenter, and he spoke about web site optimization (kicking off with a riff of how “optimization” is an over-used word!). I, unfortunately, forgot my “good” camera, which means my photojournalism duties were poorly, poorly performed (DJ is neither 8′ tall, nor was he ignoring his entire audience):

    Columbus Web Analytics Wednesday -- March 2010

    One of the anecdotes that stuck with me was when DJ explained a personal experience he had clicking through on a banner ad (“I NEVER click on banner ads!” he exclaimed) and then having the landing page experience totally under-deliver on the promise of the ad. He used the term “antisappointment” (or “anticappointment?”) to describe the experience. It’s a handy word that works better orally than written down, but I’ll be shocked with myself if I don’t start using it!

    I’ve been spending more and more time thinking about and working on optimization strategies of late, and DJ’s presentation really brought it all together. This post isn’t going to be a lengthy explanation of optimization and testing…because I’m really not qualified to expound on the subject (yet). But, I will drop down a few takeaways from DJ’s presentation that hit home the most with me:

    • Testing (and targeting) doesn’t typically deliver dramatic step function improvements, so don’t expect it to — it delivers incremental improvements over time that can add up to significant gains
    • (Because of the above) Testing isn’t a project; it’s a process — it’s not enough to plan out a test, run it, and evaluate the results; rather, it’s important to develop the organizational capabilities to always be testing
    • “Testing” without “targeting” is going to deliver limited results — while initial tests may be on “all visitors to the site,” it’s important to start segmenting traffic and testing different content at the segment level as quickly as possible

    Good stuff.

    In other news, I’ve got a few additional bullet points:

    • Our next Web Analytics Wednesday is tentatively slated to be a happy hour only (unsponsored or with a limited sponsor) on a Tuesday. If you don’t already get e-mail reminders and you’d like to, just drop me a note and I’ll add you to our list (tim at this domain)
    • The Ohio Interactive Awards are fast approaching! This event, started up by Teambuilder Search, huber+co. interactive, and 247Interactive,  is shaping up to be a great event on April 29th at the Arena Grand Movie Theater (Resource Interactive is sponsoring the event happy hour)
    • The TechLife Columbus meetup.com group continues to grow and thrive, with over 1,500 members now — it’s free, and it’s a great way to find meetups and people who are involved in high tech and digital in central Ohio

    It’s been a lot of fun to watch social media get put to use in central Ohio and make it so easy to find interesting people with shared interests. I’ve certainly gotten to know some great people over the past couple of years with a relatively low investment of my time and energy, and I’m a better person for it!

    Adobe Analytics, Analytics Strategy, General, Reporting

    Integrating Voice of Customer

    In the Web Analytics space, we spend a lot of time recording and analyzing what people do on our website in order to improve revenues and/or user experience. While this implicit data capture is wonderful, you should be supplementing it with data that you collect directly from your website visitors. Voice of Customer (VOC) is the term often used for this and it is simply asking your customers to tell you why your website is good or bad. There are two main ways that I have seen people capture Voice of Customer:

    1. Page-Based Comments – Provide a way for website visitors to comment on pages of your site. This is traditionally used as a mechanism to get direct feedback about a page design, broken links or problems people are having with a specific page. Unfortunately, most of this feedback will be negative so you need to have “thick skin” when analyzing this data!
    2. Website Satisfaction – Provide a way for visitors to rate their overall satisfaction with your website experience (vs. specific pages). This is normally done by presenting visitors with an exit survey where you ask standard questions that can tell you how your website is doing and compares your site against your peers.

    There are numerous vendors in each of these spaces and the goal of this post is not to compare them, but rather discuss how you can integrate Voice of Customer data into your Omniture SiteCatalyst implementation. In this post, I am going to focus on the first of the aforementioned items (Page-Based Comments) and specifically talk about one vendor (OpinionLab) that I happen to have the most direct experience with (their headquarters was a mile from my home!). The same principles that I will discuss here can be applied to all Voice of Customer vendors so don’t get hung up on the specific vendor for the purposes of this post.

    Why Integrate Voice of Customer into SiteCatalyst
    So given that you can see Voice of Customer data from within your chosen VOC tool, why should you endeavor to integrate Voice of Customer and your web analytics solution? I find that integrating the two has the following benefits:

    1. You can more easily share Voice of Customer data with people without forcing them to learn [yet] another tool. People are busy and you are lucky if they end up mastering SiteCatalyst, lest you make them learn how to use OpinionLab, Foresee Results, etc…
    2. Many Voice of Customer tools charge by the user so if you can port their data into SiteCatalyst, you can expose it to an almost unlimited number of users.
    3. You can use Omniture SiteCatalyst’s date and search filters to tailor what Voice of Customer each employee receives.
    4. You can divide Voice of Customer metrics by other Website Traffic/Success Metrics to create new, interesting KPI’s.
    5. You can use Omniture SiteCatalyst Alerts to monitor issues on your site.
    6. You can use Omniture Discover to drill deep into Voice of Customer issues

    I hope to demonstrate many of these benefits in the following sections.

    How to Integrate Voice of Customer into SiteCatalyst
    So how exactly do you integrate Voice of Customer data into SiteCatalyst. For most VOC vendors, the easiest way to do this is by using Omniture Genesis. These Genesis integrations are already pre-wired and make implementation a snap (though there are cases where you may want to do a custom integration or tweak the Genesis integration). You can talk to your Omniture account manager or account exec to learn more about Genesis.

    Regardless of how you decide to do the implementation, here is what I recommend that you implement:

    1. Set three custom Success Events for Positive Page Ratings, Negative Page Ratings and Neutral Page Ratings. These Success Events should be set on the “Thank You” page after the visitor has provided a rating.
    2. Pass the free form text/comment that website visitors enter into an sProp or eVar. If they do not leave a comment pass in something like “NO COMMENT” so you can make sure you are capturing all comments. If you are going to capture the comments in an sProp, I recommend you use a Hierarchy variable since those have longer character lengths vs. normal sProps which can only capture 100 characters.
    3. Pass the actual page rating (usually a number from 1 to 5) into an sProp. I also recommend a SAINT Classification of this variable such that you classify 1 &2 as Negative, 3 as Neutral and 4 & 5 as Positive. This classification should take less than 5 minutes to create…
    4. Use the PreviousValue plug-in to pass the previous page name to an sProp.
    5. Create a 2-item Traffic Data Correlation between the Previous Page (step #4) and Page Rating (step #3). This allows you to see what page the user was on when they submitted each rating.

    All in all, this is not too bad. A few Success Events and a few custom variables and you are good to go. The rest of this post will demonstrate some of the cool reports you can create after the above implementation steps are completed.

    Share Ratings
    As I mentioned previously, you [hopefully] have users that have become familiar with the SiteCatalyst interface. This means that they have Dashboards already created to which you can add a few extra reportlets. In this first example, let’s imagine that you want to graphically represent how your site is doing by day with respect to Positive, Negative and Neutral ratings. To do this, all you have to do is open the Classification version of the Page Rating report (can be an sProp or eVar – your call) and switch to the trended view. You should have only three valid values and I like to use a stack ranked graph type using the percentage to see how I am doing each day as shown here:

    This graph allows me to get a quick sense of how my site is doing over time and can easily be added to any Dashboard.

    You can also mix your newly created Voice of Customer Success Events with other SiteCatalyst metrics. For example, while you could look at a graph/trend of Positive or Negative Comments by opening the respective Success Events, a better way to gauge success is to divide these new metrics by Visits to see if you are doing better or worse on a relative basis. The following graph shows a Calculated Metric for Negative Comments per Visit so we can adjust for traffic spikes:

    Find Problem Pages
    Another benefit of the integration is that you can isolate ratings for specific pages. The first way to do this is to see which pages your visitors tend to rate positively or negatively. In the following report, you can open the Rating variable report (or Classification of it as shown below) and break it down by the Previous Page variable to see the pages that most often had negative ratings:

    This will then result in a report that looks like this:

    Alternatively, if you want to see the spread of ratings for a specific page, all you need to do is find that page in the Previous Page report and break it down by the Rating variable (or its Classification) as shown here:

    Share Comments
    As noted above, if you capture the actual comments that people leave in a variable, you will have a SiteCatalyst report that captures the first 256 characters of the comments visitors enter. This report duplicates scheduled reports from your Voice of Customer vendor in that it allows you to share all of the comments people are leaving with your co-workers. However, by doing this through SiteCatalyst, you gain some additional functionality that some VOC vendors don’t provide:

    1. You can create a Traffic Data Correlation between the Comments variable and the Previous Page variable so you can breakdown comments for a specific page. Therefore, if you have users that “own” specific pages on the website, you can schedule daily/weekly reports that contain comments only for those pages so they don’t have to waste time reading all of the comments left by visitors.
    2. You can use the Search filter functionality of SiteCatalyst to scan through all of the visitor comments looking for specific keywords or phrases that your co-workers may be interested in. In the example below, the user is looking for comments that mention the words “slow” or “latent” to be notified of cases where the visitor perceived a page load speed issue:

    Set Alerts
    Another cool thing you can do with this integration is set automated Alerts in SiteCatalyst so you can be notified when you see a spike in Negative Comments on your site. This allows you to react quickly to broken links or other issues before they affect too many visitors (and help avoid #FAIL posts in Twitter!). Here is an example of setting this up:

    Review Problem Visits using Omniture Discover
    Finally, if you have access to Omniture Discover, after you have implemented the items above, you can use Discover to do some amazing things. First, you can use the unlimited breakdown functionality to zero in on any data attribute of a user that is complaining about your site. For example, if you had visitors complaining about not being able to see videos on your site, you might want to see their version of Flash, Browser, OS, etc… or even isolate when the problem took place as shown here:

    Additionally, you can use Discover to isolate specific comments and watch the exact visit that led to that comment. This is done through a little-known feature of Discover called the “Virtual Focus Group.” This feature allows you to review sessions on your site and see the exact pages people viewed and some general data about their visit (i.e. Browser, GeoLocation, etc…). While not as comprehensive as tools like Clicktale, it is good enough for some basic analysis. Here is how to do this:

    1. Open Discover and find the comment you care about in the custom sProp or eVar report
    2. Right-click on the row and create a Visit segment where that comment exists
    3. Save the segment in a segment folder
    4. Open the Virtual Focus Group (under Pathing in Discover)
    5. Add your new segment to the report by dragging it to the segment area
    6. Click “New Visit” in the Virtual Focus Group
    7. Click on the “Play” button to watch the visit

    Now you can watch how the user entered your site, what pages they went to and see exactly what they had done prior to hitting the Voice of Customer “Thank You” page.

    Final Thoughts
    So there you have it, a quick review of some cool things you can do if you want to integrate your chosen Voice of Customer tool and Omniture SiteCatalyst. If you are interested in this topic, I have written a white paper with OpinionLab that goes into more depth about Voice of Customer Integration (click here to download it). If you have done other cool things, please let me know…

    Analytics Strategy, General

    Why Google is really offering an opt-out …

    When I first saw the news of Google’s opt-out browser plug-in spread around Twitter I thought “hmm, I wondered when we’d see this” and moved on since opt-out is more or less an non-issue — basically because in the grand scheme of things nobody really opts-out. For all the hand-wringing and navel-gazing people do on the subject of privacy online, I have never, ever seen any data that indicates that web users actively opt-out of tracking in significant numbers.

    Never.

    If you have it, bring it on as I’d love to see it. But in my experience the only people really truly and actively interested in browser- or URL-based opt-out for tracking are privacy wonks, extreme bit-heads, and some Europeans. The privacy wonks and bit-heads are who they are and are unlikely to ever change; the Europeans have privacy concerns for other reasons but I will defer to Aurelie to try and make heads or tails of what those reasons are.

    Still, it has been interesting to see some bright folks like Forrester’s Joe Stanhope offer some explanations about why Google might be doing this and what the ramifications might be. And it has been less interesting to see some of the fear mongering and hyperbole offered by Marketing Pilgrim’s Andy Beal in his post “Why your web traffic is going to nosedive thanks to Google” although I found Econsultancy balances things out with their straightforward and tactful post “Will opt-out threaten Google Analytics?

    What Andy, Patricio, and to some extent Joe, apparently didn’t notice is that Google Analytics is about to make a big, big push into Federal Government web sites, and this browser-based opt-out is just a check-box requirement to satisfy the needs of said privacy wonks who for better or worse have the Administration’s ear (or some body part, you choose!)

    Yep, the browser opt-out isn’t actually for anyone … except for perhaps the Electronic Freedom (sic) Foundation and their ilk. Google is somewhat brilliantly checking a box now so that when the Office of Management and Budget (OMB) releases all new Federal guidelines for browser cookie usage later this year any Federal site operator who wants can immediately dump their existing solution and go directly to Google Analytics.

    You do remember that Google Analytics comes at the amazing deficit reducing price of ABSOLUTELY FREE. Even a Republican can get his or her arms around that price tag, huh?

    You betcha.

    “Hey wait,” you say, “what about the fact that Federal web sites will probably never get permission to track visitors over multiple sessions?” Good point, except did you know you can override Google Analytics _setVisitorCookieTimeout() and_setCampaignCookieTimeout() variables and set their values to zero (“0”) which effectively converts all Google Analytics tracking cookies to session-only cookies?

    Yep.

    Not to mention that the little birds who sing songs in only hushed tones suggest that OMB is about to take a much more reasonable stance on visitor tracking anyway. This is not a done deal, but the situation that most Federal site managers work under today — one where many sites are more or less forced to use out-of-date log file analyzers and most are hamstrung in their ability to analyze multi-session behavior — seems to fly directly in the face of President Obama’s efforts to make government more transparent and effective.

    I said as much just after he was elected, and then I said it again when I pointed out that Barack Obama should not fear browser cookies! Federal managers need modern, easy-to-use tools to improve the overall quality of government web sites.

    Now, I could be wrong about all of this — I am human, and like Joe Stanhope I have not heard word-one from Google about the opt-out app — but I am pretty good at connecting dots and these are big, obvious dots:

    1. Google loves data
    2. Feds have tons of data
    3. Feds have requirements necessitating privacy controls
    4. Google builds privacy controls
    5. Google gets Feds data

    This is actually pretty brilliant of Google if you think about it. Assuming you’re with me in my belief that Google Analytics isn’t about AdWords or Analytics or anything other than Google’s desire to have all the world’s data, then you’ll surely see that providing Federal web site operators a web analytics solution that simultaneously solves a multitude of analysis problems AND saves money is, well, pretty freaking brilliant.

    Don’t take my word for it. Here’s a list of sites in the .gov domain that people are tracking using our free, browser agnostic web analytics solution discovery tool. We have about 100 sites total, the majority of which don’t appear to have any kind of tracking code at all, and of these:

    • 12% are using Google Analytics exclusively already
    • Another 3% are using Google Analytics with Omniture (1%) or Webtrends (2%)
    • 6% are using Omniture (one, GSA.gov in tandem with Webtrends)
    • 15% are using Webtrends (including GSA.gov in tandem with Omniture)
    • 63% appear to have no hosted analytics of any kind

    If I’m right the evidence will be obvious as more of these “no hosted analytics” sites begin to have Google Analytics tags. Sites like Census.gov, the EPA, FCC, FEMA, HUD, and even FTC might all start to take advantage of Google’s largesse (and willingness to provide a browser-based opt-out, don’t forget that!)

    What do you think?

    As always I welcome your thoughts, observations, reaction, and even anti-tracking-pro-privacy rants. If you are you a Federal site manager with insight to share but unable to voice your position publicly then out of respect I am happy to have you post anonymously as long as you provide a valid email address that I will confirm and then convert to “anon@anonymous.gov” to protect your identity.

    Analytics Strategy, Social Media

    Facebook Analytics: Part II – Vendor Solutions

    Earlier this week we described the Facebook Analytics Ecosystem and some of the ways in which businesses can go about measuring components of the social networking empire. Today, we reveal two key pieces of additional information that will help organizations: a. Understand the benefits of measuring Facebook (a necessary element in forming social marketing business objectives) and b. Identify vendors that offer measurement solutions for Facebook.

    DISCLOSURE: Analytics Demystified works with many web analytics vendors including some of those discussed in this post. We rarely disclose our clients publicly but for the sake of transparency wanted the reader to know that we do have a financially beneficial relationship with three of these vendors and a mutually beneficial relationship with all four.

    Three Business Benefits of Measuring Facebook

    To demystify the ways in which businesses can measure, understand and capitalize on the growing Facebook phenomenon, we identified three pillars of Facebook measurement. These three pillars identify the “what”, the “who” and the “cha-ching” of marketing within Facebook.

    1. Observe Interactions: What are people on Facebook doing?

        This essential component of Facebook measurement includes the ability to track anonymous user information such as visits, friends, comments, likes and exposure across pages, custom tabs and applications. It’s not as easy as it sounds.

    2. Understand Demographics: Who are all these people on Facebook? In addition to knowing what they do, it’s important to know who Facebook users are to segment the massive population on attributes such as user ID, gender, age, education level, work experience, marital status or geographic information. Facebook is very protective of user privacy and many limitations apply here as well.

    3. Impact Conversions: How can businesses cash in on Facebook marketing? This includes the ability to associate impressions and exposure within Facebook back to conversion events on external sites. And the ability to target advertising within Facebook based on observations and demographic data, while maintaining the ability to track viewthroughs and conversions offsite. This too requires extensive development and fancy footwork to make it happen so we’ll explain shortly.

    Vendor Capabilities For Measuring Facebook

    As mentioned in Part I of our series on Facebook Analytics, several of the major web analytics vendors are vying for position to deliver Facebook measurement capabilities. While we have not yet reached Facebook directly for comment, we concluded that no single vendor is likely to gain a long-term competitive advantage over the rest of the market for measuring Facebook. The partnership established between Omniture and Facebook does provide some short-term gains because Omniture is able to leverage a direct relationship with Facebook developers to fully utilize data provided by existing APIs; still, all of the vendors interviewed for this research informed us that they were actively engaged in talks with Facebook. Further, Analytics Demystified strongly believes that it is not in Facebooks’ best interest to lock into exclusive vendor agreements or partnerships because of the risk of alienating significant portions of their business population using disparate tools.

    Here’s what we know:

    Facebook Insights

    Facebook Insights offers aggregate views of behavioral and demographic data across a number of areas within the ecosystem. The Wall Insights show behavioral and demographic info on unique Fan interactions and all Fan visits. While the offering is great for the price, one thing we heard repeatedly is that the data is sampled and slow in coming, sometimes delayed up to three days.

    To Facebook’s credit they appear to be constantly working to improve the quality of insights they provide. Mashable broke unofficial news again this morning by reporting that Facebook is offering more analytics detail to page admins through weekly email alerts. These reports reveal new fan counts, page views comments and likes over the week.

    Most useful for: Companies unwilling to invest in “pro tools” for Facebook measurement. Facebook Insights does offer value so if you’ve got no other measurement prospects then what they’re offering is better than no data at all.

    Coremetrics

    Coremetrics’ Facebook announcement described their ability to determine Facebook’s influence on site visits and conversions, which melds nicely with their impression attribution tool. This slick capability allows Coremetrics to reveal vendor, category, placement and item data collected from within Facebook back to the Coremetrics Analytics interface. They do this using image tags and claim that caching happens infrequently, yet they circumvent these occurrences with cache busters.

    Server-side rendering of image tags allows Facebook to segment and report on attributed data. This allows Coremetrics’ users to see how interaction with specific tabs led to web site engagement and conversions. While technically possible for them, Coremetrics hasn’t dedicated focus on reporting user interactions within Facebook in their interface. Instead, they’ve honed in on the ability to understand how the social networking site acts as a feeder channel to their customers’ primary web properties.

    Most useful for: Companies heavily focused on Facebook as an advertising channel. Coremetrics is a good choice for clients that are not heavily invested in Facebook, but more interested in understanding how it compliments their other online acquisition marketing efforts.

    Omniture

    Omniture’s view on measuring Facebook is simple: Understand your audience > Target them with advertising > Optimize the message. They enable this by focusing on the custom tabs, apps and ads within Facebook. Their most recent announcements touted their partnership with Facebook to enable ad creation and demographic targeting directly within their Search Center Plus solution. This works through an Omniture Genesis integration that also enables even more granular behavioral and demographic data collection. We previewed each of these solutions in working demos, and both are scheduled for general release later this year (with the Genesis offering likely tied to the rumored announcements at Facebook f8.) Clients today are targeting with limited profile information specific to gender with product ads.

    Omniture has also developed a “Facebook JavaScript” (FBJS) measurement library that allows them to track behavior data natively within Facebook, and despite competitor’s claims Omniture pointed out that they’ve been doing this since May of 2009. They also deploy output and image tags, which occurs less frequently, and for permissioned applications Omniture is collecting a bevy of demographic data that will appear within Discover for slice and dice ability. They’ve also created default segments within Discover showing pathing reports for: visitors acquired from Facebook (conversion); visitors from Facebook (impressions); and known Facebook users (user association).

    Most useful for: Companies that have not yet fully determined what their approach towards Facebook will be. Given the breadth of their capabilities Omniture is a good choice for companies looking to better understand how user’s interact with the platform and the demographic make-up of their audience in Facebook, and with the SearchCenter Plus release, Omniture has the potential to dramatically improve customer’s ability to purchase laser-targeted advertising on the platform.

    Unica

    Unica has been noticeably quiet during the Facebook Analytics Wars but we’re not shy so we called ‘em out and asked them to weigh in on their capabilities. It turns out that they too have been measuring Facebook for some time using both dynamic and static image tags. They’re collecting strictly according to Facebook’s published rules but include unique user IDs and other attributes such as friend counts from visitors to their custom tabs. They also get app data for average viewing time, views, visits and visitors – all passed to Unica’s NetInsight interface.

    Unica is taking a very conservative approach to the demographic data and like some others, waiting for a ruling from Facebook before developing capabilities in that area. While that capability is waiting in the wings, Unica’s longer term vision may include integration with their new search technology and conversion pathing visualizations.

    Most useful for: Companies using other of Unica’s Affinium products and companies needing in-house analytical capabilities. Unica customers looking to develop or advertise within Facebook should explore the vast customization possibilities with Unica directly.

    Webtrends

    Webtrends is aggressively working to deliver social analytics solutions to their customers and also walking the Facebook talk. Their own corporate Facebook fan pages are the most developed of all vendors interviewed and they use these pages to test concepts and showcase capabilities. Webtrends also takes a conservative approach to data collection and privacy by adhering strictly to the letter of Facebook law, thus collecting and displaying fewer demographic attributes.

    In their own words Webtrends has been “throwing the book” at the Facebook API to obtain as much data as their published documentation allows, which includes: views, visits, bounce rates and time on site for Facebook shares, ads, apps and custom tabs. Webtrends refutes the long-term feasibility and accuracy of image tags and cache busting techniques within Facebook. And they’ve responded by developing a proprietary solution that uses a data call to pass parameters from the data collection API. This method captures all the typical data as well as flash, pop-ups and other custom fields with the potential to do a whole lot more if the data collection restrictions ease. But before you go snooping for details, Webtrends informed us they have filed a patent for this new method of data collection.

    Most useful for: Companies that want to gain deep visibility into interactions within the Facebook ecosystem. Webtrends has the potential to be very useful for social media marketers who are actively developing and tracking social media behavior in Facebook.

    Questions to ask your vendor…

    While some aspects of these Facebook measurement solutions have been around for a while, they are still very much nascent. Nearly all of the capabilities described above – as best we can tell – are deployed via customized consulting engagements with each vendor and likely will be for the foreseeable future. Keep this in mind as you think about pricing, development resources and timing.

    Also, because Facebook is changing its rules and these solutions are largely custom consulting jobs, please don’t even think about buying anything before you see it in action. Have the vendor demonstrate the functionality you’re looking for using live customer data. While mock data and in-house examples are fine for some purposes, ask to see real-world data or decide whether you want to be the test subject.

    Additionally here are a few questions that we recommend you pose to vendors when seeking out a Facebook Analytics solution:

    1. How long have you had an active measurement solution in place for Facebook?
    2. How many active customers do you have using your Facebook measurement capabilities?
    3. Can we speak with two or three of your customers actively using your Facebook measurement capabilities?
    4. Do you adhere to Facebook’s published data collection, storage and privacy regulations?
    5. Are you using your solution to measure your own Facebook efforts? Can we see your data?
    6. Do you have documented PHP and FBJS libraries that we are able to deploy on our own?
    7. How long, on average, do your Facebook measurement deployments take start to finish?
    8. Do I need to be a customer to purchase your Facebook measurement solution?
    9. Which Facebook profile data can you import into your application? Can we see it in your application?
    10. Which of your solutions are required to leverage your Facebook measurement solution?

    As always I welcome your comments, thoughts, and opinions about this exciting aspect of digital measurement. And if you think we got something wrong, please do let us know!

    Analytics Strategy, Conferences/Community

    Web Analytics Wednesday: Free and Independent!

    If you are one of the thousands of people who have attended one of our Web Analytics Wednesday events over the past few years, well, thank you! Thank you for showing your support of the web analytics community, your local community, and the practice of web analytics in general. I had no idea that our execution of June’s idea would progress to near the point it has … touching so many people and providing a gateway to jobs, employees, and all kinds of new ideas.

    That said, two challenges have emerged recently and I felt like a quick blog post that everyone could reference would be the best way to deal with each. In no particular order:

    1. Web Analytics Wednesdays are designed to be a free event. It has come to my attention that some local chapters of WAW are charging people to attend events. In most (probably all actually) cases these fees are designed to offset the cost of food or drinks, but here’s the thing: we have tons of money for Web Analytics Wednesday and we can almost certainly get more if we need it! If you find yourself in the position of having to ask local members for $10 for an event … please please please email me directly and lets find you money! I am pretty creative, and the 2010 Global Sponsors have already donated very generously, so let me help you make a totally free event if at all possible, please!
    2. Web Analytics Wednesdays need to be run thoughtfully when held in conjunction with Web Analytics Association events. This gets back to  the open-to-all atmosphere of Web Analytics Wednesday, but it has been brought to our attention that some WAA country hosts in Europe have been holding joint WAA + WAW events. This is excellent and wonderful, except if it happens at the expense of A) the global agreement between the WAA and Web Analytics Wednesday and B) the ability for anyone — WAA or not — to participate.

    The second point merits additional explanation. Web Analytics Wednesday, as many of you are already aware, is an independent entity created by Analytics Demystified, not the Web Analytics Association. Because it is a “community event” many people mistakenly assume it is WAA but it is not and never has been. We maintain WAW as a private entity because A) we believe it needs to remain open to all, not just those folks able to justify and afford the Association’s $199 annual fees and B) honestly, it’s a lot easier to get financial support for these events as an independent entity.

    To clarify this, a few years back June and I hammered out an agreement between the Association and WAW. Without boring you with the details, the agreement specifies that “Web Analytics Wednesday” is an independent brand, that all WAW registrations will occur on our web site and system, and that WAW will be open to all comers, not just WAA members. It’s an awesome agreement because it allows the Association access to WAW events around the globe without needing to have any infrastructure.

    The agreement also totally, totally supports local WAA events that want to have a social function as well! If a WAA coordinator or country manager wants to have a “social event” after a sanctioned WAA event that requires registration they have two very simple options:

    1. Call the event Web Analytics Wednesday, create the event on our platform, advertise the event for anyone and everyone who wants to attend, and ask people to sign up to participate at the official WAW web site;
    2. Call the event anything other than “Web Analytics Wednesday”

    Easy, huh?

    All we are seeking to do is ensure that “Web Analytics Wednesday” continues to be known as a totally free event, open to all comers regardless of financial disposition and willingness to support any association, vendor, or technology. And to that end we are working as hard as possible to provide resources — financial and otherwise — to event planners across the globe, working with great organizations like the WAA, and working with the brilliant and wonderful WAW hosts who have made Web Analytics Wednesday the amazing event it is.

    Personally I’m looking forward to getting to know the new WAA Executive Director and working to ensure bidirectional compliance with the long-standing agreement between WAA and WAW. I know the agreement has the board’s support, and we hope the spirit of the agreement continues to maintain the community’s support as well.

    I welcome questions, comments, and concerns, and with the Association’s permission I am happy to provide or publish a copy of the agreement between WAA and WAW (but do need the Association’s permission as it is a valid legal document.)

    Analytics Strategy, Social Media

    Web Analytics Tracking on a Facebook Page

    I’ve been on a quest now for several months to crack the code of how to get web analytics tracking on a Facebook fan page. My (and our clients’) desire to do so shines an interesting light on the way that social media has blurred the concept of a “web site.” Back in the day, it was pretty simple to identify what pages you wanted to track: if the user perceived the page as being part of your site, you wanted to track that page with your web analytics software (even if it was an area of your site that was hosted by some other third party that had specialized capabilities like managing job opening, events, or discussion forums).

    Social media, and Facebook in particular, is starting to blur those lines. If your company manages a branded fan page on Facebook, and that page is a place through which your customers and target customers actively engage with the brand, isn’t it acting a bit like your web site? Clearly, a Facebook page is not part of your site, but it’s a place on the web where consumers actively engage with brands, both to give and receive brand-related content. It acts a lot like a traditional web site in that regard.

    As companies begin to invest more heavily in Facebook pages — both through creative development and staff to engage with consumers who interact with their brand through a fan page — there is an increasing need to have better visibility into activity on those pages. I wrote an entire post on the subject of Facebook measurement back in January, and I’ve had to update it several times since then as Facebook has rolled out changes and as I’ve gotten a bit deeper into the web analytics aspects of that tracking.

    Just last week, Webtrends announced some damn slick enhancements to Analytics 9 that allow not only tracking well beyond what Facebook Insights offers, but that also brings in some specific (anonymous) user information so that the traffic can be segmented in useful ways (the post on mashable.com shows some screen captures of the resulting data). I fully expect that Omniture will come out with something comparable as soon as they can, but I don’t think they have that level of tracking yet (if you know differently, please leave a comment to let me know). [Update: Coremetrics announced some new Facebook tracking capabilities shortly after this post was published.]. My one concern with the Webtrends solution is that, as best as I can tell, it requires the tracked pages to use a Facebook application that will pop up an “Allow Access?” question to the user — the user has to indicate this is okay before getting to the content on the page. Lots of applications have this, but, at Resource Interactive, we’ve also had lots of clients for whom we have built very rich and interactive experiences on their fan pages…without requiring anything of the sort. If the access is needed to enable the application to deliver value to the user, then this is fine, and the improved trackability is just scrumptious gravy that comes along for the ride. If the access is needed just for tracking, then I would have to think long and hard about it — data capture should always be between somewhere between excruciatingly minimally visible to the user and not visible at all.

    The question, then, is, “What can be tracked unobtrusively, and how can it be done?” This post will attempt to answer that question.

    Why Is It So Tricky in the First Place?

    Facebook, largely for privacy reasons, locks down what can happen on its pages. It may make your head hurt (it certainly makes mine) to understand all of the cans vs. cannots for different scenarios, but I’ll take a crack at a short list. There are two basic scenarios that a customer might experience as a “tab on a brand’s page:”

    • The brand can add a tab to the page and drop some form of Facebook application into it; in this scenario, iFrames are not allowed, and Javascript cannot be executed
    • The brand can make a separate application, and, on the “application canvas,” they can drop an iFrame, and Javascript can be executed within that iFrame; but, since the application canvas cannot exist “in a tab,” the design for the page has to include tabs to mimic the fan page, which is a bit clunky and raises some other user experience challenges

    Okay, so that was easy enough…assuming you’re following the custom tab / application / application canvas terminology. Both of these scenarios allow the embedding of Flash objects on the page.

    Facebook doesn’t allow Javascript, but it does allow it’s own similar scripting language, called FBJS (these tabs also use “FBML” rather than HTML for developing the page — it’s similar to HTML but not identical).

    What all of this means is that it’s not as simple as “just drop your web analytics page tag on the page” and you’ll get tracking. But that doesn’t mean you’re entirely SOL. This post is almost entirely geared towards custom Facebook tabs — and, really, it assumes that the content on those pages are based on an FBML application.

    Tracking Basic Visits and Page Views for a Custom Tab?

    We’ve cracked this to varying degrees for two different web analytics tools: Google Analytics and Webtrends. We haven’t had a pressing need to tackle it for anything else, but I’m pretty sure the same principles will apply and we’ll be able to make it happen. In both cases, the approach is pretty much the same — you need to have the FBML and FBJS on the page make an image call to the web analytics program. To pull it off, you do need to have a good understanding of how web analytics tools collect data, which I wrote an extensive post about a few days ago.

    In the case of Webtrends, the simplest thing to do is treat the page like a page where every visitor who comes has Javascript disabled in their browser. I’ll cover that later in this post.

    For Google Analytics, things are a little dicier because Google Analytics doesn’t have out-of-the-box “noscript” capabilities. You have to figure out all of the appropriate parameter values and then just make a full image call (again, reference the link above for a detailed explanation of what that means). You’re not going to get all of the data that you would get from running the standard page tag (which I’ll touch on a bit more later in this post), but you can certainly get page views and unique page views with a little FBJS work.

    Start out by creating a new Google Analytics UA number for your Facebook tracking. This will give you a profile with a new ID of the form: UA-XXXXX-YY. You will have to provide a domain name, but what that domain name is is immaterial — “<brand>.facebook.com” makes sense, but it can really anything you want.

    Then, it’s just a matter of figuring out the list of values that you are going to tack on as parameters to the Google Analytics image call (http://www.google-analytics.com/__utm.gif). Below are some tips on that front (refer to the Google Analytics documentation for a deeper explanation of what each parameter is), with the bolded ones being the ones that I’ll discuss in greater detail:

    • utmwv: 4.6.5 (or a newer version — I don’t think it’s critical)
    • utmn: needs to be a random number between 100000000 and 999999999 (more on this in a bit)
    • utmhn:  <brand>.facebook.com (or something else — again, not critical)
    • utmcs: leave blank
    • utmsr: leave blank
    • utmsc: leave blank
    • utmul: leave blank
    • utmje: leave blank
    • utmfl: leave blank
    • utmdt:  the title of the page (whatever you want to call it)
    • utmhid: leave blank
    • utmr: leave blank
    • utmp: a “URL” for the page
    • utmac: the Google Analytics ID you set up (UA-XXXXX-YY)
    • utmcc: __utma%3D1.<session-persistent ID>.1252000967.1252000968.1252000969.1%3B

    This is as simple as it gets. Obviously, all of the “leave blanks,” as well as the limited number of “cookie values” being passed, mean that you’re not going to get nearly as rich information for the visitors to this tab (you should be able to just eliminate the “leave blank” parameters entirely from the image call. You will get page views and unique page views, and you can set up goals and funnels across tabs if you want. You can also start getting a little fancier and inserting campaign tracking parameters and other information, but start here and get the basics working first — you can always augment later (and please come back and comment here with what you figure out!).

    For the four bolded parameters in the list above, two are ones that you will predefine for the tab itself — they’re essentially static — and two are ones that will require a little FBJS magic to make happen.

    Let’s start with the two static ones:

    • utmdt: this is normally the <title> tag for the page that is being visited; you can make it any plain English set of text you want, but you need to replace spaces and other special characters with the appropriate URL encoding
    • utmp: this is the URL for the page; you certainly can navigate to the custom tab in Facebook and use that, but I suggest just making it a faux URL, similar to how you would name a virtual pageview when doing onclick tracking; again, you will need to make this an appropriately URL encoded value (that mainly means replacing each “/” in the URL you come up with with “%2F”)

    The two other values require a little more doing, although it’s apparently pretty straightforward with FBJS (if you’re not a Javascript / FBJS jockey, as I’m not, you may need to track down a willing collaborator who is):

    • utmn: the sole purpose of this value is to make the overall GIF request a “new” URL; it’s a random (or, at least, quasi-random) number between 100000000 and 999999999 that should change every time there is a new load of the page
    • utmcc: the main thing you want to do here is generate a value between 1000000000000000000 and 9999999999999999999 that will stay with the visitor throughout his visit to Facebook. The other values in the __utma subparameter of utmcc are various date-stamps; if you want to get fancy, you can try to populate some of those as well; overall, utmcc is supposed to be a set of cookie values that persists on the user’s machine — we’re not actually dropping a cookie here, which means we’re not going to be able to track any of the sorts of “lifetime unique visitors”-dependent measures within Google Analytics (that includes “new vs. returning” visitors — everyone’s going to look like a new visitor in your reporting)

    Make sense? I built a spreadsheet that would concatenate values I’d populated for these variables which just isn’t pretty enough to share. But, you just need to tack all of these values together as I described in my my last post and drop that as an image call on your custom tab.

    This won’t work for every tab — you can’t do it on your wall or your Info tab or other pre-defined, unformattable tabs, but if you create a new tab and drop an FBML application in it, you can go nuts with this.

    [Update: At almost the exact same time that this post went live, an e-mail hit my inbox with a link to a Google Analytics on Facebook post that I failed to turn up during my research (the post is only a week old, and most of my research happened prior to that). This post includes a handy link generator which looks really promising and helpful.]

    Tracking Actions within a Tab

    Now, suppose you’ve got your custom tab, and you’ve got tracking to the tab working well. But, you’ve dropped some Flash objects on the tab, and you want to track interactions within Flash. You’ve got two options here:

    • Just use the Actionscript API for Google Analytics — as I understand it, this works fine; I’ve also heard, though, that this adds undue weight to the app (35 KB), and that it’s not super-reliable; but, if you or your Flash developer is already familiar with and using this approach, then knock yourself out
    • Manually generate image calls for each action you want to track — this really just means follow the exact same steps as listed in the prior section, but use Actionscript rather than FBJS for the dynamically generated pieces

    Because I work with motivated developers, we went the latter route and built a portable Actionscript class to do the heavy lifting.

    Presumably, you can also use FBJS to track non-Flash actions as well, depending on what makes sense.

    What About Webtrends?

    The same principles described above apply for Webtrends. But, Webtrends has an out-of-the-box “<noscript>” solution, so, rather than reverse-engineering the dcs.gif, you can use a call to njs.gif:

    http://statse.webtrendslive.com/<your DCS ID>/njs.gif?dcsuri=/<virtual URL for the page>&WT.ti=<name for the page>

    (I did confirm that you can leave off the WT.js parameter that is listed in the Webtrends documentation for using njs.gif).

    It also seems like it would make sense to tack on a random number in a parameter at the end (such as “&amp;nocache=<random number>”) just to reduce the risk of caching of the image request (similar to what’s described for the utmn parameter for Google Analytics above). I haven’t even asked for confirmation that that would be useful, but it seems like it would make sense, and it’s just a parameter that Webtrends will ignore in the processing.

    Chances are, you’ll want to set up a new profile in Webtrends that only includes this Facebook traffic (see my opening ramble about Facebook pages being quasi-web sites), and you’ll probably want to filter this traffic out of your various existing profiles. That may mean you need to think about how you are naming your pages to make for some easy Include and Exclude filter creation.

    (Oh, yeah, and the “statse.webtrendslive.com” assumes you’re using Webtrends Ondemand — if you’re running Webtrends software, you’ll need to replace this with the appropriate domain.)

    As you’ve probably deduced by now, we haven’t really vetted our “njs.gif” usage…yet, but we’ve gotten a lot of head nods from within Webtrends that this should work. I’ll update this post once I’ve got confirmation, but I wanted to go ahead and get the information published so that someone else can run with it and maybe figure it out in more detail and let me know!

    Webtrends also, apparently, allows Actionscript to interact with the Webtrends REST API directly, which, allegedly, is an option for action tracking within Flash on Facebook pages. We haven’t confirmed that, and, in what little looking I did on http://developer.webtrends.com, I didn’t turn up any particularly useful documentation, so either that’s not widely in use, or I’m a lousy user of their search function.

    It’s Not as Tough as It Looks…but It’s Not Perfect

    This may seem a little overwhelming, but the mechanics are really pretty straightforward once you dive in and start playing with it.

    To test your work, you don’t need to actually code up anything — just set up your new profiles (Google Analytics or Webtrends) build up some image request strings, and start hitting them. You can manually swap out the “dynamic” values — even have some friends or co-workers hit the URLs as well. To introduce a bit of rigor, it’s worth tracking the specific image requests you’re using, how many times you hit them and from what browser. That way you can compare the results in your web analytics tool to see if you’re getting what you’d expect. Then you can move on to actually getting the calls dropped into a Facebook page.

    Realize, too, that this whole process is a dumbing down of what normally happens when Javascript or Actionscript is used to tell your web analytics tool that someone has visited the page. Your new vs. returning traffic is going to be inaccurately skewed heavily towards “new.” You’re not going to get browser or OS details (much less whether Javascript is enabled or not). But, you will get basic page views and visits/unique pageviews, and that’s something! You’re stepping back into the Bronze Age of web analytics, basically, but that’s better than the Stone Age, and you’re doing it within social media!

    I suspect that you can get a little fancier with FBJS and start to get more robust measurement. As a taste of that, we actually got some tracking working on users’ walls in Facebook, which was both wicked and rad (as the cool kids in the 80s would have said):

    • We posted a status update that was, basically, an invitation to click into a Flash object; if the user clicked into it, then a Flash-based box expanded on their wall, and Google Analytics would be passed an image call to record a page view for the activity
    • We also passed in a “utmv” value, which we then used to set up segments within Google Analytics — the idea being that, each time we do one of these status updates will be a separate “campaign,” but our campaign tracking will be through custom segments within Google Analytics — that will enable all of our reporting, including the conversion funnels we set up, to be set up once and then re-used through Google Analytics segmentation

    Neat, huh? Or, as we’d say in the rural Texas town where I grew up, it’s slicker than greased baby poop. This is giving us highly actionable data — enabling us to see how people are interacting with these experiences through Facebook and enabling us to try different approaches to improve conversion over time! (To be clear, we’re not capturing personally identifiable Facebook information — exactly who is interacting is still invisible to us, which is as it should be).

    Fun stuff. If you’ve given anything along these lines a try (or if you’ve successfully taken a totally different tack), please leave a comment — I’d love to get other options added!

    Analytics Strategy

    All Web Analytics Tools Are the Same (when it comes to data capture)

    I started to write a post on using web analytics tools — Google Analytics, specifically, but with a nod to Webtrends as well — to track traffic to custom tabs and interactive elements on Facebook pages. But, as I started thinking through that content, I realized that I needed to back up and make sure I had a good, clean explanation of a key aspect of the mechanics of page tag-based web analytics tools. I poked around on the interweb a bit and found some quick explanations that were accurate, but that really weren’t as detailed as I was hoping to find.

    Regardless of whether you’re trying to track Facebook or not, it’s worth having a good, solid understanding of these underlying mechanics:

    • If you’re a web analyst, understanding this is like understanding gravity if you’re a human being — there are some immutable laws of the internet, and knowing how those laws drive the data you are seeing will open up new possibilities for capturing activity on your site
    • If you’re a developer, then this will be a quick read, but understanding it will make you the hero to both your web analysts and (assuming they’re not glory hogs) the people they support with their analysis, because you will be able to suggest some clever ways to capture useful information

    By the end of this post, you should understand both the title and why the URLs I listed below are what make it so:

    I’ve been deep under the hood with both Google Analytics and Webtrends for this, but the same principles apply to all tools (because they’re all bounded by the Physics of the Internet). I’m going to talk about Google Analytics the most in-depth, because it has the largest market share (measured by number of sites tagged with it), and I’ll try to call out key differences when appropriate.

    Let’s start with a simple picture of how all of these tools work. When a visitor comes to a page on your site, the following sequence of events happens:

    Steps 2 and 3 are really the crux of the biscuit, but we need to make sure we’re all clear on the first step, too, before getting to the fun there.

    1 – Javascript figures out stuff about the visitor

    We all know what Javascript is, right? It’s one of the key languages that can be interpreted by a web browser so that web pages aren’t just static text and images: dropdown menus, mouseovers, and such. But, Javascript also enables some things to go on behind the scenes. The basic data capture method for any tag-based web analytics tool is to run Javascript to determine what page the visitor is on, what relevant cookies are set on the user’s machine, whether the visitor has been to the site before, what browser the visitor is using, what language encoding is set for the browser, the user’s screen resolution, and a slew of other fairly innocuous details. This happens every time a visitor views a page running the page tag. So, great — a visitor has viewed a page, and the Javascript has figured out a bunch of details about the visitor and the page. Now what? It’s on to step 2!

    (I realize I’m saying “Javascript” here, and most tools also have Actionscript support for tracking activity within Flash — for the purposes of this post, I’m just going to stick with Javascript, but I’ll get back to Actionscript in my next post!)

    2 – Javascript packages that info into a single string of information

    The next step is pretty simple, but it’s where the magic starts to happen. Let’s say the Javascript in step 1 had figured out the following information about a visitor to a page:

    • Site = http://www.gilliganondata.com
    • Page title = The Fun of Facebook Measurement
    • Page URL = /index.php/2010/01/11/the-fun-of-facebook-measurement/
    • Browser language = en-us

    Converting that info into a single string is pretty straightforward. Let’s start by pretending we’re going to put it into a single row in a pipe-delimited file. It would look like this:

    Site (hostname) = http://www.gilliganondata.com | Page name = The Fun of Facebook Measurement | Page URL = /index.php/2010/01/11/the-fun-of-facebook-measurement/ | Browser language = en-us

    Now, rather than using the pretty, readable names for each of the four characteristics of the page view, let’s use some variable names (these are the Google Analytics variable names, but the documentation for any web analytics tool will provide their specific variable names for these same things):

    • Site (hostname) –> utmhn
    • Page title –> utmdt
    • Page URL –> utmp
    • Browser language –> utmul

    So, now our string looks like:

    utmhn = http://www.gilliganondata.com | utmdt = The Fun of Facebook Measurement | utmp = /index.php/2010/01/11/the-fun-of-facebook-measurement/ | utmul = en-us

    We used pipes to separate out the different variables, but there’s nothing really wrong with using something different, is there? Let’s go with using “&” instead and eliminate the spaces around equal signs and the delimiters. The single string now looks like this:

    utmhn=www.gilliganondata.com&utmdt=The Fun of Facebook Measurement&utmp=/index.php/2010/01/11/the-fun-of-facebook-measurement/&utmul=en-us

    Now, we’ve still got some “special” characters that aren’t going to play nice in the Step 3 — namely spaces and “/”s, so let’s replace those characters with the appropriate URL encoding (%20 for the spaces and %2F for the “/”s):

    utmhn=www.gilliganondata.com&utmdt=The%20Fun%20of%20Facebook%20
    Measurement&utmp=%2Findex.php%2F2010%2F01%2F11%2Fthe-fun-of-
    facebook-measurement%2F&utmul=en-us

    It looks a little messy, but it’s a single, portable string that has the exact information that was listed in the four bullets that started this section. While it might be painful to reverse-engineer this string into a more reader-friendly format by hand, it’s a snap to do programmatically (which is exactly what web analytics tools do…as we’ll discuss in step 4) or in Excel.

    Before we move on, let’s tack one more parameter onto our string. This is something that is actually hard-coded into the Javascript, and it identifies which web analytics account this traffic needs to go to. In the case of this blog, that account ID is “UA-2629617-3” and the variable Google Analytics uses to identify the account parameter is “utmac.” I’ll just tack that on the end of our string, which now looks like:

    utmhn=www.gilliganondata.com&utmdt=The%20Fun%20of%20Facebook%20
    Measurement&utmp=%2Findex.php%2F2010%2F01%2F11%2Fthe-fun-of-
    facebook-measurement%2F&utmul=en-us&utmac=UA-2629617-3

    A subtle point: what we’ve really done above is to combine all the information into a single string with a series of “key-value pairs.” In the case of the first variable, the “key” is “utmhn” and the “value” is “www.gilliganondata.com.” Notice that both the key AND the value are included in the string. If you’ve worked with comma-delimited or tab-delimited files, then you might be wondering why the key is included. Why can’t the Javascript always pass in the variables in the same order, and the web analytics server would know that the first value is the hostname, the second value is the title, and so on? There are at least four reasons for this:

    • It just generally makes the process more robust because it reaffirms to the server exactly what each value means at the point the server receives the information; the internet is messy, so hiccups can happen
    • Most “advanced” features when it comes to capturing web analytics data rely on tacking on additional parameters to the master string — by including both the key and the value for every parameter, that fanciness doesn’t have to worry about the order the parameters are passed in, AND it means the custom parameters get viewed/processed exactly the same way that the basic parameters do
    • The “key-value pairs separated by the & sign” are standard on the internet. Go to any online retail site and poke around, and you will see them in the URL. It’s kind of a standard way to transmit a series of variables onto the back end of a web page or image request, and that’s really all that’s going to happen in step 3

    We’ve got our string, so now let’s do something with it!

    3 – Javascript makes an image request with that string tacked on the end

    Somehow, we need to pass that string back to the web analytics server. We do that by making an image call. In the case of Google Analytics that image request is always, always, always exactly the same, no matter the site using Google Analytics:

    http://www.google-analytics.com/__utm.gif

    Just like we covered in the “online retail site” URL structure discussion at the end of the last section, we’re going to tack some parameters on the end of the __utm.gif request. The standard way to take a base URL and tack on parameters is to add a “?” followed by one or more key-value pairs that are separated by an “&” sign. Lucky for us, the “&” sign is what we used when we were building our string in the last section! So:

    http://www.google-analytics.com/__utm.gif

    +

    ?

    +

    utmhn=www.gilliganondata.com&utmdt=The%20Fun%20of%20Facebook%20
    Measurement&utmp=%2Findex.php%2F2010%2F01%2F11%2F
    the-fun-of-facebook-measurement%2F&utmul=en-us&utmac=UA-2629617-3

    =

    http://www.google-analytics.com/__utm.gif?utmhn=www.gilliganondata.com&amp;
    utmdt=The%20Fun%20of%20Facebook%20Measurement&utmp=%2F
    index.php%2F2010%2F01%2F11%2Fthe-fun-of-facebook-measurement%2F&
    utmul=en-us&utmac=UA-2629617-3

    Wow, that looks messy, but it just looks messy — it’s actually quite clean! In reality, there are way more than five parameters tacked onto the image request. As a matter of fact, the request above would really look more like this:

    http://www.google-analytics.com/__utm.gif?utmwv=4.6.5&utmn=1516518290&amp;
    utmhn=www.gilliganondata.com&utmcs=UTF-8&utmsr=1920×1080&utmsc=24-
    bit&utmul=en-us&utmje=1&utmfl=10.0%20r45&utmdt=The%20Fun%20of%20
    Facebook%20Measurement%20%7C%20Gilligan%20on%20Data%20by%20Tim
    %20Wilson&utmhid=1640286085&utmr=http%3A%2F%2Fgilliganondata.com
    %2F&utmp=%2Findex.php%2F2010%2F01%2F11%2Fthe-fun-of-facebook-
    measurement%2F&utmac=UA-2629617-3&utmcc=__utma%3D116252048.
    1573621408.1267294551.1267294551.1267299933.2%3B%2B__utmz%3D
    116252048.1267294551.1.1.utmcsr%3D(direct)%7Cutmccn%3D(direct)%7C
    utmcmd%3D(none)%3B&gaq=1

    You can get a complete list of the Google Analytics tracking variables from Google (if you’re really into this, check out the utmcc value — that actually is a single parameter that includes multiple sub-parameters, which are separated by “%3D” — a URL-encoded semicolon — instead of an “&”; these are the user cookie values, which you can find towards the end of the long string above if you look for it). You can inspect the specific calls using any number of tools. I like to use the Firebug plugin for Firefox, but Fiddler is another free tool, and Charles is the standard tool used at my company. And, there’s always WASP to provide the “clean” view of the parameters (I use WASP heavily…unless I’m trying to reverse-engineer the specific calls being made for some reason).

    The Javascript makes a request for that URL. This is the infamous “1×1 image.” Just to sharpen the edges a little bit on some common misconceptions about that image request:

    • The request for the image is what matters — while the 1×1 image will get delivered back, by the time http://www.google-analytics.com actually sends out the image, the page view has already been counted. As a matter of fact, if there was no __utm.gif image, the traffic would still get counted simply by virtue of the fact that the Google Analytics server received the image request. As it happens, some other little user experience hiccups can happen if there’s no actual image, but the existence of the file matters ‘nary at all from a data capture perspective!
    • Yes, you can actually just request the image directly from your browser. Go ahead — here’s the URL as a hyperlink: http://www.google-analytics.com/__utm.gif (yeah, it’s something of a letdown, but now you can say you’ve done it)
    • The image isn’t a 1×1 pixel image so that it’s small and not noticed by the user. If Google got a wild hair to replace the __utm.gif image with a 520×756 pixel image of a psychedelic interpretation of the Mona Lisa…no one would ever see the change (unless they were doing something silly like calling the image directly from their browser as described in the previous bullet). The image gets requested by the Javascript, but it never gets displayed to the user. It’s sort of like a Javascript dropdown menu — the text for the dropdown gets loaded into the browser memory so that, if you mouse over the menu, the text is already there and can be displayed immediately. The __utm.gif request is the same way…except there’s nothing in the Javascript that ever actually tries to render the image to the user

    And one more point: While we’ve been talking about “image requests” here, it doesn’t have to be an image request per se. In the case of Google Analytics, it is. In the case of Webtrends, it is, too (the image is called dcs.gif). In the case of other web analytics packages, it’s not necessarily an image request, but it is a request to the web analytics server. What matters is understanding that there are a bunch of key-value pairs tacked on after a “?” in the request, and that’s where all of the fun information about the visit to the page gets recorded and passed.

    4 – Web analytics tool reads the string and puts the information into a database

    So, the web analytics server has been getting bombarded with the requests from Step 3. Can you see how straightforward it is for software to take those requests and split them back out into their component parts? That’s the easy part. Where the tools really differentiate themselves is how exactly they store all of that data — the design of their database and then how that data is made available for queries and reports by analysts.

    Back in the day (and I assume it’s still an option), Webtrends would make the raw log files available to their customers as an add-on service. That was handy — once we understood the basics of this post and the Webtrends query parameters, we were able to sift through for some juicy nuggets to supplement our “traditional” web analytics (these were in the days before Webtrends had their “warehouse” solution, which would have made the same information available).

    5 – Web analyst queries the database for insights

    Like step 4, this is an area where web analytics tools really differentiate themselves. In the case of Google Analytics, there is the web-based tool and the API. In the case of paid, enterprise-class tools, there are similar tools plus true data warehouse environments that allow much more granular detail, as well as two-way integration with other systems.

    Why Understanding This Matters

    You’re still reading, so maybe I should have made this case earlier. But, the reason this matters is because, once you understand these mechanics, you can start to do some fun things to handle unique situations. For instance, what do you do if you have Google Analytics, and you want to track activity somewhere where Javascript won’t run (like…um…your Facebook fan page — that’ll be my next post!). Or, more generally, if you’re Googling around looking for ways to address some sort of one-off tracking need, you’ll understand the explanations that you’re finding — these solutions invariably involve twiddling around within the framework described here.

    As I read back through this post before publishing it, I was struck by how far into the tactical mechanics of web analytics it is. The overwhelming majority of web analytics blog posts focus on step 5 and beyond — how to use the data to be an analysis ninja rather than a report monkey. Understanding the mechanics described here is a foundational step that will support all of that analysis work. I was incredibly fortunate, early in my web analytics career, to have an opportunity to run the migration from a log-based web analytics package to a tag-based solution. I was triply fortunate that I worked on that migration with two brilliant and patient IT folk: Ernest Mueller as the web admin supporting the effort, and Ryan Rutan, the developer supporting the effort — he was hacking the Webtrends page tag before the consultant who we had on-site to help implement it had finished his first day. Ernest drew countless whiteboard diagrams to explain to me “how the internet works” (those “immutable laws” I mentioned early in this post), while Ryan repeated himself again and again until I understood this whole “image request with parameters” paradigm.

    If you’re a web analyst, seek out these types of people in IT. A hearty collaboration of cross-discipline skills can yield powerful results and be a lot of fun. I had similar collaborations when I worked at Bulldog Solutions, and the last two weeks saw the same thing happening at my current gig at Resource Interactive. Those are pretty energizing experiences that leave me scratching my head as to why so many companies wind up with an adversarial relationship between “the business” and “IT.” But THAT is a topic for a whoooollllle other post that I may never write…

    Adobe Analytics, Analytics Strategy, Conferences/Community

    Want to meet Analytics Demystified?

    Whoa I cannot believe it is nearly March already, can you? Seems like 2010 took off like a rocket and is only moving faster and faster every day, which is great if you’re like me and you prefer “hectic” to “easy going” and are happiest when you’re fully engaged. And speaking of being busy … my travel schedule in the next few months looks awesome and will let me meet even more great companies working to take a more strategic approach towards web analytics. If you’re going to be at any of the following events, email me directly and we can arrange a time to chat!

    Here are some details about where you can meet me in the coming months:

    • March 2nd to 4th, Omniture Summit, Salt Lake City, Utah. Next week I will be attending what has become the biggest party in all of web analytics, the Omniture Summit. Say what you want about Omniture, these guys know how to put on an amazing event, chock full of content, presentations, and amazing “extras” (for example, last year they had Maroon 5 play and Glenn “Big Baby” Davis hanging out at the concert, talking about the Internet and web analytics. How cool is that?)
    • March 9th, SearchFest, Portland, Oregon. On Tuesday, March 9th I will be here in the Rose City presenting with Aaron Gray in the afternoon. Our presentation is “Measuring Online Success: Top Down and Bottom Up” in which Aaron will examine success from a more tactical perspective (his forte!) and I will focus on the strategy and governance issues that all companies need to consider. If you’re in Portland and haven’t registered for SearchFest 2010 you can use the promo code: SPKR-SEMPDXSF1020 and save a little green.
    • The week of March 22nd I will be in Austin, Texas with a client and am hoping to cajole the local crew into having a Web Analytics Wednesday, Texas-style. I know Jennifer Day in Dallas will be mad that I’m not getting to Dallas first (since I owe Dallas after their amazing photo contest win … did you all see this?) If you’re in Austin STAY TUNED!
    • Moving into April, John and I will both be going to the Coremetrics conference, also in Austin, Texas during the week of April 26th. Details about our participation are still being finalized but suffice to say we will be there for the Armadillo races, the awesome BBQ, and to learn more about Coremetrics momentum in the market. If you are a Coremetrics client please contact the company for details about their conference.
    • The first week in May John and I will be at Emetrics in San Jose. I will be delivering a keynote speech with Bryan Eisenberg similar to the “Top Down/Bottom Up” presentation I will give with Aaron Gray at SearchFest (and to be fair, the idea was Jim Sterne’s in the first place, since he is the “idea guy!”)  I will also be presenting something at the concurrent Conversion Conference put on by Tim Ash so look for details on that.
    • Also in May I will be at the Unica conference in Orlando, Florida during the week of May 17th. I’ve not been to a big Unica event and am pretty excited about this one. Details are still being worked out and if you’re a Unica customer you should reach out to the company directly for details on the conference.

    Phew, huh? Between that, client work, our Analysis Exchange efforts, and oh yeah, being a dad I’m more or less busy until summertime … which is not to say we’re not still open to new clients. We’re always happy to talk with companies about how John, Aurelie, and I can accelerate business successes through web analytics. If you’re ready to take a more strategic approach towards digital measurement, we’d love to talk to you!

    Finally, I wanted to mention a conference that many of you might not know about but your peers in the marketing group definitely do: The Online Marketing Summit. I have been a speaker at OMS for several years and have always been impressed but this time I was absolutely floored by the event. Over 800 B2B and B2C marketers from around the country converged on San Diego this week to hear some amazing content. If you do online marketing you really need to have a look at what OMS has to offer.

    Analytics Strategy

    Columbus Web Analytics Wednesday — Feedback Analysis

    It’s been a crazy month work-wise. As a result, I’ve been chalking up thoughts in my head that I’d love to get written down. One of those thoughts is actually around time management and prioritization, which I’ve been pondering in the context of, among other things, my blogging…but that’s not the subject of this post!

    Following our last Web Analytics Wednesday (and somewhat in preparation for our next one), I put out a survey to my list of 150+ past registrants. I used Google Documents for the survey, which is the second or third time I’ve used Google as a survey tool over using the free version of Survey Monkey or some other service. I think I’m hooked. While you don’t have a whole lot of control over how the data gets stored in the underlying Google spreadsheet, the user experience is pretty clean, which I like. And the data can be exported straight from the Google spreadsheet and manipulated in Excel (I know I could theoretically evaluate it within the Google spreadsheet, but I’ve never managed to spend enough time with Google Docs to really have the agility I’d like there).

    Who Answered?

    I received 21 responses to the survey, which is a 13% response rate. Not bad! The main “profile” question I asked was how many Columbus WAWs the respondent had attended. The results showed a pretty even mix from “little to none” to “some or a lot:”

    That’s good, as I feel like the results are pretty representative of the population we’re trying to serve. I didn’t go deep in the survey as to company, industry, role, etc. —  we’ve got a pretty good feel for that, and I knew I wasn’t going to go nuts with trying to segment the results as part of the analysis.

    What Attendees Are Looking to Get Out of WAWs

    So, what are attendees looking to get out of Columbus WAWs? This question had 3 options for each category: “Not really,” “Sort of,” and “Very Much So” (my friends at Foresee Results would probably tell me that a 10-point scale without intermediate labels would have been a much cleaner method…but I’ve never claimed to be an expert in this sort of thing). The chart below shows the “Very Much So” and the “Sort of” responses (I segmented by the number of times people had attended and did not glean anything of note there):

    Networking came out of the top of the list, although it was a virtual tie with increasing web analytics knowledge. That’s great, as these are two of the core goals for WAWs. We need to keep doing a “pure networking” event here and there. A challenge with those events is with the sponsorship — if we get a sponsor other than the Web Analytics Wednesday Global Sponsors, we really need to give them a forum to talk. All of our past sponsors have been great about not making their presentations “sales pitches” — we get good, practical content from them. But, they get to solidify their positions as experts in an area.

    Dave Culbertson and I have discussed several times that WAWs seem to draw SEO/SEM-interested people as much as web analytics-interested people. The disciplines have a heavy overlap, so that’s not a surprise. The survey results back this up, so we will continue to incorporate search-oriented topics.

    I was a bit surprised by the low number of people who indicated “find a job,” as it seems like I talk to one or two people each month who are between opportunities. That may be the result of an imperfectly sampled population.

    And, it’s good to know that there’s a healthy interest in drinking good beer (although that puts a tough constraint on the venue selection, which I’ll touch on later).

    WAW Scheduling

    On a highly practical front, we occasionally get feedback that Wednesday evenings are a bad time for a person — we’ve had some past regulars who simply haven’t been able to attend due to commitments elsewhere on Wednesday evenings (pool league, hockey league, teaching CCD, etc.). When we first started WAWs in Columbus, we held them on Tuesdays for this very reason, but a survey last year showed a shift to Wednesdays would work better.

    In this survey, Wednesday dinners did come out at the top of the pack:

    Now, there very well may be survey bias in the responses to these questions, because we’ve been consistently holding WAWs on Wednesdays over happy hour/dinner, which means most of the people invited to participate in the survey had registered for an event at that time. But, that’s the only group I have easy access to in order to survey, so I’m running with it.

    As one more check (and somewhat just for the data visualization challenge of it), I cross-tabbed these two questions and put it on a bubble chart:

    Again, Wednesday dinners are the clear winner. What’s a little troubling is that only 2/3 of the respondents indicated this combination was good for them. We’ll have to grapple with that a bit — you can’t please all the people all the time, certainly, but we also don’t want to shut out people all the time due to structural conflicts. My take is that we can definitely steer clear of Fridays and we should stick with “after work” time slots. But, we may try mixing up the days of the week a bit.

    Data visualization side note: the way I represented the data above works okay, I think, but it also is a good exercise in showing one of the reasons that pie charts are evil. The number inside each circle shows how many respondents had answers that fell in both categories. Compare the size of a “1” to the size of the “14” — does it look to you like the larger circle is fourteen times as big as the smaller one? It doesn’t to me. In this case, the bubbles have the values labeled inside of them, partly because the pure visualization seemed misleading. Human beings are notoriously bad at interpreting 2-dimensional areas.

    Communication

    We’ve got a wide range of ways we promote WAWs, so I wanted to get a sense as to which ones people preferred.

    How do you prefer to stay informed of upcoming WAWs?The only surprise here was that “Running into Dave Culbertson” was at the bottom of the list! Of course, I didn’t ask Dave to mention to people he ran into that this survey was posted, so there’s that pesky survey bias again. We’ll keep up the e-mails (in almost two years of building up the Columbus WAW database, we’ve had a total of 2 opt outs, so I’ll keep the frequency of communication about the same, as it seems to be working).

    Open-Ended Feedback

    A number of respondents took the time to provide detailed thoughts on the event overall, and, specifically, on the request for other venue suggestions.

    I have an infatuation with Wordle at the moment (specifically when it comes to certain types of online listening), which is one of those subjects for a future posts. Below is a wordle of the general feedback responses — I can’t help but smile when I look at it:

    A summary of some of the specifics in the general feedback:

    • There were several suggestions that we occasionally have practitioners rather than vendors present: case studies, best practices, or even peer problem-solving sessions
    • There was a suggestion to try a round table or un-conference format around the state of SEM/SEO/analtyics
    • One respondent suggested a competition of sorts — having attendees bring their “best stuff” on a topic or a challenge; maybe even trying to have a prize of some sort to the “winner”
    • “If you could get Avinash Kaushik to speak that would be SWEET!”
    • One person noted that our topics tend to very consumer brand-oriented (which is true), and that it would be nice to have content that is more general and that could be applied to B2B
    • There was a pretty healthy level of general gushing about the quality and value of the event

    One person noted in the  general feedback that “I like Barleys as a venue- that room has nice square dimensions that keep the energy together- you can’t really get stuck off to the side.” We knew Barley’s was good, but this was a fresh perspective on one of the reasons as to “why.” On the “alternative venue suggestions” question, several people commented that any location needed to be central (as Barley’s is). The only specific alternative venue suggested was Spaghetti Warehouse, which we’ve used in the past. It is centrally located, and it has a pretty good meeting spot (we’ve been seated upstairs), and it’s quiet enough to have conversations. The two downsides are: 1) the area where it’s located (although they do have a security guard posted in the parking lot at all times), and 2) the beverage selection. One of the responses to the venue question was: “Anywhere with ‘Good Beer’!” Spaghetti Warehouse definitely falls short on that front, but at least it’s not entirely dry!

    We may give that another shot.

    Feedback Is ALWAYS Welcome

    If you didn’t participate in the survey (or if you did but have other comments), please leave a comment or drop me an e-mail (“tim” at this site’s domain).

    Analytics Strategy

    The Coming Bifurcation in Web Analytics Tools

    When John was with Forrester Research last year he had the opportunity to do some work for Google that published some pretty bold claims. Among these was his reporting that “a staggering 53% of enterprises surveyed currently use a free solution as their primary Web analytics tool, and 71% use free tools in some capacity (PDF from Google). At the time I commented:

    “When [John] first told me that over half of Enterprise businesses were using free solutions I have to admit I didn’t believe him. In a way I still don’t, but perhaps that’s only because I work with a slightly different sample than he presents. Regardless, John’s report paints a picture of an increasingly challenging market for companies selling web analytics and a new sophistication among end users.”

    Increasingly my new partner is looking like some kind of prescient seer, although perhaps not for the reason some of you expect. Without a doubt Google is pushing hard to improve their analytics application, and by nearly all measures they are doing a phenomenal job. As I said back in November I personally believe their “Analytics Intelligence” feature is brilliant, and I have little doubt that we’ll continue to see little improvements here and there over the coming year.

    But as much as I love Google Analytics for what it does, I am also willing to be honest about what it does not do and what it is not. Google Analytics alone is simply not enough for truly sophisticated web analytics.

    Despite John’s findings at Forrester, and despite the fact that Google Analytics is easily the most widely deployed web analytics solution ever built, there are clearly limits to what Google Analytics is capable of today. What’s more, there is nothing wrong with having limits … what is wrong is trying to be all things to all people, which is what this post is really about.

    At Analytics Demystified we have been talking over the last six months to an increasing number of companies that are considering dropping their historical vendor, almost always in favor of Google Analytics. And at Analytics Demystified we don’t do that much work with small, mom-and-pop shops … these are global organizations, name brands, and market leaders in their respective categories. Most of these companies are spending well-over $500,000 per year on analytics technology, and a handful are spending double that.

    What’s more, all of these companies have multiple dedicated resources for web analytics. These companies, in many cases, are freaking awesome at putting their tools to work, and in all cases understand that web analytics is a “people” thing, not a “technology” thing. So what the heck is driving them towards Google’s waiting arms?

    Limits.

    It turns out there are limits to the amount your average business user is willing to invest in learning web analytics tools. As more companies begin to truly take a strategic approach towards web analytics, many of them are realizing that their business users are simply not “getting” the fee-based solution they’ve invested so heavily in. The business users have more or less found their limit, and hit the wall, and are balking at the amount of time it actually takes to learn and become proficient with these tools.

    Apparently in an effort to further differentiate themselves from Google Analytics, the paid guys have inadvertantly made their technology so complex that few people in the business are actually willing to use it.

    Oops.

    Having worked at WebSideStory in the past I have to admit I cringed when one business user complained that [market leading vendor X] “was just too complicated” and that she “really, really missed using HBX because it was simple.” But this is a story we are hearing over and over and over … to the point where I am having to revise my entire opinion about Google Analytics place in the true Enterprise, which I’m happy to do …

    … except.

    A problem with the wholesale shift to GA arises when we go to the dedicated analysts and consulting teams who actually do get and use the paid solutions, pretty well in most cases. Suggesting to them that they might try and get by solely on Google Analytics is kind of like telling LeBron James that he needs to do his job with only one leg, one arm, and a blindfold on if he wants to keep playing ball.  He’d probably still be able to drop 30 on the Knicks, but he certainly wouldn’t be happy about it.

    I don’t personally know a single analyst worth their weight in salt who would be happy and willing to standardize completely on Google Analytics, at least not today. Despite awesomeness galore, the decreasing list of things that GA doesn’t do is pretty important to these sophisticated users. True visitor-level segmentation, real flexibility in reporting on custom data, the ability to define custom metrics and dimensions, true data integration … it’s not an infinite list, but it’s still pretty long by my estimation.

    And while the list of third-party applications to provide additional functionality to Google Analytics — for example ShufflePoint and their wonderful use of the GA APIs — we have still not seen a solution emerge that confers all of the necessary functionality that “professional” web analysts need to do their jobs well. In my experience there is nothing worse than knowing how to answer a question but not have the tools in place that you need to make the necessary connections.

    Notice that I’m not saying that the alternatives to Google Analytics necessarily have these features. You don’t have to spend much time following the Web Analytics Forum or the Twitter #measure tag to see complaints about how hard it is to do stuff that should be pretty simple. But for the most part this rich functionality can be found in the add-on ad hoc exploration tools (e.g., Omniture Insights, Coremetrics Explore, Webtrends Visitor Intelligence, Unica NetInsight, etc.), and it turns out that when you’re competing on web analytics these features are pretty important.

    So what am I saying, and what did I mean by “bifurcation” in this post’s title?

    I believe that we are about to see an increasing number of companies in the coming year drop their paid vendor’s “basic solution” in favor of Google Analytics and, at the same time, seriously consider adding their vendor’s high-end offering. More specifically thanks to advances in “universal” (sic) tagging, the increasing cry from business users to “get us something simple that we can use”, and the true and present need for experienced operators to have a robust data exploration tool at their disposal, I think we’ll see an increasing number of “Google Analytics + Omniture Insights” implementations.

    Some caveats:

    • I am not saying that all companies will drop their paid vendor in favor of Google Analytics, mostly because “all companies” never do anything;
    • I am not saying that all companies should drop their paid vendor in favor of Google Analytics, or even that companies should drop their paid vendor at all, especially if you have a pretty solid web analytics strategy in place;
    • I am definitely not saying that I believe companies can manage a sophisticated web analytics operation using Google Analytics alone, although this statement hinges on the definition of “sophisticated”;
    • I am not evangelizing for Omniture Insights, even thought I used to work at Visual Sciences and continue to use OI thanks to the good graces of Omniture/Adobe;
    • I am not evangelizing for Google Analytics, even thought I do think the GA team has made amazing advances over the last 12 months;

    The final caveat is that I am only using Omniture Insights in the description below as an example — you can substitute any of the solutions I listed above just as easily, or even use SAS if you’re ready for the coming revolution in web analytics. Heck, if you’re super-motivated, you can take Hiten Shah of KISSmetrics suggestion and build your own clickstream data warehouse and analyze the results using Tableau.

    Regardless of the technology you choose, the bifurcated solution looks kind of like this:

    • For your business users you simply do an awesome job implementing all the great new functionality present in Google Analytics;
    • Using very simple Javascript libraries you “copy” the data you’re sending to Google Analytics and pass it along to Omniture Insights (OI);
    • At the same time you add whatever other information you need to pass to OI, either because GA can’t handle it or you’ve filled all your custom variables;
    • In OI you transform the data to match what GA is doing as closely as possible, knowing full well the data will never match because of GA sampling and the reality of our industry;
    • In OI you add to the data with whatever you need, either via transformation, lookup tables, custom metrics and dimensions, whatever …

    With these technologies in place you now have two things:

    1. A very appropriate solution for your internal business users, one they will likely embrace thanks to it’s simplicity, it’s beauty, and it’s Googliness;
    2. A very powerful solution for your web analysts that is largely based on what your business users are looking at.

    The way the solution set works practically within the business:

    • Business users get training on Google Analytics, which is surprisingly easy to provide, and if you’re big enough the rumor is that Google’s own evangelist will come visit with you (fun!)
    • Business users get used to the idea that the numbers in GA are not 100% perfect, especially in high-volume situations where GA is sampling;
    • Business users follow the age old advice to “manage based on trends” and use some of the slickness that is GA to identify problems and opportunities;
    • When the business finds something interesting they ask the analytics group to work with them to look more closely and provide analysis (not reports);
    • When the business needs “more accurate” numbers they ask the analytics group to provide reporting from the complete set of data (normal accuracy and precision caveats still apply);
    • With their newly gained free time, the analytics group can become more of a proactive analytics service organization and less of a barrel full of “report monkeys”;

    Yes this involves some internal education, but c’mon people, all web analytics involves internal education. You’ll need a clear explanation about the “goodness” of the GA data in high-volume (e.g., sampling) situations; you’ll need to provide training on Google Analytics, but there are some amazing people out there who can help you; and you’ll need to manage two vendor relationships … although if John’s data from Forrester is correct, 71% of you are already doing that!

    Clearly this solution is not without risks, but from where I sit, I am having more and more trouble putting together a viable and workable alternative. Web analytics is becoming an increasingly critical function across the Enterprise and awareness of the solution set is bubbling up more rapidly than ever. As this happens, an increasing number of internal stakeholders are starting to ask for direct access to web data.

    But the fee vendors, again for pretty good and obvious reasons, have evolved their base solutions to an unprecedented level of complexity, especially when you look across many vendor’s “complete” base solution (e.g., Webtrends 9 plus the requirement to use Webtrends Live for a lot of stuff, Omniture SiteCatalyst plus Omniture Genesis, etc.) Nobody is blaming them for the push upstream … especially since nobody I know could think of an alternative to differentiate their solutions from the 8,000,000 lb gorilla that Google Analytics has become.

    At the end of the day in many, many cases you end up with business users frustrated by their inability to effectively and efficiently self-serve, and analytics professionals frustrated by the amount of time they spend pushing out basic reports. Quickly the situation becomes what is politely described as “inefficient” or,  in more colloquial terms, FUBAR. You choices are  then to A) lump it and suffer or B) do something about it.

    I’m not a big fan of suffering.

    The bifurcated solution, if you think about it, is actually pretty awesome. You get the best of both worlds, and one of the solutions costs you nothing and so (hopefully) frees up budget to hire more and better people to manage your web analytics efforts. I’d rather see a company put $500,000 equally towards an ad hoc analysis engine and smart people to run it than the case I see too commonly today where the lion’s share of that $500,000 going directly to buy a solution that is not meeting the needs of the business.

    What do you think?

    If you’re a company of any size and history of investment with any of the big U.S. or European vendors, and if you’ve been considering something similar, we’d love to hear from you. While we’re already providing guidance to some pretty large clients making this move we are always eager to collect additional data as an input to our thinking. We’re also happy to hear from consultants and vendors who have a clearly vested interest in the outcome I’ve described. And yes, if you need to bitch at me for suggesting that Google Analytics is anything short of manna from heaven, I suppose I’ll approve those comments as well.

    Analytics Strategy

    Flash Cookies and Consumer Privacy

    Update: I should apologize to Adobe since I knew they had written to the FTC but didn’t mention it when I originally published this post. If you’re interested in this topic you should definitely download and read Adobe’s letter to the Secretary of the FTC regarding the use of Flash Local Shared Objects to re-spawn cookies. They cite my BPA white paper and do a great job outlining the company’s position on this particular use of their technology. I am writing to Adobe now to see if I can get someone on the phone to discuss in greater depth but if you know anyone there please ask them to email me directly.

    A few weeks back we published a white paper with our client BPA Worldwide on the use of Flash Local Shared Objects in web analytics practices. The paper, titled “Flash LSOs: Is Your Privacy at Risk?” is available for download at BPA Worldwide and does require a tiny bit of information (name, company, email.) We wrote the paper with BPA Worldwide because we are seeing a resurgence in the use of Flash LSO as a back-up mechanism for browser cookies and frankly I personally worry about the practice.

    Cookie deletion is what it is, and nothing anyone has done in the past five years has seemed to do anything to lessen (or worsen) the rate at which consumers clear cookie and history files. And yes, cookie deletion has a confounding effect on a variety of metrics web analytics professionals consider important, we’ve covered this more or less ad nasuem, although I certainly wonder how comScore’s recent reversal on the value of cookies will play out across combined web analytics + audience measurement efforts.

    My concern is that companies are increasingly using cookies to over-ride consumer preferences regarding cookie deletion. Documented by Soltani, et al. in their paper “Flash Cookies and Privacy”, companies are actively using Flash LSO, which are much more difficult to block and delete than their browser-based counterparts, to essentially “reset” browser cookie values and thusly “remember” information that consumers are either implicitly or explicitly asking the web browser to forget.

    If you’re doing this, or even considering this, I would encourage you to download the white paper as we provide what I believe to be sound guidance regarding the use of Flash LSO in a measurement practice.  You might also want to check out this post over at the Adobe web site which details how Adobe Flash 10.1 will begin to support the “private browsing” feature in most browsers. While I don’t blame Adobe particularly for how companies are using LSO in digital measurement practices, this update is an excellent response from the company and shows their commitment to consumer privacy.

    As always your thoughts and feedback are welcome.

    Analysis, Analytics Strategy

    A Record-Setting WAW in Columbus with CRM Metrix

    Last week’s Columbus set a new record for the meetup — we had exactly FIFTY attendees, which was a great showing. Part of the large draw was undoubtedly the event sponsor, CRM Metrix (@crm_metrix on Twitter).

    Pre-Meal Networking (and a Friendly Wave from Jonghee!)
    Columbus Web Analytics Wednesday -- Jan 2010

    Hemen Patel, CRM Metrix CTO, facilitated a lively discussion about incorporating the voice of the customer in web site measurement and optimization.

    Hemen Patel Presents
    Columbus Web Analytics Wednesday -- Jan 2010

    Hemen walked through a brief deck (below) that sparked some great back-and-forth with the crowd.

    A Rapt Audience
    Columbus Web Analytics Wednesday -- Jan 2010

    Monish Datta Asks a Question
    Columbus Web Analytics Wednesday -- Jan 2010

    With a crowd of fifty people, not only did I not get to meet the first-time attendees, but I barely had a chance to say, “Hi” to some of the long-time regulars. I guess we’ll just have to have another one in February (I’m working on it!) so I’ll get that chance!

    Analytics Strategy

    Building A Culture Of Measurement

    Building a Culture of Measurement is the title of the Keynote “Sprint” I’ll be delivering at Webtrends Engage next week in New Orleans. Like the other distinguished speakers during the keynote, I’ve only got 10 minutes to deliver my message and then get off the stage. Ten minutes isn’t nearly enough, so I thought that I’d elaborate here on my blog to hammer out the concepts behind my presentation.

    Let me put it right out there and state that culture isn’t built overnight. And changing culture takes even longer. So save your get-rich-quick schemes for some other ponzi project. There is no quick fix for business culture because it exists in the ethos of your organization, not in the conference rooms, offices and cubicles. Culture consists of values, beliefs, legends, taboos and rituals that all companies develop over time. Attempting to force culture will most likely result in failed efforts and an ingenious solution. Instead, organizations that don’t have an inherent culture of measuring their marketing efforts must ingrain some key measurement enablers into the system.

    Know your surroundings – know your audience.

        Start by understanding what you’re working with by taking a realistic assessment of your organization’s culture. This may be easier for an outsider to gauge who can spot promise and dysfunction much more quickly than the tenured veteran who is so ingrained within the culture that it’s a part of their daily routine. In either case, taking a realistic assessment of how the company utilizes data and reacts to data-driven ideas is the launching point.

    Find levers that trigger change. Once you’ve assessed the situation and gained your bearings, then you need to find out what motivates individuals and business units within the organization. Measurement is largely about producing results, so if there are decisions being made in absence of data, perhaps digging up some examples of bad decisions with proof from historic data might offer some subtle hints about operating differently. Yet, all companies operate differently, so if yours is one that wouldn’t react well to this tactic, then figure our how to push the buttons (whether positive or negative) that will affect change.

    Always ask why. Not so much in the way that a three year old persistently asks…why? Why? WHY? But more so to determine if data requests and new projects have a well thought out plan with measurable goals. So much of what we do in analytics is founded on having clearly defined objectives and goals that it is imperative for web analysts to enforce their clarity by insisting that data has a purpose.

    Once you’ve established what you’re working with, the next step is to develop a measurement strategy that meshes with your culture. I advise my clients to create a “Waterfall Strategy”. I introduced my concept of the Waterfall Strategy in my manifesto, so I won’t attempt to recreate it – here it is:

    Strategy Credo #8: Establish a waterfall strategy. By this I mean strategy should flow from the headwaters of the organization and align with the corporate goals set forth by the executive team. Once your measurement team is clear and united on the goals, then identify objectives as the next tier in your waterfall that supports the corporate goals (these are your business promises). The base of your waterfall strategy consists of the tactics. Tactics are the actual campaigns and programs that emerge from your marketing machine (your creative promises). Each tier within the waterfall has specific metrics that indicate success. These metrics must be clearly defined and baked into the system at all levels to ensure proper measurement. It’s also critical to recognize that neither you nor an external consultant is likely to change your corporate goals, but you can refine the way in which you get there.

    The third effort that you must undertake when attempting to build a culture of measurement is to make your data sing. And no I don’t mean going on American Idol or belting out karaoke at your next company function. Here I’m talking about the ability to tell a story with your data. Think about culture for a minute here…it’s built on stories. You need to become a story-teller within your organization and find the narrative within the data. Communicate to your constituents not with numbers and spreadsheets, but with examples of how their efforts and activities contributed to the success of the organization. In doing this, you will create heroes and legends within your organization who earned their status through data. The next thing you know, others will be knocking at your door and asking for metrics and measures to show the brilliance and success of their projects. You’ll inherit a whole new set of problems when this starts to happen, but we can tackle that at another time.

    So that’s my story and I’m sticking to it. Please let me know your thoughts and how you’ve built a culture of measurement at your organization.

    Oh yeah, if you’re going to be at Webtrends Engage next week please seek me out and let’s talk about building a culture of measurement; my concept of the waterfall strategy; or simply share a story over a cup of coffee. If you haven’t seen it yet, Webtrends has built out an awesome site for networking with fellow Engage attendees, so let’s meet.

    See you in NOLA!

    **Update** Here’s the video courtesy of Webtrends of my official presentation.

    Analytics Strategy, Conferences/Community, General

    Welcome to Analytics Demystified 2.0

    By now you’ve noticed that we’ve completely re-done the Analytics Demystified web site, that is unless you only ever read my posts in an RSS reader in which case I would ask you to click-through and have a look. The new site is the culmination of nearly a year’s effort starting with convincing my good friend Aurelie Pols to join the Analytics Demystified and, more recently, convincing my other good friend John Lovett to leave his cushy job at Forrester Research to join Aurelie and I. Hopefully you find the new site more streamlined, easier to read, and a little more focused on the aspects of Analytics Demystified we are working to feature.

    My own personal highlights include:

    • Totally free copies of Analytics Demystified, The Big Book of Key Performance Indicators, and the KPI book’s companion worksheets. I made the decision to start giving my books away for one reason and one reason only: to continue to do everything humanly possible to educate as many future web analytics professionals as possible. The response today was good (see image below!)
    • Totally revamped mini-site for The Analysis Exchange, including the ability for everyone to start to create their member profiles. The Analysis Exchange has exceeded every single expectation that I had going in, thanks to many people’s efforts. If you’re interested in helping the Analysis Exchange or learning more about the effort please visits http://www.analysis-exchange.com
    • Partially revamped mini-site for Web Analytics Wednesday, with more features and updates coming in Q2. Web Analytics Wednesday has become such an automated delight, and with SiteSpect and Coremetrics renewing their sponsorship in 2010 we hope to do even more this coming year!
    • All new look and feel for my, Aurelie, and John’s blogs, and the addition of our new Emerging Technology blog. So much of our traffic is driven by the blogs, and so many of our clients find us based on our writing here, we wanted to ensure that reading our blogs was as distraction free as possible. The Emerging Technology blog is something we think of as “TechCrunch for Web Analytics” and we hope you’ll check that out.
    • We have also worked to clarify what the Analytics Demystified web analytics consulting business and Senior Partners do, when we’re not supporting the community at large. Perhaps a small point, but one that pays the bills, so if you need help getting your web analytics strategy defined, please give us a call.

    One thing about my last point, our consulting business and giving us a call. On past sites there were dozens of calls to action and conversion points I was trying to get people to and through. On this site there is one: getting YOU to reach out to US. It may sound glib, but we are able to do more for people who simply email, call, Skype, or Twitter us than most folks can imagine, and often times our help comes without any kind of fee.

    Put another way, if you need our professional help, we’ll help you and hopefully you’ll be satisfied with what we ask you to pay. But if you need our guidance, suggestions, or honest opinion, we’ll help you without ever bringing up fees or asking for money. Like the book giveaway, Web Analytics Wednesday, and The Analysis Exchange we have found that simply answering questions without expectation of compensation is often times better than getting paid.

    In closing I am totally delighted with the traffic we had to the site today thanks to Twitter, the #measure channel, and the book offer. Based on my Omniture Insights reporting we were completely off the charts in Europe and this AM in the U.S. We’d love your help spreading the word about the book! If you can, tell people to click through on http://bit.ly/demystified-books or simply to check out the new web site.

    As always I welcome your comments, critique, and feedback. Especially if you have nice things to say about the new site, of want to help me identify bugs (since not all of you use Chrome on the Mac … LOL!)

    Analytics Strategy, Reporting

    How To Tell A Story with Your Data

    A few weeks ago, my business partner Eric and I attended a basketball game in Minnesota. Eric purchased the tickets a few days ahead of time and I really didn’t have any expectations going into the game except to have a great time. Much to my surprise, our seats were incredible! We were sitting immediately behind the announcer’s table in the first row. Now, keep in mind, I’m a Boston sports guy and even when the Celtics were struggling through the 90’s and the early part of this decade, you still couldn’t get a seat behind the announcer’s table or anywhere near the first row without taking out a second mortgage on your house. But, this was Minnesota and the Timberwolves are not necessarily a big market team.

    Anyway, as we enjoyed the game we struck up a conversation with the woman sitting immediately in front of us who was a coordinator for the announcers. Sitting on either side of her were two official NBA scorers recording all the action into their computers and generating reports at nearly ten-minute intervals. These reports were printed and handed to the announcers, which ended up in a big pile on their desks in front of them. After a while our friendly coordinator began handing Eric and I her extra copy of these Official Scorer’s Reports. So, like any good Web Analysts would do we took a look and gave the report a critical review (see the image below).

     

    We were astounded by how poorly constructed the reports were. Sure, they contained all the critical information on each player like minutes played, field goals, field goal attempts and total points. Yet, there were no indicators of which metrics were moving, who was playing exceptionally well, or even shooting percentages for individual players. The announcers were undoubtedly skilled at their jobs, because these reports did nothing (or at least very little) to inform them of what to say to their television audiences. Clearly the NBA could benefit from some help from @pimpmyreports.

    So, here is where I get to the point about telling a story with your data. Sometime during the middle of the fourth quarter a young aspiring sportscaster came running down to the announcer’s row and handed off a stack of paper that offered some new information. Finally! His 4th-Quarternotes recap was the first written analysis we’d seen that actually placed the statistics and metrics recorded during the game into meaningful context (see image below). The 4th-Quarternotes showed that:

    • A win could bring the T’wolves to 3-3 in their last six games.
    • Al Jefferson was having a good night – approaching a career milestone for rebounds – and posting his 9th double-double of the season.
    • Rookie, Jonny Flynn was about to post his first double-double (which only five rookie players have accomplished), needing only one more assist.
    • Ryan Gomes was once again nearing a 20 point game with a 58.6% field goal percentage in the past five games.

     

    This method of reporting used all of the same data that was contained within the Official Scorer’s Report but added historical context, which really brought the data to life. This was interesting stuff! Now T’wolves fans and casual observers alike could understand the significance of Jefferson’s 16 points and 28:27 minutes on the floor – or that Jonny Flynn needed just one more assist to achieve a significant feat. After reading this, (even as a Boston sports fan) I was invested in the game and had something to root for – Go Flynn!

    So here’s the moral of the story:

    • If you’re going to produce generic reports with no visual cues – do not show them to anyone because they won’t use them – and make sure you hire some damn good analysts that can interpret these reports and give a play-by-play.
    • If you do want to distribute your reports widely – take the time to format them in a way that highlights important metrics and calls attention to what’s meaningful so that recipients can interpret them on their own.
    • And most importantly – place your data and metrics in context given historical knowledge; significant accomplishments; or some other method to bring the data to life. Give your executives and business stakeholders something to cheer about!

    Finally, if you ever have an opportunity to sit behind the announcer’s table, make sure you befriend the coordinator so you can get a copy of the reports for yourself.

    Analytics Strategy, General

    The Most Important Post on Web Analytics You'll Ever Read

    When John Lovett joined Aurelie and I here at Analytics Demystified earlier this month an awful lot of people said, “Hey, nice job getting such nice guy on board,” “We love John, he’s great,” and “Man, what a great addition to your team!” Clearly John has the respect of the industry, but one thing that remained an open question in some people’s minds was “how will John make the transition from the ivory tower an analyst sits in to the ground floor where consultants actually do work?”

    I admit, I wondered that too in a way, having made a slightly different transition myself years ago. It’s not easy to come away from a situation where you provide advice but are tasked with, honestly, doing very little real work. During my own tenure at JupiterResearch years ago I ensured my own connection to practical web analytics by writing my second and third books. But John had been an analyst for nearly 10 years … and so wondering how he’d hit the ground was a reasonable question.

    Wonder no more.

    While John has already contributed greatly to the businesses bottom line and helped out with one of our largest new retail clients, he absolutely floored me this morning when he published his post Defining a Web Analytics Strategy: A Manifesto. I asked him to elaborate on some comments he made at Emetrics where he essentially poo-pooed the use of so called “Web Analytics Maturity Models”, describing the almost religious zeal some people seem to have when talking about models and declaring himself as a “Model Atheist.”

    Having written the original Web Analytics Maturity Model back in 2005, I have had first-hand experience with their failure to produce anything more than a generalized awareness that most companies simply don’t “get” web analytics, something that we more or less all know already. But honestly I was surprised when John took this position on the subject because, well, in my experience those that don’t do, teach, and models are a classic teaching tool.

    I had assumed that as an analyst John was a teacher, not a do-er like I have been for years now in my capacity as a practice leader, consultant, and web analyst. Man was I wrong …

    John’s “Manifesto” is perhaps the most lucid yet succinct explanation I have ever read detailing the steps required to make web analytics work for your business (as opposed to the other way around.) I almost asked him to edit the post for fear that he was opening our kimono too much, but if Social Media has taught us anything it has taught us that transparency is king. The fact that he managed to encapsulate what others have been trying to explain with long-winded speeches, tangential arguments, and downright rude behavior is a huge plus.

    Some of you may read John’s manifesto and think “Gee, this seems to point to the need for outside consultants” which is a fair criticism. But before you react consider two things:

    1. Consultants (like us) have a tendency to, you know, recommend consulting. Everyone’s perspective arises from their own personal biases, regardless of how many times they declare the contrary. We are consultants, consultants who want to feed their children. Forgive us our bias and we will forgive you yours …
    2. Consultants in the Enterprise are like death and taxes, we are more or less inevitable. Often times an outside perspective is exactly what the business needs to actually start to act upon the message that otherwise great employees have been stating for years. Other times the business simply stops listening to their employees and won’t make a move until McKinsey, Bain, or Demystified come in and charge big money for insights that were already there. Either way, ours is the second (or is it third) oldest profession and it must be for a reason …

    I would challenge you, dear reader, to spend some time reading John’s post and considering what he has to say. Think about how you could apply his ten insights to your business regardless of whether you turn to consultants for advice or not. Listen to your business partners needs, put away your models and roll up your sleeves, transcend mediocrity, establish your own waterfall and embrace change!

    When I said “web analytics is hard” I meant it, I really, really did. But I wasn’t trying to box anyone in or establish myself as some kind of amazingly wonderful “guru”, I was simply telling you all the truth based on my dozen years of experience in the sector. Yes, getting started can be easy; yes, making Google Analytics do stuff can be easy; and yes, you can do an awful lot in an hour a day if you simply apply yourself to the task … but the problem is that within any business of size, complexity, or nuance — which is to say all businesses everywhere — the act of getting from raw data to valuable business insights that you can repeatedly take action upon is apparently so freaking difficult that almost nobody does it.

    How is that “easy?”

    You all know I love a good debate so if you disagree with my comments here please let me know. If, however, you have something to add to John’s manifesto, I would encourage you to comment on his blog post directly.

    Happy Holidays, everyone.

    Analytics Strategy, Conferences/Community, General, Social Media

    Announcing The Analysis Exchange

    A few weeks ago I started pinging folks within the digital measurement community asking about the work we do, the challenges we face, and how we got where we are today. The responses I got were all tremendously positive and showed a true commitment to web analytics across vendor, consultant, and end-user practitioner roles. What I learned was, well, exactly what I expected given my decade-plus in the sector: “web analytics” is still a relatively immature industry, one populated by diverse opinions, experiences, and backgrounds.

    Those of you who have been following my work know that I have spent a great deal of time working to create solutions for the sector. As a matter of record I was the first to create an online community for web analytics professionals and explicitly point out the need for dedicated analysis resources back in 2004, and the first to publish a web analytics maturity model and change how web analytics practitioners interact with their local community back in 2005. I’ve also written a few books, a few blog posts, and have logged a few miles in the air working with some amazing companies to improve their own use of web analytics.

    I offer the preceding paragraph not to brag but rather to establish my credentials as part of setting the stage for what the rest of this post is about. Like many in web analytics — Jim Sterne, Avinash Kaushik, and Bryan Eisenberg all come to mind — I have worked tirelessly at times to evolve and improve the landscape around us. And with the following announcement I hope to have lightning strike a fourth time …

    But I digress.

    One of the key questions I asked in Twitter was “how did you get started [in web analytics?]” Unsurprisingly each and every respondent gave some variation on “miraculously, and without premeditation.” While people’s responses highlighted the enthusiasm we have in the sector, it also highlighted what I see as the single most significant long-term problem we face in web analytics.

    We haven’t created an entry path into the system.

    As a community of vendors, consultants, practitioners, evangelists, authors, bloggers, Tweeters, socializers, and thought-leaders, we have failed nearly 100% at creating a way for talented, motivated, and educated individuals who are “not us” to gain the real-world experience required to actually participate meaningfully in this wonderful thing that we have all created.

    Before the comments about the Web Analytics Association UBC classes or the new certification pour in consider this: The UBC course offers little or no practical experience with real data and real-world business problems, and the certification is designed, as stated, “for individuals having at least three years of experience in the sector.” Both are incredibly valuable, but they are not the type of training the average global citizen wishing to apply their curiosity, their precision, and their individual talents to the study of web data need to actually get a good job coming from outside the sector.

    And while I have little doubt people have landed jobs based on completion of the UBC course given the resource constraints we face today, as a former hiring manager and consultant to nearly a dozen companies who are constantly looking for experienced web analysts, I can assure you that book-based education is not the first requirement being looked for. Requirement number one is always, and always will be, direct, hands-on experience using digitally collected data to tell a meaningful story about the business.

    Today I am incredibly happy to announce my, my partners, and some very nice people’s solution to this problem. At 6:30 PM Eastern time at the Web Analytics Wednesday event in Cambridge, Massachusetts my partner John Lovett shared the details of our newest community effort, The Analysis Exchange.

    What is The Analysis Exchange?

    The Analysis Exchange is exactly what it sounds like — an exchange of information and analytical outputs — and is functionally a three-partner exchange:

    • At one corner we have small businesses, nonprofits, and non-governmental organizations who rarely if ever make any substantial use of the web analytic data most are actively collecting thanks to the amazing wonderfulness of Google Analytics;
    • In the next corner we have motivated and intelligent individuals, our students, who are looking for hands-on experience with web analytics systems and data they can put on their resume during when looking for work or looking to advance in their jobs;
    • And at the apex of the pyramid we have our existing community of analytics experts, many of whom have already demonstrated their willingness to contribute to the larger community via Web Analytics Wednesday, the WAA, and other selfless efforts

    The Analysis Exchange will bridge the introductions between these three parties using an extremely elegant work-flow. Projects will be scoped to deliver results in weeks, effort from businesses and mentors is designed to be minimal, and we’re working on an entire back-end system to seamlessly connect the dots. And have I already mentioned that it will do so without any money changing hands?

    Yeah, The Analysis Exchange is totally, completely, 100 percent free.

    John, Aurelie, and I decided early on, despite the fact that we are all consultants who are just as motivated by revenue as any of our peers, that the right model for The Analysis Exchange would be the most frictionless strategy possible. Given our initial target market of nonprofits and non-governmental organizations, most of whom our advisers from the sector warned were somewhat slow to invest in technology and services, “free” offered the least amount of friction possible.

    Businesses bring data and questions, mentors bring focus and experience, and students bring a passion to learn. Businesses get analysis and insights, students gain experience for their resume, and mentors have a chance to shape the next wave of digital analysis resources … resources the mentor’s organizations are frequently looking to hire.

    More importantly, our mentors will be teaching students and businesses how to produce true analytical insights, not how to make Google Analytics generate reports. Our world is already incredibly data rich, but the best of us are willing to admit that we are still also incredibly information poor. Students will be taught how to actually create analysis — a written document specifically addressing stated business needs — and therein lies the true, long-term value to our community.

    Too many reports, not enough insights. This has been the theme of countless posts, a half-dozen great books, and nearly every one of the hundred consulting engagements I have done in the past three years. The Analysis Exchange is a concerted effort to slay the report monkeys and teach the “analysts” of the future to actually produce ANALYSIS!

    A few things you might want to know about The Analysis Exchange (in addition to the FAQ we have up on the official web site):

    • Initially we will be limiting organizational participants to nonprofit and non-governmental entities. We are doing this because we believe this approach simultaneously provides the greatest benefit back beyond the web analytics community and provides a reasonable initial scope for our efforts. Plus, we’ve partnered with NTEN: the Nonprofit Technology Network who are an amazing organization of their own;
    • Initially we will be hand-selecting mentors wishing to participate in the program. Because we are taking a cautious approach towards the Exchange’s roll-out in an effort to learn as much as possible about the effort as it unfolds, we are going to limit mentor opportunities somewhat. Please do write us if you’re interested in participating, and please don’t be hurt if we put you off … at least for a month or two;
    • With the previous caution in mind, we are definitely open to help from the outside! If you have experience with this type of effort or just have a passion for helping other people please let us know. Just like with Web Analytics Wednesday, we know that when The Analysis Exchange gets cranking we will need lots and lots of help;

    Because this post is beginning to approach the length at which I typically tune out myself I will stop here and point readers to three resources to learn more about The Analysis Exchange:

    1. We have a basic, informational web site at http://www.analysis-exchange.com that has a nice video explaining the Exchange model in a little greater detail;
    2. You can email us directly at exchange@analyticsdemystified.com for more information or to let us know if you’re willing to help with Exchange efforts;
    3. You can follow Exchange efforts in Twitter by following @analysisxchange

    As you can probably detect from the post I’m pretty excited about this effort. Like I did when I co-founded Web Analytics Wednesday, I have some amazing partners on this project. And like I did when I founded the Yahoo! group, I believe this effort will satisfy an incredible pent-up demand. Hopefully you will take the time to share information about The Analysis Exchange with your own network, and as always I welcome your thoughts, comments, and insights.

    Learn more at http://www.analysis-exchange.com

    Analysis, Analytics Strategy, Social Media

    The Spectrum of Data Sources for Marketers Is Wide (& Overwhelming)

    I’ve been using an anecdote of late that Malcolm Gladwell supposedly related at a SAS user conference earlier this year: over the last 30 years, the challenge we face when it comes to using data to drive actions has fundamentally shifted from a challenge of “getting the right data” to “looking at an overwhelming array of data in the right way.” To illustrate, he compared Watergate to Enron — in the former case, the challenge for Woodward and Bernstein was uncovering a relatively small bit of information that, once revealed, led to immediate insight and swift action. In the latter case, the data to show that Enron had built a house of cards was publicly available, but there was so much data that actually figuring out how to extract the underlying chicanery without knowing exactly where to look for it was next to impossible.

    With that in mind, I started thinking about all of the sources of data that marketers now have available to them to drive their decisions. The challenge is that almost all of the data sources out there are good tools — while they all claim competitive advantage and differentiation from other options…I believe in the free markets to the extent that truly bad tools don’t survive (do a Google search for “SPSS Netgenesis” and the first link returned is a 404 page — the prosecution rests!). To avoid getting caught up in the shiny baubles of any given tool, it seems worth organizing the range of available data some way — put every source into a discrete bucket.  It turns out that that’s a pretty tricky thing to do, but one approach would be to put each data source available to us somewhere on a broad spectrum. At one end of the spectrum is data from secondary research — data that someone else has gone out and gathered about an industry, a set of consumers, a trend, or something else. At the other end of the spectrum is the data we collect on our customers in the course of conducting some sort of transaction with them — when someone buys a widget from our web site, we know their name, how they paid, what they bought, and when they bought it!

    For poops and giggles, why not try to fill in that spectrum? Starting from the secondary research end, here we go…!

    Secondary Research (and Journalism…even Journalism 2.0)

    This category has an unlistable number of examples. From analyst firms like Forrester Research and Gartner Group, to trade associations like the AMA or The ARF, to straight-up journalists and trade publications, and even to bloggers. Specialty news aggregators like alltop.com fall into this category as well (even if, technically, they would fit better into a “tertiary research” category, I’m going to just leave them here!).

    I stumbled across iconoculture last week as one interesting company that falls in this category…although things immediately start to get a little messy, because they’ve got some level of primary research as well as some tracking/listening aspects of their offer.

    Listening/Collecting

    Moving along our spectrum of data sources, we get to an area that is positively exploding. These are tools that are almost always built on top of a robust database, because what they do is try to gather and organize what people — consumers — are doing/saying online. As a data source, these are still inherently “secondary” — they’re “what’s happening” and “what’s out there.” But, as our world becomes increasingly digital, this is a powerful source of information.

    One group of tools here are sites like compete.com, Alexa, and even Google’s various “insights” tools: Google Trends, Google Trends for Websites, and Google Insights for Search. These tools tend to not be so much consumer-focussed as site-focussed, but they’re getting their data by collecting what consumers are doing. And they are darn handy.

    “Online listening platforms” are a newer beast, and there seems to be a new player in the space every day. The Forrester Wave report by Suresh Vittal in Q1 2009 seems like it is at least five years old. An incomplete list of companies/tools offering such platforms includes (in no particular order…except Nielsen is first because they’re the source of the registration-free PDF of the Forrester Wave report I just mentioned):

    And the list goes on and on and on… (see Marshall Sponder’s post: 26 Tools for Social Media Monitoring). Each of these tools differentiates itself from their competition in some way, but none of them have truly emerged as a  sustained frontrunner.

    Web Analytics

    I put web analytics next on the spectrum, but recognize that these tools have an internal spectrum all their own. From the “listening/collecting” side of the spectrum, web analytics tools simply “watch” activity on your web site — how many people went where and what they did when they got there. Moving towards the “1:1 transactions” end of the spectrum, web analytics tools collect data on specifically identifiable visitors to your site and provide that user-level specificity for analysis and action.

    Google Analytics pretty much resides at the “watching” end of this list, as does Yahoo! Web Analytics (formerly IndexTools). But, then again, they’re free, and there’s a lot of power in effectively watching activity on your site, so that’s not a knock against them. The other major players — Omniture Sitecatalyst, Webtrends, Coremetrics, and the like — have more robust capabilities and can cover the full range of this mini-spectrum. They all are becoming increasingly open and more able to be integrated with other systems, be that with back-end CRM or marketing automation systems, or be that with the listening/collecting tools described in the prior section.

    The list above covered “traditional web analytics,” but that field is expanding. A/B and multivariate testing tools fall into this category, as they “watch” with a very specific set of options for optimizing a specific aspect of the site. Optimost, Omniture Test&Target, and Google Website Optimizer all fall into this subcategory.

    And, entire companies have popped up to fill specific niches with which traditional web analytics tools have struggled. My favorite example there is Clearsaleing, which uses technology very similar to all of the web analytics tools to capture data, but whose tools are built specifically to provide a meaningful view into campaign performance across multiple touchpoints and multiple channels. The niche their tool fills is improved “attribution management” — there’s even been a Forrester Wave devoted entirely to tools that try to do that (registration required to download the report from Clearsaleing’s site).

    Primary Research

    At this point on the spectrum, we’re talking about tools and techniques for collecting very specific data from consumers — going in with a set of questions that you are trying to get answered. Focus groups, phone surveys, and usability testing all fall in this area, as well as a plethora of online survey tools. Specifically, there are online survey tools designed to work with your web site — Foresee Results and iPerceptions 4Q are two that are solid for different reasons, but the list of tools in that space outnumbers even the list of online listening platforms.

    The challenge with primary research is that you have to make the user aware that you are collecting information for the purpose of research and analysis. That drops a fly in the data ointment, because it is very easy to bias that data by not constructing the questions and the environment correctly. Even with a poorly designed survey, you will collect some powerful data — the problem is that the data may be misleading!

    Transaction Data

    Beyond even primary research is the terminus of the spectrum — it’s customer data that you collect every day as a byproduct of running your business and interacting with customers. Whenever a customer interacts with your call center or makes a purchase on your web site, they are generating data as an artifact. When you send an e-mail to your database, you’ve generated data as to whom you sent the message…and many e-mail tools also track who opened and clicked through on the e-mail. This data can be very useful, but, to be useful, it needs to be captured, cleansed, and stored in a way that sets it up for useful analysis. There’s an entire industry built around customer data management, and most of what the tools and processes in that industry focus on is transaction data.

    What’s Missing?

    As much as I would like to wrap up this post by congratulating myself on providing an all-encompassing framework…I can’t. While there are a lot of specific tools/niches that I haven’t listed here that I could fit somewhere on the spectrum of tools as I’ve described it, there are also sources of valuable data that don’t fit in this framework. One type that jumps out to me is marketing mix-type data and tools (think Analytic Partners, ThinkVine, or MarketShare Partners). I’m sure there are many other types. Nevertheless, it seems like a worthwhile framework to have when it comes to building up a portfolio of data sources. Are you getting data from across the entire spectrum (there are free or near-free tools at every point on the spectrum)? Are you getting redundant data?

    What do you think? Is it possible to organize “all data sources for marketers” in a meaningful way? Is there value in doing so?

    Adobe Analytics, Analytics Strategy

    Data Quality – The Silent Killer…

    In this post, I am going to talk about how Data Quality can kill an Omniture (or other Web Analytics) implementation. I will share some of the problems I have seen and show some ways that you can help improve Data Quality…

    Sound Familiar?
    So you have been managing an Omniture implementation for a while. You have your KPI’s lined up. You have been sharing some dashboards and reports with people throughout your company. People are starting to realize that they should talk to you before making website business decisions. Suddenly, you find yourself in the executive suite to answer some key website questions. Then, just as you are wrapping up your web traffic overview, an executive starts to calculate some numbers on a notepad and determines that the increase you show in Paid Search traffic doesn’t look right given other data they have seen from the SEM team. She also questions the rise in traffic data for EMEA, knowing that his VP in the region told you traffic has been down over the last few months. Suddenly, you are in a web analytics death spiral. In a split second, you have to decide, do you defend your Omniture data and risk your reputation or do you back-pedal saying you will re-check the web analytics data and live to fight another day?

    Hopefully this hasn’t happened to you, but it has happened to most of us who have been around the web analytics space for long enough. Unfortunately, you only get so many chances to be wrong about data you are presenting and even if your data is right, if you aren’t confident enough to stand by it, it might as well be wrong.

    Minimizing Data Quality Risk
    So how do you avoid this situation? The first step is to realize that there is no way to be sure that all of your web analytics data is correct. 100% Data Quality is not only unattainable, but also not worth the time and effort it would take to achieve. Therefore, I use a philosophy of risk minimization in which I try various techniques to minimize the key things that cause data quality issues. The following will show you some of the ways to do this:

    Ensure all Pages are Tagged
    This is easier said than done. As we all know, IT is usually used to deploy JavaScript tags and they often have more important things to do than to guarantee that every website page has a the [correct] JavaScript tag. Fostering a good relationship with IT helps, but at the end of the day, new website pages are created all the time, and tags will be missing.

    Use Technology
    As you can imagine, where there is a need, there are technology vendors. The main vendors that I have worked with or heard the most about are WASP and ObservePoint. Not completely coincidental, ObservePoint was founded by John Pestana who was one of the co-founders of Omniture. In a great blog post, John Pestana talked about getting rid of asterik’s in web analytics reports. I am sure there are many other vendors out there offering similar products, but the gist of the technology is that it can spider your website and let you know which pages are missing JavaScript tags so eliminate any obvious omissions.

    Blood, Sweat & Tears
    Unfortunately, the main way that I have minimized web analytic data loss is by downloading data and looking for anomalies. I normally do this by taking advantage of the Omniture SiteCatalyst Excel Client and downloading key data blocks by day or week and then using formulas to compare yesterday to the same day last week or last week to the week prior. Once you have the data in Excel, you can do any type of statistical analysis you want on the data to see if anything looks “fishy.” One thing I like to do is to use Excel conditional formatting to spot data issues.

    The following is a screenshot example of using Excel to spot potential data issues. In this example, I am looking at Page Views from one week prior to each day and if there is a change of more than 20%, I highlight it in red:

    dq_excel2

    Uh-oh… It looks like our daily data quality report indicates that we may have lost a tag on Friday for the Login page and something suspicious took place related to the Search Results page the same day. Obviously, the downsides of this approach are that it is extremely manual and that it is in arrears. As you know, once you miss a time slot of data in SiteCatalyst, there is no easy way to get that data back. While this approach can minimize the data loss to a day, it won’t help you get the Login Page data back in the example above.

    Therefore, the way I employ this approach is to focus on the top items within each variable. This means, I focus on the pages with the most Page Views, the Form ID’s with the most Form Completions, the Orders for the most popular products, etc…With the Excel Client, you can download multiple data blocks at once and then use conditional formatting to easily spot the issues. Done intelligently, Data Quality for 80% of your data can be done in under a few hours each day. By doing this, you can feel more confident when your VP questions your data knowing that if something were significantly off, you would have known about it ahead of time.

    Special Cases
    I have found that there are a few other situations that commonly lead to missing or bad data so I quickly wanted to bring them to your attention so you can apply some additional effort to ensure they are tagged correctly:

    1. “Lightbox” pages where a new HTML page is often not loaded. These often times are created as a window within a window and many times developers forget to put SiteCatalyst code within them.
    2. Flash/AJAX pages where the page changes dynamically or you have an entire site/page developed in Flash. By extra careful around these as they often are missing tracking code (especially when done by an outside agency!).
    3. Dynamically generated content, such as a page that shows historical stock price data after a user enters a ticker symbol. Often times, these dynamic pages are tagged as one single page, but might be better as unique pages from a web analytics viewpoint.

    SiteCatalyst Alerts
    If you have read my previous blog post on Alerts, you may figure out that you can use Alerts to help with Data Quality as well. Alerts can be used to look for changes in key metrics by Month, Week, Day (or Hour in some cases). These alerts can be handy to be notified when data is off by more than x%. However, I have found that if you want to look a more granular data (as in the example above), the current Alert functionality can be a bit limiting. You can set alerts for specific sProp and eVar values, but not as easily as you can by using Excel. Therefore, I would use Alerts as an early warning system an employ the previously mentioned techniques as your main defense against missing data.

    Classification Data
    Finally, when thinking about data quality/completeness, don’t forget about SAINT Classifications. If you have key reports that rely on SAINT Classifications, even if you have the source data collected perfectly, if you are missing key SAINT Classifications for that source data, your reports will be incorrect and indistinguishable from poor data quality in the eyes of your users. You will know if you are missing SAINT Classification data if your classified reports have a large “None” row. So how do you ensure your SAINT Classification data is complete? What I do is create Excel data blocks for each Classification and isolate the “None” row for key metrics.

    In the screenshot below, you can see that I have created a data block that looks for “Unspecified” Site Locale Full Names (the Excel Client doesn’t use None, but it uses “Unspecified” instead for some reason). In this scenario, I store a 2-digit website country identifier in an eVar and use a SAINT Classification to provide a full name. I filter on “Unspecified” where the metric is Visits, Form Views and Form Completes.

    dq_excel4

    After running, you will see a succinct report that looks like this:

    dq_excel3

    In this case, there are no Form View or Form Complete Success Events missing a Full Site Locale SAINT Classification, but there are some Visits missing the classification. You can then easily go into SiteCatalyst or Discover, open the Full Locale Name report and break it down by its source to find out what values are left to be classified.

    Finally, if you want to earn “extra credit” you can do this for all of your SAINT Classifications in one Excel workbook and make a summary screen like the one below which pulls the percentages that are unclassified into one screen so you can see how you are doing overall. What is cool about this is that you can use the “Refresh All” feature of the Excel Client to check all of your Classifications while you get coffee and when you get back, you have a fully updated view of your SAINT Classifications. In the above below, I have shaded some items in black that are OK if they aren’t fully classified, items in green that are acceptable and items in red that require attention:

    dq_excel5

    Final Thoughts
    As you can see, Data Quality is a HUGE topic so it is hard to cover it all in one post, but hopefully some of the pointers here will get you thinking about how you can improve in this area. One last thing I will mention is that like most things related to web analytics, tools are good, but qualified people are better! Therefore, I think that any serious web analytics team will have a resource who has Data Quality as one of their primary performance objectives. Without this, Data Quality tends to fall by the wayside. Try to do whatever you can to convince your management that having a full or part-time person devoted to Data Quality will pay hefty dividends in the future…

    Analytics Strategy

    Welcome to our newest partner, John Lovett

    As you can see by the title of this post Analytics Demystified has some amazingly huge news — respected industry veteran and former Forrester Research senior analyst John Lovett has come on board as a Senior Partner. I have known John for years and have been one of his biggest fans all along; imagine my chagrin when he decided to leave his awesome job and help Aurelie and I build something truly great here at Analytics Demystified!

    John has long blogged over at Analytics Evolution but he’s already set up shop here at http://john.analyticsdemystified.com.  As you can see he’s pretty excited about joining our team as well, and you can read the official press release on our web site.

    At Analytics Demystified I have worked for the past two years and Aurelie and I have worked this year to build a truly great and well-differentiated consulting firm. Lots and lots of companies do implementations, reporting, and the basic block-and-tackle work that is the foundation of our industry. But when I left Visual Sciences I wanted to fill a completely different need. In the past two years we have, I believe, successfully done that.

    We count among our clients some of the best companies doing business online, some of the greatest technology firms in measurement today, and some of the nicest people working in web analytics today. Based on the early news from our trusted clients and partners John will only accelerate our growth and allow Analytics Demystified to focus on more strategic and more valuable engagements across the globe. Plus, since John is a total and complete Rock Star like Aurelie, I have a new partner that I know I can trust with the kinds of clients we work with here at Analytics Demystified.

    We will have more great announcements from the firm in the coming weeks but I will leave you with these few things:

    • If you’re a Analytics Demystified client in any kind of retainer, you have immediate and automatic access to both John and Aurelie. Contact me directly for more information;
    • If you’d like to talk to John about his practice at Analytics Demystified you can email him at john@analyticsdemystified.com;
    • If you’d like to know more about this announcement and how Analytics Demystified can help your business, please give me a call at (503) 282-2601;
    • If you’re in Boston and want to congratulate John and buy the man a drink, please join us at Web Analytics Wednesday in Cambridge on TUESDAY, DECEMBER 15TH (sponsored by our friends at Unica and SiteSpect and hosted by Judah Phillips)

    I hope everyone reading this blog will join me in welcoming John to the team and take the time to read his “Hello World” post titled “Let the Wild Rumpus Start”Oh, and feel free to spread the word!

    Analytics Strategy, General

    Let The Wild Rumpus Start

    I feel liberated. For the first time in my professional career I don’t have to answer to anyone. Sure, I still carry accountability to my new partners Eric and Aurelie, to the Analytics Demystified brand that we will continue to grow and evolve together, and most importantly to myself to produce high caliber work. Yet, there isn’t anyone telling me what to do anymore. Not that I didn’t have autonomy in many of my previous roles…I did. But somehow working in an environment where I’m calling the shots – where the upside is big and the downside threatening – where I have an opportunity to make a difference that’s entirely my own creation – it is invigorating.

    Making mischief of one kind and another…
    So with this newfound freedom I plan to embark on initiatives and activities that weren’t previously available to me in former roles. And I will challenge the conventional measurement dogma along the way. I mentioned earlier that I intend to be an agent for change in Web analytics. To me that means: questioning the status quo of vendor measurement practices; challenging clients to fully develop their strategic vision for measurement; cultivating talent that instills measurement as a fundamental marketing discipline; and driving the industry to collectively embark on advancement. Simply forging ahead in using analytics and optimization technologies in the way we’ve done so during the past 5 years won’t get this industry to a better understanding of customer behavior and marketing intelligence.

    Sailing off through night and day…And in and out of weeks…
    Just for context, I’ll let you know that I did not arrive here overnight. I began my career 15 years ago as a marketer and realized that digital mediums could offer faster, better and more effective means of reaching customers. I watched the impact of my digital efforts blossom and realized that web technologies were the way of the future. I embarked on my analyst career back in 1999 by joining two former Forrester analysts at Gomez Advisors where I immersed myself in the online experience. There I learned what it meant to have a well founded digital strategy and consulted with firms on how to formulate one. As Gomez evolved to a performance management company, I continued my consulting and delved into the technical side of what it means to offer faster and more reliable online marketing. It was during this time that I realized that measurement was the basis for truly understanding marketing efforts. This led me to conduct analytics and optimization research at several analyst firms including: the Aberdeen Group, Jupiter Research and most recently Forrester. During my time at each of these companies, my quest remained the same: to help marketers understand how consumers receive, interact and respond to digital marketing – and what to do about it.

    To where the wild things are…
    That was how I became acquainted with Web Analytics and the industry figures that are indeed the wild things. I’ve commented before that the people of Web Analytics are among the most inviting and hospitable bunch I’ve ever met. I can recall my first conversation with Jim Sterne where I pitched him an idea for eMetrics and his response was “more please, my antenna are tingling…”. Practitioners like Judah Phillips and Jim Hassert were willing to get on the phone and articulate their analytics frustrations along with their successes to help me create research that would resonate with the marketplace. Consultants like Jim Novo and June Li spoke freely about their experiences in education and evangelism of Web Analytics that helped me formulate a perspective on why people cared so much about this industry. And vendors spoke about their technologies with passion and excitement. Every individual that I approached regarding Web Analytics was willing to share a story, some valuable insight or their unique perspective. It was clear that the people that worked within this industry had passion for what they did and I wanted to make myself an indispensable part of that community.

    The most wild thing of all…
    Among all the characters I met who were involved with Web Analytics one stood out apart from the rest. Eric Peterson seemed to personify Web Analytics. His enthusiasm for analytics and his capacity to evangelize measurement somehow captivated the veterans and newly indoctrinated alike. His passion for Web Analytics was emphatic and his communication tactics resonated. I did and still do appreciate that he takes a stand on his opinions regarding Web Analytics topics, but is still willing to give audience to differing views. He’s also willing to change his opinion if he’s proven wrong. While I’ve accused him of apologizing like Larry David, he will acquiesce when he’s wrong. Most importantly, Eric has made an indelible impact on the Web Analytics industry. Thus, when Eric and Aurelie approached me about joining them as a partner at Analytics Demystified, I couldn’t refuse.

    It’s still hot…
    So here I am, beginning a new chapter in my career that includes: education, evangelism and inciting change for Web Analytics. I’m here because I truly believe that this space is hot and it’s one that I want to be a part of for years to come. I relish the opportunity to work side by side with Eric and Aurelie because we’re all equally invested in this industry and each offer unique perspectives on where it’s all going. This means that we won’t necessarily agree on everything, but we do share a common view of the big picture. I hope to bring balance to the partnership and the chance to offer my perspective through thought leadership, guidance and evangelism. I’m also looking forward to sharing my experience, my learnings and my viewpoint with you. This industry wouldn’t be here if not for people to move it forward. I welcome conversations about how we can collectively advance measurement technologies and encourage you to reach out and share your views.

    And now, let the wild rumpus start!

    Analytics Strategy

    Recap: Web Analytics Wednesday with Foresee Results

    Last week was our monthly Web Analytics Wednesday in Columbus. Foresee Results sponsored the event and provided a highly engaging speaker: Kevin Ertell, Foresee’s VP of Retail Strategy and the blogger behind Retail: Shaken Not Stirred.

    We had a good crowd — just under 30 people — and we did our usual half-hour of networking before sitting down to order food and cover the evening’s topic.

    Pre-Dinner Networking at Web Analytics Wednesday

    We had attendees from a wide range of companies: Nationwide InsuranceResource InteractiveVictoria’s Secret (including Monish Datta…which I mention here solely for quasi-inside SEO joke purposes), DSWDiaz & Kotsev Business (Web) ConsultingWebTech AnalyticsQuest Software (makers of Foglight, actually, which I didn’t realize until I was writing the rest of this post), QStart LabsSubmerged SolutionsBizresearchLightbulb InteractiveJoeMetricExpressCardinal Solutions, and various independent consultants. By my count, 30% of the attendees were first-timers, and the remaining attendees were a pretty even split between hard-core regulars and every-few-months dabblers.

    Kevin is a great speaker — one of those guys whose use of PowerPoint is primarily to provide images that back up the stories he weaves.

    Kevin Ertell presents at Web Analytics Wednesday

    One of the stories was the “tree stump on the conference room table” story, which was about how we get used to having odd, not-particularly-helpful aspects of our web sites that are jarring to first-time and infrequent visitors, but that we never think to address.

    Tree Stump on a Conference Room Table

    You can ping Kevin on Twitter directly for a more complete explanation on that analogy, if you want. If I try to recreate it entirely, I’ll butcher it for sure! I will take a shot at summarizing the four-step process Kevin laid out for going beyond web analytics data to drive site improvement, though, which was the meat of the presentation.

    Step One: Ask Your Visitors for Feedback

    On-site surveys provide valuable information, because they let you ask your visitors questions directly rather than simply trying to infer what it was they are trying to do, how successful they were at doing it, and how smooth the process was based strictly on behavioral data. Web analytics = bahavioral data. Survey data = attitudinal data. Got it?

    Some of the highlights on this step:

    • Incentives aren’t needed to get people to take a 15-30 question survey — I think Kevin said they see something like 6-10% of the people who are offered a survey actually accept the offer (not all visitors to a site get offered the survey) and they’re able to build up an adequate sample fairly quickly in most cases
    • The way Foresee Results offers surveys, typically, is that they offer the survey when visitors arrive on the site, but then conduct the survey on exit
    • The wording of the survey questions matters — there are good/valid ways to word questions and there are bad/invalid ways to word questions; there are oodles of research and expertise on that subject, and it’s worth partnering with someone (a consultant, a company) who really knows the ins and outs on that front to make sure that the data you collect is valid
    • The Foresee Results secret sauce is that they ask questions that fall into three broad categories: 1) questions about different aspects of the site (content, functionality, navigation, search, etc.), 2) questions to gauge customer satisfaction (very precisely worded questions that are backed up by the research behind The American Customer Satisfaction Index — ACSI), and 3) questions to gauge likely future behavior (likelihood to purchase online, likelihood to purchase offline, likelihood of returning to the site, etc.). Foresee Results then uses an analytic model to link these three elements together: the first category as a dependent variable affecting customer satisfaction, and customer satisfaction, in turn, being a dependent variable affecting the various future behaviors. It’s a pretty nifty tool that I’ve been learning more about over the past few months. Powerful stuff.

    This step, done right, gives you the basic diagnostics: where the most significant opportunities for driving improvements exist with your site.

    Step Two: Augment Quantitative with Qualitative

    This step is to augment the quantitative survey data with more qualitative information. The quantitative data can help you slice/segment the data so that you can review the responses to open-ended questions in a more meaningful way.

    Presumably, these qualitative questions are ones that you update over time as you are identifying specific areas on which you want to focus. If for instance, you found out in Step One that the navigation was an area where your site scores low and also has a significant impact on customer satisfaction, then you might want to gather some qualitative data specifically regarding navigation, and you might want to break that out between people who came to the site expecting to make a purchase, as opposed to people who came to the site simply to do comparison shopping.

    This sort of analysis will give you insight into the specific friction points on the site — what types of visitors hit them and what sorts of tasks they’re trying to accomplish when they do.

    Step Three: Watch Customers (in a Focussed Manner)

    This is a step that Kevin pointed out companies sometimes try to put first, which makes it unnecessarily expensive and time-consuming. The key here is to use the information from the first two steps to focus what you are going to observe and how. Various options for watching customers:

    • Session replay — what exactly did visitors on the site do and how; in the case of Foresee Results, these replays can be tied directly to specific survey respondents (pretty slick), but Tealeaf and Foglight are tools that provide replay functionality, too
    • Eye-tracking — this requires getting people into a lab of some sort, so, obviously, the more focussed you can get, the better
    • Usability testing — this may include eye-tracking, but it certainly doesn’t have to; obviously, there are benefits of being able to focus the usability testing, whether it’s conducted in a usability lab or even in-store

    Now, you should really have a good handle on specifically what’s not working. But, what if you don’t really have any good ideas as to what to do about it? Then…

    Step Four: Usability Audit

    Work with usability experts to assess the aspects of your site that are underperforming. Arm them with what you have learned in the first three steps!

    To me, it seems like you could swap steps three and four in some cases — let a usability expert audit your site and identify likely opportunities to improve the trouble spots.

    Driving Continuous Incremental Improvement

    By keeping the survey running on an on-going basis — adjusting questions as needed, but keeping the core questions constant — you can monitor the results of changes to the site as you roll them out. And, of course, your web analytics data — especially on-site conversion data — is one tool for monitoring if you are driving outcomes that matter.

    One point on the incremental changes front: during the Q&A, Kevin talked about how sites that roll out major redesigns invariably see a temporary dip in results while visitors get used to the new site. Incremental changes, on the other hand, can occur without that temporary drop in performance.

    Interesting stuff!

    Analytics Strategy, General

    Google Analytics Intelligence Feature is Brilliant!

    Long-time blog readers are likely aware that I’m not prone to writing about individual technologies or product features unless I have the opportunity to break the news about something new and cool (or not, as the case is from time to time.) But once and awhile a single feature comes along that in my mind is so compelling and cool I need to bend my own rules; Google Analytics new “Intelligence” offering is exactly that feature.

    Just in case you’ve been living under a rock for the past month and haven’t already heard about “Intelligence” have a quick watch of the following video pulled from the Google Analytics blog:

    Pretty awesome, huh? What’s more, now that I’ve had a few weeks to play with the feature and think about it in the context of my published views on the Coming Revolution in Web Analytics, I think that “Intelligence” is one of the most important advances in web analytics since the JavaScript page tag.

    While Google is certainly not the first vendor to apply some level of statistical and mathematical rigor to web analytics data, an honor that would likely go to Technology Leaders for their Dynamic Alert product or Yahoo for their use of confidence intervals when exposing demographic data in Yahoo Web Analytics, in my humble opinion Google has done the best possible job making statistical analysis of web analytics data accessible, useful, and valuable.

    Some things I really like:

    • An approachable way to determine confidence intervals via their “Alert Sensitivity” slider. While the implementation doesn’t necessarily impart the level of detail some folks would like, the slider mitigates the prevalent concern that “people won’t understand confidence intervals.”
    • Great visual cues for alerts, especially when statistically relevant changes are not obvious based on traffic patterns. Sometimes traffic patterns just look like hills and valleys, even when something important is happening — for example, the next figure shows two alerts at the lowest threshold setting on September 16th that, upon exploration, turned out to be great news (that I might have missed otherwise.)
    • Good visual cues regarding the statistical relevance of the insight being communicated. This is tough since Google is trying to present moderately complex information regarding the underlying calculations and how much emphasis you should be putting on the insight. By showing a relative scale for “significance” I think Google has more or less nailed it.
    • Google Analytics finally starts communicating about web analytics data in terms of “expectations” instead of absolutes. All of us (present company included) have a tendency to get wrapped up in whole numbers, hard counts, and complete data sets. But we also know that Internet-based data collection just isn’t that accurate, and so any push to get us to start thinking in terms of predicted ranges and estimates is a step in the right direction. For example, I love knowing that on a given day Google Analytics “expects” between 311 and 388 people to come to my site from the UK!
    • Lots more, including the ability to pivot the views and look from a “metric-centric” and “dimension-centric” perspective, the ability to aggregate on day, week, and month, and the ability to add your own custom alerts based on changes in traffic patterns. Perhaps ironically this last functionality (“Custom Alerts”) is how we’ve all historically thought about “Intelligence” in reporting, and while useful seems somewhat weak compared to Google’s stats-based implementation.

    While awesome in it’s first instantiation there are some obvious things that the Great GOOG could improve in the feature. Some ideas include:

    • More dimensions and metrics, although I believe both Nick and Avinash have commented that they are already working on adding intelligence to other data collected.
    • Some way to expose confidence intervals and p-values would be useful (perhaps as a mouse-over) so that the increasing number of analysts with experience in statistics could have that data in their back pocket when they went to present results.
    • Email alerts for the automatically generated insights, for example when “Intelligence” determines that five or more alerts have been generated it would be cool to get an email/SMS/Tweet/Wave notification.
    • The ability to generate alerts against defined segments, so that I could see the same analysis for different audiences that I’m tracking.

    Mostly ticky-tack stuff, but again I’m pretty damn impressed with their freshman effort. I suppose I shouldn’t be surprised since evangelist Avinash has been talking about the need for statistics in web analytics for an awfully long time, but given that so many in our industry have balked at bringing more mathematical rigor to our work (including said evangelist, oh well) it’s encouraging to see Google move in this direction.

    What do you think? Are you using “Intelligence”? Is it helping you make better decisions? Do you like the implementation as much as I do? I’d love to hear your thoughts and comments.

    Analytics Strategy, General

    Are You Ready for the Coming Revolution?

    Few would argue that the past few years in web analytics have been, well, intense. The emergence of Yahoo Web Analytics, multiple management shake-ups at WebTrends, Adobe’s acquisition of Omniture following Omniture’s acquisition of Visual Sciences, WebSideStory, Offermatica, Instadia, and TouchClarity, and the continued push into the Enterprise from Google Analytics. From where I sit we have seen more changes in the last 24 months than we had in the entire 12 years previous (my tenure in the sector) combined.

    When I think about these changes, I find myself coming to the undeniable conclusion that our industry is undergoing a radical transformation. More companies than ever are paying attention to digital measurement, and despite my disbelief in Forrester’s numbers, an increasing number of these companies are forging a smart, focused digital measurement strategy. At the X Change, at Emetrics, and at Web Analytics Wednesday events around the world there is more and more evidence that this wonderful sector I call “home” is really starting to grow up.

    And we’re just getting started.

    If you pay close attention to the marketing you see from Omniture, WebTrends, Unica, Coremetrics, and the other “for fee” vendors you’ve surely noticed a dramatic change recently. Nobody is talking about web analytics anymore; the entire focus has become one of systems integration, multichannel data analysis, and cross-channel analytics.

    All the sudden web analytics is starting to sound like, gasp, business and customer intelligence.

    Eek.

    Since it’s late and since this post will be over-shadowed by the hype around Google Analytics releasing more “stuff” on Tuesday I’ll cut right to the chase: I believe that we are (finally) on the cusp of a profound revolution in web analytics and that the availability of third-generation web analytics technologies will finally get digital measurement the seat at the table we’ve been fighting to get for years.

    Statistics, people … statistics and modeling, predictive analytics based on web data, true forecasting, and true analytical competition for the online channel. Yahoo’s use of confidence intervals when presenting demographic data and the application of statistical models in Google’s new “Analytics Intelligence” feature are just the beginning. As an industry it’s time to stop fearing math and embrace analytical sciences that have been around for longer than many of us have been alive. It’s time to stop grousing about how bad the data is and actually do something about it.

    Do I have your attention? Good.

    Thanks to the generosity of the kind folks at SAS I have a nicely formatted white paper that is now available for download titled “The Coming Revolution in Web Analytics.” Just so you can see if you might be interested here is the Executive Summary from the document:

    “Forrester Research estimates the market for web analytics will be roughly US $431 million in the U.S. in 2009, growing at a rate of 17% between now and 2014.  Gartner reports that the global market for analytics applications, performance management, and business intelligence solutions was US $8.7 billion in 2008—roughly 20 times the global investment in web analytics.  Among their three top corporate initiatives, most companies are focusing their efforts online, expanding their digital efforts Internet to increase the organization’s presence in the least expensive, fastest growing channel.

    Today, a majority of companies are dramatically under-invested in analyzing data flowing from digital channels.  Even when business managers have committed money to measurement technology, they usually fail to apply commensurate resources and effort to make the technology work for their business.  Instead, most organizations focus too much on generating reports and too little on producing true insights and recommendations, opting for what is easy, not for what is valuable to the business.

    Analytics Demystified believes this situation is exacerbated by the inherent limitations found in first- and second-generation digital measurement and optimization solutions.  Provided by a host of companies primarily focused on short-term gains in the digital realm, not long-term opportunities for the whole business and their customers.  Historically these companies worked to differentiate themselves from traditional business and customer intelligence, focusing on the needs of digital marketers.  Unfortunately, as the need for whole business analysis increases, many of these vendors are playing catch-up and forced to bolt-on data collection and processing technology as an afterthought.

    The current state of digital analytics is untenable over time, and Analytics Demystified believes that companies that persist in treating online and offline as “separate and different” will begin to cede ground to competitors who are willing to invest in the creation and use of a strategic, whole-business data asset.  These organizations are using third-generation digital analytics tools to effectively blur the lines between online and offline data—tools that bridge the gap between historical direct marketing and market research techniques and Internet generated data, affording their users unprecedented visibility into insights and opportunities.

    This white paper describes the impending revolution in digital analytics, one that has the potential to change both the web analytics and business intelligence fields forever.  We make the case for a new approach towards customer intelligence that leverages all available data, not just that data which is most convenient given the available tools.  We make this case not because we believe there is anything wrong with today’s tools when used appropriately, but because we believe digital analytics should take a greater role in business decision making in the future.”

    Since I pride myself on the quality of my readership I sincerely hope that each of you will download this document and  take the time to read it. More importantly I’d love you to share it with your co-workers, friends, and followers on Twitter. I believe we are at a critical juncture in our practice’s history where the skills that have served us all along are not going to serve us for much longer, but I am always willing to admit that I’m wrong and more than anything I love a spirited debate.

    Are you ready for the revolution?

    Analytics Strategy

    SEO Tips and Thoughts at Web Analytics Wednesday

    Last week’s Columbus Web Analytics Wednesday had something of an odd vibe, but it was also one of the most tactically informative ones that we’ve had to date! The crowd was smaller than usual — 18 attendees — due to a confluence of factors ranging from the influenza virus (not H1N1, as far as I know, but appropriate precautionary non-attendance by several people), to business travel to residential water line leaks, to touching-if-inconveniently-timed spousal romantic gestures! The silver lining is that, to a person, there was genuine regret about not being able to attend the event, which is a strong indication that our informal community of local analysts really has solidified. (Monish Datta was in attendance, so I am able to gratuitously make a reference to him — ask him or me at the next WAW what that is all about, if you don’t already know!)

    As for the event itself, we welcomed a new sponsor — Resource Interactive. The topic for the event was search engine optimization (SEO) with a little bit of search engine marketing (SEM). It wasn’t the first time that we relied on Dave Culbertson of Lightbulb Interactive to present, and it likely will not be the last, as his knowledge and enthusiasm about SEO, SEM, and web analytics is both entertaining and informative!

    Dave Culbertson at Web Analytics Wednesday

    Dave attended SMX East in New York the week before WAW, and he agreed to pull together the highlights of the sessions that he attended. One of my favorite tweets from Dave while he was at the conference was this one:

    “Ended up leading a lunchtime discussion on web analytics at #smxeast. Web analytics and SEO – like peanut butter and chocolate!”

    Partly because Dave is one of the organizers of Columbus Web Analytics Wednesday, and partly because, well, SEO/SEM and web analytics really should be integrated, “search” is a frequent cornerstone of our WAW topics. Dave’s presentation was titled SMX East 2009: The Spinal Tap Wrap-up. At least half of us (myself included) didn’t get the reference, while a solid quarter of the attendees immediately got it and thought it was quite clever and amusing. There were 11 slides in the deck, so:

    The presentation focussed primarily on SEO tips, although there was some SEM here and there. An incomplete list of the nuggets/surprises that jumped out the most to me included:

    • PageRank sculpting — this is when you try to gently influence the Google PageRank for pages you control by making subtle, behind-the-scenes tweaks to both that page and other pages that you control that link to that page. Apparently, a somewhat common way to do this has been through the use of the NoFollow tag. While this may have worked at one point, Google now pretty much ignores the tag when it comes to assessing PageRank
    • rel=”canonical” — this is a biggie, especially when it comes to web analytics and campaign tracking; this is a tag that can be added to a page to specify the exact “preferred” URL for the page. It’s important because many pages get linked to or arrived at with one or many extraneous parameters tacked on to the end of the URL: campaign tracking parameters for the web analytics tool, link tracking information for the e-mail engine from which a user may access the page, session ID or user ID information for the application that is rendering the page to enable it to make subtle tweaks in the content, etc. The full adoption of this tag by Google, Yahoo! Search, and Bing should go a long way towards removing the tension that exists between the SEO person pushing for the removal of these parameters in links (to avoid link dilution) and the web analyst who pushes to add them (to improve tracking capabilities). Google put together a nice write-up and video on the canonical tag after SMX West.
    • keywords — this is “keywords for SEO,” rather than the SEM usage of the term. A lot of information was presented about studies as to where the appearance of a keyword had the most/least impact. Having the keyword in the domain name itself was great, but, of course, you’re not going to be able to do that for too many keywords! (I couldn’t help but thinking of Clearsaleing’s http://www.attributionmanagement.com/ site, though!) Even better is to have the keyword in the domain and in the directory path (i.e., http://www.keyword.com/keyword). Having the keyword in a subdomain (http://keyword.company.com) is apparently not very effective (there was a quick side discussion about an online shoe retailer — and I can’t remember which one it was and, ironically, can’t seem to put together the right Google search to figure it out — that tried creating a subdomain for very type of shoe they sold…which then helped trigger Google to make this not effective; I’m fuzzy on the specifics, obviously!) Another point here is that there is both the “what the search engine algorithm puts weight on keyword-wise” and the “how user behavior — which links users follow — is affected by keywords showing up in subdomains, domains, query parameters, etc.” factor — it’s hard to tease out which is which, so the studies have focussed more on “what actually happens” rather than “why it happens.”

    At the end of the day, search engine optimization still comes down to providing great content in a way that users can easily navigate to it and consume it. Google’s algorithms are geared around making the same recommendations that a human being with an infinite knowledge of what content was where on the web would recommend in response to a question from another human being. SEO efforts need to focus on helping that theoretical human out — not trying to fool him/her!

    I also distributed copies of the deck that Laura Thieme of Bizresearch presented at SMX East. That presentation was primarily SEM-focussed, but it also had some great nuggets in it. Unfortunately, Laura wasn’t able to attend WAW (see the first paragraph of this post!) this month. Laura presented at WAW back in July and really knows her way around SEM, so we missed having her there!

    All in all, it was a good event!

    Analytics Strategy

    An Apology of Sorts …

    Now that Omniture’s Q3 earnings are public that I sort of felt like I needed to apologize to the company or at least recognize that they did a good job last quarter leading into their sale to Adobe Systems. Despite what I had heard from multiple sources their earnings announcement was right in line with guidance. Congratulations to the entire Omniture and Adobe team!

    It still leaves me scratching my head about the deal since the synergies are less obvious to me than they clearly are to Adobe and Omniture’s management and shareholders, but hey, with the sheer number of changes occurring in the industry right now who knows what might actually work. Hell, based on what I’m hearing about the Google Analytics announcement next Tuesday, it’s going to look like a great time to be focusing on something other than competing with Google Analytics …

    I’m going to get to spend time with many of their largest customers next week so I suspect I’ll hear a great deal more about how this sale is being met by HBX customers, Visual Sciences customers, and those folks who have a tremendous amount invested in the SiteCatalyst line of products. If you’re an Omniture customer going to Emetrics next week and have an opinion you’d like to share please reach out to me directly and we’ll arrange some time to chat.

    Again, congratulations to Josh James and all of the OMTR shareholders on what is increasingly looking like a great deal for all involved.

    Analytics Strategy, General

    New Data on the Strategic Use of Web Analytics

    Recently Google published the results of a Forrester Research study they had commissioned (PDF) to help the broader market understand the use and adoption of free web analytics solution.  Google should be applauded for commissioning Forrester to conduct this work, especially given the quality of the research and the level of insights provided.  Without a doubt, free solutions like Google Analytics and Yahoo Web Analytics are having an impact on our industry and driving change in ways few of us ever imagined.

    I really did enjoy the Forrester report, primarily because the author (John Lovett) managed to surface totally new data.  When he first told me that over half of Enterprise businesses were using free solutions I have to admit I didn’t believe him.  In a way I still don’t, but perhaps that’s only because I work with a slightly different sample than he presents.  Regardless, John’s report paints a picture of an increasingly challenging market for companies selling web analytics and a new sophistication among end users.

    Speaking of sophistication, there are a few points in the report that I question, and since I have pretty good luck getting feedback from readers on big picture stories I figured I’d bring them up here in the blog.  Before I do I want to emphasize that I am not questioning Forrester or John’s work—I am merely trying to explore some data that I find contrary to my own experience in this public forum.  To this end I pose a handful of questions that I would love to discuss either openly in comments or via email.

    The first point I question is the observation in Figure 3 that 70% of companies report having a “well-defined analytics strategy.”  Two years ago my own research found that fewer than 10% of companies worldwide had a well-defined strategy for web analytics.  Last year Econsultancy reported that only 18% of the companies in their sample had a strategy for analytics.  To jump from these low numbers to the majority of Enterprises just doesn’t square with my general experience in the industry.

    .

    Remember, the implication of this data point is that 70% of all companies having more than 1,000 employees have a “well-defined analytics strategy.”  According to a 2004 report from the U.S. Census Bureau there were just over 12,000 companies in the U.S. with more than 1,000 employees.  Without assuming any growth between 2004 and 2009, Forrester’s 70% figure would result in over 8,500 companies in the U.S. that have a “well-defined” strategy for web analytics. Does that sound right to you?

    Consider that the combined customer count for Omniture, WebTrends, Coremetrics, and Unica combined in the U.S. doesn’t even add up to 8,500 companies.  Even if you use the more conservative 13% who “strongly agree” with Forrester’s statement you end up with over 1,500 U.S. companies.  I may suffer from sample bias, but personally I can barely think of 150 companies that I would identify as having any strategy for web analytics, much less a “well-defined” one.

    Most companies I talk to have the beginnings of an over-arching strategy—they’ve realized the need for people and are beginning to reduce their general reliance on click-stream data alone.  But given that I think about this topic from time to time, I think a “well-defined” strategy for web analytics takes into account multiple integrated technologies, appropriate staffing, and well thought-out business and knowledge processes for putting their technology and staff to work.  What does the phrase “well-defined strategy” imply to you?

    Similarly, if 60% of companies truly believed that “investments in Web analytics people are more valuable than investments in Web analytics technology” there would be THOUSANDS of practitioners employed in the U.S. alone.  But again, every conference, every meeting, every conference call, and every other data point suggests that the need for people in web analytics is still an emerging need.  Hell, Emetrics in San Jose earlier this year barely drew 200 actual practitioners by my count.  How many web analytics practitioners do you think there are in the United States?

    Same problem with the rest of the responses to Figure 3 on web analytics as a “technology we cannot do without” (75%) and the significance of the role web analytics plays in driving decisions (71%).  Perhaps I’m talking to entirely the wrong people, perhaps I’m interpreting these data wrong, and perhaps I’ve gone flat-out crazy, but these responses just don’t match my personal understanding and experience in the web analytics industry.

    This issue of data that simply does not make sense, while not universally manifest in the report, manifests elsewhere as well. For example, Figure 8 reports on the percentage of application used segmented by fee and free tools:

    .

    When I look at these responses and see that 63 percent of respondents using fee-based tools and 50 percent of respondents using free tools claim to be effectively using more than half the available functionality, again I find myself scratching my head. As this data appears to speak to the general sophistication of use of analytics I went back and looked at Dennis Mortensen’s quantitative study of how IndexTools was being used around the world.

    Dennis reports that fewer than 10% of his customers were using even the most basic “advanced” features in web analytics (report customization) and that fewer that 4% of his customers (on average) are making any “advanced” use of the IndexTools application. While this dataset is somewhat biased towards European companies who I believe, on average, to be somewhat behind their U.S. counterparts it does provide an objective view in how web analytics are used that seems to directly contradict the self-reported responses in Forrester’s figure 8.

    Clearly there is a gap between the responses John collected and the current state of the web analytics market.  Since John is a very smart guy I know part of his rebuttal will include the observation that he surveyed people directly responsible for web analytics (see Forrester’s methodology) and that people in general have a tendency towards positivism. Trust me, my son is the most handsome little boy ever born and my daughter’s beauty is only matched by that of Aphrodite … same for your kids, right?

    Given the difficulty associated with gathering truly objective data regarding the use of web analytics, this type of self-reported data is usually what we have to go on.  While Omniture, WebTrends, Coremetrics, and Unica all have the fundamental capability to report data similar to that provided by Mr. Mortensen, it may not be in their best interests to expose underwhelming adoption and unsophisticated use (if that is what the analysis uncovered.)  Ultimately we’re forced to accept these self-reported responses and  then reconcile them against our own views, which is why I’m asking my readers what they think about the data Forrester is reporting!

    Regarding these self-reported attitudinal responses on how web analytics is used strategically, perhaps the truth is found in the companies who “strongly agree” with John’s statements.  If we apply this lens, as opposed to the more optimistic view, we get the following:

    • 17% of companies recognize that web analytics is a technology they cannot live without;
    • Web analytics plays a significant role in driving decisions at 12% of companies;
    • 13% of companies have a well-defined web analytics strategy;
    • 9% of companies recognize that investments in people are more valuable than investments in technology

    These numbers start to make a lot more sense to me.  Likely the truth, as with so much in our industry, lies somewhere in between, but I would love to hear what you think about these adjusted numbers.  Do the lower numbers make more sense to you, or do you agree with John’s more optimistic assessment?

    Unfortunately if the lower numbers are correct the implication is that despite the incredibly hard work that companies, consultants, and industry thought-leaders around the world have done for years we still have an incredibly long way to go before web analytics is recognized as the valuable business practice that you all know it can be!

    Regardless I want to state that I do not disagree at all with the fundamental thesis in this report, that “free” is creating a whole new level of interest in web analytics and that, given proper consideration, free is an excellent alternative to paid solutions.  Lacking clear strategy and resources, too many companies have wasted too much money on paid solutions for free to not be compelling.  Thanks to the dedication of the Google and Yahoo teams, the world now has access to great applications that are in some regards more compelling than fee-based alternatives.

    While I may not have said this a few years ago, today I honestly do believe that “free” is a viable and appropriate alternative to fee-based solutions. While not appropriate in every situation, it is irresponsible to suggest that any company not willing to fully engage in web analytics should pay for ongoing services and support. Given advances from Google and the availability of Yahoo Web Analytics, any motivated company large or small now has access to a wealth of data that can be translated into information, insights, and recommendations.

    Conversely I agree with John (and Jim, and almost ever thought leader I respect) who states that you need to “prioritize your business needs and culture for analytics first and then evaluate the tools.”  This goes back to the fundamental value proposition at Analytics Demystified: It’s not the tools you use but how you use them. If you’re not invested in developing and executing a clearly defined strategy for digital measurement, you may as well be grepping your log files.

    I would love your feedback on this post, either directly in comments or via email. Thanks again to the folks at Google for making this awesome research freely available and to John Lovett for shedding light on this incredibly important aspect of our sector.  Remember: we are analysts—our jobs are to ask hard questions and then ask even harder ones!

    Adobe Analytics, Analytics Strategy, Conferences/Community, General

    Analytics Demystified European Tour

    Those of you who live in Europe are likely already aware that my good friend Aurelie Pols has joined me as a partner in Analytics Demystified. Over the next two weeks she and I will be making a series of presentations and announcements at events across Northern Europe. We will be at:

    • The Online Performance Management seminars, hosted by Creuna, in Copenhagen on Thursday, October 8th and in Oslo, Norway on Friday, October 9th. More information about our hosts and registration is available from Creuna.
    • While we’re in Copenhagen we will be having a Web Analytics Wednesday on Wednesday, October 7th. I will be giving a short presentation on testing and if you’re in Copenhagen please join us at this FREE EVENT sponsored by IIH Nordic and Webtrekk
    • Over the weekend Aurelie and I will be hanging out in Stockholm, Sweden. If you’re in Stockholm and want to meet-up please either shoot me an email or Twitter me and we’ll make plans!
    • On Monday, October 12th and Tuesday, October 13th Aurelie and I will be joining the excellent Emetrics crew at Emetrics Stockholm. I will be giving the keynote on Tuesday morning and Aurelie and I will both be participating on a series of panels and shorter presentations. Those of you keeping score will note that I have attended EVERY SINGLE Emetrics ever held in the United States but this is my FIRST EVER event in Europe. Yahoo!
    • On Wednesday, October 14th, I will be hanging out in Amsterdam with the Nedstat crew but have a fair amount of downtime during the day. I’m staying near Vondelpark and if you’d like to meet and get a cup of coffee (seriously, I mean coffee, I’m too old for the other stuff) Twitter me and we’ll make plans!

    Since I usually do three European cities in three or four days this trip is a lazy walkabout for me (four cities, seven days) but Aurelie and I have planning to do and, of course, we’ll spend a little time enjoying the local culture.

    If you live in any of these cities, or if you plan to come to Emetrics, please join us and come say hello!

    Analytics Strategy, Social Media

    Web Analytics Wednesday: A Segmentation Experiment

    Last night was another great Web Analytics Wednesday in Columbus, courtesy of the Web Analytics Wednesday Global Sponsors (Analytics Demystified, SiteSpect, Coremetrics, and IQ Workforce). We had a respectable turnout of ~25 people (not including children) and a great time! And, all the better, I got to blind people with the flash on my new camera. A few of the highlights on the picture front:

    Bryan Huber from huber+co. interactive and Jen Wells from TeamBuilder Search

    Bryan Huber and Jen Wells

    Todd Ehlinger from Nationwide, Mike Amer from DSW, and Elaine F.

    Todd, Mike, and Elaine

    The Erics — Goldsmith from AOL and Diaz from Diaz & Kotsev Business Consulting (not shown: the THIRD Eric — Eric Moretti from Quest Software)

    The Erics

    The picture that didn’t come out well was the one of Laura Thieme of BizResearch with her daughter, Melina — hanging out on her mom’s shoulder…and ‘nary a peep the whole evening (why couldn’t I have had one of those kids?!)! And (cliche warning) cute as a button! As it turned out, Melina wasn’t the only kid who made an appearance — Dave Culbertson’s sons were in attendance on the periphery for the first part of the evening as well.

    Rather than a formal presentation, we did an interactive, get-to-know-each-other, have-a-chuckle activity — conceived of and coordinated by Dave Culbertson from Lightbulb Interactive. Unlike my attempts to photograph Melina and Laura — where I only took one shot and then figured the flash was just cruel — I kept clicking the shutter at Dave until he struck a sufficiently expressive pose:

    Dave Culbertson Explains the Rules of the Game

    What Dave walked us through was a segmentation exercise: he had a list of questions, each with four possible answers, and we had to segment / re-segment ourselves after each question by going to the area of the room designated for how we would answer that question. An incomplete list of the questions:

    1. Where did you go for your undergraduate degree? a) Ohio State, b) not Ohio State, but another school in Ohio, c) not in Ohio, but in the U.S., or d) outside the U.S.
    2. Which of the following most describes your opinion of social media? a) revolutionary, b) evolutionary, c) nothing new, or d) what’s social media?
    3. If you were going to read only one book this month, what kind of book would it be? a) non-fiction business, b) non-fiction non-business, c) fiction non-science fiction, d) science fiction (or something like that)
    4. If you took only one vacation this year, where would you most like to go? a) the beach, b) the mountains, c) a large city, d) Disneyland
    5. What kind of car do you drive? a) American, b) European, c) Japanese, d) Korean

    After we’d segment ourselves, Dave would ask a few follow-up questions of the group. It really did turn out to be a lot of fun (and, if you’re reading this post and recommended a book on that question, please leave a comment with the book you recommended! There sounded like some excellent reads there, and I wasn’t taking notes!)

    For my part, I enjoyed getting folks’ take on the Omniture acquisition by Adobe. And, Bryan Huber mentioned what sounds like a pretty slick tool for <social media buzzword>online listening</social media buzzword> that factors in the influence of the person who commented about your brand as well as what they said — another part of the evening where I wasn’t taking notes (but, come on, the pictures ARE fabulous, right?).

    So, that’s the hasty recap of the evening. By the time this post publishes, I’ll be on my descent into Boston for a lonnnnng weekend with Mrs. Gilligan:

    Julie

    (And, for you, Eric G., none of the photos used in this post were subjected to post-processing other than cropping. There’s no way I’m going to be able to stick with that, though!)

    Analytics Strategy

    More color on Adobe + Omniture

    Wow, everyone seems to have an opinion about this acquisition. Some people think Microsoft will ride in at the 11th hour and out-bid Adobe because Microsoft and Adobe compete, and because Google has Google Analytics. On this point I am inclined to agree with Joe Davis, CEO of Omniture competitor Coremetrics, who comments that Omniture has been shopping the company around for some time and it is unlikely that Redmond hasn’t already had the opportunity to play (given the significant investment Microsoft has in Omniture.)

    Other folks appear to be worried that Adobe will be integrating Omniture into Flash and this raises privacy concerns. While certainly folks have concerns about tracking and the possibility of embedding tracking into Flash Local Shared Objects (LSO) I just have to believe that management at Adobe is smart enough not to risk Flash’s dominance by subjecting the technology to the scrutiny, navel-gazing, and paranoia of the “privacy police.”

    Their customers, at least the ones I am talking to, are more or less 2 to 1 against the acquisition at this point citing a variety of concerns (transition, failure to execute on stated product plans, talent flight, Adobe is not adept at services, etc.) Far be it from me to tell anyone’s customers they are wrong when expressing concerns, especially since this is an out-of-sector acquisition and Omniture is now more or less a medium-sized cog in a very big machine. Arguments for include loving Adobe (I love Adobe!), being relieved that Adobe is a big, grown-up company, and hopes that Adobe will focus on fundamentals like customer support, product execution, and global expansion.

    Another customer complaint is that Omniture is now losing the (thin, pasty) veneer of third-party objectivity and that some companies may not actually want Adobe to have access to their site’s data.  I think this may be the same boondoggle that Omniture (and others) have used to explain why “the Enterprise wouldn’t use Google Analytics” — except there is more and more (and more) evidence that the Enterprise does use Google Analytics — but it will be interesting to see how the “free-standing” analytics vendors work to make Omniture eat their own words now that they too are part of something larger.

    The comment that has me most concerned is one best detailed by Carter Malloy from Stephens, Inc. Research Analyst who I have known for years and who I know to be pretty level headed regarding the sector.  Carter sent me this, which I am simply repeating with his permission:

    “I don’t understand the strategic rationale on adobe’s part. Different end market buyers. Very different products. No real cost savings or integration between the two products. OMTR is very capital intensive vs. adobe not much at all. Seems like Adobe is buying growth with hopes for cross sells. I would be surprised to find out that OMTR did not shop the business around before accepting the bid from Adobe – we should find out soon in public filings required by the SEC. Omniture will still have to report 3Q09 earnings in October, but I think the deal will get closed before Q4 in Jan/Feb. I also think Adobe will show Omniture’s revenue performance on an informal basis going forward. It will be <10% of Adobe’s total revs, but I still think they will give analysts at least some idea of what growth looks like.”

    This was in response to my comment detailing a thesis that I have heard from several of Carter’s peers: that Omniture was about to blow Q3 earnings and that the result would be a dramatic dip in OMTR share price as investors head for the exit. The rationale is, apparently, that the company has over-promised and under-delivered for too long, both to investors and customers, and the economy has been the “last straw” for many who have opted to look elsewhere for web analytics technology. This, combined with slower-than-hoped adoption of non-core solutions (data warehouse, Test & Target, Search Center, Survey, etc.) resulted in a “company who’s greatest days are behind them” (direct quote, and I begged to attribute but was told “no” due to company policy.)

    Don’t get me wrong: This is not my thesis, at least not yet.

    While I have seen evidence of larger Omniture customers switching, increasingly to Unica, I have not seen enough evidence of the kind of massive shift away from SiteCatalyst that would warrant a sudden exit. The good news is that Carter’s thesis can easily be tested: Either Omniture will make expectations for Q3 or they won’t. I’m sure this will make for an interesting Q3 call, at least for those investors who are taking a bath on the acquisition price.

    My concern is this: If the investment banker thesis is correct, if Omniture was about to report a second quarter of, um, disappointing results, then what does that mean for the larger industry? Is Adobe really evidence that the larger market is taking an interest in digital analytics? Or was the company thrashing about looking for something new to cover for recent declines and this really isn’t about Omniture or web analytics at all?

    Again, I don’t know, at least not yet, and I don’t think any of us do. But given the very mixed reviews about the acquisition I think we as an industry should take a step back and consider the larger ramifications. Personally I don’t think web analytics is going ANYWHERE — hell, I’m recruiting at Analytics Demystified — but we can all admit we collectively haven’t done the best job explaining what we do and what the data we live and die by means.

    This interesting acquisition will certainly get more interesting as the days pass. Congrats again to all involved.

    Analytics Strategy

    Thoughts on Adobe + Omniture

    Wow, I have to admit that I was surprised mid-day today at a new client meeting in Chicago when, at the same moment, my phone, my SMS, and my email all went off at the same time. When we got to a break I quickly glanced down and the SMS message said “Adobe buys OMTR for $1.8B!!!!!!”

    Wow.

    I didn’t get to talk to the press (John got the honors, congrats) and am just not getting a chance to cogitate a little on what Adobe’s entrance to the web analytics market means after non-stop phone calls for the past five hours.  A lot of interesting comments have already been published so I will try and reference the stuff I think is insightful in an effort to avoid repetition.

    • In general, the more I think about the deal the more it makes sense, at least for Omniture. Given increasing pressure from lower-cost (and free) solutions, the economy, and a customer base that is more and more prone to complain about service issues and the high cost of doing business with the company, exiting now makes good sense.  Why fight the sea change in the analytics market when you can saddle someone else with the responsibility?
    • Like others, I don’t really see the synergy in the deal, but I admit that I love Adobe and so I’m willing to be surprised. I think of Adobe as a software company for creative types; Omniture sells software-as-a-service to analytical types; these are different business models and very different customers. The idea that somehow this acquisition bolsters Adobe’s position in content management or as a global delivery platform just doesn’t resonate with me.
    • Similarly, I don’t see this acquisition as creating anything new regarding measurement being embedded into rich media applications. Thanks, perhaps ironically, to Macromedia (owned by Adobe) we have been embedding tracking codes into Flash, Flex, Silverlight, AJAX, etc. for years … and while the integration is botched as often as not, I don’t see how adding a “Click here to Omniture-ize” button into Dreamweaver and Adobe’s RIA development suite will solve that problem.
    • I do agree with Alex Yoder’s general thesis that this acquisition increases the overall visibility of the sector and that this is a good thing. I also agree that this acquisition is likely not the last — both WebTrends and Coremetrics are owned by investors and you know how those guys are. His citation of Microsoft and Oracle is interesting given both companies historical interest in the sector (although neither has had the chutzpa to actually pull the trigger — at least in a substantial way.)
    • I also agree with Gary who is somewhat skeptical about acquisitions, especially out-of-sector ones like this (anyone remember NetIQ? How about you Deepmetrics customers?) and since the Instadia, HBX, and Visual Sciences acquisitions that Omniture made didn’t really generate the benefits promised. However, where Gary favored IBM (who I didn’t realize wanted back into the sector after selling SurfAid to Coremetrics) I liked the idea of WPP increasing their $25M investment by, well, I guess about $1775M or so. Given my position on how companies will deploy web analytics in the future, WPP adding a premium measurement brand to their analytics tools and giving them the ability to pass world-class analytics along to their best customers made sense to me. Oh well.

    Regarding Omniture customers … I am getting feedback from across the spectrum. Some customers are encouraged by the news, largely because they believe that Adobe will bring a new level of rigor to product development, integration, and customer support.  Others (including those customers still on HBX) are somewhat discouraged by the news, given that they’ve been hearing a lot of promises lately and they’re not sure what a new owner will mean.  Still others have expressed that they really liked what the company had been doing this past year and so are bummed that things might slow down while the deal and integration are completed.

    Prospects are a different question. Since I am working with a number of companies currently evaluating Omniture products … the best guidance I can give is “wait and see.” Again, I think Adobe is an awesome company and every interaction I have ever had with them has been positive. Hopefully this acquisition will be mostly painless and largely transparent to outsiders. We’ll know soon enough if Omniture’s recent aggressive pricing and willingness to cut deals to close business differs from Adobe’s business practices. And while competitors will almost certainly claim “Omniture is out of the game,” I am personally encouraging my clients to think carefully about what Omniture and Adobe have been able to do independently before writing the combined company off.

    At the end of the day I’m really happy for the bright folks I know who have been plugging away at Omniture all these years in a variety of their companies. The teams at Omniture, HBX and Visual Sciences, Offermatica, and hell even Matt Belkin (remember that guy!) who hopefully get to participate in the largess that Omniture has created should all get credit for the thousands of hours they spent on the road, fighting for the big green machine, never willing to concede until they’d finally lost (and sometimes even after they’d been asked to go home!)

    Congratulations to Josh, Chris, Brett, John, Kristi, and the entire senior management team at Omniture! Also, best of luck to the management team at Adobe with your new acquisition; your new customers are among the best in the business and will look to you to make a good thing even better.

    Analysis, Analytics Strategy, Reporting, Social Media

    The Most Meaningful Insights Will Not Come from Web Analytics Alone

    Judah Phillips wrote a post last week laying out why the answer to the question, “Is web analytics hard or easy?” is a resounding “it depends.” It depends, he wrote, on what tools are being used, on how the site being analyzed is built, on the company’s requirements/expectations for analytics, on the skillset of the team doing the analytics, and, finally, on the robustness of the data management processes in place.

    One of the comments on the blog came from John Grono of GAP Research, who, while agreeing with the post, pointed out:

    You refer to this as “web analytics”. I also know that this is what the common parlance is, but truth be known it is actually “website analytics”. “web” is a truncation of “world wide web” which is the aggregation of billions of websites. These tools do not analyse the “web”, but merely individual nominated “websites” that collectively make up the “web”. I know this is semantics … but we as an industry should get it right.

    It’s a valid point. Traditionally, “web analytics” has referred to the analysis of activity that occurs on a company’s web site, rather than on the web as a whole. Increasingly, though, companies are realizing that this is an unduly narrow view:

    • Search engine marketers (SEO and SEM) have, for years, used various keyword research tools to try to determine what words their target customers are using explicitly off-site in a search engine (although the goal of this research has been to use that information to bring these potential customers onto the company’s site)
    • Integration with a company’s CRM and/or marketing automation system — to combine information about a customer’s on-site activity with information about their offline interactions with the company — has been kicked around as a must-do for several years; the major web analytics vendors have made substantial headway in this area over the past few years
    • Of late, analysts and vendors have started looking into the impact of social media and how actions that customers and prospects take online, but not on the company’s web site, play a role in the buying process and generate analyzable data in the process

    The “traditional” web analytics vendors (Omniture, Webtrends, and the like) were, I think, a little late realizing that social media monitoring and measurement was going to turn into a big deal. To their credit, they were just getting to the point where their platforms were opening up enough that CRM and data warehouse integration was practical. I don’t have inside information, but my speculation is that they viewed social media monitoring more as an extension of traditional marketing and media research companies that as an adjacency to their core business that they should consider exploring themselves. In some sense, they were right, as Nielsen, J.D. Power and Associates (through acquisition), Dow Jones, and TNS Media Group all rolled out social media monitoring platforms or services fairly early on. But, the door was also opened for a number of upstarts: Biz360, Radian6, Alterian/Techrigy/SM2, Crimson Hexagon, and others whom I’m sure I’ve left off this quick list. The traditional web analytics vendors have since come to the party through partnerships — leveraging the same integration APIs and capabilities that they developed to integrate with their customers’ internal systems to integrate with these so-called listening platforms.

    Somewhat fortuitously, a minor hashtag snafu hit Twitter in late July when #wa, which had settled in as the hashtag of choice for web analytics tweets was overrun by a spate of tweets about Washington state. Eric Peterson started a thread to kick around alternatives, and the community settled on #measure, which Eric documented on his blog. I like the change for two reasons (notwithstanding those five precious characters that were lost in the process):

    1. As Eric pointed out, measurement is the foundation of analysis — I agree!
    2. “Web analytics,” which really means “website analytics,” is too narrow for what analysts need to be doing

    I had a brief chat with a co-worker on the subject last week, and he told me that he has increasingly been thinking of his work as “digital analytics” rather than “web analytics,” which I liked as well.

    It occurred to me that we’re really now facing two fundamental dimensions when it comes to where our customers (and potential customers) are interacting with our brand:

    • Online or offline — our website, our competitors’ websites, Facebook, blogs, and Twitter are all examples of where relevant digital (online) activities occur, while phone calls, tradeshows, user conferences, and peer discussions are all examples of analog (offline) activities
    • On-site or off-site — this is a bit of a misnomer, but I haven’t figured out the right words yet. But, it really means that customers can interact with the company directly, or, they can have interactions with the company’s brand through non-company channels

    Pictorially, it looks something like this:
    Online / Offline vs. Onsite / Offsite

    I’ve filled in the boxes with broad descriptions of what sort of tools/systems actually collect the data from interactions that happen in each space. My claim is that any analyst who is expecting to deliver meaningful insight for his company needs to understand all four of these quadrants and know how to detect relevant signals that are occuring in them.

    What do you think?