Analytics Strategy, Conferences/Community, General

Interview: John Lovett from Forrester Research

Following up my interview with Bill Gassman a few weeks ago I realized that I would be remiss if I didn’t build on Forrester’s recent Web Analytics Wave report with an interview with John Lovett. John, like Bill, totally, totally understands the web analytics industry, and in that understanding is able to clarify the marketplace in a way few others can. Don’t believe me? Check out his response to possibly the worst article about web analytics, ever. Measured, polite, even complimentary … that’s John.

I am personally honored that John accepted my invitation to return to the X Change this year and both lead the huddle on “Industry Standards (or a lack thereof)” and co-lead a huddle on technology with Bill Gassman. If you haven’t met John personally, and if you are able to join us at the X Change, I strongly recommend you make a point of introducing yourself to him.

Finally, before my questions and John’s answers, I wanted to point out how incredibly deft Mr. Lovett really is: in response to a high-and-hard fastball question about “which vendor is really the best,” John knocked the ball clear out of the park with his answer: none of them. I’ll let you read the rest for yourself …


Your recent Wave report really emphasized a lot of conventional wisdom about the web analytics vendors but had some surprises for folks.  What surprised YOU the most about the Wave results?

Well Eric, I like to say that surprises are for birthdays and not for business. So in terms of actual surprises, there weren’t any big bombshells for me. I was however pleased that the vendors demonstrated innovation in a number of areas (like social media measurement) and that despite my attempts to develop extremely challenging criteria, the vendors continue to improve year over year.

One comment people have made to me is that they question the validity of comparing fee and free solutions in a single matrix due to the fundamental differences in their business model.  How would (or do) you respond to that challenge?

That’s preposterous! I respond by saying that it’s negligent not to compare free vs. fee based solutions. In today’s economic environment if you’re not watching expenses by understanding the cost to benefit ratio of your Web analytics solution, you are acting irresponsibly. Free tools have merit for many organizations as both primary and secondary tools, while fee based solutions are more appropriate for others based on their capabilities. Organizations must do their diligence to understand what they need in a Web analytics solution to decide what’s right for them, which is really the insight the Wave attempts to provide.

I asked Bill Gassman from Gartner a variation on this question recently, but do you now or see in the near future a situation where you as a Forrester analyst are advising your clients to actively consider these free solutions in addition to “traditional” web analytics solutions from Omniture, Coremetrics, and Unica?  As a follow-up, how do you see free tools impacting the market in the next 12 to 24 months?

I advocate that a single system for measurement is always the best way to go, yet recognize that this isn’t always feasible. Duality of Web analytics tools is a reality for myriad reasons. Thus, company’s need to manage their data dissemination practices to ensure comprehension and mitigate doubt. This is tricky, but certainly possible. I often help clients determine which solution is best suited to meet their needs and financial implications are always a part of that discussion.

With regard to how free tools will impact the market: we are just witnessing the beginning of the incoming tide on this one. By this I mean that “free” will continue to disrupt the market by placing pressure for improvement on all vendors. Just look at the recent Webtrends product upgrade announcement – the majority of press around it cited a “look out Google Analytics” slant. Why the comparison…they’re worried! Fee-based vendors have even more to fear now that Yahoo! Web Analytics opened up its partner program.

Another comment I hear about the Wave results, and forgive me this, is that they’re lame because they do nothing to differentiate the “market leaders” who appear as a tight cluster.  The evidence cited is that all four vendors issued press releases declaring their “market leadership” which appears technically correct based on the Wave but as the Highlander said, “There can be only one.”  First, how do you respond to this and second, who is the real market leader in web analytics?

Here’s the dirty little secret – the real market leader is the wildly talented Web analytics practitioner. It’s not the tools that differentiate it’s the craftsman. Any company that believes the Web analytic technology alone will make them incredibly successful is delusional or just plain out of touch. There is no get rich quick scheme here. Each of the leading vendors on the Wave offers a highly customized solution that can be tricked-out to meet nearly anyone’s individual needs. But this takes a great deal of work. For those organizations that are looking for the far-and-away winner in this technology category, guess what: the tools will only get you so far – you need talented people to really make it happen.

Rumors are that Omniture has a bunch of “800 lb gorillas” hanging in their offices right now.  Clearly they’re proud of their position, but last quarters results highlighted that there are clear risks to their business that are beginning to manifest.  What do you think are the greatest risks to Omniture’s business over the next 18 months?

Well, I don’t buy into rumors and sure don’t know where I left my crystal ball. But things are tough all over. As I stated earlier, free solutions are threatening all fee-based vendors and forcing them to work harder. I can tell you that measurement technologies are an imperative for executing on digital marketing endeavors. Solutions like Omniture’s, Webtrends’, Coremetrics’, Unica’s and everyone else’s will continue to play an important role in the evolution of organizations conducting business online. I believe that Web analytics is increasingly becoming an integrated service and expect to see things evolve to easier access to data through new and alternative means. The leading vendors, including Omniture, will play a role in this evolution.

What’s your taken on the current hype cycle around “open”?  Omniture bangs the Genesis drum, Coremetrics connects, and now WebTrends appears to have decided that “open” will be the foundation of their future success (or lack thereof) … but some people think that “open” is a check-box requirement, not a competitive differentiator.  What do you think?

Open is not a feature, it’s a philosophy. The ability to get data into and out of a Web analytics solution is the crux of the issue and leading vendors facilitate this through bi-directional API’s, other import and export functions and data dissemination capabilities. Webtrends is currently doing this as well as anyone, but “open” also means talking to your customers about development plans, listening to criticism and demonstrating a willingness to change. These qualities aren’t unique to Webtrends, they’re characteristics that all vendors should exhibit. Webtrends is just marketing around them and if that’s causing people to want open, then it appears to be working.

As a previous attendee to the X Change what do you like best about the conference and what would you like to see us change this year or next?

I appreciate the intimate conversational format of X Change. The huddles really facilitate deep thought, controversial leeway and provocative discussion. As someone who attends a number of conferences, it is refreshing to engage in dialogue with individuals who are passionate about what they do and to initiate a true collaborative thinking environment. As far as change goes, I really hope to be able to guide the huddles that I’m leading toward resolution. Within our industry, all too often we surface problems and issues without identifying solutions. I’ve taken your challenge to heart and hope to walk away with some tangible results from my huddles.


John will be joining Bill Gassman, Gary Angel, June Dershewitz, and over 100 expert users, consultants, and vendors at the 2009 X Change conference in San Francisco on September 9, 10, and 11. Registration is currently underway and we’d love to have you join us! For more information please visit:

http://www.xchangeconference.com

Analytics Strategy

How much do you pay for web analytics?

I was just cruising through the just published WebTrends 9 update and thinking about how the web analytics vendor market is evolving. “9” looks neat and I’m sure glad to see some really important metrics like bounce rate appear in the UI. Still, I always scratch my head when I see vendors make statements like “[the] data visualization tool in Webtrends Analytics 9 lets anyone – even analytics novices – quickly and easily understand changes in key metrics” and then put up a feature list like this one.

Still, it’s nice to see WebTrends making some moves so congratulations to Jascha, Casey and the entire Portland crew for getting the update out the door!

Anyway …

I said I had been thinking about the evolution of the web analytics vendor market. A lot of my thinking this past week has been colored, well, purple, thanks to the announcement of Yahoo’s Web Analytics Consulting Network (the YWACN or, as I think about it “the Yack’n!”.)  On July 30th Yahoo announced that they were making Yahoo! Web Analytics much easier to get through 48 partners around the globe.

Now, when you look at the partner list you might not recognize a lot of the names — I sure don’t — but a few should stand out. Specifically Stratigent, Semphonic, Sapient, and my own company Analytics Demystified. While I can’t speak directly for any of these companies, all are run by very smart people, and I have to wonder if they’re not thinking about YWA much the same way I have been.

I mean, if you think about it, Yahoo! has basically come to us and said “Go sell excellent implementations of YWA and provide awesome ongoing support” for an application that, according to Forrester Reseach, has 77% of the core functionality of Omniture SiteCatalyst. Or, put another way, “Find companies that are struggling to get value from their existing investment in {pick a vendor}, kick that vendor out, and then make money helping them be successful for less then they spend today.”

Sweet, thanks Yahoo!

Not to brag (since it was pretty obvious) but I did say this would happen back in April 2008 given the hard work Google Analytics (who is ironically NOT a YWA competitor) had done with their similarly badly acronym’d GAAC. Yahoo wisely avoids having to support customers directly, leverages some incredibly smart folks, and lets companies reduce their annual analytics spend without having to forgo core functionality like multiple custom variables and visitor-level segmentation.

Hell, we’re not even talking about real-time updates and demographic reporting and segmentation, which while the former often has more value ascribed than necessary the latter, if I can say so based on my own usage, is pretty fantastic and not available in any other web analytics application in the market today. I mean, who would have guessed that so many mature, responsible adults love to Twitalyze themselves!

Now I sincerely doubt that any of the YWACN members are going to suddenly stop supporting the big for-fee applications out there … I know I’m not! And I fully expect the adoption of YWA to be slow and methodical (mostly because of existing contracts, Yahoo’s terms of service, and the fact that Yahoo is somewhat limiting YWACN access to new accounts although I think their strategy is fair and makes perfect sense.) But at the end of the day Yahoo has made quite possible the single best move they could have if their goal was to provide an awesome service with excellent third-party support at the best possible price.

Now if you were paying attention you may have noticed I commented that Google Analytics and Yahoo Web Analytics are not competitive. Crazy, huh? But they’re not. Google Analytics (as it exists today) and Yahoo Web Analytics (as it exists today) serve two near completely distinct target markets.

Now I know I’ll get heat for saying this (again) but I just don’t think Google Analytics is appropriate for “free standing” use within the true Enterprise. I’ll point again to Bill Gassman’s recent note on the service (which I thought was excellent) and will obviously concede that it is well within Google’s power to make GA the “bestest, most Enterprisey” web analytics application the world has ever seen … but it isn’t today. More importantly when I go looking for companies mature in their use of web analytics who rely exclusively on Google Analytics and have chosen to do so explicitly, I simply don’t find them.

I could be wrong — if you’re an analytics samurai using nothing but GA please let me know —  but what I see a lot of is mature businesses using Google Analytics to back-fill some limitation in their for-fee vendor’s service. For example, up until today it was amazingly difficult to get WebTrends to calculate a bounce rate and some people think setting up visitor segments in SiteCatalyst is a lot of work. More importantly, while lots and lots of people complain about how difficult their analytics application is to use, the team at Google has done a freaking brilliant job with the GA user interface and in my humble opinion it sets the bar for ease-of-use in web analytics.

Yahoo Web Analytics in an Enterprise context, and hopefully Dennis will forgive me this since he’s tanned and rested after a week or two in the islands, is not really that easy to use, not that easy to set up, and not that easy to configure — remember it’s 77% of Omniture SiteCatalyst which nobody ever describes as “easy to implement and easy to use” (except for Adam Greco, but he’s clearly an exception!)

But here’s the secret: Yahoo Web Analytics is not supposed to be easy to use, it’s supposed to be really, really powerful! Yahoo Web Analytics is an Enterprise-class web analytics application out of the box designed to support businesses with custom data collection needs, custom reporting needs, custom segmentation needs, and the challenges typically found within any company of size.

More importantly, because of this functionality I believe that Yahoo Web Analytics will be a gateway to a much deeper relationship between the YWACN and their customers than the GAAC have found for the most part.

Yes, Yahoo’s APIs are tightly held and thusly YWA is not as “open” as WebTrends or as “integrated” as Omniture. Yes, Yahoo is keeping Rubix under wraps so it is not as flexible as Affinium NetInsight or Coremetrics Explore. Yes the interface is kinda clunky and the terms of use were written by lawyers … I get the complaints and hear the FUD loud and clear. But given the massive adoption of Google Analytics I think that coupling exceptional services and support with a free Enterprise-class application has a lot of potential to be the permanent game changer that I first described last year.

What do you think? Is purple the new green? Is “9” too little, too late? Does Yahoo! have a chance to focus now that they have outsourced their search business? Or am I missing the point and despite two great free solutions will the world continue to pay for web analytics the same way we always have? I’m totally willing to be wrong about this one … but if you don’t believe me about how powerful Yahoo Web Analytics is either read this book or contact me directly and I’ll see about getting you your own YWA account.

As always your comments are welcome.

Analytics Strategy, Social Media

#measure is the new #wa in Twitter

UPDATE: John Lovett from Forrester Research, or @JohnLovett as I like to think of him, has weighted in on the use of #measure and appears to be on board. He also documented how quickly things change in our increasingly hectic world and how fast a “standard” can become yesterday’s news.

Just a quick post to help bring attention to the fact that the fine people of Washington state have officially over-run the #wa hashtag that many web analytics folks have been using in Twitter. While this certainly our loss given how terse #wa is when you’re limited to 140 characters it is difficult to fault those folks since WA is their state’s abbreviation.

Such is life in an unregulated world, huh?

As a replacement I have started using #measure when tagging my web analytics-related Tweets. And while there was some debate about #measure versus #waamo and #analytics and the such I would propose that #measure is the basis for everything we do. Without measurement there is no analysis; without measurement there is only gut feel.

That said I am choosing to use the #measure tag … you may choose to use something completely different. But since I was one of the catalysts to start using #wa in the first place I figured I would see if lightning might strike twice!

See you in the #measure cloud!

Analytics Strategy, Conferences/Community, General

Interview: Bill Gassman (Gartner) on Google Analytics

Bill Gassman from Gartner is one of those guys that just “gets” what we’re trying to do in the web and digital analytics industry. Perhaps because he’s been covering this space for nearly as long as I’ve been around, or perhaps because he has a deep business intelligence background and sees where all this is going. I dunno, but Bill gets it.

Recently Bill, who is incidentally coming to the 2009 X Change and leading a huddle on organizational issues and co-leading a huddle on technology with John Lovett from Forrester, published a short brief on Google Analytics that I thought really hit the nail on the head. Clear, honest, and fully taking the Enterprise into account, Bill’s report clarified a lot about how companies should be thinking about Google’s analytics solution.

Since I could not get permission to republish Bill’s report I did the next best thing — I came up with some questions and put them to the man himself. The following are my questions and Bill’s responses. Incidentally, if you want to follow-up on this interview Bill graciously said he would monitor the comments and respond there (so comment away!)

Or, you could just come to San Francisco on September 9, 10, and 11 and debate the goodness of Google Analytics with Bill in person.

Regarding your recent note on Google Analytics, can you characterize how the companies who are asking you about “free” analytics have changed in the last 12 months?  Is any one thing driving that change, do you think?

Since Google Analytics improved last October, most client inquires about Web analytics touch on Google Analytics.  That is why I published the note “Is Google Analytics Right for You?”.  (Gartner account required to access)  Marketing departments ask if it is all they need, purchasing agents wonder why they should spend money on commercial tools and corporate lawyers wonder about Google’s terms and conditions.   The economy and budget constraints trigger the questions, but the major driver to Google is its simplicity.  Many organizations do not have the processes in place to make use of the high-end products or have Web sites that do not need the sophistication they offer.  They perceive Google Analytics as good enough and “free” is a tempting offer.

If Google asked you which three things were most important to add to their functional set to be considered “Enterprise” what would those three things be?

Getting to functional parity with the commercial tools is not enough for Google Analytics to be considered enterprise class.  Google should charge for an enterprise class offering, because a lot is required as is the accountability that goes with the exchange of money.  They must also provide enterprise class support and address issues with the terms of service policy.  The key function missing is a visitor centric repository, so users can define complex multi-session segments and export visitor information to campaign and content management tools.  Google would also have to extend personalized service from customers with significant AdWords accounts to those willing to pay for it.  Finally, the terms of use policy has no service level guarantee and users must look for a FAQ to find assurances of privacy.

On the subject of “Enterprise-class” analytics … Google appears obsessed with this designation, regardless of their clear dominance from a deployment standpoint and the gains they’ve made within larger companies.  What do you think is behind their obsession?

Google appears to be fighting an asymmetric war with IBM, Microsoft and others, investing relatively little yet forcing competitors to take notice.  We see this especially for office applications and cloud computing.  Google is looking for products that give them credibility at the enterprise level, and Google Analytics is part of that story.  Other parts of the story include the recent news about the Chrome OS, Google Search Appliance, Google apps (premier edition), Geospatial Solutions, and Google App Engine for cloud computing.  While many large organizations are using some or all of these offerings, with few exceptions, only the Search Appliance has gained strategic status.  Google still brings in 97% of its revenue through advertising.  They might be showing obsession because of where they want to be, but then again, they could be throwing up a smoke screen to keep the competition too busy to attack Google on advertising.

What do you consider the single greatest risk to Google’s analytics business in the next 24 months?

There is no threat to Google’s analytics business, because Web analytics is not their business, yet.  Out of 72 million active Web servers (as reported by Netcraft), about 20,000 organizations pay for Web analytics.  Google gives away Google Analytics so that millions of Web site owners can see the impact of AdWords and buy more Google ads.  If there is a threat, it is Yahoo Web Analytics, who is using a similar tactic to go after Google’s advertising revenue.

Are you now, or do you see in the near future, a situation where as a Gartner analyst you are advising your clients to actively consider free solutions from Google and Yahoo alongside “traditional” web analytics solutions like Omniture, WebTrends, and Coremetrics?

Running two sets of tools on the same Web pages can be a recipe for trouble, because reported numbers will not match, reducing respect and therefore value for both tools.  There are situations however where two tools make sense.  It would be great if all organizations had the leadership, investment, skills and processes to use commercial tools to meet everyone’s needs, but for too many, it has not worked out that way.  When analytic resources are limited, it is pragmatic to focus commercial tools on the high-value parts of the site and let other site stakeholders use free tools. Analysis is a critical part of a customer centric Web strategy.  If some departments are happy with the free tools and a central group cannot support them, it is OK to let chaos reign until the business justification, investment and leadership are available to do things right.

Analytics Strategy

Columbus WAW July 2009 Recap — Bizwatch and More!

We had another great Columbus Web Analytics Wednesday last week at Barley’s Smokehouse and Brewpub. This month’s sponsors were Bizresearch and the Web Analytics Wednesdays Global Sponsors. We had right at 30 people attending:

Columbus Web Analytics Wednesday -- July, 2009

Columbus Web Analytics Wednesday -- July, 2009

Laura Thieme of Bizresearch presented on search marketing and the challenges of trend analysis therein. She walked through one in-depth case study and sprinkled examples from other clients into the discussion as well.

Columbus Web Analytics Wednesday -- July, 2009

Bizresearch has a product called Bizwatch that, when combined with some fundamental best practices of SEO and SEM, looks like it can yield some handy insights in a hurry! Laura is a self-professed constant tinkerer with her presentations, but I think the one below is pretty close to what she walked us through:

As the group grows, I’m finding that the evenings wind down and there are people I didn’t even get to say “Hi” to — both new attendees and long-timers. But, some of the discussions I had included:

  • Chatting more in-depth with Chris Dooley of Foresee Results. I’m looking forward to a future WAW when Chris will be challenging the group to think about the offline and post-visit behavior of site visitors and how often that doesn’t get considered by internet marketers. I also picked up a new blog to follow, as Chris mentioned that Kevin Ertell had joined Foresee Results and recommended his www.retailshakennotstirred.com blog. Ertell only started the blog last month, so it remains to be seen if it has legs. So far, his posts look to be pretty in-depth and grounded in real experience. Chris also mentioned that Eric Peterson wrote a white paper for Foresee Results, which had me poking around on the White Papers area of their site — it looks like there are some really good reads there! I’m not sure which one Eric wrote…so I may just have to download several of them!
  • Chris also mentioned a new “session replay” tool that Foresee Results just introduced called CS Session Replay. It sounds like a direct threat to Tealeaf, but it is apparently wayyyyy slicker. I don’t know if Chris (right below) was extolling the virtues of the product to Scott Zakrajsek of Victoria’s Secret or not. They might have just been discussing who put those half-drunk beers down next to the glasses of water they were drinking…
  • Columbus Web Analytics Wednesday -- July, 2009

  • I chatted with Paul Hall of the Mastery Marketing Group about the work they’re doing to drive a “360 degree view of the customer” using data from multiple systems (CRM and other). That discussion led to me bringing up Webtrends, Omniture, and Eloqua all as tools that I know of that have very real capability to do user-level tracking and analysis of web activity.
  • Our “farthest travelled to attend” award (not really an award…just me pondering after the fact) went to Kim Merritt-Butler of TheURLdr.com. Kim was in town from Cumberland, Maryland, and is very interested in getting a similar group started up in the Washington, D.C. area. So, if you know of anyone in that area whom Kim should get in touch with, please let me know and I’ll pass the information along! We were both surprised that there is not already a WAW — even an older/dormant one — in that area already.
  • I didn’t get to chat with him much, but Gareth Dismore could’ve made a case that he’d actually travelled the farthest, as he’s now based in Colorado Springs with SearchSpring. He was back in town for a couple of weeks, and he didn’t move away long enough ago to count. Or so I decided. After the fact. For an award that exists merely as a construct within this blog post.
  • I talked to several people who were new to web analytics — were starting to see it crop up and are attending WAWs as a way to dip their toes in the water (which is a great place to start!). I found myself giving my standard recommendations on that front: Occam’s Razor blog, Eric Peterson’s Analytics Demystified blog, and Web Analytics: An Hour a Day. I know scads of blogs and books have cropped up over the past few years…but these still nail the basics, IMHO.

I wrapped up the evening with a lengthy discussion in the parking lot with Bryan Cristina and headed home thinking Thursday was going to come awfully quick. Then I spent an hour getting my neighbor’s garage door unmangled, as she’d backed into it with her minivan before it was fully open and was leaving on vacation the next morning. I am clearly not as young as I used to be, as I didn’t fully recover until I cratered at 9:45 PM on Thursday night and got in a solid eight hours!

Next month’s Columbus WAW is already scheduled. It will be on August 12th at 6:30 PM, again at Barley’s Smokehouse and Brewpub. The event is being sponsored by IQ Workforce, and Corry Prohens will be presenting on job hunting and career management in web analytics and search marketing. I hope to see you there!

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Want to Debate Standards?

One of the biggest problems we face in web analytics today is our industry’s lack of standards and common definitions. And while a great number of incredibly bright folks have put a ton of energy into solving these problems, in my humble opinion we are more or less where we started years ago — agreeing politely to disagree. Those of you who have been reading my blog for awhile know that I’m not shy about disagreement — perhaps more than anything my analyst’s mind loves a spirited debate — but I also am somewhat anxious about creating tangible outcomes.

To this end I am incredibly excited about two huddles at X Change 2009, one that was just added! The first is Forrester’s John Lovett’s “Web Analytics Standards (or a Lack Thereof)” in which John will be leading us through the current state of industry standards, proposed definitions and our collective understanding of analytics terminology. The second, and one just added to the X Change, is Jim Hassert’s “When is a Visitor Not a Real Person?” huddle in which Jim will take John’s huddle one step further and drill-down into the often irreconcilable differences found in the seemingly harmless “visitor” metric and dimension.

Last year I was forced to miss a lot of good huddles. This year a team of wild horses couldn’t keep me from missing these two.

While I have little doubt that both of these huddles will live up to the spirit of the X Change my hope is that they will go one step further. I would love to see both produce some kind of actionable outcome, something that we can carry forth into our careers and the wider conversation about our industry. Given that some serious talent is already signed up for the X Change — including some of the brightest minds in the practitioner and vendor community — I have little doubt that we have the brain power … now all we need is the resolve to do something and not just push words around on paper.

If you’re a reader of this blog and want to join us at the X Change I’m happy to help you out.  If you act before July 31st I am offering a 15% discount on the registration (a $300 savings!)

Come to the X Change. Agree to do more than “politely disagree” — take a stand, defend your ideas, and help shape tangible and positive outcomes.

Analytics Strategy

Data Management — As Sexy As a High Quality Mattress

Steve Woods of Eloqua invited me to write a guest post on his Digital Body Language blog after we’d gone back and forth a bit about contact data management and marketing automation. Over the past six or seven years, I’ve been thumped on the back of the ear with data management issues again and again. It always hurts, and, by the time I’ve realize I’ve got a mess…it’s a heckuva challenge to recover.

In my current job, I’m a full-time customer data management guy. It is not sexy. Like many large companies, we’ve got customer data that is created and managed in a wide range of disparate systems on diverse platforms, each with multiple decades of system evolution. It’s important. It’s painful.

There are some great opportunities in our increasingly electronic and e-based world to make some real headway with data management. In the case of the guest blog post, I focussed on opportunities to use marketing automation tools and your web site to drive improvements in the quality of your customer data. As for how exactly I made the “high quality mattress” analogy? Click on over and check out the post!

Analytics Strategy

Columbus Web Analytics Wednesday — July 2009 with Bizresearch

Web Analytics Wednesdays are an opportunity for full-time web analysts, part-time web analysts, and anyone who is interested in learning more about web analytics to get together and share their experiences! We will informally network for a bit before sitting down and ordering food, at which point we will have a brief presentation/discussion about Bizwatch led by Laura Thieme.

Details:

When: Wednesday, July 15th at 6:30 PM

Where: Barley’s Smokehouse and Brewpub, 1130 Dublin Road, Columbus, OH 43215

Registration: the Web Analytics Wednesday site

How to find us: We have a room reserved — just go to the back of Barley’s and hang a right

We are excited to welcome a new sponsor this month! Bizresearch will be co-sponsoring the event with the Web Analytics Wednesdays Global Sponsors. The sponsors will be covering food and nonalcoholic beverages only, although you are welcome (and encouraged) to sample Barley’s fine offering of frothy beverages on your own tab.

Laura Thieme, a 12-year search marketing and analytics veteran, has developed a new search analytics application: Bizwatch. Observing the challenges of monthly trend search marketing reporting and analysis, she developed a new application that combines SEO, competitors, keyword research, paid search and web analytics. It focuses on data integration amongst the three areas of search marketing. It focuses on trend analysis and keywords that convert.

Thieme is looking for feedback from industry colleagues on the search analytics application. She is also hoping to hear from search marketers regarding monthly reporting, applications they are using, and other search analytics data integration challenges they are experiencing.

It should be an engaging discussion!

Analytics Strategy, General

The Truth About Mobile Analytics

Perhaps the only thing hotter than social media right now is mobile. And with good reason — smartphones like the iPhone and Palm Pre are taking our ability to get information to entirely new levels and ushering in an era of “digital ubiquity” that is clearly without precedent. Unsurprisingly business is responding by actively exploring how they can participate in the mobile opportunity, either by optimizing their site for small screens or going so far as to build cool, new iPhone applications to support long-standing offline initiatives.

Fortunately most business owners have learned from past mistakes and are showing interest in measuring the effect of their investment into mobile. But measuring mobile isn’t easy — the sheer diversity of technologies involved and the rapid evolution of the industry has created a monsterous landscape of devices, communication protocols, and requirements.

As a result dozens of companies have sprung up, all making claim to a unique ability to measure the mobile opportunity. Unfortunately some of these companies have decided that relying on hype, hyperbole, and sometimes outright lies are a better sales strategy than building a great product with a unique value proposition. We have seen CEOs bash other CEOs, sales people obfuscate their identity and try and provide “objective” answers, and antics that can only be described as “juvenile.”

Because the mobile opportunity is so great Analytics Demystified started taking a closer look at measurement earlier this year. I was fortunate enough to be able to rely on the expertise of folks like Michiel Berger and Thomas Pottjegort at Nedstat, the mobile team at NBC, dozens of analytics end-users, and some of the brightest product managers in the analytics sector tasked with integrating mobile into existing digital measurement offerings.

What I found was a series of surprising truths about how mobile analytics is evolving. Nedstat was kind enough to sponsor this research — and clear disclosure: Nedstat has been measuring and integrating mobile data into their web analytics offerings for years — and I am happy to announce the availablity of this research in a new white paper titled “The Truth about Mobile Analytics.”

You can download this paper from the Nedstat web site for free (but they do ask your name, email, and company name):

DOWNLOAD THE TRUTH ABOUT MOBILE ANALYTICS

We are also holding a special webcast on the subject on June 23rd at 10 AM Central European Time (CET) which is unfortunately quite late in the evening for those of us in the U.S. but quite well timed for Nedstat’s customers. I suspect the webcast will either be repeated or rebroadcast at a later date and time.

SIGN UP TO JOIN THE MOBILE ANALYTICS WEBCAST ON JUNE 23

Also, if you’re really into mobile and mobile analytics please consider joining us at the X Change Conference September 9, 10, and 11 in San Francisco. More details will be out next week but our mobile sessions will be led by Greg Dowling from Nokia (a company with some knowledge of mobile I am told.)

I encourage everyone to download the paper and give it a read, regardless of your position on mobile and mobile analytics today. As always I welcome your feedback and commentary.

Analytics Strategy, Social Media

Columbus Web Analytics Wednesday Meets #fiestamovement

Last night was the monthly Columbus Web Analytics Wednesday at Barley’s Smokehouse and Brewpub, and we were fortunate to have Webtrends sponsor for the second time this year! This time, we managed to get it scheduled in a way that lined up with Noé Garcia‘s travel plans, so he wore the dual crown of “Traveled Farthest to the Event” (from Portland, OR) and “Sponsor Representative.” The dual crown looked surprisingly like an empty beer glass:

Noe Garcia of Webtrends

Noe and Bryan Cristina of Nationwide co-facilitated a discussion about going beyond the application of web analytics tools within the confines of the tool itself. The most active discussion on that front was spawned by one of the regular participants in the group who works at a major, Columbus-based online retailer. Not necessarily this guy, but maybe it was him. My lips are sealed.

Monish Datta explains an approach to web analytics

We talked about how web analytics data, tied to order information, and then matched back to offline marketing channels such as printed catalogs, can be very effective at driving marketing efficiency. In the examples that triggered the discussion, as well as from the other participants’ experiences, the consensus was that, while the ideal world would have all of this data hooked together automatically…rolling up your sleeves and tying the data together manually can still yield a substantial payback. Part of the discussion got into volume — for companies that do a lot of direct mail-oriented promotion, using web analytics data to cut the mail volume by even a fraction of a percent (by using that data to better target who does/does not respond to printed mail) can provide significant and quantifiable savings for a company.

I didn’t think I’d ever hear anyone at a WAW say “Zip+4” (that’s shorthand for the 5-digit zip code plus the four additional digits that you see on a lot of your mail)…other than me! But I did! The person who said that may or may not be a different person pictured in the photo above. Again…my lips are sealed!


And…Ford’s Fiesta Movement

Dave Culbertson, a WAW promotional channel unto himself, kicked off an entirely different, but equally intriguing discussion:

Dave Culbertson Expounds

It all started as Dave was driving his Mazda in Grandview a couple of weeks ago. He got quasi-cut off by a 2011 Ford Fiesta two cars ahead of him. That prompted this tweet:

Dave Culbertson's "I just got cut off" tweet

Now, Dave regularly mocks people who promote themselves as being social media gurus/experts/mavens…but he’s one of the most social media savvy marketers I know. He also knows his cars. For one of those reasons (or maybe both) he immediately recognized that the car in front of him was part of Ford’s Fiesta Movement so he nailed a very relevant hashtag with his tweet. As it happened, someone else on Twitter saw the tweet, quickly realized who the likely culprit was, tweeted to her, and she wound up apologizing via Twitter less than an hour after the incident!

Ms. Single Mama's Cut Off Apology

Ms. Single Mama is a popular blogger, and this was the first time that she and Dave met in person. Everyone was curious about her Ford Fiesta agent experience. She obliged us by explaining, and, later, a good chunk of us headed out to the parking lot to see the 2011 Ford Fiesta she is driving for six months:

mssinglemama.com and her 2011 Ford Fiesta

Yes, we had name tags. Yes, the intial group that followed Alaina out to look at her car was entirely male. Yes, all told, about twice as many people as this wound up checking out the car. And, finally, yes, Alaina made a call in the midst of this picture! Andrew (far left) commented that the dashboard looked like the head of a Transformer. He…was right!

Transformer Head

2011 Ford Fiesta Dashboard

Dave even demonstrated his social media hipness by snapping a picture of the vehicle with his iPhone and then tweeting it:

Dave Culbertson iPhones a picture of a 2011 Ford Fiesta

All in all, it was an engaging, informative evening. I’m sure I’ll miss some of the companies that were represented, but they included JPMorgan Chase, Nationwide, Victoria’s Secret Online, Webtrends, Clearsaleing, Bath&Body Works, Cardinal Solutions, Highlights for Children, Rosetta, Foresee Results, Acappella Limited, DK Business Consulting, Lightbulb Interactive…and others! Not. A. Bad. Crowd!

The next WAW will be July 15th. We’re working hard to get our calendar for the rest of the year nailed down, which means we are looking for sponsors and presenters. Please contact me at tim at <this domain> if you are interested on either front.

Analytics Strategy

The Teeter-Totter of Customer Data Management

Teeter-totter

I had a professor in business school who used to explain the relationship between the stock market and the bond market as a teeter-totter (in rural southeast Texas, I grew up knowing this as a see-saw): as the yields on one went up, the yields on the other went down and vice versa. 

Managing your customer data can be like that, too — the more of a burden you put on your customers and prospects to keep your data about them clean, the less of a burden you put on yourself. And, likewise, the more of a burden you take on yourself, the less of a burden you’re putting on your customer.

While bouncing through links from a tweet, I stumbled across Steve Woods’s original Contact Washing Machine post, and it set some alarm bells off. Steve’s a damn sharp guy — he was a co-founder and remains the CTO of Eloqua, and he is pretty much an undisputed visionary when it comes to marketing automation technology. Yet, this post sparked an immediate reaction, as well as teeter-totter imagery. Since then, Steve has clarified…and I think I misread his initial premise. His point is that data cleansing should happen as early in the data acquisition process as possible — cleanse the data as it comes in, rather than crossing your fingers and waiting to run batch processes after the fact in the hopes that the data will get cleaned up.

That’s a valid point, but, after digging deeper into the cross-links in the post, I still think there’s some under-estimating of what it takes to “fix” dirty data as it comes in. For starters, when it comes to customer/prospect data, there are typically a range of incoming data entry points:

Web Data Entry

In the world o’ the web, data can come into your systems directly as typed by a visitor to your site — when a user is filling out a web form, for instance. On the surface, that’s a great place to do data validation, because you’ve got the actual user right there to clarify anything that has gone amiss. If he’s fat-fingered his phone number or put in an e-mail address that is clearly not valid, it’s best to prompt him right then and there to correct the mistake. But, the teeter-totter comes into play: if that piece of data is really not germaine (as perceived by the user), it doesn’t take long for your cleansing to lead to a frustrated visitor to your. Worse, if you don’t allow the user to bypass the validation step (with a “I don’t care what you think, I’ve entered the information correctly, so just keep it that way and let me move on” option), there is a very good chance that you will keep some visitors from ever getting to where they and you want them to!

If you include field validation on your web forms, and if you don’t allow the user to override that validation, it behooves you to include detailed form abandonment tracking in your web analytics to make sure you haven’t set up an insurmountable barrier for some of your customers.

Human Data Entry

Call centers almost always serve a data entry function as part of the customer service process. In addition, many companies have dedicated data entry staff to translate mail, fax, tradeshow-collected leads, or other transactions. This can be a great opportunity to clean your data up front, as you can certainly place a higher burden of getting the data right and enforced data validation on employees of your own company than you can on your customers and prospects.

BUT, this turns out to be a stickier wicket than it seems at first blush. If I had a nickel for every time I heard someone living in world of backend data propose data augmentation or enhancement by updating the human data entry processes to “just add one more quick step,” I’d be able to buy a Starbucks Venti Caramel Frapuccino® blended coffee (which is a lot of nickels, if you think about it). Two reasons that there should be a proceed-with-extreme-caution label placed prominently on any solution that heads down this path:

  • Call centers typically live and die by the average handle time (AHT) for their calls; yes, they want to meet the customer’s needs, but they also, out of necessity, can save big dollars by cutting the AHT by a few seconds on average. Adding 5 or 10 seconds to every call can have a very real impact (and can make you some quick enemies with call center managers)
  • It’s easy to identify the benefits of more, more complete, or cleaner data…when it comes to backend processes and data analysis. But, is that benefit readily evident to the people whom you’re relying on to capture it? Does it benefit them directly, either through smoothing the immediate next steps in their process or by impacting their compensation? Due to the high-volume nature of call center and data entry work, data that is “just another field you need to fill out” is data that is at risk of falling prey to shortcuts (the first value in the dropdown, “aaa” in a text field, etc.). The most successful introductions of process changes have a net-no-change or net decrease in the number of steps/time/complexity of the process into which it is being introduced.

Human data entry offers opportunities to get data that is more complete and cleaner…but those opportunities don’t come automatically.

There are many other ways that data can enter your systems: provided by an intermediary (often semi-independent sales channels: distributors, resellers, etc.), sourced from a third-party lead sourcing company, passed in from another system within your company (often a system that doesn’t store the data in the same format or even have the same definitions for what specific fields mean and are used for), etc. There’s value in inspecting the sources of your customer data, assessing how clean the data is that comes from those different sources, and then, with the teeter-totter firmly in mind, investigating where and how to get that data coming in cleaner!

Photo courtesy of jhirtz.

Analytics Strategy, General

Demystifying Europe …

When I quit my job at Visual Sciences back in May 2007 to form Analytics Demystified I did so because I had a vision of a new type of web analytics consulting group. I very much wanted to build a small practice made up of very senior people capable of solving the really hard problems most companies have after they’ve made the investment in web analytic technology. I wanted to establish a firm that would compliment the highly tactical firms that I respected so much — companies like Semphonic, Stratigent, and Europe’s OX2.

After two years I am very proud of the work I’ve done and the clients I’ve worked with. I have had the opportunity to work with some of the best brands, the best companies, and the most visionary management teams who are actively wokring to do more than simply “run reports” and instead want to actively compete on web analytics. That said, I have come to the realization that there is no way I could satisfy the global need on my own … so I did what every good business owner should do: I went out and got someone smarter, more eloquent, and better looking to be my business partner!

At Emetrics last week in San Jose I was incredibly excited to announce that Aurélie Pols, Europe’s most widely known and well respected web analytics consultant, has joined Analytics Demystified as a Principal Consultant.  Aurélie brings depth and experience in web analytics that is rare anywhere in the world and exceedingly rare in Europe, she was the first consultant to break the “one vendor” stranglehold in Europe that forced firms to work exclusively with a single technology, and she brings a brilliance to the explanation and use of these tools that amazes even me.

Now Aurelie and I will be working together in Europe to “demystify web analytics” and help companies make significantly better use of their technology investment. Between the two of us and our contacts across Europe Analytics Demystified will now be providing a far greater level of service than was previously possible.

I highly recommend that you read Aurélie‘s “Hello, World” blog post and start following her at aurelie.analyticsdemystified.com. If you have any questions about Aurélie’s practice or how Analytics Demystified can help you regardless of where you’re located, please don’t hesitate to contact us directly.

I hope you will welcome me in welcoming Aurélie to the Analytics Demystified team.

Analytics Strategy, General

Is Your Attribution Model Appropriate?

Recently I have spent an awful lot of time thinking about and talking about data accuracy issues in the field of web analytics. The widespread use of cookies as a tracking mechanism and the underlying assumption that “one cookie = one visitor” is a big part of the problem, but cookies are not the only problem. Another problem, one that I actually believe to be more substantial than cookies and visitors, is  the challenge of campaign attribution.

Challenge? What’s hard about campaign attribution? You tag campaigns and web analytics tells you what works, right? You get pretty ROI graphs and click-reports and all that fun stuff? Campaign analytics is easy!

Wrong.

One of the best-kept secrets in online marketing is that most campaign attribution data is completely wrong and the models used to evaluate campaign performance are wholly inappropriate.  The relative nascence of digital marketing practices, combined with conflicting measurement systems and poorly understood interaction between online marketing channels, likely means that hundreds of millions of dollars are wasted annually on marketing efforts that don’t produce their intended results.

Companies are increasingly responding to this observation by re-examining their marketing measurement systems.  Even the most cursory analysis yields a great deal of information about the “campaign attribution problem.”  Popularized recently by Microsoft with their “Engagement Mapping” efforts as well as analysis published by Forrester Research and others, it is clear that the most widely used online campaign attribution model is inherently flawed.

To correct these flaws and begin to improve both the accuracy of measurement and the general understanding of how marketing really works online, Analytics Demystified recommends a new approach to campaign analysis.  Dubbed “Appropriate Attribution”, the approach leverages widely available but infrequently used data to triangulate towards the true value of online marketing efforts.

Given that the majority of online advertisers have direct response goals, and that most marketers are still generally unsatisfied with the campaign measurement tools at their disposal, Analytics Demystified believes that Appropriate Attribution is the first step towards improving companies’ collective understanding of their digital marketing efforts.

Eventually marketers will have access to robust warehouses of data detailing consumer interaction with online media and advertising, but the adage “you must walk before you can run” is as true in digital marketing as it is in life.  Before business owners and marketers become fully equipped to benefit from complex marketing mix analysis of online and offline channels, they are well advised to address the campaign attribution problem to increase the return on their valuable dollars spent for online marketing efforts.

Thanks to the fine folks at Coremetrics you can read all about Appropriate Attribution and learn how you can start to get a better understanding of your online marketing efforts today.

Download your copy of the Appropriate Attribution paper from Coremetrics today.

Analytics Strategy

Columbus Web Analytics Wednesday: April 22, 2009

In the interest of not messing with a good thing, we’re returning to Barley’s Smokehouse and Brewpub this month for our regular gathering of full-time, part-time, and just-generally-interested web analyst types.

We had a great turnout last month, and we’re on pace to match that this month, which means we’re needing to go easy on the Web Analytics Wednesday Global Sponsors. Rather than asking them to cover the full bill, we’re just having them cover the food and having everyone be on their own for beverages, which is still a wickedly good deal!

We’ve gotten feedback in the past that it’s good to have every second or third meetup be presentation-free, and this month will be one of those. However, we are looking into doing a little post-dinner speed networking — it’ll be quick, and we’ll find out whether it works or not. The inimitable Dave Culbertson of Lightbulb Interactive is running point on that and has been noodling around as to the best approach. It should be fun!

The details:

When: Wednesday, April 22nd at 6:30 PM
Where: Barley’s Smokehouse and Brewpub (1130 Dublin Road, Columbus, OH)
I hope to see you there!
And, if you, or anyone you know, would be interested in sponsoring a future Columbus Web Analytics Wednesday, please drop me a line at tim at gilliganondata.com. Our sponsorship flexibility is unparalleled in the industry — think rhythmic gymnastics meets Reed Richards. It’s a great way to get high visibility in a group of elite local marketing professionals. It’s a great way to support Columbus as a hotbed of web analytics thought leadership. It’s good karma.
Analytics Strategy

Hosam Elkhodary

UPDATE: There is information at the bottom of this post about how to make a donation in Hosam’s memory from June Li in the Web Analytics Forum.

On Tuesday of this week the web analytics community lost a passionate advocate with the passing of Hosam Elkhodary. I had the pleasure of working with Hosam just after founding Analytics Demystified as well as spending time with him at many an Emetrics. There are many blog posts about Hosam out there but the most touching is Mike Sukmanowsky’s — Hosam clearly had the same impact on Mike as he did on many of us. I encourage you to read Mike’s post and comment there if you knew Hosam.

Hosam will be missed.

FROM THE WEB ANALYTICS FORUM:

Some additional information for those who are interested in contributing to the Heart and Stroke Foundation fund for Hosam. You can donate online here:

http://www.heartandstroke.on.ca/site/c.pvI3IeNWJwE/b.3581623/k.C08D/Donate.htm

In addition designating that the contribution is in memoriam of Hosam, please make sure you address the card as follows, which will doubly ensure the donation is directed properly:

Abdalla Elkhodary
7 Delaney Drive
Ajax, Ontario
L1T 4B2

Analytics Strategy

40 Million Reasons Your Customer Data Isn't As Current as You Think (or Hope)

While not getting as much buzz as social media when it comes to hot topics for in 2009, “customer data management” is something that marketers are starting to take seriously. It’s easy to start envisioning fancy pictures of capturing and using customer data:

  • Using behavioral data to drive timely and relevant emails
  • Integrating information across different customer touchpoints/channels to deduce customers’ and prospects’ preferred communications medium
  • Building analytic models to predict which customers are most likely to churn and making special offers to retain them

Those are all admirable goals. And, they’re all attainable. AND, they’re all going to be expected baseline capabilities within five years.

Before you tackle these higher order applications, it’s worth grounding yourself in an understanding of how rapidly customer data decays. Here are a couple of fun facts to wrap your head around on that front:

  • The U.S. Postal Service processes over 40 million address changes annually [source]
  • The population of the United States is estimated as being just north of 300 million [source]

Clearly, this isn’t an apples-to-apples comparison. But, we tend to imagine that our customers and prospects are more static than, in reality, they are — who they work for, what their job title is, and, yes, even where they live.

Analytics Strategy, Conferences/Community, General

Unique Visitors ONLY Come in One Size

Back in January I published a note about the proposed IAB Audience Reach Measurement Guidelines that generated a fair amount of interest. At the time I applauded the IAB for providing guidance regarding the definition of a “unique user” or “unique visitor” while noting some concerns about how the proposed definition would actually manifest. In summary, the new IAB definition of “unique visitor” needed to have some basis in underlying data that is based on secondary research that can be directly tied to “a person.”  Now that the IAB Audience Reach Measurement Guidelines have been officially published we can use the IAB’s own words:

“… in order to report a Unique User, the measurement organization must utilitze in its identification and attribution processes underlying data that is, at least in reasonable proportion, attributed directly to a person” and “In no instance may a census measurement organization report Unique Users purely through algorithms or modeling that is not at least partially traceable to information obtained directly from people, as opposed to browsers, computers, or any other non-human element.” (Section 1.2.4)

The last little bit references, I believe, the IAB’s distinction of four types of unique “countables” — Unique Cookies (Section 1.2.1), Unique Browsers (1.2.2), Unique Devices (1.2.3) and Unique Users or Unique Visitors (1.2.4).  The term “measurement organization” was a little, well, mystifying as was evidenced in my January post, and sadly the final document does little to clarify this term other than to say the “document is principally applicable to Internet Publishers, Ad-serving organizations, Syndicated Measurement Organizations and auditors” on the IAB web site.

This definition is important since in my last post the real conundrum appeared to be that if “measurement organization” included Omniture, WebTrends, Google, Coremetrics, etc. then the IAB was essentially saying that the vendors needed to change the way they reported Unique Visitors, at least for their clients who would be subject to the perview of the IAB and MRC.  What’s more, George Ivey from MRC never got back to my repeated requests for information, despite two members of the IAB working group (Josh Chasin from comScore and Pete Black from BPA Worldwide) openly disagreeing in their interpretation of the definition …

Well, a few weeks back I got a call from Joe Laszlo, an old co-worker of mine at JupiterResearch who is now the IAB’s Director for Analytics, the guy basically responsible for the document.  I always liked Joe and it was nice to hear from him again.  And Joe did clarify for me what a “measurement organization” is … he just didn’t directly clarify the impact on web analytics vendors.

According to Joe (and he will surely correct me publicly if I am misinterpreting our conversation) the “measurement organizations” that should be guided by this new definition of “Unique Users” are publishing organizations who are outwardly reporting their metrics for consideration by advertisers in the open market. Companies like AOL, Weather.com, ESPN, etc.  This is, I think, much more clear than the sentence a few paragraphs up that includes “Syndicated Measurement Organizations and auditors” and puts at least this part of the document in context: Essentially when using numbers coming from census-based systems, the IAB and MRC want publishers to start reporting Unique Visitor counts that have some basis in reality.

Pretty hard to disagree with Joe and the IAB on that point. We all pretty much agree that cookie-based visitor counting is messed up, and I think we can even agree that the degree to which these counts are “messed up” is a function of the target audience, the duration under examination, and the type of site.  For example, we expect cookie-based counts on sites that attract highly technical users on a daily basis to be much more impacted over a 90-day measurement period than, say, sites that attract largely non-technical users on a monthly basis over the same 90-day period.

So I’ll make one really bold statement right now, the kind that I have a tendency to regret but hey, it’s Monday and I’m feeling pretty good about the coming week:

The IAB are to be applauded for taking such a bold stand on the subject of counting and reporting unique visitors based on what we traditionally consider “web analytic” data.

I said as much in my last post … right after I said that the likelihood of the web analytics vendors following these recommendations was about the same as everyone waking up tomorrow to realize that the financial meltdown was a bad dream and the Dow is still over 14,000 (zero). The team of folks that the IAB brought together, which I understand included both Omniture and WebTrends, should be congratulated for taking a firm stand on one of the most dogged issues plaguing our collective industries (web analytics, online advertising, online publishing, syndicated research, etc.) for at least the past five years.

It is about time that we all agreed that “Unique Visitor” reports coming from census-based technologies frequently have no basis in reality. Further, we should all admit that cookie deletion, cookie blocking, multiple computers, multiple devices, etc. have enough potential to distort the numbers as to render the resulting numbers useless when used to quantify the number of human beings visiting a site or property.

Yes, before you grieve on me with your “but they are probably directionally correct” response I agree with you, they probably are, but fundamentally I believe that advertising buyers are at least as interested in the raw numbers as they are the direction they are moving. I say “probably are” because if you’re not taking the IAB’s advice and reconciling census-based data with data derived directly from people, well, you’re never sure if that change in direction is because your audience is changing, technology is changing, or there is a real and substantial increase or decline.

I mentioned above that my conversation with Joe didn’t really clarify the impact on web analytics vendors under the IAB’s new definition. Since I spent a fair amount of time thinking about the IAB guideline’s impact in this regard, I will make another bigger and bolder statement:

Starting immediately, I think the web analytics vendors and any company reporting a cookie-based count that is not in compliance with the IAB’s definition of “Unique Visitor” should stop calling said metric “Unique Visitors (or Users)” and correctly rename the metric “Unique Cookies”.

Yep, I am 100% in favor of using the IAB’s new terminology and being semantically precise whenever possible. The “Unique Visitor” counts in the popular web analytics applications are always actually counting cookies and so we should just go ahead and say that explicitly by calling them “Unique Cookies”. This change would actually give the web analytics vendors a neat opportunity … to battle to be the first to have a real “Unique Visitor” count that is based, as the IAB has suggested, on underlying data that is, at least in reasonable proportion, attributed directly to a person.

How could they do this? Let me count the ways:

  1. Develop a standard practice around the use of log-in and registered user data
  2. Work with third-party partners who are focused on gathering more qualitative data (for example, Voice of Customer vendors like ForeSee Results)
  3. Work with third-party partners who are estimating cookie-deletion rates, or at least have the potential to (for example, Quantcast)
  4. Work with third-party partners who can actually calculate cookie-deletion and multiple-machine use rates with some accuracy (for example, comScore, Google, Yahoo!)

I’m sure there are a few ways I am not thinking of, but these are the big four that have been talked about since 2005. While I expect to get some grief from paying clients about this statement, and I fully expect my suggestion to be widely ignored by the vendor community (no offense taken), I think this change would be a big step towards the recognition that there is only ONE DEFINITION of a “Unique Visitor” and this definition is only tangentially related to the number of cookies being passed around.

Like Soylent Green(TM), “Unique Visitors” are PEOPLE and our industry will go a long way towards maturation when we collectively agree on this fundamental truth.  It is not to say that Unique Cookies is not a valuable count — hell, in the absence of a strategy for reconciling cookies against people-based data unique cookies are all we have. But I do not believe that after nearly 15 years we are doing the online measurement community any justice by plugging our ears and signing “LA LA LA LA I CANNOT HEAR YOU GO AWAY!!!!!”

Which brings me to my last point …

I was really, really bummed out to read Jodi McDermott’s MediaPost article titled “Unique Visitors Come in Two Shapes and Sizes.” I was bummed because I have always liked Jodi since we worked together at Visual Sciences, because I think she is a brilliant member of our community, and because I knew I was going to end up writing these words … Jodi’s thesis is wrong and does the web analytics community a dis-service in attempting to defend a mistake by asking to water down a good definition just because it isn’t “hers” (in quotes since Jodi is a member of a larger committee charged with defining standards within the WAA.)

From Jodi’s article (which I recommend you read, especially the comments, and the emphasis is mine):

Bravo to the IAB for forcing the issue with audience measurement companies to standardize the way that they report uniques, but from a Web analyst’s perspective — and as a member of the WAA Standards committee — I wish they would have not allowed the term “unique visitors” to be redefined in such a way as to allow for multiple definitions in the space. Web analysts and media planners today have a hard enough time trying to figure out which data source to use and which standard to apply when performing their job — but that issue is now compounded even more by multiple definitions of unique visitors. In defense of the IAB, its membership is comprised of some heavy-hitter companies who are not about to change that “tab” in their reporting UI that says “Unique Visitors” on it.  But in defense of  WAA individual and company members, which include vendors such as Omniture and WebTrends (who were both listed as “Project Participants” on the IAB document, interestingly enough), neither are we. The term will live on in both places.”

I think what Jodi has missed here is that the IAB has actually given the world a useful and more accurate definition of “Unique Visitors” than any used in the web analytics industry today. More importantly, given the relative weight, clout, and respect enjoyed by the IAB in the wider world, I don’t think their definition allows for “multiple definitions” … I rather think that over time the IAB expects their member companies, especially those who want to have their numbers audited and publicly used, will consider the IAB definition the definition of “Unique Visitors” and properly consider the term we web analysts widely use today to be “Unique Cookies.”

I’m not sure what Jodi means by “heavy-hitter companies who are not about to change their “tab”” since I’m aware of very few companies today that have implemented the IAB recommendation for practical and ongoing use. But I was incredulous when I read the statement regarding using the IAB’s new definition, “in defense of the WAA individual and company members, which include vendors such as Omniture and WebTrends, neither are we. The term will live on in both places.”

Seriously? Rather than start calling our cookie counts “Unique Cookies” and having a rational conversation with our bosses to explain that the technology we use is limited in its ability to discern real people, you prefer to throw down the gauntlet with the IAB and say “screw your definition?” Despite the criticism that has been both wrongly and rightly heaped on the WAA’s “standard” definitions, despite the considerable group that crafted the IAB’s definitions, and considering the fact that the WAA’s definition is wrong, you want to pick a fight?

Two wrongs never make a right, and you’re wrong twice here. Sorry.

I am not on the WAA Standards Committee, I am not on the WAA Board of Directors, and my dues with the WAA are about to lapse so I have no basis for representing the organization. Perhaps reading more into Jodi’s post given my knowledge of her passionate work in the WAA, but I would strongly encourage the current Board of Directors to examine Jodi’s statements in the context of the IAB relationship and the “bigger picture” at play.  Because while Jodi may speak for the WAA Standards Committee and by extension the entire WAA, she certainly does not speak for me.

I will gladly use the term “Unique Cookies” when I am talking about a cookie-based count and reserve the term “Unique Visitors” for those situations where I have some basis for doing so. More importantly I will encourge my clients and vendor friends to consider doing same. The IAB has given the entire measurement community a reason to take a huge leap forward and gain clarity around one of our most important metrics. To turn our back on this opportuntity because it will necessitate change, require additional explanation, or because “we like our definition better” is wrong, wrong, wrong.

Harrrrumph.

I suspect like previous posts on the subject this will generate some conversation. As usual I do not pretend to have all the answers and I welcome your feedback. I am, unfortunately, traveling all day Monday and will have limited ability to approve and respond to comments but I promise to do so as quickly as possible.

Analytics Strategy

A Record-Setting Web Analytics Wednesday in Columbus

It turns out, as I’ve been looking at my records (read: old blog posts), that this month’s Columbus Web Analytics Wednesday marked our one year anniversary, which means it was our 13th WAW. As we were discussing it last night, we thought next month was the one-year mark, but, in a post from last May, I referenced that gathering as our third, and the Internet just doesn’t lie!

It’s only taken a year, but we just might have hit upon the perfect spot: Barley’s Smokehouse and Brewpub. For a nominal fee, we were able to reserve a room, which gave us some real volume control when we got into the group discussion portion of the evening, and there’s plenty of room for future expansion. As it was, the final headcount was 28 people, which was right around 50% higher than we’ve ever had before. Credit for that goes to:

  • WebTrends, our sponsor for the evening — we finally got some folk to come out who have been meaning to in the past, but haven’t quite made it; some of those people are WebTrends user; unfortunately, due to a scheduling conflict, Noe Garcia, the WebTrends account executive who supported our sponsorship request, wasn’t able to make the trip from Portland for the event.
  • Webtrends, who provided the evening’s speaker/topic — John Defoe, VP of Solution Services and A Bunch of Other Stuff, kicked off a discussion about using web analytics data outside of the web analytics environment; more on that in a bit.
  • The Amazing Blanqueras — we had some repeats who discovered us through past WAW promotions on Columbus Tech Life Meetup site, and we again brought in some fresh faces from the site; as Columbus Tech Life grows, so will WAW!
  • Dave Culbertson — ‘nary a WAW goes by that someone isn’t there because Dave ran into them and encouraged them to attend
  • Twitter — a number of people tweeted about the event, but my unofficial observations put Jenny Wells of TeamBuilder Search as the lead tweeter on that front

Other than that, we ran our usual gamut of promotions, and,presumably,picked up people through those channels as well. I’m sure we picked up a person or two who was Googling Monish Datta and wound up on this site (I’m up in the top 5 results for a Google search for Monish — I don’t think I’m ever going to overtake his LinkedIn profile). Feel free to take a crack about a WAW blog post not having definitive data on traffic sources…

As to the topic, John kicked off the discussion by sharing some examples of WebTrends customers who are using web analytics data beyond the web analytics environment:

  • A motorcycle manufacturer who uses web analytics data to score leads (site visitors) before passing them on to their dealers for follow-up — giving the dealers a prioritized list of who is more likely to be ready to buy
  • A media site that uses web analytics data to do an hourly refresh of the “most popular articles” on its home page (which led to a $2 million uplift in ad revenue, if I heard correctly) — I’ve always wondered how much that sort of functionality gets hit by a feedback loop (an article just barely cracks the “most popular” list, but, then, by being on the list, it gets more clicks and remains there), but I didn’t get a chance to ask
  • A company that uses web analytics data for targeted re-marketing via e-mail — identifying what content a person has viewed and using that to tailor e-mails promoting the same or similar products


Columbus Web Analytics Wednesday - March 2009

John used those examples as a way to launch discussions of where others are using web analytics data outside of the web analytics environment:

  • I chimed in with my experiences with using web analytics data for lead scoring that combines web activity data with CRM information and then pushes the lead score into the CRM system
  • Scott Zakrajsek briefly explained out Victoria’s Secret uses web analytics data for targeted e-mail re-marketing
  • Bryan Cristina shared a Nationwide: Car Insurance example…that I totally missed (um…they’re my employer; you’d think I would’ve paid more attention there!)

The wrap-up thoughts, I think, could be summarized as follows:

  • Soooooo many companies aren’t even trying to do any of these sorts of things today
  • It won’t be long before these sorts of uses of web analytics data will be a must-have rather than a cutting-edge differentiation opportunity
  • It sounds easy enough, but, when you get down to it, getting different systems to really talk to each other (or to build a layer to pass information back and forth between them in a meaningful way) takes some roll-up-your-sleeves hard work and the tenacity to stick with it until it works
  • Having an engaged executive sponsor is darn near a must to pull these off
  • Having someone driving the project who really, really “gets it” makes things go a lot smoother…but outsourcing is a viable option

Columbus Web Analytics Wednesday - March 2009

If you’re interested in learning more, it’s not too late to book a trip to Vegas for WebTrends Engage ’09! John’s got a whole session on this basic subject on Wednesday, April 8th.

If you’re interested in sponsoring a future Web Analytics Wednesday, drop me a line at tim at gilliganondata.com!

Analytics Strategy

Columbus Web Analytics Wednesday: March 18, 2009

We’re a week later than maybe would’ve been ideal, but our sponsors are coming all the way from Portland, Oregon, and I expect it to be worth the wait! WebTrends will be sponsoring the event, and Noé Garcia (Strategic Account Executive) and John Defoe (VP of Solution Services) will be attending in person. John will be presenting and facilitating the topic for the evening: “Web analytics data beyond the web analytics platform.” This is a hot topic in a lot circles and includes, in my mind:

  • Integrating user-level web behavior into a CRM system to provide Sales with greater insight into the interests of specific prospects and customers (this is standard Eloqua functionality…but don’t read that as me thinking Eloqua offers a remotely robust web analytics solution)
  • Using user-level web behavior to score leads (in conjunction with non-web analytics data) to improve the lead qualification and lead nurturing process
  • Closing the loop and using web analytics insights to dynamically drive relevant web content — something that lots of people talk and wave their hands about…but that is extremely hard to actually build in a way that really works

I’m sure there are other aspects of this topic. I, for one, am looking forward to hearing John’s thoughts on the subject.

As to the details:

When: Wednesday, March 18th at 6:30 PM
Where: Barley’s Smokehouse and Brewpub (1130 Dublin Road, Columbus, OH)
I’m looking forward to the new venue. We continue to struggle to find a place that has a suitably decent food and drink menu, is suitably centrally located, and is  suitably non-noisy for us to be able to handle a presentation and discussion. Dave Culbertson, one of the co-organizers of the event, suggested Barley’s, and two other organizers responded enthusiastically (with the e-mail equivalent of a V8-style self-applied palm to the forehead), so I’m optimistic.
I’m hoping we have a good turnout!
Analysis, Analytics Strategy, Excel Tips, General, Presentation, Reporting

The Best Little Book on Data

How’s that for a book title? Would it pique your interest? Would you download it and read it? Do you have friends or co-workers who would be interested in it?

Why am I asking?

Because it doesn’t exist. Yet. Call it a working title for a project I’ve been kicking around in my head for a couple of years. In a lot of ways, this blog has been and continues to be a way for me to jot down and try out ideas to include in the book. This is my first stab at trying to capture a real structure, though.

The Best Little Book on Data

In my mind, the book will be a quick, easy read — as entertaining as a greased pig loose at a black-tie political fundraiser — but will really hammer home some key concepts around how to use data effectively. If I’m lucky, I’ll talk a cartoonist into some pen-and-ink, one-panel chucklers to sprinkle throughout it. I’ll come up with some sort of theme that will tie the chapter titles together — “myths” would be good…except that means every title is basically a negative of the subject; “Commandments” could work…but I’m too inherently politically correct to really be comfortable with biblical overtones; an “…In which our hero…” style (the “hero” being the reader, I guess?). Obviously, I need to work that out.

First cut at the structure:

  • Introduction — who this book is for; in a nutshell, it’s targeted at anyone in business who knows they have a lot of data, who knows they need to be using that data…but who wants some practical tips and concepts as to how to actually go about doing just that.
  • Chapter 1: Start with the Data…If You Want to Guarantee Failure — it’s tempting to think that, to use data effectively, the first thing you should do is go out and query/pull the data that you’re interested in. That’s a great way to get lost in spreadsheets and emerge hours (or days!) later with some charts that are, at best, interesting but not actionable, and, at worst, not even interesting.
  • Chapter 2: Metrics vs. Analysis — providing some real clarity regarding the fundamentally different ways to “use data.” Metrics are for performance measurement and monitoring — they are all about the “what” and are tied to objectives and targets. Analysis is all about the “why” — it’s exploratory and needs to be hypothesis driven. Operational data is a third way, but not really covered in the book, so probably described here just to complete the framework.
  • Chapter 3: Objective Clarity — a deeper dive into setting up metrics/performance measurement, and how to start with being clear as to the objectives for what’s being measured, going from there to identifying metrics (direct measures combined with proxy measures), establishing targets for the metrics (and why, “I can’t set one until I’ve tracked it for a while” is a total copout), and validating the framework
  • Chapter 4: When “The Metric Went Up” Doesn’t Mean a Gosh Darn Thing — another chapter on metrics/performance measuremen. A discussion of the temptation to over-interpret time-based performance metrics. If a key metric is higher this month than last month…it doesn’t necessarily mean things are improving. This includes a high-level discussion of “signal vs. noise,” an illustration of how easy it is to get lulled into believing something is “good” or “bad” when it’s really “inconclusive,” and some techniques for avoiding this pitfall (such as using simple, rudimentary control limits to frame trend data).
  • Chapter 5: Remember the Scientific Method? — a deeper dive on analysis and how it needs to be hypothesis-driven…but with the twist that you should validate that the results will be actionable just by assessing the hypothesis before actually pulling data and conducting the analysis
  • Chapter 6: Data Visualization Matters — largely, a summary/highlights of the stellar work that Stephen Few has done (and, since he built on Tufte’s work, I’m sure there would be some level of homage to him as well). This will include a discussion of how graphic designers tend to not be wired to think about data and analysis, while highly data-oriented people tend to fall short when it comes to visual talent. Yet…to really deliver useful information, these have to come together. And, of course, illustrative before/after examples.
  • Chapter 7: Microsoft Excel…and Why BI Vendors Hate It — the BI industry has tried to equate MS Excel with “spreadmarts” and, by extension, deride any company that is relying heavily on Excel for reporting and/or analysis as being wildly early on the maturity curve when it comes to using data. This chapter will blow some holes in that…while also providing guidance on when/where/how BI tools are needed (I don’t know where data warehousing will fit in — this chapter, a new chapter, or not at all). This chapter would also reference some freely downloadable spreadsheets with examples, macros, and instructions for customizing an Excel implementation to do some of the data visualization work that Excel can do…but doesn’t default to. Hmmm… JT? Miriam? I’m seeing myself snooping for some help from the experts on these!
  • Chapter 8: Your Data is Dirty. Get Over It. — CRM data, ERP data, web analytics data, it doesn’t matter what kind of data. It’s always dirtier than the people who haven’t really drilled down into it assume. It’s really easy to get hung up on this when you start digging into it…and that’s a good way to waste a lot of effort. Which isn’t to say that some understanding of data gaps and shortcomings isn’t important.
  • Chapter 9: Web Analytics — I’m not sure exactly where this fits, but it feels like it would be a mistake to not provide at least a basic overview of web analytics, pitfalls (which really go to not applying the core concepts already covered, but web analytics tools make it easy to forget them), and maybe even providing some thoughts on social media measurement.
  • Chapter 10: A Collection of Data Cliches and Myths — This may actually be more of an appendix, but it’s worth sharing the cliches that are wrong and myths that are worth filing away, I think: “the myth of the step function” (unrealistic expectations), “the myth that people are cows” (might put this in the web analytics section), “if you can’t measure it, don’t do it” (and why that’s just plain silliness)
  • Chapter 11: Bringing It All Together — I assume there will be such a chapter, but I’m going to have to rely on nailing the theme and the overall structure before I know how it will shake out.

What do you think? What’s missing? Which of these remind you of anecdotes in your own experience (haven’t you always dreamed of being included in the Acknowledgments section of a book? Even if it’s a free eBook?)? What topic(s) are you most interested in? Back to the questions I opened this post with — would you be interested in reading this book, and do you have friends or co-workers who would be interested? Or, am I just imagining that this would fill a gap that many businesses are struggling with?

Adobe Analytics, Analytics Strategy, General, Social Media

Omniture, Europe, SAS, WebTrends, and Twitter!

You may be wondering “What do those things have in common?” You may also be wondering “Did Eric drop off the face of the Earth?” The answer to the first question is the explanation to the second …

Despite changes in Analytics Demystified’s client portfolio–changes that I believe accurately reflect the current economic climate–we are busier than ever here in Portland, Oregon.  Or rather not in Portland, Oregon as Q1 2009 has me bouncing around the globe to talk about web analytics, something I enjoy tremendously.

World Tour 2009 (Part I) got started a few weeks back at the Omniture Summit in Salt Lake City, Utah. If you haven’t been to an Omniture Summit, assuming you are an Omniture, WebSideStory, Visual Sciences, Instadia, Mercado, Offermatica … am I forgetting anyone?! … I definitely recommend attending if you have the chance. Aside from excellent production and plenty of attention to detail I felt like Omniture did a great job on the content, something they took some criticism for in years past. The break-out sessions I saw paired an Omniture employee with a customer, analyst, or industry leader and in general the result was informative without being overly sales-y.

Perhaps the thing I enjoyed the most was that, despite my occasional open criticism of Omniture and some of their practices, senior management seemed (or at least pretended) to be happy enough to see me.  I had a wonderful conversation with President of Sales Chris Harrington, spent some time with Gail Ennis and John Mellor, and even got to share Swedish Fish with Brett Error (who is now in Twitter @bretterror)  Even Josh James and I had a chance to catch up … but no, I didn’t hug it out with Matt Belkin 😉

The World Tour continues here in Portland, then off to Milan, Madrid, and Washington, D.C. Locally I am excited to get to present at SearchFest 2009, but I have to admit I’m somewhat more excited about my first trip to Milan, Italy for Web Analytics Strategies 2009 and my first return to Madrid in several years. Perhaps most excitedly, following a special presentation with MV Consultoria, I will get to meet Rene and Aurelie’s new baby Lucca! After a brief return home (to spend time reading with my five year old daughter who has recently adopted her dad’s great love for reading) I fly to D.C. to deliver a keynote presentation at the SAS Global Forum.

And that is just the beginning. You can see the complete schedule under “Consulting” at Analytics Demystified, and I am actively booking conferences and presentations in June and July.

Which brings me to Twitter …

I wouldn’t say I was an early adopter of Twitter, not by a long shot. I actually met co-founder Biz Stone in Rotterdam and admitted “No, I don’t really understand the service …” I was eventually goaded into trying Twitter by Aaron Gray of WebTrends and started seeing the inherent value after getting people to use the #wa hashtag to identify web analytics (and Washington State) related content.

Of course, if you know me, you know I was unlikely to stop there …

After a short beta test with something I called the “Twitter Influence Calculator”, last week I rolled out The Twitalyzer. With tongue-in-cheek I have described the service as “Google Analytics for Twitter” and by all measures the service has taken off. To date nearly 20,000 unique Twitter users have tried the service which summarizes your use of Twitter and provides a handful of interesting measures of success (influence, generosity, velocity, clout, and the signal-to-noise ratio.)

Rather than spend a bunch of time telling you about it I encourage you to check it out at http://twitalyzer.com

While I have been incredibly busy between these travels, client work, writing proposals, and messing with Twitter I am of course always happy to hear from readers. Send email, Twitter me (@erictpeterson), or look for me at one of the conferences above!

Analytics Strategy, Social Media

Customer service done right in Twitter, #wa style

Like many people, over the past few months I have become quite the Twitter-wonk. I find myself spending an increasing amount of time monitoring the #wa channel in Twitter, even if my individual contribution has a tendency to ebb and flow. And while I watch the Twits ramble on, one thing I have developed is an appreciation for the work that Ben Gaines is doing on behalf of Omniture.

Who is Ben Gaines? Ben is the guy who monitors all of Twitter for things like “reported 25 hour latency in omniture conversion reporting. good thing we’re not ecommerce” and “really productive omniture call – happiness is helpful reporting tools!!” More importantly, Ben is the guy who is paid by Omniture to take the time to reach out to anyone and everyone who has a problem in an attempt to engage them in a positive conversation.

Yep, Ben Gaines is @OmnitureCare.

Given the challenges that every web analytics vendor faces, combined with the naked conversations happening in Twitter, the fact that the management team at Omniture has dedicated an even-keel like Ben it is a testament to the company’s awareness of the marketplace around them. And while other vendors have slowly started to dedicate similar resources, Ben has established himself (at least in my mind) as the standard against which all other analytics vendor’s representatives in Twitter will be judged.

Even though I’m heading to Salt Lake City in a few days and will have the opportunity to meet Ben face-to-face, I reached out to the team at Omniture and asked to interview him for my blog. My questions and Ben’s responses follow.

Q: Tell me a little about yourself … who is “Ben Gaines” and how did you get into web analytics?

A: I never quite know what to say in introducing myself, so I’m going to give you 10 words/phrases to describe me: Husband. Father. Boston expatriate (and, yes, Red Sox fan). Computer geek. Wannabe athlete. Omniture-ite. Web analytics student. MBA candidate. Writer. That’s me in a nutshell, I suppose. And it’s slightly embarrassing how hard it was for me to come up with that list.

Would it be cliché for me to say that I first got into web analytics in seventh grade when I put a hit counter on my first web site? My first serious foray into web analytics was at my last company, where I helped to run what was then Utah’s official travel web site. Analytics wasn’t part of my primary responsibilities, but I remember being fascinated by the technology involved and the business logic that defined how we used the data. When the opportunity to move to Omniture came along, I jumped at the chance.

Q: When did you start at Omniture and how did you get appointed to the role of “Twitter Support Rep?”

A: I started here in April 2006 in our ClientCare support group (then called “Live Support”), and moved into a role as a support engineer, with more of a programming emphasis, about a year later. Both of these positions helped me to become personally invested in our clients’ success, and I have tried applied that sense of responsibility to everything I’ve done at Omniture.

I don’t believe that I have been given the opportunity to represent ClientCare on Twitter because I am singularly capable of doing so; my colleagues are similarly accomplished and insightful. What I believe I do offer is a strong understanding of the “under the hood” aspects of Omniture tools and implementation, a decent amount of experience working with these products as well as with our clients, and a strong desire to be out there helping people get the best value out of their Omniture experience.

Q: Do you do something else at Omniture other than monitor Twitter?

A: I currently help to manage our online documentation efforts (with particular emphasis on our Knowledge Base), and am involved with support issues in certain cases. I also dabble in building internal tools and scripts to help us serve our clients better and/or faster. While I do monitor Twitter very closely, I’ve always got something else going on my other monitor. There is more than enough to keep me busy.

Q: Describe the tools you use to monitor Twitter for Omniture?

A: I’ve tried probably a dozen Twitter apps. My favorite is currently TweetDeck, primarily because it allows me to monitor mentions of Omniture, SiteCatalyst, etc. perpetually in a separate column. That is really the most critical feature of any tool I’d consider using to interact with Twitter for customer service purposes. Most support requests via Twitter aren’t in replies to me directly; they’re found because someone—often someone not even following me—mentioned Omniture in their tweet. That’s when I step in, if I believe I can help in any way.

Q: Tell us a little about how you help customers using Twitter?

A: There are a few ways that I try to help customers using Twitter. One is to disseminate information quickly to a large group of people. During my time at Omniture, I’ve really tried to learn the “ins and outs” of SiteCatalyst and our other products, and I love sharing those hidden gems whenever possible. When there is an issue that everyone needs to know about, or a tip that I learned in a conversation with a colleague that I believe would benefit our users generally, I’ll throw it out there. I’ve gotten really good feedback on that practice.

Another way is as a resource for quick questions—things that may not warrant calling in to our ClientCare team and that I can handle on the spot or with just a minute or two of research—which clients are welcome to throw at me. These are actually my favorite in the context of Twitter because they often allow others to learn and contribute along with whoever is asking the question. What’s really cool about this is seeing other clients jump in and nail the answers to these questions before I do.

We’ve seen that our efforts on Twitter can sometimes even reduce the amount of support calls. Many of these questions/issues are actually fairly straightforward, and can be resolved in one or two tweets.
Finally, of course, I watch for mentions of Omniture or our products that may be support or feature requests and do what I can with them. We’ve gotten some really excellent feature requests via Twitter, and our Product Management team very much appreciates it.

Q: Tell us a little about how you deal with non-customers / complaints about Omniture?

A: I suppose this depends on the nature of the tweet. There are certain complaints (as well as non-customer questions) which are completely legitimate, and I do my best either to address them or to point the individual in the direction of someone who can. We’ve seen that our efforts on Twitter can sometimes even reduce the amount of support calls. I am not sure I can help people who are negative for the sake of negativity in 140 characters.

Q: What is the funniest Tweet you’ve seen/received about the company?

A: The funniest tweet about the company was one that said, “wondering when omniture will be able to provide users with a brain plug-in as part of the suite.” We’re working on it. I think it’s in beta.

Q: Who do you follow in Twitter?

A: The people I follow typically fall into two categories. Of course, I follow our customers. Finding our customers on Twitter can be tricky, so I often have to wait until one of them tweets about Omniture before I can follow them. Then I also follow industry thought leaders—yourself, Avinash, and others—from whom I am learning a ton about web analytics in general.

When someone begins to follow me without having tweeted about Omniture, I usually check his or her profile to see whether or not the person is likely to be a customer or to tweet about web analytics or Internet marketing (SEO, SEM, etc.). If so, I’ll follow. If not, I won’t.

The thing about using Twitter (or other social media) for customer support is that by following dozens or hundreds of people, I end up with a lot of updates regarding what so-and-so is eating for lunch, while I’m there mostly for professional, rather than personal, purposes. Maybe I’m a good candidate to represent ClientCare on Twitter because I don’t mind the personal updates at all. Frequently I find myself getting jealous of what our clients are eating for lunch, though.

Q: How important do you think Twitter is to customer relationship management?

First of all, I think it’s important to note that Twitter is only a part of our overall social media efforts. I will be starting to post on blogs.omniture.com shortly, and we’ve already got a ton of great content out there from 15 different experts. We want to hear from our customers about the issues they are facing and share information that will help them do their jobs better. The most important thing is staying on top of the latest trends in this area; today, a lot of our customers are on Twitter, but in six months it might be some other tool. Whatever it turns out to be, we’ll be there.

Regarding Twitter and customer relationship management, I know it’s been hugely important for us—ClientCare, and really for Omniture as a whole. I love the idea that we can listen to our customers so easily. When there are support issues, we can deal with them quicker than ever before. When there are feature requests, it’s easy to gauge whether there is a groundswell of support for the idea.  When there are complaints, we can deal with them immediately and, in many cases, put customers’ minds at ease.

We’ve received a lot of very positive feedback regarding our efforts on Twitter. I think it’s important for customers to know that we are listening. It empowers them to interact with us in a new and powerful way. And that’s not just rhetoric—we really are listening.

The other way that Twitter is important is that it feeds into the two other main thrusts of ClientCare’s efforts—support and documentation—while those elements also feed into Twitter, allowing us to solve issues and answer questions more completely than ever before. When someone asks a question via Twitter, it often feeds into the Knowledge Base. Conversely, as I am working on our documentation I frequently find information that I believe would be useful to many of our clients, and will post it on Twitter. Support issues feed into the Knowledge Base and Twitter as well; when there are general questions asked of our ClientCare team, those will often find their way into both our documentation and onto Twitter. And tweets often result in support tickets being opened, and subsequently in additions to our documentation, when questions and issues go beyond what I can handle in 140 characters.

Q: What are your measures of success as a Twitter Support Rep?

A: I think I’m still trying to feel out what the correct metrics are. Certainly time to response and time to resolution are KPIs, but that goes without saying in customer support and relationship management. At this point, I suppose my goal is to leave 100% of clients who interact with me feeling more confident in their Omniture abilities. It’s always a success when I’m able to disseminate knowledge and help our customers get better value out of our tools.


Thanks to Ben and his managers for allowing me to conduct this interview. If you know of someone else in the web analytics arena doing excellent work in Twitter I’d love to hear about it.

Analytics Strategy

Monish Datta: "I can’t believe Sasha skipped WAW for the US-Mexico World Cup Qualifier!"

Actually, THAT’s almost a direct quote. Sarcastic as it may be. We were actually competing with a US-Mexico World Cup qualifying match that was being played in Columbus (in some crazy weather…but I’m getting ahead of myself). The US won 2-0, for what it’s worth, and I’m sure Sasha enjoyed the game. I’ll get an update next month!

This month’s WAW was something of a last-minute adventure. We once again had the event sponsored by the Web Analytics Wednesday Global Sponsors, and we had a drawing for a WASP for Analyst license, which David Ruen won. I’ve got to give a tip of the hat to Sandy and Ben Blanquera for the amazing work they’ve done getting Columbus Tech Life up and running, as we continue to bring in a few fresh faces from the Columbus Tech Life Meetup postings for WAW.

The real adventure this month was that Columbus had projected high winds for Wednesday evening. And, as the day progressed, there were rumors floating around about a Level 2 Storm Alert and 60 mph winds. After a brief flurry of e-mails, we decided that we would go ahead and have WAW, and I sent out a quick note to that effect, but let people know not to worry if they’d registered but then weren’t going to make it due to the weather. Power is still out in some neighborhoods in the area a full day later — the wind lived up to the hype.

I was heading to the event in between one wet-windy-heavy storm and what later turned out to be mostly just high winds storm and caught a pretty spectactular double rainbow out my window. Lucky for me, I was heading to WAW, so had more than just my Blackberry available to snag a picture!

 Double Rainbow in Columbus

We tried another new venue this month — Bar Louie in the Arena District. And, overall…too loud (a common theme). But, good food and good drink, topped off with a good crowd (we rattled off seven people who intended to come and then didn’t either because of a last-minute conflict or the weather):

Web Analytics Wednesday Columbus - February 2009

Ironically, Monish Datta — the target of my running gag to make this site dominate organic searches for his name — is almost entirely obscured behind Brian in this picture.

The new faces this month included:

  • A co-founder of SearchSpring, which is an ASP site search tool geared towards small- to medium-sized e-commerce sites
  • The founder/owner of Jones Insight, a customer and marketing analytics consulting firm
  • A couple of folk from Bizresearch, which has developed a service for providing easy-to-understand SEO/SEM reporting
  • A jack-of-all-things-web-marketing marketer from Scotts

And, oh dear, I’m just not going to get into listing where everyone was from. As always, it was interesting to watch the interactions — the people who realized they actually had worked with each other, but only over the phone, the people who had 2 degrees of separation from each other, and, of course, the web analytics chatter.

Due to the noise level, we only did a half-hearted attempt to run our planned round table question of, “What is the most interesting (or entertaining…or terrifying) example of MISinterpretation of web analytics data you have seen?” We got a few chucklers:

  • The company that had spent a lot of development time and money to roll out a new feature on their home page. The analytics showed that 0.03% of the visitors to the home page were using the feature. The analyst who provided that insight got a call from the person who had championed the development. She told him, “Thanks so much for that data. It helped me justify keeping that feature on the home page!” The analyst wondered…how?!
  • A related example from a different participant. He had a client who had a “My” feature on their home page — a “My Favorites”-type of link-saving feature on the site. They were just about to spend $15,000 (and it was a fairly small company) to have someone update the feature. The analyst spent 5 minutes demonstrating that there was virtually no actual use of the feature, and the updates they were planning weren’t really geared towards that, anyway. The project got canned. Hmmm… turns out that wasn’t a misuse of web analytics at all, was it? Well, we are a wild and crazy bunch, so we let the rebels say their piece.
  • The time that a product manager who did a lot of self-service on the web analytics front saw a sudden 10X increase in visits to one of his product pages several weeks after he made some minor content updates for SEO purposes. He showed the results to his manager, then he shared them with the VP of Marketing, then he shared them in a large staff meeting. He developed quite a spiel about his SEO results. Then he shared the data with the web analyst, who immediately applied one of his favorite filters: the “common sense” filter. It took some digging to find out that the web infrastructure team was testing a new web site monitoring service…and that page was one of the pages they used for the test. And the company was using a log-based analytics package. And the user agent for the monitoring service wasn’t being filtered. The step function was entirely bogus.

The event started to wind down earlier than normal. I was drifting out myself after 2.5 hours. Dave and Andrew had started to head out earlier, but had gotten engrossed in a conversation and wound up sitting down to finish it. As I walked out, I got engrossed in their conversation. A half-hour later, as I started to leave (again), I realized that a number of Deloitte consultants who I work with had drifted in to watch the UNC-Duke basketball game. I wandered over for a quick, “Hey”…and didn’t leave until 11:15.

Which is why I’m going to end this post here and go to bed!

Adobe Analytics, Analytics Strategy, General

Free webinars on February 11th and 12th

If you are one of the many, many people out there who have been told in no uncertain terms that there is “no travel budget” to attend conferences in the near future and you’re bummed out about missing out on some great learning opportunities Analytics Demystified has a great solution! Rather than mope around the office, complaining about missing Ian Ayres at WebTrends Engage or Maroon 5 at the Omniture Summit, why not join Analytics Demystified, Forrester Research, Coremetrics, and Tealeaf for two free webinars next week!

And who doesn’t love “free?”

The first webinar will be held at 10 AM Pacific next Wednesday, February 11th and is sponsored by the nice folks at Coremetrics. The topic is campaign attribution, and while the “official title” of the presentation is “Effectively Managing Your Online Marketing Mix with Advanced Attribution” my personal subtitle for the event is “How LAST-Based Attribution is Wrecking Your Marketing Budget (and What To Do About It!)”

While I love the topic, I’m doubly excited about this webinar since I get to co-present with John Lovett from Forrester Research. The best thing about presenting with John is that he is never shy about his opinion and we frequently get into “heated” debates, even in front of a live audience. We will also be joined by Coremetrics own John Squire, a gentleman also known for his willingness to have his opinion heard.

If you’d like to participate in this webcast with Analytics Demystified, Forrester Research, and Coremetrics please register for this totally free event through Coremetrics.

The second webinar will be held at 9 AM Pacific next Thursday, February 12th and is sponsored by the nice folks at Tealeaf. The topic here is the Web Site Optimization Ecosystem that I first described with Tealeaf and Foresee Results back in 2007. The ecosystem is a great topic when times are tough since most companies have at least two of the technologies we’ll discuss (web analytics, voice of customer, customer experience management, testing, personalization) but have done very little to actually integrate the systems.

I will be joined on the call by Geoff Galat, Tealeaf’s VP of Marketing and Product Strategy. If you’d like to register and listen to Geoff Galat from Tealeaf and I, please register for this totally free webcast through Tealeaf.

So there you have it. Two great topics, five smart presenters, one best-of-all-prices … FREE.

I hope you’ll be able to join us, and as always, please keep up with Analytics Demystified via our web analytics events page at http://www.analyticsdemystified.com.

Analytics Strategy

Win a WASP v1.09 for Analyst license at Web Analytics Wednesday

It hardly seems like it’s been a month since the last Web Analytics Wednesday in Columbus! Maybe that’s because it hasn’t been, but it’s that time again anyway!

Come share food and drink with a group of above-average people who are interested in and working with web analytics. A good time will be had by all…and this will be the first Columbus WAW with a door prize!
Jumping right to the details:
When: Wednesday, February 11th at 6:30 PM
Where: Bar Louie in the Arena District (504 N Park Street)
Format
We got rave reviews about last month’s format where we went around the table and asked a question about web analytics (read a summary at http://tinyurl.com/WAWJan09), so we’re going to give that approach another shot this month. This month’s question:
“What is the most interesting (or entertaining…or terrifying) example of MISinterpretation of web analytics data you have seen?”
Answering the question is entirely optional — if you are new to web analytics or simply have had nothing but the most sophisticated of business partners, then a “Pass” is an entirely acceptable answer.

What’s this WASP v1.09 door prize?
WASP is the Web Analytics Solutions Profiler — a Firefox extension that comes in handy in oodles of ways when it comes to sniffing out issues, debugging deployment problems, and generally just snooping around web pages to see what’s what on the tagging front. Read more at: http://webanalyticssolutionprofiler.com/. We’ll be conducting a drawing for an Analyst license (a $49 value) at the event.
As always, please forward this post along to your friends and colleagues who are interested in web analytics. The more the merrier!
Adobe Analytics, Analytics Strategy, Conferences/Community, General

Analytics Demystified speaking engagements

One of the things I love the most about my job is the work I do as a professional speaker and industry evangelist. While the economy is not showing any great signs of improving, the speaking circuit shows no signs of slowing down.  Here is a summary of some of the web analytics events Analytics Demystified will be at between now and the end of the summer, keeping in mind that more events will undoubtedly be added.  If you’d like to have Eric T. Peterson speak at your event please contact us directly.

Online Marketing Summit, San Diego, February 5th

At the Online Marketing Summit I will be giving a presentation titled “Attribution, Influence, and Engagement: The Digital Marketer’s New Nightmare” on Thursday, February 5th. We will also be having a special Web Analytics Wednesday on Thursday event at the Westin Hotel downtown. Learn more about OMS and sign-up to join us at Web Analytics Wednesday.

The MindMeld at the Omniture Summit, Salt Lake City, February 17th

As part of the Omniture Summit, Matt Langie has organized an “X Change” like, invitation-only event called the “MindMeld(TM)”. Co-hosted by both Jim Sterne (WAA, eMetrics) and John Lovett (Forrester Research) the afternoon event will attempt to debate and develop solutions for Social Media measurement, Mobile and Video, and discuss how we can collectively elevate the role of analytics in the organization.

Given the somewhat tumultuous history I have with this organization’s management team I am honored that Matt invited me and I am looking forward to seeing the pagentry of the Omniture Summit first-hand. If you’d like an invitation to the MindMeld please let me know.

SearchFEST, Portland, March 10th

SearchFEST is a full-day search marketing conference hosted by SEMpdx and the fine folks at Anvil Media. This year’s keynote speaker is the great Danny Sullivan and I am honored to be on an analytics panel with Portent Interactive’s Ian Lurie and Widemile’s Bob Garcia. Hallie Jansen at Anvil was kind enough to give me a discount code so if you’d like to join us at SearchFEST and save a little money, reach out directly and I will hook you up!

Web Analytics Strategies, Milan Italy, March 17th

At the Web Analytics Strategies Conference and Expo I am a keynote speaker and listed as a “special GURU” which is nice (although it makes me feel kind of old.) I believe this is the first Web Analytics Expo in Milan and so I’m doubly excited about giving my presentation on “Competing on Web Analytics”  If you’re a reader of my blog in Italy and can make this event please contact me so that we might meet for coffee! I believe we will also be having a special Web Analytics Wednesday event the evening of the 17th in Milan so watch the Web Analytics Wednesday calendar.

Los Desayunos MVC con Analitica Web, Madrid Spain, March 18th

At this special web analytics event in Madrid, Spain I will be speaking on the “New Measures for Online Marketing” with the delightful Sergio Maldonado.  I’m especially excited to be returning to Madrid since my very good friends Rene and Aurelie will be coming down from Brussels with their young baby Lucca who I have not had the pleasure of meeting yet. If you’re in Madrid, let’s meet for sangria, shall we?

SAS Global Forum, Washington, March 23rd

At the SAS Global Forum I am honored to be giving a keynote presentation on “Competing on Web Analytics.” I’m very excited about presenting at the SAS conference and hope to connect at the conference with web analytics professionals who are also using SAS.  If you’ll be at the Global Forum and would like to meet, please let me know.

WebTrends Engage, Las Vegas, April 7th

Being a Portlander and having gotten my start at WebTrends back in the late 90’s I was already excited to be going to the Engage conference. Then the company announced that Ian Ayres, author of Super Crunchers, will be giving a keynote presentation. I love what the company is doing with their conference web site and it seems like I’m constantly seeing news on Twitter about the event. If you’re heading to Vegas for Engage, make sure to look for me by the blackjack table.

DMA ACCM 2009, New Orleans, May 4th

I’ve never had the pleasure of presenting at the Annual Conference for Catalog and Multichannel Merchants but am very excited to be giving two presentations at ACCM 2009. The first presentation will be more an intermediate class in web analytics and the second advanced. Both will be highly interactive and so I’m looking forward to meeting the DMA and ACCM audience.

eMetrics Marketing Optimization Summit, San Jose, May 6th

The big event, the really big show, the grand-daddy of them all … eMetrics. I am still proud to say I have never missed an eMetrics here in the USA, in part because I love the event and in part because of the profound respect I have for the conference’s organizer, Mr. Jim Sterne. I am more or less racing from New Orleans to San Jose but please look for me at eMetrics if you’d like to catch up! Plus, we will definitely be having a blow-out Web Analytics Wednesday at eMetrics this year …

Internet Retailer 2009 Conference and Exhibition, Boston, June 16th

I have always loved the Internet Retailer conference but have rarely been able to attend due to schedule conflict. Fortunately Kurt Peters got me this year before things started picking up and so I’m happy to be presenting with Mike Fried from Backcountry.com and getting deep into the details that online retailers need in their web analytics efforts.

The X Change Conference, San Francisco, September 9, 10, and 11

I could not be more excited about this year’s X Change conference for a variety of reasons. We have great buzz after last year’s event, I have talked to dozens of past participants who have told me they’re saving their limited conference dollars for the X Change, and I love that Gary and Joel from Semphonic are always looking for ways to make a great event even better. If you’d like to join us at the web analytics industry’s premier event, or if you’d like to talk about possibly leading a huddle, please don’t hesitate to email me directly.

Well, that’s most of it, at least for now. There are a handful of webcasts I’m doing on behalf of companies like Coremetrics (with John Lovett from Forrester) and Tealeaf (on the Web Site Optimization Ecosystem) but I’ll try and cover those in another, shorter post.

Analytics Strategy

Fear vs. Convenience — The Customer Data Conundrum

I was in a presentation today where the presenter spoke enthusiastically and at length about how customer data is king — how companies that gather customer data (both explicit data provided by the customer, behavioral data collected about the customer, and external data acquired from third party sources) and put it together effectively can serve the customer (and prospective customers) in a more personalized fashion. Putting that data to use benefits the customer, which, in turn, benefits the company.

That’s entirely true. And, it’s a common theme in marketing these days: “one-to-one marketing” is the catchphrase that marketers use. It’s what being “customer centric” is all about. Waiting for Your Cat to Bark is based on this concept, as are countless other books, blog posts, conference sessions, classes, podcasts, and so on. My local paper ran a piece last weekend about some of the ins and outs of how Kroger uses data from its loyalty card program to provide customers with coupons that are most likely to be of interest to them (the article is actually about the company that Kroger contracts this work out to — dunnhumbyUSA). I’ve long suspected, and the article confirmed, that many grocery chains put these loyalty card programs into place and then find that it’s a lot harder to actually use the data they collect in a meaningful way than to set up the processes to collect the data…but that’s another post entirely.

What all of these specific examples miss — or at least drastically underestimate — is the fear component. And that was touched on briefly in the January 8, 2009 Slate Political Gabfest podcast. John Dickerson, the host of the gabfest, launched into a riff about electronic medical records. The initial point of the discussion was how the Obama administration is expected to be much more tech-savvy and tech-interested than the Bush administration, and how modernizing the medical record system was one possible initiative that might get funded by the economic stimulus package. This led to a discussion among all three of the gabfesters about how this seemed like a no-brainer and would lead to all sorts of improvements. But, Dickerson then explained how there is a lot of fear associated with the initiative — how members of Congress who are proponents of this sort of legislation are coached to talk about “medical modernization” rather than “electronic medical records.” We’ve all seen articles about the privacy concerns around electronic records of this sort — the prospective employer who finds out you have been treated by a psychiatrist, the health insurance company that finds out you have a genetic marker that increases your risk of contracting cancer (and thus increases your premium), etc.

And THAT is the crux of the biscuit.

For every benefit of having a centralized, comprehensive record of who you are and what your likes and dislikes are, there is a risk that that data will fall into the wrong hands and be put to nefarious uses:

  • By storing your credit card information on an online retail site, you’re risking that someone could hack into their system and steal it
  • By giving your social security number when registering for college, you’re risking that a hacker could get in and steal your identity
  • By letting a web site store your login information in a cookie, you’re risking that you lose your laptop and someone goes back to the site and completes transactions as you

The more comprehensive the information that gets stored about you in a single location, the greater the potential for convenience and benefits to you…but the greater the damage if that information gets misused (think Orwell’s 1984 for the bleakest, most extreme example).

Which means it all comes down to the balance of fear versus trust, and how long it will be before the societal balance tips from the former to the latter.

Think about the airplane. In the earliest days of manned flight, anyone who flew had to either overcome a high level of fear or be possessed with a high degree of recklessness. Flying was risky and dangerous. The technology improved rapidly and dramatically, and air travel became increasingly safer. “Fear of flying” has gone from being commonplace in my grandparents’ generation to being rare in mine.

Comprehensive, centralized (or at least cross-referenceable), electronic data about you and me can be viewed in a similar light. Which leads me to a plea for corporate responsibility and stewardship of the customer’s trust on two fronts:

  1. Protecting the privacy of your customer data is paramount. Every “lost data” story that hits the mainstream media is a backwards step towards what really is the common good
  2. When using the customer data you do have, be triply sure to provide twice as much benefit to your customer as you garner for your company

It’s not going to be an immediate shift. It will take years and will rely on the overwhelming majority of companies and the government to be hyper-vigilant on both of these fronts before trust and value trump fear.

Analytics Strategy, Social Media

Monish Datta: "It was the best WAW yet!"*

Another month, another Web Analytics Wednesday (WAW) in Columbus. We had two sponsors — both the Web Analytics Wednesday Global Sponsors and Lightbulb Interactive, which was nifty. And, we headed back to O’Shaughnessy’s Public House because, by golly, we just knew if we gave them enough chances they could get up to batting .500 when it came to screwing up our reservations. They succeeded by having no record of our event in “the book.” We made do nonetheless.

The turnout was slightly below normal — we wound up with eleven people all told — but we tried a new format for the discussion that worked out well! Although the group was small, it was a good mix of people: web analysts from major financial institutions, web designers, SEM and web analytics types in online retail, a horse racing marketer, and a slightly-crazy-but-always-entertaining developer from an interactive agency:

Columbus Web Analytics Wednesday -- January 2009

The format for the formal part of the discussion was going around the table and asking everyone (who was willing) to describe the report or type of report that they felt was the most worthless or irrelevant, and then to also describe the type of report that they could not get or that was unduly difficult to get that they felt would be most useful. In other words — cheap and easy blog fodder for me! The results…

Most Worthless/Irrelevant Reports

  • “Hits” reports — we agreed that two cases where this wasn’t a worthless metric were: 1) error logging (e.g., missing images), and 2) server load monitoring; a late arrival proceeded to state how many hits she had to her company’s web site last year. Doh! She actually had a good recovery by proposing a third valid use: when your site is selling sponsorships and you need the biggest number you can find. Okay, so “valid” is a stretch here. Marketers. Yecchhhh!  🙂
  • Overlay reports — great eye candy for the vendor when they’re selling a web analytics product, but notoriously inaccurate, can’t handle links in Flash, require a lot of very careful link creation on the page that’s going to run the overlay to make sure all links are unique (which hurts SEO), and don’t work for pages that have their content updated with any regularity (when trying to look at an overlay from “two weeks ago;” Bryan provided us with an amusing medley of impersonations of business users asking questions about this sort of report
  • Average time on page — this prompted some debate, but the general agreement, I think, was that the problem with this report is that many, many people use it without understanding its shortcomings (which Avinash covered in detail early last year in a blog post).
  • Path reports — again, we had general agreement that it’s the persistent myth that a significant percentage of visitors to a site will follow the exact same path through the site that is the killer (I call that the “people are cows myth“); we walked through the various alternatives that do have value — single-level paths to/from a page, bucketing of types of pages, looking at combinations of pages visited but not worrying so much about sequence, etc.
  • Geographic overlays — they have their uses in some very specific cases, but they really don’t warrant being on the main page of any tool’s dashboard

My favorite from the “worst” discussion, though, was this: “Any report provided without context.” That one from the aforementioned slightly-crazy-but-always-entertaining developer

Most Wanted or Wanted-With-Less-Work

These reports got a bit more philosophical, but it was a good list nonetheless, with some common themes:

  • Several people brought up the need to marry web analytics data to other marketing channels as a biggie; they provided examples of where they had or were in the process of doing this in some fashion, but the beef was with how painful it was; this also headed down a tangential discussion of “attribution” — siloed marketing channels lead to each channel vying for as much credit as possible when they “touched” someone who converted to a sale at some point; I think we all ordered another drink in the midst of this discussion, and the lively discussion took a slightly maudlin turn. But the drinks arrived, and we recovered.
  • Forward attribution combined with segmentation — this was actually related to the prior one, and I scribbled it down as soon as Scott threw it out…but now realize it went totally over my head. Maybe he’ll elaborate in a comment on this post (after he nails down the venue for next month’s WAW, of course).
  • Form abandonment — this was one where it wasn’t that it’s not doable, it’s that it takes a lot of work to pull it off effectively. Well worth the effort, but would get more use if it was easier to set up.
  • Onsite search — this is akin to the form abandonment one, in that it’s a really useful set of data to look at, but, all too often, is tricky to get set up in a way that makes it practical to use
  • Social media integration with web analytics — this one is a result of the decentralized nature of social media, so much of what we’d want to integrate isn’t happening on sites that we “control.”

Other discussions/topics/mentions of note from my end of the table:

  • Dave went from being a social media skeptic less than a year ago to being an active user and evangelist. He’s even speaking on the subject in Cleveland next month (although he hasn’t yet plugged that on his blog)
  • In that same vein, Dave has also become a Gmail convert. Now…if I can just get him off of Blogspot and on to WordPress, my work here will be complete…
  • I found myself talking up Techrigy’s SM2 in two separate conversations — encouraging people to sign up for a freemium account to explore social media tracking, and plugging Connie Bensen as someone to ping on Twitter with questions.
  • I wound up talking about many of the people I met (in person or via social media) last fall when I moderated a panel on social media for nonprofits
  • We had a few chuckles about the <political> “Leaving Us in Great Pain” video </political> that I helped produce with some friends from Austin

I almost passed my notepad around asking people to put their Twitter usernames on it…but I decided against it. Feel free to add yours as a comment here whether you were in attendance or not if you’re interested in Columbus Web Analytics Wednesday. And/or, you can join our Facebook group. I was struck by the difference 10 months makes. We talked about Twitter during the first couple of WAWs last year, and the number of users were in the distinct minority. Some people had not even heard of it. Everyone I talked to last night uses Twitter, and uses it enthusiastically. The times they are a’ changin’!

 

* While quotation marks would ordinarily indicate that this was a direct quote, those in the title of this post more indicate paraphrasing of Monish Datta’s take on the evening. Actually…”paraphrasing” is an overstatement. In other words, I totally made the quote up. But, Monish was smiling and laughing, so I don’t feel too bad about it. I really just needed to get his name in the title for SEO chuckles.

Adobe Analytics, Analytics Strategy, Conferences/Community

Great news from Web Analytics Wednesday!

Wow, once and awhile in life you get involved in something that turns out differently than you had planned. I’ve written about this in the past, but for me Web Analytics Wednesday is one of those things

Web Analytics Wednesday was started by my good friend (and soon-to-be-new-mom!) June Dershewitz and I back in 2005 with little expectation of success. Our goal? To give web analytics professionals around the world the means and motivation to gather locally and get to know each other.

Now, in 2009, by every measure we have succeeded and even exceeded expectations, having helped create regular Web Analytics Wednesday events in places like New York, San Francisco, Boston, London, Toronto, Paris, Madrid, Copenhagen, Stockholm, Sao Paulo, Sydney and smaller cities like Austin, Texas, Columbus, Ohio, and Nashville, Tennessee. These events were attended by over 4,500 web analytics professionals around the world in 2008 alone and 2009 is off to a great start with over 600 total attendees in January alone.

In 2008 we decided to do more to help Web Analytics Wednesday events get off the ground in more cities and to provide more financial support to as many groups as possible. We were able to do this with the generous support of Coremetrics and SiteSpect, the Web Analytics Wednesday Global Sponsors. Both of these companies had already been hosting events around the world, and senior management from both turned out to be enthusiastically supportive of Web Analytics Wednesday.

Today I am incredibly pleased to announce that the Web Analytics Wednesday Global Sponsors have joined Analytics Demystified in making a donation to Operation USA. This donation of roughly $1 per Web Analytics Wednesday participant was made in recognition of the relative success many in the web analytics industry have enjoyed in the context of the challenges faced by our fellow man, woman, and child across the globe.

Personally I have been blessed with a healthy family, a rich life, and moderate success in business, and thus have frequently been able to make charitable contributions. But as we all know, as the economy worsens, charities are often the first to feel the pinch despite the fact that an increasing number of people around the world need the support of groups like Operation USA. Because of this, I am incredibly grateful to Joe Davis, CEO of Coremetrics, and Eric Hansen, CEO of SiteSpect for their willingness to match my $1,500 contribution on behalf of both of their companies.

I hope everyone who hosts, sponsors, and participates in Web Analytics Wednesday will take the time to thank Coremetrics and SiteSpect for their generosity, either by commenting on this post or by emailing the companies directly.

On behalf of Analytics Demystified, June, our Global Sponsors, and all the Web Analytics Wednesday hosts we wish you all the best in 2009 and hope you’re able to make it out to an event near you soon!

Analytics Strategy

Columbus Web Analytics Wednesday — Jan 2009 Edition

The first Columbus Web Analytics Wednesday of the year is coming! As always, the meetup is open to anyone and everyone who is interested in web analytics — we just ask that you register so we have a sense of what our headcount will be.

We are heading back downtown this month, and back to our increasingly regular haunt: O’Shaughnessy’s. We will likely be upstairs, barring any SNAFUs. We are going to try something a little new on the format this month. In our normal presentation slot, we are going to go around the table and ask everyone to answer this question:

What web analytics report do you think has become the most irrelevant or overrated, and what report would you most want that you do not have?

We’re not going to require that participants all answer the questions, but hopefully enough people will that we can have a good discussion. 

Wednesday, January 21, 2009 at 6:30 PM
As always, thanks to our sponsors: Web Analytics Wednesday Global Sponsors and Lightbulb Interactive.
We’re close to having February’s WAW lined up as well, so stay tuned!
Analytics Strategy, Conferences/Community

Thoughts on the proposed IAB Guidelines

UPDATE ON JANUARY 19, 2009: Peter Black from BPA Worldwide who was also on the IAB working group with Josh Chasin, wrote in and disagrees with Chasin’s characterization of who the “Unique User” language is targeting. I have email into the IAB and MRC’s George Ivie to clarify the situation. Watch this blog!

UPDATE ON JANUARY 18, 2009: Josh Chasin from comScore, who was a member of the IAB working group that defined the guidelines described in this post, wrote in to point out that I misinterpreted the IAB’s intent. While their web site clearly says …

“The IAB believes that all companies involved in audience measurement should be audited for their processes.  These audits are intended to establish the source of any measurement discrepancies and to find potential solutions.

All measurement companies that report audience metrics have a material impact on interactive marketing and decision-making. Therefore, transparency into these methodologies is critical to maintaining advertisers’ confidence in interactive, particularly now, as marketers allocate more budget to the platform.”

… according to Chasin the IAB is excluding web analytics vendors from “all companies involved in audience measurement” and the type of companies  that have a material impact on interactive marketing and decision-making. Since this doesn’t sound right to me at all I will warn the reader that some of the questions I raise in the following post may, in fact, be totally irrelevant (at least in the context of the IAB Proposed Measurement Guidelines.

If nothing else, with two days left in the open comment period, the IAB may want to use my confusion as an example and clarify the target for the recommendations made in the document.

Reader beware!

As long as we’re talking about web analytics standards I figured I would take the opportunity to offer up a few thoughts on the Interactive Advertising Bureau’s Audience Reach Measurement Guidelines that are open for public comment until January 20th. If you haven’t had a chance to read these proposed guidelines you should, especially if you have an interest in how we collectively communicate about data.

At 34 pages the document is certainly a slog to read–and I say this knowing full well that I have a tendency to write 50 page white papers! Since you’re all bright folks I’m just going to address some of the proposed language that stood out to me.  And, as always, if you have any thoughts or positions on the proposed guidelines I’d love to hear from you!

Starting in Section 1.2 the IAB clarifies the relationship between “Unique Cookies”, “Unique Browsers”, “Unique Devices” and “Unique Users / Visitors”.  The discussion about “Unique Devices” is interesting because this is a clear indication of the impact that mobile devices like the iPhone are having on audience measurement.  Things start to get really interesting, however, in Section 1.2.4 where the IAB says (emphasis mine):

“However, in order to report a Unique User, the measurement organization must utilize in its identification and attribution processes underlying data that is, at least in a reasonable proportion,, attributed directly to a person. For instance, data collected from registrants is one possible source that can be utilized in creating a Unique Users measure by a census-based measurement organization, if registrants represent a reasonable proportion of the total user-base and when appropriate scientific projection methods are used for non-registrrants (sic).  In no instance may a census measurement organization report Unique Users purely through algorithms or modeling that is not at least partially traceable to information obtained directly from people, as opposed to browsers, computers, or any other non-human element.

Did you get that? Keep in mind that while at JupiterResearch I was among the first to publicize the decline in accuracy of visitor counting due to cookie deletion. In fact the report was subtitled “Addressing the Decline in Accuracy of Cookie-Based Measurement.” At the time people called me crazy and Seth Godin even accused me of living in an echo chamber (I have since forgiven Seth.)

Now, three years later, the IAB is expressly telling measurement vendors to stop reporting a metric called “Unique Visitors” or “Unique Users” unless they have a research-based strategy for determining the correct proportion of cookies to “real people” and have applied that calculation in a transparent way.

Whoa.

Think about this for a minute. Every one of the fine census measurement packages (nee web analytics) out there is reporting a Unique Visitor number, but I’ll go out on a limb here and propose that none of them are even vaguely adhering to the IAB proposed definition of a “Unique Visitor.”  I’ll go a step further and postulate that, at least in the base offerings, these vendors don’t currently have the technical capability required to report an estimated/algorithmically derived “Unique Visitor” count based on scientific projection methods.

If I’m wrong about this I suspect I’ll hear about it, but I don’t think I’m wrong when it comes to the base offerings like SiteCatalyst, WebTrends Web Analyics, Coremetrics 2009, etc.  And yes, I’m aware that end-users can use higher-end products like Discover on Premise and the data warehousing tools to apply a correction factor to UV counts, but that is not what the IAB is saying. This guideline is saying that correcting for cookie, browser, and device-related over-counting of unique visitors is the responsibility of the measurement vendor.

Again, whoa.

And as if that’s not a radical enough move, the document goes on to state in Section 2.2 that the vendors need to actually break out these correction factors across three components: first-cookie acceptance, deletion, and browser denial (again, emphasis mine):

Cookie deletion rates, calibration methods and sources or estimation methods used to account for first-use, deletion, and non-accepting cookie groups should be disclosed by the audience measurement organization. The audience measurement organization should disclose census-based unique cookie counts and the estimated unique activity from first-use, deletion and non-accepting cookie groups separately and in aggregate. If the measurement organization relies on a unified model that makes reporting among these separate groups impossible, it may report these counts in aggregate only, but should be prepared to demonstrate in an audit the ability of its unified model to address each type of cookie completely.”

The IAB goes on in Section 2.4 to start to push web analytics into what is an uncomfortable position for some people, the use of algorithms and data models, to better report on unique visitors:

“As noted above, Publishers and Ad-servers will generally need to rely on algorithms (data models) to estimate the number of users attributable to the counts of Unique Cookies they develop. The underlying basis for this algorithm should be a study of actual users (i.e, people).  Ideally, such a study would be based on direct contact and/or observation of people using the browser at the time of accessing web-site content or ads with the unique cookie, as well as observation of the number of browsers in use by these users.  Additionally, inferences will need to be made about advertising activity of users with non-cookied browsers, so these types of users should also be contacted and observed.  Also, the activity of users who access content from multiple locations (home, work, school) on different browsers should be factored into these algorithms.”

Finally, the IAB is telling the vendors they need to report the results of their research to their customers, essentially exposing flaws in their technology for all to see:

“The resulting study should be representative of, and projectable to, the users of the web-site or property, and periodically re-performed to reflect gradual changes in audience.  Known weaknesses in the projection processes should be disclosed to users of Audience Reach Measurements.

If  you’re keeping track, the IAB is telling the vendors A) to completely change their definition of “Unique Visitors”, B) to start to actively research sources of inaccuracy on behalf of their customers, and C) pro-actively report known weaknesses in their system to their customers. Anyone want to place any bets on when the vendor community will adopt these recommendations? I’m going to be a little snarky here and put my money on “never in a million years.”

Seriously you have to love the IAB for putting this out there.  Unlike the Web Analytics Association’s Standards which I believe are an excellent start but are a little soft in areas, the IAB is basically telling the measurement vendor community that they are doing the entire world a disservice by reporting unique visitor counts that are complete bollocks and they need to stop doing that post-haste! Okay, maybe I’m over-reading the document but the scope of changes required for any vendor to become IAB-compliant is dramatic, both technically and psychologically.

I’m not sure if Brandt Dainow had seen the IAB proposal when he besmirtched the fine work of the Web Analytics Association’s Standards Committee, but if you compare the two proposal documents (the WAA’s proposal can be found here in PDF form) you will detect a noticable difference.  Personally I’m glad that my good friend Judah Phillips bridged the gap between the IAB and WAA and I find myself wondering, at least a little bit, whether the IAB+WAA relationship should be even deeper.

This all brings me to an excellent point that Bryan Robertson made on my last post on standards regarding how standards are defined and moved into common use. Bryan’s thesis is based on the W3C’s move from HTML 1.0 to XHTML and his point is that this transition to the XHTML standard came about because of A) a powerful standards body, B) a vocal community, and C) passionate thought leaders.  Regarding a powerful standards body, Bryan specifically make a point that other folks have made, usually behind closed doors:

“Is the WAA powerful enough at this point in time, or do we need to continue to build momentum before the standards can be more bold? For example, is the WAA hand wringing too much over the polite “we’ll share with you if you share with us” arrangement with the IAB over standards definitions? Is the WAA in a tough position in trying to bring practitioners and vendors together at the same table?”

Bonus points to Bryan for willing to be direct on the conflict of interests arising from having two masters, vendor and practitioner.  Again, I have nothing but profound respect for Angie and all of the other members of the WAA Standards Committee, but since I do know that vendors participated in the definition process I wonder a little bit how much impact they really had.

Anyway, I’m doing all the talking here and it’s a beautiful day so I will ask what you all think — either about the IAB proposal, Brandt Dainow’s assertion, Bryan’s thesis about the strength of the WAA, or anything else that strikes your fancy. Do you think the IAB standard for “Unique Users” has a snowball’s chance of being widely implemented? Do you think Brandt Dainow makes a good point (even if he does it in a lousy way)? Do you think the WAA may be better off working more closely with the IAB on Standards, given the IAB’s relative might?

My site host assures me that my comments table will not crash again so I look forward to hearing from you all.

Analytics Strategy, Conferences/Community

Sad to say, I partially agree with Brandt Dainow

Readers who are enthusiastic members of the web analytics community are by now familiar with Brandt Dainow and his sometimes antagonistic missives published at iMediaConnection. While I try pretty hard to follow the old “if you can’t say something nice” rule I occasionally fail in my efforts. Perhaps the best evidence of my failing was my calling Brandt Daniow insane when he suggested that Google Analytics version 2.0 was “simply a quantum leap above any other analytics product on the planet.”

While I firmly believe that Google Analytics is a great, valuable, and appropriate application for a wide range of needs, I think that Dainow’s “quantum leap” claim and statements like “What Google has done is simply take every feature in every product on the market and put them all into one system, and then make it available for free” are so obviously hyperbolic that they beg criticism (which Mr. Dainow got in spades from many within the analytics community.)

Dainow has since turned on Google Analytics, more recently pointing out what he describes as “disturbing inaccuracies behind Google Analytics” and again getting our  attention with irresponsible statements like “Google Analytics is different from other products in that it has been intentionally designed by Google to be inaccurate over and above the normal inaccuracies that are inevitable.

Oddly enough, his rant about Google Analytics included some statements that rubbed members of the Web Analytics Association the wrong way.  When folks like Jodi McDermott commented on the article and questioned some of Dainow’s assertions, Brandt did what any normal person would do …

… he wrote a nasty follow-up piece critical of the Web Analytics Association and the WAA Standards Committee!

I will let you read his piece yourself, but the two-sentence summary of Dainow’s opinion is that “the work of the WAA standards committee is a disaster for the web analytics community. It will take years to undo the damage and create proper precise standards that can be implemented in software. The WAA “standard” is not a standard, it’s just second-rate muttering.

Clearly Dainow is not worried about making friends in the web analytics industry.

I personally am a big fan of the Web Analytics Association.  I am pretty loyal to some of the current Board of Directors, I’ve done a bunch in the past to support the WAA and am about to announce more of the same, and I’ve even gone out of my way to help promote the work of the Web Analytics Association Standards Committee.  So it is was with great trepidation I wrote this article’s title … but I find myself agreeing with one small part of Dainow’s otherwise unnecessary rant.

Towards the end of his article, right before he declares that some pretty nice people’s work has been little more than time wasted, he says this:

The WAA should be setting the agenda, not following the crowd. The task of the WAA standards committee should be to determine how web analytics metrics should be calculated in order to achieve the highest degree of precision possible. The WAA should be laying out the roadmap for the way things should be. It then falls to the vendors to bring their software into line.”

I more or less made this same comment, although I like to believe I used a great deal more tact, when I commented on the original Web Analytics Association Standards published under the direction of former Director Avinash Kaushik back in August 2007.

At the time I preferred to focus on the reality of the situation–the fact that the WAA had proposed a set of standard definitions that, for good or ill, were better than anything else out there.  Instead of being openly critical of the definitions as written, I preferred to ask the question, “Now that we have these definitions, what are we going to do with them?”

While my call for a web analytics standards compliance matrix has since been answered by all of the major vendors except for Omniture, I personally don’t believe that the Standards process is serving the needs of our community as best possible.  We all continue to be vexed by a lack of standard definitions, a situation that will likely get worse with the decline of the web analytics economy.

Not having participated in the process of drafting the WAA Standards I can only express gratitude towards those members of the community who have volunteered  their valuable time for this work.  In my humble opinion, people like Jason Burby and Angie Brown are to be congratulated for their efforts, not denigrated and accused of having set our industry back into the dark ages.

But, in the spirit of having an open mind and building consensus, I would be interested in hearing my reader’s collective thoughts on Dainow’s point that the WAA should be setting standards without regard to their practicality today. Put another way, should the Association have written definitions that would be robust and useful in an analytics context and then presented that guidance to the entire community–vendor, consultant, and practitioner alike–saying “this is the result we should all be working towards.”

For example, should the WAA have been more explicit in their definition of a “visit” and proclaim that a visit is terminated after 30 minutes of inactivity, instead of saying “if an individual has not taken another action (typically additional page views) on the site within a specified time period, the visit will terminate by timing out.”  Being explicit about the timeout duration would make a clear statement about our collective expectation for the definition of a visit, and any technology or analysis that choses to use a timeout other than 30 minutes would also need to justify their decision to eschew the WAA Standard for another value.

I know that the WAA is doing the best they can, and I am enthusiastic about the work Angie, Judith and their fellow volunteers have all been doing.  But I do think Dainow’s assertion that standards should be set based on overall value to the community in the long run, not necessarily the near-term practicality, is worth exploring.  Taking this approach would definitely penalize some vendors and reduce their self-generated “compliance score” but it does kind of make sense to be working collectively towards a more precise set of definitions we can all work from.

Doesn’t it?

These are the kinds of conversations that aren’t just magically resolved and so I’m sure we’ll have to add this to the list of issues worthy of discussion the next time we all meet.  I’m sure it will come up at some of the upcoming vendor events, in San Jose at Emetrics, and likely at our own web analytics conference, the X Change (where last year Forrester analyst John Lovett led a conversation on the topic.)

As always I welcome your thoughts, feedback, open disagreement, pointing out flaws in my logic, etc.  I consider myself fortunate to have such thoughtful and experienced analytics practitioners among my most loyal readers and sincerely hope Dainow’s otherwise disturbing rant will lead to something of value for our community.

Analytics Strategy

Getting Started with Master Data Management (MDM)

I stumbled across a March 2008 paper written by Mike Ferguson of Intelligent Business Strategies a couple of weeks ago that looked really interesting, and I finally got around to reading through it this weekend. First looks were not deceiving! The paper is Getting Started with Master Data Management. The link I went through required a two-page registration form, which really is unacceptable, but this post isn’t about web registration form best practices!

It is worth it to wade through the process to get at the 22-page paper if you:

  • Are struggling with data being managed in disparate repositories at your company and are thinking that this whole “master data” thing might be worth looking into
  • Are new to a master data management initiative and are trying to wrap your head around the myriad issues
  • Are deeply embedded in a master data initiative and would like some assurances that you are on the right track while also uncovering a new nugget or two that you can apply to your work

Some of the highlights that jumped out for me are below.

What Is Master Data Management?

The paper makes a very clear and clean distinction between two broad categories of structured business data:

  • Master data — the data associated with core business entities such as customer, employee, supplier, product, partner, asset, etc. regardless of where that data resides
  • Transaction data — the recording of business transactions such as orders in manufacturing, mortgage, loan, and credit card payments in banking, and premium payments and claims in insurance

I like the distinction, as it adds some nuance to the tendency to make a knee-jerk definition that master data is “all data that might need to be shared across systems or the enterprise.”

Early on, the paper also makes a very, very, very, VERY key point about MDM:

…MDM is not just about data. It is about putting processes and policies in place with respect to governing master data, as well as putting services in place that provide common ways to access, maintain and manage it. MDM is, therefore, about data and the processes associated with it.

The fundamental principle described in that paragraph is that it is a fatal mistake to assume that MDM is mostly about technology. IBM deserves a negative shout-out for their irresponsibility in perpetuating this misperception by actually naming their MDM-supporting technology “MDM.” Technology vendors in all areas tend to overplay the importance of “the tool” and underestimate the importance of “the process” and “the people.” But, still… shame on you, IBM!

Why Is Master Data Management Needed?

Again, highlighting a quote from the paper is useful:

The problem this causes is that master data is fractured and partially duplicated in many different places. In addition, there is often no complete master data system of record in existence within the enterprise, except perhaps in a data warehouse where its use is restricted to analysis and reporting.

The paper goes on to include a “spaghetti architecture” picture that illustrates where many companies find themselves when they hit the point that they need to develop a master data management strategy. And, the paper includes several lists of the “…and here’s why it ain’t as easy to solve as you might think” variety. Issues such as:

  • Managing/reconciling data conflicts between different systems — is the first name for this customer “Tim” or “Timothy?” Our web site registration system has captured his name as “Tim,” while our billing system has captured it as “Timothy.” Which one should be recorded in the “master record” for the customer’s first name?
  • How do you know if the data is complete and accurate? — in the registration form I had to fill out on bitpipe.com to download the white paper, I intentionally stated that I was at a company that generates less than $1 million in revenue every year, because I didn’t want their lead management team to pounce on the fact that I’m at a very large company and should be contacted immediately by a salesperson. At the same time, I was honest about where I work and my job title. With a little bit of research, bitpipe.com can figure out what’s accurate…but manual research is expensive to operationalize, and automated research can be expensive to develop and still be less than completely accurate.
  • If changes happen to master data in one system, how long is it before these changes  get to all other parts of the enterprise that need to know about them? — this is another good one. Customers, for instance, typically have multiple entry points to interact with you: through your billing department, through your technical support department, through  your web site, etc. Each department may rely on a different operational system. Customers expect that, once they tell “the company” that they moved, that their name changed, that they had a life event that changed something relevant, that they bought a product or service, etc., that “the company” (and all of its systems) are aware. A key goal of master data management is to make that true…but it can be inordinately complex to actually pull that off across the entire enterprise.

This is just a subset of the questions and issues the paper lists. There are many more!

Getting Started with an MDM Project

The paper devotes several pages to how to build a business case for an MDM initiative. On the one hand, it’s a Business Case 101 summary — know your business, know your business’s competitive environment, build the case around relevant business value. But, it does call out some examples of where and how MDM can roll up to core business objectives.

The unwritten piece of this section is that MDM is typically expensive, requires a longer view than “this quarter,” and requires a certain degree of underlying business transformation. It’s not a case of “buy some software and install it and then the problem is solved.”

Scoping the MDM Initiative

There are lots of good tips about defining the different requirements and the potential scope — short-term and long-term — for the initiative. The paper includes a good explanation for why the goal-state architecture for the MDM system should be developed early on, even if it’s not fully realized in the first iteration. For more on the high-level alternatives on that front, I recommend pages 3-8 of Colin White’s paper that SAP commissioned back in 2007: Using Master Data in Business Intelligence (although his paper is geared a bit more towards the BI benefits of MDM, the different options for handling master data and the system-of-record vs. system-of-reference distinction is laid out really well).

The Importance of Data Governance in MDM

The paper explains why MDM should be part of a wider enterprise data governance program, going as far as saying that, “Adoption of data governance standards, policies and technologies are an essential part of any MDM project.” That data governance program needs to include data standards (shared business vocabulary, data integrity rules, taxonomies), common policies for enterprise data management and data integration development, an enterprise data governance technology platform, an enterprise data model, and processes for monitoring and managing enterprise data quality.

Conclusion

I’ve touched on some of the highlights from the paper. But, for a 22-page paper, it is packed with more information than can be easily summarized in a single blog post. It’s worth a download!

Analytics Strategy

2009 Predictions for CRM (not mine — Forrester's)

There’s an interesting piece over at SearchCRM with three experts’ predictions for 2009. The first expert — Bill Band of Forrester Research — had an intriguing list of six trends he expects to see in 2009 in CRM suites. The three that really jumped out at me (the bold text is Band’s label, and everything else is my description and observations):

  • Trend 1: The emergence of the Social Consumer — okay, sure, you’d have to have had your head buried in a backhoe-assisted hole in the sand to have missed that social media is a big deal. My sister went from telling me a year ago that instant messaging was too intrusive…to being an active user and advocate on Twitter — using it effectively and with success in her role as a manager at a nonprofit. That’s anecdote 1 of 20 that I could rattle off without thinking, and I’m sure you could, too. Web 2.0 is getting steadily adopted by, well, everyone eventually, just like Web 1.0 did. Band’s point, though, is that CRM suites “will be looking to enrich the customer experience through community-based interactions, and architecting solutions that are flexible and foster strong intra-organization and customer collaboration.” That’s an intriguing thought. CRM tie-ins to LinkedIn, Facebook, Twitter, and the like? It’s a recipe for disaster if the capabilities are developed, sold, and put to use with a Web 1.0 mentality. Pulling this sort of thing off effectively is going to require some nuance, both on the part of the CRM suite providers and their customers. Should be interesting.
  • Trend 3: The requirement to fully cost-justify CRM investments — Band boils this down to four questions that every business case will have to answer, and he indicates that his contacts are telling him that every CRM investment will need a rock-solid financial case built for it given the economic environment. I wonder. Certainly, I wouldn’t expect anyone to be out spending money frivolously, but, then, I wouldn’t really expect that to be happening in a robust economic environment, either. What I’ve seen happen time and again is that a company gets along with spreadsheets, a seasoned sales force, and relationships that have been built up over years to drive the business. Then, one day, they realize that they need to actually do some marketing and actually put some processes around how they do demand generation and demand management. Someone does a back-of-the-napkin calculation as to just how much money they’re leaving on the table, and it becomes pretty clear that they need to bring in some talent, some process, and some systems to realize that value. For the companies that have already made it over that hurdle, they tend to have the information they need to see where they are falling short and to articulate — even in the absence of guaranteed immediate financial returns — the value of further investment. For companies that have a strong cash position, downturns are a good opportunity to grow their market share by selectively investing…and CRM enhancements can be a relatively low-cost investment that can pay real dividends down the road.
  • Trend 5: The need to get more value from customer information — well, this is the one that actually hits the closest to my current bailiwick. Band nails it when he says “the right approach to customer data management is elusive.” His point, though, is that companies are going to have to get smarter about how they manage and use customer information. No doubt. I don’t see this nut truly getting cracked in 2009 — more and more companies are starting to realize they need to invest in master data management (MDM) and customer data integration (CDI). That’s easier said than done. For one thing…see Trend 3 above — no company tackles this issue until they’ve painted themselves into a corner that will require a hefty price tag to get out of. There are a handful of different approaches for getting out of that corner. All of them have there pros and cons. None of them are conceptually difficult to understand. ALL of them are wayyyyyy more involved to implement successfully. I’m struggling to envision any sort of step function change in the world of CRM on that front in 2009.

It will be interesting to look back in a year and see which of these areas really did see seismic shifts. I’m staying tuned!

Adobe Analytics, Analytics Strategy, General, Reporting

Measuring success in Twitter: Influence vs. Participation

I was reading a post recently outlining a somewhat incomplete attempt to measure something called “Influence” as a measure of success in Twitter. Being a champion for complicated and easily misunderstood metrics based on cognitive and behavioral psychology I was immediately drawn to the article but walked away unsatisfied … that is, until I found Twinfluence.

Twinfluence is this nifty little Twitter tool that lets you explore a Twitterer’s “influence” based on their reach (size of their network and second-level network), velocity, social capital, and centralization (see the explanation page at Twinfluence for the details behind each.) For example, here are some of the people I follow in Twitter analyzed by Twinfluence rank:

  • Rank #19: Jeremiah Owyang (jowyang) from Forrester Research
  • Rank #660: Bryan Eisenberg (thegrok) from Future Now, Inc.
  • Rank #2,893: Marshall Sponder (webmetricsguru) from Monster.com
  • Rank #3,577: Avinash Kaushik (avinashkaushik) from Google Analytics
  • Rank #6,124: Anil Batra (anilbatra) from ZeroDash1
  • Rank #7,195: Aaron Gray (agray) from WebTrends
  • Rank #7,591: Jim Sterne (jimsterne) from Emetrics
  • Rank #11,209: Omniture (omniture) from, yep, Omniture
  • Rank #11,786: Dennis Mortensen (dennismortensen) from Yahoo! Web Analytics
  • Rank #11,940: Nick Arnett (nick_arnett) a social media blogger

Whee, what fun! I could Twinfluence my friends and folks I follow all night and day if only client work, my family, and copious powdery snow didn’t get in the way. In case you were interested I have a rank of #5,754 based on my nearly 700 followers who are followed by over 375,000 other people and a very resilient social network.

However, after a little while I started thinking that measuring someone’s “influence” in Twitter was the wrong way to think about success in social media in general. Especially since people who have been dubbed “influential” and successful in the blogosphere have a tendency to think about their popularity in somewhat ridiculous ways … say perhaps stating publicly that they’re going to charge to re-tweet content because they want to buy expensive stuff?

Anyway, when I went down this path I immediately thought “Hey, the two things I spend the most time on in Twitter is trying to find great people to follow and trying to share interesting ideas.” To find great people I use Tweetdeck and to a lesser extent MrTweet to find folks who are having a conversation I’m interested in. To share interesting ideas I limit the majority of my updates to the sharing of links on web analytics related topics.

These combined efforts have helped me find and share ideas with hundreds of folks in Twitter interested in web analytics. So I started thinking “So perhaps the true measure of success in Twitter is being as good a listener as you are a source of information!” Being a balanced participant in your efforts, not just a “social media rock star” who spends all their time talking at people, not to them …

Of course this line of thinking let me to Dave Donaldson’s Twitter Follower-Friend Ratio (or the Twitter Ratio for short.) The Twitter Ratio is dead simple: the number of followers you have divided by the number of people you follow — the perfect Twitter key performance indicator! Dave even provides benchmarks against which we can be measured:

  • A ratio of less than 1.0 indicates that you are seeking knowledge (and Twitter Friends), but not getting much Twitter Love in return.
  • A ratio of around 1.0 means you are respected among your peers. Either that or you follow your Mom and she follows you.
  • A ratio of 2.0 or above shows that you are a popular person and people want to hear what you have to say. You might be a thought leader in your community.
  • A ratio 10 or higher indicates that you’re either a Rock Star in your field or you are an elitist and you cannot be bothered by Twitter’s mindless chatter. You like to hear yourself talk. Luckily others like to hear you talk, too. You may be an ass.

(The emphasis on that last sentence is mine … I laughed out loud when I read that!)

I think Dave’s Twitter Ratio of 10 or higher is the same thing as Perry Belcher’s “Twitter Snob” (funny YouTube video if you have 5 minutes.)  Perry comments that if your Twitter ratio is super high you may not be participating in “social media” but rather “solo media” — perfect!  Perry’s point is why are you even in social media if you don’t have time to listen to the conversation?

If I apply the Twitter Ratio to all of the fine folks I analyzed still ranked using their Twinfluence score here is what we get:

  • Jeremiah Owyang earns a score of 2.95 indicating that Jeremiah “may be a popular person” and “people want to hear what [Jeremiah] has to say” plus he “may be a thought leader in [his] community.” Sounds pretty much perfect to me, but I like Jeremiah.
  • Bryan Eisenberg earns a score of 1.04 indicating that Bryan is “respected among [his] peers” (or that he follows his Mom and she follows him, but with 1,951 followers we can assume the former is the best explanation)
  • Marshall Sponder earns a score of 2.30 which is pretty similar to Jeremiah’s score against his 851 followers.
  • Avinash Kaushik earns a score of 105.5 indicating that Avinash is “either a Rock Star in [his] field or an elitist [who] cannot be bothered by Twitter’s mindless chatter” who “likes to hear [himself] talk” but “luckily others like to hear [him] talk too.”
  • Anil Batra earns a score of 1.27 putting Anil in the same category with Bryan above although with only 266 followers his reach is somewhat lower than Bryan.
  • Aaron Gray earns a score of 1.49 pushing Aaron more towards Jeremiah Owyang than Bryan Eisenberg, at least on Dave’s scale.
  • Jim Sterne earns a score of 17.48 which is in the same “Rock Star” range as Avinash (although an order of magnitude less rock-starry  than Google’s own analytics evangelist)
  • Omniture earns a score of 1.26 indicating respect among the company’s 247 followers
  • Dennis Mortensen earns a score of 13.85 showing that Dennis, like Jim and Avniash, is a true web analytics rock star!
  • Nick Arnett earns a score of 0.58 which indicates that Nick is trying but alas, “not getting much Twitter love in return.”

My own score is 3.13 against 697 followers which I’m pretty happy about (especially the part about not “being an ass!”) Incidentally Perry Belcher’s Twitter Ratio is 0.98 … about as balanced as it gets!  If you have 30 seconds you can go to Dave’s site and calculate your own Twitter Ratio.

What do you think?

Is “influence” the best measure of success in social media? Or should we pay closer attention to something like the Twitter Ratio as a measure of our likelihood to actively participate in the larger conversation? It’s not hard to imagine the Twitter Ratio combined with a measure of tenure or update velocity or even something like influence to come up with a system to help us better discover which members of Twitter are providing real and substantial value to the community.

I welcome your thoughts, comments, suggestions, and perhaps more selfishly, recommendations for great and interesting people to follow and tools to help with the discovery process.

Adobe Analytics, Analytics Strategy, General

New data on the state of web analytics in 2009

Those of you who were unable to attend the webcast I did with Coremetrics and the Direct Marketing Association in December titled “Create Your Web Analytics 2009 Action Plan” are in luck — the nice folks at the DMA recorded this web analytics event and it is freely available for your listening pleasure.

Click Here to Watch the Webcast!

Also, we conducted a poll on the call asking about planned 2009 investment in technology and human resources for web analytics, in part because of the amazing response to my post about Web Analyics being recession proof?  For the 251 responses we got, here is what we heard:

Regarding investment in technology and tools:

  • 36 percent said they planned to spend more in 2009 than they did in 2008
  • 50 percent said they would be spending about the same in 2009 as they did in 2008
  • Only 14 percent said  they would be spending less in 2009 than they did in 2008

Regarding staffing and resources for web analytics projects:

  • 26 percent said they would be increasing staffing levels in 2009 compared to 2008
  • 70 percent said staffing levels would stay about the same in 2009 as they were in 2008
  • Only 4 percent said they would be decreasing staffing levels in 2009 compared to 2009 (phew!)

Finally, regarding web analytics in general in their organization:

  • 47 percent said that senior management considers web analytics a priority investment
  • 29 percent said that senior management considers web analytics a discretionary investment
  • 24 percent said that senior management had poor visibility into their web analytics efforts

Now I’m a little suspicious of these numbers, especially the 47 percent saying that senior management considers web analytics a priority investment which sounds high to me by about half.  Keep in mind that there is clearly bias in this sample since respondents were DMA members with an expressed interest in web analytics …

If you believe these numbers it certainly sounds like 2009 will be somewhat stagnant in the industry compared to the last few years of rocket-like growth.  And while reading that 47 percent of senior managers “get” the value of web analytics, the reality is that 53 percent don’t which is still bad news no matter how you spin it.

What do you think?

Do you believe that senior managers in nearly half the companies out there consider web analytics a priority investment? Does your management team consider web analytics a priority investment? Or are you still trying to explain to the bosses what web analytics is capable of doing for your organization? I’d love to hear your thoughts, and feel free to use a pseudonym and “anon@anon.com” as an email address to post without your boss knowing how you really feel 😉

Thanks again to everyone who attended the webcast and participated in this informal poll.

Analytics Strategy, Social Media

"You only get one chance to do it right. Try not to screw it up."

Thus were the words that subtitled Bryan Cristina’s presentation (PPT) on campaign tracking at the December Web Analytics Wednesday in Columbus last Wednesday, sponsored by CoreMetrics, Analytics Demystified, and SiteSpect at BJ’s Restaurant in Powell.

 Columbus Web Analytics Wednesday - December 2008

When it comes to screwing things up, we certainly had our opportunities:

  • Originally, we had planned on meeting at O’Shaughnessy’s Pub down in the Arena District. After initially being told we were good to meet there, we got bumped by a private party (apparently, a private party that has been occurring for a number of years at O’Shaughnessy’s and that takes over the entire place; it’s understandable, but still a bit irksome).
  • When we started looking for nearby alternatives, we realized the Rockettes were performing at Nationwide Arena that night, which was likely to clog alternate venues. So, BJ’s it was.
  • I forgot my camera. I was 3/4 of the way home to pick it up en route from work to BJ’s, when Twitter came to the rescue — @heatherdee409 shot me a tweet that she had a camera in her purse and we could use that. Thanks, Heather!
  • BJ’s had told us that we would have “the back room.” Unfortunately, that just described a large area, rather than any sort of private/semi-private space.

Thanks, I assume, to some more proactive promotion of the event (Dave Culbertson of Lightbulb Interactive accounted for at least half of the first-timers), we had record attendance. Combine the turnout with the fact that we were in a shared space, and we had less-than-ideal conditions for Bryan’s presentation. He brought a handout (PPT) and managed to semi-holler for a few minutes to quickly walk people through it. That was unfortunate, but I do think we are at least learning that we may have to settle for lesser quality food and a limited beer selection (read: The Spaghetti Warehouse) when we have a presentation.

Nevertheless, the presentation had some great information. And, some great lines that are typical Bryan-funny-caustic:

  • “‘We want to see what people from this campaign do on the site’ is not a goal, it’s an excuse for those who don’t know what they want to measure or for campaigns that have no purpose”
  • (When setting the campaign up) “Never trust anyone, especially yourself”
  • “Know the last possible second you can get things taken care of. People will forget you were excluded from everything until the last minute and will just blame you for being stupid.”
  • “‘It’s not in test, but it will show up in production’ means they have no idea what you’re talking about, don’t care, and none of your tracking tags will ever make it onto your campaigns.”

That’s just a sampling. Good stuff!

We had some first-time attendees I was pretty excited about:

  • Mark Whitman and Jen Wells of TeamBuilder Search — a relatively new recruiting company focussed on interactive talent; I had a good talk with Mark and got him to tentatively agree to do a presentation on building a career in the interactive space at a future WAW.
  • Noe Garcia of WebTrends — all the way from Portland! Bryan and I both go wayyyyyy back with Noe, and, interestingly, had had dinner with him earlier in the year at the same restaurant when he was in town; he’d been hoping that his travel schedule would line up with a Columbus WAW, and it finally did! Noe’s a great guy, and he’s tentatively agreed to have WebTrends sponsor a WAW in the spring and provide a speaker. Unfortunately, Noe was also partway through Super Crunchers, which I thought was a horrible book. We had a good-natured debate at the end of the evening about it and parted on speaking terms.
  • There were a few people I didn’t actually get to speak to, but who were new faces. And, embarrassingly, I had quite a conversation with a gentleman who has a local SEO/SEM firm…and I didn’t capture/record his name or his company! But, he did point me to Laura Thieme of OSU and bizresearch.com, who seems like another good contact for future WAWs.

List of tweeters in attendance who I could identify:

And, finally, I learned that there is apparently a Monish Datta fan page. Unfortunately, I couldn’t find it. So, I’m stuck just linking to Monish’s LinkedIn profile. But, hey, in the process of looking, I realized that last month’s post got me some serious Google Love on “Monish Datta” search results.

Adobe Analytics, Analytics Strategy, General, Industry Analysis

Web Analytics: One Month at a Time in 2009

As we look towards 2009 there are clearly some great challenges and great opportunities facing everyone who has more than a passing interest in web analytics. But regardless of the economic situation, we all need to stay focused on making the most of the people, process, and technology we have in place today, continuing to work towards positive business outcomes.

Towards this end, I would like to invite those of you wondering exactly where to begin and looking for some sense of structure for your digital measurement efforts in 2009 to a free webcast sponsored by Coremetrics and the DMA on Wednesday, December 3rd at 10:00 AM Pacific.

In this free event I will be focusing on helping companies of all sizes at all stages in web analytics maturation take a tactical look at their long-term strategic measurement efforts.  The net/net, I hope, is a “stratactical” (thanks Jennifer!) presentation that has something for everybody, regardless of the tools you’re using or how you’re currently using them.

Again, the webcast is free and open to everyone.  You can register with Coremetrics and the DMA at the Coremetrics web site:

Register Now to Attend this Free Webcast!

Again, the webcast is from 10:00 AM to 11:00 AM Pacific on Wednesday, December 3rd. I hope to see you there!

On a totally unrelated note, I wanted to say “Thanks” to Neil Mason of the Web Analytics Association (and now WebTraffiq) for bringing up my open letter to President-Elect Barack Obama in this week’s ClickZ column.  Neil makes a comparison between European’s view on the use of cookies and the current situation within the Federal Government here in the U.S.

Particularly interesting was this passage:

“The European Parliament passed a directive in 2002 on privacy and electronic communications. Leading up to this directive, there had been a concern in the industry that cookies would effectively be made illegal as a breach of personal privacy. In the end, the European Parliament concluded it wasn’t cookies or Web bugs that infringed privacy but the inappropriate use of these devices.”

Not the cookies themselves but rather the inappropriate use of these devices.  Absolutely.  I would encourage any of you interested in this issue to give Neil’s column a read.

Analytics Strategy

And the Rebellion Is Over (for Now)

UPDATE: The venue for the event has changed. We got bumped from O’Shaugnessy’s by a private party and then realized that the Arena District would likely be overrun with people attending the Rockettes show that night. So, we’ve moved north to BJ’s Restaurant at 1414 Polaris Pkwy.

For almost a year*, Columbus Web Analytics Wednesday has been held on a Tuesday. Initially because that’s what seemed to work best, but, over time, simply because we enjoyed being rebels — bucking tradition, mixing things up. Nay! Walking on the wild side! It’s Ohio — that’s about the level of wild and craziness we reach.

But, alas! Of late, we’ve had some Tuesday conflicts cropping up. After conducting exhaustive market research**, we’ve decided to step in line and shift to Wednesday night. We’re not saying that’s a permanent shift, but that’s the case for December, at least.

Wednesday, December 10, 2008
6:30 PM until…it fades 

O’Shaughnessy’s Public House
401 N Front St

BJ’s Restaurant
1414 Polaris Pkwy
Columbus, Ohio

For details and to RSVP so we’ve got a decent headcount, check out the Columbus WAW event page. The blogosphere is abuzz about it!

 

* 7 months, actually, but we’re trending towards a year, and web analytics is all about trends, right?

** A SurveyMonkey survey that had 13 respondents to date.

Analytics Strategy

Monish Datta Attends Another Web Analytics (Tuesday)

Ahhh…the timeliness of blogging. Not! The latest Web Analytics Wednesday in Columbus was last Tuesday, and I’m just now getting around to posting a brief recap. I’ve been battling a cold all week, which has combined with the mental rigors of starting a new job, and that’s just left me flat-out pooped in the evenings. That’s also my excuse for the lack of blogging overall, which I do hope to correct in the coming weeks, once I get my feet on the ground and figure out both topics and boundaries. So, be patient — no need to unsubscribe just yet!

Back to this month’s Columbus Web Analytics Wednesday. We headed back to O’Shaughnessy’s, and the event was again sponsored by CoreMetrics, Analytics Demystified, and SiteSpect. We had a good turnout:

Web Analytics Tuesday in Columbus

We again teamed up with the Adobe User Group folk, which added some additional perspectives and some new faces. I, for one, had a great conversation about SaaS-based CRM software with a fellow whose company is in that market…but I managed to totally space on the company’s name, even though I asked him for it twice! (See note above about my cold and my new job.)

We now officially have three regulars with recurring conflicts on Tuesday nights. And, this was our second consecutive meeting with no formal speaker. These two factors spawned a discussion that led to a new survey that I did manage to get sent out on Thursday night. Even if you have never attended a Columbus WAW, but you’re interested, I’d appreciate two minutes of your time to get some basic information on your ‘druthers:

Click Here to take survey

Preliminary results are leading me to think we’ll try December’s event on a Wednesday, but I’ll get the data posted here once I’ve got a few more respondents.

That’s about all for this month. Monish Datta again pointed out that I’ve made little headway when it comes to moving up the SEO ladder on searches for his name, so I figured I’d try getting him in the URL and the <title> tag to see if that makes a difference. I’ll stop short of flat-out link baiting. I promise! This is just a recurring diversion that cropped up in our first or second meeting and is now likely destined to continue indefinitely!

Oh, yeah. And Dave Culbertson is now on Twitter…although he’s still using that horrid default avatar. And, he’s still wildly skeptical about the whole application!

Analytics Strategy, General

Web Analytics is Recession Proof?

For the past few weeks I have been thinking about the economy and trying to reconcile two seemingly contradictory observations:

  1. The economy sucks, and it doesn’t seem likely to improve anytime very soon
  2. The web analytics sector is reportedly recession-proof and, in fact, predicted to grow in 2009

While I hardly need to provide any proof of the first observation, evidence for the latter has been emerging from a variety of voices in our community for the past few months.  Case in point:

  • In January, the Web Analytics Association reported that nearly 70% of companies planned to continue to invest in new tools and staff throughout 2008
  • Back in July, Corry Prohens from IQ Workforce wrote in a guest post in this blog that “74% of practitioners expect that spending on web analytics will increase at their company during the recession”
  • In September, my good friend Jim Sterne surveyed the web analytics crowd and found that 87% of practitioners plan to maintain or increase their budgets for web analytics tools and services in  the face of the current economy, and that nearly twice as many respondents indicated they planned to increase budgets for web analytics (21%) as planned to decrease same (13%)
  • Last week, the fine folks at E-consultancy splashed all across the news with the headline “Web Analytics: A Silver Lining in the Recession Cloud” by reporting that web analytics was poised to grow in 2008 by 12%

In the E-consultancy report, the organization’s head of research Linus Gregoriadis was quoted as saying: “The profile of Web analytics continues to grow as it becomes more integral to business decision-making and organisational strategy. The credit crunch is putting the spotlight on analytics as organisations work harder to understand where they are getting the best return on investment and where real value is being added.”

Recently, Josh James, the CEO of Omniture said something similar during the Q&A portion of the company’s Q3 earnings call in response to a question about whether businesses saw web analytics as discretionary:

“Every dollar that a marketer has, I think everyone has in every organization is under pressure right now and certainly marketing spend is where CFOs like to look and see if they can cut. But, what we’ve seen with our customers is their online channels are the ones that are performing the best. Their online channels are the ones that are giving the most direct impact within that quarter that spend is also taking place.

In terms of the way that they think about Omniture, even if they cut let’s say 10% of their marketing spend, they’re going to use us to a) identify the 10% they’re going to cut and b) use us to optimize the other 90% to try to get back up to the same results as they had with the 100% the year before. These kinds of times actually drive usage of our product.

When things are good it’s a lot easier when you want more sales just to throw more money at the top of funnel and to generate more leads and go through the process. When things get bad people try to focus on of everyone that’s already coming to our store, what can we do to keep them more attracted? What can we do to get them to look at other things? What can we do to get them to read additional articles? All of those behaviors drive uses of our product.”

All of this sounds absolutely spectacular. Except for one thing …

I’m not sure I believe any of it.

I think that we are collectively starting to suffer from the echo chamber effect, essentially reiterating that web analytics will be fine in this lousy economy because, unsurprisingly, we are all making money off of web analytics and we would very much like to continue doing so. The WAA, IQ Workforce, my friend Jim, E-consultancy, Omniture, me … our collective businesses are all more or less explicitly tied to continued investment in the sector. So why wouldn’t we look for data that suggests that the picture continues to be rosy and the future bright?

Why indeed.

In terms of the data presented above, as a former researcher I would offer this assessment: many of these surveys appear to suffer from sample bias. Asking the Yahoo! group, members of the Web Analytics Association, or the audience attending Emetrics about their interest, investment, or organizational focus on web analytics is kind of like asking your average Democrat in Portland, Oregon how they feel about Barack Obama.  The problem is not the audience, the problem is the interpretation: I think it is misleading to extrapolate the responses from a non-random sample of businesses and business people to the larger audience.

This kind of sampling leads to claims like “52% of online marketing managers are currently engaged in A/B or multivariate testing …” Fifty-two percent implies that tens of thousands of online marketing managers are testing. Which sounds great, except that when Offermatica and Optimost were acquired by Omniture and Interwoven they had a few hundred customers between them, and Stephane Hamel’s WASP tool reports that 0.4% (zero point four percent) of the Top 500 online retailers are using easily detected A/B or multivariate testing tools.

Don’t get me wrong, I too have been guilty of sampling biased audiences, although in the past year I have stopped conducting primary research due to both the sampling issue and the plethora of free research that suddenly appeared in the marketplace.

Ultimately I’m suspicious of this optimistic data that we’re seeing, especially in the context of statements like this one made by Mr. James made on the earnings call referenced above about the effect the economy is having on Omniture’s ability to forecast Q4 and 2009:

“Towards the end of September however, it became apparent that the challenging macroeconomic and financial environment may have some impact on our business going forward although it remains difficult to quantify the uncertainties specifically.”

Mr. James and his CFO specifically don’t want to talk about 2009 on the call. Which makes sense to me, since here are some other data points:

  • The economy sucks, and without belaboring the obvious, it appears that this suckiness will stay with us for quite some time;
  • While I don’t question Mr. James assertion that his best customers make excellent use of web analytics, in my personal experience this is not universally true;
  • Some of the largest consumers of web analytics products are starting to struggle;
  • Despite the conventional wisdom that dictates that brilliant analysts are safe when times are tough, I am getting more and more calls from brilliant analysts who are being laid off or being offered severance packages to walk away.

It is this last point coupled with something I learned at Emetrics that has me the most concerned.  In D.C. at Emetrics I heard Liz Miller from the CMO Council say that most CMO’s are a few years away from fully understanding the value of web analytics. If Liz is right, and her credentials are impeccible when it comes to the CMO’s office, then given the anecdotal evidence that continues to come in I wonder if web analytics is slightly more discretionary than we’d like to believe.

Don’t get me wrong, I sincerely hope to be wrong in this assessment. As an author, public speaker, evangelist, consultant, and conference co-producer focusing on web analytics I honestly hope to be able to write a follow-up post in six month saying, “Wow, I was really super-wrong about where the web analytics industry was going …”

But what can you do if I’m more right than not? What if you work in an affected sector or work for a company known for their web analytics acumen that is suddenly faced with bankruptcy or worse? What if the folks you work for who profess a great love for data-driven decision making are really HIPPOs in their heart and when the real bloodletting begins are just as likely to look for savings in areas that can be easily cut (human resources, for example) as opposed to those that would require breaking contracts?

What indeed.

If you’re in any way concerned about the current economy and your personal employment situation, here are five tips that I would offer to help you best prepare for the worst.

Tip #1: Focus on Increasing Profits, Not Minimizing Spend

My friend W. David Rhee just published a great response about the relationship between web analytics, sales, and marketing in a down economy.  To paraphrase Dave, if the bosses begin to panic, you don’t want to be in a situation where you appear to be an expendable marketing cost that can be cut.  It is far better to be focusing your analytical efforts on how the organization can be increasing profits, even if you have to fight to spend more time conducting analysis and less time generating reports.

Essentially you want to take Mr. James statement above to heart and work your butt off to optimize the lower-levels in your conversion funnel, working with what you already have, not what you might be able to attract.  The good news is that the technology supports this analysis; the bad news is that more often than not, the deeper you get in the funnel, the more difficult optimization becomes for a variety of reasons, not limited to the business, IT, and “the way we’ve always done it!”

Be a profit center, be big picture, become truly invaluable.

Tip #2: Don’t Be a Report Monkey

The unfortunate reality about web analytics work is that far too many smart people spend far too much time generating far too many reports that far too people actually read and even fewer actually derive real value from.  Sound familiar?  When I started the conversation about process in web analytics in 2006 at Emetrics, over 80% of the audience said they spent too much time on “reports” and not nearly enough on “analysis” … sadly I’m not confident that things have changed much in the past two years, especially on a percent-of-practitioners basis.

There are any number of great posts about why reporting is over-rated and how the real value in web analytics comes from careful, business-focused analysis of the data, there are still too few companies that have put the hub-and-spoke model into practice and are able to effectively leverage web analytical resources.

My advice to to step-up and find the real value in your data, even if you have to conduct the analysis on your own in the wee hours.  It’s not as if you can just stop generating reports (tempting as that may sound) but if you’re a good analyst, taking the time to figure out where the real opportunities to increase revenue are is the work you want to be doing anyway.  Taking the initiative to make data-powered recommendations and presenting them is a good way to demonstrate your skills and commitment to the business (but don’t stop doing the job you’re being paid to do!)

Analysts conduct analysis and make recommendations. Be an analyst.

Tip #3: Start Watching the Job Boards

Even if you feel pretty good about the situation you’re in you have to admit that the most accurate term to describe the current economy is “dynamic.”  In situations like this the worst thing you can do is be caught off guard and so I would offer that spending a little time surfing the Analytics Demystified Web Analytics Job Board (also see the WAA’s version) would be time well spent.

According to the nice folks at SimplyHired the number of job postings looking for “web analytics” experience of some kind continues to increase:

Assuming these postings are all accurate and still open, this is fantastic news since it contradicts my thesis that our sector is at risk.  The only thing that concerns me is that when I add a major market to the search, the trend graph starts to look substantially different. Here is the trend of jobs in SimplyHired for “web analytics” jobs in San Francisco:

Not quite as encouraging, huh? Now I might be using SimplyHired incorrectly but the general trend observed in the Bay Area makes me wonder if job growth in the sector is as strong as the first graph shows. Plus, anecdotal evidence suggests that an increasing number of companies are imposing hiring restrictions and outright freezes, meaning that many of these postings are effectively “inactive.”

By no means am I suggesting that any gainfully employed web analytics practitioner should jump ship in this economy unless you are absolutely confident about the situation you’ll be moving into.  But keeping your eyes, and your options, open makes increasingly good sense in my opinion.

Be smart about your current employment situation.

Tip #4: Think About Your Skill Set

I recently interviewed Corry Prohens from IQ Workforce and asked Corry about requirements for web analysts and what he looks for when trying to place folks. I recommend you read the entire interview, but here is what Corry had to say about what IQ Workforce looks for:

“In general we look for someone that has tool expertise, communication / interpersonal skills (these jobs are increasingly front-office), analysis & presentation skills and some complimentary kicker (testing, SAS, SQL, search marketing, development skills, search marketing skills, etc.) based on what our clients need at the moment.”

I went on to ask Corry about what two criteria he believed would help practitioners land a great job in this economy:

“If I were a web analyst I would learn how to use SAS to manipulate data & models.  I would also try to pick up experience in  testing/optimization.  Having one (or both) of these would open a lot more doors than a straight WA skill set.”

Real analytics experience and a focus on testing and optimization.  Great advice, even if the former is somewhat non-obvious (perhaps that’s why it’s such great advice!)  And while you may not be able to implement testing technology on the job, Google Analytics and Google Web Site Optimizer are free and easily implemented on a personal blog.

Push yourself and expand your skill set. Move ahead of the market.

Tip #5: Network, Network, Network

In my experience one of the most valuable things you can be doing during uncertain times is expanding your network of contacts.  Fortunately the web analytics industry is pretty well set in terms of opportunities to meet other practitioners, both locally and globally.  Here are a few networking opportunities that I highly recommend:

  • Attend or host a Web Analytics Wednesday event.  Web Analytics Wednesday is the world’s only local social networking event for web analytics professionals and it has helped dozens of folks find their next new job.  Take advantage of the many events happening before the end of the year or, if you don’t see an event in your town, contact me directly about getting a chapter started where you live!
  • Join the Analytics Demystified group at LinkedIn.  A few years ago I started a LinkedIn group for web analytics professionals.  Now the group has nearly 1,300 members worldwide and is open to anyone interested in getting connected via LinkedIn.
  • Join the Web Analytics Association.  The WAA is the only association we have and is actively working to create great value for their members around the world.  Joining the WAA gets you discounts to great conferences, access to their job board, and plugs you in to an increasingly vibrant community.

At the end of the day, despite the great demand for our skills and long-term opportunity afforded to all of us, a web analytics job is like any other job.  Your professional growth and development is as much a function of the people you know and your relationship to the community as your native analysis skills.

Get to know your peers. Have fun while you do it!

What Do You Think?

This has become a ridiculously long post considering that I could have just said, “I think there is more risk than we realize.  Be prepared.”  Most of us working in the web analytics arena have become quite used to the good times rolling and have every faith that they will continue to roll.  Only now, budgets are shrinking, jobs are being lost, and the general fear is that the President-Elect will create a business climate that is somewhat less friendly than most would like.

Still, my firm belief is that if you’re great at what you do and if you’re working for folks who clearly “get” the web analytics value proposition you have nothing to worry about.  All I would caution is that you not assume the latter is true, again especially in the context of the conversations I have constantly about senior-management not really understanding the art and science of digital measurement and analysis.

So now it’s your turn.  Do you think I’m way off base?  Do you believe the data I was somewhat critical of earlier in this post?  Does your boss “get” web analytics?  Are you optimistic like Mr. James that your company will be able to leverage your investment in Omniture (or whatever) to optimize your marketing spend?  Or are you worried about your job, or worse, have you been laid off?

I normally don’t allow anonymous comments but given the somewhat sensitive nature of this post and the feedback I’d love to hear, as long as the comments are appropriate I’ll approve them.

Analytics Strategy, Conferences/Community

WAA Standards Update: Thursday, November 6th

ANOTHER UPDATE: WebTrends just posted their own WAA Standard’s compatibility matrix at the new WebTrends blog. I have to say that when I first suggested that we needed a vendor-by-vendor assessment of Standards compliance back in August of last year I didn’t think it would take this long for the information to materialize. That said, despite the wait I’m glad to see most of the vendors stepping up!  I suppose I expect to have to update this post again when Omniture and Nedstat publish their information.

UPDATE: Coremetrics just posted a press release that describes their application’s WAA Standards compatibility at their web site. With Coremetrics participating in the creation of a global standards compliance matrix the only remaining vendors that need to provide this type of documentation are Omniture, WebTrends, and Nedstat.

One of the key challenges we face as an industry is without a doubt the lack of standard terms and definitions applied across different vendor solutions. John Lovett from Forrester led a huddle for us at the X Change conference this past August and the subject just keeps coming up.

The fine folks at the Web Analytics Association are actively trying to do something about this situation and have recently published a draft update of the WAA Standard Definitions.  When the definitions were first published back in August of last year I offered hearty congratulations but also commented that standards without any kind of transparency or enforcement are unlikely to be applied in any meaningful way.

Since that post three companies have publicly offered up any kind of documentation regarding their level of standards compliance — IndexTools (now Yahoo Web Analytics), Google Analytics (via Justin Cutroni at EpikOne), and most recently Unica (thanks Akin!) — but I am hopeful that this is about to change.  I am hopeful because of a conversation I had at Emetrics with Angie Brown, the head of the Standards Committee.

Angie and the Standards Committee will be presenting the updated definitions on Thursday, November 6th at noon Eastern / 9:00 AM Pacific.  I think this call is open to everyone (Angie, correct me if I’m wrong) but to help build awareness for this call and the Standards Committee in general I was lucky enough to interview Angie.  My questions (in bold) and Angie’s answers follow.

Tell me a little bit about the recent update to the standards document?

We took the feedback we received from last year — emails, blog postings, rumors, etc. — and revisited each of our original 26 definitions. This review caused us to drop one term (the distinction between “single page visits” and “single page view visits”, too confusing and not useful), we added four terms (frequency, recency, conversion rate, and impressions), and we spent the majority of the last year comparing our definitions with the tools we use in our everyday jobs.

As a result of the review, we’ve included a list of “ask your vendor” questions where we’ve been able to identify different methodologies used by different tools.

Who are some of the key people working on this project with you?

There are so many wonderful contributors on the Standards Committee. Most meetings have 8-12 participants, and they change over time based on employment status, workload, travel schedules, etc. But we do have a core group of people who joined way back in 2006 — Judith Pascual (my co-chair extraordinaire), Aseem Patel, Anna Long, and Bob Russotti — who are still active in the committee today. We have many newer members that contributed tons to this document as well, but having some continuity as we move from version to version to version has been critical.

We are also fortunate to have had outstanding guidance, enthusiasm, and representation on the WAA Board while we worked on this document, first from Avinash Kaushik and now from our current Director, Robbin Steif.

Which of the web analytics vendors are participating in the project?

Both Coremetrics and Unica are regular participants: Coremetrics since early 2006, and Unica since early 2007. A representative from Omniture came to our last meeting, and we’re looking forward to more participation from him in the future. I had a chance to talk with some of our other WA vendors at eMetrics last week, and hope we can get them interested.

Can you describe some of the ways the definitions/this work have been adopted so far?

I have heard from several vendor contacts that people are asking about compliance with the WAA definitions when sending out web analytics RFPs. We’ve also heard from people who have distributed the definitions throughout their company so everyone has a common terminology when discussing analytics.

Even more gratifying is the response from the international web analytics community, especially in Europe. Their WA market is much more fragmented than it is here in the US, with a larger variety of tools, so having common terminology is even more meaningful. One of our vendor members has a customer who translated the document into French on their own so they could share it with their coworkers! A high priority for next year is to work with the International Committee to get our document translated into other languages.

Awhile back I congratulated you on the work but complained that there needed to be teeth behind the document.  Since that time only IndexTools (who’s CTO is now a WAW Board member) and Google Analytics (via Justin Cutroni from Epik) have produced any kind of documentation describing their application’s compliance with WAA definitions.  Why do you think the other vendors are slow to respond?

I applaud both Dennis and Justin for taking that initiative without any prodding, and I think it says volumes about both men and the companies they represent. To be fair, I should mention that both Unica and Coremetrics also presented comparisons to the committee as part of their regular participation, although neither has published their results publicly. (NOTE: Unica has since published their comparison via Akin Arikan’s Multichannel Marketing Metrics blog.)

I’m merely speculating on why there hasn’t been more public response, but I suppose it’s because we haven’t asked them point blank to do so. However, this will change. Part of the feedback we’re soliciting from the industry this year will include a letter to each of our WAA member vendors asking them for comparisons, and we intend to publish the responses through the committee. The only reason we haven’t done this yet is because we’re putting the finishing touches on some scenarios (more detail later) that we also want our vendors to address.

Do you support the idea I proposed of a “Percent Standards Compliant” rating for every web analytics application/vendor?  If so, why?  If not, why not?

For now, the committee’s focus is on transparency over compliance. When the “ask your vendor” questions are answered, and the scenarios addressed, we will be able to include methodologies in the definitions in such a way that an overall percent compliance will be more meaningful. I’m not opposed to the idea, nor would I try to discourage anyone else from doing so; I just think it’s premature. Here’s why:

Our current definitions are broad, and we know that vendors use different calculations. In fact, that’s the whole reason we created the “ask your vendor” questions. There are also cases where the metrics described are not exposed in “standard” reports, but can be easily configured by the analyst in some tools. Or situations where a concept exists, but is called something else or simply isn’t exposed in the tool. Because of that broadness, I don’t think percentages calculated today will be different enough from vendor to vendor to provide much differentiation. Or worse yet, “compliance” for specific metrics will be open to interpretation.

For example, look at both Dennis’ and Justin’s responses to our “Return Visitor” definition. Dennis says IndexTools does not comply, while Justin says Google Analytics does. Because they’ve given good explanations (yay transparency!), it’s clear that both tools treat this concept in a similar manner: neither exposes the “return visitor” in the tool, but rather the visits made by returning visitors. If that’s the prevailing way this concept is exposed in the other tools, then it’s not so much a matter of compliance/non-compliance (an argument can be made either way, and indeed both interpreted compliance differently), but perhaps we on the Standards Committee should be defining Return Visits instead of Return Visitors.

So while I want the vendors to publicly tell the web analytics community how their tools stack up to our definitions, it will be as much about informing our Standards as about compliance.

What’s next for the standards document and standards committee?

1) Finish scenarios (more info below) and submit to vendors
2) Keep soliciting industry and vendor feedback on our 9/22 document (download it here)
3) Once feedback has been incorporated, work with the WAA International Committee to translate the final document into other languages

One of the more interesting things we’re doing right now is creating scenarios to find out how situations are handled in different tools. Not only will this provide critical information for practitioners who are transitioning from one tool to another (yes, we’ll publish the replies), but it will form the basis for more specific methodology-based standards.

For example, if the only activity a visitor performs on a site is a non-page event, do our tools count this activity as a visit, and what are the properties of this visit (entry/exit/duration)? Another example concerns average time on site: does the tool include bounced visits, and if so what value is used for these visits (zero? an estimate based on non-bounced views?)?

These are just two of the many situations where tools make different calculations that ultimately affect the numbers you use in your analysis, and why running multiple tools on the same site can lead an analyst to drink. But before we address how we should calculate, it will be beneficial to know the different ways we do calculate.

We’ll talk more about the scenarios during our upcoming WAA Webcast on Thursday, November 6, “Web Analytics Standards Update.” You can sign up on the WAA Site.

How can my readers help the WAA and the standards committee out?

Your readers can help us by giving us feedback. If the document is useful, please tell us. If not, what can we do better? Feedback can be broad or it can be specific to a certain metric. Ultimately, standards need to help the web analytics practitioner, and if we’re not on the right track we need to know.

Download the latest document from our WAA Standards Committee Page. If you attended eMetrics Stockholm or DC, you received a paper copy in your bag. We’ve started a blog post soliciting public comments on the WAA site. Other ways to provide feedback are by email (standards at webanalyticsassociation dot org), participating in our webcast, or starting a discussion about a specific term on the Web Analytics Forum.

There are so many bloggers in our space right now that if feedback consists solely of a post on one’s own blog or a comment on someone else’s, we may not find it so please drop us a line or post a comment on the WAA Blog to let us know.

Anything I forgot to ask you?

I want to expand a bit on the issue of standards with “teeth.” Above, I mentioned that we are more interested in transparency than compliance right now, but I don’t want to mislead anyone and make them think compliance isn’t important. It is, and compliance is our ultimate goal.

However, our industry is still young. There are any number of industry reports that will tell us most of the major tools provide similar basic functionality. A competent analyst can get valuable business insights out of any of them. And compliance to standards most certainly carries a cost, not just to vendors who will have to rewrite scripts and database queries, but to practitioners who will have to explain to their stakeholders why a number they’ve been reporting for years as 11.5 is suddenly 9.2.

Not having that happen any more is one of the best arguments for standards, but in order to get there we’re all going to have to get through the transition. The Committee wants to make sure that the standards are well-vetted, reasonable, and that they ultimately benefit the practitioners of our trade and confer a competitive advantage on vendors who comply. After all, if nobody complies, then why bother with standards?

As I discussed above with my Return Visitors vs. Return Visits example, if we find that the industry is already used to seeing a concept expressed a certain way, we’re not interested in forcing the entire industry to change their mind if both metrics are equally valuable. In that case, we want to standardize on the metric that people already find useful. We can’t do that without your feedback. And we can’t be sensitive to changes we’re asking from vendors if they don’t tell us how they do things right now. We have tried to figure it out for ourselves, but honestly? Most vendor documentation sucks.


Again, you can register to attend the WAA Standard’s Committee webcast on Thursday, November 6th by going to the Web Analytics Association web site.  Thanks again to Angie for taking the time to answer my questions!

Adobe Analytics, Analytics Strategy

Track Visitor Engagement using Google Analytics!

One of the major complaints about my work on measures of Visitor and Audience Engagement is that unless you have Visual Site (= Omniture on Premise), Unica, Coremetrics, SAS, or a custom data warehouse solution you’re somewhat limited in your ability to make the calculation. Now, thanks to the recent upgrade to Google Analytics and the availability of session-level segmentation everyone can use my calculation to explore engagement patterns on their site.

Yep, free measures of Visitor Engagement from Analytics Demystified and Google Analytics!

It was a post from Alec Cochrane about engagement that got me thinking about the application of my calculation using Google’s segmentation features, thanks Alec! Heck, had I been paying more attention to his blog I would have noticed that even Avinask Kaushik (who persists in his dogmatic assertion that “engagement cannot be measured”) refers to GA’s ability to make the calculation.

Keep in mind, what I’m describing in this post is not a full-blown measure of Visitor Engagement for a lot of reasons. Still, as I’m kicking it around it appears to be a pretty good start and per my entire approach towards measures of engagement, I’d rather have all of you banging on the idea than work in a vacuum.

So how does it work?

Step 1: Gather Your Threshold Values

The first step is to determine what thresholds you want to set for your Click-Depth, Duration, Recency, and Loyalty indices.  You can get the first two from GA’s Visitors > Overview report (shown at right) while Recency and Loyalty come from Visitors > Visitor Loyalty > Recency and Visitors > Visitor Loyalty > Loyalty respectively.

Depending on your site you may need to be creative in how you set the Loyalty and Recency thresholds, especially since GA’s reporting on these measures is not super robust. Fortunately, since the segmentation tool is pretty flexible you can play with the threshold values once you’ve set them.

Step 2: Create Your Engaged Visitors Segment

The next step is to create a segment that lets you identify “engaged” visitors on your site. I’ll first describe the basic calculation, which is essentially the same as Audience Engagement only applied to click-stream data, and then expand in a follow-up post on the idea leveraging the Interaction Index, the Brand Index, and the Feedback Index.

Start by “creating a new custom segment” and adding the visitor dimension “Page Depth” (Google Analytics’s measure of Click-Depth during the session) setting the condition to “Greater than or equal to” the Click-Depth threshold value you discovered in Step 1:

Make sure to test the segment and confirm that things are working. In the example above you can see that about 25% of the sessions to my site last May were of at least three page views. Next you’re going to add the Duration Index by adding an “and” statement and dragging in the visitor dimension for “Visit Duration” and setting the condition to “Greater than or equal to” the time on site threshold determined in step 1:

Because you’re using an “and” statement we are getting the number of sessions that were both at least three page views and at least three minutes in duration; while this is imperfect compared to the visitor by session scoring strategy we described in the longer white paper the use of “and” ensures that we’re identifying visitors who are paying Attention as measured by clicks and session duration.

The next step is to roll in the Loyalty and Recency indices using the visitor dimensions “Count of Visits” and “Days Since Last Visit”.  As I mentioned above you may need to play with the thresholds here, perhaps creating a visitor segment of goal converters (purchases, leads, etc.) and examining the return visit behavior for that segment.  Also, when you set “Days Since Last Visit” be sure to use the condition “Less than or equal to” to capture visitors who have been to the site recently:

If your site is like mine you’ll see a noticeable drop in the number of matching visits when you add “Count of Visits” or “Days Since Last Visit” because of the use of the “and” operator.  But this is good and to be expected since if everyone coming to your site was truly engaged then you wouldn’t be reading this post, you would just be rolling in money.

All you have to do now is name and save the segment and you’re in business!  I called my segment “Engaged Visitors” which is not technically correct — really what I’m tracking is “Engaged Visits” — but when you see the final application of the segment below you’ll understand why.

Step 3: Mine Google Analytics for Engaged Visitors

Once you’ve created your “Engaged Visitors” segment you can start to apply it to the various reports in Google Analytics.  I recommend comparing the engagement segment against “All Visits” to get context — and GA does something nice here in calculating the percentage of segment members (= sessions where all four engagement criteria are met) for you.  Here you can see how this comparison looks in the Visitors > Map Overlay report:

While I’m only drawing a moderately engaged audience from Australia I am feeling the love from Spain! Probably since my good friend Rene Deschamps is Spanish or perhaps since I’m talking to a web analytics consulting group in Spain about coming over for a presentation and a big Web Analytics Wednesday event this coming Spring … who knows?

Now, I am pretty delighted with how easily these segments can be applied to the various reports in Google Analytics … hell, just the fact that the segment stays applied when I navigate from report to report is nice.  And yes, there are some obvious improvements that could be made but for a first effort this is pretty nice.

The same segment can be applied to reports that are more critical to how you run you business, for example the keyword report.  When I look at three top keywords driving traffic to my site you can see a clear pattern begin to emerge (and this is without adding the Brand Index into the engagement calculation):

Here you can see an obvious difference in the level of engagement associated with external searches for my brand’s name and “Web Analytics Wednesday.”  Even searches for Judah Phillips driving traffic to my site are bringing in a highly engaged audience (Judah, since I know you’re sensitive about this, nearly 30% of the visits associated with searches for you are scoring as engaged … nice work, buddy!)

If you’re willing to keep drilling down you can learn all kinds of wonderful things.  Here is a comparison of network traffic coming from WebTrends and Omniture:

Finally, if you’re using your one user defined field to capture some type of visitor identifier (hopefully doing so in line with your privacy policy) you can actually apply the engagement segment to individuals or groups interacting with your web site and actually begin to measure true Visitor Engagement.  Here you can see my very good friends Judah and June who are highly engaged at the Analytics Demystified web site, shown in stark contrast to another very active visitor who does not appear to be paying me any Attention at all:

This has become a long post so I’ll stop here for now and leave you with the following summary points:

  • Google Analytics, like any session-based system, is not perfectly suited for calculating a true measure of Visitor Engagement;
  • That said, given the recent availability of segmentation in Google Analytics, I would encourage those of you running GA to explore the use of my Visitor Engagement calculation;
  • My belief is that you will begin to see for yourselves that this measure will help you identify opportunities not easily uncovered using traditional measures like average time on site and bounce rate;
  • But you don’t have to take my word for it, do you? Play with the ideas I put forth in this post and let me know what you discover.  I would absolutely love to hear what you learn using this segmentation strategy or learn about applications of the segment that I haven’t thought of yet!

Last but not least, keep in mind that I have always put forth my work on Visitor and Audience Engagement as a hypothesis, one that is still being evolved and subject to testing and application in a variety of business situations. The thing I love about our community more than anything is the willingness that most of us show to explore new ideas and have an open mind.

As always I welcome your comments and feedback.

Analytics Strategy

The Google Analytics update: Thoughts and Implications

By now you are all well aware that the nice folks at Google wowed the web analytics world last week by announcing a suite of upgrades to Google Analytics at Emetrics.  Google’s “Analytics Evangelist” Avinash Kaushik — who was otherwise a no-show at the conference — made the flight out in Brett Crosby’s place to deliver a very energetic presentation describing the new features. More than anything after the talk I was left with this impression:

Google is serious about Google Analytics.  Period.

This is great news, at least for most of us, and I have to say after playing with some of the new features I am very impressed. Not that I didn’t expect to be — Brett and Paul’s team at Google has repeatedly demonstrated that they get it and that they have the programming talent to back them up.  With the addition of Kaushik as a full time product evangelist, especially given Avinash’s intensely competitive nature, the question wasn’t “will they improve the product” but rather “how much will they improve the product?”

This is a conversation I seem to have with folks who are tangential to the industry quite a bit, mostly because people who have invested a bunch into OMTR are justifiably nervous about whether Google Analytics has the potential to slow SiteCatalyst sales.  All along the arguement for OMTR was “Google Analtytics is nowhere near as powerful as SiteCatalyst and Google has no reason to improve Google Analytics, adding missing functionality like segmentation, customization, and data export functionality given the associated costs and the fact that Google Analytics already dominates the web analytics landscape with an over 65% marketshare across all sites with tag-based analytics deployed.”

Except it appears that nobody told Google this. Or, if they did, Google didn’t listen.

Now don’t get me wrong, the new features are not totally perfect.  The segmentation feature which is receiving the most hype within the web analytics community is not true visitor-level segmentation but rather session-based segmentation which severely limits an experienced practitioner’s ability to drill-down into the data. But I suppose this is a perfect example of “you get what you pay for” and since we’re not paying having multidimensional session-level segmentation that can be immediately applied to all historical data is pretty sweet.

On the upside, I was actually pretty surprised about Motion Charts which to me seemed like a tchotchky but after playing with it for just a little bit I’m inclined to agree with Yahoo’s Dennis Mortensen that Motion Charts have potential. I especially like the “Link to Chart” option that seems to allow us to share the visualizations we create with other Google Analytics users.

Here’s a Motion Chart that I’m rather enjoying the use of: Keywords by goal conversion rate by bounce rate sized by Percent New Visits colored uniquely with trails turned on.

The other features (AdSense integration, Management Console upgrade) are nice but decidedly less sexy.  Oh, except for the Data Export APIs which is easily the most exciting feature announced, and the one that has the greatest potential to reshape the web analytics landscape forever.  As I recently commented when talking about Yahoo Web Analytics, the availability of API-based access to web analytic data is the feature that has the greatest potential to change the way larger and more sophisticated companies think about free tools like Google Analytics.

Judah Phillips commented as much back in May of last year when Google updated Analytics the last time.  Bright guy that Judah.  Now he can use Google Analytics in his new position to pull freely collected data into the corporate Intranet … niiiice!

I suspect that before long we’re going to see some pretty cool applications taking advantage of the Google Analytics APIs which will erode the immediacy of need to invest in for-fee solutions.  When I ask my magic eight-ball if someone will develop something like HBX ReportBuilder for Google Analytics to allow companies to create custom reports in Excel and schedule delivery, or if we’re going to see vertical-specific web analytics applications like “Google Analytics for Real Estate Sites” and “Google Analytics for Law Firms” the response is always the same:

“Signs point to yes.”

It’s not that there isn’t still a gap between functionality provided by Google Analytics and that provided by Yahoo Web Analytics, SiteCatalyst, Coremetrics, WebTrends Web Analytics, etc., there definitely is. Custom data collection, data integration into the visualization interface, visitor-level data collection and analysis, custom dimensions and metrics, and the like are all features more-or-less required for advanced analytics.  But I think the fact that so few companies are doing truly advanced analytics coupled with the inevitable ecosystem that will spring up around the Google Analytics APIs will create some pain within the sales organizations among the for-fee vendors.

Especially in this uncertain economy, if I have to choose between spending between $20K to $50K on an entry-level SiteCatalyst/Coremetrics/WebTrends/Unica/Nedstat deployment or spending nothing to explore the use of segmentation, report customization, and Motion Charts while waiting for someone like DataLinks to port their application to the Google Analytics APIs so you can spend $995 to build totally customized key performance indicator reports in Excel … well, as a small business owner the choice is pretty clear.

WebTrends recent announcement about moving increasingly into BI, essentially as middleware between web data and traditional business analysis applications, is typical of the response I expect we’re going to see from the for-fee vendors. Some type of move up-market to continue to justify the expense of data collection, which will further limit opportunities for growth since I expect the end-user market to continue to mature at a much slower pace than the available technology set.

I mean, why pay for data collection and storage if Google and Yahoo are going to give it away? Especially in the context of those APIs and the low-cost applications we’ll inevitably see, I suspect the management teams at Omniture, Coremetrics, WebTrends, Unica, and Nedstat are looking suspiciously at their Q4 and Q1 2009 projections for SMB sales and global expansion trying to figure out exactly how much free web analytics will ultimately impact the business.

I know I would be.

It’s not to say they won’t figure it out — all of these guys are tough competitors with a lot of experience growing their business in an increasingly volatile market — but it will certainly be interesting to see how they go about it.  And I personally think it’s going to be a lot of fun to see how the market changes.  The best companies are doing absolutely outrageous things with web analytics, the mid-market is maturing more than ever thanks to an understanding of the right relationship between people, process, and technology, and the number of viable solutions is now approaching a very reasonable set reflective of the market’s maturation.

In summary:

  • The new Google Analytics features are totally awesome and the crew at Google should get some kind of altruism award for making this level of functionality available for free;
  • Once and for all we can agree that, without any doubt, Google is serious about Google Analytics and web analytics in general;
  • While GA’s segmentation is limited to session data only, the implementation is brilliantly intuitive to use, especially at the best of all price points;
  • Motion Charts are far cooler than I expected them to be, custom reporting is as brilliantly intuitive as I expected it to be;
  • I still don’t think Google Analytics is appropriate for advanced practitioners, at least not as a system of record, but the number of truly advanced practitioners working out there today is still relatively small;
  • I think the Data Export APIs are the most exciting aspect of this announcement and I’m looking forward to all the cool, new applications that will inevitably spring up based on these APIs;
  • I think that Google has sucked the wind out of Yahoo’s sails, whether they intended to or not, but I still don’t think that Google Analytics and Yahoo Web Analytics are directly competitive;
  • I think the vendor folks most impacted by this announcement are the teams responsible for SMB sales, the expansion into Europe and Asia, and anyone selling web analytics solutions at a sub-$50K price point;
  • I expect the for-fee vendors to respond to Google Analytics not by picking on the features (remember: voters don’t like negative campaigning!) but rather by working to more aggressively take their existing suites and platforms up-market;
  • I don’t expect this announcement to be a death blow to anyone. Rather it serves as yet another reinforcement of the inevitable commoditization of the web analytics data collection market and a wake-up call to any company with a ten-year plan to continue to make money counting page views.

Okay, thanks to Brett’s generosity I’m going to go back to playing with segmentation and Motion Charts. As always I welcome your feedback on my commentary and would love to hear from those of you also playing with the new features about your experience.

Analytics Strategy

I Wonder If chacha.com Came Up at eMetrics?

We had a great Columbus Web Analytics Wednesday last night at The Spaghetti Warehouse, sponsored by CoreMetrics, Analytics Demystified, and SiteSpect. As usual, we held it on a Tuesday, and we went double-rebellious by somewhat inadvertently scheduling it to conflict with eMetrics (which lost us Jonghee and Judy) and with a local pool league (which lost us Nicole).

Nevertheless, we had a healthy turnout, albeit it one that had an exact 1:1 ratio of X to Y chromosomes (think about it):

Columbus Web Analytics Wednesday (Tuesday) -- October, 2008

As tends to be the case, we had heavy Coremetrics representation at the event, but we also had a WebTrends company, one company that’s exploring IndexTools, and heavy dabblers in Google Analytics all around.

We didn’t have any formal topic, but both ends of the table had some interesting discussions. On my end, a couple of the big topics were:

  • The importance of web analysts — or any working professional — being passionate about their work and constantly pushing themselves to try new things; this led to a discussion about the importance of “making mistakes” — it means that you’re trying new things, and it means that you’re learning. As theoretical as the summary sounds, this was rooted in experience with specific people in web analytics roles who were not effective in the position
  • The inaninty of the “clickpath” report — turns out that Bryan C. and I are both major anti-fans of classic clickstream reporting (it goes to my “people are cows myth”); single-level paths are much more practical and useful in most cases. We don’t like overlay reports, either, because they are so rife with pitfalls in their interpretation

We also covered a bit about my pending move to Nationwide, where I’ll be an official co-worker of Bryan C., albeit in a totally different department and not really involved with web analytics. More on that move to come — I expect the number of data management posts here to increase as I get more heavily back into that world.

And, on the “entertaining technology” front, Scott introduced us to chacha.com. It’s a free service where you can call or text in any question, and it will text back an answer. Scott demonstrated by calling and asking “Where do babies come from?” The answer that was returned was accurate, succinct, and, yet, would garner a PG rating on the big screen. I tried it this morning by asking “What is web analytics?” with lesser results:

Off-site web analytics refers to web measurement and analysis irrespective of whether you own or maintain a website.

I’m not sure where the “off-site” came in, but the response included a link with the question and answer: http://search.chacha.com/u/DIffZIPz. And, that link includes a link back to the “Source Website,” which was the Web Analytics Association, so, all in all, not bad for a one-minute response to a random question.

And, finally, Monish Datta said my goal should be for this site to show up as the number one Google result for a search on “Monish Datta.” Currently, I’m on the second page of results, lagging well behind the YouTube photo montage of The Monish set to the tune of Runaway. But, I’ll keep working on that.

Adobe Analytics, Analytics Strategy, General

Visitor Engagement + comScore = Audience Engagement!

About six months ago the management team at comScore approached me with some questions about my Visitor Engagement calculation and the Analytics Demystified engagement framework. Their Chief Research Officer, Josh Chasin, had taken an interest in my work and wondered how it may be extensible across multiple properties using the comScore dataset.

It was an excellent question, and today I’m happy to give readers a preview of what we believe to be an excellent answer. Today we’re announcing a measure of Visitor Engagement that, thanks to comScore, can be used to compare levels of engagement across multiple properties in a similar category.

Brand Marketing’s New Measure: Audience Engagement

Audience Engagement is a simple modification of Analytics Demystified’s Visitor Engagement calculation that focuses on the core site behavioral attributes, measured through the comScore panel. If you remember, the Visitor Engagement calculation is:

Σ(Ci + Di + Ri + Li + Bi + Fi + Ii)

The components of the Visitor Engagement calculation are:

  • Click Depth Index: Captures the contribution of page and event views
  • Duration Index: Captures the contribution of time spent on site
  • Recency Index: Captures the visitor’s “visit velocity”—the rate at which visitors return to the web site over time
  • Brand Index: Captures the apparent awareness of the visitor of the brand, site, or product(s)
  • Feedback Index: Captures qualitative information including propensity to solicit additional information or supply direct feedback
  • Interaction Index: Captures visitor interaction with content or functionality designed to increase level of Attention the visitor is paying to the brand, site, or product(s)
  • Loyalty Index: Captures the level of long-term interaction the visitor has with the brand, site, or product(s)

(More information about the measure of Visitor Engagement, including the details behind the calculation and several example use cases, can be obtained by reading the white paper that Joseph Carrabis and I recently published, Measuring the Immeasurable: Visitor Engagement which is freely available on this web site.)

The Audience Engagement simplifies Visitor Engagement by applying a “zero weighting” to the Brand, Feedback, and Interaction indices. By removing these values from the core calculation we are left with Click-Depth, Duration, Recency, and Loyalty:

Σ(Ci + Di + Ri + Li)

In English:

“Audience Engagement is a function of the number of clicks a visitor generates at a site, the amount of time they spent at the site, the frequency at which they return to the site, and their loyalty to the site as a member of the category for all of the sessions to that site during the reporting period.”

We’ve selected these four indices for one very simple reason: When scored using category-level thresholds (with the exception being the Loyalty Index, see below) comScore is able to automatically generate Audience Engagement values and engagement distributions across all of the sites they track.

The result is unique view into the relationship visitors have with the thousands of web sites comScore tracks around the globe. Now, for the first time ever, marketers and advertisers are able to gain insights into the level of engagement using a much more robust measure than session duration, page views, or recency alone.

Using Audience Engagement we can say with a high level of certainty that a greater percentage of Internet users find CNN more engaging than MSNBC and Yahoo! News:

More importantly we can also say that CNN has a larger population of “highly engaged” visitors to their site (22.5% of visitors at CNN versus 15% at MSNBC and less than 10% at Yahoo! News.) We believe that assessment of the audience distribution will provide advertisers an entirely new way to evaluate sites, focusing on audience quality over more simplistic measures of quantity.

This same type of analysis applied to popular network sports sites yields similarly interesting insights:

Here we can see that ESPN, while trailing Yahoo! Sports across all traditional measures (page views, sessions, minutes spent, active days) dominates Yahoo! from an Audience Engagement perspective. A closer examination of these two sites shows that ESPN’s dominance is driven largely by the frequency at which their audience members return to the site (Recency Index of 47.2% versus Yahoo! Sports at 27.0%) — an insight that has clear value to advertisers looking to create brand awareness and drive brand impressions across a sports-minded audience.

While comScore and Analytics Demystified are still working on how this data will be packaged and presented, another way of visualizing the relationship between two sites or a site and the category average is using a spider chart:

This chart visually tells the same story as the table above — ESPN has a higher level of Audience Engagement (bigger footprint) that is largely driven by Loyalty and Recency.

We believe that brand advertisers, advertising planners, and marketing managers will be able to use this data to make better decisions during the ad planning and media buying process. The whole debate over the definition of engagement manifest largely from advertisers desire to find more engaged audiences juxtaposed against a lack of faith in the simple measures being proposed as proxies for engagement. Thanks to comScore, these simple measures are about to become a thing of the past, giving way to a significantly more robust measure of the level of Attention audiences are paying at advertising powered sites around the world.

Interpreting Individual Data Points

In case you don’t want to spend the time reading the 50 page white paper I wrote recently on the subject with the mathematician and cultural anthropologist Joseph Carrabis, I’ll provide a brief summary of how the data comScore is reporting can be used.

Here is a sample of sites from comScore’s automotive category:

The first line in this table says that 42.8% of the audience to KBB.com is appreciably engaged with the web site. Engagement at KBB.com is largely driven by visitors clicking deeply into the site and spending an appreciable amount of time doing so, with nearly 85% of audience members exceeding the category Click Depth threshold and over 60% exceeding the duration threshold. Finally, using the distribution data, we can also see that 63% of the audience is highly engaged versus less than 3% who are only poorly engaged.

Audience Engagement data provided by comScore can also be used in a comparative context. Looking at the most and least engaging sites in this group, the data suggests that the audience going to KBB.com is over 400% more engaged than the audience going to About.com Autos (42.8% versus 8.5%.)  This is not to say that advertising at About.com Autos is a bad idea — over 90 percent of the site’s audience appears to be moderately engaged and in some instances a moderate level of engagement may be exactly what the campaign is looking for.

A Technical Note about Audience Engagement’s Loyalty Index

In the Audience Engagement calculation, the Loyalty Index is calculated differently than in the Visitor Engagement calculation because of an advantage conferred by the comScore system. Instead of simply counting the number of times a visitor has returned to the site as we’re forced to do using a site-centric data model, comScore allow us to better approximate loyalty as more commonly used: a measure of your likelihood to prefer a single site or brand over all others in the category. This model is essentially a “share of requirements” model used traditionally in the brand advertising industry and is calculated as:

Li(AE) = Visits to Site / Visits to All Sites in the Category

So, for example, if a comScore panelist is going only to eBay in comScore’s “Auctions” category, their Loyalty Index for eBay would be 100%:

Li(AE) = 10 visits to eBay / 10 visits in the “Auctions” Category

Conversely, if another visitor goes to eBay half the time and Bidz.com half the time, their Loyalty Index for eBay would be 50%:

Li(AE) = 5 visits to eBay / 10 visits in the “Auctions” Category

The result is a distribution of Loyalty Index scores for auction sites tracked by comScore in September that looks like this:

As you can see, eBay’s Audience Engagement component indices are higher than those of their competitors, but their Loyalty Index is much higher and tells us that nearly visitors in this category strongly prefer eBay to their competitors.

One of the challenges comScore and Analytics Demystified face regarding the Loyalty Index is the refinement of categories. Some categories like “Auctions” are well defined and represent logical competitors in a sector; others, like “News/Information” include diverse sites like Weather.com, Discovery.com, and Court TV Online. Over time we hope to refine these categories in partnership with comScore clients to provide the most accurate view of category loyalty possible. If you’re interested in participating in this work, please contact me directly.

Next Steps for comScore and Analytics Demystified

This is the first time we’ve been able to apply the Analytics Demystified Engagement construct to a syndicated audience data base.  We’re just announcing this work today, but we can already see possibilities for the measure’s evolution. Potential next-generation enhancements could include:

  • Allowing comScore clients to provide a set of branded search terms to support the inclusion of Visitor Engagement’s Brand Index (Bi)
  • Allowing comScore clients to provide a set of key site interactions designed to promote visitor Attention, supporting the inclusion of Visitor Engagement’s Interaction Index (Ii)
  • Incorporating third-party data sources measuring more qualitative aspects of the audience relationship with the site, supporting the inclusion of Visitor Engagement’s Feedback Index (Fi)
  • Allowing comScore clients to define their own competitive set in order to drill down into a more specific engagement profile in support of the advertising sales process
  • Providing comScore clients access to the details behind the Audience Engagement calculation for their site and category
  • Providing comScore clients custom access to Audience Engagement data, to provide a measure of Visitor Engagement in situations where the web analytic technology deployed does not support direct measurement

These are just a handful of examples of where this data offering can go. We’re presenting this model and starting the conversation because we want to hear from you. Regardless of whether you’re a current comScore or Analytics Demystified client, we would love your feedback regarding the calculation, the data, and the type of insights Audience Engagement is likely to provide to your organization.

Want to Know More about Audience Engagement?

Any reader of this blog knows that I have a passion for talking about the new measures of success on the Internet. I’m tremendously excited about this announcement and happy to talk if you’re interested in how you might be able to leverage Audience Engagement data.

Also, don’t hesitate to contact us if you have concerns about how we measure Audience Engagement or, in the extreme case, don’t think engagement can be measured at all. I firmly believe that the measures of Visitor and Audience Engagement I have proposed and the work I’ve done with Mr. Carrabis and now with comScore are only the beginning of the search for more useful measures of success on the Internet. Because these measures attempt to approximate something we agree is difficult to quantify, we believe that these measures will evolve over time; nothing is set in stone.

But we also believe that Visitor and Audience Engagement are better measures than “page views” and “average time spent” and far more useful to the measurement industry as a whole than simply sticking our head’s in the sand and exclaiming “engagement is an excuse” or worse, taking a Luddite’s view and declaring that complex measures are destined to fail.

For the time being, comScore is previewing additional details on the measure of Audience Engagement with their clients selectively.  If you’d like more information about how to be added to comScore’s list, or would like to discuss the measure of Audience Engagement with me, please email me directly and we can arrange a time to chat.

Analytics Strategy

Yahoo Web Analytics does not compete with Google Analytics

While Dennis Mortensen was kind enough to give me some advance notice that Yahoo! had officially rebranded IndexTools and was making it available to a wider audience, I have been so swamped with client projects I haven’t had a chance to comment on the news until now. I’m excited to see the company making forward progress towards making IndexTools available to the larger market, especially in the context of the vote of “no confidence” I keep hearing from the investment bankers I talk to from time to time.

Given that I was honored to break the story about the acquisition it is no wonder that some people have commented, “Hey, Peterson, you were wrong … it’s not free for everyone!” To this I can only comment that I would rather see Yahoo! take a measured and thoughtful approach towards the deployment of the application than be right.

But the one thing I have seen a lot of these past few days is the assertion that Yahoo Web Analytics is designed to compete with Google Analytics and that Yahoo! is somehow lame for being so late to the game. Wrong, wrong, wrong, wrong, wrong.

For those of you keeping track, when Yahoo! announced that they were waiving all fees for existing customers I commented the following:

“Finally, I would personally offer that Google Analytics and IndexTools are (in their current state) dramatically different applications targeting very different audiences.”

Now, I suppose I don’t expect traditional media to look any deeper into web analytics than necessary and so of course the logical conclusion is “Yahoo! and Google compete, egro Yahoo Web Analytics and Google Analytics are competitors.” I just hope the team at Yahoo! doesn’t start to believe this positioning as it is A) clearly wrong, B) minimizes the value of the acquisition, and C) only sets Yahoo! up to fail (something I suspect they have had quite enough of lately …)

In the midst of the media mini-frenzy I saw one quote that almost perfectly summarized why these two applications are unlikely competitors, published by the E-Commerce Times (emphasis mine):

The data’s not aggregated — the data’s stored raw in our database,” Jitendra Kavathekar, vice president of Yahoo Web Analytics, told the E-Commerce Times.

“You basically get real-time reports and dashboards, allowing our customers to take immediate action rather than waiting half a day, or waiting a day, or waiting a week to get the information they need,” he explained.

That option, Kavathekar asserts, opens up a whole host of new options for end-users when it comes to data visualization and manipulation.

“The ability to drag-and-drop different filters — to be able to cut the data in different ways, in real-time, to get the data that you need, to get the insights you need — is something you don’t generally see out there in the market,” he said.

“The data’s not aggregated” is the difference between SiteCatalyst and Omniture Discover, WebTrends Web Analytics and Visitor Intelligence, etc. It is also the difference of tens if not hundreds of thousands of dollars, additional training and support, and the need for experience professionals in the operator’s chair.  

I said it before and I’ll say it again, just for emphasis: Game changing.

Regardless of the timeline — Christmas, Easter, Memorial Day, 4th of July, whatever — Yahoo! making a real-time raw data collection environment available to a widespread audience for free will change the web analytics market, especially if the company can get their arms around a reasonable GTM strategy. If Yahoo! can figure out how to get the application in the right people’s hands instead of pursuing the ludicris strategy of duplicating Google’s success,well, I stand by all previous assertions regarding who IndexTools hurts and who it helps.

Despite not competing, the wildcard of course is still Google. I don’t talk to Brett as often as I should but the rumors of new segmentation functionality coming soon are growing louder and the idea of Google Analytics APIs from Google (as opposed to a group of bright, enterprising individuals) is persistent. Hopefully some of this will come to light week after next at Brett’s Emetrics presentation, especially since his announcement at Emetrics San Francisco made us all yawn …

If Google is coming upmarket, driven by IndexTools or just their own internal strategy, a lot of the objections to deploying Google Analytics as a business standard start to disappear. Segmentation, custom variables, an API, better filters, etc. will all push GA up-and-to-the-right in the constellations, waves, and magic quadrants of the world. Couple that with Google Web Site Optimizer and the long-term view of the market’s evolution gets even cloudier.

One thing I keep forgetting to ask Dennis and my other contacts at Yahoo! is about the status of Rubix. Last I asked it was still on the table, something that only appears to be reinforced by Kavathekar’s statement about data visualization and manipulation. If you’ve seen Rubix, the “drag-and-drop filters” comment is especially telling, don’t you think?

In summary:

  • Yahoo Web Analytics does not compete with Google Analytics because they are fundamentally designed to serve different audiences;
  • Even if Google Analytics expands segmentation functionality, these applications are structurally differen;
  • Provided the investment banking-types are wrong about Yahoo’s ability to execute, I believe that YWA will eventually emerge as a direct competitor in RFPs with SiteCatalyst, WebTrends Web Analytics, Coremetrics 2009, Nedstat, etc.;
  • If Yahoo! figures out how to scale raw data storage and ultimately gives away access to Rubix, the competitive set expands to Omniture Discover, WebTrends Visitor Intelligence, Coremetrics Explore, Unica’s Affinium NetInsight, etc.;

Again, congratulations to Dennis, the team at Yahoo Web Analytics, and the folks at Yahoo who made the acquistion on getting a branded product out the door so quickly and opening the door to partners, developers, and Yahoo Store owners. Given all of the other news at Yahoo! and in the market in general this is no small feat.

Adobe Analytics, Analytics Strategy, General

Our white paper on Visitor Engagement is now available

A lot of you have been following the thread in my blog about measures of engagement on the Internet. Over the past year we have certainly had a spirited discussion about the topic, and for the most part people’s interest in the subject has not apparently subsided. About six months ago I started working with Mr. Joseph Carrabis from NextStage Global on the engagement calculation and the byproduct of our work is now available as a somewhat lengthy white paper on the subject freely available to all.

You can download the white paper from the Research > Published Research section of this web site:

http://www.analyticsdemystified.com/link_list.asp?l=Research

The white paper includes a great deal of information about the calculation including background on it’s derivation, the calculation itself, it’s use in a business context, and the underlying mathematics.  I welcome your feedback on the paper and am more than happy to discuss the contents via phone or email.

The direct measure of a Visitor’s Engagement with a web site or set of properties is still a work in progress to be sure.  And despite some naysayers, I believe that all of us working with this or similar calculations are quite excited about the possibilities associated with moving on from more simple measures and beginning to combine metrics to create a more interesting (and potentially more valuable) view of visitor interaction on the Internet.

Let the debate begin again!

Analytics Strategy

Columbus WAW — ExactTarget, CRM, Web Analytics, Google…Coupons/Chrome/Ad Manager, and More!

It’s another second Tuesday of the month, which means it’s another Web Analytics (Tuesday) in Columbus!

We had a good turnout — 13-14 people, including a good showing from the local Adobe User Group. ExactTarget graciously sponsored the event, and Brian Burson presented/led a discussion on integrating email, CRM, and web analytics. It was interesting, as he stressed that, while there is certainly a technology challenge in making that integration happen so that you can use web analytics and CRM data to improve the relevance of your communications to your prospects…an equally big challenge is setting up the heuristics (his $10 word — not mine!) to identify which people should get which messages and when based on that sea of data. It was great to hear a vendor acknowledge that it takes real thought and planning to pull that sort of thing off, rather than pretending that 99% of the battle is the integration of the systems! ExactTarget has a nice white paper on the subject on their site. Nice read.

Other random notes of interest from my end of the table:

  • Google Coupons — Pretty. Slick. Michael described the implementation as being very similar to using sitemap.xml implementation for improved searchability. Try out the end result yourself: 1) Go to Google Maps, 2) Type in your home or work address, 3) When it comes up, search for “Jiffy Lube”, 4) Underneath the results that show up, click Coupons. You’ll get printable Valpak coupons for the location. NIF-ty…
  • Google Chrome — the two things that, when prompted, I could say I liked about it: 1) the fact that multiple tabs spawned multiple processes, and 2) the fact that the search box and URL box were one and the same.
  • Google Chrome — it’s not a browser, it’s an application environment (that’ll be a topic for me to ask more about next month…as I’m not sure what that means)
  • Google Ad Manager — very favorable reviews from the people who have seen it and played around with it
  • “Social media” is becoming a dirty phrase among a noticeable portion of this crowd

Other, people-oriented news of note:

  • Monish Datta arrived and, as such, garnered a mention in this post to boost his Google-ability
  • Scott Zakrajsek attended and shared disparaging remarks about various people and vendors freely 🙂
  • Nicole…pulled another “mysterious and unannounced absence” from the event; we’ll tell her what we made up as excuses for her next time
  • Several of our regulars and semi-regulars had to bail at the last minute due to late-breaking conflicts. They were missed. I won’t “Highlight” anyone in particular that might “Tie” an aura of self-doubt to those folk that could only be properly analyzed by Carl “Jung. He” would have an interesting assessment, I’m sure.

Next month’s event should fall on October 14th, which means I’ll probably have to miss it (unless I make my trip to Austin that week rather abbreviated). We’ll get it up and running on the WAW site a bit sooner this time. We hope.

Analytics Strategy

Your Customer Data Is Dirtier than You Think

Wow, has it ever been a while since I got a new post up. Three-part excuse: 1) Vacation that was wayyyy offline, 2) Vacation was proceeded by a crazy week on the road for business, and 3) I’ve been working on a post that’ll turn out to be three separate posts. Stay tuned for that, as it’s my assessment/lessons learned from iterations over the course of a year on a corporate dashboard.

Immediately upon returning from vacation, I had a customer data management experience that highlighted just how bad we are as marketers at successfully tracking our customers and keeping our information on them current. Since we were gone for 10 days, we had our mail stopped. I returned home several days before the rest of my family, which meant I was on hand when the post office dropped off the bin of accumulated mail. As I sorted through it, I wound up with a surprisingly large stack of mail that was not intended for anyone in our house.

The next day — the first day that only a single day’s worth of mail came — I did a little analysis. We got seven pieces of mail (pictured at left). Of those, only three had totally correct information on the address:

  • One piece — was for the prior owner of the house; we’ve now owned the house for a year (side note: the prior owners were on the opposite end of the political spectrum from us, and, sorting through the 10 days of mail, the ratio of literature the prior owners received from “their” Presidential candidate outnumbered the literature we received from “ours” by 3:1)
  • One piece — included my wife’s middle initial…but it was not the correct middle initial
  • Two pieces — were addressed to the wife of the couple we bought our last house from…in another statefive years ago!

We’ve gotten to the point, I think, where we just accept this. The problem is that we’re starting to get too smart for our own good. Sending mail to someone who no longer lives at the address has always happened.

Having a minor data entry issue — mistyping a middle initial — is going to happen any time there is human involvement (we can trace back the fact that I still occasionally get mail for “Jim Wilson”, rather than “Tim Wilson,” to a single phone company screw-up shortly after we got married a decade-and-a-half ago).

What was really interesting, though, was what happened when companes tried to address the first issue — identifying when people moved — and generated the last issue. In my wildly-not-statistically-valid anecdote of a single day, trying to “fix” the first issue generated twice the misdirected mail.

What I’m Sure Happened

We bought a house in Austin five years ago from the couple who built it and lived in it for 25 years; in that time, they had gotten embedded into countless systems with that address. We continued to receive a steady stream of mail for them for the entire time we lived in that house, and it was not all junk mail by any means. To make things a little tougher on the senders, we shared the same last name with the prior owners. At some point while we owned that house, some of those companies undoubtedly implemented some sort of customer data integration system that, undoubtedly, hooked into some external data sources to try to sniff out when their customers moved. The problem? Much of their data was already outdated…and they didn’t have a way to identify which data that was.

So, when a “relocation” was picked up — our sale of our house in Austin and the purchase of our house in Ohio — all of the identified “residents” of the Austin house were “moved” with us.

The key takeaways from me for this — and both are really of the “keep in mind” variety — are:

  1. Your customer data is always much dirtier than most people in your company assume it to be. A key role for the data analyst is to have a more realistic understanding and be the voice of reason when it comes to requested analysis projects or the planning of marketing campaigns that rely on the data being cleaner than it actually is
  2. The only real “solution” to this issue immediately dives into 1984-like paranoia — a single (or just a handful) of universal “profiles” that the customer maintains and that other systems can reference so their data stays current. OpenID is a move in this direction…but sidesteps the paranoia by being simply an identifier (OpenID itself doesn’t store any information about you — your name, social security number, address, friends, etc.). The issue almost seems intractable — any movement towards a universal identifier equates to twice as much ratcheting up of privacy concerns

It’s not pretty, is it?

Analytics Strategy, Conferences/Community, General

Forrester acquires JupiterResearch

I wanted to say congratulations to David Schatsky and the team at JupiterResearch, as well as the fine folks at Forrester Research, on the news of FORR’s acquisition of JupiterResearch announced this morning.  Forrester has acquired a great asset and a great group of analysts, researchers, and operational staff, and it was very encouraging to read David Schatsky’s post on the subject, especially:

“Jupiter’s employees are also going to benefit from the combination with Forrester. Forrester execs have enthusiastically expressed to me their respect for the quality of our staff and are eager for us to become part of the expanded company. Jupiter folks will reap the benefits of being part of a larger organization, with its rich resources, track record of effective execution, and commitment to employee growth and career development.”

Somewhat ironic that FORR has been actively looking for someone to cover web analytics since Megan Burns (who will be at the upcoming X Change conference) has transitioned to cover customer experience more broadly.  John Lovett, in my humble opinion, will make a great Forrester analyst and was almost certainly the best candidate for the job … <grin>not that Mr. Colony should have paid a $23M bonus to his current employer.</grin>

While I am excited for all involved, this combination of companies does raise one specific concern within the web analytics sector: Instead of three independent voices in the community providing an objective assessment of the competitive landscape that can be compared and contrasted over time, now there will be only two, Forrester’s view and Gartner’s view.

Don’t get me wrong, I have a tremendous respect for all involved here — otherwise I would never have advocated for inviting Megan, John, and Bill to keynote the upcoming X Change 2008 conference I’m a partner in.  But I do have some small concern that the market’s view of the vendor landscape will soon be defined by one fewer data points, especially since Gartner has not done a formal Magic Quadrant on the sector recently (although Bill did publish a market note on web analytics on July 3rd that I assume is available to Gartner clients.)

I suppose my fears may be unfounded, but given the unusual (and perhaps unreasonable) amount of weight vendors, consultants, and companies alike seem to put on these constellations, waves, and magic quadrants, the loss of one-third of the available information may have implications that won’t manifest for quite some time.  In the context of the consolidation our industry has gone through in the last 24 months, I think technology buyers are even more likely to look for that “objective” viewpoint and rely on published research.

Wait and see, I guess, but I have a few new questions to ask Megan, John, and Bill in a few weeks at the X Change!

Regardless, I’m excited for the folks at Forrester and JupiterResearch and sincerely hope the acquisition proves fruitful for all involved.

Adobe Analytics, Analytics Strategy, Conferences/Community

John Payne is running for WAA Board of Directors

Those of you in the WAA are hopefully aware that due to Avinash Kaushik’s untimely resignation the Web Analytics Association is currently having a special election.  While the entire organization without a doubt misses Avinash’s charisma, spirit, and great passion for measurement, a great slate of folks have been nominated to replace Mr. Kaushik.  While I don’t know all the candidates personally, I have had the great pleasure to work with one for several years now: John Payne from Coremetrics.

I first met John years ago while he was at IBM SurfAid and I was at WebSideStory.  I was privileged to work directly with John and his team while I was an analyst at JupiterResearch, as I am again privileged to work with John on occasion now that I have started my own company.  And while I have tremendous respect for Akin Arikan, Dennis Mortensen, and Mark Wachen based on my past interactions with each, I can think of few people who would add more value to the current WAA Board of Directors than John Payne.

John was gracious enough to answer a few questions I had about his candidacy and qualifications despite being on a well-deserved vacation in Alaska.

EP: Tell us briefly about your work in the web analytics industry?

JP: Well I am probably the one with the most tenure in web analytics … note I avoided saying the oldest, which is also true 😉 I co-founded IBM’s Web analytics solution (SurfAid) in 1996.  In that role I have been involved in all aspects of delivering web analytics.  I am intimately familiar with the data and the challenges associated with performing meaningful, actionable analysis.  I am currently responsible for Product Management at Coremetrics.

EP: Why are you running for the WAA Board of Directors?

JP: Because of my experience in working with the data and how to build a viable solution around this data, I bring a unique and seasoned perspective to WAA. I am at a point where I want to help the larger web analytics community become more productive and effective.

EP: What three things do you believe make you the one candidate to vote for in this election?

JP: I am hands on … I know the data, the challenges,  the level of effort required to make the reports meaningful.  While I have been doing this a long time, I also see a vision for how this can evolve in the future, to capabilities such as analytics across domains that result in targeted content and predictive modeling for more effective content delivery.  Thirdly, my experience will help guide WAA to deliverable that will hopefully have higher value to its members and the larger web analytics community.

EP:  How do you feel about term limits for WAA Board members?

JP: Term limits are a great idea because a regular infusion of new energy and talent will keep the WAA more vital and relevant.

EP: Some have accused the WAA of being somewhat close-minded and having a “not invented here” attitude, something that has the potential to negatively impact the community as a whole. Can you tell me if you encounter this how you might approach the problem?

JP: Each idea needs to be assessed on its own merit. I would encourage full review and dialogue of all new ideas before any rendered decision.

EP: What is your favorite thing about web analytics?

JP: I really enjoy “playing” with the data and “torturing it until it confesses!”  Really, web site visitor behavior does not lie, and I enjoy finding the story that causes the web marketing team to grim with excitement as they lay out a strategy and measure their success.

EP: What is your least favorite thing about web analytics?

JP: I am always disappointed by some portion of the community who just wants answers without being willing to “bond” with their data and the tools that they need top use.  I guess my disappointment stems from the fact that I enjoy doing “deep data dives” so much!

EP: Thanks a ton John, now back to your vacation and best of luck in the election!

Those of you in the WAA should have an email from the association titled “Re: WAA Board of Directors Special Election – Voting Period Now Open” with your ballot ID.  Hopefully all of you will take the time to vote in this special election and show your support for John, Akin, Dennis, Nicholas, or whichever candidate you choose.

VOTE NOW IN THE WAA SPECIAL ELECTION (you will need your ballot ID and to log in to the WAA extranet)

Analytics Strategy, Conferences/Community, General

Jim Sterne, the Godfather of Web Analytics

In 2007 when Eric Enge asked Google’s Avinash Kaushik about me I was humbled when Avinash responded, “You know that Eric is obviously a leader in the industry. We are all following the trail that Eric has blazed. He is just an awesome guy and a really great thinker.”  But while I appreciate the sentiment, I think that Avinash got one part of this wrong: We are all following the trail that Jim Sterne blazed.

Jim, for the three of you who don’t know him already, is an accomplished author, an internationally known public speaker, the founder of the hand’s-down most popular conference on web analytics and marketing optimization, and a co-founder of the Web Analytics Association.  And did I mention that he is without a doubt the nicest guy in the entire industry.

Yep.

Jim is one of my personal heroes and he has had a greater influence on my career than anyone I know.  Jim was among the first to learn I was leaving Visual Sciences for greener pastures and has provided me invaluable advice over the past year.  So imagine my glee when, after his inviting me to participate in his conference for six years, I was finally able to repay the favor by inviting Jim to join us at the 2008 X Change conference in San Francisco!

He accepted.  Ecstasy!

In preparation for the X Change event I have been interviewing some of the great people who will be joining us.  While those interviews are being shared with other bloggers, I decided to keep the Jim Sterne interview all to myself.  Read on and learn a little more about “the Godfather of Web Analytics” …

EP: For the three people who ** don’t ** know you, tell me a little about yourself and how you got involved in the web analytics industry?

JS: My first life was in sales – business computers to companies that had never used one before. This was pre-PC and they were expensive, confusing, and confounding. It was great fun explaining to people just what they could accomplish with one. I love watching people’s eyes light up. That led to a life in marketing – software development tools mostly. Print ads, brochures, trade show booths, direct mail. That was pre-PowerPoint so we produced overhead transparencies on a copy machine.

In 1993 I saw my first website (Sun Microsystems) and got wildly excited. I kept asking webmasters from large companies for examples of good online marketing strategy and they kept asking me for my opinion. My opinion ended up coming out in the form of books, PowerPoint presentations and corporate workshops.

In 2000, after presenting at Matt Cutler’s NetGenesis user group meetings and a couple of national tours, we decided a white paper was needed to explain this stuff from soup to nuts. That led to the book which led to the eMetrics Marketing Optimization Summit which led to the Web Analytics Association which…. Oh – you asked for a “little” about myself. Sorry – got carried away.

EP: Honestly, did you think that the Emetrics white paper you did with Matt Cutler would have the impact it did?

JS: I love being at the leading edge, banging the drum to get people to understand what’s just over the horizon. I wrote five books about online marketing but they were just part of the noise. I had no idea that E-Metrics Business Metrics For The New Economy would be the only thing out there for so long and attract such attention.

EP: What made you decide to start a conference for web analytics folks? Had you done conferences before that?

JS: I got most of my consulting business from public speaking but the conference industry was very slow in 2000 and 2001. This was the “Dot-Bomb” era after all, I decided to produce my own conference in the winter of 2001 but pushed it off until 2002 due to September 11. Web analytics was the most interesting subject to me and so few people were paying attention. It was something that needed a drum and a flag and a parade.

EP: Which of your books are you most proud of? What other author’s book do you wish you had written yourself?

JS: Nobody ever forgets the first time – even if it takes three editions to get it right. Being introduced as “author of” for the first time is a head rush that is only equaled by handing a copy of the first edition to my father. But the most fun I had was writing a little volume for Lyris – the email company – called Advanced Email Marketing. It’s a work of fiction about a guy hired into a bicycle company to get them into email marketing. He has to explain how the numbers are of value to each of the different managers and executives in the company. Good story, not much of a plot but there’s a happy ending with a twist.

Which other authors’ books do I wish I had written? The usual suspects spring to mind: Yours, Avinash’s, Jason and Shame’s, and anything on Amazon’s Top Ten list.

EP: Corry Prohens from IQ Workforce recently asked me about “the Eric Peterson brand.” How much do you work to manage the “Jim Sterne” brand?

JS: I believe branding is the result of everything you say and do. Think whatever you like, but every time you make a statement, an appearance, or a product you are expressing your brand to the world. Form is as important as function. The means are as important as the ends. Therefore the answer to your question is; All the time.

EP: Which of the Web Analytics Association’s accomplishments in the past few years are you most proud of?

JS: I am ridiculously proud of approximately 300 people who are actively working all around the world to create value for other WAA members. I helped raise the flag, but all these people are energetically and enthusiastically building something to help the next generation of web analysts. I posted a list of WAA accomplishments as a Letter from the Chairman but it’s really the fact that so many people are donating their time and talent to the cause that has me beaming.

EP: Tell me, when you’re not making things happen in the marketing optimization industry, what do you do to relax?

JS: I love to travel – sick I know, given the frequent flier miles I’ve racked up. I collect meerschaum pipes, play Mah Jongg, edit church sermons, design jewelry and monitor some 4,500 Komodo dragons in the wild through a network of webcams and RFID tags from my iPhone.

EP: If you could change ** one thing ** about web analytics, what would that be?

JS: The same thing I’ve been trying to change all along: Get everybody to recognize the astonishing power and value of this information for improving customer experience.

EP: Given that your “Emetrics: Business Metrics for the New Economy” really got the ball moving in 2000, where do you see the web analytics industry in 2010?

JS: There will be more consolidation as larger web analytics companies buy smaller ones, business intelligence companies acquire web analytics companies and smaller firms drop off the radar in the wake of free tools. Some new tools and methods will come about but the Big Problem will continue to be growing awareness, in order to increase investment, in order to train more analysts. The people problem will be with us for some time to come because you can’t automate insight.

EP: Speed round: Short answers to the following questions …

Favorite food? Popcorn
Favorite hotel? Santa Barbara Biltmore
Favorite book? Cryptonomicon by Bruce Sterling Neal Stephenson
Favorite non-web analytics public speaker? Randy Pausch
Favorite professional athlete? Willie Mays
Favorite airline? United
Favorite saying? “Eighty percent of success is showing up.” – Woody Allen coupled with “Genius is one percent inspiration and ninety-nine percent perspiration.” – Thomas Edison

I hope many of you will be able to join Jim and I at the 2008 X Change in San Francisco, August 17, 18, and 19th. Learn more about the conference at the official web site.

Analytics Strategy, General

JupiterResearch Web Analytics Buyer's Guide

Many of you have probably already noticed this but John Lovett at JupiterResearch just released his “Web Analytics Buyer’s Guide: Assessing Vendors’ Competencies and Value” (requires registration.)  Having done one of these reports myself back in the day I want to congratulate John on publishing an amazingly detailed and insightful piece of work.  John has a blog post on the report that is worth reading titled “It’s not the Tools, It’s the Craftsman” which reminded me lyrics from the Phish song Bittersweet Motel:

“When the only tool you have is a hammer, everything looks like a nail, and your living at the bittersweet motel.”

Bittersweet is an apt assessment when it comes to producing this type of research as an analyst: non-vendor clients love the insights, vendors hate the comparisons, and all-in-all the results often fail to shed any truly new light on the market.  John should be complimented because despite publishing two somewhat poorly-resolved constellations, his work makes a few incredibly important points about the state of the market today.

I know that Stephane and Anil have already discussed the report, and nobody really asked me, but here are a few of my thoughts on John’s work.

If I’m Omniture, I’m not very happy about this constellation

Despite hundreds of millions of dollars of investment — including the acquisition of three of the company’s former rivals (WebSideStory, Visual Sciences, Instadia) and the roll up of Offermatica and TouchClarity — in the large Enterprise John’s assessment has Omniture in a three-way tie for “first” with Unica and Coremetrics.  Compared to my assessment in 2004 and Greg Dowling’s work in 2006 (published by David Daniels in February ’07), John’s work shows that Coremetrics and Unica are actually gaining ground on Omniture from a business value and market suitability perspective.

This is important because it reinforces both John’s central thesis and one of the most important caveats in all of web analytics: it’s not the tool that matters, it’s how you use it!  Omniture’s own consultants make this point when they remind us that we need to work hard to take advantage of the systems we already have in place, and the reality of the situation is that you’re not going to be any more successful with Omniture than any other application until you invest in people, process, and technology with a realistic and well-considered business strategy.

Don’t get me wrong, I think that Omniture has brilliant technology and are in a great position in the market today — if they manage to actually integrate analytics, testing, targeting, and bid management in a truly meaningful way they will solve a bunch of real-world problems.  But despite the hyperbole, hype, and braggidacio, Omniture’s competitors near universally have a similar opportunity and thusly I agree with Lovett’s asessment that there is no single market leader in web analytics today, Omniture or otherwise.

If I’m Coremetrics, I am pleased as punch!

Coremetrics is in a funny position in the web analytics market.  Despite all of their competitors declaring them “done” and “yesterday’s news” they continue to rank well in both the JupiterResearch ranking and the Forrester Wave.  Maybe the reason is that Coremetrics is actually still very competitive and able to provide the level of functionality and service that their clients are looking for at a competitive price.  Could that be it?

In fair disclosure, I do some work from time to time for Coremetrics and I really like their team, but given their recent deployment of Coremetrics Explore and the expansion of Coremetrics Connect, I think Lovett’s work validates the observation that the only real difference between Omniture and Coremetrics is their general approach towards marketing and sales, not their technology.

Furthermore, despite having been long considered a high-end solution with a substantial price tag, Coremetrics actually takes first place for overall business value in the SMB sector beating not only Omniture but also Google Analytics and IndexTools which are free!  I commented as much in the press release Coremetrics issued for this report, mostly because this type of market expansion is no mean feat given the quality of the competition.  And to be fair, Lovett’s business value dimension encompasses more than just cost and includes flexibility, scalability, usability, and feature sets.

If I’m WebTrends, I’m bummed out!

Living here in Portland, Oregon I am perhaps more acutely aware of the challenges facing WebTrends.  Last week they lost their CFO to another local firm, they already had to part ways with their VP of Client Services, Kory Kimball, who was only appointed in January of this year, and they are still looking for a replacement for Kathleen Brush who was brought in by Francisco Partners as an interim CMO.  Now, to be fair, these staffing issues are offset by the fact that they still have some pretty bright folks on the team, guys like Barry Parshall and Aaron Gray, but leadership in this marketplace has to come from the top and right now, the top is looking kind of thin.

My advice to Dan and the Board at WebTrends is basically this: get someone who knows web analytics inside and out in a senior position ASAP and get them out there talking about the company, products, and market in general.  On this point I disagree with my good friend Jeff who says that “business is business” and executives don’t necessarily have to be domain experts.  When I look at the market I see folks like John Pestana from Omniture, John Squire and John Payne at Coremetrics, Akin Arikan at Unica, Dennis Mortensen at IndexTools, Brett Crosby and even the great Avinash at Google out there evangelizing for both their products and the entire field of web analytics.

Call me old school, but I think the same key insight that it’s not the technology, it’s how you use it applies everywhere.  WebTrends is not going to be able to compete on a feature/function level because, according to John, the feature/function war is over and done.  The competitive differentiation is going to have to come from somewhere else … and historically that “somewhere else” has been guys like John Squire, Akin, Dennis, and Brett working their butts off to help people understand that despite web analytics being hard, great gains are possible when everyone is invested in being successful.

Surprise, surprise, I was right about IndexTools

When I broke the story about Yahoo! acquiring IndexTools and pointed out that most people who have seen both applications consider IndexTools to be every bit as good as Omniture, Omniture complained.  Brent Higgleke, their VP of Strategic Marketing, commented on Julien Coquet’s post about IndexTools:

“This move by Yahoo! was done to compete with Google. IndexTools does not compete “toe to toe” with Omniture. The majority of their customers are small businesses (80% of IndexTools customers are SMB according to CMS Watch.) This is great news for small businesses that use Yahoo advertising. However, mid-market and enterprise customers demand advanced functionality, deep domain expertise and specialized services.”

Sounds good Brent, except you’re basically wrong.  Don’t hate me, but I’m gonna recommend that people go with Lovett’s assessment instead:

“[IndexTools] provides a profoundly capable framework for advanced analysis and offers flexible segmentation built on the premise that segment creation is best facilitated through exploration of data.  Although currently available only through certified partners, the new free pricing model of IndexTools (a Yahoo! Service) makes it suitable for businesses of all sizes that seek a flexible interface and possess in-house staff looking for insights within data.”

The notion that IndexTools is somehow inappropriate for the large Enterprise, is feature poor, or is otherwise unworthy of consideration in an RFP process when available is just plain silly. John said as much in his blog post, commenting:

“It turns out that IndexTools does have nearly 80 percent of Omniture’s standard off-the-shelf functionality (77 percent to be exact).”

Now, I think we all wish that John would have published his list of “basic” and “advanced” features so we could better quantify the “missing 23%” in IndexTools.  My suspicion is that the gang at Yahoo! are pretty conscious of John’s assessment and working diligently on the next generation of IndexTools, much like the gang at Google did with Google Analytics.

So I will state again, Yahoo’s acquisition of IndexTools is a long-term game changer.  Yahoo! has still not given a time-line for making the application freely available to all, but an entire network of partners is already out there ramping companies up at a rumored rate of over 200 accounts per week!  Obviously if Yahoo! becomes so distracted with their current business problems and never releases IndexTools then my assessment will change, but everything I hear is that my Christmas 2008 prediction is more or less correct.

Despite proclamations otherwise, people still care about data accuracy

Avinash Kaushik is perhaps most loved for his exclamation, “The data quality sucks, get over it!” which to those of us tasked with explaining the unexplainable resonates like crazy.  The problem with “getting over it” is that those crazy wonks over in the business, and especially the gang in the corner offices, still want us to produce accurate reports that can be trusted over time.  If you’re not sure about this, go down the hall and tell your VP that the unique visitor counts you have been reporting all year may be off by as much as 30% in either direction, you’re not sure, and see what he or she says …

Uh huh.

Lovett’s report seems to validate that nobody is getting over it and that accuracy is still important, especially in the vendor selection process (number three factor, following “flexibity of reporting options” and “ability to service business needs”.)  I do disagree somewhat with John’s assessment regarding what to do about the problem, he seems to focus on the need for annotation capabilities in the product, but at the end of the day companies deploying web analytics solutions need to have defined business processes to account for tag coverage, data filtering, cookie deletion as well as a data collection validation process that is actually followed on an ongoing basis.

The next big battle will be about data integration

This is something that John and I have discussed on-and-off for some time, the idea that data integration capabilities are key as increasingly “internet marketing” is giving way to capital M “Marketing” (and, because of this, “web analytics” is likely to give way to capital A “Analytics.”)  In light of this observation, John seems to be predicting that web analytics vendors will continue to build out functionality to allow them to be more deeply integrated into the business, while at the same time the existing Enterprise analytics vendors will enter the digital market via acquisition.

Hmm.

I think the problem with this is that these strategies have largely been tried and, for the most part, have failed to produce expected results.  John predicts that web analytics vendors will build or acquire content management and relevance engines, which we have already seen with WebSideStory’s original acquisition of Atomz which included the Atomz Publish platform (among other examples, mostly CMS vendors building out analytics capabilities but Interwoven’s acquisition of Optimost is tangentially relevant I suppose.)  Same for Omniture’s acquisition of Offermatica and TouchClarity.

Now I suppose it’s too soon to say if Omniture will succeed with “Test and Target” but there is no case to be made for WebSideStory + Atomz Publish being successful.  Perhaps this was a problem of execution, but I rather believe that most true Enterprise shops either A) already have CMS in place or B) are unlikely to purchase CMS from a web analytics vendor given the otherwise complicated-but-entrenched landscape.  It does look like Omniture is still supporting Publish so perhaps they will get traction that WebSideStory did not.  Still, I’m not going to hold my breath, especially given the recent upgrades that Lyris has launched around their Lyris HQ product and the integration of ClickTracks, mostly targeted at the same SMB market and available at a tiny fraction of the Omniture’s price.

Similarly, the big Enterprise software players have all had the opportunity to invest in web analytics for years now and none have taken the plunge.  Oracle and others were widely known to be looking at the sector but the only thing that came of all that was A) Microsoft buying DeepMetrix (nee Gatineau nee AdCenter Analytics) and B) Oracle buying Moniforce which I’m not sure really counts.  In the meantime, SPSS has stopped supporting NetGenesis as of February 28th of this year and only SAS and Unica are still out there looking at deeper Enterprise integrations as far as I can tell.

Now, I have my own thoughts about the future of data integration and how web analytics will be levearged in business, but hopefully you’ll come see me talk at IMC 2008 in Vancouver when I talk about “Competing on Web Analytics” and hear what I have to say in person.

What, are you still reading this post?

In summary, for the three of you still reading this exceptionally long post, I think John has written a great report on the state of the industry and the vendor landscape.  Every JupiterResearch client reading this blog should read it and give John a call to discuss.  Or, you could come to the X Change conference in San Francisco and talk to John in person, or you could come see both of us at Shop.ORG in September and watch us fight like cats and dogs about which one of us is right about data integration.  Up to you.

Analytics Strategy, Social Media

WAW(T) Columbus / Social Media Tools for Web Analysts

And…it’s the monthly installment of “Don’t These People Know that Wednesday Comes After Tuesday?” Also known as “Web Analytics Wednesday (on Tuesday) in Columbus.” This month’s event was graciously sponsored by Coremetrics.

We had a record turnout (um…by one), with participants from Victoria’s Secret, DSW, ECNext, ForeSee (all the way from Motor City!), Lightbulb Interactive, Highlights (current and former), Resource Interactive (current and soon-to-be former), Nationwide (former and soon-to-be-again), Franklin University, and, of course, Bulldog Solutions.

This month’s topic was “Social Media Tools for Web Analysts.” As usual, the presentation/handout was quick, and the more interesting part of the evening was the various side discussions that the discussion spawned. Several active Twitter users were in attendance: @bigbryc (who, apparently, I inadvertently “outed” as a Twitter user to some of his co-workers after last month’s WAW), @reubenyau, and @tgwilson (me).

The discussion centered around the various social media tools/sites that have web analyst-oriented activity. Presented from the perspective of…me, so by no means all-encompassing, and not really intended to be. We (mostly) steered clear of “social media measurement,” and we definitely steered clear of “leveraging social media as a marketing tool for your company.” The list of sites/tools and how/where I’ve seen them being used by the web analyst community is available in this Excel 2003 spreadsheet. I’ve tagged the sites/tools that, personally, I am a regular user of, as well as some of the sites/tools that I am likely to become a regular user of in the near future (or really ought to be a more regular user of) — print/print preview to see the two footnote indicators and what they mean.

It’s not comprehensive…and, yet, it’s longer than it really ought to be. I picked up a tip on Google Notebook, so I need to check that out.

I can’t figure out exactly how to work a couple of notes into this post, so I’ll just drop them in as non sequiturs:

  • Scott Zakrajsek was temporarily possessed by evil aliens recently. In reality, he always has and always will think that Coremetrics is the greatest web analytics tool on the planet
  • The soon-to-be-traditional Monish Datta direct reference so he can pop up on his friends’/co-workers’ Google Alerts…

As always, it was great to see the regular faces, great to see a few new faces, and we missed some of the regular faces.

Adobe Analytics, Analytics Strategy, Conferences/Community

Are you still wondering about X Change?

My good friends at Semphonic just published a bunch of video vignettes collected at last year’s X Change event in Napa Valley that are definitely worth checking out if you’re thinking about joining us in August.  Or, even if you’re already registered for the event, check out these videos and start getting pumped up for what promises to be a fantastic event!

As an example of the kind of content published at YouTube, here is my good friend and Web Analytics Assocation Board member June Dershewitz and some of her thoughts about the conference. Cool, huh?

There are 15 different videos at YouTube from brilliant folks like Anil Batra, Dan Shields, Manoj Jassra, Marshall Sponder, June, Phil Kemelor, Gary Angel, and even yours truly who has never been known to shy away from the camera (although perhaps I should have that day, what the heck was up with my hair!)

Don’t know what the X Change is? Read all about it at the official conference web site.

Adobe Analytics, Analytics Strategy, General

Omniture: Visitor Engagement is just a fad!

The same guys that want you all to believe web analytics is easy has now declared that “Visitor engagement formulas are largely another fad, just like parachute pants and the Hollywood diet. It’s a measure some consultants and vendors can pitch like snake oil.”

Omniture’s point that Visitor Engagement is a bad idea because it has subjective components fails to understand the work that folks like Jim Novo, Steve Jackson, Theo Papadakis, Joseph Carrabis and others have done; it makes me wonder if the author bothered to read anyone’s work on the subject.

Worse, it makes me question Omniture’s long-term commitment to Visual Site customers since Visual (= Omniture Discover OnPremise) is, at least for now, the industry’s leading solution for creating derived measures and experimenting with visitor-level data.  The point seems to be that simple measures of success, such as those provided by SiteCatalyst, are all that are required.

Hmmm …

We pretty much had this same debate a year ago when Avinash Kaushik disagreed with the use of calculated metrics to measure engagement, and I can see that Steve Jackson has already commented as such.  I wouldn’t normally have written about this except the author said one smart thing when he commented you shouldn’t “try to build a better mouse trap, when you’re not taking advantage of the one you’ve got today.”

Agreed.

If you’re thinking about trying to leverage any measure of visitor engagement, regardless of which measure you choose, you should definitely make sure your web analytics house is in order first.  Despite Omniture’s assertion, most people believe that web analytics is hard and requires a sometimes intense focus on people, process, and technology.  If you’re not staffed appropriately, if you haven’t defined your key performance indicators, if you haven’t established core web analytics business processes, and if you haven’t worked to optimize your web analytics implementation then trust me, Visitor Engagement is not for you.

A good analogy is the one provided in Tom Davenport’s book “Competing on Analytics” where he describes how baseball teams like the Oakland A’s and my friend Judah’s beloved Boston Red Sox, and football teams like the New England Patriots have used new and innovative metrics to evaluate the performance of players, concessions workers, and the entire fan experience.  Visitor Engagement is a new measure in web analytics, and thusly it will take a special type of analytical competitor to recognize the opportunity that this “uber measure” can potentially provide.  And just like some teams have shown that they are not ready to adopt new measures to evaluate their business, some companies are simply not ready to explore complex key performance indicators in an effort to “Compete on Web Analytics.”

If you’re like most companies doing web analytics today, it is likely that you will benefit more from focusing internally and learning more about how to leverage people, process, and technology more effectively, rather than look externally for new metrics of success.  You could get a good book on the subject of fundamental key performance indicators and spend a great deal of time implementing what you learn.

But if you’re interested in learning more about an innovative metric that describes the behavior and opportunity that exists with the 97% that don’t convert, a measure that you can apply to your advertising, content, B2B, marketing, or lead generation site that will compliment your otherwise robust key performance indicator suite, and a calculation that describes the level of Attention that visitors are paying to your site, your content, your testing and targeting, etc… well then I guess you’ll have to keep reading my blog (and Jim, and Steve, and Joseph, and a whole host of other people’s work who are committed to working these ideas out rather than just saying “balderdash!”)

If you’re not content to just keep reading and want to know more about my thoughts on Visitor Engagement, know this: I have been exceedingly clear that my measures of Visitor and Audience Engagement are new, and in their newness there is risk in the level of insight they may be able to provide you.  I am not promising you better skin, new hair, or more friends, despite the validation that the measurement of engagement recently received when NextStage was granted a patent for their work on the subject.  But, unlike some people, I have done my homework on the subject, and I continue to have conversations with some of the best companies in the world about how they can use new measures to improve their overall use of web analytic technology.

In the meantime, I guess I’ll put on my parachute pants, grab a glass of “Miracle juice”, and bust out the ol’ Snake Oil.

Analytics Strategy, Conferences/Community

X Change: Experts in Attendance!

You may have already seen the press release this morning about some of the industry experts that we have coming to the X Change conference in San Francisco, August 17, 18, and 19.  If not, you should check it out!  Some highlights:

  • Jim Sterne, yes … that JIM STERNE, will be joining us as a participant!  I’m not sure if I have seen Jim at an event where he is not the absolute Master of Ceremony and running the show so it will be nice to see my old friend (hopefully) relax and have the time to share his own experiences in our industry
  • Many of the most influential vendors in the digital measurement ecosystem are coming, including Aaron Gray (WebTrends), John Squire (Coremetrics), John Dawes (Tealeaf), Eric Hansen (SiteSpect), Mark Treschl (OpinionLab), Eric Head (ForeSee Results), and more I am surely forgetting.
  • We’ve also invited some of the best and brightest consultants in the business including Josh Manion and Bill Bruno (Stratigent), Jacques Warren (WAO Marketing), Aurelie Pols (OX2), Andy Fisher (Avenue A/Razorfish), Matt Jacobs (Digitas), Dan Sheilds (Wicked Science), Craig Danuloff (Commerce360), and of course the team from Semphonic!

By design, the X Change conference will be about 1/5 expert practitioners (the huddle leaders, who are amazing), 1/5 experts from the vendor, agency, and consulting community, and 3/5 practitioner attendees.  Given that the huddle leaders are practitioners themselves, we feel that we have a pretty solid (and extremely rare) ratio of experts to attendees, and that this arrangement will provide amazing value to attendees.

It doesn’t hurt that the experts we selected are among the best in the business.

Yeah, there are some folks that are missing, to be sure.  It would have been great if someone from Google Analytics could have come, and the guys from ZAAZ, and folks like Steve Jackson from SATAMA and others across the U.S.  But there is always next year, right!

Anyway, if you’re thinking about joining us in San Francisco the time to register is now.  Check out the official web site at Semphonic and get signed up today.

Analytics Strategy, General

Guest Post: Web Analytics in a Recession?

This is a guest post from Corry Prohens of IQ Workforce.  Corry is a sponsor of the Analytics Demystified Job Board and one of the most plugged-in folks I know in our industry.  He’s helped some great companies find talent, and some amazing talent find great companies which is, as we all know, one of the hardest things of all about web analytics.  Thanks to Corry and IQ Workforce for sponsorsing the job board and I hope all of you have either a safe and relaxing 4th of July or a nice respite from U.S.-based email, depending on where you live in the world!

Without further comment, Corry Prohens:

————————————————————————

This past spring I was growing concerned with the condition of the economy.  Skyrocketing oil and food prices, plummeting real estate values, an unprecedented credit crunch, investment banks folding and teetering…

The lead question for business publications and programs shifted from “Will there be a recession” to “How long and how awful will the recession be?”

In a previous life I lived through the dot com surge and bust as a technology recruiter.  I did NOT want to go there again. The last few years have been very kind to our community / career landscape and my paranoia was growing that the good times were going to end.

As a coping device and because I assumed that my colleagues shared my interest/concern, I decided to poll the community on the issue in our Summer 2008 industry survey.

It turns out that while most economists say that the United States is either experiencing or entering a recession, web analytics practitioners in the US are overwhelmingly optimistic about their career prospects in the short and intermediate-term future.

A sneak preview into the survey results shows that individuals and departments around the country are downright bullish:

  • 74% of practitioners expect that spending on web analytics will increase at their company during the recession (40% said it would increase a bit / 34% said it would increase significantly)
  • 60% of practitioners said that the recession would either increase the likelihood of hiring additional web analytics resources or have no impact
  • 17% said that their company was either somewhat or very likely to reduce web analytics headcount during the recession
  • 2% thought that the recession would have a major negative impact on their career

Thank goodness! And just to prove that these folks are answering with their heads and not their hearts, my team is literally busier right now than we have ever been.  Entering the short July 4th holiday week, we have been absolutely inundated with new requests from clients for permanent and contract web analytics resources.

As a longtime LinkedIn fan, I decided to throw the question up there last week to see what kind of response I would get.  Eight people – all web analytics practitioners – answered in a single voice:  “What recession?”

The only concrete difference / pattern that we have seen in our business over the past several months has been the exploding demand for web analytics contractors.  A year ago we were working on one contract position for every eight permanent positions.  Now contractor requests make up over a third of all new requests for resources.  I am not sure if I am ready to draw a direct correlation between the economy and the rising demand for contractors since there are several other viable explanations.

Here is the link to participate in the current survey (or to view results of previous surveys):

http://www.iqworkforce.com/survey.asp

————————————————————————

Thanks again to Corry for his support of Analytics Demystified!

Analytics Strategy, General

Welcome Paul Holstein to the Demystified weblogs!

Those of you who have been following the Web Analytics Forum at Yahoo! groups will no doubt be as excited about this announcement as I am: I am delighted to let you know that Paul Holstein, co-founder, Vice President, and COO at CableOrganizer.com will be taking over for Daniel Shields here at the Analytics Demystified weblogs.  Paul has been a long-time contributor to the Forum, both in the conversation and behind the scenes, and despite the fact that he once told me that “web analytics was just a hobby” he has demonstrated his expertise time and time again.

What happened to Daniel?  Dan has taken a big step forward in his career, co-founding Wicked Business Sciences with Paul, and Nicolas Dubus and getting into the consulting game.  Given the list of service offerings these guys have already I have little doubt they’ll be hugely successful.

I will keep Dan’s posts up for the time being but they’ll be in Paul’s blog (although their original URLs should work just fine.)

Paul has indicated that his style will be short-and-sweet, touching on a relevant subjects as they creep up in his business, in the Forum, and at great conferences like the X Change (where you can meet Paul in person!)

Paul’s “welcome, Demystified” post is up and I would encourage you to check it out if for no other reason to add Paul’s blog to your RSS feeds.  (I have already added Paul to the “All Analytics Demystified Weblogs” super-feed, just in case you want to subscribe to our writing en massehttp://feeds.feedburner.com/AllWebAnalyticsDemystifiedBlogs)

Analytics Strategy, Conferences/Community, General

Larry Freed and Lars Johannson interview me about X Change

If you’re considering coming to San Francisco for the X Change conference on August 17, 18, and 19 but are still wondering what the conference has to offer you may want to check out these interviews I did with Lars Johannson last week and Larry Freed this week.

Both guys asked good questions — my favorites were Lars asking “If someone only has the money and time to attend one conference, how should she choose between industry “default” event eMetrics and challenger X Change?” and Larry’s “With a “total absence of sales messaging and sponsored talks”, who will pick up the bar tab?”  Larry’s question reminded me that June, David, and I should probably start planning some kind of conference-associated Web Analytics Wednesday event since we had such a great turnout last time we were in San Francisco!

Check both posts out when you have a chance:

If you’re interested in the conference, check out the official web site at Semphonic.

Analytics Strategy, Social Media

Another Great Web Analytics Wednesday in Columbus

Tonight (Tuesday) was our fourth Web Analytics (Wednesday) in Columbus. We switched venues this month, and it looks like it’s going to stick — settling in at The Spaghetti Warehouse. Bryan Cristina’s concern was warranted — I don’t know that I’ve ever been to a restaurant that has beer on tap…but only two beers, and one of them really doesn’t count. But, I’ll deal with it. And I’ll make sure I’ve unzoomed the camera the next time I hand it to a waiter for a picture, so the flash is actually in range of the group!

We had attendees from far and wide. Judy Thaxton-Borlin from Brulant, who sponsored the evening (thanks!) headed down from Cleveland. And we had the entire Chicago office from Resource Interactive (that would be…Ty)! Unfortunately, our speaker fell through due to a scheduling mix-up — we were slated to have the Community Manager from Bazaarvoice, but settled for a couple of handouts from the recent Bazaarvoice Social Commerce Summit 2008. We had a good discussion about social media — where, when, and how ratings and feedback work on a site (Bazaarvoice’s specialty, and Nicole West of Bath & Body Works discussed how they’ve used the technology, as well as the challenges they’ve come across in mining the data and assessing the impact of the initiative). We had a conversation about Twitter — myself (@tgwilson) and Bryan Cristin (@bigbryc) being the biggest users in the group, although neither of us are diehard advocates. That led to the tale of #wa and Twitter.

A good time was had by all. We’re planning a multi-pronged assault on various WebTrends contacts (Noe…we’re gunning for you!!!) to get beyond the Coremetrics and Google Analytics-centricity of the group.

We’re on tap to have our next one on July 15th — another Tuesday, again at 6:30, again at Spaghetti Warehouse, with Coremetrics as the tentative sponsor. Details to come at the WAW site!

Analytics Strategy, General

It is official: IndexTools is now free for everyone!

No, it’s really not, sorry.  But as long as I have your attention I wanted to talk about a thread developing at the Web Analytics Forum about IndexTools not yet being freely available to all that I think is pretty interesting.

Does anyone remember how long Google had Urchin before they gave away Google Analytics for free?  Eight months.  And everyone spent the entire time saying, “Naw, they’ll never give it away … it would be way too expensive!”  Then, remember when they did give Google Analytics away, they immediately had to stop.

So why would anyone expect Yahoo! to be giving IndexTools away for free to everyone barely two months after the acquisition?  Impatience?  Internet time?  An intense and building desire for really good tools for the best of all prices?!?

Eh?

I am very much in the “I bet it will be free” camp, but I suspect that in the midst of everything else Yahoo! has been dealing with lately that the acquisition and roll-out team will take a measured, thoughtful approach towards the next phase of IndexTools.

Based on the letter recently sent to existing partners, it kind of sounds like they want to build a few new data centers and make sure they can handle the needs of their existing customers and partners — nothing really wrong with that, is there?  I bet they’ll also take some time and think about how to avoid some of the problems the other guys had (rollout issues, service outages, extended betas, etc.)

I’ll go ahead and reiterate (and clarify) my original prediction: I think we’ll see free IndexTools for anyone and everyone sometime around Christmas 2008.  I know people around the world are still getting new IndexTools accounts through the partners — you can see that in my Vendor Discovery Tool — but Yahoo! has a lot to consider before they roll IndexTools out to the masses.  I mean, if you think web analytics is hard, you should try developing, maintaining, selling (or not selling), and supporting a web analytics application in this market …

I’ll also bet Yahoo is going to ask for something in return for the accounts as well: Agreeing to run ads on their sites, signing up for a developer account, allowing them use your data to improve their search engine, … something that justifies or offsets some of the cost associated with giving this fairly robust web analytics application away.

Finally, I suspect that Yahoo! will soft-peddle IndexTools when it is widely available.  What I’m hearing is that despite what a lot of people think, Yahoo! doesn’t really want to piss off Google, Omniture, WebTrends, or any of the existing web analytics providers.  But hey, why should they?  Why scream “AWESOME FREE WEB ANALYTICS SOLUTION!” when they can quietly release it and know that word will eventually get out?

Anyway, IndexTools is not free for all, at least not yet.  As Julien Coquet would likely say, c’est la vie!

Analytics Strategy, Conferences/Community, General, Social Media

X Change conference conversation leaders announced

As usual, Gary Angel has beaten me to the punch, this time with his great post about the conversation leaders we’ve announced for the 2008 X Change conference. The full line-up is included further down in this post, and you can read the press release in PDF format from the Semphonic site or download this PDF invitation to the conference more suitable for printing.

Since folks have been asking me via email what is really different about X Change, primarily to help make the case to management to attend the conference, and at the risk of sounding redundant, here are three great reasons to consider attending the conference:

  1. X Change is an “expert user” conference, and we’re doing everything we can to create tremendous value for expert users. Everyone coming to the event — the conversation leader’s we’ve invited, the consulting and thought-leaders we’re bringing to the event, and the select list of senior people from the vendors — has years of experience in web analytics. Their experience, combined with those of the 100 attendees, is designed to help those of you working on the cutting edge in web analytics get your concerns addressed and your questions answered.
  2. The conversational format is designed to allow every attendee share their ideas and ask their questions, making X Change a very participatory “Web 2.0” conference. There is nothing wrong with sitting and listening — when you want to sit and listen. But the explosion of web analytics blogs, the growth of the Web Analytics Forum, and the number of web analytics folks on Twitter suggest that a bunch of us actually want to participate. X Change is the conference for the participants.
  3. We have a plan to allow you to share the insights you gain with your team back home. One of the chief complaints at last year’s conference was “I wanted to attend every session!” To help share the insights gleaned in each conversation, and help paint a picture of the industry today and where it is heading, after the event we will be publishing the “Proceedings of the Second Annual X Change Conference” document, free to all conference attendees.

If you’re still wondering about the value of the conference, or need more ideas to sell a luxurious stay at San Francisco’s Ritz Carlton to your manager, please don’t hesitate to reach out to me directly and we can chat.

The conference theme this year is “People, Process, and Technology” — the three-legged stool that all of our web analytics efforts rest upon — and we’ve broken the conversations down into similar groupings. We will have full descriptions of the conversations available online very soon but here are the leaders, their companies, and the general topics they will be discussion.

PEOPLE

  • Steve Bernstein (PayPal): Getting Analysts to Produce Analysis and Getting the Business to Listen
  • Megan Burns (Forrester Research): Building the Business Case for Change
  • Bill Gassman (Gartner): Evolving Your Use of Analytics
  • John Lovett (JupiterResearch): Industry Standards or a Lack Thereof
  • Bob Page (Yahoo!): Web Analytics and Data Privacy

PROCESS

  • Steve Bernstein (PayPal): Driving Visitors Up the Value Chain
  • Dennis Bradley (Charles Schwab): Bridging the Gap from Web Analytics to Marketing
  • Marston Gould (Classmates.com): Where Does Web Analytics Stop and Customer Analytics Start?
  • Linda Hetcher (Avaya): Searching for Success with SEO and SEM
  • Dylan Lewis (Intuit): Campaign Analysis and Attribution Modeling: Dangerous Assumptions
  • Dylan Lewis (Intuit): Establishing a Web Analytics Center of Excellence
  • John Lovett (JupiterResearch): Data Integration: Myths and Realities
  • John Rosato (IBM): B2B Analytics: Challenges and Opportunities
  • Rachel Scotto (Sony Pictures Imageworks Interactive): Integrating Online and Offline (Market Research) Data
  • Michael Wexler (Yahoo!): Web Analytics for Brand Marketers

TECHNOLOGY

  • Dennis Bradley (Charles Schwab): Justifying the Need for Advanced Visualization Tools
  • David Cronshaw (MSN/Microsoft): Emerging Trends in Online Video: Measurement, Monetization, and Mobilization
  • David Cronshaw (MSN/Microsoft): The Metrics of Video: Cost per Engagement and Beyond!
  • Jim Hassert (AOL): Analytics Across the Enterprise
  • Jim Hassert (AOL): Managing Expectations: Panel-Based and Census-Based Methodologies
  • Seth Holladay (Rodale Publishing): Slicing and Dicing Visitors: Segmentation Strategies
  • Seth Holladay (Rodale Publishing): Tracking Non-Traditional Conversion Events
  • Judah Phillips (Reed Business Interactive): Building a Successful Web Analytics Team
  • Judah Phillips (Reed Business Interactive): Knowing When You’ve Outgrown Your Current Web Analytics Solution
  • Ron Pinsky (AIG): Data Collection: Implementation, Utility, and Ongoing Integrity
  • Ron Pinsky (AIG): Integrating Customer Experience and Marketing Data with Web Analytics
  • Bob Schukai (Turner Broadcasting): The Mobile Landscape: Challenges and Opportunities
  • Bob Schuka (Turner Broadcasting)i: Mobile Technology: Development, Deployment, and Measurement
  • Rachel Scotto (Sony Pictures Imageworks Interactive): Measuring Web 2.0: Widgets, Gadgets, and Social Networks
  • Jared Waxman (Intuit): Using Real-time Survey to Improve the Customer Experience
  • Jared Waxman (Intuit): Competitive Intelligence Tools and Methodologies
  • Michael Wexler (Yahoo!): Mobile Marketing, Mobile Measurement
  • David Yoakum (The Gap): Measuring Web 2.0: Interactions, Events, and Consumer Generated Content
  • David Yoakum (The Gap): Using Web Analytics to Inform Personalization and Remarketing Efforts

If you’re a long-time reader of my blog and you’re really interested in web analytics I would very much encourage you to consider the conference: read Gary’s post, download this PDF invitation to the conference, or email me directly so we can talk about how the conference might benefit you and your organization.

Analytics Strategy, Reporting

Baseball Stats and BI Musings Part II: Data Quality

In Part I, I took a run at assessing a couple of the most popular baseball statics to see how they measured up as well-formed performance metrics. The other thought that has been running through my mind as I’ve been scoring my son’s baseball games has to do with data management and data quality.

Scoring a baseball game requires a couple of things:

  • Making judgment calls as to what actually happened
  • Capturing the right information on screwy plays where a lot of stuff happens (this happens a lot more in 9-year-old baseball games than it does in college or professional games)

The first item is one of the reasons why college and professional games have an “official scorekeeper.” There are some plays that are clearly fielding errors…but there are some that require a subjective assessment. And, even if there is clearly an error, it’s sometimes subjective as to whether it was a bad throw or a bad catch.

And, things can get a little complicated. For instance, if you look at this picture closely, you’ll be able to tell that my son is churning his 9-year-old legs as fast as he can (admittedly in pants that would fit most 12-year-olds) as he runs towards first base. And, yet, the catcher is standing right at home plate with the baseball, looking like he’s about to make a throw. What’s going on is either totally obvious to you — meaning you played baseball or have followed it with a decent level of interest — or it seems very bizarre. My son had just struck out. The rule in baseball is that, if a player strikes out AND the catcher drops the ball AND EITHER first base is unoccupied OR there are already two outs in the inning, the catcher needs to retrieve the ball and either tag the batter or throw the ball down to first base so the first baseman can tag first base. This is what’s called a “strike him out, throw him out.” You don’t see it very often in the major leagues or college, because catchers don’t drop that many balls. You see it quite a bit when the players are nine and ten years old.

Either way, my son had an official at bat with a strikeout, even if he made it to first base safely (if, for instance, the catcher overthrew first base). If that had happened (in this case, it didn’t), I would have needed to record a strikeout as well as an error on the catcher.

Sound complicated? It is, and it isn’t. Baseball has other semi-obscure rules — if a baserunner passes another baserunner, he is out. I didn’t learn that rule until I saw it happen to Baylor in the College World Series several years ago. So, scoring a baseball game correctly requires:

  • Paying close attention to every play throughout the game
  • Knowing the rules well
  • Knowing how to quickly and accurately record both “normal” plays and oddball plays
  • Being able to make the subjective calls quickly and effectively

I’ve never actually tried to verify this, but I am fairly certain that, if you take three run-of-the-mill scorekeepers and have them score the same game and then compare their results, you will get three slightly different versions of what happened. Yet, we view baseball stats and box scores as being completely black-and-white.

I worked with a data management guru at National Instruments who had a Mark Twain quote in her e-mail signature that said something to the tune of: “A man with one watch always knows what time it is. A man with two watches is never sure.” (I’ve tried to look up the exact wording and confirm that this indeed originated with Mark Twain in the past, and I didn’t have much luck.) This is an excellent point, and it applies to both baseball and business.

If we see a number that appears to be precise — 73 pitches, 10,327 visits to a web site, 2,342 leads — we equate precision with accuracy. It doesn’t cross our mind that a scorekeeper might have inadvertently clicked his pitch counter when the pitcher actually made a throw over to first base to try to pick off a runner. We ignore the fact that all data capture methods when it comes to web analytics are inherently noisy. We forget that sometimes our lead management processes break down and load a duplicate lead or miss a lead. We assume that the data that gets entered into our systems by humans gets entered by a robot rather than by a human — no judgment calls, no mental lapses. And that is simply not reality.

None of this is to say that we should throw out the data. At the end of the day, the ERAs that I calculate for the pitchers on my son’s team are going to be pretty close to the ERAs that another scorekeeper would calculate. Close enough. But, it’s easy to get caught up first in assuming that precise numbers are perfectly accurate, and, then, when something happens where you see a discrepancy, focussing on trying to get the “right” number rather than asking, “Is the difference material?”

The moral? Well…baseball is a great sport!

Oh, wait. There’s more. Don’t rely too much on your data. Don’t expect it to be perfect. Don’t focus on making it perfect. Make sure it’s “good enough” and go from there.

Analytics Strategy, Social Media

#wa: A Twitter channel for web analytics professionals

Jason Egan started a really cool thread over at the Yahoo! group asking about people who Twitter and it got a ton of response. I think W. David Rhee from OX2/LBi is working on a master list but I thought of a more “Web 2.0” way to self-identify using Twitter channels and Twemes.

The channel “#wa” was more or less open (as are most channels, for now at least) and so if you’re a Twitterer and you’re writing about web analytics, you can help create what might become a useful body of knowledge and help self-identify like minded individuals on Twitter by simply adding “#wa” to your twits.

So I might Twitter: “#wa Boy, people sure are complaining about Omniture being slow today … is it slow for you too?”

You can follow the channel at http://twemes.com/wa or subscribe to the RSS feed http://twemes.com/wa.rss

For me the jury is still out on whether Twittering is actually useful, although I do admit it’s kinda fun (at least when Twitter is not down, which it is from time to time.) Who knows, maybe the #wa channel would become the new Yahoo! group, complete with inappropriate Fred McMurray comments and everything 😉

Analytics Strategy, Conferences/Community

EUWAA Update from the WAA Board of Directors

Since summer is nearly here and since I am trying to be more European in my approach towards business (read: wanting to take long holidays during the summer) I have been somewhat lazy in my regular blogging (as opposed to my lazy-blogging on Twitter, where I have also been somewhat lazy …) But thinking about being more European brought my mind around to my idea of a European Web Analytics Association (EUWAA) that I proposed before the Emetrics Summit in San Francisco last month.

Since that post a few things have happened:

  1. My good friend Mr. Jim Sterne, Chairman of the Board at the WAA held a nice meeting over drinks in London to discuss the idea with Europeans in attendance. You can read Jim’s summary of the meeting and some interesting notes about the WAA Board of Directors over at Jim’s blog.
  2. The WAA held a more formal meeting on May 22nd with some very high-profile Europeans in attendance including Neil Mason (England), Jim Williams (Scotland), Marianina Chaplin (England), Vicky Brock (Scotland), Zeljka Stojanovic (England), Tim Leighton Boyce (England), Rene Dechamps Otamendi (Belgium), Oliver Schiffers (Germany), Julien Coquet (France), Miles Bennet (England)

I certainly wish I could have been at the meeting since it sounds like it was pretty productive. Still, I have one concern as I read the summary document provided by Board Member Vicky Brock — it sound like the pro-EUWAA conversation revolved around multiple local WAA chapters in Europe as opposed to a single, pan-European organization able to respond to the needs of “chapters” across the continent.

Perhaps I’m just reading the document incorrectly, and hopefully if I am one of the participants will write me and clarify the point. But if I am interpreting this correctly, I think this is a dangerous idea and one very likely to fail.

I suspect that the organizational process and volunteer commitment to create and maintain multiple local WAAs would be far greater than is reasonable. Were this not the case, I suspect the WAA would have a local presence here in the U.S., instead of ceding most local web analytics efforts to Web Analytics Wednesday. Web Analytics Wednesday works because it is A) low effort and B) high value. I suspect the establishment of several mini-WAA’s across Europe would be the exact opposite, at least in the short- to mid-term.

Instead, I would encourage those European’s taking the lead on this issue (Neil, Vicky, Rene Dechamps Otamendi from OX2) to focus on first creating a single, pan-European organization able to provide reasonable organizational structure for all Europeans. A single organization should be able to:

  • Provide focus for Europeans working in the web analytics arena
  • Create critical mass for EU-focused networking events, conferences, etc.
  • Publish in more languages to accommodate non-English speakers
  • Develop EU-focused content via blogs, social networks, etc.
  • Streamline organizational processes like reimbursement
  • Schedule meetings at a reasonable time of day locally

Fundamentally the idea of a single EUWAA was put forth to reflect the need for more value and more opportunities for European members of our global community. And while I have tremendous faith in Rene, Neil, and Vicky, I suspect that many more people will be necessary to provide the level of value I think the WAA is capable of providing.

According to the document, the WAA is planning to have a vote of Europeans in the next few weeks designed to gauge member (and potential member) response, theoretically to the idea of a separate EUWAA. It will be very interesting to see how the questions in this survey are worded and how the response is managed, but I suspect this process is in good hands.

Vicky concluded her letter with the following statement:

“Every action item in this report needs volunteers to step forward. If you would like to see any or all of the above come to fruition, please find something (research, identify, explore, review) and raise your hand.”

Nothing could be more true. Any association like the WAA or EUWAA depends on volunteer action for its lifeblood. Hopefully all of you who commented on my last post, and all of you who wrote me directly in support of the idea of a EUWAA, will take Vicky’s point to heart and step up when the time is right.

Since I’m not European I suspect my involvement from this point on will be tangential at best, although I’m not opposed to volunteering my own time, but I welcome your comments and critique of the ideas I put forth in this post. Also, if you’d like to know when the pan-European vote on this subject is, and if you’re not a WAA member, please don’t hesitate to comment or write me directly and I will make sure your email address is passed along.

I must say I am tremendously encouraged by the work that Jim, Seth, Neil, Vicky, Rene, and many others have already contributed to the general idea of providing more member value in Europe, especially in such a short period of time. I only hope the momentum will continue towards a product that everyone involved can be as proud of as Rand, Greg, Jim, Bryan, Seth, and the rest of the WAA founders surely are of their work.

Analytics Strategy

Multidimensional Lead Scoring in 8 Minutes

Talk about milking a topic in multiple channels! The Marketing News Radio interview I did on multidimensional lead scoring is now available. And…we’ve now added a “video white paper” to the mix, which you can view on the Bulldog Solutions web site [Update: since the redesign of the Bulldog Solutions web site, and, presumably, due to my departure from the company, the video no longer seems to be available.]

The story behind how this 8-minute video came to be is kinda interesting. At Bulldog Solutions, we’re constantly (too constantly, at times) looking for ways to put new Internet-based technologies to work for our clients. Generally, we like to put them to work for ourselves first. First, that lets us suffer through the hiccups and snags that come the first time you do anything new. And, second, it gives us something that we can use as an example of what it is our clients would get.

The truth is, this video white paper was driven more by the fact that we were starting to talk to prospects where the medium seemed like a fit than it was part of a glorious, fully-baked, multi-channel strategy for marketing multidimensional lead scoring.

In the case of the radio interview, our Field Marketing Manager pinged me months ahead of time about doing the show. She then wrote up an abstract for the interview. We had a timeline that we worked to, and the preparation included a pre-call with the interviewer the week before the show to get everyone on the same page and lay out the overall flow for the interview. And, the show itself was on my calendar for months.

Contrast that with the video white paper experience. I was in Austin for a week a couple of months ago. I flew in on Monday morning. On Tuesday, the same Field Marketing Manager said, “Tim, we’re going to be shooting a video of you talking about multidimensional lead scoring for 5-10 minutes on Thursday. It’s no big deal — just you talking through the concept. But, we’ve got a prospect who is interested in having us do some of these for them, and we need an example.” On Thursday morning, at the hotel, I realized that I’d brought all of one collared shirts on the trip (it’s Austin, we’re a startup — casual attire is the norm). The shirt was dark blue with some plaid-like stripes. I wore it to the office, and our video guru took one look and said, “We’re shooting you on a green screen — that’s a risky shirt to wear, as you may wind up with holes in your body on the final video.” At lunch, I headed to Stein Mart to pick up something that was solid (I clothes shop voluntarily approximately once every 3 years — this was not “voluntary”). I then spent an hour in one of our conference rooms with the A/C turned off (to eliminate background noise) doing three takes on the video.

The project then actually languished for a while (the urgent opportunity died or evolved to be something else — I’m not sure which). But, our video guru jumped back on it last week…and the end result turned out much better than I imagined it would, given the material he had to work with!

Analytics Strategy

Google Analytics = Strawberry?

Google Analytics recently had a bit of a hiccup with their data processing, which caused the tool to present inaccurate data for many users’ sites for the period between April 30th and May 5th. Google is reprocessing the data, and they expect to get most of it corrected. But, the situation triggered a lengthy and heated debate on the webanalytics Yahoo! group about what Google Analytics is/is not good for.

Stéphane Hamel assessed the situation using a strawberries/oranges/piña colada analogy that was pretty slick. I wrote up my thoughts on his thoughts over on the Bulldog Solutions blog.

(Yes, I’m still struggling to answer the weeks-old question: “If you contribute to two blogs, neither of which has a particularly large subscriber base, to which one should you post?”

Analytics Strategy

Being Tim Wilson: Data Management, 2,142.7, and My Gilligan Moniker

It’s a bit of a curse in the world of social media and personal branding to sport a John Smith-like name, and “Tim Wilson” certainly qualifies. From a data management perspective, I definitely fall in the “FirstName + LastName” is nowhere close to a unique identifier. I’ve had interesting “<groan>” moments over the years on that front:

  • High School in Sour Lake, Texas — my sister’s name is Kim, and our 400-person high school had two Kim Wilsons and two Tim Wilsons…and a lousy intercom. My sister and I kept our noses pretty clean, which was not the case for our counterparts (who were not related to us or to each other); we both got used to ignoring any between-class announcements for “..im Wilson, please come to the office.” Generally, someone was in trouble, and, generally, it was not one of us.
  • Registering for College Classes — as a small-town teenager in the big city of Cambridge, Massachusetts, the registration card that got handed to me was for Tim Wilson the graduate student, who was also registering for classes. That caused a bit of confusion that took a day or two to straighten out. Interestingly, that Tim actually cropped up on Jeopardy last month. I catch maybe 10 minutes of Jeopardy a year, but I happened to catch the end of Final Jeopardy a few weeks ago and saw Tim Wilson. He’s now a professor. In a quick Google search to track him down — using “Jeopardy” as part of the search — I turned up a post where the blogger calculated (with some help from the commenters…one of whom was Tim promoting his own appearance) the maximum theoretical one-day take on Jeopardy: $566,400.
  • Buying a House while On the Lam — when Julie and I were working on the mortgage approval paperwork for our first house, we hit a snag, in that “I” had a 4-year-old outstanding arrest warrant in Austin. We’d only lived in Austin for a year, and I had had no brushes with the law of any sort during past visits to the city (see first bullet above).
  • I’m a Funny Guy…but Not Professionally — in the early days of e-mail, I used to get occasional messages from random people telling me how funny I was and asking when I would next be in [random city]. These were followers of Tim Wilson the Deep South Comedian. I was tempted to respond by telling them I would be at [some club in the city they were asking about] on [a date in the near future] as an unpromoted surprise show, and that I would leave four tickets in their name at the door. I ultimately decided that would be too cruel. <sigh> See bullets above.

You get the idea. So, when I started this blog, I partially stole a page from Avinash Kaushik’s figurative book, in that his blog is Occam’s Razor by Avinash Kaushik. “Gilligan on Data by Tim Wilson” was born.

So, where did “Gilligan” come from? That’s a trail name I picked up when thru-hiking the Appalachian Trail from Georgia to Maine back in 1993 when I graduated from college. It’s a tradition on the Appalachian Trail to adopt a new moniker for the duration of your hike. There are practical reasons for this (“Buck,” who I met and hiked with quite a bit on the trail, was a female who was hiking alone, and she thought it wise not to advertise that fact when signing trail registers along the way). There were descriptive names — “Chowhound” really did eat like a horse, and “Bearanoid” did have a heightened fear of encountering a black bear.

At the time, the Appalachian Trail was 2,142.7 miles from the summit of Springer Mountain in Georgia to the summit of Mount Katahdin in Maine. The length of the trail varies from year to year as improvements are made and routes are updated, but that is what it was in 1993. I’m using my personal blog to enter my journal entries from that experience 15 years later. “Gilligan” — a name chosen largely in haste one evening with the assistance of the other hikers I was camping with that evening — remains fairly appropriate. I’m still clumsy. I still get myself into any number of improbable predicaments. And I’m skinny as…okay, so maybe that Bob Denver characteristic no longer particularly applies.

If “I had a common name” is one of the larger hurdles I face in my life, I’ll count myself fortunate, I suppose. But, sites like searchme.com just aren’t as much fun for me as they are for those of you who have more distinct names.

Analytics Strategy

Another Successful Web Analytics Wednesday in Columbus

I just got back from our third Web Analytics Wednesday — hosted on a Tuesday, because, doggonit, that’s just how we roll in Columbus. Deal with it!

We had a great guest speaker — Ken Barhoover from Brulant drove down from Cleveland and gave an excellent presentation on A/B and multivariate testing. Brulant uses Optimost for the most part, but they are tool-agnostic. The presentation covered both the basics of “why” on A/B testing, but Ken also went into some decent detail as to the logistics behind implementing A/B and multivariate tests. Very interesting stuff.

Finally, another Twitter user in the group — @johnboker attended!

And, thanks to Analytics Demystified for sponsoring the event!

Analytics Strategy, Reporting, Social Media

Measuring ROI Around Web 2.0…and More Webinars (geesh!)

Awareness (the company) has a Measuring ROI Around Web 2.0 webinar this Thursday, May 22, at 2:00 PM EDT. That’s heavy on the buzzwords, but it sounds like it might have some interesting information. And, I found out about it thanks to a mention on Twitter from Connie Bensen, who will be leaving her new kayak behind and heading to London and Paris for some R&R, so will be missing the live event herself.

Unfortunately, it partially conflicts with Kalido’s What’s Behind Your BI? webinar, which starts at 2:30 PM EDT, and it conflicts with Fusing Field Marketing and Sales, which Hoover’s and Bulldog Solutions are putting on at 2:00 PM EDT on Thursday as well.

It looks like I’ll be doing some on-demand catch-up after the fact.

Adobe Analytics, Analytics Strategy, Conferences/Community, Reporting

Web Analytics Wednesday San Francisco Metrics and KPIs

Web Analytics Wednesday in San Francisco this week was an amazing success by every conceivable measure. But don’t take my word for it, here are the metrics and key performance indicators:

  • Budget for the event: $10,000.00
  • Actual amount spent: $14,500.00
  • Percent over budget: 31%
  • Percent extra expenses graciously covered by ForeSee Results and Tealeaf: 100%
  • Planned number of sponsors: 4
  • Actual number of sponsors: 5
  • Percent sponsors interested in this event: 120%
  • Estimated satisfaction of sponsors based on feedback sample: 100%
  • Projected number of attendees: 200
  • Projected expenditure per attendee: $50.00
  • Actual number of attendees: 400
  • Actual expenditure per attendee: $36.25
  • Percent of actual budget spent on drinks: 50%
  • Estimated number of drinks served: 1,450
  • Estimated number of drinks consumed per attendee: 3.6
  • Number of hours spent serving drinks: 1.5
  • Estimated number of drinks consumed per hour:996
  • Estimated number of drinks consumed per hour per person: 2.4

I think the key measure of success is really satisfaction but I totally forgot to ask Larry Freed’s folks at ForeSee Results to conduct a survey during the event, we weren’t tagged with Coremetrics tags, and SiteSpect wasn’t able to test due to incredibly cramped conditions so we’ll have to rely on your comments and June’s pictures for the time being to make that determination. Maybe someone will post Tealeaf-esque replay video so we can estimate satisfaction based on qualitative data…

Speaking of the sponsors, I really want to thank all five sponsors of the event for their participation, willingness to help out, and excellent attitude … especially when the crowd volume prevented them from getting a word in edgewise during their 15 seconds of fame.

Suffice to say we could not have thrown a party like this without the help of these fine organizations.

I was also really pleased to see some of our industry thought-leaders out for the event, folks like Gary Angel, Jim Sterne, Larry, Judah Phillips, Brett Crosby, and Avinash Kaushik who has never attended Web Analytics Wednesday as far as I know but who just joined Google full-time, eschewing independent consulting for good old-fashioned job stability — congratulations Avinash and congratulations Google!

I was even more pleased to see many members of the Web Analytics Board of Directors at the show including Jim, June, Avinash, Bryan Induni, April Wilson, Richard Foley and probably a few more I am forgetting. I think this is great since the WAA has what can only be described as an estranged relationship with Web Analytics Wednesday … hopefully we can get that relationship worked out in 2008 so these two great organizations can work together for the benefit of our entire community!

Anyway, thanks to June, David Rogers, and all the volunteers and sponsors who made this great event happen. Mr. Sterne hinted that he’d like Web Analytics Wednesday to happen concurrently with every Emetrics conference around the world so hopefully we can work that out and take this great party on the road.

Analytics Strategy, Conferences/Community

Coremetrics and SiteSpect: Global Web Analytics Wednesday sponsors!

As Web Analytics Wednesday continues to gain momentum around the world leading into what is shaping up to be the largest event in WAW history in San Francisco on Tuesday, May 6th, I am pleased to announce that Coremetrics and SiteSpect have both signed on to be global sponsors of this event.

You can read the press release here but the essence of this announcement is that these great companies share my vision for what Web Analytics Wednesday is becoming — the largest and most widely recognized gathering of web analytics professionals in the world. In sharing the vision, their management teams have graciously decided to provide financial support to allow Analytics Demystified provide more reimbursement to a larger number of events around the world.

Starting today, we are pleased to be able to help emerging Web Analytics Wednesday chapters cover the cost of providing food and drink to participants. You can read about the WAW sponsorship policy here and if you have any questions please don’t hesitate to write me directly.

On behalf of myself, my wife Amity who provides event support, June, and every Web Analytics Wednesday host and participant around the world I want to thank Coremetrics and SiteSpect for their vision and contribution. So far in April nearly 700 people have signed up to participate in Web Analytics Wednesday and we’re well on the way to beat my goal of 5,000 participants worldwide in 2008!

In other Web Analytics Wednesday news …

  • I will be in New York this coming Wednesday giving a short Web Analytics Wednesday presentation on “The Future of Web Analytics” at the headquarters of Avenue A/Razorfish. This event is being organized by Joel Collymore, one of the WAW hosting superstars, and promises to be a great event (over 115 people are signed up to attend already!) Sign up here to join us since we need to add you on the security list at the door!
  • Don’t forget, if you live in the Bay Area in Northern California or are coming to Emetrics, we are having what is likely to be the largest WAW event in history at the Fluid Nightclub on Mission Avenue. This special WAW on Tuesday, May 6th is open to all web analytics practitioners regardless of whether you’re going to Emetrics or not so please sign up to join us!
  • Just in case you live in French Polynesia, the first ever Web Analytics Wednesday event is being held on Bora Bora in Tahiti. Steve Jackson, blogger, consultant, and likely soon-to-be EUWAA leader is heading to Bora Bora for his honeymoon and has offered to help spread the good word. While his new bride will almost certainly not think it amusing, if you can get to Bora Bora I’m buying the drinks!

I am especially excited about the San Francisco event because Coremetrics, ForeSee Results, SiteSpect, and Tealeaf have all made significant sponsorship donations, giving us quite a lot of lattitude to entertain. It looks like our efforts are paying off as the sign-up list is a virtual who’s who of web analytics thought-leaders, practitioner talent, and consulting genius!

Analytics Strategy

The Data is Pristine and Accessible. In My Mind's Eye!

Tamara Gielen has a post titled Triggered Email Is Only As Good As Your Data over on the B2B E-mail Marketing blog. She describes a scenario where you want to send a satisfaction survey to customers 90 days before their contract expires with your company, and adds on some logic of resending the survey to customers who do not respond to the initial request. Her point is — that may be a lot trickier than it sounds if you’re actually living in the real world:

  • The data needed to trigger and send these e-mails lives in different systems
  • You don’t have a good way to determine whom at the account should receive the e-mail (you probably don’t want to send the survey to everyone you have in your systems for the acccount)
  • You don’t have a mechanism for updating your data when someone leaves the account and is no longer an employee there
  • The list goes on…

The real world would put you in a position of needing to make some unpleasant decisions:

  • Do you cast the net broadly and risk sending a non-applicable e-mail to a bunch of people in your database, or do you cast the net very narrowly and only send to people you are absolutely sure are the right folk…but then limit the impact of doing the survey in the first place?
  • Do you limit the amount of manual cleanup on the data, or do you engage your sales and account management groups to manually flag who should receive the invitation (or something in between)?
  • Do you try to explain all of the caveats of the data to the person who initiated the project, or do you just make a series of judgment calls and be prepared to defend/explain them if asked later?

The truth is that, in most cases, this sort of initiative does make sense, but it also requires making a long list of trade-offs, assumptions, and judgment calls to balance the expected impact with the effort required.

The entry brought to mind some interesting data integrity snafus that I’ve come across in my personal life:

  • For years, the phone company thought my name was “Jim” rather than “Tim” — initially, this only impacted caller ID, but, over time, that list got sold to various direct marketers, and “Jim” started getting junk mail
  • Several months after we moved to Ohio, I went into REI in Austin, and when they looked up my account, they had my Ohio address with a phone number of my former employer — none of this was information that we had ever explicitly provided
  • After moving to Ohio, we signed up for a Giant Eagle grocery store card, with our new address and phone number; somehow, a couple of months later, when I didn’t have my card with me and they had to look me up…they had the address as our address in Austin!
  • I recently shifted my cell phone plan from my wife’s and my joint account with T-Mobile to my company’s account with AT&T; I found out last week that I show up as “Julie Wilson” on caller ID when I make calls with my cell phone
  • For years, Microsoft was convinced that I was a high-level IT manager responsible for my company’s system administration and infrastructure — I’ve never been within a light-year of that sort of role

All of this is to say that the data is messy. It’s never going to be perfect. Spending time and energy on data integrity initiatives makes sense, but that has to be balanced with the practical reality of the world — short of an Orwellian society, the data is always going to have some level of inaccuracy. Understand the data. Understand what you’re trying to accomplish with it. And then make a judgment call.

Photo by Daquella manera

Analytics Strategy, General

Now I too am a lazy blogger …

Because I have finally, after much goading, joined the Twitter generation. It took Aaron Gray from WebTrends and like 11 beers (which I felt this morning, mind you) after a very successful Web Analytics Wednesday event here in Portland to get me to join Twitter. Hell, I didn’t even join after meeting Biz Stone and boating around Rotterdam with him last summer (sorry Biz!) But Aaron made me wonder who Twitter streams might be used in the engagement calculation so like a cat to milk I went running.

Incidentally I did not say “Twitter has no value” or at least I don’t think I said that.  I suspect there was some qualification involved (although see my above comment about 11 beers … sheesh!)

Thanks Aaron. Yet another excuse to play with my iPhone, not my kids. You rule.

Want to follow me? I’m easy to find!

Analytics Strategy

Lead Scoring Revisited…or at Least Reiterated

One of the areas that I’ve spent a lot of time focusing over the past six months is lead scoring. Specifically, multidimensional lead scoring. I’ve written about it here before. We shot a 10-minute video white paper a few weeks ago, which should be available soon. The slides I talk to in that are available below courtesy of slideshare.net.

SlideShare | View | Upload your own

And, just yesterday, Laura Ramos posted a topic diving into lead scoring over on the Forrester Interactive Marketing blog.

All this is to say that the topic remains very much on the top of my mind. And, as is no great surprise, I’ve got additional thoughts that I’m still working out on the subject — mostly driven by our experiences putting multidimensional lead scoring into practice at Bulldog Solutions. I expect I’ll be working through some of these over on the Bulldog blog in the coming weeks. But, I thought I’d start here to at least get a little bit of a list going.

To start with, we haven’t uncovered anything that invalidates any of the concepts in the white paper. Quite the contrary, actually. It has been very well received by everyone to whom it has been presented and has sparked some interesting conversations. And, as we’ve put it into practice, it’s holding up to be as viable an approach as we expected.

The two areas that I’m most looking to flesh out my thinking are as follows:

  • The Buying Cycle Dimension — Partly because the paper itself was getting to be darn long, and partly because we were not yet in the process of implementing a third dimension, the paper only briefly speaks to the “buying cycle position” dimension of lead scoring. The paper was initially titled “2-D” lead scoring rather than “Multidimensional” lead scoring, and our CMO pointed out that this was overly limiting — we were only implementing two dimensions at the time, but we knew there were others that we would be moving on to. Specifically, he was referring to the buying cycle position. This is a bit dicey to work out, but it’s a straightforward concept: a lead who is checking out detailed product specifications and pricing on your web site is likely much closer to making a purchase decisions than a lead who is simply reading white papers or analyst articles on a topic.
  • Multiple Offering Types — this was flat out an oversight on my part, largely due to the fact that I’m at a startup that does not yet have a wildly extensive set of offerings. But, in my prior life, this issue very much was at the fore, so missing it was definitely a whiff. The issue is that one lead may be highly qualified for one type of offer but very much not qualified for another offer. In our case, we have offerings where the economic buyer is a marketing executive. We have other offerings where the economic buyer is a marketing manager. The marketing executive is actually not a qualified lead when it comes to some offerings. And vice versa. So, how does this get handled with the lead scoring? I think that this situation is an indication that you may have multiple profile scores — one for each broad offer type. I don’t think the engagement score changes from offer to offer — this is a measure of engagement with the company / the brand and is offering-independent. And, I don’t think the buying cycle position will differ, either. But, I’m still working this out. We’re testing it out manually with some upcoming activities.

One of the benefits of multidimensional lead scoring is that it does make it manageable to add additional nuances and complexity over time. And, while the lead scoring is primarily a lead qualification mechanism, it serves other purposes: it’s one factor in segmenting your database for different nurturing programs (or content within the nurturing programs) — based on what quadrant in the multidimensional matrix the lead falls in; and, it can be used for lead routing — again, based on what quadrant in the lead matrix the lead is in may dictate whether the lead is routed to an inside sales organization or directly to another organization within Sales.

As always, let me know if you disagree or can add color from your own experiences.

Analytics Strategy

Free IndexTools: Analysis and Market Implications

Last Wednesday I had the honor and pleasure of helping break the story about Yahoo’s acquisition of IndexTools. My post was pretty well received, generating over 40 comments and a handful of citations in the popular press. In that post I speculated two things:

  1. That Yahoo! would offer IndexTools at no charge by Christmas 2008
  2. That this acquisition was potentially a permanent game changer

This morning Dennis Mortensen sent me an email with the following message:

“your predictions was not that far off!
:-)”

At his blog Dennis announced that Yahoo! was now waiving all fees for current clients, essentially making IndexTools free. Dennis also announced that current IndexTools partners would be able to deploy new IndexTools implementations free of charge. Finally, Dennis commented that for the time being IndexTools would be closed to new customers, giving Yahoo! a chance to determine what their next move would be.

Christmas, Tax day, whatever. Free is free.

While I am already on record as believing that Yahoo made an excellent strategic move in acquiring IndexTools, the “make it free now” decision is absolutely freaking brilliant and the guys who pulled the trigger deserve a medal.

In one fell swoop, Yahoo! and IndexTools:

  1. Assuage all of the fears of current customers regarding the transition, any hiccups, etc. by offering the best possible olive branch in the world: free stuff! Worried about the transition? Okay, don’t pay us. Concerned about your data? Okay, don’t pay us. Don’t like Yahoo! that much? Okay, don’t pay us.
  2. Create instant demand for the product, and an instant revenue stream to the few companies that have been loyal to IndexTools all along. Imagine your glee if you woke up this morning and learned that you’re part of a limited channel able to offer IndexTools at no charge to your clients? I know I’d be pretty happy!
  3. Serve notice to your competitors, whoever those competitors are, that you are not messing around and they need to get their act together quick. Regardless of what Google says (or rather, has not said) and what Omniture says, IndexTools directly competes for both of their business (and everyone else in the sector to be sure.)

This third point is important, and it speaks directly to why I think this acquisition changes our market more or less forever. So far the team at Google Analytics has been eerily silent — or if they have spoken I have missed their comments — but I have already talked to multiple people who have basically said, “It’s a no brainer, I’ll deploy IndexTools and either run Google Analytics for validation or drop GA entirely.” More importantly, I’m talking to some pretty bright consultants that are chomping at the bit to get away from having to make excuses and write hacks to make Google Analytics do stuff it is not designed to do (screen scraping, really?!) and use a more full-featured application.

It’s not that Google cares all that much I suspect; rather I think they’re more than happy to have a real competitor in the free market, especially one that offers a logical “step up” from the basic functionality that Google Analytics has to offer. I won’t be surprised at all if Brett Crosby publicly welcomes Yahoo! (back) into the web analytics arena in the same way he welcomed Microsoft to the game last year at Emetrics. I’ll be less surprised if Google’s Analytics Evangelist has a happy post about IndexTools sometime this week, especially given his historical evangelism for IndexTools and professed respect for Mr. Mortensen.

The real challenge I see on the horizon is for the paid service providers, especially those focusing largely on the mid- to higher-end of the market (which is to say everyone.)

Consider the following:

  1. Like it or not, IndexTools is pretty much as good as most of the best analytics offerings out there today. It may not be as pretty, it may not be as “AJAX-y”, it may not be as fast, but this is an offering that goes toe-to-toe with the “mass market” analytics offerings from all the major vendors and in my opinion is every bit as good. Why would I say that? Simple, I have been running IndexTools on my web site for the past six months. IndexTools is not easy, but it is no more complicated than anything else out there (IMHO.) If you look around at comments and what some people are saying you’ll head the same thing over and over: Companies of all sizes have been selecting IndexTools for years based on rich, comparative functionality made available at an affordable price point. IndexTools has great filters, great segmentation, great custom report building tools, great extras like color coding, notes, and dashboards. Did I mention it is now all available for free?
  2. The complaints about IndexTools are, for the most part, underwhelming. One (nameless) vendor has said that IndexTools is inferior because it wasn’t included in the last Forrester Wave and JupiterResearch constellation. The same vendor said that IndexTools is inferior because someone else said most of their clients are SMBs. The same vendor tries to created FUD around the data center being in Eastern Europe. The same vendor says that IndexTools doesn’t have “deep domain expertise” or “specialized services” The problem is that A) both Forrester and JupiterResearch focused only on U.S.-based vendors and (I believe) excluded IndexTools based on geography, not functionality, B) IndexTools does have some very large clients (Vodafone, PriceRunner, Tesco, ToysRUs — read this interview with Dennis Mortensen for details), C) I suspect Yahoo! will be doing some work on the data collection and reporting architecture over the next few months, and D) IndexTools is very likely to follow Google’s model of relying on external web analytics experts to provide expertise and specialized services (YAAC, right?)
  3. The valid complaints about IndexTools are either being addressed currently or are likely to be addressed very shortly. Phil Kemelor from Semphonic who also does some work for CMSWatch has looked at the major analytics applications perhaps more than a normal human being should in the process of writing his really, really big book on web analytics tools. Regarding IndexTools, Phil said the following:

“IndexTools does not offer the functionality that distinguishes it from Omniture and WebTrends — for example the ability to analyze unaggregated data from a graphic UI and to perform repeatable Excel reporting. For now, you must use regular expressions to analyze unaggregated data and do manual updates of Excel…just like Google Analytics.”

Good assessment, Phil, and a reasonable critique of IndexTools except. I would only offer that all the major vendors require add-on applications to analyze truly unaggregated data from a graphic UI (Omniture Discover, WebTrends Visitor Intelligence, etc.) and on this point IndexTools has already showed quite a lot of people “Rubix” which is the inevitable response (and which I sincerely hope that Yahoo! decides to release, even if they make us pay something for it.)

Personally I think that the ability to analyze unaggregated data should be in a separate interface, one designed for expert users working in the web analytics hub, and that IndexTools following the SiteCatalyst/Discover, WebTrends Analytics/Visitor Intelligence, NetTracker/NetInsight, etc. model is the right decision.

Regarding the “repeatable Excel reporting” … in my interface under “Settings” I have something called “Scheduled Email Reports” that seems to work pretty well. I can add a bunch of reports and schedule them for delivery in whatever format I choose; It’s not exactly what I want (having used HBX Report Builder, still the gold standard for Excel integration IMHO) but it certainly fills an important need for web analytics practitioners. This may be an IndexTools 10 feature that I am a BETA tester for, which may explain the confusion …

Finally, I would personally offer that Google Analytics and IndexTools are (in their current state) dramatically different applications targeting very different audiences. I am sure to take endless shit for this but I believe that Google Analytics is a great entry-level tool, something designed to seem “easy” and get folks used to the idea of doing web analytics on a professional level; IndexTools is not a great entry-level tool, rather it is a rich analysis engine that is the next logical “step up” from Google Analytics for practitioners and companies needing robust segmentation, customization, and drill-down capabilities.

Don’t believe me? Go to Google Analytics and create the following segment then apply it to your most commonly used reports:

In the body of the document, which I hope I am okay quoting a little bit from, Phil also says (and I paraphrase) that IndexTools strengths include “on-screen drill down detail in reports; ad-hoc analysis features; dashboard presentation and customization” and that in addition to the weaknesses listed above, that a weakness is that IndexTools requires the “manual entry of distribution list recipients.”

Those of you who have seen Phil’s book know that he has done an absolutely amazing job summarizing the strengths and weaknesses of each tool. In his conclusions, Phil has says the following:

“IndexTools should receive consideration if you want a well-priced, commerce-focused reporting solution and do not want to pay Omniture, WebTrends, or Coremetrics prices. If most of your users are part-time analysts and marketers who basically need reporting, IndexTools may be a reasonable selection. If you require complex slicing and dicing, IndexTools should still be on your list. Automated data integration and multiple sites with huge volumes of traffic and multiple campaigns may present challenges to IndexTools. However, because the company has a history of accommodating custom requirements, you should consider the possibility of IndexTools meeting your needs even though you want something outside of their standard feature set.”

My interpretation of this is more or less “look at IndexTools” as part of your consideration process. There are instances where IndexTools may not be appropriate — absolutely true, no application is all things to all people — but if you compare the assessment above to the conclusions provided for Omniture and others I think you’ll see a favorable recommendation for IndexTools (at least I did.) If you’re interested in reading the rest of what Phil had to say, I strongly recommend buying a copy of The Web Analytics Report 2008 from CMSWatch.

Some things are missing in IndexTools 9, mostly the ability to create custom metrics (something I have become pretty used to in Visual Site), and a couple other minor things I like to see in a mass-market analytics solution but I think there are quite a few people who will be willing to look the other way on small points like this given the price point. My basis for saying this? Simple, a rumored 1.2 million Google Analytics deployments and the army of people willing to look the other way regarding the limitations in GA …

Which brings me to why I believe IndexTools is a permanent game changer:

  1. The paradigm shift I cited in my last post on the subject is going to happen a lot sooner than some people thought. Now, if you know an IndexTools partner, or soon if not, companies really don’t have to worry about the vendor selection process. If you’re new to web analytics you can get Google Analytics; if you’ve pushed past the limited functionality in GA, you can get IndexTools. Total cost for tools: nothing. Companies will be able to (finally) focus on how the tools are used and the process of doing web analytics, not haggle with vendors over pricing, fight with IT over implementation, etc., which is exactly where we need to be. Web analytics is not about the tools, web analytics is about how the tools are used to improve the business.
  2. The existing for-fee vendors have been served notice and will have to figure out a better sales proposition than “the competition sucks.” I’m willing to be wrong on this point, but I don’t think the current “anti-IndexTools” messaging I’m hearing is likely to hold up under scrutiny. Eventually buyers are going to realize that they’re talking to sales people, some of whom are somewhat integrity-challenged, who will say anything to get them to look away from Yahoo’s offering. This rocks, in my opinion, because it spells the end of the negative selling that has been a hallmark of some vendor’s capabilities. Let’s focus on what makes you truly different, given that I can get something very similar for free, huh? As far as the claim that IndexTools sucks because the analysts don’t cover it? Um, are you sure?
  3. For the existing for-fee vendors to continue to thrive, they will need to move quickly up-market and focus on the needs of a very sophisticated audience. This is really very interesting since it highlights a growing schism between vendors trying to own the Enterprise and those trying to play nice with others. Don’t know what I mean? Look around for things like “Closed-loop Marketing” and the implication that you should be bringing all your Enterprise data into your web analytics system; compare that messaging to the idea of open architectures and the notion of integrating appropriate web analytic data back into the rest of the business. In fact, now that the pricing battle is coming to an end, I think that this is the next really interesting conversation we’re going to have …

So we start to focus on the application of the tools, not the tools themselves. Game changing. The vendors are forced to refine both their offering and their sales process. Game changing. Consultants have better free tools to work with. Game changing. Web analytics technology is pushed further along towards being a commodity. Game changing.

I’m not one to make a bunch of predictions, but I would challenge those of you who disagree with my assessment to set an alert in your calendar for twelve months from today. When the alarm goes off, take a look at the adoption rate for IndexTools, the trading price for OMTR, and the ownership status of the remaining privately held web analytics vendors in the marketplace today.  Again, I am perfectly happy to be wrong about how IndexTools might change the market …

I should reiterate that all of this is not without risks: there is still a lot that could go wrong as Yahoo! integrates IndexTools into their larger offering: the team could become de-focused, key people could leave Yahoo, Microsoft could succeed in their take-over efforts, etc. (and yeah I remember Keylime too, but that was a different time, a different technology, and frankly a different group of people managing the process.)  I am very impressed with what I’m hearing so far and look forward to the evolution of the entire web analytics sector, driven in part by Yahoo! and IndexTools.

Again, congratulations to the teams at Yahoo and IndexTools and, um, Merry Christmas to the few IndexTools partners who will have the market cornered on this technology for the time being.

Adobe Analytics, Analytics Strategy, Conferences/Community

Getting excited about Web Analytics Wednesday Emetrics

Those of you who regularly attend the Emetrics Marketing Optimization Summit here in the states know that from time to time we hold big Web Analytics Wednesday events in conjunction with the conference.  Especially in San Francisco where the philosophical founder of WAW lives (June Dershewitz!) we like to have big events.

This time June and her partner David are planning a doozy!

Thanks to the very generous support of Coremetrics, ForeSee Results, SiteSpect, Tealeaf, and Jim Sterne (Marketing Optimization Summit) we are having a big party at the Fluid Nightclub on Mission Street (just around the corner from the conference hotel.)

You can download a very snazzy PDF invitation or read the press release about the event.

If you’re coming to Emetrics, even if you’re planning on attending a vendor dinner or just hitting the town, we sincerely hope that you’ll join us at this special Web Analytics Wednesday (on Tuesday) event.  You can register to join us and get all the details here.

Analytics Strategy, General

Special free webcast for online retailers this Thursday

The nice folks at Elastic Path asked me awhile back to give a presentation on web analytics for online retailers.  Their request made me realize I am still sitting on data I collected last Fall — a mistake I know but a function of the velocity at which Analytics Demystified, Inc. has grown in the past year.  To make up for this delay in some small way I will be sharing the data I collected from online retailers in a free webcast this coming Thursday.

If you’re an online retailer doing web analytics please consider spending an hour with us Thursday at 9:00 AM Pacific / Noon Eastern.  Among other things I will be covering:

  • What the measurement landscape looks like for online retailers
  • How satisfied online retailers are with their web analytics vendors
  • How web analytics has impacted spend on paid search marketing

This last point I think many of you will find especially interesting.  Given Yahoo’s recent (re)entry into the web analytics market, and the assertion from Google that Google Analytics drives sites to spend more on search marketing, I set out to answer the question “Do you spend more on paid search marketing because of information gained via web analytics?”

Tune in Thursday morning to hear what I found out.

You can register now for free thanks to Elastic Path.  Talk to you Thursday!

Analysis, Analytics Strategy

Complex Processes and Analyses Therein

Stéphane Hamel, it seems, is a bit peeved with Eric Peterson. These are two pretty big names in web analytics — Eric as one of the fathers of web analytics, and Stéphane as both a thought leader in the space as well as the creator of one of the most practical, useful web analytics supplemental tools out there — WASP: The Web Analytics Solution Profiler plugin for Firefox. With the plugin, you visit any site, and a sidebar will tell you what web analytics solutions it looks like it’s running. It’s pretty cool.

I don’t know the full background of the current back-and-forth between these two guys, but I’m a huge fan of Stéphane, and my ears perked up when I read this observation in the post:

Business Process Analysis implies understanding & improving a collection of interrelated tasks which solve a particular issue. Nothing new here… Most businesses face complex and “hard” processes, and the way to make them “easy” is by decomposing them into smaller sub-processes until they are manageable.

For one thing, for a period of ~8 months, my job title was “Director of Business Process Analytics.” And, frankly, I was never sure what that meant. In hindsight, if I’d had these two sentences from Stéphane and if I’d replaced “Analytics” with “Analysis,” I would have seen a much clearer mapping from my label to what I was actually doing in the role.

More important, though, is the concept of “decomposition.” I find myself preaching the Decomposition Doctrine regularly. And I believe in it strongly.

As an example, when it comes to the Holy Grail of Marketing Analysis — calculating the ROI of your marketing spend — many, many B2B marketers start out looking for the correlation between leads generated and revenue. I have yet to see a case in B2B where this can be found with a sufficiently tight, sustained correlation to be meaningful. That actually makes sense. It’s like looking for a correlation between the state someone is born in and the achievement of a PhD. There’s a lot going on over time between Point A and Point Z.

In the case of B2B marketing, decomposition makes sense. Decompose the process:

  • The lead-to-qualified lead sub-process
  • The qualified lead to sales accepted lead sub-process
  • The sales accepted lead to sales qualified lead sub-process
  • The sales qualified lead to close sub-process

Each of these sub-processes have people who proceed to the next sub-process as well as people who do not — put simplistically: people who “fall out of the funnel.” But, you can further decompose — of the people who fell out, where did they fall out and why? And does that mean they are gone forever, or are there processes/subprocesses that can be used to reengage them in the future?

The key here is that, from a theoretical perspective, if you link together all of the simpler sub-processes, then you’ve got an accurate representation of the more complex master process. The problem is that this is mostly true. There are probably other sub-processes that are unknown — those pesky “corner cases” that the real world insists on throwing at us. And, each sub-process likely experiences various anomalies over time. Add those together, and you’ve got a complex process that verges on the unanalyzable.

On the other hand, if you focus on a sub-process, you can analyze what is going on, including accounting for the anomalies. “But, isn’t there a risk that you’ll be missing the forest for the trees?” you ask. Absolutely. That’s why it’s important to start with a high-level view of the whole process, with a clear picture of the components that go into it. If you simply pick a “simple sub-process” to focus on, without understanding how and where that fits into the big picture, you run the risk of rearranging deck chairs on the Titanic. On the other hand, if you simply try to “analyze the Titanic,” without some level of decomposition, you’re equally doomed.

Analytics Strategy

How Yahoo! buying IndexTools changes Web Analytics

Yahoo! just announced that it has acquired IndexTools. When I first heard about this deal I thought “Oh, that’s nice for Dennis and Dennis is a pretty nice guy so that’s nice … and the title of Director, Data Insights at Yahoo! is a pretty nice title.” But then I really stopped to consider what Yahoo! pushing IndexTools out to the world means to web analytics (Disclosure: I have been working on a white paper for IndexTools; unfortunately that work is on hold for the time being.)

While we have seen a lot of deals in the last two years, this one is potentially the permanent game changer.

Depending on the deployment model that Yahoo! uses to bring IndexTools to the masses, this acquisition may spell the beginning of the end for some folks who are pretty invested in the status quo. I was unfortunately forced to write this post well in advance of the announcement because of my travel schedule so I am sure I have missed some details but consider the following:

  • Many people consider IndexTools to be every bit as good as far more expensive solutions, offering strong support for visitor segmentation, customization, marketing workflow management, advanced merchandising, and reporting that is far superior to that currently offered by other free and low-cost solutions. In fact, IndexTools is often referred to as “Omniture at a fraction of the cost” and having used both applications I’m hard pressed to disagree — everything you need to do “Enterprise” (sic) analytics is in IndexTools, without exception, and Dennis has shown an uncanny ability to roll new features into the product that directly address emerging market needs at the point they’re needed, not years ahead of time and certainly not years too late. For an excellent review of IndexTools see Eric Enge’s piece at Stone Temple Consulting.
  • Prior to the acquisition, IndexTools was poised to release their own ad hoc segmentation and analysis engine, dubbed Rubix, that directly competes with the likes of Omniture Discover and similar high-end offerings. I have seen Rubix and my first reaction was “Oh man am I glad I left Visual Sciences when I did.” Dennis and his team have taken advantage of the work that Visual, Omniture, and others have done and essentially packaged it up in a much more user friendly and approachable way. The result is something that I believe a far greater number of analysts will be able to take advantage of, regardless of the price point, and something that other free and low-cost vendors simply have no response to today. Lars Johanssen of SATAMA and my very good friends Rene and Aurelie from LBi/OX2 have similar summaries of Rubix worth reading.
  • Yahoo! will almost certainly be able to take advantage of the good work that Google has done establishing their Google Analytics Authorized Consultant (GAAC) network, giving Yahoo! an immediate deployment network (oh man do I hope they call it the “YAAC” Network!) Having run IndexTools on my own site for some time I very much expect that many GAAC partners will actually prefer IndexTools for most of their deployments given the dramatically improved capabilities of the application. Here is a list of companies I expect to call Dennis and the team at Yahoo! to inquire about how they can be YAAC partner.
  • One of the things that people really like about Microsoft Gatineau is the inclusion of a small amount of demographic data available for segmentation. I like what Ian and his team have been doing, but I suspect that A) Yahoo! has access to very much the same data (only a whole lot more of it) and B) the existing segmentation capabilities in IndexTools, not to mention Rubix, will make that data a whole lot more useful to marketers. Imagine if you deployed IndexTools having real-time access to age, gender, income, and behavioral demographic data to apply to all the reports in your system, collected via Yahoo’s huge network, obfuscated, and presented properly showing sample sizes and statistical correlations. That would be cool, huh?

The net effect, again depending on the specific go to market strategy Yahoo! uses, is potentially a profound shift in the nature of the web analytics market today. Consider the following options Yahoo! has:

  • Yahoo! can simply slap a big “Y!” on the IndexTools products and continue to sell them, perhaps through IndexTools and Yahoo’s existing partner network. The advantage of this option would be minimal interruption to IndexTools customers and it gives Y! a chance to migrate data collection and reporting technology from Europe to locations around the world. This would also give Y! time to look slowly at adoption and use of the product and think about their long-term strategy towards web analytics.
  • Yahoo! can brand IndexTools and give away the E-Business edition while continuing to sell the Enterprise version, Rubix, etc. The advantage of this option would be driving adoption and taking some of the attention away from Google Analytics in a sane and measured way. This would also give Y! better data regarding the amount of effort really required to support a real web analytics application (because let me tell you, for most of us, it ain’t easy!)
  • Yahoo! can slap the logo on and say “Come and get it, people!” giving away the whole hog to all comers. The advantage of this option would very likely be a high adoption rate (for IndexTools target audience, which is definitely different than the G.A. audience and more directly competitive with the likes of Omniture.) The disadvantage of this option is it has the potential to break IndexTools current architecture and could cause service interruptions which is something nobody wants, regardless of how much they pay for the service.

If it were up to me, I would select the second option, driving adoption and interest in the application while protecting IndexTools most valuable (Enterprise) customers at the same time. The second option would give Y! the greatest amount of information about the road ahead — to be sure there will be challenges — but would also give the company some much needed love.

Knowing some of the players involved, however, my money is on the first option. Nice and conservative, and basically exactly what Google did when they acquired Urchin. Despite what I’m sure is a strong desire on the part of Dennis and Yahoo! to have a tremendous impact on the marketplace around them, they say that “slow and steady wins the race” and that is probably more true today than ever before.

If Yahoo! chooses the third option immediately, or as Google did, waits six months and then goes to a totally free model, suddenly there are far fewer reasons to pay for web analytics at all. Even if they charge a nominal fee for more advanced functionality like Rubix, or force us to buy ads on Yahoo! a la Gatineau version 1.0 (which Microsoft has since put an end to, for good reason), I suspect there will be a profound disparity between what Yahoo! will charge and the CPM rates that most companies are paying today.

Wait, before you say “Yahoo! will never give IndexTools away for free … it would be too expensive” keep in mind that you said that about Google and Urchin.  My money is on free IndexTools before Christmas 2008.

While it is very easy for the top-tier of vendors to dismiss Google Analytics as “pretty, but basically inadequate” and “little more than an entry-level tool” the same claims cannot be used against IndexTools; consider again that many reviewers have said that IndexTools provides 80 percent of the functionality in Omniture at 20 percent of the price. If Yahoo! begins to provide 80 percent of the functionality in Omniture for NONE of the price, well, you get the gist …

If Yahoo! provides a low/no cost option for IndexTools, suddenly companies wanting to invest in web analytics will be far more likely to take advantage of the 10/20/70 rule for web analytics success I described over a year ago, focusing their efforts on people and process and worrying less about the technology used. Companies will be able to get their feet wet with Google Analytics and then, as the need arises, upgrade to IndexTools when they’ve mastered the basic processes and have hired the right people to move beyond basic reports and start to generate more complex analysis.

Obviously this acquisition is not without risks — Yahoo! could take too long to integrate IndexTools into their arsenal, the Microsoft/Yahoo! drama could play out in an unexpected way, and Google could respond by bringing Google Analytics dramatically up-market to be more competitive with Yahoo’s new position. Mitigating these risks are the fact that the team at Yahoo! is exceptionally bright (Bob Page, Michael Wexler, many others), any MSFT/YHOO drama will inevitably take years to play out, and if Google Analytics comes up market, well, then we have two truly great free or low-cost tools to choose from!

We have long talked about web analytics technology becoming a commodity, forcing a paradigm shift from the vendor-focused world that we live in today to the end-user focused world we truly need. Instead of asking “What tools should we use?” the conversation is now poised to become “How do we best use the data to drive our businesses strongly forward?” In a market where some vendors are still incredulously insisting that “web analytics is easy” I think this shift is long overdue and will be more than welcome by the majority of us practicing today.

Again, congratulations to Dennis, the team at IndexTools, and the acquisition team at Yahoo! I am very much looking forward to seeing how this announcement is received by the marketplace and how this change in the landscape benefits consultants and practitioners alike.

Analytics Strategy, General

Great news from Brussels and our friends at OX2!

This past week I have been having a glorious time traipsing about Europe talking to some very, very nice people about web analytics. My week started in London at the largest Web Analytics Wednesday in history and progressed to Amsterdam, then Brussels, and now to Helsinki, Finland. As I begin to wind things down, finishing up in Finland tomorrow and heading back to London just prior to going home, I wanted to share some really great news!

It gives me great pleasure to meta-introduce you to the first baby to be born of a true “web analytics couple”: Lucca Dechamps Pols, due in late April to the very wonderful and talented Aurelie Pols and Rene Dechamps Otamendi from Belgium’s LBi/OX2!

Those of you who know Rene and Aurelie surely know how excited, proud, and a little bit stressed out both parents are right now. Since I love being a dad I spent part of my time in Belgium regaling the couple with stories about how much fun it is to be a parent. I’m sure they’d love to hear your encouragement as well, either via comments or directly!

This tremendous event comes hot on the heels of Rene and Aurelie selling OX2 to the well respected interactive agency, LBi. I was honored to meet the CEO and CFO of LBi and happy to hear that they’re both quite excited about the capabilities that Rene, Aurelie, and the entire team at OX2 bring to LBi. It will be quite a year for R/A learning the joys of parenthood amidst their earn-out period but I have great faith in both mom and dad!

Congratulations to you both and I’m excited to meet Lucca when I return to Europe in the Fall.

Analytics Strategy, General

Matt Belkin of Omniture: Web Analytics is Easy!

Matt Belkin of Omniture recently posted on a few of the pitfalls companies fall into when deploying web analytics. I was pretty surprised to see Matt, someone who was worked in this field nearly as long as I have, make the following statement:

“Analytics success is all about building a baseline for performance (your KPI trend), and trying new things to improve on this baseline. That’s it! That’s why I think it’s easy. I know other bloggers have argued that analytics is hard, but I’ve done this for a living and I can tell you that it’s not.”

Ironically enough I have been meeting many of Omniture’s largest customers recently, none of whom seem to think web analytics is easy. They universally have some difficulty associated with technology, people, or process—the triumvirate that is truly “web analytics”—and I suspect many of them had the same response I did when I read Matt’s statement above.

I quickly scribbled out the following response late last night but for some odd reason it has not been approved yet. I figured I’d post my comment here so that Matt and his customers would have a chance to read an opposing point of view.

“Hysterical! I talk to Omniture customers constantly who complain about how hard it is to do the most basic things like calculate bounce rate, integrate data using your Genesis platform, make sense of your reports, and even just get the data they need when they need it.

Perhaps the problem that you and people like Stephane Hammel are having with my statement is something called the echo chamber effect. You say something for so long, and your buddies all repeat it, that eventually you ignore the reality of the situation and begin to believe something that is clearly not true. Seth Godin accused me of doing this once (he was wrong, it turned out, people are deleting cookies … you’ve said so yourself!)

But you’re wrong, Matt. Web analytics is hard. Ask your customers, they’ll tell you.

It’s not just hard to improve your baselines, it’s hard to implement code properly, it’s hard to understand reports and definitions, it’s hard to find qualified staff to run these applications, it’s hard for HR to stomach the salaries we are asking for, it’s hard to train newbies, it’s hard to produce quality analysis based on only quantitative data, it’s hard to get management to listen, it’s hard to make management understand, it’s hard to select a good vendor when so many are failing, it’s hard to know if and when to migrate off of HBX, it’s hard to know which low hanging fruit to pick, …

You get the picture.

You make my point yourself in your post. If there are multiple versions of the truth, it’s hard to know who to trust. If there are multiple systems, it’s hard to know which system’s “click” is the right click to count. If you yourself have had to spend “countless hours trying to reconcile differences” in data, how is that “easy?”

In a way I’m happy you wrote this post because it reinforces everything I say when I travel the globe and meet with your biggest customers. They say “Our vendor says this is easy … there must be something we’re not getting.” I say, “Why would you expect your vendor to tell you that web analytics is hard? Would that make the sales process move forward more quickly? Would that make you more likely to buy their ever-expanding series of offerings? Would it make you think you won’t end up spending more money counting events, creating custom reports, or adding ad hoc segmentation tools?”

No. If you told the truth about web analytics, your prospects and customers would think twice about their investment. But that is exactly what companies need to do to be successful, really successful, with web analytics — take the science of audience measurement seriously!

When I say “web analytics is hard” I’m not saying that it is impossible, I’m not saying it’s not complex, I’m not saying that it is best left to the experts, and I’m not saying that companies should give up and go home. I’m saying that vendors, consultants, and customers should set their expectations regarding web analytics appropriately.

In my humble opinion, your customers need to know that web analytics is hard so they can:

  1. Plan to spend a reasonable amount of time determining their needs
  2. Allocate resources appropriately for implementation and deployment projects
  3. Set expectations with management about when results will begin to appear and what will need to be done with those results
  4. Make the case to management when they need additional resources, more software, or more time
  5. Have an appropriate relationship with their vendor, based on clear expectations

When people are told that “web analytics is easy” they take their investment for granted. They expect that a “standard implementation” or something that comes from a cut-and-paste template will serve their needs, that a 0.25 FTE will be enough to produce analysis, that results will be available in a matter of days, that the software they have will solve all their problems, and that they won’t need their vendor’s support from time to time.

In a way it’s ironic that you say “web analytics is easy” given Omniture’s obvious commitment to their customer’s satisfaction — Larry Freed of ForeSee Results taught me that satisfaction is a function of expectation; when you say “I’ve done it, it’s easy brah” then as soon as they realize the truth, you’ve failed to set their expectation correctly and thusly they’re unsatisfied.

With the increasing numbers of your customers experimenting with less costly tools, I would think that customer satisfaction would be your #1 priority.

I doubt you’ll publish this comment and I suspect you’ll be pissed off at me (again) for voicing an alternative viewpoint but consider this: I’m not saying anything bad about Omniture or any of the companies you guys are buying. I think Omniture is a great organization full of incredible talent. I think the market position you’ve carved out is enviable. I think you guys have tremendous potential to advance the market, driving adoption of Web Analytics 2.0, Web Analytics 3.0, and beyond.

“Web analytics is hard” isn’t about any vendor technology or any one person. “Web analytics is hard” is about your customers and their ability to use your technology and your guidance to their greatest advantage.

When you say “web analytics is easy” you’re oversimplifying what is involved in being successful with web analytics. When you say “I’ve done [web analytics] for a living and I can tell you it’s not [hard]” you’re not paying attention to what your customers are going through. When you say “from your perspective, it’s just not that hard” you’re demonstrating your intelligence but not your wisdom. In fact, your statement “Analytics success is all about building a baseline for performance (your KPI trend), and trying new things to improve on this baseline. That’s it! That’s why I think it’s easy” really says it all.

Suffice to say I was bummed to see this hyperbole and tired rhetoric in an otherwise insightful post.

Sincerely,”

I think it’s one thing when people evangelize for free products as an easy-to-learn entry point into the market, and another entirely when one of the market leading vendors makes such bold and (in my opinion) unfortunately misleading statements.

Our collective ability to be successful depends on having clear expectations, not false ones, and our satisfaction is a function of our expectations. I think Matt is setting the wrong expectation with his comments. What do you think?

Analytics Strategy

Free white paper on the Web Site Optimization Ecosystem

I’m happy to say that my full-length white paper on the Web Site Optimization Ecosystem is now available from the fine folks at ForeSee Results (registration required for download.) I’ve been talking about the Ecosystem as a component of Web Analytics 2.0 for quite some time now and I believe this white paper does a good job of outlining:

  • The relationship between purely quantitative web analytics systems and more qualitative inputs such as those available via Voice of Customer (VOC) and Customer Experience Management (CEM) systems.
  • The necessity for all three measurement systems to create a truly robust visitor analysis environment.
  • The relationship between these measurement technologies and the “action” systems designed to support multivariate testing, behavioral targeting, and ultimately personalization.

Lee, Larry, and the entire team at ForeSee were great to work with on this document and I think some of the customer examples I discuss in the white paper are testament to the great work that companies are doing when combining web analytic and VOC data.

I hope you’ll take the time to register with ForeSee and read my work.

Analytics Strategy, Conferences/Community

June Dershewitz is running for WAA Board of Directors

Long-time readers surely know that I hold June Dershewitz in high regard; not only do I consider her a friend, I respect June as one of the most talented web analytics practitioners and consultants I have ever met. More importantly, June is one of the most fair-minded and thoughtful people working in our industry today, which is why I’m so delighted that she has decided to run for Web Analytics Association board of directors.

To help spread the word about June’s candidacy she allowed me to interview her via email. My questions and her answers follow:

June, can you tell me what made you decide to run for the Web Analytics Association (WAA) Board of Directors?

The thought hadn’t even crossed my mind until last month, Eric, when you suggested that I put my name on the ballot. The more I considered it, and the more people I talked to about what I could possibly bring to the Board, the more I realized that it was a great idea. I’ve invested a lot of energy in developing the web analytics community in my own local area, and I know that I could bring the same energy up a level to help the web analytics community at large on behalf of the WAA.

In a nutshell, what are the top three reasons you believe yourself to be qualified for the board position?

  1. I’ve been a hands-on web analyst my entire professional career, I love this work, and I am one of the strongest advocates you’ll find for our trade.
  2. I take volunteer work seriously. The Board is made up of volunteers who’ve agreed to spend 15-20 hours per month on the cause. That’s a lot to ask, but I’m ready to make a serious commitment to the job.
  3. I don’t play favorites. I want to make sure that we all benefit from the WAA’s efforts, and to that end I will strive to move the organization in a direction that’s in the best interest for all of us.

Have you given any thought to the kinds of things you would like to see the WAA accomplish in during your term if you are elected?

By all means we need to expand our member base at a rate that keeps up with the growth of our field, and at the same time we need to make sure that existing members continue to find value in their memberships. I’m all in favor of finding new ways to provide tangible, useful benefits to members. I also believe we need to form tighter bonds with related associations whose missions overlap with ours, especially as the scope of web analytics becomes broader. In terms of topics that are near and dear to me, I would like to see the formalization of local chapters and the development of a mentoring program.

I know you’re really involved in the web analytics community (being a founder of Web Analytics Wednesday!) Can you describe some of the other work you’ve done for our community in the past?

Web Analytics Wednesday has been a huge focus of my community involvement over the past couple of years. It’s evolved to the point where I’m not only heading up a monthly event series here in San Francisco, I’m also helping other organizers get started with their own events throughout the Bay Area and beyond. My work with WAW has really helped build an established local presence for the web analytics community, and I’m pleased to see similar developments in other cities where WAW has taken hold.

Within the WAA proper, I’ve been involved in the Education Committee, where I helped develop the document that will become the Web Analytics Body of Knowledge, and more recently I’ve become a member of Marshall Sponder’s Social Media Committee. I’m also contributing articles aimed at people who are new to the field of web analytics; my first one came out last month.

You’re running against some pretty heavy hitters in the field, folks like Avinash Kaushik and Jim Sterne to name a few. As you’ve looked at the slate of candidates, who would you like to see elected in this cycle: Who would you like to work with in 2008 and 2009 and why?

I would be honored to work with any previous Board member, especially those – like Avinash – who’ve voluntarily put themselves up for re-election a year early just so we’d have a balance of open positions this year and next.

As far as new people go, it would be great to get to work with Alex Langshur and Vicky Brock. After reading through all 17 candidate statements, I really like what those two have to say: Alex, because he’s committed to achievable goals aimed at bringing value to members, and Vicky, because she sees (as I do) web analytics evolving into business analytics.

Recently Lars Johansson of Satama proposed the idea of term limits for WAA board members, something that I don’t think is in place currently. What do you think? Should WAA board members be limited to one or two terms, or should people be free to serve for as long as they’re able to be re-elected?

The WAA has finally been around long enough for us to consider that issue. I’m definitely in favor of having a 2-term cap on Board membership. Our field is growing so quickly – there are great new people getting started all the time. If they have the energy and the inclination to run for the Board, I want to make sure they have every opportunity to get a spot – even without the name recognition that long-time Board members necessarily have.

Four years from now I’d love to see the Board made up of an entirely new set of people, full of enthusiasm and fresh ideas. By then we’ll be established enough as an association that our mission will be clear to whoever happens to be sitting in the driver’s seat.

You’re a web analytics blogger (one of my favorites!) so here’s an easy one: who’s blogs do you like to read and why do you like them?

Blogs are such a great way to keep up with our field and our community; I make a point to subscribe to everything I come across. Lately I’ve enjoyed Florian Pihs’s blog (I’m outing myself, he doesn’t know I’m a fan) because he’s covering web analytics in China and it’s a unique perspective that I could never hope to get on my own. Oh, and I like to follow bloggers whose sense of humor shines through in their writing, like Alex Cohen and Ian Thomas and (the occasional) Bob Page.

Some have accused the WAA of being somewhat close-minded and having a “not invented here” attitude, something that has the potential to negatively impact the community as a whole. Can you tell me if you encounter this how you might approach the problem?

I believe this attitude tends to propagate when the WAA’s activities are perceived as being shrouded in mystery. As members it’s really tough to figure out why certain policy decisions have been made when you don’t know what’s swirling around behind the scenes. I think it would be a lot better for everybody if more of what the WAA did, decision-wise, happened out in the public. I think we can do a better job being open with all members about what’s going on. Better communication – more honest, thorough communication – would keep negative sentiment in check.

What is your favorite thing about web analytics?

It’s a good intellectual challenge. There’s something about the natural shape of the data that lends itself really well to interesting, solvable story problems.

What is your least favorite thing about web analytics?

The occasional mistaken belief that what we’re doing is spying.

Ours is an increasingly international community and I firmly believe that some of the most exciting opportunities for growth in the industry are in Europe and Asia. Can you describe your experience working with international members of our community?

Over the course of my career I’ve been fortunate enough to work with a pretty global group. For 2 years I was employed by a first-generation web analytics vendor whose main office was in Britain; I spent some time working there, which was a great learning experience. Later, as a member of the central web analytics group at Oracle, I collaborated with an international team of marketers, analysts and developers. Now, as a consultant, I often find myself on the phone with clients many time zones away. Outside of work I’ve also enjoyed meeting international members of our community at conferences and through my blog.

Like you, I believe that our industry has a lot of growth potential beyond North America, and I want to make sure that the WAA does all it can to support international members by encouraging regional/local community, providing non-English language resources, and acknowledging the differences in the way we do business.

Fill in the blanks (here June’s responses are in bold print)

  1. At Emetrics, after 10 PM, you’re going to find me in the middle of a great conversation.
  2. On a long flight, I spend most of my time photographing my snack to post on Flickr as airplanefood.
  3. The one thing most people don’t know about me is I have a herd of dairy goats named in my honor.
  4. Everything I know about web analytics I learned from all the smart people I’ve gotten to work with and for over the past decade.

Anything else you think my readers should know about you as they prepare to vote this week?

If, after reading this, you’ve got any questions about where I stand or what my values are, you can write to me directly at june.dershewitz@gmail.com. I aim to represent every one of you, and I welcome your feedback, now and at any point in the future.

Analytics Strategy, Reporting

ROI — the Holy Grail of Marketing (and Roughly as Attainable)

The topic of “Marketing ROI” has crossed my inbox and feed reeder on several different fronts over the past few weeks. I don’t know if the subject actually has peaks and valleys, or if it’s just that my biorhythms periodically hit a point where the subject seems to bubble up in my consciousness.

The good news is that the recent material I’ve seen has had a good solid theme of, “Don’t focus too much on truly calculating ROI.” The bad news is that that message has been in response — directly or indirectly — to someone who is trying to do just that.

One really in-depth post came from — no surprise — My Hero Avinash Kaushik. He did a lengthy post, including five embedded videos, each 4-9 minutes long: Standard Metrics #5: Conversion / ROI Attribution.  What the post does is walk through a series of scenarios  where a Marketer might be trying to calculate the ROI for their search engine marketing (SEM) spend. He starts with the “ideal” scenario: a visitor does a search, clicks on a sponsored link, comes to the site, moves through and makes a purchase. In that case, calculating/attributing ROI is very simple. But, that’s just a setup for the other scenarios…which are wayyyyyy closer to reality. The challenge is that, as Marketers, it’s we all too often ignore our own typical behavior and common sense so that we can assume that most of our potential customers behave in an overly simplistic way. When was the last time you did a search, clicked on a sponsored link, and then, during that visit, made a purchase?

Unfortunately, very, very, very few Marketing executives would ever actually spend the 45 minutes it would take to truly consume all of Avinash’s post.  And, honestly, that’s not really “the solution.” The smart Marketing executive will find the Avinashes of the world and will hire them and trust them. Avinash (and John Marshall) really make the case that “time on site” is a more useful metric for assessing the effectiveness of your SEM spend — ROI just brings in too many variables and too much complexity.

In short: Don’t treat ROI as the Holy Grail and try to tie every one of your marketing tactics to “revenue generated.” For one thing, you will head down so many rat holes that you’ll start drooling whenever someone says, “cheese.” For another thing, you will find yourself facing decisions that seem right based on your ROI calculation…but that you just know are wrong.

Another place where this topic came up was in a thread titled ROI Models – High Level Thinking on the webanalytics Yahoo! group. I responded, but others chimed in as well. Some of those responses, in my mind, are still a bit too accepting of the premise that “I need to calculate a hard ROI.” But, other responses go more to a “back up and don’t look at ROI as the be-all/end-all.”

And, finally, ROI crossed my inbox last week by way of a CMO Council press release from back in January. I saw this when it came out, but a colleague forwarded it along last week, which prompted me to re-read it. The press release emphasized how much marketers are focussing on accountability when it comes to their marketing investments. One data point that jumped out was “34 percent [of marketers] said they were planning to introduce a formal ROI tracking system.” This is an alarming statistic. Marketers absolutely should be focusing on accountability — finding ways that they can measure and analyze the results of their efforts. But, if they truly are framing this as the need for “a formal ROI tracking system,” then that means 34 percent of marketers are going to be largely chasing their tails rather than driving business value.

Analytics Strategy, Social Media

Old School Online Community Leads to a Dozen Data Geeks and Drinks

I’ve been a fairly avid follower and contributor to the webanalytics Yahoo! group for several years now. It’s a Yahoo! group that is almost 4,500 members strong and includes active participation by many of the top minds in the web analytics industry. I actually follow the group via e-mail, which seems awfully old school. As a matter of fact, the WAA Community and Social Media committee (which I’m a new…and not very active member of — Marshall Sponder does a great job of running the committee, and I do feel bad that I don’t help out more!) is trying to figure out how to get the group onto a better platform. There’s a bit of “if it ain’t broke, don’t fix it” discussion on the subject, honestly. And unfortunately. The fact is that I doubt that a majority of those 4,500 people are really embracing social media just yet. And this online community is already awfully vibrant and successful on the current platform.

The Yahoo! group was originally formed by Eric Peterson. As that list grew (Eric passed it over to the WAA a few years ago), Eric got the idea to start up a convention of having a “Web Analytics Wednesday” on the second Wednesday of the month. This would be a designated date for web analytics professionals throughout the world to get together for a few drinks, to network, and to share ideas and challenges. Initially, the organization and coordination of these meet-ups happened directly through the Yahoo! group. But, Eric eventually put up a nice little application on his web site to facilitate these, and they’ve continued to grow.

Several months after moving from Austin to Columbus, I caught two posts in rapid succession on the webanalytics group that were clearly from people in Columbus. A couple of e-mails and a lunch meeting later, and we were hosting the inaugural Web Analytics Wednesday in Columbus! We actually held it on a Tuesday, as the venue we found promised to be less crowded then. We had a dozen people show up, it lasted for over 3 hours, and the overwhelming consensus was that it was worth doing again. Now, we just have to figure out how to structure it!

Unfortunately, one of the key organizers — David Culbertson of Lightbulb Interactive — wasn’t able to make it. But, he did manage to get a nice post up on his blog, including the picture that we took with Jonghee Jo’s camera.

I guess I’m getting old enough that I’m still amazed at the power of the internet to pull together a group of people with a very focussed area of interest. And to make the leap from online to in-person interactions so smoothly no less!

Adobe Analytics, Analytics Strategy, General

Measuring Engagement Online: The Next Stage

In the last few months there has been a tremendous surge in interest in my framework for measuring engagement online. Lately, some of the largest and well-known companies in the world have approached me about working with them to bridge the gap between the metrics they have today and something similar to the composite metric I first described back in December 2006.

While I am tremendously flattered that I have somehow become the focal point for this conversation, I have been thinking lately about how the framework has been developed and how it might end up being used by the measurement industry in general. And while early tests using the framework I’ve described are very encouraging, the calculation in it’s current state was meant to move the discussion along and get more people to “think different” about how engagement could be calculated online.

Given that interest in the framework has clearly increased, one primary concern comes up again and again: the need to apply mathematical rigor to the framework and calculation so that A) the result is repeatable, reliable, and trustworthy and B) when naysayers inevitably emerge to criticize this small side project of mine, that I have a suitable response to their criticism, regardless of where and why it comes.

I believe that the need for “A” is obvious. The need to address “B” is perhaps less obvious, but I believe that I owe it to those of you who are investing your time, energy, and money into this framework. Especially as the stakes seem to increase exponentially with every presentation, every conversation, and every high-visibility blog post on the subject, I believe now is the time to approach the engagement framework not just as a hobby but as a serious project with committed resources.

To this end, I am extraordinarily happy to say that the single smartest person I know, Joseph Carrabis the Founder and Chief Research Officer of NextStage Evolution and NextStage Global, has offered to bring mathematical rigor and analytical precision to what I am officially dubbing “The Engagement Project.” Those of you not familiar with Joseph and his work are advised to A) meet him in person at the upcoming Emetrics Summit in San Francisco or B) read some of his recent work at iMedia Connection.

Joseph will be working to make the formula universally applicable and universally defensible. Suffice to say I can think of nobody better to bring mathematical and scientific rigor to the framework I have been evolving over the past year. Watch this blog and Joseph’s blog at BizMediaScience over the next week or so for a more complete analysis of the framework in it’s current state, something we’ve agreed is the first step towards creating a true function capable practically describing the degree and depth of engagement a visitor is displaying towards a web site over time.

At the end of the day, without regard to my framework, Joseph’s analysis, or any person or group’s particular position on the use of the word “engagement”, my goal is to solve one problem and one problem only:

If you’re interested in working with Joseph and me on The Engagement Project please feel free to contact me directly.

Analytics Strategy, General

What is the future of web analytics?

What does the future hold for web analytics, indeed? During my tenure at JupiterResearch I was more-or-less paid to predict the future, but at best I was right maybe 50% of the time. I think I predicted (correctly) that Google would give Urchin away, but I probably also predicted that Microsoft would acquire WebTrends (incorrectly) as a result. Such is life.

Fortunately I am smart enough to surround myself with really smart people, which is what I have done at the newest Analytics Demystified weblog: The Future of Web Analytics, Demystified.

What Joseph Carrabis (my partner) and I are doing at “The Future of” blog is creating an opportunity for some of the brightest voices in our community to wax philosophical about where we’re all going and what things will look like when we get there. In the last two months we’ve had excellent conversations started by the likes of Joseph Carrabis (NextStageEvolution), Rene Dechamps Otamendi (OX2), and most recently Mr. Ian Thomas of Microsoft fame.

The unique thing about this web analytics blog is that posts and comments are basically peer; we’re looking for long, well thought out comments that add something to the conversation. And as odd as it sounds we’re not approving navel gazing, fawning, and trackback/ping so that we can keep the conversation moving!

If you’re a long-time reader of my personal weblog I would strongly encourage you to subscribe to The Future of Web Analytics, Demystified. More importantly, if you’ve got a big brain and want to help us work collectively in an effort to figure the future out before we get there, we welcome your comments. Alternatively, if you have a prediction, see a problem, or want help resolving a problem that you struggle with, I’d love to hear from you about being an author in the Future Collective.

Analysis, Analytics Strategy

A Little Bit of Data Can Be a Time-Consuming Thing

I had an experience over the past week that, in hindsight, I really should have been able to avoid. The situation was basically this: several different people had made comments in passing about how we were probably “overcommunicating” to our database. “Overcommunication” being the tactful way to say “spamming.” In this case, I can actually trace the perception back to at least two different highly anecdotal events, which then spawned comments that led to assumptions, and so on.

Now, I am all for diligent database management, especially when it comes to how often and with what content we communicate with our contacts. My general sense was that we could be doing better, but we were far from reaching a crisis point (I lived through a situation at another company where we did reach that crisis point, and there were plenty of telltale signs leading up to that). “I can pull some quick data on that to at least get some basic facts circulated,” I innocently thought. And, that’s what I did.

I knew going in that, while the data was one thing, the definition of “good” vs. “bad” was likely all over the map, so I wasn’t likely to change many people’s opinions as to the situation by simply sharing the data. So, I shot an e-mail out to a group of interested parties and told them I had the data, and I’d be happy to share it, if they shared with me their opinions as to what an acceptable maximum of communications per week and per month would be.

As I suspected, I got a wide range of responses, and most of the responses had some form of qualifier — well-founded qualifiers regarding the type of communication, actually. So far, so good.

I then shared the data that I had spent 15 minutes compiling in a way to make for easy analysis, still knowing that there was no clear good/bad definition, and there was no clear hypothesis being tested or action being planned that this analysis would influence. The data did show a few things unequivocably — really just highlighting that the concerns were somewhat well-founded and that discussions should continue amongst the people who already tacitly owned the situation. But, it also spawned requests for additional data that was more curiousity-driven than actionability-driven. Several people asked that the data be pulled with their particular qualifiers addressed. Most of these people were in no position to actually take any action based on the results. And, unfortunately, as reporting and analysis systems can sometimes be — applying the qualifiers would have turned the analysis into a highly manual, multiple man-hours exercise, whereas the high-level, basic pull was a 15-minute task.

On the one hand, I could ding our data storage system. By golly, Tenet No. 1 of good BI/DW design is to design for flexibility, right? In this case, the system limitations are actually a boon — they give me an out for simply saying, “No,” rather than the much more involved discussion that begins, “Why?”

It’s a punt, I realize. And not an out I would take if it was throwing anyone in IT under a bus.

My point is that “interesting” can be a Siren Song that dwarfs the pragmatism of “actionability.”

Analytics Strategy, Reporting

Free white paper on measuring multimedia on the Internet

This morning the fine folks at Nedstat in Holland published a white paper that Michiel Berger and I co-wrote titled Measuring Multimedia Content in a Web 2.0 World.  This free white paper explores the emerging direct measurement model for multimedia content by examining several common business cases for deploying video and provides a new set of definitions and key performance indicators (KPIs) designed to help companies effectively track their investment in video based content.

The timing is somewhat ironic because Judah has been writing a fair amount about Video Analytics over in his blog — I guess great minds think alike!

While video measurement has been around for awhile, the new social media certainly increases the complexity associated with determining the efficacy of video from a business perspective.  The folks at Nedstat are committed to helping their customers resolve these issues, and are generously making our white paper available without registration requirements.

You can read the press release about the paper’s availability or download your own copy right away.

Analytics Strategy, General, Reporting

What is your web analytics communication strategy: Part II

(Last week I published PART I of this post which you should read first if you haven’t already done so.)

STEP FOUR: DETERMINE YOUR KEY PERFORMANCE INDICATORS AND CRITICAL REPORTS

You’re probably thinking “shouldn’t we have done this after we defined our business objectives and activities?” Conventional wisdom would probably say you should, but in my experience if you don’t have a clear process for leveraging those key performance indicators (KPIs) and critical reports, you may end up with one of three things:

  1. A huge report of 40 KPIs distributed across the organization that few people are likely to read and even fewer likely to act upon
  2. No KPIs distributed at all, and the expectation that everyone will simply “log in” and get the information on their own
  3. Well-defined and clearly articulated KPIs distributed hierarchically throughout the organization (because hey maybe you read a great book on the subject at some point)

The problem is that only the third possibility will deeply benefit your organization. I know that some people talk about hundreds of internal users who really get web analytics and all make superb decisions with the data, but this is very much the exception, not the rule. Remember, in our Analytics Demystified Spring Survey 69 percent of respondents said that they did not believe the majority of people using web analytics data in their organization actually understood that data.

It is far better for your analytics hub, as mandated by their executive sponsor in agreement with his or her peers throughout the organization, work directly with the individual spokes to ensure that appropriate KPIs are defined and the basis for those measures is clear. The hub then follows-up with appropriate explanation about the measures, including training on the reports and data that forms the basis of the indicators.

Your critical reports are directly tied to your key performance indicators (which remember are tied directly to your business objectives.) If you belong to the marketing organization than your KPIs will be measures like “Campaign Response Rate”, “Campaign Conversion Rate” and “Campaign Cost per Click”. Obviously as these KPIs change, appropriate tactical resources in the marketing spoke will review campaign response, conversion, and cost reports in your analytics application.

Your KPIs and critical reports will differ dramatically depending on what department you work for and where in that department you work — remember that the best practice for key performance indicator distribution is to deliver the specifically and hierarchically. Most attempts that I have seen to send “everything to everybody” have failed (often miserably).

STEP 5: DETERMINE HOW YOU’LL DELIVER ANALYSIS

Once you know what your KPIs and critical reports will look like, the next step is to determine how you’ll produce and deliver analysis. Let’s assume for a moment that you’ve got a hub-and-spoke model in place and the hub is receiving regular requests for more information, insights, and recommendations. The question then becomes “how will you deliver those insights and recommendations?”

As I said last week, there is no one “right” way to communicate about web analytics data but there are many, many wrong ways. The central challenge when delivering analysis stems from the fact that so few people really understand what web analytics terms mean, what the limitations of the technology are, and what is possible and impossible to report on. But it’s not like you can just give up and ignore the confusion, so what’s a great analyst to do?

The answer is “work harder, and think outside the box” (to use an overused term). While reports and raw data are best delivered using the Bottom Line Up Front (BLUF) method, analysis really needs to be more engaging. Remember: when you deliver analysis, what you really need to do is to convince the listeners that they need to take some action. To do this you absolutely have to be engaging.

Things that have worked for clients of mine in the past include:

  • Well-delivered presentations, given IN PERSON, not just sent via email in hopes that people will review and understand
  • Well-written documents, followed by a meeting to make sure that everyone READ the document and is on the same page
  • Short summary documents, written up like a newsletter or newspaper article, designed to get people to attend a meeting or presentation

Since we’re in a Web 2.0 world, and since many of you are increasingly comfortable using new technology, a few other things you may want to consider include:

  • An internal analysis Wiki that people can subscribe to and participate in. The Wiki is a good idea because it allows you to capture the conversation in a searchable format
  • A regular analysis podcast, providing an update on past analysis and summarizing the data currently being reviewed
  • A analysis video or vidcast, created with tools like TechSmith Camasis that allow you to easily blend images, live screen capture (useful when showing people live data in your analytics application), and annotation

The advantage the final two ideas confer is their ability to be downloaded to an MP3 player like the iPod or iPhone. If you have busy executives, you might be better able to reach them if you give them something to watch on the airplane or listen to on the drive home.

Keep in mind that none of these “Web 2.0” strategies should replace well-written, well-presented analysis, delivered in person whenever possible and making specific recommendations for changes (including a testing plan when possible!)

STEP 6: PUT IT ALL TOGETHER!

Assuming you’ve completed the previous five steps, you now have a functional web analytics organization, one capable of delivering relevant reports and producing actionable analysis. Now the challenge is to stop spending all of your time generating reports and start delivering analysis!

Unfortunately, for many organizations this is really, really difficult. Even when there are dedicated resources — people specifically hired to do web “analytics” (not web “reporting”) — far too many bright folks end us spending all of their time churning out reports. Even worse, these reports often go unread, unused, and unnoticed despite the real and opportunity costs associated with generating them.

To be really, really successful with web analytics you have to train the organization to stop looking for reports and start asking for analysis, insights, and recommendations. While every situation is different, ask yourself how closely your organization follows these steps:

  1. Automated KPI reports arrive, highlighting a potential problem associated with a core business objective
  2. Line of business analytics resources consult critical reports directly looking for a reasonable explanation
  3. Failing a reasonable explanation, business resources request analysis resources from the analytics hub
  4. Analytics hub double-checks LOB’s cursory analysis, confirming the need for deeper exploration
  5. Analytics hub prioritizes analysis with the business based on pre-agreed criteria
  6. Analysis is delivered back to the business along with recommendations and a testing plan
  7. Recommendations are reviewed by the business, test plan is agreed upon
  8. Tests are run, results are socialized as follow-up to the original analysis
  9. Incremental value of change is recorded to help calculate web analytics return on investment

Individual departments are still getting their reports, but they’re generating them by themselves. Senior managers have an appropriate view into the metrics, and their own resources to evaluate observed changes. Those resources have a way to get help when help is needed. Help (the hub) isn’t bogged down generating ad hoc reports all the time and is able to focus on high-value priorities. People produce analysis and make recommendations. Recommendations are tested. Optimization happens.

Kinda brings a tear to your eye, doesn’t it?

I know there are a hundred other things that come up in the line of business for any of you who are working practitioners, but having a clear communication strategy is the first step towards whittling that list down to something reasonable and, more importantly, valuable to your organization. Defining your business objectives, clarifying ownership and organization structures, establishing KPIs and critical reports, and knowing what your analysis output will actually look like is fundamental.

Defining your web analytics communication strategy will let the data work for you, not make you work for the data. It will help you move from making purely tactical decisions and start using web analytics strategically as part of your entire business. Over time you’ll find that a clear strategy, no surprise, helps the entire organization better understand web analytics in general and the value your investment can provide. And perhaps most importantly, a clear strategy will cut down on the volume of under-used, unused, and ignored reports traveling across your network.

If you’re interested in defining a web analytics communication strategy in your organization, I’d love to talk to you. If you don’t need help, I’m still happy to provide encouragement. If I can help you, great. If I can’t help you, I bet I know somebody who can!

Analytics Strategy, Reporting

What is your web analytics communication strategy?

Judah’s recent post titled “what does your web analytics team look like” reminded me of something that has been on my mind a lot since I presented my Web Analytics: A Day a Month webcast for the American Marketing Association last month. As I travel the world talking about web analytics to companies of all shapes and sizes, one thing I’m struck by is the number of differences in how companies approach sharing web analytic data and information.

It’s not as if there is any one “right” way to communicate about web analytics, but it is clear that there are many, many wrong ways to do it. But rather than dwell on wrongness, I prefer to focus on rightness so here are a few thoughts on developing a clear strategy for communicating web analytics.

This post may seem pretty basic to many of you, but if it does I would encourage you to ask yourself these questions:

  • What decisions are web analytics driving in your organization?
  • Are those decisions largely tactical or are they truly strategic?
  • Do you feel like most people in your organization understand web data?
  • Are you producing reports that are going under-used, unused, or are flat out being ignored?

If you are less than impressed with your responses I would encourage you to read on. I’m not saying you’ll necessarily learn anything new, but maybe you’ll read something that you think your boss should hear.

STEP ONE: DEFINE YOUR BUSINESS OBJECTIVES

I know, I know, you’ve heard me say this before. I’ve been saying this since 2002 but I’m going to keep on saying it since it bears repeating. By clearly defining your business objectives you get two things done:

  1. You remind everyone in your organization why you have a web site and why those of you who work online come to work every day.
  2. You build a framework against which you will define the core activities and interactions that are worth measuring and communicating

The second point is important: You cannot measure everything effectively and efficiently — you have to have some basis for deciding what to measure and what to report. I have seen any number of companies work hard to collect “all possible data” only to realize that few people are actually asking for that data and even fewer are doing anything with it.

When you define your business objectives and get consensus on what is most important to your online business, the measurable activities that you will be communicating across the organization become clear. Suddenly rather than struggling to measure every aspect of every page across every segment you’re able to focus on critical measures in critical paths in your most important visitor segments.

I covered all of this in Analytics Demystified what seems like years ago and again in Web Site Measurement Hacks (which you can now purchase direct from my site, had I mentioned that?) but again it is worth repeating. And while it is far less common now that I will ask companies about their business objectives and get conflicting opinions, many companies have still not gone through the process of clearly documenting these objectives and the associated activities to serve as basis for their measurement efforts.

STEP TWO: DETERMINE WHO OWNS ANALYTICS AT YOUR COMPANY

One of the biggest problems I see in web analytics today is a lack of clarity regarding ownership of analytics inside the organization. On this point I will be as clear as possible:

The owner of web analytics in your company NEEDS to be someone senior enough to ensure that analysis is being produced and used!

I spend an awful lot of time as a consultant talking about ownership and structure in analytics. Your executive sponsor needs to be closely connected to web analytics and have a clear understanding of the value and opportunity measurement provides. If this is not the case, you may spend an awful lot of time producing reports that go unread and analysis that goes unused.

I suspect that my fellow blogger Daniel Shields can attest to the goodness in this recommendation, working for a great boss at CableOrganizer, but more often than not when I ask the question “Who owns web analytics?” I get responses that talk about budget centers, middle-management who haven’t got budget authority or enough political clout, or worse yet, nothing but uncomfortable laughter.

Clients almost always ask “Where should web analytics live? Should it live in Finance, I.T., Marketing, or Research?” to which I almost always answer “Who is the most senior, well-connected person in your organization that is likely to really understand what web analytics is good for?” and then give their department as my answer. Here are some additional thoughts:

  • Finance: Analytics living in your finance organization is fine because your CFO understands how to produce detailed analysis and make that analysis valuable internally
  • Marketing: Marketing is great since in many cases marketing has the most to gain (or lose) based on web analytics data and analysis
  • Research: If you have a market research organization this is also a great home since the analysis team in research usually has an excellent understanding of the customer and their (offline) behavior
  • Information Technology: I personally don’t usually recommend that web analytics live in I.T. There is often too much baggage and a disconnect between I.T. and the business for this to work (but I do know of a handful of examples where I.T. ownership of analytics does work)

At the end of the day the most successful analytics organizations are those where the executive sponsor “gets it” and is able to champion for the cause at a very high level. They will need money, resources, and time from the rest of the company to deeply integrate the necessary web analytics business processes, so seniority is an absolute must.

STEP THREE: DETERMINE YOUR ANALYTICS ORGANIZATIONAL STRUCTURE

This is the step I’ve been thinking a lot about lately, how analytics organizations are structured and integrated into medium-to-large-to-very large companies. As I’m sure you know, this piece is far from a no-brainer — whether you subscribe to 10/90, 10/20/70, or some other percentage-wise distribution of effort, I think we can all agree that people are critical to web analytics success.

But as Judah deftly points out, just hiring someone is only the beginning of the work: The more important piece is determining how those resources are going to actually provide benefit back to the entire organization. You need to have a clear strategy for leveraging these resources to produce the maximum number of insights possible.

For about four yeas I have been talking about the “hub and spoke” model for web analytics organizations, especially to medium, large, and very large companies. The hub and spoke is basically a centralized/decentralized model for measurement, one that centralizes deep analysis expertise for use across the organization but mandates that each individual department and line of business takes responsibility for their own reporting needs.

The folks in the analytics hub are directly responsible for things like:

  • Producing analysis, real analysis, to support business decisions
  • Providing training out to the rest of the organization on tools and data
  • Communicating about the goodness (or lack thereof) in the data collected
  • Interfacing with the vendor(s) providing measurement software and services
  • Managing multivariate tests and analyzing their results
  • Working with I.T. to make changes to data collection and integration

Perhaps most importantly, the hub work directly for the executive sponsor for analytics (see STEP TWO above.) Establishing a real web analytics hub is the first thing you need to do if you want to STOP spending 80 percent of your time generating reports (something a prospect recently referred to as being a “report monkey” which they didn’t seem super-excited about …)

The folks in the individual departments and LOBs are responsible for things like:

  • Paying careful attention to their key performance indicators and react to observed changes
  • Spend enough time learning the available technology to answer at least basic questions when changes are observed
  • Generating whatever reports are necessary on a regular basis and modifying those as required
  • Interface with the analytics hub to ensure that requests for testing and analysis are clearly communicated
  • Respond to test results and analysis by putting the insights generated to work for the organization

The best possible news is that the folks in the spokes don’t have to be web analytics experts! Hell, they don’t even need to read the available literature if they don’t want to (but they should.) They really only need to take enough time to learn what their KPIs are telling them and which reports in the analytics application(s) are relevant when things change.

Thinking about the relationship between the hub and spokes:

  • The hub does analysis, and the spokes do reporting
  • The hub executes multivariate tests, but the spokes recommend them
  • The hub work directly with I.T., the spokes get to continue avoiding I.T.
  • The hub helps to plan, manage, and monitor KPIs, the spokes live and die by them
  • The hub runs something like Omniture Discover or IndexTools Rubix, the spokes use SiteCatalyst or Google Analytics

This is great news because there are many, many people out there that have a 0.2, a 0.33, or a 0.5 FTE for web analytics — not nearly enough time to really get deep into web analytics but enough to create the expectation that they’ll use the data to make business decisions. The hub and spoke model creates a business process to support partial FTE in their endeavor to use and benefit from web analytics, which those partial FTE seem to truly, truly appreciate!

In my experience, over time the people who really like this kind of work will pop up and ask great questions, looking to push the boundaries of their understanding of “our little craft.” They’ll read books, blogs, go to conferences, etc. and over time may realize that they really want to work in the field of web analytics full time. Which is great, because without those people flowing into the system, the multitude of recruiters and companies across the globe looking for experience web analytics professionals haven’t got a prayer.

Since Judah, Daniel, and I have been talking about the length of out posts lately I think I’ll stop here and publish Part II of this post later this week.

The key takeaways from the thoughts here are:

  1. You have to have a web analytics communication strategy
  2. You have to clearly define your business objectives and supporting activities
  3. You need to define and establish an analytics organization
  4. Your analytics organization needs to report to an appropriately senior person
  5. The hub and spoke model for web analytics has many advantages, especially in large organizations
  6. Web analytics done well has a tendency to make people more, not less, interested in web analytics (which is good!)
Analytics Strategy, General

Web Analytics: A Day a Month white paper now available

All of you who attended my American Marketing Association webcast titled “Web Analytics: A Day a Month” earlier this month, and any of you who missed it, who would like something you can share with others in your organization can now download a (mostly) free white paper on the same subject from Tableau Software. I say mostly free because you have to give some information, and I suspect that when you do, a sales-person may call.

You can request the paper from this URL:

http://www.tableausoftware.com/web_analytics_wp_peterson

Thanks a ton to the nice folks from Tableau for sponsoring the webcast and this short white paper.  If, after you download the document, you have any questions, I’m more than happy to hear from you either through my blog or directly via email.

Analysis, Analytics Strategy, Reporting, Social Media

Bounce Rate is not Revenue

Avinash Kaushik just published a post titled History Is Overrated (Atleast For Us, Atleast ForNow). The point of that post is that, in the world of web analytics, it can be tempting to try to keep years of historical data…usually “for trending purposes.” Unfortunately, this can get costly, as even a moderately trafficked site can generate a lot of web traffic data. And, even with a cost-per-MB for storage of a fraction of a penny, the infrastructure to retain this data in an accessible format can get expensive. Avinash makes a number of good points as to why this really isn’t necessary. I’m not going to reiterate those here.

The post sparked a related thought in my head, which is the title of this post: bounce rate is not revenue. Obviously, bounce rate (the % of traffic to your site that exits the site before viewing a second page) is not revenue. And, bounce rate doesn’t necessarily correlate to revenue. It might correlate in a parallel universe where there is a natural law that no dependent variable can have more than 2 independent variables. But, here on planet Earth, there are simply too many moving parts between the bounce rate and revenue for this to actually happen.

But.

That’s not really my point.

What jumped out at me from Avinash’s post, as well as some of the follow-up comments, was that, at the end of the day, most companies measure their success on some form of revenue and profitability. Realizing that there is incredible complexity in calculating both of these when it comes to GAAP and financial accounting, what these two measures are trying to get at, and what they mean, are fairly clear intuitively. And, it’s safe to say that these are going to be key measures for most companies 10, 20, or 50 years from now, just as they were key measures for most companies 50 years ago.

Sales organizations are typically driven by revenue — broken down as sales quotas and results. Manufacturing departments are more focussed on profitability-related measures: COGS, inventory turns, first pass yields, etc.  Over the past 5-10 years, there has been a push to take measurement / data-driven decision-making into Marketing. And, understandably, Marketing departments have balked. Partly, this is a fear of “accountability” (although Marketing ROI is not the same as accountability, it certainly gets treated that way) Partly, this is a fear of figuring out something that can be very, very, very difficult.

But, many companies are giving this a go. Cost Per Lead (CPL) is a typical “profitability” measure. Lead Conversion is a typical “revenue” measure. That is all well and good, but the internet is adding complexity at a rapid pace. Pockets of the organization are embracing and driving success with new web technologies, as well as new ways to analyze and improve content and processes through web analytics. No one was talking about “bounce rate” 5 years ago and, I’d be shocked if anyone is talking about bounce rate 5 years from now.

Social media, new media, Web 2.0 — call it what you like. It’s changing. It’s changing fast. Marketing departments are scrambling to keep up. In the end, customers are going to win…and Marketing is going to be a lot more fun. But we’ve got a lonnnnnnnnng period of rapidly changing definitions of “the right metrics to look at” for Marketing.

While it is easy to get into a mode of too constantly reevaluating what your Marketing KPIs are, it is equally foolish to think that this is a one-time exercise that will not need to revisited for several years.

Oh, what exciting times we live in!

Analytics Strategy

gapingvoid on the "Who Owns My Data?" Debate

That’s gapingvoid as in “Hugh Macleod draws quick, insightful, witty things.”

WARNING: the “Who Owns My Data?” cartoon is highly irreverent (it’s gapingvoid, people!) and includes profanity.

‘nuf said — it’s a chuckler: http://www.gapingvoid.com/Moveable_Type/archives/004409.html

There’s really nothing I can add to this.

Except that I’m still chuckling. Or, maybe I’m cackling maniacally, as my day started at 3:45 EST this morning in Columbus, and it’s now 10:00 CST in Austin!

Analytics Strategy

I'm an Omniture customer now … are you?

I’m sure a lot of you got the email or saw the big announcement but the Omniture acquisition of Visual Sciences is complete.  Omniture has set up a customer welcome program at their site and has outlined a little bit about the transition plan and new product line (one word: complex)  Perhaps the funniest piece of information is this brag page about some of the business Omniture had taken away from WebSideStory or Visual Sciences, talking about the transition from HBX to SiteCatalyst.

Being a brand new Omniture customer (I license Visual Site), I just noticed that the company has Lance Armstrong and Seth Godin (who is rumored to dislike traveling West) speaking at their client event in March, wow!  In an odd way I’m quite bummed I won’t be able to attend the event since it would be a nice chance to see up close and personal what this new supercompany will look like.

Congrats to Josh James and the Omniture team, Aaron Bird on the promotion to General Manager of the VS business unit, and everyone at WebSideStory/Visual Sciences — hopefully you all find a nice and stable home at Omniture!

Adobe Analytics, Analytics Strategy, Conferences/Community, General, Industry Analysis, Reporting

My AMA presentation is now online and much more

For those of you who missed my presentation yesterday, “Web Analytics: A Day a Month”, you can now listen to the re-recorded webcast at WebEx thanks to Tableau and the American Marketing Association. I say “re-recorded” since once again I managed to bring a large enough crowd to the webcast to break WebEx. Web analytics is hot!

You can listen to the webcast without having to register (still requires name and email) until next week I think by going to:

amaevents.webex.com

Here are a few other things I should mention, as long as I’m writing:

  • I’m going to be in Boston next week for Judah’s Web Analytics Wednesday event (rescheduled from last month due to me being a weather-wimp) and if you’re in Boston or nearby I’d love to catch up. Please join us in Cambridge!
  • The next few weeks I will be in Chicago (Jan 25th), Seattle (Jan 30th), San Jose (Jan 31st) and New York (Feb 7th) giving the keynote address at OpinionLab’s client conferences. The nice folks at OpinionLab mentioned that they’re opening up the events to non-customers so if you’d like to hear me talk about how quantitative and qualitative data combined provide a much more actionable view of the online visitor, please join us!
  • The nice folks at the Direct Marketing Association who gave away PDF copies of my book Analytics Demystified in exchange for participation in their web analytics survey (written up by the amazing W. David Rhee) are holding a webinar on the research findings on Januay 23rd. The event is not free but the research is pretty good and if you’re in the DMA you should consider joining the call.
  • The nice folks at the Web Analytics Association are also holding a research call, tomorrow (Jan 17th) in fact, on the future of the web analytics industry. I think this event is free but it might only be free to WAA members (maybe if Richard or Andrea read this they can comment for all to see!) The call is tomorrow morning at 9 AM Pacific, noon Eastern and you can register to attend at the WAA web site.
  • Anil Batra has apparently jumped on the “bounce rate” bandwagon and is having a “bounce rate survey” that he’d like you to participate in. I haven’t had a chance to take it yet but I really enjoyed Anil’s salary research so I’m sure he’ll do a great job with bounce rate too!
  • I’ll be back in San Diego in late February at Aaron Kahlow’s Online Marketing Summit talking about Key Performance Indicators in a Web 2.0 World.  I really enjoyed OMS last year and am looking forward to getting back to Sea World Aaron’s event!
  • I had nothing to do with that movie on web analytics, despite it being filmed here in the Rose City, and have no idea what Ian is talking about.  Ian should spend less time at the movies and more time reading what experienced practitioners are saying about Gatineau.  <grin>

If I’m forgetting anything please comment below.  I think you’ll really like the webcast — the feedback I got has been excellent so far (despite some people going gossipy about the title of my last post on the subject … cage match indeed!)

Analytics Strategy, Social Media

Data Portability vs. Privacy

There is a lot of buzz of late regarding Robert Scoble getting knocked off of Facebook as he was testing out Plaxo and, in the process, scraping data from Facebook. The debate that has primarily raged has been around who “owns” our data when we load it into a social media site. I’m pretty sure that the Terms of Use we all blithely accept spell that out fairly clearly. I’m also pretty sure that legalese is largely irrelevant when it comes to the court of public opininion, as Facebook seems to continually rediscover!

Debbie Weil had an interesting take on the situation in her post: The controversial issue of ”data portability” (or what we used to call “privacy”). She makes the point that, “With so many of us living so much of our lives online we are trusting both that our ‘data’ won’t be misused and that it won’t disappear.” We don’t often enough recognize that data portability and privacy, if not directly in conflict, apply pressure in two different directions. Chris Brogan, Jeremiah Owyang, and many, many others have touched on the subject. In Brogan’s case, and in many of the comments on Twitter, the emphasis is on the nuisance factor of having to re-enter the same information in multiple places. Generally, there is some nod to “privacy” — “it needs to be secure, private, with configurable access permissions” — but that gets thrown in almost as an afterthought. On the other hand, it only takes one or two examples of some form of identity theft to give people pause about making their data truly portable. As a matter of fact, an on-going discussion in the world of web analytics is, “How much detail can we — and should we — track and keep on visitors to our sites?” And, when governments get involved, the emphasis is virtually always on ensuring privacy rather than on improving efficiency (in the U.S., HIPAA and CAN-SPAM come to mind immediately).

This is a truly thorny issue, and it comes down to trying to accurately manage personal preferences across multiple interrelated/interconnected systems. On one end of the spectrum, the privacy paranoid person resists sharing any true information whatsoever, and he can aggressively tell sites not to share his information in any way whatsoever — even with him! This poor soul is almost definitely going to give himself high blood pressure, and the shorter life he is going to live is going to be inefficiently lived as he continually puts up barriers that he has to repeatedly climb over. On the other extreme is the person who will openly share even his bank account details because he doesn’t believe it will ever bite him in the ass (we can label this archetype Jeremy Clarkson).

The reality is that 99% of us live somewhere in between these two extremes. Most of us believe that where we have placed ourselves on this spectrum is the obviously logical place to be. And most of us are uncomfortable shifting even slightly from our current position towards either end of that spectrum.

The person who has a finite number of cell phone minutes each month on herplan may fiercely guard that number while freely sharing her home number. Another person may have unlimited minutes and no issues with screening her cell phone calls as they arrive, so may prefer that number as her primary, most public contact channel.

This means any “solution” will have to be highly configurable. Which, sadly, means that it may be cumbersome to manage. And may struggle to get adopted. I’ll continue to keep my fingers crossed that OpenID, The Todeka Project, or some other approach can allow us to personalize our point on the privacy/portability spectrum.