Analytics Strategy, Conferences/Community, General

Finally! Standards come to Web Analytics

Last week I had the pleasure of traveling to Columbus, Ohio to participate in Web Analytics Wednesday, hosted by Resource Interactive’s Tim Wilson and generously sponsored by the fine folks at Foresee. We opted for an “open Q&A” format that turned out pretty well. Turns out the web analysts in Ohio are a pretty sharp bunch so all of the questions I fielded were of the “hardball” type.

One question in particular surprised me, and the answer I gave forced me to elucidate a point I have been pondering for some time but have never voiced in public. The question came from Elizabeth Smalls (@smallsmeasures, go follow her now) who asked, and I paraphrase, “How can we best explain the differences in the numbers we see between systems?” and “Is there any chance the web analytics industry will ever have ‘standards’?”

Long-time readers know I have followed the Web Analytics Associations’s efforts to establish standards closely over the years, helping to create awareness about the work and also pushing the Association to “put teeth” behind their definitions and encourage vendors to either move towards the “standard” definitions or, at worst, elucidate where they are compliant and where they differ from the WAA’s work.

Sadly the WAA’s “standards” never really caught on as a set of baseline definitions against which all systems could be compared to help explain some of the differences in the data. As a result practitioners around the globe still struggle when it comes time to explain these differences, especially when moving from one paid vendor to another.  But none of this matters anymore for one simple reason …

Google Analytics has become the de facto standard for web analytics.

Google has become the standard for web analytics by sheer force of might, persistence, and dedication. By every measure, Google Analytics is the world’s most popular and widely deployed web analytics solution. Hell, in our Analysis Exchange efforts we focus exclusively on the use of Google Analytics because A) we know that 99 times out of 100 we will find it already deployed and B) nearly all of our mentors have had enough exposure to Google Analytics to effectively teach it to our students.

What’s more, as Forrester’s Joe Stanhope opined the recently published Forrester Wave for Web Analytics, web analytics as we knew it doesn’t really exist anymore:

“Few web analytics vendors restrict their remit to pure on-site analytics. Most vendor road maps incorporate emerging media such as social and mobile channels, data agnostic integration and analysis features, usability for a broad array of analytics stakeholders, and scalability to handle the rising influx of data and activity.”

Joe says “few” vendors remained focused on on-site analytics, but it would be more precise to say “one” vendor — Google — has maintained interest in how site operators measure their efforts with any level of exclusivity and sincerity. In fact, I don’t think we need to call the industry “web analytics” anymore … it is probably more accurate to say we have “Google Analytics” and “Everything Else.”

Everything else is enterprise marketing platforms. Everything else is integrated online marketing suites. Everything else … is all of the stuff that has been layered on top of solutions we have historically considered “web analytics” as a response to an event that can only be accurately described as the single most important acquisition in our sector, period.

Google Analytics is the de facto standard for web analytics, and this is great news.

Assuming you take care with your Google Analytics implementation, whenever there is a question about the data you will have a fairly consistent[1] view for comparison. Switching from one vendor to another? Use Google Analytics to help explain the differences between the two systems! Worried that your paid vendor implementation is missing data? Compare it to Google Analytics to ensure that you have complete page coverage! Not sure if a vendor’s recent change in their use of cookies impacted their data accuracy? Yes, you guessed it, compare it to Google Analytics!

With Google Analytics you have a totally free standard against which all other data can be reconciled.

Now keep in mind, I am absolutely not saying that all you need is Google Analytics — nothing could be further from the truth. Despite a nice series of updates and the emergence of a paid solution that may be appropriate for some companies, I agree with Stanhope when he says that “Google Analytics Premium still lags enterprise competitors in several areas such as data integration, administration, and data processing …”

But that’s a debate for the lobby bar, not this blog post.

If you’re looking for a set of rules that can be universally applied when it comes to the most basic and fundamental definitions for the measures, metrics, and dimensions that our industry is built upon, you don’t have to look anymore. Google has solved that problem for the rest of us, and we should thank them. Now, thanks to Google, we can focus on some of the real problems facing our industry … which again, is a debate best left to the lobby bar.

What do you think? Are you running Google Analytics on your site? Do you use it when you see anomalies in data collected through other systems? Have you used it to validate a move from one paid vendor to another? Or do you believe that the WAA standards already provide the solution I am ascribing to Google?

As always I welcome your opinions and feedback.


[1] Yes, when Google changed the definition of a “session” that impacted their consistency, but once they corrected the bug they introduced it seems the number of complaints has gone down significantly. What’s more, the change made sense and in general we should be in favor of “improving on standards whenever possible” don’t you think?

Analytics Strategy, General

The Myth of the "Data-Driven" Business

You may have noticed I have been pretty quiet in my blog lately aside from sharing news about our ACCELERATE event in San Francisco in November. It’s partially because honestly I’ve been swamped with new clients, existing work, and the never-ending effort to be a good husband, dad, and friend in the midst of Demystifying web analytics …

But being busy is no excuse to stop sharing ideas and encouraging conversation so let’s dive into something that has increasingly become a pet-peeve of mine: the notion leveraging web analytics to create a “data-driven” business.

I’m sure I have used this phrase in the past in an effort to describe the transformation that companies need to go through in the digital world, relying less on “gut feel” and more on cold, hard data to guide business decision making. Hell, a lot of smart of people have, including Omniture’s Brent Dykes and Google Analytics Evangelist Avinash Kaushik who has gone so far as to describe creating a data-driven culture as the “holiest of holy grails.”

Becoming “data driven” is the way to silence the HIPPO and to more firmly establish the value of our collective investments in digital measurement, analysis, and optimization technology. It sounds great, except for one thing:

A “data-driven business” would be doomed to fail.

I think that perhaps what people mean when they talk about being “data-driven” is the need for a heightened awareness of the numerous source of data and information we have available in the digital world, enough so that we are able to take advantage of these sources to create insights and make recommendations. On this point I agree — better use of available data in the decision making process is an awesome thing indeed.

My concern arises from the idea that any business of even moderate size and complexity can be truly “driven” by data. I think the right word is “informed” and what we are collectively trying to create is “increasingly data-informed and data-aware businesses and business people” who integrate the wide array of knowledge we can generate about digital consumers into the traditional decisioning process. The end-goal of this integration is more agile, responsive, and intelligent businesses that are better able to compete in a rapidly changing business environment.

Perhaps this is mere semantics — you say “potato” I say “tuberous rhizome”  — but given the sheer number of consultants, vendors, and practitioners talking about creating, powering, and working in the mythical “data-driven business” I have started to worry that we’re about to shoot ourselves in the collective foot. We (meaning the web analytics industry as a whole) have done this before, first by claiming that web analytics was easy, then by insisting that cookies were harmless … and personally I’d prefer we avoid yet another self-imposed crisis of credibility if possible.

And while this may be semantics, I do disagree with Brent Dykes assertion that in the absence of carrot-and-stick accountability that web analytics breaks down and fails to create any benefit within the business, although I do understand fully where Mr. Dykes is coming from. I simply have not seen nearly enough evidence that eschewing the type of business acumen, experience, and awareness that is the very heart-and-soul of every successful business in favor of a “by the numbers” approach creates the type of result that the “data-driven” school seems to be evangelizing for.

What I do see in our best clients and those rare, transcendent organizations that truly understand the relationship between people, process, and technology — and are able to leverage that knowledge to inform their overarching business strategy — is a very healthy blend of data and business knowledge, each applied judiciously based on the challenge at hand. Smart business leaders leveraging insights and recommendations made by a trusted analytics organization — not automatons pulling levers based on a hit count, p-value, or conversion rate.

Kishore Swaminathan, Accenture’s chief scientist, in his discussion on “What the C-suite should know about analytics” outlines how an over-dependence on data can lead to “analysis-paralysis”, stating:

“Data is a double-edged sword. When properly used, it can lead to sound and well-informed decisions. When improperly used, the same data can lead not only to poor decisions but to poor decisions made with high confidence that, in turn, could lead to actions that could be erroneous and expensive.”

Success with web analytics and optimization requires a balance, and business leaders who will be successful analytical competitors in the future will need to develop a top-down strategy to govern how their businesses will leverage both digitally-generated insights and the collective know-how of their organizations. Conversely, being “driven” implies imbalance and over-correction — going out of your way to devalue experience, ignore process, and eschew established governance in favor of a new, entirely metrics-powered approach towards decision making.

You can do this, but to Swaminathan’s point, what if the numbers you’re using are wrong?

I think that creating a “data informed” business is a huge victory and for most companies a major step in the right direction. What’s more, working to create a “data informed” business shows respect for the hard work, commitment, and passion your employees have for their jobs and your company and products.

Rather than walk in and “embarrass the boss” with your profound and amazing knowledge of customer interactions, you can actively work with your management team by providing insights and recommendations that reflect your knowledge of how the entire business works, not just your amazing talent as web analytics implementer (or analyst, whatever …)

But I digress.

I’m interested in your collective thoughts here people. Am I over-reaching after a blogging hiatus and unnecessarily sniping in hopes of an early Fall dust-up in Google+? Or have you had the same thoughts and/or concerns, that by insisting that everyone needs to do exactly what the data tells them that we risk alienating (again) the very consumers of our efforts? Do you work at a truly “data driven” business and do what the numbers tell you each and every time? Or are you working to create a practice where otherwise smart, hard-working, and passionate marketers, merchandisers, and business leaders can benefit from the type of information and insights you are uniquely able to provide as a digital measurement, analysis, and optimization specialist?

While you consider your response I’ll leave you with a story that has shaped some of my thinking about web analytics over my career. Years ago my good friend Shari Cleary brought me into CBS News in New York to train her editorial team on Hitbox (yeah, Hitbox, I told you it was years ago!) Most of my clients at the time were “new school” but not these guys — they were hardcore news editors from the TV side of the business who had been tasked with making digital news work.

I talked and talked and talked about how powerful Hitbox was and how real-time analytics was going to power the content they put out there in the world. The editors were polite and showed real interest in the training until at one point the oldest and most grizzled of the group stopped me.

“Son, we’re not going to let the data make the decisions for us regarding editorial content,” he said with all sincerity. I was, of course, shocked to hear this — I mean, hell, that is what Hitbox was for! Figuring out which stories generated page views and which needed to be rolled off the page into obscurity.

“Umm, why is that?” I asked, figuring he’d lay into me about the inaccuracy of the system or how painful it was to use Hitbox …

“Because if we let the data drive editorial, all you will read about at CBS News is Paris Hilton’s breasts and Lindsay Lohan’s drinking problem.”

Needless to say, I stopped talking about real-time, data-driven changes to editorial content.

As always I welcome your comments, criticism, and feedback.

Analytics Strategy, General, Social Media

Massive Web Analytics Throw-down in Google+

Much to my chagrin, having been outed by the local newspaper for my original dismissal of Google+, it appears that the web analytics community is prepared to go “all in” in the social network. What’s more, because we’re no longer bound by 100-odd characters (after we @respond and #measure tag), suddenly some incredibly bright minds are able to rapidly contribute to an emerging meme.

Interested? I knew you would be.

Head on over to my stream at Google+ and catch up on the conversation stemming from Tim Wilson’s recent critique of Adobe SiteCatalyst 15. Certainly the thread has diverged somewhat but if you’re in web analytics and on Google+ we would all welcome your contribution.

>>> Web Analytics Platforms are Fundamentally Broken

If you’re not on Google+ click on this link as I have bunches of invites I can share.

Adobe Analytics, Conferences/Community, General

Great jobs and a great gathering in Atlanta next week

Just a quick note from my vacation getwaway to call reader’s attention to two great jobs at The Home Depot and to let Atlanta-area readers know that I will be in town next week for a special “Web Analytics Wednesday on Tuesday” put together by Keystone Solutions Rudi Shumpert and HD’s own Wesley “Big Wes” Hall. The event will be at the Gordon Birsch in Buckhead and I’m hoping that Rudi and Wes will allow an informal Q&A session about some of the great things that are happening in our industry lately.

>>> Register to join us at Web Analytics Wednesday, Atlanta, on Tuesday, July 19th

Regarding the jobs, our client at Home Depot is aggressively putting together a team of digital measurement specialists to help lead the company’s digital efforts forward. We have been helping the company with their digital measurement strategy now for about six months and the effort is really beginning to pay off in terms of their use of technology, the talent they are getting in the door, and the value web analytics brings to the company both online and off.

Have a look at the Senior Analyst and Manager, Web Analytics jobs on our web site and come see me next week at Web Analytics Wednesday if you’d like a personal introduction or have any questions:

>>> Job description, Senior Web Business Analyst at The Home Depot

>>> Job description, Web Analytics Manager at The Home Depot

I hope you are all having a great, relaxing summer and look forward to seeing you at a conference, event, or Web Analytics Wednesday sometime in the near future.

Adobe Analytics, General

SiteCatalyst Advanced Search Filters [SiteCatalyst]

One of the features that I find deceptively difficult at times in SiteCatalyst is the use of the Search feature. I feel like there are many times I use this and end up messing it up. Therefore, I decided to do my best to share what I have learned about what works and doesn’t work in the hopes that it will save you aggravation and time! I also hope that many you can add a comment to this post with your tips and tricks so we can all learn something…

The Basics

First, let’s start out with the basics. Hopefully if you are a SiteCatalyst user you know that the search function is used to filter results in eVar and sProp reports. You simply enter a value and SiteCatalyst will look for those values in the active report and return those rows. This is handy because you can bookmark reports, make custom reports or add reports to dashboards after you have created the filter so that you never have to apply it again.

For example, let’s start with a Pages report like this:

Obviously we have pages from all sorts of countries, but if we only wanted to look at pages from England, all we would have to do is enter “SFDC:uk:” in the search box (top-right) and we would then see a report like this:

But what if we wanted to see pages from England or France? At this point we have two options. You can either enter “SFDC:uk: OR SFDC:fr” in the search box or use the advanced search editor. Here is what it would look like with the OR statement in the regular search box (look at the top-right portion):

However, believe it or not, if you change the “OR” to be a lower case “or” you will get no results! I kid you not! I call that an “Omniture-ism” and you just have to remember it…

The other way to get to the same report is to use the Advanced Search tool. You get there by clicking on the Advanced link to the right of the search box. Once there, you would enter the appropriate phrase in the first box, click the “+” sign to add another search criteria and then enter the second phrase so it looks like this:

However, it is important that you change the top drop-down box from the default of “if all criteria are met” to “if any criteria are met” or you will get no results.

If you wanted to look for cases where there were pages on the UK website that had the phrase “form” in the pagename, that would be a case where you would use the “if all criteria are met” option and your query should look like this:

This would result in a report like this:

Finally, we can come full-circle and get more advanced and use an “AND” statement in the standard box to get the same result. Here is what the search box would look like:

Again, keep in mind that the “AND” is case-sensitive…

More Difficult Searches

So now that we have covered the basics, let’s get a bit more advanced. First, let’s keep going with our example and say that we need to find all pages in the UK or France that have the word “form” in them. This gets a bit tricky because we are mixing OR and AND statements. Using the Advanced Search query builder, here is how you would enter it:

Conversely, if for some reason we wanted to see any UK Pages that had the phrase “form” in them and all France pages (not sure why, but this is just an example), we would enter this:

Which would result in a report like this:

Note that in this case we had to change the drop-down box back to the “any criteria” option since we did the AND statement within one of the criteria (hey…I told you this was the difficult part!).

The trick here is to combine any OR and AND statements into each row since each of the individual search criteria have to be either an “AND” or “OR” clause.

On a separate note, in the advanced search area, you can change the drop-down which defaults to “Contains” to “Does Not Contain” so if, for example, you wanted to see all UK pages, but exclude those that had “login” in the name you would enter the following criteria:

Note that for this instance, we need the “all criteria are met” option…

Finally, just for fun I entered the following phrase in the “simple” search box…

…and miraculously it produced the same results!! I decided to stop here before I broke anything, but you can feel free to see how far you can push this!!

But wait…There’s more! I have been amazed by how few people I meet know this next one… Imagine that you are looking at an eVar report and you have broken it down by another eVar via Subrelations. Here is an example where I have taken the Site Locale eVar and broken it down by Internal Search Term:

Now, let’s say that you wanted to do a search filter to only see items that mention “Outlook.” The easy way to do this is to just enter the phrase “Outlook” in the search box and SiteCatalyst will show any rows that have that phrase. But what if you wanted to see the phrase “Outlook” in just United States or Japan? No matter what you put in the search box, you will not get the results you are looking for (i.e. outlook AND “united states” OR japan). Would you know how to do this? Most people I meet don’t. Here is how…

When you are using a Subrelation report, you have to keep in mind that SiteCatalyst is running two reports and it doesn’t know which report you want to filter on. Therefore, we need to tell SiteCatalyst which report we want the search term to be associated with. You can do this in the Advanced Search area. When you have a Subrelation report, and you click on the Advanced Search area, you will see a new option that allows you to select one of the two reports being subrelated like this:

Most people haven’t ever noticed this new option so now that we know it is there, all we have to do is select the right report and then enter the search term in the right report and we can get our results. For the example above, we would enter “Outlook” in the search box next to Internal Search Term and “United States OR Japan” in the search box next to Site Locale like this:

Now, since we have been a bit more specific, we can get a nice, clean report like this:

Just keep this handy feature in mind the next time you are trying to search in a Subrelations report and pulling your hair out because you can’t get the results you think you should!

Even More Difficult Stuff

Phew! If you’ve made it this far, you are really devoted to your craft. We’re almost there so hang on…

The next thing that is important to know is that you can use wildcards in your searches. To do this, you use the “*” symbol in the search query. For example, if we wanted to find any pages in the UK that has the phrase “landing” somewhere in the name, we could simply do a search like this:

The next thing to know is that Omniture can be a bit quirky when it comes to the [SPACE] separator in the search box. Let me illustrate. If I enter the phrase “home page” in the search box, here are the results I get:

This seems strange to me since none of these pages have a space in them. That would make you think that a [SPACE] is a valid separator and that this query is the same as “home OR page” right? But if I use that logic and enter this phrase “SFDC:uk: SFDC:fr:” which is really just two phrases separated by a space (just with a colon in the phrase), I get no results. I am sure there is a logical reason for this, but I am not sure what it is. Maybe if SiteCatalyst sees a “:” or a “|” it acts differently (maybe Jorgen can enlighten us on this)?

To be safe, I use the next feature – using quotes – whenever possible. My advice is that if you ever have phrases with spaces in them that you enclose them in quotes and stick to using OR statements. In the preceding example, if I change my “home page” query to be “home page” in quotes, I get the expected result which is no results. Another lesson to be learned here is that you should, whenever possible, avoid putting spaces in values that you think you will search upon. I do my best to remove all spaces from page names since that is the variable I search on the most!

Finally, you can use the “-” sign to remove things from search results. This produces the same effect as using the “Does Not Contain” feature in the advanced search area. As in the previous example, if I want to see all UK pages, but not ones that have the phrase “login”, I can enter the following in the search box:

To see UK pages that do have login in the name, you can also enter this phrase:

But when the results come back, it will mysteriously remove the “+” sign and just uses space as the separator producing the same results.

Final Thoughts…
So there you have it! Pretty much everything I know about using search and advanced search in SiteCatalyst. Do you have any additional tips or tricks? If so, leave a comment here…Thanks!

Analytics Strategy, General

Three Great Jobs at Best Buy

Now that summer is upon us I suspect that some of my personal blogging activity will slow down but I wanted to call my reader’s attention to three great jobs that our good friends at Best Buy just posted:

  • Senior Analyst, Digital Analytics
  • Associate Manager, Digital Analytics
  • Manager, Digital Analytics

Those of you who were at Emetrics in San Francisco this Spring heard some of the story about the work we’ve been fortunate to help with at Best Buy. Those of you coming to Internet Retailer in San Diego on June 16th will get to hear a shortened version of the same story. If you can’t/didn’t make either event I am happy to put interested parties directly in touch with the hiring manager at Best Buy, email me directly for details.

If you are coming to Internet Retailer, come and hear Lynn Lanphier (Best Buy) and I tell their amazing story.

General, Social Media

The Crowd Has Spoken: Gilligan It Is

(I’ll return to serious posts shortly!)

A couple of weeks ago, I asked for input as to my new profile picture on this blog and elsewhere across the socialmediaverse. The crowd has spoken, a $74 donation has been made to the Appalachian Trail Conservancy, and it looks like I’m now due to have photographic alignment with the blog name:

Part of the inspiration for this exercise was that I’ve had the experience before of knowing what someone looks like as I’ve gotten to know them digitally solely based on a single picture…and then been surprised in some way by their appearance when I actually meet them in person. This came up a couple of times at eMetrics in San Francisco. So, in addition to changing my standard profile picture, I’ve also added a collage of photos to my About page. The challenge there is that I’m an amateur photographer, so am more often behind the camera than in front of it. That made for slim pickin’s on the photo front, but there’s enough there that you can get a better sense o’ me, should you care to have that!

Conferences/Community, General, Social Media

Announcing "Demystified Days"

UPDATE MAY 6, 2011: Under threat of litigation we have decided to postpone Demystified Days for the time being. You can read more about this decision here.

I am incredibly excited to let all of you know about something that Adam, John, and our friends at Keystone Solutions will be doing this coming September that builds on our long-standing commitment to local web analytics communities and our more recent efforts to support nonprofits around the world … something we are calling “Demystified Days!”

Check out the mini-site for Demystified Days right now!

For years we have been helping local web analytics communities around the globe connect with each other as part of Web Analytics Wednesday, and by every measure, Web Analytics Wednesday works. Thanks to current and past sponsors — great companies like I.Q. Workforce, Coremetrics (an IBM Company), SiteSpect, and hundreds of other companies who have hosted regional events — Analytics Demystified has brokered more personal introductions (and served more beers) than any other organization or group in our industry.

This past year we have been trying to leverage our connections in the industry to do something truly good and solve bigger problems. The result was, of course, the Analysis Exchange — the world’s only effort to provide free analytics support to nonprofits and nongovernmental organizations — which thanks to the efforts of great people like Wendy Greco, Emer Kirrane, Jason Thompson and our mentors and students has changed how people learn how to tells stories with data.

Now we are taking it to the next level, one city at a time.

Starting September 12th in San Francisco we will be bringing a day long educational and networking event to cities across the globe.  The format will be one you are all familiar with — great presentations in the morning and great conversations in the afternoon, of course followed by drinks and networking at Web Analytics Wednesdays in the evening.

We could easily do these events for free … but we aren’t going to. Instead we are going to find awesome sponsors to help us offset costs and ask everyone who participates to buy a $99 ticket to the event. Then, at the end of the day, we are going to add up all of the revenues, subtract out all of the costs, and donate every penny that is left to two local charities decided on by the event participants.

Our hope is to be able to donate a total of $50,000 to six charities in the United States. You can help us achieve that goal by doing three very easy things:

  1. Helping us spread the word about Demystified Days within your social network. We have created a short URL http://bit.ly/demystifieddays and you can tag tweets about these events with #demystifieddays.
  2. Joining us in San Francisco, Atlanta, and Boston. We are finalizing venues right now and will post ticket purchasing information in the next few weeks so watch for that!
  3. Email us and let us know you are interested in Demystified Days. The mini-site has a form at the bottom that will let you indicate your interest. Fill out the form and we will keep you in the loop!

On behalf of the teams at Analytics Demystified and Keystone Solutions we sincerely hope you are excited about what Demystified Days can become. We welcome your questions in comments or directly via email.

Help spread the word!

 

General, Technical/Implementation

Need A Checkup? The Doctor Is In!

When it comes to your health, most doctors say that having a regular checkup is the easiest way to prevent major illness. By simply going to see your doctor once a year, you can get your vitals evaluated and see if your blood pressure is too high or low, check your cholesterol, etc… If you happen to be sick at the time you have your checkup, you can find out if it is serious or not and if you feel fine, the checkup is a way to confirm that you are in good shape.

However, when it comes to web analytics implementations, it isn’t always easy to know how “healthy” you are. You might wonder the following:

  • Is my organization capturing the right data to ensure it can do the analysis needed to improve conversion rates?
  • Do the configuration settings of our web analytics tool make sense?
  • Are we maximizing the use of our web analytics tool or are we only using 20% of its capabilities?
  • How does our web analytics implementation compare to that of my peers/competitors?

Over the past decade, I have been associated with hundreds of web analytics implementations, and the above questions were ones that often kept my clients awake at night. And, truth be told, based upon my experience, many of them had reason to be worried. More often than not, when I crack open a client’s web analytics implementation, I am shocked by what I see. Here are a few examples of problems I encounter repeatedly:

  • Unusable pathing reports due to inconsistent page naming practices
  • Unusable campaign reports due to inconsistent tracking code naming conventions
  • Web analytic variables/reports defined, but with no data
  • Cookie settings that don’t line up with business goals (i.e. Cookie using Last Touch when Marketing uses First Touch)
  • Data inconsistencies resulting in reports that are highly suspect or untrustworthy
  • Incomplete meta-data or look-up tables
  • Lack of critical KPI’s and best practices specific to the industry vertical the website serves
  • Lack of appropriate usage of key web analytics tool features that could improve overall analytic success

The remainder of this post will discuss a new service offering Analytics Demystified will be providing to address the preceding concerns. If you are interested in knowing the “health” of your organization’s web analytics implementation, please read on…

Introducing the Web Analytics Operational Audit

So how do you know if you are doing well or poorly? Like anything, the best way to know where you stand is to perform a checkup or audit. In this case, I am referring to an audit that reviews which web analytic tool features you are utilizing and what data your web analytics implementation is currently collecting.

Since there is no official “doctor” when it comes to web analytics, we at Analytics Demystified have created what we believe is the next best thing. Taking advantage of our depth of experience in the web analytics arena, we have created a Web Analytics Operational Audit scorecard that encompasses the best practices we have seen across all company sizes and industry verticals. This scorecard is vendor-agnostic and has over 100 specific items and categories that allow you to see where your current web analytics implementation excels and where it is lacking.

Over the years, I have done this type of scoring informally, but the Operational Audit framework we have created at Demystified takes this to a whole new level. Here is a snapshot of what the scorecard looks like so you can see the format:

Our goal in creating this Operational Audit project is to have a simple, yet powerful way to objectively score any web analytics implementation from a functionality point of view. Knowing where your organization stands with respect to its web analytics implementation is beneficial for the following reasons:

  • If you think you have a robust implementation, but it turns out that you do not, you may be making poor business decisions today based upon faulty data and/or incorrect assumptions
  • What if your implementation is worse than you thought? You can try and hide it, but I have found that in the long run, bad web analytics implementations are eventually found out…usually at the worst time when an executive needs something critical and you have to come back and say “sorry, we don’t have a way to know that…” Wouldn’t you like to know sooner, rather than later, what shape you are in so you can get your web analytics house in order?
  • Maybe you have an awesome web analytics implementation, but your boss doesn’t know it! What would it do to your job/career if your boss was told by an independent 3rd party that all of the time and money they invested in your web analytics implementation have paid off! What if your web analytics implementation was in the top 10% of the general web analytics population? Promotion anyone?
  • Your organization doesn’t have unlimited time and budget for web analytics implementation projects. When the stars align and you do get resources or budget, wouldn’t it be great to be armed and ready with the top things you should be doing so you don’t miss these golden opportunities?

These are just a few of the many reasons that auditing your implementation makes sense. One important note: this Operational Audit does not include a technical audit of JavaScript tagging (which can be equally as important!).

Go Forth and Audit!

As I stated earlier, the unfortunate truth is that there is more bad than good out there. People change roles, priorities change, people leave your company, companies merge. There can be any number of reasons contributing to the devolution of web analytics implementations, but regardless of how you got to where you are, if you want to be successful, you need to grab hold of the reins of your current web analytics implementation and take ownership of it.

For example, when I joined Salesforce.com, I could have spent my time blaming our implementation shortcomings on my predecessors, but that wouldn’t help me get to where I needed to go. Instead, I chose to audit our implementation and identify what was worth keeping and what had to go! In the end, our company was better for it, and the audit led to an implementation roadmap for the next year, allowing me to know how long it would take to turn things around and what type of resources I would need.

It is based upon this recent experience that I highly encourage you to consider this Operational Audit service for your organization. Long term, one of my hopes is that I can audit enough companies, across various company sizes and verticals to enable me to create a benchmark of web analytics implementations so I can let you know how your scores compare to others like you. This way, even if most companies score poorly, you can possibly claim to be the best of what is currently out there (can you tell I liked being graded on a curve in high school?). I am also looking forward to re-scoring companies next year so they can see how their implementation has improved year over year.

Intrigued? Interested? Scared?

If you’d like to learn more about having your web analytics implementation audited, please contact me and I’d be happy to answer any questions. Thanks!

 

General, Social Media

Measuring the Super Bowl Ads through a Social Media Lens

Resource Interactive evaluated the Super Bowl ads this year from a digital and social media perspective — how well did the ads integrate with digital channels (web sites, social media, mobile, and overall user experience) before and during the game. I got tapped to pull some hard data. It was an interesting experience!

A Different Kind of Measurement

This was a different kind of measurement from what I normally do. I definitely figured out a few things that we’ll be able to apply to client work in the future, but, while, on the surface, this exercise seemed like just a slight one-off from the performance measurement we already do day in and day out, it actually has some pretty hefty differences:

  • Presumption of Common Objectives — we used a uniform set of criteria to measure the ads, which, by definition, means that we had to assume the ads were all, basically, trying to reach the same consumers and deliver the same results. Or, to be more accurate, we used a uniform set of criteria and then made some assumptions about the brand to inform how an ad and it’s digital integration was judged. That’s a little backwards from how a marketer would normally measure a campaign’s performance.
  • Over 30 Brands — the sheer volume of brands that advertise at the Super Bowl introduces a wrinkle. From Teleflora to PepsiMax to Kia to Groupon, the full list was longer than any single brand would normally watch as its “major competitors.”
  • Real-Time Assessment — we determined that we wanted to have our evaluation completed no later than first thing Monday morning. The reality of Marketing, though, is that, even as there is a high degree of immediacy and real-time-ness…successful campaigns actually play out over time.  In this case, though, we had to make a judgment within a few hours of the end of the game itself.
  • No Iterations — I certainly could (and did) do some test data pulls, but I really had no idea what the data was going to look like when The Game actually hit. So, we chose a host of metrics, and I laid out my scorecard with no idea as to how it would turn out once data was plugged in. Normally, I would want to have some time to iterate and adjust exactly what data was included and how it was presented (certainly starting with a well-thought-out plan of what was being included and why, but knowing that I would likely find some not-useful pieces and some additions that were warranted).

It was a challenge, for sure!

The Approach

While the data I provided — the most objective and quantitative of the whole exercise — was not core to the overall scoring…the approach we took was pretty robust (I had little to do with developing the approach — this is me applauding the work of some of my co-workers).

Simply put, we broke the “digital” aspects of the experience into several different buckets, assigned a point person to each of those buckets, and then had that person and his/her team develop a set of heuristics against which they would evaluate each brand that was advertising. That made the process reasonably objective, and it acknowledged that we are far, far, far from having a way to directly and immediately quantify the impact of any campaign. Rather, we recognized that digital is what we do.  Ad Age putting us at No. 4 on their Agency A-List was just further validation of what I already knew — we have some damn talented folk at RI, and their experience-based judgments hold sway.

For my part, I worked with Hayes Davis at TweetReach, Eric Peterson at Twitalyzer, and my mouse and keyboard at Microsoft Excel to set up seven basic measures of a brand’s results on Twitter and in Facebook. For each measure, there were either two or three breakdowns of the measure, so I had a total of 17 specific measures. For each measure, I grouped each brand into one of three buckets: Top performer (green), bottom performers (red), all others (no color). My hope was that I would have a tight scorecard that would support the core teams’ scoring — perhaps causing a second look at a brand or two, but largely lining up with the experts’ assessment. And, this is how things wound up playing out.

The Metrics

The metrics I included on my scorecard came from three different angles with three different intents:

  • Brand mentions on Twitter — these were measures related to the overall reach of the “buzz” generated for each brand during the game; we worked with TweetReach to build out a series of trackers that reported — overall and in 5-minute increments — the number of tweets, overall exposure, and unique contributors
  • Brand Twitter handle — these were measures of whether the brand’s Twitter account saw a change in its effective reach and overall impact, as measured by Twitalyzer; Eric showed me how to set up a page that showed the scores for all of the brands we were tracking, which was nifty for sharing.
  • Facebook page growth — this was a simple measure of the growth of the fans of the brand’s Facebook page

The first set of measures were during-the-game measures, and we normalized them using the total number of seconds of advertising that the brands ran. The latter two sets of measures we assessed based on a pre-game baseline. We used Monday, 1/31/2011, as our baseline date. Immediately following the game, there was a lot of manual data refreshing — of Facebook pages and of Twitalyzer — followed by a lot of data entry.

As it turned out, many of the brands came up short when it came to integrating with their social media presence, which made for a pretty mixed bag of unimpressive results for the latter two categories above. Sure, BMW drove a big growth in fans of their page, but they did so by forcing fans to like the page to get to the content, which seems almost like having a registration form on the home page of a web site in order to access any content.

The Results

In the end, I had a “Christmas Tree” one-pager: for each metric, the top 25% of the brands were highlighted in green and the bottom 25% were highlighted in red. I’m not generally a fan of these sorts of scorecards as an operational tool, but, to get a visual cue as to which brands generally performed well as opposed to those that generally performed poorly, it worked. It also “worked” in that there were no hands-down, across-the-board winners.

What Else?

In addition to an overall scoring, we captured the raw TweetReach data and have started to look at it broken down into 5-minute increments to see which specific spots drove more/less social media conversations:

THAT analysis, though, is for another time!

General

Should Google Offer a Paid Version of Google Analytics?

Recently there has been some rumor buzz about Google releasing a “paid” version of Google Analytics (beyond what is currently available through Urchin). Assuming, for a second, that something like this is coming in the future, the real question is whether this is a good or bad idea. In this post, I’ll examine some of the pros and cons to this potential move by Google.

Why Google Should Offer a Paid Version

So what are some of the reasons that Google should offer a paid version of its web analytics offering? I can think of the following:

  • There will always be a group of web analytics users that want advanced functionality and are willing to pay for it. These advanced features are often resource-intensive and I could see Google wanting to recoup some money to enable these features or the additional data storage they necessitate.
  • There are millions of websites using Google Analytics for free and if Google can extract even a small amount of revenue from these, it can add up quickly. Since I don’t think Google is hurting for revenue, I assume that the money generated would be filtered back into the product which would mean even more enhancements to a product that pretty robust already.
  • One of the reasons Google may be thinking about offering a paid version of the product is to open the door to its sales team to cross-sell other Google products and services. By being free, Google Analytics has infiltrated millions of websites which creates an easy entrée for a Google sales rep to say: “I see that you are using Google Analytics, did you know that Google also offers Google Ad Words, Google Apps, etc…” While they can already do this, if a company has already started paying for Google Analytics (and it has made it through procurement!), that makes the cross-sell so much easier. It also helps weed out the companies that are serious, which will often be the ones willing to pay.
  • Services baby! It is no secret that professional services are a huge money maker. When I was at Omniture, we had a sizable consulting group and there are a host of other firms (including Analytics Demystified of course!) offering services around web analytics. While I am not sure if it would be a good move or not, Google could offer paid-for services around a paid-for web analytics tool itself or through its certified partners.
  • Competition! I love competition. I think it helps drive innovation. In my opinion, the consolidation of the web analytics industry over the last few years has reduced the amount of innovation and I think Google having a paid product will ultimately mean that everyone in the industry gets more.

Why Google Should Be Careful About Offering a Paid Version

So what are the pitfalls that Google might want to look out for? Here are a few worth considering:

  • Too much functionality! One of the strengths of Google Analytics is its simplicity. Since it is a free tool for most users, it has not been beholden to the axiom that more features must always be added to continue justifying the investment. Like all software products, as time goes by, more features are added to meet the needs of the most advanced users, which often results in casual users leveraging 10% of the functionality. While it looks like Phil & Nick have done a great job adding the features their users want to date, once someone is paying you money, the balance of power tends to shift in a big way (think difference between privately held vs. publicly traded company). I hope that Google will not lose its simplicity “mojo” that got it to where it is today.
  • Customer Support? One of the biggest expenses for software products is the cost associated with supporting its customers. When I worked at Omniture, we had a massive customer support organization of account managers and client care that grew exponentially. If Google has paid clients, I would imagine that it would need to provide support at a level that far exceeds what it is offering today. This is not an easy task and Google is known for being somewhat hands off for most of its products. When your product is free, people accept that they are going to be on their own more than when they are paying for something and if support isn’t good, I could see Google Analytics losing a bit of its current luster. I also imagine that Google loses quite a bit of money on Google Analytics (which I assume it makes up for on the AdWords side), and this will be even worse once it has to staff up to support users unless it can find a way to get its partners to offer that support.
  • SLA’s (Service Level Agreement). Paid-for vendors have legal requirements around the availability of the product and the handling of product issues. To date, it is my understanding that Google Analytics has not had SLA’s since it is a free product, but I would imagine Google would need to provide a reasonable SLA for the paid side. SLA’s are never fun and usually end up costing time and money…
  • What happens if no one buys it? Google has done a lot of things that have changed the market and some that have not done quite as well (i.e. Google Wave). Google shook up the web analytics industry in a huge way with free Google Analytics, but what would it say if only a small % of companies decide to pay for its product? Does this serve as a boost to its paid competitors? I guess the real question comes down to this. If I am a Fortune 500 company and am currently using Google Analytics and a paid product from Omniture, Webtrends, Coremetrics or Unica (which is very often the case!), what features will Google Analytics add to its paid product that will get me to only use Google Analytics and get rid of my other paid vendor? I would guess that the things I would be looking for are 1) my own dedicated servers so I know my data is really my data and can be kept as long as I want, 2) knowledge that Google is not seeing any of my data and using it in its search algorithms, 3) support and SLA’s at the same caliber I am getting from my other paid vendors and 4) 90% of the features I can get from my other paid vendor. If Google can deliver on these items (and I am sure it can), I think it will make a compelling case as to why companies should standardize on Google Analytics, but I don’t think this will be something that happens overnight.

Obviously, all of this is still speculation, but I, for one, look forward to seeing what Google does and how they address some of the items I have described here.

I highly recommend you check out this YouTube video on disruptive innovation. I think it is very cool to watch this and think about Google being the “entrant” and the other paid web analytics vendors as being the “incumbents” described in the video. This video talks about what Google has done to the other paid vendors and how Google could one day become the incumbent and fall prey to even newer entrants (or reincarnations of the old incumbents!). Fascinating stuff!

So what do you think? Will they do it? Will people buy it? What things do you think Google needs to do to make it successful? Please share your thoughts by adding a comment here…

Analytics Strategy, Conferences/Community, General

A few thoughts on the upcoming WAA Awards

I got a nice note this morning from Mike Levin at the Web Analytics Association:

“CONGRATULATIONS! You have been nominated for a WAA Award of Excellence in the category of: Most Influential Industry Contributor (individual) Your nomination recognizes the contributions you and/or your company have made to the web analytics industry. It is an honor to be nominated and the WAA congratulates you on your success. “

While I am honored by the recognition and delighted to have been nominated I told Mike that I am declining to participate in the voting.

Mike wrote me back and seemed surprised but my thinking is very simple: I have been very fortunate in my web analytics career and have received lots of recognition from my peers, my clients, and the press. I’m not one to bang my own drum and brag about my accomplishments … I prefer to just do my thing, help my clients and the community, and build a strong company for my partners and associates.

So I humbly and politely decline the honor and instead will cast my vote for folks I believe to be truly deserving of an industry honor. Here are the people I will be voting for:

  • Web Analytics Rising Star: Jason Thompson.  Jason is still a bit rough around the edges but I love his style and commitment to getting things done.  If I can vote twice I am voting for Michele “Jojoba” Hinojosa … her passion is palpable and her enthusiasm is infectious.
  • Most Influential Industry Contributor: John Lovett. I’m not sure John is actually eligible because he is on the WAA Board but his work on the WAA Code of Ethics is a monumental achievement and one that has the potential to shape our industry for years to come.  If I can vote twice my second nod goes to Jim Sterne … who has done more for this industry than Jim Sterne?  Damn right, nobody!
  • Most Influential Vendor: Google.  Most of the positive changes we have seen in the past two years in web analytics can be derived either directly or indirectly to the work that Brett Crosby and the team at Google Analytics put out there.  Second vote goes to Omniture given the critical mass they have been able to create and the big strides they made since the Adobe acquisition on customer support and overall focus.

UPDATE: OMG I didn’t realize that Corry Prohens was running a shameless and ruthless campaign to win the “Influential Agency/Vendor” award.  You should read his “shameless campaign” blog post and consider voting for Corry.

  • Client/Practitioner of the Year: Best Buy. Difficult to not vote for one of your own favorite clients but I hope you will all come to my keynote presentation with Lynn Lanphier at Emetrics and hear why I cast this vote.  Second vote? Dell, for taking the advice I gave them last year to heart and who are now kicking ass and taking names for testing and optimization. Bravo!
  • Technology of the Year: Analysis Exchange. Now, of course, I’m not really going to vote for something I helped create, but I am pretty damn proud of the work we have done and with Wendy Greco at the helm things are only getting better.  If I could vote twice … I wouldn’t, because I’d be tempted to vote for Twitalyzer LOL!

Again, I do appreciate the nod from the WAA and am looking forward to the party — the Analytics Demystified and Keystone Solutions crews will be there in force. I wish everyone nominated for the WAA awards the best of luck and, as a native of Chicago, remember to vote early and vote often!

Don’t forget to nominate your favorite web analytics superstar!

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Conference Season is Upon Us

Wow, I just got done looking more closely at the Analytics Demystified team calendar for the next few months and it is a doozy! Chances are if you live in the U.S. and do any type of digital measurement, analysis, or optimization professionally we are going to see you between now and the end of March.

If that is the case, we’d like to buy you a drink!

Despite each of us presenting, often multiple times, we are always happy to make time for our clients and potential clients when we are out-and-about.  If you realize you’re going to be at one of the following events why not drop us a line and we’ll see if we can connect. Who knows, maybe we’re planning a great party or something …

After all that the three of us are going to slink home to our loved ones and try and convince them we are in fact their fathers, husbands, and sons.

Seriously, though, we never get enough opportunities to meet with partners, friends, and prospects at these events so if you’d like to meet with any or all of us please drop us a line sooner than later so that we can block time and make plans.

Adobe Analytics, Analytics Strategy, General

Free webcast on Tag Management Systems on Jan 25th

Given the considerable buzz in the marketplace regarding Tag Management Systems and vendors like Ensighten, TagMan, and BrightTag I wanted to call your collective attention to a free webcast I am participating in next week on “The Myth of the Universal Tag.” On Tuesday, January 25th at 1:00 PM Pacific time I will presenting with Josh Manion, CEO of Ensighten and Brandon Bunker, Senior Manager of Analytics at Sony, detailing some of the advantages I see in the adoption of a tag management platform.

What’s more, the nice folks at Ensighten have taken the registration form off of my white paper on tag management systems and so everyone is free to read all of my thoughts on Tag Management without prompting a sales call.  How cool is that?

Spread the word:

“The Myth of the Universal Tag” free webcast sponsored by Ensighten
Tuesday, January 25th, 1:00 PM Pacific / 4:00 PM Eastern
Register online now at GoTo Meeting!

Don’t forget to download that free copy of my white paper on tag management systems!

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Want to meet Adam Greco? Go to OMS 2011 in San Diego!

By now I hope you have heard that Adam Greco is joining John and I as a Senior Partner in Analytics Demystified. While his official start date isn’t still for a few weeks he’s already on the road as part of the Demystified team. If you’d like to meet Adam in person and talk with him about the practice he is building there are a few places I just happened to know he will be in the coming months:

  • Adam will be participating in the Web Analytics Association (WAA) Symposium in Austin, Texas on Monday, January 24th. Adam is talking about integrating web analytics and CRM which is core to his practice area given his past work at Salesforce.com and Omniture.
  • Adam will also be presenting at the Online Marketing Summit in San Diego, California on Tuesday, February 8th. He’ll be giving the same presentation on web analytics and CRM, discussing how to move marketing analytics from the server room to the board room.
  • Adam will also be joining me in Minneapolis on Wednesday, February 16th for a special Web Analytics Wednesday sponsored by our good friends at SiteSpect and with the generous help from our friends at Stratigent.  We don’t have the details on the site yet but the event will be downtown Minneapolis and Adam and I will be doing some prognostication and fielding questions from Twin Cities locals.

Adam will also be at Webtrends Engage, Adobe’s Omniture Summit, and the Emetrics Marketing Optimization Summit but we’ll post more on that when additional details emerge.  Suffice to say Adam will be busy in his first few months on the job.

If you haven’t met Adam I would encourage you to head out to one of these events and introduce yourself. Especially if you’re a marketer and are considering the Online Marketing Summit — if you haven’t been to OMS you really need to go.  Every year I am absolutely blown away by the job that Aaron Kahlow and the OMS team do bringing that conference together.  OMS draws amazing speakers, amazing sponsors, and most importantly amazing conference participants and delivers an absolute fire-hose of information.

I’m sincerely bummed that Adam is taking my place at OMS this year — I haven’t actually missed a big OMS event in California ever — but I am confident that the audience will benefit greatly from Adam’s message about CRM integration, his direct experience at Salesforce.com, and his distinct presentation style.

General

weThink Podcast — Digital Trends and What They Mean for Marketers

Matthew Santone, Dan Shust, Chris BarcelonaLast fall, I started listening to the weThink podcast (here’s the iTunes link) that is unique in that I personally know all of the people who work on it. They’re some of my co-workers at Resource Interactive, and most of them are part of the RI Lab — our “R&D” wing: Matthew SantoneDan Shust, and Chris “Barce” Barcelona, with Lisa Richardson as the moderator. They’re go-to folk when it comes to what’s hot and happening in the digital and social space, and what those happenings mean for consumers and for marketers. Seeing as how I’m both a consumer AND a marketer, I pick up great info from every episode. Even better, the format and style of these bi-weekly chats are entertaining and engaging.

The most recent episode was a bit longer than usual, but it’s a good sample of the breadth of material they cover.

Predictions for 2011

Lisa asked the guys to complete the statement: “2011 will be The Year of…” and she got a range of responses:

  • Barce: Facebook Credits and the superphone
  • Matthew: data — the year we actually start making sense of and great experiences out of all of the data we’re collecting from consumers
  • Dan: the year of “the internet of things” and the year of Kinect-like technology (using motion to deliver great experiences)

CES 2011 Recap

Dan attended CES, while Matthew and Barce followed the event closely from afar. The highlights they discussed:

  • Tablets — the Motorola Xoom, which runs Google’s Android Honeycomb OS; the RIM Playbook, the Samsung Galaxy Tab, the Razer Switchblade, and all of the questions and issues around how the myriad form factors and applications will evolve (and how marketers and developers will deliver content to such a wide range of devices)
  • Superphones — the Motorola ATRIX made a splash at the show, but the larger discussion was around how a single device would truly become the centerpiece of a consumer’s digital life
  • 3D — 3D experiences are here to stay, but there was some debate as to whether this is really going to be driven more by consumers or more by manufacturers (and not just device manufacturers — Oakley and other sunglasses manufacturers are now introducing 3D glasses). Glassless group viewing may never happen (lenticular displays, even as they evolve, are still reliant on the viewer being in a small sweet spot to get the 3D effect), and what kind of human interaction barriers do 3D glasses introduce that limit the practical application of 3D?
  • Automotive — Audi’s attempts to deploy vehicle-to-vehicle communication such that vehicles can automatically collect data about weather and road conditions and share that information with other vehicles. This, I believe, is one example of “the internet of things” — all sorts of devices floating around the world that have both data collection and network connectivity capabilities
  • Motion — centered around Microsoft as the lead press conference at the event and Steve Ballmer discussing what’s next for the Kinect — controlling both Netflix and Hulu Plus using hand gestures, as well as Kinect-based avatars interacting in a virtual space (the Second Coming of SecondLife, perhaps?). And, the gang discussed how Kinect-like technologies can make for richer and more relevant consumer experiences both in-home and in-store.

The Mac App Store

Apple has now released an app store for the Mac — think iTunes, but for Mac laptops and desktops rather than just for iPhones. This appears to be a harbinger of a future that sounds a little funny: a future where laptops and desktops run apps. But, these are apps in the smartphone/superphone/tablet paradigm, rather than the “heavy overhead installed software applications” that have been a mainstay of computers for years. These apps will have much more of a platform-agnostic and cloud-centric orientation — enabling cross-device usage of an app in a seamless manner. The Chrome Web Store is another example of this shifting paradigm, with the Tweetdeck, Mashable, and Amazon Window Shop apps available there being examples of where it appears this world is heading.

The iPhone on Verizon

The consensus was that the announcement that, as of February 10th, the iPhone will be available on Verizon, rather than solely with AT&T, will be one of the biggest non-news events of the year. While iPhone users are frustrated with the dropped calls they get with AT&T, they’re going to be equally frustrated by the fact that they cannot simultaneously make a phone call and maintain a data connection with their iPhone when they switch to Verizon. AT&T’s 3G service is GSM-based, which allows data and phone service simultaneously…but is prone to call dropping. Verizon’s 3G service is CDMA-based, which is less prone to dropped calls, but which cannot run data and phone at the same time. Both AT&T and Verizon are migrating to the GSM-based 4G LTE technology, so users, presumably, will have similar experiences and similar limitations once that happens.

One way to look at this announcement is that it is a further leveling of the playing field for a 2-horse race </mixedmetaphor> between the iPhone and Android-based phones: Windows Mobile 7 is awesome, but it’s wayyyyy too late to the game, and RIM just can’t seem to get out of its own way.

Picks of the Week

  • Barce: personal hotspots coming to all iPhones in March (Verizon and AT&T)
  • Matthew: over the holidays, he purchased and installed a Filtrete WiFi Enabled Programmable Thermostat — controllable via an iPhone app or an web interface — and is loving it
  • Dan: the new eBay Fashion iPhone app — a very cool augmented reality app whereby you put your eyes between a couple of markers and you can then “try on” sunglasses

Pretty cool stuff. If you listen to podcasts, it’s worth subscribing (iTunes link)!

Analytics Strategy, Conferences/Community, General

Big Changes at Analytics Demystified

I suspect by now many of you have noticed but this week we made two pretty amazing announcements here at Analytics Demystified. Now that the dust is settling I have some time to take a step back and offer up some comments on the announcements and what I believe they mean for our clients, our prospects, and the web analytics industry in general.

On Tuesday we announced that respected industry veteran Adam Greco had joined John and I as a Senior Partner. Adam is well-known to many in our community thanks to his high-visibility work during his tenure at Omniture, his popular “Omni-Man” blog, and his fine, fine work on the Beyond Web Analytics podcast series.

For John and I bringing Adam on board was a no-brainer. The guy is as bright as they come, he is articulate, and most importantly he knows how to squeeze every last drop of value out of the most widely deployed digital measurement solutions in use today — Adobe SiteCatayst and Google Analytics. Adam is committed to extending that expertise to all of the popular platforms as quickly as possible, and our hope is that by mid-year he will be providing the same great insights he has for SiteCatalyst to Webtrends, Unica, Coremetrics, Nedstat, and other customers.

Adam will be running our Operational Use Audit and Framework Development practice as well as providing custom training and generally supporting the rest of the Demystified service offerings.  Which brings me to our second announcement …

On Wednesday we announced an exclusive partnership with tactical and technical consulting practice leaders Keystone Solutions. Keystone is a slightly better-kept secret than Adam Greco, although their current clients certainly know who they are. Founded years ago by former Omniture super-star Matthew Gellis, Keystone has grown into a talent magnet comprable to, well, Analytics Demystified.  Matt Wright from HP, Kurt Slater from Expedia, Rudi Schumpert from Ariba, and a host of other amazing analytics technicians.

We have doubled-down with Keystone for one simple reason: in our experience they are the best of the best when it comes to providing fundamental and foundational support for any digital measurement practice. Especially against those same two “most popular” solutions — Google Analytics and Adobe SiteCatalyst — Keystone delivers in a way that few others out there are capable, and that is the kind of talent we prefer to work with in the field.

Through this partnership Analytics Demystified clients will be able to benefit from a dramatically expanded set of web analytics consulting service offerings ranging from on-the-ground implementation support to ongoing reporting and analysis to some pretty amazing custom solutions. They will also be taking the lead on our Tag Management Systems Audit and Deployment practice, an offering I expect to be red-hot in 2011 and beyond.

Now, unfortunate as it is, we were not able to pursue this type of relationship with Keystone without some cost. The immediate fall-out is that Analytics Demystified will no longer be participating in the X Change conference. While this breaks my heart after having put three years of sweat equity into the event, relationships change and so it is time to move on.

I do, however, promise every one of the hundreds of consultants, vendors, and practitioners we have personally invited to this conference over the past three years that we will be back, live and in-person, with something far more “Demystified” in nature. Based on our work with Web Analytics Wednesday, the Analysis Exchange, and hundreds of other events around the globe, we have a pretty good idea of what is truly missing from the web analytics event landscape … and now, thanks to Adam and the team at Keystone, we have the means to deliver.

I welcome your comments and questions about both pieces of news, and I hope you’ll keep your eyes open in the coming few weeks for even more news from our growing company. It is exciting times, indeed.

General

Adam Greco, Demystified

I am extremely excited to begin this next chapter in my career and wanted to get started at Analytics Demystified by describing my past, the present, and what I hope to do in the future.

The Past

I began my career in consulting working for Arthur Andersen’s technology group, first in the field of Customer Relationship Management and eventually in Marketing where I ended up managing the website of the Chicago Mercantile Exchange. It was there, as one of Omniture’s first customers, that I began to learn about web analytics and the data behind websites.

For some reason web analytics came naturally to me, and I loved figuring out fun, new ways to use the technology at my disposal to answer interesting business questions. As I had done in my consulting days, I dug into every Omniture training manual available and soon found that I knew the technology almost as well as those at Omniture. I think that is what led me to ultimately apply to work at Omniture in their newly founded Omniture Consulting group.

While at Omniture, I had the pleasure of working on many different clients, large and small, and helping them get the most out of Omniture products. As I showed Omniture clients how to use the technology, I often heard clients say “I had no idea you could do that…” so I decided to start a blog to teach people what they really ought to know.

Fast forward a few years and I decided that I wanted to go back to the “practitioner” side of the house and was given a great opportunity to head up web analytics at Salesforce.com. In the role of Senior Director for Web Analytics over the past two years at Salesforce I have had a great time reviving the web analytics program. More importantly, thanks to the generosity of the Salesforce organization, I have been able to continue writing about Omniture technology, speaking at industry events, and helping to establish and grow the Beyond Web Analytics podcast series.

The Present

While working at Salesforce.com has been one of the highlights of my career, when you are used to working on ten or more web analytics projects at a time as a consultant, sometimes working on just one for a few years in a row can be tough. Plus, despite being fully employed these past two years, I have been constantly approached by great people asking how to get the most out of their web analytics efforts — requests that I more often than not had to turn down.

Having been in consulting for most of my career, I had to face the inevitable truth; I have always been destined to become a consultant again … to work with dozens of companies at a time, sharing my knowledge of Omniture with companies large and small working to maximize their investment in web and digital analytics.

The Future

Once I decided that I wanted to go back into web analytics consulting, the difficult part was figuring out which organization would be the best fit for me. There are so many great web analytics consultancies, and over the years I have become friendly with many of the leaders of these firms. However, in my mind, Analytics Demystified was my first choice because of the brand that they have built, the caliber of their clients, and the overall thought-leadership their principals have displayed throughout the years.

When I think about the web analytics industry as a whole, I think about the books that helped launch the industry, the Yahoo Discussion Forum, the camaraderie found on Twitter, Web Analytics Wednesdays and most recently, the Analysis Exchange, and the new Web Analysts Code of Ethics. The common theme in all of these pillars is Eric Peterson and the Analytics Demystified brand he has created.

It was only last year when I first met Eric face-to-face but I have always been impressed by his ability to understand where our industry has been and, more importantly, where it is going. When John Lovett joined the organization, I enjoyed reading about all that he was doing in helping clients with their strategy, vendor evaluations, and social media efforts. In the last year, I got to spend some time with John and have been equally impressed with his passion for the industry and work he has done as a board member for the WAA.

I know there are any number of organizations that I could have joined where I could have helped teach people what I knew and managed teams of consultants. But at Analytics Demystified I believe that I am in a position to be both teacher and also a student, working with two of the smartest, most respected people in the field.

At Analytics Demystified, my hope is to help as many clients as possible get the greatest value from their investments in web analytics technology. I am excited to continue helping clients using Omniture technologies, but also look forward to branching out into other vendor tools and help their clients as well. It is my hope that the combination of Eric and John’s strategic work with my design and architecture background will have a synergistic effect, helping Analytics Demystified clients to further achieve their digital business objectives.

I look forward to hearing from all of you here in my new blog, and by all means if I can help your business, please don’t hesitate to contact me.

Adobe Analytics, General

Form Submit Button Clicks

At the end of last year, I spent a bunch of time showing how you could dissect your website forms to see which were performing well and not so well. While this post will be different from those, it is still related to website forms. In this post, I am going to share a concept that will let you determine which of those visitors seeing your forms have the intention to complete them and which do not. This information can be very valuable as I hope to show.

Which Forms Get Visitors to Take Action?
If you have forms on your website, I hope that you are at least doing the basics and tracking how many people View each Form and how many Complete each Form like this:

This will allow you to have a rudimentary view about how each website form is performing. However, one short-coming of this is that you only have two points of comparison. As a web analyst, I always like to have more data points to slice, dice and analyze. The report above answers the question: “How many people who see each form decide to complete it?” What if you wanted to know how many people who see each form try to complete it? That might be an interesting data point, since sometimes when you do a lot of Paid Search or Display Advertising you could be driving less qualified traffic to your website. Therefore, what I like to do is to create a new metric that I call Form [Submit] Button Clicks. This Success Event is set when website visitors click the button that you place on your form (duh!). By doing this, you have essentially created a wedge between the Form Views and Form Completes metrics shown above such that you can create a report that looks like this:

As you can see here, in the first report above we knew that only 786 of the 2,246 Form Views turned into Form Completions. However, with the second report, we now know that visitors to that specific form clicked the Form Submit button 830 times. That means that 44 times they tried to complete the Form, but were unable to for one reason or another (maybe Form Errors).

Dig Deeper With Calculated Metrics
Once you have this cool new Form Button Clicks metric, you can then create some fun new Calculated Metrics that let you dig even deeper. Here are two that I suggest: Form Button Click Rate & Form Button Click Fail Rate. The Form Button Click Rate is the number of Form Button Clicks divided by the number of Form Views. This metric shows you what percent of people viewing the Form actually click the button as shown here:

In this report you can see which forms on your website are doing a good job at getting visitors to click the button. Forms with low percentages might indicate that there are too many fields, poor content or a bad offer. You can use this report to zero in on which forms represent the biggest opportunity for improvement. I like to bubble-chart this data such that the forms with the most Form Views and the lowest Button Click Rate move to the “magic quadrant.”

The next Calculated Metric is the Form Button Click Fail Rate. This represents the percentage of times visitors click the Form Submit button, but fail to have a Form Complete. These people represent your “lowest hanging fruit” as by clicking the button, they have implicitly told you they are somewhat interested in you! You create this metric by dividing the difference between Form Button Clicks and Form Completes by the number of Form Button Clicks as shown here:

In this case, for the first form, about 5% of people who click the button don’t make it to a Form Complete, but the last form shown in the report seems to have some issues since 62% of Form Button Clicks don’t make it to a Form Complete. You may want to start doing some testing on that form!

As is always the case, whenever you create new Calculated Metrics you can see them as general metrics in addition to using them in eVar reports. Therefore you can set Alerts and see trends for both of the metrics described above:

What I like about these two metrics is that one shows you how good you are at getting people to click the button on the form (how good your offer/content is) and the other tells you how good you are at closing the deal once a visitor has decided to give you a chance. Those who have managed websites realize that there are very different tactics used to solve these two very different questions so having these metrics can really help you focus and use your precious website resources as efficiently as possible.

Don’t Forget Your Other Reports!
While the above reports hopefully get you excited, don’t forget that you already have many reports that can be combined with the information above to get even more value. For example, one of the reports I use a lot is the Traffic Driver (Unified Sources) report which shows me how each visitor got to my website. Wouldn’t it be cool if I could see Form Button Clicks and the above two new Calculated Metrics by Traffic Source? Well…you can! All you have to do is add these metrics to your existing Traffic Sources report like this:

Now you can see how each channel is doing! Looks like Paid Search (SEM) is generating lots of Form Views, but only gets 12% of these to turn into Form Button Clicks. If they do get someone to click the button, it looks like 55% of them don’t end up successfully making it to a Form Complete. This can be contrasted by SEO which seems to fare a bit better by getting 30% of its Form Viewers to click the button and of those 75% make it through to Form Completion. You can imagine how powerful this data could be and how you could use a product like Test&Target to come up with ways to improve these conversion rates by traffic source.

If you want to get even more granular, you can break this report down by the root traffic driver so you can take specific actions. In the following report, I can see the Paid Search ID’s that make up the Form Views and the other metrics and see how each performs individually:

Here we can see that there are some Paid Search keywords that are doing well (get people to click on the submit button over 20% of the time) and others that are under-performing (less than 15%). You can use these metrics to help drive your Paid Search strategy or possible automate this using SearchCenter. Finally, in this fictitious example, I have made row three have zero Form Completes, but a 32% Form Button Click Rate, which would indicate a major issue with the form that should be addressed.

One last example of leveraging an existing report would be the Visit Number report:

Here we can see that the Form Button Click Rate is pretty consistent, but up a bit in the 3rd visit, but interestingly, our Form Button Click Fail Rate appears to decrease over time. Perhaps the more time visitors take to get to know us, the more likely they are willing to deal with all of the information we are asking for on our forms!

Final Thoughts
Well there you have it. I always find it so amazing that adding one simple Success Event in the right place can open up so many new web analysis opportunities. If you have forms on your website, I hope this will help you learn more about your users and how they are interacting with your forms. Let me know if you have any questions…

Analytics Strategy, General, Social Media

It's not about you, it's about the community …

Happy New Years my readers! I hope the recent holidays treated you well regardless of your faith, persuasion, or geographic location. I wanted to take a quick break from all the heavy privacy chatter these past few months and tell a little story about the generosity of our community and one individual in particular.

If you follow me on Twitter you may have noticed me cryptically tweeting “it’s not about you, it’s about the community” from time to time. I started sending this update as a subtle hint to a few folks who harp on and on about their accomplishments, products, and “research” in the Twitter #measure community … but sadly those folks never got the hint (so much for being subtle, huh?)

Over time the tweet became something larger — it became a reminder about what we all are capable of when we think about more than our own little world.  “It’s not about you, it’s about the community” is about some of the greatest contributors in the history of web analytics, people like:

  • Jim Sterne, who years ago realized that we needed a place to gather, and who wisely picked the Four Seasons Biltmore in Santa Barbara, California.  While Emetrics may have become a profit-generating machine, those of you who know Jim and know history understand that the conference is as much about and for the community as it is anything else;
  • Jim Sterne, Bryan Eisenberg, Rand Schulman, Greg Drew, Seth Romanow, and others who founded the Web Analytics Association years ago when it was clear that we needed some type of organizing body, committing themselves to hundreds of hours of work without thinking about how they would make money off of the effort;
  • Jim Sterne (again!!!!) who has been making sure that we all know who is doing what where and when via his “Sterne Measures” email newsletter for as long as I can remember;
  • Avinash Kaushik, Google’s famed Analytics Evangelist, who has long committed the profits from his books on web analytics to two amazing charities;
  • Super-contributors to the Web Analytics Forum at Yahoo Groups, folks like Kevin Rogers, Yu Hui, Jay Tkachuk, and dozen more who still take the time to answer questions from newer members of this rapidly expanding community;
  • Past and current Web Analytics Association Board members and super-volunteers, folks like Alex Yoder, Jim Novo, Raquel Collins, Jim Humphries, and so many more who give their time and energy every month to make sure the Association continues to evolve and grow;
  • Activists and evangelists like my partner John Lovett, who in the midst of writing his first book on social media analytics has taken the time to shepherd our Web Analysts Code of Ethics effort through the Web Analytics Association Board of Directors;
  • Everyone who has ever hosted a Web Analytics Wednesday event, including luminaries like Judah Phillips, June Dershewitz, Tim Wilson, Bob Mitchell, Emer Kirrane, Perti Mertanen, Alex Langshur, Anil Batra, Ruy Carneiro, Dash Lavine, Jenny Du, David Rogers, and way too many more folks to list who contribute their valuable time to help grow organic web analytics communities locally;
  • All of the over 1,000 members of the Analysis Exchange, many of whom have contributed to multiple projects to make sure that nonprofit organizations around the world have access to web analytics insights;
  • Dozens of others I am forgetting, and probably hundreds more I have never even met …

When I think about this list of people and their individual contributions to the web analytics community it is almost overwhelming — how lucky we are to have such considerate and giving friends!  Still, people have been giving back for years and so it is rare that I see something or someone in the community that really blows me away …

Until recently.

Not everyone knows Jason Thompson, and I suspect he would be the first to admit that not everyone who knows him actually likes him, but if I had to pick one “web analytics super-hero” for 2010 Jason would be my hand’s-down, number one choice.  See, Jason was smart enough to not just get the web analytics community to give back to our community, he managed to get our community to help provide clean water to an entire community in a developing nation.

Having worked repeatedly as a volunteer with Analysis Exchange Jason was introduced to charity:water, a nonprofit organization who’s vision is very simple: to provide clean, safe drinking water for everyone on the planet.

Water.

Not a great blog or free books, not data or solution profilers, but water that mothers can bring to their children. Clean, pure water that I would venture each and every one of the members of the web analytics community takes for granted and rarely even considers the source and its availability.

But Jason thought about it, and what’s more, Jason did something about it. Thanks to some cool new technology Jason was able to donate his 36th birthday to help raise $500. By leveraging Twitter and his web analytics community he was able to raise that $500 by December 18th.  Having met his goal before his birthday Jason didn’t stop and settle, he set the bar higher, working first to raise $1,000, then $3,000, and finally $5,000, enough to provide water for an entire village – 80 people for 20 years.

Jason’s effort brought out the best in our community again, collecting donations from luminaries and lay-users alike … hell, he even got money from his mom! Some of the biggest names in web analytics helped Jason along, and donations large and small rolled in right up until Ensighten’s Josh Manion put in the last $300 on Jason’s birthday, putting him over the top and completing his final goal.

Honestly I don’t know Jason very well, but I do know passion and greatness when I see it. Jason once again served as a reminder that “it’s not about you, it’s about the community” and he did more than just tweet obnoxiously … he put his time and money where his mouth is and did something real.

Bravo, Mr. Thompson.  Bravo.

If you don’t know Jason I highly recommend following him in Twitter (@usujason, if you’re into Twitter) and, if you see him at a conference or event do like I will and buy the man a drink. I for one am going to let Jason be an example of how I can work even harder to make a difference both inside and outside of the web analytics community in 2011 and beyond.

Hopefully some of you will do the same.

Adobe Analytics, General

Tracking Form Errors (Part 3)

(Estimated Time to Read this Post = 4 Minutes)

In this series of blog posts, I have been talking about how to see what types of Form Errors your website visitors are receiving so you can improve conversion. So far, we have learned how to see how many Form Errors your website is getting, which fields are causing those and how many Form Errors you get per Form and Visit. As my regular readers know, I like to go beyond the basics, so now we are going to kick it up a notch and get into some real fun stuff. Fasten your seat belts!

Which Fields on Which Forms?
In my first post of this series I shared a simplistic way to learn which form fields caused errors using a List sProp. However, correlating this to specific forms was a bit trickier. Here I will show how to do this, even if you don’t have Discover. The trick here is to set a Form Errors eVar that stores all of the fields which had an error when the above Form Errors Success Event is set. Since eVars have a longer character length, this should be possible for most forms that aren’t too long (which they shouldn’t be anyway!). I like to do this by concatenating the field values into one long string with a separator between each field. Here is an example of the report you want to have:

This report will look a bit like the one I described in the previous post, but as you will see, it is much more powerful since it is in the conversion area and can take advantage of Conversion Subrelations. Besides being able to see which combination of field errors are troubling users, you can open your Form ID reports, find a specific form and then break it down by this new Form Error eVar to see the specific form fields causing problems by form as shown here:

Using this report, we can see that for the first form shown above, 66% of the times visitors get a Form Error, they had eight form field errors (or left them blank). This data, when coupled with observational data using a tool like ClickTale can be invaluable in driving increased form conversions!

What % of Required Form Fields Have Errors?

While the above report, which shows Form Field Errors by Form, is powerful, one question it doesn’t answer is: How many of the required fields on my forms are not being filled out by users? The answer to this question can help you figure out which fields should/shouldn’t be required. So to answer this question, what you want to do is to look at each form that loads on your website and calculate how many fields the user received an error for and then divide that number by the total number of required form fields. For example, if you have a form with eight required fields, and the current user received two errors on that form, the calculation would be 2/8 or 25%. You should then pass this 25% value to an eVar when you are setting the Form Errors Success Event. Once you do this for all forms, you will have a report that looks like the one shown here. Using this report we can see that the highest number of Form Errors are cases where users are getting errors on every field (which is most likely people leaving all fields blank). Maybe our users don’t realize that these fields are required and we can do some testing to create a better experience or reduce the number of required fields?

If we want to see which forms are the ones that have the highest 100% Form Field Error Rate, all we need to do is break the above report down by Form ID:

Finally, if you are doing a good job of grouping your website forms using SAINT Classifications, you can see some super-cool reports. In the following report, I have grouped all of my website forms into high-level buckets of Demo and Free Trial. Then I broke this report down by the percentage of required fields that result in Form Errors.

You can see here that most website visitors on Demo forms are getting errors for 100% of the fields (probably leaving them blank!), while for the Free Trial, the largest percentage of required fields with errors is 10%. Interesting data indeed!

Final Thoughts
In this post, we have covered some advanced ways to see which fields produce errors on each form, see this by form and seen how to know which forms have the highest total required field error rates. These reports can provide an enormous amount of insight into what is happening on your forms with respect to errors and once you understand your visitor’s form behavior, you can apply these learnings to all forms on your site. In my next post, I will cover a tangentially related item (related to Forms, but not as much about Form Errors) that I think is super-cool.

Between this post and the last post, hopefully you have some food for thought when it comes to tracking how your website forms are doing so you improve your conversion rates…

General

Commerce Department and WAA Code of Ethics

Thanks to Tim Evans I was alerted to a report about the Commerce Department weighing in on privacy issues online.  Suffice to say I agree with the direction Commerce is giving the Obama administration.  Specifically the idea that, according to CNN’s Money, “the government ‘enlist the expertise and knowledge of the private sector’ to create ‘voluntary codes of conduct that promote informed consent and safeguard personal information.'”

More or less exactly what John Lovett and I proposed back in September of this year.

I have started reaching out to the media on this point — that we in the digital measurement community are already taking matters into our own hands and stepping up — but we could use your help! Please, if you know anyone in the press, send a link to this blog post along to them and help spread the word that as a community we can take responsibility for our own actions and we are willing to do what is right for consumers around the globe.

This issue affects all of us in the digital measurement sector — vendors, consultants, and practitioners alike. Please help us create awareness about our efforts.

Here are links to the relevant background materials:

Here is the link to the near-final draft of the Web Analysts Code of Ethics:

  • Last chance to shape the Web Analysts Code of Ethics (WAA Blog)

The Standards sub-committee for the Code of Ethics met yesterday and as I publish this blog post John Lovett is presenting the final version to the Web Analytics Association Board of Directors.  We expect the Code to be available to sign at the WAA web site in the coming weeks.

Adobe Analytics, General

Tracking Form Errors (Part 1)

Almost all websites have forms. Whether you are a B2B/Lead generation site, an eCommerce site, a travel site, etc… you most likely have forms. More importantly, you have people who don’t fill out your forms correctly and get some sort of error message. While error messages are a fact of life, in the web analytics/optimization world these are painful since you work so hard to get people to your site, to read your content and then agree to give you personal information. That is a lot of time and money spent only to have someone potentially abandon because they have problems with your forms. This represents your “low hanging fruit” so to speak – people who have already decided they like you and want to give you their information! In this series of posts, I am going to share some techniques for seeing how much of a problem your website has with form errors and in the next few posts I will cover some more advanced things you can do to diagnose these form error issues.


Which Fields Produce the Most Errors?
The first step in diagnosing form error issues is understanding which form fields are causing issues. Unfortunately, since a user might receive more than one error message, you have to pass in multiple values to a SiteCatalyst variable. This can be done using the Products variable, but since that is often already being used for more important purposes, I will suggest that you use a List Traffic Variable (sProp) to capture these values. Unfortunately, List sProps are not well documented and have some specific limitations (see Knowledge Base ID# 2305). All you need to know is that List sProps allow you to pass in delimited values and when you view them in the sProp report, these values will be split out. Let’s look at an example. Here we see a form in which a user has attempted to submit the form without filling out some required fields. What we want to do is capture which fields this user messed up (could mean incorrect value or leaving blank) so we can see which ones are messed up the most often. In this case, we see that the form errors are related to Job Title, E-mail Address, Phone #, Company Name and the MSA checkbox.

So in this case we can use a List sProp to capture the fields giving us errors. Here is how it would look in the JavaScript Debugger:

Unfortunately, List sProps are still constrained to the 100 character limit so if you have long forms you are out of luck or you can select the most important form fields to capture. Once you have captured the fields, you can open the sProp report and you will see something that looks like this:

In this case, we can see that we are getting the greatest number of errors on the Phone Number form field on the US website (I have added the site since forms exist in multiple sites). I could also filter this sProp report for just US or Japan form fields by using a text search of “us:” or “jp:” as needed. This report should help steer you in the right direction when it comes to fixing basic form field issues.

Correlating Form Field Errors to Forms
Once you have seen which form field errors, the next logical question is to see which forms had which errors. Unfortunately, one of the limitations of List sProps is that they cannot be used in Traffic Data Correlations. Therefore, if you want to breakdown form field errors by Form, you will need to use the Discover product as shown here:

If you don’t have access to Discover and seeing this type of breakdown is important to you, you may want to consider using the Products variable instead of a List sProp since the Products variable comes with full Subrelations by default (though this implementation will be significantly more difficult). I will also be covering a different way to approach this in my next post so stay tuned!

Final Thoughts
If you are not currently tracking form field errors, hopefully this will give you some ideas on how you can start the process of seeing where you are tripping up your visitors. Keep in mind that this post is just a start and that the next few posts will go into more advanced stuff you can do and how you can identify your biggest opportunities for improving conversion.

Analytics Strategy, Conferences/Community, General

FTC "Do Not Track?" Bring it on …

As the hubub around consumer privacy continues I was gently prodded by a friend to pipe up in the conversation.  While my feelings about how we have ended up in this position are pretty clear, and while my partner John and I have proposed what we believe is a step in the right direction regarding online privacy and the digital measurement community, it seems that some type of ban or limitation on online tracking is becoming inevitable.

Without getting political or debating the reality of what we can and cannot know about online visitors I have a single word response to the FTC:

Whatever.

Before you accuse me of changing my stripes or going completely nuts consider this: If the FTC is able to somehow pull off the creation of a universal opt-out mechanism, and if the browser developers support this mechanism despite clear and compelling reasons not to, and if consumers actually widely adopt the mechanism — all pretty big “ifs” in my humble opinion — then I believe the digital measurement industry will do what I have already described as inevitable:

We will hold a revolution!

Since my tenure at JupiterResearch back in 2005 I have been telling anyone who would listen to stop worrying about counting every visitor, visit, and page view and instead start thinking about statistically relevant samples, confidence intervals, and the algorithmic use of data to conduct analysis.  Yes, you need to work to ensure data quality — of course you do — but you don’t have to do it at the expense of your sanity, your reputation, or your job …

See, it turns out in our community it doesn’t really matter whether we are able to measure 100% of the population, 90% of the population, or even 80% of the population — what matters is that we are able to analyze our visitor populations and that are able to draw reasonable conclusions from that analysis.  Oh, we have to be empowered to conduct analysis as well, but that’s a whole other problem …

Statistical analysis of the data … trust me, it’s going to be all the rage in a few years. I’m not saying this simply because I have a white paper describing the third generation of digital measurement tools that will empower this type of analysis … although I would encourage you to download and read “The Coming Revolution in Web Analytics” (freely available thanks to the generous folks at SAS!)

I’m saying this because every day I see the writing on the wall.  Data volumes are increasing, data sources are increasing, and demands for insights are increasing, all while professional journalists, politicians, and political appointees are supposedly protecting our “God-given right to surf the Internet in peace” without any regard to the businesses, employees, and investors who depend to a greater or lesser degree on web-collected data to provide a service, pay their bills, and make a profit …

Okay, sorry, that was editorializing.  My bad.

Still, rather than wring our hands and gripe about how much the credit card companies know (which is a silly argument given that credit card companies provide tangible value in exchange for the data they collect … it’s called “money”) I believe it is time to do three things:

  1. Suck it up.
  2. Hold yourself to a higher standard.
  3. Buy “Statistics in Plain English” and start reading.

The good news is that we have access to lots and lots of great statistical analysis of sampled data today — we just might not realize it.  Consider:

Have I mentioned Excel, Tableau, and R?  Hopefully by now you get the gist … statistics is already all around us all the time, perhaps just not exactly where we expect it or, in the context of lower rates of data collection, where we will ultimately need it to be.

Perhaps the most encouraging evidence that we will be able to make this shift is the increasing attention the digital world is getting from traditional business intelligence market leaders like Teradata, FICO, IBM, and SAS.  I, for one, am more or less convinced that the gap between “web analytics” and “Analytics” is about to be closed even further … and here’s one guy that seems to agree with me.

We don’t need to thumb our noses at the privacy people — quite the opposite, and to this end John and I will be sitting down with a representative from the Center for Democracy and Privacy and Adobe’s Chief Privacy Officer MeMe Rasmussen at the next Emetrics in San Francisco! We also don’t need to stick our head’s back in the sand and hope this issue will simply go away — it won’t, trust me.

We need to prepare.

Prepare by committing yourself to not being that scary data miner that consumers are supposedly so afraid of; prepare by improving your data quality to the extent that you are able; and prepare by starting to communicate to leadership that it really doesn’t matter if you can count every visitor, every visit, and every page view — what matters is your ability to analyze data using the tools at your disposal to deliver value back to the business.

If you’re not sure how to do that, call us.

Viva la revolution!

DISCLOSURE: I mentioned and linked to lots of vendors in this post which I normally do not do. Some are clients of Analytics Demystified, others are not. If you have concerns about why we linked to one company and not another please don’t hesitate to email me directly.

Adobe Analytics, General

A/B Test Bounce Rates

(Estimated Time to Read this Post = 4 Minutes)

In the past, I have written about Bounce Rates, Traffic Source Bounce Rates , Segment Bounce Rates and Site Wide Bounce Rates. In the latter, I even promised I was finished writing about Bounce Rates, but, alas, I have yet another Bounce Rate installment. I was recently in a conversation with a peer and she asked me how they could see the bounce rates of the various landing page A/B tests they were running via Test&Target. I told her that this was easy to do if you follow my instructions in the Segment Bounce Rate post, but she asked if I could write a brief post with more specifics so here it is…

Why A/B Bounce Rates?
Before getting into the solution, let’s re-visit why this is of interest. Test&Target (and other tools like GWO) are wonderful when it comes to optimizing landing pages. They allow you to alter content/creative elements and see what works and what doesn’t. I have seen many cases where clients have used tools like Test&Target to change content based upon when brought the user to the website (i.e. Search Keyword) or demographic information (i.e. Location). Regardless of the reason you want to test, if it is a landing page, one of the questions you often get asked is related to Bounce Rate. Understanding how many people saw “Version A” of a test and bounced vs. those who saw “Version B” and bounced usually comes up for discussion. To answer this question using Omniture/Adobe tools you have the following options:

  • Create a unique page name for each test variation and use the regular Pages report and Bounce rate metric. However, this can get very messy, so unless your website is small, I don’t recommend this approach.
  • Use ASI or Discover to build a segment for people coming from “Version A” or “Version B” and then compare the bounce rates. This is a viable option if you have access to these tools and are well versed in Segmentation.
  • Attempt to track Bounce Rates from within Test&Target. This does not come out-of-the-box, but if you have mboxes on all of the pages the landing page links to, I have heard of some people setting conversion events on the landing page and the subsequent pages, but I don’t think this is for novices (if you are interested, I’m sure @brianthawkins could figure out a way to hack this together!)
  • Do what I suggest below!

Implementing A/B Bounce Rates
Luckily, implementing this in SiteCatalyst is relatively simple. All you need to do is the following:

  1. Enable a new Traffic Variable (sProp)
  2. In this new sProp, concatenate the Test&Target ID and the Page Name on each page of your website
  3. Enable Pathing on the new sProp

That’s it! By concatenating the Test&Target ID and the Page Name, you create a unique join between the two and can find the combination of the Test ID you care about and the page name that you expect them to have landed on. Once you find this combination in the report, you can add your Bounce Rate Calculated Metric (Single Access/Entries – which hopefully you already have as a Global Calculated Metric) and you are done. Here is an example of a report:

In this report, you have all of the ID’s associated with the US Home Page, how many Entries each received and the associated Bounce Rate. If you wanted, you could perform a search for the specific Test&Target Test ID you care about and then your report would be limited to just those ID’s. In the example above, we have multiple tests taking place on the US Home Page. However, in the following example we can see a case where there is just one test taking place on the UK Home Page and the associated Bounce Rate of each:

Other Cool Stuff
But wait…there’s more! Since you have enabled Pathing on this new A/B Test sProp, there are some other cool things you can do. First, you can look at a trended view of the report above to see how the Bounce Rate fluctuates during the course of the test. To do this, simply switch to the trended view and choose your time frame:

Another benefit of having Pathing enabled on this sProp is that you can see how visitors from various tests navigated your site using all of the out-of-the-box Pathing reports. Here is an example of a next page flow for one of the tests:

You can run the preceding report for each test variation and compare the path flows to see if one version pushes people more often to the places you want them to go. Another report you could run is a Fall-Out report which can show you how often people from a specific test made it through your desired checkpoints:

In this example, instead of seeing how the general population falls-out from the Home Page to a Product Page and then to a Form Page, we can limit the funnel to only those people who were part of Test ID “18964:1:0.” I like to run this report and the corresponding one for the other test version(s) and add them all to a SiteCatalyst Dashboard where I can see the fall-out rates side by side.

Final Thoughts
As you can see, by doing a little up-front work, you can add an enormous amount of insight into how your A/B tests are performing on your site including Bounce Rates, Next Page Flows, Fall-Out, etc…Enjoy!

Adobe Analytics, Analytics Strategy, General

Tracking Lead Gen Forms by Page Name

Every once in a while, as a web analyst, I get frustrated by stuff and feel like there has to be a better way to do what I am trying to do. Many times you are able to find a better way, often times you are not. In this case, I had a particular challenge and did find a cool way to solve it. You may not have the same problem, but, if for no other reason than to get it off my chest, I am writing this as a way to exhale and bask in my happiness of solving a web analytics problem…

My Recent Problem
So what was the recent problem I was facing that got me all bent out of shape? It had to do with Lead Generation forms, which are a staple of B2B websites like mine. Let me explain. Many websites out there, especially B2B websites, have Lead Generation as their primary objective. In past blog posts, I have discussed how you can track Form Views, Form Completes and Form Completion Rates. However, over time, your website may end up with lots of forms (we have hundreds at Salesforce.com!). In a perfect world, each website form would have a unique identifier so you can see completion rates independently. That isn’t asking too much is it? However, as I have learned, we rarely live in a perfect world!

Through some work I did in SiteCatalyst, I found that our [supposedly unique] form identifier codes were being copied to multiple pages on multiple websites. While this causes no problems from a functionality standpoint – visitors can still complete forms – what I found was that the same Form ID used in the US was also being used in the UK, India, China, etc… Therefore, when I ran our Form reports and looked at Form Views, Form Completes and Form Completion Rate by Form ID, I had no idea that I was looking at data for multiple countries. For example, if you look at this report nothing seems out of the ordinary right?

However, look what happened when I broke this report (last row of above report) down by a Page Name eVar:

At first, I thought I was going crazy! How can this unique Form ID be passed into SiteCatalyst on eleven different form pages on nine country sites? This caused me to dig deeper, so I did a DataWarehouse report of Form ID’s by Page Name and found that an astounding number of Form Pages on our global websites shared ID’s. Suddenly, I panicked and realized that whenever I had been reporting on how Forms were performing, I was really reporting on how they were performing across several pages on multiple websites. In the example above, I realized that the 34.669% Form Completion Rate I was reporting for the US version of the form in question was really reporting data with the same ID for forms residing on websites in Germany, China, Mexico, etc… While the majority was coming the the form I was expecting, 22% was coming from other pages! Not good!

The Solution
So there I was. Stuck in web analytics hell, reporting something different than I thought I was. What do you do? The logical solution was be to do an audit and make sure each Form page on the website had a truly unique ID. However, that is easier said than done when your web development team is already swamped. Also, even if you somehow manager to fix all of the ID’s, what is preventing these ID’s from getting duplicated again? We looked at all types of process/technology solutions and then realized that there is an easy way to fix this by doing a little SiteCatalyst trickery.

So what did we do? We simply replaced the Form ID eVar value with a new value that concatenated the Page Name and the Form ID on every Form Page and Form Confirmation Page. By concatenating the Page Name value, even if the same Form ID was used on multiple pages, the concatenated value would still be unique. For example, the old Form ID report looked like the one above:

But the new version looked like this:

With this new & improved report, when I was reporting for a particular form on a particular site/page, I could search by the form pagename and be sure I was only looking at results from that page. Also, a cool side benefit of this approach is that you could add a Form ID to the search function to quickly find all pages that had the same Form ID in case you ever did want to clean up your Form ID’s:

Implementation Gothca!
However, there is one tricky part of this solution. While it is certainly easy to concatenate the s.pagename value with the Form ID on the Form page, what about the Form Confirmation page? The Form Confirmation page is where you should be setting your Form Completion Success Event and that page is going to have a different pagename. If your Form ID report doesn’t have the same Page Name + Form ID value for both the Form View and Form Complete Success Event, you cannot use a Form Completion Rate Calculated Metric. For this reason, you need to use the Previous Value Plug-in to pass the previous pagename on the Form Confirmation page. Doing this will allow you to pass the name of the “Form View” page on both the Form View and Form Complete page of your site so you have the same page name value merged with the Form ID.

A Few More Things
Finally, while the Form ID report above serves this particular function, it is not very glamorous and it might not be the most user-friendly report for your users. If you want to provide a more friendly experience you can do the following with SAINT Classifications:

  1. Classify the Form ID value by its Page Name so your users can see Form Views, Form Completions and the Form Completion Rate by Page Name
  2. Classify the Form ID value by the Form ID if for some reason you want to go back to seeing the report you had previously

Final Thoughts
Well there you have it. A very specific solution to a specific problem I encountered. If you have Lead Generation Forms on your website, maybe it will help you out one day. If not, thanks for letting me get this out of my system!

Analytics Strategy, Conferences/Community, General

Are you in Atlanta? I will be, next week!

Just a quick note to those of you in the greater Atlanta (GA) metropolitan region to let you know I will be in town next week working and participating in two awesome events:

  1. A blow-out Atlanta Web Analytics Wednesday, sponsored by the fine folks at Unica, where I will be moderating a “practitioner panel” with Delta, Home Depot, and How Stuff Works. Sadly we only had room for 60 odd people and that list filled up almost right away … but you can write to Jane Kell and ask to be put on the waiting list just in case people have to back out.
  2. A presentation to Atlanta CHI on “Getting to Know Your Users Using Data” at the Georgia Tech Research Institute Conference Center. As this is a somewhat mixed audience my presentation will be a high-level walk through the systems and processes we all leverage on a daily basis.

As far as I know the CHI event is not sold out and costs $35 for the general public, $10 for students with ID, and nothing (free!) if you are a member of Atlanta CHI!

For those of you not in Atlanta, apologies for this utterly useless blog post. Hopefully I will make it to your town soon and can make it up to you …

Adobe Analytics, General

Tracking Recurring Revenue

Recently, I had the pleasure of meeting a web analyst whose business relied on a subscription model. In a subscription model, you often sell a product initially and then there is a subsequent Recurring Revenue stream (normally monthly). During the conversation, I explained how I would address this in SiteCatalyst and since it is a somewhat advanced concept, thought I would share the same info here in case there are blog readers out there who also have Recurring Revenue models.

Why Is Recurring Revenue A Challenge?
So why is tracking Recurring Revenue in SiteCatalyst difficult? As always, I like to explain through an example. Let’s imagine that you sell a popular CRM product that has an initial sale price and then a monthly subscription. A visitor comes to your website from a Bing keyword of “CRM” and they end up purchasing your product for $10,000. You can track this $10,000 sale online and attribute it to the Bing keyword. However, what do you do after one month? Let’s say the customer pays $1,000 each month after the initial $10,000. How do you attribute the recurring monthly $1,000 to the Bing keyword that originally brought the customer? Many clients I have seen stop at the initial sale, but this is problematic. What if there are some marketing campaigns that bring in a lot of initial sales, but those campaigns produce customers who quit the subscription after two months? Perhaps there are other marketing campaigns that generate lower initial sale amounts, but result in customers who are retained for several years. How do you compare “apples to apples” in this case if you cannot tie both initial and subscription revenue to the original marketing campaign?

The answer for most clients is to simply pass the original marketing campaign to their back-end system and do all of the reporting outside of a tool like SiteCatalyst. However, this has the following negative consequences:

  1. As a web analyst, you are now out of the loop which is not good for your program (or your career!)
  2. There are hundreds of online web-only data points that you know about the original sale (i.e. visit number, internal search terms used, internal promos used, etc…). Are you going to pass all of these data points to your company’s data warehouse and do analysis there? If you are a big company that may be possible, but what if you are a small or mid-sized business?

If you are like me, at a minimum, I like to have all important data in SiteCatalyst so I can have a seat at the table. With this in mind, the following section will describe what you need to do if you want to get Recurring Revenue into your SiteCatalyst implementation so it can be tied to the same data points as the initial sale.

Recurring Revenue in SiteCatalyst Reports
If you have read my past blog posts or even just my last post on Product Returns, you may know that one of my favorite SiteCatalyst features is Transaction ID (I suggest re-reading this post!). At a high level, Transaction ID allows you to set an ID associated with a transaction and later upload offline metrics that are dynamically associated with any Conversion Variable (eVar) values which were active at the time the Transaction ID was set. Just as Transaction ID was important to solving our Product Returns issue, it can be used similarly to solve the aforementioned Recurring Revenue challenge.

When visitors make their initial subscription purchase, you can set a Transaction ID value on the confirmation page. Doing this allows you to establish “key” that can be used later to upload Recurring Revenue and tie it to all of the eVar values associated with the original sale. Keep in mind that you will have to work with your Adobe account manager to get Transaction ID set-up. Additionally, Transaction ID is normally only used for 90 days, but in this case you will need to work with your account manager to get it extended perpetually (or for as long as you want to include Recurring Revenue). For example, if you want to associate two years of revenue with the eVar values that contributed to the original sale, then your Transaction ID data must persist for two years.

Once you have Transaction ID enabled and have started passing Transaction ID’s for online purchases, the next step is to create a new “Recurring Revenue” [Currency] Incrementor Event. Transaction ID uploads are similar to Data Sources and, as such, can only import data as Incrementor Success Event. Setting up a new Incrementor Event is easily done through the Admin Console.

Once you have Transaction ID set-up and your new Recurring Revenue currency Success Event, you need to generate a Data Sources template file that you can upload on a monthly/weekly/daily basis. This file will consist of each subscription account that is still active and the amount of [monthly] Recurring Revenue that should be recorded for each date. Normally, you will have subscriptions that expire at varying dates so you may decide to upload a file on a daily basis which represents all those who are starting a new subscription cycle on that date. The Data Sources template that you will create should have the following columns:

  1. Date – Use the date that the new subscription cycle starts, not the date of the original sale. This date will determine which month the Recurring Revenue will appear in when using SiteCatalyst. NOTE: Please keep in mind that there is currently a SiteCatalyst restriction that you cannot upload a Transaction ID file that has dates spanning more than 90 days. You can upload dates that are more than 90 days old, but the date ranges for the entire file upload cannot be more than 90 days (kind of lame in my opinion!).
  2. Transaction ID – The ID set when the initial subscription sale took place
  3. Product Name/ID – Same value that was passed to the Products Variable during the original online subscription purchase
  4. Recurring Revenue – Amount that the client will be charged for the next subscription cycle

When you are done, your Data Sources upload file might look something like this:

Seeing Recurring Revenue in SiteCatalyst Reports
Once you have successfully uploaded some Recurring Revenue data, it is time to see how all of this looks in SiteCatalyst. To do this, open the Products report and add Revenue and our new Recurring Revenue metrics. The report should look like this:

Next, you can create a Calculated Metric which combines the two metrics to create a “Total Revenue” metric as shown here:

Finally, since Transaction ID allows you to apply the eVar values that were associated with the original transaction to the new Recurring Revenue data, you can use Subrelation break downs for the Products report by Campaign and see both Revenue and Recurring Revenue (and the Total Revenue Calculated Metric)!

In the above report, we can see an example of the quandary I described in the beginning of this post. When we break down our first product (Sales Cloud) by marketing campaign, if we look at the Revenue column, it looks like we should be focusing our marketing spend on Bing Branded Keywords. However, when we add our Recurring Revenue, we can see that the majority of our Recurring Revenue and the most of our Total Revenue is coming from E-mail. Perhaps that is the best place to concentrate our marketing budget…

Final Thoughts
So there you have it. A simple, yet [hopefully] effective approach for making sure that you show all Revenue you are helping generate whether it takes place during the initial sale or subsequently… If you have comments/questions, please leave a comment here…

Adobe Analytics, General, Industry Analysis

Our Engagement Metric in use at Philly.com

Those of you who have read my blog for long know that I have written a tremendous amount about measures of visitor engagement online. In addition to numerous blog posts we have published a 50 page white paper describing how to measure visitor engagement and every year I give a half-dozen presentations on the subject. Unlike some people who seem to fear new ideas and others who disapprove of anything they themselves do not create I have long been a champion for evolving our use of metrics in web analytics to satisfy business needs.

But don’t take my word for it, read about how the nice folks at Philly.com are using a near complete version of my calculation to better understand their audience.

Cool, huh?

The thing I love about this article is that Philly.com is openly talking about their use of my engagement metric.  What’s better is that their sharing prompted another super-great organization (PBS) to comment that they too have been using my engagement metric for years.

Awesome.

I have been honored to work with several companies in the past three years who have implemented my metric and variations thereof but most treat the metric as a competitive secret. Given that most are in the hard-pressed and hyper-competitive online media world I understand, but I’m certainly happy to see Philly.com and Chris Meares share their story with the world.

Anyway, check out the article and, if you’re brave, download our white paper on visitor engagement and give it a read. If you are in media and are stuck trying to figure out how to get web analytics to work for you (instead of the other way around) give me a call. I’m more than happy to discuss how our measure of engagement might be able to help your business grow.

Adobe Analytics, General

Tracking Product Returns

If you sell products or services on your website, you are probably working diligently to be sure that you are tracking the appropriate Orders, Units and Revenue associated with each product sold (you may even be doing some advanced stuff like I described here). Doing this allows you to see all sorts of wonderful things, like what pages lead to sales and what online campaigns have better ROI than others. However, one formidable challenge that web analysts don’t like to talk about is Product Returns. How often have you bought something online only to ship it back or return it to a brick & mortar store associated with the website? If customers return products in significant enough numbers, all of the great online data you have collected may be inaccurate.

I have seen some companies apply a “rule of thumb (dumb?)” in which they discount sales by 20% across the board to account for returns, but how does that help you determine if a specific marketing campaign is good or bad when your SiteCatalyst reports only show the good? By not tying product returns directly to their corresponding online sales, your web analytic reports will be inherently flawed. The truth is that I have seen very few clients who have adequately addressed this issue, so I thought I would suggest an idea that I think is an appropriate way to deal with Product Returns. Even if you don’t sell things through a shopping cart, I encourage you to read this post as its principles are applicable to any situation in which you have an online success that later is retracted in some manner offline.

Tracking Product Returns
The best way to understand the tracking of Product Returns, is through an example. Let’s pretend that are a web analyst for Apple and a first time visitor comes to the website from the Google paid search keyword “ipod” and purchases two iPods for $50 each. In SiteCatalyst, when we open the Products report, we would see $100 for the Product labeled “ipod” and if we broke it down by visit number, the same $100 would be attributed to visit number one. So far, so good…

However, let’s imagine that one of these iPods was returned to a local Apple Store. Now our reality has changed. The paid search keyword and first visit combination has now only led to $50, but SiteCatalyst still shows $100. If we create calculated metrics to compare our Revenue per Marketing spend, suddenly our ROI just got cut in half, but this is not reflected in SiteCatalyst. This might cause us to misallocate marketing dollars to campaigns that look to be good at first, but in reality are not as profitable as others when Product Returns are added to the mix (especially if we automate Paid Search using SearchCenter!). I don’t know about you, but I certainly wouldn’t want to be the one telling my boss to invest in marketing campaigns that turn out to be “duds!”

So how do we fix this mess? In order to track Product Returns, you’ll need to re-familiarize yourself with the Transaction ID feature of SiteCatalyst (I suggest re-reading this post!). At a high level, Transaction ID allows you to set an ID associated with a transaction and later upload offline metrics that are dynamically associated with any Conversion Variable (eVar) values which were active at the time the Transaction ID was set (phew!). In this case, you need to ensure that you are setting a Transaction ID value when the original online sale takes place. By doing this, you create a “key” that will allow you to upload Product Return data later and back it out of its corresponding online sale. Keep in mind that you will have to work with your Adobe account manager to get Transaction ID set-up and you’ll want to be sure that the Transaction ID table persists for as long of a time frame that you require to upload return data (default is 90 days, but this can be extended).

Once you have Transaction ID enabled and have started passing Transaction ID’s for online purchases, the next step is to create a new “Product Return Amount” [Currency] Incrementor Event. Transaction ID uploads are similar to Data Sources and, as such, import data as Incrementor Success Events. Setting up a new Incrementor Event is easily done through the Admin Console.

Once you have Transaction ID set-up and your new Product Return Amount currency Success Event, you need to use Data Sources to generate a Product Returns template which you can populate and upload on a daily/weekly/monthly basis. This file will contain the following columns:

  1. Date – I suggest you use the date that the original purchase took place, not the date of the return, so there is no lag time. NOTE: Please keep in mind that there is currently a SiteCatalyst restriction that you cannot upload a Transaction ID file that has dates spanning more than 90 days. You can upload dates that are more than 90 days old, but the date ranges for the entire file upload cannot be more than 90 days (kind of lame in my opinion!).
  2. Transaction ID – The ID associated with the original online sale
  3. Product Name/ID – Same value that is passed to the Products Variable during the original online purchase
  4. Product Return Amount – This is the total $$ amount per product that is being returned

When you are done, your upload file might look something like this:

Using Product Returns in SiteCatalyst Reports
Once you have successfully uploaded some Product Return data, it is time to see how all of this looks in SiteCatalyst. To do this, open the Products report and add Revenue and our new Product Return Amount metrics. The report should look like this:

Next, we can create a Calculated Metric which subtracts the Product Return Amount from Revenue to create a “Net Revenue” metric as shown here:

Finally, since Transaction ID allows you to apply the eVar values that were associated with the original transaction to the new “return” transaction, you can even use Subrelation break downs for the Products report by Visit Number and see both Revenue and Returns (and the Calculated Metric) by Visit Number (pretty cool huh?)!

Orders & Units (Advanced)
For those that are a bit more advanced, I wanted to let you know that the above solution does not back out Orders or Units for Product Returns. Backing out Orders is a bit tricky since you may only want to remove the Order if the entire Order is returned. Units is a bit easier as you can simply create a second Incrementor Success Event for “Returned Units” as we did above. However, I suggest that you start with Revenue since most of your questions around Product Returns will be related to Revenue.

Finally, SiteCatalyst does provide an out of the box Data Sources template for Product Returns which can be found in the Data Sources Manager:

However, I have not used this template myself and I would have the following potential concerns:

  1. I don’t believe that this template uses Transaction ID which can be problematic as you will be unable to use the Product Return metric with all of your pre-existing eVar reports
  2. It looks like this template uses the same Revenue, Orders and Units Success Events to back out Product Return data. I feel like this can be a recipe for disaster if something goes wrong. With my approach, the worst case scenario (if you upload some bad Product Return data) is that your new Product Return metrics are temporarily off. If you use the standard Revenue, Orders and Units metrics, a mistake can be fatal and hidden amongst your normal online metrics (I never mess with Revenue, Orders or Units!).

For these reasons, I suggest you talk to your Adobe Account Manager if you want to pursue this route.

Final Thoughts
So if Product Returns are something that you have to deal with, the above is my suggested way to handle them. Those of you who work in Retail day in and day out may have come up with some other ways to deal with Product Returns, so if there are other best practices out there, please leave a comment here. Thanks!

Adobe Analytics, Analytics Strategy, General, Reporting

Presentations from Analytics Demystified

This week is somewhat bittersweet for me because it marks the very first time I have missed an Emetrics in the United States since the conference began. And while I’m certainly bummed to miss the event, knowing that my partner John is there representing the business makes all the difference in the world. If you’re at Emetrics this week, please look for John (or Twitter him at @johnlovett) and say hello.

If you’re like me and not going to the conference perhaps I can interest you in one of the four (!!!) webcasts and live events I am presenting this week:

  • On Tuesday, October 5th I will be presenting my “Web Analytics 201” session to the fine folks at Nonprofit Technology Network (NTEN) who we partner with on The Analysis Exchange. You need to be a NTEN member to sign up but if you are I’d love to talk with you!
  • On Wednesday, October 6th I will be doing a free webcast for all our friends in Europe talking about our “no excuses” approach towards measuring engagement in the online world. Sponsored by Nedstat (now part of comScore) all attendees will get a free copy of our recent white paper on the same topic.
  • Also on Wednesday, October 6th (although at a slightly more normal time for me) I will be presenting our Mobile (and Multi-channel) Measurement Framework with both our sponsor OpinionLab and a little consumer electronics retailer you may have heard of … Best Buy! The webcast is open to everyone and all attendees will also get a copy of our similarly themed white paper.
  • On Thursday, October 7th I will be at the Portland Intensive Social Media workshop presenting with Dean McBeth (of Old Spice fame) and Hallie Janssen from Anvil Media. I will be presenting John and Jeremiah’s Social Marketing Analytics framework and am pretty excited about the event!

All-in-all it promises to be a very busy week presenting content so I hope to hear from some of you on the calls or see you in person on Thursday.

Adobe Analytics, General

Hidden SiteCatalyst Features

(Estimated Time to Read this Post = 4 Minutes)

One of the funny things about SiteCatalyst (you will notice I can’t yet bring myself to call it Adobe SiteCatalyst!) is that there are some really cool features that are hidden. In some cases, it almost seems like someone has gone out of their way to hide them, but I like to look at these “hidden gems” as a sort of rite of passage. In this post I will share some of the ones I have found and hope that maybe you know of others so that all of us can learn! Also, if you haven’t read my old blog post on SiteCatalyst Time Savers, I encourage you to do so!

The Magic Triangle and Checkboxes
If you are like me, it may seem like you spend most of your day adding/removing metrics from reports! This can be a very time consuming process, so you might as well be as efficient as possible. However, I often find that new SiteCatalyst users add extra steps to the process because they don’t know a few easy tricks in the Metrics window. The first trick is that you can change the column that is used for sorting by clicking the [very] little triangle next to each metric. It amazes me how many people add metrics, wait for the report to load and then click on a column to sort and wait for the report to load again! Multiply that by twenty reports and it becomes a real time suck! Instead, simply click the triangle until it turns green (soon to be Adobe red?) and you are done!

But wait! There’s more…You will also notice that there are a bunch of check boxes next to each metric. Those check boxes are used to choose which metrics you want to graph with your report. You don’t have to graph every metric in the report, which may confuse your audience. Also, I find that many clients don’t take advantage of the fact that you can display two graphs per report. To do this, all you need to do is check off one of the boxes on the left and right side. This is helpful if some of your metrics are numbers and them are percentages. It is the closest SiteCatalyst comes to a secondary axis you may be used to in Excel.

Remove Subrelation BreakDowns
If you frequently use eVar Subrelation reports, you may find that after breaking one eVar down by another eVar, you want to go back to the report before it was subrelated. For example, let’s say you have opened aTraffic Driver report and broken it down by Offer Type as shown here:

Now let’s say you change the date range and some other report settings and then decide you want to just see Traffic Driver Type by itself again. Unfortunately, if you use the trusty “Back” button in your browser, you will have to re-do all of those customized settings. However, there are actually two ways to remove this subrelation without losing any work.

The first way to do this is to click on the “Broken Down by:” link shown in red above. Once you click on this, you will see a list of all of your variables and you can choose the bottom-most one labeled “None.” The other way is to click the green magnifying glass icon you used to create the Subrelation and do the same thing as shown here:

Double Your Searching Pleasure
Another thing I have noticed that a lot of SiteCatalyst users don’t know is that you can add search criteria to two different variables if you are using a Subrelation report. To do this, click on the Advanced Search link and then you can use the drop down boxes to choose which variable/search term combination you want:

In this case, I have chosen to filter for all Traffic Driver Types containing “SEO” and can proceed to enter my search criteria for Offer Type…

Inherit Segments
If you use DataWarehouse or ASI, you probably spend a lot of time creating Segments. If so, you may find times where you want to re-use some parts of a segment you have already created. When I first started using SiteCatalyst, I did this by printing out my segments and re-creating them manually. This is both time-consuming and prone to error, so I found the trick to do this more efficiently:

There it is! See how easily you can copy an existing segment? Do you see it? If not, would you believe me if I told you that there are actually two different ways to re-use segments on the above screen?

The first way to do this is to click the icon to the right of the Segment title. This will pop-up a new window which allows you to pick an existing segment you want your new segment to be based upon.

The second way to do this is to use the Segment Library. You can access the library by clicking its name next to the “Components” item. The Library is used to store commonly use segment building blocks. In the example below, I have created a Page View container that looks for Pages where the IP Address Geography is in the United States. By dragging this over to the Library, I can re-use this anytime I am creating a new segment.

Filter Report Suites
If you have Admin rights to your SiteCatalyst implementation and deal with a lot of report suites, the Admin Console quickly becomes one of your best friends. One of the most time consuming Admin Console tasks is finding the report suites you are looking for among all of your report suites. I frequently see people scanning up and down over and over hunting for the report suites they need. Fortunately, there is a much better way that is somewhat hidden – using the “Saved Searches” feature of the Admin Console. Using this feature you can create a filter to find report suites such that even if you add new ones, they will be added to your saved search if they meet the criteria.

Here is a real-life example. When I joined Salesforce.com, we had a lot of report suites and I began creating new report suites. When I created the new report suites, I simply added the phrase “New” to my new report suites titles. Once I did this, I clicked on the “Add” link within the Saved Searches area of the report suite manager and created a “Saved Search” rule like this:

Keep in mind that this is a very basic rule. You can actually add multiple criteria items and can build rules that take into account any of the following report suite criteria:

Lastly, if you are an Admin, be sure to read my past blog post with even more Admin Console Tips.

Final Thoughts
So there you have it. As you can see, none of these items are critical showstoppers, but I have found that knowing them can help speed up your day and give you SiteCatalyst bragging rights! Do you know of others? If so, please share them here as comments!

Analytics Strategy, Conferences/Community, General

Web Analysts Code of Ethics …

Following up on last week’s thread about how the web analytics industry is on the cusp of becoming our own worst enemy as the tide of public opinion increasingly turns against online and behavioral analytics I wanted to make good on my offer to help the Web Analytics Association. I fully support the efforts of the Association to create a solid community for web analytics professionals around the world and have long been a contributor to their work, be it turning the Web Analytics Forum (at Yahoo! Groups) over to WAA management, opening the doors for WAA participation in Web Analytics Wednesday, and providing other “behind the scenes” support when asked.

To this end I composed a preliminary “Web Analysts Code of Ethics” that I had planned to work on here in my blog (with you all) and then turn over to the Web Analytics Association. Much to my surprise, according to my partner John Lovett (who is a Board member) the Board of Directors loved the preliminary code and asked to have it publish at the Web Analytics Association blog.

Easy enough, and so I would like to redirect all of you over to the Association blog where I and the WAA both would like to hear what you have to say about this early effort. The comments have already started over there, and of course if you’re more comfortable commenting here then by all means, I welcome  that.

As I mentioned a few times in my recent Beyond Web Analytics podcast (not live until early on September 13th), I believe that we need to start advocating on our own behalf and I see this code as one small step in the right direction. Hopefully the WAA Standards Committee, the Board, and all of you out there whether you’re in the Association or not will join me in this effort to help the wider world understand what we all do (and what we do and will not do.)

So go do two things right now:

  1. Read and comment on my “Web Analysts Code of Ethics” at the WAA Blog
  2. Listen to my interview with Adam Greco and Rudi Shumpert at Beyond Web Analytics
Adobe Analytics, Analytics Strategy, General

Internal Search Term Click-Through & Exit Rates

Recently, I was re-reading one of Avinash Kaushik’s older blog posts on tracking Internal Search Term Exit Rates and realized that I had never discussed how to report on this using Omniture SiteCatalyst. In a past Internal Search post, I covered many different things you can do to track internal search on your site, but did not cover ways to see which terms are doing well and which are not. In this post I will share how you can see this so you can determine which search terms need help…

Why Track Internal Search Term Click-Through & Exit Rates?
So why should you do this? In the era of Google, we are all slowly being trained to find things through search. Many of my past clients saw the percent of website visitors using search rise over the past few years. In addition, Internal Search and Voice of Customer tools are some of the few out there where you can see the intent of your visitors. Unfortunately, most websites have horrible Internal Search results which can lead to site exits. In my previous Internal Search post I demonstrated how to track your Search Results Page Exit Rate, but that only shows you if you have a problem or not. If you do have a high Search Results Page Exit Rate, the next logical step is to determine which search terms your users think have relevant search results and which do not. Note that this is not meant to show you which terms lead directly to website exits, but rather, which terms cause visitors to use or not use the search results you offer them after they search on a particular term.

How Do You Track Internal Search Term Click-Through & Exit Rates?
Ok, so how do you do this? Follow the following implementation steps:

  • Make sure that you are setting a Success Event when visitors conduct Internal Searches on your website. Hopefully you are already doing this so in many cases this step will be done!
  • Make sure that you are capturing the Internal Search Term the visitor searched upon in an eVar variable. Again, you should be doing this (if not, shame on you!).
  • Here is where we get into uncharted territory. The next step is to set a new Success Event when visitors click on one of the items on the search results page. Depending upon the technology you use for Internal Search, this could be hard or easy. Regardless of how you actually code it, the key here is to set the second Success Event (I call it Internal Search Results Clicks) only if visitors click on a search result item (not if they click on a second page of search results or go to another page through other navigation). It is also critically important that you only set this Search Results Clicks Success Event once per search term! Do not set it every time a visitor clicks on one of the search results after using the “Back” button. If you don’t do this correctly, your Click-Through and Exit Rates will be off. This could take a few iterations to get right, but stick with it!
  • Once you have both the Internal Searches and Internal Search Results Clicks Success Events set, you can create a Calculated Metric that divides Internal Search Results Clicks by Internal Searches to see the Internal Search Click-Through Rate as shown here:

  • From there you can create the converse metric which subtracts the Internal Search Click Through Rate by “1” to come up with the Internal Search Exit Rate as shown here:

  • After this is done, you can open the Internal Search Term eVar report and add all three metrics so you see a report like this:

In this case, it looks like the “Zune” Internal Search term might need some different search result content as it has a much higher exit rate as the others. Another cool thing you can do is to create a report which trends the Internal Search CTR % or Exit % for specific Internal Search terms so you can see if they have been good/bad over time. Also, if you use SAINT Classifications to group your Internal Search Terms into buckets, you can see the report above for groups of Internal Search terms. If you vote for my idea in the Idea Exchange, you would be able to set SiteCatalyst Alerts to be notified if your top Internal Search Terms have spikes in their Click-Through or Exit Rates. You can also segment your data to see how the Internal Search rates differ when people come from Paid Search vs. SEO, etc… and even use Test&Target to try out different promotional banners on your search results page…

Finally, don’t forget that when you create a new calculated metric like the Internal Search CTR % metric described above, you also get the bonus of seeing this metric across your entire website under the My Calc Metrics area of the SiteCatalyst toolbar. Simply find this new metric and click on it and you can see your overall Internal Search Click-Through Rate regardless of internal search term. Your report will look something like this:

For The True Web Analyst Geeks
If you were bothered when I mentioned above that you should only set the Search Results Clicks Success Event once per search term, then you are my kind of person (please apply for a job with me!)! You were probably saying to yourself: “If I only count once per search term, how will I know which search terms get visitors to click on multiple search result links?” Right you are! That could be valuable information. If you want to see that as well, all you have to do is set a second Success Event each time a visitor clicks on a search result [I call this Internal Search Result Clicks (All)]. Then you can compare how many times people click on any search result to how often click in total. Here is a sample report:

In this example, you can see that the search term “api” had one click only in either scenario, but the search term “chatter” had people click on it 100% of the time and 5 times they clicked on two search result items. If you want, you can create another Calculated Metric that divides the Internal Search Result Clicks (All) by the # of Internal Searches to see how many search result clicks each term averages. In the case of “chatter” above, it would be 2.25 search result clicks per search!

Final Thoughts
If Internal Search is important to your site, make sure you are tracking it adequately so you can improve it and increase your overall website conversion. Do you have any other cool Internal Search tracking tips I haven’t covered? If so, leave a comment here…

Analytics Strategy, Conferences/Community, General

We are our own worst enemy …

Back in February of this year, in partnership with BPA Worldwide, Analytics Demystified published a white paper detailing the risks associated with the use of Flash Local Shared Objects (LSOs) in digital measurement. Titled “The Use of Flash Objects in Visitor Tracking: Brilliant Idea or Risky Business?” the paper drilled down into how some companies are using Flash LSOs and offered  the following guidance:

  1. Do not use Flash to reset browser cookies
  2. Disclose the use of Local Shared Objects
  3. Allow site visitors to disable Local Shared Objects

The first piece of advice turns out to be pretty important since companies are now being sued over their use of Flash to reset browser cookies. MTV, ESPN, MySpace, Hulu, ABC, NBC, Disney, and others are being dragged into a lawsuit based on their use of Quantcast and Clearspring who were identified by Soltani, et al. as using Flash LSO to reset deleted browser cookies. These lawsuits allege a “pattern of covert online surveillance” and seeks status as a class action lawsuit.

Yikes.

Fortunately for Adobe they do not seem to be one of the targets in these suits, which makes sense considering the position the company has taken regarding the use of Flash. In my interview with MeMe Rasmussen, Adobe’s Chief Privacy Officer back April of  this year, Mrs. Rasmussen explicitly stated:

“… the position we outlined in the FTC Comment on condemning the misuse of local storage, was specific to the practice of restoring browser cookies without user knowledge and express consent.  We believe that there are opportunities to provide value to our customers by combining Omniture solutions with Flash technology while honoring consumers’ privacy expectations.”

On the topic of consumer privacy and web analytics, following up my partner John’s response to the Wall Street Journal article on online privacy (“Be still my analytical heart”), I recently wrote a piece for Audience Development Magazine titled “You are all evil …” While a little tongue-in-cheek the article encourages marketers and business owners to:

  1. Have a rock-solid privacy policy
  2. Not use tracking software they don’t understand
  3. Not be unaware of what tracking software they have deployed
  4. Have a clear answer for “how and why do you track us?”
  5. Be transparent as hell when anybody asks what you’re doing

As I reflect back on the guidance we have provided in the past year I run the risk of becoming quite depressed. None of our recommendations are surprising, revolutionary, or particularly Earth shattering … but not nearly enough companies are doing most of these very simple things. Given this, one possible outcome is becoming increasingly apparent …

We are going to get screwed.

Go back to Walt Mossberg’s 2005 assertion that “cookies are spyware” and the related conversation around cookie deletion and you will see a clear pattern: media (ostensibly acting in the best interest of consumers) points out that what we do is somehow devious … and we more or less ignore the problem, hoping it will go away.

My friend Bob Page once referred to something he called the “Data Chernobyl”  … a unexpected and massive meltdown in consumer trust associated with the data that we collect, store, and use to make business decisions.  When you think about it for just a little bit the idea is terrifying … because everything we do depends entirely on our ability to collect, store, and use information about consumer behavior on the Internet.

Our livelihoods depend on everyone ignoring the fact that we track, understanding why we track, or getting something tangible out of the tracking we do.  Sadly we have never offered anything tangible, we have never really made an effort to explain what we do in court of public opinion, and it is increasingly clear that the bright light shining on our trade isn’t going to fade anytime soon.

What’s worse is that we are collecting even more information across mobile, social, and other emerging channels, perfecting our ability to integrate that data into over-arching consumer data warehouses, and occasionally using techniques that even the most hard-hearted of web analysts get all geeked-out about.

We have become our own worst enemy.

Now, as I declared in the Audience Development piece, I simply do not believe that consumers are as freaked out about tracking online as the media makes them out to be … the data I have seen just doesn’t support that conclusion. But consumers aren’t the real problem: the real problem: is the media, lawyers, and potentially the Federal Government. All three of these groups continue to generate page views, make money, and “protect the common man” (sic) by throwing our industry under the bus … and we aren’t doing anything in our defense.

Dumb, dumb, dumb.

People much smarter than I am have repeatedly stated that they don’t want to engage the media or “privacy police” in a conversation that they cannot possibly win.  To a small extent this makes sense, but at some point I wonder if we are going to collectively end up looking like my four year old when he knows he’s made a mistake.  My son gets away with it because he’s awesome cute and I love him, but I am beginning to think the collective web analytics industry is not going to get away with mumbling and making lame excuses for much longer.

The advertising industry has the IAB and NAI, both of whom appear to be responding to articles, lawsuits, and Congressional investigation on many of these issues.  (If you haven’t seen it yet, have a look at this amazing “privacy matters” campaign the IAB is running.) But we are not the advertising industry, we are the web analytics and digital measurement industry, and we need to have our own voice, our own lobby, and our own representation.

Since the framework for this already exists, I am officially asking that the Web Analytics Association formalize and finalize their Industry Advocacy program and represent the digital measurement community in the forum of public opinion.

I have already volunteered to help with this effort under the Presidency of Alex Langshur and reiterate that commitment to the current Board of Directors.  The WAA needs to bring together corporate members and key practitioner representatives to quickly hash out a clear, concise, and practical position on the relationship between digital measurement technology and consumers. The current WAA Board is in perhaps the best position in years to make the decision to represent the needs of our community … but decisive action is required.

Without the WAA’s leadership on this issue I fear that over time we will lose the battle of public opinion and my tongue in cheek assessment of the “evilness” of our industry will be far less funny than it seems today.

Let’s not let that happen.

We are an awesome industry full of brilliant people.  The work we do is some of the most valuable but least understood in the interactive world.  I believe it is time to come out of the closet, accurately describe the value of the work we do, and stop shying away from a conversation we feel is stacked against us and a battle we are unsure that we can win.  If we don’t try, without a doubt, we will remain our own worst enemy.

Adobe Analytics, General

Tracking Navigation

Unless your website is very basic, odds are that you use some sort of navigation to help visitors find website content. Usually navigation is in the header or left side of web pages. Inevitably, there will be times when you are asked how often and in what ways visitors are using navigation. In this post I will cover some common navigation questions and how to answer them.

Common Questions
So what are the common questions you may get around navigation. Here are some that I have been asked over the years:

  1. Which individual navigation links are clicked the most?
  2. Which navigation areas are clicked the most? This is usually related to the main section area, not individual links.
  3. From which pages are visitors using each navigation link?
  4. For what percent of website visits is navigation used?
  5. In what order do website visitors use navigation links?
  6. Which navigation links lead to key website success milestones being accomplished?

The following will show how to answer each of these questions:

Which individual navigation links are clicked the most?
In this scenario, people are looking to see which detailed navigation links are clicked the most. In the image below, this would represent such links as “Sales Cloud 2,” “Service Cloud 2,” “Custom Cloud 2,” etc…

To answer this question, you should have your developer write code that will pass the name of the link to a Traffic Variable (sProp) when a visitor clicks on each link in your navigation. In addition, I highly recommend that you have them include the high-level navigation area in the value passed to the sProp. For example, when a visitor clicks on “Sales Cloud 2” in the example above, I would pass the value of “products:sales cloud 2” (I always use lower case since sProps are case-sensitive) to the sProp. Passing the high-level area will ensure that your data is clean as there are times when the same link can occur more than once in a navigation structure. When this is complete, you can view a report that looks like this:

Which navigation areas are clicked the most?
In this question, people are generally asking to see (in the example above) if the “Products” section is clicked more than the “About” section and if so, by how much. The good news is that if you have done the previous step correctly, you can answer this question by creating a SAINT Classification which rolls up the values in the preceding report into higher-level buckets. You can create this classification easily by exporting the above report to Microsoft Excel and splitting the column by the separator and using the first part as the high-level navigation name. Here is what your SAINT file might look like:

After you create and process this SAINT file you will be able to see a new high-level navigation report that looks like this:

From which pages are visitors using each navigation link ?
In this scenario, people at your company may want to know what is the most common top navigation link clicked from the home page or from another page on your site. To see this, you need to have setup a Previous Page sProp. This sProp passes the name of the previous page to the current page which allows you to create Traffic Data Correlations between it and any other Traffic variable. In this case, once we have a Previous Page sProp, we can correlate it to the Top Navigation Link sProp shown above to see what navigation links are clicked from each page. For example, I can open up the Previous Page sProp within a report suite and then break it down by the new Top Navigation sProp…

…to see a report like this:

In this case, we can see that the “customers:india customers” Top Navigation link was only clicked 482 times from the home page.

In addition, since this uses a correlation and correlations are bi-directional, you can also use this to find out all of the pages from which visitors clicked on a specific navigation link:

In this case, we can see that the “customers:india customers” link was clicked a total of 957 times and then see the breakdown of pages visitors were on when they clicked it. This can help your content people understand when visitors are reaching for the navigation… Finally, if you look closely, you can see that the “SFDC:in:homepage” shows the same 482 clicks referenced above, but in this case we can see that it accounts for 50% of all clicks this link gets across the entire website…

For what percent of website visits is navigation used?
In some cases, you may be asked how often website navigation is used (in general). One easy way to figure this out is to look at the the total Page Views from the first SiteCatalyst report shown in this post and divide it by the number of website Visits. This can be done easily using the ExcelClient where you can pull a Visits data block and the report above and divide the two. However, if you think you might need this on a recurring basis and if trending is important, I will show you another way to do this. When visitors click on navigation links, in addition to passing the link name to a Traffic Variable as shown above, set a “Navigation Clicks” Success Event. Once you have a Success Event, you can create a Calculated Metric that divides Navigation Clicks by Visits as shown here…

…which will allow you to see a report like this:

In what order do website visitors use navigation links?
If you are redesigning your navigation, a useful piece of data is the order in which visitors click on navigation links. Do they always click on the first items in the list? The ones that are farthest to the left? Fortunately, if you have implemented the items above, you can see this by simply enabling Pathing on the new Navigation Links sProp created above. This will allow you to view the Pathing reports including a Next Page Flow and Previous Page Flow just for navigation items:

Which navigation links lead to key website success milestones being accomplished?
Finally, I will occasionally be asked which navigation links are contributing to success. To answer this question, all you have to do is enable Participation for your key metrics on the Navigation links sProp described above. This will allow you to add a Participation metric to the first report shown above to see which links were in the flow of your key website Success Events.

Final Thoughts
Well, there you have it. Everything you wanted to know about tracking your website navigation, but were afraid to ask! If you have any comments/questions, use the form below.

Adobe Analytics, General

Previous Page Variable

I believe that every SiteCatalyst implementation should have a Previous Page sProp. There! I said it (I feel like I am channeling Avinash!). In past blog posts I have touched upon the use of a Previous Page sProp, but I feel like I have not done it justice and wanted to take time to explain it in greater detail. In this post, I will describe why I think this variable should always be set and provide some examples of its use.

Why You Need a Previous Page sProp
I find that in the web analytics world, I often receive the following question:

What page was the visitor on when he/she _______?”

You can fill in the blank with many things. Here is a list of the ones I have been asked:

  • …searched for this phrase in our internal search box…
  • …clicked on a button to go to a web lead form…
  • …downloaded a white paper…
  • …added products to the shopping cart…
  • …clicked on a banner advertisement…
  • …started using the ROI calculator…
  • …clicked to fill out a website survey…

I could go on for days and never come to the end of these types of questions! People want to know this information because it helps them get inside the head of their visitors. Often times it leads to navigation or content changes. Regardless of the reason, I assure you that you will be asked this question at some point and the truth is that it is not easy to answer with out-of-the-box functionality (i.e. Pathing). The good news is that setting the Previous Page sProp is easy and will pay great dividends down the road…

How To Set the Previous Page sProp
Setting the Previous Value sProp could not be easier. All you have to do is use the Previous Value JavaScript Plug-in to pass the previous page name to a new Traffic Variable (sProp). You can even see a detailed description of the code for this in Ben Gaines’ great Summit blog post. If you need help, call your Omniture Account Manager, Omniture Consulting or ClientCare.

Once you have your JavaScript setup to pass the Previous Page Name to the sProp, you need to enable a Traffic Data Correlation to any sProp for which you want to create a breakdown. For example, if you want to see what pages visitors were on when they searched for a particular internal search term, you would correlate the Previous Page Name sProp with the Internal Search Term sProp…

…so you can see a report like this:

In addition, if you are familiar with correlations, you may recall that they are bi-directional, so in addition to seeing the pages people searched for specific terms from, you can also see the converse. In this case, that would mean seeing all of the internal search terms visitors searched for on a specific page:

As you can see here, we see the same “4” searches for the phrase “chatter” from the selected page as we saw in the first internal search term report (in this case I am just using Internal Search as an example, but if you want to learn more check out my Internal Search post).

One Is Usually Enough
However, one word of caution, I have seen many clients implement several Previous Page sProps and I am not a fan of doing this as I will now explain. Let’s say you want to see what page people were on when the searched on a specific search term (as described above) and you also want to see what page they were on when the downloaded files on your site. A lot of people will set two Previous Page sProps in this situation – one for the search term and one for the file downloads. In my opinion, this just wastes a variable, wastes correlations and causes confusion for your users. The truth is that all you need is one Previous Page sProp to answer both questions. Since on each page there will be one and only one Previous Page value, there is really no reason to do this multiple times.

I have seen some clients who have chosen to pass the Previous Page Name to an eVar. There are some interesting uses of this. For example, if you want to see what pages visitors were on when they added a specific product to the shopping cart, you can pass the Product Name to the Products Variable, set the Cart Add Success Event and the Previous Page Name to an eVar. The main issue you will run into is that Conversion Subrelations are an “all or nothing” proposition so you can only do breakdowns by eVars that have Full Subrelations.

One final tip that I will throw out there is to consider having your developer pass a value of “[NO PREVIOUS PAGE AVAILABLE]” (or something similar) to the Previous Page sProp on entry pages (or any other time no Previous Page is available). I find that this is easier than dealing with questions around “Unspecified” in correlation reports and it is easier to remove this value using the search box than it is to hide the “Unspecified” values.

Final Thoughts
As I mentioned in the beginning, I highly recommend that you have a Previous Page sProp for all of your key report suites and add correlations as needed. If you have any questions/comments, feel free to leave them here…

Adobe Analytics, Analytics Strategy, General, Reporting

Our Mobile Measurement Framework is now available

Today I am really excited to announce the publication of our framework for mobile and multi-channel reporting, sponsored by OpinionLab. You can download the report freely from the OpinionLab web site in trade for your name and email address.

This paper builds on our “Truth About Mobile Analytics” paper we published with our friends at Nedstat last year and focuses on both measurement in mobile applications and, more importantly, a cross-channel measurement framework built around interactions, engagement, and consumer-generated feedback.

  • Interactions occur in every channel, digital or not. Online and on mobile sites we call these “visits” (although that is a made up word for interactions); in mobile apps the interaction starts when you click the icon and ends when you click “close”; in SMS it starts when you receive the message; on the phone it starts when you dial, and in stores interactions start when you walk up to an employee.
  • Engagement is simply “more valuable” interactions. Regardless of your particular belief about the definition of engagement, we all know it when we see it. Online it happens after some number of minutes, or clicks, or sessions, or whatever; in mobile apps it happens when you’ve clicked enough buttons; on SMS it happens when you respond to the message; on the phone it starts when you begin a conversation, and the same is true in a physical store.  We say engagement is “more valuable” because without engagement, value is unlikely to manifest.
  • Positive Feedback happens when you do a really, really good job. Measuring feedback is a critical “miss” for far too many organizations. Apples “app store” and the value of the star-rating system has essentially proven that there are massive financial differences associated with positive and negative experiences … but most companies still make the mistake of ignoring qualitative feedback altogether.

These three incredibly simple metrics can be applied to every one of your channels, your sub-channels, and  your sub-sub-channels (if you like.)  When applied you can create an apples to apples comparison between your web, mobile web, mobile apps, video, social, etc. efforts.

Then you can apply cost data, and you’re really in business.

I don’t want to say much more than that but I would really, really encourage you all to download and read this free white paper. When we put something like this out — something we believe has the power to really transform the way everyone thinks about the metrics they use to run their business, and something that has the potential to force dashboards everywhere to be scrapped and started over — we’d really like your collective feedback.

DOWNLOAD  THE WHITE PAPER NOW

Thanks to Mark, Rick, Rand, and the entire team at OpinionLab for sponsoring this work. If you’re the one person reading my blog that hasn’t seen their application in action, head on over to their site and have a look.

Adobe Analytics, Analytics Strategy, General

Validating Orders & Revenue

I recently received an e-mail from a blog reader who was having issues tying their Orders in SiteCatalyst to Orders in their back-end system. Here is a snippet from the e-mail:

I have a little issue in my own SiteCatalyst setup that I recently discovered. Sad for me I had trusted the number of Orders for each day’s Conversion Funnel and recently I decided to validate the numbers in SiteCatalyst against what our back-end system has. SiteCatalyst is 5%-10% understated each day which makes for a heck of a difference at the end of the month! I’d rather be understated than overstated, but can you give me some ideas where I should look first?

Unfortunately this is an all too common problem I hear out there. In this post I am going to share some ideas on how you can tackle this Order/Revenue validation issue head-on and make sure you can trust your critical Orders/Revenue data in SiteCatalyst.

Order ID eVar
If you have an online shopping cart, you should already be setting the s.purchaseID variable with a unique Order ID when an Order takes place on the website. This variable is used by SiteCatalyst to ensure Order uniqueness. Unfortunately, the downside of this variable is that it is not readily available in the SiteCatalyst interface. It is available in DataWarehouse but not in regular SiteCatalyst reports or Discover. Carmen Sutter (@c_sutter) has submitted an idea in the Idea Exchange to change this, but until then, I recommend that you set what I call an Order ID eVar variable. To do this, all you need to do is set the same value you pass to the PurchaseID variable to a custom eVar. This will allow you to see all Orders and Revenue by Order ID from within SiteCatalyst and Discover as you would any other eVar. Once you have done this, you can open up this new Offer ID eVar and add your Orders or Revenue Success Event as needed:

In the example above, we can see that most Orders have only one Order ID, which is what we want. However, in this case, we can see that one ID was counted twice. That may require some research and I like to schedule a report like the one above to be sent to me weekly so I can make sure nothing strange is going on.

Data Sources Setup
However, while adding an Order ID eVar is helpful in seeing if you are over counting Orders in SiteCatalyst, it won’t tell you if you are under counting Orders or how close your SiteCatalyst data is to your back-end systems. To do this, I recommend you use Data Sources. As a quick refresher, Data Sources allows you to import external data/metrics into SiteCatalyst (see post link for more details). In this case, I recommend that you import in a file from your back-end system into SiteCatalyst which contains your unique Order ID, the number of Orders (which should always be “1”) and the Revenue Amount. When you import data via Data Sources, you include the date that you want the data to be associated with so it doesn’t matter if you import the data on a daily, weekly or monthly basis, but the more frequently you upload it, the better so you can find issues quickly.

Here are step-by-step instructions on how to do this:

  • Create the Order ID eVar described above
  • Create two new Incrementer Success Events and name them “Back-End Orders” (Type=Numeric) and “Back-End Revenue” (Type=Currency)
  • Create a new Data Sources upload template (ClientCare or Omniture Consulting can assist with this). You want to be sure to map the two new “Back-End” Success Events to the Data Sources template. Even more critical, is that you want to include the newly created Order ID eVar in the Data Sources template. If you do not do this, then you will not be able to see these two new Back-End metrics in the same Order ID eVar report that you have in SiteCatalyst (more on this later).

  • When you are done, you should have a Data Sources template that looks something like this:

  • Now all you have to do is work with your developers to have this file sent via FTP to the Data Sources FTP on a regular basis.

The Payoff
So by now, you are probably saying to yourself: “That’s a lot of work!” No argument here! However, hang with me as I share what the ultimate payoff is for doing this. As you recall, our primary objective was to see if our online Order and Revenue data was matching what our back-end systems indicated. Now that we have the Order ID eVar and two new “Back-End” Order and Revenue metrics, we have everything we need. This is where the fun begins and we put it all together!

All you have to do now is to open the new Order ID eVar report and add all of the relevant metrics. First, we will add the SiteCatalyst Orders and Revenue so we can see online Orders and Revenue by Order ID:

Next, we will add the two new “Back-End” metrics to the report and, since we were smart enough to include the Order ID eVar value in the Data Sources upload, SiteCatalyst knows which “Back-End” Order ID and dates line up with our online data:

Cool huh! As far as SiteCatalyst is concerned, these offline metrics are connected to your Order ID eVar values just as if they had happened online. Using this report, we can see if there are any differences between our online and offline data. In the example above, it looks like the “Back-End” system had an order with $2,350 in revenue that wasn’t captured online. Having this information makes it much easier to troubleshoot order submission issues. You can even use DataWarehouse or Discover (only if you use Transaction ID Data Sources) to break down Order ID by browser, domain, IP address, etc… to see if you can figure out what is happening. In addition, you can export this data to Excel and look at the totals to see how far off you are in general.

Finally, for the true SiteCatalyst geeks, you can create a Calculated Metric that divides Orders by Back-End Orders and/or Revenue by Back-End Revenue to see a trended % that each is off and set up Alerts to notify you if they deviate too much! When you take into account this level of assurance all of a sudden the Data Sources work above might not seem like all that much in the long run!

Final Thoughts
If you sell products online, nothing is more critical than believing in your key metrics. Even if you don’t sell online, the same principles here can be applied to lead generation forms, subscriptions or any other metrics you store in SiteCatalyst and also in your back-end systems.

General

Sad to see Aurelie Pols go …

I am very sorry to say that our European partner Aurelie Pols has decided to leave Analytics Demystified and pursue other goals in her life. While I am very sad to announce this, I have certainly enjoyed working with Aurelie over the past year and on behalf of myself, my family, and our partner John Lovett we wish Aurelie, Rene, and little Luca all the best.

Analytics Strategy, General, Social Media

Guest Post: Kevin Hillstrom

Kevin Hillstrom is one smart dude. President of MineThatData, author of Online Marketing Simulations, and prolific contributor to the Twitter #measure channel. Kevin spends a huge amount of time in Twitter challenging web analysts to think and work harder on behalf of their “clients,” 140 characters at a time.

A few weeks ago I asked Kevin “what five practices learned in the offline data analytics world would you like to see web analytics professionals adopt?” The following contributed blog post has Kevin’s answers which are, unsurprisingly, awesome. Near the end Kevin says “The Web Analyst has the keys to the future of the business, so it is a manner of getting the Web Analyst to figure out how to use keys to unlock the future potential of a business.”

Brilliant. We are the future of business … so what future will we be helping to create?

Kevin Hillstrom, President, MineThatData

In 1998, I became the Circulation Director at Eddie Bauer. Back in those days, Eddie Bauer printed money, generating more than a hundred million dollars of pre-tax profit on an annual basis.

One of the ways that Eddie Bauer generated profit was through the use of discounts and promotions. If a customer failed to purchase over a six month period of time, Eddie Bauer applied a “20% off your order” offer. The customer had to use a special promotion code, in order to receive discounted merchandise.

We analyzed each promotion code, using “A/B” test panels. Customers were randomly selected from the population, and then assigned to one of two test panels. The first test panel received the promotion, the second test panel did not receive the promotion. We subtracted the difference between the promotion segment and the control segment, and ran a profit and loss statement against the difference.

In almost all cases, the segment receiving the promotion generated more profit than the control segment. In other words, it became a “best practice” to offer customers promotions and incentives at Eddie Bauer. Over the course of a five year period of time, the marketing calendar became saturated with promotions. In fact, it became hard to find an open window where we could add promotions!

Being a huge fan of “A/B” testing, I decided to try something different. I asked my circulation team to choose two customer groups at random from our housefile. One group would receive promotions for the next six months, if the customer was eligible to receive the promotion. The other group would not receive a single promotion for the next six months. At the end of the six month test period, we would determine which strategy yielded the most profit.

At the end of six months, we observed a surprising outcome. The test group that received no promotions spent the exact same amount of money that the group receiving all promotions spent. After calculating the profitability of each test group, it was obvious that Eddie Bauer was making a significant mistake. It appeared that we would lose, at most, five percent of total annual sales, if we backed off of our promotional strategy. Eddie Bauer would be significantly more profitable by minimizing the existing promotional strategy.

In 1999, we backed off of almost all of our housefile promotions. At the end of 1999, the website/catalog division enjoyed the most profitable year in the history of the business.

This experience shaped all of my subsequent analytical work.

Just because we have the tools to measure our activities in real-time doesn’t mean we are truly optimizing business results. In the Eddie Bauer example, we had the analytical tools to measure every single promotion we offered the customer, and we used existing best practices and “A/B” testing strategies. All of it, however, was wrong, costing us $26,000,000 of profit on an annual basis. Simply put, we were measuring “conversion rate”. What actually happened was that we “shifted conversions” out of non-promotional windows, into promotional windows! Had we measured non-promotional windows, we would have noticed that demand decreased.

So, by measuring customer behavior across a six month period of time, we made a significant change to business strategy, one that dramatically increased annual profit.

What does this have to do with Web Analytics?

The overwhelming majority of Web Analytics activity is focused on improving “conversion rate”. Our software tools are calibrated for easy analysis of events. Did a visitor do what we wanted the visitor to do? Did a promotion work? Did a search visitor from a long-tail keyword buy merchandise when they visited the website? All of these questions are easily answered by the Web Analytics expert, the expert simply analyzes an event to determine if the event yielded a favorable outcome.

Offline analytics experts (often called “Business Intelligence” professionals or “SAS Programmers” if they use SAS software to analyze data) frequently analyze business problems from a different perspective. They use whatever data is available, incomplete or comprehensive, to determine if the individual actions taken by a business over time cause a customer to become more loyal.

With that in mind, here are five offline practices I wish online analytics experts would adopt.

Practice #1 = Extend the Conversion Window: Instead of analyzing whether a customer converted within a single visit or session, it makes sense to extend the conversion window and learn whether the customer converted across a period of time. For instance, when I ran Database Marketing at Nordstrom, we learned that our best customers had a 5% conversion rate, when measured on the basis of individual visits, but our best customers nearly achieved a 100% conversion rate when combining website visits and store visits during a month. By extending the conversion window, we realized that we didn’t have website problems, instead, we had loyal customers who used our website as a tool in a multi-channel process.

Practice #2 = Measure Long-Term Value: Offline analytics practitioners want to know if a series of actions results in long-term profit. In other words, individual conversions are relatively meaningless if, over the course of a year, individual conversions do not yield incremental profit. This is essentially the “Eddie Bauer” example I mentioned at the start of this paper, we learned that individual conversions (customers purchasing via a promo code) yielded increased profit during the promotional period, but generated a loss when measured across a six month timeframe. A generation of Web Analytics experts were trained, largely because of software limitations, to analyze short-term business results, and have not developed the discipline to do what is right for a business across a six month or one year timeframe. Fortunately, Web Analytics practitioners are exceptionally bright, and are easily able to adapt to longer conversion windows.

Practice #3 = Comfort with Incomplete Data: I recently analyzed data for a retailer that was able to tie 70% of store transactions to a name/address. During my presentation, an Executive mentioned that my results must be inaccurate, because I was leaving 30% of the transactions out of my analysis. When I asked the Executive if it would be better to make decisions on incomplete data, or to simply not make any decisions at all until all data is complete and accurate, the Executive acknowledged that inferences from incomplete data are better than inaction caused by data uncertainty. Offline analysts have been dealing with incomplete multi-channel data for decades, and have become good at communicating the benefits and limitations of incomplete data to business leaders. The same opportunity exists for Web Analytics practitioners. Don’t hide from incomplete data! Instead, make confident decisions based on the data that is available, simply communicating what one can and cannot infer from incomplete data.

Practice #4 = Demonstrate What Happens to a Business Five Years From Now Based on Today’s Actions: Believe it or not, this is how I make a living. I use conditional probabilities to show what happens if customers evolve a certain way. Pretend a business had 100 customers in 2009, and 44 of the 100 customers purchase again during 2010. This business must find 56 new customers in 2010 to replace the customers lost during 2010. I can demonstrate what the business will look like in 2015, based on how well the business can retain existing customers or acquire new customers. This type of analysis is the exact opposite of “conversion rate analysis”, because we are looking at the long-term retention/acquisition dynamics that impact every single business. I find that CEOs and CFOs love this type of analysis, because for the first time, they have a window into the future, they actually get to see where the business is heading if things remain as they are today. Better yet, the CEO/CFO can go through “scenario planning” to identify ways to mitigate problems or to capitalize on favorable business trends. The Web Analytics practitioner has the data to do this type of analysis, it is simply a matter of tagging customers or shaping queries in a way that allows the analyst to make inferences that impact long-term customer value.

Practice #5 = Communicate Better: This probably applies to all analysts, not just Web Analytics experts. Executives are frequently called “HiPPOs” by the Web Analytics community, a term that refers to “Highest Paid Person’s Opinion”. The term can be used in a negative manner, suggesting that the Executive is choosing to not make decisions based on data but rather on opinion or gut feel or instinct or internal politics. I was a member of the Executive team at Nordstrom for more than six years, and I can honestly say that I made far more decisions based on opinion than I made based on sound data and analytics … and I am an analyst by trade!! Too often, the analytics community tells an incomplete story. Once, I witnessed an analytically minded individual who made a compelling argument, demonstrating that e-mail marketing had a better return on investment than catalog marketing. This analyst used the argument to suggest that the company shut down the catalog marketing division. On the surface, the argument made sense. Upon digging into the data a bit more, we learned that 75% of all e-mail addresses were acquired when a catalog shopper was placing an online order, so if we discontinued catalog marketing, we would cut off the source of future e-mail addresses. This is a case where the analyst failed to communicate in an appropriate manner, causing the Executive to not heed the advice of the analyst. Too often, analysts fail to put data and customer findings into a larger context. Total company profit, long-term customer profitability, total company staffing strategies and politics, multi-channel customer dynamics, and Executive goals and objectives all need to be taken into account by the analyst when communicating a data-driven story. When this is done well, the analyst becomes a surrogate member of the Executive team. When this is not done well, the analyst sometimes perceives the Executive to be a “HiPPO”.

These are the five areas I’d like to see Web Analytics experts evolve into. The Web Analyst has the keys to the future of the business, so it is a manner of getting the Web Analyst to figure out how to use keys to unlock the future potential of a business. Based on what I have witnessed during the past forty months of multi-channel consulting, I am very confident that Web Analytics practitioners can combine offline techniques with online analytics. The combination of offline techniques and online analytics yields a highly-valued analyst that Executives depend upon to make good business decisions!

Adobe Analytics, General

X+ Page Visits

[I apologize in advance for such a horrible blog post title, but I couldn’t think of a succinct way to describe what I intend to cover. Maybe one of you out there will have a better suggestion after reading the post!]

If your website is like many I have seen, you get a fair amount of daily visits and unique visitors, but it may be the case that a large number of your visitors don’t go beyond the first few pages of your site. When I see this, I get very frustrated when I think about all I have done to get people to my site and optimized the site for my designated conversion goals. But as web analysts, we need to put our emotions to the side and get down to the numbers. Therefore, one of the things I like to do is to quantify how big of a problem my website has with visitors who only view a small number of pages. In this post I will show you how to quantify this so you can begin to take action on addressing this issue.

The Setup
Before I get too deep into this topic, I’d like to setup the scenario since I think this will help it make more sense. Let’s say that the main purpose of your website is to get visitors to view and complete lead generation forms. Let’s also say that on your website you see that your most significant drop-off takes place after the third page of each visit. In this situation, you might have lots of Visits and relatively few Form Completes so that your Conversion Funnel looks like this:

As you can see in this funnel, there is a pretty significant gap between Visits and Form Views. While that presents a huge optimization opportunity, I like to break massive efforts like this into smaller chunks that I can work towards (or as Avinash points out – Micro-Conversions). Since we noted earlier that a large portion of visits exit after three pages, wouldn’t it be nice if we could bridge the gap between our Visits metric and our Form Complete metric in the funnel above? Having a middle ground between these Visits and Form Views might get our team to think about ways to turn more Visits into Visits of four pages or more which, depending upon your site, might be a step in the right direction. In many sites I have worked with, there is a direct correlation between visitors viewing more pages and higher form conversion rates.

X+ Visits Explained
Now that we have set-up the situation,it becomes a bit easier to understand what I mean by “X+ Visits” since I am really saying that you can set a new Success Event metric which represents how many Visits your website gets where the visitor viewed more than “X” Pages. What “X” represents is up to you and should be based upon your own data. In this example, we will say that we are going to call it “4+ Page Visits” meaning the number of Visits in which Visitors viewed four or more pages.

The implementation of this is very easy for any good JavaScript developer since all that is involved is setting a Success Event as soon as each Visitor hits the fourth page of the session. Once you have done this, you can update the conversion funnel shown above to look like this:

While this may not seem like much of a difference, here are some cool things you can do once you have this implemented:

  • Create a Calculated Metric to divide 4+ Page Visits by Total Visits to see what % make it to four pages and trend this over time to see if you are getting better or worse

  • Use the filter feature of the conversion funnel to see your funnel by Visit Number or Traffic Source (i.e. SEO) to see how each impacts the mix of Visits and Visits of four or more pages

  • Create a calculated metric for the inverse (in this case three pages & fewer) by subtracting 4+ Page Visits from Visits. I also like to pass both to Excel using the ExcelClient to create a stacked graph like this to show progress

Final Thoughts
There you have it. If you find that you consistently have significant website drop-off after a few pages, hopefully, this new metric will help you better dissect what is happening so you can “Micro Conversion” your way to more Macro Conversion!

Analytics Strategy, General

Are you looking for experienced web analysts?

Anyone who has read my blog for long knows that I am passionate about two things in web analytics: process and people. Process is the glue that holds all the hard work we do as analysts together and allows our effort to translate into tangible business value. But without a doubt it is the people who are absolutely critical to any businesses ability to compete and succeed on web analytics.

Unfortunately people, especially really good ones, are incredibly hard to find. So much so that my partners and I have invested heavily in creating an entirely new way for novice and veteran analytics practitioners alike to gain valuable “hand’s on” experience using data to answer business questions, The Analysis Exchange.

While the Analysis Exchange has exceeded every single short-term milestone we have established for the effort, it has long been clear to my partners and I that training alone is not enough to satisfy the immediate needs of businesses working to take advantage of their existing investment in web Analytics. Companies need analytical talent now, not a year from now, not in six months, right now.

Why the urgency? Myriad reasons. The money has been spent on technology, the clock is ticking, the promises have been made, offline revenues are in decline and the company’s digital channels are the hope and future and difference between profitability and not.

The web analytics promise is real — companies that have become adept at generating analytically-driven insights and then translating those insights into sound business decisions have staked a clear competitive advantage. The giants of our industry — brilliant people like Joe Megibow, Dylan Lewis, Shari Cleary, and Lynn Lanphier <plug>all of whom are coming to the X Change conference in September, are you?</plug> — have not only determined the value of people but have also figured out how to convince management of that value.

Have you? Most companies have not.

Most companies persist in their belief that web and digital analytics is something that they can do “part time” and still have the successes that Intuit, Expedia, MTV, Best Buy, and others gain by hiring brilliant people, giving them clear direction, and recognizing the value of the analytical output they produce. Despite being well-intentioned, far too many managers still believe that software alone will provide insights and make recommendations.

But I digress.

Because we at Analytics Demystified believe in people and process so strongly, and because we are pretty confident in our consulting as it relates to process, we have decided to put our money where our mouths are and start helping companies fill their open positions for “web analyst, senior.” Today we are extremely proud to announce our first-of-it’s-kind partnership with the web analytics community’s leading recruiting firm, IQ Workforce.

Working directly with Workforce CEO Corry Prohens and his team, Analytics Demystified has crafted a “one-two” punch to help speed the process of finding, vetting, and hiring the kind of deep talent and teams required to take complete advantage of any investment in digital measurement technology. The Demystified partners and IQ Workforce will help you determine exactly which roles you need to fill, what strengths the ideal candidate will have, and how hired resources will fit into the organization that both creates business value and a satisfying experience for the analyst (which has a surprisingly positive impact on retention!)

In essence Analytics Demystified with our 30+ years of experience in web analytics will sit on your hiring panel and help you find and hire the critical difference between “web analytics as a cost center” and “web analytics as a profit center.”

Did we mention we will do it for a fixed price and in a way that allows most companies to circumvent HR’s aversion to “outside help?”

If you’re looking for an analytics guru for your organization, give us a call. We are more than happy to explain how this partnership creates a dramatic advantage for most companies, and would love to talk with you about our business and our partners at IQ Workforce. In the meantime please have a look at our press release on the announcement and more details about the offering:

Thanks to Corry and his team for making this idea a reality. On behalf of IQ Workforce and the Demystified Partners we look forward to helping you with your staffing needs.

Adobe Analytics, General

Product Pathing

In my last post, I discussed how you can see how much money you are leaving on the table when it comes to the online shopping cart. While I still have the shopping cart on my mind, I thought I would follow this up with a concept I call Product Pathing. Product Pathing answers one of the questions I get from time to time: How can I see the order in which website visitors are looking at my products or product categories? The following will provide details on why you might want to do this and its implementation.

Why Product Pathing?
So why would you want to implement Product Pathing? Here are a few reasons:

  1. Understanding how visitors jump between products or product categories which helps you understand how visitors navigate your products
  2. Seeing what products are viewed concurrently which helps you understand what cross-sell/up-sell opportunities might exist
  3. If one of your website goals is to get visitors to view multiple products, you can measure how you are doing against that goal

There may be more reasons, but the preceding items should help you build a case for implementing this, especially since it is not difficult to do.

Implementing Product Pathing
So the standard way to see the answers to the questions above is to use page name-based Pathing reports. You might find the page name of a particular product and then look at Pathing reports to see what visitors did after viewing the product. However, I find that this approach does not work because there are so many pages on the website that it is impossible to sift through them all and isolate just product pages. Therefore, I am going to propose the following alternative solution:

  1. On all Product View Pages, in addition to setting a Product View Success Event and the Products Variable, pass the Product Name (or ID if that is all you have) to a new “Product” Traffic Variable (sProp). Be sure that you pass nothing but the Product Name (or ID) to this sProp.
  2. After that is done, enable Pathing on this sProp

Believe it or not, that is all you have to do! By passing only the Product Name (or ID) to this new sProp, you will have a clean, new sProp that allows you to see Pathing reports on only Products like this:

Moreover, keep in mind that you have access to all Pathing reports so you get the bonus benefits of seeing the following as well:

  • How often visitors looked at Product X and then didn’t look at any other Products (Exit % – 42.32% in this case)
  • All paths containing Product X (Full Paths Report)
  • What Products visitors see (if any) between Product X and Product Z (Pathfinder Report)
  • How often did visitors see Product X and then Product Y (Fallout Report)
  • Which Products were viewed first the most often (Entries) or last the most often (Exits)

A Few Other Cool Uses of Product Pathing
In addition to this, there are a few other cool things you can do:

  • Instead of passing Product Names (of IDs), you can pass in Product Categories to see the same data at a higher level
  • Instead of passing Product Name values at the Product View Success Event, you can set an additional sProp in which you pass Product Names when the Cart Add Event is set to see the order in which visitors add products to the shopping cart
Adobe Analytics, General

Money Left on the Table

(Estimated Time to Read this Post = 3.5 Minutes)

Imagine that you are in a retail store and you grab a bunch of items, bring them up to the counter and just as you are about to pay, you decide to push a few of the items off to the side and not include them as part of your purchase. While this may not happen too often in real life, it happens quite often in eCommerce. If you are a retail website, these discarded items can add up quickly! In this post, I am going to show you how to quantify how much money you are leaving on the table. For those not involved in a Retail site, I will also do my best to show how this concept can be applied to non-Retail sites.

The Standard Cart Process
So before we get to the more advanced stuff, let’s make sure we are all on the same page when it comes to the eCommerce shopping cart process. Normally, here’s how it works:

  1. Visitors view products on your website and you capture this with a Product View Success Event and store the products viewed in the Products Variable.
  2. At some point, visitors add items to the shopping cart and you set the Cart Add Success Event and the Products Variable with the product ID or name(s).
  3. Hopefully, visitors get to the Checkout Page and you set the Checkout Success Event and the Products Variable with the Product ID or name(s).
  4. Finally, the order is completed and you set the Purchase Success Event which sets the Orders, Units and Revenue Success Events for each Product purchased.

Hopefully this is straightforward and if you sell online you have successfully implemented these steps on your site. If so, you are ready to take things to the next level and do some stuff that is not traditionally done as part of standard eCommerce implementations.

How Much $$$ Left on the Table?
As the post name implies, in this scenario we would like to see how much $$$ we are losing online by website visitors leaving items in their Cart. If you think back to the initial scenario above, this is equivalent to the Retail store adding up how much they could have made that day if no one had left stuff on the counter when they were checking out. In addition to seeing how much $$$ is being missed out on, the store owner would probably want to know what products are being left to see if there are any patterns he/she could identify. For example, it may be the case that items over $100 are left more often than products under $100, etc…

Well the good news, is that if you are doing business online, this much easier and you can see a lot more data on the items being abandoned and those who abandon them. So here’s how you do it:

  • When a website visitor adds one or more products to the shopping cart, in addition to setting the Cart Add Success Event (scAdd), you should set a currency Incrementer Event with the dollar amount associated with the items added. As a refresher, an Incrementer Event allows you to pass in a numeric/currency value to a Success Event instead of using it as a counter. By passing in the amount associated with the items added the Cart, you will have a new metric which represents the total potential that you could have made had no one left anything in the cart. I call this new metric $$$ Added to Cart.
  • Once this is done, you can compare this “$$$ Added to Cart” metric with your Revenue metric, either in a conversion funnel report or in a normal Conversion Variable (eVar) report by creating a Calculated Metric dividing the two metrics to see what % of $$$ Added to Cart turns into Revenue.
  • If you want to be even more particular, you can set another incrementer event with the $$$ that the visitor has in the Cart at the time of Checkout. However, if you find that you don’t have much loss between Cart Add and Checkout or between Checkout and Purchase, this may prove to be unnecessary.
  • Finally, since you are setting the Products variable with the Cart Add event already, when you compare these two metrics, you can easily break it down by Product (or any other eVar variables you have set previously).

Beyond Retail
As promised, I wanted to touch upon a few ways you could use this same concept if you manage a non-Retail website. Here are a few that come to mind:

  1. On a Financial Services site, pass in the total loan amount a person is requesting and compare that to how much they are eventually loaned.
  2. On a Media site, pass in the total amount of advertising your site could have earned if all ads were clicked.
  3. On an Auto site, pass in the total value of cars visitors configure to see your max potential.
  4. On a Lead Generation site, pass in a value for ever visitor who starts completing a lead form.
  5. On a Travel site, pass in the total value of trips planned online and compare it to the amount actually booked.
  6. On a Manufacturing site, pass in the total Bill of Materials value the visitor has added.

As you can see, the concept of seeing what your high-end potential is and comparing it to actual performance can be applied to almost any website and gives you another data point for comparison. I like using this metric better than Visits or Unique Visitors since it is not realistic that you are going to convert every person who comes to your site. However, once a visitor takes some more deliberate actions, they are self-qualifying themselves, and therefore, capturing their potential revenue streams gives you a high, but realistic goal to strive for and a KPI that you can use to see how you are doing over time.

Final Thoughts
So there you have it. Just a quick, easy way to add some more data to your all-important shopping cart process. In general, I feel like Incrementer success events are under-utilized by SiteCatalyst users so hopefully this example helps to get your mind working in new and inventive ways to use them…

Adobe Analytics, General

Traffic Correlation Hack

(Estimated Time to Read this Post = 3.5 Minutes)

From time to time, I will see people talking about some of the limitations of SiteCatalyst Traffic Data Correlations on Twitter and blogs. Below is the most common request I see:

For those of you not familiar with Correlations, they are a way to break one Traffic Variable (sProp) down by another sProp, assuming that both are set on the same page (image request). Unfortunately, in SiteCatalyst, the only metric you can see for Correlations are Page Views. In the example below, I can use a Correlation to see what page a visitor was on when searching for a specific phrase in the internal search box:

While this is handy, what if I wanted to see how many Visits people searched for this phrase from each page or better yet, how many Daily, Weekly, Monthly Unique Visitors did this? Currently, the only way to get at this information is to use DataWarehouse or Omniture Discover. However, the following section will show you a handy little hack to get this information directly from SiteCatalyst…

The Hack
So in this scenario, we will say that we want to see how many Weekly Unique Visitors searched for the phrase above from each page on our website (as in the example above). Here is the trick to doing this:

  1. Just as in a Correlation, you must have both data points you want to correlate available on the same page. In this example, it is Previous Page Name (from the GetPreviousValue JavaScript Plug-in) and the Internal Search phrase
  2. Once you have the two data points available, create a new Traffic Variable (sProp) and concatenate the two values using a separator. In this example, if the user searched for the above phrase from the Japan Customers page, the value would be “キーワード検索:SFDC:jp:customers”
  3. After you have passed the concatenated value to the sProp, contact your Omniture Account Manager and tell him/her that you would like to enable Visits, Daily Unique Visitors, Weekly Unique Visitors, etc… on that sProp

When all that is done, using the example above, you will have a report that looks like this:

The confusing part of this hack is that you won’t actually use a Correlation report anymore. You will no longer open one report and break it down by another report, but instead you will simply open the new sProp report and add the metrics you want to see. In the report above, I have added Page Views, Visits and Weekly Unique Visitors and searched on the specific Internal Search phrase for which I am interested.

However, as you can imagine, you could have a lot of unique values in this report. One caveat is that you need to make sure you don’t exceed the SiteCatalyst variable limit which is 500,000 unique values per month. In this example, you would want to make sure that the combination of Page Name and Search Term does not exceed 500,000 per month, but for most sites this shouldn’t be a problem.

Another thing to keep in mind is that the above scenario is just one example. This “hack” is by no means limited to Internal Search Terms and Pages. Here are some other examples of what you can do with this “hack:”

  • Unique Visitors/Visits who saw a specific page by visit number (i.e. Home Page:Visit 1, Home Page: Visit 2, Demo Page: Visit 1, etc…)
  • Unique Visitors/Visits who searched for a term by visit number (i.e. Pricing: Visit 1, Pricing: Visit 2, Demo: Visit1, etc…)
  • Unique Visitors/Visits who saw a specific page by country (i.e. Home Page:US, Home Page:UK, Demo Page:US, etc…)
  • Unique Visitors/Visits viewing a specific product page by search engine (i.e. Product X:Google, Product Y: Google, Product X:Yahoo, etc…)
  • Unique Visitors/Visits viewing a specific product page by search keyword (i.e. Product X:walmart, Product Y:walmart, Product X: walmart.com, etc…)

These are just a few examples and my advice is to look at whatever you are currently correlating (in Admin Console) and determine the items for which you would like to see Visits and Unique Visitors.

Advanced Users
For advanced users out there, I wanted to call out a few more things you can do with this concept:

  • If you don’t have a lot of unique combinations, you can add multiple correlations to the same sProp and use the search box to find the item combinations you need. For example, you may use the Internal Search & Page example shown above, but also store Page Name & Visit Number combinations in the same sProp. As long as all of your data is underneath the 500,000 unique value monthly limit you are ok. Alternatively, you can multiple sProps assuming you have enough variables that can have Visits and Unique Visitors enabled remaining in your contract.
  • With this alternative approach, you can also view Trending Reports for each of your combinations if you enable Pathing. This means that you can trend Visits or Unique Visitors for any combination (i.e. Weekly Unique Visitors who view Product Page X on Visit Number 1 over the last 90 days). This temporarily solves the following Idea Exchange item (Allow Trended Versions of Correlation Reports)

Final Thoughts
So there you have it. Just a quick “hack” that allows you to get a bit more information out of SiteCatalyst. In the future, perhaps Omniture will allow you to see Visits and Unique Visitors right form the normal Correlation interface (please vote for this here: Provide Visits and Unique Visitors in Correlation Reports), but in the meantime, hopefully this will help bridge the gap…

Adobe Analytics, General, Reporting

Site Wide Bounce Rate

In the past, I have written about Bounce Rates, Traffic Source Bounce Rates and Segment Bounce Rates. Hopefully this will be my last post related to Bounce Rates, but I recently found a “hack” to calculate and trend a Site Wide Bounce Rate in SiteCatalyst so I thought I would share it. I define Site Wide Bounce Rate as the total number of Single Access Visits divided by the total number of website Visits. Unfortunately, for some reason, this metric is very difficult to wrestle down in Omniture SiteCatalyst because you cannot view Pathing metrics (i.e. Entries, Single Access) in Calculated Metrics unless you are within an Traffic (sProp) report that has Pathing enabled.

To date, the way I have reported on Site Wide Bounce Rate was by pulling Visits and Single Access data into Excel using the SiteCatalyst ExcelClient. Once there, I could divide the two and if I wanted to see it by day (or week or month), all I needed to do was to pull both metrics by day. It was straightforward, but I could not add this to my SiteCatalyst Dashboards.

The Hack
So let’s say that you want to report a daily/weekly/monthly trend of your Site Wide Bounce Rate and add it to one of your executive dashboards. Here are the steps:

  • First you need to create the required calculated metric. In this case you want to divide Total Single Access by Total Visits (or Total Entries which is the same thing). I would recommend making this a Global Metric so all of your users have access to it going forward:

  • Once this metric is created, open your Pages report, click the Add Metrics link and add the new Site Wide Bounce Rate metric to your list of metrics. It will be under the Calculated Metrics area. Place this new Site Wide Bounce Rate metric so it is the first metric and then add your regular Bounce Rate metric and finally add the Page Views metric and click the small triangle to sort by Page Views. When you are done, it should look like this:

  • When you click OK, you will be able to see a report that shows your most popular pages, the Bounce Rate for each page and the overall Site Wide Bounce Rate. This report is handy for seeing how each page is doing in relation to the Site Wide Bounce Rate.

  • However, our original objective was to see the trend of the Site Wide Bounce Rate and add it to a dashboard, so let;s get back on track. To do this, all you have to do is click the “Trended” link shown in the report above. As is always the case, trending will show you the left-most metric trended over your chosen date range (which is why it was important to put Site Wide Bounce Rate in the first metric slot!). After clicking it, you will see a report that looks like this:

So the resulting graph is your Site Wide Bounce Rate and you can now add this to any SiteCatalyst Dashboard. However, as you recall, I mentioned this is a “hack” so if you look closely you will see a bunch of pages in the data table for this report. What is strange is that the values for each row are the exact same. This is the place where you can see how much of a hack this is. This data is pretty much useless so I recommend just adding the graph to your dashboards and ignoring the data table. Perhaps in the future Omniture will let us add this type of Calculated Metric to the “My Calc Metrics” area so we don’t have to take such a convoluted path to add this trend graph to a dashboard!

Final Thoughts
So there you have it. A quick hack in case you ever need to calculate Site Wide Bounce Rate for your HIPPO’s! Enjoy!

Conferences/Community, General

Update from The Analysis Exchange …

I have been so busy with clients, presentations, the launch of Twitalyzer version 3.0, and trying to enjoy the onset of summer I have been a very bad blogger. I have missed opportunities to follow-up on Steve Jobs mixed messages about analytics in the iOS platform, to talk about some really amazing Web Analytics Wednesday events that have been happening, … heck, I’ve even missed the chance to weigh in on a really interesting (albeit one-sided) flame war between Quantivo and Google.

Oh were there only 38 or perhaps 42 hours in every day.

Still as busy as I have been I have been amazed at some of the success folks are having in the Analysis Exchange so I wanted to drop a note and share some of what is going on:

  • First, and perhaps coolest, is this article in Internet Retailer about how the Public Broadcasting System (PBS) are using Analysis Exchange to gain insights into their traffic data;
  • Second, we have announced that our first-year goals are to leverage the Analysis Exchange community to produce free analysis for 1,000 nonprofits and non-governmental organizations, to train 500 “student” web analysts, and to create opportunities for participation for 150 “mentor” analytics practitioners;
  • Thirdly, we have also announced  that as an incentive to participate, we will be awarding a complimentary pass to this year’s X Change conference in Monterey, California to one (1) student and one (1) mentor who distinguish themselves as participants in the Analysis Exchange effort;
  • Finally, we are just pleased as punch to get so much great feedback from participants, both via email and on Twitter. Folks really seem to be enjoying themselves which is awesome!

To keep track of Analysis Exchange I have created a pretty elaborate dashboard. I’ll spare you all  the details but in an effort to be transparent in this work here is the top-line summary that I watch change and improve every day:

“Of our 529 members, 46% have completed a profile. Based on the number active and staffed projects our member participation rate is currently 7%. Of our 54 organizational members, 52% have created projects and 13% have completed projects. Of our 21 active projects, 62% are fully staffed and 100% of those have set a starting date for the work.”

We have work to do, but it is great work and we hope you will join us and participate!

Analytics Strategy, Conferences/Community, General

The Analysis Exchange is OPEN TO EVERYONE

Back in December of last year Aurelie, John, and I announced an idea we believe has the potential to change the web analytics industry forever, The Analysis Exchange. Briefly, the Analysis Exchange is a totally new approach towards web analytics training — one that depends less on what you read and more on what you do.

The Analysis Exchange lets experience web analysts demonstrate their passion for their work and gives beginners valuable “hands on” experience with data and real business problems. What’s more, the output from Analysis Exchange projects directly benefits some of the most amazing organizations around the globe — nonprofits and non-governmental groups who work not for money but for the betterment of humanity, our planet, and all creatures great and small.

You can read more about the origination of this effort in our blog posts and a very nice write up by our friend Jim Sterne, founder of the Web Analytics Association:

Since December we have been hard at work building out a web site and perfecting the business process that would be required to accomplish our core goals. What are those goals, you ask? Very, very simple … between now and June 1, 2011 we want to:

  • Provide FREE analysis to 1,000 nonprofit organizations
  • Provide FREE training and certification to 500 web analytics students
  • Provide FREE certification and support to 150 web analytics experts

1,000/500/150 are the numbers that we will be living by, but we know we’re not living there alone. We know this because the initial response to The Analysis Exchange has been tremendous! In addition to the great stuff we learned in our first testing round we have had excellent feedback from nonprofits, mentors, and students alike.

I love what Amy Sample, Director of Web Analytics at PBS Interactive had to say:

“What I love about the Analysis Exchange is the learning is reciprocal.  Not only is the student learning about analytics and giving back to the organization, but the organization is learning from the student as well.  Many of our local PBS stations have little experience with Web Analytics.  Through the Exchange, the stations are able to learn how to tackle analytics problems along with the student and how to make a lasting impact to their own organization.”

Cindy Olnick from the Los Angeles Conservancy had similar enthusiasm for her project:

“Joy’s a terrific mentor from what I can tell, and she and Danielle are great at translating all the numbers into information I can use. They’ve given me a report and will set up some new parameters in Google Analytics targeted to my goal of increasing membership.”

Todd Bullivant, one of our students said:

“It was a great way to end the week! Thanks again to everyone for the opportunity. I learned a lot about analytics that I can use in my own organization as well as future projects. I hope to work on many more of these in the future! I also just heard that my company is planning to spotlight me in the next internal newsletter due to this project, so increased visibility!”

Susie Hall, Director of Outreach and Enrollment at Acton School of Business said:

“The Analysis Exchange project was very enlightening for me as well.  We found out some valuable information, and I’m excited to use this new-found knowledge to help shape our outreach efforts. This project could not have come at a better time, we are in the middle of changing pretty much all of our processes, so moving forward armed with such powerful information is invaluable. Andrew and Candace were lovely to work with, and I am very happy with the whole experience.”

One of our super-motivated students, Andrew Hall said:

“During the course of the project, I worked with almost every functionality of Analytics other than custom variables, got to understand how Adwords campaigns work, and learned the benefits of taking data from Analytics into an analysis software like Tableau to gain and communicate insights.  Most importantly, I confirmed that I really enjoy doing this!  I am waiting to hear back from a couple of jobs, but in the mean time I’ve decided the knowledge I now possess would be beneficial a lot of organizations.  I feel confident enough to start approaching businesses and nonprofits in my community to get consulting work.”

By now I’m sure you get the picture, and our mentors have been having a great time as well. So much so that Joy Billings from Digitaria gave a positively glowing review of her work at the San Jose Emetrics, John Lovett’s student had her work go all the way to the CMO’s office at The Holocaust Museum, and Victor Acquah from Blue Analytics said:

“Just got off the presentation! Todd did such a great job with the analysis and presentation that it is hard to tell he hasn’t been in analytics for too long. Totally impressive output.”

We at Analytics Demystified have felt totally blessed to be part of the projects that have been going this far … but now is the time to take it to the next level: Starting with the publication of this blog post, The Analysis Exchange is open to all students, all mentors, and all qualifying organizations around the world.

If you haven’t already, please create an Analysis Exchange profile and join us in our effort to change web analytics forever. If you need more information first we have lots and lots of content including:

If you’re already in the Analysis Exchange and you want to help, please reach out to nonprofits you know and ask them to create a project so you can work with them. If you’re on Twitter, please use our short link (http://bit.ly/analysis-exchange) to help spread the news. If you’re a member of Web Analytics Wednesday, please consider mentioning the effort at your next meeting.

Finally, I want to offer a special “thank you” to Aurelie, John, Jim, Holly Ross, Beth Kanter, Sean Power, and each and every one of the mentors, students, and organizations who have helped us over the last five months. You are all amazing for contributing your time and energy to help make this effort run as smoothly as possible. Thank you!

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Excited to Announce X Change 2010 Keynote!

Now that Emetrics West is behind us, and what an Emetrics it was this year, Analytics Demystified and Semphonic officially start to ramp up our efforts to get the best of the best of you to join us for three days in Monterey September 20, 21, and 22. While I am excited about the entire event, I am particularly excited about our keynote offering this year titled “A Conversation with Management.”

Because the X Change draws so many expert practitioners, managers, and directors of web analytics my general feeling has always been that we should be programming for “lifers” in the field, looking for opportunities to help participants expand their career horizons. Our “Conversation with Management” keynote is a conversation with three of the most successful web analytics professionals I personally know:

  • Shari Cleary, Vice President of Digital Research at MTV Networks
  • Joe Megibow, Vice President of Global Analytics at Expedia.com
  • Steve Bernstein, Vice President of Analytics at Myspace

I have personally known Shari, Joe, and Steve for years and have had the great honor of watching each progress up the management chain, taking an increasing amount of responsibility with each step. Now all three of our keynote participants represent web analytics at the highest levels within each of their organizations, an incredible feat when you consider the footprint MTV, Expedia, and Myspace have on the Internet.

During our keynote I will be leading the panel to explore common “lifer” challenges including staffing, vendor management, the balance between reporting and analysis, their relationship with senior-most management, and the importance of business process to each of their jobs. My goal will be to get each to share details regarding their own career path in hopes those insights will help X Change attendees accelerate their own goals.

You can learn more about the 2010 X Change on our micro-site for the conference:

  • The 2010 X Change conference schedule
  • Hotel information for the Monterey Plaza Resort and Spa
  • Details about Analytics Demystified’s Think Tank offerings
  • Our X Change Frequently Asked Questions document
  • Registration for the 2010 X Change in Monterey, CA September 20, 21, and 22

If you have questions about the conference please don’t hesitate to give any of the Analytics Demystified partners a call or email. Remember that the conference is limited to the first 100 people who register and registration has already started.

See you at the X Change!

Analytics Strategy, Conferences/Community, General

Are you coming to Emetrics?

Well folks, it’s that time of year again. The winds are dying down and the flowers have all started to bloom so it must be time to make our annual pilgrimage to San Jose to bask in the glory of Jim Sterne and the Emetrics Marketing Optimization Summit! As usual I will be there and have the honor of sharing a keynote slot with my long-time friend and uber-optimizer Bryan Eisenberg!

  • Emetrics Keynote: Wednesday at 1:00 PM in the Grand Ballroom

Partner John Lovett will also be there, basking in his own glory on the heels of his Web Analytics Association victory … and taking the WAA’s new Certification test. I haven’t really had much time to think about the Certification yet but will be interested to hear what John and others taking the test have to say.

I also have the rare honor of presenting with Brett Crosby, Group Product Manager for Google Analytics and one of the nicest guys in the entire industry, hand’s down. Oddly he and I are presenting IMMEDIATELY AFTER his “What’s new from Google Analytics” pitch on Tuesday … but to compensate we’re gonna try something new and have a very loose “conversation” about web analytics that is more similar to an X Change mini-huddle than a traditional presentation.

  • Talking Analyics: Tuesday at 2:00 PM at The Conversion Conference (co-located w/Emetrics)

Finally I will be sharing the stage at Web Analytics Wednesday with Adam Laughlin from the nonprofit Save the Children. We will be talking about our respective community education efforts — his “Web Analytics Without Borders” WAA initiative and our own Analysis Exchange. I will be making a few exciting announcements about The Analysis Exchange next Wednesday so if you cannot attend Web Analytics Wednesday please watch my blog or follow me on Twitter.

  • Web Analytics Wednesday: Wednesday at 6:00 PM at the Fairmont in San Jose

That schedule again:

  • Tuesday, 2:00 PM at The Conversion Conference with Brett Crosby (Google)
  • Wednesday, 1:00 PM at Emetrics with Bryan Eisenberg (Emetrics Keynote)
  • Wednesday, 6:00 PM at Web Analytics Wednesday (The Fairmont Hotel, Market Street Foyer)

Thanks to Coremetrics and SAS for their generous support of Web Analytics Wednesday at Emetrics, by the way. Great companies like these are what keep WAW events around the world free and open to everyone!

See you in San Jose!

Analytics Strategy, General

My Interview with Adobe Chief Privacy Officer

Those of you paying close attention to issues regarding consumer privacy on the Internet are probably at least a little familiar by now with Flash Local Shared Objects (also called Flash “Cookies” by some.) I wrote a white paper on the subject Flash objects’s use in web analytics on behalf of BPA Worldwide back in February and had to update the blog post I wrote when I noticed  that Adobe had wisely written a letter to the Federal Trade Commission regarding the use of Flash to reset browser cookies.

After writing that update I got in contact with Adobe’s Chief Privacy Officer, MeMe Rasmussen, who politely agreed to answer a few questions that I had about their letter and Adobe’s position on the use of Flash as a back-up strategy for cookies.  Given that Scout Analytics is now reporting that Flash “Cookies” are increasingly being deleted by privacy-concerned Internet users I figured it was a good time to publish my questions and MeMe’s responses.

The following are my questions (in bold) and Mrs. Rasmussen’s responses verbatim.

Flash Local Shared Objects (LSOs) have been around for a long-time and I have been aware of their use as a “backup” for browser cookies for reset and other calculations for a few years.  What made you write your letter to the FTC now?  Was there a specific event or occurrence?

The topic of respawning browser cookies using Flash local storage was publicized after research conducted by UC Berkeley on the subject was published in August 2009.  The topic was also raised at the FTC’s First Privacy Roundtable in December, so when the FTC announced that its Second Roundtable would focus on Technology and Privacy, we felt it was the appropriate opportunity for Adobe to describe the problem and state our position on the practice.

While I believe the position you outlined in your letter to the FTC is the correct one, you have put many of your customers in an uncomfortable position by condemning an act that they have been using for quite some time — essentially issuing negative guidance where none had been previously issued (to my knowledge.)  What has the response to this been if I may ask?

We have not received any comments or concerns from customers about our Comment Letter to the FTC.  Adobe’s position specifically condemns the practice of using Flash local storage to back up browser cookies for the purpose of restoring them after they have been deleted by the user without the user’s knowledge and express consent.  We believe companies should follow responsible privacy practices for their products and services, regardless of the technologies they choose to use.

On page 8 of your response to the FTC you discuss Adobe’s commitment to research the extent of this (mis)use of Flash LSOs.  Given the extent to which LSOs are being used perhaps “not as designed” and the sheer popularity of Flash on the web this seems quite a task.  Can you describe how you have started going about this effort?

We are currently in the process of defining the research project and are working with a well-respected consumer advocacy group and university professor.  At this time, the specific details of the project have not yet been finalized.

Within the web analytics community many have commented that your position on Flash LSOs may impact some of what Mr. Nayaren and Mr. James have said about the integration of Omniture and Adobe products like Flash.  Specifically some of the commentary suggests a tight integration of Omniture’s tracking and Flash.  Does your position on LSOs as a tracking device change the guidance the company has issued to common customers?

No, the position we outlined in the FTC Comment on condemning the misuse of local storage, was specific to the practice of restoring browser cookies without user knowledge and express consent.  We believe that there are opportunities to provide value to our customers by combining Omniture solutions with Flash technology while honoring consumers’ privacy expectations.

One of the suggestions I made in the white paper with BPA Worldwide that you cited was to use Flash LSO as a back-up tracking mechanism but NOT to use it to re-spawn cookies.  From a measurement perspective there are a handful of good reasons to do this … does Adobe have a position on that strategy that you can outline?

The point we made in our FTC Comment was that we considered the practice of using Flash local storage to respawn HTML cookies without user consent or knowledge to be an inappropriate privacy practice.  In your white paper, you identified some uses of Flash local storage whereby browser cookies are rest but the use is given clear notice and an opportunity to consent.  We believe that technology should be used responsibly and in ways that are consistent with user expectations.  The example you presented in your white paper was an example of a Web site that, by giving notice and control to the user, implemented our technology in what appeared to be a responsible manner.

(Thanks again to MeMe and the team at Adobe for getting these responses back to me! As always I welcome your comments and questions.)

Adobe Analytics, General

Advanced Comparison Reports

In my last post, I covered Comparison Reports and showed the basics on how to use them. In this post, I will cover a few advanced ways that I use these reports related to Pathing reports.

Date-Based Pathing Comparisons
For many of the analyses I perform, I like to see how website visitors are navigating my site via Pathing reports. However, these reports are usually static – for a specified time period. Therefore, what I like to do is to view a few of the standard SiteCatalyst Pathing reports in a way that I can see how Pathing is changing day to day, week to week or month to month. While SiteCatalyst doesn’t let you use comparison reports in all Pathing reports, there are a few key ones that do allow you to compare date ranges – Next/Previous Page Report and Full Paths Report. The following are some examples of how you can take advantage of this.

Next/Previous Page Report
Let’s say that you have a key page like your home page and you want to see what pages visitors are going to from it in March vs. February. To do this, simply open the Next Page report (under Pathing) and select the Home Page as the page of focus. Once there, you can use the calendar as shown in my last post to select two different time periods (February & March in this case) and see the report:

When looking at this type of report, I tend to change the graph so I am looking at percentages instead of the raw numbers since that is what I care about the most. Also, you can apply Normalization to this report by selecting it in the report settings (to learn about Normalization see my previous post). Finally, the same principles apply if you want to see the above report as a Previous Page report to see how people are getting to a specific page differently between two different time periods.

Full Paths Report
In this next example, let’s say that we don’t have a specific page in mind, but rather, want to see how all of our paths are changing between two different time periods. To do this, you can use the Full Paths report (found under Pathing). This report shows all of your paths and can be quite massive, but I tend to use it to see just my top few site paths. Using the date comparison feature, you can see how the paths of the same site differ between two distinct time periods. To do this, simply open the Full Paths report and use the calendar tool to select your two date ranges and you will see a report that looks like this:

As you can see, this report allows you to look at your top paths and see if there are any significant changes you should know about. This is handy if you have a special promotion on a page and want to see how it is changing your Pathing behavior. Finally, there are some cool advanced things you can do in the Full Paths report like limiting paths to a specific number of pages and specifying an entry page, but you can explore that as needed.

Site-Based Pathing Comparisons [If you don’t have multiple report suites or ASI slots you can skip this section!]
As I mentioned in my previous post, the other type of comparison report you can create is one based upon report suites/ASI slots. These comparisons allow you to see how one data set is doing compared to another. A perfect example of this is if you are part of a multinational and have basically the same website for different geographies. Other examples might be a company that has multiple divisions/products that are similar enough that they use the same type of website template. In both of these cases, you have the same page names, just different locales or products and might want to see how one is doing vs. another. In this example, we will assume that a website is basically the same except for the site geography and that we want to see how people are navigating from the home page of two different geographies. If you are clever, you will quickly realize that this might be problematic since each website might have different page names including the site locale. This why in my Page Naming Best Practices post I recommended that you set a custom sProp without the site locale (and have Pathing enabled). Here is a quick excerpt from that post:

One wrinkle that can emerge are cases where you have multiple geographic websites. For many companies, this results in a similar version of the website, but translated into different languages. If you have this situation, I recommend tweaking the above page names to include a site locale indicator. For example, each page in the UK site should have “uk:” in the page name and so on. When this is done, your page names might look like this:

[Advanced User Alert – If you have multiple site locales, I also recommend passing the page name without the site locale to a different sProp (with Pathing enabled) so you can see how a page does across all site locales (i.e. Participation). I also like to pass the site locale by itself to a separate sProp so in a global report suite I can create correlations between sProps and other variables (i.e. Internal Search Terms).]

If you have done this, then you can simply open the next page report for this custom sProp (that has no site locale) and choose the “Compare to Site” option and select the sites that you want to compare. In this example, I am looking to see what visitors from Spain and Italy do when they are on the home page. In this case, I am normalizing the data so I can get a better view of the next page differences between the two sites even if they have different amounts of traffic. As mentioned previously, you can do the same thing for Previous Page and Full Path reports…

Using Comparisons With Other Custom sProps
Lastly, if you have Pathing enabled for other custom sProps, you can take advantage of this functionality there as well. Let’s say you have Pathing enabled on what internal search terms people look for on your website. You can compare this between websites or for two different time periods. Below is an example of looking at the same internal search terms for two different time periods.

Final Thoughts
These two posts should cover pretty much all you need to know about SiteCatalyst comparison reports. If you are using them in any other creative ways, please leave a comment here. Thanks!

Adobe Analytics, General, Reporting

Comparison Reports

Often times when I used to work with clients and now internally, I am surprised to see how many SiteCatalyst users don’t take advantage of Comparison Reports within the SiteCatalyst interface. In this post I will review these reports so you can decide if they will help you in your daily analysis.

Comparing Dates
Hopefully most of you are familiar with this type of Comparison Report. This report type allows you to look at the same report for two different date ranges. To do this, simply open up an sProp or eVar report and click the calendar icon and choose Compare Dates when you see the calendar. In the example shown here, I am going to compare February 2010 with March 2010:

For this example, I have chosen the Browser report, using Visitors as the metric. After selecting the above dates, my report will look like this:

As you can see, SiteCatalyst adds a “Change” column where it displays the difference between the two date ranges. This can be handy to spot major differences between the two date ranges. In this case we can see that “Microsoft Internet Explorer 8” had a big increase and that “Mozilla Firefox 3.5” had a decrease (probably due to version 3.6!). You can compare any date ranges you want from one day to one year vs. another year.

However, when you compare ranges that have different numbers of days, your results can be skewed. For example, in the report above, March had three more days than February so that may account for why the differences between the two are so stark. If this ever becomes an issue, you can take advantage of a little-known feature of Comparison Reports – Normalization. In the report settings, there is a link that allows you to normalize the data. When you normalize the data, SiteCatalyst makes the totals at the bottom of each report match and increases/decreases the values of one column to adjust for the different number of days. I am not 100% sure what specific formula or algorithms are used to do this, but for the amount of times that you will use it, I would go ahead and trust it. Below is an example of the same report with Normalization enabled:

If you look closely, you will see that the March 2010 column has been normalized when we clicked the “Yes” link shown in the red box above. By doing this, SiteCatalyst has reduced the numbers in the March 2010 column to assume the same number of Visitors as there were in February. If you want to normalize such that February is increased to match March, you simply have to reverse the date ranges so when you select your dates, March is the first column and February is the second column (the second column is always the one that gets adjusted). As you can see, the “Change” column is now dramatically different! In this version, “Microsoft Internet Explorer 8” no longer looks like it has changed much. I find that using this feature allows me to get a more realistic view of date range differences.

Finally, you may notice a tiny yellow box in the preceding report image (says “6,847”). This is a secret that not many people know about. When you normalize data, Omniture artificially reduces or increases the values in the normalized column. But if you want to see what the real value is (if not normalized), you can hover your mouse over any value and you will see a pop-up with the real number! If you look at the first version of the report (the one before we normalized), you will see the same “6,847” number in the first row of the report… Pretty cool huh?

Comparing Suites
This second type of Comparison Report is the one that fewer people are aware of or have used. In this type of comparison, instead of comparing date ranges you compare different report suites. Obviously, this only makes sense if you have more than one report suite, but it also works with ASI slots so don’t assume this isn’t relevant to you if you have just one report suite. Much of the mechanics of this are similar to the steps outlined above. You simply open one report (in this case we will continue to use the Browser report) and then choose the “Compare to Site” link and choose a second report suite or ASI slot. In this case, I am showing an example of the Browser report for two different geographic locations. Since most report suites have different totals, I tend to use Normalization more in these types of comparison reports.

Final Thoughts
This covers the basics of Comparison Reports. Hopefully you can use this to start creating these reports and adding them as scheduled reports or even to Dashboards. In my next post, I will take this a step further and demonstrate an advanced technique of using Comparison Reports…

Analytics Strategy, General

iPad, Mobile Analytics, and Web Analytics 3.0

If you follow me on Twitter (@erictpeterson) you are likely already annoyingly aware that I rushed right out last week and bought Apple’s new iPad. I got the device for a few reasons but fundamentally it was because I’m a technology geek–always have been really–and despite knowing the iPad will only get better over time I was happy to shell out $500 to see what the future of computing and all media would look like.

Yeah, I see the iPad as the future of computing and all media. Bold, sure, but hear me out … and I promise I’ll make this relevant to web analytics, eventually.

I believe that all that the “average user” of any technology really wants is a simple solution to whatever problem they may have at the time. At a high level people look towards their operating system to simplify access to the multitude of applications and documents they use; at a lower level we want our applications to simplify whatever process we’re undertaking.

Proof points for my belief are everywhere, ranging from the adoption of speed dial on phones (simplifies calling your friends and family), power seats in cars (simplifies getting comfortable when you switch drivers), and even into web analytics where a substantial growth driver behind Google Analytics has been the profound simplicity with which important tasks such as custom report creation and segmentation are accomplished.

The iPad, and to some extent the iPhone and it’s clones, absolutely crushes simplicity in a way that is simultaneously brilliant and powerful. Want to read a book? Touch the iBooks application, touch the book you want, and start reading. Want to send an email? Touch the Mail app, touch the new icon, and start writing. Want to play a game or send an SMS or Tweet something? It all works exactly the same way … tap, swipe, smile.

Sure, the iPad is a little heavier than is optimal, and yeah it shows fingerprints and costs a lot of money and isn’t open source and … blah, blah, blah, blah. The complainers are gonna complain no matter what–you’re Apple or your not in this world I guess. But the complainers I think fail to grasp the opportunity the iPad creates:

  • The iPad takes mobile computing to an entirely new level. With iPad you have a 1.5 lb device that will let you read, write, watch, and generally stay connected from just about anywhere for up to 10 hours between charging. What computer or phone does that? None that I know of, and so iPad gives us a simple answer to “I need to work but I’m away from the office.”
  • The iPad enforces usability of applications, and this is a very good thing. The complainers complain that Apple asserts too much control over app design via their App Store acceptance processes. Apparently these folks haven’t used enough crappy software in their lifetimes and are hungry for more. Apple’s model and their application design toolkit gives us a simple answer to “I wish this software was easier to use.”
  • The iPad changes media consumption forever. Despite the Flash-issue, one I suspect will become a non-issue very quickly thanks to the adoption of HTML5, the iPad is the most amazing media consumption device ever created. It is a portable, high-definition TV, it is a near-complete movie library, it provides access to hundreds of thousands of books, and it allows you to surf the Internet in a way that can only be described as “delightful”. By definition the iPad gives us a simple answer to “I wish I had a way to keep my books, my movies, my newspapers, my TV shows, … all of my media, in a single place that could be accessed anytime from anywhere.”
  • The iPad changes education forever. I’m making a bet that by the time my first grade daughter hits middle school a significant number of children will carry iPads to school, not expensive, heavy, and immediately out-dated textbooks. Think about this for a second: interactive textbooks that can be updated as easily as a web site, think about young people’s media consumption model today, and think for just a second about why Apple would be motivated to provide “significant educational discounts” for the device. The iPad in schools gives us a simple answer to “How can we provide a common platform for learning that any student or teacher can immediately master and reflects our rapidly changing world?”

Think that last piece isn’t important? Have a look at the image at the right, sent to me by @VABeachKevin (thanks man!) where he has already translated all three of my books into the ePub format and placed them on his iBooks bookshelf! This collection gives any web analyst with the iPad instant access to hundreds of pages of web analytics insight, anywhere, anytime. How cool is that?

(And heck, these aren’t even Jim, Avinash, or Bryan’s books … I bet Kevin’s converting those as we speak!)

I suspect you cannot appreciate this until you have one in your hands but the iPad has or soon will remove the necessity to purchase printed books, newspapers, and magazines. More importantly it gives the holder the ability to work efficiently from nearly any location around the world–all you need is a Wifi connection today and later this month that will be augmented with a 3G option.

Yeah, I’m an Apple fanboy, and yeah, I’m lucky to be able to drop $500 on technology without giving it much thought, but wait and see … I bet the adoption curve on the iPad will very much mirror the iPhone which is essentially ubiquitous these days. And just wait until someone develops a full-featured web analytics data viewer that takes advantage of all the pinching, swiping, dragging, and zoom UI capabilities of the iPad, that will simply be awesome! Imagine:

  • Scrolling along through time by simply swiping left or right
  • Zooming in on data by tapping or dragging across several dates
  • Adding metrics and dimensions by dragging them onto the existing graph or table
  • Changing from graph to table by simply rotating the device

Total “Minority Report” for web analytics … and I bet we see this within nine months time. In fact, if you’re a Apple developer looking for an awesome project … call me! I’d love to help guide a team developing next-generation web analytics interfaces on tablet computers.

Why This Matters to Web Analytics Professionals

I said I would try and make this relevant to web analytics practitioners so here I go. The iPad matters to measurement folks for exactly the reason I outlined back in September, 2007 when I first wrote about mobile’s impact on digital measurement. Web Analytics 3.0, a term I coined at the time and one I still use, is essentially the addition of a completely new dimension for analysis: user location.

In a digitally ubiquitous world–again one I described in 2007 that has more or less come to pass (although the prediction was kind of like predicting gridlock in Washington or rain in Oregon in April)–where a visitor is accessing information from becomes increasingly important and adds potentially significant context to any analysis we conduct. Location coupled with the device they’re using will likely have a profound impact on their likelihood to transact or otherwise use your site.

For example, a visitor accessing your site from home will likely have different needs and goals than one in their car, in an airport, in a coffeeshop, or in one of your competitors stores. In a world where an increasing number of visits are “out of home/out of office” visits conducted using mobile devices our collective approach towards analysis needs to change, perhaps dramatically.

To be fair, this is not something you need to solve and resolve today. While our ability to discern and differentiate mobile visits is getting better all the time, our overall analytical capabilities for mobile including the ability to tie mobile, fixed web, and offline visitors together is still unfortunately complicated. On top of that, while applications are increasingly able to pass over geographic information, most web browsers are not, and so our ability to gather large quantities of this data are still limited …

… at least for the time being.

For now I stand by what I said back in 2007–digital ubiquity and location-awareness changes everything. Back then the devices and platforms were just an idea; now we have the iPhone and it’s clones, the iPad is about to usher in a new era of mobile computing, Google and Apple are both behind mobile advertising, and the full scope of our analytical challenges are just beginning to emerge. If  you’re struggling with how to measure your mobile investment and thinking about how that strategy needs to evolve please consider giving us a call.

What do you think? Do you have an iPad or do you refuse to purchase one? Why or why not? Have you already started to struggle measuring mobile devices or do you have it all worked out? Is this all as exciting to you as it is to me? As always I welcome your thoughts and comments.

Adobe Analytics, Analytics Strategy, General, Reporting

Integrating Voice of Customer

In the Web Analytics space, we spend a lot of time recording and analyzing what people do on our website in order to improve revenues and/or user experience. While this implicit data capture is wonderful, you should be supplementing it with data that you collect directly from your website visitors. Voice of Customer (VOC) is the term often used for this and it is simply asking your customers to tell you why your website is good or bad. There are two main ways that I have seen people capture Voice of Customer:

  1. Page-Based Comments – Provide a way for website visitors to comment on pages of your site. This is traditionally used as a mechanism to get direct feedback about a page design, broken links or problems people are having with a specific page. Unfortunately, most of this feedback will be negative so you need to have “thick skin” when analyzing this data!
  2. Website Satisfaction – Provide a way for visitors to rate their overall satisfaction with your website experience (vs. specific pages). This is normally done by presenting visitors with an exit survey where you ask standard questions that can tell you how your website is doing and compares your site against your peers.

There are numerous vendors in each of these spaces and the goal of this post is not to compare them, but rather discuss how you can integrate Voice of Customer data into your Omniture SiteCatalyst implementation. In this post, I am going to focus on the first of the aforementioned items (Page-Based Comments) and specifically talk about one vendor (OpinionLab) that I happen to have the most direct experience with (their headquarters was a mile from my home!). The same principles that I will discuss here can be applied to all Voice of Customer vendors so don’t get hung up on the specific vendor for the purposes of this post.

Why Integrate Voice of Customer into SiteCatalyst
So given that you can see Voice of Customer data from within your chosen VOC tool, why should you endeavor to integrate Voice of Customer and your web analytics solution? I find that integrating the two has the following benefits:

  1. You can more easily share Voice of Customer data with people without forcing them to learn [yet] another tool. People are busy and you are lucky if they end up mastering SiteCatalyst, lest you make them learn how to use OpinionLab, Foresee Results, etc…
  2. Many Voice of Customer tools charge by the user so if you can port their data into SiteCatalyst, you can expose it to an almost unlimited number of users.
  3. You can use Omniture SiteCatalyst’s date and search filters to tailor what Voice of Customer each employee receives.
  4. You can divide Voice of Customer metrics by other Website Traffic/Success Metrics to create new, interesting KPI’s.
  5. You can use Omniture SiteCatalyst Alerts to monitor issues on your site.
  6. You can use Omniture Discover to drill deep into Voice of Customer issues

I hope to demonstrate many of these benefits in the following sections.

How to Integrate Voice of Customer into SiteCatalyst
So how exactly do you integrate Voice of Customer data into SiteCatalyst. For most VOC vendors, the easiest way to do this is by using Omniture Genesis. These Genesis integrations are already pre-wired and make implementation a snap (though there are cases where you may want to do a custom integration or tweak the Genesis integration). You can talk to your Omniture account manager or account exec to learn more about Genesis.

Regardless of how you decide to do the implementation, here is what I recommend that you implement:

  1. Set three custom Success Events for Positive Page Ratings, Negative Page Ratings and Neutral Page Ratings. These Success Events should be set on the “Thank You” page after the visitor has provided a rating.
  2. Pass the free form text/comment that website visitors enter into an sProp or eVar. If they do not leave a comment pass in something like “NO COMMENT” so you can make sure you are capturing all comments. If you are going to capture the comments in an sProp, I recommend you use a Hierarchy variable since those have longer character lengths vs. normal sProps which can only capture 100 characters.
  3. Pass the actual page rating (usually a number from 1 to 5) into an sProp. I also recommend a SAINT Classification of this variable such that you classify 1 &2 as Negative, 3 as Neutral and 4 & 5 as Positive. This classification should take less than 5 minutes to create…
  4. Use the PreviousValue plug-in to pass the previous page name to an sProp.
  5. Create a 2-item Traffic Data Correlation between the Previous Page (step #4) and Page Rating (step #3). This allows you to see what page the user was on when they submitted each rating.

All in all, this is not too bad. A few Success Events and a few custom variables and you are good to go. The rest of this post will demonstrate some of the cool reports you can create after the above implementation steps are completed.

Share Ratings
As I mentioned previously, you [hopefully] have users that have become familiar with the SiteCatalyst interface. This means that they have Dashboards already created to which you can add a few extra reportlets. In this first example, let’s imagine that you want to graphically represent how your site is doing by day with respect to Positive, Negative and Neutral ratings. To do this, all you have to do is open the Classification version of the Page Rating report (can be an sProp or eVar – your call) and switch to the trended view. You should have only three valid values and I like to use a stack ranked graph type using the percentage to see how I am doing each day as shown here:

This graph allows me to get a quick sense of how my site is doing over time and can easily be added to any Dashboard.

You can also mix your newly created Voice of Customer Success Events with other SiteCatalyst metrics. For example, while you could look at a graph/trend of Positive or Negative Comments by opening the respective Success Events, a better way to gauge success is to divide these new metrics by Visits to see if you are doing better or worse on a relative basis. The following graph shows a Calculated Metric for Negative Comments per Visit so we can adjust for traffic spikes:

Find Problem Pages
Another benefit of the integration is that you can isolate ratings for specific pages. The first way to do this is to see which pages your visitors tend to rate positively or negatively. In the following report, you can open the Rating variable report (or Classification of it as shown below) and break it down by the Previous Page variable to see the pages that most often had negative ratings:

This will then result in a report that looks like this:

Alternatively, if you want to see the spread of ratings for a specific page, all you need to do is find that page in the Previous Page report and break it down by the Rating variable (or its Classification) as shown here:

Share Comments
As noted above, if you capture the actual comments that people leave in a variable, you will have a SiteCatalyst report that captures the first 256 characters of the comments visitors enter. This report duplicates scheduled reports from your Voice of Customer vendor in that it allows you to share all of the comments people are leaving with your co-workers. However, by doing this through SiteCatalyst, you gain some additional functionality that some VOC vendors don’t provide:

  1. You can create a Traffic Data Correlation between the Comments variable and the Previous Page variable so you can breakdown comments for a specific page. Therefore, if you have users that “own” specific pages on the website, you can schedule daily/weekly reports that contain comments only for those pages so they don’t have to waste time reading all of the comments left by visitors.
  2. You can use the Search filter functionality of SiteCatalyst to scan through all of the visitor comments looking for specific keywords or phrases that your co-workers may be interested in. In the example below, the user is looking for comments that mention the words “slow” or “latent” to be notified of cases where the visitor perceived a page load speed issue:

Set Alerts
Another cool thing you can do with this integration is set automated Alerts in SiteCatalyst so you can be notified when you see a spike in Negative Comments on your site. This allows you to react quickly to broken links or other issues before they affect too many visitors (and help avoid #FAIL posts in Twitter!). Here is an example of setting this up:

Review Problem Visits using Omniture Discover
Finally, if you have access to Omniture Discover, after you have implemented the items above, you can use Discover to do some amazing things. First, you can use the unlimited breakdown functionality to zero in on any data attribute of a user that is complaining about your site. For example, if you had visitors complaining about not being able to see videos on your site, you might want to see their version of Flash, Browser, OS, etc… or even isolate when the problem took place as shown here:

Additionally, you can use Discover to isolate specific comments and watch the exact visit that led to that comment. This is done through a little-known feature of Discover called the “Virtual Focus Group.” This feature allows you to review sessions on your site and see the exact pages people viewed and some general data about their visit (i.e. Browser, GeoLocation, etc…). While not as comprehensive as tools like Clicktale, it is good enough for some basic analysis. Here is how to do this:

  1. Open Discover and find the comment you care about in the custom sProp or eVar report
  2. Right-click on the row and create a Visit segment where that comment exists
  3. Save the segment in a segment folder
  4. Open the Virtual Focus Group (under Pathing in Discover)
  5. Add your new segment to the report by dragging it to the segment area
  6. Click “New Visit” in the Virtual Focus Group
  7. Click on the “Play” button to watch the visit

Now you can watch how the user entered your site, what pages they went to and see exactly what they had done prior to hitting the Voice of Customer “Thank You” page.

Final Thoughts
So there you have it, a quick review of some cool things you can do if you want to integrate your chosen Voice of Customer tool and Omniture SiteCatalyst. If you are interested in this topic, I have written a white paper with OpinionLab that goes into more depth about Voice of Customer Integration (click here to download it). If you have done other cool things, please let me know…

Analytics Strategy, General

Why Google is really offering an opt-out …

When I first saw the news of Google’s opt-out browser plug-in spread around Twitter I thought “hmm, I wondered when we’d see this” and moved on since opt-out is more or less an non-issue — basically because in the grand scheme of things nobody really opts-out. For all the hand-wringing and navel-gazing people do on the subject of privacy online, I have never, ever seen any data that indicates that web users actively opt-out of tracking in significant numbers.

Never.

If you have it, bring it on as I’d love to see it. But in my experience the only people really truly and actively interested in browser- or URL-based opt-out for tracking are privacy wonks, extreme bit-heads, and some Europeans. The privacy wonks and bit-heads are who they are and are unlikely to ever change; the Europeans have privacy concerns for other reasons but I will defer to Aurelie to try and make heads or tails of what those reasons are.

Still, it has been interesting to see some bright folks like Forrester’s Joe Stanhope offer some explanations about why Google might be doing this and what the ramifications might be. And it has been less interesting to see some of the fear mongering and hyperbole offered by Marketing Pilgrim’s Andy Beal in his post “Why your web traffic is going to nosedive thanks to Google” although I found Econsultancy balances things out with their straightforward and tactful post “Will opt-out threaten Google Analytics?

What Andy, Patricio, and to some extent Joe, apparently didn’t notice is that Google Analytics is about to make a big, big push into Federal Government web sites, and this browser-based opt-out is just a check-box requirement to satisfy the needs of said privacy wonks who for better or worse have the Administration’s ear (or some body part, you choose!)

Yep, the browser opt-out isn’t actually for anyone … except for perhaps the Electronic Freedom (sic) Foundation and their ilk. Google is somewhat brilliantly checking a box now so that when the Office of Management and Budget (OMB) releases all new Federal guidelines for browser cookie usage later this year any Federal site operator who wants can immediately dump their existing solution and go directly to Google Analytics.

You do remember that Google Analytics comes at the amazing deficit reducing price of ABSOLUTELY FREE. Even a Republican can get his or her arms around that price tag, huh?

You betcha.

“Hey wait,” you say, “what about the fact that Federal web sites will probably never get permission to track visitors over multiple sessions?” Good point, except did you know you can override Google Analytics _setVisitorCookieTimeout() and_setCampaignCookieTimeout() variables and set their values to zero (“0”) which effectively converts all Google Analytics tracking cookies to session-only cookies?

Yep.

Not to mention that the little birds who sing songs in only hushed tones suggest that OMB is about to take a much more reasonable stance on visitor tracking anyway. This is not a done deal, but the situation that most Federal site managers work under today — one where many sites are more or less forced to use out-of-date log file analyzers and most are hamstrung in their ability to analyze multi-session behavior — seems to fly directly in the face of President Obama’s efforts to make government more transparent and effective.

I said as much just after he was elected, and then I said it again when I pointed out that Barack Obama should not fear browser cookies! Federal managers need modern, easy-to-use tools to improve the overall quality of government web sites.

Now, I could be wrong about all of this — I am human, and like Joe Stanhope I have not heard word-one from Google about the opt-out app — but I am pretty good at connecting dots and these are big, obvious dots:

  1. Google loves data
  2. Feds have tons of data
  3. Feds have requirements necessitating privacy controls
  4. Google builds privacy controls
  5. Google gets Feds data

This is actually pretty brilliant of Google if you think about it. Assuming you’re with me in my belief that Google Analytics isn’t about AdWords or Analytics or anything other than Google’s desire to have all the world’s data, then you’ll surely see that providing Federal web site operators a web analytics solution that simultaneously solves a multitude of analysis problems AND saves money is, well, pretty freaking brilliant.

Don’t take my word for it. Here’s a list of sites in the .gov domain that people are tracking using our free, browser agnostic web analytics solution discovery tool. We have about 100 sites total, the majority of which don’t appear to have any kind of tracking code at all, and of these:

  • 12% are using Google Analytics exclusively already
  • Another 3% are using Google Analytics with Omniture (1%) or Webtrends (2%)
  • 6% are using Omniture (one, GSA.gov in tandem with Webtrends)
  • 15% are using Webtrends (including GSA.gov in tandem with Omniture)
  • 63% appear to have no hosted analytics of any kind

If I’m right the evidence will be obvious as more of these “no hosted analytics” sites begin to have Google Analytics tags. Sites like Census.gov, the EPA, FCC, FEMA, HUD, and even FTC might all start to take advantage of Google’s largesse (and willingness to provide a browser-based opt-out, don’t forget that!)

What do you think?

As always I welcome your thoughts, observations, reaction, and even anti-tracking-pro-privacy rants. If you are you a Federal site manager with insight to share but unable to voice your position publicly then out of respect I am happy to have you post anonymously as long as you provide a valid email address that I will confirm and then convert to “anon@anonymous.gov” to protect your identity.

Adobe Analytics, General

Twitter Integration Enhancement Ideas

As I was at Omniture Summit last week, I couldn’t believe that it had already been a year since I started talking about integrating Twitter data into Omniture SiteCatalyst! Since I haven’t seen many updates about this integration come from Omniture, I thought I would share a few enhancements I have made over the year in case any of them are useful to those out there using the integration…

Competitor Twitter Share
When I first envisioned importing Twitter data into SiteCatalyst, my primary focus was tracking how often my brand was mentioned and importing the brand-related tweets. This allowed me to monitor my brand usage and filter tweet reports to send the right tweets to the right people based upon search phrases. However, the more I thought about it, the more I realized that this integration could be used to keep tabs on competitors as well. Instead of setting one “Brand Mentions” Success Event, you could expand the scope of what is tracked and also grab tweets mentioning your competitors and set a second Success Event named “Competitor Tweets.” This second Success Event allows you to trend your competitors and track them on the same SiteCatalyst dashboard you use to track your own brand:

This led me to another cool idea…Why not track overall “Competitor Tweet Share” in which you quantify the % of tweets your brand gets in relation to those of your competitors? This would allow you to trend your “share of twitter” for your narrow competitive niche. To do this, create a Calculated Metric as follows:

This results in a graph like this which allows you to see when spikes occur to see if local events or press releases move the needle:

You can also set Alerts based upon this Calculated Metric to be notified when you are spiking or tanking in relation to your competitors!

General Tweets
The next concept I thought about was “general tweets” that were related to a business. For example, if you are Coca-Cola, you might want to keep tabs on tweets mentioning “soda” or “soft drink.” However, you wouldn’t want these counted as “Brand Tweets” or “Competitor Tweets,” so instead you can set a third Success Event called “Twitter General Mentions” and specify a list of keywords that should trigger this Success Event. This allows you to see if a list of “general” keywords related to your business is rising or falling over time to gauge the general level of interest in your category over time:

#Fail
Lastly, I decided that the #Fail hashtag was too good to pass up. If your brand is mentioned in the same tweet as the #fail hashtag, you probably want your social media team (if you have one!) to be alerted at once! To do this, all you have to do is create a scheduled report with #Fail in the search box and schedule it to run hourly. Unfortunately, SiteCatalyst delivers hourly reports whether there is data or not (to stop this please vote for this idea) so you may need your social media folks create an Outlook rule to filter the alerts that say “No Data” in the subject.

In addition, you can perform the same exercise for your “Competitor Tweets” since your social media team may want to be notified when your competitors have a #Fail hashtag in tweets mentioning their brand name!

So there you have it…a few minor updates or enhancements to the Twitter – SiteCatalyst integration. If you have other ideas, please leave a comment here…Thanks!

Adobe Analytics, General

Page, Section, Site Naming Best Practices

Recently on Twitter, there was some discussion about the best way to name your pages in SiteCatalyst. Therefore, I thought I would share some thoughts on the best way to set names for Pages, Sections, etc… when using SiteCatalyst. Hopefully, you are already doing most of what I will mention here, but just in case, here are my suggestions. Also, keep in mind that many people have different styles of naming pages (and feel very strongly about it!) so what is shown here are my personal preferences…

Naming Pages
If you don’t manually apply a “friendly” page name to the s.pagename Traffic variable (sProp) in SiteCatalyst, Omniture will capture the URL by default. This is not ideal for the following reasons:

  1. URL’s can be very long and exceed the sProp character limit (normally 100 characters)
  2. URL’s can be hard to understand by end-users
  3. URL’s can have querystring parameters that get cutoff which means that many pages get treated as one page name in SiteCatalyst (which ruins Pathing reports)
  4. URL’s can have a http:// and https:// versions which means two versions of each URL which subdivides Pathing, Unique Visitors, etc…

Therefore, it is highly recommended that you name your pages the way you would like to see them in the Pages report. I generally recommend naming pages based upon directory structures or manually by adding fields to your content management system. Once you figure out how you will name your pages, the key question is what should you name your pages. Here are my recommendations:

  • Make sure all pages within each unique website have a common identifier. For example, if you have three distinct websites that serve different purposes, I like to assign a value in the page name for each website so I can easily filter those pages in a global report suite (one that has data from all websites). For example, for the Omniture website, I would have an identifer for the public (marketing) website (i.e. “omtr:”) and a different identifier for say the Idea Exchange (i.e. “ideas:”).
  • I like to include the section in the page name when possible. For example, if the Omniture public website has a section for “Products” and another for “Services,” I would include those in the page name (i.e. “omtr:products:” or “omtr:services”). This allows you to easily filter Page reports to get all of the pages within a section. Some companies also include the sub-section in the page name which is fine as long as you don’t hit the sProp character limit.
  • Make sure all pages have a unique name. If you have two pages with the same exact page name, SiteCatalyst will treat them as a single page and all stats for that page will be merged (including paths).
  • Be mindful of case-sensitivity. sProps are case-sensitive, so if you have the same pagename value, but in different cases, you will get two distinct page names. A common best practice is to force upper or lower case in the JS file to avoid any issues.

So if you put all of these ideas together, a list of your pages might look like this:

One wrinkle that can emerge are cases where you have multiple geographic websites. For many companies, this results in a similar version of the website, but translated into different languages. If you have this situation, I recommend tweaking the above page names to include a site locale indicator. For example, each page in the UK site should have “uk:” in the page name and so on. When this is done, your page names might look like this:

[Advanced User Alert – If you have multiple site locales, I also recommend passing the page name without the site locale to a different sProp (with Pathing enabled) so you can see how a page does across all site locales (i.e. Participation). I also like to pass the site locale by itself to a separate sProp so in a global report suite I can create correlations between sProps and other variables (i.e. Internal Search Terms).]

One other item related to site locales is the use of different languages or translated pages. While I do recommend different page names for each site locale, I do not recommend that you have different page names for the same page translated into different languages. You can read more about this in my old Foreign Language post.

As you can see, this doesn’t look all that hard to implement, but by using the above items you can easily:

  • Filter pages for a website (i.e. omtr: vs. ideas:)
  • Filter all pages for a specific section (Search contains :services:)
  • Filter all pages for a particular site locale (Search contains “:uk:”)
  • Filter on a combination of the above items. For example, let’s say you wanted to see all UK Product pages (Search contains “:uk:products:”)

When you look at this it makes common sense, but I can’t tell you how many clients I ran into that had incomprehensible page naming which made everything more difficult. Even if it means losing some historical page data, I always recommend that clients have good page names as it pays great dividends down the road…

Site Sections (s.channel)
After dealing with Page Names, the next thing I like to tackle is Site Sections. These are useful if you want to see how visitors are navigating your website at a higher level than pages. If you have good page names, you really should be able to build your Site Sections by setting it equal to the page name minus everything past the last “:” symbol. For example, in the example above, if the page name is “omtr:us:services:consulting” then the section would be “omtr:us:services” (you choose whether you want to include the “:” at the end or not). I have seen many clients that can set Site Section values automatically based upon good page names which saves a lot of development work and ensures consistency.

Site
One variable that many clients forget to include is the Site variable. Essentially, what you are doing here is to pass in a value for the website by itself into an sProp. In the example above, this would mean values of “omtr” or “ideas” by themselves in an sProp. Doing this allows you to see total Visits and Unique Visitors by site in one report and when Pathing is enabled, allows you to see how people are navigating from one website to another. Again, if you have good page names, you can set the Site variable by simply grabbing everything before the first “:” symbol in the page name.

Page Type
Those of you who have read my previous blog posts know that I am a fan of setting a Page Type on each page that represents the function of the page. I won’t rehash this topic, but recommend you check out my prior post on this.

Advanced Stuff
For those who are a bit more advanced in their SiteCatalyst usage, you can check out the following page related advanced topics:

So there are a few items related to naming Pages, Sections, etc…Let me know if you find these tips helpful and/or if you have come up with best practice suggestions of your own you’d like to share…

Adobe Analytics, General

Cross-Visit Traffic Source Attribution

Last week I shared a way to capture the various traffic sources (i.e. SEM, SEO, E-mail, etc…) so you could calculate the Bounce Rate for each of these Traffic Source types. In this post I am going to build upon this and show you another cool way you can leverage this to have what I call Cross-Visit Traffic Source Attribution.

What is Cross-Visit Traffic Source Attribution?
As an online marketer, one of the things I want to see is how each traffic source leads to online success. Within a visit, it is relatively easy to see which Traffic Source types lead to success. Normally this is done by capturing the various campaign elements and using SAINT Classifications to roll these up into Traffic Source types. However, what many marketers want to see is the overall mix of Traffic Source types that lead to success over several visits. For example, maybe Paid Search is always the last thing your visitors are doing before placing an order, but maybe the first thing they did was to click on an SEO keyword. I touched upon this a bit in an old blog post on Cross-Visit Participation which you can review here. If your organization has a desire to see a high-level view of which combinations of Traffic Source types lead to success, then Cross-Visit Traffic Source Attribution may be your answer.

Implementing Cross-Visit Traffic Source Attribution
If you have followed the instructions I laid out in my last blog post, then you have already done much of the work required to enable this feature in your SiteCatalyst implementation. Now that you have an sProp that contains the Traffic Source type set on the first click of each website visit, all you have to do is the following:

  1. Pass this value to an eVar (Most Recent Allocation)
  2. Implement the Cross-Visit Participation plug-in
  3. Have the eVar expire when your primary success event takes place (i.e. Orders)

As a refresher, the Cross-Visit Participation plug-in stores a list of elements, in this case Traffic Sources, with each visit so when a Success Event takes place, you can attribute the success to the current string of cross-visit values. For example, if someone comes to your site three times, first from SEO, second from E-mail and third from SEM and then places an order, the current value in the eVar would be “SEO|E-mail|SEM.” As time goes by, and you have more website visitors, the combinations that occur most frequently will rise to the top (web analytics darwinism?). Usually the single Traffic Sources will be at the top (i.e. SEO by itself or SEM by itself), but what I look for are the combinations that are at the top of the list. I sometimes even hide the individual items using the advanced search feature (Tip=Show if it Contains “|”) so I can see only multiple session Traffic Sources:

The only warning I will give about using this functionality is that it might burst the bubble of some of your co-workers who think that their Traffic Source type is the “end all, be all” of success. In my experience, many people bounce around quite a bit and the results can surprise you!

First Touch, Last Touch
When it comes to attribution, many talk about First Touch, Last Touch and All Touch, meaning which Traffic Source was the first that visitors saw in a sequence leading to success, the that visitors saw last or a list of all of the Traffic Sources that influenced the success. In SiteCatalyst, the easiest way to implement First Touch and Last Touch is to use two separate eVars. Both capture Traffic Sources, but one has Original Allocation and a long expiration (never or say 6 months), while the other eVar is set to Most Recent Allocation and expires at the Visit. However, you can also use the new Cross-Visit Traffic Sources eVar shown above to do this. Simply download the above report to Excel and then isolate the first Traffic Source or the last Traffic Source and add up the Orders (or use a Pivot Table) to see the total for each Traffic Source.

Traffic Source Influence (All Touch)
For me however, I am most interested in seeing the total influence of a specific Traffic Source (All Touch). While this is not readily available in SiteCatalyst (since Linear eVar Allocation only works within one visit), you can use the new eVar mentioned above to quantify the potential impact/influence of a specific Traffic Source Type. Here is how you do it:

  1. Download the report above to Excel (you decide if you want to include the single Traffic Sources or only when multiple exist – as shown above)
  2. Use an Excel Formula to set the Traffic Source Type for a specific Traffic Source Type (i.e. SEO) in all rows where it is found (see green column below)
  3. Create a Pivot table off this new column (i.e. SEO) and look at the total Success Events (Orders in this example) that are associated with a row that contains the Traffic Source Type you chose in step two (in this case 754,328)
  4. Take that total (i.e. SEO Influenced Orders in this case) and divide it by the Total Orders (in this case 76.07%). This will show you how much SEO influenced Orders such that SEO was involved in a visit that ultimately led to an Order.

Finally, if you want to see Cross-Visit Attribution of individual Campaign elements (Tracking Codes) instead of Traffic Sources, you can apply the same principles shown in this post and my last post.

Hopefully, between this post and my last post, you will be able to answer the nagging Traffic Source questions that come up from time to time and help your organization better understand where it should use its precious marketing dollars…

Adobe Analytics, General

Traffic Source Bounce Rates

Recently out on Twitter, someone had asked how you can calculate Bounce Rate for various Traffic Sources. In the past I have shed light on how you can create Bounce Rates for campaign elements, visitor types, etc…, but I failed to share how to see Bounce Rates for SEO, SEM, E-mail, etc… In this post I will share the way to do this. If you are reading this post, odds are you know what Bounce Rates are, but quickly, it is the percent of visitors who saw one thing (normally page or section) and then went no further. If you need a refresher on Bounce Rates, you can look at my old Bounce Rate post, or better yet, check out Avinash’s post on Bounce Rates.

Why Traffic Source Bounce Rate?
Often times, marketers want to see how each of their disparate online marketing channels are doing when compared to each other. Most will rate them by how well they perform against the website KPI’s. However, due to its popularity, may want to see the Bounce Rate for these online traffic sources. While my Segment Pathing post showed you how to see the bounce rate for a specific traffic source element (i.e. SEO Google keyword going to your home page), what if you want to see the total Bounce Rate for SEO or SEM? Unfortunately, that doesn’t come out of the box in Omniture SiteCatalyst (though you can derive it in Omniture Discover). I am not going to tell you whether the Traffic Source Bounce Rate is a valid metric as that depends upon your business objectives, but the next section will outline how to implement it.

Implementing Traffic Source Bounce Rate
So how do you implement Traffic Source Bounce Rate? Like any Bounce Rate metric, you need to be able to calculate Single Access and Entries. In SiteCatalyst, that means you need Pathing to see these metrics, so you know right off the bat we are going to need a Traffic Variable (sProp). Once you have identified which sProp you are going to use and had Pathing enabled for it by ClientCare, you need to find a way to get the various Traffic Sources that you use into that sProp. The ones I commonly use are:

  1. SEM
  2. SEO
  3. E-mail
  4. Display Ads
  5. Affiliates
  6. Social Media
  7. Other Websites
  8. Typed/Bookmarked (a.k.a. the rest)

The key to this solution is that you need to find a way to identify the Traffic Source of the first click to your site. This can be done manually in your JS file or semi-automated using the Unified Sources VISTA Rule or the similar Channel Manager Plug-in. Regardless of the method, what I try to do is to find something that uniquely identifies each online marketing channel. Usually the best way to do this is through a query string identifier.

Here is how I do this:

SEM – If a click to your website comes from a Search Engine, you should have an identifier (i.e. ?s_kwid=) in the URL. If you do, you know the Traffic Source is SEM.

SEO – If the click comes from a Search Engine, but doesn’t have that identifier, it is SEO!

E-mail – When you send e-mails, you should be tracking the inbound clicks with a query string parameter. If so, set it to something unique for e-mails (i.e. ?eid=) so you know that if you see that identifier in the URL, the Traffic Source is E-mail.

Display Ads – In a similar manner, if you are buying Display Ads, you normally get to choose the destination URL. Therefore, you can set the destination URL on your site to have another unique identifier (i.e. ?displayid=) so you know which clicks have come from Display Ads.

Affiliates – See Display Advertising (i.e. ?affID=)

Social Media – This one is a bit trickier, but what I do is make a list of the key Social Media sites I want to track and when I see the referring URL contains one of those URL’s, I set the Traffic Source to Social Media.

Other Websites – If all of the above criteria have not been met, but there is a referring URL, set the Traffic Source equal to “Other Website.”

Typed/Bookmarked – If none of the preceding conditions have been met and there is no referring URL, set the Traffic Source to “Typed/Bookmarked.”

Phew! It sounds difficult, but you should be using different query string parameters anyway in your campaigns area and any good JavaScript developer can do this somewhat easily. It does require coding (which I don’t do myself!), but maybe Omniture will provide this out of the box one day…

But Wait…There’s More!
Believe it or not, you are not done yet! Once you have found a way to distinguish the Traffic Source and are passing that into an sProp on the first page of every visit, you are 90% of the way there. The last step is a bit confusing (techie alert!). In order for SiteCatalyst to know if a visitor made it beyond their first page of the visit (hence, did not “Bounce”), it needs to see a different value in the sProp at some point during the Visit. If it doesn’t see another value passed to the sProp, it will assume they didn’t see any other pages and exited the site (your boss won’t want to see a 100% Bounce Rate for every channel – trust me!). Therefore, when a visitor navigates to a second page in the visit (any page – doesn’t matter which one), you need to force a “dummy” value into the same sProp that you previously passed Traffic Source. My clever developer, passes in the value “Did Not Bounce” as the dummy value. I will let those more technical than me discuss the best way to pass this dummy value, but once you have done this, you will have a new sProp that has one value for each of your Traffic Source Types and one extra one for the dummy value. Since this sProp has Pathing enabled, you will have a Single Access and Entries metric for each of your Traffic Sources and can now calculate Bounce Rate (I recommend using Advanced Search to hide the dummy value and save it as a Custom Report so you don’t confuse your users).

For the most part, this sProp won’t have much value beyond calculating the Bounce Rate since it is really only set on the first page of the visit, but here are some additional goodies:

  1. Use Trended reports to monitor Traffic Source Bounce Rates over time
  2. Enable Daily, Weekly, Monthly, etc… Unique Visitors on the sProp to see Uniques for each Traffic Source
  3. Correlate it to any other sProps that are most important on the first page of the visit (i.e. Referrer, Visit Number, etc…)
  4. There is one more cool thing you can do with this, but it is so cool, I need to do a full post about it so stay tuned for my next post…

Final Thoughts
Well there you have it. I wish it was not so convoluted, but don’t shoot the messenger! If anyone else knows an easier way to do this, I am all ears. I apologize for this being a bit more technical/complicated than most of my posts, but I don’t know of a non-technical way to explain this. Let me know what you think…

Analytics Strategy, Conferences/Community, General

Welcome to Analytics Demystified 2.0

By now you’ve noticed that we’ve completely re-done the Analytics Demystified web site, that is unless you only ever read my posts in an RSS reader in which case I would ask you to click-through and have a look. The new site is the culmination of nearly a year’s effort starting with convincing my good friend Aurelie Pols to join the Analytics Demystified and, more recently, convincing my other good friend John Lovett to leave his cushy job at Forrester Research to join Aurelie and I. Hopefully you find the new site more streamlined, easier to read, and a little more focused on the aspects of Analytics Demystified we are working to feature.

My own personal highlights include:

  • Totally free copies of Analytics Demystified, The Big Book of Key Performance Indicators, and the KPI book’s companion worksheets. I made the decision to start giving my books away for one reason and one reason only: to continue to do everything humanly possible to educate as many future web analytics professionals as possible. The response today was good (see image below!)
  • Totally revamped mini-site for The Analysis Exchange, including the ability for everyone to start to create their member profiles. The Analysis Exchange has exceeded every single expectation that I had going in, thanks to many people’s efforts. If you’re interested in helping the Analysis Exchange or learning more about the effort please visits http://www.analysis-exchange.com
  • Partially revamped mini-site for Web Analytics Wednesday, with more features and updates coming in Q2. Web Analytics Wednesday has become such an automated delight, and with SiteSpect and Coremetrics renewing their sponsorship in 2010 we hope to do even more this coming year!
  • All new look and feel for my, Aurelie, and John’s blogs, and the addition of our new Emerging Technology blog. So much of our traffic is driven by the blogs, and so many of our clients find us based on our writing here, we wanted to ensure that reading our blogs was as distraction free as possible. The Emerging Technology blog is something we think of as “TechCrunch for Web Analytics” and we hope you’ll check that out.
  • We have also worked to clarify what the Analytics Demystified web analytics consulting business and Senior Partners do, when we’re not supporting the community at large. Perhaps a small point, but one that pays the bills, so if you need help getting your web analytics strategy defined, please give us a call.

One thing about my last point, our consulting business and giving us a call. On past sites there were dozens of calls to action and conversion points I was trying to get people to and through. On this site there is one: getting YOU to reach out to US. It may sound glib, but we are able to do more for people who simply email, call, Skype, or Twitter us than most folks can imagine, and often times our help comes without any kind of fee.

Put another way, if you need our professional help, we’ll help you and hopefully you’ll be satisfied with what we ask you to pay. But if you need our guidance, suggestions, or honest opinion, we’ll help you without ever bringing up fees or asking for money. Like the book giveaway, Web Analytics Wednesday, and The Analysis Exchange we have found that simply answering questions without expectation of compensation is often times better than getting paid.

In closing I am totally delighted with the traffic we had to the site today thanks to Twitter, the #measure channel, and the book offer. Based on my Omniture Insights reporting we were completely off the charts in Europe and this AM in the U.S. We’d love your help spreading the word about the book! If you can, tell people to click through on http://bit.ly/demystified-books or simply to check out the new web site.

As always I welcome your comments, critique, and feedback. Especially if you have nice things to say about the new site, of want to help me identify bugs (since not all of you use Chrome on the Mac … LOL!)

Adobe Analytics, General

Basic Brand Awareness Tracking

One of the holy grails of online marketing teams is to find a way to track and measure a company’s Brand Awareness. There are many different approaches to do this including the use of products like comScore, Compete, Twitter, but more often than not, it takes place offline in research studies. While this trend is not going to change anytime soon, as a web analyst, you may be looking for data that you can collect to provide an estimate of your Brand Awareness. Therefore, in this post, I wanted to share a “quick and dirty” way to use online data to see and trend the popularity of your company brand. While this will not be a comprehensive approach, it might provide a basic starting point into the larger “Brand Awareness” puzzle.

Why Track Brand Awareness?
There are many schools of thought on whether it is even worthwhile to try and track Brand Awareness. While people like us try to track everything, sometimes, there are things that are just not meant to be tracked. If you own a website that sells stuff, then there is so much you can do with Web Analytics that tracking Brand Awareness is probably way down on the list. However, there are many companies (i.e. B2B) that don’t sell products directly and inevitably the question arises:

“What is the true purpose of my website?”

If you are part of one of these companies, the above question is often followed with a spirited debate about whether success should be judged by lead counts, unique visitors, visitor engagement, etc… At some point one Marketer will say that the website should be used to build Brand Awareness so success should be judged by increasing Unique Visitors, only to be countered by another saying that Unique Visitors don’t mean anything if they aren’t the right types…After about an hour of this, there is rarely a consensus on how to judge the success. Soon you can see why this is not a popular topic in Web Analytic circles!

Amid all of this confusion, I think that people sometimes forget the real reason that people care about Brand Awareness. At the end of the day, you want to measure how often consumers that are interested in a product/service that you provide think of you when the time comes to research or buy that product/service. If you are doing a really good job at branding your company such that you are top of mind when consumers are at this stage, then one way or another you have done something right. This is why I think there is some value in trying to quantify this and trend it over time.

So What Can Be Tracked?
So building upon the previous section, let’s assume that you don’t sell a product directly on your website, but that there are consumers out there who need your product/service (and have a blank checkbook in hand!). Do you think they would:

  1. Come to your office and ask to see your salespeople?
  2. Pick up the Yellow Pages and give you a call?
  3. Mail you a letter asking for information?

Maybe in the 1980’s, but not today! Most are going to go to a Search Engine and a few savvy ones will go to Twitter. So if the bulk of these will go to a Search Engine, and you are truly “top of mind” from a branding standpoint, they would probably search for your company name or the name of one of your products. For example, if the consumer is looking for a “CRM” product they might search for “CRM.” But if you are doing your job and have an awesome brand such that the first thing people think of when they think about “CRM” is your company brand (I don’t know…maybe something like “salesforce.com” ;-)), then you would know that your brand is alive and kicking!

Following this logic, you can see that one interesting way to track your brand awareness is to quantify how often people are coming to your website from a list of “Branded” keywords of your choosing. This list of keywords would include your company name, product names, key executive names, etc… If you can aggregate these SEO keywords (I wouldn’t include Paid Search Keywords), then you have a number that you can trend over time. Keep in mind that this is not an exact way to track brand awareness, but the logic behind it is that the more people [organically] search for your key brand phrases, the more pervasive your brand is out there. In my consulting experience, I have often found that the number of SEO Brand Searches has a direct correlation with other key website success metrics.

So How Do I Implement SEO Branded Keyword Tracking?
In a perfect world, it would be great if there were an easy, reliable way to track how often your brand keywords were searched on all of the major search engines. Companies like comScore try to estimate this, but it is not always accurate due to the panel-based methodology. Another way I have tried to get at this data is through Google Trends, but I have not found ways to automatically export that data through API’s (if you know how please let me know!).

That being said, if you want to use SEO Branded Keywords to track your brand, take the following steps:

  1. Work with your Marketing team to identify the list of keywords that everyone agrees are “Brand Keywords.” In order to not distort the trend, it is important that you not continually add to the list so try and get an exhaustive list and stick to it for an extended period of time (i.e. readjust yearly).
  2. The next step is to isolate these Branded Keywords in your SEO reports. One way to do this is to add each one to the advanced search criteria for your SEO Keywords report (in the interface or ExcelClient), but if you have a lot this can be difficult. My preferred approach is to pass SEO Keywords to a custom eVar. Once you have done this, you can use SAINT to classify these keywords as “Branded Keywords” and then use the trended view of reports. If you are using the Channel Manager plug-in or the Unified Sources Vista Rule, you should already have the data you need in a custom variable.
  3. Once you have these branded keywords isolated, you can create a report that looks like this:

In addition, if you have specific products that are brands of their own, you may want to apply the same technique to the SEO Keywords that represent those brands and chart the Brand Awareness of your different products amongst each other (maybe inspire some competitiveness?). For example, at Salesforce.com, we group our products into “Clouds” so you might chart the SEO Keywords related to the various “Clouds” on a graph to see how each is doing (shown with sample data here):

Don’t Forget About Twitter!
As mentioned earlier, another way to look at how your brand is doing is to look at Twitter. This can be done using the Omniture Twitter Integration I proposed last year. Implementing this provides you with a way to see how often your brand is being talked about so you can see a chart like this:

If you want to get fancy, you can even measure how your brand compares to the brand of your competitors on Twitter. The graph below shows what I call “Twitter Competitive Share” and is calculated by the following formula:

Branded Tweets / (Branded Tweets + Competitors Branded Tweets)

The result is a chart that looks like this:

Final Thoughts!
Well there you have it, definitely not world peace, but if you are looking for some different ways to leverage your web analytics data, hopefully these ideas give you some food for thought. If there are other ways that you are using web analytics data to track Brand Awareness, please leave a comment here as I’d love to hear about it…

Analytics Strategy, General

The Most Important Post on Web Analytics You'll Ever Read

When John Lovett joined Aurelie and I here at Analytics Demystified earlier this month an awful lot of people said, “Hey, nice job getting such nice guy on board,” “We love John, he’s great,” and “Man, what a great addition to your team!” Clearly John has the respect of the industry, but one thing that remained an open question in some people’s minds was “how will John make the transition from the ivory tower an analyst sits in to the ground floor where consultants actually do work?”

I admit, I wondered that too in a way, having made a slightly different transition myself years ago. It’s not easy to come away from a situation where you provide advice but are tasked with, honestly, doing very little real work. During my own tenure at JupiterResearch years ago I ensured my own connection to practical web analytics by writing my second and third books. But John had been an analyst for nearly 10 years … and so wondering how he’d hit the ground was a reasonable question.

Wonder no more.

While John has already contributed greatly to the businesses bottom line and helped out with one of our largest new retail clients, he absolutely floored me this morning when he published his post Defining a Web Analytics Strategy: A Manifesto. I asked him to elaborate on some comments he made at Emetrics where he essentially poo-pooed the use of so called “Web Analytics Maturity Models”, describing the almost religious zeal some people seem to have when talking about models and declaring himself as a “Model Atheist.”

Having written the original Web Analytics Maturity Model back in 2005, I have had first-hand experience with their failure to produce anything more than a generalized awareness that most companies simply don’t “get” web analytics, something that we more or less all know already. But honestly I was surprised when John took this position on the subject because, well, in my experience those that don’t do, teach, and models are a classic teaching tool.

I had assumed that as an analyst John was a teacher, not a do-er like I have been for years now in my capacity as a practice leader, consultant, and web analyst. Man was I wrong …

John’s “Manifesto” is perhaps the most lucid yet succinct explanation I have ever read detailing the steps required to make web analytics work for your business (as opposed to the other way around.) I almost asked him to edit the post for fear that he was opening our kimono too much, but if Social Media has taught us anything it has taught us that transparency is king. The fact that he managed to encapsulate what others have been trying to explain with long-winded speeches, tangential arguments, and downright rude behavior is a huge plus.

Some of you may read John’s manifesto and think “Gee, this seems to point to the need for outside consultants” which is a fair criticism. But before you react consider two things:

  1. Consultants (like us) have a tendency to, you know, recommend consulting. Everyone’s perspective arises from their own personal biases, regardless of how many times they declare the contrary. We are consultants, consultants who want to feed their children. Forgive us our bias and we will forgive you yours …
  2. Consultants in the Enterprise are like death and taxes, we are more or less inevitable. Often times an outside perspective is exactly what the business needs to actually start to act upon the message that otherwise great employees have been stating for years. Other times the business simply stops listening to their employees and won’t make a move until McKinsey, Bain, or Demystified come in and charge big money for insights that were already there. Either way, ours is the second (or is it third) oldest profession and it must be for a reason …

I would challenge you, dear reader, to spend some time reading John’s post and considering what he has to say. Think about how you could apply his ten insights to your business regardless of whether you turn to consultants for advice or not. Listen to your business partners needs, put away your models and roll up your sleeves, transcend mediocrity, establish your own waterfall and embrace change!

When I said “web analytics is hard” I meant it, I really, really did. But I wasn’t trying to box anyone in or establish myself as some kind of amazingly wonderful “guru”, I was simply telling you all the truth based on my dozen years of experience in the sector. Yes, getting started can be easy; yes, making Google Analytics do stuff can be easy; and yes, you can do an awful lot in an hour a day if you simply apply yourself to the task … but the problem is that within any business of size, complexity, or nuance — which is to say all businesses everywhere — the act of getting from raw data to valuable business insights that you can repeatedly take action upon is apparently so freaking difficult that almost nobody does it.

How is that “easy?”

You all know I love a good debate so if you disagree with my comments here please let me know. If, however, you have something to add to John’s manifesto, I would encourage you to comment on his blog post directly.

Happy Holidays, everyone.

General

Defining A Web Analytics Strategy: A Manifesto

…Strategy is all talk unless it can be executed in a way that delivers on both the creative and business promises. ~Liz Gebhardt Thinking Out Loud

I’ve been thinking a great deal about strategy lately and what it means to truly build a winning strategy for Analytics. All too often strategies take shape after a plan is already in action. We’re seeing rampant instances of this type of reactionary measurement strategy in social media today because marketers simply don’t want to get left behind in the latest digital craze. Yet, they really don’t know where to focus their strategic measurement efforts, so tactics take precedence. But even with more mature disciplines, like Web Analytics, strategies are often ill formed and don’t contain the vision necessary to carry an organization to the next level. And often times it’s not their fault – getting something done requires tactical execution – but companies and marketers in it for the long haul eventually come around to a strategic approach.

In many cases, consultants are brought in to build strategic roadmaps for measurement practices, especially in the complex realm of Web Analytics. Yet, outside experts (and internal champions) are usually at a distinct disadvantage because organizations have already embarked on a process of web data analysis and somewhere along the line, those efforts failed. New strategies must then clean up in the aftermath of failure to override distrust and misguided use of digital data. However, a well defined strategy will set you on the right track. Whether you endeavor to take this strategy on by yourself or hire an external resource to guide you through, this manifesto should help. It’s one that I adhere to and offers some guidance that should make the difference between rudderless marketing efforts and well-defined programs with quantifiable measurement success.

Strategy Credo #1: Listen to your constituents. Building a sustainable strategy of Analytics requires some serious fact-finding. To advise an organization on steps toward improvement, you must first fully understand their unique situation. For internal strategists, this means soliciting feedback from your cohorts and establishing a collaborative environment – while external consultants are required to get under the corporate covers by asking the right questions and listening carefully for ambient and recurring themes. When you stop and take the time to listen you can find out some incredibly revealing things.

Strategy Credo #2: Roll up your sleeves. … And get to work. Our tight knit cottage industry that is Web Analytics is quickly outgrowing its humble origins and becoming a marketing imperative of global proportions. Yet, amid the experts and audible voices in our space there’s some derision between those who spout analytics theory and those that actually practice analytics. To become effective at delivering a strategy for Web Analytics, one must go beyond the academic exercise of offering models that propagate general best practices to proving real value through demonstrated client success. This success comes from working with clients to understand their unique issues and customizing a solution to meet specific needs.

Strategy Credo #3: Assimilate to the culture. This one is important because culture within the organization will dictate the Web Analytics strategy. If you’re on the inside working for change, you are probably already ensconced by culture so the trick will be to separate yourself from bias and your established notions of what you see – to how things really are. For outsiders, culture is something that you can actually pick up on pretty quickly. By observing who takes over a meeting immediately upon entering the room or listening to the pleas of a frustrated analyst, you get a good sense of how things operate. Culture is definitely the most difficult thing to change at an organization, so understand it in order to work with it rather than against it.

Strategy Credo #4: Mediate judiciously. Web Analytics managers and staff can typically identify 99% of their own issues – they often just need a story and an effective communicator to incite management to change. By 1) listening to your constituents, 2) demonstrating empathy and a desire to affect change by rolling up your sleeves, and by 3) understanding the culture you are working in you’ll be in a position to shape the story. However, often times, internal employees are unsuccessful at pleading for the cause. For this reason, consultants are called in to argue precisely the same points that internal advocates have been saying for years. When this is done judiciously and with conviction the light bulb goes off and management actually begins to take action.

Strategy Credo #5: Identify creative solutions. In my experience, I’ve found that when it comes to Web Analytics it’s typical that: a) resources are limited, b) budgets are constrained and c) inflexibility exists somewhere within the system. A valuable manager or crafty consultant will assess these limitations and deliver creative solutions. This means quit complaining that you don’t have enough staff or resources for Web Analysis – nobody does! Instead determine if contractors can get you over the hump – or perhaps offshore labor can free up bandwidth – or maybe just tin cupping it around your organization to get funding for that measurement project is a viable solution. Creativity requires new thinking and developing a measurement strategy that works is predicated on fresh thinking.

Strategy Credo #6: Transcend mediocrity. I’ve been saying this one for a while, but it’s something that I strongly believe in – and it’s really hard to do. Mediocre analysts spend their days producing reports and processing data that can be (but usually isn’t) consumed and put into action by others. Too often, there’s little time for actually thinking about the data and translating what it means to take insights to action. This is essentially the equivalent of opening your mouth and letting the unfiltered drivel run out before thinking it through. It’s your Does-This-Actually-Make-Sense meter that should be pinned to the red when communicating analytics data. Climbing out from this sea of mediocre analysis requires a measurement strategy that ensures everyone is working towards the same goal and that the signposts are legible and in the same language for every traveler.

Strategy Credo #7: Actually solve the problem. Strategic Web Analytics goes beyond merely pointing out problems to actually solving them. Personally, I get immensely frustrated when efforts cease upon identifying the problem and fall short of providing tangible resolutions. This classic downfall of theory is quick to point out that your Web Analytics program blows, but beyond pointing to more technology or generalized tactics, there’s no real solution. This is a result of not actually doing the work (see #2 Roll up your sleeves) and thus not really knowing what is effective. Whether initiated from the inside or from an external consultant, a Web Analytics strategy must solve the problem and establish a working solution.

Strategy Credo #8: Establish a waterfall strategy. By this I mean strategy should flow from the headwaters of the organization and align with the corporate goals set forth by the executive team. Once your measurement team is clear and united on the goals, then identify objectives as the next tier in your waterfall that supports the corporate goals (these are your business promises). The base of your waterfall strategy consists of the tactics. Tactics are the actual campaigns and programs that emerge from your marketing machine (your creative promises). Each tier within the waterfall has specific metrics that indicate success. These metrics must be clearly defined and baked into the system at all levels to ensure proper measurement. It’s also critical to recognize that neither you nor an external consultant is likely to change your corporate goals, but you can refine the way in which you get there.

Strategy Credo #9: Ensure executable recommendations. Practice a crawl → walk → run approach to implementing a measurement strategy. This involves clearly illustrating the immensely lucrative and sexy benefits of being able to run, but knocking some reality into your key stakeholders by showing that serious work is required to get there. Successful strategies are designed in bite-sized chunks that align the components necessary to perform analysis wonders. If you try to take on too much too soon, then you’ll end up falling on your face and losing the confidence of your champions. By establishing clear expectations and milestones of accomplishment you will be on your way to executing on a measurement strategy.

Strategy Credo #10: Embrace change. Change management must be factored into any Web Analytics strategy overhaul. In most cases, refining (or defining) a strategy for marketing measurement is a monumental task. To offer an analogy, it’s like asking your grandfather to step out of his Oldsmobile so you can rip out the dashboard and replace it with an entirely new one that includes foreign instruments, dials and gauges. After you go through the painful process of getting this new system in place you need to explain to your grandfather that knowing how fast he was going using the speedometer was simply not good enough. Instead he should be focused on his fuel consumption per mile in order to conserve gas (tactic) – so that he can adhere to his fixed income budget (objective) – so that there will be some money left over when he eventually passes on (goal). Apologies for the crude analogy – but your grandfather’s Oldsmobile simply won’t cut it anymore! Your measurement strategy needs to create change – communicate the benefits – and deliver value. No one said this was easy 😉

So there you have it. That’s my Web Analytics Strategy Manifesto. I really believe that there’s a profound difference between “people who think about Web Analytics” and “people who do Web Analytics”. This Manifesto is based on doing. I’m curious to know what you other “doers” think and how you’ve embarked on establishing a strategy for measurement in your organizations?

General

Four Books That Will Change the Way You Communicate

I don’t think I will ever forget the first time that I made a presentation at work. It was just over a decade ago, I was just a few months into my employment at a company where I would work for the next eight years, and I was on the hook to present a new process to a room of 20 engineers. I diligently prepared my transparencies (I’m old enough to have used an overhead projector, but not old enough to refer to the medium they supported as “foils”). I rehearsed the material again and again.

And I bombed.

The material was dry as it was, but it wasn’t, by any means, unmanageable content. I just didn’t do a good job of managing it!

Fast forward 10 years, and I found myself giving a presentation to a room of 50-60 people, and the material was set up to be just as naturally engaging — presenting on an approach to measurement and analytics to…a bunch of marketers.

The presentation went much better, judging both from the engagement level of the audience and discussions that it has prompted weeks later. I’m no Steve Jobs, but I’ve paid attention to what seems to work and what doesn’t (both in my presentations and others), read some articles here and there, and, I realized, read a few books along the way that have really helped.

So, with that — four books that all have a heavy component of “how the brain works” and that, collectively, have taught me a lot about how to present information, be it a dashboard, a report, or a presentation.

Gladwell and Gilbert

The first two books are books that I read within a few months of each other. To this day, I recall specific anecdotes with no idea which book they came from. Blink: The Power of Thinking Without Thinking made the rounds when it first came out as “another great book by Malcolm Gladwell” (following The Tipping Point: How Little Things Can Make a Big Difference). The fundamental anecdote of Blink has to do with our “adaptive unconscious” — our intuition and ability to “know” things without fully needing to process them. As he dives into example after example, Gladwell touched on various aspects of how the brain works.

Daniel Gilbert’s Stumbling on Happiness takes a more directly psychological angle, but it covers some of the same territory. One of Gilbert’s main points is that the human brain does not remember things like we think it does — pointing out that a vividly remembered, down-to-the-color-of-the-shirt-you-were-wearing memory is not really an as-recorded memory at all. Rather, the brain remembers a few specific details and then makes up / fills in the rest when the memory gets called up. It’s so good at filling in these blanks that it fools itself into not being able to tell fact from interpolation!

Both of these books made an impact on me, because they pointed out that how we take in, process, and store information doesn’t work at all like we intuitively think it does. And, both books set up the next two books by shaking the assumptional foundations I had of how we, as humans, think.

Straight-Up Business Reading

Chip and Dan Heath’s Made to Stick: Why Some Ideas Survive and Others Die is a practical manual for communicating information that you want your audience to pay attention to and retain. They boil the components into a five-letter acronym — S.U.C.C.E.S. — and go into each component in detail.

The elements are Simple, Unexpected, Concrete, Credible, Emotional, and Stories, and they provide a nice framework for critiquing how we communicate any idea. Irecognize that I regularly struggle with Simple, Concrete, and Stories as elements in my blog posts. But, every element is one that can be injected using some discipline and time to do so. I nailed all three of these elements a number of years ago when I found myself on an internal lecture circuit trying to drum up large donors for my company’s annual United Way campaign — I was heavily vested in conveying a strong message, and I wound up using an example of my grandfather’s battle with Alzheimer’s as a way to pull the audience in and ask them to find something they were passionate about and support it. I also wove in various quirky takes on how $10/week would really add up — think the sort of thing you hear again and again from your local NPR station during fundraising drives. In the case of that campaign, we blew our numbers out of the water — had a 500% increase in the number of people who gave at the “leadership level” that year. Now, a lot of things had to come together to make that happen, but, to this day, I’m sure my well-crafted, well-rehearsed, and sincere speech made to at least a dozen different groups of employees (and the fact that I was a fairly low-level employee making the case — I was asking people who were making a lot more money than I was to give at least as much as I was), played a non-trivial role.

And that was years before I read Made to Stick. But, the book helped me reflect on any number of presentations — ones that worked and ones that didn’t.

And, Finally, Wisdom from a Neuroscientist


The last book in this tetralogy is one that I just finished reading — Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School, by John Medina. I stumbled across the book as a recommendation from Garr Reynolds of Presentation Zen, so I wasn’t surprised that it had some very practical tips, as well as the “why?” behind them, for communicating effectively. Medina’s premise is that there’s a ton of stuff we don’t yet understand about the brain. BUT, there are also a lot of things we do know about the brain, and many of those lay out pretty clearly that the way we work in business and the way our education system is set up both run counter to how the brain naturally functions.

These “things we do know” are broken down into 12 “rules” — exercise (good for the brain), survival (why and how the brain evolved…and implications), wiring (how the brain works at a highly micro level), attention (there’s NO SUCH THING as multitasking…and other goodies), short-term memory (what makes it there and how), long-term memory (what makes it there, how, and how long it takes to get there), sleep (good for the brain), stress (some kinds are good, some kinds are bad), sensory integration (the more senses involved, the better the memory), vision (the #1 sense), gender (men are from Mars…), exploration (age doesn’t really degrade our ability to learn). Medina ends each chapter (one rule per chapter) with “Ideas” — implications for the real world based on the information presented.

The book goes into very technical detail about how, when, and where electrical charges zip around in our skulls to accomplish different tasks. While that information is not directly applicable, each time he goes there it’s as a setup to more directly useful information. Throughout the book, Medina provides practical thoughts for how to communicate more effectively — helping people pay attention (getting the information you are communicating into working memory) and retain the information over both the short and the long term. Two of my absolute favorite nuggets from the book were:

  • p. 130 (in the chapter on long-term memory) — Medina has the reader do a little memory exercise with the following characters: “3 $ 8 ? A % 9.” The fact he drops after the exercise is interesting: “The human brain can hold about seven pieces of information for less than 30 seconds! If something does not happen in that short stretch of time, the information becomes lost.” This is about getting information on its way from working memory to long-term memory and how repetition, thinking about the information, and talking about the information all helps it on its way. As a communicator (be it through a presentation or through a dashboard of data), this seems like powerful stuff — how often have we all seen someone cut loose with slide after slide of mind-numbing information? The human brain simply cannot take all of that in and retain it without some help!
  • p. 239 (in the chapter on vision) — Medina has a section titled “Toss your PowerPoint presentations.” I groaned. While I get highly annoyed by the rampant misuse of PowerPoint, I’m not a Tufte acolyte to the point that I see the tool itself as evil. In the second paragraph, though, Medina clarifies by providing a two-step prescription: 1) burn your current presentations, and 2) make new ones. Medina’s beef with PowerPoint is that the default slide template is text-based with a six-level hierarchy. This entire chapter is about how a picture really is worth 1,000 words, and Medina pleads with the reader to cut wayyy back on the text in his/her presentations (he has a fascinating explanation of how, when we read, we’re really interpreting each letter as a small picture…and that’s actually not a good thing for retention of information).

There are oodles of other good information in the book, but these are two of the snippets that really resonated with me.

Better to Be Steve Jobs than Bill Gates

I do believe that some people have better communication instincts than others. I’ll never be Steve Jobs when it comes to holding an auditorium in the palm of my hand. But, between reading these books and thinking through my own evolution as a communicator (this blog notwithstanding…but I’ve always said that I write this blog to keep my e-mails shorter and to try out ideas that occur to me during the day — sorry folks…both of you…but this blog is mostly for me!), I’m convinced that effective communication is a trainable skill.

I’ve also noticed that, the more I have to communicate, and the more I work to do so effectively, the easier it seems to be getting. In another 20 years, I might just have it nailed!

Analytics Strategy, Conferences/Community, General, Social Media

Announcing The Analysis Exchange

A few weeks ago I started pinging folks within the digital measurement community asking about the work we do, the challenges we face, and how we got where we are today. The responses I got were all tremendously positive and showed a true commitment to web analytics across vendor, consultant, and end-user practitioner roles. What I learned was, well, exactly what I expected given my decade-plus in the sector: “web analytics” is still a relatively immature industry, one populated by diverse opinions, experiences, and backgrounds.

Those of you who have been following my work know that I have spent a great deal of time working to create solutions for the sector. As a matter of record I was the first to create an online community for web analytics professionals and explicitly point out the need for dedicated analysis resources back in 2004, and the first to publish a web analytics maturity model and change how web analytics practitioners interact with their local community back in 2005. I’ve also written a few books, a few blog posts, and have logged a few miles in the air working with some amazing companies to improve their own use of web analytics.

I offer the preceding paragraph not to brag but rather to establish my credentials as part of setting the stage for what the rest of this post is about. Like many in web analytics — Jim Sterne, Avinash Kaushik, and Bryan Eisenberg all come to mind — I have worked tirelessly at times to evolve and improve the landscape around us. And with the following announcement I hope to have lightning strike a fourth time …

But I digress.

One of the key questions I asked in Twitter was “how did you get started [in web analytics?]” Unsurprisingly each and every respondent gave some variation on “miraculously, and without premeditation.” While people’s responses highlighted the enthusiasm we have in the sector, it also highlighted what I see as the single most significant long-term problem we face in web analytics.

We haven’t created an entry path into the system.

As a community of vendors, consultants, practitioners, evangelists, authors, bloggers, Tweeters, socializers, and thought-leaders, we have failed nearly 100% at creating a way for talented, motivated, and educated individuals who are “not us” to gain the real-world experience required to actually participate meaningfully in this wonderful thing that we have all created.

Before the comments about the Web Analytics Association UBC classes or the new certification pour in consider this: The UBC course offers little or no practical experience with real data and real-world business problems, and the certification is designed, as stated, “for individuals having at least three years of experience in the sector.” Both are incredibly valuable, but they are not the type of training the average global citizen wishing to apply their curiosity, their precision, and their individual talents to the study of web data need to actually get a good job coming from outside the sector.

And while I have little doubt people have landed jobs based on completion of the UBC course given the resource constraints we face today, as a former hiring manager and consultant to nearly a dozen companies who are constantly looking for experienced web analysts, I can assure you that book-based education is not the first requirement being looked for. Requirement number one is always, and always will be, direct, hands-on experience using digitally collected data to tell a meaningful story about the business.

Today I am incredibly happy to announce my, my partners, and some very nice people’s solution to this problem. At 6:30 PM Eastern time at the Web Analytics Wednesday event in Cambridge, Massachusetts my partner John Lovett shared the details of our newest community effort, The Analysis Exchange.

What is The Analysis Exchange?

The Analysis Exchange is exactly what it sounds like — an exchange of information and analytical outputs — and is functionally a three-partner exchange:

  • At one corner we have small businesses, nonprofits, and non-governmental organizations who rarely if ever make any substantial use of the web analytic data most are actively collecting thanks to the amazing wonderfulness of Google Analytics;
  • In the next corner we have motivated and intelligent individuals, our students, who are looking for hands-on experience with web analytics systems and data they can put on their resume during when looking for work or looking to advance in their jobs;
  • And at the apex of the pyramid we have our existing community of analytics experts, many of whom have already demonstrated their willingness to contribute to the larger community via Web Analytics Wednesday, the WAA, and other selfless efforts

The Analysis Exchange will bridge the introductions between these three parties using an extremely elegant work-flow. Projects will be scoped to deliver results in weeks, effort from businesses and mentors is designed to be minimal, and we’re working on an entire back-end system to seamlessly connect the dots. And have I already mentioned that it will do so without any money changing hands?

Yeah, The Analysis Exchange is totally, completely, 100 percent free.

John, Aurelie, and I decided early on, despite the fact that we are all consultants who are just as motivated by revenue as any of our peers, that the right model for The Analysis Exchange would be the most frictionless strategy possible. Given our initial target market of nonprofits and non-governmental organizations, most of whom our advisers from the sector warned were somewhat slow to invest in technology and services, “free” offered the least amount of friction possible.

Businesses bring data and questions, mentors bring focus and experience, and students bring a passion to learn. Businesses get analysis and insights, students gain experience for their resume, and mentors have a chance to shape the next wave of digital analysis resources … resources the mentor’s organizations are frequently looking to hire.

More importantly, our mentors will be teaching students and businesses how to produce true analytical insights, not how to make Google Analytics generate reports. Our world is already incredibly data rich, but the best of us are willing to admit that we are still also incredibly information poor. Students will be taught how to actually create analysis — a written document specifically addressing stated business needs — and therein lies the true, long-term value to our community.

Too many reports, not enough insights. This has been the theme of countless posts, a half-dozen great books, and nearly every one of the hundred consulting engagements I have done in the past three years. The Analysis Exchange is a concerted effort to slay the report monkeys and teach the “analysts” of the future to actually produce ANALYSIS!

A few things you might want to know about The Analysis Exchange (in addition to the FAQ we have up on the official web site):

  • Initially we will be limiting organizational participants to nonprofit and non-governmental entities. We are doing this because we believe this approach simultaneously provides the greatest benefit back beyond the web analytics community and provides a reasonable initial scope for our efforts. Plus, we’ve partnered with NTEN: the Nonprofit Technology Network who are an amazing organization of their own;
  • Initially we will be hand-selecting mentors wishing to participate in the program. Because we are taking a cautious approach towards the Exchange’s roll-out in an effort to learn as much as possible about the effort as it unfolds, we are going to limit mentor opportunities somewhat. Please do write us if you’re interested in participating, and please don’t be hurt if we put you off … at least for a month or two;
  • With the previous caution in mind, we are definitely open to help from the outside! If you have experience with this type of effort or just have a passion for helping other people please let us know. Just like with Web Analytics Wednesday, we know that when The Analysis Exchange gets cranking we will need lots and lots of help;

Because this post is beginning to approach the length at which I typically tune out myself I will stop here and point readers to three resources to learn more about The Analysis Exchange:

  1. We have a basic, informational web site at http://www.analysis-exchange.com that has a nice video explaining the Exchange model in a little greater detail;
  2. You can email us directly at exchange@analyticsdemystified.com for more information or to let us know if you’re willing to help with Exchange efforts;
  3. You can follow Exchange efforts in Twitter by following @analysisxchange

As you can probably detect from the post I’m pretty excited about this effort. Like I did when I co-founded Web Analytics Wednesday, I have some amazing partners on this project. And like I did when I founded the Yahoo! group, I believe this effort will satisfy an incredible pent-up demand. Hopefully you will take the time to share information about The Analysis Exchange with your own network, and as always I welcome your thoughts, comments, and insights.

Learn more at http://www.analysis-exchange.com

Adobe Analytics, General

Intranets – The Other Website

While most of you reading my posts are focused on your public website, in this post I am going to share how you can leverage your web analytics skills internally at your organization. Company Intranets are often times larger than the public website and using the tips I will share here, you can get some big visibility internally and become the hero of your HR team!

Why You Should Care About Your Intranet
Companies often spend a LOT of $$$ on building Intranets. Unfortunately, not everyone at the company uses the Intranet. If you can help your internal team show what is working and what is not working on the Intranet, you can help them to save a lot of money. In addition to the altruistic reasons to track what happens on the Intranet, there are the following selfish reasons:

  1. Tagging Intranets is a great way to try new things and get better at web analysis in a safe environment
  2. Intranets often have low traffic volume so it is a great way to help cost-justify increased budgets for web analytics (“Mr. CEO, not only does this money go towards tracking the website, it also allows us to track our entire Intranet!” – Just don’t tell them that tracking the Intranet costs all of $1,000 in server calls!)
  3. Showing people what is happening on the Intranet does wonders for people inside your organization understanding what the heck you do for the public website!

I have seen situations where a web analytics team has killed themselves trying to get senior executives to see what is taking place on the website and what improvements could be made based upon solid web analysis, only to see the same team get promoted or more budget after spending 2-3 weeks showing what takes place on the Intranet (something that they actually use)! It sounds completely illogical, but I guess if you can’t beat them, join them!

Tracking Intranets
So what should you track on Intranets? The following are my best practices learned working with a few large clients. The one caveat to everything below is that you have to be sure to track all of this data in a different report suite than all of your other website data!

Employee ID
Depending upon the security policy of your company, ask if you are able to track down the the Employee ID level. I tend to not do this since it can be a bit creepy, but it is technically possible and you can replace the Omniture Visitor ID with your own unique employee identifier.

Non-Personally Identifiable Employee Info
On each Intranet page, I recommend that you pass Department, Region, Business Unit, Office Location, Employee Band Level (i.e. VP, Manager), etc… to variables. This will allow you to break down all Pages by these data points. I generally pass these to an sProp and an eVar (save some time setting both through this post) and also recommend you put your top five of these into a 5-item Traffic Data Correlation.

Pages & Sections
Obviously, you want to pass in a unique page name for every Intranet page like you would any other website. In addition, you should pass the Intranet section to the Site Sections (Channel) variable. As always, I recommend that you enable Pathing on the Channel sProp so you can see how employees are navigating between Intranet sections.

Internal Search
Just like a public website, Internal Search is usually important on Intranets. You should track Internal Search on the Intranet just as you would on a public website. You can apply the same principles I mentioned in this Internal Search post. This includes tracking what search terms people are looking for, but the beauty here is that you can see these by Department, Region, etc…

Timeparting
Many of my Intranet clients were keen to see when employees were accessing the Intranet, so I recommend you implement the Timeparting Plug-in. This allows you to see what day of the week and time of the day employees access the Intranet. Don’t forget to create a correlation between these sProps and your other ones so you can see when each page/section is accessed most often.

Internal Promotions
Much in the same way that I described Internal Campaigns in the past, Intranets may have promotional areas that try and entice employees to click. You can track these the same way you would a public website.

Intranet KPI’s
The following are the types of KPI’s I have seen used for Intranets:

Page Views/Visit & Average Time Spent/Visit
Depending upon whether your goal is to get employees in and out or get them to spend more time reading Intranet content, you can use this calculated metric to see how you are doing.

Page Views (Event)
As I described in this post, I would recommend that you set a Success Event on each page. Why? Well let’s say you want to see how many pages on the Intranet a specific internal e-mail led to. You can open the Campaigns report, find the e-mail and then see how many pages were viewed. You can then use an eVar Subrelation to break this down by page name (as long as you pass Pagename to an eVar) to see the exact pages viewed.

Internal Searches
As you would on a website, you should track and trend the # of Internal Searches taking place on the Intranet.

Logins
If employees have to log into your Intranet, you can capture that as a KPI to see how you are doing at getting them to access the Intranet. This can also be used for segmentation (i.e. show me all users who have not logged into the Intranet in the past 30 days…)

Custom KPI’s
Many times, Intranets are used to get employees to fill out forms, surveys, etc… Each of these key actions should be captured with a Success Event and in the case of Forms, you should capture the Form Name in an eVar so you can break it down appropriately.

Employee Profile Views
As we march down the road of internal social media, it is fun to track how often each employee’s Intranet Profile is viewed. Using new tools like Salesforce.com Chatter, we may be moving to a world where employees get “followers” so you can track how often people are looking at or following other employees. This allows you to see who your employees think are important (which may not always align to the org chart!).

Final Thoughts
As you can see, if you know what you are doing for tracking a public website, tracking an Intranet uses many of the same principles. If you are just getting started in web analytics, feel free to apply the above items on your Intranet as a testing ground before you tackle the public website. If you have some other cool things you have done to track your Intranet, please feel free to leave a comment here…

Analytics Strategy, General

Let The Wild Rumpus Start

I feel liberated. For the first time in my professional career I don’t have to answer to anyone. Sure, I still carry accountability to my new partners Eric and Aurelie, to the Analytics Demystified brand that we will continue to grow and evolve together, and most importantly to myself to produce high caliber work. Yet, there isn’t anyone telling me what to do anymore. Not that I didn’t have autonomy in many of my previous roles…I did. But somehow working in an environment where I’m calling the shots – where the upside is big and the downside threatening – where I have an opportunity to make a difference that’s entirely my own creation – it is invigorating.

Making mischief of one kind and another…
So with this newfound freedom I plan to embark on initiatives and activities that weren’t previously available to me in former roles. And I will challenge the conventional measurement dogma along the way. I mentioned earlier that I intend to be an agent for change in Web analytics. To me that means: questioning the status quo of vendor measurement practices; challenging clients to fully develop their strategic vision for measurement; cultivating talent that instills measurement as a fundamental marketing discipline; and driving the industry to collectively embark on advancement. Simply forging ahead in using analytics and optimization technologies in the way we’ve done so during the past 5 years won’t get this industry to a better understanding of customer behavior and marketing intelligence.

Sailing off through night and day…And in and out of weeks…
Just for context, I’ll let you know that I did not arrive here overnight. I began my career 15 years ago as a marketer and realized that digital mediums could offer faster, better and more effective means of reaching customers. I watched the impact of my digital efforts blossom and realized that web technologies were the way of the future. I embarked on my analyst career back in 1999 by joining two former Forrester analysts at Gomez Advisors where I immersed myself in the online experience. There I learned what it meant to have a well founded digital strategy and consulted with firms on how to formulate one. As Gomez evolved to a performance management company, I continued my consulting and delved into the technical side of what it means to offer faster and more reliable online marketing. It was during this time that I realized that measurement was the basis for truly understanding marketing efforts. This led me to conduct analytics and optimization research at several analyst firms including: the Aberdeen Group, Jupiter Research and most recently Forrester. During my time at each of these companies, my quest remained the same: to help marketers understand how consumers receive, interact and respond to digital marketing – and what to do about it.

To where the wild things are…
That was how I became acquainted with Web Analytics and the industry figures that are indeed the wild things. I’ve commented before that the people of Web Analytics are among the most inviting and hospitable bunch I’ve ever met. I can recall my first conversation with Jim Sterne where I pitched him an idea for eMetrics and his response was “more please, my antenna are tingling…”. Practitioners like Judah Phillips and Jim Hassert were willing to get on the phone and articulate their analytics frustrations along with their successes to help me create research that would resonate with the marketplace. Consultants like Jim Novo and June Li spoke freely about their experiences in education and evangelism of Web Analytics that helped me formulate a perspective on why people cared so much about this industry. And vendors spoke about their technologies with passion and excitement. Every individual that I approached regarding Web Analytics was willing to share a story, some valuable insight or their unique perspective. It was clear that the people that worked within this industry had passion for what they did and I wanted to make myself an indispensable part of that community.

The most wild thing of all…
Among all the characters I met who were involved with Web Analytics one stood out apart from the rest. Eric Peterson seemed to personify Web Analytics. His enthusiasm for analytics and his capacity to evangelize measurement somehow captivated the veterans and newly indoctrinated alike. His passion for Web Analytics was emphatic and his communication tactics resonated. I did and still do appreciate that he takes a stand on his opinions regarding Web Analytics topics, but is still willing to give audience to differing views. He’s also willing to change his opinion if he’s proven wrong. While I’ve accused him of apologizing like Larry David, he will acquiesce when he’s wrong. Most importantly, Eric has made an indelible impact on the Web Analytics industry. Thus, when Eric and Aurelie approached me about joining them as a partner at Analytics Demystified, I couldn’t refuse.

It’s still hot…
So here I am, beginning a new chapter in my career that includes: education, evangelism and inciting change for Web Analytics. I’m here because I truly believe that this space is hot and it’s one that I want to be a part of for years to come. I relish the opportunity to work side by side with Eric and Aurelie because we’re all equally invested in this industry and each offer unique perspectives on where it’s all going. This means that we won’t necessarily agree on everything, but we do share a common view of the big picture. I hope to bring balance to the partnership and the chance to offer my perspective through thought leadership, guidance and evangelism. I’m also looking forward to sharing my experience, my learnings and my viewpoint with you. This industry wouldn’t be here if not for people to move it forward. I welcome conversations about how we can collectively advance measurement technologies and encourage you to reach out and share your views.

And now, let the wild rumpus start!

Analytics Strategy, General

Google Analytics Intelligence Feature is Brilliant!

Long-time blog readers are likely aware that I’m not prone to writing about individual technologies or product features unless I have the opportunity to break the news about something new and cool (or not, as the case is from time to time.) But once and awhile a single feature comes along that in my mind is so compelling and cool I need to bend my own rules; Google Analytics new “Intelligence” offering is exactly that feature.

Just in case you’ve been living under a rock for the past month and haven’t already heard about “Intelligence” have a quick watch of the following video pulled from the Google Analytics blog:

Pretty awesome, huh? What’s more, now that I’ve had a few weeks to play with the feature and think about it in the context of my published views on the Coming Revolution in Web Analytics, I think that “Intelligence” is one of the most important advances in web analytics since the JavaScript page tag.

While Google is certainly not the first vendor to apply some level of statistical and mathematical rigor to web analytics data, an honor that would likely go to Technology Leaders for their Dynamic Alert product or Yahoo for their use of confidence intervals when exposing demographic data in Yahoo Web Analytics, in my humble opinion Google has done the best possible job making statistical analysis of web analytics data accessible, useful, and valuable.

Some things I really like:

  • An approachable way to determine confidence intervals via their “Alert Sensitivity” slider. While the implementation doesn’t necessarily impart the level of detail some folks would like, the slider mitigates the prevalent concern that “people won’t understand confidence intervals.”
  • Great visual cues for alerts, especially when statistically relevant changes are not obvious based on traffic patterns. Sometimes traffic patterns just look like hills and valleys, even when something important is happening — for example, the next figure shows two alerts at the lowest threshold setting on September 16th that, upon exploration, turned out to be great news (that I might have missed otherwise.)
  • Good visual cues regarding the statistical relevance of the insight being communicated. This is tough since Google is trying to present moderately complex information regarding the underlying calculations and how much emphasis you should be putting on the insight. By showing a relative scale for “significance” I think Google has more or less nailed it.
  • Google Analytics finally starts communicating about web analytics data in terms of “expectations” instead of absolutes. All of us (present company included) have a tendency to get wrapped up in whole numbers, hard counts, and complete data sets. But we also know that Internet-based data collection just isn’t that accurate, and so any push to get us to start thinking in terms of predicted ranges and estimates is a step in the right direction. For example, I love knowing that on a given day Google Analytics “expects” between 311 and 388 people to come to my site from the UK!
  • Lots more, including the ability to pivot the views and look from a “metric-centric” and “dimension-centric” perspective, the ability to aggregate on day, week, and month, and the ability to add your own custom alerts based on changes in traffic patterns. Perhaps ironically this last functionality (“Custom Alerts”) is how we’ve all historically thought about “Intelligence” in reporting, and while useful seems somewhat weak compared to Google’s stats-based implementation.

While awesome in it’s first instantiation there are some obvious things that the Great GOOG could improve in the feature. Some ideas include:

  • More dimensions and metrics, although I believe both Nick and Avinash have commented that they are already working on adding intelligence to other data collected.
  • Some way to expose confidence intervals and p-values would be useful (perhaps as a mouse-over) so that the increasing number of analysts with experience in statistics could have that data in their back pocket when they went to present results.
  • Email alerts for the automatically generated insights, for example when “Intelligence” determines that five or more alerts have been generated it would be cool to get an email/SMS/Tweet/Wave notification.
  • The ability to generate alerts against defined segments, so that I could see the same analysis for different audiences that I’m tracking.

Mostly ticky-tack stuff, but again I’m pretty damn impressed with their freshman effort. I suppose I shouldn’t be surprised since evangelist Avinash has been talking about the need for statistics in web analytics for an awfully long time, but given that so many in our industry have balked at bringing more mathematical rigor to our work (including said evangelist, oh well) it’s encouraging to see Google move in this direction.

What do you think? Are you using “Intelligence”? Is it helping you make better decisions? Do you like the implementation as much as I do? I’d love to hear your thoughts and comments.

Adobe Analytics, General

Page Name eVar

In my last post, I described some of the benefits of using a Page View Success Event. In this post I will continue along the same theme by describing the benefits/uses of a Page Name Conversion Variable (eVar). I recommend you read my last post on the Page View Success Event prior to reading this post as the two go hand-in-hand.

Setting a Page Name eVar
Setting the Page Name in an eVar, while somewhat nontraditional, can be used for many different purposes. In this post I will cover just a few, but I am sure those reading this can come up with many more. The implementation of this couldn’t be easier. Simply pass the s.pagename value to an eVar and you are done! The following sections will outline how I use this variable once it is set.

Campaign Pages
Let’s say that you are running a bunch of online marketing campaigns and you want to see how many pages on the website people coming from each Campaign Tracking Code view. In SiteCatalyst, the main way to figure this out would be to use DataWarehouse, ASI or Discover unless you read my last post and had set a Page View Success Event. But now let’s take it a step further. What if you want to see the pages that visitors from each Campaign Tracking Code viewed on your website. Easy right? Not so fast. There is really no easy way to see this in SiteCatalyst using out-of-the-box reports. One way to do this would be to use the Get&Persist Plug-in to pass the Campaign Tracking Code to a Traffic Variable (sProp) on each page of the visit and then use a Traffic Data Correlation to correlate this new sProp with the Page Name variable, but that is a lot of work! The other way is to use a Page Name eVar. By default, your Campaign Tracking Code report will store and persist the Campaign Tracking Code for multiple page views (you choose your time frame in the Admin Console) so if you begin to store Page Names in another eVar, you will have an intersection between Page Name and Campaign Tracking Code on each page. That allows you to use a Conversion Variable Subrelations report to see all Pages viewed by visitors coming from each Campaign Tracking Code You can see this by opening up the Campaign Tracking Code report, selecting the Page View (Event) metric and clicking the icon next to a specific Tracking Code to break it down by the Page Name eVar. Once you have done this, you should see a report like this:

page_evar_code

Channel Pages Tracking
If you role up your Campaigns to higher-level Marketing Channels using SAINT Classifications you can use the concept from the Page View Event post to see how many pages are viewed on your site after visitor arrive from each Marketing Channel.

page_evar_channel

You can then break this report down by the Page Name eVar to see the most popular pages for each Marketing Channel:

page_evar_channel2

While this is not as granular as viewing Pathing by Campaign (as I demonstrated in this post) , it can give you a high-level view of what pages are popular for each different marketing channel. If you are using the Unified Sources DB VISTA Rule or Channel Manager plug-in, it gets even better as you can see what pages people coming from another website or SEO are viewing on your website by breaking down a particular SEO keyword or external website link by Page Name:

page_evar_channel3

Internal Search Follow-On Pages
If you are properly tracking Internal Search on your website, you should have Internal Search Terms stored in an eVar so you can use this concept to break down Internal Search Terms by this new Page Name eVar (while using the Page View Event) to see what pages visitors view after they search on each specific Internal Search Term:

page_evar_search

What Page Does Success Take Place?
Another side-benefit of setting a Page Name eVar is that you can see on which page a Success Event takes place. For example, if you set a “File Download” Success Event and a file is available on several pages, you can subrelate each file name with the Page Name eVar to see which page is the most popular for downloading each file.

Conversion Variable QA
Finally, there is a completely different use for the Page Name eVar – Quality Assurance. Often times, you will run into situations where you have eVars that have bad data or no data at all (the dreaded “None” row!). Often times, these issues are hard to troubleshoot. However, if you have a Page name eVar, your life is much easier.

Let’s say that you have forms on your website and when visitors complete a form, they are required to enter a “Company Size” field which is stored in an eVar. However, there are many cases where you are seeing the Form Company Size eVar with no data. This might mean that IT forgot to make the field required on some of the Forms (would never happen right?). How do you figure out which forms are causing the issue? All you have to do is the following:

  1. Open the eVar report that has data issues with a relevant Success Event metric (Form Company Size and Form Completes in this example)
  2. Find the row that has bad data or no data (“None” row)
  3. Click the breakdown icon to break the report down by the Page Name eVar
  4. The resulting report (see below) will show you a list of Page Names where SiteCatalyst set the Form Complete Success Event, but did not have a corresponding Form Company Size eVar value

page_evar_qa

You can then send this report to your IT team to help them find pages where there may be tagging issues. You could even schedule this as a recurring report to you and IT so you are alerted when similar issues arise in the future, which helps with overall data quality. Keep in mind that this will only work if the eVar you are looking at has Full Subrelations or you add Full Subrelations to the Page Name eVar (see below).

Final Thoughts
As you can see, there are many different uses of this functionality. The following are some final pointers related to this topic:

  1. As previously noted, if you plan to use the Page Name eVar extensively for testing, I would recommend that it have Full Subrelations so you can QA all eVar reports, not just those that already have Full Subrelations.
  2. In one of the rare times I ever tell clients to do this, I would recommend that you set the Page Name eVar to expire at the Page View in the Admin Console. Expiration beyond that will probably add little value and only slow down reporting. There are some special things you need to do here if you use Custom Links so I would advice you speak to Omniture Consulting about this.
  3. Consider Classifying the Page Name eVar by Page Type, Page Product Category, etc… to increase the value you get from this eVar.
Analytics Strategy, General

Are You Ready for the Coming Revolution?

Few would argue that the past few years in web analytics have been, well, intense. The emergence of Yahoo Web Analytics, multiple management shake-ups at WebTrends, Adobe’s acquisition of Omniture following Omniture’s acquisition of Visual Sciences, WebSideStory, Offermatica, Instadia, and TouchClarity, and the continued push into the Enterprise from Google Analytics. From where I sit we have seen more changes in the last 24 months than we had in the entire 12 years previous (my tenure in the sector) combined.

When I think about these changes, I find myself coming to the undeniable conclusion that our industry is undergoing a radical transformation. More companies than ever are paying attention to digital measurement, and despite my disbelief in Forrester’s numbers, an increasing number of these companies are forging a smart, focused digital measurement strategy. At the X Change, at Emetrics, and at Web Analytics Wednesday events around the world there is more and more evidence that this wonderful sector I call “home” is really starting to grow up.

And we’re just getting started.

If you pay close attention to the marketing you see from Omniture, WebTrends, Unica, Coremetrics, and the other “for fee” vendors you’ve surely noticed a dramatic change recently. Nobody is talking about web analytics anymore; the entire focus has become one of systems integration, multichannel data analysis, and cross-channel analytics.

All the sudden web analytics is starting to sound like, gasp, business and customer intelligence.

Eek.

Since it’s late and since this post will be over-shadowed by the hype around Google Analytics releasing more “stuff” on Tuesday I’ll cut right to the chase: I believe that we are (finally) on the cusp of a profound revolution in web analytics and that the availability of third-generation web analytics technologies will finally get digital measurement the seat at the table we’ve been fighting to get for years.

Statistics, people … statistics and modeling, predictive analytics based on web data, true forecasting, and true analytical competition for the online channel. Yahoo’s use of confidence intervals when presenting demographic data and the application of statistical models in Google’s new “Analytics Intelligence” feature are just the beginning. As an industry it’s time to stop fearing math and embrace analytical sciences that have been around for longer than many of us have been alive. It’s time to stop grousing about how bad the data is and actually do something about it.

Do I have your attention? Good.

Thanks to the generosity of the kind folks at SAS I have a nicely formatted white paper that is now available for download titled “The Coming Revolution in Web Analytics.” Just so you can see if you might be interested here is the Executive Summary from the document:

“Forrester Research estimates the market for web analytics will be roughly US $431 million in the U.S. in 2009, growing at a rate of 17% between now and 2014.  Gartner reports that the global market for analytics applications, performance management, and business intelligence solutions was US $8.7 billion in 2008—roughly 20 times the global investment in web analytics.  Among their three top corporate initiatives, most companies are focusing their efforts online, expanding their digital efforts Internet to increase the organization’s presence in the least expensive, fastest growing channel.

Today, a majority of companies are dramatically under-invested in analyzing data flowing from digital channels.  Even when business managers have committed money to measurement technology, they usually fail to apply commensurate resources and effort to make the technology work for their business.  Instead, most organizations focus too much on generating reports and too little on producing true insights and recommendations, opting for what is easy, not for what is valuable to the business.

Analytics Demystified believes this situation is exacerbated by the inherent limitations found in first- and second-generation digital measurement and optimization solutions.  Provided by a host of companies primarily focused on short-term gains in the digital realm, not long-term opportunities for the whole business and their customers.  Historically these companies worked to differentiate themselves from traditional business and customer intelligence, focusing on the needs of digital marketers.  Unfortunately, as the need for whole business analysis increases, many of these vendors are playing catch-up and forced to bolt-on data collection and processing technology as an afterthought.

The current state of digital analytics is untenable over time, and Analytics Demystified believes that companies that persist in treating online and offline as “separate and different” will begin to cede ground to competitors who are willing to invest in the creation and use of a strategic, whole-business data asset.  These organizations are using third-generation digital analytics tools to effectively blur the lines between online and offline data—tools that bridge the gap between historical direct marketing and market research techniques and Internet generated data, affording their users unprecedented visibility into insights and opportunities.

This white paper describes the impending revolution in digital analytics, one that has the potential to change both the web analytics and business intelligence fields forever.  We make the case for a new approach towards customer intelligence that leverages all available data, not just that data which is most convenient given the available tools.  We make this case not because we believe there is anything wrong with today’s tools when used appropriately, but because we believe digital analytics should take a greater role in business decision making in the future.”

Since I pride myself on the quality of my readership I sincerely hope that each of you will download this document and  take the time to read it. More importantly I’d love you to share it with your co-workers, friends, and followers on Twitter. I believe we are at a critical juncture in our practice’s history where the skills that have served us all along are not going to serve us for much longer, but I am always willing to admit that I’m wrong and more than anything I love a spirited debate.

Are you ready for the revolution?

Analytics Strategy, General

New Data on the Strategic Use of Web Analytics

Recently Google published the results of a Forrester Research study they had commissioned (PDF) to help the broader market understand the use and adoption of free web analytics solution.  Google should be applauded for commissioning Forrester to conduct this work, especially given the quality of the research and the level of insights provided.  Without a doubt, free solutions like Google Analytics and Yahoo Web Analytics are having an impact on our industry and driving change in ways few of us ever imagined.

I really did enjoy the Forrester report, primarily because the author (John Lovett) managed to surface totally new data.  When he first told me that over half of Enterprise businesses were using free solutions I have to admit I didn’t believe him.  In a way I still don’t, but perhaps that’s only because I work with a slightly different sample than he presents.  Regardless, John’s report paints a picture of an increasingly challenging market for companies selling web analytics and a new sophistication among end users.

Speaking of sophistication, there are a few points in the report that I question, and since I have pretty good luck getting feedback from readers on big picture stories I figured I’d bring them up here in the blog.  Before I do I want to emphasize that I am not questioning Forrester or John’s work—I am merely trying to explore some data that I find contrary to my own experience in this public forum.  To this end I pose a handful of questions that I would love to discuss either openly in comments or via email.

The first point I question is the observation in Figure 3 that 70% of companies report having a “well-defined analytics strategy.”  Two years ago my own research found that fewer than 10% of companies worldwide had a well-defined strategy for web analytics.  Last year Econsultancy reported that only 18% of the companies in their sample had a strategy for analytics.  To jump from these low numbers to the majority of Enterprises just doesn’t square with my general experience in the industry.

.

Remember, the implication of this data point is that 70% of all companies having more than 1,000 employees have a “well-defined analytics strategy.”  According to a 2004 report from the U.S. Census Bureau there were just over 12,000 companies in the U.S. with more than 1,000 employees.  Without assuming any growth between 2004 and 2009, Forrester’s 70% figure would result in over 8,500 companies in the U.S. that have a “well-defined” strategy for web analytics. Does that sound right to you?

Consider that the combined customer count for Omniture, WebTrends, Coremetrics, and Unica combined in the U.S. doesn’t even add up to 8,500 companies.  Even if you use the more conservative 13% who “strongly agree” with Forrester’s statement you end up with over 1,500 U.S. companies.  I may suffer from sample bias, but personally I can barely think of 150 companies that I would identify as having any strategy for web analytics, much less a “well-defined” one.

Most companies I talk to have the beginnings of an over-arching strategy—they’ve realized the need for people and are beginning to reduce their general reliance on click-stream data alone.  But given that I think about this topic from time to time, I think a “well-defined” strategy for web analytics takes into account multiple integrated technologies, appropriate staffing, and well thought-out business and knowledge processes for putting their technology and staff to work.  What does the phrase “well-defined strategy” imply to you?

Similarly, if 60% of companies truly believed that “investments in Web analytics people are more valuable than investments in Web analytics technology” there would be THOUSANDS of practitioners employed in the U.S. alone.  But again, every conference, every meeting, every conference call, and every other data point suggests that the need for people in web analytics is still an emerging need.  Hell, Emetrics in San Jose earlier this year barely drew 200 actual practitioners by my count.  How many web analytics practitioners do you think there are in the United States?

Same problem with the rest of the responses to Figure 3 on web analytics as a “technology we cannot do without” (75%) and the significance of the role web analytics plays in driving decisions (71%).  Perhaps I’m talking to entirely the wrong people, perhaps I’m interpreting these data wrong, and perhaps I’ve gone flat-out crazy, but these responses just don’t match my personal understanding and experience in the web analytics industry.

This issue of data that simply does not make sense, while not universally manifest in the report, manifests elsewhere as well. For example, Figure 8 reports on the percentage of application used segmented by fee and free tools:

.

When I look at these responses and see that 63 percent of respondents using fee-based tools and 50 percent of respondents using free tools claim to be effectively using more than half the available functionality, again I find myself scratching my head. As this data appears to speak to the general sophistication of use of analytics I went back and looked at Dennis Mortensen’s quantitative study of how IndexTools was being used around the world.

Dennis reports that fewer than 10% of his customers were using even the most basic “advanced” features in web analytics (report customization) and that fewer that 4% of his customers (on average) are making any “advanced” use of the IndexTools application. While this dataset is somewhat biased towards European companies who I believe, on average, to be somewhat behind their U.S. counterparts it does provide an objective view in how web analytics are used that seems to directly contradict the self-reported responses in Forrester’s figure 8.

Clearly there is a gap between the responses John collected and the current state of the web analytics market.  Since John is a very smart guy I know part of his rebuttal will include the observation that he surveyed people directly responsible for web analytics (see Forrester’s methodology) and that people in general have a tendency towards positivism. Trust me, my son is the most handsome little boy ever born and my daughter’s beauty is only matched by that of Aphrodite … same for your kids, right?

Given the difficulty associated with gathering truly objective data regarding the use of web analytics, this type of self-reported data is usually what we have to go on.  While Omniture, WebTrends, Coremetrics, and Unica all have the fundamental capability to report data similar to that provided by Mr. Mortensen, it may not be in their best interests to expose underwhelming adoption and unsophisticated use (if that is what the analysis uncovered.)  Ultimately we’re forced to accept these self-reported responses and  then reconcile them against our own views, which is why I’m asking my readers what they think about the data Forrester is reporting!

Regarding these self-reported attitudinal responses on how web analytics is used strategically, perhaps the truth is found in the companies who “strongly agree” with John’s statements.  If we apply this lens, as opposed to the more optimistic view, we get the following:

  • 17% of companies recognize that web analytics is a technology they cannot live without;
  • Web analytics plays a significant role in driving decisions at 12% of companies;
  • 13% of companies have a well-defined web analytics strategy;
  • 9% of companies recognize that investments in people are more valuable than investments in technology

These numbers start to make a lot more sense to me.  Likely the truth, as with so much in our industry, lies somewhere in between, but I would love to hear what you think about these adjusted numbers.  Do the lower numbers make more sense to you, or do you agree with John’s more optimistic assessment?

Unfortunately if the lower numbers are correct the implication is that despite the incredibly hard work that companies, consultants, and industry thought-leaders around the world have done for years we still have an incredibly long way to go before web analytics is recognized as the valuable business practice that you all know it can be!

Regardless I want to state that I do not disagree at all with the fundamental thesis in this report, that “free” is creating a whole new level of interest in web analytics and that, given proper consideration, free is an excellent alternative to paid solutions.  Lacking clear strategy and resources, too many companies have wasted too much money on paid solutions for free to not be compelling.  Thanks to the dedication of the Google and Yahoo teams, the world now has access to great applications that are in some regards more compelling than fee-based alternatives.

While I may not have said this a few years ago, today I honestly do believe that “free” is a viable and appropriate alternative to fee-based solutions. While not appropriate in every situation, it is irresponsible to suggest that any company not willing to fully engage in web analytics should pay for ongoing services and support. Given advances from Google and the availability of Yahoo Web Analytics, any motivated company large or small now has access to a wealth of data that can be translated into information, insights, and recommendations.

Conversely I agree with John (and Jim, and almost ever thought leader I respect) who states that you need to “prioritize your business needs and culture for analytics first and then evaluate the tools.”  This goes back to the fundamental value proposition at Analytics Demystified: It’s not the tools you use but how you use them. If you’re not invested in developing and executing a clearly defined strategy for digital measurement, you may as well be grepping your log files.

I would love your feedback on this post, either directly in comments or via email. Thanks again to the folks at Google for making this awesome research freely available and to John Lovett for shedding light on this incredibly important aspect of our sector.  Remember: we are analysts—our jobs are to ask hard questions and then ask even harder ones!

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Analytics Demystified European Tour

Those of you who live in Europe are likely already aware that my good friend Aurelie Pols has joined me as a partner in Analytics Demystified. Over the next two weeks she and I will be making a series of presentations and announcements at events across Northern Europe. We will be at:

  • The Online Performance Management seminars, hosted by Creuna, in Copenhagen on Thursday, October 8th and in Oslo, Norway on Friday, October 9th. More information about our hosts and registration is available from Creuna.
  • While we’re in Copenhagen we will be having a Web Analytics Wednesday on Wednesday, October 7th. I will be giving a short presentation on testing and if you’re in Copenhagen please join us at this FREE EVENT sponsored by IIH Nordic and Webtrekk
  • Over the weekend Aurelie and I will be hanging out in Stockholm, Sweden. If you’re in Stockholm and want to meet-up please either shoot me an email or Twitter me and we’ll make plans!
  • On Monday, October 12th and Tuesday, October 13th Aurelie and I will be joining the excellent Emetrics crew at Emetrics Stockholm. I will be giving the keynote on Tuesday morning and Aurelie and I will both be participating on a series of panels and shorter presentations. Those of you keeping score will note that I have attended EVERY SINGLE Emetrics ever held in the United States but this is my FIRST EVER event in Europe. Yahoo!
  • On Wednesday, October 14th, I will be hanging out in Amsterdam with the Nedstat crew but have a fair amount of downtime during the day. I’m staying near Vondelpark and if you’d like to meet and get a cup of coffee (seriously, I mean coffee, I’m too old for the other stuff) Twitter me and we’ll make plans!

Since I usually do three European cities in three or four days this trip is a lazy walkabout for me (four cities, seven days) but Aurelie and I have planning to do and, of course, we’ll spend a little time enjoying the local culture.

If you live in any of these cities, or if you plan to come to Emetrics, please join us and come say hello!

General

The Acquisition…

So by now, you have heard the news about the Omniture acquisition by Adobe. Some out there have pinged me for my thoughts on the matter. Since my blog is reserved for education vs. opinion, I am inclined not to comment much on the matter, but given the magnitude of the transaction, I thought I might provide a few random thoughts…

10 Things I Think About the Acquisition

  1. I think that Omniture has some great products and even better people. I wish them the best and hope that this acquisition doesn’t impact them in a negative way.
  2. I think that the two companies are a strange match. I understand the potential synergies and know that both companies will do their best to portray the acquisition as having a synergistic effect, but I am a bit skeptical. When you spend years preaching about optimizing websites and conversion, I don’t see how that jives with a company that makes design related products. Sure you can track Flash components and Flash microsites better, but you could do that without the need to acquire the company that does the tracking.
  3. I think acquisitions are hard and fail more often than they succeed. Integrating two companies is simply hard work. Many years ago, I was a Lotus Notes expert. Lotus had a thriving e-mail and collaboration tool. People like me ran consultancies around their products. Then IBM bought them and the product died. If you are still using Lotus Notes today, you are one of the few (and maybe proud?). Lotus Notes became an after-thought to IBM as it was a small part of their overall business. I fear that the same thing could happen to Omniture at Adobe.
  4. I think that Omniture acquired too many companies too fast and this may have led to a loss of focus. The Omniture leadership team often spoke about the goal of becoming a company that generated a billion dollars in revenue/year. I think that all of the companies that Omniture bought and the difficulties in integrating them together may have made it more difficult for the company to achieve its goal. I think they had the right vision of creating a cohesive online marketing suite (minus the sorely needed e-mail provider acquisition), but I think a more methodical approach and more up-front integration plans could have made a world of difference.
  5. I think a good question to ask is why Omniture chose to sell now? While they are getting a decent premium, I am sure they could have stayed independent for a while and continued to grow the company. Did the management team feel they had taken it as far as they could? Did they find the prospects of future growth too daunting?
  6. I think that Google Analytics played a silent, but big part in this transaction (the elephant in the room!). I also think that the long term winners of this deal could be Unica and/or Eloqua.
  7. I think that Adobe should focus on three of Omniture products: SiteCatalyst, Test&Target (formerly Offermatica) and Insight (formerly Visual Sciences). Without trying to offend anyone, I think these three products are the most valuable and unique to Omniture. If I were in charge, I would focus all Omniture development resources on those products…
  8. I think that five years from now, there is a chance that we may all be viewing websites and display ads that are much more Flash-intensive and interactive and that there will be people running those sites/ads using Omniture data and targeting to get more and more of our money! Those people will look back on this date as the day that things changed for the better. If Google Analytics can drive more advertising revenue for AdWords, maybe Omniture Analytics can drive more product revenue for Adobe…
  9. I think that Adobe would be wise to keep Omniture as a standalone brand since it is very well known in the web analytics space (not to mention that I would have to change my Twitter Name!). I don’t care if they want to cross-sell products, but the last thing Omniture customers need right now is rebranding, bundling, new contract/payment terms, etc… (unless they want to go down the free model which I would be supportive of!). Trim the fat, re-focus the company on a few core products, retain the good folks and I think they will see a profitable subsidiary.
  10. I think that Adobe could do the following to help Omniture customers and be seen as heroes:
    • Use their size and $$$ to find a way to make SiteCatalyst servers more robust, reliable and speedy
    • Find a way to simplify Omniture products so they are on par with newer analytics tools like GA (there is a cool product named Flash with which they could build a state-of-the-art SiteCatalyst interface!)
    • Deliver on the product-to-product integrations for which Omniture customers have been anxiously waiting
    • Find a way to provide more Omniture resources to help customers through Support and/or Account Management (and give Ben a raise!)

I wish both companies the best and hope to continue to be an advocate and champion of the Omniture products…

Adobe Analytics, General

Segment Bounce Rates

In my last post, I discussed a topic which I called Segment Pathing, which allows you to see how Pathing on your site differs by Visitor Type or Campaign Tracking Code. In this post I will build upon this concept with one of the most popular topics in the Web Analytics field: Bounce Rates. While I am not as enthusiastic about Bounce Rates as many others in the field, I do understand their importance and why people like them them. However, one of my gripes with the Bounce Rate metric (which I have always defined as Single Access/Entries) is that there is not an easy way in SiteCatalyst to see Bounce Rates for different types of visitors or Campaigns. Unless they have Omniture Discover or are experts at ASI Segments, most of the Omniture clients I worked with were primarily looking at Bounce Rates for the entire population. While this is OK, I think we can do better than that. In this post I will show you how I create Segment Bounce Rates. However, to get the most out of this post, I strongly encourage you to read my prior post on Bounce Rates and my previous post on Segment Pathing before reading this post.

Segment Bounce Rates
As I just described, my goal when looking at Bounce Rates is to be able to tell my peers how visitors are bouncing off key pages based upon both the page and the segment. In my previous post, I highlighted two segments that I commonly use: 1) Visitor Type (i.e. Customer vs. Non-Customer) and 2) Campaign Tracking Code (i.e. visitors from Google keyword A vs. Yahoo keyword B). If I can dissect how each segment bounces off pages, I can determine if I need to create different versions of pages for each Visitor Type or Campaign Code or I can use this information to build future A/B Tests using a tool like Test&Target. As I mentioned in my last post, this is a moot point if your organization already has Omniture Discover, but as is always the case in my blogs, my goal is to show you how to do things if you only have access to SiteCatalyst.

Implementing Segment Bounce Rates
The good news is that if you have already followed my instructions from my previous post on Segment Pathing, you are 95% of the way to being done with implementing Segment Bounce Rates! As a quick recap, in my last post I described a process in which you concatenate the Page Name with another Traffic Variable (sProp) that contains a segmentation that you care about (i.e. Visitor Type). Once you have these values concatenated on every page, you enable Pathing so you can see paths or pages by segment. However, when you enable Pathing on this new sProp, you immediately gain access to the two metrics that you need to calculate Bounce Rate: Single Access & Entries. Therefore, without even knowing it, by implementing Segment Pathing, you have also implemented Segment Bounce Rates! All you need to do is to create the Bounce Rate Calculated Metric (which hopefully you already have as a Global Calculated Metric) and you are done.

So how do you see the results of your work? All you need to do is to open the new concatenated sProp and add the Bounce Rate metric to the report. In the example shown below, I will use the Campaign Pathing sProp which shows Campaign Tracking Codes concatenated with Page Names. I will add Visits, Single Access, Entries and Bounce Rate to the report:

SegmentBounce_1

As you can see, the Bounce Rate for each Tracking Code/Page Name combination is displayed and you can sort by any metric you wish.

As a best practice, I like to conduct a text search filter to isolate one Page Name so I can see how the Bounce Rates differ for the same page with different Campaign Tracking Codes. In the following example, I filtered on the phrase “:Home Page” and limited my results to see only Home Page Entries and the associated Bounce rates of each Campaign Tracking Code:

SegmentBounce_2

Keep in mind that I am only showing a few simple examples here and that this functionality can be extended to any segment of your choosing. If you want to get really advanced, you could even concatenate multiple items together, such as Visitor Type + Campaign Tracking Code + Page Name. This would allow you to see how different Visitor Types, coming from specific Campaign Tracking Codes, landing on specific Pages, navigate your site or Bounce off pages (i.e. Customer:ggl_1:Home Page). Just don’t go too crazy since there are character limits on sProps and you don’t want to exceed the 500,000 monthly unique limits on sProps.

Final Thoughts
As you can see, you get a “two for the price of one” deal if you do all of the steps in this post and the previous post. If you don’t have access to Omniture Discover and want to see how people navigate through your site or bounce off your site pages by specific segment, I suggest you give this a try and see if it helps you.

 

General

Am I Ever BeHIND on Posting…

August was a little crazy for me:

  • I changed jobs — left Nationwide to become Director, Measurement and Analytics at Resource Interactive — which is 1000% the “right” move, but meant for a hectic/stressful month
  • Back-to-school time, which was more than just getting our kids ready — my wife ran our two sons’ elementary school’s entire supply sale…and my “I’ll show you a few tricks in Excel to help you stay organized” offer morphed into a full-blown custom ERP system built in MS Access; August was the month when all the supplies arrived (think almost 10,000 no. 2 pencils…) and had to be divvied up; I did no divvying, but there were a number of late-breaking report requests; at last count, the database had over 20 tables (it’s almost a fully denormalized database), over 40 queries, 12 forms, and 20+ reports; AND…it’s now been extended to also handle the production of the school’s student directory; gotta love MS Access!
  • Company, company, company — two visits from friends in Texas, two visits from my parents, a visit from my in-laws, and my mother-in-law moved in for six weeks to convalesce from surgery…all in a 3-week period in August

I’ve got one more good customer data management post in me that needs to get written, at which point I expect to be shifting over to more web analytics-y, social media measurement-y posts going forward.

And…as I played around with Drupal for a couple of projects over the past couple of months, I realized that the theme that I settled on after weeks of experimentation on this blog…is one that was built for WordPress to mimic one of the Drupal default themes! How embarrassing!

Please be patient! My life will settle back down soon (I hope). In the meantime, if you’re going to be in Columbus in the middle of September, consider stopping by this month’s Web Analytics Wednesday on September 16th!

Adobe Analytics, General, Reporting

Custom Search Success Events

I know many Omniture clients that spend much of their time using SiteCatalyst for SEO and SEM tracking. If you are one of these clients, the following will show you a fun little trick that you can use to improve your Search reporting by setting custom Search Success Events.

That Darn Instances Metric!
As a Search marketer, you tend to spend a lot of your time in the various Paid and Natural Search Engine reports within SiteCatalyst. While in those reports, you would normally use the out-of-the-box “Searches” metric for most of your reporting. If you stay in the Search reports, life is good, as you can use the Searches metric and any other Success Event to see what success takes place after visitors arrive from a particular Search Engine or Search Keyword. For example, here is a report that shows Searches and Form Completions coming from various Search Engines:

customsearch_1

However, as I blogged about a while back in my Instances post, the Searches metric is really just a renaming of the dreaded SiteCatalyst “Instances” metric. Why is that bad? It means that if you need to see Searches in any other Conversion Variable (eVars) report, you are out of luck. For example, let’s say that your boss wants to see a report that shows Searches and Form Completes (and possibly a Calculated Metric that divides the two) by Site Locale (each country in which you do business). To do this, you would open the Site Locale eVar report and add Form Completes, but guess what…there is no “Searches” metric to add to the report since it only exists in the Search Engine reports! Rats!

Let’s say you are an eternal optimist and you say, darn it, I can solve this! After pouring over past blogs, you finally arrive at the perfect answer! I can use Conversion Subrelations to break the Search Engine report down by Site Locale while the Searches metric is in the report! So you go back to the Searches report shown above and realize that all you have to do is use the green magnifying glass icon to and break the report down by the Site Locale eVar (which BTW will only work if Site Locale has Full Subrelations enabled). I’m a genius, you think to yourself! Then you wait for the report to load…brimming with anticipation only to see this…

customsearch_2

Yuck! What’s up with all of the “n/a” values? Foiled again by the darn Instances metric!

Don’t Panic!
Don’t be so hard on yourself since if you got that far, you are ok in my book! Just consider this a well earned lesson on why you have to be careful around any Instances metric (don’t fall for the same thing with Product Views!). As always, I don’t like to just present problems since the Omni Man is all about solutions! To solve this enigma, we have to find a way to get around the Instances metric. At a high level, the solution is to set custom Success Events when visitors arrive at your site from a Search Engine. I usually set a Natural Search, Paid Search and Paid + Natural Search metrics. This can be done in several ways, but the easiest way is through the Unified Sources Vista Rule or the JavaScript equivalent known as the Channel Manager Plug-in. Regardless of how you implement it, once you have true custom success events set when visitors arrive from a search engine, you can use these success event anywhere within Omniture SiteCatalyst which means that you can now create the report you were looking for above like shown here:

customsearch_3

The following are some other advantages of using a custom success events for Searches:

  1. You can use these metrics in Calculated Metrics (i.e. Shopping Cart Additions/External Natural Search) without having to rely upon the ExcelClient
  2. You can create Alerts on Paid or Natural Search metrics
  3. You can add some cool SiteCatalyst Plug-ins or advanced features to the new Custom Search success events that make them even better than the out-of-the-box Searches metric (i.e. Avoid back button duplicate counting by using the getValOnce plug-in or Event Serialization).
  4. You have an easy way to create a metric report for Searches (see below) and add it to a SiteCatalyst Dashboard

customsearch_4

The only caveat I will give you is that the new custom Search metrics will probably never tie exactly with the out-of-the-box metrics, but in many cases you can make them more accurate and useful. If SEO/SEM is something that is important to your organization, I suggest you talk to Omniture Consulting and give it a whirl… Let me know if you come up with any other cool uses for this functionality…

Analytics Strategy, Conferences/Community, General

Interview: John Lovett from Forrester Research

Following up my interview with Bill Gassman a few weeks ago I realized that I would be remiss if I didn’t build on Forrester’s recent Web Analytics Wave report with an interview with John Lovett. John, like Bill, totally, totally understands the web analytics industry, and in that understanding is able to clarify the marketplace in a way few others can. Don’t believe me? Check out his response to possibly the worst article about web analytics, ever. Measured, polite, even complimentary … that’s John.

I am personally honored that John accepted my invitation to return to the X Change this year and both lead the huddle on “Industry Standards (or a lack thereof)” and co-lead a huddle on technology with Bill Gassman. If you haven’t met John personally, and if you are able to join us at the X Change, I strongly recommend you make a point of introducing yourself to him.

Finally, before my questions and John’s answers, I wanted to point out how incredibly deft Mr. Lovett really is: in response to a high-and-hard fastball question about “which vendor is really the best,” John knocked the ball clear out of the park with his answer: none of them. I’ll let you read the rest for yourself …


Your recent Wave report really emphasized a lot of conventional wisdom about the web analytics vendors but had some surprises for folks.  What surprised YOU the most about the Wave results?

Well Eric, I like to say that surprises are for birthdays and not for business. So in terms of actual surprises, there weren’t any big bombshells for me. I was however pleased that the vendors demonstrated innovation in a number of areas (like social media measurement) and that despite my attempts to develop extremely challenging criteria, the vendors continue to improve year over year.

One comment people have made to me is that they question the validity of comparing fee and free solutions in a single matrix due to the fundamental differences in their business model.  How would (or do) you respond to that challenge?

That’s preposterous! I respond by saying that it’s negligent not to compare free vs. fee based solutions. In today’s economic environment if you’re not watching expenses by understanding the cost to benefit ratio of your Web analytics solution, you are acting irresponsibly. Free tools have merit for many organizations as both primary and secondary tools, while fee based solutions are more appropriate for others based on their capabilities. Organizations must do their diligence to understand what they need in a Web analytics solution to decide what’s right for them, which is really the insight the Wave attempts to provide.

I asked Bill Gassman from Gartner a variation on this question recently, but do you now or see in the near future a situation where you as a Forrester analyst are advising your clients to actively consider these free solutions in addition to “traditional” web analytics solutions from Omniture, Coremetrics, and Unica?  As a follow-up, how do you see free tools impacting the market in the next 12 to 24 months?

I advocate that a single system for measurement is always the best way to go, yet recognize that this isn’t always feasible. Duality of Web analytics tools is a reality for myriad reasons. Thus, company’s need to manage their data dissemination practices to ensure comprehension and mitigate doubt. This is tricky, but certainly possible. I often help clients determine which solution is best suited to meet their needs and financial implications are always a part of that discussion.

With regard to how free tools will impact the market: we are just witnessing the beginning of the incoming tide on this one. By this I mean that “free” will continue to disrupt the market by placing pressure for improvement on all vendors. Just look at the recent Webtrends product upgrade announcement – the majority of press around it cited a “look out Google Analytics” slant. Why the comparison…they’re worried! Fee-based vendors have even more to fear now that Yahoo! Web Analytics opened up its partner program.

Another comment I hear about the Wave results, and forgive me this, is that they’re lame because they do nothing to differentiate the “market leaders” who appear as a tight cluster.  The evidence cited is that all four vendors issued press releases declaring their “market leadership” which appears technically correct based on the Wave but as the Highlander said, “There can be only one.”  First, how do you respond to this and second, who is the real market leader in web analytics?

Here’s the dirty little secret – the real market leader is the wildly talented Web analytics practitioner. It’s not the tools that differentiate it’s the craftsman. Any company that believes the Web analytic technology alone will make them incredibly successful is delusional or just plain out of touch. There is no get rich quick scheme here. Each of the leading vendors on the Wave offers a highly customized solution that can be tricked-out to meet nearly anyone’s individual needs. But this takes a great deal of work. For those organizations that are looking for the far-and-away winner in this technology category, guess what: the tools will only get you so far – you need talented people to really make it happen.

Rumors are that Omniture has a bunch of “800 lb gorillas” hanging in their offices right now.  Clearly they’re proud of their position, but last quarters results highlighted that there are clear risks to their business that are beginning to manifest.  What do you think are the greatest risks to Omniture’s business over the next 18 months?

Well, I don’t buy into rumors and sure don’t know where I left my crystal ball. But things are tough all over. As I stated earlier, free solutions are threatening all fee-based vendors and forcing them to work harder. I can tell you that measurement technologies are an imperative for executing on digital marketing endeavors. Solutions like Omniture’s, Webtrends’, Coremetrics’, Unica’s and everyone else’s will continue to play an important role in the evolution of organizations conducting business online. I believe that Web analytics is increasingly becoming an integrated service and expect to see things evolve to easier access to data through new and alternative means. The leading vendors, including Omniture, will play a role in this evolution.

What’s your taken on the current hype cycle around “open”?  Omniture bangs the Genesis drum, Coremetrics connects, and now WebTrends appears to have decided that “open” will be the foundation of their future success (or lack thereof) … but some people think that “open” is a check-box requirement, not a competitive differentiator.  What do you think?

Open is not a feature, it’s a philosophy. The ability to get data into and out of a Web analytics solution is the crux of the issue and leading vendors facilitate this through bi-directional API’s, other import and export functions and data dissemination capabilities. Webtrends is currently doing this as well as anyone, but “open” also means talking to your customers about development plans, listening to criticism and demonstrating a willingness to change. These qualities aren’t unique to Webtrends, they’re characteristics that all vendors should exhibit. Webtrends is just marketing around them and if that’s causing people to want open, then it appears to be working.

As a previous attendee to the X Change what do you like best about the conference and what would you like to see us change this year or next?

I appreciate the intimate conversational format of X Change. The huddles really facilitate deep thought, controversial leeway and provocative discussion. As someone who attends a number of conferences, it is refreshing to engage in dialogue with individuals who are passionate about what they do and to initiate a true collaborative thinking environment. As far as change goes, I really hope to be able to guide the huddles that I’m leading toward resolution. Within our industry, all too often we surface problems and issues without identifying solutions. I’ve taken your challenge to heart and hope to walk away with some tangible results from my huddles.


John will be joining Bill Gassman, Gary Angel, June Dershewitz, and over 100 expert users, consultants, and vendors at the 2009 X Change conference in San Francisco on September 9, 10, and 11. Registration is currently underway and we’d love to have you join us! For more information please visit:

http://www.xchangeconference.com

General, Reporting

My Favorite v14.6 New Features

A few weeks ago, with the release of SiteCatalyst v14.6, there were a few interface features added that people like me have been requesting for a long time. While there were many new items released, two of the more simple ones can go a long way to making the lives of power users easier. Below is a quick description of these two enhancements and why I like them.

Send Link
Have you ever worked hard to create a beautiful report in SiteCatalyst and wanted to share it with others at your company? To do so, you usually have to save it as a Bookmark or to a Dashboard and then share that Bookmark or Dashboard and then tell users how to find it and add it to their list of Bookmarks or Dashboards. Alternatively, you could send it to them in PDF/Excel/CSV format, but then they cannot manipulate it (change the dates, add different metrics, etc…). Well all of that is a thing of the past now since you can now easily send a link to the exact report you are looking at to one of your peers. The only prerequisite is that they have a log-in to SiteCatalyst and have security access to the report suite and variables used in the report. This is a real time-saver and I think will be useful in driving SiteCatalyst adoption by getting people into the tool to explore vs. always looking at reports sent via e-mail.

To send a link to a report, simply click the new icon found in the toolbar…

14_6_SendLink

…and you can copy this link and send it to people at your organization. I was told that these links would be good for a year which should be plenty of time. The way I am excited to use this feature is in PowerPoint presentations where you can put a screen shot of a report and then make the entire screen shot image a hyperlink to the real report so when you are presenting you can easily dive right into the report without having to fumble around to find different reports when you are short on time and/or in front of executives.

My only complaints/enhancement requests of this new feature are as follows:

  • I would like to be able to have this feature for Dashboards as well
  • It would be cool if you could e-mail the link to SiteCatalyst users be picking names from an address book since they all exist in the Admin Console anyway. Even better if you could set-up some groups for people who you commonly e-mail
  • In the future, it would be interesting if you could send the link to a Publishing List which would show the same report, but for a different report suite to different groups of people (however, this would mean you need to check a box to determine if the link is variable or not like Dashboard reportlets)

Update Dashboard Reportet
The second new feature I love is the ability to update Dashboard reportlets. Using this feature, you can now make changes to a Dashboard reportlet much more easily than in the past. Previously, to update a Dashboard reportlet, you would have to:

  1. Open the Dashboard
  2. Launch the reportlet into full view
  3. Make your changes
  4. Click to add the new version back to the Dashboard
  5. Update the reportlet settings
  6. Wait for the Dashboard to open
  7. Delete the old version of the reportlet
  8. Move the new version to the correct space (phew!)

Now you can accomplish the same thing by doing the following:

  1. Open the Dashboard
  2. Launch the reportlet into full view
  3. Make your changes
  4. Click the new link (shown below) to update the Dashboard reportlet

14_6_Reportlet

As you can see, this is much easier and much more intuitive for end-users. In addition, you can even change report suites and view the same reportlet for a different data set and update it and it will be saved back to the Dashboard tied to the new report suite! Very exciting for Omniture guys like me!

Well those are my two favorite enhancements, but I know there were many more made. Let me know if you agree/disagree that these two items are useful or if there are other feature updates that you have found useful or if you have additional suggestions on how these two can be improved (maybe Omniture Product Management will end up reading these!). Thanks!

Analytics Strategy, Conferences/Community, General

Interview: Bill Gassman (Gartner) on Google Analytics

Bill Gassman from Gartner is one of those guys that just “gets” what we’re trying to do in the web and digital analytics industry. Perhaps because he’s been covering this space for nearly as long as I’ve been around, or perhaps because he has a deep business intelligence background and sees where all this is going. I dunno, but Bill gets it.

Recently Bill, who is incidentally coming to the 2009 X Change and leading a huddle on organizational issues and co-leading a huddle on technology with John Lovett from Forrester, published a short brief on Google Analytics that I thought really hit the nail on the head. Clear, honest, and fully taking the Enterprise into account, Bill’s report clarified a lot about how companies should be thinking about Google’s analytics solution.

Since I could not get permission to republish Bill’s report I did the next best thing — I came up with some questions and put them to the man himself. The following are my questions and Bill’s responses. Incidentally, if you want to follow-up on this interview Bill graciously said he would monitor the comments and respond there (so comment away!)

Or, you could just come to San Francisco on September 9, 10, and 11 and debate the goodness of Google Analytics with Bill in person.

Regarding your recent note on Google Analytics, can you characterize how the companies who are asking you about “free” analytics have changed in the last 12 months?  Is any one thing driving that change, do you think?

Since Google Analytics improved last October, most client inquires about Web analytics touch on Google Analytics.  That is why I published the note “Is Google Analytics Right for You?”.  (Gartner account required to access)  Marketing departments ask if it is all they need, purchasing agents wonder why they should spend money on commercial tools and corporate lawyers wonder about Google’s terms and conditions.   The economy and budget constraints trigger the questions, but the major driver to Google is its simplicity.  Many organizations do not have the processes in place to make use of the high-end products or have Web sites that do not need the sophistication they offer.  They perceive Google Analytics as good enough and “free” is a tempting offer.

If Google asked you which three things were most important to add to their functional set to be considered “Enterprise” what would those three things be?

Getting to functional parity with the commercial tools is not enough for Google Analytics to be considered enterprise class.  Google should charge for an enterprise class offering, because a lot is required as is the accountability that goes with the exchange of money.  They must also provide enterprise class support and address issues with the terms of service policy.  The key function missing is a visitor centric repository, so users can define complex multi-session segments and export visitor information to campaign and content management tools.  Google would also have to extend personalized service from customers with significant AdWords accounts to those willing to pay for it.  Finally, the terms of use policy has no service level guarantee and users must look for a FAQ to find assurances of privacy.

On the subject of “Enterprise-class” analytics … Google appears obsessed with this designation, regardless of their clear dominance from a deployment standpoint and the gains they’ve made within larger companies.  What do you think is behind their obsession?

Google appears to be fighting an asymmetric war with IBM, Microsoft and others, investing relatively little yet forcing competitors to take notice.  We see this especially for office applications and cloud computing.  Google is looking for products that give them credibility at the enterprise level, and Google Analytics is part of that story.  Other parts of the story include the recent news about the Chrome OS, Google Search Appliance, Google apps (premier edition), Geospatial Solutions, and Google App Engine for cloud computing.  While many large organizations are using some or all of these offerings, with few exceptions, only the Search Appliance has gained strategic status.  Google still brings in 97% of its revenue through advertising.  They might be showing obsession because of where they want to be, but then again, they could be throwing up a smoke screen to keep the competition too busy to attack Google on advertising.

What do you consider the single greatest risk to Google’s analytics business in the next 24 months?

There is no threat to Google’s analytics business, because Web analytics is not their business, yet.  Out of 72 million active Web servers (as reported by Netcraft), about 20,000 organizations pay for Web analytics.  Google gives away Google Analytics so that millions of Web site owners can see the impact of AdWords and buy more Google ads.  If there is a threat, it is Yahoo Web Analytics, who is using a similar tactic to go after Google’s advertising revenue.

Are you now, or do you see in the near future, a situation where as a Gartner analyst you are advising your clients to actively consider free solutions from Google and Yahoo alongside “traditional” web analytics solutions like Omniture, WebTrends, and Coremetrics?

Running two sets of tools on the same Web pages can be a recipe for trouble, because reported numbers will not match, reducing respect and therefore value for both tools.  There are situations however where two tools make sense.  It would be great if all organizations had the leadership, investment, skills and processes to use commercial tools to meet everyone’s needs, but for too many, it has not worked out that way.  When analytic resources are limited, it is pragmatic to focus commercial tools on the high-value parts of the site and let other site stakeholders use free tools. Analysis is a critical part of a customer centric Web strategy.  If some departments are happy with the free tools and a central group cannot support them, it is OK to let chaos reign until the business justification, investment and leadership are available to do things right.

General

SPSS Expertise? Job Opportunity — Full-time/Contract/Flexible

The Austin-based division of a qualitative and quantitative research company is looking for someone with SPSS expertise — they’re pretty flexible as to how the work gets set up, and there is not a requirement that the role be based in Austin. Below are the requirements:

Minimum require (per project): 15-20 hours a week

Software Skills (in order of importance):

  • SPSS (v15 to V17) – Intermediate to Expert (4+ years)
    Ability to:

    • Navigate using command code (little reliance on GUI)
    • Develop concise syntax
    • Clearly document processes
    • Working knowledge of data import and export procedures
    • Techniques to manipulate string and numeric data
    • Logic construction and deconstruction
    • Loop and do repeat functions
    • Descriptive statistics
    • Custom tables
    • Macros
  • Text Editor – Intermediate (3+ years)
  • Excel – Intermediate (3+ years)

General Experience (in order of importance):

  • Data management:
    • Processing
    • Troubleshooting: data / logic issues
    • Basic analysis
    • Basic to complex reporting
  • Work environment: small programming teams
  • Online surveys
  • Research / market research (consumer and/or B2B)
  • Technology sector

General Skills

  • Technical orientation
  • Strong attention to detail
  • Strong problem solver

Nice to Have Skills:

  • Python
  • Visual Basic

Ping me for details (the details being “here’s the company and a contact person”) at: tim at gilliganondata dot com.

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Want to Debate Standards?

One of the biggest problems we face in web analytics today is our industry’s lack of standards and common definitions. And while a great number of incredibly bright folks have put a ton of energy into solving these problems, in my humble opinion we are more or less where we started years ago — agreeing politely to disagree. Those of you who have been reading my blog for awhile know that I’m not shy about disagreement — perhaps more than anything my analyst’s mind loves a spirited debate — but I also am somewhat anxious about creating tangible outcomes.

To this end I am incredibly excited about two huddles at X Change 2009, one that was just added! The first is Forrester’s John Lovett’s “Web Analytics Standards (or a Lack Thereof)” in which John will be leading us through the current state of industry standards, proposed definitions and our collective understanding of analytics terminology. The second, and one just added to the X Change, is Jim Hassert’s “When is a Visitor Not a Real Person?” huddle in which Jim will take John’s huddle one step further and drill-down into the often irreconcilable differences found in the seemingly harmless “visitor” metric and dimension.

Last year I was forced to miss a lot of good huddles. This year a team of wild horses couldn’t keep me from missing these two.

While I have little doubt that both of these huddles will live up to the spirit of the X Change my hope is that they will go one step further. I would love to see both produce some kind of actionable outcome, something that we can carry forth into our careers and the wider conversation about our industry. Given that some serious talent is already signed up for the X Change — including some of the brightest minds in the practitioner and vendor community — I have little doubt that we have the brain power … now all we need is the resolve to do something and not just push words around on paper.

If you’re a reader of this blog and want to join us at the X Change I’m happy to help you out.  If you act before July 31st I am offering a 15% discount on the registration (a $300 savings!)

Come to the X Change. Agree to do more than “politely disagree” — take a stand, defend your ideas, and help shape tangible and positive outcomes.

Analysis, General, Reporting

Where BI Is Heading (Must Head) to Stay Relevant

I stumbled across a post by Don Campbell (CTO of BI and Performance Management at IBM — he was at Cognos when they got acquired) today that really got my gears turning. His 10 Red Hot BI Trends provide a lot of food for thought for a single post (for one thing, the post only lists eight trends…huh?). It’s worth clicking over to the post for a read, as I’m not going to repeat the content here.

BUT…I can’t help but add in my own drool thoughts on some of his ideas:

  1. Green Computing — not much to add here; this is more about next generation mainframes that run on a less power than the processors of yesteryear
  2. Social Networking — it stands to reason that Web 2.0 has a place in BI, and Campbell starts to explain the wherefore and the why. One gap I’ve never seen a BI tool fill effectively is the ability to embed ad hoc comments and explanations within a report. That’s one of the reasons that Excel sticks around — because an Excel based report has to be “produced” in some fashion, there is an opportunity to review, analyze, and provide an assessment within the report. Enterprise BI tools have a much harder time enabling this — when it’s come up with BI tool vendors, it tends to get treated more as a data problem than a tool problem. In other words, “Sure, if you’ve got data about the reports stored somewhere, you can use our tool to display it.” What Campbell starts to touch on in his post is the potential for incorporating social bookmarking (“this view of this data is interesting and here is why”) and commenting/collaboration to truly start blending BI with knowledge management. The challenge is going to be that reports are becoming increasingly dynamic, and users are getting greater control over what they see and how. With roles-based data access, the data that users see on the same report varies from user to user. That’s going to make it challenging to manage “social” collaboration. Challenging…but something that I hope the enterprise BI vendors are trying to overcome.
  3. Data Visualization — I wouldn’t have a category on this blog dedicated to data visualization if I didn’t think this was important. I can’t help but wonder if Campbell is realizing that Cognos was as guilty as the other major BI players of confusing “demo-y neat” with “effective” when it comes to past BI tool feature development. From his post: “The best visualizations do not necessarily involve the most complex graphics or charts, but rather the best representation of the data.” Amen, brother!!! Effective data visualization is finally starting to get some traction — or, at least, a growing list of vocal advocates (side note: Jon Peltier has started up a Chart Busters category on his blog — worth checking out). What I would like to see: BI vendors taking more responsibility for helping their users present data effectively. Maybe a wizard in report builders that ask questions about the type of data being presented? Maybe a blinking red popup warning (preferably with loud sirens) whenever someone selects the 3D effect for a chart? The challenge with data visualization is that soooooo many analysts: 1) are not inherently wired for effective visualization, and 2) wildly underestimate how important it is.
  4. Mobile — I attended a session on mobile BI almost five years ago at a TDWI conference…and I still don’t see this as being a particularly hot topic. Even Campbell, with his mention of RFIDs, seems to think this is as much about new data sources as it is about reporting and analysis in a handheld environment.
  5. Predictive Analytics — this has been the Holy Grail of BI for years. I don’t have enough exposure to enough companies who have successfully operationalized predictive analytics to speak with too much authority here. But, I’d bet good money that every company that is successful in this area has long since mastered the fundamentals of performance measurement. In other words, predictive analytics is the future, but too many businesses are thinking they can run (predictive analytics) before they crawl (performance measurement / KPIs / effective scorecards).
  6. Composite Applications — this seems like a fancy way to say “user-controlled portals.” This really ties into the social networking (or at least Web 2.0), I think, in that a user’s ability to build a custom home page with “widgets” from different data sources that focus on what he/she truly views as important. Taking this a step farther — measuring the usage of those widgets — which ones are turned on, as well as which ones are drilled into — seems like a good way to assess whether what the corporate party line says is important is what line management is really using. There are some intriguing possibilities there as an extension of the “reports on the usage of reports” that gets bandied about any time a company starts coming to terms with report explosion in their BI (or web analytics) environment.
  7. Cloud Computing — I actually had to go and look up the definition of cloud computing a couple of weeks ago after asking a co-worker who used the term if cloud computing and SaaS were the same thing (answer: SaaS is a subset of cloud computing…but probably the most dominant form). This is a must-have for the future of BI — as our lives become increasingly computerized, the days of a locally installed BI client are numbered. I regularly float between three different computers and two Blackberries…and lose patience when what I need to do is tied to only one machine.
  8. Multitouch — think of the zoom in / zoom out capabilities of an iPhone. This, like mobile computing, doesn’t seem so much “hot” to me as somewhat futuristic. The best example of multitouch data exploration that I can think of is John King’s widely-mocked electoral maps on CNN (never did I miss Tim Russert and his handheld whiteboard more than when watching King on election night!). I get the theoretical possibilities…but we’ve got a long ways to go before there is truly a practical application of multitouch.

As I started with, there are a lot of exciting possibilities to consider here. I hope all of these topics are considered “hot” by BI vendors and BI practicitioners — making headway on just a few of them would get us off the plateau we’ve been on for the past few years.

Analytics Strategy, General

The Truth About Mobile Analytics

Perhaps the only thing hotter than social media right now is mobile. And with good reason — smartphones like the iPhone and Palm Pre are taking our ability to get information to entirely new levels and ushering in an era of “digital ubiquity” that is clearly without precedent. Unsurprisingly business is responding by actively exploring how they can participate in the mobile opportunity, either by optimizing their site for small screens or going so far as to build cool, new iPhone applications to support long-standing offline initiatives.

Fortunately most business owners have learned from past mistakes and are showing interest in measuring the effect of their investment into mobile. But measuring mobile isn’t easy — the sheer diversity of technologies involved and the rapid evolution of the industry has created a monsterous landscape of devices, communication protocols, and requirements.

As a result dozens of companies have sprung up, all making claim to a unique ability to measure the mobile opportunity. Unfortunately some of these companies have decided that relying on hype, hyperbole, and sometimes outright lies are a better sales strategy than building a great product with a unique value proposition. We have seen CEOs bash other CEOs, sales people obfuscate their identity and try and provide “objective” answers, and antics that can only be described as “juvenile.”

Because the mobile opportunity is so great Analytics Demystified started taking a closer look at measurement earlier this year. I was fortunate enough to be able to rely on the expertise of folks like Michiel Berger and Thomas Pottjegort at Nedstat, the mobile team at NBC, dozens of analytics end-users, and some of the brightest product managers in the analytics sector tasked with integrating mobile into existing digital measurement offerings.

What I found was a series of surprising truths about how mobile analytics is evolving. Nedstat was kind enough to sponsor this research — and clear disclosure: Nedstat has been measuring and integrating mobile data into their web analytics offerings for years — and I am happy to announce the availablity of this research in a new white paper titled “The Truth about Mobile Analytics.”

You can download this paper from the Nedstat web site for free (but they do ask your name, email, and company name):

DOWNLOAD THE TRUTH ABOUT MOBILE ANALYTICS

We are also holding a special webcast on the subject on June 23rd at 10 AM Central European Time (CET) which is unfortunately quite late in the evening for those of us in the U.S. but quite well timed for Nedstat’s customers. I suspect the webcast will either be repeated or rebroadcast at a later date and time.

SIGN UP TO JOIN THE MOBILE ANALYTICS WEBCAST ON JUNE 23

Also, if you’re really into mobile and mobile analytics please consider joining us at the X Change Conference September 9, 10, and 11 in San Francisco. More details will be out next week but our mobile sessions will be led by Greg Dowling from Nokia (a company with some knowledge of mobile I am told.)

I encourage everyone to download the paper and give it a read, regardless of your position on mobile and mobile analytics today. As always I welcome your feedback and commentary.

Analytics Strategy, General

Demystifying Europe …

When I quit my job at Visual Sciences back in May 2007 to form Analytics Demystified I did so because I had a vision of a new type of web analytics consulting group. I very much wanted to build a small practice made up of very senior people capable of solving the really hard problems most companies have after they’ve made the investment in web analytic technology. I wanted to establish a firm that would compliment the highly tactical firms that I respected so much — companies like Semphonic, Stratigent, and Europe’s OX2.

After two years I am very proud of the work I’ve done and the clients I’ve worked with. I have had the opportunity to work with some of the best brands, the best companies, and the most visionary management teams who are actively wokring to do more than simply “run reports” and instead want to actively compete on web analytics. That said, I have come to the realization that there is no way I could satisfy the global need on my own … so I did what every good business owner should do: I went out and got someone smarter, more eloquent, and better looking to be my business partner!

At Emetrics last week in San Jose I was incredibly excited to announce that Aurélie Pols, Europe’s most widely known and well respected web analytics consultant, has joined Analytics Demystified as a Principal Consultant.  Aurélie brings depth and experience in web analytics that is rare anywhere in the world and exceedingly rare in Europe, she was the first consultant to break the “one vendor” stranglehold in Europe that forced firms to work exclusively with a single technology, and she brings a brilliance to the explanation and use of these tools that amazes even me.

Now Aurelie and I will be working together in Europe to “demystify web analytics” and help companies make significantly better use of their technology investment. Between the two of us and our contacts across Europe Analytics Demystified will now be providing a far greater level of service than was previously possible.

I highly recommend that you read Aurélie‘s “Hello, World” blog post and start following her at aurelie.analyticsdemystified.com. If you have any questions about Aurélie’s practice or how Analytics Demystified can help you regardless of where you’re located, please don’t hesitate to contact us directly.

I hope you will welcome me in welcoming Aurélie to the Analytics Demystified team.

Adobe Analytics, Conferences/Community, General

Are You Coming to Emetrics?

It’s almost amazing to consider that it has been a full year since the last Emetrics “West” event in California — what with so many changes and little Luca Dechamps Otamendi turning one — but it is again time to gather together and bask in the glory of Mr. Sterne’s excellent event. I am again honored to be presenting to a combined track, this time on Wednesday, May 6th at 11:00 AM, and will be giving an update of my “Competing on Web Analytics” presentation that resonates so well with, well, pretty much everyone who has seen it.

The update is important and stems from a bunch of research I have been doing for the past six months. Given the launch of Yahoo Web Analytics 9.5 today and the recent opening up of the Google Analytics APIs I am busier than ever talking with companies who are trying to find the “right” balance of technology, people, and process.

Also, as I do from time to time I have a really big announcement that I will be making at the beginning of my talk. Last time I quit my job at Visual Sciences to start Analytics Demystified … this time? Come to the talk and be the first to find out!

I hope you’ll drop by and see my talk, again: Wednesday, May 6th at 11:00 AM.

I am also speaking briefly in the “Softer Side of Metrics” panel with Mr. Stephen “Recently Elected to the WAA Board” Hamel and folks from BT Buckets and Firefox on Thursday, May 7th at 11:00 AM. This should be fun since I’ll get to introduce the larger web analytics community to the work I have been doing with Twitalyzer.

Also, don’t forget about the Emetrics edition of Web Analytics Wednesday which is, as always, open to conference attendees and the local community alike. We have something special planned to honor our recently deceased colleague Hosam Elkhodary so I hope you’ll sign up (so we can get a good count) and join us at the Fairmont Hotel in San Jose.

Finally, as always I go to Emetrics to meet with as many people as I possibly can and operate under the “I can sleep when I get home” mentality. If you’ve read my books, read my blog, enjoy Twitalyzer, or just have always wanted to ask me something please feel free to reach out … literally if you see me passing by or by Twittering me at @erictpeterson and setting up a time to meet.

(If you can’t make it to San Jose the next big analytics event in the U.S. is the X Change Conference September 9, 10, and 11 in San Francisco. I’m a huge fan (and partner) in the X Change so I’d love to tell you more about it if you’re interested!)

I hope to see you in San Jose!

General

Blogroll Update+

Blogrolls, blogrolls, blogrolls. I realized over the weekend that the blogroll(s) on my site were wildly out of date — they reflected some great blogs…but not exactly the ones that I really follow and read most consistently these days.

So, I updated that. But, in the process, I decided to re-open a nasty can of worms that I’d only casually eyed in the past, and I added a Favorite Feeds page to the site. There were two reasons this was a dicey place to go:

  • While I’ve got the best intentions for putting up the page — to give people who come to my site an easy way to scan the content I’m most likely reviewing through my feed reader and possibly discover a new blog or two they’d like to follow — the “content ownership” makes for a touchy subject. There is plenty of splogging going on out there, and that’s really not my intent.
  • The logistics of actually posting a page with a dynamically generated, yet easy to read and duly giving credit where credit is due, list was trickier than it seemed like it ought to be

I think I handled both of these challenges successfully, but please drop a comment if you think I’ve missed something.

Approach to Avoiding Inappropriate Republishing of Content

What I settled on was only posting the post titles and prepending each post with the source in brackets. Clicking on the link takes you to the content on the site where it originated (via feedproxy.google.com, which was entirely unintentional, but may yield some nice benefits down the road — I don’t think this introduces any ethical issues).

Technical Approach for Pulling this Off Using WordPress

I’m sure there are technically more elegant solutions, but here’s the list of how I stitched things together to make the page work:

  1. Created a Yahoo! Pipe that pulls each of these feeds, prepends the source in brackets, and then combines all of the feeds into a single feed sorted from newest to oldest publication date
  2. Ran the pipe through Feedburner — this wasn’t absolutely necessary, but just seemed like a best practice (I subscribe to the feed directly in my feed reader for when time is really short)
  3. Installed both the Exec-PHP WordPress plugin and the WP-RSSImport plugin
  4. To get Exec-PHP to work, and because I do use the WordPress WYSIWYG editor, I created a new user account that has the WYSIWYG editor turned off and used that account to create the new page
  5. To get WP-RSSImport to work, I ran the documentation page through Google to get enough of a translation for me to figure out that I needed to use the following code on the new page I created:
    <?php RSSImport(20,”http://feeds2.feedburner.com/GilliganOnDataFavoriteFeeds&#8221;,false,false); ?>

It took a number of false starts, but the result seems fairly clean, so I’m going to go with it.

Whatcha’ think?

General

Interview on Social Media and Analytics

I have done hundreds of interviews with all kinds of media in my years in web analytics. Some of these interviews have turned out well, some less well, but rarely do I get to participate in a conversation about analytics that afterwards I think “Phew, that was cool.”

A few weeks ago I got to do exactly that thanks to Brent Leary at CRM Essentials.

If you have a few minutes and want to hear my recent thoughts on a variety of subjects including getting started in analytics, the impact of analytics on social media, and the work I’ve done recently on Twitalyzer, please take the time to listen to this interview.

Brent is a totally engaging interviewer and he pushed the conversation along in unexpected ways. I have been getting tons of good feedback already but, as always, I welcome your thoughts and comments.

Analytics Strategy, General

Is Your Attribution Model Appropriate?

Recently I have spent an awful lot of time thinking about and talking about data accuracy issues in the field of web analytics. The widespread use of cookies as a tracking mechanism and the underlying assumption that “one cookie = one visitor” is a big part of the problem, but cookies are not the only problem. Another problem, one that I actually believe to be more substantial than cookies and visitors, is  the challenge of campaign attribution.

Challenge? What’s hard about campaign attribution? You tag campaigns and web analytics tells you what works, right? You get pretty ROI graphs and click-reports and all that fun stuff? Campaign analytics is easy!

Wrong.

One of the best-kept secrets in online marketing is that most campaign attribution data is completely wrong and the models used to evaluate campaign performance are wholly inappropriate.  The relative nascence of digital marketing practices, combined with conflicting measurement systems and poorly understood interaction between online marketing channels, likely means that hundreds of millions of dollars are wasted annually on marketing efforts that don’t produce their intended results.

Companies are increasingly responding to this observation by re-examining their marketing measurement systems.  Even the most cursory analysis yields a great deal of information about the “campaign attribution problem.”  Popularized recently by Microsoft with their “Engagement Mapping” efforts as well as analysis published by Forrester Research and others, it is clear that the most widely used online campaign attribution model is inherently flawed.

To correct these flaws and begin to improve both the accuracy of measurement and the general understanding of how marketing really works online, Analytics Demystified recommends a new approach to campaign analysis.  Dubbed “Appropriate Attribution”, the approach leverages widely available but infrequently used data to triangulate towards the true value of online marketing efforts.

Given that the majority of online advertisers have direct response goals, and that most marketers are still generally unsatisfied with the campaign measurement tools at their disposal, Analytics Demystified believes that Appropriate Attribution is the first step towards improving companies’ collective understanding of their digital marketing efforts.

Eventually marketers will have access to robust warehouses of data detailing consumer interaction with online media and advertising, but the adage “you must walk before you can run” is as true in digital marketing as it is in life.  Before business owners and marketers become fully equipped to benefit from complex marketing mix analysis of online and offline channels, they are well advised to address the campaign attribution problem to increase the return on their valuable dollars spent for online marketing efforts.

Thanks to the fine folks at Coremetrics you can read all about Appropriate Attribution and learn how you can start to get a better understanding of your online marketing efforts today.

Download your copy of the Appropriate Attribution paper from Coremetrics today.

General, Presentation

PowerPoint / Presentations / Data Visualization

I wrote a post last week about PowerPoint and how easy it is to use it carelessly — to just open it up and start dumping in a bunch of thoughts and then rearranging the slides. That post wound up being, largely, a big, fat nod to Garr Reynolds / Presentation Zen. Since then, I’ve been getting hit right and left with schtuff that’s had me thinking more broadly about effective communication of information in a business environment:

Put all of those together, and I’ve got a mental convergence of PowerPoint usage, presenting effectively (which goes well beyond “the deck”), and data visualization. These are all components of “effective communication” — the story, the content, how the content is displayed, how the content is talked to. In one of Reynolds’s sets of sample slides, you can clearly see the convergence of data visualization and PowerPoint. And, even he admits that this is a tricky thing to post…because it removes overall context for the content and it removes the presenter. Clearly, there are lots of resources out there that lay out fundamental best practices for effectively communicating in a presentation-style format. Three interrelated challenges, though:

  • The importance of learning these fundamentals is wildly undervalued — it sounds like Abela’s book tries to quantify this value through tangible examples…but it’s a niche book that, I suspect, will not get widely read by the people who would most benefit from reading it
  • “I need to put together a presentation for <tomorrow>/<Friday>/<next week>” — we’re living under enormous time pressure, and it’s incredibly easy to get caught up in “delivering a substantive deliverable” rather than “effectively communicating the information.” When I think about the number of presentations that I’ve developed and delivered over the past 15 years, the percentage that were truly effective, compelling, and engaging is abysmally small. And that’s a waste.
  • Culture/expectations — every company has its own culture and norms. For many companies, the norms regarding presentations are that they are linear, slide-heavy, logically compiled, and mechanically delivered affairs. For recurring meetings, there is often the “template we use every month” whereby the structure is pre-defined, and each subsequent presentation is an update to the skeleton from the prior meeting. Walk into one of those meetings and deliver a truly rich, meaningful, presentation…and your liable to be shuttled off for a mandatory drug test, followed by a dressing down about “lack of proper preparation” because the slides were not sufficiently text/fact/content-heavy. <sigh>

What’s interesting to me is that I have spent a lot of time and energy boning up on my data visualization skills over the past few years. And, even if it takes me an extra 5-10 minutes in Excel, I never send out something that doesn’t have data viz best practices applied to some extent. As you would expect, applying those best practices is getting easier and faster with repetition and practice. So, can I do the same for presentations? And, again, that’s presentations-the-whole-enchilada, rather than presentations-the-PowerPoint-deck. Can I balance that with cultural norms — gently pushing the envelope rather than making a radical break? Can you? Should you?

General

PowerPoint the Application vs. the Application of PowerPoint

Slightly off-topic for this blog, and a little dated, but worth sharing nonetheless.

During a discussion with a couple of my co-workers today, I made an observation about how my current company, as well as one of the major consulting firms we use, seem to really be in love with PowerPoint as the documentation/presentation/communication/general-purpose tool of choice. This prompted an immediate and emphatic response from one of those co-workers, who insisted that he “loved PowerPoint.” 

The exchange reminded me of the news last year that Katsuaki Watanabe, the President and CEO of Toyota, had decreed that employees stop using PowerPoint for the creation of documents. Garr Reynolds (aka, Presentation Zen…master), had a great take on the news. A couple of the highlights:

  • To be clear, Watanabe did not “ban PowerPoint use,” as was mis-circulated at the time
  • Watanabe did severely discourage the use of PowerPoint as a documentation tool — Reynolds calls these “slideuments” (slides + documents), which is a wickedly apt designation (and the core of this post)
  • “…visuals projected on a screen in support of a live talk are very different from material that is to be printed and read and/or analyzed.”

And, a longer excerpt that is also key:

…there is often no distinction made between documents (slideuments made in PowerPoint) and presentation slides prepared for projection. They are often interchangeable. Sounds efficient, right? And it would be funny if it was not so inefficient, wasteful, and unproductive. The slideuments produced in Japan make understanding and precision harder when printed, and when used for projected slides in a darkened conference room, they are the country’s number one cure for insomnia. 

This was fundamentally the distinction that I was trying to get my co-worker to understand…without much luck. He’s clearly got some PowerPoint chops — he kept pulling up different slides he had done that had intricate builds and snazzy palettes and templates. But, the slides he was most proud of were heavily laden with annotations and text — they were standalone, comprehensive pictorial representations of complex concepts or systems.

Once he let loose with, “The point of PowerPoint is not the retention of the information — it’s the ‘wow’ factor,” I admitted defeat.

The title of this post is really the gist of my thesis here: Powerpoint the application is not the same thing as the application of PowerPoint. All too often, we don’t make that distinction. As Reynolds puts it, “Slideware is not a method, it’s simply a kind of tool.”

Think of a sledgehammer. It’s a tool — an application, if you will. But, it can applied for vastly different purposes:

  • Used with a wedge to split firewood
  • Used to drive a metal fencepost into the ground
  • Used to prop open a door that keeps blowing shut

These are very different applications of the tool, and you would be clear as to what it was you were trying to accomplish when you hiked it over your shoulder and headed off to the task at hand.

It's not what the software does...
[Cartoon by Hugh MacLeod — see oodles more at gapingvoid.com]

The same holds true for PowerPoint. It has several different distinct possible uses…and it’s worth being clear as to which one you are tackling:

  • A live, in-person demonstration — think simple, minimalist visual backup that supports an engaging presenter without distracting from what he/she is saying; think Steve Jobs (and, if you’re not familiar, check out one of the Presentation Zen posts on that subject)
  • A live, online presentation via a webinar or web conferencing solution — this is a stickier wicket, in a lot of ways; it’s tempting to hedge against technology quirks by distributing the .ppt/.pptx file to all of the attendees via e-mail so they can simply pull up the deck and follow along, but this can be problematic, as the audience can then jump ahead and jump back. Generally, this sort of presentation is “the best alternative we have” when, ideally, you’d be doing a live, in-person demonstration. I would think this means the same minimalist approach described in the prior bullet would apply.
  • Documentation never intended to be presented — slideuments — these really are problematic and should be avoided. 

All too often, there is a blurring of all three of these: a live presentation for some people, while other people are participating remotely, and the “presenter” has distributed the presentation as a handout that has all of the detail that he/she is going to present. That leaves the participants cognitively vascillating between listening to the presenter’s words and reading through the detail in the presentation that is either being projected or is printed in front of them. It’s just not effective. Make the presentation a presentation. If there is supplemental detail or review material, put that in a document and distribute it separately — before, during, or after the presentation. Let the presentation be truly visual and let it support the concepts and information being presented, with an emphasis on the concepts

Aside: To bridge back to the topic of this site, I’ve even seen PowerPoint used as a poor man’s BI presentation tool: PowerPoint 2007 linked to Excel 2007, which was in turn linked to Access 2007, which was in turn hooked into a SQL Server database. On the one hand…<shudder>. On the other hand, when it came to a portable (once the link to Excel was removed), shareable report, it wasn’t half bad! (Our intent was for it to also be a prototype that we could iterate on quickly as we developed requirements for a true BI tool…but that didn’t pan out for other reasons.)

So, that’s my mini-rant. It’s a problem. A clear problem. But, not one that I intend to solve. If you’re interested in thinking more about the topic check out:

  • Presentation Zen (obviously)
  • Laura Fitton / Pistachio Consulting — you can just look at her posts that have the presentation tag
  • For the militant/extreme death-to-PowerPoint take, there’s the inimitable Edward Tufte…but he really does go a bit overboard 
General

Wanted: Senior Analyst with Marketing Chops for a 4-6 Month Contract

I got pinged this morning by a colleague who is looking for someone with marketing chops (preferably B2B) and fairly advanced analytic skills for a 4-6 month engagement. The engagement would require heavy travel on the front end, and the main deliverable is a model that the client can use to assess the effectiveness of various marketing programs when it comes to driving service renewals.

I don’t have much more detail than that, but if this piques your interest, drop me a line at tim at gilliganondata dot com and I’ll put you in touch with someone who can provide more information.

Analytics Strategy, Conferences/Community, General

Unique Visitors ONLY Come in One Size

Back in January I published a note about the proposed IAB Audience Reach Measurement Guidelines that generated a fair amount of interest. At the time I applauded the IAB for providing guidance regarding the definition of a “unique user” or “unique visitor” while noting some concerns about how the proposed definition would actually manifest. In summary, the new IAB definition of “unique visitor” needed to have some basis in underlying data that is based on secondary research that can be directly tied to “a person.”  Now that the IAB Audience Reach Measurement Guidelines have been officially published we can use the IAB’s own words:

“… in order to report a Unique User, the measurement organization must utilitze in its identification and attribution processes underlying data that is, at least in reasonable proportion, attributed directly to a person” and “In no instance may a census measurement organization report Unique Users purely through algorithms or modeling that is not at least partially traceable to information obtained directly from people, as opposed to browsers, computers, or any other non-human element.” (Section 1.2.4)

The last little bit references, I believe, the IAB’s distinction of four types of unique “countables” — Unique Cookies (Section 1.2.1), Unique Browsers (1.2.2), Unique Devices (1.2.3) and Unique Users or Unique Visitors (1.2.4).  The term “measurement organization” was a little, well, mystifying as was evidenced in my January post, and sadly the final document does little to clarify this term other than to say the “document is principally applicable to Internet Publishers, Ad-serving organizations, Syndicated Measurement Organizations and auditors” on the IAB web site.

This definition is important since in my last post the real conundrum appeared to be that if “measurement organization” included Omniture, WebTrends, Google, Coremetrics, etc. then the IAB was essentially saying that the vendors needed to change the way they reported Unique Visitors, at least for their clients who would be subject to the perview of the IAB and MRC.  What’s more, George Ivey from MRC never got back to my repeated requests for information, despite two members of the IAB working group (Josh Chasin from comScore and Pete Black from BPA Worldwide) openly disagreeing in their interpretation of the definition …

Well, a few weeks back I got a call from Joe Laszlo, an old co-worker of mine at JupiterResearch who is now the IAB’s Director for Analytics, the guy basically responsible for the document.  I always liked Joe and it was nice to hear from him again.  And Joe did clarify for me what a “measurement organization” is … he just didn’t directly clarify the impact on web analytics vendors.

According to Joe (and he will surely correct me publicly if I am misinterpreting our conversation) the “measurement organizations” that should be guided by this new definition of “Unique Users” are publishing organizations who are outwardly reporting their metrics for consideration by advertisers in the open market. Companies like AOL, Weather.com, ESPN, etc.  This is, I think, much more clear than the sentence a few paragraphs up that includes “Syndicated Measurement Organizations and auditors” and puts at least this part of the document in context: Essentially when using numbers coming from census-based systems, the IAB and MRC want publishers to start reporting Unique Visitor counts that have some basis in reality.

Pretty hard to disagree with Joe and the IAB on that point. We all pretty much agree that cookie-based visitor counting is messed up, and I think we can even agree that the degree to which these counts are “messed up” is a function of the target audience, the duration under examination, and the type of site.  For example, we expect cookie-based counts on sites that attract highly technical users on a daily basis to be much more impacted over a 90-day measurement period than, say, sites that attract largely non-technical users on a monthly basis over the same 90-day period.

So I’ll make one really bold statement right now, the kind that I have a tendency to regret but hey, it’s Monday and I’m feeling pretty good about the coming week:

The IAB are to be applauded for taking such a bold stand on the subject of counting and reporting unique visitors based on what we traditionally consider “web analytic” data.

I said as much in my last post … right after I said that the likelihood of the web analytics vendors following these recommendations was about the same as everyone waking up tomorrow to realize that the financial meltdown was a bad dream and the Dow is still over 14,000 (zero). The team of folks that the IAB brought together, which I understand included both Omniture and WebTrends, should be congratulated for taking a firm stand on one of the most dogged issues plaguing our collective industries (web analytics, online advertising, online publishing, syndicated research, etc.) for at least the past five years.

It is about time that we all agreed that “Unique Visitor” reports coming from census-based technologies frequently have no basis in reality. Further, we should all admit that cookie deletion, cookie blocking, multiple computers, multiple devices, etc. have enough potential to distort the numbers as to render the resulting numbers useless when used to quantify the number of human beings visiting a site or property.

Yes, before you grieve on me with your “but they are probably directionally correct” response I agree with you, they probably are, but fundamentally I believe that advertising buyers are at least as interested in the raw numbers as they are the direction they are moving. I say “probably are” because if you’re not taking the IAB’s advice and reconciling census-based data with data derived directly from people, well, you’re never sure if that change in direction is because your audience is changing, technology is changing, or there is a real and substantial increase or decline.

I mentioned above that my conversation with Joe didn’t really clarify the impact on web analytics vendors under the IAB’s new definition. Since I spent a fair amount of time thinking about the IAB guideline’s impact in this regard, I will make another bigger and bolder statement:

Starting immediately, I think the web analytics vendors and any company reporting a cookie-based count that is not in compliance with the IAB’s definition of “Unique Visitor” should stop calling said metric “Unique Visitors (or Users)” and correctly rename the metric “Unique Cookies”.

Yep, I am 100% in favor of using the IAB’s new terminology and being semantically precise whenever possible. The “Unique Visitor” counts in the popular web analytics applications are always actually counting cookies and so we should just go ahead and say that explicitly by calling them “Unique Cookies”. This change would actually give the web analytics vendors a neat opportunity … to battle to be the first to have a real “Unique Visitor” count that is based, as the IAB has suggested, on underlying data that is, at least in reasonable proportion, attributed directly to a person.

How could they do this? Let me count the ways:

  1. Develop a standard practice around the use of log-in and registered user data
  2. Work with third-party partners who are focused on gathering more qualitative data (for example, Voice of Customer vendors like ForeSee Results)
  3. Work with third-party partners who are estimating cookie-deletion rates, or at least have the potential to (for example, Quantcast)
  4. Work with third-party partners who can actually calculate cookie-deletion and multiple-machine use rates with some accuracy (for example, comScore, Google, Yahoo!)

I’m sure there are a few ways I am not thinking of, but these are the big four that have been talked about since 2005. While I expect to get some grief from paying clients about this statement, and I fully expect my suggestion to be widely ignored by the vendor community (no offense taken), I think this change would be a big step towards the recognition that there is only ONE DEFINITION of a “Unique Visitor” and this definition is only tangentially related to the number of cookies being passed around.

Like Soylent Green(TM), “Unique Visitors” are PEOPLE and our industry will go a long way towards maturation when we collectively agree on this fundamental truth.  It is not to say that Unique Cookies is not a valuable count — hell, in the absence of a strategy for reconciling cookies against people-based data unique cookies are all we have. But I do not believe that after nearly 15 years we are doing the online measurement community any justice by plugging our ears and signing “LA LA LA LA I CANNOT HEAR YOU GO AWAY!!!!!”

Which brings me to my last point …

I was really, really bummed out to read Jodi McDermott’s MediaPost article titled “Unique Visitors Come in Two Shapes and Sizes.” I was bummed because I have always liked Jodi since we worked together at Visual Sciences, because I think she is a brilliant member of our community, and because I knew I was going to end up writing these words … Jodi’s thesis is wrong and does the web analytics community a dis-service in attempting to defend a mistake by asking to water down a good definition just because it isn’t “hers” (in quotes since Jodi is a member of a larger committee charged with defining standards within the WAA.)

From Jodi’s article (which I recommend you read, especially the comments, and the emphasis is mine):

Bravo to the IAB for forcing the issue with audience measurement companies to standardize the way that they report uniques, but from a Web analyst’s perspective — and as a member of the WAA Standards committee — I wish they would have not allowed the term “unique visitors” to be redefined in such a way as to allow for multiple definitions in the space. Web analysts and media planners today have a hard enough time trying to figure out which data source to use and which standard to apply when performing their job — but that issue is now compounded even more by multiple definitions of unique visitors. In defense of the IAB, its membership is comprised of some heavy-hitter companies who are not about to change that “tab” in their reporting UI that says “Unique Visitors” on it.  But in defense of  WAA individual and company members, which include vendors such as Omniture and WebTrends (who were both listed as “Project Participants” on the IAB document, interestingly enough), neither are we. The term will live on in both places.”

I think what Jodi has missed here is that the IAB has actually given the world a useful and more accurate definition of “Unique Visitors” than any used in the web analytics industry today. More importantly, given the relative weight, clout, and respect enjoyed by the IAB in the wider world, I don’t think their definition allows for “multiple definitions” … I rather think that over time the IAB expects their member companies, especially those who want to have their numbers audited and publicly used, will consider the IAB definition the definition of “Unique Visitors” and properly consider the term we web analysts widely use today to be “Unique Cookies.”

I’m not sure what Jodi means by “heavy-hitter companies who are not about to change their “tab”” since I’m aware of very few companies today that have implemented the IAB recommendation for practical and ongoing use. But I was incredulous when I read the statement regarding using the IAB’s new definition, “in defense of the WAA individual and company members, which include vendors such as Omniture and WebTrends, neither are we. The term will live on in both places.”

Seriously? Rather than start calling our cookie counts “Unique Cookies” and having a rational conversation with our bosses to explain that the technology we use is limited in its ability to discern real people, you prefer to throw down the gauntlet with the IAB and say “screw your definition?” Despite the criticism that has been both wrongly and rightly heaped on the WAA’s “standard” definitions, despite the considerable group that crafted the IAB’s definitions, and considering the fact that the WAA’s definition is wrong, you want to pick a fight?

Two wrongs never make a right, and you’re wrong twice here. Sorry.

I am not on the WAA Standards Committee, I am not on the WAA Board of Directors, and my dues with the WAA are about to lapse so I have no basis for representing the organization. Perhaps reading more into Jodi’s post given my knowledge of her passionate work in the WAA, but I would strongly encourage the current Board of Directors to examine Jodi’s statements in the context of the IAB relationship and the “bigger picture” at play.  Because while Jodi may speak for the WAA Standards Committee and by extension the entire WAA, she certainly does not speak for me.

I will gladly use the term “Unique Cookies” when I am talking about a cookie-based count and reserve the term “Unique Visitors” for those situations where I have some basis for doing so. More importantly I will encourge my clients and vendor friends to consider doing same. The IAB has given the entire measurement community a reason to take a huge leap forward and gain clarity around one of our most important metrics. To turn our back on this opportuntity because it will necessitate change, require additional explanation, or because “we like our definition better” is wrong, wrong, wrong.

Harrrrumph.

I suspect like previous posts on the subject this will generate some conversation. As usual I do not pretend to have all the answers and I welcome your feedback. I am, unfortunately, traveling all day Monday and will have limited ability to approve and respond to comments but I promise to do so as quickly as possible.

Analysis, Analytics Strategy, Excel Tips, General, Presentation, Reporting

The Best Little Book on Data

How’s that for a book title? Would it pique your interest? Would you download it and read it? Do you have friends or co-workers who would be interested in it?

Why am I asking?

Because it doesn’t exist. Yet. Call it a working title for a project I’ve been kicking around in my head for a couple of years. In a lot of ways, this blog has been and continues to be a way for me to jot down and try out ideas to include in the book. This is my first stab at trying to capture a real structure, though.

The Best Little Book on Data

In my mind, the book will be a quick, easy read — as entertaining as a greased pig loose at a black-tie political fundraiser — but will really hammer home some key concepts around how to use data effectively. If I’m lucky, I’ll talk a cartoonist into some pen-and-ink, one-panel chucklers to sprinkle throughout it. I’ll come up with some sort of theme that will tie the chapter titles together — “myths” would be good…except that means every title is basically a negative of the subject; “Commandments” could work…but I’m too inherently politically correct to really be comfortable with biblical overtones; an “…In which our hero…” style (the “hero” being the reader, I guess?). Obviously, I need to work that out.

First cut at the structure:

  • Introduction — who this book is for; in a nutshell, it’s targeted at anyone in business who knows they have a lot of data, who knows they need to be using that data…but who wants some practical tips and concepts as to how to actually go about doing just that.
  • Chapter 1: Start with the Data…If You Want to Guarantee Failure — it’s tempting to think that, to use data effectively, the first thing you should do is go out and query/pull the data that you’re interested in. That’s a great way to get lost in spreadsheets and emerge hours (or days!) later with some charts that are, at best, interesting but not actionable, and, at worst, not even interesting.
  • Chapter 2: Metrics vs. Analysis — providing some real clarity regarding the fundamentally different ways to “use data.” Metrics are for performance measurement and monitoring — they are all about the “what” and are tied to objectives and targets. Analysis is all about the “why” — it’s exploratory and needs to be hypothesis driven. Operational data is a third way, but not really covered in the book, so probably described here just to complete the framework.
  • Chapter 3: Objective Clarity — a deeper dive into setting up metrics/performance measurement, and how to start with being clear as to the objectives for what’s being measured, going from there to identifying metrics (direct measures combined with proxy measures), establishing targets for the metrics (and why, “I can’t set one until I’ve tracked it for a while” is a total copout), and validating the framework
  • Chapter 4: When “The Metric Went Up” Doesn’t Mean a Gosh Darn Thing — another chapter on metrics/performance measuremen. A discussion of the temptation to over-interpret time-based performance metrics. If a key metric is higher this month than last month…it doesn’t necessarily mean things are improving. This includes a high-level discussion of “signal vs. noise,” an illustration of how easy it is to get lulled into believing something is “good” or “bad” when it’s really “inconclusive,” and some techniques for avoiding this pitfall (such as using simple, rudimentary control limits to frame trend data).
  • Chapter 5: Remember the Scientific Method? — a deeper dive on analysis and how it needs to be hypothesis-driven…but with the twist that you should validate that the results will be actionable just by assessing the hypothesis before actually pulling data and conducting the analysis
  • Chapter 6: Data Visualization Matters — largely, a summary/highlights of the stellar work that Stephen Few has done (and, since he built on Tufte’s work, I’m sure there would be some level of homage to him as well). This will include a discussion of how graphic designers tend to not be wired to think about data and analysis, while highly data-oriented people tend to fall short when it comes to visual talent. Yet…to really deliver useful information, these have to come together. And, of course, illustrative before/after examples.
  • Chapter 7: Microsoft Excel…and Why BI Vendors Hate It — the BI industry has tried to equate MS Excel with “spreadmarts” and, by extension, deride any company that is relying heavily on Excel for reporting and/or analysis as being wildly early on the maturity curve when it comes to using data. This chapter will blow some holes in that…while also providing guidance on when/where/how BI tools are needed (I don’t know where data warehousing will fit in — this chapter, a new chapter, or not at all). This chapter would also reference some freely downloadable spreadsheets with examples, macros, and instructions for customizing an Excel implementation to do some of the data visualization work that Excel can do…but doesn’t default to. Hmmm… JT? Miriam? I’m seeing myself snooping for some help from the experts on these!
  • Chapter 8: Your Data is Dirty. Get Over It. — CRM data, ERP data, web analytics data, it doesn’t matter what kind of data. It’s always dirtier than the people who haven’t really drilled down into it assume. It’s really easy to get hung up on this when you start digging into it…and that’s a good way to waste a lot of effort. Which isn’t to say that some understanding of data gaps and shortcomings isn’t important.
  • Chapter 9: Web Analytics — I’m not sure exactly where this fits, but it feels like it would be a mistake to not provide at least a basic overview of web analytics, pitfalls (which really go to not applying the core concepts already covered, but web analytics tools make it easy to forget them), and maybe even providing some thoughts on social media measurement.
  • Chapter 10: A Collection of Data Cliches and Myths — This may actually be more of an appendix, but it’s worth sharing the cliches that are wrong and myths that are worth filing away, I think: “the myth of the step function” (unrealistic expectations), “the myth that people are cows” (might put this in the web analytics section), “if you can’t measure it, don’t do it” (and why that’s just plain silliness)
  • Chapter 11: Bringing It All Together — I assume there will be such a chapter, but I’m going to have to rely on nailing the theme and the overall structure before I know how it will shake out.

What do you think? What’s missing? Which of these remind you of anecdotes in your own experience (haven’t you always dreamed of being included in the Acknowledgments section of a book? Even if it’s a free eBook?)? What topic(s) are you most interested in? Back to the questions I opened this post with — would you be interested in reading this book, and do you have friends or co-workers who would be interested? Or, am I just imagining that this would fill a gap that many businesses are struggling with?

Adobe Analytics, Analytics Strategy, General, Social Media

Omniture, Europe, SAS, WebTrends, and Twitter!

You may be wondering “What do those things have in common?” You may also be wondering “Did Eric drop off the face of the Earth?” The answer to the first question is the explanation to the second …

Despite changes in Analytics Demystified’s client portfolio–changes that I believe accurately reflect the current economic climate–we are busier than ever here in Portland, Oregon.  Or rather not in Portland, Oregon as Q1 2009 has me bouncing around the globe to talk about web analytics, something I enjoy tremendously.

World Tour 2009 (Part I) got started a few weeks back at the Omniture Summit in Salt Lake City, Utah. If you haven’t been to an Omniture Summit, assuming you are an Omniture, WebSideStory, Visual Sciences, Instadia, Mercado, Offermatica … am I forgetting anyone?! … I definitely recommend attending if you have the chance. Aside from excellent production and plenty of attention to detail I felt like Omniture did a great job on the content, something they took some criticism for in years past. The break-out sessions I saw paired an Omniture employee with a customer, analyst, or industry leader and in general the result was informative without being overly sales-y.

Perhaps the thing I enjoyed the most was that, despite my occasional open criticism of Omniture and some of their practices, senior management seemed (or at least pretended) to be happy enough to see me.  I had a wonderful conversation with President of Sales Chris Harrington, spent some time with Gail Ennis and John Mellor, and even got to share Swedish Fish with Brett Error (who is now in Twitter @bretterror)  Even Josh James and I had a chance to catch up … but no, I didn’t hug it out with Matt Belkin 😉

The World Tour continues here in Portland, then off to Milan, Madrid, and Washington, D.C. Locally I am excited to get to present at SearchFest 2009, but I have to admit I’m somewhat more excited about my first trip to Milan, Italy for Web Analytics Strategies 2009 and my first return to Madrid in several years. Perhaps most excitedly, following a special presentation with MV Consultoria, I will get to meet Rene and Aurelie’s new baby Lucca! After a brief return home (to spend time reading with my five year old daughter who has recently adopted her dad’s great love for reading) I fly to D.C. to deliver a keynote presentation at the SAS Global Forum.

And that is just the beginning. You can see the complete schedule under “Consulting” at Analytics Demystified, and I am actively booking conferences and presentations in June and July.

Which brings me to Twitter …

I wouldn’t say I was an early adopter of Twitter, not by a long shot. I actually met co-founder Biz Stone in Rotterdam and admitted “No, I don’t really understand the service …” I was eventually goaded into trying Twitter by Aaron Gray of WebTrends and started seeing the inherent value after getting people to use the #wa hashtag to identify web analytics (and Washington State) related content.

Of course, if you know me, you know I was unlikely to stop there …

After a short beta test with something I called the “Twitter Influence Calculator”, last week I rolled out The Twitalyzer. With tongue-in-cheek I have described the service as “Google Analytics for Twitter” and by all measures the service has taken off. To date nearly 20,000 unique Twitter users have tried the service which summarizes your use of Twitter and provides a handful of interesting measures of success (influence, generosity, velocity, clout, and the signal-to-noise ratio.)

Rather than spend a bunch of time telling you about it I encourage you to check it out at http://twitalyzer.com

While I have been incredibly busy between these travels, client work, writing proposals, and messing with Twitter I am of course always happy to hear from readers. Send email, Twitter me (@erictpeterson), or look for me at one of the conferences above!

Adobe Analytics, Analytics Strategy, General

Free webinars on February 11th and 12th

If you are one of the many, many people out there who have been told in no uncertain terms that there is “no travel budget” to attend conferences in the near future and you’re bummed out about missing out on some great learning opportunities Analytics Demystified has a great solution! Rather than mope around the office, complaining about missing Ian Ayres at WebTrends Engage or Maroon 5 at the Omniture Summit, why not join Analytics Demystified, Forrester Research, Coremetrics, and Tealeaf for two free webinars next week!

And who doesn’t love “free?”

The first webinar will be held at 10 AM Pacific next Wednesday, February 11th and is sponsored by the nice folks at Coremetrics. The topic is campaign attribution, and while the “official title” of the presentation is “Effectively Managing Your Online Marketing Mix with Advanced Attribution” my personal subtitle for the event is “How LAST-Based Attribution is Wrecking Your Marketing Budget (and What To Do About It!)”

While I love the topic, I’m doubly excited about this webinar since I get to co-present with John Lovett from Forrester Research. The best thing about presenting with John is that he is never shy about his opinion and we frequently get into “heated” debates, even in front of a live audience. We will also be joined by Coremetrics own John Squire, a gentleman also known for his willingness to have his opinion heard.

If you’d like to participate in this webcast with Analytics Demystified, Forrester Research, and Coremetrics please register for this totally free event through Coremetrics.

The second webinar will be held at 9 AM Pacific next Thursday, February 12th and is sponsored by the nice folks at Tealeaf. The topic here is the Web Site Optimization Ecosystem that I first described with Tealeaf and Foresee Results back in 2007. The ecosystem is a great topic when times are tough since most companies have at least two of the technologies we’ll discuss (web analytics, voice of customer, customer experience management, testing, personalization) but have done very little to actually integrate the systems.

I will be joined on the call by Geoff Galat, Tealeaf’s VP of Marketing and Product Strategy. If you’d like to register and listen to Geoff Galat from Tealeaf and I, please register for this totally free webcast through Tealeaf.

So there you have it. Two great topics, five smart presenters, one best-of-all-prices … FREE.

I hope you’ll be able to join us, and as always, please keep up with Analytics Demystified via our web analytics events page at http://www.analyticsdemystified.com.

Adobe Analytics, Analytics Strategy, Conferences/Community, General

Analytics Demystified speaking engagements

One of the things I love the most about my job is the work I do as a professional speaker and industry evangelist. While the economy is not showing any great signs of improving, the speaking circuit shows no signs of slowing down.  Here is a summary of some of the web analytics events Analytics Demystified will be at between now and the end of the summer, keeping in mind that more events will undoubtedly be added.  If you’d like to have Eric T. Peterson speak at your event please contact us directly.

Online Marketing Summit, San Diego, February 5th

At the Online Marketing Summit I will be giving a presentation titled “Attribution, Influence, and Engagement: The Digital Marketer’s New Nightmare” on Thursday, February 5th. We will also be having a special Web Analytics Wednesday on Thursday event at the Westin Hotel downtown. Learn more about OMS and sign-up to join us at Web Analytics Wednesday.

The MindMeld at the Omniture Summit, Salt Lake City, February 17th

As part of the Omniture Summit, Matt Langie has organized an “X Change” like, invitation-only event called the “MindMeld(TM)”. Co-hosted by both Jim Sterne (WAA, eMetrics) and John Lovett (Forrester Research) the afternoon event will attempt to debate and develop solutions for Social Media measurement, Mobile and Video, and discuss how we can collectively elevate the role of analytics in the organization.

Given the somewhat tumultuous history I have with this organization’s management team I am honored that Matt invited me and I am looking forward to seeing the pagentry of the Omniture Summit first-hand. If you’d like an invitation to the MindMeld please let me know.

SearchFEST, Portland, March 10th

SearchFEST is a full-day search marketing conference hosted by SEMpdx and the fine folks at Anvil Media. This year’s keynote speaker is the great Danny Sullivan and I am honored to be on an analytics panel with Portent Interactive’s Ian Lurie and Widemile’s Bob Garcia. Hallie Jansen at Anvil was kind enough to give me a discount code so if you’d like to join us at SearchFEST and save a little money, reach out directly and I will hook you up!

Web Analytics Strategies, Milan Italy, March 17th

At the Web Analytics Strategies Conference and Expo I am a keynote speaker and listed as a “special GURU” which is nice (although it makes me feel kind of old.) I believe this is the first Web Analytics Expo in Milan and so I’m doubly excited about giving my presentation on “Competing on Web Analytics”  If you’re a reader of my blog in Italy and can make this event please contact me so that we might meet for coffee! I believe we will also be having a special Web Analytics Wednesday event the evening of the 17th in Milan so watch the Web Analytics Wednesday calendar.

Los Desayunos MVC con Analitica Web, Madrid Spain, March 18th

At this special web analytics event in Madrid, Spain I will be speaking on the “New Measures for Online Marketing” with the delightful Sergio Maldonado.  I’m especially excited to be returning to Madrid since my very good friends Rene and Aurelie will be coming down from Brussels with their young baby Lucca who I have not had the pleasure of meeting yet. If you’re in Madrid, let’s meet for sangria, shall we?

SAS Global Forum, Washington, March 23rd

At the SAS Global Forum I am honored to be giving a keynote presentation on “Competing on Web Analytics.” I’m very excited about presenting at the SAS conference and hope to connect at the conference with web analytics professionals who are also using SAS.  If you’ll be at the Global Forum and would like to meet, please let me know.

WebTrends Engage, Las Vegas, April 7th

Being a Portlander and having gotten my start at WebTrends back in the late 90’s I was already excited to be going to the Engage conference. Then the company announced that Ian Ayres, author of Super Crunchers, will be giving a keynote presentation. I love what the company is doing with their conference web site and it seems like I’m constantly seeing news on Twitter about the event. If you’re heading to Vegas for Engage, make sure to look for me by the blackjack table.

DMA ACCM 2009, New Orleans, May 4th

I’ve never had the pleasure of presenting at the Annual Conference for Catalog and Multichannel Merchants but am very excited to be giving two presentations at ACCM 2009. The first presentation will be more an intermediate class in web analytics and the second advanced. Both will be highly interactive and so I’m looking forward to meeting the DMA and ACCM audience.

eMetrics Marketing Optimization Summit, San Jose, May 6th

The big event, the really big show, the grand-daddy of them all … eMetrics. I am still proud to say I have never missed an eMetrics here in the USA, in part because I love the event and in part because of the profound respect I have for the conference’s organizer, Mr. Jim Sterne. I am more or less racing from New Orleans to San Jose but please look for me at eMetrics if you’d like to catch up! Plus, we will definitely be having a blow-out Web Analytics Wednesday at eMetrics this year …

Internet Retailer 2009 Conference and Exhibition, Boston, June 16th

I have always loved the Internet Retailer conference but have rarely been able to attend due to schedule conflict. Fortunately Kurt Peters got me this year before things started picking up and so I’m happy to be presenting with Mike Fried from Backcountry.com and getting deep into the details that online retailers need in their web analytics efforts.

The X Change Conference, San Francisco, September 9, 10, and 11

I could not be more excited about this year’s X Change conference for a variety of reasons. We have great buzz after last year’s event, I have talked to dozens of past participants who have told me they’re saving their limited conference dollars for the X Change, and I love that Gary and Joel from Semphonic are always looking for ways to make a great event even better. If you’d like to join us at the web analytics industry’s premier event, or if you’d like to talk about possibly leading a huddle, please don’t hesitate to email me directly.

Well, that’s most of it, at least for now. There are a handful of webcasts I’m doing on behalf of companies like Coremetrics (with John Lovett from Forrester) and Tealeaf (on the Web Site Optimization Ecosystem) but I’ll try and cover those in another, shorter post.

General

Barack Obama should not fear cookies!

Just after President Obama was elected back in November I wrote a blog post that had been kicking around in my head for a long time calling for the “legalization” of browser cookies by Federal Government run web sites. The response to the post was great, but now it appears that the first comment from Brent Hieggelke (who was head of marketing at WebTrends for several years) was destined to become ironic.  Brent (who is my neighbor in Portland) waxed philosophical about government and cookies with this comment:

“As someone who 4 years ago spent ALL of New Years Day on the phone with the White House Communications Team because their site was “outed” by CNN and other media as <> using cookies in a completely innocent manner, I couldn’t agree more.”

Turns out that Jascha Kaykas-Wolff, the new head of marketing at WebTrends, is probably having the exact same conversation thanks to so called “privacy advocates” according to this article in InformationWeek. What’s more, the privacy advocates, rather than educating themselves about the real risks associated with the use of browser cookies are apparently patting themselves on the back for getting the Obama administration to make a simple, cosmetic change at WhiteHouse.gov regarding the use of YouTube video.

Giving himself full credit for the change, Chris Soghoian from CNET’s “surveli@nce st@te” blog says:

“It seems that someone in the White House read my blog post yesterday–as within 12 hours of the story going live, Obama’s Web team rolled out a technical fix that severely limits YouTube’s ability to track most visitors to the White House Web site.”

Congratulations Chris. Instead of giving the President’s team the lattitude to focus on, oh, THE ECONOMY, THE THREAT OF TERRORISM, THE HOUSING CRISIS, UNEMPLOYMENT, and HEALTH CARE you single-handedly managed to force the Administration to waste their time worrying about whether or not Google was getting just a little more of the world’s data.  President Obama, in the midst of rolling out a truly revolutionary use of technology in government in an effort to get more of us personally involved in our communities, our country, and our collective future, was forced by your misguided fear-mongering to stop what they were doing and address what has otherwise been hailed as a brilliant communication effort.

You sir, are the man.

Seriously people, can we stop worrying about cookies for a little while? Given all the other problems we have as a nation and as a global community, am I alone in thinking that people like Chris and his fellow “privacy advocates” need to find something else to focus their efforts on? Maybe if this community spent more time trying to help the President come up with ideas to put America back to work and less time creating fear, uncertainty, and doubt in the popular media we’d see the kind of change that our President has been talking about.

At this point I’m fairly confident that any person who has any shred of concern about their cookies being scraped, hijacked, poisoned, bombed, or otherwise maliciously used to expose their personal habits or ruin their lives has figured out how to clear or otherwise modify said cookies. Even though I started writing about the profile of the cookie deleter back in 2005, I’m still waiting for someone to give me a good reason to delete said objects that is not A) because you’re a site developer and you need to confirm how cookies are being set, B) you’re a web analytics specialist debugging tracking, C) you gamble a lot online or D) you surf a lot of porn.

If “A” or “B” I understand.  If “C” or “D” … don’t forget to clear your browser history too!

I’m being snarky, I know, and maybe I’m just taking Chris to task since he still has his street-cred inducing ponytail and I cut mine off. But at this point the hand-wringing about cookies in general much less because of the mandate set by OMB M-03-22 has become tedious and needs to stop. President Obama is working to change the way government works and I think his staff deserve some latitude when it comes to the Internet. If we want government sites to work for us, we need to let analytic technology work for them. If we want change, we need to be open to change.

Put another way, if you fear Google, don’t use their products. If you fear cookies, delete them. If you fear for your privacy online, don’t go online. Wear a foil hat. Don’t answer the phone. Don’t open the door. Don’t speak.

But please people let’s decide to take some personal responsibility on this issue and stop bugging an otherwise busy administration–whichever administration that may be. Regardless of how you feel about Barack Obama, let’s all recognize that we are facing substantially bigger challenges today than we have in recent history and since the man was fairly elected he deserves at least a chance to improve the economic conditions in the U.S. without “privacy advocates” forcing his staff to make tedious (and functionally meaningless) changes to the White House web site.

I know I’m going to get slammed for this post, that’s okay. Somebody needs to stand up for cookies and since I already tried “diplomatic” I suppose it’s time to try “direct.” Browser cookies help make it possible for great companies like CNET to provide lots of great content–including Chris’s blog! Browser cookies help justify great technology like Twitter, Facebook, and MySpace. Browser cookies power the Internet and should not be feared, especially not by President Obama.

I look forward to your comments and criticisms of my position.

General

Seven Things You May (or May Not) Know about Me

I got tagged for the “Seven things…” meme by Chris Wilson.

The rules:

  1. Link to your original tagger(s) and list these rules in your post.
  2. Share seven facts about yourself in the post.
  3. Tag seven people at the end of your post by leaving their names and the links to their blogs.
  4. Let them know they’ve been tagged.

So, on to the seven facts:

  1. I cut out of college partway through my last semester to hike 2,142.7 miles from Georgia to Maine on the Appalachian Trail. I proposed to my girlfriend (now wife) when she came out to pick me up at the end of the trail.
  2. I hate talking on the phone and generally won’t answer it at home.
  3. My undergraduate degree is in architecture, although it is officially a Bachelor of Science in Art and Design. Since it is abbreviated BSAD, and due to the financial prospects for those of us who earned it, we pronounced it “Be Sad.” I don’t actually have much artistic talent, and I got out of architecture once I realized how woefully lacking I was on that front. I have an enormous appreciation for people who have visual design skills. 
  4. A play I wrote in college got produced on campus and was later included as an example for the Playwrights in Performance class included in MIT’s Open Courseware project.
  5. I grew up in Sour Lake, Texas, which is where Texaco was founded (originally named Texas Company). The town’s population was at 1,807 and dropping as of the last census before I moved away, but it has been a boomtown twice: first, in the mid-1800s, as a health resort when people (including Sam Houston) came from far and wide to soak in the town’ssulphur springs (long since dried up) that gave it its name, and again in the early 1900s when oil was discovered.
  6. I am a rabid amateur baseball fan — especially focused on the Texas Longhorns and any team or league in which I have a kid participating; if I know I’m going to be able to watch an entire game, I’ll score it if I’ve got a score sheet and a pencil.
  7. My cousin, Chris, authored the first Windows version of NCSA Mosaic (the first real web browser) and later developed the first CSS implementation in Internet Explorer. He’s currently the Platform Architect of the IE Platform team at Microsoft.

And, the seven people I’ll tag to pass this along:

  • Chris Tammen because, like my cousin, he doesn’t post nearly often enough. And, he’s one of the most spontaneously hilarious people I know (for years, his resume included “Able to change many lightbulbs without the aid of a chair or stool” in the “Additional Skills” section, as he is a very tall fellow)
  • Kristin Farwell because I already know that her mother and stepfather are professional skydivers (and she’s got a half-dozen ready tales of record-breaking and noteworthy jumps her stepfather has led or participated in), because she grew up as the daughter of a quintessential free spirit, because she taught me a lot of things about the internet, and because I have two framed photos she took in Austin hanging on the wall of my office…and I suspect her list would turn up more fascinating nuggets
  • Greg Phelps because he’s an English major who has spent the last decade of his career doing hardcore database work. And he knows baseball.
  • Avinash Kaushik because all I know is that he is one of the most brilliant and pragmatic minds in web analytics, that he is extremely personable, that he’s got a degree from The Ohio State University, and that Google probably knows how lucky they are to have him helping drive the evolution of Google Analytics.
  • Amy Bills because she is really good at what she does, but, more importantly, because her perspective on the world tends to be hilariously wry and sardonic
  • Bryan Cristina because he keeps saying he’s going to blog more, and maybe this will give him a nudge to get re-started. And because the filter between what he thinks and what he says is pretty minimal, which makes for entertaining tweets, Yahoo! group responses, and conversations
  • Kevin Sasser because I met him when he was trying to sell me a content management system (and almost succeeded — would have if the product had been a better fit for our needs), and who I have since gotten to know as the author of one of the most entertaining, yet practical, blogs on how to be effective in Sales.

    One person I would have tagged, but she was already tagged last year:

    • Connie Bensen — community strategist, mentor, friend, and former .357 pistol packer; when the meme hit her, it was an “8 things” meme, but substantially the same (and more reasonable than the “25 things” that’s making its way around Facebook!)