Industry Analysis, Tag Management, Technical/Implementation

Stop Thinking About Tags, and Start Thinking About Data

Nearly three weeks ago, I attended Tealium’s Digital Velocity conference in San Francisco. I’ve attended this event every year since 2014, and I’ve spent enough time using its Universal Data Hub (the name of the combined UI for AudienceStream, EventStream, and DataAccess, if you get a little confused by the way these products have been marketed – which I do), and attended enough conferences, to know that Tealium considers these products to be a big part of its future and a major part of its product roadmap. But given that the majority of my my clients are still heavily focused on tag management and getting the basics under control, I’ve spent far more time in Tealium iQ than any of its other products. So I was a little surprised as I left the conference on the last day by the force with which my key takeaway struck me: tag management as we knew it is dead.

Back in 2016, I wrote about how much the tag management space had changed since Adobe bought Satellite in 2013. It’s been awhile since tag management was the sole focus of any of the companies that offer tag management systems. But what struck me at Digital Velocity was that the most successful digital marketing organizations – while considering tag management a prerequisite for their efforts – don’t really use their tools to manage tags at all. I reflected on my own clients, and found that the most successful ones have realized that they’re not managing tags at all – they’re managing data. And that’s why Tealium is in such an advantageous position relative to any of the other companies still selling tag management systems while Google and Adobe give it away for free.

This idea has been kicking around in my head for awhile now, and maybe I’m stubborn, but I just couldn’t bring myself to admit it was true. Maybe it’s because I still have clients using Ensighten and Signal – in spite of the fact that neither product seems to have committed many resources to their tag management products lately (they both seem much more heavily invested in identity and privacy these days). Or maybe it’s because I still think of myself as the “tag management guy” at Demystified, and haven’t been able to quite come to grips with how much things have changed. But my experience at Digital Velocity was really the final wake-up call.

What finally dawned on me at Digital Velocity is that Tealium, like many of their early competitors, really doesn’t think of themselves as a tag management company anymore, either. They’ve done a much better job of disguising that though – because they continue to invest heavily in TiQ, and have even added some really great features lately (I’m looking at you, New JavaScript Code Extension). And maybe they haven’t really had to disguise it, either,  because of a single decision they made very early on in their history: the decision to emphasize a data layer and tightly couple it with all the core features of its product. In my opinion, that’s the most impactful decision any of the early tag management vendors made on the industry as a whole.

Most tag management vendors initially offered nothing more than code repositories outside of a company’s regular IT processes. They eventually layered on some minimal integration with a company’s “data layer” – but really without ever defining what a data layer was or why it was important. They just allowed you to go in and define data elements, write some code that instructed the TMS on how to access that data, and then – in limited cases – gave you the option of pushing some of that data to your different vendor tags.

On the other hand, Tealium told its customers up front that a good data layer was required to be successful with TiQ. They also clearly defined best practices around how that data layer should be structured if you wanted to tap into the power of their tool. And then they started building hundreds of different integrations (i.e. tags) that took advantage of that data layer. If they had stopped there, they would have been able to offer customers a pretty useful tool that made it easier to deploy and manage JavaScript tags. And that would have made Tealium a pretty similar company to all of its early competitors. Fortunately, they realized they had built something far more powerful than that – the backbone of a potentially very powerful customer data platform (or, as someone referred to Tealium’s tag management tool at DV, a “gateway drug” to its other products).

The most interesting thing that I saw during those 2 days was that there are actual companies for which tag management is only a subset of what they are doing through Tealium. In previous years, Tealium’s own product team has showcased AudienceStream and EventStream. But this year, they had actual customers showing off real-world examples of the way that they have leveraged these products to do some pretty amazing things. Tealium’s customers are doing much more real-time email marketing than you can do through traditional integrations with email service providers. They’re leveraging data collected on a customer’s website to feed integrations with tools like Slack and Twilio to meet customers’ needs in real-time. They’re solving legitimate concerns about the impact all these JavaScript tags have on page-load performance to do more flexible server-side tagging than is possible through most tools. And they’re able to perform real-time personalization across multiple domains and devices. That’s some really powerful stuff – and way more fun to talk about than “tags.” It’s also the kind of thing every company can start thinking about now, even if it’s something you have to ramp up to first.

In conclusion, Tealium isn’t the only company moving in this direction. I know Adobe, Google, an Salesforce all have marketing tools offer a ton of value to their customers. Segment offers the ability to do server-side integrations with many different marketing tools. But I’ve been doing tag management (either through actual products or my own code) for nearly 10 years, and I’ve been telling customers how important it is to have a solid data layer for almost as long – at Salesforce, we had a data layer before anyone actually called it that, and it was so robust that we used it to power everything we did. So to have the final confirmation that tag management is the past and that customer data is the future was a pretty cool experience for me. It’s exciting to see what Adobe Launch is doing with its extension community and the integration with the newest Adobe mobile SDKs. And there are all kinds of similar opportunities for other vendors in the space. So my advice to marketers is this: if you’re still thinking in terms of tags, or if you still think of all your third-party vendors as “silos,” make the shift to thinking about data and how to use it to drive your digital marketing efforts.

Photo Credit: Jonathan Poh (Flickr)

Adobe Analytics, Analytics Strategy, Digital Analytics Community, Industry Analysis

Analytics Demystified Case Study with Elsevier

For ten years at Analytics Demystified we have more or less done marketing the same way: by simply being the best at the work we do and letting people come to us.  That strategy has always worked for us, and to this day  continues to bring us incredible clients and opportunities around the world.  Still, when our client at Elsevier said he would like to do a case study … who were we to say no?

Elsevier, in case you haven’t heard of them, are a multi-billion dollar multinational which has transformed from a traditional publishing company to a modern-day global information analytics business.  They are essentially hundreds of products and companies within a larger organization, and each needs high quality analytics to help shape business decision making.

After searching for help and discovering that companies say they provide “Adobe consulting services” … without actually having any real-world experience with the type of global challenges facing Elsevier, the company’s Senior Vice President of Shared Platforms and Capabilities found our own Adam Greco.  Adam was exactly what they needed … and I will let the case study tell the rest of the story.

Free PDF download: The Demystified Advantage: How Analytics Demystified Helped Elsevier Build a World Class Analytics Organization

Analytics Strategy, Industry Analysis

A Mobile Analytics Comparative: Insurance Apps – Part 2

Mobile analytics “Events” are the actions that users take when using your mobile apps. These events can be anything from opening the app, to swiping through screens, taking a photo of a fender-bender, or submitting a new claim. Events are the lifeblood of any mobile application and tracking them is essential to any mobile analytics platform.

In Part 1 of this comparative, I looked at the total number of SDK’s within the top 5 insurance apps (based on US Advertising spend) and posted some observations about what I found. Here in Part 2, I will take a closer look at how mobile analytics events are tracked and the specific events tracked by each of these apps.

 

Out-of-the-box Events for mobile apps. While every analytics tool handles in-app event tracking differently, vendors seem to take one of two approaches when tracking events on mobile apps. The first, which is common among analytics vendors used by our group of insurance apps, is to provide basic default events and offer the ability to track custom events as well. The second method is to track all events by default and allow users to identify which ones are most valuable to them. Both scenarios have distinct advantages, but neither will suffice unless you take the time to understand what goals you’re trying to accomplish with your mobile app and what type of engagement you wish to encourage users to take.

Adobe Analytics

Adobe Analytics’ mobile SDK offers a number of capabilities out-of-the-box and their Lifecycle metrics provide valuable context data to help analysts see what’s going on within their apps. Adobe Analytics collects: launches, crashes, upgrades, and session information by default. Additionally, Lifecycle metrics provide data on engaged users, days since first/last use, hour of day, day of week, and device, carrier, and operating system info as well. Adobe Analytics also offers a number of default dimensions within its mobile SDK that enable analysts to capture location data, lifetime value, and campaign details. To see a full list of Adobe Analytics’ default mobile metrics and dimensions, click here. Adobe also offers the ability to track custom events, which can be configured as variables that will reveal in the traditional Adobe Analytics interface. Also, for Adobe users, the mobile SDK supports other Marketing Cloud solutions like Target, Audience Manager and Visitor ID, making it a great choice for those looking for a single solution.

Google Analytics (Firebase)

Google Analytics and their mobile specific platform, Firebase, offer a number of default metrics as well. First Opens, Sessions Starts, and user Engagement are all provided out-of-the-box. These events, along with in app purchases, app updates/removes, mobile notification info, and dynamic link data are all offered as defaults. To view Google Analytics (Firebase’s) full list of default mobile events, click here. Google also offers the ability to track custom events as well. With Google Firebase, product owners and developers can build and manage apps and configure tracking that can share data to Google Analytics. This enables the opportunity to contain your data in a single location so that your mobile data is viewable along with your traditional web analytics data.   

 

Auto Event Tracking. There are a few vendors on the app analytics marketplace that take the approach of auto-tracking events and enabling users to identify keys events and label them for analysis within their analytics interfaces.

Mixpanel is one of these solutions and offers Autotrack as a service to capture clicks on links, buttons, and forms as well as other in-app actions. They built a point-and-click editor, which allows product owners to configure tracking by navigating web pages and mobile apps as they would normally. The editor provides valuable contextual data like how many times a button has been viewed and clicked in the past few days, which helps hone in on the most valuable assets. But what’s very cool is that once events are identified in Autotrack, reports will contain historic data on these events back to the time of first implementation of Autotrack. Event data can also be augmented with context data called “properties” that adds additional detail to key events.

Heap Analytics is another solution that automatically captures user actions with its base code. Similar to Mixpanel, Heap Analytics captures clicks, taps, gestures, and form fills through its default tracking. Their Event Visualizer allows Analysts or product owners to configure tracking by navigating apps and interacting with the interface to record, name, and define events. This solution also works retroactively and enables analysis of events within the Heap Analytics interface. Both of these solutions take much of the pain of traditional tagging and configuration away from Analysts and product owners allowing them to get into data analysis and insights quickly.

As with most things in digital analytics, there are multiple ways to get the job done and multiple tools to choose from to accomplish the task of tracking events. I didn’t even mention other popular mobile analytics platforms Flurry and Localytics here mainly for brevity sake, so perhaps I need to another blog post to call out the differences between vendors. Which tool you decide on has a lot to do with ease of use, if you want to see your web and mobile data in a common interface, and the level of complexity you’re willing to endure to get robust mobile tracking.

 

What the Insurance Apps are tracking

All of the insurance apps that we evaluated in this comparative are using either Adobe Analytics, Google Analytics, or both on their apps. No instances of Mixpanel nor Heap Analytics (nor Flurry, nor Localytics for that matter) and their auto tracking features were detected, indicating that each of these companies elected to go the more traditional route for app analytics tracking. That said, there were some similarities and differences in the way that each company was tracking events within their apps.

Opens

Tracking Opens is arguably the most basic of events that can (and should) be tracked within your mobile apps. Each of the insurance apps I evaluated contained Open event tracking, which is not surprising since it’s offered by default from their vendors. Yet, what companies do with their Open event data is where things get interesting. For this evaluation, I did not look at how any of these firms are using data, so I won’t pass judgement on their utilization of this basic yet informative data point. Instead, I offer some concepts for thinking about how to analyse your data. Opens are the basis for determining active users, which is a highly regarded KPI in the mobile world. Use Open rates, plus other events to determine which users are active within your mobile apps. Also, by using contextual data such as time of day and location, you can learn a great deal about when and where your users are opening their apps. Are they primarily at home? On the highway? In cities? Understanding this data can help to tailor content for users when they need it most. By analyzing Open Event data and the contextual values that are associated with this event, you can learn a lot about how your customers are using your app.  

Authentication

The ability to recognize a user and authenticate based on device or customer ID is a unique advantage in the mobile world. Within our insurance app sample, we found that Progressive was actively tracking logged in users, yet Geico, State Farm and Allstate were, too, with Adobe’s native Marketing Cloud ID. Each of these firms is tapping into one of the great advantages of mobile in its ability to authenticate individual users. The practical applications of authentication include the ability to customize content for known users, target them with offers and promotions, and even to use geolocation to push messages within apps. A recent article about mobile analytics in the auto industry cited an eMarketer study that revealed, “…one of Audi’s retailers recently ran a two-month campaign targeted to mobile-first audiences. The dealership was able to link 40% of total car sales to the targeted mobile audience.” This linkage is unheard of in the web world, but it’s entirely possible in mobile.

Navigation Flows

Navigation flows provide the ability to see how users are navigating your apps to determine usability, utility, and effectiveness. While our short insurance app comparative didn’t dive deep into any of the apps, we did find tracking within the Geico app that indicated navigational elements such as previous page name, page name, and section of their app. Presumably, these metrics are used to help to determine how users are traversing apps which provides a great deal of insight into usability. Navigation flows in these apps are valuable because even in my short script, I attempted to start a new auto insurance quote and was handed off to a responsive website for each of the five insurance providers evaluated. By defining your desired navigational paths and simply knowing how many users encounter your desired flow and how many drop off versus completing an action, like the quote online, provides key insight into the way the app is built and utilized. Knowing abandon rates for key functions might influence some providers to develop more native features and capabilities if they can be justified through navigational flows and analysis.   

 

What are you tracking?

Throughout this comparative we learned that companies are using relatively similar event tracking, which is for the most part standard offerings from their vendors. However, we expect as apps become more critical in business operations and more users are relying on apps to complete their transactions, that the level of customized tracking will undoubtedly improve. But as with all things in analytics, tracking events and dimensions that support your business goals is the top priority and this is where you should start and/or focus if you’re responsible for tracking your company’s mobile apps.  

If this is your role, it’s important to keep in mind that tracking apps can be a messy process. If your developers and product managers are building apps with the desire of creating immersive experiences, then it’s easy to try and track everything and lose sight of the key goals. Tagging complex or immersive apps can be a technical challenge, which is why you should take the opportunity to identify clear KPIs (such as get a quote), and also ensure that your UX and design teams are outlining the critical paths within your apps so everyone is on the same page about what you’re trying to accomplish and how you expect users to get there. This approach will ultimately lead to readily available insights (whether you were right or wrong), because you set the metrics by which you will measure success ahead of time. This pragmatic approach to mobile measurement is often overlooked yet essential for measuring what’s critically important and determining if your applications are successful. Continue reading

Conferences/Community, Industry Analysis

Mobile Analytics Summit Recap

I recently participated in the first ever Mobile Analytics Summit, which was a fantastic event chock full of great information and insights. The virtual format allowed attendees to tune in based on sessions that were most relevant to them. And if you missed it, there’s an opportunity to go back and catch all the sessions because the presentations are archived and available. ObservePoint was a gracious host and organized event sponsor; I was honored to participate in the Summit.

Some of the key trends that I observed from the conference included:

 

Mobile Strategies Must Be Holistic Strategies

Let’s face it, mobile is HUGE today. According to Krista Seiden of Google, “half of all searches on Google take place on smartphones globally”. And, “More than half of all web traffic (recorded with Google Analytics) comes from smartphones and tablets.” Krista delivered a compelling presentation on her personal journey Moving from a Web to a Mobile World that highlighted many of the nuances and fundamentals of measuring today’s digital environment.

But, mobile is ubiquitous…it’s in your pocket, it’s on your nightstand, and it’s probably not far from you wherever you are these days. So you better be strategic about it. The mobile experience is dominating half of all time online, which means that there’s still another half of the experience that’s happening elsewhere. This means that your mobile experience must connect to your customer’s desktops, to their telephones, and to their in-store experiences as well. Companies that fail to build integrated experiences are alienating their customers. Accept that mobile is part of the stack that includes acquisition drivers, marketing layers, testing solutions, CRM applications, Email and SMS communication tools, and a myriad of other technology solutions that manage customer interactions. So keeping mobile in a silo is a recipe for disaster. If you think about the customer lifecycle for any product or service, there are multiple touch points and inevitably multiple channels, accept that you will capture data and interact with customers and prospects via multiple methods. Whether on the app, website, or in the store, you’re gathering data. You know it and the customer knows it. Yet, their expectation is that you’ll remember them regardless of device. So get strategic about it.

 

Find Your Framework

Getting strategic requires mobile developers, product managers, strategists and analysts to find a method to their madness. This can be accomplished by using a framework for measurement. This was touched on by a number of us speakers at the Mobile Analytics Summit, but Stephen Blake Morse of mParticle put it into context by stating that a customer journey framework is imperative for aligning your measurement efforts with company KPI’s and business goals. Stephen provided a resource for Designing a Mobile Strategy Microsite; and also gave a nod to Dave McClure’s Pirate Metrics as frameworks to learn from while developing yours. At Analytics Demystified, we help clients do this too. We help by understanding corporate objectives, developing frameworks and socializing them with leadership. Once our frameworks have been established and socialized, we empower clients to execute using a Measurement Plan that aligns specific initiatives with measures of success.

In another great presentation by Tim Trefren of Mixpanel, he advises listeners to Stop Treating Your App Like a Marketing Channel. Tim states that Engagement and Retention are the KEY metrics, and I agree. Acquisition and revenue are relatively clear. Although the tactics are wildly complex, the math is straightforward. Find more customers, make more money. Yet, engagement is vague and retention…well that’s tough. According to TechCrunch, nearly 1 in 4 people abandon mobile apps after one use. Tim referenced Andrew Chen’s research that revealed that losing 80% of mobile users is normal. Most apps have a retention problem. The average app loses 77% of its Daily Active Users within the first 3 days. By 30 days…90% of active users are lost. After 90 days, the average app has lost 95% of active users. This means that it’s not about getting the downloads and installs, it’s about keeping them engaged right from the get go. While a framework can’t necessarily save your failing apps, it can be applied as a means to strategically plan, launch, and manage mobile apps throughout their lifecycle.

 

Focus on the App

So now that we’ve established that a holistic strategy is a solid one; and that a framework can help organize your strategy…did I mention that it’s all about the app? We already know that more than ½ of all web traffic comes from mobile, but what’s even more interesting, across all these mobile devices, 85% of time is spent within apps. This makes the App the king of mobile.

Not only are Apps dominating the consumer world, they’re also ruling the workplace. According to my research, a study called Accelerate digital transformation with simplified business apps finds that, 69% of employees seek an engaging mobile-first work experience. This experience is enabled through apps! The ability to minimize the number of enterprise systems like CRM, Email, Jira, etc. an enterprise worker must log into every day can be enabled through an app. But what’s more relevant is that these app data streams can be customized or more accurately, curated to meet each user’s personal requirements. This is facilitated via an app. Yet, app deployment in the enterprise still lags consumer applications. The study revealed that 55% of organizations have implemented three mobile apps or fewer – typically email and calendar…but that’s just the tip of the iceberg for enterprise app utilization. Watch for the explosion of a new marketplace of enterprise easy apps in the next 18 to 24 months.

 

Optimize Your Apps

There are two primary things to consider when working to keep your apps performing in tip-top shape. The 1st) Operational Diagnostics, and 2nd) Testing.

I learned from Stephen Blake Morse that average Mobile App has eighteen 3rd party SDK’s. Eighteen! That’s a lot of data flowing out of each and every app. App bloat is a real thing.

The explosion of growth in data and analytics has led to a bounty of tools and technologies in both analytics and marketing (remember the stack?). As such, apps are getting loaded with data dispersing agents by the dozen. Within any given app, you’re likely to have Acquisition tools, Analytics, Optimization, Automation, and Aggregation solutions. Each of them is collecting and sending data to 3rd party solutions across the farthest reaches of the cloud. But with every potential benefit you receive from yet another integration, comes the potential cost of slowed performance. Additionally, apps that drain battery life, those that make excessive server calls, or those whose libraries rival the size of the Library of Congress can all impact performance and contribute to retention problems. As such, prudent app developers are optimizing apps.  

 

Test To Be the Best

Several sessions in the Mobile Analytics Summit delved into the testing world. Sun Sneed of ObservePoint articulated that Mobile is Hard and the challenges we face are technical, process oriented, and resource constrained. Sun offered 7 Ways to Win at Testing Your Mobile App Analytics, which called out: device fragmentation, “chatty” apps, testing throughout the dev process, the high cost of defects, and starting with your end goals in mind

Chetan Prasad of Adobe, walked us through his presentation Acquire, Engage, and Optimize to Drive App Addiction, which clearly underscored the themes of using a framework, the retention challenge, and testing everything. Chetan spoke of testing examples that included testing features to achieve 50% more revenue; testing design to increase logins by 10%; and testing content to realize 110% improvement in click-throughs and an 8% lift in redemption of rewards and offers.

Matt Thomas of ObservePoint also shared his thoughts on How Smart Companies Transform Their Mobile Testing Paradigms. Matt takes a pragmatic approach that begins with requirements gathering after first discussing the goals of the app or more simply put: What does success look like for this app? Matt hammered home the theme of defining your strategy first, which is the key factor in actually measuring the success of the app. In similar fashion to my own presentation at the Mobile Analytics Summit, Matt talked about the importance of documentation and developing a Solution Design that will be a guide for developers to use as a roadmap. The pivotal point in Matt’s presentation was that Change is the only constant and automated testing is the means by which you must assure your app validity.

 

Mobile Experiences are Leading the Digital First Revolution

Many of the presentations touched upon the User Journey is some form or fashion. What I found to be interesting about this theme is that the absolute most important revelation about digital today is understanding customers as they traverse channels. Mobile is almost certainly part of the experience, but it’s not singular. Digital First competitors today must know their customers and they must be able to reach them in real-time.

Moe Kiss of The Iconic shared her analysis in the presentation How Cross Device Analysis Taught Us the Value of Our Mobile Apps that included seven steps to determine the value of their app. First and most important, was stitching visitors. By analyzing the steps leading to critical actions such as using wish lists, Moe learned that apps were driving some conversion events, but not necessarily sales. She found that cross device users spent more time online, they engaged more frequently throughout the day, but while they browsed on their mobile devices, they ultimately purchased on the desktop. So, while the app was critically important, it wasn’t the key to driving success. Their success came from delivering great user experiences. Each device played a different role and not every feature was necessary on every platform. Among Moe’s key findings was that users are transitioning between devices and being open to use the right channel at the right time was the secret to their success.

Chris Slovak of Tealium shared his presentation Identity Resolution is Key to Digital Transformation, which began with Chris setting the stage for experience-driven companies. Uber, Nest, Gatorade, Waze, Venmo, and Amazon are creating customer experiences that are the gold standard of mobile first. Like Moe’s findings, Chris talked about the experiences being more important that the products. Because mobile is measured in events, it carries the potential to humanize the experience and use data to make it personal. Yet, to get there organizations need to connect with their customers via 1:1 relationships across all channels and devices. To do this, companies must operate in real-time. But that can only be effective if you know your customer through harvesting customer ID’s, which Chris claims must be part of your data layer.

Stephen Blake Morse also talks about this in mParticle’s Customer Data Platform. Whether it’s a social handle, email address, subscriber ID, 1st or 3rd party cookies, iPhone IDFA, IoT device ID, or any other personal identifier…you need these identification keys to get relevant to your customers. Chris too talks about a framework incorporating data Capture > Enrichment > Activation as a means to interact with customers in real-time. This ability is the lynch pin for becoming personal with customers who want that and for delivering meaningful experiences.

 

In closing, I want more…

And you should too. The world of Mobile Measurement is pretty exciting. It’s filled with “micro moments” that don’t mean much as stand-alone actions, but in the context of a greater strategy for measuring success, they are essential for defining and shaping experiences. The trends I mentioned here are observations that I took away from the Mobile Analytics Summit, but there are many many more. I wrote about just a few of my favorite presentations, but I encourage you to check out what’s interesting to you.

And, if we at Analytics Demystified can help in any stage of your mobile measurement pursuits: whether that’s an overarching strategy, requirements, a measurement plan, implementation, or analysis…give us a shout or leave a note in the comments. We look forward to hearing from you!

Industry Analysis

The Downfall of Tesco and the Omniscience of Analytics

Yesterday, an article in the Harvard Business Review provided food for thought for the analytics industry. In Tesco’s Downfall Is a Warning to Data-Driven Retailers, author Michael Schrage ponders how a darling of the “analytics as a competitive advantage” stories, British retailer Tesco, failed so spectacularly – despite a wealth of data and customer insight.

I make no claims to a completely unbiased opinion (I am, after all, in the analytic space.) However, from my vantage point, the true warning of Tesco lies in the unrealistic expectation (or, dare I say, hype) that ‘big data’ and predictive analytics can think for us.

It is all too common for companies to expect analytics to give them the answers, rather than providing the supporting material with which to make decisions. Analytics can not help you if you are not asking the right questions. After all, a compass can tell you where north is, but not that you should be going south. It is the reason we at Analytics Demystified prefer to think about being data informed, not data driven. Being ‘data driven’ removes the human responsibility to ask the right questions of, and take the right actions in response to, your data.

Ultimately, successful business decisions are an elusive combination of art and science. Tesco may have had the greatest analytics capabilities in the world, but without business sense to critically assess and appropriately act upon the data, it is a warning: considering analytics to have some kind of omniscience, rather than being a part of your business ‘tool box’, is to set it up to fail.

What do you think? Is Tesco’s downfall a failure of analytics? Leave your thoughts in the comments.

Industry Analysis

It’s not about “Big Data”, it’s about the “RIGHT data”

Unless you’ve been living under a rock, you have heard (and perhaps grown tired) of the buzzword “big data.” But in attempts to chase the “next shiny thing”, companies may focus too much on “big data” rather than the “right data.”

True, “big data” is absolutely “a thing.” There are certainly companies successfully crunching massive volumes of data to reveal actionable consumer insight. But there are (many) more that are buried in data, and wondering why they are endlessly digging when others have struck gold.

Unfortunately, “big data” discussions often lead to:

  1. An assumption that more is better;

  1. A tendency for companies to try to skip the natural maturation of analytics in their organisation, in attempt to jump straight to “big data science.”

The value of data is in guiding business success, and that does not necessarily require massive volumes of data.

So when is big data of value?

  • When a company has pushed the limits of what they were doing with their existing data;

  • When they have the people, process, governance and infrastructure to collect and analyse volumes of data; and

  • When they have the resources and support to optimise based on that data.

But to succeed at a more foundational level, companies should focus first on whether they have:

  • A culture that fully integrates data and analytics into its planning and decision making;

  • The right data to guide their strategy and tactics. This includes:

    • Data that reveals whether initiatives have been successful, addressing the specific goals of the work;
    • Data that provides insight into progress, including early indicators of the need to “course correct”; and
    • Data that identifies new opportunities.
  • The resources and support to optimise based on findings from current data

While all businesses should be preparing for increased use and volume of data in the coming years, it is far easier to chase and hoard more and more and more data than it is to derive value from the data that already exists. However, the latter will drive far greater business value in the long term, and set up the right foundation to grow into using big data effectively.

Analysis, General, Industry Analysis

Getting comfortable with data sampling in the growing data world

“Big data” is today’s buzz word, just the latest of many. However, I think analytics professionals will agree: “Big data” is not necessarily better than “small data” unless you use it to make better decisions.

At a recent conference, I heard it proposed that “Real data is better than a representative sample.” With all due respect, I disagree. That kind of logic assumes that a “representative sample” is not, in fact, representative.

If the use of “representative” data would not accurately reflect the complete data set, and its use would lead to different conclusions, using “real” data is absolutely better. However, it’s not actually because “real” data is somehow superior, but rather because the representative sample itself is not serving its intended purpose.

On the flip side, let’s assume the representative sample does actually represent the complete data set, and would reveal the same results and lead to the same decisions. In this case, what are the benefits of leveraging the sample?

  • Speed – sampling is typically used to speed up the process, since the analytics process doesn’t need to evaluate every collected record.
  • Accuracy – if the sample is representative (the assumption we are making here) using the full or sampled data set should make no difference. Results will be just as accurate.
  • Cost-savings – a smaller, sampled data set requires less effort to clean and analyse than the entire data set.
  • Agility – by gaining time and freeing resources, digital analytics teams can become more agile and responsive to acting wisely on small (sampled) data.

There is no doubt that technology continues to develop rapidly. Storage and computing power that used to require floors of space now fits into my iPhone 5. However, the volume of data we leverage is growing at the same rate (or faster!) The bigger data gets, and the quicker we demand answers, the more sampling will become an accepted best practice. After all, statisticians and researchers in the scientific community have been satisfied with sampling for decades. Digital too will reach this level of comfort in time, and focus on decisions instead of data volume.

What do you think?

Adobe Analytics, General, Industry Analysis

Our Engagement Metric in use at Philly.com

Those of you who have read my blog for long know that I have written a tremendous amount about measures of visitor engagement online. In addition to numerous blog posts we have published a 50 page white paper describing how to measure visitor engagement and every year I give a half-dozen presentations on the subject. Unlike some people who seem to fear new ideas and others who disapprove of anything they themselves do not create I have long been a champion for evolving our use of metrics in web analytics to satisfy business needs.

But don’t take my word for it, read about how the nice folks at Philly.com are using a near complete version of my calculation to better understand their audience.

Cool, huh?

The thing I love about this article is that Philly.com is openly talking about their use of my engagement metric.  What’s better is that their sharing prompted another super-great organization (PBS) to comment that they too have been using my engagement metric for years.

Awesome.

I have been honored to work with several companies in the past three years who have implemented my metric and variations thereof but most treat the metric as a competitive secret. Given that most are in the hard-pressed and hyper-competitive online media world I understand, but I’m certainly happy to see Philly.com and Chris Meares share their story with the world.

Anyway, check out the article and, if you’re brave, download our white paper on visitor engagement and give it a read. If you are in media and are stuck trying to figure out how to get web analytics to work for you (instead of the other way around) give me a call. I’m more than happy to discuss how our measure of engagement might be able to help your business grow.

Adobe Analytics, Analytics Strategy, General, Industry Analysis

Web Analytics: One Month at a Time in 2009

As we look towards 2009 there are clearly some great challenges and great opportunities facing everyone who has more than a passing interest in web analytics. But regardless of the economic situation, we all need to stay focused on making the most of the people, process, and technology we have in place today, continuing to work towards positive business outcomes.

Towards this end, I would like to invite those of you wondering exactly where to begin and looking for some sense of structure for your digital measurement efforts in 2009 to a free webcast sponsored by Coremetrics and the DMA on Wednesday, December 3rd at 10:00 AM Pacific.

In this free event I will be focusing on helping companies of all sizes at all stages in web analytics maturation take a tactical look at their long-term strategic measurement efforts.  The net/net, I hope, is a “stratactical” (thanks Jennifer!) presentation that has something for everybody, regardless of the tools you’re using or how you’re currently using them.

Again, the webcast is free and open to everyone.  You can register with Coremetrics and the DMA at the Coremetrics web site:

Register Now to Attend this Free Webcast!

Again, the webcast is from 10:00 AM to 11:00 AM Pacific on Wednesday, December 3rd. I hope to see you there!

On a totally unrelated note, I wanted to say “Thanks” to Neil Mason of the Web Analytics Association (and now WebTraffiq) for bringing up my open letter to President-Elect Barack Obama in this week’s ClickZ column.  Neil makes a comparison between European’s view on the use of cookies and the current situation within the Federal Government here in the U.S.

Particularly interesting was this passage:

“The European Parliament passed a directive in 2002 on privacy and electronic communications. Leading up to this directive, there had been a concern in the industry that cookies would effectively be made illegal as a breach of personal privacy. In the end, the European Parliament concluded it wasn’t cookies or Web bugs that infringed privacy but the inappropriate use of these devices.”

Not the cookies themselves but rather the inappropriate use of these devices.  Absolutely.  I would encourage any of you interested in this issue to give Neil’s column a read.

Adobe Analytics, Analytics Strategy, Conferences/Community, General, Industry Analysis, Reporting

My AMA presentation is now online and much more

For those of you who missed my presentation yesterday, “Web Analytics: A Day a Month”, you can now listen to the re-recorded webcast at WebEx thanks to Tableau and the American Marketing Association. I say “re-recorded” since once again I managed to bring a large enough crowd to the webcast to break WebEx. Web analytics is hot!

You can listen to the webcast without having to register (still requires name and email) until next week I think by going to:

amaevents.webex.com

Here are a few other things I should mention, as long as I’m writing:

  • I’m going to be in Boston next week for Judah’s Web Analytics Wednesday event (rescheduled from last month due to me being a weather-wimp) and if you’re in Boston or nearby I’d love to catch up. Please join us in Cambridge!
  • The next few weeks I will be in Chicago (Jan 25th), Seattle (Jan 30th), San Jose (Jan 31st) and New York (Feb 7th) giving the keynote address at OpinionLab’s client conferences. The nice folks at OpinionLab mentioned that they’re opening up the events to non-customers so if you’d like to hear me talk about how quantitative and qualitative data combined provide a much more actionable view of the online visitor, please join us!
  • The nice folks at the Direct Marketing Association who gave away PDF copies of my book Analytics Demystified in exchange for participation in their web analytics survey (written up by the amazing W. David Rhee) are holding a webinar on the research findings on Januay 23rd. The event is not free but the research is pretty good and if you’re in the DMA you should consider joining the call.
  • The nice folks at the Web Analytics Association are also holding a research call, tomorrow (Jan 17th) in fact, on the future of the web analytics industry. I think this event is free but it might only be free to WAA members (maybe if Richard or Andrea read this they can comment for all to see!) The call is tomorrow morning at 9 AM Pacific, noon Eastern and you can register to attend at the WAA web site.
  • Anil Batra has apparently jumped on the “bounce rate” bandwagon and is having a “bounce rate survey” that he’d like you to participate in. I haven’t had a chance to take it yet but I really enjoyed Anil’s salary research so I’m sure he’ll do a great job with bounce rate too!
  • I’ll be back in San Diego in late February at Aaron Kahlow’s Online Marketing Summit talking about Key Performance Indicators in a Web 2.0 World.  I really enjoyed OMS last year and am looking forward to getting back to Sea World Aaron’s event!
  • I had nothing to do with that movie on web analytics, despite it being filmed here in the Rose City, and have no idea what Ian is talking about.  Ian should spend less time at the movies and more time reading what experienced practitioners are saying about Gatineau.  <grin>

If I’m forgetting anything please comment below.  I think you’ll really like the webcast — the feedback I got has been excellent so far (despite some people going gossipy about the title of my last post on the subject … cage match indeed!)

Analytics Strategy, General, Industry Analysis

Is Google Analytics the Killer App? No.

For the past two days readers and friends have been writing me asking my opinion of Brandt Dainow’s recent iMediaConnection piece on “Google’s Killer App.” Most of the questions run along these lines:

“Is this guy insane, or did I completely miss the revolution?

http://www.imediaconnection.com/content/15823.asp

I am so impressed with Omniture I have a hard time believing that Google Analytics beats it. I have been in a bit of a bunker trying to keep the magic going here, but can you hook me up with a reality check?”

The reality check is this: Yes, Brent Dainow has apparently gone completely insane. Too bad too, since I kinda liked some of the stuff he’s written in the past.

Let’s consider some of the bizarre statements he makes in his article:

Google has killed the web analytics software industry with the release of the new version of Google Analytics. The new version was released just under two months ago and is simply a quantum leap above any other analytics product on the planet.

This is his opening statement, and I don’t know where to begin. Statements like “killed the web analytics software industry” and “simply a quantum leap above any other analytics product on the planet” are bizarre. Is Dainow paying any attention to the web analytics market? Omniture continues to accelerate, WebTrends has released a great new version of their application, Microsoft is about to release their own free offering, …

And don’t get me wrong: I really do like Google Analytics, and I use it regularly, but there is absolutely no possible justification for saying that Google Analytics is a quantum leap better than other available applications. Google Analytics has some pretty visualizations, a slick UI, and does a good job of integrating with Google’s search marketing products, but a “quantum leap?” I think not.

“Google Analytics version 2 is not revolutionary. It does not extend web analytics software by providing new forms of analysis. Neither does it extend our understanding of websites by offering new approaches. What Google has done is simply take every feature in every product on the market and put them all into one system, and then make it available for free.”

Google has “every feature in every product on the market”? Really? Are you sure? Because I can think of dozens and dozens of useful features that I’ve seen in solutions like ClickTracks, Visual Sciences, Omniture, WebTrends, Coremetrics, Unica, … basically every other solution on the market today that aren’t in the version of Google Analytics I’m using. Features like:

  • Real visitor segmentation (multidimensional, ad hoc, etc.)
  • Custom variables at the visitor, session, and page view level
  • The ability to produce custom reports for automated delivery
  • The ability to define custom metrics and customize reports in the interface
  • The ability to import metadata as an input for analysis
  • Commerce-related reports like browse-to-buy ratios
  • A browser-overlay that can be customized

(This list goes on and on and on, and has been discussed a great deal by folks like Judah Phillips and Phil Kemelor.)

Dainow continues:

“I am surprised by the range of features Google has added. I would have assumed some had been patented by the companies that created them. I can only conclude this is not the case. The range of features Google has borrowed from other products suggests the web analytics software industry managed to do 10 years of research and development without registering even one patent. This must be unique in the history of computing. If Google has stolen patented ideas, then I can only conclude they simply don’t care and will rely on their massive cash reserves to sort it out later.

I suspect that Google does not own the patent for the browser overlay, for path analysis, and the JavaScript page tag. I would not assume that Google believes they have “stolen patented ideas” but you can be sure that some lawyer, somewhere, probably does. Maybe the companies that own these patents are pissed at Google but are hesitant to sue a company with the financial resources of GOOG?

Daniow then gets a little more personal:

“I say this as someone who, until this month, ran a company that produced web analytics software and directly competed with Google Analytics. No more. There is simply no way my organization can produce the range of features Google offers and make them available for nothing. We will keep the consulting arm going but use Google Analytics as the reporting system.”

This is perhaps both the most confusing and most telling statement in the entire article. His statement is confusing because one would have thought that as the CEO of a web analytics software company Dainow would have had a more refined understanding of the features available in the market today, the patent market, and the overall utility of free software.

His statement is telling because it sounds like ThinkMetrics is about to become a GAAC (Google Analytics Authorized Consulting) partner, in which case the bizarre pro-Google rhetoric in this article begins to make sense.

[UPDATED: Brett Crosby from Google wrote me and said that ThinkMetrics was not currently nor was about to become a GAAC partner. Which really only makes Dainow’s post that much more bizarre!]
At least, it makes sense that Dainow would want to write a bizarre cheerleader piece like this, I still cannot come up with any justification for iMediaConnection to publish something so strangely biased, poorly researched, and obviously wrong. Perhaps they too have decided that the rest of the vendors are dead and thusly unlikely to buy advertising on their site. I know I wouldn’t be sending a check to iMediaConnection anytime soon if I were Tim Kopp at WebTrends or Gail Ennis at Omniture.

Dainow then makes an even more confusing comment:

“I have been converted to Google Analytics version 2 purely by the strength of the product. It is not just the range of features that is impressive, it is the integration and flexibility.”

If by “integration” Dainow means “with Google’s products only”, and by flexibility he means “a total lack of flexibility” then I suppose I agree. Call me crazy, but I think integration means the ability to pass a variety of data automatically into and out-of the application using defined APIs, not just being able to see Google AdWords impressions and costs. And I think flexibility means the ability to collect multiple custom data, to define new data schemas, and to reprocess data if necessary.
I guess we just have different definitions.
Dainow continues to blather on and on (his “Blather Index” is very high in this article!) about the greatness and wonderfulness and amazing beauty of Google Analytics. For example:

“All the tables are clickable so that I can instantly drill down on the elements that stand out. For example, I recently analyzed the performance of a tourist site’s listings in travel directories. I was able to drill down on specific directories and see which pages and descriptions were working and which were not. Within the same directory, I could see some listings that had a bounce rate of 9 percent and others with a bounce rate of 70 percent.”

Well no wonder Brandt’s so in love with Google Analytics: The tables are clickable and he can instantly drill down on elements that stand out! That is certainly a feature not found elsewhere in web analytics …
I’m getting snarky so I’ll wrap this up. Dainow concludes with the following:

“But despite its failings, the overall range and flexibility of Google Analytics, combined with the price (free), leads me to expect the new version to totally dominate the market and drive most competitors out of business. You need an extremely good reason, or three, to continue staying with any other product.

If there is to be any future in web analytics software for any competitor, that company will need to rapidly expand the scope of reporting available and seriously enhance flexibility and drill-down capabilities.

Industry consolidation is sure to follow, and I expect WebTrends to be one of the few companies with the pockets to pursue such a strategy. It is surprising that Microsoft has not produced a product to compete.”

In these three short paragraphs, Dainow demonstrates a near complete lack of understanding of web analytics and the web analytics marketplace. Google Analytics already dominates the market in terms of total domains coded, but dominance isn’t defined by the breadth of your coding, it’s defined by the success your customers have using your application!

I’m not saying that GA customers aren’t able to be successful, but the data suggests that they still have a long way to go before the value of Google Analytics, or any free analytics application (sorry Ian!), can be assumed. Web analytics is hard and “pretty”, “free”, and “Googly” don’t make it any easier. Dedication and commitment make web analytics easier, not free and click-able.
There are hundreds of good reasons for any company to continue to use an alternative to Google Analytics: Dedicated support, “Enterprise-class” (sic) product features, and a company-wide commitment to customer success, not just to gathering all the world’s data, are three that come to mind.
Most of the licensed solutions on the market today have significantly greater reporting, flexibility, and drill-down capabilities than exist in Google Analytics. Visual Sciences, Omniture, WebTrends, ClickTracks, Coremetrics and others have all spent years working on these kinds of issues, and I think their customers largely agree that they’ve done a pretty good job. Visitor segmentation, custom reporting, and data warehouse analysis are all fundamentally important to “real” web analytics and are all basically absent in Google Analytics.

No disrespect to the management team at WebTrends, but don’t you think that Omniture and their $1.16B USD market cap would qualify as “having deep pockets”? Not that Omniture is likely worried at all about the competitive threat described in your article — plus they’re going to save a bundle by not advertising at iMediaConnection!

Regarding Microsoft and Daniow’s desire to sell Bill Gates his discarded web analytics solution … I think Dainow is the only person in the world who didn’t read last week about Microsoft Gatineau!
The comments at iMediaConnection basically all ask Daniow the same question–What planet are you from, dude?–and are best summarized by this comment:

“oh please….. it is lousy — we are now going to move to Omniture because of all the deficiencies in 2.0 — this kind of post must be paid by Google because people who use it for major adspends (Over 1m for us) know what a lousy move this was for us.. hey but I know the bloggers are excited.. while it has a few nice additions the removal of so many key features and the inability to see metrics together that previously were easy to compares are serious detriments.. plus it is not nearly as sophisticated as it once was.. stop drinking the kool-aid ..” (Elxiabeth schachin)

Well put, Elxiabeth.

In summary: I’m sorry to hear that things didn’t work out for Daniow’s company, especially with the great success that almost everyone else in this industry has been having for the last 24 months. And I wish him all the best as a GAAC partner — the world definitely needs more GAAC partners and smart people able to provide technical support for Google’s wonderful and amazing free application. But the kind of biased, self-serving, and poorly researched rhetoric published in Daniow’s piece has no place in the market today, at least in my humble opinion.
What do you think? Is Google Analytics going to destroy the web analytics marketplace? Is GA2 the best web analytics application in the entire universe? Are you calling your licensed vendor today to cancel your contract, and calling your broker to divest your holdings in OMTR and VSCN now that Daniow has made such a compelling case? I’d love to hear what you all have to think.

Analytics Strategy, General, Industry Analysis

I am heading to Tokyo but a few thoughts before I go …

Thanks to everyone who has been so engaged in the debate over Technorati’s utility as a data source for ranking blogs.  I guess I opened a can of worms with that post but the debate has been just great!  But on to bigger and better things …I’m just about to board my flight to Tokyo, Japan to give the keynote presentation at Digital Forest’s Marketing ROI Day conference on August 1st.  I’m very excited about this opportunity and to finally meeting my generous hosts at Digital Forest.  If you are reading this and live in-or-near Tokyo, please come to Marketing ROI Day and meet me in person!

Incidentally, I’ve finally updated my presentation schedule.  You can learn where I’ll be presenting pretty much through the end of the year at this URL:

http://www.analyticsdemystified.com/link_list.asp?l=Presentation

Finally, I think I mentioned that I’m writing for DM News now.  My second article, titled “Hiring Myths for Web Data Talent” is now available online (and theoretically in print as well!)  In this article I address four key issues that anyone looking to hire experienced talent — regardless of which analytics platform they’ll be using — needs to consider.  I encourage each of you to read the article, but here is the top-line summary of hiring myths:

  1. That an analyst is always the most important “first hire”
  2. That a mathematics background is a must
  3. That web analysts salaries can be easily compared to common IT functions
  4. That a good hire guarantees positive return on investment

I welcome your comments and feedback regarding my presentation calendar, the DM News article, and pretty much anything else you’re interested in chatting about.

Adobe Analytics, Analytics Strategy, Industry Analysis, Reporting

On NetRatings and time spent on site

In all of the fuss about NetRatings dropping page views as a metric used to calculate site popularity is the fact that the company actually did a pretty smart thing: they took my advice from February 15th of this year and rolled in a very valuable and useful “sessions” metric. Well, maybe it wasn’t my advice they took, but I think it was a great idea either way to drop page views since they’ve become increasingly inconsistent to instead focus on the one metric that is consistently applied and well defined, sessions.

Unfortunately NetRatings chose to focus their announcement on “total minutes” saying that time was a better measure of engagement. Personally I’ve never been a very big fan of the time spent metrics — I guess I’ve just looked too long and too hard at all the problems associated with how time is collected and recorded in the web analytics realm.

There is a really engaged thread at the Web Analytics Forum at Yahoo! Groups on this subject that is definitely worth a read if you’re interested.

And I’ll admit, I don’t have all the details associated with how panel-based services like Neilsen and comScore track time spent. If they’re actively tracking the user and only counting time when the browser window is active and the mouse is moving, well that would be a good use of the panel. My suspicion is that, like in web analytics, they’re simply recording the delta between the first and last request for a page in the domain — a strategy that suffers from a litany of well-described problems.

The two I see as most problematic are:

  • Single page visits are either difficult to count or not counted in time spent calculations
  • The amount of time a web page is open is likely only poorly correlated to their actual engagement with the page

Some have already noted that the fact that very popular sites like Google will do poorly in time spent on site because one of the dominant use cases involves only a single page (I search and I go.) Conversely, depending on how time spent on site is calculated, the search engines may have inordinately long times spent based on a search leading to a long browse time on a discovered site, leading back to the search results (same session, clock is presumably still ticking), leading to the next discovered site, etc.

I for one use iGoogle in exactly this way: I load the page frequently throughout the day and do nothing more than look at a single page view. In fact, unless Nielsen is either tracking the AJAX-interaction with the iGoogle interface, or counting single page view sessions, it is likely that my interaction with iGoogle is not counted at all. But let me assure you, I am quite engaged with the content in my Google portal (something that would be well evidenced by the total session count I generate at the site each day.)

As I looked back through the plethora of comments that my original post on using sessions to compare sites I noticed that I had made this statement in response to a comment from Jacques Warren:

  • If you want to compare two or more web sites, use sessions because of the reasons I outlined in my original post.
  • If you’re interested in the number of people coming to one web site (presumably yours), use de-duplicated unique visitors but be mindful of cookie deletion.
  • If you’re interested in the activity of people on your web site, and if you have a “Web 1.0″ web site, use page views but be mindful of issues like code coverage, proxies, robots, etc.
  • If you’re interested in the activity of people on your web site, and if you have a “Web 2.0″ web site built around RIAs, etc., use some form of event model.

I’ll stand by this. Until I know more about how N/NR and comScore calculate their time spent on site metrics it’s hard to believe their numbers to be any more useful or accurate than those provided by direct measurement systems. That said, I’d welcome a briefing on the subject from either company if they’re reading this and are interested in having me pick apart their methodology spending some time with me.

If companies really need to use time spent on site, they should consider using better key performance indicators for time such as Percent Low/Medium/High Time Spent on Site categories (something I talk about at length in The Big Book of Key Performance Indicators.)  That way N/NR could report on the percent of all tracked sessions that were “30 seconds or less”, “31 seconds to 5 minutes”, and “More than 5 minutes” (as an example) which would give us a more powerful view into the relationship between visitors and the time they spend on site.
At the end of the day I like that N/NR has provided a consistent and easily compared metric to their customers in “total sessions” which is what I will inevitably focus on as a measure of site popularity. Having devoted quite a bit of time to describing what I believe to be a solid measure of visitor engagement, it’s difficult for me to think about “time spent on site” (or even “total sessions”) as a good proxy. Time spent, recency, depth of session, session number, etc. are all components of engagement, not direct measures.

What do you think? Is Nielsen right and I’m crazy? Have you been looking closely at your time spent on site metric for years and are delighted that the rest of the world has finally caught up? Or are you like me and spend far too much time browsing from site to site, flipping from task to task, and thusly confounding clocks and counters on every site you visit?

I welcome your comments.

Analytics Strategy, General, Industry Analysis

comScore study sheds new light on risks to cookie-based measurement

Awhile back the folks at comScore called me and asked if I would be surprised to learn that cookies were being deleted at a pretty high rate. Of course I said, “No, because I reported as much in 2005.” Through the course of the conversation, however, it became clear that comScore had the ability to shed new light on our understanding of cookie-based measurement; specifically they had the ability to measure the rate of deletion associated with first-party cookies.

comScore published the results of that study today.

I will fight the temptation to smugly say, “Ah ha! I told you so …” since the comScore data shows that I was both right and wrong when I first wrote about cookie deletion when I was with JupiterResearch. I was right in my assessment that this is happening far more frequently than those of us in the web analytics field particularly want to believe. But I was wrong in my assumption that cookie deletion was largely limited to third-party cookies.

The comScore data reports that over 30 percent of their panel of 400,000 home user computers deleted both first- and third-party cookies. Now, when I talked to Andrew Lipsman and Gian Fulgoni from comScore I repeatedly encouraged them to check and double-check these findings since especially their number for first-party cookies is much, much higher than I think any of us expected to see.

That said, I have no reason to believe that comScore would make this claim frivolously (okay, except for the fact that they provide a competing methodology to cookies) … I have asked comScore for a deeper briefing on their research but nothing has been scheduled as of this posting. Perhaps on my urging comScore took their research a step further and surveyed a subset of their panel asking about their stated behavior towards cookies. In the press release, Dr. Magrid Abraham addresses this in the context of the conventional wisdom that assigns greater risk to third- than first-party cookies:

“There is a common perception that third-party cookie deletion rates should be significantly higher than first-party cookie deletion rates,” continued Dr. Abraham. “Because many PC users reset or delete their cookies using security protection programs, conventional wisdom dictates that people are more likely to selectively expunge third-party cookies – which are generally deemed more invasive – while maintaining their first-party cookies. But these findings suggest that selective cookie management is not prevalent, a fact that comScore confirmed via a survey, with only 4 percent of Internet users indicating that they delete third-party but not first-party cookies.”

Yikes. When you look at the tables in the comScore study you can see where the problem is coming from: serial cookie deleters, the 7% of site visitors (measured via the comScore panel) that are repeatedly removing their cookies and thusly will appear as a new site visitor with every visit. I addressed the idea of serial deleters in my final JupiterResearch report on “The Crumbling Cookie” and, at the time speculated that some of the more nefarious activities available through the Internet were to blame.

Still, I never would have put the number as high as 7 percent.

It’s interesting to me that cookies are back in the news. It will be more interesting to see how all of this is digested in the coming days, weeks, and months. I wonder if Seth Godin will comment on the comScore study? I mean, I’m not sure that the “echo chamber” argument applies to comScore’s panel of 400,000 measured, identified individuals.

This seems to be a topic ripe for commentary and conversation. What do you think? Is comScore crazy? Is this report flawed? Or are we just fooling ourselves when we believe that “unique visitor” counts are an accurate representation of the number of real human beings coming to our web sites over long periods of time?