On NetRatings and time spent on site
In all of the fuss about NetRatings dropping page views as a metric used to calculate site popularity is the fact that the company actually did a pretty smart thing: they took my advice from February 15th of this year and rolled in a very valuable and useful “sessions” metric. Well, maybe it wasn’t my advice they took, but I think it was a great idea either way to drop page views since they’ve become increasingly inconsistent to instead focus on the one metric that is consistently applied and well defined, sessions.
Unfortunately NetRatings chose to focus their announcement on “total minutes” saying that time was a better measure of engagement. Personally I’ve never been a very big fan of the time spent metrics — I guess I’ve just looked too long and too hard at all the problems associated with how time is collected and recorded in the web analytics realm.
There is a really engaged thread at the Web Analytics Forum at Yahoo! Groups on this subject that is definitely worth a read if you’re interested.
And I’ll admit, I don’t have all the details associated with how panel-based services like Neilsen and comScore track time spent. If they’re actively tracking the user and only counting time when the browser window is active and the mouse is moving, well that would be a good use of the panel. My suspicion is that, like in web analytics, they’re simply recording the delta between the first and last request for a page in the domain — a strategy that suffers from a litany of well-described problems.
The two I see as most problematic are:
- Single page visits are either difficult to count or not counted in time spent calculations
- The amount of time a web page is open is likely only poorly correlated to their actual engagement with the page
Some have already noted that the fact that very popular sites like Google will do poorly in time spent on site because one of the dominant use cases involves only a single page (I search and I go.) Conversely, depending on how time spent on site is calculated, the search engines may have inordinately long times spent based on a search leading to a long browse time on a discovered site, leading back to the search results (same session, clock is presumably still ticking), leading to the next discovered site, etc.
I for one use iGoogle in exactly this way: I load the page frequently throughout the day and do nothing more than look at a single page view. In fact, unless Nielsen is either tracking the AJAX-interaction with the iGoogle interface, or counting single page view sessions, it is likely that my interaction with iGoogle is not counted at all. But let me assure you, I am quite engaged with the content in my Google portal (something that would be well evidenced by the total session count I generate at the site each day.)
As I looked back through the plethora of comments that my original post on using sessions to compare sites I noticed that I had made this statement in response to a comment from Jacques Warren:
- If you want to compare two or more web sites, use sessions because of the reasons I outlined in my original post.
- If you’re interested in the number of people coming to one web site (presumably yours), use de-duplicated unique visitors but be mindful of cookie deletion.
- If you’re interested in the activity of people on your web site, and if you have a “Web 1.0″ web site, use page views but be mindful of issues like code coverage, proxies, robots, etc.
- If you’re interested in the activity of people on your web site, and if you have a “Web 2.0″ web site built around RIAs, etc., use some form of event model.
I’ll stand by this. Until I know more about how N/NR and comScore calculate their time spent on site metrics it’s hard to believe their numbers to be any more useful or accurate than those provided by direct measurement systems. That said, I’d welcome a briefing on the subject from either company if they’re reading this and are interested in
having me pick apart their methodology spending some time with me.
If companies really need to use time spent on site, they should consider using better key performance indicators for time such as Percent Low/Medium/High Time Spent on Site categories (something I talk about at length in The Big Book of Key Performance Indicators.) That way N/NR could report on the percent of all tracked sessions that were “30 seconds or less”, “31 seconds to 5 minutes”, and “More than 5 minutes” (as an example) which would give us a more powerful view into the relationship between visitors and the time they spend on site.
At the end of the day I like that N/NR has provided a consistent and easily compared metric to their customers in “total sessions” which is what I will inevitably focus on as a measure of site popularity. Having devoted quite a bit of time to describing what I believe to be a solid measure of visitor engagement, it’s difficult for me to think about “time spent on site” (or even “total sessions”) as a good proxy. Time spent, recency, depth of session, session number, etc. are all components of engagement, not direct measures.
What do you think? Is Nielsen right and I’m crazy? Have you been looking closely at your time spent on site metric for years and are delighted that the rest of the world has finally caught up? Or are you like me and spend far too much time browsing from site to site, flipping from task to task, and thusly confounding clocks and counters on every site you visit?
I welcome your comments.