Recap: Web Analytics Wednesday with Foresee Results
Last week was our monthly Web Analytics Wednesday in Columbus. Foresee Results sponsored the event and provided a highly engaging speaker: Kevin Ertell, Foresee’s VP of Retail Strategy and the blogger behind Retail: Shaken Not Stirred.
We had a good crowd — just under 30 people — and we did our usual half-hour of networking before sitting down to order food and cover the evening’s topic.
We had attendees from a wide range of companies: Nationwide Insurance, Resource Interactive, Victoria’s Secret (including Monish Datta…which I mention here solely for quasi-inside SEO joke purposes), DSW, Diaz & Kotsev Business (Web) Consulting, WebTech Analytics, Quest Software (makers of Foglight, actually, which I didn’t realize until I was writing the rest of this post), QStart Labs, Submerged Solutions, Bizresearch, Lightbulb Interactive, JoeMetric, Express, Cardinal Solutions, and various independent consultants. By my count, 30% of the attendees were first-timers, and the remaining attendees were a pretty even split between hard-core regulars and every-few-months dabblers.
Kevin is a great speaker — one of those guys whose use of PowerPoint is primarily to provide images that back up the stories he weaves.
One of the stories was the “tree stump on the conference room table” story, which was about how we get used to having odd, not-particularly-helpful aspects of our web sites that are jarring to first-time and infrequent visitors, but that we never think to address.
You can ping Kevin on Twitter directly for a more complete explanation on that analogy, if you want. If I try to recreate it entirely, I’ll butcher it for sure! I will take a shot at summarizing the four-step process Kevin laid out for going beyond web analytics data to drive site improvement, though, which was the meat of the presentation.
Step One: Ask Your Visitors for Feedback
On-site surveys provide valuable information, because they let you ask your visitors questions directly rather than simply trying to infer what it was they are trying to do, how successful they were at doing it, and how smooth the process was based strictly on behavioral data. Web analytics = bahavioral data. Survey data = attitudinal data. Got it?
Some of the highlights on this step:
- Incentives aren’t needed to get people to take a 15-30 question survey — I think Kevin said they see something like 6-10% of the people who are offered a survey actually accept the offer (not all visitors to a site get offered the survey) and they’re able to build up an adequate sample fairly quickly in most cases
- The way Foresee Results offers surveys, typically, is that they offer the survey when visitors arrive on the site, but then conduct the survey on exit
- The wording of the survey questions matters — there are good/valid ways to word questions and there are bad/invalid ways to word questions; there are oodles of research and expertise on that subject, and it’s worth partnering with someone (a consultant, a company) who really knows the ins and outs on that front to make sure that the data you collect is valid
- The Foresee Results secret sauce is that they ask questions that fall into three broad categories: 1) questions about different aspects of the site (content, functionality, navigation, search, etc.), 2) questions to gauge customer satisfaction (very precisely worded questions that are backed up by the research behind The American Customer Satisfaction Index — ACSI), and 3) questions to gauge likely future behavior (likelihood to purchase online, likelihood to purchase offline, likelihood of returning to the site, etc.). Foresee Results then uses an analytic model to link these three elements together: the first category as a dependent variable affecting customer satisfaction, and customer satisfaction, in turn, being a dependent variable affecting the various future behaviors. It’s a pretty nifty tool that I’ve been learning more about over the past few months. Powerful stuff.
This step, done right, gives you the basic diagnostics: where the most significant opportunities for driving improvements exist with your site.
Step Two: Augment Quantitative with Qualitative
This step is to augment the quantitative survey data with more qualitative information. The quantitative data can help you slice/segment the data so that you can review the responses to open-ended questions in a more meaningful way.
Presumably, these qualitative questions are ones that you update over time as you are identifying specific areas on which you want to focus. If for instance, you found out in Step One that the navigation was an area where your site scores low and also has a significant impact on customer satisfaction, then you might want to gather some qualitative data specifically regarding navigation, and you might want to break that out between people who came to the site expecting to make a purchase, as opposed to people who came to the site simply to do comparison shopping.
This sort of analysis will give you insight into the specific friction points on the site — what types of visitors hit them and what sorts of tasks they’re trying to accomplish when they do.
Step Three: Watch Customers (in a Focussed Manner)
This is a step that Kevin pointed out companies sometimes try to put first, which makes it unnecessarily expensive and time-consuming. The key here is to use the information from the first two steps to focus what you are going to observe and how. Various options for watching customers:
- Session replay — what exactly did visitors on the site do and how; in the case of Foresee Results, these replays can be tied directly to specific survey respondents (pretty slick), but Tealeaf and Foglight are tools that provide replay functionality, too
- Eye-tracking — this requires getting people into a lab of some sort, so, obviously, the more focussed you can get, the better
- Usability testing — this may include eye-tracking, but it certainly doesn’t have to; obviously, there are benefits of being able to focus the usability testing, whether it’s conducted in a usability lab or even in-store
Now, you should really have a good handle on specifically what’s not working. But, what if you don’t really have any good ideas as to what to do about it? Then…
Step Four: Usability Audit
Work with usability experts to assess the aspects of your site that are underperforming. Arm them with what you have learned in the first three steps!
To me, it seems like you could swap steps three and four in some cases — let a usability expert audit your site and identify likely opportunities to improve the trouble spots.
Driving Continuous Incremental Improvement
By keeping the survey running on an on-going basis — adjusting questions as needed, but keeping the core questions constant — you can monitor the results of changes to the site as you roll them out. And, of course, your web analytics data — especially on-site conversion data — is one tool for monitoring if you are driving outcomes that matter.
One point on the incremental changes front: during the Q&A, Kevin talked about how sites that roll out major redesigns invariably see a temporary dip in results while visitors get used to the new site. Incremental changes, on the other hand, can occur without that temporary drop in performance.
Interesting stuff!