White Paper: Testing Secrets of Success
If you’ve been reading my blog for any amount of time you’ve inevitably heard me comment that I think “web analytics is hard” — not complex, not mysterious … just plain difficult. It’s hard to select vendors, hard to install code, hard to train users, hard to get the “right” reports, hard to get management’s attention, hard to make the case for change … the list goes on and on (and on and on and on!)
Hard, but not impossible.
In the past few years we have definitely started seeing an increase in the number of companies that “get it” — so much so that we’re able to program an entire conference built around the superstars of web analytics. More and more I am talking to, working with, and hearing about companies who have leveraged the “web analytics is hard” mindset to properly set expectations regarding their use of technology, their deployment of people, and their use of business process to really excel at web data analysis and turn this analysis into tangible improvement for both the business and their customers.
As I collect more information about these web analytical competitors one thing that nearly always emerges as a hallmark of their success is some type of structured testing program. Of course this makes perfect sense because without testing the analyst is never really sure about the impact of their recommendations, so much so that I have often said “if you’re not testing, you’re not really doing web analytics!”
But the increase in testing has raised an interesting corollary to my “web analytics is hard” manifesto … testing is hard too!
Fortunately we’re all a little bit older and a little bit wiser this time around and we recognize that testing requires more than just throwing code on the page and clicking the “Optimize” button. Testing is a process that requires people and technology … sound familiar?
I’m bringing this up for two reasons:
- At the X Change conference in San Francisco on September 9, 10, and 11 Matthew Wright from HP will be leading a conversation titled “Testing, Testing, Testing: Building Consensus and Evaluating Results” to discuss the nuances behind testing, things like getting stakeholder approval, planning, and clearly defining measures of success;
- This morning the nice folks at SiteSpect published a white paper I wrote that details ten “best practices” for testing that I think a lot of folks new to testing often forget. Titled Successful Web Site Testing Strategies: Ten Best Practices for Building a World-Class Testing and Optimization Program, this white paper is freely available (requires registration) and covers nuances like testing teams, stakeholder involvement, test plans, timelines, and, of course, measurement.
If you’re working to become an analytical competitor and join the ranks of the kinds of companies who get invited to lead a conversation at X Change I highly recommend either grabbing the testing white paper, coming to the X Change, or BOTH! Especially if you’re serious about testing I think you’ll find this free white paper useful when you work to set expectations, and trust me, testing is as much about expectations as it is execution!