AB Testings vs. Analytics
I’ve just been reading ‘Web Analytics 2.0’ by Avinash Kaushik, and it basically confirms my oppinion of the web analytics world in general at the moment. That is, basically, in a 445 page book, 17 pages are devoted to A/B & multivariate tests and the remainder, broadly speaking to analytics.
It really clued my in when I read this: “Without a doubt, A/B testing is the best technique to start your testing journey. Because of its inherant simplicity, you can get going quickly, and you can engergize your organization while having some fun. In addition, the results are easy to communicate. You don’t have to worry about multivariate tests, regressions, or other such (lovely) things” (Web Analytics 2.0 by Avinash Kaushik, Wiley Publishing, Canada, 2010, p198)
Is the analytics world really so self-absorbed it thinks that generatic metrics is more important than testing? It seems so.
I can only speak from my own experience, but broadly, as I see it, web analytics falls into two categories:
1) I want to improve my website conversion rate.
2) I am spending $X on online / DM / TV / etc marketing, which is most effective for cost per acquisition?
Over whelmingly the metrics that people seem to be interested in generating are either from (2), or for (1) with no real understanding of why they are gathering them.
Let’s collect bounce rates for our website!
Erm… because… we… should, write a report about that. Ok, how about we collect first touch / last touch information for our various online marketing strategies so we can see which one gives us the best cpa.
Ah… we’ll um, tag them all and then when they arrive at the website we’ll compare the conversion rate of the different incoming items and then rank them.
So, basically an A…N test where the traffic source is different but the content served is the same default content each time?
Ah… well, yes.
Analytics should follow testing
I couldn’t be more opposed to the approach in the example above. Analytic metrics should always be either to track conversions for a test, or to discover new areas on the website that require attention and testing.
Collecting data for its own sake is pointless, and I feel that is widely accepted.
So why is so much focus given to collecting metrics?
Surely the core focus should be on testing, and have to derive and implement the metrics required for that testing. An A/N test doesn’t have to literally be two visually distinct pages. It’s still an AB test when you are comparing the conversion rate of two seperate pages.
Calling it a test formalizes the process of hypothesizing, testing and comparing two sets of data. You can do that in a spreadsheet, in the web analytics tool of your choice, or what ever you choose.
The key though, is making sure that:
1) The data you gather is statistically relevant.
2) You document and accumulate the learnings from various tests.
3) Over time, improve the conversion rate of your website.
It really amazes me that so little attention is given to the praxis of web analytic testing, and so much to the process of setting up and understanding metrics which have no practical application.