.comment-link {margin-left:.6em;}

Monograph

Thursday, January 15, 2004

Web Statistics

Following up on yesterdays post about Deming I have begun combing the web for examples of people who have applied statistical techniques to web traffic. Most of what I've found is about estimating the loads on servers and capacity planning, typically at a scale far beyond what I'm interested in.

However over at Useit.com I found a Jakob Nielsen column on the application of regressions and standard deviation to traffic analysis. Since this is the first thing I'm planning on doing in these experiments it makes encouraging reading.

Another thing I wish I'd read before last weeks job interview is this piece on six sigma standards for the web. While Jakob is comparing apples and oranges here when he moans that the web is 100 000 times less efficient than six sigma there are useful ideas in it.

If he was being a little more honest he'd note that six sigma relates to processes, specifically repetitive ones in industrial environments. Making it as easy to buy something on an e-commerce store as it is to put the round object in the round hole on the assembly line is a worthy goal, but not a useful one.