Guerrilla Analytics is pretty much what it sounds like – it’s about going out, without permission and without sanction, and conducting analytics on publicly available information, purely for the purpose of curiosity, case study, or for the common advancement of the discipline or technology. In many ways, I admire the work that has been done by the dev community. JQUERY is an example of a developer led open-source technology, a common library that many front-end devs dip into. It’s just an awesome because it saves them so much time and effort. Many developers are truly technologists. And the really awesome ones go out and experiment. They actually really push the science, and frequently, when it comes to many of these[…]

Patrick @glinskiii once identified three large buckets of skills in his “it takes an orchestra” argument for web analytics programs. It feels like years ago (it’s probably only been about a year), and it has since evolved. It goes like this: There are three large skillsets in web analytics. T’s, or Technical Analysts, specialize in the technical side of web analytics. They’re the people who can tell you where to put single quotes versus double quotes in the S.Campaign variable of Omniture. S’s, or Strategic Analysts, specialize in strategic side of web analytics. They’re the people who can tell you the social process necessary to take an insight and translate it into action. A’s, or Analytical Analysts, specialize in extracting[…]

Joseph Carrabis wrote something very relevant to our interests. Especially when it comes to planning Web Analytics projects. It’s worth the read. Go check it out. I’ll wait. What’s easily missed on the first scan is the passage: “The purpose of these rules is to tend towards 0 the likelihood that a mistake will be made.” And the two rules, which are the meaty bits are: “Rule #1 – Eliminate Variables” And “Rule #2 – Remove Ambiguities” Rule 1 is important. I categorize knowledge into three broad buckets: What I know that I know. What I know that I don’t know. What I don’t know that I don’t know. It’s the third category that’s the scariest of all. When I[…]

An excellent blog post on Estimating the Effects of Cookie-Deletion is timely and welcome, given the relative degree of contention around the Unique Visitor (UV) definition. The chart above is not gospel, and you should not be running around saying that all websites have 100% human-visitor inflation. That isn’t what Angie is saying. Angie has offered up something valuable: a pretty simple model for estimating UV inflation. What Angie is arguing here that the effect of cookie deletion on your unique visitor to human estimate will depend severely on the use of your website and the inherent habits of its audience. Let’s assume that there’s a fanatical group of humans that visits your website. Let’s also assume that within that[…]

I’m smitten with Rails. Rails conforms to my world view in two ways. DRY stands for ‘Don’t Repeat Yourself’. It’s a great principle, especially when writing difficult SPSS code. MVC stands for Model, View, Controller – and it’s the dominant way that I organize, present, and modify data. There are other biases that are built into Rails that I like, but mostly, it’s those two principles. I’m looking at Rails as an important way of solving a number of lingering problems in Web Analytics, and once I learn enough to actually start experimenting and solving them, I’ll share them.

This post briefly summarizes four threads of thought and a conclusion around problem orientation. I read “Evaluation of Internet Advertising Research” by Juran Kim and Sally J. McMillan. It’s effectively a social graph exercise. The findings themselves are interesting (and you can read about that through the Web Analytics Association once I publish the review), but this reference to “invisible colleges” was especially fascinating – just coming off of the SLAB Karen Stevenson talk at OCAD. Kim and McMillan make the point that visualizing bibliographic graphs (a social graph) is useful for uncovering these colleges. The second thread has to do with “The Market Valuation of Internet Channel Additions”, by Geyskesn, Gielens and Dekimpe. In it, they construct a model[…]

It’s rare that somebody forces me to really look at something differently – but Karen Stevenson in an SLAB lecture at OCAD did. Karen pointed out that the three human variables that matter are: transactions, authority, and trust. Transactions among people are easily handled by technology. It’s been long standardized, and in fact, we’re making incremental improvements in that all the time. Where there’s ambiguity in transactions, you need authority to make decisions. What was really left unsaid, but what I’m concluding, is that since humans are very creative people, they always manage to get themselves into non-standardized problems, and as such, they will always need authority. (Look no further than to Judge Judy for daily evidence of that.) As[…]