Organizational Design Patterns for Web Analytics
In my previous post, I argued that Web Analytics was not easy because of complexity, much of it caused by people. Things can get lost in translation when translating data into actionable insight and actionable insight into action.
Let’s turn to the solution, something that Jacques Warren, fellow tweeter (and #wa guru), has termed “Organizational Engineering”.
What follows is a laundry list of the elements, considerations, and biases that should feed a successful web analytics organizational design pattern.
1. It all starts with a great web analyst, a few things a great web analyst does or understands:
a) Takes the site map, goes through the site, and understands it.
b) If no site map exists (which is common), then that web analyst produces one themselves, in so doing, they will understand the site.
c) With site map in hand, uses path analysis to understand where the bulk of the traffic goes.
d) A great web analyst DOCUMENTS everything. What was done. Past reports. Past recommendations. Past successes. Past excuses. Indexes. A great web analyst is like an elephant – they never forget.
e) Understands that the tool is not the end all and be all.
f) Knows that the measure of success is not the number of reports produced, it’s the amount of actionable insight produced.
g) Is vendor agnostic. Sees the tool for what it really is. A means – not an end.
h) 80/20 rule. Spends 80% of the web analytics budget on quality head count and 20% on technology. If they can’t afford both an enterprise package and quality head count, goes for a free web analytics package until such time sufficient value and case studies have been derived to increase the size of the pie.
i) Knows they can’t answer everything with sufficient certainty to make a decision on one input alone, all the time.
j) Leads with actionability and value. Do this, get X% lift, get Y% in incremental, costs Z, get A in profit. (Plus or Minus a confidence interval B)
2. Starting Small Versus Going Big
a) The advantages of starting small include: lower risk, higher odds of success, gaining of momentum, ability to easily reach again if one fails, ability to learn and re-apply knowledge of failures.
b) The advantages of going big include: bigger payoff and much greater credibility if something is executed, greater chance of a gaining support for a better design pattern from the get-go.
The decision of which way to go depends on the character of both the analyst and the company
3. Winning Support
a) Gaining the support of both middle and upper management for anything is vital. Not everybody really matters, but every little bit of support doesn’t hurt.
b) Building a bridge to IT. IT typically owns the technical infrastructure of the website. They have your tags, and can break them. You need their support to get your tests and insights implemented.
c) Building a bridge to Finance. Jim Novo is absolutely bang on about this.
d) Building a bridge within Marketing. Web Analytics typically (not always), lives within Marketing. If a web analyst is there, chances are somebody in that department sees the value of you being there. Make sure that your manager is onside.
e) A web analyst without any support from management will initially fail to get anything executed. If this situation degenerates into a trend, the web analyst will ultimately stop generating actionable insight, suffer from skill decay, and eventually, defect to another company or department.
4. Communication Cadence
a) Automate dashboarding. Nobody ever made their company any money with a dashboard. A dashboard says what happened in the past. A dashboard has yet to tell me what is likely to happen in the future.
b) Bi-weekly or Monthly spotlights. Use powerpoint or video (the medium is the message). Remind them what you told them last time, what’s been done about it since, what you found this time, options for exploiting the opportunity.
c) The web analyst is indispensable when considering any online marketing campaign.
d) Report generation cadence should not be so high as to preclude the analyst from doing their real job: adding value, intelligence, and making the company more money.
5. Prioritization
a) If nobody is owning it, own it.
b) Assist the person who owns it.
6. Anticipate the satisfaction/disatisfaction curve
a) The set of business questions that could be asked of an analyst is infinite.
b) The set of business questions that a web analytics tool can answer, at any given time, is finite.
c) Expectations of what the analyst is able to answer will inflate as the analyst delivers successfully.
d) The amount of incremental satisfaction with the tool will decrease as time passes.
e) The argument for supplemental tools that match queries that could not be answered will become more compelling over time, so long as these have been documented.
f) The request for more advanced tools should be as proactive as positive, in anticipation of the successor sets of business questions.
g) The business questions should be compelling enough to the people who matter to justify the increased budget.
h) Eventually, the web analyst will have to learn statistics once the “Hour a Day” questions have been exhausted.
And that’s it.
It’s a start.
If we can get together and agree on the general characteristics of a good organizational design pattern for web analytics, the adherents to that design pattern should be in a better place.