The selection of Key Performance Indicators (KPIs) is, in a very small organization, a relatively rapid and efficient process. Typically, the web analyst or web consultant talks directly with the owner and manager, and an experienced web analyst should be able to select a series of KPIs, in real time, that align with what management is talking about. It can be, literally, a 6 hour engagement (when you include education, process, overview, and a few breaks).
However, when the size of an organization, and stakeholders, scales – the intensity of the process also scales.
In some organizations, the web analyst begins the process with their manager, and a number of peers. The first process, including education, and assuming that the web analyst is able to keep the conversation on track with a manager, can easily gobble up 20 hours. Of course, a presentation deck needs to be produced, which includes a large number of grounding slides. Part two of the process involves cross-selling those KPIs across multiple managers at the next tier. There are inevitably a few changes. Many will be legitimate. Some might not be. Inter-departmental politics, in which somebody, somewhere, needs to object to something – anything – are not just the stuff of office lore. It’s real.
The result is that the KPIs need to come back down to the departmental level, after modified, and then referred back to inter-departmental level, several weeks later.
Later, the KPIs are then presented to somebody at the CMO or CxO level, and either modified or approved. If they’re modified, generally speaking, there’s little blowback from the inter-departmental layer.
Why should it be so hard?
First, KPIs are infused with business strategy (see the Web Analytics Association definitions document), and as such, they are a point of contention.
Secondly, web analytics can be very hard to define and to understand, even for experienced practitioners. There are biases that are embedded, and can be even hidden, in the selection of KPIs.
Thirdly, as the complexity of organizations scale, the difficulty in getting any sort of consensus also grows. Suppose you and a friend went to a seafood restaurant. Getting the seafood platter for two – that includes scallops, lobster, shrimp, and crab, would be just divine. Now suppose you and a friend invite a party of 10 out to a seafood restaurant. The odds that at least one friend is going to object to going in on the party platter for 10, on the sole objection that they don’t like scallops – is that much harder. KPIs are a lot like a seafood platter. You either accept everything on the menu, or nothing on the menu.
Are there ways to improve the process?
The first is to reach straight for the business plan, the annual report, or any sort of company wide strategic document, and to quote it in your KPI document. If you can align specific KPIs from your web strategy right back to those plans, the better. If you can align KPIs right back to a specific CxO speech, if it’s from the CEO, so much the better. If it’s all referenced, and squarely aligned, the better.
The second is to be intensely specific with your KPI definitions. Again, appeal to a higher power, and reference the WAA definitions. In the world of web analytics, specificity wins.
The third is the hardest bit of all – ‘bringing people along’. Complex organizations have a large number of departments that handle complexity by maintaining the status quo. Traditionally, simplicity and the status quo wins. (IBM stood by and watched Dell grow during the eighties and nineties exactly because of the inertia generated by the status quo.). If a single department wants to derail any sort of KPI-accountability effort, they just might be able to. Try to bring people along as much as possible – recognize when you’re up against an uncooperative group – and take steps to mitigate those risks.
KPI selection is indeed challenging, but there are ways to get around common stumbling points.