The latest book I’m reading is “The Origins of Wealth”. It’s another HBR endorsed book, I think, but I really love it. Beinhocker has this way of expressing really complex causes using, at times, very frustrating Grade 12 HBR language. (4 pages spent describing what endogenous means versus exogenous).
But it’s incredibly accessible.
Much of it is relatively easy for traditional economists to attack. And that’s fine – I think they might be missing the real point of the book: “The Origins of Wealth” is to complexity economics as “A Brief History of Time” is to quantum physics. Both works are essentially “pop literature”. (And, editorially, there’s absolutely nothing wrong with making science, even if it is social science, popularly accessible!).
The passage that got me the most excited last night was a portion about something called “The Beer Game”. It’s an experiment that’s been replicated thousands of times, with relatively similar results. You get one player who plays a beer store owner. One plays the distributor. One plays the wholesaler. One plays the brewery.
At the start of each round, each player places an order for beer. The profit for being able to fill demand from each successive person is 50 cents. The penalty for not having enough stock is -1.00 (or something). The point is – if you don’t have stock, customers get pissed, and the relative costs reflect the real world. And, there are of course little delays here and there. Whoever has the most money at the end, wins.
Consumer demand is determined by a card that gets flipped over at the end of each round, and the transactions are then executed.
Everybody starts off with 4 bottles of beer.
What they don’t know is that demand starts off at 4 beers for the first few rounds. Then, it jumps one time to 8 bottles, and stays there for the rest of the game.
And what do you get when you plot it?
Business cycles might (just might) have nothing to do with exogenous shocks to a system, but rather, might have more to do with humans not being very good at handling long feedback delays.
It’s like when you’re in the shower at the Crown Plaza Hotel in Burlingame. You start up the shower, get it started, and it’s ice cold. You turn the nob. It goes hotter than your tea. You turn the nob once more. It goes really cold. You turn it way further the other way. It goes a little less hot. Figuring your safe, you hop in, only to find that it leaves you with third degree burns.
Everything is a little bit off at the Crown Plaza Hotel in Burlingame.
But what’s really probably going on is that there’s a huge time delay between you turning that nob, the water heater directing more water to your shower, and it actually coming. If you’re like most people, you try to adjust that sucker instantly, expecting instant feedback, but in hugely distributed systems, such as the Crown Plaza Hotel in Burlingame, you get all sorts of time delays.
And when you get those delays, you end up getting cycles.
The Beer Game demonstrates that time delays, even in a system that’s theoretically in equilibrium, can cause some pretty bad cycles.
Imagine if consumer demand really changed around in that game? Imagine if we’re factoring in a calendar (a great demand forecaster) but, what if you get weather influencing beer consumption (and you do). So, you get time delays on top of that, and wow. Weather is often considered to be outside of a system. It’s exogenous.
But even without random things like weather, you get issues that are endogenous, that are internal, to a system.
What does this mean for web analytics?
While I was reading this, I was reminded of Bucklin and Sismeiro (2003) “A Model of Web Site Browsing Behavior Estimated on Clickstream Data”. Their model suggested that people interact with a website in a manner that can be predicted using a Logit model.
I applaud their effort.
It just hits me that when we’re trying to figure out why people are interacting with a website the way they are, web analytics data is only recording their actionable behavior. Frequently, we’re not even picking up what they’re thinking (YET, but if I get my way and secure 30 grand for a pilot, we will). We’re only getting some representation. (And not a 100% accurate picture at that, but fine).
Much of the wiggles that we see in the data might be caused by exogenous factors. Lots of kids interrupting lots of people, for instance. However, I’ll suggest that many of the wiggles are getting caused by phenomenon that are simply inherent in the structure of a website.
Thankfully, the “Great Lakes School” of Analytics recognizes that we need to use qualitative instruments to gain deeper understanding of what many of these wiggles are, and, which ones are important.
I’m also suggesting that there might be severe mental “lags” in a users clickpath behavior. Gaining insight into those lags might be key. Web Analytics might be able to identify such lags, but I think we’re going to need qualitative inputs to dissect, and leverage them.
Not all of the wiggles in our data are exogenously generated. I think many of them are a result of the complexity inherent in the websites.