A very smart person remarked that he liked numbers because they didn’t lie. People lie about numbers. Over the next 30 minutes, I demonstrated how two honest people can have two valid interpretations of the numbers, and have their models supported by the same facts. An hour later, during our measurement science biweekly meeting, I invited the team to analyze a 5×5 RM table, and asked a fairly loaded question about it. Diversity in opinion eventually gave way to consensus around a mean. Several honest people had feedback and conflicting models about the way the world really worked. Each version perhaps more probably true than the last. ‘Truth’ is one of those really strange words in analytics. It’s something we[…]
Author: Christopher Berry
“L.A. Law Wikipedia Page Viewed 874 Times Today“, an article from the satirical media giant The Onion, is funny because it’s painful. The article starts off telling a story about irrelevant content. In this case, web analytics about a really old TV show on Wikipedia: “Our L.A. Law page typically gets 915 views on weekdays and 670 on weekends, so we’re about 40 off the pace,” Wikipedia web moderator Ben Stern said of the entry for the Steven Bochco series, which hasn’t aired a new episode since 1994. “Then again, the day isn’t over, and if our metrics are correct, Corbin Bernsen’s IMDB page should be viewed at least 15 more times before midnight. We generally get some runoff from[…]
One of my favourite sites is KillerStartups.com. It’s everything I love about startup culture and innovation. There are hundreds of independent variables that goes into explaining why some of these startups are going to thrive, and why most won’t. (It’s more complex than biology because people are involved!) My favourite variable is evident utility. Each startup has two paragraphs to convince me to even click to learn more. Do I see an actual use? Does it do something that somebody else already does in a better way? Cheaper way? Is it generalizable. It’s not the most predictive variable of success though. Twitter is a good example of something I could see no evident utility for. Eventually I saw utility, at[…]
It’s surprising how little time I’ve spent analyzing PowerPoint with the same rigor as social and the web. It’s amazing how that dissociation happens. There’s a set of methods that apply to these mediums over here, and a set of methods that apply to this set over here. And you can go along not even being aware of it. On Thursday, Nadia, Heather and I were remarking how a specific POV looked after Paul gotten his hands on it. The content was all there. The content was actually the same. It just looked more persuasive. Naturally, writing persuasive content is a cornerstone of marketing – so suddenly – powerpoint becomes an object of curiosity. We enumerated all the things that[…]
We did something very different for last night’s Web Analytics Wednesday Toronto. Out with the invite was a strongly worded request to produce three bullet points on one sheet. The hypothesis was that if you give analysts a platform for sharing some work with others, they will take it. The expected outcome was lower turnout with a higher intensity of participation, and a higher perception of value. Six sheets were presented by: Martin Ostrovsky (Repustate), Brian Cugelman (Alterspark), Kevin Richard, Heather Roxby, Greg Araujo, and myself (Syncapse). They were excellent and sparked very active debate. Fifteen people in total came out, including web analysts (Mark Vernon, @web_analyst), creative (@mimc03), data miners (Gar et al), developers (@chrismendis et al), managers, directors[…]
I’ve had a fairly rough 9 days with a very troublesome model. My original hypotheses are rejected. A piece of the world doesn’t really work the way that I expected. The great news is that I’m forced to look beyond the clean dataset and write new hypotheses. Even failures can be great. However, it doesn’t make for good commercial reading. Instead of having that nice, clean, nugget: Brands that did x realized y. There’s a much messier message: Neither a, b, c, d, e, f, g, h, i , j, k, l, m, n, nor p had a significant impact on y. That messier message works among marketing scientists. Usually a sound of surprise. Then acceptance when they see the[…]
The obvious agenda of the next Toronto Web Analytics Wednesday is pretty obvious. When passionate developers get together, they usually hack. What happens when passionate analysts get together? That’s the question. I’ve put out a pretty basic call – bring 25 copies of a single sheet of paper to the next WAW. Have 3 bullet points and supporting data. Be prepared to distribute it and talk about what you found. It is indeed homework prior to the next one. It’s an opportunity for analysts to move beyond talking about web analytics to sharing what they do. There are loads of open data sets out there, with very, very rich datasets. Never before has there been so much opportunity. To that[…]
Shaking the next Web Analytics Wednesday, at Bar Wellington – second floor. You’re invited to bring 25 copies of a single sheet of paper that contains: Three bullet points of analysis, preferably with an actionable recommendation or finding Data that supports your analysis A reference to the data source Your name, company, contact info, website, blog, twitter handle, and so on It’s not a dashboard. It’s web ANALYTICS wednesday. Why? You’ll leave with something in your hand and feel smarter for experience. We rarely get a chance to share our craft with other practitioners. Why shouldn’t practitioners have a chance to put up? The agenda We’ll start at 6:30pm and we’ll get up and move around with our sheets. You’re[…]
One of the most instructive papers on serial innovation comes out of left field from Griffin, Hoffmann, Price and Vojak – the latter three out of the University of Illinois, Urbana-Champaign. (Which will bring a smile to regular readers of this space.) Their paper doesn’t reference Roger Martin, but it confirms much of what he’s described about the saliency phase of opposable thinking. The best piece that I’ve taken away from the paper is the definition of an Interesting Problem. A problem is deemed interesting only if: The firm can actually solve it and management accept the solution. (Feasibility) Customers will pay to have that problem solved. (Marketability) Will it be a big deal for the firm. (Impact) Prepare to[…]
It was only towards the tail end of the second year of university when anybody tells you about the pearson tables. It’s a glorious thing, all alone in there, hidden away in PASW. You run it for a number of variables, and it gives you a beautiful matrix showing the strength and direction of relationships among them. It can be disastrously misleading. Violence can dull sensitivity. Still, it can be used to rapidly validate mental models and rule others out. I cope when I’m confronted with a large dataset. I identify what is it that I’m trying to figure out. Then lay out all the independent variables that I think might relate into explaining that variation in something dependent. I[…]