Two trends, an exponential increase in data produced, and a linear increase in the number of analysts produced per quarter, continue pose a massive challenge to businesses and analytics practices alike.

We need both physical technology and social technology to practice analytics at scale.


There are three grouping of physical technologies:

  • First, there’s instrumentation technology that we use to measureĀ  and record the world around us.
  • Second, there’s analysis technology that we use to understand the data that’s coming.
  • Third, there’s presentation technology that we use to communicate a world view, and what to do next.

On the instrumentation technology side, we’ve all had a few challenges with instrumentation as of late. Specifically, the understanding of definitions, their impacts, and the unexpected impact of bugs. Empathy from one technologist to another on this front. Instrumentation is not easy.

On the analysis side, SPSS, R, SAS, Datameer, Python. Amazing technologies, some of which may be used as controllers, some of which are used by analysts to peer into the deepest, most chaotic systems.

On the presentation technology side, we have excel, powerpoint, keynote, and certain dashboarding technologies. They have pros and cons. XML or JSON API’s ought to be the future, or some version of it, here. It doesn’t seem like a big problem. But it a fairly wicked one, because credibility and authority are bound up in aesthetic.

Getting these three physical technologies right, linked up together, is very important to practice analytics at scale.


Social technology at scale

People are incredibly important because they’re the ones who generate insight, and ultimately cause beneficial change that results in sustainable competitive advantage. It’s not the physical technology of software and hardware. The institutions that cause them to behave in very specific ways is a social technology. And it must be in place to scale.

There are a number of problems that an organization creates for analytics professionals.

For one, most organizations don’t know what they don’t know about analytics. They don’t understand that instrumentation is still young and buggy. That truth isn’t absolute. That sleuthing is part of the role. That it’s not just “pizza and spreadsheets”. That it takes time to put together a series of recommendations that make sense in a given a system or context. That not every convenient reasoning business case can be generated, or generated quickly. That not everything is recorded by the instrumentation. The first three months setting up any new analytical institution is entirely about resetting expectations.

There are a number of problems that analytical leadership causes for their organizations.

For one, most organizations don’t know what they don’t know about analytics. Bad behaviors result. They hive off the data. They clam up indiscriminately. They refuse to engage with regions of the company for extended periods of time without a strategy in place for such cut-offs. They hire the wrong people. They don’t secure headcount for enough people to be successful. They’re unable to demonstrate their own ROI. They don’t say no often enough to be able to cause their own ROI. They don’t champion their own work. They acquire a siege mentality. They don’t publish and they don’t share their successes with industry. They churn rapidly.

Not all leadership is bad and not all organizations are poisonous to analytics.

If we accept the premise that organizations want sustainable competitive advantage from analytics, and that analytics leadership wants that same outcome, we can construct a physical technology stack and a social technology stack that achieves that end.


Current thinking on that end:

1. Mediums and Medias are fragmenting. The most progressive thinking on the subject is towards medium planning (Syncapse, Teehan+Lax), and as a result, analytics leadership that resists new, novel, mediums are likely to be viewed as obstructionist. Instrumentation will fragment as a result. This is okay. Derive a medium measurement strategy. It’s what our collective leadership must become good at.

2. Analytics means having an analytical tool to use. SPSS is preferable because of usability. R is preferable in certain environments because it’s free. Firms that actively compete on analytics may require a big data stack to data mine very large sets.

3. Recommendations are communicated in powerpoint. The business schools have decreed this. Collaborate on problems without a powerpoint presentation. New thinking from the business schools have decreed this.

4. Every single organization has a C-level that always asks for go-pher analyses. The rest of the organization typically pays no cost to support an analytical seat. These ad-hoc seats are an excellent way to hire new talent out of the universities and are good training opportunities, under supervision. Allow your experienced guns to find the insights, and let your inters-juniors do the gophering. Record the output of the gophering.

5. Nobody in the organization has incentive to acknowledge the analytics departments for insight discovery. The leadership of that department must make sure that there’s a solid internal culture that rewards insight.


Most organizations desire analytics at scale – which means handling both the intelligence and ad hoc sectors of the business – simultaneously. The way to get there is by combining social and physical technologies that enable that scale. It will be an ongoing losing war, but every battle should bring victories.