In this post:

Data Driven Cultures in startups should discover product-solution-market fit more reliably than Ego Driven Cultures

Data Driven Cultures

Carl Anderson, 2015 (Data Scientist at Warby Parker) defines a data driven culture as:

  • Is continuously testing;
  • Has a continuous improvement mindset;
  • Is involved in predictive modeling and model improvement;
  • Chooses among actions using a suite of weighted variables;
  • Has a culture where decision makers take notice of key findings, trust them, and act upon them;
  • Uses data to help inform and influence strategy.

Startups

A startup is defined as an experiment looking for a problem-solution-market fit. The goal of a startup is to become a business. To do that, it must discover a market, a subset of people that are self-referential in making decisions to buy, that exhibit willingness-to-pay for a solution. That has to be a solution that the startup is building or is willing to build.

Startups with 16 headcount or less are fairly unique beasts. People wear multiple hats. The founder(s) scramble to meet payroll while scrambling around for endless figures and projections to sate investors concerns.

Data is extraordinarily important. They have to be optimized to convert the data they’re getting from the market, about their product, into experience, and ultimately into intelligence, so that they can survive and maybe become a business.

The odds are against them and the risk is high.

The competitive edge

Data driven cultures fostered within startups ought to be able to discover a problem-solution-market fit more reliably than, say, an ego-driven startup, because:

  • Conflicting evidence that goes against the model is more likely to be accepted;
  • The rate of learning is likely to be greater if the team knows that it is supposed to be learning;
  • The time to error correction is likely to be lower because the team has a greater alignment to reality;
  • The quantification and mitigation of risk is likely to be greater.

If and only if:

  • The opening position is viable,
  • The team can execute and build a competitive product.

The opening position is viable

Assume a startup with 72 weeks of runway in the bank from the outset.

Assume an initial idea that:

  1. Has a total addressable market of 1,000 customers
  2. Those customers are not self-referential in their purchase decisions for the product
  3. Has an average willingness to pay of $39.99 / Month

Consider an idea like “Customs forms for importers of live swine from Costa Rica into Canada”, or like, “Bingo card generator for Russian-English ESL course instructors”. Something ultra niche. And, I don’t know if live swine importers have a social network or if Russian ESL teachers all know each other, but assume that they don’t.

In a data driven culture, you’d expect the founders to hypothesize their beachhead market and their total addressable market and then fight like hell to count those leads. But maybe it takes several weeks to realize that swine importation is too niche. It may be entirely possible that the entire sector of managing Canadian Customs and Revenue Agency forms for live animal importers is still too narrow of a sector. In other words, it may take a full 72 weeks to understand that you started way off in the wrong area.

Finding water is much harder if you’re dropped off in the middle of the Sahara.

This may be especially true with respect to IDE (Innovation Driven Enterprises) which are defined by a longer monetization loss in the short run, but are capable of an exponential scale should product-market-solution fit be achieved. While there is a lot of literature on IDE, the connection between features getting laid down, and the market, aren’t completely connected. Typically, that augmentation has to come from the application of a few lean methods.

In other words, a given culture may have a very high learning rate, but, because it started out in simply terrible terrain, it may have no chance of reaching viability before time runs out, no matter what.

Starting strategy matters.

The team can execute and build a competitive product

Individual talent and collective talent matters.

If the team cannot execute, then it cannot engage in a build-measure-learn cycle.

To hit you over the head with it: the first word in build-measure-learn is build.

Sometimes nodes in a social graph just don’t fold properly. People have different values, interests, and objectives. They can do unpredictable things when they’re together.

Many founders are genuinely confused as to why they can’t find talent to come work for them. Many others are stung by the rejection when people with perfectly fine jobs turn down the opportunity to come work with them. Rejection in the labour market is much like rejection in the marriage market. You’re not right for everybody. And some founders are ill equipped with the capabilities of attracting and retaining the right kind of talent.

Many folks at a startup are genuinely confused as to why people can’t work together. And there’s all sorts of habits and reasons why people don’t get along, come together, and hash it out.

Not every team can execute.

This is a non-trivial assumption.

If both conditions are met: hardening and viable starting point, a data driven culture ought to confer advantages in the form of graduating from being a startup to being a business.

There are four reasons to support this idea.

  • Conflicting evidence that goes against the model is more likely to be accepted;
  • The rate of learning is likely to be greater if the team knows that it is supposed to be learning;
  • The time to error correction is likely to be lower because the team has a greater alignment to reality;
  • The quantification and mitigation of risk is likely to be greater.

Conflicting evidence that goes against the model is more likely to be accepted

It sucks to be wrong. It sucks even harder when it’s your judgement that caused people to lose their hope, effort, and sometimes, jobs. It can be tremendously damaging. It can be intensely damaging to morale, and even worse on the effectiveness of a leader to lead. Being clearly wrong about things can cause confidence to erode and severely reduce the ability of a team to execute and to build.

It feels so much better to be right.

Conflicting evidence is easy to reject because it feels bad. It’s easy to reject conflicting evidence when you’re on record in the business press that your strategy is working. It’s so much better to continue on a scientific theory when so many careers have been built with a set of assumptions, and the conflicting evidence can be explained away by instrumentation error, or that it’s merely a curiosity of nature (Kuhn, 1962). It’s also entirely probable that the search for explanation is only truly triggered by dissatisfaction (See: James G. March 1994, 1998). And, if a model isn’t perceived to be broken, why expend energy trying to fix it?

But you don’t often get to a place being persistently right until you learn from being wrong.

To hold up data driven culture as a progressive method for evidence acceptance in the same post as writing about Kuhn and March may seem hypocritical, but I’d argue that we can strengthen data driven culture by being aware of what Kuhn and March have observed about institutional behaviour.

The alternative is so much worse. To carry on hoping that a tactic or strategy is really right, out of pure fear of humiliation, carries far more risk than coming clean by being clean.

A data driven culture is unlikely to be forgiving of dead reckoning and the disaster that follows. If logic and reason is trumped by emotion and hope, and it turns out badly for all involved, the judgement is likely to be swift and the consequences brutal. If there was a clear reason, an assumption that was incorrect or faulty understanding of the underlining model, it’s far more likely that being wrong will be forgiven.

Evidence that falls well outside the hypothesis under testing is far more likely to be greeted with delight by those in the startup culture than a confirmation. For example:

“We only go 80 signups as a result of our Valentine Bovine Campaign on YouTube.”

“What was the CPA?”

“$45.00”

“Did you target rural Canada and rural United States farmers with interests in pig farming.”

“We did.”

“And why didn’t they convert?”

“Well, almost that converted signed up had a university email address.”

“That makes no sense.”

“I wrote some of them….they’re graduating students from agricultural programs.”

Instead of being enraged about a $4500 paid media spend, arguably, $2000 was spent acquiring new customers, and $2500 was spent learning that new graduates are into the product. That’s rather idealized. That’s the spirit.

The rate of learning is likely to be greater if the team knows that it is supposed to be learning

The third step in a Build-Measure-Learn cycle is Learn.

If the team understands that it is supposed to be learning, it will direct less energy towards having arguments and more energy towards finding out.

How often has it happened in a group you’ve worked in that, because somebody was the first to state an idea, and because they were the first to talk, they become intractably invested in that idea? Instead of being able to say, “let’s test both”, and moving onto to build, measure, and learn – it can be the case that many sides may agree to disagree, resulting in stalemate. It’s not terrific.

Very often the startup is designed to validate the ego of the founder or its architect, instead of validating the product-solution-market fit. If there’s always an issue with causing acceptance of the results to sink in, the rate of learning will be lower simply because more time is forced to elapse.

Every day spent cajoling the founder, architect, or that person, into acceptance is time wasted and runway eroded away into the ocean.

The time to error correction is likely to be lower because the team has a greater alignment to reality

There’s perception of reality, and then there’s reality. Some may believe that the value of pi is really 3.2. But it isn’t. Thankfully, there’s data you can experience about the nature of pi, and you can use that experience to inform your intelligence. It’s great.

Sometimes a founder is able to project a reality distortion field. Distortion fields are especially important when high hopes fuels bursts of high performance. Enthusiasm is contagious. Over-the-top hero worship is epidemic. Conflicting data with that distortion field is unlikely to result in a great experience for the toxic people bearing the bad news.

A data driven culture is closer to reality.

It may not be without a distortion field. Just as an army marches on its stomach, a startup marches on hope. Fundamentally, the team has to believe that they can persevere against reality, illegitimate competitors, and a punitive market attention curve. It’s closer to reality because it’s more likely to be closer to the data that reality generates.

And its leadership takes notice of key findings, trust them, and act upon them.

The quantification and mitigation of risk is likely to be greater

A data driven culture is involved in predictive modeling and model improvement.

How often have you seen a set of projections that has the company’s new customer acquisition curve stay flat for 18 months and then hockeysticks at month 16? Have you seen those? I’ve seen a lot of them.

That’s a risk right there. What happens if the flat line trend continues past month 16? Ah, that’s right, they run out of cash and they crash off the end of the runway. Devastating. And that’s not just something a central Canadian would say.

Wishful thinking isn’t a form predictive modeling.

A solid predictive model is rooted in predicting risk and mitigating them.

The very activity of identifying and talking through risk, gain, and the future is a key form of planning. This shouldn’t be taken for granted. It is very frequent that people are fearful of expressing doubt, raising risk, or talking openly about future expectations in a group environment. The odds that open collaboration is likely to occur generally diminishes as the size of the group increases.

The size of an insider group could, potentially, be greater if the exercise of predictive modeling allows people to express doubt and dissenting ideas that can be rendered palatable through testing.

There’s good reason to believe that, generally, the identification and mitigation of risk is likely to be greater in a data driven culture than in an ego driven one. However, it is possible for risks to be mentally magnified to the extent that they allow too much doubt to accumulate, harming morale.

On the whole, those with superior ability to predict the future are in a better position to benefit from it, and discover a market sooner and faster.

Conclusion

In management science, it’s taken an axiom that data driven decision generate superior results. For the reasons above, there’s enough reason to suppose that:

Data Driven Cultures in startups should discover product-solution-market fit more reliably than Ego Driven Cultures

Posts in this series include:

The Data Driven Culture

The Data Driven Culture: Strategy

The Data Driven Culture: Startups