What we learned about the data and analytics market

August 1, 2023
|
By Jake Fuentes and Jon Brelig
This article is one in a series describing lessons learned from building Cascade. Before reading, we suggest checking out what Cascade did and an overview of lessons learned.

Cascade was an advanced analytics tool, allowing business teams to create automated analysis that typically required SQL or Python. So in companies that had a data team, we were relieving that team of some of their more tedious work by allowing business units to do it themselves. For companies that didn’t have a data team, we were allowing them to automate work in a way that wasn’t otherwise possible.

Either way, our landing point was with business teams, and we sold the tool as general-purpose data solution. We knew that analytics was a very custom business, and the insights created would be very specific to that that company. So while we had many use case examples, we were not heavily opinionated about what data should be crunched in Cascade and what shouldn't.

Data tools for business teams

A big reason we sold our product that way because of precedent: BI and analytics tools have typically been sold without presupposing the specific analysis being produced. Companies from Looker, Tableau, Mode, Metabase and many others are sold as general-purpose data tooling, in contrast with products like Amplitude which are more use case-specific.

The problem was that we were trying to sell a data tool to business teams. In fact (and I know this sounds obvious), data teams buy data tools, and business teams buy business solutions. We were selling data tooling to business teams.

We knew we were breaking that rule, with the assumption that business teams needed to expand their own data capabilities. We knew that data teams did not feel the pain of the “analytics breadline” as much as business teams did, who had to wait weeks for report adjustments or ad-hoc analysis. For their part, business teams had far more context on what they wanted, and we thought that they would jump at the opportunity to take on some of that work themselves. We found a few prospects to back up the conclusion we had already come to.

The problem is that services teams (like data teams) buy tools differently than business teams do. Services teams resemble internal consultants, and they buy tools to facilitate handling the many use cases that come their way. Business lines generally align purchases with the P&Ls that they manage, and they don't need to handle as much use case variety. We were selling the tool as analytical infrastructure, steering towards a general-purpose tool supporting multiple use cases rather than a highly-opinionated solution geared towards one function. We could craft solutions out of our building blocks, but for each function where we had a compelling use case, we were competing with vertical, purpose-built solutions.

If it were common for companies to organize their business analysts into their own central team (like many do for functions like design, engineering or data), I think our approach could have worked. But with many companies having scattered their analytical talent across many different functions, we found it challenging to find landing zones within most organizations.

We did find pockets of analytical talent that was concentrated enough to justify significant spend for tools like Cascade. Consulting, audit and accounting firms were the major players in those pockets. But with Alteryx also deeply embedded in that space, we knew that our product differentiation needed to be high enough to overcome switching costs. That would be a long road.

What’s the value of an insight?

Even if we had figured out how to sell our product to one team or another, we were afflicted by a larger issue that impacts the entire analytics space: what’s the value of an insight?

The value of any analysis we helped teams create is capped at the value of the insights they produce, which is incredibly difficult to quantify. Spending on most data and analysis tooling is largely justified as table-stakes requirements from executives or the board. Sometimes analysis produces game-changing results for companies, but those events are unpredictable and hardly ever come from non-technical analysts. Analysis does have clear, definable value inside a small market: the market research, consulting and related firms that directly monetize it. But that market is not enough to support the vast array of analytics tools out there.

We were not alone in this conundrum: I would argue that the ambiguous value of an insight has put significant downward pressure on the whole of the BI and business analytics sector. Both Periscope (now Sisense Fusion) and Mode stalled out at about $30m in annual revenue. Looker had deeper hooks into data modeling but plateaued at $100m, and it's now more or less a loss leader for Google Cloud. Newer players like Sigma may prove this all wrong, but I bet they won't. The more that BI tools get disjointed from data infrastructure, the more acute the problem of proving insight ROI becomes.

We were selling infrastructure to business teams without a clear ROI. No wonder it felt like pushing a rope.

The fracturing of BI

For the last two decades, the “business intelligence” space has been slowly trending from monolithic, vertically-integrated tools (Cognos, Microstrategy) to more lightweight, audience-specific apps that can work on a variety of data stores (Hex, Sigma, Metabase). Not only will this trend continue, the headless nature of transformation and modeling tools will open up a broad suite of options for how business teams consume data. Traditional BI tools will be abandoned in favor of embedded analytics inside existing tools or purpose-built, use case-specific apps that are friendlier and more relevant to business teams. Data teams and more advanced analytics categories will still require general-purpose tools, but I doubt there will be a next generation of Tableau or PowerBI. Instead, we’ll see a large, scattered array array of opinionated, functionally-specific tooling that pull from standard warehouses and data models. Data teams will spend more and more of their time integrating with those tools, but less of their time creating the analysis that is instead produced by those apps.

With that thesis in place, it was hard to see a world where tools like Cascade would deliver venture-scale returns. For better or worse, venture-backed businesses cannot merely deliver steady growth and decent margins. The hockey stick requirement can seem tyrannical at times, undervaluing businesses that seem healthy in any other light. But that model is the one we signed up for, and the market we were in did not support it. If we’re right about where the data space is headed, I doubt we’ll be the only ones caught in that spot.