The ROI of Data Teams: Can You Prove It?
A conversation with Chetan Sharma at Eppo
Data teams and the modern data stack have attracted large sums of investment over the past few years. Everyone has taken for granted that data teams are valuable, and that they deserve a multi-layered stack of best-of-breed software. I’ve been wondering: with the significant changes in the macro-environment this year, will data teams (and their software vendors) soon be required to justify their ROI more concretely? And how should they go about doing that?
Che Sharma, the Founder & CEO at Eppo, was the perfect person to discuss this question with. As an early data scientist at Airbnb and later at Webflow, Che was knee-deep in proving the value of data teams to the rest of the business. Most recently he founded an experimentation platform to help data and product teams figure out how to impact business metrics — thereby building a category within the modern data stack that is arguably much closer to revenue than other categories.
In this conversation, we discussed:
Should data teams align themselves with revenue or with cost reduction?
If you led a data team and you were making your case to your CFO that your budget shouldn't be cut, what would you say?
How can data teams prove their value in companies that don’t quite “get it” yet?
You can listen to the podcast or else read the lightly edited transcript of the conversation below. Let's dive in!
If you’d like to hear more about topics related to scaling SaaS and other B2B businesses, you can subscribe to my newsletter (for free) here:
As always, I’ll share a few leadership roles at companies I’m excited about:
Eppo (Che’s company!): Founding Marketer
dbt Labs: Sr. Director of Professional Services
Bubble: Director of Product Marketing, Director of Partnerships
LaunchNotes: Director of Engineering
Kingdom Supercultures: Head of People
Guide: Head of Marketing
Allison: Che, I'm so excited to have you on the podcast today to talk about the ROI of data teams and how we can prove it out, especially in this macro environment. Thanks so much for joining us.
Che: Great to be with you, Allison. Topic near and dear to my heart here. So, as you probably mentioned in the show notes here, I'm an early data scientist at Airbnb and then Webflow, which both have interesting relationships with data teams and ROI. So I'm excited to dig into it.
A: Awesome. To start off, can you tell us a little bit more about background and what you're doing today?
C: Absolutely. I am the founder and CEO of Eppo. We're a next gen A/B experimentation platform. It came from my feeling that the best way for data teams to have ROI is to heavily invest in experimentation and plug that throughout the org. But before I was at Eppo, I was, as I mentioned before, a data scientist at Airbnb, quite early. I saw a lot of the growth journey play out and worked across the stack, from machine learning for a couple years, then analytics for a couple years, and data tools for a couple more. I saw the data team progress as we went from mostly focusing on reporting and supply growth, to being part of every major pillar of the company, from growth in China to business travel, marketing, and everything else.
After Airbnb, I consulted for a bunch of companies from a wide array of domains. And then most recently I was at Webflow, which is a no-code website builder, a SaaS tool, where again I got to see the data tool ROI journey.
A: You're the perfect person to speak to the subject. Often when people think about ROI, they think about one of two levers, and maybe both: revenue and cost reduction. Do you think data teams should associate themselves with either of those? And if so, why?
C: I think data teams need to associate themselves with the goals of the company at the time. If you look at Airbnb from 2012 to 2017, it wasn't necessarily a revenue-oriented place, but it was extremely focused on growth, on pursuing the land grab. Airbnb understood that there were incredible network effects in an international travel platform. So the most important thing was to become the de facto standard, not just in America, but in Europe, in Asia, Australia, everywhere. I wrote a blog post about this: if you focus too much on revenue, then actually it will distort the initiatives you pursue. In our case at Airbnb, focusing on revenue might have made us invest a lot more in America and Western Europe, which would have taken away from the land grab initiative.
So I think in general, it's good to just focus on the company's priorities. At Webflow, the priority was ARR, so our data team was much closer to revenue. I think this is what leadership exists to do: to outline those priorities so that teams can orient themselves around it.
A: So do you think this literally might mean the data team saying, there are these five OKRs for the company this quarter, we are contributing to some or more of those in these particular ways?
C: Yeah. 100%.
A: And do you think that data teams should measure themselves numerically based on some kind of impact on those numbers?
C: I don't really think, so because data teams don't actually do things. It's funny to say that, but they don't actually build anything. The actual product-building process is: understand, design, build, and measure. Data teams are really involved in understanding and measuring, but not design and building. Even on the marketing side elsewhere, we're about guiding hygienic decision-making. So in that regard, if you are not the one necessarily making the decision, how do you kind of pin the dollar tag on yourselves?
I think what you need to do is, have really tangible demonstrations of how you are improving decision-making. It's hard to put a number on it, but you can list all the places in which you guided a decision in a way where you had a really big role here.
A: Oh, interesting. I'm trying to think through exactly what that might look like. Is that saying something like, before the data team got involved, this business team was on track to make this particular decision, which would have resulted in X, Y, Z impact on our OKRs, but then we got involved and actually now there's A, B, C impact on our OKRs?
C: Yep. That's exactly it. Concretely, the way this turns out is less of a sharp “before, after” and much more of a, “here's a scope of type of decisions that are happening, unguided by data, that are now going to be guided by data. And we can understand that level of decision making in aggregate.”
I'm the CEO of an experimentation company. In experimentation, there are a bunch of decisions like, would you launch this thing or would you not launch this thing? When you do experimentation and you actually say, let's see what the metrics say about it, and metrics are obviously very much in the data domain, that's a tangible demonstration of how the data team affected a decision.
There are examples in other parts of the organization. For example, customer service: would you route this ticket to this person, or that person? In marketing: would you spend more on this campaign, or that campaign?
There are decisions that are happening all over the place. As you plug data into those places, then those are, to me, tangible demonstrations of guiding decision-making.
A: I'd imagine there would be some people out there who would say, look, it's really hard in most instances to really prove that the data team had an impact on decision-making in a way that impacted company OKR. So these people might say, instead, we should think about data teams as essential infrastructure that just has to exist and should be funded, regardless of circumstances. What would you say to that?
C: I actually don't disagree with it too much. The place where I would counsel a little bit of temperament of that, is that just because you have numbers doesn't mean you changed the decision. Data people often need to make sure that the decisions are actually changing — especially early in their journey, before companies fully embrace data as a core constituent. Our common friend, Elena Grewal [former leader of data science at Airbnb] and I talk about this, that when you're hiring early data scientists, a really important trait that makes a big difference is communication ability. Can you take these results and make them compelling so that people change their decisions?
A: I'm thinking more about the different ways in which you could conceive of the impact of data teams and how you should budget for them, which obviously in this macro context is potentially controversial and a very important thing to think through. Where would you say the consensus is today, about how you should budget for data teams? Perhaps on the one end of the spectrum, it should be hire data folks regardless of circumstance, there should be a fixed budget, or a percent of revenue. And then, on the other end of the spectrum, there might be a need to prove out the impact on certain financial metrics. Where do people's opinions lie?
C: I think right now there is not that much idea of plugging in data teams as direct drivers of revenue. I think that the nature of the way the data landscape has progressed is, that the tools like Snowflake and dbt have made it so that you can spin up reporting really easily. And so, I think as a result, you're seeing that all across the globe, there are reporting teams. They're called data teams, but they're mostly doing core reporting that's going to guide a board deck. Oftentimes that reporting is so macro that it’s not driving operational decisions.
That's the state of the world today, but I think with likely a recession happening, things could change fairly readily. A lot of teams have made many investments in purchasing all these data tools, hiring these data teams, and there was a clearly justifiable reporting need. But everything beyond reporting was a little bit more speculative. The thinking was, “PMs need numbers, so let's give them a data person.”
What’s going to happen in the future is that a CFO is going to look at headcount request and ask, tell me how money comes out the other end? Data teams need to provide the answer very concretely.
I've seen data teams early on, when they need to win political capital, generate short-term revenue wins. For example, you're spending marketing dollars all over the place. Is there some way you can establish attribution models? That kind of analysis always reveals some channel that's bad. Whenever you do that process, especially early on, there's always low hanging fruit. “Here's the example where we're spending $500K and don't have to.” That is tangible revenue improvement. It pays off a small data team's salary and feels good. That's a decent enough place to get started.
The casualty of recession-style thinking of being very short-term focused is that some of the most powerful things coming out of a data team is when you actually reveal an entire strategic direction that could be pursued. That's the thing that in the boom times you get more leeway to pursue, but in the lean times, you have to make sure that's part of a portfolio of much more direct revenue drivers.
A: All of that makes a lot of sense. I'm thinking about, more broadly, how data teams can advocate to their CFO to retain budget in this type of economic environment. And I imagine one of the frequent inputs into this conversation coming from go-to-market leaders — the people who own revenue at the end of the day. Sometimes I hear rumblings in the SaaS world from GTM leaders that the modern data stack is really just tools for data people. And that data people all have this love fest with each other, and are very excited about their own profession and all these software products that are being built, but the GTM leaders in some of these instances feel that they haven't really benefited from the modern data stack.
So you could imagine a data leader and a sales leader being in a room with the CFO, each advocating for their own software and headcount budget. I wonder how that conversation will go.
Do you think that go-to-market leaders are mistaken about the modern data stack and data teams?
C: The funny thing is, I'm not even necessarily sure they are mistaken. I would say that with the economic boom times over the last several years, and the boom in innovation and the data tool landscape, suddenly people have brought on a bunch of data observability tools, or next gen BI tools, or data catalogs. The data team liked them and had a lot of discretionary spending.
But the key question is, Do all of these data tools add up to better decision making? This is again why I was so excited to start an experimentation company. Suppose you snap your finger and you had a pristine data warehouse that was always up to date that could plug into everything. Suppose you took Snowflake, dbt, Hex, Hightouch, etc., had the whole package, and you snapped your fingers and it all worked. What would happen next? You have to have that theory of change in your mind. As a data team, you need to say, “we think we are going to affect a bunch of product launch decisions.” That's the experimentation play.
Data teams should be thinking about how to model the segmentation of users in order to dramatically improve the ROI of ad spend. They can completely change the way marketing teams deploy dollars.
Once you are affecting decision-making, and your company leaders like that, they’re going to see the value of the other data tools that you want. They’ll see that you need your data pipelines to reliably work, and that you need other underlying investments on top of your data warehouse. That’s when you can have nice conversations about the ROI of more foundational data pieces.
A: I'm wondering how you think these budgeting conversations will unfold from the perspective of where in the P&L a data team should be accounted for. You could make the argument that data team should be a part of the G&A budget, but then again, maybe if you're trying to prove an impact on the business, you should be allocated to the various functional cost categories. What's your take?
C: A data team can kind of start off by doing some sort of cost sharing. With Eppo, for example, the most common setup we've seen is that half the budget comes from the product team and half the budget comes from the data team. This shows that the decision-makers are literally bought into the process.
There may be some amount of money that you would spend on standard reporting that could count as G&A. But when you're affecting decision making and when you need to justify having more than a 3-person data team, you should have a lot more cost sharing.
A: Che, I know there are some companies where the data teams are revered internally. Airbnb, where you used to work, is one of these companies where the data team is considered to be a core part of the broader organization. It's much easier to justify headcount and spend there. It's a default assumption that they're adding a ton of value. But, if you are on a data team at a tiny startup where there maybe only a few data people, it might be a lot harder at that point to prove, why do you need that additional head count? Or why do you need that additional spend? What would your advice be for founders and data leaders at earlier stage companies, in terms of how to get to where Airbnb is?
C: I think it's a great question because I was the fourth data scientist at Airbnb and so when I joined, we did not have a personal brand, internally. We did not have this default assumption that we should be hiring data scientists everywhere and pursuing opportunities that were not just contributing to short term revenue, but rather long term strategic. The advice I give is that you have to treat it like a ladder. Step by step. What do you do to reach up to that “data team nirvana,” where you're held in high esteem as a crucial part of every decision?
I think the starting point is — and this is why your initial data hires are really important — what is the easiest path to showing wins? Tangible wins that are going to lead to further investment. Those are places where you have sufficient data volume to do something meaningful. In the world of data, sample size is forever a thing you have to care about. And so, where do you have good data volume?
Secondly, based upon your previous experience doing this type of work, where do you believe a few months of investment will lead to tangibly improved decision making or ROI? That could be improving your marketing campaigns. It could be taking the customer service operation and making them more efficient. Pick one or two areas that are going to show value quickly, staff them up, and show value. From there, you build up bit by bit. Once the data team has a brand of, “everything it touches, it is transforming and driving a lot of value,” that helps you get the political capital to pursue things that are a little bit more long-term, to figure out some underlying deep question of the business.
At Airbnb, for instance, after showing a whole bunch of successful experiments improving search ranking — because at the time search ranking was literally a handmade score, literally it was just a column of the database with a number in it and we were ranked by that number — we then moved to machine learning and kept improving revenue.
And then we got the leeway to spend a month investigating seasonal pricing, which we kind of always knew was going to be a big deal. If you look at the rest of the travel industry, they do seasonal pricing. And if you look at our price filters, people obviously use them differently across seasons. That was a lot of months of work to make that case and then ultimately justify a dynamic pricing model. I can't imagine going into the CFO's office and saying, we want to get two of our most senior data people and give them three to six months to go work on a project that may or may not work. You get there by first knocking out a bunch of singles and doubles.
A: So it's a journey. Thinking about the early stages of proving out the brand and value of your data team, what are some other examples of early wins that ultimately the best data teams generated when they were early stage?
C: Webflow is a great example. It's a SaaS business, it was three data scientists at the time when I joined, and the whole company was operating as if every user was basically the same — part of one homogenous pool. But, all the early employees knew there was this central divide of people who are website builders for others — contract website builders —and then there were people who were going to build a website themselves, like a business owner. One of the earliest wins we had as data team was to model that out. Once we did that segmentation, it started having dramatic ramifications for how we spent marketing dollars and how we prioritized products. For example, we saw that if you spend a lot of marketing dollars on a small-business CEO who has never built a website before, it is probably not going to go very far. It’s better to spend marketing dollars to get a website builder who's going to make 500,000 websites. We saw that these non-technical CEO types wanted Webflow to look like Squarespace, to be very easy but limited. Whereas the website builders wanted it to look like Photoshop, very powerful and complex.
There was immediate tangible benefit in terms of allocating marketing dollars. But there was also a strategic question that emerged: who are we building this product for? Who do you want to serve?
That was an early win. Another early win related to our affiliate program. There was literally a bug in how it was set up that was wasting $500K a year. There were little wins like that here and there.
At Airbnb, when I first joined, the name of the game was a land-grab growth strategy. The question was, in all the different metros, whether it's Bangkok, Thailand or Paris, France, what do we consider to be a healthy balance of supply and demand? And, where should we be spending dollars on boosting supply in those places? We supplied all the boots on the ground with insights to help guide the resourcing. That was a very early way to demonstrate value.
The other early win that I hadn't mentioned before was in search ranking. We replaced a very manual process with an actual machine learning model that did things like penalizing hosts who never responded and boosting hosts who often accepted people. Those were immediate revenue gates.
A: It's amazing how many times I'm talking with a founder and they're looking for advice on some operational topic. My first tip is, get a data person to actually look into what's going on. It's very difficult to form a churn playbook or create a certain type of go to market strategy if you don't actually understand what's happening. And I can imagine seeing many founders hiring data people earlier and earlier in their startup journey.
C: Yeah. One of the things about Airbnb is that even though the founders were designers and weren’t really data people, they did hire Riley as the first data employee. There was enough kind of investors in the ear to make an investment there that they went and did it. The trap I see all the time is that when the people start a data team, they do it with a “dip their toe in the water” fashion: let's just bring on a data scientist and do some reporting. And, they'll hire someone a little bit junior or maybe mid-career or something. Someone who knows SQL and worked at a couple places.
But, when you're first deciding on how to invest and start this flywheel, you need someone who can think more strategically, because the core decision is, where are you going to go to notch your first wins? And how do those ladder up to the other ones. And, to be able to make that judgment, you need to understand that “this business is a marketplace whose goal is margin improvement” or something else. You have to be able to see a few steps ahead to be able to say, “the best investment we can make right now is marketing ops” or another area. Failure mode is when you’re simply building core metrics for the board decks.
A: My last question for you, Che. You started an experimentation company, Eppo, which almost by definition is intended to help data teams prove their impact on priorities, since the nature of experimentation is that there's some output variable that you're trying to impact, and you see new mechanisms for how you can best impact it. I'm sure we can have a whole conversation about best practices for experimentation, but if you had one tip to leave people with today, what would that be?
C: Besides buying Eppo, what I would say is, the reason why experimentation has a great effect is that it takes the metrics that underlie the business — the stuff the board cares about — and it plugs it into all of these product investments you're making. If you ran an experiment and it showed a successful result, would your metrics be ones that would actually get attention? If you go to the CFO and say hey, I just boosted click through rates 10%, that's not going to get you very far. It's not going to lead to this ladder-up of capabilities and brand. When you're running experiments, make sure you use business metrics. And then, that logically leads you to think about the experiments that will drive those business metrics.
A: Che, thank you so much for joining us today. This was an awesome conversation. I know that people will benefit a lot from it.
C: This has been great, Allison.