One thing many high growth startups have in common is the growth team — a multi-disciplinary group tasked with moving bottom line product metrics.
Unlike traditional marketers that focus on acquisition, they are tasked with iterating on the product to improve things like retention, referral, and average revenue per user.
Mike Rome
Mike Rome leads growth at Eat Street, and in his career has helped acquire millions of users for a number of startups. In this discussion Mike shares how he approaches growth, his methodology for prioritizing experiments, the components of a good growth team and more.
Listen to the audio version.
DI: You’ve talked about the intersection of growth and product. When you say things like “the product is the marketing,” or that a lot of the levers that you can influence from a marketing perspective actually live inside of the product itself, what do you mean?
Mike: One problem with all the data out there now is the noise degrading the signal. What you really want to do is operate above the blizzard. You want to figure out the metrics that actually put you on a path to sustainable growth.
I like Dave McClure’s “Startup Metrics for Pirates” which talks about 5 levers for growth: acquisition, activation, retention, revenue, and referral. We can just run through the definitions really quick.
Acquisition is essentially marketing. How do we get people to this thing that we build? How do we get people who we think we might be solving a problem for with this new solution to this thing?
Lots of people do this. When they are thinking through growth, they’re thinking about the channels they could potentially tap to acquire customers.
When I’m looking for success in an acquisition channel, the two things that always matter are scale and unit economics.
Scale means “can we find a sufficient amount of customers relative to whatever the goal is?” Unit economics is the cost to acquire them relative to the value that they create. We talked about that generally as LTV.
Like I said, most teams do this. But many companies lean too heavily on acquisition or marketing as an engine for growth.
Activation means once we get people to this thing that we have, how do we get them to do what we want. Depending the size of what you’re selling, it might be a purchase if it’s an easier sell. If it’s something way more expensive, it might be a sign up or even watching a video or something like that
Retention is how to get them to continue to come back and do what you want. Probably the most important of them all, because it’s the most indicative of a good product.
Revenue is how you get them to engage in some sort of monetization behavior. And there’s ways to do that beyond bringing people to your site and letting them buy what they think they want.
The last one is is referral. How do you get them to have such a great experience that they become the marketers for you?
The really big takeaway: of these five levers, only one of them is marketing. All of these other things happen once people get to the product itself.
People don’t spend enough time trying to study, reflect on, and experiment with those product levers, and ultimately they’re a much more ROI-friendly way of growing. If you already have people coming to something and you’re able to use the resources you have in-house to build a better product to raise order values or encourage them to become marketers for you, you get a lot more like bang for your buck.
DI: I would imagine that a lot of people who do marketing would say “I stand up referral campaigns. I build funnels to activate.” They might think this sounds the same as what they do now. What are some of the process or competency differences that you think distinguish growth from marketing?
Mike: Sure. I think you’re hitting on this idea of putting growth process before tactics. Which is one of the bigger differences I’ve seen.
Early on when I got into tactics, a lot of it was informed by who’s just yelling the loudest in their content marketing on social media. You’d read something and be like, “Oh, that sounds interesting, right? We should go try that.”
The danger and putting tactics before process is unless you have a way of continually starting and finishing tasks, it’s very easy to get left behind. Everyone who’s listening to this podcast is going to be resource-constrained. If you don’t have a process to thoughtfully prioritize and have conviction around what you’re doing, you’re just going to waste too much time.
Our process at a high level is to start with a brainstorm around those five letters that we talked about — acquisition, activation, retention, referral, revenue. We try to go for unbridled ideation and think of things hat you don’t even think will work.
Maybe they’re just marketing channels that are less intuitive. Sometimes that stuff gets really interesting because they’re generally less crowded less saturated stuff. Once you do that brainstorming to get all the ideas down, I would pair it with some sort of quantitative audit to assess potential impact.
It’s okay if you’re just starting up and you don’t have any internal data to reference to inform what tactics within those levers you might hit first. Secondary research works — there’s stuff out there, maybe future competitors in the space or general insights that have been written about either the problems you’re trying to solve or even solutions that exist.
DI: If they do have data, what are you looking for to assess impact — things like bottlenecks? Is that what you mean when you say doing an audit of what they already have? What should they be looking for?
Mike: I think one of the dirty little secrets of trying to grow something is that often the best things are just fixing stuff that’s broken.
A lot of people hear this term growth or growth teams, and it’s a sexy notion. But the reality is a lot of times we’re just finding hiccups in the product and taking things away or just making things work as we thought they would work, across all platforms or all browsers.
Outside of fixing stuff, we’re looking for where inputs and outputs not equal. How do we spend a low effort on high-potential, high-impact, high-confidence tasks?
We use something called ICE scoring to prioritize. ICE is just an acronym for impact, confidence, and effort.
So after brainstorming and doing that audit, you have all these things you want to test or these new product ideas as it relates to activation, retention, etc.
For every single idea, you should start with impact. Ask, “If this thing were to work, what’s the impact? What’s the payoff?”
You can put together a quick and dirty model. It doesn’t have to be perfect, just go through the exercises and figure out what’s important.
Next is confidence. It’s great if you find an idea and you’re like, “Wow. If this works, the impact is huge.” If you don’t have a lot of conviction around the impact though, and there’s not a lot of primary or secondary data to support what you’re saying, it’s probably not the first thing I would do. Confidence matters just as much as impact.
The last thing, the most important thing when you’re starting up, is effort. What’s it going to take to actually test this new idea? Can I do it myself? Can someone on my team do it? Do I need a developer? Do I need a designer? Do I need multiple developers?
The Holy Grail is the low-effort, high-impact, high-confidence ideas. And that’s where you want to start.
The reason I would weigh effort so much is in the early days, momentum is everything. If you think about the formula for growth, it’s number of experiments you can run multiplied by impact multiplied by success rate.
So you have these three inputs and really only one of them is in your control, and that’s the number of experiments you start and finish.
And the reason starting and finishing so experiments is important is because that’s how you learn. Most of the things you’ll do don’t work. You don’t get demoralized. You ask yourself ask yourself “Why didn’t this work? What was our hypothesis that was disproven and how does that re-inform our priorities? How do we move stuff around based on what we’ve learned?”
The more tests you can start and finish, the quicker you learn, and the quicker you learn the higher your success rates. And that’s when you get that flywheel spinning.
DI: What would you say to somebody that says, “Hey, this all sounds awesome. I would love to try to implement something like this for my team, but I have folks who come from a different sort of mentality?” How do you try to make something like that happen inside of an organization?
Mike: It’s really important when you have those conversations not to think that that you have all of the answers. Starting from a place where your path is the only path is a bad idea.
I recommend this path just candidly because I’ve seen it work for me. But I’m always mindful that our process is a living process. It’s broken. We are changing it all the time and trying to get better.
And so I would ask questions. “Why do you think that the best approach?” It’s important to try to persuade people that you’re just trying to develop a better mechanism for how to move forward.
Everyone thing every realizes is there are a thousand different things they could be doing. So I’ve told people that process is piece of mind. It gives you confidence in what you’re doing and not doing. And looking at it from that lens is really helpful.
The other thing is to realize it’s only part of the equation. There’s still the need for brand marketing. In my job today, I spend a half my time on marketing and customer acquisition and the other half on product. But my side is very data driven.
The other part of customer acquisition for our company is brand-focused. I’ve learned they complement each other super well. If you execute on creative, whether it’s TV or radio or out of home, they can really lift up direct response and what my team does. So they’re symbiotic.
DI: It seems like if you were to start on the product side, get some wins, demonstrate the process works, and then ask for a seat of the table around acquisition, that might help.
Mike: I think that’s a great idea. It’s also important to stress that they have their job, which is to get people to the site, and you have your job which is to make sure the product does what you sold.
One of the big mistakes I’ve made and I know is common is that it’s really easy to fake growth. The number-one way to do that is to find channels where you can acquire lots of inexpensive customers without being mindful of the unit economics on the other half of that equation of successful customer acquisition.
It’s equally important to start with the product and to worry about levers and retention to make sure you’re ready for marketing. If you haven’t built something that enough people want and ideally are willing to pay for, it won’t work.
You want to get enough people using the product to see if some of those activation retention metrics are improving; you just want to know if you’re steadying the foundation. A lot of times the foundation isn’t steady, and when people get into customer acquisition it’s intoxicating to see lots of app downloads or lots of views or even purchases but, you know, those are one-time purchases. If people aren’t coming back, that’s the biggest mistake.
DI: You see that a lot with startups who have investors setting expectations around consistent growth period over period, and now you have to keep plowing increasing amounts of money into it to continue to show what they’re expecting, even if your foundation is shaky underneath.
Mike: Absolutely. You have to acknowledge that even even if you have really great investors and backers, your incentives are never perfectly aligned.
If you look at the unit economics of how most investors or funds make money, it’s not because they have lots and lots of wins. It’s usually one big whale that carries the fund.
So they have an incentive to figure out who’s going to boom or bust as fast as possible. They want to figure out if they should spend their time and energy with a business or even put more capital into it. As a result there is a lot of pressure to spend money to drive scale, even if it doesn’t make the most sense for the business.
I’m lucky, in that we’ve got really awesome investors, and we can have these kinds of conversations about what’s responsible. But there’s always pressure to spend faster and you really want to make sure you’re spending in a way that’s sustainable.
DI: We run into that in an innovation context where organizations are used to measuring success the same way they measure the success of the core business. There’s a strong tendency to want to show immediate results and to make it look bigger than it’s ready to be. that whole that concern around premature scaling. But it takes time to iterate and find product-market fit.
Mike: One of the best pieces of advice I would give to folks who are part of larger organizations and working on an innovation teams is you have to have buy-in from the top that this is going to be a long and demoralizing road and, if people don’t have the stomach for it, it’s just tough.
It’s very exciting at the beginning. But what really drives success is the people and teams who can persist in the trough of sorrow. It comes down to determination and commitment because of all the startups that I’ve studied or been a part of, there’s always that slog. You have to have that buy-in from the top.
Someone told me the other day that one of their best qualities was they take bad news very well. I think that’s actually an amazing competitive advantage. You’ve got to be excited enough about the problem you’re solving to keep going. If it can’t hold your attention span, or the attention of your organization, it’s really tough.
DI: Getting back to process — what does that look like concretely? Can you think of examples of focusing on things that weren’t acquisition related and were able to generate meaningful upside for the organization?
Mike: Sure. So in the early days before we had good language around this, again we were really resource constrained like most of the people probably listening to this.
We were hitting the database and seeing where users were getting hung up. Or going to a coffee shop and user testing a signup flow with people. Talking to users and digging into the product. You can talk to customer service people and figure out where people are getting frustrated.
You obviously have to have good tracking in place to mine for good inputs in the database. All the steps mapped out. In our case it was medical fundraisers. So activation was getting people to start fundraisers and make some money for the fundraiser up front.
We had enough events in place within the products to look at all of those steps and just see where it was broken, and those were always great examples of just not having enough innovative ideas for how we could improve the flow. Mostly it was things not working as we’d communicated or intended for it to work.
DI: Talking about the experiments and speed and number of experiments that you’re running is the one metric that you’re able to really control. Are you running a bunch of experiments at the same time to move one metric? How can organizations move faster and control that one variable?
Mike: It’s a million-dollar question. I think tracking is really tough and it’s hard to know what the optimal things you should be doing at once are.
I think sometimes people get hung up trying to start and finish an experiment and definitively know what it means.
Take acquisition. In most cases you just need to know if there’s something there, or if it’s a hard pass. Assuming you get some results, and you know there’s scale, that’s probably sufficient. So you don’t always need to have absolute certainty about something — often directional certainty is sufficient.
That’s really important when you’re resource constrained. Even for me — I work for a later stage startup with hundreds of employees and tens of millions of dollars in revenue — I’m still super resource constrained. I don’t always have the luxury of running 5 product experiments at once.
DI: People that think of testing probably think of Amazon’s hundred shades of blue. But unless you have scale, it’s going to take way too long to find wins that way, especially since most experiments fail.
Mike: Yeah. I think early on that’s good advice. If you’re only changing this one thing and it’s very early and you have a small subset of customers, you’re probably not testing ambitious enough things.
Don’t worry about this idea of not knowing your goal; you want to acquire twice as many customers or you want to improve the signup flow by 50%. You’re not looking for a five percent relative improvement, you’re looking for a 50% absolute improvement.
If you adopt that mentality, you have the freedom to change more things. You just have to get content knowing that even though you’re better off, you might not be exactly sure why things are working and that’s OK sometimes.
DI: I’ve told our team before — lacking causation, I’ll take correlation. I don’t know if this is actually causing it, but it seems to be correlated and that’s good enough for me for now. Let’s deploy and move forward.
There’s also a team member education piece to this, where you’ll run the test higher up in the funnel — on your registration page for example — and if it works and you see a 15 percent lift, they’re thinking that’s going to trickle down to a 15% bottom line bump.
But it usually doesn’t work out that way. Any change that you make changes the way people interact with your product. Their behavior is going to be different.
Mike: Yeah, there are people a lot smarter than me on A/B testing. Evan Miller writes some really thoughtful things around what to pay attention to when you’re testing different experiences.
I think you have to be OK with some of this ambiguity and not knowing if you want to move forward fast enough.
It’s a delicate balance though. How do you make sure you’re using data to have conviction around what you do next, while operating above the blizzard of data?
DI: For us, often clients have the reverse problem where they don’t have any tracking set up, at least not at the event level. They track sales or registrations, but not all of the events that lead up to that. So we usually we usually have the reverse problem.
Mike: Well I think the larger takeaway here is, when it comes to starting something even if you don’t have data, that’s fine. Go make data. Go talk to potential customers. Don’t sit in a conference room and ideate with a bunch of the executives who aren’t going to be the user of the product.
I don’t think this whole idea of “we don’t have analytics set up” or “we don’t have users” should get in the way of putting in the work. Go talk to talk to people. Even if it’s bad news. The best feedback you can get when you want to build something is figuring out it doesn’t work. Figure out why and improve.
DI: I was having a conversation with our creative director about design thinking. We often do “how might we” exercises that are similar to the unbridled ideation you were talking about.
He said a lot of UX is about pattern recognition, and is dependent on the inputs you have. Someone cold off the street can throw out ideas, and it’s better than nothing. But they’ll be less informed than if you do some research in advance of a workshop. If I know what my customer says and how they use my product, you have much better inputs and the session is more productive.
You mentioned the potential risks of becoming prescriptive and looking at what other people have done and just copying it. At the same time, those are inputs I can bring and say, “Hey, this product or completely unrelated industry solved a similar type of problem by doing x.” Does that make sense at all?
Mike: I think it’s a really powerful point. I think again, especially when you’re starting, don’t get hung up on not having certain inputs. And don’t get hung up on this idea that you need hundreds of pieces of input.
There’s plenty of reading out there around picking up UX patterns. And from a user testing perspective, it takes a small subset of users to figure out what things would satisfy 80 percent of the product needs.
Don’t get hung up on the idea that your inputs aren’t good enough. Just go find any inputs. If you’re curious and you’re bringing inputs to the table that are rooted in actual user feedback, not just from your head but from talking and listening to a potential customer, that’s useful.
Paul Graham talks a lot about how users have lots of the answers. Nine times out of ten, I can’t think of a better use of time than just going and listening to some users and studying them, either using the product or reacting to you talking through a product.
DI: Let’s talk about team. I know it looks different depending on the stage of an organization, but in order to execute on growth, it seems like you need more than just a person with a marketing mind. What skill sets are necessary?
Mike: Curious people are a big deal. You want to find people who are comfortable with bad news. They’ve got a lot of persistence and determination — the whole idea of a rapid rate of learning is the reward and that’s enough. I think all of those are really good things.
In terms of the makeup of an early team, you should have people on the team who are essentially the customer. It’s hard to build something if you’re not the customer and you have to guess at what the customer wants. Be the customer because then you can move a lot faster. You can just think about how you would create value for yourself.
When it comes to actual skill sets, it’s good to have a developer to execute on experiments. And you need some design help. Design matters.
I think technical marketers are important — people who are really into direct response marketing. They’re good because they’re generally detail-oriented, so they can double as product managers.
They also have empathy for developers and designers to respect what they’re asking them to do. It’s really easy for someone to say a couple of sentences and create five weeks of work for a developer. Knowing what you’re asking and why that’s hard is important. It’s just good for the morale of the team.
You also need someone with some analytics background. Have someone who is a truth seeker and is going to keep you honest.
I think if you just find people who are excited to build shiny new things instead of building things that actually solve a real and deep problem, that would be difficult.
DI: One of the issues with growth teams has been getting access to developers because so many teams are resource constrained. How do you recommend advocating for getting resources when you’re just getting started?
Mike: When it comes to getting developer time, you have to find someone who’s excited about the opportunity. You need to find a developer who gets fired up about the work. You can’t understate the importance of being excited about the problem. At some point it’s going to get hard. And if you’re not excited about the problem, it’s tough to keep going.
When you’re trying to get buy-in from the top, explaining the value the work could create for the business is a good approach. Just let the facts do the talking. It forces you to be really well researched and prepared. It’s not very hard to show that working on x versus y could have a financial benefit to the business.
Figure out who you’re trying to get buy-in from. What are the things that make them look good? What do they care about? It’s tough, but politics matters. Especially if you’re in a larger company.
DI: One of the tools that we’ve been using a lot more in the last year is a super granular growth model. Trying to visualize and quantify all of those levers, figuring out the six steps during activation here and the three different referral loops inside of a product.
Now, instead of saying let’s spend three months on this feature because is’t next on the roadmap, showing how something will influence a variable and doing a sensitivity analysis is really helpful in making that case.
Mike: The reason detailed models are are very helpful is it opens your eyes to the road ahead. Getting a sense of awareness before you start and a degree of humility is going to help you. You want to make sure everyone has the stomach for what’s ahead and then is still excited.
I took a look at it. You sent it over to me a little while ago and I was super impressed. I mean, a lot of that stuff was more savvy than some of the stuff I’m using, so hats off.
DI: Everybody gets it from other people. That was that was all Brian Balfour. It’s incredibly helpful for vetting potential investments as well. Founder says, “I’m raising $1.5M and our plan is to get to 50,000 customers over the next twelve months.”
And I ask, “How do you plan on getting there?”
And they’re like, “Well, there’s this bucket of money and we’re going to do word-of-mouth and we’re going to do paid marketing.”
When you show them the model, they start to realize that getting to 50,000 is going to take a lot more work than they thought. When they talk about doing content marketing, you have to find out how many articles you’re going to have to write to rank. What’s the monthly search volume on all of those and your expected click-through rate, etc. They start to realize they need to really buckle up and plan on hustling.
Mike: I couldn’t agree more.
DI: You talked about process and you’ve talked about team. I know you’re a big believer in culture too. Once you’ve got the team together, what are some recommendations for instilling a culture of growth to make this this stuff stick?
Mike: A great shortcut is to find people who fit the mold. There’s certain things you can’t teach. You have to find people who are determined, who are okay being wrong a lot, who are curious and truth seekers.
Setting the precedent up front helps too. Making sure people really understand how difficult the road ahead is going to be and are bought in. I don’t mean to talk about it negatively because I love it. But the learning curve is just ridiculous. You grind it out for six months and hit some hard times. But you come up for air you just you’ve built this new skill set and that high rate of learning never stops. There are certain people who just have an appetite for it.
Ask some tough questions in the early days to make sure people are doing this for the right reasons. Get people who aren’t just here because it seems like a cool idea. It needs to be something more — people really need to believe in the idea.
DI: I know you have a reputation for keeping people’s enthusiasm levels high throughout the slog. And you also have a reputation for modeling behavior — getting your hands dirty instead of just barking orders. Were those deliberate choices?
Mike: I’ve always appreciated people that are still in the trenches. That carries a lot of weight for me.
The other thing is that lately I’ve just been trying to get out of the way a little more. We’ve been at it for a while; the team’s been a formal team for several years now. And I think sometimes you need to get out of the way.
I think about all of the best growth ideas that have happened on growth teams that I’ve been a part of and not one of them was my idea, and so getting comfortable with that idea is something I’ve been working on.
The thing I’m always focused on bringing to the table is just making sure there is a culture of meritocracy, where best idea wins and it doesn’t matter who it comes from. Use ICE scores, make sure people have the tools and time to execute, and get out of the way.
Originally published at digintent.com on November 19, 2018.