
Loading summary
Olivia
Think of randomized control trials in healthcare is my like easiest and simplest comparison point of like you are going to when you're rolling out a new drug, you're going to give one group of people who are kind of similar characteristics to this other group, but one group of people a placebo drug and you're going to give the other group of people who kind of again, statistically kind of indistinguishable from, from the control group. You're going to give them the drug and you're gonna observe the difference in behavior between those two groups to kind of validate the efficacy of that drug. That's incrementality testing. We have a counterfactual to understand what would have happened anyway in the absence of this intervention. Whether that is, you know, search or video or YouTube. What was that group going to do anyway? And that's really like at its core kind of fundamentally what we mean when we talk about incrementality testing.
Nick Sharma
Welcome to Limited Supply, the place for refreshingly real takes on what D2C is really like. I'm your host, Nick Sharma. Let's start talking about money. Have you ever added up all the conversions attributed to each of your paid channels and realized that the sum of those conversions is greater than the actual number of conversions on your site? That's likely because most ad platforms tend to over report and get more credit than they deserve for the conversions they drive. Today, sophisticated marketers are moving away from attribution and moving toward incrementality testing in order to maximize growth and understand efficiency. To find the incrementality of your marketing tactics, you have to run test and control experiments. But these tests are hard to do on your own. And that's why I recommend using House. With House you have an automated self service platform that allows you to configure regional testing control experiments to measure incrementality, identify points of diminishing returns, optimally allocate marketing dollars and maximize your growth. The platform is built by world class scientists and allows you to test all your marketing channels both online and offline. You can measure the impact across all your sales channels including direct to consumer retail, Amazon, etc. And you can calibrate your in platform reporting for incrementality with House. My friend Connor at Hexclad uses House and he was actually able to share that when they ran an incrementality test around Meta and Amazon, they realized that a lot of their Meta spend was indirectly driving revenue for Amazon. These learnings actually changed their acquisition strategy for Amazon.
Unnamed Host
Going Forward.
Nick Sharma
If you're a seven figure brand and you're running on multiple marketing channels, I should say if you're at least a seven figure brand, then you can potentially save millions of dollars by starting an incrementality practice and learn what's truly driving your business. Houzz uses superior science and makes this type of testing accessible. And so I recommend you check them out. Go to Haus IO Limited, that's H A U S IO Limited to learn more and get a demo.
Unnamed Host
All right, Olivia, so I want to start with your background. You have such a fascinating background. Could you actually. Why don't you tell us about your background and then I'm going to pepper you with some questions along the way.
Olivia
For sure. Sounds good. So, all right. Starting from the beginning, I kind of came up in the paid media marketing world at an ad agency in Chicago in ad tech, mostly working on the Allstate business. And I say that only because they were actually pioneering incrementality testing in like 2012.
Unnamed Host
Wow.
Olivia
So was introduced to it very early on in my career. And then I made the transition in house at Netflix. Joined there in about 2015, 2016, as they were starting to expand globally. And this was like a playground for learning and just working with incredibly smart people. We were running incrementality tests and NGO experiments all over the world, and we had a very clear understanding of what paid media and marketing was delivering to the business in a world where there was like tons of organic demand and noise and, like, really hard to actually understand that. And we were able to figure it out because we had this amazing team of data scientists and we reduced our acquisition costs, I remember one year by like 80, 90% by doing some of these, these tests. And so I can share more on the specifics later. But then a few of us went over to Quibi, the. I don't know if you've heard of it.
Unnamed Host
Yeah, the video. I remember Chrissy Teigen had a show judge Chrissy or something.
Olivia
Yep. It was the failed streaming startup of 2020. Learned a ton and got to see like the complete opposite play out where at Netflix, lot of organic demand. Really hard to make media incremental at Quibi because we had so little organic demand. Like, everything we were doing in paid was incremental. And it was just a question of like, at what cost are we willing to acquire these customers in terms of willingness to pay? So after Quibi, then went over to Sonos to lead growth, and that's actually where I kind of reconnected with Zach, the Founder of House Sonos was heavily reliant on last click attribution. They had just integrated an mmm, and that was pointing in the complete opposite direction in terms of insights. And so I pitched experiments as a way to really start to quantify what marketing was delivering to the business. I had done that at Netflix, and it was just going. So I hired House, I hired Zach. I was one of the early customers of House at Sonos, and it was just going so well, and we were making such significant progress in such a short amount of time that I was like, all right, I need to be a part of this. And so I moved over to Houzz very early a couple years ago and haven't looked back since.
Unnamed Host
That's amazing. And before we get into the rest of this, could you just, at a high level, explain mta, MMM and incrementality? Just so everybody listening understands that context.
Olivia
Yeah. So really, the. When you think about incrementality, it's the difference between incrementality and an MTA or an MMM is the establishment of a counterfactual. Like, think of randomized control trials in healthcare is my, like, easiest and simplest comparison point of, like, you are going to. When you're rolling out a new drug, you're going to give one group of people who are kind of similar characteristics to this other group, but one group of people a placebo drug, and you're going to give the other group of people who kind of, again, statistically kind of indistinguishable from. From the control group. You're going to give them the drug and you're going to observe the difference in behavior between those two groups to kind of validate the efficacy of that drug. That's incrementality testing. We have a counterfactual to understand what would have happened anyway in the absence of this intervention, whether that is, you know, search or video or YouTube. What was that group going to do anyway? And that's really, like, at its core, kind of fundamentally what we mean when we talk about incrementality testing. MTA and mmm, they are more correlational, but they. They're needed and necessary kind of in the mix. But it's more so looking at correlations of, like, all right, this user was exposed to this ad and then they did this. But. But not causal. We're looking in terms of mmm. You're just looking at patterns in relationships versus having that true holdout group.
Unnamed Host
And is it true that MMM really factors the last two years of data? And so, like, it's not really able to be leveraged properly. Unless you have two, you know, whole years of data. Plus, I guess if you think about, like, Covid, you know, factoring into that data, like, does that mess up the data when you think about.
Olivia
Mmm, yeah, you need a lot more data. You need to. Because, you know, if, for example, if you're spending up this Q4 and your business goes up, you need enough variation to understand, well, is your business going up because it would have happened anyway or because you're changing your media mix? And so you need a lot of historical data and just more looks at what's going on. You need to be able to look at last Q4 and the Q4 before that to actually understand and parse out what media is driving. So just the more data you can feed there, the better.
Unnamed Host
Amazing. I'm such a huge Sonos. I've got to be in the top 2% LTV of Sonos customers. Are there any fun tests that you ran at Sonos or Netflix or Quibi that you can talk about?
Olivia
Yeah, I can. What's your favorite Sonos product?
Unnamed Host
I mean, at home I've got the sound bar and the sub, and in my room I've got the two Sonos ones. In the office, we've got like 10 Sonos ones and two subs.
Olivia
When I joined, they were like, you need to just splurge on the sub. I never thought I needed a sub. Like, not big on it. And it's completely changes. Like, I don't want to go to the movies anymore. No facts changes the whole.
Unnamed Host
I started getting noise complaints from downstairs after I got the sub.
Olivia
Yeah, I can see it not working as well in New York. So. Okay. Some experiments that we run at Netflix. Well, I actually won't share the brand, but I've seen time and time again that, you know, brand search and affiliate are kind of the usual suspects when it comes to incrementality. Like, shut it down and see no impact to the business. Even though on last click attribution, it looks like your best channel.
Unnamed Host
Right.
Olivia
So those are kind of like the obvious.
Unnamed Host
That's the trick Honey uses.
Olivia
Yep.
Unnamed Host
Those fucking scammers.
Olivia
Yep. Honey. And it's tough to test it. They make it really hard to test.
Unnamed Host
Because they just immediately refresh the page.
Olivia
Yeah. And there's like no geo targeting capability on couponing. So we actually, we did test this a few times at House. You just have to shut it down. Say, like, I don't want this up for a month or two. And then you're able to Glean some pretty interesting insights. So interesting. Yeah. Ask your incrementality partner if you can do that. An interesting. So one of the experiments that we ran on Meta at Netflix was broad targeting versus manual targeting. You know, interests and demographics. We have all of these different titles that we're leveraging to drive acquisition. So, you know, like Ozark has a very different audience than the Crown. And we were going in and we were manually interest targeting each title to the people we thought would respond. And so we ran an incrementality test where we just gave Meta the keys. Go broad and algorithmically find the users who are going to sign up as a result of seeing these titles. And that crushed. We rolled that out globally in terms of moving away from interest based audience on Meta into just broad targeting. We called it unfettered and we just let Meta take the keys in terms of finding the users.
Unnamed Host
Have you seen that meme? There's an Instagram account, I think it's called chadvertising and it's like the meme of the guy with the blonde hair just sitting back going broad, ripping all the ads.
Olivia
Yeah. And it's also just in terms of resources. It's such a time savings and if it works better, just as well. You can save so much in terms of media buying.
Unnamed Host
Totally. There's probably some cool learnings too at Sonos around like because Best Buy is such a big partner for retail. And then there's Amazon and there's the sonos.com site. Did you find any cool learnings there or experiments?
Olivia
Yeah, this is actually why I signed up with Houzz is one of the big reasons was a lot of Sonos business happens in retail and so we were missing a very large part of the picture with Clickbase Attribution only focusing on dtc. So I'd say the best tests that we ran at Sonos and the most interesting tests were understanding the dynamics of like where people are buying. So search, for example, even if any given search test didn't show incremental lift in revenue, what we did notice was that it was share shifting users from retail into D2C where our margins were better. And so if you're looking at like profit optimization, you're like, all right, well it's not incremental revenue, but we're actually improving our margins by moving users into dtc. And so that was actually like a huge learning and really, really interesting for the business.
Unnamed Host
Did you find there was a higher multiplier on LTV for customers who were Both retail and DTC? Customers versus those who were just like Best Buy only or D2C only or Amazon only.
Olivia
I do think that buying through dtc, once you have users in your ecosystem buying through dtc, you can email them and you can kind of just get them into the ecosystem. So I do feel like the goal was always to get users into D2C.
Unnamed Host
Yeah, fully agreed. Okay, I want to talk about more incrementality 101. So pretend we're explaining incrementality to somebody who's never heard about incrementality. We have the healthcare example, which I think is amazing. But like, why would somebody use House and not just use like Google Analytics or Meta's own incrementality tests? Like why should somebody use House?
Olivia
Yeah, this is a great question and I'm really glad we're covering it because I think there's a bit of confusion out there in terms of what you need and when. So there are three types of incrementality tests and I'll go through each of them.
Unnamed Host
Okay.
Olivia
First type is a GEO experiment. This is primarily what we're doing at House and the reason is that it has a lot of advantages. Number one, privacy. Durable. In terms of the data. I know you're curious about the data. Yeah, we, we don't need any user level data at all. No pii. So all we need is like sales and revenue by day by Geo, which any E comm brand will have pretty easily. And so with that aggregate level data, we're able to do testing across any channel, online or offline. So that's, that's. I, I think a key characteristic is that it's been resilient to all the privacy changes. The second big reason why you want to go with GEO is because you test in the exact same way across every channel, online or offline. So you know, Meta is their incrementality tool, works different from Google's and you know, Snap doesn't even have one as an example. And so by geotesting you're running the exact same methodology standardized across every channel you test ott, direct mail, out of home, even. So that's why we work on Geo. One of the common complaints about GEO is that it's a little bit more noisy than user level. And so there have been a lot of advancements in terms of the science here with what we call synthetic control methodology to squeeze as much precision and power out of these tests as we can. Because historically people want to do geotesting, they understand it, but it's very expensive and you need to Run it for a long time. You perhaps need a larger cold back group than you would like. So that's where a lot of the advancements in the science is making a big difference in terms of adoption. So the second piece is user level studies. This is the second type of incrementality test. And user level testing is run by Facebook and Google. This is called conversion lift. You can ask Meta or Google to set that up on your behalf. I think Meta's is a little more like available in terms of on the front end Google, you have to go through reps and stuff. Yeah, you have to go beg and plead to set it up. But the reason you can't do a user level test on your own is because in the world of acquisition or prospecting, you don't know who these users are. Like you haven't acquired them yet. So you need Google and Meta in terms of their universe to be able to cleanly allocate a user to treatment versus control for you because they're unknown to you at that point. And so Google and Meta have that capability. With the privacy changes, it's been harder for them to do this like allocate a user into treatment versus control and then follow them through to purchase. So it's been a bit more challenged since iOS 14 and since all these privacy changes. For example, I think Google's Conversion Lift product only works on Android and then they're modeling iOS.
Unnamed Host
Oh, interesting.
Olivia
Yeah, because, and maybe I should get fact checked there, but that was my understanding when I last spoke to them about it because they actually can't allocate users cleanly on iOS. They just don't have insight and visibility there.
Unnamed Host
All these, all these like, companies, it's like they talk about privacy until they have to adhere to it.
Olivia
Yep. Yeah, it's been, it's been brutal for, for Meta and Google in terms of the Apple changes. Yeah, I think they're bouncing back and like capi is something that really, I think got Meta back to steady state.
Unnamed Host
Yeah, totally.
Olivia
So that's a user level test and this is where my advice is. If you're only spending on Google, you're only spending on Meta and you're small enough that it's not gotten fuzzy yet in terms of who's coming in organically versus through paid Meta and Google Conversion lift studies might get you most of the way there in the beginning. It's just, you know, once you start expanding outside of those two channels, Pinterest and Reddit and Snap, like they don't have conversion lift tools. Out of the box. And then theirs won't be comparable to Meta's because they'll be using different methodology. And so that's when we start seeing customers kind of come into house when they're spending in maybe more than a couple of channels. And it's getting really hard to kind of untangle paid growth versus organic given all the different channels that they're spending in and like the contribution of a channel like TV or YouTube when we know primarily that it's like assisting in making your other channels more efficient but not a direct acquisition driver. So that's when you go up funnel, you start spending in more channels. You really need geotesting because you don't have any options in the user level testing world.
Unnamed Host
Right. Okay, so it sounds like until you're spending on multiple channels, you can actually just leverage Meta and Google's own conversion lift study. But do those places factor in multiple points of distribution as well or just your site?
Olivia
Great question. So this is where. No, they wouldn't factor in.
Unnamed Host
I feel like people should visualize like on one side it's like the channels you're spending on, on the other side it's the points of distribution. And in the middle is really where incrementality I think shines.
Olivia
Yep. Right, Yep, that's a great point. No matter what size you are, if Amazon and retail as a sale sales channel are starting to represent like a meaningful percentage of the business, then you probably need to be doing geotesting because you can't feed that Amazon data back to meta to power conversion lift. So yeah, that's a really great point. No matter what your size, if you're primarily offline in terms of retail and Amazon, you should be thinking about geotesting.
Unnamed Host
Yeah. And is there a third?
Olivia
Yeah, yeah. So the third type of incrementality testing is what I call natural experiments at house. And this was the honey example where if you can't geosegment, you need to kind of fall back on, on off testing where you just shut it off or you turn something on that you haven't been running and you observe the business. You have maybe like a forecast of what you expect to do and then your actuals based on whatever change you made and like a price change in terms of your product. If you increase prices, you might not do like a controlled experiment there, but you can look and see before and after that price increase what happened to the business and try to glean some insights. It's a lot less precise. There is no counterfactual. So it's more of a more correlational than causal. But I like to always say we're doing the best we can here with a channel like Influencer. If you're doing influencer seeding, they can't geosegment those posts. But you still want to understand how it's working. Maybe you fall back on a time series where you just go on off.
Unnamed Host
Oh, I love that. Could you explain the difference between correlation and causal? Yeah, because people are gonna think it's casual.
Olivia
I know, I know. We just launched a new product called Causal Attribution and I'm hearing a lot of talk of, oh, casual attribution. Just chilling. Just doing some casual attribution here. Correlation, causation. Correlation is just that classic example of like brand search. I am spending on brand search. I have a user coming in and then they purchase. Well, was that a correlational conversion? Did that ad just happen to hit someone who was coming in anyway or was it causal? Did that ad actually influence somebody who wasn't going to purchase anyway to make that purchase? And so we're looking for the causal conversions. And like a great example is Sonos with, with you and your story where you said you have like 10 or 15 products. If we get an ad in front of you and then you buy the new headphones that just launched, was it the ad that caused you to buy or were you gonna buy no matter what because you're such a big Sonos fan? We're trying to spend our money on the causal conversions and try to move away and shift away from just looking at correlational conversions. Because there's probably a lot of waste there.
Unnamed Host
Right.
Olivia
And one thing I do like to note is that it kind of sounds bad. It's like this has a negative connotation where it's like, oh, a lot of your spend is non incremental. So it's probably a discount on your channels. But sometimes folks are undervaluing the media. Like you can use incrementality to actually add a multiplier. And YouTube and OTT are just two obvious examples where I think folks are actually understating the impact. And when you start to look at incrementality, you move off of click based attribution. For some of these view based channels you actually end up adding a multiplier where you find out it's like 2x more incremental than your MTA was showing.
Unnamed Host
Yeah, that's so interesting. I feel like that's something I'm hearing a lot more about YouTube is that when they like Actually Cody was saying this too. When he runs YouTube and tests it with House, he sees a really good incrementality factor behind it. But when you look at just the day to day analytics of the platform, you would think it's not working at all.
Olivia
Yeah, I love YouTube. I think I'm one of YouTube's best salespeople because I say like performance marketers think it just can't work because they're looking at click based attribution. So there's a ton like ton of opportunity in terms of competition. Like your competition is not there because they think it doesn't work. And so in terms of just auction dynamics and competitive pressure, there's a lot of like open, open space there for you.
Nick Sharma
Marketing measurement is hard and honestly it's just getting worse day by day. Reported conversions are often double counted across platforms and click based attribution is misleading and oftentimes incomplete. Today, sophisticated marketers are moving away from attribution and more towards incrementality in order to maximize growth and efficiency. This is where HAUS comes in. HAUS is a fully self service experimentation platform that allows you to configure regional test and control experiments to measure incrementality and identify points of diminishing returns. Brands like Caraway, Jones Road, Beauty, Ritual and Hexclad, they all use Houzz and it helps them save millions of dollars because they can quickly identify what is actually driving incremental versus just what's over reporting sales. These types of tests are super hard to set up and can be really tricky to get right. That's why my favorite solution for this is House. They have cutting edge technology, they have PhD economists and data scientists who have built these solutions before at companies like Amazon and Google. The platform allows you to test all your marketing channels both online and offline. It allows you to measure the impact across these channels whether it's DTC retail or Amazon. And honestly you can really just upload your sales reports from your retail stores and it runs all the numbers for you and it allows you to calibrate your platform reporting for incrementality using House. With House you can finally answer questions like what is the ideal spend volume in this channel? Or what is the right mix of upper and lower funnel media. Finally, add scientific rigor to your marketing and make the most impact with your spend. To learn more, go to House I.O. limited. That's Haus I.O. limited. To get started.
Unnamed Host
What are some other interesting insights tactics that you see from experiments at House that you're like wow, I can't believe more people aren't doing this. Or this is our secret book of golden nuggets right here.
Olivia
I actually brought a couple.
Unnamed Host
Amazing.
Olivia
I think it's timely. We're in mid September, and so I wanted to talk about this idea of filling the funnel. And I think Cody's.
Unnamed Host
I call it filling the toilet and then flushing the toilet.
Olivia
That's a good one. I don't know if we should put that in our marketing material. I'll think about it. So filling the funnel, everybody right now is prepping for Black Friday Cyber Monday. They're trying to figure out upper funnel like reach campaigns on Meta as an example. And the best way to kind of fill the funnel, maximize reach so that they can have the best Q4 that they can. And so we looked back at last year Q4, and we looked at all of the experiments that we ran here, October, November, December. And so we have the immediate lift, which was the lift driven during the period of the experiment. Maybe most of our experiments on average are like three weeks. But then we looked at delayed lift and we have this feature at house that we call a post treatment window where we just, we stop the test, we revert back to national or, or whatever your business as usual is. And we just watch and observe the behavior of the treatment markets over time to understand the lagging impact. And like, Hexclad did a great case study on this where there was. There was quite a bit of delayed lift. And it makes sense, right, for higher price point, like AOV products that they're not going to convert, like in the first two weeks and might need a little bit more time. So we looked at immediate lift versus delayed lift and we found that. Let me just pull up my notes here. So delayed lift exceeded immediate lift in 73% of the experiments that we ran in Q4.
Unnamed Host
Wow. And how long was that period?
Olivia
It was like another three weeks.
Unnamed Host
Oh, so not that long.
Olivia
Not super long. We're not looking at months here. We're just looking at another two to three weeks after. And so, like, what does this mean for you? I think in terms of buyers who are thinking about this right now, you might want to take your efficiency targets down a little bit right now in the spirit and like, understanding that you're going to reap a lot of the benefits here come November when people are actually in market buying. So if you're running these Meta reach tests or YouTube, don't just shut it off because you don't see it working right away. If you're able to spend At a slightly lower efficiency right now you are going to reap those benefits because you're going to have more users in market aware of your product come November.
Unnamed Host
Right? Yeah. That's fascinating.
Olivia
Yeah, this is a big one. And then the other insight from looking back at last year Q4 is that video campaigns drove 286% more delayed lift.
Unnamed Host
Than search video view campaigns.
Olivia
Video like YouTube OTT.
Unnamed Host
Oh, okay.
Olivia
Yep. And meta reach was in the middle there. So really search was. Yep, search was the least amount of delayed lift video. YouTube OTT was by far the most Meta reach was somewhere in the middle there.
Unnamed Host
Was there a certain threshold as to where reach became useful? I feel like normally Reach campaigns are like, they just target the brokest users because they know they're not going to click or buy anything.
Olivia
Yeah. Reach is not an obvious, like light it up and it works. I know. Even Cody and some of our customers at Houzz have had to really focus on, all right, what is the right way to buy reach campaigns? Because it's not just to set it and forget it. Yeah. So you kind of have to work at it. Same with YouTube where you can't just light it up and expect it to work out of the gates. There's some tactical improvements and optimizations that you need to make in terms of audience targeting. Like where meta is just like, leave it alone, go broad. And that typically works for most folks. I don't think it's the same in YouTube and certainly not in meta reach campaigns.
Unnamed Host
Right. Yeah. I feel like when normally I feel like once you hit the million dollar a month spend mark in meta, then like add to cart campaigns start to get useful and landing page view campaigns or even view content as the conversion event. And then the last thing I think of is reach. But do you find, do you find that different conversion events at different levels of spend will drive different incrementality? Like could you say that for the first one and a half million dollars of spend, you really just can focus on purchase objective, but then the next 2 million to not make sure you're just decreasing efficiency, you should actually run add to cart because that fuels the purchase campaigns.
Olivia
Yep. We have a lot of customers testing into the right optimization goal for their specific business. Like sometimes folks will go, they'll try to go up funnel, but they'll go to add to cart and it won't actually achieve the objective of like trying to reach more people. They're only expanding that universe a little bit. And so that's where it's kind of like A test it for your own business. If your goal is really to maximize reach and fill the funnel, maybe add to cart is not high enough up.
Unnamed Host
I see.
Olivia
So a lot of folks in Caraway and that team has been testing with add to carts versus landing page views versus video view and they've I think learned a lot. So it's this kind of goes back to the point of like, you can't just set and forget. You really need to test into the right way to buy upper funnel.
Unnamed Host
Do you think that like, okay, there's always this debate of like brand marketing versus performance marketing. This to me seems like the future of brand marketing. It's like actually understanding incrementality insights and then leveraging it, you know, on where you spend next. Most of the alcohol companies that, that we work with, like, you know, the large conglomerates, they spend six months running a campaign, six months analyzing. If it worked. By the time they're like halfway through analyzing, they have to start creative prep for the next campaign. Do you think the future of brand marketing is like a brand marketer running campaigns and then looking at House for reporting?
Olivia
Yep. Yep. We work with, with Pernod Ricard, which is Jameson and Absolute. So we actually, they, they just, they just tested the sphere for Jameson. Yeah.
Unnamed Host
And they tested that with House.
Olivia
Yeah, we have some insights coming there.
Unnamed Host
Okay, so how does that work? So, so I know the inputs you can have are obviously like online sales, but then you can upload a bunch of sales reports in the house.
Olivia
Yep.
Unnamed Host
So like what did they, what were they uploading and how are they running the experiment?
Olivia
Yep, they just, they send us their sales data sell into the retailers and.
Unnamed Host
Because it's more distributor data.
Olivia
Yeah, it's more distributor data because it's selling data. We, we do have a bit of a delay. Like that doesn't, you know, that's not immediately tied to ad exposure. So our results come through. Whereas it's pretty typical for us to publish results a week after test ends. With Pernod, we're probably looking at a month or so.
Unnamed Host
And is that because you're waiting for the second reorder from the distributor?
Olivia
Exactly.
Unnamed Host
To look at the time difference.
Olivia
Yep.
Unnamed Host
Wow.
Olivia
Yep. So. So yeah, we, we work with Pernod and in. What's, what's interesting is exactly what you said is the reason they work with us is because they need results quicker.
Unnamed Host
Right.
Olivia
They need, you know, results in weeks, months, not six months to a year. The, the future of brand marketing. I. That's a really good question. We talk about this all the time. My observation is that folks really only want to buy brand marketing insofar as it drives sales. Like, nobody really cares about how the awareness metrics are changing. The finance people, the cfo, like, they need to see some impact to the bottom line. And so that's why I think the future is going to be observing maybe shallower KPI's like site visits in real time, but running longer experiments with that post treatment window to understand ultimately how those like, you know, leading indicators are driving sales.
Unnamed Host
Yeah. Could you speak to like, how. Okay. So one thing I ranted about assuming this episode comes out when it does last week, episode one thing I ranted about was how so many of our clients will complain that their CPAs are too high or that they can't seem to get past a certain spend threshold in Facebook without seeing diminishing returns. And the biggest thing I've been saying is like, well, actually like, we can do everything we possibly need to on the performance side. We can make the best creative, the best offers, test landing pages, formats of creative. But unless the brand is being built outside of this performance marketing ecosystem, chamber like, you know, I think of it as like the brand team puts out pixie dust on the floor and the performance team sweeps it up.
Olivia
Yep.
Unnamed Host
Would you say that it's a valid statement that unless you're built like. Or another way to think about this is like, you ever seen that movie Cars?
Olivia
Yeah.
Unnamed Host
When Lightning McQueen is like paving the road with the thing behind him. I think of it as like, paving the road is brand marketing and the cars driving on it is performance marketing. The better you pave the road, the smoother the car is going to drive and vice versa. Would you say that that's a valid statement?
Olivia
Totally. Totally. We learned this at Quibi too, right? Like, we could have the best performance marketing engine in the world, but if there's no brand and there's no brand equity, you're going to hit that wall that you mentioned in terms of rising costs. I. I have a bit of a hot take on, on brand marketing. So many people talk about this on Twitter, but if you're trying to build a brand and maximize reach, why then go into TV and all these like, chant like, why not just focus on building an amazing brand in the channels that have the most scale, which is like TikTok meta YouTube. Like, I think it's interesting that people write off those channels when they think about brand marketing because they're thinking of this as needing to be completely different from performance.
Unnamed Host
Fully agreed.
Olivia
But that Might also be the place where you build a brand too.
Unnamed Host
I couldn't agree more. I mean, two brands I was looking at yesterday that I think do a phenomenal job on platforms like that. Have you ever heard of Waterboy?
Olivia
No.
Unnamed Host
They're like an electrolyte powder. They've almost entirely built a eight figure business really without running too many ads. Like most of their content is done organically and behave. Candy has been around for a while, but just recently has taken off on TikTok and TikTok shop. And this notion that like brand marketing needs to be hiring these big agencies and billboard campaigns and TV campaigns is, I think so outdated because all the platforms that are gonna give you the most reach are free to use and you just kind of have to figure out how to use them.
Olivia
Totally.
Unnamed Host
We also used to run tests when I worked at Hint. So Hint's where I got introduced to incrementality about seven years ago.
Olivia
Yeah, I was going to ask.
Unnamed Host
So at the time, this was pre Cambridge Analytica. Meta had all those amazing insights. You could upload a customer list, you could see everything about that customer list that Facebook knew and then post that when data started to get a little fuzzy. When we added incrementality and plus we were scaling and adding more and more retail doors, we were starting to really get big on Amazon. And one thing we found was like for every dollar we spent online, we would make $3 across all the other channels, Amazon, Retail marketplace, et cetera. One, do you find that to be very common with consumable products or just products in general? And two, or actually second part to that is, is that something that's easy to test? And then the second question was actually we'll just start with that and then I'll save the second part.
Olivia
Yeah. The question is, is it common to see that lift play out in Amazon and retail when you're investing in those channels?
Unnamed Host
Yeah. Like is it, is it easy with or without Haus or actually with both to just understand like, all right, you know, like let's say I'm a supplement brand, I'm spending on Meta and I'm getting a 1 ROAS or 0.8 ROAS. I feel like that's fairly normal for supplement brands. Cause they rely on that subscription revenue. Is it easy to understand like, okay, I'm getting a one roas online, but this is the real roas because you factor in what this spend is doing in Erewhon or Target or whatever it may be.
Olivia
Yeah, I used to think this is my biggest shift since joining House. I used to think that if you're advertising for DTC, you're clicking through to DTC, that most of the effect will happen in D2C. And I was completely wrong. Consumers are going to buy where they want to buy and they're very well trained. And so there's, there's a behavior there that's very difficult to influence. And so I learned this through many experiments that a lot of the effect here is happening in Amazon and in retail, even if you're doing everything you can to try and drive users into the DTC shop. One actually interesting learning here. So that is consistent with what I've seen. But one interesting insight is that the more view based a channel, so like OTT, YouTube is my top examples, the more lift we see on Amazon and retail. We're gonna study this more but like compared to meta or search, we see more of the lift play out for those view based channels in Amazon and in retail. I'm excited to study this. I've told some folks this lately and they're like, yeah, that makes sense. But again, it goes back to the like, on your day to day reporting, it's looking like YouTube and connected TV like can't work or they're not working, when in reality they might just be driving more in a higher proportion of Amazon sales that you're not seeing in attribution.
Unnamed Host
Totally. This instantly reminded me of the world of branded content with publishers where you go to them and they'll be like, yeah, we're going to guarantee, you know, 500,000 visitors on this content or, you know, 100,000 visitors on this content, but we're gonna charge you $65,000 to write about your product. Being in Sephora. Did you ever run any experiments around branded content? Or have you seen experiments around branded content? Cause I'm curious like, you know, if YouTube and OTT can do the same thing as branded content where they're charging $65,000. I've always had this hunch that branded content's just a waste because I always see the same publishers that sell you 100,000 visitors, they run shitty traffic on Taboola to get those visitors.
Olivia
I bucket branded content in with affiliate. It's like one flavor. We have the coupon sites and then we have cash back, like Capital One and then branded content or kind of like advertorial on these websites. Like GQ is the third bucket. Super hard to geotest and also just hard to test in general because that stuff lives in perpetuity. Like you can't just Take down the GQ article.
Unnamed Host
Right.
Olivia
So I don't have a lot of data here. I share your, your skepticism.
Unnamed Host
Yeah. So part two of that question I was gonna ask was so you could test like impact of media in other channels. Can you do the same thing for organic social? Like could waterboy or one or like, you know, who else is big on TikTok? Like smack. I don't know if you've seen Smack and sunflower seeds or any of these others. Like, you know, is it easy to test organic impact or organic incremental impact, or would that just be kind of that third way of testing where you just turn it on or off?
Olivia
Yep. That's the fallback in terms of time series. On off. Yeah, you can't geosegment organic.
Unnamed Host
Right.
Olivia
So yeah, that's a little bit less precise. But we certainly do like on off kind of bursts there where if you have a new strategy, you really want to go hard into like UGC and see how that's working for organic. It's like we do a burst where they just post a bunch of UGC content over the course of a few weeks and we observe what's happening.
Unnamed Host
Interesting. Have you run any tests where brands have seeded a bunch of creators or influencers content as kind of that burst and seeing what the impact is?
Olivia
Yep. And we, what we do is we'll, they'll put money behind it. They'll like put some paid behind it and we can actually geotest that media.
Unnamed Host
Oh, interesting.
Olivia
So if you want to see what amplifying that seating looks like, we can absolutely test that with a holdout. We do that on TikTok.
Unnamed Host
And do you see there's a higher impact of creator content versus brand content?
Olivia
Oof. That's a good question. Our influencer testing has been promising.
Unnamed Host
Yeah.
Olivia
Whereas you know, like the affiliate space, very mixed results. The, the smaller amount of influencer testing we have done has been encouraging.
Unnamed Host
Yeah. Interesting, huh?
Olivia
It's a, it's, it's, it's such a, it's such a difficult area to test and probably so much opportunity here. So. But I would say like don't shy away just because you can't design a perfect experiment.
Unnamed Host
Right.
Olivia
Like, like don't let you know, perfect be the enemy of good. If you want to just do that burst method where you go out with a bunch of different types of branded content or seeding plays. We can just observe and run a time series test on off and see what's happening.
Unnamed Host
Okay. One phrase that over the last seven years it either Escapes my brain or it's in my brain. Very rarely does it escape now. But diminishing returns. Yeah, that is a phrase that for the first three years when I learned it, I felt so smart saying it. Now it's a common term. But can you just explain what diminishing returns is?
Olivia
Yeah, this is, I'm so glad you brought this up. This is my favorite experiment that we run at House, but it's all about marginality. I think Ben from True Classic talked about like, it's not about average CPIA or cost per incremental acquisition. It's about your marginal cpia, where is your next dollar best spent. And so diminishing marginal returns helps you understand where you are on that curve before you just start increasing the cost, before you are no longer growing at the clip that is efficient for you. And so to make this more real with an experiment, what we'll do is we'll have a three cell test. We'll have 1/3 of the country getting your business as usual spend pressure, 1/3 of the country getting BAU plus 25% and then the last third or final third getting BAU plus 50%. And at those different levels of spend pressure, we're able to draw your curve and show you how much incremental revenue or how many incremental users, new customers you're getting with that extra 25% or added 50% of spend. And then you can do the math there on whether your marginal CPIA is that next dollar is best spent in Meta or in YouTube or in another channel. But those are really fascinating tests to run where like fanduel, we have a public case study about this where they're like, all right, we, we YouTube's a huge channel for us. We want to understand how much we should spend. Going into March Madness, they ran kind of the low, medium high spend pressure in different cells and they figured out that between medium and high they were fully maxed out, no additional users and just added cost. And so they were able to kind of calibrate the spend level there to make sure that they're not overspending relative to what they're getting back.
Unnamed Host
And I know this isn't true, but when I think of what you saying, dividing it into thirds, I think of like, you know, the west coast, the middle of America and the East Coast. How do you actually divide those?
Olivia
No, it would be like the opposite there we would have the right representation of west coast markets in all three cells, such that all three cells are like statistically indistinguishable. From one another. So, like, the behavior of the regions in cell one are exactly the same as cell two and cell three.
Unnamed Host
Got it. Amazing. Okay. I feel like this was a great episode.
Olivia
Yeah.
Unnamed Host
Anything else we should add in there for the people?
Olivia
No, No. I hope this was helpful. Good luck with your Q4.
Unnamed Host
Yep.
Olivia
And make sure you're looking at delayed lift.
Unnamed Host
Yeah. Amazing. Well, thank you for coming on, Olivia.
Olivia
Thanks, Nick.
Nick Sharma
Thanks for listening.
Unnamed Host
We'll be back.
Nick Sharma
Next time to cut through the noise on cpg, retail and E Commerce. If you enjoyed this episode, why not.
Unnamed Host
Share it with a friend?
Nick Sharma
And be sure to subscribe wherever you.
Unnamed Host
Listen so you don't miss the next one.
Limited Supply: S9 E9 – Incrementality, Data, the Funnel, Oh My! (with Olivia Kory of Haus)
Host: Nik Sharma
Guest: Olivia Kory, Haus
Release Date: September 18, 2024
In this episode, host Nik Sharma delves deep into the nuances of incrementality testing in the Direct-to-Consumer (DTC) marketing landscape, joined by Olivia Kory from Haus. The conversation primarily revolves around the shift from traditional attribution models to more sophisticated incrementality testing to maximize growth and marketing efficiency.
Olivia Kory opens the discussion by likening incrementality testing to randomized control trials in healthcare:
“Think of randomized control trials in healthcare… that's incrementality testing. We have a counterfactual to understand what would have happened anyway in the absence of this intervention.”
[00:00]
Olivia provides a comprehensive explanation of incrementality testing compared to Multi-Touch Attribution (MTA) and Marketing Mix Modeling (MMM).
Incrementality Testing: Establishes a causal relationship by comparing a treatment group with a control group to determine the true impact of a marketing intervention.
MTA and MMM: Primarily correlational, analyzing patterns and relationships without establishing causation.
Olivia emphasizes the importance of creating a counterfactual scenario to measure what would have occurred without the marketing effort:
“MTA and MMM, they are more correlational, but they’re not causal. With incrementality, we’re looking for what’s actually driving the business.”
[07:20]
Olivia shares her extensive background in paid media marketing, highlighting her experiences at:
Nik introduces Haus as a pivotal platform for conducting incrementality testing, enabling brands to:
Olivia supports this by detailing how Haus's geotesting methodology offers consistency and precision:
“With geotesting, you’re running the exact same methodology standardized across every channel you test… it’s resilient to all the privacy changes.”
[13:07]
Olivia breaks down the three primary types of incrementality tests:
Geographical (GEO) Experiments:
“All three cells are statistically indistinguishable from one another. So, the behavior of the regions in cell one are exactly the same as cell two and cell three.”
[43:48]
User-Level Studies:
“Once you start expanding outside of those two channels… that’s when you go up funnel, you start spending in more channels… you really need geotesting.”
[17:53]
Natural Experiments (On/Off Testing):
“If you want to just do that burst method where you go out with a bunch of different types of branded content… we can run a time series test on off and see what’s happening.”
[41:17]
Olivia shares real-world applications and learnings from her tenure at various companies:
Netflix: Discovered that channels like brand search and affiliate contributed minimal incremental lift despite appearing strong in last-click attribution.
Sonos: Identified that while certain channels didn't drive direct incremental revenue, they shifted customers from retail to DTC, enhancing overall profit margins.
“Even though on last click attribution, it looks like your best channel, incrementality tests showed no impact.”
[09:17]
Q4 Case Study: Analyzed immediate versus delayed lift across various campaigns, finding that delayed lift often exceeded immediate lift, especially for higher price-point products.
“Delayed lift exceeded immediate lift in 73% of the experiments we ran in Q4.”
[26:03]
Video Campaigns: Discovered that video campaigns like YouTube and OTT drove significantly more delayed lift compared to search-based campaigns.
“Video campaigns drove 286% more delayed lift than search video view campaigns.”
[26:54]
A key highlight of the episode is the exploration of diminishing returns in marketing spend. Olivia explains how understanding the marginal cost per incremental acquisition (CPIA) can guide optimal budget allocation.
“Diminishing marginal returns help you understand where you are on that curve before you start increasing the cost, before you are no longer growing at the clip that is efficient for you.”
[41:49]
She illustrates this with an experiment design involving varying spend levels across different regions to identify the optimal spending point:
“We ran low, medium, and high spend pressure in different cells and found that between medium and high, we were fully maxed out.”
[42:00]
Olivia posits that the future of brand marketing lies in integrating incrementality insights to drive sales effectively. She challenges the traditional notion that brand marketing must rely solely on big agencies and expansive campaigns, advocating for leveraging platforms like TikTok, Meta, and YouTube for scalable brand building.
“People are really only buying brand marketing insofar as it drives sales… the future is going to be observing maybe shallower KPIs like site visits in real time, but running longer experiments with that post-treatment window to understand ultimately how those leading indicators are driving sales.”
[30:25]
Nik and Olivia discuss the symbiotic relationship between brand and performance marketing. Olivia underscores the necessity of building brand equity to support scalable performance marketing efforts, referencing lessons learned from Quibi and other ventures.
“At Quibi, we could have the best performance marketing engine in the world, but if there’s no brand and there’s no brand equity, you’re going to hit that wall in terms of rising costs.”
[33:28]
The conversation also touches upon testing organic social strategies and influencer content. Olivia shares insights on handling organic bursts and seeding creator content, emphasizing the need for flexible testing methodologies when dealing with non-traditional marketing channels.
“If you want to run a burst where you post a bunch of UGC content over a few weeks, we can observe and run a time series test on off and see what’s happening.”
[39:55]
As the episode winds down, Olivia reiterates the importance of delayed lift and encourages listeners to adopt incrementality testing to uncover hidden efficiencies and opportunities within their marketing strategies.
“Make sure you’re looking at delayed lift.”
[44:21]
Nik wraps up by highlighting the critical takeaways about moving beyond traditional attribution models to embrace a more scientifically rigorous approach to marketing measurement.
This episode of Limited Supply serves as an invaluable resource for DTC brands aiming to refine their marketing strategies through advanced incrementality testing, ensuring every marketing dollar is effectively driving growth and enhancing profitability.