
Loading summary
A
The CMO Confidential Podcast is a proud member of the I Hear Everything Podcast network. Looking to launch or scale your podcast, I Hear Everything delivers podcast production, growth.
B
And monetization solutions that transform your words into profit.
A
Ready to give your brand a voice then visit iheareverything.com welcome to CMO Confidential.
B
The podcast that takes you inside the drama, decisions and choices that go with being the head of marketing. Hosted by five time CMO Mike Linton.
A
Welcome marketers, advertisers and those who love them to Chief Marketing Officer Confidential. CMO Confidential is a program that takes you inside the drama, the decisions and the politics that go with being the head of marketing at any company in what is one of the most scrutinized jobs in the executive suite. I'm Mike Linton, the former Chief Marketing Officer of Best Buy ebay for Farmers Insurance and Ancestry.com here today with my guest Alex Schultz. Today's topic, marketing at Meta, the view from the eye of the storm. Now Alex is the CMO and VP of analytics at Meta, a position he has held for five years. Prior to that, he had various positions in growth marketing at Meta and also founded paper planes, a UK based website, nearly 330 years ago. We'll talk about that later. He is also on the board of Lynnblad Expeditions. And full disclosure, we worked together at ebay a number of years ago. Welcome Alex, and it's great to see you.
B
Thank you. It's nice to see you too. And thanks for the support in my early days at Meta.
A
Yes. All right, so let's talk about this role at Meta. What exactly is the CMO VP of Analytics responsible for and how is your group structured?
B
Yeah, I mean the way I like, the way I like to think about things is it's kind of like what I used to do with my little websites was you want to get people to visit the websites and you want to count how many people visited the websites and what they did on it. And that's basically what I do for Meta and that used to be my hobby. So I have all of the marketing teams globally. So product marketing, experiential marketing, business, consumer, all of our brands. So we have a centralized marketing team similar to Apple or Microsoft or Google. And then I have about a third of the analytics team report to me directly. So those are the analytics team that cover, you know, our ads, products, our infrastructure, community integrity. So anything run by the COO reports to me directly. The rest of analytics is decentralized. And I actually decentralized it completely when I took over as head of analytics. So there is a head of analytics for Reality Labs who's awesome. There's also a head of analytics for Facebook app, WhatsApp, Instagram, and there's a head of analytics for meta Superintelligence Labs. So I act kind of as fairy godmother for those guys. And then the rest of my org is whatever my boss currently doesn't want to do. So currently I'm also using running user research. I'm running competitive intelligence. As of a month ago, I run translations have done for the last 14 years. So, like, I'm kind of the COO to the COO in some ways.
A
So this is like, I want to talk about the decentralization just so our listeners understand. When did you. How did you decide to decentralize and then how do you manage as a, you know, overseer of that and compensate that group to make sure that the analytics are all on the same page?
B
Yeah, I think, like, it's really interesting for marketing. I felt we needed to centralize it because there are a lot of correct ways to do marketing, but if you do multiple different correct ways, they'll cancel each other out. So there's loads of different positions you can take for any given company, but once you've decided the position, everyone has to pull the same position, you know, otherwise it won't work. For analytics, I felt it's the reverse. Right? Data is generally truth and you can scrutinize each other's work well, and there's usually only one correct answer with the data for any question. And so for that, I felt like actually peer review and being able to look at each other's work meant that we would be able to be correct. And the decentralization meant that the leaders where the analytics team were decentralized to would have to care more about that team because they reported to them, because they'd be managing them. And it's worked out that way, actually. Like, the analytics teams are being supported by their leaders, and their leaders are really thinking about what they have to say and believe in it, and we're keeping it factually correct because I kind of act as a little bit of a weight in the center to make sure that we have that factual correctness. The one thing I'd say is just don't decentralize analytics too deeply. So you heard me say, there's a head of analytics for WhatsApp, there's a head of analytics for Reality Labs. It's not the subdivisions of WhatsApp or Reality Labs that have. That have Heads of analytics, they're aggregated still up to the level where it's at the product group or business unit level. They're not a subdivision level because at that point I think you start to get the analytics team captured by the local director or whatever. And they're not a strong enough, senior enough analytics leader to push back when people are critical of the. What the data says, I hear you.
A
Saying, and I've done similar stuff, which is the marketing analytics is really the company interfacing with its customers. The operating analytics, which is what I kind of hear you saying in WhatsApp and everything else that's actually business units that have specific analytics they need, that can't draft as easily off the company analytics. And so they need their own kind of player or set of groups. Is that a fair way to look at it?
B
Yeah, 100%. Although you could do it centrally in most companies, analytics exists in this superposition of states where it could be decentralized or it could be centralized. And people centralize it when they're not getting the right answers because people are grading their own working. And you know what? They decentralize it when people are not feeling they're getting the support they need from the central team. And so we've managed to decentralize it like a decade ago and keep it in that state. So you're right, it can draft off the product team.
A
How did you get? How did you get. You know, the other thing that happens, particularly as you acquire companies, is there sometimes it's not one set of analytical truth because there's not one agreed upon set of data. How did you get all the data in the same place or get everybody to agree, yeah, this is the data we're going to use. You can't just be pulling it from here or there.
B
Yeah, I mean, I, yeah, I, I think so. I've run into this a lot with companies and people I've advised and worked with and so on. The fact of the matter is there is usually one set of truth, like revenues, revenue. Free cash flow is free cash flow. EBITDA's EBITDA. You know, there's a, there's a greed.
A
It's hard to argue with those. Yeah, right.
B
But also active users, active users, like monthly active users, like did they log in or did they not log in? Daily active users is the same. It's the same thing. Time spent, like, it's the same thing. So I think one of the problems actually is that you often get leaders who want to create their own custom vanity metric Y. So I believe. And look, this might be arrogant. I believe one of the reasons that we've been able to get to a standardized set of metrics, like, for example, Instagram really wants their monthly actives to be 28 days. And the industry standard is 30 days. And for the company, we report 30 just across the whole company, because I can make that decision. But look, the Instagram team looks at 28 days, and that's fine because there's not that much of a variance. So I don't care about it that much. There's slight differences in how you. How you define viewport views versus impressions. So Facebook, it's viewport views for a feed impression. Instagram, it's a different, slightly different definition of impressions. What percent of pixels need to be on screen, how long do they need to be on screen, all of these kind of details. So there's some variance. But fundamentally, the core of social media is do they visit, how often they visit, how much time do they spend, and, like, how much content do they produce? You know, and you can debate lots and lots of different things, but it's actually quite simple. And having me in the center as a pretty strong leader in the company means we debate that stuff less than other companies because I'm pretty well empowered as a head of analytics. And so I think actually a chunk of how you get to this debate is that a CEO board, senior leadership team doesn't actually empower their head of analytics. And it really is a problem for the company because then two teams will say monthly active, two teams will say revenue, and it means different things.
A
Right? Yeah. The other thing. Well, that sounds like a really good foundational set of analytics. So let's flip over. What's it like to do marketing in a business that is always in the center of just about every piece of news in the world? You got political debates, social debates. You're always referenced, even for almost everything. Tell me what it's like to sit in the center of that and how do you manage it?
B
Yeah, I mean, it's very. It's very stressful. And, you know, people have strong opinions about meta. It is interesting to me how much people want to blame platforms versus people for what they say. And what I see all the time is that it's funny. I was. I was on a podcast a few months ago during the book launch, and the person had a set of very political questions they asked me, and I was like, okay, thanks to those questions, I can tell you your politics because, like, you can take the same point of view and you can look at it from different angles and both sides sort of have, have, have different point of views. Look, in most of the world it's good. So in most of the world we have a really strong brand and we're in a really great place in the US and amongst kind of elites in Europe and throwing Australia and Canada, we're in a much tougher place and it is stressful to be in the middle of that. But for me, like, I believe very deeply in what we're doing and so I'm able to actually handle the stress because I believe in what we're doing, I believe in our responses. I feel pretty strongly that this is a good company run by good people who I love and trust. And so that enables me like from an emotional level to be able to manage through it and which is important and then hopefully from an emotional level be able to manage my team through it, which is also important because when you.
A
Because there's a lot of emotion around this kind of stuff every day.
B
Yeah. So I think the number one thing is like, how do you manage your emotions and your team's emotions through it? And that comes from a place of like really believing in what we do and feeling good about it. And then from a practical point of view, look, the number one thing we can do is stay focused on our products and get past whatever the latest political, usually motivated thing is that's going on with us. And so where we can reduce the cycle length, that's the number one focus, is like, just like, let's not have long cycles and get back to talking about product and innovation. That's when we do well. And so from a marketing and comms basis, it's a focus on, let's talk about product innovation, let's talk about people using our products. That always feels good. And let's try and minimize any of the other news cycles in terms of just length.
A
Hey, for people that are not as experienced as this, which is probably everybody listening to this, how do you. Do you have rules of the road to reduce cycle length or response rules to keep this from exacerbating itself when you get in the middle of it or not?
B
Well, that's really the comms team. I'm very closely partnered with the comms team, but in our company, comms doesn't report to marketing. You know, I think, I think like, Number one, like, tell the truth is the most important thing. I think a lot, you know, and hold, hold your position on what you believe is the truth is actually really, really, really Important. And beyond that, I think you'd really have to talk to the comms team.
A
Got it. Okay, so you're sitting in the center of all the AI stuff as well. How should people out there be thinking about the future of marketing in the era of AI? And what tips can you give our listeners? And we have at least 5000 CMOs. Listen to the show, give some tips about how you see the AI revolution, how Meta is playing it, and then tips you would advice you would give to our listeners.
B
Yeah, look, I think from an AI revolution perspective, there's two things kind of to look at in my role. One is what's Meta doing in terms of producing AI using AI and whatever. And the other is like, how are we using AI to make ourselves more productive and effective? On the second piece, I think that's where probably the most tips and tricks are for people who are also CMOs and heads of analytics. And the things that I think about for that is it's very much a threshold technology for any given problem. And what I mean by that is like the AI, typically for any problem, has a precision. Like how right does it get it? And it has a recall. How many things can it get that right? Basically, of all the things you could ask it, what, what percentage can it get right? And a human being's the same like you ask any given human being. For example, like create me a really great ad for Oakley Vanguards for people running in the park and the human being can go and take photos, or they can use cgi, or they can use Photoshop, or they can do whatever to create that person in the, in the glasses. And my version of really great ad will vary from some other CMOs as to how I review it, but how correct they get it for me to say, yes, run that ad is the precision. And then how many things I could ask them. It's like running, cycling, biking, golfing, how many of those they can do is the recall? And so what happens is the AI just gets better and better and better at a given problem, and it marches up from having 0% precision and recall until it gets to that human level of precision and recall. And the moment that it beats the human level precision and recall, that's when you start using the AI to do things versus using the human being on their own. Now, typically what's happening is it's a human using the AI. The AI doesn't just run off on its own and do it right. And so the reason I say it's a threshold technology is you just don't use it to do anything until it gets better than a human being doing it on their own. And then suddenly it feels crazy that you would ever have done it without AI being part of the solution. And as such, a very gradual move feels very sudden to people when it crosses that threshold. So, number one, expect it to be a threshold technology and just see how it's developing, because it will come on you suddenly, it'll appear suddenly, even though it's actually been a gradual improvement. Number two, AI doesn't work on its own right now. Creativity currently is not something the AI does very well. Also, you need to review the output of pretty much any AI because like, it hallucinates, it makes mistakes, it's not very self.
A
Sometimes it's wrong too. Sometimes it gets it wrong.
B
Yeah. The other day I asked our analytics agent how many people had more than 5 gigabytes of storage backups for WhatsApp on iPhone in the US and it told me 90 billion.
A
Wow, incorrect.
B
But it did actually say, and I got this wrong, which I thought was really good that it said it got it wrong, because clever. But you need a human in the loop. Like, you need a human in the loop, both for creativity, taste, and also review. And so the second thing is, it's Ginny Romany's quote, AI isn't going to take your job. Someone using AI is going to take your job. And so the second piece is like, we're looking at it from a, like, okay, who is the best person at using AI in our team to do these jobs? And then the places where it's working really well are analytics. This analytics agent using Claude code is just amazing. And it's got me writing queries again for the first time in five years. It is working well at scaling creatives. It is. And making our videos and changing backgrounds just like cgi, Photoshop used to do. It just does it better. Those two areas, it's really showing promise. And then obviously the summarization, composition, you know the joke of like, turn these five bullet points into an email, turn this email into five bullet points, all of that stuff. So that's why we're seeing it working. And it is scaling people and it's making people more productive, you know. And then for us, yeah, I'm like, really excited about next year. Like, we've had a bad year in 25. We had an amazing year in 24. I'm pretty hopeful we'll have a great year again next year. But that's for all of you to.
A
See and judge There you go. Hey, yeah, let's, let's go back to the question which is, you know, you, you're you, you have a huge team, probably using a lot of AI. If you're out there and you do not have a lot of AI usage or you've got a, you need help with AI or a consultant, what advice would you give somebody to say are these people or is this candidate good at AI or not?
B
Yeah, I mean we're sitting for candidates versus an agency or whatever for candidates. We're seeing a barbell distribution basically. So right now, people coming out of College who've used AI for the last three years, basically the people graduating now, ChatGPT happened in their first year. So the people coming out of college right now who have used it natively in any space the whole way through college, they're coming in with a totally different way of working, totally different expectations and they are really doing amazingly. Think about it, I mean like people who came in mobile native, people who came in digital native, at each generation it's the same thing. So first thing I'd say is if you're hiring people out of college, it's important to get them to tell you what they did with AI and tell you in detail what they did with AI. But if you get people who have been doing stuff that's going to be amazing for you, the second place we're seeing it really work. So barbell distribution is also very senior experienced people. So like individual contributing engineers who are VP level who are the top kernel expert in Linux. Those folks are just, they're such experts in their field, they know the right questions to ask the AI. And we're watching lots of those folks adopt things like Claude Code or Gemini or whatever for coding and they're just absolutely doubling their productivity if they are those very top end people. And so first thing I'd look for that, I would expect that pattern to hold the people in the middle of their careers who are like super, super busy, who are not the world experts in their space and also haven't natively used AI because they haven't had time for the last three years, they aren't as good on average as the people in those ends of the bar bill. There are exceptions to every rule, right? And then the last thing I'd say is so look at those two different classes. And then the last thing I'd say is this question I asked of like ask them what they did with AI, like really ask them what they did with it and then say, you know, the Five whys. Comment? Yeah, yeah.
A
How?
B
Tell me more details. Oh, my God. What's a really good prompt for that? How did you deal with hallucinations? Like, actually, you know, ask a bunch of those questions to probe whether they're just saying that they used AI or they can throw out examples of like, oh, yeah, no, it hallucinates and it did this crazy thing. And what I actually did was I put it in a loop or, you know, blah, blah, blah.
A
I agree with this. Well, I'm teaching college now, and one of the things you have to think through is I let them use as much AI as possible, but you have to give them homework that doesn't have an answer. If it has an answer, they will just get to the answer. So you have to leave it unstructured. And then they can use AI to get to choices or to get to stuff, but they actually have to think on top of the answer. And I think as a professor, it's actually hard to think through the homework and get it exactly right. And I don't know how you would teach certain things in, like, accounting or finance, where there is an answer.
B
Well, and should you. Or should you just know There are tools now. Like, we don't ask people to write out spreadsheets by hand. We don't ask people to be a typesetter before they can do desktop processing anymore. Like, some things are obsolete, and you should expect that the person will use one of these tools in parallel. I mean, as someone who studied physics, everything but thinking through the proof could be done better by a computer.
A
Exactly. But you got to figure out, can you actually think or, you know, you have to push on the thinking front without the grunt front, I guess.
B
And so the way they did it is they pushed on the thinking front in the exams, and then for all the coursework and classwork, you were just expected to be really good at using computers. They just expected you would. It wasn't like, oh, you're going to go and in paper, you're going to figure out the gravitational pull of the Earth in this particular location. No, you're actually supposed to be able to write the damn computer program.
A
There you go. Yeah, I was doing that gravitational pull thing the other day just for fun. Hey, so what things should marketers be doing right now to Prepare not for 26, but for. Even for 27 and beyond?
B
Yeah, I mean, this is the right question. This is the exact right question. Because I think too many people are preparing for where the world is right now.
A
Yeah.
B
And with AI in particular being this threshold technology that crosses thresholds of capabilities, that it's this hill climbing all these different areas. If you don't skate to where the puck is going, you are going to be really left behind. So it's the most important question for everyone to ask. I think, you know, number one thing I think you need to be doing is get in the flow of information, because how can you know where the puck is going to if you don't actually have the information coming in about decisions people are making, where people are moving towards, and so on and so forth. So, you know, follow the AI leaders, whether it's Alex Wang on our side, whether it's Sam, whether it's Karpathy, whether it's other accounts like Rune or whatever, but like actually get in there and follow the people post.
A
When you say the information, you're talking about basically the progress of AI around the world. That, that's, that's what you mean by that. So that.
B
Get in the. Yeah, yeah, look, I mean, you can go and you should, you should definitely be adopting tools internally, using them, drilling yourself like, I don't know, I'm not adding anything to you to tell you to do those things. You're intelligent listeners, they know that. But like, I actually think the thing that lots of people are failing to do is get in the information flow, even internally, you know, when, as it happens, Alex Wang, who's joined us to lead Meta Super Intelligence Labs, I invested in scale AI like I don't know, seven years ago personally. And you know, when he came on board, conversations that I'd been having around, like the Thanksgiving table or hanging out with friends or whatever, you know, gaze of Silicon Valley. Yeah, I know a lot of people in that, in that set, right. Like Sam and, and others like that, and, and suddenly became conversations we were having at work that we hadn't actually been having in the rooms I was in. And that was really an interesting and enlightening thing for me because stuff that I thought was like, very standard, actually a lot of folks I worked with didn't know. And so that really opened my eyes. And I did this with my leadership team. I did this with the comms leadership team. And there were things like, you know, whether you're talking about AI 2027, which is obviously a max doom scenario, or whether you're talking about machines of Loving Grace from D or you're talking about Sam's Gentle Singularity essay, you know, or all of these different things. There are certain just texts that are worth reading that everyone talks about and Everyone reads.
A
I think there's a really big point here you're making, which is AI is not just the tools. It's the thinking about how the tools are used and how. And the. And the wrong. It's the wrong word. But the philosophy behind the tools.
B
Yeah.
A
And how people are.
B
Yeah, but what's going to be built next? That's the question you're asking is what's the future if you're not in the flow of the literature? And by the way, literature does actually mean literature. It means things like Ian Banks. It means things like Heinlein and the Moon is a Harsh Mistress. So it means books, as well as these essays, as well as what people are tweeting about. You'll get the feel of the philosophy, exactly what you say and. And the direction that this industry is going. And I know there's 100, 200 people probably who matter in this industry, and that's it. And I'm not one of them. And, like, if you follow them, you can really learn where they're going and therefore you can predict where the technology is going.
A
I have not heard of the Moon as a harsh mistress, but I just like the title, so I'm definitely getting that as soon as we're off. Hey, let's talk about. You know, you're sitting in the center of a lot of B2B interactions, too. We have a huge amount of B2B listeners. Give us some thoughts on how the B2B market is evolving and what you see coming down the road.
B
Yeah, I mean, this is another one where the book, the click here is actually kind of covering it kind of pretty heavily, actually, because, like, well, you.
A
Can open and tell us a little bit about the book, and then we'll go to the question if you want.
B
No, I mean, it's more like the tools aren't changing, that the tools are changing a lot, but the principles aren't changing.
A
Okay.
B
Which is one of the absolute core things for me in the book. So you know what's happening with B2B marketing? I mean, a lot of it's staying the same. So you need to promote to people in product, and you need to actually say to them, hey, use my product for this. Please use it for this. So whether you're Google, whether you're us, but bluntly, whether you're booking.com talking to their. Their hotels or Expedia talking to their hotels, like, you need to get people to adopt the new tools. What are the new tools? They're AI. They're agentic AI. They're getting people to communicate over messaging with their clients. Their customers. But use. Not just use WhatsApp, use WhatsApp plus AI agents on top of it. You know, not just use Salesforce but use Salesforce plus digital sales reps on top of it. But even that's very similar to the past because what's messaging apart from like a faster form of database marketing?
A
Yes, but here, here's we've had a bunch of guests on like John Miller and some other folks, you know, and who have said, look, the old marketing funnel and B2B is dead. It's just all like now it's all about product use, customer service and relationships. Do you, do you buy that?
B
No.
A
Tell me, tell me what you're thinking.
B
I mean, the old funnel isn't there if nobody even knows you exist. So if awareness doesn't. Isn't there?
A
Oh, well, you have to have awareness. Yes. Yeah, yeah, right.
B
You know, and then you've got to actually manage them fully through the funnel. So they've actually got some feelings that they want to use your thing and then you need a button and like I just feel everyone reinvents the wheel and says, oh, the funnel's dead. What you need to do, how did you put it there?
A
You said customer relationship, customer experience, relationship. That marketing qualified leads.
B
Does this sound like nurturing leads to you? If you're giving people experience and you're building the relationship, doesn't that sound like pretty standard lead nurturing in the middle of the funnel?
A
On CMO Confidential, we try have all points of view. So we will have a book and show for, for this. So I want to flip over to the transition you led that transmission or I think from Facebook to matter from a marketing standpoint. Just give us the inside story on.
B
That logo right there.
A
Yeah, I see it. It's very tastefully placed.
B
Three dimensions, by the way. Yes. I have logos everywhere. They're always there.
A
Nice.
B
Very nice. Actually, the guys who did this, Zach Steubenvall is at OpenAI now and Andrew Sturk is at Anthropic and they did this, this logo. Yeah, I mean the rebrand was a really exciting experience. You know, I came in as CMO and I said, look, it's really difficult to be the company Facebook and the app Facebook. And I know there are companies like that, like Estee Lauder or whatever, but the magnitude of Facebook as an industry defining thing, like you say social media, the first thing you think is Facebook.
A
Yeah, you do.
B
And it's so industry defining that it Was overwhelming. The rest of the products, like Facebook Inc. Was not getting any credit for building WhatsApp into what it's become. Facebook Inc. Was not getting any credit for Instagram. Yeah. And certainly nothing for the innovation that was happening with Reality Labs. Whether it's the. The. Whether it's the glasses or whether it's the headsets, like, we weren't getting any.
A
You are doing great on product placement, by the way. I mean, the way you have brought all this stuff out physically, it's. Yeah. You might as well hold up the book. There you go. Excellent. Well done. Well done, Alex.
B
But no, like, I think the corporate brand was blocking people to understand we were doing anything else. And also, it was hugely confusing. So, like, when we do a Privacy update for WhatsApp, and we said facebook, Inc, blah, blah, blah, blah, blah, people saw the word Facebook and they're like, well, I don't want Facebook involved in my WhatsApp. Reasonably. And it was Facebook Inc. And so I feel for the confusion, even on podcasts or press, when you're talking about Facebook, Our partners, journalists, others wouldn't know if we were talking about the company or the app a lot of the time. And it was really confusing. So for me, I felt we needed a distinction between the corporation and the app. I was thinking something more like PepsiCo or BMW Group. I wasn't going the whole way to a full rebrand, so I was like, FB Inc. Or. You know what I mean?
A
Yeah, A little holding company thing. Totally. Yeah.
B
Alphabet. And Mark was like, no, I actually. I really agree, and I want to do this. He actually. I was in the. I was in the shower when I got the. When I got the call, and my phone was just the other side of the glass. And I saw it was like. And I was like. So I jumped out the shower and. And he literally was like, how quickly can you rebrand and. And can you do it in seven months? Because we were going to do this financial segmentation of Reality Labs.
A
Oh, all right. Yeah. So driven by reporting. Yeah. Nice. Okay.
B
Yeah. And I said yes before talking to anyone on my team. They thought I was insane. But, yeah, so it was. It was really driven by that. Get rid of this confusion between Facebook Corporation and Facebook app and allow the news of all the other stuff we're doing to be actually attributed to the corporation. And that has worked really well.
A
Well, thank you for sharing that story. I want to take a stroll further down memory lane and talk about ebay. You were at ebay? We were at ebay together. Well, over 20 years ago, I think. Tell me about lessons learned there and things you've taken away and any other things you want to share.
B
Ebay was just an incredible, in my opinion, like university for me. So. So I think there are probably as many stories from ebay in the book as there are from Facebook. Yeah, that's how important it is to me. Twenty years on, things I really learned about from ebay. I think the number one thing that eBay taught me was incrementality measurement in marketing. Ebay was really good at this. Whether you talk TV and doing DMA based segmentation and regional analysis of the impact of television, which is one of the first measurement things I worked on at ebay actually. Or you're talking about online marketing, direct response, paid search. And ebay was very smart at actually testing paid search by turning it on and off and not just trusting that the last click data was the correct thing and was.
A
Ebay was on the forefront of all of that. Yeah.
B
And honestly, all of those principles that ebay taught me are valid today when you buy social media. I mean, search hasn't changed. Search is still last click wins in terms of how. And obviously that's exactly what I do like my colleagues at Google. And I wouldn't have my career without Google. But it is in their favor to optimize for last click winning.
A
That's just like not just in their favor, like completely in their favor. It's like you start the match up to nil.
B
Yeah, absolutely. So, I mean, look, more credit to them. They're doing exactly the right thing strategically. But ebay taught me about incrementality measurement. Do you remember when we turned off all of paid search?
A
Yes, I do.
B
Right. And what happened was the numbers moved differently to what we expected based on our post click tracking. And we had to turn them off because Google was launching Google Pay and they had bused, they had put buses to take all the people away from Meg Whitman's speech at ebay live in Boston.
A
I, I was there. That was, that was, that was like the tea party incident. That was so insane.
B
And so we retaliated by turning off our ads.
A
Yeah.
B
And we were such a bidder in Google. What was crazy is Google learned from this too, because we were a low price bidder on every keyword basically at Google. And Google saw their revenues drop more than they expected on losing us because we dropped out the auction on all these auctions. And we saw that Google was actually more impactful than we thought it was. Right. Because the numbers went down. More than we predicted. And that taught us about incrementality. And so it was actually like just the learnings I had on incrementality. So many things like brilliant people, talent, organizational structure, rhythm of business, communications, dashboarding. But if I was to take one thing away, it's ebay taught me how to do incrementality measurement in large scale online marketing. And that is one of the two core pillars of my. Of my book.
A
We'll talk about that in a second. But that was also when frenemies fight. Yeah, I mean, that was like, you know, it's such a huge relationship and have that fight break out at an ebay live event. I mean, that was quite.
B
Also, all the people involved in paid search on both sides had nothing to do with the fight.
A
Yeah, right.
B
They were just like, we just want to sell you ads and we want to buy ads, but like the rest of our companies have just gone to war.
A
Yes. And it was. Any other lessons from ebay? Because ebay had at the time a hugely dominant position in tech and it's moved from that position now when you look back on it. Any observations?
B
I mean, there's so many. So two other lessons that I throw out. One is ebay's. When I joined, right after I joined ebay, stock dropped. I don't know if it was half full, a third. But right when I joined ebay, stock fell off a cliff. And with the stock falling off a cliff, a bunch of people left and it was actually like a challenging time and whatever. And the people who made it through, who stayed through the stock drop and who saw ebay through some of the turnaround to get it back onto a good path were some of the best people I ever worked with. And some of the people who saw the stock drop and hit the doors, they were not as effective. And so it really did sort of say to me, hey, you need to look for people who have been at the company and have gone through a dip. And if they've gone through a dip, made it through and been successful, that's a good way to get signal that they're competent and really effective. Doesn't mean they can't be otherwise. But that was a lesson.
A
Well, that toughness and resilience and the ability to handle, you know, tough times, which almost every company has a tough time.
B
I certainly. Every team does. What Every team for sure.
A
Oh, doesn't, you know, and every leader does too. So. Hey, tell us about your book. We've referenced it a couple times. You held it up. You can hold it up again if you want. It was great prop placement. You were doing great on that. There you go. Tell us about your book, why you wrote it, what's in it.
B
Yeah, look, I mean I think there, I personally think, and maybe that's not fair, but I personally think there are, there isn't a great guide to digital marketing. The basics like how to do paid search, how to do social media, how to think about organic versus display, how to do product LED growth using in product promotion promotions, what the basic infrastructure is you need from a martech stack, from an agency stack, from a team, you know, stack. And so like what I wanted to do was do a good book. That would be a good reference book for anyone who, you know, maybe they are, maybe they're new to the industry, maybe they're from not the marketing team, but they're from say finance and they want to understand how to talk to marketing about marketing or how to judge it. Maybe they're a new business group leader and they've been sales or product and they want to now learn the marketing function, you know, or they're a marketing leader who's been in, I don't know, out of home and now wants to transition to do more digital. There are lots of different things, but like a guide where you can pick up the guide and it just helps you get from 0 to 1, I'll say 0.5 to 1 in this field because people were always asking me for it and the one didn't exist. So that's the fundamental reason I wanted to write it. The reason the company let me write it is this point I made earlier. Like the world of Internet marketing is, well, sort of two things. One, I'm the CMO of a company that sells ads.
A
There you go.
B
Our team want me to do thought leadership on ads, which is not unreasonable. And for me, appearing on stage with JLO as a thought leadership moment is not my vibe. But like writing a semi academic text on how to do online marketing, that's very me. And so that was one part. But the second part is this point around incrementality. Like I believe it's a strategy credit for us, that we want people to do incrementality measurements, that we, that it actually favors us. When people look at incrementality measurement over post click tracking. You and I both know that like that's the right way to measure things, not just take credit because someone clicked. You should actually say I ran the ad. I didn't run the ad. What happened to my results? But it is a battle we're fighting in the marketplace, especially against Google, but the rest of search in general. And so getting out there and talking about incrementality measurement is like really important. And so that's motivation for me. I want to give something back to the industry. All the money is going to charity. I'm not taking a penny of royalties from it. It really is just a gift to the industry. From my perspective for the company, it's actually aligned with what we want. For me as the CMO to do thought leadership on ads and in general, we're fighting a battle to argue for incrementality measurement over post click tracking.
A
My thing was always like, why did they do it? It's one thing to know they did it, it's why. And the why almost always explained the incrementality. So you can't just, oh, the why is very valuable.
B
And you can really get to what drove the increment. Like, oh, I didn't know that thing was available. Great. That's a great reason. You know what I mean?
A
Right?
B
You know, for example.
A
Well, this brings us to our traditional last question. It's a two parter. You can take both or one, but you must take at least one. Practical advice we haven't talked about yet or funniest story you can share on the air. You can take both or one, but you must take at least one.
B
Maybe both as one answer.
A
Knock yourself out. We love both.
B
So I love affiliate marketing. It is my first love. It is like how I paid for college. It's awesome. At the same time, affiliate marketing has a lot of fraud and a lot of what isn't necessarily fraud but isn't exactly what you're trying to pay for. So when I was at ebay with you, and I'm not going to talk about big affiliate fraud case, I'm going to talk about something smaller. Yeah.
A
Because that was a big one.
B
Oh. I was the defining one for the industry and I do write it up in the book. But when I was at ebay, we changed from confirmed registered users, which is you registered and you confirmed your email address to activated confirmed registered users. You registered, you confirmed your email address and you bid on something, you bought something, you listed something, you activated and we changed all the payouts for that. And in general, it was a massively positive move. It changed our landing pages, we were landing people on the registration form. It changed us to use the search landing page. It put us in a place where we were incentivizing the right behaviors and it produced great Results for ebay. However, in affiliates, we got a new. We got, we had a class of affiliates who were called incentive affiliates and they would incentivize people to do things. And one had a computer game. And in the computer game you would get extra lives for doing things by clicking on an affiliate link and doing what they asked you to do. And so they did this. They said, we would like you to sign up for ebay and we would like you to activate, we would like you to place a bid. And so their users did that en masse. And we paid out a bunch of money to this affiliate. And by the way, completely inside our terms, this was not in any violation of our terms.
A
Yeah, yeah, yeah.
B
And so they did this. And then what happened was chargebacks spiked and people kept winning auctions and then saying, well, I didn't want to win it. I was just doing it for the extra life in the game. And so they were like charging back on their cards. They were like not paying. They were, you know, doing all these different things. And so it caused a big spike in what we considered sort of fraud or bad behavior on the ebay site because this affiliate had done exactly what they're incentivized for and what was completely inside our terms.
A
The unintended consequence of compensation is always there. And there it is.
B
Number one rule of affiliates is affiliates will do exactly what you pay them for.
A
Exactly.
B
Better look very closely at what you're paying them for.
A
Well, my other rule is if there's a hole in the logic, the marketplace will find it and exploit it to pieces before you even know it's there. So you better test it all. Okay, you said you had two, two things you wanted to say. Is there.
B
No, that was two in one. I think that's a funny story. Practicality. Know what you're paying them for.
A
All right, so thank you, Alex. That's. I think that's a great way to end the show. And thanks to everyone to listening to for listening to CMO Confidential. If you are enjoying the show, please like share and subscribe. New shows drop every Tuesday. We have over 150 shows. They're on Spotify, Apple and YouTube and they include why can can't is your next best customer in AI Bot. The unfairness and disparate impact of privacy policy and dissecting compensation, understanding negotiating and managing pay. Hey all you marketers, stay safe out there. This is Mike Linton signing off for CMO Confidential.
Guest: Alex Schultz, CMO & VP of Analytics at Meta
Host: Mike Linton
Date: January 13, 2026
In this episode, Mike Linton interviews Alex Schultz, CMO and VP of Analytics at Meta, about what it’s like to be the marketing leader at the center of one of the world’s most scrutinized companies. The conversation covers managing vast marketing budgets, orchestrating global teams, navigating decentralization, responding to constant public scrutiny, the integration of AI in marketing and analytics, the nuts and bolts of leading a major rebrand, and lessons from both Meta and eBay. Alex also shares practical hiring tips for the AI era and delivers memorable insights into affiliate marketing’s “law of unintended consequences.”
[01:44 - 03:19]
[03:39 - 06:31]
Centralized Marketing, Decentralized Analytics:
Managing Multiple Data Truths After Acquisitions:
[09:41 - 12:46]
Living Under Constant Scrutiny:
Advice for Less Experienced Marketers:
[13:23 - 17:46]
Two Spheres:
AI as a Threshold Technology:
Human + AI is the Present:
[18:12 - 20:35]
Barbell Distribution:
Advice for Professors and Managers:
[22:22 - 25:57]
Get in the Information Flow:
Literature & Philosophy Matter:
[26:23 - 28:49]
Tools Change, Principles Remain:
The Marketing Funnel Debate:
[29:09 - 32:21]
Why the Rebrand?
How it Happened:
[32:44 - 37:27]
Key Takeaway: Incrementality Measurement
Talent and Resilience:
[37:49 - 40:34]
Purpose:
Why Meta Approved It:
Proceeds go to charity; it’s a “gift to the industry.”
[41:16 - 43:56]
Affiliate Marketing: The Law of Unintended Consequences:
Another Rule:
This summary reflects the candid, insightful, and often entertaining tone of Mike Linton’s conversation with Alex Schultz—infused with actionable advice, real-world stories, and the kind of executive wisdom only gained “in the eye of the storm.”