
Loading summary
A
Hey. This episode of the Exit 5 podcast is brought to you by Qualified. It's no secret. AI, the hottest topic in marketing right now. And one thing we hear a lot of you marketers talking about is how you can use AI agents to help run your marketing machine. And that's where Qualified comes in with Piper, their AI SDR agent. Piper is the number one AI SDR agent on the market, according to G2. And hundreds of companies you've heard of like Box, Asana and Brex, they've hired Piper to autonomously grow inbound pipeline. How good does that sound? Qualified customers are seeing a massive business impact with Piper. A 3x increase in meetings booked and a 2x increase in pipeline. The agentic marketing era has arrived. And if you're a B2B marketing leader looking to scale pipeline generation, Piper, the number one AISDR is here to help. Go and check it out today@qualified.com exit 5 and you can hire Piper, the number one AISDR agent and grow your pipeline too. Who doesn't want more pipeline right now? Come on qualified.com exit5 and we'll see you on the podcast. You're listening to B2B marketing with me, Dave Gearhart.
B
Exit. All right, so on this episode of the Exit 5 podcast, I speak to Hayley Carpenter. Hayley is the founder of Chirpy, a conversion rate optimization agency. And we discuss everything about conversion rate optimization on this episode. We talk about tools you can use, the way to think through it, how to pitch it internally, all that good stuff. And if you're an in house B2B marketer, improving the amount of conversions you get from your site is always important. It's always something that's top of mind for the team. So this is an episode that you won't want to miss if you've been thinking about. All right, cool. I'm here with Hayley Carpenter. Hayley, how's it going?
C
Going well.
B
Good, good. Love to hear it. Yeah, I would love if we just give the audience, you know, the 60 second background. Who are you and what are you working on right now?
C
I'm Hayley, as we already said, I own Chirpy. It is a CRO services agency. I've had Chirpy now for about two and a half years, but I've been in the industry for getting close to a decade almost. So with that said, I have primarily an agency background and then I also have worked at Optimizely on their services team, managed the team there in North America, and then I went out on My own with Chirpy, and I'm currently in Austin.
B
Very cool. Very cool. Nice. Yeah, I know that Chirpy, you know, typically works with D2C E commerce. That's the space you play in. But, you know, we were talking about this earlier. I feel like D2C. There's a lot that B2B marketers can learn from the D2C and E commerce space. So that's why I was excited to bring you on and talk about some of the stuff you're working on and what you're seeing, because I think there's a lot that we can learn. So, yeah, with that being said, you know, I kind of like to just get right into the meat of it and not waste too much time. You know, I've been a B2B marketer before and been responsible of the revamping of the website. And the toughest thing is, like, where do I even start? There's obviously like the whole, like, redesign and rebranding part, but from your perspective, like, you're. You're a B2B marketer and you're looking at a website like, in terms of conversion rate optimization, what's one of the first things that you're looking at?
C
Ooh, good question. I want to take it back a couple of steps and then I'll answer that question. Because for anyone that's listening to this that hasn't heard of CRO, if I say that acronym, it's not Chief Revenue Officer for clarification. If you think that, I'm not judging you, but I have met people out in the world who have said that. So CRO is conversion rate optimization. It is what it sounds like. You are optimizing conversion rates on a digital experience, whether that be transactions, forms, MQLs, whatever you have going on. But conversion rate optimization is also so, so much more than what it is simplified to be. It's oversimplified and also over complicated often. So I try and land somewhere that's digestible in the middle. But for people who are just getting into CRO, really what people associated most often with I would say is testing or experimentation or AB testing. There are all kinds of words used out there. I would say A B testing is probably the most common one. But really, if you are like, what is CRO? It breaks down into two buckets. And this is that middle point that I try and reach. Where it is testing, that is certainly one bucket. But the other bucket is research. By that you could mean user research, ux, UI research. Again, a number of names for the same thing. Market research, brand research, whatever research. We are collecting data over here. And there are all types of ways I call the methodologies that we use to do the research with all kinds of tools, and then that feeds into the testing. So all of that to say that the research bucket is often very overlooked, not talked about, not known about. So it's really important to start there. I would say, in my opinion, but I don't think that's just my opinion. I would say any good optimizer would agree that you really need to start with collecting data, getting insights, and then you use those insights to go over to the testing bucket. However, I will avoid trying to get too in the weeds here too quickly because not everyone can test. That doesn't always make sense for every case, for everything that you're trying to solve. So the high level point to answer your question is start with data, start with research. And the most common methodology I would say that people know about is analytics, like GA4HEAP, Adobe analytics, there's all kinds of analytics tools. But I will probably at some point in this podcast potentially go off on my tangent about research and how it is also so much more than analytics. But I will stop my answer there.
B
All right. No, that's a good answer. Yeah, no, I'm, I'm down to go, to go into right away. So, you know, the first place to start is research. I think that makes a ton of sense. I do agree that most people think that conversion rate optimization means a B testing. And you know, a lot of people talk about like, oh, it's not just testing whether a button's blue or red. And that's true. But there's also just a level of like, it's not necessarily just testing at all. There's. There's the other side of it, which is figuring out, you know, what people actually want to hear from you and learning more about your customers before you try and create this website for them. So let's get into this research and data and insights piece. Let's say I am starting there today. What are some things that you're trying to learn and uncover at this step?
C
Oh, love this question. So with research, then, there are two main spectrums that I use to think about the planning of the research. One of those spectrums is quantitative and qualitative data. We want both types in an ideal situation. And I boil that down into quantitative data is what most people are familiar with. That's numbers. Qualitative data is words that I would say is the underappreciated and underutilized type of research, but we want both. The other spectrum is behavioral and perceptive data. Perceptive is also known as attitudinal data. And that is how people are moving and interacting with things versus what they're thinking and what they're feeling. So we need both of those types of data as well. Another way to think about it is what is happening and why it's happening. Oftentimes quant data gives us what is happening. But then the natural follow up question is, well, why is that happening? Usually you have to go to qualitative data to get those answers. And so with all of that said, if you're just starting in this place of general discovery or perhaps you have a list of specific questions that you're trying to answer that dictates really then where we start and what we're trying to answer and find. So for this case, say maybe we're starting in a general place. Because a lot of teams come to me, they don't necessarily have specific questions, they don't know too much about research. They're like, let's optimize stuff and you tell us, which is completely fine. And so then you're really looking for low hanging fruit to start out. So like where are the friction, the big friction points? Where are bugs, if any? Where are things that stand out as huge breaking of best practices? Not that we always want to follow best practices, but it is a good starting point. And then from a like perceptive side of things, how do users feel about the experience? What are they thinking and how are they perceiving things? And that can look a number of ways if you want to get really tactical. You look at things like messaging, the value proposition, your offers, promotions, the journey as a whole. That's another thing I would say people really kind of don't consider enough is it's not just this one section of a site in singularity, it's zero is fully encompassing of a journey. And I would say that's, I would argue a mistake. That's a harsher word. But it's, you know, it's true as a mistake to look at something in a journey in singularity without considering the big picture. So yeah, a couple of directions there. Does that answer the question?
B
Yeah, it does for sure. Yeah. So one that I actually wanted to double click on, oh my God, I, I can't believe I just used that word. But one that I want to double click on a little bit more is where do, like I know we talked about where to start, but like how about the actual place on the website? Like, is it like you don't know where to start? You've been given this task, you know, the goal is to obviously get more people to do the thing you want them to do on their site. So from there is it like let's, let's focus on just the homepage for now or let's focus on the product pages or does it always have to be the whole site? Like how do you, how do you break down the thinking there?
C
This is exactly why I do custom scopes for all my clients that come in is because yes, in the biggest case of doing CRO, particularly with a website I want to throw in here that CRO is not just for websites, it's also for mobile apps, it's for SaaS products, it's for any digital experience really. And then the testing, anywhere you own a code base, in theory you can test. So we'll talk about it in the context most likely this whole time relative to websites, but just for the audience to know that this is broadly applicable. But as far as where to start, like I was saying, in the biggest case it's the whole funnel and you can even should start with landing pages and ads in the PPC side of it, all the way to the post experience of whatever your conversion is. But it really is then like if you're not doing the whole journey, is there some immediate need or immediate major issue that we need to address? Like has all of the traffic stopped converting from the pdps? Okay, maybe let's look there or maybe your services pages or something, you know, is something broken in the cart? Okay, we should probably start there. From a testing perspective, this question is quite objective in that if we go back to that testing comment where I was like, not everyone can do testing. When you start there, you have pre test calculations, so the starting points are actually based on your real data. And testing isn't a matter of whether or not you want to do it. Of course you have to want to do it to get into it. But it's not just like, oh, we want to test, so we're going to do that now. It's okay. We're going to run the numbers and the numbers will tell us if you can test at all. If the answer is yes, then those numbers are going to tell us where we can test, how we can test, how long we can test, what we can test, when we can test. Like it's going to give us all of the answers, right, that we want and then we piece that together. Into a roadmap and priorities from there. So then in that case maybe it is just the homepage.
B
Right.
C
Simple enough for companies that have more traffic. It's probably the entire funnel. Companies fall in between there. So maybe we do homepage services page. Depends what context for me for E Comm, you know, it's homepage category pages, PDPs. Maybe we stop there. It just depends on the traffic. Okay, but research is much broader and more accessible.
B
Yeah. Okay, cool. So. So that brings us to a great next question. If you have limited traffic, how would you think about testing? I know there's no like one number for anyone. Every company industry are different. So there's no like. At least I don't think there's one one number minimum. But how do you think about testing with limited traffic? Is it let's get more traffic first. Is there always some kind of testing you should do? Can it be a waste of time? What are your thoughts there?
C
Many thoughts, yeah. For this, for my clients that come in, for people that I look to work with, typically my benchmark is average monthly users of 200,000 or more. Not to say that if you are right below that, that we can't test again. It's down to the calculation. So it's proportion of conversions to the traffic. We input those into the calculator. But that's about my threshold for those that are lower traffic. Let's say you run those calculations and the answer is no, you can't test in the traditional sense. I label that a B testing in that sense that you're talking about it with optimizely video controlled hypothesis testing. But you can still test if you're low traffic. It just looks different and it's not going to. The data is not going to be used and analyzed the same. It's not going to be considered the same. But it is there and it gives us data. I hate to frame it as a less than option and take away value from what it offers, but technically it is like a second choice to controlled hypothesis testing. But this other type of testing for low traffic clients I would call user testing. So let's say we can't put an A and a B into a regular controlled hypothesis test. We can do something like a preference test and still technically get statistical significance with that. So we could put our A and B into a different platform under different circumstances and we still have data then to guide us in a decision which is better than the alternative of nothing. If you're not testing at all in user testing, then as that interim or not interim, but like low traffic solution, then what do you have? Usually nothing. So then you're back to guessing and having no data to guide you. And so as long as the data is clean, it's accurate, you can trust it. Some data is better than no data. So in my opinion, it is still worth doing some form of user testing for those lower traffic clients with that roadmapping type of decision making mindset that you would have in controlled hypothesis testing. But everyone can do research. Everyone can do research in that bucket. And then there are some tools out there that will say you can test with very, very low traffic. I feel like this is so just like direct and harsh, but like that's a lie. Like you can't just test with whatever traffic you want. Like that's not how it works. Yeah, so watch out for that. Also, you know, if people say like, oh well, you can't do anything with low traffic from like a data gathering perspective and a making perspective, that's also not true. Yeah, I feel like there was another key point. I forgot it, but that's most of my answer.
B
Yeah, yeah, no, and, and I agree. I think a lot of people, there's a spectrum of people that listen to our podcast. Some probably have traffic, you know, at or even much north of 200,000amonth. And then some are significantly less, like even 5,000amonth. But maybe the 5,000amonth that come in are actually like, you know, super high value enterprise B2B type. So even in that case, like, how should they be thinking about this again? Is it just like you're too low or even at that level, is there still something that you should and can be doing and how would you think about it?
C
Even at that level, in my opinion, there's still things you should be doing because again, like these resources are out there, these tools are out there to do this type of testing. And you can do it for micro decisions every day. You can do it for macro decisions. Even if you're not using a standard user testing platform where you're aiming for some number of responses and significance, you can still do other research to inform your choices. Even if it's qualitative, even if you're just talking to a couple of people, if that's all you can do better than no data. Like I said, make sure it's accurate, reliable. Like if you're selling something to a very specific icp, try and find testers and users and people to talk to within that or real customers. I'm not saying we'll just go, then ask Sam off the street what he thinks and Susie off the street and ask 200 people. Like, that's not what I'm saying. That's not the accurate, reliable data we're trying to use. But otherwise, yes, there are always absolutely still things that you can be doing with data to gather data, to use data to inform what you're doing. A way that people talk about zero that I love is that it's really like an operating philosophy or a decision making philosophy, not just relative to a website or an app, but really as a business. It's like we are going to use data as an entire business entity to make our choices and work smarter. And there are so many companies out there that do not operate that way, even though I would. I think sometimes people are under the assumption that every business does, given the state of what things are today. But that's not true. So we're very much. I still have this rant all the time.
B
Yeah, no, totally. Yeah. I think that's the toughest part with testing, right? It's like, how do you build the culture of it? It's very easy in the singularity of the website redesign to say that, oh, maybe we're going to run some tests here or if we are changing our landing page or creating a landing page, it's very easy to say, oh, in this one project we're going to test two variations of this thing. But it's like, how do you, how do you continually do it and how do you make it part of like the team operating process or even the process of you reporting to your manager? Like how, how do you show them that? I'm constantly running tests and you know, through each test we're learning and iterating and getting better. So yeah, I guess I, I want to turn it back to you. Like how, how do you see teams create this process or create this culture of, you know, constant testing and experimentation? Like, what does that actually look like on the, on the inside and what can marketers listening to this take away from that?
C
Such a broad topic of conversation. I will give my succinct answer that's hopefully the most broadly applicable. But CRO is a system, so I'll start there. I have a diagram of that system. Like it is visualized and it's tangible. You can see the pieces, you can know the pieces. And that system should be implemented for whatever team we're talking about. Granted, it's not always going to look the exact same, but foundationally, usually like the big pieces are the same. And so with that said, let's assume a company has no CRO going right now. No one's doing it. I or someone is bringing it into a company. Usually I would say it starts in one team. Usually I would say that's marketing. And then once you get it rolling with marketing, then you evangelize your work, it's going to be successful. Because what I tell people is CRO always works when it's done correctly in some amount of time. If someone is out there listening to this and you're like, CRO is bullshit or like, you know, we've done it, it doesn't work, whatever. It's not the system that failed you, it's the humans running it that failed you. So you know, you implement that system, it's going to work. You evangelize those results. You, you get other people involved in a variety of ways. You share out your work in a variety of ways and usually that captures some interest. You can get people's attention that way and then you slowly work it out from there. Usually it migrates over to a product team if one exists. And going back to what I said about CRO is more than just conversion rate optimization. It's a business wide effort really, or should be, it's an operating principle. So with that and the research, oftentimes you know it's starting in marketing, but you're finding insights that relate to sales, to customer service, to your actual products, whether you're E comm and that's a tangible product that we're having, that we're selling or whether that's SaaS product with a product team over here you find things that go everywhere, that apply everywhere. Just further proof that zero is really a business wide thing. But you can certainly, I'm not gonna lie, you can certainly have a situation where there's nothing you can do. Like I can be the best consultant, the best CRO practitioner on the entire planet. And sometimes there's still nothing I can do to get a team to adopt CRO or a broader company to adopt CRO, even if I have one team inside of it. But there's a whole spectrum there. Sometimes it just depends. But also sometimes there's silos in a company and you ultimately work to break those down a little bit and that allows you to expand CRO. Sometimes you can't, sometimes teams already don't have silos and they're pretty good about the cross functional work. It really just depends. But yeah, you start with the system usually getting in place with the team growing it from there.
B
Yeah. Okay, cool. I love that. Yeah. Yeah. I find one of the best ways, just like overall in marketing, rocket science, but to get buying through anything is just the proper reporting of what is going on up to your higher ups. Right. It's like if you can tell the story of how this is making some kind of impact on the business, then obviously people are a lot more likely to get bought into it, especially if those results are a positive one. So in this case, how should teams be doing it? Like, what's is it like they're sending weekly updates to just their boss, to the whole company. Like, what have you seen as some good reporting and evangelism tactics to get buy in across this culture of testing in the marketing team?
C
Yes. There is a diagram that I've seen and used. I can't remember where I first saw it, but this really falls under a piece of the system that I group as program management. You see my German shepherds behind me, program management and governance. And so if you think about program management and governance as the center of a wheel, I have a puppy and she's four and a half months old and she just tantalized the older one. You think about the center of those of a wheel, program management, governance. And then you think of the spokes out from that. All of the spokes are the different tactics that we're using to try and do the evangelization, to get the buy in, to build the culture. So that's things like having a Slack channel or a teams channel. You can use that in a variety of ways. But maybe you're dropping test results, hyping up upcoming tests, sharing out your reports. Usually there's weekly, monthly and quarterly reporting happening for Croatia, for lower velocity programs. If you're testing, usually it's on a less frequent cadence. It just depends. But within that, then in those, those timing, the timing I just mentioned, then there are different types of reports. So there's individual test reports. Those are very tactical and specific. But then there are maybe like newsletters that go out in email and Slack, maybe directly to like a couple of people. And then there's usually a quarterly business review that is more strategic and also from like that program perspective, program management is different from project management in a number of ways. There are program level metrics that we want to be reporting on. So that would be in something like a monthly meeting, in a quarterly meeting where it's not just like this one test got this lift and here's this data set and this learning. It's more like okay, within this quarter, here's how many wins we had, here's how many losses. Here's what we did with this set of learnings. Here's where we passed this research and sales did this with it. Higher level things, that's one thing particularly relative to evangelization and getting buy in that's often missed or not known about or not done is that program level reporting. It's more of just like the individual test reporting. And then also in within that reporting, like research and sharing that out is also extremely important. And I think research, it bums me up still to this day. A lot of times it's done and then it's just a slide deck that goes to die. It's insights that go to die. Where's the value in that? You might as well just not have done it. That's another piece where you need to put it. Put the findings in the newsletters, link to them, call things out, make it exciting, tell the story, link it to your tests, show the picture, present that stuff people hate presenting. But like get in front of people with it, explain it, get them excited. And that doesn't mean just sitting and reading the whole damn thing like slide for slide, like no one wants that. Right? That's not gonna get people interested. So you really need to like have that all encompassing picture. And then another piece of it too is like the project management that ties into the program management, but I would say it's a sub thing. It's like how are you keeping track of all of this? Like where are the research deliverables living? Do you have an insights repository or knowledge base of everything? Where are your tests living? How are you strategizing? Where's the strategy and the roadmap and the prioritization living. Do you have dashboards? There's all kinds of ways that we can report now, right? And like there's no excuse not to. There are just more tools than ever before. There's more resources than ever before. So it is a giant puzzle, I will say. But you gotta put it together, right? So like if you're not gonna do it, have someone do it. Hire someone.
B
Yeah, yeah, totally. Totally. Yeah. So I know there's no like step one, but you know, I'm a, I'm a scrappy B2B marketer and I want to take a swing at this. I know that's probably not the right way to phrase it, but it's like, you know, I want to start getting, you know, better with looking at data more frequently. I want to start making Decisions with it, you know, whether it's. I know with like email tools it may be different, website tools might be different. But like are there some like scrappy wins or scrappy things you can do to just start like building this muscle of testing? Yeah, tell me.
A
This episode is brought to you by walnut. It's 2025. Something has to change based on how people buy today. Why are we pouring all of this effort into marketing just to hand prospects a PDF or push them behind a book, a demo wall? Come on. Today's buyers, just like you and me, we don't want to wait. We want to explore the product, see how it works and understand its value. Before booking a meeting with someone, I want you to show me the product. Come on. 70% of the B2B buying journey is already done before a sales rep is even contacted. So your buying experience needs to match. And that's where Walnut comes in. Walnut helps you put your product the center of your marketing. They make it easy for marketers like you and me to embed interactive demos on the site, Drop them in campaigns or personalize them for sales in minutes. No engineers acquired love that is that, is that called Vibe coding? Once a buyer is interested, they don't just want a one off walkthrough of your product. They want a place to actually evaluate, compare notes and make a decision that they feel confident in. So Walnut has deal rooms that make it easy for your internal champion to sell your product to their team. With interactive demos at the core, the result is fewer stalled deals. Fewer stalled deals, Consistent buyer experiences and intent data that shows exactly what features are winning. That's why companies like Adobe, NetApp and more trust Walnut to shorten sales cycles, scale pre sales and drive millions in new pipeline. Want to see it for yourself? Go check it out. Walnut IO. That's Walnut IO and they have a cool offer for you. They will actually build your first demo for free. So you can see how demos and deal rooms actually works. You get to see the product, test it out. They're going to do it for free because we're sending you there from exit 5. So go to Walnut I.O. today and tell them that you heard about them from Exit 5.
C
Yes, there is a free tool called Microsoft Clarity. Okay, it is I would say on the newer side, I don't want to say new new because it's been out for a minute now, but it's on the newer side of tools and it is just a snippet that you can go place on the back end of your site. Super Easy peasy. They'll give you the directions, you can look it up, but it takes like five seconds. And then what happens is that tool will start collecting data. And what it has in there that I like to direct people to first would be heat maps and session recordings. So that's going to give you behavioral data that shows you where people are clicking, where they're not clicking, where they're scrolling, and then the session recordings are. I don't want to say like people think they're creepy. I don't think it's creepy. Like, I think it's within bounds. Lots of tools out there to do it, but you have that snippet. And then sessions on your website are being recorded so then you can go in and see real users organically interacting with your website. And they don't know that you're watching, which is a plus because if they know that you're watching, it influences their behavior or can. So it's a great place to just be like, what, what are people doing? Like when they get here? What does that maybe look like? You know, just go watch 10 people hang out on your website and go poke around. Like, are they doing what you expected? Are you like, oh, no, like that's, that's not what I thought was happening. Yeah, like maybe you just launched a new page or a new flow and you thought people would go one way and you find out, oh shoot, that's not going how I thought at all. That's a great, you know, great starting point with heat maps. Like maybe you and your team have been working on a section on the homepage that is maybe halfway down even, and you run a heat map and people are not even reaching that part of the page. What a giant waste of your time. So it's just like there are some immediate quick things that you can learn entirely for free. I will say clarity is not my favorite tool. So if you're willing to put in a small amount of money for a small fee, I would start with hotjar because they have a very nice entry level plan, even a couple upgraded plans that are super reasonable even for like a startup. And I, I would honestly just jump straight to hotjar.
B
Nice. Okay, cool. Yeah, that I was going to ask about hotjar. Yeah. Recently I used it on a page that I had just built and yeah, I, I watched some, some recorded sessions and looked at the heat map and yeah, it's exactly what you said. Like, I just, there were certain things that I had no clue would be confusing to people. There were some things that I was just like, I have no clue why anyone would ever click on that so many times, but it keeps happening over and over, and I need to do something about it. So I definitely agree that that's like a really quick and easy scrappy thing. It's also just, like, satisfying. It's all. It's also just like the process kind of addicting because it's like you get to see colors and you get to actually see people in action. It's a really, really cool way to start it. No doubt about it.
C
Yes, I like that you called that out, because heat maps for people that have never seen them before, they're very, very colorful, and they're very visual, very tangible seeming. And so as opposed to a table of numbers, which many people don't get excited about at first glance, heat maps are also a good starting point because they kind of. They kind of draw people in. They're like, ooh, what's that? Ooh, that's interesting. That's kind of fun looking. And then you start to explain it, and you're like, ooh, ah. Oh, that's cool. And so I do like that at that spot as a starting point for that reason as well.
B
Yeah, yeah, exactly. Exactly. Okay. Love it. Love it. All right, so I want to talk a bit about what you're seeing in direct to consumer brands that you work with and maybe what you think is missing in B2B. So from a testing standpoint and how they're thinking about what to test in their marketing, what do you think B2B is missing? What do you think B2B can start doing that these brands are doing and you're seeing on your end?
C
I think one of the biggest things is clarity and value proposition with messaging and information and flows in general. So on the E Comm side, that's typically much more straightforward or at least easier for brands to do. Because, you know, if you're selling a product like, let's say I'm selling a shoe, the value proposition is usually pretty straightforward to come up with. It's, you know, something about your shoes. You have shoes. Everyone knows what shoes are. Yeah, the funnel is okay. I maybe go to the homepage. I go to the category page of women's shoes. For me, I pick a shoe that I like. I go to the particular product page for that shoe. I add it to my cart. I go to the cart, I check out, boom, you know, and that's pretty consistent for E Comm and for, like, the B2B side. Not saying that it's you know, the messaging and the clarity is difficult in every case, but usually companies struggle much more with that. Especially if you're offering something complicated or something with a lot of jargony words, or if you have 30 services pages that are five layers deep somehow and there's four calls to action and you have a demo and you have a contact Us and you have this, that and the other, it can get a little bit unruly. So piecing all of that together in a way that's extremely clear, effective, concise. There are times that we want to break best practices intentionally. In B2B, I would say more often. And then the flows, like I was saying, and making that clear to users, that's a good starting place. And I feel that pep just because I've worked, you know, I have one of his companies before. He hammers on this a lot. I don't want to speak for him, but like with messaging and clarity and value propositions, and I'm not like biased or anything because I've worked with him and think a lot of pep, but like genuinely, that is a really good place to start and a lot of teams do struggle with it. And there's a tool like Winter, for example, W, Y N T E R that are specifically for B2B messaging. It is on the more expensive side. So there are other tools as well that are great for message testing or value prop testing or homepage testing or navigation testing, even like, you know, card sorting, tree testing, things like that that you can use and immediately get some things to probably change or test in a control hypothesis test if you're going to do that.
B
Yeah, yeah, yeah. I think you nailed it. Like, I've been on B2B marketing teams before and there's only so much testing you can do. But if your messaging sucks, if you're not well positioned against the competitors in the market, there's just nothing that's really going to fix that. Like the problem is way at the foundation and that needs to get fixed first. So yeah, that stuff's way more important. And with that, I guess that goes back to part of you put conversion rate optimization into two buckets, which was a testing part and the research part. And this is where the research just really is one of the most important components. Because if you're not, you know, doing some talking to customers or even doing some research internally, like even just talking to senior leadership and understanding how they want to talk about what the company does and how they see the company being different now in the future than the competitors, and there's really so much. You could out test that, right?
C
Yeah. And I, you know, teams don't want to hear it, but usually they're not talking about products and things and services, how their customers are talking about it, or the pain points that the team think exists or really anything that the team is thinking. Like usually at some juncture, at a number of points, you're not going to be aligning with your actual customers. And you won't know that without the research. And like you were saying these foundational things like think of it as a pyramid. Measurement is one of them. And metrics and proper strategy around that is part of that foundation with messaging. But if the foundation, if that's cracked and then you're trying to build on top of it, six months, a year later and that whole thing crumbles at the bottom, that's a bad day. Your whole pyramid is going to fall on top of you. It's not a good time. And I see it all the time. And shoot. There's one other thing. Oh, right. And something else that I'm gonna note that I also like that pep talks about. I'm gonna say his name cause he's the one. Do I just see talk about it? I think the most. But like features, you can't really compete on features for very long, if at all. And if you don't have features, benefits, whatever, like usually they end up getting pretty similar across whoever your competitors are. So then it's like, well then what do you have then? What are you competing on? It's brand story value proposition, messaging and all of that instead. So it's so important to focus on. Yeah, and, and, and the journey and just, just everything that we've been talking about.
B
Yeah, totally. No, and there's a lot to it and a lot to unpack and, and a lot of things to get right. You know, it's funny, we at Drive last year we had a talk from Kyle Coleman who at the time he was the CMO of Copy AI. Now I think he's at ClickUp and he talked about when he first joined Copied AI and in the, you know, the process of starting the job, he had screenshotted his website, messaging and all of his three biggest competitors or four base competitors, whatever it was. And they literally all sounded the exact same.
C
Yeah.
B
And they all about the same, you know, kind of feature set or the same, you know, thing that they help people do. So it's pretty crazy from that perspective. Like when you're talking about the way we think people want to hear about how we talk about we do versus what our customers think is typically totally different. And a lot of times marketers, you know, I'm guilty of it too. Like, I'm so wrapped up in the bubble of our company that I think that everybody sees it the same way or a similar way and that just because I say it, that people are going to pick up on it. But, you know, it's really like putting yourself back in your customer's shoes is just so hard.
C
Yeah, and that applies to me too. Yeah, it applies to me too. Like, I'm not excluded. I can't work on my own stuff. I hire people to help me because I'm just. We're all too close to our own things. And that's not our fault. It's just the nature of the work.
B
Yeah, yeah, exactly. Awesome. Okay, so we talked about storytelling and messaging, positioning the value prop. I want to talk about social proof. I'm not saying social proof is like this silver bullet of everything marketing related, but in my opinion, it's kind of close. Like, I feel like if you have good social proof and if you can tell if you could wind that into the story properly, it's much harder to lose or to get bad results. So, yeah, how do you see, let's say you're looking at ads. You know, you're working on someone's ads and helping them with testing, or whether it's their website or emails or whatever you're working on. What role is social proof playing? Like, if you don't see the proper social proof or don't see any at all, are you like yelling and shaking your client? Go get Immediately?
C
Yeah, immediately, immediately. I mean, I don't say always often, but I might venture to say always here in that every time. Social proof doesn't exist somewhere or it's very minimal and we do some research, it's called out, people notice, it immediately dings your credibility and kind of the investment that people are willing to give you. I feel like it's just so easy to have nowadays. There's no reason not to. There's all kinds of platforms. Like we've been saying, just go talk to customers, do a survey, offer an incentive like, Hey, 10 minutes of your time, I'll give you a Starbucks gift card for a cup of coffee. Or you know, hey, if you give us 10 minutes of your time, we'll enter you into, you know, raffle or whatever for like a hundred dollar Amazon gift card or something like that. And then you can, you know, get reviews that way. Or like there's, there's just so many ways, like there's no reason not to have it. And I do want to say too, though, especially on the E Comm side, which I know isn't like as much of a focus here, but I mean, maybe applies for a B2B audience to some extent. Like, also do be strategic about it though, because in like an E Comm example, a common strategy or a tactic is to have like the star reviews for a product, right? Usually there's like the star visual at the top of the page. Like, okay, we have 20 reviews, five stars, and then they somewhere put it down towards the bottom of the page. But if there are too few reviews and there's actually not enough social proof, people call that out every time as well. And that also dings your credibility. So there is like a balance of strategy here of like, okay, what do you have? How much of it do you have? How quickly are you able to get it and are you getting it and how are you continuing to do that? So it's not just always as simple as, like, hey, go fling some stuff up there. But it, you know, it is extremely important talking about that foundation and the messaging and everything. Social proof is with that. Oh, I remembered a point that I wanted to say earlier too, which is like, usually brands like especially D2B are still at that foundation level. So everyone loves to jump to what are more advanced things like personalization recommendations, things like that. If your messaging isn't clear, if your journey is not clear, if your value is not clear, if you don't have social proof and people don't trust you, don't worry about personalization quite yet. Yeah, it's not the most important thing because, you know, if you think about that pyramid, personalization is going to be up here. Your base is cracked, doesn't matter. Like, everything is crumbling beneath you. So that's another, I would say again, big mistake that I see. And people really just like, let's do personalization. That's. I keep saying that. Cause that's like a really big one. Like, people come to me all the time and you're like, okay, well let's look this, this, this, and this. Oh, foundation's cracked. And that's where we start.
B
Yeah, yeah, yeah, I love that. Yeah. I, I, and I think social proof, I'm not gonna say the bar has, has lowered. I just think it's, you can be scrappier with it now. Like, yeah, you need to fly to your client's site across the country and videotape them for two hours with drones to get a good anymore. Like you could.
C
And.
B
And that stuff doesn't even work that well. Like, if I went in a website and saw that, I would be like, oh, wow, you got your happiest customer, highest production possible thing. Like, and they're wearing like a suit and tie and are like 30 years senior to me. Like, that doesn't speak to me at all. It's so much easier now. You can. Even in an industry that is just like, you know, more prim and proper, more white collar, I still think you can be scrappy with it. Something that we did recently with exit 5 was we sent. We're using this tool called testimonial to, which allows you to essentially send a link to somebody and says, hey, you know, we'd love if you can leave us a review. It'll take you a couple minutes. Click here and then they could choose text or video. If they choose text, they can write it up and we give the prompting questions. And if it's video, same thing. Like, they can just record a video just off their laptop. And that's so much more powerful just to, you know, for other people to see your customers just talking from a laptop out of their home or out of their office, just saying, yeah, you know, Exit 5 is a great tool that's helped me do X, Y and Z. Would highly recommend. If you're in B2B marketing, like, that alone is just crazy valuable. It's so raw, the social proof, and it's really just undeniable. Like, if somebody is willing to do that, you can almost guarantee it's going to build some trust in the audience because they're like, oh, like people. Here's 20 people that went on video and talked about this product like, it must be decent, right?
C
100%. People want the role. They want the authentic. They want the genuine. They don't want the fabricated and the smoke in the mirrors and the crap that your marketing team created and like, put a veil over. People these days are very attuned to that. Maybe more than ever. I mean, that's maybe a big statement, but like, yeah, people have bullshit detectors and they're very good now, especially with AI, you know, so. You're right, you're right.
B
Okay, I have a couple more questions here. One is, where do you see is usually, you know, what gets in the way of building this system or, you know, really properly experimenting and testing. Like, where do you see marketing teams getting blocked or stuck with this one.
C
Big one can be ego and that can just be going back to, like, I can be the best consultant in the world and there's nothing I can do. Usually if ego is involved, that's more of those cases where people within the company, whether that be higher ups or any level really, that is. Has some level of decision making, is like, no, we know best, or we've been doing this way forever and this is how we will continue doing it forever. And we are not open to. To changes. And this, that and the other. Like, that is one of the biggest blockers for sure. Like, I can't do anything about that. Right. Like, I can talk to him blue in the face usually, and it doesn't make a bit of difference. I'm going to try, but I don't always win. And then other blockers would be like a common budget, especially for, like the bigger programs. And if you're doing testing, if you're doing CRO, well, usually it is somewhat expensive. Like, there is a cost to it. That's just what it is. It requires a lot of resources. It's a very specialized skill. Don't even get me started on the tangent of AI replacing the CRO strategist or like the human component of this to a large extent. Like, no, that's not. That's. I'll just save it and then. Yeah, yeah. But, you know, I don't want to say that to, like, people listening who are like, oh, it's expensive. Great, tune out. Because there are things that you can do that are lower cost, for sure. But, you know, budget on the big side is usually a blocker and then another one is like, resources. So, like, if you hire someone like myself, my team from the outside to come in, there has to be at least some level of collaboration. Like, there has to at least be one human on the team to manage the relationship or at least keep an eye or like, give approvals for things. So if the team is really, really strapped, that's a blocker. I would say those are the big ones. Yeah, that's probably the end of my answer there.
B
Okay, cool. Love it. All right. Another thing, you know, in any of your work recently or anything that you can think of, is there a test result that has really shocked you? Like, you know, something that you've done that was like, yes. Oh, I did not expect that. And. And maybe we can pull some learnings out of that.
C
Yes. And this is the fun part about testing, is that people think they know what's best. They think they know the answers. Especially if the C suite's Dictating like hey, we're doing this. And they probably would have done it without a test but there's a zero team and they were like hey, let's test it. So they, you know, they're testing it but they're like no, we know. And then it loses. And they're like what? This is exactly why we test. So one test that comes to mind that really did. I was like, oh wow, that's a shocker. Is it was an E comm but still applicable to lots of companies. In the top navigation site wide in the header their control or like their existing initial version had imagery. So you know it's like link with an image, link with an image, link with an image. And they actually their team wanted to try removing the images and have text only. I thought that would lose. I thought the design was not as strong and it got a little over 7% lift in transactions. I was so surprised. And that was at full statistical significance. I can't remember what the confidence interval was off the top of my head. But I mean like that's significant in actual bottom line transactions from just removing images. Which is like one of the easiest tests that you could set up. I mean that's huge.
B
Absolutely, yeah. That's awesome. I love that. I love that. And last thing is, is there like an overrated metric or an overrated way that teams or. Yeah, what is there an overrated metric in conversion optimization? Like anything that you think teams shouldn't pay attention to that they do.
C
Not one in particular. But I have a very strong stance on this at a higher level which I think better answers the question. If you've talked to me, people will know I have a very strong opinion on this. If you've worked with me before, but there when you run tests or you're doing work really of any kind, research, whatever, you're going to have a primary metric or maybe two that you're going after. So on the B2B side, let's just say it's form submissions and you're trying to get leads. That would be a primary metric to me. Then you have secondary metrics, maybe a tertiary distinction. I don't think that's like that necessary. I would just say primary secondary. So then secondary for me would be things like bounce rate, exit rate, average session duration, scroll depth, clicks on something, things like that. Engagement type metrics are like very common. I'll use that as an example. So the opposite of that, that I would say is a big mistake or however you want to frame it. Overhyped whatever would be the opposite of using the vanity metrics, the what I call secondary metrics as primary metrics. And then thinking of the bottom line metrics is like, oh, those are nice to have. They're like, yeah, it'll trickle down, we'll see what happens. Or like, no, that's secondary because we don't have enough traffic and data volume or whatever. Like, there's all kinds of reasons people flip those. That's one of the biggest mistakes that you can make in my opinion, and I'm hardcore. Like, the bottom line metrics need to be primary. That's what we're going after. And the strategy revolves around that. Because if you're gonna go, if you're gonna go to your C suite at the end of a quarter and you're like, yeah, we decreased bounce rate by 20% and we increase these clicks by 50%. Like, who gives a flying shit? Like, that's not going to justify thousands or hundreds of thousands of dollars in investment in this work or whatever you're doing. Really? They're like, okay, well how does that affect the bottom line? That's revenue, that's form submissions, that's mql, that's SQL saos, that's ARR, that's lifetime value, that's transaction. Like that's all the money related metrics. Right. So that's how I would answer that question.
B
Okay, I love that. Yeah, that just goes with so many things in B2B and in marketing, when you're reporting what you're working on, it's like you just have to find a way to tie it back to that. So if the thing is, you got 10 extra form submissions this month on whatever that form is. And out of the 10 submissions, you know that equals five more opportunities for the sales team. And from those five opportunities, one of them is more is likely to close. Let's just say it's 20% close rate. Then it's like, great, we got an extra 10k in mrr from this test and that's what you lead with. That's the hook is like, we ran a test and we got an extra 10k in error from it. And then allow them to peel back the onion and say, oh, like how did you do that? What happened? And walk me through it. And then if they're interested in knowing those secondary metrics that you said about, maybe you can bring them up there. But most of them are just going to want to know like high level the story. And then it's like, great, let's just Keep doing it and let me know how it can help invest more in these tests and what you're doing. So love that.
C
Exactly. And I want to throw in. I know we're almost at time. I want to throw in if a team's response back to what you and I just said is, well, we can't measure those bottom line things. Not an excuse, completely unacceptable excuse. Yeah, yeah. That's just it. That. It's just not acceptable.
B
Yeah, I agree. I agree. I think there's always a way to tie it back if. If you don't know how or, you know, you're saying there's no way. It's just you just haven't figured it out. But there's always a way. You got. You got to be able to talk about what you're doing and how it's affecting the business. Especially in this setting where you're literally looking at one thing versus the other. I think you should be able to have a clear idea. And then if not, it's kind of like, well, why did you do it in the first place if you didn't even know what it was going to impact? Right. So.
C
Exactly. Yeah. Yes.
B
All right, well, I think that's a good note for us to end on. Hayley, this has been awesome. I appreciate all your insights. I know you're very active on LinkedIn, so I'm sure people could find you there and dig deeper. There's so much more on this topic. And you're also inside the Exit 5 community. And one of the reasons I wanted to have you on today is because you're always in the community, delivering a ton of value to the other members, commenting on stuff, and just giving away so much great knowledge. So, again, I appreciate it. And we'll see you around. Okay.
C
Yeah, thanks for having me.
B
All right, thanks, Bye.
A
Hey, thanks for listening to this podcast. If you like this episode. You know what? I'm not even going to ask you to subscribe and leave a review, because I don't really care about that. I have something better for you. So we've built the number one private community for B2B marketers at Exit 5. And you can go and check that out. Instead of leaving a rating or review, go check it out right now on our website, exit exit5.com. Our mission at Exit 5 is to help you grow your career in B2B marketing. And there's no better place to do that than with us at exit 5. There's nearly 5,000 members now in our community. People are in there posting every day asking questions about things like marketing, planning ideas, inspiration, asking questions and getting feedback from your peers. Building your own network of marketers who are doing the same thing you are so you can have a peer group or maybe just venting about your boss when you need to get in there and get something off your chest. It's 100% free to join for seven days, so you can go and check it out risk free and then there's a small annual fee to pay if you want to become a member for the year. Go check it out. Learn more exit5.com and I will see you over there in the community. Email, in my humble opinion, is still the greatest marketing channel of all time. It's the only way you can truly own your audience today. But when it comes to building those emails. Well, if you've ever tried building an email in an enterprise marketing automation platform, you know just how painful that can be. I won't name names, but templates get too rigid. Editing code can break things and the whole process just takes forever when it shouldn't. That's why we love knack here at exit 5. Knack is a no code email platform that makes it easy to create on brand high performing emails without the bottlenecks. If you're frustrated by clunky email builders, you need nac. If you're tired of hoping the email you sent looks good across all devices, just test it in NAC first. And if you're a big team that's making it hard to collaborate and get approvals on your email, you definitely need nac. The best part? Everything takes a fraction of the time you can see Knack in action@knack.com exit 5 that's K-N a K.com ex exit 5 or just let them know you heard about NAC from Exit 5. That's us.
Episode Title: Research-First CRO: How Data Leads to Better Marketing Decisions with Haley Carpenter
Host: Dave Gerhardt
Guest: Haley Carpenter, Founder of Chirpy (CRO Agency)
Date: September 15, 2025
In this episode, Dave Gerhardt hosts Haley Carpenter to dive deep into conversion rate optimization (CRO) from a research-driven perspective. Haley unpacks why data—not just experimentation—should be at the core of all CRO efforts. She covers fundamental concepts, best practices for teams of all sizes, tool recommendations, embedding a testing culture, cross-pollination of D2C and B2B tactics, and avoiding common pitfalls around metrics, reporting, and company buy-in.
For more insights from Haley or to join the discussion, connect with her and thousands of marketers in the Exit Five community.