Loading summary
A
Hey, everyone. We know already that many go to market teams are flying blind. They're tracking the wrong metrics, they're making decisions based on their gut feelings and experience. And in many cases, what we see is they're really missing anywhere between, you know, like 60 to 80% of the really important activities that create Pipeline before it even happens. So in this episode, we're going to be talking about something called the pipeline black box and also why AI isn't going to be your silver bullet and how to lay out a plan to rebuild the visibility you need to make the right decisions and move forward fast. Enjoy the show. You're listening to GTM Live, a podcast by Passetto. Hey, guys. Amber, we've got a new voice on the show today. Very excited. It's actually been a few weeks since we've all recorded together. We've had a few episodes recently with some guest speakers and such, and there's been some really exciting stuff going on behind the scenes at Pesetto. Our team is growing and our capabilities are also growing along with that. And so Amber, Amber Williams is here. She is the newest member of the Passetto team in her role as head of revenue operations and now is a co host with Trevor and I. So we're really excited to have Amber join. Amber, can you give us a background, tell the audience where you came from? Yeah, yeah, for sure.
B
Thanks, Carolyn. So happy to be here with you and Trevor and obviously a fan of the show and have been as well for years and feels like my whole career in Revops. So super happy to be here. And yeah, so what I've been working on for the past few years is really just getting deeper in terms of like the core visibility problem in revenue operations and in go to market. So I know we're going to talk about it today, but what I have specialized in is seeing into the Pipeline black box because regardless of, you know, current belief and assumptions, more tech in 2025 does not make things easier for go to market teams. And so that's what I specialize in, is helping uncover the layers and sort of bring things together to give you the visibility into what's working and what's not. And obviously we've been, you know, sort of helping solve similar problems for customers that you solve at Passetto. So I'm happy to be here and combine efforts.
A
Yeah, I love that. It's definitely a trend that Trevor and I have been seeing since, since we all created Passetto in just this crisis that is really crappy data architecture. And so we're pumped to be solving that in a more hands on approach with our customers and with our client base. And yeah, it's good to have you here. It's good to have, you know, another voice on the show and some folks might also just recognize you because you've been a regular attendee of when we've done this thing live. So let's kick off today. Topic for today is really to talk about this concept of revenue as a science. How do we think about the go to market engine as this sort of like a scientific framework where you can sort of have this level of predictability when done right. And so I want to unpack that a little bit more. But I thought to just kick things off, I wanted to talk about a conversation that's really stuck with me. It's a conversation that I had earlier this week with a CEO of a series B tech company. And the conversation really came down to, yeah, I think my marketing team is underperforming. You know, I don't, don't really trust the stuff that they're doing. It doesn't seem to be working in terms of like new, new logo acquisition. Like we gotta ramp that up. Like what would you do? Right. And being in the world of marketing myself and having sort of been ahead of marketing in my previous roles, my first instinct, and I think the instinct that a lot of marketing leaders would have is to sort of go back into their toolbox, into their own experiences and sort of like gut feeling, like what do we think is going to work? Well, you got to ramp up paid search. If you're not doing paid search, you got to start investing in that. You got to get on paid social. And I think that there's a lot of validity in like doing the right things that should be a part of a strategy. But I paused for a sec and I'm like, how do we know that your new logo acquisition plateau or like deceleration is really a marketing problem? We don't unless you have the data to collect that. And I really think this comes down to systematically across go to market. It's probably an overall go to market problem. That impact that is impacted by both marketing and sales and your SDRs or BDRs or whatever. And so I paused and I was like, you know what, I probably wouldn't make any recommendations at that point. My recommendation would be to not make any decisions without the data to actually back that. Right. And so I think this really comes down to the fact that a lot of go to market leaders in today's ecosystem are making decisions that way based on, like, gut feel and not reliable data. And so I think that really comes down to this topic today of revenue as a science or just this concept of revenue visibility. And so I'm really pumped to get into that. I think it's going to be really applicable to the people that listen to this show. And so, yeah, let's get into it.
B
Awesome. Yeah, I think thinking about it as a science is like, obviously we have access to so much data. And so I'm excited to unpack more in terms of what does that mean from the revenue science perspective and. And how do we really capitalize on all these promises. Right. With all this go to market technology and really start to see the impact of what's happening. So, yeah, so in this episode today, we're going to talk about the pipeline black box, what that is. Maybe you've heard it before. Maybe you're like, hmm, I kind of know what that is, but can we unpack that a little bit? So we're going to talk about that and talk about the 8020 theory, which is to say, hey, most companies see 20% of what's happening in their pipeline and they think that they have revenue visibility, but they're actually missing 80% of what is driving conversions across their pipeline and don't see. They really can't see, you know, even what's not working, which I know is top of mind for a lot of go to market leaders right now. And then we're also going to talk about defining revenue science. What does that mean? What does it mean at Pesetto to run revenue like a science and data architecture, this new idea of instrumentation. And like, what is instrumentation? How is that different from how maybe you've approached data architecture in the past? And how do we really simplify this concept in a way that is going to make sense to the executive team? And then towards the end, we're going to talk about if we were all to go in house tomorrow as go to market leaders, what would be the first thing that we would focus on and why? So let's get into it.
A
Cool. Yeah, I love that. I think we're gonna get into some good stuff today. But I think to start, you know, we've gotten into this before, but I really want to double back to what is the status quo for go to market measurement. Right now it feels like the vast majority of go to market leaders and operators are really stuck still in the status quo. So, Trevor, let's get your take on what is the status quo? How are People operating right now.
C
Absolutely. And we see this with just about everyone we work with. There's some version of these things that I'm about to talk about and I think the headline here is that not a lot has changed in a really long time. We haven't gotten more sophisticated about this as a community for years really. We've bolted on all these new tools that do these very specific point solution things which are great. Yes, you need some of those tools. You should, you know, explore new ways of doing things and all that. But when it comes down to reporting on how did it all go, you still end up with sort of the same old stuff. So this idea of MQL and SQL, which are not really fundamentally wrong, but they're often not the whole story. They're, they're really hyper focused on sort of a couple different facts of the greater journey. And you know, we find that, you know, if you're still stuck in the the idea of mql, you're also thinking a lot about marketing versus sales. You're thinking about what did marketing hand over. And you know, it's not that simple. It's. You end up with these arguments about marketing handoff and quality and all those sorts of things and you forget to think about sort of the other ways that sales becomes aware of prospects that need to be worked and you lose the ability to, or you never gain the ability. I guess I should say to tell that more nuanced story about here's all the things that sales is working whether or not they were marketing specifically driven. We want to know some more things about that. So, so you see these very hyper focused measurements of MQL SQL. You of course are measuring website traffic in an anonymous way. You, you do some things in Google Analytics or whatever, you know, web analytics tools, but you're not often folding that into sort of the greater story of the journey and so on. Of course we've got sales automation now and everyone's got something like that and they provide some ways of measuring some things. But of course all those tools, whether it is sales automation or some other points point specific tool like they're built to show how they specifically are tying back to a result. If they're trying to measure results at all. It is very self serving. It is only a tiny piece of the story. And so you just don't get that holistic view of how that tool or overall solution feeds into the bigger picture. We do have attribution. Some organizations of course have an attribution tool. We find that they are often hyper focused on the marketing story. They are not being used well to sort of go underlying the whole journey. It ends up just being a way to tell some kind of story about how marketing did some marketing things. And it needs to be, needs to be more than that to be useful to everybody. Yeah, I think it's just these are all components, they're not necessarily wrong, they are important things but they rarely get pulled together into a sort of holistic vision of what's really happening. And even when organizations attempt to do that, it often becomes messy and confusing and just ends up being something that nobody can really wrap their minds around. And ultimately nobody pays any real attention to it because they don't get it or they don't trust it or whatever. And so that's definitely this black box as Amber talked about. We can't really get into the specifics of what's really happening on that journey.
A
I think too that's a lot of times why when you're in this like QBR scramble or you know, preparing for a board meeting and things like that, you end up with, you end up with like some sort of narrative that oftentimes is like kind of self serving or skewed towards like what that individual platform data is telling you but it skirts around like the real business impact. And to some extent, I think I've said this before but like I think sometimes CMOs or marketing leaders are treated like second class executives and I think that's wise because these tools sort of like give them anecdotal data at best, not really business impact driven metrics to really communicate their performance and what they're doing as it contributes to new business revenue.
B
Yeah. I think when it comes to the status quo in go to market and what are we measuring, it's absolutely stuck five years in the past. Right. So I can say from a RevOps perspective it's table stakes. If you don't know how to come in and measure, you know, how many qualified leads did we create, how many meetings did the sales team have, how many opportunities did they create, how many opportunities did we close, what's our sales cycle? But I think the big disconnect between executive level leadership and the teams on the ground surfacing, these insights are executives are asking what do we do strategically? How are we going to make a decision to double down or cut or pivot and the teams in management, right. And like even director level revops there's this disconnect in terms of understanding how much operational resources go into maintaining the status quo. As you have it right now. So I think there's multiple reasons for that. But the status quo is really, really harmful for companies in 2025 because it's really showing you something that is not going to give you those strategic insights. Right. Like Revops gets, you know, criticism. I know when I was in house like years ago with Revops is like, how can we be more strategic? How can we be more strategic? Well, we're over here maintaining a lot of reporting that is really required from this like top level, top down understanding of here's what we need to be able to see and here's how we're going to report on meetings created. So much so that as the tech stack gets more complicated and these point solutions become more complicated and we have AI coming in everywhere where it's a lot to keep up with. And so the status quo I think is something that we need to be more upfront and realistic, at least in revenue and GTM operations around this is an operational load. Right. And so if we're going to really pivot to see what's happening in the business at a deeper layer to really match the complexity of the market today, we have to like back into the solution. Right. Because the status quo reporting around this like conversion rates and it's not getting us there, but it takes a lot to like try to see it from a different perspective. And teams are really bogged down with just keeping up with the status quo. So that's my perspective on that.
C
Yeah. So much of the status quo measurements are very simplistic volume measurements too. Amber, you mentioned number of meetings, number of MQLs, whatever. Like that's interesting to know and I think you should know those things, you know, some version of those things. But just the sheer fact of a volume number doesn't really tell you a whole lot. You start to layer in conversion rates and starts to tell you a little bit more, but it's still overly simplistic and not particularly actionable. Like, okay, we got more MQLs or we booked more meetings or whatever. What do you do with that information? You gotta get deeper if you want to know the why, it can't stop there. You need to know quite a few things under the hood like how you got there and what happened when you got there and where'd we go next and so on and so on.
A
Personally, like my view is that I actually hate the MQL as a metric and I say that I used to actually be on the other side of the fence, obviously like being in house and you know, volume of MQLs generated was like the gold standard for marketing, and unfortunately it still is. But I think what's highly problematic about measuring MQL volume is that MQLs are not a good, in my opinion, indicator of potential pipeline. We're measuring, in many cases, just names at that point of people that have absolutely no intent are not in market, you know, might never be in market. And I think the biggest missing piece is understanding, like the cause and effect between the MQLs you are creating and what happens with them. Like, are they passed to sales? What is sales doing about them? Are they converting? Are they going anywhere? And this is not a blanket statement because I've definitely seen, I don't see this across the board, but I think in a lot of cases for companies that like, totally over index on MQL generation or just demand conversion sort of programs is that those MQLs literally go nowhere. I just was on the phone with a company earlier this week where a huge proportion of, like, what they're passing to their SDR teams are like, oh, these MQLs that reached A, you know, a scoring threshold. And when we look at. We're now able to measure this. Right. But we previously didn't. But now that we can measure it, we can see that that huge proportion of MQLs converts to pipeline at, I think it was like 1.4%, an alarmingly low number. And so if you didn't have the ability to measure the cause and effect between marketing and what's then passed to, you know, your prospecting engine and then passed to pipeline, you would never see that. You wouldn't have this cause and effect or like, causal relationship between each of the things that you're doing across your go to market to understand what's actually working and what's not. And it seems like as I'm explaining this, it feels so overly simplistic and it's like, well, why is that so hard to measure? But I guess the reality is that it just is like most companies that we see, like, just don't measure this. They measure MQLs generated over here marketing, and they measure like, you know, pipeline generated over here in sales. And trying to determine a way to like, see who sourced what.
C
Yeah. To extend that point a little bit further, though, I do agree you want to sort of be able to measure that very immediate cause and effect. Like we do this thing and here is the measurable outcome. We pass prospects over and they become deals or they don't become deals. At the end of the day, the Question you're really asking is how are we going to get more revenue? It can't be only that because you need to be able to then walk backwards through the moving parts and understand the individual cause and effect components. But in my opinion, it all must sort of roll back up to. And we're actually making more money.
A
Yeah.
C
Even in the pipeline scenario, like building more pipeline that doesn't go anywhere is also bad. So you, you can look at those prospects that become pipeline and you can understand what types of prospects, what's triggering those prospects and which ones become pipeline more. But you want to look at that and then you want to look at, okay, we're building this pipeline, is it closing? Are we winning? And you have to have that greater view of the whole thing to be able to get to, let's really do more of that. And that's something you need to know. And everyone should be thinking about it that way. We're not talking about methodology here specifically, but really every go to market team should have some stake in the revenue number too. I'm not specifically commenting on compensation targets or anything like that, but overall, it can't be we got this number of leads. It has to be, we had a meaningful impact on reaching the revenue target for the year or whatever. And so teams need to be oriented around that. So they're going over this common goal, going after this common goal rather than yelling at the top of their lungs about how great they did at driving leads. Can't be that.
B
Right. Or dragging yourself through the mud because you didn't reach your opportunity creation goal for the quarter. It's like, absolutely, Trevor. It all speaks to the status quo. Right. Because the finance and the overall business performance is being talked about in some rooms. And then you've got the go to market teams who are, you know, boots on the ground and in the mud and beholden to these status quo, like performance metrics that are doing no one a favor when it comes to growing revenue. And there's this big, messy middle black box where you really don't know what's going on. It's like you're taking all your ducks, putting them in a bucket and saying, look, we got 500 ducks. Like, no, you have 10 or 20 different types of ducks in there. Like, like they all have their own unique, like, needs and experience and outcomes and they're different. Like, we don't have one go to market motion like maybe we had 10 years ago. Like things were simpler. It's not that simple anymore. And we can't keep Treating it like it's just simple and obvious and out of the box when it's not.
C
Yeah, I like the use of ducks as an analogy there. Just as a side note, I have a question.
A
I'm trying to like put myself in the perspective of a CEO who's sort of like at the 30,000 foot view. Okay? So if my company is using say like HubSpot or Marketo for marketing automation and like theoretically we're capturing everything we need to capture in there. And then my sales team or like, you know, my BDR team is using Gong and we're tracking all of our activities in Gong and then we track the sales cycle. Can't I get what I need from those tools? I know what my marketing team is doing in Marketo. I can track that. I know all of the activities that my sales team is doing. It's in gong. I can track that. What's the problem here? Why can't I get what I need to know?
C
Yeah, we've sort of skirted around this in all the stuff we've said so far. Sort of implied it. There isn't one metric, there is no one magic metric to all of this. So like that's one problem, is like we're not trying to go after this one thing. We are trying to build this sort of greater ecosystem of the story. You know, that's a word I like to use a lot of in these kind of conversations about this stuff is it is a story. We are trying to tell this sort of complex, nuanced story and it doesn't all live in the same place. It's not one rolled up number that says yes, we are doing well or we are not. Like, yeah, I mean you have that in are we hitting our revenue target? But that doesn't explain all the stuff underneath it. And so it comes back to that sort of point solution self serving measurement comment that we made earlier. They are incentivized to measure the thing that they do. So in those point solutions, sure they're going to give you some interpretation of a certain piece of the story, but all that stuff doesn't live in the same system and it's never going to. I mean that would be a great dream. And you know, maybe there was a time when you might have believed maybe 10 years ago that Salesforce was going to become that thing, you know, the, the platform that does it all and all your stuff lives there and whatever and reality just hasn't shown that to be true. We're not there. I don't think that we're going to be there because, you know, the minute you get there, someone thinks up something new and it's now out of the platform again. So that's a reality of what it is we all do here. It is this multifaceted thing. You're trying new things all the time. You're using new tools as, as the technology and sort of approaches, you know, concepts, methodologies change. So you end up cobbling stuff together. Some are doing in spreadsheets, some have big BI analytics teams that are building big machines internally to do those things. But yeah, it's just, it's not going to be a couple reports in a Salesforce dashboard or your HubSpot dashboards. That'll be a piece of it. But then there'll be all these other things. And that's what we see.
B
So much good complexity and juiciness here. I think, like, one other thing that I would add on to what Trevor's saying, like, what I hear when I talk to CEOs is they want to know what happened with that paid social lead that they paid essentially like a thousand dollars for whatever based on how many they got that month. Like, they want to know what happened. And what they're getting is a volume report. Oh, we had 25 meetings booked for the month. And that's not what they want to know. They want to know, wait, why did this go nowhere? This looked so promising. They're in our icp. What happened? And so I think where we're really failing is the silo, right? So we have marketing reporting on these metrics, we have sales reporting on their metrics, we have the sales cycle, we have revenue, but there's no thread that ties everything together. For Sally Jane, who is this like golden nugget who came through paid social, like, was that the first time she ever heard of us? Did she go to an event? Did she talk to sales and then read a white? Like, what? Like, what happened? Like, why did she never make it into the opportunity pipeline? And that's what CEOs really want to know. It's like, because pipeline is so expensive and everything's so costly now and so difficult to like build pipeline, they really want to know, like, what the heck is happening with these individual leads. And it's frustrating to see that there's just like a drop off and no real explanation. And so I think when it comes to science, that's what you want to be able to do is tell that story, as Trevor said. But there are also many individual stories, right? It's not like A zoomed out story.
A
I think you said something really important, Amber, which is to bring it all together at the person level. Like when we're thinking about this, we're looking at all of their signals or their interactions or the, you know, the things they did, right, they clicked on this, they visited this site, blah, blah, blah. And then maybe an sdr, you know, had this call or this meeting and then maybe this person is part of an opportunity. But what ends up happening is that all of this data is now joined at the contact level. And I think that's the biggest thing where if you're looking at these things in different systems, you're not connecting it down to the individual person.
C
That is absolutely true. You know, you want to get to that place where you're telling a story about the person. You want to get to a place where you're rolling multiple people up to the company that you're trying to sell to. You know, the assumption here is we're talking B2B. We're talking about multiple people involved in the process to get from a sort of raw target account to a new customer. And there's all these complex things that are happening. And yeah, they're in different systems, they're happening across different people. And the other thing I would call out is what you often see, like people try to get a little more sophisticated and not just do volume measurements or whatever and try and like layer on the what drove it air quotes. What drove it in the context of trying to measure a paid search ad, as Amber put it. But I'm just going to restate it. It's not really that simple. It's, it isn't. The question is not what did that paid search ad drive? Like what is it responsible for solely? That is the sort of like last touch or some, you know, miscellaneous ad specific attribution thing that is going to say, you know, there's plenty of these models out there, like, okay, we're going to look at just those ads and then we're going to give it a 30 day look back window or whatever, whatever, whatever starts to get more sophisticated, it feels more sophisticated. But you're still leaving out all the other things that happened. You're still trying to tell the story about like, yeah, we got this lead again, air quotes from a LinkedIn ad. Like, sure, that was a piece of what that person did to engage. But it's rarely the whole story. You don't want to think of it as a whole story. So even just asking that question, like how many leads did our LinkedIn ad drive. That's not even the whole story. And so, you know, I'm, I'm of course very passionate about that. Like, it's. It's dropping off a ton of other information about the bigger picture, that more complex story. And I think where this becomes really problematic, even in orgs that try to get that level sophisticated, is everyone's still sort of asking the question, like, we want this one simple answer. Just tell us this thing. A lot of stuff gets lost in translation because you have to dumb it down, you have to simplify, you have to try to fit that more complex story into this box of the board is asking for this specific measurement, or my CEO is asking this very pointed question. And I think we all struggle to reconcile those things where there's this need for the executive summary version of it. But when you try to force all that stuff into this box of the executive summary, you lose so much of the nuance and you rarely succeed in telling a version of the story that's actually useful. So I think that's something that we're always playing around with, trying to get continuously refined, like, how do you do that and what information comes into play when you're talking about those things? That's probably a whole other episode in itself because it's just there's so much there into trying to figure out how to go to these different audiences within the org and tell the right level of detail of story, even if you are sophisticated and pulling the data together.
A
Yeah. I think one of the biggest things with when we're talking about like revenue visibility overall is when you have revenue visibility, you have the ability to see not just what is working, like, oh, you know, did Sally Jane convert and become an opportunity? But like, what about the hundred people that maybe have engaged with that paid social ad or something and became a lead and then were passed to your SDR team and then your SDR called them. But then we can see they made this many attempts, they called them this many times, and then they ended up being disqualified because they just like weren't in market or they didn't have budget or whatever the reason is. But you can now now start to see the percentage of people from that campaign that made it through to pipeline and then made it through to, you know, like closed one revenue, but then this significant amount of people that like, dropped off and went nowhere. Right. So if we're talking about what's working, what's not, sometimes with the traditional status quo approach, you might have some breadcrumbs that show you what might be working, but it sort of, like, hides all of this, like, wastage that just isn't tracked.
B
Yeah. What's different about the deals that we won? Like, they came from, you know, same place. Like, what was different about their journey? What did we do? Or what did they exhibit that was different about them? Did they move quicker? Did they come in at a different acv? Like, there's just so much that you can glean that we absolutely do not see with status quo. And it's causing huge pipeline leak for companies, huge revenue leak. So are we ready to kind of move on into kind of zooming out to talk about this in a different way? But we've been talking about the status quo a lot. One way that companies try to solve for this is to create their own homegrown iteration of, like, sort of just trying to iterate on their existing broken system. And we call it homegrown because you use the resources that you have, or maybe you even bring in a consultant or something to, like, try to help you. But I'd love to know, Trevor, from you specifically because you've seen so many of these systems across your illustrious career. How does layering new tech on top of a legacy system in a process contribute or compound this problem?
C
Yeah, organizational change is hard. We all know that. And you end up with a lot of bias. I mean, that's. I'm not saying anything people don't already know. I mean, that's. That's certainly obvious. What we find is that you rarely cut through it all. And so, you know, what we end up seeing is that even when you try to modernize, try to get more sophisticated, try to build something new, the way that you think about it today creeps back in pretty quickly. You end up having to take this great new idea and make it conform to how you think about things today. So instead of just like being willing to break it all temporarily and rebuild and build something better, you end up being dragged into the mud of, yeah, but this new thing also needs to do the old thing, and we need to be able to still measure the same thing that we did before. And it certainly drives a lot of less than ideal decisions, design decisions and so on. And how do you really solve that? I don't know that there's a good answer other than being willing to fight for the right thing, which is sort of a hard thing to do in practice. Of course. Sure, you may have to build some things in parallel. I think we all have had to do that, and we do have to do that, but I think you got to be willing to fight. That's really what I would say to anybody that's out there trying to make this change. Sure, compromise is somewhat inevitable, but you got to be willing to fight pretty hard and explain why this new way needs to be the new way and shouldn't be aligned with the old way. It's just something that rarely happens well. So it's something that we certainly like to reinforce and I mean we get ourselves into situations where it is a hard, a hard thing to do and a hard conversation to have and all that. And it would happen to in House as well. I think the in house folks that are trying to build these things have it worse because they have their own bias. They don't even really realize anymore. We definitely see that. I think we've all been guilty of it as in House, in our in house roles. I'm not, I'm not going to pretend that I've necessarily done any better. You get just beaten into this groove of this is how we do things. This is what people internally are asking me for. Therefore I can't break that. And at some point you have to be willing to, I think, break, break it.
A
Well, I said this in a previous episode, but I think so much of it, it's kind of like twofold, right? Because it comes down to the philosophy that the C suite or board or whatever really believes in. And so in some ways it's really hard to break through like a lot of legacy thinking if the CEO, for example, is really married to their way of thinking about things. And then of course it's really hard to change strategically how you do things if you don't have the KPIs to anchor on. And so it's like, how do you build in those KPIs if you know, the CEO is unwilling to or is like resistant to change. It's sort of like this loop that it becomes really, really hard to break.
B
Yeah, I'm seeing more and more CEOs and CROs and growth leaders coming to me saying things like, let's just start from scratch. So I think there's a point too where someone, your CEO or maybe you're a founder led company and the founder was really involved originally in like your system and all these definitions. And that's great. And yes, I'm sure it worked for a period of time, but it's clearly not working right now or shit wouldn't be hitting the fan for you like this. Right. And so there's a period, there's A point where it just, they just get so frustrated and they're like, you know what, forget it, let's just start from scratch. Like, what do we need to do now in order to set ourselves up for success? And so I think to, at a certain point, you have to be in revenue operations. Like, you have to have that grit to come in and be able to like, stand on your own and say, this is how this should be done. But there's just a real soft, like weak spot right now. Right. Like, the market is crazy. Like a lot of people are just trying to keep their job and like, respect for that. Like you, you want to do what you can in the constraints of what you have. And so rocking the boat in Revops is never fun, especially in this kind of market. And so I absolutely empathize and shout out to everyone who's like, going through similar situations. I know, I've been there and we all have. But yeah, it really has to come to this breaking point. And so that's what, you know, I see in conversations with executive leadership, and I know we see it all the time, which is you gotta come to a realization on your own that your homegrown system is not working out and you gotta fix it. And whether that's with bringing an expert in in house or hiring someone externally like Pesetto to help you see what to do, you basically gotta start from scratch. So, yeah, I've been seeing people asking for that more.
C
Yeah, yeah, you touched on something there that I just wanna expand on. Inevitably. I think what we see, I've seen countless times, is companies are pretty complacent about fixing these things until it's a crisis. Right. So really sort of just echoing what Amber said, she's saying, you know, sort of have to come to this sort of big realization. What I would argue is, yes, that's how it usually goes down. Things become an emergency and then you come to the realization and that's when you can push it through. Because it's sort of like, well, what else, what else do we have to lose now? Things have gone badly. You know, you see this in all sorts of go to market things really basically. Like when the company is in this hyper growth phase and they're just making money, nobody's worried about anything. Things can be sloppy, things can be inefficient and so on. But that's when you want to fix the things. You have the money, you're not in crisis, you should take the time to think about the next phase rather than the ship is Going down. And now we need to get out the buckets and bail. We don't want that. It's terrible and causes you to make rash decisions. Even when you are ready to make change, do it while you have the luxury of being able to do it.
A
Yeah, right.
B
It's like what either happens is you're in the crisis and you say, okay, like we actually have to fix this now and how can we see ROI from that as soon as possible versus waiting six months or a year? Or you say, fuck it, let's just go throw some more tech deck at it and let's layer in, you know, a GTM engineer or some other layer and just like throwing paint at the wall, which actually makes the problem so much worse and just further delays you fixing it. And so those are the two paths that you could basically go down.
A
I was going to ask, okay, so like in the proliferation of AI era, we have AI tools for everything now. Right. And GTM is no exception. So what's to stop me as a CRO or GTM leader or whatever to just bolt on technology that claims that AI is going to solve my, you know, my thing, like, let AI do the work. Yeah, like, is that a good idea? Can AI solve this for us?
C
Well, my opinion, they're, you know, AI has not solved the garbage in, garbage out problem. That has always been true about data. There are claims out there that it can, I don't believe them even a little bit. It cannot work. AI is very sophisticated math, but it's still math. It still has to work with facts that are real and usable. Right. So I think just at a high level, that's one thing we'd say is, sure AI has a place here. I certainly don't argue that AI is a solution to every problem. I think it has very, very good uses in very specific things. And those things, the list of things is broadening as the technology matures. But it's not just put some kind of AI agent on top of your CRM and magically, you know, everything you could possibly ever need to know that's not going to fix your problem. If the data about certain parts of the, of the journey, parts of the story either doesn't exist, is incomplete, is just wrong. I mean we, all of those, those scenarios exist in most people's, you know, most organizations world of data. Right. So, you know, you're just going to get a very sophisticated answer from an AI that is still wrong and we'll be missing things and misleading. Like, yeah, it can't know whether it's complete or not. It's feeding, you're feeding it whatever data you've got. And so that's really like the big thing. You know, I think there are ways to use AI if you have complete data. Yes, AI can and should be used to say, hey, here's all the real facts. Here's a very complete version of the raw story. Can you summarize this for me again? But even, even in that case, you're going to do a lot of prompt engineering. It's not, don't just feed it and say, give me something like it's not going to be that simple. Even, even so. And I do think there's more productive ways to bolt on AI in go to market now. You know, it is very good at things like, you know, hyper personalization of outbound like that. That's a great use case. It makes sense. It's feeding some information about a person that you are collecting. It's producing some written word. Those are things that just make sense for AI to be doing a pretty good job at summarizing an account sort of overall, like recent support tickets, all that stuff. Like there's all sorts of things I could keep listing, like things where I'm saying, yes, absolutely, those are great use cases. But the solving this sort of black box problem, I don't think it's not in simple terms, it's not a use case. There's so much that has to be true before you could really do that.
A
Yeah, we were just talking to a fairly large company that acknowledges like, you know, our data is really terrible in terms of what we're capturing and what we're able to measure. And they're wanting sort of like a new lens to just really inform how they're making investments in marketing specifically. And in the one breath they acknowledge like, ooh, we have some pretty terrible data and acknowledge that we probably need some help like cleaning up the wiring underneath everything. But in the same breath it's like, oh, you know, we're exploring this like great attribution platform and it's great because it's got AI. We really like this idea that we can just like ask AI a question, it's going to service, service the answers. And I think a lot of people are very quick to, to think that they need attribution to help them. But I've been in that situation, I've leaned on attribution thinking it's going to give me the answers only to rip it out a year later because the real problem was the underlying data. That's feeding it. And so like I have a really, a really firm stance that like these tools that especially the ones that leverage AI aren't, aren't going to fix something where there's like a, it's like a medication, right? Like you medication oftentimes fixes the symptom, it doesn't fix the underlying problem. Like you got to go deeper, right?
B
If the symptom is let's book more meetings, then yeah, your automated outbounded scale might book you more meetings. But if the problem is we have a go to market problem in terms of what are we offering, is there a fit in the market? Is there a problem solution fit, Automated outbound is not going to fix that for you. So you still have to do that.
A
Hard work, but temporarily it might. Yeah, that's a great analogy for sure.
B
Just to kind of wrap up the whole AI convo super like hot topic right now. And I think because of course everyone wants to explore and really push the limits of how can we leverage AI in our organization and how do we become AI literate. And so you absolutely should be doing that. And I think one of the most impactful strategic ways that companies can move forward with this is to really get an understanding of where are you right now. Like Trevor had mentioned, like what are those use cases in your revenue factory where you can plug in AI use case and you're going to get ROI out of that immediately versus a use case where this is really misleading. And of course we always have companies out there like slinging value propositions around that are just really not tr. And so to set yourself up for success, being able to see where do you fit on this like maturity scale with your data, like how can you leverage AI and like be successful versus don't go down that road, you're going to waste tons of time and money. You need to fix X, Y, Z first. I think that's what companies should really be investing in.
C
I agree. It all comes down to being thoughtful about the overall structure and approach of your whole go to market machine. Go to market factory. You're not just throwing things at the wall hoping they stick. You are thinking about like a good go to market operator is thinking about all the moving parts together. And as we've been talking about this entire episode, it's also acknowledging that those parts that you're thinking about, how you execute as a big picture are then being measured as that same sort of context of a big, big story and so on. It's not just sort of like random throw things into a pile and hope something good comes out of it. You want to be strategic about it, you want to be thoughtful, you want to understand how the parts work together. That's something that your individual functional practitioners, your paid ads person and whatever, like all these different roles, like that's not their job to do. They should have some context into how it plays into their bigger, into the bigger world. But someone or a functional group of people within your org need to be able to think about all the moving parts and how they come together. Some of that is a good, solid rev ops function. Some of it is your executive team acknowledging they need to understand that stuff and that there's not one answer to everything. So yeah, acknowledge the complexity and put the effort into understanding how it all fits together.
A
I think being in the weeds with like a lot of customers who are feeling this specifically, you know, people that are in marketing, marketing leaders or demand gen leaders or you know, who are trying to improve their decision making using data. I think understanding like where you are, I guess like on the sliding scale of revenue visibility right now is a really good thing to have. When you're trying to build a case for CEO buy in like 100%, I think you are going to be way more successful. Like if you're a change agent in your organization and you know something's broken, you're feeling that, you know, you're feeling the symptoms of that, I think you are going to have so much more effectiveness instead of positioning this big huge thing that potentially needs to be fixed. I think if you can surface like in a quantify, a quantifiable fashion, here's where we sit. Relative to like complete revenue visibility, like we're three out of, you know, a 10 or something like that. And here's exactly where our gaps are. It becomes a lot easier to build momentum around that because you know where you are and you sort of have like a compass of where you need to go and how long it's going to take you and who needs to be involved, et cetera.
C
Yeah.
A
So sort of thinking about, you know, where do we go from here. I have a question. As we, you know, wrap this all up. If we were to go all back in house, whether me as a CMO or Amber as a head of Revops or same with you, Trevor. And the company has reached a pipeline new logo plateau or their growth rate is decelerating or whatever the scenario, what would be the first thing that you would all prioritize? Amber, why don't you start and take.
B
It away okay, awesome. Yeah. So coming in, head of RevOps, I would really start to unpack what is the true cost of our status quo. So you come in, you're going to be automatically unloaded, loaded upon with a ton of reporting and day to day maintenance and ticket desk type of stuff for the most part, especially if you're a team of one or few. And so I think saying no to those things upfront and just let it break, like, see what breaks and just say, hey, look, we're really responsible for helping grow and expand revenue at this business and everything that is not tied to that is gonna take a backseat and being able to just be really have a strong conviction about that because it's so easy for all of your time to get eaten up just maintaining that status quo as we talked about earlier. And you have no time to really create those systems and those feedback loops that help you be a strategic leader and for the business. And so I would say you have to say no to those things and you have to come to the table with a plan of like, here's how much this is costing us in terms of person hours, you know, missed opportunities, et cetera. And here's why we're going to deprioritize those until or unless something really comes that is really important to the business so that we can work on this important visibility that we need to be able to see that we can't see right now. It's like the classic example of focusing on what's important versus what's urgent. And a lot of teams, especially operators, just spend their days working on what is urgent.
C
So true.
A
Yeah, yeah, great answer.
C
Sort of piggybacking on that, on Amber's. Great answer there. Being very wary of not just dealing with the thing that's screaming the loudest, which is really sort of the point that, that Amber's making, I think, and taking the time. You know, we've talked a lot this whole, this whole episode about how, how complicated and interconnected everything really is and should be. And so even if things are screaming really loudly that they quote unquote, need to be fixed right now, you probably don't want to do that until you have a really good sense for all the actual moving parts and start to put a plan together on how to fix those vis gaps, because until you've done that, you don't really know how important any one thing really is. Now, I think that's the ideal world. Like you do take the time to step back and think about those things and get everything in Order acknowledging that the real world, you're probably still going to have to do some random fires. That is the nature of rev ops and you're never going to get away from that. But the goal should be as much as possible to take that step back and make sure that you can tell the whole story or understand where you can't tell the whole story. And maybe that's. That's really what I'm saying is you're probably not going to be able to tell the whole story. Go figure out what that part, you know, what are those parts that you can't figure out why? And those should probably be near the top of the list. How do we fill those gaps? How do we make sure that we can tell the whole story so that later on we can know what really is truly important in that sort of tactical way where we're like, okay, yes, this piece of the story is telling us this is important, rather than just guessing or having some specific function feel this real strong need without a reason other than their own function's need.
A
Yeah, I feel like both yours and Amber's approaches are pretty similar in that way. I think I would do two things simultaneously, honestly. I think on the one hand, if, you know, if I'm overseeing a marketing team, I would make sure from like an OP standpoint that we are really consistent and rigorous in the things that we are tracking that we are doing in the market, like making sure we are consistently using UTMs to track channels. I would make sure that we were super formulaic and standardized in terms of how we're operating campaigns so that literally everything we're doing in the market can be tracked back in the system. I think that is one thing I would pursue immediately on the one hand. On the other hand, I think it's fostering a relationship immediately with finance. I think it always comes back to ROI of marketing. Marketing is expensive. And so at a high level, I would really want to know overall our cost to produce pipeline and our cost to close new logos and assess almost immediately how far within, like, you know, a reasonable range of what we would consider normal. Are we outside of that, like, are we spending way too much? Are we, you know, under the benchmark range? I think that would guide some, like, immediate decisions just around our effectiveness overall, overall.
C
Yeah, I think you've made a few good points there, Carolyn. We haven't really talked in this episode much about the finance components. We've sort of implied it maybe in some of what we've said, but yeah, to put a finer point on it that is part of the story. It's not just about what you're executing, it's what you're spending to do it overall investment and resources and all that. And of course, we in previous episodes have talked a lot about that piece of the puzzle, but I'm glad Carolyn brought it up a little more specifically here because it is important. It's often overlooked or lightly touched at best, that topic, the occasional marketing budget review or whatever, but it should be more comprehensive than that.
A
Yeah.
C
The other point that you made about sort of coming in and making sure everything's trackable or to the best of, you know, best possible tracking, at least I know through my journey in the sort of marketing ops, rev ops world over many years, I've seen a lot of, a lot of cases where it's an afterthought. It's, well, we just got to get this thing out. You'll get it out the door. It's more important that we execute. So let's just worry about the tracking and how it did later, which I think we all actually know. If you didn't think it up, you know, think about it up front, you're probably not gonna be able to track a whole lot after the fact. And so building that culture probably around like, no, we're gonna take the time. If you are going to do something, you're gonna invest actual money, human resources, whatever, to execute this thing in go to market. You shouldn't do it until you have a sense for how you're going to measure what happened. You should be able to answer that.
B
Question, which means that you throw less paint at the wall for the next 90 days. It might mean that you splatter a little bit less, but the experiments that you do run are pointed and you can really speak to what happened.
C
Right. Sort of back to the science concept that we sort of kicked off this episode about. Yeah, be scientific to the best of your ability. The scientific method is a thing and you can apply it, at least in some ways to this and you'll definitely come out the other end with a more usable set of information.
A
Yeah.
B
And speaking of usable sets of information, I know we gotta wrap up here soon, but what we really just talked about in the last few minutes is a lot of visibility gaps, understanding AI readiness. And so if you're at a point where you need to pinpoint the exact data gaps that are blocking your go to market clarity, you want to eliminate the black hole between the activities that you're running and the revenue that you're seeing on the other side, we have a foundational step that you can take here at Passetto, which is called the Revenue Visibility Diagnostic. And so what that looks like is really just ingesting, you know, your data and helping you see what's. What we can see in terms of what's working and what's driving performance in your revenue pipelines, and then also help you see, like, how you can mature. And so if that's something that you're interested in, definitely reach out to Carolyn and we can. We can talk to you more about that.
A
Yeah, I love that plug. I think that's a great fast, too. Like, the thing is, too, it's like a really fast sort of, like, rapid way to get some initial insight, to just give you a little bit of a compass directionally on what can be our quick wins in the near future. But, like, longer term, strategically speaking and operationally speaking, what are some things that the organization can prioritize. Thanks for dropping that in there, and I think good chat. Otherwise, if you like the show, feel free to drop us a review. We really appreciate that. And give us a follow. Say hello, and we'll see you all on the next show.
C
All right, cheers.
A
See y'. All. Bye.
Air Date: September 1, 2025
Hosts: Carolyn Dilks (A), Trevor Gibson (C), Amber Williams (B) of Passetto
This episode explores the pervasive "pipeline black box" in B2B SaaS go-to-market (GTM) organizations: Why most teams operate without true visibility into what drives pipeline and revenue, why standard metrics like MQLs and new AI tools fail to solve the core issues, and how to rebuild systems and culture for granular, actionable revenue insights. The hosts candidly cut through common misconceptions, highlight operational and strategic pitfalls, and outline what leaders need to do to turn revenue into a "science" rather than an art based on gut feel or vanity metrics.
Q: “If you went in-house today as head of GTM, marketing, or RevOps, what’s first?”
If you are a CEO, CFO, or revenue leader ready to move beyond GTM guesswork and vanity dashboards, this episode delivers a candid roadmap to rebuilding true revenue clarity—no silver bullets, just modern, actionable steps.