
Loading summary
Jason Packer
Foreign.
Analytics Power Hour Intro/Outro
Analytics topics covered conversationally and sometimes with explicit language.
Michael Helbling
Hi, everybody. Welcome. It's the Analytics Power Hour. This is episode 293. Okay, listen, we draw a hard line of this show. We don't talk about tools.
Jason Packer
We.
Michael Helbling
But we never said anything about tool selection. And let's be honest, we have all been there trying to figure out which vendor to go with after putting in tons of effort into our carefully crafted spreadsheet with all the selection criteria, which somehow every vendor says, yes, they can absolutely do all the stuff on there. It's enough to make a person cynical. And we analysts don't need help with that. So take a pause from reading the cold sales emails from the latest analytics AI SaaS vendor and let's talk about the ins and outs of selecting a tool. But first, let me introduce my co host, Tim Wilson. Or as I like to call you, a Tim Tool Selection Wilson. No. How you doing, Tim?
Tim Wilson
I'm just about to ready to select a new podcast recording platform, so.
Michael Helbling
Perfect. That's going to probably trigger a bunch of inbound emails. All right.
Jason Packer
And mo kiss.
Michael Helbling
How you going? I know you do a lot of vendor evaluation and selection in your role.
Mo
I certainly do. I'm very pumped to talk about this. I think the only thing you missed is the like, oh, don't worry if we can't do it yet. It's on our roadmap.
Michael Helbling
Oh, yeah, that's our Q1 roadmap. Don't worry.
Tim Wilson
That happened yesterday at a client. They turned to the vendor and they're like, yeah, we can't do that. And the response from the client, which was a very large company, was like, well, is it on your roadmap?
Michael Helbling
And it's on our QN1 roadmap. Yeah.
Mo
Yep.
Michael Helbling
All right. And I'm Michael Helbling and we wanted to bring on a guest and we found a great one, Jason. Jason Packer is the founder of Quantable Analytics. It's an analytics consultancy focused on analytics engineering and implementation. He's also the author of the book Google Analytics Alternatives, now in its second edition, and the genius behind the Measure music channel on the Measure chat Slack Group. And now he is our guest. Welcome to the show, Jason.
Jason Packer
Thanks, Michael. Really happy to be here. You know, it's bucket list item to finally make it on the podcast.
Michael Helbling
Well, it's awesome to have you. All right, so maybe to kick things off, Jason, maybe just walk us through sort of what brought up the idea and was behind the idea of writing the book in the first place.
Jason Packer
Yeah. So you know I've always been really interested in, like, evaluating software and, like, knowing what's out there. Even back to my early days as a, like, you know, UNIX administrator and software developer. You know, I liked looking at all the different tools. And back, you know, in this sort of era of when the Google Universal analytics universal sunset was coming up, there was a lot of people that were asking these questions. There were a lot of people asking me these questions. And so I thought, well, you know, I may as well start doing this research. Seems like a fun thing to do. And I started out thinking, well, maybe I'll write a series of blog posts. And then someone at Columbus at the time, Web Analytics Wednesday, said, well, why don't you just write a book, Jason? And that seemed like a good idea to me. And so I did it. And now a few years later, there's some things have changed. There's some new tools I wanted to look at, and I thought I would just make the same mistake again. So here we are.
Tim Wilson
Wait, who was it at Data Analytics Wednesday who said that?
Jason Packer
It was Ahmad. Ahmad.
Tim Wilson
Oh, okay, nice.
Jason Packer
Which I think I credit him for in the first. First book, at least for the. For the idea.
Tim Wilson
Look, I read it. I didn't memorize the acknowledgments.
Jason Packer
Geez, come on, Tim.
Mo
But it sounds, Jason, like a big part of your process, and like, understanding the capabilities of the tool is, like, really playing with it, right? And I think one of the things that I'm often thinking about is, like, I see folks trying to evalu tools without getting their hands dirty. And so, like, do. Do you think that's what everyone should be doing, or is that just the thing that's always worked for you?
Jason Packer
Well, I think, like, everybody loves to have an opinion about a tool, and it's very easy to form an opinion, right? You, like, get in there, you see, like, you know how it looks and how it feels, and that's fine. I mean, you know, I have opinions about that too, but it. You really have to, like, balance that against, like, really learning what the tool is about. And for me, the way to do that is to use it and to use it with real data. Not to use it, you know, not to watch videos about it, not to, like, be walked through a demo by somebody, but to, like, install it on a website. And even if it's, like, just a trivial website, install it and use it. And you know, that that's how I learn best. That's how I learn most quickly.
Mo
And. And do you think that using it with real Data like the bit that I'm kind of taking away from that is like, it helps you understand it, but how do you think it changes, like the evaluation process itself?
Jason Packer
I think using real data will show you a lot more about where the like issues are. Like, for example, you know, if you're working with a vendor and they walk you through it, they're going to show you the highlights, they're going to show you the things that work well. They're going to show you a tool that's completely, perfectly set up. And we all know, I mean, that's not how it is, right? You and like, you know, the. In the book, I. Everything that I evaluate I used on real websites with real user data. And so, for example, like one of the issues with that real, those real websites is one of them had a terrible bot problem. Like, you know, it was like there was a site that I bought, you know, like on the secondary market. I didn't make the website, I just bought it. And so, you know, it had some real traffic, but it was just like, you know, littered with bots. And so the traffic looked really weird. And like there was all kinds of strange hits to pages that weren't there. But that let me, led me to learn a lot about how, you know, these different tools worked in the cases where there's a bunch of 404s or there's huge amounts of bot traffic. So like that's the difference between no vendor, whatever. Like show you a demo where like 90% of the traffic was bots. That'd be crazy. And in some ways it can be challenging to do that because that might not be your use case. So a lot of the things I talk about in the book is use case match. Like that's the challenge as a tool evaluator is to match your constraints of your use case to the best match of a tool. And it's not like, you know, like I said, opinions, everybody's got them. And like there are in some ways in which some tools are more technically advanced than others, or some tools are like faster than others or whatever. But like, it's really about, you know, matching use case to tool through the lens of those constraints back to the
Tim Wilson
using the actual data. So the book was kind of digital analytics, product analytics stuff. I would put BI platforms in there, put data warehouse platforms, all of those. When it's like you want to try it with your data. I mean a really high bar or a real challenge seems to be we want to do a bake off or we want to Do a proof of concept, we want to try it out. And I've gone through processes where it's like, we're going to do the RFPs, we're going to select some finalists, we're then going to do a bake off. And that does mean you're fundamentally doing some sort of mini implementation and trying to draw the line of, you know, and that can include getting through some compliance hurdles to say, yeah, we're using our real data. Or do you say, well, we're going to dummy up, we're going to do an effort to make. It's kind of like our data, but it's been anonymized to the point that it's not our data, but it still mimics our data enough that we could actually try it in this platform. Like, it does seem like companies to me. That's what motivates a lot of the not wanting to go through that process.
Jason Packer
Like, ideally it would be great to use your actual data and to do like you say, a real mini implementation, but that's just not feasible in a lot of cases.
Tim Wilson
I mean, mo. Have you, have you done that?
Mo
Yeah, like I'm not going to beat around the, like, I do a lot of like analysis of different vendors and different tools and that sort of stuff. I would say I definitely lean towards the, we should do multiple POCs. Like the last major tool selection we did, I, I think I wanted to do maybe four POCs. And obviously like that's a negotiation with the business and capacity and things like that. We ended up agreeing on two. But I think the thing that I found really hard is like often the folks doing the evaluation and the assessment and those sort of things, like I don't know if the incentives are always there to do multiple POCs. And I find that hard to reconcile with because it is, it's really hard to understand how good a feature is or a particular capability that you're looking for without stress testing it. And yeah, I don't know if I just air to too far on the POC side maybe. I think folks internally would probably, probably say I do.
Jason Packer
I think that's really challenging. Right? Like, because a POC is great, but even before you want to get to that poc, you want to feel like you've narrowed it down to something that's worth the effort there, you know, and for me part of that can be not even doing a real poc, but doing like a, you know, a toy test, you know, like, oh, let me do it with my, let me do it with My podcast website. Let me do it with my, you know, personal website or whatever. And, you know, that's part of the reason also why I'm a big proponent of like, free tier. Even, like, you know, even on, like, enterprise tools. And that can be a challenge. Right? Like, not everybody can offer that. And sometimes if you're talking about a, like, you know, a huge BI platform or something, like, what would a free tier even mean if it's even doing a simple example? Implementation means putting in, you know, 100 hours of work or something. But that, like, the ability for, you know, to get a little bit into the product before you really start talking about committing company resources to it. I think, you know, because I do love the. The POC approach and the more the better, but, you know, it can be hard to get those resources for sure.
Mo
And also, just like, getting it through security is like a really big step. Like, you're basically doing like a procurement process for something that you're running a POC on. And like, yeah, it takes a lot of time and energy. But I mean, I obviously am very biased here because I lean strongly on the side that that's worth it. Yeah. And like, that's my lived experience. But, yeah, this.
Tim Wilson
Well, but Bill, have you. Have you run into. Because I can see the down. So say you say it's only two. You get down to two tools, and you get. And you've got multiple people who are all kind of trying it, and they all have kind of different things they most care about. And then you get to the end of that and you're like, all we've done is allowed people to dig their heels in further on their preferred tools because now they have hard evidence that that other tool doesn't do this thing that I think is really important, and it does this thing. Like, do you wind up saying, well, this is supposed. We're hoping that we arrive at a clear winner, but even if you do a POC of four tools, there's still not. There's still not the one clear winner. And you're still in kind of a negotiating phase. And you're also setting up the people who didn't back the ultimate winner to be able to say, see, we did the POC and I told you we shouldn't have that one. Sorry, that's just depressing me. I'm just.
Mo
No, no, no. I. I can still remember, like, a few years ago we were doing a bi tool selection. It must. Must have been like five years ago. And all the data analysts got in a room and we like this. This was the absolute worst way to do it. I would never ever do this. But we were like, how important is this thing to you? And everyone would go to one side of the room or the other side of the room and almost every time I was on the side of the room on my own and I think so suffice to say we did not pick the tool that I wanted to get. But it is what it is. I think the thing that I find so difficult about data tools in particular, and I know we had Colin on previously from Omni talking about how like especially bi tools like you're trying to be many things to many different people. And I think what's so challenging about data tools is data folks have very strong opinions about the things that they do and don't want to work with. But also their opinions are normally like representing what is best for them and not always what is best for the business. Like, and that's human nature. Right? Like you think about what's going to make your own job easier. But, and, and so I think I, I often come with this perspective of like a data tool is actually for our stakeholders. So like even if it's a little bit trickier, a little bit harder for like us in our day to day, is it going to help our stakeholders in their relationship with data be better? Because I will upgrade that. But I don't think that's the common. I'm not sure that's necessarily like a common view.
Tim Wilson
Michael, what's your relationship status with SQL?
Michael Helbling
Oh, I think you know, it's complicated. It keeps gaslighting me with a syntax error near from like I don't know where from lives.
Tim Wilson
Well, here's a healthier relationship. Prism by ask. Why you ask in plain English. Prism Prism Rice the sequel.
Michael Helbling
Ooh, like revenue by channel, week over week, excluding refunds. And instead of Me crafting a 47 line query and a three line apology, Prism just does it.
Tim Wilson
That's right. The best part, it doesn't forget everything. The moment you close the tab. Prism's jam. A memory remembers your reality, your definitions, your quirks. I mean not your personality ones but you know, your coding quirks.
Michael Helbling
Well but like the bigquery table is the source of truth and conversion means this and not whatever gets decided by somebody like mid meeting somewhere.
Tim Wilson
Exactly. So you don't have to re explain your business context like it's a bedtime story for robots.
Michael Helbling
Yeah, I have to admit I'm a little tired of starting every session with previously on analytics and when Prism Generates
Tim Wilson
SQL, you get traceability, you can, can track changes, see what was created and follow the logic.
Michael Helbling
I like that because when somebody asks me where this number come from, I could stop saying whoa, from the number tree.
Tim Wilson
It's like version control for your analytics brain.
Michael Helbling
I like it. A little bit of accountability, but it's convenient.
Tim Wilson
That's right. So do you want in go to Asky AI and join the waitlist. That's ask dash the letter. Yes. And use code APH to go to the top of that wait list.
Michael Helbling
I like the idea of letting AI write some of the SQL and let
Tim Wilson
your memory do literally anything else.
Jason Packer
No, and I think it's not. And I think, you know, everybody also wants to work with the tool that's good for them, like personally as well. Like. Right. Like this idea of sort of like you're implying that, like, hey, I want to work with the new tool. I want to work with the cool tool. I want to work with a tool that's good for my career. I want to work with the tool that my LinkedIn posts are going to be go, you know, go viral with. And you know that, that a lot of times that's not the right, that's not the right fit. So it's really about the whole organization. Not just, not just, you know, the analyst, but a lot of times the analyst isn't even really the one.
Michael Helbling
Flip it around. And people want to work with a tool they're familiar with. Like, I used this in my last job, so I want to use it here.
Jason Packer
Which was good.
Tim Wilson
When GA4 came out and Universal analytics got sunset, then it was like, well, nobody's familiar with it, so reset.
Jason Packer
Yeah, yeah, that's what I was going to say too, is that a lot of times a tool switch is not the right answer. We all like to think like, hey, there's a tool out there, the perfect tool out there that's going to fix my problems, is going to make my personal life better, my company do better, etc. Etc. But there's no perfect tool. There's no like grasses, you know, looks greener. But like a lot of times the tool you have now just isn't implemented correctly. And the new one you get isn't gonna, isn't gonna, you know, isn't gonna be implemented correctly either. So, you know, that can be a real challenge too. And especially if you're like, hey, I wanna do these. Hey, we're gonna do two POCs and put in all these resources. In the end we're gonna say, oh, well, actually I think the answer is that we stick with what we got and we just, you know, spend a little more time trying to improve our application. Nobody wants that answer.
Mo
In your experience, like, talk me through when there are trade offs, right? Like we've all said, like no tool is going to meet the brief perfectly. How have you approached like balancing those trade offs? Like, what's your thinking and how do you, when you're working with businesses, convince them of like the trade offs they should make versus shouldn't?
Jason Packer
Yeah, it's, it's really difficult because how I evaluate the tools from the book is a totally different mindset than how I think when I'm talking to an organization. And like a lot of times I won't even really be talking about the same things, right? Like in the book I talk about like, you know, the underlying tracking structure of different tools, the databases, the different tools use, how they work with consent and things like that. And when I'm talking to a particular business, I may, you know, I listen for what their real pain points are, right? Like is this an organization that like, you know, they just need to get off of, of GA because of their, because of compliance issues. And that's like, you know, then I focus their selection on solving those pain points as directly as possible. But also like, you know, trying to not get into the weeds with them about the details of the tools that, you know, the people listening to this might find interesting, you know, because they, they're not going to find that interesting.
Tim Wilson
Well, but I mean, I think you could just kind of mix because part of what you did, and maybe it's worth having you. What I loved about both editions because the structure stayed the same is that the tool by tool, blow by blow, and it's not a feature by feature, but the tool by tool
Jason Packer
kind of
Tim Wilson
write ups are the second half of the book, the first half of the book is you gotta have kind of a framework of what matters to you. And so you admitted throughout you're like, there is no perfect categorization. But like you just talked about one of those was the tracking methods. And for the, I could see for the right company they would say, well, we've been getting burned by our current tracking method and we have got to find something. And you're like, cool, well let's then think about the philosophical difference from the different tools. If somebody else says we just need something super cheap, it's like, okay, well then let's talk about the nature of your digital experience and the different pricing models. If somebody says we Just got to get off of ga because it's compliance. We actually love everything about it. It's just we, our compliance team has said we have to get off of it. And so I would say in the example you just gave you actually it was kind of how you approached the book. It's just where are you going deeper in that? Understanding what attributes truly matter and then going deeper, right?
Jason Packer
Yeah, I think actually that's fair that like, you know, and like one of the things I've talked about is how things like it's all about constraints, right? And how price is a constraint. Price is a real important thing for organizations. It's not the coolest thing to talk about when it comes to tooling and similarly it's just a question of how you're engaging with the decision makers, I guess. So the things that are in that first half of the book are just like a long list of the things that I think about and it might not be, I might think about a bunch of those. When talking to a particular organization about a tool, I might not be talking to them about all those things, but I'm certainly thinking about a lot of them and I think it's important to understand them to a certain degree. For example, in the new edition there's a chapter on server side. Obviously I'm not going to teach someone everything about server side analytics in a chapter, a 3,000 word chapter of my book that's not primarily about that, but understanding at least enough about that to know if you're talking to a vendor when they say, oh yeah, we support server side, it's easy. This is what you do to be able to understand, interpret what they're saying, to know like, oh well, really kind of like anybody could do server side. It's not really about the tool, it's more about the deployment, about, oh, are you using server side GTM to deploy that? And if you are, then this, and perhaps the real underlying problem is tracker blockers or something like that. And then your lens for viewing that is different. So that's why I think that the first half of the book, the sort of the guide part of it rather than the product evaluations is the lens in which I look at all product evaluations and I'm trying to share that viewpoint in the first half. So Tim liked it at least. So that's a.
Tim Wilson
Can I ask, and this is probably also a question for multiple people like you, and you said it in kind of some of the earlier discussion that you explicitly did not talk to the vendors, even though they were. Especially after the first edition, they knew you were doing the second edition and they're like, come on, just let our sales engineer help you out. You know, once you just understand. And I think you did that to say, I want a level playing field and I need to finish this book at some point. And if doing. If doing 15 POCs is tough, letting their sales teams get their hooks into you would be absolutely impossible. Whereas.
Jason Packer
Yeah.
Tim Wilson
And so whereas Mo, I feel like if you're down to a couple, the where does the sales play? So I don't know. Maybe j. You can talk through that.
Jason Packer
I mean, yeah, that's sort of a. Unusual choice that I make in the book is to like. I mean, I definitely have talked and I know a lot of really great people at a lot of these vendors. Like, especially after the first edition, you know, I've talked to a lot of these. These people and there's a lot of.
Tim Wilson
A lot of them told you what you got wrong.
Jason Packer
Not as many as you think. Some. Some. Yeah. But, you know, it's important to me that I was really, really fair more than I was like, particularly making any value judgments or anything like that. But the, the, you know, the not engaging with them is. Is about loving the playing field to some degree. It also fits in well with how I learn. Like a. Describing the. Like, you know, learning from doing the. Again, like getting a demo account or some kind of account where I can use the product is the fastest way for me to learn rather than like being on sales engineering calls. But I think it's a much. That's. That was my case for writing the book. That's different than, you know, most case with dealing, you know, engaging with vendors from a, you know, a large org that has specific needs. And I, you know, I think that a lot like engaging with the vendor reps can be really, really helpful. But it also like, gives you an idea too of the, like, culture fit between the product and your organization, which is a real thing like, you know, and something that when I started the first edition of the book, I didn't expect to be so important, but is, I think, quite important.
Michael Helbling
Do you hear that, Tim? Culture. Very important. I just wanted to reiterate that point really quickly. Sorry, keep going to Mo.
Mo
Just to add to that, I have personally found that engaging with sales engineering support, whatever, is a really big part of the process because I want to make sure that, like, we can learn from their expertise that we're like, not facing challenges that are very easily fixed. And I think part of then like evening the playing field is making sure that you get that with all the companies that you're pocing that it's like it's not a favorites game. And, and you're so right, Jason. Like, such a big part of it is about the culture or like the ways of working that you then get to explore with that other company. And like, very transparently, like, I've talked about our relationship with Snowflake quite a bit and a big, big part of our success. I will rail on about implementation for years to come, but a big part of it is like, we've been, we've had really close relationships with their product teams, with their product managers, their tech leads. We will have calls like testing out new features and new functionality and being able to influence their roadmap. That is a hugely important thing for us when we're doing vendor selection because we want to make sure that in a year's time we have the kind of relationship where we can push their product if we need to. I think that letting those folks in the room so that we can stress test each other is a big part of the evaluation for me.
Jason Packer
Yeah, I agree with that. And again, it depends, like I was saying, on your organization and why you're buying the thing to start with, right. You know, if you're a tiny startup and you're not really gonna like, if you're. The thing that I hate is like, you're at a tiny startup and you're talking to an enterprise software provider and you know, you get to the point where, okay, we're ready to actually like, talk some real prices and like, okay, well, you know, start for, for your volume data. We're starting out with $65,000 a month. And you're like, that's my. What are you talking about? That's like my entire yearly budget for all of my analytics. So, you know, it. I love transparency. I make that pretty clear book. And I think that like, you know, that's just a great thing to get people on the same page as quickly as possible because that's super important. And I think that like, when you are engaging with the vendors, being transparent with them helps everybody. Like, you know, nobody wants to seem like, yeah, nobody wants to seem like a dummy when they're, they're talking to a vendor. But, but you know, if I, if it's a new tool, I don't know the tool, right? Like, they, they know the tool, they know their immediate competitors far better than I will. So like, you know, I, I try to be very direct about hey, you know, the budget is this. And here's my seemingly very stupid question and when you gave me answer, I didn't understand. I'm going to just ask that stupid question again because you know, it's important to everybody that like, you know, we find, find the best fit in the like most direct way possible.
Mo
And I do think, obviously I come from a place of absolute tech privilege. I think a lot about all the time.
Tim Wilson
We're like, well
Mo
anyway, I just want to be conscious of other folks have very different budget constraints when it comes to tool selection and things like that. But there are things I will die on a hill for. And one of them is I do think obviously budget is incredibly important. But if there is a very good tool and it is not like 10x like other options, but it is a good fit, I personally think that is a fight worth having with the business, like getting support for that extra budget to make the right tool decision versus like being so constrained by it that you make a really, really shitty choice. And again, like everyone's not in that position. But that, that situation you just described, Jason, like having the, knowing the prices much earlier in the process is absolutely something that folks should be doing. Like you can't wait till you've done a POC to start like getting an idea of their pricing because if it is like way out of the realm of possibility, you don't want to waste your time and energy on it.
Jason Packer
Yeah. And like again, you know, I'm going to, I guess I haven't thrown Google under the bus yet. I'm just, yeah, there's always. Here we go. Here we go. Yeah. One of the things that I think Google really made hard for is that, you know, they made a, with Universal, they made a, you know, pretty darn good product and they made it free to just an incredible degree. You know, like it was free to technically free to 10 million hits a month. And in reality it was quite a bit higher than that. And with GA4, of course there's no, no hard event limit. You know, the 1 million per day export limit to BigQuery is probably the thing that people hit first. But that's still. They're giving away so much for free and that's really caused people in the industry to think that analytics should be basically free, that the software should be free. And like that's, it's very distorting. I mean it makes things really hard for, for new tools to come out there and to, to get a foothold in the market. And it makes like What Mo you're saying as far as like, hey, you know, I, we need to understand that even if this tool is a little bit more money, think of the cost and people the cost in like you know, data decisions in the organization, you know, like you're like really undervaluing analytics. And part of undervaluing analytics started with you know, VA being free and you know, that's still, still has happening.
Mo
Oh, Jason, I feel like we could sit around and like chat for hours because I think fundamentally one of the biggest mistakes I see is yes, folks want to work on cool shit to put on their resume or LinkedIn or whatever, but it's also the like the open source fallacy or the free fallacy which is like, oh this is open source or it's free, it's not going to cost us anything. And I, I will push pretty heavily on like that does not mean it's free. We need to actually think like we're talking about a solution here that has five full time engineers supporting it. That is not free. To me that is actually a huge cost to the business. And if we want to do that because we think that's the right decision, that's okay. But that needs to be a line item in our decision as well. Not just the like on paper cost of the tool.
Tim Wilson
Well that doesn't that also then extend that if we're going to have to support and we have those five engineers and one of those engineers leaves what's the size of the pool of candidates that we're going to have to replace it? Which is one of those where market leaders and whatever tend to have a leg up. Even if, and it's a legitimate leg up, they've achieved some critical mass. There are, you know, nobody got fired for buying Tableau or Power bi and part of that is because everybody's kind of been exposed and is familiar. But it's also legitimate saying, well if I need a Power BI developer, that's a much larger pool to draw from. Right.
Michael Helbling
Yeah. And Mo, excitingly get ready for a new wave of that with AI because now people are going to be like it's free, we can just build it with AI. Which actually it's a question too Jason, sort of from your perspective because obviously we've all kind of been through vendor selection processes and we kind of touched on how Google Analytics is free. So obviously, you know, as people are sort of jumping on the AI bandwagon and seeing how easy it is to prototype things, not necessarily build full on products yet, but like we're we're moving in that direction, I would say. Do you think that's going to be something that it'll enter the process of sort of like the build versus buy debate suddenly changes a lot in the future?
Jason Packer
Yeah, I think so. I think that's already, already happening. I think that it's happened not exactly with AI, but the sort of the simplified realm of these tools like the, in the book, I call them simplified web analytics tools, including things like Plausible and Fathom. Umami is also quite popular. These tools are extremely simple. Most of them are cookieless. They use like an IP+user agent hash to create sessions. And like, there are so many of those tools out there now. There's like, you know, every few months a new one comes out. Some of them are quite good. I mean like, it's not a hard thing to prototype and it's even easier, you know, with AI. I mean I, I could like, you know, go in and especially if you start with the ones that are open source, you can be like, hey, you know, build me an Umami clone. Here's the Umami GitHub repo, have at it. And you could, you could build yourself something like that pretty quickly. And some people are doing that. You know, I think you run into some of the problems that Tim was talking about as far as like having expertise with the tool. Like if it's, you know, if it's some internal tool, then the internal people are going to be the only ones that have the experience with it. And like also when it comes to some of the more complicated underlying database things, that's I think far beyond the complexity of what AI can do a good job with. And so, you know, like, and also things like Schema, AI doesn't do great job with. I mean it can do okay, but it needs a lot of human handholding. So I think that ultimately it's a mistake for most organizations to try to think that they can build their own when there are so many really great platforms already out there. I think it's maybe fine to think, oh, we're going to extend, we're using whatever we're using Posthog and Posthog is an open source tool they have, it's one of the widest tools in the market. They have 34 different apps built into the tool. Whether you want like session recording or feature flags or whatever, or LLM analytics, they've got it probably. And if there was something in there that you need to add to that, then using AI on top of that, I think to extend it, it Makes sense. But like trying to like build a, like a foundation to your analytics platform, your analytics practice with, without really having that real strong attention to detail that these platforms that have been out there and tested have. It doesn't make sense whether it's, you know, humans building it or it's even worse if AI is building it. But I think it makes sense, lot more sense to like build on some of the great tools that are already out there.
Michael Helbling
That gave me a cool weekend idea though.
Jason Packer
Mommy club coming up.
Michael Helbling
Yeah, yeah, just, just because you can.
Tim Wilson
I don't know, but as you, as you brought up Post Hog, because that's one that I'm not familiar with, but you, you mentioned them because they kind of. Well, actually I think I sat and watched you talking to a different vendor and you brought them up a lot as being kind of like there is this, like, who was the tool built by and for like posthog is like of buy and for the developer Google Analytics. Universal analytics was kind of intended to be for the more casual. Like, I mean, initially they tried and they tried to stick with it. Like, this is easy. This is for the marketer. BI platforms seem like they're similar. You know, if you're, if you're in the power bi, you're in the Microsoft stack. You're like, this is built for the enterprise that wants to have the complete ecosystem and progression of tools. So which I think you've, you've said here, you definitely say it in the book. Like, what is the philosophy? Kind of what is at the. What's in the DNA of the company that's building it? Like, who do they feel is the user that has primacy?
Jason Packer
Right.
Tim Wilson
I mean, is that a fair.
Jason Packer
Absolutely. Yeah. You and I have talked about that, right? Like, that product philosophy and Outlook is more important than any sort of like feature comparison. You know, feature comparisons. I mean, probably some people have heard me complain about this before, but, you know, feature comparisons are helpful in some ways, but they also are, you know, already outdated by the time you posted your feature comparison list. It's, it's already out of date. The one checklist item that it says it does, X. There's so much to unpack underneath that, that, that their version of X might not be what you really think that you're getting when you get that feature. And people want more features. Like I just talked about Post hog, they've got every feature under the sun. They've got, you know, they've got session capture, but maybe you already have session capture. You're already running Microsoft Clarity or hotjar or something like that. So while that looks like a great thing on a feature comparison list, that's not something you need or that's not something that's going to help you. It's going to add confusion to the product. The more features you add onto a product, the harder it can be to use. That's just the way the way these suites work. So, you know, like. But it can be really hard at the same time to like understand, to peel back that layer of marketing a little bit and be like, well, what is this? What is really this product philosophy? Who is this for? Like, you know, when you're in that job, when you're, you know, when you're an analyst and you get, I don't know whether you get in front of like Adobe Analytics Workspace, you're like, oh, okay, I get it. This is, this is for me, this was written by people that have been listening to people like me. This makes sense to me and is useful to my use case. That is clearer. And a lot of times once you get in and use the tool, like I was saying before, but when you're just looking at the marketing, it can be like, well, both platforms say they have the ability to customize reports done. They're the same when they can be wildly different.
Mo
Talk to me about this philosophy piece because I find this really interesting, the philosophy of the platforms. And I think maybe I was alluding to a similar idea before, but how do people figure that out is the philosophy who the user is or the direction they want to take it?
Jason Packer
I think it's not easy because in a lot of ways think the vendors themselves don't know a lot of times like they've. It's a product of the history of the company. It's the product of what the target market of that tool is. And it's a product of the, you know, the people that built it and who they were listening to when they, they built it. So, you know, let's take, we're talking about plausible and some of the simplified tools. They have a clear philosophy of, this is a simple tool. You know, there's not going to be fast ability to customize reporting. Everything's going to be on one screen or pretty much everything is going to be on one screen. You know, we're not going to have a drown you in configuration options. This is designed to be simple. And part of that is, you know, in that is privacy as well, is that it's, you're giving up some amount of complexity to make it easier and to perhaps make it more private as well. That's a philosophy. And I think that philosophy for them is. Can be pretty clear, right? When you look at their marketing, you look at the sample, you know, you try out a simple product, it can be pretty easy to understand their philosophy when they communicate it well. And it's not, you know, a sprawling platform with 27 different components or whatever, versus, like, some of the more complicated tools. Like I talk about Piano in my comprehensive category, along with tools like Adobe. If you're looking at either of those tools, they offer so many different features and functionality, and there's a much more complicated onboarding process that it can be really hard to understand what that philosophy is until you get much further along in the process. I do think that like, you know, talking. Talking to the vendor and engaging with them before you get too far along can help you understand that, but it also can confuse the process too. So, I mean, I don't know. I don't know that I have like a real great answer.
Tim Wilson
Well, I like that because you said you have to sort of where the company like looking at the roots of the tool and. And I don't have a million examples, but I look at like in the BI space, you had Tableau, which was like one of the second generation of tools that clearly was like, the shit should be drag and drop and we should be able to customize it to conform to things that Steven Few would give a 10 out of 10 to. So they were coming at it saying, it's got to be a drag and drop WYSIWYG interface that you can have, highly customized to be very, very clean visuals. So they were a BI tool. Philosophically, I think it was forward on the quality of the visualization. Contrast that with Domo comes along a number of years later. And I would say Domo was saying, no, no, no. It's all about the ease of connecting to all of your data sources. And they kind of LED with the connectors and now they're competing with each other. So over time, their sales teams are saying, we're losing Domo, we're losing deals because our visualizations are shitty. And Tableau is getting pushed back of saying we're losing deals because we're not easy to connect all these different things. I think they are not necessarily permanently handicapped, but it is one that I would say both of those tools, like, that's where their various strengths versus weaknesses are. Which means if I'm looking at a BI platform and those two are in the consideration set I may be thinking, do I have stuff going pretty tightly into. Most of my stuff's going to go into a data warehouse and occasionally it'll be pretty normalized and I want to hook into it, and occasionally I might want to hook into something else. Or are we going to just live in a chaotic world where I'm always going to be needing to hook into a gazillion different data sources that are all going to be messy and I'm going to need to be able to do transformation within it? I think it does take a lot
Jason Packer
of
Tim Wilson
maturity or wisdom or thought to try to map where is my company's kind of which philosophical or historical underpinnings are most aligned with my needs. And then stand up and say, and guess what that means our visualizations will never be as good as what the perfect idea would have because that's a lower priority.
Jason Packer
Yeah. And that's where I think I talk a lot about understanding fundamentals, how that can be really helpful to close some of that. You're talking about is the gap between the marketing that you see from the vendor and the reality. Right. And part of closing that gap and understanding really what a tool is all about can be like understanding the fundamentals of how particular things work. Like, if we're talking about, you know, databases, right. If we're talking about this product uses MySQL and this product uses Snowflake, and this product uses Postgres and this product uses clickhouse, knowing just a little bit about the differences between those two tools is going to tell you a lot about the product. You know, like if we're talking about, you know, we're talking about, say we're comparing Pivot Pro and Matomo. Pivot Pro uses Clickhouse as the database underlying their product, and Matomo uses MySQL on the surface, they're pretty similar products, but they end up working quite differently because of that difference in the underlying database. MySQL is a simpler database. It's something that's easy to self host. It's something that's easy to see the raw data from. It's something that's not super performant in a lot of more complicated analytical queries. And all those things surface in the products. And if you know that background and you know that it's not like you need to know how to use those tools, but just knowing a little bit. The same is true for like tracking methods, methods like cookies, tracking with cookies versus tracking with this IP+ user agent method, or tracking with browser fingerprinting or Whatever. Just knowing a little allows you to sort of see, like, oh, the vendor says X. Oh, I think what they mean is this. There's not like, you know, as much as the vendor might say. It's not like there's, you know, a million new things and million new ways to do things. There's a limited number of ways.
Michael Helbling
All right, well, before I wrap up, I'm going to give Mo the opportunity to jump in one last time. But we do have to start to wrap up soon.
Tim Wilson
But.
Michael Helbling
Yeah, go ahead, Mo. I know you want to ask one more.
Mo
Jason, we've talked about, like, a lot of different, I guess, like, concepts that. And things you need to think about in this whole, like, tooling decision space. If I'm sitting at my desk and I just take like your one, like, absolute. This is the thing that should be most top of mind from all the things we've chatted about today. What would be, like, the one thing that you would say? Just if you pay attention to this, then you'll probably make a slightly better decision.
Jason Packer
Oh, that's a tough question. I might actually say price. Oh.
Tim Wilson
So I'm disappointed.
Jason Packer
It is. It is dis. I'm disappointed. I'd like to say something, you know, cool, like, you know, the fundamental database schema or something like that. But like, you know, prices, like, it's a shortcut to a lot of, you know, putting you in the right area. And I don't, you know, limiting to. I don't want to do. But that's where I would go.
Tim Wilson
Price is one input to a total cost of ownership. I mean, that's again, maybe another one. Have you ever come at it that way, Mo, with any of your.
Jason Packer
That's a better, you know, total cost of ownership. Let's just say that. Say I said total cost of price. That's what I meant.
Mo
That needs to be in version three of your book. Because I like that framing total cost of ownership sounds way better than I
Jason Packer
think I do use total cost of ownership. I don't know, maybe.
Tim Wilson
I mean, I think it. It makes sense if you're going through the different, like, philosophically, what's. How much am I going to have to invest in added tooling to work around a limitation in their tracking or something like it. It could be, but yeah.
Michael Helbling
All right, well, we do have to start to wrap up. This is awesome conversation. And honestly. So it's a good conversation because I think everybody deals with this in some capacity in their analytics career. So. Jason, thank you so much for coming on the show. And being our guest today, one thing we like to do is go around the horn, share last call. It could be any topic, anything at all. Just something that might be of interest to our listeners. Jason, you're our guest. Do you have a last call you'd like to share?
Jason Packer
So my last call is something that you already mentioned, Michael, which is Music League. So nice. Michael and I and Mo, I believe your sister is a part of this as well. Is Music League is, is a like it's a competition sort of. It's a friendly competition where every week somebody like there's a theme, like this week's theme in the music league that I'm part of is Beatles covers. So everybody picks a Beatles cover that they like. Then a playlist is made automatically from that, whatever, 20 songs and everybody votes in the ones that they like. And fun is had. Like it's, you know, not complicated. It's fun to do with your, with your peers, your friend group, your work. We've been doing it on the Measure Slack for what, three years now or something like that.
Tim Wilson
Is it in the measure? What for somebody who's interested, they have to be in the Measure Slack and then in the Measure Music channel they can.
Jason Packer
Yeah, that's where the conversation happens. You don't technically have to be part of that, but anybody can start a music league too. And there's also like free, you know, like.
Tim Wilson
Yeah, but we like to do stuff around, you know, us. Don't just get people to go out.
Michael Helbling
Yeah.
Tim Wilson
Don't let their own thing.
Michael Helbling
They got to be part of the Measure Slack to do this. So join that obviously top here.
Jason Packer
Yeah, top tier.
Michael Helbling
Yeah. And the group is amazing. Like it's also great. We have tons of cool, fun music based conversations with all your peers and analytics and it's a lot of fun. So for my own personal experience, it's. It's a great time. And I've got a great idea, Jason, because you know, we've been growing as we grow and then we get the big power hour bump on this now we can start like different levels of leagues. So there could be like a premier league with relegation and then a championship league like, like British soccer, you know, be relegated.
Jason Packer
I'm not sure I would like to that if you're not.
Michael Helbling
Well, I probably would be too. I don't often score very well, but I have a lot of fun. Anyways. It's also really cool to get a new playlist every couple weeks or so of songs you might not have ever heard or, or genres you're not that into. So it's nice. I like.
Tim Wilson
So we do occasionally get comments from people who are like, you guys mentioned the Measure Slack, where it's like, if you literally go to Measure Chat and then you join Measure Chat, join Measure Chat, and we'll also have it on the Show Notes page. So if anybody's like, you guys keep mentioning it and you. I. It's in our outro, and we don't have instructions for how to find it.
Michael Helbling
So listen, if you're committed, you'll find your way in. All right. No, thank you. Yeah, that's awesome. And, Jason, thank you for kind of being the oomph behind that. I know it's a ton of work on the back end to make it work.
Jason Packer
Commissioner.
Michael Helbling
Yeah, the commissioner.
Jason Packer
The.
Michael Helbling
The ska. The ska loving commissioner of the Measure music channel. All right, Mo, what about you? What's your last call?
Mo
Okay, so my husband has been listening to a podcast for a long time that folks will probably be familiar with. I have noticed it indexes highly to men. I know a lot of men that listen to it. I don't know a lot of women.
Tim Wilson
Yeah, that's the word.
Mo
So the Pivot podcast, one of the. My husband listens to it on, like, loudspeaker around the house, and it, like, really, like, drives me nuts. And I have not been the biggest fan of Scott Galloway. However, I have had my opinion changed very significantly. I am now a listener of Pivot. I have been incredibly impressed with how they've talked about, I mean, AI and, like, tech over the last few months, but particularly the coverage on the Epstein files is something that I just really like. It really impressed me, and that's why I've become a really big listener. Scott also last month did this, like, resist and unsubscribe initiative, which folks might have seen in the media, which was really cool, which was, like, encouraging folks to. To basically use our economic power to let tech companies know that kind of. We're not happy with how they're supporting the administration. And so, yeah, like, I just. I felt like they were using their voice to. To share their perspective on something in a really, like, meaningful way. And also just for everyone out there, like, check in on the women in your life. The last few months have been, like, shaken us to the core. And so just. Just check in on your. Your wives, your mums, your daughters, all. All the women.
Michael Helbling
Nice. Yeah.
Jason Packer
Great.
Tim Wilson
All right with that, huh? Yeah, great.
Jason Packer
Yeah.
Michael Helbling
Tim, what's your last call?
Tim Wilson
What have you got? Well, there was this episode of the Rogan No. So I'm gonna do two. They'll be, they'll be, they'll be quick. One, David Epstein, who I'm a big fan of, I like his books, like his videos, but he did a 15 minute video called why you should fail 15% of the time. And he talks about desirable difficulties, which is a phrase I don't think I knew. But he kind of breaks down the value of doing hard things the hard way. And like specifically what that does for you. Which I mean, in the world of vibing shit, there are a lot of people grappling with it, but it's just a well done video and he's delightful to listen to. And then maybe kind of adjacent to that there was just an article, I don't know. It was the Metadata Weekly, Mark Dupuis, the AI analyst hype cycle. And there were some quotes in it that were just, I thought were gems like quote, if AI can only answer questions that have been pre configured by the data team in a semantic layer, what have we actually built? An expensive natural language interface to existing dashboards, which. And he kind of makes the case of like, where is this all going? Like it's narrowing down to where what you actually get is maybe not that great. But then he also had the analysts who thrive will be those who can translate business problems into the right questions, validate AI output, build the context systems that make AI useful and provide the judgment and recommendations that AI cannot. Which I think a lot of people are saying, but that's kind of like a cheap throwaway thing to say. When I look what people are then also saying, I did this thing, it often kind of skips those components of it. So the AI analyst hype cycle by Marc Dupuis is my second one. Michael, what's your last call?
Michael Helbling
Well, we did an episode a while back talking about semantic layers with Cindy Howson from Thoughtspot, which was awesome. And we also did an episode about AI that I remembered something Mo said about how letting AIs leverage, how the queries are being used in the organization is also a way of training the AI to do that. And I read it article recently from Jacob Madsen at Mother Duck about rethinking the semantic layer and kind of challenging the idea that a semantic layer is kind of the only way to go. And I just thought it was a cool counterpoint. I don't know that I've got a strong opinion one way or the other. Like I very much respect like the conversation we had with Cindy and I really thought it was really Powerful. But there's some interesting research and discovery going on as well on sort of like letting the AI consume all your SQL queries and using that to help it understand more some of the context behind where and how your data is getting pulled together. So anyway, it's a good, it's a good read, good to kind of think through those things. I don't think we've solved it for our industry. So I think it's early days on all this. So yeah. Oh, and what's this breaking news? I'm getting word now straight from our correspondent. There is a book out there that Jason Packer has written called what's the name of the book again? Hold on, I have it written down. Google Analytics alternatives. And for listeners of the Analytics Power Hour, he's going to give you a 20% discount. So that's pretty sweet. If you haven't already bought the book, that is the incentive to do so. Discount code aph.
Jason Packer
So there you go.
Michael Helbling
That will put the link to that in the show notes as well. All right, well, Jason, once again, thank you so much for coming on the show. This has been a lot of fun and a really good conversation. Appreciate all the work you've done. It's a labor of love, I'm sure, just to do all this and so very much appreciate it. On behalf of a vendor weary industry. I think you're doing us all a big service. So thank you.
Mo
Thank you.
Jason Packer
You're welcome. Yes, great time.
Michael Helbling
All right, well, and we'd love to hear from you too because you've been listening and you probably have questions or you've got thoughts and so reach out to us. You can do that on the measures like chat group which we have spoken about on the show as well as our LinkedIn page or via email at contact analyticshourIO and we also love to get your comments and ratings on whatever podcast platform you listen to. Please feel free to do that as well. And I think I speak for both of my co hosts.
Tim Wilson
Boy, have you a few things that are a little important. If only our show prep had it. So one, just know that Michael, you and Jason and I will all be at measure camp New York on the 28th of March. So that is true.
Michael Helbling
We will.
Tim Wilson
But even more important, I didn't expect
Michael Helbling
by now there'd be tickets left, so I was leaving that out because it's. You're too late. You probably can't make it.
Tim Wilson
Well, there's something that's if you can get a ticket kind of more important in I mean More important, from an operational perspective, the Marketing Analytics Summit that we'll be at on April 29th. Yeah, you. Okay, now you're seeing that point. Okay, I did that.
Michael Helbling
Yeah. More breaking news I'm getting. Yeah, we're going to be a Marketing Analytics Summit and we need your help. We want your questions. We've got a very cool survey of which there's an Easter egg at the end that I had no part of. And we'll have to take the survey and ask a question to see it. But, yeah, go to AnalyticsHour IE Listener and submit a question. We'll be recording at the Marketing analytics summit at. On April 29th in Santa Barbara, California, and we hope to see you there. But if you can't make it there, we can still ask a question and we may answer it on the. On the podcast. So please do that if you can, if you want to ask us a question and even if you don't want to push yourself a little bit and ask what? Anyway, I highly preference questions that make Tim feel uncomfortable. So, like, you know, ask him emotional questions about, you know, the best manager he ever had or.
Tim Wilson
Yeah, stuff like that. We have not figured out how we're sharing access to all the questions with all the co hosts before me.
Michael Helbling
Oh, yeah, there might be a little tricky part of that. All right, well, before I forget anything else about the show wrap up, let me just say thanks once again, Jason. And I think I speak for both of my co hosts, Mo and Tim, when I say, no matter what vendor you need to pick, just keep analyzing.
Analytics Power Hour Intro/Outro
Thanks for listening. Let's keep the conversation going with your comments, suggestions and questions. On Twitter @NalyticsHour, on the web at AnalyticsHour IO, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.
Tim Wilson
Those smart guys wanted to fit in,
Analytics Power Hour Intro/Outro
so they made up a term called analytics.
Tim Wilson
Analytics don't work.
Analytics Power Hour Intro/Outro
Do the analytics say, go for it no matter who's going for it. So if you and I were on the field, the analytics say, go for it. It's the stupidest, laziest, lamest thing I've ever heard for reasoning in competition.
Michael Helbling
All right, well, we do have an editor who we've been talking so fondly about, so we can stop and start as needed. Well, without further ado, actually, before.
Tim Wilson
So just.
Jason Packer
What?
Tim Wilson
Well, just like Mo, are there any. Because, I mean, you're kind of often in the midst of vendor selection stuff, so you're comfortable. There will be anything you talk about.
Michael Helbling
You can.
Tim Wilson
You'll self edit for whatever named and unnamed.
Mo
Yeah, so like we just signed in you BI Tool. Which I probably can't say, but I will just say I've been involved in multiple BI Tool selections and stuff like that.
Jason Packer
Okay. Yeah.
Tim Wilson
All right.
Jason Packer
All right.
Michael Helbling
Let's start clacking the keyboard and record this thing. I got it.
Tim Wilson
You got it? I think it's on it.
Mo
I was like, I better do it before you start because if I do it halfway through your.
Michael Helbling
That was great timing, actually. Really good.
Mo
Pretty sure I got.
Tim Wilson
Here we go.
Michael Helbling
In 5, 4, 3, 2.
Tim Wilson
Rock Flag. And an instrumental Rock Flag rendition by our guest.
Jason Packer
Oh, my gosh.
Michael Helbling
That's. That's the permanent one at the end of every show now. That's incredible.
Tim Wilson
I don't know why it was showing that it was going to play that one and instead it just played like Transition two. So I'm. It's Rush that.
Episode #293: Tool Selection and the Unhelpfulness of Feature Comparisons
Date: March 17, 2026
Hosts: Michael Helbling, Tim Wilson, Moe Kiss
Guest: Jason Packer (Quantable Analytics, author of Google Analytics Alternatives)
This episode dives into the challenges of analytics tool selection, exposing the limitations of feature-by-feature comparisons and offering insights into how organizations can more effectively evaluate which tools truly suit their unique use cases. Special guest Jason Packer shares firsthand experience from researching, evaluating, and writing about alternative analytics platforms, emphasizing real-world testing and organizational needs over high-level marketing claims.
[00:16 – 01:59; 38:58 – 41:28]
[02:42 – 11:23; 41:28 – 43:37]
[12:44 – 20:51; 41:11 – 46:09]
[17:52 – 19:19; 48:49 – 50:07]
[23:08 – 27:13]
[30:11 – 32:39]
[33:20 – 37:26]
For analytics professionals navigating a crowded tool landscape, this episode offers practical, hard-won advice against the easy lure of checklists or hype. In the words of the hosts: No matter what vendor you pick, just keep analyzing.