Loading summary
Mo Kiss
Foreign.
Analytics Power Hour Outro
Analytics topics covered conversationally and sometimes with explicit language.
Michael Helbling
Hey, everybody, welcome. It's the Analytics Power Hour. And this is episode 287. Ho, ho, ho. Holy shit. Another year is basically over. 20, 2025. I mean, it never even had a chance to slow down and decompress. It feels like. I mean, we're just running a breakneck speed, finding out about AI, doing our work, trying to do everything. But regardless, we're going to try to take a look back and maybe a small peek forward. That's the Analytics Power Hour Year in Review episode. And so, with no more ado, it's time to introduce my awesome co hosts, Mo Kiss, Director of Data Science for marketing at Canva. How you going?
Tim Wilson
I'm going pretty good. But, yeah, 2025, that was a time it felt fast. Big year.
Michael Helbling
Big year. I agree. Tim Wilson, head of solutions of Facts and Feelings. Do you agree? Hello.
Val Kroll
Ho, ho, ho. Hello. Ho, ho, ho.
Michael Helbling
Hello.
Val Kroll
Quite a year.
Michael Helbling
Yeah. And Val Kroll, head of delivery at Facts and Feelings. How's your year going? Gone.
Mo Kiss
Lots of feelings. There were lots of feelings.
Michael Helbling
Yeah, I agree. And of course, we are missing Julie Hoyer as she enjoys some time off with her new baby, and so we look forward to her coming back next year.
Val Kroll
So her year is going sleep deprived, right?
Michael Helbling
Yeah, yeah, that's right. And of course, as a special treat, we've got Josh Crowhurst, growth marketing Director at emmanulife, as our special guest this episode. Welcome back, Josh. Hey.
Josh Crowhurst
Yes, great to be here.
Michael Helbling
You know, I don't know, Josh, if our listeners, actually many of them, know the story of how you became involved with the podcast in the first place. So if you don't mind, I'd like to take a second and just tell people how that that happened.
Val Kroll
I thought it was going to be like the 2025 and how you stormed away, like, keep it as the Year in Review.
Michael Helbling
Well, I mean, it's part of the Year in Review that Josh finally had to step back from his role with the podcast. So we're actually really glad that you did rejoin for this one last episode for Year in Review, which is our tradition. And, you know, if you're up for it, come back next year. We don't care. But, yeah, that's just being involved. Nicely put, Michael. No, I'm saying it would be fun. It's not pressure. It's up to you. You got a lot going on in life, but no, early 2019, Tim and I were working out how to make the show better, and we thought we needed some Help. And so we put out a call for a producer. It was a poorly written job description, one that we did not fully understand, and then Tim fully understood it.
Tim Wilson
Just to be clear.
Michael Helbling
Well, in terms of, like, what it would take to do and what we were looking for and all those things, it was just very much like a shot in the dark. To our surprise and delight, we got a response from Josh Crowhurst. And after chatting with him a few months later, because I forgot about the email and didn't look at it for a while, Josh joined the show as our producer and was with us for, I believe, six years, which is incredible and so amazing. And so now that life has taken Josh in new directions and he's growing, he's obviously stepping into bigger and bigger roles, and it's so cool to see how, you know, your life and career has just flourished. And I like to think maybe. I mean, I don't.
Val Kroll
I don't know.
Tim Wilson
I think, anyways.
Michael Helbling
Oh, no, not me personally, just the Analytics Power Hour generally benefited your career in some way. I'd love to think that, but probably it didn't.
Josh Crowhurst
Absolutely.
Michael Helbling
Anyway, we appreciate it and we're happy that you are able to join us for this episode. Okay. What we do on all these episodes in year Review, we like to look back at the year that just went past. We did a lot of shows. We did a lot of interesting shows. We like to talk about some of them, highlight some of our favorite episodes, maybe chat about some of the things that happened this year. So who wants to kick us off? What's a. What's an episode that really stands out for you?
Mo Kiss
Well, obviously, we started off our year strong. No show would be complete without Tim Wilson kicking off our year with the announcement of analytics the right way episode 263. So that was a big. We were all so excited to see that come to life. And it was super fun to be a part of that episode since I had the pleasure of working with Dr. Joe Sutherland. And that was just a really fun, big moment, like diving into all of the big themes of the book. But that was the first one of the year. Was it? It feels like if it was not the first.
Tim Wilson
Second.
Michael Helbling
Second.
Val Kroll
Yeah.
Mo Kiss
Starting strong. That was a good one.
Tim Wilson
I still have FOMO about missing that one.
Mo Kiss
We did fight to four. Always get to be on it.
Tim Wilson
Yeah.
Val Kroll
Well, as other people have released books this year, I realized what a kind of a shit job of ongoing Rolling Thunder promotion of the book. But I was in it for the writing of the book, and I figured it was Going to be downhill Once he showed up on the Analytics Power Hour as a guest, why would there need to be any other promotion?
Michael Helbling
The old APH bump, we like to call it.
Val Kroll
Yep, clearly dozens. Dozens of books flew off the.
Tim Wilson
I have bought six alone, so I am definitely helping the. The supplies go out the door.
Mo Kiss
Were those some of your stocking stuffers? MO for friends and family.
Michael Helbling
Folks.
Val Kroll
It's not too late.
Tim Wilson
But I do.
Val Kroll
If you're listening now, you can try for that special someone in your life, the ebook version.
Mo Kiss
Use code Aph Bump for 10% off.
Michael Helbling
Don't say that. Oh, man.
Mo Kiss
Well, I'm glad there's no other episodes to talk about.
Michael Helbling
Yeah, that was really the one.
Val Kroll
Let's talk about that one more.
Michael Helbling
That was the one. All right, listen, I have an episode. Here's the thing, okay? When we do this podcast, I'm like, this is a thing I do with a lot of things. So, like, when I interview people, when I work with people, when I talk to people, I'm always looking for kind of where their passion lies, what sparks, like kind of what makes their eyes light up. And one of our episodes that I really enjoyed, and it was a person I'd want to get on the show for a long time, was Dan McCarthy, which we did episode 272 about calculated and complex metrics. It was a really fun conversation. And Dan is so smart and so amazing in his role as a professor and studying these companies and the metrics they produce, especially for. For public reporting, for stock reporting purposes. But what was amazing was the passion he has for these topics through music. And he has a soundcloud with all these songs on it. And it was sort of after the show was over that he kind of started in on it. But that was sort of where I saw the switch kind of flip into this is Fun and a little bit of light in his eyes about that kind of thing. And I'm sure obviously he enjoys his other work too, but it was just really cool to kind of connect with in the. In the coolest way possible. Another data nerd about things they loved about their work and about data. Anyway, so that was just a moment that kind of stood out to me as far as being a really educational and fun episode. It was just so cool to. To watch somebody's eyes light up about things they were passionate about.
Tim Wilson
I learned so much on that episode, and I even probably like a week ago sent it to someone to have a listen. The number of times I get questions about LTV to CAC and why Finance and probably companies are so interested in that specific metric and how it's calculated. I'm just like, here is a show that I prepared earlier. Please peruse at your own leisure. And I just loved how he did such a wonderful job of really getting into, I guess, the different perspectives and the complexities that we sometimes face as data folks in a metric that at its surface might seem really simple and obvious, but actually can really change a business decision or a perspective of a business by how it's calculated and how it's interpreted. And also just to say his SoundCloud, the number of data show and tells that I've opened with one of those songs and people are always like, mo, where do you get these data songs?
Michael Helbling
I'm like, I know people.
Tim Wilson
I know people. So yeah, I definitely had that in my top couple of episode list as well.
Val Kroll
Well, that was like my finding him. So I now see more of his stuff and he made the point on that episode and then he kind of continues to make it that when. When companies stop reporting stuff, it's not.
Michael Helbling
Usually for sometimes that's informative.
Val Kroll
Yeah, that in and of itself. And there's some kind of hand waning as to why. And he's like, but another way to look at it would be here's this thing I wrote two years ago that indicated this might be problematic. So, yeah, he was a fun one.
Josh Crowhurst
So on the topic of things that people are passionate about, I think one of the episodes that I absolutely loved and maybe is a bit in line with something that I'm really passionate about was number 282, using and creating Data to Understand Pop Culture with Chris Della Riva. So for me, this was honestly probably my favorite episode ever because it just. It's like, so it's so right up my alley. Like, it's in my backyard. Like, it's like he's talking about looking up writing credits and production credits on songs and tracking that. And this is something that I just do just impulsively. I'm always annoying my friends with pointless, surprising facts about songs that. Did you know Bruno Mars co wrote Forget yout or like, I don't know, Mark Ronson produced and wrote that song from A Star Is Born. Like, just like shit like that. I'm just always. I'm always looking behind and seeing like, who's involved in that song. And the idea that there are just people behind the scenes that maybe don't have mainstream name recognition in a lot of cases, but have really shaped what you're hearing on the Radio or on Spotify for, you know, sometimes for decades. And so, yeah, Chris talks about tracking that and having that in the data set. And I wish I could get my hands on that data because I would absolutely just be poring over it.
Val Kroll
Oh, it's there.
Josh Crowhurst
Looking for the showing facts.
Val Kroll
You can. It's on the. It's on the show notes page.
Josh Crowhurst
Oh, my God.
Val Kroll
Yeah, we found out, I think.
Mo Kiss
Yeah.
Josh Crowhurst
See you guys.
Mo Kiss
Busy.
Val Kroll
I'm out.
Josh Crowhurst
Yeah. Okay. I'm diving.
Michael Helbling
Josh, liner notes, Crowhurst.
Val Kroll
And Michael, you really enjoyed recording that show. Is that. Is that right, Michael?
Michael Helbling
You know what? Thank you so much, Tim, for bringing up a sore point. I just find it hilarious after 11 years of you basically being like, I don't know anything about pop culture, like, you record that episode instead of me. Like, come on.
Val Kroll
I read his newsletter. No, that was a fair.
Michael Helbling
Fair. Anyways. It was.
Mo Kiss
We're gonna have to rename this show Year in Review and Airing of Grievances.
Michael Helbling
That's right, The Festivus Airing of Grievances.
Val Kroll
Which Chris's book is now out. It was not out when we recorded the. But it is. So also, if you're like, somebody loves someone so much that you want to get them analyzed the right way. And a second book, twofer that Uncharted Territory is now available at booksellers near you.
Michael Helbling
Still available by Boxing Day, Probably. Yeah.
Val Kroll
I don't know.
Tim Wilson
Do you guys have Boxing Day?
Michael Helbling
No, but it's the day after.
Val Kroll
Yeah.
Michael Helbling
You have one more day, so maybe it'll shoot time. I don't know.
Val Kroll
Everybody has some pretentious neighbor who celebrates Boxing Day so they can explain to.
Michael Helbling
You what it is.
Tim Wilson
Leftover food and none of the pressure of Christmas Day.
Val Kroll
Right. Now imagine that coming out of an American who's, like, just explaining how sophisticated they are.
Michael Helbling
Well, I've. Obviously, with these book recommendations, I would think you'd be talking about whole of Okaflot. So maybe that's the holiday. What?
Mo Kiss
Not familiar, Sorry.
Michael Helbling
It's an Icelandic holiday where you read.
Val Kroll
Books right before Christmas.
Michael Helbling
So there you go.
Tim Wilson
I was about to say, should I, like, pivot us in a totally different direction and talk about the elephant in the room?
Michael Helbling
Yeah, I mean, what. What.
Tim Wilson
How many episodes do you reckon AI came up in? Oh, damn it. I should have actually been prepared and, like, transcripts or some shit. That would have been a good idea.
Michael Helbling
Yeah.
Mo Kiss
Use your librarian thing, Michael.
Michael Helbling
Yeah, well, we don't have every episode uploaded yet, so it's still a work in process. But thank you, Val, for bringing that up. Because it's an AI project that Tim and I are working on. But I've got to say, Mo, it probably came up in probably 75% of our episodes.
Tim Wilson
You reckon 75? Everyone put in a guess.
Mo Kiss
I would say maybe higher.
Val Kroll
No, I think I go.
Michael Helbling
I mean, I'm counting between one.
Mo Kiss
Whether it was a topic or it just came up. If it just came up or.
Josh Crowhurst
Yeah, do last calls count? Because that would bump it up.
Mo Kiss
They do in my head. That's why I got to my number.
Michael Helbling
I mean, there's at least 10 episodes that have AI in the title.
Tim Wilson
I'm gonna say 90%.
Michael Helbling
Yeah, it was a lot.
Tim Wilson
Let's leave everyone hanging and we can report back at a future day.
Michael Helbling
That's right. Guess how many jelly beans are in the AI jar.
Val Kroll
So I'd like to go on record that I did not commit to that. It will be reported out at some future date. So I think the likelihood of that happening is.
Mo Kiss
If any of our listeners want to figure it out, sound off in the comments.
Michael Helbling
If only we had a producer who.
Val Kroll
Could go back through.
Michael Helbling
You know, Tim, as we clink champagne glasses on another successful year of the podcast, I think our listeners would agree that you and I almost always agree on things.
Tim Wilson
What?
Val Kroll
Absolutely not. I spend half or most of my time on this show, I think, just correcting your misguided thinking.
Michael Helbling
Well, agree to disagree. But there is one thing we both agree on. AI is starting to reshape our industry. And I think we both call bullshit on nonsense like vibe analytics.
Val Kroll
Absolutely right.
Michael Helbling
But here's the flip side. Analysts do have to start using AI. Leveraging LLMs to multiplier capabilities isn't just interesting anymore. It's going to be table stakes in 2026.
Val Kroll
Which is why I'm actually excited about our new sponsor, Ask why. Yes, it's an AI tool, but it's one where analysts can do real work and critically. Ask why is smart about data privacy. They do not send your raw data to the LLM.
Michael Helbling
Right. Ask why? It builds a semantic layer on top of your data and then uses that to generate SQL that answers your questions or helps you build reports on your own data set. It's currently in beta and it's evolving fast. But you get the upside of AI and the assurance that your data stays secure. You can actually start leveling up into being an AI analyst, starting with Ask why.
Val Kroll
For a limited time, use the code AI when you join the wait list. And our friends at Ask why will move you right to the top of that list.
Michael Helbling
The Site is Ask Y AI. That's ask the letter Y AI. So go sign up for the wait list using code aph.
Val Kroll
This isn't vibe analytics. This is the rise of the AI analyst.
Michael Helbling
All right, let's get back to the show. Yeah, it is interesting because it Certainly, I mean, Mo. I think the point you're making is like, AI was everywhere and always here all year long in 2025, and it seemed to grow in speed and pace throughout the year.
Mo Kiss
Yeah, definitely a topic that came up in the listener survey is people wanting to. Wanting it covered, wanting some topics covered there. So I think that creeped into our schedule, informed and.
Val Kroll
And as my other hat, as the fielder of the inbound pitches for show topics, I can certainly say that that percentage was definitely north of 75%. But is it fair to say, and maybe this is my normally optimistic self, that you guys are so familiar with that at the start of the year, the ratio of AI hype to AI, specifically in the world of data and analytics, that it was like north of 90% of the AI hype excitement to the. Wait a minute, guys. It's not going to be everything. And that it slowly gotten a little bit more in balance. Just as the conversation in the. In the zeitgeist around what AI can and can't do as people have gotten their hands on it and realized limitations. Or is that me?
Tim Wilson
I think that's fair.
Michael Helbling
It has come back a little. I still think we're a little out over our skis somewhat in terms of AI. I mean, just AI in general. Like, a lot of people think we're in a bubble by the time this comes out. Hopefully the stock market hasn't crashed or anything.
Val Kroll
But.
Michael Helbling
You know, that's always a thing that, like, people are talking about is like, oh, is this all a bubble? And, you know, like the dot com boom and bust kind of an idea.
Mo Kiss
I think it's like with any trendy thing, it's like, cool to think of all the. The use cases and all the potential. And then the cool thing is to be like, but you can't do this, can't do that. So, like, I feel like we're in that phase of, like the LinkedIn. Like, I just get so tired, you.
Michael Helbling
Know, like the 50th time you hear they not like us. And you're like, no.
Val Kroll
I did see a thing where somebody.
Mo Kiss
Michael.
Val Kroll
That was good.
Michael Helbling
Michael.
Val Kroll
I did not. I read a piece that was saying that instead of a bubble, think of it as a forest fire, which it actually has a lot of bubble tendencies. Well, but it talks about if you go back to the. The Internet, the original. The 2000 Internet bubble that it was, pointing out that it's like the bubble burst, and it's not like you're back where you started. There are players that were sufficiently hardy and had actually a plan that they were like the big trees that actually managed to weather it. And they're like, yeah, Google, Apple, Microsoft, they're not going anywhere if the bubble bursts. And then it talked about the ones that are basically just a thin veneer of crap, that those are just going to disappear, but that it's also that correction, when it comes, there will be a smarter universe out there, and there will be little shoots that come out of it that can kind of. I don't know how they refer to it as little green shoots that'll crop up once all that sort of gets cleared out. It seemed like a useful metaphor involved is the weather coming. But. But I also find it's crazy, like, just having conversations with. With normies and sort of where the person who's not kind of has some responsibility to figure it out how much they're not. There isn't real depth of thought. They're just. I had a friend say, I just use ChatGPT instead of Google Search now. And I was like, I don't have the energy to say, you could just use Google Search and it would be Gemini, like, if you just want plain text results. And that's kind of the extent of what they're doing. Although I could also go on some rants as well.
Michael Helbling
But, you know, it was interesting to me this year when I would go to different events in, like, conferences or things like that and see the pace, because I remember going to measure Camp New York work in the spring, and of course, everyone was talking about AI this, AI that. And it was all kind of like, wow, look at all this cool stuff. And then literally from then to the fall at measurecamp Chicago, I felt like we'd already gone through, like, a maturity curve almost with the way that we're discussing AI and some of its use cases. And it just seemed like we're just blasting through the cycle really fast. Feels like sometimes some places there's still quite a bit of hype, But I do think some people are getting their feet on the ground and starting to use it for actual things and starting to understand how to leverage or how to think through use cases effectively.
Tim Wilson
So that was literally the thing that has been on my mind when I was looking at the episodes that were My favorite, it's probably recency bias, but they were definitely the ones towards the end of the year. Well, I suppose they weren't all the end of the year, but like the semantic layer episodes. I thought the topics on BI with Colin were really good. And then also loved the one on Bayesian stats with Michael Kaminsky. But part of me wondered, I just felt like there was this return to us discussing, I want to say, quote unquote, the basics. But it's not basics, it's the fundamentals of data stuff. And is the reason we were discussing that is because everyone's trying to go so fast on AI. There was this not reckoning but like acknowledgement that to do that well and I don't want to be like, you know, the usual shit of data, bad data in, bad data out, blah, blah, blah, that sort of crap. But like I just, I felt like I've been giving a lot of thought and energy and I feel like folks in the industry are about the quality and how we do things well and how we measure if the output is good and like that of its nature means we have to have more sophisticated conversations about fundamental data concepts. And I felt like there was a return to that. And maybe that's similar to what you're talking about, Michael, where like there was kind of a bit of a rush and then people are like having more sophisticated discussions is probably a good summary.
Michael Helbling
Yeah, no, I like that framing because I think that's exactly right. It's sort of like the early thing I saw was like, well, your own expertise drives results in AI all the time. But it's sort of like, okay, if you go down to some brass tacks about how to conduct analysis, how to think about data lineage, how to think about traceability, all the things that we teach are taught as analysts to be able to compose an analysis correctly, follow it through correctly, and deliver out the other side. Those are all steps we learned as analysts. And so AI is a part of that process now, but we still have to maintain all of those parts along the way. It feels like does that I don't know. And maybe AI will get so good it can do all those steps for us at some point. But I just don't think there's ever going to be any anytime in the near future like a black box appropriate approach to analysis. Which don't get me started on the topic of vive analytics, which is the most stupidest thing I've ever heard of in my life.
Tim Wilson
I think we need to do a spin off episode on that because I disagree on analysis, but probably definitionally or.
Michael Helbling
Semantically, we're probably in agreement. But yeah, we can probably do a whole show on it.
Val Kroll
Well, what Joshua Judd.
Josh Crowhurst
Yeah, this is something that I've also noticed, I think is kind of related is that using AI, it really drives home to me that you really have to, especially as a manager, you need to have your critical thinking skills switched on because things will start to come up produced by, I mean, especially like more junior people in their careers that are, I guess, more AI native will be using this and might at some times skip some of the steps in producing an analysis and they'll come up with something that sounds really logical, but maybe they had a conclusion in mind, they punched it into ChatGPT and worked backwards at arriving on some logic to present an idea that maybe hasn't been fully thought through. So this is something where I think we have to be super aware of it, that there's a lot of, I guess, convincing sounding bullshit where if you prod to go one layer deeper, like the thinking just isn't there. So coming back to the idea of having the fundamentals, but also just being aware that this is around us all the time and try to really focus on is the logic sound.
Val Kroll
I mean, I think that's when it gets used as, this is something that I don't enjoy doing. AI gets put out there as being, oh, the grunt and tedious work that you do, AI can do that. Now, I think that's an overinflation. How many people are literally sitting there saying, I do monotonous, tedious, repetitive work day in and day out, and no one has come out with a way to streamline that. So this monotonous, tedious work gets conflated with this is work that I don't really enjoy or I have to kind of think about it. I hate summarizing meetings that are all over the place. Oh, look, Zoom will just record and summarize for me. And it's like, well, you may hate doing that, but you're missing what sort of value you should be adding along the way. And I think the same thing goes for analysis. If you're like, well, if you think that the goal is to get a slide deck produced that looks plausible, then you're kind of missing what analysis is. There is stuff that is supposed to be hard and that you are having to think through it with that structure as you go.
Michael Helbling
Yeah, I want to step aside for a quick second and take a quick break with our friend Michael Kaminski from Recast the Mediamix marketing and geolift platform, helping teams forecast accurately and make better decisions. Michael sharing bite sized marketing science lessons over the coming months to help you measure smarter. Over to you, Michael.
Michael Kaminski
Granger causality might be the worst named concept in analytics. What you need to know is that Granger causality does not demonstrate causality. Just because some some variable passes a Granger check does not mean that it causes some other variable. What Granger causality actually shows is predictive ability. Effectively, the check is looking to see if past values of x can predict y better than past values of y alone. As an example, let's imagine we have two time series. One is the time that a rooster crows every morning and the second is the time of the sunrise. By just eyeballing the data, we can see that the rooster crows consistently a bit before sunrise. Yet a Granger causality test would conclude that rooster crows Granger caused the sun to come up every morning. The problem is really in the name. It confuses analysts and especially business stakeholders who understandably assume that a Granger causality test actually checks for causality. Here's what to remember. Granger causality only tests whether one variable precedes and helps predict another. It says nothing about whether one actually causes the other.
Michael Helbling
Thanks, Michael. And for those who haven't heard, our friends at Recast just launched their new incrementality testing platform, Geolift by Recast. It's a simple, powerful way for marketing and data teams to measure the true impact of their advertising spend. And even better, you can use it completely free for six months. Just visit getrecast.com geolift to start your trial today. Okay, let's talk about shows we liked maybe that didn't always touch or didn't touch fully on AI. What are some topics we liked this year that weren't necessarily in the AI wheelhouse and kind of mo. This is coming off of you talking about sort of this fundamentals kind of an idea.
Mo Kiss
One of the ones that I had FOMO for not being on was the ANOVA I Hardly Know Ya with Chelsea.
Michael Helbling
Oh, that was so good.
Mo Kiss
That one was so good. I mean she's just a joy. But I don't know if you guys remember, but she's one of the things that you guys started with on the episode is that she had a poem pre chatgpt Times Twitter feed poem about anova which I loved. But she was just so thoughtful in the way that she was describing and getting into like all the inner workings and the comparisons with Ancova and Minova and like, she's like, at the end of the day, it's linear aggression all the way down. And I thought you guys did a really nice job probing with some really good questions that were very thoughtful from real life experiences that I just thought made that episode really good. I've definitely listened to that one more than once this year, but that was really fun. It was an easy listen, even though it's a complex topic.
Val Kroll
I'll throw in the episode 268. You get an insight and you get an insight with Chris Koch, which was, I would say, very not AI because it was so much about a human being pulling things from different directions. And that wasn't the first. We had Rod Jacka on years, Jacka, Jacka, Jacka on years ago to talk about what is an insight. So I feel like that's a perpetual question in our industry and there are certainly a million AI powered tools that are like, it'll find insights for you. And to me, that was like that episode. Chris is not an analytics person. He is coming from much more of a creative and messaging and branding background and getting his perspective on what the many, many facets and the inherently human nature of trying to get some deeper understanding about something I thought was a pretty nice corrective to the AI hype. I really liked how he defined an insight.
Michael Helbling
Another one of my favorite episodes, and Mo, you mentioned this one as well, was the one with Michael Kaminsky about Bayesian statistics. I think throughout my career I've learned things just sort of by arriving at them, not necessarily being officially trained in them or those kinds of things just because of how I started in analytics and how I kind of grew into the field. And it was sort of this really big light bulb moment to sort of realize like, wow, the way that I actually approach this stuff is literally what we talked about in that episode. And sort of for the first time kind of slammed together in my mind, like, made the connection finally, like, oh, that's Bayesian statistics. So it's just so funny. Yeah, like, I know what that is conceptually, like, oh, it's your priors, blah, blah. But like, as a model for actually doing stuff in the real world, I hadn't really said, like, oh, I'm Bayesian in the way that I think about that.
Tim Wilson
It's funny because I think one of my tendencies, and I always say this to my team, is that I oversimplify things. And I think that's just part of my role. Right. Is like, I'm often trying to communicate something really complex to A leadership team. But I think one of the things that I really loved about that episode is, like, in my mind, I think I had maybe perhaps oversimplified what I understood about Bayesian stats, and Michael brought a level of new depth to the topic that really add a lot of value to me personally.
Michael Helbling
Yeah, no, I. I really liked it and actually was super applicable. Like, I was literally sitting down with a client, like, not long after we recorded that, and I was able to walk them through sort of a process they could follow where they were in a situation where a frequentist approach would not have worked well in that context. And I was like, well, here's some other alternatives. We could actually do something like this. And it actually worked really well. But it's sort of funny because I probably would have still suggested that, but now I could actually call it what it was, as opposed to sort of being like, I've got an idea. Try this. It probably has a name. I just don't know it. Anyway, so it was kind of. It was just really cool, like, to sort of connect the dots on that for me this year.
Mo Kiss
All right. One of the other ones that I'll throw out there. Another recent one that we did was 268 the metrics layers, data dictionaries. Maybe it's all semantic layers with Cindy Howson. So I have to admit, full, full transparency, when we were in our planning for that, I'm like, is that really a whole episode? Like, I'm like, okay, I'm not on it. So I feel. But holy shit, yes, it was a whole episode because it was with Cindy and it was really, really well done. I love it so much.
Michael Helbling
Val, Tim and I both will tell you, like, we've gone into certain episodes over the years and been like, I don't know about this. And it turns out to be amazing. So, like, a lot of times a little bit of doubt is almost like an indicator that, like, something good might happen here.
Tim Wilson
But also, I think the fact is that Sydney herself is such an experienced data practitioner, has such a depth of knowledge about the technologies and the topic we're talking about that. I mean, I could talk about semantic layers for hours, which I have done with Cindy from time to time. But I think that episode was really strong and like, really? Yeah. Semantic layers is a hot topic at the moment. Lots of folks are building things. You know, there's a DBT product, a Snowflake product, and there's a bunch of similar products that are built into BI Tools. And it. It's a very like, timely episode as well, given how quickly things are moving in the industry. Or maybe, I don't know, maybe not quickly because we're, like, trying. But Cindy, I think Cindy was just such a wonderful guest for that specific episode and probably is one of my favorites as well.
Val Kroll
And the fact that she made the point that, one, they're not new, and two, thinking of it as one monolithic thing, I was like, those were like two.
Michael Helbling
That was big.
Val Kroll
Very like, ah, this has gotten the label of this is the grand new thing. Just roll it out. And I was like, it is the fact that she is very, very politely, really fucking annoyed with the cycle of the latest shiny thing being treated is like, this is the thing, the answer. So, Josh, you were going to say something?
Josh Crowhurst
Yeah. A recent one that I particularly enjoyed was 281analytics, the view from the Corner Office with Annalee. Yeah, great episode. And I think we were talking about trying to get this, like, finding the right guest for this idea for years, maybe. It was a long years, long time. Yeah, that was one I think we were trying to put together for a long time. So when I saw that on my Spotify feed, I was like, oh, I have to listen to this right away. And it was worth the wait for sure. And for me, it really, it resonated maybe partly due to some perhaps slightly traumatic recent experiences in my previous company where I had exposure to senior leadership. A few of the things that she talked about were really sharp, I thought, talking about setting a culture of productive curiosity. I love the term because I did see it firsthand. You'd be in a meeting and the CEO would make an offhand comment, and then people would just spend an inordinate time digging into that, like, whatever. Whatever the ask was. Because, you know, the CEO said it like, I have to, you know, I have to. I have to do this. And they don't think, well, it's just another stakeholder. It could have been, you know, it might not be something that's worth spending hours or days looking into. So we would. We would come back in the next meeting and the CEO wouldn't necessarily even remember making the comment. So I kind of learned to, like, level set in the meeting before going and saying, hey, like, you know, we're going to look into this. It's going to. This is the amount of time we're probably going to spend on it and just sort of set that. Just get that out there before leaving the room. But what I liked about what Anna said was she talks about having a level of precision that's necessary and sufficient for the importance of the decision that's being made and then having the self awareness as a leader to specify that and then save the team some of the bandwidth. So as an analyst, when you're in there, if it's not clear.
Val Kroll
State it.
Josh Crowhurst
And get it out in the open and get the alignment. But I love Anna's perspective that taking ownership as a leader, realizing what you say, people might just take it and run with it and spend a ton of time and you didn't necessarily intend it that way. So I loved that framing. And then I just thought, yeah, a really thoughtful perspective on what a data driven culture can look like and how it can be established and driven from the executive level. And then just one last.
Tim Wilson
That's the specific bit that really sung to me was how much responsibility she took as the leader for that culture versus assuming that your data scientists are responsible for the data culture alone. That was one that really stood out.
Josh Crowhurst
Yeah, no, it made me. I was like, I want to work there. She has such a great way of framing it and thinking about it and communicating her vision on how data can be used and should be used and then setting the example. It just was really inspiring, honestly. And then one last thing that resonated again, going back to my ptsd. But yeah, brief your analysts if you want them to be set up to succeed the first time they're presenting to the CEO. I'll say that maybe didn't happen for me. I might have been pantsed in front of the whole company group executive committee as a result of that. So please, as a manager of analysts, please do that. That's great advice. Prevent any, any dramatic pantsings of your of your team when they're in a room.
Mo Kiss
Poor Josh. Yeah. The thing that also struck me about that conversation was I don't think she realizes how novel her perspective is. Like, she was like, oh. She's like, of course that that's what leadership. I'm like, I was like, can you say that in some of your circles? Like, I was like, like, where's the job? Where's the link to your jobs posting? I think I even said that, Josh. I was like, hopefully your last call is that you're hiring because this is awesome. But yeah, no, that was a good one.
Michael Helbling
It was. That was one of my favorites too, Josh. Because it was in a way so affirming of a thing I've really come to start believing more and more is that leadership drives data culture more than the data team does. And as the only way to really drive, you know, a data rich culture or data informed culture in a company is if the leadership is doing it. Because even when you take on the role as a data leader in your company, you can't force people to become data driven. They either are. They aren't. But if the CEO is saying it, well then that makes it a different thing altogether. But yeah, no, it was, that was a great, that was a great episode. And yeah, it was a long time coming. That was in our, like every year we'd have that in our list of like, yeah, we gotta find somebody that could do justice to this topic. And as, as analytics people, we're always thinking like, yeah, what do they think about, you know, when they're sitting as the CEO, what's their perspective on data? Do they care? Do they look at these charts and graphs? Like, that's a question I think our whole audience thinks about. Anyways, Anna was amazing.
Val Kroll
That was, yeah, years ago we had someone who agreed and was ready to come on and then ghosted us like completely. So it was like, oh, that's rock.
Michael Helbling
Oh, I forgot all about that.
Val Kroll
Talk to him, talk to him later. It was, it turned out his company was like in the midst of. It was about to get acquired. He was like, yeah, I really needed to go Darth. I'm like, I don't know. That email response of like, hey, actually this isn't a great time. Would have, you know, been too problematic, but I don't know. So yeah, I agree.
Michael Helbling
Well, what trends are shaping the next year, Mo?
Tim Wilson
Oh God, I don't know. I think, I think if I've just obviously gone through lots of 2026 planning and thinking about the year ahead and it sounds so boring, but if I had to boil it down to a couple of key things that I'm really thinking about, it is about consistency and making sure that we have really solid consistency and metric definitions and how metrics are calculated and all those sorts of. It just sounds boring, but I feel like it's becoming more important than ever. I think the other thing that I'm spending a lot of time thinking about is, I don't know, we are all using AI just for internal efficiency gains and it just feels shit. Like if you're using it to like write a better email or a Slack message, it just like it doesn't feel like that is how we can be getting the best from some of these tools and so thinking a lot more about specifically like the data products we make and how we can better automate. And I'll give you a specific example which is going to sound really lame. It's going to sound stupid and lame, but this is the exact thing, right? So like we used to keep a list of dashboards, like you know, your top company dashboards. So when someone onboards you can be like, you want to know about this topic or this topic or this topic, you go here, it's a manual list. It's pain in the ass. It always ends up outdated, not maintained. And I was like, that is a problem where we should be solving with technology.
Michael Helbling
Right?
Tim Wilson
And I think that's probably why I'm so hyper focused on consistency and all the fundamentals. Because if you want to throw technology at how do we maintain this list without needing someone to go manually update some spreadsheet or whatever it is. How do you understand which your dashboards are being used, which are high value, which are going to answer the right question to do that? The data that you're using to build a technological solution has to be very good quality. But yeah, those are just like the things that are on my mind going into 2026. Oh, Tim looks triggered. Well.
Val Kroll
I mean I believe back on the fundamentals that there still is. It is so easy to get caught up and they were going to keep measure, measure, measure, measure, measure and the, the complexity kind of explodes. Mo, you were at a very large, massive amount of data digital native company. I have even in the last two weeks have had an experience with a massive company that their issue was much more around internal alignment on what different teams were trying to accomplish and not the data. Every time the data people would come in it was just kind of like puking out charts of stuff. And you could see that that wasn't serving the business. I mean there were some kind of comical ways in which the data people were very knowledgeable, the visualizations were fine, they could answer questions about the minutia and that wasn't remotely what the organization needed. So I think that's not a direct response. I mean I think my cringe a little bit like, well, let's look at which dashboards people are looking at and which metrics like that. That to me winds up being coming up often saying can AI come up with an engineering solution that's just going to tell me the insight. It's kind of like, well, let's just.
Tim Wilson
I don't agree, I don't agree. I think the thing fundamentally you and I are very aligned that it's about the business question that you're trying to answer. Right? Like I would say that that's.
Michael Helbling
I don't know.
Tim Wilson
I'm getting like a semi nod. I'm getting a semi. One of the concerns you have is like are people leveraging AI to answer a question that could be answered very easily with something that's already built? And then it comes down to this is more about cost efficiency.
Michael Kaminski
Right.
Tim Wilson
I don't want someone continually asking a question every day that's costing us money to run that is sitting on a dashboard that can be easily looked at and interrogated if they just know where it is. It's about discoverability to answer that question. And so there are multiple problems that you're trying to solve. And it might just be another way to answer that business question.
Val Kroll
I would say I wish it was a trend of 2025, but I think the reason I was having that reaction to answering business questions goes back to Momentarily Mount's soapbox that the definition is if somebody in the business asks this question, it's a business question and therefore I need to answer it. How can I answer that efficiently and most effectively? And it becomes a volume play with if you instead totally shifted. I think there's a crap ton of questions that are kind of fishing that actually point to a much more fundamental challenge. But so of trying to solve. I mean it goes to the. And this does come to the AI companies that are like, imagine if you could just sit there with ChatGPT and just ask it questions and it would provide responses. And then the pushback winds up saying ah, but it doesn't. The answers don't have. They have hallucinations or ah. Without this engineering it can't provide accurate. And to me I'm like, that is not the goal. End state is to have people who aren't thinking rigorously about what they're trying to do and are prematurely jumping to the data. I deeply in my soul believe that that is heading down a path of just getting more people wandering through more data to have more meaningless arguments to produce more overly lengthy PowerPoint or Canva or Google Slides decks that aren't actually moving the business forward. So it's just making. It's actually putting fuel on the fire of something that is broken in business. Horribly, horribly, horribly.
Michael Helbling
Activity without outcome. Maybe so Tim, maybe the AI product you want to see built is the one that forces more rigorous questioning by guiding people through that process. So be like, why are you asking that question? Oh, interesting. Refine that. Okay, you don't really want that analysis because you wouldn't want to mistake this for this. So maybe you Want this analysis? Like something like that would be.
Mo Kiss
And then. And then at the end, does it turn into like an intake for like an intake system that goes, you and me.
Tim Wilson
I was. I was telepathically communicating with you. Being like, it kind of sounds like a JIRA insight ticket.
Val Kroll
She's trolling.
Josh Crowhurst
Gaslighting.
Michael Helbling
That was something I was already going to say, which was at the end of episode 279, the process of analytics, we have thoughts that episode, we were talking about that. And at the end of that episode, I was like, now because of AI, all these processes are going to take on even more importance. And Tim jumped down my throat and said, they've always been important. And he wasn't wrong. But the reality is to get to leverage AI, you have to do those precursors. So to Mo's point, kind of that return to some of the fundamentals is kind of like the trend. So Tim was wrong to do that to me on that episode. That's really the point I'm making.
Mo Kiss
Okay, so here's another poll. Another poll. Do we think that AI was mentioned more this year or Tim's blood pressure raising happened more this year.
Val Kroll
Wait, what was the first option?
Mo Kiss
Mentions of AI versus Tim's blood pressure, Right?
Val Kroll
Yeah. Well, they're deep, they're deeply correlated, and there is causation.
Michael Helbling
Well, at least for Tim's blood pressure, there's medications for that, so. Oh, brother. Well, that's one trend that will probably continue is that Tim and I will tangle up a couple of times. No, it's fine.
Tim Wilson
So I'm probably going to say something again, fiery. Also, just to clarify, answering a business question does not mean that we should answer every question raised by the business. Just to caveat the former discussion before.
Val Kroll
I move, but if you're making it so that they can get to whatever the question. Yeah.
Michael Helbling
Okay.
Tim Wilson
The next topic that I also think is coming up a lot, which very much ties back to the episode with Anna, is about decision velocity. I think that is something that is really, really interesting as. And again, like Tim makes the point. I work in a very unique position at a company that's probably not representative of what most companies are that. That data folks are working in. But it's like very much, how do you use the right level of rigor for the decision that you're trying to make as a business? And sure, maybe there's some AI sprinkle salt on the top of that as well.
Val Kroll
So I think that making that point of getting the business more sophisticated, that what are the stakes behind the decision and therefore is a little bit of a signal very quickly because is that better and desirable than getting a complete answer, but way too late. And I think there is starting to be some awareness on the business side that if you just are waiting for the inarguable truth, you'll just be waiting forever. Although I think there is still the tension between the teams. They can't get answers to me fast enough. Why can't I just have an AI? This was secondhand. Somebody said their CMO was like, I just want to have the AI just tell me, give me insights while I'm in the shower in the morning. I just want to get up and have it have sifted through the data. And I'm like, okay, we still have a ways to go because that CMO is.
Michael Helbling
But that's not the same thing as decision Velocity because I guarantee you that CMO is not not doing decisions effectively and a good speed because they're doing the gathering of information incorrectly to go after decision velocity. So one time somebody told me that a CEO is just a decision engine, which I thought was actually a really cool way to think about that. And when you think about just executive leadership, generally clearing obstacles for your team and all those things, Decision velocity is a huge part, like not getting yourself bogged down. And there's lots of frameworks for that, like the old Bezos, like two way door versus one way door, decision matrix, that kind of stuff. And so there's these things that can help. But I do think you could look at AI as an enabler of helping you frame or think about speed to decision. Because one of the things my old boss Mike Gustav used to do, he used to force us to write down decision journals. I don't know if you've ever done this before. Very time consuming and very annoying. And I was always super bad at it. It's probably why I'm not as good a decision maker as I should be. But it helps you then go back and look at previous decisions and what led up to them and go through that. So not everything is all data analysis. Data informs some decisions and to a greater or lesser extent, but to the extent that like, oh, if I had to use this data, I might have made a better decision. If you're evaluating your decision capabilities. And I think AI is really well suited to helping you remember some of those things over time as well. So that could be another way to leverage AI in that context, maybe. Go faster, be smarter.
Val Kroll
So I think this is your opportunity, Michael, to make a decision to Bring the show to a close.
Michael Helbling
You know, it's about that time, Tim. It's hard because I don't want to because of two reasons. Because we've got Josh on the show and I don't want it to end. And so that's one part. And then the second part is it's the end of 2025. This is our last episode of the year. And let's get on with 2026.
Tim Wilson
I am ready for it.
Michael Helbling
Moe's ready. All right, let's shut the door. So we're done. Thank you all. As you've been listening, maybe you have a memory of 2025 you want to share. We would love to hear from you. Or what are you looking forward to in 2026? Same thing. Reach out to us. You can comment to us at our LinkedIn page or on the Measures Slack chat group or via email@contactnalyticshour.IO. we'd love to hear from you. And obviously thank you, Josh. No show would be complete without thanking you for coming back to be one more special guest. 1.
Josh Crowhurst
This is fun. Thanks for having me, guys and girls.
Michael Helbling
It's awesome. It's awesome. It's. We do. We do. I think you're still in our Slack. I don't know if you've just abandoned that Slack at all or you still.
Josh Crowhurst
Kind of peek in Slack, but I do still get the emails. Oh, yeah, I'm seeing those ideal suggestions coming through.
Mo Kiss
Some pro.
Michael Helbling
I'll remove you from the. The email list, I guess.
Mo Kiss
Process breakdown.
Michael Helbling
Keep getting those. Yeah, well, we didn't really have a process for that. So under gdpr, you do have a right to be forgotten, but I don't want it to. All right. And if you listen to the show, leave a rating and review. If you've been listening throughout 2025, go to your favorite platform, give us a review rate the show that helps other people discover it. And we've had a lot of audience growth this year, both on our regular channels and on our YouTube channel. So if you're ever on YouTube, subscribe to us there as well. We put every episode up on our YouTube channel, as well as some awesome shorts that the team puts together for each episode. Don't know what we're gonna put together out of this one, but we'll see. And then of course, for all of my co hosts, I think I speak for everybody when I say 2026 is going to be an amazing year. But no matter what it brings, you know that you can always keep analyzing.
Analytics Power Hour Outro
Thanks for listening. Let's Keep the conversation going with your comments, suggestions and questions on Twitter @analyticshour, on the web at analyticshour IO, our LinkedIn group and the Measured Chat Slack Group. Music for the podcast by Josh Crowhurst.
Josh Crowhurst
Those smart guys want to fit in, so they made up a term called analytics. Analytics don't work.
Analytics Power Hour Outro
Do the analytics say go for it no matter who's going for it. So if you and I were on the field, the analytics say go for it. It's the stupidest, laziest, lamest thing I've ever heard. For reasoning in competition, I nearly Josh.
Michael Helbling
On the last episode did a no show would be complete without a huge thank you to Josh. Do I switch and I switched it at the last second.
Mo Kiss
No show could be would be complete without Keep analyzing.
Josh Crowhurst
That's not how I.
Val Kroll
I a huge thank you door.
Michael Helbling
Yeah. And yeah, it's like just a hard cut. No show would be complete without demanalyzing anyway.
Val Kroll
Wow.
Josh Crowhurst
Hey, I still need the theme music, so I feel like I can still get the occasional shout out.
Michael Helbling
Yeah, yeah, you're in it.
Val Kroll
Rock Flag and let's Raise a glass with Tim and Mo With Michael, Julie, Val. Five hosts who guide us through the noise and make the numbers tell. For all our power hours, friends, for all our power, our will toast the laughs and insights shared in all those power hours. Voice Crack should have picked a different key on that one.
Michael Helbling
That is awesome.
Josh Crowhurst
That has to be it. That has to be it.
Val Kroll
That has to be it.
Michael Helbling
That's the best one we've ever done.
Release Date: December 23, 2025
Hosts: Michael Helbling, Moe Kiss, Tim Wilson, Val Kroll
Special Guest: Josh Crowhurst
This festive episode of The Analytics Power Hour gathers the regular crew plus special guest (and former producer) Josh Crowhurst for a spirited and reflective "2025 Year in Review." The team looks back at the stand-out podcast episodes of the year, shares personal and professional highlights, and discusses industry trends—most notably, the pervasiveness of AI in analytics. The conversation is as much about the evolution of data practice as it is about memorable bar conversations and camaraderie.
Favorite Episodes & Why They Stood Out
Episode 263: "Analytics the Right Way" with Dr. Joe Sutherland
"We started off our year strong. No show would be complete without Tim Wilson kicking off our year with the announcement of 'Analytics the Right Way'... It was super fun to be a part of that episode." ([04:49])
Episode 272: "Calculated and Complex Metrics" with Dan McCarthy
"It was just so cool to watch somebody’s eyes light up about things they were passionate about." ([07:53])
“The number of times I get questions about LTV to CAC...I’m just like, here is a show that I prepared earlier.” – Moe ([08:32])
Episode 282: "Using and Creating Data to Understand Pop Culture" with Chris Della Riva
“Honestly, probably my favorite episode ever because it’s so right up my alley... Tracking writing credits and production credits—this is something I just do impulsively.” ([11:02])
AI's Domination of 2025 Discourse
“How many episodes do you reckon AI came up in?” ([14:06])
The group estimates at least 75–90% of episodes included AI, noting its movement from hype to practicality.
“AI is starting to reshape our industry. And I think we both call bullshit on nonsense like vibe analytics.” ([15:57])
“At the start of the year, the ratio of AI hype...was like north of 90%...It’s slowly gotten a little more in balance...just as the conversation...gets more mature.” ([18:57])
“Literally from spring to fall...I felt like we’d already gone through a maturity curve almost with the way we’re discussing AI and some of its use cases.” ([21:43])
Concerns About Hype vs. Practical Use
“Like with any trendy thing, it’s cool to think of all the use cases...and then the cool thing is to be like, but you can’t do this, can’t do that.” ([19:23])
“Instead of a bubble, think of it as a forest fire...There are players that were sufficiently hardy and had actually a plan.” ([19:48])
“I felt like there was this return to us discussing, I want to say, quote unquote, the basics...how we do things well and measure if output is good.” ([22:39])
“Your own expertise drives results in AI...all the things that we teach as analysts...AI is a part of that process now, but we still have to maintain all of those parts along the way.” ([24:09])
“There’s a lot of convincing sounding bullshit where if you prod just one layer deeper, the thinking just isn't there.” ([27:01])
“There is stuff that is supposed to be hard...” ([27:01])
“She’s just a joy...You guys did a really nice job probing with some really good questions that were very thoughtful from real life experiences...” ([30:32])
“That was so much about a human being pulling things from different directions...a pretty nice corrective to the AI hype.” ([32:36])
“It was sort of this really big light bulb moment to realize, wow, the way I actually approach this stuff is literally what we talked about.” ([32:36])
“Michael brought a level of new depth to the topic that really add a lot of value to me.” ([33:32])
“Holy shit, yes, it was a whole episode because it was with Cindy and it was really, really well done.” ([34:51])
“I loved that framing...taking ownership as a leader, realizing what you say—people might just take it and run with it and spend a ton of time...” ([37:06])
“That was one that really stood out.” ([39:41])
Emerging Themes
“It is about consistency and making sure that we have really solid consistency in metric definitions...It just sounds boring, but I feel like it's becoming more important than ever.” ([43:08])
“It comes down to cost efficiency...I don’t want someone continually asking a question every day that’s costing us money to run that is sitting on a dashboard that can be easily looked at...” – Tim ([47:01])
“Maybe the AI product you want to see built is the one that forces more rigorous questioning by guiding people through that process.” ([49:45])
“Do we think that AI was mentioned more this year or Tim’s blood pressure raising happened more this year?” ([51:13])
“How do you use the right level of rigor for the decision that you’re trying to make as a business? And sure, maybe there’s some AI sprinkle salt...” – Moe ([52:12])
“Do we think that AI was mentioned more this year or Tim’s blood pressure raising happened more this year?” – Moe ([51:13])
Conversational, irreverent, and candid—with healthy doses of skepticism (especially regarding AI hype), humor, and camaraderie. The hosts share personal stories, lighthearted arguments, and a deep respect for the analytic craft, fundamentals, and community.
This Year in Review reflects on a period when AI-talk dominated industry chatter, but the Power Hour team grounds their analysis in the enduring value of fundamentals, rigor, and human insight. Listeners are encouraged to revisit notable episodes, participate in the ongoing discussion, and gear up for changes and challenges in 2026—with the reminder to, as always, “keep analyzing.”
If you want to follow up on any of the episodes mentioned, check the Analytics Power Hour archive. For more discussion, visit their LinkedIn, Measured Chat Slack, or send a note to the team!