
Loading summary
Alan Mosca
The Agile brand.
Greg Kilstrom
Welcome to the B2B Agility Podcast where we look at the factors that drive success in B2B marketing with a focus on the people, processes, data and platforms that make B2B brands stand out and thrive in a competitive marketplace. I'm your host, Greg Kilstrom, advising Fortune 1000 brands on martech marketing operations and CX, best selling author and speaker. Now let's get on to the show.
If you could eliminate one of the biggest roadblocks to successful project delivery in your organization, what would it be and why? Agility requires not only the ability to adapt to change, but also the foresight to anticipate it. This means embracing data driven insights and leveraging technology to navigate the complexities of modern project management. Today we're going to talk about how AI is transforming project management and enabling true agility in industries that haven't traditionally been known for the rapid pace of change. To help me discuss this topic, I'd like to welcome Alan Moska, co founder and CTO@NPlan. Alan, welcome to the show.
Alan Mosca
Hey Greg, thanks so much for having me. It's a pleasure to be here.
Greg Kilstrom
Yeah, looking forward to talking about this with you. And before we dive in though, why don't you give a little background on yourself and your role at Unplan?
Alan Mosca
Yeah, so as you said, I'm co founder and CTO, so we've been here from day zero. MPlan is in terms of startups, maybe relatively old now. We've been going eight years which in IT in construction terms is still a baby. But in startup terms, you know, we were pre GPTs and all of that. We were founded as an AI company or at the time as a machine learning company. And my personal background is I used to work as a quant, so I used to do financial modeling at trading firms. While I did that, I did a part time PhD in machine learning theory because it was just really starting to beep on everybody's radar and became so interesting that I got hooked. I got to the end of that and co founded MPLAN with my co founder who told me about some of these problems that are happening in construction. And I was like that is this seems like a problem worth solving. And so I ditched my career in finance which surprisingly actually correlates very well with what we're doing at MPLAN in terms of quantifying risk and thinking about different scenarios both on individual projects and portfolios. So I did manage to transfer over a lot of my learnings from that, which I wasn't expecting at the time.
Greg Kilstrom
Great, great. Yeah, yeah, definitely construction. I don't know a ton about it, but a little, a little experience in the large scale space. Lots of, lots of moving pieces and you know, I think that's, that's what, what we'll talk about a bit here and, and, and want to start by talking about how AI is really transforming those, those large scale projects and project management. So you know, we certainly talk about AI a lot on the show. I think everybody talks about AI all the time, everywhere it seems like. But can you share a no pun intended, concrete example of how AI is actually improving project planning, scheduling or risk management in a large scale construction project?
Alan Mosca
Yeah, yeah, we have a ton of use cases. One of my favorites because we started working with them quite a few years ago is the transpennine route upgrade, usually shortened to TRU, which is a 10 billion pound, so fluctuates depending on currency, but let's call it between 12 and 15 billion dollars of upgrade works on a route in the north of England. So there's electrification and there's a lot of complexity to it because it is also done without shutting down the line. So it's 10 years of work, give or take, that needs to happen. It's very expensive, it's very complex, there's new stations being added and all of this whilst you keep the trains moving. We started working with the transpanite route upgrade four years ago and one of the, one of the stories that I actually love talking about because it's very simple to understand is transparent upgrade has this reporting structure. So rail is kind. Construction of rail is kind of centralized. There's a department for transport in the UK that is ultimately in control via multiple delegate structures, but they have a governance reporting system. And every four weeks the project team at TRU would produce a report both for government governance, risk reporting and general project controls. Like these are decisions, these are the problems that are coming up. This is what we're going to do about it. Making that report took them six weeks. Yeah, every four weeks. So you end up in this paradox where like anybody who's reading information from that report, even if they read it on the day that it's produced it's finalized, is reading information that is 10 weeks out of date. Which means you can't really change things in the future. By the time you're reading about a potential problem, it might have actually already happened. So we put in a lot of automations through our agent, but also just the nature of the fact that we were able to do a Lot of forecasting for them, quickly at speed to look at emerging problems. That got down to roughly two weeks now. Wow.
Greg Kilstrom
Wow.
Alan Mosca
And you know, it's still like a very intense two weeks. The project controls team at TRU is, I want to say, hundreds of people just working on this type of stuff and the scheduling and the planning. So we've actually helped them like all of a sudden, instead of having two weeks every month where they're working on two reports at the same time, they're going to two weeks every month where they're free and can actually do their job. Because the reporting is not even their job. It is something that I have to do on top. So I felt that is a really good success story. And like they have their own estimates of like how much they managed to save. Not so much by saving time, but being able to think about things that they wouldn't have thought about before. There is this thing called possession of track, which actually most circles in the, in the United States is called an occupation. I think either way, it basically means that the track is closed for that period of time. Usually it's overnight or on a holiday or on a weekend because you don't want to disrupt, disrupt commuters. And all the work in planning is preparation to be ready for the moment you start the possession. Yeah, because the track possession is planned very meticulously, down to the single minute the rail operator has fines for. No, the rail. The rail owner, which is Network Rail in uk, has fines that are ridiculous amounts of money. I think it's like £35,000 per minute if you go over an occupation because you're delaying trains at that point. So everything is planned super meticulously. There's a lot of margin, but you want everything to be ready for that point. And so they had this exercise that they were doing for an occupation over the Easter period and we started working with them on that and they figured out that they weren't going to be ready and these were all the things that they needed to do. So in the end they were ready for the occupation. But if they weren't, it would have probably cost in the eight to nine digit order of magnitude. Wow. Wow.
Greg Kilstrom
Yeah. Yeah. So I mean, you know what, what you're saying, I mean in conversations about AI, the efficiency, you know, like, let's do things more quickly. Like that's often the topic of conversation, but you bring up a few other really good points of again, the just the free, like mental real estate to be able to think about things, but also the time to plan and mitigate, either mitigate risk or mitigate, you know, in this case, you know, know, tens of millions of, of pounds of potential fines and, and other costs and, and things like that. And so again, it's, it's, it is efficiency, but it, it can be so much more. Right?
Alan Mosca
Yeah. So I like, I like, in general terms, when I talk about AI, a lot of people are talking about like, oh, yeah, automation, let's do this thing, but quicker by pressing just a button. And that is cool if you want to charge $20 a month, right, and via ChatGPT. But it's more about like, okay, now that you can do this, what is that enabling you to scale up? Right? So if, if I'm taking a forecasting procedure that used to take six months and I can do it in 10 minutes, which is our core value proposition, what does that mean? Well, it means that I can do hundreds of them every day, which means I have, I'm now all of a sudden entertaining this entire space of scenarios for different plans. And you kind of like looking a little bit like Loki, thinking about multiverses and trying to pick what is the next thing. And I'm sorry if you're not a Marvel fan, but then, but then that's the value to me. That's, that's how you create the value. Rather than, oh, I'm going to save 50 hours a month of manual work. I'm like, sure, that's, that's cool. But you know, organizations that spend billions of dollars, 50 hours a month is not registering on any.
Greg Kilstrom
Yeah, it's a blip. Yeah.
Alan Mosca
Yeah, it's a blip.
Greg Kilstrom
What do you think? You know, is, is that kind of a misconception there? What do you think the biggest misconception is that you encounter when you start talking about introducing AI into project management? And you know, how, how do you, how do you address that?
Alan Mosca
Well, the biggest mix misconception that I find now is that everybody just thinks that when you mention AI, you actually mean ChatGPT. I see this a lot in, like, the thought leadership circles within Project Controls, which is the wider domain that we operate in, where people are giving webinars. This is how you use AI, and it's just a series of prompts. And I'm like, that's like 0.1% of the things that you can do, right? We've developed an army of different machine learning algorithms, plus our own LLM agents, plus our own LLM models and generative models for plans and everything else. Those are the things that, to me are exciting rather than like, oh, here's a couple of cool problems. Right. The problem is that we're still very anchored on AI equals LLM right now. Rather than like, what are the things that you can really do? Right. Thinking like, what's AI in 2027?
Greg Kilstrom
Yeah, yeah, yeah. I mean, I think, and as, I think as consumers start using things like ChatGPT and everything, like sort of, it solidifies that further because. Yeah, I mean, obviously, you know, AI has been around for decades, but it's as if it was invented in 2022 or something like that.
Alan Mosca
Yeah.
Greg Kilstrom
If you know, depending on who you ask. So. Yeah. So I mean, how do you, how do you kind of get beyond. Is it showing use cases? Like, how do you, how do you get beyond that initial like kind of mental block you need.
Alan Mosca
Yeah, you need to get them to see it. However. However that is. Right. So like we do, obviously our commercial team does an enormous amount of demos king whatever else webinars. But also we have this MO in our commercial team. So if you think about like our B2B structure, in a sense we sell to very large enterprises that operate on decade timescales. So that sales process in itself is a century basically. But also we set it up so that a pilot for us is never less than six months. Okay. So whereas you could say like, oh, I'm going to pilot ChatGPT for a month or two and see what I can get out of it. And that's enough to make a decision whether like I want to pay even the $200. For us, it's more like, okay, you need to run through these loops of planning, forecasting, reporting a couple of times until you start actually like flexing a new muscle that helps you think in what if terms about the future of your project rather than, oh, we're behind on this. Yeah, yeah. And that takes a little bit of time. And there's so many different stakeholders you can imagine. Like, you know, every time we onboard a new customer, that's about 50 to 100 users that all have different job roles. So there's a lot of managing that complexity as well. So any pilot less than six months, we're just not going to get to that point where our customers see a return within the pilot. So we'll just refuse point blank to do a three month pilot and we'll, we'll always push for at least six months. In most cases it's the year. Right. So it's the first year of usage of the product is called a pilot and you get to the end of the year. And we can actually then say like these are the things that we told you at the beginning of the year and the ones that you said weren't going to happen. These are the things that happened. Yeah. So you, you even end up with a I told you so kind of moment, which is not fun. Right. Because you have to go to someone and say like, hey, you ignored me, I was right.
Greg Kilstrom
Right? Clients love that.
Alan Mosca
Yeah, yeah, it's love. Clients love being told that their plan isn't good enough. There is an enormous amount of risk they're not going to hit their planned date. And then I told you so those are the best ways to close a deal quickly. Right. And so you've put all of this together plus the fact that like humans are not built for thinking in probabilistic terms about the future at scale. So the primary point of data that we operate with is construction schedules. So a schedule for a mega project is effectively a Gantt chart that will have between five and 100,000 activities. Right. If I put this in front of any human being, they're going to at best pick 50 things that are worth focusing on. And so you've left like 99,950 things unobserved that might blow up in your face. And because it's a Gantt chart for, for a project, it's like a very long sequential schedule where everything's connected together. As soon as one thing goes wrong, everything else pushes to the right. So it kind of ends up in this like self fulfilling prophecy. And that's what we did at the beginning with studying this data is we found that that's the reason why projects are late is because we have this situation where it's impossible to mitigate everything, it's impossible to know what's going to go wrong. And we're doing our best efforts manually, but you really need like scale of computation and thinking that can't be done even by 100 humans most times. Yeah.
Greg Kilstrom
So in that, in that scenario. And so I definitely don't work in construction. I work primarily in marketing. And you know, there's, there's plenty of challenges there, but different timescales and, and complexities for sure. But you know, one of the challenges that's often run into, you know, when trying to introduce automation or other, other types of AI and is just access to the data that helps feed into those models. So you know, I wonder, you know, what, what is it? What does that look like in large scale construction? Like, you know, you know what you need to be looking at. But how do you like, is the data readily available? Like how does that work from the, from the customer standpoint?
Alan Mosca
The answer is a very unsatisfactory. It depends.
Greg Kilstrom
Sure, sure.
Alan Mosca
So you have a very wide scale of organizations that have everything meticulously organized. Here's our SharePoint, here's our schedule repository. And you can have access to them so they get loaded into the AI. There will be organizations where everything is on Dave's laptop.
Greg Kilstrom
Yeah, right, right.
Alan Mosca
Dave quit 10 years ago.
Greg Kilstrom
Totally.
Alan Mosca
So yeah, you have like those two extremes. And so in one case it's very easy, obviously, in the other case not so much, but there'll always be something. Right. And because most of the time we're working with the individual project, the individual project has its own data setup. So you have like this like added layer where you have the central organization that's doing lots of projects, but also you have. Each individual project is basically its own company most of the time. Right. If you think about, I don't know, an energy company, Vattenfall, that are building wind farms, like each different wind farm is a different company that has a different project director and a different team and a different setup. And they decide the tools that they use. And so it gets like very fractally broken up very quickly. And we rely a lot on the users uploading what they have available for the questions that they have that they need solving. There is one step usually that we do push for all the time, which is fine tuning our models on an organization's schedule data. So if you have been doing projects in the past, you have your schedules, so you have the schedule you started with and then there's like the monthly updates until you finish. Then when you finish, you have a version that tells you this is what actually happened. Right. And you can imagine the difference between those two schedules is enormous most of the time. But that allows us to learn that and that's how we do the forecasting. So by using that as a fine tuning data set for the specific customer, we learn how that specific customer, like how does Meta build data centers compared to Amazon, compared to Google Cloud, compared to Microsoft. Right. And they do it in different ways. Right. I mean, without even looking at it, it's a very easy guess to make. They do it in very different ways. And so we fine tune our model with the data that they give us for that so that it performs better for their specificities. Yeah, yeah. And it is, it is generally very messy.
Greg Kilstrom
Yeah, yeah. So from, from your perspective and looking at the, I would imagine, you know, if a, if a customer comes to you and they have all the stuff, you know, it's, it's not on Dave's laptop and you know, it's in a centralized place and it's organized and all that. That takes some leadership and some organization on their end before they even come to the table. You know, from, from your perspective, what for, for leaders listening out there that are, you know, that are thinking about, you know, how to do this, whether they're in construction or not. You know, what kind of leadership does it take to have the kind of culture that is ready for this kind of AI adoption and automation and everything like that. Like how, how should leaders be thinking?
Alan Mosca
Yeah, I mean, it's probably not going to be a surprise. Right. But we work with very risk averse companies.
Greg Kilstrom
Sure. Yeah.
Alan Mosca
And so the more risk averse you are, you said that most of the work you do is in marketing. Right? Marketing is way less risk averse than construction, which is if you then push it to the extreme, you've got infrastructure, energy projects, nuclear power plants, defense. There's a lot of risk aversion there if it comes to sharing data. Right. So we've had to, we had to put a lot of stuff in place, like certifications, security clearances. But also the leadership needs to want to explore opportunities. And there's, there needs to be like a certain amount. I use the word hunger maybe a little bit too liberally, but leadership needs to be hungry for opportunities to improve their organization. And that's usually the spark that ignites everything else. Most of our GTM involves marketing creates awareness with the potential users. And the users are the people that are not quite on the ground, but they're in the project management team. But our buyer is three, four, six layers above. Yeah, right. Many times it's the CEO or the CFO that makes a decision to buy and they only see the results on the balance sheet effectively. Right. Or sometimes in some reports that they get and through, you know, third, fourth order of indirect information. And when those decisions happen, that's when like we have a successful deployment, when we have the leadership involved wanting to do this. Conversely, you have the opposite. Like there's. And chief data officers are usually very good, but every once in a while there'll be a chief data officer that appears out of the blue in the middle of an engagement and says, why are you using our data? Our Data is worth $2 trillion. So you have to pay us or you're not getting it. And usually then we walk away. Right. When that happens and we have done so so many times. But that's because there isn't that appetite for. Yeah, maybe it's a misuse, it's an abused word, but innovation, right. This isn't the appetite. Look at what can we do better and how far can we actually go. And so those organizations usually are the ones that then show up in the laggard part of the curve when things start going really, really well. Right. So like now they're adopting SharePoint and they'll look at ChatGPT in a couple of years once everybody else has figured out. Right?
Greg Kilstrom
Yeah, yeah. So then going, going a few rungs down the, the org chart, so to speak. You know, what, what should leaders be making sure that they're, you know, their project managers and the, the people, you know, whether it's on the ground or you know, just managing projects and more mid level, like what skills should they be having to be ready? Because I would imagine this is also, even if it makes things a lot easier and in some cases quicker, it's still a, it's a mindset shift to be thinking about some of this stuff. Like what, how, how should they be preparing for this kind of shift?
Alan Mosca
Yeah, I, I love this question because a lot of the time we hear from leadership a version of this question which is like, oh, but how can I change the organization so that the people that work in my group or in my, in my organization are prepared to use AI in the right way? And actually what I'm seeing is kind of the opposite. And what I mean by this is that I mean you have kind of like 50, 50 breakdown, you have like the fingers in the ears. I don't want to hear about it. AI is going to take my job and so then I'm going to be. So therefore I'm going to be anti AI for the rest of my life or fully embrace it.
Greg Kilstrom
Right.
Alan Mosca
It's like it's starting to get quite polarized. There's not a lot of people sitting in the middle.
Greg Kilstrom
I know what you mean.
Alan Mosca
But most projects have both types of people and so really all that leadership needs to do is give those type A, type B. I don't know, like not in a personality sense. Right. But give those people space and that's really all you need to do. Just let them come up with innovation ideas, try some experiments, let them sometimes make some mistakes as long as you put safeguards around them like the mistakes not being too costly and good stuff will come out of it because those people are naturally attracted to doing these things. And we get job job applications from people that work with our customers all the time. Right. So I will show up at a client and with our customer success team and our relationship management and typically out of three, four engagements, there'll be one or two people that send us a CV within six months. And those are the people that are excited. You need to nurture them. That's the only thing that needs to happen. Right. It's not like you need to introduce like mandatory training, etc. I think that's maybe later, but that's compliancy sort of stuff. As soon as you make things homework, people are going to just like reject it.
Greg Kilstrom
Good point, good point. Yeah, I love it. Well, Alan, thanks so much for, for joining today. One last question for you. I like to ask everybody, what do you do to stay agile in your role and how do you find a way to do it consistently?
Alan Mosca
Yeah, I do a lot of things, but I'll try and I'll try and keep it small and usable and referenceable. So we have had from day zero what a lot of people now call an ambidextrous organization. So we have a research team. It's called nerd. Proudest moment of my life was naming it because it's MPLAN's experimental research department.
Greg Kilstrom
Nice.
Alan Mosca
And they publish papers, they run a thing called Machine Learning Paper Club which everybody is open to join anyone in the world where we talk about a new machine learning paper every week. So obviously I'm a part of that. And then so I spend a lot of time with them, I spend a lot of time with other founders learning about what new things are coming up. What is everybody doing? I maybe weirdly, but I like making myself uncomfortable. We have every six months an event called AI Day where we try and announce something new and something big. Usually I'll sat for me and the organization a relatively impossible task so that if we get 50% of the way, that is something that we can announce and that keeps me, I guess afloat because like AI is the fastest moving discipline definitely currently, but I think in the last couple of years, probably in the history of humanity. Right. There's like.
Greg Kilstrom
Yeah.
Alan Mosca
Something along the lines of like a million papers a year being written. Right. Wow.
Greg Kilstrom
Yeah, I believe.
Alan Mosca
And it's, it is, it's. I mean, don't quote me in this number because I haven't done any research on like finding the actual number.
Greg Kilstrom
Right, right.
Alan Mosca
But the two largest conferences, the conferences of the year, each have about 50 to 100,000 submissions. Just those conferences, right? And this is just the research. Think about all the products that are coming out and the code that is being written and all these things. So it is very easy to get way overwhelmed. So the other bit is try and stay sane and try and filter out the things that you shouldn't be looking at very aggressively.
Greg Kilstrom
Well, again, I'd like to thank Alan Mosca, Co Founder and CTO at NPlan for joining the show. You can learn more about Alan and NPLAN by following the links in the show notes.
Alan Mosca
Thanks so much Greg.
Greg Kilstrom
Thank you.
Thanks again for listening to the B2B Agility podcast. If you enjoyed the show, please take a minute to subscribe and leave us a rating so that others can find the show more easily. You can access more episodes of the show at www.b2bagility.com. That's b2bagility.com. While you're there, check out my series of best selling agile brand guides covering a wide variety of marketing technology topics. Or you can search for Greg Kilstrom on Amazon. Until next time, stay focused and stay agile.
Alan Mosca
The agile brand.
Podcast: B2B Agility with Greg Kihlström™: MarTech, E-Commerce, & Customer Success
Host: Greg Kihlstrom
Guest: Alan Mosca, Co-founder and CTO at nPlan
Episode: #69 – How AI is Transforming Project Management
Date: November 11, 2025
This episode explores how artificial intelligence (AI) is fundamentally transforming project management, particularly in industries known for complex, large-scale projects, such as construction. Host Greg Kihlstrom interviews Alan Mosca, co-founder and CTO of nPlan, to discuss concrete examples, debunk misconceptions, and provide leadership insights for B2B organizations considering the adoption of AI for project management and risk mitigation.
On reporting improvements and freeing up team time:
“Now... they're free and can actually do their job. Because the reporting is not even their job. It is something that they have to do on top.” – Alan Mosca (05:34)
On AI’s real impact:
“If I'm taking a forecasting procedure that used to take six months and I can do it in 10 minutes... what does that mean? Well, it means that I can do hundreds of them every day... and that's how you create the value.” – Alan Mosca (08:17)
On misconceptions of AI:
“The biggest misconception that I find now is that everybody just thinks that when you mention AI, you actually mean ChatGPT...” – Alan Mosca (09:44)
On leadership and culture:
“Leadership needs to be hungry for opportunities to improve their organization. And that's usually the spark that ignites everything else.” – Alan Mosca (19:01)
On managing AI change at mid-levels:
“Just let them come up with innovation ideas, try some experiments, let them sometimes make some mistakes... good stuff will come out of it... You need to nurture them. That's the only thing that needs to happen.” – Alan Mosca (23:48)
Full episode and more show notes available at:
www.b2bagility.com