
Loading summary
A
Hello and welcome to another episode of the Product Thinking Podcast. Joining us today is Tony Ulwick, the founder and CEO of strategyn and a visionary in the fields of innovation and product development. Known as a pioneer of the jobs to be done theory and inventor of outcome driven innovation, Odi, Tony has transformed how companies understand and meet customer needs. His methodologies, which have been widely adopted by leading companies like Microsoft and Johnson and Johnson, aimed to take the guesswork out of product development and enable more predictable success. With a career that began at IBM where he witnessed firsthand the challenges of product failure, Tony has dedicated himself to reframing innovation as a science. Today he'll share insights on turning failures into learning moments, the journey to creating Odi, and his perspective on the future of innovation and customer centric product strategy. But before we dive into it with Tony, it's time for Dear Melissa. So this is a segment of the show where you can ask me any of your burning product management questions. Go to dear melissa.com and let me know what they are. Here's this week's question. Dear Melissa, My question relates to the transition many companies, teams and individuals face these days going from output to outcome. However, my organization will still do classic projects and only a couple of departments will be going full product. This seems a bit counterintuitive to me. How can we make the transition and tap into the potential of product operating model if we know from the start that not everyone will be following along? So this is pretty common where a lot of companies might start with one part of the organization and say we're going to move this to a product operating model, but other parts of the organization might still be working in project. This shift is always a learning journey and it usually takes many years, but it's going to be difficult along the way. What you'll probably observe is that within your team you be able to work in a product operating model, meaning you'll be able to talk to customers, talk about metrics, try to figure out what's the right thing to build. What's going to be hard is getting leadership aligned. So leadership is usually going to be thinking about project management style of we do this project and we move on. Why should we fund this stuff for longer instead of the product operating model? And what you need to do is get really good at communicating where your efforts are, what you're trying to really prove here, why it's the most important thing you could be doing, and how the product will be evolving over time. It's not just one and done, why you need to invest in it and actually grow. What works here is trying to explain to the concept that to leadership that when you think about products, we don't want to treat them just like projects because we'll be spinning up many different products that might solve the same problem. And instead we can lower our footprint of software and our cost. And if we think about our products as expandable pieces of software that solve the needs of many so that we get leverage from it. So you may want to start there and align on what the concept is of what a product is and why you're managing a product and why you should see it over time. Right. It's not going to go away. It's going to keep solving problems. And you want to make sure that's strategic too. How they see that's connected back to the company strategy and that it will help it grow. When you're in this kind of mode, a lot of it comes down to communication. Unfortunately, it just takes more work to communicate up to explain why you work the way you do, to explain like, hey, we do discovery processes. We're not going to just dive into this. It's not going to be managed by specific hard dates right here. And get away from some of the misconceptions with the project. Maybe you talk about how your teams are product teams and not project teams. Maybe you get in front of some of those issues ahead of time to make sure that there's clarity there. That's what I would advise for. But there will be growing pains along the way. Let's say that as an organization, it's going to be messy for a while. But I've seen many organizations go through this transition. Just a couple teams will start with product. They'll spin it up, they'll try to see what's going on. You can still make your team, if it's doing product, a premier team and an example of what success looks like. And that's what I would focus on. How do I show that by this way of working we're going to be more successful, we're going to create better products that people love. And then I'm going to tell my success stories to the company company so they get really excited about it and more people want to move into a product operating model. The other thing to acknowledge here too is that not everything needs to be a product. Right. So there are some things that are just not productized. If we're talking about software products that you sell to customers, like, sure, we want to move those teams into a product operating model, but not everything across an organization needs to be a product. So let's be very clear about what is a good candidate and what is not. Like what team should be operating in product and what teams should not be operating in product. So you want to make sure that you're setting a good example for the rest of the organization so that it'll be easier for them to jump on board. And then you can tackle the really hard problems like budgeting, strategy, deployment, alignment across organizations and then cross functional working relationships. I hope that helps and good luck on that journey. Now it's time to go talk to Tony. Welcome to the podcast, Tony. It's great to have you here, Melissa.
B
Thanks for the invite. I appreciate it.
A
I have been a long time follower of you, Tony, for a long time and your strategy journey before you started strategy. Tell me a little bit about your career. What led you to wanting to go into this type of work?
B
Yeah, it's a good story because I experienced one of the most humiliating product failures of the decade back then. So I was working for IBM at the time. I was working in their PC division and working on a product that was called the PC Junior. It was supposed to change the way people did home computing. It was supposed to beat Apple. It was a huge endeavor. IBM invested about a billion dollars in the project. And instead of it being successful, the very day after it was introduced, the headlines in the Wall Street Journal read, the PC Junior is a flop. And of course we didn't believe it, but they were right. It was a flop. It took about a year for us to come to grips with that reality. And it cost IBM a tremendous amount of money. And this was my first experience with the product. I thought products, especially coming from a company like IBM, of course it was going to be successful. But as I started studying this, I wondered, how did IBM fail so miserably? It's not just IBM that fails miserably. Obviously I realized very quickly that many companies have big failures like that. And the funny thing is, it's not just back then in the 1980s, it's still today. The same thing is still happening. So this has been an issue for years and it got me very interested in innovation as a process. I really wondered, how did they know that this was a flop? The very day after we introduced it, they were using some criteria to judge the value of the product that we didn't know about. Otherwise we would have built the product to meet that criteria and would have been successful. It got me thinking about, is it possible to know way in advance how people are going to measure the value of your product. And I didn't have the answer for that, but that was what I set out to do. And I spent my last five or six years at IBM trying various methods of innovation practice. Back then it was QFD and house of quality and voice of customer and conjoint analysis and a few things like that were coming together. But I realized quickly there was no real process for innovation and set out to create one.
A
So when you were looking back at all of these things at IBM and how people did innovation, what did you do to come up with what you call the outcome driven innovation? Right. That process.
B
As I was looking at different ways to innovate, I realized there was no process there. But it occurred to me that Levitt gave us a nice hint. He said, people don't want the quarter inch drill, they want the quarter inch hole. And then the engineer in me took over, said, hey, I have an idea. If we study the process of creating a quarter inch hole, we could treat it like we do in manufacturing line. Like you study the manufacturing process, you try to figure out what do we have to measure and control to produce a predictable output. And we automate things and we do statistical process control to remove variation from the process, and we do six sigma to eliminate defects from the process. And that was my career, my early career at IBM. And I thought if we study the job of creating the quarter inch hole as a process, we can apply all that same thinking and we can figure out how to get the job done better, faster, more predictably, higher output throughput. And so that was the beginning of it. And I thought, will people be able to tell us how they measure success? And turns out they do. And we've worked on that language over the years. We call them the customer's desired outcomes. We all use them. For example, as you're preparing a meal, you think about things like maybe you overcooked the meal and you'd say, I don't want to do that, right? I'd really like to minimize the likelihood of overcooking a meal, or this is cooking unevenly. I'd like to minimize the likelihood of cooking the food unevenly, minimize the time it takes to prep the food, minimize the likelihood of creating the wrong portions. There'd be a whole bunch of metrics that you'd consider to try to prepare the meal perfectly. So the thought was, we can go capture those inputs, we can figure out what the needs are in a given market, if you will, and go from there to figure out which are unmet, to what degree are there segments of people with different unmet needs? And then build the strategy around creating solutions that get the job done better. So I thought of it in a flash and I didn't know if it was going to work. So that was the next step, is to go try it. And it turns out it works quite nicely and we've been perfecting it ever since.
A
What did you do in early days to test it out and see how it was working in organizations?
B
I practiced at IBM. They were great to work with, like an internal consultant. And I did projects around the globe. Actually, I spent time in Australia for a month and a half on assignment, applying the approach to a business unit there. Japan, France, in the US itself. And so I had about two years experience just getting it ready. And then I decided in 1991 to start my own business and started advertising, getting some clients set up. And I left IBM and started the practice. And the very first project that I had executed on, I was with Cordis Corporation. And they were on their last leg in terms of their angioplastic balloon product line. They were. They had 1% market share and they needed a win. We tried it out. And one thing that was very encouraging, when I was in there interviewing with them to take on this work, there were some nurses in the room next door and they said, would you come just apply this process next door? Just show us how this works. Go talk to those nurses and get these inputs that you're talking about. And I said, okay, what product are we talking about? And they said, the sheath introducer. No, I didn't know what a sheath introducer was, but it didn't stop me. I went in and, and started asking them about what are they trying to achieve with this sheath introducer. And they told me. And then we started getting into discussion about what are they trying to achieve. The very first step of using it, what are they trying to avoid? And I went down this path and I started writing down these outcome statements. And at the end of the session we spent maybe 10 minutes doing this. I collected maybe 10 or 15 outcome statements. And they said, you should hire that guy. He knows your market better than you do. So I got the assignment and so that was the start. But the project turned out amazingly well. They introduced 19 new angioplasty building products. A year and a half later, they were all number one or two in the market. Their market share went to over 20%. They also discovered a huge unmet need in the market using the process, which was minimized the likelihood of recent reproduction stenosis, which is the recurrence of the blockage. It was off the chart. It was a very important, poorly underserved outcome. And they said, we have a product that could probably address that. We're working on it in the lab with 30 other things. And we said you should trickle down your resources and be first to market with that product. Which they did and they succeeded. And that product was the heart stand. And that became a billion dollar business in less than two years. And their stock went from $8 to $108. It was just a great all around story, very encouraging. So I could see the process working. I knew how to make it work. Then the question was, can we make it work in all industries or more industries other than just the medical space? And I spent the first 10 years, 1991 from until about 2001, pretty much testing this in different companies as a loan practitioner. And then in 2002 I was published in the Harvard Business Review that case study that it featured the quarter story. And from then on I got way too busy and had to start a company and learn how to run a manager consulting firm.
A
So it's always funny how that ends up too busy. Now I gotta be busier trying to run a company too. But it's a testament to that this works and that it's helping people. So when you talk about outcome driven innovation, what I really love about it too is how you contrast it with other types of processes like agile development processes, design thinking. What have you observed coming into companies and helping them about how the misconceptions around when these different processes are needed holds them back. And where do you see ODI fitting in?
B
Yeah, I think the key factor that contributes to confusion is the overlap of the innovation process and the development process. I call the innovation process the front end of innovation, which is really everything before development. And the output of that process should be a product concept that with a high degree of certainty is going to win in the market before you start developing it. That's the ideal goal, right? So you shouldn't start developing the PC Junior and watch it fail at the end. You should know right up front it's going to win or lose and don't do it right. Separating them out is important. Most companies think of them as one and the same. They think the innovation process starts with the idea and stops at product launch. I'm just separating front end of innovation, which is come up with the product concept from developing the product. If you want to develop the product, you want to make sure that the product is the product that's going to win in the market. That's what ODI does on the front end. That's when Agile kicks in development. So use Agile techniques to develop. And the way I like thinking about it is ODI makes Agile more agile because you're not iterating on what the product does while you're creating it. You're only iterating on the product design so that it will perform as nicely as possible. So separating those two out makes great sense. I view ODI plus Agile as the great combination for the front end of innovation and development. Then there's other techniques, like Lean Startup, for example, that ask you to hypothesize the market, the customer needs and the solution all at once. That's intertwining everything, right? You're trying to solve a very complex equation. And what we do instead is we first define what market we want to go after. Then we define the unmet needs in the market, then we come up with the solutions that best address those unmet needs. So we solve the equation by really setting up constants in the equation. The market becomes a constant, then the needs become a constant, and then you can come up with the solutions that address them. It's like solving a simultaneous algebraic equation, right? You solve for the constants in the equation. You can't solve for all the variables at once. We discount the Lean Startup methodology because it causes you to iterate incessantly and you may never get out of that loop. There's other tools, like design thinking, but design thinking really isn't a process. It is a set of tools that can be used throughout the development process. That's how it was initiated. Now, could you also use some of those tools for the innovation process? Yes, you can. And some people have confused the two again and said design thinking is all about innovation and development. But really it's a great set of tools for the development process. But it's not a great set of tools for the innovation process. It's. It doesn't produce a product concept that you know will win in the market before development begins, but ODI does. That's the difference.
A
What I really like about ODI too is what you just said. We start with that market piece. And where I see a lot of people mess up, I think when it comes to innovation, or especially people in startups, right, they go into markets or they're trying to solve a problem of an area they don't know a lot about, right? And I've seen a lot of even like when I was teaching at hbs. I saw a lot of students be like, I'm gonna go build something to do X in this market. And I was like, do you know anything about that market? And they're like, no, but it's, it's something I want to go after. There's a problem I want to solve there. And they're not really looking at the landscape or what they should be tackling or if they want to really tackle that market without identifying it. And I think that's really interesting about odi, that you start there. Like I've seen that common mistake. Why do you suggest starting at the market for most companies or for most startups?
B
Yeah, we always start at the market because we need to know who our customer is that we're trying to create value for. That's the first question, who are we trying to create value for? And then the second question is, what job are they trying to get done? So it could be interventional cardiologists trying to restore blood flow to an artery. We need to know who we're targeting, what job they're trying to get done. And notice I'm saying it like that too. I'm not saying what job the product gets done, it's I'm targeting a group of people and they're trying to get a job done. What we see is most product concepts, especially in startups, get a tiny piece of the actual customer's job done. You're not going to hit a home run, a feature on a platform, and you're never going to take over the world by pursuing just a feature on a platform. You want to figure out what is the entire job they're trying to get done and work over the years to create an end to end solution that gets the entire job done. And unless you know what that entire job is, it's going to be very hard to figure it out. You'll eventually get there. And markets evolve in that way naturally over time anyway, through guesswork. But if you know the end game right up front, you know the entire job, you can lay out a plan to get the entire job done in the most efficient manner, which is what we propose. And that's the most efficient path to growth.
A
So in the ODI model you were talking about, in the process, you were talking a little bit about how you don't do the actual building, let's say yet. And that's more for the development process, that's the iteration process. When we look at lean startup as well, what are the steps to ODI and what are the Types of activities that people should be doing during those steps.
B
Yeah, that's a great question. So the first step is to find the market, to find the group of people you're trying to create value for and the job they're trying to get done. Now, the way we do that is we talk to the customer. That's why it's important to know the group of people you're trying to create value for, because you get to get out and talk to them and get from them their view of what they're trying to accomplish. So that becomes your market definition. The second step is to understand the customer's needs. Now we're going to define needs as the metrics people use to measure success when getting the job done. Minimize the likelihood of over cooking the food or minimize the likelihood of cooking it unevenly. There's typically anywhere from 75 to 125 of these metrics for any given market. And we uncaptured, we capture those by again talking to customers. Job executors, as we call them. They have experience executing the job right. So they, like in the case of interventional cardiologists trying to restore blood flow in an artery, they've tried to do that hundreds of times. So as you're asking them about these need statements, they can tell you, oh, in this stage, I need to do this, then I need to do that, and I got to watch out and avoid that, and I got to eliminate that defect. And they can go through and tell you all the metrics they're using to measure success. We train people to think like this and to capture inputs in that manner. The outcome statements have a very specific structure and syntax and format. And we've learned over the years that they need to have that format in order to be useful. Statements from the beginning of the process to the end, they're inputs that come from the customer, run through the organization. They have to inform sales, marketing, development, R&D. One set of statements that everyone can agree to and most companies like, we poll all the time. We ask this question, is there agreement in your organization as to what a customer need is? 90% of product teams say, no, we don't even agree on what a need is. Never mind what the needs are and which needs are unmet. We can't even agree what a need is. We've also polled 12 different experts in Voice of customer did this years back, and we have 12 different definitions of what a need is. So this has been an issue for years. Somebody has to decide what a need is, right? So we look at it like what should a need be? Should be instruction. And instruction that comes from a customer that tells you how to get the job done faster, more predictably or without defects, which is the goal of getting the job done better. So once we lay all this out, we can come up with this particular format and structure and be sure that we're getting inputs that are going to lead to success. So that's the second step, very important step. Obviously the third step is we want to figure out which needs are unmet and to what degree. So there may be 10 of those 70 or 100 needs that are really important and poorly satisfied in the market today. And we want to discover which of those needs are. So we do quantitative research. We'll put a survey together that will go out to hundreds of people and we ask them to tell us the importance of these outcomes the last time they were getting the drought hunt and how satisfied they were with their ability to achieve that outcome using whatever solution they were using. And we asked them what solution that is too. So we know the answer to that question. So with that information, we can start doing our data analysis to figure out are there any needs that are unmet across the entire market? Are there needs that are unique to segments? What are the step? The fourth step in the process as part of that analysis is what we call outcome based segmentation. We want to know, are there segments of people with different unmet needs? And in all the studies that we've done over the years, there's always segments of people with different unmet needs. In other words, people don't agree on which needs are unmet. The best way to discover those segments is not by segmenting around demographics or psychographics, but segmenting around the unmet needs. Now, most companies can't do that because they don't even agree on what a need is or what the needs are, which needs are unmet. But we've fixed all that and now we can discover segments of people with different unmet needs. So for example, when we helped Bosch into the North American circular saw market, they were trying to compete with Dewalt Makita. They wanted to come up with a solution, a premium brand that would get the job done better. Same price point. To make that happen, we had to find opportunities to get the job done better. When we looked at the broad market, it looked like the market, everybody in total, it looked like there were no opportunities. But when we segmented the market, we found a third of the population that had 14 unmet needs. They were More finished carpenters, they had to make more angle cuts, blade height adjustments, and they had 14 unmet needs that nobody else had. So that became their target. So they came up with a solution that satisfied those 14 unmet needs, and that was their best selling circular saw in North America for about 10 years or so. So that's why the segmentation aspect is so critical, because if you build for the average, you're targeting nobody, usually ineffectively. Then the last step is to use all the information to come up with the product concept, which we do through ideation. So in the case of Bosch, again, when we presented those 14 unmet needs to the engineers, it took them three hours to come up with solutions to address all of them. And as they said, it's not as if we hadn't had these ideas before. The problem is we've had all these ideas before and more. We've had thousands of ideas. We just didn't know that these were the 14 that were really going to create the most value for the customer. So it's all part of getting teams on the right path. And the same path is the way I like thinking about it, they have to head in the right direction. So everyone's creating value for the customer in the most efficient manner and everyone has to believe it. Right. And these are the two key elements of our approach that have to come together for company to be successful. They got to be heading in the right direction and everyone has to be paddling in the same direction to achieve both of those is the magic of the innovation process right there. If you can do that effectively, you're going to be successful.
A
When you think about these companies that are not agreeing on what a need is or what it means to them, what do you do to try to encourage them to align on what those needs are? Like, how do you start that conversation or go through that work?
B
Yeah, you know that I love that question. Because we've just learned recently that the best way to do that is to ask them, how do they define needs? And we do individual exercises, we may do workshops, it's fun. But everyone writes down how they think about a need. Oh, is it a specification, a requirement, a pain, a gain, an exciter, a delighter, a value driver? I could go on. There's 30 terms that we hear that people use to state what a need is. And then if we go into saying, write a need statement right now, then compare. Everyone compares what they've put up there and they see that there is disagreement, proving to them that you don't Agree on this basic thing. You're all here to come up with solutions that address unmet needs. And we can't even agree on what need is or what a need is. So that's the first step. And then offering the solution, of course, is the key. Right? Why is this a need? Because people buy products to get a job done and if we can help them get it done better, they'll buy our product. Let's tie needs to statements that show how to get the job done better. It's that simple. Right? The logic's there and if we can get the team to follow the that path. And the funny thing is they don't all have to know how to do odi and how to talk to customers. They just have to know that once they get that information, that's the information they should use to drive their product decisions.
A
When it comes to needs and outcomes, I feel like everybody in every organization now is we're an outcome driven organization, we got to be concentrating on outcomes. And I feel a lot of organizations aren't doing it well or they're getting lost about what an outcome really means. How do you tie in needs back to outcomes? Like how do you make sure that you are actually writing good outcomes, focusing on the right outcomes and tying that back to what those unmet needs are?
B
See, I don't like to use the word. If I could eliminate the word needs, I, I would. Right. Outcomes are the needs. And so people buying products to get a job done, that's the market, they go through steps along the way. That's what we call the job map. And then they use these metrics to judge success when getting the job done. Those are their outcomes. You could argue that all of those are needs at some level, but I don't even like to make that argument because it doesn't matter. The point is just focus on the outcomes tied getting the job done better. And that's going to help you create products that will get the job done better. We're really vocabulary here is so important. We put a glossary of terms together that, that we like to use that shows how all these pieces fit together. If you stay in that mindset, you can define adjacent markets differently, you can define segments differently, and they all make sense through that lens. And once you get that thinking in your head and that mindset shift, then it's really hard to see it any other way.
A
You talked a little bit too about how you have a very specific way of writing outcome statements. What have you found is required in a good outcome statement? That makes sure it's in the right direction.
B
Yeah. So it contains four pieces or elements. One is direction of improvement, which is always minimize. So it's going to be either minimize the time it takes to do something or minimize the likelihood of some bad thing happening. So that's the second part. It's a metric, either time or likelihood. Now, years ago we used many other metrics, but we've realized that these two really are the most useful and in terms of describing what customers are trying to achieve, because they're either trying to get something done through faster, more predictably, or without defect. Now, the faster is tied to time, the more predictably and without defects are tied to the likelihood of you doing something wrong when executing the job or the product doing something unpredictable when using the product. And that's it. So now we're going to talk about minimize the time it takes to do something in some context. So the third statement is the object of control. What are we trying to control in the process? Minimize the likelihood of overcooking the food. We're going to try to control the overcooking aspect. And then the last part is a contextual clarifier, if it's needed, so you can talk about. What context are you referring to? Interventional cardiologists may want to say, I want to make my way through a tortuous vessel or minimize like impacting the side vessel when I'm making my way through a tortuous path or something like that. Supplying some context to the statement. So you know the situation. They're. They're struggling with that particular outcome. So those are the four pieces. We've. We tested this back in the early 2000s with Microsoft. We worked with them in about 45 projects. And most of the surveys we would use different variations of statements to see what happened. So we tested them and we found that the structure that we use today is a structure that gives us the best insights, gives us the best discrimination, causes the least fatigue. All the key things that you look for when you're trying to create a good survey. We often say the process has been battle tested. It really has. And we've worked with lots of very smart people from the best companies on earth over the years who've helped us refine the process to make it even more effective.
A
When you're looking at ODI and for companies that maybe are not engaging with you, but they're trying to learn ODI and bring it in, what are some common mistakes that you see them make or misconceptions they have about it.
B
I say that the biggest misconception is they think it's going to be easy because it sounds easy, but in practice, it's not that easy. And you need different people with different skill sets to execute the process. You need someone who can manage an entire project and even know how to frame it. Who is our customer? What do we. How do we aim ODI at a market? Right. Just to think through that can be very challenging. We need someone that has to collect all the outcomes from customers. You need good qualitative research skills to make that happen. We need people to build surveys, collect data, run segmentation analyses and others. And so we need good quantitative researchers. And then you need people that can pull the story together, read the data, understand what it means and turn it into a strategy. So it's not as if any one person in the organization can do all those things. So having a team of people that can get it done and knowing the responsibilities and just being prepared for what it takes to get good at it, like, you can read the book and you can see how logically it makes sense, but there's just a lot of little nuances in making it happen. Should go through the learning, learning curve. So that's probably the biggest issue, I'd.
A
Say, when you're thinking about who should be doing ODI too. In organizations, I see this a lot like in agile transformations, where people go. Like the leaders go, oh, this is like a responsibility of the team. I don't need to know this. Just go do it. And then I'll. I'll set these goals up here. Who should be involved in the ODI process from like a leveling standpoint? We just went through roles, but are you like a VP of product overseeing this or like a C suite executive? Or is this for like directors of product or who.
B
Yeah, you should oversee a team of people. So it could be a business unit director who oversees a number of businesses. I see some companies will create a corporate team of ODI practitioners who get assigned to different business units. I've seen it done both ways effectively. But what you shouldn't do is have all your product managers become experts at doing odi. That's not right, because they're not necessarily strategists and qualitative researchers and quantitative research and data analysts. Asking them to do that is really too much to ask. So that's why we say set up a team. It could be under the head of the business unit, it could be on. It could be a corporate team that gets used across business units. But think of it like that as opposed to teaching the product team, product manager or even all the people on the product team, they don't have to know how to do odi, they should know what an outcome is. They should know what to do with those needs statements once they see them at the end because they're going to use them in their workflows to make the same decisions they always make. But now they're going to be focused on the customer's outcomes, not some irrelevant piece of information or misleading piece of information. So I think those are some of the important intricacies that come into play.
A
Yeah. To me, a lot of what you're talking about feels doing good research that informs our strategy of where to go right through this process and then narrowing it down. And I've seen in organizations like especially director level, let's say like people who, who run a team of product managers, right, who are working on more innovative types of products or net new things. I've seen them run effective ODI processes. But they've had to bring in other people like you're talking about on the team. Right. Like they needed a, we'd call them like a product ops analyst type person to help with the data. They need the user researchers to go out and do that and they're helping to oversee it and run it or Even like the VPs, building a little team underneath them to help figure out how to inform the strategy in product management. I see a real lack though of the work that you're talking about. Right. Like actually going out, doing the sifting, trying to point at people in the right direction before we set the objective targets that we want to hit and then just dive into the product work.
B
Yeah. And this is what companies hire us to do because it is a unique skill set, it's specialized, it's. We've been through a lot of iterations of the process we spent a lot of time on more recently is not making the process better because it's pretty good, but making it easier to install in a company. Right. If you think of it like that. And with the advent of AI, it's interesting to see the possibilities of like synthetic outcome driven customers, for example, that could be created using the data I just described. But the front end could just be an interface where you could talk to this customer who has all this information about the market based on the ODI data that we feed it, plus other information that would be pertinent. So there's a lot of fun possibilities here that we're thinking about working on to help get the job done. Even Better.
A
That's really cool. So I like this kind of trend of looking at AI in a lot of the innovation space these days too. I've heard some pushback from people and talking about how AI is going to get rid of the need to do some of the stuff that we have been doing around these processes, testing, experimentation. We talk about the cost to build things is faster. So people are saying, oh, you don't really need to test products, maybe you don't need to do as much research. You just build something, throw it out there and see if it works. With all the work that you're doing, do you think that the progressions we're making in software development, with the smarter AI features, the faster it is to build things and the lower cost to build, do you think it's taking away some of the work that we need to do there or do you think it's only making it more possible to do more of it?
B
There's the inefficient way to develop things is to guess and test and iterate. I mean, that's not efficient if you can do it fast. Is it cost effective? It could be in software. Not all companies just develop software though, and even software companies need services and sometimes more than that. So it's not a good across the board philosophy to have. You're still guessing, right? The way I like thinking about this is what are the chances of you randomly coming up with a solution that addresses the top 15 unmet needs in the market? If you don't know what they are, it's zero. It's just not going to happen. But what are the chances of you creating that solution if you know exactly what those 15 minute needs are in priority order? And the answer is about 86% because now you're just relying on your team to use their creativity to come up with solutions that address needs that you know exist. And if you can do that, you're going to get the job done better and succeed in the market. So my question would be, why wouldn't you just do it? That way you're going to win. Companies like the medical space that have long product life cycles already know this, right? A lot of technology firms know this as well. It's in the software space where they're, they just think it's easier to guess and throw it out there and see what happens and get a response. And like I said, it may be cost efficient, but it's. You're still wasting your time and you can still do it better by having the needs first approach. Or outcome driven approach as opposed to an ideas first approach.
A
When you are looking at the advancement of technology. We talked a little bit about the AI stuff, but what are you excited about in the field of innovation and how do you think it's going to change?
B
It's changing rapidly. So a few mega trends, if you will. You talk about these, the ability to go capture outcomes. And people don't want to capture outcomes, they don't want to do the quantitative research. What they really want is a way to query customers to get answers to questions like, should we go copy this feature that a competitor just put out because it's so valuable to customers and without it we're going to lose market share. I would love to know the answer to that question right now. Using ODI data, you can answer that question. So we just need to learn how to take that kind of command, turn it into the query that produces the answer. And so those are some of the things that we're taking a look at. And I think that's going to bring more predictability to the process. But behind that data or behind that query in that database, there's going to be good data. Right. You still have to be following some good practices. Right. So I think that ODI is really well suited for AI application because it's built with a bunch of rules, it's a rules based discipline and so it's a natural fit for an AI application.
A
I'm excited to see that evolve over time. Tony, thank you so much for being on the podcast with us. If people want to learn more about you and strategy, where can they go?
B
You can head right to our website@stratagen.com or you could email me if you'd like at ulwichrategin.com awesome.
A
And we will put those links in our show notes@theproducthinkingpodcast.com thank you so much for listening to the Product Thinking podcast. We'll be back next Wednesday with another amazing guest and in the meantime, if you have any questions for me, go to dearMelissa.com and let me know what they are. We'll see you next time.
Released: January 22, 2025
Host: Melissa Perri
Guest: Tony Ulwick, founder/CEO of Strategyn, inventor of Outcome-Driven Innovation (ODI), and pioneer of Jobs to be Done (JTBD)
In this episode, Melissa Perri speaks with Tony Ulwick, a leading thinker in innovation and product development, about his journey creating Outcome-Driven Innovation (ODI) and how it enables organizations to conquer the unpredictability of product success. Tony shares the story behind ODI, its practical application, common misconceptions, and its strategic fit with modern methodologies like Agile and AI-driven innovation.
Tony Ulwick lays out not only the reasoning and architecture behind Outcome-Driven Innovation, but candidly addresses the challenges of implementation, the complementarity with agile approaches, and the conceptual clarity required for success. This episode offers practical, actionable insights for any organization looking to reduce risk in product strategy and move toward truly customer-centric growth—especially as AI unlocks new efficiencies without sacrificing foundational research.