
Loading summary
A
The agile brand.
B
Welcome to Season seven of the Agile Brand where we discuss the trends and topics marketing leaders need to know. Stay curious, stay agile and join the top enterprise brands and martech platforms as we explore marketing, technology, AI, E commerce, and whatever's next for the omnichannel customer experience. Together we'll discover what it takes to create an agile brand built for today and tomorrow and built for customers, employees and continued business growth. I'm your host Greg Kilstrom, advising Fortune 1000 brands on martech, AI and marketing operations. The Agile Brand podcast is brought to you by Tech Systems, an industry leader in full stack technology services, talent services and real world application. For more information, go to teksystems.com to make sure you always get the latest episodes, please hit subscribe on the app you listen to podcasts on and leave us a rating so others can find us as well. Now onto the show.
A
What if every AI interaction with a.
B
Customer built upon the last instead of starting from scratch every single time, or at least having it feel that way? Agility requires not just reacting quickly to customer needs, but learning continuously from every interaction to anticipate the next one. This means our technology, especially our AI, can't operate with amnesia. It needs to have persistent shared memory. Today we're going to talk about breaking down the silos between our AI systems.
A
We'll explore a concept that promises to.
B
Give our AI a persistent memory, allowing different models and platforms to share context and build a truly continuous, intelligent customer experience. Tell me discuss this topic. I'd like to welcome David Funk, Chief Technology Officer at Avaya. David, welcome to the show.
C
Thanks Greg, it's great to be here. I really appreciate it.
A
Yeah, looking forward to talking about this. Definitely a topic. Top of many people's minds, myself included. Before we dive in though, why don't you give a little background on yourself and your role at Avaya?
C
Sure. So I've been with Avaya about a year and a half now and before being CTO at Avaya, I was CTO at Aspect and Alvaria and then moved to a smaller kind of startup at Edify and Edify was acquired by Avaya and so that's how I came into the business and then took over as CTO about half a year ago. So it's been a great ride. You know, I've been developing enterprise software in the contact center space for most of my career, so led transition to cloud at Aspect and really happy and excited about our new product, Avaya Infinity. That is, I think, a great, great solution for the contact center. So really happy to be here.
A
Yeah. And I guess to give a little context, and I know you briefly mentioned it, but could you tell us a little bit about Avaya? You know, who's your primary customers and what are some of the challenges that you solve?
C
Sure. So Avaya is a company that has been around for a while. It has a storied past. It grew out of the Bell companies, spun out of Lucent a while back, and has been an industry leader in mostly voice communications and then subsequently contact center. And we still are industry leaders in the contact center and in critical communications infrastructure in really reliable audio and voice. Our customers are the largest of the large, big airlines, large banks, financial services companies, governments. We service, service the US Federal government and other governments across the globe. So we're very much a global company and really cater to those large customers that have really specific needs.
B
Yeah, great, great.
A
So, yeah, let's dive in and I want to start kind of with the strategy and the why of all of this. So we're going to talk about Model Context Protocol, or MCP for those familiar with it already, but just kind of starting at the top, you know, it would be hard to escape that AI is everywhere and in every conversation and so on and so forth. And, you know, kind of started the recent boom, started with Gen AI and the chatgpts and all that agentic AI certainly taking over a lot of the conversations recently. But you know, what I've certainly seen, and I think what, what many are are that are paying attention to are seeing is let's call it AI Sprawl or, or what, whatever label you want to call it. Everybody's got their own AI features now. Most have their own agentic component. So we're kind of at this place where, you know, what do you choose? How do you choose? And kind of what I touched on in the beginning is, does my AI hook up with this other platform's AI and how do all these things work? Because the ones that suffer the most, we may suffer internally in trying to get our work done, but the ones who suffer the most are the customers. Right. So maybe, you know, given that kind of context, can you talk a little bit about, you know, what is MCP and why should we care about it?
C
Sure. Well, you're absolutely right about the AI landscape right now being kind of dizzying. It's changing so fast and you never know when a new industry leader is going to emerge and it's really hard to keep up. I was just at a conference a couple weeks ago with one of our partners, databricks. And that was a real focus of what we talked about there, how quickly things are changing and making sure that you prepare your company for a rapidly changing future. And I think MCP is an approach that really does that for agentic AI. So MCP stands for model context protocol. And this is a protocol that was established by Anthropic, one of the industry leaders in creating AI. And what it does is it is protocol that standardizes the way that large language models can interact with the real world. If you think about a large language model, they're built on, they're trained on huge, vast array of different language sources. That's why they're called large language models. The model is built up by consuming just about every bit of text that humans have created on the planet, but they're built on things that have happened before. And all they can do is predict the next token, the next word that logically might be following based on what has happened in the past. So you can get a large language model to very accurately predict what the weather in Tallahassee might be based on history. But you can't ask it what that weather is right now unless you use the model context protocol. It gives context to the model, so it gives the model the ability to reach out and ask a question about the real world or take action with the real world. And this is really critical with agentic AI. Agentic AI means AI that has agency. So instead of a large language model being something that could, you know, suggest, help with your homework, which was mind blowing enough, you know, now we can have large language models telling us what's really going on and then acting on that based on instruction. So it takes the large language model capabilities completely to the next level. And the other great thing about it is it standardizes the way that those things happen. So it really kind of democratizes that in a large way. You don't need nearly as much technology input to make these kinds of connections. With mcp, you can pick a large language model from any of the ones that are out there and connect them to different. Essentially it's APIs that are available either in your enterprise back office or on the Internet at large. And all of that information then becomes available. All the things that those APIs can do become available to that large language model to be effectual. I think it's a real game changer.
A
Yeah, well, and building on that, so to. To take this from the customer lens, you know, I think AI certainly is able to do a lot of things and when trained on the right data and everything like that, do it, do it well. But I think we found ourselves getting back into that, you know, the phone tree doom loop kind of thing where, okay, the chat is trained on this one thing, but to get from here to there, you know. So in other words, customers are dealing with very fragmented journeys again, you know, in many cases, because they're hopping from AI to AI. How does something like MCP help to connect those, those journeys from that customer perspective?
C
So those journeys that you're talking about, they feel that way because they, they are very prescriptive. They are established based on what some contact center administrator thought you were going to do. And you might be doing something totally different as a customer.
B
Right.
C
The great thing about agentic AI and large language model empowered by MCP is it can adapt to your particular needs and you can interact with it, and it doesn't follow that prescriptive path. It can react and tailor its response to your specific needs and your subsequent needs. It becomes then a dialogue that's happening with the artificial intelligence. Now, that's a tall order for the contact center. Right. And enterprises are very concerned about the large language models, hallucinating and things like that. And they're expensive to run too. Right. So the Avaya Infinity product is designed to help our customers try these things out, maybe by helping agents first, human agents in the contact center first. And then you can use the same tools that we empower the human agent to do that with to then, once you've monitored and tested, to make sure that the agentic AI is performing well and has the right boundaries around it, you can switch and then make it available to end users. But that kind of power and the flexibility of all the different tools you can give AI, those are the things that I think are really changing the game for contact centers, especially in contact centers. Another thing I want to make clear is it's a fantastic information domain for artificial intelligence to really make a difference. Because a lot of the things that human agents are doing there are repetitive, they're relatively straightforward in many ways and kind of routine. Right. AI is perfectly suited to handling those kind of routine things. And then if you can help a human agent with those routine things with agentic AI, then it leaves time for that human agent to do things that only a human can do, to have empathy, to really listen and give your end customer a really excellent experience. So that's the vision that we think is unlocked with agentic AI powered by mcp.
A
Yeah. So, I mean, in that approach, you Know, I would say it's elevating the role of the humans that are there from having to do kind of, to your point, the busy work or the repetitive stuff that it's, it's time consuming, but it doesn't require a lot of, you know, strategic thought or critical, critical thinking. But then on the things that, to your point, when there is a challenge that is there's no script for it or there's, there's no way to automate it, the humans can focus their time on there and then everybody kind of gets what they need, right? I mean, is it overly optimistic to say that AI could also play a strategic partnership role as well? So, you know, not, not only doing the repetitive work, but also kind of forming a partnership with the humans to also augment some of that higher thinking?
C
I think that's definitely not overly optimistic. We at Avaya have started talking about this concept and I think we, I think my CEO Patrick Dennis, who's, you know, a brilliant thinker, I think he actually coined this term. We're talking about this idea of tandem care, right? Of AI and humans working together to improve the care that customers get in the contact center. And that can be taking care of the routine things that I kind of referenced before. But I do believe, like you said, some of the higher order things can be empowered too. Now those are the kinds of things that I think you want to definitely constrain to your back office and to your human agents helping them out because, you know, you don't want to necessarily expose that to your end customers right off the bat. But if you think about the way that AI is helping businesses across the globe right now, with knowledge workers really being empowered and able to do much, much more by connecting all these things together, together. Same thing is true for the human agent in the contact center. I definitely agree with you on that. That's a possibility of a really great outcome.
B
If you listen to the agile brand, you're here because you want to lead. You want insights to help you build smarter experiences, stronger brands and customers who keep coming back. That's why you are going to love Simply CX, a brand new podcast from Microsoft hosted by Nicole McKinley, Microsoft's global customer experience lead. SimplyCX takes you inside the conversations that are shaping the future of customer experience and engagement. Each episode features industry experts and business leaders from companies like CarMax, TD bank and T Mobile exploring the innovations transforming CX. We're talking omnichannel journeys that actually connect AI, that enhances human connection and how culture, collaboration and technology can work together to deliver trust, personalization and scalable growth. If you lead cx, influence CX or simply want to understand how the most admired brands stay ahead, Simply CX belongs in your rotation. Hear real stories, practical lessons, and the innovative strategies that will define the next era of customer centric success. Elevate the way your organization connects with the customers who matter most. Start listening to Simply CX every other Tuesday, wherever you get your podcasts, and follow host Nicole McKinley on LinkedIn to keep the conversation going. Still jumping between tools just to update your website? Framer Unifies Design, CMS and Publishing on one Canvas no handoff, no hassle, everything you need to design and publish in one place. Framer already built the fastest way to publish beautiful production ready websites and it's now redefining how we design for the web with the recent launch of Design Pages, a free canvas based design tool, Framer is more than a site builder, it's a true all in one design platform. From social assets to campaign visuals to vectors and icons, all the way to a live site, Framer is where ideas go live, start to finish. Framer is a free full feature design tool. Think unlimited projects, unlimited pages, unlimited collaborators, and all the essentials ready to design, iterate and publish all in one tool. Start creating for free@framer.com design and use code Agile for a free month of framer pro. That's framer.com design and use promo code agile framer.com design promo code agile rules and restrictions may apply.
A
As far as measuring success, I'm sure there's, you know, the traditional metrics, you know, average handle time, first contact resolution time to, you know, there's, there's a million acronyms and names for a lot of these metrics. I'm sure those don't go away. But are there other things that get unlocked relating to like customer lifetime value and loyalty and other things when you're able to connect the dots so much better?
C
Well, I guess what I would say to that is the Contact center is a great place to put AI out there and to really measure how well it performs. Because the Contact center is a very constrained environment that is already has very sophisticated measurement tools. Some of those metrics you mentioned, we can apply those same tools to AI and you can really measure the effectiveness of AI very specifically with the same set of tools that you're using, same set of analytics capabilities and we're doing that with Avaya Infinity. All of the capabilities that we use to measure the human agent, we're also using our same analytics package to measure the effectiveness of AI. And then the other thing that is very true is that you have a more constrained domain for cost and you can understand the cost of the AI and really then measure the cost effectiveness of both. And that's important because large language models can be, depending on which ones you use, they can be very expensive to run. And this is a place that I think that, that there's going to be a lot of improvement in the next couple years. If you take one of the big kind of generic large language models, they're the ones that OpenAI is their latest one out there that's available on the Internet, that's designed to answer anything that anybody asks about anything. So it has to be huge and it's going to be super expensive to run. But an enterprise can create very much more focused large language models about a much more specific, specific information domain, train those on that domain and those are much more cost effective to run. So you can really very effectively measure both the effectiveness in terms of customer satisfaction and the cost to deliver that and compare that to the cost to deliver it with a human person and just make your make your decisions that way. From my perspective, it's all about return on investment for the, for the enterprise context.
B
Yeah, yeah.
A
And so for leaders planning out there, you know, what's the, what's the timeline for, you know, for instance, for Avaya adopting MCP and what does that look like as far as implementing it and being able to connect it with other, with other offerings?
C
We're demoing it right now so you can see demos of it with our Avaya sales engineers. It'll be available in pilot at the end of Q1 and then it'll be available in production for our end customers in calendar Q2 of this year of 2026. Yeah, yeah, nice.
A
And so looking beyond the contact center, I know that's been the focus of our conversation here. If MCP is successful in being able to create this kind of shared brain of AI, where else do you see potential in the enterprise for it to have a great impact in the next few years?
C
So the horizon is really pretty wide open there. I'm seeing it really helping my teams and our product organization. We just used a large language model to document an API of ours and it was a product manager that did it. It wasn't even an engineer. Right. So the things that get unlocked with just about any knowledge worker and you connect all of the back office tools that you use, anything that is a piece of software running out there if it can offer up its capabilities as an MCP server, then you can have the large language model interacting with these disparate tools. And in my case as a development leader, it is code repositories and ticketing systems. You can have the LLM working together to unlock information of how those two systems work together. That typically is done by a human going back and forth. When you connect two, three, four of those back office systems together you can get really fantastic results. So I think it really is a wide open horizon and what intrigues me is just to think about what we are going to do as a society with that additional productivity. Gets back to this tandem care idea that we believe that the human and AI working together can just create much, much better experiences, much, much better outcomes for everybody. That's my hope. That's kind of the dream. Yeah, yeah, I love that.
A
Well David, as we wrap up here, a couple last questions for you. So thinking a year ahead if we this interview one year from now, what is one thing that we would definitely be talking about?
C
I really think that I'll go back to that concept of more fit specific and more focused AI models doing more concrete defined areas for cost effective delivery. There's a question out there whether large language models are really going to get much better. I know the big AI that companies are working on, general intelligence, we can all, who knows if that's going to come, right? What I see happening, what I see happening all the time right now across a variety of industries is specific models getting really, really good with very focused constraints so that they don't hallucinate, getting very, very good at specific tasks. And I think more and more the big large language model providers have these tools out there and now all these different smaller companies are filling in all these different niches with super effective AI and it's much more cost effective because the models are smaller. I think we'll see more and more of that, more and more. Oh, what a great use case. That company's going to be successful. I mean my team right now is using a software development tool that leverages AI with tremendous effectiveness. We're seeing so much improvement in terms of the throughput of my developers. So we'll see more and more of those kinds of things happening in the next year. That's what we'll be talking about.
B
Yeah, love it.
A
Well David, thanks so much for joining today. Last question for you before we wrap up. What do you do to stay agile in your role and how do you find a way to do it consistently.
C
Well before we wrap up. I want to say thanks again, Greg for having me on. This is I really enjoyed our conversation. Yeah, of course. For me to stay agile, I would say two things. I like to surround myself with really cool and effective people. I am only as good as my team and I also really, you know, I like to have a good time and I like my team to have a good time. We spend a lot of time at work and if we're miserable at work, we're wasting a lot of our life. I would much rather have a good time. Laugh. I try to make every meeting as fun as possible and that keeps me on my toes and I think it makes my team happier to be working and I think we're all more productive because of that. So that's what I try to do. Yeah.
B
Love it.
A
Well, again, I'd like to thank David.
B
Funk, Chief Technology Officer at Avaya, for joining the show. You can learn more about David and Avaya by following the links in the show notes. Thanks again for listening to the Agile Brand brought to you by Tech Systems. If you enjoyed the show, please take a minute to subscribe and leave us a rating so that others can find the show as well. You can access more episodes of the show@theagile brand.com that's theagile brand.com and contact me. If you're interested in consulting or advisory services or are looking for a speaker for your next event, go to www.gregkilstrom.com that's G R E G K I N H L S t r o m.com the Agile brand is produced by Missing Link, a Latina owned, strategy driven, creatively fueled production co op. From ideation to creation, they craft human connections through intelligent, engaging and informative content. Until next time, stay curious and stay agile. The agile brand. Before we continue, I wanted to share.
D
A key strategic resource that a majority of the Fortune 500 are already aware of. Finding the best technology, business and talent solutions is not easy. With business demands and competitive pressures mounting, you need to be able to design, deploy and optimize your technology to provide leading customer experiences while driving business growth. Those of you that have been listening to this show for a while know know that this podcast is brought to you by Tech Systems, a global provider of technology, business and talent solutions for more than 80% of the Fortune 500. Tech Systems accelerates business transformation for their customers. Whether you're looking to maximize your technology roi, drive business growth, or elevate customer experiences, Tech Systems enables enterprises to capitalize on change. Learn more@techsystems.com. that's T E K systems dot com. Now, let's get back to the show.
Guest: David Funck, CTO, Avaya
Title: Building Persistent Memory of the Customer with AI
Date: December 17, 2025
In this episode, host Greg Kihlström sits down with David Funck, Chief Technology Officer at Avaya, to discuss a pressing challenge in customer experience: how to break AI “amnesia” and create persistent, contextual customer journeys across platforms. Central to this conversation is the Model Context Protocol (MCP), a standard emerging to facilitate shared context between disparate AI systems—transforming how brands build customer lifetime value and operational agility, especially in enterprise contact centers. The discussion spans everything from AI sprawl and agentic AI, to tandem care between humans and AI, key metrics for success, and the practical rollout of interconnected, context-aware AI experiences.
[01:07-05:31]
[05:31-08:50]
[08:50-12:11]
[12:11-14:19]
[16:42-19:17]
[19:18-20:15]
[21:59-23:43]
On AI Sprawl:
“Everybody’s got their own AI features now. Most have their own agentic component. So we’re kind of at this place…what do you choose? …The ones who suffer the most…are the customers.”
— Greg Kihlström [04:21]
Defining MCP:
“MCP is a protocol that standardizes the way that large language models can interact with the real world.”
— David Funck [05:44]
MCP as a Game Changer:
“It gives the model the ability to reach out and ask a question about the real world or take action…”
— David Funck [07:23]
Agentic AI & Customer Journeys:
“It becomes then a dialogue…with the artificial intelligence. …if you can help a human agent with those routine things…then it leaves time for that human agent to do things that only a human can do…”
— David Funck [10:00, 11:17]
On Tandem Care:
“We’re talking about this idea of tandem care, right? Of AI and humans working together to improve the care that customers get in the contact center.”
— David Funck [13:06]
Enterprise Application of MCP:
“It really is a wide open horizon …what we are going to do as a society with that additional productivity. Gets back to this tandem care idea…”
— David Funck [20:56]
What’s Next for AI Models:
“Specific models getting really, really good with very focused constraints so that they don’t hallucinate…we’ll see more and more of those kinds of things happening in the next year.”
— David Funck [22:21]
Leadership & Staying Agile:
“I like to surround myself with really cool and effective people…if we’re miserable at work, we’re wasting a lot of our life. I try to make every meeting as fun as possible and that keeps me on my toes…”
— David Funck [24:00]
| Timestamp | Segment Description | |--------------|---------------------------------------------------------------------| | 01:07–05:31 | Introducing MCP, AI sprawl, and customer pain points | | 05:31–08:50 | Technical breakdown of MCP and its implications for AI | | 08:50–12:11 | How MCP and agentic AI improve customer journeys and contact centers | | 12:11–14:19 | Human-AI partnership (“tandem care”) and elevating agent roles | | 16:42–19:17 | Measurement, analytics, and ROI in AI-powered contact centers | | 19:18–20:15 | Avaya’s MCP rollout and demonstration timeline | | 20:15–21:59 | Broader enterprise applications: knowledge worker productivity | | 21:59–23:43 | Predictions for the next year: focus on specialized AI models | | 23:53–24:40 | David Funck shares agile leadership philosophy |