
Loading summary
A
Foreign. Welcome to Risk Never Sleeps, where we meet and get to know the people delivering patient care and protecting patient safety. I'm your host, Ed Gaudet.
B
Hello. Hello, everyone. Welcome back to the Aimed Insights podcast series recorded live here from San Diego. I'm here with my co host, Ed Gaudet.
A
Hey, Saul.
B
So good to be with you again, brother.
A
Yeah, I feel like I spent way too much time with you.
B
Too much time.
A
Yeah.
B
But it's a break.
A
I need a patient from you, so.
B
Yeah, after this, you'll get one.
C
Okay.
B
Okay.
C
Yeah.
B
Yeah. But it's been great. It's been great.
A
You're home, though. You're here in sight.
B
Yeah. So it's pretty good to be here. It's been so great. Day two, and it's about to get over the top.
A
That's right.
B
Because we have the super connector.
A
It's getting hot in here.
B
The brain.
A
That's right.
B
The most outstanding.
A
Right? The wizard of Wonderness.
B
Her name is Sherry Duville.
A
Right.
B
She is the CEO of Metagram. Sherry, welcome.
C
So kind.
A
And chair of the TTIC too, right?
C
That's true. I am. And you are also on the executive. I am a co founder. Bringing magic. Bringing your. Matt. We're a great team, Ed.
A
Sherry is an uber connector. I thought I was a connector. Then I met Sherry and I'm like, whoa.
B
I'm impressed.
C
Teach me. We work together. We are a dream team.
A
Teach me, Sensei.
B
I mean, dinner last night. Wow.
A
Yeah, I know. Dinner. She pulled that dinner together. That was.
C
We worked together.
A
I know.
B
We were mastermind.
A
We would have had three people there, actually.
B
And I will say, both of you guys are just like, wow.
A
Really?
B
I thought I was good.
A
Yeah, no, she's.
B
And then I hang out with you guys.
C
We were playing tennis, him and I.
B
Yes, we were.
C
Yeah. We were like, how about this person?
A
Yeah, that's true. We were back and forth. No one texts more than Sherry, by the way. No.
B
No one texting.
C
Is that just to you?
A
No one texts more. No one texts more. No. And she has multiple text chains. I don't know how you do it. Have you created a digital twin already?
C
I might have a few.
B
That's how she does it.
A
That's how she does.
C
I might have a few agents.
A
She's cloned herself. Yeah. How's Arthur doing?
C
He's doing great. He's the best.
A
He's the best.
C
Isn't he? Very fortunate.
A
It was so great. I'm so glad he came to dinner last night.
C
Oh, thank you so much.
A
So good.
C
To see him. Yeah, he's amazing.
A
What a great community you've created in Anthony. We're. I'm just on for the ride.
C
No, you have absolutely amplified well. You're kidding. No, you're amazing. You bring so much gravitas and executive produce presence. Thank you. Yeah, leadership and really bringing a lot. Both you and Saul brought a ton to this meeting and thank you so much everybody. Yeah, everybody has said that, so thank you.
A
So what have you heard that's really interest you this couple last couple days? Like what have you. I can't even talk anymore.
B
I don't know. What have you learned? Yeah, what have you learned?
A
There you go. That's the word I was looking for, learn.
C
You know what I've been learning this whole time? I, because I've been working on this module. We, we brought everybody that you need to have to build, deploy, implement and integrate AI because I've been sweating bullets ever since one big beautiful bill act got passed. Because you have a lot of friends in every corner of medicine and healthcare. And so I really know what it means and what's going to happen not just with OBA itself, but also with everything with cms, with the states. And we've got a lot of hospitals in trouble. Everybody talks about rural hospitals, but we've got lots of hospitals up for sale and lots of states. And we're in my hometown, we've got a county system now we did get what you might call a bailout. We had a measure A that got passed. So it looks like they're going to get a little bit of breathing room. But we've got like up to 60% Medicaid in that county. And so we've been sweating bullets, not for ourselves, but just for the population. A lot of us are humanitarians and not really wanting to see the population at true risk. And we're talking about regular working people. We're not talking about just homeless people on the streets. We're talking about people with regular jobs, with regular families having regular access to health care. We're not just talking about in some sort of random rural state. We're talking about people don't realize in Silicon Valley regular working people being on Medicaid. And that's what kills me. And so I've just been ever since the realization of what's happening with the health system, putting this module together, just the weight of the opportunity that we have to bring the best people in the field to help them understand their obligation to really make AI work for the health system, not as a Bright shiny object innovation not as something cool and fun to do, but to make it really work. To make it work to improve outcomes not just clinically, but like as Dr. Emby said in our session, to make it work also financially for the health system. Because if you don't make it work financially, that it doesn't work for anyone.
A
No margin, no mission.
C
Exactly. Yeah. So just feeling weight of that has been a incredible learning experience this whole time working with the team. We had 20 speakers with TTIC rolling into aimed and I've been learning from them. They've been so generous with us this whole time. Yeah.
A
And you've been working at this for the last what, six to nine months, I think. Yeah, yeah, it's been incredible.
B
And Sheri, it'd be good to, to share what TTIC is for those that don't know.
C
Oh, thank you. Yeah. So TTIC is born out of. I am co chair of Trust for IEEE Ultra 2933 which is standard ANSI accredited standard that I worked on with a good friend, mutual friend with Ed Gaudette. Mitch Parker is the vice chair of the standard.
A
Shout out to Mitch.
C
Shout out to Mitch.
A
We had Florence Hudson on Perfect.
C
Yeah. He was the founder, co founder with Mitch of this standard. And so Mitch and I put this TTIC together with Ed and some others because we needed a multidisciplinary group to be able to take this technical standard to market. And that's the TTIC group because IEEE is really just primarily engineers and so which is essential to have that. It's a full stack standard. Trust, identity, privacy protection, safety security. I'm not going to get into the details because I'm sure Florence probably got into more.
A
That was tips, remember?
B
Yeah, she had into detail.
C
Yeah. So she put into more of the details the full stack back end databases, networking, interoperability devices, data infrastructure. And it's primarily engineers, medical device engineers across the full stack networking. But what we saw was that you also need other people to buy into the standard to implement the standard, to improve the standard. And so what we did was take a group, a multidisciplinary group including CIOs, CISOs, engineers and physicians that would have to implement that and create this group around TTIC and with Alta with Ed's help and put TTIC around that, with a, with this core group from the standard. And that's how we started ttic. And then my company Metagram actually technically executes the standard from a technical perspective. But that's. So we sit under ttic but TTIC Itself is like a non commercial entity.
B
Yeah, that's fantastic.
C
Yeah.
B
Just wanted a level set for those that are on the way screen there. Yeah. And by the way folks, we'll link up the interview with Florence if you want to deep dive it. She goes.
A
The Florence interview.
B
Yeah, she goes into more details.
A
Yeah. And we can put the links for TTIC and diagram. Oh yeah, no doubt in the show notes.
B
Now we are doing a really fun lightning round. I don't know, have you been on Ed's podcast?
C
But yeah, I was on Risk Never Sleeps, one of the first buzz.
A
Yeah. We've added some new questions.
C
We've been.
A
Yeah, let's new questions go. If you could go back and change one decision you made in your life, what would it be?
C
Oh, wow. I'm sure you can relate to this as an entrepreneur. So there's certain advisors. I'm not going to name them obviously, but there's probably certain advisors that I would probably pass on. Yeah.
A
Yeah.
C
Okay.
A
That's fair. Yeah. Why are advisors so important to the entrepreneur?
C
Well, because they have experiences. They have experiences. But I think one thing I've been passionate about is that knowledge to execution. And then one of the things that I wanted to desperately get across today is the seven domains that you have to tackle and then the eight layers of policy to practice and how we work to make that work with a rational operating model. And so the challenge with advisors is that unless they have to be really good. And that's the thing is that there's just not that many people that are really good in the space because the work is so hard. And it doesn't mean they're not good people. It's just that most.
A
Not a good fit.
C
Well, it's not even that. It's. Yeah, yeah. Not a good fit, but not a good fit because people love to talk. There's so many smart talkers. And me, Ed, like I. I'm not afraid to say what I'm not good at.
A
Yeah.
C
But I hate just talking. Yeah. And I do. And I'm not afraid to talk. But I want to get stuff done. If I cannot get the job done, I don't want to talk about it. I want to get the work done. And I'm actually get pissed at people that talk and don't get work done. I don't want to hear them talking. If you just want to talk and don't get the work done, I don't want to know you. I want them to get away from you.
A
I love that.
B
Sherry, tell us how you really feel you are gangsta.
A
That's a little bit gangster, right?
C
Yeah. So that. That.
B
I like that.
C
But. But there are so many people out there that want to be advisors, that want to be innovators, that want to talk that and that. And I want those people to stay away from me, but those people always want to come to me and talk to me about being advisors. And I want to know, how did you turn those into execution results? And as soon as I start pressing the heat onto them, they'll go away. And that's how I buckle.
A
Yeah, that's right. Yeah. No, it's true. Yeah. You have the people that. I sum it down. People that can think and do. Then you have people that can critically think and execute, and the people that can critically think and execute together. It's rare.
C
Exactly.
A
Top of the 1% of the 1%.
C
Exactly.
A
And so getting the right advisor that can do those, either one of those really well or both is so important.
C
Exactly.
A
Because that's what you need as you're bringing something new to market.
C
Right?
A
Exactly. You're birthing something that, you know, you've never done before or the market's never seen before. And that advisor that can help you think through the things you don't know and that he or she doesn't know is so important.
C
You know what I'm talking about? Innovators. We have so many talking innovators.
A
Yeah. And the ones that want to be heard.
B
So true.
A
The ones that want to be heard, like, they talk so much. They want to be. That's terrible. Right?
B
They, like. They're the worst.
C
Yeah. Yeah.
A
They're not even hearing what they're saying, like, how I'm rambling on like this.
C
Right.
B
You're a doer, though. You're a doer.
A
I do. Do. I do a lot of things. Yeah. I like to get things done. Like Sherry, which is why we get along.
B
That's why I love you guys.
A
Yeah.
B
Like, this is good.
A
Yeah.
B
Do our table here.
A
I'm here because of Sherry.
B
I'm here because of you.
A
Oh, I know.
B
It's just a domino effect.
A
It is. Yeah.
C
I'm grateful you guys have totally raised the game here, and thank you so much.
A
It's been incredible. Yeah, it's been great.
B
It's our pleasure.
A
I just feel honored to be here, like, to be able to do this. And we had an amazing set of guests that came through, including you, obviously. But, like, the level of guests we had today. Yesterday was great. Today was just another level.
C
When I found out how much work it took for you to get here. I started packing your schedule. That's.
A
I love that.
C
I love that.
A
That's so awesome.
B
Because you crushed it.
A
Because you did. You crushed it. Like, having, like, people like Arlen and Florence and Doug and Josh and Josh Chris and like. Oh, it's just so good.
C
Yeah, that's me. But I don't ask people to do stuff unless I'm gonna deliver.
A
You delivered? Yeah.
B
You deliver, Sherry?
C
Yeah, that's what I do. Yeah.
B
You're the mail woman. You deliver.
C
Thank you.
A
Oh, I thought you were calling her a male and a woman.
B
But I'm not gonna ask Male as the male.
C
Like, I'm not gonna ask Ed to come here from Boston.
B
Yeah.
C
With all of his stuff and operations and then be like, not have a schedule of awesome guests. It's just, like, not how I roll. Yeah.
A
I came here instead of going to chime, which I love.
C
We love chime. Yeah, we love chime, so we respect Chime. So I have to make it worth his time. That's how I roll.
A
I sent the A team to chime, so I don't need chime.
B
You got a great team.
C
Congrats on that.
A
What's that?
C
Congrats on that.
A
Oh, thank you.
C
Yeah, do that.
A
Isn't it great to be able to scale?
C
Yeah, it's amazing. I aspire to that.
B
So we were having a really great actually at health in Vegas about the importance of team.
A
Yeah.
B
And what are your thoughts on that?
C
Oh, it's everything. It's everything.
A
She lives and breathes it every day.
B
Yeah.
A
Do you see Kate?
C
I haven't had.
A
She's here. Yeah, she came in. Yeah, she was just there. I just saw her. Yeah, she's been in. She had snow.
C
Yeah, I saw a picture of the snow. I saw a picture. The team is at everything. So we have an organizational psychologist shout out to Karen Ja Madsen. So we use success, years of science about that. Yeah. So I think one of the things I'm grateful for is that I can't survive without team because I have extreme strengths and I also have other things that make it so that I can't live without teamwork.
B
So what's your top tip in the age of AI or not AI? To build a great team.
C
Extreme self awareness. Because unless you have extreme self awareness for what you can contribute to a team and how you hinder a team, you can't facilitate the team. Because as a leader, you really have to be the person that sets the tone and making sure that you're draw. Every leader has drawbacks and gifts and you have to use your gifts to propel the individuals as individuals as well as the team as a whole. And you have to make sure that you're drawbacks don't hinder the team. And so that extreme self awareness has to come out every day. Yeah. You're never going to be perfect. Yeah.
B
Great advice.
C
Yeah.
A
Great advice. Yeah.
C
Yeah. Thank you.
A
And do really well at that too. You've assembled an amazing group of people.
C
Oh, thank you. I love a perv. Oh, who doesn't?
B
I know, but I just loving organization.
A
He's such a nice guy.
C
What a great.
A
And Anthony Lee, Come on.
C
He's so coachable too.
A
Yeah. Oh, what a.
C
What?
A
Just last night's dinner was so fun.
C
I'm so glad. Well, thank you for spirit.
A
Oh, no, no. Listen, if people aren't there, it's not worth it.
C
Right.
A
Like having that group of folks together. That was amazing. Yeah.
C
TTIC is only two years old and now we've got 20 people on stage and I think he has a great. It's a great team.
B
Exponential progress there.
A
Yeah.
C
Yeah. So thank you. Shout out to Chuck Podesta. He's been our sponsor of the. Because really talk about Gravitas. One of the top CIOs in the market to. Because it's like risky for a top CIO to put.
B
Where's.
C
Check out Renown Health.
B
Oh, he's with Renown. Okay.
C
Yeah. Mitch Parker, as we had already shouted out, Steve Ramirez. Yeah. It was.
A
Who had a baby.
C
Yeah. Congratulations.
A
He's not here.
C
Yeah, exactly. Or else he'd be here. He's also on the advisory board. So for these guys that have reached the pinnacle in their career in these very sort of traditional CISO CIO roles for them to get behind a new organization, it's a big risk. Right. But thanks to them, TTIC has really been able to do a lot of things that the industry needs complementing the existing organizations, not competing with any existing organizations. And so we're thankful for that. And hopefully they've gotten a lot out.
B
Of it or been a tour de.
A
Force I think in an industry and a little bit just getting started like you said.
C
Right. Yeah. Just two years old. Yeah. And hopefully we'll be able to amplify these other organizations and help them and help the leaders that have co founded.
A
Yeah. As well go back in time. What would you tell your 20 year old self?
C
Just had to don't worry and just have fun. Right. That's what My first mentor was the chair of the physics department. And he was. He was just a huge sponsor and interestingly, a priest of my university. And he would. And just. He was really pushed me. And I just wished that he always had a lot of faith in me and just pushed me to the max and all these different, very hard areas. And I just wished I could have trusted his message more the time because he just saw really big things for me at the time because I was like the only woman doing all these crazy things then. And he was just always like pushing. And I wish I could have absorbed his message faster more than Dr. Drummond.
B
Is his name Dr.
C
Drum?
A
Is he still with us?
C
No, no. But he was a legend. Yeah. Yeah.
A
Santa Clara.
C
Yes. Oh, yeah. Santa Clara.
A
Nice.
B
That's awesome.
C
Yeah.
B
Yeah. And I feel like for folks listening, that's a good message to take home. Right. What advice would you give to people now, knowing what you know?
C
So we. It's. I'm telling myself that same message because we have a small window of time, again with this whole thing with. I am bringing ethics into medical AI, and I have a small window of time to make sure that really happens for medical AI. Because what I told the room during my module is that we're in an AI driven economy. We have B's and T's in this AI driven economy. And what I told the doctors was that don't be afraid, just start driving your own movements using Greg Satell's Cascades. No one's going to stop you like these giant AI companies. They're not against you being ethical. Right. They just don't even know what it means. They're not going to stop you. But they don't. They just don't care. I hope that's not controversial. It's just not their business.
A
No.
C
You know, it's not their business. It's not their business model. But they're not against it. They just. Just don't have any reason to be doing it.
A
Yeah. We had a guest talk about moral injury today, which was doing something that they know was not in the best interest of the patient, but that the organization was asking them to do.
C
Oh, yeah. Is it because they didn't know any better or is it because. I think it. A lot of times I think it's because they just don't know any better.
A
I think it's a combination of that.
C
Yeah.
A
And I think from an instinct perspective, your instinct will tell you if you're listening, if you're tapped into it, like you're in sync with it. Oftentimes we know when we make a decision, if it feels right, there's no friction. But if there's ever an ounce of consideration, should I make that decision or not? Your gut's gonna tell you if you're aware, if you're listening, if you're open to.
C
But see, the problem is, though, sometimes there's a lot of complexity in this world, of course, which is why we worked really hard to demystify a lot of this complexity. And so the challenge for a lot of administration is that they don't really understand all the layers. And so they're going to be asking the physicians to do things that they might not understand because they don't understand the technology or they don't understand how the technology intersects with the clinical care. And so that's what I was saying, is that the physicians, often having very high IQs, they need to step in and step up because they do have the capacity, they have the intellectual capacity to connect the clinical care with the technology, and they need to step up. And that's what I'm encouraging them to do, is because they do have the capacity. A lot of the times, the administrators, they may not have the time and they may not have capacity. And so since the physicians often can, they must do that. That's the point. That's my message.
A
Yeah, yeah. This one was more. A little more basic. It was more about, are we in the business of keeping people sick or keeping them well?
B
Yeah, it was about diagnostics. Yeah, it was imaging and diagnostics and.
A
Also prescribing the wrong drugs or too many meds. And you know that, like, it's probably not in the best interest of the patient to.
B
About RVUs versus care. Yeah, yeah, yeah.
A
So it was only in that way.
C
Oh, I see.
A
Which is an interesting. It's an interesting opportunity with AI to change everything because it's such a.
C
Like, what's the business model?
A
Yeah, you get an opportunity to change everything with AI. It's that powerful. It's a revolution. It's not like, incremental. Right. And it's bigger than the Internet.
C
Oh, totally.
A
And so what are we going to do if we just apply it to the same old business models for, like, 2x? Is that really what we should be doing with AI?
C
Yeah, I see what you're saying.
B
Right.
A
Or should we be rethinking everything in a new age of AI? Because it's very different.
B
Rethink the things that need to be rethought, because there's some things that are good really? And there's some things that should be redone.
A
Yeah. Do you generally think the care system is good right now? I think there's good people.
B
Yeah, there's.
A
But do we think the system is good?
B
There's a lot of room for improvement out here.
A
Being diplomatic, I think.
B
I think.
A
I think the care.
B
I think it's terrible.
A
Somebody put up on the screen how we do the country.
B
It's sick care, right?
A
Yeah. We're like the worst compared to other countries. And we spend more. And why is that happening?
B
Don't get me started on insurance.
A
Yeah, that's so. So. So if you incrementally apply AI, I'm.
B
Not gonna bash the whole thing. Like, there's things that work, but someone has complex.
C
I think it's complex.
A
It is complex. But if someone doesn't bash it, you get no change. Someone's gotta speak up. I'm not a revolutionary, I'm not an activist, but I think we're at this precipice in time where we can take a technology that is so powerful and apply it to actually change things. Everyone talks about rural hospitals and how they don't have enough money.
C
Right.
A
The technology now is democratized for the most part. They can do so many things that they couldn't do if they apply AI a quarter of the way.
C
Here's what I want to do. I think what we can do and what we must do is relieve a lot of the administrative bloat and give time back to doctors. It's not quite revolutionary, but I think.
B
You on that one.
C
But. And that's what I'm actually working to do at Metagram, which I didn't get a chance to get into in the module because I wanted to set some foundation with some infrastructure. But that is. There's a lot of extra layers that can and must be automated. And it's not about taking away from jobs that are active right now. It's about not filling new ones when they don't need to be. Right. And automating that and relieving. One simple example is making sure that clinicians don't need to take on numbers of patients that are unreasonable. That's one of my goals, one of my personal work goals.
A
The numbers unreasonable, not the patients.
C
The number. Numbers of unreasonable patients. Volumes of patients that are unreasonable. That their workload is a physically reasonable workload. Right. And because the administrative bloat that we have in the health system is something that has to be addressed, and that's something personally that I am. I'm working on, as well as the whole denial thing is another thing. Yeah. Those are the two personal things that I'm tackling.
A
Denial of claims.
C
Yeah.
B
Like prior auth.
C
Stuff. More about. Yes. That I'll be talking to you more about. It's. I can't totally get into it here in the public, but it's a good.
B
Excuse for a part two.
A
Yeah. When you're ready to launch, let's go.
B
Yeah, let's get you back on.
C
But we'll be discussing that more. More this year. So I think there are certain things that create a lot of waste that need to be. That need to be dealt with. And so, again, it's complexity and it's interests. And how do you tackle that in ways that make sense and make use of the resources that are actually there?
B
That's great.
C
Yeah.
B
I love it. Sherry, it's been a pleasure to have.
A
You on the show.
C
Thank you.
A
Yeah. Really good to see you.
B
Good to see you, as always.
C
Always. Yeah. Thank you so much for being here and having so much, too. But don't you love the people here that are actually brilliant people?
A
Oh, it's unbelievable.
B
Substance over show.
A
Yeah, substance over show. This is like, on no other conference, right?
C
Exactly.
A
The content is amazing. You said this, but, like, I experienced it. It's so good. It's.
B
I was on the fence of coming.
A
Yeah, we talked about this. He's like. I look at it like, no, you don't understand. This could be. He's like, I don't know. I'm like, you're. You live next door to this place.
B
And I said, dude, all right. And we trust.
A
He did. He finally got there.
B
He's like, okay, all right, I'll just do it.
A
I'll just do it. Just do it.
B
And I'm so glad I did. This has been fantastic.
C
I'm so happy.
B
Where can people reach out to you and find out more about the work that you do?
C
Yeah. Please go to LinkedIn and they can click on the.
A
You will find her on LinkedIn.
C
Yeah. Artemis.
A
Yeah.
C
Yeah.
A
Prolific poster.
C
Yeah. Yes. I just find that people.
A
She's a connector. I told you.
C
It's the place that's. They can find what I'm working on.
B
Awesome.
C
Yeah. Thank you.
B
And Medigram.
C
Yeah, that's easy.
A
Connect there, too.
B
We'll throw that all on the show. Notes.
A
Yeah.
B
Sherry Duville, CEO of Metagram, with us here at aimed25. Thanks for joining us, Sherry.
C
Thank you. What an honor to be here with you.
A
Thanks for listening to Risk Never Sleeps for the show. Notes, resources and more information and how to transform the protection of patient safety. Visit us@SenseInet.com that's C-E N S I N E T.com I'm your host, Ed Gaudet. And until next time, stay vigilant because risk never sleeps.
Host: Ed Gaudet
Guest: Sherri Douville, CEO of Medigram
Date: December 23, 2025
In this episode, host Ed Gaudet welcomes Sherri Douville, CEO of Medigram, to discuss why “ethical AI” in healthcare means little without translation into real-world results. From deep dives into the complexity of healthcare systems to the realities of executing on bold visions, Sherri and Ed explore the intersection of technology, policy, team building, and ethics. The conversation highlights the gap between theory and execution in AI, the power and challenges of multidisciplinary teams, and what it really takes to protect patient safety in a digital age.
This episode puts forth a clear message: ethical AI, particularly in medicine, is not meaningful without real, on-the-ground execution that addresses both systemic and frontline needs. Sherri Douville’s candor about the work, the challenges, and the hope for transforming healthcare via technology serves as practical advice for both current leaders and future innovators. The value of diverse, execution-driven teams and the empowerment of clinicians are vital themes threading through the conversation.
[Prepared for listeners and those interested in digital healthcare leadership, AI execution, and genuine change in patient safety.]