
Loading summary
A
Foreign.
B
Welcome to Risk Never Sleeps, where we meet and get to know the people delivering patient care and protecting patient safety. I'm your host, Ed Gaudet.
A
Welcome to the Risk Never Sleeps podcast in which we learn about the people that are on the front lines protecting patient safety and delivering patient care. I'm Ed Gaudet, the host of the program and today I am pleased to be joined by. I think I can consider you a good friend now. Chris Johnson.
C
Perfect. Yes. Good friends.
A
Good friends. And you are the Senior Director of Cybersecurity Compliance Programs for gtia.
C
Yes, gtia. We are a global technology industry association. I guess you could say we're one of a kind of. We're focused on providing the place to be for the ITP and vendor community.
A
And if I recall. So full disclosure, I was on your podcast. Do you want to give that a plug?
C
Quickly, quick plug to MSP 13:37 Cybersecurity Focus all the time and raw conversation. I suspect not too far off from what we're going to talk about today.
A
We'll try to get weirder though, I think. I don't know if that's possible, but. And I remember the conversation. I remember enjoying it. And so I do also remember that GTIA was birthed through some other organization. They get that right?
C
Sort of. So we used to be known as CompTIA, so the computer Technology Industry Association. We are the same association with a brighter name. Okay, so new logo, new name, same association. So still been around for 40 plus years. That hasn't changed.
A
Great. And how did you get involved in the industry?
C
That's a funny question. So I have been like many people in it. You kind of fall into it, didn't go to school for it kind of thing. Decided I could do something in IT better than the company I was working for, started my own thing. We all probably have very similar stories. If you're in it at all. Fast forward Covid timeline. And I was really, I was burned out of the constant travel, the constant risk evaluations of banks and other companies. And my wife's like, hey, our school is hiring an IT guy. Maybe you should do that. And I'm like, you know, my hands are now soft. I hadn't turned a screwdriver in a very long time. And they hired me. Like, I showed up to this meeting thinking that I was volunteering to help at a school and walk into a room with the entire school board superintendent, like, hey, we'd like to offer you a job. And I'm like, why? I was so not Ready for what they put on the table. And I did that for a little over four years. And that was right before then I got the reach out to come to gta. And it's funny because all the decisions that you and I both had to make during the COVID window were reasons why I had left what I was doing before COVID happened, like constant travel. And then all of a sudden it's like I no longer had an office to go to if I would have stayed at the company I had been at. Right? Like those are the things that happen, right? So it's crazy. Anyways, here I am, I'm now heavily involved in compliance programs. That's my passion, it always has been. And GTIA has really helped me use that as a platform to work with
A
other people, any particular sectors or cross industry.
C
For me, in compliance programs, predominantly it's serving the itsp, which is kind of the new framing of what would be an MSP or an MSSP to be a little bit more inclusive of. There's variations in that and really to help them mature their cybersecurity posture. Right. Like we talk about left of boom, we talk about right of boom. And I think that it's easy sometimes to talk about right of boom. Like, hey, if you have these things in place when bad things happen, you'll be ready to go. And you're like, yeah, but what if I had things in place so that I don't have the now I get to execute the break glass scenario, the recovery plan.
A
Yeah, yeah, yeah, yeah.
C
And I think that's again, and I don't want to downplay the importance of being ready for the right of incident. Right. But I think the model isn't keeping up from a risk standpoint. I think what used to be my preventative tools are limited to doing X, Y and Z are now actually quite a bit more sophisticated than they were. And so I think the things that cause strife in the MSP community particularly, or any group of it, is the, oh, I have to tune this for it to really be valuable. And so there's that disconnect between put tool in and go, oh, yeah, we got the XDR mdr, whatever it is, so we're ready for when the bad thing happens. And be like, I know what it was because we have the tool in place to tell us. And I think that's unfortunate, right? But the tools themselves have gotten more sophisticated and I hate to go down the tool rabbit hole, but man, I remember when we were afraid to let an RMM tool do Auto patching for us. Right. So, I mean, we've come a long ways, and I would argue even the bad guys are starting to have a little bit of friction when they target an organization that's done just a little bit of work. You just feel like it attacked you through your browser without any effort because there was no encryption in the browser. And if you look around today, most people are hesitant to continue to website if it says, hey, this site's not secure.
A
Exactly. And so how is AI being applied positively to these tools to enable that?
C
That's not a fair question, I think. I don't think I should answer that question. I think that. Well, how would you say. I mean, because here's my problem with saying that, what is AI?
A
Yeah.
C
And what are the things that have been. Always been there? In some cases, you know, they've been around three, five years as a company. They always had, quote, AI functionality in the way their tool was built. You know, the large language models, the way they built them out. But now we want to know. So the product names now says AI on the title. I mean, it used to say Office365. Now we're supposed to say something 365 Copilot. I think that's where I would say, I have a problem when you say, you know, how is AI being used?
A
Let's take AI out of the question. Let's say, let's ask it differently. What we've seen in health care over the last five years or so has been a shift from identify, detect, protect, from a framework perspective to respond, recover. Right. So more effort has been on. Not a matter of if, it's a matter of when. So let's be ready to recover, respond. What I heard you say, I think was that, hey, you know what? We should also go back now to look at identify, detect, protect, because the tools have gotten better.
C
Yeah.
A
So what is it about the tools that have gotten better, modular AI that give you that confidence that maybe we should balance back to sort of the front end of the problem?
C
You know, I'll give you one on the AI that I think is extremely powerful. In fact, I just did an episode of 1337 where I interviewed a CISO doctor in research around AI and talking about governance of AI. So, like, you can go down this, like, interesting. You can take an interesting fork in the road where we're going down the path, talking about the user experience of me engaging with an AI tool. Right. An agent or some sort of prompt that allows me to have a conversation through either typing or voice in the context of tools pertaining to the security of an infrastructure. Thinking about like old school labels of like firewall and the cloud, the virtual servers and stuff like that, I think one of the things that's really changed in recent years is how AI can help take large volumes of information. And if I'm just as an example, if I'm looking at like a threat intelligence feed, unless I have a tool to decipher what's in that feed, it's all gobbledygook, right? Like I'm not going, oh look, I think that's a malware for Dropbox and I can just see it right now. Maybe if you've looked at the same patterns over and over again, sure, I suppose you could detect some of that. But if you think about how fast that's coming out too, I don't know that the human eye or even the way we operate can keep up with that kind of information. Which goes back to we're going to worry about, respond and recover because we know that at some point an alarm bell is going to go off and it's going to say this is bad and you need to do something about it. And I think with the way the sophistication of tools have gotten and the ability to leverage AI to help take our volumes of information and shrink it down to usable in a reasonable amount of time, I think you can get into this might be a false positive, but we're going to go ahead and block it anyways. And then you don't ever experience it on the right of incident because it didn't ever get to become an incident. Right. I think that's the difference that I believe is starting to really happen is that when tools are properly configured, when the human element is properly trained, I think you are starting to see where we are more resilient on the left side than we ever have been in the past.
A
Yeah, no, that's a good point. And you know you mentioned threat modeling at some level with AI. Let's talk about the negative of AI, right? The threat, the risk of AI. It completely changes the traditional model regarding threats. I mean, if you assume traditional threat model focuses on infrastructure and the application layer risk AI threat model extends to cover more behavioral vulnerabilities, adversarial inputs. The attack surface is changing in a way that's much more data oriented and much more integrity oriented within the CIA construct. Right. Than it is confidentiality and availability. And I just love your thoughts on, on that. What you're seeing there from that perspective,
C
you Know, it's, that's a tough one. I think that in reality it's not new. I think what AI does is that it enhances or it accelerates the ability to do something good or bad. Right? Like that would be how I would see it. So forget the triad for a minute. I think everything now is operating at light speed. I'll even give an example of like emails. You're not supposed to click on emails with links in it, right? Don't click on the link. Like if I could beat somebody over the head with this and be like, don't click on the link. Well, at the same time it's like, did you get my email that had the link for the scheduling thing that you didn't click on? Wait, which one am I supposed to do? I mean, if we go back in time, we know the Internet was built in order for us to be successful in using the Internet, we are clicking on things or it becomes a useless resource. Right? Like how do you navigate with that being said, when I think about the sophistication of emails, it's not to say that phishing simulation or that phishing security awareness training, those are all important things to do. But the reality is the email that I'm being presented with that is bad with sophistication of the threat actors and usage of AI, they should be putting in front of me a pretty darn good looking email, almost to the point of it's so good. I know that, you know Ed would never send this to me, right? Because Ed always puts typos at the end of his email. This has no typos, right? That kind of thing. I think that's a good example of just showing like because of AI, we can impersonate people's voices. So all of those things. So now thinking about like integrity. Well, what about like if anybody's using an echo or I don't want to use the A word because I don't want to set anybody's little device off. But when Amazon made the change that they did or the voice, when it added in that AI intelligence level, it went from having sort of this metallicy not very warm sounding to it to like you're literally talking to somebody in the room with you and you look around and they're not there. Like that's impersonating a voice. Like, you know, heaven forbid someone decided to drop in a voice that's someone you actually know.
A
Yeah, I think, I think it's happening. I mean we see it with deep fakes that are getting hired, right? Yeah, that's a real use case where people are impersonating real people and physically through a virtual setting. Right. Taking jobs, getting hired, creating a group of 15, 20 people that are basically keeping those jobs alive and then sending information outside of the US and it's been documented, it's real. And it's all done through that deep fake capability. And you're right, it's gotten so much better. The sensory. Our ability to perceive some type of sensory injustice, it's slowly going away, slowly dissipating.
C
Well, think about, like, facial recognition. How do I know if it's accurately recognizing said face as being the face that is actually there? I mean, what's the movie with Tom Cruise? It's Dark Knight. Dark Knight is the one that comes to mind where they're messing with the computers. So you see him walking, but you and I know that it's Tom Cruise. Right. But their cameras are telling them that Tom Cruise is in all these different locations at the same time because of how they're manipulating the software. And I'm like, this is where we are. We are here. And we don't even need to do that level of technology anymore. We just think just have planted that face as being somebody else everywhere we put it.
A
Yeah. So how does that change compliance? Because I feel like the whole world of compliance is being dragged along. What do we need to do as compliance professionals to get ahead of this?
C
So I think that, you know, you think back when the privacy rule came into play back in the 90s, it was very logical and thought through. Right. We will look at security in the vein of, like, if it's sensitive, it should be in a locked room and we should restrict who has access to those things. Right. And then you fast forward and the security rule came along largely to address, sadly, the same things that have now converted into a different format. So what used to be me using an analog phone to leave a voicemail, now I'm on a VoIP system. I now have a new requirement placed on me. So I'm like, well, let's just go back to analog phones so we don't have to deal with this. Right. So compliance. Right. I think the interesting thing is that compliance isn't security. Security isn't necessarily compliance. One can hope that if you're complying with something, that it's hopefully showing evidence of how you're addressing a best practice or a standard. And we know that those good, bad or otherwise, the reason frameworks exist, reason the omnibus rule came out, they exist because we were failing with Consistency at protecting the things that we have been entrusted with. And so in most cases, I would almost argue that in all cases that frameworks exist, they were never created to be punitive for the people that they are asking or telling that you need to be compliant with. And I think that is also where the problem lies. Those that are being held accountable and saying, hey, you must comply with fill in the blank. Well, that's a checkbox. I did X. So therefore I'm compliant, therefore I can move on and worry about other things. And I think that misses the whole point as to why these exist. These are baselines, these are a starting point. These are examples of things that you are dealing with, with a path to ensure the integrity of what you're protecting. And I think that's where we have a problem. Compliance has to evolve in such a way that says tomorrow I'm going to be evaluated on this again and I need to be able to show either a reason why it hasn't improved or improve it.
A
Yeah, that's a great point. I mean, I learned about 20 years ago that no company wants to be the most compliant. They're going to do just enough to get by, which is a checkbox, as you said, and that includes adoption of technology. If the technology checks the box, doesn't necessarily solve the problem, Checks the box for the regulation, they're going to choose that over doing what actually solves the problem.
C
Well, you remember the firewall rule? That was one of my favorite ones. Like, yeah, we got a firewall sits over there. See the blinky lights? What's it plugged into the wall? I have a firewall, therefore I am compliant with. I think it was at least one or two of the.
A
But like CASB 1386 back in 2022.
C
There's another one.
A
Yeah, 2002, was it 2003?
C
It was close. I, I don't remember the exact date.
A
I think it was July 22, 2003, I think. But I have to go back because my birthday is a day after. So that's why I remember. But I remember thinking, I know I was, I remember thinking that, wow, that's going to be great. We had a data security company at the time and we protected data. We wrapped it with policy and controls. So you imagine the data travels with that. It's pretty much impenetrable. But what the market did was it went full disk encryption. That was how it solved the Cosby 1386, which doesn't really solve the problem. It solves it at rest, but doesn't solve transit. Right, exactly.
C
So, you know, that's a great example of what we're seeing happen across the board. So I mean, another example along those, like, since your disk encryption is thinking about like with cmmc, it's like, well, if we're dealing with cui and like, oh, we'll put that in an enclave.
A
That's right.
C
Okay. So but you have things inside that enclave that you're not entirely sure how they work and they're managing or addressing from a security standpoint, the assets inside this enclave. How do you know when it communicates outside the enclave? What's.
A
We don't have to care about that.
C
Right. I don't have to worry about that. It made me think of like, you know, you've got a castle and you've got a moat, but there's no drawbridge because it's been 80, 90, 100 years and the drawbridge no longer goes up and down, it's just the bridge.
A
Or worse, there's a drawbridge, but there's a tunnel that goes under the moat. Sure. That everyone knows about.
C
Well, I mean there's so many different ways that we could describe this, this challenge. Right. Like we see it play even in modern times of like, I have all the great locks and cameras, but I don't turn the alarm system on, the door is always open. Help yourself to the new tv. Right. Like, I think that we are, and this is I think a big problem in the IT space. We love tools. Like, I honestly can tell you that there should be a 12 step program for shiny object syndrome. Like it is a real thing. I would argue anybody Adderall or coffee,
A
high amounts of caffeine.
C
Yes. But when interesting about it, and I remember when I first realized that I genuinely had a problem with shiny object syndrome. It was tied to looking at company websites. If the company website was ugly, I don't care what product they sold. I've already moved on to another website. I wanted the mirrored and the fun looking icons and obviously took 45 minutes to load. But that's not the point. That's where we are today. If you look at and I don't care what the organization is, it could be a hospital system, it could be a small clinic. Fill in the blank. And there's two sides to this technology. Shiny object center. One is the things that are fun and convenient. Your IoT devices, the speakers in the lobby, the Sonos devices or things like that. And then on the other side, you have the things that they're using to manage infrastructure. So Your firewall company, the intra ID or fill in the blank what it might be for SSO and all these other things. And if you look really closely, most of them are constantly evaluating what new product they're going to use as the one they have, they don't know has added those features to it. It's no longer adequate. So they're looking for something else which introduces its own risk. Right. Because now you've got to now configure net new product to the same level or better than what you already had.
A
Why are they looking?
C
Because they are not satisfied with okay. And the cost that they're spending is always, there's gotta be a way to save money. So there's gotta be something else out there that addresses the same problem for less than we're spending on it today.
A
And why are they just at an okay state?
C
So I would argue that okay should be the focus. So I think it's that when I think about okay in the implementation stage, or they find out that there's a product that does a whole lot more from a feature set standpoint that they're not satisfied with okay. I'm not advocating that one shouldn't have better than okay tools. It's more of that. They're not addressing the things around the tools, largely the human centric portion of this. And that is why, to me, I think that okay should be all they're worried about from a tool standpoint because they always are going to have to address the people standpoint. So investing constantly in tools, again, I come back to they're probably not investing in the people.
A
Yeah, no, I think you're right. And I think that's where I was going with this. Is that really the problem? It's never about the tool in my opinion. It's always about the people in the process. And if those things aren't harmonized in a way that drives outcomes. Right. Drives the right level of outcomes.
C
Sure.
A
Then it's like driving, you know, a high end BMW from the rear seat. Right.
C
That takes talent.
A
That's right. It's pretty much impossible, I think.
C
Right.
A
But the point is it's the process and the person.
C
Right.
A
The person decided to be in the back and that's the process they're trying to use to drive it. And it's not working. So I think tools aside, because there's a lot of tools out there, the problem is, like you said, there's a lot of tools out there. So if you're continually not satisfied, you have to look inwardly, you have to look at what's the process. And most people think about it.
C
Right.
A
Most people have a problem, so they immediately reach for a tool, and then what they do is, the next thing they do is they take the tool and they take the people. And then they say, all right, let's take our process down and make the tool do our process. You failed already. Like, that's why most CRM systems that have ever been started out fail within six months or nine months. Because people aren't thinking about it from transformation. They're really not. They use the word transformation, but they're not actually driving transformation as a project which includes more than just the product or tool that you're working towards. It's got to include the people, it's got to include the process, it's got to include policies and procedures and politics oftentimes. Right. Because that can get in the way of successful deployment.
C
So, yeah, usually the absence of good governance tends to be defined by bureaucracy and politics.
A
Exactly right.
C
So let me ask you a question, because I think that brings up a really interesting point. You mentioned CRM. We've talked about tools. We've talked about how it's always easier, I think, to talk about tools and is to talk about people. But what I find interesting is that when you talk about the failure at six, 10 months after adopting a technology, I don't know about you, but when I was evaluating tools, it was always like, I've looked at 13. I think they all should have everything that was in all 13 of those tools. And interestingly enough, if I had approached it from the standpoint of what do I need a CRM to do for my company and then start evaluating or looking for tools that meet that criteria, I probably would have spent a lot less time being disappointed in the tool that I adopted, because I would have had realistic expectations. And I see that play out almost all the time. Like, hey, we. We adopted fill in the blank tool. I'm like, it doesn't do this. It doesn't do this. It doesn't do this. Yeah, but we can build that in. We can customize it. Like, but the average tool that we would have been otherwise successful with all had the things that we needed without customization.
A
Yeah.
C
Off the shelf is hard for so many companies. Like, we found the perfect tool. What was perfect about it? Oh, it was cheap and fast. Like, yeah, but there's a third one that I know you dropped in your triad, and that was quality. Right.
A
You only get two Boyle's Law.
C
Right. I've never seen high Quality and cheap be combined either.
A
Right? Yeah. So, you know, regardless of the technology, whether it's AI or the next great thing, quantum, whatever, we're still going to be plagued by this issue of the holy trinity of technology, which is people, product, process and product or technology. And without those things being harmonized in some way, you know, most projects are doomed to failure or replacement. We're not getting what we expected. Well, what did you expect? Well, you know, we wanted it to work. Well, what does that mean?
C
It to work? Show how my involvement. Right, yeah. What was, what did your project contingency plan look like? Well, the contingency was I didn't have to show up because the project is just done.
A
That's right, exactly.
C
You know, it's interesting. You use the, I like to call it the three P's, but people will dispute that it should be T for tools or technology because I get it. Because product isn't necessarily. Anyways, we go down a rabbit hole on that. But my thought process was so many times, we have the right tools and we have the right people, but we don't spend the time to get it rightly implemented. I had this happen not so long ago. I inherited what had been done. I came in when phase three of a project should have happened. So they put in a million dollars in Cisco switching gear all over this campus. And I came in into phase three when they were supposed to do VLAN reduction and elimination and make sure that all of the Cisco switch stacks had the latest firmware. I can tell you that was more than a decade ago, or just shy of a decade. 80% of those switches still don't have any newer firmware than they did when I was trying to bring that project back to life. And there was now no budget for it because, well, the previous project manager said, yeah, we can do phase three on our own. We'll save a ton of money.
A
Exactly.
C
Saved a ton of money. Oh yeah. Because we didn't do anything.
A
Yeah, exactly. Well, and that's the thing, like transformation requires leadership. Failureship is a leadership concern. And if you're bringing technology in regardless of the size of it, quite frankly, I would posit you're transforming something or you're not thinking about it correctly. It's not. It can't be incremental. It's technology. And so if you're actually going to spend a couple hundred thousand dollars or more on a technology and a project which can be the implementation for most of these can be 5x with the license cost is. Then you're Transforming something so people, because
C
you could transform and it doesn't look very good when you're done.
A
Transforming well is a huge consideration. And most people don't like change. That's why I say it's a leadership concern. Because if you ask people, they're like, well, am I going to be able to hire more people? No, no, no, no. Actually we're going to be able to replace people with this. Oh, I don't want that.
C
That is every underlying dream. Right now AI is replacing the white collar world.
A
Scary as hell. And we're not thinking about it correctly and we're not being transparent about that. And that's why what's going to end up happening is most of these projects are going to be sabotaged because people aren't going to let them work, they're not going to let them be deployed, they're not going to let them be adopted in a way that needs to be adopted for success. It happens all the time, regardless of AI.
C
Sure.
A
I think it's going to be demonstrably worse because AI is such a threat and people aren't stupid.
C
Well, I think, I think it's interesting that we are seeing through the lens of AI replaces all of these jobs. I'm not saying that it's not going to, you know, this definitely could be what people are calling as the next industrial revolution, but if I look closely at the way the majority of AI is being consumed, well, that isn't going to happen anytime soon. The other piece that's concerning, and I think this is one that people have to look a lot closer at, is can we financially sustain AI doing those things? Because as the data centers get built out and as the processing continues to climb for everybody consuming tokens for the next caricature of their friends, of course it's expensive.
A
There's a supply and demand issue that's gotta be worked out. And to your point about the time it's gonna take, there's a drag on the system right now. Right. And it's called people. And so in some ways that's not a bad thing because going too fast actually has consequences. But what will end up happening is at some point there are going to be those among us that are fearless and that figure it out autonomously. I'll just say, worst case scenario, autonomously, with very little people, and then put the larger concerns out of business because now that. So basically that's what's going to happen. It's not going to be like everyone tomorrow is going to cut 40% of their staff and replace it with AI. I don't think that's going to happen. I think what's going to happen is things are going to happen the way they're happening and people are going to get comfortable. Like you said. Oh, it's not going to happen right away, blah, blah.
C
Right.
A
But what's going to happen is the insidious. They're going to be winners and losers just like there is always.
C
Sure.
A
And people are going to wake up and go, holy cow, I don't have a business what just happened. And that means I worry about, you
C
know, we know that there's going to be a recall on a lot of programming. Right. A lot of software development is going to go recall because it didn't get the security due diligence that it needed.
A
Yeah.
C
But I think about things like websites and some of the other like lower end stuff that you know, there's quite a few jobs out there that are building people websites and now today I can go use a prompt tool or vibe code tool and build a website that's pretty shiny. Shiny enough for people to go and consume from that website. So yeah, I think you're right. I think there's going to be a lot of change happening in a very short period of time. What's the risk? I think the risk for anybody out there, whether it's from healthcare to you, like fill in the blank is going to be to ignore it, to not think about what could I do to recreate myself for the future. Yeah, yeah, exactly. The monkeys are here.
A
Yeah, yeah. All right. We covered a lot of ground.
C
Let's talk about who.
A
Personally, if you weren't doing this job, what would you be doing? What are you most passionate about?
C
Geez, you know, if I wasn't doing this job, what I'm most passionate about, it's kind of still the same thing. I would say that I don't know that I see what I do as a job like I traditionally would have. But what I would say if I was doing this as not a job, I would probably work more with the non profit sector for this very, the same category, cyber hygiene, the little things. I mean it sounds trivial but you look at some of these organizations that they're non profit, they raise money for the homeless or for water projects or fill in the blank and you go look at their tech structure and you're like you are just setting it up to never be trusted again. When you take somebody's money because somebody's money got taken, that was not your intent and it didn't come to your organization, but they used your name to capture that dollar. And I've seen it happen enough times that that's where my passion would be outside of work, is to do that, you know, help the smaller businesses, the upstart startups that are trying to build a new business. Help them recognize the areas that don't cost a lot or at all necessarily, to still improve their cyber resilience.
A
Okay, when you're not working, what do you like to do?
C
When I'm not working, I like Legos in the back.
A
So.
C
So Legos. No. So, I mean, the podcast is kind of fun.
A
Yeah.
C
I would say probably would get into running front of house, like sound. I like helping people dial in their studios, like helping them with, you know, that tech infrastructure. I don't know how many times I've had someone, you know, this device doesn't work. I need a new laptop, I need new hardware. And just the way my brain works. Few minutes in, I'm like, oh, yeah, reset this and you're good to go. They're like, I've been on forums for two weeks. Well, you know, it's funny you say curse.
A
Is there actually a good service that a random consumer, like, if something went down in my house, who could my wife call if I'm assuming I'm out there to get it fixed?
C
Like, I guess it depends on what it is. Yeah, well, we know to call. It's a computer.
A
It's a computer thing. It doesn't matter.
C
Like, so that's a tough one. So I steer clear of the residential tech space just from, you know, bad experiences. You know, some guy who wants to show me his possum collection in his garage. No, but that's.
A
That's.
C
Yeah, I guess that's my point.
A
There's really. That's an opportunity for someone out there, like, if you solve that problem, like, you know, because the.
C
What's the tech service that I forget squad.
A
Yeah, the gay squad. The geek squad. Not good.
C
No. I think consumerism of technology is. It's broken. Get a new one. And I think largely that's fair from the cost. From an equipment standpoint, the cost has come so far down.
A
Yeah.
C
But when you get into things like tracing wires is always kind of a fun.
A
Or just WI fi. Like, my WI fi is awful. Like, how do I fix my WI fi? My cell coverage.
C
Cell coverage is a little bit different because they have control over that. But on the. On the WI fi coverage, usually it's. You're dealing with a product that Has a single.
A
Yeah.
C
And regardless of placement, you know, refrigerators and counters are bad, but I would say that that gets into, like, Euro and some of the others have kind of solved that problem. Yeah, depends on how fast you want your. Like, I had someone ask me the other day, so I got all this new stuff. Like, I'm doing two and a half gig throughput. Do you think that's good enough for me to have my own media server? And I'm like, dude, I think you could have done it with 100 meg and probably never noticed. But, hey, good for you. Two and a half gigs sounds great. That sounds super fast.
A
That's funny. That's funny. All right. Biggest mistake you've made in your career.
C
The biggest mistake I made in my career, I've probably made more than one. I think the biggest mistake is not remembering that I didn't get here by myself, that there were a lot of people that kept me propped up to get to where I am today. And I think early on I may have gotten further ahead than I probably should have when I did. And I think that really bit me pretty good. It was definitely a humbling experience. Always set me back at least five years on my career journey. But nonetheless, we learn we either fail forward or we fail backward. And I would say that, yes, my failure was to not recognize that I needed help, you know, like, that I needed to recognize that I needed to slow down. You know, my kids grew up largely with me not being present for a lot of things when they were little. And can I go back and fix it?
A
No.
C
But I can be intentional in the time I have with them now. And so that was part of the reason why I did before COVID I got to be here, present further more of what they all remember anyways, right? Like, they don't remember that I wasn't there for all the diaper changes.
A
Oh, no, they do. They do. My girls tell me they remind me. But I could be a good grandparent,
C
you know, And I think those are some of the things that are really. They were big mistakes. I should have recognized sooner that I wasn't meant to be a solopreneur or an entrepreneur in the sense that I need to go start a new business or run my own msp. It's funny. This is actually funny. My MSP, we sold in 2016. My senior director of services now runs the NOC for another MSP. So he definitely moved into a better fit. My former business partner, he now is inside sales engineer for a very, very large security company. I think his salary is more than what we collectively made as an M. Msp. My director of operations, he is now running. He's the GM of a fairly large msp. So when I look back, it's like we should have stopped being an MSP as soon as we started being an msp because none of us had the mental capacity at the time for running an msp. And yet fast forward and I think all of us have matured in such a way that if we were starting an MSP today, boy, it would be a completely different company, that's for sure.
A
Well, well,
C
I will never. I should not be recorded when I say this, but I don't believe that I will ever go back to running an msp.
A
No. Okay, I got this recorded, so.
C
That's right. I don't believe.
A
I heard you. I heard you. I got the. All right, so you know, you see yourself, 20 year old self running around the halls of somewhere. What would you tell him? What advice would you give him?
C
Geez, focus. Read more books. Be intentional. Our time is short and I think if I was going back to talk to my 20 year old self, it is. Take advantage of the opportunities, don't let them pass you up. Like, take risks, be brave. Life is short. And I think that my 20 year old self was like routine. I had the job, I was getting the job. I stayed there for way too long.
A
Yeah. All right, you're on an island. What five records would you bring with you? Geez, this is the fun questions now.
C
Megadeth, Symphony of Destruction.
A
Oh, nice. Megadeth.
C
I would probably bring any album by pearl jam.
A
Okay, 10.
C
That would be a good one. Yeah, Filter. Anything by Filter.
A
Nice Filter. Wow. That's a new one.
C
So 10 albums, that's like tool.
A
Are you a Tool fan?
C
Those were good. So then, then if I was changing genres a little bit. Yeah, you know, I'd go like maybe some Run dmc. Most of the crossover stuff that they did with like Aerosmith and some of the others. Anything by Wu Tang.
A
Oh, Wu Tangle. Yeah.
C
Like one of my favorite songs of all time is probably Gravel Pit.
A
So try that at home, kids.
C
That's right, that's right. I think at the end of the day I like music across so many different genres. If I had 10 albums that had some different. Yeah, I'd be pretty happy If I
A
said five, but that's okay. The only five, but if they were double.
C
Well, jam album.
A
Oh, 10. I said 10. Yes, you're right.
C
I would say Jimmy Buffett, the beaches, bars ballads and boats would be after.
A
Right.
C
You're on an island.
A
I mean, that's for sure. I think that's the greatest hits album, though, isn't.
C
Was redone, though, by Jimmy Buffett. So, yes, he did an entire tour. I don't remember if I got him in the right order. The beaches, bars, boats.
A
Yeah.
C
But. Yeah, and I remember him setting up shop. I think it was in Florida somewhere.
A
Yeah.
C
Because my roommate in college went and basically spent the whole summer listening to Jimmy Buffett every night.
A
Yeah. Son of a Sailor. Great song.
C
Oh, my word. Son of a Sailor. That's a great. Yeah. Burgers in paradise. Burgers in paradise.
A
All right, Any last advice for someone who wants to break into our world coming out of school?
C
Yes. Don't do it. No, I think there's one consideration.
A
Msp.
C
Well, yeah. I mean, I think anybody that has gone to school that is actually considering anything in the cybersecurity realm, risk management space is to remember that most of what we're talking about is our human nature of labeling things. I remember when the ITSP space included firewall management in their service offerings, but they did not break it out into a category called cybersecurity. So I would say be open to a job in business and operations are going to lend themselves to that exposure to what we've talked about today, business risk. What are the things from a cybersecurity standpoint that we should be doing? How do we avoid things on the right side of boom when we can proactively manage on left? And I think it's going to come down. If you are passionate about working with people and you have the desire to due diligence around people, process, that kind of thing. Really just be open to the job and not be like, oh, I want to be a threat analyst. Probably going to be really bored. Just gonna say that right now. You're probably gonna be really bored, but we need people like that. Right. Like, so I think it's just being willing to be vulnerable for the opportunity. Because if you go down this path of, like, I'm only applying for jobs that are, you know, level two threat analyst, or, you know, fill in the blank, you're probably gonna be disappointed.
A
Excellent advice. Chris Johnson. Thank you so much for your time today. This is Ed Gaudette from the Risk Never Sleeps podcast. If you're on the front lines protecting patient safety and delivering patient care, remember to stay vigilant because risk never sleeps.
B
Thanks for listening to Risk Never Sleeps for the show. Notes, resources and more information and how to transform the protection of patient safety. Visit us@SenseInet.com that's C E N S I N E T dot com. I'm your host, Ed Gaudette. And until next time, stay vigilant because risk never sleeps.
Title: Compliance Is Not a Checkbox, It’s a Trust Strategy
Host: Ed Gaudet
Guest: Chris Johnson, Senior Director of Cybersecurity Compliance Programs, GTIA
Date: February 26, 2026
In this engaging episode, Ed Gaudet welcomes Chris Johnson of GTIA for an in-depth conversation about the evolving nature of cybersecurity compliance in healthcare and IT. Chris argues that compliance should not be viewed as a mere checkbox activity but as an essential component of building trust and organizational resilience. The discussion covers the impact of AI on cybersecurity, the pitfalls of tool-centric thinking, the enduring importance of human factors and process, and career advice for aspiring professionals.
“Compliance isn’t security. Security isn’t necessarily compliance. One can hope that if you're complying with something, it's hopefully showing evidence… but these [frameworks] are baselines, a starting point.”
— Chris Johnson (15:00)
“…the sophistication of tools have gotten and [with] the ability to leverage AI… we are more resilient on the left side [prevention] than we ever have been in the past.”
— Chris Johnson (08:10)
“The email that I’m being presented with… with sophistication of the threat actors and AI… they can impersonate people's voices.”
— Chris Johnson (11:00)
“We had a data security company… we wrapped [data] with policy and controls… but the market did full-disk encryption. It solves at rest, not in transit.”
— Ed Gaudet (17:10)
“It's never about the tool… it's always about the people and the process. And if those things aren't harmonized… you failed already.”
— Ed Gaudet (21:12)
“Transformation requires leadership. Failureship is a leadership concern.”
— Ed Gaudet (26:44)
"The risk… is to ignore it, to not think about what could I do to recreate myself for the future."
— Chris Johnson (30:54)
On Compliance as Trust:
“Those good, bad or otherwise, the reason frameworks exist, the reason the omnibus rule came out—they exist because we were failing with consistency at protecting the things that we have been entrusted with.”
— Chris Johnson (15:00)
On Tool Obsession:
“There should be a 12-step program for shiny object syndrome. It is a real thing.”
— Chris Johnson (18:33)
On Failure to Harmonize:
“Most CRM systems fail within six months or nine months because people aren't thinking about it from transformation.”
— Ed Gaudet (22:21)
On Career Growth:
“Biggest mistake is not remembering that I didn’t get here by myself…there were a lot of people that kept me propped up to get where I am today.”
— Chris Johnson (34:47)
“Be open to a job in business and operations—business risk, process, due diligence. If you go down this path of… ‘level two threat analyst’… you’re probably going to be disappointed.”
— Chris Johnson (40:09)
Chris Johnson closes with practical wisdom: Focus on the basics, invest in people, and view compliance as a strategic, trust-building activity—not a checkbox. Real transformation comes through harmonizing people, process, and technology, guided by leadership and a willingness to adapt.
For more resources and ways to increase your risk awareness and patient safety, visit:
www.censinet.com