
Loading summary
A
Welcome to to the Point Cybersecurity Podcast. Each week, join Eric Trexler and Rachel Lyon to explore the latest in global cybersecurity news, trending topics and industry transformation initiatives impacting governments, enterprises and our way of life. Now, let's get to the Point.
B
Hello, everyone. Welcome to this week's episode of to the Point podcast. I'm Rachel Lyon here with my co host, Eric Trexler. In person.
C
We're in person in Herndon, Virginia. This week, Rachel, we had some technical issues, but we struggled through those and I think we're good to go. And like our best shows, we're looking at each other face to face. It's great.
B
Why do you have good lighting on you though, and not me? Like, why.
C
I will turn the lighting right now,
A
but you know what I mean?
B
I just feel a little left out. Oh, I'm so excited for today's guest. Like, we were talking right before we got on here, man. We got some good, good to talk through today. Please welcome to the podcast Rob McDonald. He's senior vice president for platform at Virtru. Welcome, Rob.
A
Yeah, thanks, Eric. Thanks, Rachel. So excited to chat with you today. And yeah, I totally agree. That conversation leading up, juicy topics, fun stuff, and look forward to engaging with you and your audience today.
C
Right. Like, how do we get the microphones working? Oh, they're working. Hold on. Now we have headset issues on Rachel's new laptop because she broke her screen.
A
You know, if those didn't happen, this would not be as much fun, I promise.
B
Absolutely not. And I will say I didn't break the stream because I was watching TikTok on a work laptop. I'm just gonna put that out there. But I. TikTok is my favorite topic right now, as you know.
C
So I think it was the last show we recorded, you said you had wasted a significant amount of time on
B
TikTok trying to figure it out and
C
you were swearing it off.
B
Yes, but.
C
But you're back on the TikTok.
B
But as Rob. Rob so eloquently told me what was going on before we got on about, you know, the. With dopamine, serotonin, all that stuff. Like, it's beating me up. The Do Rob and the kitten videos and like, literally I'll lose like an hour, hour and a half. And I feel empty inside when I turn it off. Like, I feel like it's feeding me a drug. Like, that's bad, right?
A
Yeah, well, I mean, I don't know if. I don't know if it's bad. Like you're. People get value from it. What people don't understand is just how sophisticated the correlation of all the data points you create to understand you. Like TikTok is not doing anything different than the other big tech companies. They just happen to be a lot more controversial based on where that data go. But yeah, it is a slippery slope in terms of what you give up. But people do get value of it because they know you. They have turned something into a value for you, and it's a slippery slope.
C
Rob, why don't I know Rachel then? I mean, if TikTok knows so well, how do I. How do I get to know my co host? So I know her. I can't relate at all. So I was 12 and. Rob, you can analyze me here. I was 12, and I won't go into a whole lot of specifics, but somebody gave me beer and I was losing a little control. So what do you do when you're 12 and you have. You're a little inebriated? You drink coffee. I burned my mouth like you couldn't believe it was at work. It was an industrial coffee machine. Yeah, I took it right away. Took. Took one huge gulp. Burned my lips, burned my tongue, burned my whole mouth. I haven't had coffee since I swore it off for life.
B
That makes sense.
C
Why can't Rachel do that with TikTok? Are you saying that the. That the Folgers, or I forget the brand, they haven't figured out me, but they. But TikTok's figured out Rachel.
A
I actually, Eric, I think, you know, oddly enough, you nailed it. First off, I do think two wrongs make a right there, right? Just. You drink a beer, then you burn yourself. So that worked out for you. But you know what you said there was. You took that coffee and it burnt your mouth and you're like, I'm not touching that again. That pain is still being subsidized heavily in industry, right? So you're getting all this value. You know, your credit card gets stolen, you immediately get refunded. There is still this subsidy that takes some of the bite out of that pain. It is a tsunami, however, where people are starting to realize what equities they're giving up and what value they're giving up. But the pain is still, I would say, manageable through those subsidies. So, you know, TBD when that threshold and tips. But you want to be prepared for that tip. You don't want to get your mouth burnt with hot coffee when that thing tips, right?
B
Absolutely.
C
So help me understand that. And I don't do coffee. Ice cream coffee with chocolate. I don't do. I swear. Coffee off her life. Not a problem for me. I'm good. Good.
B
Rachel swore frosted coffee. I'm just saying. Shameless plug for frosted coffee at Chick Fil A. I just gotta tell you.
A
Sounds delicious.
B
My new obsession.
A
I'm done.
C
There's no dopamine firing and making me want coffee ever. I'm telling you, I will probably never have coffee in my life. You swore off TikTok last week.
B
I know, but it me.
C
And you're back already for the crazy dog videos?
A
Yes.
C
So obviously no pain, Rob, is what you're saying. Like there Rachel's pain would be the waste of time she's spending there in her degraded work performance or podcast prep. I don't know. But what you're saying is she's. She's not recognizing the pain or the benefits are probably overcome, overcoming any pain that she would recognize.
A
Yeah. I mean, as a human. Right. You're adapted. Your wetwear right. Is adapted to say pain. Don't do that. Reward. Do that. Right. When you see these breach incidents and you see these data exfiltration events, they're massive numbers with no names on them except for a corporation, and you're like, I don't understand that. How's that impact me? However, if you talk to someone that has had that data exploited cross modality, and they have truly suffered from that, from a reputational perspective or a financial perspective, I guarantee their pain level is different. Especially if that pain for them has resulted in inability to get some of that back. And by the way, some things you can't get back. Right? Some reputational things are really hard to get back. It's like a company losing a brand. How long does it take for you to get a brand back? Right. Maybe never. Right?
C
Yeah. Okay, so let me, let me make sure that I understand this. I see Rachel's on. On board here. What you're basically saying, if I have a credit card stolen and it doesn't cost me much of anything, there's very low paying. I don't necessarily take additional proactive measures in the future to better enhance the security around my credit card or not lose it or something like that. Now, if my identity is stolen, somebody files a fraudulent tax report as a result, with the IRS on me, it's years and years and years of pain. I may be more inclined to protect that identity going forward because I felt that pain. It was traumatic, it was impactful. There was some meat behind it. If you well, is that kind of what you're. You're describing?
A
Yeah, and by the way, that's exactly right. And I think this pattern exists everywhere. Right. Like in the corporate level, I want to.
C
Level, I want to take it to a corporate cybersecurity level now.
A
Yep.
C
So I know you were, you were a CIO in the health care space, I believe, for your current role. Right. So if you had a ransomware attack that took down the business and let's say it was a wiper that wiped data, you're going to be more inclined in your next role or in that role as you look to, you know, post recovery, but protect the organization going forward. You're going to be more inclined to protect against any kind of ransomware attacks or sensitivity that would allow things to come in. You're sensitized to it.
A
Yeah, I think separate parties for a moment too. Right. Because the, the, the technical audience is kind of already sensitized to it. Their mission is aligned and oriented and that is around serving the organization, making sure that the organization is empowered to do what they need to do with the technology, Technology there to enable the business. Right. There's the non, I hate to put it in just technical. Non technical. That's not fair. Let's talk about hospitals for a second. Right. The doctors there, the nurse, mostly the nurses. Yeah, we'll talk about, we'll raise the nurses up a little bit. They do the majority of the work in these hospitals. This is the reality. They're there to take care of the patients. Right. So technology is a means to do that. Anything that gets in the way is no good. So when that audience, when you talk to them, and you're a CIO and a ciso, your job is to translate risk at any given moment in time. Well, there's 30 other things competing. I'm not saying anything that anyone doesn't know, but until there is no risk,
C
is that gunshot wound that just came in through the emergency room doors for them, like, how do we keep them alive? They're not thinking about cybersecurity, any kind of ransomware attack, any kind of phishing emails. They weren't.
A
They weren't. They weren't. And what you just said is important because the ransomware in particular has resulted in inability to take care of patients. That has elevated the position of cyber in these organizations. It has not changed the press on the CISOs and the unnatural, unfair, ridiculous types of leverage being placed on them, but it has given some opportunity for them to further those missions. Because now there is clear connected tissue with those operating teams and oh yeah, this is gonna get in the way of me taking care of my patient, my number one mission.
C
Yeah. So if we lose access to patient records while we're prepping for surgery, we may not be able to take Sarah into the surgery room and operate because we can't see what we're doing. She may have a life threatening illness or problem that we've got to deal with. That's a clear way that they understand, the clinicians understand how this is linked together, is what you're saying.
A
I think so. And this is also a good opportunity even more now that they see that, they can also start to understand how directed and lateral attacks can start affecting Iot devices in hospitals, infusion pumps and things of this nature. So it's more than just, I don't know about this patient. It is, I'm going to damage this patient directly because the device I have trusted is no longer working the way I thought it would work.
B
And we saw this. I mean, was it last year with the hospital been infected by ransomware in Ireland? No, in the US and it was kind of the first time they had been able to directly link a death. It was an infant death. Because ransomware shut down the systems. They couldn't use the scan to see that the umbilical cord was wrapped around the baby's neck. The woman delivered and she was like, you know, if I'd known, I would have gone to a hospital that you guys had this stuff down. And that's heartbreaking.
A
You know, it's frightening and like, it's frightening. And ultimately it's the type of pain that we're talking about here. You don't want that type of pain to happen in order to necessitate change. But look, at the end of the day, humans are humans. And sometimes that is what is required to necessitate change, is to be able to see that happen in real time. And I don't ever want that to happen. But now that things like that have happened, you hope that that position is elevated in the organization. That's right.
C
But back to your, your example from a few minutes ago. I think if you were working in that hospital and you knew that story, if you were part of that family who had that traumatic event, you're really sensitized to it.
B
Yes, yes.
C
If you're working in a hospital two states over and you just have patient after patient coming in, you're down on nurses. My son's a pediatrician, so I have some level of understanding of what they Go through here, you're down on nurses. You can't give the proper care to your patients. Are you sensitized enough to that or is it too remote? And then the follow up for you, Rob, would be as a CIO or a ciso. How do you convey that risk, that sensitivity to somebody who didn't experience it? Because you've got to be proactive.
A
Yeah, look, let me start with that one first. CISOs and CIOs have to be storytellers. We are humans at the end of the day, and we still exchange information. Our children learn through mimicry and listening to stories. We still do the same as a human. You have to be able to take these stories in industry where you can say, hey, see this institution over here? They look just like us. They are us, by the way. They're no different. Right. And let me tell you what happened to them when they deprioritized this initiative. You can't drive change only through fear. But risk is ultimately a quantification of, of what is likely to happen. And if you have a story, and you have a series of stories that result in a pattern, and that pattern looks like you, then yes, that is a higher likelihood. And your risk quantification has to go up for that particular incident and then your ability to deliver that story as that CISO is going to be what gives you the most leverage. CISO and CEOs have to be good storytellers.
B
Yeah, I like that.
C
I mean, as marketers, we love it, right? It's all about the story.
B
It really is. That's how you learn.
C
I agree with you. I mean, I had a. In a prior life, I had a storytelling class created and the, the instructors, the creators, actually took it back to the beginning of the human race. We grew up through storytelling. And you know, we've had a lot of guests on the show that talk about storytelling. That is how we learn how we communicate.
B
Yeah.
A
Yes.
C
I love that answer. Because you don't want to wait until it happens to you.
A
No, we wouldn't be here as a civilization if we waited for everything. Just like in any business strategy, you can accept small failures along the way, but you know what? You can't accept a bunch of catastrophic failures because then the ship sinks. So your storytelling has to protect you from the catastrophic failures and you have to learn from the smaller failures. Whether you're steering a cyber security team or you're steering a business, it's no different. You're doing the same effective risk quantification and navigation going back to My favorite.
B
I love the privacy topic, Rob.
A
Me, too. Me too.
B
I mean, ish.
C
Let's talk. Let's talk privacy. I don't want to get you on TikTok.
A
No, no.
B
But I guess that's where I'm struggling today. You know, one of the things that I am struggling with today, I mean, you know, aside from, you know, until. Like, until you were personally affected. Sometimes it's very difficult for people to feel like, oh, it'll never happen to me, but it will. But then you have this whole other generation, like a TikTok generation. Right. Where their whole life. I mean, talk about pii. I mean, that's out there. It's on their TikTok. I know where they live. I got their birthday. You know, like, I got. There's a security number. I got the dog's name. I got all the things I need. You know, if I were a bad actor. But, you know, then you have, like, GDPR and CCPA and all of these
C
other things and regulatory components.
B
Exactly. And it's like, so. But if you're willingly putting it out there, what's the point?
C
You know?
B
I mean, I guess that's where I struggle, too. I think it's really critical. But then again, the information's probably already out there, right, Rob?
A
Yeah, it is already out there, but. What are you doing? I think. Let's talk about it for a second, because, you know, we have both made progress and also let down newer generations. That's the reality. Right. So we started in this world of blind trust, where you go to these SaaS providers, and you say, here's my data in exchange for a service, and you hope for the best. Hope is the common denominator. Right. GDPR is this move where we say, okay, that's not good enough. So you give up your data, you get a service, and you surrender your control to a legal proxy. That's what you're doing. You know, you have no ability to determine comprehensiveness of action. Right. And by the way, we're still not solving the problem that you have no idea what you're consenting to anyway.
B
Right, Right.
A
So this is what I mean by
B
we've let them down, Rob. I mean, just as an example.
A
That's right.
B
I got. Anyway, I was looking at this thing, and they're like, you know, here's our agreement. You need to sign. It was literally 200 pages long.
A
That's.
B
I was like, there's absolute. I'm just gonna sign this, and I just hope it works out. Yeah, exactly.
C
Who doesn't do that.
A
That agreement is designed to protect that company, not you. Right. It is intentionally designed to be confusing to you so that you go just like, you know, there's these studies. I think this might have been a Kahneman study where it related to the Kahneman study, where if you've seen 30 options for your 401k, you just don't accept anything. But if there's two or three, you'll pick one and invest. Right. It's no different. Right. It's no different. And so if we don't think that that's done on purpose, we're crazy. Right. It's done on purpose. And legal proxies are not the answers. Right. We need to move to a state where if you're getting value from a service. Great. Have a clear transport medium from the intent of that agreement to your ability to determine whether there's a violation or not. And take control yourself. What we don't know, like what we've done as a society, has continued to quantify the value of data. But we're not talking about the thing that is actually the tsunami, which is you have given up your sovereignty over control. That's what you've given up. So, data aside for a moment, the tsunami that's coming is when everybody realizes that I have literally surrendered complete control. Forget about the value of data. Right. And that's the bridge we have to cross. Next or now? Not next.
C
And I'm reminded we did a podcast with the New York Times reporter Shira Frankel, an author.
B
Yep, yep.
C
And she studied Facebook. And Facebook has a tremendous now meta, I believe they have a tremendous amount of information on their users and they were having difficulty understanding how they protected information, how they should protect your information, what they were going to do. And things changed over time pretty significantly. Yeah. But they still retained all of that data on that pii, essentially on us.
A
Yeah, yeah. You know what's interesting?
C
Even if you do understand it, which you're not going to read the 200 pages, so you don't.
B
Right.
C
How they handle it, it could change. And then they send you another 200 pages saying, hey, heads up, we made this one minor Change. Read the 200 pages. If you have questions, there's no one to go back to. Rob, to you.
A
Yeah. By the way, what you just said, I'm okay with. As a business, you want to change your mind. Great. You know what? As a data owner and a consumer, I want to change my mind too. Right. But when we decide to change our minds, we have to let the parties involved know in a way that they can consume and take action on. Right. So reading that agreement is not the answer. Assessing the organization and putting a legal team on that organization to hold them accountable is not the whole answer. We need technical controls to map to that agreement so that you have like a beacon when they make a change or when that change takes place, that does not exist. And by the way, we just talked about one company. You just talked about meta. Your data is in hundreds of organizations and there are institutions that are collecting it at a aggregate level and understand you at a deeper view across all of those data silos. So take the complexity of that one company times all the others. It's probably not linear. It's probably exponential complexity. Right. The chance of you being able to read and understand all that is backwards. You're always going to be behind. You need to get ahead of it with a technical control so that you can see what they're doing with it.
C
Well, if you even have any time, I mean, I can't even manage all of my subscriptions for television anymore. Right. Right. Let alone where the data is, where everything is. And, and I've got to tell you, I'm not a, you know, keep picking on Facebook because I can. I'm not a Facebook user. I have an account.
A
Sure.
C
I wouldn't know how to scrub my data if I wanted to just close that account out. I mean, there's nothing there. I wouldn't know how to scrub that data if I tried. And I don't know if they would do it.
A
Yeah, you just landed on it. You, what you can do today is you can go and say, delete my information. I no longer consent. And you. And then you get notifications back that say, this will be processed in X period of time, or they got to meet it. And then you. And then it says, it's done. And you're like, okay, I hope they did a good job. You have no idea. Right? That's not, that's not appropriate, in my opinion, because this is why it's not appropriate. We're beyond the technical age where we have a technical answer to this. There are technical controls, there are technologies that can allow you to be able to say, remove my data, revoke my consent, et cetera. And it just happened inside the environment. What we're choosing not to do, we're choosing to ignore that because it's easier just to accept your data and treat it like of low value because you're the human. Right. They just treat it like a pool of data. It's easier for these companies to do that, and I think it's not appropriate for us to let them take the easier road because there are technical answers now. There weren't in the beginning.
C
Okay, so there's a technical answer. Let's assume it's relatively cost effective and easy. As consumers, the three of us aren't going to drive even with this podcast. We're not going to drive a behavior change. Are you talking about a regulatory change? And then you've got to deal with country by country by country or something else. What do you recommend?
A
Yeah, I mean, I think if regulation alone was the answer, then we would not have healthcare companies doing bad things with data because HIP has been around forever. So regulation alone can't get it done right over time. The reality is there's no. I don't have a silver bull to answer for you to be truthful. Like, we're seeing, we're seeing an expansion in awareness around this surrender of sovereignty today. This has been going on for a long time, right. This is not new. But I think consumers are becoming increasingly aware of the sovereignty they gave up. So there has to be that rising tide of awareness. Has to be. Until consumers effectively indicate that they're going to not choose a brand over another brand because. Because they're positioning that value pillar. That plus a regulation and legal protection is probably what's going to be required. It can't be just one or the other, in my opinion. But you know, that's. That's kind of where I stand on what it's going to take to get there.
C
Where do we start, Rachel? As a prolific TikTok user? We'll just say frequent TikTok user.
A
It's.
B
I mean, where do we start? I feel like it's almost like Sisyphus, though, Rob. I mean, they make it so easy to do these things, you know, I mean, I think to your point earlier, I mean, why would I, why would I decide to make my life harder? I mean, I've seen the benefits of multi factor authentication. So, you know, now I'm like, oh, okay, I can take that extra step. It's kind of inconvenient, but I see the benefits now. Until you start seeing that kind of, you know, an alternate reward system, it's hard to want to make that change.
C
But you see the benefits because you understand the risk to some level.
A
Yes. Yeah. Oh, geez. Alternate reward systems. That's a topic right there. I mean, people are in different stages of activities and things they get benefits from. So I think there's always going to be benefit from these social platforms. Don't get me wrong, I'm not asking for the, the, the end of social platforms. But you know, you wouldn't go to a grocery store today.
C
Well, I use them, so no consequence to me.
A
You know, I'm a non parti, a non participation for me as well. But that's just because I'm a very small sample set of privacy advocates. Right. But you know, people use what they want to use. But you don't go to a grocery store, if a grocery store and there's a product that says used with toxic pesticides, you wouldn't use it. Well, but we didn't think about that a long time ago. Now we do. So there are projects like the digital standard that help you kind of assess a company. I do think education is critical. Education is a long, long haul. Right. It's going to be a long time before that education. So what you end up having typically is you have educational change that results in kind of evolution over time. And then you have like these periodic black swan kind of events that accelerate the mission. And I hate to say it, but if you just as historian of anything related to humanity, that's kind of what motivates changes in a lot of ways. And at the point when that pain overcomes the subsidies, that's a black swan event, I think ultimately where people will demand more change. And they're doing it now, though, to be honest with you. They're doing it now. The awareness around cybersecurity, the awareness around privacy, consent, informed consent, it is at an all time high. And yet to ask yourself why? I think it's because the pain is higher and the awareness is higher.
C
Yeah. Okay, so I'm with you and I agree with awareness. I don't know what a black swan event looks like in this space and Rachel hates when I do this, but I'm going to pick on some examples just because I like to.
A
Sure, sure.
C
And I'm sorry, there are customers potentially and we'll just have to deal with that. So take something like an Equifax. Right. A lot of people had information stolen.
B
Yeah.
C
And you might say that was a black swan event there.
A
It was a big event for sure. I agree with that.
C
I think you and I might be more in agreement here. Yeah, it was a big event.
A
Yes.
C
I think. When did that happen, Rachel? Two years ago. Three years.
B
Three years ago.
C
Like 20. 19, 24. Covid.
B
Yeah.
C
How great is it? We mark everything by Covid. I know. So let's say it was 2019 and I don't have the exact date. My date is still with Equifax. It was stolen.
B
But do you get a choice? I guess that's my question. With Equifax, I don't get a choice
C
if you don't get a choice.
B
Right. Which is so bizarre to me. Okay, why does it just go to them?
C
Right. So Home Depot. Do you ever go to Home Depot? They had an event. Target. They had an event. Used to still shop there.
A
Yeah.
C
Nobody heard you? The answer was no, Rob, but for different reasons. Let's not go into that.
A
Okay.
C
My point being we have awareness. But what is a black swan event, or what I would call a catastrophic.
B
Right.
C
Event that would cause you to change buying behaviors, right?
A
I don't know. Well, I mean, look, how have we not hit it yet?
C
I mean, Sony, look at that. People still, I mean, I guess people weren't impacted. Like, let me be very clear. Consumers weren't necessarily impacted directly in a negative manner. With Sony.
A
This is really interesting. I mean, you guys bring up some great topics. I mean, the Equifax. No, these are great. These are really tough topics because you're, you're, you're wading in a territory which is like, can it even change? Because if it does change, what's the economic impact on our, on our country? Right? So this is where the, the, the need, the necessity of industry to overwhelmingly suppress, protect, subsidize because it is directly attached to some kind of economic outcome for the country. Right? So and some of this is just choice, right. Or the world. So some of this is choice. Like in your case, maybe there's a one institution is maybe one of maybe two to three processors of a certain type of activity. There's just no alternative. Well, that's not great, but some of it is just more complicated than that. Right. The country as a whole is better served if there's not a lot of buying behavior change. And you'll see some unnatural activities to protect that. Right. But there is ultimately these black swan events that just overcome the ability to address that. So if there's an event large enough or systemic enough or close to the livelihood of humans, think like, you know, the energy networks and things of this nature where, or distribution networks where people don't get water, they don't eat, you know, things that you, there's no way you can't just magically produce spin in the market when water's not showing up in my tap. Right, Right. But there's this longer term thing that's happening already, which is the exfiltration of intelligence and intellectual property. It's right now. Right. So this is leaving the country. It's resulting in an increasing difficulty to compete in some areas. Now I feel like we are the producers of a great deal of innovation and we're well ahead of that curve. That's of course true. But it's not like this is not decaying what our livelihood already. So we say this Y combine event is not here. But I think a lot of that is simply the fact that we are, I don't want to say covering up, but we're simply trying to make sure people don't get too excitable about it. But there's a lot of agencies and there's a lot of people with mission right now that are actively protecting against this intellectual property exfiltration that is harming our economy. So these events are happening. Right. We just simply are good enough at combating the blowback right now so that the average consumer doesn't have to fill it. That's the reality.
C
I agree with you. I just did a quick search on like major breaches.
A
Yeah.
C
And it's amazing.
A
Yahoo.
C
Right. I talked about Equifax, Marriott.
A
Yep.
C
Right. We had Adult Friend Finder. Let's not forget Ashley Madison. Remember that people are still using those services.
B
Sure, sure.
C
Facebook, Target Skip, OPM, MySpace, LinkedIn. You know, we're talking about a lot of social media type of activities. We. I probably named four or five.
B
Yeah.
C
I don't think people have curtailed for the most part. I mean there are a couple. There are a couple. Eric's out there. But Even I use LinkedIn. I don't think people have curtailed their usage of social media platform nor their restricted the type of content they put up. Right. And I think at the corporate level level, Rob, to your point, we are incredible innovators and creating a lot of new ideas and things.
A
Yeah.
C
I think at the enterprise level, whether government, private sector doesn't matter. The focus is on innovation. Much more so than protecting the information, protecting the data we have. And that is not to take anything away from the hundreds of thousands and millions of cyber defenders and, and aware professors and science scientists and researchers and doctors and everybody else who's working with data. Right. I just don't know. Certainly not a. I don't think it's a black swan event. I think it's a catastrophic event. Call it what you will. I think it's more. There's a level, a different level of awareness depending on who you are, where you are what you're doing and as a result, we're doing a mediocre. I had a professor once, a sub marginal, I thought was the greatest, like, derogatory comment ever. It's about someone's paper she was talking about that performance was sub marginal. And I was like, that's a slam. But I think our performance is sub marginal in the area of understanding the importance of data, understanding risk and protecting that data despite overwhelming, overwhelming evidence that people want to steal it, are stealing it, and will continue to steal.
A
Yeah, look, I mean, humans have this need to constantly produce more, newer, better innovation is just in our blood. It cannot exist otherwise. And that's how we've gotten to where we are. So let's applaud that for a moment. Right. I'm thankful for antibiotics. I'm thankful for these innovations that allow us to live a better life. So let's keep that up. And. But with that comes edge behavior. Right. So if I'm pushing the frontier on a mission, I'm going to forget about these other things. I don't think anyone would advocate for us to change that pattern. So as a result of that, the truth is, when you have that type of edge innovation, what you do is you have a lot of increased surface area. You got these tech companies engineering at a blistering pace, resulting in adoption by users who are not thinking about that. That results in you just giving up information. That's the equivalent of pouring buckets of water in the boat. While you're trying to pour buckets of water out of the boat, the defenders are up there doing their best to fend it off. Now you're just pouring more water in, but you're not trying to. Right. So you can't. And that's why you see breaches up into the Right. The technical sophistication is increasing, the surface area is increasing, and the amount of information I can collect on my attack service is just phenomenally more sophisticated. So we can continue to take this approach as we should. As you pointed out, these defenders, these protectors, these researchers, phenomenal, what they're doing is amazing. And there's a lot of things that we don't know that they're doing. And I'm thankful that they are. That's the reality. Right. Especially inside some of the agencies. Right. But what we're not doing, I think at the same time, though, is we're not giving those individuals. We're still depending on a third party to do right by the individuals that will never be aware if I'm Just a non technical consumer. I will never. I don't want to have the technical sophistication to understand what all those other individuals. Why would I? That's not my job. Right. We're not doing enough to say when you agree to that EULA or when you agree to that data sharing agreement. Okay. Now we're going to marry that with a technical control so that if you need to change your mind later you can on your own without 30 intermediaries. We do still need to cross that castle but.
C
And you can rest assured that your data that we are managing and storing is safe with us. It's a lot of trust taking on that responsibility.
B
A lot of trust there.
A
Yeah, it's complete trust. And by the way, humans work by trust, right? At the end of the day you go into your dry cleaner, you give them the most expensive thing that you were handed down three generations, you say please clean this and don't mess it up. That's a tremendous amount of trust. Humans are never going to change. Right.
C
It doesn't always work out so well, let's not go.
A
But you get what I'm saying? But you still do it, right? You still do it because humans work on a level of trust. I am not advocating that we stop trusting everybody. That's why, you know, the term geo trust I think is misinterpreted in a lot of ways. Right. We still have to trust parties but you need the ability to affirm independently activity and take technical control of that after the fact. Today you do not have those two things. You have one of them at all. Exactly.
B
It's a conundrum.
C
Okay, so I'm giving up on the consumer. I think there are too many. Rachel's out there be on TikTok because she enjoys it.
A
I'm going to keep fighting for them. But that's okay. That's okay.
C
No problem, Rob. But at the enterprise level, at the corporate level.
B
Yes.
C
Businesses, agencies, organization, how do they do a better job on a, let's call it a data centric approach to protecting our pii, our data like that, that to me seems we've got experts. Your example, we have doctors, nurses, clinicians, you name it, who aren't experts in protecting data. But they have some level of awareness. But we have experts helping them. How at the enterprise level do we take that data center centric approach to security and protect data data of our customers and our employees and everybody else?
A
Yeah, I mean I don't want to say do a better job. I don't want to use that language here because I don't want to devalue the efforts that are underway, because I do. I think there's a couple things we do really well, but we need to
C
do a better job.
A
We do need to do a better job. I agree. Some categories though, once that data is within a location, we actually have a lot of tech and a lot of call policy control to protect it when it's in my container here. I got it. I got your data. It's great. Let's describe it this way, right? You're using 2, 3, 400 SaaS applications inside your enterprise. This is the reality. The only common denominator to all those applications is the data. It's the only common denominator. Right. So it's the only thing that moves between them. It's the only thing that's valuable within them. And the data is the proxy to the human that's fulfilling whatever business process that application is meant for. There's a connection value. That's the value.
C
The value's trapped also.
A
That's exactly right. So I think what I'm advocating for is, you know, that you understand that common denominator and you start at, and you start down the road of understanding the data itself and protecting the data itself. So whether you're talking about zero trust or you're talking about whatever, whatever other framework, fundamentally, data has been only looked at as a component of the stack and we have yet to really focus on it as a predominant priority pillar and a first mover in terms of, okay, well if that's a common denominator, can I move my intent and controls to that data itself so that when it flows where it flows, I can express my control how I need to. The other category is if you don't do that to do business, that data has to leave your organization full stop. Right. There are no alternatives. So if you only extend these legacy concepts or you only extend, you know, zero trust as a means to contain SaaS applications, basically to put firewalls around SaaS applications. Great. When that data stays in your environment, sure. But I think your organization needs to participate with vendors, suppliers, customers. And if we can't move the intent and control to the data when it leaves, your answer is, I did my best to this point. Good luck. Right, so everybody's doing great, everybody's working really hard. The initiatives are great. All I'm advocating for is that we're in the era now where a data centric approach and focus should get more elevated priority as the more future proofed approach to Data protection.
C
And do you think it is these days?
A
Increasingly so. Increasingly so. By the way, this has now made it in these data pillars, have made its way into NIST standards. Now for zero trust, you see some of the government cybersecurity initiatives call out data pillars. So I mean just on that telemetry alone, I would say that there is some increased awareness about the priority of that. So I think it is getting better. I think we are earlier days on adoption because look, it's easy for tech organizations to say do this, do that. Okay, so let me put my CIO hat on. CISO hat on. It's a far different beast to then implement. So I would say from a technical maturity perspective, we're farther along. From an awareness perspective that this is important. Important we're farther along. Implementation is early days in a lot of organizations and that's going to take time because these CIOs, CISOs, organizations, difficult job. The majority of them are still in the phase of data discovery and mapping. They don't even know where their data is and what it is. So we've got to have empathy about that.
C
Yeah, I think back to the government CDM program. They broke it into four phases. Initially it was who's on your network, what's on your network, what are they doing on the network. And then there was this fourth phase. They really haven't gotten to 12ish years later, which was how are they interacting with data? Right. Like how are they protecting and interacting with it and everything else. And you never get there because you don't know who's there, what's there and what's, what's happening. Right, right. And it's, it's, it's hard.
A
Yeah, it's really hard. Let me connect that, let me connect back the corporate to the TikTok because there's connective tissue here. You said you accepted that 200 page agreement and you're like. And you're getting value immediately. Right.
C
I want to be very clear here. That was Rachel Lyon.
A
Rachel, Rachel, Sorry. I will make sure to be explicit. I'll be explicit.
C
I do not have an account.
A
If I find one dance video this week, I'm going to call you out, Eric.
C
Okay, back to you, Rob.
A
What a data centric approach allows you to do is to defer the risk and liability. Because if you protect up front, you may not have. Because you just said it yourself, the environment's so complex. You don't know what actor is where, you don't know what data is where. This is the reality by the way, in what world does that change? Are we getting simpler or more complex? It's not like we're going to get into a world, one point where everybody just completely has their mind wrapped around where everything is. We don't live in that world. Right. My junk drawer in the kitchen would, would confirm that humans are really bad at that. Right? So if you protect the data up front, right, what you're doing is you're going, okay, I may not know what hostile territory it's going to operate in, but I'm going to defer liability because I'm protecting it here. I've got audit telemetry around what's happening to it and I can change my mind if I see a hostile event. That's why the data centric pillar is getting so much more elevation.
C
Okay. Although if I'm with Equifax and I want to. I can't even do that. If I'm with Marriott and I want to remove my account because of a breach, the breach has to happen first. My data's gone, right. Today, afterwards, as a consumer.
A
Yeah, that's right, today. And that's what I said we were earlier in the implementation journey, Right. So I think awareness is critical because you got to know these technical controls exist, you got to have some type of environmental pressure, whether that be regulations, consumers, organizational reputation, increased elevation of trust. By the way, what better pillar of your brand than trust? It's what it's built on, right? So I expect to see more organizations differentiate on this. But Marriott as an example, they're just really, really early in an implementation phase. So you can't take advantage of that yet. You won't get any value from that because they haven't implemented. But they're on a journey and let's hope we can help them accelerate there soon.
B
Yes, we need to.
C
My information is everywhere. I'm almost.
B
There's so many questions still, Rob. We could keep talking for hours.
A
We. Yeah, I mean, this is. Yeah, well, I think we have a lot of answers. I think what we got to do is we got to, as a, as a industry be, we need to desperately be more empathetic towards these protectors and defenders because of where they're at. The implementation journey, the burnout is so high, the job satisfaction is low. It's not. Look, we're humans. We're not talking about machines, right? At the end of the day, we got to stop talking about these implementers as though they are some factory that you can crank the dial up on and produce more widgets it's just not that case management needs to do better, leadership needs to do better. Right. That needs to happen. And then we're going to get there. Along the way, we've got a couple of guides along the way. Right. We're seeing increased awareness around data protection and privacy. Those are our guides along this journey to make us a hero. Now we need to support our heroes to get along the journey and not treat them like and hold ridiculous KPIs. Like, no breach ever. If you get a breach, you're fired. These kind of silly concepts.
B
Right, right, right.
A
You know, if every CEO only had successful businesses, that'd be an interesting world. But I don't know one CEO that hasn't had failures in their past. So why are we treating our CISOs differently? Right. It doesn't make sense.
B
Exactly. I know.
C
Spot on. It is.
A
It is.
B
Well, I, I know we're coming up on time, Rob. I could, I could keep talking about this stuff forever, especially more TikTok. But we'll leave that for another time.
C
I will work on it.
A
Oh, I certainly appreciate the conversation. I had a blast. I hope it was of value to you and your audience.
C
Thank you for joining us.
B
Absolutely.
C
Thank you.
B
Thank you, Rob. And to all of our listeners, thanks for joining us this week for another awesome episode. And as always, Eric, what do we want him to do? We want him to, oh, smash the subscribe button.
C
I thought you were going to say be aware and protect your data. Better.
B
That too, while you're smashing the subscription button. That's right.
A
That's great.
B
Thanks again. Thanks again, Rob. Thanks our listeners. And until next time, everybody be safe.
A
Yeah. Thank you. Thanks for joining us on the to the Point Cybersecurity podcast, brought to you by forcepoint. For more information and show notes from today's episode, please visit www.ForcePoint.com govpodcast. And don't forget to subscribe and leave a review on Apple Podcasts or Google Podcasts.
Podcast: To The Point - Cybersecurity
Hosts: Rachael Lyon & Eric Trexler
Guest: Rob McDonald, Senior VP, Platform at Virtru
Date: March 3, 2026
In this episode, Rachael Lyon and Eric Trexler sit down with Rob McDonald to explore the complex interplay between human psychology, societal behaviors, and the evolving landscape of data and privacy protection. The conversation moves from individual experiences with platforms like TikTok to the broader implications of data sovereignty, regulatory frameworks, and the technical—and human—challenges of cybersecurity in both consumer and enterprise contexts. Rob provides nuanced insights into why real change in privacy and security behaviors is so challenging, how pain and awareness drive action, and where both organizations and individuals must adapt as data-centric threats continue to grow.
Rob McDonald on surrendering control:
"What we've done as a society, has continued to quantify the value of data. But we're not talking about the thing that is actually the tsunami, which is you have given up your sovereignty over control. That's what you've given up." ([16:50])
Eric Trexler on incidents and behavior:
"I think our performance is sub marginal in the area of understanding the importance of data, understanding risk and protecting that data despite overwhelming, overwhelming evidence that people want to steal it, are stealing it, and will continue to steal." ([29:11])
Rob McDonald on the need for technical controls:
"What a data centric approach allows you to do is to defer the risk and liability. Because if you protect up front, you may not know what hostile territory it's going to operate in, but I'm going to defer liability because I'm protecting it here." ([39:25])
Empathy for Cybersecurity Professionals:
"We need to desperately be more empathetic towards these protectors and defenders because of where they're at. The implementation journey, the burnout is so high, the job satisfaction is low." ([41:20])
The episode blends pragmatic optimism with a candid acknowledgment of the persistent challenges facing individuals, enterprises, and policymakers. The conversation is friendly, lively, and peppered with real-world analogies (from coffee to dry cleaners) while remaining highly technical at key junctures. Rob balances the technical with the human, repeatedly stressing the need for empathy, education, and realistic implementations—both at the consumer and organizational level.
Key Takeaway: Sustainable change in data privacy and protection demands a combination of regulatory frameworks, rising consumer awareness, data-centric technical controls, and empathetic leadership—none of which can succeed fully in isolation.
For further information, detailed show notes, and related resources, visit Forcepoint's website.