Loading summary
A
Hello everyone, this is Tom Uren. I'm here with the Gruk for another Between Two Nerds episode. G', day, Gruk. Are you?
B
G', day, Tom. Fine, and yourself?
A
I'm well. This week's episode is brought to you by Trail of Bits. Find them@trailofbits.com I have a very interesting interview with CEO Dan Guido about how AI is really changing Trail of Bits, the way it works, its enterprise, the whole business. Catch that on the podcast channel. So grok this, like, I find it quite fascinating thing happened a couple of weeks ago that made me think about how people don't really care about security. And for most people that is absolutely the right thing to do.
B
Can I just stop you there for a second? You are aware of the industry that we work in, right?
A
Yeah. So sorry, sorry to break it to everyone. What we do is just pointless most of the time as a. As a rule.
B
Thanks a lot, Tom.
A
So the incident was there's this. It's a hedge fund called Hunter Brook and it's styled as Hunter Brook without the vowels, I think on Twitter or X or whatever it's called these days. And what they do is deep dive investigative reports into companies that they think are doing something bad or fraudulent. And the latest one was ubiquity.
B
Right.
A
And they'll take a short position in the stock, then they'll publish the report and say. And advertise. So they've got quite a slick media outlet aspect. They'll say, look at these terrible people, they're breaking the law. And the idea is that the stock will then go down and then they make bank. That's the, that's the business model. First of all, the ubiquity thing is really interesting because the particular angle is that they are being used by the Russian military and.
B
Right.
A
So that's. I know something that you are interested in. And just the. The amount of networking equipment used in the modern battlefield was the sort of takeaway, sort of staggering.
B
Yeah. In a way. Yeah. So for me it was very useful because I've been trying to argue in my thesis that this war is the most networked war since probably the Peloponnesian War.
A
I think you could well be right there.
B
Yeah, yeah. But seriously, like just the degree of networking is beyond what people can sort of imagine, particularly military people, where they have these very limited ideas. And this helped so much just because of the ubiquity of the penetration of the devices. It was like. Yeah, like squads have one. Like a three man element in a basement has like A router and an antenna and stuff like, it's just, it's everywhere.
A
Yeah. So the fraudulent part or the bad part? The reputationally damaging part for ubiquity is that the entities in Russia are meant to be sanctioned, so their equipment shouldn't be ending up there. But the report spoke of letters from the Russian military saying, you know, thank you very much, letters of commendation, great work, etc. Etc.
B
So there was one brilliant email which said the minimum order is $100,000. And so they reply back, they're like, how come $100,000? And it comes back, you want to break sanctions? The minimum order is $100,000. Fair enough.
A
It's got to be worth our time if we're going to take that risk. Allegedly. I'll just sprinkle a few of those around. I don't know where to put them, but I'll put them in there. So we had a quick look at Ubiquiti's stock price and it didn't seem to have shifted much in the wake of that announcement. And that implies that the market doesn't care about sanctions in this case. So this really very strongly reminded me of very similar incident back in 2016 where a hedge fund again took a short position in, I think it was St. Jude's Medical and they manufactured medical devices. The hedge fund had a security company, Muddy Waters, I think it was called, and they found all these vulnerabilities in St. Jude Medical's equipment. They produced a report and basically argued that they were deficient or negligent or, you know, somehow bad.
B
There were malpractice lawsuits waiting to happen, that this was a sort of looming threat of badness of some sort. Right.
A
Yeah. And so that kind of thing is, I think it's fascinating because it potentially provides companies with an incentive to improve security. And so all the time I'm looking for what is going to get companies to actually improve security.
B
Ransomware.
A
In that case, the stock apparently did drop, but it hasn't become standard practice, so.
B
Well, that's probably because the financial bros were looking at this and like, yes, it makes money, but is it ethical? And, And they just went, you know, like, I, I, I couldn't sleep at night knowing that we had done this. It just, just it, it would sit badly with me. Yeah.
A
And like those medical devices, they may have been vulnerable to something.
B
Right.
A
But there's like, I don't have I ever come across a story, I don't think I've ever come across a story where someone's medical Device has been diddled with maliciously because of a vulnerability. Like, it's a theoretical attack vector, but in practice, like, there's just not the motivated, malicious people who do that. Right.
B
It becomes this, like, TV plot of, like, what if there was a politician with a pacemaker and some adversary could remotely access it and kill them? And I mean, I think for most people, be like, please, go ahead, we hate our politicians. But also.
A
So there was.
B
I don't think a nation state needs to do that. If they want to kill someone, they have a lot of options, you know, so it doesn't.
A
I think if I remember rightly, it was Dick Cheney had his pacemaker modified to remove radio communication. Some very senior US politician. And that seems about right. If you're the vice president of the US that probably could be a thing.
B
Sure.
A
That seems like a sensible thing to do.
B
Why not? Right. It's a lot like how, like, Kamala Harris doesn't use AirPods because they're Bluetooth. And so that makes sense. Like, if there's any security issues with Bluetooth and you're in that sort of position, you just don't want to be exposed to them. Similarly, the German government has now announced that, like, if you're doing security stuff for the German government, you're not allowed to use AirPods. And what I found funny was that there was an article that came out that said, like, the Germans are doing this, and it must be related to the fact that there was some Bluetooth vulnerabilities announced recently. And that sort of misunderstands the way governments approach security problems. It's just.
A
Right.
B
Yeah.
A
It doesn't matter.
B
Right. Like, it could be no security vulnerabilities found ever, and they'd still say, theoretically, this could be a problem. You're not allowed to use it. Yeah.
A
And there's a pretty straightforward workaround, which is use a wired headphone and look cool. Apparently, nowadays.
B
Yeah.
A
I mean, this sort of trail of thinking, it's basically brought me to the point that for the vast majority of people, most of the time, good security in things is not that important, and that even though everyone in the industry thinks we're in a hellhole, it's fine.
B
So welcome to the last episode of Between Two Nerds. That's right.
A
And so I was thinking to myself.
B
I want to push back on you, but I'm kind of struggling, I have to say. So, like. Like, I would push back. And I say, I agree that for the majority, but it's not like the vast majority necessarily right. Because I do think that there's a very sizable minority that does matter for. And here I'm thinking about people being stalked. Vulnerable populations. There are a large number of people to whom it does matter, but hardly everyone. For most people it's. It shouldn't matter.
A
Yeah. So the way I've been thinking about it is that for the vulnerability to matter you need someone who's motivated to take advantage of. And so people with a whole lot of cryptocurrency.
B
Yeah.
A
Like there are people motivated to go out and for a while at least the vulnerability has been in sim swapping, like that's been a weak point. People have exploited that and that population is susceptible to the sort of, I guess I'd call it systemic vulnerability of just life.
B
Yeah.
A
Right. So I think that's an example of a small population where that systemic societal vulnerability may be. Right.
B
Exposes the exposure that to most people shouldn't matter, does to them simply because they have motivated adversaries, people who put in the effort.
A
Yeah, yeah, exactly. So it's the sort of two sided thing. And for. I agree, I think therefore for a significant minority of the population that is a problem. But the problem, the dynamic is that it's hard for a minority to get the majority to invest a lot in better security if it's like. Ah, right. Most of the time it doesn't care.
B
I mean here's the thing is I would say that the, the security trade off is between friction and freedom. Right. So like the more security you have, the higher friction you deal with.
A
Right.
B
The less you have, the more freedom you have to do what you want. The problem is if you have too much freedom, so does everyone else and you know you're insecure. Whereas if you have too much friction, you stop doing that thing because it's just too difficult. And so for a significant minority, they want the degree of friction to be higher, but that friction cost exceeds what anyone else is willing to deal with.
A
Yeah, yeah. So I think we're just, we're seeing.
B
The exact same thing. Yeah, yeah.
A
We're just trapped in this status quo where lots of people get owned and the rest of us don't care enough to fix it. The one, one of the very first things, when I started at Defence, there was an old school crusty security guy and he said the opposite of security is not insecurity, it's convenience.
B
Yep.
A
And that's slightly very similar to what you were saying. And it just frames it as well, we want things to be convenient. Yeah.
B
Like there's Two costs that we're trying to impose on people if we want them to be secure. One is there's the resource investment just to get there. And the second thing is we're going to make things more inconvenient in the process. We're going to add friction. Right. So, for example, if you are on the front lines in Ukraine, you will put a PIN on your phone rather than biometric IDs, but then you'll use biometric ID to lock your apps because that way if it's recovered and somehow they bypass the pin and do image it, there's still another layer of protection. And against your individual things, if you're not on the front lines in a war zone, I think there's very, very few people who would benefit from having a PIN and biomagic ID on each of their apps for their fault. You would stop doing it. You would turn them off after a day or two. If you started on Monday, by Friday that would be gone. No way that's going to work.
A
Yeah. So this, this is this kind of train of thought where we are where we are because as a society, that's more or less what we're happy with. But I was thinking that there is a population of people where that pervasive security is insecurity is just really, really bad news. And that's like intelligence professionals. And so, and it's not even all intelligence professionals, it's just a small number of them. And I think that it's this. Maybe it's a sort of curve that goes. Is flat and then it's a bit of a few and fewer people that care more and more and more because of societal insecurity.
B
Yeah. So I guess the example here would be just before the OPM hacks. This was back in 2016, 2015. So just before China hacked OPM, they'd also done a hack against Anthem, which is the, the medical insurance provider for the US Government. They'd also hacked. I think it was Delta, but it might have been United, I, I don't remember. But it's. It was the airline that has the government contract and does like most the flying for them. And I think Marriott was the hotel as well. But again, I could be wrong, but each of these had been hacked and all of their databases of customers and stays and flights and all this stuff had been downloaded and there was this huge, like, who would hack into a medical insurance provider and just steal their historical customer database? What's the point of that? And it's obviously because that's where The CIA gets their medical insurance. Right. So, yeah, like, you're absolutely right. Like, there's a small amount of people who care that their medical insurance provider was vulnerable to getting hacked. And I don't think that those people have enough clout to change it.
A
Right, yeah. And I think it's not even like every day, you know, Mr. Or Mrs. Josephine Schmo at the CIA, who's just an analyst. It's like a small number of operatives who are meant to be doing secret things.
B
You're right.
A
A small number of COVID facilities in different places that get outed because of that kind of, I keep on saying, pervasive insecurity. So once upon a time I would have said that if you're one of those people, there's steps that you can take that would protect you. But, but I think there's steps that can protect your communications.
B
Right.
A
It's very difficult to protect your pattern of life if everything else is a, you know, effectively an open book or close to it.
B
Right. So it's the. You, you could take a lot of steps to protect yourself, but it's not going to be enough because you have to protect every service that you interact with as well.
A
Right, yeah, yeah.
B
And that's just, that's not happening. One of the things that I thought you were going to say initially was that, like, a state should care about security even if the population doesn't. Yeah, see, I'm not sure, like, I thought the argument you were going to make was that states should care. And I would, I would push back on that, saying they do when they need to.
A
Right, right.
B
And so I've got examples here, which is like, for example, Albania, which very clearly did not care. They got absolutely hammered. They panicked. They, they called like Article 4. They were trying to call, trying to decide to call Article 5. Like, they were absolutely freaking out. And then two years later, it's now happened enough that they are so good at it, they're hosting like resilience seminars and conferences and, you know, like, their.
A
Vulnerabilities become an opportunity.
B
Yeah, the Chinese for systemic nation state risk is opportunity. But yeah, like, the same thing happened with Russia. Right. Like Russia didn't care about cybersecurity because no one was attacking them. Everyone was pissing outside the tent. Right. They were inside pissing out. But once the war started and all the hacktivists got involved, everything changed. And within like three months, it had all been secured. Like, not secure, but it had been significantly improved from what it used to be. So my strong feeling now Is like you'll see a lot of when you read policy papers or you talk to people at think tanks and that they always talk about like we're very vulnerable, we need to work on this. They have this whole, like it's an extreme problem and they're just. The problem they have is they're thinking of a point in time. Right. Like you will be vulnerable for that first bit and that will suck, but then you get better and it doesn't matter anymore.
A
Right, Right. So it's the fatalistic, I'll deal with it when it comes type approach.
B
But that seems to work. Like we've seen it. The thing is, I think that that's an acceptable solution, right?
A
Yep.
B
Because it absolutely does work and it looks like the consequences are, I mean they're not great for the few people who particularly suffer. But in general it might be a reasonable trade off for not having to make those investments now.
A
Right? Yeah, yeah, yeah. So when you said you thought I was going to argue that states should care. Yeah, I guess I think the whole thing is a trade off. Right. The more you invest in security, the more inconvenient it is. The probably, I think from an economy wide point of view, the slower your whole economy is, the pace of innovation is slower. And I think it's like, not that I think we're in anywhere in an ideal place.
B
But Tom thinks everything's perfect, we don't need to do anymore.
A
It's not clear to me that more security would have left us in a better place today.
B
Yeah. So I'm going to agree with you and expand on the reason why in that the attacks that do happen are by people who are motivated for some reason. And I'm not sure that any incremental improvement of security would be sufficient to overcome them. By which I'm saying it's like, look, if you have a default password set up and no one is looking for it, that doesn't matter. But if someone is looking for you and you have a default password set up, then yeah, that's bad. But if you don't have a default password set up, you're probably going to be vulnerable to something else that they will take the effort to exploit.
A
Right.
B
And so ransomware is good, that's going to get clipped. The upside of ransomware. Now, the way that ransomware is really a benefit is ransomware provides this sort of opportunistic adversary, which in theory should mean that the lowest of low hanging fruit has an incentive to do the bare minimum. And so ransomware should be Creating that. And I think to a degree it is.
A
It's providing that incentive that organizations need to improve. Yeah. So it's a real shame that the whole cyber security industry missed out on that gig. Making money and improving security. What's not to like?
B
I missed a trick on that one. But I guess the thing would be, it would be nice if there was a way to do ransomware without the ransom part of it.
A
Right, right. Well, I mean, it's the business model to improve security. Like, it actually works.
B
Right.
A
Like sort of universally you talk to people, it's like, yeah, it does drive security improvements. And unlike Hunter Brook and Muddy Waters that we spoke about right at the beginning, it's an enduring phenomena that's kept on keeping on and probably will keep on keeping on. And there's no good version of that. Like, I guess there's bug bounties and. But they're too sort of limited in scope to actually.
B
Right.
A
Really drive huge improvement.
B
Bug bounties are a proactive feature, whereas I think ransomware is passive. Like, the less you do, the more likely you are to have a ransomware event.
A
Right, right, right, right, right. So if you're involved in bugs in sponsoring bug bounties, you're probably not in the. You're probably ransomware's target demographic. Perhaps. But I don't think either of those really do anything for that small group of people who should really care, like the intelligence professionals.
B
Right.
A
Or the must be other.
B
There's vulnerable populations of people. Right. There's again, like, there's.
A
I was going to say billionaires, but they probably have like, compensating controls, which is paying, you know, for 15 bodyguards or whatever.
B
Yeah. I mean, I think there's people who should care, who don't, and they're. They're in this sort of vulnerable minority and they just sort of don't recognize it. So like the, the crypto people have been like, they get their fingers cut off and stuff. And I'm not sure that there's a technical solution to.
A
Right, yeah.
B
If someone's going to cut your finger off to get money from you, I don't know if changing your password is necessarily the security solution that will work. I don't know if that's going to do it.
A
Patching your devices.
B
Yeah. So I mean, the one thing this does remind me of is I think Salt Typhoon does provide a good example of why for people who are exposed, you know, everyone uses the phone networks, some of them are particularly concerned about it. And that's the, like, the Politicians, the government, civil service, military, even, all of that. And so for them, they would want the networks to be secure. Yep. Like these are, it's a sizable minority, but they happen to be in a position of power. And so they were able to a degree to exert some leverage over the telcos and try and get them to do these sort of like bare minimum things like the change the default password, try to do patching in a reasonable timeframe of some sort.
A
Oh, hang on. I think the words that I saw the FCC chairman Brendan Carr use was accelerate patching. Didn't it say anything about reasonable time frames? Just accelerated?
B
Well, if you never patch, technically any patching at all is an acceleration, so.
A
That's right, yeah.
B
Even that there's resistance. Right. Even just like change the default password has like huge resistance. And I know there's this argument of like if we introduce fluid friction, it will slow down the rate of innovation. I can see, like, yeah, if you have to multi factor authenticate every time that you do anything, that's going to be slow and painful. But changing the default password does not seem like, you know, like that doesn't seem like a bridge too far. That seems like a very reasonable step to require.
A
Yeah, yeah. So I mean, this whole discussion just makes me feel pessimistic or maybe optimistic that we are where we are because like that's just somehow the balance that we've arrived at because of the incentives everyone has.
B
Right.
A
And so it's not we've accidentally ended up here because people are evil or people made the wrong choices.
B
It's not a moralistic thing. It's not like, yeah, it's a systemic problem. Well, it's not even a systemic problem. It's a systemic outcome.
A
And even though some countries have more regulation and more pressure on companies to do better, from a security point of view, it doesn't seem that there's like a night and day difference. Companies still get breached. Some countries might be a bit worse than other countries.
B
Yeah. So it seems like the world is ending or it should end because of how bad everything is. But it's been like that since like the 90s, right. It's always just like things are so bad. We could like at any point this sort of cyber apocalypse can hit us. And the fact that it hasn't, I think doesn't reflect that it's not possible, but rather that no one benefits from doing it who could actually do it. Right, right, right.
A
So we're safe for now because there is no evil mastermind that wants to destroy the planet.
B
And I think it's like, you know, before you get into cyber, you don't really care. Like, it's not a thing that's on your radar. It's not a thing that bothers you. As you go through your career, you see how bad everything is and you want it to be better because it could be better. And it doesn't seem like there's a lot that needs to be done to just get better for everyone. Like just a little bit of this, a little bit of care, a little bit of, you know, and you. You struggle and push and nothing happens. And I think after like 20 years, you suddenly go, right, yeah, I don't really care that much about security.
A
So this is the. We're the cyber sensei, is what you're saying. We pass through to the other side and don't care anymore.
B
Yeah, exactly, Tom. Exactly, exactly.
A
Thanks a lot, Craig.
B
Thanks a lot, Tom.
Date: February 9, 2026
Hosts: Tom Uren (A) & The Gruk (B)
This episode of "Between Two Nerds" dives into the uncomfortable reality that society is, and perhaps always will be, fundamentally insecure when it comes to cybersecurity. Tom Uren and The Gruk explore why most people—and organizations—don’t really care about security, why this is the rational outcome of our collective incentives, who actually should care, and why meaningful improvements remain so elusive. The discussion is peppered with industry anecdotes, war stories, and a sobering acknowledgment that security, in the real world, is a tradeoff society is largely willing to lose.
Medical Device Hacking – Hype vs. Reality (05:54–07:01)
Security Practices of High-Risk Individuals (07:01–08:03)
Throughout, Tom and The Gruk are wry, self-deprecating, and candid, combining humor (“Welcome to the last episode of Between Two Nerds”) with deep industry insight. Their resignation about the state of security is offset by their clear-eyed, matter-of-fact explanations of why we’ve ended up this way.
This episode is a must-listen (or must-read) for anyone seeking to understand the deep-rooted reasons behind society’s enduring digital insecurity—and maybe to feel a little less guilty about it.