Loading summary
A
Hey, everyone, and welcome along to Seriously Risky Biz. This is our podcast here, all about cyber security policy and intelligence. And my name is Amberly Jack. Very soon I will bring in Tom Uren, our policy and intelligence editor, and we're going to have a chat about this week's Seriously Risky Business newsletter that Tom has written. You can read that and subscribe over at our website, Risky Biz. But first, I'd like to thank our sponsor for this week, which is authentic. And you can find them at goauthentic. That's authentic with a K. G', day, Tom. It's great to see you.
B
G', day, Amberly. How are you?
A
Yeah, really good, thanks. And I want to chat to you, Tom, about the FBI. The FBI, we learned recently, is buying data about Americans, which on the one hand, Tom, kind of feels a whole lot easier than the old fashioned boring way of getting warrants to get location data from telcos. But, Tom, you're sort of saying here, wait on a minute. Lawmakers really need to step in here. What's the story here?
B
This feels to me like a continuation of a story I've written about many times, which is that America's data privacy laws and just the really loose state of data brokers and the ad tech ecosystem means that the whole country's awash with data that can be used in very invasive and intrusive ways. And my argument over the years has been that the best way to fix this up is to tighten up data privacy laws. That's not happened. And there have been several stories where either local law enforcement or has been using this data to basically track and surveil people. There's not been a story where it's been shown that the federal government is doing that in any meaningful way.
A
Right.
B
This is still not that story. All we've got is that the FBI director, Keshe Patel, was asked about the use of location data, and he basically said, yes, we use data in accordance with blood laws. Now, the problem is that, so this is the first time that the FBI or any federal agency has really said, yes, we're using this data, but it's unclear totally what, what they're using it for. And, you know, is that a legit use? And when I mean legit, I mean in the sense that most people would be okay with it, that feels like a fair use of the data. It's not infringing on civil liberties. Now, the problem with all this data is that it is, like I said, tremendously intrusive. So you can do all sorts of things that you would otherwise require a warrant to do. And so it's been described as a backdoor way of getting, well, not getting a warrant, but getting the same sort of data, which is true if you do it. And so I think that that's very significant. All the. There have been a number of different privacy bills. They've not really gone anywhere. But I think that this is the moving it from a theoretical risk, like the federal government could do intrusive things with this data, to a much more imminent risk. Now, I wrote in the past about the intelligence community's use of this data. And that report, which the ODNI released a couple of years ago, did a really good job of explaining why it's a problem. And in the intelligence community, people are using that data. But there was no particularly good policy. The people who compiled the report couldn't get a handle on all the different ways it was being used. But again, there was no smoking gun of this is being used in a terrible way to invade the rights of American citizens. So we're in a sort of a similar situation. But the problem is that the FBI has a very large domestic agreement, and it can also compel people. Like it's got coercive powers. Like, that's the point of the FBI is at times to arrest people. And so I think it's a much more worrying situation where the intelligence, compared to the intelligence community, which is largely focused overseas. Not entirely, but largely.
A
Yeah.
B
And it doesn't have the same kind of coercive powers over citizens. And so I think this is the, you know, we've gone from a theoretical risk to a real one. It deserves some sort of legislative response.
A
And Tom, what. What would you like to see, I guess, in terms of those responses? Like, is the answer just treating them like getting data from telcos and needing warrants? Is there more oversight needed? Like, what. What's the answer here, do you think?
B
I think it's of the things you mentioned, all of the above. So I think warrants just to me, intuitively, they make sense. The reason you have warrants is you want someone else assessing the FBI in this case, but any, really, any federal agency. Is this a legitimate use of the data? Like, why do you need it? Are the, you know, other upsides worth the downsides? And I think it makes sense to have an independent arbiter for that. And that's what a warrant is for. And it makes sense because it is as. As intrusive as other forms of data that law enforcement or federal agencies have traditionally Required a warrant for. So, you know, if anything's above that line, I think it deserves the same processes. I think there's also a lot of oversight required because those reports can be or are tremendously useful in trying to figure out where the gaps in policy particularly are. And so it's only when people step back and take a deep look at how things are being used that you find out that they're being used inconsistently or in ways that don't seem quite right. And so I think it's a process of both of those things. I think one without the other is not sufficient, but both together are necessary.
A
Anytime you get these kind of arguments about privacy and data, you kind of get the people sort of, you know, well, if you've done nothing wrong, you've got nothing to worry about. Do you want to run through a couple of the risks that are involved with being able to purchase this data?
B
Yeah. Yeah. So one of the first stories that several years ago is that a small Catholic publication bought some data, and I don't think they even spent that much money, and it was Grindr data. So they outed a priest, and they figured out who that person was by where they were staying at night and which bars or restaurants they were visiting, and they basically outed them. Now, that's how sophisticated you have to be. You have to be able to write on substack. So it's not a high bar.
A
Yeah.
B
And there have been a number of stories about media publications who've gone out and bought data and done, I guess, kind of theoretical exercises around sensitive locations like abortion clinics. Can we figure out who these people are based on where we see their location data? And you can do that. And so that's a couple of journalists and spending, I imagine, like a couple of weeks. It's not a huge deal. So the. The bar, the barrier for entry is not particularly high. The data is readily available. I think the reason we don't see a whole lot more stories is because there's just very few people motivated to do that. Now. I think the kind of people who are motivated to do that are foreign intelligence services, for example, you know, who are all the people who go and park at NSA or CIA and leave their phones in the car all day because you're not allowed to take them into the building and then drive home. Home. Where do they all live? Where do they go to the gym? What's their pattern of life? I think that's a serious national security risk. I think you can do the exact same thing for military personnel. And there are a number of stories about how military personnel have been tracked from particular places to deployments. So then you can figure out who they are, where do they live. I think these are real serious risks, but so far the US has not really done anything about them. Yeah, yeah, there's. I mean, there's a story last week or this week that the location of the French aircraft carrier was revealed because someone had logged in on Strava and done a run around the deck. So the are stories that the US has used this data to try and track ISIS terrorists, I suppose, to different locations. And also. And, and you can be assured that if the US Is doing it, other people are doing it. It's not. It's not a. It's not like building a nuclear weapon or anything like that. The bar is. Is quite low. You need a bit of smarts, but mostly you just need the motivation or the intent to do it.
A
You'd kind of assume, Tom, that the FBI's use of this is legitimate, but your problem is we don't really know how or why they're using it. And so, hey, let's make sure that they have the same kind of paperwork and oversight that they would for any other data that they get is basically the gist of it here.
B
Yeah, pretty much. I think what the director Keshe Patel went into in that. In that hearing was very limited. It's just, yes, we are using data. It has been very valuable. Of course it's very valuable. Like, I would be surprised if it wasn't valuable. But the question is, you know, should you be using it without some sort of. I think it needs process around it because it is potentially so powerful. So, you know, what is it, Spider Man? With great power comes great responsibility. I think this applies because it is as intrusive, if not more intrusive than anything that you would need to get a warrant for.
A
Moving on, Tom, the FCC wants to make routers proudly red, white and blue. And they have brought in some new rules and regulations around importing routers, which effectively bans new ones from being sold in the country without going through a number of hoops. At first glance, Tom, this might seem like a nice improving security measure, but that doesn't seem to be what's going on here.
B
Well, I mean, at first glance, it doesn't even seem like it would improve security. So, like, the immediate story, at no glance. That's right. When I first read the story, I was like, what? And it's kind of framed as a security improvement initiative. So the rationale is that having vulnerable routers everywhere is a bad thing, which it is. So therefore we're going to ban all new and it's consumer grade routers in particular, and they define that as ones that are typically residential and you just install yourself. So we're going to stop new ones until they go through this special process. And I kind of naively assume that the special process might be some sort of security validation. And if you're, you know, if you don't meet these certain standards, perhaps you wouldn't get that exemption to be allowed into the country. But no, it's, it's got nothing to do with that at all. It's all about who owns the company, where is it made, where do the parts come from. But I thought the most outstanding, funniest, weirdest part was what are your plans to bring manufacturing to the states? That's like, that's the third, the third question. And none of the questions in the approval form or process relate to security at all. So I think that having a reliable supply chain like that is a thing. I think that's a legitimate security concern. Supply chain security is a thing. Right. I just don't think it's the first thing I would tackle. If the, if the problem is consumer grade routers, that's not the first thing I would tackle. I think there's lots of things you can do to make routers more secure that would be more useful from a security point of view. So I think that the way it's set up, people can buy routers that are already approved. So you've got the current stock of routers which we know are not particularly secure because they get compromised all the time. The new stock, there's some commitment that comes with that to build in America. Will that improve security? No, I think could well be counterproductive because then you've got management effort focused on how do we shift production to America, not how do we make our routers more secure. And I think management of any company has a limited bandwidth of things you can do. If priority number one is to move to America, priority number one is by definition, not make our routers more secure. I think it actually has the potential to be counterproductive from a strict security perspective.
A
From a security perspective, what would your answer be? How would we go about incentivizing manufacturers
B
to be more secure? In years past, I would have said have a certification scheme or a labeling scheme where they get tested and they get rated. I think the Trump administration has been very willing to employ tariffs. So I think there is a, I think a legitimate reason that you could say that we want to tariff poor routers with poor security. I think that the Emergency Powers Act, I think you've got a reasonable argument that that could apply and that we don't want insecure routers in the country. So we're going to tariff them at 100% or whatever. So I think taking things that the administration has already done, I think combining it with some sort of testing program, if you don't meet these standards, well, 200% tariff for you.
A
Yeah.
B
If you're perfect, 10% tariff, I think that would focus manufacturers, mine minds on getting security improved. Quick, smart, like that would really give them in the incentive to do it. And then I think having like secure supply like that is a thing you would, that is important. But it's like I said, not the first thing you would do. I would try and get the actual equipment itself more secure and then consider the supply chain second.
A
Now, Tom, we have spoken recently about Donald Trump's cyber strategy, and you have written in this week's newsletter that the Trump administration's approach to using the private sector is kind of becoming clear and it's less unleash the private cyberbeast and more. Tell us what you know, Tell me about this. What have we seen?
B
Yeah, so there's been this suspicion in the field that many people think that the Trump administration is very much about unleashing the private sector to just go hack away. Like that story has appeared a number of times. It's pretty clear that the administration is not going to do that. And they've spoken about, like the strategy says, unleash the private sector by creating incentives to identify and disrupt adversary networks. That's a very robust statement. And it doesn't quite match what the National Cyber Director has been saying. He's talked about the private sector being, I think he said, illuminating the battlefield. I described it as being the eyes and ears and supporting government action. So the government really wants to show its chops in terms of offensive cyber action. And I guess from the intelligence community's point of view, the number one intelligence priority is not know what's going on in the world of cybercrime. And so if the administration wants to do anything about the bigger problem, it's. It's intelligence agencies, which are its, I guess traditional eyes and ears are not focused on cybercrime per se. But that's something that private sector companies have spent a lot of time and effort on.
A
Yeah, right.
B
They've got threat intel groups, they've got disruption groups. And plus they have a lot of data sources related to their own telemetry that they see that the government just doesn't have. Like, the government has special secret intelligence assets, but they're not focused where the Microsofts or the Googles have their assets. So there's complementary pictures, I guess. And so the, the idea is that by taking the, you know, illuminating the battlefield, seeing what Google or Microsoft or Palo Alto sees, the government will be in a position to decide, here's what we can actually do that will make a difference. And that doesn't have to be a cyber thing. That can be a sanctions thing or diplomatic thing. And so there's lots of options. And that's what they're describing. So in the piece, I liken it to a different kind of public partner, public private relationship, or partnership. Now, these have a bad name because. They're moderately successful. They've never gone away. Everyone kind of has grand hopes for them, and they're always disappointing, but they're not positively bad. And I think mostly that's because incentives are not quite aligned. And in this case, I think incentives are aligned. The private sector wants action done on particularly criminal groups, but just general badness on the Internet, like badness on the Internet is bad for business. And in this case, the government really wants to do stuff. It really wants to demonstrate power and throw its weight around. And so I'm mildly optimistic that this will work out. I think it's not a case where the private sector wants something from the government and the government is reluctant. Like, in terms of information sharing, there's always been a bit of a disconnect because the government not that good at sharing secret stuff. And it's also not really focused, like I said, on stuff that the private sector really cares about. Do they care about what Putin is thinking? It's probably like the number two or three intelligence requirement, Xi, Putin. And that's not something the private sector really cares about in any way. And so the sort of interests are not aligned.
A
I have worked with you for a little while now, Tom, and I can say that I get very excited when I see you even mildly optimistic about it. So it feels like a good day.
B
I think there's a Russian saying, we hoped for the best, but it turned out like always. Hey, look.
A
On that note, Tom, we will leave it there, but thank you so much for joining me yet again. And you can, of course, read and subscribe to Tom's newsletter over at our website, Risky Biz. But, Tom, have a great week. And I'll see you again same time next week.
B
Thanks, Amber. Sam.
Risky Bulletin, March 26, 2026
Host: Amberly Jack
Guest: Tom Uren (Policy and Intelligence Editor)
This episode explores how U.S. law enforcement, particularly the FBI, is leveraging commercially available location data—potentially sidestepping traditional warrant requirements. Hosts Amberly Jack and Tom Uren examine the legislative gaps in data privacy, the invasive potential of brokered data, risks to civil liberties, and call for updated oversight. The conversation also addresses new FCC rules around router imports and reflects on evolving public-private cyber threat partnerships under the Trump administration.
Conversational, clear, and slightly wry—Tom Uren’s analysis is measured but urgent, and Amberly Jack keeps things focused and engaging. The dialogue maintains expert insight without dry jargon, often seasoning complex issues with memorable analogies.
This episode is a deep-dive into emergent risks at the intersection of data privacy, law enforcement practice, and national security, dissected with clarity and a touch of dry humor. Listeners will come away with a nuanced understanding of why American data privacy law needs updating, where well-intentioned policy misfires, and how government and private sector interests can—sometimes—align for the greater good.