
Loading summary
Isabella Royo
Did I talk too much? Can't I just let it go?
Lucas Bantonis
Thank you so much.
Isabella Royo
Take a breath. You're not alone. Let's talk about what's going on. Counseling helps you sort through the noise with qualified professionals and online therapy makes it convenient.
Lucas Bantonis
See if it's for you.
Isabella Royo
Visit betterhelp.com randompodcast for 10% off your first month of online therapy and and let life feel better.
Lucas Bantonis
Change isn't coming.
Isabella Royo
It's already here.
Lucas Bantonis
Commerce is going digital and tax complexity is multiplying. Tax rules evolve, rates shift. Data floods in. Vertex connects it all. A global tax compliance platform powered by tax ready data and intelligence systems. SmarterTech continuous tax compliance built in confidence.
Unidentified Advertiser/Voice
Learn more at vertexinc.com DeleteMe makes it easy, quick and safe to remove your personal data online. At a time when surveillance and data breaches are common enough to make everyone vulnerable, Deleteme does all the hard work of wiping your and your family's personal information from data broker websites. I've said it before and I said it again. I was a Delete Me customer before they ever started advertising on the Lawfare podcast. And let me tell you why. Because it isn't just a one time service. Deleteme is always working for you, constantly monitoring and removing the personal information you don't want on the Internet. It sends me regular personalized privacy reports showing what information they found, where they found it, and what they removed. And, and every time they send me one of these reports, the data brokers have put stuff about me back and they've taken it down again. You know, this is a continuous service. I have an active online presence. I'm pretty exposed. I do things that, you know, other people don't do, whether it's taunting the Russians or taking political positions that people find objectionable. And yet my privacy is important to me and Delete me is an important part for me of protecting that. Have you been the victim of identity theft? I have. Harassment? Yup. Doxing? Not yet, but I assume it'll happen eventually. And if you haven't yet, you probably know someone who has and it may be you next. Delete Me can help. So take control of your data and keep your private life private by signing up for Delete Me now at a special discount for our listeners. Get 20% off your DeleteMe plan when you go to JoinDeleteMe.com lawfare20 and use the promo code lawfare20 at checkout. The only way to get 20% off is to go to JoinDeleteMe.com Lawfare20 and enter code lawfare20 at checkout. That's JoinDeleteMe code lawfare20.
Isabella Royo
I'm Isabella Royo, intern at Lawfare, with an episode from the Lawfare archive for November 16, 2025. On October 6, the Brookings Institution released a report on how immigration enforcement agencies are scaling up use of a range of technological tools to monitor immigrants, including facial recognition technologies, social media scanning, autonomous border surveillance towers, and artificial intelligence powered hurricane scores intended to assess the likelihood that an immigrant will fail to check in for an agency appointment. For today's archive, I selected an episode from August 9, 2024, in which Yoheni Alostri sat down with Lucas Bandones to discuss the relationship between law enforcement and tech companies, what that relationship looks like in other countries, how communication between tech companies and law enforcement can be politicized, and more it's the Law FIR podcast. I'm Eugenia Daughtery, Law FIR's fellow in technology Policy and Law with Senior Privacy Engineer at Netflix, and former Army's Reserve Intelligence Officer Lucas Bantonis.
Lucas Bantonis
Like when the news says it's a tough time out there for data protection, it is partially because of machine learning, partially because of like a vague slide towards populism and authoritarianism in many parts of the world. It doesn't mean we should stop trying.
Isabella Royo
Today we're talking about the relationship between law enforcement and tech companies, what that relationship looks like in the US and other countries, and the different ways in which that communication can be politicized. So Lukas, you sit in a very interesting intersection of law and engineering and technology. So can you maybe start by just describing what it is you do and what your day to day looks like?
Lucas Bantonis
Yeah, sure. Thanks, Eugenia. My day to day is mostly about being a bridge between engineers, whether they're data or software engineers, and lawyers and other legal professionals. Most of the nexus of like data portability and law enforcement response is answering requirements from legal about what to add to subject access requests or how to manage requests from customers or law enforcement about data. And it's been that way ever since I joined the field about four years ago.
Isabella Royo
So tell us a little bit how you joined the field. You know, what were the skills that allowed you to do this this work? Because you're not a lawyer, right?
Lucas Bantonis
I'm also not an engineer either. Yeah, it's kind of. It's kind of one of those things where it happened by accident. A lot of careers start by happy accident. I had some intelligence experience from the military. I had a policy background from school and writing Some papers for academia and for research outfits. And I took a lot of classes that focused on first cybersecurity because I was really interested in cybersecurity policy. And then I, I started getting interested in emerging tech and kind of like that lent itself pretty well to first working in privacy engineering. And honestly like I spun some experience working in storage systems into like a project that focused on law enforcement response at Google. And that project ended up being my segue into helping their law enforcement response teams just sort of improve the ability and kind of like protections inherent in their products to keep customers safe while answering requirements from government and law enforcement. So it was very, very happy accent and something that I think that lends itself well to like my mix of skills, non skills, whatever you would call happens to be in that bucket.
Isabella Royo
I feel like that's usually how it happens. Happy accidents and being able to seize opportunities when, when they come your way. At least for us who are maybe, I wouldn't say the, you know, early in the, in the field, but when I started, and I think when you started, there wasn't a path to do this work, you just kind of learned how to do it.
Lucas Bantonis
Yeah, not at all. Like, I mean, some of it's like, especially because some of these technologies are like emerging or they're attached to something that like governments have trouble understanding. And then you get into this weird territory where even the companies that make this technology don't really understand it super well. So it's like there's nobody that really knows how to grapple with. I mean, maybe some of the engineers developing the tech, whether it's machine learning or something else, know like how to innovate in the field, but they're not really sure how to add protections for it because it hasn't been invented yet. And that's, that's true of cyber security, that's true of ML, that's true of a lot of different disciplines.
Isabella Royo
Yeah. And then crucially, this bridge function that you're describing, that you described kind of from the get go where you may understand the technology but you may not understand what the legal regime around it is.
Lucas Bantonis
Oh, absolutely. It's up to like first line engineers and legal professionals to be able to make informed decisions about risk ownership, acceptance and kind of forecasting because they either build the tech or they know the law. A lot of times you'll hear privacy engineers specifically talk about knowing a little bit about each of those categories, but not being sort of like the first line expert in like any specific one of them. We are necessarily Privacy experts. But it's challenging to even define privacy in the context of emerging technology. It's really challenging, actually.
Isabella Royo
So one of the reasons why I was excited about our conversation is I think you bring such an interesting perspective to this, at least for me, when I think and when I talk to people about the relationship between tech companies and law enforcement, I tend to hear it more from the law enforcement side or from researchers who are looking at this and studying this from the outside. But you know, maybe it's a little bit less common to hear from someone who's actually inside the big tech companies. At least that's, that's been the case for me.
Lucas Bantonis
So.
Isabella Royo
So can you characterize that relationship between law enforcement and the companies from that unique perspective that you bring?
Lucas Bantonis
Absolutely. So with it in mind that it's like a small field and there are definitely sort of better spokespeople for legal and risk positions from an engineering standpoint, you always want a system to do what it's supposed to do. If you build it and then it has a certain failover tolerance and you generally try to do your best to provide the best product. And if that product happens to have, as is required by not just gdpr, but now CCPA and a bunch of laws that are coming out across the planet as well as the US you have to have some measure of data portability baked into every product. Like customers now expect to have some type of granular control over what data is available to them. The tension arises when there's also sort of like this access question or this access expectation among law enforcement and governments. I mean, I have seen dozens upon dozens of headlines that revolve around different countries level of expectation that they have an access to consumer data via a tech company. It's something that I understand some folks at lawfare are very interested in some stuff that I'm interested in in terms of these tech companies acting as functional intermediaries for this surveillance. It's not that we conduct it ourselves though. There's lots of different positions floating around the government now as to whether or not there's explicit surveillance being conducted by companies. It's more along the lines of we build our products to the best of our ability as like big tech engineers to make sure that customers have the most control and context and decision making that we can afford them, the ability to delete data, all those different things that are baked into gdpr. But also that like when we have to comply with government requirements, provided that the government respects and there's like an established agreed definition that they respect the rule of law. We're going to answer their request to the most specific degree of our ability. There, there are some constraints on that ability, like lack of specificity. So if a bit of legal process comes in and it's not specific enough to target an account or maybe it's for data that we don't have, there are occasions that any sane tech company would outright reject the request or push back and say, you need to be more specific. If the country or government does not reflect what we would consider the rule of law. And that's generally a consensus position that isn't really spoken for among any specific tech company. But you'll see a common pattern where like one company will trumpet their response above another, but they're all functionally doing the same thing. They're, they're making it so that there is a precedent established and it's not, it's not a formal legal precedent. No one's actually writing new law on this. No one's writing papers. I mean some people are writing papers, hopefully soon myself to be included in that company. But generally like that's fascinating to me because it's still in the spirit of honoring multiple different users of that ecosystem, making sure these requests, that they reflect like only the degree of accuracy that they provide, that the requests don't result in a legal data production that's over or under the mark, and that it reflects like legal reality and like the reality of the product, like availability of data, things like that. It's a world that most people don't like. They, they, like you said, they do know it exists from the perspective of law enforcement wants to unlock a phone or like even then they, they do it on the basis of encryption and it becomes an encryption debate and there's some kind of, I wouldn't call it silly, but I would call it like very ham fisted debate between whether or not encryption should exist in the first place. This is a little bit different and sometimes more nuanced.
Isabella Royo
So I'm hoping to get to this later, but since you kind of brought it up, I feel like I need to ask it now. You know, of course this is not just about responding to requests from U.S. agencies or U.S. law enforcement. This is spanning all over the world because you know, big tech companies have operations everywhere and you know, I would love to hear a little bit more about what it's like to answer to these different regimes. What are the different things that you need to keep in mind. But also I was just reminded of the recent testimony to Congress that We saw from, from Microsoft and they stated for example that when there are requests from the Chinese government, because that hearing focused a lot on China and Microsoft's presence there, they said they maybe just don't respond to those requests. Is that typical? I know that there's concern around the access of foreign governments to Americans data.
Lucas Bantonis
It's very typical. So you'll see an industry standard where a lot of major tech companies either don't have business operations in China and each, each company has their own reasons for not doing that. But when requests come in from say China or Russia typically and for different reasons, they will just not align to the request. I think maybe in the early thousands or early 2010s there was a bit of a gray area where a couple of the big, the names in big tech like were trying to do like a hybrid business mod model or do, you know, comply with CCP censorship. But like most, most hoax have gotten out for like kind of the same reasons. It's one of the weird ways in which they're all very harmonious. They don't like to answer requests from governments that they strongly believe would misuse the data to like target dissidents and stuff like that.
Isabella Royo
So what about other, you know, governments? What, what's the range of how comfortable between yes, we will likely reply to this within the margins of the law. Two, we're just not going to have operations in your territory.
Lucas Bantonis
Well, I mean it can be. So some of the risks that come without mentioning any specific countries, some of the risks that get considered have been things like degree of fraud and corruption in submitting these requests. Like how easy is it to submit fake requests regardless of the subject matter or the type of request. So that's a big consideration. It's not always a logistic consideration, whether like a given group or like you know, geographic region of like policing is like willing to use the tools that you provide them or like how corrupt a given court system is. But like that's all pretty kind of wrapped up into the overall picture. And then sometimes it's like who gets elected into leadership and how do they use the information that they glean from tech companies. And there are a couple of standouts in the last like 10 to 15 years that would fit that bill, whether it's in Europe or in Asia. But like generally like there's no one size fits all way to decide that a country you used to honor requests from no longer follows the rule of law. It's tricky because then a lot of companies want to contend with their business in that country too. Like, do they, do they want to, do they want to catch smoke from a government that like, you know, may toe the line, but they toe the line successfully? It's really challenging. As opposed to governments that are a joy to work with, that have really robust regulatory agencies that flag instances of overproduction. Again, without sort of getting into specifics. It's just very, it's, it's a, it's a big scale, like to say, like, what is the scale? It's very wide.
Isabella Royo
It's every single country. It's, it's every single country that's like.
Lucas Bantonis
Not Russia, China and a handful of others. Others, yeah.
Isabella Royo
So an interesting trend is this requirement in legislation that you as a company must have an office and personal present in the country where you're operating. Right. And there's been a lot of concerns that that can lead to applying pressure. Right. Because it's easier to say, well, we're not going to respond to your request. If everyone that works for you is in another country, it's far away, there isn't really much that they can do. But not only you need to have a presence, but it is considered an offense maybe to not respond to the government requests. There's leverage there for the government. Do you think that's going to affect how we're seeing operations being maintained or not?
Lucas Bantonis
I hate to give the cop out answer, but again, it depends on the country and it depends on the company. I have sort of like personal anecdotal experience that unfortunately I can't share that like, generally there are creative ways to still base your office and still do things with like your people and your data to be able to protect them successfully. And that's. That is again, something that is surprisingly common across big tech companies if the risk gets too great. So if not just lawyers, but your security staff or your campus staff, say, like, hey, there is an imminent risk to people in this country or like the person that just got into office is like not going to honor their word to keep the campus safe. That is an easy, like contact the State Department type get them out situation. But beyond that, the safest thing is to withdraw the medium ground is to just be able to apply a little bit of reverse pressure because like most of these companies, whatever the company is, is so valuable to have in that country that like, I can't think of even, even cases where like certain companies still do business in Russia or they do business in like a, a country that's like toeing the line on authoritarianism. It can Sometimes be a weird bluff that, like, they're. Because they need the company to provide those services. They're like, really valuable for the economy. Or like, people can get jobs doing that work. So the companies have more leverage than it seems like they do.
Isabella Royo
Since we're speaking about the international part of this, the UN is in the process of negotiating a new cybercrime convention. And it has been criticized. You know, me amongst the people who have criticized that for. Between other things, it's very broad scope and honestly the insufficient protections that it provides to human rights. But the convention would also create some new obligations and create some challenges for tech companies. So could you maybe speak a little bit about what these changes would be and how worried you are about them?
Lucas Bantonis
So I am not super well read on the current state of negotiations, so if there's later context that you can provide that I miss, please feel free. But generally, one of the most interesting things that I saw was in the considerations for, like, the treaty negotiations was this concept of traffic data, because it was, like, listed by like the ICO and some of the language in the, the draft sessions as like, core to cybercrime investigations. To be able to share data between stuff. There's another act that, like, is like that, like in electronic evidence, like, there's a lot of, like, talk in the eu, at least specifically, rather than the UN to like, be able to share stuff across, like, share evidence for terrorist investigations and intelligence investigations. But this idea that somehow ambient traffic data, which to me just sounds like communications, metadata, we've been talking about that in the states for like 30 years longer actually is really fascinating because a lot of times I don't know that, like, having every actor that is responsible for each endpoint of like a major breach, having access to the same, like, telemetry and investigations data is actually going to speed up the investigation. Like, normally there's like a couple of key actors or there's like one person that pushed a bad update or something like that that, like, does. That does most of the lifting. That does most of the lifting. And it's necessarily because they have access to that telemetry. Like, is giving it to a bunch of different government agencies going to help? Does that mean I'm a critique of the treaty negotiations? Not really. I just think that it's like that concept of traffic data is really fuzzy and I thought that was interesting. To answer your question directly, like, do I believe that that would have, like, meaningful changes for the way that tech companies do things for, like, cybersecurity investigations? I think One of the biggest tensions I see is that tech companies still have a really fraught relationship sharing, like, information about a vulnerability with the government, because it's loosely assumed that, like, the government, whether it's any government in the world, as you like, are very well aware from your own expertise. Like, they just don't trust that they're not actively using them, that they're not actively, like, using something to, like, that's later going to get patched. So the way that ties in with traffic data to me is it's just. I'm glad that the UN is thinking about it and some of the drafters are thinking about it, but I don't know that it's going to compel companies to rethink the way they collect that telemetry or share it, for that matter, unless there's something about the negotiations that are missing. Like, please, please take it away and tell me something else that you. You found interesting about the same topic, because I'm happy to respond.
Isabella Royo
No, no, I think that's. That's interesting. You know, the part that I focused more on is the fact that there aren't sufficient protections in place and that the scope is just way too broad. Honestly, literally everything would become a cybercrime, and I think that that's dangerous. But, you know, that always poses interesting challenges for everyone who actually needs to operationalize a treaty. But I want to bring us back to the U.S. it's been an interesting year for privacy and surveillance. There's some recent reporting that Schumer is expecting the child online safety bills to clear the chamber soon. So that, you know, again, going back to the theme of how is this going to change your job and how big tech companies operate, do you see these bills creating significant change to your day today or. Or not?
Lucas Bantonis
Child safety is tricky. And while sort of like acknowledging that the law enforcement response world necessarily touches child safety, like, in like a sort of like a meaningful, like, relationship, it's challenging because a lot of, like, the requirements for, like, international agreements for domestic law enforcement are pretty ironclad for, like, how you treat with that subject and like, different, like, you know, issues that crop up on the Internet. I don't know that it would so much change my job just because of. Just because where I know work has less data than Google. I think overall the landscape may get tricky in countries just like the cybercrime negotiations that, that use, like, broad collection apparatuses to, like, go after crimes in a really weirdly fungible way. Like, you know, to say, like, hey, I'm Writing this bill or I'm going to send this legal process from government X to company Y to target a crime that involves like put putting children at risk or I'm going to do it because it's a cybercrime. To me that's a really like broad continuum that has similarities in the sense that they just use very broad government vehicles to target the same thing in the US I don't necessarily see that happening, but sometimes child safety bills do include like weird provisions.
Isabella Royo
Let's talk about something that is I think maybe a little bit more squarely in your realm, which is the renewal of FISA 702. I'm sure you follow that closely. So I would like to hear just your thoughts on that entire process. How did you experience. Experience all of that?
Lucas Bantonis
Yeah, I mean I watched a couple of the debates and I read the commentary not just here on lawfare, but elsewhere on the Internet. I talked about it with my co workers, I talked about it with some legal folks. And generally like 702 renewal always seems pretty cut and dried. Like it's sort of a necessary element of how the bureau targets suspects and intelligence investigations and law enforcement investigations. The challenge becomes, it seems like I'm a broken record at this point, but when people add like weird provisions to the renewal. So the iteration that I saw was like the Renewing Intelligence and Securing America act that folks from different civil liberties groups argued in some commentary that like it would, it would sort of like give a broader, you know, ability of federal law enforcement to compel companies to be able to surrender data or like that there's like some type of like more pressure under the current renewal that they can apply. The weird part about the pressure that law enforcement applies to companies is that, you know, they can like send a bunch of requests or they can send national security letters. They can send actual like legal process that compels companies to like set up, you know, broader and broader legal productions. I couldn't even after reading the text of the bill, find stuff that like expanded that requirement very meaningfully. I think the challenge is going to become when that involves like the, the, the most interesting wrinkle doesn't actually come from domestic surveillance for me because for like US persons, which 702 provides that the intelligence community can't target any US persons, including if it's like targeting a foreign person for the purposes of targeting a domestic person. The challenge comes when like the, those people are abroad or like there are other weird entanglements with like cross border data sharing and like folks have like other meaningful relationships with like other data repositories and other regulations in different countries. Like, I think that that's where the compulsion, or sort of the, the, the compulsion becomes stronger. A lot of the protections break down. And that's my biggest concern with like changes to FISA that are, that have, that have gone forward through the renewal. I don't like what you would call like backdoor inclusions. They're not, they're not, they're not great. And coming from the intelligence community myself, I respect the need and the sort of. Yeah, the need for constancy and like a degree of specificity, a degree of aggressiveness and every intelligence collection law. But the relationship that intelligence law has with companies. Let's, for now, let's say it's fraught. I don't, I don't, I don't like the, the landscape. Especially as I mentioned before when it gets into like silly debates about encryption and stuff like that. I hesitate to sort of like say that like that's, that's a good trend for the future. A PSA from Instacart.
Isabella Royo
It's Sunday, 5:00pm you had a non stop weekend. You're running on empty and so is your fridge. You're in the trenches of the Sunday scaries. You don't have it in you to go to the store, but this is your reminder you don't have to. You can get everything you need delivered through Instacart so that you can get what you really need. More time to do whatever you want. Instacart for one less Sunday.
Lucas Bantonis
Scary.
Isabella Royo
We're here. Taking care of your eyes shouldn't be a hassle. That's why Warby Parker is a one stop shop for all your vision needs. Our prescription glasses and sunglasses are expertly crafted and unexpectedly affordable. Stop by a nearby store or use our app to virtually try on frames and get personalized recommendations. Did we mention we offer eye exams and take vision insurance too? For everything you need to see, head to your nearest Warby Parker store or visit warbyparker.com today. That's warbyparker.com if you're a custodial supervisor at a local high school, you know that cleanliness is key and that the best place to get cleaning supplies is from Grainger. Grainger helps you stay fully stocked on the products you trust, from paper towels and disinfectants to floor scrubbers. Plus you can rely on Granger Granger for easy reordering, so you never run.
Lucas Bantonis
Out of what you need.
Isabella Royo
Call 1-800-GRAINGER click granger.com or just stop by Granger for the ones who get it done. It's interesting not that long ago not to self plug but I will do it regardless. I, I had this really great conversation with Joseph Cox, who wrote the book the Dark Wire about the FBI kind of building out their own hardened cell phone company so that they could sell that to drug dealers and they just managed the entire company and that's how they had access to the communication. And you know, like, while you cannot expect that type of operation all the time, it just seems like maybe a better approach than requiring all these companies to break or weaken their encryption just to, just to be able to see what's going on.
Lucas Bantonis
Yeah, I mean I've, I've actually had a couple recent conversations with like former law enforcement professionals and I think the general argument is, is based on a difference in mission rather than like technology where like they wake up basically every morning thinking like, how can they not only like drive up a statistic like prosecution rate or like conviction rate or amount of putting dangerous people in jail, but like how to break up networks, how to break up gangs, how to break up terrorists. And generally if they can do it while meaningfully going above and beyond the requirements of the law, they are doing a good job. But that is like, that is not, that is, that is a corollary to their primary mission, which is just like keeping people safe. And I've met tons of people who really do believe in this mission and do a good job at it. Encryption always seems to be this thing that when I talk about it with them it's like, well like of course people deserve privacy, but I'm trying to catch terrorists. And that's like, that's kind of, that's it, that's like the end of the discussion most of the time for, for encryption advocates, as I'm sure many of the folks that you've already spoken to and like written about, like it's the exact opposite. It's like the, the, the, the main goal for driving protections in the end, to end encryption space is just that the people who use Signal, WhatsApp, Telegram, Wicker, whatever are, you know, they're dissidents, they're marginalized groups. Sure, we're willing to admit that like, you know, military and like spies use these too. But like also like most of these groups are protected and they can't like if they have that encryption weakened or broken in any significant way, they will be rendered just as vulnerable as the terrorists or the criminals would be to the FBI. And it's an unacceptable trade off that you're willing to make here with regards to like the subject of like dark wire or like that whole managed company. I think the American government has a long history of creating like, you know, little legally sound like legal snares to catch people and like subvert the definition of entrapment to be able to break people up and get their comms broken down. I think the value of still asking companies for permission is because the companies can serve, however strong or weak that could be at any given point. A meaningful intermediary check on government surveillance. Going direct to the customer still feels a little. Yeah, I know, I'm not a lawyer. It's not, it's legally not entrapment, but it does feel like it's sort of abrogating the process through which you're able to conduct surveillance fair and square. I think the simple definition of surveillance is literally like observing someone for the purpose of collecting evidence. And how do you know that what you're collecting is going to go into evidence? Like, how do you even submit the process for like a warrant if you're just snarfing up all their comms? Like that's not, there's no evidentiary basis for the collection. I mean even in intelligence, I mean there's lots of hair raising stories and you know, vague or explicit human rights violations in the history of the global intelligence services. But at the same time there's a basis for every, there's supposed to be a basis for every interrogation, for every collection, for every like thing that you submit to your agency head to your lead, to your commander in order to be able to gather that intelligence. So that's why those plans, those like company things, like, they always sort of like, like I said, I'm sure they're, they're airtight. But as, as a privacy, as a newly minted privacy person going on five years, that stuff just gives me the heaps.
Isabella Royo
I'm failing right now to remember which proposed legislation this was. But not that long ago there was a piece of legislation being proposed about encryption that basically said, you know, we want you to have a way to access that is proven to not weaken general protections. Right. So basically give access, but make it still be good. And it just seems like such an interesting example of that disconnect or the lack of a bridge between policymakers and actual technologists because sometimes that technology doesn't exist. There isn't a way to do that, if I understand it correctly. So how do you even get around that if there are requirements that just you cannot meet in the law.
Lucas Bantonis
I mean, I hate to be like Chachki or precious with this comment, but, like, just make them work harder. Like if you can. You know, it's proven that, like, the National Security Agency can study Tor endpoints and then find out the computers that the traffic was on by looking at the endpoints. I mean, theoretically you could do that with endpoint devices. You could do with computers, you could do with phones. Does it always work? No. Is it always in time to stop terrorists? No. But that's sort of the imperfect solution that meaningfully comports with the requirement with the burden placed on intelligence and law enforcement agencies and cops. Like, you know, anywhere up and down the chain, encryption exists for a reason. Cops earn their salary and they do a great job catching bad guys when they find workarounds that keep people who want to use encrypted chats safe. Like, that's kind of like a weird. You know, people do take sides on that. Very pointed sides. And somebody can listen to this and think, well, this guy's like, ludicrous because it's. It's only one or the other. But I really do think that making both sides of the community work harder at sort of what they do. Yeah. Has a. Has a benefit for everyone, unfortunately. Yeah. There's also, like a benefit for criminals and terrorists if you just, like, keep strong encryption, but it's not to the extent that, like, they still can't be caught.
Isabella Royo
Something that I've been quite interested in, especially looking at some of the recent developments over the last year, is how politicized communication between government agencies and tech companies has become. And I think we see that most clearly with social media companies because the communication between doj, FBI and tech companies about misinformation efforts or how to maintain election security has been criticized very publicly. There's a new report the DOJ inspector general released evaluating the efforts to coordinate information sharing about foreign malign influence threats to US Elections. And the kind of three big findings were there is a lack of policy on how you do this. There are First Amendment implications, and there's also a lack of strategy guiding the interactions. You know, I know your field is not necessarily social media, but you do do a lot of the relationship between the tech company that you work for and law enforcement. So do you think these concerns translate at all to your field? Have you seen any kind of backlash about the way that the two sides work together? Or has it been kind of its own little. Its own little pocket that is just.
Lucas Bantonis
Not touched I think they're very related. I hate to be disappointing when I say like from whichever topic like doesn't specifically affect like law enforcement response because folks in government and cops will always need data for investigations. They're always going to want to know about something. I think the dichotomy I'd like to briefly examine is like it's assumed that Democrats will go after tech companies. Like it's just like they will hold them accountable because you know, they're, they're harming consumers or like an open society requires snarled companies and they don't pay enough taxes and they're building a bunch of evil machine learning and then Republicans will not touch them. And I think that is false. I think it just. And like, you know, even if like you agree on its face that like you would agree with me that that's false, a lot of people like if Republicans win in November, they generally will go after tech companies that they don't like or that didn't give them money. They're, they're a little more personal these days. Democrats don't always go after tech companies successfully nor do they go after them evenly. So it's always like, you know, like and even depending upon how you know different cases in regulatory agencies proceed. It's one of those things where like it's never quite so cut and dried. I think in the general pullback of government from tech with like social media and just that relationship. I do think it's important to look at how like search companies don't derank conservatives because they post conservative stuff. It's because it's generally associated with misinformation, like 10 to 40% degree of reliability depending upon the site. I just made that up. That's not a real statistic. But generally like overall the biggest thing that concerns me about that pullback is a lack of like keeping relationships strong. So whether it's in something that you know very well, like cybersecurity or something that I used to work in, like machine learning, that lack of understanding is going to produce for really silly and ineffective regulation. Like if they, if they don't know how the products are made, they don't know what they're capable of. I was a big fan of reading the non binding Bill of Rights that the White House put out recently. But like generally like it doesn't have any teeth if a pace of innovation is going to outstrip it in like a year or two or less. Knowing machine learning these days. So whether it's social media, cybersecurity or Machine learning. I think that pullback doesn't have as much implication for my, you know, small subfield or for like the extent to which the government, the American government is going to be willing to regulate its big tech companies, but just that like that it won't be able to effectively. I, I really do worry about that breakdown in relationship, let alone like if you wanted to have some type of meaningful relationship for defense or for gov tech, like all, all of that like interstitia matters. It shouldn't be overly cozy. I mean I think any, any sort of like free thinking American citizen should be concerned if it gets too cozy because that also produces bad regulation. So you have to have something in the middle.
Isabella Royo
I think I want to pull from two threats that you presented in that answer. The first one, you hinted at the differences between Democrats and Republicans and their relationship with big tech. This is an election year. We just saw the change, you know, Biden stepping down, Kamala Harris taking over as the Democratic nominee. So I was wondering if you could maybe tell us about what it looks like if either, you know, either Trump or Harris actually wins. What can we expect to be different or remain the same?
Lucas Bantonis
I expect sort of like, like to to re. Pull on the part of it that I introduced. There is a big expectation that if a Democrat wins the tech companies are in trouble type of a, they're not going to be able to do anything type of a thing. I don't know at least right now whether or not we have a chance of seeing like toothy, strong and sensible legislation from Congress on either privacy or machine learning that is national. Even if Kamala Harris does win because the companies are in the driver's seat. Like I, I don't know how to emphasize that with an exclamation point enough. But there are, there are definitely sort of admirable efforts whether it's from like regulatory bodies or like NIST or anything like that like it, it's good faith effort and there are lots of cool policies being kicked around in D.C. that are meaningful that include data privacy as like one of its precepts. And I, and I love that. Right. But the top four or five AI companies have already scraped the entire, you know, like whatever it is, like they're going to run out of data in 10 years or something like that. So I don't know how you sort of meaningfully eat at that advantage. Especially since there are other things that a Democratic administration would have to contend with, whether it's immigration, security, like two plus active, really big war zones, you know, and like, you know, buckets of human rights issues, like all kinds of like things that like just will pull away from the fact that it seems like they want to craft a narrative that they're not pulling away from the world. They're rejecting Republican isolationism. And, and now it's, now you're supposed to regulate really fast, really powerful, really wealthy machine learning companies and also protect user privacy and fight the encryption fight the way the Democrats should. That's, that seems like a bad quagmire, to be honest. I think they're gonna have to pick and if they do win, I wish them luck. For Republicans, it's, it's also kind of equally interesting. I mentioned that some of the sort of rejection of like tech regulation or going after specific tech companies is like personal to whether they're being de ranked. It's more nuanced than that. I think that Republicans overall are mostly sort of like, like they're, they're pretty famous for engaging meaningfully in culture wars or like having like certain companies now be conservatively aligned. Like to mention like some of the recent backers that it came out to give money to the Trump campaign. They are no less jazzed about AI. I mean like I, I frame it as like Democrats don't want machine, you know, they're going to regulate machine learning. It's like both political parties love that stuff. Like they love sort of the dynamics of like, I think that conventionally Democrats have done a better job of sort of like safeguarding the worker. Like generally like many Democrats are pro union or they're sort of like after worker protections and privacy protections. I think for Republicans you'll see a lot of like fights that revolve around the culture war, whether it's like search ranking or AI and art and AI and ugc, AI on social media, data privacy for all of those same things, but data privacy in a way that's sort of like, it's a weird cross section where like they care a lot about data privacy. If somebody is like, you know, sort of called out for posting a bunch like they don't. Nobody wants to get like canceled for like posting stuff on social media about like, you know, gun rights or being part of like a conservative right wing site. So they do care about privacy. It's like, it's a, it's a very like, it's a complicated thing to explain. I think generally though, they are very happy to bring the fight like kind of home. I mean they're, they, they are full tilt on, on route. Like Trump and Vance are on route to like, you know, take the country back 200 years to Andrew Jackson. You know, like, they, they want to focus on domestic issues and they're like, I think naturally, tech regulation, billionaires that support them, culture war stuff, gun rights and immigration misinformation on social media. I think that whether it's like meaningfully different for me as an employee in this space, I, I couldn't tell you. I can't predict that. I just know that they're very. Com. Bringing things domestic and that's kind of what I see for them.
Isabella Royo
And the final thread that you've mentioned several times is machine learning artificial intelligence. I don't think we can go through a tech podcast nowadays without talking about it. We saw that Meta had to halt the use of AI in Brazil because of the Data Protection Authority ban. Do you think there is a trend towards preventing artificial intelligence deployment or not? What's going to happen with all of that? How do you see it from, from where you sit?
Lucas Bantonis
I admire when a government believes they can like, you know, like, or sort of like. It reminds me of like the European withdrawal of some of the key, you know, sort of like the major powerful features of the bottles due to like, protection requirements. Nothing is stopping this train. The, the people, the processes, the tech, the companies that like I've engaged with, I would say being part, being like a loose, loose adherent and follower of the ML community for like two and a half years now because I, honestly, when, when we were at Fletcher, like I, I read about it and like I, I was covering it. Yeah, excited. I wrote a little paper about it. But like, there is this headlong fascination with birthing like super intelligence. Like there, there are adherents all throughout Silicon Valley. There are billionaires, there are politicians that just want to see this thing rocket. And like, I don't, I don't know why, like, it's like they'd never seen the Matrix or they've never seen a movie about this stuff, but they will strip away the capacity of most modern governments to contend with how powerful that crap is just by continuing to scrape data, generating synthetic data to replace, you know, signing deals with publishers and stuff like that. I, I really do appreciate, like, you know, I'm painting a little bit of an apocalyptic picture. I think that like, overall it's really good when governments, whether it's Brazil, the US and Europe, try to meaningfully snarl what I would consider just like crappy and really like, mildly to moderately disrespectful AI practices in its development because it's Good. Like that's how you, that's how you tease apart a good baseline and restrictions for regulation. Not too dissimilar from cybersecurity, maybe just a little bit more juice, a little more powerful. That being said, it's not going to change my job because if anything, more ML means more data and more data types. So that's all I'll say about that. Also, like, it's not going to change that headlong fascination. Like it's how to really describe it. Like people like, God, they just cannot like get enough of this concept that like even the frickin safety people, like even the people that work in safety and protection, they're getting fired. They're getting fired or removed because they're getting concerned about data, normal things like data training practices, let alone whether something scary like this is sentient. So I don't, I don't think it's going to stop anything. I think it's going to keep going. It's going to be up to like data protection professionals like us to give a crap about putting in the guardrails because otherwise the development's going to continue unimpeded. If it were to stop deployment, like if you were to stop deployment in one country or stop deployment of a feature, it would just go unimpeded in another. And that's how it's going to keep going.
Isabella Royo
Lukas, if there's anything, you know, any last thoughts that you want to leave us with or something that you wish we'd covered, but we didn't. You have the floor.
Lucas Bantonis
Yeah. So I tend to be like a rambler and kind of pessimistic about most tech, despite the fact that it is my livelihood generally. I think that whether it's law enforcement response, data protection, privacy, cybersecurity or machine learning, I'd like to say that there are a lot of like core really good professionals of like various degrees of like investment in the hype cycle that like are doing a damn good job every day, like making sure that users are protected in like the best way that they are currently able to make them. A lot of decisions about like frontline protections. Again, whether it's any of the, the fields that we've, we've discussed on this episode is just like we, we all apply pressure. We have a responsibility to apply pressure as, as, as data protection professionals, you can call it data protection, whatever the bucket is that you'd want to group all this stuff into. And I think that it's just like when the news says it's a tough time out there for data protection. It is partially because of machine learning, partially because of like a vague slide towards populism and authoritarianism in many parts of the world. It doesn't mean we should stop trying. I think it's super important to encourage data protection folks to just keep their heads up and like not not submit to the pessimism and some of the rancor that some of these trends produce in the industry because it's important to do our jobs.
Isabella Royo
Okay, that is actually a cheerer note to end with. So Lucas, thank you so much for joining me. This was a great conversation.
Lucas Bantonis
Likewise, thanks for having me.
Isabella Royo
The Law Fair Podcast is produced in cooperation with the Brookings Institution. You can get ad free versions of this and other Law Fair podcasts by becoming a Law Fair material supporter through our website, lawfairmedia.org support. You'll also get access to special events and other content available only to our supporters. Please rate and review us wherever you get your podcasts. Look out for our other podcasts, including Rational Security, Chatter, Allies, and the Aftermath. Our latest Law Fair Presents podcast series on the government's response to January 6th. Check out our written work@lawfairmedia.org the podcast is edited by Jen Patia and your audio engineer. This episode was Noam Osvind of God Rodeo. Our theme song is from Alibi Music. As always, thank you for listening.
Lucas Bantonis
Hey, it's Adam Grant from Ted's podcast Work Life, and this episode is brought to you by ServiceNow AI is only as powerful as the platform it's built into. That's why it's no surprise that more than 85% of the Fortune 500 companies use the ServiceNow AI platform, while other platforms duct tape tools together. ServiceNow seamlessly unifies people, data workflows, and AI connecting every corner of your business. And with AI agents working together autonomously, anyone in any department can focus on the work that matters Most. Learn how ServiceNow puts AI to work for people@servicenow.com.
Date: November 16, 2025 (archived episode from August 9, 2024)
Host: Eugenia Daugherty (Lawfare Fellow, Technology Policy and Law)
Guest: Lukas Bundonis (Senior Privacy Engineer, Netflix; former Army Reserve Intelligence Officer)
This episode takes a deep dive into the evolving and complex relationship between law enforcement agencies and big tech companies. Through a candid conversation with Lukas Bundonis—a senior privacy engineer with a military intelligence background—the discussion explores global differences in law enforcement data requests, how political climates and new legislation impact tech/legal operations, the persistent tug-of-war between privacy and surveillance, and the challenges posed by artificial intelligence and machine learning advancements. Importantly, the episode unpacks how tech companies are both intermediaries and gatekeepers, striving to protect user privacy while fulfilling legal obligations.
Bridging tech and law:
Path into the Field:
Data access tension:
Rejecting government requests:
Different responses for different countries:
Physical presence & government leverage:
Child Safety Legislation:
FISA 702 Renewal:
Politicized regulatory atmosphere:
Election impacts:
On the current state of data protection:
“When the news says it's a tough time out there for data protection, it is partially because of machine learning, partially because of a vague slide towards populism and authoritarianism in many parts of the world. It doesn't mean we should stop trying.”
– Lukas Bundonis (04:40, repeated at 51:58)
On policy-technology disconnect:
“It's an interesting example of that disconnect or the lack of a bridge between policymakers and actual technologists because sometimes that technology doesn't exist.”
– Isabella Royo (35:48)
On the global regulatory arms race:
“Even the people that work in safety and protection, they're getting fired or removed because they're getting concerned about data...It's going to be up to data protection professionals like us to give a crap about putting in the guardrails, because otherwise the development's going to continue unimpeded.”
– Lukas Bundonis (48:45)
This episode offers a nuanced, inside look at how global tech companies contend with governmental pressure, legal obligations, and evolving technological realities like AI. The discussion is frank about the rising challenges—authoritarian clampdowns, regulatory ambiguity, impossible legislative demands on encryption, and the overwhelming pace of AI. Bundonis’s firsthand insights highlight both the persistent risks and admirable determination of privacy professionals to keep defending user rights in an increasingly complex world.
For more in-depth analysis and further reading, visit Lawfare’s website: www.lawfareblog.com