
Loading summary
Podcast Host
The following podcast contains advertising. To access an ad free version of the Lawfare Podcast. Become a material supporter of lawfare@patreon.com lawfare that's patreon.com Lawfair also check out Lawfare's other podcast offerings, Rational Security Chatter, Lawfare, no Bull, and the Aftermath.
Canopy Humidifier Advertiser
Skincare experts and dermatologists have often touted the benefits of indoor humidity as essential for healthy, glowing skin. But did you know dry air can start to harm your skin in as little as 30 minutes? For years, many people have relied on humidifiers for better skin, sleep and overall wellness. But traditional models bulky, mold, prone and difficult to maintain? That's where Canopy Humidifier comes in. Recommended by leading dermatologists, Canopy is a completely reimagined humidifier designed to elevate any space, offering the ultimate in skincare and wellness benefits. Canopy's clean moisture combats dryness, dullness, and fine lines while strengthening the skin's barrier and boosting the effectiveness of topical skincare products. With its sleek design, Canopy is the cleanest and easiest humidifier on the market. With its unique technology, cleaning is as easy as popping it in the dishwasher. Go to getcanopy.co to save $25 on your Canopy humidifier purchase today with Canopy's filter. Subscription Description Even better, use Code Podcast at checkout to save an additional 10% off your canopy purchase. Your skin will thank you.
Hydro Advertiser
Want a workout that actually works? Hydro delivers a full body workout that hits 86% of your muscles in just 20 minutes. Rowing with Hydro combines strength and cardio with thousands of workouts led by Olympians in breathtaking locations. No wonder nine out of ten members are still active one year later. Try Hydro risk free at hydrow.com and use code row to save up to $475 off your Hydro Pro row. That's H Y-R-O-W.com code row.
Marianne Franks
And so, inadvertently, I assume this bill actually wouldn't apply to deepfakes. That's one. So that seems like a really big problem for under inclusivity. And it does apply to all kinds of nonconstitutional visual depictions that are not, in fact, what we would define as NCII according to the rest of the criminal statues. So those seem like really, really big problems.
Renee DiResta
RENEE it's the Lawfair Podcast. I'm Renee DiResta, contributing editor here at Lawfair and associate research professor at Georgetown University McCourt School of Public Policy. With us are Marianne Franks, president and Legislative and Technology Policy Director of the Cyber Civil Rights Initiative and professor at George Washington Law School, Becca Barnum, deputy Director of the Free Expression Project at the center for Democracy and Technology, and Adam Connor, vice President of Technology Policy at the center for American Progress.
Becca Barnum
Something I'm concerned about, and I think it's worth paying attention to, is this model getting exported to other kinds of content that there might not be as much unanimous agreement about it being deserved to be taken down and sort of replicating this across wider swathes of content that are even harder than NCII to identify.
Renee DiResta
I'm here today with three distinguished guests to talk about the Take It Down Act. The act is sitting on President Trump's desk and he's expected to sign it. So it's important to understand what it does. This is the bill that penalizes non consensual intimate imagery at the federal level, which nearly everyone agrees is a good thing. But it also has some provisions requiring that platforms take it down, which even some strong supporters of NCII penalization are concerned could lead to censorship or over enforcement. Let's start by helping the listener understand the various provisions of the bill. So Take it down is, as with many things in Congress, an acronym. So it is. I'm going to read this. The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act. And so it generally prohibits the non consensual online publication of intimate visual depictions of individuals, both authentic and computer generated. So touching on AI, sometimes what's known as deep fake DPA content, and requires certain online platforms to promptly remove such depictions upon receiving notice of their existence. And I guess we've got sort of four key aspects to this bill. We have criminalization of non consensual intimate imagery, so making it a federal crime to knowingly publish or threaten to publish intimate images without the subject's consent. Again, both authentic and AI generated deep fakes, we have platform responsibilities that are now attached to this. A requirement to remove reported non consensual intimate imagery within 48 hours of notification by the person depicted or a representative. Also, I believe, steps required to eliminate duplicates of the content. There are now criminal penalties for that, including fines and imprisonment. And then there is an enforcement component where the FTC is tasked with enforcing the act, treating violations of, I believe, that notice and takedown process as deceptive trade practices. So a whole lot of stuff in there. And Marianne, as you mentioned, things that people have tried have had implemented, I think, at the state level. And now this is sort of the first time that we've seen this brought up to the federal level. I don't know. Maybe let's start with part one, I guess. Start with the criminalization component. I don't know if you want to talk a little bit about what that is doing, the sort of comparison between what this is doing at the federal level versus the very patchwork approach that the states had taken previously.
Marianne Franks
Yeah. So when we started, and I say we. When the Cyber Civil Rights Initiative started calling for legislative Reform Back in 2013, there were only three states that criminalized image based sexual abuse. There's now 49. If you're wondering who the holdout is, it's South Carolina. And there are lots of different definitions, There are lots of different penalties, categorizations. And so one of the reasons why we have been so insistent on pushing for a federal criminal law in addition to these state laws is that you really do want to have a uniform definition of the offense. You want to have the sort of weight of the kind of deterrence effect. Right. Of something being a federal crime. And so we were thrilled to some extent to see that this was getting pickup because the language that is folded into Take it Down is language that was known as a standalone bill called the SHIELD act for the last few years. Before that it was called a few other things. The first version of it was called the Intimate Privacy Protection Act. And we were pretty happy with that criminal provision when it comes to authentic depictions. And we had also worked with multiple members of Congress on a deepfakes provision because our view is that it's a slightly different matter and it really needs to be phrased somewhat differently. Roughly speaking, both of those sections have made it in to Take it down and they're basically good. I do have a couple of reservations. One of them is that there's a massive exception that's written into the bill that says these criminal penalties don't apply if it is a. If the person distributing or possessing the image is the person in the image. Now, what I think that they were trying to say is that if you have a selfie and you distribute it of yourself, that you won't be criminalized. But that's not what the exception says. It basically says if that person's in it as well as. Right. It's not limited to a depiction of themselves. So I think that's a really terrible loophole. And it was something we raised repeatedly with the sponsors and for some reason did not get addressed. So I think that's a bad thing. But apart from that, I'd say that the criminal provisions are Pretty good. They're better than most of the state provisions. They're very narrow, they're very specific, they're very clear. So I think that part of it is the win that we want to celebrate because that is something that we have needed for over a decade and it is good that we finally have that criminal provision.
Adam Connor
Marianne, can I just ask a quick question about this? Obviously enrolls definitions in law and obviously I know non consensual intimate imagery is kind of a term we should be using. Is there a preferred term we should be using for kind of the, what they call a digital forgery in the law or a deep fake? Is there a kind of preferred term we should be using as we kind of discuss this bill?
Marianne Franks
Thanks for that question. That, that is something that I was kind of, I was quite gratified to see in the bill that they did adopt the language that we have suggested, which is sexually explicit digital forgery as opposed to deepfake, which I've never really been comfortable with that term, given that this is not really a, you know, that's not a real term and it's is named after a user who kind of created this problem. So that's, that's not great. So I do think digital forgery gets closer to what we're trying to describe. So we, we tend to say sexually explicit digital forgery or, and, or inauthentic or synthetic non consensually distributed intimate imagery, which is a kind of a mouthful.
Adam Connor
Thank you.
Renee DiResta
Why do you think they chose to pursue this legislation now? What do you think was the motivation for the, the momentum that we just saw here?
Marianne Franks
This is something that gives me pause because as I mentioned, it's been a really long time since we've first been calling for this bill. And we've had a bill in place, a better one, I would say, back in 2016, introduced almost every single year since then. And every single year it got fairly close to passage, or almost every single year it got close to passage. And then there were objections from the ACLU and other civil liberties groups sometimes. And sometimes it was Republicans who were objecting to it. Notably, when we were trying to promote deepfakes legislation, the pushback was mostly coming from Republicans saying this sounds like you're trying to crack down on misinformation. And we don't. We think that misinformation isn't right.
Renee DiResta
Yes, this is one of the reasons why I'm asking. Yes, yes.
Marianne Franks
And so it's very. So when this momentum started to pick up last year, we were a bit concerned because we weren't convinced that this was a genuine concern about the underlying issue, which is the sexual exploitation of the people being targeted, but that instead this was becoming politicized in a way that. That made us uneasy. And I think that that is a big part of why it got the steam that it did. Which is not to say that the sponsors and the people who are supporting it and voting for it are necessarily acting in bad faith. I don't think that that's true. I think they are trying to do the right thing. But I think the fact that it suddenly became a consensus, notably last year, was a little bit strange. And I think it's important to note, too, that this bill got into the continuing resolution and it got stripped out basically on Elon Musk's orders. Right. So what has December and now is also a question. And the fact that the first lady is, is. Has decided to take this on as her issue, I think raises some questions. So I think we. We do want to wonder why, after having had better versions of this bill that were much more narrow, that did not include that takedown provision, why was there never a. A chance, realistically, of getting that passed? And yet, now that you have that, combined with all of these other things that now will at least potentially become a weapon to be used in a political or ideological way, why was this the vehicle that people suddenly got excited about?
Becca Barnum
I think that's exactly right. And I think it's important to situate this bill in the broader tech regulation landscape and the fact that Congress has been struggling for years to come up with narrowly tailored constitutional and sort of constitutionally defensible ways to regulate platforms and address very real online harms. I think what's gratifying to me about this bill is Congress is being responsive to a very real problem, but the way that they've gone about. We'll talk about this more when we get to the takedown section because it's less relevant for the. For the criminal provisions. But I find it very interesting that this bill got over the finish line where other ones have failed. So, for example, in the last Congress, the bill on everyone's mind was the Kids Online Safety act, which would require platforms to pretty fundamentally restructure the way that they operate their platforms, and from CET's view, would have similar, although not exactly the same, effects on people's speech online. And really, one way to think of how these bills are different. There are very real harms for kids online. NCII is obviously a very real harm for kids. And adults. But the difference between the two, or at least one of them, is that the Take It down act as it relates to the takedown provisions is probably a lot easier and cheaper to implement than the Kids Online Safety act and others might be. I do wonder if that's why we saw the tech industry itself throw its weight behind this one rather than some other tech regulation bills that we've seen. And so I think this is a story both of I think, Marianne, you've described this as like a bittersweet moment that really resonated with me. It's sweet for me because it's a recognition of this is a real harm. And I think the culmination of well over a decade of civil rights advocates trying to bring attention to this issue. And so it's a sweet moment for that to get over the finish line. But it's a little bitter in that that work is culminating and coming to fruition along with our takedown mechanism. That's going to have some pretty serious implication for user speech.
Renee DiResta
And we did have one other guest who I was hoping to have on who was very supportive. And so maybe I'll play the role of being the little bit kind of advocating, I guess, for it just to get that, that viewpoint out a little bit. I mean, there are several hundred orgs here on this list of civil society orgs that did support, including for example, NCMEC and some of the other child safety orgs. I guess one of the, one of the kind of obvious questions is are people reflexively responding to it in part because Melania Trump picked it up. Like, we can talk about the I want to move into the notification and the concerns about over enforcement and platforms in a second. But how do we respond to that concern?
Adam Connor
I'll answer that too. But I think there's kind of two points I think that are worth highlighting and sharing. I think one is just the kind of context in which the bill kind of did make it through, which is obviously a work that started really picked up steam last Congress. And you have to remember last Congress, Senate was controlled by Democrats and Ted Cruz is the ranking now chair of Senate Commerce. But it is something that started from a more bipartisan kind of place. Elon Musk did and Donald Trump did strip it out of the CR in the year they stripped everything out. So it was not just that they targeted this bill. I think it is probably important context to note. There was a ton of stuff, cancer, cancer research and other things in that CR that they sank. You know, so that is, this was a Kind of consequence of that. It was not. I think it's solely. Didn't just take that out. Out of the bill. I think that's. That's helpful context. You know, I do think that that helps explain on some level the kind of ability for this bill to move quickly, obviously, which was it looked like it could become a law and a split Congress. And so obviously in a unified Congress, just from a kind of pure dynamics point of view and the simple mechanics. You know, I think that is just a piece there. And I think the other is that advocates, and I know Becca and Marianne have been working on this issue for years. This is a hugely serious issue. I think Bittersweet is certainly a reasonable place to describe it. But these are advocates who think, when you hear their stories, you know, I think are very moving, and I think it is a very real problem. And Congress sometimes feels bad when they can't solve problems. And I think it can be hard for them to hear constantly that every solution they have is, you know, mixed. Right. And I think. I think that can sometimes be, I think, a very difficult place when. When they're thinking about how to weigh these kind of pros and cons in it. I will say that, you know, I think in the kind of way that this moves so quickly is that, you know, I think to take back this example for cosa, Right. It is narrower than cos. That sense. Right. It is a piece of a problem, not a broad problem, which I think kind of contributes in a less nefarious sense to some of this, which isn't to say that it isn't a danger of being abused by others. But there's, I think, benign explanations for why this becomes a law that we sometimes forget in a world where most things don't become laws because Congress rarely legislates anymore. But this has a series of kind of more benign legislative factors that can also help explain it.
Marianne Franks
I just want to add to that, though, that for historical context, that when we had multiple years where we presented SHIELD or the Intimate Privacy Protection act, again, much narrower, the complaints about it were that this is too broad. The complaints were that, you know, this is. You shouldn't criminalize this at all. This is going to ruin free speech. And we're talking about a provision that was just about narrowly prohibiting authentic visual depictions that were disclosed without consent. And there was always pushback from the left, from the right, and it didn't matter that there were compelling stories. These victims have been speaking out, and it was much harder to speak out 10 years ago than it is now victims who would speak out. And all complaints about how, oh, no, this is going to be censorship were just taken seriously by people who probably should have known better. So I think that I don't find that to be a compelling explanation for why now, because we had all of those factors in place before in a much better, more constitutional, much more narrow bill. And suddenly, for people who had been criticizing criminalization, criticizing, oh, the chilling effects or what have you, are now supporting a bill that is much, much broader than anything we ever proposed in the past and has a takedown provision that has direct and serious implications for, for the way that tech platforms operate and is so prone to arbitrary enforcement. I find that very suspicious.
Renee DiResta
Let's talk about the enforcement provisions. Let's go into helping the listener understand what the enforcement provisions are. So we have a requirement to remove within 48 hours upon notification by the victim or representative of the person depicted. I believe is. I hope I got that language correct. Becca, do you want to describe a little bit about the, the provision?
Becca Barnum
Sure. So within a year of the president signing the bill, covered platforms, which is a definition we should chat a little bit about. But covered platforms will be required to set up a notice and takedown system where either victims themselves will be able to submit a complaint to a platform where people, third parties acting on behalf of, of victims will be able to submit a complaint requesting that images and any known copies be taken down within 48 hours. We talked a little bit about, about hoping to have some advocates for these provisions on the show, and I'm going to step into that role because I think the, yeah, I mean, the notice and takedown provision itself is something that, you know, cdt and just like, on a personal level, I really wanted to get to a place where we could actively support these provisions because it's such an important tool for users to have a lot of platforms have this right now where you can request the removal. We also know that platforms, if they're not, you know, they don't always respond in time or in a timely manner. There's not always great transparency into whether and when they're going to take things down. And so the idea of empowering people right then and there, when they find an image of themselves or someone lets them know to just say, hey, no, this is me. I'm taking back control over my, my image and likeness, and I want this taken down now. And having platforms have to respond to that is extraordinarily powerful. From CET's view, the problem comes, it gets tricky as it relates to the definitions and the ambiguity. And I think the best way to think about it is really, I don't have any concerns with NCII coming down. Right? None whatsoever. If the provision operates as intended where it applies to non consensual imagery and that imagery has to come down, I think most people would agree that that's a really good thing and an important tool for people to have. It's really the effects on everything that's not non consensual imagery that CDT has concerns with. And unfortunately, the way the bill's drafted, it's ripe for sweeping in a lot of speech that the authors really didn't intend to sweep in. And I think it's going to have some pretty negative effects for users.
Adam Connor
I think just one small point there that Becca, feel free to correct me if I'm wrong, but I think one of the things that is interesting about this bill, just in the broad context is obviously a lot of particularly rhetoric from Republicans in Congress is kind of very anti big tech right now. And kind of understandably what is interesting about this bill is, is it is very broad.
Becca Barnum
Right.
Adam Connor
It does not just target, you know, large social media platforms. It targets basically the whole Internet with a few exceptions, which I think Becca and others we can, we'll talk about in a second. But I think it's also, you know, it is not that this is not a problem on major social media platforms. It is generally a problem they have some existing mechanisms for. And so I think it was interesting just kind of. It's not necessarily that targeting big tech of this was the kind of primary driving motivator, although certainly they are regulated and many regulated by it and many will ended up supporting the bill. But I think it is interesting that it is and why I think there are speech concerns. Right. It is a much broader targeted, you know, kind of sweeping in fairly large to all parts of the Internet with some limited exceptions.
Podcast Host
What does the future hold for business? Ask nine experts and you'll get 10 answers. Rates will rise or fall, inflation's up or down, tariffs off. They're on. Can somebody please invent a crystal ball? Until then, more than 41,000 businesses have future proofed their business with NetSuite by Oracle, the number one Cloud ERP bringing accounting, financial management, inventory, HR into one fluid platform with one unified business management suite. There is one source of truth giving you the visibility and control you need to make quick decisions. Real time insights and forecasting means that you're peering into the future with actionable data. When you're closing your books in days, not weeks. You're spending less time looking backwards and more time on what's next. So Lawfare is still a little bit small for NetSuite because we're, you know, a tiny little nonprofit, but we aspire to continue growing, and when we get big enough, this is what we're going to plan on using. Whether your company is earning millions of dollars or even hundreds of millions of dollars, NetSuite helps you respond to immediate challenges and seize your biggest opportunity. And speaking of opportunities, download the CFO's Guide to AI and Machine Learning at netsuite.com lawfare the guide is free to you at netsuite.com lawFare that's netsuite.com lawfare DeleteMe makes it easy, quick and safe to remove your personal data online at a time when surveillance and data breaches are common enough to make everyone vulnerable. Yo, it's easier than ever to find personal information about people online. Just try it. You can get your address, phone number, family members, names hanging out on the Internet, and that has real, actual consequences in the real world. And it makes everyone more vulnerable. With Delete Me, you can protect your personal privacy or the privacy of your business from doxing attacks before sensitive information can be exploited. So I'm somebody with an active online presence. I do podcasts. As you know, I sometimes even make Get Ready with Me makeup videos. And, you know, I'm kind of out there. I express my opinions, but the dirty little secret is my privacy is actually important to me. And that is why I use Delete Me personally. I have used it since actually before Delete Me was a sponsor of a lot of lawfare podcasts. I believe in this product and I find that it has made a difference in my life. So you too can take control of your data and keep your private life private by signing up for Deleteme now at a special discount for our listeners. Get 20% off your Delete Me plan when you go to JoinDeleteMe.com lawfare20 and use the promo code lawfare20 at checkout. The only way to get 20% off is to go to JoinDeleteMe.com Lawfare20 and enter code lawfare20 at checkout. That's JoinDeleteMe.com lawfare 20 code lawfare20.
Marianne Franks
Some tech presents the ins and outs of caring for your home. Out uncertainty, self doubt, stressing about not knowing where to start in plans and guides that make it easy to get home projects done out word art. Sorry, Live laugh lovers in knowing what to do, when to do it, and who to hire. Start caring for your home with confidence. Download Thumbtack today.
BetterHelp Advertiser
BetterHelp Online Therapy bought this 30 second ad to remind you right now, wherever you are, to unclench your jaw, relax your shoulders, take a deep breath in and out. Feels better, right? That's 15 seconds of self care. Imagine what you could do with more. Visit betterhelp.com randompodcast for 10% off your first month of therapy. No pressure, just help. But for now, just relax. Got a new puppy or kitten? Congrats. But also yikes. Between crates, beds, toys, treats, and those first few vet visits, you've probably already dropped a small fortune, which is where Lemonade pet insurance comes in. It helps you cover vet costs so that you can focus on what's best for you and your new pet. The coverage is customizable, sign up is quick and easy, and your claims are handled in as little as 3 seconds. Pro Tip Lemonade offers a package specifically for puppies and kittens. Get a'llemonade.com pet your future self will thank you. Your pet won't. They don't know what insurance is.
Renee DiResta
So I think just to note what you just said there, just to highlight it, several of the big tech platforms did support it. And also I think the comparison that a lot of people make is to the Digital Millennium Copyright act notice and takedown system, which has existed for a long time. They do comply. There's a lot of concern about overcompliance, about again, overuse of DMCA to take down speech that should not be taken down. Sort of the abuse that goes into the DMCA system, which for again, I am not a lawyer nor a DMCA compliance officer. Adam, maybe you want to weigh in on this. You were at a platform, but just the, the extent to which entities will file, the companies will default to taking it down immediately to avoid being held liable. And then sometimes things will go back up if the, if the notice party files a complaint or a response. And then sometimes things will go back up. But there is this kind of ability to abuse DMCA to take down speech and content that you do not like. So that is one of the things that is, that has been raised as an objection to this particular notice and takedown provision.
Adam Connor
Yeah, I'll just say when we were setting up content moderation at Facebook, there was like a bucket of all these really hard problems then, like we hired somebody to deal with DMCA because it's the law. And I think, you know, I think services job a lot of us did want. And technology has obviously played a bigger role in that. I would say kind of two things. Not a DMC expert. A expert. I think there's a wide body of evidence on both its shortcomings. But also maybe it's better to have something, nothing. But I will defer to others on that. I think the kind of in this context for this bill, the critical aspect as it relates to implementation is there are no kind of appeal processes or any sort of ability to kind of contest this. And I think that is part of the concern. That's not necessarily. And again, Marianne and Becky can correct me wrong, a constitutional concern, although maybe it will, is also an additional one, but it is more kind of a implementation expression of speech concern relative to that, particularly if your content is swept in maybe, you know, incorrectly.
Marianne Franks
Yeah. I'd want to add to that that the, you know, I'm not a DMCA expert either, but the things that I do know about the DMCA process is that you have to attest on penalty of perjury that what you are alleging, you have a good faith belief that that is true. Now there's a good faith requirement that's mentioned in the take it, but there's no perjury attestation and there's no. The DMCA says that you will be subject to liability if you are making knowingly false statements about whether in fact you are entitled or authorized to be a person to ask for this removal and that that or if you are knowingly misleading, providing misinformation about whether this is in fact violative. Right. Nothing like that in the provision here. So literally any kind of complaint that alleges a non consensual visual depiction supposedly have to be investigated fully, which, you know, one of the problems we have with this is it seems like it would just unleash just a torrent of bad faith complaints. So as Becca was saying, if we were actually talking about this bill doing what it thinks that it's doing, which is what this is just going to be a laser targeting focus on actual NCII and it's going to come straight down, that would be amazing. Right. But I don't see how you get that from this. And that's not only bad because of what it sweeps in, if that is what platforms are having to deal with on a daily basis, are they ever going to get to the actual NCII complaints and how are they supposed to sift through the bad faith complaints versus the ones that are genuine? So I'm Worried about it actually being counterproductive, not just overly broad. And then the other thing that I think is worth pointing out is that the definition of what is supposed to come down is not matching the criminal definition of what is unlawful. And that seems like a really big problem. Right, so. So you have this incredibly narrow. Rightly so, really, based on the statutes that we have been, in the statutory language we've been promulgating for some time to comport with the First Amendment. And then in the takedown provision, you just have this general term non consensual visual depiction, with none of the exceptions, none of the restrictions, just basically says something that is sexually explicit and allegedly non consensual. And that is both really, really broad, but also under inclusive, as I've mentioned in some of the analyses that I've done, which is it's interesting that the bill says this is a deepfakes bill or it's going to take down deepfakes. The term intimate visual depiction comes from an existing civil federal statute and that statute refers to authentic images. And so inadvertently, I assume this bill actually wouldn't apply to deepfakes. That's one. So that seems like a really big problem for underinclusivity. And it does apply to all kinds of non consensual visual depictions that are not in fact what we would define as NCII according to the rest of the criminal statute. So those seem like really, really big problems.
Becca Barnum
That all sounds right to me. And I think it's also part of what we think about here at CDT is kind of the reality of how these, these bills get implemented. And so we look at this, the statutory text, and agree entirely with what has been mentioned about the shortcomings and ambiguities there. But also we know that particularly given the, the statute's requirement that these bills, that these images rather come down with 48 hours. There are different approaches to intermediary liability. But when things like this are implemented, the goal here, and there's nothing in the bill that requires platforms to implement this, these provisions competently. Right. We hope they will, we hope they will take care to investigate and ensure that things are submitted are actually NCII or even actually include nude imagery. Right. Aren't just sort of unflattering pictures I happen to find of myself on the Internet. We hope they will do that. But there's no requirement that the platforms do that. And so the law is lacking in that way. And also just thinking about sort of the economic incentives that platforms have if the choice is between risking FTC enforcement or just taking down a piece of content that a user posts, that individual piece of content doesn't really make a platform all that much money. Every individual piece of content isn't particularly really important to platforms, even if they, they do want to facilitate speech. And so when you implement these broad liability requirements and a really strict deadline, given tough choices and resource constraints, or choices to create constraints on resources and content moderation, we can expect that it might not be implemented in the best way possible, which will really exacerbate the ambiguities in the law.
Adam Connor
I think just two small points, I think one to Marianne's original point, right, there's no perjury kind of component or anything there. And my understanding, at least in having worked, you know, and I think others might have a better sense here, is right. That is the kind of barrier to report in CII or other kind of horrifying or embarrassing imagery like that's, that's often a kind of balancing act, I know, for platforms or others as they try and understand that. And I think that is different, you know, if you lower the barriers as you're doing it kind of voluntarily versus having to have something in law. But I do think, think that, please correct me if I'm wrong. That's a balanced equation where you can see if you read in the build text, the requirements for identifying it are fairly minimal. And likely we implemented maybe with checkboxes, you gotta attest to it, you've gotta identify the piece of content which is obviously necessary in a brief statement about it, but it doesn't have any kind of strong additional details there. I mean, it'll be interesting to see in the implementation here. And it is not required in any way, shape or perform if the FTC will give any sort of guidance. I mean, it's certainly not required in the bill. It is certainly something they could do and one hopes they might consider aspects of that as they implement, you know, aspects of it. But I think that's part of it. And then I think the other thing I'll say to Becca's point, as she pointed out the potential abuses, right, there's like kind of a melding of potential abuses as I, as I understand the concerns and it's worth saying that they kind of come together, but they're also kind of separate, which is like will people use bad faith and just lie to the form? Will platforms check it? You know, will platforms bother to check any, you know, either correctly, maybe real or versus kind of bad faith, you know, pieces. Will they even check the content involved, and then will somebody kind of flagrantly abuse it, as I think has been suggested? But those are. They kind of all meld together, but they are kind of different, separate risk vectors in terms of implementation. And you could see, for instance, some platforms saying, we're small, right, and we're just going to take everything down, whereas larger platforms might say, you know, using existing systems. And I think what's probably most likely true here is the kinds of content that is most vulnerable first is kind of content that would be adjacent to ncii, right? So this would be voluntary, intimate imagery or something like that, or pornography or things that is legal, things like that. And so you could also imagine a company specializing in those things maybe putting more effort into it than kind of platforms. As I mentioned, the kind of benefit and downside of this bill is its broad applicability. And so I think that is obviously why Beck and others have concerns.
Renee DiResta
Well, I think for a brief, shining moment in time, there was some transparency around DMCA takedowns also, right. There was the Lumen database, so outsiders could go and see to what extent abuse of DMCA was happening. We could remember that, for this was actually how people saw that X was taking down content in response to requests from the government of India. And then as soon as that article was written, X stopped contributing to the Lumen database. But I know that there's been some concerns about FTC enforcement as the body has felt a bit more political with the firings of some of the democratically appointed FTC commissioners. That question of who is the best entity to handle that enforcement of platform compliance. I'm curious what you all think about that decision. Whether it would have been better to have some sort of specialized privacy or for the child safety component, maybe child safety agency or some other entity handle that piece of it.
Marianne Franks
It's such a tough one because it's hard to know who, which entity would actually have the competence for this, especially if we're talking about an FTC that is, you know, that there are two problems I think in this current moment that we would need to worry about in terms of the ftc. One is the over politicization and weaponization for political purposes. The other is stripping away any kind of budget or resources to do actual good work. And so there's an ineffectual kind of concern on the one hand, and then there's a overly effective but really bad politicized kind of concern that I would have, you know, not only with the purported firings of the Democratic commissioners But also that you have the head of the FTC openly saying, I want to go after tech companies that Trump doesn't like.
Becca Barnum
Right.
Marianne Franks
I mean, that's effectively what he's saying. And that big tech, little tech divide is a concerning one for me because so many of the platforms that are the worst offenders when it comes to distribution networks of intimate imagery are little tech. Right. So this kind of rhetoric about how we need to go after the big shots or what have you is very distracting. And on top of all this, you have this, this complete entanglement that Trump has given us with this appointment of Elon Musk and whatever weird role he's in to say, you know what, realistically, if you're ex, do you think that you are worried about the FTC coming after you given the fact that you are, you know, Elon Musk is Trump's right hand man at the moment. Right. Who actually has to worry about being investigated by the ftc. And I think that the implications of this are pretty unnerving and not, you know, one of the additional reasons why I'm worried about this is this very odd provision that says that, you know, normally the ftc, when it does its unfair and deceptive practices kind of investigations and exercises its powers, it's limited to investigating entities that are commercial entities that, that are operating for profit. That makes sense. It's the Federal Trade Commission that's kind of their job. But this provision says, and take it down that, oh, we are not going to adhere by that limitation. We can go after nonprofits as well. And I just have to wonder what is that actually about? Because again, we're a victims and survivors centered organization. We are very familiar with the landscape of who are the worst offenders for distributors. I can't think of a non profit that is on that list. And so this seems like a very odd place to do that kind of expansion of jurisdiction.
Renee DiResta
How should we think about on the enforcement piece in particular? This is one area where we should see, I would think, some improvement on the false positives front. Platforms love to tout their sophisticated AI enforcement on this front. Adam, I don't know if you want to weigh in on this one. Do we think that the 48 hour deadline is going to lead to the concern, you know, the proliferation of false positives that people are concerned about? Do we think that it will be better, you know, because they support it? I am curious, like, where you think that we're going to come down on that.
Adam Connor
Yeah. So I think two things. One, just to finish off Marianne's point about the FTC and are certainly very significant concern with the firing of not only commission, attempted firing of not only commissioners of the ftc, but other places that implicates all sorts of aspects. I think it is really important to note that like a critical part of and there are very few successful tech bills, you know, they've made it into law, but a critical part of like the tech policy congressional landscape has been it has been a relatively safe bipartisan agreement point that you could do enforcement through the FTC for a variety of things. And that's just because as a independent agency, in theory with bipartisan commissioners, like it was a place that kind of wasn't the fight on. And I think unfortunately, fortunately, the kind of longer term impact of these illegal firings of independent commissioners by President Trump is it's really going to damage that aspect of a place where it could enforce privacy laws, could enforce various other laws that we have. There's any number of bills out there, some which these groups we'd support, some of which we'd oppose, but kind of just looked at it as a relatively safe and competent place to administer these things. And that's like a real lasting damage. This may be the last time we see significant bipartisan support for a bill that does FTC enforce, that adds FTC enforcement powers. Just because of the kind of politicization of the agency of independent agencies writ large and obviously the kind of concerns that are raised by Jeff Ferguson's, you know, kind of agreeing with the president as a condition of his employment that he this is no longer an independent agency and he will operate it not like an independent agency. In his remarks the other day, that they will, you know, sternly continue to prosecute the Facebook case up until the moment Donald Trump tells them to stop is I think, you know, I think a good indication. I think more broadly too, as you get into the kind of questions of over enforcement, it actually puts the ft. I think there are significant concerns about abuse there and other pieces. It also puts the FTC and platforms in a weird place because of this rhetoric because obviously we have seen from Chair Carr, we have seen from the president in both his first term with his eo, you know, attacking speech and his current kind of embodiments of attacking the First Amendment free speech, you know, a concern to weaponize this, to take down speech they don't like. Right. But that is kind of difficult as you write broader rules. And you can imagine in this case, right. People who might weaponize, take it down as folks are worried about to flood would say, you know, false reports. Those aren't going to be limited to one side of the ideology. If it ends up kind of targeting content that is not just kind of intimately intimate depictions. And you get to a weird place where if you start to be overly broad and taking down all content and then, oh, all that content is a lot of pro Trump content, for instance, you know, that's a vulnerability. Now for platforms. That's something that the FTC and Trump might be mad about. And so you also get into this weird place where platforms by the law might not have to care too much about the content, but they might have to start thinking about having to sort through it more closely because they don't want to be yelled at for taking down too much Trump. And now will they tilt that one way or another, I think is obviously the question. But it just, it adds a extra layer of complication you don't find in the law that you see in the operating reality that platforms are working in in the Trump era. And I think as it gets implemented, you know, companies that are most positioned to comply with this law. And I think it is why you see broad social media platforms that have these existing systems feeling relatively confident about it. A lot of them, for instance, Facebook is a good example, right? Like, doesn't allow under its rules, super intimate imagery anyway, so it's kind of coming down regardless of, you know, kind of this law or not. I think it's again a broader question for the 48 hour requirement with smaller platforms or platforms that might not have resources or care. I think is where, you know, you will see some really interesting questions on that.
Becca Barnum
I guess.
Renee DiResta
While we have a few minutes left, I want to get at the question of is this overall net beneficial? It sounds like there's been fairly widespread agreement that the victim censored criminal provisions are on fairly firm ground that they fill a pretty clear federal gap, genuine federal gap. The notice and takedown provisions are where that potential chilling effect or over enforcement or First Amendment litigation is possibly going to land. I think maybe where we'll see some litigation, rulemaking, court challenges not so dire that the whole law collapses. I'm curious what y'all think about that. I've been curious about the balance between like the sort of slippery slope concerns I've seen expressed from civil libertarians and some tech policy folks versus again, the overwhelming support and the real harms that this is attempting to backstop. And you know, my, my personal feeling, my personal bias that it is overall net beneficial. And so I'm curious, you know, where you all are coming down on that.
Becca Barnum
So this bill is a bit of a heartbreaker for me because I, I really was excited to have the opportunity for Congress to weigh in both in on this issue, which is, which is very worthy of attention and in response and to empower people with a tool that they can use to try and stop the proliferation of those images online. I think for me, what I'm turning to now is trying to, to the extent possible, making this a net benefit. Right. Working with platforms to minimize opportunities for abuse, working to advocate for interpretations of the law that will be user protective and privacy protective in its current form. Obviously, as one of those civil libertarians myself, I have a tendency to focus on some of the problems, but there are benefits to this bill that I want to see accrue to users and to victims themselves. And I think there is a world in which Congress could have passed a constitutional and privacy protective takedown mechanism to help victims. And I am really looking forward to helping the FTC and the courts interpret this bill in a way that is consistent with that vision. Because I think on net, stepping back from this bill, it is absolutely a net positive for Congress to be addressing these issues and for victims to be able to respond and for us to also commit to reducing the prevalence of image based sexual abuse. And so I'm excited to make it a net benefit, even if in its current form it might not be.
Adam Connor
Becca Marianne, could you also, because I think we skipped over this, could you just highlight what is the unconstitutional aspect you think is most strong in these cases? Because it is, it's not quite that implementation of the enforcement, it's just the broader concept. Right. Of the takedown provision.
Marianne Franks
I mean, I think, yes, I think that there are straightforward over breadth problems under the First Amendment for the takedown provision because it's incredibly broad, it's vague, it's as I mentioned, over and under inclusive. It's difficult for any entity to know how to conform its behavior with this. The dictates are just not clear. So I think that there is a very key over breath challenge. Under breath challenge I would maybe it's weaker, but the over breath for sure, the vagueness. And so I do think just sort of straightforwardly, the takedown provision raises some very serious constitutional problems.
Renee DiResta
Do you think that the platform supported it? Assuming that court challenges would significantly narrow it and that this was more of an optic support?
Marianne Franks
I wonder, I do wonder a little bit about that. I wonder, I think what we're going to see fairly soon, I Mean, for instance, I could imagine the bill gets signed and you immediately see a coalition of tech groups, groups asking for an injunction for the takedown provision. And I think that would be a good thing. I actually hope that that is what happens, whether that's why they did it or whether they're just living in the reality that is, look, not only is this the Trump administration kind of dictate, but it's also one that, you know, from the other side or from every other side, people have been saying for a long time, like, why aren't you doing something about this issue? Right? Who wants to be the platform that now says in this environment that they don't care about revenge born or deepfakes, right. Five years ago they didn't care, and 10 years ago they definitely didn't care. Now they, I guess they do, at least up to a point. And now there's political pressure and this is one win where, where they can say, well, sure, this is one, you know, thing that we should have been doing probably anyway. And the fact that the Trump administration is claiming that it cares about it is a great win. Well, everybody's happy. Maybe that's why they said, or they're just confident, right? As I think many companies would be, that it's not going to hit them because they know that as long as they keep Trump happy, they don't have to worry about getting any kind of enforcement action against them, I guess.
Renee DiResta
Let's go with one final question. In a few years, do you think that this will be a. Will have evolved into a model for a, you know, really serious global NCII framework or a cautionary tale of over censorship or just something that really has no impact?
Marianne Franks
Small question. I mean, I guess I want to lightning round. I want to link your previous question about, you know, is this overall a bit, because I think they're related questions. Right. You know, what is going to be the lasting impact of this bill? And you know, when I wrote that this was not just bittersweet, but that I characterized the takedown provision as a poison pill. Because it's not just that. It's actually because I support the criminal provision so much. I mean, obviously we do because we have literally been calling for it for over a decade. But for it to be, you know, for it to be joined with what I think is a very damaging, not only unconstitutional, but actually counterproductive for a victim. That provision, the thing that I worry about the most is the kind of, I don't even know how to express my despair or my disappointment at the idea that what victims have been advocating for and sacrificing for, including the founder of my organization, Holly Jacobs, is for this issue to be taken seriously because it's not a political issue. It is just about individual privacy, dignity, anti exploitation and the idea that it could be harnessed with this kind of clearly to me politicized and I think bad faith provision or at least one that I'm fairly certain is going to be used that way. That you're going to confuse this issue, that you are going to link these two things that there are at least some actors I think who are using the kind of COVID of we care about this issue so that they can achieve their political ends. That is the thing that makes me extremely upset on behalf of the survivors that have really fallen for this, for the criminalization. So for me the criminal provisions, if we could fix that terrible loophole about the appearance of the disclosure, him or herself, those were the provisions we've been asking for. Those are good provisions. I'm glad to see that maybe what has. One thing that has happened in the course of these last few years is people have come to consensus on that, which I am still surprised here because it never seemed to be true in previous years. If we could come to consensus around this, fix the problems that are in that that are easy to fix. I think there's. And just get rid of the takedown provision. I am much more supportive of the idea of real § 230 Reform of real structural changes to how we try to impose accountability on the tech industry rather than trying to do this. How many hours is it? Which kind of image is it, who has to sign off on it which I think is kind of an exercise in futility. As much as it makes promises to victims that they're going to be taken care of. That's the other piece of it I really worry about that we're giving false hope to victims. This isn't a process that we know how to make work yet. And the idea that you're going to tell victims this is going to fix all of your issues. I think one of the things we're inevitably going to see in the next few years when this law goes into effect is actually it doesn't fix these things. And so now what I think that's.
Becca Barnum
Right and I think something to keep an eye on moving forward is to the extent I can't predict sort of who might end up challenging the bill, but to the extent it survives in attacked something I'm concerned about and I think it's worth paying attention to is this model getting exported to other kinds of content that there might not be as much unanimous agreement about it being deserved to be taken down and sort of replicating this across wider swaths of content that are even harder than NCII to identify and where the equities aren't as clear about it needing to come down at the request of the people depicted. And so that's worth considering because again, back to the broader tech regulation landscape. Everyone wants there to be reform, whether on the left. They're on the left or the right. But the types of content they're concerned about and the ways they want to go about it, they've yet to reach agreement. Right. And what people define as harm and what people define as harmful content really depends on their worldview. And I do worry about this model getting extended beyond NCII to areas where it will be even, even harder and more likely for lawful content to be censored through government action.
Adam Connor
And I'll just say I think two things. I think, one, I understand why it's so bittersweet for Marianne. You know, I think one of the reasons why this sits now is because of the effect of advocacy and also the proliferation. Right. I think the things we have heard over and over again, right. Is that people understand with AI now that like, this is a thing that they are. They're seeing a lot more of. Right. Not just on the other real side, but on the digital forgery side, they're seeing stories in their schools. And so I think it's obvious your effective advocacy, even if the solution here is not necessarily all of the one you might want. But I think that to ask why now? I think it is kind of hitting a point where it is now no longer something that people kind of. Of look at distantly, but I think increasingly understand just as a broad public, that this is a real thing and it really happens and it may need to be dealt with. Which I think in part explains some of the congressional motivation. You know, I think I would say two things about this. I think my general sense here is this provision, this. This law and the takedown provision, if it survives, you know, it will help take down NCII and it will help take down digital forgeries of intimate imagery. I think it is very clear that that is true, that that the kind of little tech places we, you know, kind of Marianne Frank's mentioned that, like, help proliferate this aren't necessarily the big platforms. And I think that is like a net good. I think whether or not all the other kind of content that could be affected by it is worth it is, I think, going to be, you know, really what we look back and ask about maybe in the future, see if we can change. But I. I do think it is worth saying that it will help some of that go down. And so there is some good in here, which I think why people have mixed feelings about it. I think the other thing I would just say is that why was Congress willing to do this? Obviously, you know, the kind of piece there was, you know, one we've heard a lot. And I know from my own time in the issue, like people who are victims in CIA, not all of them want to take criminal, you know, take action in the court. Sometimes they just want it down and they want it to go away. And whether or not that's possible or even constitutional, I think that kind of speaks. Speaks to the reason why this provision was in the bill. And I think it speaks to the reason, quite candidly, why I think members of Congress were willing to roll the dice on maybe something that's not as constitutional because it feels so powerless to say we can't tell them to take it down. Now, obviously, there's legal issues and there's all sorts of abuse issues, but I think if you look at fundamentally why that happened, and I think unfortunately, plenty of members of Congress, particularly female ones, have experienced this kind of assault and abuse, and I think that that helped drive a very human reaction to it. And obviously now it's in the courts in the implementation and for the platforms. But I think that's why it will have a legacy. And I think it just kind of determines what survives, you know, the courts to determine an implementation, to determine what that legacy is.
Renee DiResta
Thanks, everybody, so much for the really thoughtful and nuanced discussion here. I think it's a really important moment, and I guess now we see if it gets signed before this comes out. But no, I really appreciate all of your expertise, and I hope that this really kind of helps explain to the audience some of the questions that I had. I really benefited from hearing all the nuance that y'all brought to this conversation. So thank you so much for sharing your expertise with us today.
Marianne Franks
Thank you.
Renee DiResta
The Lawfare podcast is produced in cooperation with the Brookings Institution. You can get ad free versions of this and other lawfare podcasts by becoming a Lawfare material supporter at our website, lawfairmedia.org support. You'll also get access to special events and other content available only to our supporters. Please rate and review us wherever you get your podcasts. Look out for our other podcasts, including Rational Security, Allies, the Aftermath, and Escalation. Our latest Lawfare Presents podcast series about the war in Ukraine. Check out our written work@lawfaremedia.org this podcast is edited by Jen Patya and our audio engineer. This episode was Hazel Hoffman of Goat Rodeo. Our theme song is from Alibi Music. As always, thank you for listening.
Hydro Advertiser
Want a workout that actually works? Hydro delivers a full body workout that hits 86% of your muscles in just 20 minutes. Rowing with Hydro combines strength and cardio with thousands of workouts led by Olympians in breathtaking locations. No wonder nine out of ten members are still active one year later. Try Hydro risk free@hydrow.com and use code RO to save up to $475 off your Hydro Pro rower. That's H Y-R-O-W.com code row.
Summary of "Lawfare Daily: Digital Forgeries, Real Felonies: Inside the TAKE IT DOWN Act"
Release Date: May 6, 2025
In this episode of The Lawfare Podcast, hosted by Renee DiResta from the Lawfare Institute, experts delve into the intricacies of the TAKE IT DOWN Act—a significant piece of legislation poised to transform federal responses to non-consensual intimate imagery (NCII). The discussion features Marianne Franks of the Cyber Civil Rights Initiative, Becca Barnum from the Free Expression Project at the Center for Democracy and Technology, and Adam Connor from the Center for American Progress. Together, they explore the bill's provisions, potential implications, and the concerns surrounding its implementation.
The TAKE IT DOWN Act stands for the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act. Its primary objective is to address the proliferation of NCII, including both authentic and AI-generated deepfakes, by imposing federal penalties and enforcing stringent platform responsibilities.
Criminalization of NCII:
Platform Responsibilities:
Enforcement Mechanism:
Marianne Franks, representing the Cyber Civil Rights Initiative, underscores the importance of federal legislation in standardizing the definitions and penalties associated with NCII, noting that "there are 49 states now criminalizing image-based sexual abuse" ([02:06]). She praises the bill's criminal provisions but expresses concern over a loophole:
"There's a massive exception that's written into the bill that says these criminal penalties don't apply if it is a... If the person distributing or possessing the image is the person in the image." ([05:46])
Franks emphasizes that while the criminal aspects are robust, the exception could undermine the bill's effectiveness by allowing potential abuse.
Becca Barnum from the Free Expression Project acknowledges the bill's positive strides in recognizing and addressing genuine harms but voices apprehension regarding the notice and takedown provisions. She warns that the broad definitions may inadvertently lead to the censorship of lawful content:
"The problem comes, it gets tricky as it relates to the definitions and the ambiguity." ([19:59])
Barnum fears that ambiguous language could enable over-enforcement, affecting speech that should remain protected under the First Amendment.
Adam Connor of the Center for American Progress provides insights into the legislative process and the FTC's capacity to enforce the act amidst political pressures:
"There was a ton of stuff, cancer research and other things in that CR that they sank." ([20:17])
Connor highlights the challenges the FTC might face in impartially enforcing the act, especially given its politicized environment and resource constraints.
A significant portion of the discussion revolves around the terminology used in the bill. Franks prefers "sexually explicit digital forgery" over the colloquial term "deepfake", arguing that the latter lacks precision and may contribute to confusion:
"I've never really been comfortable with that term, given that this is not really a, you know, that's not a real term." ([08:24])
Franks expresses skepticism about the sudden bipartisan support the bill garnered, attributing it to a politicized environment influenced by high-profile endorsements rather than purely on constitutional merits:
"I think it's a bittersweet moment... But it's a little bitter in that that work is culminating and coming to fruition along with our takedown mechanism." ([13:17])
Connor adds that the bill's passage was influenced by both advocacy efforts and the political landscape shaped by figures like Elon Musk and Donald Trump:
"But I think this is a story both of I think, Marianne, you've described this as like a bittersweet moment that really resonated with me." ([13:17])
Becca Barnum elaborates on the notice and takedown system, recognizing its potential benefits for victims but cautioning against its implementation ambiguities:
"It's the effect on everything that's not non consensual imagery that CDT has concerns with." ([19:59])
The comparison to the Digital Millennium Copyright Act (DMCA) arises, with both guests pointing out risks of overcompliance and misuse:
"There's a lot of concern about overcompliance, about again, overuse of DMCA to take down speech that should not be taken down." ([27:44])
Franks raises substantial First Amendment concerns, emphasizing the provision's overbreadth and vagueness:
"I do think that there is a very key overbreadth challenge." ([45:50])
These challenges question whether the law infringes upon protected speech rights, potentially rendering portions of the act unconstitutional.
The guests debate whether the act will ultimately be a net positive or a cautionary tale. While the criminalization of NCII is largely supported, the notice and takedown provisions introduce complexities that could hinder free expression and lead to unintended censorship.
Becca Barnum expresses a commitment to maximize the bill's benefits while mitigating its drawbacks through advocacy and careful interpretation:
"I think there is a world in which Congress could have passed a constitutional and privacy protective takedown mechanism to help victims." ([44:03])
Conversely, Franks fears that the act's current form may confuse efforts to combat NCII and offer false hope to victims, potentially exacerbating the problem rather than resolving it.
The TAKE IT DOWN Act represents a pivotal moment in federal efforts to combat non-consensual intimate imagery. While its criminal provisions address long-standing gaps in state laws, the accompanying platform responsibilities and enforcement mechanisms raise significant concerns about potential overreach and constitutional violations. As the act awaits presidential signature, its implementation and subsequent legal challenges will likely shape the future landscape of online privacy, free expression, and victim support in the digital age.
Notable Quotes:
Marianne Franks ([05:46]):
"There's a massive exception that's written into the bill that says these criminal penalties don't apply if it is a... If the person distributing or possessing the image is the person in the image."
Becca Barnum ([19:59]):
"It's the effect on everything that's not non consensual imagery that CDT has concerns with."
Marianne Franks ([45:50]):
"I do think that there is a very key overbreadth challenge."
Becca Barnum ([44:03]):
"I think there is a world in which Congress could have passed a constitutional and privacy protective takedown mechanism to help victims."
This summary encapsulates the multifaceted discussion surrounding the TAKE IT DOWN Act, highlighting both its intended benefits and the apprehensions it has sparked among experts. As stakeholders navigate this complex legislation, the balance between protecting individuals and preserving free expression remains at the forefront of the debate.