Loading summary
A
Hey Joseph, here as I hope you are as well, we are on break for the holidays.
B
So for the podcast we're uploading a couple of our earlier interview podcast episodes. As you know, we've relaunched that series, or rather launched that series where every week we have an interesting interview with somebody that we've come across in our reporting. We actually did a few of those interviews way back when, several months at this point where it wasn't really a formal series, but they were very interesting conversations. So we're resurfacing a couple of those in the hope you can get something out of them by listening to them. Now the one that I'm re upping is my interview with Meredith Whitaker, who's the head of Signal, probably the most important consumer encrypted messaging app around at the moment. We talk about backdoors, I think some government hacking as well. And I also think this is a pretty good partner interview with the other one I had with Mike Bobbitt a few weeks ago where he was one of the FBI agents on Operation Trojan Shield, which was this investigation, as you would have heard, where the FBI put its own back door into the communications. Well, you can almost hear the other part of that conversation now with this interview. So please take a listen and maybe.
A
Now if you're hearing it for a.
B
Second time, you can actually hear it again with some additional new context as well.
A
Hello and welcome to a special interview episode of the 404 Media podcast where we bring you unparalleled access to hidden worlds, both online and IRL. 404 Media is a journalist founded company and is your support. To subscribe go to 404 Media Co as well as bonus content every single week. Subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404 Media co. I'm your host Joseph and with me today is Meredith Whittaker, the president of the Signal Foundation. Signal, as many of you will know, is generally considered the gold standard of an easy to use, consumer friendly, end to end encrypted messaging app. Signal has a long history. I remember way back in, I guess around 2010, although I joined it a little bit later. There was Tech Secure and Redphone and then they morphed into the Signal app that we all know today. And then of course beyond the app itself, there's the Signal protocol which is, you know, I'm butchering it, but basically the encryption itself. I'll say that I'm sure I'll get some angry emails, but that is also implemented in WhatsApp and Facebook messenger, basically making that protocol a linchpin of privacy and end to end encryption for billions of people. Right. So here we're going to talk about the current state of privacy, the realities of backdoors and the threats against end to end encrypted messaging services worldwide. Meredith, thank you so much and welcome to the show.
C
Such a treat to be here. Joseph, I am such a huge fan of your work and for media, so I'm just going to put a plug in right at the jump that folks really should support. You all are doing so much heavy lifting in an ecosystem that has sadly been hollowed out a bit too much in the last couple years. So thanks for that work and again, really happy to be here, of course.
A
Absolutely. So I think just to start, everybody knows Signal, I would say, but they may not know some of the, the numbers behind it. Can you just give us a sense of today, how many users are we sort of talking about and is that number growing?
C
That number is growing. And I will just off the bat be annoying and cagey here and say we don't give exact numbers of our user base. One reason for that is that it's pretty volatile. We will see massive growth in a region in response to a political issue or what have you and then perhaps some drop off and it really expands and contracts a bit, but it is overall growing. We are seeing increased sensitivity to privacy. That's evidenced everywhere from our user growth charts to the fact that Apple and Meta are spending billions of dollars on advertising, really indexing on that one value, to simply the tenor of the public conversation as Things like the AT&T data breach which you all reported on, you know, it continues to make people aware of the very real material dangers of, of the mass surveillance that most of our core digital infrastructures is conducting on us every day. So, you know, Signal, if you want a little shortcut here, Signal has been downloaded 200 million times about. I don't have the exact number in front of me on just from just the Play Store alone. And of course that's one of three clients and we are, you know, we're looking at trends that really point up into the right and have, you know, some pretty big plans in the future in terms of how we can ensure that Signal as core infrastructure, both the messenger and the protocol, and the work that we're doing across the stack to enable privacy in an ecosystem that defaults to surveillance, is able to grow and spread and become increasingly foundational to a digital ecosystem. The Internet, whatever metaphor we want to use for the fact that, you know, this now is the nervous system of our social and political and economic life.
A
Yeah, for sure. And I mean, you mentioned plans and of course, growth. That makes me think of how signal over, I mean, not even the years, but almost the months it feels because you will release new features somewhat rapidly, or you have recently. I mean, stickers are a massive part of signal now. We are constantly using them in our group chats. I make my own stickers for in jokes and all of that sort of thing. Is the idea there's that you are trying to introduce at the end of the day, fun features. Are you introducing those to generate growth? Is that the thinking behind it?
C
Well, we are introducing features that we hear from people across the world, and these are different features depending on the region, the context, even the age group, are core ways that people communicate. So stickers are not hugely popular, for instance, in many contexts in the US but they're wildly popular. They are core communications tools in much of East Asia. And so a messenger app without stickers is not very useful to people who are, you know, who have facility with that paradigm, who, you know, think in stickers, who send stickers as part of a conversational rhythm that is very familiar. So part of what we're doing there is trying to put the core function of communication at the center of what we do. People don't pick up signal to flex the fact that they care about privacy. They pick up signal to talk to someone they love or like or want to speak to in some other way. It is fundamentally about human communication, which itself is fundamental to being human. So we want to respect the fact that that takes many forms and many argets. And it stickers in one case is very important. So it's, you know, it's to enable growth, I think, you know, often growth comes when we're talking about communication networks. The network effect really matters. So, you know, communication networks to be, you know, 101 about it. They, they increase in value with the quantity and quality of the nodes added to the network. So the more, the more valuable. So put another way, no one buys the first telephone, right? Put another way, if you switch to signal but your friends don't, you haven't really switched to signal. You're not really able to use it, right? So we see growth perpetuated by collective events or collectives. You'll see a very easy one is political volatility when the distance between physical safety and digital privacy collapses. Ukraine, one of our hugest markets we saw that rise from a very small market to a very large market. Obviously in relation to the Russian invasion, we see growth in groups. That happens in relation to just big tech messing up. The incumbents making a policy change. In early 2021, WhatsApp announced a change of terms of service, where they announced to their users that they were going to be giving data to Facebook. And they had promised for a long time not to do this. This was a broken promise and people noticed. People do actually care about privacy and they switched in droves to Signal. We basically ddosed our servers. We had to recover from that. Did a very good job. Briefly, bless the team who stayed up nights doing that hard work. And then for over three months, we were number one in the app and play stores across 70 countries. So, you know, you see, what we want here is to be ready for those precipitating events, to have the features that make Signals seem easy and intuitive across the globe. And those aren't going to be the same features in every jurisdiction. Stories, for instance, which are not hugely popular in much of the west, hugely popular in South Asia, hugely popular in Brazil and other parts of South America. So again, we're trying to cover a lot of ground, but ultimately it's about making Signal as easy to use to communicate in the way that you're familiar with and that your friends are familiar with as possible, so that when you need it, it's truly there for you.
A
Yeah, it's funny, I totally forgot Stories were even a thing in Signal. I don't think I've ever used that single feature.
C
You are missing out. You are really missing out.
A
I see something that looks like Instagram in any way, I'm like, no, no, no, no, no. It's not even a security thing. It's just like, I don't know, I've never engaged with that sort of content. You see what I mean?
C
Oh, my gosh. Well, I'm gonna. I'm just gonna say I really love stories. I think, you know, it's. For me, I have actually love, you know, like, I love seeing what my friends are doing. I love cute people. I love a cat. Right. You know, no shame in that. And I feel really uncomfortable with Instagram. I feel really uncomfortable with sort of, you know, that stuff being scraped for AI or, you know, have ads run against it or sort of show up in some stranger's feed. The whole thing went from a photo hosting app that was slightly better than Flickr to, you know, a biometric surveillance hellscape. So having a place where I can share, like, cat photos or, you know, cool photos that I took that I know they're not going anywhere. They are literally deleted off of, you know, my device and everyone else's. They are encrypted. They're not. No AI is run against those. I'm not exposing a stranger's face to facial recognition. That could imperil them because they overstay the visa and that's run against some clear view system and the meta backend that goes to law enforcement. I don't think that is happening now, but it's incredibly technically possible and it's the exact kind of thing that could happen. So anyway, I love stories. I think folks should enjoy stories. And it doesn't come with all of the stress that Instagram and the other surveillance acts come with, but we also give you the option to turn them off permanently and never think about them. So we're different than big tech in that way as well.
A
I think that must have been what I did because again, I literally have not thought about that in forever. I guess just the last thing on features is that you introduce more features. You at least theoretically can introduce more attack surface. Right? That is just the nature of product development. How do you introduce these new features while keeping the app secure?
C
I mean, that's, that's a good question. I think the, yeah, the paradigm is the more complex the code, the more room there are for bugs to hide. We, you know, we are audited regularly. We engage in, you know, security best practices, we have code review practices. And I think it's, you know, we're also developing, unlike almost any other alternative, large scale messenger in the open so people can watch even before the feature is pushed. We are developing in the open on our repos. There are people who will go in and see bits of code behind a feature flag that's being iterated on and begin to hypothesize in our forums or on Reddit or wherever they do it, about what we might be building. So there's a lot of scrutiny and a lot of eyes on our code, which is honestly, it's a gift, right? We have people looking at it, finding issues, raising concerns. Sometimes it's bad faith, sometimes it's annoying. Whatever, whatever. But nonetheless, that's a powerful immune system. So that's not one weird trick. There is no one weird trick. The trick is you never stop asking that question. You never let down the vigilance. You never think you're safe and out of the woods. But I do think we have a lot of practices and a development paradigm in place that puts us in a much better spot than any other competitor. And I mean let's be real, if you look at the competitors apps there are, every single quarter some product development team has got to ship some crappy feature because let's say someone named Mark is obsessed with the newest tech hype, right? And so in terms of bloated features, I think Signal is actually incredibly lean and elegant. And it really is that balance between how do we not fall behind the norms that mean that we're genuinely not useful for certain forms of digital communication that most messengers would be useful for while maintaining our integrity as best in class for security and privacy.
D
Pop quiz what is the worst thing about having a cat? Stinky litter boxes the good news Boxee has solved the problem. The pro in Boxee Pro stands for probiotics which stop the bacteria that causes odors so you never smell your litter box ever again. Boxy Pro keeps the box continuously odor free infinitely. All you have to do is remember to scoop and when people come over they won't ask what's that smell? With Boxypro all you do is top off the litter and you never have to dump out the whole box. Its amazing clumping power, makes scooping easier and makes cleaning up after your pet not much of a chore at all. So if you're ready to stop giving your litter box the stink eye, it's time for you to try Boxee Pro. After years of searching for the perfect cat litter, I finally found Boxee and it's become my go to for my cat's litter box. So if you're tired of switching litters looking for the one. Get Boxee Pro at box I E C A T.com it's the last litter you'll switch to. Enjoy 30% off with the code 404Media@boxycat.com 404Media that's B O X I E C A T.com 4:40 hundred 404Media.
A
All right, last sort of setup question. Years and years ago I remember the New York Times published an article based on a subpoena signal received from the US government and it showed what I think a lot of listeners already know. But it showed that signal cooperated with the legal demand and returns the user data it had which was. I'm just looking at the quote now, the time the user's account had been created and the last time it connected to the service. Obviously not much data. Is that still the case with what data signal will provide to the authorities, if compelled to do so, is it still basically the same as it was?
C
And we go out of our way to get that as close to nothing as possible. So yeah, you can look@signal.org bigbrother and you can look through other subpoenas that we have received and unsealed and posted there. We fight every subpoena that we get. If we are forced to comply, if we lose that fight, we then have to comply. That's how it works. And that's why to be a truly private communications provider, you cannot have the data because ultimately you can be forced to comply or shut down or what have you. So we do comply, but what we comply with is vanishingly small. And if you go to that site and it hasn't been updated recently, I believe it is updated. I'm not actually looking at it right now, but we're in the process of posting some newly unsealed subpoenas there now. So you'll get, you'll get a number of examples of how those look and just how little data is available there.
A
Sure. All right. That was me spending 15 minutes setting it all up because I just think that context is important. But let's now zoom out a little bit. And today in 2024, what do you see as the most pressing threat against end to end encrypted messaging? It's a very fast moving world. We had the EU chat monitoring proposals. We now maybe have another debate around the crypto comms around the Trump shooter as well. But is it those? What is the most pressing threat that you see against end to end at the moment?
C
I don't think in stack ranks.
A
It'S not the fret Olympics. Yeah, I guess, yeah.
C
I mean, we're thinking like we're talking about a kind of nexus of threats and, and primary one among them is, you know, centralized power tends to constitute itself via information asymmetry. And so since, you know, I don't know, you can look back in 1976 with the US government trying to prevent the publication of Diffie and Hellman's paper on public key cryptography, because they didn't even want the paper out there. Right. There has been anxiety Post World War II around the idea that any network for communications that people can access could be off limits to government scrutiny. And that is not going away. I think we need to be on the lookout to threats to the viability of encryption itself. Threats to implement a backdoor or a front door or bolt on a surveillance service as a mandatory component of any end to end encrypted communication service, thus completely nullifying the entire point and tiring us out with having to argue against rhetorical tricks over and over again as the European Commission has just done and then withdrawn. So those are legislative threats that we've seen many of in the uk In Europe, you know, we saw Australia propose some of these, but seems to be walking it back with some fairly sane language and, you know, some of their latest regulatory and policy proposals. So I think we'll see more of those. But I do get a sense that the arguments are getting through to some extent that, you know, to the policymakers. You mean to the policymakers and the people who, who may not have understood the type of threat that their well meaning legislation was posing. Because I think, I think in some sense we're dealing with a scenario where the pretext is incredibly inflammatory. It's very emotionally charged. Child abuse, child sexual abuse. We need to prevent this. No one disagrees, right with that, people? Yes, we need to prevent it. We'll do whatever we can. Well, here's what to do. And a lot of people I think just, you know, took the instructions and said, okay, we're going to diligently do this. Not looking at the fine prints or not understanding that the, you know, there were some, some Trojan horses in there in terms of fundamental rights and liberties.
A
Yeah, well, just on that, I mean, because maybe not everybody listening is actually familiar with what happened in the, in the EU recently. Could you just very briefly give us a rundown of what was being proposed and sort of what your reaction was to it?
C
Yeah, I mean, there's the long version, which I don't think is terribly interesting for the, the lay listener, but in effect the conceit was that there is a, you know, there are issues with child sexual abuse material online and that law enforcement, government's, well meaning NGOs, in order to deal with this, they need to be able to effectively break end to end encryption and scan private communications and enumerate to find those that may contain such imagery and we assume, prosecute both, I guess, the culprits. And so that's the high level rationale. And so we've gone back and forth for I think a little over a year. There's been a proposal to do certain kinds of scanning and then the scanning that would break end to end encryption, constituting a backdoor. And then the technical community and ourselves and others would come out and say, no, that's actually a terrifying cybersecurity risk and there's no evidence that it actually helps to protect children. And then a new proposal would come and say, like, oh, well, we're not going to create a backdoor, we're going to put the scanning in front of the encryption so it doesn't actually hurt encryption. But of course, if you bolt surveillance as a mandatory component of an encrypted system onto whatever the communications platform is, you have undermined encryption. You've moved the target for hackers from unbreakable math to some crappy vendor provided software that the government is mandating, which is going to be easy to break. Right. So we've had to kind of go through a cycle of what I would call like rebranding the same old thing in an attempt to get it through and get it past the technical communities, vocal opposition, the long standing consensus that, you know, this is math, not magic, you have to recognize the limitations of your desires here. And so that's where we stand now. The latest round was maybe a month ago. There was a drawdown effectively, and now the European Commission is reconstituting itself after elections. We have Hungary coming into the presidency who is rumored to be largely supportive of this. We don't know if it'll be picked up or not. But I, you know, again, coming back to the first point, power constitutes itself by information asymmetry. And this has been a long standing wish of law enforcement to undermine strong encryption. I don't have much confidence this will ever be put to bed entirely. This is not the, this is not a disagreement. This is not a misunderstanding that we can educate away. This is a battle for power that we're going to have to contend with on those terms.
A
Yeah, it's been going on for decades at this point. Right. And it goes all the way back to the Clinton administration, what was the Clipper chip and all that. And then the San Bernardino stuff and then, yes, the pivot.
C
And before that there was in the 80s, Reagan was also discussing some of this. So there's never been the prospect of networks that law enforcement can break that hasn't freaked out government. In my historical reading. I think there are other threats we could discuss that aren't legislative that might be interesting as well. And one of those that has occupied my time is the move of AI systems into the operating system and things like Microsoft's recall in particular, I would say pretty alarming shift in a longstanding paradigm in which the operating system provides a trusted basis on which various applications not from a single vendor, are able to, let's say, form contracts with a given user. You know, signals contract is we collect none of your data. If you're using Signal on a device that isn't compromised, it is secure, it is not going to be leaked. And thus when we fulfill a subpoena, for instance, you know, it only has vanishingly few bits of information in contrast to any other application. And that's, you know, that's a kind of an incredibly important paradigm. And what we're seeing with things like Microsoft Recall, which was meant to ship with Windows 11, the new Windows operating system, is a violation of that paradigm where in the name of, you know, feeding an AI system that will do some convenient or an inconvenient or completely useless thing for you, in Recall's case, it's constitute a kind of eidetic memory of everything you were doing online for the last three months. Like anyone wants to know that you were doom scrolling at 3:00am, right? Like, you know, which is a given.
A
To be fair, they don't need an AI system to figure that out. But yeah, it would catalog basically everything you're doing on that Windows machine to then provide some sort of efficiency service with its AI tool. But to me it's that I don't want that on my os. No, basically no you don't.
C
I mean, just a plain text HoneyPot on your OS that includes screenshots of your Signal Desktop messages. If you're using Signal Desktop, that fundamentally violates that contract between signal and the person using it, which is then being subverted by the operating system manufacturer or the oem. So I think for me we need to be a lot cannier about simply accepting that if AI systems are running on the device and not phoning home, they somehow constitute a private system. Because what we're actually seeing is the need for data that these systems have and the functions that these systems are being put to, whether it's recall and this sort of silly eidetic memory that's like a desperate search for a market that's not materializing. If you want my read on that, or Google's Gemini, which is going to scan your phone calls for scams, but could very easily be purposed to scan your phone calls for discussion of drugs, or scan your phone calls for people seeking abortion care or anything else that we really need to pause before we allow these companies to make those kind of determinations to violate our privacy to make those decisions about us. I think we need to reevaluate that paradigm. And some of the, some of my thinking and my work right now on the, maybe the scholarly side or the analytics side is really focused on that.
A
Yeah, yeah. To shift gears slightly. So you mentioned sort of front doors and back doors and we've seen authorities launch larger and larger operations against encrypted chat platforms. Listeners will know. I don't shut up about this. It's like my obsession. We had Encrochat, where, where the French police hacked into more than 30,000 devices that were using encroachat software. Sky 70,000 devices. They got half a billion messages. And then Anom, which is the platform that was secretly run by the FBI. To be clear, the majority of their users were alleged serious criminals. That's apparent in the messages that I've seen. And then you can then argue, well, should the authorities still have done that? I think that's very much an open question and a debate that we simply have not had in public. But what do you think just broadly about this trend from authorities to compromising entire encrypted chat platforms, even if they are in some cases used heavily by criminals? Because that's just such a different way of approaching the encryption problem that law enforcement sees. It's not hacking an end device. It's not doing maybe a side channel like, well, let's get a wiretap of their ordinary phone calls or whatever. It's just like we're going to compromise the entire platform. I mean, what do you make of that approach?
C
I don't have anything clever to say about it. It's, you know, it's, it's a threat we need to keep in mind, particularly, you know, given that, you know, private money can mean that these, you know, different startups or, or for profit platforms change hands without people knowing. Right. And I think, you know, I often think about this in terms of Telegram. Like, would we know, would we know if the CIA put together an LLC and quietly like, you know, gain a majority stake in Telegram? You know, I don't think they need to because Telegram is not actually encrypted or secure in any meaningful way. So why not just, you know, look in the window instead of by the house, maybe. But you know, I don't have a sort of analysis on that trend other than I think this is. Again, we have, you know, closed source platforms that are making promises that aren't validated, aren't backed up by scrutiny or an analysis of the open code or something like the signal protocol, which is been in the open, the implementation has been in the open for a decade. It's been hammered, it's been examined from all sides and it continues to stand the test of time. And that Part of this is perhaps a symptom of the, the way we have guilelessly approached tech in general, allowing tech companies and tech narratives to shape what we understand about tech and not demanding democratic scrutiny or expert validation or any real systematic checks that ensure that the marketing and the reality are the same. Now, would that actually make a difference in the context of law enforcement, you know, collaborating with a, you know, or standing up a fake platform or taking over a platform and kind of quietly subverting its functionality? You know, likely not. But I think, you know, again, maybe, maybe, you know, there would be other bulwarks in place that would make it more difficult. And, you know, in general, it's not just law enforcement that could do this as well. Right. You have, you know, we know there's a lot of scam platforms. You know, we can think about the, like, you know, match your face to a. Celebrities that are gathering biometric data to train some shady AI to sell to law enforcement. Right. Like, there's a lot of, like, pretext and, and lying and marketing that is considered normal in the tech industry that, you know, if we were, if we were producing a food product would not be considered normal, would be, in fact, criminal in a lot of cases.
A
Yeah.
C
So I think. Quick reflections there, but it's something we should be wary of, and I guess it's a symptom of a larger problem of hype and opacity being the currency of the industry.
A
Yeah. Maybe just one other question before I get to the main one I want to finish on, but you did bring up Telegram, and recently the CEO of Telegram said in an interview that his engineers had allegedly been approached by the FBI, encouraging them to include certain code bases in the app, which could potentially act as backdoors. Not many specifics were given in the interview, but hey, that's still an interesting detail in and of itself. Have any of your engineers been approached by the FBI or other authorities in that sort of way?
C
I mean, you're asking me to prove a negative, but let's move back. It's a sort of a fantastical story, right? I don't know, like, it's not verified.
A
I'll say that it's just his claims.
C
Yeah, like, hi, I'm the FBI, here's some code. Could you put the code in your code base? Like, I don't like. Nah, man. Get someone hired at Telegram. Like, there, you know, we know there are ways that this is done. No, I, I, the, the story itself just seems a little bit like, like a children's book. Version of a real threat. Right. And again, you know, Telegram is not open source. We don't know, you know, we know it's not secure. We know that it is. You know, their encryption protocol has had, you know, some bad bugs that some people thought looked like a backdoor that, you know, I believe were remedied. But, you know, nonetheless, I can't say that story is true. I can't say the story isn't true, but I would say it's, it's kind of a, it feels like a kind of mythologized version of a real concern that doesn't kind of hold water. If that's the FBI's tradecraft, you know, I think we're all pretty safe.
A
Yeah. And to your knowledge, that hasn't happened to you?
C
Yeah, no, I mean, we, you know, we develop in the open. Right. The FBI can look at our code. We don't, you know, this is one of the reasons we don't just willy nilly accept pull requests that are made by external contributors on our repo. Right. We take a lot of care with what we accept or don't accept, and don't accept much because we want to make sure that we're scrutinizing things there. We have people scrutinizing our open source code, we have people scrutinizing our encryption system, the signal protocol, and some of the other metadata encryption that we use. So it's. No, we have not had, to my knowledge. Right. And I can't, It's a bit of a, it's like a question because I'm like, I don't, you know, like, did a guy meet a guy at a party? I wasn't there, but no, I've never heard of that happening. That's not how I've ever heard of, you know, like, I've never heard of a scenario like that in Silicon Valley. And I've worked in tech for almost 20 years. You know, you do find about out about moles and agents, but it's usually, you know, a more sophisticated operation. And so, yeah, that's what I can say. And you know, I think we have structural safeguards in place that don't rely simply on one engineer saying yes or no. Right. We have things in place that check our code, whether the bug is malicious or accidental.
A
Yeah. And I think this just leads to sort of the final main thing I really wanted to ask you about is because while I've been covering law enforcement's approach to end to end encrypted communications, especially ANOM and all of My investigations there, I've sort of seen that three options have emerged, or maybe not options, because that sort of implies you can only do one and not the other. But there's sort of these three paths for encrypted comms. The first is one you mentioned earlier, which is sort of the front door, Right. Which is where apps will give data to authorities under a legal order. Discord does this all the time. Telegram does not really do it because they just generally don't cooperate. Signal does cooperate, but it gives a very small amount of data because it doesn't retain much data.
C
We fight.
A
You fight? Yes, sorry, you give it after, which.
C
Is a form of cooperation. We don't deny that the process has validity because that's a good way to get your service shut down. But we do fight every request and then, yes, we do cooperate with those that we are forced to cooperate with.
A
Yes. So you have that front door and maybe one way to appease the debate and stop these cycles as well. More apps could then give more data. Of course, that's going to mean undermining your own security. Then you put that to one side. Then you have things like backdoors, which is anom. Which is the FBI running its own encrypted chat platform and getting all of these messages, these crazy, audacious worldwide operations. And I think there's a lot of wide room for collateral there. And then the third one is targeted hacking, where authorities may think, well, we want to get the encrypted messages of this particular person, let's hack that particular device. Use some sort of modular malware to only get the messages. Do it under a legal order, of course. Also room for abuse there. My question is, which one of those is. Well, I suppose not the most attractive, but the least bad to you out of those options?
C
No, no, I reject the framing. I didn't do anything at this point.
A
I knew you were going to reject the premise. Yes, I absolutely knew you were going to reject the premise of the entire question. But I think that's interesting in of itself. Right, because we have got to this point where the FBI feels it can secretly run its own tech company for organized crime and intercept messages because they see that as a valid option. Right. And I think what I'm trying to get at is that we have this debate and there's the dialogue in the EU and then in the UK and Australia, as you say, in the shadows. The FBI isn't really asking. It's like, well, we're just going to go run our own encrypted chat platform and do all that. So if you reject the premise, why is that exactly?
C
Well, the premise has stayed the same for 30, 40 years, right? We sort of skim through that history. But the premise that law enforcement is Paris perilously on the verge of being shut out from the visibility it needs to do its job, that has sustained for as long as this debate has sustained. And of course, 30, 40 years ago, there was no such thing as the Internet. Our letters were not surveilled. And indeed, postal mail has some incredibly strong laws protecting it from such surveillance. If to do a phone wiretap, which was itself controversial, you had to go through a huge number of hopes, you had to get a warrant, you had to prove that you had a reason to do that, Right? So they had much, much, much less information than. Than they do today. And in the interim, particularly, you know, looking at the 1990s, we effectively greenlit a business model that endorsed mass surveillance as the economic engine of tech. So, you know, it was. There's a number of reasons for this, but basically there was a Clinton administration did two things. They put no privacy guardrails on the commercial Internet. So mass surveillance for corporations, everything goes totally fine. They can do way more than the government could even dream of in the name of profit and commerce. And two, they endorse advertising as the business model of the Internet. And of course, advertising is know your customer, know your customers, know more and more and more about your customers, so you can, you know, differentiate yourself in the market, and that is an incentive to surveil. So in this sort of last 20, 30 years, we have seen mass surveillance over every aspect of our lives become the norm, become accessible to law enforcement. The argument that they are somehow missing data and that this is a perilous threat to their ability to do their jobs, simply does not check out when you begin to look at the details here. And so, you know, you can say, like, in order to get the data they're missing, here are three options. But you have already taken as a given a premise that I don't think holds up.
A
Yeah, all. All I would add to that is that just to give one specific concrete example, during anom, there was an amphetamine lab producing tons and tons of speed by a very senior drug trafficker, a guy called Rifkin. His name doesn't matter. What's important is that.
C
Was it a Sappler?
A
No, not quite. You know. Yeah, they were on a nom as well.
C
Yeah, sorry. I don't know if their hiding was really the Problem.
A
Right, right, right. But so there's this drug lab and it's producing drugs. And the key thing is that the local police, or rather the national police of Sweden, they had no idea this drug lab existed. They had no idea this drug trafficking organization even existed until they got messages from the FBI's backdoored app. So my question is that they were missing crimes. Is that an acceptable trade off? How do you square that with sort of the rejection of the free options, if you see what I mean? They were missing crimes is what I'm saying.
C
Well, yes. Okay. It's a very compelling story because it's very difficult to argue against it without somehow being counted at a camp that is going to be, I don't know, sympathetic to drug lab gangsters, whatever's going on there. I guess if we had cameras in all our bathrooms, in all of our homes, in every inch of our lives, we could argue that that would ensure that the law, you know, law enforcement doesn't miss crimes. Like, there is a threshold here where I think we need to recognize that fundamental liberties, the fundamental ability to communicate privately, are pretty imperative to having a functional democratic society and governance structure. Now, I don't. I don't know the details of that particular story. You know, I would also. I don't know where's the sourcing on it in terms of, you know, had they received calls from neighbors nearby the lab who may not have been rich people who reported bad smells, did they have other indicators that they weren't paying attention to? Were there other issues of data deluge that was perhaps preventing them from finding the needle in the haystack of data? Which is, you know, the largest problem I see when I talk to law enforcement people, when I talk to people familiar with the actual, you know, on the ground labor practices of doing law enforcement in this day and age, is that there's too much data. There aren't enough people to triage it. There aren't systems that can tell the signal from the noise. There are, you know, repetitive data. There is, you know, there's simply not a good way to sort through it and make sense of it so that, you know, within it here is an indicator of a real crime versus a fake crime. So I think what you presented is a. It's the kind of narrative that if I were the FBI's press office, I would be really happy to find a way to tell that story because it's erasing a lot of, you know, complexity. Like, you know, did we talk to the Swedish police? Is that their story? Or is that the FBI story? Sort of.
A
I mean, I spoke to the Swedish police. Yes, I spoke to the Swedish police and I read the messages, but yes, yeah, but I see what you're getting at in that. There could be, and it's almost to make it more theoretical and speculative now when they're broadening it out, but, like, there could be these other ways that authorities could investigate or discover these crimes, basically, as you say, the neighbours or whatever. And that also applies just much more broadly to when we have many more conversations in private now. It's like, yes, well, maybe they have their phone location data or whatever. It's not like there is a crime wave because everybody's using end to end encrypted messaging.
C
Yeah. And can we draw back a little on this term crime? Right. Like criminality. Criminalization. And I don't want to get too abstract, but I do want to recognize that what is and is not a crime changes over time in authoritarian societies. Journalism is a crime in the us Accessing healthcare in a number of states is a crime. Trans healthcare is a crime. There is a woman living in jail right now in the US her name is Jessica Burgess, and she is there largely because Facebook turned over messages between her and her daughter to law enforcement that were used to convict her of the crime of obtaining abortion care and dealing with its aftermath in the state of Nebraska after the Dobbs decision had made that illegal. So, yeah, they found a crime. But, like, what are we actually talking here when we talk about the confluence of those types of authoritarian tendencies, the fact that crime is always defined and redefined according to social norms and mores. And obviously we're. This is not to, like, relativize things like murder. Right. But, like, like to be really rooted in our present moments and then to kind of argue that in order to stop crime, we need full visibility on everything that is, we know, you know, chilling to speech, chilling to dissent, you know, deadly for journalism, perilous for, you know, any. Any meaningful social transformation of the type that will be necessary to overcome climate, of the type that will be necessary to meet many of the inflection points that we are living through now.
A
Yeah. And that's, of course going to be even more relevant, at least in the US with the upcoming US Presidential election. And maybe we're going to see what. I forgot. Yeah, we're going to see what the US Is going to look like in, I guess, a few short months. But, Meredith, this conversation was incredibly insightful. Thank you so much for coming on the show. And I really, really do appreciate it.
C
Thank you so much. Joseph. Big fan of your work in 404 Media and use Signal.
A
I highly recommend that as well. All right, let me just play us out with our outro. As a reminder, 404 Media is journalist founded and supported by subscribers. If you wish to subscribe to 404 Media and directly support our work, please go to 404 Media co. You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week. Another way to support us is by leaving a five star rating and review for the podcast. That stuff really helps us out. Unfortunately, algorithms and shit. This has been 404 Media. We will see you again next week.
D
And Doug, here we have the Limu Emu in its natural habitat helping people customize their car insurance and save hundreds with Liberty Mutual. Fascinating. It's accompanied by his natural ally, Doug.
A
Limu is that guy with the binoculars watching us? Cut the camera.
D
They see us.
C
Only pay for what you need@libertymutual.com.
D
Savings.
C
Very underwritten by Liberty Mutual Insurance Company and affiliates. Excludes Massachusetts.
Date: December 31, 2025
Host: Joseph (404 Media)
Guest: Meredith Whittaker (President, Signal Foundation)
This episode is a replay of 404 Media’s in-depth interview with Meredith Whittaker, president of the Signal Foundation. The discussion focuses on the current state and future of privacy, the perennial debate over backdoors in encrypted messaging, the real-world threats to end-to-end encryption, and the challenges posed by AI integrations and platform security. Throughout, Whittaker articulates Signal’s philosophy, its technical approaches, and her principled stances on the intersections between privacy, crime, and state power.
On User Motivation:
“People don’t pick up Signal to flex that they care about privacy. They pick up Signal to talk to someone they love...”
(Meredith Whittaker, 06:53)
On Legislation:
“If you bolt surveillance as a mandatory component… you have undermined encryption.”
(22:37)
On Law Enforcement Tradeoffs:
“I reject the framing… The premise that law enforcement is perilously on the verge of being shut out… has sustained for as long as this debate has sustained.”
(39:02)
On Criminalization & Liberty:
“What is and is not a crime changes over time in authoritarian societies. Journalism is a crime… in the US, accessing healthcare is a crime in a number of states…”
(45:30)
On Open Source as Immune System:
“There’s a lot of scrutiny and a lot of eyes on our code, which is honestly, it’s a gift… that’s a powerful immune system.”
(13:35)
On Microsoft Recall and AI:
“Recall... constitutes a kind of eidetic memory of everything you were doing online for the last three months…”
(25:17)
This interview distills the challenges and complexities facing end-to-end encrypted platforms like Signal in a world increasingly hostile to robust privacy. Meredith Whittaker articulates Signal’s privacy-first philosophy, its careful approach to feature development and code audit, and its unwavering opposition to government-mandated surveillance—whether via law, hacking, or AI-powered platforms. She challenges listeners to consider not just technological or legislative risks, but also the philosophical underpinnings of privacy, the shifting nature of laws, and the dangers of letting crime control narratives drive mass surveillance.
Recommendation:
If you value privacy and want to understand how powers, business models, and new technologies threaten private communication, this episode—anchored by sharp, quotable commentary from Whittaker—is essential listening.