
Loading summary
Kara Swisher
Support for the show comes from into the Mix, a Ben and Jerry's podcast about joy and justice produced with Fox Creative. Former President Donald Trump has made voter fraud a key talking point in his 2024 presidential campaign, despite having no evidence of widespread fraud. Historically, claims like this can have a tangible effect on ordinary voters. In a new three part series, host Ashley C. Ford talks to Olivia Coley Pearson, a civil servant in Douglas, Georgia who was arrested for voter fraud because she showed a first time vot how the voting machines worked. Hear how she fought back on the latest episode of into the Mix. Subscribe Now. Support for on with Kara Swisher comes from Anthropic. A lot of AI systems out there feel like they're designed for specific tasks that are only performed by a select few. So where do you start? Well, you could start with Claude by anthropic. Claude is AI for everyone. The latest model, Claude 3.5, offers groundbreaking intelligence at an everyday price. Claude Sonnet can generate code, help with writing, and reason through hard problems better than any model. Before you can discover how Claude can transform your business@anthropic.com Claude.
Scott
Support for this show comes from Polestar. Polestar is an electric performance car brand that is focused on innovation for both cutting edge technology and design and their all electric SUV. Polestar 3 is for those unwilling to compromise, for those who believe they shouldn't have to choose between the spacious comfort of an SUV and the agile handling of a sports car. For those who need an intuitive infotainment system and a dashboard designed with minimalism in mind. Polestar 3 is for drivers who won't settle for anything less. Book a Test drive for Polestar 3@Polestar.com.
Meredith Whitaker
It'S on. It is.
Kara Swisher
Hi everyone from New York Magazine and the Vox Media podcast network. This is on with Kara Swisher and I'm Kara Swisher. My guest today is Meredith Whitaker, president of the Signal foundation, the nonprofit that runs the Signal messaging app, one that I use all the time because it's pretty much the only one I trust. Signal has been around for a decade and only has 70 to 100 million users a month, which is peanuts compared to WhatsApp, just under 3 billion. But Signal is not a lightweight in the tech world. Its Signal protocol is considered the gold standard in end to end encryption. In fact, it's the tech that WhatsApp and Facebook messenger and Google use. The difference is that Signal users actually keep all their metadata locked up too, which is why it's become the messaging app of choice for People who are really concerned about privacy. Cybersecurity experts, NGO workers indicted. New York City Mayor Eric Adams, Drake, and me too. As I said, Meredith Whitaker has been leading the Signal foundation since 20, and she's kind of a perfect person to do the job. More than one reporter has called her Silicon Valley's gadfly, which is also my title, by the way. After starting at Google in 2006, Whitaker quickly moved up the ladder, founding Google Open Research and MLAB in 2017. While she was still at Google, she also co founded the AI Now Institute. This was very early at NYU with Kate Crawford to research the social implications of AI. So basically, Whitaker was a rising star. But then in 2018, after she helped organize walkouts to protest sexual misguided conduct at the company, citizen surveillance and Google's military contracts, the company told her to cool it and she left. Whitaker has been the no hold bar's advocate for data privacy in a world increasingly run by what people call the surveillance economy. I'm excited to talk to her again about that. The increasing consolidation of power and money in tech, especially in AI, where she sees this privacy fight going and how a nonprofit or even for profit startups can survive. She's still a firebrand. And our expert question comes from another of those people, Rene Diresta. So this should be good. One more thing. I've been nominated for the best host in the Signal Listeners Choice Awards, which has nothing to do with the signal messaging app. We're not trying to butter up judges here, but if you agree that I am the best host and you know, you really should, you can follow the link in the show notes. Thank you so much. All right, let's get to it.
Meredith Whitaker
It is on.
Kara Swisher
Meredith, welcome. Thanks for being on on.
Meredith Whitaker
So happy to be here, Cara, and nice to see you again.
Kara Swisher
I know it's been a while since we did an interview. It was 2019, you were still at Google building up the AI Now Institute on the side. Now you're at the Signal Foundation, a nonprofit. Talk a little bit about that shift and what happened there and why you decided to do this.
Meredith Whitaker
Yeah, it feels like centuries in tech and in my life. I mean, for me, this is all part of a single project, really. Like how do we build technology that is actually beneficial, actually rights preserving? How do we stop the bad stuff, start the good stuff? And of course, technology is vast and these companies are huge and there are many angles to take. So I was also at the FTC trying to help it there AI now trying to produce better research and now Signal to me is just the most gratifying dream job in a sense because we are actually building and shipping high availability tech that is rejecting the pathology and the toxic business model that I've been pushing at prodding, fighting for almost 20 years in my career.
Kara Swisher
Talk about why you left Google. It had gotten, I don't think hostile is the right word, but not what you wanted as I recall.
Meredith Whitaker
Yeah, I mean it was a combination of hostility from some in management who didn't like honest conversation about ethical business decisions, let's say.
Kara Swisher
Oh that, oh that.
Meredith Whitaker
So yeah, that old thing, not evil.
Kara Swisher
But you know, adjacent.
Meredith Whitaker
Not evil, just you know, like impolite company. Don't mention the evil. Yeah, so I, you know, I had raised a lot of alarms. I'd participated in organizing against some really troubling business decisions. Some of the surveillance and defense contracting that were just being made based on bottom line, not based on kind of ethics and duty of care. And I kept that up for a number of years. But at some point I felt that I had hit the end of the road. The pressure from Google to stop some of the retaliation that I and a lot of my colleagues were facing meant that we were just spending more and more strategizing how to keep our toe hold other than not building say AI Now Institute, which has gone on to do incredible things, not thinking about positive projects in the world. And my adrenals needed a rest, I needed a change of pace. And I'd also been at Google for over 13 years at that point.
Kara Swisher
When you say they didn't want, there was pressure. Explain that for people who don't understand within these companies, Google's always been a place where things are debated since the beginning or it was even if two people really did run the place or controlled the place.
Meredith Whitaker
Yeah, look, I mean I joined in 2006, which was a wild and free time at Google and they really did nurture a culture of open conversation, communication. There was that sort of Usenet message board vibe on our internal mailing list. You just go on and on debating the nuances of any given point. So that was sort of the nervous system of Google when I jumped in there. And of course there was a huge amount of money, so there was a lot of room to play around, to fail to learn things, to start new initiatives. Now it doesn't mean that decisions were made by consensus. Right. But it means that was the environment that was nurtured and that attracted a lot of people.
Kara Swisher
Yeah, on every topic. It wasn't just very serious ones. I remember a Huge argument over kombucha there at one point.
Meredith Whitaker
Yeah, yeah, yeah.
Kara Swisher
Yelling at the founders about the shitty kombucha.
Meredith Whitaker
There was a famous thread on goji berry pie that went on for like 3,000 posts. Right. So, you know, I really, I learned my poster poster skills pretty early. But you know, that muscle still remained. And a lot of people were there because they believed the rhetoric. Right. Like don't be evil is a bit trite and it's certainly, it's also far down the line.
Kara Swisher
There's a lot to the, to the left of it. You could do a lot of bad.
Meredith Whitaker
Things to the left.
Kara Swisher
Right, yeah.
Meredith Whitaker
And evil to whom? I mean, come on. But nonetheless it was, you know, in a socially awkward discipline of kind of nerds who do computer science, pointing to don't be evil was often invoked just to say, like, yo, I'm uncomfortable. Right. So there was this reflex in the company and as they moved, you know, let's say they, they moved closer and closer to the red lines. They were able to swear off in the beginning because they were so far away because the money was coming in. And we'll solve the problem of what happens when we have to choose between $1 billion and hanging onto our ethics. Right. That, you know, that seemed like a fantasy. And of course they started hitting up against these red lines in 2009, you know, kind of request from the Chinese government, they held firm there. And then we went into the mid 2010s and they're signing up to be defense contractors building AI, drone targeting and surveillance software.
Kara Swisher
So you had started the AI Now Institute on the side. Explain for people what that is. And then we're going to get to signal because it's how you got here is an interesting journey, I think.
Meredith Whitaker
Yeah, no, my path is wild and winding. So I had founded a research group at Google and that was a research group that, the nucleus of which was a measurement lab, this large scale Internet performance measurement platform with a consortium of folks in Google and outside of Google at the Open Tech Institute. So then I, you know, I hear about AI. It's, it's machine learning back then around like 2012, 2013, 2014. And I'm like, oh, what is this? This seems cool. It's like a statistical technique that does some stuff. Oh wait, you're taking really flaky data. Like not way more flaky than mine and way higher up the stack. So it's making, you know, like up the stack to making decisions about human beings.
Kara Swisher
Yeah, the garbage in, garbage out idea.
Meredith Whitaker
Garbage in Garbage out and then like garbage into a black box that you then treat as a godhead. Right? Like that's the issue. Right? Like it's not, you know, you're calling this intelligence, but actually you've just sort of masked its provenance, masked its flaws, and are using it to imply that these, you know, massive corporate infrastructures are somehow sentient and alive. And what I'm not saying is that patterns in data that is responsibly collected aren't useful for decision making. Obviously they are. Right. You know, the issue is that there is a toxic business model at the heart of this, that those patterns aren't always responsible and that we, we forget that data isn't a synonym for facts at our peril.
Kara Swisher
Yeah. I just interviewed Yuval Harari and he made a salient point, a simple one, that there's a lot of information out there, but not a lot of facts. And that's hard to discern. And of course this higher intelligence isn't going to know the difference because you put it in there. Right? Because it's only going to know what it knows. So you got worried, you leave. Explain how you got to Signal and what your thought was on why it was important.
Meredith Whitaker
Well, I've actually been a big fan of Signal, involved in the world of Signal since around the beginning when you, when you work at the network layer, at the kind of low layer, you're privileged to begin to learn pretty quickly that everything is kind of broken. Right. There are security flaws or privacy issues like duct tape and sticky tape and like a handful of core libraries maintained by open source contributors who live on a boat and won't answer emails. Like, you're like, oh, this is the Internet. Wow. And I think like, I began to be animated by questions of privacy and security pretty early because of that exposure and because it was the most interesting place, frankly. Like it was where the fresh air of politics, you know, met the theory of the Internet. And so I had been a fan. I'd known Moxie for a number of years.
Kara Swisher
Swank is Moxie.
Meredith Whitaker
Moxie. Marlon Spike is the founder of Signal, co author of the Signal protocol, and really carried Signal as a project on his back, putting huge amounts of time and energy into it to do what is almost impossible, which is create this virtuous, nonprofit, open, high availability communications tech that is not participating in surveillance, that is not participating in targeting or algorithmic tuning or content curation or any of the other things that we've seen go real, real south with the others right now.
Kara Swisher
Let's talk About Signal messenger, which is the core product. For a while, you know, a lot of the big concerns around messaging apps were green versus blue bubble barrier or stupid things like that. But there's more important things. What does it do differently and what doesn't it do so people can understand the difference between all the different messaging apps?
Meredith Whitaker
Signal's big difference is that we are truly private. We collect as close to no data as possible and we develop open source so that our claims, our code, our privacy guarantees don't have to be taken on trust. You can actually verify them. And because of the reputation we've built up in the community, because the Signal protocol was a massive leap forward in applied cryptography and actually was the thing that enabled large scale private messaging on mobile devices, we get a lot of scrutiny and that promise of many eyes making better, more secure code has really delivered for us.
Kara Swisher
Right. People know what you're doing. And so it's like a messaging app like any other in terms you could message back and forth. But what does it do differently and what doesn't it do?
Meredith Whitaker
Yeah, so it protects your privacy, let's say up and down the stack. We use our own gold standard cryptography that actually others license from us. WhatsApp licenses it, Google licenses it. It is the gold standard.
Kara Swisher
Yes, this is end to end encryption.
Meredith Whitaker
End to end encryption. And we created the kind of the gold standard there and it protects what you say. So you and I are texting Cara. Signal doesn't know what we're saying on signal. Like, you can put a gun to my head. I can't turn that over. But we go well beyond that too. Because of course, metadata, this fancy little word for data about you and your friends is also incredibly revealing. So we don't collect data about your contacts, your groups, your profile photo. When you text someone, who's texting whom. So all of that, that required research and actually design, like building new things, solving problems. Because the ecosystem we're in has been built with the assumption everyone wants to collect all the data all the time and keep it to do whatever. So we actually have to go in and be like, well, we can't use that common library for development because if we use that, it would collect data. Let's give a concrete example. When we added gift search to signal, because everyone likes a reaction gift, right? Or at least boomers do. And we couldn't just use the GIPHY library, that would have taken a couple hours. We would have tested it in the engineers, go home and go to sleep. No, we had to rewrite things from scratch. This was actually a significant architectural change. It took a number of months and when we implemented it, it meant that we weren't giving any data to giphy. They have no idea whatever. So when they're acquired by Meta, we don't have to worry.
Kara Swisher
You don't have to worry. Right, exactly. So this is this end end encryption. Who's using it now and where are you seeing growth right now?
Meredith Whitaker
Yeah, I mean, our user growth has been steady and I think again, this just, you know, the bloom is off. The big tech rose. Right. People do not want to be surveilled. There is a giant demand for privacy. And so signal is global core infrastructure. We're used by journalists everywhere, human rights workers. We are the core infrastructure in Ukraine for safe communications, for sensitive information. Government, military, we are core communications in governments across the world. Right. Just for, you know, we don't want a data breach to expose sensitive information. I think, you know, every time there is a, what we call a big tech screw up or a massive data breach, we see spikes in signal growth. We also see spikes when there are, there's geopolitical volatility. So, you know, you see when, when there was the uprising in Iran around women's rights, we saw a massive spike in use and then we saw the government or the ISP try to block it, and then we stood up proxies to try to get people access anyway. So it's really, you know, it's when people suddenly recognize the power that personal sensitive information can give those who might want to oppress or harm or otherwise hurt their lives.
Kara Swisher
Or sell you things.
Meredith Whitaker
Exactly, or sell you things and then decide what news you get. Decide if you get an ad for a good rate on a home loan or a bad rate. These things that are subtle but also really meaningful.
Kara Swisher
WhatsApp is peddling privacy in the form of encryption as a selling point, but still collects metadata. Talk about this business model for a huge slice of the tech sector at this point. Data collection, surveillance, capitalism, profits, et cetera.
Meredith Whitaker
I mean, I think of this as really the original sin, right? Like the Clinton administration knew there were privacy concerns with the Internet business model. They had reports from the ntia, they had advocacy from human and civil rights organizations. This was a debate that played over, out over the 90s as the rules of the road for the Internet were being, you know, established. And I, you know, this is why I just like, I get itchy when people are like, they could never have known. And I'm like, literally there were reports before any of this were done laying out exactly how this would go down, and it went down that way and slightly worse. This wasn't a matter of guileless innocence leading to innovation that got out of control. This was a business model choice where, you know, the Clinton administration said absolutely no restrictions on commercial surveillance. And they also endorsed advertising as the business model of this Internet. And like, of course, what is advertising if not, know your customer? We gotta get more and more data, right? So it's an incentive. It's like a flywheel incentive for surveillance. We want as much data about our customers as possible so we can target them with ads. And what does that incentivize? That incentivize huge clusters of compute and data storage so that you can keep this data. That incentivizes things like MapReduce, that is sort of the precursor to a lot of the AI models. Now that incentivizes, you know, vi. You know, social media that calibrates for virality and sort of like upset and cortisol and like, you know, it's like amygdala activation, basically.
Kara Swisher
I always say enrichment equals engagement.
Meredith Whitaker
So, yeah, there you go. Exactly. And why does it equal engagement? Not because, like, we like engagement, but because that means you see more ads, you click on more ads, you contribute more data. You know, the cycle continues. And this business model is super profitable, so that's. That's the norm.
Kara Swisher
Let's talk about finances. You said there isn't a business model for privacy on the Internet. Now, Signal is not just opposed to surveillance capitalism, as we said, it's a nonprofit funded by donations. Include a big chunk from WhatsApp founder Brian Acton, who is also a co founder and board member at Signal. You don't take investments, you don't have advertising. The app is free, but you still need money to pay your engineers and keep your servers running. Talk about how you do that.
Meredith Whitaker
Yeah, well, our costs are about $50 million a year. And every time I say that, I get a couple tech founders, a couple tech execs come up to me and say, like, congratulations on keeping it lean. Right? So we're, you know, we're doing really well, but what we're doing is big and requires resources because tech is capital intensive. So right now we are funded by donations. That's our nonprofit status. And again, as we just sort of touched on that, nonprofit status is not a nice to have. It's not like, oh, we like, you know, charitable giving. No, it's a prophylactic against the pressures of a business model that are opposed to our Mission, which is private rights, preserving communication. So we are looking at different models right now for how we grow this. How do we sustain signal, and how do we make sure that SIGNAL isn't just a lonely pine tree growing in a desert. Right. We need an ecosystem around us. We can't be just the, you know, the sole example of one that got away from that business model. And I think things like how do we set up endowments that can sustain signal long term? How do we think about tandem structures or hybrid structures where things that would otherwise be polluted and toxified by exposure to a toxic business model are kept cordoned off. There's some vision in there that we could inject, but the flat fact is that's the cost.
Kara Swisher
Yeah. And there's nothing you want from your users except use. Right. It's sort of a free. It's like a free park or something like that. So protecting privacy isn't also something that is not a moving target. There are new systems on the horizon. Quantum computing comes to mind which require a complete overhaul of encryption systems, which you depend on. You're already preparing for QDay, as the wall Street Journal recently called it. Very dramatic over at the Wall Street Journal. But explain what QDay is and what you've been doing to deal with that. Some people have a vague knowledge of quantum computing, but it can unencrypt everything very quickly. Basically.
Meredith Whitaker
Yeah. It's very, very powerful computing that basically can factor large primes, which is what we depend on in cryptography very quickly. Right. And so this would break the premise of kind of unbreakable math is the guarantor of current crypto systems and a future in which we have sufficiently powerful quantum computing, which I guess is what Q day is, although I would have thought it was like a Q and A on thing.
Kara Swisher
Yeah, Q is a letter we have to stop using.
Meredith Whitaker
Yeah, I'm like, oh, cool.
Kara Swisher
X and Q. Yeah, X and Q.
Meredith Whitaker
I know we're reducing our literacy as we speak, but there is. Quantum computing is developing and there's no clear answer to when we will have a quantum computer that can actually do that. But it's catastrophic enough that we can't rest on hope or postpone it. So Signal was the first, the first private messenger, the first messenger to implement post quantum resistance encryption for our signal protocol. And the resistance we added protects against the kind of attack we can be worried about now, which is called harvest now, read later. And that just means you collect all the encrypted data. It's garbled bits, it Means nothing. But you save it and you save it and you save it. And in at a time when these sufficiently powerful quantum computers exist, you then apply them to decrypt it.
Kara Swisher
The Harvest now thing is really interesting for people who don't understand, it's like stealing all the safes and putting them in a room and then someday you'll be able to figure out how to open them. Essentially.
Meredith Whitaker
Yeah, that's a perfect.
Kara Swisher
Yeah. Yeah. So one other thing obviously is reputation. So in 2022, federal investigators said they had gotten access to signal messages helped them charge leaders of the oath keepers in January 6th. It wasn't clear how they got those. And I'm sorry to say this because I think he's the biggest information civ on the planet. Elon Musk questioned the app on X. Something about known vulnerabilities are not being addressed. Any idea what he meant? I mean, I'm not gonna ignore everything that imbecile says, but what kind of impact do reports impose? I know, I know. Large sigh.
Meredith Whitaker
Yeah, I mean, I don't know what he was talking about. I think getting to the Oath Keeper point, look, the way that the cops usually get data is someone snitches, someone has it on their phone, they get access to the phone. It's Occam's razor. And it's not that complicated. We're dealing with people, we're dealing with conspiracies which never really work out that well. But it is a good hook if you wanna scare people about security claims. Particularly because. Right. Like 999, 99% of people can't themselves validate these claims, which makes this kind of weaponized information environment really dangerous and really perturbing to us, which is why we're so careful about this. When Elon tweeted that, I don't, you know, there what I can say for sure, and this is what I posted on Twitter. We have no credible reports of a known vulnerability in signal. We have a security mailing list that we monitor assiduously where we haven't heard anything. There are no open critical vulnerabilities that we have heard of. So, you know, it's kind of. You put us in a position of proving a negative. And so, you know, so it was this off the cuff tweet. It caused a lot of confusion. I was sort of dealing with that for a number of days, you know, not because it was serious, but because it seriously freaked people out. We had human rights groups, we had people calling us, just saying, like, look, this is life or death for us, right? If Signal is broken, you know we're going to lose people. We need to know for sure. And what I can say is we have no evidence that this is true. I will say since then, Elon has tweeted screenshots of messages on his phone that are SC Signal screenshots. So you know you can put that together. You can of course use everyone uses Signal.
Kara Swisher
It's got a lot of secrets to keep, I think.
Meredith Whitaker
Well, I mean anyone who has anything they're dealing with that is confidential or has any stakes generally ends up using Signal.
Kara Swisher
Yep. We'll be back in a minute. Support for on with Kara Swisher comes from Mint Mobile. It can feel pretty great to share your latest obsession with everyone you see. Well, Mint Mobile's latest deal is kind of like that because when you're paying just $15 a month, when you purchase a three month plan, you want to spread the word. At Mint Mobile, plans come with high speed data and unlimited talk and text delivered on the nation's largest 5G network. I've actually tried out Mint Mobile. I have an extra phone and it was super easy to set up and I didn't really have to do anything, it just was plug and play. It's been great as a Mint customer and very easy to do all online. To get this new customer offer and your new three month premium wireless plan for just 15 bucks a month you can go to mintmobile.com cara that's mintmobile.com cara you can cut your wireless bill to 15 bucks a month at mintmobile.com $45 upfront payment required equivalent to $15 a month new customers on the first three month plan only speed slower above 40 gigabyte on unlimited plan. Additional taxes, fees and restrictions apply. See Mint Mobile for details. Support for this podcast comes from several Nines. If your company operates in the cloud, you know that out of control costs, disruptive outages, damaging security breaches and unacceptable vendor concentration can feel like they're destroying your cloud services. It's time for a new paradigm. One that not only provides you with more control over where and how you run your application and data workloads, but also one that provides the operational efficiency and reliability at scale via automation. Pioneered by several nines. Sovereign DBAAS is a new way forward for corporations with multiple business requirements who need to deploy their workloads in mixed environments. It's grounded in end user independence. It gives it ops teams the ability to deploy and orchestrate databases in public, private and hybrid environments, removing lock in risks and giving organizations the orchestration benefits at scale that they used to get from traditional dbaas but now without its portability and access trade offs. So whether you're in a DevOps or platform engineering team, sovereign DBaaS can provide you true optionality resulting in a healthier, more robust, robust and competitive tech ecosystem. Learn more@ several nines.com Cara support for on with Cara Swisher comes from Quince no matter how you feel about football and pumpkin spice lattes, one thing I think most people can enjoy about the fall is a good warm sweater. And if you're looking for a high quality one for that upcoming cold weather, then look no further than Quince. They're known for their Mongolian cashmere sweaters that are warm, chic and best of all affordable. In fact, all of Quince's luxury items are priced at 50 to 80% less than similar brands. I got myself actually a piece of luggage that you could put such sweaters in and I really like it. It's a hard shell, it moves incredibly well and it fits in the overhead compartment which is a critical thing and at the same time can expand. If I had to add more stuff in it, I plan to take it with me everywhere I'm going during the holidays and shoving it full of cashmere sweaters. Get cozy in Quince's high quality wardrobe essentials. Go to quince.comcara for free shipping on your order and 365 day returns. That's Q U I N C E.com Cara to get free shipping and 365 day returns. Quince.com Cara it's fair to say you faced a lot of headwinds in this battle to maintain tough standards on encryption. And everybody does. Remember Apple's battle with James Comey of all people? People don't remember it was James Comey, but last year the UK passed the UK Online Safety Act. The EU has been debating child sexual abuse regulation, known as Chat Control Bill. Basically they're all touted as efforts to protect users, especially children online, which seems like a good thing, right? But you and other security experts have been pushing back talk about these two bills, what they do and would do to your model.
Meredith Whitaker
Yeah, I'll just kind of characterize them in one brush because like ultimately they are aiming for the same thing, tldr in the name of protecting children from abuse harmful content they would mandate or would give a body or agency the power to mandate, scanning everyone's private messages and comparing them against some database of prohibited speech or content. And this isn't possible while preserving end to end encryption. Like, that's the mathematical truth. A backdoor. You. You implement a backdoor, you have broken the mathematical guarantee of encryption. And we have only to point to the fact that the Wall Street Journal just reported that, you know, apparently the Chinese government, no surprise to anyone, has been hacking interception points. So back doors in US Systems. Right, so this is not a game. This is not a hypothetical. This isn't the technical community raising large hyperbolic flags. No, this is the reality. And any backdoor in a network compromises the whole network. So you backdoor the. You mandate scanning of signal in the uk of course, communications cross borders and jurisdictions all the time. And then that means Amnesty International housed in the UK when they're talking to human rights defenders in Uganda, where being gay is punishable by death. Working to get people's information, to get asylum cases going to get people out and to safety. That conversation is then compromised.
Kara Swisher
So I just, in fact, spoke to Hillary Clinton, and she was talking about how they use signal and WhatsApp to help women get out of Afghanistan after the US military withdrawal. And they needed that secrecy to protect them. And you noticed you see a surge in downloads when conflicts arise. But these back doors, everybody gets in, then everyone's able to get in.
Meredith Whitaker
Yeah. A back door is not something you can control. Once there's a door, anyone can walk through it. Right, so this is the magical thinking that we talked a lot about when we were pushing back on this bill. Right? You want a golden key. I think, as James Comey said with the Apple showdown, you want a magical wand, you want a secret portal that only you have the spell to open. Well, that doesn't exist. That's a fairy tale. What does exist is a critical vulnerability in the only core systems we have to guarantee confidentiality and cybersecurity of communications. And if you undermine those, if you open that door for everyone, it means that the technology we have for that type of security no longer exists, no longer matters. Right, so it's serious.
Kara Swisher
How much does the recent arrest of Telegram founder and CEO Pavel Durov impact the debate? Clearly, Telegram's known as a cesspool. You know, my son was like, it's for sex and drugs, mom. Just so you know, it's often named in one breath. Was signaled because the company talks about privacy and encryption in the larger sense. But for those who don't remember, Durov was arrested in connection with distributing child sex abuse material and drugs, money laundering, working with organized crime. He's accused of failing to allow Authorized law enforcement interception. Basically, he didn't give investigators access on the app. You're not a social media company. But talk about the difference and how that has affected you. All because you're adjacent to him, of course.
Meredith Whitaker
Yeah. I mean, I think the discourse is exactly the right way to frame this. The impact of the arrests, the talk and the kind of information hyperbole, the way this sort of became a martyr story and the lack of really concrete information, which never helps, meant that there was a lot of questions. Right. And I remember being like, wait, what happened? What are the charges? Sort of sorting through the French legal system. But ultimately it doesn't affect us. Right. We're not a social media app. You can't go viral on Signal. You can't broadcast to millions of people. It doesn't have sort of encounter features. It's a very different thing.
Kara Swisher
And for people who don't understand, these are groups that they create on WhatsApp or on Pavlov, Durov's platform, Telegram and.
Meredith Whitaker
Millions and millions of people. And then they have a kind of like, what's happening near me? So you can feature that with geolocation. So there's all sorts of things happening there that mean that the legal and regulatory thresholds and duties they face are wildly different from Signal. They're a social media company. They broadcast things to millions of people. They are constitutively not private and not very secure. Signal has designed ourselves so we are an interpersonal communications app. We intentionally do not add channels or features where you can go viral. We intentionally steer clear of those kind of duties because you cannot do those duties, you can't meet those obligations while being robust in our privacy and security guarantees. That's just full stop.
Kara Swisher
Right? But there is that idea. The concerns, the total accursedness doesn't help the good guys. It aids and abets bad actors. I think that's the bigger worry about csam. This is child sexual abuse materials. Every week we had a question from an outside expert. Let's listen to this one and I'd love you to answer it.
Meredith Whitaker
Hi, Meredith. This is Renee Diresta, author of Invisible Rulers and previously the technical research manager at the Stanford Internet Observatory. My question for you is as AI makes it easier to generate synthetic, non consensual intimate imagery and csam, how specifically should platforms and governments respond to the production and dissemination of this harmful content? Is it possible to implement effective measures against these abuses without infringing on privacy and free expression?
Kara Swisher
So what are your thoughts on this? One of the things that's important is there are significant and justifiable concerns in this area. Right. In certain areas, drugs, child sexual abuse, et cetera, then how do you protect against it?
Meredith Whitaker
Yeah, I mean, absolutely, this is a very serious area. Right. And that's one of the reasons it has been so, let's say effective in floating a lot of these anti encryption proposals because. Because it takes the air out of the room, frankly, like a lot of people have experience with this, sadly. And it is extremely difficult and extremely emotionally engaging. So, you know, I think we need to take the issue itself seriously first. Right. How do we protect children full stop and then begin to look at what are the slate of options that we have. Where is the problem coming from? You know, are we funding social services? Are we ensuring that there are infrastructures in place where when a child reports that something bad is happening, that, you know, a priest or a teacher or their uncle are involved in something horrifying, how do we take care of that child and protect them? Why is Prince Andrew walking around in a country that is fixated on encryption as the culprit here, here? Right. And this doesn't, this is not saying that platforms don't have responsibility here, but it is saying that. I think when you look at the facts here, when you look at the number of people in different countries, law enforcement who are actually dedicated to reviewing this material and tracing it down, I can't say those numbers publicly because they were given to me privately, but we're talking about tens of people in one case. We're talking about two people total. So a lot of times we're talking about, you know, the issue is a haystack full of needles and not enough people to categorize the needles. We're talking about resources there. We don't have basic trust and safety tooling available to startups. So there are many places to invest in actually tackling this, both online. Right. You know, go after payment processors. A lot of what's happening is sort of sextortion and that's a node. There's. There are reporting infrastructures for public platforms and social media. There are all sorts of research on this attack the business model. All of those are options on the table to me. What leads to my distrust of a lot of the remedy space is that with all of that being obvious, with a lack of investment in social services, with the culture we have where children are often not believed here, still encryption is manufactured as the cause of this problem, when there's very, very little evidence that private communication has any causal role in this issue.
Kara Swisher
Right. But of course, I think Durov flouted. Not helping. Yeah.
Meredith Whitaker
But he also doesn't run a private communications. Right. Like, none of that was private. It was just a flex of like, yeah, we're not going to help. Right. So that's a social media platform. Just saying no. And I think there was a, you know, how do they say it? A fuck around and find out moment.
Kara Swisher
Yeah. For him.
Meredith Whitaker
Right. But that's very different from private communications and encryption. And it's weird how there's sort of a transitive property by which encryption becomes the, you know, the problem to be solved in every case, even when the evidence doesn't support that.
Kara Swisher
Well, it's sort of a brute. It's like throwing a hammer at a piano to make music or something. Like, it's like. So in the US we're seeing a lot of states are passing bills requiring age. One of the solutions is age verifications and restricting social media apps and access for minors. Florida, South Dakota, Oklahoma, name a few. There's agreement with a lot of these bills that's been used as a smoke screen. You've called it surveillance wine in accountability bottles. Talk a little bit about these ideas of restricting young people and then what you meant by surveillance wine in accountability bottles.
Meredith Whitaker
Yeah, well, I mean, look, I don't think restricting young people ever works. As a young person who always figured it out faster than my parents, this is a paradigm where I am. I have very low hopes, and I have even lower hopes when I see the folks at the tip of the spear of this movement, which are, frankly, often large biometric and age verification companies like Yoti who are selling the remedy. Right. So if we pass a bill that requires age verification to get into websites, none of these platforms are going to do that themselves. They're going to contract with a vendor or a third party who will bolt on age verification software and run that for them, because that's a liability shield and you don't want to build what you can lease or borrow. And then we get into a situation where age verification is a mass surveillance regime that is similar to tracking people's content and habits online. Right. You can't know that someone's a child without knowing who is also an adult, to be clear about that. And so we begin to legislate a tracking and monitoring system that one, won't really work based on all the evidence to date. And two, is attacking the problem at the level of restriction, not at the level of platform business models. Right. And this is this, this is where we get into Accountability wine or surveillance wine and accountability bottles, which is really like you and I live through this. We recognize that there is something really wrong with the big tech business model. Right. That accountability is needed. And we saw in the mid 2010s that there was a real call for this and what came out of that were some good ideas and then like this, I think some bad ideas wrapped in accountability. Right? So instead of going after the surveillance supported advertising, social media business model, cutting off the streams of data, perhaps implementing a very strong federal privacy law in the US that would undermine that model, take a bunch of money off the table, but clean up a lot of the global harms. We're looking at bolting more surveillance and monitoring onto the side of it. So it's giving the government and NGOs and whoever else a piece of that monitor monitoring instead of reducing the monitoring itself. And so I think it's, you know, how do we tune these regulations and how do we, you know, how do we find the political boldness to actually go up against these business models and those who are profiting from them instead of sort of, you know, try to make our name as someone who did go up against them, but actually propose remedies that don't go up against them. And I guess that's the age old question of, you know, how do we find real bold political leaders.
Kara Swisher
Are you worried about the impact of the outcome of these laws? Say here in the United States we have this election, a potential autocrat who would love surveillance. Although I don't think you'd understand it at this point.
Meredith Whitaker
I am, I am, you know, I would say I am a bit of a political exile in that I'm concerned with centralized power wherever it exists, whether that's in large tech corporations or in governments. I don't think handing more tools to governments and then imagining we live in a counterfactual world in which those governments will always be run by and benevolent adults is correct. And I think a lot of people sort of who've been pushing for accountability often live in that world. And I also am, you know, frankly, I think a lot about what the collateral consequences could be of a very bad law in the US that affects the big tech companies that control the world's information and infrastructural resources. Right. You have three companies, is based in the U.S. with 77,0% of the global cloud market. You have five information platforms, social media platforms, four of which the biggest four are jurisdiction in the U.S. which at this moment control most of the world's information environment. So that's a lot of power to be homed in one jurisdiction, particularly given the kind of volatility we're seeing and the way that just people in general, as we move through generation after generation who are kind of native to tech and kind of understand these things, are beginning to recognize just how much power and control is housed in these companies. I think that recognition is seeping into the bedrock of popular consciousness. And I want to reduce this toxic business model. I want to create an ecosystem that is way less concentrated before someone with malicious intent gets their hands on that throttle.
Kara Swisher
We'll be back in a minute. Fox Creative.
Meredith Whitaker
This is advertiser content from Virgin Atlantic.
Ian Mitchell
Hey Carrot. Scott, Remember me? The guy that Tina Fade, your Alec Baldwin sort of rejuvenated your career? Anyways, I'm in the lounge at Heathrow. I'm at the length the Virgin Lounge, the Virgin Atlanta Clubhouse lounge. And I'm about to have the chicken tikka masala. I love it here. You should check it out. It's where the cool kids hang out. Anyways, hope you're all safe travels.
Kara Swisher
Scott frankly, it's a miracle that Virgin Atlantic let you into the clubhouse and their incredible business class. But I guess they did tell me how it was.
Ian Mitchell
So Kara, I'm an original gangster when it comes to Virgin. I've been flying Virgin for 20, 20 plus years and I do the same thing and they get it right every time. They always have the financial times for me and I order the chicken tikka masala and that is my Virgin experience. If it ain't broke, don't fix it.
Kara Swisher
And your drink was what is your drink?
Ian Mitchell
Well, I used to drink a Bloody Mary or a beer in the clubhouse. I don't drink alcohol when I travel anymore, so I just do mineral water. But they have this kind of cool cocktail that's like a lemongrass or some sort of cool margarita thing and I get a virgin one.
Kara Swisher
What is your pre flight routine? What is your actual besides your chicken tikka masala at the Virgin Clubhouse, my.
Ian Mitchell
Pre flight routine is, well, I always do the same thing in the morning when I travel. I try and work out. I take the dogs for a walk and I always make time for the clubhouse because I do enjoy the Virgin Clubhouse at Heathrow. So check out virginatlantic.com for your next trip and see the world differently. Certain amenities are only available in selected cabins and aircraft.
Kara Swisher
Support for the show comes from Miro. While most CEOs believe innovation is the lifeblood of the future, only a few feel their teams excel at making innovative ideas actually happen. The problem is once teams move from discovery to ideation to product development, outdated process management tools, context switching, team alignment and constant update updates massively slow the process. Now you can take a big step to solving these problems with the Innovation Workspace from Miro. Miro is a visual collaboration platform that can make sure your team's members voices are heard. You can make use of a variety of helpful features that let your team share issues, express ideas and solve problems together. And you can save a ton of time summarizing everything by using their AI tools which synthesize key themes in just seconds. With Miro, you can innovate faster and feel stronger as a team. Whether you work in innovation, product design, engineering, ux, agile or it, Bring your teams to Miro's revolutionary Innovation Workspace and be faster. From idea to outcome. Go to miro.com to find out how. That's M I R O.com.
Renee Diresta
Vox Creative.
Unknown Advertiser
This is advertiser content. Content from Zell when you picture an online scammer, what do you see?
Renee Diresta
For the longest time we have these images of somebody sitting crouched over their computer with a hoodie on, just kind of typing away in the middle of the night. And honestly, that's not what it is anymore.
Unknown Advertiser
That's Ian Mitchell, a banker turned fraud fighter. These days, online scams look more like crime syndicates than individual con artists. And they're making banks. Last year, scammers made off with more than $10 billion.
Renee Diresta
It's mind blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands of scam centers all around the world. These are very savvy business people. These are organized criminal rings. And so once we understand the magnitude of this problem, we can protect people better.
Unknown Advertiser
One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple. We need to talk to each other.
Renee Diresta
We need to have those awkward conversations around. What do you do if you have text messages you don't recognize? What do you do if you start getting asked to send information that's more sensitive? Even my own father fell victim to a, thank goodness, a smaller dollar scam. But he fell victim. And we have these conversations all the time. So we are all at risk and we all need to work together to protect each other.
Unknown Advertiser
Learn more about how to protect yourself@vox.com Zelle and when using digital payment platforms, remember to only send money to people you know and trust.
Kara Swisher
Last time we spoke, AI was all we talked about. Things have changed dramatically. But you were warning back then about the surveillance economy and power consolidation. Cassandra, I would say I think you inspired me a lot to start really talking about it and pointing it out over and over and over again. So how are you feeling about AI now and how it's related to this new AI economy, this idea of surveillance capitalism? Because these systems are going to get. You're saying we should stop it now? Is it even possible, given the consolidation of power in tech?
Meredith Whitaker
Yeah, well, I am. You know, look, I am a consummate optimist. I wouldn't be able to get up and do this if I didn't believe that change was always possible. I think we are. We're in a frothy, hypey moment. And I do see the AI market wobbling. I see the definition of what AI is sort of wafty right now, and I see a real struggle by the incumbents to try to keep that market going and maintain their advantage. And so I can explain a little bit why I see that. Right. And maybe we'll just start with what AI is. This deep learning paradigm, which is, you know, all the transformer models. ChatGPT, all of this is still deep learning. Right. We haven't moved into some sort of escape velocity for a new form. It's actually pretty old. The algorithms are from the late 1980s now. There's been sort of moves that improve them, but nonetheless, it's pretty old. What is new is the massive amounts of data available. And this is data that's collected, created, that people are enticed to deposit by these large surveillance platforms. And the massive amounts of computational infrastructure, which was basically created in order to sort of support this business model. And then in the early 2010s, they were like, oh, you know what, these old algorithms, machine learning algorithms, but we're going to call them AI because it's flashier. Do new and interesting things improve their performance when we match them with our massive amounts of data? When we match them with the computational power we have?
Kara Swisher
Yeah. So right now, for people who know most of AI technology, they're held or financed by one of the big names, Microsoft, Google, Amazon, Apple or Meta X. Less so Most of the AI chips, the GPUs, are controlled by Nvidia, which you've called Chip Monopoly. So what you're essentially saying is they've assembled all the parts. Right. That's really what's happened. They've Got the data they didn't have before. They've got the compute power and it's all in the hands of people that can afford it. There's also the idea they've been pushing for a while that bigger is better. They're always like, we need to be. I've heard it from Mark, I've heard it from all of them, is we need to be this big in order to fight China. That's usually the boogeyman they use, which is a boogeyman, let's be clear. We spoke with Mustafa Suleiman a few weeks ago who said that even $1.3 billion from Microsoft is enough to make inflection AI successful. So he took the whole ship to his funders. We're seeing valuations in AI that are insane. $157 billion for a startup, OpenAI and the money coming from just a few sources. You said the market is wobbly, but doesn't feel wobbly. You know, it feels like it's the consolidation of power again. So I'd love you to talk about what that means.
Meredith Whitaker
You know, I think what is wobbly here is that there isn't a clear market fit for this. Right. We have this round trip investment. So it's, you know, the big companies, you know, I think it's 70% of Series A in AI startups were coming from the big infrastructure providers and it was often in the form of credits to use their infrastructure. So we're, you know, we're talking about something, something really muddy there. But it's not an organic startup ecosystem. Right. And the path to market. So if you want to monetize this, it still goes through these companies. You're either selling access to a cloud API by Azure, Google Cloud, whatever, or you are monetizing it by integrating it into your massive surveillance platform, you know, a la meta, and you know, kind of using it to sell ads.
Kara Swisher
Let me rewrite this email for you.
Meredith Whitaker
Exactly. Which, yeah, no, thank you. The email was one word and it was fine.
Kara Swisher
Yeah, that's what I always say.
Meredith Whitaker
And so I think it's, you know, I think there still hasn't. If we're, you know, we're talking about billions, hundreds of, you know, trillions of dollars. We're talking about capital to the moon, we're talking about the capital no one else can reach. And then we have like a bot that messes up our email or target spending a huge amount of money for their company to develop this chatbot to help employees that was immediately roasted by everyone because it was so wrong. It was so bad. It was so annoying. Or you know, at Upwork Research just published a survey that said 77% of the people from, you know, executives through rank and file employees who they interviewed said AI made their work messier, not better. Right. So like when the rubber meets the road on the actual business model, we're still struggling to figure out like. Right, but what does this do that's worth hundreds of billions of dollars.
Kara Swisher
Let me push back on that. You could say I heard that at the early Internet. What do I need this for? You couldn't have imagined an Uber when apps happened. You just couldn't have like nobody could. And eventually they made, they're making money now. Like you know, not a ton of money, but still a lot of these companies you wouldn't have imagined then. So I think we're sort of in the stupid assistant phase. But it's not going to stay there necessarily. You know, maybe you think differently. I don't. I think it will improve and become better and show what it's used for.
Meredith Whitaker
I think we're going to see a vast culling of the market because we simply don't need many, many big, big bloated models that are the same. LLMs that are the same and that you know, are very resource intensive. I think we also need to be super careful about how we're measuring better and this gets into benchmarking and evaluation. I just published a paper with a couple of co authors. You're looking at this bigger is better paradigm and we actually, you know, you see that smaller models, more purpose built like with more better curated domain specific data often perform better in real life.
Kara Swisher
Radiologists or something like that or cancer, certain cancer cells, sure.
Meredith Whitaker
A lot of health applications. So I think it's, you know, again I'm not saying throw the baby out with the bathwater data isn't, you know, isn't useful for anything. But I think this particular type of like massively bloated model that, you know, it's not going to stop hallucinating. We are bolting kind of relational databases onto the side of these, you know, probabilistic systems trying to kind of stabilize them so that they're not as embarrassingly wrong on, you know, on main, like they are in search in Google right now. But nonetheless that's not, you know, that's not a solution to the core problem that they don't have information augured in facts.
Kara Swisher
So not useful you're saying? Not useful. Like the same candy bar with different wrappers, the same shitty Candy bar with different wrappers. What do you imagine would be useful? These smaller, as you noted in your paper, these smaller databases where you, these LLMs that are really specifically useful, for example.
Meredith Whitaker
I mean, that's, that's a bit of a tricky question because it's hard to answer when the claims being made around AI's utility by sort of the marketing are that it's useful for everything. Right. I think we need to really like break it down. Like what would be useful in education, right. And this is where I'll point to that some of the AI now work, looking at some of the industrial policy. It's like, is AI even the thing that's useful there? Or are the climate costs are the opportunity costs are there? You know, do we need school lunches or a chatbot? Right. And I want the freedom to answer that question before I have to sort of take AI as a given and be like, how to make it more useful because there are places where data analysis super, super important and useful. Right. But this is not a general everything tool. This is a product being produced by big tech, which is sort of making more use of the derivatives of this already toxic business model. Creating more data that is often and faulty or, you know, harmful but nonetheless powerful, affects our lives and that is being sold as kind of a skeleton key for everything when it isn't actually proven as useful, just more for surveillance.
Kara Swisher
Economy, as you said. I spoke with historian and philosopher Yuval Harari recently. As I said, that's his nightmare scenario. He's calling for more regulation in AI and everything to slow it down. Now, you were a senior advisor on AI to the ftc. What needs to happen from your perspective?
Meredith Whitaker
Well, the market could slow down or the business model could slow down. I think, you know, things like the crowdstrike outage, which is when, you know, Microsoft effectively cut corners on quality assurance, on testing, on monitoring for a very, very critical update that affected core infrastructure like healthcare systems, traffic lights, banks and cost.
Kara Swisher
Cost money.
Meredith Whitaker
Yeah, it costs money for them to do this. Right. So they didn't do it. Right. And global infrastructure was offline for days. So the evidence that this sort of concentrated business model is bad is it's no longer deniable. So I do think that combined with the danger of this sort of concentration in one jurisdiction, the concerns about sovereignty that you're seeing across the globe will, I believe, impel real investment and real focus on building healthier alternatives like Signal as a sort of model for the thing we need. I also think the climate costs are just undeniable Right.
Kara Swisher
Well, you know, they're building nuclear devices for that. Three Mile island is very open.
Meredith Whitaker
I mean, did no1 Google Three Mile island or did they hallucinate?
Kara Swisher
Well, to be fair, fossil fuels are worse than nuclear. Just by stats.
Meredith Whitaker
I mean, fossil fuels are terrible. But the, you know, these companies will claim that they are carbon neutral, but you scratch the surface there.
Kara Swisher
No, not. It's nonsense.
Meredith Whitaker
And you see that, you know, they're buying, you know, kind of carbon neutrality certificates and using weird accounting.
Kara Swisher
Right, right.
Meredith Whitaker
Yeah.
Kara Swisher
But let me finish up by ask, like. Cause the biggest company here obviously is getting the most money and everything else is OpenAI, which is sort of the quarterback right now of this thing.
Meredith Whitaker
Microsoft, as we call them.
Kara Swisher
Yes, yes, that's right. Well, they don't like that. But you've written about how the term OpenAI was always misleading. Obviously, they're ditching their nonprofit status. I had talked about this several times to many people who were sort of, they were really, they really did believe in the mission, I have to say. And I kept saying, there's too much money. Now, I'm sorry, I don't know who you are, but you're naive at best to think that this amount of money is going to keep this a nonprofit. You obviously probably took a salary hit when you joined Signal. It's a real nonprofit. Talk about, is it possible to unwind that mindset? You just said we all need organizations, many more like Signal, but this is an enormous amount of money. How do you disassemble the mindset there? And how do you get it into a more mindset like yours? Because I just don't see any of this amount of money going any other direction.
Meredith Whitaker
Quickly, one aside, Signal pays very well. Okay, so we're, you know, we're. So if you're looking for a job, check out our jobs page. We do try to like, retain and attract the best talent. Right. And I think in the OpenAI case, like, it was never about just the models. Right? Like, you need massive compute. You know, it can cost $1 billion for a training run. You need massive amounts of data. And so you're going to have to figure out either convince a big company to give that to you, convince someone with billions and billions of dollars to burn it on that. And once they've burned it on that, what do you do with the model? Like, I'm going to let that hang in the air. What do you do with a model? You can't use it at scale again without a path through to market that goes through these Companies. So this is when I talk about AI being a product of big tech. Like that's very, very literal, right? They have the data, they have the compute, they have the access to market. Either, you know, Meta is applying it within Facebook at, you know, for services, this email prompt thing, or OpenAI is advertising it as a like dongle as it for Azure customers who sign up to that. Or you have something like Mistral, which is a national champion in France, building open source large models. But how do they actually get to market? How do they make good to their investors and their business plans? They license it to Microsoft who then licenses out through their cloud. So when I talk about 70% of the market in cloud being controlled by three companies, we also have to fold that into the AI conversation and recognize that it's a, you know, AI has not introduced any new software libraries or new hardware. This is all stuff we've had in the past that we know about that exists. This is not novel. What is novel is this massive amount of data and the way that it's being used to sort of train these statistical models that can then be applied in one way or another.
Kara Swisher
So is what happens then from your perspective? What happens? I mean, obviously it was never going to be a nonprofit after the money raising happened and they need the money, FYI. They can't not have the money to grow and they've got everybody on their tail too at the same time. What is their fate? What happens to a company like that?
Meredith Whitaker
I mean, it seems like there's a lot of like interpersonal things happening. Like, you know, that's every company court drama as well. So predicting you were around at Google.
Kara Swisher
I was there even before you. It was like, come on, come on.
Meredith Whitaker
Touche, touche Amazon.
Kara Swisher
Come on. Yeah, the hot mess of Twitter.
Meredith Whitaker
Oh God, yeah. So hot mess aside. Yeah, I would say they just kind of slowly become an arm of Microsoft, you know, the same way, you know, maybe there's an alphabetization of Microsoft, the same way Google kind of spun out, you know, different entities as an antitrust prophylactic, you know, but I don't, you know, again, there was never a model to be a nonprofit long term, given their vision, in my view. And I think what you've seen is just a series of whittling away at that until it no longer exists. So you know, Microsoft, Amazon, Google, those are the three big clouds. You know, that's the gravitational pull in which model creators are going to ultimately get sucked into, you know, Nvidia maybe.
Kara Swisher
Is there a model for a nonprofit in AI?
Meredith Whitaker
I think so. I mean, I think so. But again, we got to take this term AI back a little bit. It does not simply mean massive, massive, massive law of scale. You know, bigger is better AI. There are many forms of AI that are smaller, more nimble, like more actually useful. And I think there is a, you know, there could be a model for AI research that is asking questions that are less useful to the big players to this bigger is better paradigm and, you know, perhaps more useful for smaller use. For use in things like, you know, are we training models for things that aren't profitable, like environmental monitoring or, you know, sips something or another? I think there is. There's a model there. I think again, though, what I'm seeing is, is a misunderstanding of that fact and a misunderstanding of just how capital intensive this bigger is better AI is. That's having governments around the world who are anxious about sovereignty concerns and like, want their own. Basically throwing money at AI without understanding that that's going back into the same companies. That's not going to ensure sovereignty. So it's, you know, it's like, oh, great, you have a $500 million European AI fund. Well, let me break it to you slowly. That's half a training run.
Kara Swisher
That's right, that's right.
Meredith Whitaker
So, like, what are you doing?
Kara Swisher
You can't afford it. Yeah. You can't just regulate them. Just regulate them. Same way with privacy and everything else. So, last question, we spoke five years ago, light years ago, it seems like a million years ago, a trillion years ago. Looking down the road, what do you think the next five years will? What the most important things for your work for signal tech in general, if you had to prognosticate.
Meredith Whitaker
Yeah, well, what I am working on, what I'm kind of obsessed with right now, in addition to just, you know, building and sustaining signal, which I love, is how do we find new models to sustain better tech? Right? Like, once we've cleared the weeds of this toxic model, once we've prepared the ground, how do we grow things that are actually beneficial? How do we create a teeming ecosystem? How do we encourage open tech and democratic governance, which I think is a thing we don't talk about enough, frankly. But how do we have a part in deciding what tech is built, who it serves, how we assess it? Some of the scrutiny that signal receives from its loving and sometimes belligerent community of security researchers and hackers is part of our strength. How do we expand that to people? How do we shift from a monolithic five platforms, control our news to a much more heterogeneous ecosystem that's a little warmer, a little fuzzier, you know, a little, you know, RSS feed ready, so to speak. I think, you know, those are problems that aren't new, but that I think there's a real new appetite to actually tackle. Because it's getting too obvious when you have Andreessen Horowitz and Y Combinator coming out and saying, like, we're the champion of little tech. We know that the death knoll has rung for big tech, and what we need to do is then, like, define what comes after.
Kara Swisher
Yeah, absolutely. All right, Meredith, thank you so much. I love talking to you. I should talk to you. I love talking to you every five years. And I really appreciate because I think people don't understand it. Please, everyone use signal. I use it all the time. Thank you. It's free. It's not stealing my stuff. And it's really. It's another moment where I'm like, where is the Signal AI thing?
Meredith Whitaker
Amen. Well, I'm Team Cara on that.
Kara Swisher
Okay. All right. On with Kara Swisher is produced by Christian Castro Roussel, Kateri Yocum, Jolie Myers and Megan Burney. Special thanks to Sheena Ozaki, Kate Gallagher and Kaylin Lynch. Our engineers are Rick Kwan and Fernando Arruda and our theme music is by Trackademics. If you're already following the show, you're not drinking the surveillance wine. By the way, it tastes terrible. If not, back into the accountability bottle for you. Go wherever you listen to podcast, search for on with Kara Swisher and hit follow. Thanks for listening to on with Kara Swisher from New York Magazine, the Vox Media Podcast Network, and us. We'll be back on Monday with more.
Meredith Whitaker
Creating highly advanced AI is complicated, especially if you don't have the right storage. A critical but often overlooked catalyst for AI infrastructures, Solidigm is storage optimized for the AI era. Offering bigger, faster and more energy efficient solid state storage, Solidigm delivers the capability to meet capacity, performance and energy demands across your AI data workloads. AI requires a different approach to storage. Solidigm is ready for everything the AI era demands. Learn more at storageforai.com support for this.
Unknown Advertiser
Podcast comes from Anthropic. It's not always easy to harness the power and potential of AI. For all the talk around its revolutionary potential, a lot of AI systems feel like they're designed for specific tasks performed by a select few. Well, Claude by Anthropic is AI for everyone. The latest model Claude 3.5 Sonnet offers groundbreaking intelligence at an evergreen price. Claude's Sonnet can generate code, help with writing, and reason through heart problems better than any model. Before you can discover how Claude can transform your business@anthropic.com Claude.
Podcast Summary: "Signal’s Meredith Whittaker on Surveillance Capitalism, Text Privacy and AI"
On with Kara Swisher
Host: Kara Swisher | Guest: Meredith Whittaker, President of the Signal Foundation
Release Date: October 17, 2024
In this episode of On with Kara Swisher, Kara engages in an in-depth conversation with Meredith Whittaker, the president of the Signal Foundation. Whittaker, renowned for her advocacy in data privacy and her pivotal role in founding the AI Now Institute, discusses the intersection of surveillance capitalism, text privacy, and the burgeoning AI economy.
Whittaker begins by recounting her extensive career in tech, highlighting her tenure at Google from 2006 to 2019. During her time at Google, she founded Google Open Research and the Measurement Lab (MLAB) in 2017. Additionally, she co-founded the AI Now Institute at NYU with Kate Crawford to explore the social implications of AI.
Meredith Whittaker [04:56]: "How do we build technology that is actually beneficial, actually rights-preserving? How do we stop the bad stuff, start the good stuff?"
Whittaker discusses her departure from Google, citing increasing hostility from management due to her advocacy for ethical business practices and data privacy. Feeling the need for change after 13 years and enduring pressure to curb her activism, she transitioned to the Signal Foundation, viewing it as a fulfillment of her lifelong mission to promote data privacy.
Meredith Whittaker [06:06]: "I had raised a lot of alarms. I'd participated in organizing against some really troubling business decisions... my adrenals needed a rest, I needed a change of pace."
Whittaker delves into what sets Signal apart from other messaging apps. Signal employs its proprietary Signal Protocol, considered the gold standard in end-to-end encryption, ensuring that not only the content of messages but also metadata remains private. Unlike competitors like WhatsApp, Signal collects minimal user data, making it a preferred choice for privacy-conscious users, journalists, and human rights workers.
Meredith Whittaker [14:20]: "Signal's big difference is that we are truly private. We collect as close to no data as possible and we develop open source so that our claims, our code, our privacy guarantees don't have to be taken on trust."
Operating as a nonprofit, Signal relies primarily on donations, including substantial support from Brian Acton, the co-founder of WhatsApp. Whittaker emphasizes the challenges of sustaining a privacy-focused platform without resorting to invasive business models. The foundation explores various funding strategies, such as endowments, to ensure long-term viability without compromising its mission.
Meredith Whittaker [20:15]: "We are looking at different models right now for how we grow this. How do we sustain Signal, and how do we make sure that Signal isn't just a lonely pine tree growing in a desert."
Addressing future threats to encryption, Whittaker explains Signal's proactive measures against potential quantum computing attacks, a scenario dubbed "QDay." Signal has already implemented post-quantum-resistant encryption in its protocols to safeguard against "harvest now, read later" attacks, where encrypted data is stored and decrypted once quantum capabilities emerge.
Meredith Whittaker [22:54]: "Signal was the first private messenger to implement post-quantum resistance encryption for our Signal protocol."
Whittaker responds to criticisms and misinformation regarding Signal's security. Specifically, she addresses Elon Musk's unfounded claims about vulnerabilities within Signal, clarifying that Signal has no credible reports of such issues. She underscores the importance of trust and transparency in maintaining Signal's reputation.
Meredith Whittaker [24:29]: "We have no credible reports of a known vulnerability in Signal. We have a security mailing list that we monitor assiduously where we haven't heard anything."
The conversation shifts to legislative efforts like the UK's Online Safety Act and the EU's Chat Control Bill, which aim to mandate message scanning to protect users, especially children. Whittaker vehemently opposes these measures, arguing that they compromise end-to-end encryption and overall network security by introducing backdoors.
Meredith Whittaker [30:35]: "Any backdoor in a network compromises the whole network. It's a critical vulnerability in the only core systems we have to guarantee confidentiality and cybersecurity of communications."
Whittaker differentiates Signal from platforms like Telegram, whose founder Pavel Durov faced legal issues related to content moderation failures. She highlights that Signal intentionally avoids features that facilitate large-scale broadcasts, thereby mitigating similar risks and legal obligations.
Meredith Whittaker [34:00]: "Signal is an interpersonal communications app. We intentionally do not add channels or features where you can go viral. That's just full stop."
Addressing concerns around child sexual abuse materials (CSAM) and synthetic imagery, Whittaker emphasizes the need for comprehensive solutions beyond encryption. She advocates for robust social services and infrastructure to protect vulnerable individuals, critiquing the tendency to place the burden solely on encryption technologies.
Meredith Whittaker [36:21]: "There are many places to invest in actually tackling this, both online and offline. We need to look at the facts and address the root causes."
Whittaker critiques recent US state legislations requiring age verification for online access. She warns that such measures often lead to increased surveillance and tracking without effectively addressing the underlying issues of platform business models.
Meredith Whittaker [40:27]: "Age verification is a mass surveillance regime that is similar to tracking people's content and habits online. It doesn't work and it attacks the problem at the level of restriction, not business models."
Whittaker explores the consolidation of AI within big tech companies, noting that AI advancements are heavily dependent on vast data and computational resources controlled by a few giants like Microsoft, Google, Amazon, and Nvidia. She argues that this concentration perpetuates surveillance capitalism, where data collection drives profit and power.
Meredith Whittaker [51:56]: "AI is a product of big tech. They have the data, they have the compute, they have the access to market. It's not novel; it's an extension of established surveillance-driven business models."
Looking ahead, Whittaker envisions a future where the AI market undergoes significant culling, favoring smaller, purpose-built models over large, resource-intensive ones. She advocates for nonprofit AI research and development, emphasizing the need for models that prioritize societal benefits over profit.
Meredith Whittaker [56:43]: "There could be a model for AI research that is asking questions less useful to big players and more beneficial to society, like environmental monitoring."
In wrapping up, Whittaker underscores the importance of building an ecosystem that supports open technology and democratic governance. She calls for collective action to challenge toxic business models and foster innovative, privacy-centric alternatives.
Meredith Whittaker [66:01]: "How do we create a teeming ecosystem? How do we encourage open tech and democratic governance? These are the questions we need to answer to move forward."
Kara concludes by urging listeners to support platforms like Signal and stay informed about the critical issues surrounding data privacy and the AI economy.
Notable Quotes:
Meredith Whittaker [14:20]: "Signal's big difference is that we are truly private. We collect as close to no data as possible and we develop open source so that our claims, our code, our privacy guarantees don't have to be taken on trust."
Meredith Whittaker [30:35]: "Any backdoor in a network compromises the whole network. It's a critical vulnerability in the only core systems we have to guarantee confidentiality and cybersecurity of communications."
Meredith Whittaker [51:56]: "AI is a product of big tech. They have the data, they have the compute, they have the access to market. It's not novel; it's an extension of established surveillance-driven business models."
Meredith Whittaker's insights shed light on the pressing challenges of maintaining privacy in the digital age, the ramifications of surveillance capitalism, and the monopolistic tendencies within the AI sector. Her advocacy for nonprofit models and open technology offers a hopeful path toward a more equitable and secure technological future.
For more information, visit the Signal Foundation and support their mission to protect digital privacy.