
Loading summary
Lawfare Host
Nearly every News alert in 2025 has raised questions, some old, some new, about the law and national security. And now you get the chance to ask Lawfare directly. It's time for our annual Ask Us Anything Mailbag podcast, an opportunity for you to ask Lawfare this year's most burning questions. You can submit your question by leaving a voicemail at 202-643-846, or by sending a recording of yourself asking your question to ask us anything lawfairmail.com By December.
Bombas Advertiser
16Th studies show that 100% of everybody in the world wants to curl up indoors and do nothing because it's so darn cold out there. That's why many people are turning to Bombas, whose pillowy plush slippers and warm merino wool socks have been said to be the most comfortable in the history of feet. Bomba's products have been found to boost coziness by up to 1 million percent. Okay, enough fake statistics. But could Bomba sock socks and slippers.
Renee Diresta
Really be the Cure?
Bombas Advertiser
Go to bombas.com audio and use code audio for 20% off your first purchase. That's B O-M-B-A S.com and use code.
Renee Diresta
Audio.
Chief Lead Advertiser
Dear Career Ladder, you've had your moment. You're linear and one dimensional. Ambition doesn't just go up anymore. It zigs and zags and squiggles where CEOs, executives, founders. We're advising companies, launching side hustles, taking breaks, defining our next act ambition on our terms. The possibilities are endless. Chief Lead on join us@chief.com.
Jimmy Wales
So one of the things we look for when we're looking at sources is like, oh, when they do get something wrong, because everybody does, what do they do about it? You know, how transparent are they about what happened and how they're going to fix it?
Renee Diresta
It's the Lawfare Podcast. I'm Renee Diresta, contributing editor at lawfair, here with Jimmy Wales, founder of Wikipedia and author of the Seven Rules of Trust.
Jimmy Wales
So the way Wikipedia works is very different from social media, for example, where you can just flood the zone. You know, like you can have 10,000 bots all saying similar things and just flood the zone with it. That's going to get you nowhere.
Renee Diresta
In Wikipedia today, we're talking about what it takes to keep an open, collaborative platform trustworthy in a time of deep distrust and political pressure. So I love the fact that you opened your book with Stephen Colbert's joke about wikiality back when Wikipedia was seen as kind of chaotic and maybe unreliable. He tells this joke about elephants and how we can change what elephants are just by editing the page. But Wikipedia has now become the sort of scaffolding of the Internet. It trains AI, it powers search, shapes how billions of people understand the world. When did you realize that it was becoming that kind of infrastructure?
Jimmy Wales
It's a good question. I mean it sort of emerged over a period of time. I mean I remember a few specific moments. We had the John Seigenthaler incident. So John Sigenthaler Sr. Was a very well known journalist who stumbled across his Wikipedia entry which had a terrible error, got in touch with me and I, you know, immediately fixed it. But then he wrote a scathing editorial about Wikipedia in the USA Today. That was the first time I was like, oh wow. Like actually people are actually paying attention to this, like it's actually newsworthy, you know, like what Wikipedia says. So that was one of the early incidents. But you know, it's over time. You know, there's all kinds of funny things. I mean even like Stephen Colbert making a joke about Wikipedia was like, oh wow, like you're on noticing this. Yeah, yeah, crazy.
Renee Diresta
And I remember because I was at the time, this would have been, I guess around 2018 or so, right when Susan Mojicki was sitting on stage at one of the conferences, maybe it was Recode, I think it was one of the major tech conferences. And she says, oh yeah, we're going to use Wikipedia for fact checking was how she framed it. This was during the days of conspiracy theories was, you know, this was the, the fake news when fake news meant, you know, stories that weren't true is before that term got kind of co opted and, and she kind of announced on stage that Wikipedia was going to be used for fact checking. And I thought this was very interesting. I was studying conspiracy theories quite a bit at the time and writing about them. And I think I wrote about this for Wired. I was just contributing at the time thinking like this is very interesting, right, because as I recall, I think I reached out to somebody at Wikipedia Media and they were like, nobody told us about that. Do you want to talk a little bit about what that was like?
Jimmy Wales
Yeah, no, I mean that was interesting. I mean I think Susan mentioned it to me, but yeah, it wasn't like a formal deal or anything. You know, it's like, I mean one of the interesting things about Wikipedia is it's freely licensed. So everything in Wikipedia can be copied, modified, redistributed, redistribute, modified versions, commercially or non commercially, which will be relevant when we Talk about AI in a little while. And so it was sort of no surprise that they might want to take some snippets from Wikipedia to put underneath videos or you know, whatever. But it's kind of cool.
Renee Diresta
It was, it was a very interesting approach, I think, because YouTube rather notoriously recently, you know, they wrote a letter to Jim Jordan saying we don't fact check. We've never fact checked. And Jim Jordan was like, you're never going to fact check. But this, this became a thing, right, because, because it was actually this moment back in 2018 when they said we were, we're going to use crowdsourcing, crowdsourcing information. We're going to point to Wikipedia and whatever Wikipedia says is going to be the thing that they point to. And it almost a, almost a community notes model in a sense. Even back in the olden days when, when they were relying on this consensus of the crowd to sort of go after things like the earth is round and vaccines don't cause autism and these, these things that we're now relitigating in 2025. One of the things about that actually kind of connecting it maybe to things like community notes and consensus is that Wikipedia has always made that process very, its process, I should say, very visible, right, that disagreement, debate, evidence. Whereas many institutions tend to keep that sausage making behind the scenes. You write about this a lot in your book. What have you learned from making that process so visible? You write a lot about it in the context of transparency to help build trust.
Jimmy Wales
Yeah, no, I mean I think it's, it's really, really important. It's important for the Wikipedia process. It's important for trust and it is something that I think I'd like to see happen in more places. So, you know, perhaps unusually for somebody who's quite anti misinformation and disinformation, I actually, I wasn't that unhappy when Facebook decided to stop their program that they were doing fact checking. Not because, you know, I think, oh, it's fine post nonsense, that's not my point. But it was like it was not transparent enough and it was a bit top down and it was a bit, you know, sort of like these organizations were, you know, doing the fact checking, mainly doing a good job, but people didn't trust that process. It didn't feel authentic in certain ways. I mean, I always joke, you know, how on Wikipedia it says, you know, the neutrality of this article has been disputed or the following section doesn't cite any sources. And I always say, you know, I wish the New York Times would do this sometimes, you know, just put a little note at the top of an article saying, you know, we had a big fight in the newsroom, we weren't sure we're going to run this. We think it's important enough to run. But just be warned, a couple of the journalists don't think this is firmed up enough yet. Wow, that's kind of cool. Like, that's actually amazing. And it's like, now I can read it with that understanding. And there is that old school, traditional journalism, sort of Voice of God, you know, like it's true because we're telling you it's true and people kind of can see through that. And, you know, some of the older ways of dealing with problems around that are still valid, like run a correction. Like, that's actually really important. That's one of the things we look for when we're looking at sources is like, oh, when they do get something wrong, because everybody does, what do they do about it? You know, how transparent are they about what happened and how they're going to fix it and so forth.
Renee Diresta
It's an interesting point. I think that that question of legitimacy is really very, very closely tied into the transparency piece. Have you ever encountered a moment where you think, oh, that kind of backfired, or have you found it to be just universally a positive?
Jimmy Wales
Well, I mean, one of the funny issues is that it's often very difficult for outside people, journalists, for example, to actually read a discussion page in Wikipedia because all the voices look completely equal. But without any understanding of who's saying what and how, that's going to be perceived by the other people in the conversation. It's actually really hard to interpret. And that's not a criticism of journalists. It's just, it's a hard thing to do. And so what happens is sometimes you'll see, you know, a story, a massive controversy breaks out on Wikipedia. I'm like, no. Same two trolls who are always complaining about everything. Like, it's not really. That one's not a real controversy in Wikipedia. And you know, recently I just, on my talk page, I casually mentioned some ideas about how we might be. Might use AI to support our work. And one person said, this is making me really angry. I can't believe you're saying this. And I was like, I'm sorry you're angry. And you know, well, it was just a conversation. It was a conversation. And I would say there were mixed feelings about my ideas. And some people thought, oh, good idea. Some people thought not so much. But the News headline was Community pushes back on Jimmy Wales's Plan to Shove AI down Their throats or something. Yeah, I'm exaggerating, but so that's kind of like, okay, right. We're just trying to have a little conversation here like you would in any organization, sort of around the water cooler as they used to say. I don't. Do people have water coolers anymore? But yeah, so sometimes it's a little funny because the transparency just means. Well, it's all out there. So.
Renee Diresta
Right. And sometimes I think it is hard to follow. You have the edit history comments, then you have the talk page comments, then you have. Sometimes things wind up on the talk page comments of the editors themselves when there's a dispute. I've followed it a few times as I've looked at contentious articles both as a person who follows it, as a person who writes about it sometimes, and also as a person who becomes the subject of controversy sometimes, you know, these days. And so it has been interesting to see like try to figure out where the different. Where the different sense making, so to speak, happens. And so it's been very interesting. Have you followed the emergence of. I don't know if you spent any time on X Community Notes?
Jimmy Wales
Yeah, yeah, yeah. I actually. It's the only thing I still like about X.
I would say I spend as much time so I try to avoid it these days. I find it. It's just gotten so toxic and it's just really not same. It's not fun. I have a few friends on there and interesting people, so that's still valid. But when I go on, I typically just, I see a request to do a community note thing and I do it. I'm like, oh, this is kind of fun and interesting and.
Renee Diresta
Yeah, so you're a contributor to Community Notes.
Jimmy Wales
Yeah, yeah. I feel like it's a. It's a valid and useful thing to do. And I mean it's interesting because I do try to come at it with like a Wikipedia and so it's often sort of fun for me to say, wow, like I massively disagree with this person's comment. However, this community note is not actually very good. Like it's. It's sort of. I'm one of those people who are always like, take it to the, you know, post your own comment. Like you're just disagreeing with the opinion. You're not fact checking. You know, this is not a clarification or whatever. You're just debating the person. It's like, yeah, go debate the person. That's fine. So anyway, I do enjoy it and I actually think it's something I'd like to see them do more with and more of that. Because, you know, I mean, one of the long standing problems of Twitter is Twitter X is, you know, the algorithmic amplification of really toxic people, toxic comments. But also like, as I always remind people about Usenet, which was before the World Wide Web, it's like we didn't even need algorithms to be toxic to each other. Like, that's a human thing. And so already you take that tendency, you add a layer of algorithmic promotion on top of it and you get this cesspool that's completely not living up to its purpose, if it has a purpose. So, yeah, I think Community Notes is a good thing.
Renee Diresta
Now, as I was saying in trust and safety research that the problem with social media is people.
Jimmy Wales
So.
Yeah, I used to, when I was young, a teenager, I worked in a grocery store for a while. I was working the night shift stocking shelves and I got to help open a new store. So they'd opened a brand new store. And so the week before we just stocked the entire grocery store beforehand. And it was, it was like the platonic essence of grocery store. The morning it opened every can, everything was absolutely perfect. And I said, you know, this job would be a lot more pleasing if it weren't for the customers. There you go. So it's like, yeah, absolutely. The only. Yeah, the hard part about community management is managing the community. So.
Renee Diresta
Right. So there's kind of like. There's two thoughts I have. One is, I don't know if you followed some of the research on Community Notes. That's very interesting. Which is that, and maybe this, this kind of like branches between AI versus where I want to take it with some of the debates about sources. But let me, let me stick with the Community Notes piece here. Have you seen any of the interesting research on what's called the Haberma machine?
Jimmy Wales
I have not.
Renee Diresta
It makes this argument. It's very, very interesting research. It's done by Google DeepMind that actually one of the incredible uses of AI is that it can produce a community note or you know, sort of a snippet, a distillation in very neutral language that people actually like quite a bit. Because one of the problems with Community Notes is that oftentimes the notes don't clear, so to speak. The way that a note shows up on a tweet is that the bridging algorithm means that people on the left and people on the right all have to Agree that the note is helpful in order for it to show up. That's how the algorithm works. And that in terms of writing in that neutrality, which I think ties into Wikipedia and neutral point of view. Right. How do you express something in kind of a neutral language that people find palatable? This is the thing that has to happen on community notes on Wikipedia, you know, you're sort of expressing a neutral point of view. You have more, more language to work with. But on community notes, what people are finding is that the machine actually can write in that tone that more people like. And so it clears. And so you're starting to see to assist with scaling, producing more notes in a neutral, neutral tone. And then the voting is what goes to the people. So the consensus process is kind of bumped down to the humans. Right. So the humans have to do the voting. The humans have to decide a note is needed. Per your point, this isn't just an opinion debate. That consensus still has to happen between right and left. But that's where people are starting to see a use case for AI in this kind of consensus making process. It helps just write in that tone that people find palatable. I was curious if you'd seen this or what you.
Jimmy Wales
I hadn't seen it. Sounds very plausible and very interesting. I mean, I use large language models all the time for fun and you know, just. I find it an absolutely fascinating thing and I can see that, that, that sounds likely to me and I. What I also like about it generally, and this is my view about the future of AI and Wikipedia, large language models and all that is for AI to help by making suggestions is a much more effective use than trying to take over something from humans. Because the level of nuance required to really make a decision is often quite difficult and you just really do want that assurance and reassurance. On the other hand, a starting point like a helpful, you know, like oh, here's, here's an idea of something. Yeah, actually quite interesting.
Renee Diresta
So how are you thinking about AI and Wikipedia?
Jimmy Wales
Well, I mean, there are a lot of different elements there. So I would say one of the first things is we are not envisioning and we really don't think it's a great idea to have AI producing content that goes directly to readers just because the hallucination problem is still quite bad and so on and you know, it just. That doesn't work. It doesn't feel right to us, it doesn't feel necessary. But I'm interested in how AI might help us do certain things like at scale that already happen, but that are a little bit too manual, so to speak. So I was talking to a French Wikipedian. This summer we have an annual conference, Wiki Mania. We were in Nairobi, which was awesome. The Florence old school Wikipedian, she said, oh, I don't have time to edit Wikipedia as much as I used to, but I have a hobby of I go into French Wikipedia and I find an older page that has dead links because dead links go dead after time. And then I see what was the link supporting, and then I go find a new source that supports that and I add the source and I'm like, oh, okay, yeah, great. And it's just a little thing she does and it's nice and you know, that's very typical Wikipedia knitting we call it. You know, just sort of something to do. But I said, okay, what I'm interested. Like what would you think about this idea where finding a 404 not found link, you don't even need it, you know, that's not AI. You just find a broken link and then an AI reads and says, oh, this is what that was supporting. And then it goes to some usual suspect sources, it vets them a bit for you, and it makes a suggestion. And she said, oh, like that sounds useful. Like, because the interesting part is making the judgment, does this support that? That's the human piece. What's not interesting is finding a dead link. That's a lot of clicking. And what's not interesting is googling and throwing out like seven like sources that don't. They thought. She thought it would mention. It didn't mention this point, whatever. So it's like, oh, wow, that could just speed up the work and it would be great. So that's a simple little thing, but it's the sort of thing that you can start to think about being able to do now that we have a technology that can do textual analysis in a really kind of human approachable way. You know, other things would be cross language comparisons. I've played around with this a little bit myself. I realized I started doing it because I was interested in, you know, controversial topics and how we dealing them in different cultures, different languages. But I actually think it's just as valid to think about it for not so controversial topics. Although frankly, as a Wikipedian I can tell you almost anything can be controversial if you scratch the surface. But you know, like my simple idea here is what about articles about French wine in English Wikipedia versus French Wikipedia? Well, that's probably, they're probably quite similar. They're both actually Quite good on the topic of wine because I guess there's a lot of wine hobbyists who are interested. But I bet you that if you just sort of, you know, ran a script over a lot, you would find articles that are markedly different for no good reason. And that would be really interesting to be able to sort of have something that sort of just. Just all it does is a little friendly bot. It raises hands like, oh, hey, I found these two articles and they're quite different. And even though they're supposed to be about the same subject and then a human would go, oh, great, like, let me look into that. That sounds fun. That sounds interesting because one of the things that people do, they come to Wikipedia as Wikipedians and this is the best way to come to Wikipedia, not as a culture warrior for your thing that you're uptight about, but you as a Wikipedia. And you're just like, oh, what am I going to do today? Like, I want to help Wikipedia. I want to do. I want to do something interesting. And maybe AI can help us find things that are interesting and sort of and productive. You know, like what's a popular article that has neutrality warning that's been there for too long? Oh, that might be. If you, if you're looking for a tough challenge, that might be good. So anyway, that's where I think about AI is like supporting the community.
Lawfare Sponsor Host
I'm a last minute holiday shopper. I often don't do it at all until it's too late. You know the feeling, everything's gone already. You don't have ideas. But here's an idea for you. If you are like me and that's your situation, Aura Frames is the solution with a gift that feels personal. I love my aura frame. But more important than that, I love my aura frame is that the people that I buy Aura frames love them. I got two aura frames for the Lawfare office. We share them. They're hanging around the office. One of them is just pictures of Lawfare people. One of them is the Lawfare dependents. Pets, kids, all the people that we care about. And we upload them to the aura frame and we all share them. We all get a kick out of it. And it's super moving to see both of these frames develop over time. You upload unlimited photos and videos. You individually or a group of people like say, the Lawfare community. You just download the Aura app and connect it to Wi Fi. You preload photos before it even ships and you can just keep adding them from anywhere, anytime. You can personalize the Gift add a message before the frame arrives. You share photos and videos effortlessly, straight from your phone all year long. The gift box is included. Every frame comes packaged in a premium gift box with no price tag. You can't wrap this kind of togetherness, but you can frame it. So for a limited time, save on the perfect gift by visiting auraframes.com to get $35 off Aura's best selling Carver mat frames named Number One Wirecutter by using the promo code Lawfare at checkout. That's a U R A frames.com promo code Lawfare. This deal is exclusive to listeners, and frames sell out fast, so order yours now to get it in time for the holidays. Support the show by mentioning us at checkout. Terms and conditions apply.
Deleteme makes it quick, easy, and safe to remove your personal data online at a time when surveillance and data breaches are common enough to make everyone vulnerable. The New York Times wirecutter has named Deleteme their top pick for data removal services, and I'll tell you why. Because data brokers make a profit off of your data, which is a commodity. Anyone on the web can buy your private details, and this can lead to identity theft, phishing attempts, and harassment. But Delete Me lets you protect your privacy. I do it with Delete Me and I think you should, too. I have an active online presence. I do wacky stuff. I dressed up as an inflatable frog the other day and. And, you know, I put myself out there. I threw dead sunflowers in front of the Russian embassy. But my privacy is, at the end of the day, still really important to me. I want a separation between my public activity and my private life. I've been a victim of harassment, identity theft, and that sort of thing. It's not pleasant. And if you haven't, you probably know someone who has to. Delete Me can help. So take control of your data and keep your private life private by signing up for Delete Me now at a special discount for Lawfare listeners. Get 20% off your Delete Me plan when you go to JoinDeleteMe.comLawfare20 and use the promo code Lawfare20 at checkout.
Renee Diresta
You can.
Lawfare Sponsor Host
The only way to get 20% off is to go to JoinDeleteMe.com Lawfare20 and enter the promo code Lawfare20 at Checkout. That's JoinDeleteMe.com Lawfare 20 code Lawfare20.
Aramco Advertiser
Who drives the world forward? The one with the answers or the one asking the right questions? At Aramco, we start every day by asking how? How can innovation help deliver reliable energy to the world? How can technology help develop new materials to reshape cities? How can collaboration help us overcome the biggest challenges? To get to the answer, we first need to ask the right question. Search Aramco Powered by How Aramco is an energy and chemicals company with oil and gas production as its primary business.
American Red Cross Advertiser
Blood donation is now more inclusive. More people are able to donate blood with the American Red Cross through FDA guidelines that eliminate eligibility questions based on sexual orientation. The Red Cross celebrates this historic change and welcomes those who may be newly eligible to donate blood. There's a place for everyone in the mission of the Red Cross. The Red Cross is committed to achieving an inclusive blood donation process that treats all potential donors with equality and respect while maintaining the safety of the blood supply. Join us and help save lives. To learn more and make your appointment to donate blood, visit redcrossblood.org LGBTQ that's redcrossblood.org LGBTQ you know what's faster than your paycheck?
Renee Diresta
Literally everything.
Earnin Advertiser
It's time to get your pay up to speed with Earn In. You can access your pay as you work instead of waiting days and weeks for a paycheck. Get up to $150 a day with max of $750 between paydays. No interest, no credit checks and no mandatory fees. Because hey, it's your money. Download the Earn in app now to get it and join millions of any day payday. That's ear N I N ernin is not a bank. Access limits are based on your earnings and risk factors. Available in select states. Expedited transfers available for a fee. Terms and restrictions apply. Visit Earnin.com for full details.
Renee Diresta
I guess this bridges into Grok quite nicely. Have you looked at your Grokipedia brio?
Jimmy Wales
I have not. I have not. I've actually I've barely had time to look at Grokopedia. I have looked at it a little bit. I've mainly. I've seen news stories about it and so forth and it. It doesn't sound great. I really do need to do a deep dive and I think I've got some time late next week to actually sit for six hours and like do a bunch of comparisons. I might actually get AI to help me compare them.
Renee Diresta
And yeah, I can recommend an article by Alexios Mansar Les over at Indicator Media. I linked it in my recent analysis. Also, I read my bio. I'll give a I was, you know, one of the lucky 855,000 to get a page.
In the first pass. No, actually, in all seriousness, you know, I study adversarial abuse online. That's my job. I was very intrigued. The first half, I have to say, was very, very good. It really just sort of crawled the web. And all these random profiles that I've had in the past, random interviews I've done with people where, you know, they start with, what was your childhood like? And you tell these random anecdotes about, oh, my dad taught me how to code, and all these random things that never, you know, that a random person would never know. Like, it actually did scrape and aggregate all that stuff. And so, like, the very. The opening is actually quite, quite rich with these random, you know, stories and things. And then it goes off the rails, you know.
Jimmy Wales
Yeah, yeah, yeah.
Renee Diresta
And it has a whole controversy section. And I think that it really is tilted towards controversies. Like, that's. That's kind of what he wants. That, that's, you know, sort of like when you have a product that's created out of spite, you really can see what it. What it could be versus what it is. And that, I think is actually the great tragedy of Grokopedia is that. So Alexios analysis, which does do a diff between Wikipedia and Grogipedia, kind of finds that about 55% of the articles are essentially cribs, right? They're basically identical. But in the controversial topics is where you see this significant deviation, where there's this significant overemphasis on controversy. And that's where my experience was, that it pulled from a congressional report written by Jim Jordan that was just complete bullshit.
Jimmy Wales
And.
Renee Diresta
And in their congressional testimonies by nut jobs, you know, the real politically motivated stuff. And even as it used the. The politically motivated testimonies, it hallucinated facts that were not even in them, right? So it was sort of one degree past even the conspiracy theories. It went one degree further. So I thought, well, this is very interesting. And so there's no talk page. There's no talk page to fight with the bot. There's just a. A box. So I fought with it in the box, by the way. Turns out everything you fight with it about in the box becomes public. It is, you know, they do in fact, post that, which is good. I didn't, you know, I was like. I was a little bit direct. But that language is out there.
Jimmy Wales
That's all right.
Renee Diresta
Whatever. It is what it is. So it turns out two weeks, two and a half weeks later, after I write about this for the Atlantic, it does in Fact, go and correct. It does correct the bio and you can actually see the bot's reasoning as it goes and compares. My complaint about the source saying like, this isn't even in here. You're hallucinating. It does in fact go and look and it does in fact make the correction. Right? It does realize that what it is saying is just not true, right? It's just not true. It winds up writing a very long winding. I mean, honestly, it's kind of garbage. It's very long winding. It doesn't make any sense really, but it does in fact go and edit it. And so it's very, it's very interesting, this process of like trying to fight with a robot to make a change. But it did after two and a half weeks, go and make this change. So I wrote about that too. You know, I wrote about it on my substack. But it's a really interesting experience to look at. Again, this, this process of like that initial scrape, that very comprehensive crawl that it does. I thought like, oh, you know, if you were to actually say like, hey, I want to go write a bio of somebody. Right? Particularly when you have these, these initiatives that Wikipedia does where it's like, oh, we want to write about women or people who are, you know, not necessarily represented or these various projects, you could see it as like, this is an interesting way to gather sources that don't show up on Google and things like this. But then again, since it's motivated by spite.
Very heavily leans into controversies. And see in Alexios is this kind of ties into the other thing I want to talk about, source comparison. That's the thing that really hits as the major difference here, which is a significant percentage of the sources are things that I think the average person would not consider to be reliable. And by that I don't mean, I don't even mean right wing media or left wing media. I mean Stormfront, I mean Infowars, I mean. Yeah, so, so we're really way far off in the realm of, of what people would consider to be, I think like the fringe of the fringe lifesite news, you know, these sorts of. Yeah, much more out in the realm of, you know, these things that are, are way out there. So that's, that's the sort of analysis when you look at the diffs that are very interesting and I'm kind of curious bringing it back to, to the, the sort of source wars here, how you think about that fight over sources that is playing out now. You know, you're, you're winding up Getting letters from members of Congress, the sort of refworking that's happening here. How are you guys thinking about holding the line there? Or you know, how you're going to engage the community and members of Congress about that, that war that's happening?
Jimmy Wales
Yeah. So I mean, it's, it's super interesting. So I mean, the first thing to say is, you know, I think it would be completely intellectually irresponsible not to make judgments about sources. Like, that's something you really have to do. And you can't just say, you know, like, how dare you. Like, it's biased. If you think this social media influencer is not as valid as the New England Journal of Medicine. Like, no, it isn't. It isn't at all biased. It's like that's just paying attention to quality and the facts of reality and so forth. And then at the same time, you always have to, like, in the Wikipedia world, you have to be very thoughtful about this and how do you think about sources? And when you deprecate is a term we use, which doesn't mean it's banned as a source, it means you just prefer a better source. And you know, that gets really tricky when we are in an era where, quite frankly, there's a large number of new media sources that are quite low quality and they do tend to be right wing. I would say that's a sort of a big, has been a big growth. I mean, one of the things that I've been saying is if there are right wing billionaires who are upset about the state of the culture, please fund some high quality news sources. And high quality, you know, it's not about the political leanings. It really is about, like, are you just posting populist crazy things that have no basis in truth? Do you do error corrections? Do you know, how thoughtful is it? That's really important. And so that's currently, I think, an issue in the world. It's not an issue for Wikipedia. I think we have to, to bend over backwards as best we can to say, you know, like, okay, like let's be very careful that we aren't cherry picking sources because that's also a very natural, easy human tendency, you know, and I think we're pretty good on that front. You know, like, if there's a genuine scientific controversy and it also has some politicized element to it, well, we should ignore the politicized element and just cover the, the genuine aspects of the controversy. And I think we mostly do that. I mean, obviously you can quibble on this that and the other always. So, yeah, it's really tricky. And then I just have to say it's not quite what you asked me, but you gave me an excuse for a little small rant about this. As you know as well as anybody, there is a real trope on the right that the Biden administration was putting pressure on to censorship, you know, social media and potentially Wikipedia. And when people found out, yes, we had conversations with the Biden administration, it was all like, aha. You know, this is mainly about. About COVID And I'm like, no, no, no, hold on a second. Like, we don't, like, we talk to governments all the time. We explain how Wikipedia works. We'll never change anything because the government wants us to change it. Also, we never got any pressure from the Biden administration to change anything. Like, that just wasn't a thing they were like, as they're researching and trying to learn about, like, what's going on in the information ecosystem around Covid and things like this. Obviously, they talk to us. Very interesting. And yet we have gotten pressure from the US government, but from the Republicans. And I think that's extremely problematic. You know, we had this letter from the interim attorney in D.C. who didn't get the permanent job. So. Great. You know, Ed Martin. Yeah, Ed Martin. And I mean, frankly, you know, I just said, you know, it's really good that we do have fantastic, very calm, professional staff who just basically answered the letter in a very matter of fact way, because I would have just said, go fuck yourself and footnoted the First Amendment. They say, like, none of this is in any of your business. Like, it's completely absurd. But, you know, all right, you've got questions. Here's the answers to the questions. And, and, you know, this sort of hinting at our nonprofit status and saying things like, basically, you know, it's come to our attention you let foreigners edit Wikipedia. It's like, yes, we're global, you know, like, what are you even talking about? But, you know, there we are. And so I just think that is unfortunate. And it's actually something I do think quite sincerely and quite genuinely. There are people on the right who are upset and concerned about freedom of expression and various negative things that are happening potentially to freedom of expression. I mean, here I live in the uk, where freedom of expression is not nearly as protected as the US And a lot of people have concerns. I think that's great. And if you have those concerns, then you on the right should say, actually, it's outrageous that Ed Martin is sending like nasty letters to Wikipedia. Like that is not the role of government whatsoever. That's my rant.
Renee Diresta
No, well, I got subpoenaed by Jim Jordan, I think around the same time that he was sending letters to Wikipedia. So I'm right there with you.
No, I think that question. They've also, I believe, wanted names of editors and things like that. So we've definitely seen, I've written about, I think some of the requests. There's been also, I think actual state censorship from foreign governments that has targeted Wikipedia. I think you were fined by Russia, Russia, the Wikimedia foundation, for refusing to take down certain content about the war in Ukraine. So you've also experienced this from non US Governments too. Do you want to?
Jimmy Wales
Definitely.
Renee Diresta
You have any other.
Jimmy Wales
Yeah, yeah, yeah, there's a, there's a few examples. So yes, there, there have been fines issued in Russia, which we're never going to pay, by the way. You know, that's just not a thing we're going to do, you know, like just. No, we. Well, we're currently blocked in China. We had an on off sort of situation in China. We have a standing offer from China that we could be open and accessible in China if only we would let a Chinese university manage the content and make sure it's legal in China. No, sorry, that's not something we're going to do. We were banned in Turkey for about three years and we, I'm very proud of this one. We fought all the way to the Supreme Court in Turkey and won and now we're unblocked in Turkey and it was a landmark decision for freedom of expression in Turkey and that's great. And you know, I think one of the reasons. So we have many reasons. I mean, partly we're just quite ideological about the fundamental human right to participate in discourse and in something like Wikipedia, like it's actually, you know, this isn't hate speech, it isn't threatening people, it isn't all the, all the borderline things. This is Wikipedia. Like it's very, you know, and even if it's biased in places, which of course it is in some times, in some places we try not to be, that's okay. That's also perfectly part of legitimate discourse and so forth. But also I think of it as a very practical matter. Like one of the issues is like, if we, if we decided to, you know, in Turkey, we could have gotten unblocked in Turkey by just blocking a certain page from being visible in Turkey and just don't send that page to Turkey and It's all going to be fine. Once you start doing that, then they come out of the woodwork. I think a lot of governments really understand it's kind of all or nothing. You could block Wikipedia, but you're going to block all of Wikipedia because they're not going to cave in, they don't play ball. And I think that's really, really important. And I've criticized. Elon had a little thing with him about this in Turkey. They decided to take down certain tweets even after he loves to talk a lot about freedom of expression. And he sort of made the typical kind of reason and excuse, which by the way, other than him posturing about the issue so flamboyantly as he does, I think is a hard problem. I think, I think it's a really, it's not a hard problem for us. Right. But for commercial enterprises with a business model that's different from ours, I get it. It's like, you know, are you going to lose your entire access to the Turkish market over one tweet? Like, okay, that's a tough business decision to make for us. It's actually, I mean, this is one of the great things about our business model. So we're a charity, but our business model is support from small donors and our donors would be very, very angry at us if we participated in censorship. So it actually would cost us money, you know, so it's like, great. Our incentives are aligned with our values. That's really, really wonderful. So anyway, I, you know, I think it's hard, hard decision to make. You know, in a lot of the companies, some are better than others about it. Some just like, they do whatever, they just take down whatever without even a fight. Others, I think YouTube typically tries to fight the good fight where they can. Yeah, yeah.
Renee Diresta
How do you think about sort of state sponsored edit warring? I know that. So this, this again maybe ties into the Wikipedia as information infrastructure that really heavily influences AI downstream. So not AI as a, a creation tool for Wikipedia, but Wikipedia as a, as a training Data set for AI. So one of the things that we see is LLMs and other content really heavily use sources to produce reality downstream, we might say. And this means that state sponsored actors and particularly are interested in influencing it. This is of course why Congress is also interested in what's on Wikipedia. But, but this question of Russia in particular. Right. Let's just say it's is very interested in shaping narratives on Wikipedia, knowing that this is going to, it's going to influence Google snippets. Right. It's going to influence, you know, AI answer engines are going to draw very heavily from it. If you can shape what reality? You know, if you shape reality on a Wikipedia page, you're going to influence training data downstream. This is going to shape what is communicated. How do you guys think about that in terms of. It takes a lot for community to be on top of all of the pages. Right. It is a massive endeavor. There's a colleague of mine at Lawfare who wrote a really excellent article a couple years back about fighting in one guy's bio and I don't remember which it was a Ukrainian poet or writer. I'm trying to remember the specifics, but even just like trying to erase Ukrainian identity where they kept kind of like trying to go in and take out Ukrainian and replace with Russian. Right. These sort of small scale, you know, little information war kind of dynamics where just trying to like subtly change things around the edges to kind of tilt narratives in favor of Russia. This is the sort of thing where like the community can get to it at some scale. But when you have something that's done at large scale, I'm sort of curious how you think about that reality shaping those types of attacks, given the importance of Wikipedia. Yeah, downstream.
Jimmy Wales
So. And there's a few, a few things to say about this. So first of all, it's quite difficult to get away with that. So the way Wikipedia works is very different from social media, for example, where you can just flood the zone. You know, like you can have 10,000 bots all saying similar things and just flood the zone with it. That's going to get you nowhere in Wikipedia. Another element of Wikipedia that I think is quite important is we have a very strong tradition against voting on content. So we all have straw polls and things like that. We call it a not vote, just so it's not a vote. Because what really matters is getting the sense of the room, but also evaluating policy based arguments for what you're saying.
So just sheer numbers doesn't get you very far. In fact, it more than likely just gets you blocked because people immediately go like, oh, there's a hundred random people we've never heard of before all saying the same thing, like, let's just block them all like that. They're clearly bots and trolls and what have you. So it's harder than you think. And then also I think things. I was once in Russia before the war, a long time ago when one would go to Russia and I was sat at dinner after a conference and I was sat next to the Editor in chief of a major magazine. And he said, oh, I can make Wikipedia say whatever I want. I just give $100 each to a few Wikipedians and it's done. And I'm like, okay, let's talk about that. Like, could you really do that? And the answer is probably not. Because if they started putting in strange things, the other Wikipedias would go like, what are you doing? You don't have a source. Like, what is this about? Like, you know, you could enter the discourse, but you could do that for free, you know. I said, but you're the editor in chief of a magazine. You can make it say whatever you want. And in fact, the government can probably make it say whatever they want because they can make you write whatever they say, you know, and so it's much, much harder when you've got an open system where the decision making is based on consensus and about the editorial standards and so on and so forth. Now, that's not to say, you know, governments can't have some influence and some participation. And maybe that would be okay if they're very transparent about it. It's like, like, just post on the Talk page and say, oh, hi, I'm from the Russian government. We've got this concern, like, you've not really dealt with this statistic properly. Okay, well, we'll look at it. I mean, whatever. I mean, that's a minor point, but the idea of, like, sort of sneaking into Wikipedia and sock puppeting. Yeah, I'm sure it happens at small scale, but I don't think it's the main thrust of things. And in fact, one of my moments of pride was I. I was at a conference here in London and I met a Ukrainian journalist, and she said, oh, how. How is Russian Wikipedia talking about the war? And I said, well, I. I hear good things. But, like, she. You speak Russian? Yes, fluent in Russian. Fluent in Ukrainian. I'm like, well, please go and let me know. Like, go and read it. So she did that night. She went home. Went home, went to the hotel, read it, came back. I saw her the next day and she said. And I said, oh, how was it? And she said, it was better than I thought it would be. And she said, I've got to quibble with a few things. I'm like, great. Like, I've got to quibble with a lot of things. Like, that's. That's where we. That's. I'm okay with that. And, you know, it's things like Russian Wikipedia says it's a war. It says Russia invaded Ukraine, it says the things that you might think would be difficult to say in Russia. And by the way, a lot of the Wikipedians have gone laying low, you know, abandoned their old account, which was tied to their real life identity, and now they're editing under a new account that's more anonymous and they're using a VPN and they're being a little more careful because it's quite difficult. You know, we've had Wikipedians arrested in various authoritarian places and they're heroes, they're amazing. And so, yeah, what I'm saying is, I'm not dismissing the question. I'm saying I think we're okay, but we always have to take that seriously. And you know, obviously we do technical things looking for patterns of IP addresses and things like that, looking for sock puppeting, things like that, because that does happen. But I think more often sock puppeting and sort of doing that is one person being a troll rather than state sponsored activity.
Renee Diresta
That's fair. I think that question about reliable sources comes up also in the realm of, you know, the engineering of entire domains that are essentially fabricated by the state at this point, just LLM generated alternate propaganda sites and realities. It makes me think a lot about this, this question of reliable news ecosystems, particularly when they are outside of the sort of US language ecosystem and how we know what is.
Jimmy Wales
Absolutely crucial, I mean, super important and super interesting. I mean, one of the things. So back when, you know, we first started to get worried about fake news sites, before Donald Trump decided it means all the news he doesn't like, you know, that's a, it's a legitimate phenomenon that existed before LLMs and it was never a problem for Wikipedia because the Wikipedians spend their lives debating about the quality of sources and also they're kind of hard to fool. Like I remember one example headline, it said, pope endorses Trump. This is Pope Francis. And I was, you know, like, you show this to. Well, it was a little bit viral on social media. It wasn't like a massive thing, but it did, it got a lot of retweets or whatever. And like, you can fool random people, but fooling Wikipedians is quite tough because they would go like, oh, hold on, Popes don't generally endorse political candidates. That's just not a thing. And by the way, Pope Francis and like, whatever you may think about the Catholic Church and all that, he seemed like a really nice guy and unlikely to be a Trump supporter. Like very much not likely. And so you would just go to that. This probably makes no Sense. And then you go look at the website and you're like, oh, like you click around a bit, you're like, oh, this is only like four pages on the website. It was just done up for social media. But what you're talking about is a further threat because now in order to make something like that was quite easy, but you could only make one or two pages, you couldn't generate an entire news site. And now you probably can create quite a huge amount of content and make that look a lot more plausible. And I would say you've identified as well, like try to fool English speaking Wikipedians about English language media, you're not going to get very far. Try to fool Germans about German language media, you're not going to get up very far. But try to fool any of us about news sites in Thailand. Oh, it might be a little tricky. Like I don't know which ones are the best, I don't know the most famous papers. And I don't know if this is new or old. I mean it just means more diligence on our part. So like if somebody wants to use a source. But it's part of the world that we have to grapple with.
Renee Diresta
I want to touch on another thing that you said in your book that I think ties back right now to this crisis of trust in institutions which is growing. I would say, as you know, we've seen some really remarkable things from the CDC in the last 48 hours. We now have a page that says vaccines cause autism on CDC.gov.
That'S incredible.
Jimmy Wales
Wow. Okay. Yeah.
Renee Diresta
So I'm curious how we should be thinking about public agencies trying to build trust. And I think that I wrote a book that came out in 2024 that still sort of assumed that we would be reforming federal agencies. Now I think maybe we're going down to the state level at this point. But how should institutions be thinking about reforming or rebuilding trust at this point? How should they be learning the lessons that you write about in your book? What do you think is the key thing for them to be thinking about.
Jimmy Wales
Giving advice in normal times is different from giving advice right now, now, because I, I very much appreciate, I mean we know how many scientists have resigned or left because they can't, you know, like they just can't abide by it whatsoever. But in normal times, you know, there are, there are definitely things that, that organizations of all kinds do and can and should do to build trust. So, you know, like one of the things that I think we should insist on as citizens when we get back to sanity is the intellectual independence of the scientists at the cdc. And you know, should they also have checks and balances about things like not being too much under the sway of Big Pharma? Fine, yeah, great. That's actually important. It's less important than the anti faxers think. But yeah, that's a, that's a thing. So all of these sort of steps to say, like how do we ensure the integrity of the process and depoliticize it as much as possible. You know, one of the things that I learned in, in the research for the book that I thought was quite interesting is in a lot of these partisan cases that these kinds of actions reduce trust not only with the people who disagree with you, but also the people who agree with you. So this was, you know, I talk about the.
The Washington Post decision not to endorse presidential candidates, which I said I think it was a good decision to just very, very badly timed. Doing it just before the election made it seem like it was implicit support for Trump by Jeff Bezos or whatever. And I don't think that was it. But more broadly the research shows like endorsements, political endorsements by news organizations not only lower trust among people who don't agree with the endorsement, they actually lower trust among people who do agree with the endorsement because they're then concerned is the news being skewed to support their candidate or is it a campaigning organization or is it, is it fact based? And I think that's doubly true. Like with, with journalism and news. It's, it's a complicated matter. I think there's, personally, I think there's nothing wrong with having a right leaning or left leaning paper that has a view. That's fine. You've got an audience. That's fine. I think we all have to, I trust them all a little bit less because of that, but it's fine. But I think for, you know, government agencies trying to represent all the citizens, I think they've got to really, really be very, very thoughtful about this. And you know, a piece of it would be that purpose. What is the purpose of the cdc? The purpose of CDC is not to fight Big Pharma. It's not to support the president. Purpose of the CDC is to research and disseminate accurate factual information that's important for our health. So yeah, I'm not saying anything too radical, I think, but it sounds a bit radical in these times.
Renee Diresta
Looking ahead, what do you think is the next, you know, the next trust challenge that you see coming for Wikipedia or For the broader knowledge ecosystem?
Jimmy Wales
Yeah, well, I mean, I do think, you know, there are a lot of the challenges we've touched on already. You know, if we are in a broader information ecosystem that has become more politicized and more fraught with risk for accuracy and so forth. I mean, in the past, you would have never questioned for one second statistics coming from the CDC about chickenpox. Right. But is that going to become politicized? Like, if that becomes politicized, then are those statistics still valid? I mean, have they fired all the statisticians and are they stopping collecting the data so then suddenly like, the quality of that output is lower? So that's a risk for us, a big risk that I talk about a lot that's not necessarily new and not necessarily about sort of the current Trump administration, all of that. But like, the decline in local journalism, I think is a real challenge for Wikipedia. It's easier to write the history of my hometown, Huntsville, Alabama in 1980 than it is in 2020, because the local newspaper is basically dead and the number of journalists working locally is much smaller. So that's a problem that we, you know, that. And I don't have a solution to that. I wish I did. And then for us, for ad Wikipedia, I think one of the most important things is community health. And community health. I mean, by that, staying curious, staying neutral, staying open. One of the things that I have said in my book and I've said to Elon Musk is if you're putting out the incorrect view that Wikipedia has become Wokopedia and has been taken over by radical left wing activists. And it's a sad shame, but Wikipedia used to be great and it's been taken over by left wing crazies. Well, you're doing two things. You're telling kind, thoughtful conservatives that they wouldn't be welcome in Wikipedia, so they're gonna stay away. I don't want that. I want them here. I want people here to debate the issues of our time in a thoughtful and kind manner. And you're also telling those crazy left wing activists that Wikipedia is their new home. And then we have to deal with them like, that's no fun either. It's sort of like, no, actually what we want are people who care more about the values of Wikipedia than about the culture wars. And that's really important. And they can come from all sides. And like, you know, there's great people on all sides. And so that's, that's something we have to be careful about, that we don't react to all that by saying, fine, like I, I bristle a bit when people say, I get why they say it. But like, like, oh, reality has a liberal bias. Like, that's a cute saying and all that. But I'm like, it's, it's not. That should not become a mantra for Wikipedia because that doesn't really foster an intellectual openness and willingness to listen to people on all sides.
Renee Diresta
Is there any, like, appeal or anything that we can be telling ordinary listeners or just the average person that they can be doing to contribute or, you know, help to turn the tide on this stuff?
Jimmy Wales
I hope it's fun to come and edit Wikipedia. And so I think that's always the best thing people do. Obviously we're charity. We do depend on donations. Our 25th anniversary and yeah, and of course I here about my book. So this is the German edition, which I'm very excited to try to read. I speak a little bit of German, but not very much. It's coming out in 20 different languages. So I'm excited about that.
Renee Diresta
That's fantastic.
Jimmy Wales
Yeah, awesome.
Renee Diresta
Well, thank you so much. I really enjoyed our conversation.
Jimmy Wales
Yeah. Yeah, fantastic.
Renee Diresta
The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get ad free versions of this and other Lawfare podcasts by becoming a Lawfare material supporter at our website, lawfairmedia.org support. You'll also get access to special events and other content available only to our supporters. Please rate and review us wherever you get your podcasts. Look out for our other podcasts including Rational Security, Allies, the Aftermath and Escalation. Our latest Lawfare Presents podcast podcast series about the war in Ukraine. Check out our written work@lawfaremedia.org this podcast is edited by Jen Pacha and our audio engineer. This episode was Kara Shillin of Goat Rodeo. Our theme song is from Alibi Music. As always, thank you for listening.
Earnin Advertiser
You know what's faster than your paycheck?
Renee Diresta
Literally everything.
Earnin Advertiser
It's time to get your pay up to speed with Earn In. You can access your pay as you work instead of waiting days and weeks weeks for a paycheck. Get up to 150 a day with a max of 750 between paydays. No interest, no credit checks and no mandatory fees because hey, it's your money. Download the Earn in app now to get it and join millions of people making any day payday. That's Earnin. Earnin is not a bank. Access limits are based on your earnings and risk factors. Available in select states. Expedited transfers available for a fee. Terms and restrictions apply. Visit Earnin.com for full details.
The Lawfare Podcast: "Wikipedia, Ref-Working, and the Battle Over Reality"
Date: December 9, 2025
Host: Renee Diresta, Contributing Editor at Lawfare
Guest: Jimmy Wales, Founder of Wikipedia and author of The Seven Rules of Trust
This episode dives deep into how Wikipedia navigates the modern information landscape, the ongoing battles over reality and sources, the challenge of trust and ref-working, and the impact of AI and political pressures on the Internet’s most influential open encyclopedia. Jimmy Wales, Wikipedia’s founder, joins Renee Diresta to discuss the collaborative platform's role as information infrastructure, threats from misinformation, government intervention, and how Wikipedia’s principles translate into trust, neutrality, and openness amid crisis.