Podcast Summary: The Rubin Report – “Wikipedia Conspiracy Goes Deeper Than Anyone Knows”
Guest: Ashley Rindsberg
Host: Dave Rubin
Date: December 26, 2025
Overview
In this episode, Dave Rubin sits down in-person with journalist and investigative researcher Ashley Rindsberg to dissect his investigations into Wikipedia’s growing ideological bias, its hidden connections with other major information gatekeepers like Google, and the broader implications for knowledge, free speech, and Western civilization. Rindsberg goes deep into the power structures behind “neutral” online platforms, similarities between legacy media malfeasance and new digital manipulation, and the complexities of trying to fight back against what he terms the “knowledge cartel.” The conversation spans historical detail, personal anecdotes, technical mechanisms of bias, and possible futures—both dystopian and hopeful.
Key Discussion Points & Insights
1. Rindsberg’s Background: Media Skeptic Turned Internet Sleuth
- Origin: Rindsberg describes his journey from New York Times devotee to critic. His book researched the Times’ propagation of falsehoods throughout the 20th century (e.g., downplaying the Ukrainian famine, uncritically repeating Nazi propaganda, misreporting on Castro, the Iraq war, and the 1619 Project).
- Notable Quote:
- “When you go back and look forensically at their coverage on the most important topics of the last century, you uncover some shocking stuff.” (Ashley Rindsberg, 04:29)
- Turning Point: Reading The Rise and Fall of the Third Reich while in Israel, and discovering the Times’ front-page story asserting Poland invaded Germany—a Nazi propaganda lie.
- “That was actually what it said… turns out that was a Nazi propaganda ploy… and the New York Times just bought it and printed it for the world.” (Ashley Rindsberg, 09:05)
2. Wikipedia’s Mission Drift & Capture
- 2017 Shift: Rindsberg details a deliberate, explicit mission shift at Wikipedia from “encyclopedia” to “social justice movement powered by DEI,” led by the Wikimedia Foundation.
- “They changed the mission of Wikipedia… to building a social justice movement powered by DEI. Those are their terms.” (Ashley Rindsberg, 02:16, repeated/rephrased at 11:19)
- Leadership: Catherine Maher (then Wikimedia CEO, later NPR CEO), described as a “true believer” in Open Society/Soros ideology, drove this shift, supported by PR experts linked to the Clinton Foundation.
- Quote:
- “She also called Wikipedia, while she was running it, a white Western male construct… she thinks that truth is something that we don’t find, but we make up. It’s a product of power dynamics.” (Ashley Rindsberg, 16:43)
3. Wikipedia’s Influence on the Information Ecosystem
- Symbiosis with Google: Wikipedia is prioritized in Google search results—first hits, sidebar panels, and even AI summaries.
- “You’re always going to get that first result dominated by Wikipedia. But we also have a knowledge panel…that’s Wikipedia. AI overview… that’s Wikipedia.” (Ashley Rindsberg, 17:40)
- Financial & Structural Ties: Deep, mostly undisclosed partnership between Google and Wikimedia; large donations to Wikimedia’s endowment (placed in the leftist Tides Foundation).
- “They both benefit from it, but what it becomes is a kind of knowledge cartel.” (Ashley Rindsberg, 19:51)
- The Knowledge Cartel:
- “It’s a monopoly. It’s Google’s monopoly, it’s Wikipedia’s monopoly, which [is] a monopoly on information online.” (Ashley Rindsberg, 20:09)
4. Ideological Bias & Editorial Manipulation
- Research Findings: Empirical analysis detects systematic, flagrant political bias in language about politicians, journalists, etc.—negativity toward the right, positivity toward the left.
- Crowdsourcing Myth: Wikipedia as a “neutral, crowdsourced” project is largely a myth. Only a small, committed, usually ideologically motivated editorship wields real power. Attempts by outsiders to change controversial content are rapidly reverted or suppressed.
- “Unless you are a dedicated, full time, ideologically driven editor… your edit will never stick.” (Ashley Rindsberg, 22:23)
- Organized Editing Campaigns: Describes groups of editors systematically sanitizing articles for Hamas, Hezbollah, CCP, and other state-backed or extremist causes, often with overt affiliations.
- “I did this very big investigation into these pro Hamas editors… three dozen plus. And these guys are relentless… a million edits across 10,000 articles.” (Ashley Rindsberg, 24:02)
5. State & Foreign Influence Operations
- CCP Editors: At least 300 openly pro-CCP activists working round the clock to shape narratives on major issues (COVID, Taiwan, Xi Jinping, etc.), with little meaningful oversight or intervention by Wikimedia.
- Means of Analysis: Direct attribution is difficult due to anonymity, but statistical and behavioral analysis indicates high coordination.
- “You can only look at the patterns and say, this is clearly coordinated, statistically, this is coordinated, and then tell the world that this is happening.” (Ashley Rindsberg, 25:41)
- Policy Recommendations:
- US government should “investigate foreign influence,” and consider restricting official or AI system reliance on Wikipedia (e.g., executive order against using Wikipedia for federal AI training).
6. AI Training & the Entrenchment of Bias
- Cascading Influence: Wikipedia isn’t just dominant on Google; it is a primary source of AI model training data (ChatGPT, Claude, Gemini, Siri, Alexa, etc.), amplifying its worldview into new generations of digital assistants and tools.
- “The worldview of AI reflects Wikipedia’s worldview disproportionately.” (Ashley Rindsberg, 21:16)
- “Knowledge Infrastructure” as Leverage:
- “When Wikimedia foundation try to shift Wikipedia’s role as global knowledge infrastructure, they succeeded.” (Ashley Rindsberg, 17:40)
7. Reddit & Mechanized Propaganda
- Pipeline from Terror Groups: Rindsberg documents translation and spread of terrorist messaging (Hamas, Hezbollah, Houthis) from Telegram, through Reddit (via coordinated groups), and out into the broader online ecosystem (Discord, Twitter/X, etc.)
- Moderation Double Standard: Reports on how Reddit will ban “the Donald” or unpopular right-leaning communities, but ignores or protects pro-terror propaganda at the behest of ideologically aligned “trust & safety” staff.
- “Trust and safety… that’s this kind of unique hinge, this pivot point online. And… one of these things that we saw a little bit of it at Twitter X when Elon took over and fired them all.” (Ashley Rindsberg, 35:14)
8. Bot Networks and Information Distortion
- Bot Prevalence: Bots are now so sophisticated that even top detection tools are, according to Rindsberg’s sources, “snake oil.”
- “Even the best… [bot detection] is basically at this point, snake oil. It doesn’t work.” (Ashley Rindsberg, 43:05)
- Engagement Gaming & the Spiral of Extremism:
- “If you say something crazy, the bots like the hell out of it and retweet the hell out of it. You get engagement and then you just start saying crazier things.” (Dave Rubin, 42:13)
- Real Human Skill:
- “It’s the last thing that’s human about us, our ability to detect the bot.” (Dave Rubin, 43:34)
9. Prospects for Change: Competition & Cultural Detox
- Breakthroughs: Elon Musk’s Grok (Grokopedia), “JustaPedia,” directly challenge Wikipedia’s information monopoly—a positive development in Rindsberg’s view.
- “Over time, Grokopedia will start to feed other AIs, it’ll start to feed Google. There’s another site called JustaPedia… actually working.” (Ashley Rindsberg, 27:50)
- Risks: New competitors may still be vulnerable to ideological or algorithmic capture.
- Desire for Multiplicity:
- “Having more than one viewpoint out there is the main thing and if it’s more than two, even better.” (Ashley Rindsberg, 30:01)
- Personal/Cultural Solutions: Rindsberg and Rubin share a Gen X nostalgia, emphasizing how withdrawal from smartphone addiction and rediscovery of real-world connection may be the “divide” that matters most in the future.
Notable Quotes & Memorable Moments
- On Mission Drift:
- “They changed the mission of Wikipedia from building an online encyclopedia that’s reliable to building a social movement, social justice movement powered by DEI. Those are their terms.” (Ashley Rindsberg, 02:16, also 11:19)
- On Wikipedia’s Power:
- “What it becomes is kind of a knowledge cartel. You can’t break it. It’s a monopoly…a monopoly on information online.” (Ashley Rindsberg, 20:09)
- On Editorial Bias/Myth of Crowdsourcing:
- “Unless you are a dedicated, full time, ideologically driven editor… your edit will never stick.” (Ashley Rindsberg, 22:23)
- On Truth & Power:
- “She called the truth a distraction… She thinks that truth is something that we don’t find, but we make up. It’s a product of power dynamics.” (Ashley Rindsberg, 16:43)
- On AI and Wikipedia:
- “The worldview of AI reflects Wikipedia’s worldview disproportionately.” (Ashley Rindsberg, 21:16)
- On Bots:
- “Even the best [bot detection] is basically at this point, snake oil. It doesn’t work. You cannot, even with the best technology, detect a bot… only as a human.” (Ashley Rindsberg, 43:05)
- On Cultural Recovery:
- “The real world, there’s so much better, you know, the texture of it. The difference between putting your kids to bed with a phone in your hand and the phone just not on the same floor… is huge.” (Ashley Rindsberg, 47:59)
- “Zelda didn’t tell you to hate your neighbor for no reason.” (Ashley Rindsberg, 49:13)
Timestamps for Key Segments
- [04:29] — Rindsberg’s background: NYT forensic media criticism
- [09:05] — Discovery: NYT spreads Nazi propaganda on WW2
- [11:19] — Wikipedia’s 2017 mission shift to social justice/DEI
- [16:38] — Catherine Maher’s philosophy: truth as “distraction”
- [17:40] — How Wikipedia shapes Google, AI, and knowledge infrastructure
- [19:51] — The “knowledge cartel” and mutual benefit with Google
- [22:23] — Why outsider edits don’t survive on Wikipedia
- [24:02] — Investigation: coordinated Wikipedia editorial campaigns (Hamas/Hezbollah)
- [25:41] — Foreign/state influence: CCP editing, statistical detection
- [30:01] — Hope for competition: emergence of Grokopedia and JustaPedia
- [35:14] — Trust and safety staff as ideological gatekeepers
- [43:05] — Identifying bots: snake oil detection, only humans can discern
- [47:59–49:17] — Real world vs. digital world: why offline is better
Conclusion
This episode unpacks the unseen power structures sustaining Wikipedia’s dominance, how its editorial ideology seeps into search, AI, and culture, and why attempts to restore balance are so fraught. Rindsberg calls for government transparency on foreign influence, a ban on federal reliance on Wikipedia for AI, and more public awareness that Wikipedia is now “the most hackable system out there for propagandists.” The conversation winds up with a touch of hope: the more information platforms and viewpoints, the better, plus an appeal to individual sanity—get offline when you can, and don’t forget what real life is like.
