TRASHFUTURE – UNLOCKED 5000 Columbos feat. Gregk Foley
December 31, 2025
Main Theme
This episode of TRASHFUTURE unpacks how the logic of digital capitalism, technological hype, and the revolving door between politics, tech, and the security state manifest in initiatives like AI-powered law enforcement tools. The discussion weaves through news items dripping with absurdity, the commodification of public sector 'innovation,' the chilling advancement of surveillance capitalism, and concludes with a humorous but trenchant call for a more honorable, even cinematic, model of policing.
Joining the regular crew is Gregk Foley, host of the podcast Bloodwork, to offer sharp commentary on AI, policing, and the failures of the Blairite political legacy.
Key Discussion Points & Insights
Woke AI, the US Approval Process, and the President’s Ramblings (00:15–03:48)
- The episode opens with a satirical riff on political grandstanding about “woke” culture infiltrating AI and regulatory structures in America, riffing on Trump’s muddled complaint about state-level approval processes leading to “woke” data centers.
- "Woke, it's got a regenerating health bar, got shields." – Hussain (00:41)
- "We thought we took down woke, but then its health bar flashed and it got bigger." – Riley (00:45)
- The hosts joke about AI being ‘corrupted’ by exposure to progressive states, poking fun at technophobia and reactionary panic.
The Deadening Repetition of History (03:29–03:53)
- They compare current political hysteria to the patterns of the 20th century, only now "being much more dumb about it."
- "It does sort of feel like we're repeating a lot of things that happened in the 20th century, but we're also being, like, much more dumb about it." – Ewan Blair (03:31)
Nonsensical Christmas Murals, Local Panic, and Symbolic Portals (05:09–10:14)
- Riley highlights a bizarre Christmas mural in Kingston upon Thames, featuring melting snowmen and animals – leading to confusion and over-interpretation by local Facebook groups, who conflate images of running on frozen water with ‘small boats’ (i.e., references to migration).
- "There's clearly something political about it, but nobody knows what it is. If ever there was a slogan for our times, that's it." – Greg Foley quoting a restaurant manager (08:29)
- The crew mock British cultural paranoias: the 'woke' label attaching itself to anything, and the persistent inability (or unwillingness) to parse reality from symbolism.
Ewan Blair, Multiverse, and the NHS: The Revolving Door of Tech-Policy Grift (10:49–22:33)
- Multiverse (founded by Tony Blair’s son, Ewan) emerges as a paradigmatic grift: a company first extracting public funds to 'reform' apprenticeships, now pivoting to AI training for the NHS by partnering with US data-mining firm Palantir.
- "The stakes are so high... if we don't take the workforce on the journey, you add to the productivity doom loop and people will die." – Ewan Blair quote (12:47)
- "Man selling monorail says only monorail can reverse decline of downtown district." – Greg Foley (13:21)
- The hosts debunk myths about outdated NHS technology (e.g., the supposed evil of fax machines), noting that digitization for its own sake can be counterproductive or insecure.
- "Sometimes that is the best solution to a technological problem, is a fax machine." – Hussain (14:27)
- "There are industries where making things touch-screenified and 'modern' actually makes services worse." – Ewan Blair (paraphrased, 14:34)
- Central critique: The problem isn't a lack of tech, it's the diversion of funding toward private consultants, under the guise of modernization.
Palantir, AI, and Union-Busting (18:01–22:33)
- The Multiverse–Palantir partnership will train NHS staff to use Palantir’s AI decision tools, with government funds, raising questions about NHS worker consent, data security, and union-busting.
- "Palantir is even helping the NHS absorb and like, absorb the impact of resident doctor strikes. You can use it to break strikes." – Riley (19:19)
- The group laments the cyclical nature of influence, nepotism, and public sector grifting by the political establishment and their progeny.
Weaponized GeoGuessr: GREYLARK and the Absurdities of AI Policing (23:54–36:16)
- The featured investigation is Greylark’s “GeoSpy”: a tool that claims to geolocate any image using AI, marketed aggressively to police and security agencies.
- “This is weaponized Geoguessr.” – Greg Foley (24:21)
- Hosts challenge the validity and ethics of such tools, relating their skepticism to perennial tech-for-police scams (e.g., ShotSpotter), cop puffery, and the dangers of automating police violence.
- "Arguably, if it was within 20 minutes, maybe not that dangerous... No one lies more than the police." – Hussain (27:34)
Cybertrucks and Toy-Box Security (28:10–29:39)
- Tangent on Las Vegas police getting Andreessen Horowitz-donated Tesla Cybertrucks, lampooning the spectacle of 'innovation' gone absurd.
Dystopian Outcomes, Accountability, and Evidence-Generating Machines (32:14–36:16)
- Serious points about AI tools automating or producing pretext for violence and error in policing, distorting ‘truth’ and allowing for the abdication of responsibility.
- "It is an AI hallucination in real life... you have lived inside an AI hallucination." – Riley (34:06)
- "Their utility lies precisely in their potential to destabilize the thing they proclaim to secure." – Greg Foley (51:35)
- Foley particularly stresses the move from finding evidence to producing evidence—through whatever means necessary—including AI hallucinations.
5,000 Colombos: A More Honorable Model of Policing (38:13–42:19)
- Greg Foley advocates dramatically for "abolishing" AI-heavy policing in favor of a force made of 5000 “Columbos”—bumbling, persistent, personable detectives as seen on TV, upholding a 'gentlemanly' game of cat-and-mouse.
- "I don't want 500 mrabs. I want 5,000 Columbos." – Greg Foley (38:57)
- This evolves into a comedic vision: massed raincoat detectives, cigar budgets, and an erotically charged pursuit of crime and justice.
GREYLARK’s Founder and Corporate Social Media Shtick (42:21–49:11)
- They mock Greylark’s “brotherly” Boston-born founder Daniel Heinen for his cringey LinkedIn, fake motivational quotes, and culture war pandering.
- "Search that [quote], you will get precisely one Google result. Daniel Heinen's Twitter." – Greg Foley (44:47)
- Point made: These companies blend tech-bro hustle, surveillance capitalism, and online branding into a toxic but lucrative stew.
The Inverse Dynamic: From AI Warfare to Surveillance Everyday Life (50:18–54:42)
- Greg Foley brings in the lesson of Israel’s “Lavender” AI program in Gaza: AI is mainly used to speed up target approval, lower oversight, and shift liability for civilian harm onto “the machine.”
- "From what I could see, the real purpose of the system was to generate thousands of targets that could all be signed off in a few seconds with as little human oversight as possible." – Greg Foley (50:18)
- The hosts agree: the real 'utility' of these systems is not accuracy but producing 'credibility' for power to act, whatever the social or human cost.
Closing Reflections & Recs (54:54–57:53)
- Greg Foley plugs Bloodwork, his podcast on the philosophy and history of violence, inviting listeners for collaborations on episodes covering cinema and theory (e.g., Michael Mann’s Heat, Schrader’s “Man in a Room” trilogy, or Tenet).
- "For me, the podcast is the juice." – Hussain (55:51)
- Discussion meanders into movie fandom and the obtuse pleasures of Christopher Nolan’s logic.
Notable Quotes & Memorable Moments
- "It's a future so close, I can taste it. I can smell it." – Greg Foley, on a sea of Columbos, (42:07)
- "Heroes deserve the most AI." – Greg Foley, mocking corporate slogans (47:17)
- "Mom says it's my turn on the surveillance machine." – Hussain, on the 'brotherly' founders (46:24)
- "The real utility of these systems is their ability to produce justifications for violence, not accuracy or truth." – (Summarized from Greg Foley’s remarks, 51:35–52:47)
Overall Tone
Irreverent, incisively skeptical, and laced with dark humor, the episode bounces between mockery and serious worry about technological overreach in policing and policy. The hosts’ energetic banter merges with Greg Foley’s philosophical edge, maintaining the show’s trademark blend of the absurd and the analytical.
Timestamps for Important Segments
- 00:15–03:48: Satire on “woke” AI and political rambling.
- 10:49–22:33: Deep dive into Ewan Blair, Multiverse, NHS tech grift, and Palantir.
- 23:54–36:16: GREYLARK’s “GeoSpy”—AI spying for cops, skepticism, risks.
- 38:13–42:19: The case for 5,000 Colombos—honor in policing.
- 50:18–54:42: Israel’s Lavender AI, shifting liability, the danger of techno-justifications.
- 54:54–57:53: Bloodwork plug, closing meta-commentary on cinema and tech.
For Listeners New and Old
This episode sharply exposes how tawdry, redundant, and dangerous our "technologized" future can be, punctuated by surreal moments, deep-dive analysis, and raucous suggestions for a world with less AI and more Colombo. If you want to understand why advanced policing tools aren’t just absurd but profoundly political—and why you should root for the detective with the cigar—this is one for the books.
