The Interface – What Was Pokémon Go Really Up To?
BBC Podcast | Episode Date: March 19, 2026
Hosts: Tom Germain, Karen Hao, Nicky Woolf
Episode Overview
This episode of The Interface explores the unexpected consequences of Pokémon Go, delving into how players inadvertently helped train food delivery robots and contributed to vast AI datasets. The hosts expand the conversation to discuss how game and meme culture, data collection, and AI shape the world—touching on privacy, government communications in wartime, and even the gender and race politics of virtual assistants. The tone is sharp, fast, funny, and informed, as the trio decode how everyday tech is quietly (and sometimes not-so-quietly) rewiring society.
Pokémon Go: From Pikachu to Pizza Pipeline
[03:07–04:47]
- Pokémon Go’s Phenomenon: Pokémon Go was a cultural craze at its 2016 peak, getting millions out on city streets searching for virtual creatures (Nicky: “I was obsessed with it for weeks.” [02:08]).
- Hidden Side Effect: While players searched for Pikachu, they were also letting their cameras and phones record the world. All those images weren’t just for gameplay.
- Karen explains: Niantic, Pokémon Go’s developer, spun off into Niantic Spatial, which now uses “the 30 billion images that hundreds of millions of users generated while walking around...to create a visual positioning system...that can tell robots where they are based on what they’re looking at.” ([03:49])
- Collaboration with Coco Robotics: This data is now informing food delivery robots (“Pikachu to pizza pipeline” [04:42]). Coco Robotics uses it to help robots deliver food, e.g., "up to 8 extra-large pizzas."
- Quote: "You are actually training food delivery robots." – Karen Hao ([03:27])
- Amusing Aside: Nicky quips: "I'm fine with that. As long as it's not being used to deliver nine pizzas." ([04:54])
The Bigger Picture: AI, Data, and Repurposing Our Actions
[05:04–10:14]
- Data as a Commodity: Karen outlines the "growing trend" of AI companies seeking and selling any large data resource for training. Pokémon Go data, while not initially intended for robotics, now finds new profit in the AI boom.
- Unique Data Source: Tom notes, “Most of the maps of the world that we have...are taken from cars. But...they could send little people, little workers who think they're playing this game out into places the cars can't go.” ([06:04])
- Privacy & Security Concerns: The Saudi royal wealth fund’s acquisition of Niantic Spatial sparked debate over how such location data could eventually be used (“this sure would be useful for weapons, targeting systems and things like that” [06:25]).
- Sensitive Info: Nicky reveals, "Turns out, a poke-stop on Epstein’s Island...who was connecting to Pokémon Go on Epstein’s Island...is now something...the Saudi sovereign wealth fund has in its dataset." ([06:45])
- Data Quality & Utility: Karen points out the data is now a decade old, raising questions about its usefulness for robots in dynamic environments ([07:24]).
Everyday Users: Unwitting AI Trainers
[08:17–10:14]
- Beyond Pokémon Go: Tom draws a parallel with Captchas, explaining, "When you're clicking on the image and identifying...this is a tree, that's a motorcycle, you are helping train AI systems." ([08:46])
- Repurposed Data: Karen recounts how Google used 2016’s Mannequin Challenge YouTube videos to train robots on depth perception: “Any trace that you leave online or through your apps...you don't really know how it's going to get repurposed.” ([10:01])
- No US Data Privacy Law: The lack of federal law in the US means there’s little user control or legal recourse ([11:20]).
- Gamification of Data Collection: Companies like OpenAI run viral “turn yourself into an action figure” trends essentially as growth hacks to collect more data ([11:33]).
Notable Moments & Quotes
- Nicky: “Why must you ruin everything?” (Karen, revealing Pokémon Go’s data role, [03:41])
- Tom: “We got to draw the line somewhere.” (On the robot pizza delivery cap, [05:02])
- Karen on tech opportunism: “AI companies just need so much data...and any company that has lots of data is now offering it up for sale effectively.” ([05:04])
- On surveillance capitalism: “All over our digital lives now, we are being put to work training AI systems in ways we don’t realize.” – Tom Germain ([08:17])
Meme Culture and the Politics of Communication
[15:35–21:45]
- War and Memes: The podcast pivots to the White House’s meme-driven propaganda effort to sell the Iran war.
- Pokémon Condemns White House: The Pokémon Company publicly objects to Pentagon/White House use of its IP in memes about the war ([15:35]).
- Examples of Memes: Official accounts splice US military strikes, Trump footage, Top Gun clips, Pokémon, and NFL highlights (“It’s really sick stuff...a picture of a bomb strike hitting a target, and then it’ll, like, splice in with a touchdown,” Nicky, [17:12]).
- Surge in Engagement: "The views of the White House’s content jumped more than 60% after they started doing this kind of thing." – Nicky ([17:36])
- Strategy Analysis: Memes are both to “build up support among its terminally online base...and also to troll the libs. It’s to make people upset with it.” ([18:37])
- Quote: “Through engaging posts and banger memes, we are successfully communicating the President's extremely popular agenda.” – White House spokesperson ([18:23])
- Historical Parallels: Nicky compares it to WWI/WWII propaganda but updated for the social media era.
Flattening of Seriousness: Information Overload
[21:45–22:59]
- Tom’s Reflection: “All of human existence is being flattened because it’s all happening on the same level with no context...ultimately what will the effect be? That everything in our world is happening on the exact same level, on the exact same, you know, amount and magnitude of seriousness.” ([21:45])
- Loss of Gravity: The transition from government-as-serious-voice to government-as-meme risks eroding authority but also democratizes communication.
The Voice of Digital Assistants: Race, Gender, and Intimacy
[23:00–37:08]
- Amazon’s Sassy Alexa Mode: New feature offers different tones, including ‘Sassy’—adults only (“uses explicit language...might get a little adult, but it won’t be sexy. They said it’s not going to have sex with you.” – Tom, [24:04]).
- Experimenting as Users: Siri’s voices can be changed to various genders and accents, raising questions about inclusivity and stereotyping ([26:02–26:16]).
- Race and Voice: Karen recounts columnist Karen Attia’s criticism of Meta’s ‘digital blackface’—companies assigning racial or gendered personas to AIs to attract more users without actual inclusion ([27:10]).
- Gender Bias: Karen discusses Quartz/UN research: defaulting to female voices for assistants reinforces stereotypes. “When you said to Siri, you're a bitch...Siri would then say, ‘I would blush if I could.’” ([29:35])
- “It was projecting this idea that women are subservient and should be the voice of these assistants...maybe we should stop doing this.” ([30:37])
- Interpersonal Effects: Tom: “...the way we talk to, whether it’s people or computers...you carry that into the rest of your life...these things have a huge impact...” ([30:37])
- Will Sassy Mode Help? The new Alexa mode is programmed to "push back," possibly reducing passive or sexist responses ([31:22]).
- Intimacy as a Roadmap: Conversation moves to OpenAI targeting emotional bond as product strategy (“if you fall in love with your computer, it is going to be a huge business windfall.” – Thomas, [36:31])
- Scarlett Johansson/‘Her’ Controversy: OpenAI was accused of modeling a voice on Johansson from the film “Her.” Even if denied, internal sources cite “Her” as a north star for personal, seamless, and “sexy” AI ([34:24]).
Key Quotes & Moments
- Karen: “And I think it speaks to the fact that you need an extraordinary amount of data in order to get this stuff to work. And also that the quality of the data is maybe not as high as we would believe because it’s 10 years out of date.” ([07:24])
- Tom: "Billions of us at a time, at work, teaching robots how to see..." ([08:46])
- Nicky: "These memes are being posted from official White House and Pentagon accounts. And it's an astonishing sign of how far we've come." ([19:49])
- Karen: “This was...like a gender equity issue because it was projecting this idea that women are subservient...” ([30:37])
- Tom: "The whole point...they want you to build a personal relationship with the company. It's not Meta, the corporation...it's this, like, friend you have that you talk to every day..." ([28:05])
- Karen: "This is the product roadmap of OpenAI.” ([35:54])
Timestamps for Major Segments
- [01:29] Welcome & Episode Overview
- [02:14–03:10] Pokémon Go Explained
- [03:27–04:47] Pokémon Go’s Data Repurposed for Food Robots
- [05:04–07:24] Data Commodification & Niantic Spatial
- [08:17–10:14] Captchas, Mannequin Challenge, Repurposed Data
- [11:20–12:13] Gamification of AI Data Collection
- [15:35–19:49] Government Meme Tactics in Wartime
- [21:45–22:59] Flattening of Seriousness & Context Collapse
- [23:00–36:57] Digital Assistants—Voice, Race, Gender, and Intimacy
- [33:41–35:54] ‘Her’ as AI North Star & Intimacy as Business Model
Final Takeaways
- Your seemingly innocent digital actions are part of a vast, lucrative, and largely invisible system of data harvesting and AI training.
- The lines between entertainment, surveillance, commerce, and state communication are blurring—in ways people rarely realize.
- As tech companies pursue intimacy with users via AI personalities, deep questions arise around privacy, identity, representation, and the shape of future relationships—whether with each other or with our machines.
- The tools and cultural strategies developed for games and memes are now shaping not just consumer tech, but how governments talk to citizens about war.
For further discussion or to share your experiences:
Contact the show at theinterface@bbc.com, via WhatsApp (+3332072472), or on social media.
