Podcast Summary: My Momma Told Me – "Robots are Racist" (with Brett Gray)
Date: February 24, 2026
Hosts: Langston Kerman, David Gborie
Guest: Brett Gray
Episode Overview
In this engaging and humorous episode, comedians Langston Kerman and David Gborie invite actor and singer Brett Gray ("I'm a Virgo", "MJ the Musical", "Barbershop") to dive into the conspiracy theory: "Robots are Racist." The trio explore the nature and definition of robots, the biases inherent in AI, and debate whether advanced machines can develop and perpetuate racism. Their conversation flows from comedic banter and generational pop culture differences to real-world examples and research on racial bias in technology.
Key Discussion Points and Insights
1. Generational Gaps and Black Cultural Canon
- Early Banter: The episode kicks off with lighthearted teasing about phone cases, Gen Z habits, and a generational gap around movies like "Cool Runnings" and “Johnny Tsunami”.
- Brett admits never seeing "Cool Runnings":
“What year did that come out?” (07:24, Brett Gray)
“29 is old enough to have seen Cool Runnings.” (07:28, David Gborie)
- Brett admits never seeing "Cool Runnings":
2. The Main Conspiracy: Are Robots Racist?
- Launching the Topic:
“You said, my mama told me robots are racist.” (10:56, Langston Kerman)
- Defining ‘Robot’: Brett clarifies he's thinking about human-like, intelligent robots, not just Roombas or manufacturing arms. He references robots like in "Ex Machina" and emerging humanoid AI.
“When I think robots, I think like, have you guys ever seen Ex Machina?” (11:52, Brett Gray)
- Potential for Racism: Brett and the hosts discuss that AI’s ability to detect and classify people by race could easily lead to bias if—especially once given power (e.g., as hiring managers).
- Brett:
"If the robots are like now hiring managers, I do think that it could be racist... Because it can go in your history, detect who you are... and decide based on a lot of different factors." (13:47, Brett Gray)
- Brett:
- Institutional Power: They debate if robots can be “racist” if they become decision-makers or if their bias is more broadly anti-human, especially if AI becomes an entirely new “race.”
- Human vs. Robot Survival:
- Discussion veers into whether robots would depend on humans for maintenance and upgrades, or if they could evolve past human need entirely.
3. Real-Life Examples of Bias in Robotics and Technology
- Motion Sensor Racism: David shares his experience that automatic hand dryers and faucets work less effectively for darker skin, sparking both playful and serious discussion:
“Motion sensor robots are racist as a dark skinned individual... Paper towel dispensers. The darker your skin is, the less they work.” (33:03, David Gborie)
- Langston adds:
"Even my experience... I ain't that far from the shit that they should be able to manufacture it for." (33:07, Langston Kerman)
- Langston adds:
- Study Cited: Langston presents findings from a legitimate scientific study by Georgia Tech, UW, and Johns Hopkins:
“A study... confirms that robots have an active bias against people's race and sex.” (35:01, Langston Kerman)
- Robots picked white and Asian men more as ‘doctors’, black men as ‘criminals’ 10% more, and Latino men as ‘janitors’ 10% more—mirroring existing social biases. (36:51, Langston Kerman)
4. Where AI Gets Its Biases
- Training Data: They consider how AI, trained on biased human data (e.g., the internet), will inevitably reproduce and even amplify those same biases.
“You can't... Our internet... is far more consumed with bias than it is with truth. And in that way it's training [robots]..." (38:13, Langston Kerman)
5. Can AI Institutionalize Racism Further?
- Scope of Decision-Making: Brett raises concerns about AI entering institutional roles (e.g., deciding court cases), and Langston points out this may already be happening in subtler ways due to reliance on AI-powered search and summaries.
- Examples: Google’s AI Overviews as a source of misinformation:
“The AI overview is often wrong... it skips over a lot of information or sometimes picks the wrong one because it's just there are way more articles saying this wrong thing..." (44:22, Langston Kerman)
- ChatGPT 'Coded Blackness': Jokes about AI using slang when tailored to Black users, highlighting how simplistic or misguided the modeling can be.
“I've asked it to write me an email and it'll be like, yo.” (45:52, Brett Gray)
6. Pop Culture, Conspiracies, and Blackness
- Voicemail: Ghostwriter as Slave Ghost?:
- Caller shares a rumor that PBS’s “Ghostwriter” is actually a runaway slave killed by dogs—a “dark” premise for a children’s show.
- Hosts and Brett largely dismiss this theory, comparing it to “white people bullshit” and after-the-fact symbolic retconning.
“That is white people sort of attempting to add a weight to a thing that they weren't actually writing about.” (55:35, Langston Kerman)
- Imaginative Theories:
- The gang riff on Barney being imaginary ("Yeah, I guess. Every time"), and reflect playfully on PBS kids canon—“Arthur,” “Dragon Tales,” “Reading Rainbow.”
- DMX's meme version of "Reading Rainbow" song makes an appearance:
“Suck my dick.” (60:43, David Gborie)
Notable Quotes & Memorable Moments
- Langston on the mission:
"We dive deep into the pockets of black conspiracy theories, and we finally work to prove absolutely nothing..." (04:02, Langston Kerman)
- On the futility of arguing with tech:
“The transfer of control out of your daily life to these things is, like, kind of the scariest transition in my lifetime...” (25:37, David Gborie)
- On skepticism toward technological progress:
"We weren't unimaginative... But it was like, oh, there's a day where I wake up and the Google is in charge of how we find out information." (25:59, Langston Kerman)
Important Timestamps
- [10:56] – Introduction of the “robots are racist” conspiracy
- [13:47] – Brett’s explanation of how AI could be racist in hiring
- [33:03] – David’s personal experience with racial bias in hand dryers
- [35:01] – Langston introduces academic research on robot bias
- [36:51] – Data/statistics from study: AI classifying races into stereotypes
- [44:22] – Google AI Overview and the spread of misinformation
- [45:52] – ChatGPT’s attempts at Black ‘voice’
- [53:11] – Voicemail about “Ghostwriter” as a slave ghost
- [55:35] – Hosts dismiss after-the-fact attempts to ascribe hidden racial history
- [60:43] – Humorous DMX “Reading Rainbow” moment
Tone and Language
The episode is consistently playful, irreverent, and rooted in Black pop culture. The hosts blend silliness with sharp socio-technical analysis, and Brett Gray proves an affable, quick-witted guest. Their style is profane, deeply self-aware, and often self-deprecating, betraying a real affection for both the subject and each other.
Closing & Shout Outs
- Plugs: Brett Gray’s upcoming Amazon Prime project "Barbershop", and his socials.
- Shout-out: Live shows for Langston and David (especially in New Orleans), Patreon, and listener engagement.
- Final Reflection:
"We cannot let white people continue to use slavery as a weapon against us. We gotta draw a line somewhere." (58:56, Langston Kerman)
Summary Takeaway
Robots may inherit the world, but for now, they're still running on our (decidedly flawed) data—and thus, our biases. The hosts urge listeners to stay skeptical, laugh at the absurdity, and remember that racial bias isn't just built into old systems—it's being coded into new ones, too. As always, they invite listeners to call in with their own theories, share the ridiculous, and, if you must, threaten to take the mics away.
