The Tech Policy Press Podcast
Through to Thriving: Protecting Our Privacy with Chris Gilliard (November 15, 2025)
Episode Overview
In this episode of the "Through to Thriving" series, host Anika Collier Navaroli explores the present and possible futures of privacy with guest Chris Gilliard, co-director of the Critical Internet Studies Institute and author of the forthcoming book Luxury Surveillance. The conversation covers the challenges of maintaining privacy in an always-connected world, the ideological resistance to visibility, the historical and racialized dimensions of surveillance, the concept of "luxury surveillance," and hopes for a future where technology serves rather than surveils us. The episode also critically examines tech platforms, digital redlining, GenAI, and what true privacy could look like.
Key Discussion Points & Insights
1. The Tension of Being Private in Public (Online)
- Maintaining Privacy as a Public Figure
- Chris Gilliard is known for his online presence and thought leadership, yet intentionally keeps his personal life and image private.
- "I do my best [to remain private]." (02:01, Chris)
- Ideological Resistance
- Chris resists feeding personal data into “the machine”—a pushback against the widespread expectation to overshare online.
- "I try, whenever possible, to not feed the machine." (03:17, Chris)
- Negotiated with the Washington Post to exclude photos of his face.
- Anika notes her own struggle: "One of the things in my life that I've hated the most about the past couple of years is how visible I have had to be…" (02:38, Anika)
Memorable Quote
"It is difficult, you know, because I think that there's an imperative driven by tech companies... that we're supposed to give away everything online." (02:11, Chris)
2. Pandemic, Home Invasion & the Normalization of Surveillance
- Zoom and Remote Work/Home Intrusion
- The pandemic normalized visual and spatial intrusion, exacerbating privacy erosion.
- "We were being conditioned to accept the idea that strangers had a right to peer into our houses." (05:34, Chris)
- The disparity of visible backgrounds magnified existing class differences.
- "The disparity that showed at that time... was just something that I think... is completely under investigated." (07:42, Anika)
- Chris described pushing back on video call norms to set an empowering precedent: "If I can say no, I should say no because it gives other people the option..." (08:28, Chris)
Notable Moment
"There was a lot of work done to normalize [home video access], I think, to our detriment." (07:04, Chris)
3. On Being 'Very Online' Without Social Media
- Chris uses minimal social media, mostly BlueSky, left Twitter on the day Musk acquired it.
- He gains most information via RSS, Apple News, Google Alerts.
- "If being online means being on platforms, I guess I'm not really. But I do spend a good chunk of my day...reading and researching articles." (10:07, Chris)
4. Leaving Twitter: Platform Migration and Digital Influence
- Left Twitter ("the bird app") immediately after Musk's acquisition, having foreseen incoming toxicity and the diminishing utility for progressive/left voices.
- Wrote "Digital Migration is Nothing New for Black Folks" (WIRED) reflecting experiences of Black users shifting platforms.
- Importance of platform dynamics in who gets heard/verified and whose work is amplified.
Notable Reflection
"The deal that they used to offer is, 'We'll give you this service... to pay for it, you let us surveil you a little bit.' That deal was a lie." (49:07, Chris)
5. Defining Privacy & Its Roots
- Privacy Definition (via scholar Saba Singh):
- "The right and agency of a person or community should have to reasonably avoid, evade scrutiny, be it of their body, data, or property." (20:54, Chris, quoting)
- Personal Roots
- Grew up Black in Detroit—felt the direct impact of race-based surveillance and exclusion. The audible "click" of car locks as a child symbolized both surveillance and alienation.
- References Simone Brown's Dark Matters on anti-Blackness and surveillance.
- Challenges with institutional filtering at colleges shaped his focus on digital exclusion and equity.
Memorable Quote
"It feels like a sign that I'm the one who's not supposed to be there, even though it's my neighborhood... it feels alienating. It feels insulting." (26:11, Chris)
6. Digital Redlining
- Chris popularized "digital redlining" to describe digital barriers (e.g., filtered internet) that reinforce racial/class inequity, echoing housing redlines.
- Filtering and denied connectivity trace, reinforce, and create new lines of exclusion.
- "The implications... were drawn along not only class and race lines... they also kind of mirrored the exact geographies that we tend to see." (30:57, Chris)
7. Luxury Surveillance
- Core Argument: "An ankle monitor and an Apple watch are essentially the same technology."
- Surveillance devices now are marketed as aspirational/luxury goods, normalizing constant monitoring.
- "There's a segment of devices often chosen by people who have the ability to say yes to surveillance... they don't understand them as surveillance devices." (33:03, Chris)
- Devices like Meta Ray-Ban glasses, video doorbells, and wearable AI exemplify “luxury surveillance”—invasive tech disguised as lifestyle improvements.
Notable Exchange
Chris: "I argue that an ankle monitor and an Apple watch are essentially the same technology." (32:41, Chris)
Anika: "Oh my God, you're right. I'm wearing an Apple watch... I literally told my best friend, and her response was, I didn't think we were allowed to do that." (32:41–32:53, Anika)
8. Abolitionist Perspective on Tech
- Chris identifies as a tech abolitionist:
- "Many of these technologies should not exist and we should reject them in any and all ways possible and available to us." (35:18, Chris)
- The mainstream defense that "some legit use cases exist" ignores the overwhelming, proven social harms.
- Drawing the line: "You can use your AK as a doorstop... that's not what it's for." (36:51, Chris)
9. Critical Views on GenAI & ChatGPT
- Chris refuses to use ChatGPT or other GenAI tools; sees them as tools of further surveillance and false promises.
- Highlights historical struggles for literacy among Black Americans, and finds the idea of automated "reading and writing" deeply troubling.
Memorable Quote
"Billionaires and soon-to-be trillionaires... have come along and said, oh, we have a thing that's going to make it so that you don't have to read anything." (45:44, Chris)
10. Disparities & the Normalization of Surveillance
- GenAI and data-driven “optimization” technologies are sold as beneficial but enable unprecedented surveillance, with marginalized groups most exposed, but eventually, everyone is affected.
- “There’s a lot of investment in making sure that we’re not able to [say no]... the negative effects of these things are not isolated to the most vulnerable... but they eventually come for everyone.” (38:38, Chris)
11. Imagining Better Futures
- Drawing from Ruha Benjamin, Chris advocates for using imagination to envision tech that benefits us rather than extracts info and value.
- Most tech "futures" imagined by companies are worse because they serve capitalist interests; positive tech futures require systemic change.
- Hope: As more people become aware and critical, collective pushback will increase.
- “We can rewrite the ways that things work and it is possible... it’s actually super dark right now, but... the tech barons are really overplaying their hand. It's very clear where their alliances are.” (54:47, 56:11, Chris)
Notable Quotes & Timestamps
- "I try, whenever possible, to not feed the machine." (03:17, Chris)
- "We were being conditioned to accept the idea that strangers had a right to peer into our houses." (05:34, Chris)
- "If I can say no, I should say no because it gives other people the option..." (08:28, Chris)
- "There's a segment of devices often chosen by people who have the ability to say yes to surveillance... they don't understand them as surveillance devices." (33:03, Chris)
- "Many of these technologies should not exist and we should reject them in any and all ways possible and available to us." (35:18, Chris)
- "We can rewrite the ways that things work and it is possible." (57:50, Chris)
Important Timestamps
- 00:00 – 01:08: Introduction to the series and episode
- 02:05 – 04:05: Personal privacy practices and negotiation with media
- 05:28 – 08:28: Pandemic norms, home/work privacy invasion, and pushback
- 10:00 – 12:50: Chris’s online life and media consumption habits
- 14:19 – 16:59: Leaving Twitter; digital migration for Black users
- 20:16 – 21:13: Defining privacy: "the right to avoid scrutiny"
- 22:25 – 27:37: The click of the car lock: Blackness, surveillance, and belonging
- 28:09 – 32:17: Origins and definition of digital redlining
- 32:27 – 36:58: Luxury surveillance: Apple Watch, video glasses, video doorbells
- 35:18 – 38:49: Abolitionist approach; rejecting surveillance tech
- 43:15 – 46:50: Literacy, GenAI, and the future of knowledge
- 47:32 – 53:08: Disparities in privacy; normalization and imposition of surveillance
- 54:47 – 57:50: Hope, imagination, and the pushback for a better tech future
Final Thoughts
Through a personal, political, and historical lens, Chris Gilliard and Anika Collier Navaroli dissect privacy as both a right and a necessity in the digital age. The episode offers actionable reflections for tech users, scholars, and policymakers: refusing normativity around surveillance, questioning the legitimacy/utility of so-called innovations, and collectively imagining a future where privacy is possible for all. Gilliard’s abolitionist stance and call for rethinking technology’s purpose stand out as a radical and necessary provocation.
“We can rewrite the ways that things work and it is possible.” (57:50, Chris)
