Episode Overview
Title: How to Become an Algorithmic Problem
Host: Justin Hendricks (Tech Policy Press)
Guest: Jose Maréchal, Professor of Political Science at California Lutheran University
Date: February 22, 2026
This episode explores the themes from Jose Maréchal’s book, You Must Become an Algorithmic Problem. The conversation investigates how algorithmic systems, especially recommendation engines and large language models, shape individual and collective behavior, and what is lost for liberal democracy when optimization governs human experience. Drawing from political philosophy and empirical research, Maréchal argues for renegotiating our "algorithmic contract" and expanding the rights and responsibilities of digital citizens.
Key Discussion Points & Insights
Maréchal's Intellectual Journey
[02:28 – 05:41]
- Maréchal recounts his path from early optimism about technology to academic skepticism, influenced by years of teaching "Technology and Politics".
- His previous book, Facebook Democracy (2012), examined the commodification of intimate relationships through Facebook groups and found that political discourse online shifted from deliberation to seeking solidarity with like-minded individuals.
- Quote:
"It started very early. I've been teaching a class called Technology and Politics for, gosh, almost 20 years... I have a mentality of being an early adopter and somebody like right now I'm torn between my disdain for tech oligarchs and my fascination with AI as a tool." — Jose Maréchal [02:28]
- He argues social media has damaged the discourse environment by blurring the boundaries between solidarity-seeking and deliberative discourse.
The Outlier: Statistical and Political Dimensions
[06:03 – 11:07]
- The concept of the outlier is central to Maréchal's new book, both from a statistical and societal perspective.
- In statistics, outliers disrupt predictive models; in democratic societies, outliers represent valuable difference and novelty.
- As algorithms scale, outliers become easier to isolate and ignore, undermining novelty and unpredictable human behavior.
- Platforms like Netflix create "clusters" for users, promoting sameness and narrowing exposure to new ideas, which Maréchal worries erodes democratic and human potential.
- Quote:
"Are we habituating ourselves towards sort of those things that we've already told the algorithm we like and foregoing those kind of flights of fancy or those engagements with serendipity that... are necessary, I think, in not just democratic life, but to live a fully rounded human life." — Jose Maréchal [10:39]
The Algorithmic Contract & The Loss of Serendipity
[11:07 – 15:57]
- Maréchal introduces the "algorithmic contract," drawing from political theory's social contract.
- We willingly trade anxiety about information abundance for curated experiences, but the contract comes at the cost of reduced exposure to difference and serendipity.
- The narrowing of digital experience diminishes democratic capacity, rational discourse, and empathy.
- Quote:
"If we go too far into that kind of rabbit hole of ourselves... we miss out on all the other possibilities of life... it inhibits our ability to do the things that liberal subjects need to do in a democratic society." — Jose Maréchal [13:29]
Optimization, Factory Farmed Citizens, and Cultural Homogenization
[15:57 – 20:39]
- Maréchal unveils the analogy of "factory farmed citizens," drawing parallels to factory farming's negative externalities.
- The optimization culture locks in personal preferences, discourages novelty, and impacts creative fields like art, music, and even policymaking.
- AI-based recommendation and generation tools exacerbate this, producing modal, average, predictable responses and suppressing pluralism.
- Quote:
“What are the externalities of having a media and cultural environment that pretty much gives you everything you want?... If great work is out there and it doesn't get picked up in the algorithm, then does that make it even harder for people to pick up on it?” — Jose Maréchal [17:47]
Ambient Stochastic Terror: Anxiety as a Feature
[20:39 – 24:07]
- The platforms’ tendency to inject violence or anxiety into content feeds creates a constant background of stress: "ambient stochastic terror."
- Platforms both relieve and amplify anxiety, keeping users in a state of ontological enclosure, predictable and profitable for advertisers.
- Quote:
"The very thing that we think is giving us relief is also creating and amplifying the anxiety.” — Jose Maréchal [21:23]
“Scared subjects are better consumers. If you don't leave your house, right. You buy a lot of things online..." — Jose Maréchal [23:15]
Erosion of Democratic Norms and Human Relationships
[24:07 – 31:17]
- The “augmented state of nature” driven by AI and surveillance erodes trust, discourages community, and undermines democratic habits.
- Replacement of interpersonal relationships with synthetic interactions (e.g., students bypassing professors for AI tutors, coders relying on AI rather than Stack Exchange) reduces opportunities for mentorship, empathy, and community building.
- Liberal democracy assumes rights and freedoms are exercised in community; isolation and suspicion among citizens are antithetical to this vision.
- References to liberal thinkers (Mill, Rawls, Tocqueville) and modern scholars (Jennifer Forstall, Alexander Lefebvre) highlight the necessity of communal relationships for liberal values.
- Quote:
“For us to fully enjoy freedoms, we need to be in community with one another, because that's where our views of the world get vetted.” — Jose Maréchal [25:58]
Can Government or Policy Respond?
[31:17 – 36:06]
- Hendricks expresses concern that government adoption of AI is rushed, with minimal consideration for rights or ethics.
- Maréchal sees hope in targeted, deliberative uses of AI (like supporting low-resource languages) that can expand participation and pluralism.
- He frames policy divides as “politics of contraction versus politics of expansion” and emphasizes that engagement, not optimism, is essential.
- Quote:
“…the politics of contraction versus the politics of expansion. And maybe it's always been that way, but I do think maybe this is a new iteration of it.” — Jose Maréchal [34:49]
Toward Algorithmic Citizenship and Fuzziness
[36:06 – 41:53]
- Policy should guarantee not merely privacy, but the right not to have one’s potential limited by algorithms—a “right to serendipity” and a “right to digital potentiality”.
- Boolean (binary) vs. fuzzy (probabilistic) citizenship: Resisting rigid categorization fosters humility and openness, which are vital to democratic deliberation.
- Encourages tools and systems that foster serendipitous discovery, novelty, and an unbounded future.
- Quote:
“Instead of thinking of ourselves as members as ontologically enclosed, we think of ourselves as like yeah, well I prefer this. So I'm 66% of this, but I'm also 65% of that.” — Jose Maréchal [37:13]
“It’s important to maintain a sense of a not predetermined future. Right. That one of the hallmarks... [of] democracy is... when outcomes are predetermined, it’s not democracy." — Jose Maréchal [39:29]
How to Be an Algorithmic Problem
[41:53 – 45:13]
- Becoming an algorithmic problem means refusing to be easy to predict, challenging your own views, seeking discomfort, and actively engaging with difference.
- Encourages political and digital selfhood that is idiosyncratic, unpredictable, and willing to engage in real coalition-building and argumentation, not just comfort of sameness.
- Quote:
"Being an algorithmic problem is a commitment to anachronism, to idiosyncrasy, to like, you know, like putting yourself in situations where you are uncomfortable and engage with content and material that might not be the norm in the groups that you are... inhabiting." — Jose Maréchal [44:35]
Notable Quotes
-
On algorithmic contracts:
“In order to resolve the anxiety and boredom of everyday life... we submit to these frameworks that help curate our life for us.” — Jose Maréchal [12:25]
-
On cultural optimization:
"Netflix is actually trialing out, like make content that sort of fits this particular cluster, right? So artists know that if they want to get picked up by Spotify, they have to change the beginning of their songs..." — Jose Maréchal [17:21]
-
On anxiety and surveillance:
“Scared subjects are better consumers. If you don't leave your house, right. You buy a lot of things online...” — Jose Maréchal [23:15]
-
On democratic affordances:
“Somebody posting to Stack Exchange is a democratic affordance. It's a liberal affordance. You're trying to help other people because you believe in the well being, the value of the well being and dignity of others.” — Jose Maréchal [28:05]
Timestamps for Important Segments
- Maréchal’s work and Facebook democracy: [02:28 – 05:41]
- Outlier concept in statistical and social terms: [06:03 – 11:07]
- Algorithmic contract & its impact: [11:07 – 15:57]
- Cultural/Artistic homogenization & optimization: [15:57 – 20:39]
- Ambient stochastic terror & anxiety: [20:39 – 24:07]
- Implications for democracy & relationships: [24:07 – 31:17]
- Government and AI policy challenges: [31:17 – 36:06]
- Policy proposals for AI citizenship & rights: [36:06 – 41:53]
- How to be an algorithmic problem: [41:53 – 45:13]
Conclusion
Jose Maréchal’s core argument is that resisting algorithmic predictability is not just a matter of individual authenticity but a democratic imperative. By renegotiating how we live with algorithms, demanding plurality, serendipity, and digital rights that preserve human potential, we can protect both democracy and the fullness of our lives.
This episode is essential listening for anyone concerned with technology’s impact on democracy, culture, and individual agency in the algorithmic age.
