Sean Carroll's Mindscape Ep. 330
Guest: Petter Törnberg | The Dynamics of (Mis)Information
Date: September 29, 2025
Episode Overview
In this engaging episode, host Sean Carroll welcomes Petter Törnberg, professor of computational social science, to discuss the dynamics of information and misinformation in society, with special focus on social media. The conversation blends ideas from physics, complexity science, social modeling (particularly agent-based models), and the real-world consequences of digital interaction platforms. Together, they explore why social media platforms often generate polarization, echo chambers, and attention inequalities—even without malevolent algorithms—and whether there are feasible interventions or sources of optimism.
Key Discussion Points & Insights
1. Complexity Science in the Social World
- Complex vs. Complicated Systems:
Petter starts by distinguishing between complicated systems (like machines that can be disassembled and understood in parts) and complex systems (like ant colonies, where collective behavior emerges from the simple interactions of many agents) ([05:49]). - Changing Social Epistemologies:
He argues that society’s metaphor has shifted—from the planned, "machine epistemology" of Fordism to an "organic" metaphor where outcomes emerge from many decentralized interactions ([08:55]). This shift also changes the forms and visibility of power in society, with platforms and algorithms subtly guiding social outcomes.
"There are still forms of power that are shaping society. They're just much less visible...by shaping how we interact by things like algorithms."
— Petter Törnberg ([10:43])
2. Structural Power and Emergence
- Even emergent, “bottom-up” equilibria encode power structures set by the initial conditions and “rules of the game”. The outcomes aren't neutral; changing the rules can produce very different, perhaps less harmful, emergent phenomena ([11:59]).
"You could have produced other rules that would produce other outcomes."
— Petter Törnberg ([12:18])
- The digital society’s networked structure is associated with power law rather than normal distributions; a few individuals acquire vast influence and most remain sidelined ([14:02]).
3. The Schelling Segregation Model and Social Media Analogs
- Classic Model Recap:
Schelling’s early model illustrated how individual preferences, even without any top-down prejudice, create cascading segregation ([15:38]). - Digital Update:
Petter transferred this model to online spaces (imagine forums or subreddits as group nodes), finding even stronger segregation dynamics emerge ([17:25]). - Filtering Algorithms:
Somewhat counterintuitively, filter bubbles (algorithms showing users more of what they like) may reduce overall segregation by stabilizing group compositions ([21:32]). - Debate on Explanations:
While Schelling’s model shows "innocent" mechanisms causing segregation, real-world polarization often results from overt structural forces—both explanations have validity and sometimes overlap ([23:33]).
4. Echo Chambers, Political Discourse & Social Media Structures
- Different platforms exhibit various levels of echo chamber effects; smaller online communities (like subreddits) tend to be highly segregated, while platforms like Twitter historically had more cross-ideological interaction ([25:23]).
- Benefits & Harms:
Social media can provide community—even lifesaving support—for marginalized groups, but also for hate groups and extremists ([27:42]). Formation of collective identity and radicalization can be traced in the longitudinal data of hate forums like Stormfront ([30:35]). - Identity Feedback:
Interaction shifts individuals’ self-narratives (“I” to “we”), highlighting the recursive process of identity radicalization online ([31:56]).
"When they first come in, they use ‘I’ and ‘my’...but then over time they start saying ‘we’ or ‘SF’ for Stormfront."
— Petter Törnberg ([31:39])
5. Modeling Social Systems: From Rules to Language Models
- Agent-Based Modeling Primer:
In agent-based models, simple behavioral rules at the agent level can yield surprising, often unintended, systemic outcomes ([37:43]). - Upgrading Agents with LLMs:
Traditional rule-based agents are limited; LLMs (large language models) now allow the simulation of richer, more realistic behaviors ([39:08]). - LLM-driven Social Media Simulation:
Petter and collaborators created a simulated social network of 500 LLM agents, each with customized personas derived from real US data, given a simplified social media environment ([45:26]).- Agents could post, share, and follow others, with minimal platform constraints.
- The researchers expected to struggle producing “toxic” outcomes without complex manipulation.
6. Emergent Pathologies of Social Media: Findings from Simulation
- Three Negative Outcomes Emerged Spontaneously ([48:17]):
- Echo Chambers: Democrats and Republicans quickly segregated.
- Attention Inequality: A few users dominated the attention (power law distribution).
- Polarization/"Social Media Prism": More extreme users captured more reach and engagement.
"To our surprise, we didn't actually need to do anything more than provide this bare bone platform. And we got these three features that are widely considered the problematic aspects of social media."
— Petter Törnberg ([46:35])
- Mechanisms:
Preferential attachment (the rich get richer) and emotionally-driven engagement mechanics—sharing tends to amplify strong emotions, feeding into network structure ([49:17]). - Anthropomorphism Caveat:
LLMs imitate human rhetorical tendencies; while not conscious, they robustly operationalize learned “preferences” (including political partisanship) given a persona ([51:21]). - Structural Feedback:
Network formation and content-sharing are recursively linked, locking in echo chambers and polarization regardless of individual intent.
7. Testing Interventions: Can Social Media Be “Fixed”?
-
Tried Interventions:
Petter’s team tried various proposed “fixes”, such as:- Showing users only constructive/bridging content
- Hiding partisan identifiers
- Sorting posts chronologically instead of popularity-based
Results were not promising—most interventions barely dented the problematic outcomes, and some made things worse (e.g., chronological feeds exacerbated the "prism" effect) ([63:49]).
"This kind of emergent phenomena seems to be very kind of rigorous to perturbations... It's a very robust emergent phenomena."
— Petter Törnberg ([63:55])
8. Information Quality, Misinformation, and Political Incentives
- Social media, by stripping away traditional editorial gatekeeping and incentivizing attention, creates optimal conditions for misinformation to flourish ([68:00]).
- Petter's empirical research with actual social media data finds that right-wing populist movements, in particular, have exploited these dynamics, using misinformation as a deliberate political weapon ([70:00]).
9. Is Social Media All Bad? Room for Optimism
- While much of the emergent behavior is problematic, Petter notes the historic and personal positives: online connection has been liberating and supportive for many, especially marginalized users ([71:00]).
- Real progress, he suggests, demands fundamental redesign—not just surface tweaks to algorithms and feeds ([72:00]).
Notable Quotes & Moments
-
On "Physics Envy" in Social Sciences:
"Social science is hard. People are messy, there's a lot of variables going on...But there are contexts in which physics-like reasoning can be helpful or even interesting."
— Sean Carroll ([00:31]) -
On the Shift from Fordism to Networks:
"We've moved from that kind of machine epistemology into an era...defined by complex systems where we're talking about society as swarms or self-organizing."
— Petter Törnberg ([09:20]) -
On Echo Chambers:
"There are suggestions that there is quite a lot of communities that are relatively segregated...Most subreddits, if they're political, they tend to be towards one side or the other."
— Petter Törnberg ([25:23]) -
On Extreme Content's Success:
"...the more polarized, more extreme users tend to have more attention on the platform."
— Petter Törnberg ([48:17]) -
Chilling Emergence:
"We didn't actually need to do anything more than provide this bare bone platform. And we got these three features that are widely considered the problematic aspects of social media."
— Petter Törnberg ([46:35]) -
On Intervention Futility:
"Unfortunately, none of these solutions really fix the problems that we're observing and some of them actually make matters worse."
— Petter Törnberg ([63:49]) -
Cautious Hope:
"We could create structures, we could create platforms and spaces that would actually be beneficial for us. It's just we might need to rethink it in more fundamental ways than just these cosmetic changes."
— Petter Törnberg ([72:00])
Timestamps – Selected Segment Highlights
- [05:49] Complex vs. complicated systems, Fordism, and metaphors for understanding society.
- [15:38] Recap and update of the Schelling segregation model for the digital world.
- [25:23] Echo chambers in real-world platforms; positive and negative social media impacts.
- [31:39] Formation of collective identities in online hate communities.
- [37:43] Explanation of agent-based modeling and the integration of LLMs.
- [46:35] Spontaneous emergence of echo chambers, attention inequality, and polarization in LLM models.
- [63:49] Testing and failure of platform intervention strategies.
- [68:00] Social media’s structural incentives for misinformation, especially in politics.
- [71:00] Reflections on social media’s positive origins and the necessity for deeper change.
Concluding Note
The episode offers a sobering yet nuanced perspective: the pathologies of social media are deeply rooted in structural dynamics akin to physical laws of networks and complexity—not just the consequence of bad actors or tweakable algorithms. However, the same networked structures that create problems have also enabled unprecedented connection and support. Meaningful progress hinges on reimagining information ecosystems at a fundamental level.
