Transcript
A (0:00)
Foreign. I'm Marissa Wong, intern at Lawfare, with an episode from the Lawfare archive for April 5, 2026. On March 25, the jury in a landmark lawsuit found social media Companies Meta and YouTube liable for harm caused to a user's mental health because of their addictive algorithms and design features. The Bellwether case could open social media companies to more lawsuits over their algorithms and its effects on users well being. For today's archive, I chose an episode from March 14, 2025, in which Renee Diresta sat down with Glenn Wheal, Jakob Machengama, and Ravi Iyer to unpack how social media algorithms shape user interaction through processes designed to be invisible and opaque to users. The group also discussed how new decentralized platforms are attempting to design pro social media platforms for a more possible future. It's the Lawfare Podcast. I'm Renee Diresta, Contributing Editor at Lawfare and Associate Research professor at Georgetown McCourt School of Public Policy. I'm with Glenn Weil, economist and author at Microsoft Research Jakob Achungama, Executive Director of the Future of Free Speech Project at Vanderbilt University and Ravi Iyer, Managing Director of the USC Marshall School Neely Center.
B (1:38)
I just think no matter what our goals are, the design of the overall information ecosystem and what gets surfaced is critical.
A (1:51)
Today we're talking about design versus Moderation. The way that social media platforms are built influences everything from what we see to what is amplified to what is even created in the first place. As users respond to incentives, nudges, and affordances. These processes are often invisible or opaque, though new decentralized platforms are changing that. So we're going to talk designing a pro social media for the future and the potential for an online world without Caesars. I want to just bring you guys in right now just thinking about the difference between moderation as policing, failed end state versus Design. Design as a proactive way to cultivate behaviors, to subtly shift norms, to guide users in particular directions. Not necessarily through top down rule enforcement, but rather by determining the affordances of a system and what the system lets us do. So one of the reasons that I'm excited for this conversation with you all specifically is that when I read your work, you all have such deep thinking about the specifics of ways that system design can produce better social media. I know that Glenn and Jakob, you guys just had a paper recently released, you titled it Pro Social Media and I'd love to just start with that. I think the the term pro social media is wonderful. I'd like to maybe ask you to Define what that means and tell us a little bit about your work.
B (3:09)
Yeah, so I think the key idea that motivated the term pro social media is that obviously social media are doing something social, they're using social information to serve content. But that doesn't necessarily mean that they're achieving the goals that many people had in creating social media, which was to strengthen connections across people, you know, help communities be stronger and you know, reinforce the social fabric that they build on. So social media could in theory either be like sustainable, you know, agriculture, but that reinforces, strengthens the soil at the same time as it harvests from it, or it could be like clear cutting agriculture. And I think many people believe that social media has actually been undermining the social fabric as it's been harnessing it and we want to try to make that more sustainable, more regenerative, so to speak.
