Loading summary
A
Your film is now ready to be shown. Good Morning. I'm Justin Hendricks, Editor of Tech Policy Press. We publish news, analysis and perspectives on issues at the intersection of tech and democracy. In 2026, Tech Policy Press hosted nine fellows located around the world from India to Spain to Brazil to the US and beyond. These individuals brought a variety of experiences and expert to their work. One of those fellows was Anika Collier Navaroli. Anika is an award winning writer, lawyer and researcher focused on the intersections of tech, media policy and human rights. She rose to particular prominence during the course of the Congressional investigation into the attack on the U.S. capitol on January 6, 2021. The House Select Committee that investigated those events spoke to hundreds of witnesses, including social media executives, with insight into the role that platforms played in propagating the false claims that motivated violence that day and in connecting and facilitating the move organization of people that sought to overthrow the election. Anika testified as a whistleblower about her experiences as a senior content policy expert at Twitter. She provided information about the platform's handling of extremism and disinformation and the phenomenon she described as coded incitement to violence. And Anika shed light on the internal deliberations about the decision to suspend President Donald Trump from Twitter following the attack. This year, Anika was named an Assistant professor of Professional Practice at Columbia University in its Graduate School of Journalism where she teaches and conducts research research into the impacts of policy, regulation and governance of emerging tech. For her fellowship at Tech Policy Press, Anika conducted a series of discussions intended to help us imagine possible futures for tech and tech policy, for democracy and for society beyond the moment we are in. She called the series through to Thriving. For the tenth and final episode, Anika and I caught up about the project and reflected on the themes we heard emerge from the interviewees over the course of the year. Hey Anika.
B
Hey Justin. How you doing?
A
I'm doing okay. How are you? You were back in the news this week.
B
That is the truth, is it not?
A
So if any listeners don't know what I'm talking about. There was a piece published in the Washington Post on December 15 about tech whistleblowers and about their experience after they blow the whistle. And you were one of a number of individuals that were referenced in this piece of along with other folks who have been on this podcast or participated in Tech Policy Press in the past, including Yael Eisenstadt and others. I don't know. How did it feel to see this piece come out?
B
I was a little nervous about it to be Honest, I haven't really talked about whistleblowing in quite some time, which has been intentional. And so to kind of go back and sit with it and talk a little bit about what the process has been like, especially kind of the process within the tech policy community and the reception within the tech policy community. It's not something I really talked about publicly. And so I think it felt good in many ways and therapeutic in many ways. Kind of like this podcast to talk about a lot of these different issues and things that I, you know, I've been dealing with and working through and kind of a silent manner.
A
Well, I wanted to bring it up because in many ways, it felt like a kind of full circle to me. In terms of your fellowship and the project you've had here, you know, when you applied to this fellowship, I think you were looking for a way to do exactly what you just mentioned, which is to process a lot of what you personally have been through in recent years in your career, but also to allow others to talk about what's happening in the field. Multiple fields, I should say, by the end of your various discussions, you've talked to people in trust and safety and journalism and technology and policy in the art. In art. Yeah. Above and beyond. I think what we expected you to do for this in terms of kind of where you set out at the beginning of the year, you know, what were you hoping to do with this series?
B
Yeah, well, first off, I just got to give a shout out to Gabby, who used to work at Tech Policy Press, because it was her idea when we were going through the interview process, to turn this into a podcast, because my idea was like, let me talk to a bunch of people and figure out some stuff. Gabby was like, that sounds like a podcast. And I was like, oh, I actually hadn't thought about that before, you know, but I really was in just kind of a place of trying to figure out, I think, like, so many people, how did we move forward, Right? Like, okay, we got here. Here doesn't seem to be going that great. And so how do we get from this place that we're currently at to this, again, the place of maybe thriving? Right? Because it felt like we were just so hamstrung by ourselves and by our ideas. And so this podcast kind of became my way of working through that, you know, personally and, you know, very publicly. And I'm really grateful for the opportunity to have been able to do that because I, you know, I have this, you know, body of work that I could look at and kind of reference in Time and space and say, like, oh, this is exactly what I was thinking at this moment. And these are, you know, the things that we were laughing about and the things that we were thinking about. And I think, you know, we'll get into the themes and all of that. But I think it was really wonderful to just touch base with people on a human level this year and go through, you know, so many of these conversations in that way.
A
Well, this is the tenth episode, I suppose, in the series.
B
We're gonna kind of wrap it up a little bit.
A
Absolutely. Just at the end of the year, bringing it under the Wire. So pleased to have had the opportunity to edit each of these throughout the year. And I think we're going to walk a little bit through some of the themes and some of the big ideas and maybe hear a little bit from some of the people who participated along the way to more or less underscore some of the ideas that you explored. But one of the things I know that you were keen to do is, you know, just get the lay of the land from folks where we're at at the moment, you know, how people are feeling. So tell me a little bit about what folks told you in that regard.
B
Well, I think one of the really important things that came out of this sort of conversation, everybody, you know, we asked, was kind of like, what's up? You know, how you feeling? Like, what's going on right now? And Jerrel Peterson, who has been, you know, a fantastic person in the trust and safety world and in my life, who's currently the director of content policy at Spotify, really put it, I think, in a really interesting sort of way. It's complicated.
A
I always start my day with gratitude. I'm grateful to have a job. I'm grateful to work on things that are impactful, that mean a lot to me. There are a lot of really strong translated professionals who are out there who do not have a job right now or who are worried about their career. So gratitude first. But there's a lot of polarization among people, and it's not unique to one or two countries everywhere. There's a lot of tension and fear about global economies, about technology, about climate disasters. Right. We are seeing one play out in front of us right now, too, where there's a lot of loss of life, there's all kinds of wars and trust and safety. Folks have to sit at the center. We can't ignore it because these conversations are happening online, in content and in interactions, and so exciting, fun, all of those things. I still love it, but it is still complicated. It has not gotten any easier. It's complicated. I think we heard that from a bunch of different people.
B
Right, Right. It is not in any way easy, I think is what we continue to hear. Right. This work is not something that anyone kind of goes into lightly and that anyone like leaves slightly either. Right. And I think that that is something that I really wanted to talk about. And I think, you know, you and I, Justin, have talked about this a lot. But, you know, one of the things that I kept asking people in this series, and I don't know if you want to tell the story about the question that kept coming up over and over again, I really felt meaningful.
A
Oh, absolutely. And that question was, how do you feel about that? Or how do you feel? Which I have to admit is not a question I've often asked people on the Tech Policy Press podcast. Although I will just point out that I started this interview.
B
Oh, really?
A
You got a question?
B
That's funny. That's really funny. I didn't realize that you had done that. But yeah, this is something that I, I thought about and I, you know, I was talking at trustcon and you know, one of the things that they were asking was like, you know, speaking publicly about tech policy, you know, how do you do it? And I was like, well, one of the things is like, you kind of gotta find your niche, right? Like everybody, there's no need for like a brand new tech policy podcast. Like, everyone, you know, there's enough out there in the space. And so for me, it felt like, well, what can I add to the space? And you know, I think I talked with a Vaishnavi at one point about how, you know, like the nibblings in my life call me Auntie Internet because, you know, it's like a, if you get in trouble, there's anything on the Internet, come and ask me. But I'm also Auntie feelings, right? And like, that's, those are the places that I really thrive. And so I think this podcast allowed me to really sit in that space. And in many ways it's like a therapy session, right? It was like, I mean, how are you? How does that make you feel? It's like the most therapy asked question that you can have, right? And so I think being able to ask people that really allowed me to really dig a little bit deeper and get into these sort of conversations. And you know, even for myself, I think, you know, in my first conversation that I had with Ellen Pow, who is the director and founder of Project Include and Former interim CEO of Reddit. When I spoke to her in the very first episode of the series, I talked to her a little bit about what I was thinking and how I kind of arrived to the series as well. The policy conversations in the US Are stunted, and there is a sense of unraveling that is happening elsewhere and around the world. But one thing I think is really missing is the sort of affirmative vision for what we actually want out of technology. You know, how do we create for more just equitable, sustainable or democratic future? So I've also been wondering, how do we get through this moment so that we can see beyond it? And how do we put our focus and energy and what do we put our focus and energy into right now that is going to fortify us for what is to come? And how do we not just survive, but somehow how do we have the audacity to find a way to thrive, a way to dream, a way to create, a way to develop alternative futures?
A
So you're trying to kind of connect to the humanity of some of these people, trying to kind of get at the reality that not only are they intellectual participants in this field, but they're people, and they're people who are all people. We're all people.
B
We're all people, and we're all humans, and we all have feelings. And yes, we might have, you know, done some, like, really amazing stuff that was in the Washington Post or whatever, you know, in Tech Policy Press or someplace around these publications that we, you know, read. But, like, we're humans. And I think that that was something that really came out throughout the various folks and throughout the various conversations that we had. And we mentioned talking to Ellen, but, you know, speaking with her for the very first podcast about Community was really special.
A
Right.
B
And she spoke about really, like, what it meant and what the need was for building intentional community and what that means for folks. A feeling of trust, a feeling of, I think the reciprocity is important. I think, you know, not a transactional reciprocity, but a knowledge that if I needed help or if I just needed to go someplace where I'm not going to be judged that that safety element, but somebody who, like, shares the same values. Not all the values necessarily, but just I know this is a good person.
A
So one of my favorite conversations that you hosted is with someone I know reasonably well because he is on the board of Tech Policy Press and has been a friend to Tech Policy Press since it got started, which is Desmond Patton. Absolutely. At the University of Pennsylvania. Now, could not possibly list all of the titles and affiliations, but one of the things he's working on is a research agenda about Joy. And by the way, I'll say Desmond kind of started thinking about Joy after he spent a lot of time thinking.
B
About grief, not joy. Like the opposite of joy. Like gun violence amongst youth. Right. One of the most terrible things, which I think so much of what I think I wanted to talk to him about was sort of the transition that he's been able to go through, especially for those of us in tech policy who spent our time looking at the worst things on the Internet and some of, like, the really bad shit, right? Like, what does it mean to then pursue joy? What does it mean to have this be as, like, a center of your life and how that kind of changes how you are seen within the tech policy industry? And I think Desmond had some really interesting things to say about that.
A
People now associate me with Joy, talking to me about my research on gun violence and AI. They're talking to me about Joy, and I love it. They want.
B
I was gonna say, how does that make you feel? It makes me feel wonderful because I have shifted my identity.
A
Like, I've shifted, and it's positive. And people want to process joy with me.
B
They want to share joy with me.
A
People, like, are emailing me articles that they read in the New York Times about joy. They're looking for joy in unexpected places, and they want to process it with me. Everything has been now focused on joy. I have shifted my life just by being intentional, being reflexive, and also being vulnerable. And I was like, damn, if we.
B
Can all do this. Like, this is cool.
A
And so that's been the beauty of this. So one of the other things that I think of you having tried to do in this series is to talk to people who have been both in and operated around and outside of industry, in civil society, but people who have worked in big tech firms and have that experience and have perspectives on the work that goes on inside those companies, the shortcomings of those companies, but also the opportunities to try to make change within them and from outside them. One of those people was Bashnavi, who I think taught us a few different things.
B
Yeah. I'll just say I think it was really important for me to talk to people across a broad spectrum of tech policy, especially as someone who's had a career that's kind of been in, I don't know, all over the place. Right. I've been in civil society, I've been in academia, I've been in industry, and I just really recognize the importance that we have for all of us to be in the room talking together in order to get to a place of change. Right. Because I think in so many ways we're all headed in the same direction. And I think Vaishtavi made such an incredible comment about what it really means to be working together and how it is important that we, we work together because like, we all hold different pieces of knowledge. Right. And how important it is especially to have folks who have worked inside of the tech industry being involved in these discussions as well. Vaishnavi, of course, is the founder of vys. I think we still fundamentally have a significant asymmetry of expertise when it comes to how technology works. Oh, I think most of the folks who are doing great work around product and policy development, engineering, data science, research, they sit within private organizations, they do not sit within civil society, they do not sit within government. And yet it's a civil society and government that plays the role of checks and balances in the system. But how can you truly effectively regulate something if you don't understand how it works? I think Alice Hansberger, who's the head of trust and safety at Moosubi, also said something kind of similar in that vein and thinking about what does it mean to work inside of technology companies. And so I think my biggest advice is to remember the company as a whole, capital letters, the company is not the individuals who work there. And that back channel conversations and just meeting each other as humans are things that actually can often drive change in a way that the official meetings can't. And that goes both ways. If you're a trust and safety person and you're working on a policy, reach out and ask people's advice and be clear about what you personally can and cannot promise and can or can't deliver. But there's tons of people out there who are just desperate to have some hand in helping platforms be better and so take advantage of that as well. I think one of the reasons I found the conversation with Alice and Jerrel to be so special is because, very honestly, there's always been a little bit of a hostility towards folks who work inside of the tech industry and just a little bit of, I don't know, holding at arm's length. And that's something that I felt when I went from advocacy into working in industry. And it's something that I wanted to talk a little bit about breaking that down and what that can look like, especially working between civil society, especially working between advocacy and recognizing the need for all of us, again, to be in the room together.
A
And of course, this is a difficult thing that we could probably spend an entire other series on this question of, you know, what's possible to do from within side technology companies versus what is, you know, needed in terms of putting accountability measures and regulation on them from the outside in order to make sure that accountability is there. So much of the best intentions of folks who might work in civic integrity or trust and safety are not matched by the folks who are working in the C suite or the incentives that are being set by boards.
B
Well, of course, I know so much about that, otherwise I wouldn't have been in the Washington Post this week, as you mentioned. Right. That's been, you know, so much of my own experience is what does it mean to be inside of a company or be someplace inside of tech policy wanting to do good work, but maybe facing, you know, leaders or powers or, you know, powers that be that don't necessarily want that same sort of change. And I think one of the folks that really helped me think through, you know, how we can imagine different things in the spaces is artist Mimi Onuoha, who I've known for quite some time. Right. And her thinking about the sort of connection between art and technology and how we are, as she says it, you know, different parts of the same body really was inspiring to me to hear. I think we need to recognize ourselves as different parts of the same body. The arm does something different than the leg, but you need both, use both when you walk. If we know that, maybe it can inform our interaction together so that it isn't reductive or simplistic or like, we can understand that. We use different languages, we might be speaking to different audiences, but we are sharing a similar message. To me, that message, when you say, what do we hope for the future? Our hopes and our critiques should be intertwined.
A
Hope for the future. Many people seem to keep coming back to this. You know, where do they find hope? Where do they see silver linings? Where do they see opportunity to build a different type of world, an alternative future? That was a common theme in this series, a purpose in this series.
B
Yes. And I'm glad that it came through because if it didn't, then I think we would have missed the big purpose of it. But, yes, this was. I mean, you know, we talked about origin stories and how people got to where they got and with the work that they're doing. But really what this was about was, what do we do in the future? How do we envision the future. Can we envision a future that is different than what we currently have? And I think one of the things that was so interesting was from Tim Neat, who is Tim, who is the founder of the of dare, the Distributed AI Research Institute. I mean, she talked a little bit about how we kind of see tech as being inevitable, right. And this sort of advancement, that this one thing comes after the next thing, that comes after the next thing. But in reality we're just living in the imagination of folks who have the most funding and who are deciding what is actually happening and that we don't necessarily have to do that. And it's kind of a call back to, you know, Donna Haraway's the View from Nowhere, you know, feminist critique of the way in which we learn, especially at least in the western world about, you know, science and tech as being from no one's point of view. It's like the, you know, it's just the truth out there or the next evolution, the next inevitable evolution. Like obviously there was horses and then steam engines and then cars and then self driving cars and then flying car. You know, this is the natural progression of things. And first we have to really understand that there is no natural progression of things. It is literally all of it is about who is getting the resources to execute on what imagination.
A
I should point out that one part of your future that's been realized this year is that you've taken a new job. You're at Columbia.
B
Sure have.
A
You're in the journalism school, you're teaching again. One of the conversations you hosted this year was about journalism.
B
It sure was. Which is close to my heart, clearly and obviously right. Which I think I'm the biggest proponent of the crossover between journalism and technology policy, which of course is why tech policy presses so special to my heart, because it is that, that exact crossover. But you know, I wanted to talk to reporters in the field who are currently working on technology policy, like Naomi Nicks at the Washington Post, and see like, what, what does it mean at this moment to be doing tech policy? And how can we win back that trust that journalism and other institutions of knowledge have lost? The mission feels as urgent as ever. I don't while there's some while trust in media, and particularly I think mainstream media has declined, that feels sad and discouraging, but also feels like a challenge of like, how do we explain how we work? How do we continue to try to win the trust of the American people, of our readers, and how do we do it in a way where there's a business model that allows us to exist and to do that work. That's a really tough challenge, but it feels important. And in the moment when people are believing known falsehoods, when conspiracies online have created communities for people are affecting politics.
A
In that same episode, you also spoke to one of my favorite researchers out there. Mine do comes at a lot of questions with very unique perspective. Jasmine McNeely.
B
Yes, Jasmine. Shout out to the University of Florida, where I went to undergrad, where she works. Jasmine and Naomi. It was such a wonderful episode to be able to sit down and talk to folks about tech policy and journalism through a variety of different ways. And I really loved what Jasmine had to say when I specifically asked, you know, what is some of the advice that you have for the next generation of folks who want to be in this field and who want to pursue this work that so many of us are doing?
A
I would say read. Read a lot.
B
Read a lot.
A
Read a lot of different things.
B
Read history, read sociology and read nonfiction. But then do not be afraid to write for yourself. Please write for yourself. I'm begging you to write for yourself. I'm not missing the visual journalists at all and the photojournalist and those do that yourself as well. But be willing to be vulnerable in writing for yourself and accepting feedback.
A
I think it's important to say that not everybody you interviewed was willing to paint a rosy picture of the future in some far off place. And one of the people who every day is in my feed pointing out some hypocrisy of a tech firm, some danger of a new technology, something that we should be concerned about with regard to the intersection of tech and society is Chris Gilliard. And I appreciated, you know, his interview in particular.
B
Same. Right. So Chris is somebody who I actually had never spoken to before, and I think it was really wonderful being able to get to meet him and have this conversation with him. And, you know, he's the co director of the Critical Internet Studies Institute, and he talked about a phrase that he's had for a while and that he's even, I know I've seen printed on T shirts or printed on stickers, right. In this sort of way that, you know, there is hope. It's interesting because when he says it, it sounds like there's no hope for the future. But in reality, he comes from such a hopeful place.
A
I mean, I have a phrase that I say often, right. Which is like, every future imagined by a tech company is worse than the previous iteration.
B
Yes, yes. This is. That is on T shirts, on Stickers, everything. Yes.
A
Yeah, I think I need to do another T shirt drop, Right?
B
Yes, I think you do too.
A
And the reason I say that is because the imagination of the tech company is. Is driven by capitalism and the need to, like, extract maximum value from us. That in order for these things to exist in a way that didn't do that and actually benefited us, you know, we kind of have to rewrite a bunch of the ways that things work. I think that's possible. You know, I think it is. It's like actually super dark right now, you know, but, you know, I was just talking to a collection of college students and, you know, I think in a lot of ways, like, the tech barons are really overplaying their hand. And what I mean by that is that it's very clear where their alliances are.
B
Like, they.
A
There's not really kind of any pretense anymore whose side they're on, whether they believe in. In things like, you know, racial justice or equity or even democracy. Like, we don't. Like, I don't have that question anymore.
B
Yeah, you know. Yeah, I know.
A
Like, I personally never did, but, you know, like some people did. Yeah, yeah, it's very obvious where. Where they are.
B
Yeah.
A
You know, history says that that usually doesn't work out well for.
B
Yeah.
A
For that crew of people, you know, and so I think that, that. I think that there, it's possible to have some of these things in ways that don't feed directly into authoritarianism. Another thing that I think you got at in this series is where we can make progress on activism, where we can act, you know, where folks can get engaged. One of the people that talked to you about that was Nora Benavides.
B
Yeah. I think that sort of question of what can we do? Was one that I really wanted to hit on. Right. Because again, I mentioned, you know, so many moments we feel stuck. We felt like there's. There's no way to move forward. We can't do regulation. You know, platform accountability is so hard. So what can we do? And I think I really appreciate Nora for giving us this kind of long held civil rights wisdom of not waiting for hope. Right. And being able to act even in a space when hope feels so grim and so far away. The great line many of my colleagues talk about in the civil rights space, which is we can't wait for hope. We have to take action. And I think in this environment, there's a sense of, well, let's wait, like you and me, let's have this chat and somehow hope that a feeling Comes to us and then take action. And it's like, no, just take an action. Just show up at a protest. Just go to the city council meeting, whatever it. Send the email to your representative. Do the thing, whatever it is. Start small, but start. Rip that band aid off, because action will produce hope, not the other way around.
A
A lot of people probably looking at 2026 thinking it's going to be a very, very difficult year. A lot of political foment, both in the US and abroad. A lot of headlines I'm sure to expect around artificial intelligence and other threats to privacy and to democracy that emanate from our concerns around technology. You're moving into this new role you're teaching. Is there anything you'll take from this series in terms of, you know, how you'll talk to your students, how you'll think about how you lace in considerations around the future? We're trying to build into the work that you're doing.
B
I take so much of the series with me. Right. I think that, as I said, it's been a very, like, public, therapeutic session of thinking and also finding joy for myself. Right. I think there was one of the things that would have been so easy for me to do is lose hope after the journey that I went through and to lose all sort of desire to want to continue to move forward. And instead, what I found is that I believe in the next generation and I believe in these students who care and who ask questions and who come at this from a place that makes me hopeful and genuinely, you know, they start. And they're starting from a place that I took me years to get to. Right. And they're. That's where they're coming to this work from. And so, you know, I'm really excited. I'm going to start teaching a new course next semester, this upcoming semester, specifically about content policy and a technology company, the stuff that I used to do. And so, you know, being able to kind of make meaning in a way around a lot of this work that I did. I remember somebody asked me once while I was in the middle of the, you know, whistleblowing situation, what would it mean to put a bow on it? Right. And I had to say, well, I don't think I've been. I don't think I've done that yet. Right. I don't think. I think I was still in the middle, and I was still trying to figure out what does it mean to kind of wrap this up and put a bow on it. And I think I'm almost at that place, right, of being able to say, like, okay, that's over and done. And now I'm at the place of how do I take the lessons that I learned and give them to somebody who can do something with it? Because when I tell you it's not going to be me, Justin, I am this is no longer my job. You know, I am grateful to all of the gods that, you know, what the President of the United States says on a social media platform is not my job anymore. And, you know, figuring out what is and is not misinformation is not my job anymore. And I send, you know, my heartfelt prayers and thoughts to those people whose job it is and what my job now is, is to take the lessons that I've learned and to impart them to the next generation and say, like, hey, here's how we messed up, here's what we learned. Take this and run with it. And hopefully, you know, they can do better than what we did.
A
I'm sure your students will benefit from your perspective and your experience, as we have benefited from your interview approach and style and the relationships that you brought to this. Very grateful that you were able to join us as a Tech Policy Press Fellow this year, Anika and I hope that we'll continue to work together in 2026 and beyond.
B
Oh, same Justin, you can't get rid of me. You've never been able to get rid of me since the first time I came onto your podcast and wrote for you. And I plan to continue to stay around and I am again grateful for this opportunity to be a fellow and to be able to put together this limited series and shout out to the fellows that come next and the amazing work that Tech Policy Press is going to continue to do.
A
Anika, Happy New Year.
B
Same to you, Justin.
A
Before we close the episode, I just want to give a special shout out to all of this year's Tech Policy Press fellows, including Anika, Arianna, Apolafia, Dia, Eric Salvaggio, Jasmine Matani, AIs Martins, Megan Kirkwood, Pratik Wagre and William Burns. They each contributed so much to our coverage and analysis this year and we wish them the best in the year ahead. And a special shout out to the Tech Policy Press team for coordinating and editing the fellows throughout the year, including Bin Lennit, Ramcha, Jahangir, Cristiano Lima Strong, and Prithvi Iyer. Prithvi just left Tech Policy Press earlier this month month to join the Penn center on Media, Technology and Democracy and we wish him the best. That's it for this episode. I hope you sent your feedback. Write to me at Justinechpolicy Press. Thanks again to Anika, thanks to my co founder Brian Jones and thank you for listening.
B
Tech policy press.
Podcast: The Tech Policy Press Podcast
Episode Title: Through to Thriving: Insights from the Field
Date: December 21, 2025
Host: Justin Hendricks
Guest/Fellow: Anika Collier Navaroli, with featured voices from throughout her fellowship series
This special episode marks the culmination of Anika Collier Navaroli’s “Through to Thriving” series, a fellowship project with Tech Policy Press exploring the current state and imagined futures of technology policy at the intersection of tech, democracy, and society. The host, Justin Hendricks, and Anika reflect on this year-long series of conversations with leading voices in trust & safety, journalism, academia, advocacy, and the arts. The episode focuses on processing recent years’ turmoil in tech policy, fostering space for human connection, and seeking paths not just to survival, but to hope, agency, and thriving futures.
The Necessity of Collaborating Across Borders:
Breaking Down Hostility: Conversations unpack the persistent skepticism and mistrust between those inside industry and those in advocacy or civil society, noting the importance of collaboration.
Advice from Inside the Industry:
On Emotional Processing and Community:
On the Limits of Hope Alone:
On Joy as Activism:
On Tech Policy Cross-Sector Cooperation:
On Possibilities Beyond the Status Quo:
This episode serves as a capstone—both a reflective look back on the traumas and lessons of recent years and a heartfelt call to action and imagination for what tech policy, civil society, and democracy can become. Listeners are encouraged to engage as whole people, act before hope arrives, and rewrite tech policy’s future with a more humane and equitable vision.