Loading summary
A
The following podcast contains explicit language.
B
Hello and welcome to the Optimism edition of Slate Money, your guide to the business and finance news of the week. I'm Felix Hammond of Fusion and I have been persuaded by my special guest this week to be optimistic. I have. I was hanging out with an optimist last night. I am now hanging out in Slate Money Studios with an optimist. So I feel like on this Thanksgiving edition of Slate Money we are going to try our hardest in the midst of a brutal year to have a sliver of hope and optimism. We will not give in to paranoia. That's the alternative title of the episode. I am joined as ever by Cathy o', Neill, the author of Weapons of Mass Destruction.
C
Hello.
B
By Jordan Weissman, the Money Box columnist at Slate.
A
I kind of wanted to call this the no Surrender edition as a Bruce tribute.
B
And now, as long term listeners of Slate Money might remember, we have a little bit of a tradition here at Slate Money for the Thanksgiving edition to do a kind of philanthropy theme. We had Rob Reich and Jesse Eisinger on last year, or maybe even the year before. I lose count of these things. And this is the week of Giving Tuesday where we all like try and give lots of money away. We're going to talk about what we can and should do philanthropically, especially in the age of Trump. And we have probably the best person to talk about all of this, the one and only Laura Arnold.
D
Hello.
B
Introduce yourself, who are you and why and tell us what is your connection to philanthropy?
D
Well, I am Laura Arnold. Thank you so much for having me. I am a philanthropist, as Felix notes. I am the co founder of the Laura and John Arnold foundation along with my husband John. We are headquartered in Houston, Texas. We hence my cowboy boots this morning, which Felix admired.
A
They're excellent.
D
We have offices in Washington D.C. and New York. And we are in the business of changing the country, saving the world, making people's lives better. We do that through systemic change. We try to identify systems that we believe are not functioning optimally for society. And we think through solutions to those problems and how to fix those systems in ways that yield benefits to everyone. And that work has led us to invest in everything from criminal justice to research integrity, education, healthcare, democracy, any number of different avenues where we believe public finance, pension systems, so many, many avenues that we believe can yield enormous benefit to society if we just think through better alternatives as to how to spend public policy dollars.
B
And even before the election you had a whole anti corruption arm of your foundation.
D
Yes, I mean it is. I don't know that I would call it an anti corruption arm. One of the main vehicles through which we hope to achieve policy change is data, is the propagation of good data, the aggregation of data so that we can identify trends, transparency, so understanding whether or not people are aware of issues that are occurring in their communities. And so the corruption work is really an outgrowth of that is to shed light on practices that people may not realize exist and that aren't in the public limelight and that we believe should be.
B
So let's. Kathy, let's start by nerding. We are going to talk about Trump and philanthropy, but let's start off with this.
C
So, I mean, I guess one of the things I know the Arnold foundation for is the work with Anne Milgram and pretrial detention.
D
Yes.
C
Could you talk a little bit about how that works?
D
Sure. I'll start by giving you a little bit of background of how we got into that work. As I mentioned, we are interested in fixing systems. We're interested in understanding what is at the root of a system malfunction. We quickly gravitated to criminal justice because it's an area that first is under resourced. Second is frankly, I mean, at the time when we started, the work was very much not in the public limelight and where we, we believed that there was enormous opportunity and more importantly, massive, just massive injustice. We took probably about a year to think through every aspect of the criminal justice system, from arrest to reentry to recidivism to workforce training. There are so many options within criminal justice. So we started unpacking those and trying to understand where we believed the greatest opportunity was.
C
So what do you think the single biggest problem with the justice system is?
D
I don't know. There's one single biggest problem, I think that ambiguity, injustice. I mean, honestly, inequity, inefficiency, that's more than one biggest problem.
C
Is that code for racism or classism?
D
I think certainly it manifests as racism in many, many instances. So we very quickly, when we started looking at the data, honed in on something that we thought was incredibly unjust and really that needed to be reformed. And that was a pretrial justice phase. And by pretrial I mean the moment, the time period between the time somebody is arrested for whatever reason and the time that person is tried. Now, as a, as a lawyer myself, I'm a lawyer by background, you know, you always have these idealistic concepts of how you know, of how the justice system works. And one of the main things that you learn even in middle School is that in our criminal justice system, you are innocent until proven guilty. That is a. That is, you know, that is a bedrock principle of how we think about our criminal justice system. But in practice, you. If you are arrested, you go before a judge, and that judge, typically in most jurisdictions in this country, will take a look at you, and we'll say, okay, you know, Kathy o', Neill, you've got blue hair, and I don't know, bail is set at $10,000. Bang. And your trial date is. If today is whatever it is, November 22nd, your trial date is April 24th. And then we have to think about what happens to kathy between today, November 22, and April 24. That's a very long time. And in my mind, as a lawyer, I think she's innocent until proven guilty. That should be the dispositive factor. And as a judge, if I'm the judge, all I should be thinking about is two things. Is Kathy going to show up on April 24th? And second, is Kathy gonna do anything bad? Maybe. Is Kathy gonna do anything violent? Is Kathy going to commit a crime between now and then? Those should be the operative factors that would decide what I do with Kathy as a judge between today and that date. But instead, what happens is the judge refers to a fixed bail schedule. In many, many jurisdictions, not all looks at you, makes some value judgments as to whether she likes blue hair. Right. Whether or not she thinks that you're, you know, that you look threatening and comes up with a bail amount that if you meet, if you happen to have the money, you probably would make bail and you would leave and show up or not on April 24th. So we started looking at this system, and then we started looking at the results of the system. And we saw that the results of this kind of system and this kind of discretionary and almost automatic judgment by judges results in massive amounts of individuals who are in jail only because they could not make bail, not because they are not likely to appear on April 24, not because they are potentially violent or potentially have a likelihood of committing a criminal act, only because they couldn't pay. And sometimes they can't. Sometimes they can't pay the. Whatever it is, $10,000 bail amount. Sometimes they can't pay the 10% that is required by bail bondsman. So we're talking about very, very poor people.
C
Yeah.
D
So.
C
And like, the story of criminal justice, as I learned reading, like researching my book, is like a story of doubling and tripling of biases, really. So you have the bias of the bail as you said people who can't afford 10,000 or even $1,000 just get stuck in places like Rikers for months and months. But then you have like the issue that actually, as a white educated woman, not only could I pay that bail, but I wouldn't probably be in that situation in the first place and I'd have a lawyer with me who would, you know, talk about how great I am. So there's all sorts of effects going on at the same time, but you're right.
D
No, absolutely, absolutely. But if you look at the headline figures on who is in jail at any given time, and the figure is on any given day, around 700,000, maybe a little bit more, of people who are in jails awaiting trial at any given time. The vast majority of those people are non violent. A majority of those people are minorities. Right. There are nonviolent offenders who are only there because they can't make bail. So to answer your question, I'm sorry.
C
I was just going to ask you, like, so how are you addressing this with data, number one and number two, like, how do you get access to that data? Because that's data I don't have access to.
D
Yes. So to, to address this question, we looked at this data. We looked at the figures, the headline figures. We looked at the headline figures which were upwards of 700,000 people on any given day or in jail. The vast majority of them are nonviolent offenders. Many, many, many of them are minorities. So this qualifies as a suboptimal outcome. Right. So this meets the criterion of something that we would like to change as a system. So we started thinking through how can we add value to this exercise and how can we that policy. What we decided to do was work on a risk assessment system that was data based that would serve as a tool for judges to have some sort of data point to refer to in terms of that person's failure to appear, likelihood of failure to appear, and likelihood of recidivism, which is really what that person should care about. So we started thinking through how can we gravitate the system to more towards caring about what we should care about, which is recidivism and failure to appear, and less on ability to pay, which is what the system is based on now and which, as Kathy noted, has all sorts of biases and racial implications. So we went about this work. We found some very talented data scientists within the criminal justice space. We leveraged our own, our own relationships with the Department of Justice and with sort of local, state, state criminal justice systems to have access to data on a confidential basis and start looking at cases retroactively so cases that are already closed to see if we could isolate and determine which factors were most predictive on those cases that already occurred. So we know whether the person showed up, we know whether the person recidivated. Could we isolate some factors that were most predictive of failure to appear and recidivism? And could we construct some sort of risk assessment that would be a pilot that would start the conversation towards risk assessments in the future?
A
So one thing we talk about on the show, and Kathy especially talks about, is how you're essentially building a model and how models sometimes accidentally encode unintended consequences, and especially if you don't have a good way to measure whether or not something's working and revise it. So I guess my question when I hear about a project like this which has, you know, amazing intentions, is how do you know long term if it's working and how do you revise it if it's not? How do you, how do you make, how do you kind of calibrate it going forward?
D
This is something we think about all the time, Jordan. And we, and we're very conscious of the perils of many of the things that you note in your book, Kathy, just first of all, of overly relying on an algorithm, right. With no human involvement. We are extremely conscious of latent racial biases or latent injustices that we might be fostering with these very algorithms. And the way that we're addressing those problems in this public safety assessment tool is really by revisiting the data. When we're now at the pilot phase, when we have data to disclose, we plan to make all of the data available. Our algorithms are available, our equations are available in terms of where the pilot projects are now. But our hope is that we all, as data scientists, as philanthropists, as reporters, as concerned citizens, can look at these algorithms and look at these tools and collaborate to make these more fair. But I guess I think about when I think about whether or not is it as transparent right now as we need it to be? No, because we don't have the data yet from the jurisdictions where we've started to work. But then I think about what's the alternative? I think about, I know the system now doesn't work. I absolutely know that judges are biased. I absolutely know not all judges, but many judges manifest biases in ways that they're not even aware of. I know that minorities are disproportionately incarcerated because of their inability to pay I know that most of them are nonviolent. I know it isn't working. So we need to start thinking about alternatives that lead us to a better path.
C
I have a few suggestions. One is, you know, you mentioned that your formulas are available. I didn't know that, but that's great. That's a good step.
D
Yes.
C
Is there also an audit available? Like, so do we know what the actual results are? That's like the first thing. The second thing is do we have any sort of high level meta analysis of like the jurisdictions in which a tool like yours is being used versus not used? So we can say the system we have now sucks. We know that. And this system is actually, when the system is, is, is when we have this tool in place, it's actually a better system, more fair system that would give us a lot of, you know, ground truth. And it's something that like, honestly I was told I couldn't study when I was thinking of working with the White House on this, on an Arnold foundation, like funded project, I was like, I want to look at whether, you know, recidivism, risk, algorith actually help or hurt given systems. And they're like, that's off the table. So like, I guess that brings up to the larger thing is like, we want this to get better. Everyone in this room, like probably all our listeners want this to get better. The thing that we have, and this is, this is more true for foundation work than it is for typical government work. Although the Department of Justice isn't a well behaved entity. The problem is like the lack of accountability. Like we. I hope that you make these audits available, but what kind of power do we have as the public to require that?
D
Yeah, I mean, I think all of what you brought up are very good points. Just a small correction. We. I'm not aware of exactly what Arnold foundation work you were involved in at the White House. We typically don't. We have not historically worked on recidivism and sentencing and sort of the back end of the system, if you will. So we certainly recognize and agree that there are many, many flaws and improvements to be made on those algorithms going forward. You know, frustration that we have right now as we've developed the tool, to be honest with you, is that the regulatory structure under which we can access this data is remarkably complicated. And frankly, you know, it presents some obstacles for us to disclose the data. And there are some good reasons for that and some reasons that are probably, you know, not as good. Right. There are, there are justifiably privacy Concerns as to disclos, disclosing data on a personal, on a personalized basis on people who have been arrested. There are extraordinary concerns in terms of, you know, who accesses that data and for what purpose. So much of what we're doing now is trying to work through those issues in a very deliberate way with the jurisdictions in which we work so that we can create, for example, data hubs that are, you know, where we can anonymize the data and make them available to people like you and people who have the ability to make suggestions and add value to, to what we're doing.
C
I understand, and I totally agree that a lot of this data is extremely personal and we wouldn't want it out there. But I do think that you could publish audits with like, here's the methodology of this audit without exposing the data.
D
And we will absolutely do that. I mean, this is. We are, we are in the business of making people's lives better, and we want nothing more than collaboration from anybody. So I hope that you personally will audit our, our algorithms. I, we would love to have you involved in, in any capacity that, that you're interested in. We have, we have a number of them, specific projects on the public safety assessment to look at discrete issues on how it's, how it's evolving and how it's changing as it's being implemented. We have a group looking at potential for racial biases. We have, you know, discriminatory outcomes. We have a look, we have a, you know, we have another team looking at the algorithm itself to understand whether or not, you know, we're weighting the factors in a different way, whether those, the nine factors that we actually use are, in fact, the factors that we should be using. I mean, all of this, as, you know, is an iterative process.
C
Can I make one more comment? Felix, I know you want to end the topic, but I just want to make one more comment, because there is a lot of racial bias in a lot of these things, and we won't be surprised if we find it. But I'm starting to this new campaign with respect to racial. When we find racial bias and algorithms, which is to use it as an opportunity to understand causality and intervention. So if we find that black people are much less likely to show up at their court date, we might say, what is keeping them from showing up at the court date? Do they not have a ride? Do they, you know, you know what I mean? Like, what is it like? We could actually think of that as an opportunity to be an optimist, to Be an optimist for this episode as an opportunity to, like, close those disparities.
D
Absolutely, absolutely. And we do, we also, we already do some of that. I think one of the, one of the very important ancillary benefits of this work in jurisdictions is that it's forced jurisdictions to take a look at their data and understand what they have and have not been collecting and to ask precisely those questions. So, for example, we have some pilots where we send people text messages saying, hey, your court date is April 24th. You know, we'll send that text message on April 21st, because sometimes somebody who has three jobs and three kids and is living paycheck to paycheck, doesn't have an iPhone and, you know, to click on her Outlook Calendar app and calendar in April 24th to show up. So, so we're, so we are, we are extremely sensitive to everything that you mentioned and we are working on each of those things.
B
So, so Kathy's raised a bunch of really interesting issues here, which I totally want to pick up that ball and run with in the next segment. Kathy, you were talking a lot, you were talking a bit about accountability. Yeah, basically. And this is, this is a fascinating question for me, Laura, especially as regards your foundation. We had Rob Reich on here, who's written a lot about both the upside and the downside of the fact that foundations are basically completely unaccountable to anyone. Kathy's asking for certain levels of accountability. You're saying, well, yeah, we believe in transparency and we are going to do this, but that's an entirely voluntary thing that you're doing, sort of because you think it's the right thing to do and not because you have to do it. And at the same time, the fact that you don't have public accountability and you can do things that might be unpopular in certain jurisdictions and just do them anyway, whether people like them or not, and is part of the driving force behind your foundation. You're like, because we are individuals and we can do things which don't need broad public buy in, we get to do kind of the stuff that other people might not be willing to touch.
D
I understand the spirit of your question. I vehemently disagree with your characterization. We do not go into any jurisdiction where we're not asked to enter. We have a 200 person waiting list. On the 200 jurisdiction list. On our public safety assessment work, even in the work that you rightly note as unpopular, perhaps in the big scheme of the national debate, for example, our pensions work, we've never entered a jurisdiction where we have not been asked to enter. We don't view our role as sort of a top down, you know, we know more than, you know, than the state of Alabama as to how it should run its prisons, for example, or how it should run its pension system. We view ourselves as a resource to suggest policy, you know, policy ideas that we think are an alternative to the status quo and a superior one.
B
Let's talk a little bit about what happened in Baltimore since this was, got a lot of headlines.
D
Yes.
B
Is that you went in and you basically paid for a plane, correct me if I'm wrong here, to sort of fly around in the sky above Baltimore and take photographs every second and create a photograph of basically where everyone, and everyone is.
C
Persistent Surveillance System, I think it's called.
D
Yes, it's called Persistent Surveillance. The company is called Persistent Surveillance Systems. So let's take a step back and I just want to make the note that our work in criminal justice is very much focused on figuring out what works and what doesn't work. And that manifests in any number of ways. So we want to understand how what is effective and not effective in policing, what yields could result and what doesn't yield optimal results in a world of limited resources. We have to make that, you know, that value judgment. Right. And I don't want police departments to make that judgment without data. So we learned of this technology actually through a podcast. I think it was Radiolab. I think Radiolab did an interview with the head of Persistent Surveillance Systems and we thought, this is sort of interesting, he had some, he had enormous success in Juarez, Mexico in addressing really a real crime crisis in that area and that those, the authorities, the police authorities in Juarez found the technology enormously useful. So we said to him, if you find a jurisdiction in the US who wants to try it out, wants to see whether or not it yields good results, we'll pay for it. Persistent Surveillance Systems had, I think that there were one or two jurisdictions where they were considering operating in the United States. They reached out to Baltimore, Baltimore was interested, Baltimore at the time and still has a risinga rising rate of murder, something like a 70% increase in non fatal shootings, a murder clearance rate of something like 30%. I mean, abysmal, right? I mean it was in a state of crisis. It was looking for alternatives to try to solve this, the crime realities that they were facing. So Baltimore was interested and we said we'd fund the pilot and that's what we did. And as you rightly note, the exercise was a plane, a plane has Some cameras on flies over the entire city of Baltimore during the daytime takes one photo per second. And the idea of the technology, which, by the way, has not been proven, I don't know if this works. This. This could work. This could not work. This could be worth the money. Could not. That was the point of the pilot, was to understand whether or not this added value. But the hope is that when a crime occurs and we're talking about violent crimes and shootings, we're not talking about, you know, we're not talking about a pickpocket. The city of Baltimore is much too busy to use this technology for a pickpocket. And by the way, the technology is too expensive to use for that purpose. But when a crime occurs, the photo technology, in conjunction with street cameras, which already exist throughout the city, can pinpoint where the shooting occurred and where the shooter from where the shooter came and where the shooter went. And the whole. The hope is that that is just another tool to help the police solve some of these violent crimes.
B
So obviously, this was, you know, this raised the hackles of certain civil liberties types, and it was also done without, as far as I could make out, anyone in Baltimore really, outside a small part of the police department knowing about it.
D
Right.
B
And I think now, especially, you know, since November 8th, the concerns around civil liberties and public security and the intersection between those two are like, very sort of front of mind. So how much were you thinking about this from a civil liberties point of view? And do you think you would do it again now in the age of Trump in that kind of. Without anyone knowing about it way?
D
Well, first of all, we had no role or idea as to who did or did not know about it when it was implemented. We have hundreds and hundreds of grants. We don't micromanage grantees, so we don't. We were not involved in communications as to whether or not to keep it. To keep it secret, if you will, or to not disclose it to. To the community at a given time. That was a decision that was made by the police chief of Baltimore without really our input. And we didn't. We don't really view our role as. As, you know, as micromanaging a grantee in the first place.
C
I mean, you could keep precision surveillance systems accountable since the, you know.
D
Yeah, but I mean, I don't know that they're not accountable because ultimately this is a pilot. Right. So I think the rationale that the police commissioner was adopting was that, let me see if this works. Let me see if this is something that I want to devote police department dollars to. Because I'm not going to pay for it for time immemorial, I'm paying for the pilot. I want to know, as a philanthropist, I want to know if this is a tool that is useful to you, law enforcement, in light of your critical moment in crime fighting. Is this something that you would want to use given your finite budget? You'll have to allocate, you know, you'll have to allocate funds away from something else in order to pay for this. So it wasn't a question of it being kept secret for time immemorial. It was a question of will it work. And if it works, then we have the discussion. But the police commissioner, I assume, was thinking, let me see if it works before I have the fight.
A
So I don't know. This isn't so much a question as a comment, but. But I'm thinking about how you said that you never enter a jurisdiction, you never go to a city or state unless they want you there. Right. But one of the issues with philanthropy is that sometimes the presence of money can make maybe a bad idea look like a better idea or a good idea. It's something. It's like, well, we might not have done this if we were spending our own money on it, or that we would have to think through the political implications more carefully. But the money's there, so we might as well take it and try it. And I think one example of that is Newark Public Schools is almost, with Zuckerberg, is sort of maybe the ultimate example of that we've had recently. But I guess that strikes me as a little bit of a concern. I'm wondering if you just have a response to that or if, you know, is that something you guys think about, is that the presence of money could just, you know, influence.
B
But that's the whole point, right? You get people to try things that they wouldn't otherwise try.
A
But if you're saying you're only going somewhere where they want you.
D
That's right. Well, we're only going somewhere where they have an appetite, where they've identified that there's an issue that they need to address, where they've identified that the current. The current approach to that issue is suboptimal and maybe even detrimental, and where they're open to different approaches and where they've committed that if those approaches prove to be successful, that they'll implement those approaches into their daily or into their policy landscape. So, Jordan, your concern, which is, we never go into a city and say, hey, Free money. We think that you should do, you know, this, that and the other with your jails. And, you know, it'd be nice to have this training program, for example, and we'll pay for it. What we would do is say, are you satisfied with your recidivism rates? Are you satisfied with your homelessness programs? Would you be willing to try an alternative? Let me cap your downside. So this is all the pay for performance work, for example. Let me cap your downside and let you try something that you may be too risk averse to try. And if you, if this intervention proves to be more successful and better for people than what you currently use, you commit to move going forward to shift resources to that intervention.
C
Yeah. So I'm going to jump in on this issue, which is I feel like one of the things I talk about in my book a lot and we talk about on this show is this idea that when you have this massive new kinds of technology, which is like what we're talking about with surveillance, you get to define the people who pay for it. The people who own it get to define success. And obviously you're defining success as like, we identify murders or something like that. But there's also costs which are not being measured. And that's what obviously what we're worried about when we talk about nobody was notified. There was nobody asked in the public, do you want to be surveilled from an airplane above you? That didn't happen. So the costs were not measured, but the potential benefits were measured. I mean, I'm just, I'm being, I'm being strong in my statement. But the point is like, that's, that is exactly what I worry about. And I think most people worry about when we talk about surveillance and technology is that like, who defines success and to what extent are you ignoring costs and like, whose costs?
D
Well, so I'll give you two answers first. So I do define success in that way, but I'm not the ultimate arbiter of success. This, again, the Baltimore example, the technology will need to be adopted by the police department, will need to be approved by the mayor. If in fact it's going to be used in conjunction with that discussion, there will be a public debate. So people can say, I don't want to be surveilled. Communities, communities that are the victims of high rates of crime, as actually was the case in Baltimore, can say, no, actually I want this surveillance because I am concerned about my neighborhood and we should have that discussion. I think the discussion is constructive. We have said time and time again that we encourage that discussion. It is not my, it is not my role to make that value judgment for any community. I view my role as providing the option. This thing, this is what it can do. I don't know, do you want to make that trade off? Make that trade off. If you don't want to make that trade off, that's great.
B
You're married to a commodities trader who understands the value of options. The whole point about options is they only go in one direction. You have this baseline of no surveillance and then you add an option of more surveillance. And the only way that that can ever move is in the direction of more surveillance. So like, once you've got the option, it's going to wind up getting implemented somewhere. And we've definitely seen that with the growth of the NSA and the way that as the amount of surveillance that the government can do on its population has grown exponentially, given the amount of data that it's had access to, the amount of surveillance that it has done has grown exponentially, the ability to surveil seems to lead inevitably to more surveillance.
D
And I take that point. I don't think, I view this as a community decision that represents a value judgment on a trade off of resources. Do you want to hire more police officers or do you want to buy this technology? Or there are many technologies that the police department may be using now that maybe it chooses to abandon in favor of this technology. So I guess I don't really see. I don't know that I agree with the sort of meta discussion of where this could, where this could lead. I think that, I think that from a civil liberties perspective, we have to debate this. I mean, I say this as one of the, you know, one of the top funders of the ACLU nationally. I, the chair of its, the co chair of its, of its centennial campaign. I am extremely committed to these issues. I want the debate, I want the ACLU to challenge this because I want to know, I want to know. I want to know whether or not this is something.
C
Well, agree on them.
D
I was gonna say let's have the discussion, but let's have the discuss based on data. Let's understand whether or not this tool works and whether or not we're willing to make that trade off. And maybe the answer is no. And that's great. If the answer is no, it's no.
A
It's interesting because you say it needs to be a community decision and that inherently raises then the question of who in the community is deciding and who's. And that gets to really deep issues, especially in a city who votes, who doesn't vote, which part of the community is actually represented on the government level? And then we talk about making the decision based on data. You know, how the extent to which a Democrat, I think, especially in like this last election, we've seen the extent to which any Democratic debate really happens based on data is. I think that our faith, my faith in that's been shaken also.
C
Like, all data is biased. Right? Like, in other words, I mean, I'm gonna be really harsh because that's my mood right now. Like, basically you're giving money to one side of that data conversation and you're letting them create evidence that they're, that they're right. I'm not saying that they're wrong. I'm just saying, like, it is not a fair fight.
D
What data would you like to see on what you perceive to be the other side?
C
The, like, degradation of your quality of life as you are surveilled, which, by the way, is hard to create. That's not easy data to collect or measure?
A
Well, yeah, I was gonna say that that strikes at a really important part, which is that there are some aspects of community life that you. I don't think you can even measure that. And that applies to many, many things.
D
So I'll give you that point. So let's say that there is enormous degradation in quality of life because you were surveilled. So have that discussion and you mobilize people who are like minded and have that debate at a local level and say, no, we don't want this in our community, which is what happened in other jurisdictions in which this, in which this technology was considered. And I think that's a very. I think that's a very healthy discussion. And I don't have a view, I don't have a view as to whether or not Baltimore should adopt this technology, but I do think it's important to think about to maximize the data that we can gather as to whether or not it's worth doing. So.
B
I hinted at this already, but I want to ask you directly, we are in a very different world now. We are in a world which it's incredibly important to remember is not normal. And there is this imperative on certainly our journalistic world, but I think generally on society as a whole, not to normalize some very extreme and problematic behavior coming out of the White House. In that context, how does your philanthropy, and how should philanthropy in general change? How have you reacted to the election result and how has that changed like, you've been working with government very closely. You've been invited in by government. You've been happy to sort of do confidentiality agreements around the data of government and that kind of thing. What happens when the government starts being something that you have to be worried about rather than a potential ally?
D
Yeah, well, first, I don't think it's a surprise to anybody who knows me that I had my moment after the election where, as from a personal perspective, you reflect on the reality of your country. I don't have the luxury of dwelling on that as a philanthropist. We are in the business of making the country better. We're in the business of maximizing opportunity, minimizing injustice. We need to move forward even in light of individuals who we find distasteful, dishonorable, deplorable. Right. We need to move forward. This is not the first time and won't be the last time when we will need to make coalitions with people that we don't like and people that we don't agree with. This is what we did with the Coalition for Public Safety. We partnered with the Koch brothers, the Ford Foundation, MacArthur, to create a bipartisan coalition to move forward issues that we thought needed to be moved forward and where we could coalesce around reform and solutions.
B
We continue to believe basically nothing. The way you conduct your philanthropy hasn't really changed. It shouldn't change and it shouldn't change.
D
No, I don't think that channeling resources toward. To exaggerate toward a fight toward the apocalypse is the way that we need to be viewing philanthropy. I think much of what we do, all of what we do is creating opportunities through bipartisan reform. And we believe, I do believe that we still have those opportunities.
B
So this is where I want to invite Jordan in, because we've been getting a bunch of emails from listeners about what they've been doing in the wake of the election. And certainly what I can speak personally, that I have personally sort of reconfigured. And I feel that personal philanthropy and institutional philanthropy are two incredibly different things. And I think this is entirely reconcilable. But on a personal level, Jordan, I mean, at least from our anecdotal email thing, people are really trying to react to the election on a philanthropic level.
A
Yeah, well, I guess I can feed out a few numbers. I also do have a question, kind of just pinging off your last comment, but we did end up. We got 32 emails from our listeners, which was lovely. And, you know, people were, I mean, the most popular organizations, maybe not surprisingly, because we name checked them, were donating to the aclu, donating to Planned Parenthood, because, you know, and it was. I mean, you know, one thing that I did notice was the Southern Poverty Law center was another very popular one.
D
All of which are grantees, either personally or through the foundation.
A
The Brennan center for Justice Care came up. And one thing I noticed among the responses we were getting from listeners is it was a lot of less traditional charities, things like giving money to the poor or food to the poor, and a lot of organizations aimed at helping civil society, dealing with civil rights, dealing with things like abortion rights or protecting minorities. And I think there is a deep concern about that right now. And so I guess my question is there's been a big trend in philanthropy in general to. And you're part of this, to, in part, to focus on issues where the outcomes can be very much quantified and you can measure the number of lives saved and things like that. Do you think there needs to be any shift in emphasis to things like, again, like, you know, less quantifiable things like building up civil society, building up institutions, and making sure that those, you know, can survive in the age of, I guess he is he who we can now name Trump.
D
This is an issue that I care a great deal about, Jordan. It is not squarely within our mission at the foundation. But, of course, philanthropy, my philanthropy and John's philanthropy, is not limited to what we do at the foundation. As I said, we're a top funder of the aclu, a top funder of Planned Parenthood, all of these issues, the center for Reproductive Rights, the Brennan Center. There is absolutely a role for philanthropists who have a civic conscience to think through the civil liberties implications of the moment in which we live. With respect to kind of what that means as a concrete matter in terms of the fight, I think that the way that I approach it, at least, is to bolster those organizations in the same way that, frankly, we've bolstered them, you know, historically, so that they can stand ready to challenge when and if regulations come down the pipeline or decisions come down the pipeline that we think are detrimental to civil society. But I think that, you know, so long as we all invest in the checks and balances system, which is how I view the aclu, by the way, as a check on power, as a check on these incremental exercises of power that may or may not be favorable. Right. So long as we have a strong network of checks and balances. I think that that provides comfort to me that civil Liberties are being represented.
A
I mean, do you think that other philanthropists are. I guess, you know, you're more tuned into the philanthropy world than any of us here. I mean, do you think there is a sense that this is more important now than it was three weeks ago? You know, do you think there is. That these kinds of organizations need more support than they did, or do you think. Do you think they need more support than they did?
D
I think some organizations need more support than they did. I think that, you know, there's always the question of initiating legislation versus doing the blocking and tackling work. So, for example, in reproductive rights, most of the work continues to be done in the states. States, right. The blocking and tackling of, you know, of regulations that continue to come up in legislatures and need to be struck down or need to be. Need to be blocked from implementation. I don't know that a Trump administration necessarily is, you know, is going to impact that work. I think that work has been ongoing for years, and that work needs to continue. You know, I think that. I think philanthropists are, and I think individuals just as citizens, we're all galvanized by the concerns of what might be to come. And I guess my broader point is that we can either be paralyzed by those fears and channel resources prospectively in preparation of some attack, or we can maintain the strength of those institutions, continue to support them in the ways that they need to be supported, but get to work on saving people's lives because there's so much work to do and so much opportunity.
C
In that light, I would like to propose two technological issues, technological projects that I would love to see happen. Because I just want to say, like, everybody is giving money to the ACLU who can afford it, and that's wonderful, or whatever the other different places, or if they're investing in free press and stuff like that. But there's actually more energy there than just donations. People want to be part of a movement. They want to be organizers, not just donators. Right. So here's an idea I was thinking about, which is having, like, work apps, sort of apps, networks that. That works by apps on phones or something like that that would help connect women who need abortions with women who are willing to pay for the abortion, give their house overnight or three nights to the women who need to come to a different state. You know, pay for, you know, pay for the travel, pick people up at the airport. Like, I know so many women in my social group in New York City who are like, I would give money for that. I would let them Stay in my guest room, blah, blah, blah. But we don't have the way of connecting. So that's something that we could do with crowdsourcing with an app. That's just one idea. Another idea for people who are being deported. Because we worry about that five activists should be arrested for each person deported. Like, make a fuss and get arrested and make, like, bring press along. That could also be done with an app and with people who are, like, aware of what the movements are.
B
Which I have to say, in terms of the presumption of innocence that Laura was talking about earlier, there was a man, I can't remember who it was, who walked into a meeting with Donald Trump with a set of proposals, which got photographed as he was walking. I saw that thing and then the first thing that was on the list was all undocumented aliens who even get arrested automatically get deported. Like, it is a complete presumption of guilt. It's like if you're guilty of something, you get deported, but even if you're arrested, even if you're completely innocent, we'll deport you anyway.
D
Yeah, that's.
A
Which is kind of Kris Kobach's world we're gonna be living in.
B
Terrifying. But let's, let's wrap this up because I feel like we need to get onto a numbers round. Cathy, do you have a number?
C
I do. 25%. It turns out that 25% of people admit to self censoring online to avoid intimidation and harassment. That's according to a new report by Data and Society.
B
That seems very low to me.
C
Well, 50% people of people say that they have been harassed or intimidated online. So 25% said it made such a big effect on them that they stopped actually doing saying certain things, which is really scary.
B
Well, I count myself part of that. 25%. There are certainly things which I don't say online because I'm like, it's just not worth it.
C
Right?
D
Yeah. My number is $3,500. That is the amount of bail that a gentleman named Gilbert Cruz couldn't pay when he was charged with a misdemeanor. He spent nine weeks in jail, his life was destroyed. He lost his job, his car was repossessed, he lost his housing, and his case was dismissed. He was innocent until proven guilty, but he spent over two months in jail. And that is emblematic of what happens throughout the country.
B
That was in Texas, that was in.
D
Harris county, in Houston, Texas.
A
So Felix had me share what was going to be my number before the numbers round. It was going to be the 32, however. So I'm instead just going to come back to that and read a couple of my favorite emails I got from listeners. My favorite personally came from Carol, who, explaining what she did in reaction to the election, said, I gave money to Planned Parenthood in honor of Mike Pence and had the donation certificate sent to the Governor's Mansion in Indiana.
D
That's great.
A
That was one thing. Another was a quote that somebody, one of our listeners, sent, which is from George Kirstein. Apart from the ballot box, philanthropy presents the one opportunity the individual has to express his meaningful choice or over the direction in which our society will progress. And I bring that quote up because in the last episode or one of our last episodes, I did talk a little bit about the. Sometimes the idea of philanthropy and the welfare state are set in opposition to each other. We can have less of a welfare state if we have more philanthropy or vice versa. And I think we do have to think of them as complementary and just. I don't want anyone, you know, when we call for more giving and kind of of building up the community, I hope no one out there is thinking that that is in some way suggesting that we shouldn't also be fighting to maintain what the government does for the poor and the needy.
B
My number is 122 billion, which is the number of pounds that the UK government is going to have to pay now in order to pay for Brexit.
C
Oh, my God.
B
That's about 150 billion dol. Dollars. This came out in this week's budget. It's way higher than even the. The worst case estimates that people had been expecting. But this is the new UK budget. The deficits between now and 2020, as, as projected, have risen by 122 billion pounds, basically just thanks to Brexit.
C
That's. We've come a long way from that bus.
B
Yeah.
C
Yeah.
B
We are going to save £350 million a week. It's really. You're not going to sa.
C
Damn bus.
A
Oh, man. I actually, you know, I have one other number I realized I need to. It's a correction. Last week I had a brain fart and I said there are only about 600 million people in India. There are more closer to 1.2 billion. I apologize.
C
By a factor of two.
A
Yeah, I don't.
B
He was only mentioning the women.
A
Yeah, I was. And so. Well, that is important because I was saying that about half the country supposedly unbanked, which means that a population the size of the United States is unbanked there. It's more like a population double the size of the United States would be unbanked if they, in fact, 50% don't have any kind of financial services available to them. So.
B
So, yes, I think that's it. There was so many other things I wanted to talk about and we couldn't fit in. But that just means that Laura's going to have to come back.
D
Absolutely. It would be my pleasure. It was an honor to be here with you. Happy Thanksgiving.
B
Happy Thanksgiving to all of you.
A
Thanks for coming.
B
Write your checks on giving Tuesday to any or all good causes that you can find. I love this, what do you call it, institution. I think it's a sort of national institution now. And we will continue to interrogate these issues on Slate Money, which you can listen to by subscribing to our show. Subscribe to all of the Panoply shows. They're found@itunes.com panoply I do need to thank Veeralyn Williams and Mary Wilson, who produced the show this week. Also the executive producers Steve Lichti and Andy Bowers. But most of all, thank you very much to Laura Arnold, who has come all the way from Houston, Texas.
D
Yes, sir.
B
To be with us today. Thank you. And we will talk to you next week on Slate Money.
D
Sam.
Release Date: November 26, 2016
Host: Felix Salmon (Fusion)
Panelists: Cathy O’Neil (author, Weapons of Math Destruction), Jordan Weissman (Slate), Laura Arnold (Co-founder, Laura and John Arnold Foundation)
This special "Optimism Edition" of Slate Money takes place around Thanksgiving and focuses on philanthropy: what it can—and should—be doing to improve society, especially in the wake of Donald Trump's election. The episode features an in-depth conversation with Laura Arnold, a leading philanthropist, about systemic change, criminal justice reform, data-driven interventions, surveillance and civil liberties, and the challenged role of philanthropy in politically uncertain times. Throughout, the hosts bring skepticism, tough questions, and moments of genuine hope.
(01:56 – 03:17)
(04:07 – 11:56)
(11:56 – 18:09)
"We are, we are in the business of making people's lives better, and we want nothing more than collaboration from anybody." – Laura Arnold (17:21)
(19:34 – 21:49)
(21:49 – 35:10)
"It is not my role to make that value judgment for any community. I view my role as providing the option." – Laura Arnold (30:50)
(35:57 – 44:12)
"We need to move forward even in light of individuals who we find distasteful, dishonorable, deplorable." – Laura Arnold (37:05)
(44:12 – 46:11)
| Segment | Timestamp | |------------------------------------------------------|---------------| | Panel Introductions, Theme for Optimism | 00:08 – 01:56 | | Laura Arnold lays out foundation strategy | 02:01 – 03:17 | | Criminal justice reform discussion begins | 04:07 | | Problems with pretrial detention and bail | 05:12 – 09:45 | | Data, algorithms, and transparency | 11:56 – 18:09 | | Foundation accountability debate | 19:34 – 21:49 | | Baltimore surveillance pilot controversy | 21:49 – 35:10 | | Civil liberties and philanthropic choices | 25:16 – 35:10 | | Philanthropy post-election, activist energy | 35:57 – 44:12 | | Suggestions for activism apps | 44:12 – 46:11 | | Listener feedback & philanthropy as activism | 39:25 – 44:12, 48:05 – 48:59 | | Numbers round (bail, censorship, Brexit cost) | 46:24 – 49:43 |
The conversation is earnest, probing, and occasionally contentious, with Felix mediating between Laura Arnold’s optimism, Jordan’s skepticism, and Cathy’s technical criticality. The language is direct and informal—full of expert knowledge but grounded in accessible examples and real listener concerns.
This episode offers an illuminating look at how philanthropy aims to change failing systems, grapples with the unintended effects of data and technology, and must navigate legitimacy and democracy in tense political times. You'll hear real debates on accountability, surveillance, racial justice, the limitations and possibilities of giving, and how individual and institutional actions interlock to form civil society.
"We need to move forward even in light of individuals who we find distasteful, dishonorable, deplorable. Right. We need to move forward." – Laura Arnold (37:05)
This encapsulates the episode’s underlying message: that optimism isn't naïve, but a form of determined engagement—even (and especially) in anxious times.