A (14:58)
Trust and safety is such an interesting field because it is, in part, so old and but at the same time so young. And like, literally the term of trust and safety dates back to, like, 2000. And I want to say ebay was the company where trust and safety was first coined. And it makes sense that ebay would be one of the earlier companies worrying about this because people didn't want to buy a product and then never have it shipped to them on ebay. Right. That's a trust issue. Right. Like, sale of illegal goods is something that ebay had to be concerned about. There's a safety issue. And so, you know, it sort of feels natural that ebay would be an early sort of thinking in this space. But I do, I do think that, like, they. They hit a lot of the right buckets, like Danielle and Ari. Like, you get it mostly right from my personal experience, where, like, from the outside, it probably feels very reactionary. But usually what's happening is that there are people internally that care about a particular issue, and they can't get attention within the company until something external happens. And then once the external thing happens is when the company actually takes movements on it. And so I think a lot of these issues that we've seen have, like, their precursor in, like, a couple of Employees who are like, hey, this is a problem, we should deal with it and then sort of like advocating for that internally until finally there's an opportune moment in the public when they can be like, hey everyone, do you remember that thing that we've been yelling about for a year here it is getting us negative press out in the public. Can we please do something about it now? You know, it feels very like, like I said, it feels very reactionary from the outside. And it's easy, I think, to be kind of cynical from the outside and be like, ah, like the companies are only doing something when they're getting negative press or when advertisers are complaining or something like that. And I could definitely understand that sentiment. Right. But there is, there is some foresight that goes into it and there are people that are proactively thinking about it. I like to point sort of the first trust and safety or integrity thinking, first example of that in the public was actually Larry Page and Sergey Brin in the Page rank article from 1997 or 98 or whenever. And they talk about like in the future being number one. Ranking number one on a search engine is going to be really, really valuable. And so if you're running a search engine, you really have to worry about bad actors who will try to figure out how to manipulate the system to get to that number one spot because that will have much, a lot of value from a marketing standpoint. And they were concerned about it from a marketing standpoint, which I think like is a lot of the activity. But it's certainly not the only ones. Right. And so they're talking about that in 1998. You know, fast forward to 2005 when you know, one of the first search engine integrity issues comes about. Political ones at least, which was the miserable failure Google. Right. And that's where like a bunch of bloggers came together and they took the phrase miserable failure as their anchor text and they linked it to George W. Bush's biography on the whitehouse.gov website. So if you Googled miserable failure, the number one link was George W. Bush's biography on the White House, which is like kind of one of the first examples of a political influence operation campaign being run online by some definitions. And that was, you know, you know, Paige and Brin, you know, sort of predicted that kind of stuff to a certain example. I think another space to really consider about is ads and ad integrity because, you know, while they're, while 230 broadly protected them from any liability around sort of the organic Content that was being uploaded on their platforms and they're distributing. They were not protected around ads. And especially all around the world there is tons and tons of regulation around ads and what kind of ads are and aren't allowed. And so like ads has been another place where like the trust and safety has been kind of like a long standing thing. We just didn't call it trust and safety, we just didn't call it like, like the professional name. But it was sort of like, oh yeah, ADD Compliance team, you know, and it's their job to make sure that there are no firearm sales in the uk, there's no prescription drug sales in the eu. Right. You know, there's tons of laws around the world that the platforms have to comply with. And so they did build up that infrastructure quite early. But then I think, you know, from Danielle, from like the, the, the sort of like 2011, 2012 era onwards, you know, the thing that's getting really attention is societal level impacts. Right? So like there was the Arab Spring, which was a mix of really positive stories and really negative stories. Like that was around, you know, 2010 to 2012. There was ISIS and terror, you know, terror groups, you know, using the platforms, which kind of takes you into 2014. Then you're into Malaysia, Cambridge Analytica, you know, the Russian IRA operation, 2016, and then you're off to the races where it's like, you know, it really feels from the outside like the platforms are on the back foot and it's just like new, new exploit, new vulnerability, new set of bad actors that are using the platform to do something bad, you know, after another. And the platform is responding to it. And my personal story at Facebook, so Yeah, starting in 2016, ending on 2019, you know, my integrity time started kind of late 2017, early 2018 is when I was on integrity projects. And you know, coming out of, you know, Malaysia, the IRA scandal and Cambridge Analytica, there was kind of a beautiful moment inside of Facebook when they're like, you know what everyone, we're going to take this really seriously. Like, every product team, you're all going to spin up your own integrity, you know, versions of it. So my first Integrity team was the Pages Integrity team, which was based off the Pages. Org. And they're like, hey, like, you know, the IRA just used the Pages product to reach, you know, over 100 million Americans. We need to take this, we need to seriously consider what are some bad outcomes that might be happening on Pages. And the company really took it seriously. It was honestly kind of amazing to Watch like as every different product team spun up integrity teams that were worried about any negative impacts coming from them and taking that seriously. And yeah, it was great. I think it was great for a while. I think we're seeing like I don't think that trust and safety is there, but we are entering a new era for sure.