Loading summary
Interviewer/Lawyer
Foreign
Joseph Mayberg
hello and welcome to the 404 Media podcast where we bring you unparalleled access to hidden worlds, both online and IRL. 404 Media is a journalist founded company and needs your support. To subscribe, go to 404 Media Co as well as bonus content every single week. Subscribers also get access to additional episodes where we respond to the best comments. Gain access to that content at 404 Media co. I'm your host, Joseph and with me are two of the other 404 Media co founders, the first being Emmanuel Mayberg.
Emmanuel Mayberg
Hello.
Joseph Mayberg
And then Jason Kebler. Hello.
Jason Kebler
What's up?
Joseph Mayberg
All right, I'll give a quick shout out to Jason's interview podcast episode that just went up. People should definitely check that out if they haven't already. Jason, do you just want to give us the very brief overview of what it's about? Because it's pretty crazy what this guy did and why he did it.
Jason Kebler
Yeah, yeah. It's a YouTube filmmaker who mapped the only unmapped city in America on Google Maps. As in there's like one place called North Oaks, Minnesota that doesn't have that's not on Street View and it's not on Street View for like very specific and weird reasons. Reasons that we get into in the podcast and basically he took a drone and flew it around town and encountered some issues. But I found it to be super interesting. I had no idea that this existed. I just actually honestly he emailed me because he is making an episode about Flock and so he interviewed me for that and I was checking out his channel and I was like, oh this is nuts. I had no idea that this, this was happening. So we did a bit of a tradesies where I'm going on his YouTube channel and he's on ours. But yeah, please check it out. This episode is sponsored by led. If and that is if you find yourself using AI personally or for work, you might also find yourself wondering where all your AI conversations go. How are they stored? When are they actually deleted? As we enter into the Quit GPT era, it's a good time to check up on your AI privacy while searching for alternatives outside of the Silicon Valley surveillance machine. LED was built on the same encryption used by Proton and Signal. You get the power of flagship open source AI models hosted in Finland, all without the techbro surveillance exposure. LED employs a multi layered approach to privacy, from end to end encryption to German owned and controlled data centers with zero exposure to the US Cloud Act. That's true data sovereignty. Visit E L L Y D E E AI Today it's free to try. And their mission plan starts at only $5 a month. That's led.
Joseph Mayberg
All right, Emmanuel, do you want to take us through this first story?
Emmanuel Mayberg
Yeah, I think we actually have a flurry of stories that happen mostly over the weekend for reasons which we'll get into. But the first one of them is from Joe and the headline is, I watched some six hours of DOGE BRO testimony. Here's what they had to say for themselves. Joe, what are these videos? Where did you find them?
Joseph Mayberg
So these are videos of depositions from two members of doge. Justin Fox, I believe, and Nate Kavanagh. And they were part of a sort of DOGE operation campaign mission, however you want to describe it, to cut a ton of government grants in neh. And they went about and they did this and they were very instrumental in cutting hundreds of millions of dollars worth of grants. And because of that they are being sued. And I just want to get the organisations correct. They're being sued by the Modern Language association, the American Council of Learned Societies, the American Historical Association. They are suing NEH and a number of other parties, including these two DOGE members. So these video depositions were recorded as part of that and then also just put on YouTube, which I feel is somewhat rare. You know, like we have the. Was it the Bill Clinton Epstein one recently? I have never really seen depositions before. Maybe that's why I was so captivated. And I watched six hours of Fox's depositions specifically, but very interesting or horrifying videos and I spent a lot of time going through them, as the headline suggests.
Emmanuel Mayberg
Yeah, I've seen some depositions. I'm sure Jason has as well. Bill Gates had a famous one about the monopoly trial, I think it was. And of course Epstein has been in the news a lot. I've seen some of that also. I just want to. It's like. Feels like ancient history, but not that long ago we were writing all about these young DOGE guys trying to make cuts to the government. And this is going to be very interesting for many reasons, but it's the first time we really hear from them directly a lot for six hours, as you say, and not at a stage settings like, I think Trump had some public meetings with them that were filmed, but this is like in depth questioning about what they were doing. So what are some of the moments that you pulled out of there in your six hours of viewing?
Joseph Mayberg
Yeah, it's mostly the clips that listeners or readers may have already seen, because the way I saw it was that somebody on Bluesky posted a link to the YouTube channel of one of these organizations that uploaded the videos. And then I saw a quick little clip where Fox was unable or unwilling to define dei. He was asked repeatedly, how do you understand dei? The reason being is that that was the reason or the justification given for severing a ton of these grants. Right. And he kept pointing back to the executive order which said, we're trying to get DEI out of the government or whatever. But even when the lawyer and the attorney pushed him to be like, yes, but what's your understanding of dei? He wouldn't do it. I don't want to say he couldn't do it. It's more unable or unwilling is sort of the term I'm using. But he's incredibly evasive, obviously. I'm sure we'll play a little bit so people can hear it.
Interviewer/Lawyer
How do you interpret dei?
Joseph Mayberg
There was the EO explicitly laid out the details. I don't remember it off the top of my head.
Interviewer/Lawyer
It's okay. I'm asking for your understanding of it.
Joseph Mayberg
Yeah, my understanding was exactly what was written in the eo.
Interviewer/Lawyer
So can you.
Joseph Mayberg
I don't remember what was in the. So right now do you have an
Interviewer/Lawyer
understanding of what D.E.I.
Emmanuel Mayberg
is?
Joseph Mayberg
Yeah.
Interviewer/Lawyer
Okay, so what's your understanding as you sit here today in this deposition?
Joseph Mayberg
Well, it was exactly what was written in the eo. And so anytime that we would look at a grant through the lens of complying with an executive order, we would just refer back to the EO and assess if this grant had relation to it.
Interviewer/Lawyer
Okay, but I guess I'm stepping back from your methodology strictly in terminating the grants. Do you have an understanding as you sit here today of what DEI means?
Joseph Mayberg
Yeah.
Interviewer/Lawyer
Okay, so what's your understanding of what it means?
Joseph Mayberg
Well, I. It is exactly what was written in the eo. Okay, so why is a documentary about Holocaust survivors? DEI faction is the gender based story that's inherently discriminatory to focus on this specific group.
Interviewer/Lawyer
It's inherently discriminatory to focus on what specific group?
Joseph Mayberg
The gender based. So females during the Holocaust. And then the other parts, I mean, he gets into very specific examples. There was one about a documentary, a grant for a documentary about black civil rights. And Fox says something to the effect of, well, we cut this grant because it wasn't for the benefit of humankind. Which is obviously an absolutely insane thing to say. He does walk that back. And they actually read back to him live in the deposition. They read what. What he just said. He's like, well, that's not what I meant, blah, blah, blah. But he did say it. And then there was another example of cutting funding for a Holocaust documentary called My Underground Mother. So it was really. Even though it was six hours of footage and of depositions, it was kind of the same examples over and over and over and over again, as you might imagine, because he just wasn't answering the questions. And I know that people may think, well, what's the value of it? He's being evasive. But as you said, this is sort of the first time or one of the first times we've seen or heard them speak for themselves. And even though they're obviously being coached, there is a DOJ lawyer right next to these people that I don't know if you'd say representing technically, but definitely assisting them in this deposition. So, of course the answers are going to be coached. They are going to be massaged. But even then, I still think this is incredibly illuminating. But those were the examples that jumped out beyond that, it was just repeating those over and over again, essentially, but very interesting stuff.
Emmanuel Mayberg
I think whoever the lawyer is who was asking him questions actually did a pretty good job because to this definition of dei, it seems to me like the entire point of that line of questioning was to show that it's a ridiculous category. And what Fox repeatedly says is that his definition of DEI is whatever the executive order from Trump states. And that's clearly coaching, as you said, because I think that kind of makes it legal for him, because he's like, I was just following the executive order and I'm following that to the letter, and that's it. But then when he's asked to define dei, I think it essentially, it's like, this is me interpreting a little bit, but it's like any activity that highlights any particular group, and then obviously it becomes ridiculous, because it's like, if there's any initiative for women, if there's any documentary that focuses on women, it's like certain minorities, it's like, that is dei and it just. It's impossible to do anything.
Joseph Mayberg
And he acknowledged that when they used ChatGPT, they were searching for, you know, black, homosexual, LGBTQ, but they didn't search for white or Caucasian. And he does acknowledge that. And he actually says, you know, we, well, could have done that. Yeah, but you didn't. So, you know, that's the difference there. But, yeah, you're right. The line of questioning was just so persistent. I mean, again, it's over six hours, but. And doing it in different ways, where it just showed how sort of ridiculous their position was.
Jason Kebler
Yeah.
Emmanuel Mayberg
And our previous coverage showed that it was ridiculous in practice because you would just have these total blanket activities or actions where it essentially looked like they did word searches on studies, and if it included the name of any minority or DEI or gender, it just got removed. And you can look back at all that coverage if you want to see how silly this was in practice. Okay, so you published this story, I think it was Friday, right?
Joseph Mayberg
Yeah, I believe so. Because for context, I watched. The reason I watched six hours was because I was on the plane a lot. So I had a lot of time to watch these and at the gym. And then when I got back on the Friday. Yeah, yeah.
Emmanuel Mayberg
So perfect flight activity. You. You land, You. You publish the post on Friday. And then I think. Because the. The. I think. Well, actually, we also published a video, several videos that kind of highlight some of the things that we're talking about. And all this stuff went extremely viral, both your posts and the videos. And I think it's when a lot of people first learned that this was even happening and seeing what this guy was saying. And then that led us to a couple. It got so much attention that some stuff happened because of it. So this leads us to your next headline, which is Doge Deposition Videos Taken down after Judge Order and Widespread Mockery. So what happened?
Joseph Mayberg
Yeah, you're right in that we were one of the first outlets to clip the depositions and then posted those. And then it was very funny seeing the right wing. It was mostly right wing. I think there was other political leanings as well. But basically x.comgrifter accounts lifting our clip, and it's like, that's the 404 media font. Like, that's 100%. It's nothing to beef about. It was just funny and just funny to see how that ecosystem works. But we clip those, they go viral. Do that article, as you say, on, I believe, Friday night. So very soon after we publish, the government then issues a filing into this lawsuit, and they say, or ask the judge, we need you to intervene and get the other party to stop the spread of these videos, to get them taken down, because these may cause harassment and reputational harm. That is the argument from the government that seeing Doge people saying things of their own words is going to cause reputational damage. I'll leave it to others to decide, well, maybe that's a consequence of saying things. Then further on in the filing, they do get a bit more concrete, or rather just before that, it does actually cite Our article, specifically the. I watched for six hours and our videos and whatever, and there was a Huffington Post video as well. The government then later on in the filing does say Fox specifically. He was the focus of much of the videos. He has allegedly faced death threats because this stuff went massively viral. I'm not necessarily doubting that. I'm sure that people made those threats. I would say that we haven't seen what those threats are or how concrete they were. But that is the government's argument in this lawsuit. So they do that filing. The judge, I believe, looks like they're going to agree. And there's then an emergency filing from the other parties, these language associations and organisations bring in the case and they're like, well, look, actually there's a massive First Amendment issue here. These videos should be public. They were never under a protective order, so you shouldn't order their removal. The judge disagreed and late on Friday night ordered that they be removed. And then sure enough, you go to the organization's YouTube channel and the hours and hours of video spread across maybe not dozens, but at least a dozen videos. They've gone completely. They've been wiped. So a pretty wild series of events to go from something that is massively, massively viral. Like, it's not just us covering it, it's basically everyone to a judge saying, you must remove those from YouTube.
Jason Kebler
Can we discuss how crazy this is? I mean, I think that occasionally judges will seal things if someone's safety is at risk, like sometimes. But I feel like the bar for that is usually pretty high, at least I'm not a lawyer. But that's like, as I understand it, that's pretty high. It's like we see court records all the time that have incredibly sensitive information on there, like details about people getting harassed, like addresses, phone numbers. Like, sometimes these things are redacted, but quite often they're not. And those are not filed under seal, for the most part. And here we have a situation where these government employees, in a highly publicized case of great public interest, dealing with millions and millions, billions of dollars, I guess probably of government grants that have led to people dying all over the world and in the United States because of the types of things that they did. And we are protecting them because people are being mean to them online. Like, I don't know, maybe I missed
Emmanuel Mayberg
what you said, but also, like public servants, right? It's like this is about stuff that they did for the government. It's not as if it's a private company or it's like a family matter or this is taxpayer, you know, funded activities.
Jason Kebler
Yeah. And I don't know if it's one of those things where it's just like the judge ordered it taken down while they deliberate whether to put it back up, like, as sort of an emergency measure, but kind of regardless. Like, this is very. I think it's crazy. I mean, I think that. I think that most people do, considering, like, how viral it went and that, all that sort of thing, but it is not normal, I guess. Like, this is not. This is not supposed to happen, I don't think.
Joseph Mayberg
Yeah. Incredibly unusual, I would say. Yeah.
Emmanuel Mayberg
The only information in there that is damaging to them is the fact that people really don't like what they did. Right. It's like they're not talking about their home address or anything like that. Okay, so that happens. And then that immediately leads us to another headline here. The Remove Doge deposition videos have already been backed up across the Internet. Very predictable. So where do these videos live now?
Joseph Mayberg
Yeah, so this was on Saturday, the day after the judge ordered the removal from YouTube. And then seemingly the organization's, you know, went along with that because obviously it's a legal order. And then Jason was actually keeping an eye on the Data Hoarders subreddit, which is a very fascinating place. And there's always very interesting people doing very interesting things there. And I think when Jason flagged it, it was more people discussing backing it up. And maybe that was Friday night, I can't quite remember. But come Saturday, somebody sent me a link to the Internet Archive, and on Saturday, someone had uploaded all of the videos. And I went through them and double checked like, yes, there's the Fox one, yes, there's the Kavanaugh one. There was actually two depositions from another two NEH officials as well, who were a bit more senior and sort of managing what these Doge people were doing as well. So their depositions were up there as well. So Internet Archive, obviously a very useful resource for preserving this sort of thing. The last thing I'll say just before I throw it to Jason to talk about the torrent as well. I think, crucially, the judges order, it wasn't like an order against YouTube. It wasn't an order against a platform. It was an order against the specific organizations in this lawsuit to be like, you have to take steps to claw back these videos. And the most obvious way they would do that is they would remove it from their own YouTube channel. But I don't think they have really any power to remove our Instagram posts or really The Internet Archive stuff. And I don't even know, or I don't really expect them to be expected to then go do that as well. I'm not familiar with this judge or sort of their understanding of the Internet, but it's just not how it works. Stuff is going to be out there, and people very quickly archived it. While I was writing this, Jason, you were editing it and you pointed to the torrent, right?
Jason Kebler
Yeah. And so I had seen, not just that people were talking about backing it up, but there was someone on there who was like, I have the files and I'm going to torrent. I'm going to make a torrent, but I don't know how. So he was, like, trying to learn how to do it. But, yeah, it's like, I mean, it's the Streisand effect, which I think is real. I think there's been some studies that the Streisand effect effect is actually, like, overstated to some degree. That, like, there's an initial spike in interest and people kind of like finding something when the government deletes it or when a company deletes it, but that in the long term, it's like, becomes harder to find. But I think that in this case, like, because it's torrented now, it's censorship proof. Like, it's decentralized. It cannot be deleted. I mean, I think it's still up on the Internet Archive, and I hope that it stays up on the Internet Archive, but that's still a centralized place, whereas this torrent now, there's tons of different cedars, and it's like, you know, torrents are undefeated in that way. Like, it will live forever somewhere. I don't know. I downloaded copies of it. I'm not deleting them. We haven't uploaded them, but it's like they're useful to have if we. If we report on it in the future, like that sort of thing. So, yeah, it's good to have sort of like multiple backups, I guess.
Joseph Mayberg
Yeah. I mean, on Saturday, I was thinking before, obviously it was clear they were already on the Internet Archive. I was thinking, like, if we get these videos, do we upload them to our site or something? And, you know, kind of like what we did with some of the Epstein documents when Jason bought them off Pacer and then we just uploaded them so other people could access them. I was thinking, do we do the same here? But we didn't need to, frankly, because other people had already archived them. All right, thank you for asking me those questions, Emmanuel. We'll leave that there. When we come back. After the break, we're going to talk about Jason's story about African intelligence. A very, very interesting trip. I'll say that. We'll be right back after.
Jason Kebler
Every now and then you get a piece of clothing that makes you rethink your entire closet. For me, it came in the form of the softest sweater I've ever worn. Made out of alpaca fiber by Paca in Peru, Pakka makes outdoor and lifestyle apparel from alpaca fiber, one of the world's most sustainable natural fibers. Their bestselling hoodie is softer than cashmere, warmer than wool, and breathable. The hoodie is built for real life, thermal regulating, odor resistant, durable and made to last. Each one is made start to finish in Peru and features an Inca ID that's hand woven by artisans, honoring generations of knowledge and traditions and connecting you closer to where your clothing comes from. Over 250,000 people have already picked up the Packa hoodie. I'm going to be honest, I love my Packa hoodie and I wore it all winter. But it's getting hot in LA now and I've been wearing some of Pakka's other great clothes. For example, I've gotten their men's tee right now and you can't see it obviously, but I've been wearing their essential boxer briefs as well. Both are warm but breathable, which feels kind of like a magic trick. If you've been thinking about leveling up your clothing game, this is your sign to do it now. To grab your pack of hoodie or other pack of clothing, go to www.go.pakaapparel.com 404Media. That's www.go.p akaaparel.com 404Media and use code 404Media if you've ever thought about learning a new language and immediately felt overwhelmed, you're not alone. Most people start and then drop off pretty quickly. As many as 90% of people eventually give up. That's happened with me in the past until I tried Babbel. That's what I like about it. Babel actually fits into my life. Instead of long, intimidating sessions, you get bite sized sessions that take about 10 minutes. So you can do one in the morning, on a break, or even while you're just winding down. I've been using it to work on my Spanish and what stands out is how practical it is. You're not just memorizing random words, you're learning how to actually have conversations things you'd use in real life like ordering food, asking for directions, or just talking to someone naturally. And Babbel adapts to how you learn best. You can listen, speak out loud, read or even use their podcast to pick up language while you're doing something else. Everything is designed by over 200 language experts so it feels structured but not overwhelming. It's really about small steps and progress. You can actually notice. Just like I've noticed when I travel, I'm having a lot easier time getting around and actually talking to people. Babbel has sold over 25 million subscriptions worldwide, offers 14 different languages, and every course comes with a 14 day money back guarantee. Make fast lasting progress with Babbel, the science backed language learning platform that actually works. Here's a special limited time deal for our listeners right now. Get up to 60% off your Babel subscription at babel.com 404Get up to 60% off at babbel.com 404 spelled B-A-B-B-E-L.com 404 Rules and restrictions may apply.
Interviewer/Lawyer
This episode is sponsored by BetterHelp. March 8th was International Women's Day. And so for the whole month we're taking a beat to celebrate women and everything that they carry all year long at work and relationships and families and in the many roles that they hold every day. You know, last year my mother fell and suddenly a woman who had supported a family for her entire life ended up needing a lot of her own support. And we rallied to her side but we couldn't do everything. And part of what really helped her and part of what she needed was therapy. The kind of therapy that you can get with BetterHelp. BetterHelp therapists work according to a strict code of conduct and they are fully licensed in the US and BetterHelp also has a therapist match commitment. It does the initial matching for you so you can focus on your own therapy goals. And if you aren't happy with your initial match, you switch to a different therapist at any time. Your emotional well being matters. So find support and feel lighter in therapy. Sign up and get 10% off at betterhelp.com 404media that's B-E-T-T E R H E L P.com 404M E D I A.
Joseph Mayberg
All right. And we are back. Jason, this is one you wrote. The headline is AI is African Intelligence. The workers who train AI are fighting back. So you mentioned on the podcast recently that you took a trip to Kenya. I think at the time you were talking more about the conference, which is sort of the reason you went, you were giving a talk and that sort of thing. But as is detailed in this piece, you also did a fair bit of reporting while you were there as well, talking to various people and going to different events. What events related to data labeling did you go while you were in the area?
Jason Kebler
Yeah, so I was aware of this guy named Michael Jeffrey Asia who wrote a report for the Data Labelers association, which is his organization, and a few other people's organizations about his time as a data labeler. He was on the podcast as a, as a, an interview episode a few weeks back. But I felt like there was more to the story than just, just his story because it's a whole organization of thousands of people who are the very low paid labor behind AI training. And that's very broadly defined. It's like a lot of them have worked for Sama, which is this company that has worked with Meta, actually continues to work with Meta. They said that they've stopped, but they're doing their smart glasses data labeling stuff, which came out of, I believe like a Swedish newspaper, Czech newspaper. It's in the article. I'm sorry, but there was like a really good article about data labelers in Kenya who were looking at all of this like highly sensitive video footage from Meta smart glasses. So data labeling is like a very broad category of jobs that I would argue is quite related to content moderation. And so content moderators are people who look at violent content, really like highly contentious political content, sexual content for different social media companies and determine whether or not it violates the rules of a given platform. And as we've reported, like over the last few years, social media platforms have largely stopped giving a shit. And so as social media companies have kind of taken a step back from content moderation, the jobs there have become a bit more scarce. And it's a similar type of job, like data labeling is a similar type of job and it can include everything from like looking at a bunch of pictures and saying what is happening in the pictures to like drawing squares over the faces of people in different either footage or images, to like help train facial recognition systems, to describing what's happening in, in like in porn, for example. So that's something that this guy, Michael Jeffrey Asia was doing and his job was like eight hours a day he watched porn for some platform. He didn't know which platform it was because the way that it works is like you work through a subcontractor. And he was categorizing like, what was going on in any given scene so that the platform could, like, categorize it for search. And then also, I don't know, sometimes they like, you can jump to different parts of a video that's like, oh, now they're doing this, now they're doing that, blah, blah, blah. So he was doing that, and then after that he had a second shift with a different job where he was an AI chatbot, like an AI sex bot, essentially. So he was training AI companionship bots that were telling users they were talking to AI, but he was the one who was actually chatting and he was given.
Joseph Mayberg
Is he even training in that? Because it's almost like, I mean, yes, with the, with the meta smart glasses one, they. They look at images and they're training it that way. With the porn one, it's like, what do you see? You categorize it as a position or whatever. You're training the data. I mean, I'm, I'm sure training is going on this one, but it almost just sounds like he's just. It's all just smoke and mirrors. He's just being, quote, unquote, the AI, essentially.
Jason Kebler
Yeah, I mean, that, that's, that's like, very interesting question. And it's like we've seen different models. Like, we've seen companies that just straight up lie and say that they're doing AI, because AI sounds high tech and whatever, but then you look under the hood and it's a bunch of human beings in Kenya or India or Pakistan who are just like pretending to be AI. But then I think that the business models of a lot of these companies are to start with that human labor and then slowly automated over time and eventually turn it over to the AI. And so it's hard to say because we actually don't know which platform he was working for. Like, we know the name of the subcontractor he was working for, but we don't know. There's like so many different AI companionship bots out there. And because of the way that this industry works, it's like he basically just gets. He sits down at his computer or at a terminal and a window pops up and he's like, just given instruction to sex with these people. That's essentially what he was doing. And what he was saying was that he was given a Persona. And so that Persona could be, you're a straight man talking to a woman, you're a woman talking to a man, you are a lesbian. Like teenage lesbian. Like, he was like, I had to do all this stuff and take on these Personas. And I had to switch between the Personas all the time. And basically his whole thing is that this ruined my life in many ways. It's like I was paid very little. It was quite traumatizing because I felt pulled between all these different Personas. I felt like I had to do things and say things that I didn't want to say. Because he's like, I'm a Kenyan man and I'm being asked to be like a college student in the United States.
Joseph Mayberg
Like, it's.
Jason Kebler
It was weird for him. And he was also looking at porn all day before that. And he was like, I basically became desensitized. I had trouble having sex with my wife. I had ptsd, I had insomnia because I was working like a zillion hours a day staring at a computer. And he did all of this because his son had lymphatic cancer and he needed a job. And it's like, this is. These are like one of the biggest sectors of tech jobs in Kenya. And so I talked to a lot of people in Kenya just because I was at this conference. A lot of them worked in tech, a lot of them worked in journalism. Talked to various Uber drivers. I talked to like servers and bartenders. And like all of them knew what data labeling was, first of all. And many of them had done it themselves. It's kind of like doordashing here or Uber driving. It's like largely gig work that you can just pick up and do on the side. And then some people make like entire jobs out of it where they're just doing it like all day, every day.
Joseph Mayberg
And so basically ingrained, maybe the culture is the wrong word, but like data
Jason Kebler
labor in the economy for sure. It's a big part of the economy. And it's like you leave the Nairobi airport and you get a cab and you immediately drive past the headquarters of sama, which is the biggest company that does the subcontracting. And it's like, it's huge. It's like a huge campus on the side of the highway. But anyways, I know that was a long wind up, but basically, after over a year of doing this, Michael was like, fuck this. This is terrible. I hate this. We need to fight for better rights. And so he and some colleagues formed the Data Labelers association, which is. I mean, it's not formally like a labor union, as in they haven't been like recognized by the companies and doing collective bargaining and things like that. But right now they're like growing power to basically push back against these companies and push for better working conditions. And so the event that they have,
Joseph Mayberg
how are they doing that exactly, the pushback?
Jason Kebler
Well, right now they're just signing people up, like, as in, they're just like, are you a data label or do you feel like you're mistreated? Like, here is what's happening more or less. It's like, it doesn't have to be this way because people who do data labeling in the United States, there's like not that many of them because they have to be paid minimum wage and things like that, but it's like they're paid better wages, like they have benefits. A lot of them have like mental health support. A lot of people in other countries have like better labor protections there. And so right now they're doing a lot of educating of people about how they are being taken advantage of and how this is an extractive industry. And then they also have worked with a lot of lawyers in Kenya because Kenya has laws that should prevent some of this stuff from happening. So there's a lawsuit against SEMA right now, There's a lawsuit against OpenAI, there's a lawsuit against Meta about how these people are treated, about the fact that a lot of them don't have mental health support, about the fact that a lot of them are very poorly paid and don't have benefits and like all this sort of thing. And so I think it's a bit of like a two pronged approach where they're trying to get as many data labelers as possible to kind of say, like, I support collective action, like I want to make this a better job for people and myself. And then there's like the legal aspect of it where, you know, I spoke to one of the lawyers that is suing Meta and she told me, like, we have laws that should protect against this. It's just a matter of getting them enforced. And some of these lawsuits have been like winding their way through Kenyan court for like years at this point. And it's just a matter of like getting an injunction or sort of getting a result where these companies will be required to treat the workers better. I think one of the kind of scary things, and this is not to discourage them at all because what they're doing is great, is that a lot of these big companies will go to the Kenyan government and say, if you regulate us, we're just going to leave the country, we're going to go work in another country. And so that's the kind of thing that they're holding over the entire country. At this point. It's not a matter of, oh, the workers are pushing back. It's a matter of like, the government has largely, as I understand it, according to Mercy Mutemi, who's the lawyer I spoke to, like, the governments largely kind of look the other way because Kenya sees this, like, the Kenyan government sees this as a chance to, like, work with American big tech and to, you know, kind of gain access to that, you know, these jobs. Because these jobs do exist, even though they suck, like, they are jobs. And so they're like, oh, we don't want to lose these jobs. And if we regulate these companies, like, Mark Zuckerberg is just going to go to like Uganda or something instead or
Joseph Mayberg
Southeast Asia or something like that. Yeah, right. Two things I would just mention. You mentioned the sort of similarity to content moderation. And then we have these AI jobs. There's almost one in between, which is the translation jobs. I remember back at Motherboard, we got a leak talking about how workers were listening to Skype calls to aid in Microsoft's in improving the translation engine behind that and, you know, is translation AI. I don't know, maybe people use ChatGPT for translation, all that sort of thing. But that almost seems like a bridge between the content moderation stuff and the AI stuff. And then I would also just say that there are some projects which there's sort of a spectrum, right? There's the really sensitive stuff, like the porn stuff and then the meta glasses especially. I'd probably put like listening to Smart Assistant audio in there as well. Then you have maybe training images for Flock license plate reader cameras like Wii Reporter. Recently they're using overseas workers to train those algorithms. You then also have some stuff which, like, straight up involves the military. There was this recent article in the Bureau of Investigative Journalism in London and the headline, I mean, I'll just read it and you'll get the picture. Gig workers in Africa have been helping the US military. They had no idea. And that was about Appen, which is this other huge, like consulting contracting firm. Right. So, yeah, we had like waves of coverage of the content moderation stuff and we did a lot of that and then other technology websites did it. And now there's like the wave of the AI coverage as well.
Jason Kebler
Yeah, so. So there's a few things. One, you're absolutely right. Like, a lot of it is translation. A lot of it is actually content moderation for AI chatbots.
Joseph Mayberg
Right.
Jason Kebler
Like a lot of them do, like content moderation for ChatGPT there was like, a Time magazine article about some Sama workers, like, maybe a year and a half ago, two years ago, where they were, like, judging how, like, they were grading chatgpt on the responses that it was giving to people. And then also they were looking at, like, if someone was trying to, like, make a bomb on ChatGPT or something. They were. They were, like, testing the effectiveness of the guardrails, more or less. And so that. That really bridges the gap as well. And then there's a. There's an article that I mentioned in the interview that I did with Michael. It was on a substack written by a Kenyan guy, and it was like, I don't write like ChatGPT. ChatGPT writes like me. And it's very interesting because a lot of Kenyans, at least according to this article, Michael said the same. They, like, when they post on LinkedIn or when they, like, email people, they are getting told that they're using ChatGPT to write their. Their things. And a lot of them say that they are not using ChatGPT. What's happening is that the people who are training ChatGPT how to write, like, how they are tweaking the outputs and just, like, doing that, it's like, it's Kenyan English. And English is one of the two main languages, or two official languages of Kenya. Swahili is the other. But it's like, basically, like, we have now trained this robot to write like us. And now when we write in the way that we were taught to write in school, we're getting accused of using AI and we're not using AI, and like, that's leading to bad outcomes for us because some of them are like, I'm a. I'm a writer, and now I'm being accused of using AI just because this robot writes like me. And I thought that that was super interesting. And then, yeah, the title of the article is like, AI is African Intelligence. I thought that was a really powerful quote from Michael where he was like, and everyone knows this. They should know this. But it's like, AI is not magic. It's like there is just, like, zillions of human hours that go into. Not just, of course, all of the training data that comes in, all the stuff that's sucked into these tools, but then also the managing the outputs of it and tweaking the outputs of it and making sure that it all works. It's like the people who are doing that, not all of them are African, but a lot of them are. And so they're like, this is our labor. We're getting paid $200 a month to do this and OpenAI is worth a trillion dollars or whatever they feel like. That's not fair and I think it's very hard to argue with that.
Joseph Mayberg
Yeah, it makes complete sense. I'm sure we'll keep an eye on that. But for now, if you're listening to the free version of the podcast, I'll now play us out. If you are a paying 404 media subscriber, we're going to talk a little bit about jobs and AI and you know, maybe there's some inaccuracies about what is being reported or some stuff is being missed out. You can subscribe and gain access to that content at 404 Media co. As a reminder, 404 Media is journalist founded and supported by subscribers. If you do wish to subscribe to 404 Media and directly support our work, please go to 404 Media co. You'll get unlimited access to our articles and an ad free version of this podcast. You also get to listen to the Subscribers only section where we talk about a bonus story each week. This podcast is made in partnership with Kaleidoscope and Alyssa Midcalf. Another way to support us is by leaving a five star rating and review for the podcast. That stuff really helps us out. Here is some of a very long one from all mart important, relevant reporting. I'm basically the opposite of a tech enthusiast, but this is one of my favorite podcasts. The reporters make it really clear why tech stories matter and how tech and tech billionaires are impacting our lives. Thank you so, so much. This has been 404 Media. We'll see you again next week.
This week’s 404 Media Podcast dives into two major stories:
Throughout, the hosts bring trademark 404 Media clarity and skepticism, pulling stories from the digital shadows into the light.
(Starting ~03:25)
Availability and Rarity: The depositions were public, uploaded to YouTube—"somewhat rare" (05:00).
Nature of the Depositions: Focused on in-depth questioning, unlike previous public appearances (05:21).
DEI Definition Evasion:
Specific Examples of Questionable Justifications:
Use of ChatGPT in Review Process:
Evasiveness and Coaching:
(Starting ~28:28)
| Timestamp | Segment/Topic | |-----------|------------------------------------------------------------------------------------------| | 03:25 | Introduction to DOGE depositions story | | 05:21 | DOGE depositions—nature and public access | | 07:19 | Evasive DEI definition exchange (detailed Q&A sample) | | 08:34 | Specific grant cuts (Black civil rights, Holocaust documentary) | | 11:34 | Use of ChatGPT in discrimination | | 13:32 | Virality and right-wing appropriation of video clips | | 13:32-16:32| DOJ/judicial takedown request and response | | 17:43 | Discussion of public servant status and free speech implications | | 19:11 | Backup and preservation on Internet Archive/torrents | | 21:31 | Streisand Effect reflections | | 28:28 | Transition to Jason’s reporting in Kenya, introduction to AI workers story | | 29:09 | What is data labeling? | | 32:36 | Portrait of Michael Jeffrey Asia, traumatic labor (sexbot/user Persona work) | | 34:55 | The toll of the work: PTSD, family troubles | | 36:59 | Birth of Data Labelers Association — organizing pushback | | 37:02 | Legal efforts and corporate resistance | | 42:04 | Linguistic ironies: “I don’t write like ChatGPT. ChatGPT writes like me.” | | 44:58 | “AI is African Intelligence” — thesis and call for recognition |
The episode showcases 404 Media’s commitment to uncovering the seldom-seen machinery behind both government policy and cutting-edge technology, reminding listeners that what disappears from platforms often continues to live on—and that the real “intelligence” in AI may come from those farthest from the spotlight.