Transcript
Michael Jeffrey Asia (0:00)
Foreign.
Jason Kebler (0:04)
Hello and welcome to the 404 Media podcast where we bring you unparalleled access to hidden worlds, both online and IRL. 404 Media is a journalist founded and owned company and needs your support. To find our work and subscribe, go to 404Media Co subscribers get access to bonus podcast segments and early access to interview episodes like this one. I'm Jason Kebler and this week we have a special episode and something a little bit different. I recently went to Kenya for a journalism and AI conference that I talked about briefly on this pod before. And while I was there I really wanted to meet with Michael Jeffrey Asia, who is the Secretary General of the Data Labelers Association. Data labeling is a huge job in Kenya. A lot of people there are talking about it all the time. It's seen as tech work because it is tech work. And for people who don't know, data labelers are the people who train AI and, and who also work on ensuring the outputs are accurate. In some cases, data labelers are themselves pretending to be AI in order to eventually train AI tools. A lot of times data labelers don't know exactly what they're working on or who they're working for because the work usually goes through a platform or a subcontractor or a combination of both. Although, as you'll see in this conversation, a lot of data labelers are able to figure out who exactly they're working for. But basically they can be presented with a backend where they're asked to perform tasks or correct outputs or answer questions or label like video, photos, things like that. And in some cases, as you'll see, their answers are actually presented in real time as AI to the end user. And so there's been all these stories where in quote unquote, like AI messenger is actually just some data labelers responding and pretending to be AI, even though the data labelers themselves might not know that that's happening. Data labeling is notoriously brutal and underpaid work. Workers sometimes earn as little as a few dollars per day. They work under algorithmic management where they have these really strict quotas. And then because they're sometimes trying to train AI about what not to do or what not to show, they're often shown graphic, violent or sexual content for hours at a time. This has led to a lot of cases of ptsd. There's actually a lawsuit, couple lawsuits in Kenya going on right now about this. You know, Meta has been sued. There's a company called Sama that Michael worked for that has had A lot of complaints against it. And data labeling is really similar to content moderation jobs. And a lot of people who work in data labeling also do content moderation or they switch back and forth between the industries. It's such a big thing in Kenya at the moment that sort of driving down the highway going into Nairobi, you like see all of these huge office complexes where people do data labeling and like big SEMA offices and things like that. And I actually was like, I mentioned data labeling to the driver who took me to meet Michael for this interview. And she told me that she was also a data labeler and so are a lot of her friends. I wanted to talk to Michael because he's the author of a report called the Emotional Labor Behind AI Intimacy, which was put out a few months ago by the Data Workers Inquiry Project. In this report, Michael explains working endless hours pretending to be an AI sex bot or a bunch of different AI sexbots, more or less, and the toll that that took on his mental health and his marriage. He's going to talk about that in the interview, but I want to read a passage from it that was like really affecting for me. It goes, quote, I had to assume fabricated identities, memorizing false backstories, and reading through previous chads. Sometimes I would be assigned a conversation that had been ongoing for several days and had to continue it smoothly so the user wouldn't realize the person responding had changed. I played the part, stepping into carefully crafted Personas designed to connect with unsuspecting customers on a quote, unquote personal level, often through sexual or intimate conversations. When I logged into my work dashboard, I had access to multiple fake profiles of varying genders, typically three to five different Personas I could operate simultaneously. Sometimes I had to operate male and female Personas on the same day, depending on what the platform's users were seeking. One day I might be Jessica, a 24 year old lesbian college student from California, And Joe, a 30 year old gay man from Florida. Another day it could be Maria, a 28 year old heterosexual nurse, or a nameless woman artist. I felt like I was losing myself in the role. It started as any other job, responding with empathy and willfully pretending to care. But over time, it became harder to separate the act from reality. The lines blurred. I began questioning if I was acting or if I was truly becoming the Persona I was forced to embody. I was losing touch with who I really was, a feeling that has never left me. Michael has since become really critical at the Data Labelers Association, a group that is fighting to Organize people who do data labeling work in Kenya and I guess internationally actually. And who's advocating for better working conditions, higher pay and more protection for data labelers. I met Michael at a coworking space in Nairobi in a very tiny room. So I'm not on camera after this, but here's my conversation with Michael.
