Ed Zitron (13:29)
Can Sora OpenAI's video creation tool replace actors or animators? No, not at all. But it still fills the air full of tension because you can immediately see who is pre registered to replace everyone that works for them. AI is apparently replacing workers, but nobody seems able to prove it at scale. But every few weeks a story runs where everybody tries to pretend that AI is replacing workers with some sort of poorly sourced and incomprehensible study. Never actually saying somebody's job got replaced by AI because it isn't happening happening at scale, and because if you provide real world examples, people can actually check if they're true. Now I want to be clear. Some people have lost jobs to AI, not just not really white collar workers, software engineers, or really any of the career paths that the mainstream media and AI investors would have you believe. Brian Merchant has done excellent work covering how LLMs have devoured the work of translators, using cheap, almost good automation to lower already stagnant wages in a field that has already been hurting before the advent of generative AI, with some having to abandon the field and others pushed into bankruptcy. I've heard the same for art directors, SEO experts and copy editors, and Christopher Mims of the Wall Street Journal covered these last year. These fields all have something in common. Shitty bosses with little regard for their customers who have been eagerly waiting for the opportunity to slash labor. To quote Merchant, the drumbeat marketing and pop culture of powerful AI encourages and permits management to replace or degrade jobs they might not otherwise have across the board. The people being replaced by AI are the victims of lazy, incompetent cost cutters who don't care if they ship poorly translated text. To quote Merchant again, AI hype has created the COVID necessary to justify slashing rates and accepting just good enough automation output for video games and media products. Yet the jobs crisis facing translators speaks to the larger flaws of the large language model era and why other careers aren't seeing this kind of disruption. Generative AI creates outputs and by extension defines all labor as some kind of output created from a request. In the case of translation, it's possible for a company to get by with a shitty version, because many customers see translation as what do these words say? Even though one worker told Brian Merchant, translation is about conveying meaning. Nevertheless, translation work has already started to condense to a world where humans would at times clean up machine generated text, and the same worker warned that the same might come for other industries. Yet the problem is that translation is a heavily output driven industry, one where idiot bosses can say oh yeah, that's fine because they ran an output back through Google Translate and it seemed fine in their native tongue. The problems of a poor translation are obvious, but customers of translation are, it seems, often capable of getting by with a shitty product. The problem is that most jobs are not output driven at all, and what we're buying from a human being is a person's ability to think and do. Every CEO talking about replacing workers with AI is an example of the real problem that most companies are run by people who don't understand or experience the problems they're solving, don't do any real work, don't face any real problems, and thus can never be trusted to solve them. In the era of the business Idiot, which is a piece I wrote a while ago, I talked about how this was the result of letting management consultants and neoliberal free market sociopaths take over everything, leaving us with companies run by people who don't know how the companies make money, just that they must always make more without fail. And when you're a big stupid asshole, every job that you see is condensed to its outputs, and not the stuff that leads up to the output or the small nuances and conscious decisions that make an output good as opposed to simply acceptable or even bad. What does a software engineer do? They write code. Right? What does a writer do? They write words. Right? What does a hairdresser do? They cut hair. Yeah, that's of course not actually the case. As I'll get into later in the series, a software engineer does far more than just code, and when they write code. They're not just saying what would solve this problem with a big smile on their face. They're taking into account their years of experience, what code does, what code could do, and all the things that might break as a result. And all of the things that you can't really tell from just looking at the code, like, whether there's a reason things are made in a particular way. And a good coder doesn't just hammer at the keyboard with the aim of doing a particular task. They factor in questions like, how does this functionality fit into the code that's already there? Or if someone has to update this code in the future, how do I make it easy for them to understand what I've written and make changes without breaking a bunch of other stuff? A writer doesn't just write words. They jostle ideas and ideas and emotions and thoughts and facts and feelings into a condensed piece of text. They sit up late at night typing thousands and thousands of words, and it drives them insane. It's often quite emotive, or at the very least, driven or inspired by a given emotion, which is something that an AI simply can't replicate in a way that's authentic or believable. And a hairdresser doesn't just cut hair. They cut your hair, which may be wiry, dry, oily, long, short, healthy, unhealthy, on a scalp with particular issues at a time of year when perhaps you want to change length at a time that fits you and the way you like it, which may be impossible to actually write down, but they get it just right. And they make conversation, making you feel at ease while they snip and clip away at your tresses, with you never having to think for a second. Fuck. Does this person know what they're doing? Are they going to listen to me? This is the true nature of labor, that executives fail to comprehend at scale, that the things we do are not units of work, but extrapolations of experience, emotion, and context that cannot be condensed in written meaning or bunches of trading material. Business idiots. See, our labor is the results of a smart manager saying, do this, rather than human ingenuity interpreting both a request and the shit the manager didn't say. Now, what does a CEO do? Well, I did look. And a Harvard study said that they spend 25% of their time on people and relationships, 25% on functional and business unit reviews, 16% on organization and culture, and 21% on just strategy, with a few percent here and there for things like professional development. Hmm. That's who runs the vast majority of companies, people that describe their work predominantly as looking at stuff, talking to people, thinking what we do next, and going to lunch. The most highly paid jobs in the world are impossible to describe. Their labor described in a mishmash of linkedinspiration. Yet everybody else's labor is an output that can be automated. As a result, large language models must seem like magic to these dickheads. When you see everything as an outcome, an outcome you may or may not understand and definitely don't understand the process behind, let alone care about. You kind of already see your workers as LLMs. You create a stratification of the workforce that goes beyond the normal organizational chart, with senior executives, those closer to the class level of CEO acting as those who have risen above the doldrums of doing things to the level of decision making. A fuzzy term that can mean everything from making nuanced decisions with input from multiple different subject matter Experts to as ServiceNow Bill McDermott did in 2022, and I quote, make it clear to everybody in a boardroom of other executives that everything they do must be AI, AI, AI, AI, AI. And that's five of those. The same extends to some members of the business and tech media that have for the most part gotten by without having to think too hard about the actual things companies are saying. Look, I realize this sounds a little mean and it's not a unilateral statement, and I must must be clear. It doesn't mean that these people know nothing, just that it's been possible to scoot through the world without thinking too hard about whether or not something is true just because an executive said it. When Salesforce said back in 2024 that its Einstein trust layer and AI would be transformational for jobs, the media dutifully wrote it down and published it without a second thought. It fully trusted Mark Benioff when he said that Agent Force agents would replace human workers, and then again when he said that AI agents were doing 30 to 50% of all the work in Salesforce itself, even though that's an unproven and nakedly ridiculous statement. Salesforce's CFO, by the way, said earlier in this year that AI wouldn't boost sales growth in 2025. One would think this would change how Salesforce was covered or how seriously one takes Mark Benny off. But it hasn't, because nobody's really paying attention. In fact, nobody seems to want to do their job in this case.