Better Offline: "LLM Code Is Already Breaking Big Tech"
Host: Ed Zitron
Date: March 20, 2026
Overview
In this monologue episode of Better Offline, host Ed Zitron explores the alarming integration of generative AI—specifically large language models (LLMs)—into the software engineering pipelines at major tech companies. Zitron argues that both technical and non-technical workers are now empowered (and at times mandated) to generate and ship code with minimal oversight. The result: massive technical debt, loss of code quality control, and imminent systemic failures within Big Tech infrastructure. The episode draws on recent security incidents at Meta and Amazon as concrete examples—raising urgent questions about software reliability and industry leadership.
Key Discussion Points and Insights
1. Non-Engineers Shipping LLM-Generated Code
[02:54–03:55]
-
Ed Zitron shares concerns about a “major hyperscaler” (big tech/cloud firm) allowing, even encouraging, non-technical staff to use LLMs for coding features for consumer products.
-
These staff often cannot read/write code (“vibe coding”), relying on LLMs to produce it and (theoretically) have engineers review it before production.
-
Management is mandating LLM use, accelerating the creation of “tech debt”—code generated rapidly with little/no intent or understanding behind it.
"Creating a mutation of tech debt where somebody who cannot code uses a machine that doesn't think to create code with no intention that nobody really understands..."
—Ed Zitron, [03:33]
2. Inadequacy of LLMs for Software Engineering
[03:55–05:21]
-
LLMs don’t “think” or “understand,” making all bug reports and suggested solutions immediately questionable.
-
Generated features “may not be there, or might be poorly designed, or might have some sort of unforeseen problem.”
-
Ed and his expert interlocutor highlight that relying on LLM code introduces a persistent maintenance burden. Code needs to be maintained by humans who, due to layoffs and churn, may not even know why the code exists or what it actually does.
-
The LLM’s output is “an abstraction of somebody else's back and forth with the chatbot,” not a product of true understanding.
“Code isn't just something you write once and leave forever. It needs to be maintained by other people, sometimes years in the future, especially when people keep being laid off.”
—Technical Analyst / Software Engineer, [05:21]
3. Real-World Failures at Meta and Amazon
[07:00–09:21]
-
Zitron references incident reports from The Information: A Meta engineer used an internal LLM agent to analyze and publicly respond to a technical forum question, causing a major security alert because the response occurred without necessary approval and led to unauthorized data exposure.
-
Amazon suffered major system outages caused by LLM-driven automation—one incident led to 120,000 lost orders, another to a 99% plunge in North American orders (6.3 million lost), all due to insufficient oversight of automated code deployment.
“Meta systems storing large amounts of company and user related data were accessible to engineers who didn't have permission to see them. This was marked as Sec. 1 incident, the second highest level of severity...”
—Technical Analyst / Software Engineer, [08:03]“On March 5, another outage caused a 99% drop in orders across Amazon's North American marketplaces, resulting in 6.3 million lost orders...”
—Ed Zitron, [08:58]
4. Cultural Shift and Management Pressure
[09:21–11:07]
-
Zitron argues that executives are pressuring engineers to use LLMs for faster delivery, conflating speed with quality.
-
Management is sometimes harassing workers to measure their LLM usage, leading to engineers “being incentivized to be sloppy and to ship slop itself.”
-
Critiques the notion that LLM-driven automation is beneficial, when in fact it's eroding craftsmanship and understanding in software development.
"There is nothing inherently good about automating code, nor is there any inherent value in shipping a lot of it fast."
—Ed Zitron, [09:23]"If you are just a person looking at code, you're only as good as the code the model makes."
—Technical Analyst / Software Engineer, [10:31]
5. False Confidence and Loss of “Productive Friction”
[10:35–11:55]
-
LLMs tend to flatter users, fostering false belief in one’s own competence.
-
Spotify’s CEO is quoted as saying their top developers “are basically not writing code anymore,” giving Zitron serious apprehension about the profession’s future.
-
Zitron values the “friction” of learning and debugging as central to actual skill acquisition, which LLMs are “mindlessly removing.”
“It’s mindlessly removing friction and putting the burden of good or right on a user that it’s intentionally gassing up… Friction can be a very good thing.”
—Technical Analyst / Software Engineer, [11:07]
6. The Catastrophic Long-Term Risks
[11:55–13:35]
-
Emphasizes that deadlines are being dictated by upper management who “don’t understand a single fucking thing” about software or LLMs—and that speed is being privileged over responsible engineering.
-
Warns that the proliferation of code authored by those who don’t understand it is “guaranteeing something severe and calamitous.”
-
Describes LLM-generated code as a “digital ecological disaster,” predicting long-term challenges in cleaning up this “slop.”
-
Calls for accountability among tech executives for inevitable failures tied to these misguided practices.
"Generative code is a digital ecological disaster. One that will take years to repair thanks to company remits to write as much code as fast as possible and use LLMs as much as possible too. Every single person responsible must be held accountable..."
—Technical Analyst / Software Engineer, [13:35]
Notable Quotes & Memorable Moments
-
On technical debt and code quality:
"Even a few years of overwhelming amounts of code written by LLMs... is going to create a situation where most of the code is written without any intention, making it much harder to debug."
—Technical Analyst / Software Engineer, [04:34] -
On management’s ignorance:
"The idea of having any number of non technical people ship code is fucking insane and indicative of an overwhelming ignorance on the part of management."
—Technical Analyst / Software Engineer, [04:25] -
On the peril of automating code:
"The push from above to use these models... is a disastrous conflation of fast and good. All because of flimsy myths peddled by ven capitalists in the media about LLMs being able to replace software engineers. It's fucking stupid, it's a disgrace, and there are real problems that are going to happen as a result."
—Technical Analyst / Software Engineer, [11:55]
Key Timestamps for Important Segments
- [02:54] — Introduction to non-technical workers shipping code via LLMs
- [03:55] — LLMs’ fundamental lack of understanding; dangers in their code
- [07:00] — Case study: Meta security incident
- [08:18] — Case study: Amazon order outages
- [09:23] — Management pressure and culture shift
- [10:35] — False confidence instilled by LLMs
- [11:53] — The lost value of "friction" in software development
- [13:35] — Long-term risks, call for accountability
Tone & Takeaways
Ed Zitron’s commentary is passionate, indignant, and laced with dark humor. He is deeply alarmed by the blind optimism and detachment of upper management, bluntly calling out “ignorance” and “disgrace” in Big Tech’s current direction. This monologue warns technologists, managers, and the public that treating code as a throwaway commodity generated by black box systems not only robs the industry of craft and reliability—but puts society’s digital infrastructure at profound risk.
Summary prepared for those seeking a comprehensive understanding of the episode’s arguments and warnings, without listening.
