
Loading summary
A
Today on the AI Daily Brief, why electricity is AI's biggest problem and before that, in the headlines, another restructuring of Meta's AI efforts. The AI Daily Brief is a daily podcast and video about the most important news and discussions in AI. All right friends, quick announcements before we dive in. First of all, thank you to today's sponsors Assembly AI Robots and Pencils, Blitzy and super intelligent. To get an ad free version of the show, go to patreon.com aidaily brief and to learn about sponsoring the show, send us a Note@ SponsorsIDailyBrief AI welcome back to the AI Daily Brief Headlines edition. All the daily AI news you need in around five minutes. Meta is cutting 600 roles in their AI division in an attempt, apparently to move faster. Axios reports the downsizing will impact a significant chunk of the several thousand strong AI team. Chief AI Officer Alexander Wang wrote in a memo to staff, by reducing the size of our team, fewer conversations will be required to make a decision and each person will be load bearing and have more scope and impact. Staff reportedly learned if they were among the cuts on Wednesday morning, affected employees were encouraged to apply for other jobs at Meta, and Wang expects most of them to find another position internally where they could apply their AI skills. He wrote, this is a talented group of individuals and we need their skills in other parts of the company now. By some accounts, this is now the fifth restructuring of Meta's AI division this year. Meta disputes that characterization, instead viewing this as a continuous effort to nail down the right organizational structure. Notably, one of those reorgs involved rolling up existing AI teams into the newly constituted Superintelligence Lab. From there, the expensive superstar hires that took place over the summer were concentrated into a subdivision known as TBD Labs. Reporting states that the layoffs won't impact TBD Labs, instead thinning out the Fair AI Research Lab, the AI product team and the AI infrastructure unit. Meta also continues to actively recruit more elite AI researchers for TBD labs even as they're doing these other layoffs. Bringing together multiple divisions and figuring out how to make this all work in harmony is no easy task. Just ask Google, who took over a year to do this between their various AI organizations as well. Still, to some, this feels like a recipe for even more discontent in Meta's AI organization. Earlier reporting had suggested that bringing existing AI workers into the superintelligence labor was an effort to quell morale issues. There was a suggestion that existing workers didn't want to be left out of the company's new initiative and stuck working on legacy projects like updates to llama4. The good news for the folks impacted is that this is an absolute hiring bonanza for every other AI company, and they are jumping on it very quickly. Look, ultimately, when it comes to Meta, it's all going to be about the next AI products they actually put out. If Llama 5 is a banger, no one will care about all this turmoil. And if it's not, well, that'll be a whole different discussion. Next up, some interesting M and A news. Adobe is apparently considering or has considered a multi billion dollar acquisition to keep up with the AI boom. The information reports that Adobe had discussed acquiring AI video and avatar generation company Synthesia for 3 billion. That would be a markup from Synthesia's $2.1 billion valuation from a fundraising round back in January. Adobe's venture arm invested an undisclosed amount into Synthesia earlier this year to form a strategic partnership. So this would be taking that to the next level now. While Adobe has been steadily adding AI features over recent years, they're viewed as one of the software companies most at risk of disruption during the AI boom. Their stock is down almost 20% so far this year. In an otherwise hot market for tech companies, a splashy acquisition could be a way for them to shake up the narrative. Synthesia's technology also does fit into Adobe's latest AI venture, which is a platform called AI Foundry that offers businesses a way to build custom image and video models based on their branding and ip. Some, though, think the market's antipathy towards Adobe is overblown. Francois Chalet, one of the founders of the ARC AGI Prize, wrote, adobe is caught in a similar narrative as Google was six months ago. Everybody, and I mean everybody currently believes that Adobe is dead in the water because of gen AI disruption. As a result, Adobe has been aggressively selling off for two years. There's just one small problem. The narrative is dead wrong. Adobe is not Stack Overflow or chegg. No one is canceling their Photoshop subscription because they started using gen AI, which would be completely obvious if investors had any familiarity with Adobe products and their customers and use cases. Ultimately, he says, it's likely gen AI will actually turn into a tailwind, letting Adobe ship better features and deepen its moat. Gen AI, he argues, is a commodity tech now and tends to help establish players. So there you go. Not financial advice on this show, but Francois is basically saying buy Adobe, man. Next up, a couple of product updates. The first big upgrade is coming for OpenAI's Sora app, specifically focused on expanding the use of the cameo feature. OpenAI's head of Sora, Bill Peebles, writes, character cameos are coming in the next few days. You'll be able to cameo your dog, guinea pig, favorite stuffed toy, and pretty much anything else you want. He also noted that people will be able to create cameos of the characters that they've generated from their Sora videos, which opens up a lot of really interesting creative possibilities. Additionally, they are adding basic video editing to the app, including stitching together multiple clips. They say they're continuing to work on overmoderation and that they have a bunch of experiments around making the social experience much better, hinting at channels that could be specific to a university, company or sports club. Meanwhile, Sora seems to be following the exact patterns of the Internet before it. Justine Moore writes, the number one video on the Sora app right now, by a wide margin, is this obese cat steamrolling a house. I'm beginning to wonder if the killer use case of AI video is just creating more cat videos until the entire Internet is completely overrun by them. Something to consider Meanwhile, for those of you who are interested in ChatGPT's Atlas browser, product lead Adam Fry shared a group of short term fixes that are coming over the coming weeks in what looks like basically a notion list shared directly to Twitter, which I kind of love as a communication style. Their post launch fixes include multi profile support tab groups, a model picker in the Ask ChatGPT sidebar, the ability to use projects in the ChatGPT sidebar, an OPT in ad blocker, and just a whole bunch of other upgrades. Now ultimately, whether any of those change the fundamental question of whether the use case matters to you, I'm not so sure. But this is certainly not a product that they've just put out there and are now going to forget. Lastly today, one more interesting story surrounding OpenAI. The company is forging ahead with vertical agents as rumors emerge of a secret project to train their models on investment banking. Bloomberg reports that OpenAI has engaged a team of more than 100 former investment bankers to help teach their AI to build financial models. The project is codenamed Mercury. According to documents viewed by reporters, participants are being paid $150 an hour to write prompts and provide feedback on financial models for a range of transaction types. This includes restructurings and IPOs with the goal of producing an agent that can complete entry level tasks typically performed by a junior banker. The work is also reportedly focused on training on industry formatting norms, things like italicizing percentages and using correct margins. Now, of course, this type of reinforcement learning project has become a major focus of several AI labs as they work on products for the coming year. Miramorati's Thinking Machines lab has been heavily focused on using the technique and data labeling. Startups like Merkor are increasingly recruiting specialists to guide reinforcement learning. The goal is to produce models that have specialist knowledge and reasoning capabilities related to particular areas of professional work. And apparently OpenAI sees investment banking as low hanging fruit. Junior investment bankers notoriously work insane hours in a highly compensated field, so a specialist agent that could handle some of the load could be an easy sell. This could also be part of OpenAI's plan to sell vertical agents that could replace human workers in specialist fields. Now, whether this is still how OpenAI is thinking about it, it's not exactly clear. These reports on specialist agents came from back in May and as we all know, five and a half or six months of AI time is about 30 years in regular time. In any case, for now, that's going to do it for today's headlines. Next up, the main episode Foreign if you're building anything with Voice AI, you need to know about assembly AI. They've built the best speech to text and speech understanding models in the industry, the quiet infrastructure behind products like Granola, Dovetail, Ashby and Cluly. Now, as I've said before, voice is one of the most important modalities of AI. It's the most natural human interface and I think it's a key part of where the next wave of innovation is going to happen. Assembly AI's models lead the field in accuracy and quality, so you can actually trust the data your product is built on. And their speech understanding models help you go beyond transcription, uncovering insights, identifying speakers and surfacing key moments automatically. It's developer first, no contracts, pay only for what you use and scales effortlessly. Go to semblyai.com brief, grab $50 in free credits and start building your voice AI product today. AI changes fast. You need a partner built for the long game. Robots and pencils work side by side with organizations to turn AI ambition into real human impact. As an AWS Certified Partner, they modernize infrastructure, design cloud, native systems and apply AI to create business value. And their partnerships don't end at launch as AI changes robots and pencils stays by your side so you keep pace. The difference is close partnership that builds value and compounds over time. Plus, with delivery centers across the us, Canada, Europe and Latin America, clients get local expertise and global scale for AI that delivers progress, not promises. Visit robots and pencils.com aidaily Brief this episode is brought to you by Blitzy, the Enterprise autonomous software development platform with infinite code context. Blitzy uses thousands of specialized AI agents that think for hours to understand enterprise scale code bases with millions of lines of code. Enterprise engineering leaders start every development sprint with the Blitzi platform, bringing in their development requirements. The Blitzi platform provides a plan, then generates and pre compiles code for each task. Blitzi delivers 80% plus of the development work autonomously while providing a guide for the final 20% of human development work required to complete the sprint. Public companies are achieving a 5x engineering velocity increase when incorporating Blitzi as their pre IDE development tool, pairing it with their coding copilot of choice. To bring an AI native SDLC into their org, Blitzi is providing a limited time 30 day free proof of concept for qualifying enterprises. The team will provide a 5x velocity increase on a real development project in your org. Visit blitzi.com and press book demo to learn how Blitzy transforms your STLC from AI assisted to AI native. That's blitzy.com Today's episode is brought to you by Superintelligent. Now for those of you who don't know who are new here, maybe Super Intelligent is actually my company. We started it because every single company we talk to, all the enterprises out there are trying to figure out what AI can do for them. But most of the advice is super generic, not specific to your company. So what we do is we map your AI and agent opportunities by deploying voice agents to interview your teams about how work works now and how your people would like it to work in the future. The result is an AI action map with high potential ROI use cases and specific change management needs. Basically everything you need to go actually deliver AI value. Go to be super AI to learn more. Welcome back to the AI Daily Brief. There has been arguably no bigger conversation in and around AI over the last couple of months than the massive infrastructure buildout that is both happening, but is also being preset in these big crazy deals that have the market talking about whether this is all just one big bubble. Now, we've extensively covered the nature of the deals themselves. What are the types of factors that would actually make it a bubble or not. But there is another part of this story which is extremely important, which is whether or not we're actually going to have the physical and energy capacity to even bring all of this infrastructure online. It is increasingly clear that this is a challenge not only in a technical sense, but also in a political sense. The conversation around this took a big leg up in the summer. You might have seen some version of this chart floating around that showed the absolutely static electricity generation capacity of the United States. Between 1999 and now, China had more than 5x its electricity generation, giving it one of its biggest advantages when it comes to the AI buildout. In July, investor Chamath Palihapitiya writes, the big problem with this graph is that as AI gets reduced to computation power, it further gets reduced to electricity to power the data centers that house the computation. The US is still ahead in model sophistication and quality, but we are way behind on electricity generation, which could catch up with us. Need to pay close attention to this and make sure we incentivize every form of electricity generation, storage, transmission and distribution. But before we understand what AI and data centers impact on the electricity system in the US is going to be, we need to ground ourselves in where we are right now. And the TLDR is that about 70% of transmission lines and transformers are over 25 years old, with many installed all the way back in the 1960s and 70s now nearing their end of life. As the Smart Electric Power alliance writes, grid reliability has been in decline since the mid 2010s due in part to this aging infrastructure. Now they also point out that even before we get into the world of AI data center construction, the grid that we have wasn't designed for even our existing modern consumption patterns. Whereas demand grew slowly in the past, we now have sustained around the clock high consumption. Electric cars, digitalization, all of these things are driving load growth that was putting pressure on the system even before we got to AI. The Department of Energy currently projects that peak demand could jump by as much as 38% by 2030. And exacerbating that problem, plants that produce 104 gigawatts of power are slated to be retired in that same period. That cuts across coal, gas and nuclear. And right now only 22 gigawatts of new firm capacity is planned. The DOE predicts that if those plants are retired too quickly, without reliable alternatives being brought online, we could see up to 800 hours of blackouts per year by 2030, which could be even more depending on extreme weather patterns. There are also some unique regional strains. Texas's ERCOT broke 10 new peak demand records in 2023 due to both rapid growth and record heat. And other regional centers have even more data infrastructure than the national average. And this is the landscape that the data center buildout is coming into as per a McKinsey report and Goldman Sachs data, 6.7 trillion of capital expenditure will be deployed in data center infrastructure through 2030. Data centers are anticipated to add somewhere between 116 gigawatts and 243 gigawatts in demand to US grids by 2030. The mid range estimate is triple the 55 gigawatts that they demanded from 2023. Right now, data centers don't necessarily consume as much power as you might think given how much discourse they have, but that sort of shift would make a huge difference. Total US power generation in 2023 was around 1200 gigawatts. So you're potentially going from data centers demanding less than 5% of US capacity all the way up to potentially double digits. Bain, for example, projects that data centers will represent around 9% of electricity consumption by 2030. And one of the big challenges for utility companies with all of this is to figure out how much of this is going to be real. Last week, CNBC published a piece called Utilities Grapple with a multibillion dollar question how much AI data center power demand is real. Willie Phillips, former chair of the Federal Energy Regulatory Commission, said there is a question about whether or not all of these projections, if they're real. There are some regions who have projected huge increases and they have readjusted those back, grid Unity CEO Brian Fitzsimmons said. We're starting to see similar projects that look exactly to have the same footprint being requested in different regions across the country. Current FERC Chairman David Rosner made the point that the difference of a few percentage points in electricity load forecasts can, quote, impact billions of dollars in investment in customer bills. Put simply, he said, we cannot efficiently plan the electric generation and transmission needed to serve new customers if we don't forecast how much energy they will need as accurately as possible. And importantly, this conversation has gone from one for policy wonks and energy infrastructure professionals to one that is starting to hit the mainstream. Just last week, USA TODAY published a piece called Is AI Making My Electric Bill Higher? That piece shared that in a September analysis by J.D. power. They found that between 2020 and 2025, household utility costs spiked by 41%. Now, according to Bankrate, overall consumer prices are around 24% higher than they were in 2020, meaning that the utility cost spike is outpacing even the rest of what people already consider painful inflation. The center for American Progress put out a report last month that found that more than 100 gas and electric companies have raised or proposed rate increases for this year or for 2026 in total, U.S. citizens in more than 40 states face higher utility bills going into next year. Now, to their credit, USA Today points out that AI isn't the only thing going on here, that in addition to dealing with this aging infrastructure, we're also dealing with climate related change issues. But still it's very clear that AI is on people's minds, said Todd Snitchler, the president of the Electric Power Supply Association. In the span of I'll call it 24 months, data centers went from something no one talked about to something everyone's talking about. A Bloomberg expose from the end of September found something similar. The piece was called AI Data Centers are sending power bills soaring. They argue that wholesale electricity costs as much as 267% more than it did five years ago in areas near data centers, and that those increases were being passed on to customers. And part of the issue here is simply the way that the business of power generation is designed. As CNN Business points out, large buyers of electricity typically pay lower rates because the distribution infrastructure is less complex. Power needs to be piped to one location rather than hundreds or thousands of homes. Pricing models haven't been updated to take into account the surge in data center growth. Basically one of the core issues is that there currently isn't really a good mechanism to charge the data centers more because they're the ones adding demand to the grid, meaning that the cost of the build out, even though it's not something that's being requested by consumers, gets shared and passed on to those consumers. And so when I argue that electricity is potentially AI's biggest problem, it's not just because it's going to be a constraint in the ability to get compute online, but because of the political implications. Robert Reich recently tweeted, A nationwide backlash to AI data centers is brewing, and for good reason. While AI enriches big tech CEOs and props up the stock market, data centers are sucking up communities, water and power. When the AI bubble bursts, the rest of us will be stuck holding the bag. Meanwhile, in seeming proof of horseshoe theory that shows that eventually the far left and the far right come together in the same position, zero Hedge tweeted. The data centers and AI giants are making billions as they drain the power grid dry and get indirect consumer subsidies in the form of 100% bill increases. Speaking more for the average person, investor and entrepreneur Nick Huber writes, AI is going to go down as a disaster of colossal scale. My electricity Bill in Athens, Georgia is up 60% since 2023. Six increases in the last 24 months just approved 20 plus data centers under construction in the region Quality of life is dropping for 99% of people and increasingly this is not just a battle that's happening on social media, but starting to come out into the real world. Pima County, Arizona recently blocked Amazon's Project Blue data center. In that region, residents saw a 14% rate increase this year. And while the local electricity supplier said it had nothing to do with Project Blue but was rather from an infrastructure investment that had already been made in 2024, AI was the easy scapegoat last month. In September in Indiana, Google withdrew a proposal for a 468 acre data center project in advance of a planned vote by the Indianapolis City County Council, which was expected to deny their application. In Wisconsin, Microsoft canceled their Project Nova data center plans due to community opposition. In a statement, Microsoft said, based on the community feedback we heard, we have chosen not to move forward with this site. We remain committed to investing in southeast Wisconsin and look forward to working with the Village of Caledonia and Racine county leaders to identify a site that aligns with community PR priorities and our long term development goals. Earlier this year, Data Center Watch published a report claiming that $64 billion worth of American data center projects had been impacted or were in some way threatened based on community and grassroots opposition. And this was six months before all of these big deals were starting to be announced. So what are the possible answers here? Well, one is that in some cases, tech companies are effectively just building their own power plants. A piece last week in the Wall Street Journal was all about the new tech strategy to bring your own power, the piece reads. Most tech titans would be happy to trade their DIY sourcing for the ability to plug into the electric grid. But supply chain snarls and permitting challenges are complicating everything, and the US isn't building transmission infrastructure or power plants fast enough to meet the sudden surge in demand for electricity. You're also starting to see bills show up in local legislatures. Last week, more Perfect Union tweeted In New Jersey, a bill has been introduced to make sure data centers pay for electricity they use. This power surcharge would go towards modernizing the state's electric grid. In August, Oregon passed a similar bill that effectively required data centers to pay for the strain that they put on the grid without those costs having to get passed on to the consumer, said Gartner analyst Bob Johnson. The homeowners shouldn't have to pay for data centers, but that's not built into the pricing structure. Now this strikes me as something that would be likely to have a lot of bipartisan consensus. To the extent that it is a random arbitrage of the quirks of how the current system works, that individual companies that are meaningfully increasing demand don't actually have to pay for the build out of that demand without socializing it to others, that seems just like a completely untenable situation that is absolutely doomed to create hostility and animosity. I tend to think though, that even beyond finding ways to close those loopholes and have companies be more on the hook for the actual cost of the electricity infrastructure buildout, we're likely to see solutions go even farther. Patia again writes, the simple solution for hyperscalers is as Option A agree to a higher base rate with the utilities so that you can guarantee people in the local geography won't see increased electricity rates. Option B agree to pay for residential, solar and storage for local citizens so they won't see increased electricity rates. Either way, if the hyperscalers don't use their gobs of free cash flow to cushion the inflation of electricity rates, you should expect to see a lot more pushback. I would expand this even farther One of the things that makes AI unique in the historical patterns of creative destruction is that there is actual creation happening on the front side, not just before the destruction. What I mean by that is that what typically happens when a new technology paradigm comes in is that the first thing that we see is the destruction. The jobs that get displaced and automated away and the economic fallout that comes from that. In the case of AI, because of the need for this massive infrastructure build out, the rapid modernization of the entire US electrical grid, plus the construction of new plants both on the electricity and the data center side, there is an absolute boatload of new jobs and new professions to be built simply surrounding that. It seems like an incredible cell phone that the companies who are on the ground doing that build out are finding ways for this to be the best thing that ever happened to the communities that they're in. I think that right now those companies and everyone else that flows downstream from them need to be rapidly increasing their attention to and their consideration of the communities that surround and are going to deal with the externalities of that build out. And frankly, they should not be thinking about it simply as PR and crisis comms, but as an actual chance to be incredibly meaningful and value additive in the short term. Even as the AI future that we're all excited about gets built out to do anything less than that. Is it just an unconscionable lack of imagination and I can guarantee will cost more in the long run than just about anything that they could do to engage with communities and get them on board in the short run. Now obviously this show is more about the practical than the macro, but I do still want to make sure that you guys have a broad based understanding of everything that's happening surrounding this industry. Hopefully this gave you a little bit of a better sense of what's happening in and around electricity in the data center. Build out. For now, that's going to do it for the AI daily brief. Appreciate you listening or watching as always. And until next time, peace.
Episode: Why Electricity is AI's Biggest Problem
Host: Nathaniel Whittemore (NLW)
Date: October 24, 2025
In this episode, Nathaniel Whittemore (“NLW”) explores a critical yet often overlooked challenge facing the expansion of artificial intelligence: the growing crisis in electricity supply and infrastructure. NLW examines how the unprecedented demands of AI development, particularly through data centers, are colliding with aging electrical grids, slow capacity growth, and rising consumer costs. The episode also touches on structural industry responses, burgeoning political backlash, and the potential need for new business and policy approaches.
NLW’s approach is direct, informed, nuanced, and occasionally wry—he balances technical analysis with concern for broader social and political impacts. The lesson: AI innovation can only proceed at speed if the electricity and community challenges are met with imagination, investment, and genuine partnership, or else a backlash is inevitable.
This episode delivers a comprehensive breakdown of why electricity and grid infrastructure may be the limiting factor for the AI revolution—not just technically, but socially and politically as well. If you're invested in AI’s future—whether as a technologist, policymaker, or community member—understanding this intersection is increasingly critical.