
Loading summary
A
I think we've repeatedly talked about this. AI business is a terrible business. Why is that? Every technology cycle has a infrastructure investment cycle. Like if you just look at the level of investment at the beginning of the web era, at the beginning of the mobile era, at the beginning of the cloud era, there's always this huge investment in infrastructure. And it's usually like many years after the infrastructure investment that all of the value gets created and we're just not even close to being there yet. These AI harnesses like Claude code, like openclaw, like Codex, they're very small. But what they look like to me is the future of what the applications on top of the AI infrastructure are going to look like in 10 years.
B
More or less.
C
Is it no or is it yes?
B
We'll debate the tech that's best when we get more. More or less.
C
Dave and Brit, plus Ham and Jess,
B
put it all right to the test. More or less.
C
Why?
B
Hello, Morins. Welcome to More or Less Jessica, Dave and Brit edition.
C
Hey, Jess, I like that you introduced the Morins in your first line instead of the listeners. It's hello, listeners too, right?
B
No, I'm saying how you. Okay. No, that's good to be back.
A
Here we are.
B
The listeners are here for me to say hello to you. For better or worse. That's what they're here for.
C
I think what you're trying to point out is that there is no other lesson on the podcast.
B
No other lesson. The other lessons. Traveling doesn't matter. Which doesn't matter. It matters somewhat. But, guys, it did leave me an opportunity to rejigger my entire closet to a. Give more space for me, but also get more space from Sam. And viewers of this podcast on YouTube will notice my finger is taped because this went to a somewhat. You know, all suburban weekends begin with, do I need to go to the er. No, I'll go to Home Depot instead. So that's basically what I've been doing. I think that's good. Yeah.
C
Time out. So, first. First question. You removed clothes from Sam while he was gone to get more space, AKA his.
B
I don't think my marriage would survive that. No, What I did is he has a small portion of the closet. I decided to swap his small portion with another small portion so that all my things are together and all his continuity.
C
You want continuity of your space?
B
Yes.
C
Okay.
B
In order to make the place I relocated him to sort of hospitable, I decided to add more shelving to it so that I can argue he's getting
A
the old more shelving truck.
B
Correct.
C
And that's why you went to Home Depot.
B
And that's why I went to Home Depot because I needed more shelves.
A
I respect this, Jesse.
C
Me too. I'm a fan of a good DIY project, Jess.
B
So right now we purchased the right melanite but they would not cut it for me. So I have to go to the lumber yard sometime between this is even more advanced. Saturday.
A
My respect is going up for you by the sentence right now.
C
Is your AI agent teaching you what to do and walking you through every step.
B
There's been no AI of agent. I even used a real measuring tape. Not my iPhone measure. I used measuring tape.
A
I love everything about this.
B
The fingers healing. And next week on the podcast we can ask Sam how he feels about
C
his When Dave goes out of town,
A
the best thing ever.
C
I actually use that opportunity to just get rid of a few things that he won't remember.
B
I have discarded no clothing. I have discarded no clothing.
A
Yeah, you don't want to get rid of Sam's clothing because Sam's been wearing the same thing for 20 years and that could be catastrophic.
C
Sam had a major style upgrade. I saw some photos recently. He's popping collars. He's like all about the button ups the these days.
A
He's like looking.
B
Did you guys see the double collar? He double collar. The world was confused. The world was confused.
A
Well, anyway, he pulled a Sam. He pulled a Sam Altman.
B
He did, yes. You know the reference.
A
Yeah, that's a deep cut.
B
He pulled a Sam Altman.
C
I don't get why Sam Altman has double collars.
A
No, it's an old, it's a very old like 20 years ago WWDC story.
B
Yeah, you. That's a deep cut.
A
Gotta be around. Let's get into this. Let's get into the current things.
B
No, we will. We've got this trial, the trial of the whatever raging ahead. OpenAI Elon juice here by the second.
A
Is it really though?
B
Yeah.
A
Okay. All right, well, what else are we going to talk about? I won't jump in. Tell us what else we're going to talk about.
B
Well, we're going to jump into that. We also need to talk about the Met gala. Not from a fashion point of view because that is not why you're tuned into this pod, obviously for the anti tech sentiment.
A
Sometimes you guys have good fashion.
C
I think Jess and I make good comments about fashion.
A
You do, but it's not every. It's not why people log into this pod.
C
No, it's fine.
B
So we, we really need to talk about the knives being out for, for all tech people because I think it, it's, it's important. I'm not trying to dismiss this. I think this is very important also. You know guys, the enemy of my enemy is my cloud computing provider because we have seen Perhaps the strangest AI compute deal to date, which is anthropic colossus. So SpaceX, Elon Musk, what really have given over their First Colossus project.
C
100%.
B
100% Colossus.
C
100%.
B
Dave, did you wake up this morning?
A
Oh no, I've been on calls all day.
C
Dave's very busy right now, I can attest.
B
And it is fascinating and clearly all these other dynamics, including this trial I think help explain how this happened.
A
Yeah, no question.
B
And we should talk about COMPUTE because we also put publish an interesting chart in the information that I think is my most commented on tweet ever in history. So Compute, compute, compute. Let's start with Met Gala Britain. Dave, what are your thoughts?
C
I saw a lot of venture capitalists that I know there and I was like, what's happening?
A
And I'm just confused. What is the Met Gala for?
C
Well, Lauren Sanchez Bezos and Jeff Bezos are not in general sponsors.
A
I don't really care about the Met.
B
It, it's actually to support the Costume Institute of the Met. It's not even to support the Core Mat guys. It is specific quickly to support the Costume Institute and it. But it's become both a fashion event but a socialite gathering of the year. And for years and years there have been a parade of tech CEOs sponsoring tables, getting involved. TikTok sponsored, Instagram sponsored. Right?
A
Like yeah, but that's because those are those. I always viewed it as like those that were relevant, were involved in the, you know, and there was some semblance of editorial and taste and et cetera involved. And so you know, seeing like our, you know, our dear friend Adam Mosseri doing it like Instagram's like direct in the middle of the cultural conversation. Right? Like that made sense to me. It seems to me like there's a lot of just random, I don't know, like what, like did the editorial, did the editorial piece come off? Like is, is that not.
B
Is it really. It's a fundraiser, right.
A
And like is it just money now?
B
I think it's always sort of been just money, but with a sort of Anna Wintour creation layer on top curation layer. But the big news this year was one of the honorary co hosts were the Bezos and specifically Lauren. And this led to, it seemed, you know, from behind the scenes, just externally protest internally a lot of consternation, people refusing to come. This drumbeat of Tech Lash, you know, is important to talk about.
A
That's not Tech Lash, Jess. That's like.
C
I think it's Tech Lash. I agree with Jess, that's.
A
I think it's Wealth Lash.
B
Well, I think it's all of it, Dave. And this is actually what I wanted to talk to you guys about because I think we're seeing like a crescendo and a convergence in a way. This like, is probably quite consequential for these tech companies. So some of it's directly tied to like layoffs in AI. And so you see this commentary like comparing the estimated price of Lauren's dress to how many Amazon workers jobs could be saved or something. Obviously ridiculous at that level. But gets at an issue in the public perception of tech in this moment that I think is very significant. So you've got sort of like AI and job displacement. Then you just have what to me seems like more standard, like hate the rich kind of things that you could lob against all standard.
C
But I would argue is on the rise, especially with things like gas prices and everything else eating the economy right now where I feel like the bifurcation of the wealthy and unwealthy is, is continuing to spread.
B
I think this is like, I mean, and this is, it's a hard conversation to have but it's so high level. But like I think, I just can't imagine, I mean, God forbid we're seeing the violence against Sam Altman. I mean, I just think that this is like boy, boiling over in kind of a big way. And it's gotta be something that these tech companies are taking kind of serious looks at. But at the same time.
A
You mean big tech companies?
B
Yes, big tech as opposed to startups or what? What do you.
A
Yeah, I mean the, the scale that we're dealing with. I mean, you know, Jeff Bezos is 100 billionaire, right? Like this is a totally different scale.
B
Could be a trillionaire. Guys.
A
Yeah, these are what is like we, we actually don't do justice to talking about this all as one thing. Like these are like very different the scales we try to on this pod. The scales of all very different. Like these are, you know, Amazon. What, what, what is Amazon and Meta and Apple and you know, the meg 7, meg 8, whatever. Like the big, big tech companies are very different animals than all the startups. All the even like the mid Tier tech stuff are just so much smaller and so.
B
So Dave, what do you think though? Like, I, I think we're at a point where, especially with AI, it could all get washed up in the anti segment. I mean, the other major bit of news so far this week. I'm gonna give you your take. I'm gonna give you one more data point. This is what I do. The headlines. We are now seeing a Trump administration seriously considering essentially taking a Biden approach to AI regulation, which was much more hands on, which was contemplating reviewing models and training ahead of time. Now, you guys will remember that the Biden administration's stance on that was what pushed Marc Andreessen, Ben Horowitz, along with crypto into the arms of Trump. The Trump position has been like, we want to put a moratorium on AI regulation. And now, you know, in the snap of a fingers, we're seeing being contemplated, to be clear, not made into executive order territory yet, but a whole set of AI regulations that would really represent a sea change in the approach to the administration. I think there are many reasons potentially behind that, but one of which has to be they're taking the temperature on, you know, the broader sentiment around AI and what they believe the public wants from a regulatory point of view. I don't know why. I mean, I think they're probably. The technology itself is also shaping the change in that approach. But Dave, what's your take on this?
A
I mean, I'm mostly hearing, at least in my back channels that I'm involved in that that is fake news, that there's not really, you know, you look at like Sachs and these folks that are working on this stuff in the White House, like this is not how they view things. Like they view it as this is American capitalism. Like we should, of course.
B
But Sax has a different job now. He's still informally an advisor through pcast, but he is not the aizar anymore.
C
There's no aizar. Right. Like who is the AI Czar?
B
I think Sriram would say he's the aizar. I don't know.
A
I think that there's a lot of games being played that are unrelated to what the White House may or may not think and that, you know, I think anthropic's out there fudding all kinds of things open AIs. I mean, there's just a lot of like narrative games going on that are self interested and you know, like, I worry about this because America, like what am I seeing on the openclaw front? I'm going in reverse order on Your things, Jess. Like, no, no, America has absolutely.
B
There really is no order. I just try. There's no order.
A
Jess has absolutely no. Or I'm sorry, America has absolutely no take on open source AI models. Okay? Like, we don't have them. Meanwhile, China is printing them on printing presses. Like, you know every single founder, every company I know, everyone I am Talking to about OpenClaw is like trying to figure out how to use Chinese models to make their AI pipelines more efficient by the day. Because you know what? Like, I can now run. Like, I have this amazing little Dell box on my desk now that runs 120 billion parameter models with a Blackwell that is actually pretty good. Like, it runs a claw, like really, really well.
B
And Wait, I have to pause. You're using a Dell box instead of an Apple box. And your nave is Dave Morin. Morin comma Dave.
C
It was, it was. It was a gift. It was a gift.
A
Apple doesn't make a. Look at this. This thing is the size of a Mac Mini and it runs an Nvidia Blackwell. Like a GB10 has 128 gigs of RAM. You can run pretty awesome inference loads on it. It's pretty. It's pretty amazing.
B
Oh, I wish Sam were here to see this time we changed. I can't wait to tell him.
A
I mean, it's the thing we've been talking about on for like two years. Like, I now can run like pretty powerful inference with pretty smart models, like on my desk. And you know, this thing costs $5,000. Like it's not that out of reach. Now. The point is, I think you've got a big battleground playing out, right? You've got Chinese models and open source coming, like, really, really strong. The American answer to that has been, we are going to let American capitalism create Anthropic and OpenAI. And maybe Grok. Looks like now Grok is out the door and maybe Colossus is like really the only thing that comes of it. And Colossus will run Anthropics workloads and that's great for Xai.
B
No, they're getting Xai gets Colossus 2. There's two colossuses now, so they're not giving up on Xai.
A
Anthropic gets the hand me downs. You know, Anthropic's like the little brother getting the hand me downs.
B
Yeah, I think Colossus 1 is fine. But yes, I don't think it's like,
A
okay, I don't know. I don't have much more to say on that. But I want to come back around to your first point, which is around this, you know, the inequality question. My new take on this is that there are around 50 million or so. I don't know if you go on Wikipedia or you look at any number, there's around 50 million software engineers in the world and yet there are 6 billion people that use the Internet, that use software in the world. And so up until this point in time in human history, only 50 million people knew how to create software and then therefore create software wealth. And what is coming with openclaw and Claude code and Codex, these are tools that are only used today by single digit millions people, right? There are like 4 million people using codecs, some number similar of Claude code, probably about the same. And yet anyone can create software now. And I know I'm kind of a broken record on this, on the pod, but I think this is, there's like a likelihood that the inequality that has been driven by software entrepreneur entrepreneurship up until this point is at kind of a local maxima. And that because today anyone in the world can create software at a pace that was impossible six months ago, like you can speak English to create software and then therefore create software style wealth. I think this is like a totally new pathway for wealth and that we might actually see a reduction in inequality because of AI over the next 20 years. And that to me is the optimistic take.
B
Yeah, I follow the logic of that take. I just don't see it happening. I think that, I mean, maybe not. I also think anyone, I think anyone who says they know what you're not, like the future is unclear. But what I worry about is that there these tools, I mean you were just talking about open source, are going to democratize the tools, but not the economics that is inherently deflationary. So you know, you may, you may. And also like markets tend towards, like tech has a sort of consolidating power as well. It has a decentralizing power. But look at how many search engines there were look at how many search engines there are now. Look at how many social networks there were. Look at how many social networks there are now. I mean, I completely follow that the sort of AI revolution is different and is fundamentally unleashing very powerful tools far wider than some of these other things. But I think technology does end up consolidating both the economics, the control and other things as well. And that's what we've seen in other waves. And like the reality, I guess, like,
A
I think that I'm not focused on anything other than the Software engineering piece. Right. The idea that, like, only a small. I mean, we're talking.
B
Look, you used to be able to make a lot of money relatively by being a journalist, and now every. Everyone's a content creator, but journalists overall, except at the extremes, are not very well paid. I mean, I. That's a bad analogy. I don't know that there are ever any super, super rich journalists. But, like, just because you've democratized access to the medium, it doesn't mean you've democratized access to the economics.
A
Yeah, but I think that the economics were completely. They were. They were absolutely unattainable unless you could speak this esoteric language, these, like, esoteric software languages. Like, and you had to also be able to go speak the language of capital and raise huge amounts of capital to, like, put enough engineers in a room to build a software system that could capture any economics. Right. And so I guess that's the thing here, is that the Doomer narrative. I guess I refuse to let the Doomer narrative take over our podcast because I believe that the Doomer narrative comes from a specific.
B
I'm not a doomer. I.
A
You know, it's a specific subset of Silicon Valley that, like, I think there's an optimistic take here too, which is that now everyone can make software.
B
And now I just, I think there's a widening. It's hard for me to see anything but widening inequality from this vantage point, knowing that that's such a slim lens because so much is unclear. But Brit, what. What's your take on this?
C
Well, I continue to believe that what. What they've seen is true. And if you use the analogy that Jess, you and I went through, which is journalism to the creator. Economy still going through it.
B
Still going through. That still here.
A
Well, no, guys, it was media companies to create our economy.
C
Media companies to create our economy. Fine. Jess and I both have run media companies and I still believe that there are great content people in this world, despite the fact box is now getting acquired by Rupert Murdoch. We can talk about that later.
B
Not by Rupert Murdoch.
A
Did that happen today?
B
Guys, I thought there was a Rupert. What was the Rupert James Murdoch? Kennedy?
C
A different Murdoch.
B
You've got the wrong Murdoch.
C
We'll go back and talk about the Vox News. I clearly read a headline and not the Star.
B
Honestly, this Murdoch believes in climate change and actually is devoted to.
C
Okay, so we're happy. Great.
B
We are so excited about this. I hope the deal gets.
C
There's still a high level of quality that can get paid and have a great job in this industry. And I think the same will be true for software engineers. At the same time, there's now a new trillion dollar economy for everyone else that wasn't a content creator before and didn't know how to write an article or make a video and now can and does and has, has profited off of that. And so I think, to your point, I think you're both right.
B
Yes, but to what level of profiting?
C
Right? Like I'm not saying they're gonna have like a venture backed companies, but they're going to make maybe hundreds, thousands, tens of thousands, millions of dollars, some of them on a spectrum. Right. And so I think that's net positive and optimistic for people across the world to think about who never have been like known how to write software to turn an idea into a company or an app.
A
Like they've been, they've been extracted from by people who could create software. Like, I think this is really important to just point out every other week
B
we have this debate. I want to have the same debate, but in the context of public sentiment towards all of this right now. Right. So take away what tech people are saying, Dave. And the doomerism and all of that with its multiple.
A
Right, like the doomer. The doomer agenda has become popular culture.
B
Okay, but what do we do about that? Or what we, what do you guys do about that? Because that's a reality today.
C
Once people start realizing, once the everyday person in Texas that I grew up with is like, whoa, I made a thousand dollars from this random AI thing I created. Which I don't think is gonna happen today, but like maybe a year from now and that's gonna, that sentiment's gonna spread and people are gonna be like, whoa. And by the way, Dave and I did an hour long meeting with our head of school this week. It was just like every time we do conversations with like normies like the head of school or whatever, they're like, what? I can like do all this stuff, but like all the parents hate it. And everyone's, you know, everyone's disputing how we should use AI and how should we do all these things. But also like I could run my school more efficiently.
B
So you think the path, you think the path out is education?
A
Yes.
C
I'm just saying, like people have not even started using the most powerful parts of AI yet. And once they have and once they realize how much more efficient they can make their business or they can start a new business or they can make their life, I think the sentiment will naturally change.
A
Yeah. Not just Education, we actually have to have better products, right? Like I keep telling everyone, like it's, it's 1984 and the computer was just invented. It's 1997 and websites are a thing. It's 2007 and mobile apps are now a thing, right? And at the beginning of all of these things, nobody had the slightest clue what all of those new objects were going to be like. There were going to, it turns out there were going to be billions of computers made, there were going to be billions of websites, there were going to be billions of apps. And we're right there, right now on AI agents and nobody has the slightest clue. Even though everybody in town keeps talking about how like the future is already here. Everybody's already like, AGI is already happening. It's like, dude, no one is using this stuff yet. Single digit millions of people are using this stuff. And the only thing between here and there, like more people using this is going to be better user experiences, better design, better products. Like, we haven't cracked it yet. Like ChatGPT itself still isn't beyond 800 million people and it's declining in growth right now. And so like we have to make better products in order to get this vision to the rest of the world.
B
GPT is declining in growth. It's actually using losing users on MAU or it's growing less slowly.
A
MAU looks like it's growing slower on all the charts I've looked at.
B
But growing slower isn't losing users, it's just growing less slowly.
C
It's still growing slowly. I don't know, declining growth.
B
I don't think it's losing easier. I think it's growing mostly. Look, I think, I think this will be a very interesting conversation to come back to in 24 months because I think that again, everything you guys are saying makes sense and I can like chart the logic in it. I also think people aren't going to buy it. And remember, layoffs are a big reason.
A
You mean like the public sentiment?
B
Yeah, yeah, the public sentiment. And I think layoffs. Look, you, you coinbase just had layoff. You know, Meta has had recent layoffs. Meta had recent layoffs and came out and said we can't confirm that there won't be more and we don't actually know the optimal size of this company. That is corporate messaging that is telling people to batten down the hatches for more layoffs, right?
A
By the time all post pandemic, like, you know, Jess, you and I are in similar circles. Like, everybody is doing this because they overhired in the pandemic. AI is a convenient.
B
Now they're doing it because they're spending $200 billion on compute as well.
C
I think because they also, they can do things more efficiently. For real.
B
They can wait do things more efficiently. I mean, I, I, I think it's both, right. I mean, Dave, think of like the things you could do with your agents. Right? It doesn't mean you're gonna stop hiring people, but it does mean that there
A
are certain people I have to like, hire more people. That's what I don't understand about this whole narrative is like, everyone I know working in this stuff is more busy than they've ever been. Like that. What drives my whole passion around this is that everyone I know at the heart of the AI conversation and at the heart of AI innovation is working harder and more than they've ever worked in their entire life to the point where they're not sleeping well, their health isn't as good. Like, it's like this stuff is not making us more efficient.
C
Well, I also think there's like a dopamine hit this thing gives you. And they are obsessed with it. Yeah.
B
Look, the token maxing population is not the population we're talking about right now. Now, right.
A
And, but some version of that's coming. Like, everybody else is also going to experience this, right? Like tokens are going to come and become part of your job.
B
They're going to experience something. But I think it's really hard to know exactly what for. For to use a cheesy word. Like the diffusion of all of this is going to look like and how, like, I, I think, I think it is so early that like, we cannot, it's not easy to extrapolate how this is and this is why you see wildly different protracted projections.
A
And I'm like, open to that. Maybe I'm totally disconnected and maybe I'm, Yeah, maybe, maybe I have no idea. Maybe I'm absolutely like, to the listener, like, hit me on Twitter if we sound completely and totally disconnected from what's actually going on.
B
People will tell us in the YouTube comments if we're totally, and how we're totally disconnected. But I think it, you know, I've talked about, like, starting to get, you know, some like, policy proposals around how to handle this out there. I think, you know, when, when OpenAI bought TPPN that, that, you know, whatever you would fill in the blank. Acquisition of two months ago, it was like, to change the public perception of AI was like one of the stated Reasons behind it.
A
Yeah, tell better stories. Like these guys have only used the doomer kind of. We've talked about it at length on this pod. Like it's like a Christian theological doomer. Like we are creating a God, you should fear it. They've used this one version of a message. It's actually ancient spiritual technology. They used like literally Christian theology to sell this thing to the world and scare everyone. And now they're stuck. They're like, ah, AI is less popular than ice. What do I do? I have to tell a much better story, right? Like I have to give people the optimistic take. And so you're.
B
If people are going to feel like it's coming for their jobs or their kids jobs, like it's going to be very hard to come up with a narrative as narrative driven as the world is that's going to create that now. Dave, you're right. In the past, the product itself has close the gap, right? And Google ate all the world's information until it was super convenient and then it was Google and you could never live without it, right? Or name your privacy concern. That went away with some, you know, new convenience. But it's like, it's just, it's crescendoing.
A
I feel like I'm like the, the voice of Sam here on this or something like that. It's, you know, we're the product, like. No, the policy thing. Like you just mentioned policy, right? Like policy is like so far behind. We can't count on policy to help with any of this. If we're, if we're sort of staying up in this, like, if we're staying up in this, like what is the popular, what is the conversation amongst all of the citizens of the United States and the world? Like policy is not going to help, right? The only thing that's going to help is better products, right. That like do better at what we're. And then I guess that's kind of the breakdown in ideologies, right? Some people believe you need to slow it down and stop it and be Bernie or whatever and policy. Some people are technology maximalists and they think like if we can make better technology, then maybe that will solve the problem. And that's the great debate.
C
But don't you think speed is like a big or the biggest part of this? Because during the search era and the mobile era, that those were both radical changes, even the PC era, for people, for their jobs, for how they lived, for how they worked, but they happened over like so many years that I think organically people Adapted and companies adapted. And right now we're in this like crunch period where it's like everyone's racing at the speed of light to get to the next model. And so therefore everyone needs to keep up and every company needs to keep up and the jobs are, we're losing jobs faster and everyone needs to pivot faster. And there's this amount of stress and anxiety.
A
Who is everyone?
C
Like companies across in where country schools, literally. I mean, everyone's like, this isn't happening across the country.
B
It is.
C
The literal negative sentiment is because people are afraid of what AI means for them, for their kids, for their jobs, for all of this stuff and how it's going to take over their lives. And I wonder if, if it was slowed down and all of this was happening over a five or ten year time horizons. But I think that's what the government's trying to do.
A
No, they're not, Brit. Nobody's trying to do anything.
B
Guys. Deep Seek is raising like a gargantuan amount of money.
C
They're from China. Our government is like, we should put the brakes on some of this stuff. No, I know, but like, I just think that, I think the human sentiment of change is not meant to like move this fast. Like, even, even high tech, like, or sorry, high growth tech startups are freaking out of like, oh, we have to change, we have to do layoffs.
A
Humans don't like change in general.
B
I think it's possible. I don't. I'm going to try this out. I don't know if I believe it. I think it's possible. It's not actually moving this fast.
A
That's what I'm saying. It's not.
B
The, the narrative, including the role of the press broadly and the role of commentators and of everyone is like, you're right, Dave. They're piling into the doomer narrative or they're piling into the like, like diffusion is here tomorrow narrative. Right. And what's happening. But it's a hard line because you do need to. My experience is you have to somewhat evangelize this stuff to get experimentation going.
A
Of course, of course.
B
And so. But then it's like you've been AI pilled. It's like, no, I haven't been AI pill. But like, I did like, you gotta learn about this.
A
Yeah, totally. But look, the numbers. Just look at the numbers. That's what I'm trying to do is like every conversation I'm in, I'm just like grounding people in the numbers. There are only 4 million people using Codex, that's like no one, literally no one is using AI engineering tools. No one. In the grand context of the world, it's gotta be the same number, right? If you just look at the numbers, right, there's around 10 million software engineers that build iOS and Android apps. So the population of developers in the world is somewhere in the ballpark of 10 million. And Codex's last number was 4 million. My guess is that Claude's is the same ish, you know, it's single digit million and so that's awesome. Like, I love this stuff. I'm using it every day. It's so much fun. It's a new paintbrush that you can paint amazing new paintings with, right? And it's so fun and it's a great new technology. But like the numbers are so tiny. Like so tiny. And I get it, the revenue numbers are enormous, but like the numbers are still so small in terms of how far this is diffused into society. Like, it's so cool.
B
Well, I want, I want to transition this to the compute conversation. Just some new developments there because this is like the other side of the coin of this. But Dave, I also think it's interesting to point out that like you're making this point that it's so small that so many, few people are using it that we don't know. You're also saying that you and everyone, you know, have to hire more people or you know, going crazy. Have never like, so, you know, it's sort of comments like that that have people extrapolating to.
A
Well, that's what I think. Like, you know, I think we've, well, we've well articulated it perhaps. We, we tend to articulate it on the pod that there's, there's no good business in AI. Like, I think people are sick of us about it this way on the pod, that I don't think we all
B
think there's no good business in AI business.
A
Well, I mean, I think we've repeatedly talked about this, that the AI business is a terrible business. Why is that? Every technology cycle has a infrastructure investment cycle. Like if you just look at the level of investment at the beginning of the web era, at the beginning of the mobile era, at the beginning of the cloud era, there's always this huge investment in infrastructure, right? And in this case it's been in compute, which is a great thing to switch to next, but it wasn't until the application era started long and it's usually like many years after the infrastructure investment that all of the value gets created and we're just not even close to being there yet. Right. Like, that's what I'm kind of trying to point out is that these AI harnesses like Claude code, like openclaw, like Codex, they're very small. But what they look like to me is the future of what the applications on top of the AI infrastructure are going to look like in 10 years and nobody's started building them yet. Like it's all just pretty new and we're like figuring this out and there's a huge amount of business to be made around. How do you build stuff to use the AI infrastructure better and make it more usable to people? Because it's still very hard to use AI inference. Like in your job there has to be a tool for you to use the inference to do the job that you're doing. Right. And that requires user interface and design and, and well thought out tools. And like a lot of that is all just really new. Most people just use AI to get information today. They don't use the inference to do their job. And like that's going to be the next 20 years. Right. And that takes a long time because everybody has to be a tool needs to be made for their job or their new job and then they need to be learn how to do it. And like humans don't metabolize things very fast. And so I, I buy your take that it's much slower than people think and the narrative is way out in
B
front which is also like, so we don't have to belabor compute, which I know is not everyone's favorite topic. But two important things happen. One, and let me just share this chart. So information had some reporting about anthropic which has been behind in compute capacity.
A
Yeah, it's clear. They made a massive strategic mistake.
B
They made a massive strategic mistake. They seem to have remedied it pretty quickly.
C
I think they've remedied it really quickly
B
and at great expense.
A
I don't know about that. What are the actual numbers?
C
Yeah, I mean they, they Google and now Colossus, right?
A
Yeah, but Brit, this is probably costing them a fortune. Like this is a massive mistake.
C
I mean 200 billion, whoever is the
A
CFO or did the planning is like, I don't know, it's just Krishna Rao,
B
smart guy, they made a mistake, look.
A
Yeah, but this was a serious mistake.
B
It's a serious mistake. It will cost them. I think Dario said at their conference today that their revenue's up 80x where they thought it would be. They're doing just fine. Okay. So but here is an interesting thing. So the information broke the news about the size of Anthropic's latest Google cloud commitment at 200 billion. And then we charted of the, we just had earnings of the backlogs in deals for Microsoft, Oracle, Google and Amazon. What percentage are OpenAI's commitment and Anthropic's commitment? It's together they're about half. Okay. On one hand this is not surprising obviously who else are the really big customers of COMPUTE at this moment? But it's also really striking how much growth that's happening in these multi trillion dollar hyperscalers is coming from the spending
C
from the private markets, basically from all venture capital companies.
A
Yeah, but am I reading this right? That OpenAI was just so much smarter? Like the scale of what they were doing was so far beyond Anthropic.
C
And why hasn't OpenAI invested in Google?
B
Well, so my other take is when these guys will they just view themselves as too competitive. But when they put that aside and OpenAI said yeah but how is Anthropic
C
not competitive to Google?
B
OpenAI thinks it's competing with everyone. But the other thing by the way, and Dave, I'll circle back to your question is this chart just shows the money is invested.
C
Right.
B
So this is sort of those who believe that this is a circular economy. So Microsoft is investing in Anthropic. It just is, is going right back up there. We should have put this in one chart team. We have it other graphics right back up here. So Dave, I think you're, I think how do I stop sharing? Okay, yeah, Anthropic screwed up for this compute. But also, and, and I think they will fix it at great cost. But you know, after striking Colossus they instantly rolled back some of the limits. They instantly upped the capacity. I mean I, I actually think they resolved it pretty quickly. And I don't think you can fault companies for having even more exponential growth than the exponential growth that they thought. I mean they were obviously too cautious in compute and that was obviously a mistake. But I, that's going to be nowhere near existential for them. And I think it's hilarious that I mean Dario Elon Musk kind of the inverse of each other as humans. Like, like they really are not similar on any metric except their hatred of Sam Altman. Right now they're very different. Like Dario, he's starting to be more outspoken especially around his sort of doomer narratives. But like, you know, he has been. It's a good, it's a good point. Dave of how to sort of describe it, but like articulated very different principles around safety, articulated very different principles around capital than Musk. Like is, I would say, while he says some very bold narratives on certain topics. Like he's sort of like an understated human being in many ways minus like pushing those narratives. He's built a culture at Anthropic where like absolutely no one leaves. Xai is a revolving door of people. Like they're just very different type of leaders and they have not. But you know, they've had very different approaches to regulation. Like Elon has been like back off. Dario's been full court press. So I would just say there, there has not been many common threads on how they operate their companies. What they've articulated is important, you know, that, that I've seen having followed this space pretty clearly. But now we have this, this megadeal where SpaceX is selling or renting computer. I don't know what you do, you know, to the Anthropic. Elon is tweeting up a storm about all the time he spent with the Anthropic team and what great humans they are and how aligned. You know, the elephant in the room is just OpenAI and Musk wanting to stick it to them every way possible. I mean as this trial has an and the SpaceX IPO, we're showing any customer demand for your compute. And by the way, I don't think it's a coincidence that Anthropic said it would be potentially interested in orbital data centers, which of course is the driving narrative of the SpaceX IPO and which no one in the industry at all believes is actually a near term thing. So it seems like they both got some extra juice out of this agreement. But I mean it makes a lot of sense and I think it's a very smart move from Elon's perspective and probably a prescient one on his part to have gone full steam ahead in building all these data centers knowing that, that, you know, this opportunity would present itself at some point. But it is kind of wild. They're very odd bedfellows is my take. I don't know. What do you guys think?
C
I can't take Elon seriously anymore. And I feel like it's like he's, he's taking the Trump pill of he's best friends with someone one day and not the next. And, and so I don't actually trust any of these deals like that they're all gonna happen go through that he's on one side or the Other. I feel like no one. This is all. These are all characters as we've talked about before masquerading. And one month is very different from the next. And I think you're even seeing the same thing. Like, we've been so hot on Anthropic for the last few months, but also like, ChatGPT's new model is like really awesome. And now there's a lot of people starting to use that again.
B
Did you guys get invited to this party? I didn't get invited to the party. There was a 55 party or something on 5 5.
C
Yeah, yeah.
B
Companies now parties for model releases.
C
Oh, please. God.
B
This seems very something. Okay. Sorry, Brett. It's swinging. So you like the new ChatGPT model?
C
Yes, and I still use all of them. And I'm still like, I just think we're so early in this and you know, Sam would say they're all just gonna go to the same place and be commoditized, but I actually think we're gonna be flip flopping back and forth for the next many months and we'll see who wins out. But, you know, everyone's just getting their compute infrastructure, everything in place and trying to build as fast as they can. Meanwhile, Google has 3 billion users with like registered accounts. So I'm still not, I'm still like long Google here.
B
This is a cycle where Google goes slightly quiet and then it's IO and, and we'll see.
C
Yeah, they're two weeks from now, Google's gonna come out with a bang and
B
we just sort of don't know. Have you guys been following the trial?
C
No. Can you fill us in?
B
Well, like, basically. So it was really Brock. Recent days have been Brockman's turn on the stand. And I think as we've talked about too, but you know, Greg diaried the entire founding of OpenAI. His heart's wants, desires, financial goals, everything. And that is all in evidence. So he's been asked a lot of questions by that. I mean, look, Musk's lawyers are pit bulls and I think they're actually very good at what they do. I was catching up with my lawyer yesterday and I was just like, what do you think's gonna happen in the trial? And he was like, musk has great lawyers, you know, So I. I don't know. But basically it was a lot of going back to the earliest days of OpenAI with Elon, with Greg, with Sam. It was revealed that Altman, because it was a nonprofit back then, was paid Greg $10 million out of his like, own family office and kept it from Elon. And so all these sort of alliances were being formed beneath the surface, and it just seemed like a complete tinderbox ready to explode, which it did. You know that it also came out that Brockman's stake is worth about $30 billion, which is a point that he was just grilled on and asked, well, why haven't you donated it? And like, all these questions. And it made me realize. I do think Altman, who's very shrewd, was really prescient in not having equity in OpenAI. And again, it can get paid in a lot of different ways. But I was actually the first to report that he didn't way back when, in all these funding rounds. And it was kind of peculiar to me. But the sense was that he really felt that, like, it would be better for everyone, including, like, coming back to bite him for having, like, personally made so much money on a particular technology. So we'll see. Mira Moradi has been testifying or her deposition is being played, and so questions about Sam's leadership are coming out and, you know, a little bit more TMZ than substance in some regards.
A
Yeah, I guess that's been kind of my read, is that everybody thought that this was going to be. I don't know which side thought the other was going to be the pariah in the eyes of the public, but it's all kind of like a thud, in my view. It's like, eh. Like to back to the beginning of the episode, Jess, to me, what this just looks like is sort of wealthy people bantering. And like, it's kind of unfortunate. Like, it doesn't seem to serve the public interest. And maybe I'm misreading it, but it looks to me like kind of like a dud trial. But I guess we'll see.
B
Well, it's not.
C
I don't know.
B
I mean, again, I think ultimately Musk has to show that OpenAI I violated a contract or, you know, committed fraud. And I do think these lawyers are smart and all of these things do potentially weigh towards that, but it's. It's a little bit circuitous. And, you know, the jury obviously has a lot of power in this case too. And so you're just trying to convince
C
jurors Polymarket gives Elon a 41% chance of winning the case.
A
Has it changed at all? I haven't looked at the.
B
It's been pretty stable.
C
It started at like, 56%, but.
A
So he's losing.
C
Yeah, but you have to invert these things. Right. Because the betting pool is on the flip side. Right. So yes.
A
So wait, the, the title is will Elon Wisp win his case against Sam Altman and it dropped.
B
Yeah, Polymarket's betting against him winning.
A
It started out at 55 and we are at 41 right now. It's kind of, it's been all down since it started really.
C
But there's been some bobbling in the last week for sure.
B
OpenAI also revealed in a filing that Musk had approached them about his Brockman about a potential settlement two days before. That doesn't really help his case, although I think that was more body language.
A
Wait, so sorry, what happened?
B
So, so OpenAI put out a filing saying that a couple days before the trial at these. Well, they didn't put this in the filing, but at the breakthrough prizes, Musk had approached Brockman about, you know, potentially figuring out a solution to this or something like that. So OpenAI put it in. I think the reason they issued a filing is they wanted to be able to ask Brockman about it. So they had to like submit that as, I don't know, testimony of some nature. But it also made like apparently According to this OpenAI filing, when Brockman rebuffed him, Musk said, you know, in a couple weeks you two are going to be the most hated people on the planet.
A
And like that hasn't happened. So it kind of actually goes to show how disconnected they are. All of this is from reality. Right? Like that. I don't know that anyone cares like about this. Even people in town. Like, it's like ah, it's like interesting TMZ stuff.
B
Well, I know, I mean again, not everyone has to care an equal amount about everything. But, but this, this, there's things at stake here. I mean OpenAI is important. OpenAI's IPO, it's is something that's going to be widely watched and a lot of people are going to have, you know, a clear stake. And so I don't think we have to over index towards it for like drama's sake. Although please, someone make the movie. You know, it is, it is happening.
C
It is news, guys, side news on you can actually bet and there's $6 million of betting on polymarket on the number of times Elon Musk will tweet in a given week and for this week between May 1 and 8, which is the. There's $6 million that's gone into it.
B
It.
C
The current 48% is at 160 to 179. But the numbers go all the way up to like 500 plus. This is a hilarious poly market on this point.
A
This is maybe a fun pop culture thing. One of the things I've been hearing a lot from smart people that I know they're running doing interesting things with open clause is people are doing poly market trading operations using several open clause or several agents. And the best trading market is the Twitter mentions like whether or not something's going to be mentioned on. On Twitter.
B
So a trade like that very easily manipulate. Didn't Brian Armstrong.
A
I think that's why it's. I think that's why it's so good. Like that's why it's such a good.
B
It has like cameo esque vibes. Like you're just getting people to say things. I don't know.
C
I cameo.
B
We had a polymarket story today and the information about the slow start for their US business. So worth checking out for people interested in such topics. What else more. And Steve, you want to give us the Open Claw update? You're. You're in meetings all day, you're not even looking at your information news feeds.
A
You must be swamped.
B
What's wrong with you?
C
His claws delivering the news nightly at night, that's fine.
B
Are you getting like MCP then? You want an MCP deal like I got you? We got.
A
We don't do. We're way more in depth but we're way more into clis at openclaw. Yeah.
B
Can you explain that for me? What's a cli?
C
Command line interface.
A
Command line interface.
B
Oh yeah, yeah. But that you can still mcp. Those are two different things.
C
Similar.
A
The main openclaw update is that we have had a lot of bugs and stability to work on for the last few weeks and it's been, you know, there's a lot that has happened and a lot of people using this thing at very large scale. And so we've been very focused on stabilizing the core, reducing. Reducing what is openclaw to only what matters and then putting everything that is sort of third party things that a lot of people use into a plugin architecture so that the central core is safe, secure, fast, you know, as bug free as possible. It's definitely taken us longer than we expected and there was a rough week last week, but we're at a good place now. I think we're, you know, things are speeding up, people are happy maybe turning the corner. It's been a, been a lot of work though. Doing a release almost every two days right now, sometimes multiple times. A day.
B
Dave, is OpenAI helping with this too or this is really just the open source network as it's been growing separate from them. I know they have some involvement.
A
It's the open source network. They have actually a couple of people contributing from the OpenAI engineering team have been contributing as open source maintainers. Obviously Peter is the largest contributor and he technically works at OpenAI. And then we're in conversations about the other, I guess big update is the foundation. I've already mentioned it publicly several times, but the 501C3 is set up. We actually have capital coming in the door now. We've actually begun onboarding a full time team to work on the open source core at the foundation, which is great. That's a really important piece of doing a good job on this project given how large of scale it is at this point. So we've been working through that and then the conversation with OpenAI, we're still figuring out what level they're going to support us. We obviously, we burn in a enormous amount of tokens on this project. We haven't disclosed publicly how big it is, but the number is, it's, it's very large. And so the help of token providers of various kinds is very important to us. So we're, we're, we have ongoing conversations with many, many different partners right now and we're going to get a blog post out in the next couple weeks talking through a lot of that stuff. So I'll, I'll save that for that.
B
Did you see the meta news and the information? A little bit more about their personal agent plans. I think this was out there to
A
some degree, but yeah, I think it's been out there. You know, at openclaw, we've been aware for some time that they have a fork of openclaw and you know, we, we expect they'll do something pretty big. So that's cool. It's good to hear they're doing that. And yeah, it's great. You know, we've got, it says end
C
of June I think. Right.
A
We see the same thing. And I mean in China you've got huge deployments on ByteDance and let's see, ByteDance, Baidu, Alibaba, they all have forks. They've all, they're all deploying at large scale. So there's a lot of this going on all over the world.
B
My big AI news of the week is I connected my granola to my Claude code.
C
Oh, you're late to that, Jess. I did that six months ago. You know what's cool about. Oh, you can do some damage with that.
B
Tell me, like, give me two examples of things. Now that I've made this connection, I can do it.
C
Okay. One thing that's obvious. There's a partnerships business that you run at the information and probably like the hot leads, like, you know, CRM of some sort. I don't know what software you're using for your CRM, but literally today I just, like, within 30 minutes, imported all my granola notes for the last year and a half, two years, used anecdotes and all of them to populate a new CRM for me on the fundraising side of things. And I have, like, little. And I can now, like, rank. I had Claude, like, using anecdote as
B
like a technical term. Or you're saying you added notes like, we've had a sales meeting. You're pitching me on, you know, okay,
C
you had a sales meeting. Okay. No, you. One of your sellers, like, talked to Sony a year ago and they were like, on the side, out on the side of their mouth. They were like, yeah, we really don't have budget this year, but probably like mid 26 is when we're gonna, like, look at these opportunities again. You forgot about it because they weren't on your leads list. Guess what pops up in your CRM now with a flame icon, Sony, because it's mid-2026 and, you know, you need to prioritize that and you have the exact anecdote from those.
B
We pay Salesforce a shit ton of money to do that.
C
Well, you don't need to anymore. Salesforce. No, all of this is happening now. You can custom create it. You can feed in everyone's granolas every time you have a new meeting with Sony. Now you're. You're enriching the data over and over again. Just telling you it's good. Yeah, I don't know. I'm.
B
I'm interested in what my team thinks about sharing granolas.
C
They don't have to share all granolas. You can auto save granolas to certain folders. So all of my fundraising meetings go to a folder and that folder can be shared.
A
There's a lot of different ways to do this. Shout out, you know, to Gary Tan. He's got. He's been doing a bunch of really interesting innovation around this with his GStack and GBrain project. Jess, you should check out GBrain and I, I can take this offline with you. But there's, there's a bunch of cool ways you can, like, embed all of your transcripts and whatever other data you need for workflows, like what Britt was talking about into vector databases. And then have your claw or your Claude code be able to recall things, like memory, in a way that's really powerful.
B
I keep telling it to remember things because Sam taught me that.
A
Yeah. Yeah.
B
I like asking it. What I did today.
A
Yeah, no, that. That's one of the best use cases, for what it's worth. Like, synthesize my.
B
What should I do tomorrow based on what I did today? Like what?
C
You know how everyone's getting, like, their agent to send them a daily briefing in the morning with, like, the top news and the weather and all the things? Well, someone connected this to, like, a receipt machine to make a physical paper they can leave for their kid in the morning on, like, the kitchen table to be like, remember, it's cloudy today. Pack your jacket. Here are the three things happening this afternoon. Make like your math test is tomorrow.
B
I'm also in favor of talking to children. I. I also believe that's a good thing.
C
But let's say you're traveling and, you know, you don't want to be the nagging mom on text message reminding your nanny to then tell your kids.
B
Yeah, it's a novelty. I got. Okay, that's good. This is good. My AI homework. Brit, did you see Devil Wears Prada, too? Lot of journalism shout outs.
C
I can't. I invited eight women to go with me, and everyone flaked. I feel like we live in flake culture now.
B
Well, Brit, we are ships passing in the night because last Thursday, I had a dinner date with one of our friends who stood me up. But she had a good reason.
A
Look at that.
C
This.
A
What is with you guys and all your flake friends?
C
Flake culture. Flake culture. Everyone has a different opportunity.
B
She's an avid POD listener, so there's no. No resentment over this.
A
But I know which one it is.
B
I had a babysitter, and I thought, what am I going to do with this free time? Because I. It was the only night I had a babysitter for, like, 10 days. And for. Because we had recorded the pod and you said you were going to go. I. I had been seeing all the promo, but I. No idea where. Where. When this movie hit theaters, I booked myself at the San Mateo Cinema Movie date. So I don't. I think this is the only time I've actually seen a movie alone in my life, which is odd. I can't think of another one. I had the best time the Theme of the demise of journalism in the layoffs across the industry is like the backdrop to this whole thing. Yes, it's corny, yes, it's silly, but it was a delight. I loved every minute.
C
I'm thinking of doing this for Mother's Day solo.
B
You should be. I had a great time, family. I really enjoyed it. It's a nice two hours. You know, these movie theaters are really. You've probably been in a while. I don't even know. Maybe I see a movie.
C
Oh, I go to movies a lot. No, yeah, they're great. They have recliners now. Yeah, it's great.
B
It was a great experience. The San Mateo Cinemark. And I have to thank you for the idea. I would have just gone home and, you know, missed out on my evening out and so highly recommend. Dave, what's on your list? What's on your pop culture viewing? Step away from the claw.
A
Ella Langley. And that she is the first. First country artist since Taylor Swift to cross over. Like her. Her song Choosing Texas is number one on the country charts. And the. And the Hot 100.
C
She's a big deal.
A
She's great. You know, she wasn't the best that we saw at Stagecoach a couple of weeks ago. She was amazing. But she's clearly young and doesn't have the stage presence of say, like a Lainey Wilson or a Post Malone. She's a rookie. But. But, you know, I think that's pretty rad that she's. She's kind of got this crossover thing happening. Yeah. Which is cool. And she's a great musician. Super great musician. I was also just so wildly impressed with Teddy Swims a couple weeks ago. Like, I. I've always liked his music, but he was unbelievable in person. Like, just kind of the. The.
B
Is Taylor Swift doing the Toy Story song?
A
Yeah.
B
What's going on?
C
This got. Oh, yeah. Her team messed up. They accidentally, like, leaked an Easter egg. Then they retracted it. But it's obvious she has a song. She's. And now she has a credit on the Toy Story. It's coming out June 19th or 20th, something like that.
B
Oh, so that countdown was early.
A
Yeah.
C
Yeah.
B
But now we know. I bet everyone think about that. Well, now we're talking about the movie. I wouldn't have talked. What Toy Story are we on? 11. Toy five.
C
Five. And frozen three is happening later this year, too.
B
You know, I have watched and enjoyed the first two Frozen back.
C
But you know what's not going well? In the spirit of May the fourth be with you. Star Wars. Star wars. Is not going well.
B
What do you mean not going well? It's Star Wars. What are you talking about?
C
Like all the newer movies are like not. The new characters aren't really sticking with people. So like the Disney people are having to revive it all.
A
Yeah, there's a rumor that they might split the. They might split the three sequels off into their own timeline or something like that because it's all going. It's been going so badly and like the Mandalorian movie bombed in the box office and so they're like, there's like this potentially new storyline that they have to generate because everybody just wants to, you know, Luke Skywalker and the original characters. They don't want all these rays and all these new things.
B
You know, there's a limit for how long you can stretch a franchise, but it's not gonna. If Star wars can't make it work, I don't know what the process prospects for.
A
Yeah, I mean it's kind of a bummer, but I thought that this Mandalorian. I don't know, I'll probably still go see it with the kids, but the. I thought maybe it was going to be good, but it looks like it's bombing pretty badly.
B
There you go. Okay. Dear friends. Well, Morins, it's always a pleasure chatting exactly the same topics with you every week having exactly the same argument. It's like a Twilight Zone. What's going to happen to AI job Displacement. Just kidding. We're moving the ball forward, people. We're grateful for all you listeners out there. Please do. We do want to hear from you, especially I think on the calibrating the tech sentiment and what to do about it. So but with that, we will say farewell from now. Thanks for listening and watching and we'll see you back here next week for another episode of More or Less.
C
Bye bye guys.
A
See you later.
C
If you enjoyed this show, please leave us a virtual high five by rating, rating it and reviewing it on Apple Podcasts, Spotify, YouTube or wherever you get your podcast. Find more information about each episode in the show notes and follow us on social media by searching for more or less Avemorin lesson. And as for me, I'm Brit. See you guys next time.
Episode: Why the Met Gala Hates Tech, Elon vs OpenAI Drama, and the Rise of Chinese AI Models
Hosts: Dave Morin, Jessica Lessin, Brit Morin, and (in spirit) Sam Lessin
Date: May 8, 2026
In this lively episode, the More or Less crew dives deep into the growing anti-tech sentiment bubbling up at the 2026 Met Gala, the high-profile legal clash between Elon Musk and OpenAI, and the explosive developments around Chinese open-source AI models. The hosts take on provocative topics: public tech perception, AI’s threat to jobs, venture capital’s compute spending binge, and the shifting landscape of global AI competition. Original insights, friendly debate, and plenty of industry anecdotes fill the hour—plus quick hits on pop culture, product tips, and some fun at each other’s expense.
00:56–09:47
09:47–11:45
12:09–15:45, 50:06–51:14
15:45–19:52
19:59–22:29
22:29–25:07
25:07–33:27
33:27–39:09
40:41–46:06
47:04–52:14
53:53–57:57
“No one is using this stuff yet… single digit millions of people are using this stuff. The only thing between here and there is going to be better user experiences, better design, better products.”
— Dave Morin (21:20)
“Some of it’s directly tied to AI and job displacement, then you just have what to me seems like more standard, like hate-the-rich things.”
— Jess Lessin (07:44)
“I now can run pretty powerful inference with pretty smart models on my desk… This thing costs $5,000. It’s not that out of reach.”
— Dave Morin (13:00)
“I can't take Elon seriously anymore. And I feel like he's taking the Trump pill—he's best friends with someone one day and not the next.”
— Brit Morin (39:09)
"Every technology cycle has an infrastructure investment cycle... It's usually many years after the infrastructure investment that all the value gets created and we're just not even close to being there yet."
— Dave Morin (31:35)
The More or Less crew offers an honest, sometimes cheeky, and always informed roundtable on tech culture’s biggest stories. They urge listeners to look past the headlines: AI transformation is real but still early, wealth resentment is more complex than just “techlash,” and the next decade’s breakthroughs are yet to surface. Meanwhile, the compute race and legal gamesmanship suggest this is a 20-year marathon, not a sprint.
Call to Action: The hosts want to hear from listeners about their own take on public tech sentiment and the actual impact unfolding in their worlds.
[Summary created to reflect the tone, nuance, and conversational energy of the original episode. Suitable for readers seeking a comprehensive yet readable recap.]