
Loading summary
A
You know how Google convinced all of the newspapers that they would just make money on the volume and they should give up?
B
I'm familiar with that trade. Yep. Didn't take it.
A
This is exactly the same story once over again with retailers where the chatbot was like, oh, don't worry, just let us front end you and you'll have no brand. But don't worry because you'll make it up on volume.
C
It's like, no, don't fall for it.
B
But everyone falls for it.
A
You know why they'll fall for it? Because in most cases, the people running these companies, like, want to be there for three years and get a short term revenue shot in the arm and then they're going to take their stock options and leave and leave the companies in ruins.
B
Or less. Easy. No. Or is it yes. We'll debate the text. That's best when we get More or Less Dave and grit.
C
Plus Sam and Jess put it all
B
right to the test. More or less. Why? Hello, friends. I missed you. Welcome to More or less. I'm back.
C
How was Switzerland?
B
It was delightful, Dave.
C
I love that place.
B
Thanks for asking. I did. I was expecting. First I thought Davos would be a huge boondoggle and so I never went because I thought it'd be a waste of time and money. Then the scales tipped and I thought it was worth my time. But I. I had a lot of fun. The weather was warm.
C
Oh, no.
B
Running into people on the streets, it felt like south by Southwest.
C
Is the snow as bad as the western United States?
B
Yes.
C
Oh, that's terrible news. Well, although, did you see that there's snow in Russia that's covering up the third story of buildings?
A
Yeah. Although I can't tell if that's AI or not. Some of them look pretty AI to me.
C
Oh, that's a good point.
A
But like, some of the, like snow drifts are like amazing. Like this really feels like AI, but it's cool.
C
I'm like, no, all of the snow went to one place on Earth.
A
Well, also the Northeast. Like my. The place I grew up skiing was J Peak. J Peak is having like a record season.
C
Should we go to jpeak and ski some Powell?
A
I think we might should.
B
No, we're not going to Vermont.
A
I don't. The problem is I have no reason to be on the east Coast. If I did, I would totally go just for the novelty of it.
C
Let's go do some New York meetings and go skiing.
A
I mean, to be honest, it would be a pretty. I mean, I almost. It's Almost worth doing for the Instagram posts.
C
Yeah, let's do it.
B
Okay. Do it for the gram. Fine.
A
It's like, what are we doing for the gram is we're gonna ski J Peak. It's actually a great mountain. I haven't skied it since I was 10, but it's a good one.
B
Okay, we need to tell our dear. I've decided, guys, we don't have listeners anymore. We just have viewers. If you're not experiencing this in video,
C
I don't know about that.
B
I decided this because my favorite podcaster of the moment, Andy Roddick, also decides this. He doesn't talk about his listeners.
C
Andy Rod, tennis player.
B
He has the best tennis podcast there is. It's called Serve.
C
I didn't know that.
B
Racket's podcast is number two. I'm sorry, I have to give props.
C
Oh, wait, so you own this podcast?
B
No.
C
Got it.
B
And he's a great podcaster, and he doesn't refer to his listeners. He just refers to his viewers. So.
C
Viewers? Is that out of habit? Because he's used to having viewers and he just doesn't know listeners.
B
No, nothing is a habit. It's probably out of, like, someone who has a full podcast operation behind them, as opposed to us clowns. So I'm just gonna take his. His things. He would also say Sam Watson operation. He would also say subscribe.
A
I've been told that actually YouTube is the growth channel.
B
Of course it is. TITV is crushing it on YouTube. Guys, did you know that YouTube is big? There are people on it. But what I was going to say to our viewers is, we're down a Brit. She had a work commitment, so I will do my best to keep this podcast feminine.
A
Gabe, what is she doing for work?
C
I can't talk about it.
A
Really?
C
Really.
B
So fun.
A
Okay, all right. Boring.
B
She has some work she's selling offline, too. I don't know.
C
No.
B
Okay, I want to start by everyone talking about a highlight from their week, and I'm going to share a photo. Let's talk about mine. Okay, here we go.
A
Go for it. You start.
B
I'm going to share this photo, and I want you guys to tell me what it is and if in the process, I might, like, share all my text messages. Unpublished drafts, like, technology is not my. You know.
A
That's fine. You already gave it to Claudebot or the Mod Bot. What are the fuck? They're calling it Multbot. Multbot.
B
What is this? Dear viewers.
C
Whoa. What is that?
B
What is this? So for you people, you listeners, what,
C
are they doing ayahuasca trips at Davos now?
A
No. You know what they figured out, Dave? They figured out that it's cheaper to just skip the ayahuasca and just lie on the floor. That way there's no cost of goods sold.
B
Okay, so what I'm showing is a picture of. Well, I will give it away, but basically a picture of.
C
Those look like expensive blankets.
B
What looks like a sleepover in a cathedral. It looks like a sleepover in cathedral, wouldn't you say?
C
It does look like a sleepover in a cathedral.
A
Okay, Is it a. Is it a sit in protest in a cathedral? Protesting some. Like, something?
B
Yeah, that would have been consistent with the vibe at this thing, but to be honest. But no. I went to a Grace Cathedral sound bath.
C
Oh, interesting. I've heard those are good. Actually, that's the most UN Jessica thing I've ever heard.
B
It is A, the most UN Jessica thing. B, it is the most San Francisco thing.
C
We need to get you down to Esalen.
B
I went at the suggestion of dear friends. Obviously, I was totally game for this experience. I've done one sound bath in my life, which I really enjoyed. For those of you who don't know what a sound bath is, there are gongs and music while you lie on the floor.
A
It's an incredible racket.
B
It's an incredible racket.
C
At least they're using that cathedral for what it's meant to be used for. Finally.
B
Well, they have the mind body. I learned all about the mind body programming of Grace Cathedral. You can do yoga, you can do meditation.
C
Esalen was actually part of helping start it, like, 50 years ago.
B
I enjoyed it very much. 90 minutes of Cathedral sound bathing. And also I left and I was like, this is so a groupon of like 10 years ago. Like, this is exactly the thing.
C
Groupon.
B
Remember? Like, you know?
C
Of course. How do you even find out about such things these days?
B
You know, I have friends who are wiser and more spiritually tuned than I, who also have the foresight to get tickets. So anyway, it was a great experience. This podcast is brought to you by the Grace Cathedral Mind and Body program. Once a month, go sound bath.
C
Go get a sound bath in a real cathedral.
B
Just make sure if you're kind of bony like I am, you need more than a yoga mat on a cathedral floor.
C
Although that's its own special kind of altered state.
B
Yeah, it was not comfortable, but it was spiritually lovely. Okay, what about you guys?
C
I've been deep down the Moltbot, Clodbot Rabbit Shocker.
A
Shocker.
B
It's been renamed now. Is it no longer Clodbot?
C
Yes, it's been renamed for obvious reasons.
B
What's the name by the also Yaso ing here?
C
Moltbot. Which is. That's what a lobster does when it evolves or loses its skin. So claudebot was named after a lobster, so.
A
No, claudebot was named to sound like Claude.
B
Yeah.
A
And let's not kid ourselves. It's good marketing.
C
Indeed. I mean, it started as an open source hobby project and it went way bigger than I think Peter ever thought that it would. And he's been having a pretty intense week. I've been talking to him a bunch and he. Yeah, it's just. It's been pretty awesome to watch this happen. And I think that. I think it showed people a future of how to think about personal AI assistance that nobody had actually done yet. There'd been a lot of talk about it, but the idea that you could just give it the computer. I know we've talked about it on the pod before.
B
We were two weeks ahead of this madness.
A
We were two weeks ahead of this one. And we were two weeks ahead on everyone losing their minds about the security implications. Like, two weeks ago, our podcast was how this is going to end the Internet. And lo and behold, two weeks later, the world caught up. We again, once again, we just didn't trade the insight.
C
Sorry, Sam, I've. I've heard you giving advice, advice to people that they need to at least write down or talk about on podcasts what their. What their insights are. So that's a good way to build your reputation and your resume.
A
So there you go.
C
Once again, we are right.
A
We're right. Duh.
B
Okay. Keeping this focused, you each get one to two examples of an amazing Moltbot win of the week.
A
What have you actually done with it? I've actually, I have. I have a little Mac mini running with it right now, and I actually do something that's like, not that cool, but fine with it. What are you using it for, Dave?
C
One of my big wins was I've always wanted to build a iPhone app for our internal CRM that we have at offline. We have like a super custom CRM that has a lot of different data from a lot of different sources. And we've got great web interfaces, but no iOS interface. And so this weekend, while I was during halftime of my son's basketball game, I booted up a new iOS project and pretty much had the entire thing built in, like two hours of talking to this thing throughout the day and now it's on the phones of people at my office and they love it. And I was looking at the token count that I. Or not the token count, but the cost of the tokens that it took to do the work. And it was around $150. And I was just like, wow. Like, you know, you could say that $150 a day on using this tool is very expensive or you could compare it to the cost of actually hiring an engineer to do this work and all of the back and forth that it would take. And it. Which would have been in the tens of thousands of dollars and probably months of work. And so a huge win for me on that front.
B
Do you pick which models you wanted to use or it's defaulted?
C
Yeah, you can. I've really experimented with different ones. The coolest thing about it is that it can use any cli. And actually, I think the biggest insight of Multbot is that it doesn't use this like, MCP stuff that was like kind of the du jour thing of the last year. It actually uses.
B
That's what Cowork uses.
C
Yeah. And he kind of. The way that Peter ended up not necessarily stumbling, but really making this happen was that he was making a lot of different CLIs for tools that he wanted to use with his agentic coding setup. And he actually uses codecs mostly. And so he was building all these different CLIs and then he realized that if you could orchestrate across all of these CLIs that the agent would just use the command line interface, use the UNIX command for help, and discover the entire set of tools that the command line interface has and then just start using it. So the powerful thing about your question, Jess, is that you can install any tool that you want and then you can tell it to. You can actually use it to use Claude code, or you can have it split its work across Claude code and Codex. You can have it do all kinds of different orchestration across all the different tools, depending on what use case you're trying to accomplish. And it's really good at it, which I think is the. Is the cool part.
B
Okay, Sam, what's your. What Mobot?
A
I initially, weeks ago, installed it on my. An actual computer and then immediately uninstalled it. But I. I dug. I have a whole pile of Mac Minis in my. I've had them because I always. I used to do. Whenever I do a crypto transaction, I would buy a Mac Mini and then never use it again.
C
I have a growing pile of them
A
yeah, just like I had 1. 1. Every transaction required its own Mac Mini. So I have a whole stack of them that I've wiped. And so I have one set up. I just basically the thing, the only thing I actually do with it that's somewhat useful is I set up a telegram bot for myself and then gave it my Twitter and a few other things. And I can basically, I have it set up to ask where I can just go say, hey, what's going on in the world? And it go reads the New York Times, the Wall Street Journal, it reads a custom set of Twister Twitter people I follow, etc. Writes me a digest and then posts on the Internet for other people. And it's like the digests are pretty good. So it's like I'm using it just for like little shims like that. I will soon enough. My next little project maybe I'll do this afternoon is just making it go around LinkedIn and like a bunch of posts for me and shit. Post, comment on things like just because, like who cares, you know, Like I'm happy to give it access to my LinkedIn. I don't care about LinkedIn. And it's funny, my two real insights, it is like one, I can't believe that NGROK didn't make this, right? Like, this is basically just like to me, like, and like I've built things like this before.
C
You just what's. What's N Grok.
A
And Grok was basically this little like tunneling service that everyone uses where you can basically just like put your Mac Mini on the Internet incredibly easily and like tunnel into it, whatever. It's mostly used for development, but like, like I've used it forever for little like scripting projects, right? Where you want to be able to hit a web server easily and maintain an IP and you know, an address for it, etc. But it's like this and like slapping an LLM on your desktop on top is like, like, I'm just surprised. Like ngrok's an awesome. I always wanted to buy it because it's just like a great tool. But I was like, huh, this is really just like an NGROK extension. So that was point one. And then the second point is like, I just, I really do think it's just the end of the Internet, right? Because it's just too. The fact that you can do this and like just automate the browser so easily. It's not that you couldn't do it before, it was just harder. I can be like, hey, like go around like like generate funny images and post them on Instagram. For me, hey, like go around and like comment. Like here's my kind of archive of points of the world. Just like comment on all of my friends posts or people I think are interesting to build distribution and like hijack their distribution. Like any. I can just say, hey, go around and read all these websites and tell me what they say and I'm never going to visit them again. It basically just makes it like all this stuff was doable before, but it's so unbelievably easy for literally everyone to do it that like, it's kind of like the final straw where I think the Internet just goes away, right? Like people are like, there's no more web. Like Jazz, I bet.
B
What do you mean no more web browser? What does it mean? The Internet goes away.
A
You're just not going to use like the desk. Like websites will be gone because like it's, because it'll all just be 100% bot traffic, right? And like then the economics of it break and whatever else. Like you're, you're. We'll see how long the information.com exists, right? As opposed to it being, you know,
B
I knew we never worked too hard on the design for a reason.
C
I mean, it's interesting, Sam, because one of the, one of the use cases that I built last weekend was I built a Jackson Hole snow report and I had it go get all of the like esoteric local news reporters that they have no AI, they have no API, right? So it just opens the browser, gets the report goes, gets stuff from the open Snow API and then I have it set up as effectively a cron job or a repeating task that, well,
A
literally it's a cron interface in the bot. It's like totally easy to implement.
C
Totally. And then after that I was like, oh, I used to have this old idea that I built during the Facebook platform years called Snow Tick, which was I assigned snow tick or ticker symbols, like stock symbols to all of the ski resorts in the country. And so I had it do that and then I gave it my 10 favorite resorts. And now every week it will go and it'll get the data, it will go scrape the data from the websites that, where it's hard to get by just opening the browser and doing it. And then it gives me a synthesized point of view as to where it thinks the snow is going to be good next week. And like doing that, you're right, it
B
would have been we need one of these Davos Earth models that everyone's talking about in AI now, maybe, but like
C
it would have been easy maybe to do that. But what I, I didn't want to go do the work to do all this. I don't know, just like a lot of muckety muck data grabbing you'd had to do in the old days. And I've got this tool now and it runs every weekend for me and it's no problem. And you can do that a thousand different ways now.
A
So all the nerds are really, really incensed by this argument because like Selenium has existed forever. Like, guys, you're missing the point. Right? Which is like, yeah, like not. There's nothing new that you, you couldn't do with a little technical know how before, but when you make the barrier to doing this stuff truly zero, which is like what's happening right now? Usually the Internet can't survive.
C
Like it's going to go away or does it take another form? I mean, it just feels to me like. I feel like that's why I'm so excited about it, Sam, is that it feels to me like, I don't know, HyperCard or like the beginning of the Internet where it's just like, wow, there's like this new primitive to build on top of now. Like, like people might build skills instead of websites now. Like I've seen a bunch of people go super viral with the skills that they're building.
B
So this is. I'm gonna pause here because I. This brings us to when I want. We've. We've gotta move. We need to have the conversation everyone else is gonna be having in two weeks now, since we're two weeks ahead. So it's six months from now. First of all, what is this thing by then? Is it part of Microsoft? Is it a company with a $10 billion valuation? What is it?
A
It's nothing. It's nothing. Cause it's like there's gonna be a thousand versions of. It's an indic. This is like. It's like a MySpace space. It's cool, it's fun, but it's not an important platform. It's just like a thing that will be copied a thousand times.
B
Okay, Dave, what's your answer to that? Agree. Disagree.
C
I think that will happen. I think there's probably like a thousand copy machines up and running already. I think this is, this is probably the most powerful idea in AI, I don't know, in three years. But that's what it is.
B
The idea being an agent that just controls things for you.
C
Give the agent A computer.
B
Give the agent a computer.
A
To be clear, that's not a new idea. It is just a good implementation of it. Yeah.
C
And I also think the way he implemented skills which we were just talking about really matters. Like it's extremely easy to create a new one and then share it with the community and then everybody else can download it. And so there's like this self fulfilling prophecy going on where people are creating new things to add onto it. There's something like 30,000 projects that have happened in the last seven days.
B
You know, has he raised any money? Will he.
C
No, he's a, you know, he started it as an open source project.
B
Yeah.
C
But you know, I think that probably every single investor investor on the planet's trying to write and I bet, except for me, I bet there's a lot of people, you know, trying to get them to come do it and you know, big companies and all those kinds of things. So I don't know, I think any of these things could be an outcome.
A
I think it's a stupid investment for anyone to make. But I will say as an individual developer, I think it's a great example of what you should be doing, right? Which is like doing cool shit that are calling cards with like little concern for what the business is. And then maybe it's something and maybe it's not. Like, this is the thing I've been like, like pushing a lot of people on recently is like you just have to be doing memorable things right now. Right. Like, I'd much rather see you do like a memorable viral project of almost any sort than like pitch me some weird sass thing.
C
Yeah, I think that's right.
A
It's like, be memorable in this moment. Like that's what matters and it's memorable. It's a moment.
C
Although being memorable is an interesting question, Sam. Like, can you be memorable by just trying to be memorable? Like this goes back to one of my favorite Rick Rubinisms, which is like,
B
no, you have to be exceptional to be memorable.
C
Don't try to make music for the charts. Make music that you love. Right, Totally. And I think there's a lot of people out there trying to make music for the charts.
A
And I agree, but that's not memorable. Like, I think like, it's like, you know, my favorite memorable thing recently was like, God, what was the name of that guy on Twitter who's so great who scripted the Meter Maids?
C
I haven't seen it.
A
It was so good. It was like he basically built this thing scripting the government's website where you could track where all the meter maids were and, like, where people were getting tickets in real time.
C
And it was.
A
It was awesome. And it's like, it's just a great example of, like, the Internet being fun and, like, doing memorable pranky stuff. Right. Like, I've been talking to a lot of college students recently, and they so annoy me because they come in with these, like, super professional pitches about, and I'm like, guys, pranks on campus do fun, like, experiment, toy, like, play. Like, that's the whole point of the moment. We're in again. And, like, what made the Internet great is, like, I want to see more pranks. Like, I want to see more hilarious things being done with technology in unexpected ways. Like, we're back to a hacker era. Right. And, like, that's great.
B
Yeah. So what else have you guys come across that has this ethos?
C
It's funny because I also saw a whole bunch of people yesterday talking about that they want to go back to the Apple way of launching things, which is to keep things secret as long as possible.
B
Oh, Apple keeps absolutely nothing secret. They try.
C
Well, I understand, Jess, but the idea that, you know, you actually are careful about how you launch something rather than always doing it in public, because the other. The other thing that has become culturally normative is to always be building everything in public. And there was, like, a big conversation over the last few days that that's also not always the right strategy. And so I would. I would posit that Sam's right. If you're doing things to have fun, like, do as many things to have fun and make lots of projects do fun things. But if you're getting serious about something, like, try to be serious about it long enough that by the time you show it to the public, it, you know, it goes off pretty well.
B
Do you think this is what. So obviously hardware is suited to that in a way that software isn't. I don't know if you guys talked, no question, in my absence, about the information scoop that Apple is developing an AI pin a la humane. Okay. We don't have to.
A
I think we probably made fun of it.
B
Okay. Yeah, I think you did, too.
A
At least I did.
B
I mean, it's funny. OpenAI is kind of sort of taking this advice with their hardware launch, except they're talking about literally everything else except the physical, like, which device it is. Right. And which one's going to be first. And so they're trying to get some of that aha magic out of the ultimate reveal. Could backfire, though, if the Device isn't great. We'll see. Okay, what? Let's talk any more. I want to wrap this AI thing. I do want to talk about the tech's reaction to the ICE protest because it is a big topic this week as well. But what else do we have to say about AI at the end of January 2026?
A
It's kind of what? It's just like, not interesting keeping on.
C
Yeah, it's just like it's keeping on, keeping on.
B
You know, keep an eye. I will tell you guys. I feel like since you guys are cut from the same cloth, I occasionally have to invoke other parts of the ecosystem. And I hosted a panel on AI infrastruct at Davos with the CEO of Core Weave, the head of G42, the CFO of OpenAI and the COO of BlackRock.
C
That's pretty amazing. Fancy people, take us inside. Jess.
B
Guys, it was a great panel.
C
What was the big question?
B
Well, so the purpose of the. I hope no one from the World
A
Economic Forum is listening, but they probably will now.
B
You know, the purpose is high level dialogue, which is great, but I'm a reporter, I have questions. They need to be answered.
C
So what is high level dialogue? Like, what's the difference between high level and low level?
B
High level would be how big is the AI opportunity and what are the risks?
C
Okay.
B
Jessica. Level would be core Weave. You only support Nvidia. TPUs are on the rise. How do you think about the future of your infrastructure Stack? Right. Those would be like very different questions. And what I enjoyed about it is I think we legitimately covered it all. Like, I think there were some high level themes that were interesting, but the speakers were really candid about, you know, asking more specific questions about their strategies. And when I asked Sarah Fryer, who's the CFO of OpenAI, about their, I asked her to respond to criticism Google was throwing their way. Demis, the head of DeepMind, had said he was surprised OpenAI was moving into advertising so quickly because Google was not, which was like, not actually true and also kind of just throwing some shade. So she responded to that and she talked about about OpenAI's many business model plans, including IP licensing from Discoveries made, which created a big news cycle. And to be clear, this is not like us as individuals have give OpenAI a share of any IP we have from using it. But OpenAI striking partnerships around drug discovery and with research institutions or whatever. So I thought that was an interesting part of the panel. But admittedly, everyone on that stage had been AI Red pills. And that was actually their word, not mine. Like they were just what does that mean? And so most of the questions I asked came back with a. You only ask that question if you don't believe in the demand and the demand is infinite and so that question is therefore invalid. Like that was pretty much the pattern. Like when I asked Sarah about these so called circular deals where customers are investing, you know, taking an investment from your supplier to go buy more of their product, AKA half the financings in AI she, she said, well, you know, you only think that way if you kind of underappreciate the demand. But the other thing I asked. So I asked the core weave and this is a little wonky even for readers of More or less. But core weave had a very rocky year in the stock markets last year because it came out as an early AI IPO but with a very unusual for a tech company kind of debt structure. And most of the year the market kind of whipsawed it and it was crushed. But then after, as you know, sort of one of the most important neo clouds, it's recovered. But I asked the CEO, I said, what happens when just all the training gets way more efficient and so all this infrastructure isn't, you know, you don't need it. Right. Or in, in a sense or it's just training's getting way, way more efficient. And he had this quote which I'm going to read because I it, it's really stuck with me. He said it's not the infrastructure that's endless, it's the voracious appetite for intelligence that is limitless. That's the part we have to focus on. And every single step function in the foreseeable future, certainly within the horizons we are working on, are going to do nothing more than accelerate the business. Do you guys have a reaction to that? I have a reaction to that.
C
I mean it sounds like they're saying the same thing, that the demand is so much bigger than all of you see. Like we're sitting here looking at our, you know, our P. Ls and the demand is just like shockingly larger than all of you norm enemies can see. And that will be our destiny. Is that what they're both saying? In different ways?
B
Yeah. And the thing I didn't, I ran out of time and also just didn't think of on the spot is like I can think of things where there's limitless demand and they're a total commodity with no margin and no business. Right. So those are like two different things. Right? You can kind of accept and embrace limitless demand and never ending efficiency gains but it doesn't automatically follow that that has a healthy profit margin. In fact like you could look at what the Internet did to media and entertainment and say actually here's an example of like technology that the just like the liquidity for time spent watching videos, it's gone like way up because it's just so frictionless to consume this thing. And there goes the business model. So I, I, but, but also he could be right. Like I actually say I, I, I think think that's what's so exciting about this moment is like but, but, but it is a bit concerning to me that the leaders in this infrastructure build out are you know, really see one side of the coin.
A
Well that they're religious. I mean like basically this is the thing about this whole argument. It's either right or wrong. Spidey sense in history says it's wrong. Right. And then when you hear prognostications like this, something's about to go very badly, it's possible they're right. Like it's not impossible. And everyone's just playing two things. One is the asymmetry card which is like eh, if we're wrong we're still fin, it doesn't really matter. And like, but if we're right it's important and like everyone wants to be important at an important moment and do something important. And then B is like the irony is, and this is the thing, the scary thing is look, the world is ruled by narratives. Narratives. I think people are like value stored in narratives. I'm very pro narrative in a lot of ways. And here's the irony of the whole thing is like even if it's wrong, if the narrative is strong enough and growing and people choose to store belief in the narrative, it will keep going for a long time and they will make a lot of money in the income for them. Right. And so like for me it's like you know, I'll tell you a different related if you'll humor me thing going on this week, which is my kind of thing of the week is silver and gold. Silver and gold are yes, let's talk
B
about silver and gold.
A
Silver and gold crushing up like a they're up like 2x since January 1st. Like they are the asset to own is silver and gold. And hilariously you look at that as someone who believes in like bitcoin and like the culture and you're like oh my God, all that money is supposed to be in bitcoin. Not Supposed to be in silver and gold. But the silver and gold story, which is kind of like bitcoin minus the technology risk. Right. Ironically, is like the hottest thing in the world. And like why is there limitless demand for metals that are useless? Narrative. Right. And so like is it worth buying silver and gold? Well, it has nothing to do with their actual value. It's like do you believe the story? And the story is effectively short everything else in technology and AI.
C
Right.
A
And so I don't know. Like I just, it's obviously if you lead one of these things and you're red pilled, there's two ways to be red pill. One way red pills, you're actually red pilled. Right. And you actually, actually believe some of the incredible bullshit that you're hearing. Two is you're not quite red pilled. But the reality is, just as we've been talking about for a long time, the cost benefit to just going all in narratively on it is in favor of going all in narratively on it. And candidly it doesn't even matter because it doesn't have to be right. It just has to be what people continue to believe.
B
Yeah, I agree with all that and I, but I, and I think it's val this surprise no one but I think it's important to point that out that that's what's happening.
C
Isn't that also what's going copper? It's like anything downstream from the AI narrative is also becoming a narrative.
A
For what it's worth, I actually almost bought a bunch of copper recently because that one actually makes more sense to me.
B
Well, copper's part of data. I mean it's, it's much closer to.
C
That's what I'm saying. Like you need a lot of copper for not just the data centers but the electricity. And the story is that we're going
A
to need in the next few years more copper than has been mined in the early. In the rest in the last 10,000 years if you humanity right. Or something and therefore you like buy
C
copper, which is impossible.
B
Yeah, people are buying like they're getting like copper traders on stuff. So these AI companies meanwhile, you know, we, we've got tech earnings this week, which I know this pod never likes talking about, but Meta is upping its capex guidance so it's like. Just kidding.
C
Those, that's, that's pretty gnarly those big numbers.
B
We're we're just gonna, we go a little bit, just a little bigger.
C
But is that Sam's point which is like they're just gonna go infinity. They're just like we're, we're on the other side. Like we're going all in on this narrative.
B
Also guidance. I mean you can argue that it's better to come in under that over. I mean there's a lot of like. I guess I should double check. I just thought tweet that they were upping it. I don't know, maybe I should actually
A
can't believe anything on Twitter. It's just a bunch of clodbots.
B
Yeah, I know I'm just going to have to. I mean that's a problem. But. Okay, well let's just scare slightly here. So obviously with all the things going on in Minneapolis and ice, this nation is divided and in shock and in turmoil and working through a lot. It's having some interesting reverb in tech. What, what guys seeing?
A
Well, I'll say that was my, my, my Clodbot when it summarized the news yesterday. Basically the, the TLDR on the whole thing was the vibe. The final vibe check was the only story is Minneapolis. Until that's resolved, nothing else matters. So Claude Bot believes it.
C
This seems bad.
A
My take on it is did you see. I mean obviously the initial thing was no one in tech is responding right and like people are like calling them out on it, et cetera. I thought Andrew Ross Sorkin's kind of take on Arrow was interesting where he did. I think a. He did. I think he had a pretty sophisticated take and then I'll. I'll extend what I would respond to it. His take was look, we don't need to comment about immigration. Da, da da da. The whole nine yards. The thing business executives can comment on, I should be talking about is just training. Like you all train people. Like clearly like there's training gap right in like the people that are like running ice or like on the ground etc. And we should fix that. And I thought that was like a fairly business oriented, benign, fine, approachable. Here was my response take. I was like, look, if you're a business executive, you understand training. Sure. But you also understand like recruiting. And I actually the bigger recruiting and
B
contracts, those are the languages you speak.
A
Yeah. The bigger thing that, that I think is missing in this conversation, my take on it is like again, not commenting on the immigration part. Whatever again, in general I am in favor of like our borders being secure
B
and like Sam, that's not what's happening here just to be.
A
But I'm just saying I am in favor of those things. But I do think there is like a structural recruiting problem, probably for ice, which is like, who chooses to apply to and be an ice, and, like, where they come from and, like, how they think about kind of what their role is. And not because you think about, like, take like an open AI my case, you get a bunch of researchers with a mission, right? Which is like, especially early, which is like, oh, like, we get to work on the coolest things and, like, we're saving the universe with quote, unquote safety. And the AGI race is important and we're gonna get comped a certain way. It's like you attract a certain set of talent, right, that cares about a certain set of things. If you told me that ICE has a recruitment problem, which is like, the message they have, the comp structure, like, the appeal of who wants to do that job is a set of people where, like, let's put it this way, you need to do a lot of training and screening. I believe that. And I think business executives can understand how that would play out. Right. And so the question for me is, like, how do you take ICE or the mission and inject the right people into the culture and the community where they're doing their job, but they're not ending up in really bad places and situations and making bad choices.
B
So, Dave, if you were running Apple, Google, Meta, what would you have done?
C
Whoa, that's a tough question. I guess, quickly, to respond to Sam's comment, I would extend that to not just the questions of training and these other things that you mentioned, but we see this in Silicon Valley all the time. When you try to scale a company too fast, the quality bar goes down a lot. Right. And my understanding is that we're trying to hire an enormous number of people into ICE very quickly. Right. And every time that happens, you see, like, enormous numbers of problems organizationally and with how people behave. Right. And you can only grow human organizations so fast without there being a huge degradation in the quality of how they behave.
B
True. I don't know enough to know if that is the cause of what's happening, but what about now?
C
So I don't know what I would have done, Jess. I mean, I'm not a person running one of these huge organizations because here's what they did.
B
Right? So, I mean, you had some first, you had these people who had been afraid to oppose the Trump administration publicly, just do a little bit more of it. And Jeff Dean at Google, one of their OG AI leads, founder of Google Brain, so forth, was the first to come out before, like, the most recent Tragedy. Do this. And. And everyone's like, oh, my God, someone from inside a tech company. Company has publicly criticized the administration. That was a big deal. Then you see, like, the Paul Grahams and the other people who are known for being opinionated on Twitter, sharing their opinions. But by the way, these people weren't, like, tweeting about politics for months and months and months. Like, there was really a chill over that. And then ultimately, you saw the. The CEOs, like Sam Altman, Dario, Tim Cook, sort of condemn what's happening, but praise Trump. So basically, to literally say what's happening has gone too far, but Trump is a strong leader and will do the right thing.
A
Yeah, it's a little dystopian.
B
It's a little wild.
C
Yeah. I mean, Jess, this is just. Go read any. Go read 1984. Go read any dystopian, you know, novel about descent into dystopian, you know, authoritarian nightmares. And the main thing is, is that fear drives behavior. Right. Like, and I. I do think that that the real underlying thing going on is fear, and people are afraid to speak because of what's happening. And that's how this works. Right. Like, you have a situation where there are actions being taken that appear to be anti American, anti democracy. Like, people's rights aren't being respected in a variety of different ways. And they're just like, that's pretty clear. And. And then people are afraid to speak up because they're like, what if it comes to my town? Right. What if it comes to my business? Like, I think there's a lot of that going on in America right now.
B
Do you think in the last 72 hours, anything has shifted in the Valley
C
on this topic, on which I guess I don't even know what topic we're
B
talking about on that fear. Are people a little less fearful or not? I mean, I actually.
A
Well, it's all about the Overton window.
C
No, I think there's power in numbers. Yeah, exactly. It's like, it's. It's like once everyone acts, then everyone acts. Right. But nobody's going to be this. Nobody's going to be the tall poppy. Right.
A
Well, some people are the brave. People have to be first. But I think the flip side to all of that is, like, well, Sam,
C
I would argue the richest people would be first. Right. The people with the most resources, the most insulated people. Yes.
B
But I don't know. Trump is actually targeting those people very intensely. So they're being very quiet.
C
Yeah. But they have the resources to withstand it. Right. Like, I think that's the thing is like, not everybody in the, quote, valley is created equal.
A
But I do think there's this other interesting problem which is like, like, let's pretend you're, I'll say this like, you're like, look, we should have secure borders and like, laws should be enforced. Right. You're like, great, now I need to hire an organization to do that and everyone's going to hate that. Like, lots of people are going to be really upset about that. Like, just like, that's the problem statement you have. Who are you going to get to do that? That like, is good. Right. Like, it's not a desirable job. Right. And so there's this interesting question of like, degrees of freedom and like, how do you even do hard things when like, good people don't want to do hard things or like, they come with baggage or perspectives that are challenging. And so I think this is like, again, I'm not excusing any of it. I'm just asking more of an intellectual
C
question, which is like, I mean, Sam, to that point, like, in my hometown, there was a, you know, there was a big debate about this and one of the main topics they wanted to do is de. Mask the people. And one of the reasons these people are wearing masks is that they, they like, they can't do the job without their, like, personal life being completely destroyed. You know, so it's like, how do you, how do you enforce laws?
B
I guess I'm more. We, we're not going to veer this into political territory, but I'm more interested in.
C
But isn't this. I thought we were.
B
No, but I'm veer it into business leaders reacting to political territory.
C
But isn't that the story, Jess? Is that all of the business leaders in Silicon Valley, starting with. I feel like Brian Armstrong at Coinbase, have tried to remove politics from the, you know, from the like.
B
And so my question is that shifted
C
milieu of the internal culture of the business. It's like, you can have whatever politics that you want.
B
Absolutely.
C
And this is a free country. If you want to be Republican, you want to be Democrat, just keep it out of the workplace and like, let's focus on the, on the job that we have to do. That seems to be the message that's been pretty consistent the last couple years.
B
It absolutely has. It absolutely has. And it has served a lot of purposes. And I, I think it prevails. I think it remains the prevailing playbook for leading one of these big tech companies. But, but you're going to See underneath. And we're, I mean we're seeing this at Google and everywhere. Right? The employees who don't want you to do deals with ice. Right. In Palantir. I mean, and you're going to see. And I just don't know. I mean, no one knows, but I'm sort of wondering like where we are in the past. It's just a pendulum too.
C
And is it reasonable this seems to have broken out? Like, this seems to have broken out?
B
Yeah, that's what I'm trying to say.
A
That's certainly what my bot thinks.
C
Well, here's, here's a side thing on this, which is that, and maybe I'm a broken record on this, but I don't use social media except Twitter. And I in like halfway, I don't know when it, when all of this started happening, it was maybe a day or two later, suddenly there were people around me that were like, oh my God, America's falling apart. Like, it was really interesting because when you're not exposed particularly to meta's products and TikTok, you have no pro. Like none of the message that's going around is getting into your brain. And so it's a really interesting thing to watch. Everybody else, like, I had a bunch of group chats exploding and people around me were talking about this and there was like this anxious, like America is, is like so screwed. And I'm like having a great weekend. It's a beautiful weather, like everything seems fine here in my local life. And it' really interesting to see how much of an effect the Internet machine has on people. It's not that this isn't a bad thing that's happening, but the way in which people are responding to it is clearly also being influenced by the medium. And when you're not in it, you can see it very transparently.
A
The medium is the message, perhaps.
C
Well, it's not everybody's chosen medium. It's a lot of it's. It is like 80% of the world's chosen medium. But like when you're not in it, it's very obvious how much the temperature is being raised by the medium.
A
Look, can I say something else that's very anti intellectual and I'll get yelled at for, but that's part of my job on this podcast. I don't really know what's going on in Minneapolis. Right. Like, and I, I'd say that, like, it's interesting. I think that part of the reason for that is there's basically, you know, people when, when Deepfakes first came out and it was like the Obama era, People have these really expensive deepfakes, and Obama will be saying, like, we have started a nuclear war and, like, a deep fake. And we're like, oh, no. In the future, like, we'll have deep fake ob. Start nuclear wars. You will believe it, right? I think we have the opposite thing going on, which is there's so many. There's so much information and so many deepfakes and so many God knows what they're like. You just don't believe anything. And, like, getting to, like, what actually is going on, it's just so expensive.
B
But it is possible to figure it out. Sam, you can't just say, because it's hard. Truth doesn't exist. It's hard.
A
Yeah, no, I'm not saying. True. I definitely am not saying truth doesn't exist. What I'm saying is getting truth is actually getting more and more expensive for every single person, which then creates this, like, really interesting dynamic where like, even, like, me, it's like, I kind of think I have opinions about this stuff.
C
It's actually kind of true, but I
A
don't, like, fully trust my own opinions about it because I'm like, I don't. There's too much noise. There's not enough. Like, there's too many angles or too many deepfakes, and it's too. It's too expensive. Fundamentally, it's too expensive in my life, which is a shitty thing to say. But it's true for me to have, like, a deeply informed opinion about this. Like, I can have a structural opinion.
C
It seems like there's like. There's like, three levels of resolution to this. Sam. There's like, the. Okay, do I agree with generally what the current administration is doing and all of their actions like this? That's one level of resolution. Then the one you're talking about, which is kind of the middle one, which is like, what's really going on in Minnesota? Like, what's going on on both sides? Are there organized groups involved? Like, nobody really knows what's going on. And I don't. I don't know where there is reporting that explains any of that. It's like, there's a lot of stuff all over the Internet, but almost everything I read, I'm like, I don't know if I should believe that or not. And then the third level of resolution is like, you can go watch. Which, like, after I had all these people telling me what was going on, I went and watched the video of the man being shot, and it's very clear. You know, the New York Times did a great job going frame by frame through what happened. And you can very clearly see this should not have happened. Like, it's like clear as day when you're at that level of resolution, right? When you're talking about five people tackling a person and one person has a gun and the guy's on his back and a dude like, is just turns around, pulls out his gun and shoots him. And it's, it's like way too quick to. Way too quick to the trigger. Like, it's like not even. There's no question that what happened is wrong, but it's like wrong on the super micro. Like this individual who actually shot him definitely made the wrong decision. Right. And that's easy to see at that level of resolution. But I think you're absolutely right, Sam, that nobody has the slightest clue what's going on at the like, Minnesota level. And it's extremely expensive to figure out. I feel like we're all pretty good, like information gatherers. I don't have an opinion, nor do I understand what's going on over there because, like, I can't tell from all of the information and I don't know what information to read that's real. I mean, maybe, Jess, you have a. Have a. Like, how should.
B
No, no. But just in the interest of being mildly confrontational, I think, Dave, your reaction is very different from Sam's reaction. Right. Sam's like, I don't know what's going on. And you're like, I. And. And Sam makes a smart point, which is. And I'm actually somewhere in between you. Right. And it's really hard to figure out exactly what's going on. And I leave it a busy life and there's only so much time I'm going to. On that. Right. Dave? I also watched the Times breakdown and I thought that was the most useful thing for sure. And I think, you know, and I just. It's naive. I don't believe that the typical, you know, the average American is going to spend as much time as you have, Dave, on this. But I wish they would because it would be. Everyone would be better off for it. But it's also not going to happen. Right? So instead we're going to be in this world where most people will sort of default to what fits with their existing patchwork of opinions and be somewhat vague about what they think. Right? And that's just what we're left with. I mean, that is the reality.
A
It's like, look, I mean, again, I like, any shooting is a complete tragedy, right? Like, I'm not like. And again, I haven't gone as deep, but like, yes, if someone just, like, randomly made a very bad decision shot, I'm like, okay, well, that probably is a training problem. That probably is a recruitment problem. There's like, things to address about that. But, you know, again, like, the question for me is like, okay, so how does that spiral into everything else? That's, I think, place where it's just like, there's so much information warfare and it's so expensive to have an environment, informed opinion.
B
Yeah, that's true. So you can just trust the information. But we don't cover this, so you can't trust.
C
Why don't you cover things like this, Jess? Like, it's actually an interesting question.
B
I'm not an expert. I'm not an expert, and our newsroom's not an expert, and it's not.
C
But your processes could produce information that is valuable.
B
I mean, maybe, but.
C
And your customer wants it.
A
No, I don't know. The customers do want it.
C
I'm one of your customers. I want it.
B
But this is. You shouldn't want it for us, Dave, I think, you know, firing some of your customers is like an important thing, right? You should expect from us to be first and accurate and authoritative on matters relating to the tech business and to the degree that Tim Cook or Altman are responding or Google cancels a major contract or something like that. But the work it takes and the experience of the journalists that you want covering some of these other dynamics is not us, right?
A
Well, I think also the problem is the broader problem beyond the information is really no one's willing to pay for this at scale. And that is, I think, a societal vital problem, right? Which is, you know, if you wanted to, there's information style with super, super trusted people and da, da, da, da. The business model just isn't there for the average consumer to afford or pay what it actually costs to get trusted opinions on this anymore, right? And, like, I think that is the structural problem with democracy. And like, a lot of the place we're at now is it's just extremely difficult to make the business work in a profit way, which is why the New York Times becomes a gaming company. And why, like, there's all these supplemental problems in the media industry is like, if you're like, hey, you have the 20 journalists are the people you can trust on this. They do an incredible amount of work to get it right, but you have to pay them a lot because it's really expensive to get it right. No one pays.
B
I really struggle. Like, all this is something our newsroom really struggles with. Like these big national moments, presidential elections or like. Or whatever, you know, when every lens, keyboard, camera, phone, whatever, is focused on one issue. Right. And. And there are many that are just. You don't. You can't ignore them to the point of, like. It just seems like you're out of tune with the broader dynamics that are affecting the things we cover. But there's just, like, very little up and covering them as well, to be honest.
C
I think that's right. And I think it's. I think it's actually back to the earlier point about fear. I think there's fear about the powers that be, but I also think there's fear about the social aspect of it, right. That, like, if you speak up, you're going to annoy your. You're going to annoy the people next to you. Right. There's, like this thing where it's just we're all so divided, nobody wants to talk about it. They'd prefer to just not talk about it. I think that's like a real thing that's going on right now. I know I certainly feel that way. I used to be, like, incredibly politically engaged, but, like, I've really felt. Felt like. I just feel like I've got better uses for my mind, you know?
B
And I kind of hope. What. What I had a glimmer of this week is I just wondered if maybe that was changing a little bit. Right. Like, let's. For example, someone who came out very
A
supportive of one battle in the war
B
ICE was Keith Raboy at Coastal Ventures. His boss, the Node.
C
Yeah, that was cool.
B
Came out against him now, respectfully, actually, and he didn't actually.
C
Yeah, it was cool.
B
Just t tweet Keith's wrong or Keith's an idiot. But another partner had sort of responded to Keith in a thoughtful way, and Vinod shared that and added his endorsement. Right.
C
If anything, that's cool because it's modeling of how we should be with each other. Right.
A
Vinode is generally pretty good at this stuff, I gotta say. The note every year grows in my estimations over time, but I think it's
C
good modeling because we should be able to disagree like that and, like, have a productive, even, like, intense discussion about something. But often that's not what's happening anymore.
A
I thought it was funny on Keith in totally unrelated news that Jack, who works at slow, got intel and outed him for doing pushups off his knees and Barry's boot camp.
B
Oh, no. He did what?
A
And it was like. That was a whole thing. Because I think it was like Keith was like, no, it's like my fourth Barry's of the day. But I was like, wow. It's like, Jack, that's a really bold associate move.
B
How did I miss that?
A
To like go out like calling out Keith for knee pushups and Barry, did
B
he approve that before he did that? That's like shots fired.
C
No way.
A
You think we any internal controls on these things?
B
Shots fired?
A
No, it was just wild. There's even a backstory to it. But like, I got to tell you that, you know, we talk about Keith, Keith scandals. That was the key scandal in my life for sure.
B
It's like he secretly only spends half of his time at Miami. Good Lord. Okay. I don't know where we took that conversation other than it's something everybody.
A
That's the type of hard hitting reporting knee push ups that, you know, the information can cover. You know, guys, we just.
B
I have been trying to hire for a weekend section a reporter to really dig into the crazy cultural things happening in this moment in San Francisco. And Eli Rosenberg just joined us this week. So get ready, buckle up.
C
Anything new you're hearing about?
B
I mean, we're hearing about everything. I should have a great story this weekend. I think it will run on Friday when this pod comes out. But.
A
Yeah.
B
Shit getting weird.
C
How weird? I mean, weirder than it.
B
Well, like the. Not everything from the shooting up the Peptides to the.
C
That's not. That's. That's old news.
B
I know that's old news. It was in the New York Times. We've got new news.
A
What's the new news? What? What is not happening in our house?
C
Yeah. What are we not doing? Yeah, Jess, come on. You can't keep. You can't keep this secret.
B
What is not happening in this house is a hard thing to actually. This is a test bed of wild and crazy.
A
Dave, you need to know about this.
C
All right, I'm in.
A
The thing that's happening in our house that I bought off of Instagram. Targeted ads so bad.
B
Guys, I can't leave again. I can't go to Europe.
A
It's so good. Is a mini wiffle ball pitching machine. It's.
C
Oh, that sounds fun.
A
Awesome. So you basically like, no one needs this.
B
No one needs this.
A
It's. Everyone needs this. Everyone needs this. It's like, so. It's really well made, it's portable, it's massively overpriced and like but you can plug it in and it pitches mini wiffle balls at you.
C
Is it like one of those tennis machines?
A
No, this is like someone's some bros slapped like a bucket on some spinning wheels and sell you like the mini wiffle balls. But it's great and like it is the best. I now have the kids just like whacking wiffle balls all day.
B
How many injuries have been sustained, Sam? How. What are we up to an injury count?
A
I mean, they're just wiffle balls.
C
This is, it's boys. Jess, come on.
A
I did immediately hit a line drive into our four year old. But like that was his fault, right?
B
Like second shot.
A
He was standing at shortstop. He was standing at a shortstop. And then, you know, it's been great. They've only been minor injuries.
C
That sounds awesome.
A
Better than a Mac Mini and about the same price.
B
I'm trying to think what my latest purchases. A sweater. I bought a sweater.
A
I got to say, I just, I can meta just reported earnings in like, you know, when you know the capex is going up, you kind of know the next line is that they had a good quarter, right? Because they're not going to like announce more capex unless they're crushing it. And like they did and stocks way up. But I got to tell you, my barometer is like, how many things have I bought on Instagram in the last week? And so long as that keeps going up, I'm a very happy meta shareholder. And it keeps going up. Jess, you don't even know. I bought some really expensive stuff of Instagram the other day. Like what for you?
B
Oh, it's perfect. But I, I do off Instagram, not off. I can give you some other websites.
A
In this case it was like a link to a website and the website had another thing. It was great. But like I gotta tell you, AI my, I could not be more bullish on the things I will buy on Instagram because of AI driven targeting and commerce. It's like wild.
C
You've been consistent on that, Sam. If there's one thing you've been consistent
A
on, that's the thing that blackrock should be expanded. Excited about is my ability to buy Infinite junk that's well targeted to me.
C
So we've got, we've got Minnesota and infinite Junk.
B
So I had. And then, and then we'll bring this whatever to a close. But I was talking obviously AI commerce was a hot topic at Davos just because OpenAI was. I mean I love when I say that and everyone else is like, no, Greenland was a Hot topic Jessica for you. AI and commerce. So I'm willing to exchange, accept that I'm dialed a little bit differently, but you know, with OpenAI's ad news and shopping news. So I asked Andy Jassy about it, the CEO of Amazon, and he was like, look this. He was sort of like, maybe we'll do deals with these people. And I was like, why do you have to do deals with them? Like you're Amazon. And he was like, oh, we'll do deals with some of them. But he was like, first of all, it's really hard. Like all the other pieces, the infrastructure, the free shipping, like, you know, he was talking his book. But then I was talking to some of the leaders of the payment companies that are starting to do these integrations with chatbot commerce and they were saying that retailers are actually really nervous about this because retailers don't want you just to pick that laser focused thing. They want to merchandise to you.
A
No shit.
B
They want to get you on their site to add the sweater or add the boots or add the bat to the wiffle ball thing.
C
So shouldn't I be very good at this?
A
No, no.
B
But, but you're in the, you're in a different ux. You're in a command line ux.
A
You don't. There's no brand loyalty. It's just find me the cheapest, best, whatever. This is my big. You know what's going to happen, you know what's going to happen if, if, if AI driven commerce takes off, which I'm actually quite skeptical of.
B
But you're halfway there with Instagram, Sam, you're. You would take that AI feed, Sam,
C
you're literally buying things from AI driven commerce. Instagram is a. Oh, that form is different.
A
I'm saying let's call it chat bot oriented. Here's the thing. You know how Google convinced all of the newspapers that they would just make money on the volume and they should give up?
B
I'm familiar with that trade. Yep. Didn't take it.
A
You should be familiar that one of the worst trades in history, right? This destroy. This is exactly the same story once over again with retailers where the chatbot was like, oh, don't worry, just let us front end you and you'll have no brand. But don't worry because you'll make it up on volume.
B
It's like, no, to be clear, Amazon kind of did this already.
A
Yeah, but this is worse, right? Like is what I'm saying is like.
B
Yeah, no, I agree.
A
It's like, it's just like the, it's like if Shopify is like the godsend, the retail world, right from a platform perspective where you have a brand and like all type of stuff. The idea that like these retailers would be so stupid as to let themselves be front ended by ChatGPT or something and just disappear into the ether of capitalism efficiency, I mean I just like God bless. But it's going to be the exact same story all over again if they fall for it.
B
Yeah. Okay, well, here's to not falling for the same stories.
C
Don't fall for it, but everyone falls for it.
A
You know why they'll fall for it? Because the people. It's a principal agent problem. Because in most cases the people running these companies like want to be there for three years, years and get a short term revenue shot in the arm and be lauded for their chat GPT integration and they're going to take their stock options and leave. And leave the companies in ruins. Is the same in the media industry. It's like it's a total principle.
B
I was going to say you're describing the media business. I remember the first time I met. I don't even. This is past the limitations of worrying if I'm supposed to say it. The first time I met Steve Burke who was the CEO of NBC and I asked him a question about five years away and he looked at me, why would I think of something that long term? Right? Like it was just. And I was like, oh yeah.
A
Like it's fascinating when you think about the layer consolidation, but that is like, I think the biggest, one of the biggest untold truths of capitalism, right? Is it's people talking about how capitalism is pretty bad at long term planning. Part of it is structural and things like that, but part of it is honestly just the principal, principal agent problem. Like no one's incentivized for long term success and so they're, they'll fold on anything for a short term hit. Right. Which is the bull case on why you will shop inside ChatGPT in some form and why all these companies will get completely eviscerated. But like they know that and they don't care.
B
Yeah. Okay, well, let's try to not make the same mistakes, folks. Let's make new ones.
A
No, they will, they will. It's fine.
B
And thrive and prosper to our viewers. Thank you. If you're not yet viewing this pod, you missed some photos, you missed some ice cream. I don't know how you did it.
A
I, I managed to. I've been wanting a towel this entire time.
B
Are you joking?
A
Are you still wearing a towel? I took a shower.
B
You took a shower? Wow. Okay. And with that, we wish you a wonderful week. We're almost heading into super bowl week in Francisco, which will be very exciting, fun events to the gills. We've got the a Ferrari launch. We're going to. Sam, are you going to that, by the way, IRS vp?
A
I don't know. I have something scheduled. And I also don't really care about Ferraris. But maybe. I mean, if it was a Miata launch.
B
What if I told you it was Miata launch?
A
I'd be there. Oh, my God. If there was like a new Miata, I would be there in a second.
C
Just. I'll go with you to the Ferrari launch.
B
Okay. Dave, you can be my plus one.
C
All right, cool.
A
I move.
B
It's going to be great. I don't. We're not missing it.
A
I moved a dinner for it.
B
Well, we're gonna. We're gonna go to some concerts, some book parties. It's gonna be great. And we'll share it all with you dear listeners of More or Less. So thank you for your viewership. I said listeners. And we'll see you back here again next week for another episode of More or Less.
A
Bye.
C
See you guys later.
B
If you enjoyed this show, please leave us a virtual high five by rating it and reviewing it on Apple Podcast, Spotify, YouTube, or wherever you to get your podcast. Find more information about each episode in the show notes and follow us on social media by searching for at moreorless, at davemorin, at Lesson, at J Lesson. And as for me, I'm at Brit. See you guys next time.
Hosts: Dave Morin, Jessica Lessin, Sam Lessin
Date: January 30, 2026
In this lively episode, the More or Less crew bounces between the revolutionary impact of agentic AI (AI that can act for users, like ClawdBot/Moltbot), the broken incentives shaping how businesses react to AI disruption (especially in tech and commerce), and the challenges facing both society and the tech industry amidst major political, cultural, and economic upheaval.
Expect wide-ranging and fast-paced discussion on:
Throughout, the hosts balance humor, skepticism, inside-baseball, and candid reflection on technology's place in society.
“When you make the barrier to doing this stuff truly zero… usually the Internet can’t survive.”
— Sam Lessin, [15:04]
“Websites will be gone because… it’ll all just be 100% bot traffic, right? And like then the economics of it break…”
— Sam Lessin, [13:22]
“Doing cool shit that are calling cards with like little concern for what the business is... be memorable in this moment.”
— Sam Lessin, [17:28, 17:58]
“You can accept and embrace limitless demand and never-ending efficiency gains but it doesn’t automatically follow that that has a healthy profit margin.”
— Jessica Lessin, [25:11]
“World is ruled by narratives. Narratives... Even if it’s wrong, if the narrative is strong enough … it will keep going and they will make a lot of money in the income for them.”
— Sam Lessin, [28:19]
"This is exactly the same story once over again with retailers where the chatbot was like, oh, don’t worry, just let us front end you and you’ll have no brand. But don’t worry because you’ll make it up on volume."
— Sam Lessin, [53:11]
“It’s people talking about how capitalism is pretty bad at long-term planning... part of it is just the principal-agent problem. Like no one’s incentivized for long-term success and so they’ll fold on anything for a short-term hit.”
— Sam Lessin, [54:40]
Conversational, candid, skeptical, and sprinkled with both deep industry insight and irreverent in-jokes. The episode strikes a balance between high-level tech analysis, practical experimentation, and sociopolitical observation, with the hosts’ long history and close rapport providing color and context to every segment.
For more about the hosts or latest episodes, visit More or Less website or follow @moreorlesspod.
This summary captures all non-ad content and major themes, organized for easy reference.