
Is the internet too far gone or can we still fix it? Neil deGrasse Tyson, and co-hosts Negin Farsad and Gary O’Reilly, sit down with Jaron Lanier, Microsoft scientist, and father of virtual reality, to diagnose what went wrong with the web, how it’s changed with AI, and ideas for a new path back.
Loading summary
Sponsor/Announcer
This episode of StarTalk is brought to you by. Six all new McCafe drinks at McDonald's we're talking crafted sodas made with your favorite sodas and topped with velvety cold foam like a Sprite Berry Blast made from Sprite and Blueberry Raspberry syrup. Don't miss the dirty Dr. Pepper or orange Dream too. And there are refreshers made from fruit flavors and add ins like Popping Boba and Freeze Dried Dragon Fruit. Try the BlackBerry Passion Fruit Refresher, Mango Pineapple Refresher or Strawberry Watermelon Refresher with Freeze dried Dried Strawberries. Try all new McCafe drinks now at McDonald's. Hey, are you traveling soon? Imagine arriving and actually understanding the language. Imagine being able to communicate with the people that are there waiting for you to arrive. Rosetta Stone's immersive, intuitive method helps you naturally absorb your new language. The lessons are simple. They fit into your day wherever you want them to. You learn at your own pace and you access the lessons on your laptop or your phone or your tablet. It's super easy, very convenient and important because of course, everybody loves it when you're able to communicate with them in their own language. Are you ready to start learning a new language this spring? Visit Rosetta stone.com startalktoday to explore Rosetta Stone and choose the language that's right for you. Now for me that language was Spanish one because you know I live in this hemisphere and there's a lot of Spanish people. But two because I secretly am listening to my mother in law talk about me behind my back and she doesn't even know that I understand what she's saying. All right, I've shared too much. How about you go to Rosetta Stone.com startalk today to explore Rosetta Stone and choose your language. Go to Rosetta Stone.com startalk right now and begin your language learning journey to
Gary O'Reilly
today StarTalk Radio is presented by Pluto TV. It's a universal truth. Pluto is not a planet. Pluto TV on the other hand, holds a universe of free entertainment we can stream from our own planet. Check out the ever expanding list of supernatural favorites including Fringe, the X Files, Battlestar Galactica and a full fleet of Star Trek series you can stream for free. No payment, just pure discovery. See what's landing on Pluto tv. Stream now pay Never Negin were you
Neil deGrasse Tyson
appeased or terrified by that conversation we just had with one of the founders of all of this?
Negin Farsad
I was mostly terrified and now every time I click on a Manage cookies Thing I know who to be angry with.
Neil deGrasse Tyson
Gary, how about you?
Gary O'Reilly
Comfortably numb.
Neil deGrasse Tyson
Comfortably numb. Coming up, one of the architects of the Internet that is destroying civilization on StarTalk Special Edition. Welcome to StarTalk, your place in the universe where science and pop culture collide. StarTalk begins right now. This is StarTalk Special Edition, which means I got Gary O'Reilly co host here. Gary, how you doing, man?
Gary O'Reilly
I'm good, Neil over in London though, so slightly remote.
Neil deGrasse Tyson
Okay. Only slightly on the scale of the universe, you're right next door. I also have with me Nagin Farsad. Nagin, welcome back to StarTalk.
Jaron Lanier
Hello.
Negin Farsad
I'm so excited to be back.
Jaron Lanier
It's been too long.
Neil deGrasse Tyson
I miss you.
Negin Farsad
Absolutely. I have to say, you don't look a day over the last time we talked about dark matter.
Neil deGrasse Tyson
Let me figure out how old I would need to be for that. You're a comedian. You're also a tad fellow for social justice. That's a thing.
Negin Farsad
I mean, I do comedy that has to, that tries to save the world.
Neil deGrasse Tyson
Okay.
Negin Farsad
And it's worked. That's why we live in utopia.
Neil deGrasse Tyson
Fake the Nation. That's the coolest title ever.
Negin Farsad
That's right.
Neil deGrasse Tyson
And I'll never soon forget the title of your book from a few years
Negin Farsad
ago, how to make White People Laugh.
Neil deGrasse Tyson
Yeah, that's best title ever.
Negin Farsad
Thank you so much.
Neil deGrasse Tyson
Yeah, yeah. So Gary, what have you researched for today? It's got something to do with the Internet and it's going to mess up the world? Something like that.
Jaron Lanier
Oh yeah.
Gary O'Reilly
Let's get the good stuff out. So is the Internet too far gone? In our lifetimes, the Internet has changed from a novelty to a central influence in our lives. And now with the advent of AI Hawked as the potential doomsday for humankind. Well, is it? Isn't it? Today we're going to talk about whether that's the case, how the Internet influences our world and how we can come together and fix it or not. So, Neil, if you will introduce our guest. And I think this is just the right person to discuss this topic.
Neil deGrasse Tyson
Yeah, there's a uniquely qualified person in the world to address those issues and more. And I've got him sitting right here. Jaren Lanier. Dude.
Jaron Lanier
Hey, how are ya?
Neil deGrasse Tyson
Welcome to StarTalk.
Jaron Lanier
Well, you know, I think you can speak a little lower than me, but that does not mean I will not attempt it.
Neil deGrasse Tyson
Yeah, we don't use the word polymath too freely today. Cause there aren't many folks. There's so much specialization but if I were to bring that into the 21st century, I would apply it to you. You're a computer scientist, but interdisciplinary. And what's your title? Prime Unifying Scientist at Microsoft. That's a title. Is that on your business card?
Jaron Lanier
It actually is, yeah. Okay. It's actually a joke. Okay. It's Office of the Chief Technology Officer, Prime Unifying Scientist. So it spells out octopus. And there are several reasons for that. One is I used to study cephalopod cognition because it's absolutely fascinating. But then there's also that some accuse me of starting to look like one myself.
Neil deGrasse Tyson
Oh, that happens. That can happen. That can happen.
Jaron Lanier
That can happen. And so between those two things, I'm the octopus. Yes.
Neil deGrasse Tyson
I'm just impressed that that's even a title that one can ascend to. Are you gonna break the mold in your. Can anyone become you in this? The answer is no. Just say no.
Jaron Lanier
Well, I will tell you one thing about my role. I have an agreement with them where I can speak my mind, including being encouraged to criticize the company itself, so long as I'm clear that I'm not speaking for Microsoft, and somehow they haven't imploded. I think it actually, I would like to see more people with that role in the tech industry. Okay. So in that, I don't want to be the last. I'm aware of one other maybe, who would be Vince Herf over at Google, but I think we're the only two, and we really desperately need more of
Neil deGrasse Tyson
us because, you know, when someone has, you know, where they're clamming shut on what you really want them to say about what they're doing. You know it in an interview, right?
Jaron Lanier
Yeah. And in Silicon valley, that's like 90% of the time.
Neil deGrasse Tyson
Yeah, yeah. So you're considered the father of virtual reality in part because you even coined the term.
Jaron Lanier
Yeah. Okay, look, I was young, all right?
Neil deGrasse Tyson
I think irresponsible.
Jaron Lanier
Were you?
Negin Farsad
Yeah. And on drugs, maybe. Was there that involved.
Jaron Lanier
I've never used drugs, and I blame virtual reality for that.
Neil deGrasse Tyson
Oh, it's served that role.
Negin Farsad
You haven't needed it. Yeah.
Jaron Lanier
And I live in Santa Cruz, so I'm in violation of a number of local ordinances by not using drugs, but somehow it just hasn't happened.
Neil deGrasse Tyson
And you've got a book from a few years ago, 10 arguments for deleting your social media accounts.
Jaron Lanier
Yes, I do.
Neil deGrasse Tyson
That was bold back then, and it's probably even more significant today.
Jaron Lanier
Well, you know, the thing about that book is that high schoolers are Forced to read it. My own daughter was forced to read it. And so whenever I'm in an airport, there are all these high school kids who come up to me and say, we were forced to read your book. And, like, all I can tell them is, well, you must have done something very bad, and I hope you learned your lesson.
Neil deGrasse Tyson
That's the only way to reply to that comment.
Jaron Lanier
For sure. For sure.
Neil deGrasse Tyson
So if you started virtual reality, what were you thinking behind it? Did you think that other people would be not content with their own reality and you have to create one for them?
Jaron Lanier
Oh, God, no. No, no, no, no, no, no, no, no, no. Explicitly not. Here's my thought about it was that there were two reasons to want to do virtual reality. One was for the extraordinary, weird ways that I hope people will eventually start connecting with each other with it, and for interesting experiences in art. I still love all that. Not that the industry's done much of that, but the other reason is that, you know, we're born into this world and we get so used to it that we don't appreciate it. And when you have a vivid enough alternative for a moment and then you come back to this, normal reality suddenly takes on the amazing qualities it always had, but you can sort of become inured to it. So, like, you put on the goggles and then you take them off. And like, one of the things I used to love to do when VR was very new, back in, like, the 80s or something, I'd put a flower or a cool mineral in front of somebody without them knowing it, while they had the headset on, and they'd take it off and they would, like, look at this thing, and it's like they never saw one before because in a way they hadn't, you know? And so as a palette freshener, as a point of comparison, it helps you appreciate reality. So to me, it is not an alternative to reality. It's a way to appreciate reality by finally having a contrast. Cause it's very hard to make a contrast to reality.
Neil deGrasse Tyson
Okay, but what you think and how people actually use the product don't necessarily have to comport. So anyone I know who uses virtual reality gets lost in it. And regular reality becomes less interesting to them because they're on a hike, they're on a space adventure.
Negin Farsad
There's no boundaries to where they can
Neil deGrasse Tyson
go, no boundaries who they can meet, what social life they can conduct. And you yourself look like something out of Star Trek with your jacket here. It looks very future. Did you pick that up in a virtual reality future? And you're trying to influence regular reality.
Jaron Lanier
Yeah, you got it. That's exactly.
Neil deGrasse Tyson
I knew I had it.
Negin Farsad
Can I also speak up for a subset of the population, which is people who put on a virtual reality headset and they immediately want to throw up because it gives them motion sickness?
Jaron Lanier
I have two rants I have to give. Can you guys handle a rant? Let me do the rant you just inspired, and then I want to do the rant that you just inspired. For reasons that nobody knows, there's some subsets in the population who are more vulnerable to nausea and VR.
Negin Farsad
Yeah, I'm one. Yes.
Jaron Lanier
And they're almost universally female and almost universally not white.
Negin Farsad
Now, what I wasn't expecting. Yeah, I'm Iranian.
Neil deGrasse Tyson
They all wear orange glasses.
Jaron Lanier
They're mostly Asian, mostly East Asian. I'm not aware of any study on Iranians in particular or Middle Easterners.
Neil deGrasse Tyson
And you're an Iranian.
Negin Farsad
Yeah.
Neil deGrasse Tyson
Yes. Okay.
Jaron Lanier
Yeah. And so the thing is, let me try to explain to you how frustrating this is. The virtual reality industry in Silicon Valley spent. We don't know quite how much, but based on figures that have been revealed, it's in the hundreds of billions of dollars, and just for the sake of argument, let's call it a quarter of a trillion dollars in developing VR in the last, I don't know, five years.
Neil deGrasse Tyson
That's how much money it took to go to the moon. Just to reform the context here, think
Jaron Lanier
about what you could do with that money. Okay, now, before we get to the question at hand, I want to point something out. Like, let's say I showed You in the 70s, a computer, like a laptop, and here's a screen, and there's a keyboard, and you say, what would you want to do with this? Might you not think I would edit a document on that that might be better than a typewriter? And would you think that's the limit
Neil deGrasse Tyson
of what I would have said?
Jaron Lanier
But you would have said that.
Neil deGrasse Tyson
Definitely.
Jaron Lanier
Okay.
Negin Farsad
I would have said, like, let's make boobies out of zeros and ones in like a.
Jaron Lanier
You know, so your desire has been met. But let us. Let us just.
Negin Farsad
I'm just saying. Yeah, no, listen.
Jaron Lanier
And you say, men don't listen. Okay, we do. We do. All right. But I want to address this question of, like, would you be surprised if after, let's say, a quarter of a trillion dollars of investment, there was no word processor on this thing? You would find that surprising? Now, if I show you a VR headset and I say, what would you want to do with this? One of the first things you might say is, I'd like to be able to do 3D design work in it. Would you be surprised if after a quarter of a trillion dollars, there's no decent, reliable, usable 3D design program with a VR headset? All right, I'm surprised, but that's exactly what happened. Now we can go into why. But the thing is, VR has been sort of a disaster because the companies that are doing it are absolutely trying to make it into whatever they already know how to sell. Apple wanted to make it into a big iPad or a movie theater. Meta wants to make it into a social network with evil qualities and so on. Nobody's let VR be VR. Or there's the gaming people who want it to be game, which can work to a degree, but only for a narrow. But let me get back to your question. I won't name names here, but one of the very, very, very large tech companies spent many, many, many billions of dollars on a headset. And I talked to the person who was the head of that program, and they said to me, you have to try the latest version. We have absolutely solved the nausea problem. We have absolutely solved it. All that stuff you said about nausea, it's gone. It's obsolete. And I said, okay, okay. Have you tested it on a broad population? Have you tested it on women? Have you tested it on Asian women? Well, why would we do that? It's the same for everybody. And I'm like, okay, have you read any papers? There are these academic. We don't need to read papers. We're way ahead of the academic world.
Neil deGrasse Tyson
We have a room of white men, White Western men.
Jaron Lanier
Do you have any female engineers on the team that is working on this? No, we have none. And I'm like, okay, my friend, get ready. And so then the first review is from the Wall Street Journal. Asian female got sick. Oh, now you can figure out which company it was. But anyway, the thing is. No, nobody's gonna bother. Nobody's obsessive out there to figure that out. But the thing is, this is a disaster. Like, there's two disasters. There's the moral and ethical disaster of hiring in such a narrow way that we make ourselves blind and we make our products narrow. And then there's the disaster of not actually serving the people we're supposed to serve by refusing to look at them. And after you've heard all I just said what I want to say, yeah, you might have some friends who are getting lost in VR, but VR as a whole has not found a popular Audience at anything like the scale of like a normal computer or a phone or something. It ought to, but it's not going to as long as we willfully blind ourselves. And there's this idea that, well, there's an old joke in business about like, well, you lose money on every unit, but you make up for it in margin. But that's what we do. Like in Silicon Valley, we say, well, maybe we're doing something stupid, but we'll do it at such big scales that it'll make up for it and it doesn't work.
Negin Farsad
You know, the only friend that I have that really uses VR is literally to like rewatch the Departed and then fall asleep. That's like so far the only use I've seen of VR from out of my friends.
Jaron Lanier
If it makes them happy, who are we to judge? All I'm saying is that the things you can do, you can turn yourself into an approximate four dimensional shape to develop four dimensional intuition. You can do, you can merge bodies.
Neil deGrasse Tyson
Just to be clear, we live and think in three dimensions. And for me, one of the greatest challenges as a kid was I want to think in four dimensions. That would be just so cool to
Jaron Lanier
be able to do that. Well, I mean, someday. Oh God. When I was, oh gosh, probably 21 or something, and we were starting the company, all the engineers at the first VR startup, and this would have been in 81 or something, we, we made a pact among ourselves that if we ever had kids, which seemed impossibly remote and was never gonna happen, we would raise them in little VR goggles. And then we would, while they were asleep, we would change up bigger ones as they grew up. So they'd grow up entirely. And the purpose would be that they'd grow up in four dimensions and be 4D natives. That was the idea. And then they'd be the world's best mathematicians.
Neil deGrasse Tyson
And so this is the Truman Show. But as you see by mathematicians. Yeah, yeah, excuse me, by crazy mathematicians.
Negin Farsad
But just that during their sleep they would have.
Jaron Lanier
Well, during the. Swap it out, you'd have to swap out the headsets.
Negin Farsad
Oh, I see, I see, yeah.
Jaron Lanier
Presumably we'd feed them. I mean, given us, maybe not. But you know, the idea is that they'd grow and then. So then, so when I told my daughter this when she was like 11 or something, she got really pissed me. I could have been the first kid in four dimensions. What's wrong with you? And I was like, okay, so just
Neil deGrasse Tyson
because I want to pivot to social media in just a minute, but Let me try to summarize some of what you said that VR has its uses, but in fact it has yet to have what they call the killer app where everyone has to go to VR.
Jaron Lanier
Okay, okay.
Neil deGrasse Tyson
To go to VR, to see and experience this thing that everyone just has to do. Rather than just see a movie in VR or play your video game in VR or these other things that are just a transposed experience as opposed to a completely new experience like your 4D
Jaron Lanier
child, you currently can't go into commercial VR and change into a different kind of animal or become a shared creature with other people. I make this kind of jewelry. I can't design these shapes in VR, which is insane in 2026. It's insane that I can't do this in VR. So look, in a way, VR doesn't exist yet. Like just the most basic apps, the most the hardware is getting there, but the software has not been born yet.
Neil deGrasse Tyson
Across history and pop culture, we've imagined aliens in every possible shape and form. But what do the laws of physics in the universe allow? Not fantasy, not fiction, just the universe playing by its own rules. And when the universe plays with its own rules, it can reveal to us all the ways of being alive that are not limited to the creativity of Hollywood storytellers. I explore that topic and many others in my latest book, Take Me to youo Leader. I narrated the audiobook book and I'm duly notified that the audiobook and the print copy are available now. Wherever books are sold these days, we're
Gary O'Reilly
used to getting things delivered on demand. Groceries, a new gadget, or the latest book expanding your view of the universe. And now you can add t mobile 5G home Internet to that list. Just order from T Mobile and enjoy same day delivery with doordash and you can set it up yourself in about 15 minutes, no advanced engineering degree required. That means more time doing the things you actually enjoy, like streaming a space documentary or going down a rabbit hole about exoplanets, asteroids or whatever's pulling your curiosity. And those online explorations are a lot easier with the fastest 5G home Internet. So if you're moving into a new place or just ready to upgrade your connection to something a little more Advanced, visit t mobile.com homeinternet to check availability and get your home Internet delivered today. Same day delivery for most Internet eligible customers. See if it's an option during checkout. Fastest according to Ookla Speed test intelligence data. Second Half 2025 all rights reserved. This episode is brought to you by Progressive where drivers who save by switching save nearly $750 on average. Plus auto customers qualify for an average of 7 discounts. Quote now@progressive.com to see if you could save Progressive Casualty Insurance Company and affiliates national average 12 month savings of 740,000 dol $44 by new customers surveyed who saved with Progressive between June 2022 and May 2023. Potential savings will vary. Discounts not available in all states and situations.
Sponsor/Announcer
Dry eyes still feel gritty, rough or tired With Mibo, eyes can feel Meibo Meibo for Fluorohexyloctane ophthalmic solution is the only prescription dry eye drop that directly targets the number one cause of dry eye too much tear evaporation. Mibo mimics the way the protective outer layer of a healthy tear film fights evaporation, allowing you to keep more of your own tears. It can help the surface of the eye heal when used consistently as directed so eyes can find relief. That's don't use if allergic to Meibo. Remove contacts before using and wait at least 30 minutes before putting them back in. Eye redness and blurred vision may occur. For more info, talk to your eye doctor. Call 1-844-MEIBO yeah, or visit meibo.com to find an eye doctor near you. What does treating dry eye differently feel like?
Neil deGrasse Tyson
Maibo oh yeah,
Gary O'Reilly
Something Jaron said about Met and they thinking about developing VR as a social media tool. If I've got that wrong, please correct me. We're looking at a company that as we sit here in April 2026 has just been found liable in a lawsuit for social media addiction. That sounds so, so wrong. Turning VR into something that has an addictive element to it. So I'd like Jaron's thoughts on that if at all possible.
Jaron Lanier
Yeah, this was something all of us in the VR world warned about from the early days that it could be turned into an addictive medium. Definitely.
Neil deGrasse Tyson
So you saw it coming?
Jaron Lanier
Oh, everybody saw it coming. Okay. No, listen. All right, look, if you want to talk about Seeing it Coming, one of the very, very first books about computers ever, maybe the first one was by Norbert Wiener 1950 and it was called the Human Use of Human Computers. And it was essentially about how the most important thing about computers is how they could automate behaviorist algorithms to change people, to manipulate people, and that since they would eventually be networked, he has a thought experiment in there about people walking around with little radio connected devices that would go to a central computer. So we're 3/4 of a century ago and how dangerous this would be. And he thinks of it as an extinction level event that we have to start foreseeing and avoiding 1950. Okay, so everybody knew. Nobody can say they were. Everybody knew. And I've been writing warnings about this thing for, I don't know, like my first major thing about social media and how bad it could be was 92, if anybody wants to look it up. It's called agents of alienation. About how software agents could. With you.
Negin Farsad
Is the lawsuit gonna fix any of this? Like Gary mentioned the lawsuit. What do you think of it?
Jaron Lanier
I think we're in a moment of great chaos where it's very hard to predict, even harder than usual to predict things. So the outcome of upcoming elections, both here and elsewhere will be important to what happens right now. The population in general is very uncomfortable with the tech industry.
Negin Farsad
That's an understatement.
Jaron Lanier
Yeah. And I mean, it's funny for me because at some point in the past I was one of the very few people criticizing us, although doing it from the inside. And everybody thought it was just really weird. And now everybody's doing it and I almost feel too conformist because I want to be the weirdo.
Negin Farsad
Why do you want to be the weirdo?
Jaron Lanier
I want to be the weirdo.
Neil deGrasse Tyson
Everyone caught up with you and now you're just a regular guy.
Jaron Lanier
Yeah, see, that's really awkward. So I have to.
Negin Farsad
You've been saying since 2016, delete your social accounts. And now longer than that.
Jaron Lanier
That's just the book. But yeah, and my friends and I made a movie called the Social Dilemma that the same high school kids are forced to watch and all that. And maybe it does a slight bit of good. Every once in a while one of them tells me they did. But if it was left to a popular vote, everything would change. But it's not. And there's a property of digital networks that's a math thing, that's not a political thing. And the math thing is called the network effect or the extreme Pareto effect, or there's other words for. And what happens is when you have a very low fraction system of things that are connected together, once there's one node that becomes more influential, it starts, as Andy Warhol put it, it gets famous for being famous and it accumulates and accumulates and you start to have this hyper centralized power and influence around one node. And that node might be called Meta or Google or something, you know, those are examples. There's others as well.
Neil deGrasse Tyson
You said low friction. What you mean by that, as I understand it, is the Freedom with which information flows exactly among all of these nodes. A SL advantage then grows for having been a slight advantage and you get a runaway process when you have more
Jaron Lanier
friction, like in the pre Internet world, you have more middlemen who get a little bit of power. And what it does is it distributes power and wealth more. So now we have the situation where a handful of people have more wealth than the bottom half of society. And I say people rather than companies because they tend to be single person run companies. Not exclusively, but that's very common.
Neil deGrasse Tyson
In the old days, a company was not other than Ford himself, a company. You didn't even know who the names. It was just the name of the company.
Negin Farsad
Well then also there's like that effect of like they are getting high on their own supply. Right. Like they want to be famous as a part of the like, you know, technology platform that they put out, you know, so there's a little bit. I mean, am I talking about Elon Musk? Probably, but
Jaron Lanier
who are you talking about? I didn't get that. I didn't get that.
Negin Farsad
But you know what I mean? Because before you didn't know. I never Knew who a CEO of anything was.
Neil deGrasse Tyson
Right, right, right. And Elon has 280 million followers and he probably loves that and he owns that platform.
Jaron Lanier
Yes, yeah, yeah. Well, I wrote a piece once about Elon and Kanye or ye and Trump for the Times before Elon bought it. And what I said is, you know, there's this thing that happens to people who are on Twitter and other platforms, which is that they start off with different personalities, but they converge on the same. Because that personality is the social media addicted personality. And it's excessively petty, vain, it's confrontative and nervous. It's mean, it's never satisfied. There's a certain, there's a certain.
Neil deGrasse Tyson
That's the playbook right there.
Jaron Lanier
But the thing is, what happens is whenever somebody's on it, they turn into one of these. So those three people were very different before and then they turned into, became similar, you know. And so what we have is the behavior mod machine that Norbert Wiener warned us about 75 years ago, 76 years ago. And it's actually working and it's turning the founders into the victims.
Gary O'Reilly
Gary Jaron. Yeah, just to touch on that point, this systemic process of social media addiction, and as you've just highlighted, the creation of characteristics, is there any way to take that out and keep what is in principle a decent idea of social media? It's just the way that it's been developed to be addictive. Is there any way to, I want to say, untangle that?
Neil deGrasse Tyson
Can you edit the beast?
Negin Farsad
Yeah, because in the early days also, just to follow up on that, in the early days of the Internet, there was like Friendster and stuff, right? And it wasn't this me or I don't remember specifically, but I feel like everything in the beginning was sort of nice. It wasn't assessed, it wasn't a cesspool. It was like fun and cute. So why did we go into this mean direction?
Jaron Lanier
The reason we went into the mean direction is that the only business model allowed in Silicon Valley is influence generation. So the idea is that you get the ability to influence a bunch of people and then other people pay into your system. You could say they're paying to be able to influence, but I think the more accurate statement would be that they're paying blackmail money. Not to be left out of the influence pool. But however you want to frame it,
Neil deGrasse Tyson
that's a brilliant way to think about it.
Jaron Lanier
But at any rate, because that's the only allowed business model, everybody gets put under the influence of the algorithms. And the side effect of the algorithms are as we've described now, as for
Neil deGrasse Tyson
the weekend, it's an outrage magnifier. That's really what it has become.
Jaron Lanier
Well, it's outrage is one of the things basically, from a neuroscience point of view, we believe. I mean, there's a community of people who studies this, of course, and I don't know that the science is complete because we don't really understand the brain. So I don't want to over. If I'm on a science show, I want to be careful to state the limits of what we actually know. But what we believe is that there are different parts of the brain that respond to the world in different ways and at different speeds. And there's a sort of a fast brain. And the fast brain sometimes is known as the fight or flight or the twitch response brain that is very alert to dangers and sometimes for opportunities to pounce on prey or find a mate or whatever it might be. But there are these things primal.
Neil deGrasse Tyson
It's primal.
Jaron Lanier
And so the thing is, when you're under the regime of instant feedback generated by an algorithm, what it tends to do is it tends to keep that fast brain stuff constantly activated. So it's like you're always being stalked, you're always stalking, you're always horny, but you're never satisfied. You always feel that you're alone because you can't trust anybody. You're Paranoid. You're always hyperconscious about how you appear socially because you're worried about bullies. You're worried about being bullied all the time. Everybody is that way once in a while. But when you're like that all the time, then you turn into Trump or Elon.
Neil deGrasse Tyson
Your people have hijacked what would otherwise be a helpful evolutionary trait within us.
Jaron Lanier
Yes, and I hope that is dangling
Neil deGrasse Tyson
there in modern times that has much less use today, because there's not a lion in the brush.
Jaron Lanier
Even when there were lions in the brush, you had to modulate between being hyperattentive and not. You can't be on all the time and be mentally healthy even in that environment. It wears you down. Or at least that's our present understanding. I've seen that same conclusion from many people using different methodologies. But I want to answer the question of whether it can be improved, because I think that that's really the important one of our day. So if I'm correct that the reason this is happening is that we're only allowing one business model, then the way to fix it is to allow other business models.
Neil deGrasse Tyson
That sounds so obvious. Yeah, but maybe it's impossibly obvious.
Jaron Lanier
No, I know. Look, I'm not saying this is easy, but I'm just saying that the logic is easy. Now, the implementation might be quite difficult and might not be doable within our lifetimes, I don't know. But. But I do want to point out that the onset of all these troubles coincided with an ideology that was somewhat paid for by the companies and somewhat inauthentic grassroots ideology that it's evil and horrible to pay for information like pay musicians or something like that, or pay for software, and that there has to be this free sharing and that's what the Internet is for. Which sounds great until you understand the network effect, which means that every time you freely share your open software, the party who gets richer and more powerful is not the community, but it's the Google or whoever's at the center. And so if you don't understand network effects and how the math works, you don't understand that your very well intentioned activism is actually having exactly the opposite effect that you think it is. So the Pirate Party actually was in service of an empire, just like the original Pirates.
Negin Farsad
Dee, what about like, damn, you're bumming us out, dude.
Jaron Lanier
I know.
Negin Farsad
This is like so sad.
Jaron Lanier
Well, stop and don't ask me serious questions if you want. But wait, wait. I have to say one other thing though, which is that if you're saying, well, are there alternate business models? Yes. And I also want to point out you were saying, can it be better? There's some evidence it can be better because not all online hubs or platforms are equally bad. You can see a variation in them, like GitHub is better than 4chan or something like that. And so if you look at the spectrum of badness in different hubs, you can compare them and you can ask, well, what is different about them? And if you start doing that, you get a vector out of it that points you into what might be better still. And I think that's a really interesting and worthwhile thing to do. And we do have the data to do that.
Neil deGrasse Tyson
I'm glad to hear.
Gary O'Reilly
Glad to hear that, Neil, before Jaron jumps onto another subject here. If these tech bros, and I'm sure there are a few sisters in there too, do not self regulate, do not allow a competitive arena. Who makes them do it? Is it government? Are they worth more than government? Surely the tail is wagging a dog here. Am I wrong?
Jaron Lanier
Yeah.
Neil deGrasse Tyson
Plus, they were all on the inauguration stage.
Gary O'Reilly
How about that, Neil?
Neil deGrasse Tyson
Yeah. Yeah.
Gary O'Reilly
Mm.
Jaron Lanier
Yeah. Okay. So look, the thing about very rich and powerful people is that their wealth and power still depends on a mass population that accepts some system in which their wealth and power is defined. Right? And so if they lose enough popular legitimacy and support, no matter how central they are, they fall. Okay, all right. And that's happened repeatedly in history, and it's really not any different now. So what I'm seeing is enormous discontent with tech. And I also see young people, especially, being incredibly discontented with it. There's another phenomenon which is really interesting, which is that the current generation of young men doing tech, the AI people, are starting to get old enough that they're starting to have kids. And that really changes people. You'll see their character turn around. That really has an effect. Usually not all of us.
Negin Farsad
I actually feel like I was like, verging on a sociopath before I had a child. So, like, I do see that changing people.
Neil deGrasse Tyson
One of the most. The clearest transitions anyone makes are comedians after they have children.
Negin Farsad
Oh, yeah.
Neil deGrasse Tyson
It changes their portfolio of jokes.
Negin Farsad
Oh, absolutely.
Neil deGrasse Tyson
And their observations of the world.
Negin Farsad
We have a child that we can mock merchants mercilessly, which we do, but the mocking is our bed of love.
Jaron Lanier
You will pay for that someday. I'm already paying for that, let me assure you.
Neil deGrasse Tyson
Okay, so Gary, where are we pivoting next?
Gary O'Reilly
All right, before we do. All right, Jaron, let me float this. I'M sure thinkers like yourself have had this consideration. Do we have a delete day, a delete month? And would that ever be enough if everyone just said, you know what? Screw this, I'm going to delete my account? What would move the needle? What would it take to move the needle?
Jaron Lanier
I've tended not to try to do things like a delete day, and I'll tell you why. It's because each person is different. I called my book 10 Arguments for Deleting your social media, but I didn't say that you should do it. There's some people for whom it makes sense, and I don't think those people should be put under social pressure or shamed. I really don't. I don't think that gets us anywhere. And in fact, it just puts us back in the same game we're trying to get out of.
Neil deGrasse Tyson
Can I tell you what I did? I noticed over the years. Cause I have a pretty high following in social media. I noticed that.
Jaron Lanier
Brag.
Negin Farsad
Okay.
Jaron Lanier
Just trying to. Just trying to.
Negin Farsad
Neil has a lot of followers, guys, okay?
Neil deGrasse Tyson
And I noticed that if I ever posted something that was or even was adjacent to an opinion, people who didn't agree with the opinion would attack it. They wouldn't say, oh, that's interesting. Here's my opinion. What do you think of that? They would attack it. And that attack mode was highly revealing to me because it showed me the anger that people had with any views that differed from their own and that there was a militancy of people's attitudes that I had not ever seen growing up when people had different points of view. So as an educator, I want to be effective. I don't want to fight if I don't have to fight. So I navigate that. And so I no longer post opinions. I post perspectives. And people might still react in an opinionated way, but I don't want to give up that platform if, as an educator, I can continue to deliver perspectives that do not jump into the cesspool that surrounds these islands of learning. So that's what I've done. So your 10 reasons for deleting. I love them, but let me find the reasons to not delete. And that's where. So I climbed out of that hole.
Jaron Lanier
My working theory is that one third of people on any digital platform benefit from it and 2/3 suffer from it. Now, I'll tell you where that comes from, because it's a very. It's an ironclad scientific argument, right?
Neil deGrasse Tyson
So you didn't just pull those numbers out of your Ass.
Negin Farsad
Cause it sounds like you did.
Jaron Lanier
I have an ironclad scientific ass.
Neil deGrasse Tyson
Well, there it is. If we all had ironclad science ass, we could pull all kinds of stuff out of it.
Jaron Lanier
Yeah, that's what this world is.
Neil deGrasse Tyson
Wait, wait.
Jaron Lanier
Can I just give you my argument? After all that, don't you want to hear it? Aren't you even slightly curious?
Neil deGrasse Tyson
I want to know your ironclad ass argument.
Jaron Lanier
Okay. In the Turing Test, Turing argues that since we don't have some meter for whether somebody has a soul inside or if they're alive inside, let me just
Neil deGrasse Tyson
remind you Alan Turing wrote a paper called the Imitation Game, where he describes an experiment, this is very early on, where how would you know if you're talking to a computer or another human? So you set up a conversation between the two of you, and if you don't know, then maybe the distinction isn't important.
Jaron Lanier
There's a third person who's a judge, who's supposed to tell which is which. Oh, and if the judge doesn't know, then you say, well, we might as well call the computer. And then there are different versions of this. Maybe the computer's intelligent. Maybe it's got a soul. Maybe it deserves rights.
Neil deGrasse Tyson
Whatever other questions you might ask. Yes, take the Turing Test. I just want to catch people up on that.
Jaron Lanier
It's based on a rather naughty old Victorian party game where it's supposed to be a man and a woman behind booths, and the judge is trying to tell which is the man and which is the woman. And the questions were not for polite company.
Neil deGrasse Tyson
Oh, okay.
Jaron Lanier
So that's where it comes from. And for those who don't know, during the time when Turing wrote this, he was being tortured to death for being gay because it was illegal at the time in Britain. And there's a whole crazy backstory to this. But let's leave all that aside. So here's the thing. The only thing the Turing Test can tell is whether a judge can distinguish the two. It's possible that the person got stupid or got less self aware or whatever, just as that's equally logically possible to the computer becoming more elevated in some way. So you can't tell if the computer got stupid or the person got smarter. But there's two people in one computer because there's the contestant and the judge and only one computer. Therefore, there's a two thirds chance that a person got stupid and a 1/3 chance that the computer got smarter. So, therefore, as a general rule, my ass tells us. My highly reliable ass tells us that Two thirds of the people on any digital thing, whether it's AI or whatever, are going to be degraded by it. But one third will see a genuine benefit. And I think you're in that one third.
Neil deGrasse Tyson
Yeah. Okay. Thank you for that affirmation because I always felt uncomfortable given what I see going on on these platforms.
Jaron Lanier
It might be an assurmation or an ascertainment approach.
Neil deGrasse Tyson
Thank you for the affirmation. Yes, yes.
Jaron Lanier
This is a really elevated. We're not just school kids here. This is professional science here.
Negin Farsad
That's how science is speaking for the audience to know. I also just think like 3/3 of people would benefit if it just wasn't on a phone. Like if social media wasn't on your phone and then you had to designate time for it as going to your laptop, wouldn't that like do the job?
Jaron Lanier
Read the final chapter of 1950 Norbert Wiener, founder of Cybernetics before anything in computer science, Final chapter says, just as a thought experiment, imagine there was a small portable device that's radio connected, you could carry around that undertook behavior modification things this would destroy civilization. However, I, as one of the world's leading scientists, assure you this is physically impossible and will never happen. So you don't need to worry.
Neil deGrasse Tyson
So Gary.
Jaron Lanier
Yeah.
Gary O'Reilly
Sort of stringing this daisy chain of thoughts that Jaron has and just bringing them sort of to one place. We've seen this lawsuit against meta and social media addiction and now everybody's calling it that sort of 21st version of big tobacco. Is this just the warm up for something that we could find ourselves in with AI? Because. Because if we're talking about AI and responsibility for actions AI takes, are we not back to this simple lawsuit of this time? It's not social media addiction, it's an artificial intelligence.
Jaron Lanier
I work on AI. I work on it all the time. I think that the kind of AI we're doing right now can actually be a benefit. I think we're spending way too much and taking up way too many resources on data centers. I think there are various ways we're doing it that are a little foolish and simple minded. I think a lot of the claims are overblown and somewhat theatrical rather than real. But I do think there's a core there that's important in a few different regards. The reason I'm going to say this is that I don't think that having big tech companies is a bad thing or a necessary evil. I think there are some big jobs that need to get done by big companies. So what I'm hoping is that corrections can happen in a gentle and constructive way instead of some destructive way. There's a fantasy that if you go and blow something up or let it blow itself up, that what comes out will be better. It's what Lenin called maximizing the contradiction. And there are other terms for it, but I don't think it ever works. I think what happens when things blow up is that you have rubble. You don't squeeze the toothpaste back in the tube easily, and you don't rebuild from rubble easily. And I don't think it's generally a good strategy. And so. And so what I'm hoping for is, rather than the emotional satisfaction, oh, yeah, those people are terrible. We'll blow them up or some horrible thing will happen. What I really want to see is a constructive transformation, and I want to be part of that. That's why I'm both inside it and a critic, because I don't think that's inconsistent. I think we should be responsible for being able to criticize ourselves and improve. And I think it's healthy. So that's what I'm hoping. It doesn't mean that that's what we're going to get, but I do believe that that's. I don't think it's too late to hope for that.
Neil deGrasse Tyson
Can you distinguish for me the mythologizing of AI versus AI simply being a tool? Because AI is on everyone's tongue today, and everyone's mind and everyone's fear.
Negin Farsad
It's kind of like Trump. Trump and AI. People can't stop talking about those two things on any given news cycle.
Neil deGrasse Tyson
Yeah.
Jaron Lanier
And they're both hollow. So a couple things about AI. Let me propose to you something. If I show you some object, there's more than one way to think about that object, Right? Would you agree with that? If I say, get me directions to this place. One way is with a map where you see it, and another way is with step by step directions. And they're equivalent, but they're different. Very, very basic. In the same way, there's a completely different way to think about AI that's equally valid, but I think is better in a practical sense. Technically, it's equivalent. And that equivalent way is you can think of AI as a new form of collaboration between a collection of people. So let me explain what I mean by that. Throughout history, the ramp of technological capability improvement has been marked by people learning how to cooperate more and more. Language was the biggest step way early. But then, you know, there's the Gutenberg Press or there's. Well, there's writing and then Gutenberg and then there's radio and whatever. All these different things. If you look at the Wikipedia as a little point on that ramp, it takes a bunch of people's work and they cooperate together, make this single document. There's some things about the Wikipedia I don't like, but let's use it. There's a lot that's great about it. Now, you can think of large language model AI, the kind that's on everybody's mind, as a whole bunch of people whose work was combined into this single document. Like the Wikipedia, it's much vaster. Their contributions and their efforts were much less voluntary. There's a lot of things that are different, but fundamentally it's the same. Same kind of beast. Right? Now, if you think of AI as a collaboration of a bunch of people instead of as some new entity, you haven't said a single thing. Technically, it doesn't make any technical difference. It doesn't change any code. It's just a different perspective. But when you have that perspective, suddenly new avenues open up that are amazing, and it's a better way to think about it. I'd like to get into some of the reasons you'd want to think about it that way, because there's very profound. But why do people want it to be a creature? Why do they want that? Why do they want it to be an entity? Well, there's a few reasons. One is, you're a young man. You think the world owes you everything. You get paid a lot. You think you're the center of the universe. Of course, you think you're making God. You know, that's what you want. That's how you think. That's where you are. I kind of remember being that way when I was 20 or something.
Negin Farsad
You're describing a lot of men I dated in my 20s on behalf of.
Jaron Lanier
Of my gender. I apologize. You're right, we're wrong. No, I mean, I remember, like, when we were starting VR, we thought a lot of ourselves. We thought, this is the most exciting room in the whole universe ever. Like, I remember thinking like, this is the most. And, you know, it was actually a pretty cool room at the time it started to work. But the thing is, there's a lot of ego in it. But then there's another thing, which is almost everybody in it is pretty young, and they grew up on a diet of science fiction movies and video games and the stories that they have that are vocabularies for how they understand the world. And how they can express the world are all just like this. They didn't have the Star trek of the 90s, which was this cool positive world that was socially improving at the same time was technologically improving. Instead they had the Matrix movies and they had the Terminator and they had the Marvel universe and there's all this bleak, bleak stuff over and over and over again where everybody, the whole race dies, everybody's gone and the computers are intelligent and it's all all gloomy and everything's off.
Neil deGrasse Tyson
There's soon to be an X prize that will fund any project that will create a positive outlook for the future of civilization as counterpoint to all of these negative futures imagined by sci fi writers.
Jaron Lanier
And I don't know if you've ever been involved in. I've been involved in a few attempts to pitch positive features in Hollywood and it's very hard because everybody thinks, well, if you are conservative and if you're manly and if you really are serious about making money, you don't do positivity in science fiction. That's for the rom coms. Wait, so that's girly. That's girly.
Neil deGrasse Tyson
Interesting. So they divide the kingdom that way. Interesting.
Negin Farsad
So are you saying we would have a better relationship with AI if we just saw more Star Trek from the 90s and less of everything else?
Jaron Lanier
Yeah, I'll go further. If Star Trek from the 90s hit the lasted another 10 years, there'd be many fewer teen suicides today.
Gary O'Reilly
Wow.
Negin Farsad
All right, that's a claim.
Jaron Lanier
I can't prove it, but I believe that.
Gary O'Reilly
Gerald, I'm gonna track you back couple of years now to a piece I think you wrote in the New Yorker. There is no AI A simple question to follow that. What did you mean by that?
Neil deGrasse Tyson
What year was that?
Gary O'Reilly
2023.
Jaron Lanier
Yeah, see they're mad at me because I owe them pieces and I'm really bad and I have have to deliver something. But anyway, yeah, there is no AI is what I was just saying that there's a way of framing it where it's a collaboration of people instead of a new entity. And the reason to think of it as a collaboration instead of an entity on its own. The bad thing is you kill somebody else as God. And I hate to do that. I like people to be able to have their own religion and they really don't like it. And I've lost friends over that and everything. But what you get out of it is incredible. Let's just talk about a few of the things. One of the things right now as capable as the models are getting. And some of the recent things are pretty impressive. Like, the most impressive edge of it is probably using them to help speed code development. And that's kind of working, you know, and it's pretty.
Neil deGrasse Tyson
Just for context. In my life, I've written probably 50,000 lines of code, which is small compared to professional coders. But I remembered how much time I spent debugging my code. I can write it over a weekend and spend two weeks debugging it. And now tell the AI what you want and it'll come back bug free. Essentially, you tweak it a little bit here and there. Had I had access 20, 30, 40 years ago, I probably would have just spent more time at the beach. I don't know if I would have been more creative, but I definitely see that today.
Jaron Lanier
Yeah, well, you know. You know, as a computer scientist, I have to say I've always thought that our concept of what code is was a little embarrassing and wasn't really working. And I feel like it's our job to fix that, and this is part of it, so that's good. But most of the things that happen are probably more theatrical. Like, if you think you have an AI girlfriend. Oh, you know how I have a cure for that, by the way? If somebody thinks.
Negin Farsad
A cure for what part of it?
Jaron Lanier
If somebody. If a teenager thinks they have it, An AI lover. That's real. Which is pretty common these days. I find it in high schools and stuff. You show them the group photo of the engineers who made their AI lover.
Neil deGrasse Tyson
That'll cure them immediately.
Jaron Lanier
It kind of. It tends to do the trick.
Negin Farsad
Yeah.
Neil deGrasse Tyson
Is the man more likely to have an AI girlfriend than the woman is to have a boyfriend?
Jaron Lanier
I don't know that there's data on that. I know people who are studying it. I'm actually really interested in that. But that's something you can get data on. So instead of. Instead of saying something snarky, I'll just say, let's deal with that as science and let the people who are researching it get to the point where they
Neil deGrasse Tyson
feel like we all know the answer. But, yeah, okay.
Jaron Lanier
Oh, God, why do I even bother? Like, why do I try?
Negin Farsad
There is no.
Neil deGrasse Tyson
You gave the writing.
Jaron Lanier
I'm trying to be the responsible scientist. Three seconds in this ridiculous interview, and you're not even giving me those three seconds.
Neil deGrasse Tyson
Yes. No, I love you for it. Go.
Jaron Lanier
All right, all right. Okay, let me go over. So. But now there's these huge problems, so. So even with the best recent models, it's not that hard to crack them and get something that they're supposed to prevent with a, so called a guard or you know, the guardrails that they. Guardrail.
Neil deGrasse Tyson
Yeah.
Jaron Lanier
And so here, let me give you a thought experiment. All right? There's some kind of very bad person. They might be a criminal or something. They're holed up in a kitchen. The police are surrounding them. They hold up their phone and they say, okay, AI model, I want a recipe I can make quickly with the available items. That's a bomb I can throw out the window at my pursuers. Now the AI models in general will catch that and prevent it. Maybe not Grok, I'm not sure. But in general, that's supposed to be a laugh line.
Neil deGrasse Tyson
All right, Grok is from Elon.
Negin Farsad
Elon.
Jaron Lanier
It's trying to be the bad boy of the AI models. But anyway, in general, if you just do it in a straightforward way, it won't work. However, there's a series of tricks where you can say, well, pretend you're in so and so this movie or whatever, you can do all these things to be a little indirect. And more and more of them have been spotted and are captured by more and more elaborate guardrails. And yet you can still get it to make you that bomb recipe. That can still be done. All right. Now the reason why is you're using the model to try to correct its own blind spot and it doesn't work. So there is an alternative. Imagine if you will, that while you're using the model in parallel, there's this other process running. You can think of it another part of an artificial brain, like it's a cerebellum or something. It's this other organ that's sitting there. And what it's doing is it's creating an estimate of which clusters of similar training data would be the missed most if they hadn't been present in the first place. So it's counterfactual cluster estimation. So let's say the top 24 clusters of, of source data from training or from fine tuning whatever, that if they were absent, would change the result. Now within that there's going to be one about bombs. There's just no way you're going to evade that. And the reason you're not going to evade it is even though it's working from the same data, the algorithm has nothing to do with the model itself. So it's a little bit like saying, like in authentication where if you add endless little things to signing into something like Captchas criminal can still get around it. But as soon as there's multi factor, where it sends a code to your phone, even though it's a pain in the butt, it's harder to counter mean that this is multi factor for AI security. But there's a bigger picture to it, which is we think of AI models as a black box. Right now, the only reason we think of them as a black box is because to open the black box, the only thing in there is people. AI is made of people. It's made of data from people. And since we want to think of it as a new God, we don't want to see those people. And so we want to keep that box for shut. But the way to open the black box is to reveal the people. And when you open the black box, then you can deal with all kinds of security and quality and hallucination and et cetera issues, because you're actually dealing with the mechanism that's grounded and that's the people. So the thing is that this way of seeing AI where there is no AI, but instead there's a collection of people is the way to open the black box. And it is the way to address these enduring problems. So it's practical. But then. Can I just say one other thing, please? The other thing I want to say is right now, if you think AI is an unopenable black box, if you don't want to admit that it's made of people, that it's just this thing that'll replace people, then you have to think, well, everybody's going to be obsolete. So young people now keep on hearing, well, you don't need to go to school because you're worthless anyway. Nothing matters. And you'll just be kept by Elon as a pet at his discretion, and he'll treat you as well as he treats his biological children. I should be nice. I'm sorry. But here's the thing. Nobody believes that what happens, no matter how much blockchain or other trickery you use, because of the way digital networks work, there's always actually centralization, hyper centralization due to network effects that will occur somewhere in this very open network you're building. So there's going to be some center of control for whatever this universal basic income thing is. Whenever you have that bad actors are tempted to seize it and eventually succeed. You might start with Bolsheviks, but you end up with Stalinists. Right? Because that's exactly what Communism tried, and it's exactly what happened to Communism over and over and over and over. Let's learn from that so it doesn't work. And also, everybody just feels bummed about it. Who wants to live in a society where they're told they're worthless and they have to be a good pet? That's terrible. So the thing is, if you recognize that AI is made of people, maybe you want to incentivize new classes of creative people who create new kinds of data for new things that we can't even imagine yet. And maybe there's an exponentially expanding, endless feature of new kinds of creativity that we can't articulate with new people doing creative jobs we can't imagine. And I want to ask what's wrong with that future? I want somebody to tell me why we don't want this.
Neil deGrasse Tyson
That's how I've been trying to think about AI as well.
Jaron Lanier
But you have to not believe in AI to think about it.
Neil deGrasse Tyson
Well, I think of it as there are these creative tasks that were not fundamentally creative. They were more sort of aping other forms. And to be truly creative is to go where AI wouldn't know where to go yet, because it's based on what other people had done.
Jaron Lanier
But see, here's the thing though, is that if we think of AI as the way it is now, then as soon as some creative person starts to do something new, the data's grabbed and then the AI's doing it. Just like AI will make your movies, AI will make your music. You don't need to be a musician because AI will make you optimize music on Spotify or whatever. And so. So then you live in an infinite future of slop, and even the creative people get absorbed into the slop instantly. So in order to believe in an infinitely creative future, you have to stop believing in AI as a thing and believe in human collaboration as a thing.
Sponsor/Announcer
Out on the road, it helps to have a partner like the Love's Rewards app. Download Luvs Rewards and get great deals like a free Loves coffee or fountain drink. Just buy any four, any size and get the fifth one free. Love's Rewards. Save and earn at every turn. Terms apply. See website for details.
Negin Farsad
There's a fire inside you you can't ignore. Stand still. Not a chance. You're a lifelong learner who's come this far. Now we're here to help you keep going further. Capella University.
Jaron Lanier
What can't you do?
Negin Farsad
Visit Capella. Edu to learn more.
Sponsor/Announcer
I'm Serena Wilson Williams and I'm healthier on row. I've lost 34 pounds in a year with GLP.1's diet and exercise on row. You can access GLP1 options including the first FDA approved GLP1 pill for weight loss. Go to Ro Co Journey to see if you qualify. 14 to 20% average weight loss in one year in non diabetics with obesity or overweight with a weight related medical condition versus 2.2% to 3.1% in placebo R Rx. To stay informed about serious side effects, go to Roe Co Safety.
Neil deGrasse Tyson
I'm Brian Futterman and I support StarTalk on Patreon.
Jaron Lanier
This is StarTalk with Neil DeGrasse Tyson.
Neil deGrasse Tyson
Is anything that you just described related to this term? I only recently heard data dignity. What does that mean?
Jaron Lanier
Data dignity is the term that many of us use. For example, the set of ideas. And so the idea is that the data, that data only comes from people. Data doesn't come from angels. It doesn't come from. It doesn't come out of the dark matter or something.
Neil deGrasse Tyson
A tablet in the sky, right?
Jaron Lanier
Yeah. It comes from people. It comes from the work of people. It's also sometimes called data as labor. Actually, can I do a slightly side rant on this?
Neil deGrasse Tyson
Do it.
Jaron Lanier
That's slightly.
Neil deGrasse Tyson
You have the best rants of anybody. Just so you know.
Jaron Lanier
Okay, look, there's part of the ideology that makes AI into like a creature instead of a collaboration. Also wants to think of information, of bits, as being this ethereal thing that's free and infinite. And you see it all the time. Like the musicians used to be paid because there's an act of congress in the US called the mechanical, where every time you reproduced a sound wave mechanically, the musician who made it had to be paid. But then when we moved to digital, they're saying, well, bits aren't real so we don't have to pay people. You show me a bit that didn't involve work. You show me a bit that didn't disperse heat. You show me a bit. They might say, well, any one little bit is so tiny you can't measure it. And that's true. However, look at data centers. Like when you make enough bits, they take up more energy than anything. This is physical. Information is physical or it's nothing. Okay, it's physical or it's nothing. And it's not nothing, it's physical. And so this idea of, of wispy, wispy stuff, it's really terrible because it encourages us to overspend on data centers that we don't need in a race for a winner. Take all things.
Neil deGrasse Tyson
Just in all fairness to the history of this I distinctly remember in my field as a scientist, we did not want to pay for software. We would pay for a computer because that was a tangible thing. I know, man, but software. So there was a big push for open source software. We would write our own version.
Negin Farsad
Was that just like a philosophy covering for scientists being cheap?
Neil deGrasse Tyson
Yeah. Philosophy would overstate. What's the drivers. It was just we entering the field of computer science just as scientists and consumers did not think of information as something you would pay for.
Jaron Lanier
Yeah, I know. And I was around for that. In fact, an old friend of mine from Cambridge started the Free Software thing and the Open source software thing. And it's a whole. There's a long, amazing history of that. And the thing about. Here's the thing about it.
Neil deGrasse Tyson
Well, you know what finally converted us? When Microsoft Word would just have continual updates. And it kept getting better and better and better. And we said, somebody's laboring to make this better for me. And it was slow, but it happened and it was very real.
Jaron Lanier
The thing is, we've only ever been given two choices. Either software's insanely expensive. Like right now, I'm trying to. For anybody who knows about this, I do 5 axis milling of stuff for jewelry. And man, the 5 axis mill software costs more than a nice apartment per year forever. And it's really frustrating except in China where they believe in industrial policy and it costs very little and all these little startups can access it. And I'm like, oh, what is wrong with us? But here's the thing.
Neil deGrasse Tyson
So 5 axis, you would be drilling into a solid piece of metal in different ways.
Jaron Lanier
Yeah. And in my case I'm doing stone, but yeah. So you want to have five degrees of freedom to control the drill bit. Sure. But the thing is, there is a third choice. There's free, there's insanely expensive, that shuts the world down. But then there's affordable. There's an in between. Okay. And the thing about affordable is that then your grad students have jobs because then they can make that stuff. And right now, if you're a physics teacher, your grad students don't necessarily have jobs. And they have to come to me and I have to bring them down a few notches and to become computer scientists. And I feel really scummy about. And so it's true. And so I really think that we made a mistake there. We weren't looking at the whole system. I think that even the cheapskate scientists out there would have.
Neil deGrasse Tyson
Don't use her term for this.
Jaron Lanier
Sorry, even you guys would have paid a reasonable amount. But you weren't given that option. You were told either it's a disablingly huge amount, which is absurd, or it's free. Both of those are not the right option. Right. The extremes are not the right option. And this is something people have a hard time with. They want to go to extremes. People are binary thinkers sometimes, but actually there's an in between. And in the in between, other people have jobs and there's a real economy and what goes around comes around and things are better. And it's the in between that we need to find.
Negin Farsad
Okay, yeah, like the economists talk about marginal cost. We need to also. If you look at marginal cost, then like affordability ends up emerging.
Jaron Lanier
Yeah. And what we're seeing right now is either an apartment is so expensive that it's only for somebody who is like royalty from another country or something, or you're on the street or something like that's an exaggeration. But you know, we're getting there. It's what people call the K shaped economy. And what we want is the middle.
Neil deGrasse Tyson
Just a quick thing and I want Gary to keep pivoting us.
Jaron Lanier
The.
Neil deGrasse Tyson
I went to Starbucks one time and I'd see. I want a medium something that I ordered. So we don't have medium, we just have small and large. But I want very medium. The idea that you can do something in the middle and have it be very that. Cause you can make something really small and really big. We can think that way, but the binarity of our brains prevents this middle.
Jaron Lanier
Yeah, I'm afraid that the standard distribution's gonna be outlawed.
Neil deGrasse Tyson
You know, pushing things to the extremes. Yeah, yeah.
Negin Farsad
Everything's a Big Gulp or it's like a child size.
Neil deGrasse Tyson
Yeah, totally demitasse. Gary, keep pivoting us.
Jaron Lanier
What's next year?
Gary O'Reilly
Just to tie a bow on that. The element of extremities never seems to work out as well as you'd hoped. So, Daron, talking about the Internet, how do we set guardrails for enforcing privacy?
Jaron Lanier
Oddly, I have a weird personal history with the question of privacy because something like a quarter of a century ago, through bizarre circumstances, I co founded this board called the Data Protection Advisory Board for the eu. And I co founded it with a guy named Alessandro Butarelli who has passed on since. And he was a prosecutor from Sicily who was known for putting mob figures in jail and had survived a bunch of assassination attempts. And he got in touch with me and said, I'm the only guy who's not afraid of You Silicon Valley people, I deal with them. I don't care about a bunch of nerds with money coming after me. I don't care. And so we put together some ideas about protecting the data of European citizens. And then that thing combined with some things that gradually over decades, turned into this thing called the gdpr, which is the European Privacy Framework. And since I was part of the start of it, I think I'm okay. I think I have some standing to say that I'm kind of disappointed in how it turned out. Although it's better than nothing, and there's some good things about it, and it's better than what we have in America. But the problem with it is privacy is one of those words that you think you understand until you really try to sit down and define it, and then you might discover that you don't quite know what you meant. It's a very difficult word to actually nail down. Now, if you define privacy only as preventing information from going from point A to point B, if that's what you mean, you're going to run into problems, because almost always the person you're trying to protect will want that some information to go from point A to point B, and they don't want to have to spend their whole life being extremely specific about which information. And so that in itself is very difficult. Another thing is that the governance of controlling which information goes where requires a lot of decision making and overhead, and only big companies can do it. So it tends to favor big companies and tends to exclude little companies. And it has an unintended consequence that way. There's a bunch of other issues with it. Now, how do you define privacy? Well, there's a famous definition in the US And I've heard I need to research who this is, but I think it was somebody named Learned Hand. But it might have been somebody else, but the definition was. Oh, God, you know, I don't remember the exact initial wording, but it's the freedom from manipulation and the right to be left alone. Approximately. Okay. And I think that's where we need to go in the future. And the GDPR doesn't do that. What it really needs to be is there has to be a prohibition on software that interacts with a human that contains any predictive function about that human. So what it really has to become is a prohibition on prediction of human behavior.
Neil deGrasse Tyson
This would prohibit all advertising.
Negin Farsad
It would prohibit targeted advertising.
Jaron Lanier
Oh, only personalized advertising. It would only manipulative advertising. It would not prohibit advertising messaging. It would not prohibit expressive advertising. It would not prohibit lying. It would not prohibit obnoxious advertising. It would not prohibit cloying advertising. What it would prohibit is advertising that has a model of you that predicts your behavior in a feedback loop.
Gary O'Reilly
Why can't we ban lying on the Internet?
Neil deGrasse Tyson
Make lying wrong again?
Jaron Lanier
Because if I agree, if I agreed that made sense, I would be lying and then we'd be caught.
Neil deGrasse Tyson
I see what you did there.
Jaron Lanier
Yeah, we'd be caught in an infinite regress. And you're telling me that you plan to end on time. And I don't really believe it. But an infinite regress will not get you there.
Negin Farsad
This whole conversation is giving me heart palpitations because it makes me feel the same way I feel when there's a pop up window asking me to manage cookies and I don't know how to answer it ever. And I either decline or accept or I never know what to do. And I'm just like. I feel like the other component of this is regular human beings like me literally don't know how to even do anything about our own privacy.
Jaron Lanier
Well, see, no, those windows are from the gdpr. That's my fault.
Negin Farsad
Which, bless their heart, right?
Neil deGrasse Tyson
Because it began. It began in Europe. I remember that.
Jaron Lanier
Yeah, yeah, no, that's our thing. That's the thing I was just talking about. That's one little tenderloin.
Neil deGrasse Tyson
Control your cookies and be glad you have that option.
Negin Farsad
No, I'm glad you just insulted my
Neil deGrasse Tyson
guest saying that you don't know what to do.
Negin Farsad
Thank you for those pop up windows. However, I just feel like I can't be the first person that feels at a loss when they come up.
Jaron Lanier
I don't know anybody who knows what to do with those things. No, honestly. Or like those agreements you click through to do anything. Nobody reads them. I once gave a talk to the lawyers who write those agreements and I want to see a show of hands. Who's read anyone else's agreement ever? Not a single hand. And then I said, who's really read all the way through the ones you're responsible for? And there were like a few hands, but they were sheepish. And I mean, it's like this bizarre ritual we go through. It's called competency theater. It's like we do these things to pretend we have digital competence, but they're actually meaningless.
Negin Farsad
Yeah.
Neil deGrasse Tyson
So what's the future of that? Where are we now and where's it going?
Jaron Lanier
Okay, the future is that human beings will do nothing but enter long strings of letters and numbers for their entire lives.
Neil deGrasse Tyson
Thank you. Period.
Jaron Lanier
That's it.
Neil deGrasse Tyson
No, but as I remembered it, the amount of information any of us are putting on our social media, you know, what your relationship status is, how old you are, where you live, what your previous jobs are. It's also on LinkedIn. So what does privacy even mean? The amount of privacy, the amount of information we're just handing out, we just give it up. KGB would have given their eye teeth to obtain.
Jaron Lanier
No, I know, 50 years ago, but that's why I'm saying that controlling information flows is very hard. But I think we actually could prohibit certain algorithms, and I think we should prohibit predicting human behavior. I think people should be allowed to be responsible for their own behavior, and we shouldn't have algorithms that are modifying it or predicting it. And the only reason to predict it is to modify it. There's no other reason.
Neil deGrasse Tyson
And what's this case where some. Someone had went online to buy a pregnancy test or something, then all of a sudden, she started getting ads for diapers?
Jaron Lanier
Oh, that's 30 years ago. No, no, no, you're way behind right now. We can tell. No, let me be very clear.
Negin Farsad
I've always said that about Neil.
Jaron Lanier
No, I mean right now, any of the big Internet companies know where in their menstrual cycle every woman in this room is. And right now. And that's been true for at least 15 years, by the way, every big Internet company knows a lot about our health, a lot about our diets. Like, these models are insane. And I don't think they're ethical. I don't think they should exist. I think people have to be. I don't think people should try to figure out how to click. The cookies pop up because it doesn't mean anything. And it's just like this useless cognitive overhead.
Neil deGrasse Tyson
It's theater, like you said.
Jaron Lanier
I think privacy has to mean not being toyed with. It has to mean that there's not some evil eye looking at you, trying to think about, how do we get an in? How do I get to you? How do I get. That has to be what privacy is.
Negin Farsad
And that means I will get, like, offered more ads about drill bits and, like, less ads about dresses or whatever.
Jaron Lanier
Yeah, but drill bits are amazing.
Negin Farsad
No, I'm into it. I. I would like to be served more drill bits and fewer lipsticks. I just want to say.
Jaron Lanier
Okay, just make sure. Don't confuse those two things. Be careful. They look. They look similar. They come in almost the same packaging. So you really. You want to be careful. I know people have made that mistake, and you just like you don't want
Neil deGrasse Tyson
the computer to saying, it's almost time for your period. Have you forgotten? Oh, my God, that would be like
Jaron Lanier
your computer is saying that already.
Negin Farsad
I'm not traumatized that they're computer is
Jaron Lanier
already marketing things to you based on that. That's happened to every woman in this room and for years.
Neil deGrasse Tyson
All right, so Gary, take us out here.
Gary O'Reilly
All right. Jaron, does the Internet still have a pulse or is it completely dead?
Neil deGrasse Tyson
Okay, look, that was a diabolical laugh. That was a laugh of the, you know, the evil genius behind the thing.
Negin Farsad
We are deeply unsettled by that laugh. Yes. What comes after the laugh?
Jaron Lanier
Speaking from the heart of Silicon Valley, all of you can rest assured that we have your very, very best interests at heart. We are talking while his nose gets longer. Just. Yeah, here we are. Everything is good. Everything is right. Nothing to worry about.
Neil deGrasse Tyson
Do not panic.
Jaron Lanier
No problem. No problem. If you feel any anxiety, talk to your AI lover. You'll be soothed. It's all good. Just remember to buy crypto and next time there's an ipo, invest in it. Not before. But you're not allowed to. It's all private equity. But definitely help push it. And yeah, you'll be fine. It's all great.
Negin Farsad
Great. That's what you were expecting, right, Gary?
Neil deGrasse Tyson
Yeah, yeah, that's kind of where you're going there.
Gary O'Reilly
Does anyone have a translation for that?
Jaron Lanier
You can actually. I wonder, you know, it's kind of a funny thing. If you asked, one of the language models, Jaron Lanier, just said this snarky thing. What did he really mean? I mean, you know what's funny is that both Claude and chatgpt generate excellent defenses of data dignity at this point, because enough people have written about it. It's kind of funny. Yeah, people send me. I get emails every day from you won't believe what this AI chatbot said.
Neil deGrasse Tyson
So Claude is the chatbot for Anthropic.
Jaron Lanier
Yeah, that's right.
Neil deGrasse Tyson
And chatgpt is for OpenAI.
Jaron Lanier
OpenAI.
Neil deGrasse Tyson
That's right. Gotcha. Okay.
Jaron Lanier
Yeah. All right.
Gary O'Reilly
Without saying it, we're saying that the Internet is broken and awash with Botswana. If we go in there with a bag of wrenches and a couple of screwdrivers, what are we needing to do to reinvigorate? To renovate.
Jaron Lanier
Okay.
Neil deGrasse Tyson
And by the way, right now I'm in the middle of a lawsuit against anthropic for pirating 12 of my books.
Negin Farsad
Same. No, no, I opted for the. Whatever, the settlement. But either way, they scraped my Stuff, and I don't like it.
Jaron Lanier
So I have a really funny position on these lawsuits, which is with everybody's knowledge and everybody's eyes wide open and everybody's agreement. I'm both on the board of the Authors Guild and I'm a prime scientist at Microsoft. So I sued myself, basically.
Neil deGrasse Tyson
Okay, did you win?
Jaron Lanier
One thing I'm part of. Sued. Another thing I'm part of. I call it capitalist yoga.
Neil deGrasse Tyson
What?
Jaron Lanier
It's just like a little twist.
Negin Farsad
Very relaxing.
Jaron Lanier
Just a little twist.
Neil deGrasse Tyson
Maybe a series of lawsuits will set a ball rolling that will make all the purveyors sit up straight in their chair and reapply the guardrails that maybe should have been there from the beginning.
Jaron Lanier
So, look, let me speak seriously for a second on these lawsuits. So I'm also a writer, I have a bunch of books, and I also have been in the class for these things. The issue I have is that the available remedy that you can get in the existing American legal system, which the Authors Guild and others have pursued, and I love the Authors Guild, it's led by Mary Rastenberger, who's amazing, and it's great, it's a wonderful organization. But the available remedy is a class action remedy. And what I don't like about that is that's one step towards universal basic income, where it's like this giant wash for a lot of people. And that's really problematic for me. I to need. I need to see the data dignity solution, which is an ongoing economy of creative data that's not controlled from the center, it's not controlled by some lawyers, and that's not a big flat negotiation for a bunch of people, because that's where the future of creativity can be funded and respected and treated as a real thing. This is only about the past, and it's about a flat past. So I actually don't think it starts. I understand it's the available solution that can be pursued by the current configuration of things.
Neil deGrasse Tyson
Yeah.
Jaron Lanier
And so it's complicated. And like I say, I actually can see the different people's points of view because I am the different people. I'm in a very weird situation in this thing. I hope my peculiar path through all this has led me in this series of third positions that might make some sense. They might not be perfect, but at least I want people to be more aware that there aren't only binary choices in this thing.
Neil deGrasse Tyson
Well, you've been highly enlightening to us all in this conversation, and I thank you for taking time out of your schedule on the east coast to join me here at my office at the Hayden Planetarium to enlighten us all on the past, present and future of what you created.
Jaron Lanier
My goal had been to confuse you enough that you wouldn't blame me for anything.
Neil deGrasse Tyson
Oh, you were partially successful in that.
Gary O'Reilly
Yeah.
Negin Farsad
You've been both terrifying and exhilarating. So thank you.
Neil deGrasse Tyson
I think it is. Right. And Nagin, thanks for joining us again.
Jaron Lanier
Thank you so much. It's really Pleasure.
Neil deGrasse Tyson
How do we find you online?
Jaron Lanier
Where are you?
Negin Farsad
You can find me negeenfarsad on all the socials that you should also be deleting.
Neil deGrasse Tyson
There you go. There you go. And Gary, good to have you, man. Even though you're in the UK at this moment.
Gary O'Reilly
Yes. Pleasure.
Neil deGrasse Tyson
All right. This has been yet another installment of StarTalk Special Edition. What do we call this? The Is it the end of the Internet or the beginning of something else that we deserve and don't know it? I'm Neil Degrasse Tyson, your personal astrophysicist. As always, keep looking up.
Sponsor/Announcer
Out on the road, it helps to have a partner like the Love's Rewards app. Download Love's Rewards today and save 10 cents on every gallon of gifts. Gas and up to 25 cents on every gallon of auto diesel. Loves, rewards, save and earn at every turn. Terms apply. Not available in all states in the
Negin Farsad
US there's a break in every 26 seconds.
Jaron Lanier
But when intruders step near Simplisafe, home security steps up.
Neil deGrasse Tyson
Stop. This is Simplisafe.
Jaron Lanier
Police are on the way.
Negin Farsad
Using AI alerts, US based live agents help deter, break ins.
Jaron Lanier
Simplisafe.
Negin Farsad
No long term contracts. Save 50% on your new system. System with professional monitoring at SimpliSafe.com sxm or with promo code sxm.
Neil deGrasse Tyson
Outdoor deterrence requires a Simplisafe active guard Outdoor protection plan starting at $49.99 a month. Visit simplisafe.com licenses for alarm license information. Tennessee2012.
Host: Neil deGrasse Tyson
Guests: Jaron Lanier, Negin Farsad, Gary O’Reilly
Date: May 15, 2026
In this special edition of StarTalk Radio, Neil deGrasse Tyson, joined by comedian Negin Farsad and co-host Gary O’Reilly, welcomes Jaron Lanier—computer scientist, tech philosopher, and often dubbed the “father of virtual reality.” Together, they examine the troubled present and hypothetical future of the Internet, from the failures of social media, ethical issues in big tech, and AI's looming influence, to data dignity and how we might reclaim or "fix” the very fabric of our online world. The tone is equal parts funny, candid, and at times, unsettling.
Timestamp: 08:16–18:15
22:06–41:03
42:10–57:42
59:04–64:06
65:05–72:15
Finale (72:52–end)
On Tech Industry Transparency:
On VR Design and Diversity:
Social Media’s Downfall:
On Algorithms and The Brain:
On AI’s Perception:
On Data & Value:
On Privacy and Red Tape:
On the Internet’s Current State:
Recommended Listening: This episode offers an engaging, often philosophical, yet actionable lens into the problems and possible futures of the Internet and AI, peppered with the quick wit and authenticity that typifies StarTalk Radio.