A (66:05)
Professorial. All right, so there used to be this way that we would talk about technology's impact where we would have some notion of a machine. So some sort of abstraction, some sort of techno, cultural, economic abstraction that we didn't create but sort of exist. And it's dictating the way our lives unfold. So there's sort of like particular paths I'm drawing here that come out of the machine and we don't realize it, but like our lives are kind of being dictated by this amorphous abstraction type of thing. Like, this is the way we used to. Commentators used to think about technology. And so here we would have people like Jacques Allol or Louis Mumford, like, they would talk about these big abstractions that we didn't realize existed but controlled the way we actually, our lives were unfolding, our societies were functioning. Then this transitioned at some point, right? And this was one of the ideas that came to this conversation that really caught my attention is at some point in the 2000s, so maybe I'll just label this like the 2000s over here. We went away from this way of thinking about technology as being part of these, like, big hidden systems that dictated our lives. And we kind of fragmented these impacts. This is what we talked about in the conversation. We kind of fragmented these impacts into like, a lot of little things. So maybe we have mental health as a separate thing. Oh, we're going to measure the mental health impact of a smartphone with a particular group and be like, oh, there's a negative impact there. Or we're going to look at what's the environmental impact of an AI data center. Oh, it uses a lot of power, a lot of water, and maybe that's going to be a problem. Or maybe there's a sort of civics issue, sort of misinformation on social media. It's going to create problems with cohesion around certain topics, like in public health or something like that. So in the 2000s, we sort of fragmented our understanding of technology's impact into like, all these other little impacts. And this was not. It kind of lacked a big kind of oomph. Right now we were sort of, oh, there's this and that and this, and this affects me, but this one doesn't. It's an environment where no one's like, super happy with technology, but it's also not an environment that's necessarily going to inspire a lot of major changes because there's too many just little things, some of which affect you, some of which don't. That was kind of the state of affairs. I sort of put that the turning point from this more fragmented, very narrow understanding of techno issues. I put that turning point maybe right around Nicholas cocktail Car in the early 2000s, where he was riding the shallows, which took the Internet technology and said, let's look at its impact on the brain and be like, oh, there's a specific neurological or cognitive issue here. We're using this technology, studies show, makes it harder to do this type of thinking. And now we were off to the races where now we were seeing technology through these relatively narrow frames. Then we have Kingsnorth come along. And I think the way to imagine Kingsnorth is he's going back, back. He's like, no, no, no, I want to understand technology again as part of one of these big systems that he calls the machine. One of these big systems like we used to talk about with the big tech polemicists from the early 20th century that really affects how our lives lead. So we're sort of back, we're back to that view again. And I think this is partially why we discovered, you know, he's beginning to catch our attention because like, this is a bigger vision, right? This is like there's this big system that's controlling so much about our lives. But here is where he gets different than what we saw back in this treatment of it from the early, the earlier in the 20th century. His complaints about the machine, they're much more about their impact on us as humans. Humans, the kind of humanistic impact. If you read a sort of like Lewis Mumford talking about the mega Machine, it's about like how we implicitly organized like labor and machine so that like the pharaohs could build the pyramids. There are often arguments about power and control about if you like a lull is more about the sort of the colonization of a sort of like arts type approach to the world with a sort of scientific worldview kind of sucking the, the inspiration out of the way we approach things and letting economic forces have much more power. And these are really kind of influenced by Marxian type of impacts. But Kingsnorth is more saying like, no, no, no, here's the thing about the machine that he's talking about. It makes your life impoverished. It makes your life as a human worse. It's a fiction about what we should be doing as humans that leads you to have a worse, worse life. And because of that, he says, I, I have a response. If, if like the machine is removing your humanity, you can fight back from the machine against the machine to make your individual life better. And you do this like, you know, Harper talked about by setting limits. Like, you do this by like he of course did this to an extreme right, but you know, like he has some cat cabin somewhere, a little farmhouse in County Galloway and in Ireland somewhere. And he plants crops. And you know, he's an Orthodox Christian and he used to be, he used to be a Wiccan and he's. There's trees and you know, this is expertly drawn as everyone can see. And like he's really Happy. He said a bunch of. He doesn't use smartphones. He homeschools his kids. Right. And he's kind of built his own life out of this by setting a lot of limits. Limits. He said by resetting limits, I'm able to like, create a more human life. This is why I think this caught on. It was this twofold thing. We used to be, you know, we had these big think guys. They're all guys at the time, like writing these big think books that were like, really cool theory, but like, in the end it was like capitalism, you know, is bad, or like there wasn't much you could do about it. And then we fragmented into this world of let's think about anti tech as just like all these little narrow things and you would care about some and not others. It's not something that's like motivating a lot of. Of action. Right? You're like, oh, maybe I should use my phone less or whatever. Then we get to Kings north and he gives a different approach here. No, no, let's go back to the idea of these big abstract, technocultural economic systems that are controlling our lives. And we never signed up for it, and we never even really chose them, but they are. It's all focused on growth and the breaking down of barriers and the denial of our corporality and humanity. And he said, but once you recognize that and it's making our lives worse, you can fight back by setting limits again. Again, that limits actually allow your humanity to be expressed. Like, what technology do I want to use? How do I want to live? Maybe I don't want to use AI at all. Why do I have a smartphone? Why? It gets rid of this sort of fatalistic, like, everyone has to use every tool. What else are you going to do? And said, no, no, no. Human flourishing is about setting these limits. Acknowledging, as Harper summarized King's north view, that we are humans are sort of like biological sect beings who like growing up, seek wisdom, appreciate creativity, and eventually die. And that is like the reality of the human experience. And so we don't want to deny that and feel like we can live forever or get rid of biology. We can actually embrace constraints and try to actually flourish and, you know, be in nature and touch grass and not be about growth at all. Limits. Anyways, there's a lot of points captured, but I think this gets to the core of why this particular book was bidding. It gave us a much more sweeping, ambitious view of technology like we used to have, but then connected it to what individuals could do. This idea that human flourishing could be returned if you're willing to set limits, at least to take a stand, to fight back against a machine. And in that way, I think it fixed flaws of both of the prior models. It's also just different than we've been seeing. So that's why it caught attention. Not all the coverage is positive. We talked about it in the show. There's all sorts of issues or this or that or hypocrisies or politics that people don't like. But man, it got people thinking because that's a big swing, this picture right here, right. Of going from, you know, you have this like really big. You have this like really big philosophical, theoretical idea and you have these like, really exciting, like, solutions based on limits. I think this caught people's attention and it should catch yours as well. So as you remember, Harper ended our conversation by saying like, yeah, read the book. It's going to shake things up. He's not giving you a prescription, but he's giving you a. A vision, one where you take control of your life and through limits, you allow your humanity to be expressed in a world where no limit growth technology is trying to take that all away from you. Take it or leave it. At least it's an exciting book, so I can see why it is so popular. Interesting point. Jesse. I don't know if you. We talked about this. I guess I talked about it briefly in the interview, but he left his book tour. Paul Kings North. He was like, enough of this. I don't want to travel all over the place and be. This is really exhausting. And I want to go back to my farm. And he just stopped this book tour and went back home.