Transcript
Jordy (0:02)
I wrote about Netflix. There was a funny, very brief interaction between Ben Thompson of Strathecary and Netflix co CEO Greg Peters on last Thursday's Tratecheri interview. And they go back. The only mention of AI in like an hour long interview or something. It's just two little exchanges. Ben Thompson says, is AI slop going to save you if it overwhelms the UGC platforms? And basically it's like you're a refuge. So this is actual, this is real. And Greg Peters just says, I think it's credible. I don't know if that's the reality. So I can't say with certainty that's where we're going to land. But it's a credible possibility. So he's like, maybe that's a bull case. It is interesting. I mean, Netflix has been trading down over the last couple months, but in general it's up, I think it's 4x up since the launch of ChatGPT. But every CEO needs to contend with the AI question, the AI issue. How will AI change their platform? And AI has already been changing Hollywood. I mean, I, I was reflecting on the Avengers. I just remember seeing maybe he was even in one of the first ones, the whole CGI process for Thanos. He has this like very distinct large chin. So Josh Brolin is the actor that plays Thanos.
Tyler (1:15)
Is he a mogger?
Jordy (1:16)
He is a mogger. He has this huge chin.
Tyler (1:18)
It's actually like he's kind of like the og.
Jordy (1:20)
I don't know. It looks like chin implants. It's kind of crazy. But it has these like cracks in it. It has this like very distinct look. Thanos. And normally the way the VFX pipeline works is that you go and you put these black dots all over your face. And then you wear a helmet that has a camera pointing at your face. It tracks all the points. So when you smile, like it sees that the actor that's driving the performance capture is smiling and then that facial movement is transferred. So they're recording the lines, they're acting it out, they're giving their facial performance and then that's transferred all the little subtleties of how their eyebrows move, all of that is transferred to the CGI character. It can look a little flat though. So what they did with this is they still have all the points on the face, but then they interpolate from the small points that are on the face into a higher res model. Yes, don't read that, don't read that, don't read that. But it is a Good point. So all of those are tracking markers. And then the question is, you have a much higher resolution CGI model. If you just transfer with 50 points or 20 points, you're not getting all the detail of what a human face actually looks like and the way it moves. And so Digital Domain, which was one of the many VFX studios that worked on the Marvel series, they built a straight up machine learning pipeline. Like they used AI. It wasn't a diffusion model, it wasn't an LLM, but they used a machine learning model to basically translate from the low resolution, just a few dots to a much higher resolution mesh. That then became the performance of Thanos on the screen. And I don't know if you remember 2018, the movies. Obviously you didn't see any of these movies, but I don't remember like AI back then.
