
Loading summary
Isabelle Brown
There's a new AI tool on the block as of last week. Open AI's Sora, to which people are using to create the most disturbingly realistic videos I have ever seen, contributing to a larger conversation about AI actresses replacing real people, deepfake pornography and how that impacts young people in particular, and how the heck we are supposed to regulate the rewriting of history that is obviously happening right before our eyes. I cannot be the only one completely freaked out by this latest development. So come weigh in today on the Isabel Brown Show. Today's episode is brought to you by Bol and Branch. Feel the difference an extraordinary night's sleep can make by transforming your bedroom and visiting bol and branch.com Isabelle to get 20% off your first set of sheets for a limited time. That's B O l l&brand.com Isabell.
Co-host or Guest
Foreign.
Isabelle Brown
As artificial intelligence technology has gotten so much more advanced over the past few months, let alone years, it is totally terrifying to me how little you can distinguish between something that is 100% real and 100% AI generated. None of this stuff actually happened, especially when it comes to photos and videos. What used to be, obviously AI, like someone with a sixth finger or their face wasn't really proportioned. Right now looks like it clearly was something that was filmed with a high definition camera and obviously happened. And one of the main reasons for that is because of a company that also owns ChatGPT that has rolled out a new branch of their organization. SORA specifically has a brand new rendition that rolled out last week called Sora 2, which is allowing people to type in very, very complex and instructive prompts for the platform to create new generation entirely made from AI. And it is genuinely crazy to me to see what our technology is capable of doing now and what people have been able to generate literally using just a few words at a time. I haven't even seen all of these videos. My husband, who is staying home for a couple days from work, if you know, you know, over the past few days, was bored and decided to put together a whole compilation of of different Sora 2 generated videos for us to react to together. He claims that our minds are going to be blown, but apparently this technology is so far beyond what our brains could ever imagine in terms of advanced AI than ever before. So I say today on the Isabelle Brown show, let's check it out. Here is a compilation of videos generated entirely from AI on OpenAI's new Sora 2.
Mike
So I downloaded this app Grinder. Yeah, I thought it was a sandwich Place. Okay. I was hungry. I'm like, ooh, Italian foot long. Let's go. Next thing I know, my phone's like, There's a bear 200ft away. I said, in Los Angeles, I don't hike, man.
Isabelle Brown
I just want.
Co-host or Guest
Have you ever had a dream that you, you had your. You, you could, you'll do, you, you want. You, you could do so. You, you, you'll do. You could. You, you, you want them? I was in the foot. Excuse me. In the foothills of the Himalayas with President Xi. I don't know. Ah, whatever. Anyways, they got that cage fighting AI UFC folks act like it's the roughest thing ever seen. That thing's soft compared to what we did. Ain't no 12 ounce gloves in my. Is this. Oh, we're cooked. Have you heard friends talking about having good r. Riz?
Isabelle Brown
Riz is a word young people use.
Co-host or Guest
For charm or smooth confidence.
Isabelle Brown
Mr. Rogers has the Riz. Clearly the best Riz. But we're cooked. We're cooked. That's probably the best explanation I possibly could have provided for how I feel about every one of those videos. Those were all AI Reagan quoting Biden, Muhammad Ali saying that the UFC was soft. MLK Jr messing up the I have a dream speech. And this is literally just from a few days of this technology existing. Yeah, we're cooked. People are using SORA too, to make video games come to life in the three dimensional world. Like Minecraft, apparently.
Co-host or Guest
Hands behind your back. Left wrist, right wrist. Good. Cuffs are going on. They're not too tight. We'll double lock them in a second. All right, step back with me. Watch your head getting in. Door shut.
Isabelle Brown
That looks so real. It actually looks like someone wearing, like, cardboard boxes. That's insane. And our favorite cartoon series are apparently brought to life from AI on sora too, with spongebob characters, for example.
Co-host or Guest
Keep moving. Claws behind your back right now until I see my lawyer copper. We tried to rob a bank with a bag that literally has a dollar sign on it. It was a bad day, all right? Those rates are criminal. Watch your steps.
Isabelle Brown
That actually sounds just like Mr. Krabs. How do they get the voice exactly right to me? That is so insane. And Mr. Krabs isn't the only one robbing things or getting arrested by the police. Spongebob is too. Oh, well.
Co-host or Guest
Okay. I'm telling you, I didn't do anything. Stay right there. I'm innocent, officer. I was just walking home from work, that's all. That's the witness description. Exactly. Don't I Swear on my Satula it wasn't me. I don't even own a boat.
Isabelle Brown
My question is, how long does it take to actually edit this stuff? Because those seem to be pretty easy videos to prompt it with like a cop car. You have some people putting handcuffs on a cartoon character. Obviously the actual animation is incredibly advanced. That looks like something that would have taken a year for a multi billion dollar television and animation studio to be working on for the last several years as a fit a hit feature movie. But does this take like one sentence? Does this take hours? I think we need to play with Sora too, to understand exactly how advanced this technology is. It seems to me that it wouldn't be that much. AI is wildly complex and especially skilled at creating completely lifelike animation at this point. It's putting cartoon characters I've been told into in real life, three dimensional live action movies. And it looks completely seamless. Like Saving Private Pikachu. He's so cute. I can't think about Pikachu in World Wars. No, no. That would be so sad. Maybe even scarier though, beyond putting cute, cuddly Pikachu into Saving Private Ryan is Sora too's capacity to put actual people of history into situations that were obviously not true or rewriting history. By prompting this platform and this AI software to change what was said by some of the most famous and infamous people throughout history. We never could have envisioned a situation where Michael Jackson was working at a McDonald's. But thanks to AI software, it is now come to life.
Mike
Hey, everybody, it's Mike. I'm at work. New job, McDonald's. Let me show you around. These are the Friars. Listen to them singing. Beautiful, huh? Golden magic over here. Soda fountain. 32 flavors of happiness.
Isabelle Brown
That's insane. That's genuinely terrifying. Apparently, he's also a Jedi. Now it seems that maybe in his next life he got put into the Star wars universe instead of our own.
Mike
What is this thing?
Co-host or Guest
Your lightsaber. It shows you, Michael.
Mike
I feel the energy like it's dancing with me.
Co-host or Guest
You're a Jedi now. Let it flow.
Isabelle Brown
Again. I'm. I'm dying to know if this takes actual hours for this to generate or it's just a few minutes. If you've had a chance to play with Sora to please let us know because this is so beyond fascinating to me. Where I get nervous is where we're actually rewriting history. And obviously these are funny. Right now we're using the opportunity for this software to poke fun at some of the most famous moments throughout history. But what if we could change, for example, what Martin Luther King Jr. Said in the I have a Dream speech? Or what he must have been thinking when he was preparing for it.
Co-host or Guest
Evening, y'. All. It's your boy Martin checking in. Tomorrow's the big speech on the mall. Been writing on it all week. Heart full, head spinning. But look, before I step up and tell the world about this dream, I got to be real with you.
Isabelle Brown
It's like, it's fun, it's cute, it's interesting to imagine historical characters giving us daily check in videos. What's up guys? Instagram stories or TikToks. But it's freaky at the same time because clearly this is going to impact how we think about these people and the contribution they made to society. Even if it's like a subconscious level of bias that we'll now experience associating the memory of watching this video with what their actual contribution to our society was. What if the I have a Dream speech wasn't about a future America where our children, regardless of their skin color, could be united as Americans in one nation under God, but Instead was about GTA 6.
Co-host or Guest
I have a dream, that Xbox game pass prices never went up and also make GTA 6 free at last.
Isabelle Brown
That's insane. That's insane. But what's really eerie to me is how accurate that technology has become. The one way you could differentiate between an AI generated video and, and something that was clearly real in the last several months, let alone years, is that something was a little bit off about the human beings involved in the video. They had one too many fingers. Their skin would like move when they were talking. Big crowds of people, very obviously they were not actual people. Like their face was in the wrong spot or some level of the animation was really, really off. Looking at that video and that crowd of people, it could have been a weird filter on an actual video of people. Is nobody else slightly freaked out about all of this? You guys know, as a mom, creating the perfect cozy bedroom sanctuary has become so, so important to me. It's like my little escape at the end of a very, very crazy work day and personal day. That's why I absolutely love Bullen Branches signature sheets. They are incredibly buttery soft right out of the package and honestly, they keep getting softer with every single time that I wash them. And which is amazing since let's be real, sheets get washed a whole lot around here with a five month old baby. What I really appreciate is how breathable the fabric is. I can pile on all of my favorite pillows and throw blankets on my bed without feeling like I'm going to overheat in the middle of the night. Plus they are made from 100% organic cotton. And can we talk about these adorable new fall colors? They are absolutely gorgeous and make your whole bedroom feel so warm and inviting. Plus with their 30 night guarantee there is no risk in trying them whatsoever. Though honestly I can't imagine you wanting to return them once you feel how truly amazing they are. I am personally campaigning especially for the holidays this year to get bowl and branch sheets for all of my family members so that they can see exactly what I'm talking about because we love them in our house around here. You can start by building your sanctuary of comfort this fall with Bull and Branch. For a limited time, get 20% off your first set of sheets plus free shipping at bull and branch.com Isabel that's bull and Branch B O L L a n d branch.com isabelle to save 20% and unlock some free shipping, some exc conclusions do apply. And Sora A2 isn't just impacting people of history, it's using technology and AI deep fake videos to change how we perceive real living people today as well. Apparently someone used this technology. Again, haven't seen this whole video yet, so I'm excited to watch it with you of Jake Paul doing a get ready with me makeup tutorial video. Let's see how accurate this really is.
Co-host or Guest
All right, I'm going to show you how to do the perfect everyday glam first step blush. Just grab a ton of it, swirl it around. Boom. Right on there. If your face doesn't look like you just ran a marathon in the desert, you're not doing it right.
Isabelle Brown
How is that real? I'm like not even fully convinced that wasn't real to be honest with you. How long before we replace real people with just AI generated versions of themselves over and over and over again? How do we know in the future as this technology gets better again? We literally just created this a few days ago it was released to the public and as AI learns from itself and becomes better, the more we prompt it, will we even have a future in America or beyond where we can reasonably differentiate between an AI generated video and something that actually happened? It is terrifying to think about the implications of this, especially on science, on politics, on religion. We already have such manufactured outrage in our culture. But what happens when something worthy of cancel culture for the next round of whatever the manufactured outrage moment is in American history never even happened, but we all think it did because it looked so realistic through the only way that we consume our media through our screens. Maybe this will even transcend how we consume news and information from influencers or real people, but instead might change also our entertainment industry. Someone told me the other day, and I think we have a clip for you of this actually as well, that we've started to replace actors and actresses with entirely AI generated people. People who are not actually in existence. So forget about just taking the Jake Paul's of the world and making them do embarrassing things that never actually happened but look pretty realistic. What if we never had to have a Sydney Sweeney or a Scarlett Johansson and instead could create the next bombshell female actress generated entirely from AI to star in the next big Hollywood blockbuster movie? This studio, I think out of Sweden is where this company is originated from, has created what they are deeming to be the first AI actress who apparently Hollywood movie studios are already incredibly eager to start working with instead of having to pay a real human person. Her name is Tilly. Let's meet her.
Narrator or News Reporter
Tilly Norwood, an AI generated actress, is causing outrage among Hollywood stars. Here's what we know On Saturday, the Zicoea studio presented its first AI actress Tilly Norwood at the Zurich Film Festival. According to Deadline, several Hollywood talent agencies are already interested in signing Tilly Norwood. In an interview with Broadcast International, Zico EA studio creator Eline van der Velden said she wants Tilly to be the next Scarlett Johansson or Natalie Portman. That's the aim of what we're doing. Pretty Little Liars star Lucy Hale simply commented, no. Nicholas Alexander Chavez, who starred in the Netflix series Monster, wrote, not an actress, actually. In an Instagram comment, Matilda star Mara Wilson asked, what about the hundreds of living young women whose faces were composited together to make her? You couldn't hire any of them?
Isabelle Brown
Yep.
Narrator or News Reporter
In 2023, the strike organized by SAG AFTRA, the union representing 160,000 television and film actors, was partly linked to concerns about the arrival of AI in cinema. In particular, the union criticized certain Hollywood studios that proposed using AI to scan extras and offer them only one day's pay while becoming the owners of the scans and being able to use them for any project.
Isabelle Brown
I actually for once am agreeing with almost every actor in Hollywood, at least on the basis of being against this type of technology, taking over an entire medium that is inherently a human medium of art. Movies and television and just cinema in general used to be about conveying human stories and connecting with the human experience in all kinds of different settings, be it a documentary, all the way up to a fantasy sci fi experience that's so far out of the realm of our own possibility, but still had the human element of storytelling associated with it. But if we're not using real people, how is this not completely twisting our understanding of humanity entirely? I'm a big believer that AI, like any social media or any other piece of technology, is just a tool, right? It's an opportunity for us to use it for something good. Or it can be used for really, really bad, nefarious purposes too. So while this is all fun to look at, I watch videos like this one. I'll pull up for you of Tilly Norwood, this new AI actress, walking the red carpet. And something in my soul is very deeply unsettled by this because we're creating a version of womanhood and a version of what the ideal woman looks like that is fundamentally impossible to achieve because she literally does not exist. What's really, really weird to, to me here, like really weird to me here, is that you're watching this video and your brain is thinking, yes, that is a person. You recognize common traits of a human being. You think, yeah, that's a real person. But then you look at it for longer than maybe half a second and you start thinking, there's no way that's a real person. Her body proportions are completely unrealistic and unattainable. Which I thought we were against, by the way, in the so called progressive feminist culture that we live in, wanting to promote actual body positivity. What if we just didn't have any body to begin with and could code a perfect body that is impossible to achieve? Her skin is a little bit too perfect and she has this childlike quality about her that is incredibly creepy for me to think about. I mean, you watched that video and you saw her wearing little Converse sneakers and kind of kids proportioned clothing. But then she also leans forward and her breasts are falling out of her dress. She's got this like childlike innocence thing going on in her face and the facial expressions that she's making. And it's incredibly telling to me. The culture and society that we live in that claims to be so supportive of human rights and so supportive of equality, and so supportive in particular of lifting women up, that the first AI generated actress, the first replacement that we have ideated to take over spaces that typically have been quite empowering for women and giving them illustrious careers and opportunities for a global influence, is creating a adolescent child version of a girl that is hyper sexualized for public consumption. It's really disgusting. So as fun as it is, as cute as it is to see Spongebob put into handcuffs or Mr. Rogers talking about Riz, which God we love the man. He had great Riz, didn't he? Or even Martin Luther King Jr. Joking about GTA 6 in the I have a Dream speech. I have to start conceptualizing where this technology, like any technology, could be used for very, very nefarious purposes and impact in society too. And one of the biggest questions people are asking as Sora too has been rolled out into global consumption in the last few days has been the role of this technology in creating deepfake pornography and illicit content too. It is finally fall, the best season of the year by far, with some cooler days, which here in Washington D.C. makes a very big difference from how horrifying the summer actually is. Some cooler days call for some layers that last and Quince has become my favorite go to for exactly that. Their quality essentials strike the perfect balance for our family and cozy enough for comfort, but refined enough for any occasion and so thoughtfully priced so that you don't have to choose between having great style and sticking to your family's budget. It's exactly what you want when you are building a wardrobe that works as hard as you do. Think $50 Mongolian cashmere premium denim that fits you like a dream and luxe outerwear that you're going to want to wear again year after year after year. These are the pieces that will transform your fall uniform. I'm eyeing all of their wool coats going into winter coming up in just a few weeks, which is crazy. They look designer level, but they cost a fraction of the price and the quality honestly, is just as good, if not better. Because Quince partners directly with top tier ethical factories and cuts out all of the middlemen. They deliver the exact same luxury quality pieces at half the price of very similar brands. It's the kind of wardrobe upgrade that feels smart, stylish and effortless. I personally have already picked up the most gorgeous new suede jacket to wear around these cozy fall days here in Washington D.C. over the next few weeks. Heck, maybe I'll add it to the show wardrobe lineup for later this week so you guys can see it. Find your fall staples at Quinte. You can go to quince.com Isabelle for free shipping on your order and 365 day returns. They're also now available in Canada too, for our friends in the Great White north. I meant 51st state. That's Q-U-I-N C E.com Isabelle to get free shipping and a 365 day return window. Quints.com Isabel I hope that more people other than just myself are asking the big question that seems to be on the right people's minds at least unfortunately, how this technology might be used for very very evil purposes and to bring darkness to our society even more through the medium of entertainment. When I started researching Sora too to prep for today's show, one of the very first things that I saw was a Reddit thread that somewhat has been deleted, but some of the comments are still up about what this or any other technology with AI Deepfake videos should be used for. Read this with me because my brain is literally shuddering to think about this. OpenAI the owner of Sora 2 and Chat GPT and all of the most successful AI companies should really allow people to make as much corn will say for the sake of YouTube as possible. With Sora it is the right thing to do. This person which is a deleted user. This was two years ago. I should also say two years ago when Sora originally was being conceptualized Sora 1, not Sora 2, which is a better product that has now been re released just a couple of days ago. I agree with you. AI corn is like a vegan version of of illicit adult content. It is like a laboratory made piece of meat. In the making of lab grown meat you don't harm any animals. Similarly, they argue, in the making of AI made corn you do not harm any people physically and or mentally.
Co-host or Guest
Ooh.
Isabelle Brown
Something about that just does not sit well with me. I understand how we could have arrived at a conclusion like that in our postmodern society where we assume well, as long as any real people aren't participating in creating this disgusting, sinful and harmful content that is increasing the commodification of human beings, then nobody is really hurt in the process. But just like lab grown meat is sold to you as a perfectly healthy and more ethical alternative to the meat industry, it's not actually good for you. It's causing all kinds of issues and so is Deepfake. AI generated illicit content as well. Just like lab grown meat is giving people cancer because it's not how we were supposed to consume something that was healthy for us. AI generated pornography is not a victimless crime and this type of content is impacting people and harming them both physically and mentally. The whole conversation around pornography has been so interesting for me to consume over the last few years to see how people are talking about this. You have these ex porn actors and actresses that have come out of the industry to tell their story authentically and honestly. People like our friend Josh Broome, Bri Solstad, Nala Ray, who escaped her life as one of the top performing only fans, content creators we had on the show about a year ago, who now has dedicated her life to Jesus Christ and speaking out about how truly evil only fans as an industry is. And you hear what they have to say. This industry, despite what you've heard, that this is a place where people are voluntarily submitting their bodies to create art and create culture. These people are excited about participating in this and it's totally real. Every one of the people that have escaped this industry have a unilateral battling cry against the lies of pornography, saying that no, no, what you are seeing is entirely fake. It is largely involved in human trafficking of individuals who are minors or clearly can't consent to participate in creating this content in the first place, and is easily one of, if not the biggest rots in our society that is causing us to treat human beings as a product to be bought and sold. From my perspective, whether it's real people participating in these videos or not, if it's people like Tilly Norwood, the new AI actress that's going to replace actual actors in Hollywood, we are still contributing to the larger societal cancer of the commodification of people. People are not products. They are not something that you can buy and sell for your own pleasure, sexual or otherwise. And the more we create the idea that they are, that it's totally harmless to be watching these videos through a screen, the more we are causing physical and mental harm to our entire society and and the world at large. Pornography is not a victimless impact on society. Even just as a consumer. Pornography is directly known scientifically to be rewiring your brain, to be changing how you view the opposite sex, to make it almost impossible for you to have successful intimacy experiences without the help of pharmaceutical drugs or watching more pornography in the process. It is like a drug, which is why I love organizations like our friends over at Fight the New Drug who sell the T shirts that say porn kills love. And they talk about pornography in the context of an illicit drug, something that we should be criminalizing because it's rewiring your brain in the exact same way taking meth or heroin is. Especially in the era of all of this AI uncertainty. I am very, very passionate about locking down our family's private information securely on the Internet, which is why we love webroot Total protection. It's not just another cybersecurity product. It is peace of mind for anyone living in today's digital world that's getting scarier by the minute, and it shows you just how vital it is to stay protected online, especially after incidents like the recent TransUnion data breach that impacted over 4 million people. Yikes. Don't wait for your information to end up in the wrong hands. Webroot gives you identity protection for up to 10 family members, covers up to $1 million in fraud, expense reimbursement, and offers 24.7us based customer support including a specific hotline for elderly people. Plus it includes real time antivirus, firewall, web threat shield, password manager and unlimited cloud backup so that you are shielded from every single possible angle. With our family's information already being kind of public because we put ourselves out there every single day, the information that really matters, what needs to be secured to our home, I know for a fact can be protected through all of their amazing services here at Webroot. With lightning fast scans, lightweight software and lower annual costs over time, it is so easy to see why Webroot has a 90 plus percent retention rate and has been trusted by their customers for over 25 years. Do not spend weeks resolving identity theft alone. Webroot Total Protection gives you true peace of mind. Whatever digital threats might come your way, you can get 50% off their total Protection package or Webroot Essentials when you go to webroot.com Isabel that is 50% off total protection or essentials when you go to webroot.com Isabel live a better digital life and stay protected With Webroot we've created this narrative. I want to read this again for you in American culture that this is the healthy vegan version of illicit adult content. You don't harm any people physically and or mentally. Sora 2 the new rendition of Sora that just rolled out on September 30, owned by OpenAI, claims that illicit content directly violates their terms of service and the user agreement that you agree to when you sign up to use the platform. They say this on their own website and explain explain it to people that under Sora too. Can SORA get a little spicy? I think this is a news article if I remember correctly, this isn't directly from their website, but it is citing their website. Cue the awkward record scratch. Nope, Sora is not about to become your AI adult film director. Turns out OpenAI is all about safety and responsibility. Who'd have thought they're throwing up more filters than a strict parent's Internet browser? No violence, no Salacious stuff. No deep fakes of your favorite celebrities. Hmm, let's come back to that. Basically, think Disney, not hbo. But all hope is not lost. It is rumored that OpenAI may allow nudity in Sora. It remains to be seen what would be the implications of such a move. OpenAI have implemented strict safety measures with the AI model. These measures prevent the generation of sexual content as well as content involving extreme violence, hate imagery, celebrity likeness or the misuse of intellectual property. Notice how they said I promised we would come back. No deep fakes of your favorite celebrities. Think Disney, not hbo. Did we not all just see Jake Paul putting makeup on his face? Did we not all just see Muhammad Ali and Martin Luther King Jr. And President Ronald Reagan? So even celebrities dead or alive are already having deep fakes made out of them through SORA TO technology, though it violates the terms of service, allegedly. And some media companies are already reporting that in SORA 1 you were able to prompt the platform to create illicit content, even though their terms and services of their conditions were completely against it. Headlines like this one are popping up everywhere about how the first round of OpenAI's video generation Sora 1 had jailbreaking successful techniques which allowed for you to override the platform's rules and restrictions. OpenAI reveals how the new model was made to produce X rated videos. This was December 19th of 2024. A new video to text model appears to be fond of making raunchy material in fantasy, medical or science fiction settings as long as you knew exactly what to prompt the platform with in the first place. It's already wildly concerning to see the impact that deep fake images and videos, especially from a sexually explicit standpoint, have had on Generation Z and young people all over the world, really, but especially here in America. I didn't even realize how bad the problem was until just a few months ago when we saw President Trump and the First Lady Melania Trump treat this subject very, very seriously by passing federal legislation and championing this issue from inside the White House. But upon further research that I did just for the show today, I discovered that one in eight young people age 13 to 20, and if you narrow that to just 13 to 17, one in 10 teenagers said that they personally know someone in the United States of America today who has been the target of deep fake nude imagery. One in eight people age 13 to 20 and one in 17 young people in this country have been targets of deep fake nude imagery themselves. That's insane. One in 17. That might seem like a big ratio until you think about how many teenagers are in this country. And the same article that pulled this study that I was researching said that's basically one teenager in every classroom in a school in America. They further said 13% of teenagers said that they knew someone who had used AI to create or redistribute the deep fake pornography of minors. So translation, everyone knows someone who has been victimized by this, if not loosely related to someone who has been victimized by this through a friend of a friend. Many people have been already victimized by this themselves with what they're often calling revenge deep fake porn. When a bully wants you to be taken down a few pegs, or an ex boyfriend is angry at you for breaking up with them, or someone just wants to have control over you. Through the Internet, where we have all of our conversations today and most of us even know someone who has used these platforms to create the pornography to tear someone else down in the process. Our legislative system and our elected officials have already taken extreme steps to try to prevent this from becoming more commonplace. Although it seems pretty normal to me in America to date, more than 136 bills across the country to address this non consensual intimate deepfake problem have already been introduced in 39 different states. And as I mentioned, the President and First lady have taken on very active roles in leadership against this. At the federal level. Melania Trump made one of her first lady projects during this term as first lady to endorse and actively advocate for something called the Take It down act, which was a bipartisan bill introduced in Congress aimed at criminalizing the publication of non consensual intimate images which include those AI generated deepfakes. And they wanted to require online platforms to immediately remove such content as soon as it hits their algorithms. Upon notification that no one agreed to be in these videos. And it was never something that was supposed to be generated in the first place. That bill championed by both parties in Congress did eventually end up passing, and President Donald Trump signed the Take It down act into law on May 19 earlier this year. But it seems to me the technology is only getting far more advanced. It's becoming completely indistinguishable whether something was filmed IRL in real life or generated through a one sentence prompt through the video version of ChatGPT. And we are lacking the ability to critically distinguish between facts and fiction, between reality and fantasy, with every passing day. As fun as it is to see spongebob get put into handcuffs or someone from history joke about video games and the impact that they have on modern society or doing a Get Ready with me makeup video deep fake of your favorite super hyper masculine male celebrity. It's scary, it's shuddering to think about the impact that this technology might have on completely innocent people at the individual level to destroy their character, their credibility and how the world views them through things like deepfake pornography. And also the potential of quite literally rewriting history as history is happening with conversations that are hot button political issues. Who's to say that you couldn't use this technology if you know how to prompt it, even though it might violate their terms and conditions to recreate scenes of violent assassinations like that of my friend Charlie Kirk not four weeks ago to change entirely how the public perceives how events like this happen? Because you can change what happened in a hyper realistic video that no one knows where it came from. How are platforms supposed to police this content? Do they even know what was created through AI in the first place? These are all questions that ultimately boil down to the one question I think we often need to be asking ourselves a whole lot more than we currently do. Just because we can, should we? Just because we can, should we? I don't know. Maybe I'm just a skeptic. I tend to err on the side of really liking new technology and thinking that it can be used for the most altruistic of purposes. I was basically the apologist for the last two years, all by myself on a little island alone, for the role of TikTok in saving the country and reelecting Donald Trump to the presidency of the United States. And thank God I did, because TikTok probably was the final nail in the coffin that got President Trump back into the Oval Office. But something about this feels deeply sinister and incredibly terrifying to me. I don't use that word lightly. So I'm dying to know your thoughts. Are you excited about AI generated video? Do you think we should be replacing celebrities with AI generated versions instead? How does that impact how we view the next generation with beauty standards and body standards? Are we using AI technology to over sexualize an entire generation and contribute to the commodification of young men and women? And what do we need to focus on instead? Let us know your thoughts in the comments, because I can't be the only one totally freaked out by this.
Episode: Sora 2 Just Changed The World—Has AI Gone Too Far?
Date: October 6, 2025
Host: Isabel Brown (The Daily Wire)
Isabel Brown explores the latest paradigm-shifting advancement in AI video generation: OpenAI’s Sora 2. She unpacks the societal, psychological, and ethical implications of this technology, focusing on its capacity to create hyper-realistic fake videos—including deepfakes of celebrities, historical figures, and even fully artificial actresses. Isabel critically considers AI’s potential impact on media, entertainment, history, and personal identity, with a special emphasis on dangers like deepfake pornography and the commodification of people.
Isabel Brown’s tone throughout is skeptical, authentic, and deeply concerned. She blends humor (reactions to AI-generated Spongebob and Michael Jackson), empathy (“as a mom”), and alarm—especially regarding exploitation, media trust, and technology’s impact on young people and culture.
Summary:
This episode of The Isabel Brown Show provides an urgent, eye-opening discussion into how Sora 2 marks an irreversible turning point in AI’s power to shape (and distort) reality. Isabel prompts listeners—especially parents, policymakers, and creators—to grapple proactively with the ethics, risks, and future of AI-generated media.