Transcript
Elise Hu (0:07)
You're listening to TED Talks Daily where we bring you new ideas and conversations to spark your curiosity every day. I'm your host Elise Hu're looking at a grainy photograph of soldiers who've been taken hostage. Is this photo real or fake? Until recently, this question wouldn't have been difficult to answer, but today it may be the first thing we need to ask. In this talk, digital forensic scientist Han Hani Fareed warns of the fast approaching dangers of generative AI in forever changing our understanding of truth and facts, and says that when it comes to our engagement with technology, we're at a pivotal fork in the road. It all comes down to what choice we will make. Stick around after his talk for a brief Q and A between Hani and Latif Nasser, the co host of Radiolab and a guest curator, ed2025 and tune.
Georgie Frost (0:58)
Into this very feed later today for.
Elise Hu (1:01)
A special conversation between Hani Farid and me where we dig into some of the deeper ideas from his talk.
Georgie Frost (1:14)
Want to go deeper on the business topics that matter? This is the so what from BCG and I'm your host Georgie Frost. BCG experts explore the most pressing topics on the minds of the world's top business leaders. Listen wherever you get your podcasts.
Georgie Frost (1:33)
This message is brought to you by Apple Card. Each Apple product, like the iPhone 16, is thoughtfully designed by skilled designers. The titanium Apple Card is no different. It's laser etched, has no numbers and it earns you daily cash on everything you buy, including 3% back on everything at Apple. Apply for Apple Card on your iPhone in minutes, subject to credit approval. Apple Card is issued by Goldman Sachs Bank USA Salt Lake City Branch terms and more@applecard.com this episode is sponsored by Dell introducing the new Dell AI PC. Powered by the Intel Core Ultra processor. It's not just an AI computer, it's a computer built for AI. That means it's built to help do your busy work for you so you can fast forward through editing images, designing presentations, generating code, debugging code, running lots of apps without lag, creating live translations and captions, summarizing meeting notes, extending battery life, enhancing security, finding that file you are looking for, managing your schedule, meeting your deadlines, responding to Jim's long emails, leaving all the time in the world for more you time and for the things you actually want to do. No offense Jim. Get a new Dell AI PC starting at 699.99@dell.com AI PC how those ahead? Stay ahead.
Hani Hani Fareed (2:51)
You are a senior military officer and you've just received a chilling message on social media. Four of your soldiers have been taken. And if demands are not met in the next 10 minutes, they will be executed. All you have to go on is this grainy photo, and you don't have the time to figure out if four of your soldiers are in fact missing. What's your first move, if I may be so bold? Your first move is to contact somebody like me and my team. I am by training an applied mathematician and computer scientist. And I know that seems like a very strange first call at a moment like this, but I've spent the last 30 years developing technologies to analyze and authenticate digital images in digital videos. Along the way, we've worked with journalists, we've worked with courts, we've worked with governments on a range of cases, from a damning photo of a cheating spouse, gut wrenching images of child abuse, photographic evidence in a capital murder case, and of course, things that we just can't talk about. Used to be a case would come across my desk once a month, and then it was once a week. Now it's almost every day. And the reason for this escalation is a combination of things. One, generative AI. We now have the ability to create images that are almost indistinguishable from reality. Two, social media dominates the world and is largely unregulated and actively promotes and amplifies lies and conspiracies over the truth. And collectively, this means that it is becoming harder and harder to believe anything that we read, see or hear online. I contend that we are in a global war for truth with profound consequences for individuals, for institutions, for societies and for democracies. And I'd like to spend a little time talking today about what my team and I are doing to try to return some of that trust to our online world and in turn, our offline world. For 200 years, it seemed reasonable to trust photographs. But Even in the mid-1800s, turns out the Victorians had a sense of humor. They manipulated images. Or you could alter history. If you fell out of favor with Stalin, for example, you may be airbrushed out of the history books. But then in the turn of the millennium, with the rise of digital cameras and photo editing software, it became easier and easier to manipulate reality. And now with generative AI, anybody can create any image of anything anywhere at a touch of a button. From four soldiers tied up in a basement to a giraffe trying on a turtleneck sweater. It's not fun and games, of course, because generative AI is being used to supercharge past threats and create entirely new ones. The creation of nudes of real women and children used to humiliate or extort them. Fake videos of doctors promoting bogus cures for serious illnesses. A Fortune 500 company losing tens of millions of dollars because an AI impersonator of their CEO infiltrated a video call. Those threats are real. They are here. And we are all vulnerable. It's useful to understand how generative AI works, starting with billions of images with a descriptive caption. Each image is degraded until nothing but visual noise is left, a random array of pixels. And then the AI model learns how to reverse that process by essentially turning that noise back into the original image. And when this process is done, not once, not twice, but billions of times on a diverse set of images, the machine has learned how to convert noise into an image that is semantically consistent with anything you type. And it's incredible, but it is decidedly not how a natural photograph is taken, which is the result of converting light that strikes an electronic sensor into a digital representation. And so one of the first things we like to look at is whether the residual noise in an image looks more like a natural image or an AI generated image. Those star like patterns are a telltale sign of generative AI. Now, for the mathematicians and the physicists in the audience, that is the magnitude of the Fourier transform of the noise residual. For everybody else, that detail doesn't matter. But you definitely should have taken more math in college. Professors can't help themselves, but no forensic technique is perfect. And so you don't stop after one thing. You keep going. So let's go on to our next one, the vanishing points. If you image parallel lines in the physical world, they will converge to a single point, what's called a vanishing point. Good intuition for that is the railroad tracks. Railroad tracks are obviously parallel. They narrow as they recede away from me and intersect at a single vanishing point. This is a phenomenon that artists have known for centuries. But here's the great AI doesn't know this, because AI is fundamentally, as I just described, a statistical process. It doesn't understand the physical world, the geometry, and the physics. So if we can find physical and geometric anomalies, we can find evidence of manipulation or generation. Evidence number two. All right, what else can we learn? Surprisingly, shadows have a lot in common with vanishing points. And again, this is a physical phenomena that you expect in natural images. And because AI fundamentally doesn't model the physics and the geometry of the world, and it tends to violate these physics, we now have a very good indication that this image is not authentic. The most important thing I want you to take away from this is that while it may not be easy, it is possible to distinguish what is real from what is fake. I think this image is a bit of a metaphor for how a lot of us feel. We feel like hostages. We don't know what to trust anymore. We don't know what is real, what is fake. But we don't have to be hostages. We don't have to succumb to the worst human instincts that pollute our online communities. We have agency and we can affect change. Now, I can't turn you all into digital forensics experts in 10 minutes, but I can leave you with a few thoughts. 1. Take comfort in knowing that the tools that I've described and that my team and I are developing are being made available to journalists, to institutions, to the court to help them tell what's real and fake, which in turn helps you. 2. There is an international standard for so called content credentials that can authenticate content at the point of creation. As these credentials start to roll out, they will help you, the consumer, figure out what is real and what is fake online. And while they won't solve all of our problems, they will absolutely be part of a larger solution. 3. Please understand that social media is not a place to get news and information. It is a place that Silicon Valley created to steal your time, your attention by delivering you the equivalent of junk food. And like, thank you. And like any bad habit, you should quit. And if you can't quit, at least do not let this be your primary source of information because it is simply too riddled with lies and conspiracies and now AI slop to be even close to being reliable. 4. Understand that when you share false or misleading information, intentionally or not, you're all part of the problem. Don't be part of the problem. There are serious, smart, hardworking journalists and fact checkers out there who work every day because I talk to them every day to sort out the lies from the truths. Take a breath before you share information and don't deceive your friends and your families and your colleagues and further pollute the online information ecosystem. We're at a fork in the road. One path. We can keep doing what we've been doing for 20 years, allowing technology to rip us apart as a society, sowing distrust, hate, intolerance. Or we can change paths. We can find a new way to leverage the power of technology to work for us and with us and not against us. That choice is entirely Ours. Thank you.
