Transcript
Podcast Host (0:00)
This podcast is brought to you by eHarmony, the dating app to find someone you can be yourself with. What makes eHarmony so special? You no, really, the profiles and conversations are different on eharmony, and that's what makes it great. EHarmony's compatibility quiz brings out everyone's personality on their profile and highlights similarities on your discovery page. So it's even easier to start a conversation that actually goes somewhere. So what are you waiting for? Get who gets you on eharmony? Sign up today.
Meditation Guide (0:27)
This is a mini meditation guided by Bombas. Repeat after me. I'm comfy. I'm cozy. I have zero blisters on my toes. Blisters. And that's because I wear Bombus the softest socks, underwear and T shirts that give back. One purchased equals one donated. Now go to bombas.com acast and use code acast for 20% off your first purchase. That's B O M B-A-S.com acast and use code Acast at checkout.
Podcast Host (0:57)
Thumbtack presents the ins and outs of caring for your home out Procrastination Putting it off, Kicking the can down the road in plans and guides that make it easy to get home projects done out Carpet in the bathroom like why? In knowing what to do, when to do it and who to hire. Start caring for your home with confidence. Download thumbtack today.
Elise Hu (1:22)
Foreign you're listening to TED Talks Daily where we bring you new ideas to spark your curiosity every day. I'm your host Elise Hu. Scientists are getting closer and closer to giving the powers of telepathy to humans thanks to brain computer interfaces. In his 2024 talk, researcher Chintang Lin shows us an EEG headset that can turn silent into words with rather remarkable accuracy. Check it out.
Chintang Lin (2:03)
How often are you frustrated by the time it takes to accurately get things in your mind into a computer? It is even worse for people like me whose first language is not based on letters. I live and work in Australia, but I am originally from Taiwan. I moved to Sydney eight years ago and now run a university research center there. Most of us use keyboard every day to get things in our mind into the computer. We have to learn to type. The fact that you have to learn to do something shows how unnatural it is. The finger driven touch screen has been around for 60 years. It's convenient but is also slow. There are other ways to control the computers joystick or gestures, but they are not very useful in capturing the words in your mind. And it is words they are critical to communication for Human beings. The problem is is about to be over because of AI. Today I will show you how AI can turn the speech in your mind into words on screen. Getting from the brain to the computer efficiently is a real bottleneck for any computer application. It has been my passion for 25 years. Many of you, or most of you have heard of brain computer interface bci. I have been working on BCI for the direct communication between a brain and machine since 2004. I developed a series of EEG headsets that do this, but they are not new. What is new is an interface that works in a natural way based on how our brain is working naturally. Imagine reading words when someone is thinking, translating the brain signals into words. Today you will see this in action and with no imprint. We are using AI to decode the brain signals on the top of your head and identify the biomarkers of speaking. That means that you can send the words in your mind into the computer with wearable technology. It's excited. I and I believe it will open up the bottleneck of how we engage with computers. We are making exciting progress in decoding EEG to testers. It's nature. We have had very promising result in decoding EEG when someone is speaking aloud. The frontier we are working on now is to decode EEG when the speech is not spoken aloud. The words flow in your mind when you are listening to others or when you are talking to yourself. All thinking. We are well on the way to make it a reality. I am going to invite two of my team, Charles and Daniel, to show it to us again. This is the first world premiere for us. We are getting around 50% accuracy in decoding the brain signals into words when someone is speaking silently. Here shows how it will work. We have a collection of words that we have trained our technology with. They are combined into sentences. Charles will select one sentence and Daniel will read the sentence word by word silently and produce the brain signals that will be picked up by our sensors. Our technology will decode the brain signals into words. We pick up the brain signals with sensors and amplify and filter them to reduce the noise and get the right biomarkers. We use AI for the task. We use deep learning to decode the brain signals into the intended words. And then we use the large language model to make the match of the decoding words and make up for the mistakes in EEG decoding. All of this is going on in the AI, but for the user the interaction is natural through thought and in nature language. We are very excited about the advances that we are making in understanding words and sentences. Another thing that is very natural to people is looking at something that has their attention. Imagine if you could select an item just by looking at it. Not by picking it off the shell or punching a cord into the vending machine. Two years ago in a project about hands free control of robot. We were very excited about the robot control via visual identification of the freakers. We are now beyond that. We need not any flicker. The AI is making it natural. Daniel is going to look at the photos and select an item in his mind. If it is working as issue, you will see the select item pop up on screen. We use photos for this because they are very controllable. To show that this is not all but just beer. Into my presentation, Charles will pick up one item for Daniel to select in mind. Please, Charles. It's car. It's a car. So when Daniel was wrecked the car in his mind Hamburg is incorrect. It's unlucky that the 30% error rates came with us again. Let's invite Charles Daniel to show it again. When Daniel select an item in his mind, his brain recognizes and identifies the object and trigger his EEGs. Our technology decode the triggers. We are working on our way to the technical challenges. We will work on overcoming the interference issue. That's why I asked the phone to be turned off. Different people have different neural signatures which are important to decoding accuracy. One reason I I brought Daniel along here is because he can give off great neural signatures. He can give us the great neural signature. As far as our technology is concerned. There are steel cable here as well. It is not yet very portable. Probably one biggest barrier to people to using this would be how do I turn it off? Anyone of you will have had the time when you are happy that people you are with don't know what you are really thinking. There are serious privacy and ethics issues that will have to be dealt with. I am very passionate about how important this technology can be. One exciting point is linking the brain computer interface to the wearable computers. You already have computer on your head. The brain will be the nature interface. It is not only about comparison controlling a computer. The natural BCI also provide another way for people to communicate with people. For example, it allows people who are not able to speak can communicate with others or such as when privacy or silence are required. If your idea of nature is a lovely forest, you could wonder how natural this could be. My answer is nature. Language is the natural thought process that you are using. There are no unnatural imprints in your body. I am challenging you to think about what you regard as natural communications, turning the speech in your mind into words. There is a stand way to finish up when talking with people, you say just think about it. I hope you are as excited as we are for the prospect of a future in which when you just think about something, the words in your mind appear on screen. Thank you.
