
Loading summary
A
Today we're going to be going over all the latest updates to the Meta Ray Ban Smart Glasses. All the updates that have come out so far in 2025 and two really big updates that we've been waiting for were just released this April and we're going to talk about it. Intro this is the Blind Life. Welcome back to the channel. The fit for today is the Karate Kid inspired Sweep the Cane. No Mercy, a very popular design. If you'd like to check out this design or other VIP T shirt designs, you can find information for that in the video description or on my website, theblindlife.net now let's jump into the latest updates to the Meta Ray Ban smart glasses. Version 12 of the software was released in January and it had some minor updates but they were important updates. They mostly surrounded how well the Meta Ray bans respond to the wake word, which is Hey M E T A I'm not going to say it because mine mine are listening right now, but how well they respond to that and then how quickly they get a response Back to your question, like what's in front of me? What does this say? How quickly it processes that and gets you that information. Not only is it more responsive when listening to the wake word, but you don't have to structure your your questions not the same way that we used to. For example, I don't always have to say look. Now in order for it to take a picture. Previously if I wanted to say what was on my desk, I would have to say hey look and tell me what's on my desk. And the look keyword is what caused it to take a picture. But now we don't have to do that. Now I can just say what's on my desk and it will recognize what I want it to do and it'll take a picture. So that's definitely going to make it much easier to learn how to use the Meta Ray Ban smart glasses for a lot of people. In March, the glasses here in the US got an update where they are now compatible with Audible from Amazon. This means you'll be able to stream all your Audible books audiobooks through the headset and be able to control the playback through the touch gestures and using your voice. Now, April seems to be the big month for updates. The software is updated to version 14 and we have a lot of really cool features being added in. Firstly, improved control of your Spotify using voice commands. You'll be able to sort by artists, you'll be able to shuffle playlists, all with voice commands for Example, a voice command you could give would be to shuffle my workout playlist next real time weather and air quality updates. So I thought you could do this all along, but maybe you couldn't. But now we'll be able to ask Meta what the weather is, what the UV index is, what the air quality is, things like that. Hey, Meta, what's the air quality today?
B
The current air quality index is 41. What is good?
A
Thank you. John Cena telling me what the current air quality is for Kentucky. Also in April, they updated the AI voices. Now the AI voices are more lifelike and just more pleasing. And a quick note about the celebrity voices, I did a video about that previously. Unfortunately they're only available currently in the US and the uk. I know a lot of my viewers in the UK when I've been making these videos, they were upset because a lot of this wasn't available to them. But now it is super cool. Congratulations uk. And speaking of other countries, now the connected apps, your music apps are available in a lot more countries. Looks like they're available in the us, Canada, the uk, Australia, France, Italy, Spain, Ireland, Austria, Belgium, Sweden, Denmark, Finland, Germany and Norway. That's pretty awesome. Congratulations everybody. You can now listen to your music through your sunglasses. And probably the biggest update, in my opinion, of April, actually of the year so far, is the full public release of live AI and live translations. Up until April, this has only been available through early access, but now it's slowly being rolled out to everyone and in the case of the live AI is going to be tremendously helpful for us. So what is live AI? Previously, whenever we wanted to get information, we would have to prompt the glasses. We would give the command to wake it up. We would say, you know, what is this in front of me? What does this say? It would take the picture process and then come back with the response. Well, with live AI, instead of going through all of that, the glasses just turn on the video and so they are actively watching all the time, using the camera, using video, and you can just have a conversation with it, so to speak. You can just ask it questions and it will give the response back very, very quickly. So this is going to have a huge impact for us in our community and allow the glasses to be much more helpful. Live translation is kind of similar. Once you turn it on, once you activate it, the glasses are constantly listening for a different language and almost in real time will translate that into your language through the speakers. So if you do live in another country, if you travel a lot, if you work with People who don't speak the same language that you do. It's going to be extremely helpful. Real quickly, let me demonstrate the live AI because that's what I'm most excited about. Hey, Meta. Enable Live AI.
B
Live AI session with video starting now.
A
So now I can just ask it questions. What's in front of me?
B
In front of you is a keyboard, a phone, a smartwatch, and a mouse.
A
What color is the keyboard?
B
The keyboard is white.
A
What am I holding in my right hand?
B
You are holding a red iPhone.
A
Red iPhone, Perfect. What is that over there?
B
It's a microphone.
A
What am I pointing at?
B
You're pointing at the camera.
A
Pointing at the camera. I am? You're right. How many cameras do you see?
B
I see three cameras.
A
Three cameras, Perfect. All right, thanks, Meta. Turn off live AI.
B
Live AI session, Anthony.
A
There you go. So the tricky part when doing a live AI session, it, like I said, it's incredibly powerful and it's going to be really helpful for us in a lot of different situations. I'm thinking like, grocery shopping, I'm thinking, like, going through mail much more quickly. But the tricky part is that it's always listening to you. So if you start to try to have a conversation with somebody else, the glasses are going to think you're talking to them, and they might respond. If you're in a super noisy area and people are talking to you, the glasses might try to talk to you. So it's probably better to just end the session if that happens. And you can give a command like, stop session, stop, live AI, things like that. But you can also press and hold the touchpad on the right temple of the glasses. One thing I'm super excited about for my next trip, I travel a lot, and I'm really excited about using live AI in the airport. I'm thinking walking down the the terminal, trying to find my gate, instead of having to stop and saying, look and see what's in front of me. And it take a picture. And I'm standing there in the middle of the hallway waiting. For a couple of seconds, I can just be walking and say, all right, what does that one say? That says B13. Okay, we're looking for B25. Keep on walking. What about that one? That's B16. All right, we're getting closer. It's just going to speed up the process. So much better. Is this the men's room? No, that's the ladies room. Okay, which one's the men's room? The men's room is on the right. You know, it's like so many situations where this is going to be helpful for us. And now, like I said, excited to announce that it is being rolled out to everyone who has access to the meta AI. How do you get these updates? Great question. It's super simple. Just make sure that the Meta View app is fully updated on your phone and then go into the app and make sure that the glasses themselves have been updated to the latest version. Now for things like the live translation, you'll have to go in and download the language pack that you want to translate. Supported languages for translation are English, Spanish, French and Italian. All right guys, so that was the updates so far for 2025, including the big ones that we've been waiting for for several months now. If you have any questions about this, let me know in the comments down below. Once again, don't forget to check out the merch available through the website. All my merch is available on both Amazon and Spring, so check that out links down down below or through the website theblindlife.net while you're down there, be sure to like the video and subscribe if you found this content helpful. And make sure you hit the bell notification because I put out videos like this every single week and I want to give a huge shout out to my channel members. Thank you to all the members of The Blind Life YouTube channel. Without your support a lot of this would not be possible. We truly appreciate it. If you would like to learn about how you could become a channel member, give back to the channel and and take part in all of the extra perks that come along with that, including joining us for our monthly members only zoom calls. Then just click the join button located right below the video. You can also visit the membership page on my YouTube channel to learn more about joining. Huge thank you to the members once again and a huge thank you to you guys for watching. As always, this is Sam with the Blind Life. I will see you next time. Sa.
Host: Sam Seavey
Date: May 3, 2025
In this episode, Sam Seavey covers all the latest updates to the Meta Ray-Ban Smart Glasses that have come out in 2025, focusing on the significant improvements released in April—especially the much-anticipated Live AI and live translation features. Sam also demonstrates how these updates enhance everyday experiences for blind and visually impaired users.
Version 12 Release (January):
Audible Compatibility (March):
Version 14 Software Rollout:
Enhanced Spotify Controls:
Real-Time Weather & Air Quality:
Upgraded AI & Celebrity Voices:
AI voices sound more lifelike.
Celebrity voices now available in the UK (previously US only).
“Congratulations UK. And speaking of other countries, now the connected apps, your music apps are available in a lot more countries…” (03:30)
Expanded International Support:
Live AI (April public rollout):
The standout update of the year, now becoming available to all users.
No more need to take an explicit picture or use trigger words—continuous video stream allows for conversational queries and rapid answers.
Example use for the blind/VI community: Easier navigation, faster reading of printed material, identifying objects on the fly.
“Instead of going through all of that, the glasses just turn on the video and so they are actively watching all the time… you can just have a conversation with it…” (04:05)
Live AI Demonstration (05:40–06:16):
[Sam:] “Hey, Meta. Enable Live AI.” (05:40)
[Meta:] “Live AI session with video starting now.” (05:43)
[Sam:] “What’s in front of me?” (05:46)
[Meta:] “In front of you is a keyboard, a phone, a smartwatch, and a mouse.” (05:46)
[Sam:] “What color is the keyboard?”
[Meta:] “The keyboard is white.” (05:52)
[Sam:] “What am I holding in my right hand?”
[Meta:] “You are holding a red iPhone.” (05:55)
[Sam:] “What is that over there?”
[Meta:] “It’s a microphone.” (06:02)
[Sam:] “What am I pointing at?”
[Meta:] “You’re pointing at the camera.” (06:04)
[Sam:] “How many cameras do you see?”
[Meta:] “I see three cameras.” (06:10)
[Sam:] “All right, thanks, Meta. Turn off live AI.” (06:12)
Sam’s commentary:
“It’s incredibly powerful and it’s going to be really helpful for us in a lot of different situations. I’m thinking like, grocery shopping… going through mail… much more quickly.” (06:19)
Pro tip: Be mindful in noisy areas—the AI may respond to other people talking; best to end session in such environments.
Travel Use Case:
Sam expresses excitement about using Live AI in airports for rapid navigation:
Live Translation:
Ensure both the Meta View app and the glasses firmware are up-to-date on your phone.
For live translation, download your chosen language packs through the app.
Sam wraps up by encouraging users to update their glasses and try out the new features, especially Live AI and translation. He points out that these AI-driven updates are not only cool but fundamentally shift day-to-day independence and accessibility for blind and visually impaired users. He invites the audience to connect through comments and check out The Blind Life’s merch and YouTube membership for further engagement.