The Blind Life Podcast – Ep 41
Meta Ray-Ban Smart Glasses 2025: Updates You Need to Know About! Live AI Is Here!!
Host: Sam Seavey
Date: May 3, 2025
Episode Overview
In this episode, Sam Seavey covers all the latest updates to the Meta Ray-Ban Smart Glasses that have come out in 2025, focusing on the significant improvements released in April—especially the much-anticipated Live AI and live translation features. Sam also demonstrates how these updates enhance everyday experiences for blind and visually impaired users.
Key Discussion Points & Insights
1. Early 2025 Updates (January–March)
-
Version 12 Release (January):
- Minor yet important refinements, particularly around how the glasses respond to the wake word (“Hey Meta”) and how quickly they return answers to user questions.
- Improved natural language understanding—users no longer need to structure commands as formally (e.g., no need for trigger word “look” before asking a question).
- “Now I can just say ‘what’s on my desk’ and it will recognize what I want it to do and it’ll take a picture.” (01:00)
-
Audible Compatibility (March):
- US users can now stream Audible audiobooks directly via the glasses, controlling playback with touch gestures or voice.
2. April 2025 Big Updates
-
Version 14 Software Rollout:
- “April seems to be the big month for updates. The software is updated to version 14 and we have a lot of really cool features being added in.” (01:45)
-
Enhanced Spotify Controls:
- Users can now sort music by artist, shuffle playlists, and manage playback using only voice commands.
- Example: “A voice command you could give would be to shuffle my workout playlist…” (02:10)
-
Real-Time Weather & Air Quality:
- Ask for instant updates on weather, UV index, and air quality.
- Demo:
- “Hey, Meta, what’s the air quality today?” (02:48)
- [Meta AI voice:] “The current air quality index is 41. What is good?” (02:56)
- “Thank you. John Cena telling me what the current air quality is for Kentucky.” (03:01)
-
Upgraded AI & Celebrity Voices:
-
AI voices sound more lifelike.
-
Celebrity voices now available in the UK (previously US only).
-
“Congratulations UK. And speaking of other countries, now the connected apps, your music apps are available in a lot more countries…” (03:30)
-
-
Expanded International Support:
- Music app integration now supported in the US, Canada, UK, Australia, France, Italy, Spain, Ireland, Austria, Belgium, Sweden, Denmark, Finland, Germany, and Norway.
3. The Biggest Change: Live AI & Live Translation
-
Live AI (April public rollout):
-
The standout update of the year, now becoming available to all users.
-
No more need to take an explicit picture or use trigger words—continuous video stream allows for conversational queries and rapid answers.
-
Example use for the blind/VI community: Easier navigation, faster reading of printed material, identifying objects on the fly.
-
“Instead of going through all of that, the glasses just turn on the video and so they are actively watching all the time… you can just have a conversation with it…” (04:05)
-
-
Live AI Demonstration (05:40–06:16):
-
[Sam:] “Hey, Meta. Enable Live AI.” (05:40)
-
[Meta:] “Live AI session with video starting now.” (05:43)
-
[Sam:] “What’s in front of me?” (05:46)
-
[Meta:] “In front of you is a keyboard, a phone, a smartwatch, and a mouse.” (05:46)
-
[Sam:] “What color is the keyboard?”
-
[Meta:] “The keyboard is white.” (05:52)
-
[Sam:] “What am I holding in my right hand?”
-
[Meta:] “You are holding a red iPhone.” (05:55)
-
[Sam:] “What is that over there?”
-
[Meta:] “It’s a microphone.” (06:02)
-
[Sam:] “What am I pointing at?”
-
[Meta:] “You’re pointing at the camera.” (06:04)
-
[Sam:] “How many cameras do you see?”
-
[Meta:] “I see three cameras.” (06:10)
-
[Sam:] “All right, thanks, Meta. Turn off live AI.” (06:12)
-
Sam’s commentary:
“It’s incredibly powerful and it’s going to be really helpful for us in a lot of different situations. I’m thinking like, grocery shopping… going through mail… much more quickly.” (06:19)
Pro tip: Be mindful in noisy areas—the AI may respond to other people talking; best to end session in such environments.
-
-
Travel Use Case:
Sam expresses excitement about using Live AI in airports for rapid navigation:- “Walking down the terminal, trying to find my gate… now I can just be walking and say, all right, what does that one say? … Okay, we’re looking for B25.” (06:45)
- “Is this the men’s room? No, that’s the ladies room. Okay, which one’s the men’s room? The men’s room is on the right.” (07:00)
- Real-world accessibility improvements for independent wayfinding.
-
Live Translation:
- Glasses can now listen continuously for a foreign language and provide almost real-time translation via speakers.
- “If you do live in another country, if you travel a lot, if you work with people who don’t speak the same language that you do. It’s going to be extremely helpful.” (04:50)
- Supported languages: English, Spanish, French, Italian.
4. How to Get the Updates
-
Ensure both the Meta View app and the glasses firmware are up-to-date on your phone.
-
For live translation, download your chosen language packs through the app.
- “Just make sure that the Meta View app is fully updated on your phone and then go into the app and make sure that the glasses themselves have been updated to the latest version...” (07:35)
Notable Quotes & Memorable Moments
- “Now I can just say ‘what’s on my desk’ and it will recognize what I want it to do and it’ll take a picture. So that’s definitely going to make it much easier to learn how to use the Meta Ray Ban smart glasses for a lot of people.” (01:00)
- “Congratulations UK. And speaking of other countries, now the connected apps, your music apps are available in a lot more countries…” (03:30)
- “Probably the biggest update in my opinion of April, actually of the year so far, is the full public release of live AI and live translations.” (03:45)
- “The glasses just turn on the video and… you can just have a conversation with it, so to speak. You can just ask it questions and it will give the response back very, very quickly.” (04:05)
- “It’s going to be really helpful for us in a lot of different situations. I’m thinking like, grocery shopping, I’m thinking, like, going through mail much more quickly.” (06:19)
- “So many situations where this is going to be helpful for us.” (07:09)
Timestamps for Key Segments
- Version 12 improvements: 00:45–01:30
- Audible support: 01:30–01:45
- April/Version 14 feature overview: 01:45–03:45
- Live AI and translation deep dive: 03:45–07:35
- Live AI demonstration: 05:40–06:16
- Practical airport navigation scenario: 06:40–07:10
- How to update & supported languages: 07:35–08:00
Conclusion
Sam wraps up by encouraging users to update their glasses and try out the new features, especially Live AI and translation. He points out that these AI-driven updates are not only cool but fundamentally shift day-to-day independence and accessibility for blind and visually impaired users. He invites the audience to connect through comments and check out The Blind Life’s merch and YouTube membership for further engagement.
