Meta continues to improve its Ray-Ban smart glasses with new software updates, and the latest releases add three new capabilities, making them some of the smartest and trendiest glasses. In the latest update released on December 16, users who signed up for the Early Access program can now access Live AI, Live Translation, and Shazam capabilities, where users can identify the song playing in the background directly on the smart glasses.
Of the three, Live AI is the biggest upgrade. It adds video AI capabilities to the Ray-Ban Meta glasses, where the glasses can respond to what you see and any questions related to it in real-time, working similarly to Google’s Project Astra, powered by Gemini 2.0.
These features were first demonstrated at Meta Connect 2024 in September, and they are now being rolled out to early adopters living in Canada or the United States.
A recent update added support for streaming music through platforms like Spotify and Amazon Music, and Be My Eyes.
Similarly, Ray-Ban Meta glasses now support real-time translation between English and either Spanish, French, or Italian. When the other party is speaking in one of those three languages, you’ll hear it in English, and if they’re also wearing Ray-Ban Meta glasses, they’ll hear your English response in the language they’re speaking. This will help travelers a lot, where language may be a barrier.
Last but not least, Ray-Ban Meta glasses can now recognize music playing in the background. All users have to do is ask “Hey Meta, what’s this song?” And Meta will come back with an answer in a few seconds. This is definitely a useful feature, especially for those who like to discover local music while traveling, where it is more convenient than opening the phone and asking Shazam the same question, since these glasses usually stay on the face.
Why should you buy our membership?
You want to be the smartest in the room.
You want access to our award-winning journalism.
You don’t want to be confused and misinformed.
Choose your subscription package