Personally, I prefer the look of the slightly narrower Headliner frames, but the second generation glasses still look a lot like traditional Wayfarer glasses. I’ve never been a fan of transition lenses for my glasses, but I’m starting to use them for smart glasses. As Meta improved its cameras and made its AI assistant more useful, I found more reasons to wear glasses indoors.
If you’re paying $300 or more for a pair, you might as well wear them everywhere. It also helps that the transition lenses on the second generation Ray-Ban Meta glasses are slightly darker than my first generation Wayfarer transition lens glasses. But throwing away the regular clear lenses comes at a cost. Frames with polarized lenses start at $409, transition lenses start at $459, and prescription lenses can cost much more.
Just like the current Oakley Meta HSTN glasses, the second generation Ray-Bans also have a longer battery life and a better camera. Meta says the battery will last up to eight hours on a charge with “normal use”. I managed about five and a half hours of continuous music playback. This is a remarkable improvement over the battery in my original pair, which started to show signs of aging after two years. The glasses now also support high-definition 3K video recording, but the 12MP wide-angle lens takes the same 3,024 x 4,032 pixel portraits as previous models.
The video quality has improved significantly, but I still think it’s probably not necessary for most people if you mainly share your clips on social media. However, this makes the glasses more attractive to developers, and judging by the number of participants in Connect, I suspect that Meta considers them an important part of its user base. However, I hope Meta offers the ability to record hyperlapse and slow motion videos, as I think they can be more interesting for everyday tasks than regular POV footage.
Meta AI + what next
Two years ago, I was quite skeptical about Meta’s AI assistant. Since then, Meta has continued to add new features. Of these, I like the translation capabilities of the glasses the most. On a recent trip to Argentina, I used live translation to follow a walking tour of the famous Recoleta Cemetery. It wasn’t perfect (the feature is intended more for back-and-forth conversations than extended monologues), but it allowed me to take a tour that I would have otherwise had to skip. (One caveat: Using live translations for long periods of time requires a lot of battery.)
Meta AI can provide context and translations in other scenarios as well. I spent time in Germany testing the latest second generation Ray-Ban glasses and had to repeatedly ask Meta to translate the signs and instructions. For example, this is how the Meta AI summarized this collection of characters.
As I wrote in my review of the Oakley Meta HSTN glasses, I didn’t find the Live AI, which lets you interact with Assistant in real time and ask questions about your surroundings, very useful. It still seems like more of a novelty, but it’s a fun demo to show friends who have never tried “AI glasses.” There are also some very interesting accessibility use cases that take advantage of the glasses’ cameras and AI capabilities. Features such as support for ‘Detailed Answers’ and ‘Be My Eyes’ show how smart glasses can be particularly effective for people who are blind or visually impaired.
One AI-powered feature I haven’t tried yet is Conversation Focus, which can adjust the volume of the person you’re talking to and eliminate background noise. Meta revealed the feature on Connect, but did not say when it would be available. But if it works as expected, I imagine it will be useful in many scenarios.
I’m also particularly intrigued by Metas Connect’s announcement that third-party developers can finally create their own integrations for their smart glasses. There are already a handful of partners like Twitch and Disney looking for ways to take advantage of the glasses’ camera and AI capabilities. So far, Meta AI’s multimodal tools have shown promise, but I haven’t been able to find many ways to use these features in my daily life.
This may change if app makers have access to the platform. Disney plans to integrate smart glasses into its parks, allowing guests to get real-time information about rides, attractions and other services as they go. Golf App 18Birdies has introduced an app that can provide you with statistics and other information while you are on the golf course.
Should you buy them? And how is the integrity?
When the Ray-Ban Meta glasses came out two years ago, it was pretty easy to answer this question. If you liked the idea of smart glasses with a good camera and open speakers, it was easy to buy them.
Now it’s a bit more complicated. Meta continues to update its first-generation Ray-Ban glasses with key new features such as Conversation Focus, new camera modes and third-party app integrations. So if you already own a pair, you won’t lose much by not upgrading. (And with a starting price of $299, the first-gen glasses are still solid if you want a more affordable option.)
There are also other options to consider. The upcoming Oakley Meta Vanguard goggles feature wider hardware upgrades and other unique features that will appeal to athletes and anyone who spends a lot of time outdoors. And at the top are the $799 Ray-Ban Meta Display glasses, which combine AR elements with existing features in an exciting way.
I also have the same privacy concerns I had when I tried on Meta’s first pair of Ray-Ban glasses in 2021. I’m aware that Meta already collects an extraordinary amount of data about us through its apps, but the glasses seem to offer much more personal and potentially invasive access to our lives.
Meta has also made significant changes to its eyewear privacy policy in recent months. US users can no longer save voice recordings to their cloud, but can still delete them manually in the Meta AI app.
The company says it will not use the content of photos and videos taken to train its artificial intelligence models or serve ads. However, images of your environment that have been processed for the glasses’ multimodal capabilities, such as Live AI, can be used for training purposes (these images are not saved to your device’s camera roll). Meta’s privacy policy also states that the company uses recorded audio with voice commands for training. It goes without saying that anyone using Meta Glasses should be very careful when sharing their interactions with the AI app, as several users seem to have inadvertently shared many very personal interactions with the world.
If this makes you uncomfortable, I’m not here to convince you otherwise! We are still grappling with the long-term privacy implications of generative AI, not to mention generative AI on wearable devices equipped with cameras. At the same time, as someone who has used Meta’s smart glasses for over four years, I can say that Meta has managed to turn what once looked like a gimmick into a really useful accessory.
