Google Unveils Android XR Glasses with Live Translation

After months of speculation, Google finally revealed its prototype Android XR smart glasses. The live demo at Google I/O wasn’t perfect, but the real-time translation feature wowed the crowd.

Right after the keynote, I explored the Shoreline Amphitheater to try the glasses myself.

Like my brief time with Project Moohan—Google and Samsung’s XR headset—I only had about five minutes. These weren’t sleek designer frames. They were the same Samsung-built prototypes shown on stage.

At first glance, they looked like regular black glasses, similar to Meta’s Ray-Bans. One side was slightly bulkier, but the tech was neatly tucked in.

The screen is embedded in the lens. When turned on, a small floating box appeared, showing the time and weather at the top of my view.

A button on the right stem snapped a photo. The image briefly enlarged in my vision, creating a more immersive experience than Meta’s screen-less Ray-Bans.

Google says the glasses will support messaging, calls, and translation. I didn’t test those, but a Google rep demonstrated the navigation feature—and it was impressive.

Navigation cues appeared subtly at the top of my view, like “turn right in 500 feet.” I didn’t need to look down at a phone or watch.

For more detail, I could glance down to see a mini map that moved with my head. In a city like New York, this would let me walk normally, check directions at a glance, and view the full route when stopped. That’s a big leap forward.

The screen quality was decent, though I didn’t test it in direct sunlight. The demo took place in a small, controlled room.

It’s still early. Google is working with several brands, and developers will get access later this year.

Meanwhile, the Project Moohan headset is expected to launch soon. Samsung will release the final version, which could help attract third-party support and user feedback.

Gemini, Google’s AI assistant, worked just as well on the glasses as it did on Project Moohan. I asked for the weather and got an audio forecast. I also had it analyze a painting and find book reviews and purchase options.

Having Gemini in the glasses feels powerful. It’s not just the voice—it’s how it integrates with the screen and the broader Google ecosystem.

Samsung’s prototype didn’t win me over on design. But I’m optimistic about future versions from brands like Warby Parker, X-Real, and Gentle Monster. With more variety, one of them might strike the right balance between style and function.

I’ve used Meta Ray-Bans to capture moments like walking my dog or riding theme park attractions. I also liked the original Snapchat Spectacles, though their novelty wore off quickly. Both had limited features.

Android XR already feels more capable—even after just five minutes.

Gemini in glasses doesn’t feel like science fiction. It’s here, and it’s promising. Now I’m eager to see how Meta responds—and what Apple might bring next.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Recommended Posts