It becomes a familiar feeling, that moment of joy when information, a video, a photo, a navigation or even a live translation floats before my eyes. Now I’ve seen it with Meta Ray-Ban and Meta Orion shades. This is the new state of smart glasses and I was excited to finally see Android XR join the party.
This week marks a crucial turning point in Google’s Android XR journey, which began somewhat inauspiciously with the Samsung Galaxy.
Exposure, be careful.
We started with the prototype Android XR monocular frames, which stood out because they were lighter and bulkier than traditional glasses. To accommodate my vision, they had pre-installed corrective lenses that were located directly behind the lenses and looked like typical clear lenses.
Google uses waveguide technology for both the single-screen and dual-screen versions. Basically, it uses small screens embedded in the edges of the frames. The image is then projected through the lens and sent to the user’s eyes via waveguides. It creates the illusion of a screen or floating images.
One of the tricks is that bright images, like time and temperature, can float before your eyes, but never obscure your vision. Instead, you simply change focus from close (to see AR images) to far to see what’s really in front of you.
This, by the way, contrasts with the high-resolution micro-displays used by most mixed reality platforms such as the Vision Pro and Galaxy XR. With this you never see through a lens; Instead, the entire world, both real and augmented, is presented to you on stereo microscreens.
Google has kept both prototypes thin, light and comfortable, leaving most of the computing duties to a connected Google Pixel 10 (it plans to make them work with iPhones, too). This seems to be the preferred strategy for these types of smart glasses that you wear all the time, and frankly, it makes sense. Why try to recreate the processing power of a phone in your glasses when you will almost certainly have your phone with you? It’s a marriage that, in my short experience, works.
AndroidXR
Google calls these frameworks “AI Glasses”, based on the continuous support Google Gemini can provide. There will be models without a screen that will listen to your voice and give answers through the built-in speakers. During my demos, I saw that having Gemini close at hand through audio interactions, even when the screens are off, can be useful.
However, there is something very attractive about screens with integrated lenses.
Even though the monocular screen is only shown to one eye, your brain quickly makes the adjustment and interprets the videos as if they were shown to both eyes, even if some are slightly left of center.
My first experience was varying weather and temperature; I could concentrate to see it or look past it to ignore it. You can of course also turn off the screen.
In some ways, the resulting experience was very similar to the one I had with the Meta Ray-Ban Display a few months ago.
The frames come with cameras that allow you to show Gemini your world or share it with others.
I called Gemini by holding the stem down and asked him to find me some music that matched the mood of the room. I also asked him to play a David Bowie Christmas song. In front of my eyes floated a small YouTube game widget linked to the Bowie/Bing Crosby version. “Little Drummer” I could hear it through the speakers built into the glasses. Google executives told me that they didn’t need to write any special code to get it to appear in this format.
Another time I looked at a shelf full of groceries and asked Gemini to suggest a meal based on the available ingredients. It came with cans of tomatoes, so of course I used tomato sauce. I took every opportunity to interrupt, redirect or question the Gemini. Everything was handled with ease and kindness, just like someone used to dealing with rude customers.
Taking a photo is easy, but managers can also access Gemini models and add AI enhancements with Nano Banana Pro.
I looked at a nearby windowsill and asked the Twins to fill the space with teddy bears. Like most requests, this one went from glasses to phone to Google cloud, where the Nano Banana Pro quickly did its job.
Within seconds, I saw a photo-realistic image of teddy bears resting sweetly on the windowsill. The images are always relatively sharp and clear, but without completely obstructing my view.
A member of the Google team ordered the glasses via Google Meet; I answered and watched a video of them. Then I showed them my opinion.
One of the most surprising demonstrations occurred when a Chinese speaker entered the room and the glasses automatically recognized his language and translated it immediately. I could hear his words translated into English in my ears, but I could also read them before my eyes. The speed and apparent precision was astonishing.
Of course, the glasses can be an excellent navigation system for the front. I asked the Geminis to find a nearby museum and when we chose the Museum of Illusions (visual, not magical), I asked them to give me detailed instructions. When I looked up I could see where to turn and when I looked down I could see my position on the map and the direction I was going.
Google partnered with Uber to bring this experience in-house. They showed me how the system could help me navigate an airport using Uber’s navigation data.
double vision
I then created dual screen prototypes. They didn’t seem any bigger or heavier than the monocular versions, but they offered a completely different viewing experience.
First, you get a wider field of view, and since it has two screens (one for each eye), you get instant stereo vision.
This gives you 3D views of cities on maps that change depending on how you view them. Frames can make any image or video 3D, although I found some of them a little weird; I will always prefer space travel content filmed in 3D.
Handy on Google Maps, where both screens convert every indoor image into a 3D image when you enter a location.
Project Xreal Aura
Google’s Android XR team gave me a quick, hands-on demo of Project Aura.
Xreal is the leader in display glasses that essentially create 200-inch virtual screens from any USB-C connected device (phones, laptops, gaming systems), but Project Aura is a different beast. Like the Samsung Galaxy XR, it’s a standalone Android XR, an in-your-face computer in the form of lightweight glasses. At least that’s the promise.
Like the Xreal One, the glasses use Sony Micro-LED displays that project images through thick prisms into your eyes. These glasses also came pre-installed with my prescription so I could see clearly during the experience.
Unlike Android
They offer a clear and relatively wide field of view of 70 degrees. Just like the Samsung Galaxy XR, Aura uses gesture control. Since there are no cameras that track your eyes, I had to consciously point and pinch elements on the screen. Since this is an Android XR system, the control metaphors, menus, and interface elements are, for better or worse, identical to what I found on the Samsung Galaxy XR.
We use it to quickly connect to a nearby computer and enjoy a productivity experience on a large screen.
My favorite part was a great demonstration of the game plan. It was a sort of Dungeons and Dragons card game where I could use one or two hands to move and resize the rather detailed 3D game board. When I turned my hand over, there were half a dozen cards spread out in front of me. You can take any virtual card and look at it. There were little game pieces on the playground that I could pick up and investigate. The whole time the CPU was hanging from the belt.
Unlike the AI Glass prototypes, Project Aura is still considered an “episodic” device that you can wear occasionally and generally at home, at the office, or maybe on the flight home.
The Galaxy XR is growing
The original Android XR device, the Galaxy XR, is also getting a minor update this cycle.
I first tried Likenesses on the Galaxy XR, the version of Google People on the Vision Pro. On the other hand, you capture your image with your phone. This creates an animated copy of your face and shoulders.
I was on a Google Meet call with a Google representative and could see his eerily realistic image that seemed to follow his movement. This included his hands. We could share a Canva screen and collaborate.
Google told me that even if someone doesn’t use headphones like the Galaxy
Where are you going now?
While all of these demos have been exciting, we’re still at least a few months away from commercial availability of Project Aura, and that includes the Android XR AI display glasses.
Google hasn’t revealed much about battery life or how close it is to the useful (and relatively affordable) AI Display Frames with its partners Warby Parker and Gentle Monster.
But from what I have seen, we are closer than I previously thought. Additionally, the ecosystem of wearable and connected devices, such as flagship phones, Pixel watches, computers and app stores, appears to be converging.
I think we are almost done with diving gear that is too expensive for episodic use. Now is the time to use AI powered AR glasses and I am really ready for it.
