Last week, within the vibrant atmosphere of Google’s Hudson River office, I had the opportunity to don a pair of Android XR glasses and engage in a conversation with Gemini while navigating the room. Unlike the stylish models from Warby Parker or Gentle Monster showcased at Google I/O in May, these were part of a developer kit destined for Android developers around the globe. The demonstrations, which included visual assistance and gyroscopic navigation, unfolded swiftly and impressively. At one point, I requested a fruit salad recipe using pasta from the shelf, only to receive a recommendation for a classic tomato sauce instead—highlighting both Gemini’s intelligence and the glasses’ advanced multimodal capabilities.
As my briefing concluded, I transitioned seamlessly from the Android XR glasses to Samsung’s Galaxy XR headset and an upcoming model by Xreal, known as Project Aura. This fluidity between devices, many of which will integrate with Android phones and smartwatches for enhanced functionality, represents one of Google’s ambitious goals for 2026. From my perspective, that future is eagerly anticipated.
Google’s vision for AI glasses is two-fold
Google’s strategy for AI glasses encompasses two distinct forms: one that focuses solely on audio and camera features, akin to Meta’s Ray-Bans, and another that incorporates a display for visual cues and interactive interfaces, similar to Meta’s Ray-Ban Display. This competitive landscape is certainly heating up, but Google possesses a significant advantage even before launch—a robust software ecosystem. Developer Preview 3 of the Android XR SDK, complete with APIs, is set to debut this week.
This ecosystem isn’t limited to popular applications like Gmail, Meet, and YouTube, as seen with Meta’s Messenger, Instagram, and WhatsApp. Instead, it boasts a wealth of existing third-party Android apps, homescreen widgets, and hardware products that are expected to transition smoothly into the Android XR operating system. I experienced a glimpse of this integration when I requested an Uber ride from the Google office to a highly-rated pizzeria in Staten Island. The glasses not only displayed a navigation route to my pickup location but also projected the driver’s information as I approached. This functionality, derived directly from the native Uber app for Android, exemplifies the seamless development potential for the wearable platform.
Another noteworthy aspect of my demo was how Gemini provided contextual information the moment I donned the glasses. Rather than having to inquire about my location, the weather, or the various objects around me, the Android XR experience commenced with a summary of relevant information and an invitation for follow-up questions. This thoughtful approach enhances the natural flow of interaction with the assistant.
Galaxy XR gets better, but I’m more drawn to this headset
During my exploration, I also revisited the Samsung Galaxy XR headset, this time equipped with new features such as PC Connect, which syncs with a Windows PC or laptop for a more immersive viewing experience, travel mode for improved stability during movement, and Likeness, a digital avatar generator reminiscent of Apple’s Spatial Personas. As a Windows user, I found the PC Connect feature particularly engaging, allowing me to project a significantly larger virtual screen while playing the game Stray. The responsiveness of the wireless controller inputs and the stable image quality in terms of refresh rate were impressive.
However, what truly captured my attention was the more portable and comfortable Xreal glasses, dubbed Project Aura. Initially announced at Google I/O, my first experience with these wired wearables revealed that the future of comfortable face computers is closer than we might think. Project Aura boasts a generous 70-degree field of view, enhanced by Xreal’s tinting feature that brightens the display. Operating on the same Android XR platform as the Galaxy XR headset, it allows for pinch and swipe gestures, simultaneous viewing of multiple floating windows (including via PC Connect), and access to various Android apps and services already available on users’ phones.
The pressing question surrounding Project Aura remains its price. Xreal’s current lineup of extended reality glasses ranges from 0 to 0, and with the added computing power and innovation of Project Aura, one might anticipate a launch price closer to ,000. While Google and Xreal have not yet disclosed an official release date, indications suggest that these glasses could arrive late next year.
My experience with Google’s Android XR demos reinforces the notion that competition in the wearable computing sector is intensifying, driven by practical hardware and software integration rather than mere speculative concepts. The core strength of Google’s approach lies not only in Gemini’s capabilities but also in its ability to leverage the established Android ecosystem—a prospect that should resonate positively with developers. Although my demos encountered a few hiccups, as is common with beta products, the vision of effortlessly transitioning between diverse devices—from bulky developer kits to the sleek Project Aura—underscores Google’s commitment to flexibility. What I witnessed suggests that the company’s aspirations for seamless, multifunctional smart glasses by 2026 are not just marketing rhetoric, but a rapidly approaching reality that could transform our interaction with information and the digital realm.