Last week, a demonstration of Android XR glasses took place at Google's Hudson River office, showcasing features such as visual assistance and gyroscopic navigation. These glasses are part of a developer kit for Android developers. Google aims to integrate these devices with Android phones and smartwatches by 2026. The strategy for AI glasses includes two types: one focusing on audio and camera features, and another incorporating a display for visual cues. Developer Preview 3 of the Android XR SDK is set to launch soon, supporting a wide range of existing third-party Android apps. The glasses can display navigation routes and driver information for Uber rides. Gemini, the assistant, provides contextual information immediately upon wearing the glasses. The Samsung Galaxy XR headset has new features like PC Connect and travel mode, while Xreal's Project Aura glasses offer a 70-degree field of view and access to Android apps. The anticipated price for Project Aura could be around ,000, with a potential late next year launch.