visual assistance

AppWizard
December 8, 2025
Last week, a demonstration of Android XR glasses took place at Google's Hudson River office, showcasing features such as visual assistance and gyroscopic navigation. These glasses are part of a developer kit for Android developers. Google aims to integrate these devices with Android phones and smartwatches by 2026. The strategy for AI glasses includes two types: one focusing on audio and camera features, and another incorporating a display for visual cues. Developer Preview 3 of the Android XR SDK is set to launch soon, supporting a wide range of existing third-party Android apps. The glasses can display navigation routes and driver information for Uber rides. Gemini, the assistant, provides contextual information immediately upon wearing the glasses. The Samsung Galaxy XR headset has new features like PC Connect and travel mode, while Xreal's Project Aura glasses offer a 70-degree field of view and access to Android apps. The anticipated price for Project Aura could be around ,000, with a potential late next year launch.
AppWizard
November 4, 2025
Samsung's Galaxy AI now supports 22 languages, including Gujarati and Filipino, enhancing user experience with features like Live Translate, Interpreter, Chat Assist, and Note Assist. The Galaxy S25 series has introduced Gemini Live, allowing real-time visual assistance through Google's AI. Samsung has confirmed that core AI features will remain free of charge, despite earlier speculation about potential subscription fees.
Winsage
April 6, 2025
Microsoft introduced Copilot Vision during an event celebrating its 50th Anniversary. This feature allows users to point their camera at objects for real-time identification by AI, integrating OpenAI's GPT models for enhanced memory, search, personalization, and visual capabilities. Currently available on the Windows Desktop app, Copilot Vision can recognize open applications without continuous monitoring. It adapts its responses based on the specific application in use, such as providing contextually relevant guidance in Blender 3D and visually indicating tools in Clipchamp. More advanced features are anticipated in the future, but no specific timeline has been provided.
Winsage
October 2, 2024
Microsoft has introduced updates to its Copilot platform, including tools like Copilot Voice, Think Deeper, and Copilot Vision, which will initially be available only to select testing groups through Copilot Labs. Copilot Labs is designed for experimental features, allowing user feedback for product enhancement. Copilot Vision enables the Copilot in Microsoft Edge to visually interpret screen content and provide real-time voice assistance, with privacy measures in place. The Think Deeper feature allows for more detailed responses to complex inquiries. Access to Copilot Labs is limited to Copilot Pro users who subscribe at a monthly rate, while Google Labs offers a no-cost alternative for experimenting with AI features.
Search