Google Unveils New Tools and Libraries for Building AI Glasses Apps

With the unveiling of the Android XR SDK Developer Preview 3, Google has taken a significant step forward in enhancing the development of AI Glasses experiences. This latest release introduces two innovative libraries: Jetpack Projected and Jetpack Compose Glimmer, alongside an expansion of ARCore for Jetpack XR, which now incorporates motion tracking and geospatial capabilities tailored for AI Glasses.

Enhancing App Interactivity

The newly introduced libraries empower developers to extend the functionality of existing mobile applications, allowing them to seamlessly interact with AI Glasses. By utilizing the glasses’ built-in speakers, camera, and microphone, developers can present information directly through the glasses’ display, when available.

There are many scenarios where your app might want to use AI glasses hardware. For example, a video conferencing app could add a UI control that allows the user to switch their video stream from the phone’s camera to the AI glasses’ camera, offering a first-person point of view.

The first library, Jetpack Projected, facilitates the projection of an app’s XR experience from a host device, such as an Android phone, to AI Glasses using audio and/or video. This library enables applications to verify the availability of a display on the target device and to wait for it to become accessible. In alignment with the standard Android permission model, applications must request permission at runtime before accessing device hardware.

Accessing AI Glasses hardware can be achieved from both an AI Glasses activity and a standard app, provided a valid projected context is obtained. The integration of audio support is straightforward, as the AI Glasses audio device functions as a typical Bluetooth audio device.

However, capturing photos or videos through the glasses’ camera presents a more intricate challenge. This process necessitates the instantiation of multiple classes to verify hardware availability, configure the setup, and bind the activity lifecycle to the camera, ensuring it operates in sync with the activity state.

Creating Augmented Experiences

On the other hand, Jetpack Compose Glimmer offers a suite of UI components and a visual language designed for crafting augmented experiences on AI Glasses equipped with a display. This new visual language employs optical see-through technology to merge visuals with the surrounding environment, prioritizing clarity, legibility, and minimal distraction. Supported components include:

  • Text
  • Icons
  • Title chips
  • Cards
  • Lists
  • Buttons

All components are built upon the foundational concept of a surface, which developers can utilize to create custom components. Glimmer components can be tailored using modifiers to adjust layout, appearance, and behavior, and can be layered along the z-axis to impart a sense of depth through shadow effects.

Additionally, Google has introduced an AI Glasses emulator within Android Studio, enabling developers to preview UI designs and simulate user interactions, including touchpad and voice input.

Expanding ARCore Capabilities

In conjunction with these advancements, Google has broadened the capabilities of ARCore for Jetpack XR, a comprehensive set of APIs designed to facilitate the creation of augmented experiences. The latest iteration allows for the retrieval of planar data, anchoring content to fixed locations in space, and more. Notably, the update introduces support for motion tracking, enabling the glasses to respond dynamically to user movements, as well as geospatial pose, which allows content to be anchored to locations mapped by Google Street View.

Developers eager to explore these new features can access Android XR SDK Preview 3 in Android Studio Canary, following an upgrade to the latest emulator version (36.4.3 Canary or later).

About the Author

Sergio De Simone


Show moreShow less

AppWizard
Google Unveils New Tools and Libraries for Building AI Glasses Apps