Android XR apps will have camera permissions similar to Android phones

In December of last year, Google unveiled Android XR, its innovative extended reality operating system designed for virtual and mixed reality headsets. Shortly thereafter, Samsung introduced its first prototype headset, known as Project Moohan. The commercial launch of Android XR is slated for 2025, featuring a headset developed by Samsung alongside a pair of smart glasses crafted by Google’s DeepMind subsidiary. While Google has kept many details under wraps, insights from developers suggest that Android XR apps will have camera permissions strikingly similar to those found on Android smartphones.

Android XR app developers can request permission to access the headset’s cameras

Developer Antony Vitillo, also known as Skarred Ghost, inquired about camera access for Android XR headsets. A representative from Google confirmed that, akin to any Android application, developers will be able to utilize existing camera frames with user consent for XR experiences. This capability includes access to the “main world-facing camera system” and the “main selfie-camera system”, which focuses on the user’s face. This mirrors the functionality available on Android smartphones, where developers can tap into both the “rear” and “front cameras”. Notably, competing platforms such as Meta’s Quest 3 and Apple’s Vision Pro do not currently permit third-party developers to access their cameras directly.

Android XR will give devs access to your living room feed

According to sources, Google plans to grant developers access to the living room feed via the cameras on the Android XR headset. This feature will enable devices like Samsung’s Project Moohan to analyze the user’s surroundings, allowing mixed-reality games and applications to adapt based on the context of the environment. Developers will be able to leverage this feed to fine-tune their applications, enhancing the overall mixed-reality experience.

The Android Developers website indicates that developers can request access to “Scene Understanding”, which includes functionalities such as “Light estimation”. This allows for the projection of camera passthrough feeds onto mesh surfaces and enables ray casting against identifiable elements within the environment. Furthermore, developers can seek advanced tracking capabilities through the rear camera of the Android XR headsets, facilitating the detection of “hand joint poses and angular and linear velocities”. This feature aims to create a “mesh representation of the user’s hands”, potentially enhancing immersion and enjoyment in hand-tracked VR gaming experiences.

While Android XR-based devices will offer “basic” hand tracking functionalities by default—encompassing actions like “pinching, poking, aiming, and gripping”—Google is expected to unveil additional details about its XR platform in the coming weeks.

AppWizard
Android XR apps will have camera permissions similar to Android phones