AppFunctions

AppWizard
March 30, 2026
Google has introduced early beta features for Android aimed at enhancing task-centric capabilities through an "agent-first" operating system. The key component is AppFunctions, a Jetpack API that allows developers to expose self-describing capabilities within their applications for seamless interaction with AI agents while prioritizing user privacy and performance by executing tasks on-device. AppFunctions operates similarly to backend capabilities declared via MCP cloud servers but runs locally on the device. Additionally, a UI automation platform has been introduced to assist users in performing complex tasks without requiring developer input. This platform allows users to complete tasks like placing pizza orders or coordinating rideshares through the Gemini Assistant. Privacy and user control are emphasized, with all interactions designed for on-device execution and mandatory confirmations for sensitive tasks. Currently, these features are in early beta and available exclusively on the Galaxy S26 series, with plans for broader deployment in Android 17.
AppWizard
February 26, 2026
Google is enhancing Android apps to align with user expectations for artificial intelligence, similar to advancements in Windows 11. Developers received a preview of this initiative, which includes a new feature called AppFunctions. This feature allows Android apps to expose public interfaces for specific functionalities, enabling seamless interaction with AI agents and system-level services. AppFunctions are analogous to the Model Context Protocol (MCP) for cloud-based AI interconnectivity and will be accessible through Google's Jetpack library and platform APIs, ensuring local interactions on devices. AppFunctions are currently in early development, with initial examples implemented in the upcoming Gemini version for the Samsung Galaxy S26 series and other Samsung devices running OneUI 8.5 and higher. Users will interact with Calendar, Notes, and Tasks using AppFunctions to streamline activities. Google is launching an early preview of AppFunctions through a beta feature in the Gemini app, available on the Galaxy S26 series and select Pixel 10 devices, allowing users to delegate tasks to AI agents by double-pressing the power button. The initial rollout will focus on apps in food delivery, grocery, and rideshare sectors in the US and Korea. AppFunctions are expected to be integrated into Android 17, with a stable release anticipated around mid-year.
AppWizard
February 26, 2026
Google has introduced early-stage developer capabilities for Android aimed at connecting applications with intelligent agents and personalized assistants, specifically Google Gemini, while prioritizing privacy and security. A key feature of this initiative is AppFunctions, introduced with Android 16, which allows applications to expose specific capabilities for access by agent apps, enabling seamless task execution on devices. Developers can define app functionalities for AI assistants, facilitating various use cases such as task management, media creation, cross-app workflows, and calendar scheduling. A practical example includes the Samsung Gallery app, where users can request specific photos through Gemini, which triggers the appropriate function to retrieve them. Additionally, Google is advancing a UI automation framework for AI agents, allowing for the execution of generic tasks across applications with minimal coding. Future expansions of these capabilities are planned for Android 17, with ongoing collaboration with select app developers to enhance user experiences.
Search