Microsoft’s Recall feature, initially touted as a groundbreaking addition to the Copilot+ PCs unveiled in May 2024, aimed to provide Windows 11 users with an AI-driven “photographic memory.” This innovative tool was designed to alleviate the common challenge of information overload by utilizing advanced neural processing units to transform vague searches into precise results. However, the original design raised significant privacy and security concerns, prompting a wave of criticism from experts who labeled it a “privacy nightmare.”
In light of this backlash, Microsoft decided to halt the planned preview of Recall, opting instead to return the entire codebase to developers for a comprehensive redesign. The past four months have been dedicated to addressing these concerns, culminating in a recent blog post by David Weston, Microsoft’s VP of Enterprise and OS Security. This communication stands out for its transparency, offering a detailed account of the extensive modifications made to Recall’s security framework.
Recall will work only on Copilot+ PCs running Windows 11
Microsoft has clarified that Recall will exclusively be available on Copilot+ PCs, which must adhere to the secured-core standard. The feature will only activate if Windows verifies that the system drive is encrypted and that a Trusted Platform Module (TPM version 2.0) is enabled. The TPM serves as the foundation for a secure platform, managing the keys necessary for data encryption and decryption.
Additionally, Recall will leverage several core security features inherent to Windows 11, such as Virtualization-Based Security, Hypervisor-enforced Code Integrity, and Kernel DMA Protection. It will also utilize Measured Boot and System Guard Secure Launch to prevent Recall from functioning if the machine has not booted securely, thereby protecting against early boot attacks. While security researchers may still discover ways to test Recall on incompatible hardware, such attempts are expected to be significantly more challenging than in the earlier leaked versions.
Recall will be opt-in only
Addressing one of the primary concerns from critics, Microsoft has confirmed that Recall will be an opt-in feature. During the setup process for Copilot+ PCs, users will be presented with a clear choice to enable or disable the saving of snapshots through Recall. If users do not actively opt in, the feature will remain off, and no snapshots will be captured or stored.
Moreover, users of OEM and retail versions of Windows 11 (Home and Pro) will have the option to completely remove Recall via the Optional Features settings. In contrast, for Windows 11 Enterprise users, Recall will not be included in the standard installation; administrators must deploy it separately and enable it through Group Policy or other management tools. Even then, users will need to authenticate using Windows Hello biometrics on supported hardware to activate the feature.
New privacy settings add extra control over personal data
To enhance user control over personal data, Microsoft has introduced an icon in the system tray that alerts users each time a Recall snapshot is saved, along with the option to pause the feature. Certain types of content will be excluded from Recall snapshots by default, including any browsing conducted in private sessions across supported browsers such as Edge, Chrome, Firefox, and Opera. Users will also have the ability to filter out specific applications and websites.
Recall is designed to automatically filter out sensitive information, including passwords, credit card numbers, and national ID numbers. The underlying library for this feature mirrors the one utilized by enterprises that subscribe to Microsoft’s Purview information protection product. If the analysis phase determines that a snapshot contains sensitive data or originates from a filtered app or website, the entire snapshot will be discarded, ensuring that its contents are not saved to the Recall database. Additional configuration tools will allow users to retroactively delete snapshots from specific time ranges or applications.
Recall’s security architecture leverages core Windows features
The initial announcement of Recall raised alarms about potential vulnerabilities, particularly concerning local and remote attacks. The revamped architecture introduces multiple layers of protection to address these risks. Setting up Recall necessitates biometric authentication linked to the user’s account, ensuring that only the authenticated user can perform Recall searches and operations.
Furthermore, snapshot data and the vector database used for searching stored snapshots are encrypted. Accessing these databases also requires biometric verification, and all operations occur within a secure environment known as a Virtualization-based Security Enclave (VBS Enclave). This design prevents unauthorized users from accessing decryption keys, thereby safeguarding the contents of the database. The services managing snapshots and the associated database are isolated, making them less susceptible to interference from other processes, including malware. Additional safeguards, such as rate-limiting and anti-hammering measures, are in place to thwart brute-force attacks.
Microsoft conducted security reviews
In a bid to bolster confidence in Recall’s security, Microsoft has conducted multiple internal reviews of the new architecture. The Microsoft Offensive Research and Security Engineering team (MORSE) has performed red-team testing, while an independent security vendor has been engaged to conduct a thorough security design review and penetration test. Additionally, Microsoft has completed a Responsible AI Impact Assessment (RAI) to evaluate risks, harms, and mitigation strategies across its six RAI principles: Fairness, Reliability & Safety, Privacy & Security, Inclusion, Transparency, and Accountability. The company has also committed to offering bug bounties for verified reports of serious security issues.
Will it satisfy critics?
The initial missteps surrounding Recall have understandably led to skepticism among security experts. However, the recent announcement provides a wealth of information, and the upcoming Insider testing phase set to commence in October will offer a valuable opportunity for further feedback. This input will be crucial in shaping Microsoft’s AI initiatives, with the company’s leadership, including CEO Satya Nadella, likely to monitor developments closely.