Microsoft has undertaken a significant revision of its Recall feature for Copilot+ PCs, emphasizing the security of this self-surveillance system. The company describes Recall as a tool designed to help users “instantly and securely find what you’ve seen on your PC.”
How Recall Works
While users may not always remember their activities on their PCs, Microsoft’s Copilot AI is equipped to retain that information. Recall operates by capturing snapshots of the Windows desktop at regular intervals, documenting application usage, and storing this data for future retrieval. Users can access this visual activity log through text searches or by navigating a timeline of their activities.
Upon its initial announcement at Microsoft Build 2024 in May, Recall faced considerable criticism, with security experts labeling it a potential privacy nightmare. Security researcher Kevin Beaumont likened it to a keylogger, while author Charlie Stross raised concerns about its implications for legal discovery, as it could inadvertently record sensitive information like banking details and personal communications.
In response to the backlash, Microsoft postponed the rollout of Recall in June to reassess its approach. By August, the company announced that it had made sufficient adjustments and planned to release the feature to Windows Insiders in October.
Security Features and User Control
David Weston, Microsoft’s VP of enterprise and OS security, reassured users in a recent blog post that Recall is designed with security and privacy as top priorities. He emphasized that Recall is an opt-in feature, meaning users have the choice to enable it. Furthermore, users can completely remove Recall through the optional features settings in Windows.
For those who choose to use Recall, the system encrypts its snapshots within a vector database, safeguarding the encryption keys with the PC’s Trusted Platform Module. Access to this data requires the user’s Windows Hello Enhanced Sign-in Security identity, which is linked to biometric authentication methods like fingerprints or facial recognition. Additionally, access permissions are time-limited, necessitating re-authentication for subsequent sessions, thus mitigating the risk of unauthorized data access.
Weston reiterated that “Recall is always opt-in,” clarifying that snapshots are only taken with user consent. The data is stored locally on the device and is not shared with Microsoft or third parties. Users maintain control over their data, with the ability to delete snapshots or pause the feature at any time. Any future options for data sharing will require explicit user consent.
Privacy Safeguards
Despite its name, Recall does not retain certain types of information. Private browsing sessions in supported browsers, including Edge, Chrome, and Firefox, are not recorded. Users can also designate specific apps and websites to be excluded from Recall’s monitoring. To further enhance privacy, a sensitive content filter is active by default, preventing the capture of passwords, national ID numbers, and credit card information.
Users are empowered with controls over Recall’s content retention duration, disk space allocation for snapshot storage, and the ability to delete records based on time, app, or website. The data that is saved can be accessed through an AI agent, providing a streamlined experience for users.
Weston concluded by stating, “Recall’s secure design and implementation provides a robust set of controls against known threats.” Microsoft remains committed to harnessing the power of AI while ensuring that security and privacy are upheld against even the most sophisticated attacks.