Microsoft Recall Can Still Potentially Screenshot Your Sensitive Information

Earlier this year, Microsoft faced significant challenges with its Recall feature, designed to provide users with an auto-screenshotting “photographic memory” for Windows 11 Copilot+ PCs. The rollout was postponed due to serious security vulnerabilities that raised concerns among users and experts alike.

Now, as Recall makes its return, albeit with some limitations, users are discovering that it may not offer the level of security they expect, particularly when it comes to sensitive financial information. The feature, which began reaching select testers last month, became widely available on December 6 for all users with a Copilot+ PC participating in the Windows Insider beta program. Initially introduced in May, Recall was met with scrutiny after security researchers highlighted the ease with which AI transcript logs could be accessed, prompting Microsoft to implement encryption and a Windows Hello login requirement for access.

Security Features and Limitations

The new iteration of Recall aims to enhance user protection by automatically detecting and ceasing the screenshotting of sensitive information, such as bank details. However, reports from Avram Piltch at Tom’s Hardware indicate that the filters designed to prevent the capture of sensitive data may not be functioning as effectively as intended. For instance, the feature reportedly continued to screenshot a Notepad document containing credit card information and even captured a fabricated loan application PDF.

Despite these shortcomings, Piltch noted that the filter did successfully block screenshots on two payment sites he tested. In a blog post dated November 22, Microsoft encouraged users to specify websites that Recall should avoid capturing, emphasizing the importance of user feedback in refining the feature. The company stated, “If you find sensitive information that should be filtered out for your context, language, or geography, please let us know through Feedback Hub.”

Gizmodo reached out to Microsoft for further clarification but has yet to receive a response. As the company navigates the complexities of AI recognition, it must account for various scenarios where users input sensitive information. While all screenshots are secured behind a Windows Hello login, the potential for unauthorized access remains a concern.

Recall is currently an opt-in feature, disabled by default for users loading it on a PC in the Insider channel. As a beta product, it is expected that users will encounter various issues. Microsoft previously informed users in its November 22 blog post that screenshots would not be saved if they installed Build 26120.2415 after activating the Windows beta build.

As someone who has been experimenting with Recall, it becomes clear that the true value of this feature can only be assessed over time. Users need to accumulate a substantial number of screenshots and allow their memory to fade before the utility of such a tool becomes apparent. Additionally, it is essential to note that none of the Copilot+ PCs were shipped with the AI models pre-installed, necessitating a download upon signing up for the Insider build.

As highlighted by The Verge, the notion that one’s work, conversations, and online activities are continuously recorded can be unsettling. Perhaps more surprising than the ongoing issues requiring resolution is the fact that Microsoft initially intended to launch Recall six months ago without the extensive refinements now deemed necessary.

Winsage
Microsoft Recall Can Still Potentially Screenshot Your Sensitive Information