Microsoft’s AI Secretly Copying All Your Private Messages

Microsoft is making a notable return with its AI-driven feature, Recall, which captures nearly every action taken on a computer by continuously taking background screenshots. This relaunch is exclusive to Copilot+ PCs, a specialized line of Windows 11 machines designed with hardware optimized for AI functionalities. However, concerns regarding privacy are not unfounded.

Initially introduced last May, Recall was swiftly retracted following significant public backlash. Security experts raised alarms when it was discovered that the screenshots were stored in an unencrypted database, leaving sensitive user data vulnerable to potential cyberattacks. Since its hasty withdrawal, Recall has undergone a period of testing through Microsoft’s Insider program, albeit under scrutiny for persistent security risks.

Enhancements and Ongoing Concerns

In December, an investigation by Tom’s Hardware revealed that Recall often captured sensitive information, including credit card numbers and Social Security numbers, despite having a setting designed to filter out such data. As Microsoft prepares for this latest iteration, several adjustments have been made to enhance Recall’s safety. The screenshot database is now encrypted, and users must actively opt in to save their screenshots, a shift from the previous opt-out requirement. Additionally, users can pause Recall at their discretion.

While these updates are commendable, they do not mitigate the fundamental invasiveness of Recall. As noted by Ars Technica, the tool poses significant risks not only to its users but also to anyone they communicate with, as messages may be captured and processed by the AI without the knowledge of the other party involved. This capability raises serious concerns about privacy, as it could inadvertently collect a wealth of sensitive material, including personal photos, passwords, and medical information.

Security researcher Kevin Beaumont has expressed caution regarding Recall’s implications. He highlights that, while the technology is impressive from a technical standpoint, it is fraught with privacy pitfalls. Beaumont’s testing revealed that the feature’s sensitive information filter remains unreliable, and the encrypted database is only safeguarded by a basic four-digit PIN. Alarmingly, he discovered that Recall effectively indexed everything it stored, including private messages and images that were meant to be ephemeral.

In a telling example, Beaumont recounted sending a self-deleting message containing a photo of a well-known individual. Recall captured and indexed this image by name, meaning that if the recipient had Recall enabled, the photo would have been cataloged under their name and could later be exported, despite its intended temporary nature.

Beaumont’s advice serves as a stark reminder of the potential risks associated with Recall: “If you’re discussing something sensitive with someone using a Windows PC, it’s prudent to first check whether they have Recall enabled.”

More on Microsoft: Microsoft’s Huge Plans for Mass AI Data Centers Now Rapidly Falling Apart

Winsage
Microsoft's AI Secretly Copying All Your Private Messages