Microsoft has announced significant security upgrades for its Recall AI feature, which creates a record of user activity on their PCs. Following concerns raised by security researchers about the potential for hackers to exploit the tool, Microsoft has implemented several measures to enhance its security.
In an interview to Bloomberg recently, David Weston, a vice president for enterprise and operating system security, said the company heard the critiques “loud and clear” and set about devising layers of security safeguards for Recall designed to thwart even the world’s most sophisticated hackers.
How Microsoft is changing Recall
In the upcoming version of Recall, users will have more control over the data collected. They can filter out specific apps or websites and enable sensitive content filtering by default. In-private browsing data will no longer be saved. Additionally, Recall can only be activated through biometric authentication, and collected data will be stored in an isolated environment.
Microsoft emphasizes that Recall is designed to help users navigate their PC’s history more efficiently. However, the company has taken steps to address security vulnerabilities. Sensitive data will be encrypted and stored locally on the user’s machine, requiring biometric authentication for access. Recall will also have a built-in timeout feature to prevent unauthorized access.
The revised version of Recall will be available in beta form next month. While it will be available on Copilot+ PCs, businesses will need to opt-in to use it on their machines.