In response to security concerns, Microsoft is detailing how it has overhauled its controversial AI-powered Recall feature that creates screenshots of mostly everything you see or do on a computer. Recall was originally supposed to debut with Copilot Plus PCs in June, but Microsoft has spent the past few months reworking the security behind it to make it an opt-in experience that you can now fully remove from Windows if you want.
“I’m actually really excited about how nerdy we got on the security architecture,” says David Weston, vice president of enterprise and OS security at Microsoft, in an interview with The Verge. “I’m excited because I think the security community is going to get how much we’ve pushed [into Recall].”
One of Microsoft’s first big changes is that the company isn’t forcing people to use Recall if they don’t want to. “There is no more on by default experience at all — you have to opt into this,” says Weston. “That’s obviously super important for people who just don’t want this, and we totally get that.”
A Recall uninstall option initially appeared on Copilot Plus PCs earlier this month, and Microsoft said at the time that it was a bug. It turns out that you will indeed be able to fully uninstall Recall. “If you choose to uninstall this, we remove the bits from your machine,” says Weston. That includes the AI models that Microsoft is using to power Recall.
Security researchers initially found that the Recall database — that stores snapshots taken every few seconds of your computer — wasn’t encrypted, and malware could have potentially accessed the Recall feature. Everything that’s sensitive to Recall, including its database of screenshots, is now fully encrypted. Microsoft is also leaning on Windows Hello to protect against malware tampering.
The encryption in Recall is now bound to the Trusted Platform Module (TPM) that Microsoft requires for Windows 11, so the keys are stored in the TPM and the only way to get access is to authenticate through Windows Hello. The only time Recall data is even passed to the UI is when the user wants to use the feature and authenticates via their face, fingerprint, or PIN.
“To turn it on to begin with, you actually have to be present as a user,” says Weston. That means you have to use a fingerprint or your face to set up Recall before being able to use the PIN support. This is all designed to prevent malware from accessing Recall data in the background, as Microsoft requires a proof of presence through Windows Hello.
“We’ve moved all of the screenshot processing, all of the sensitive processes into a virtualization-based security enclave, so we actually put it all in a virtual machine,” explains Weston. That means there’s a UI app layer that has no access to raw screenshots or the Recall database, but when a Windows user wants to interact with Recall and search, it will generate the Windows Hello prompt, query the virtual machine, and return the data into the app’s memory. Once the user closes the Recall app, what’s in memory is destroyed.
“The app outside the virtualization-based enclave is running in an anti-malware protected process, which would basically require a malicious kernel driver to even access,” says Weston. Microsoft is detailing its Recall security model and exactly how its VBS enclave works in a blog post today. It all looks a lot more secure than what Microsoft had planned to ship and even hints at how the company might secure Windows apps in the future.
So, how did Microsoft nearly ship Recall in June without a high amount of security in the first place? I’m still not super clear on that, and Microsoft isn’t giving much away. Weston confirms that Recall was reviewed as part of the company’s Secure Future Initiative that was introduced last year, but being a preview product, it apparently had some different restrictions. “The plan was always to follow Microsoft basics, like encryption. But we also heard from people who were like ‘we’re really concerned about this,’” so the company decided to fast-track some of the additional security work it was planning for Recall so that security concerns weren’t a factor in whether someone wanted to use the feature.
“It’s not just about Recall, in my opinion we now have one of the strongest platforms for doing sensitive data processing on the edge and you can imagine there are lots of other things we can do with that,” hints Weston. “I think it made a lot of sense to pull forward some of the investments we were going to make and then make Recall the premier platform for that.”
Recall will also now only operate on a Copilot Plus PC, stopping people from sideloading it onto Windows machines like we saw ahead of its planned debut in June. Recall will verify that a Copilot Plus PC has BitLocker, virtualization-based security enabled, measure boot and system guard secure launch protections, and kernel DMA protection.
Microsoft has also conducted a number of reviews on the upgraded Recall security. The Microsoft Offensive Research Security Engineering (MORSE) team has “conducted months of design reviews and penetration testing on Recall,” and a third-party security vendor “was engaged to perform an independent security design review” and testing, too.
Now that Microsoft has had more time to work on Recall, there are some additional changes to the settings to provide even more control over how the AI-powered tool works. You’ll now be able to filter out specific apps from Recall alongside the ability to block a custom list of websites from appearing in the database. Sensitive content filtering, which allows Recall to filter out things like passwords and credit cards, will also block health and financial websites from being stored. Microsoft is also adding the ability to delete a time range, all content from an app or website, in addition to everything stored in Recall’s database.
Microsoft says it remains on track to preview Recall with Windows Insiders on Copilot Plus PCs in October, meaning Recall won’t be shipping on these new laptops and PCs until it has been further tested by the Windows community.