Key Takeaways

  • Recall feature in Copilot+ takes automatic screenshots and stores their OCR information in an SQLite database for later reference.
  • Cybersecurity expert Kevin Beaumont warns that this feature could pose a significant privacy risk if accessed by hackers.
  • Microsoft may face repercussions if the Recall feature is shipped as-is.

When Microsoft announced Copilot+, a lot of its proposed features caused some buzz on the internet. However, out of all of the new AI-powered tools we were shown, the Recall feature saw the most conversation. The Recall feature automatically takes screenshots as you use your computer and uses it when you ask Copilot what you were doing in the past; you can think of it as your PC's memory of what you did.

People were concerned that Recall might be a huge privacy risk; after all, if someone gains access to these logs, they can see everything the user has done in the past few months. Microsoft was quick to alleviate user fears by stating all of the data was encrypted, but a cybersecurity expert has found plain-text logs that they believe hackers could remotely access.

Related
Microsoft Copilot+: Everything you'll be able to do with your new Snapdragon X Elite PC

Microsoft has unveiled its vision for AI PCs, and it's called Copilot+

A cybersecurity expert states that Recall "sets cybersecurity back a decade"

Copilot Recall (3)

As spotted by The Verge, Kevin Beaumont took to X to explain how insecure they believe Recall is. In it, they state that they dug through Recall and saw that its logging system is "just an SQLite database" and that the feature's existence "sets cybersecurity back a decade." They advised everyone to disable the Recall feature once Copilot+ arrives to prevent anyone from stealing your data.

Kevin also made an extensive blog post on Double Pulsar explaining their findings. It's well worth a read if you want to see how secure the Recall feature is, but here are a few highlights:

Every few seconds, screenshots are taken. These are automatically OCR’d by Azure AI, running on your device, and written into an SQLite database in the user’s folder.

This database file has a record of everything you’ve ever viewed on your PC in plain text. OCR is a process of looking an image, and extracting the letters.

Q. Have you exfiltrated your own Recall database?

A. Yes. I have automated exfiltration, and made a website where you can upload a database and instantly search it.

I am deliberately holding back technical details until Microsoft ship the feature as I want to give them time to do something.

I actually have a whole bunch of things to show and think the wider cyber community will have so much fun with this when generally available.. but I also think that’s really sad, as real world harm will ensue.

Microsoft may have a huge privacy issue on its hands

Microsoft Copilot logo trailer
Source: Microsoft

If what Kevin says is true, then Microsoft could potentially be in a lot of hot water if they ship Recall as-is. If it's as insecure as Kevin says, we may see malicious agents working to find ways to steal data from these databases. Kevin states in his blog post that "data can be accessed remotely," which could open the door to hackers; however, we may instead see a new wave of malware that locates and steals the database off of people's computers. Are Kevin's claims baseless, or will Microsoft have to take drastic measures to prevent one of the biggest cybersecurity flaws in recent history? We'll have to wait and see.