June 16, 2024

Microsoft Recall AI Privacy Concerns Untapped by Simple Python Script

With multi-modal generative AI being one of the hottest topics of the last few years, it was natural that Microsoft would want to integrate it into as many nooks and crannies on Windows as feasible. However, shortly after its announcement of the Recall feature for Copilot+ systems on May 20th, a simple Python script published on GitHub showed that Microsoft made one pretty crucial mistake: The data it stores is not secure, at all.

Some Background on Microsoft Recall

Microsoft Recall is a feature in the company’s new Copilot+ PCs that come with an advanced ARM version of Windows 11. The chips in the computers are designed to pump out as many operations as possible with the lowest power profile possible, making them capable of integrating AI into the user’s workflow without requiring remote servers to handle requests.

A man recalling a memory with a bright lightbulb overhead
Image credit: SDXL

Recall is supposed to add to the Windows experience by taking periodic screenshots of your system while you’re using it, and then processing all the information through a computer vision AI model that interprets the elements present on your screen. This can later be used to help you sift through your session and help you look back at things you did earlier in your session.

Provided your screen is constantly changing at any point, Recall will take screenshots every five seconds and save them. Otherwise, it waits for changes to happen.

What Happened?

On June 7th, a user by the name of Alexander Hagenah (xaitax) on GitHub published a Python script called TotalRecall that could run locally and sift through all the data that was supposed to be stored securely.

A piece of a Python script that exploits Microsoft's Recall feature

We took a look at the script itself, and in only 164 lines of code, all it had to do was find an SQLite .db file sitting in the following location on the target system:


After it locates the file, the script simply opens and reads it. That’s it. There’s no hacker magic. Most of the code is spent defining and checking storage paths for the script to extract the files correctly. There is no exploit here because all the files Recall stores are just out in the open and unencrypted.

It turns out that despite Microsoft touting an entire section about “Built-in security” in their overview of Recall’s privacy, all it takes to crack open the Recall database is to navigate to a folder and type a few SQL commands. All the images are also stored in a subfolder called “ImageStore” in plain view.

A Cause for Worry?

The prospect that there are unencrypted screenshots of things you’ve done -including passwords you may have typed in plaintext – stored without any protection in your file system sounds scary, but it’s not as catastrophic as it may sound.

A red alert strobe light
Image credit: SDXL

The concerns over privacy are still valid. Giving users such easy access to Recall’s files without some barriers makes it very easy for skilled social engineers to convince less tech-savvy people to compress the CoreAIPlatform.00 folder into a ZIP file without a second thought and hand all of that data over.

Certain applications can maliciously extract information from UserActivity events using the Recall system API without any consequences. It might even be possible to perform these kinds of data captures without elevated privileges, which means that users would not even be notified when it happens.

However, there’s one silver lining here: Accessing all of this data remotely is still extremely difficult to do without the user’s knowledge or consent. For all the faults in this system, Recall data is still stored locally on Copilot+ machines, which ship with some of the most secure default settings of any Windows-powered system.

All compatible devices use Windows Device Encryption or BitLocker by default and come with Enhanced Sign-In Security through device PINs and biometric data. To be fair, the latter is still pretty easy to bypass with some simple password resetting techniques that worked for over a decade.

This still doesn’t let Microsoft off the hook, but it at least explains why the company was confident that it wouldn’t be the worst thing in the world to provide an unencrypted SQLite database and store all the images taken by Recall in local folders without additional encryption.

How to Fix This

If you’re using a Copilot+ system and have concerns about Recall’s effect on your system’s security, you can disable Recall by going to your Settings menu and clicking through Privacy & security -> Recall & snapshots and disabling the Save snapshots option. If you don’t see this option, then it’s very likely you don’t have Recall on your system.

As for what Microsoft should do, there should be a space that is sandboxed away from access by locally-run applications and scripts. If a simple Python script can slurp up the data and sort it in folders it creates, it’s also possible for the same script to upload all of this data to a server farm somewhere. This should at least be blocked by a User Access Control challenge, if not entirely encrypted in a binary blob, like almost every other critical component of the system should be.

Image Credit: SDXL, all screenshots by Miguel Leiva-Gomez

Subscribe to our newsletter!

Our latest tutorials delivered straight to your inbox

Miguel Leiva-Gomez

Miguel has been a business growth and technology expert for more than a decade and has written software for even longer. From his little castle in Romania, he presents cold and analytical perspectives to things that affect the tech world.

Leave a Reply

Your email address will not be published. Required fields are marked *