Microsofts Recall Feature Draws Criticism From Privacy Advocates

  /     /     /  
Publicated : 23/11/2024   Category : security


Microsofts Recall Feature Draws Criticism From Privacy Advocates


Despite Microsofts reassurances, multiple security researchers describe the technology as problematic for users and their organizations.



Microsofts plans to introduce a Recall feature powered by artificial intelligence in its Copilot+ PCs lineup has evoked considerable privacy concerns. But the extent to which these concerns are fully justified remains a somewhat open question at the moment.
Recall is technology
that Microsoft has described as enabling users to easily find and remember whatever they might have seen on their PC. It works by taking periodic snapshots of a users screen, analyzing those images, and storing them in a way that lets the user search for things they might have seen in apps, websites, documents, and images using natural language.
As Microsoft explains it, With Recall, you can access virtually what you have seen or done on your PC in a way that feels like having photographic memory.
Copilot+ PCs will organize information based on relationships and associations unique to each user, according to the company. This helps you remember things you may have forgotten so you can find what you’re looking for quickly and intuitively by simply using the cues you remember.
Default configurations of Copilot+ PCs will come with enough storage to store up to three months worth of snapshots, with the option to increase that allocation.
In introducing the technology, Microsoft pointed to several measures the company says it has implemented to
protect user privacy and security
. Recall will store all data it captures only locally on the users Copilot+ PC in fully encrypted fashion. It wont save audio or continuous video, and users will have the ability to disable the feature. They also can pause it temporarily, filter out apps and websites that a user might not want saved as snapshots, and delete Recall data any time.
Microsoft will give enterprise admins the ability to automatically disable Recall via group policy or mobile device management policy. Doing so will ensure that individual users in an enterprise setting cannot save screenshots and that all saved screenshots on a users device are deleted, according to Microsoft.
You are always in control with privacy you can trust, Microsoft said.
No Recall data will ever go back to Microsoft, and none of the accumulated data will be used for AI training purposes, according to the company.
Such reassurances, however, have done little to assuage an outpouring of concern from several quarters — including entities like the UKs
Information Commissioners Office
(ICO) — about potential privacy and security risks associated with Recall. The companys own admission that Recall will happily take and save screenshots of sensitive information, such as passwords and financial account numbers, without doing any content moderation has fueled those concerns.
Security researcher Kevin Beaumont encapsulated the issues in a blog post this week that described Recall as a new
security nightmare
for users. His biggest concern — which many others have expressed as well — is that the Recall database on a users machine will be a goldmine of information — including passwords, bank account information, Social Security numbers, and other sensitive information — for attackers to target.
With Recall, as a malicious hacker you will be able to take the handily indexed database and screenshots as soon as you access a system — including [three] months history by default, Beaumont wrote. Information stealers will have access to data in the clipboard, as well as everything else a user did in the preceding three months. If you have malware running on your PC for only minutes, you have a 
big 
problem in your life now rather than just changing some passwords, he stated.
In addition to Recall data being a big target for attackers, theres also some concern over what kind of access, if any, Microsoft will have to it. Microsofts assurances that Recall will remain strictly on a users device have done little to alleviate concerns. The ICO has asked for more transparency from Microsoft regarding Recall.
Industry must consider data protection from the outset and rigorously assess and mitigate risks to peoples rights and freedoms before bringing products to market, the ICO said in a
statement
.
Gal Ringel, co-founder and CEO at Mine, describes the Recall feature as an affront to user privacy and an assault on best practices for both security and privacy.
Beyond its particularly invasive nature, the fact that there are no restrictions in place to censor or conceal sensitive data, such as credit card numbers, personal identifiable information, or company trade secrets, is a major slip-up in product design that presents risks far beyond cybercriminals, he says.
As a tech giant, Microsoft has the resources to process and store loads of unstructured data safely and efficiently that most enterprises lack, Ringel says.
Collecting thousands — if not millions — of screenshots that could contain data protected under various global data privacy regulations is like playing with fire, he notes, suggesting that Microsoft make the feature opt-in rather than enabling it by default.
Recalls continuous screenshot capture functionality could potentially expose sensitive data if a device is compromised, says Stephen Kowski, field CTO at SlashNext. Even though Microsoft has built-in encryption and other security measures to mitigate risks of unauthorized access to the locally stored Recall data, organizations should consider their own risk profiles when using the technology, he says.
Microsoft is heading in the right direction with its controls, such as the ability to pause Recall, exclude certain apps, and use encryption, which provides important user protections, Kowski says. However, to enhance privacy further, Microsoft could consider additional safeguards, like automatic identification and redaction of sensitive data in screenshots, more granular exclusion options, and clear user consent flows.
In one sense, Recalls functionality is not very different from that offered by the myriad user and entity behavior (UEBA) tools that many organizations use to monitor for endpoint security threats. UEBA tools can also capture and potentially expose sensitive data on the user and their behavior.
The big problem with Recall is that it adds additional exposure to endpoints, says Johannes Ullrich, dean of research at the SANS Institute. UEBAs data collection is specifically built with security in mind.
Recall, on the other hand, adds an additional prize an attacker may win when attacking the endpoint, Ullrich says. It provides a database of past activity an attacker would otherwise not have access to.
Microsoft did not respond specifically to a Dark Reading request for comment on spiraling privacy concerns. A spokesman instead pointed to the companys
blog post on the privacy and control mechanisms
that Microsoft said it has implemented around the technology.

Last News

▸ Debunking Machine Learning in Security. ◂
Discovered: 23/12/2024
Category: security

▸ Researchers create BlackForest to gather, link threat data. ◂
Discovered: 23/12/2024
Category: security

▸ Travel agency fined £150,000 for breaking Data Protection Act. ◂
Discovered: 23/12/2024
Category: security


Cyber Security Categories
Google Dorks Database
Exploits Vulnerability
Exploit Shellcodes

CVE List
Tools/Apps
News/Aarticles

Phishing Database
Deepfake Detection
Trends/Statistics & Live Infos



Tags:
Microsofts Recall Feature Draws Criticism From Privacy Advocates