Microsoft Research chief scientist has no issue with Windows Recall
As tool emerges to probe OS feature's SQLite-based store of user activities
Asked to explore the data privacy issues arising from Microsoft Recall, the Windows maker's poorly received self-surveillance tool, Jaime Teevan, chief scientist and technical fellow at Microsoft Research, brushed aside concerns.
Teevan was speaking on Wednesday with Erik Brynjolfsson, director of the Stanford Digital Economy Lab, at the US university's Institute for Human-Centered Artificial Intelligence's fifth anniversary conference.
Brynjolfsson said when Recall was announced, there was "kind of a backlash against all the privacy challenges around that. So, talk about both the pluses and minuses of using all that data and some of the risks that creates and also some of the opportunities."
This was clearly a popular topic.
Of course we are rethinking what data means and how we use it, how we value it, how it gets used
"Yeah, and so it's a great question, Erik," said Teevan. "This has come up throughout the morning as well – the importance of data. And this AI revolution that we're in right now is really changing the way we understand data."
She continued, "Microsoft generally helps large enterprises manage their data, create data, share data, and that data is really something that makes the business of work different in the context of generative AI.
"And as individuals too, we have important data, the data that we interact with all the time, and there's an opportunity to start thinking about how to do that and to start thinking about what it means to be able to capture and use that. But of course we are rethinking what data means and how we use it, how we value it, how it gets used."
The Register noted when Recall was introduced at Microsoft Build last month that the software – which builds an archive of screenshots taken every few seconds and logs user activities, so that past actions can be recalled – presents a significant privacy risk. As recently described by author Charlie Stross, it is "the product nobody wanted" and "an utter privacy shit-show."
- Analysts join the call for Microsoft to recall Recall
- Windows 11's Recall feature is on by default on Copilot+ PCs
- Giving Windows total recall of everything a user does is a privacy minefield
- Was there no one at Microsoft who looked at Recall and said: This really, really sucks
Undaunted by Teevan's unwillingness to acknowledge why Recall struck a nerve, Brynjolfsson probed further.
"Is it stored locally?" he asked. "So suppose I activate Recall, and I don't know if I can, but when you have something like that available, I would be worried about all my personal files going up into the cloud, Microsoft, or whatever. Do you have it kept locally?"
Erk Brynjolfsson and Jaime Teevan on stage at the Stanford HAI conference this week ... Click to enlarge
Teevan responded, "Yeah, yeah, so this is a foundational thing that we as a company care a lot about is actually the protection of data. So Recall is a feature which captures information. It's a local Windows functionality, nothing goes into the cloud, everything's stored locally."
And that was that, as if continuously recording one's computing activities in a series of screenshots and activity logs has no security or privacy implications if the data is local and protected by Microsoft Account credentials – and not much of a reassurance in light of the release of security researcher Alex Hagenah's tool Total Recall. This code can extract and display data from Recall's unencrypted SQLite database, in which the operating system "feature" stores snapshots of user activity.
Meanwhile, security researchers and analysts continue to pile on, calling for Recall - due to be released later this month - to be forgotten.
As Stross argues, Windows PCs with Recall will be targeted by lawyers during discovery proceedings because they will provide access not just to email messages but conversations in any messaging or collaboration app, and possibly spoken conversations if speech-to-text data gets captured by Redmond's activity logger. It's also handy for a system intruder to us to snoop on what their victim has been up to lately, personally and for work.
"It's a shit-show for any organization that handles medical records or has a duty of legal confidentiality; indeed, for any business that has to comply with GDPR (how does Recall handle the Right to be Forgotten? In a word: badly), or HIPAA in the US," he wrote in his post.
"This misfeature contravenes privacy law throughout the EU (and in the UK), and in healthcare organizations everywhere which has a medical right to privacy."
Referring to Recall's ability to avoid capturing DRM'd content, the sci-fi scribe continued: "About the only people whose privacy it doesn't infringe are the Hollywood studios and Netflix, which tells you something about the state of things." ®
Also at Stanford: To solve AI energy crisis, 'rethink the entire stack from electrons to algorithms,' says physics prof.