Google's Privacy Budget, a plan to reduce the amount of information available in Chrome as a defense against browser fingerprinting, runs the risk of performing poorly, of breaking websites, and of creating a new tracking mechanism.
These are the concerns of Mozilla CTO Eric Rescorla, who recently published his analysis [PDF] of Google's Privacy Budget. While Mozilla's recent testing of sponsored suggestions during search queries in Firefox complicated its stance as a defender of privacy, rival browser maker Brave also raised some of these issues in November, 2019, and updated its objections last week.
Privacy Budget, in conjunction with adjacent privacy improvements like Gnatcatcher (IP address protection), aims to enhance the privacy of web users by reducing the browser fingerprinting surface – the number of data points available. With fewer details to construct an identifier, there's a greater chance the identifier will not be unique.
"Fundamentally, we want to limit how much information about individual users is exposed to sites so that in total it is insufficient to identify and track users across the web, except for possibly as part of large, heterogeneous groups," wrote Google engineering manager Brad Lassey in the project's GitHub repo.
Online marketers very much don't want that and have made Google's Privacy Sandbox a competition issue – they worry that if the Chocolate Factory limits the availability of web data, then Google will have an informational advantage due to its ability to get data through its ad ecosystem and Google Accounts.
And so they've managed to get the UK's Competition and Markets Authority to extract a series of commitments from Google that its changes will not harm competition.
But even assuming Google somehow manages to reconcile its privacy push with the pushback from the ad tech set, it may all be for naught. According to Mozilla's Rescola, there are several significant issues that threaten Google's Privacy Budget.
First, the calculations are difficult and likely to vary, he contends, because not all values measured have the same informational content. For example, knowing someone uses Chrome doesn't have much value because so many people do, but knowing someone uses Tor (or Firefox) puts them in a much smaller set of people.
A related issue is that some values are correlated, like screen height and width, so a site would only need to know one to infer the second. And this isn't currently considered in how a site's Privacy Budget is calculated.
- Want to support Firefox? Great, you'll have no problem with personalised, sponsored search suggestions then
- Motivated by commerce, not conscience, Google bans ads for climate change consensus contradictors
- What do iOS and Android have in common? Their apps suck at privacy, boffins say
- Opt-out is the right approach for sharing your medical records with researchers
Then there's the problem with multiple reads of the same fingerprinting surface. For example, Rescola observes, Google's proposal doesn't explain how to deal with an API like window.screen that would be called on each page load in order to size its content displayed. If each load counts toward the Privacy Budget, then the limit would quickly be reached. And if only the initial load is counted, there are still issues with values that may not be entirely static.
In other words, a given Privacy Budget total could mean a significantly different fingerprinting surface across different sites and users.
Then there's the matter of what happens once a website has exceeded its Privacy Budget.
"Enforcement is likely to lead to surprising and disruptive site breakage because sites will exceed the budget and then be unable to make API calls which are essential to site function," said Rescorla in a blog post. "This will be exacerbated because the order in which the budget is used is nondeterministic and depends on factors such as the network performance of various sites, so some users will experience breakage and others will not."
Finally, Rescorla cites the possibility that Google's Privacy Budget could itself become a fingerprinting surface, a concern raised by Brave's researchers, because the limitations placed on cookies recently have made fingerprinting more appealing to those intent on ducking privacy defenses.
"The Privacy Budget idea is intended to prevent this kind of cross-site tracking but because the budget itself is state that can be both manipulated and read but is not partitioned, it can be used for cross-site tracking," Rescorla said in his analysis.
He argues that rather than trying to contain fingerprinting data, Google would be better off simply limiting fingerprinting surface when new web APIs are developed while gradually removing existing fingerprinting surfaces and monitoring for abusive patterns.
In a statement emailed to The Register, a Google spokesperson said Privacy Budget is still being developed and improvements based on this sort of feedback should be expected.
"Our ultimate goal is to build a solution that restricts fingerprinting effectively without compromising key website functionality or introducing new forms of tracking," said Google's spokesperson. "We appreciate Mozilla’s engagement throughout this process as we all work to build a more private web without third party cookies and other forms of invasive tracking. This is our collaborative process working as intended." ®
- Black Hat
- Cybersecurity and Infrastructure Security Agency
- Cybersecurity Information Sharing Act
- Data Breach
- Data Protection
- Data Theft
- Google AI
- Google Cloud Platform
- Google Nest
- Identity Theft
- Microsoft 365
- Microsoft Office
- Microsoft Teams
- Palo Alto Networks
- Tavis Ormandy
- Visual Studio
- Visual Studio Code
- Web Browser