Let there be ambient light sensing, without fear of data theft
Six years on web devs finally settle on sensor privacy defenses
Six years after web security and privacy concerns surfaced about ambient light sensors in mobile phones and notebooks, browser boffins have finally implemented defenses.
The W3C, everyone's favorite web standards body, began formulating an Ambient Light Events API specification back in 2012 to define how web browsers should handle data and events from ambient light sensors (ALS). Section 4 of the draft spec, "Security and privacy considerations," was blank. It was a more carefree time.
Come 2015, the spec evolved to include acknowledgement of the possibility that ALS might allow data correlation and device fingerprinting, to the detriment of people's privacy. And it suggested that browser makers might consider event rate limiting as a potential mitigation.
By 2016, it became clear that allowing web code to interact with device light sensors entailed privacy and security risks beyond fingerprinting. Dr Lukasz Olejnik, an independent privacy researcher and consultant, explored the possibilities in a 2016 blog post.
Olejnik cited a number of ways in which ambient light sensor readings might be abused, including data leakage, profiling, behavioral analysis, and various forms of cross-device communication.
"The attack we devised was a side-channel leak, conceptually very simple, taking advantage of the optical properties of human skin and its reflective properties," Olejnik explained in his paper.
"Skin reflectance only accounts for the 4-7 percent emitted light but modern display screens emit light with significant luminance. We exploited these facts of nature to craft an attack that reasoned about the website content via information encoded in the light level and conveyed via the user skin, back to the browsing context tracking the light sensor readings."
- Ambient light sensors can steal data, says security researcher
- Boffins' five eyes surprise: Bees correct colour for ambient light
- Apple: We're defending your privacy by nixing 16 browser APIs. Rivals: You mean defending your bottom line
- Gotta have standards? Security boffins not API about bloated browsers
It was this technique that enabled the proof-of-concept attacks like stealing web history through inferences made from CSS changes and stealing cross origin resources, such as images or the contents of iframes.
Browser vendors responded in various ways. In May 2018, with the release of Firefox 60, Mozilla moved access to the W3C proximity and ambient light APIs behind flags, and applied further limitations in subsequent Firefox releases.
Google took what Olejnik described his paper as a "more nuanced" approach, limiting the precision of sensor data.
But those working on the W3C specification and on the browsers implementing the spec recognized that such privacy protections should be formalized, to increase the likelihood the API will be widely adopted and used.
So they voted to make the imprecision of ALS data normative (standard for browsers) and to require the camera access permission as part of the ALS spec.
Those changes finally landed in the ALS spec this week. As a result, Google and perhaps other browser makers may choose to make the ALS API available by default rather than hiding it behind a flag or ignoring it entirely. ®