Boffins have found a side channel to observe the choices netizens make when viewing interactive streaming videos.
At least at one point in history people worried about such things: four decades ago, the availability of home video rental data was deemed, in America at least, to be enough of a privacy invasion to prompt the Video Privacy Protection Act of 1988.
Now researchers at the Institute of Technology Madras, India, have found the specific choices viewers make while watching interactive videos can be determined from network traffic, opening up the possibility that ISPs may seek to sell such data as yet another data signal for ad targeting or that authorities might demand it to assess political attitudes.
Netflix last year presented an interactive movie, Black Mirror: Bandersnatch, in which viewers can make choices along the way that affect the path of the story.
When viewers watching the video choose one of the two narrative paths at various branch points in the story, that information gets sent back to Netflix to display the appropriate video segment. And it turns out to be possible to discern which branch each viewer took through network packet analysis.
In a paper just released through pre-print service ArXiv, "White Mirror: Leaking Sensitive Information from Interactive Netflix Movies using Encrypted Traffic Analysis," a handful of the institute's computer scientists show that story choices – sent from the viewer's browser to Netflix via a JSON file – can be inferred despite the encryption of network traffic.
"Our experiments revealed that the packets carrying the encrypted type-1 and type-2 JSON files can be distinguished from other packets by their SSL record lengths which are visible even from encrypted traffic," explain Gargi Mitra, Prasanna Karthik Vairam, Patanjali SLPSK, Nitin Chandrachoodan, and Kamakoti V in their paper.
Say GDP-aaaRrrgh, streamers: Max Schrems is coming for you, Netflix and AmazonREAD MORE
Using a data set of 100 viewers, the researchers claim they successfully determined the viewers' choices 96 per cent of the time.
A data set of decisions in an interactive narrative may sound inconsequential in an era of social media oversharing, but the researchers nonetheless suggest the information could have commercial or political applications.
"Interestingly, the choices made and the path followed can potentially reveal viewer information that ranges from benign (e.g., their food and music preferences) to sensitive (e.g., their affinity to violence and political inclination)," they explain.
The Register asked Netflix if it has any concerns about these findings. We've not heard back.
The boffins from Madras suggest a straightforward mitigation: altering the JSON files sent from the viewer's browser so they're equally long and indistinguishable from one another. ®