In November, exercise-tracking app Strava published a “heatmap” of user activity which it cheerily boasted comprised a billion activities, three trillion lat-long points, 13 trillion rasterized pixels and 10 TB of input data.
It took a while, but late last week someone wondered “how many Strava users are members of the military or national security groups, and are uploaded their activity?” The answer is “plenty - and they've revealed where they work, where they live, when they were sent to a new outpost and where to ambush them when they least expect it."
Ever since Nathan Ruser, an international security student at the Australian National University, observed that Strava's data included the exercise routes of military and natsec personnel, locating military installations in Strava's has become a social media sensation.
Strava released their global heatmap. 13 trillion GPS points from their users (turning off data sharing is an option). https://t.co/hA6jcxfBQI … It looks very pretty, but not amazing for Op-Sec. US Bases are clearly identifiable and mappable pic.twitter.com/rBgGnOzasq— Nathan Ruser (@Nrg8000) January 27, 2018
For example, in Australia, it's now possible to see where people exercise at the secretive deep desert Pine Gap sigint station:
Observers have also noted that Strava hasn't revealed much more than was already already visible on Google Earth. For example, here's Pine Gap again, this time from Google:
Google's got a much clearer image of Pine Gap
Strava's explanation of how it made the Heatmap says it excluded data that users asked to be kept private. The service allows users to create multiple "privacy zones" with a radius of up to 1km. When users enter such the zones, their digital tracks disappear in order to make it harder to figure out where they live or work.
Data revealing the location of sensitive facilities, or the habits of military personnel, would therefore have been excluded if users had employed Strava's privacy setttings.
However, as Ruser later tweeted, the location of bases isn't the only concern: the ability to establish “pattern of life” information also makes the Heatmap a serious source of risk – mainly because people weren't keeping their information private.
If soldiers use the app like normal people do, by turning it on tracking when they go to do exercise, it could be especially dangerous. This particular track looks like it logs a regular jogging route. I shouldn't be able to establish any Pattern of life info from this far away pic.twitter.com/Rf5mpAKme2— Nathan Ruser (@Nrg8000) January 27, 2018
The Daily Beast's Adam Rawnsley noticed the app can even reveal troop movements, if new Strava users pop up in an area around a military base:
Pretty faint but data from the Strava exercise app shows like China has deployed joggers to its disputed Woody Island in the South China Sea, in addition to fighter jets and HQ-9 SAMs pic.twitter.com/HG6zkb8tcw— Adam Rawnsley (@arawnsley) January 27, 2018
It also, by the way, possible to extract people's names, profile pictures, and heart rates from Strava's backend:
We are just now all seeing how much you can learn from this data now that it is publicly accessible, but all of that *already was possible* if one had access to the data. https://t.co/rB2fto3w6H— Dino A. Dai Zovi (@dinodaizovi) January 29, 2018
Beyond the military frenzy, however, El Reg agrees with observations that the heat map is sufficiently detailed to pose a risk to individuals. Infosec bod Brian Haugli noticed that the heatmap reaches all the way to your door:
You can see individuals that are using Strava by zooming it to houses that have a short line. Strava gives the ability to set up privacy zones, but it's not on by default. pic.twitter.com/azqZFXiVQZ— Brian (@BrianHaugli) January 28, 2018
Even if individuals had set up the area around their homes as privacy zones, which Haugli noted is not the default, the dataset still contains a level of personally identifying information that shouldn't have been published by Strava, according to European privacy researcher Lukasz Olejnik.
Olejnik said at the least, someone should have conducted a privacy impact statement before pressing “publish” on the dataset.
He told The Register in an email: “This highlights the challenges of location data anonymisation, and how mass datasets reveal unexpected patterns. Organisations should carefully consider consequences on multiple levels prior to publishing private data.
“That said, making a privacy impact assessment of this kind of a project would be quite an adventure.”
Olejnik also tweeted that Europe's General Data Protection Regulation (GDPR) considers location to be sensitive information, meaning publication should be handled with care. ®