Where do troops at Incirlik Air Base in Turkey like to jog? Around the nuclear weapons storage sites, says a heat map of fitness paths from data tracking app Strava. Sourced from the native GPS data on user’s smartphones and watches, Strava Labs produced a global heat map, and with a click of a button and a minute on Google, anyone can find where on a military base it’s likely that people like to jog. So what?
When it comes to Incirlik, the Strava jogging data just adds to existing knowledge. Satellite views show B-52s stationed outside special bunkers, a quick search of “nuclear weapons Turkey” pulls up a ton of stories about the storage at the base in Incirlik (I’m partial to this one), and the knowledge that people on base like to run laps around available paths probably isn’t new information to anyone in the area. As analysts and uniformed personnel debated on Twitter, what Strava’s heat map adds isn’t much compared to what a half-interested observer probably already knows. But that’s the heatmap itself.
The bigger, specific question is what else Strava knows that isn’t on the heat map. And more broadly, the bigger danger is what happens when every technologies vital for everyday life record that information and share it widely. Strava accounts are linked to Facebook, Google, or email, and depending on the sign-up method, by simply making the account a user gives Strava the same data about their connections already siloed away in a social network. A click or two later, and the user can choose how to share their location with the app. Buried under setting is a “privacy” section. By default, anyone can view a Strava user’s profile, people logged into the app can follow a user, and can download user data. An Enhanced Privacy setting masks some activity, but users have to individually toggle each of several categories of activity to hide it from the app. If a user wants to keep a certain location hidden on the app, like their home or office, they have to go through the desktop client to set a hidden zone with a radius of three-quarters of a mile around a set location.
For someone who wants to just sign on, run, and share their run with their buddies, that’s a lot of work to make sure that, say, the daily jog around a patriot missile installation is secure. And someone who wants to make sure that the steps they take while on on duty count for their fitness tracker might not take the extra time to play with the sensor and hide their location data from other users. And even if they do, the information is still fed into the app and the company itself, where it is then collected and up to the company how that data gets used. The heat map doesn’t identify any specific users. It instead gives viewers something else: patterns of behavior around places.
A quick note on this: while the high traffic areas show brightest (this is the nature of a heat map, after all), it’s the thin lines around sensitive locations, detailing paths walked by maybe one person doing security, that are probably most interesting to anyone plotting something nefarious. That’s true universally: the paths of any group of armed people running security with Strava on can be found here. While it might be of special interest to those tracking the activities of the Pentagon, it’s relevant to anyone tracking any military. And a caveat: this is just the data of people who use the Strava app, so while it documents some activity, the absence of anything on the Strava map only means no Strava users recorded data there, not that nothing happened there.
Strava is hardly the first app to capture ephemeral behavior and turn it into a public, geolocated document. When the messaging and video tool Snapchat introduced a world map option, allowing any user to view any users or posts set to public, commanders had to remind those serving to change their privacy settings, to keep locations private from acquaintances. When Pokemon took a map built for the augmented reality game Ingress and made it a massive phenomenon, quiet but mildly notable sites (like, say, a Commanding General’s house) became the target of casual interlopers.
We are living in a future where the default is to be connected, and the ways people are connected increasingly ask users to offer information, seemingly insignificant, in exchange for daily utility. To some extent, this is vital: what good is a jogging app if it can’t tell where the user went jogging? What is unclear, though, is what else the app knows, and how securely that data is stored, and if it’s possible for someone with access to the full dataset to de-anonymize it.
And then, there is the simple fact of the data itself: while the location of nuclear storage in Incirlik may be public, and the bunkers may be visible from space, those loops are still restricted areas, and cell phones, especially the kind that can record the path someone took on a jog, are restricted in those areas. If there is a risk here, it is the modern liabilities information shared online combined same danger that has plagued every endeavor from the dawn of time: human error.
Kelsey Atherton blogs about military technology for C4ISRNET, Fifth Domain, Defense News, and Military Times. He previously wrote for Popular Science, and also created, solicited, and edited content for a group blog on political science fiction and international security.