Londoners are unknowingly having their physical data collected. Here’s why that’s a problem. - RSA

Londoners are unknowingly having their physical data collected. Here’s why that’s a problem

Blog

  • Future of Work
  • Digital
  • Technology

Two recent developments in London’s shared spaces – location tracking and facial recognition – call into question established boundaries between public and private.

Initially derided for ruining the tranquillity of a signal-less tube, a massive amount of Londoners now rely on wi-fi on the Underground.

What most commuters probably haven’t realised is that Transport for London (TfL) has been harvesting location data from their devices. This is true of any device which can connect to wi-fi, not just those that do connect to the network. Following a series of trials, this is set to be rolled out across the network from the 8th of July.

TfL has at least been relatively transparent about the data it has been collecting. User data is encrypted, and the network will not monitor what websites you visit, only your location on the underground.

This means they can produce highly accurate maps of Tube journeys, and all kinds of sexy data visualisations to go with them. TfL has promised an information campaign regarding the changes, and opting out is as simple as switching off your wi-fi.

Who benefits from surveillance data?

This kind of data yields obvious benefits for the transport network. While it’s already possible to work out where commuters tap in and out, there can be dozens of routes between any two destinations. This  data will better allow them to resolve capacity issues, and the location data produced is so precise it’s even possible to see how people travel across stations.

Of course, there is a financial driver behind this. TfL is set to lose funding from the Department for Transport, and are looking to plug a hole of over half a billion pounds in their budget. Aside from a more efficient service, data from tube users will allow for the terrifying prospect of dynamic advertising on the Underground, with far higher revenues to be made from figuring out visual hotspots.

This throws up questions about how we want to design our cities, and how we value public space. The utilitarian is us begs for more, faster, less crowded transit, but at the expense of more Deliveroo adverts following us around the Circle Line? This data could eventually be linked to our wider digital footprint, and the prospect of personalised adverts perennially hovering in the background of our lives is a grim one.

Facial recognition technology in London is a problem

Other developments in people-tracking make this all seem comparatively benign. Last week the Metropolitan Police trialled facial recognition software in East London, stopping and ID’ing individuals who matched police databases of known suspects. This technology is far from perfect – freedom of information requests from Big Brother Watch, a campaign against surveillance, found that there was a 96% misidentification rate in recent trials of the technology. Nevertheless, the police seem set on expanding its use across London.

This a problem. Law enforcement agencies are yet to figure how to use such an invasive technology in public spaces, often parking vans with obnoxiously large cameras and swarms of officers outside tube stations and shopping centres.

In a video circulated widely on social media last week, a man who refused to show his face to police camera was forced to do so, and then fined £90 after an ensuing argument. If facial recognition technology is to become the norm on the streets of London, the Met are (at the very least) going to have to work on their PR.

 

This comes with a suite of other tech nominally designed to help the police do their job. A recent report by the human rights group Liberty notes the growth of so-called ‘big data policing’, which extends to creating predictive models of where crime is likely to happen by crunching large volumes of data.

The nature of these algorithms can produce significant in-built racial bias. There are also suggestions that it just doesn’t work – successful policing comes from building community trust rather than microtargeted prevention and punishment.

Facial recognition technology is already seeing significant pushback. This week a case is being brought to the high court in Cardiff, arguing that weak regulation of the technology breaches human rights.

In the US, libertarian inclinations seem to be winning out over the economic power of Silicon Valley. Amazon’s shareholders look prepared block the company from selling their facial recognition software to government agencies, and San Francisco has recently moved to ban the practice entirely.

The next wave of data scandals will be even more personal

What both of these cases – location and visual tracking  point towards is an increasing encroachment by tech, government and the private sector on the public realm.

While TfL has promised that data from our phones will be encrypted and depersonalised, in theory it is still possible for someone with the right ‘keys’ to access the locations of individuals.

At present, there is scant legislation protecting individuals from having their physical data unknowingly collected by the government or corporations.

The data scandals of the previous decade have centred around information that we at least opted to put on the web – our emails, internet history and contact details – even if this data ultimately ended up in the wrong places.

In the last five years, whether we’ve opted in or not, our online presence increasingly extends to the physical – our biometric data, location and appearance. The next wave of data scandals will be even more personal.

Who should govern our data? The RSA's Tech and Society programme is currently exploring AI & ethics, data rights and disinformation.

Be the first to write a comment

0 Comments

Please login to post a comment or reply

Don't have an account? Click here to register.

Related articles