I don’t like being watched, especially when I don’t know who the watchers are, and the rapid growth in cameras in streets, shops, trains, buses, public spaces and private buildings has troubled me for many years.
Until recently the watchers were, at least, human — delightfully limited in their perceptual capacity and ability to identify or discriminate, tired and distracted and even bored with their job as they sat behind banks of monitors.
And of course most of the material was never even watched, instead being held on a hard drive on a neglected server in a dank office awaiting some incident that would require it to be reviewed, but more likely to be overwritten to make disk space for today’s similarly neglected footage.
This is no longer the case. More and more of the cameras that capture us as we go about our lives — like the one in this rail carriage watching me as I write — are not intended for human viewing at all, but are input sensors for an increasingly sophisticated surveillance machine. It used to be misleading to say that a camera was ‘watching’ you as it was just an image capture device — at some point a human being needed to to the actual watching.
Now the cameras are at the service of always-alert neural networks programmed to capture features and pass the data they extract for further processing inside an increasingly capable system. And while these millions of systems are largely separate for now, the trend is to connect them together to provide citywide, countrywide and perhaps even global capabilities.
Now the machines really are watching us, and it is their understanding of the world, in the form of visualisations and alerts, that is fed to the human operators who serve the commercial or state interests behind the mechanism.
And they are getting more and more ambitious.
First, they came for the numberplates, with the carefully named ‘ANPR’ or automatic number plate recognition systems installed on gantries or in vans parked inconspicuoulsy at the side of the road. The name seems designed to reduce suspicion — If they’d called it ‘carspotter’ or ‘numbersink’ we might have protested more loudly, but these systems are now unstoppable and seen as a ‘vital tool’ for crimefighting.
Now, with more sophisticated algorithms and hundreds of billions of dollars of investment from governments and large technology firms who have improved the hardware, refined the cameras and created and trained the neural networks, the machines are looking at our faces, our clothes and even the way we walk — our gait — and labelling us rapidly and efficiently.
It’s all for our own good, of course. In China you can pay for goods by simply looking at the Dragonfly camera and if your face is in the national ID database and you’ve linked your Alipay, you’re good to go. In the UK we prefer to scan faces in crowds looking for suspicious character and those with outsanding arrest warrants. For the US it’s about making airports a smoother experience, dispensing with cumbersome passports and boarding cards in favour of total passenger surveillance.
While San Francisco has moved to forbid the use of facial recognition in public, a move that perhaps illustrates that the Google engineers who work so hard to develop these technologies are less keen on living the observed life that their hard work has made feasible, this seems to be an attempt to swim against an irrestistible riptide.
The dogs of machine vision have been unleashed upon the world, and they are simply far too useful to far too many organisations, with good and bad intent, to be easily stopped.
And we can expect the legal frameworks that surround this sort of surveillance to expand to make it harder and harder to sidestep: recently in London someone who covered their face to avoid being caught in a trial of facial recognition technology was fined. Existing laws passed with anti-Muslim intent to stop women covering their faces in public may also be found very helpful when it comes to requiring everyone to make their faces visible to the cameras.
Just as the whole legal framework surrounding copyright was ehanced to protect rightsholders by criminalising the act of bypassing digital rights management technologies, thereby removing many of exceptions carefully crafted into the underlying copyright law, so we can expect new laws to facilitate widespread use of facial recognition and criminal sanctions for resistance or any attempt to undermine it.
When Socrates argued that ‘the unexamined life is not worth living’ he can’t have imagined that the observers would be non-human algorithms seeking to categorise, commercialise and constrain him in the service of the invisible powers of capital and the state. And despite the occasional act of gesture politics from those cities that once declared themselves ‘Nuclear Free Zones’ I don’t see us turning away from the general deployment of ML-backed observation of all of us, all of the time. After all, it is just so useful for all sorts of purposes.
If we want our smart cities and smooth urban transport and cleaner air and all the other things, then being watched over by machines of growing intelligence is part of the price. For Socrates the unexamined life was one where we had no inner knowledge. It seems that instead we are to become the subjects of interest for non-human algorithms seeking to categorise, commercialise and constrain us in the service of the invisible powers of capital and the state.
Originally published at http://www.astickadogandaboxwithsomethinginit.com.