San Francisco's facial recognition ban.

The Board of Supervisors for the City and County of San Francisco are a long, long, long way down the pecking order when it comes to legislation. Unlike the US Congress, where Members can pass motions that amount to comment not law, the Board of Supervisors can pass law - but it may well have no practical effect. So exactly what is the newly announced law and how much weight will it carry?

The concern in San Francisco, as in a number of other US towns and cities, is that biometric data, of which facial recognition is one example, is subject to mission creep: that is that it collected for one purpose but then used for others. Some say that the use of biometric data has resulted in the reduction of use of essential services, such as shelters for the homeless, because the data is used to track the use of shelters. Those who are undocumented, some but not all of which are illegal immigrants, have, say some, resulted in fear of discovery and therefore they stay away.
But those, along with shopping centres, social media companies and other non-government uses, are not affected by the ban. Also, there is a limited exemption for law enforcement agencies. High on the list of places using the data is Las Vegas which is, in essence, a private surveillance state with extensive data capture at and around casinos and exchange of information between them. The extent to which that data might be available to the police and, even, the Revenue services is not known but - and this is where the situation becomes seriously complex, that data will also include the money laundering compliance data which provides a considerable amount of personal information about any visitor who exchanges cash for chips and vice versa.
That data, once in the hands of federal law enforcement, can be cross-referenced to US Customs and Border Protection which has long been a heavy user of facial recognition systems (and even gait and behavioural analysis) and, inevitably, Immigration and Customs Enforcement.
Critics of the expansion of the technology point to China which says that it will soon have a database of everyone in the country to aid it in matters of national and state security.
All of this is before the question of reliability of the technology is considered. It is known that facial recognition is not reliable. Try using the auto-gates at London's Heathrow after a long tiring flight and see how often it turns you away - it has improved but initially some users reported repeated and 100% failure rates over a significant period and number of trips. The abandoned IRIS system, those same users report, had a 100% success rate. IRIS relied on enrolment, like Hong Kong's frequent visitor system, relies on enrolment and reliable capture of data. The later system, in use across major points of entry in the UK and Australia, for example, relies on an algorithmic analysis of the photograph submitted with passport applications - and on a photographic view of the user at the point of entry. Everything from the angle at which a person stands to the way hair lies can cause rejection. It is the use of identity card photographs as base data that is also causing concern for some in the USA where the Federal Bureau of Investigation has access to driving licence photographs to help with identification. The big question that exercises minds is whether this should be delegated to technology or reserved for human intervention.
That is a debate that has long been lost: automated searches of fingerprint databases (a functionally similar search to that of databases of faces) and DNA records has been a core tool for law enforcement agencies around the world. There's numberplate recognition tied to facial recognition in the UK and the roots of that go back to the 1980s and the need to identify suspected terrorists, ideally before they commit to any action.
It has been said that access to facial recognition databases should be permitted only with a warrant but that horse has bolted, too, unless the database is in private hands in which case it might be argued that there is a similarity with private security camera footage. Indeed, the central reason that the USA's only major terrorist attack, the 11th September, 2001, revealed that it was the lack of co-operation between agencies that meant that data that might (emphasise "might") have led to increased surveillance of at least some of those later convicted of involvement. The sharing of data, and free access to it, has become an objective of governments in the battle against terrorism and serious organised crime including the people smuggling and trafficking.
There is no doubt that people are uneasy about such data being freely available but many of those are the very people that argue against control of internet companies which publish, albeit with the consent, even assistance, of their users of faces and other material which law enforcement now routinely access when it is published in public. Facebook, some years ago, made a futile (and somewhat disingenuous) attempt to combat the publishing of images of people other than the account holder - but only by saying people could object and have their image removed or fuzzed out. That was doomed to fail because photos are distributed, often many times, before the victim finds out about them. The question is whether those people gave their consent to their image being published and the time to do that is at the time of publication.
San Francisco's law (which is subject to a confirmatory vote next week but which it is expected to pass easily) is a statement but if it falls foul of federal law it will have no effect. Moreover, unless a federal Judge certifies that it is a constitutional matter, there can be no question of the law ending up in the Supreme Court.
It is, therefore, an interesting debating point. The city's own law enforcement officers will be bound, arguably hampered by it, but in the real world it is unlikely to have significant effect on much more than shoplifting, traffic offences and local crimes including those of violence. It may be overturned once a rape victim finds that her assailant escapes because the only identification evidence is on a traffic camera or a bus security video.