While in countries like China automated face recognition systems are already used by law enforcement on several occasions, fortunately elsewhere attempts to employ this technology still encounter some resistance: this is the case in the United Kingdom, where an urban trial system at the London Metropolitan Police it proved to be particularly inaccurate. According to data obtained by the same police forces through a request made by the British Freedom Of Information Act, between 2016 and 2018, 96% of the people identified by the system's video cameras as potential criminals were normal citizens - passers often unaware of the fact to be filmed.
The system under test
Since 2016, the London Metropolitan Police has begun to use a network of closed circuit television cameras connected to a facial recognition system from time to time in some areas of the capital, designed to compare the faces of framed individuals with a database of people sought by forces. order. If the system detects a correspondence, it sends a notification to an operator who can decide to intervene on the suspect; if the face framed does not correspond to any image already present in the database, the relative images are deleted 30 days after registration.
Criticism of facial recognition
Needless to say, the procedure raised several criticisms. On the one hand there are the most attentive to privacy and citizens' rights, which have long denounced the dystopian nature of the technology used; on the other hand there are those who simply point out that - despite the strategic positioning of some informative posters on the nature of the surveillance in progress - at times passers-by are not even aware of being taken back by a system designed to recognize them.
The data that emerged in these hours on the performance of the system do not play in favor of those who wanted it in action. The metropolitan police have specified to be aware of the imperfect nature of the system and that for this reason the decision to intervene on the reported individuals is always taken by a human being. For critics, however, human intermediation is not sufficient to make the use of a similar system acceptable, partly because some of the errors made by the software have given rise to more complex and serious events: according to what the Big Brother Watch association reports , in one episode a 14-year-old black boy in school uniform was stopped by the police and released only after being forced to provide his own fingerprints, while in other cases passers-by with scarves or coats were stopped simply because they were unrecognizable for the system, and therefore forced to show their identity card.