Discussions of Platform Urbanism eventually end with some allusion to the Tom Cruise vehicle Minority Report, often as a way to discuss the dangers of predictive policing. The work of the hero-cop is to stop crimes before they happen by using the skills of ‘precogs’, seers who lie in warm baths waiting for visions of desperate villainy to possess to them. T. Cruise then jumps into action and arrests the criminal before the deed is done. Until the system breaks down.
The part of the film that is often overlooked involves a chase scene in a shopping mall where Cruise is trying to hide the ‘Precog’ he has kidnapped from the Justice Department goons who are hunting them. The escape is dogged, however, by smart billboards and advertisements that recognise the cop, and call out his name enticing him with the latest geegaws and commodities. (He later has to undergo eye surgery to avoid detection). This tense dramatic scene reminds us of the close relationship between police technology and the advertisement. Both are forms of all-seeing pursuit – a hunt, almost. The criminal and the consumer are one and the same in the eyes of the smart software.
Face recognition software has become one of the most pervasive methods of algorithmic tracing. It originated in the 1960s, and like fingerprinting in the Victorian era, consists of systematically dividing the face into characteristics and forms that can then be scanned, catalogued, and re-assembled when necessary. In the 1970s, this was based upon 23 measurements, but developments in machine learning have now made this extremely sophisticated: while still only based on 80 nodal points in the human face, it has become able to read 000s of variables between these nodes. And improvements in 3D modelling offers further enhancements.
This research was funded and encouraged by the state in institutions like DARPA, part of the US Defence Department, and ARL, the army Research Laboratory, and the police. The US Army is still a major developer of the technology, particularly in finding methods of finding people of interest at a distance. Much of this funding is funnelled into start-ups, who are also developing the technology for more mainstream functions.
In 2016 Yahoo applied for a patent for a "smart" billboard. The billboard would collect data through innovative sensors, cameras and microphones embedded within the urban fabric – all without the permission of the passer-by or pedestrian. Not only could the data be gathered and later sold to help craft highly targeted ads for future billboards, it could also be processed and read in real-time, giving the advertiser the ability to dynamically alter the advertisement depending on audience makeup and behaviour.
The billboard could collect biometric data on passers by to 'determine whether the audience corresponds to a target demographic'. It would 'identify specific individuals in the target audience.' Microphones could collect conversations that would reveal audience reaction to the ads, and proximity sensors could show how close people get to the billboards. Eye-tracking sensors could determine whether passers by are looking at the ads and for how long. Image recognition techniques and mobile data could be used to form a more focused profile of the audience.

Piccadilly Circus, London
Smartboards were launch in Piccadilly Circus in London, in 2017. The boards were to be integrated with monitors who would pick up changes in weather, the colour of cars in the vicinity, new, sports reports and social media updates.
Elsewhere, Advertising giant M&C Saatchi is currently testing advertising billboards with hidden Microsoft Kinect cameras that read viewers’ emotions and react according to whether a person’s facial expression is happy, sad or neutral. The rise of emotion recognition software has become increasingly popular.
Of course advertisers are meant to behave in such ways, and we, as consumers, can – and should – ignore their temptations. However, the billboard is constantly sorting, dividing up, and measuring each face. The algorithm is primed by a set of preconditioned parameters of what makes a buyer, and will target them as they walk past.
The same technology that can discern age, sex, race, emotion, class has many uses. The recent Russian App Findface allows you to search an uploaded portrait against the 200 million users of the social network Vkontakte. Recently the developers inked a contract with the Moscow City administration, to add their software to the 150,000 CCTV cameras around the city; again without the citizens permission, or knowledge. Your social media timeline now helps the state track you across the city. This is not something anyone expects in the terms and conditions of use.
Furthermore we should not assume that these techniques actually work. Recently MIT road tested the 3 most prominent commercial face recognition software systems: Microsoft, IBM and Megvii. It found that it could correctly identify the gender of 99% of the white men it viewed, but then the percentages fell away at it attempted to identity different racial identities, and reached a low of 35% accuracy for all women.
Despite claims of neutrality, code is always embedded with prejudices, blind spots, bias and worse. The systems that organise these sortings and separations are hidden within an algorithmic black box that only allow certain people to control, change or understand it. Yet it is being integrated into our urban environment as if it possesses superhuman omniscience.
The Apple iPhone X launched in 2017 now uses face recognition software to unlock their devices. We have become increasingly habituated to using our faces as a digital identifier. It raises questions about the right to privacy and data protection. The Face ID technology depends on the safe storage and access to the vast database that stores these records. According to a study by Dutch organization Consumentenbond, it discovered that just over 40% of devices, mostly using Android OS, could open by using a photograph of the device’s owner. at the other end of the problem, Face ID is not compatible with face-masks during the Covid-19 Pandemic.
In January 2020 the European Union proposed a moratorium on the use of Facial Recognition software in public spaces following complaints by a huge number of human rights groups who saw its ubiquitous use as a violation of privacy. The proposal was limited to five years while 'a sound methodology for assessing the impacts of this technology and possible risk management measures could be identified and developed'. However the draft paper was shelved under extreme pressure from the technology and policing sectors. It is unlikely to be revived any time soon, partly as a consequence of the Coronavirus pandemic.
In the aftermath of the pandemic, facial recognition software has become essential to many state’s attempts to track and trace the virus. This may be the turning point for the total integration of eye software across the surfaces of the city. The questions will be: who owns the software? And what control do we have over the data that it harvests?
Comments