THE BENEFITS OF FRT
FRT involves cameras being used to scan the faces of anybody who walks past them, creating distinct biometric patterns. These patterns can then be matched with other images gathered from the internet and social media accounts. The aim of FRT is not only to identify unknown individuals but also to quickly locate them. It was these benefits that the police in Florida, Georgia and Texas cited when defending their partnership with Clearview AI, a facial recognition technology company that has since become embroiled in multiple lawsuits.
WHEN FRT DOES NOT WORK
FRT poses a significant risk to an individual’s right to equality before the law. Article 26 of the International Covenant on Civil and Political Rights states that ‘[a]ll persons are equal before the law and are entitled without any discrimination to the equal protection of the law’. Specifically, in the UK, public authorities must comply with their Public Sector Equality Duty under the Equality Act 2010. Public authorities, such as the police, must refrain from discriminatory behaviour and actively consider how their policies and actions could inadvertently disfavour members of the public.
FRT undermines the right to equality and equal treatment because, as reported by Forbes, it is almost 100 times more likely to misidentify Black and Asian individuals compared to white men. This statistic illustrates a fundamental deficiency with current FRT. A 2012 study focusing on the three most commonly used algorithms behind FRT found that they all performed, at a minimum, 5% worse on Black people. One particularly alarming instance where FRT was used found that 92% of identifications made were false. Whilst a problem in and of itself, this irregularity is even more dangerous when one considers the context and environment within which FRT is deployed; we live in a society where institutional racism as evidenced in the Stephen Lawrence enquiry and racially motivated police violence are real, pervasive problems.
WHEN FRT DOES WORK
Even when FRT works, it still threatens an individual’s right to privacy. Article 8 of the European Convention on Human Rights (ECHR) details that ‘everyone has the right to respect for his private and family life, his home and his correspondence’. Whilst undeniably vague, this provision does not represent a free-for-all. As noted by Amnesty International, FRT inevitably involves extensive monitoring and storage of an individual’s private data - without the consent of that individual, and often without any suspicion of misconduct, which, at its core is indiscriminate mass surveillance.
There is a reason why George Orwell’s 1984 maxim ‘Big Brother is watching you’ has, in an almost singular fashion, embedded itself as a defining phrase of the twenty-first century. The dystopian and authoritarian society detailed in 1984 was characterised, among other things, by mass surveillance. No longer a dystopia, mass surveillance has become our collective reality. Indeed, Georgetown Law’s Center on Privacy and Technology estimated that half of US adults are already included in police FRT databases. Although our right to privacy may be limited where ‘necessary in a democratic society’, it is hard to imagine a scenario in which indiscriminate mass surveillance of this kind can be said to constitute a legitimate and proportionate interference. At least, not if we continue to claim to be a country which adheres to democratic ideals.
THE FUTURE FOR FRT
R (Bridges) v Chief Constable of South Wales Police & Information Commissioner [2020] does not sound the death knell for FRT. In response to the Court of Appeal’s decision, a spokesperson for the Home Office stated that ‘the government is committed to empowering the police to use new technologies like facial recognition safely, within a strict legal framework’. IBM, Amazon and Microsoft are currently refusing to sell their FRT in response to the widespread backlash. However, it is worth noting that only IBM has committed to permanently ending such sales.
Campaigners at Liberty - the civil rights group responsible for bringing the legal challenge in R (Bridges) in conjunction with Ed Bridges - warn that FRT should never be extensively deployed, as this could mean the end of anonymity as we know it. Whilst employing robust parameters following a thorough review of the technology is a good starting point, it is not enough. FRT is just one manifestation of a society increasingly under watch; those in charge should not dismiss the possibility that any deployment of FRT may pose an unjustified interference with our rights.
Bethany is a law student with a keen interest in human rights, especially as relating to women and intersectionality. Having just completed her law degree at the University of Bristol, she will begin the LPC in September before embarking on a training contract in late 2021.