Facial recognition is currently employed in many contexts, for instance, in law enforcement and migration controls. In addition, the mobile and social media industry utilises biometric data to improve their services or to profile user data. However, this wide use of facial recognition could create human rights concerns. Indeed, European bodies or institutions such as the European Agency for Fundamental Rights, the Council of Europe, and the European Parliament are already tracking activities associated with the use of facial recognition and detecting issues.
FACIAL RECOGNITION AND DISCRIMINATION CONCERNS
Facial recognition is an AI-based technology used for verification, identification, or categorisation of biometric data. In practice, facial recognition can be defined as the automatic processing of digital images which contain the faces of individuals for their identification, authentication, verification, or categorisation.
Facial recognition has the potential to lead to the discrimination of certain population such as ethnic groups. This occurs because facial images used by facial recognition technologies to develop algorithms consist mainly of white men, while women or other ethnic groups are less represented. The European Parliament in a recent resolution underlined that algorithmic bias is the main problem related to this technology. Indeed, according to the European Parliament, the facial recognition distortions could create discrimination concerns. In particular, facial recognition technologies could misidentify minority ethnic groups or LGBT people due to the non-homogeneity of data.
Concerning the European Human Rights System, it should be pointed out that article 21 of the Nice Charter (CFR) prohibits discrimination based on any ground, including sex, age, religion and ethnic origin, and article 14 of the European Convention on Human Rights (ECHR) foresees the right to non-discrimination.
RIGHT TO RESPECT FOR PRIVATE LIFE AND DATA PROTECTION
Facial recognition software for its functioning needs to collect and analyse biometric data to identify subjects, although this process could represent an interference with the right to respect for private life and data protection, as provided in articles 7 and 8 CFR. Similarly, article 8 ECHR ensures the right to respect for private and family life.
The European Court of Human Rights (ECtHR) has already ruled on some face recognition cases, in which it considered that the right to respect for private life was violated. In Gaughran v. United Kingdom, it decided that the implementation of facial recognition tools, using photos captured during a person’s arrest and later stored in a police database, interfere with the right to respect for private life laid down in article 8 ECHR. The same Court stressed that the photo retention of an arrested person for an indefinite time is a breach of the same right.
This ruling followed that in Beghal v. United Kingdom, a case concerning the infringement of the right to respect for private life in the public context. Particularly, it referred to the use of facial recognition technology in airports or ports by immigration or anti-terrorism officers. In Beghal, the ECtHR concluded that such practices were consistent with article 8, paragraph 2 ECHR inasmuch as said measures were in accordance with the law of the country and just for public security and national defence purposes.
Moreover, other precedents of the ECtHR on matters of video surveillance in the public and private sector could be applied extensively to the case of facial recognition. These cases involve infringements to the right to respect for private life in contexts of law enforcement, border controls, employment workspace surveillance, and monitoring at universities .Concerning the public sphere, the case of Peck v. United Kingdom refers to a police intelligence device used to identify criminals on the basis of photos stored in a database. Here, the ECtHR found that this practice is in conflict with the right to respect for private life, as stated in article 8 ECHR.
Regarding the private sector, in Lopez Ribalda v. Spain and Autovic and Mircovic v. Montenegro, the ECtHR reviewed cases where employers resorted to video surveillance systems to control their employees. On the one hand, in Lopez Ribalda, the ECtHR ruled that this type of control in the workplace breaches the right to respect for private life inasmuch as it is necessary to at least notify employees about the use of surveillance technologies. On the other hand, in Autovic and Mircovic v. Montenegro, the ECtHR established that this type of video surveillance should be time-limited.
Finally, the protection of the right to respect for private life from new technologies also appears in instruments approved by the European Union. For instance, Regulation (EU) 2016/679 (GDPR) sets out a series of provisions limiting the use of facial recognition. In this regard, article 9 clarifies that the processing of special categories of personal data must be limited, while article 22 establishes that “data subjects have the right not to be subject to a decision based solely on automated processing, including profiling”.
FACIAL RECOGNITION AND FREEDOM OF EXPRESSION, OPINION, ASSEMBLY AND ASSOCIATION
Facial recognition, then, comes into conflict with further fundamental rights, such as the freedom of expression and information, established under articles 11.1 CFR and 10 ECHR, as well as the freedom of assembly and association, referred to in articles 12.1 CFR and 11.2 ECHR. Particularly, processing images obtained by video cameras in public spaces could interfere with individuals’ freedom of expression and opinion and could affect their freedom of assembly and association.
In fact, during social protest facial recognition systems are extensively used by public authorities or police officers through video surveillance systems. In this regard, it should be noted that the ECtHR is still silent on this subject. Only the European Parliament Research Service raised the mass-surveillance issue in its paper “Regulating Facial Recognition” (EPRS Paper).
This EPRS Paper built on the work of the United Nations High Commissioner for Human Rights published in “The impact of new technologies on the promotion and protection of human rights in the context of assemblies, including peaceful protests” (UN Report). According to the UN Report, the domestic legal framework must be based on the principle of necessity and proportionality to regulate mass-surveillance tools. Therefore, the indiscriminate and untargeted mass-surveillance of individuals should be not allowed in the context of assemblies and peaceful protests.
In conclusion, facial recognition technologies constitute nowadays significant human rights challenges. European institutions are increasingly aware of the risks linked to the use of such tools for surveillance purposes. Hence, human rights concerns relating to facial recognition devices cannot be ignored and in the next few years the field of fundamental rights in Europe should evolve towards the regulation of these issues. Surely, the ECtHR will play a key role in this development, but a simultaneous engagement of other European institutions and human rights watchdogs will also be very important.
Manon Eleonora Lagana is a PhD Candidate in Information Technology Law, European Data Protection Law and Human Rights at the University of Valencia, Spain. Previously, she earned a LLM in International and European Studies at the University of Valencia, Spain and a LLM in Law and a Juris Doctor Degree at the University of Pisa, Italy.