India has increased in recent years its reliance on the use of facial recognition technologies (FRTs) in policing. This is in alignment with the international community shifting towards the use of artificial intelligence (AI). According to a report by Carnegie Endowment, at least 56 countries are actively using FRTs in policing currently. With AI increasingly used, the potential threat of more policing against historically ostracised communities has also been exacerbated.
A significant issue with the deployment of such technology is that it is often presented as a progressive way of policing and ensuring efficiency within the criminal justice system. However, in the case of India, this technology was implemented without sufficient legislative discussion or feedback from potentially affected stakeholders. Moreover, since these policies choose to leave out affected individuals, they go against the democratic ethos on which the law itself may have been established.
Rapid implementation of such technology ensures that fears of a surveillance state are furthered in the minds of communities. This is especially true for minority communities who have been on the receiving end of extrajudicial police action and police brutality in the past.
DISPROPORTIONATE SURVEILLANCE & INCREASED POLICING
AI is not impartial. The technology works on data sets that are governed by machine learning and past trends in arrests. Therefore, it is not error free and issues such as mischaracterisation based on generalised traits are common. Historically, the police forces have been known to act in discriminatory ways against socio-economically weak minorities, which has ensured that the data sets that current AI systems have are skewed against individuals from certain ethnicities or geographical regions, as well as descriptions based on gender and religion. The likelihood of individuals being singled out by a technology that is being operated in regions densely populated by such minorities with data sets that haven’t been corrected or reformed leaves space for exploitation.
VIDHI Centre for Legal Policy, in a report, highlighted how the highest density of police stations in Delhi were in parts of the city where the representation of Muslims (a minority community in India) was high. This meant that the highest amount of policing and data collection was happening in these regions. The problem with this distribution emerges from the data sets that the technology gathers, which is often discriminately geared towards socio-economically weaker communities because of the police being biassed actors in the past before reform and checks were implemented.
Since the technology does not have a regulatory framework being accessed by all and is not open to public feedback, it just feeds into the issue of an inability of locating responsibility as well. The possible solutions to this include explainable logs in the technology, i.e., a determination of why a certain decision was taken by the AI. Additionally, the usage of diverse data sets can mitigate the bias. Lastly, there must be enough checks and balances which allow for the decisions to be taken by the technology to be questioned and appealed against.
The impending implementation of the FRTs in India poses a threat to citizens especially in ways that policing practises will progress in the country and their efficiency. It also poses questions about who will be held accountable in the future if the determination of crime is done through technology and not through individual people. This stems from a lack of deliberation and discussion in the public domain and the lack of a regulatory framework. It is essential to create checks and balances especially with technology that happens to have such far reaching effects in how the lives of citizens will be governed. The legislative push for the same must come from a space where the citizen is prioritised as the supreme stakeholder.
Jibran Khan is a fifth year law student at Dr. Ram Manohar Lohiya National Law University, Lucknow. They are interested in Policy and Law, Arbitration and Constitutional Law.