Can Artificial Intelligence (AI) Programs Strengthen The Justice Process For Sexual Assault Victims?

A significant percentage of assault survivors have little faith in adversarial legal systems. Incredibly poor treatment combined with low conviction rates has painted the legal process as unfriendly territory for many sexual  victims. According to a study published by Janice Du Mont and Terri L. Myhr, only 5% of assaults are formally reported to the police and only 11% of these cases lead to a conviction.  In these studies, such low rates were driven by concerns that police and courts would question their credibility. Susan Estrich’s book “Real Rape" notes that the victim’s perceived credibility becomes the primary determinant in whether charges will be filed. Factors like behaviour, occupation, physical appearance and whether they had been drinking all come into play. Even after giving a succinct recollection of events  and cooperating with the police, a victim can easily have the door slammed in their face.  

Access to justice advocates are arguing that frontier technologies, specifically AI-powered chatbots, can bridge harmful gaps in a daunting and often ineffective justice process. They argue that trauma-sensitive AI applications can increase victims’ legal knowledge and trust. In an area where the law so commonly fails, will cognitive technologies be the necessary remedy?

POTENTIAL AI SOLUTIONS 

In Montreal, the chatbot “Botler AI” generates free legal information and guidance for sexual assault victims who have little knowledge of their rights. Its AI system uses deep learning to harness data from over 300,000 US and Canadian criminal court documents and complaints related to sexual harassment. This data allows the software to assess whether the user has standing under the criminal code and what specific laws have been broken. The system finally generates an incident report that the user can give to the relevant local authorities if they so wish.

AI systems like this are encrypted, available all hours and programmed to give caring and non-judgemental responses. Additionally, the encrypted report provided by a service like Botler AI gives the victim privacy. Botler AI aims to be an alternative resource for complainants to get informed and comfortably share their experiences without fear of judgment. It empowers its users with  a confidence grounded in tangible legal doctrine.

Thailand’s police Lieutenant Colonel Mekhiyanont began developing a similar chatbot - SisBot. She aimed to use AI as an alternative approach to help victims.  AI would demystify the justice process, and help victims obtain vital information to navigate the legal system. SisBot also gives victims information on how best they can preserve evidence of the incident. In addition, the chatbot is capable of assisting formal police investigations. 

In America, advances in AI-powered forensic DNA technology and robotics are improving the viability of rape kits. For example, criminalists in the Oakland Police Department have developed a speedier and more efficient way to spot the victim and the attacker’s genealogy using AI. New technologies are also being developed  to reduce the rape kit backlog. New robotic equipment has allowed states like Ohio to process nearly 14,000 backlogged rape kits, helping to criminalise over 300 serial rapists.

DRAWBACKS

Regardless of the untapped potential, legal technology is shrouded in as much skepticism as it is optimism. Apps should not be considered the elixir for entrenched societal injustices. The promises of AI-assisted justice may still be unsuccessful if our prejudiced notions surrounding sexual violence remain unchallanged. For example, new rape kit technologies have presented positive outputs in America. However, in a study conducted by Michigan State University, it was found the rape kit backlogs were considerably impacted by gender, race and socio-economic bias. Women of lower economic status, sex workers and those from marginalised backgrounds were all less likely to be tested, under the biased assumption that they were less credible.

Can a bot erase the rape culture that allows for such failings? How does an algorithm answer for the social structures that deem women as careless and “asking for it?” Can a bot understand the intersectional experiences of BIPOC and/or queer women?  How can data output be used in our society of pervasive sex-based biases?Data is also only as good as the humans who collect and input it. Those who input this data, whether explicitly or unconsciously, have biases. Algorithms used to assess data are also not free from sin.  Algorithmic bias based off of systemic assumptions leads to biased data, creating unfair outcomes especially for marginalised victims.  These considerations must be recognised when implementing AI-driven justice. AI cannot function as the judge and executioner. It is a foundation -  a starting ground for investigations that must be adjudicated by humans. 

As a legal tool, these technologies increase access to information for victims. However, the dialogue surrounding legal technology and access to justice must not assume that equality has one solution. Algorithms cannot be the sole replacement for pressing political, socio-cultural and institutional reforms. Issues of jurisdiction, liability, privacy must be resolved to establish technology as a legitimate path in the justice process. Practical barriers, such as funding and training legal practitioners also prevent the total implementation of AI driven justice.

Beyond these cautionary notes, where the justice system fails, AI can thrive. The legal system can be a complicated and intimidating maze for many survivors. Artificial intelligence offers a new and essential alternative to our legal norms, and may present real solutions.  Earl Warren once remarked, “It is the spirit and not the form of law that keeps justice alive.” By listening to survivors and providing credible procedural advice, cognitive legal technologies address the needs of the victims and illustrate what steps they can take to seek appropriate justice. Trauma sensitive AI can create a new method for justice, where the voices of sexual assault survivors are considered paramount.

Mayowa is a final year Honours Politics student with a certificate in Computing. As an aspiring technology lawyer, she is passionate about internet governance, digital innovation and believes that technology should be used to create an equitable future. When not writing, she can be found compiling JavaScript.