AI in Criminal Justice: How it Can Become Biased

Written by Ruby Gardner

Potential of AI

It was Dave Walters who said, “the potential benefits of artificial intelligence are huge, so are the dangers.”

The praising of this phenomenon lets it seep into our systems of power, which becomes the ultimate ‘dangers’ of our time. This is not to say that the merging of these two worlds will only bring out the scarcity of humanity.

AI has led us to groundbreaking discoveries towards our universe and this place we call home, such as the use of machine learning in charting ‘unmarked galaxies, stars, and black holes.’

My name is Ruby Gardner; a student at the American International School of Budapest, and I have an inquisitive mind for problem-solving.

This summer I endeavored in the Inspirit AI summer course, with which I learned the fundamental components of Python, machine learning techniques, and the application of this to global issues.

A part of problem-solving is understanding the complexities of the mechanism--I learned a lot about trial and error and just how technical the AI process can be. This made me gain so much respect for the work that is done in this field--just how much this generation has defied.

More recently, I was accepted into the Inspirit AI ambassadorship program. This is why I am writing to you today. With this ambassadorship platform, I want to use my writing to take you on a journey into the criminal justice systems--more specifically, AI in criminal justice.

What is Artificial Intelligence?

In simple terms, Artificial Intelligence is the problem-solving process filtered through machines rather than through natural systems of intelligence. The field of AI gained official recognition in 1956 at Dartmouth University.

It has been contributing to how humans think about the world. In the early days of its development, AI was commonly equated to game theory and programming as AI was used to write checkers-playing programs. Now AI is applied to much more complicated human systems.

AI is used in the medical field, space exploration, manufacturing companies, security, and surveillance, and even in the criminal justice system.

Criminal Justice System

Within the United States, the criminal justice system is set up in order to enforce the law. The main objective is to reduce crime rates. This seems very simple and frank, but this system very much resembles the iceberg analogy.

What you see as an outsider is only a sliver into the vast hindrance it creates. Authority’s lack of accountability, ‘unconstitutional overcriminalization’, racial dehumanization and government funds not going towards the right supportive organization are all reasons why this system is crumbling.

Merging Artificial Intelligence and Law

Then, we introduce AI in criminal justice.

Police departments use machine learning strategies, such as facial recognition, to retrieve information about criminal suspects. What researchers are finding is that these facial recognition systems hold an unbelievable amount of bias around skin tone and gender.

In the case of criminal detection, the facial recognition error rate is up to 34% higher for darker-skinned females compared to lighter-skinned males.

Minority archetypes are not as regularly inputted into the machine learning algorithm because of the most affordable, easy testing data being Caucasian male figures; therefore it does not learn to identify their features as normal.

Another dangerous process within the machine learning algorithms is the risk assessment tools. It is supposed to accurately predict criminal behavior, but for suspects already in the criminal justice system.

This tool produces a single number, from the details of the suspect's profile, that estimates the probability of recidivism. The problem with this is that the crime data that is processed in order to create this number is historical and holds much more weight with context.

Machines do exactly what we tell them to do, and if we are not careful with what we are telling it to process then it will end up making ‘statistical correlations’ rather than ‘causations’.

For example, if an algorithm picks up that being black is correlated with high recidivism, you would not have any in-depth reasoning or understanding as to why there is a correlation. It is simply a ‘casual scoring mechanism’ that does not take into account any other factors.

Imagine being given a number that predicts your recidivism rate based on machine learning correlations for physical features instead of your actual disposition and the severity of your criminal record.

Solutions

A solution for this is simple.

We have to provide versatile, detailed, visual training data for the AI to use for it to perform at an accurate level. Machine learning is like taking a test.

If I do not study an array of subject material prior to taking the test, then the topics that I did not practice, I will not be able to perform well on. It all starts with the human. If we are biased, then the machine is biased.

Conclusion

Overall, the criminal justice system has to mend its way to become fairer and more just by tackling these minor details, within AI, that cause colossal damage to people's lives. We can learn to utilize AI for our benefit as long as we learn to use it correctly.

Ethics becomes the most important consideration when implementing these modern techniques into our historical systems. As we can see, AI in criminal justice can ruin lives if humans are not in a superior position to it and can take control of the reins.

Works Cited

Daftry, Shreyansh. ""How NASA Uses AI and Machine Learning for Space Exploration."" Hyperright, 17 Aug. 2020, hyperight.com/how-nasa-uses-ai-and-machine-learning-for-space-exploration/#:~:text=AI%20has%20proven%20its%20great,navigation%2C%20monitoring%20and%20system%20control. Accessed 20 Oct. 2021.

Hao, Karen. ""AI Is Sending People to Jail--and Getting It Wrong."" MIT Technology Review, 21 Jan. 2019, www.technologyreview.com/2019/01/21/137783/algorithms-criminal-justice-ai/. Accessed 20 Oct. 2021.

Najbi, Alex. ""Racial Discrimination in Face Recognition Technology."" Science in the News, Harvard University, 24 Oct. 2020, sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/. Accessed 20 Oct. 2021.

Neily, Clark. ""America's Criminal Justice System Is Rotten to the Core."" Cato Institute, 7 June 2020, www.cato.org/blog/americas-criminal-justice-system-rotten-core. Accessed 20 Oct. 2021.

Simonite, Tom. ""The Best Algorithms Struggle to Recognize Black Faces Equally."" Wired, 22 July 2019, www.wired.com/story/best-algorithms-struggle-recognize-black-faces-equally/. Accessed 20 Oct. 2021.

Previous
Previous

Computer Science Journey: 5 Tips to Prepare Yourself in High School

Next
Next

Computer Science in High School: Tips for Preparation