By: Sehseh K. Sanan

Procedural Due Process is embedded in the Constitution and demands that a person’s life, liberty, and property, are not denied without “notice, opportunity to be heard, and a decision by a neutral decisionmaker.” [1]

But what is a neutral decisionmaker? Typically, we would think of a judge or a jury, who is not a party in the case, listening to facts, applying law, and coming out to a decision. But what if the decisionmaker was a computer? Specifically, a computer deciding how long someone should be in jail. 

Right now, and approximately since 2016, courts across the world are beginning to use algorithms to determine sentencing guidelines. [2]Prior to using algorithms, in the U.S., judges had to rely on either sentencing guidelines or mandatory sentencing. [3]Sentencing Guidelines are provisions published by the Federal Government that go through factors such as “subjective guilt of the defendant and to the harm caused by his facts.” [4]Judges are not mandated to follow the Guidelines, but they are required to explain why they are departing from the federal suggestions. [5]Under mandatory sentencing, judges are required to impose minimum sentences depending on the charges that the prosecutor brings. [6]This is frequently seen in drug or drug-related charges. [7]What both of these systems have in common is that they are trying to assess the risk that an offender poses on society and how long they should be incarcerated to right the wrong. 

A close up of a map

Description automatically generated

Figure 1[8]

Artificial intelligence and algorithms claim to do the same thing as previous systems: assess risk and attempt to deter the offender. [9]Algorithms take into consideration factors such as “(1) the likelihood that the defendant will re-offend before trial (“recidivism risk”) and (2) the likelihood the defendant will fail to appear at trial.” While this may seem beneficial from a utilitarian perspective, the rise of algorithm usage has had devastating and long-lasting effects. [10]People like Darnell Gates, who served time from 2013 to 2018 for running a car into a house and violently threatening a domestic partner, was deemed high risk and did not even know it. [11]Further, as these algorithms are being made by human beings, their innate biases are being built into these algorithms. [12]It is no secret that people of color, particularly Black men, are more likely to be subjected to higher sentences. [13]

So, where do we go from here? Mandatory sentences are not the answer as they impose higher sentences even for first time offenders. [14]The guidelines also have racial biases. [15]Personally, if the goal of criminal justice and prisons are to reform offenders, then I think that the criminal justice system should be reformed to match that goal. Prisoners should receive education and a support system that will push them to not reoffend. We as a society could focus on a combination of rehabilitative and restorative justice. [16]By focusing on this combination, the troubles and obstacles that offenders find once they leave prison can be alleviated by the support of their community and society. [17]Maybe this could lead to a better society. But either way, we should definitely not let computers and algorithms decide. 


[1]Procedural Due Process, Legal Information Institute, Cornell Law School,https://www.law.cornell.edu/wex/procedural_due_process(last visited Feb. 9, 2020).

[2]Kehl, Danielle, Guo, Priscilla, Kessler, Samuel. Algorithms in the Criminal Justice System: Assessing the Use of Risk Assessments in Sentencing. (July 2017). Responsive Communities. Available at: https://cyber.harvard.edu/ publications/2017/07/Algorithms.

[3]Federal Sentencing Guidelines, Legal Information Institute, Cornell Law School, https://www.law.cornell.edu/wex/federal_sentencing_guidelines(last visited Feb. 9, 2020); Mandatory Minimums and Sentencing Reform, Criminal Justice Policy Foundation, https://www.cjpf.org/mandatory-minimums(last visited Feb. 9, 2020). 

[4]Federal Sentencing Guidelines, Legal Information Institute, Cornell Law School, https://www.law.cornell.edu/wex/federal_sentencing_guidelines(last visited Feb. 9, 2020). 

[5]Id

[6]Mandatory Minimums and Sentencing Reform, Criminal Justice Policy Foundation, https://www.cjpf.org/mandatory-minimums(last visited Feb. 9, 2020).

[7]Id

[8]Demographic Differences in Sentencing, U.S. Sentencing Comm’n, https://www.ussc.gov/sites/default/files/pdf/research-and-publications/research-publications/2017/20171114_Demographics.pdf(last visited Feb. 9, 2020).

[9]Algorithms in the Criminal Justice System: Pre-Trial Assessment Tools, Electronic Privacy Information Center, https://epic.org/algorithmic-transparency/crim-justice/(last visited Feb. 9, 2020). 

[10]Cade Metz, Adam Satariano, An Algorithm That Grants Freedom, or Takes It Away, NYTimes, Feb. 6, 2020, https://www.nytimes.com/2020/02/06/technology/predictive-algorithms-crime.html (last visited Feb. 6, 2020).

[11]Id

[12]Id

[13]Demographic Differences in Sentencing, U.S. Sentencing Comm’n, https://www.ussc.gov/sites/default/files/pdf/research-and-publications/research-publications/2017/20171114_Demographics.pdf(last visited Feb. 9, 2020). 

[14]Mandatory Minimums and Sentencing Reform, Criminal Justice Policy Foundation, https://www.cjpf.org/mandatory-minimums(last visited Feb. 9, 2020).

[15]Racial, Ethnic, and Gender Disparities In Federal Sentencing Today, U.S. Sentencing Comm’n, https://www.ussc.gov/sites/default/files/pdf/research-and-publications/research-projects-and-surveys/miscellaneous/15-year-study/chap4.pdf(last visited Feb. 9, 2020). 

[16]David Best, Pathways to Recovery and DesistanceThe Role of the Social Contagion of Hope(2019). 

[17]Id