• The Autonomy Issue

    The Automation of Pre-Trial Detention

    The Autonomy Issue
    Highrisk4

    High Risk

    In the United States, “innocent until proven guilty” remains an illusory promise. Racial bias determines bail decisions, and it doesn’t get any better when you replace racist judges with algorithm-driven Risk Assessment Tools – no matter how “objective” or “neutral” their advocates claim them to be.

    The right-hand side of the Dorchester Municipal Courtroom in Boston, MA is partially cordoned off with a glass box that sits roughly diagonal to the judge’s Bench. Really, this glass box is a cage. In the chamber behind the cage are rows of long wooden benches which seats observers, family, law students, and those defendants who weren’t held on bail or those who could afford bail. In the cage wait the defendants who were held on bail they couldn’t pay. Even before they are deemed officially ‘guilty,’  poor folks – almost entirely black and brown people in this particular courtroom – are on display like prisoners. Those whose fate is the glass cage and not the wooden benches are shackled at the hands and feet. They are unshowered, having to make their court appearance wearing whatever clothes they were arrested in. And they are spoken to through the glass wall, straining to hear what the public defender is advising and what the judge is pronouncing. 

    I am here doing Court Watch for the Massachusetts Bail Fund (MBF), which organizes primarily to pay bail for people who can’t afford it, but also functions as a bit of a guerrilla research operation. MBF volunteers have been observing court hearings to see how a new initiative announced by District Attorney Marian Ryan is playing out in practice. In January of 2018, Ryan instructed prosecutors to stop requesting bail for “non-violent minor offenses,” which is vague and according to lawyer and founder of the MBF, Atara Rich-Shea, “not a legal category that can be predicted or enforced.” Although this policy may seem positive, given its ambiguity, it’s unlikely to have an impact on the number of people held in pre-trial detention. Additionally, the announcement was made within months of a House Bill that passed in Massachusetts which mandates the use of Risk Assessment Instruments (RAIs) when setting bail. So we come to court to make sense of how these “reforms” play out in actual hearings. 

    I am also here because growing up poor means I know the significance of even a $50 bail fee. I am here because as an activist I know how a mass arrest can drain a political organization’s funds. I am here because I spent two years in pretrial detention centers working with juvenile boys who were about to be tried as adults. And I am here because, as a close friend of someone who took his life in a jail cell after being detained for possession of marijuana, my grief coaxes me to interrogate. My grief coaxes me to fight. 

    In contrast to those in the cage, the wooden-bench defendants sit in open space. They have had the opportunity to shower and put on fresh clothes. They can approach the judge’s bench and shake their lawyer’s hand. 

    Of course, both the caged defendants and the uncaged defendants are, once entered into the criminal punishment system, never actually free. Their name in a police blotter, their wrists in chained cuffs, their face attached to a criminal tracking number – these things rob them of autonomy before any conviction is made. There is a reason, after all, many of us impacted by and working against policing and the courts refer to the “criminal punishment system,” rather than the euphemistic and illusory “justice system.” 

    But the difference between the experiences of those behind the cage and those in front of it is still a relevant one: primarily this difference is about money, but it also about who the court deems is and isn’t a ‘risk.’ 

    The question of risk has been a guiding principle of the pre-trial stage of the criminal legal process since the mid-20th century, and was codified in distinct ways in both the Bail Reform Act of 1966 and the 1984 Federal Preventive Detention Act. The law in 1966 instructed judges to assess whether or not a defendant was a “flight risk” – that is, whether or not the defendant would try to flee the jurisdiction before their trial in order to avoid prosecution. The 1984 act supplemented more than overturned the 1966 act by additionally requiring judges to determine if the release of the arrested person would be a risk to “the safety of the community.” These decisions are often driven by racial bias, as a Princeton study concluded last fall, resulting in the detention, at disproportionate rates, of poor people of color.



    In an effort to make assessment less “biased,” tools and instruments have been created to make these calls using empirical data: prior convictions, residency, employment, whether or not the defendant has a working phone, mental health, and so on. The instruments are created by both public and private entities and take a variety of forms, most of them including a round-up of historical records that claim to be detached from race and gender. For example, the Laura and John Arnold Foundation “Public Safety Assessment” (PSA) tool tracks “a defendant’s age, current charge, and key aspects of a person’s criminal history,” but “does not take into account race, gender, employment status, level of education, or history of substance use.” According to advocates, RAIs allow “objective” data to help in determining whether or not an arrestee will or will not appear at their given court date, and claim to do so in way that is “race- and gender-neutral.” 

    Couched in the language of progressive reform, these instruments are presented as something that is beneficial to the “nonviolent” offenders, the “good offenders,” in contrast to the “bad” ones, the deserving ones. But this reifies logics of white supremacy and capitalism that maintain the state as a legitimate authority on what qualifies as either innocent or deviant. Many RAI advocates fail to realize that racism is always already woven in the very fabric of the criminal legal system.

    Using algorithms may even exacerbate racial bias, given that these tools have been shown to do the opposite of what they are supposedly designed to do. Take, for example, a study conducted by ProPublica of the use of computer software to predict recidivism among criminal defendants in Broward County, Florida. The study discovered that the software produces biased outcomes against blacks: even when controlling for other relevant factors, black people are still 77 percent more likely than whites to be deemed a risk of committing a future violent crime, and 45 percent more likely to be deemed a risk of committing any crime at all. Even when race isn’t explicitly a factor used by data machines, it is an implicit one, since the geographic location of their home, their work history and mental health can’t be properly understood unless we consider the history and stubborn persistence of racism in our culture. 

    Data ultimately reports not on individuals, but on individuals’ relationship to a racist system. And although seeking to create better, less racially coded data may seem like a solution, it is important to understand how data has been used historically and is still used today as a punitive measure against people of color and the poor. Poor communities and communities of color are disproportionately objects of carceral surveillance and control. As a result, these communities have trails of data that are ready to be leveraged against them, irrespective of whether they’ve formally interacted with the criminal legal system. According to Virginia Eubanks in her book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, low-income communities, both white and non-white have always been under “moralistic and punitive poverty management strategies,” and that digitizing them is a continuation of an age-old practice, not a departure from it. Eubanks explains a historical trajectory of data’s insidious relationship to poor people and people of color, starting with the Eugenics movement in the 1800s (which gave rise to the poor house – a progenitor of today’s prison), through the scientific charity movement that attempted to determine the deserving from undeserving poor, through the New Deal’s punitive underbelly, all the way to today’s bleak state of data-driven welfare programs. In her research, Eubanks finds again and again that the algorithmic collection of information by the state is never in service of better aiding marginalized communities, but rather in the service of better policing them.

    From national borders to hourly-wage work, real freedom is elusive to us all, but those who are disproportionately targeted by police and the prison system demonstrate, to exaggerated degrees, what Judith Butler argues is the “public dimension” of the body. In Undoing Gender, she writes, “Although we struggle for rights over our own bodies, the very bodies for which we struggle are not quite ever only our own... [C]onstituted as a social phenomenon in the public sphere, my body is mine and is not mine.” Under white supremacy and capitalism, the criminalized body is never not a site for public consumption, and thus is never actually free. RAI-generated data won’t undermine this – it will further reveal it. And when the introduction of this supposedly objective data into bail decisions results in a false sense of successful reform, it is a pyrrhic victory. 

    In the Dorchester court room, I hear explanations from people behind the glass that actually reinforce why they were held on bail in the first place. An affective mix of nervous and pleading narratives are translated by the public defenders trying to explain context to the judge: this person is homeless; this person is in need of mental health care that they can’t afford. And yet, it is these same conditions that that initially kept them held. RAIs don’t fix a broken system; they maintain a system that functions exactly as it’s designed to: an arm of the State that preserves social control over poor people and people of color. And data – “objective” as it may be – will never absolve it of that.

    Back Issues

    read the full Mask Magazine back catalog

    Mask Magazine

    Mask Magazine

    cancel

    Mask Magazine

    Send an email to yourself with resetting instructions

    Loading
    loading ...