Of Risk Assessment and the Problems of Bias and Justice

In his Scatterplot post algorithmic-decisionmaking-replaces-your-biases-with-someone-elses-biases Sociologist Dan Hirschman has written a good summary (with links) of recent discussions of the problems with COMPAS, a “risk assessment” tool that is used to decide whether people are released from prison or jail.  He calls particular attention to the story in Rebecca Wexler’s article “Code of Silence” in Washginton Monthly of an inmate who figured out that he was being kept in prison by a heavily-weighted subjective item asking “Does this person appear to have notable disciplinary issues?” that was over and above the person’s actual disciplinary record and was weighted heavily enough to be the deciding factor in whether an inmate was released (i.e. changing the total score from 8 to 1 on a 10 point scale).  Other examples are provided of seemingly “objective” scales that actually give heavy weight to subjective opinions.

A study published by propublica.org found that COMPAS had a higher rate for Blacks than Whites for falsely predicting that a person would re-offend and a higher rate for Whites than Blacks of falsely predicting that a person would not re-offend. A methodological comment published by the Brookings Institute argues that there is an inherent statistical bias in this methodology toward finding this result built in to the different underlying rates and also raises the question of whether the people who are not let loose because of failing the risk assessment were prevented from re-offending by the treatment or confinement.

But there is another, deeper, problem with COMPAS and many other risk assessment tools because that they inherently embody the  consequences of structural discrimination. Even if subjective judgments are avoided and net of whatever illegal acts they have actually done, Black and Latino people in more intensively policed areas are more likely to have been caught and arrested. They are more likely to have been caught and arrested in the past, and they are more likely to be caught and arrested in the future. The social and economic indicators similarly embody inherent bias. People whose parents were low income or uneducated are less likely to have accumulated education and employment credentials and opportunity to purchase a home. People who experience racial discrimination in the housing and employment markets are less likely to have a respectable place to live in a good neighborhood and a good job. People who come from heavily-policed areas are going to have a higher percentage of relatives and friends who have been in trouble with the law. People who have experienced discrimination all their lives are going to have more a suspicious and reserved or hostile demeanor that makes them seem to have the wrong attitudes from the point of view of a social worker or prosecutor or judge. Someone without a place to live is more likely to get into trouble for vagrancy.

The instruments are identifying the correlates of trouble, but not the causes. They take discriminatory hierarchies as inputs and exacerbate them and make them worse as outputs. These instruments could pass 100% of the statistical tests for prediction and still reproduce bias. But the problem is worse than that, because even without risk assessment instruments, the same built-in structural biases compound the consequences of disadvantage and make them worse. The system is built on the idea that it should treat you worse every time it encounters you. If you were warned and released last time, you should be ticketed this time. If you were ticketed last time, you should be arrested this time. If you were given probation last time, you should get jail this time. And so forth. But as long as different places are policed differently, the execution of this seemingly-logical rule will inevitably take discrimination and disadvantage as an input and generate more discrimination and disadvantage as an output.

There is a logically similar related issue with fines and court costs. In the US there are flat fees per offense regardless of the offender’s wealth. A fine of $200 that is annoying but affordable for an average person is barely spare change for a wealthy person and impossibly unaffordable for a low income person, so the effectively penalty is much higher for a low income person who must instead spend time in jail than for an affluent person. As reported in an Article in the Atlantic fines in many Scandinavian countries are instead calculated as a percentage of income, for example “Finland’s system for calculating fines is relatively simple: It starts with an estimate of the amount of spending money a Finn has for one day, and then divides that by two—the resulting number is considered a reasonable amount of spending money to deprive the offender of. Then, based on the severity of the crime, the system has rules for how many days the offender must go without that money.”

It would be a big change in the US just to recognize that charging the same monetary fine for the same offense is not “equal treatment” but, in fact, unequal treatment that punishes lower-income people more than higher-income people for the same offense. It would be an even bigger change in the US to recognize that seemingly-neutral algorithms collect all the inequalities and discrimination in society and systematize them. The only way to treat people justly and fairly is to recognize the differences in their circumstances. But this, in turn, requires deciding which differences deserve compensation and which do not. I see no impersonal or mathematical way to avoid value judgments in this. But bringing the issue out into the open can at least allow us to talk about it.

EDIT: In response to social media dialogues, I want to be clear that I think there is a positive place for risk assessment tools and other systematic assessments IF they are open to inspection. There is a problem when all judgments are made subjectively, and I do think “objective” assessment tools can help this problem. But only if the content of these tools is open to inspection, reflection and critique. It seems contrary to all principles of justice to allow a secret proprietary tool to be used in an unaccountable way.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.