ONE OF 8 BILLION

In the third of a special five-part series, Julia Dressel, with special guest Tarra Simmons, discusses how Recidiviz is confronting racism and bias in criminal justice data.

Show Notes

SUMMARY
In the third of a special five-part series, Julia Dressel, with special guest Tarra Simmons, discusses how Recidiviz is confronting racism and bias in criminal justice data.

GUESTS
Julia Dressel, software engineer at Recidiviz

Tarra Simmons, Washington State legislator who spent time in prison

HIGHLIGHTS
From Julia Dressel:
  • The massive growth of the criminal justice system in the United States makes it particularly difficult to break the cycles that have disproportionately ensnared people of color.
  • The racism that has been ingrained in our criminal justice system is perpetuated when we use statistics from the past to predict outcomes in the future.
  • We need to build a new system that is not burdened by mistakes of the past, but rather uses new data to show what works and how we can shrink the prison population safely and equitably.

From Tarra Simmons:
  • Tarra was inspired to pursue a law degree by the lawyers she met with while in prison, but once she was out she had to overcome multiple barriers to achieve that dream.
  • Prisons in many cases perpetuate violence, inflicting trauma on people who have been abused themselves, so understanding their stories is important as we look for a better way forward.

LINKS

What is ONE OF 8 BILLION?

This is ONE OF 8 BILLION, a podcast about all of us.

We all have a story, don’t we? We’ve all had successes and failures, joy and disappointment, love and sadness. And yet, we’ve all made it to here… to right now! Our stories are one amongst eight billion others… eight billion other stories, each of them unique, each of them grand in their own way, and each of them a window into the humanity that connects us all. ONE OF 8 BILLION tells the life stories of people from around the world. More at 1of8b.com

ONE OF 8 BILLION is supported by TEN7, a technology studio whose mission is to Make Things That Matter. Online at ten7.com

Hey everyone!

Welcome to the TEN7 Podcast and the
third episode of a special 5 part

series called “Meeting the Moment: Using
Data to Reimagine Criminal Justice”

I’m your host Ivan Stegic.

This series is a partnership with
Recidiviz, a nonprofit organization that

is using data-driven tools to help guide
change in the criminal justice system.

Recidiviz experts are helping us explore
a wide variety of angles related to

criminal justice and our prison system.

We’re also looking for signs of
hope, that maybe our country is

finally ready to meet this challenge
and find a new path forward.

Our mission at TEN7 is to “Make Things
That Matter,” so this series fits with

our values and our desire to do our
part to make the world a better place.

In the first episode of the series, we
learned that mass incarceration is a

uniquely American problem, particularly
targeting people of color and women.

In our second episode, we examined how
data might help states start to reverse

our reliance on mass incarceration,
if we can improve the ways that data

is gathered, shared and interpreted.

In this episode, we’re going to examine
the issue of bias in data and technology,

and what we can do to prevent that
bias from perpetuating the inequality

in our criminal justice system.

But first, I want to continue with
the story of Tarra Simmons, a recently

elected state legislator in Washington
who spent time in prison and who has

a unique perspective on the challenges
that face people with criminal records.

We are leading off each episode of
this series with Tarra’s voice as

a reminder that criminal justice
is not about numbers, it’s about

human lives and it’s about hope.

Here is Tarra Simmons, continuing
her story and discussing how she

first got the idea to become a lawyer
while she was serving time in prison.

When I met the lawyers when I was
in prison they were coming in, they

were actually mostly law students
and they were coming in to help the

women with their family law cases.

So, when you go to prison oftentimes,
you’re losing custody of your kids

and they were helping the women kind
of advocate for their parental rights

and so I met them through that.

They told me that I would be a good
lawyer because I was catching on

quickly and I was a good advocate
and things like that, and I was

helping other women in the prison.

I asked them, Do you think I could
because of my criminal record?

They didn’t know for sure but they did
give me the name of a law professor to

call when I got out and so I wrote his
name down and I was out for probably

eight months before I contacted him,
and I asked him what did he think

about my chances, and I went through my
criminal record with him and he said,

All of your criminal record was tied
to trauma and substance use disorder,

and you’re addressing those and so I
think you would have a good chance.

So he became a friend and mentor of
mine and then I met other friends and

mentors along the way including people
who had criminal history and had done

time in prison who became lawyers
and collectively they really helped

me gain entrance into law school.

And so practicing law and being a
graduate of a law degree are kind

of two different things, right?

You have to first get that certificate
that you can hang up on a wall, you

have to graduate, but then you have
to sit for the State Bar exam, right?

My understanding is you graduated
and then you weren’t allowed

to sit for the State Bar exam.

Yes, it was devastating.

It completely blindsided me because of the
fact that two years before I went through

that process, my dear friend, Professor
Shon Hopwood, at Georgetown University in

DC, he had robbed five banks at gunpoint
and served 12 years in federal prison.

[laughing] What?

Yeah.

And the Washington State Bar
Association had allowed him to become

an attorney here in Washington State.

So, for me I really thought
I had done everything.

I had been honest and open, and I had
helped so many people and I had over

100 letters of recommendation from not
only sitting judges to prosecutors to

people in my recovery community who
I’d helped to supervisors who had been

supervising me over the course of my law
school career in different internships

and externships and I had so much to look
forward to and I had done everything.

I had kept up with the obligations
to keep my nursing license.

I had done all of these random
drug tests for five years.

I had done everything.

So it was really blindsiding to
me that they voted to not allow

me to sit for the bar exam.

It was really, really devastating.

So you couldn’t even sit for the exam.

Even if you had taken the exam,
they could’ve denied you after

that, but you couldn’t even have
an opportunity to take the exam.

My understanding is that you then
went to the State Supreme Court

in Washington to fight this.

Yes, and the person who ended up
representing me in the State Supreme

Court was none other than Shon
Hopwood [laughing] who had become

a lawyer in Washington State after
robbing five banks at gunpoint and

serving 12 years in federal prison.

So the irony of that alone [laughing]
was kind of amusing, and of course

he did amazing advocacy on my behalf.

Something happened that
day that never happens.

Usually it takes the Supreme Court
four or five months at least to issue

an opinion, and we left the courthouse
that day on November 16, 2017, and we

thought we might hear back in April and
came home and within hours the Supreme

Court issued a unanimous decision
allowing me to take the bar exam.

So it was pretty special.

That’s really amazing.

And it set some sort of precedent
in the law for others as well.

Right?

Yes.

I think now we finally have a
Supreme Court case that others can

use when they’re advocating for
their right to become an attorney.

What did you do that evening when
you found out about that result?

Oh my gosh.

It was just hours and hours
of talking with all of the

people who had supported me.

So many people supported me, and
so really, I spent the rest of that

night with my family at the dining
room table just fielding phone calls

from reporters and from friends and
supporters, and just letting them know

how much we appreciated their support.

We’ll hear more from Tarra in our next
episode, but before we begin to talk

about the issues related to bias in
technology, Tarra did offer some thoughts

on the ways that our criminal justice
system tends to draw on the pain of the

past and perpetuate it for the future.

I would say that there really is nobody
that struggles with being patient more

than I do, because I literally have
thousands of family members reaching

out to me and sharing their grief
and their trauma and their sadness,

because their loved one is incarcerated.

Hearing those stories over and over
again, the pain of those individuals,

but also survivors of crime too who
have a lot of untreated trauma and

pain, and I absolutely care about their
issues as well, and their healing.

I just don’t see them
as being tied together.

I think that the way forward
is continuing to break down the

victim-offender binary issue.

I think that that one is the
political challenge that we

really need to lean into.

I’m really grateful to be working
with a lot of survivors of crime

right now who are advocating for
transformational justice and for a

new way of holding people accountable
for harm and recognizing that

prisons are another form of violence.

Prisons are state sanctioned violence,
violent for previously sexually abused

women that have to go to prison and be
ripped away from their kids and be strip

searched and be dehumanized, and to give
birth shackled to a bed, to the way we

are treating people, not just women, but
the men that are in prison and the stories

that they share around their parents
putting cigarette butts out on them,

and that the pain of these individuals.

They’ve already had so much trauma
and then we put them in these

dehumanizing conditions where
they’re put in segregation and

locked in a cell for 23 hours a day.

We absolutely have to change this,
and I think as long as we continue to

highlight these stories, as long as we
continue to work around breaking down

what is violence and recognizing the
criminal justice system is actually

not helping, and showing the research
and the data about recidivism, and how

hard it is to get out of the cycle, how
hard it is to get a job, how hard it

is to get a place to live, and how that
in itself is creating more recidivism.

Tarra’s story illustrates how trauma can
be passed from generation to generation,

and how our criminal justice system
has, unfortunately, helped continue

these cycles rather than break them.

If we acknowledge that our criminal
justice system has roots in racism… and

if all of our data about criminal justice
reflects this historical bias… how do we

develop new technologies and programs to
create a different system going forward?

That’s the question that our
next guest is hoping to answer.

Julia Dressel is a software engineer
at Recidiviz, and she joins us now.

So, my role at Recidiviz is a software
engineer and I work on a number of

teams at Recidiviz but the core of
my work is on the infrastructure,

or data analytics infrastructure
that we’re building at Recidiviz.

And how did you manage
to get to Recidiviz?

What was the way you got there?

This story starts actually many years
ago, back when I was an undergrad in

college and I am one of the few software
engineers who was also a gender studies

major when I was back in college.

And studying both gender studies and
computer science simultaneously got

me super interested in situations
where there was technology that was

reinforcing any type of systemic bias or
discrimination that we see in our society.

So, when I was in school I ended up doing
some research on some of the algorithmic

risk assessment tools that are used in
the criminal justice system and this

is where I first started learning about
bias built into these tools, but also

started learning more about the problem
of mass incarceration as a whole, and

then a couple years ago I pretty randomly
came across Recidiviz and was so excited

because this was an organization that was
working exactly on a problem that I was

really passionate about and also it was
working in a way where me as a software

engineer could really contribute to what
they were getting done at Recidiviz.

Why are you so passionate about it.

Obviously there’s a left brain,
right brain thing going on here with

gender studies and computer science,
but what makes you so passionate?

It’s a huge, huge problem that we have
particularly in the United States.

We incarcerate, I think, around 25%
of the world’s prisoners even though

we only have around 4% of the world’s
population, and it’s this massive, massive

system that has grown so dramatically
over the past couple decades and impacts

particularly people of color in this
country, but impacts millions and

millions of peoples lives, not only the
people that are in the system but anyone

related to or in the community of anyone
who has gotten trapped in the system.

And so, I have found this pull to do
anything that can to make the system

smaller, make it more fair and get
people out of the cycles that they’ve

fallen into through their interactions
in the criminal justice system.

I’d like to get some definitions out
there, so that we’re comfortable with

talking about words and our listeners are
able to understand these words as well.

Let’s start out by describing the context.

What’s the technical definition
of bias in technology?

Bias is in general, not even just
specific to technology, bias in general

is when there’s a disproportionate weight
in favor or against a certain thing.

So, when we’re thinking about bias
in technology, what we’re usually

talking about is discrimination against
a group of people that is happening

because of some certain tool, because
of how the tool’s being used usually.

And so it’s a fact that there are
technological tools where the use of that

tool is having a disproportionate negative
impact on a certain group of people.

And where do we see this typically
manifested in technology?

It actually comes up in a lot of
different types of technologies when

you’re thinking about credit scores,
algorithmic use of predictive analytics,

trying to determine which type of
student is going to be successful

if they’re admitted to a college.

We’ve seen algorithmic tools
used for college admissions.

We see algorithmic tools used a lot in
determining who qualifies for a bank loan.

It usually manifests in technology
that’s trying to make a prediction

about a certain individual, and
the way that prediction is being

made is having a disproportionate
negative impact on a group of people.

So that’s absolutely something we should
care about because we might actually

be affected by that bias and not even
know about it, and it might be unfair.

Absolutely.

So why is it said to be built-in?

That’s one of the things that we kind
of glean from the title and from the

context of this episode is, we’re talking
about the built-in biases of technology.

It’s said to be built-in because no
matter how the given tool is used,

no matter the intent of the person
using it or the system using it, it’s

going to have this disparate impact.

And disparate impact is defined
as when there is disproportionate

negative impact to a certain
protected class of a group of people.

And so, we say that the bias is built-in
to a technology when the kinds of

decisions that it’s making and the
kind of decision criteria that has

been coded into how the tool is making
a prediction, or making a decision.

Those kinds of decisions are always
going to on average impact a group of

people more negatively than others.

And so, it depends on the technology
we can get into certain technologies,

but basically why it’s built-in is
that no matter what the application

is of a given tool it’s going to
have a disproportionate impact.

So, all these words we keep hearing,
machine learning, artificial

intelligence, predictive analytics.

Let’s start with machine learning, or ML.

What is that?

An analogy that I really like to
use to describe machine learning,

because it’s one of those words that’s
thrown around a lot is, let’s say

that we are trying to teach a machine
to identify a picture of an apple.

So we want to be able to show a photo
to a machine and say Is this an apple?

And for the machine to say, Yep, that’s
an apple or no that’s not an apple.

And how you train a machine learning
model is you always have to give

it data to learn patterns from.

So, let’s say we’ve got thousands
of different pictures of

fruits and vegetables and we’ve
labeled what each of them are.

So, we’ve labeled either this is an
apple or this is not an apple and

the machine learns, Okay, this is
generally what an apple looks like.

It’s kind of round, red,
maybe has a leaf on the top.

And then you show this machine a
new picture that it hasn’t seen

before and you say Is this an apple?

And because you’ve loaded all of
this data to teach the machine the

patterns of what an apple usually
looks like, it can make a guess,

educated guess based on the patterns
that it has learned whether or not the

thing you’re showing it is an apple.

So, it’s basically how a machine learns
patterns that exist in a given data set

and then uses what it’s learned about
those patterns to make predictions about

a new thing that it hasn’t seen before.

Is that the artificial intelligence
part or is that something else?

Artificial intelligence
uses machine learning a lot.

Artificial intelligence usually we
define as when you’re asking a machine

to do a human-like task, and so,
looking at a picture and saying Is

this an apple, you know, maybe there’s
a person out there who needs to be

able to do that, but a more human-like
task would be say picking apples.

So if you have a robot and you want a
robot to be able to go out to a tree

and pick an apple, it needs to know if
something is a leaf or something is an

apple, and so, artificial intelligence
is when you load potentially a robot or

computer with machine learning so that it
knows, Okay, this is what an apple looks

like, and then you tell it, Okay, go pick
all the things that look like an apple.

So these things are not
mutually exclusive, they’re

building on top of each other.

Okay.

And then there’s this other
term, predicted analytics.

What does that mean?

Predictive analytics is when you are
using trends of what have happened,

usually historically, in order to predict
what is likely to happen in the future.

So, we can stay on this apple
analogy if it’s helpful [laughing]

.
Yes, let’s do it.

[laughing]

Yeah, so, let’s say we own this huge
orchard and we’re trying to predict,

we’re trying to figure out how many
apples we’re going to be able to sell

next year, so predictive analytics would
be having accurate historical data on

This is the number of apples we usually
sell, given the number of trees we have

and we have figured out what the trends
are historically with this orchard, and

we use those trends, maybe we incorporate
weather trends, seasonal trends, etc.

to try to model and predict, Okay, given
what the weather’s supposed to be like in

the next season this is how many apples
we expect to produce in the next year.

So, it’s using trends of historical
information to predict what’s

likely to happen in the future.

And I noticed you used the word
accurate data when you described the

data in the past that you’re going
to use to do predictive analytics,

and I would guess that data about
which crops were successful and what

the data for the apples look like,
should be pretty unbiased, right?

Unless there’s someone who is technically
or artificially leaning towards giving

the red apples more prevalence in the
data somehow over the green apples, right?

Yeah, so that’s usually a situation
where you’re probably hopefully

not going to have a lot of bias in
the data collection but you could.

Say some person doesn’t like green apples
at all, and so they’re going through

and they’re counting all the apples
and they are like, You know what, I’m

not going to count these green apples.

I don’t think green
apples count as apples.

So that could be how a personal bias in
what counts as x versus y could influence

the accuracy of the data that’s collected.

Okay, I think I understood that and
that’s why I wanted to bring that in

because I was thinking about what you
described as bias earlier and how that

might relate to this apple analogy.

So there’s this other phrase
historical descriptive analytics.

What is that?

So, historical descriptive
analytics, if we just take out

historical for a second, descriptive
analytics is describe what exists.

So, can you count the number of apples
that were produced in October of 2020.

That would be one descriptive data point.

So, there’s nothing fancy going
on with descriptive statistics

basically, you’re just counting, you
are categorizing things and you are

counting what is in each category.

So you might get a little fancy with
like, Oh, this is the historical trend

that has happened or these are the
different rates of green apples to

red apples, or something like that.

But descriptive statistics is what we
just think about as basic math that

describes what has happened and when
we think about historical descriptive

statistics or descriptive analytics
it’s being able to have a dataset on

something that has happened over a
given amount of time and pull numbers,

pull metrics out of that data set.

Got it.

Okay, so, now that we got the definitions
all out of the way and we understand this

wonderful apple analogy, let’s talk about
bias being an issue in technology that’s

used in the criminal justice system.

You mentioned the stats and we’ve
talked about it in different episodes.

The fact that the U.S.

has about 4% of the world’s population but
about 25% of the incarcerated population.

Tell me about the bias in the technology
that’s used in the criminal justice

system and why it’s important to
understand it in the context of the U.S.

criminal justice system.

What’s really important before you start
talking about any kind of bias with

technology that’s used in the criminal
justice system is an understanding of the

very, very specific racial context of the
United States criminal justice system.

And there’s some pretty alarming stats
that I can go through to kind of paint

this broad picture, but basically a
black adult is almost six times more

likely to be incarcerated than a white
adult in the United States, and Hispanic

adults around three times more likely
than non-Hispanic white adults to be

incarcerated and another really alarming
one is that the Black and Hispanic people

in our country are around 29% of the U.S.

population but they make up
actually around 57% of the prison

population in the United States.

Wow, that is a huge disparity there.

Huge.

So, that’s really important because
that shows that we’ve got this system

that historically has not treated people
of different races equally, right?

So we have this system that has
over policed people of color,

predominantly black people, and then
over sentenced and over incarcerated.

When I say over, I mean disproportionately
we are incarcerating black people and

other people of color in our country.

So that’s a really important context
to think about when you are trying

to build any tool that’s going to use
historical data about the criminal

justice system to make any prediction
about what’s going to happen next.

If we have accurate historical data about
who has come into the criminal justice

system you could see those trends.

You could look at those trends and say
Okay, wow, we have these groups of people

who have been consistently incarcerated at
higher rates than other groups of people.

It's actually an accurate picture
of what the system is doing,

which is over-incarcerating
certain groups of people.

So given this context tell me
about how the bias is affecting it.

I think, not to overuse this
apple analogy, [laughing] but I

feel there’s actually something
here that could be helpful.

So, back to machine learning.

We’re trying to say Machine, please be
able to identify an apple from an image.

And that learned what an apple looks
like based on this huge data set that

said apple, not apple, apple, not
apple, and so, let’s say that there’s

tons of images in this data set that
are tomatoes that were labeled as

apples, and it doesn’t really matter
why those got the label apple but let’s

say those tons of tomatoes that have
been labeled as apples, and so this

algorithm has learned Okay, this thing
that looks kind of like an apple, okay,

they’re telling me it’s an apple, great.

This is also an apple.

So then we show a picture of a tomato
to this machine learning algorithm

and it’s going to say, Oh, you’ve been
telling me for decades that this is

an apple so here this is an apple.

And where that analogy kind of goes is
that when you use historical data about

the criminal justice system to build any
type of machine learning algorithm it

learns what categories of people fall
back in the system over and over again.

It learns the patterns that we’re
giving it that say, This is what our

system looks like and it learns those
categories of people that have been

incarcerated at higher and higher rates.

When you ask a machine learning
algorithm that was trained on historical

United States criminal justice
data, Do you think this person will

commit another crime in the future?

So, you’re asking it a question within
the context of the criminal justice

system, it’s going to say, Oh, yeh
this person looks a lot like all

of the other people that have been
stuck in the system over and over

again so I think, yes, this person
will probably end up back in prison.

When in actual fact that person may
have been mislabeled or mischaracterized

because the data is biased.

It’s basically that we in this
incarceration system have incarcerated

certain categories of people more
than others and so if you have

somebody that falls to that category
it’s going to say Yes this person is

likely to fall back in the system.

So that just creates this
self-perpetuating, defeating disparity

machine where we can’t mitigate any of
the existing discrepancies that are in

the system if we’re relying on machines
that are using historical data to describe

those discrepancies in order to make
predictions about what’s going to happen.

So how do we fix that?

Is it even possible?

Usually tools, like predictive tools
that are used in the criminal justice

system, are asking a question of risk.

They’re usually risk assessment tools.

So there is a person that’s either
pre-trial and so the Judge is deciding

whether or not to let them out of jail
before their trial, and there’s an

algorithmic tool that is saying, Oh
this person is high risk of committing

another crime before their trial, or high
risk of not showing up to their trial.

In the root of that question we are
using an algorithmic tool to predict

somebody’s likelihood of risk based
on their proximity to categories that

have been historically criminalized.

And then we’re punishing that person, not
for the crime that they have committed

or for their actual threat to public
safety, but we’re punishing them or

making a decision about them based on
what we think they might do based on

how much they look like other people who
haven’t had a chance to even get out of

prison and succeed in public life before.

So we’re making predictions
based on flawed data.

Exactly.

How do you then go about
making a prediction that’s

unbiased based on biased data?

One key to addressing this is
to shift the kind of questions

that we’re trying to ask.

So, recognizing that we shouldn’t be
making decisions based on a prediction

of what we think someone might do,
and instead we should be reacting to

somebody’s material circumstances or what
their actual needs are in the moment.

Pretrial is a really great example of
this where instead of keeping somebody in

jail because they don’t have housing and
therefore at a risk of recidivism because

that is a risk factor, not having housing.

Instead of saying, Oh, we’re going to
punish you for this fact that you have

a need that happens to be a factor that
makes you more likely to recidivate in

the future, instead asking a question
of, Okay, what is this person's need

and how can we support that person
so that those needs are getting met?

So it’s looking at a mental health
situation and saying, Okay, let’s get this

person treatment and resources, instead
of Let’s further punish this person

for the fact that we have historically
criminalized poor mental health.

That’s one direction to take it, to
shift the questions that we’re asking

of technology and not assume that
we can get some type of perfect and

unbiased prediction of the future,
but instead shifting what we’re trying

to do in those certain contexts.

So, as an organization you have to be
keenly aware of the bias that you are

potentially introducing in your software.

What’s your role when you work
with the state to show them how

you’re addressing this bias?

The role and what we’re really building
at Recidiviz is an ability to show states

a mirror of what’s currently happening,
and they’ve been super, super receptive

to this, and we’ve been able to show
them, Here are the racial disparities

that are currently existing in your
system, and until the work that we’re

doing, it was really hard to have an
accurate understanding historically how

a trend has changed over time, or to be
able to ask a question of, This month

what is currently happening in my system,
and where are the racial disparities

that are happening in the system?

So you’re building a system to
start taking data and to monitor

the data in real-time and to reflect
that data back to the states.

And then presumably you’re then
able to have a better chance of

predicting what is likely to happen
if they make a policy change.

Exactly.

So, we’ve got this one project that we’re
working on where we’re looking at policy

proposals in a given state legislature
system and using accurate historical

descriptive analytics of how people have
moved through that system historically.

We are predicting what the projected
impact is of passing that given policy

and what we can do is determine how that
policy will impact the racial disparities

of a given state's correction system.

So, as we start to make changes to the
system that currently exists, we can say,

This is going to have a positive impact
on the racial disparities in the system,

or This is a dangerous change to make
because it’s actually going to deepen

the racial disparities in that system.

So that’s actually a risk here, right?

The risk that you might be trying to fix
things but that you might be introducing

additional bias into the system.

So, how do we avoid that?

The first step in avoiding that is having
a very accurate understanding of what

has gone on and what’s currently going
on, and so we need to be able to know if

a change is having a negative impact, we
need to be able to know immediately that

that is having an undesired impact and we
need to pivot or change what we’re doing.

So this includes super intentional
but also slow rollouts of some

of the tools that we built,
and so, there’s a good example.

There’s a certain tool we’re building
where we’re rolling it out district

by district and we need to make
sure that we’re getting it right

before it expands any further.

What does that timeframe look like?

Are we talking about years to do a
rollout or is it shorter than that?

How quickly is the feedback
cycle happening for you?

That’s a great question.

What’s great is that we have very
effectively shortened the feedback loop

of the whole criminal justice system, or
in the systems in the states that we’re

working with because we’re collecting
data in real-time from the State and

producing analytics on what is currently
happening and how changes have impacted

what is currently happening in that State.

So it depends on the product, but some
products have a slow rollout maybe over

a couple months and then we can look back
to those months and say, Okay, cool, this

is having the positive impact that we
wanted, let’s expand a little bit farther.

What does a negative impact
do to your rollout and how

do you address those issues?

We actually have implemented what we
call backstop metrics which is we have

metrics that we’re keeping track of
and we say, Okay, this is what’s really

important and if anything starts to go
towards this danger zone we need to take

a pause and evaluate why that’s happening,
potentially, pull what we’re implementing

and rethink it and re approach it later,
and this is pretty rare for Silicon

Valley, I think for the techsphere is
there’s usually not a high desirability

to take a pause in what you’re building
there’s a lot of the classic Facebook

phrase of Move fast and break things.

And we’re working in a context
that is really important and

affects a lot of people's lives.

So we have to be very intentional and very
careful about everything that we build so

that we are only having a positive impact
and decarceral impact on the system.

Do you think that we’re going to get to
a point where we are able to shift the

discussion and get good clean data and
sort of eliminate the bias that we are

experiencing right now in the future?

There’s such a tendency to say,
Okay, well how can we get to a

place where we’ve got clean data?

How can we get to a place where we’ve
got data that doesn’t have a bias?

And what you’re really asking is,
How can we get to a criminal justice

system that doesn’t exhibit drastic
racial bias at every step of the way.

Right?

You’re not going to get an accurate
picture of the criminal justice

system that is “clean” or “free of
bias” if you don’t have a system

that is equitable, that is much
smaller than what we have currently

or that is treating people equally.

So let’s solve that big problem
and get rid of bias is what

you’re saying, and we do that by
changing the system fundamentally?

Yeah, and where I feel like Recidiviz
really comes into play there is, we’ve got

this huge system of mass incarceration,
it’s been built over decades, and there

is this bipartisan support, thank God,
to shift what that system looks like,

to decarcerate, to make it smaller,
and as we start to chip away at this

massive, massive system, we need to
make sure that every change that we

make is going in the right direction.

And so, having an ability to say,
Okay, we’re about to make this change.

Do we think it’s going to be good?

Yes?

Projected impact.

Positive.

Great, and then a year later or a
month later whatever kind of the

feedback loop is on that particular
change, being able to evaluate,

Did that have an expected impact?

Yes.

Amazing.

Let’s replicate that in other states
or let’s make even a more drastic

change in that direction, etc.

So, being able to monitor and have
accurate awareness of what is currently

going on, how big the system is,
what’s happening in the system, and

then being able to say, These changes
are working, these changes are not, in

driving our impact all in one direction
is really what we’re trying to do.

I kind of want to close by asking
you what you’re hopeful about and

why you are still working on this?

It’s a huge problem.

What makes me hopeful is that every
single decarceral change has a very

human impact, and when we started
Recidiviz a couple years ago we kept

saying, If we get one person out
of prison that will be a success.

And so, even impacting one person's
life in a positive way is a success

and everything on top of that is just
even more and more success for this

organization and for our personal
impact we want to have on this problem.

Recidiviz exists explicitly so that we
can ensure that the future of our criminal

justice system does not look like its
past, and as we make changes to the system

we need to have the assurance that we’re
making things better at every single

step and not worse, and so I’m hopeful
and very proud to work at Recidiviz

and I’m hopeful in the impact that we
are having on the system as a whole.

Our series will continue next week
with more from Tarra Simmons, and

we’ll also speak with Serena Chang,
a product manager at Recidiviz.

Serena will help us consider how
technology might help change the

culture of our criminal justice system
and perhaps lead the way toward more

effective, human-centered reform.

Here’s some of what Serena had to say:

We’ve learned a lot of things that
we never would’ve known if we had

not talked to clients on supervision.

So, we’re often trying to build tools
for the parole and probation officers and

talking to the clients helps us know, Are
these things actually going to work or

will they create these unintended impacts?

One client that we were talking
to said, The best thing that my

parole officer could do for me is
just acknowledge that I’m a human,

not just a number and a crime.

So that impacts a lot of design
decisions, like we probably should be

surfacing the client's name and not their
department of corrections ID or number.

There are lots of simple things like
that, that we can completely overlook

if we are not actually talking to
the people impacted by the system.

Join us next time for the fourth episode
of our series, Meeting the Moment: Using

Data to Reimagine Criminal Justice.

We hope you’ll subscribe.

You can find out more
online at ten7.com/moment.

Thank you for listening.