Computer Says Maybe

Everything is happening so fast. And a lot of it’s bad. What can research and science organizations do when issues are complex, fast-moving, and super important?

More like this: Independent Researchers in a Platform Era w/ Brandi Guerkink

Building knowledge is more important than ever in times like these. This week, we have three guests. Megan Price from the Human Rights Data Analysis Group (HRDAG) shares how statistics and data science can be used to get justice. Janet Haven and Charlton McIlwan from Data & Society explore the role that research institutions can offer to bridge research knowledge and policy prescription.

Further reading & resources:
**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Post Production by Sarah Myles | Pre Production by Georgia Iacovou

What is Computer Says Maybe?

Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.

Alix: [00:00:00] Hey there. Welcome to Computer Says maybe this is your host, Alix Dunn, and in this episode, we are going to be exploring how hard it is to do good science when the world is changing super fast, and it's really important for us to know what's going on. It's kind of a continuation from our conversation last week with Brandi Guerkink from the Coalition for Independent Tech Research, but it's digging in a little bit more on the challenges.

Researchers face when they're trying to sort of find out what's going on. And we are talking to two different organizations in this episode. The first is HRDAG, and we're gonna be talking with Megan Price, the executive director. And then the second organization is Data and Society. And we're gonna be talking with Janet Haven and Charlton McIlwan.

And what ties the two organizations together is they are both studying extremely complex phenomenon in the world and they are trying to use like. Good methods to do it, which is difficult, which [00:01:00] we're gonna get into. So let's kick off with Megan Price from HR dag.

Megan: My name is Megan Price, and I am the executive director of the Human Rights Data Analysis Group. We call ourselves HR dag.

Alix: Most people, when they think about human rights, they think about laws. They think about maybe activists and advocates and advocacy. They think about pictures, they think about stories, they think about courthouses.

I don't know. Oftentimes they don't think about data science. I don't know, like that doesn't, doesn't often come to mind. Um, do you wanna just give us like the top line what HR Dag. Does, and then we'll talk a little bit about some of those early examples where you all kind of bust onto the scene doing this new thing.

Megan: Yeah, you're totally right. It is not often what first comes to mind, and it's one of the things that I love. It's right there in our name, the Human Rights Data Analysis group. It is the thing that we do. We first got our start. In the early nineties when my colleague and co-founder Dr. Patrick [00:02:00] Ball, was working in various Latin American countries and he was really doing accompaniment for some of those more classic human rights activities that you might think of.

And it was through meeting those advocates and having those conversations with folks that the reflection back from the advocates was, what we really need is some help. Organizing and preserving and securing our data. And so that really was kind of the start of that idea. And then HR D'S history really comes out of partnering with transitional justice mechanisms, so partnering with truth commissions with branches of the United Nations and using data and statistics to answer.

Really fundamental questions about conflict questions like how many people were killed in this conflict, how many people went missing over what period of time? What were the patterns of violence? And it turns out that as soon as you ask that patterns question, that becomes a statistics question. Because [00:03:00] even though the question seems really straightforward, and it is, it's a good starting point.

There's a lot of complexity to getting the answer to that question right, and to answering it in a way that withstands the kind of adversarial, either, you know, courtroom environment that can be very challenging or just public opinion.

Alix: Yeah. Super interesting. So basically Patrick was like seeing data science statistics.

Data as a form of evidence that was kind of supplementing traditional practices. And it was kind of, I mean, I feel like in the NI early nineties, I don't know. There's that joke that a data scientist is a statistician that lives in San Francisco. I feel like in the early nineties though, like people didn't even really know what.

Statistics could offer, I would imagine like, I mean that must have been really weird to be the kind of American who has this mathy thing going on. Any reflections on, I mean, I'd love to talk to Patrick about this, but like any reflections on like what that was like? Like how long did it take for people to be like, we need the data scientists, like we need the [00:04:00] statistician, bring him in.

Megan: I mean, that's actually one of my favorite sort of ways to think about these projects because, yeah, it was definitely decades before we would enter a room and somebody would say, oh, thank God the statistician is here. I mean, yeah, that's not. That's not a reaction most people have. Um, it did happen actually, maybe not

Alix: one that people have generally now, but carry on.

Right.

Megan: Um, but no, I would say that at the very beginning what people did recognize and just didn't necessarily name as like, we need a statistician to do this, was recognizing the need to. Remember the stories that they had been told or the documentation that they had been able to collect, and especially in your early nineties, recognizing that if you have sheets of paper with handwriting on them, they're really vulnerable.

And so looking for ways to just make sure that that information isn't lost and then having conversations with, you know, with Patrick and with other members of his team about what's the

Alix: way that you don't lose that. Mm-hmm. It's also, I think. Even [00:05:00] like talking to people that work in those kinds of environments, it's really hard to wrap your head around what it's like to be in a country when something has just happened for a long period of time and, and you're trying to like get organized and like preserve and also probably.

Maybe psychologically wanting to like move on. It's such a strange time to like try and do something laser focused that will work in a court environment. And I feel like having someone that can systematically be like, let's scan those documents, let's digitize those things. Let's not presume that that filing cabinet that's like shoved in the corner of that thing is like a good place.

I just don't think we think very much about. Information in moments of transition and trauma. And I imagine that was like a really I super intense position to be in.

Megan: Yeah, absolutely. And that's why we really try to frame that and think about it in our own heads. We're thinking about it as data and statistics and you know, sort of analyses that we could do later, but when we're talking to our partners exactly, [00:06:00] because it is so chaotic and traumatizing to be in those moments.

And there are other bigger. Priorities. Right? Like, and we absolutely recognize that, that we always try to frame it in the context of remember and honor the thing that you, that you know now. Because there is going to be an after. There is going to be a later we want to position ourselves. From positions of strength for those moments of accountability.

But yeah, you don't know what they're gonna look like. You don't know exactly what the question is gonna be. And so for right now, it's just try to remember what you know right now. In some instances, what we have found, especially in the cases of community members and family members who've been killed or disappeared, I mean that.

We have found over and over again is a really inherently natural and organic process that people wanna write down the names of the dead and that they will just start recording that information. And so then giving them a way to then preserve that and leverage it later feels really important.

Alix: Yeah.

Reminds me of the [00:07:00] Brian Lara radio show. I don't know if you've ever listened to his radio show in New York. It's like New York Public Radio, and he had a whole episode where he just read off the people who had died from COVID. Oh, wow. As like a, yeah. Uh, 'cause I think that that is like in every like. Huge trauma, especially when there's been a significant loss of life.

I mean, it's, I guess it's another New York example, but like September 11th, you know? Oh yeah. Reading the names or reading the names. It really, um, it matters. But I think it's interesting 'cause I, I actually hadn't thought about this until the very second, but like, when you reach a volume of death. I mean, it's that Stalin quote, like one death is a tragedy.

A million deaths is a statistic. How did you, how do you guys navigate that in your, I don't know, like psychologically in your own work, like not flattening it into just numbers given, it's like the bigger the crisis actually, the more you should be thinking about the underlying human involved.

Megan: Absolutely.

One of the ways that we do that as a team is by partnering with these groups that are never going to forget and will never let us forget. The individuals. And the other thing, just to kind of circle [00:08:00] back to something that you were saying about the statistics, sort of amplifying the anecdotal stories like that really is the role we try to play.

So we do want the data and the statistics to be one piece of that puzzle in a way that's really complimentary to, and affirms and strengthens those individual stories. And I think the other piece, 'cause we are very cognizant of, and we think hard about, about that Stalin quote. The other thing that statistics can offer is your own context.

Because a lot of these really horrible experiences are incredibly isolating. And so to go through them and to feel like, well I'm, I'm the only one who experienced that, or I'm all alone in experiencing this trauma and everyone's trauma is unique. But if you then can place your story within. This bigger, what sounds like a distant data set or statistic, it still kind of provides you that community.

I am [00:09:00] not alone. I'm not the only person that's happened to.

Alix: Yeah, I can imagine that's an interplay. And I think too, like the idea that it amplifies the underlying story, not like smushes it with lots of other stories, I feel like is really important. Okay. Well, most people at this point might be like, okay, I need an example When I'm explaining HR DAG to some person who like doesn't know anything about the human rights movement, I will very nerdily try and describe what you all did.

I'm gonna try this and this is good practice to make sure that I'm advocating for what your organization does effectively, but essentially in the court of law, trying to use data science to help a court determine whether or not something was genocide based on the pattern of the deaths. Is that broadly right?

Is that what happened? And like, take us back. What ha like give us the juice. What happened?

Megan: Yeah. Yeah. That was, that was a very good, a very good synopsis. Hold on. Uh, so yeah, so this work started in Guatemala again in the nineties and the early two thousands with work with the Guatemala Truth Commission.

Predates me. But Patrick and other members of the HR D [00:10:00] team had done some statistical analysis with the Guatemalan Truth Commission to find that more than 200,000 people went missing and killed during the internal armed conflict in Guatemala. And that was published as one of the Truth Commissions findings.

And there was a technical appendix sort of outlining all of it. And as part of that work, the Truth Commissioners had raised the question of, can this analysis help answer the question of. If this level of violence was targeted in a way that would be consistent with genocide. And so that was the start of that investigation and that question.

And then it became particularly salient in the mid two thousands when General Fre and Rio Smat, who was the former DEF facto president of Guatemala. During the early eighties was accused of committing acts of genocide, and so the particular statistical analysis that we presented in the case against him was a finding that members of the Mayan [00:11:00] population had a risk between five and eight times greater of being killed by the Guatemalan army.

Then they're non Mayan neighbors in the same geographic region. And so that was one piece of the puzzle, but that was a statistical finding. Of a pattern of violence that was consistent with targeted violence and that was consistent with acts of genocide, basically. It

Alix: is extremely unlikely that that outcome would've happened by chance.

Megan: Right. And also it deflated the defense's argument that it was an armed conflict. There was lots of violence and it was indiscriminate violence. And if that had been the case, then those relative risks would've been different. We would not have seen. That much larger risk to Mayans as compared to their non Mayan neighbors.

Alix: So did Patrick have to go testify?

Megan: He did. He testified against Rios Mont and that's so intense. Yes. Um, it's not quite as intense as when he [00:12:00] testified against Evi. 'cause Milovich represents God himself. And so Milovich, what did cross-examining,

Alix: so Milovich cross-examined.

Megan: Yes. Patrick. Patrick,

Alix: yes. Oh my God.

Yeah. Wow. That would give you nightmares? Yes. Yeah. Although he's pretty steely, but that's really, wow. Oh my God. Okay, so then what happened? Did the court, of all of the evidence that was presented, did they say how that factor weighed in their determination?

Megan: Yeah, and in fact in their finding of a guilty verdict, 'cause they did come back and find Rios not guilty of committing acts of genocide, they specifically referenced the statistical analysis and said that it affirmed the stories that the victims were also testifying about.

And so that's. That's always our, our goal. Now, unfortunately, I have to provide the WWA coda to this story. Oh no. Which is that that verdict only stood for 10 days because the constitutional court overturned it on illegal technicality. And then there was a second [00:13:00] court case which Patrick was going to testify in.

I actually would have to go back and check to find out if he got to testify a second time before Rios Mo died before the end of the trial.

Alix: Oh, okay. I mean. 10 days is a long time. He was found guilty. It did happen. Yeah. Yeah. And it sounds like they weren't overturning based on that piece of evidence. It was probably SPAN band.

It was like strange. Yeah. Legal machinations. Yeah. That's so cool. Okay, so HR dag, was it a thing then yet? Yeah. Yeah. Okay. So it was like Patrick and like a small team.

Megan: Yeah. At the time that he testified in Rios month's trial, the analysis for the trial was when we were still at Benetech. So we were Oh yeah.

Uh, you know, a medium sized team at another tech nonprofit. And then kind of in between the two trials was when we, we spun out and then it was just Patrick and I for a little while, and then we've grown the team back up again.

Alix: That's so cool. Okay, so it's like one thing. To go to other places where these like [00:14:00] punctuated very intense, you know, a million people like, like the volume of death in this is very, I dunno, it's a distinct experience and it's also like ki I would imagine kind of, um.

I don't know, like you're the white Americans gallivanting to a place and being like, this is really bad. Uh, we're gonna help. I don't know, maybe there, maybe while we're doing this kind of work before, but I, I seem to remember this, like turn to like looking at issues in the US and trying to apply some of these same methodologies to support and accountability movements in the us.

How did that start? For sure. Was that a fair characterization or were you guys doing that before too?

Megan: No, it is a fair characterization. So I can, I can kind of tell the story a couple of different ways. So it's true that for the first probably almost 20 years of trade ags work, it was exclusively outside the United States.

And we always, as a team, were interested in doing work in the United States. I mean, all of the work we do, we think of as holding institutions accountable for. Violating human rights. [00:15:00] And that is a thing that also happens in the United States because of the way that we work in partnership with other organizations.

We just didn't see the right opportunity for our style of project and, and the right thing to work on. And then there became these moments in the United States where people started using the phrase state violence. Not just immediately dismissing it as hyperbole, which again is like the phrase that we have always used inside and outside the US to talk about these kinds of instances.

So there's two kind of threads. One thread, if I can indulge in another Guatemala story. We actually have. A whole nother project that we did in Guatemala, which was what brought me to the team and is the first project I worked on, which was analyzing the bureaucratic records of the National Police Force in Guatemala, who also participated in this armed conflict and the analysis of that.

It's this amazing warehouse full of paper, [00:16:00] and the analysis of that, in addition to being substantively incredibly important, also has this great. Statistically nerdy component because it's this building full of piles of paper, millions and millions and millions of pieces of paper. And the question that was posed to the team was, well, how can we start analyzing it as quickly as possible?

Because it's very vulnerable, it's paper. It could disintegrate it. Catch fire, we could just lose access to it. And so what we did was we partnered with volunteers from the American Statistical Association to design a really, really cool, multi-stage probabilistic sample to draw a representative sample of documents based on the physical structures of the building.

So it was that kind of work looking at the bureaucratic records of the police in Guatemala. That I think has a direct relationship to the work we're doing in the US now because a lot of our projects in the US now also involve accessing and making sense of internal bureaucratic [00:17:00] records kept by various police forces in the us.

But the way that we actually got there, the way that we got to those projects that make use of that data started with a question that partners raised about so-called predictive policing. It really was partners reaching out to us and saying, one of these predictive policing vendors published in a peer reviewed journal, their.

Approach, could you kind of help us just understand it and unpack it and critique it? And so, Christian Lu and William Isaac wrote this really great analysis of predictive policing demonstrating all the ways that it's actually perfectly named because it's not predicting crime, it's predicting the behavior of police.

Um, and if that's it's a banger of a piece

Alix: of

Megan: work, we

Alix: will

Megan: link to it in

Alix: the show notes. Awesome.

Megan: Thank you. Um, and so that really provided us. Kind of an entry point into doing more work in the us. And then the other piece that brought all of [00:18:00] that to us was. Outside of the us the work that we've done historically is using a specific class of statistical tools called Multiple Systems Estimation or Capture Recapture, and it's a way of estimating from multiple lists of named victims or named individuals.

To a total population. So it's a way of estimating not only how many incidents have been documented, but what's missing from all of the lists. And so we're very familiar with that approach we used in a lot of different contexts. And then in 2014, the Bureau of Justice Statistics used that, a exact same method to estimate what was missing from two federal data sets attempting to.

Quantify deaths in custody in the us. So there are two federal data sets that are tasked with recording information about individuals who die while in police custody in the us. But because of [00:19:00] the structure of the US system, there are more than 18,000 different police jurisdictions within the US and it's all voluntary for them to report data to these federal data aggregations.

So everybody sort of knows that those federalists. Are incomplete. So the Bureau of Justice Statistics put out this report saying, well, we did this two system, multiple systems estimation. We think the true total number of people who die in custody is this larger number 'cause we're missing some amount of them.

And we saw that and we kind of said, well, we know. A thing or two about that particular approach. And in particular, if you only have two lists, you have to make some pretty strong assumptions about those lists. And so we just wrote a little memo. This was again, work that Christian Lu did with Patrick and just wrote a little memo kind of saying, well, what sort of a sensitivity analysis of those assumptions, like what are some ways that those assumptions could vary and how would that affect this estimate?

So that memo is the other thing that kind of opened the door. To then groups in the [00:20:00] US who were doing this work, kind of saying we have questions about state violence in the US and we were like, we are very interested in working on questions about state violence in the us.

Alix: That's super interesting. I mean, I feel like those are the kinds of moves that make strategic sense.

But then don't necessarily always translate into funders understanding or like, like having the same funder. This is maybe an aside, but I'm curious, how did it go? Like was it motivated by American philanthropy being more interested in some of these questions or was it like really irritatingly difficult to get them to pay attention to it and give you funding for it?

It was, it was really irritatingly difficult.

Megan: Yeah. And then like it was one of those things that like strategically then. Ended up being really advantageous because, because you ahead, right? 'cause then the pendulum swung and then that was like what everybody wanted to talk about and to do. But there was a long period of time where people were, were really, they couldn't follow those threads.

And I think also, like I hadn't practiced explaining them very well yet.

Charlton: Yeah. And

Megan: so like there was a long time when people really wanted. To bifurcate and talk about [00:21:00] our work as international and US and like as separate. And you were like, no, this is a method applied

Alix: to multiple contexts, right? Yeah.

Yeah. American exceptionalism, it runs very, very deep. It's really, it's really hard to escape. Um, that's super interesting. Okay. Well then that was then when racial violence and state violence was like, you know, it's part of our history. It's who we are as a country, um, and then maybe less. Historically unprecedented.

You get this election and then reelection of he who must not be named. Um, I imagine that changed how you were positioning yourself. Do you wanna talk a little bit about how you conceptualize you all as a science organization trying to kind of apply scientific rigor to questions of political import and like talk a little bit about maybe how that changed when there was a, an authoritarian office.

Megan: Yeah, absolutely. So we do, we always describe ourselves as a non-partisan organization and we [00:22:00] conduct human rights research and we use tools from various scientific fields to understand patterns of violence. I mean, that is at the core of what we do. And so in some ways, the shift that we're experiencing in the US.

I hate to say it, but looks familiar because we have worked in so many other countries and we have so many partners who have gone through these kinds of experiences, and so in a lot of ways the work that we were already doing and the way that we were already framing it. Remain the same because we recognize that there's going to be some later point.

There's going to be an after this, there's going to be some kind of transition, and we're going to need ways to tell ourselves. About what happened during this time period, and ways to [00:23:00] tell ourselves that are grounded in rigorous data collection and data analysis, conclusions that can stand up to adversarial environments, which is very much an environment that we find ourselves in right now.

Alix: Yeah. But I feel like there's an assumption. That if you have some like structural mechanisms for accountability, that adding evidence within those is somehow going to leverage those institutions of accountability for the purposes of actually dealing with what's happening. It feels like. In the last seven years, I'm just gonna pull that number.

In the US there's been an unmooring of knowledge production from institutions of accountability that could earnestly engage with fact, earnestly engage with like findings, A court system that feels increasingly. Partisan and uninterested in like non-partisan sciences. Um, so how, I mean, I don't know. How does it feel to be a knowledge [00:24:00] producing, like epistemologically driven organization in an environment where like knowing stuff is like not seen as.

Important anymore or something right as currency. Yeah,

Megan: yeah, yeah. Real challenging. Um, you know, we have this shorthand on our team when we're talking about taking on a project and sort of deciding priorities within our queue of asking, well, in this situation, does the truth matter? Which, that's really just the way we frame an internal conversation around, is the thing that we're bringing to this project gonna make a difference or be useful?

And. You know, we try to just be really humble and pragmatic about that because sometimes people are having arguments that aren't about facts, and so bringing better facts isn't the thing that's gonna convince people. But yeah, so we've always kind of framed our conversations that way, and it's. Just gotten progressively more depressing to kind of ask ourselves that.

And you know, [00:25:00] we still remain very devoted to facts and the truth. And you're pretty cool. Yeah. You know, one of the, one of the places that we get to. Just keep answering that question with like, yes, it does matter and it does make a difference is in these kind of hyperlocal contexts and I think to kind of keep bringing it back to our, our partners, like that's the other reason why we keep working with the model that we do because facts and the truth very much matter to our partners and they help us sort of identify.

Those leverage points. You know, that may be just re not just, but like that may be reaching out to community because maybe the community needs a better, deeper understanding outside of official channels of describing an event, maybe in a local context. Like there is an opportunity to, you know, change a policy or change a law.

And so really focusing. In those very specific contexts where facts still matter.

Alix: All right, we're gonna leave Megan there for a minute and jump over to another conversation I [00:26:00] recorded. A little while ago during Climate Week in New York with Janet and Charlton for the data in society, Megan was just touching on this sort of organizational gap between doing the work of uncovering the truth, but then kind of having to translate that truth into something that actually makes a difference in the world.

So translating it into justice or meaningful policy, prescriptions, that's something that data and society spends a lot of energy on. Trying to find a way to have research actually be useful for changing what comes next in the world. So we dug into how data in society does that, because obviously research is very different than policy and it requires kind of a different way of thinking.

But here's Janet and Charlton to say a little bit more about the role of research in what's happening right now.

Janet: I do think it is a particular moment in time where. As a society, we are in the process of turning away from the public interests that we're turning away from public institutions, from protecting public institutions [00:27:00] that we're turning away from the creation of.

New public knowledge and we are seeing, you know, a corporatization of government that's happening in a way that is absolutely driven by a tech oligarchy, but I think is being embraced by other parts of society, including. Parts of the government that should have a duty of care for the public interest, and I think we are seeing a historic turning away from that.

Charlton: I was working with a group of researchers and working with folks at NSF around this question of how do we better engage, involve. People at the community level, and particularly people in underserved communities around technology development, deployment, et cetera, and really at the point that was so optimistic of people asking questions about how do we really flip these scenarios in some way to get communities [00:28:00] more buy-in too.

Enact processes that would help diminish the risk of harm when certain technologies are built and used in and around those communities. And. All of that work, which was deeply institutional and systemic in terms of what research gets funded, what types of accountability for people doing research around technology in particular communities.

All that is functionally disappeared at this point. The people who champion it and. The questions and the ability, the funding, et cetera, to really pursue those. And you know, corporations have stepped in a bit with, Hey, we're concerned about this, but I'm always deeply distrustful.

Janet: I think there's a methodological.

Foreclosure, you know, which is that in this environment where I think the tech companies do wield a lot of power in terms of the questions that get [00:29:00] asked, they also wield a lot of power in terms of the methodologies that are used, and they both are going to be foregrounding. Quant analytics, right? And sort of looking at the world through the lens of what do we know about the impact of, you know, chat GPT in the world through the lens of the data that comes back from chat, GPT, which of course is like the snake eating its sale, right?

Like, we're like in this logical loop of like, we only know what that. Enclosure of that platform tells us. So that's a really gigantic loss. Instead of looking to grounded social science, grounded ethnographies that really look at not only the primary impacts of these tools in terms of how people are using them, what it means in their lives, but the secondary and tertiary impacts on communities on.

You know, the larger economy at all of those questions, we really lose that. And I think that's [00:30:00] incredibly dangerous when it is coupled with, you know, going back to the challenges that we're seeing with our government right now, the loss of public data. As our founder, Dana Boyd would say, we start to lose.

How do we know? But we know we lose that kind of public base of knowledge about the world around us in really dangerous ways.

Alix: I think it's also coupled at a time when journalism is collapsing, which is I think where you get some of the more, maybe not full ethnographies that are multi-year and longitudinal and like telling lots and lots of stories, but you still get stories and it feels like that's going away too.

And it feels replaced by these very superficial, sort of quantitative quick. Hit This is what's happening coming from industry.

Janet: I feel like there has been a generation of tech journalists that have, you know, kind of come into play and come on the public scene, particularly over the past five years who are doing amazing work and that [00:31:00] are just, you know, holding the line in ways that I feel so grateful for.

Karen Howes book, the Empire of ai. The sort of most obvious case, a hundred breaking stories in a single book. Yeah. Yeah. I mean, I mean, it's just, it's, it is an amazing, amazing book that tells us so much, not just about the story of open ai, but just tells us so much about the political economy and the social impacts of.

AI as it is, you know, being thrust upon the world right now. And I think, you know, I mean Karen is maybe the best known journalist because of that book, but I think there are a lot of journalists that are doing great work. Kari Johnson at Cal Matters, KME Hill at the New York Times. You know, I mean we, I feel like we're seeing a generation of great tech journalists that are pushing back on the hype and the narratives that come out of the tech companies.

And I think that's. Just an incredible, incredible value, you know, to sort of [00:32:00] maintaining a real dialogue in the public sphere or in public spheres. And I see as a real companion to the kind of work that Data and society does, I mean, it means we and our peer organizations and, and peer researchers and academia are not the only game in town in terms of where to get.

Really solid information and sense making about the world right now.

Alix: When you were describing the beginning of data in society, it was like a elite nerd club and I feel like a lot has changed in 11 years. But Janet, do you wanna talk a little bit about how it came about as an organization so we can get into how it's changed?

Janet: Yeah, absolutely. So Data In Society was founded by Dana Boyd. Dana is still an advisor to the organization. She is. Truly a visionary and and what she saw 12 or 13 years ago when she was working at Microsoft Research in the social media collective, there was the idea that the study of the [00:33:00] societal impacts of data-centric technologies, of automated technologies was just not happening in any scale at any organized way.

And her insight was, this is gonna really. Matter. We really need to understand this. To be able to integrate these technologies into our world, we really need to be able to understand it, to govern these technologies in smart ways. So I joined the organization about a year and a half after it was founded and.

I had known Dana for years. I came from the Open Society Foundations, which is a, a philanthropic organization, and I had been there for almost a decade and a half and had worked broadly across issues of technology, human rights, accountability, and governance. So I. Really had felt that lack of an evidence base and a lack of attention to how these technologies that when I was at the Open [00:34:00] Society Foundations, we were sort of funding or trying to figure out, you know, how to incorporate into a whole set of issues around the world, how they actually.

Were impacting people on the ground. And so I had actually come to Dana when I was still at the Open Society Foundations because we were grappling with this question, which now seems very quaint of the idea of algorithmic. We called it algorithmic manipulation of the public sphere. Dana had just started Data and Society, and so I went to her and said like, what do you think about this?

Like, can you. Can you tell us about this? And that actually was one of the questions that started Data in Society's sort of long engagement with algorithmic accountability, with the study of Miss and disinformation and other things. I mean, that was work they were already doing. And so I was. You know, it was, it was a really, really interesting conversation that came about.

And so she started data in society primarily in the early days when Charlton was there in the very early days, primarily as a space for a [00:35:00] disparate crowd of researchers in those very early days who, just, a few people who were starting to ask these kinds of questions and, and to ask them through the lens of not just sort of the hype and fear cycles that.

Tech companies were putting out, but through the lens of power, through the lens of inequality and in a way that really started with people and their experiences with these technologies. So I You, you were there. What, tell us about it.

Charlton: Yeah, I mean that sounds pretty much right and for me, I remember.

Precisely that kind of entry point. Like for me, number one, I had not come from a long history and background of studying technology. I had studied electoral politics and then sort of migrated towards questions around technology that were interesting to me. Um, and so data and society and meeting Dana was one of those moments of, oh, you are thinking [00:36:00] about this in the way that I am.

And I would've thought other people would, but. Come to find out that was not the sort of the lens and so people were coming with the heavy. Let's talk about the technology. Let's talk about the material artifacts and a lot less about the people. My concern before all of this was about people in sort of situations of power access politics and so blending those was important for me and Data and Society was a place I started to see that work happening.

Alix: I feel like it's also interesting to think about you coming from academia and academic inquiry and you coming from philanthropy that has sort of an opinion potentially about the world that it wants to see. That obviously sort of comes together in data and society as a strategy of thinking about research that serves some purpose beyond just let's ask interesting questions as academics and write papers and talk to each other about them.

I mean, I would love to hear a little bit about how you think about the mix of opinions about the world, or sort of [00:37:00] thoughts about how the world should be combining with trying to, you know, use scientific methods and inquiry and really sort of find evidence and understand evidence about what's happening in the world.

How do those two things come together for the organization?

Charlton: I'll start again with a story because it really has shaped my perspective on this. From a long time ago to this point, I wrote a book called Race Appeal, uh, back in around 2009 or so, 2009, 2010. And I remember when it was written, it won a couple of awards and I remember a sort of legendary figure in the field, uh, political science saying this is a great book, but it had no soul.

Burn. Oh my God. Can you imagine the, the sinking feeling to read those words, especially with the, the upside of this was a great and powerful book, and then the drop in it, the ultimate egg, God, it's like, whew.

Alix: Yeah.

Charlton: But it almost immediately, I knew what she meant by that. And it was that [00:38:00] it was a book that was heavy on numbers and a particular kind of data, but was not foregrounding.

The kind of why, like why does it matter that this is a group of people that I'm concerned about focused on and that makes a difference in some way. And there was a lack of that kind of connection with the people and not just the problem. Right, and so that was a point at which I sort of consciously decided that I was gonna do my scholarship in a very different way.

Is there something more than just saying, do I have access to a technology or not? What is the deeper set of concern? What is. That sort of thing that gets me closer to people and really understanding that dimension. And that's what I think that sort of reshuffling of those two things did for me and my work.

Alix: I love that frame of connecting with the people, not just the problem. 'cause I feel like that's [00:39:00] such a good encapsulation of what happens when people try and sort. Flatten the process and make it feel sciency. They actually remove the most essential part of the equation, which is people and, and I feel like that's just a really powerful way of framing that.

Janet, what do you think? I mean, you have, you've had to, like when building out like a policy team, for example, obviously you have a team of researchers who have, you know, you know, they're trained researchers, they. Rigorous methodologies to uncover evidence about the world around them. But obviously with a policy team, you, you want to participate in that conversation in a, in a particular way.

So do you wanna talk a little bit about sort of how the organization has managed that and maybe it's not attention? I don't know.

I

Janet: mean, we decided to build a policy team within Data and Society because. What we saw happening was this sort of hope in many places where people are doing academic research and wanted to have impact, they would put it out, it would get picked up and become part of the policy conversation, and that [00:40:00] sometimes happens, but usually you need to have some kind of translational function that.

Is packaging that research in a way that policy makers can grasp that probably has a lot less nuance than the original research that is done that is communicated in to some extent a, you know, policy ready way. Like what do you take away from it to actually apply to a regulatory environment or to legislation or something like that.

And so what we decided was that we really needed to have that in-house. Function really close to the researchers so that we didn't lose as much in translation, rather than pursuing maybe a more traditional route, which would be like partnering with a think tank or policy organization, you know, to kind of be like, well set up briefings on the hill for us, or something like that.

But I think, you know, it's been sort of a two way street between our research team and our policy team to really learn how to [00:41:00] translate academic work. Into policy language that can be understood. And I think the best work that we've done in the policy space is where we have these really tight collaborations between our policy team and our expert researchers.

One thing that I would say is like, you know, we're doing sort of empirical grounded research, so one of the outcomes that we had to accept is that not all of that is. Gonna generate policy prescriptions, and that's okay. I mean, it all has value, but it may have value in different ways, right? Like a policy outcome is only one way.

That research can be impactful in the world. It can also be impactful in the world, in shaping academic fields, in shaping new kinds of inquiry. It can be impactful in the world. Telling stories and narratives that can be impactful and, you know, bringing people to the table who were not previously given space, that essentially new communities are recognized as, you know, sort of legitimate voices in a public [00:42:00] debate.

So there's a lot of ways policy is not the only way. There are a bunch of factors, right? So one is that it's never one. Piece of research. It's never a like one-to-one translation. I would say every good piece of policy work that we've ever put out is standing on the shoulders of citation giants. Right.

And that's a big part of our policy work is that we have a deep citation practice. That's one important thing is that research is additive. It's cumulative to get to policy recommendations. I think it's also like opportunistic, like sometimes you just land at the right moment where you have a piece of research coming out that happens to be just like meeting the moment.

For instance, right now one of our teams is doing research on mental health. And chatbots. And that has turned into a huge issue in policy circles, right? Like, how do we protect people who are both leaning on [00:43:00] chatbots for mental health support and may also being harmed in through their health by chatbots?

How do we build protections? Um,

Alix: you have statewide bands right now coming out. Is it Illinois? Yes. That basically said ban. We legally can't actually offer a chat bott for the purpose of therapy.

Janet: So that's about. Apps that claim to be therapists. Many more people are just using, are using general purpose models, general purpose LLMs.

Alix: But this is an attempt at defining a policy that supports conditions

Janet: Correct.

Alix: Within which a model's not being used in ways that you would

Janet: Exactly. So our, our team that's working on that put out, not a policy brief, but an op-ed, about essentially what their research is saying about this moment and the kinds of guardrails that needs to be built around these technologies.

Given what they're seeing in terms of how people are using these tools to foster better mental health.

Alix: That's a really good example. 'cause I can immediately imagine once you build evidence around where it's causing harm, in what conditions it's causing harm. I can very easily imagine that translating [00:44:00] into.

More prescriptive assertions that X is not appropriate or potentially considering policy related to Y. So I feel like that's a, yeah, I mean, it gives you a chance to say like, here's what we're seeing. It may not be the do this right now, but it's informative. It can be informative of a direction in governance or a particular set of protections that maybe hadn't been considered.

Charlton: Most of the a academics that I talk to, whether they're in technology fields or otherwise. Often say the reason that they do what they do is to try to make a difference in a world in some meaningful way. Academia doesn't really lends itself to that all the times in terms of the structures, the incentives, et cetera, that are driving what's important for an academic researcher or tenure or tenure track faculty member or what have you.

So to have a kind of separate space that makes that possible, I think is what makes, you know, a lot of academic, traditional academic researchers excited about a place like Data and [00:45:00] Society and also gives fuel to their work in thinking that what they research, what they come up with, the knowledge they produce can really have an impact in some real way.

Alix: One thing I think a lot about is the asymmetry of resources between the people causing the problems and the people researching the problems they're creating. There's that pie chart in a paper that Abe put together a couple of years ago, um, showing the proportion within academic conferences about machine learning and.

Computer science, the proportion of them coming from academia versus industry and how much that's changed. And basically it's, it's flipped. So it's gone from like 25% industry, 75% civil society or academia to now 25% civil society and academia, 75% industry. So essentially, academia is now, in some ways, not necessarily co-opted, but it's, it's a piece of infrastructure for industry, um, in some ways because they have the money and the resources.

So then you've got these little institutions like Data and Society that's kind of. Trying to like keep up and be like, we need independent research and [00:46:00] independent things being produced to like help us understand what's happening. How do you think about those resource asymmetries, especially when you know all the funding cuts happening and kind of the overall environment?

It feels like one of scarcity outside of industry and one of like opulence and control inside industry.

Janet: I think for us we have to. Think about how we put our resources in the places that we think are gonna have the most impact. And I, I would say, I think that's why Data and Society is structured in a way so that we have all of these functions under one roof.

Because I think that's the way that we can have the greatest impact with research, so that we have, you know, research foundations, we have policy blueprints, we have this function of creating counter narratives, and then we have this network power angle. Trying to bring all of that under one roof is what I think creates.

More bang for the buck that it gets us further than having a kind of disparate network. I mean, I am a, [00:47:00] I am a believer in institutions. I think one thing that, you know, Dana did that was so brilliant among the many brilliant things she did, was to start an organization, you know, to plant the flag. That this isn't just an idea or an academic field or a journal, it is an organization that deserves.

Public support and it's a set of ideas that deserve public support because it is work that's happening in the public interest. So, you know, I mean, I think like this is the, you know, I've been in philanthropy, like I've been in the nonprofit side. I mean like the asymmetry of resources is a story as old as time.

That's not something we're gonna solve. So we have to think about, you know, how do we, how do we direct our energy and build our institutions in a way that they have the most power and the most impact?

Alix: All right. To close the episode off, we're gonna have the rest of the conversation with Megan. He's gonna say a little bit more [00:48:00] about what she wants from you, the listeners especially, to be thinking about in this kind of strange time where our access to knowledge or even just like knowing what's going on in the world is under such threat.

And so I'll let her take it from here. Uh, if you wanna hear more from Janet and Charlton and get a full picture of the history data and society, we had a really great conversation that we are publishing the whole thing on YouTube, so we'll link to that in the show notes. Um, but let's let Megan close this out.

The overall operating environment of data and information is changing so fast because you don't have a federal US government that wants. Any of that to be available is what it feels. It feels like they're trying to turn the lights off in the room or something. I imagine that's a problem for your work.

Do you wanna describe a little bit, I mean, I also imagine you guys have views on what it looks like to build data infrastructure as a society that like helps us understand what's going on, not just for specific rights violations, but for just like insight into the. World [00:49:00] we live in. I don't know. What are your thoughts on everything sort of disappearing from the internet at at incredibly fast pace?

Megan: As a small organization, you know, we're very hyper-focused on how do we maintain and not. Lose and lose access to the data for our projects and for our partners and our little slice of that particular data pie. And so, you know, we, and by we here, I really mean Patrick, you know, spend just a lot of time and a lot of angst.

You know, just setting up a lot of data, backing up mechanisms, you know, to just make sure that we don't lose anything and in multiple places and multiple jurisdictions and outside the US and all of those sort of good best practices In terms of, yeah, like the loss of information from the internet and just from data infrastructures generally, right?

Like all the federal. Statistical agencies that are pulling down data. There are fortunately a lot of civil [00:50:00] society actors who are really on that and who are really working, you know, both together and independently to capture what can be captured. And I think that that was really a lesson that we learned from the first administration was the importance of having these backup systems and the importance of having.

Library folks and archivist folks who know how to do this and who have processes in place, um, and who really jumped into action.

Alix: So you guys, a couple weeks ago wrote this letter, not just about the kind of data environments that are being, I don't know, what do we even call, like burned. But beyond that, like the overall, I think you used the word tyranny in the title of the letter, if I remember right.

Do you wanna describe. What you were thinking when you decided you guys should write this, and then I'd love to hear a little bit more about the letter itself. When did you guys decide to do it?

Megan: Yeah, absolutely. So it's something that we've been talking about internally since the spring, because that was really how it started, was [00:51:00] that we wanted to be really cognizant of the work that we have always done and the mission that we have always had, and the very rapidly changing environment in which we're now doing it.

And so we wanted to have an internal conversation. Just to affirm to each other that this is, this is still the most important thing for us to be doing and this is what we're committed to. And also to be really transparent and explicit with each other about potential risks that were now posed by doing that work in a, in a way that was different from, you know, last year or several years ago, and giving the whole team an opportunity to talk about the way they.

Might experience or were thinking about those risks. You know, we wanted to be really aware of the fact that Patrick and I are a couple of middle aged, cis hit white people, and not everybody on our team is. And so we're all gonna experience [00:52:00] and perceive risks differently. And so we didn't want to just.

Have our voices saying, you know, this is, this is our mission and this is what we're gonna do in full steam ahead. Very gratifyingly that that essentially was. The conclusion that came out of that conversation. But we wanted to really open up the space and have a really involved internal reflection, um, where we talked about that.

And the starting point for that, which is the starting point for all of our work was, you know, we turned to the Universal Declaration of Human Rights. I mean, that. That has always been the foundation of our work. It's a really beautiful document that I really strongly recommend people read. It's not that long.

It's super accessible. And so that kind of brings us into what, what we ended up writing. What came out of that internal reflection and conversation was really highlighting some specific articles from the UDHR and pinpointing. This is the explicit. Right that is enumerated in this really important document that is directly relevant to our work.

That is the reason [00:53:00] why we do our work and, and why we think it's important that we keep focus on, on this approach.

Alix: Yeah, I think it's really cool 'cause I feel like I've seen a lot of organizations, particularly ones that like think of themselves as like doing the tech part of a human rights thing when the kind of.

Authoritarianism, temperature of the water starts to boil. They're like, we're just the computer people. Um, I feel like it's really nice to see an organization that's like, we're not just the computer people. Like we actually like, believe in these broader mechanisms. And that like also, we don't wanna live in a world where human rights accountability is something that's like, I don't know, like a passe impossible.

Infrastructurally destroyed thing, and I just, I don't know. I really appreciated you all taking that stance. I'm kind of wondering, I don't know. Were you worried about it?

Megan: I wouldn't say that we were worried about it. I mean, I wasn't worried about it. I think all of us felt a little uncomfortable because HR [00:54:00] Dags, posture.

Is always as the data nerds in the background, I mean, we joke, but it's really true that our happy place is like a footnote and a technical appendix. Um, we really think that's where our work does its best work. And so it was more the discomfort of coming out and saying publicly more things about kind of our work and the way we felt about it and how important it felt to us.

Not because we don't believe those things or because we thought those things. Didn't need to be said, but just because we are not usually a very public facing organization. And so that was something else that you know, that we had as an internal conversation was can we all sit with this discomfort? And does this moment feel important enough that it calls for it?

I think that really was the, the turning point and the conclusion for us was watching the attacks on academic institutions and on science and attacks on civil society. [00:55:00] And then really specifically, you know, to talk about kind of the risks that we thought about pertaining to our own work. There was an executive order that specifically named the International Criminal Court, and then there were some sanctions that came as a result of that against some prosecutors for the ICC.

And we've worked on. Projects with the ICC. And so we felt that's important work that needs doing and you know, we need clarification that that is within our rights to do that work. And then there was, I mean, there have been so many executive orders, but the, the two that that really felt close to our work was that one.

And then there was another one, I forget exactly the title. But it said something about unleashing the power of the US police to basically do their jobs. And it really specifically mentioned and pinpointed, you know, defending police against criticisms of their work, which is what we and our partners feel is necessary to hold institutions accountable.

And so we really [00:56:00] felt like it was important to, to say, we're gonna keep doing this work. We're gonna name the importance of it.

Alix: I also feel like in this era. There's a lot of different foundations positioning themselves differently. I'm wondering if there's any call outs you wanna make for people that have done a good job, like where there's any program officers or foundations that you're like, those people, they're naming the thing and yada, yada.

But is there any way that's been like, you think doing a good job within philanthropy?

Megan: I mean, so I think, you know, all of us are probably aware that the MacArthur Foundation has been, and John Palfrey in particular has been doing a lot of writing on this and really getting out in front. I think that they could do more and they could say more, and I think that that's gonna be my general position on philanthropy across the board.

Yeah. Take take

Alix: risks. Take risks, please. Yeah.

Megan: I mean, if not now, when, right. Like this is. This is the moment. Um, but it has, it has been nice to see things like what's been coming out of the MacArthur Foundation and to a certain extent the Ford Foundation coming out [00:57:00] and saying things. And then there's a larger group that sort of has signed on to some letters and things like that, that have come out and, and tried to have more of a solidarity with many position.

And so I think, you know, we're seeing, we're seeing some of that. There are definitely others that I have heard from individuals that sort of, their, their position is like, just keep your head down. And I think that is not a position that I think is gonna be good for any of us in this moment. But I also like, I understand how you get there.

Yeah.

Alix: What do

Megan: you want people to

Alix: do?

Megan: So. A few things. What we want people to do is to have their own internal reflection. Ideally, you know, drawing on the Universal Declaration of Human Rights. Pick the articles that speak to you and to your work and that really identify the, the foundation for the work that you do, and ideally.

You know, make your own statement because then all of us can kind of be in this together saying out loud, we've, we've always done this work. We've always [00:58:00] come from this place, but here it is laid out in black and white in really firm language. The reason why it matters. And then to, you know, to continue to be in conversation with each other so that we can feel more solidarity.

Because I think one of the really common lies. Under authoritarian regimes is that all of us are alone and we're all sort of suffering and feeling afraid in isolation. And so having more of these conversations and coming together to talk to each other about the risks that we're perceiving, the risks that we're experiencing and the work that we're doing, I think is as important as doing the work.

And then that, of course, is the third thing, is do the work. You know, go out there and keep. Keep doing your important work and your mission-driven work.

Alix: All right, well next week we have a special episode from MozFest is gonna come out and we were there in full force and took some time to put something together to kind of give you a wide range of what we learned [00:59:00] and who we talked to you there. And then we are also gonna be releasing some fuller form interviews that we did on site in Barcelona.

Thank you to Georgia Iacovou and Sarah Myles for producing this episode, and we'll see you soon.