1
00:00:02,200 --> 00:00:05,120
Speaker 1: Can you imagine having to make a decision about who

2
00:00:05,160 --> 00:00:08,760
Speaker 1: gets access to something that could save their lives, and

3
00:00:08,960 --> 00:00:13,000
Speaker 1: as a result, who doesn't get access. Would you use

4
00:00:13,039 --> 00:00:16,680
Speaker 1: a lottery system an algorithm? Would you make a call

5
00:00:16,800 --> 00:00:20,360
Speaker 1: on a first come, first serve basis? Well, in the

6
00:00:20,440 --> 00:00:24,599
Speaker 1: nineteen sixties in Seattle, a committee, almost like a jury

7
00:00:24,680 --> 00:00:27,319
Speaker 1: of local citizens were asked to do just that.

8
00:00:28,360 --> 00:00:31,600
Speaker 2: When first invited to serve on the committee, I was

9
00:00:31,720 --> 00:00:35,120
Speaker 2: very uncomfortable, feeling that I was taking the place of God.

10
00:00:36,640 --> 00:00:39,920
Speaker 1: We're about to hear from Rick Mazel, a medical historian

11
00:00:40,040 --> 00:00:43,519
Speaker 1: at the University of Houston. A few years back. Rick

12
00:00:43,680 --> 00:00:46,360
Speaker 1: was doing some research when he came across this story.

13
00:00:47,720 --> 00:00:50,239
Speaker 3: I'm a good and for punishment, I like topics that

14
00:00:50,320 --> 00:00:53,760
Speaker 3: are difficult to research and find.

15
00:00:54,720 --> 00:00:58,120
Speaker 1: He found this one article about the committee in Seattle

16
00:00:58,280 --> 00:00:59,360
Speaker 1: that caught his attention.

17
00:01:00,200 --> 00:01:03,400
Speaker 3: I was curious as to why I didn't know that

18
00:01:03,560 --> 00:01:08,360
Speaker 3: much about it. For whatever reason, historians have stirred queer

19
00:01:08,520 --> 00:01:12,000
Speaker 3: a bit of this conversation. It was really a fascinating

20
00:01:12,480 --> 00:01:15,640
Speaker 3: but difficult scenario to engage.

21
00:01:15,440 --> 00:01:18,600
Speaker 1: So Rick kept digging. He found out the story started

22
00:01:18,640 --> 00:01:22,360
Speaker 1: with the grand opening of a medical clinic. The treatment

23
00:01:22,360 --> 00:01:26,679
Speaker 1: the clinic provided was highly specialized and time consuming. So

24
00:01:26,880 --> 00:01:29,640
Speaker 1: time consuming that patients had to go in twice a

25
00:01:29,680 --> 00:01:32,600
Speaker 1: week and get hooked up to a machine overnight.

26
00:01:33,120 --> 00:01:37,440
Speaker 2: All night while they read or talk, or work or sleep.

27
00:01:38,040 --> 00:01:41,960
Speaker 2: The entire blood content of each patient as being circulated

28
00:01:42,000 --> 00:01:46,800
Speaker 2: through an artificial kidney and clean and pumped back into

29
00:01:46,800 --> 00:01:47,600
Speaker 2: the body again.

30
00:01:49,480 --> 00:01:52,840
Speaker 1: These clips from an NBC documentary called Who Shall Live,

31
00:01:53,280 --> 00:01:57,440
Speaker 1: filmed a few years after the Seattle Artificial Kidney Center opened.

32
00:01:57,880 --> 00:02:00,920
Speaker 1: It aired in nineteen sixty five on Nash Television and

33
00:02:00,960 --> 00:02:05,880
Speaker 1: created quite a stir. The kidney Center was revolutionary. It

34
00:02:06,000 --> 00:02:08,240
Speaker 1: was the first place in the whole world to offer

35
00:02:08,320 --> 00:02:11,920
Speaker 1: long term kidney dialysis, a brand new type of life

36
00:02:12,000 --> 00:02:17,280
Speaker 1: saving treatment. There was just one problem.

37
00:02:17,520 --> 00:02:20,519
Speaker 2: The cold, hard fact of the matter is, there are

38
00:02:20,680 --> 00:02:24,320
Speaker 2: just so many places available on the kidney machine, and

39
00:02:24,400 --> 00:02:28,120
Speaker 2: there are more applicants than places. Somebody has got to

40
00:02:28,160 --> 00:02:31,760
Speaker 2: be left out, and somebody has got to decide who

41
00:02:31,760 --> 00:02:33,440
Speaker 2: shall live and who shall die?

42
00:02:34,880 --> 00:02:40,280
Speaker 1: Yikes, who shall live and who shall die? When Rick

43
00:02:40,320 --> 00:02:43,680
Speaker 1: saw who the people were who would determine that, he

44
00:02:43,840 --> 00:02:48,560
Speaker 1: was shocked. The Kidney Center put seven seemingly random people

45
00:02:48,600 --> 00:02:52,399
Speaker 1: in charge. They would later come to be known as

46
00:02:52,440 --> 00:02:55,840
Speaker 1: the God Squad, the ones to determine the fates of

47
00:02:55,960 --> 00:02:59,160
Speaker 1: thousands of their neighbors. It was up to then to

48
00:02:59,240 --> 00:03:01,040
Speaker 1: decide which would be saved.

49
00:03:01,919 --> 00:03:04,560
Speaker 3: I thought it was pretty unbelievable that they would have

50
00:03:05,040 --> 00:03:08,640
Speaker 3: lay people and community people making this decision.

51
00:03:10,000 --> 00:03:13,440
Speaker 1: This whole thing, this attempt to figure out who should

52
00:03:13,480 --> 00:03:18,640
Speaker 1: have access, became so controversial, such a pivotal point that

53
00:03:18,720 --> 00:03:20,840
Speaker 1: it would become a wake up call for the need

54
00:03:20,919 --> 00:03:25,040
Speaker 1: for a more transparent system. This is a story that

55
00:03:25,120 --> 00:03:28,000
Speaker 1: paved the way to what is now known as bioethics.

56
00:03:29,840 --> 00:03:33,840
Speaker 1: I'm your host, Lauren Aurora Hutchinson. I'm the director of

57
00:03:33,880 --> 00:03:38,200
Speaker 1: the Ideas Lab at the Johns Hopkins Berman Institute of Bioethics.

58
00:03:38,800 --> 00:03:41,520
Speaker 1: In this season a Playing God, we went behind the

59
00:03:41,560 --> 00:03:44,880
Speaker 1: scenes to discover how some of the most significant medical

60
00:03:44,880 --> 00:03:49,960
Speaker 1: innovations impacted people's lives and continued to whether it's saving

61
00:03:50,040 --> 00:03:53,800
Speaker 1: lives or creating babies, a new technology was usually waiting

62
00:03:53,840 --> 00:03:57,760
Speaker 1: in the wings, along with a multitude of ethical questions.

63
00:03:59,360 --> 00:04:01,800
Speaker 1: We looked at where we draw the line, should we

64
00:04:01,880 --> 00:04:04,920
Speaker 1: draw the line, what's right and what's wrong when it

65
00:04:04,960 --> 00:04:08,840
Speaker 1: comes to our bodies, And we turned to bioethicists to

66
00:04:08,960 --> 00:04:13,880
Speaker 1: answer these questions. But in this bonus prequel episode, we're

67
00:04:13,880 --> 00:04:18,040
Speaker 1: doing something different. We're going back in time to immerse

68
00:04:18,120 --> 00:04:22,240
Speaker 1: you in one of the most important foundational stories of

69
00:04:22,360 --> 00:04:27,919
Speaker 1: modern bioethics from Pushkin Industries and the Johns Hopkins Berman

70
00:04:28,000 --> 00:04:38,120
Speaker 1: Institute of Bioethics. This is playing God. By the nineteen fifties,

71
00:04:38,200 --> 00:04:41,400
Speaker 1: if someone had kidney disease, they could be surgically connected

72
00:04:41,440 --> 00:04:44,880
Speaker 1: to a machine called an artificial kidney, also known as

73
00:04:44,880 --> 00:04:49,960
Speaker 1: the dialysis machine. Dialysis at the time worked well for

74
00:04:50,040 --> 00:04:52,880
Speaker 1: anyone whose kidneys needed help for just a short while,

75
00:04:53,640 --> 00:04:57,279
Speaker 1: but people whose kidneys had failed needed ongoing dialysis for

76
00:04:57,360 --> 00:05:02,640
Speaker 1: life or they would die. Connecting to a dialysis machine

77
00:05:02,680 --> 00:05:05,480
Speaker 1: did a lot of damage to blood vessels, so there

78
00:05:05,520 --> 00:05:09,080
Speaker 1: were only so many sessions a patient could do. In

79
00:05:09,200 --> 00:05:13,960
Speaker 1: nineteen sixty, a young Seattle nephrologist named Belding Scribner decided

80
00:05:13,960 --> 00:05:17,719
Speaker 1: to do something about it. He designed a little U

81
00:05:17,800 --> 00:05:21,599
Speaker 1: shaped piece of hollow teflon called a shunt. It could

82
00:05:21,600 --> 00:05:24,560
Speaker 1: be left in a patient's arm or leg permanently to

83
00:05:24,680 --> 00:05:28,200
Speaker 1: use again and again to connect to a dialysis machine.

84
00:05:28,320 --> 00:05:30,960
Speaker 1: This meant chronic kidney disease would no longer be a

85
00:05:31,000 --> 00:05:34,719
Speaker 1: death sentence. I just want to pause here for a moment,

86
00:05:34,880 --> 00:05:38,120
Speaker 1: because even with the Scribner shunt, it wasn't possible to

87
00:05:38,200 --> 00:05:42,600
Speaker 1: treat everyone. So who should be granted access when there

88
00:05:42,640 --> 00:05:46,800
Speaker 1: isn't enough of something life saving to go around? This

89
00:05:46,920 --> 00:05:49,800
Speaker 1: question around the allocation of resources is one of the

90
00:05:49,800 --> 00:05:54,000
Speaker 1: most central questions in bioethics that's still being asked about

91
00:05:54,080 --> 00:05:58,159
Speaker 1: all sorts of things today. What's the best way, or rather,

92
00:05:58,600 --> 00:06:02,160
Speaker 1: what's the least bad way to resolve this kind of dilemma?

93
00:06:02,520 --> 00:06:03,120
Speaker 4: Well, here's what.

94
00:06:03,080 --> 00:06:07,240
Speaker 1: Happened in this case. In nineteen sixty two, the Seattle

95
00:06:07,360 --> 00:06:12,159
Speaker 1: Artificial Kidney Center opened at Swedish Hospital. Initially, the center

96
00:06:12,440 --> 00:06:15,520
Speaker 1: had just three machines and could only treat up to

97
00:06:15,640 --> 00:06:19,640
Speaker 1: nine patients. Each person selected would need to continue to

98
00:06:19,720 --> 00:06:22,880
Speaker 1: be treated for the rest of their life. At the time,

99
00:06:23,080 --> 00:06:26,039
Speaker 1: chronic kidney disease killed tens of thousands of people in

100
00:06:26,080 --> 00:06:29,960
Speaker 1: the US each year. Regular dialysis was their only shot

101
00:06:30,040 --> 00:06:34,000
Speaker 1: at staying alive. So how would the center choose which

102
00:06:34,080 --> 00:06:37,800
Speaker 1: patients would get a second chance at life? To begin?

103
00:06:38,360 --> 00:06:41,680
Speaker 1: Belding and the hospitals set up an initial screening process

104
00:06:41,839 --> 00:06:46,120
Speaker 1: to whittle down the thousands of patients to hundreds, and

105
00:06:46,160 --> 00:06:48,479
Speaker 1: in order to even be considered for a spot in

106
00:06:48,520 --> 00:06:52,760
Speaker 1: the first place, each candidate needed a referral from their doctor.

107
00:06:53,040 --> 00:06:54,320
Speaker 1: Rick says, we.

108
00:06:54,400 --> 00:06:58,760
Speaker 3: Don't know if they accepted referrals from black physicians, and

109
00:06:58,839 --> 00:07:01,920
Speaker 3: there were not many black physicians in the nineteen sixties,

110
00:07:02,279 --> 00:07:05,360
Speaker 3: which is part of what I argue is problematic about

111
00:07:05,400 --> 00:07:08,880
Speaker 3: the committee. You know, Seattle is still a city that

112
00:07:09,040 --> 00:07:14,120
Speaker 3: is highly racialized, highly segregated. It's not Alabama or Mississippi,

113
00:07:14,160 --> 00:07:17,600
Speaker 3: but there were still segregation in hospitals, which.

114
00:07:17,360 --> 00:07:20,680
Speaker 1: Of course had implications as to who would get referrals

115
00:07:20,720 --> 00:07:24,080
Speaker 1: to even be on the list. The center then had

116
00:07:24,120 --> 00:07:29,760
Speaker 1: specific criteria. Patients had to be between fifteen and forty

117
00:07:29,800 --> 00:07:34,040
Speaker 1: five years old. The hospital advised that children might not

118
00:07:34,080 --> 00:07:38,120
Speaker 1: be able to handle ongoing long term dialysis both mentally

119
00:07:38,280 --> 00:07:42,400
Speaker 1: and physically. Those of the right age then had to

120
00:07:42,480 --> 00:07:45,120
Speaker 1: show they could pay for three years of the treatment

121
00:07:45,240 --> 00:07:49,920
Speaker 1: upfront thirty thousand dollars the equivalent of three hundred thousand

122
00:07:49,960 --> 00:07:53,480
Speaker 1: dollars today, and they had to be able to access

123
00:07:53,480 --> 00:07:55,120
Speaker 1: the center twice a week.

124
00:07:55,760 --> 00:07:59,760
Speaker 3: Perhaps most problematic is they could not have underlying conditions,

125
00:08:00,240 --> 00:08:04,400
Speaker 3: so diabetes, hypertension, all of those things would disqualify you

126
00:08:04,480 --> 00:08:07,000
Speaker 3: from the possibility of chronic dialysis.

127
00:08:07,720 --> 00:08:11,200
Speaker 1: The medical advisory committee also interviewed the candidates to get

128
00:08:11,200 --> 00:08:13,200
Speaker 1: a sense of their psychological health.

129
00:08:13,880 --> 00:08:17,720
Speaker 3: So people who were emotionally unstable, who were poor, who

130
00:08:17,720 --> 00:08:21,160
Speaker 3: did not have certain kinds of jobs, who were unmarried,

131
00:08:21,200 --> 00:08:25,280
Speaker 3: who did not go to church were largely considered inherently

132
00:08:25,320 --> 00:08:30,280
Speaker 3: biologically flawed by the medical committee. Those were the individuals

133
00:08:30,320 --> 00:08:33,880
Speaker 3: who were not emotionally stable enough to deal with long

134
00:08:33,960 --> 00:08:35,240
Speaker 3: term dealysis.

135
00:08:36,600 --> 00:08:40,840
Speaker 1: The treatment center evaluated about fifty candidates for each available slot.

136
00:08:41,520 --> 00:08:44,000
Speaker 1: They would then whittle that number down and hand the

137
00:08:44,120 --> 00:08:47,280
Speaker 1: final decision over to the God Squad. The God squad

138
00:08:47,360 --> 00:08:50,640
Speaker 1: would then have to choose one person out of about

139
00:08:50,679 --> 00:08:54,120
Speaker 1: four candidates per slot. The people on the God squad

140
00:08:54,160 --> 00:08:57,960
Speaker 1: were not experts in kidney disease or dialysis. Only two

141
00:08:58,000 --> 00:09:00,240
Speaker 1: of them were medical professionals.

142
00:09:00,880 --> 00:09:05,360
Speaker 2: I am a banker, I am a surgeon, I am

143
00:09:05,360 --> 00:09:09,680
Speaker 2: a lawyer, I am a physician, I am a labor leader,

144
00:09:10,920 --> 00:09:14,080
Speaker 2: I am a housewife, I am a clergy many.

145
00:09:15,360 --> 00:09:18,720
Speaker 1: Belding, Scribner and his colleagues decided it was unfair to

146
00:09:18,760 --> 00:09:22,920
Speaker 1: burden physicians with making the final call. Their reason since

147
00:09:23,000 --> 00:09:25,400
Speaker 1: all of the candidates on their list would benefit from

148
00:09:25,440 --> 00:09:29,640
Speaker 1: the treatment and were deemed good candidates, the choice of

149
00:09:29,679 --> 00:09:31,840
Speaker 1: who to save was now really more of a social

150
00:09:31,880 --> 00:09:35,440
Speaker 1: one than a medical one. I can't help noticing where

151
00:09:35,480 --> 00:09:38,080
Speaker 1: they drew the line between this being a social decision

152
00:09:38,200 --> 00:09:41,080
Speaker 1: rather than a medical one, because to me, it seems

153
00:09:41,120 --> 00:09:45,560
Speaker 1: like so many of these factors were actually social anyway.

154
00:09:46,200 --> 00:09:48,560
Speaker 1: It was at this point they decided to turn it

155
00:09:48,640 --> 00:09:52,880
Speaker 1: over to the ordinary people of Seattle. It was their

156
00:09:52,960 --> 00:09:56,200
Speaker 1: job to evaluate these patients and determine who should live

157
00:09:56,480 --> 00:09:57,400
Speaker 1: and who should die.

158
00:09:58,480 --> 00:10:01,439
Speaker 3: There was a woman who was up for evaluation and

159
00:10:02,440 --> 00:10:04,840
Speaker 3: husband and sons said that she was no longer cleaning

160
00:10:04,880 --> 00:10:07,960
Speaker 3: the house. That was part of what they brought up

161
00:10:08,000 --> 00:10:11,640
Speaker 3: to evaluate her as to her worthiness of dialysis.

162
00:10:12,880 --> 00:10:21,040
Speaker 1: We'll be right back after the break. The God Squad's

163
00:10:21,080 --> 00:10:25,320
Speaker 1: official name was the Admissions and Policies Committee of the

164
00:10:25,360 --> 00:10:28,920
Speaker 1: Seattle Artificial Kidney Center at Swedish Hospital.

165
00:10:29,640 --> 00:10:33,600
Speaker 3: They, of course say that it represents a cross section

166
00:10:34,160 --> 00:10:38,479
Speaker 3: of the Seattle population. It was mostly men, mostly middle.

167
00:10:38,240 --> 00:10:42,440
Speaker 1: Clayer's administrators from the kidney center hand selected the God

168
00:10:42,480 --> 00:10:47,360
Speaker 1: Squad members. That's right, hand selected. The hospital gave the

169
00:10:47,360 --> 00:10:51,960
Speaker 1: committee an information packet on each candidate, with their medical records,

170
00:10:52,000 --> 00:10:57,720
Speaker 1: psychological evaluations, financial information, even letters of reference. One of

171
00:10:57,720 --> 00:11:00,560
Speaker 1: the doctors who briefed the committee later said, we told

172
00:11:00,559 --> 00:11:03,400
Speaker 1: them frankly, that there were no guidelines. They were on

173
00:11:03,480 --> 00:11:07,280
Speaker 1: their own. We really dumped it on them. The committee

174
00:11:07,320 --> 00:11:10,760
Speaker 1: decided to review every piece of biographical information they could

175
00:11:10,760 --> 00:11:14,079
Speaker 1: get their hands on. They also decided to enlist other

176
00:11:14,120 --> 00:11:17,640
Speaker 1: people to help them, a social worker and a psychologist.

177
00:11:19,120 --> 00:11:21,480
Speaker 1: What they were looking to determine was what they called

178
00:11:21,480 --> 00:11:23,360
Speaker 1: a person's social worth.

179
00:11:24,160 --> 00:11:27,679
Speaker 3: One of the criteria that the Patient Advisory Committee often

180
00:11:27,720 --> 00:11:29,959
Speaker 3: considered was the common good.

181
00:11:30,720 --> 00:11:33,400
Speaker 1: Rick says, if you read through the committee's records, you

182
00:11:33,440 --> 00:11:36,679
Speaker 1: can piece together what the members thought made someone worthy.

183
00:11:37,480 --> 00:11:40,120
Speaker 3: Being a white collar worker was better than being a

184
00:11:40,120 --> 00:11:43,479
Speaker 3: blue collar worker. A woman who was a known prostitute

185
00:11:43,960 --> 00:11:48,000
Speaker 3: was rejected for a woman who was a mother of four.

186
00:11:48,679 --> 00:11:51,480
Speaker 3: Another one that sticks out is a young man who

187
00:11:51,600 --> 00:11:53,880
Speaker 3: was considered to be, and this is the term that

188
00:11:53,960 --> 00:11:57,760
Speaker 3: they use, a never do or will a playboy, and

189
00:11:57,840 --> 00:12:01,040
Speaker 3: so he does not have the right temperament or morality,

190
00:12:01,360 --> 00:12:03,160
Speaker 3: then he's not worthy of theallises.

191
00:12:04,400 --> 00:12:07,360
Speaker 1: The committee members were kept anonymous, and their work happened

192
00:12:07,360 --> 00:12:11,400
Speaker 1: behind closed doors, But in nineteen sixty two, a prominent

193
00:12:11,480 --> 00:12:15,920
Speaker 1: reporter named Shana Alexander revealed their inner workings to the

194
00:12:15,960 --> 00:12:20,240
Speaker 1: world in a Life magazine article. Amazingly, all of the

195
00:12:20,280 --> 00:12:23,200
Speaker 1: committee members agreed to be interviewed as long as they

196
00:12:23,200 --> 00:12:27,280
Speaker 1: were not identified. The committee even re enacted one of

197
00:12:27,320 --> 00:12:30,600
Speaker 1: their first liberations so Shana could hear how they sounded

198
00:12:30,640 --> 00:12:35,000
Speaker 1: in action. Their conversations made it clear that to committee members,

199
00:12:35,080 --> 00:12:38,320
Speaker 1: what made someone worthy of saving was a matter of

200
00:12:38,360 --> 00:12:42,679
Speaker 1: personal opinion. We had voice actors read from the article.

201
00:12:44,640 --> 00:12:46,360
Speaker 5: If we are still looking for the men with the

202
00:12:46,440 --> 00:12:49,880
Speaker 5: highest potential of service to society, I think we must

203
00:12:50,000 --> 00:12:52,559
Speaker 5: consider that the chemist and the accountant have the finest

204
00:12:52,679 --> 00:12:55,160
Speaker 5: educational backgrounds of all five candidates.

205
00:12:55,480 --> 00:12:57,520
Speaker 6: How do the rest? Do you feel about number three,

206
00:12:58,080 --> 00:13:01,520
Speaker 6: the small businessman with three children. I'm impressed that this

207
00:13:01,640 --> 00:13:04,560
Speaker 6: doctor took special pains to mention this man as active

208
00:13:04,559 --> 00:13:07,080
Speaker 6: in church work. This is an indication to me of

209
00:13:07,200 --> 00:13:08,760
Speaker 6: character and moral strength.

210
00:13:09,480 --> 00:13:11,920
Speaker 7: For the children's sake, we've got to reckon with the

211
00:13:11,960 --> 00:13:15,640
Speaker 7: surviving parents' opportunity to remarry, and a woman with three

212
00:13:15,760 --> 00:13:18,280
Speaker 7: children has a better chance to find a new husband

213
00:13:18,400 --> 00:13:20,880
Speaker 7: than a very young widow with six children.

214
00:13:21,280 --> 00:13:22,960
Speaker 6: How can we possibly be sure of that?

215
00:13:25,679 --> 00:13:29,120
Speaker 1: Shanea's article not surprisingly caused outrage.

216
00:13:29,800 --> 00:13:32,520
Speaker 3: Lawyers at the time, you know, argue that it was

217
00:13:32,600 --> 00:13:36,599
Speaker 3: really just a way for physicians to avoid the responsibility

218
00:13:36,640 --> 00:13:40,080
Speaker 3: of making a difficult decision that they did not want

219
00:13:40,120 --> 00:13:42,280
Speaker 3: to make and that nobody wants to make.

220
00:13:43,200 --> 00:13:46,439
Speaker 1: Mostly, Rick says, people pointed out the obvious flaws with

221
00:13:46,520 --> 00:13:48,280
Speaker 1: a metric worthiness.

222
00:13:48,640 --> 00:13:51,200
Speaker 3: Someone who is, you know, an activist in the civil

223
00:13:51,280 --> 00:13:54,200
Speaker 3: rights movement. That's a social good, but it might not

224
00:13:54,320 --> 00:13:57,640
Speaker 3: fit within the ideals of what it is that they

225
00:13:57,720 --> 00:13:59,880
Speaker 3: think as a social good. So you could have a

226
00:14:00,240 --> 00:14:03,600
Speaker 3: respected business person who was still unethical in a number

227
00:14:03,640 --> 00:14:04,920
Speaker 3: of different ways.

228
00:14:05,760 --> 00:14:08,880
Speaker 1: In the end, the committee selected the first group of patients,

229
00:14:09,360 --> 00:14:14,520
Speaker 1: among them a physicist, an engineer, a car salesman, an

230
00:14:14,600 --> 00:14:20,239
Speaker 1: aircraft worker, and an oil company executive. By most accounts,

231
00:14:20,440 --> 00:14:25,160
Speaker 1: the God Squad kept meeting until nineteen seventy two. That year,

232
00:14:25,520 --> 00:14:30,720
Speaker 1: Congress passed legislation making dialysis available to everyone whose kidneys

233
00:14:30,720 --> 00:14:34,680
Speaker 1: have failed, but the committee lived on in public imagination.

234
00:14:36,560 --> 00:14:39,840
Speaker 1: Many people didn't get the life saving treatment they needed

235
00:14:39,880 --> 00:14:44,240
Speaker 1: because they were deemed less worthy. The God Squad were

236
00:14:44,320 --> 00:14:46,640
Speaker 1: people who just had to make up the rules as

237
00:14:46,640 --> 00:14:50,080
Speaker 1: they went along. There was no template yet for best

238
00:14:50,080 --> 00:14:53,960
Speaker 1: practices or ethical guidance in making these kinds of decisions.

239
00:14:59,040 --> 00:15:01,200
Speaker 4: They were starting from s, you know, and I think

240
00:15:01,240 --> 00:15:04,120
Speaker 4: that we have a much more robust literature. You know,

241
00:15:04,160 --> 00:15:07,400
Speaker 4: we have a history of bioethecal analysis to lean on now,

242
00:15:07,880 --> 00:15:10,840
Speaker 4: and of course we're still improving over time and how

243
00:15:10,840 --> 00:15:11,920
Speaker 4: we think about these things.

244
00:15:12,680 --> 00:15:16,320
Speaker 1: This is Kate Butler. She's a clinical neprologist based in Seattle,

245
00:15:17,040 --> 00:15:19,440
Speaker 1: and she says, what is key is to design a

246
00:15:19,480 --> 00:15:23,600
Speaker 1: system that's fair. But of course fairness can be understood

247
00:15:23,640 --> 00:15:24,800
Speaker 1: in lots of different ways.

248
00:15:26,080 --> 00:15:28,800
Speaker 4: Do we want to make the very best use of

249
00:15:28,880 --> 00:15:31,360
Speaker 4: resources in terms of saving the most lives in terms

250
00:15:31,360 --> 00:15:34,600
Speaker 4: of having the most life years lived? Do we want

251
00:15:34,680 --> 00:15:38,280
Speaker 4: to consider quality of life years lived? And if so,

252
00:15:38,720 --> 00:15:42,480
Speaker 4: who decides on quality? And or do we want to

253
00:15:42,480 --> 00:15:46,320
Speaker 4: make sure that we're allocating resources in a way that

254
00:15:46,320 --> 00:15:49,720
Speaker 4: feels equitable to us? And again, who is that us?

255
00:15:49,920 --> 00:15:53,120
Speaker 4: Who's making the decision about whether the system is equitable.

256
00:15:54,160 --> 00:15:58,080
Speaker 1: Kate told us that nowadays systems are based on ethical foundations.

257
00:15:58,640 --> 00:16:01,200
Speaker 1: For example, one way of doing things would be to

258
00:16:01,280 --> 00:16:04,440
Speaker 1: prioritize recipients who we expect to live the longest after

259
00:16:04,480 --> 00:16:09,600
Speaker 1: a transplant, which would be a utilitarian approach. Or you

260
00:16:09,640 --> 00:16:12,320
Speaker 1: could use a lottery so that everyone on the list

261
00:16:12,360 --> 00:16:15,200
Speaker 1: gets an equal chance of a transplant, which would be

262
00:16:15,240 --> 00:16:18,880
Speaker 1: based on the principle of equality. I mean, which one

263
00:16:18,920 --> 00:16:21,920
Speaker 1: do you think would be most fair? Kate gave an

264
00:16:21,920 --> 00:16:25,360
Speaker 1: example the National Weight List for kidneys, which is a

265
00:16:25,400 --> 00:16:27,920
Speaker 1: modified version of waiting until your number is.

266
00:16:27,880 --> 00:16:32,520
Speaker 4: Called, and that process has been worked out over decades

267
00:16:32,880 --> 00:16:37,880
Speaker 4: as a collaboration between clinicians, bioethicists, the community by way

268
00:16:37,880 --> 00:16:38,880
Speaker 4: of community forums.

269
00:16:39,680 --> 00:16:42,760
Speaker 1: There's an organization that monitors the systems to see if

270
00:16:42,800 --> 00:16:45,720
Speaker 1: it's working the way it's supposed to. It's called the

271
00:16:45,800 --> 00:16:50,520
Speaker 1: United Network for organ Sharing. In twenty fourteen, they discovered

272
00:16:50,520 --> 00:16:54,400
Speaker 1: a flaw. The waiting list wasn't accounting for some groups

273
00:16:54,440 --> 00:16:57,960
Speaker 1: of people, mainly people of color, having a harder time

274
00:16:58,040 --> 00:17:01,640
Speaker 1: getting on the list in the first place. In bioethics,

275
00:17:01,760 --> 00:17:05,600
Speaker 1: equity is a key principle. It's important to account for

276
00:17:05,640 --> 00:17:10,120
Speaker 1: disadvantage or underrepresentation. So they made a change.

277
00:17:10,359 --> 00:17:13,760
Speaker 4: There was an intentional effort to change the WEIGHTLISS criteria

278
00:17:13,840 --> 00:17:16,679
Speaker 4: to give you retroactive time for time since you started

279
00:17:16,760 --> 00:17:21,480
Speaker 4: on dialysis, so people would get points for the time

280
00:17:21,480 --> 00:17:24,320
Speaker 4: spent on the witlist or how long they had been

281
00:17:24,359 --> 00:17:25,920
Speaker 4: on dialysis, whichever is longer.

282
00:17:26,600 --> 00:17:31,080
Speaker 1: The change was apparent within months. The system still isn't perfect,

283
00:17:31,520 --> 00:17:33,960
Speaker 1: but Kate says, as an example of how the field

284
00:17:33,960 --> 00:17:37,160
Speaker 1: of bioethics has evolved since the time of the God Squad.

285
00:17:37,200 --> 00:17:43,080
Speaker 4: There's more to medicine than just clinical analysis of individual

286
00:17:43,119 --> 00:17:48,840
Speaker 4: cases that considering the bioethical implications of these decisions was

287
00:17:49,160 --> 00:17:52,399
Speaker 4: necessary and important. I think that's why people refer to

288
00:17:52,440 --> 00:17:56,320
Speaker 4: this example as the birth of bioethics today.

289
00:17:56,480 --> 00:17:59,960
Speaker 1: It's part of the process to consider ethics in medical advance.

290
00:18:01,080 --> 00:18:04,520
Speaker 4: Any situation in which you have resource scarcity for something

291
00:18:04,560 --> 00:18:08,520
Speaker 4: so consequential as healthcare, there's going to be tragedy, right

292
00:18:08,640 --> 00:18:10,320
Speaker 4: There's going to be someone who doesn't get the care

293
00:18:10,480 --> 00:18:12,400
Speaker 4: you want for them. We're not going to be able

294
00:18:12,400 --> 00:18:18,600
Speaker 4: to design a perfect system.

295
00:18:18,680 --> 00:18:21,720
Speaker 1: As we have heard from this series, the landscape is

296
00:18:21,840 --> 00:18:26,320
Speaker 1: ever shifting. Every time there's a new medical innovation, there's

297
00:18:26,359 --> 00:18:31,000
Speaker 1: a whole new set of ethical questions. If you've enjoyed

298
00:18:31,080 --> 00:18:34,320
Speaker 1: playing God, then we're going to have plenty more stories

299
00:18:34,359 --> 00:18:36,960
Speaker 1: like this coming out of the Ideas Lab at the

300
00:18:37,040 --> 00:18:42,560
Speaker 1: Johns Hopkins Berman Institute of Bioethics. Playing God is a

301
00:18:42,600 --> 00:18:46,359
Speaker 1: co production the Pushkin Industries and the Johns Hopkins Berman

302
00:18:46,440 --> 00:18:50,879
Speaker 1: Institute of Bioethics. Special thanks to our guests in this episode,

303
00:18:51,160 --> 00:18:55,520
Speaker 1: Rick Mozelle and Kate Butler. Emily Vaughan is our lead producer.

304
00:18:56,440 --> 00:19:02,120
Speaker 1: Production support from Sophie Crane and Lucy Sullivan. Our editors

305
00:19:02,240 --> 00:19:06,520
Speaker 1: are Karen Chakerjee and Kate Parkinson Morgan. Theme music and

306
00:19:06,600 --> 00:19:11,960
Speaker 1: mixing by Echo Mountain, Engineering support from Sarah Brugare and

307
00:19:12,000 --> 00:19:17,760
Speaker 1: Amanda Kaiwang. Show art by Sean Carney, fact checking by

308
00:19:17,840 --> 00:19:22,760
Speaker 1: David jar and Arthur Gompertz. Our executive producer is Justine

309
00:19:22,840 --> 00:19:26,760
Speaker 1: Lang at the Johns Hopkins Berman Institute of Bioethics. Our

310
00:19:26,800 --> 00:19:31,119
Speaker 1: executive producers are Jeffrey Kahan and Anna Mastriani, working with

311
00:19:31,200 --> 00:19:35,720
Speaker 1: Amelia Hood and with support from Susan Snead, Aaron Henkin,

312
00:19:36,080 --> 00:19:42,280
Speaker 1: Abigail Brickler, Kim bikermer Anna Oakes, and Jamie Smith. Funding

313
00:19:42,359 --> 00:19:46,920
Speaker 1: provided by the green Wall Foundation. Special thanks to voice

314
00:19:46,960 --> 00:19:51,960
Speaker 1: coach Vicky Merrick. This is our last episode, so we'd

315
00:19:52,000 --> 00:19:54,320
Speaker 1: like to thank some of the many people at Pushkin

316
00:19:54,440 --> 00:19:59,920
Speaker 1: who've supported this show throughout the season, including Jacob Weisber,

317
00:20:00,840 --> 00:20:08,719
Speaker 1: Heather Fame, John Snarz, Letal Malad, Greta Cohne, Carl Migliori,

318
00:20:10,040 --> 00:20:18,840
Speaker 1: Jasmine Perez, Eric Sandler, Jordan mcmill Isabella Navarez, Nicole op

319
00:20:18,920 --> 00:20:28,080
Speaker 1: Den Bosch, Maya Kanig, Jake Flanagan, Owen Miller, David Glover,

320
00:20:29,280 --> 00:20:35,480
Speaker 1: Nina Lawrence, Mia LaBelle, and Ian Petsa. To learn more

321
00:20:35,520 --> 00:20:39,440
Speaker 1: about Bioethics and the issues presented in this series, please

322
00:20:39,560 --> 00:20:45,080
Speaker 1: visit Bioethics dot jhu dot edu forward, slash Playing God.

323
00:20:46,240 --> 00:20:50,240
Speaker 1: I'm Laurena Rora Hutchinson. Thanks for listening to Playing God.

324
00:21:00,480 --> 00:21:04,240
Speaker 1: If you're interested in learning more about these stories and discussions,

325
00:21:04,520 --> 00:21:08,119
Speaker 1: visit the Berman Institute's guide to the podcast at Bioethics

326
00:21:08,160 --> 00:21:12,359
Speaker 1: dot j u dot ed u, slash Playing God, or

327
00:21:12,359 --> 00:21:15,200
Speaker 1: find us on social media at Burman Institute