1
00:00:00,160 --> 00:00:03,440
Speaker 1: A quick note before we start the show. This episode

2
00:00:03,480 --> 00:00:07,920
Speaker 1: is about depression and treatment for depression. It contains references

3
00:00:07,920 --> 00:00:11,640
Speaker 1: to suicide. If you or someone you know is struggling

4
00:00:11,680 --> 00:00:14,360
Speaker 1: with thoughts of suicide, we have links to hotlines and

5
00:00:14,400 --> 00:00:15,920
Speaker 1: resources in our show notes.

6
00:00:17,600 --> 00:00:23,239
Speaker 2: I am completely dependent upon electricity as medicine and there

7
00:00:23,280 --> 00:00:26,000
Speaker 2: will never be a point in my life where I

8
00:00:26,040 --> 00:00:30,440
Speaker 2: can quote go off the grid because I can never

9
00:00:30,640 --> 00:00:34,360
Speaker 2: be without electricity for my own survival.

10
00:00:35,680 --> 00:00:39,800
Speaker 1: Brandy Ellis calls herself a cyborg. She has an electrical

11
00:00:39,840 --> 00:00:43,360
Speaker 1: implant in her brain twelve years ago. Before she got

12
00:00:43,360 --> 00:00:47,120
Speaker 1: the implant, she often struggled with depression. She says it

13
00:00:47,159 --> 00:00:51,280
Speaker 1: has made her feel more like herself. Brandy was first

14
00:00:51,280 --> 00:00:55,160
Speaker 1: diagnosed with depression when she was twenty. At that time,

15
00:00:55,320 --> 00:00:58,440
Speaker 1: she slept twenty hours a day and she cried often

16
00:00:58,520 --> 00:01:02,560
Speaker 1: when she was awake for no reason. So she saw

17
00:01:02,600 --> 00:01:05,720
Speaker 1: a therapist and went on medication, and she felt better

18
00:01:06,080 --> 00:01:09,720
Speaker 1: like herself again. She consulted with her doctor and decided

19
00:01:09,760 --> 00:01:13,400
Speaker 1: to stop taking medication, but a few years later her

20
00:01:13,440 --> 00:01:17,319
Speaker 1: symptoms came back. Starting back up with medication and therapy

21
00:01:17,440 --> 00:01:21,039
Speaker 1: pulled her out of it. The depression lifted. This cycle

22
00:01:21,120 --> 00:01:23,679
Speaker 1: went on for over a decade, but when she was

23
00:01:23,720 --> 00:01:27,600
Speaker 1: thirty two, the depression came back, and this time it

24
00:01:27,640 --> 00:01:28,320
Speaker 1: felt different.

25
00:01:30,080 --> 00:01:34,880
Speaker 2: I wasn't sleepy, I was anxious, and I had insomnia,

26
00:01:35,720 --> 00:01:40,240
Speaker 2: and I was less weepy and more cranky.

27
00:01:41,360 --> 00:01:44,160
Speaker 1: She had assumed her depression would resolve like it had

28
00:01:44,280 --> 00:01:47,319
Speaker 1: every other time, once she found the right combination of

29
00:01:47,360 --> 00:01:51,200
Speaker 1: medication and therapy, but for four years nothing helped.

30
00:01:52,280 --> 00:02:00,440
Speaker 2: I tried over two dozen different medications of every conceivable type,

31
00:02:01,200 --> 00:02:04,040
Speaker 2: and that is not counting all the various dosages of

32
00:02:04,120 --> 00:02:09,160
Speaker 2: those medications and all the various combinations of multiple medications.

33
00:02:09,600 --> 00:02:12,960
Speaker 1: Meanwhile, she couldn't hold a job. She moved back home

34
00:02:13,000 --> 00:02:16,840
Speaker 1: to Dowray Beach, Florida, where her parents supported her financially,

35
00:02:17,840 --> 00:02:21,040
Speaker 1: but not everyone in her life was so supportive. Some

36
00:02:21,120 --> 00:02:24,160
Speaker 1: relatives told her that her depression was her own fault,

37
00:02:24,800 --> 00:02:27,760
Speaker 1: that she was just being lazy. For the first time

38
00:02:27,760 --> 00:02:31,519
Speaker 1: in her life, Brandy had thoughts of suicide.

39
00:02:32,080 --> 00:02:36,920
Speaker 2: He never understood what it meant to be suicidal in

40
00:02:36,960 --> 00:02:40,320
Speaker 2: any of my previous depressive episodes, and I refer to

41
00:02:40,360 --> 00:02:43,120
Speaker 2: it now as my brain tried to kill me and

42
00:02:43,240 --> 00:02:44,959
Speaker 2: I was trying to survive.

43
00:02:45,960 --> 00:02:49,600
Speaker 1: As a last result, Brandy's doctors suggested she tried something

44
00:02:49,639 --> 00:02:54,000
Speaker 1: called electroconvulsive therapy. It's a procedure where small amounts of

45
00:02:54,000 --> 00:02:58,080
Speaker 1: electricity are sent through the brain to relieve mental health symptoms.

46
00:02:58,600 --> 00:03:01,760
Speaker 1: It helped, but the relief only lasted for a couple

47
00:03:01,800 --> 00:03:05,440
Speaker 1: of weeks. Brandy's doctors said that in order to keep

48
00:03:05,480 --> 00:03:08,079
Speaker 1: her symptoms at bay, she'd have to keep coming back

49
00:03:08,120 --> 00:03:11,360
Speaker 1: in for treatment every month indefinitely.

50
00:03:11,480 --> 00:03:16,040
Speaker 2: And that was just something that I could not do forever.

51
00:03:16,600 --> 00:03:20,120
Speaker 2: So it did look like I was going to be

52
00:03:20,280 --> 00:03:24,200
Speaker 2: suffering like this for the rest of my life, and

53
00:03:24,280 --> 00:03:28,799
Speaker 2: I didn't know how long I could survive that kind

54
00:03:28,880 --> 00:03:29,840
Speaker 2: of life.

55
00:03:30,280 --> 00:03:33,680
Speaker 1: Brandy's doctors said there was nothing left to try, but

56
00:03:33,800 --> 00:03:37,240
Speaker 1: she wasn't ready to give up. She scoured the internet

57
00:03:37,320 --> 00:03:41,080
Speaker 1: for other treatment options and did an online mental health

58
00:03:41,080 --> 00:03:44,760
Speaker 1: support group. She read about a clinical trial that seemed promising.

59
00:03:45,560 --> 00:03:49,320
Speaker 1: Researchers at Emory University were studying a type of treatment

60
00:03:49,440 --> 00:03:53,560
Speaker 1: called deep brain stimulation. She said she didn't have high

61
00:03:53,560 --> 00:03:55,800
Speaker 1: hopes that it could help her, but she thought it

62
00:03:55,880 --> 00:03:57,400
Speaker 1: might be worth doing anyway.

63
00:03:58,040 --> 00:03:59,480
Speaker 2: I felt like I was going to do the brain

64
00:03:59,520 --> 00:04:04,440
Speaker 2: surgery and then it would not work, and I would

65
00:04:04,480 --> 00:04:11,160
Speaker 2: still have this terminal depression and I would eventually die

66
00:04:11,520 --> 00:04:14,240
Speaker 2: rather than continue to suffer. But I thought that I

67
00:04:14,320 --> 00:04:18,200
Speaker 2: could be a data point that helped people who came

68
00:04:18,279 --> 00:04:18,760
Speaker 2: after me.

69
00:04:20,800 --> 00:04:23,919
Speaker 1: I'm Lauren and Rora Hutchinson. I'm the director of the

70
00:04:24,000 --> 00:04:28,320
Speaker 1: Ideas Lab at the Johns Hopkins Berman Institute of Bioethics.

71
00:04:29,040 --> 00:04:33,200
Speaker 1: On today's show, we're talking about the revolutionary technology of

72
00:04:33,279 --> 00:04:39,240
Speaker 1: deep brain stimulation or DBS. DBS is a highly effective

73
00:04:39,279 --> 00:04:44,160
Speaker 1: treatment for neurological conditions in the rare instances that patients

74
00:04:44,200 --> 00:04:48,599
Speaker 1: aren't responding to other types of treatment. But computer based

75
00:04:48,640 --> 00:04:52,279
Speaker 1: implants can change more about the brain than the disease

76
00:04:52,320 --> 00:04:56,920
Speaker 1: they're meant to treat, which raises all kinds of ethical dilemmas.

77
00:04:58,760 --> 00:05:01,240
Speaker 1: So when it comes to the possib ability of altering

78
00:05:01,279 --> 00:05:05,960
Speaker 1: our personalities by implanting electronics, where do we draw the line?

79
00:05:07,120 --> 00:05:10,400
Speaker 1: Are we giving computers too much control over who we

80
00:05:10,440 --> 00:05:13,640
Speaker 1: are when we allow them to alter fundamental human traits

81
00:05:13,720 --> 00:05:18,400
Speaker 1: like our emotions? And if it's okay to change our moods,

82
00:05:18,600 --> 00:05:24,920
Speaker 1: what about other things like our intelligence? From Pushkin Industries

83
00:05:24,960 --> 00:05:28,920
Speaker 1: and the Johns Hopkins Berman Institute of Bioethics, this is

84
00:05:28,960 --> 00:05:29,920
Speaker 1: playing god.

85
00:05:36,040 --> 00:05:40,479
Speaker 3: This is a neurosurgical procedure that involves the implementation of

86
00:05:40,680 --> 00:05:46,039
Speaker 3: thin electrodes that are, i say, like the thickness of spaghetti.

87
00:05:47,120 --> 00:05:50,479
Speaker 1: This is Patricio Riva Porse. He was one of the

88
00:05:50,520 --> 00:05:55,000
Speaker 1: researchers conducting the study that Brandy came across. These days,

89
00:05:55,040 --> 00:05:58,320
Speaker 1: he's the director of the Treatment Resistant Depression Clinic at

90
00:05:58,360 --> 00:06:02,320
Speaker 1: Emery's a lot of his time working on deep brain

91
00:06:02,440 --> 00:06:04,880
Speaker 1: stimulation and explaining how it works.

92
00:06:05,680 --> 00:06:09,440
Speaker 3: The electrodes is implanted with a canula that keeps it,

93
00:06:09,560 --> 00:06:11,800
Speaker 3: you know, kind of rigid until the point where you

94
00:06:12,200 --> 00:06:13,000
Speaker 3: hit the target.

95
00:06:13,600 --> 00:06:17,000
Speaker 1: The electrodes, he says, empowered by a device that's similar

96
00:06:17,040 --> 00:06:20,880
Speaker 1: to a pacemaker, which gets inserted under the patient's skin

97
00:06:21,120 --> 00:06:25,279
Speaker 1: near the collar bone. Before researchers brought DBS to people

98
00:06:25,320 --> 00:06:28,680
Speaker 1: with depression, they were already using it for conditions like

99
00:06:28,760 --> 00:06:29,880
Speaker 1: Parkinson's disease.

100
00:06:30,800 --> 00:06:36,000
Speaker 3: This idea came from that application into saying, well, what

101
00:06:36,040 --> 00:06:40,880
Speaker 3: if you can deliver small amounts of electricity to specific

102
00:06:40,920 --> 00:06:46,280
Speaker 3: areas of the brain to stimulate circuits that are related

103
00:06:46,480 --> 00:06:51,720
Speaker 3: to mood and emotions towards the circus that control those

104
00:06:51,760 --> 00:06:56,480
Speaker 3: behaviors and those thoughts. In Parkinson's disease. The usual targets

105
00:06:56,600 --> 00:07:00,240
Speaker 3: that are chosen are the ones that are involved in

106
00:07:00,600 --> 00:07:05,039
Speaker 3: treating the tremor or the rigidity, and in depression, the

107
00:07:05,120 --> 00:07:11,600
Speaker 3: circuits are quite different. Different areas were chosen to stimulate,

108
00:07:12,000 --> 00:07:15,840
Speaker 3: and it seems to be that patients with depression have

109
00:07:16,040 --> 00:07:21,040
Speaker 3: an increase in the activity in that area. And the

110
00:07:21,080 --> 00:07:23,560
Speaker 3: thought was, well, what if you implant an electro there

111
00:07:23,600 --> 00:07:25,840
Speaker 3: and you deliver a lot of electricity there. What you

112
00:07:25,920 --> 00:07:29,400
Speaker 3: do is you put that area in an inhibited mode.

113
00:07:30,040 --> 00:07:31,080
Speaker 3: So he turned it down.

114
00:07:32,360 --> 00:07:35,240
Speaker 1: Patricio says that by the time Brandy came across his

115
00:07:35,320 --> 00:07:39,360
Speaker 1: clinical trial, there was already some published research showing that

116
00:07:39,440 --> 00:07:43,240
Speaker 1: his theory helped promise A small study show that out

117
00:07:43,240 --> 00:07:47,559
Speaker 1: of six initial patients, DBS helped four of them. After

118
00:07:47,600 --> 00:07:50,360
Speaker 1: six months of treatment, they were no longer depressed.

119
00:07:51,360 --> 00:07:55,560
Speaker 3: The satisfying aspect of deep brain stimulation and other implantable

120
00:07:55,960 --> 00:07:59,280
Speaker 3: devices is that when patients get well, they stay well.

121
00:08:00,080 --> 00:08:03,320
Speaker 3: Continuous delivery of small amounts of electricity, you know, as

122
00:08:03,920 --> 00:08:06,800
Speaker 3: as simple as it may seem, it seems to be

123
00:08:07,200 --> 00:08:10,320
Speaker 3: a treatment that is sustained over time. That's I think

124
00:08:10,560 --> 00:08:16,240
Speaker 3: is what is so hopeful about these implantable neuromodulation techniques.

125
00:08:17,120 --> 00:08:20,280
Speaker 1: He says, the procedure comes with risks too, like brain

126
00:08:20,280 --> 00:08:24,200
Speaker 1: hemorrhage or infection. It is brain surgery after all.

127
00:08:25,840 --> 00:08:29,000
Speaker 3: And then you have negative outcomes of what happens if

128
00:08:29,240 --> 00:08:34,160
Speaker 3: treatment doesn't work right. These are patients that are very,

129
00:08:34,280 --> 00:08:39,680
Speaker 3: very ill with depression for years. They have a much

130
00:08:39,720 --> 00:08:46,120
Speaker 3: higher risk of suicide. And unfortunately, across the trials there

131
00:08:46,160 --> 00:08:49,400
Speaker 3: have been patients who have died of suicide. I have

132
00:08:49,520 --> 00:08:55,800
Speaker 3: not seen patients who because of the DVF they attempted suicide.

133
00:08:56,120 --> 00:08:59,640
Speaker 3: I think that we attribute the suicide, of course as

134
00:08:59,679 --> 00:09:04,720
Speaker 3: the ultimate negative outcome, but as the failure of the treatment,

135
00:09:05,200 --> 00:09:07,760
Speaker 3: of treating the illness that we want to treat. Right

136
00:09:08,000 --> 00:09:10,920
Speaker 3: as if you were having a clinical trial that treats

137
00:09:11,000 --> 00:09:15,160
Speaker 3: advanced cancer, if patients die of that cancer, you don't

138
00:09:15,160 --> 00:09:16,800
Speaker 3: say that it's because of the treatment.

139
00:09:17,520 --> 00:09:20,080
Speaker 1: All these things were explained to Brandy when she first

140
00:09:20,080 --> 00:09:23,640
Speaker 1: contacted Patricio and the other researchers back in twenty eleven,

141
00:09:24,440 --> 00:09:27,160
Speaker 1: and none of it scared her off. In fact, almost

142
00:09:27,160 --> 00:09:27,959
Speaker 1: the opposite.

143
00:09:28,600 --> 00:09:32,480
Speaker 2: It sounds terrible, but I felt like the people who

144
00:09:33,000 --> 00:09:36,480
Speaker 2: thought that I wasn't trying hard enough to get well

145
00:09:37,679 --> 00:09:40,760
Speaker 2: would change their mind if they knew that I had

146
00:09:40,840 --> 00:09:43,800
Speaker 2: had brain surgery trying to get well.

147
00:09:44,960 --> 00:09:48,800
Speaker 1: Not everyone was convinced. To some of her family members,

148
00:09:49,080 --> 00:09:52,960
Speaker 1: DBS didn't sound like medicine. It sounded like a shortcut,

149
00:09:53,240 --> 00:09:56,400
Speaker 1: just turning her emotions over to the control of a computer.

150
00:09:57,520 --> 00:09:58,360
Speaker 2: After a few.

151
00:09:58,160 --> 00:10:02,040
Speaker 1: Appointments to determine whether she was a good candidate, Brandy

152
00:10:02,120 --> 00:10:05,800
Speaker 1: was accepted to the clinical trial for DBS. She moved

153
00:10:05,800 --> 00:10:08,240
Speaker 1: to Atlanta, where she'd need to live for at least

154
00:10:08,280 --> 00:10:08,680
Speaker 1: a year.

155
00:10:10,600 --> 00:10:15,640
Speaker 2: The surgery itself was exhausting. It was very long. So

156
00:10:16,360 --> 00:10:19,839
Speaker 2: it starts way before dawn, and you go in and

157
00:10:20,440 --> 00:10:24,520
Speaker 2: they give you a nice IV to keep you calm,

158
00:10:24,840 --> 00:10:29,520
Speaker 2: and they start drilling a frame into your skull.

159
00:10:29,640 --> 00:10:33,120
Speaker 1: The frame is temporary. It's just to stabilize the patient's

160
00:10:33,120 --> 00:10:36,400
Speaker 1: head during the operation, because for the first part of

161
00:10:36,440 --> 00:10:38,960
Speaker 1: the surgery Brandy would be wide awake.

162
00:10:41,440 --> 00:10:44,600
Speaker 2: Then you go into the operating room and they bolt

163
00:10:44,679 --> 00:10:47,720
Speaker 2: that frame to the operating table, and this way you

164
00:10:48,040 --> 00:10:52,880
Speaker 2: absolutely cannot move your brain, your head, any part of

165
00:10:52,920 --> 00:10:54,240
Speaker 2: your upper body.

166
00:10:55,160 --> 00:10:59,360
Speaker 1: The operating room was crowded with neurosurgeons and the research team.

167
00:11:00,000 --> 00:11:03,600
Speaker 1: Surgeons started stimulating different points in the target region of

168
00:11:03,640 --> 00:11:07,480
Speaker 1: Brandy's brain, trying to pinpoint where to place her implant.

169
00:11:08,120 --> 00:11:10,560
Speaker 1: Each time, they asked her if she felt anything, and

170
00:11:10,640 --> 00:11:12,239
Speaker 1: at one point she did.

171
00:11:13,520 --> 00:11:16,800
Speaker 2: It felt like gravity decreased a little bit, like I

172
00:11:16,840 --> 00:11:20,120
Speaker 2: did not. My mass was less, you know it was.

173
00:11:20,880 --> 00:11:24,160
Speaker 2: I felt lighter, like more air came into my body.

174
00:11:25,080 --> 00:11:28,840
Speaker 1: The surgeons put Brandy under anesthesia, placed the implant in

175
00:11:28,880 --> 00:11:32,760
Speaker 1: that spot, and sewed her back up. The surgeons gave

176
00:11:32,800 --> 00:11:35,480
Speaker 1: Brandy's body a few weeks to heal before turning on

177
00:11:35,520 --> 00:11:39,000
Speaker 1: her device. By chance, that date happened to be the

178
00:11:39,040 --> 00:11:41,199
Speaker 1: eleventh of November twenty eleven.

179
00:11:45,840 --> 00:11:49,600
Speaker 2: I thought eleven eleven eleven was an amazing day to

180
00:11:49,640 --> 00:11:50,599
Speaker 2: become a cyborg.

181
00:11:51,559 --> 00:11:54,560
Speaker 1: Cyborg, of course, is a science fiction term, not a

182
00:11:54,559 --> 00:11:57,960
Speaker 1: medical one, but she likes the way it sounds. When

183
00:11:58,000 --> 00:12:01,600
Speaker 1: the implant was initially turned on, Brandy didn't notice much

184
00:12:01,600 --> 00:12:04,800
Speaker 1: of a change, but about two months later, she has

185
00:12:04,880 --> 00:12:07,160
Speaker 1: what she calls her first good day.

186
00:12:07,880 --> 00:12:11,400
Speaker 2: I was still very very sick, still very very depressed,

187
00:12:12,240 --> 00:12:15,680
Speaker 2: but I could get out of bed and brush my

188
00:12:15,760 --> 00:12:21,920
Speaker 2: teeth and maybe leave the house, and I didn't hate

189
00:12:22,000 --> 00:12:22,800
Speaker 2: my existence.

190
00:12:23,760 --> 00:12:28,640
Speaker 1: Slowly, Brandy started having more good days. Six months after

191
00:12:28,679 --> 00:12:33,000
Speaker 1: her device was switched on, she felt significantly better. By

192
00:12:33,080 --> 00:12:35,840
Speaker 1: the time she'd had it on for eleven months, she

193
00:12:35,960 --> 00:12:39,480
Speaker 1: no longer met the diagnostic criteria for clinical depression.

194
00:12:40,600 --> 00:12:45,520
Speaker 2: I did my best to be very mindful about adding

195
00:12:45,640 --> 00:12:50,040
Speaker 2: one thing back at a time, you know, getting a relationship.

196
00:12:50,440 --> 00:12:53,480
Speaker 2: I met my partner then, and I started to be

197
00:12:53,600 --> 00:12:59,440
Speaker 2: able to support myself and go on trips with friends

198
00:12:59,679 --> 00:13:01,760
Speaker 2: and build a life again.

199
00:13:03,000 --> 00:13:07,320
Speaker 1: Brandy's surgery was twelve years ago now, and she's still thriving.

200
00:13:07,920 --> 00:13:08,640
Speaker 2: In that time.

201
00:13:09,240 --> 00:13:13,600
Speaker 1: She's only felt depressed once out of nowhere. About two

202
00:13:13,679 --> 00:13:17,720
Speaker 1: years after the surgery, she'd started feeling weepy and exhausted,

203
00:13:18,080 --> 00:13:21,400
Speaker 1: so she went to see her research team. They discovered

204
00:13:21,400 --> 00:13:25,080
Speaker 1: that her device had had a minor electronic glitch and

205
00:13:25,160 --> 00:13:26,600
Speaker 1: it had turned itself off.

206
00:13:29,720 --> 00:13:33,600
Speaker 2: And then realizing that was why was when the light switched.

207
00:13:33,960 --> 00:13:38,760
Speaker 2: You know, that was when I realized that this absolutely

208
00:13:39,400 --> 00:13:44,760
Speaker 2: was what was responsible from my recovery. Was this implant,

209
00:13:45,640 --> 00:13:48,960
Speaker 2: This entire bonus life that I have, this sort of

210
00:13:49,080 --> 00:13:52,720
Speaker 2: extra life from a video game that I got from

211
00:13:53,000 --> 00:13:56,120
Speaker 2: this device is the only reason that I am still

212
00:13:56,120 --> 00:14:01,800
Speaker 2: alive today.

213
00:14:02,640 --> 00:14:05,839
Speaker 1: After hearing Brandy's story, I wanted to know more about

214
00:14:05,840 --> 00:14:10,760
Speaker 1: the ethical issues surrounding DBS, like how valid are the

215
00:14:10,800 --> 00:14:14,640
Speaker 1: concerns that DBS outsources too much of a person's self

216
00:14:14,760 --> 00:14:18,640
Speaker 1: to a computer. So I talked to someone who's thinking

217
00:14:18,679 --> 00:14:22,440
Speaker 1: critically about the current and future applications of this technology

218
00:14:22,880 --> 00:14:27,000
Speaker 1: and the thawny ethical questions that surround it. That's after

219
00:14:27,040 --> 00:14:41,680
Speaker 1: the break. Karen Ramelfhanger is a neuroscientist and ethics scholar.

220
00:14:42,240 --> 00:14:46,320
Speaker 1: She's the founding director of the Institute of Neuroethics. She

221
00:14:46,480 --> 00:14:50,720
Speaker 1: explores the potential trajectories of new technologies like DBS and

222
00:14:50,800 --> 00:14:55,280
Speaker 1: the various ethical, medical, and legal ramifications.

223
00:14:55,120 --> 00:14:59,160
Speaker 4: On a day to day basis. I'm exploring the ethical, legal,

224
00:14:59,200 --> 00:15:03,000
Speaker 4: and social pations of new neuroscience. So I am thinking

225
00:15:03,040 --> 00:15:07,280
Speaker 4: about and trying to systematically address questions about how neuroscience

226
00:15:07,360 --> 00:15:11,320
Speaker 4: might challenge our notions of identity, the kinds of world

227
00:15:11,600 --> 00:15:14,680
Speaker 4: we want to live in our day to day life

228
00:15:15,160 --> 00:15:15,800
Speaker 4: as we know it.

229
00:15:16,560 --> 00:15:19,680
Speaker 1: Karen also happens to know Brandy Ellis, who we heard

230
00:15:19,720 --> 00:15:22,920
Speaker 1: from earlier. They both get a lot of invites to

231
00:15:22,960 --> 00:15:27,560
Speaker 1: speak to medical students and bioethicists about DBS and sometimes

232
00:15:27,680 --> 00:15:30,680
Speaker 1: end up on the same panel. I call Karen to

233
00:15:30,680 --> 00:15:33,840
Speaker 1: get a better understanding of where the technology is today,

234
00:15:34,120 --> 00:15:36,960
Speaker 1: where it's going, and what the ethical concerns are.

235
00:15:38,040 --> 00:15:43,240
Speaker 4: While there are remarkable clinical effects tied to deep brain stimulation,

236
00:15:43,840 --> 00:15:48,360
Speaker 4: there were some puzzling ethical tensions that came up around

237
00:15:48,640 --> 00:15:52,480
Speaker 4: deep brain stimulation, and that was for Parkinson's patients. There

238
00:15:52,560 --> 00:15:55,040
Speaker 4: were a small number and still are a small number

239
00:15:55,080 --> 00:16:01,000
Speaker 4: of patients who experienced really identity kind of changing or

240
00:16:01,120 --> 00:16:05,240
Speaker 4: perceived changes in personality and behaviors that they tied to

241
00:16:05,280 --> 00:16:10,280
Speaker 4: the technology. So there were questions about who was actually

242
00:16:10,440 --> 00:16:12,400
Speaker 4: the narrator of the life at that point. Was it

243
00:16:12,400 --> 00:16:14,640
Speaker 4: the technology or was it the person? Was it some

244
00:16:14,760 --> 00:16:18,960
Speaker 4: kind of combination And the goal of the intervention at

245
00:16:18,960 --> 00:16:23,560
Speaker 4: that time was to alleviate movement problems, and now we

246
00:16:23,600 --> 00:16:28,000
Speaker 4: also are exploring brain technologies for intractable depression mood disorders

247
00:16:28,040 --> 00:16:30,840
Speaker 4: that can't be treated by anything else. In this case,

248
00:16:31,640 --> 00:16:35,440
Speaker 4: some of those fundamental features of one might argue those

249
00:16:35,480 --> 00:16:37,960
Speaker 4: or personality features or ways that someone had learned to

250
00:16:37,960 --> 00:16:41,240
Speaker 4: interact with the world for a long time with terrible

251
00:16:41,600 --> 00:16:44,960
Speaker 4: suffering with depression, that maybe now we're actually using the

252
00:16:44,960 --> 00:16:47,720
Speaker 4: deep brain stimulation to change those fundamental features of that

253
00:16:47,760 --> 00:16:50,160
Speaker 4: person's life. So it's not a side effect, It's actually

254
00:16:50,600 --> 00:16:54,160
Speaker 4: part of the treatment. So how do you manage some

255
00:16:54,280 --> 00:16:58,000
Speaker 4: of the tensions in that space around ensuring that the

256
00:16:58,000 --> 00:16:59,880
Speaker 4: person who is getting the treatment is in the t

257
00:17:00,040 --> 00:17:02,440
Speaker 4: I ever see that they are aware of the types

258
00:17:02,480 --> 00:17:06,840
Speaker 4: of experiences they might have in their relationship with that

259
00:17:06,920 --> 00:17:08,600
Speaker 4: technology might change over time.

260
00:17:09,119 --> 00:17:12,440
Speaker 1: So would you say that deep brain stimulation could change

261
00:17:12,440 --> 00:17:13,000
Speaker 1: who we are?

262
00:17:15,440 --> 00:17:21,240
Speaker 4: It's a question that I explore. So I believe that

263
00:17:21,760 --> 00:17:25,880
Speaker 4: all technologies that we create are really a social mirror

264
00:17:26,080 --> 00:17:29,520
Speaker 4: for our fears or aspirations for the kind of world

265
00:17:29,560 --> 00:17:32,679
Speaker 4: we want to live in. So in that way, you

266
00:17:32,760 --> 00:17:36,639
Speaker 4: might ask if these kinds of technologies are tools for

267
00:17:36,800 --> 00:17:41,800
Speaker 4: us to better become ourselves, are they collaborators for becoming ourselves?

268
00:17:41,880 --> 00:17:44,800
Speaker 4: Or in some cases, are some of these tools identities

269
00:17:44,800 --> 00:17:47,320
Speaker 4: in their own right? So these are the types of

270
00:17:47,400 --> 00:17:49,040
Speaker 4: questions I'm exploring.

271
00:17:49,240 --> 00:17:53,280
Speaker 1: And you're involved in the essical and legal approaches to

272
00:17:53,320 --> 00:17:57,200
Speaker 1: these technologies. So who gets to decide, like, even if

273
00:17:57,240 --> 00:18:01,120
Speaker 1: it matters whether they change who we are, who gets

274
00:18:01,119 --> 00:18:01,639
Speaker 1: to say in that?

275
00:18:02,920 --> 00:18:06,600
Speaker 4: Thus far, the conversation hasn't really involved a lot of

276
00:18:06,600 --> 00:18:11,120
Speaker 4: the lived experience the people who are actually using these technologies,

277
00:18:11,200 --> 00:18:13,480
Speaker 4: or might use them in the future, and in a way,

278
00:18:14,400 --> 00:18:17,119
Speaker 4: we are all patients in waiting who may one day

279
00:18:17,200 --> 00:18:21,399
Speaker 4: need these technologies for a variety of reasons. So a

280
00:18:21,400 --> 00:18:25,160
Speaker 4: lot of these ethics conversations, and rightly so, they might

281
00:18:25,240 --> 00:18:30,800
Speaker 4: emerge from scholars in the university. They're people who have

282
00:18:31,359 --> 00:18:34,640
Speaker 4: close interactions with the research and the evolutions of it,

283
00:18:35,240 --> 00:18:40,040
Speaker 4: and we start to see clinical researchers in those conversations

284
00:18:40,080 --> 00:18:42,120
Speaker 4: as well, and then you start to see them kind

285
00:18:42,119 --> 00:18:45,639
Speaker 4: of enter into the policy maker space. But there hasn't

286
00:18:45,640 --> 00:18:48,000
Speaker 4: been a lot of room for people like Brandy Ellis,

287
00:18:48,200 --> 00:18:51,520
Speaker 4: and that's a problem, and we don't have a good

288
00:18:51,520 --> 00:18:54,320
Speaker 4: systematic approach for that, and so I think it's important

289
00:18:54,359 --> 00:18:58,359
Speaker 4: to have formal platforms to give patients and those who

290
00:18:58,440 --> 00:19:01,640
Speaker 4: have lived experience with the devices a voice. But right

291
00:19:01,680 --> 00:19:06,040
Speaker 4: now it's largely dictated by the people doing the research

292
00:19:06,080 --> 00:19:07,520
Speaker 4: and the people who fund that research.

293
00:19:10,119 --> 00:19:12,960
Speaker 1: Could you talk a bit about health equity and who

294
00:19:13,040 --> 00:19:15,280
Speaker 1: so far gets access to these treatments.

295
00:19:15,760 --> 00:19:22,000
Speaker 4: Deep brain stimulation requires a lot of expertise and specialized materials.

296
00:19:22,280 --> 00:19:25,800
Speaker 4: You're not seeing this readily available and accessible most places,

297
00:19:25,840 --> 00:19:28,359
Speaker 4: though in the US I wouldn't say it's easy to

298
00:19:28,400 --> 00:19:29,800
Speaker 4: get it. I mean, I think you'd need to go

299
00:19:29,840 --> 00:19:33,159
Speaker 4: to a specialty center to get it, for example, and

300
00:19:33,200 --> 00:19:36,480
Speaker 4: then you might not have places like low and middle

301
00:19:36,520 --> 00:19:39,400
Speaker 4: income countries who have access to expertise or materials.

302
00:19:39,720 --> 00:19:43,480
Speaker 1: And so what does consent mean in this context with

303
00:19:43,560 --> 00:19:45,439
Speaker 1: this kind of technology.

304
00:19:46,200 --> 00:19:50,400
Speaker 4: That's a good question, and it's something that ethicists think

305
00:19:50,440 --> 00:19:55,200
Speaker 4: a lot about, especially with brain disorders and where cognitive

306
00:19:55,240 --> 00:20:03,679
Speaker 4: capacity might be different due to disease or ability. So

307
00:20:03,720 --> 00:20:06,879
Speaker 4: it's important for us to note that consent is but

308
00:20:07,040 --> 00:20:14,919
Speaker 4: one instrument of ensuring a patient's dignity, their agency in

309
00:20:15,000 --> 00:20:20,199
Speaker 4: their care, and their right to health. So consent is

310
00:20:20,240 --> 00:20:25,280
Speaker 4: not a perfect tool, and we should recognize that consent

311
00:20:25,359 --> 00:20:28,560
Speaker 4: is also not a moment. It's an ongoing exercise where

312
00:20:28,640 --> 00:20:30,480
Speaker 4: you don't just it shouldn't end when you sign a

313
00:20:30,480 --> 00:20:33,080
Speaker 4: piece of paper. There should be an ongoing dialogue between

314
00:20:33,119 --> 00:20:38,000
Speaker 4: the researcher and the participant in the study. I've actually

315
00:20:38,280 --> 00:20:41,240
Speaker 4: listened to other patients talk about the way that they

316
00:20:41,280 --> 00:20:45,359
Speaker 4: were evaluating the kind of risk was just, you know,

317
00:20:45,400 --> 00:20:48,680
Speaker 4: I've already feel like I've got nothing left to live

318
00:20:48,720 --> 00:20:51,359
Speaker 4: for and if this can actually help me want to

319
00:20:51,359 --> 00:20:54,880
Speaker 4: live again, then maybe, but maybe I don't even care

320
00:20:55,320 --> 00:21:00,439
Speaker 4: at that point. So the task is to not have

321
00:21:00,560 --> 00:21:03,480
Speaker 4: the patient decide on their own. So they should also

322
00:21:03,520 --> 00:21:06,040
Speaker 4: have a family member involved who can help them deliberate.

323
00:21:06,080 --> 00:21:08,840
Speaker 4: So there should be someone else present to help. In

324
00:21:08,920 --> 00:21:12,520
Speaker 4: the case of some of these studies, family member is

325
00:21:12,560 --> 00:21:15,359
Speaker 4: required to also sign onto the study and be involved,

326
00:21:15,560 --> 00:21:18,959
Speaker 4: because you need it's very involved for these participants. They

327
00:21:19,000 --> 00:21:21,480
Speaker 4: have to come back from many visits, it's time consuming.

328
00:21:23,440 --> 00:21:26,520
Speaker 4: They need another perspective to also track day to day

329
00:21:27,440 --> 00:21:28,800
Speaker 4: how that patient is progressing.

330
00:21:29,520 --> 00:21:33,680
Speaker 1: And I'm curious. So we're talking about consent of the procedure,

331
00:21:33,720 --> 00:21:35,960
Speaker 1: but what about consent when it comes to turning an

332
00:21:36,000 --> 00:21:36,840
Speaker 1: implant off.

333
00:21:37,720 --> 00:21:42,200
Speaker 4: Yeah, there was a case where an individual had a

334
00:21:42,240 --> 00:21:46,400
Speaker 4: deep brain stimulator put in for a Parkinson's disease, and

335
00:21:46,640 --> 00:21:49,719
Speaker 4: they were one of the few cases that developed adverse

336
00:21:49,760 --> 00:21:54,840
Speaker 4: effects of mania, and they ended up while the stimulator

337
00:21:54,880 --> 00:22:02,720
Speaker 4: was on gambling, ruining their marriage. And the clinical team

338
00:22:02,760 --> 00:22:06,600
Speaker 4: had to decide, you know, should this person The person

339
00:22:06,640 --> 00:22:09,000
Speaker 4: has two choices, this patient, this patient can keep the

340
00:22:09,160 --> 00:22:14,600
Speaker 4: stimulator on. Their motor symptoms are fairly resolved, but they're

341
00:22:15,560 --> 00:22:17,760
Speaker 4: living in such a way that basically they should be

342
00:22:17,840 --> 00:22:25,359
Speaker 4: institutionalized because of their reckless behavior and related to the mania.

343
00:22:26,440 --> 00:22:28,679
Speaker 4: But if they had the device turned off, then the

344
00:22:28,760 --> 00:22:33,159
Speaker 4: patient would be confined to a bed because they wouldn't

345
00:22:33,200 --> 00:22:36,880
Speaker 4: be able to move around. So what should we do here?

346
00:22:37,520 --> 00:22:39,679
Speaker 4: But the first step was the team needed to decide

347
00:22:39,760 --> 00:22:42,480
Speaker 4: were they going to ask the patient if he wanted

348
00:22:42,480 --> 00:22:45,600
Speaker 4: the stimulator on or off when he was on or

349
00:22:45,640 --> 00:22:50,520
Speaker 4: when he was off. So it was a very tricky

350
00:22:50,840 --> 00:22:55,040
Speaker 4: situation and they ended up asking the patient while he

351
00:22:55,200 --> 00:22:58,840
Speaker 4: was off, thinking that this was his more authentic state

352
00:22:59,080 --> 00:23:02,000
Speaker 4: of who he was. And when they asked the patient

353
00:23:02,040 --> 00:23:04,199
Speaker 4: what he wanted to do with a stimulator off, he

354
00:23:04,240 --> 00:23:07,480
Speaker 4: said he'd rather have it on, so he wanted it

355
00:23:07,600 --> 00:23:07,919
Speaker 4: left on.

356
00:23:08,480 --> 00:23:11,760
Speaker 1: And how common is it that deep brain stimulation has

357
00:23:11,840 --> 00:23:14,920
Speaker 1: led to the negative consequences of people's behavior.

358
00:23:15,960 --> 00:23:21,399
Speaker 4: It's actually not that common. So there are a handful

359
00:23:21,440 --> 00:23:25,119
Speaker 4: of cases that have been documented that ethicis and scholars

360
00:23:25,160 --> 00:23:28,560
Speaker 4: have really focused on and written tons of papers. In fact,

361
00:23:28,560 --> 00:23:31,680
Speaker 4: new ethicists cut their teeth on this type of case study,

362
00:23:32,400 --> 00:23:36,399
Speaker 4: but in reality, there's not that many cases. But still

363
00:23:37,080 --> 00:23:40,240
Speaker 4: it's worth paying attention to because that's still one person's

364
00:23:40,240 --> 00:23:42,200
Speaker 4: life who's dramatically changed in a way. They didn't want

365
00:23:42,200 --> 00:23:43,920
Speaker 4: and you don't want that to happen again.

366
00:23:44,280 --> 00:23:49,359
Speaker 1: Yeah, for sure. So I'm just curious with these devices,

367
00:23:49,680 --> 00:23:52,440
Speaker 1: when would you say that they cross over from being

368
00:23:52,520 --> 00:23:55,639
Speaker 1: therapeutic to actually providing some kind of enhancement.

369
00:23:56,200 --> 00:23:59,920
Speaker 4: The therapy enhancement line has always been a blurry on.

370
00:24:00,760 --> 00:24:04,600
Speaker 4: This really does tie to the kind of a knee

371
00:24:04,680 --> 00:24:08,720
Speaker 4: jerk reaction of the notion of who has the right

372
00:24:09,040 --> 00:24:13,200
Speaker 4: to change the human condition and society? Who is the

373
00:24:13,280 --> 00:24:16,720
Speaker 4: right to play God? This is the name of the series.

374
00:24:17,320 --> 00:24:20,760
Speaker 4: Sometimes it's not the playing god part that people are

375
00:24:20,760 --> 00:24:25,600
Speaker 4: worried about, it's the playing part. So are you creating

376
00:24:25,680 --> 00:24:29,320
Speaker 4: new circumstances that are irresponsible? Are you allowing humans to

377
00:24:29,359 --> 00:24:32,359
Speaker 4: go beyond their swim lanes then they shouldn't. Are you

378
00:24:34,200 --> 00:24:41,000
Speaker 4: overriding what is given? But we also know that many

379
00:24:41,040 --> 00:24:44,000
Speaker 4: people are born with certain disabilities, and in those cases

380
00:24:44,000 --> 00:24:46,399
Speaker 4: we don't say it's playing God, typically to try to

381
00:24:46,480 --> 00:24:50,360
Speaker 4: cure them. So in that case, it's not just playing God,

382
00:24:50,359 --> 00:24:52,119
Speaker 4: But how do we play God in the right way?

383
00:24:52,520 --> 00:24:55,080
Speaker 1: Well, thank you so much, Karen. This has been really

384
00:24:55,720 --> 00:24:59,159
Speaker 1: really fascinating to hear all about the important way that

385
00:24:59,200 --> 00:25:04,120
Speaker 1: you're doing today. Brandy Ellis is grateful that she went

386
00:25:04,200 --> 00:25:07,520
Speaker 1: through with the DBS implant. Now that she's back to

387
00:25:07,560 --> 00:25:11,439
Speaker 1: feeling like herself, she seeks out opportunities to speak with

388
00:25:11,520 --> 00:25:15,359
Speaker 1: physicians and researchers to help them understand what that dark

389
00:25:15,440 --> 00:25:18,560
Speaker 1: period of depression and suicidal thinking was like for her.

390
00:25:19,680 --> 00:25:23,000
Speaker 1: She feels that implant essentially saved her life and wants

391
00:25:23,000 --> 00:25:25,160
Speaker 1: be able to know about how much it helped her.

392
00:25:26,680 --> 00:25:31,920
Speaker 2: I want to make it clear that the DBS didn't

393
00:25:32,760 --> 00:25:35,840
Speaker 2: change me. I am not a different person because of

394
00:25:35,840 --> 00:25:43,280
Speaker 2: this implant. Depression changed me. Those years of suffering the

395
00:25:43,359 --> 00:25:47,760
Speaker 2: depression altered my personality, every aspect of my life. The

396
00:25:47,840 --> 00:25:51,920
Speaker 2: DBS did not change me. It restored me.

397
00:25:57,240 --> 00:26:01,560
Speaker 1: Next time on playing God dging category of drugs can

398
00:26:01,640 --> 00:26:08,200
Speaker 1: cure debilitating and even fatal diseases, diseases that were previously untreatable.

399
00:26:09,040 --> 00:26:13,000
Speaker 1: But often these so called miracle drugs can cost a fortune,

400
00:26:13,640 --> 00:26:16,480
Speaker 1: as one mother learned when her child was diagnosed with

401
00:26:16,560 --> 00:26:17,600
Speaker 1: a fatal disease.

402
00:26:19,440 --> 00:26:21,480
Speaker 2: We were talking about him living.

403
00:26:21,920 --> 00:26:26,200
Speaker 4: I mean, you'll pay anything, Like I would say that quickly, like.

404
00:26:26,240 --> 00:26:27,720
Speaker 2: Absolutely, I would pay anything.

405
00:26:27,880 --> 00:26:30,280
Speaker 4: But then how can I pay anything? Like?

406
00:26:31,480 --> 00:26:34,320
Speaker 1: How do I pay one hundred and twenty five thousand.

407
00:26:34,040 --> 00:26:37,800
Speaker 2: Dollars a dose. Just get that out three times a year.

408
00:26:38,040 --> 00:26:39,000
Speaker 2: That's impossible.

409
00:26:40,320 --> 00:26:44,919
Speaker 1: Join us next week for more Playing God. Thank you

410
00:26:44,960 --> 00:26:49,760
Speaker 1: to our guests Brandy Ellis, Karen Rommelfanger, and Patricio rev

411
00:26:49,960 --> 00:26:54,320
Speaker 1: paulse Playing God is a co production of Pushkin Industries

412
00:26:54,440 --> 00:26:59,520
Speaker 1: and the Johns Hopkins Berman Institute of Bioethics. Emily Bourne

413
00:26:59,600 --> 00:27:03,719
Speaker 1: is our producer. This episode was also produced by Sophie

414
00:27:03,760 --> 00:27:07,919
Speaker 1: Crane and Lucy Sullivan. Our editors are Karen Chakerjee and

415
00:27:08,040 --> 00:27:13,200
Speaker 1: Kate Parkinson Morgan. Mixing by Samir Sengupta, Theme music by

416
00:27:13,200 --> 00:27:18,480
Speaker 1: Echo Mountain, Engineering support from Sarah Bruguer and Amanda Kaiwang.

417
00:27:19,520 --> 00:27:23,840
Speaker 1: Show art by Sean Krney, fact checking by David jar

418
00:27:24,200 --> 00:27:29,360
Speaker 1: and Arthur Gompertz. Our executive producer is Justine Lang at

419
00:27:29,400 --> 00:27:33,399
Speaker 1: the Johns Hopkins Berman Institute of Bioethics. Our executive producers

420
00:27:33,480 --> 00:27:37,440
Speaker 1: are Jeffrey Kahan and Anna Mastriani, working with a Melia Hood.

421
00:27:37,960 --> 00:27:42,720
Speaker 1: Funding provided by the Greenwall Foundation. I'm Laurena Rura Hutchinson.

422
00:27:42,880 --> 00:27:52,320
Speaker 1: Come back next week for more Playing God. If you

423
00:27:52,400 --> 00:27:55,119
Speaker 1: have enjoyed hearing about these stories and want to know

424
00:27:55,240 --> 00:27:58,720
Speaker 1: more about the history of bioethics. We have been creating

425
00:27:58,760 --> 00:28:01,840
Speaker 1: something very special for you. We have an oral history

426
00:28:01,880 --> 00:28:05,600
Speaker 1: collection with the founding figures of modern bioethics in America.

427
00:28:06,280 --> 00:28:09,680
Speaker 1: The collection is called Moral Histories, and in it you'll

428
00:28:09,720 --> 00:28:12,239
Speaker 1: hear from people who are in the room as some

429
00:28:12,280 --> 00:28:15,199
Speaker 1: of the most significant decisions were made about how to

430
00:28:15,280 --> 00:28:19,240
Speaker 1: manage new technological developments in science and medicine. Go to

431
00:28:19,320 --> 00:28:23,840
Speaker 1: Bioethics dot Jhu dot edu forward slash Moral Histories to

432
00:28:23,920 --> 00:28:24,480
Speaker 1: learn more.