WEBVTT - I Wouldâve Let You Die, Too

0:00:03.160 --> 0:00:05.280
<v Speaker 1>You know, it was nine years ago when somebody laid

0:00:05.320 --> 0:00:07.920
<v Speaker 1>this all out for me and said, this is what's

0:00:07.960 --> 0:00:12.320
<v Speaker 1>going to happen. The pain, the trauma, the hell you're

0:00:12.320 --> 0:00:15.120
<v Speaker 1>going to put your family through. Oh and by the way,

0:00:15.320 --> 0:00:18.360
<v Speaker 1>you know, you're going to look like this and people

0:00:18.400 --> 0:00:21.360
<v Speaker 1>will scare at you and people will be like, what

0:00:21.400 --> 0:00:25.080
<v Speaker 1>the hell is going on with that person? I would

0:00:25.079 --> 0:00:28.400
<v Speaker 1>have said, oh don't no, no, no, don't let me wake

0:00:28.520 --> 0:00:30.840
<v Speaker 1>up to this. Oh no, no, no, no, there's no way.

0:00:32.640 --> 0:00:37.320
<v Speaker 2>Almost a decade ago, Andrea Rubin was in a horrible accident.

0:00:37.920 --> 0:00:41.239
<v Speaker 2>Her doctors thought they could probably save her life, but

0:00:41.400 --> 0:00:44.400
<v Speaker 2>for two months, Andrea wasn't able to tell them if

0:00:44.479 --> 0:00:48.559
<v Speaker 2>that's what she wanted. She was in a coma and

0:00:48.600 --> 0:00:52.560
<v Speaker 2>then heavily sedated for several weeks. So other people stepped

0:00:52.600 --> 0:00:56.640
<v Speaker 2>in to make life changing decisions for her, and it

0:00:56.760 --> 0:00:57.480
<v Speaker 2>got messy.

0:00:58.240 --> 0:01:02.880
<v Speaker 1>Everybody's interests wore on my behalf, but they were completely different.

0:01:03.840 --> 0:01:07.520
<v Speaker 2>What happened to Andrea was one of those rare freak accidents.

0:01:08.160 --> 0:01:11.160
<v Speaker 2>Her car caught fire while she was trapped inside. A

0:01:11.240 --> 0:01:14.959
<v Speaker 2>crew of firefighters rescued her, but by then the majority

0:01:14.959 --> 0:01:18.520
<v Speaker 2>of her body was covered with pretty severe burns, the

0:01:18.600 --> 0:01:21.160
<v Speaker 2>kind that go deeper than the skin and damage your

0:01:21.240 --> 0:01:26.080
<v Speaker 2>muscles and bones. Paramedics airlifted her to a nearby hospital

0:01:26.200 --> 0:01:30.559
<v Speaker 2>in Cleveland, Ohio, where she lived. The doctors weren't clear

0:01:30.600 --> 0:01:33.160
<v Speaker 2>at first if they'd be able to save her her

0:01:33.160 --> 0:01:37.080
<v Speaker 2>burns were that bad. But one thing was immediately clear.

0:01:37.440 --> 0:01:40.600
<v Speaker 2>Her face was so swollen that it was blocking her airway.

0:01:41.600 --> 0:01:45.120
<v Speaker 2>To give Andrea even a chance of surviving, doctors wanted

0:01:45.160 --> 0:01:47.800
<v Speaker 2>to put her on a ventilator, a machine that helps

0:01:47.800 --> 0:01:51.680
<v Speaker 2>you breathe. The ventilator would give the forty nine year

0:01:51.680 --> 0:01:55.240
<v Speaker 2>old Andrea a shot at staying alive, but when she

0:01:55.360 --> 0:01:58.440
<v Speaker 2>woke up, her life would look pretty different.

0:01:59.400 --> 0:02:02.360
<v Speaker 3>She suffered burns over a significant portion of her face,

0:02:02.880 --> 0:02:05.920
<v Speaker 3>and so she was going to lose her nose and

0:02:05.960 --> 0:02:06.600
<v Speaker 3>her ears.

0:02:07.240 --> 0:02:10.160
<v Speaker 2>This is Monica Garrick. She's a co director of the

0:02:10.200 --> 0:02:14.240
<v Speaker 2>Center for Biomedical Ethics and Metro Health System, the hospital

0:02:14.280 --> 0:02:15.480
<v Speaker 2>where Andrea was treated.

0:02:16.400 --> 0:02:20.280
<v Speaker 3>She had significant disfiguration of her lips in her eyelids.

0:02:20.840 --> 0:02:24.320
<v Speaker 3>Her vision was going to be questionable. She was going

0:02:24.360 --> 0:02:26.760
<v Speaker 3>to be living without at least one of her arms

0:02:26.760 --> 0:02:27.880
<v Speaker 3>from the elbow down.

0:02:28.200 --> 0:02:31.880
<v Speaker 2>Even though the ventilator will keep her breathing. Andrea would

0:02:31.880 --> 0:02:34.519
<v Speaker 2>need many surgeries to have a second chance at life

0:02:34.639 --> 0:02:38.519
<v Speaker 2>outside of the hospital room. Because burns are so painful,

0:02:38.720 --> 0:02:42.520
<v Speaker 2>and because she was being operated on so frequently, doctors

0:02:42.520 --> 0:02:46.920
<v Speaker 2>put Andrea into a medically induced coma so she couldn't consent,

0:02:47.600 --> 0:02:53.200
<v Speaker 2>not to the ventilator or the surgeries. So the hospital

0:02:53.200 --> 0:02:55.880
<v Speaker 2>called her seventy nine year old father who lived nearby.

0:02:56.760 --> 0:02:59.480
<v Speaker 2>He arrived at her bedside and he told doctors to

0:02:59.520 --> 0:03:02.840
<v Speaker 2>go ahead with the ventilator and the surgeries to try

0:03:02.880 --> 0:03:06.560
<v Speaker 2>to save his daughter's life. So doctors put Andrea on

0:03:06.639 --> 0:03:10.520
<v Speaker 2>the ventilator, and that's when things got complicated.

0:03:11.240 --> 0:03:14.160
<v Speaker 3>She had these friends who were adamant she would not

0:03:14.280 --> 0:03:15.800
<v Speaker 3>want treatment to be continued.

0:03:16.600 --> 0:03:19.480
<v Speaker 2>These friends showed up at the hospital burn unit too,

0:03:20.200 --> 0:03:23.079
<v Speaker 2>and just down the hall from where Andrea lay unconscious.

0:03:23.200 --> 0:03:26.200
<v Speaker 2>The friends argued with the medical team.

0:03:25.880 --> 0:03:27.960
<v Speaker 3>They were out of I mean they were, you know,

0:03:28.160 --> 0:03:31.680
<v Speaker 3>just that this would not being burned and with the

0:03:31.800 --> 0:03:35.400
<v Speaker 3>kind of scarring she was going to face was not

0:03:35.720 --> 0:03:36.960
<v Speaker 3>going to be okay with her.

0:03:37.960 --> 0:03:40.640
<v Speaker 2>Andrea would not want to live without an arm, they said.

0:03:41.760 --> 0:03:44.400
<v Speaker 2>She wouldn't want to wake up without a nose or ears,

0:03:44.560 --> 0:03:48.240
<v Speaker 2>and no longer recognize her face. Her father may want

0:03:48.240 --> 0:03:51.360
<v Speaker 2>her to live, but if Andrea could speak, she would say,

0:03:51.520 --> 0:03:56.520
<v Speaker 2>let me die. This made doctors pause. Even if they

0:03:56.520 --> 0:03:59.760
<v Speaker 2>could save her, the surgeries would affect her quality of life,

0:04:00.000 --> 0:04:02.640
<v Speaker 2>and here were her close friends saying she wouldn't want

0:04:02.680 --> 0:04:05.480
<v Speaker 2>to live that life. This was no longer just a

0:04:05.480 --> 0:04:11.440
<v Speaker 2>tricky medical decision. It was an ethical problem. So the

0:04:11.480 --> 0:04:13.800
<v Speaker 2>doctors asked Monica and her team for help.

0:04:14.360 --> 0:04:16.640
<v Speaker 3>That team was like, look at these friends are so adamant.

0:04:16.920 --> 0:04:21.080
<v Speaker 3>They're so adamant. What do we do. They were very

0:04:21.120 --> 0:04:24.040
<v Speaker 3>convincing that she would not want to live this way.

0:04:24.560 --> 0:04:29.279
<v Speaker 3>In fact, one of them grabbed our ethics fellow shook

0:04:29.320 --> 0:04:33.600
<v Speaker 3>her and begged her to get the team to stop

0:04:33.640 --> 0:04:37.120
<v Speaker 3>treating Andrea and stop torturing her. Legally, we know what

0:04:37.160 --> 0:04:39.680
<v Speaker 3>we're supposed to do, but ethically it gets more complicated.

0:04:40.720 --> 0:04:44.520
<v Speaker 2>While Andrew's case is severe, her situation is actually more

0:04:44.560 --> 0:04:48.440
<v Speaker 2>common in hospitals than you might expect. At some point,

0:04:48.480 --> 0:04:51.599
<v Speaker 2>you or someone you love will face a life changing

0:04:51.640 --> 0:04:54.720
<v Speaker 2>medical decision. It might not be clear what is the

0:04:54.839 --> 0:04:58.800
<v Speaker 2>right decision, And if you can't consent because you're unconscious

0:04:58.880 --> 0:05:02.800
<v Speaker 2>or sedated, how should doctors and loved ones decide for you.

0:05:04.880 --> 0:05:08.680
<v Speaker 2>I'm your host, Lauren Aurora Hutchinson. I'm the director of

0:05:08.720 --> 0:05:12.720
<v Speaker 2>the Idea's Lab at the Johns Hopkins Berman Institute of Bioethics.

0:05:13.320 --> 0:05:16.279
<v Speaker 2>I've spent years working on stories about the ways in

0:05:16.320 --> 0:05:19.760
<v Speaker 2>which medicine and science show up in people's everyday lives.

0:05:20.320 --> 0:05:23.240
<v Speaker 2>In this series, I'm going behind the scenes to discover

0:05:23.480 --> 0:05:27.560
<v Speaker 2>how some of the most significant medical innovations have impacted

0:05:27.600 --> 0:05:31.440
<v Speaker 2>people's lives and continued to whether it's saving lives or

0:05:31.480 --> 0:05:36.480
<v Speaker 2>creating babies, New technologies are often accompanied by new ethical questions.

0:05:37.160 --> 0:05:40.160
<v Speaker 2>Just because we can do something, does it mean we should?

0:05:40.880 --> 0:05:44.440
<v Speaker 2>And who gets to make those kinds of decisions? When

0:05:44.440 --> 0:05:48.640
<v Speaker 2>does it seem like playing God? In each episode, you'll

0:05:48.680 --> 0:05:54.440
<v Speaker 2>hear directly from patients, leading bioethicists, scientists, and physicians as

0:05:54.480 --> 0:05:58.239
<v Speaker 2>they grapple with these kinds of questions. On today's show,

0:05:58.520 --> 0:06:02.760
<v Speaker 2>The Ventilator, it saves lives, but it also forces us

0:06:02.760 --> 0:06:06.159
<v Speaker 2>to ask who should make life and death decisions for

0:06:06.279 --> 0:06:09.839
<v Speaker 2>someone who can't tell us what they want. From Pushkin

0:06:09.920 --> 0:06:13.920
<v Speaker 2>Industries and the Johns Hopkins Berman Institute of Bioethics, this

0:06:14.400 --> 0:06:21.280
<v Speaker 2>is playing God, So who should make decisions on behalf

0:06:21.320 --> 0:06:25.200
<v Speaker 2>of a patient who is heavily sedated or unconscious, and

0:06:25.279 --> 0:06:28.560
<v Speaker 2>how do they make the right call. We'll return to

0:06:28.600 --> 0:06:31.479
<v Speaker 2>Andrew's story and how decisions were made in her case

0:06:31.520 --> 0:06:35.440
<v Speaker 2>a little later, but first I wanted to understand how

0:06:35.560 --> 0:06:39.400
<v Speaker 2>experts even begin to address the ethical questions in cases

0:06:39.480 --> 0:06:43.320
<v Speaker 2>like this. To find out, I asked Jeffrey Kahan. He's

0:06:43.360 --> 0:06:46.920
<v Speaker 2>the director of the Johns Hopkins Berman Institute of Bioethics.

0:06:47.680 --> 0:06:50.880
<v Speaker 2>Jeff is extremely well regarded in his field, and I'm

0:06:50.920 --> 0:06:55.119
<v Speaker 2>not just saying that because he's my boss. So, Jeff,

0:06:55.160 --> 0:06:59.039
<v Speaker 2>from a bioethics perspective, where do we start answering these

0:06:59.080 --> 0:07:00.200
<v Speaker 2>types of questions?

0:07:00.600 --> 0:07:03.320
<v Speaker 4>Well, first, I want to say thanks Lauren for taking

0:07:03.320 --> 0:07:05.600
<v Speaker 4>the time to talk with me about these important questions,

0:07:05.600 --> 0:07:08.839
<v Speaker 4>and not just because we work together. So first, I

0:07:08.839 --> 0:07:11.640
<v Speaker 4>think we have to answer a question, which is what's

0:07:11.800 --> 0:07:16.119
<v Speaker 4>the end goal here? And the end goal really should

0:07:16.200 --> 0:07:20.560
<v Speaker 4>be preserving the autonomy of the individual patient, that is,

0:07:20.600 --> 0:07:24.080
<v Speaker 4>the control that they have over themselves and their bodies.

0:07:24.560 --> 0:07:27.360
<v Speaker 2>So in the case here with Andrea, it's about making

0:07:27.400 --> 0:07:29.920
<v Speaker 2>sure that the decision that gets made is actually the

0:07:29.960 --> 0:07:32.960
<v Speaker 2>one that Andrea would want exactly right.

0:07:33.040 --> 0:07:37.000
<v Speaker 4>So it seems really obvious to us now in twenty

0:07:37.000 --> 0:07:39.880
<v Speaker 4>twenty three talking to each other that we would ask

0:07:39.920 --> 0:07:43.920
<v Speaker 4>the person first and foremost. But that wasn't always the

0:07:43.960 --> 0:07:48.200
<v Speaker 4>way it was. The system used to be much more paternalistic.

0:07:48.560 --> 0:07:52.480
<v Speaker 4>That is, the doctor knew best and patients just went

0:07:52.520 --> 0:07:57.200
<v Speaker 4>along with whatever the doctor recommended. And so it was

0:07:57.240 --> 0:08:00.160
<v Speaker 4>a sea change, really a big difference in the way

0:08:00.400 --> 0:08:05.120
<v Speaker 4>the medical profession practices, and what more importantly, maybe patients

0:08:05.160 --> 0:08:10.440
<v Speaker 4>expected that their decisions mattered, actually mattered more than the

0:08:10.480 --> 0:08:11.520
<v Speaker 4>doctor's recommendation.

0:08:12.280 --> 0:08:14.760
<v Speaker 2>So when did this change happen and what was it

0:08:14.760 --> 0:08:17.000
<v Speaker 2>that made the change happen? When it did, why did

0:08:17.080 --> 0:08:19.040
<v Speaker 2>doctors start considering patients' wishes?

0:08:19.360 --> 0:08:23.640
<v Speaker 4>It really began in the nineteen sixties. American society was

0:08:23.680 --> 0:08:28.360
<v Speaker 4>going through some pretty dramatic changes in the nineteen sixties

0:08:28.360 --> 0:08:32.120
<v Speaker 4>into the early nineteen seventies. We had the Vietnam War raging,

0:08:32.200 --> 0:08:35.559
<v Speaker 4>We had Watergate and the scandal that ensued. We had

0:08:36.040 --> 0:08:39.200
<v Speaker 4>civil rights finally starting to take hold in a lot

0:08:39.280 --> 0:08:44.040
<v Speaker 4>of political turmoil around that, and along with identification and

0:08:44.200 --> 0:08:48.559
<v Speaker 4>finally implementation I guess of civil rights came a recognition

0:08:48.960 --> 0:08:52.400
<v Speaker 4>that patients also had rights. Rights to decide for themselves

0:08:52.400 --> 0:08:55.520
<v Speaker 4>and to make decisions about what should happen to their body.

0:08:56.000 --> 0:08:59.559
<v Speaker 4>And along with that came some technologies which called it

0:08:59.600 --> 0:09:01.680
<v Speaker 4>to question and how we would actually be able to

0:09:01.720 --> 0:09:04.920
<v Speaker 4>allow patients to take control of decisions about their bodies.

0:09:05.679 --> 0:09:08.400
<v Speaker 2>So, can you tell us about what kind of technologies

0:09:08.520 --> 0:09:11.240
<v Speaker 2>it was and what ethical questions it came about.

0:09:12.200 --> 0:09:15.080
<v Speaker 4>Well, one example of a technology that came along around

0:09:15.120 --> 0:09:17.880
<v Speaker 4>the same time and challenge some of these ideas around

0:09:17.880 --> 0:09:22.520
<v Speaker 4>patient autonomy was the ventilator. So a machine that allowed

0:09:22.840 --> 0:09:26.000
<v Speaker 4>doctors to save the lives of people who are critically

0:09:26.040 --> 0:09:29.480
<v Speaker 4>injured to before that technology was invented would have died

0:09:29.679 --> 0:09:32.560
<v Speaker 4>like Andrea. The challenge is, if somebody is connected to

0:09:32.600 --> 0:09:36.560
<v Speaker 4>a ventilator, they can't most of the time respond to questions,

0:09:36.559 --> 0:09:39.839
<v Speaker 4>certainly not by speaking verbally, and most of the time

0:09:39.880 --> 0:09:44.000
<v Speaker 4>they're unconscious, making it all but impossible to understand what

0:09:44.040 --> 0:09:46.920
<v Speaker 4>their wishes might be when it comes to whether they

0:09:46.920 --> 0:09:50.400
<v Speaker 4>should be kept alive when the quality of their life

0:09:50.440 --> 0:09:53.680
<v Speaker 4>after they may or may not recover is so uncertain.

0:09:54.080 --> 0:09:57.640
<v Speaker 4>And so a technology that allowed people to be kept

0:09:57.720 --> 0:10:02.119
<v Speaker 4>alive in a way that just wasn't possible before undermined

0:10:02.200 --> 0:10:06.920
<v Speaker 4>or made very difficult the idea of also respecting their autonomy.

0:10:07.320 --> 0:10:11.880
<v Speaker 2>So I guess doctors were again playing God in the sense.

0:10:11.679 --> 0:10:15.400
<v Speaker 4>And they were in a position to take God exactly.

0:10:15.520 --> 0:10:19.880
<v Speaker 4>And the ethics question is, wow, should they be the

0:10:19.920 --> 0:10:23.320
<v Speaker 4>ones who get to decide or how best to figure

0:10:23.320 --> 0:10:26.480
<v Speaker 4>out what would be in the patient's interest, what the

0:10:26.520 --> 0:10:29.959
<v Speaker 4>patient's desires would be? Who gets to decide?

0:10:30.360 --> 0:10:34.080
<v Speaker 2>Okay, so how do we get from that to now?

0:10:34.400 --> 0:10:38.000
<v Speaker 2>And how should these decisions get made?

0:10:38.200 --> 0:10:40.560
<v Speaker 4>The idea of trying to make sure that people who

0:10:40.600 --> 0:10:43.280
<v Speaker 4>could not answer the question about what they would want

0:10:43.320 --> 0:10:48.240
<v Speaker 4>for themselves led to something called an advanced directive, so

0:10:48.880 --> 0:10:52.400
<v Speaker 4>a document that articulates what people would or would not

0:10:53.120 --> 0:10:56.560
<v Speaker 4>like if they found themselves in a situation like being

0:10:56.960 --> 0:11:00.800
<v Speaker 4>maintained on a ventilator, usually by checking back, but also

0:11:00.840 --> 0:11:04.720
<v Speaker 4>really importantly to identify somebody who can speak on their behalf.

0:11:05.280 --> 0:11:07.920
<v Speaker 2>And then what happens if someone hasn't left anything behind

0:11:08.559 --> 0:11:11.240
<v Speaker 2>like an advanced directive, so like what happened with Andrea.

0:11:12.920 --> 0:11:17.880
<v Speaker 4>Unfortunately that's a very common occurrence. Then the question is

0:11:18.040 --> 0:11:22.080
<v Speaker 4>and should be who can speak with knowledge about their wishes?

0:11:22.280 --> 0:11:23.640
<v Speaker 4>What would they want?

0:11:24.520 --> 0:11:24.640
<v Speaker 5>Not?

0:11:24.840 --> 0:11:27.280
<v Speaker 4>What do we think is best for them, but rather

0:11:27.400 --> 0:11:30.160
<v Speaker 4>what would they want? And if we don't know the

0:11:30.160 --> 0:11:33.480
<v Speaker 4>answer to that question, then we have to ask a

0:11:33.480 --> 0:11:35.760
<v Speaker 4>different question, which is what do we think would be

0:11:35.760 --> 0:11:38.120
<v Speaker 4>best for them? If we don't know what they would want.

0:11:39.000 --> 0:11:42.640
<v Speaker 4>The problem occurs when members of the family don't seem

0:11:42.679 --> 0:11:45.720
<v Speaker 4>to know what the patient would want, or other people

0:11:45.760 --> 0:11:48.600
<v Speaker 4>show up and say, I know this person better than

0:11:48.640 --> 0:11:51.920
<v Speaker 4>the members of their family, and I can speak with

0:11:52.080 --> 0:11:55.400
<v Speaker 4>knowledge about what they would want in a way that's better,

0:11:55.520 --> 0:11:59.600
<v Speaker 4>more informed than the people who are related to them.

0:11:59.800 --> 0:12:02.360
<v Speaker 2>So in Andrea's case, there was a lot of disagreement

0:12:02.400 --> 0:12:05.080
<v Speaker 2>as we had. So what happens in those cases?

0:12:06.080 --> 0:12:09.440
<v Speaker 4>Then there is a process. It's required as a matter

0:12:09.480 --> 0:12:13.559
<v Speaker 4>of accreditation for hospitals that an ethics committee exists at

0:12:13.559 --> 0:12:16.160
<v Speaker 4>the hospital and that there be a process for something

0:12:16.200 --> 0:12:20.000
<v Speaker 4>called an ethics consultation and ethics consult So people like

0:12:20.480 --> 0:12:24.280
<v Speaker 4>Monica in our story, who was a clinical ethics expert,

0:12:25.160 --> 0:12:31.079
<v Speaker 4>social workers, psychologists, sometimes psychiatrists, members of the legal department

0:12:31.120 --> 0:12:33.880
<v Speaker 4>in the hospital. All those people are sitting around at

0:12:33.960 --> 0:12:37.400
<v Speaker 4>table in a conference room, being presented with a case

0:12:37.679 --> 0:12:40.559
<v Speaker 4>and then trying to help advise how to proceed.

0:12:41.160 --> 0:12:43.439
<v Speaker 2>So what would be used as evidence in those kind

0:12:43.440 --> 0:12:47.120
<v Speaker 2>of consultations. Say, if Andrew's friends had had something written

0:12:47.160 --> 0:12:50.080
<v Speaker 2>down that Andrea had said, would that help?

0:12:50.960 --> 0:12:54.400
<v Speaker 4>I think it would help inform the process. You have

0:12:54.440 --> 0:12:57.160
<v Speaker 4>to hope in a case like andreas, if there was

0:12:57.280 --> 0:13:00.760
<v Speaker 4>concrete evidence that the friends could bring that it would

0:13:00.760 --> 0:13:05.040
<v Speaker 4>be used to inform the conversation, hopefully inform the father's

0:13:05.160 --> 0:13:08.760
<v Speaker 4>decision making. So it wouldn't change who gets to decide,

0:13:08.760 --> 0:13:12.720
<v Speaker 4>but hopefully it would change the information that the person

0:13:12.720 --> 0:13:15.720
<v Speaker 4>who gets to decide would use to make the decision

0:13:15.720 --> 0:13:17.040
<v Speaker 4>on behalf of the patient.

0:13:17.960 --> 0:13:23.160
<v Speaker 2>Okay, thanks Jeff. After the break, we'll find out how

0:13:23.240 --> 0:13:27.000
<v Speaker 2>well Andrea thinks this system worked for her. How did

0:13:27.000 --> 0:13:30.400
<v Speaker 2>the ethics team, her loved ones, and doctors decide what

0:13:30.520 --> 0:13:33.720
<v Speaker 2>to do, and did she feel like they made the

0:13:33.760 --> 0:13:34.200
<v Speaker 2>right call.

0:13:34.840 --> 0:13:39.320
<v Speaker 1>So I think it's a very difficult question to answer.

0:13:40.559 --> 0:13:49.880
<v Speaker 2>Wow, let's go back to the hospital burn unit. Andrea

0:13:50.000 --> 0:13:54.520
<v Speaker 2>Rubin is covered in life threatening burns and is unconscious.

0:13:55.280 --> 0:13:59.280
<v Speaker 2>Doctors say the ventilator and surgery are her only shot

0:13:59.360 --> 0:14:04.360
<v Speaker 2>at staying. Andrea's dad is ready to move forward, but

0:14:04.480 --> 0:14:08.280
<v Speaker 2>Andrea's friends are insistent that Andrea wouldn't want to be

0:14:08.400 --> 0:14:10.720
<v Speaker 2>kept alive under these circumstances.

0:14:12.160 --> 0:14:14.839
<v Speaker 3>They were so adamant. In fact, I had a nurse

0:14:14.880 --> 0:14:21.040
<v Speaker 3>tell me years later in tears, years later thinking back

0:14:21.240 --> 0:14:26.080
<v Speaker 3>about the stress that they were feeling, because the friends

0:14:26.080 --> 0:14:32.200
<v Speaker 3>were so adamant that they were sending Andrea into a

0:14:32.240 --> 0:14:34.520
<v Speaker 3>life she would not want to live again.

0:14:34.760 --> 0:14:40.280
<v Speaker 2>Monica Garrick, the hospital bioethicist. According to Monica, Andrea's friends

0:14:40.320 --> 0:14:44.040
<v Speaker 2>couldn't override the dad's authority to make the call. But

0:14:44.160 --> 0:14:47.360
<v Speaker 2>if the friends had any evidence of Andrea's wishes, if

0:14:47.400 --> 0:14:50.120
<v Speaker 2>there was a text from Andrea, if they could remember

0:14:50.120 --> 0:14:53.680
<v Speaker 2>the details of a conversation, Monica could take that information

0:14:53.760 --> 0:14:54.680
<v Speaker 2>to Andrea's dad.

0:14:55.080 --> 0:14:57.880
<v Speaker 3>You know, we might have talked to her father and said, look,

0:14:58.040 --> 0:15:00.000
<v Speaker 3>this is what we're being told, and this is an

0:15:00.080 --> 0:15:02.880
<v Speaker 3>consistent with the fact you're willing to consent. Can you

0:15:03.120 --> 0:15:05.520
<v Speaker 3>talk to us about why you're consenting on her behalf?

0:15:06.200 --> 0:15:09.680
<v Speaker 2>But her friends didn't have any concrete evidence of Andrew's wishes,

0:15:10.280 --> 0:15:13.880
<v Speaker 2>so Andrew was kept alive. She had nineteen surgeries while

0:15:13.920 --> 0:15:17.320
<v Speaker 2>she was sedated, and around thirty nine more over the

0:15:17.360 --> 0:15:22.280
<v Speaker 2>next five years, Andrew spent seven weeks in an induced

0:15:22.320 --> 0:15:25.320
<v Speaker 2>coma and another month and a half in and out

0:15:25.360 --> 0:15:30.000
<v Speaker 2>of consciousness, and slowly she began piecing together the story

0:15:30.120 --> 0:15:33.400
<v Speaker 2>of what happened. She learned about what her friends did.

0:15:34.000 --> 0:15:38.840
<v Speaker 1>They were fighting with my dad saying, you know, obviously

0:15:38.880 --> 0:15:42.960
<v Speaker 1>you don't know your daughter. They really felt my father

0:15:44.520 --> 0:15:49.480
<v Speaker 1>didn't understand that for a girl who forty nine years old,

0:15:50.320 --> 0:15:52.800
<v Speaker 1>yeah she can walk, but she does does she want

0:15:52.800 --> 0:15:54.800
<v Speaker 1>to walk through this life looking like this?

0:15:55.480 --> 0:15:58.240
<v Speaker 2>She also learned what surgeons had done to save her.

0:15:58.920 --> 0:16:04.040
<v Speaker 1>They made a makeshift eyelid, which just looks basically looks

0:16:04.080 --> 0:16:06.440
<v Speaker 1>like I've been punched in the face right, so I

0:16:06.440 --> 0:16:09.840
<v Speaker 1>have no vision in that eye. I lost three quarters

0:16:10.000 --> 0:16:14.760
<v Speaker 1>of my nose. I had burns on my face, so

0:16:14.760 --> 0:16:17.440
<v Speaker 1>I don't look anything like I used to do. A

0:16:17.440 --> 0:16:19.560
<v Speaker 1>lot of scar tissues, so I can't really smile.

0:16:20.560 --> 0:16:23.280
<v Speaker 2>The most she can manage now is a slight upcurl

0:16:23.320 --> 0:16:26.920
<v Speaker 2>of her lips. Also, the burns on her scalp were

0:16:27.000 --> 0:16:30.760
<v Speaker 2>so bad she can no longer grow her hair. Like

0:16:30.800 --> 0:16:35.200
<v Speaker 2>a lot of people, hair was pretty important to Andrea

0:16:35.480 --> 0:16:39.000
<v Speaker 2>even before the accident. She felt pretty insecure about it.

0:16:39.360 --> 0:16:42.080
<v Speaker 1>I could never get my hair like long enough the

0:16:42.120 --> 0:16:44.080
<v Speaker 1>way I wanted it, and I'm like once I discovered

0:16:44.120 --> 0:16:47.080
<v Speaker 1>hair extensions, it was game over. I'm like, oh my god,

0:16:47.520 --> 0:16:51.400
<v Speaker 1>I have long, full hair. Finally, I wasn't really like

0:16:52.200 --> 0:16:54.840
<v Speaker 1>thrown on all the makeup and you know, having to

0:16:54.880 --> 0:16:57.080
<v Speaker 1>look perfect. I took very good care of myself. I

0:16:57.080 --> 0:16:59.480
<v Speaker 1>always worked out. I cared about what I looked like.

0:16:59.560 --> 0:17:02.400
<v Speaker 1>But I wasn't all about it, you know, except when

0:17:02.440 --> 0:17:03.600
<v Speaker 1>it came to the hair extensions.

0:17:03.600 --> 0:17:04.719
<v Speaker 4>I was all about it.

0:17:05.560 --> 0:17:09.760
<v Speaker 2>These days, Andrea wears a medical wig. It's long, straight

0:17:09.840 --> 0:17:13.280
<v Speaker 2>and honeyblonde, just like her hair was before the accident.

0:17:14.359 --> 0:17:17.600
<v Speaker 2>Of course, it's not exactly the same, and that's part

0:17:17.600 --> 0:17:19.880
<v Speaker 2>of what her friends were worried about all those years ago,

0:17:20.720 --> 0:17:23.119
<v Speaker 2>why they thought Andrea might not want to stay alive.

0:17:23.840 --> 0:17:27.240
<v Speaker 1>What my friends were suggesting was extreme, but then again,

0:17:27.280 --> 0:17:28.840
<v Speaker 1>what happened to me was very extreme.

0:17:29.359 --> 0:17:31.760
<v Speaker 2>The three friends who spoke up for her at the hospital,

0:17:31.920 --> 0:17:35.960
<v Speaker 2>all women, are still her closest friends today, and she

0:17:36.040 --> 0:17:39.000
<v Speaker 2>says there are no hard feelings. They've talked about why

0:17:39.040 --> 0:17:41.439
<v Speaker 2>they were so adamant that she shouldn't be kept alive.

0:17:42.240 --> 0:17:45.160
<v Speaker 1>They're like, we just didn't. We didn't want to put

0:17:45.160 --> 0:17:47.359
<v Speaker 1>you through this. We didn't think you'd be happy. We

0:17:47.359 --> 0:17:51.680
<v Speaker 1>were speaking for you, as we thought you would have

0:17:51.720 --> 0:17:52.640
<v Speaker 1>spoken for yourself.

0:17:53.480 --> 0:17:56.000
<v Speaker 2>They aren't afraid to talk about what happened. They can

0:17:56.040 --> 0:17:57.120
<v Speaker 2>even laugh about it.

0:17:57.960 --> 0:17:58.159
<v Speaker 5>You know.

0:17:58.320 --> 0:18:01.159
<v Speaker 1>It's like I always grant, Oh, I just would have

0:18:01.200 --> 0:18:02.480
<v Speaker 1>you know, I would have killed you too.

0:18:06.080 --> 0:18:08.760
<v Speaker 2>She also understands why her dad made the call to

0:18:08.840 --> 0:18:09.520
<v Speaker 2>keep her alive.

0:18:10.440 --> 0:18:15.800
<v Speaker 1>I am pretty sure it was a difficult call for him.

0:18:16.480 --> 0:18:19.159
<v Speaker 1>But I think at the end of the day, when

0:18:19.240 --> 0:18:23.880
<v Speaker 1>doctors said, hey, we probably can save her, I guess

0:18:23.680 --> 0:18:24.960
<v Speaker 1>what's a parent going to do?

0:18:26.080 --> 0:18:28.800
<v Speaker 2>But were her friends right when they assume the surgeries

0:18:28.840 --> 0:18:31.800
<v Speaker 2>would compromise her quality of life so much that she

0:18:31.840 --> 0:18:33.720
<v Speaker 2>wouldn't have wanted to survive.

0:18:34.359 --> 0:18:36.239
<v Speaker 1>You know. I go back and forth with this. At

0:18:36.240 --> 0:18:38.800
<v Speaker 1>the time, I think they were right. Of course, Now

0:18:38.800 --> 0:18:43.760
<v Speaker 1>in hindsight, I'm so happy I'm alive. A lot has

0:18:43.840 --> 0:18:47.560
<v Speaker 1>changed on the outside, but my life really is still

0:18:47.880 --> 0:18:48.920
<v Speaker 1>pretty remarkable.

0:18:51.920 --> 0:18:54.720
<v Speaker 2>Andrea is happy she's alive, which is pretty much the

0:18:54.760 --> 0:18:57.800
<v Speaker 2>best case scenario for someone who didn't have their wishes

0:18:57.840 --> 0:19:02.200
<v Speaker 2>spelled out in advance. And again we're not picking on Andrea.

0:19:02.760 --> 0:19:05.200
<v Speaker 2>Most people don't have any of this stuff worked out.

0:19:05.760 --> 0:19:08.440
<v Speaker 1>I tell everybody, get your stuff together, because I didn't

0:19:08.440 --> 0:19:13.720
<v Speaker 1>have anything together. Nothing and everybody had a long, hard

0:19:13.760 --> 0:19:14.880
<v Speaker 1>journey because of it.

0:19:15.680 --> 0:19:18.960
<v Speaker 2>And if you're thinking, great, I'll just write an advanced directive.

0:19:19.119 --> 0:19:22.960
<v Speaker 2>Problem solved. It's still hard to predict how you feel

0:19:22.960 --> 0:19:26.760
<v Speaker 2>in every situation. You might not understand all the options

0:19:26.800 --> 0:19:30.320
<v Speaker 2>available to you, especially if you're dealing with a life

0:19:30.320 --> 0:19:37.400
<v Speaker 2>saving medical technology like the ventilator. The ventilator was an issue.

0:19:37.400 --> 0:19:39.360
<v Speaker 2>In another case, Monica got involved in.

0:19:40.760 --> 0:19:45.399
<v Speaker 3>Our service, got consulted by a surgeon and said, I

0:19:45.440 --> 0:19:49.080
<v Speaker 3>don't have an issue now, but I want you involved

0:19:49.480 --> 0:19:51.800
<v Speaker 3>now because this is a little complicated.

0:19:52.480 --> 0:19:54.840
<v Speaker 2>The surgeon told her he had a patient who might

0:19:54.880 --> 0:19:56.760
<v Speaker 2>need to be on a ventilator for just a day

0:19:56.840 --> 0:19:58.960
<v Speaker 2>or two before he was well enough to breathe on

0:19:59.000 --> 0:20:03.000
<v Speaker 2>his own. The problem. The patient had been very explicit

0:20:03.119 --> 0:20:05.840
<v Speaker 2>about his desire to never be put on a ventilator.

0:20:06.359 --> 0:20:11.280
<v Speaker 3>He had written it on the backs of manila envelopes,

0:20:11.800 --> 0:20:14.800
<v Speaker 3>in notebooks and like sort of scrap pieces of paper.

0:20:15.600 --> 0:20:18.159
<v Speaker 3>So I don't know what had happened in his life,

0:20:18.200 --> 0:20:21.720
<v Speaker 3>but he was very sure he did not ever want

0:20:21.760 --> 0:20:22.920
<v Speaker 3>to be on a breathing machine.

0:20:23.560 --> 0:20:26.000
<v Speaker 2>It was up to his daughter to decide what to do,

0:20:26.640 --> 0:20:30.159
<v Speaker 2>and she knew his dance, but it wasn't clear to

0:20:30.240 --> 0:20:32.840
<v Speaker 2>her if he'd understood that being on a ventilator can

0:20:32.880 --> 0:20:36.920
<v Speaker 2>be temporary. Did he really mean never or just not forever.

0:20:38.520 --> 0:20:41.720
<v Speaker 2>The daughter reached out to the hospital's bioethics consult team.

0:20:42.200 --> 0:20:44.760
<v Speaker 2>She wanted help from experts like Monica.

0:20:45.280 --> 0:20:47.119
<v Speaker 3>And so, what do I do if you're telling me

0:20:47.160 --> 0:20:50.960
<v Speaker 3>it's temporary and that it'll save his life and that

0:20:51.119 --> 0:20:55.320
<v Speaker 3>without it he could die? But he said never? What

0:20:55.480 --> 0:20:56.080
<v Speaker 3>do I do?

0:20:57.119 --> 0:21:01.560
<v Speaker 2>Remember Jeffrey Kahn, our bioethicist from earlier, as he pointed out,

0:21:01.600 --> 0:21:04.919
<v Speaker 2>we can't always predict what a patient would decide for themselves.

0:21:05.600 --> 0:21:08.120
<v Speaker 2>In those cases, we have to decide what would be

0:21:08.119 --> 0:21:11.760
<v Speaker 2>best for them, what would be in the patient's best interest.

0:21:12.760 --> 0:21:15.199
<v Speaker 2>Monica and the patient's daughter got the input of the

0:21:15.240 --> 0:21:18.280
<v Speaker 2>medical team and decided that it would be best for

0:21:18.359 --> 0:21:22.720
<v Speaker 2>him to be put temporarily on a ventilator. Luckily, the

0:21:22.840 --> 0:21:25.760
<v Speaker 2>daughter didn't have to act on that decision. Her dad

0:21:25.800 --> 0:21:27.480
<v Speaker 2>didn't need to be put on a ventilator.

0:21:27.640 --> 0:21:30.960
<v Speaker 3>He got better and he came to a few days later,

0:21:31.760 --> 0:21:34.239
<v Speaker 3>and when he woke up, the first thing out of

0:21:34.240 --> 0:21:37.520
<v Speaker 3>her mouth to him was, Dad, you know, this is

0:21:37.560 --> 0:21:40.520
<v Speaker 3>what happened, and they almost had to do this to you.

0:21:41.040 --> 0:21:43.119
<v Speaker 3>But you've said repeatedly and you wrote it down that

0:21:43.160 --> 0:21:45.359
<v Speaker 3>you never wanted to be on a ventilator. What should

0:21:45.400 --> 0:21:47.240
<v Speaker 3>I do in the future if this ever happens again?

0:21:47.400 --> 0:21:49.120
<v Speaker 3>And he said, well, why all means say yes.

0:21:50.160 --> 0:21:53.800
<v Speaker 2>The idea of acting in the patient's best interests accounts

0:21:53.800 --> 0:21:57.800
<v Speaker 2>for some of the gray areas in advanced directives, because

0:21:57.840 --> 0:22:00.679
<v Speaker 2>even if you have an advanced directive, it's hard to

0:22:00.760 --> 0:22:03.200
<v Speaker 2>know if you feel the same way when you're in

0:22:03.240 --> 0:22:04.760
<v Speaker 2>a life or death situation.

0:22:05.520 --> 0:22:08.639
<v Speaker 3>People when they get close to death, if they've said

0:22:08.760 --> 0:22:11.520
<v Speaker 3>I don't want any interventions, sometimes say well, now give

0:22:11.560 --> 0:22:14.040
<v Speaker 3>them to me. I've changed my mind. And some people

0:22:14.040 --> 0:22:16.399
<v Speaker 3>who get close to death who have said, you know,

0:22:16.480 --> 0:22:20.080
<v Speaker 3>give me every intervention get close and say no, no, no,

0:22:20.240 --> 0:22:22.959
<v Speaker 3>I don't want them. Now I'm ready to go. So

0:22:23.600 --> 0:22:29.359
<v Speaker 3>all this decision stuff at the bedside gets really it's

0:22:29.720 --> 0:22:34.119
<v Speaker 3>extremely complicated. You have the patient who sometimes is unreliable,

0:22:34.160 --> 0:22:36.320
<v Speaker 3>not because they're not good people or not because they

0:22:36.320 --> 0:22:38.639
<v Speaker 3>haven't thought about things, but because we change our minds

0:22:38.680 --> 0:22:39.960
<v Speaker 3>all the time about everything.

0:22:40.640 --> 0:22:44.679
<v Speaker 2>Andrea understands this as well as anyone I know.

0:22:44.720 --> 0:22:47.720
<v Speaker 1>If it were nine years ago and somebody said, Okay,

0:22:47.720 --> 0:22:50.000
<v Speaker 1>here's what's going to happen. You know you're going to

0:22:50.040 --> 0:22:54.560
<v Speaker 1>be completely disfigured, blah blah blah. Would could you do it?

0:22:54.720 --> 0:22:57.440
<v Speaker 1>My answer to be, heck no, don't even think about it.

0:22:57.520 --> 0:22:59.720
<v Speaker 1>Save your time and enerchanges. Let me go.

0:23:00.600 --> 0:23:05.240
<v Speaker 2>Back then, physical appearance was pretty important to Andrea. It

0:23:05.280 --> 0:23:08.600
<v Speaker 2>played a big role in her quality of life. Today,

0:23:09.080 --> 0:23:13.159
<v Speaker 2>Andrea looks completely different, but her quality of life is

0:23:13.240 --> 0:23:17.080
<v Speaker 2>better than she could have imagined. She's glad to be alive.

0:23:18.640 --> 0:23:21.879
<v Speaker 2>Monica is trying to help Andrea make a plan in

0:23:21.920 --> 0:23:25.119
<v Speaker 2>case she ever ends up in another life or death situation.

0:23:26.400 --> 0:23:29.639
<v Speaker 2>She's encouraging Andrea to prepare an advance directive.

0:23:31.240 --> 0:23:34.200
<v Speaker 3>Had a conversation with her about what happens if she suffers,

0:23:34.359 --> 0:23:39.000
<v Speaker 3>you know, really devastating seespine injury and ends up with quadriplegia.

0:23:39.440 --> 0:23:41.800
<v Speaker 3>Then what And then she's told me that she wouldn't

0:23:41.840 --> 0:23:44.480
<v Speaker 3>want to continue to live like that, and I'm like,

0:23:44.520 --> 0:23:46.320
<v Speaker 3>but you said you wouldn't want to continue to live

0:23:46.400 --> 0:23:51.040
<v Speaker 3>like this and she said, right, but that's really my limit.

0:23:51.160 --> 0:23:52.760
<v Speaker 3>And I said, well, how do you know that that's

0:23:52.760 --> 0:23:55.040
<v Speaker 3>really your limit? When you thought this was your limit,

0:23:55.400 --> 0:23:58.560
<v Speaker 3>and then this, and she said, well, you're right, Monica,

0:23:58.600 --> 0:24:00.919
<v Speaker 3>I don't know. And I I said, so, then what

0:24:00.920 --> 0:24:03.040
<v Speaker 3>are we supposed to do? And you know, she basically

0:24:03.040 --> 0:24:05.159
<v Speaker 3>admits she doesn't really know what we're supposed to do.

0:24:05.400 --> 0:24:07.280
<v Speaker 3>Right if something happens to her again.

0:24:08.040 --> 0:24:11.120
<v Speaker 1>I've gone through so much. I just have this fighting spirit.

0:24:11.200 --> 0:24:13.800
<v Speaker 1>But oh heck no, no, no no, I don't have any

0:24:13.800 --> 0:24:16.960
<v Speaker 1>more fight left in me. But you know, when it

0:24:17.000 --> 0:24:19.000
<v Speaker 1>all is said and done, I want to live.

0:24:19.960 --> 0:24:24.600
<v Speaker 3>These aren't the thoughts of a flaky person, right. Andrewa

0:24:24.640 --> 0:24:30.639
<v Speaker 3>is extremely articulate. She's very smart, she's very thoughtful. I

0:24:30.680 --> 0:24:34.000
<v Speaker 3>think she's just typical of normal human beings.

0:24:35.720 --> 0:24:39.879
<v Speaker 2>Normal human beings constantly change their minds, our values, and

0:24:39.960 --> 0:24:43.840
<v Speaker 2>ideas about things like appearance and quality of life. They

0:24:43.880 --> 0:24:47.639
<v Speaker 2>shift over time. How we feel about things depends so

0:24:47.840 --> 0:24:50.960
<v Speaker 2>much on the context. But the more you think through

0:24:50.960 --> 0:24:53.800
<v Speaker 2>your wishes and share them with others when things are okay,

0:24:54.760 --> 0:24:57.680
<v Speaker 2>the easier is for everyone to respect your wishes. When

0:24:57.720 --> 0:24:59.360
<v Speaker 2>things are dire, I.

0:24:59.320 --> 0:25:02.080
<v Speaker 1>Get to live, to do, I get to try to

0:25:02.119 --> 0:25:05.879
<v Speaker 1>make a difference. I have some happiness, less pain, no

0:25:05.960 --> 0:25:08.680
<v Speaker 1>more hospital. It's like, yeah, the last thing I want

0:25:08.680 --> 0:25:11.600
<v Speaker 1>to do is sit here and start talking about the

0:25:11.600 --> 0:25:13.560
<v Speaker 1>worst thing in the world. But it is something I

0:25:13.600 --> 0:25:16.000
<v Speaker 1>need to start really focusing on again.

0:25:17.200 --> 0:25:21.240
<v Speaker 2>Of course, we can't plan ahead for every scenario. In

0:25:21.320 --> 0:25:24.080
<v Speaker 2>some cases, a loved one may have to step in

0:25:24.160 --> 0:25:27.199
<v Speaker 2>and make a call, and when they do, there's a

0:25:27.240 --> 0:25:30.840
<v Speaker 2>ton of pressure and stress to get things right. For

0:25:30.960 --> 0:25:35.040
<v Speaker 2>some people that's the most important thing, But for Monica,

0:25:35.440 --> 0:25:38.640
<v Speaker 2>what's more important than being right is how you get

0:25:38.680 --> 0:25:39.400
<v Speaker 2>to your decision.

0:25:39.960 --> 0:25:43.320
<v Speaker 3>Personally, I've told my loved ones, look, you know, do

0:25:43.400 --> 0:25:45.159
<v Speaker 3>what you think is best in that moment, and that

0:25:45.280 --> 0:25:47.159
<v Speaker 3>is okay with me. Like, just know that I support

0:25:47.200 --> 0:25:49.359
<v Speaker 3>whatever decision you made. And I had a nurse tell me.

0:25:49.400 --> 0:25:51.159
<v Speaker 3>She was a retired nurse, and she said, Monica, I

0:25:51.240 --> 0:25:54.800
<v Speaker 3>always told loved ones, family members, if you make the

0:25:54.920 --> 0:26:01.040
<v Speaker 3>decision from love, you cannot make a bad decision. You know,

0:26:01.440 --> 0:26:05.720
<v Speaker 3>actual love, not selfishness or but real love for the person.

0:26:05.880 --> 0:26:08.320
<v Speaker 3>You can't make a bad decision. And I think that

0:26:08.320 --> 0:26:08.880
<v Speaker 3>that's right.

0:26:11.720 --> 0:26:15.560
<v Speaker 6>Coming out this season on Playing God, I just remember

0:26:15.800 --> 0:26:19.200
<v Speaker 6>laying there and watching the lights above me as we're

0:26:19.240 --> 0:26:22.280
<v Speaker 6>walking down the hallway, and the first thing I said was,

0:26:22.320 --> 0:26:24.800
<v Speaker 6>do I have a uterus, and the nurse who was

0:26:24.800 --> 0:26:27.600
<v Speaker 6>pushing me look down and they smiled and they're like,

0:26:27.640 --> 0:26:28.720
<v Speaker 6>you have a uterus.

0:26:29.240 --> 0:26:33.600
<v Speaker 1>So there were questions about who was actually the narrator

0:26:33.600 --> 0:26:35.680
<v Speaker 1>of the life at that point. Was it the technology

0:26:35.760 --> 0:26:38.080
<v Speaker 1>or was it the person? Was it some kind of combination.

0:26:38.800 --> 0:26:44.440
<v Speaker 5>I am completely dependent upon electricity as medicine, and there

0:26:44.480 --> 0:26:47.159
<v Speaker 5>will never be a point in my life where I

0:26:47.200 --> 0:26:51.560
<v Speaker 5>can quote go off the grid because I can never

0:26:51.760 --> 0:26:55.520
<v Speaker 5>be without electricity for my own survival.

0:26:56.119 --> 0:26:57.879
<v Speaker 4>You sort of have to ask yourself, what would I

0:26:57.920 --> 0:27:01.160
<v Speaker 4>do as a parent? Wouldn't I do anything I possibly could?

0:27:01.200 --> 0:27:03.320
<v Speaker 4>How can you not try everything when you're trying to

0:27:03.320 --> 0:27:04.359
<v Speaker 4>save the life of your child?

0:27:05.800 --> 0:27:09.520
<v Speaker 2>Many thanks to our guests Andrea Rubin and Monica Gerrick.

0:27:10.720 --> 0:27:14.240
<v Speaker 2>Playing God is a co production of Pushkin Industries and

0:27:14.280 --> 0:27:19.119
<v Speaker 2>the Johns Hopkins Berman Institute of Bioethics. Emily Bourne is

0:27:19.160 --> 0:27:23.120
<v Speaker 2>our lead producer. This episode was also produced by Sophie

0:27:23.160 --> 0:27:27.320
<v Speaker 2>Crane and Lucy Sullivan. Our editors are Karen Schakerjee and

0:27:27.440 --> 0:27:33.560
<v Speaker 2>Kate Parkinson Morgan. The music and mixing by Echo Mountain Engineering,

0:27:33.600 --> 0:27:38.560
<v Speaker 2>support from Sarah Bruguer and Amanda Kaiwan. Show art by

0:27:38.560 --> 0:27:43.600
<v Speaker 2>Sean Krney fact checking by David jar and Arthur Gompertz.

0:27:44.359 --> 0:27:48.640
<v Speaker 2>Our executive producer is Justine Lang at the Johns Hopkins

0:27:48.680 --> 0:27:52.600
<v Speaker 2>Berman Institute of Bioethics. Our executive producers are Jeffrey Kahan

0:27:52.720 --> 0:27:57.200
<v Speaker 2>and Anna Mastriani, working with a Media hood funding provided

0:27:57.280 --> 0:28:02.280
<v Speaker 2>by the Greenwall Foundation Special thanks to Tammy Coffee. I'm

0:28:02.359 --> 0:28:05.280
<v Speaker 2>Lauren and Rora Hutchinson. Come back next week for more

0:28:05.359 --> 0:28:16.760
<v Speaker 2>Playing God. If you're interested in learning more about these

0:28:16.800 --> 0:28:20.080
<v Speaker 2>stories and discussions, visit the Berman Institute's guide to the

0:28:20.080 --> 0:28:24.919
<v Speaker 2>podcast at Bioethics dot JHU dot edu, slash Playing God,

0:28:25.720 --> 0:28:28.760
<v Speaker 2>or find us on social media at Burman Institute