Bruce Holsinger, welcome back to Writer's Voice. Thank you. It's a pleasure to be with you again. So Culpability, this is just a terrific book. As I told you before we began, I couldn't put it down, literally. It is about the ethical dilemmas that AI poises for us. It begins with a horrifyingly ordinary tragedy, a family car crash, that involves a fully autonomous vehicle. Why did you use a car accident with an autonomous vehicle as a lens for exploring the moral dilemmas of AI, of which there are a legion? Yeah, so I began thinking about this as not connected to AI at all. It really was just a family getting through a car accident. I was thinking after my last novel, which was about climate change, it was called The Displacements, and that was a family going through this massive national catastrophe involving the world's first Category 6 hurricane, as the novel imagined it. In this case, I was thinking, okay, well, a family going through another kind of calamity, a smaller scale one, a kind of everyday accident on a highway. And I was thinking about in the driver's seat is the oldest child in this family. He's 17. His name is Charlie. And I just thought, okay, what if everybody in that family in some ways felt themselves responsible for this accident, so that we're all culpable in some ways. And then it was really only after I was in the middle of a first draft when I was thinking of introducing this element of a self-driving or autonomous car. And then when I finished the first draft, it was the fall of 2022. And suddenly, chat GPT exploded into our consciousness. Everyone in the world was talking about AI, and I finally understood what this novel was really about. It's not so much a novel about AI, it's about a family going through a really hard incident in their lives. But what the introduction of artificial intelligence did is it kind of thickened that story, made it more contemporary, made it more urgent in some ways. And so the AI kind of layered itself in as I was going along. Well, you say it's a typical family, but it has at least one member who is not typical at all. The mother, Lorelai Shaw, is a recipient of a MacArthur Genius Award, she's a philosopher of ethics, and she's the author of a book within the book, Silicon Souls on the Culpability of Artificial Minds. So tell us about Lorelai Shaw. Yeah, so Lorelai is a world-leading figure in the field of ethical artificial intelligence, and the book begins with her words from that book within the book whose title that you just gave to your listeners, and writing her character, I suppose as I was going through the process of writing the novel, she became a kind of obsession for me in some ways, just as artificial intelligence is an obsession for her. We hear the story mostly through Noah's first person voice, he's her husband, he's the dad of their kids, and my last three novels, I guess my last four novels have all been from multiple points of view, mostly in third person, and that's really the way I usually write. This time I'm focusing in on one perspective and letting him tell that story, he's not an unreliable narrator, but he is a bit of a clueless narrator, especially when it comes to Lorelai, she's unfathomable to him in many ways, the way her mind works, what she does with her days, her job, there's almost a way that he purposefully doesn't let himself know about what she's up to, and that plays very strongly into the plot later in the book. But I wanted to create this character who would, in some ways, once I understood what the novel was about, I created Lorelai to embody some of the anxieties that our world has with AI that our society is grappling with. And she becomes, she kind of crystallizes all those questions in her personality, in her mind, in her writing. And she crystallizes it focusing on the ethical issue. I mean, I think there are other issues with AI that we can think about, not least of which is AI kind of outsourcing our creativity and our intelligence to it. I have even noticed a perceptible loss of creativity when I've used AI, and so I try to use it as judiciously as I can. You really focus in on the moral issue here, and it's one that is around a very classic ethical problem, the trolley problem, which appears throughout the book, both explicitly and metaphorically. What is the trolley problem, and why did you choose to anchor the story in that classic moral dilemma? The trolley problem was developed by a British philosopher named Philip Afoote, and there's different forms of it, but the classic one is let's say you're standing at a switch alongside the track of a trolley coming along, and if the trolley keeps going on its current track, it will kill three people that are bound to the track a little further down. If you pull the lever and put the trolley on another track, it will kill one person that is bound to that track further down. The idea is that if you do nothing, three people will die. If you do something, one person will die. What do you do? Do you pull that lever so that you're responsible for one person's death, or do you throw up your hands and think of yourself as helpless? You didn't do anything, and three people died. And it's one of those... It wasn't originally designed, of course. It was way predated self-driving cars, but it does make us think about the morality of ourselves in relationship to machines and how we intervene in the everyday kinds of dilemmas that machines are involved in. And so it's AI systems, the kind of trillions of algorithmic decisions that a really complicated system like a self-driving car might be making. The trolley problem is much too simple to really capture the everyday dilemmas that those systems are dealing with, but it does illustrate the kinds of moral problems that people in that world, people who are programming algorithms, people who are inventing algorithms and thinking about the technologies of these kinds of machines, it does exemplify the kind of problems that they have to think about every day. So I thought it was a nice way to kind of illustrate the dilemmas that Lorelei is involved in and the central dilemma that defines what happens to the family at the opening chapter of the book. Just to generalize it, are you saying that autonomous vehicles have to make the kinds of decisions, ethical decisions that are implied in the trolley problem? I mean, could we see them making choices between killing people that they don't have within their area of concern versus the people who are, you know, their owners, so to speak, or to instead use other kinds of ethical things like how they may kill their owners because there's only one owner and there may be 10 people in the road? Is that what you're saying? No, I'm more because I think choices and decisions are words that those are very human kinds of words. And I think in the technology of self-driving cars, as I understand it, and believe me, I'm no expert here, but the research that I did, the people I interviewed, they're machine learning, they're learning from their own mistakes, the machines, the algorithms are designed to make them better and better and make smarter and smarter. But even, even as I tried to describe it, make smarter and smarter responses to the programs that are controlling them, right? And to make the process of being in one and being around one safer. And that's the, that represents the kind of dilemma that these machines embody. You also bring up some other dilemmas, things that are very impressing, I think, to us, as we see the incredible development of drone warfare, along with the increasing sophistication of AI, you know, the whole notion of, of war machines, basically, that will be autonomous, autonomous warfare done by robots, where these ethical questions really do come up. How do you think about the issue of drone or other kinds of autonomous weapons? Well, I think it's terrifying. I mean, what else could I, could I say, I think they, we're at this point where these kinds of drone swarms that I imagine in the novel, and it only comes up a couple of times, I don't want to give spoilers about where, where it's involved in the plot, you know, those are just over the horizon if they're not already on the horizon and, you know, they're, they're agencies working on them in, in many different parts of, of different governments. And one of the dilemmas in the book is, well, what if these drone swarms are more accurate in distinguishing combatants from civilians? Couldn't they be more moral than a human fighting force? Isn't it, aren't there aspects of these where the, the morality isn't black and white and they're just pure evil? And I suppose that's one of the things that Lorelei wrestles with. That's one of the things the industry is wrestling with. The morality of AI is not clear cut and maybe warfare will be one of the places that bears that out. Well, one of the issues is that we don't really know, you speak in the novel of the black box of AI. And I was reading an article in the, I forget which national newspaper yesterday about the fact that we really don't know why AI does make, if you can call it, make decisions or how it will make decisions. In other words, AI is a black box to us. It could be a completely alien intelligence. Could it develop a completely alien ethical system? Well, of course. Yeah. And I think one of its, one thing that's alien about it is its indifference. You know, for all that we talk about the ethics of AI, the morality of AI, the systems themselves, and you say it, but it's them, and it's them by the hundreds of thousands now. And they're all different. They're all learning in different ways from text, from image, from video, from our language, from the way we move in the world. They're all learning different things about us all the time, but they are completely indifferent to our fates. And to me, that seems one of their superpowers. It's often not recognized as such, but it's one of the most unsettling things about artificial intelligence systems is their lack of a moral framework. Even if one is programmed in for the algorithms, there will always be what experts in the field call the alignment problem, where they may elude our requests for them. They may misunderstand them. They may solve a problem in a way that we hadn't understood they could. And so that's one of the ways that I think people who work in these systems imagine them getting out of control. But culpability, it's really not a novel that's just about AI. It's also about a family and a family facing an everyday tragedy and how to get around it. So I really wanted the artificial intelligence not to be secondary, just to be a dilemma that they were wrestling with as they go about their lives, as they're thinking about how they relate to each other, how the kids relate to each other, the parents relate to their kids, and all the kind of everyday dilemmas that families are facing. Yes, but you actually made these questions about AI. I mean, this is what your novel so brilliantly does. You make it personal. You make it something that forces the reader to think about, okay, well, what would I do in that situation, or how do I respond? One of the novel's most memorable refrains is, a family is like an algorithm. I wanted to ask you, what did you mean by that, and what is the import of it? Well, it's not my position that a family is like an algorithm. It comes from the mouth of Lorelai, one of the characters. And so she, in the prologue, Noah is remembering this moment when she said, he was asking, they were in a conversation, we're so different, how do we do it? What's our secret sauce? How do we make our marriage work, our family work? And Lorelai says, well, a family is like an algorithm. There's inputs and outputs, and as long as everything's going along smoothly, the machine of the family is going to work as it should. And of course, Noah is very skeptical of that. He's not dismissive, but he's like, oh, I guess that's her way of seeing the world. And the story told in Culpability, I think, defeats that idea. And so maybe one of Lorelai's arcs is she has to learn that things are messy. A family is messier than that, and it's not an algorithm. So that's one of her, I guess you could call it her learning curves in some ways. And another theme that runs through the novel is parenthood as both love and terror for the safety of one's children. And of course, the desire for safety is what's behind the notion of autonomous vehicles, you know, the vehicle at the center of this novel. And yet, I'm totally terrified by the idea of an autonomous vehicle for myself or my children and grandchildren. So how did this theme of parenthood as both love and terror shape the book's moral landscape? Yeah, that's a great question. I, well, that's how I think a lot of parents experience it. You know, you are, on the one hand, you have this boundless love for your kids, you would do anything for them, you would anything to help them get through our world. On the other, you're, you know, every time they leave the house, and every time they're away from you, you're worried about getting a call, you know, maybe especially as they get older. And that gets really scary. You know, to go back to your point about autonomous vehicles, let me ask you this. When's the last time you were in an Uber? Have you been in an Uber before? Never. Never. Okay. Oh, so you haven't even been in like an Uber or Lyft. Yet my son is an Uber driver. Oh, okay. Well, do you think you should, you should ask him this. Do you think he looks at his phone while he's driving to get his next fare? I actually don't think he does. That's interesting. He would be the only Uber driver that I would ever hear about who doesn't. Because every time I've been in an Uber or Lyft, you know, the driver has, you know, been spent time on their phone to get their next fare. And I think that's, you know, the job of a self-driving taxi, like a Waymo in San Francisco or Los Angeles, is to get you to your destination safely, not to find its next fare. And there've been studies of passenger hours in autonomous vehicles versus human-driven vehicles. And the safety, the safety incidents, there's just no comparison. So I, you know, I think autonomy is going to be more and more about how cars are driven, how trucks are driven. And I know that, you know, that in human intelligence makes it scary, but I don't know if it makes it less safe. Right. And I'm not convinced that it does either. And the only reason I think that about my son is that he has really lit into me when I've even moved my hand towards my phone while I'm driving. Ah, good for him. So I think he's pretty careful about that. But yes, I take your point. I want to go back though to this issue of indifference of AI, because I do have to say that it was the AI implications, maybe because I'm thinking about them so much, that really gripped me. And a Lorelei, who's such a compelling character, asks whether we can train machines to be good in the same ways we train ourselves. And of course, we train ourselves with the idea of consequences. I mean, that's supposed to be parenting 101, consequences, not abuse, but consequences. And yet, if these machines are indifferent, you do pose this issue. How do we train them to be good? They don't. I mean, it's not my field. I don't really know how. I don't know the technicality. But Lorelei has devoted her life to this. She's got a PhD in ethical philosophy, but she also has a PhD in computational engineering. And so she has figured out all these ways to try to make them respond to our ethical systems. And you brought up that metaphor of the black box, which people in AI use to describe the unfathomability of these systems. And that's part of it. Even Lorelei doesn't quite understand how all these systems work, the system she herself is working with. So that's another thing that makes them very uncanny, I think, for us to think about. Yeah. And there's also an interesting issue of how we do make our own moral choices, you know, without giving anything away. The drive to create things for the good of humanity, leading to quite the opposite, is definitely a theme that is explored to a certain degree in this book. How did you wrestle with that? And also, may I ask what may be some of your outside influences in thinking about it or sources when you did the research for this book? Yeah, so I read a lot of scholarship in the ethics of artificial intelligence. I interviewed people in the field. I had, you know, a lot of exchanges with people who are at the edge of those kinds of questions and also just absorbing a lot of the philosophical language about algorithmic injustice, about machine learning and some of its pitfalls, some of its moral pitfalls, and also just trying to bring it back to the level of story. So it's not just about the tech, it's about how it affects people and how people reflect. And maybe this is another way of answering the questions you're asking, is if we think of artificial intelligence as holding up a moral mirror to humanity, how are we looking in that mirror and what do we see when the figure in the mirror looks back at us? Are we, you know, inflecting our moral choices through these machines in spite of these machines with the help of these machines? And what is that going to look like in the years ahead? You know, how is it reshaping us as a species? And it's, you know, it's a subset of technology more generally, you know, people being glued to their phones and so on. But I think when it comes to AI, the questions are very unsettling in the kinds of dilemmas they confront us with about autonomy, about interpersonal relationships, about therapy, and so on, you know, the list goes on. Yeah, could you say a little bit more about those? And what are what are some of the greatest fears that you may have? I'm not so much about worried about the robots coming for us. When it comes to AI, I'm much worried about much more worried about disinformation, deepfakes, and so on. I'm worried about, I'm also really worried about the environmental devastation that the massive numbers of data centers are leading to, you know, that there's one recent book I read by Karen Howe, H-A-O, called Empire of AI. It's about open AI. It's about data centers in Chile. And by one estimate, this data center that's about to open that's being developed by Google AI in a village in Chile is going to demand as much water as is used by the village in a whole year. So where are they going to get their fresh water? So it's those kinds of dilemmas that the way they're intertwining themselves into all our aspects, all aspects of our lives, that I think can be really unsettling. And of course, the other issue is that they are creatures of us, and therefore they reflect our own biases and prejudices and lack of concerns. I mean, one could see an AI that would be developed by ecologists as being perhaps quite different than the one that's being developed by profit-driven tech billionaires. Right. Or a chatbot like the one the middle daughter Alice talks to in the novel. She develops a chatbot named Blair, and Blair becomes her best friend. And then Alice has these very dark interactions with Blair over the course of the book. I think we see nine or 10 of them that I wrote. And you're never inside Alice's point of view, except in those sex exchanges. And I wrote that part of the book for a few different reasons. One is those exchanges give a real spine of suspense to the book about what happened in the car that day. But it's also just a demonstration of one of the big dilemmas of AI right now, where we're increasing numbers of young people are having these interactions with these chatbots. Again, these inhuman chatbots, but that they humanize and that they turn into human beings to serve their own needs. And that's a really scary aspect of the implementation of these technologies right now. I read yesterday, and don't quote me on the actual percent, but it was some kind of, could have been close to 37% of young people, and I don't remember what the range of ages was, have actually had a romantic relationship with a chatbot. Yes. Yeah. Or if not a romantic relationship, a personal relationship, I think was that statistic that has somehow interacted with it as if it's a human being, or a therapist, or a companion, or a friend, something like that. And so what does that do to our relationships with actual human beings? Well, who knows? I mean, I would be interested in hearing what you think, and your own experience with it. In the novel, again, Culpability, I was really trying to think about what it does to Alice. Alice is a kid, she's a middle child, her other siblings are very charismatic, and they have friends to burn. But Alice is struggling, and Blair, her chatbot, is her only friend. And that worries her parents, and well, they don't even know about it. I think that's one of the scary things, is it gives kids, gives young people yet another thing to keep secret from their parents, these friendships that often their parents aren't even aware of. Yes. I mean, I think, clearly, it's what comes unexpectedly from our actual friends that forces us to deal with making moral choices and growing. And I'm not convinced that a chatbot is able to do that. The chatbot in the book is a little bit more confrontational, but I know that, like, ChatGPT, for example, has been created in order to, you know, I mean, the word that comes to mind is suck up to you, you know, is flatter you. Yeah, they're sycophants, right? I think that the sycophancy is one thing that was trained into the early models. And so it's got this kind of fake emotional content where it's saying, that's a wonderful question. You know, let me explore that with you. It's such a smart way of putting it, that kind of language. And that's creepy too. Absolutely. So my final question is, Lorelei says, and this really kind of brought me up short. She says, we shouldn't make machines to be good for us, but to help us be better ourselves. So say more about that. How, I mean, could machines help us be better ourselves? And I'm not just talking about, obviously they could perhaps diagnose certain symptoms, correctly diagnose certain conditions from symptoms better than doctors, because they may be better aware. Although I'm not even convinced of that, but I know that there are some, Dr. Eric Topol is a big booster of AI in diagnosis. But I think she's talking about the moral universe there. Yeah. I guess the idea is she doesn't, and again, this is Lorelei, this is not me. She's a character, not a real person. And I'm always reminding people of that. This is someone who has a viewpoint within a novel, which is a fictional creation. And she doesn't want, and a lot of people in this world, in the AI ethics space feel this way. She doesn't want machines to be a kind of moral prosthetic. In other words, she doesn't want us to give machines the responsibility to make the world a better place. She wants us to do that. And she wants the machines to be our allies in some ways, to the extent possible. And it's a model of morality that is, it resonates a little bit with the title of the novel, Culpability, that this is a general age of responsibility that we're in. And it's up to us, as she says, in some of the closing words of the book, it's going to be up to us whether they make us better. We shouldn't assign that only to them, that these machines are products of our creation. We have to destroy them if that's what's necessary. We can't think of ourselves as not their equals. And that'll be a crucial part of, you know, fighting the moral battles ahead. The responsibility is ours, or the culpability will be. Exactly. That's a nice way of putting it. The responsibility is ours, or the culpability will be. Exactly, exactly. That's a nice way of putting it. Well, it's just a terrific novel. And I love the way fiction allows the reader to explore something both intellectually and emotionally. That is the real gift of it. Thank you, Bruce Holsinger, for writing this wonderful book and speaking with us today. Thank you so much. It's been a pleasure. Elizabeth George, welcome to Writer's Voice. Thank you. It's great to be with you. I have to ask you first about the title of this novel, A Slowly Dying Cause. It was a totally, you know, as they call it, propulsive read. But I couldn't figure out the title. Sure. If you think about the way the book ends, there's a moment where Linley is having a conversation with Deidre Traher, the woman that he is in love with. And in the conversation, he is reminded of a line from King Lear, which is, I have full cause of grieving. And so the whole book is about grief, and letting go of grief, and the various ways that grief appears in one's life, and the people who are able to let go of it, and the people who are not able to let go of it. So the slowly dying cause is the ability to let go of the cause of the grief, and in doing so, letting go of the grief itself. And now you should stop me if I get into dangerous territory of any kind of spoilers. Okay. This is an Inspector Linley novel. You have, actually, Inspector Linley comes in fairly late in the book, interestingly enough. Barbara Havers, his sidekick and, you know, partner, basically. She comes in a little bit earlier. Why so late? Because this is a very carefully constructed book. The tricky bit in doing a novel like this, especially when it does not take place in London, this is actually my third book set in Cornwall. The difficult part is to figure out how I'm going to involve my detectives from the Metropolitan Police in a crime that is not within the metropolis, greater metropolis of London. And so in this particular case, what I decided to do was to start with the detective inspector who was in a previous book, Careless in Red. The detective inspector, Beatrice Hannaford, is the person who is assigned the case. She belongs to an MIT team, and that's a major incident team. And the major incident teams actually are the people who solve crimes. And so the issue that I had is, if Lindley's going to be involved in this crime, and if Habers is going to be involved, how am I going to involve them? And that became the need, the involvement came from the need for Lindley to go to Cornwall, because the family home, or as they call it, the family pile, has a problem with its roof, which has been leaking for quite some time into the gallery. And so he has to go down and deal with this damage, because it's going to cost a small fortune. He lives in a grade two listed property, which means that if it's going to be repaired, it has to be repaired in materials that come from that same period. And so this is a huge problem, a huge project. He goes, and then in the meantime, Barbara has to go. Well, Barbara doesn't have to go, but Lindley tells her she's going to go, because she has been essentially thrown out of work by their superior officer, Isabel Artery. So once that happens, that gets them down to Cornwall. Habers is also sort of involved early on tangentially, because Deidre Traher, who is also from Cornwall, has had an incident that involves her family, her birth family. And she's called Barbara just to get information initially. And then ultimately, that brings Barbara and Lindley to Cornwall, and the involvement of Deidre Traher's family ultimately gets Lindley involved in this crime as well. So it's pretty complicated, as it usually is, to get my detectives involved in the situation that's ongoing. Yeah, very intricately plotted. It seemed to me that there were all these disparate threads in the novel that then progressively get pulled together. But before we go further, for those of our listeners who may not be familiar with the aristocratic cop, Tommy Lindley, and Barbara Havers, who is a totally working-class girl, talk about these two characters that you have followed for so many years. Sure, sure. Well, Lindley is from the aristocratic class. He's sort of like what they would call upper-class out of sight, if you were looking at the book called You and Non-You, which was written by one of the Mitford sisters. It's quite amusing to read. But anyway, he's an aristo, and he has inherited a title. The title is the Earl of Asherton, and that title has been in his family for about 250 years, or going on 300 years. Now, he is somebody who was never really particularly interested in being an aristocrat. He is, although he, you know, he does his duty and he recognizes what his duty is, but he also wanted to live a life that he sees as perhaps more useful than being somebody who is responsible for this huge property, which indeed he is. So he has gone to both Eton and Oxford and has become a, has gone into police work from there. He has a sister, Judith, who is also in this book, and his mother, who's widowed, and she is also in this book. And he has an estate manager who takes care of the property in his absence as well. Then Barbara Havers is his polar opposite. She's from a working class family. She went to comprehensive, but did not further her education after going to comprehensive, which is like what we would think of as high school. She has gone into, into police work after that. So she's probably had a longer period of time in police work than Lindley himself has. She is a, she really has a chip on her shoulder. First of all, as a woman in police work, and secondly, as a working class woman who's assigned to work with this aristocrat. And she has managed to cope with this throughout by sort of making light of the fact that they are so, so opposite. But the fact that they're opposite is what has made them, for me, such interesting characters to deal with. Yeah, say more about that, because they're quite fond of each other. Yeah. And that was one of the things that I wanted to do in this book, in the series. Well, first of all, I wanted to write a series. I set off to do that right from the very beginning. What I wanted to explore is not only the class differences, but I wanted to explore the idea that a man and woman could work together, despite their differences, and also, despite their differences, could come to love each other dearly, but not to have that be a sexual love. Because the very idea that, you know, Linley and Habers would end up in bed together is, for people who've read the books, it's just ridiculous. I mean, that would never happen. And neither of them would ever want it to happen. So they're not, you know, they're not attracted to each other, but they have each other's back to the extreme and go to the death for each other. And that was what I wanted this relationship to be like. Elizabeth George, let's talk about the main plot. We have a story of a relationship between an older man and a much younger woman who was a girl when he was still a teenager when he first meets her. Talk about the broad outlines of this plot just to orient us and why did you, what were the issues that you were trying to frame in this novel? I wanted to do a dual narrative where we are seeing the background of the man who dies, and that's not giving anything away because he dies in the first scene of the book. So it's okay to say, I hope, but he is what's called a tin streamer. And a tin streamer is a person who has been mining tin for generations in this particular spot in Cornwall. And the particular spot where he's mining tin is exactly the place where somebody was mining tin until just very recently. And to mine tin, you have to break apart a stone called cassiterite, because tin does not exist on its own other than inside this stone. And so it has to be not only mined, you have to mine for that stone, but then you have to break apart the stone and from that extract the cassiterite. So this particular man, Michael, is telling his story and what he's doing is he's telling the story for two reasons. One is to kind of work out what is going on in his life at the moment that he's telling the story. And so he's looking back on how he came to be in the position he is right now with this woman who is 23 years younger than he is, with his suspicions about what's really going on with her and another character, and with the idea hanging over his head that there is a mining company called Cornwall Eco Mining that wants his property because they want lithium. And lithium is huge, of course, because it has everything to do with batteries. And they have come up with a way to extract lithium that doesn't harm the environment. Really? Yeah. I wondered about that because a lot of the characters were as skeptical as I was of that. Oh, yeah. This is brand new. And this is part of the serendipity of writing these novels, is that I find out things that while I'm working on the book, or when I'm in Cornwall, in the case of this book, doing research, and something comes to my attention that is totally unexpected. And what came to my attention was the fact that there was this company who had come up with a way to extract lithium from brine saltwater, which is how it's found. And right now, it's coming out of South America, but it is hugely harmful to the environment, because the way they do it is they have these huge, like football-sized ponds, and the ponds have to evaporate. And in the evaporation process, then what's left is the lithium. But this company has come up with a way to remove the water, the saltwater, and not ocean water, because it has to be thermal water. So it has to be hot water. So this hot water exists between, beneath the granite in the substratum. And they've come up with a way to bore down to the thermal pools to pump the water out of the pools, take it to a processing plant, extract the lithium, and put the water back into the place where they got it. So it's just, it's this amazing process. And so that's what is sort of like the background of this book, is there's this company who wants the land that belongs to Michael, the narrator. And they want it because they want to build their processing plant on his land. And the reason that they want it there and not someplace else is that he has buildings that already exist, and they can build on the footprint of those buildings. Otherwise, they have to go through a huge process to get permission to build anywhere in Cornwall. So it's kind of this complicated political situation that I think is fairly clear to the readers. It sounds wildly complicated when I explain it, but that's really what's going on. And so the narrator, Michael, is sorting out his life to figure out, okay, how do we get where we are right now? It has to do with the lithium thing. It has to do with his young wife. It has to do with his family. He's divorced. He's left his wife for this woman 23 years younger than he is. And he has two very angry children that he's left behind. His daughter is probably beside herself with anger. And it's really, really affected her entire life and her ability to trust anybody, especially to trust any man. And it has affected, obviously, it's affected his ex-wife as well. So that's sort of like the background of what's really going on. One of the most interesting things for me was, you know, I think this is the first detective novel I've read where the guy who's the murder victim is a character that is telling his story throughout. And I was thinking along the way that he was an unreliable narrator. Great. It turns out not so much. Well, not so much, except he is totally, I mean, don't you think he's pretty blind to the wonderfulness of his young wife? Of course, no, that wasn't to say that he was filled with, he wasn't filled with delusions. But I was expecting one thing and got something totally different. Oh, okay. Yeah, I had a lot of fun with him and especially with creating his voice, you know, because it really needed to be this voice of a guy who is, you know, believes totally the wrong thing and is just, you know, passionately in love. And, you know, talk about, you know, love is blind. He really sort of exemplifies that. And Elizabeth George, I just have to ask you what your process is with so many characters and really the need to keep the suspense up. This is a long book. How do you keep things straight? And how do you develop? It seems to me you have to kind of drop in the development, you have to drop in things, you know, that pulling together, as I mentioned before, it really takes a very organized mind, I would think. Well, part of it is the process of creating characters in advance, but so that I know enough about them that when they make an appearance in the novel, you know, so I know who these people are, and I know what their life experiences, how they react to their life experiences, and what their psychopathology is. So I know that going into it, but I don't always know exactly how they're going to react to certain situations. So there's, in a certain respect, the characters actually tell me how they're going to react, what they're going to do as a result of something that has happened to them, or something that they've discovered. But additionally, keeping it all straight is a matter of not only being super organized, because the books are complicated, it's impossible to hold all of that in my mind as I write, you know, continue to write the book. And so what I do is at the end of each chapter, which is like really a day in the life of all of these people, what's going on, I take each character on like a sheet of paper, and put on each sheet of paper, like Linley, Havers, Michael, you know, whatever it is, then under each character, for each day, I have exactly what happened. So I would write, you know, Linley finally sees Daedra on this particular day, or Linley and Havers arrive at Hohenstau, you know, just what exactly happened on that day. And in particular, if there was anything that I really need to remember about that day. So then I take a color pencil, and I have the color, I have like, I think four different colors, and the colors represent different aspects of a crime novel. So I might have like orange might be clue, pink might be triggers another scene, red might be, you know, something else. And so next to each item that I've written down, then if there is anything like, for example, that's a clue, then that will get that color in a little star next to it. If there's something that's going to trigger another scene, that will get the other color next to it. Now, some things, of course, that are written down, as far as what happened on this day, are not going to trigger another scene, they're going to be something that deals with this particular character in their life on this day. It's not necessarily going to trigger another scene, but it's going to just more fully develop the character. Having said that, I will tell you that I had to do five drafts of this book to get everything right, because what would happen is I'd start out, I would be writing and realize, oh my God, I can't do it this way, I'm going to have to start again. So I did that five times. And the book that I'm working on right now, I'm on the third draft, and I know that I have to start again and do a fourth draft. So that's just how it works sometimes. It's never just a straight through process without doing any more drafts. I wish it were, but it's not. And is this one also a Lindley Havers novel? Yes. Yeah, it is. Yeah. And this one that I'm working on now takes place in Sussex in the, in and around the town of Hastings. Well, I hope to be able to read that one and talk to you about that next time. One of the things that I love about your books is the way you build the environment, the detail of the place. I feel like I'm a fly on the wall. And I think it's probably one reason why these are long, because you really create an immersive environment for the reader. Nothing is skipped. Well, I'm sure some things are skipped. I bet you probably got rid of a lot of your darlings too, like every author does when they edit. But it really is immersive. So I wanted to actually ask you, what attracted you to the genre of fiction? You know, I sense a delight in you in painting this kind of immersive environment. So I guess I'm asking you about what is the emotion that you bring? You know, why did you start with this? And what is the emotion you bring to writing these kinds of novels? Now, are you talking about the crime novel itself? Are you talking about the crime novel and the setting of the crime novel? Well, I'm talking about both. So what first drew you, what was so interesting to you about crime novels? And then what pushed you to create the kind of crime novel that goes into such detail about the character, but more than that, such detail about the ambience? Sure. To answer the question about why the crime novel, it's a real simple answer, because the crime novel provides the writer, me, with a through line. And that is the crime in its investigation. And the crime has to be solved by the end of the novel. And that's the through line. And so what I early on realized is that it's like a skeleton. And then on that skeleton, you can hang, the writer can hang as much as the writer wants to hang. So on the skeleton, that's like the through line heading toward the solution. You can explore character, you can explore theme, you can explore place, you can have subplots. So the crime novel can be one of two things. It can be a rich experience, that sort of like what I call the tapestry novel, or it can be just the crime and the characters marching toward the solution to the crime. It's purely up to the writer. Now, as a reader, I loved reading books that were larger than an Agatha Christie. No disrespect to Agatha Christie, she was brilliant. But Agatha Christie basically wrote mysteries. And the challenge for the reader was to figure out who done it before it was revealed by Hercule Poirot or Miss Marple. So I wasn't interested in doing that kind of book. I was interested in doing a book where there was a lot going on, and where, for example, the crime fighters had significant others in their lives. They weren't individuals who just existed absent of family. So I also wanted to look at that as well. And so doing that provided me this really rich, I was able to provide a richer experience for the reader. Choosing England as a setting was very, was a real, really simple. I love England. I've always loved England. It is one of the most fascinating places. I mean, you could, one could go to England, one could live in England and never see everything that's actually there. It is such a fascinating place. And, you know, it's existed, well, it's always existed, but England itself is a place that was occupied by the Romans. It was there during the Ice Age, you know, it's just got this amazing amount of teeming history going on there. Julia And how you imagine the place, though, back to this, which was so interesting to me. Do you see a place in such detail, or does it reveal itself slowly as you're writing? Because again, it's so immersive. Susan Thank you. You know, the places that I write about are almost always real places. And so, for example, the tin streaming place where Michael has not only the tin streaming going on, but where he makes his jewelry as well. Well, that was a real place. Now, what I added to that place was this cottage to which he brings his beloved, you know, sort of thinking that and she's picturing, you know, the cottage in the country who wouldn't want a cottage in the country, especially in Cornwall. Only it's not quite what what she's expecting it to be. But the place itself, all of that was there. And so I just added to it. And that's generally what happens is that I will choose a real place and then I will add to that place. Sometimes what I do, for example, if I know that a major crime is going to occur, I may change the name of the village. For example, when I used Butte in Cornwall, very, very north part of Cornwall, and I was writing a book about the surfing community in Cornwall, I changed the name because of what was going to happen there. And so I just changed it to Cassvellon. But everything that I said about Cassvellon was actually there in Butte. I would love to pretend I have this great imagination. This is why I could never write science fiction or fantasy, because I just don't have that kind of imagination where I can just dream up these places. So what I do is I let England become, for me, this place that suggests story. And when I go to England, that's what I'm looking for, places that suggest story. And that's what I, you know, when I saw the tin streaming place in Cornwall, well, that immediately suggested story to me and I knew I'd be able to use it. Well, that's great. But I have to disagree with you. I think you have an incredibly fecund imagination, because it's not just about description. It's about all those little details of things that are happening in that place or the things that people are doing, other characters may be doing, just the things that actually flesh out a real scene. That I think is just brilliant. And it's part of what makes it such a delight. And there's also a lot of lightness and humor in this book. Oh, there has to be, you know, I always have humor. Frequently, it's through the person of Barbara Havers, because she has such great attitude. She's so easy for me to write. And I would say that in all my books, no matter how dreadful the scenario might be, there will be moments of lightness. And in this one, of course, there's Barbara's attitude, but then there's also the fact that, you know, these are aristocrats, none of them know how to cook. And they're in this position of having to cook their meals. And, you know, the roof is falling in and there's all kinds of stuff going on. And so that provides the lightness that I think prime novels also need so that you're not always in this kind of doom and gloom situation. Well, Elizabeth George, I love talking with you about this novel, A Slowly Dying Cause. I look forward to talking with you about the next one, if that is in the cards. And thank you so much. Sure. Thank you so much, Francesca. This was It was really great.