Chasing Leviathan

On this episode of Chasing Leviathan, PJ and Dr. Carl Öhman discuss the implications of data privacy, especially after we die. Together they review the fragility of digital information, the ethical considerations surrounding data ownership, and the legal landscape of data privacy. Dr. Öhman also points to the cultural significance of digital remains, and the collective nature of data privacy in a technologically driven society.

Make sure to check out Dr. Öhman's book: The Afterlife of Data: What Happens to Your Information When You Die and Why You Should Care 👉 https://www.amazon.com/dp/0226828220

Check out our blog on www.candidgoatproductions.com

Who thinks that they can subdue Leviathan? Strength resides in its neck; dismay goes before it. When it rises up, the mighty are terrified. Nothing on earth is its equal. It is without fear. It looks down on all who are haughty; it is king over all who are proud. 

These words inspired PJ Wehry to create Chasing Leviathan. Chasing Leviathan was born out of two ideals: that truth is worth pursuing but will never be subjugated, and the discipline of listening is one of the most important habits anyone can develop. 

Every episode is a dialogue, a journey into the depths of a meaningful question explored through the lens of personal experience or professional expertise.

What is Chasing Leviathan?

Who thinks that they can subdue Leviathan? Strength resides in its neck; dismay goes before it. It is without fear. It looks down on all who are haughty; it is king over all who are proud. These words inspired PJ Wehry to create Chasing Leviathan. Chasing Leviathan was born out of two ideals: that truth is worth pursuing but will never be subjugated, and the discipline of listening is one of the most important habits anyone can develop. Every episode is a dialogue, a journey into the depths of a meaningful question explored through the lens of personal experience or professional expertise.

PJ (00:00.992)
Hello and welcome to Chasing Leviathan. I'm your host, PJ Weary, and I'm here today with Dr. Carl Omen, Assistant Professor in the Department of Government at Upsala University. And we're here today to talk about his book, The Afterlife of Data, What Happens to Your Information When You Die and Why You Should Care. Dr. Omen, wonderful to have you on today.

Carl (00:24.449)
Well, thank you so much.

PJ (00:26.57)
So tell me Dr. Ullman, and I think there's some immediate answers that pop up in people's minds, but why did you write this book? Why is this book important?

Carl (00:38.227)
So I mean I could answer that question by saying I had already done the research for years and you know once you've piled up like a bunch of articles and you realize that like I can actually put these together to a coherent narrative that's a one answer of why I wrote the book but the other part of your question why is it an important book I think the answer to that question is that everybody's talking about tech all the time you know like AI

the internet, digital stuff. It's all we tend to talk about these days. And I found that this issue of yes, but what happens to all of these data when we die and when the companies that own our data die? That is a completely neglected perspective.

from from the debate Nevertheless, this is according to me at least this is going to be one of the big Battlegrounds for the dominion of the internet for the next hundred years or so. So I wrote this book to to show to people who hear about my research topic and they're like, wow That's such an interesting quirky detail of like what's gonna happen to our data when we die and this book is kind of telling them No, it's not a quirky detail like

This is at the very center of the battles for the Internet's future.

PJ (01:59.062)
What's the current state of, well actually before we do that, what are some of the repercussions if we just kind of give this right away? If it's just like, just let the data live, like who cares once you're dead? What are some of the ways that it can be used or misused?

Carl (02:14.744)
Hmm. Yeah, I get this question sometimes. Like, okay, but why should I care? Like, I'm gonna be dead anyway, so like, what does it matter? And I tend to answer, well, maybe you don't care about what's gonna happen to you if you're not gonna be around, but you probably care about the people around you, right? And the thing is, your data is...

kind of like a network. I commonly compare it to DNA. So biological data and social data are similar in a lot of ways. And one of which is that if somebody has your DNA, or say they don't have your DNA, but they want to know your biological identity. Now let's say that they have your parents' DNA, or your grandparents and your siblings. Now they can draw a lot of inferences about you and who you are, on a biological level at least, from that information. The same goes

with digital social information that I may not have a data portfolio on you, but if I own your deceased parents' data, maybe your grandparents, I can build network around you and draw a lot of inferences about who you are. So what I'm saying is basically that if you care about the privacy of those close to you, you should also care about what happens to your data post -mortem.

You should also, I think, care about the society that you're leaving behind. your data may not be very valuable on a societal scale, but consider that you are one of two billion people who are gonna die in the next three decades alone. So you're part of a much bigger narrative.

And whomever ends up controlling your data will also control a piece of history. So to illustrate this I commonly point to the MeToo movement. Hundreds of millions of women witnessing about sexual harassment across the world. Of course that's going to be studied by future historians who are going to want to find out what was the development of 21st century feminism like.

Carl (04:25.563)
one person in the future is going to control that entire narrative. It's going to control every single tweet. And that person is Elon Musk, by the way. And I think that illustrates why, you know, if you don't do anything, what you're doing is essentially that you're helping Musk and Zuckerberg build a monopoly of our digital past. And that is a very dangerous society to leave behind.

PJ (04:31.35)
PJ (05:00.202)
You know, it's funny, like I obviously thought about this before, like talking to you and it just, like just to hear you say just out loud, like the future of the narrative is in like Musk and Zuckerberg's hands. You're like, I mean, even if you just said the future of the narrative is in two guys' hands, I'd be like, that's problematic. Like, forget how you feel about Musk or Zuckerberg, right? It's like, this is so much power.

Carl (05:14.884)
Yeah.

Carl (05:19.931)
Hmm. Hmm. Yeah.

Yeah.

PJ (05:30.71)
And do you think part of the reason because you kind of mentioned that you know, that's a nice little side thing They sometimes get that when people deal with this question do you think that's because I Think it's just recently that the price of data has overtaken the price of oil I think it's like the last 10 years and you think that part of this is people aren't seeing what's coming because we don't have a lot we haven't had as many people die yet with the data so they don't realize like we're about to get like an avalanche because all this technology is pretty recent

Carl (05:46.171)
Mm -hmm. Mm -hmm. Mm -hmm.

PJ (06:00.02)
Is that part of why that, the lack of understandings there?

Carl (06:03.949)
Yeah, I got a comment on your, you know, Data is the New Oil connection because I actually had a chapter called Data is the New Oil that I erased kind of last thing. I took it out and I was like, I'm probably going to use it for some other paper. But this is a bit of a parenthesis. But my idea was that, well, data is literally the new oil in the sense that what is oil? Well, it's accumulated fossils, right? So it's dead creatures.

PJ (06:32.202)
Mmm.

Carl (06:33.786)
compiled over time and I'm like well that's data. Like all data are digital human remains in a sense or at least all social data because it's an accumulated past so to speak. But with that being said yes I think that none of the big tech companies think of themselves as stewards of history. They think of the emergence of the dead on the internet as an anomaly.

And what I'm trying to show in the book is that, it's actually the other way around. It's the fact that the living are populating the internet that is the anomaly. Because the internet essentially is one giant archive. And the archives are something that we built originally for the dead. So what has happened in the 21st century, what's special about the digital era, is that the living have now moved in to the archives. And we surround ourselves with these archives.

all the time. We carry our phones around, our cars are connected, our house alarms are connected, our fridges are connected. So we're literally living inside the archive.

And now we're beginning to realize that we're sharing this archive with the dead. And I'm like, yeah, that is like the natural, that's what an archive is. It's a storage for the past and for the dead. So yes, I think you're right. Like people are only now beginning to wake up to this reality because it's becoming an economic reality of like, okay, shit. We're like, we now, Facebook now has what, a couple of hundred of millions, probably dead users on their servers and they're waking up to like,

like, like those users aren't clicking on anything, they're not generating any traffic themselves, least. Nevertheless, they take up server space, so what do we do with them?

PJ (08:31.028)
Yeah, and I think with a you know, you'd mentioned the archive side of things Would another good analogy be when you go to the library? You expect most of the books to be by dead people Right like we understand that most of the you know, the internet is so new It's kind of like the the invention of the printing press in some ways it's like there's this huge flood of books and it's like all these people are living and all sudden it's like

Carl (08:46.505)
Yeah. Yeah.

Carl (08:57.472)
Yeah.

PJ (09:00.038)
A hundred years later, I mean a couple hundred years from now, most of the information on the internet is going to be from, like predominantly from dead people.

Carl (09:04.671)
Yeah.

Carl (09:09.345)
Yeah, and it's also it's a double movement in the sense that like we're producing more information that we're putting on the web.

But we're also uploading increasing amounts of previous information. know, people are uploading like information about their ancestors, old family albums, new texts, historical information. Increasingly museums are digitizing their archives. National archives are digitizing their collections and putting it online. So it's also that it's sort of growing backwards and even those who live

like hundreds of years ago are now online in a weird sense.

PJ (09:54.538)
And you've mentioned a little bit you've drawn connections between, for instance, fossils as kind of this archive, like that's compressed information in some ways that we're using and then something similar is going on with this archive on the internet.

Carl (10:05.051)
Mm

PJ (10:13.974)
How close is that equivalence? What are the main differences between what's happening in the physical world with these kinds of archives and the digital world? What are the key differences and how does that affect what we're talking about here?

Carl (10:30.427)
So I'd be happy to unpack the analogy between biological information and digital information and there's a lot to dig into there. But I think we can save that for maybe the next question or a couple of questions down. But I think you're touching upon something interesting in the analogies and disanalogies between physical archives and historical ways of preserving information and

the newer ways, digitized ways of preserving information. One of the key differences, which I think is really important, is the fragility of digital information. We tend to think about what we put online or what we put...

on our computers as kind of like immortalized. And you see this language frequently. You see this, like all this entire, what I call the digital afterlife industry, they're all about become immortal through the internet. it's a virtual space, cyberspace, which is somehow beyond the material world. And nothing could be further from the truth. In fact, even though it's cheap to store things online,

It's enough with a single click of a button and it's all gone. Whereas if you have a library full of books, you can't simply click a button and it's all gone. Like you have to physically go and collect the books that you don't want anymore. You have to load them into a truck or something. You have to drive that truck somewhere and you need to have a literally huge furnace to burn it.

And that's a lot of labor. And that labor prevents you from doing hasty decisions in terms of what to destroy and what to preserve. We still have clay tablets from like two or three thousand years ago. And for digital information to be preserved for

Carl (12:38.481)
such periods of time, it's almost unthinkable. Software changes, know, the hardware changes. It's not even sure that if you have a file on your local machine and you try to open it 15 years later, it's far from certain that you can even access that file. And that's 15 years. And then consider that as clay tablet. That exists for 3 ,000 years at least.

Now, why is this important? Well, it is important because the companies are storing these data are also fragile and we tend to forget that. Our entire system, well, the economic system for one, but also legally, how we regulate the internet kind of assumes that the tech giants, Facebook, Google,

X etc. are going to be around forever, but they're probably not. And the question is, what's going to happen to the data the day that they face bankruptcy? And one pretty plausible scenario is that the data is just destroyed. That's certainly not impossible, which would be a disaster from a historical point of view. That would be literally what archivists call

A digital dark ages. The biggest archive of human behavior ever assembled in the history of our species. Basically the most detailed view of history that you can ever assemble. Like a gold mine for future generations to understand their past. All just completely gone, blown up in the air. And I'd love to talk... Yeah. Yeah. And some listeners are probably thinking like...

PJ (14:18.356)
It'd be like the burning of Alexandria and the Alexandrian library. Yeah. Sorry. Go ahead. Yeah.

Carl (14:25.054)
Come on, like what I put on Facebook is not the burning of the library of Alexandria. know, it's pictures of funny cats and disinformation or what not. But we mustn't forget that even if you would say that what we put on social media is trash. Okay, sure, we'll call it trash, but guess what?

PJ (14:30.08)
Yeah

Carl (14:47.644)
archaeologists consider trash piles and sewers some of the most interesting sites for historical insights. Because it's not about the monuments and the aristocrats. It's about like what did ordinary people do? What were the masses doing? What were they thinking about? What kind of lives were they living? So yes, it may be mundane, but it says a lot about our time and our society and common people, quote unquote.

Now, let's say that one of these tech giants would fail and die, go out of business, and they wouldn't destroy the data. I think that is actually the even more interesting option because what's going to happen? Well, what's going to happen is essentially that a bankruptcy or an insolvency lawyer is going to come in.

and administrate the auctioning of all the valuable assets. Now what are the valuable assets if X or Facebook or LinkedIn or whatever goes down? Well, it's the data. And that data can essentially be sold to anyone, to the highest bidder. So you're reacting to Musk and Zuckerberg monopolizing history. Okay, but...

make it Russia or China or like, you know, via contact, the Russian social media site could easily just, you know, acquire all of that data. That's basically what happened to MySpace. That data has been passed around numerous times. So like if you're on MySpace, your data is probably being used to draw inferences about your consumer behavior without your knowledge.

PJ (16:37.578)
And I'm suddenly grateful that I didn't do a lot on MySpace. You know, even it's interesting, you might be the first person I've really talked to who doesn't hesitate at all to say X instead of saying Twitter. And I think that just really points to the, and I know that other people, you know, don't hesitate to say that. You're just the first person I've talked to. Like I still have, I'm like Twitter, mean X, know, like, you know, like what do call it? Zeding, I don't know, you know, it's tweeting.

Carl (16:51.995)
Mmm.

Carl (17:03.953)
Yeah, yeah.

PJ (17:07.508)
That goes right to your point of, you're like, Twitter's huge. It will always be around. And then like three, six months, it's all sudden, it's something totally different. It's owned by a totally different person who has complete control over, well, at least I don't know what complete control is. I think we're gonna get into the legal side of it, I imagine. Right, like there's this very, like you said, they seem, because it's all new, it seems like they'll be around forever. And then all of sudden you realize like, they're very, very fragile.

Carl (17:16.202)
Yeah.

Carl (17:34.933)
Yeah.

Yeah. And like, believe me, as a researcher, that contrast was stark. Twitter, when there were still Twitter, had this, they were very open -minded and supportive of researchers. In 2010, they donated their entire archive of tweets to the Library of Congress. And they're like, this needs to be researched. This needs to, you know, the public needs to gain access to their own digital public sphere.

At least retrospectively, at least like as a historical asset. They had a really cool researcher API that meant that researchers could gather the data that they needed in a really neat way, which was like a specific academic access that you had. And as soon as Musk took over, he just scrapped that completely. He is not interested in doing any goodwill for the academics.

PJ (18:39.104)
There's so many sarcastic comments about free speech that I want to make there, but we'll just let that. I thought he was doing this to save free speech. But anyways, OK, we'll just let that lie there. What is the current state of this discussion then, in terms of at the academic level, but also at the legal level? What is the current state? How is this handled?

Carl (18:42.524)
Yeah, yeah, yeah.

Carl (19:00.519)
Mm -hmm. Mm.

Carl (19:06.135)
So the legal level is a bit of a car crash. If you look at some of my expertise would be within Europe. So European data privacy is governed by a European law called the GDPR, the General Data Protection Regulation, which is like a very strong data protection law, probably the strongest, at least at a federal level, strongest in the world.

PJ (19:34.099)
It's definitely better than the United States. Yes.

Carl (19:36.204)
Well, yeah, most. I don't know if you heard this, but like people in the business say the approach to technology is in the US, it's innovate, don't regulate. In China, it's innovate, then regulate. In Europe, it's regulate, don't innovate.

PJ (20:01.106)
Hahaha!

I've never heard that word. That's good. Yeah, sorry, go ahead.

Carl (20:06.01)
It has some truth to it. Anyway, it's a data protection legislation that actually has teeth. But here's the interesting thing. It explicitly states that all of these rights cease at the moment of death. Like, if you're dead, no data rights. Individual member countries could add on clauses to the GDPR that enshrine rights to the dead. And some have, in a limited sense,

But as a general matter, no. And that goes for most jurisdictions that I'm aware of in the world. That if you're dead, either you have like a very, very weak sense of data privacy rights, which is basically saying that your bereaved family has some rights to decide over what's going to happen to your data, legally speaking. But as a general matter.

But no, like you're dead, you have no privacy rights. Which means that these companies can basically do whatever they want with your data. If they want to feed that into an AI and create an avatar that simulates your personality, yeah, they're free to do so. you have given them, they literally own, they're the sole owners of that data. Of course they could do that. Which is really scary when you think about

how willingly people put their biological information online as well. I think Ancestry has like 26 million people's DNA on their servers. Okay. And people do that because they want to find out who their relatives are, which I'm sure is cool. But have you considered what happens the day Ancestry goes bust?

and they're selling out all of that DNA to the highest bidders. And like, you know, I don't want to get super speculative and it's, I should really stress, it's not a speculative book. It's a book that's really about like, what's happening now.

Carl (22:16.046)
Even so, just think about the fact that someone that owns your DNA, in the future, let's say that they sell this to a biotech company that's like, you want to buy a clone of your great grandfather? Yeah, we can do that. Which is like, I don't want live in that world.

PJ (22:36.564)
Yeah, when you mentioned the social and biological kind of equivalence, I immediately thought my mom was adopted and had never sought out her biological family. And this actually ends well, but you could immediately see, like someone actually used it to contact my mom.

Carl (22:48.096)
Mm

Carl (23:01.559)
Mmm, yeah.

PJ (23:02.388)
Like she did not reach out. She, her, her sister bought her the test. And then because of that, someone from her biological family reached out to her. Now that's fine. She's actually, they've, they've gotten to know each other. It's been a really good thing. But when you think about it, like witness protection, you think about things like, abusive family members. And then all of a like, you know, if it doesn't have to be you, and this is what this is like a large part of your point. It's like.

Carl (23:08.882)
Yeah.

Carl (23:12.338)
Yeah.

Carl (23:20.487)
Mm -hmm.

Carl (23:24.517)
Yeah.

Carl (23:31.333)
Yeah.

PJ (23:32.32)
I'm related to this person who has a different name. I wonder if they're near so -and -so, right? If you have like an abusive father or an abusive uncle or aunt, you know, any of that stuff, like they can use that to like, they don't even need your DNA. And that's not, that's not even after you're like, so if that goes public after you're, after you're dead, then it's, then it's your kids or it's your, your spouse or, know, whatever. anyways.

Carl (23:34.62)
Yeah.

Carl (23:39.035)
Yeah.

Carl (23:48.349)
Yeah.

Carl (23:54.429)
Hmm.

Carl (24:00.607)
And I think this really, there's a philosophical angle here, which is that these are not just details.

This is about how we think about data and how we think about data privacy. Because our current paradigm of talking about data privacy is that it's an individual right. It's my data, I have the rights to decide over what happens to it. And that's like, the GDPR is full of that. It's all about the right of the individual.

or the right to self -determination. And like all of that's good, I'm not opposed to that, but it's only half the story. Because for me, data privacy is primarily a collective issue.

We can't single out single individuals and be like, you decide for yourself what's gonna happen to your data because it's all networked. It's all intertwined such that your decision of what happens to your data is also a decision of what happens to many, many other people's data. And I have like a concrete example apart from the speculative ones that we discussed here. I write about

This case in Germany a couple years ago, a 13 year old girl died tragically. Her parents wanted access to her Facebook conversations or actually to her entire Facebook account. And Facebook wouldn't give it to them. They're like, no, we've signed a deal where we're not gonna share her data with anyone in life nor in death.

Carl (25:43.453)
And they sued Facebook and went back and forth through German courts until eventually the Supreme Court in Germany were like, okay, so this is, we compare the status of her Facebook profile or Facebook data with physical letters and you inherit physical letters such that the family should inherit the account and they open the account

And the research community was kind of like, hang on a minute, it's not only her data in these conversations. If you have a messenger conversation with somebody, it reveals a whole lot about them too. this is an example, think Facebook has since provided a much better solution for this, it's one of those examples of if you inherit the chat log, you also get a lot of information about that other person.

PJ (26:20.629)
Right.

PJ (26:43.36)
Yeah, yeah. You've said data privacy quite a bit. Is that equivalent to data ownership or is there some important differences there? Because as we talk about even like inheritance and stuff like that, it seems like we're talking about possession as well.

Carl (26:59.403)
Mm -hmm.

Carl (27:03.56)
Yeah, there are two sides of this coin as well. Yes, we are talking about data ownership, but we're not only talking about data ownership. So I was trained by a philosopher called Luciano Flodini. He's the father of the philosophy and ethics of information. I think he was named the most quoted, or the most cited living philosopher.

PJ (27:07.807)
Yeah.

Carl (27:32.185)
So obviously, huge impact on this field.

But Lujan has this really cool way of thinking about the status of data. And normally we think about it as a possession, as something that we own, which would be liken it to like your car. And if somebody does something to your car, it can harm you in certain ways. It may be destruction of property. If someone steals it from you, it's theft, obviously. But...

And like many, many times that kind of ownership lens is useful and it provides valuable insights to how we should think about data, but it's not the whole story. And Lujano proposes that we should also think about data as something that we are. So rather than a car, our data is kind of like your hand. So what someone does to your data when they steal your data is rather to be linked to

like abduction or somebody illicitly touching or violating your physical integrity. And this may strike some people as counterintuitive of like, well, my data is not something I am, it's something that I own. But let me then suggest this analogy which may make it a bit easier to think about.

to comprehend. Think about what you are. you think, try to define like what is a person? What are you as a person? What distinguishes you from every other thing in the universe? Okay, you can take two approaches roughly. You can either say, which is very common, and I think the leading approach in philosophy, that you, what defines you is a narrative. So you are unique because you have a unique story that has only happened to you.

Carl (29:26.426)
What's a story? What's a narrative? Well, it's a sequence of information. So if you're saying, am my story, personhood consists of narrative, well then you're saying personhood consists of information. Another way of approaching this would be, well I'm not only a narrative, I'm also a physical thing. I am a body. I'm a biological being. That's what defines me.

Okay, what is biological being? What distinguishes your biological being from any other biological being? Well, if you zoom into your cells, you will find DNA. What's DNA? It's proteins coated...

as a chain of information. however you want to define it, you're gonna boil down to the smallest common denominator is that a person is her information. And when you think about it this way, you open up an entirely new ethical language for talking about what.

is data privacy. So for Luciano, it's essentially about our integrity and our right to self -determination of defining who we are, of not having others interfere with our literal being as persons. And I've developed that perspective through a series of articles and also in the book. And I've taken it one step further in saying that, okay, if our data is kind of like our

digital body shouldn't then our data after death be akin to a biological corpse.

PJ (31:07.828)
Yeah, that's no, it's a great question. It's a great, I almost said great question. Wow. I just tell it. No, I don't know. Great answer. That's a, it was a great, great answer. I definitely think in terms too of there's the, when you talk about that violation, like, I think people feel that with blackmail, there's that violation of personhood. And so where, like there's the

Carl (31:08.017)
So long answer to your question, but there's a lot to unpack there. Wait, it was. Thank you.

Carl (31:30.426)
Hmm, yeah.

PJ (31:35.488)
Theft of your car is one thing and it's gonna affect your capacity, but the theft of personal letters, that can lead to, we can feel that's a different kind of violation. And just as I was listening to you and you're talking about this, a big difference between, and this is part of the complication, especially with the internet.

Carl (31:41.126)
Yeah.

PJ (31:59.454)
is that you can't copy a car, right? If you copy a car, it's just a different car, but the information, like someone can leave the information with you and take it with you. And on the internet, it's fragile, but it's very reproducible. How does that affect the discussion? The fact that it is so rapidly, I mean, one of the biggest things we talk about on the internet is something going viral.

Carl (32:02.158)
Hmm. Yeah.

Yeah.

Carl (32:23.204)
Mm -hmm. Yeah, I mean in multiple different ways, of course part of it is that we rarely have an accurate image of what data we are actually sharing. So whenever I bring my topic up, they're like, but like...

doesn't just erasing your Facebook account do that for you? Because they think that your digital footprint or their digital footprint consists of what they consciously put on social media. And most people, or lot of people at least these days, don't put that much stuff on social media. And I'm like, yeah, well, sure, that helps.

But you mustn't think that your data is akin to the data that you consciously put on there. Like if you were scrolling on a newspaper article and you stop at a paragraph, that's registered. All of your search data is registered. As soon as you take a step, if you own an iPhone, your iPhone registers that step.

Like every couple of minutes your phone is sending geo data. And you're like, well, who cares about a company knowing my whereabouts? It's not like anyone's gonna actually look at that. Well, that's first of all wrong. That may certainly happen. But also, you know, what we talked about before with putting that data in connection to other people's data. So in...

PJ (33:52.074)
You

Carl (34:04.754)
My household, have, there are at least two phones and a third phone coming frequently. And I'm sure you could tell if you just from like our geo data, you can tell that it's a married couple and probably one of that couple's siblings coming to babysit every now and then. And that's like a super simple, it's only geo data. That's all that it is. And still you can tell.

Are they married? Are they cheating on one another? Who's staying there for how long? You don't really see this in everyday life unless you're in exceptional circumstances. I had this a couple years ago. was working at a mountain station in rural Norway.

And there are only a few guests coming there. I mean, it's a very, very remote place. And I realized that my people you may know that came up on Facebook were the guests of the hotel. And I'm like, since the only people I'm close to are these people, that's what I'm getting. Like Facebook knows that I've probably met them.

And it's only because our phones were in a relative proximity.

PJ (35:36.672)
Yeah, you could literally walk up to someone you'd never met before and be like, hey, is your name this? Like that's, that's creepy.

Carl (35:41.025)
Yeah, yeah. Yeah. But that's what I mean with, you you mentioned data being so easy to copy. And I really want to stress that not only is it easy to copy, it's also so sticky. everything we do as soon as you move, it generates information. And that's a very stark contrast to how things used to be. It used to be that

If something was really, really, really important, then we could spend resources for recording. This person is so important that we're going to record what they did. We're even going to spend money on taking a photo of them. Whereas nowadays, it's almost a complete reverse. This is a point that Luciano does in one of his books, that recording has now become the default mode of society. Everything is recorded.

PJ (36:24.042)
Right.

Carl (36:38.4)
all the time. history is not written through the question is this important enough to preserve, but it's rather written through selective destruction. Are these data insignificant enough for us to destroy?

PJ (36:55.434)
Yeah. And there's the algorithm side of it too. Like there's the sticky side and then there's the instant networking of it, right? This idea that like, yes, this, that you're here at this mountain resort and then these other people are, but the fact that it's immediately connected is also part of the discussion. Yeah. I had a, a journalist on from California, his name's, Sarus Faravar. And he, we talked about, he did this work on license plate readers there.

Carl (37:03.989)
Mmm.

Carl (37:14.721)
Mm.

PJ (37:25.374)
So the police were using license plate readers. And this what people just don't like. This blew my mind, but this is just the kind of stuff that, this isn't even what you're putting on there. They weren't deleting the information as they got it, but they were constantly checking for warrants attached to license plates. And they kept all the license plates. so they'd drive down the highway and it would keep all this. Are you familiar with this story? kept, it would keep out. So all the police officers,

Carl (37:42.229)
Hmm. Hmm.

Carl (37:48.415)
No, no, no, no. It sounds crazy.

PJ (37:53.13)
had this, if they didn't have to do anything, would record every single license plate they went by. It was like several, it could do several hundred a second. And so it went to the police. They did not delete it. They had no policy in place for it. And it considered to be under the Freedom of Information Act. So Sarus asked for it. They gave him six months of every single license plate and where it had been seen by police officers.

Carl (37:53.817)
huh.

Carl (38:00.801)
Mmm.

Carl (38:08.569)
Mmm.

Carl (38:15.256)
Mm.

PJ (38:22.686)
in the entire state. And so he went to the local, yeah, he went to like one of the people in like the state government and he's like, this is important. The guy's like, okay. He's like, he showed him, he's like, hey, what's your license plate? Like did it on the spot with him there, clicked on it and it showed up on the map. He's like, so you come here to the Capitol building, you go here for work and what's this place? He's like, that's my home.

Carl (38:25.465)
my god.

Carl (38:47.342)
Yeah.

Carl (38:51.374)
Yeah.

PJ (38:51.498)
This was publicly available if you requested it. They just handed it out. Like, and so I don't know if they're deleting that now. I hope that he's been able, don't, you know, again, but that like, that's not even information that like we know we're given. Like you could turn off your location, presumably, right? But like when everything's being recorded, the stuff with drones and stuff now, like, I mean, that's the fact that that was publicly given out also, the way, like if people knew that they could do that, like,

Carl (38:55.033)
Yeah.

Carl (39:01.784)
Yeah.

Carl (39:07.054)
Yeah.

PJ (39:19.922)
Stalkers and stuff like that. That's a whole other discussion, but it's like the amount of information out there is astonishing Similar what to like especially with meta with people going in with the ocular helmets I was talking to someone was talking about the their registering eye movements now and they can use that to identify people but they also can keep all like where your eyes are looking the entire time which Anyways

Carl (39:21.293)
Yeah. Yeah.

Carl (39:32.675)
Hmm.

Carl (39:36.163)
Hmm.

Carl (39:41.87)
Yeah.

It's exactly my point to like you're leaving behind a digital shadow that is just far, far larger than you may think. And it's impossible for a single individual to fully take control over their digital legacy. And you could argue like, yeah, but it's anonymized. Like no one can actually like zoom into that data point that is you and look at your eye movements.

That may be true with some pretty hefty modifications. Even so, like even if it's part of a collective legacy, we collectively should have a say in how it's used. Because it's our society, you know? It's about what we're leaving behind. And currently we're not thinking about the fact that when we use our devices, we're also building

an empire for these individuals and essentially helping them monopolizing the past. And that is like, if we care at all about our descendants, we ought to worry about that.

PJ (40:58.536)
Even the people who will just like that you know and will survive you I mean you don't have to have like a a huge generational view to be like I love the people next to me who then there's some of them some that were gonna be alive after I die like hopefully right

Carl (41:02.693)
Yeah.

Carl (41:10.511)
Yeah. Yeah.

PJ (41:15.196)
I'm now feeling really weird about recording this conversation. But anyways, I'm sure that's probably not the first time someone has said this to you, but yes. So you use this model of perhaps we should use it more like what do we do with our corpse after? What do we do? So can you spell that out in terms of a solution to this?

Carl (41:20.326)
Hit!

at least we're doing it willingly and consciously.

Carl (41:37.03)
Mm

PJ (41:43.99)
At least from what I was understanding from you, that's kind of your solution. What would that look like for our data? how does that affect, going back to your discussion about Facebook messaging, how does it affect other people's privacy who are, for instance, messaging back and forth? Let's say I passed away suddenly, how would that affect the people I'd been messaging with?

Carl (42:09.635)
So what this perspective does, so rather than, I hesitate to call it a solution, because it's not like here's how we're gonna solve this entire issue. that's part of the message of the book that there's no solution, there's no way we can, you know, there's no fix to it. Because it's not a detail, it goes all the way to the root of...

the economic and technical infrastructure of the internet. So like we can't solve this issue unless we solve the larger issues of data privacy, of the economic architecture of data management. So I just wanted to say that for the record, but yes, I think that thinking about

our digital remains as kind of informational corpse is a useful perspective ethically because it allows us to answer this question of why should we care? mean, what sense should we care? And the argument of it's simplified is saying that, okay, if...

PJ (43:14.378)
Hmm. Yeah.

Carl (43:22.234)
If our data posthumously is to be likened to a digital corpse, should have a similar ethical status as a human corpse. What is the ethical status of a human corpse? Well, there are lot of harms that do not apply necessarily to a corpse. can't be physically harmed. They can't feel pain and so on.

Nevertheless, are many things that you can do to a corpse that we would consider gravely unethical. Why? Because the corpse is a thing, but it's not only a thing, it's a very specific category of things. It's still a human thing. Which means that even though it's dead, the corpse should be considered an end in itself, in the Kantian sense of that word.

So this lets me mobilize the concept of human dignity as the prime ethical concept that should guide our thinking over digital remains. And then you could, of course, answer, well, human dignity, like that could entail anything. Just look at different societies and how differently they look upon dignity. And I'm like, yeah, sure. Like there are very different cultural ways of expressing.

dignity in some cultures complete erasure like a kind of digital cremation would maybe be more dignified in other cultures that would be a crime towards the dead of erasing their memory all right sure sure enough but the ethical concept as such

is a very useful lens. Yes, there must be cultural expressions of that concept, but the concept in and of itself, I maintain, is very useful.

Carl (45:16.769)
And its most important usage, least as far as my research goes, is when we couple it with the economic angle, with the political economy of these data. Actually my first article on this topic, the title is, The Political Economy of Death in the Age of Information.

where I couple the concept of digital, sorry, of human dignity with the industry of commercializing digital remains. And I show in that article and also in the book how the dead in the internet economy are essentially turned into a commodity. They are reduced to...

not ends in themselves, but a means to another end, the end of profits. So they become kind of like a factory of attention such that if there are plenty of companies who have like digital memorials, they have like pictures and stuff of the dead person and you can post things onto their kind of like a deposit of their digital traces.

Facebook now has a memorial service where the profile is upon request to turn into a memorial site such that family members and friends can post stuff on it. Essentially what's happening there is that the company is taking the dead person's data and letting it drive the clicks of other users because it becomes, you know, that's what drives the traffic. If you have

even more sophisticated technologies added on top of that. A number of companies by now, and like for the past 10, almost 15 years, have an offering services where you upload all of your data or data from someone whom you have access to. And,

Carl (47:25.324)
You use these data to train an AI and that AI simulates the personality of the person. There's been a lot of ethical discussions about like, can he do that? What's the ethics of doing that? And my contribution to that debate with human dignity angle is saying,

Well, yes, but it's not only the technology that does this. We must not get stuck on looking at the technological systems. We must also look at the economic systems within which these technologies operate. Why? Well, because these are designed products. They're products that are built with a very specific purpose in mind. And that purpose is making money.

So it's not just any version of you, it's not just any simulation of your personality. It's a simulation of the version of you that is the most likely to generate clicks and drive interaction. So that's why I'm using this human dignity perspective because that's the only concept as far as I'm concerned that makes all of the ethics make sense.

PJ (48:35.854)
And so if I understand and I just want to clarify here and make sure I'm the same track, it's not that you're necessarily offering like a solution. You're trying to raise the level of discussion to it. Like you can have disagreements about human dignity, but there's a big difference in talking about someone's corpse and talking about human dignity and talking about someone's car and property law. And so yeah.

Carl (48:55.106)
Yes. Yeah. part of it is that also, should probably have disclosed, we have legal systems in place for how to deal with the human dignity of corpses. That is something that we've settled and discussed for thousands of years. So one of the solutions that I point to in the book saying, archaeological museums.

They very commonly have human remains in their collections. They must run into the same issues, right? Of how can we commercialize, display biological human remains? So what I propose is seeking inspiration from the ethics of archaeology. And I think that that's a framework that makes a lot of sense. If we look at how that industry is regulated,

It's basically regulated by a common code of ethics, like a code of professional conduct within the museum industry of like, here are the ethical principles of what you can and cannot do with a human course in an exhibition. I think something similar could and should be done for the digital sphere as well. And like, it's...

already starting to happen like chatgbt4 for instance or sorry OpenAI has as one of their ethical, I can't remember what they call it, rules.

whatever they actually call it. you can't, yeah, ethical commitments is that you cannot impersonate, let the bot impersonate a dead person without their consent. If, however, there's an if, if you display it publicly. Now for private use, then like if you have like an old spouse or something, just.

PJ (50:43.836)
ethical commitments or something. Yeah.

Carl (51:07.62)
feed their data into the bot and turn them into an AI, which is like a very creepy scenario if you imagine like abusive boyfriend or something who could create their own AI version of their ex -girlfriend.

PJ (51:24.31)
Yeah. And I think there's actually like a Black Mirror episode, like exactly like about that. Yes. There's something here too, because you've talked about like the one I want to say, I'm really disappointed that, you when I asked for solutions, you're not providing me with the, you know, the answer to the internet for now and all time, you know, so sorry for putting you on the spot. I realized the way I said that you're like the solution for everything. No.

Carl (51:28.786)
Yeah, yeah, yeah.

Carl (51:51.014)
Yeah, just a parenthesis, when I talked to my editor about this, I came to the last chapter and every chapter is, it's of course a more sophisticated argument, but even so it's kind of like the academic version of a drunk and college student being like, it's about the system, man.

PJ (52:12.914)
Yeah

Yeah.

Carl (52:16.172)
You gotta realize about the system. And I'm like, okay, I said this in all the chapters of like.

PJ (52:19.346)
Carl (52:22.292)
It's not about the ethical, the ethical virtues of individual companies or individual industry leaders. It's about like the framework in which they operate. And then I come to the last chapter and I'm like, okay, so I'm supposed to like provide some kind of solution here. And I said to my editor, like, I can't write the last chapter and like, and here's how we abolish capitalism. And she was like, and she was like, yeah, although,

PJ (52:48.16)
Yeah.

Carl (52:52.563)
If you pull it off, that would make it a very popular book. And I was like, yeah, that's a big, big if.

PJ (53:00.19)
Yeah, yeah. One thing that I from a, and I think you've kind of hinted at it, but I also feel like beyond just like the raising the level of legal and philosophical discussion, I think there's a real angle here for a kind of common culture that the, you are talking about here, we very valuable to infiltrate the moral

Carl (53:21.599)
Mmm.

PJ (53:28.466)
imagination and the moral feeling of the average everyday person. Because if people started thinking about their own data privacy like this, but especially other people's, because there are a lot of people who just wouldn't even think twice about kind of messing with some of the else's data, right? And to instead think of it as like, I'm, you especially someone who's passed, right? And instead of be like, this is like, this is like defiling a corpse.

Carl (53:39.179)
Mmm.

Carl (53:47.168)
Yeah.

PJ (53:55.816)
Most like that would at least give people more pause. And I think that that's a worthy solution besides just, you know, there's legal penalties. Like we understand there's legal penalties, but there's also cultural penalties if you're, if you're messing with a corpse, right? Like, I mean, I don't want to get too graphic there, but the, but like the, that's kind of been ingrained in us ever since we've been growing up. And I think there's something valuable about the, the popular level. mean, that's something like black mirror can do.

Carl (53:59.157)
Mmm.

Carl (54:06.195)
Hmm.

Yeah.

Carl (54:16.265)
Hmm.

PJ (54:22.55)
But even with what you're saying, there might be an educational component for the average common person. Do you see that?

Carl (54:30.498)
Yeah, no, I think that's a very good point. I actually hadn't fully considered it on an individual level, the way that you just put it. I think it's a very interesting and strong point in that there's something that sociologists call the privacy paradox, which is that in surveys, people always fill out that they care a lot about their privacy, but they don't act accordingly.

Like no one actually moves to preserve their privacy. But they say that they care a lot about it. And I think it would be interesting to see if like, is there a posthumous privacy paradox as well? In the sense that I think you're right in that it's deep in your gut that like messing with a dead person's dignity is just wrong. And I...

do think that at a certain level at least, may start caring more when they realize that this is also being done to literally defenseless data subjects. Like people who literally cannot remove their data or protest in any single way. I do write about an adjacent topic in the epilogue of the book.

Where I write that, ironically, this may be one of the issues that are gonna cause us to wake up to the domination of how the big five is essentially dominating the public sphere. And I'm sure if you ask most people, they'd go like, yeah, that's a very bad thing. And nevertheless, that's where they put their data.

But in the epilogue to the book I write that historically at least, having your ancestors buried in the soil of a certain land is like the strongest claim to that soil. Like whenever you have...

Carl (56:44.241)
disputes or conflicts over land like the strongest claim to that land is always yes, but my ancestors are in the soil like if And you can take pretty much like any country any dispute over over over soil what it ultimately comes down to is like it's my fatherland why is it the fatherland because my fathers are buried in here now

Imagine something similar with with big tech that in a couple of decades It's not only going to be individual loved ones who lie buried there but entire Nations and cultures are going to have their collective past buried in these platforms as well. And yes that maybe

of sales tactic from from big tech in that, you know, I may not care about Facebook's future, but if Facebook is where my entire culture and nature stores its past, yes, I do care about its longevity. And you can see this like already now, like when X threatened to remove all the inactive accounts, people went

They're like, no, this is like the last memory I have of my departed father. Like, cannot do this. And twice they had to redact that decision. And I still don't think they figured out what to do. But in that sense of caring for the longevity also comes a sense of entitlement. A sense of ownership that, well, this is mine.

to some degree, in the same sense that the soils where my ancestors lie buried is mine. So ironically, may actually make us wake up to this very strange domination of our public sphere may be the dead, in that the dead become our allies for the democratization of the web.

PJ (58:56.618)
Dr. Oman, it has been absolutely awesome having you on today. Fascinating topic, real joy to talk to you. Thank you.

Carl (59:04.21)
Thank you so much.