Back in America

A man's house catches fire. He tells police he ran through the flames saving his belongings. Then detectives pull the data from the pacemaker in his chest. His own heartbeat tells a different story.

Andrew Guthrie Ferguson is a professor of law at George Washington University Law School, a former public defender, and the author of Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance. In this conversation, he walks us through the criminal cases, the legal gaps, and the surveillance infrastructure that most Americans don't know they've already built around themselves. We talk about Google search histories used as confessions, smart home cameras that become prosecution witnesses, Palantir's expanding role in immigration enforcement, and what happens when the definition of "criminal" shifts but the data trail stays the same.

Ferguson proposes something he calls the tyrant test: design your privacy protections by assuming the worst possible leader will have access to your data. He argues it's not a thought experiment. It's the logic the country was founded on.

Book: Your Data Will Be Used Against You (NYU Press)
https://nyupress.org/9781479838295/your-data-will-be-used-against-you/
Guest: Andrew Guthrie Ferguson, Professor of Law, George Washington University Law School


From the conversation:

"Everything you buy that is a smart device is a surveillance device. And what we've done is sort of build around us this network of smart devices that is revealing who we are and what we do." -- Your Heartbeat Can Convict You

"Your smart pacemaker, almost anything you create with data can be used against you in a court of law." -- Your Heartbeat Can Convict You


You might also like:

Wrongfully Convicted: Darryl Burton Spent 24 Years in Prison for a ...

Who should get the vaccine first? We didn’t know so we asked a phil...

A Pastor Joined the FBI. Then His Kids Came Out.

What is Back in America?

Interviews from a multicultural perspective that question the way we understand America

If you're tired of arguing with strangers on the Internet, try talking with one of them in real life.

Welcome to Back in America, the podcast.

My guest today is Andrew Guthrie Ferguson, professor of law at George Washington University, a former public defender, and the author of Your Data Will Be Used Against You. So, professor, before we get into anything else, I want you to tell my listener of a story from chapter three of your book, A Man Named Ross Compton, A Housefire, and A Peacemaker. What happened?

He was a lawyer, a lawyer who helped him live and helped him live the life he wanted to live. He was also going to his doctor so his doctor could track his health and make sure that his heart condition was being treated the way it was supposed to be. Detectives, police detectives, went to the doctor's office to obtain the heartbeat to use as evidence against the defendant in a court. Now, their detectives weren't doing anything wrong. They had a case. Apparently, there was a fake fire.

He claimed it was an accidental fire, so he got insurance money. It turned out it was arson. And the detectives were going to use his heartbeat to disprove his story of what happened. And it puts the issues in the book in wonderful contrast. I think we can all agree that we want this kind of innovation, this digital world around us. At the same time, we might want to pause about the fact that our most intimate data, the stuff that we honestly cannot live without, it's our heart, could then be used against us in a court of law.

Right now, the law, the rules, the practice of how we sort of use data has not been set out. And the takeaway basically is like your smart pacemaker, almost anything you create with data can be used against you in a court of law. That's what the book's all about. Yeah, and I want to come back to that. But just the story of this fire. So the house burned. The guy said, oh my God, I did everything I could to save my precious belonging.

I was running all over the place. And the insurance started to be suspicious and questioned that aspect of the story. So they figure he had a pacemaker, that it was a smart pacemaker, and they obtained a warrant. Right. And that data that had been lived turned against him. You start your book by saying in a world where everything is data, everything is evidence. Is that what that story illustrates? Yes. And think about all the other smart devices in your life.

But it's not just your smart pacemaker. He probably also had a cell phone in his pocket. Maybe he had a smart watch on his wrist. All of these things are actually revealing data. And if you think about all the digital connections in your life, the things you do every day, from communicating through digital means, email, text, searching the internet for questions and answers of problems you have, your car is tracking you, your phone is tracking you, your watch is tracking you.

All of these digital connections are evidence. Everything you buy that is a smart device is a surveillance device. And what we've done is sort of build around us this network of smart devices that is revealing who we are and what we do. And the vulnerability is that sometimes that data can be used against us for bad reasons. Maybe you walked out the door with a no kings protest sign because you were upset with your government. Sometimes it can be used for good reasons.

Maybe you want detectives to be able to solve otherwise unsolvable cases because there happens to be a camera in their home. And that tension is the core tension in the book that I try to wrestle with. Let me take you back to your earlier days. So you were a public defender before becoming a professor. You stood next to people facing the full power of the states. And in your book, you write that at its core, policing is about power and social control. So when did you first see a piece of digital evidence

change the balance of power in the courtroom? You were standing in. So the easy case is again, my days as a public defender are now 16 years old. I've been teaching for that long, but was cell phones, right? People were actually recording themselves do the dumbest things. And many of my younger clients thought it was really cool to take photos of the guns or drugs or things they weren't supposed to have. And of course, that would be used as evidence.

But those digital trails have only expanded as we have all sort of embraced the digital world. And today, part of the reason I actually writing this book is as a lot of us that teaches criminal law and criminal procedure, I recognize that the future trials, the trials that my students are going to go enter into as prosecutors or defense lawyers, is primarily going to be digitally focused. Primarily, the evidence that the jury is going to hear is going to be like, well, can you prove it through cell site data? Was there a video of the scene?

Can you use facial recognition to make sure this human identification is actually a match? And I want to just sort of spell out for those students or future lawyers, how criminal law will change in the face of digital forensic evidence and how we probably need to start seeing criminal practice that way and through that lens, as opposed to the way I practice, which probably wasn't all that different from the way people had practiced for decades before, which was almost without computers. And we did have computers, but we didn't really use them in court. With lots of paper descriptions and blown up photos and none of the digital world we have today.

Right. So you're going to take us through another anecdote example. One of my favorite lines in the book, your home is your castle, or it was until the castle got Wi-Fi and started sharing data with Google and Amazon. And there is a case early on involving a Google Nest camera and a Marinand bedroom. What happened in that bedroom, in that house, and tell us about what our home have actually become. All right. So this is a great case involving why law enforcement sees the fact that our smart devices

are incredibly powerful evidence. In this particular case, there was a domestic killing. A husband kills their wife. We know that what happened because it was mostly all captured on camera. You can watch the husband get a cold drink of water before loading the gun and then going up. And you can hear the shoot, the shots fired, and the scream, and then him leaving. Now in a normal world without cameras, we obviously have a dead body and we have circumstantial evidence, but you wouldn't have the clear sense of this individual mental state

and or their direct connection. It's not much to argue about when you see a person loading a gun and going up toward the bedroom where the body of their wife is shot. And so it is terrific evidence, obviously. And what it means is that any of those smart devices from your smart refrigerator that tells you when the milk runs out and orders it for you or smart Alexa's or Echo's that are listening to everything in your house or camera systems. I always joke with my students about all the students who have

cameras to watch their pets at home, like they have a cat cameras and like that. All of that, of course, is recording inside the house. It is a great example of self surveillance. Like the police didn't request a camera in your house. In fact, you probably would be violently reacting to if there was a mandated camera put into your house by the government. We did it. We put it in our houses.

And now when there's a crime or something that happens, the police can get access to it. And that's what happened in this case. And it can happen with any of those devices that we have bought to turn our smart homes into surveillance homes. So there is another instance in your book, which is quite fascinating, honestly. It's where you reproduce the Google search history of a man whose wife has just disappeared. So I don't want to spoil the sequence here, but can you walk us through that search history? Starting at four fifty five in the morning.

And I believe it's page eighty six. Sometimes said that we reveal more to Google than we would to our closest friend or loved one. Right. The questions we ask, embarrassing questions we ask. And then some ways it is a reflection of our own interests, concerns, worries, anxiety, curiosity at the moment. And sometimes it is amazingly damning evidence of guilt. So there's a case out of Boston where a woman went missing.

The main suspect was the husband, but they never found the bodies. They weren't sure exactly whether the husband was involved. They suspected him until they asked for his Google search. So here are the Google searches beginning at four fifty five a.m. It says, how long before the body starts to smell? Four fifty eight. How to stop a body from decomposing. Five twenty.

How to embalm a body? Five forty seven. Ten ways to dispose of a dead body if you really need to. Six twenty five. How long for someone to be missing to inherit? Six thirty four. Can you throw away body parts? Nine twenty nine.

How does formaldehyde, what does formaldehyde do? Nine thirty four. How long does DNA last? Nine fifty nine. Can identification be made on partial remains? Eleven thirty four. Dismemberment. And the best ways to dispose of a body.

Eleven forty four. How to clean blood from wooden floor? Eleven fifty six. Luminol to detect blood? One oh eight p.m. What happens when you put body parts in ammonia? One twenty one. Is it better to throw crime scene clothes away or wash them?

Then it gets worse. The next day. This is the second day. Twelve forty five p.m. Hacksaw. Best tool to dismember. One ten p.m. Can you be charged with murder without a body?

One fourteen p.m. Can you identify a body with broken teeth? January third. The next day. One oh two p.m. What happens to a hair on a dead body? One fourteen p.m. What is the rate of decomposition of a body found in a plastic bag compared to a surface in the woods?

One twenty. Final one. Can baking soda make a body smell good? Now, if you are a police detective and you see these series of questions to and you have a missing person and a husband who is claiming innocence but has these searches. I mean, it is the best evidence one could imagine. It literally shows a guilty mind at work asking Google for answers that are almost as incriminating as one can imagine. So most listeners will say, well, good. I mean, they they quote a killer, right?

But then you're right. Every perceived political enemy is vulnerable to digital exposure. So if a search history can convict a murderer, what happened when someone Googles abortion services in a state where that is now a crime? And I think that's the tension, right? So there is probably some woman right now in a state that has largely criminalized abortion. Texas, Idaho, you name it, who is Googling first signs of pregnancy or abortion services, not in Texas. And that search, which again is revealing a question that goes to intimate autonomy and bodily choice, is as open as the Google search of the gentleman who likely murdered his wife. And the difficulty is twofold.

One is the current law. It isn't even clear that police need a warrant. So if you don't need a warrant to get this information from Google, then literally just troll through or ask Google for anyone who has said abortion services without any predicate crime or interest and simply have a mechanism of surveillance. And so question one should be, should we at least have some kind of warrant or higher standard warrant to protect this kind of information? And the second question is, well, in a state that is criminalized abortion, is a warrant really all that protective? If you're allowed to, if it becomes a crime, a full crime to obtain an abortion or conspire to obtain an abortion, then police could use a warrant to obtain that information. There wouldn't be any protection there because the law says the police can use warrants to go after criminal activity. So the hard question is this data is revealing in ways that you might think should be protected or might think should not be protected.

And we don't have a good way, like legally, we don't have good rules to differentiate those two. And essentially our default has been if you create the data, it's available for law enforcement. And I think that that is a problem as part of what I'm trying to wrestle with in the book.

Just for those of us who are not lawyer, can a search history be used against you? Is it a proof or is it the beginning of a proof? What is it? Under current law, the first question is how did the police obtain it? So if they obtained it with a warrant, they're pretty well protected. If they obtained it without a warrant, there's an open question about whether they needed to get a reverse keyword search warrant to obtain it. And there's actually some current litigation about the context of whether that is too general a search. Because obviously in order to obtain the how do I first sign the pregnancy search, you're actually searching of Google, or at least all of a geographic area of Google. And so it's a huge general search that's not particular to a person or an individual or even a crime.

You're just sort of searching probably your my search history as well to get that information. So there's a question of like, do those reverse search warrants, are they too generalized, too unparticularized kind of run afoul of Fourth Amendment values? And the second question is, well, assuming that the police obtained a warrant and got that information, can statements that you make to an algorithm owned by a huge public company like Google be used against you? And the answer is yes. On the theory that you have willingly shared this information, it would be no different in the law's eyes. And you were standing on your lawn yelling out, how do I dispose of a dead body? You know, like that sort of ability to verbalize something that is largely public or not private, the police can obtain and use against you. And so in many situations, your words or acts can be used against you as evidence in a court of law.

I want to move on to something more timely or very timely. Even you write in your book about Palantir works with the LAPD and how the company manages and integrates multiple different data stream of its customer. Before you describe how the police became dependent on the technologists running those systems. And since you finished writing, the Palantir story has just exploded. They have now 30 million contracts with ICE for something called Immigration OS. They have a tool called ELIT that pulls Medicaid records to map where people live and assign a confidence score to their address. And Palantir is also the company that was the conduit for Entropiq's AI Unclassified Panagon Network. Until that relationship blew up because of the Department of War wanted AI system available for all purposes.

And Entropiq pushed back on mass surveillance and autonomous weapons. You coined the phrase surveillance as a service. And on page three, when a single company is the data backbone for immigration enforcement, the pipeline for military AI and the subject of a fight over whether guardrail should even exist. What does your framework tell us about where we are? Yeah, I mean, unfortunately, I feel like my book is all too timely. I wasn't obviously forecasting what would happen when I wrote it. But I think many of the risks identified have come true, which is great for the book, not so great for the world. But so a couple like realities are so Palantir is a data like a data company that takes whatever information it has.

It can be businesses or government and allows the owner of that data to manipulate it. So the work they were doing in Los Angeles is they had essentially a system where they were trying to create social network connections. So basically connections that might not otherwise be seen among people who have been arrested or were under the eyes of Los Angeles. So it would speed up investigations. So it was an early form of person based predictive policing, this idea that if you had enough information about the people you thought were involved in criminal activity and you kept collecting like new information, you'd be able to potentially predict who might be involved in criminal actions or areas that might need more law enforcement support. And so Palantir was in the kind of local policing business. They have since moved on from that to the federal policing business in part because there's more money there.

And I think they couldn't find that many other contracts and departments as big as L.A. that were willing to pay for it. But in some ways, all they were doing is taking different streams of information, whether it was like cell phone information, housing information, criminal information, arrests and that and making it usable. Right. Because in many ways, all this data had originally been siloed by different sort of bespoke technologies and wasn't talking to each other. So what Palantir did was create a more powerful investigative tool by allowing the different data streams to talk to one another. They're now applying that to the federal government, primarily in their contracts with DHS and ICE. They began some early contracts, actually in Trump won. These have just been more expanded. And as you said, they are doing a couple of things.

They are identifying areas like places, geographic places based on addresses or automated license plate readers or other information they have, but where people might be who are undocumented, allowing ICE to get there quickly, as well as they're able to track all of the people who are sort of being targeted by DHS and CPB in a more effective way. Essentially, the data that people have given the government when they come here to seek asylum or seek immigration status is obviously saved by the federal government. Now it's being sort of used and the analytics behind it is being run by Palantir to let ICE do what they were planning to do just a lot more effectively and more efficiently because now they have like geographic information, social network information of connecting the dots and the ability to identify people. When they see them who are again targeted by the ICE administration.

What do you make of the Palantir, Antropic situation? Antropic says we don't want you or we want a contract that will make it clear that you will never use our technology for mass surveillance of Americans and autonomous weapons. And the government replies, well, it's not you dictating the law. It's us. And it's sort of what you want, right? You want more regulation around how data can be used. And yet, you know, Antropic doesn't really trust the government and they say we don't even know what this technology can do. So they want more guardrail.

I mean, there is, there is also a tension here and I'm not sure what you think of it. So I appreciate Antropic's position in saying that they don't want to support mass surveillance. And I think that is obviously a positive. I mean, interesting enough, Palantir has said something similar. They have said if it's unlawful. The hard part is like, what is lawful? Like we don't actually have clear rules about what the US government can do to surveil people. Right now, ICE is purchasing private location data in order to track people.

They can do that because there's kind of a loophole in the Fourth Amendment and current legislation that you can just purchase private data like any other person selling coffee or sneakers can buy. Well, so the government can buy. And so like lawful isn't doing much work when we don't have laws limiting whether or not mass surveillance on Americans is permissible. Like generally we think, no, it sounds like that's why we have the Fourth Amendment. But when you get down to it, it's actually harder. You talk about NSA work. Are you talking about like what is the military doing at all with mass surveillance of US citizens? So I take it as a great principle.

I'm glad that they have stepped back to try to put some guardrails on how AI can be used. But I honestly don't know enough. I'm not sure anyone really knows enough about what the fears were that they saw and foresaw. Everything that's happening with ICE and the concerns we see with ICE and even actually the concerns we see with Anthropic, like could happen in local law enforcement tomorrow if local law enforcement wanted to adopt these technologies. There's nothing stopping them from doing it. It's just we haven't seen local law enforcement saying we really want to build a complete surveillance state. We can do that. There aren't laws that say we can't.

We just haven't seen them wanting to do it. But the ICE is an interesting window into what could happen if you empowered federal law enforcement to use all the technologies described in my book in the worst possible way. It really is a pretty clear move toward an authoritarian surveillance world. And there aren't countervailing laws that would push back on that. Remind us of the Fourth Amendment. So the Fourth Amendment comes from the Bill of Rights, the United States Constitution, that protects against unreasonable searches and seizures. It mentions the protection of our persons, our homes, our papers, our effects, our things. It generally requires a probable cause warrant to invade sort of our most protective spaces.

But the current understanding of the Fourth Amendment is that the search, like when the Fourth Amendment would come in to protect you, only happens when there's a violation of a quote unquote reasonable expectation of privacy. And you might ask yourself, well, what is a reasonable expectation of privacy in a world where I can walk down the street and there are 10,000 cameras in my city and my license plates being read by an automated lightslight reader. My cell phone is giving up my signal of where I am, and almost everything I touch is connected to some third party service that I kind of consented to and gave up that information and government can get access to it. So what does that mean in the modern world? And that's a great question. There are not simple answers. There are particular cases that have evolved, but the law is pretty old fashioned and analog. And we're just slowly seeing the Supreme Court wrestle with this idea that digital might be different.

And so it's a hard question to see what does the Fourth Amendment mean today? I have some ideas. I'm a law professor. I read about that kind of thing, but it's not an easy answer. You have a concept in your book that I think is one of the most interesting and important idea in it. You call it the Tyrant Test. You're right. Assume power will be misused.

Assume that the Tyrant will gain access to your data and process and proceed accordingly. So tell us that idea came from, where that idea came from and why do you think it's not a thought experiment anymore? When I wrote it, it was more of a thought experiment than where we were right now. I read a law review article called Surveillance on the Tyrant Test that basically said one way to approach this new risk that we all now face with this exposure, this vulnerability to the data that we give up in this world is to try to recognize that we need to act. But how do we act? And I say the way to begin is to assume the worst. Assume that the Tyrant will be reading your most embarrassing, like Google search, and then proceed appropriately.

What would you do to protect against that world? And there's no one answer. You have to have legislative responses like heightened protections of certain kinds of intimate data and communications data and other kinds of constitutionally protected data that needs to happen by law, by legislative rules. You need to have a judiciary that responds to sort of, again, update the Fourth Amendment to meet the digital age. Again, a lot of the cases I teach are like from the 1960s, 70s, 80s, analog world, analog technology. They still are governing things today and we need to adapt to a new world. You would probably want to have citizens involved in the sense of like localized bodies and groups of people to sort of push back and check.

You want to have rights and remedies. The American legal system, like America was founded on the Tyrant test, right? We were worried about a centralized concentration of power. We divided it among the three branches of government. We divided among the states and the federal government. We had grand juries and juries to sort of respond to when the Tyrant abuses their power. We have legal rights and the Bill of Rights and remedies to go to court. And it takes all of that to do this kind of work to respond to the danger of consolidated data power.

And I think what you can see now and you see it in other countries is that it's very easy to take all of our digital trails and centralize it in a way where the Tyrant, whoever is in charge, can misuse it in ways that are detrimental to liberty, autonomy, freedom, descent, just being weird and different. And we've seen that story before in other countries. And I think that we need to sort of take that lesson seriously now and move forward with that all of the above approach as we can consider what to do about the data that exists in our world today. Are you optimistic that anybody wants to do anything, at least in this administration, about enforcing more regulation when it comes to private data? I think that part of the reason we're writing the book is to get people to see that this cuts both ways, right? Donald Trump's concern is that his data was used against him. He saw what he saw as the weaponization of data. Like Jim Comey, former FBI director, career U.S. attorney his whole life, straight arrow, his data is being used against him.

And so whoever is in power is likely to misuse that power against the other. I think there might be a bipartisan recognition that the political winds will shift and you don't want your data exposed. And one of the strange things about this world of consumer data is that it applies across the board. If you're talking about police surveillance technologies, it primarily targets the poor, communities of color and places where police have generally done a lot of their work, part of that social control under IDEA. But you're talking about consumer data. Like if you're a sitting senator, your Google searches are probably kind of embarrassing, too. At least your kids are. The data revealed like who's meeting with you.

If you're a Supreme Court justice and you're traveling and getting like boondoggle trips across the world, guess what? Your data might actually reveal what lobbyists you were hanging out with and some pretty embarrassing things. And I think that some people might say, that's great, we should expose that. I think the recognition is like there are places of pockets of action and activity and association that we probably don't want revealed without a really, really good reason. And that's the balance I don't think we have currently. Well, on that note, there is a list that stopped me cold in your book where you list every single kind of digital record that a judge could authorize a police to obtain with a warrant. And I would like you to read some of it for me. Right.

It starts with a cancel diagnosis and it ends with a prior and it's on page 140 of your book. My personal biggest takeaway from writing this book was that there is nothing too private that can't be obtained with a warrant. Like your smart bed, your period app, whatever it is, your digital diary of your darkest secrets, with a warrant, it's all obtained. So I begin this chapter about privacy problems with these kinds of digital things, all of which are available with a warrant. A cancer diagnosis, an ovulation cycle or sperm count, a suicide note, an act of sexual intercourse, paternity test, an admission of infidelity, an office affair, a stroke, a family secret, a confession, a visit to a psychiatrist, a prescription, a poem about loss, an email to your dying mother, a video of you in your favorite spot in the park, an expression of gender identity, a bank statement. A bankruptcy statement, a group text chat with college friends, a mean text about your coworker, coworker or boss, a photo of an ill-advised inebriated act, a video of your bedroom, an experiment with illegal drugs, a prayer. And it could go on. You could like, there is nothing too secret that cannot be obtained.

And I think if you step back and you say, why is it that we would always privilege the ability to prosecute over that sense of privacy? I'm not sure we'd have a good answer. Usually the fact pattern, the issue comes up in a way where the prosecutor has a good reason for going into the bedroom or whatever it is. But if you step back and say, like, is that really the balance we want? That if you create it, it can be obtained and used against you in a court. I'm not sure that we would as a society, we would say it's a free for all. That the default is once created, anyone gets to use it for a criminal prosecution. So for anyone listening to that, who is maybe starting to freak out, what would be the most or the single most important thing they could do?

Well, I think that this is the single most important thing is to buy my book and read it and share it with your friends. No, I think that the point is that we need to create this as a consciousness. We need to recognize that this idea of privacy matters, that the fact that we should be able to live in a world where we can have the innovations of smart pacemakers and camera systems and not worry that we are really creating the evidence to be used against us. Like, that's the better world, right? That we don't limit sort of consumer innovation. At the same time, we don't expose ourselves to like a petty tyrant of your sheriff or a real tyrant of the federal government in a way. And you can carve out areas where that is. Now that isn't going to be an individual choice.

And in the book, I'm not trying to condemn people who own a Ring doorbell camera or who use post everything on social media. I understand that's part of how we live. I kind of want to let people have those things and yet have laws where those things won't be used against them. That, to me, would be a better answer than either like banning the technology or not having the technology or just the current status quo, which is essentially if you create it, police will come.

Did you tell your kids anything about that? Did you tell them like, hey, maybe you shouldn't install a Ring camera in your house or watch out for your Google search? One of the interesting things that happened just this last winter was I have a 17 year old and a 13 year old. And so they listen to Spotify, like the music thing. And every year you got this, quote unquote, Spotify rap, which is basically a description of like your listening preferences over the last year. And what was fascinating to them and to me was like the granular nature of that. Everything you listen to at every moment had been recorded to the minute, to the second. Right.

And the fact that this was happening, not just here, you could see it. You could see exactly how many times you played a Taylor Swift or how many times you had played a particular song. And the kids could see, wait a minute. That means that the other things I'm doing online don't just like disappear into the void. They're actually being recorded. And it was actually an interesting like teaching moment at home. And like we're, we are, I need a Google map to get to my kids' soccer games. I don't have a doorbell camera, but I understand why people do.

And the point here is more so not an attack on the technology as much as it is a hope that people will come together to do something about the laws that could regulate limit or just have the debate. We haven't had the debate yet about what the balance should be. Maybe the debate will be, we should just give it all to the police. We haven't, we haven't even had that debate. All right. Is there a book or a film that helps you personally make sense of the world that you are describing?

I read a lot of, a lot of articles. That's probably not the book or the film. I don't know. You always find that the, the, I always cite the minority report and there are other new movies about that because it sort of can be kind of. Disopian science fiction like, but I actually think that the real world is where I get my inspiration from. Like you see videos of how police are using real time crime centers. Like if you just go check out how these sort of centralized police data centers with like thousands of cameras that they can click from place to place to place. It's not like Hollywood, but it is really, really interesting to watch where they can go from camera to camera, from the drone camera that flies to the scene to the police body camera, running out for the suspect, to the city camera, to the homeowner's camera.

People can upload their film of what they've seen and you see how policing will change in the future. And it's not Hollywood. It's real. And like that kind of thing always kind of inspires me to see like what's coming next and how we need to sort of react to those changes. Thank you. So we are at the end of this episode. And as always in Back in America, I like to close with one last question, which is what is America to you? It is still a place of opportunity and inclusion to country based and built on ideas and ideals.

It involves participation and deliberation and fairness and equality. I think the most central example of how small D democracy can work is like the jury, right? This idea that 12 ordinary citizens without otherwise any other power come in and decide a case against the legislative body that decides something is a law against the executive that's trying to prosecute against the court that is overseeing it. And they get to decide it is like a reaffirmation that like when we come together as citizens and we have a task to do and we have rules and we follow those rules, we can get to the right result. I kind of see that is my hope for America is that we will find a new that that centered sense of purpose and coming together and that we can build from that a future that feels less divisive and more inclusive and more hopeful about the same ideals that have been around since the Constitution was enacted.

Andrew Guthrie Ferguson, thank you so much. Thank you.