Miranda Warnings

Technology can both help and hurt in the cause of justice. In a fascinating conversation, Vivian Wesson shares how artificial intelligence and facial recognition are being used in both criminal and civil law.   

What is Miranda Warnings?

Join NYSBA’s 118th President David Miranda each week as he interviews some of the biggest names in law and politics. Each week he discusses all things legal – and some that are not. You have the right to remain listening.

Dave Miranda:
Hi, I'm Dave Miranda, general counsel and past president of the New York State Bar Association. Welcome to Miranda Warnings. You have the right to remain listening. This week on Miranda warnings, we're going to talk about artificial intelligence and facial recognition technology. We're very pleased to have with us today. Vivian Wesson. Welcome, Vivian.

Vivian Wesson:
Thank you.

Dave Miranda:
Vivian Wesson is General counsel to the Board of Pensions of the Presbyterian Church. She also serves as chair of the New York State Bar Association Committee on Attorney Professionalism. She's a delegate to the Bar Association's House of Delegates and a member of NSBE's new working group on facial recognition technology. Vivian, we're going to talk a little bit about artificial intelligence and facial recognition technology. I know you recently wrote an article in Bloomberg Law calling for federal regulation of Facial Recognition Technology. Just for starters, maybe you could tell us a little bit about what facial recognition technology is and how it's used.

Vivian Wesson:
Absolutely. And thank you again David, for this opportunity to speak with you and to your listeners. I am a big fan and advocate of Miranda Warnings programming. So thank you very much for allowing me an additional platform to discuss facial recognition technology. When you think about the ability to have a digital signature of oneself, no different than a fingerprint, how the complexion and tone of your features, of your face, do identify you. Although I've been, and I'm sure you from time to time have been accused of having a doppelganger. I'll be walking down the street and someone, them say, oh, hey Carmen. Oh, you're not Carmen. Well, facial recognition technology uses machine learning to come up with this comparison, not just relying on naked eye. And it sprung out of really computer vision back in the 1960s when they really were trying to develop something that would allow a really more rapid recognition of someone for benign and hopefully benevolent purposes.

But what it really does is take a set of facial data, so this will be images of like hundred thousands now some of these databases have been created and ask the computer to run an analysis of the probability that I look like one of the images that are stored in the database. So wherein lies the problem is who's putting the images in the database. So the database, it mainly consists of people who look like David Miranda versus people who look like me. The flaw soon became inherent that we were not being inclusive enough in these representative sets, just that the computer was not giving us an accurate prediction of identifying the correct person and the inaccuracies were more glaring when it came to people of color and particularly black women.

Dave Miranda:
First, to your first point, where is the database coming from? So what's the source of the input of where the visual representations are coming from?

Vivian Wesson:
So there are multitude of sources that a technology company could use. There are open source databases of different bases. Google has a platform for which you can use a sample data set. There's some government entities who created these sample sets and the idea was to really spur more commercial adaptation and the reconfiguration of what the technology would look like. And then there are companies that simply go through the website and scrape images off and put those into database.

Dave Miranda:
And I know you've called for some regulation of this use, but before we get to the regulation, why don't you share with us some of the ways that you're seeing that the technology is being misused?

Vivian Wesson:
So I would share with you David, four primary areas of concern when it comes to facial recognition technology. One is when it's being used and it operates exactly as intended. And that really is more the MSG case that we can talk about later in your program or the case where it does not act as intended because there are inherent algorithmic biases within the technology and no one really had a clear understanding of what the training set was that produced these unforeseen results. And then you just have simple misuse and abuse of the system where someone could manipulate an image such that it will create a match within the facial recognition technology by using Photoshop techniques. And lastly, it's the dystopian sort of chilling effect of the use that some police during post George Floyd's murder had used to surveil protestors. If you can sort of pinpoint and identify folks who are exercising their right to peaceful assembly, that has a chilling effect on you being able to exercise your constitutional right.

Dave Miranda:
Well, let's like to pick apart each of those. Well, the proper use where you said there could be a problem and we are going to talk about the MSG case and that is technically the use, the manner in which it's being used is fulfilling its purpose. And we'll talk a little bit about that later. But then you said there's a situation where you're using the technology, but there's an unforeseen result. What kind of problem comes from the unforeseen result of the use of this technology?

Vivian Wesson:
So in early days, and I love that I'm saying early days when we're talking less than a decade ago, some of the nascent programs that were released, a lot of programs that work as anticipated such as using facial recognition technology to open up your iPhone to Aura Unlock. Some, using it as some biometric form of encryption. Well, what happens if the system is not working as it anticipate, as is anticipated, and anyone who has very similar features to you can unlock your device and use it in some nefarious, which has happened.

Dave Miranda:
Or so that's when I have my cell phone and it has, if it sees your face, it can get in without using your password. Is that like

Vivian Wesson:
That? Yeah. So being able to have someone mimic your features and it recognizes that it's not working exactly as you expected it to because you should be able to pick up all the nuances. Let me give you a better commercial example. When HPE went to release his first webcam technology, the idea was it was tracking what was happening with your face and providing the necessary amount of light such that the webcam would track you as you are. And it was the boom at the time is like everyone needed to have, and we never knew how much we were going to need webcam technology during pandemic, but the problem was that it was not recognizing black faces, so we would not come on at all. So folks who were expecting this AI capability webcam, they couldn't figure out and remove and get the bug out of the system when it was first launched Google when it first launched its photo AI facial recognition technology.

This was so for the sole purpose of you being able to sort or there's pictures of grandma and there's pictures of Timmy at the beach and you would have be able create an album collection of the people that the software went and sorted all of your files through. That sounds like a wonderful and beneficial application except what it started to do was sort through people's photo albums and label black people as gorillas because it did not have a proper training set that it could distinguish what a black face was, especially when it was juxtaposed to a white one.

Dave Miranda:
And then you talked about another problem where there was the creation of images and that's perhaps more of artificial intelligence use, right? Yeah.

Vivian Wesson:
So when you think about how fun it is, the younger generation, they'll do all of these enhancements when you can add certain, change certain features and Photoshop yourself to make yourself look more appealing. When you're on Instagram or you're doing your TikTok video again seems rather benign except what if you used that from a law enforcement perspective to alter the features on the face of someone you suspect might be the criminal or a suspect in particular investigation. You run that through the facial recognition technology. Well, it's not actually looking at an image of someone, it's looking at a doctored image of someone. And if it gives you a false reading, that to me seems an abuse of a technology that really was intended to give you an appropriate match for an unaltered image.

Dave Miranda:
So there would be a lineup of potential suspects and then the photos would be manipulated somehow to what? To make the person look more like the suspect or make look more guilty, give them beady eyes or something or?

Vivian Wesson:
No, but more. I suspect that it is this person, but if I ran their image, let's say the image that I took in a lineup in the facial recognition system and it did not spit back that person, well, if I shaved the nose down a little bit maybe oh cheeks, then it'll come back and it was like that person is a match to what I saw in the surveillance video that now I have my suspect. That's not what you're supposed to do. That is an absolute abuse of how the technology is intended to work. If it came back that that person in fact is only a 70% match, it should be an investigative tool as opposed to being a dispositive solution to trying to find your suspect problem.

Dave Miranda:
Right, and well, it seems like that would be a terrible thing to do. That should be, it would seem as though, especially when we're dealing with criminal matters, that would be something that's prohibited. That seems like you're putting your thumb on the scale. I would assume that if the photos were manipulated that that's something that a defense attorney certainly would be entitled to know and could I would think make a lot of noise about if there was a prosecution.

Vivian Wesson:
Let's put it this way, what if you did not enter the doctored photo into evidence that you did not indicate that that was what led you to believe that this was your particular suspect from the surveillance video? Then it never rises to the level of a light of day that the jury would even consider. Well wait a minute, because that would give you, if you are a juror or you are the defense attorney, that could give you pause as to what's the efficacy of the actual identification if you are relying on the computer to do it. If the computer is doing its job as is intended, and you put an unaltered image in there and it came back with something that was not in the 90th percentile of accuracy, then you have to use that as just one step in as part of your investigation, but that you were in utter reliance on it and then you do something to manipulate it then now the playing field is not level.

Dave Miranda:
And that I think that leads to the point that you were making in the article that you recently wrote in Bloomberg Law where you're saying there needs to be some federal regulation on this. Talk a little bit about the kind of regulations that you would expect to see. I know you said there was some federal legislation that was proposed pre-pandemic that kind of took a backseat once we went into the pandemic, but obviously in the last three years, certainly the technology has advanced substantially. I would suspect that the need for regulation is even stronger now than it was three years ago. So what kind of regulations do you think we should be seeing?

Vivian Wesson:
So right now you are seeing a scattering of spattering, I would even say, of states and cities that have just tried to deal with what they're calling sort of biometric identification. So there's the contingent that I'm okay if I have to be flagged and fingerprinted for certain activities, fingerprint is 100% reliable. Oh, okay. Well, if I am submitting to, I want to be admitted to the New York bar, I must submit to putting my fingerprints on file. I know for the purpose of which they are using this information, why they're collecting this biometric data and the further purposes for which they can use it. The problem lies in the facial recognition technology. These are cameras that are in department stores at entertainment venues that are just capturing your image without your knowledge and certainly without your consent and certainly without your understanding of the uses to which this will be put.

When I go to the airport, I understand that I am agreeing explicitly that if I want to fly, part of the bargain is you get to capture my image and check it against maybe a no-fly list or some other dangerous character or use it as part of some other type of police level investigation. You submit to that, that's one of the exchanges of getting on a commercial airline space. But do you do the same thing if you go into Macy's department store or is that the same contract? Are you put on the same type of notice and giving the same type of consent? That's what really the federal legislation leading to is you can't just take it without someone's knowledge, but you need actual consent, not implied consent, even if the bargain is that there's a big red sign in the front of your local Bloomberg's.

If you're coming in our store, be aware, we use FRT to scan folks and this is part of an, that purpose is it is part of our theft mitigation process. You can agree to that bargainer. You can say, wow, I just lifted a purse from here last week, so I'm going to exit stage lap. You get to make choices right now. There are no choices and the commercial market is ramping up briskly and using this technology again for a lot of risk mitigation and a beneficial purpose to make sure that they're looking after that someone is not coming in to commit a bad act, okay. That that's all well and good, but you have to tell us that. And we should as part of the public good, we have some agreement to it and there should be hefty signs associated with anyone who in infringes on that potential right to privacy and freedom from surveillance.

Dave Miranda:
Well, let's dig into that a little bit, right, because there's cameras everywhere. You've talked about stores, but they're in the streets, they're on lamp lampposts. Oftentimes they're used not for anything nefarious. Oftentimes they're used to help stop crime or to catch criminals. And so I would think that there would be a difference between somebody or some entity recording what's going on in a public space, which a store certainly is a public space or a street is a public space for the purposes of, as you suggested, crime prevention a and a difference between that and actually taking the visuals that they record and then using them for some other purpose. So I would think that it's not only the location of the recording and whether there's an expectation of privacy in that location, but also if there's some knowledge and expectation of how the visual would be used.

So most people, with the exception of your example, who lifts pocketbooks, most people don't have a problem with the fact that there might be some recording for crime prevention. In fact, it might make them feel safer, but if it's being used for other purposes that they're not aware of maybe some other commercial purposes that would be an issue. So how do we filter between good uses and proper uses and acceptable uses and uses that might be unacceptable.

Vivian Wesson:
I think for me, you hit the nail on the head when you talk about the expectation of privacy. If I am walking down center City Street in Philadelphia, I have no expectation that someone could not possibly capture my image for whatever reason. The problem is when the technology lends itself to potential abuse where you're not using it and collecting this biometric data for the sole purpose of doing this potential like theft or crime mitigation purposes, but then using it such that you are selling it to highest bidder and people can use this for stalking purposes. People could use it potentially to commit identity theft and other manner of fraud. There are whole host of when you balance on the scale a potential bad uses than benign, I think a regulation can point out and capture when we're trying to do something that is beneficial for society, which crime mitigation and hopefully prevention or apprehension of people who commit the crimes.

It's like that makes sense to me is just when you talk about it, a completely unregulated state, which is where we are now, where at most New York City has you must post a notice sort of requirement. It still does not have any strictures around what else you could be doing with that data. Therein lies for me will lie the problem.

Dave Miranda:
What about, is there any regulation against taking someone's image and manipulating it so it looks like they're saying or doing something that they didn't do? I mean it's gotten to the point. Certainly you could see it in movies where with CGI where they're looks like they're doing all sorts of things and that's obviously perfectly harmless and entertaining, but we've gotten to the point where you could make it seem as though someone was in a different location and saying something that perhaps they never said and doing something that they never did. And it looks very incriminating, certainly is. Do we have any laws that might prevent that or is that something also that we need to regulate?

Vivian Wesson:
What you're talking about is the deep fake state. There is a creation now of deep fake using AI, not necessarily FRT. So this is just complete potential manipulation of images or Photoshopping images into certain things, doing a voiceover background sound for someone and having their image move, even though they're not saying what it is that the deep faked image is saying. And again, wholly unregulated and it's like if you see that computers are creating these lifelike brand new people and images and they've just, New York Times recently ran an article about that as well, disturbing in that someone can be out there sort of creating this artificial population of people who are doing different things or as you say, putting you potentially in some compromising position not unregulated. Absolutely, that there is nothing to stop someone from creating an image of David Miranda and having him meet with the leader of North Korea and you guys are sitting down for a chitty chat.

It also goes back for me that the cautionary tale is everyone believes what they see. And if you're creating these false images and especially making them so very realistic, utilizing this technology, when do we put the parameters around when that's a good thing and where or when you should not do it. So when you think about it, we're just going to have to go back to old-fashioned law. If someone is stealing my image without my consent, that's copyright. So we go back, copyright will not allow you to do certain things and take my image without my consent. You're not allowed to reproduce it because this space is my own.

We'll have to go back to just intellectual property rights, period. When you think about those things didn't change just because we now have fancier tools that allow you to do different things and just good old-fashioned slander and libel, those still apply. Even if you're creating this manipulated image of me potentially saying and or doing something that I think is slanderous to my character, I'm pretty much still going to have a right of action. But is there anything particular about using AI to do it? No.

Dave Miranda:
I want to talk a little bit about this MSG case that's been in the news, right? And it's certainly, it's struck a chord I think amongst the public, particularly amongst attorneys because attorneys were the ones that were specifically targeted here. This is a situation where MSG, which is shorthand for Madison Square Garden and its various affiliates, is using facial recognition software to prevent lawyers in law firms that have litigation against MSG or its affiliates from entering MSG venues. And it's been highly publicized because unsuspecting lawyers have shown up to go to a basketball game or to go to Radio City with their children and they've been through this facial recognition technology, they've been precluded from entering at the door. And so let's talk a little bit about what MSG did and where we are with that issue.

Vivian Wesson:
So you want to know what I find ironic about MSG's use of the technology? They are following a complete four letter of the New York City law. They actually sent notices to all the law firms that litigate against them and put them on notice, which was all they were required to do. To say, I will be using this technology to ban anyone who is a lawyer at your firm. I don't care if they're litigating against me or not, if they are associated with your firm. And he was like, oh, it was only 90 firms, but you did it. You put all 90 of those firms on notice to say, if you come to any of my, the what I'll call public quasi-public space venues, I reserve the right to escort you to the door. Not well, not with saying that you're not there for anything other than an entertainment purpose.

This is a ban that we're going to institute and we're going to use this technology in order to enforce it. When you think about, let's think about the old school method of doing that. How many people do you think you could put on the door at a Nick's game to appropriately make sure that they single out the David Miranda's other world? I can almost guarantee you that it will probably be 40%, 40% accurate, maybe. I think this testimony come now, it was like it's very difficult, but now you've decided to weaponize facial recognition technology against an entire class and you created a class by doing this of lawyers who happened to be litigating against one of your enterprises is representing someone who has a lawsuit against you. That's it. That that's the bar from my wanting to exclude you from being able to come into my facility.

And I have so many multiple levels of issues with just doing that. But that's the first level that we talked about of harm where you're using a technology that it's infallible for this purpose because you went to those law firm websites, scraped all of those images. That's what you use to train your AI system. So it was guaranteed that you're going to find all the lawyers who happen to have head shots on the law firm sites and you're going to be able to ban all of them. The technology worked flawlessly, it worked exactly how they intended it to work.

And is that what we want? Is that a use for which a private sector individual should have and be able to weaponize against any class? So let's, let's take lawyers out of this. What about someone criticize you in the New York post, the New York Times, what did they say something bad about you as an individual? Are they going to go on the band list? Where does it stop if we do not at this point draw lines in sand and create regulation to stop it? That's the slippery slope. And as lawyers love to talk about getting on that slope, but you can see where it can start the slide.

Dave Miranda:
Yeah, let's talk about it because the example that you gave, the more traditional example has been codified where for criticism, so New York civil rights law section 40 B went into place decades ago before facial recognition technology. When theaters would want to preclude a critic, they gave a bad review, they'd say, you know what? You gave us a bad review, you can't come in anymore. And they actually codified that that's impermissible. And that law came up in the MSG case where the judge, at least at the trial level, I know it's on appeal now, but the trial level said that 40 B would prohibit the wrongful refusal of admission to places of public entertainment where there's performances or music, but wouldn't necessarily prevent someone from going to a basketball game because that's come under 40 B. So it seems like at least in the past that there was regulation that would prevent someone from somewhat arbitrarily precluding someone from entering their venue.

Vivian Wesson:
The argument that I've been having with folks who are opposed to my point of view on this or find it not as disquieting as I do, think on balance private property rights should always trump privacy rights. So you're now putting at odds potential what the right of a private property owner can do versus what my potential first amendment or just overall privacy rights might be.

Dave Miranda:
But we need to make a difference between private property, meaning like my house or private property, meaning a public forum where the public is invited to come buy tickets and come in. So if I want to preclude someone from coming in my house, I probably should be able to do that as my private property just for whatever reason. I don't like the person.

Vivian Wesson:
Right, exactly.

Dave Miranda:
But for a public forum, if you hold yourself out as a public place of entertainment of some sort where people are buying tickets, that would seem different. Isn't there a difference between personal property and a public space?

Vivian Wesson:
So when you were mentioning the civil law, or at least the at trial level when they started trying to do what are the laws that could speak to what's happening in MSG contacts and there was uncertainty as to whether or not someone's buying a ticket for a Nick game entitles them to be able to be there. So what you're talking about only is a limited license right, that's all a ticket really gives you is a limited license for that period of time to be on that particular personal property of someone else. It is still a private commercial property. It is not publicly held by anyone. This is just the owner of owner X. Well, can owner X set what the limitations are? Well, we've already prescribed that there's certain things that owner X cannot do, right? He cannot say, you are not coming in because we do not like black women to show up.

Like soon as you hit title eight, you are done. I was like, if you're going to hit any of those defined classes, you're going to have a problem. It doesn't matter that it's your private institution, someone's going to be able to make some claim against you. But in this instance, he created this class by saying, all lawyers in this who are doing this particular thing that we find offensive, well what's to stop any other, so commercial property space owner. And that's why I don't want to limit it to just what's happening at MSG, but what's happening when any private owner of a space, even if it could be considered part of the public good, sets these types of limitations on access. When do we say we need to put some wheels around this? Let's circle this particular usage such that it is direct, intentional and transparent and known to whoever it might be targeting or might impact.

Dave Miranda:
So in your view, if an attorney is suing MSG, does MSG have the right to preclude that attorney in every attorney in their law firm from going to a Nick's game or going to the music hall?

Vivian Wesson:
So I come out on the balance of the right of privacy, the right to my own personal time, the right for me to be both a lawyer as well as being a separate parent or a spouse, a sister or brother. I get to separate those things and those things shouldn't travel with me when I go to an entertainment venue. You've now created something that classifies me as the one thing. Even if the one thing is not what I am doing when I go up to show up at a Nicks or a Ranger's game, not the right of some individual, we'll call it semi-private or semi-public owner should be able to do

Dave Miranda:
So. And we can get into this a little bit, and I know you're familiar with this, that when you're involved in litigation and both sides have attorneys and they're potentially going to court, there's formal procedures for discovery and it's not permissible for an attorney necessarily to go kind of undercover and try to elicit information from people that might have information about the case from others. And I kind of think that MSG is in this particular instance, using that as an umbrella to cover everything. Now, if you have a litigation over a slip and fall perhaps, or a contract matter against MSG, there's not going to be any information that another attorney in the law firm is going to be even be potentially able to gather by going to a Nicks game.

Vivian Wesson:
So the recent case I arrest ye, assured that the mother who was traveling to see the Rockettes Radio City Christmas show was not there arming her two children with subpoenas for anyone at Rockefeller Center. That that's not what was happening. And this is not a laser focused sort of battle against this litigating class, but this it's way too over broad of an approach to get at or potentially chastise those who or retaliate against those who may bring lawsuits against you, that that's not what we intended is a useful means of leveraging this type of technology. When it can be used in a weaponized fashion.

Dave Miranda:
It would seem as though the intent was to have a chilling effect. That if you're an attorney that's in a law firm that's involved in MSG, then I guess you're at war with them and you're going to have to suffer. Which is I'm sure something that the task force that you're involved in with the bar association's going to look at as to whether this would have a chilling effect on attorneys taking a particular case. Now obviously attorneys are subjects to, they take an oath where they represent their clients diligently. Certainly many attorneys have been subject to more harmful things than not going to a Nicks game. But as to whether that's appropriate or not, I think that's something that the task force will, I think discuss. And we don't want to have anyone be deprived of their right. And you do have a right to bring litigation if you're aggrieved and if you can't find an attorney to do it, that would in effect be a denial of someone's rights.

Vivian Wesson:
Justice. Absolutely. I'm also troubled by the implication that from my perspective, attorney professionalism, we focus a lot on what we need to do when we are looking at the rules, rules of professional conduct. And one of them is being able to diligently represent your client and considering taking on a case. You also run your conflicts, do a conflicts check within your firm. Well now you're potentially a bridging my right to how effectively run my conflicts check because my conflicts isn't, well, maybe I need to root out someone who has season tickets to the Nicks before I can take your case. That's insane. That is not part of what we should ever have to do in order to decide to take on a new representation.

Dave Miranda:
Right. Well, Vivian Wesson, thank you very much for being with us here on Miranda warnings and talking about artificial intelligence and facial recognition technology and some of the ways we might be able to use it for good and regulate it. Thank you so much for being with us.

Vivian Wesson:
Thank you very much. And so I'm really happy to be able to spend some time with you, David. Appreciate it.

Dave Miranda:
Yeah, it's great. Always great talking to you. We have a somewhat lighthearted feature on Miranda warnings called music book or movie where you can share with us something of an artistic element that's important to you.

Vivian Wesson:
Thanks. So I am going to give a hundred percent thumbs up to a book that was given to me called God Improv and The Art of Living. And one of the things that I have said repeatedly that has been the most powerful and instructive for my practice of law was participating in an improvisational comedy group while I was in undergrad. So we toured around Southern California doing, well effectively it was like, whose line is it anyway? And it taught you how to be just completely flexible to put the words yes and in your vocabulary and be able to be reactive in the moment. And remember when I took the job working at this, the religious organization, Presbyterian Church, it's like, so how do you factor this improv skill with being on the Christian side of life? I was like, God's doing nothing but improv when you think about it.

He was like, when He decides, he was like, yes. So Sodom and Gomorrah, yeah, I'm going to turn you into salts if you, I'm like salt, but that's what you came up with, God. There is no sort of magic blueprint to it. And if you think about that some of the world, some of this sort of diagram, people think that your whole world is mapped out for you. It's just simply, it's not true. And I'd love being able to read the interconnection between my faith and my improv skills and how that really is about the art of living.

Dave Miranda:
Well, that sounds great. I'm going to say yes and I'm going to take a look.

Vivian Wesson:
That's what I want to hear.

Dave Miranda:
Thank you very much, Vivian. You stay well. This has been Miranda Warnings, a New York State Bar Association podcast. You have the right to subscribe, rate, and review.