Feminism NOW

AI seems to be everywhere these days, and it’s often marketed to us as a way to boost our efficiency. But feminists have long known that technological advancements can also create new avenues for the exploitation of women, and AI is no exception. 

Our theme this season is “Protecting women in a hostile world.” In this episode, NOW National President Christian F. Nunes is joined by Andrea Powell, co-founding director of Alecto AI, a survivor-founded tech company that combats online image abuse, to discuss deepfakes, image-based sexual abuse, and how the law and technology itself can help keep women safe. 

Links
AlectoAI
Andrea Powell’s book, Believe Me: The True Story of How a Trafficked Teen and Her Advocate Changed the Justice System and Found True Freedom
Andrea Powell’s Congressional testimony - “Take It Down: Ending Big Tech’s Complicity In Revenge Porn”
Demand for deepfake pornography is exploding. We aren’t ready for this assault on consent.

Guest: Andrea Powell is a twenty year expert in global policy and direct services for survivors of sexual exploitation, human trafficking, sexual assault and image-based sexual violence. She has published in the New York Times, Washington Post, NBC, PBS, Ms. Magazine and more. Andrea’s social justice memoir, Believe Me, can be found online and throughout indie bookstores and has been awarded the New York Book Award of 2024. 

Take Action NOW:  In celebration of Women’s History Month, read more about the National Organization for Women and the Equal Rights Amendment here

Listen to new episodes of Feminism NOW released every other Wednesday. To find out more about the National Organization for Women, visit our website.

Socials:
Alecto AI:
Instagram: https://www.instagram.com/alectoai 
LinkedIn: https://www.linkedin.com/company/alectoai/ 

Andrea Powell:
Instagram: https://www.instagram.com/ankachristine 
LinkedIn: https://www.linkedin.com/in/andrea-powell-bb53255/

Creators and Guests

Host
Christian Nunes
AP
Guest
Andrea Powell
BB
Producer
Bethany Brookshire
IB
Editor
Ismael Balderas-Wong
SC
Producer
Susanna Cassisa

What is Feminism NOW?

Passionate about modern feminist issues? Want to learn more about how today's political, academic, and cultural leaders strive for a future of universal equality and justice?

Join NOW in a podcast dedicated to intersectional feminist discussions in American society with leaders in entertainment, sports, politics, and science. From conversations on constitutional equality, to economic justice and reproductive rights, listeners will find new ways to learn, engage, and get empowered.

Listen for new episodes released every other Wednesday.

Bethany Brookshire [00:00:06]:
This is just a note. Today's show is discussing online image abuse, particularly sexual abuse. Please take care of yourself as you listen. Hello, everyone, and welcome to Feminism NOW, the podcast from the National Organization for Women. I'm senior producer Bethany Brookshire. When the Internet first became popular, parents often tried to comfort their kids when they got nasty comments online. It's just the Internet, they'd say. It's not real life, but it's been a long time since then. Life on the Internet is just as real as anything you can touch with your hands, and its impacts can be just as devastating. Of course, anyone can get scammed or be exposed to disinformation. But women are at extra risk on the Internet with the rise of online image abuse, particularly images involving sexual abuse. While people might think of AI and think about silly images on the Internet, the truth is darker. A 2019 report showed that 96% of deepfake material is pornographic in nature, and much of it uses images of women without their knowledge or consent. This material can ruin lives. Today, NOW's national president, Christian F. Nunes speaks with Andrea Powell, one of the founders of Alecto AI, about the dangers of the digital space for women and how the technology that harms can also be used to help people stay safe. And while we're here, we would love to hear your thoughts. Is this your first time hearing about this issue? Can you tell the difference between AI and reality? Contact us at feminismnow@now.org and now, let's get to the interview.

Christian F. Nunes [00:01:41]:
Hello, everyone. Welcome back to Feminism NOW. We are happy to have you with us. I'm Christian F. Nunes, the national president of of the National Organization for Women. Our theme this season is Protecting women in a Hostile world. And we know what we're seeing every day, that the world women live in now is much more hostile, not just in real life, but also online. Many of us do our jobs online. Our social lives are online. We know we need to prevent hacking and viruses, but the Internet can be much more dangerous than than that. That's why I am so pleased today to be speaking with Andrea Powell. She is the co founder of Alecto AI, a tool that helps fight deep fakes and digital abuse. She's also the executive director of Karana Rising, which provides efficacy and care for survivors of sex trafficking. And she is the author of the 2024 book, Believe Me, the True Story of How a Traffic Teen and Her Advocate Changed the Justice System and Found True Freedom. Oh, my goodness. Andrea is just full of knowledge, experience, and a delight to speak with so thank you for coming on to Feminism NOW.

Andrea Powell [00:03:01]:
Oh, thank you so much, Christian, for having me. I'm, I'm delighted to be here.

Christian F. Nunes [00:03:06]:
So this conversation this season is really important because we really know that women and girls have really been experiencing just exacerbated amounts of abuse online. And that has become so out of control because it's, you know, unregulated and there's no accountability. But we know at the same time what we don't often talk about is the, the trauma, the anxiety, the depression, the harm that it causes. So I want you to inform our listeners a little bit more about this issue. So can you start off by telling us what a deepfake is and also what online non consensual image based sexual abuse is?

Andrea Powell [00:03:47]:
Sure, absolutely. So what is image based sexual abuse? First of all, it is the stripping away of consent from the individual who either is being actually depicted or synthetically depicted in an intimate setting. So we're using the word intimate and not sexual in that context, but because what you and I might think of as sexual or intimate might be different from what someone say in a, in a more faith based community in a different part of the United States or someone in a different part of the world might think so rather than having that lengthy debate, it's really about consent. So image based sexual abuse impacts just here in The United States, 1 in 12Americans, the vast majority of individuals who experience this are from vulnerable populations or are women and girls. So that. Which really honestly isn't a surprise because all forms of sexual violence disproportionately impact women and girls, but it does impact young men, boys, older men as well. So I wanted to just kind of then move into what is deep fake abuse, which a lot of your listeners will have heard of what's called deepfake porn. But in reality, this isn't porn. Porn is a consenting adult entertainment industry. But deep fake abuse is where you're taking someone's face and you're putting it on another body, or you're taking someone's image and you're what's called nudifying them. So there's hundreds of apps, free and accessible, that anyone can download. They can take a picture from like a woman or a girl and make it look as if she's completely nude and then put that online, share that, store that email, text it, use it as a threat or coercion. So deep fake abuse is really a synthetic approach to sexual violence that's happening online. And while people often think of celebrities like Taylor Swift, Or Billie Eilish. The reality is it's happening to younger and younger girls across this country and around the world, including girls as young as like 11 and 12 in junior high. So it is rapidly expanding with over 9,000 websites, sites just surfacing on Google alone explicitly designed for this type of abuse.

Christian F. Nunes [00:06:05]:
I want to thank you for first naming the difference of what that is, but also speaking out about the deep fake porn terminology because I always hate when I hear deepfake porn because like you said, porn is consensual between adults and when someone's image is stolen or manipulated, that is not consent. It's done without their consent. So thank you for submitting that.

Andrea Powell [00:06:27]:
Sure, yeah.

Christian F. Nunes [00:06:28]:
But what I want the listeners to know is like really what we're talking about here, when we're talking about numbers, what are we talking about? We talk about these things because people think it's just, it's not as prevalent as it really is. So can you tell the listeners a little bit more about like how often this is occurring and why this is such a big problem?

Andrea Powell [00:06:47]:
People often think about why is online sexual violence like increasing or happening? But it's just, it's. The answer is really simple. It's an easy way to sexually abuse individuals online. You can sit there with your computer, make a deep fake video of your classmate and then share it with all of her friends and family, put her location. So at Alecto AI, our goal is to, to really reverse the conversation from why did he do it to fully focusing on consent. Because oftentimes you hear terms also like revenge porn. Right? And you, you don't revenge murder people, you don't revenge rob people. So this is, you know, it's not revenge. There's nothing anyone could do that would justify the sexual abuse. And just to kind of add into that Christian, when we think about the impact of, of deep fake abuse, even though the word fake is being used and now it's increasingly called sexual digital forgeries. But I think that's kind of a mouthful for our non legal listener. When we think of the word fake. We think it's, it's not real. But it also kind of connotates that nothing really bad is happening because it's not real. But that's not true. I work daily at Alecto with survivors of deep fake abuse and years afterwards they're still wondering, has someone seen that video? Has that person, do they think they've seen me naked? Even if they get the content down, their thing is if it's never really over, it's continuous Threat of violence and humiliation. And. And really, I often think of deep fake abuse and image based sexual abuse in general as a digital get back in the kitchen for women and girls who dare to rise.

Christian F. Nunes [00:08:35]:
I think that what oftentimes when you talk about the part about it being like, fake, and people become desensitized to the word fake and they feel like it doesn't have as much harm, but the reality of it is these people who are surviving deep fakes and are victims of this, they are being traumatized over and over and over again. And what that trauma does, and I'm speaking from a mental health background because I'm a trauma therapist, is it makes it hard to heal. Right. I mean, it's so many layers to this. Right, Andrea? Like, it's not that simple as to just be like, well, you know, it just. It's is. It's not.

Andrea Powell [00:09:11]:
It's fake.

Christian F. Nunes [00:09:11]:
You know, it wasn't really you. People out there in. In social media and online don't know that. And we know that people have also experienced real life stalking, harassment, and sexual violence because of deep fix.

Andrea Powell [00:09:26]:
Exactly. And that's the thing is, you know, some of the young women I've worked with have said, like, they just. They're going about their day, they're, you know, sitting with their mom after church, having lunch, and a guy will approach them and say, oh, you're a porn star.

Christian F. Nunes [00:09:39]:
Mm.

Andrea Powell [00:09:39]:
Or a girl who's experienced that, who's still in high school, and suddenly she's kicked off the cheerleading squad.

Bethany Brookshire [00:09:46]:
Right.

Andrea Powell [00:09:47]:
Because she's considered amoral or doesn't have the ethics that the cheerleading squad wants her to have. And that might sound like a small thing, but it's not a small thing because it's not just taking away rights. It's. It's destroying dreams and opportunities. Most individuals who go through this go offline. But where do we apply for jobs? Where do we interact with our friends? You know, it's self expression for content creators, for example, people who actually, their job is to create content online that's now greatly diminished, if not completely over. Oftentimes people think, oh, you know, it's just this. This situation where you can get over it. But if you're having your name placed on that video, or you're having your home address or the names of family members put on those images on those websites. First of all, not to get into the intent, but why is that happening? That's to humiliate, threaten, and terrorize that individual. One woman who's in our community, her Ex husband for a very, very long time pretended to be her and had fake porn sites in her name with her information and was profiting off of that. Which Christian, like that's sex trafficking.

Christian F. Nunes [00:10:59]:
It is.

Bethany Brookshire [00:10:59]:
Absolutely.

Andrea Powell [00:11:01]:
Just because it's online, it, it, it doesn't diminish the harm. She lost her job, she has small children. What do you think it's like for her little boy to have other boys come up to him and say your mom's a porn star.

Bethany Brookshire [00:11:12]:
Right.

Andrea Powell [00:11:13]:
So it impacts everyone around that individual. And I just think even those who are creating this content don't see the harm. Like especially these young boys who are finding free new to fight apps, finding these websites online. Some of those new to fight apps there's listed as safe for four year olds. It's really, it's just right there in front of these kids. So you're kind of giving a boy like a digitally loaded gun. I do think that this impacts everyone involved in a very negative way.

Christian F. Nunes [00:11:42]:
You know, and just to continue what you're saying, it's like we're talking about adolescents who have not got to that full understanding about impact or empathy or really understand like the long term damage that could come from these things or are still learning these things. They think it's just funny, you know, and part of it is because we've created a place where we excuse and we minimize because what, what's really going to happen is the boys, they're going to say, well, he didn't know. But everyone's going to automatically seem the worst about the girl or the woman there. And I think that's one of my biggest issues I see is that we're always looking to victim blame and anytime there's things about sexual violence and then we create an opportunity for justification for perpetrators, especially if they're male when they do these things.

Andrea Powell [00:12:28]:
Oh, for sure. And you know, we're also afflicted by this concept of himpathy which my, my friend and philosopher Kate Mann coined, which is the misguided sympathy for male perpetrators of sexual violence.

Christian F. Nunes [00:12:40]:
I like that.

Andrea Powell [00:12:41]:
Himpathy. She made it up and related to the Kavanaugh hearings and the Washington Post loved it. So we have a, a big problem with people desperately trying to believe and have empathy for those who create this violence while not believing survivors. Which is, well, it's the title of my book, believe me. But I, I don't mean believe me in the, in, in the sweetest sense of like that would be nice. I mean that right. When you don't believe survivors, when law enforcement doesn't believe survivors. When tech platforms say, oh, you'll know about you, that disbelief actually could take someone's life.

Christian F. Nunes [00:13:16]:
Yeah. And it's also a form of erasure. It's erasing like their value, their, their worth, their existence in some ways. And we don't think about these things. So one of the things I wanted to talk about is like how quickly can an Internet based sexual image or a deep fake actually become viral cushion.

Andrea Powell [00:13:35]:
It can just happen in a day. You know, high school boy creates these content and like three or four girls in his class and then they share it with their friends who share it with some of their older brothers who were in college. And then one boy puts it on a porn site. And you know, many of the survivors I worked with in the matter of like weeks, there's thousands of URLs and what that means for the victims. It's like you have to create a digital rape kit of yourself to find all of that content. Then you have to go and lobby these platforms and sometimes there's these things called zombie sites. Increasingly these sites are unmanned. The content's up there, but there's no one right to talk to, to get it down. And so, and especially when you're talking about adult content and that's, that's where Electo comes in, is where our goal and what we're capable of doing is you can create a profile in Electo with your face only yours. You don't get anybody else's, we don't own that. You get a token of your face and you can find your images online and click which ones you want removed. And we can then work with the platforms to get that content down. Now we're not fully at scale yet. Right now we've got about 10,000 user and we would, we think we need to serve about a billion. So we've got work to do. But that being said, we want to make it where the victim, and I'm using that word strategically in this context, the victim doesn't have to create their own digital rape print and keep going back to their crime scene and watching their body be degraded sexually over and over and over again. And my founder, Breeze Liu, we worked tirelessly for months. We could not get the help we needed in the United States. We connected with my partners at Speezo, which is the survivors in technology. So you know, it's, it's both about how quickly it can go viral, but also how quickly it can ruin a life. We, I think we have to stop thinking about online offline Sexual violence. Sexual violence is sexual violence.

Christian F. Nunes [00:15:33]:
Absolutely. I absolutely agree that sexual violence is sexual violence. So, listeners, we're going to take a short break for some action now, and we'll be right back.

Christian F. Nunes [00:15:47]:
Listeners, did you know that March is Women's History Month? To help you celebrate, here are a few facts from our own history. Now was founded in 1966, and at our second national conference, we adopted the passage of the Equal Rights Amendment, an amendment that President Biden noted in January is law of the land. Women's History Month shows us that the road to equality is a long one, and it is bumpy. We may feel hopeless as our rights are removed, but we cannot stop fighting. We cannot stop creating new history, a new history in which we achieve equality for all genders. For your action now, head to now.org to read up on the Equal Rights Amendment and learn how you can help support equality for all. And now back to our show. And now we're back to this wonderful conversation we're having with Andrea Powell. One of the things we talked about earlier was the fact the majority of women, like almost all of it is, or I believe 99% of all the bigs, are women and girls. But there are also still deepfakes that happen for other groups like, like LGBTQIA individuals. And really focusing on that being a subgroup of women, girls, but also sometimes young boys and men experience that. I'm curious to know this. So when we see that happen with men and young boys, is it the same justice they're receiving as women and girls, or what are you seeing?

Andrea Powell [00:17:30]:
So I will be honest up front that I have not met very many men and boys directly impacted by specifically deep fake abuse. Image based sexual abuse, yes. But there's often fluidity across, you know, it's not like the abuser is like, you know what, I only do spy cams or I only do deep fake abuse. Like, they're kind of doing whatever. And for boys, I think one of the reasons I haven't met a lot of boys, particularly boys who are don't identify as lgbtq, is the shame. And I, I share this, even though it's a dark thing. But two years ago, I was working on a landscape report for my prior organization through the Reclaimed Coalition to analyze specific trends around image based sexual abuse. What is it? Where is it happening, all of that? Well, through that work, I found about 40 public cases of teenagers and young adults who had taken their lives as a result of image based sexual abuse. And it just, I think the shame associated. And this is on all forms of sexual violence really make it hard for young men and boys to come forward.

Bethany Brookshire [00:18:33]:
Right.

Andrea Powell [00:18:34]:
You brought up the LGBTQ community though. But just like all other forms of sexual violence, those experiencing image based sexual abuse or deep fake abuse who are in the LGBTQ community often have a much harder time getting support from law enforcement, being believed, getting access to justice.

Christian F. Nunes [00:18:53]:
Yeah, you know, and I agree with you about this shame of it because as I mentioned, my background trauma therapist, I mean, everyone experiences shame. So I'm not going to say only men are experiencing shame, but I also notice that it also digs into their sense of their own personal identity. And I'm not saying it's worse at all than what women also experienced, but I have noticed that it seems like internalized more sometimes.

Andrea Powell [00:19:20]:
Yeah, I think that so many cases, particularly when you're talking to boys, you've really been indoctrinated in this tough guy, you know, and I see some of that changing. And speaking of that, and this is a little bit of a shift, but I think it's worthy of noting. When we think of deep fake abuse, we're often thinking of like the, the bad actor, right? That one kid, that one creepy old dude. We're thinking of these disparate people. But deep fake abuse, with the 9,000 websites, the thousands and thousands and thousands of images, 99% of deep fake is sexual abuse of women and girls. This is a culture, it is a culture of misogyny and hate against women. And what we're seeing is not just that bad actor, but you're seeing the complicity of tech companies, of developers, of places that store the images, like Apple. And I'm not here to point out different tech platforms, but all of this plays a role. The financers. And then when you think about the people who download it, share it, do other things with it, docs it. Like one young lady who's an advocate in our space was on a TV show talking about her experience. And afterwards, all of these online users spent hours finding her abuse content and re uploading it as a joke because she dared to be an advocate. Oh, and by the way, if you're an advocate or you're a public figure in any way, you are considered public domain and you are not protected from this because we currently don't have a federal law that criminalizes image based sexual abuse or deep fake abuse. We have one piece of legislation, the Take It down act, that would change that. And I think it's very close to getting through the Senate again and hopefully getting through the House. But we, we need survivors to have options, civil options with the Defiance act, criminal options. We need better regulators and better regulatory bodies. Many countries like Australia, South Africa, South Korea have regulatory bodies that, that really regulate tech companies and regulate policy and regulate ways in which this is addressed. But in the US it's like not only them are most of the companies here. And the vast majority of the demand for deep fake abuse is here. It's a wild west up in here.

Bethany Brookshire [00:21:41]:
It is so true.

Christian F. Nunes [00:21:43]:
It, it just boggles my mind when we want to say the reason why we're not giving these rules and regulation is because we want to make sure people have free speech. And it's like, I didn't consent. That's 1, 2. When it's creating violence and harm. We got to stop going into this free speech argument like it just, it's a nonsense to me and that's not how it was designed. And I really like what you said earlier about we can't look at this as online abuse versus, you know, real abuse. It's all abuse. It's all real. It's all abuse. And we have to stop trying to compartmentalize abuse to make it easy for people to desensitize to it.

Andrea Powell [00:22:23]:
And I think that that's just, it's so vital for, for folks to understand and have a feeling. A lot of your listeners are on our team already and following this. But I urge you, when you hear people mocking, even celebrities, when it's fake abuse, say something, because just because you're Taylor Swift or Amber Heard or Rihanna or whoever, it, you didn't sign up for that. Taylor Swift didn't sign up for her name to be synonymous with deep fake porn and by the bodies used in the Taylor Swift 4chan challenge. And a year and a half ago, those were the bodies of sex trafficking victims. And I was a victim advocate for 26 of those individuals last year. And the harm, losing jobs, the mental health toll, the toll on personal relationships, the lack of trust. Like, one young woman was like, I can't date because every date I go on, if he Googles my name, the first thing he sees, there's things that aren't me, right? No one wants to go near me or the people who do think I'm a sex worker. And to that point, even sex workers, and especially sex workers are vulnerable to this crime. They too should not experience this. We have to be agnostic. You know, I make a joke, but I'm serious. Even if Donald Trump were in a deep fake abuse scandal, I would still Say no, he has a right to not experience.

Christian F. Nunes [00:23:48]:
Right. Absolutely.

Andrea Powell [00:23:49]:
It shouldn't be political. It is, it's more, it's even more than sexual violence, Christian. It's. This is our human rights.

Christian F. Nunes [00:23:56]:
Yes.

Andrea Powell [00:23:57]:
And it's just where does all this lead? I think we need so much deeper empathy, education. You know, regulation is important, laws are important, but laws are only as good as those who implement them.

Christian F. Nunes [00:24:11]:
I would be remiss if I didn't also mention that when this happens to women of color and girls of color, the disbelief is even greater. Like there is a huge disbelief because they're already hypersexualized anyways just by existing. We can't forget that experiences depending your intersectionality and your identities make sometimes make it so much harder for you.

Andrea Powell [00:24:34]:
For sure. Well, I mean people are talking about like AI is revolutionizing society. I'd like to counter that and say we need a revolution of compassion.

Christian F. Nunes [00:24:44]:
Absolutely.

Andrea Powell [00:24:45]:
And we need digital bystanders. When you see something that doesn't look like consent, like if your friend sends you a nude of his girlfriend, I'm pretty confident she wants you to do that. And so if you take that and you re share it, you just have perpetuated sexual violence, the better response is to report it or to say, hey man, that's not okay. Or if you see something that just looks like hey, she doesn't look conscious in that image, whatever it is, or you know, it's a deep fake, report it and make that a part of the ethos of our online digital compassion.

Christian F. Nunes [00:25:20]:
We definitely have to move away from being bystanders to defenders, you know, because right now we have so many silent bystanders and or henchmen who like you said, engage in it. But we definitely have to move to a place of being defenders. And that's like you're saying it's the only way. This is this. We could fight, fight against this and stop it. Before we go, once again, this is about protecting women in hostile environments. Can you just think of two takeaways that listeners could do to help make sure that they are also contributing to protecting women and girls?

Andrea Powell [00:25:52]:
I'm going to speed through like three things really quick. So one, if you have experienced or you're concerned that you've experienced any type of image based sexual abuse, including deep fake abuse, please reach out to the Electo foundation. It's Alecto.AI.com There aren't that many of us yet. You will find me. You can also Find me on LinkedIn and other places. But that we will respond. We will engage. Two, if you know someone who's been through this and they're suffering, tell them about this podcast and make sure they know about Electo and make sure they know they're not alone and that you believe them and say those words. I believe you. Yes, it's very powerful. It's very healing. And three, if you see something that you think looks like deep fake abuse or image based sexual abuse, report it to the platforms. And you can also, if it, if it looks like it's a child, report it to the national center for Missing and Exploited Children. Or you can go to stop ncii.org There are resources, but if you come to us, I will guide you in the right direction. But those are some of the other resources and if you're listening to this, you're now part of the solution. So thank you.

Christian F. Nunes [00:27:02]:
You are part of the solution. That is a perfect way to end. Thank you so much, Andrea. I appreciate you. You know I'm always here to the support because like you said, we all have to be a part of this. And I thank you for joining us.

Bethany Brookshire [00:27:18]:
Thank you so much for joining us on the podcast this week as we discuss digital safety in a hostile world. If you think this topic is important, we would love it if you shared it with your friends. Please like and subscribe to the show and you can send us your thoughts and questions at feminismnow@now.org head to now.org that's N O W. O R G to read up on NOW's core issues and our approach to advancing women's equality. Together we can make a difference. I'm Bethany Brookshire. Thanks for listening and we'll see you soon.