Civil Discourse

Nia and Aughie discuss the cases heard before the U.S. Supreme Court in February 2023 regarding the responsibility of Google and Twitter for the content published on their platforms.

What is Civil Discourse?

This podcast uses government documents to illuminate the workings of the American government, and offer context around the effects of government agencies in your everyday life.

Welcome to Civil Discourse. This podcast will use government documents to illuminate the workings of the American Government and offer contexts around the effects of government agencies in your everyday life. Now your hosts, Nia Rodgers, Public Affairs Librarian and Dr. John Aughenbaugh, Political Science Professor.

N. Rodgers: Hey, Aughie.

J. Aughenbaugh: Good morning, Nia. How are you?

N. Rodgers: I'm good. How are you?

J. Aughenbaugh: I'm good because once again, listeners, we're going to record a podcast episode about one of my favorite government institutions; the United States Supreme Court.

N. Rodgers: The SCOTUS.

J. Aughenbaugh: Because even when they don't really try, they end up in the news.

N. Rodgers: You know what? But this is not even in the news episode. It is an episode we are going to release weirdly in the middle of other things because that's how we do stuff sometimes, because it is timely. But it is timely in the sense that they've just had the discussions. It is not timely in the sense that we will ever know what their opinion is because they're never going to release any ever again.

J. Aughenbaugh: Let's pause that. The week that we are recording, the Supreme Court shocked pretty much everybody who follows the court when just in the middle of the week, they went ahead and announced three more decided cases.

N. Rodgers: Really?

J. Aughenbaugh: Yes.

N. Rodgers: They had, had people shaking their fingers at them because they had been waiting so long.

J. Aughenbaugh: Yes. Then they just went ahead and dropped three on us.

N. Rodgers: Boom. Bye. They must have been listening to the podcast.

J. Aughenbaugh: Of course.

N. Rodgers: I think you and I should take credit for that. But this week, or rather, in February of 2023, our lovely justices, our nine not technology great justices are hearing a couple of pretty nuanced cases about social media. Can you walk us through the cases?

J. Aughenbaugh: Yes. The first case is entitled Gonzalez v. Google. For those of you who don't know, YouTube is actually owned by Google. That becomes important in just a few moments. What was at issue in Gonzalez v. Google is whether Section 230 of the Communications Decency Act of 1996 effectively provides legal immunity, meaning somebody cannot be sued in court if computer services make targeted recommendations of information provided by another information content provider.

N. Rodgers: How that translates is, the recommended button in YouTube or the algorithm in YouTube that when you watch a video, the next video that it shows you, there's an algorithm that goes into what it thinks you're going to want to watch because it is extremely important to YouTube that your eyeballs stay on their website because that's how they make money. They make money by you watching the ads and then them selling the ad content underlie those videos, so they need you to stay on the site so they will encourage you to watch the next video by giving you something that they think you want to watch.

J. Aughenbaugh: Yeah. The longer you stay on, the greater the number of viewers that they then can go ahead and say to advertisers, hey, we had 20 million Americans at this time of day and they were watching this. Don't you want to place ads in-between the videos?

N. Rodgers: Exactly and that's where you get the idea of influencers. People whose shows get watched or people whose YouTube videos get watched in the millions: PewDiePie and Phil Day, and variety of people. There's pretty serious competition to get to that level. You're trying to get to those level of people, which is fine except that this case talks about that recommendation and how that recommendation can lead people down a path.

J. Aughenbaugh: Doing really bad stuff, and that's what this case is about.

N. Rodgers: Because what happens is the algorithm says here, Aughie, you watched this film noir video. Maybe you would like this video about film noir. Then you watch that and then it's like, okay, well maybe you would like this video about villains. Well maybe you would like this video about; and it takes you down a path. It's not like Aughie watches the Maltese Falcon and the next thing that comes up as an Isis video. That's not how that works. There's a much more subtle way that the algorithm works. If Aughie didn't watch the next video, if he was like, I don't care how film noir is made, then it wouldn't offer that again.

J. Aughenbaugh: Or I don't care about the real life cases that led to; yes

N. Rodgers: To crime. Then he would not get those. Once he skips those recommendations two or three times, then the algorithm is like, this guy doesn't actually like true crime he likes crime in film and it would go back to giving him different offerings. For the justices, in case they were wondering, that is basically how YouTube works.

J. Aughenbaugh: Yes. What this case is about is that the family of Nohemi Gonzalez, she was a 23-year-old woman from California, was shot dead during a 2015 rampage by Islamist militants in Paris.

N. Rodgers: It killed 130 people.

J. Aughenbaugh: A hundred and thirty people, yes.

N. Rodgers: Truly bad guys.

J. Aughenbaugh: Yes. In 2016, her mother and stepfather and other relatives accused YouTube of providing ''material support to the Islamic State'' because of the recommendations of the group's videos, the Islamic State's videos, to certain users based on, as Nia just explained, the algorithmic predictions about their interests. The Gonzalez family was basically saying the recommendations helped spread Islamic State's message in recruit jihadist fighters.

N. Rodgers: Which is true. But Google's argument is that we don't control the behaviors of individuals who watch these videos.

J. Aughenbaugh: Yes. Now, the Gonzalez family sued under a law called the US Anti-Terrorism Act, which lets Americans record damages related to, ''an act of international terrorism''. Unfortunately, for the Gonzalez family, in 2021, the Ninth Circuit Court of Appeals dismissed the case. The reason why the Ninth Circuit Court of Appeals dismissed the case was its interpretation of Section 230 of the Communications Decency Act, which most federal courts in this country have long interpreted as providing pretty much universal legal immunity for the companies because the logic is, these companies are merely the platforms. They are merely publishers, and they can't be held responsible for the content that other people, other groups, put on the platform.

N. Rodgers: Can I argue with you just for a second?

J. Aughenbaugh: Well, hold on before you get to the argument.

N. Rodgers: No, I wanted to argue with something you just said.

J. Aughenbaugh: Go ahead.

N. Rodgers: They're not publishers, right?

J. Aughenbaugh: No. The argument is this; it's much like a newspaper gets published. The newspaper can't be held liable for content provided by opinion editorial writers. They are just merely publishing. You can't go ahead and sue the publisher, you can only go ahead and sue the writer. Okay?

N. Rodgers: Okay.

J. Aughenbaugh: If what they wrote causes you to then engage in criminal behavior.

N. Rodgers: If you got tax advice from an opinion column in the New York Times.

J. Aughenbaugh: Or the Washington Post. For instance, one of my favorite columns that I read in the post is from Michelle Singleton. It's like a money matters column. I can't sue the Washington Post if I follow Singletons advice and then it leads the IRS to go ahead and say that I was trying to avoid paying my taxes.

N. Rodgers: Although wisely, she puts disclaimers on the bottom of her column, that says, you should seek specific financial advice for your situation.

J. Aughenbaugh: You know the column that I'm talking about. Because you've read it too. But they are in the language of the First Amendment, merely publishers. They're not telling you to watch this, they're not telling you how to respond to this. They're just merely providing the what? The platform.

N. Rodgers: The video to start with.

J. Aughenbaugh: No.

N. Rodgers: No. The platform. Somebody else made the video. They are providing the conduit through which.

J. Aughenbaugh: Yes, that's all they are. The reason why, by the way, listeners, if you want to know why Congress passed this law, was that in the early 90s, there were Internet sites that were getting sued because of content that other people provided on their sites that lead to stalking and criminal violence, etc. Many Internet companies made what argument, may rather persuasively to Congress. If you don't give us legal immunity, you're basically going to stop this new technology in its tracks. Because none of us are going to want to go ahead and provide these social media platforms, if we're going to spend all of our time in court defending ourselves about content we didn't write, we didn't create, we didn't film. We're just providing the platform.

N. Rodgers: We're as innocent as newly fallen snowflakes. Except the argument to that is, you actively make money from your algorithmic provisions. Like they make money when you click on the next video. It is in their financial interests to entice/encourage you to look at the next video. That's the counter-argument of that. That's what the actual lawsuit is. Isn't that what the basis of the lawsuit is? Is that the Google is saying free speech.

J. Aughenbaugh: No, but the Gonzalez family isn't even focusing on the money. The Gonzalez family is focusing on the first thing that you explained in this episode. You're not merely providing the platform. You are actually directing people to content. Your hands aren't clean. You're actually engaged in.

N. Rodgers: I see the algorithm itself is a problem.

J. Aughenbaugh: That the algorithm itself is the problem, because you're making recommendations that are leading people who are already, shall we say, predisposed to want to engage in this terrorist behavior. Your hands aren't clean.

N. Rodgers: Google doesn't want to change the algorithm because people will use it less, because their recommendations are not being tailored to them.

J. Aughenbaugh: That's right. Yes.

N. Rodgers: Because people are like, don't show me videos I don't care about, only show me stuff I want to look at.

J. Aughenbaugh: Yes.

J. Aughenbaugh: I guess that's how people are and because there are about it, because a billion videos on YouTube. I don't know what the real number is, but it's uncountable probably at this point. Google's like we have to use an algorithm and they're like, yes but your algorithm is making people criminal, like it's encouraging criminal behavior.

J. Aughenbaugh: Your hands are not clean.

N. Rodgers: Which is true anyway.

J. Aughenbaugh: But in the language of criminal law, you're involved in the conspiracy because these individuals wouldn't have found these videos if not for your algorithm.

N. Rodgers: Without your help. I see.

J. Aughenbaugh: Now, in parts, Google and a whole bunch of amicus curiae briefs submitted by other technology companies basically make this argument. Not only would it go ahead and hurt the profitability of these Internet companies, it could also threatened other shall we say, services that you find on the Internet, like search engines, job listings, product reviews, and displays of new song and other entertainment that are not even remotely controversial. Because again, think about this. Let's just say for instance, you provide a product review on a site. Give me a product review site, What is it? Yelp.

N. Rodgers: Amazon or Yelp.

J. Aughenbaugh: In the process because your product review is so well written and catches so much attention, then now all of a sudden a whole bunch of people are no longer using that particular business. That discourages businesses for advertising their services on the Internet.

N. Rodgers: Well from a tiny little podcast point of view, if you want people to find your podcast, you want it to be searchable in the podcast providers, Apple and Transistor and all those people. You want people to be able to search Google Play, find your podcast. What happens is you get that recommendation at the bottom it says, do you like this podcast? You might also like, and it would kill that. Which means it would kill small entrepreneurs with tiny little podcasts who want to get known. I can see on Google, this is tough, because it's both sides. This is going to be tough for the justices.

J. Aughenbaugh: While it is going to be tough. Moreover, I think the justices are very much aware of this. You have individuals, you have politicians from across the ideological spectrum who have suggested that Congress should revise Section 230 of the Communications Decency Act. I mean, think about this Nia, our last two presidents have publicly stated that Section 230 of this law should be revised.

N. Rodgers: I mean, we can't get Joe Biden and Donald Trump to agree that air is breathable.

J. Aughenbaugh: Or think about this. Interest groups representing civil rights, gun control. Women's rights groups have all told the justices that the platforms amplify extremism and hate speech.

N. Rodgers: Well, and Gamergate. Well, anyway that's a whole separate issue.

J. Aughenbaugh: Twenty six states submitted an amicus curiae brief that said social media forums do not just publish user content anymore, they actively exploit it. I mean, you're talking about a wide array of groups who normally.

N. Rodgers: Wouldn't agree on anything.

J. Aughenbaugh: They wouldn't sit down at the table if there is a free buffet of food. They'd be like.

N. Rodgers: There's a catch and I'm not doing it.

J. Aughenbaugh: I'm waiting until this group gets out of line before I'll actually sidled up to the buffet.

J. Aughenbaugh: One law professor at Michigan State, Adam Candeub said it perfectly. Section 230 is basically Internet companies get out of jail card free because they have legal immunity.

N. Rodgers: But if you don't hold them legally responsible, at what point will they change the algorithm so that it doesn't encourage dangerous behavior. If I don't give you financial incentive to change your behavior as a company, you will not change it. That's how companies work.

J. Aughenbaugh: At the time we recorded this podcast, the Supreme Court already heard oral arguments in this case in Nia. By the way, the oral arguments just in the Gonzalez case, were nearly three hours in length. Now, listeners for those of you who have not listened to our previous summer of SCOTUS podcast, you may not know this, but typically oral arguments in any case in front of the Supreme Court is limited to an hour. The justices and the attorneys went back and forth for nearly three hours just on this one case.

N. Rodgers: Well, it's complicated.

J. Aughenbaugh: It is complicated.

N. Rodgers: This is a complicated question because part of what you believe in, if you believe in American capitalism, is this idea of companies being able to grow and expand and do all things like that, and you want reasonable regulation, but you also want them to be able to do those things. Think about how many people Google employees. Think about all of that stuff, all of those things play into that question on the one side, but on the other side you also have this, at which point is a big tech evil tech, and you have to rein it in and regulated in some way so that you say to people, it's unacceptable for your

J. Aughenbaugh: Company to engage in this practice that can lead to this potential harm. Think about every time the American economy shifts from one dominant industry to another, farming until industrialization. Industrialization to service or postmodern. Now we're into technology. Think about how big of a share the technology has of the American GDP every year. Some estimates is that it's well over 60-65 percent.

N. Rodgers: Yeah.

J. Aughenbaugh: Mind you, this is all occurred. I know some of our younger listeners are just like.

N. Rodgers: My whole lifetime. No. Well, your whole lifetime, but not ours.

J. Aughenbaugh: Yeah. Well, they're easily two or three generations of Americans who are just like technology. What technology? I still use a rotary phone, computers.

N. Rodgers: That's the work of the devil right there.

J. Aughenbaugh: It was pretty clear, reading the transcripts of the oral arguments, the justices are really wary. They're cautious. There circumspect that it should be the court that gets involved, and that this is something that the Congress, the legislator branch should do. Not with standing the fact that many of the justices were openly sympathetic to the Gonzalez's family.

N. Rodgers: Of course, because they lost their daughter. But that's not what's in question here. What's in question here is not, did bad guys hurt your daughter? Yes, they did. They murdered your daughter and those people should be held account for an entirely different crime, the crime of murder. That's not in question here, but the question here is much more complicated when you get into this idea of if you rely on technology and you want it to work, there are certain ways that it works, that then the negative side of that can be, but it leads people into this darker place. One of the things, I know it's not directly connected, but hear me out on this. One of the things that the Virginia lottery has on the bottom of every ticket is please gamble responsibly.Then it has a phone number. If you are not a person who or if you're a person who struggles with gambling responsibly, I would argue that by the time you've bought the ticket, the deed is already done. But they're trying to regulate the path down which you could find yourself gambling away your rent money or whatever. That's the thing. It's a hard thing to come up with how do you regulate? But some people can get pleasure from this and do fine and not be homicidal maniacs because they don't turn into those folks. They don't go down that path. Then other people do, part of that is what you were saying your predisposition. It's a complicated. I feel a little bit sorry for the justices because that's a tough question.

J. Aughenbaugh: Now, the following day was the second case. The second case is Twitter verses Taamneh. What happened here was an ISIS linked attacker killed Nawras Alassaf and 30 other people in an Istanbul nightclub in 2017. Alassaf's family sued Twitter, Facebook, and Google, alleging that the companies contributed to the growth of ISIS and that they could have taken more aggressive enforcement action to combat pro-ISIS content on their social media platforms. Now this case is less about Section 230 of the Communications Decency Act. Instead, Alassaf's family brought the lawsuit under the Justice Against Sponsors of Terrorism Act, JASTA. This is a law which was amended in 2016, and it allows civil suits against entities that aid and abet terrorism. Alassaf's family is basically saying, Twitter et al aided and abetted terrorism.

N. Rodgers: Which I mean technically? But wow, there again is nuance. What we're trying to get across I think with both of these cases is. It's not as easy as, Twitter should have stepped on ISIS's neck when they were making pro-ISIS tweets, join us and help save the world or whatever it was.

J. Aughenbaugh: Prohibit the use of their platform for these tweets. But Congress passed JASTA to allow families of the victims of 911 to sue Saudi Arabia. When Congress passed the law, they actually overrode President Obama's veto of the law.

N. Rodgers: He vetoed what, two things in his entire, or maybe four. He didn't have very many vetoes, but this was one of them. With him saying no, this is terrible idea.

J. Aughenbaugh: Yeah, and this was the only time that Congress successfully overrode one of his vetoes.

N. Rodgers: Why did he say it was a bad idea?

J. Aughenbaugh: He said, and this is a direct quote, "enactment of JASTA could encourage foreign governments to act reciprocally and allow their domestic courts to exercise jurisdiction over the United States or US officials, including our men and women in uniform for allegedly causing injuries overseas via United States support to third parties."

N. Rodgers: If you're living in big glass house.

J. Aughenbaugh: Yes.

N. Rodgers: You should probably be really careful about picking up a rock. Yes after 911 people wanted to punish Saudi Arabia because they perceived was it 18 of the 19 attackers were Saudi citizens.

J. Aughenbaugh: Yes.

N. Rodgers: But when you do that you have to be careful because Americans do bad stuff oversees all the time. Do you really want citizens in the other countries to be able to sue the United States over the actions of their citizens? Like just be careful about that. That's a big can of worms and the president was like, this is not a good idea and Congress over read him anyway.

J. Aughenbaugh: Now in this case, Twitter lost when the Ninth Circuit Court of Appeals expanded the scope of the laws libility. This is a case where Twitter lost so Twitter is bringing the appeal to the springs. There are a whole bunch of former state department lawyers, former federal judges who are like, hey, if we, again another friend of the court brief to the United States Supreme Court and they basically went ahead and said, supreme Court, you need to overturn what the Ninth Circuit Court of Appeals ruled because if you don't there are a number of countries in the world that already have in place laws in their own country that say, if the United States or any other country allows their citizens or groups from their nations to be sued in their courts, Americans, including American government officials can be sued in their courts. Nations like for instance, Great Britain in Pakistan, now, those nations are, " Our allies right now" but how long are they going to be our allies if we're suing their citizens or their government officials in our courts and forcing them to pay damages? They're not going to be our allies very much longer. Because that's how foreign relations work. If you're willing to give us a pass we're willing to give you a pass. But if you're not willing to give us a pass we're going after you guys.

N. Rodgers: It's not a door you want to open.

J. Aughenbaugh: Yes. Again, you could tell during the oral arguments and here I read the oral arguments. The justices were just like, okay, first of all, what does aid and abet truly mean? Because again, that's a common phrase in criminal statutes. But this is not a criminal statute, this is a civil lawsuit that allows US citizens to sue people from other countries in federal court for civil damages. Is what Twitter, Facebook, Google, when they have their algorithms that make recommendations, are they truly aiding and abetting? Or are they just money seeking corporations that are trying to get a whole bunch of people to their social media platforms? Now you're talking about intent.

N. Rodgers: Exactly.

J. Aughenbaugh: Yes.

N. Rodgers: Justin needs to go.

J. Aughenbaugh: You really don't like that law.

N. Rodgers: I really don't like Justin and here's why. I don't want America to be sued by randoess in other countries who have been harmed by American policies or American stuff and I don't think that holding the whole government responsible. They want to sue Saudi Arabia, they should sue the families of the 19 people who committed the atrocities. We know who those 19 people are. I don't know. I don't like the idea of being able because that is such a dangerous way to engage in international relations. I don't know, that just seems like a mistake to me.

J. Aughenbaugh: But then let's throw in an intervening variable. What about all those anti-terrorism scholars and groups that say, if you don't hold terrorists groups in those that support them accountable then you'll never change the behavior?

N. Rodgers: I know, that's the argument about Iran and Hezbollah. Is that you should sue Iran for Hezbollah's atrocities because then it will make Iran stop supporting them. I understand that intellectually, I just think and again, it comes back to nuance, it's very complicated.

J. Aughenbaugh: It is extremely complicated.

N. Rodgers: All of this is very complicated and pulling part when you pull on one thread you lose the arm of your sweater on the other side, like it's all jacked up because.

J. Aughenbaugh: Your favorite coat all of a sudden no longer fits you.

N. Rodgers: Exactly.

J. Aughenbaugh: Etc. But, because I was reading some of the amicus curiae briefs and the one from the anti defamation league was very explicit, just came out and said, social media is a, "Dangerous tool for terrorists if the platforms are not regulated by the federal government."

N. Rodgers: Can I say something here that is not going to be popular? But it is, I believe it to be true. These atrocities are terrible. We're talking about 160 lives lost, 168 lives lost in these two atrocities.

J. Aughenbaugh: Yes.

N. Rodgers: But I put to you that a significantly bigger fish to fry with Twitter is stopping teenage girls from bullying each other into anorexia, suicide, all the other thing and that's happening on a much larger scale than the ISIS attacks. It's easy to get angry at ISIS and it's easy to get angry at terrorism but domestic terrorism which is what bullying is, it's a form of that is a significantly larger problem for a much larger group of people and we're not addressing that. There's a part of me that's like, if you want to get mad at Twitter let me give you reasons to get mad at Twitter. This may be a reason but there are bigger reasons to be met.

J. Aughenbaugh: In what you're pointing out Nia, is the Internet effectively is at roughly 30 plus years, 30, 32, 33 years. Now we have enough time to see the various policy issues that have arisen with this new technology and economy. Now we have to come to grips with how to re-respond to those negative externalities.

N. Rodgers: Because it started out as the Wild West of the internet. Where anybody posted anything anywhere there was all stuff and now we're starting to go, that's not good for people's mental, physical and social health, but how do you walk that line between that and Agi's favorite thing in the world which is free speech? Agi wants to be able to drop the F bomb in his classes when it is necessary to make a point. Saying to him you can't do that stifles his creative endeavor.

J. Aughenbaugh: My ability to go ahead and reach occasionally young minds who have tuned me out five-minutes earlier.

N. Rodgers: Then they go, whoa, did he just say, I'm back to paying attention.

J. Aughenbaugh: Yeah, I need to pay attention. The Google case is again different than the Twitter case, both of them deal with social media. But the Google case really does highlight free speech versus protecting the collective. I know you can't remove the fact that these internet companies have made a ton of money. But I find it rather remarkable when you get both liberals and conservatives for different reasons who now all of a sudden are like.

N. Rodgers: This is a terrible idea.

J. Aughenbaugh: This is a terrible idea.

N. Rodgers: We got to fix this Internet thing. The olds have finally decided, hey, we got to fix this Internet thing.

J. Aughenbaugh: But even with the Twitter case, the Twitter case actually highlights a tension between two different public policies. You want to give the victims of terrorism an opportunity to hold terrorists accountable. On the other hand, do you really want a hamstring US foreign policy initiatives? Because we're afraid that if we send our men and women overseas to help an ally or to defuse a situation that they may be held accountable in another nation's courts.

N. Rodgers: Right, and endangered potentially. Also aiding and abetting. I'm with the justices that along with arbitrary and capricious, are phrases that are very hard to define how. By aiding and abetting implies that Twitter supports ISIS. I think Twitter's argument here is no, we don't support anything that doesn't make us money. Like what they should say is we're purely capitalist. We will support you if you make us money. It doesn't matter what part of the political spectrum you are on. One of the things that is often brought out about the Internet is that it suppresses conservative voices.

J. Aughenbaugh: Yes.

N. Rodgers: That's a common refrain and a common argument that you hear. When a part of me wants to say, yeah, because there's no money and conservative voices.

J. Aughenbaugh: Yeah.

N. Rodgers: The money is in people who were on fire.

J. Aughenbaugh: Yes.

N. Rodgers: Right. Moderates don't make it anywhere because moderates don't make any money. If you're like us and you're just middle of the road and you could go either way on the topics. Nobody is going to sell us.

J. Aughenbaugh: We're not the target audience of much of the Internet.

N. Rodgers: Exactly. The much of the Internet, the target audience is angry people, either on one side or the other. They're the ones who were bringing the money into the Internet, the moderates who are standing in the middle going, Oh, it could really go either way, nobody cares about them. There you go now. You're not even remotely interesting to us because you're not making money. But if that's why, and just to put this out there, that's why thumbnails have ridiculous titles in them. YouTube videos have ridiculous titles because it inflames people's emotions and they are more likely to watch.

J. Aughenbaugh: Yes.

N. Rodgers: If every video on YouTube was titled moderate discussion of X topic, nobody would ever look at anything on YouTube.

J. Aughenbaugh: Then the next question is Nia.

N. Rodgers: Nobody would ever read Twitter. If Twitter resolve moderate people going, well, it's a nuanced argument.

J. Aughenbaugh: It get theirs. Let's just say, for instance, a critical mass in Congress does arise to address Section 230 of the Communications Decency Act and a critical mass arises to perhaps tighten up the language of Jocasta. Then it begs the question. Should we be concerned about what we wish for?

N. Rodgers: Yes. That's the other part of that is, okay, well, we're going to regulate it out of existence or we're going to regulate it in such a way that nobody gets to say anything on the Internet.

J. Aughenbaugh: Yes.

N. Rodgers: That hurts anybody's feelings. So nobody says anything.

J. Aughenbaugh: Yes, war. Before you gain access to social media platforms.

N. Rodgers: You have to have a battery of psychological tests.

J. Aughenbaugh: Yes. Or you have to go ahead and demonstrate that, as far as local law enforcement is concerned, you're not going to engage in X behavior after watching these videos, right?

N. Rodgers: Right. Or the reason that they use an algorithm is because they don't have enough people.

J. Aughenbaugh: Yes.

N. Rodgers: To okay videos. Can you imagine how much the Internet would slow down if every single thing that went onto the Internet had to be vetted.

J. Aughenbaugh: Who's doing the vetting?

N. Rodgers: Nothing would ever get published.

J. Aughenbaugh: Yes, I make the question about who does the vetting?

N. Rodgers: Then who watches the watchers? Exactly.

J. Aughenbaugh: Because that's the conservative critique, for instance, of Twitter and other social media platforms. They are populated with a whole bunch of well-paid, very educated liberals.

N. Rodgers: Who hate us, and don't want our ideas to get out there.

J. Aughenbaugh: Well, that's right.

N. Rodgers: That's a common refrain on the conservative side.

J. Aughenbaugh: Then I've also read a number of law review articles that have begun to question the extent to which the federal government, particularly in the last two presidential administrations have tried to influence the regimes of social media platforms in regards to their content.

N. Rodgers: Well, now you've got Elon Musk who is rising his own tweets to the top of the because he can because he owns the company.

J. Aughenbaugh: But now you're getting the release of hundreds of pages of Twitter documents demonstrating that at least the Biden administration, if not, before, the Trump administration, tried to influence what Twitter would allow in regards to comments about the federal government's COVID-19 response.

N. Rodgers: Yes.

J. Aughenbaugh: Back to your very recent comment. Who's watching the watchers?

N. Rodgers: If we could trust that that up swelling in Congress to actually translate to oversight, it would be fine. But that's not what they're going to do. What they're going to do is change the law and then proceed to not do oversight, because that's been their modus operandi here for quite a while. He said with some irritation.

J. Aughenbaugh: Well, quite a bit of irritation. In this case, I'm irritated, right? I mean, think about this Nia. I know we're going to wrap this up pretty quickly, but I mean, the Biden administration at one point in the Department of Homeland Security was contemplated creating a disinformation board as part of Homeland Security. Remember, we talked about off the recording and I was just like really.

N. Rodgers: Well and be careful naming something that disinformation board because that implies that you're going to put out disinformation.

J. Aughenbaugh: Instead of reviewing what you believe is disinformation on social media.

N. Rodgers: Wait, if you're trying to quell people telling faucets, he then you'd have to quell people ever talking to each other.

J. Aughenbaugh: Sure.

N. Rodgers: Because as long as there has been.

J. Aughenbaugh: Two people.

N. Rodgers: Two people, one of them has told the fall saying, does this dress make me look fat? No honey. It doesn't. That goes back to the Adam's fall.

J. Aughenbaugh: The Adam and Eve, the story in the Bible.

N. Rodgers: Exactly. Does this leaf make me look fat? No.

J. Aughenbaugh: Take a bite of this apple.

N. Rodgers: You're going to love it. Like exactly. That's part of it, but also, I think something that we have to keep in mind is that this tension is always going to be there between how do we regulate and have oversight, while still allowing people to have the freedom to voice themselves and to be heard. You don't kill an idea like ISIS by putting it in the dark. You kill an idea like ISIS by showing that it doesn't work, but it doesn't win hearts and minds, but it doesn't do what it says it's going to do. That's it.

J. Aughenbaugh: Or the cost of ISIS.

N. Rodgers: Is too high yeah.

J. Aughenbaugh: Is unacceptable. You don't do that by putting in the dark, I would argue.

N. Rodgers: The J. Rob's court has it, right. It's not up to the courts.

J. Aughenbaugh: Yes.

N. Rodgers: This should be the legislative, this should be the people's branch speaking for the people and saying, we're going to have better regulation of these things that we now perceive as the common good. We perceive the Internet as the common good. There are people who in the United States now believe they have a right to the Internet.

J. Aughenbaugh: Yes, they do.

N. Rodgers: They cannot be persuaded that they do not. That the Internet is a service that you must purchase. Oh no, it is now the way life works. People will say, oh, you have to apply for a job online, you connect with people online. You do all these things online. If it is a common good, then it is up to the legislative branch to oversee it and make sure that it's fairly distributed and fairly used?

J. Aughenbaugh: Yeah. Because what you've got is a collective. On the other side is the protecting the collective. If we're actually concerned about, for instance, the mental health of our young people.

N. Rodgers: Which we should be and we're going to talk about that in a future episode.

J. Aughenbaugh: Okay, then the people's elected representatives need to step in.

N. Rodgers: Well, it's been interesting, Aughie, thank you so much.

J. Aughenbaugh: Yeah, it's a complicated. Again, I know listeners are probably tired of me and saying is, this is complicated than just reading the oral argument transcripts really highlighted how difficult these cases truly are.

N. Rodgers: How the justices are also trying to say we're not tech giants we're Justices. We're going to limit our scope here.

J. Aughenbaugh: Yes.

N. Rodgers: Thanks Aughie, I appreciate it.

J. Aughenbaugh: Thank you, Nia.

You've been listening to civil discourse brought to you by VCU Libraries. Opinions expressed are solely the speaker's own and do not reflect the views or opinions of VCU or VCU Libraries. Special thanks to the Workshop for technical assistance. Music by Isaak Hopson. Find more information at guides.library.vcu.edu/discourse. As always, no documents were harmed in the making of this podcast.