In an uncertain world where AI technology is disconnecting us more and more each day, we are dreaming of a world where technology connects us. The premise of this podcast of simple: let's talk to people and imagine the most hopeful and idealistic futures for technologies that connect and do good in the world.
So we're on the line today with Casey Feisler. Casey is an associate professor at the University of Colorado Boulder in the Department of Information Science and researches and teaches in the areas of technology, ethics, internet law and policy, and online communities.
also is a social media superstar. Casey, think this is the third time that I've interviewed you for some sort of podcast project. So welcome back to my world of podcasting. It's great to have you.
Casey Fiesler (04:22.957)
Well, thank you, always a pleasure.
AI-tocracy (04:26.113)
And today, what we wanted to do and why we're talking today is because of various things that are going on in funding and academia today, but also because of a recent grant that you had canceled that we wanted to just tell the story of and broadcast some of what is really going on for academics and for groups who are doing really important research. And so we were thinking we could just dive.
right in and ask you, what's going on? What happened?
Casey Fiesler (05:00.109)
Sure. Well, coincidentally, given the general topic of your podcast, this grant that I had was on AI education. It actually was part of an explicit NSF call that came out, I think, early last year around AI workforce development.
through a dear colleague letter and it was through it was called the Educate AI Initiative. And there were a lot of things that were focused on K-12, undergraduate AI education. I had a very small in the scheme of things grant that was focused on putting together
interdisciplinary teams of undergraduate students in order to create accurate, fun educational content about AI for social media. And in addition to that, to do some research, sort of needs assessment for AI education for young people. Part of the motivation behind this was that
there a lot of young people, especially teenagers, for example, do a lot of informal learning on social media. And this becomes really important when, you know, in the United States at least, a lot of schools, K-12 schools, high schools still don't have any CS classes. And so it's not necessarily clear where students would learn about
AI from a sort of computing perspective. And so, you know, if they're doing a lot of informal learning on social media, how do we make sure that there's good content?
AI-tocracy (07:01.323)
For informal learning, guess, what are the potential dangers of that? what, in terms of motivating the work, what is at risk or what is maybe the problem statement that you were trying to solve?
Casey Fiesler (07:13.517)
Yeah, mean, you know, there's a lot of hype around AI on social media. I mean, I think there's a lot of overstatement about AI and its capabilities in sort of both directions. know, whether you're talking about how AI is this like completely evil thing that's going to destroy the world or AI is going to like change everything tomorrow.
you know, neither of these things is entirely accurate. There's a lot of, I would say sort of myth making around, around AI on social media, or just like a not, not a, not a ton of what I would call like direct sort of educational content. There's a lot of how do you use AI, less about how does it work and less about its limitations.
AI-tocracy (08:13.519)
And I think we're living into some of the hype cycle we had, Emily Bender and Alex Hanna, last week, this week, still this week, it's still launch week. And we were talking a lot in that conversation.
Casey Fiesler (08:23.287)
Yeah.
AI-tocracy (08:28.921)
really interested in as maybe one of the first generations that grew up on the internet is how we play into that hype for folks who are growing up, for teens and youth. And it sounds like your work was starting to unpack that, or the plan was for that work to start unpacking that and help support youth in getting the information that they need. What happened?
Casey Fiesler (08:52.237)
So about three weeks ago, I got an email near the end of the day on a Friday informing me that this grant had been terminated by NSF. Actually, first, I got a text message from our department chair that said, Casey, I'm so sorry. I just saw the email. Let me know if you want to talk. I didn't know what it was about.
AI-tocracy (09:20.111)
What a way to find out.
Casey Fiesler (09:22.317)
Yeah, was actually I was traveling I was visiting UC Irvine and I got that text message like when I was on my way to dinner with some folks so the good I mean at least I had you know some other professors to talk to about over a glass of wine over dinner but ironically I had literally just been telling someone about about this grant made like
AI-tocracy (09:41.327)
Right?
Casey Fiesler (09:49.998)
literally 15 minutes before I got this email. And the email came from my university grants office. I was not contacted by NSF at all. They sent a list of grants to the university that had been terminated. Mine was one of 11 that were terminated on that particular day.
AI-tocracy (10:13.519)
When we're thinking about grants for folks who are not maybe aware of how academic grants work or how much maybe money is involved, when we're talking about, this is the NSF, so the National Science Foundation, and these grants, say 11 grants, were probably worth a lot. Like that's a lot of funding, I imagine.
Casey Fiesler (10:37.183)
Yeah, mean, if I had to guess, was, so this was, my grant was a type of NSF grant called an EGR, which is an acronym. And the first word is exploratory. But I can't remember what it stands for. Yeah, exactly. It's, they're sort of intended to be.
AI-tocracy (10:49.647)
And then there's four other letters that work in there
Casey Fiesler (10:57.963)
maybe a little bit outside the box or like high risk, high reward, but smaller, you know, kinds of things. So this grant was only about $250,000, you know, complete and counting overhead. It was for two years. And just to give you a sense of the kinds of things that, you know, an NSF grant might pay for. In this case, so as I mentioned,
This, you know, so this was about creating social media content, doing sort of needs assessment, but also a big component of it was the idea of putting together these groups of students with different types of skills and knowledge. So the idea was that there would be like communication and media production students who would be really interested in content creation and thinking about how to communicate things. And also like information science and computer science students who know about AI. And the idea was like,
The comm students will learn about AI, the computer science students will learn how to communicate, the world will learn about AI, right? And so a big chunk of the money was actually going to go to stipends for the students who were participating in this program. Another component of the money would have gone to a PhD research assistant to help run the program.
And honestly, that was the biggest chunk, like between those two things, there was a little bit of travel money as needed, a little bit of equipment money if we wanted to get a camera. But it really wasn't very much. I actually had, and a little bit of summer salary for me too. And I'd spent...
Almost none of it. I got the grant partway through fall semester, so was too late to put a PhD student on it for that year. And so what I was actually, I'd been working on it. I'd been working on some of the background stuff, thinking about what kinds of content. I'd been developing a survey that I was planning to develop for K-12 teachers. And then for the past...
Casey Fiesler (13:11.351)
Well, at that point for the past five weeks, I had been running a small pilot program with a small group of information science majors. We had been meeting once a week. We'd been brainstorming. They'd been drafting content. We're sort of trying, trying this out to see how it worked. I hadn't even paid them yet because they were being paid at the end of the semester. And so, when the grant got taken away, I actually couldn't use it to pay these students.
And so thankfully, CMCI, the college, stepped in and paid their stipends. But I can't use any of the money for anything else. So I basically get a pay cut because I'm not getting summer salary, even though I've been working on this grant for six months. I don't have the money for stipends, so students can do it next year, even if I know there's been some bridge funding that has been
been proposed in various places that I could apply for. But part of the problem as well is that even if I had the money for stipends for students without being able to have a research assistant to help me run the thing, I mean, I would basically have to do it like teaching another class for, you know, and so, yeah.
AI-tocracy (14:33.337)
So it sounds like the money predominantly, I just wanna, I'm just highlighting this point, I'm just underlining this point, the money was going to students, right? Like the money was supporting students to do important work in the world. So I guess I'm like my question, what I'm, well, one more kind of fundamental context question is for folks, again, who don't really know how grants.
Casey Fiesler (14:37.686)
Yeah, yeah, yeah.
Yes.
AI-tocracy (15:00.885)
work and especially the timeline of grants and really I'm getting at how much work goes into creating a grant and research that goes into creating a grant. Could you just very quickly go over the process of creating this grant maybe even the timeline of how grants work?
Casey Fiesler (15:15.883)
Yeah, so you have to write a proposal. And writing a proposal for a National Science Foundation grant is certainly no small amount of work. I will say for this particular type of grant in EGRE, it goes through a slightly different type of review process at the NSF, so it typically doesn't take as long.
But it still did take a pretty long time this time. I think I submitted the grant and the proposal in March of 2024. And then I heard in August of 2024 that I got it. So like right before, right as the new semester was starting. And yeah, so and you know, I mentioned like I'd been working on it and summer salary and that kind of thing. Like the way that, well, okay.
bit of context. Most research faculty are paid on a nine month salary since we're not teaching over the summer. And so one component of this is the idea that if you are doing funded research projects that some amount of time over the summer sort of represents your labor.
on this grant in the same way that like paying a research assistant, you know, during a semester represents their labor. But of course, it's not like the professor doesn't work on the grant all year, it only works on it for those two weeks. It's just sort of an accounting kind of thing. So just to explain what I was talking about there.
AI-tocracy (17:02.999)
Yeah, yeah, no, that's helpful. And I think the question that I keep asking myself is why these grants? And so I'm sure it's speculation, but why do you think that your grant was targeted at this time to be defunded?
Casey Fiesler (17:18.081)
Yeah, so I can only speculate, but I feel pretty confident in this educated guess. So when I first got the notice about the termination, I was shocked because it was this grant. I have another National Science Foundation grant. have a career grant. And it is about
AI-tocracy (17:36.335)
Mm.
Casey Fiesler (17:48.27)
It's about technology ethics. And I wasn't like actively worried about any of my grants, of all, but I really thought the one that was AI workforce development, which is like what it was framed as, I really thought that one was safe. I mean, I've had grants in the past that a hundred percent would have been gone right away. Like, you know, my former student, Brianna Dim.
The NSF grant that funded her was 100 % broadening participation. It was, how do we get more women and queer people and people with disabilities into computer science? Like that one, I would have expected to be gone right away, right? But AI workforce development. And so I was baffled. And my first theory was I went and looked at the Dear Colleague letter, like for the program. And I noticed that one of the ways that it was framed was
inclusive education. And I was like, that's it. They got rid of every single thing that was funded under that program. So I contacted the program officer and he confirmed for me that that was not the case, which is great. I'm very glad. But then, you know, the news reports started to come out about this set of terminations. It all happened on that Friday.
AI-tocracy (19:01.743)
Mm-hmm.
Casey Fiesler (19:16.269)
And there had been an explicit message from Doge about...
AI-tocracy (19:24.055)
Still can't believe that we have to say that with a straight face on, but yeah.
Casey Fiesler (19:28.607)
Yeah. About how like or a statement like I saw it through Doge, but it might have come like from the White House. I'm But anyway, there was an explicit like government statement that research around misinformation, disinformation and malinformation had been determined to no longer, you know, NSF priorities because it was like
and affront to free speech. Which also can, and I was like, well that, why did my, and then I went and looked at the public abstract for my grant. And so I talked about the motivation for the grant earlier, right? So the public abstract for this grant has a sentence that says, you know, something along the lines of,
a lot of informal learning for young people takes place on social media, which could potentially be a problem with respect to AI education because there is a lot of myths and misinformation about AI on social media. So that is my educated guess as to the reason.
AI-tocracy (20:45.295)
Do you have a sense of?
Why? So misinformation seems from my academic perspective as something that is bad, right? Like something, misinformation and disinformation seems like something that we should take out of the internet. So what's going on? Why would that be something that people would want to get rid of or target? Because it seems different than the DEI grant, grants being targeted. That seems like a different...
category.
Casey Fiesler (21:20.917)
Yeah, mean, but it's still an ideological thing. I mean, as I'm sure you know, for many years there have been complaints that social media, for example, is biased against conservatives. And
one of the ways that this comes out, or like this perception comes out, is around combating misinformation on social media. for example, Oliver Hamson's paper on disproportionate content removals, which is a great paper by the way, everyone should read it. They did a survey of people about having content removed from social media.
And they looked for what demographics of people had content removed disproportionately compared to everyone else. And there were three groups, and it was black users, transgender users, and conservative users. But the difference was that when digging into the reason why people had content removed, black and trans people would talk about what seemed like inappropriate removals.
you know, I was talking about my experience with racism and it got flagged as racism or, you know, people were harassing me and I clapped back at them and it got, you know, reported for, for bullying. Whereas the conservative users in this survey talked about how they had been flagged for things like misinformation and they weren't, and they were like disputing the
policy, like as opposed to that, like they weren't necessarily disputing that they'd done something wrong, if that makes sense. And so there is a difference between like, you know, having content removed for no reason versus there actually being a reason. And one of the things is that people disagree about what constitutes misinformation sometimes.
AI-tocracy (23:22.713)
Mm-hmm. Mm-hmm.
AI-tocracy (23:44.623)
Sure, sure.
Casey Fiesler (23:44.95)
I mean, there's certainly, know, disinformation is intentional. So in theory, the people who are, you know, sharing disinformation know that it's incorrect, but misinformation can be, you know, people who are duped or gullible or in confirmation bias, these kinds of things. But there is definitely a widespread perception in some circles that misinformation is...
that combating misinformation is a form of intentional censorship. And so that's the deal. There's this perception that anything that is trying to keep people from spreading misinformation is just a form of censorship.
AI-tocracy (24:17.871)
Mm-hmm, mm-hmm, mm-hmm.
AI-tocracy (24:31.983)
It seems like, from where I'm sitting, canceling academic grants specifically on misinformation and disinformation is also a form of censorship. So taking what's happening with conservatives kind of out of it, the conservatives feeling like they might be, we might be seen as like our content's being removed unfairly. It does seem like, in your case with the grant,
Casey Fiesler (24:44.973)
Mm.
AI-tocracy (25:00.899)
that it's kind of an equal and opposite reaction to that, is that fair to say?
Casey Fiesler (25:06.605)
certainly seems logical. You know, and I'll say like,
AI-tocracy (25:08.271)
You
Casey Fiesler (25:15.725)
But based on the way that I described this, my grant and its little keyword not being about misinformation appears to be collateral damage, I would say. That said, I want to be very clear that the cancellation of the grants that were actually about misinformation is perhaps even more unjust because this is extremely important.
AI-tocracy (25:25.593)
Mm-hmm. Mm-hmm.
Casey Fiesler (25:42.102)
research, mean, misindifference and deprivation is a huge problem, especially given what's happening with generative AI. I mean, my, I mean, just like yesterday, I saw this video on TikTok that was like, and when I say clearly a deep fake, I mean, that like the content, was like, it was obvious that it was a deep fake, but it was a politician's wife giving a speech. If I had not,
been paying attention, it looked very real. There were clearly people in the comments who thought, I mean, there were a lot of people who were also calling it out as obviously AI, you know, and scams and frauds. I mean, it's just, it's a huge problem. And even if you think back to like Biden's executive order on AI, one of the components of that was like,
Well, know, executive orders in theory are supposed, the way they're supposed to work is that they're just marching orders for like different areas of government to like, whatever's happening now is not what they usually are. But for the AI executive order that is no longer in place because President Trump immediately got rid of it. But one of the things was like, hey, we need to figure out how to do watermarking. Like, go figure out how, like NIST or whoever it was, like go figure out how to do watermarking.
That is an incredibly hard technical problem. There are a lot of really smart people working on how to do things like labels and watermarks for deepfakes and generative AI. I don't know this for sure, but my assumption is that that is exactly the kind of research that got caught up in these cuts.
AI-tocracy (27:30.542)
One thing that we were talking about before going live today that I've been thinking a lot about is that even President Trump is very clear. Let's go AI, let's maybe deregulate AI, but let's, AI and innovation, we're gonna bring it back to America, including education. We're gonna educate people so that they're better computer scientists and they can work with AI better. And yet you also have the...
the function, the downstream impact of executive orders and DOGE and cuts for efficiency and grants that are immediately impacting students in the US, their ability to innovate or at least know about the capacities of AI so they can innovate. Does that resonate with you and how you're seeing this? And if so, can you say more?
Casey Fiesler (28:21.101)
Yeah, and I would think about this two different ways, actually. So the first one is sort of you know, taking me as an example. Literally the week, like five days after my grant got terminated, President Trump put out an executive order about the importance of AI education in K-12. Like the importance, the AI education for young people. I mean, the irony, I cannot.
AI-tocracy (28:40.857)
Yeah, yeah, I saw that.
Casey Fiesler (28:49.037)
However, I have theories about that executive order, but I suspect also that the reasons that I think it's important for people to learn about AI are probably different than the reasons that the White House thinks that people should learn about AI. The word limitations did not appear anywhere in that executive order, right?
AI-tocracy (29:16.442)
Or ethics, presumably.
Casey Fiesler (29:17.759)
Yeah, exactly, exactly. And you know, despite sort of my typical work in the space, the work that I was doing for that grant was not explicitly about, mean, it was, it was gonna be also about ethics and most of my social media content on AI education is about ethics. But it was also very basic like.
here's what this is, here's how it works, but also how to use it responsibly and what are the limitations and how do we think about things like environmental impact and also why does it get things wrong? That sort of thing. So there's that component of it that just like the irony of that. The other sort of broader thing is that
If we look beyond like the particular grants that are getting terminated and my assumption is that like, so AI education is not being targeted. AI education is getting caught up in other kinds of things. So like there's my weird example and my assumption is that it's also getting caught up in DEI cuts because until very recently, the National Science Foundation cared a lot about broadening participation. And so, you know, there's a lot of broadening participation initiatives about
you know, getting historically excluded groups education around AI. So my assumption is that those got caught up in it as well.
AI-tocracy (30:47.66)
When you say broadening participation, is actually a new term for me. Can you say more about what that is, maybe the brief history around that and like maybe who's using that language?
Casey Fiesler (30:54.765)
So yeah, so broadening participation in computing is a, like that's a big term in computer science education specifically. I'm not sure if that's the term that's used in other STEM fields, maybe. But basically what it means is like, how do we increase the diversity of people who are studying computing and then that, you know, therefore in the tech industry working in this space.
NSF has, mean, and, and broadening, I mean, you know, I do not, the arguments against DEI, mean, also, I think people don't know what DEI is when they're arguing against it, but like, they're silly, because they, they're assuming that, that the world, that things are equal. Like they're, you know, it's the assumption that...
You know, there's no problems out there, but my God, it's still, it's still so, I mean, even like me, I mean, you know, as a woman studying computer science over many years, you know, like maybe it's a little bit better today than it used to be. But I mean, if, if you ask like,
women in the tech industry, like what it's like, or like, you know, women studying computer science, like you hear awful things all the time. I mean, like the stereotypes and the harassment. mean, there's just there's just all kinds of all kinds of issues. And and this is true for a lot of historically excluded groups. And this is why they're historically excluded. So, you know, there's been a
AI-tocracy (32:43.375)
Right.
Casey Fiesler (32:47.969)
there's been a lot of work in this space and it becomes really important. mean, and it's not just about like gender or race, you know, also things like socioeconomic status. There are a lot of programs that are intended to sort of help level the playing field a bit. And they've been pretty impactful.
But even then, you know, the percentage of women graduating with computer science degrees in the United States is considerably less today than it was in 1984. I don't know how deep you want me to go into this, but I can give you the short anecdote for why that is.
AI-tocracy (33:30.255)
Wow.
AI-tocracy (33:38.031)
No, go for it. That's kind of a shocking statistic for me.
Casey Fiesler (33:45.869)
So, yeah, so in the 80s, you know, like the computer as a person, you know, used to be women's work traditionally. And they were, and so, you know, by the 80s, there were actually a fair number of women studying computer science. There's this great book, Unlock,
unlocking the clubhouse that was based on some research done at Carnegie Mellon in the 90s. And so the theory that comes out of that book in part was that in the 80s, the rise of the personal computer and the way that it was marketed, personal computers were marketed in part, like in terms of hitting children as partly toys.
for boys. And if you look at like there's, I made some TikToks about this and found an old Apple commercial that's like a little, you know, like anyway, it's very clear that like how these were being marketed, right? Like this is a cool thing for your son. And so what ended up happening was that by the 90s, when students went to college, the boys had been using computers.
for years and the girls had not. And so they were already behind and there was this like sense that like, isn't for you. Like this is a thing for boys. And that was already having such an impact by the 90s that women studying computer science were dropping out at much higher rates and this just kept getting worse for a while.
So, you know, like in the 80s, I think it was at like 34%. It's a little, you know, it got down to like, I don't know, 19 or something, and now it's up a little bit, but we're still like in the, I think last time I looked like low 20 % kind of area. And yeah, you know, issues around, I mean, people pretend like there's not, there's not anything happening here, but like,
Casey Fiesler (36:04.525)
you still see like, you know, walk into like particular areas at a tech company and there'll be like one woman working there. You know, you get to certain levels of computing classes and there'll be like a couple of women students in the room. These kinds of things matter.
so, and, and again, this, and it's not just gender, was same kinds of things for other other historically excluded groups as well. And there's been a lot of research in this area that actually has made, like been quite impactful. You know, like I said, that the percentages have of, have started to sort of creep back up. We're also seeing a lot of work on, you know, different types of like informal learning. And it's also not just about like, you know, how do we get
middle school girls to care more about computer science. It also becomes like, do we support people through the entire pipeline? And so, yeah.
AI-tocracy (37:01.743)
When we're thinking about the DEI grants and those being defunded, what do you see as the impact of that on some of the sexist histories or presence in computer science education, if you see a connection between the two?
Casey Fiesler (37:08.397)
Hmm
Casey Fiesler (37:19.725)
Yeah, mean, I just really don't want us to see, I don't want to see us move backwards. I mean, and honestly, this is what all of the like, you know, hits on DEI is doing. It's moving us backwards. And actually, here's a statistic that I saw. bookmarked this and haven't dug into it yet, but I saw a LinkedIn post.
where someone had looked at, and I'm guessing, so there's been a lot of open forms for people who've had grants terminated to fill out about what, so I've done three of these. So I guess someone looked at the data from these and surprise, surprise, the PIs of the grants that are being terminated are disproportionately people from underrepresented groups.
AI-tocracy (38:16.559)
Do you think that's because of their identity or do you think it's because those folks were more likely to study things that were in those topics?
Casey Fiesler (38:27.117)
I think it's probably the latter. Not that it really matters, if you think about... Actually, think about something like AI ethics. Start naming the big... People who immediately come to mind for this.
AI-tocracy (38:35.801)
Sure.
AI-tocracy (38:44.943)
For example.
Casey Fiesler (38:55.341)
If you look back on my bookshelf, you start looking at tech ethics books, the number of women of color compared to everyone else. And there are good reasons for that. But if suddenly you cut all of the AI ethics research, it disproportionately impacts these folks. And what it results in is
you know, more white men having research funding compared to other people. And that's basically what this analysis saw. And so, like, so cutting this kind of research as well also like sets back particular people. And this becomes, I mean, the other thing about all of this, it's very, all of this is very upsetting. But we're thinking back to again, like the,
Importance, for example, of misinformation research. It's not just the immediate material impact on these particular PIs and this particular money. That's a big deal. But it also means that especially early career researchers
aren't going to be doing this kind of research as much. And this isn't, know, and because if you need to get tenure someday, it's very hard to justify prioritizing research that is unfundable. And this is both because it can be hard to do research without funding and because traditionally, research funding is part of the evaluation criteria for getting tenure.
And I actually just saw a message from someone working at NSF that was sort of encouraging academic institutions to remember that. This message was specifically talking about career grants. basically they were saying, with all these cuts, because we also might just see...
AI-tocracy (41:03.375)
Hmm.
Casey Fiesler (41:14.231)
cuts across NSF in terms of there being less funding. Like that is the expectation of what's gonna happen. This is gonna impact everything, not just DEI, not just misinformation, but everything. If you have less NSF money, that means less people get career grants, obviously. And if some academic units are using, were you able to get a career grant as a proxy for are you a good enough researcher to get tenure, then suddenly no one gets tenure anymore, which like obviously is untenable.
But if people are worried about particular types of research, research on marginalized groups, research on climate change, research on disinformation, we're gonna see less of that research. And this isn't people being selfish or whatever, it's just survival. Yeah.
AI-tocracy (42:07.311)
Well, I mean, you have to go where the incentives are, right? And if the incentive structure is so skewed towards not looking at that, like for me, you I wanna make a living at some point. But I do, I wanna go back to something that you're talking about, which is this relationship between the federal government and granting institutions run by the federal government and academia. So for myself, I am...
scared of staying in academia. And so me as someone who has studied AI and was going to start a job in which I was in academia studying AI, I'm like, well, no, I don't want to do that anymore. Right? Like I either want to move to Europe to go study AI or I want to do what I'm doing now, which is more of like starting my own businesses and doing consulting and all of that stuff. Because I feel like a future in academia is very uncertain.
and a scary prospect that there's something changing fundamentally in academia because of the funding structure. And so I'm wondering, this is a big question, but how you're thinking about it as someone who's further on in your career as an associate professor of the war on academia, we can call it, and is that even the right term to be calling it?
Casey Fiesler (43:26.785)
Well, and actually, if we go back to your previous question about the irony of the current administration thinking that AI is so important and all that, this is actually the downstream effect there, too. If you destroy academia in the United States, we do not want to rely on just big tech to do foundational research.
I mean, look at like, why do we have Google? We have Google because of a National Science Foundation grant.
AI-tocracy (44:07.076)
Can you unpack that for folks who aren't aware of that story? Just very briefly.
Casey Fiesler (44:10.445)
So yeah, mean, the short version is that an NSF grant was funding some of the work of the founders of Google when they were students at Stanford. And this is frequently how things work. Basic scientific research turns into much bigger things to the point where people forget sort of the roots of it.
You know, what you just expressed, like, I might, you know, you might be scared to go into academia, you might want to go to another country. Huge. The brain drain that the United States is going to see because of what's happening right now is massive. And this is not just, it's not just people like you deciding not, potentially not to go into academia. So that's one thing. Or wanting to go into academia, but there are no jobs because universities have no money.
So there's that. There's also people like me who have been messaged by people outside the country like three times in the past month saying like, hey, we're hiring. Why don't you come to insert country here?
AI-tocracy (45:28.367)
Yep, yep. Or companies, I imagine. If there is a goal or a plan to move academics into industry, I imagine that's something that folks are thinking about.
Casey Fiesler (45:32.715)
Yeah.
Casey Fiesler (45:38.678)
Yeah, that too. I will say, you know, there are countries that are being incredibly smart. Like I think it's France is like, we're going to we're going to hire. And I've I've also seen, you know, I I've seen like two people not in our field, but just like on social media or whatever. Like just just took a job at university. The University of Toronto iSchool is about to be a banger.
AI-tocracy (45:47.823)
You
AI-tocracy (46:04.791)
I have looked into the, I mean, there's more positions going on over there too. I've been looking at those.
Casey Fiesler (46:08.301)
you
And that's really smart of them, right? I mean, it makes total sense. Like Canada's about to have, you know, if things keep going as they're going, Canada, Australia, like especially English speaking countries are about to have like the best universities in the world. The other component is international students not wanting to come to the United States, which of course they don't. Look at what's happening to international students right now. So now really talented,
students across the world who might have come and gotten degrees here and potentially stayed here and gone to work for these big tech companies. So even if you think that all the work should happen at the big tech companies, I mean, this is the thing that I guess Elon Musk and Donald Trump are arguing about, right? It's like H1B pieces and these sorts of things. So like there's, are multiple levels of scientific brain drain in the United States that are likely to happen.
And some of it comes down to immigration policy, but a lot of it comes down to like, how are we treating our universities?
AI-tocracy (47:19.521)
So where would you like to see this going? And where do you practically see all of this funding situation going, in the next year or maybe for the remainder of the administration?
Casey Fiesler (47:38.965)
I mean, you know, how pessimistic do I want to be really? Right. Like, I mean, yeah, I mean.
AI-tocracy (47:41.615)
Well, that's that's your call. That's your call. I'm on the pessimistic side of things. So maybe maybe take the optimistic perspective, but Be honest
Casey Fiesler (47:50.786)
I mean, like the cuts, okay. let's, on the optimistic side, maybe these legal challenges continue to work, right? Maybe, you know, a lot of universities have been appealing termination. Maybe my grant even comes back, who knows, right? Like lots of universities have been appealing terminations. There's also been, you know, basically legal challenges to like grants being cut.
you know, in the general, you know, so maybe the cuts turn out to be illegal and current, you know, things come back. And also the overhead caps also being challenged. Maybe that also turns out to not, that, you know, would be a very best case scenario right now. That is possible. The pessimistic side is even if that happens,
A thing that could happen next is that when Congress does new budget that has NSF money in it, NSF funding gets cut by half. Like I've heard that thrown around. I don't know how likely that really is. That then impacts not just like things that people think are bad, like DEI or whatever. It impacts everything.
And not just NSF, NIH too. And now we're talking about medical research and this kind of thing. Vote in the midterm elections, y'all. Vote in the midterm elections. Also, and state, know, where.
AI-tocracy (49:31.392)
Really that should be the takeaway from every single episode.
Casey Fiesler (49:44.248)
Colorado will have a new governor election soon. That's gonna be very important because as what's happening with the federal government, like states suddenly, I mean, I'm glad, mean, our attorney general who is Phil Weiser, who is someone that I quite like, is running for governor, but also he just like keeps suing the-
The Trump administration is very unpopular with them right now. This is the energy that I want, right?
AI-tocracy (50:15.599)
Well, so in that worst case scenario, I'm paying close attention to the relationship between academia and industry and then the government. So I guess you have three, I can have a triangle there. Do you see, if there is an absence of money coming from the government, do you see industry stepping in in some way or trying to hire researchers out of that?
Casey Fiesler (50:40.525)
So there's various ways in which this could happen. So there's the industry stepping up and doing the research and hiring people out of academia and that kind of thing. And that's good to some extent, but also we have weird structures of power in tech and especially around AI. And I don't like the idea of all the research happening that's so product focused.
But you do get innovation that way, at least. The other thing is, will tech companies step up to fund academics? So like Google, for example, Google PhD fellowships, Google faculty research grants, Jed has one. And also a lot of tech companies,
give money to nonprofits who are doing, for example, my friend Ruth Farmer runs a nonprofit called the Last Mile Education Fund that supports low income students who are trying to finish a degree in computing. She's gotten a lot of money from Microsoft, Google, these kinds of places. But you might also have heard that the...
Boulder based National Center for Women in IT has lost, you know, they're funded in part by grants and part by, you know, donations and these kinds of things. They lost a lot of NSF money. And when you're thinking about something like a broadening participation initiative, given the sort of like fear around DEI right now, it might be that tech companies don't want to be seen like giving money to DEI initiatives.
AI-tocracy (52:34.511)
Well, and most of them have also changed the language around anything related to DEI or have cut it out completely. At least you can't search it on their FAQ or anything like that. So it's a broader issue there.
Casey Fiesler (52:43.532)
Yeah.
Yeah, mean, and so then we then we come to like foundations, right? Like MacArthur and McGovern and, you know, like they're going to save us all. like I hope they're going to like step up. But there's only so much right. Like, I mean, obviously, the money that that NSF has put into research is far, far more than like what McGovern has put at their availability.
AI-tocracy (53:16.207)
For reference, Jed that Casey was talking about is Dr. Jed Brubaker, who is also a faculty over here at CU Boulder, also happens to be my PhD advisor. just for reference. As we move towards closing, I want to also highlight that you are a leader and a fountain of wisdom for youth, or I shouldn't say youth, for college folks who are thinking about doing a PhD or do grad school or even
advance their own career as researchers, as undergraduates, how are you thinking about teaching them in this moment or mentoring folks who are in academia and early in their career in this moment?
Casey Fiesler (54:02.509)
So I have a YouTube channel where I have a lot of videos with advice for PhD students. And I do live streams on YouTube, usually a few a year, like when we're in PhD admissions season. And I did one at the end of March or beginning of April. And I was like, my god, everyone's going to be completely freaking out about this. one of the, so right now I'm talking about PhD students. one of the side effects of
AI-tocracy (54:23.695)
You
Casey Fiesler (54:31.369)
of the funding cuts. So as I mentioned before, a huge component of what NSF funding pays for often is research assistance. So if universities don't have that funding, then that means they can admit less PhD students because they have to be funded.
If grants aren't helping to pay for them, then that means the money has to come from the university. you know, like there's only so much. so PhD students were having offers rescinded this past cycle. And also some programs like admitted less students than they usually do or didn't admit any students or these kinds of things. And this is a valid concern. Now, previous, now I was telling people before, like, maybe it's not so bad because some of this appears to be
sort of preemptive, like it's programs being risk-averse. But as we see more actual funding cuts, that might end up not being the case. Like it might not be preemptive. It might be based on actual funding cuts. So we're gonna have this pipeline problem too. And then we also, of course, have the problem for folks who are at your stage in your career, like where are the jobs?
Because if universities are concerned about, well, there's two things. One, if universities are just concerned about running the university because suddenly overhead is cut considerably, for example, they're going to institute hiring freezes. Also, postdoc positions are going to cut or like, know, anything that's soft money, soft money research scientists, anything that comes from grants, those jobs won't be there either.
AI-tocracy (56:23.085)
Yeah, well, and you also have a double jeopardy in industry too, where you have a lot of jobs being cut under the premise of, AI can do that job anyway. So you have people who have spent years in computer science and doing AI development, and those jobs I've heard from lots of colleagues that are more on the technical side, they're just, they're disappearing, and then a lot of folks are just getting laid off, even before companies really know what they're doing with it. It's just, you know, overhead is always the most expensive.
for a company. that's an interesting tension that at some point some bubble's gonna have to break. I don't know what that is or what that looks like.
Casey Fiesler (56:59.213)
Yeah, and I don't want to suggest that this, don't go into research anymore, don't go into academia, don't apply for PhD programs. The work still needs to happen, right? We need to keep going. It might be a little bit harder and we might have to be a little more creative.
You know, you might have to consider other types of jobs, be willing to sort of move between things, you know, maybe do slightly different types of research or frame research differently. But also...
My expectation is that this is not forever. And I do think that some of this damage can be undone. Vote in the midterms.
AI-tocracy (57:52.08)
And we have seen this historically before. There have been administrations, don't quote me on this fully, but I think even the Clinton administration, there were some cuts. so this is unprecedented in some ways, but in other ways there have been hits to funding before and there have been some bounce backs.
Casey Fiesler (58:16.459)
Yeah, it'll survive.
AI-tocracy (58:18.413)
We'll, yeah, we'll, yeah, we'll see. Otherwise, I guess everyone will just move to Europe with that. We'll see. How I want to end. So Ezra Klein does this thing where he always asks for like, are three books or one book that people are thinking of right now? I wanna, I don't just wanna copy him. I'm wondering if there's any sort of song that you've been listening to or that you're into.
right now, which may be a more difficult question than books, haven't decided yet. So if you want to go the book route, because I know you're an avid reader, you can go there instead.
Casey Fiesler (58:54.477)
Um, I did just read the new Hunger Games book. I just finished it two days It's actually very good. uh, yeah, I, I almost, so I don't know if this is, I gotta tell you, the first song that popped into my head, like in the context of this conversation, because you asked me.
AI-tocracy (58:58.767)
How was it? Yeah
Casey Fiesler (59:14.347)
was a rat by Penelope Scott. And I'm just going to leave it with that. And people can look it up and find out why that seems relevant to this conversation.
AI-tocracy (59:16.143)
That's great.
AI-tocracy (59:25.047)
Yeah, I'll make sure to put it in the show notes for the podcast episode. And any last comments or thoughts before we close out?
Casey Fiesler (59:35.957)
Just, you know, again, the work is important. You know, what we're doing, whether it's whether it's AI ethics research or education or, you know, misinformation, you know, in the context of this podcast, things around deepfakes, DEI, et cetera, like keep doing the work. And I do have have hope that like good people are going to like continue to.
save us.
AI-tocracy (01:00:06.959)
Well, and you're one of them So I would like to thank dr. Casey Feesler for being on the show today and We live stream this which is very exciting in the future. We'll hopefully live stream on YouTube twitch if there's any AI related interest in in twitch streaming which we'll talk to constituents about and followers about
But besides that, catch us at least weekly, every week on your favorite pod catchers. And thank you for joining us today for AI Talker Sea Live.
AI-tocracy (01:00:43.735)
And we got the music that really gets you pumped up.