Who thinks that they can subdue Leviathan? Strength resides in its neck; dismay goes before it. It is without fear. It looks down on all who are haughty; it is king over all who are proud. These words inspired PJ Wehry to create Chasing Leviathan. Chasing Leviathan was born out of two ideals: that truth is worth pursuing but will never be subjugated, and the discipline of listening is one of the most important habits anyone can develop. Every episode is a dialogue, a journey into the depths of a meaningful question explored through the lens of personal experience or professional expertise.
PJ (00:03.596)
Hello and welcome to Chasing Leviathan. I am your host, PJ Weary, and I'm here today with Dr. Alex Reed, the chair of the Department of Media Study at the State University of New York at Buffalo. It's wonderful to have you here today.
Alex Reid (00:18.64)
Glad to be here.
PJ (00:19.884)
and we're talking about his book, Rhetorics of the Digital Non -Humanities. I only have the Kindle version here with me so I can't show a beautiful version of the cover, but fascinating book. And just to kind of start us off, what led you to write this? Why this book?
Alex Reid (00:27.984)
You
Alex Reid (00:37.904)
Ah, well, sure. I mean, this is a subject that I've been studying since I went to graduate school in the early nineties. You know, it's the, that was the moment that the internet kind of came alive. Right. So that's what had started my interest in studying the relationship between technology and communication, rhetorical practice, which is what I would say is sort of the.
backbone of my research, my entire career. And so this particular book is looking at the role that, increasing role that digital media technologies play in shaping our capacities to communicate, to produce knowledge, to share that knowledge with one another, to...
come to make decisions, to come together about issues that we need to address. And I mean, certainly in the last 10 years, from smartphones to social media, now to generative AI, things have changed a lot for people around the world, really.
PJ (01:53.132)
It's interesting, and I don't know if my audience is getting tired of me saying this, it just keeps, it comes up with each guest. My day job is I'm a digital marketer. And so just this morning I was talking to my wife, we make websites, and that's been pretty solid for us for several years. And now generative AI, they're now like, they're creating those kind of prompts that like, make me a website that does this. And they're like, okay, no need to panic yet.
But we definitely need to keep an eye on what kind of market share is that going to take, right?
Alex Reid (02:26.032)
Right, no, absolutely. I mean, this is a concern. My department I'm in now is largely, I mean, we do humanities and arts. A lot of our students are filmmakers or game designers, or those are the things they're aspiring to be. So generative AI is a huge issue. I mean, we've had the strike with the screenwriters and things like that. So we are all aware of those kinds of things.
PJ (02:52.268)
And for our audience, one, it's not an uncontroversial definition, like what is rhetoric. That gets thrown around a lot. So if you don't mind kind of articulating what you mean by rhetoric, and then what is the discipline of digital rhetoric to kind of center us as, you I mean, even that title of Rhetorics of the Digital Non -Humanities. I think if we start off just saying that, not defining it, I think some people like...
I don't - where are we? Where are we on the map of, like, the world, you know?
Alex Reid (03:20.656)
Thank you.
No, absolutely. I mean, the title is a little bit inside baseball, I would say. Yeah, I mean, so the, you know, the rhetoric, I guess, like the oldest definition, we go back to Aristotle is about, you know, the available means of persuasion, you know, understanding the ways that you can persuade an audience to take any particular action in language.
PJ (03:30.156)
I've never heard that term. I like that. Sorry, go ahead.
Alex Reid (03:54.416)
And so understanding the art of it, right? So there's an art, a practice of rhetoric that's about persuading people, whether you're, you know, in digital marketing or you're a lawyer or you're a teacher or a politician or what, just, you know, trying to get somebody to go out on a date with you, you know, it's, it's about persuasion. That's the history of it. I mean, I think that a more contemporary definition would look.
more broadly at the ways that we use language and other media to produce knowledge, to work together and to persuade, but that persuasion isn't necessarily the central focus that it was as a sort of classical thing. As a regarding digital rhetoric, I mean, that's a term that was coined in the late eighties and by a guy, a scholar named Richard Lanham.
Uh, and, um, you know, it came up out with, uh, you know, like the PC revolution, which started in the late seventies and then, you know, P computers coming into the workplaces and into classrooms in the eighties. And so I think scholars in my field that were teaching writing and thinking about rhetoric were starting to become, pay more attention to the role of computers. And so, uh, that's.
kind of where that term comes from. So it's really looking at the specifically at rhetorical practices as they relate to digital media.
PJ (05:25.484)
And that obviously leads us into, you know, when you talk about the question about what are non -humans?
Alex Reid (05:32.4)
Ah, yes. Okay, so, well, I mean, that's not us. I mean, my, when you look at interest, scholarly interest in the non -human, and I talk also about the, about post -human, which is kind of related to that, that non -humans, you know, could be animals. So there are people that certainly talk about the role of, you know,
PJ (05:37.228)
Yeah.
Alex Reid (06:02.288)
animals and rhetoric and things of that nature. My interests are in the technologies as actors, you know, as participants and not just as, you know, the mute extensions of our individual will or something along those lines that they have a they that they actually have a part to play.
PJ (06:25.324)
So kind of reassembling them as maybe not agents might be too strong of word or is that what you'd be looking for versus tools?
Alex Reid (06:34.96)
Yeah, it's, yeah, I mean, the question of agency comes up quite a bit in my book and in my work about, you know, where, where, where do the, where do our thoughts come from? Like, how did they arise? How do we put them into language? How do we decide what we want to do and how we go about it? And so I would, I always think about.
our tools as, I mean, you can think about it historically as just as a way that we have tried to solidify and regularize the way that we go about communicating with one another. So, you know, anything from grammar, you know, in language to, you know, genres like a memo or an email, like those are things that, you know, we don't have to think about how to do because we know how to do them.
and we use them in certain contexts and not others. You don't probably send your wife memos. There's that kind of thing. So understanding technologies as ways that we have encased or tried to regularize our processes, our human processes, and make them easier for us. But at the same time, then we then pick up because other people make our tools.
for various reasons and then we use them and try and figure out how to use them. And so they come to have their own tendencies, right, for how we use a cell phone versus how we use a pencil. And so how does that work and shape the ways that we decide to communicate?
PJ (08:20.748)
And I think, even as you talk about the difference between a pencil and a cell phone, that are we talking about, you talk about user populations, are we talking about the, and for you that's, you wanna, well, and this is what I wanna ask you, are we looking to get rid of completely the ideal persuader, or are we looking to decentralize that and allow it to exist along different capacities? Like there's an ideal,
It definitely feels like it's not just a per technology. It would be like, or not like a cell phone or pencil, but it'd be more like the person who's ideal on Twitter would be different from the person who's ideal on Instagram. Like those are very different. But are you looking to decentralize or are you looking to kind of kill off the idea of that ideal, uh, rhetorician?
Alex Reid (09:07.736)
Right.
Alex Reid (09:13.68)
I would more likely be moving, I'm generally moving away from the idea of ideals or some kind of idealism. And the, that, I mean, just to like think about like a microphone that we're both using microphones right now, but we use microphones differently from the way, you know, Mick Jagger uses a microphone or I don't pick, pick. I mean, I'm really dating myself with my references. I mean, that's older than me.
PJ (09:41.772)
I know who Mick Jagger is. He's the guy from the song Moves Like Jagger, right? That's the... Sorry, can't resist.
Alex Reid (09:43.44)
Yeah, that's right. You got it. So yeah, so that, so I do think that, you know, there's not necessarily an ideal and what you what I would sort of briefly draw out from that example is just, well, you know, if you're, you know, standing on the stage in a rock arena with a microphone in your hand, you're in a very different.
rhetorical situation, right? Than we are, right? And so you need the reasons why you're using a microphone are different. And you are interested in different capacities of that microphone perhaps than we are interested in. And so my interest is in part like for people to consider, study, and try to describe the ways in which technologies can provide us with different
capacities for rhetorical action, for communicating, without, you know, as they get situated in different larger kinds of contexts.
Does that make sense?
PJ (10:58.22)
Yes, yeah, yeah, yeah. Sorry, I started going somewhere else and I was like, well, that's not a good question. And then I forgot the original question. Oh, as you're talking about that, it seems like as things were slower, you talk about faster networks, but you also talk about the rise and a lot of what you're talking about here is the confluence of social media, mobile tech, faster networks, internet of things.
There, and you also talk about your basis in materialism, which I'm sure is a lot of this movement away from ideals. But do you think some of this movement away from ideals also has to do, or the value of an ideal rhetorician has to do with kind of the mobile technology and faster networks? Because when the discourse is more solidified, that like it's easier to be more rigid in thinking about it. Does that make sense?
Alex Reid (11:51.312)
Yeah, no, I think that the idea that you could be the ideal rhetorician in a more permanent, long -standing way would rely upon technologies having a little bit more endurance than they do now, a little bit more permanence than they do now. So, you know, if arguably, you know, not to get too partisan on these matters, but, you know, I think we're going to say that...
You know, Donald Trump gets elected because in part because of his capacity with Twitter. You know, is that going to work now that it's X or is it, you know, for him? I mean, is it going to have a different role? You know, is that the, are the presidents of the future, you know, we had, if we go back to the thirties and if it's Churchill and Roosevelt, you know, on the radio or if it's
you know, Mussolini on the radio or, you know, like now we don't have radio presidents, right? We had Nixon and Kennedy famously doing the first televised debate and Nixon, you know, the story about, you know, he didn't want to wear makeup and how that affected his ability, you know, so like these kinds of things about a televised president, a Twitter president, a radio president. Yeah, I mean, so is this, there's no ideal that comes, that goes across those things.
PJ (13:14.7)
Right, right. If you don't mind, and I know this isn't, it seems important kind of as the foundation, can you talk a little bit about this new materialism and it obviously kind of extends throughout, but you're not arguing for that, but how does that kind of ground this?
Alex Reid (13:30.896)
Yeah, so new materialism is a philosophical concept that arises in the early 90s. So it's, you know, arguably, it's partly a result of technological change of, you know, the information revolution. It's based upon French philosophy that occurs after
In 1967, there were these massive strikes in Europe and France and stuff, and the idea that this would create massive political change, which it didn't. And then some philosophers changed their way of thinking. People like Gilles Deleuze, for example, a French philosopher. And a lot of new materialism comes out of Deleuze's work and Foucault, other French philosophers of the.
of that period and really what it's, you know, it's new because the old materialist is Marxist and the new materialist is post -Marxist, I guess. I mean, it's not like a complete rejection of Marxism, but it's a spreading out of trying to understand the ways in which non -humans and materiality shape historical
shape our history in ways that go beyond the kind of narratives that Marxism provides that you know focus on like means who owns the means of production right I mean that's kind of like the the central tacit of Marxist philosophy.
PJ (15:14.092)
Is that similar, when you say post -Marxist, is that similar or how would you distinguish that from like post -structuralism?
Alex Reid (15:22.)
Ah, well, I mean, they are certainly related in that, I mean, historically they're related. They both come out in like the 60s and 70s. And they share a rejection of grand narratives, right? I mean, Marx gives us this narrative of the proletariat revolution that leads to, you know, kind of workers paradise. And of course, we have lots of other narratives in our society. And...
post -structuralism, certainly a rejection of grand narratives or meta narratives as someone like Jean -Francois Léatard would have talked about another of these French philosophers in the 70s. So there is really this kind of movement and post -modernism, like all of these posts. I mean, in some ways you could say it's, one story of it is to think about how the French.
PJ (16:10.444)
You're right, right.
Alex Reid (16:20.154)
respond to the German occupation during World War II, because we're talking about people that were young folks, philosophers who were young people during that occupation, right? They were alive. And so that's certainly a part of where post -structuralism comes from. Another version of it, one of my professors used to say is that, it's the French crisis in German thought. So it's like,
PJ (16:48.172)
Right.
Alex Reid (16:49.518)
It's Marx and it's Nietzsche and Freud, okay, he's Austrian. So, you know, like these ideas, like responding to these 19th century philosophers, Heidegger as well, and then coming up with something new after the German occupation of France. So of like, maybe those things weren't great ideas or they were incomplete and we want to build on.
PJ (17:12.652)
Thank you. That was honestly more for me personally, because I don't know if I've ever really encountered post -Marxist, but I've studied Lyotard, or at least read, I should've studied, I've read Lyotard and Foucault, and so I was like, all right, what's the connecting bridge there? That's helpful, thank you.
Kind of central to your work, and I'd be remiss if I didn't get to this, you kind of start off with a very technical definition of rhetorical capacity. What is that and why is it important? Why do we need to understand that?
Alex Reid (17:45.936)
Yeah, so I mean, think, I mean, part of it is like the notion of capacities is sort of broadly available in philosophy. So, so there's a longer story there. But rhetorical capacity specifically is about how the tools that we have give us different opportunities for action. So that instead of thinking about,
saying like as a human being I have innate qualities that allow me to think and to speak and so forth that those are those those are capacities and by capacities rather than qualities what we're talking about is something that arises from our interaction with something else.
Right? So a knife is sharp and the knife has the capacity to cut, but it only has the capacity to cut something, you know, certain things, right? I mean, like I've got, I can cut a tomato with my knife. I mean, we can go into the Ginsu knife ad, but you know, I mean, like the, you know, can it cut this or that or whatever? But I mean, like certain things that can't cut, right? So it's got certain capacities, but it can't do any of those things if my hand isn't on it, right? Making it cut.
So the two of us have to come together to create the capacity to cut. So to think about rhetoric is something that is not inside of us, but rather is something that arises from our interactions with others, with non -humans.
PJ (19:32.492)
And you have your author, your text, and your audience in some ways. You can think of it in that way. But if you talk about it from the knife perspective, the capacity to cut comes from me plus the knife. And I think the thing that most people often miss is that you need the right thing to be cut. You could write a great speech, and if you don't have the opportunity to give that speech, it doesn't make a difference. But in the same reason, you can't.
It's very easy to think about, like, I have a knife, I can cut things now, and it's like, well, here's a tree. Now, good luck. You know, it's like, I need a saw, right? Like, that's like, the knife will break before you get through that, you know? So, yes, and that's where the interaction of non -humans is obviously important. How does that work with, and I...
What I really appreciate is like when you put these phrases together they sound inside baseball, you know, but As I sit and think through them they make sense to me like sometimes like academic terms you're like that doesn't mean anything what I was like I was like that's not what I thought that meant at all But as you like I understand like why you're using these terms when you talk about distributed deliberation How does that fit and obviously like you're kind of just?
slowly working through what these rhetorical capacities are. So I'd love to follow the structure of your book here, but what is distributed deliberation and how does that affect us on a daily level?
Alex Reid (21:08.784)
Sure. So, my concept of distributed deliberation comes from a larger concept of distributed cognition, which comes out of... Boy, I was trying to remember this guy's name. I should have looked it up in my book. But there's a guy who wrote a book called Cognition in the Wild. And I can't remember... Why can't I remember his name right now? But it's a book that was written in the 80s.
PJ (21:35.724)
Edward Edwin Hutchins.
Alex Reid (21:37.688)
Yes, thank you. And he's in the field of psychology, right? So, I mean, what he was looking at is how, like, for example, if you're trying to bring an aircraft carrier into San Diego, like, it's not just the captain that's piloting that big old boat, right? Like, it requires a lot of people and a lot of technologies all working together to perform that task. And we understand that, right? It's a normal thing.
So like the idea of that cognitive acts are distributed around people and tools in order to be enacted. And so I think about deliberation. So it's just like how we decide what we want to do and how do we do that? Well, it's distributed through, like if you want to buy a car, well, you probably do a Google search to find out like what's the best car, what's the good prices, right?
So we rely on Google to help us make a decision. That's an easy example, I think, of this. And one of the things I talk about is, well, what's going on in that relationship with Google, which has people, but is also like, there's no one at Google that knows what's going to show up when you type something in to the search. Not exactly. But the purpose of Google, of course, Google Search is
for you to click on something, right? And that the purpose of Facebook is for you to click on something and the purpose of Twitter is for you to click on something. And really the purpose of most social internet sites is for you to click on something and ultimately perhaps to click on the buy button. But I mean, you know, but you're sharing information, you're giving up stuff. And so trying to understand.
how we now make decisions that we're not making, if we were ever making decisions, like how we make decisions today relies upon our interaction with other humans, but through the lens, through the mediation of these technologies and that they are participants in our decision -making, right? Because you get shown stuff, I get shown stuff on the internet and I don't know why necessarily, right?
PJ (23:58.42)
Yeah.
Alex Reid (23:59.856)
Like we all get these things, like, why is this in my YouTube feed? Like, why is it trying to recommend to me to watch that? What's it's thinking about? Like, why do these ads show up for me? And, and things like that, right? And we don't necessarily know why, like certain posts get promoted and others get hidden. Why, you know, like there's lots of stuff on YouTube that, you know, no one has access, very few people have access to, right? Cause it's not promoted in the way that the algorithms function. So algorithms are decision makers.
that no one really takes a lot of responsibility for. But they certainly shape our conclusions about the world. Go do a Google search for climate change and see what you come, I mean, you're gonna get different kinds of stuff.
PJ (24:43.596)
Yeah. Or a Bing search or DuckDuckGo search. Like, you will get very different results. That's a, that kind of classic meme going around like when you Google for something, are you Bing for something? Your responses are, oh, okay, that's quite different. Personally, this has been interesting for me. I spend too much time and this is everyone's complaint on YouTube shorts, right? I don't do TikTok because I'm...
Alex Reid (24:48.528)
Yes, yes.
PJ (25:12.332)
you know, 35. And, you know, I know people my age do TikTok, but there's still that small, you know, modicum of, uh, I, I, restraint really gives me way too much, uh, you know, probably just pride, you know, we'll go with the sin, not a, not like, but, uh, not, not a virtue, but, um, at about every six months I'll get a flood of absolute junk. And I think it's gotten, uh, better, but I'll get an absolute flood, uh,
Alex Reid (25:28.08)
Thank you.
PJ (25:42.048)
of junk in my feed about every six months, especially when it was first starting, because certain creators had figured out how to hack the algorithm. And I would get just the worst videos, like just the dumbest stuff. And I was like, why is this happening all of a sudden? And it would always be like, I'd find out like, okay, they're doing an update in like a week. Like they're like, oh, that's a loophole. Be like lots of bots, like sharing something or, you know, which are these.
Alex Reid (26:06.832)
Thank you.
PJ (26:10.73)
non -human, you know, these non -human actors.
Alex Reid (26:14.128)
Right, and if you're making websites, you're probably thinking about search engine optimization, right? And those rules are always changing for good reason, right? Because people game the system. And so, which is kind of what they're supposed to do, I guess, to some extent. But there is an interesting kind of recursivity there, right? Because the rules change and...
people trying to figure out what they do, what they are, and as soon as they get closed, the rules change again, right? But you've already been shifted in a particular direction, right? As a creator of websites or web content.
PJ (26:54.348)
And that actually leads into my next question. You're talking a little about distributed cognition, talking about aircraft carriers, right? There's a lot of brains doing a lot of different things, working towards this purpose. What's the difference between what goes on with carriers and what goes on with this kind of distributed deliberation with clicks and algorithms? There's a feedback loop.
Like you don't have it, your carrier's not getting rebuilt on the fly as you're taking it into, well on the wave, as you're taking it into San Diego, right?
Alex Reid (27:31.76)
Well, yes and no, I mean, it's probably undergoing continual maintenance. But I mean, I think that it's not the ship of Theseus, right? I mean, it's not like, it's too complete, it's too complex for that. But right, but I do think that, so like a user, I mean, the thing about a aircraft carrier, right, is that you've got this massive military hierarchy that's part of organizing it, right? And the way that people act and respond. And if you're at a university, you've got,
PJ (27:39.308)
Hahaha!
Alex Reid (28:01.84)
Like I am, you've got a looser set of rules that dictate how that space operates. So we can look at different kinds of social cultural situations and think about the ways that language and behaviors and technologies and other materials and histories and cultures or whatever. It's all a lot of complex stuff. That's why it's difficult to figure out.
come together to shape behaviors. But I think about user populations, you know, like as Facebook users or whatever, like we all, if you're just the typical user, right, we all have the same generic technical capacities granted by being a user, right? We all sign the same terms of service that we don't read. And that, but we have,
the, you know, we can post certain kinds of media that's certain size and we can like, and we can share. We can, like, we have these things, right, that shape us as users, right? That grant us certain capacities rather than others. And so that helped that those things help to organize our participation. And then you have a little bit of a hierarchy, right? Cause you could, you know, create a group and become a group moderator. Like you can, you know, there's a little bit of that kind of stuff that goes on.
and Facebook or other social media. And so all of those things can work to regularize behaviors as like we do in normal life, where you say something, you get slapped. You're saying, well, maybe I won't say that again. I mean, the same thing happens online, right? It's just not physical slap. It probably happens more commonly, right? Because that's the web for you.
PJ (29:52.268)
Yeah, yeah. And I do think getting disliked produces a different set of reactions than getting slapped, I think. Like that's a whole different... Which is why we end up with a very different kind of discourse online, I think. Yeah. And that, so... But that's a different thing. Like, and actually this is great because we talked about like the difference here between talking in person...
Alex Reid (30:04.684)
Yeah.
Alex Reid (30:10.224)
Yes, I'd so.
PJ (30:21.612)
Yeah, I made a motion in my hands like we were talking in person. That's a very different discussion. Okay, that's a whole rabbit hole. But your next kind of rhetorical capacity you talk about is synthetic attention. Would you say that's a new development? I mean, I don't think we've really had anything technologically that way. Or is there something you'd compare it to in the past?
Alex Reid (30:50.352)
Well, I think you could compare it to reading a book, right? I mean, in a way that, you know, that's something obviously we all, you know, people need to learn how to do. And, you know, the whole, there's a whole set of practices historically and today, right? About like where you sit in a coffee shop and read a book or you sit, you know, and drink, like you read before you go to bed, whatever it is, right? We create these habits and practices.
PJ (30:55.404)
Okay.
Alex Reid (31:19.254)
around those things. And it does require a certain kind of cognitive disposition, right? Which is why many of my colleagues come to the comic book place. Kids these days, they don't read, right? We sign them books and they don't do the reading. And like, how do we handle that? Or what do we do about that? And so because maybe the students don't have those practices or whatever, we live in a different time. We don't.
PJ (31:33.676)
Right.
Alex Reid (31:48.56)
live in the 19th century anymore. So I do think that there are, but definitely what we have with contemporary technologies is a way that we have pretty crudely monetized attention. The idea of an attention economy is a relatively new concept. Maybe it goes back to like Nielsen ratings.
or something along those lines, right? But I mean, it's like our devices, like in that chapter, I talk a lot about smartphones and the way that if we think about, again, like the capacity to attend to something is not something that is integral to me, but is something that is a distributed capacity, like my capacity to pay attention to something in a certain situation, like all those things have to come together and I've been trained to be attentive.
as a scholar and some of those things, right? So the smartphone is designed to organize the way that we attend to digital media, right? And we have a little bit of control over that, because we can pick our apps and we can, you know, control our notifications and things along those lines. But, and we can talk about the technical aspects of how...
a smartphone can actually work and when it pays, like how it draws up information and how it gets triggered and things like that. And we can also think more critically about why do smartphones work the way they do? Why do we want this thing in our pocket that's constantly reminding us to, or demanding that we do something? But we do.
So it's, so for me, you know, it's thinking about, well, maybe we do need something. Like if we do want to be connected to this synthetic world of media and information, we need some tool to help us do that, right? Because we don't, we can't just like eat a hard drive and learn what's on it, right? I mean, we have to have some way of accessing it. So, so these are tools, right? To do that.
Alex Reid (34:12.88)
And they sort of shape the things that we pay attention to. So yeah, that's kind of where, you know, so my book is in some ways about negotiating those relationships, like realizing that they are relationships and then figuring out what do we want out of them.
PJ (34:39.916)
Yeah, actually, I realized I kind of jumped the gun here. Can you for our audience explain what synthetic attention is and how that fits in with what you call close hyper machines?
Alex Reid (34:50.864)
Oh, yeah, sure. So, yeah, more inside baseball. Yeah, so synthetic intention is just my way of highlighting that attention is not just something that we have, but it's something that is created, that's built with technologies. You know, like chalk.
Yeah, just think about the whole like a classroom or an auditorium, right? I mean, that it's designed, you go to a theater, you sit in the dark and rose, right? Like, that's all designed to make you pay attention to some to the screen, right? So we, so these things are all synthesized, right? So that, so that's part of it. And then so the, the close hyper machine, it kind of has to, it draws upon a conversation that comes out of the work of another scholar, Catherine Hales.
who talks about different kinds of reading practices. Close reading is like a term from literary studies. It's about reading closely, like read looking very like the particular words and the choices and the syntax and those elements rather than just trying to get the ideas, right? And so we have close reading, which requires people. We have...
machine reading, right, which is the reading that machines do. And then like, Hales talks about what she calls hyper reading, which is kind of what, you know, quote unquote kids these days do, right? Where they've got multiple screens up and, you know, windows and different kinds of things calling their attention and they're shifting back and forth, right? Between different kinds of material. And so like I talk about the smartphone as a...
Like an integration of those things, right? Because it's close to us. It's intimate. It's a personal device. I don't know that we feel have a degree. I mean, maybe like, you know, the stereotype is guys in their cars, right? That they feel intimate about, uh, but that, uh, like the phone is really intimate. A device, right? We don't really share our phones with other people and stuff like that. So it's, so it's a personal intimate thing. It's hyper.
Alex Reid (37:08.272)
because I mean, it's spasmodic, right? I mean, it's just like, who knows when that thing's gonna buzz and what it's gonna say. And I think that's in a way kind of like what's exciting and addictive about social media. It's more like, it's less like smoking a cigarette than it is like, you know, the one -armed bandit in Vegas, right? I mean, you don't know what's gonna come up, but both are addictive in different ways, right? So, yeah, so those kinds of things.
PJ (37:20.78)
Mm.
Alex Reid (37:35.856)
And then the machine, right? I mean, obviously it's a, like what the part of what the smartphone does is it takes a technical, you know, binary digital data and transforms it. It reads it and then turns it into something that is attractive, engaging to us. It's informational communicative to us, right? So we look at our phones, we hold these things on our bodies. They keep buzzing.
I always used to get these like phantom buzz like, I think, is that my phone buzzing or is it just my something else's or it's just happy to see me? I mean, I think that it's just, you know, that it's just, you know, like we have these responses to these visceral responses to these technologies that then take us down a meteor rabbit hole and we never know when it's going to happen. Right? Exactly. And, uh,
PJ (38:12.524)
Yeah.
Alex Reid (38:32.1)
that we kind of live our lives around having our attention synthesized, created for these purposes. And that's the way that we decide to interact with the massive media scape that we are now a part of. And why do we do that? Well, I mean, that's a good question. I mean, I don't, you know, should we do something else? Maybe. What should it be? I don't know. But, you know, how do we, how do we even have that conversation?
PJ (38:59.66)
Right. And just saying the old ways are better has almost never worked. I can't think of really... An example does not come to mind where just saying the old ways are better works. So, yeah.
Alex Reid (39:12.688)
No, you can't. I mean, for good or bad, you know, we don't have a time machine. So, you know, I mean, it's the future isn't the past. We can learn things from the past, but we can't, you know, we can't live there. So we already did.
PJ (39:29.164)
Yeah, we can't, for instance, whether we want to recover thinking from before like a modern period, we would never be pre -modern. We will always be, we're just redefining what postmodern is, right? Like you can't, like, like, like, we can't just ignore that this stuff happened and that people have said it, right? And so even if you want to change, and so if you wanted to recover things like reading, it's going to have to be, you,
Alex Reid (39:44.88)
Yeah, right.
PJ (39:57.898)
Let's for instance, if you wanted to recover reading and again, not to say that that's what needs to happen, but past habits of reading never had to deal with a cell phone world. So it is going to be a different thing, a different setup, a different set of like, I put on my phone on Do Not Disturb all the time because I get like, otherwise I was finding myself like,
Bing, bing, you know, especially if I'm the system administrator for like a lot of websites and oh my gosh, the amount of just, yeah, anyways. Even as you were talking about that, I think of, I'm sure you've experienced this where you're in a group and you have, like everyone has the default notification sound and everyone reaches for their phones at once, right? Like it's like, it's your attention has been created, right? Like it's.
Alex Reid (40:33.2)
Oh.
PJ (40:56.818)
It's really really strange. This reminds me a lot. I had an earlier episode where I had a UX professional who had been kind of leading a charge for kind of the craftsman side of things talking about unethical design and you know talking about the the way that For a while, you know, we used to have pages on Google, you know, and now it's become infinite scrolling everything Everything has become infinite scrolling because it grabs attention better
hiding, I mean this is where we get into, like that's a different discussion where you could argue about the ethics of it. In the middle of that I found out why my grandma had been charged a subscription instead of a one -time purchase because they had hidden that it was actually a subscription in really small print for her, right? Like and that's truly unethical, right? Like there's like all these different layers to this sort of discussion and so then you have things like you're talking about like,
Alex Reid (41:45.744)
Mmm.
PJ (41:56.62)
they start shaping the capacities that we have where it's like, no, no, no, if it's going to be, you know, like this stuff can start, well, I don't know if it has yet, but I think it might start coming down from Congress. If you're going to, if it's going to be bought, it has to be a certain size font, right? Like you have to have, you know, just stuff like that. Go ahead.
Alex Reid (42:15.504)
Yeah, no, I mean, I think, yeah, I mean, certainly we can talk about laws that can be created. And I know that, you know, right now there's a lot of stuff going on about generative AI and kids, right. And youth and, you know, that we often start off by talking about, well, what about the children? It's a, you know, a useful rhetorical place, right. That people will attend to as a way of talking about these things. Cause we always think, well, we're adults. We can.
control ourselves or something, right? I don't know what adults those people are dealing with. Not me, they're not dealing with me.
PJ (42:45.918)
Right.
PJ (42:49.676)
Oh.
PJ (42:53.268)
Oh man. Oh. Yo, no, I know. Yeah. I, yeah, I, my wife got a slice of cheesecake and I know that I can't eat it. And I was like, you know, I'm just gonna have a couple of bites. And I was just up all night and I'm just like, it's been years since I've done this. Why did I do it this time? You know, it's just like, it's like, oh yeah. But the kids are the ones, the kids are the ones that need to be protected. Yeah. No.
Alex Reid (43:09.264)
You
Yeah.
Alex Reid (43:17.36)
Yes, the kids need to be protected for sure. I mean, the laws and like, you know, 10 tips to reduce your phone use or just, you know, any like here's how you should set up the settings on your phone. So that you like all those like privacy settings and all those things. Those are all good things, right? That we need to do. And, you know, they're part of the way that the institutions we have that are available to us can help us to address these issues.
But at the same time, they're all kind of based on a basic conception of humanness as rational animals who can, given a fair shake, will make good decisions. I mean, that that's what democracy is about.
And the faith in democracy is the faith in the capacity of citizens to make rational choices. And, right? I mean, I don't know. Yeah, it could be. But if we don't, I mean, if we start thinking about humans in a different way, which is more like the post -human kind of way of thinking about it, you know, it's...
PJ (44:22.764)
Yeah. You know, when you put it like that, it's a little depressing, but okay. Yeah.
Alex Reid (44:43.088)
we think about rationality and thought and decision -making as things that are environmental, that they're the actions of populations that are both human and non -human, that there are these assemblages and networks and things along those lines, then you start to think that either, not that laws won't work, but that laws need to be.
framed on a different expectation for what humans are, which I think is, you know, we're not there, you know, culturally.
PJ (45:18.7)
Yeah, I was about to ask. It sounds like, I mean, it sounds to me like we're tying back to your earlier discussion of moving away from ideals, or at least ideal actors for sure, right? Because, so you have a different idea of democracy, that you're like a new type of democracy to match the new materialism, where it's like, you're constantly, like we're looking for the ideal citizen, the rational, logical, and it's like,
Alex Reid (45:30.896)
Right.
PJ (45:49.26)
Have you met the citizen? Like, like.
Alex Reid (45:52.3)
Yeah, yeah. And I mean, to be clear, I mean, I put myself in that category. I'm not trying to say, if only you, you know, got a college degree. It's not about that, you know, it's, it's about kind of, it's about our ontology, it's about our being, right? It's about who we are, what we are, and trying to understand that better than we, than we currently do. When we think about,
PJ (46:03.372)
Yeah.
Alex Reid (46:19.952)
a different democracy. Well, there's this other French philosopher, his name is Bruno Latour that I talk a fair about in the book. And he talks about this idea of a democracy of things and what he calls a non -modern constitution that has to do with recognizing that the role that inanimate non -human actors,
play in our society. And I think probably the easiest way to think about that is when we talk about representation, right, like in democracy, right, we have represent people that represent us, right? So that's one form of representation. Another form of representation is what we have in language, right? So, you know, language represents things, right? So science, like science is a representation of the natural world.
Right. But to be able to create those representations, right, you need a lab, you need all these non -humans and to be able to write and all these things. And that's really, that's where Latour's work came out of. He was kind of like a sociology of, sociologist of science, science practices and stuff. That's kind of where he started. So, so we think about a democracy of things and how we, how the world gets represented to us in order to be able to deliberate and do the things that democratic societies do.
then we have to realize that it's not just the humans that are involved. And so, I mean, we make rules and laws about the non -humans, but we don't necessarily hear them and recognize the role that they play. And so we can think about climate change as an example of that. So that's kind of tangential to the subject matter of my book, but it's, I think, an obvious example.
PJ (48:12.364)
Well, I mean, even as we're, what I see, at least from the government, and this is my not greatly informed opinion, but it constantly feels like they're playing catch -up to technology. And I think you mentioned something about being generative in the field, like trying to shape forward instead of just like, oh, no, we don't want that. And so even as we talk about things like, and honestly, how much...
Well, we've always had kind of those laws about technology. We've had laws about radio waves. We've had laws about television, but it's always been reactive and it hasn't been so explosive. But with the advent of things like AI and those sorts of things, part of what it sounds to me, and I want to just make sure I'm tracking with you, is that you want a recognition of purposes and goals and kind of like a...
And shaping those things so that we're not reacting and after the damage is done. Is that is that one way to talk about it?
Alex Reid (49:13.23)
Yeah, I, yes, I think so. I think we do need to imagine new ways of, new forms of community, new ways of being as, as individuals, you know, that are, that recognize our interdependence with a natural world, with a cultural material world that we arise from. And that, that is the context in which we will continue to live or not.
And that we, you know, I think it, it, it orients us differently than the way that we are currently oriented. And, you know, I mean, I think, you know, a lot, you know, has to do with how people, you know, what people believe about the world. Right. I mean, if you're, you know, if you believe in a, in like a Judeo Christian God, then maybe you feel like, you know, what it's all about. And, um,
So maybe you're not concerned. I don't know. But you know, that at least you know what the purpose is. Like you know, like you have an eye, you believe you know where everything's supposed to go and end up, right? So, and so maybe like that directs your decision -making, but if you don't have that, or if you live in a society where, you know, people have a lot of ideas about what that answer is, even though they all believe in a God, but they believe different things about it, which is certainly...
PJ (50:22.284)
Okay, yeah.
Alex Reid (50:43.408)
where we are globally, then maybe that isn't a workable solution for a collective. So we have to figure out how to do something differently that thinks about us, thinks about humans in a different way, because those older ideas no longer suit us in, no longer provide us with the capacities that we need to face the challenges that we face with.
emerging technologies and climate change and these sort of global conditions that, I mean, I would argue that our predecessors in the historical moments in which these religions emerged, they were not situated to be able to address these questions in the way that we need to address them now.
PJ (51:30.38)
I want to be respectful of your time and I want to be respectful of what you're trying to accomplish with this book. So I won't ask you for a prescription of where we should head because that's not your goal here. But if you could, for our audience, for the listener, as they go throughout this week, how should they think about what's a fundamental shift that they can kind of meditate on throughout the week that would help them better understand?
the way that rhetoric works among the digital non -humanities.
Alex Reid (52:08.048)
I would say to pay attention to the way that you use social media, because most of the time we don't. And again, I include myself, right, that it's a distraction or something like this. To pay attention to what we click on and why we do that and what decision we made based upon something like, why did we buy that or why did we...
Why did we like that or something? And it's, uh, it's not something you could do all week long, but maybe you could, you know, try and at the end of the day, think about like something that you did and try and figure out why you did it. Or like, as you go in, like as you're going about using your device to just, you know, stop for a moment and think about like, wow, you know, 15 years ago, I didn't have one of these. What would I ask? What would I have been doing? Why am I doing this now?
And yeah, just start to think about and look around and see all the other people, like why are those people doing what they're doing? And just to be able to be more attentive to it. And maybe you'll have an inspiration, you'll have an insight and you can share it with somebody. And it's a conversation. So that would be a starting point.
PJ (53:25.964)
Yeah, something like this, I think just creating conversation is the first start. And that's obviously the goal of your book. Dr. Reed, wonderful to have you on today. It's been a real pleasure.
Alex Reid (53:35.888)
Thanks so much.