Authors join peers, scholars, and friends in conversation. Topics include environment, humanities, race, social justice, cultural studies, art, literature and literary criticism, media studies, sociology, anthropology, grief and loss, mental health, and more.
Refusal haunts the entire history of technology.
Brian Merchant:When we refuse a technology, that's part of this legacy.
Sarah Sharma:There's an overwhelming sense that today we cannot refuse. And we show historically we've always been refusing.
Thomas Dekeyser:Hello everyone, I'm Thomas Dekeyser. I'm a lecturer in Human Geography at the University of Southampton. I'm also the author of the book A Long History of Refusing the Machine, which is published with University of Minnesota Press. I'm happy to be joined today by Sarah Sharma and Brian Merchant, both thinkers who I greatly admire and who have been influential in shaping contemporary debates about technological politics, as well as debates about the possibilities of refusal. Today we're going to be talking about the book, but also more generally about long histories of resistance to technology, the Luddites potentially, refusal of AI technologies, and the politics of technology more generally.
Thomas Dekeyser:I'll start with a brief kind of recap summary of the book. So my book, Techno Negative, traces the long histories of technological refusal. I start in ancient Greece with some of the first machine breakers and end with ultra leftist groups bombing computer companies in 1980s France. When we look through such archives, what we find is that more often than not, when new technologies are being developed and distributed, there are pockets of refusal that aim to undo those technologies. Refusal haunts the entire history of technology.
Thomas Dekeyser:So the book investigates the shifting terrain of this refusal throughout history, following its various political trajectories, philosophical commitments, and material practices. And what I noticed going through these histories is that the play between innovation and refusal is actually rarely limited to the kind of practical and economic concerns that we often tend to think about when we think about technology, such as will it make our lives easier, will it make us more productive and so on. And instead, it often plays out on the level of kind of shaping and reshaping who and what counts as human. So the book traces how, within various contexts, technology is tied up with this ongoing definition and redefinition of what it means to be human, and how this has long legitimized various forms of exclusion and violence. Through this focus, I reveal the need to move away from debates that understand technological power as largely a question of harms, biases and rights, although those are of course important, and instead I begin to foreground the politics of these kinds of ontological determinations.
Thomas Dekeyser:So the question that the book ends with is this: in our moment of the ever closer alignment of Big Tech and Silicon Valley and elsewhere with authoritarian politics, how are we to resist? So the aim of the book is to offer some historical and conceptual tools for resisting these kinds of technological environments that are increasingly saturating our everyday working and social lives. So that was a brief summary of the book. What I want to do now is give a chance to both Sarah and Brian to introduce themselves before then leading us into a conversation about the themes at the heart of the book.
Sarah Sharma:I'm Sarah Sharma. I'm Professor of Media Theory at the University of Toronto at the Institute for Communication, Culture, Information and Technology, where I'm also the director. I'm the author of Feminism Against Big Tech. First, I should say that I feel it's a companion piece to Thomas's book from a feminist perspective, so I'm very excited about their emergence in the world together. But one of my arguments in the book is that, it appears as though our big tech giants have a media theory and that the way they conceive of technology is the same as the way they conceive of difference.
Sarah Sharma:In other words, they have the same conception of technology as a utility and they conceive of others and people in the world as utility. And they create a conception of technology as tools that we all sort of fall for, but they don't think of technologies as tools. They in fact think of technologies as environments. And one of my concerns has been that the feminist turn to technology sort of lacks a media theory. I think like Thomas, some of these things are hard to say.
Sarah Sharma:Some of these things feel difficult to say about the worlds in which we work within.
Brian Merchant:Hello, yes, and I am Brian Merchant. I am an author and writer who covers AI, tech, and the labor space. I'm a journalist by training and the author of the book Blood and the Machine about the Luddites who I imagine we'll be discussing today. Perhaps the most famous techno negativists, Thomas. We can debate whether or not that's the case.
Brian Merchant:Certainly the one bound most ferociously with the epithet that equates to that in popular culture. I am looking forward to discussing this fine book and any number of other myriad of topics that it may inspire us to get onto.
Sarah Sharma:I can start. I got to read an earlier copy of your book. And one of the first things, and it's the thing I just said actually, was how difficult and brave I found the book to be. It's brilliant. It's page turning.
Sarah Sharma:It does this counter history of refusal that allows, I think, so many other scholars to do the work they do. I always love it when this happens. It's like, thank you for doing this so we can do this. It is really difficult to argue against the technoreformist position, and I wanted to ask you how it feels to do that. I mean, we can intellectualize this.
Sarah Sharma:We can talk about theories and histories, but I actually wanna talk about you as a scholar, writer. Are you against something like Reformation?
Thomas Dekeyser:This is a great question. I think you're absolutely right. It is a difficult position to inhabit, and one of the main impulses behind the book is precisely my dissatisfaction with that position, in that I feel like it often comes short in terms of actually articulating alternative futures or even sort of offering a significant counterpoint to our contemporary moment, which seems to be increasingly authoritarian. And within that, it seems like reformism is just a shortcoming for various reasons. But I think that said, it does remain a kind of difficult task to even try and think something that exists outside of reformism.
Thomas Dekeyser:Some of the figures that I draw upon the book have stood out to me precisely because they embody that position that I couldn't quite articulate myself, precisely because it seems like we're at a moment where reformism is taken as the foundation for pretty much all articulation of technological politics on the left. And so it's either reformism or accelerationism. I've long been drawn to a more kind of abolitionist position. So I think that's where I find some comfort in taking on a position that isn't a reformist one. Is looking at other fields of struggle, whether it's around race, gender, and so on, and seeing that within that people have productively and successfully taken up positions that are other than reformists.
Thomas Dekeyser:So I think I agree there's a kind of difficulty in taking on that position, but I find comfort in sort of companion thinkers in other parts of politics. And I find it interesting to bring that into the sphere of technological politics and thinking about what a more abolitionist position might look like. And it's also something that I read in your work, the manifesto that you wrote around broken machines. I take inspiration from that because you're one of the few people that also seems to inhabit a sort of similar sentiment, very least, that is trying to exceed or circumvent kind of the problems that come with more reformist positions.
Sarah Sharma:Yeah. And I find myself to be generous. Would say, like, I would say to people that take up, like, this question of, like, well, what then instead? Or what can we do? And I think of what you've offered.
Sarah Sharma:First of all, you actually locate the counter histories. I mean, you actually have the figures of refusal. You've proven it historically. And I think for me, I'm not doing a history, but I can only say I want it to be an orientation. Like, that abolition refusal, at least in my world of media studies, should be an orientation.
Sarah Sharma:I was thinking too about how when you talk about techno anticide or technological anticide in the history of colonialism, I mean, the proof is there. You offer a paradigm and framework for actually understanding the violence in the position of reformation when you bring up the history of colonialism as a history of technology and a history of refusal as well.
Brian Merchant:Yeah. I mean, I think that it does lead to an interesting question. What I think this alternative politics would entail, I think, is a big open question right now for people sort of seeking to practice this. And, you know, calling it abolition, which I've seen floated in a lot of spaces, is often also fraud for, you know, reasons sort of gestured at by by, you know, what you were just talking about, Sarah. So I'm I'm wondering how you conceive of it's one thing to refuse, and I understand the drive to try to condemn or to separate from reformism, but things do get thorny.
Brian Merchant:And when when we're calling this abolition or something like abolition, that immediately sort of conjures a whole new set of politics that need to be interrogated. So I'm I'm wondering, you know, how you're thinking about that. Like, are you thinking about this as sort of a technological abolition or AI abolition? Or how are you navigating this space, As you said, like, what then? Right?
Brian Merchant:Or what what does come next?
Thomas Dekeyser:I think that's a very interesting and big question. I think the way I articulate the position of abolitionism or the techno abolitionist position that I'm beginning to sketch out in the book, and it's only really in the conclusion that I come to it specifically. I articulate abolitionism in terms of a very specific kind of ontological and political problem that I've sort of traced throughout the history of technological refusal, which is that problem of kind of humanism that comes back again and again, and that has a variety of very violent implications whereby certain humans, as Sarah mentioned, sort of colonial contexts, wherein technology is sort of used as a discursive tool for rendering certain people as either lesser human or less than human. And so that's one case in which the humanist politics is revealed to be profoundly kind of violent in various ways. And so my abolitionist position that I'm articulating in the book, I'm taking it towards the end, is an abolition of a humanist position to technology.
Thomas Dekeyser:So it's less an articulation of a specific material practice of, say, we need to go and destroy our technological environments. I think I would be very wary of that, in fact, of that position, because it assumes that we can easily separate ourselves from those systems to begin with, which we cannot. And I think the reason why I'm drawn to a critique of humanism is that it's a way of beginning to kind of undermine those kinds of imaginations that are either technology can just simply be mastered by the human in ways that are desirable to us or on the other hand, oh, we can just rid ourselves of technology as if technology was ever sort of separate from us. So the abolitionist position is, for me, one which starts from a kind of critique of humanist politics of technology. In the very end, I go back to some of the techno negative figures that I talk about in the book and begin to map out some positions that I think are abolitionists.
Thomas Dekeyser:So I'm looking at sort of an inhumanist politics of technology, non human politics of technology. I'm trying to map various ways of thinking technology beyond a humanism that I think engenders all kinds of violence. I want to be precise in terms of what I mean by abolitionism, and that it has to do with an abolition of a humanist position. And then what comes after that, I'm hoping other people will continue to build upon, but also I'll hopefully continue to explore my own work.
Brian Merchant:Yeah. Since you kind of mentioned this, this was one thing that I found very interesting, which was like an argument in the introduction about how you're structuring your thinking around what techno negativism is. And I will quote you back to yourself. I know authors love it when people do that. You're defining techno negativism.
Brian Merchant:You're saying like this is not the cruel optimism described by Lauren Berlant where one holds an attachment to what is ultimately averse to your flourishing, but its inversion, a disbelief in what holds the conditions of one's ability to live on. Data, the applications, the algorithms, the cloud storage, the mobile devices and the wearables we might disdain are the same ones that in empowering much of our personal, social and working lives constitute us. I thought that this was a really interesting claim to make on a lot of levels. So one question I had is that you're ceding quite a bit to the sort of the tech companies and their operators. Sure, we've entered into contracts with them how we use their platforms and services and generate this kind of data you're talking to and you know, what we get and give up in return.
Brian Merchant:But we've entered into those contracts a lot of times against our will or we're not even knowledgeable or a lot of users aren't knowledgeable in them. I think some would argue that the notion that the data these companies have gathered and the infrastructures they've built, they might argue against the idea that this constitutes us and that it may in fact be rejected without rejecting that which holds the conditions of our ability to live on. Like, I to me, alternative arrangements with, you know, perhaps differently constituted managers of infrastructure, more democratically, more, humanistically managed, infrastructure and services are kind of imaginable. So not to get, like, so deep into the weeds early on, but I was really curious about this claim that this stuff constitutes us. It seemed to me to be a powerful and very interesting claim.
Sarah Sharma:Can we play a game? Because I had this quote exactly out to ask you a similar question from a completely different framework. So I have to do this now. And I was going to tell you this really grabbed me, this quote about our everyday devices that organize the way we move in the world. And I was going to suggest that this is really part of a recursive media theory where you recognize that what's happening here is these are the technologies that structure our relations with one another, how we move, our proximity.
Sarah Sharma:So I hear Brian coming at this from more of a political economic standpoint. We enter these contracts with these tech companies and we can reject them. We can say no on the paper. And then I see this world of data, the applications, the algorithms, the stores, the mobile devices, the wearables are the same ones that empower personal, social, and working lives as sort of like the very things that propel us in the world to make us who we are. And they're so mundane and innocuous that we don't recognize them as really powerful technologies.
Sarah Sharma:I was thinking about, yes, and this structure is what it means to be properly human. It's also why it's difficult for people to think of something like refusal or negativity because this is the condition of their possibility. And it would make me think that something like abolition refusal also needs to recognize its technological parameters of possibility. We don't refuse in a non media environment. We refuse in a media environment.
Sarah Sharma:So my question for you, Thomas, is who is right, Brian or I?
Thomas Dekeyser:You're both right. I'm glad you picked up on that quote, because I think it sets the central problem that I'm trying to work through in the book, which is precisely the challenge of refusal at a time when these technological environments, if not actually constitute us, seem like they constitute us to the point that it's really hard to imagine our social relationships, for example, without advanced technological mediation. It's really hard to imagine our everyday working lives without reliance on internet infrastructures. So I think that sets up a difficult challenge for the imagination. For me, that is not a giving in to technological power, but a recognition of the conditions that they've created for us.
Thomas Dekeyser:For me, it props up the challenge of refusal, the very difficulty of it. Not as a way of saying that refusal is impossible, but that we have to start from that, our own enmeshedness, our embeddedness within these technological infrastructures. For me, a reformism doesn't get out of the violent dimensions of these technological environments, and a part of the difficulty is thinking outside of the structures that I do think constitute us, even if it's against our own will. For me, it isn't a way of setting up the impossibility of refusal, but of the very stakes of refusal.
Brian Merchant:Yeah, it's an interesting place to jump off and explore what follows from that, as some of the arguments follow from that in your book. I am thinking of a lot of the folks I've encountered in some of my recent writing and reporting groups that are engaging in a techno negativism and in some cases what they are calling Luddism, especially the student groups who have begun organizing events and chapters and groups and movements to disavow the phone, especially getting rid of the phone, getting rid of the iPads. They hold up the iPad babies as like growing up in tandem with all these extractive and attention dominating technologies that are owned by the big tech companies and that they feel victimized in culture and in their upbringings. And so there is a practice now among, like, the I'm thinking of the Luddite Club in New York City, the LAMP Club. There's abstinence.
Brian Merchant:And, you know, some of these have varying degrees of grassroots resonance. Some of them very just very much are high school kids or college kids who have said, this is what we're gonna do. We're just gonna get rid of the phone. We're going to pull the plug. So that is, to me, is kind of an option on the table.
Brian Merchant:And interestingly, I'll just add, I think the effect of talking to all these groups and these young people who are trying to refuse technology in a society that has integrated all these norms that you're talking about, all of the, you know, the the wearables and and and data ingestion and and and so forth, has had the sort of the social effect of expanding the possible for me personally. I've been looking at getting a dumb phone or a or a light phone or if if not just, like, smashing my iPhone outright. This is a long winded way of saying these kids have decided that they are not defined by the data, that that does not constitute them, that they can leave it behind and we'll see how effectively and for how long. But does that option still exist in your framework and in your conception to say like, well, this is me and this is not?
Thomas Dekeyser:Yeah, I think it's precisely I want to make room for, and I do this in the book historically, I want to make room for precisely those moments when people have decided that those are possibilities, both in terms of individual or collective forms of refusal. What I'm setting up is the profound difficulty of doing that. The people that I've spoken that are involved with those kinds of practises today, one of the things that they will say again and again is the difficulty of it, whether it's in the form of, let's say, a detox movement, which will always already imply a return to, you know, not detox, but the intoxication of whatever addictions, digital addictions that they have. So I think you're absolutely right. I think that's precisely the room that I'm trying to make by foregrounding the difficulty of it and the need to take it seriously as something that I think is increasingly harder.
Thomas Dekeyser:So when I make the claim in the introduction, I'm specifically talking about our contemporary moment. And if we compare that to some of the historical examples that I discuss in the book, whether it's the Luddites or religious communities in medieval times, is quite different when we think about the possibilities of refusal in those moments versus our contemporary moments. And I think that's what I'm trying to do in the introduction, is to set up the contemporary difficulty of it. If we're thinking about the Luddites and Brian, you've written about this in-depth if we're thinking about the Luddites, there's a particular set of machines that they're rejecting for a particular set of reasons. I think our contemporary moment, even if we're just looking at computation, it's so omnipresent that it is incredibly hard to think outside of anything.
Thomas Dekeyser:But, okay, we can just keep what we have and do it slightly better, reform it in slightly more humane ways, for example. I guess that's where I'm trying to go with this, to try and set up the contemporary moment and the difficulty of it in order to then push through it, hopefully.
Brian Merchant:Yeah. And I think what's so valuable about the book, which is this is such a rich and invaluable text for so many reasons, and one of which is that the way that it historicizes this rejection, and that's another really key point I think you make in the introduction, is that this is missing from our histories. This is missing from our culture. This is missing from the narrative in general. And having it absent from history, all of these refusals, whether organized or I love the story of Archimedes in ancient Greece and just tracing this through that there is this through line that has been ignored.
Brian Merchant:My impulse before reading this book would have been to say, like, well, that's because it the the dictates of of capital are driving, you know, the rasher of the narratives of refusal from history because it that's what it wants. It doesn't want resistance or any criticality, but it precedes that even. I think that, like, knowing that it's is human, knowing that there is, in some cases, nobility and mythology around refusal and accentuating this fact will do wonders for building up this sense that when we refuse a technology, you know, it's not just for material purposes, it often is, but it's part of this legacy just as much as innovation. It's such a hollow sort of single point cultural conception of technological development that we have where we just sort of hailed the innovators who end up making a lot of money from a technology and omit the refusal. So I did want to say thank you for filling out this history and giving it such depth and breadth.
Brian Merchant:I think that it'll be inspiring and useful and interesting to to a lot of people.
Sarah Sharma:I second that thank you and say it again. It's interesting to me. I think this is one of your arguments too because of how you do your histories. It's there's an overwhelming sense that today we cannot refuse. And, you know, you show historically we've always been refusing.
Sarah Sharma:And the idea that today technology is so pervasive in every domain that we can't step outside in any way. But I also think you do hold on to an idea that we live in the world of the tech, whether or not we are using it. And I think that's central to your history. It's something I think you and I share in our work. I like the way you do this because you bring up the individuals that are doing this, but at the same time, Brian's bringing up something like abstinence or people getting rid of their phones.
Sarah Sharma:These are great for digital detox. These are great for small groups. These are great But does it change the world they live in? Technological power is a form of power. If technological power is as strong as economic power, social, other forms of power, if we recognize technology as being not a tool, but a type of power, then I think it changes the game for what refusal looks like in this context.
Thomas Dekeyser:Yeah, Sarah, what you're describing, I think that's absolutely right. We need to think of technology beyond the material infrastructures and devices and even the coding, the algorithms and so on. And instead to think about it as a set of environments that are produced technologically that moves beyond just the devices that we use and how we use them. It's a way of sort of our computational minds that digital technologies have helped produce far exceeds the question of whatever particular biases that exist within certain data sets and how that reproduces, let's say, certain forms of violence. Even though that's important, I think what big tech specifically has managed to do is to shape more than just the very immediate material ways that, let's say, we relate to each other or even how the economy operates.
Thomas Dekeyser:And this is why I'm interested in the question of ontology in the book. I propose thinking of technological power as at least one important part of it is that it shapes and reshapes how we think what is human, what counts as human and what doesn't count as human. So I think if we're thinking about technological power as exceeding just let whatever the particular devices that we use, a big part of that is the way that technology makes philosophical intervention. We see this historically and we see this today, whether it's in terms of AI reshaping what it means to be human, machines being thought of as human. These are all kinds of negotiations that are taking place, and they're profoundly political negotiations.
Thomas Dekeyser:This is what I'm tracing throughout history as well, The political struggle over what it means to be human as being, yeah, profoundly political. And I think our contemporary moment is just a sort of a new iteration of that, and a new sort of acceleration of it.
Sarah Sharma:I think it's great that it took us thirty two minutes to use the word AI, say AI. So I want to congratulate us on that too. It's also funny for me now that we're doing it. Let's do it a little bit. It is interesting how the word human is invoked now with such and it sounds different than it did last year.
Sarah Sharma:Like, I'm not even talking about posthumanism or transhumanism. I'm talking about the way that our tech companies, our institutions, our universities use the word now. And when they say it, it creeps me out because I'm like, wait, what were we doing before exactly? It just doesn't sound the same. I'm not making a liberal humanist critique or anything.
Sarah Sharma:I'm not critiquing where does a human end, where does it begin and end. I'm not talking about the cyber manifesto. I'm literally talking about the way the word sounds now.
Brian Merchant:Yeah. No. I agree. It risks becoming, commoditized even where firms are using AI and human vis a vis AI rather as a selling point as, like, some, you know, positioning of like the the human as a service provider in opposition to the AI generated sludge that's that's taking over everything else. Maybe this is one of the manifestations of the reformism that you're you're talking about is it's like a very neoliberal brand of reformism to say like, well, we'll use human made in our marketing materials.
Brian Merchant:And I I wrote a little article about this for The Atlantic last year, and the companies were just like, Dove was like, we will never use AI. We will always feature a human. A human service guaranteed or, you know, made by there's like a even a little logo some, you know, online boutique businesses are using made by humans. And I think that also sheds light on the enormity of what refusal can entail now. Now that AI, which is as much an ideological fixture as it is a technological one, is everywhere, is used in everything.
Brian Merchant:Every you know, that the joke of the last couple years has been everybody needs an AI strategy now, whether you're like a shoe manufacturer or like the Department of Education. Like AI is embedded everywhere, and therefore the opportunity for refusal or rejection is arguably more omnipresent than it ever has been before.
Thomas Dekeyser:Yeah, it's a good question. First, I want to actually go back to the yeah, I think you're both right in terms of the mere term, the human, and how that's being invoked in order to suggest a sort of counterbalancing to whatever AI is doing is incredibly interesting to me. It assumes that the human who was previously enrolled in processes that are now automated, that those humans were doing things by default in ethically positive or very least ethically neutral ways, which is obviously not the case. We need to be very wary of setting up the binary between human and the machine for philosophical reasons, but also for profoundly political reasons. The machine isn't necessarily the evil that is the sort of counter position to the good human.
Thomas Dekeyser:There's plenty of humans and we just need to look at history to see that plenty of humans who've done all sorts of violent things, horrendous things without machines helping them. So that's one thing I wanted to say on that. The second thing is also that centering of the human positive balance to the machine, AI led or otherwise. I think there is a question of who counts as human within that. I think a purposeful ignoring of the humans that are rendered sort of less than human in the process of, very least, even just training data sets, labelling, what happens to certain humans, who are those who are just considered the surplus of all this, who are the populations that are made to live in really dehumanising environments, dehumanising workplaces and so on?
Thomas Dekeyser:I think when we're thinking about that, the sort of value that gets put on the human, I'm always immediately thinking which human and particularly when it comes to AI, what kinds of human are rendered? It's less than human, even just on the sort of level of the training, the maintenance, the production and the waste that comes with artificial intelligence. I'm really happy with the comment around the human because I think it really gets at the crux of what I'm trying to do in the book and sort of leverages it to contemporary moments where we see it sort of play out in very interesting ways.
Sarah Sharma:Yeah, I think I want to say to praise you for that too. Think it's a really difficult position to parse out and you're entering into sort of a minefield too when you're talking about entering into feminist post colonial critical race approaches, transhumanists. Difficult. I noticed that too when you were dealing with a cyborg manifesto, right, which is formative for the way feminist approaches to technology and the body expand. And I wanted to ask you, did you feel tentative about making that argument?
Sarah Sharma:And then I wanted to ask you, what other sorts of threads did you feel tentative about?
Thomas Dekeyser:You're right. And I think there's a difficulty to it. I think the way that I try and deal with it in the book is through precision. At least I'm hoping for conceptual precision. In many ways, it's not a rejection of many fields of thought.
Thomas Dekeyser:It's a way of hopefully revealing elements that have been forgotten or overlooked. And I think particularly when it comes to things like positions taken within sort of histories of feminist or like studies and so on, I think particularly there it's even more significant to be as precise as possible in terms of what it is that one is trying to push forwards. My sort of pushing of certain strands of feminist thought and certain kinds of thought within particularly, you know, Black studies, I think is where I found that to be tricky isn't the right word but where I felt most inclined to be as absolutely as precise as possibly can be. I'm never really interested in an immediate rejection of any field of thought. I try, at least in my work, to always start from a generous position and work with what people are trying to do in order to then maybe go my own way.
Sarah Sharma:If it's okay, want to read your section that's somewhere on page 70 in the earlier drafts. And I think you do this, and I wanna just show everybody how careful you are with this. Quote, without a confrontation with the negative ontological labor of technology, we will fail to acknowledge and work through the specific technological dimensions of past colonial expansion. We also run the risk of overlooking and thereby failing to understand and undermine the contemporary violence of technological antecide where racialized populations continue to be much more likely to be treated as machines, waste, or data. From this viewpoint, and as a conclusion we'll return to, neither a liberal humanist theory of man nor a posthumanist theory of subjectivity are currently capable of sufficiently working through and against the full scale of the ontological violence of colonialism.
Sarah Sharma:I just thought it was, like, one of the most powerful parts of your book that and it this is that careful element I was thinking of. I thought it was fantastic.
Brian Merchant:What about something that's less careful? I'm wondering, this is not your most targeted audience or whatnot, but these days I go into rooms and I talk about AI and I talk about refusal a lot of times, especially from the standpoint of labor organizing and what it would mean to refuse in a contract or to oppose the introduction of a technology in a workplace if a lot of workplaces are trying to do the top down AI implementation thing. So this is what I hear is oh, you know, like, how do we oppose this? How do we fight back? How do we bargain?
Brian Merchant:And I know this is not your concern with philosophical inquiry and humanism in the broader picture, but I'm wondering also if in that inquiry, it has surfaced any sort of recurring themes or thoughts or ideas that you might pass on to people who say, okay, you've surveyed the long and beautiful history of people and individuals and organizations, you know, refusing or being negative about technology. What can that mean for us as we're sitting down trying to negotiate a labor contract with a boss that wants to use AI to automate X number of jobs or so on?
Thomas Dekeyser:Again, I think coming in with the big questions. I I mean, ultimately, that's the question the book ends with more than directly responds to. I think the sort of philosophical arguments I make are a way into that without necessarily wanting to specify what exactly that should look like in the present. For me, that's a conscious choice. I think what I'm trying to offer are more conceptual tools that people can work with.
Thomas Dekeyser:That said, if I were to write this book in a different way, let's say, write another history of technological refusal, which I'm not going to do, but if I were to write another one with an eye specifically on pulling out what we might learn in terms of, let's say, practical organising, I mean, there's a few things that I can think of. One is and labour organising is already moving in that direction is the necessity of collectivity. If we're looking at that history, the power that comes with collective agency, I think, is a crucial one. I don't think that's a particularly surprising one. But at the very least, it pushes against a sort of very individualising notion of technopolitics, which might be, I'm going to just stop using my phone and hope that that undermines our current technological regimes.
Thomas Dekeyser:I think by being in the place of labour, you've already partially asked the question of what it might look like. And I think maybe a second thing that I would say is that it's about more than just using certain technologies or refusing certain technologies. And it's about a deeper in order to refuse, we have to rethink more than which devices we use. It's an epistemological question. It's a question of how we relate to others.
Thomas Dekeyser:It's a deeper set of questions than just what do we use, what do we decide not to use. And I think that makes it really difficult because it means we have to alter our own subjectivities significantly if we are going to pose a serious threat to our contemporary sort of technological powers. And that makes it really difficult, I think. But I do think it's a necessary task because otherwise we're just going to keep reproducing the same problems.
Brian Merchant:Yeah, that's why I was interested to sort of bring up the new Luddite clubs and people who are not just rejecting the technology or just putting forward, but also like building like clubs and groups that can then sort of socialize that rejection in some way. Thank you for that.
Thomas Dekeyser:If that's okay, I want to ask Sarah actually how you would respond to that question. Given your new book coming out as a feminist critique of big tech, and I don't know if part of that is thinking through feminist ways of resisting or feminist modes of doing and thinking and making technology. Don't know if that's a part of it, but I'm also curious as to how you would respond to that potentially.
Sarah Sharma:Yeah, I feel like I wrote it in a way so I couldn't get asked that question Brian just asked because I know that everybody wants to ask that question, but it still comes up. Even in the book, I was like, I feel like I present it as a clearing so that we can get to that part.
Brian Merchant:That's what all you academics do. You all are like, we make the theory. I lead
Sarah Sharma:You know what? We're paving the way for Brian to exist. That's what we're doing. Don't you think, Thomas? We are the condition for your possibility, Brad.
Brian Merchant:That puts
Sarah Sharma:a lot
Brian Merchant:of pressure on me.
Sarah Sharma:No. I'm thinking I'm not not exactly kidding. It's like a division of labor. We all have to hit this is one thing I love about talking to you two is we're doing different things. We can't each do that thing, and it's necessary.
Sarah Sharma:I think I'm taking up, like, where do we locate gender in the politics of big tech. Right? And there's no obvious space or place, it's not the place we're looking. One of my arguments is that feminist positions on technology spend so much time on the issue of reformation and repair because of the condition of our labor. We have to deal with this patriarchal technologic and we spend so much time in it that we never get to the part of imagining otherwise.
Sarah Sharma:And I don't think just imagining otherwise is an effect of politics either. Like, I'm not talking about world other world making and something that's just gonna arise, but we never get to the part of devising a feminist technologic because we're trying to deal with the patriarchal technologic. Part of doing that for me is that I really think that we need a feminist philosophy of technology, And I don't think reformation is a feminist philosophy of technology. I think that's a way of trying to repair and ameliorate the pain and violence of living under the patriarchal technologic. I define the patriarchal technologic as like homo and heteronormative economies of exchange, successes of the South that revolves around productivity, time management, the nuclear family, technologies of delegation over assistance.
Sarah Sharma:And so I'm like interrogating like these dominant structures and frameworks that when mentioned this part about the apps and the mobile devices that we hold close to our bodies, I'm suggesting these are patriarchal technologies because they're technologies of delegation that depend on the utility of others. They only work in relationship to you delegate and you depend upon others' utility. And so what would a world of feminist technologics look like? And it's not about like, would life be better on another planet? It's part of the disdain and discomfort and what I call, like, a impasse in thinking about technology.
Sarah Sharma:We're kind of confined in this patriarchal technologic. So when somebody asks me, which they always do, it's always the last question after a talk when I've already stated in the beginning that I do not know what to do and somebody asks me what to do. I am usually tired, and I don't know what to say, and I hopelessly fail in my response to that question.
Thomas Dekeyser:Yeah, I can echo that. I think part of the reason why I set the book up in the way I do around specifically the kind of philosophical problem is precisely just, you know, I think the question what is to be done is a crucial question. But I think before we get to that question, I like the way you're putting it, Sarah, in terms of clearing. I think that is entirely important work and maybe just writing an academic book isn't precisely the way at getting at what is to be done. It might be through practise, it might be through organising.
Thomas Dekeyser:So I think there's sometimes it can be unfair to ask of someone who wrote a book what is to be done at the I understand the impulse, and I think it's an important one because people want answers to a difficult question like that. But I think that's kind of work to be done by myself as well as others. Like, I wouldn't want to offer a clear answer to that. Let's set up an abolitionist tech club. I mean, maybe that's the answer, but I think that's collective work.
Sarah Sharma:But it's also this kind of writing and thinking we have to see as collective work. One of my arguments too is I believe that techno negativity on the ground, labor organizers, need feminist theory. They need a feminist approach. They need to think not just about women and others, like in terms of identity, but like orientation towards alternative technologic. I think it's amazing that there's an emergence of family abolition, police abolition.
Sarah Sharma:There's a whole bunch of refusal, blood in the machine. These things are emerging at a similar time. There is a structure feeling around the topic, and this is a division of labor. We can't do this all at once. But one of my arguments in the book is that we need to be able to see something like the Uber Eats driver needs to be able to identify with wages against housework.
Sarah Sharma:They need to be able to see themselves as part of this socially reproductive order, not just an entrepreneurial hustling male. They need to be able to see themselves within the frameworks of feminist thought and action. But that's like an impasse. You know, labor organizing is based on theory. But which theory?
Sarah Sharma:I think that's another way to think about this as well.
Brian Merchant:I have very much enjoyed putting you both on the spot here. Great answers and great thoughts.
Sarah Sharma:You know, Brian, if I would like, could just when somebody asks me that question in a talk and I've never done it, but I could be like, what a patriarchal question. What is to be done?
Brian Merchant:You could and
Sarah Sharma:you But I have never, and I will never.
Brian Merchant:Well, was asking Thomas anyway.
Sarah Sharma:Yes, anyways. Sorry
Thomas Dekeyser:for passing it on to you, I'm just always curious because precisely because I tend to deflect the question, I'm curious as to how other people respond to it. And it's a question I often get. I made a film that came out a few years ago called Machines and Flames, which focuses on a specific group that I also discussed in the book, which is called CLOUDO, which stands it's French acronym for Committee for the Liquidation or Separation of Computers. It's a group that set fire to early computational companies like IBM, Philips Data Systems and others. Whenever I screen this film, at the end, people are like, what can we learn for our contemporary moment?
Thomas Dekeyser:And I'm like, I don't know, ask Cluedo. We don't know who they are, but I think it's a really interesting impulse. I think it comes in part from the urgency that we all feel in that. I think the question always comes from a good place, which is this is a particularly dire situation that we are in. What kind of tools can we come up with in order to break out of it?
Thomas Dekeyser:It's an important question, but it's a difficult one to answer without immediately reproducing certain kinds of problems.
Brian Merchant:Yeah, all that. And it's a beautiful film.
Sarah Sharma:Yes. We screened it in Toronto two years ago. Was fantastic.
Brian Merchant:You could see echoes of the film in the book as well. It has a similarly poetic approach to refusal, which I appreciate.
Thomas Dekeyser:I wanted to finish by saying thanks again to Sarah and Brian for engaging with the book, for joining me, for having conversation about these themes that I think each in our own ways we think about a lot in our work and in our lives. So appreciate that. And yeah, I also want to say that I think I already said this in the beginning, but that you're kind of important inspirations for the work that I do, whether it's my writing, films and so on. So yeah, it's been great to have you on here and to be able to have this conversation.
Sarah Sharma:No, it's wonderful to talk to both of you. Thank you.
Brian Merchant:Yeah, and likewise. Thank you for writing this book and giving us more fodder for thought and again, establishing that historical lineage and and an opportunity to talk about and think about refusal in a more robust way also for this conversation, which was good fun. Thanks for letting me yap it up.
Narrator:This has been a University of Minnesota Press production. The book Techno-Negative: A Long History of Refusing the Machine by Thomas Dekeyser is available from University of Minnesota Press. Thank you for listening.