Machines & Meaning

We explore Alisdair MacIntyre’s concept of narrative fragmentation and whether large language models (LLMs) contribute to it through their underlying architecture. 

What is Machines & Meaning?

Machines & Meaning examines artificial intelligence through the lens of different philosophers to understand how AI technology shapes human experience. Created for curious, thoughtful people who want to move beyond simplistic "AI is good" or "AI is bad" narratives, each episode takes a key concept from a philosopher and uses it to examine a specific aspect of AI technology and its impact on human life. While the show assumes listeners are familiar with current AI developments, it doesn't require technical knowledge. The series aims to help listeners develop a deeper understanding of how these technologies are changing how we think, behave, and relate to one another by bringing philosophical insights into conversation with modern AI developments.

Angel Evan:

Ever wonder what happens to our stories, our narratives, when they get processed by AI? Today, we are diving into this question through the lens of philosopher, Alastair MacIntyre. We'll explore what narratives mean in MacIntyre's view, peek under the hood of large language models, and finally ask ourselves, are these AI systems changing how we record and understand our own stories? Hello, and welcome to Machines and Meaning. I'm your host, Angel Levin.

Angel Evan:

Does the underlying architecture that powers large language models, or LLMs as they are commonly called, contribute to the creation of narrative fragmentation? That's our topic today. We're going to start with an overview of Alastair MacIntyre, a contemporary Scottish American philosopher heavily influenced by Aristotle and Thomas Aquinas. Next, we'll explore MacIntyre's theory of narrative fragmentation, something he believes is caused by what he calls modernity, which is simply the condition of being modern. Finally, we'll take a look at 2 specific pieces of functionality in LLMs to understand whether they contribute to MacIntyre's argument for narrative fragmentation using a real LLM query about the Apollo Moon landing in 1969.

Angel Evan:

Let's start with some background on Alasdair MacIntyre. For historical context, Alasdair Commerce MacIntyre was born in 1929 and is 95 years old at the time of this podcast. He is a philosopher who has contributed to moral and political philosophy. In terms of key life events, he began teaching in the UK in 1951 and moved to the US around 1969. Over a 30 year timeframe, MacIntyre held multiple posts at various schools, including Boston University, Notre Dame, Duke, Wellesley, and Yale, to name but a few.

Angel Evan:

He's been a professor, a professor emeritus, and a research fellow. He retired from active teaching in 2010, but remains a senior distinguished research fellow at Notre Dame. And apparently, he even still maintains an office there. His major influence would definitely be Aristotle. It's impossible to talk about MacIntyre without discussing Aristotle, who has probably been his greatest influence.

Angel Evan:

But he's also clearly influenced by Thomas Aquinas, the Italian priest of 13th century who is often credited with blending philosophy and theology and someone who was also influenced by Aristotle himself. In terms of historical impact, MacIntyre makes a case for why his approach works compared to other normative ethical theories from philosophers like Bentham, Kant, or Rawls. These quote unquote enlightenment thinkers, MacIntyre believes are generally misguided. And they lead MacIntyre to realize that he needs to return to Aristotle's roots, not just borrow from Aristotle directly. So, MacIntyre rails against the overestimation of reason by these enlightenment thinkers and instead tries to revitalize the tradition of Aristotelian ethics, a field of study that is the practice of providing response to the question of how to live a good life.

Angel Evan:

McIntyre has an opinion on many topics, including the idea of tradition, the ethical failures of encyclopedic practices. Yes, believe it or not, McIntyre has a problem with encyclopedias, and also the concept of telos, a Greek word meaning a person's ultimate object or aim. But there are 2 topics in particular we'll be examining today. The first is narrative. The second, fragmentation.

Angel Evan:

These are topics MacIntyre makes a summative argument for in 3 of his books. The first, After Virtue. The second, 3 rival versions of moral inquiry. And the third, Ethics in the Conflicts of Modernity. In chapter 15 of his book, After Virtue, A Study in Moral Theory, MacIntyre states that man is essentially, in his actions and practice, as well as in his fictions, a storytelling animal.

Angel Evan:

He says that narratives help children learn about the world's ways and that without narratives, there's no way we can understand any society, including our own. So for MacIntyre, narratives are crucial to understanding ourselves and essential in helping our society to function. But what about this idea of fragmentation? The concept of fragmentation is a theme throughout much of MacIntyre's work. In MacIntyre's view, the self is fragmented, which happens when the multiplication of desires leads to a multiplication of choices.

Angel Evan:

We can have fragmented traditions, fragmented narratives, and even fragmented morality. The root of this fragmentation is imposed by modernity, which MacIntyre believes partitions each human life into various segments, each with its own norms and modes of behavior. And so for MacIntyre, we've lost the ability to see our lives as one unified narrative. And instead, we're left with these disconnected pieces that don't quite tell a complete story. Interestingly, fragmentation is also how LLMs input and output information.

Angel Evan:

The transformer architecture, the underlying architecture of LLMs is comprised of many interconnected components, but there are 2 worth examining as potentially causal in narrative fragmentation. The first is the encoder decoder function. The second is the attention mechanism. Let's break them down using an interpretive understanding of this technology based on the paper, attention is all you need. The encoder takes the input, like a sentence, and converts it into a numerical representation that captures its meaning.

Angel Evan:

It's like converting words into a numerical language that the computer can understand. The decoder reads this numerical representation and generates the output one piece at a time. As it produces each new piece, it looks at what it has already generated to help decide what should come next, similar to how we form sentences one word at a time while considering what we've already said. The attention mechanism matches a query against a reference called a key, and that determines weights. It then combines values using those weights to produce an output.

Angel Evan:

The weights control how much each value contributes to the final result. And the key innovation here is that the function focuses on the most relevant information for each specific task. But by doing so, it omits seemingly unimportant details resulting in, you guessed it, fragmentation. Let's look at a simple question using the LLM Claude 3.5 sonnet from Anthropic. I type in the question, when did astronauts first land on the moon?

Angel Evan:

And here's the response I get from the LLM.

AI Voice:

The first moon landing occurred on July 20, 1969, when Apollo 11 astronauts Neil Armstrong and Buzz Aldrin landed the lunar module eagle while Michael Collins orbited above in the command module. Armstrong became the first human to step on the lunar surface at 10:56 PM EDT, famously declaring, that's one small step for man, one giant leap for mankind.

Angel Evan:

McIntyre would consider the LLM's response to be a form of narrative. And on the surface, that seems pretty straightforward. But let's examine the response more closely through this MacIntyrean lens of narrative fragmentation. First, there is no mention of President Kennedy's 1961 speech setting the goal to reach the Moon by the end of the decade, an essential part of his narrative. There is no mention of the astronauts being from the United States, an essential part of that country's narrative, especially in the context of the space race between the US and the Soviet Union.

Angel Evan:

And there's no mention of the more extensive Apollo program, which was a years long culmination of technological development gained from earlier missions, an important part of the narrative for the thousands of people who contributed to that program. In MacIntyre's eyes, these omitted elements are precisely the kinds of identities society needs to understand itself. Their omission results in an ununified narrative because it lacks any sense of shared cultural experience. But why? What happened?

Angel Evan:

Well, the attention mechanism likely gave the highest weights to the word when since it's the query word about time and time related words in a relevant sentence like 1969 and astronauts to connect the context to the when query. However, it gave low weight to the prepositions and entirely omitted details about the cultural narrative. The result is that we get pieces of the story, but not the whole story, and perhaps not even the most interesting parts of that story. In fairness, alternative LLMs might have preserved narrative unity by imposing their own set of rules of relevance, but they certainly wouldn't be consistent, which leads to additional variance and, by extension, additional fragmentation. Regardless, this automated prioritization of information mirrors MacIntyre's critique of how modernity creates fragmented human experiences that fail to preserve their inherent unity.

Angel Evan:

MacIntyre says that we're all bearers of a particular social identity. We are someone's son or daughter, someone else's cousin or uncle. We are a citizen of this or that city, a member of this or that guild or profession. We belong to this clan, that tribe, this nation. These social identities are recorded through narratives, which become fragmented when technology decides which parts about them are, quote, unquote, important and which parts are not.

Angel Evan:

Are these AI systems changing how we record and understand our own stories? I would say they are, and I think McIntyre would agree. In his work, MacIntyre argues that modernity partitions each human life into a variety of segments. And it's the distinctiveness of each segment and not the unity of the life that we are taught to think and to feel. By design, large language models transform narratives into segments, which are prioritized based on an opaque set of parameters that determine what is important, and in the process, may dismantle a narrative's unity, even if their intent is simply to summarize a series of events.

Angel Evan:

This is an important point. So let me say this again in a slightly different way. In an attempt to summarize historical facts, LLMs have to make choices about what to admit. But what you have to ask yourself is this. What information is lost in the summarization process?

Angel Evan:

And does the benefit of the summarization outweigh the value of the information that is lost in the process? It's almost a trade off between convenience and context. Sometimes we become so enamored and perhaps distracted with technology that we forget to ask ourselves a larger question of how it changes us or through a MacInturian lens, how it fragments us. LLMs change how we receive information and understand ourselves and our narratives through the means by which they process and serve that information. In this way, the very tools we use to tell our stories may perpetuate the threat of fragmentation that MacIntyre warns us about.

Angel Evan:

And while the final effects are yet unknown, one thing is already certain. LLMs are rewriting our stories and how we come to know ourselves through them.