Some Future Day

Is Twitter (X) dying?

Kurt Wagner is a tech reporter for Bloomberg who has been covering X, Formerly known as Twitter, for over a decade. During this time, Kurt fostered special relationships with senior ranking staffers. Ultimately laying the groundwork for his incredible new book about Twitter, "Battle for the Bird: Jack Dorsey, Elon Musk, and the $44 billion Fight for Twitter's Soul."

In this episode, our wide ranging discussion surrounding social media's impact on society, technology's transformative impact on legacy media, and the new face of Twitter provides us with an important peek into the future. 

Sign up for the Some Future Day Newsletter here: https://marcbeckman.substack.com/

Episode Links:
Kurt's Book: https://www.amazon.com/Battle-Bird-Dorsey-Billion-Twitters/dp/1668017350

To join the conversation follow Marc Beckman here:
Youtube
LinkedIn
Twitter
Instagram

What is Some Future Day?

Some Future Day evaluates technology at the intersection of culture & law. 
 
Join Marc Beckman and his esteemed guests for insider knowledge surrounding how you can use new technologies to positively impact your life, career, and family.  Marc Beckman is Senior Fellow of Emerging Technologies and an Adjunct Professor at NYU, CEO of DMA United, and a member of the New York State Bar Association’s Task Force on Cryptocurrency and Digital Assets.     

Marc Beckman : All right, Kurt Wagner, how are you
Kurt Wagner: I'm great. Thank you for having me.
Marc Beckman : I'm so excited to see you. It's such a pleasure to have you on Some Future Day. honestly, I think this is a really important topic today. The work that you've done in building this book, Battle for the Bird, is incredible. I, I understand that it took you several years really [00:02:00] to compile this information.
How many years did it take you actually to write the book?
Kurt Wagner: so, you know, I started the book, I actually went back and looked, I was pitching the book about two years ago, April of 2022, um, so I guess, you know, from, from pitching the book to publishing the book, a little under two years, but really I've been covering Twitter, now X, of course, for, you know, A decade. So even though it's been about two years of, you know, book specific work, I mean, I was building off of years and years of sourcing and stories and notes that I'd already taken.
So I feel like it's kind of a, I feel like it's a several year process, even though I guess, you know, technically it was maybe 18 months from start to finish in terms of getting the book out.
Marc Beckman : incredible commitment in your lifetime and, you know, it's pretty cool that you have developed all of these, um, insider types of relationships over at TwitterX, but, your vantage point with regards to social media, I would imagine, has to be, um, You know, [00:03:00] looking at things through a wider lens.
If you're going to look at, at what Twitter and X stands for, you must have to have, um, a wider lens as it relates to social media in general.
Kurt Wagner: Yeah, I mean, it's, it's one company that I cover, but I also cover, you know, meta and all the, this, the apps there, Facebook, Instagram, WhatsApp, you know, I've covered Snapchat, uh, I've covered Tik Tok, Pinterest, like all of these sort of, consumer social services, not to mention dozens of startups that have come and gone over the years.
So, you know, to understand where X fits, you sort of have to understand the, the broader picture. Sandbox, where they're playing, but then on top of that, you know, there's I'd like to say this about Facebook and X in particular. What happens in the world is literally happening on these platforms, right?
So politics, culture, music, business, like you can't just know sort of the ins and outs of the business. You really have to be tuned into, like, pretty much what's happening in the world because That's what people are talking [00:04:00] about, and that's what drives policies at these companies, that's what drives content issues at these companies, so, you know, it feels like one of those beats that keeps me very, uh, widely read, if you will, because I, I sort of have to know what's going on everywhere.
Marc Beckman : So is social media important?
Kurt Wagner: Yeah, I mean, I think that it is. I've been getting my news through social media for a really long time. So for me as a news consumer, it's sort of been like the first stop. Twitter, historically, has always been the first stop for me. Um, it's where I'm entertained, right? So I, I, I You know, not afraid to admit, I go to TikTok.
I get a lot of entertainment from TikTok and Instagram these days. Um, you know, I stay connected with people that way. I've built a whole community, especially on sort of old Twitter. Um, I say old Twitter, that's sort of pre Elon Twitter. I had a much bigger community there, both professionally and personally that I, I would tap into.
So for me, Like, social media is sort of this, like, first front door into the internet for me in a lot of ways. It's like the [00:05:00] places I go first in order to then jump from there to a specific news website or to a specific video or whatever it may be. I understand, you know, that's not how everyone uses these things, right?
And I think there's been a change in perception around social media ever since the 2016 election, quite frankly. I think it sort of, for a long time, was like Hey, these are great. Like, look it, we're all staying connected. We're all, you know, what What a great tool this is, and I think the 2016 election was the first moment where we all really said, oh, wow, there's a real, real negative downside to these, these companies and these platforms if you're not paying attention, and, um, you know, I think it's sort of gone downhill from there for a lot of people, so that's just my, you know, I, I tend to use these things a lot, and like I said, view them in that way, but I know for other people, The social media industry has really, you know, lost some relevance for them over the last couple years.
Marc Beckman : So that's really interesting because on the one side, you're saying it's an amazing source for breaking news, um, gathering [00:06:00] current events as well as being entertained. But the 2016 election in particular has been transformative in nature. why do you use the 2016 election as like such a demarcation in the transformation of social media, generally speaking?
Kurt Wagner: Yeah, I mean, I think there were two kind of big things that came out of the 2016 election in the U.S. The first was this realization that, um, these networks can be manipulated and abused, right? And, and in this case, it was specific to, you know, Russia, uh, trying to sway opinion of American voters without anybody knowing about it.
And that's just sort of one example, but it sort of opened the door to, oh, well, you know, how else are these networks sort of being manipulated? How else are these networks being abused? How are people being influenced by these networks without necessarily knowing where the content's coming from? Right?
So I think that was, that was sort of one huge moment of that election. I think the other is just simply having, you know, President Donald Trump in [00:07:00] charge in the U.S. And the platform that he had on Twitter in particular, but also Facebook, to, you know, be, in my opinion, very divisive, right, and, and very, uh, confrontational, I think it was sort of the start of a, a, you know, divide culturally in the United States, um, and so, I think, okay, on one hand, it's like, we've been exposed, uh, these platforms have been exposed to, you know, be manipulative, and on the other hand, it just created this culture war that then I think has, uh, You know, been exacerbated by these platforms, right?
Because now you have people who are taking very strong stances on one side or the other. And it just feels like the kind of discussion on these platforms has gotten much more polarizing since 2016. So for those two reasons, I sort of use that as like the moment we took a really hard turn when it comes to social networks.
Marc Beckman : So, you know, it's pretty significant because it's about, it's almost 10 years since that 2016 moment in [00:08:00] time. and I wonder like what the, uh, impact has been on our culture and an American society's desire to read. Long form content to do the hard work to understand the news, right? Like if, if I'm taking in the news with Twitter, I get the headline, I click the link, and then I go deeper, but often I'm not, I'm sure a lot of people feel this way too.
I don't have the time or I don't want to read the long form content off of my phone. I prefer to read it off of my computer or actually with a physical newspaper. So how has the delivery of news then since 2016 transformed our society?
Kurt Wagner: I think this was happening. I think the distribution of news in this way was, was pre 2016 to be clear, but I think it just got more and more polarized and starting about then. But, I mean, it's, it's snackable, uh, you know, headlines, right? I, I'm guilty of this too. You scroll through, you read the headline.
Okay, great. You sort of make an assumption about what maybe the rest of the story is and you move on and, [00:09:00] and you're doing that constantly throughout the day, uh, especially on, somewhere like Twitter, but also on Facebook or Instagram or wherever you might get your news. And so, I think what it has done is it has expanded probably the number of headlines that you see, right?
So you're now getting a wider swath of information than you may have before, but you're not going nearly as deep, presumably, because you're not clicking in and, and reading and, you know, So part of this is a media industry problem too, like a lot of stuff, including at Bloomberg's behind a paywall, right?
So oftentimes if I do click, Hey, I really want to read this. Oh, I hit a paywall and you know, you're not going to pay for 20 different new subscriptions. Most people are not. And so then you, you end up going back and you sort of get trained to think like, Oh, I'm not even going to click because it's just sort of, you know, I'm running into these paywalls all the time.
And like, I just think this. So this way of news distribution has, has pros and cons. It's much faster, which is great. Um, like I said, you do get exposed to more ideas, which is great, but you're not going as deep. And I also think, you know, one thing [00:10:00] I've certainly noticed, and maybe this is how it always was, but I just didn't notice it as much, is people have made up their mind on a lot of these, these stories or headlines before they ever read them or learn anything more about it.
Right. And so you feel like as a journalist, sometimes you're presenting this, you know, thoughtful, thoughtful. Uh, summary of the facts, and yet, some people are going to agree with you from the get go, and you could, it doesn't matter what you write, they're going to say, yes, this, you know, this story is good or bad, or whatever.
And there are just as many people who feel the exact opposite, they're never going to take it seriously, and they see the headline and immediately assume, You know, it's clickbait or you're, or it's fake news or you're anti, you know, Donald Trump or someone else. And so I just think, you know, it's, it's a really tough industry right now because I don't feel that there's a lot of room for nuance, um, both because of these platforms, but also just because of the way that, that people have sort of, dug their heels in on, on various sides of the political spectrum.
Marc Beckman : Kurt, it's kind of interesting. So you're hitting on a topic that I haven't [00:11:00] discussed yet on the show, um, but I talk about often with friends and colleagues. It's this, um, psychology idea of groupthink. And I think this pertains in particular to, um, not just Gen Z, but Alpha and Millennials as it relates to social media where, whereby, um, a person thinks that she or he.
might have a particular idea on a topic, and then we'll jump into social media to validate whether or not I'm thinking the right thing, to not just see what my friends are saying about that topic, but also what the media is doing, so they come across someone like you, a Proper trained journalist who's doing, you know, tons of work to create these stories to get, verifiable, information, facts, um, sources, and they're not clicking deeper because maybe they don't want to pay for it, but they see the headline, the headline could verify what they're thinking, and then that social consciousness seeps [00:12:00] into every part of our society.
Kurt Wagner: Yeah, I mean it's a, there's a term for that on social media, right? Which is this filter bubble, right? And this idea that you are following people, whether intentionally or unintentionally, who probably reinforce your Beliefs and ideals and, and share similar, views that you have, right? And so, and that's, you know, I'm not a psychologist, so I, I don't want to talk out of turn here, but that makes total sense to me, right?
You would go on, you would see someone who says something, you say, yeah, I agree with that. That sounds, that sounds smart. That aligns with my views. You follow that person. And now as a result, you start to, you know, you get more of that, right? And you can see how maybe starting, In the middle of this spectrum, you could very easily go one way or the other just simply through that process of sort of coming across someone who agrees with you, follow them, engage with them.
And now these algorithms say, Oh, okay, let me show you more stuff like that. Right. And you can see people going further and further in each direction. You almost have to go out of [00:13:00] your way to follow people or engage with stuff that is maybe of an alternate viewpoint in order to get that take. And so. You know, there's been a lot of research around this.
So again, I don't want to, you know, I, I'm not looking at it and I don't have it memorized. So I don't want to speak out of turn here, but you have heard that term. And I think there's some validity there, which is, you know, these networks are trained to give you what you want to see, to make it so that you come back and spend more time.
And oftentimes that means, you know, showing you stuff that's either shocking, um, or something that, you know, aligns with your beliefs and, and, and makes you feel kind of validated.
Marc Beckman : Wow, that's pretty wild. I, you know, admittedly, I haven't heard of the term of art filter bubble, um, but it, I guess it totally makes sense, particularly for where we are in this day and age and with the, you know, the next generation of consumers. but it, that also must have a greater effect on the polarization of our society, right, Kurt?
Because [00:14:00] now, if I'm only hearing, you know. The news that I prefer, prefer to hear and I'm hearing from voices who I, you know, prefer to listen to, I would have trouble, uh, deducting as to what would be the truth.
Kurt Wagner: Yeah, and I think that's part of the kind of problem with the system and why it can be kind of viewed as broken in some ways, especially when it comes to news, right? Is that you tend to, again, get reinforced with something. Maybe the original path you go down or the original belief you have is inaccurate.
Well, you could be reinforced with, with more people who believe something to be inaccurate, right? And so now suddenly this idea of maybe a conspiracy theory, for example, gains a lot more steam because before You know, before the internet, maybe you'd be sitting at home and being like, man, this is, I have this theory.
It's really crazy. But you weren't constantly, you know, running into other people who necessarily shared that. And we're reinforcing that. I just think it's much easier [00:15:00] in 2024, uh, to go out online and not just on social networks, but just in general and find people, you know, communities, right. That sort of Feel and believe the things that you believe, right?
In some cases, that's very harmless. You're into gardening, you're into cooking, you're into whatever. You know, you go find those people and you feel validated and supported. In some cases, you know, QAnon or Pizza Gate or whatever it is, you go and you can similarly find, you know, communities and people who are reinforcing those beliefs, but that can just be a much more dangerous thing than, you know, maybe it used to be.
Marc Beckman : it's also complicated because I think the traditional rules and guidelines of journalism are completely thrown out the window when it applies to content creation and social media. So, you know, where do you stand as it relates to, you know, Verifiable sources and information, like what's the impact now when, you know, you mentioned QAnon, for example, like if I'm, if I'm in that community and I'm listening to everything that QAnon is saying and taking [00:16:00] everything at face value without doing the legwork, what happens to our community?
why aren't people holding, uh, a higher standard, uh, as it relates to verified sources and verified information
Kurt Wagner: Sure. Well, and to be clear, you know, the idea that people were getting misinformation has existed for a long time, right? So I don't want to say that this is a brand new idea, just simply that it's become easier, I think, to be exposed to this type of stuff, or easier to reinforce those ideas. But, you know, I think if you're There is some legwork now in finding what is true, uh, versus what is an opinion versus what might be misinformation in some cases intentional.
And I think that there's a lot of people who maybe news, you know, they scroll because they want to feel like they're sort of educated at a headline level and it's not worth the effort to go and say, Hey, I'm going to sit down and like read this real deep dive. I think there's also part of it is, is probably on the media, right?
Which is like, [00:17:00] We as the media, it's our job to condense sometimes these very complicated or nuanced subjects into something that people can easily digest and read. And so, you know, that's inevitably going to leave certain details out. That's inevitably going to change, um, you know, you can't condense, uh, uh, the war in the Middle East into a 600 word, news article, right?
And expect people to walk away and feel. Completely educated about what's going on and so part of that is is that's the role of the media But part of that also is like people need to be Interested and willing to go kind of do this on their own a little bit and I'm not sure You know what percentage of people are into that or you know have the time to to go do that And so what you do instead is you read the headlines And you watch the snippet of TV news or whatever and that becomes the basis for your entire view on a particular subject.
And again, I don't think that's totally new. I just think we see it so much more because of social [00:18:00] media. You're exposed to the various viewpoints so much more because of social media that it feels more dramatic now than it probably used to be.
Marc Beckman : Yeah, it definitely feels more dramatic to me, I'll admit it. Um, so you, you, you highlight 2016 as this transformative moment for social media, generally speaking. Do you think that, um, we'll see another transformative moment for social media in the imminent future, maybe within the next five years or so?
I think so,
Kurt Wagner: I think one thing that's, uh, we're paying close attention to is this idea of generative AI, right? Where, um, you're now not just seeing content from your friends or family or celebrities or whoever, but you're seeing content that has been completely, uh, created. with, with AI, right? A video that's been totally fabricated audio that is, that sounds like a celebrity, but is really, it's fake.
It's been doctored or whatever. And so you can imagine a huge moment, um, whether it's around the [00:19:00] election or something else, where Basically, the world gets duped, right? People think, oh no, uh, you know, here's a clip of Donald Trump saying something, or Joe Biden saying something, and it truly changes, maybe the person, maybe it changes a vote, maybe it changes the outcomes of an election, maybe, you know.
Maybe it leads to a war, right, or close to a war. It builds, it kind of cuts through this idea of trust and being able to say, well, at least I could verify with my eyes that, you know, I'm watching this on video and I therefore know that it happened. That's kind of out the window, or if not yet, it will be very soon, and so I don't know what that moment's going to be, but I could very well imagine a major moment on a social network like this that sort of, again, pivots the entire industry into another sort of, you know, pre and post, right?
So for me, it was pre 2016 election, post 2016 election. Maybe it's going to be Pre, you [00:20:00] know, that that doctored video that led to a missile strike and post that doctored video and so You know, you hope that that does not happen But the technology is just getting too good The AI technology is that it makes me wonder if we're gonna see a moment like that at some point in the near future
Marc Beckman : So you won't be surprised if, if, you know, something like this happens.
Kurt Wagner: No, I don't think so. I mean, we're already seeing, you know, doctored audio of President Joe Biden. We're seeing, you may recall, there was a, uh, deepfake, nude photos of Taylor Swift that went viral online recently. And, you know, for the most part, people scramble to try and immediately say, this is fake, don't believe it, you know, whatever.
But this stuff moves so fast and there are people who whether it's fake or not, they'll see it and think it's real because I've seen it. And I think that's where it, you're just never going to be able to clean up a mess once it happens. I just don't know how big of a mess we're ultimately going to come across.
Marc Beckman : So we're going to get to this point, I think, culturally, [00:21:00] where most people should not trust the sources and should not trust the information, right? It's, it's, um, almost as if we're like going to the, the, the next extreme because of artificial intelligence. Chris,
or come to you.
Wagner's feed and say, we know he does good work, therefore we can trust, you know, the news that he's reporting, but the reality is when you're, when you're starting to get into this world where things can be fabricated and look so, so real, look so legitimate, but are totally made out of,
Kurt Wagner: you know,
Marc Beckman : a software program, it does create this uncertainty where people are no longer going to be [00:22:00] able to instantly say, Hey, I've seen it with my eyes, therefore it's true.
I think that is
Kurt Wagner: where,
Marc Beckman : where people are going to have to pause and figure out where is this coming from? Has it been validated? Et cetera.
do you think politics and government will always have, um, such a, a heavy burden Presence in social media. It doesn't seem like it was like that at the outset, but a lot of our conversation, we just keep going right back to it, right? The political landscape, the elections, the, you know, geopolitics. Do you think it's always going to be there now?
Is this like the new medium for politics to play itself out?
Kurt Wagner: Well, I think that just demonstrates how important these networks became and have become, right? That, that this is the place where discussion around politics and culture sort of happens. Um, I, you know, it's hard to imagine 20 years from now. Maybe these networks won't exist. Maybe we'll all be, you know, doing something in our VR, our AR glasses or headsets or whatever, right?
So I don't want to say something Definitive, necessarily forever, [00:23:00] but it's hard for me to imagine politics being removed from the equation so long as we're continuing to get our news from these places, right? Because politicians, they want to be where readers are. They want to try and influence, you know, opinion and voter sentiment.
So they go to the places that people are discussing these things, and I think that is why we've seen such a huge impact on On, uh, you know, X or, or what was Twitter on Facebook, on Instagram, TikTok, right? And, and TikTok, we haven't even talked about TikTok, right? Um, but that's like, such a great example of, of blending social networking and politics, right?
It is, it is a network where peop For the most part, people are scrolling through and watching dance videos and comedians and skits, um, and yet it's at the center of a huge geopolitical struggle between the US and China, right? And it's like, in some ways, it's almost how, how the heck did we get here? And the other, on the other hand, it's like, this just, again, goes to prove how instrumental these [00:24:00] networks are.
And to go back to maybe your question at the very beginning, you were sort of like, is social media still important? I think that sort of speaks for itself at this point, right? These networks keep showing up at the most important stage we have, and that sort of answers the question for us.
Marc Beckman : Absolutely, but just like, since you brought up TikTok, and I don't want to spend too much time on TikTok today, but I, it's, you know, I'd be remiss to not mention, we have a situation where there is this geopolitical battle, right? We're concerned about China owning and operating Uh, TikTok and, and, you know, accessing Americans data and beyond.
I, I know like we had on our show, a guest of mine was the chief technology officer of New York City and the New York City mayor's office has banned TikTok from every employee's um, phone. They're not able to use it on government issued phones. I don't know if you're aware of that or government issued devices.
It's, it's pretty compelling. But yet, Both presidential candidates, [00:25:00] Trump, I guess I should include RFK, but, but Trump and Biden are embracing TikTok publicly, like using it and, and talking in favor of it. So, is it like, just too strong of a drug at this point? Like, they realize perhaps that it's not such a good thing for America, but yet, because it reaches, you know, that younger demographic in such a powerful way, they'll do it anyway.
The
Kurt Wagner: I think so, and I mean, it's, it, it again proves the importance of reaching young people where they are, right, because, because TikTok is, of course, predominantly popular among young people, young potential voters, um, which is why you're seeing someone like Joe Biden use it. There, there is, and I gotta be somewhat careful, but like, There's a level of hypocrisy, right, across, and this is not just unique to this situation, I think it's, it's something we see across political parties and government in a lot of different things.
But here is one example, right, Joe Biden has said, I'm [00:26:00] going to sign this bill that will ultimately, you know, force TikTok to either be divested from its Chinese parent or be banned, you know, warning about the, the data security threats, warning about the influence of. Maybe the Chinese government using the algorithm to feed, you know, feed people a particular narrative or sway political opinion.
And yet he's using it to campaign at the same time, right? And, and, and you see this, this tension between, you know, here's what I'm saying, I'm warning you that this thing is bad and is bad for national security so much so that we might, you know, force the app stores to remove it. And at the same time, It's so valuable because we know people are using it that I'm going to, you know, hope to raise some campaign dollars there.
There's just this, like, really crazy tension, uh, that's happening, uh, around TikTok, but, you know, you could, you could find several examples of this, I think, with tech and, and politics right now. Um, it's hard for me to sort of wrap my, it's, it's a little bit You know, as an observer, it can be a little bit [00:27:00] frustrating because it feels hypocritical to me.
Um, but you know, I think that's unfortunately the political system in which, in which we kind of operate in right now.
Marc Beckman : It is, it's, it's a sign of the times, for sure, I agree, um, it's interesting to take the concept of political influence a little further to a recent Supreme Court case surrounding both the Biden administration and the Trump administration's use of social media to effectively coerce the American public.
I'm sure you're familiar with this case. So at what point do, um, do, do we need to say like, it's a little bit too much, the government shouldn't be involved. We are maybe perhaps limiting freedom of speech or other, you know, constitutional rights when the government is, is kind of stepping on, on the scales a little bit with social media.
Kurt Wagner: I think it's a, a real pendulum situation. It's a, it's a constant sort of back and forth, right? I think if you look historically, I'll use Twitter as the example because that's sort of what, you know, we're [00:28:00] talking about today, or I guess what's now X. Um. You know, Twitter used to be, believe it or not, its old CEO called it the free speech wing of the free speech party.
Like there's, there was this idea that any, you could post basically anything to Twitter so long as it wasn't child pornography or like ISIS support videos or something. And you were kind of fair game, right? And the pendulum started to swing because People started to leave Twitter. They were saying, I'm sick of being harassed.
You know, they'd have high profile celebrities who'd say, I'm getting bullied off of Twitter and now I'm leaving. And the company made a business decision to sort of start to clean itself up. And, you know, it was like, hey, we're going to clean up hate speech. We're going to clean up racism because we're seeing that it's bad for business to have that stuff on there.
Then I think 2020 happened. COVID, the post, um, election with, with Donald Trump, you know, basically disputing the election results. And you saw the pendulum move very dramatically in the other direction, [00:29:00] right? Well now, if you're saying something that the government believes is inaccurate about COVID, You're going to be suspended.
If you're disputing the election results, you're going to be suspended or taken down. And, and you saw people say, whoa, whoa, whoa, whoa. You know, we've moved, the pendulum went from the free speech wing of the free speech party to, I can't, you know, dispute the, accurate or the, you know, helpfulness of, of masking, right?
And, and so now you're seeing it start to move the other direction because Elon Musk then came in and company. And he's now saying, Hey, we've, we've gotten way too censorship heavy. We're going to move it the other way. Right? So I kind of bring this up. It's a little long winded. I apologize. But the point being is that this is a constant ebb and flow that sort of reflects, I think, what's going on in the world, what's going on with the business, what, what, what people, the consumer sort of feel in that moment.
And I think like Elon is, is kind of at risk of taking it the other direction right now, back to where we were, you You know, maybe in 2014 2015 where it's like people are being bullied off of this app. People feel [00:30:00] like they can't go on X without being exposed to racist content or pornography or whatever it is.
And maybe it will, you know, again, swing back the other way. So I just feel like this is what it's going to be. I don't know. I think this cycle is hard to, um, to fix because free speech means something different to every, every person who, who you talk to. And so they're never going to find that perfect medium, but I think they're sort of constantly going back and forth, trying to do that.
Marc Beckman : Yeah, it's interesting as you're talking about this pendulum and the flow, um, to both extremes, I was thinking in my mind, like, wouldn't it be nice, I think most people are going Uh, hoping for some kind of central, centrist, uh, moderate positioning. Um, I just don't, I don't think we're going to get there quickly.
but when you talk about Elon, like, dragging it back, the pendulum back towards, like, this open speech and, and bringing back bullying and all of this, like, is, so many people think, like, Elon Musk is dangerous as a result of it. Like, would you categorize Elon Musk, as [00:31:00] dangerous?
Kurt Wagner: You know, I think there's this feeling very similar to President Trump for me, where

Kurt Wagner: the influence that two people in particular have over culture and the way that people view a particular issue is so strong, right? I mean, it's like hard to quantify. I think Elon has like 160 million followers or something, which is obviously a huge audience, but it's just hard to quantify how much he can sway.
public opinion and perception around a particular topic, right? And we're seeing him try to, you know, do this around topics like immigration, um, DEI, like diversity and inclusion efforts. He sort of has picked a few particular topics that he feels strongly about and he's hammering them with his, his, you know, audience on, on X, um, if not daily, weekly.
And so I think where it can become dangerous is when you have. Someone who maybe [00:32:00] takes a stance that leads to a dangerous outcome, right? We saw this with President Trump and the election results, in my opinion, right? He's sitting here saying, the election's rigged, the election's rigged, the, you know, you need to come fight for me.
And what happens? We see an insurrection at the, at the U S Capitol, right? And I think that is where you can see when this message is hammered home over 160 million people on a daily basis or weekly basis. You can see a real world outcome of that. And so, you know, do I think that like Elon is, is sitting around intentionally trying to, to, you know, harm people?
I certainly hope not. But do I think that sometimes the rhetoric that can come from, from not just those two, I use those two as an example, but anyone with that level of power and influence, is their rhetoric going to have an impact on the real world? And I think the answer is absolutely. And so it depends on the stances that they choose to take and whether or not those can lead to, you know, harms in the real world.
Marc Beckman : Yeah, I think [00:33:00] it's interesting to evaluate your comments and then, you know, go back to what we were talking about a minute ago, where we're living in this time where people are okay with soundbites, where They'll, um, look at a headline and say, well, that must be the truth. So if you couple those things together, perhaps the, um, impact on the general sentiment of the masses can really shift in a significant way, in a significant way, most people are not going to do the legwork to do the research and, and come up with some level of deductive thinking.
Kurt Wagner: so. And I think, again, when you admire a a person, right? I mean, there's a lot of people who admire Elon for his business and technical success with SpaceX and Tesla. And so, you know, you follow him, perhaps you started following him years ago for that, right? Hey, this is a guy who's built some incredible technologies.
And then, you know, now you're getting fed anti immigration content, or now you're now you're seeing how DEI is, you know, the root [00:34:00] cause for X, Y, and Z. And so, Sometimes, you know, people maybe, uh, sign up to, to follow someone they admire for one thing. And then when, when that person's, you know, um, interests or mission or whatever it might be pivots, well now, you know, suddenly there's a whole group of people who are being exposed to other ideas.
Now, I'm not saying you shouldn't be exposed to other ideas by any stretch. I'm just simply saying that there can be things that You know, can be more dangerous than others in my opinion, or certainly more divisive than others. And I think that that's what we're seeing right now with Elon. I think he's choosing to talk about a lot of things that can be incredibly divisive.
And I think we saw that with President Trump a lot as well when he was, when he was in charge and using X on a daily basis.
Marc Beckman : So do you think the idea of saying that Elon Musk is a true free speech advocate is a stretch?
Kurt Wagner: I do. I think there's too many examples of him, you know, saying, Hey, I want free speech. But when that free speech sort of rubs up against another one of my, my goals or, [00:35:00] or things that I care about. You know, I'm willing to compromise, right?
Marc Beckman : His personal, you're talking about his personal or professional goals,
Kurt Wagner: That's right, that's right. And so, you know, yeah, I was gonna say we can go back to the book for a second because I have a couple examples in here, but like, you know, there's one, this was, this was a week after he took over, right?
So we've been talking about bringing free speech back, free speech absolutist, all of this stuff, and a week after he takes over, you know, there were users, kind of regular people on X, Who were out and they were, um, harassing or criticizing advertisers on X, right? They were saying, how can you advertise, how can you give money to this guy, blah, blah, blah.
They're basically, they're peppering these advertisers with negativity. And this was driving Elon crazy, right? Cause he's like, hey, this is impacting our business. Now all of a sudden, I'm relying on these advertisers for revenue, and I have people harassing them. We're gonna suspend these harassers, right?
And it's. It's to me, it was such a clear example of, I'm, I'm supportive of free speech until, right? [00:36:00] Because in this case, it was like, until that free speech might start to hurt the business that I'm running. And um, I think there's many, you know, that's just one example. There's many examples like that, I believe, sort of in Elon's history and in his short time running X to me to sort of convince me that like, While I think he, he talks a big game about free speech, I think there are limits to what he believes, should be allowed.
And I think sometimes the, you know, that line can be drawn just wherever it's most convenient for him or the companies he's running.
Marc Beckman : So, you know, personally, Kurt, I happen to be like a major free speech advocate. I feel like everything needs to be out in the marketplace. But admittedly, there are some people that have accounts on X that just are mind blowing to me. Like, why would he allow for? Iran's Supreme Leader Khamenei to have an account on X.
Like, I just don't even understand the need to have, you know, it's, it's always, um, hate speech that's coming from Khamenei. It's always, you know, wild, radicalized, anti [00:37:00] America, anti Jewish speech. So, you know, what about the idea of limiting, uh, personalities on X that are just, you know, built up to be hateful?
Kurt Wagner: Yeah, I mean, he's sort of made a, a pretty conscious, a pretty conscious effort to, You know, bring back actually people who are very controversial or who have had, uh, you know, been previously suspended or banned from, from X for breaking the rules or whatever, right? And part of me thinks this is, you know, a bit of the sales pitch from Elon, right?
Is that, hey, um, you know, he wants Twitter to be edgy. He knows that the edgier X, excuse me, X, uh, he knows that the edgier it is, the more, um, controversial it is. Maybe the more attention that it's going to get, right? And I'm sort of now putting myself in his shoes. I don't know exactly why he would want the supreme leader of Iran to have a megaphone on Twitter, but at the same time that's [00:38:00] not out of line with sort of how he has handled other accounts on Twitter, or how he has framed his belief of Twitter, right?
And I think it's Look, if he wanted to come out and say, literally, we will let anything go, you know, short of, you would, you would certainly hope short of child pornography or anything like that. If he said any type of speech beyond that is fair game, okay, that's one thing. But that's not really what he's doing.
He's saying, hey, it's free speech, but, and he's making all these other concessions, but then he allows, you know, some very controversial folks on there. So, that's where, that's where for me it becomes hard to take the free speech thing seriously, because it feels like there's all these examples. On both sides, they're like, okay, well, why is he, he's taking this down, but he says he loves free speech and he's leaving this up, but he says he loves to protect, you know, the sort of the, the privacy of the conversation or the safety of the conversation.
He really can't, you know, it's hard to do both those things. And I'm not sure he's doing a great job.
Marc Beckman : so I guess, um, one question as it relates to that is, shouldn't the market just regulate [00:39:00] itself? Like, he's providing the platform and if everybody goes in there, it should be the marketplace's decision and, and obligation to go and verify sources and verify, uh, the news and decide whether or not they're tuning into, you know, what might be total crap or what might be the most important information ever.
Kurt Wagner: Well, and this is, this is really the argument for why these rules at these platforms exist at all, right? This is a, these are businesses. And so, they come up with these rules. They say, hey, we're going to limit what you can say around hate speech or around racism or sexism or whatever. In some cases there are legal reasons, right?
Like, that's why they, they take down, uh, terrorist content, or they take down child pornography, or other things. Like, there are legal reasons that you want that stuff off of your platform. But when it comes to speech, they've made decisions around what you can and cannot say because they are trying to attract the largest possible audience.
Or at least that's how it's been historically, right? We talked about the pendulum swinging with Twitter before [00:40:00] Elon got there. Well, they're swinging that pendulum because they're trying to get people to show up and people aren't going to show up if they feel like everything they see on there is offensive.
And so, you know, to your point, it lets the market dictate, right? Like people are willing to vote with their time. Does this network feel like it adheres, you know, to the, what I'm comfortable with in terms of speech? Yes or no. I'm going to spend time there. Yes or no. And so I, part of me thinks like, that's, that's what we're dealing with.
That's still what we're dealing with. Right. But Elon has said. I don't, you know, this goes back years now, at this point, but he said, I don't care about the business of X, it's just about free speech, right? But then he makes decisions As if he's running a business, which I think is a totally reasonable thing to do.
I just wish he would say that like, Hey, I'm making these rules to try and maximize the business potential of this network. Um, he sort of says the opposite all the time, which is where I think, again, you know, you get this sort of rub, right? It's hard to have a big business with complete and total free speech because people aren't just going to want to show up to something like [00:41:00] that.
Marc Beckman : So when you talk about the big business piece, I mean, Musk made a 44 billion gamble here on, on X. Do you think it's even feasible for this company to reach profitability and, and to, um, you know, even, even get some return on that investment?
Kurt Wagner: 44 billion, it's hard to imagine right now. I mean, it's, it's, there's. Historically, there has been a time when Twitter was worth well north of 44 billion, at least on paper. So like, is there a path to, you know, doubling, tripling this business? Sure. What's difficult to imagine is that happening under the current ownership and the current business structure.
Because X makes at least You know, it did make about 90 percent of its revenue from, uh, advertising. Uh, it still makes the majority of its revenue from advertising. The problem is that you now have an owner [00:42:00] who is You know, um, tweeting things that are quite controversial, you know, allowing or not allowing certain speech, sort of based on his own personal beliefs, and most importantly, going out.
I don't know if you remember, but in December, he went on stage at a conference in New York at the Dealbook Summit and told advertisers to go F themselves, right? And so,
Marc Beckman : was amazing.
Kurt Wagner: yeah, it's like one of the craziest things I've

Marc Beckman : I thought he was on drugs when, honestly, I thought he was on drugs when he did that. I was like truly shocked. I know that he's come out and advocated for like ketamine treatments lately, but like I did not understand those comments at all.
Kurt Wagner: It was, it was wild. I've never seen anything like that, um, from someone of that, that level of, sort of, success and, and power. My point being, though, is that it's hard to imagine a bridge being built between X and the advertising community when you have Elon doing the things that he's been doing since he took over.
And so, When you, you know, when you, when you paint it like [00:43:00] that, you say, okay, well, if ads aren't going to be what doubles or triples this business, what is, what's the other business opportunity here? And they haven't come up with anything yet. So that's where I'm like, is it possible? Of course it's possible.
We've seen networks grow beyond this size and scale before, but is it likely with the current setup? I would say absolutely not.
Marc Beckman : but when you talk about the business model, I mean, as you're very well aware, he, Musk, has been, um, publicly stating that this is going to transform into the everything app, um, where, where, you know, this new endeavor will have X compete with FaceTime and dating apps and YouTube and, you know, Banking and LinkedIn.
And, you know, obviously he launched Grok artificial intelligence. So do you think Kurt, that they're making progress towards becoming the everything app? Do you think that we'll see some kind of cryptocurrency exchange built into X at some point, or is this all blue sky?
Kurt Wagner: I think we're seeing Progress in the sense that they are [00:44:00] continuing to move forward with this plan. I'm skeptical that this is what is going to ultimately save Twitter for a couple reasons. Um, one, a lot of the things that they're trying are things that we've seen people do. tried before without success, right?
And so a good example, actually, is that they're launching this app for connected TVs. And it's the idea that you could open your TV and watch videos from X from your living room, I guess. You can sit on your couch and scroll through. Does anybody want that? It's my question. Like, is anyone sitting here saying, I need another place to watch Videos from X and I want it to be on my television.
And I would, I would say probably not. That's, you know, I, I'm, I'm happy to be proven wrong, but it seems to me like that's not something people are clamoring for. Let's also talk about the banking. Like this would require an incredible change in consumer [00:45:00] behavior, right? That people would have to say, I want to start storing my money.
through my social network, not my bank, not my, you know, not Chase, but I want X to hold my money and give me a, you know, a, uh, percentage yield or whatever on my savings account. Is that going to happen? Or is that what people are looking for? And I'm not sure the answer is yes there either. So my point being is that they're kind of going in all of these, they're, they're sort of throwing everything at the wall.
Let's be, let's be the everything app, as you point out. Um, But there are a lot of things, there are a lot of other businesses that do those things really well already. And I'm not sure that people are looking for a single app to do all of this stuff. And so I'm skeptical that this is the plan to go to now.
For me, what Twitter always did best, better than anyone, was it did news, right? It did, that's what it was for. Now, maybe that business was always going to be limited because there's only so, you know, you can only grow so big as a news focused service, but at least that was, uh, their, that was [00:46:00] their thing.
That was what they were best at. That's what differentiated them from everyone else. I worry that now if they're just getting into video and banking and, and, um, you know, payments. They're just going to do the same things that a bunch of other businesses already do better. And I'm not sure that that's the strategy to really differentiate.
Marc Beckman : But look, just to play devil's advocate, with all due respect, uh, for a second, like, there is some proof of concept with those two pillars that we just touched on, as it relates to like, scrolling and watching on television, I know for sure that Gen Z and Gen Alpha, uh, as bizarre as it is to me, are engaging with YouTube's original content on television now.
Um, so, and, and, and that, you know, I, I also realize that Elon Musk and his team have been advocating for exclusive content to be, uh, published in long form now. from sports and beyond on, on X. and then we know that there's also proof of concept with platforms like Venmo, payment transfer [00:47:00] systems and whatnot.
So maybe the next generation will find it convenient, uh, to have everything in this everything app, uh, format. But, you know, everything kind of goes back to like Elon Musk. Like is the brand. X or is the brand Elon Musk? Like it's funny like obviously there's there must be a branding problem. My background is advertising you might remember and even during the course of this conversation you and I several times have been You know, vacillating between, should we call it Twitter?
Should we call it X? And, um, you know, clearly there's a branding problem, but the advertising perspective also goes back to like what Elon is saying personally, and people hate Elon Musk, but yet, um, there's this ability also to have like, You know, Khamenei next to Coca Cola or Khamenei next to Disney.
Like where does that come into play? So there's so many like moving parts as it relates to this social media platform in particular that are truly complicated. And I think he needs to unlock the financial [00:48:00] value that perhaps a banking system or an original content platform can bring to the platform because he's just losing so much advertising revenue.
And I know I'm throwing a lot out there, but
Kurt Wagner: No, no. I mean, I, I, so I'll, I'll maybe address like the YouTube comment first only because I agree with you, right? There is, there is an example of this working, which is, which is YouTube. There's a major difference between X and YouTube, which is that YouTube has the largest video library in the entire world, including a very advanced way for creators of video to make money, to build a life or a business using that platform.
X doesn't have any of that stuff, right? X has historically not been a place for video. It does not have a setup for making, you know, enticing, essentially, video creators to come over and say, here, let me give you my All my hard work for free because I know I'm going to get paid from an advertiser in return.
It just doesn't exist there. So while I agree this, this does work for [00:49:00] YouTube, I don't, I'm not convinced it's going to work for X because this is not what X does. And it's, and there's, I haven't seen anything from them that has convinced me that they will be able to do that. Right. They've announced a few premium, um, video partners, um, including Don Lemon, and that one already.
fell apart, right? We, that
Marc Beckman : up one day before they even got to the starting gate,
Kurt Wagner: they even got to the starting gate. So, you know, I'm, I'm sort of being, I'm, I realize that I'm sort of being a little bit negative here on, on X, but I just think it's, that kind of goes back to what I was saying, which is like, they are trying to do a bunch of things that other businesses have already mastered or are doing really well and they're at a huge, huge disadvantage because they're starting from zero.
Video being, in my opinion, one of those examples. Payments, you mentioned Venmo. You're right. Like there is a need for me to be able to send my friends or family money very easily. But guess what? There's already a half dozen very easy places I could do that. Do I need to do that on [00:50:00] the social network that I use?
And I'd say the answer is no. So that's sort of, that's my feeling, right? Not that these types of businesses can't exist or that they can't be incredibly successful. It's that, is this the best use of X's time and resources to go after? And I'm not sure that it is.
Marc Beckman : That's interesting. A lot of it is like cult of personality with Elon Musk, and it goes both ways. And like I said, it's like, um, you know, there's a, there's a big part of the planet that loves Elon and that hates Elon. In fact, just this morning, I heard this Australian Senator, um, call for the imprisonment of Elon Musk, that Elon should be jailed.
I'll, I'll read you the quote because I realize this is You know, essentially breaking news and you might not have heard it. She said, her name is Jackie Lambie and she called for Musk to be imprisoned. She said, quote, Elon Musk has no social conscience or conscience whatsoever. Quite frankly, the bloke should be jailed.
But quite frankly, the [00:51:00] power that man has because of that platform. that he's on, it's got to stop. It has absolutely got to stop. And it's like remarkable to me how he's such a polarizing personality and that this person, Jacqui Lambie, admittedly, I don't know if she's an advocate or what her background is.
I do understand she's an Australian senator though, that she's like publicly calling for You know, Elon to be imprisoned. It's wild.
Kurt Wagner: Yeah. I mean, I don't know all the, I don't know all the specifics of that either, but I would say like, you know, politicians. In the U.S., certainly love to say the most headline grabby, most attention grabby thing that sort of sounds like that, uh, you know, maybe, uh, transcends to, to Australian politics as well, but it's sort of, you know, to your point, like, let's set aside the, the logistics of should or should he not be put in jail, which I think is maybe a, a conversation for a different, a different podcast, but to, to your point, it just shows you You're right.
This is an [00:52:00] incredibly polarizing person. He's become more and more polarizing in the last two years since he took over X, um, more so than he ever was just as the, as the, you know, CEO of, of, uh, Tesla and SpaceX. And I think what happened is that, you know, for a long time, his sort of missions, um, were, were viewed as, You know, good or, or, or well intentioned, right?
Oh, he's trying to save the environment by coming up with electric vehicles. Oh, he's trying to explore space, you know, find out where the, the future of, of human civilization is going to live. And I think there were a lot of people who thought this is pretty exciting. You know, this is a guy who's willing to sacrifice time and money and attention for the good of humanity, right?
That was sort of the view of Elon maybe five years ago. And I think what happened with X is that. He has been public in a way that most people never saw him before. They knew who Elon was, maybe as the CEO of Tesla, but that was sort of it. You know, they didn't listen to everything he said. They didn't see [00:53:00] everything that he said or believed because it wasn't broadcasted on X to 160 million people every single day.
And I think owning X has sort of changed that perception because we're hearing from him more often. I think we're seeing, you know, the way he engages and, and who he interacts with more often in ways that we didn't before. And I think that's just rubbed a lot of people the wrong way. People who probably would have supported him five years ago, who are now saying, wow, he's a very different person than I thought he was.
And I think his ownership of X has sort of exposed that to a lot of people.
Marc Beckman : So, Kurt, it's interesting because in Battle for the Bird, in your book, Battle for the Bird, you talk about the, um, pre Elon Twitter culture, uh, seeming really galvanized, like a great level of morale and, and a place that people love to work. I'm wondering if this concept has translated into the corporate culture at X.
Like, have you seen, have we witnessed a, diminished morale since Elon's, uh, you know, at the helm?
Kurt Wagner: There was a, there was a huge [00:54:00] diminished morale early on, right? Because there were all of these sort of hangover, there were all these people who were hanging over from Twitter 1. 0 into X, and I'd say that first year, There were, you know, people for various reasons, financial, uh, you know, visa reasons for some people who were maybe in the country on a work visa, uh, you know, there are people who thought, well, maybe this could work, or maybe this is the kind of person I want to work for.
And then they find out six months in, okay, Elon's not that person for me. Right. And so I think that first year was really tough when it came to morale. He was also fresh off of, of layoffs, um, you know, killing a bunch of projects, killing a bunch of teams. That's obviously. You know, that, that sucks when you're working at a company and, and your team gets decimated or this thing you've been working on for years gets thrown out the window.
That's tough. We're now about 18 months into, um, Twitter 2. 0, I would say, um, which is obviously now X. And I think that the people who remain. [00:55:00] In a lot of cases it's because they want to work for Elon, right? They, they believe that Elon is changing the world. They believe that Elon is changing speech, um, or could change the speech and freedom of speech.
So I think we're sort of at the stage where the people who are left are pretty You know, for the most part, pretty loyal to him, pretty loyal to the mission, um, they know what they've signed up for, right, whereas I think when he took over, there's people who sort of were like, this is not what I signed up for, I just, but I need a paycheck for the next six months, for whatever reason, I'm going to stick this out.
So, you know, as with everything, over time, I think you, you see the, the old guard sort of fully leave, uh, and you see the new guard come in and those are the people who say, you know, I'm signing up for a reason. I'm signing up for, for this company and this CEO because I believe in him. And, um, I think we're, we're more there than we were a year ago.
Marc Beckman : Oh, that's interesting. So, so like when you talk about like this lofty aspiration of using X to change the world and, [00:56:00] and now this new, uh, regime comes along and follows behind, behind Elon to do so, my mind goes directly into the Twitter files. And it's kind of funny because when those were breaking, um, they were really, they caught my attention.
I was like, wow, these are like some terrible things that have been exposed as it relates to the way the company was operating and outside third parties and all. And then I was, I was thinking like, as of today, it doesn't even seem like people care anymore. It seems like that's just old news and it hasn't changed the way that society is functioning or operating.
So I'm wondering, like, if you could take a second, maybe first to just Explain to the audience what the Twitter files, I know there's a lot there. So like from a top line level, like what are the Twitter files? And then to your, like, to your point, like have they played a role in changing the world at all?
Kurt Wagner: Yeah, so the Twitter files was this idea shortly after Elon took over to essentially expose the prior ownership, or the [00:57:00] prior leadership at Twitter for what he felt were some egregious, you know, free speech, or, um, content, um, violations, right? And so, he brought a couple of journalists in. He basically said, you have free reign through all of our systems, right?
You can go through old emails, you can go through old Slack messages, um, you can see All the internal messy discussion around a particular, you know, decision that the company had made, um, be it, you know, for content, uh, rules. And like, let's expose this, right? I think he said something like sunshine's the best disinfectant, right?
His idea being that we are going to make the old group look bad and expose all the bad things they did, um, to set the, the, clean the slate and, um, you know, move forward into this new sort of, uh, Twitter 2. 0 or, or what is now X. So. There was a ton of attention, as you may remember, around sort of the first, second, third installments of the Twitter files.
Um, the very first one, I believe, was about, uh, the Hunter Biden laptop [00:58:00] story. So, uh, if folks remember in the run up to 2020, Um, right before the election, there was a story about, um, you know, a laptop that had been owned by Hunter Biden and, and some files that were on the laptop and Twitter had suppressed the links to that story because they thought it was a, they basically thought it was a hack.
They thought that, um, you know, they had a rule around sharing hacked materials. They thought all this material had been obtained through a hack and so they said, like, you can't even share the link. to this story. And, you know, President Trump and conservatives were up and they were pissed, right? Because they're like, this is negative.
This is negative information about my political rival right before the election. And yet here, Twitter is basically forbidding people to see it. Now, I'm sort of kind of getting, you know, this is why I'm a little reluctant to go into the Twitter files only because it's like you go way down a rabbit hole on one particular issue.
But this was an important one. It's also talked about in the book, Twitter made a mistake. And they said Transcribed They, you know, within about 24 hours, I think maybe 36 hours, [00:59:00] the base said, Hey, this isn't hacked. We messed up. You can read the article right here. I would argue that like the, the decisions that they made about trying to ban the article probably drew way more attention to it that I think it probably would have gotten even if they hadn't tried to do that.
So I'm not sure that anyone was like, Oh, I didn't get to learn this because of Twitter. But the point being is that they were putting their thumb on the scale, uh, for a very important story right before a very important election. And so, that was Twitter Files number one, and I bring that up because the idea was to expose the decision making process behind these decisions.
And um, there's now been tons of Twitter Files, you know, sort of, um, series that have come out over the years, and I think you're right, like, they have not landed in the way that I assume Elon thought they would. I think the first couple raised a bunch of, um, you know, questions and raised some eyebrows.
But for me, the biggest thing was, and I sort of mentioned this in the book, is I was like, For the most part, the Twitter files that I've seen have simply exposed how [01:00:00] complicated and messy these decisions are internally, right? Because with the Hunter Biden one, for example, we had internal Slack messages and emails of people saying, should we be doing this?
What are we doing? Like, can we even defend this decision, right? You have employees in there basically saying. What are we doing? Was this the right call? Yes or no? And so to me, again, it sort of just shows like these are really complicated decisions. In that case, Twitter made a mistake. They admitted the mistake and apologized.
I don't think that Anyone necessarily learned more about that situation, the Twitter files that they couldn't have maybe anticipated, and so maybe that's why it didn't get as much attention as, as, as people thought that it would, but, um, I do think, you know, again, the idea to go full circle was Let's expose the old group, make them look bad, and move forward and, and sort of set a clean slate and, you know, I guess you could say, um, that that is [01:01:00] done and, and, and he's sort of, he's sort of done that, he's sort of accomplished that.
Marc Beckman : Yeah, it's kinda, it's, it's really interesting, you know, again, I keep going back in my mind to the concept of trust and verify, um, and, and the verification issue, like, just has to pull the rug under, uh, the veracity of what's being reported on Twitter, on X on social media in general. I mean, We didn't get into spaces at all, Kurt.
Like there was one thing to see it in writing, right. Vis a vis a tweet, but like, you know, I I'll tune in, you know, there's a very big, I think the biggest personality as it relates to spaces and reporting the news is a guy named Mario Naufel. And, um, there are major issues with regards to the teams that surround him and, you know, help report breaking news, whether it's geopolitical or domestic or beyond, it's a, it's a major, major issue.
So. You know, I wonder what the impact will ultimately be, um, when, when advertisers are not able to track spoken word, but start to realize like a lot of what's [01:02:00] being said in these spaces are totally inaccurate as well.
Kurt Wagner: Well, and that's someone that Elon has been on Spaces with a lot too, right? And so as a result, he's gotten an even more elevated, uh, Um, platform, because Elon grants that to him, right? He says like, I'll come, you know, speak with you on Spaces and bring my 160 million followers with me. And so, you know, to go back to, you know, our, Are these, is Elon dangerous, right?
That's an example, um, of, you know, him using this power to not just say, here are the issues I care about and I'm going to hammer you with those issues, but to elevate other people around him as well, right? Who, who may or may not have the same You know, um, helpful things to say and, and I'm not, I won't speak to anyone in particular because I, you know, I don't want to get into like the, the specifics of all these people.
I don't know them all that well. My point just simply as being is that even being in Elon's orbit, [01:03:00] Um, can be enough of a, you know, boost to the rhetoric that you want to share.
Marc Beckman : For sure. For sure. So Kurt, is X dying?
Ah, boy, I get this question a lot. I think it is losing relevance, in my opinion. Um, I'm not sure if that's the same as dying. I mean, the business we know is much smaller than it was before. Uh, the valuation of the company is much smaller than it was before. So from a business standpoint, I, I would say X is in a, um, an unhealthy situation.
Kurt Wagner: Thank you. The bigger concern for me is the relevance, right? Because if X maintains relevance, people are still going to be there. Advertisers are still going to ultimately show up. Newsmakers, celebrities, politicians will show up. I'm not sure that X is as relevant today as it was two years ago. And I think if I look forward, you know, two more years from now, I only see it going into the less relevant.
side of the spectrum. I'm actually interested what you, you know, everyone has their own perspective on this and opinion. This is just, this [01:04:00] is just my take on it. I'm curious what you think, but my worry is that the less relevant it becomes, the harder it is to build a business. And at a certain point, you know, obviously those things start to snowball, but I'm curious what, what you think.
Marc Beckman : Yeah. I mean, I mean, it's kind of interesting. Like I, from my vantage point, use every social media platform and look at everything and I see a lot of. Um, uh, surplus space coming through the existing landscape of social media. I think there's a lot, um, where communities just aren't engaging, aren't really looking.
I think there are a couple of like big tent poles, but I think X has a place right now. I think it's an interesting Community. I think it is a place, um, where ideas and thoughts can, um, travel across the globe. And, and I still think it's very powerful. I read somewhere, Kurt, for example, that it is the most, uh, heaviest traffic.
You might have written this actually, when it comes to, um, being like the number one place [01:05:00] for people to find, uh, New, their news. I, I really, I believe that, um, uh, X is the number one place for people to go into social media and find their news. So, you know, clearly there's a certain need for it. but I guess we'll, we'll see what it transforms into,
Kurt Wagner: Yeah. I mean, I still use it, right. I, I'm, I'm still there. I'm not there as often, I will admit, like I used to be, but First thing in the morning, last thing at night. I would check it, you know, what felt like hourly throughout the day. Like it was, it was the place I went. Uh, I would say my, my usage. I, It's hard to say.
I, I wish I actually, maybe I should go check my, my app usage, although I don't know if it has the historical stuff, but I would guess my, my time is probably cut in half. Um, you know, I'm one person, but I am also like that group of journalists who, who lived and died with, with Twitter for a long time and really cared about my following and my engagement and all that stuff.
And I just don't as much anymore. And I don't think I'm alone in that. Um, [01:06:00] so, you know. We'll see. I, I, I hope it doesn't go away in the sense that, like, I've gotten a lot of value historically from it. I do agree with you. I think, like, it could be, it could fill a very important role, which is, like, the role of news distributor on the internet.
I'm just, I just don't think it's doing a great job right now of doing that, and, um, we'll see. We'll see if that improves.
Marc Beckman : Are you following, uh, Khomeini?
Kurt Wagner: No, I'm not. I'm not. I do follow Elon, obviously, so I get all my Elon stuff.
Marc Beckman : So, Kurt, um, you've given me a tremendous amount of your time and, and more importantly, your insight and knowledge today, and I, I truly appreciate it. Um, congratulations again on your awesome book, Battle for the Bird. Um, I would highly recommend that everybody locks into this. It's really fantastic.
every guest I have, we have a tradition, we end the show, um, with, uh, kind of looking into the future. I start a sentence, and then the guest ends it, and we incorporate the name of the show, Some Future Day, into that sentence. [01:07:00] So, if, um, if you're game, I'd love to, I'd love to share this concept with you and see where you end up.
Kurt Wagner: Okay, let's do it. I'll, I'll, uh, yeah, let's go.
Marc Beckman : All right, we're going to be very generalized here, but it's, you know, it's on point. In some future day, X will be,
Kurt Wagner: Bankrupt.
Marc Beckman : wow, in some future day, X will be bankrupt.
Kurt Wagner: I, I know. That's, again, I was like, I know, that felt very dramatic, but I do worry about, I do worry about the business. I do worry about the business.
Marc Beckman : Amazing. Well, Kurt, thank you so much. Um, it really is, has been a pleasure speaking with you today. Thank you for joining me on some future day.
Kurt Wagner: Thank you for giving me the chance to come on, and I hope people, you know, give the book a shot. It's um, a lot of love and effort went into that thing, so I hope people enjoy it.
[01:08:00]