Oxide and Friends

On December 20th, former Tweep, Ian Brown, joined a Twitter Space about Twitter's architecture. He was surprised when the head Tweep himself joined.. and more surprised when Elon responded to his questions about a proposed rewrite by calling Ian a jackass! The Oxide Friends talk about architecture, rewrites, hubris, fear, curiosity, and safety.

Show Notes

Break it down with Ian Brown

We've been hosting a live show weekly on Mondays at 5p for about an hour, and recording them all; here is the recording from December 26th, 2022.

In addition to Bryan Cantrill and Adam Leventhal, our special guest was Ian Brown.

Creators & Guests

Host
Adam Leventhal
Host
Bryan Cantrill

What is Oxide and Friends?

Oxide hosts a weekly Discord show where we discuss a wide range of topics: computer history, startups, Oxide hardware bringup, and other topics du jour. These are the recordings in podcast form.
Join us live (usually Mondays at 5pm PT) https://discord.gg/gcQxNHAKCB
Subscribe to our calendar: https://sesh.fyi/api/calendar/v2/iMdFbuFRupMwuTiwvXswNU.ics

Speaker 1:

Ian, first of all, thank you so much for joining us. We know that you your your book tour is already a book solid. We know that you're A

Speaker 2:

a whirlwind media tour of 2 stops. The Ian and Jay thing on Friday, and I'm I'm wrapping it up with this today. So

Speaker 1:

It's wrapping it up.

Speaker 2:

What's up?

Speaker 1:

I I, you know, I assumed I was gonna see you on Conan, and I don't even know what the these late night television are late night talk shows even a thing? I mean, I I guess I'm not making a Johnny Carson reference, which is good, but I I feel

Speaker 2:

We're we're all pretty out of it, I think, in that loop. It's, it's really hard to know what what the kids are doing these days. So

Speaker 1:

Exactly. Well, the kids are all on Discord is what the kids are doing these days. Yeah.

Speaker 2:

Alright, man. This is this is red.

Speaker 1:

So, I I Adam, you also joined this space, but you and I both joined the space with Elon, I think, after the fireworks had happened.

Speaker 3:

Yes. So I saw Ian, I saw your tweets, and as happens to me often with tweets, I had no idea what you were talking about. So it took some digging, found this space where, I guess all the fireworks had had detonated, and I heard you Elon pontificating on his, you know, knowledge of the inner workings of Twitter, his new acquisition.

Speaker 2:

Yeah. Yeah.

Speaker 1:

How did you discover this space?

Speaker 2:

Yeah. Great. No. It's great. Great question.

Speaker 2:

So, Yeah, I've been like, I've been, I think like many of us and I think, since I see many of us here on discord and other places, I've been trying to wean myself off of the, the bird site. Right? It's just not a good place to be.

Speaker 1:

No. It's not.

Speaker 2:

And, and, you know, it was it was probably, like, what, 8:30 at night or something like that. I don't know. 9 o'clock. I I can't quite remember exactly what time it was. Probably closer to 8:30 or so.

Speaker 2:

And, I just we have, you know, 2 2 young girls, a 8 year old and a 4 year old. So I just gotten them all sort of settled for the night. And then, you know, I just, like, you know, for, like, the younger one, I sort of sit outside in the living room outside their rooms just sorta, you know, like, hanging out, being a dad. Right? But I got, you know, time to kill.

Speaker 2:

So, you know, fire up the phone. And, to my credit, I did hit Mastodon first and was just sort of, like, cruising through there, you know, and seeing what's going on. And then, you know, muscle memory. Right? You know, you run out of content on one thing and then just like that bluebird icon, hit that just to see what's going on.

Speaker 2:

You know,

Speaker 3:

it's a safe space. You don't have to explain yourself. Yeah.

Speaker 2:

No. No, I feel like, you know, it's just it's it's a struggle, right?

Speaker 1:

It's a struggle. It's a what day at a time One day at a time. It's

Speaker 2:

a process.

Speaker 1:

It's a

Speaker 2:

process. Absolutely. So, and, you know, like, the way spaces work is that if there's a couple spaces, a couple of people that you follow in a space, you get those little purple sort of, you know, like, come hit or punch the monkeys or their ads at the top. Right? So

Speaker 3:

Yeah.

Speaker 2:

I that thing, like, rolls in. I was, oh, like a space that some like, there's a number of folks. I'm like, oh, like, they, you know, they're usually a good indicator of something interesting. And, so I you know?

Speaker 4:

Oh, okay.

Speaker 1:

Can we just pause it just for a moment? Because this is one of these features that Twitter Spaces had that has, I guess, but we no longer use it. So has had

Speaker 3:

We just speak about it in the past tense. Yes.

Speaker 2:

We can see

Speaker 1:

in the past tense. But this whole idea of, like, oh, like like, Brian is in the space. Like, you may wanna go into the space. And it's like, well, Brian may be in the space because he's an idiot or he he's got nothing better to do with his life at this moment. Well And you should definitely not the the and I finally found the preferences to be like, I don't wanna advertise this because to whom I know who is here.

Speaker 1:

I I I was in a very bad crypto space, not like there's any other kind. And, folks that follow me start coming in. They're like, oh, this must be interesting. Brian's here.

Speaker 2:

No. Absolutely.

Speaker 1:

What is this? What do you what's going on in here?

Speaker 2:

No. It's it's the worst. You're like, oh my god. This room is full of Nazis. And all of a sudden, it's like every, like, like your best friends from high school are in there.

Speaker 2:

You're like, oh shit. Now do they think I'm a Nazi? Do they think I'm in the crypto ads? Yeah. Right.

Speaker 2:

It's it's the worst. Yeah.

Speaker 3:

It's like everything I click on, I don't wanna be part of the permanent record people.

Speaker 2:

Right. No. It's terrifying. It's it's such a it's such an uncomfortable piece of design. It really is.

Speaker 1:

Right.

Speaker 2:

But but it works. I mean, like, I'm, you know, I'm I'm bored. It's late. I got nothing going on. And, Yeah.

Speaker 2:

I you get into the interstitial screen first. You haven't quite joined yet. Right? So you see, like, the topic, etcetera. And it's, it's the dish.

Speaker 2:

I don't quite know how to pronounce the guy's last name. Hotz or Hotz, whatever. George Hotz Hotz's face. And, like, the title was, like, something something Twitter architecture. I can't remember.

Speaker 2:

But that's what gets your hackles up. Because you're like, this there's nothing nothing there's a there nothing this is a very disingenuous space. And when I tune into it, it's going to make me angry. But, you know, at this point, it just I think and I think I speak for at least myself. I won't I won't speak for any other ex sweeps, but it just it drives you crazy to hear these yahoos just talking trash about stuff that, like, it's also like, don't get me wrong.

Speaker 2:

Many things about Twitter's infrastructure deserve to be criticized. Like we, we used to criticize these all the time as ex Tweeps as well. But like, there's a different there's a different motivation. So anyway, I jumped into that space and and was sort of listening, you know, like rage listening, I guess. And, you know, like there was just, I can't remember what it was.

Speaker 2:

I think it was something about, like, the scale

Speaker 1:

Oh. Of the I think

Speaker 2:

the service art

Speaker 1:

Now maybe a good segue to actually play the clip. So, Adam, you wanna you wanna actually play because I in in the way I kind of imagine this is, you know, you you're kind of, like, you're tuned in,

Speaker 2:

Oh. But Cool.

Speaker 1:

You're kind of

Speaker 2:

Elon wasn't Elon wasn't there at this point. This is the interesting

Speaker 1:

Oh, my gosh. Sorry. Go ahead. Yeah. Yeah.

Speaker 1:

No.

Speaker 2:

No. Yeah. So Okay. Yeah. So I I like, I like, there was some discussion rambling of of, it was George and a bunch of folks who had, yeah, no idea how Twitter worked or or what the history of the engineering team was.

Speaker 2:

But they they were off on some crazy tangent. I was just like, this is this is ridiculous. Like, this is not this is not this is not true at all. I should I should just, you know, let me ask for the mic and see if I can just sort of correct this. Right?

Speaker 2:

And so, like, they gave me the mic and then the topic had moved on. So I was just sort of sitting there, with the mic, and I I think I had my head raised at one point, like, to, like, to politely, like, join in in this conversation. And, but, yeah, they're, like, I just sort of, like, tuned out of it and I was just sort of hanging out. But, I still had the mic. I still had my hand up.

Speaker 2:

And then, then then Musk joined. I think I put I put my hand down. So it's like, what the fuck is my hand up? Like, this is not how Twitter spaces work anyway. It's just like like, there's no there's no rules here anyway.

Speaker 2:

So I just put my hand down. And I was just I honestly, I was just listening. I had no intentions of jumping at this point because my whatever original thing I was gonna speak about had long since entered whatever the dev null of this this sort of space of of of just, Twitter slander. But, but, yeah, does that maybe is where you could maybe pass it over to you in terms of,

Speaker 3:

Well, I also just just additional context. The person who's space with us, this George guy, is Nextweep, but he joined Yeah.

Speaker 4:

Like 10 months.

Speaker 3:

He had on the on the claim that he was gonna fix up search lickety split, but actually it turned out to be hard because turns out this stuff is hard. So, if folks in the space or in the audio can give me some sort of indication they can hear. I'm gonna play where Ian came in. Is that right, Brian?

Speaker 1:

Yeah. Good. Go ahead and play where where it will play you. You wanna play the bit that prompted you to come in. So start from the top of

Speaker 4:

that clip.

Speaker 3:

Okay. Well, I got, I got the top of that YouTube video you sent over, so

Speaker 1:

I'll start it from there.

Speaker 5:

Hi, Fran.

Speaker 2:

No. No. No. I mean, like, what

Speaker 1:

do you mean by Are you good? No. No. Is there

Speaker 2:

some Come on, buddy. Come on.

Speaker 5:

Who who are you?

Speaker 2:

What do you mean who am I? I don't know. You gave me the fucking mic. I ain't got no

Speaker 5:

I I I'm I'm doing the mic, and let's let's keep it let's keep it civil in my space.

Speaker 1:

Alright. So, Adam, can you can let me give you a so that is actually that's the beginning of the kind of the full, because that recording goes all the way to the end of the space, but it you've kind of picked Ian up mid

Speaker 2:

That was not the trigger. Yeah. Not the trigger. Right? You

Speaker 1:

you you kind of

Speaker 2:

Not the trigger.

Speaker 1:

Pick Ian up mid fight or flight reaction. The and so I've, the the original tweets you have the original tweet from the, that's got the that's easy for

Speaker 4:

you to

Speaker 1:

play that idea. Yeah. Yeah. Because the and that recording actually is interesting because that takes you all the way to the end, and we get into a lot of other stuff that because I one question I wanted to ask you while you while Adam's pulling this up is after they booted you, did you stick around? Did you listen to the rest of it?

Speaker 2:

I I did not. Well, I think I went back in immediately and tried to get like, ask for the mic, which which like, I mean, we can get into this as we work work through this sequence here, but, like, then, like, you know, like, they weren't giving me the mic. That was pretty clear. So I I think I took, like, a screenshot and and, like, said something, you know, in the heat of the moment on Twitter about George or something. But then then, no.

Speaker 2:

I was like, well, you know, at this point, also, my, like, my older daughter was like, why are you should

Speaker 1:

why are you Why is that a yo?

Speaker 2:

Why using those words that we're not supposed to like, yeah. I was just, so as a parent, I had to sort of dial it off then. Yeah. I'd I had, oh, yeah. No.

Speaker 2:

I did, I I I did not, I did not listen in really after that, I guess, after

Speaker 1:

I turned it back to you. Some clips from from later as well here.

Speaker 4:

Okay. Because you're gonna play mine. Right?

Speaker 1:

Yeah. We're gonna play yours, Theo. Yeah. Exactly. So because it gets

Speaker 3:

So don't we

Speaker 1:

get to

Speaker 3:

don't we get to the start of that, the tweet, Brian? Yeah.

Speaker 1:

Yeah. Go to so so play it from the start of the tweet, and we'll we'll get get to kinda what's that Ian off.

Speaker 2:

Yeah. Before we start I think

Speaker 6:

that's where we're

Speaker 5:

gonna have have less features. I think that that we'll just need to do a a total rewrite of the the whole thing. You know?

Speaker 2:

Wait. Sir, seriously, a a total rewrite, that's your that's your prediction for Velocity? Yeah. Well, when you say

Speaker 4:

a total rewrite, do you mean starting with the skeleton or a bunch of engineers sit down with a whiteboard and say, what is Twitter?

Speaker 2:

I'm I'm or reform. I I mean, I just mean,

Speaker 5:

like, literally, like, this this like, you you need to try to amend this the, like, the crazy stack that exists or, rewrite it.

Speaker 2:

But, when you when you say when you say crazy stack, what do you mean? Like, break it down.

Speaker 5:

Have you seen have you seen Georgia's, like, diagrams?

Speaker 2:

No. No. No. I mean, like, what

Speaker 1:

do you mean by Are you gonna No. No. No.

Speaker 4:

It was Can you hear me?

Speaker 2:

Come on, bud. Come on.

Speaker 5:

Who who are you?

Speaker 1:

What do

Speaker 2:

you mean who am I? I don't know. You gave me the fucking mic. I ain't got no I didn't

Speaker 5:

get no I I I'm I'm doing the mic, and let's let's keep it let's keep it civil in my space. I mean

Speaker 2:

yeah. Like, what no. No. Man, you're in charge of the servers and the programming. Whatever.

Speaker 2:

Like, what is the stack? We we lost the stack. In my space, please. Take take it take it take me from top to bottom. What does the stack look like right now?

Speaker 2:

What's so crazy about it? What's so abnormal about this stack versus every other large scale system on the planet, buddy? Come on. K. Alright.

Speaker 2:

So first

Speaker 5:

off Amazing. Wow. You're a jackass.

Speaker 2:

Okay. Okay. Okay. Okay. Alright.

Speaker 2:

No. No. No. I got I got no credibility here, buddy. I got

Speaker 5:

no idea. That's a First off first off, let's keep my space simple.

Speaker 4:

Like, I'll remove him from from speakers. Features.

Speaker 3:

Okay. Well So Welcome to the jackass.

Speaker 1:

Welcome to the yeah. Exactly. So

Speaker 6:

Can I can I say one thing, by the way, Ian, just about the language and parenthood thing? I played this clip for my 9 year old, and she asked me for your autograph.

Speaker 5:

So, like, just

Speaker 2:

So I mean, we

Speaker 1:

So alright. So first of all, with let's answer Elon's question. Who are you? Because you're actually not

Speaker 2:

press 1.

Speaker 1:

The the one of the things that I think was frustrating to a lot of people listening to this is, you were being treated like the and you're being kind of self effacing.

Speaker 5:

You're like, yeah. Who the fuck am I?

Speaker 1:

Like, I'm just some idiot off the street that you gave the mic to. But, actually, you're not, and you have got a lot of deep experience at Twitter. So do you wanna describe a little bit of kind of the the basis for your expertise over this?

Speaker 2:

Sure. So I'm a very specific idiot, I think, is what we would we would say here. I I think so, yeah, I I, I worked I worked at Twitter for, from 2013 up until about a year ago, in 2021. And, for the bulk of that time, I, I managed our budget team for, most, I guess, notably our and operating systems team, our JVM team, and then the, like, this is hard to sort of sum up in a single sentence, but, like, this sort of chargeback service registry metering sort of system team. And then, you know, at the very beginning, a few other person.

Speaker 2:

Yeah. I spent a

Speaker 1:

lot of

Speaker 2:

time working for a lot of people who built this thing and and or or built it again or or refactored it or able to, yeah. I mean, I won't claim to be an expert though. I would say I certainly, like, I certainly, have an emotional investment. Let's put it that way, in in sort of the Twitter legacy, I guess.

Speaker 1:

Well, and I think that, you know, periodically, you get this where someone approaches a system that they don't really understand, and they say, we need to rewrite it all. And, you know, once you've had any kind of, you know, length of time in in engineering in a professional capacity, you've dealt with this. And, you know, to me, the problem that you were trying to get to, which the problem that we should all trying to we should all be trying to get to when we're contemplating this very radical step, which is what problem are we trying to solve?

Speaker 2:

Exactly. Listen. Look. I yeah. That that to me was so I'm just sitting there.

Speaker 2:

Right? Minding my own business casually. We on mute, whatever, with no real intention, honestly, if, like, I wasn't out to, like, start a fight with with Musk or or really anybody. But I mean, yeah. As an engineering manager, like, you can't like, somebody says, hey.

Speaker 2:

We should probably rewrite this from scratch. Like, you're legally obligated at that point to, like, stop the conversation and just say, like, as you point out, what are we trying to do here? What are we

Speaker 1:

trying to do? It's, like, there can be there can be legitimate reasons to do this. Like, I'm not saying we shouldn't do it. No. Totally.

Speaker 1:

Don't. You wanna be sure that if you are rewriting it just for its own sake. And I did love and I and I I don't I don't mean to, it for me, I found this kinda so upsetting that I, in similar situations, would trip over my words. So the and I I I could almost feel you being like, where the fuck do I start here?

Speaker 2:

Oh, well, you hear my like, I my entry into that was was gibberish. I mean, you like, if you step back and squinted, you sort of got a sense of what I was trying to say. But, like, prediction of velocity is just it's nonsense. But it, I think

Speaker 1:

I I loved your brain core dumping on that sentence in particular because what Musk is saying is in order to deliver features faster, we're gonna rewrite everything from scratch. And it's literally, like, I just divided by 0 in my brain. I mean, it's like Yeah. That you you

Speaker 2:

it doesn't work. Oh, I mean, it was just like I was it absolutely reflects as whatever like, I'm I'm, shit. I'm fucking saying something. And then you're just like, that wasn't that wasn't really what exactly how I wanted to say that. And then I, yeah, then I was like, alright.

Speaker 2:

Well, said my thing. Let's, you know, like, back on mute. Like, it wasn't I was not ready to to get into what happened next, but it it, you know, I think as we heard in that that, that, that clip, it was this it was the, you know, the crazy stack part. Right? It was it was it was just this, like, I just I just got

Speaker 3:

Just disparaging. Right? It's disparaging of of of all of the complexity that that folks at Twitter have faced, the nontrivial first of its kind in a lot of different ways. And to just say it's crazy is so disparaging and so dismissive of this hard work and these hard problems. And to think that one could just conjure it up from whole cloth in some sort of timely fashion is lunacy.

Speaker 1:

Wait. It's it's so nontechnical. I mean, it's like Yeah. It'd be one thing to be like, look, you know, our our right latency is just off the charts for, you know, this aspect or these kind I mean, it's like it'd be one thing to kind of, like, disparage it with technical grounding, but this is just, like, just disparaging it because I it's otherworldly. I don't know what it is.

Speaker 1:

I don't understand it. I how could this possibly be this complicated? It's like, well, you actually wanna learn? Like, we can explain it to you about why this is Yeah. And it which is not to say that, like, everything should be the way it is, of course.

Speaker 1:

But it Right. It's a it's very hard as a starting point for a conversation.

Speaker 2:

No. It absolutely and and just, yeah, to go to back to Twitter again. Like, you could you could do Twitter in a lot of different ways. You don't have to do it the way Twitter ended up building it. Right?

Speaker 2:

But, you know, there are so many, like, decisions that went into this. There's just so much evolution. Right? And there's so many reasons for the way it actually is. And that lack of curiosity.

Speaker 2:

Right? Something that just Yes. It just and it's even worse than that. It's just it's a disingenuous sort of, like, approach to this where it's like we are we are coming here to look for fault so that we can hit the sort of point we've been trying to make from from the beginning, which is like Twitter is a disaster in train wreck.

Speaker 1:

Which is the thing

Speaker 3:

is It's such a

Speaker 2:

not totally untrue in many Right. Over the years. That's not a false statement, but like, like there's a way to get there that brings everybody along for it. And this is not that sort of approach.

Speaker 3:

It's such a, it's such a great point. I love it so much because I think it is a trap that maybe we've all fallen into. I certainly have multiple times and seen then seen people fall into it, which is people show up. They look at this code that was written a year or 2 or 5 or 10 ago. Or is it a lifetime?

Speaker 3:

Mess. Right? So this is a mess. I why would anyone have done it this way? And even, you know, maybe being the author of the code, and I've been in this situation looking back 5 years and say, you know what?

Speaker 3:

That was bad. But it was also the first version, and we didn't know what we were doing. And, yes, in hindsight, it all looks like garbage. But everything looks like garbage in hindsight. There is the rare piece of software that really holds up.

Speaker 3:

And it is such a, you know, tough position that when people come in saying, well, it's all everything is crap.

Speaker 1:

I think it's Yeah. Sorry. You can go ahead. No. I

Speaker 2:

was gonna I mean, it like, there is, like, you know, we like, there is this practice of, like, you know, you get into this, like, any large organization gets into this, like, learned helplessness kind of sort of mode where if they don't go back and challenge the fundamental assumptions that you had when you were building this 3 years ago or, you know, 5 years ago, like, you will end up with stuff that doesn't make sense necessarily for present day. But just let me say this, like, on behalf of, like, Twitter engineers that I've worked with over the years, like, so many of them were really great about going back, revisiting those choices, revisiting the architecture, the granularity of a given section of that service graph, etcetera. So, like, I mean, just just this is why that crazy that just lack I mean, just it it hit you know, I was in full

Speaker 1:

Okay. So we you're hitting on a couple of really important points here. One is this kind of ship of Theseus phenomenon where the it's like there have been rights happening surely at Twitter. Certainly, if for most large systems have been rewritten many times as individual components are rethought of. And some of those components, will remain and are really hard to to rethink, but a lot do get rewritten.

Speaker 1:

There are there are rewrites within this larger system all the time, which is all the more reason why this idea of a total rewrite of, like, we are going to rewrite everything at Twitter just doesn't parse. It doesn't make sense.

Speaker 2:

No. Well, it I mean, I I think, like, you have to break in also, like, the the, you know, Spolsky is classic, you know, like Netscape, whatever, 4 dot o sort of rewrite thing where, like, just the act of burning it all to the ground and starting from scratch. Like, you're gonna have to relive, like, a like, 10 years of experience to get to a point where you have something as close to the stability, scalability. All those corner cases are destroyed. Right?

Speaker 2:

You just you've lost that, like, that sort of bug driven evolution of how that quote unquote stack, you know, ended up in that current state. It's just it it's just it was such a it's madness.

Speaker 1:

And that's assuming you succumb to second system syndrome, which Yeah. Which where everyone is like, oh, thank god. We're rewriting the stack. Okay. Great.

Speaker 1:

If we're gonna rewrite the stack, like, these are the 50 things that I have learned that we definitely need to incorporate into the rewritten stack. And and, like, a lot of that can be really important wisdom, but you can also easily end up with something that is so burdened

Speaker 2:

that So complex.

Speaker 1:

Yeah. Exactly. I mean and, you know, this is the, and, actually, do you, Adam, may I do a brief reading from the mythical man month? The By all means. But the so this is Fred Brooks' Mythical Man Month, which is aided in some regards, certain it it it very dated in in other regards.

Speaker 1:

It's very gendered for the way the way it speaks of everything. But there are certain things that are kinda timeless that were pretty clear early on in software development. And one of these is, now quoting from Brooks, the general the general tendency is to over design the second system using all the ideas and frills that were cautiously sidetracked on the first one. The result, as Obed says, is a quote big pile, unquote. I'm like, didn't Obed live, like, in, like, 40 BC?

Speaker 1:

But apparently, Obed said big pile.

Speaker 2:

That's yeah. That's that's some that's some old Latin slang there. No. I was like,

Speaker 1:

it's like, even like, it was a giant pile of garbage. Obin was like, this thing is a giant pile of garbage that we can't rewrite. For example, consider the IBM 709 architecture later embodied in the 70 90. This is an upgrade, a second system for the very successful and clean 704. The operation set is so rich and perfused that only about half of it was regularly used.

Speaker 1:

And I think that the the the way that the second system effect manifests itself can vary. But I think it it shows the the the more abstract idea of this peril of a rewrite. Like, a rewrite is is a radical operation. It can be done successfully, but, you have to proceed cautiously and with a lot of curiosity about the past, and it's also gonna take a while. That's the other thing that he

Speaker 2:

Yeah. Yeah. No. Totally. This is like the velocity the velocity aspect.

Speaker 2:

I mean, it was so comical in the sense that, like, if you wanna hurry up and ship a bunch of features, starting over from 0 is probably not gonna get you to that. Like like, you're not gonna get spaces 3 out of out the door anytime soon. It's just it's

Speaker 1:

Yeah. And so then there are people. So so the other thing that's kind of interesting, and this is throughout this entire space. And so, George, I don't know if you I mean, I guess you haven't followed George in. George is an interesting character.

Speaker 1:

So this is he's a definitely, I don't agree with with certainly some of his perspectives, but he's indisputably like a real technologist who cuts code and, and value

Speaker 4:

On the security side of things in particular, he's insane, like, one of the top in the world when it comes to, like, cracking, like, consumer hardware software. He was one of the pioneers of jailbreaking both in the PlayStation and the iOS, like, early

Speaker 1:

days. Right. Toast total pioneer. He was the I think you're the first one, Theo, right, to jailbreak iOS. Certainly a very early iOS jailbreak, coming Yeah.

Speaker 4:

I I know Sorik and a few of my other friends in the safety and security scene hold him in very high regard, which surprised me because I didn't know, like, what is, like, because some of the stuff he said outside of security is so absurd that I had no idea.

Speaker 1:

That's right. I wish he knew

Speaker 4:

what he was doing, but he absolutely does in that space.

Speaker 1:

Yeah. That's right. And and Theo, I think you're capturing exact exactly. It's like he is he there's some great technical gravitas there, but there's also, like, he definitely says some things sort of like, like, okay. Woah.

Speaker 1:

Don't don't don't agree with that one. But the thing that is interesting to me in this conversation, especially in in your exchange with Musk, George is actually trying to get it on the rails and to ask actually Musk some questions that Musk never answers. I love his revolution or reform. So he I don't know if he that you he in there, he says, like, are you talking revolution or reform? Asking that of Musk.

Speaker 1:

And the because I think and Theo being shown on your take on this. Yeah. Having, you know, you were there for that entire space. But I think George's answer is like, what you actually need is reform. You gotta pay down a bunch of this technical debt.

Speaker 1:

You over and over again in that space, he's like, you keep talking about new features. We need to pay down technical debt, which I think is it probably have a position that I think in just about any engineering organization is probably the case. I don't there are very few that don't need to pay down technical debt. But I thought that was, and it's that came up kinda over and over again. But even in that particular exchange, I think George is trying to be like, wait a minute.

Speaker 1:

What are you talking about by a total rewrite?

Speaker 2:

Yeah. No. I mean, I I, you know, I I've listened to that a few times, obviously. And, I think yeah, no. It it was like just like if if the moderator or George had, like, you know, like, done, like, the question leading sort of that he was trying to do, then, like, it probably wouldn't have blown up.

Speaker 2:

But it, you know, I think it was

Speaker 1:

it definitely alright. So then

Speaker 4:

I think it's a little unfair to George to assume that he could, in any way, get that back on the rails. Like, he considering the state of the call, I was impressed with how well he held it together.

Speaker 1:

I was impressed too. I mean, I I was impressed because he was willing to ask Musk some tough questions and then not go in to save him when he couldn't answer.

Speaker 4:

It also stood up for me at one point, which I really didn't expect. Like, when I was grilling Elon on the ad stuff, and Elon said to kick me, George specifically said, no. Theo knows what he's talking about, and then kept me in the space.

Speaker 1:

So that's a good segue. So, Ian, I don't know if you've heard this, but so later, maybe, like, 15 minutes after you'd been booted,

Speaker 4:

I think it was less than that. It was long enough that he was still pissed because of talking to you. I

Speaker 1:

yeah. I you know, I it it it was a it was a while though. The the kind of discussion had passed and that it went onto a new discussion. And then he the Adam Hunter, do you have the the the clip of when, Theo was volunteering his expertise on ad targeting?

Speaker 3:

Yeah. Give me just one second. Sure.

Speaker 2:

It's pinned

Speaker 4:

in the quote tweet on my profile.

Speaker 1:

So the end. So Theo a because you had been in the space also from the beginning, I think. Right?

Speaker 4:

Yeah. I joined pretty early.

Speaker 1:

Right.

Speaker 4:

I don't know how I saw it. I think that dev Agrawal sent me it, pretty early, so I hopped in. And then he said that he was, like, taking requests for people who had pushed back on stuff, and I immediately jumped up and he added me. We've continued chatting.

Speaker 2:

Here we go. Go ahead.

Speaker 4:

Ad sales and a lot of like, purchasers expect ads isn't necessarily, I'm gonna give you these ads and I trust you to serve them to the right people. It's usually almost the opposite where it's I wanna target this group of people in this region who went to a Jiffy loop in the last 72 days. That level of micro targeting is something a lot of ad buyers now expect, and, like, it's a huge part of how CPMs can get so high on platforms like YouTube. Regardless of whether they're targeted or not, if your goal is CPMs, you should have a very good idea that someone's gonna click on an ad before you show it to them, and you can do this with GoodML. But sometimes these advertisers will serve entirely different versions of

Speaker 5:

the same please. He's just talking nonsense. Thanks. So, the the I've actually talked to the advertisers. I talked to some major advertisers just today, this morning, and they're looking for, as you would expect, return on investment.

Speaker 5:

So that if they spend a certain amount of money, are they gonna get a return in excess of the amount of money spent? Quite rational, not arbitrary or random. So, and I wanna say who it is, but there are major advertisers spent $60,000,000 on Twitter this year. And they were like, we calculated the ROI. Actually, like, another company helped them calculate the ROI, and Twitter was the lowest.

Speaker 5:

And this is why they're having difficulty, spending additional money on Twitter

Speaker 1:

because they're

Speaker 5:

literally looking at the lowest ROI. I'm like, yeah. Okay.

Speaker 4:

Yeah. Talking absolute nonsense as he goes on to explain that Twitter can't actually make money from ads.

Speaker 1:

I absolute nonsense. So the so if folks miss it because Elon came in and spoke over Theo. When you were speaking, he spoke over you and said, turn this guy off, please. He's just talking nonsense. And which is like

Speaker 2:

I mean, it was kind of

Speaker 1:

like as again, with Ian's, I had the same when listening to your interchange with him where it's like, my brain is short circuiting because it's I mean, it's definitely not nonsense. It's what's the opposite of nonsense again? Like, total sense. So and Theo was that surprising? I mean, you kind of you'd see, you've listened to this exchange with Ian some 15 minutes earlier.

Speaker 1:

Were you were you surprised when he, and then later as you point out, George is, like, actually, I think, Theo, actually, well, I'm not I'm not gonna boot him because he's he he's made contributions to the space. But, that was surprising to me that he was disagreeing with something that feels so basic from an advertiser perspective?

Speaker 4:

I'd say that, semi surprised. It's hard for me to remember how I felt at the moment because I've just overanalyzed the shit out of it and thought through, like, how I feel about it now. And what I feel now is that Elon's, like, major success of Tesla is something that every step along the way, everybody said, you can't just make a car company. That's not how this works. And it turned out he was, like, kind of right.

Speaker 4:

And now he assumes he can do that in a lot of spaces where you just can't do that. Like, you can't reinvent ads by making a really good algorithm. You have to, like, compete in the ad space as it stands and works. And it was surprising to me how little he was willing to try and understand a new space as though, like, his ability to brute force into 1 carries over to others. The thing that surprised me the most, though, was George standing up for me.

Speaker 4:

I did not expect that in the slightest, and we've ended up chatting, like, a good bit after. And I have a a decent bit of respect for him. And I think that my perspective on all this in the end is somewhere, like, between everybody's, and I'm just disappointed in Elon as a business, like, person. Like, he's just failing as a person who has to make a company profitable because he's been able to ride off a fucking, like, investment forever, and that's not an option anymore for Twitter, and he's kinda fucked.

Speaker 1:

Yeah. It is very disappointing in that regard. And, the the the whole thing is very disappointing. And I think also very revealing. So the thing the the point you raised is a good one where this is somewhat and then, you know, we we've talked about this before that it it it takes you have to be crazy to start a company in some regard.

Speaker 1:

And we want people to be crazy enough to start a company, but then no crazier. Like, be normal and and and remain curious and grounded and detail oriented. And some people can pull that off, and maybe he was for some period of time. But where we are now, he is because I think the common theme, Theo, across what you're saying and and, you know, what what your experience is just this total lack of curiosity about, like, don't you wanna actually understand how this stuff works? And because there was a lot of opportunity to make Twitter better, I feel, when he was coming in, had he had that curiosity.

Speaker 4:

I think making Twitter more profitable is it needs to be the first focus. And he came in with a little bit of that with the layoffs, but hasn't maintained it since. Like, he he did the the big scary optics thing, but then none of the, like, annoying detail oriented work to actually make money appear, like working on the ad network. Like or to just so much of the value of Twitter is going to have to be through ads and through video ads in particular if they wanna lean into video. And they just haven't put the effort into acquiring, bucketing, identifying targets, and doing the things you need to to actually serve those ads.

Speaker 4:

And the the way he explained advertising and, like, how his relations with the advertisers are going, it just it screams, we don't have anything to offer here. And that's that's a death sentence in this space.

Speaker 1:

Totally. And, again, totally incurious about perspective, totally incurious about, Ian, about your perspective, and totally incurious, you know, kind of looking for confirmation.

Speaker 2:

I mean, totally and curious about everybody who was actually there working on this stuff when he took this over. I mean, just the way in which the layoffs occurred. It was I mean, it was, like, arbitrary at best and, like, vindictive in very specific points, at worst. And I think that if you are trying to, like, run a business or, you know, grow that business to the point where you can actually recoup that, for lack of a better word investment. It just like, just stop.

Speaker 2:

Like, talk to folks. Like, nobody I nobody I worked with and had contact with anything close to a rational, reasonable conversation that wasn't, in wasn't hostile and or pretty clear what was gonna happen again. Just just it's a madness.

Speaker 1:

Okay. So, Yinta, this is nothing I wanted to ask you because, certainly, one of the things that, having worked in a fear based environment, not that this is not that this is that surprising, but, like, fear completely undermines creativity and risk taking. I mean, obviously. Yeah. And Yeah.

Speaker 2:

I think just, you know, my my favorite manager of Frozen quote is, you know, people make bad decisions when they're scared or or stressed or upset. Right? You know, it's it's it's just it's not a great not a great environment for for creativity or even just good decision making. Just,

Speaker 4:

I'd put a lot of this on a spectrum. I I call it the overthinking, underthinking, where fear and insecurity and those things result in overthinking to the point where you make a bad decision, and, like, too much hubris and confidence results in underthinking leading you to a bad decision. The the goal, especially, like, to go back to the founder point of, like, being just crazy enough is being having just enough hubris to, like, think through decisions, but not so much that you overthink and end up, like, overcommitting or undercommitting when you need to just get the thing done.

Speaker 3:

Totally. What do you get when you mix hubris and fear? I guess nothing but bad decisions.

Speaker 4:

Absolute fucking chaos.

Speaker 1:

Absolute chaos. And that is I think is that's the other thing that that is would make it very difficult to be. Because I try to think of myself, like, alright. If I'm an engineer at Twitter and I have decided I mean, realistically, I've decided because I I have to. My circumstances dictate that I need to stay because I gotta assume that anyone who's got the capacity to leave would have left.

Speaker 1:

How do I get excited about these elements? I when you talked about this kind this total rewrite, in in some regards, it's so abstract that it's actually hard to get concretely excited about it. And I don't think that he views that kind of intrinsic motivation as important to software engineering. And we know that that intrinsic motivation is like the whole game. I at least from my experience.

Speaker 1:

You know, I mean, so your experience is as someone who's had the lead team. Yeah. But, like, you the like, your entire the all of your true intellectual property, the the intellectual engine of your endeavor walks out of the office every night or or I guess we no longer work from offices. So I don't know. It goes to bed every night.

Speaker 1:

But the the and in order to harness that, you have to stoke their intrinsic motivation. And when you make things when they're chaotic, when you make people afraid, it's not I mean, I think worse than than than overthinking. You just run the risk that they they just shut down completely.

Speaker 2:

Yeah. No. I mean, I couldn't I couldn't agree more. You need people who are, you know, absolutely, like, stoked about what they're doing. I mean, like, it's it's absolutely great and awesome to work for that paycheck because you need a paycheck.

Speaker 2:

And that's like, the extrinsic motivations are are important too. You gotta meet those needs. Right? You gotta give people that, like, people gotta eat. Although, I no.

Speaker 2:

But,

Speaker 1:

like, what extrinsic motivation? Because it'd be one thing if you'd be like, alright. Look. So I'm gonna, like, I'm gonna recap everybody. And if you stick around with me at Twitter for the next, you know, for 4 years, you're gonna have, you know, this kind of ownership or this like, you're gonna I'm gonna really incentivize you to stay.

Speaker 1:

I'm not seeing is is any of that happening? I don't think any of that's happening.

Speaker 2:

No. But as you point out, there are people there, right, who, like, need their health insurance to continue. Right? They they can't find a job for either visa reasons, you know, in a hurry, or or many absolutely valid reasons that they're still there. But, you know, maybe they have a payout.

Speaker 2:

Right? They, you know, acqui hire something and they they're still waiting to get that, like, you know, 2 year or whatever payout. And that's great. Like, get that money. Do what you need to do for us, you know, to to live your life and and and to be, you know, to survive.

Speaker 2:

But in terms of the quality work that you're gonna be doing, in terms of, like, the output being what Twitter at this extreme sort of, point needs, it just I don't see how you get that. It just, it just doesn't seem doesn't seem possible.

Speaker 1:

But it it just feels like we are and I think that you mentioned this earlier as well that we are kind of passing beyond an event horizon where this thing is not gonna come back. It feels like this thing is gonna live as its primary value is gonna be as an object lesson as to what not to do in terms of running a business. And so on on that note, one of the things that that, also drives me nuts is there's some deliberate, I feel, misreads of history that Musk has when he and and, Adam, do you have the the the Mac OS clip? Not sure.

Speaker 3:

Oh, no. I don't have the one picked up. Sorry.

Speaker 1:

Do you

Speaker 3:

have the index on that?

Speaker 1:

No worries. Yeah. That was at, like, 20 minutes and 10 seconds or something like that. Actually, maybe Into

Speaker 3:

that video you said? Yeah.

Speaker 1:

The video, the the the full length one.

Speaker 3:

Yeah. Yeah.

Speaker 1:

So the the

Speaker 2:

the Was he off on, like was there some sort of next step? Of course.

Speaker 3:

Exactly. Oh my goodness.

Speaker 2:

Right. It's good. And it

Speaker 1:

is so you know, I feel that, like, part of the danger of, you know, misbehavior from someone like Musk is that people who lionize Musk will appeal to that kind of implicit authority when they wanna do some and I we've already heard this, like, anecdotally where where Musk is emboldening people in Silicon Valley to do I mean, to make honestly, to behave badly. I I I'm not sure what I would look at at what he's on at Twitter. I don't think there's anything there that is that is laudable, honestly. And so but what's funny is to and I of course, people have done this with the jobs as well. There's not there are many others in history where people kinda look to these folks and kinda pick and choose what they wanna learn from him.

Speaker 1:

And it was funny to hear Musk use Jobs as as a model, and in particular, the, for the for the jobs is returned to Apple. Adam, do you have that one?

Speaker 3:

I think, yeah, I think I've got that.

Speaker 5:

Like, so I look back at at, old school Apple, you know, when they had the their old operating system, when then Jobs came along with OS, you know, OSX. You know? It was it was a it was a new stack. It wasn't like, let's let's, like, you know, fool around the old Apple OS. It was like the he just took the NextOS and, adapted that to Apple and said, yeah.

Speaker 5:

Your old software just doesn't work anymore. Too bad.

Speaker 4:

But an operating system doesn't run its

Speaker 2:

That's but no. The old software also still worked. Right? There was the carbon on the Internet. Not even right.

Speaker 2:

That is just

Speaker 3:

such a misread everything that has happened. Yeah, absolutely. It's just like, it was bought for next. It was the compatibility was paramount. People developers were treated with respect.

Speaker 3:

I imagine even internally, probably engineers at Apple were occasionally treated with respect.

Speaker 1:

And I'm sorry. Also, are we just gonna erase the actual history of Next Computer? That I mean Next. Yes. That that

Speaker 4:

let me

Speaker 2:

because it's like like, Jobs did

Speaker 1:

come along with a rewrite. I mean, he No. He had started next to what? In in 85?

Speaker 4:

Thanks for the reminder that you guys have been coding longer than I've been alive.

Speaker 6:

Yeah. That's what I meant.

Speaker 2:

I so I worked I worked at a a bank in the early stages of my software development career. And, the entire it's derivatives of risk, part of the bank. The entire trading floor was running on next step machines. It was just like we had objective c programmers, all over the place.

Speaker 1:

Was that at O'Connor? Where was that?

Speaker 2:

No. This was, so this was at at Bank of America, but let me give you the like, let me let me unwind this. It was not really Bank of America. Right? It was, it was one of the original Chicago options sort of trading houses that came out of, like, the derivatives sort of revolution, like, post Black Scholes and, like, like, 79, 80, whatever.

Speaker 2:

It's like the Ritchie brothers, built this whole thing, sold it to Nations Bank. Nations Bank bought Bank of America. So, technically, it was Bank of America, but it was still like this hardcore sort of, like, you know, like, OG sort of, Chicago derivatives, and options sort of, you know, trading sort of world. But, like, full on next next, sorry. Slight digression here.

Speaker 2:

Full on next, shop. Apple's bank was Bank of America. That's where Apple banked their stuff. So when Mac OSX no s x came out, I guess. Apple came and gave a a pitch to, like, this software group that we were part of and was like, you know, hey.

Speaker 2:

We got, like, this new set of, like, machines coming out, like, this new OS. Like, it's pretty much compatible with all your next, based infrastructure and software here. Like, what do you guys think? Should we should we roll with this or what? And every next step developer was like, fuck yes.

Speaker 2:

This is gonna be the greatest thing ever. And, you know, in classic large bank creations, we went with Java, and something else. But it was one of those it it's just an amazing sort of moment where where you just zigged and they could have zagged and had really, it's an amazing platform that was fast and not a total rewrite, but support of their software over to the new OS. Anyway, sorry. Memory road trip there.

Speaker 2:

Back to you guys in a lot.

Speaker 1:

Well, but but I think it

Speaker 2:

I mean, but it's

Speaker 1:

but it was a long and torturous path to get to Jobs' return back to Apple in 97 when Next is basically on the brink of bankruptcy, or insolvency. And the the that you know, to to kinda portray that as inspiration for whatever the hell Musk is gonna do at Twitter, It's like, sorry. That you're not Steve Jobs, dude. You're Gil Amelio. So, I mean, you are presiding over the over the death of Twitter.

Speaker 1:

You're not actually doing the rewrite. The, the Yeah.

Speaker 2:

No. It it it seems it seems like that's it it seems like end end times.

Speaker 1:

Well, and and I just think, again, it's so it's it's funny to hear that used as the as kind of the casus belli there, because the other thing is, like, the rewrite is if you are to rewrite this, and even indeed paying down technical debt, all this stuff takes time. And I think from an executive perspective, having folks pay down technical debt is something that's really challenging to do because it doesn't necessarily have an artifact that is gonna be user visible. It's technical debt. It's, like, actually, we're in a much better position for the features we wanna go add 6 months from now, a year from now, 2 years from now, but it hasn't actually meaningfully changed the the artifact. And Musk seems like the the opposite of the kind of executive who would be able to get behind that.

Speaker 2:

Yeah. No. Again, like, you you don't ship any features while you rewrite the whole stack. But, again, it comes back to, like, the motivation here. It doesn't seem like anything flows to, like, a rational sort of plan to make Twitter successful.

Speaker 2:

It just it it feels like this very emotional sort of attack on stuff that he seems to have some serious serious issues with. It it just feels so personal.

Speaker 1:

Well, it so the thing is that

Speaker 4:

can I try something Yeah? Absurd? Can I try to do my best to, despite my disdain for Elon, how he's running the business and everything, try to theorize the scenario in which your rewrite might make sense based on what we've seen?

Speaker 1:

I'll just check that.

Speaker 4:

Of course. Okay. So let's say I'm the CEO of Twitter. I look at the bank. We have 2 years left before we're out of money.

Speaker 4:

We just lost half of our major advertisers. We're effectively going belly up. We have 4 proposals for new things we could build that could possibly get us profitable. And we write out the spec for all 4. We look at these different things, and we can't decide, like, which of these makes the most sense.

Speaker 4:

Each of them will take a year to make with the current state of the code base and the current state of the infra everything that we're building on top of. So we have 4 years of things to test and 2 years to do it in, or we could take the massive risk of rewriting a big portion, spinning up parallel in for being in, like, 4th system hell with the possibility that we can actually test all 4 of these things before we're out of cash and maybe just maybe increase the chance that this business is able to survive. But the the core being, they need to find a thing that makes Twitter more money, not a new fucking vanity metric to put on your timeline.

Speaker 2:

So I I think a reasonable question to ask, but but again, like, what what is it about the stack? I'm sorry. What is it about the stack that limits them from actually moving forward with this? I think that's that's where I I lose the the plot in in Musk's sort of assertion that the stack is crazy or the hypothetical that, like, you're blocked by doing x, y, or z. I I don't

Speaker 4:

One example did come up during the call that I thought was interesting. The the fact that the view count DB is entirely separate and hosted in Google Cloud when the rest of their data storage was entirely unrelated. And trying to build a bridge between those two things was a significant challenge for them.

Speaker 2:

Don't get me started on the on the GCP, AWS. So Twitter's made some bad decisions. No question. My personal opinion, I I don't speak for necessarily anybody else at Twitter. But, like, going to the public cloud, shifting a lot of the analytics stuff out of off prem is insane.

Speaker 2:

It's just a dumb idea. Understand. Absolutely. Move it back. Just move it back.

Speaker 2:

I mean, like, don't like, I I think the like, that split brain thing, totally legit. Again, like, before Musk, like, there was a lot to criticize about Twitter and how the company was wrong. But and the decisions from the business perspective.

Speaker 1:

I think that that split brain, by the way, likely came from the expediency PO that you're talking about because it sounds like I mean, Twitter as a business has has had its struggles for for a period of time. And at some point, just from listening in on the space, someone wanted to ramp up analytics infrastructure, and they lacked available compute resources on prem. So they spun it up on the cloud, You know, not unreasonable, but it now has given you this now you that that system that you have as a result is now a you you've tried to do something to simplify yourself and to to expedite yourself, but you have a net result that is much more complicated. And I think feel that's what you're referring to in terms of that 4th system problem.

Speaker 2:

Yeah. And alright. Steve, go ahead, but, I do wanna jump back in on Yeah.

Speaker 4:

Go ahead.

Speaker 2:

On the GCP. Yeah.

Speaker 1:

Do it.

Speaker 2:

So I you know, having had a ringside seat to a lot of the back and forth with GCP, like, I I don't really. I'm skeptical of the claim that the capacity wasn't there. We we could there was a a bunch of

Speaker 5:

Yeah.

Speaker 2:

Large company kind of things that go down where somebody has, like, the cloud was pushed at Twitter, by folks on the board and and sort of VPs who'd come in and go, which is like, hey. Obviously, we should be in the cloud. And time and time again, and this is maybe relevant to some of the work that the oxide folks are doing. Time and time again, like, the the the decision point came down to, like, it's just way too fucking cheap for Twitter to run in particular, its compute resources on prem. Right?

Speaker 2:

Just there's just no way that the the economics would would make sense. The but there was always this continuous sort of drive. Right? And and this also goes back in a in a healthy way, I think, to the, like, the overall, like, keep questioning your sort of, you know, your your learned assumptions, your learned helplessness. Right?

Speaker 2:

Like, reevaluating whether or not Twitter should be in the cloud. Ultimately, I think when we made the decision to go to to GCP, it really felt more like a high level sort of decision that sort of ignored some of the the physics of both the cost of running on GCP versus running on Twitter. And also, I think a lot of maybe, narratives that that sort of build, I think, a certain executive's desire to close the GCP, deal. That executive is now, unsurprisingly, at GCP. Exactly.

Speaker 2:

Of course. Right. I know. Yeah. Yep.

Speaker 4:

So it

Speaker 2:

like, the I have a lot of

Speaker 4:

personal bias against GCP and just, like, all the interactions I've had with people, both execs and individuals there have been, like, some of the most, like I I've never felt like I was using IQ points faster.

Speaker 2:

Well, I I I will say, like, they have really great people. I mean, they they're building really great stuff. I will just need it. I'll also leave

Speaker 4:

it on the Firebase team. As well as I hate the Firebase product. Like I know a lot of good engineers over there.

Speaker 2:

Yeah. I just wanted to just so I'm clear. I'm I'm, you know, everybody's a lot of great engineers all over the place. So it's just,

Speaker 4:

Yeah. I to go back to the, like, how did Twitter get here and how does Twitter get out? I wanna start with a couple premises just to see, like, what what the, like, bounds of our agreement and disagreement are. And a lot of this is gonna be coded in my, like, ignorance as a younger dev. Like, I'm building on top of the backs of, like, what you guys were making when I was in diapers, and it's essential.

Speaker 4:

I was even talking to Brian earlier about how important a bunch of his talks are for, like, getting me into the idea of speaking as part of software. And, like, with that all said, a I I cannot find where this quote came from. It was said by Luke Lafren on, the WAN Show by Linus Tech Tips. I'm a good friend of Luke's. I asked him to find the source for me because I want it so badly.

Speaker 4:

The quote's, every 3 years, any given piece of software gets roughly 3 times easier to make. How do we feel about that?

Speaker 1:

I mean, I think that that is intriguing. It is, and I think the answer varies to on different kinds of software. And I for me, it feels like there are certainly

Speaker 4:

I'd say kernels get 3 times harder.

Speaker 1:

Yeah. Well, the the other things that have I mean, I I mean, honestly, to get technical about that, I mean, like, Rust has made that lowest level system software. It's not a factor of 3. It's a much bigger factor, I think, ultimately, but it also is not over 3 years. It's over a much longer period of time.

Speaker 1:

And I think that software has gotten easier and easier to rewrite or easier to write, which I think part of the reason why, Adam, you mentioned earlier about, you know, going back to your own software that you'd written some of years ago, be like, someone's like, why would you do it this way? It's like, well, because this other stuff didn't exist. That's why. So I think that there there's some real truth to that, which is part of the reason why you do wanna kind of constantly be reevaluating. Why did we do it this way?

Speaker 1:

And if we were to do it again, would we make the same decisions? And then I think the other the the other key bit is if you if we were to make a different decision, would it have merely allowed us to deliver that same artifact faster and better, or is it gonna yield some real quantum jump in the the quality that we really deliver right now? Adam, do you remember my my argument about Care TLD with Mike back in the day?

Speaker 3:

No. I don't.

Speaker 1:

So Care TLD Mike was complaining about Care TLD, the which the current one time laker. And the he's like, there's no way we're gonna have the current one time linker as it is 10 years from now. It's just like it just it this is not software that we should be because it's like the current one time linker is filthy, and it's got lots of, like, strange hacks in it. But the current one time linker

Speaker 3:

that's enough fit of that's enough fit of peak where Mike had been trying to add something or whatever and finding that it was 10 times harder than it needed to be.

Speaker 1:

For sure. And it it can be really frustrating. The problem is that the Kroger one time linker also has, like it's a bounded surface area. It does actually work for sure. It's not something that you you don't really use in the hot path.

Speaker 1:

You kind of load a kernel module and, like, that's it. And I remember having this bet with him. I'm like, the kernel runtime linker is not gonna be rewritten within 10 years. It's just not it's not on a trajectory to be rewritten within 10 years because not that it's not a, the the existing software is not grotty. It is.

Speaker 1:

That rewrite wouldn't yield anything. The like, KRTLD will be rewritten when it is a side effect of doing something much larger. So if we were to, you know, for example, if we were to ultimately rewrite the entire operating system in Rust, like, yes, obviously, like, Krakowd is not coming along for that ride. But you it makes no sense to rewrite it simply because it may be easier to write that kind of software today because the software works. And, Theo, I think that would be, like, my ultimate counterargument to that would be, like, I totally agree, but I wouldn't necessarily use it as a great So I'm

Speaker 2:

just creating.

Speaker 4:

I'm just doing premises with that. Like, we haven't gotten to the argument yet. Like, I I I asked one question.

Speaker 3:

So sorry. Okay.

Speaker 1:

Yes. Next premise.

Speaker 4:

No worries. So so the premise one is software gets significantly easier to write over years. Like, I I've heard the 3 times in 3 years number, and that gut feel is right for me in the web dev world right now. Things are getting significantly better, significantly faster than I ever would have expected, and it's maintaining. So with that premise, like, largely agreed upon, the next one is that when decisions are made, a lot of the value in that decision comes from the understanding of the decision.

Speaker 4:

Okay. Like like, even, like, a a decision right or wrong, whatever, a lot of the value in a given software decision comes from the the fact that the people who made it know why they made it.

Speaker 1:

I I yeah. No. I'm definitely with you on that. And and and the the basis of the decision is an important part of of the act, the deliberation behind it. Yeah.

Speaker 1:

No. I'm totally with you on that. This is why I'm a big believer in well documented code and having, documents to support why decisions are made. Yes. Totally with you.

Speaker 4:

Awesome. One last one. A a work environment that has more dimensions of complexity is going to have engineers more hesitant to make sweeping changes. So the the more debt uncertainty legacy code, whatever the hell we wanna call all these things in a given code base, engineers are now less likely to be up to propose and be down for sweeping changes, be it tech or product.

Speaker 1:

Yeah. Absolutely. Yeah. With you on that one for sure, And especially as you get implicit interdependencies and you get a someone you know, occasionally, you'll have someone who doesn't believe that, and then they will break the system rather badly. And then they will, like, actually okay.

Speaker 1:

I I get it now. So, yes, I I agree with that as well.

Speaker 4:

Yeah. I I think these are the premises that Elon is working within, and the additional understandings that make all of these things you shouldn't act on are development he just hasn't made personally. And the result is if if, like, you operate within those axioms and ignore the rest, rewrite, rewrite, rewrite them. That's what everything screams at you. I'm not saying it's the right thing.

Speaker 4:

I'm saying that's roughly how he got there.

Speaker 1:

Yeah. I think that that's giving him far too much credit. I mean, I I I on the one hand, I mean, he should be lucky to have someone giving him such sound intellectual basis for what is to me just a very emotional kind of reaction, but I think if we could Okay.

Speaker 4:

That's fair. I I can't assign this to him. Like, I'm saying, like, as I sat down to try and think why would any smart person ever say rewrite here? What are the things they're thinking or the things they're not thinking? That's actually Those are the things that they have to

Speaker 1:

be thinking.

Speaker 6:

I mean, this is the thing where you need to go back and reexamine the priors. First of all, is Elon

Speaker 1:

Musk a smart person?

Speaker 6:

They and and I I'm I'm asking that seriously. I mean, I think the answer is no.

Speaker 4:

Is or was?

Speaker 6:

Either. I mean, I think that

Speaker 1:

he he comes

Speaker 4:

So something that came up earlier that I was that I wanted to jump on was the idea that Elon isn't curious because that was one of the few things that stood out about him for me earlier on. Yeah. Like, when I first saw him doing interviews in particular, I think there was a Joe Rogan way back where he was going very in-depth on, like, the physics of the car door and, like, the engineering challenges they had in building the door for the Tesla. And it sounded like he was genuinely excited and really curious about how those things work. It was a fundamentally different tone than I'm seeing in him now, and I do feel like that curiosity died.

Speaker 4:

And with it, a lot of the, like, question asking that you would expect when someone doesn't know what they're talking about. And as a result, when you go from asking questions about things you don't know to making statements about things you don't know, you go from curious to stupid very fast. This is

Speaker 1:

a very funny been

Speaker 6:

surrounded by people he's been surrounded by people for decades now who have been telling him what an amazing genius he is, and everything that he says is so smart and precious. Like, if you were constantly told that you were the smartest person in the room, you begin to believe that. And everything else that didn't originate, you know, fully formed from the head of Elon Musk, like, is just Balderwash and nonsense. And that's why he can, you know, just arrogantly dismiss everything all day. You know, shut this person up.

Speaker 6:

He's talking nonsense. It's like, well, yeah, because, you know, that didn't come from him. And if he truly believes that he is the smartest person around, then he has no incentive to be curious about anything, really.

Speaker 1:

Well, and I think I think I think that this is exactly it. And I think, yeah, you're making a very good point that people absolutely do change over time. Kara Swisher has made the same argument because she points out that, you know, I did some very in-depth interviews with with Elon years ago, and this is not the Elon that I recognize at all. And, Dan, to your point, it's like you when you get surrounded with sycophants, you and but which, by the way, a fear based culture will generate sick offense. Right?

Speaker 1:

A fear based culture will generate people around you telling you what you want to hear. And I thought it was interesting that, you know, it's an interesting kind of study in how to navigate this, to listen to George in that space, try to, like, get him back on the rails without being like, hey, pal. Like, go fuck yourself. You're wrong. He I mean, he's trying to not lose his kind of his own credibility with Musk while trying to steer him, and it's very hard to do with someone who has been hearing over and over and over again, what a genius there.

Speaker 1:

We we did Musk a disservice societally. I treat and certainly did the the people who were the true founders of Tesla and the engineers of SpaceX and so on. We did those folks a did both Musk and those folks a disservice by pretending that he had done all this stuff. Because he speaks in the and and, Adam, I don't know if you've got the, the clip from 57 minutes, if that's one you can get to. Yep.

Speaker 3:

Feed up right now.

Speaker 1:

Yeah. Yeah. This is a good one.

Speaker 5:

Well, it's I mean, I feel like sort

Speaker 2:

of a

Speaker 5:

lot of things I should say to preface, like, this is, like, what I think someone told me, as opposed to, like, this is the god's honest truth. So that's an important preface for anything I say. If what I say may not be true, or at least it's my recollection of what I think someone told me. So, they'll take anything I say with a grain of salt.

Speaker 1:

Oh my god. And just be like Wow. And you're just like

Speaker 3:

That's quite a disclaimer. Right? You will you walk around with a tattoo like that? I mean, it's really like big asterisk on almost any state.

Speaker 1:

Asterisk. Like, listen. I should tell you that it's the power of the aircraft. I don't actually know where we are necessarily. I it is my recollection of what someone told me where we are.

Speaker 1:

You're just like, oh my god. Like, can we get someone who's flying the plane who knows where we are? How about that?

Speaker 2:

Fully postmodern. The whole thing is is divested of authorial intent. It's all about the subjective interpretation of each reader to her ministry.

Speaker 1:

It really is. And it is also I think it's mesmerizing that this is all and good for Musk for recording this for posterity because this is gonna be Enron esque in the degree to which I think it's ultimately taken apart. And I think it's actually valuable to to get inside his head to a certain degree, and that's there are these levels at which he knows that, like, actually, all I know is what people have told me. And speaking for me myself personally, and actually, you mean, initially, you know, your perspective on this is someone who's, like, led large teams. I have I really feel that I have to get into the details to really understand something.

Speaker 1:

And it's so hard to get into the details when you are relying on what someone is relaying to you. That, like, it's actually I I I feel that I need to see the source code at some level, you know, and that can be really hard. That doesn't scale, obviously. So you gotta I mean, how do you hit that balance of of having kind of a large team and staying detail oriented enough to know that what you're saying is is has basis in fact?

Speaker 2:

Yeah. I mean, like, I think we have a little a couple different perspectives on this, to be honest. So, like, you know, I I think you have to be able to build trust in the right people. And I think, you know, you're always vulnerable to maybe what's happened in Musk and that you trust wrong people or you have people who are, you know, afraid of you. But, you know, I think you have to just be able to build a culture where you can't necessarily be in the weeds on every single thing.

Speaker 2:

But you have to, like, have built a culture and a team that people are comfortable sharing the things that they don't know with the other folks on the team, admitting that they don't know something. I mean, you you just yeah. I I like, you just can't you can't be in the weeds on everything. And, also, like, you can't manage things that you've never worked with hands on, and you have to be able to do that if you're gonna, you know, grow an org or grow a team or grow a company. Right?

Speaker 2:

And I'm sure you're experiencing some of this at at Oxide right now. Right? I mean, it's it's it's you just have to be able to, you know, trust folks but also create the the right environment and mechanics where that trust can be safely given.

Speaker 4:

I think there's a certain level of, I've referred to this thing called reset theory. I even talked about it with George at the end of the space where I think when people become or hit a certain threshold of success in a given area, they assume that areas nearby it, they can carry some of that success over. So, like, if I get really good at web dev, clearly, I can go pick up Rust and contribute to the Linux kernel. Like, if I'm an 8 out of 10 web dev, I default to 5 out of 10 Rust learn Linux dev. Right?

Speaker 4:

No. That's not how any of this works. I basically am starting from scratch.

Speaker 1:

Yeah.

Speaker 4:

Like, there's some amount of patterns and paradigms you can carry over, but that's a a slight bump of a floor, not a guaranteed raise of a ceiling. And people, once they hit a certain threshold of success, often feel as though they're allowed to go to other places, and just their their presence there means that they will be successful in it. Because you're learning how to skateboard again.

Speaker 2:

Tia, we are we are absolutely on the same page here other than, I got old before I started doing anything switch. So but, what I would say, Theo, to your point, I am am plus 1,000 on it, and maybe to tie this back into, like, the perception of what's happened to Musk or change. Yeah. You need to have that confidence to go in and say, hey. I'm I'm smart enough to come in here and and learn a new domain and maybe have an impact.

Speaker 2:

But you have to still maintain that curiosity. Right? That beginner's mindset. The ability to go into that new space and acknowledge that you don't have the expertise, but you're bringing other things to the table. And it feels like Musk has crossed some sort of threshold where it is confidence sans curiosity, and that that feels like it creates this toxic space.

Speaker 2:

I mean, there's a bunch of other shit going on there too, I think. But, like, that feels like maybe a fundamental sort of missing link. I mean, it

Speaker 1:

would totally. And then I think that you couple that, and I do think that this is where I think the the real challenge. I think it's interesting for you to take the kind of, like, what would be the intellectual basis for a rewrite? And I think that you're, like, you're on the right track there of where these are the kinds of reasons why a rewrite would make sense. The thing that I just cannot get my head around is this idea that your trust is so important when you are doing anything bold, and trust is so easy to to violate.

Speaker 1:

Trust is so easy to destroy. It's really hard to create. And I just see action after action after action that destroys that trust. And I'm like, I did like, that to me is, like, one of these kind of fundamental lessons of software development in the large is part of the reason it's challenging is because we have to build trust across a large organization, and that is really, really tricky. And it's tough to maintain.

Speaker 1:

It's not easy. It never gets simple. And I just feels like there's there's an unwillingness there to to focus on that trust.

Speaker 4:

If I've learned anything as, like, in EngLead and now in exec, it's that, like, the best thing you can do for engineering trust is when one of, like, the when an engineering, like, decision maker or leader says says that something should be rewritten, like, smiling and waving as they do it even if you know it shouldn't be just because of the amount of trust, and buy in you get for that. And, I do think a rewrite, given, like, the engineers that are there are interested and excited about that, could be a way to get a lot of buy in.

Speaker 1:

That's an interesting idea. I think that you would to me, again, like, it's the total rewrite. I like, tell me more. What are we rewriting exactly and why? And, you know, there may be you know, I

Speaker 4:

By the sounds of it, they're telling him, if he knows so little about the rewrite, chances are other people are the ones championing it. I would imagine that there's a decent pocket of engineers at Twitter who are excited about this.

Speaker 2:

I mean, I don't I look. I here's what I

Speaker 4:

could just be wrong about that. If you have insight, please share.

Speaker 2:

Look. I mean, I'm I have less and less contacts within Twitter at this point. And but, like, look, I, yeah, I I don't buy it. I mean, I feel like there were so many people that I like, this is gonna be, like, a really crappy example. But, like, Twitter used to do these hack weeks.

Speaker 2:

Right? Like, once a quarter, maybe then once. Yeah. Like, twice a year, once a year. Like, it's like you give people a fucking week, and they can build crazy features, maybe not fully production ready.

Speaker 2:

And and, you know, like, the the, like, the stack didn't ever seem to be a problem for, like, innovation. I think Twitter's inability to ship stuff, like, came from a much like, the friction was not on the technical side. Right? It was on the product, business, and leadership side. Right?

Speaker 2:

There's a bunch of stuff that slowed things down. But I know, like, you know, everybody complains about certain sharp edges in the code, right, or stuff that it needs to be pieces that need to be rewritten or refactored or evolved. But, like, you know, like the general gestalt of Twitter over the years was not like, oh, man. Gotta go back and write some code and the stuff is gonna take me 3 years. Okay.

Speaker 2:

So, you know,

Speaker 1:

I think you have hit and this is like a good point to kind of, like, wrap it up and bring it home on because I think you've hit on a really, really, really important point here. That it's in your experience watching Twitter over the years, as you say, with these hack weeks, where the stack wasn't the issue. The the it's tempting to blame the stack, but the the thing that that impedes innovation is much more likely to be organizational leadership product, the business, than it is the stack. And when it is the stack in some regards, where or when it is some aspect of the stack, that, like, this thing is preventing us from actually really innovating, it's, honestly, pretty easy to get consensus on it and and do it because it's pretty clear that it's a technical thing that is that is preventing progress rather than organizational malaise, which I think in software engineering, it is much more common to be organizational malaise that is preventing innovation other than past technical decisions.

Speaker 4:

I think we love to and the people who would be in the ways of things like that. And as such, what's left is the, like, structure of how things once were. And, again, like, I don't I'm not on any side here. I'm just trying to steal, man, the other end.

Speaker 2:

Yeah. I mean, I I fully buy in to Conway's Law. It's it's legit. We've all seen it over the years. It it's it's the real physics of how things work.

Speaker 2:

But, I mean, the nice thing about Twitter is that we reorg every, like, 3 minutes. So, like, the perf perfect hedge the perfect hedge against Conway's law is just to continuously destroy it. It's just, now, again, I yeah. I mean, like, like, along with all the execs who were the problem, and to be fair, Elon retained a couple of the execs who were the problem. I think that Elon also nuked and paved all the people who knew how this hack worked, how the code worked.

Speaker 2:

And if what you end up with is a bunch of folks Sorry. Not many folks because there's still many great people there. And I'm not casting a a blanket instead of, net across or aspersion across the entire Twitter group there. A lot of those folks are relatively new. They're like, I got I got hired 3 months ago.

Speaker 2:

I can't find a new job right now. And if you're dealing with a brand new tech stack and there's nobody around to walk you through the the hows, the reasons, and the whys, it's gonna seem like a hostile territory. And that it's just it's it's sad to see that, like, they're in this place and they they can't move forward. But I don't know.

Speaker 4:

Only pushback there is the use of seem like. Like, the software is software. It can't hurt you. And seeming like it's hostile and being hostile are the same. Yeah.

Speaker 2:

They're actually You're you're

Speaker 3:

speaking like someone who's different than software.

Speaker 2:

Yeah. Yeah. So Exactly.

Speaker 4:

Yes. Absolutely. And the the point I'm making is that, like, the distinction in this sense is, to me, unfair. Like, if it seems hostile and there isn't a way around that because the people who would have been around to help you through it aren't there, that there's no meaningful distinction between seeming hostile and being hostile to me there.

Speaker 3:

See, and I'll push back on that, Theo, because I cause, that, that, feeling of hostility is actually born of confusion and in some cases, lack of curiosity. And they're a knee jerk can be, I could build a simpler thing better, faster, cheaper, more comprehensible. When in fact, I just don't understand the system well enough, and the reaction needs to be to understand rather than to assume that you rather than to go to hubris.

Speaker 1:

Well, and it's interesting, like, the role of Pacific in ZFS, Adam. Where the so kind of the history of ZFS where we were suffering. We, Jeff, Bonwick, suffering with with UFS, VXFS, desperately wanting to rewrite it. But, really feeling like there was that, what, year period, 2 year period, and where, before they went to a complete rewrite, they really did try to reform, I would say, in terms of take, Georgia's revolution or reform. And I think that those that whatever that was, the 18 months that they were trying to do that, ended up being very important to understand all the dimensions of the problem.

Speaker 1:

I mean, even still, ZFS was a total rewrite and took years years years years years. Took a very long time, and it's hard to imagine a rewrite that would have more, more tailwind than that one had from a technical perspective and even an organizational perspective, although it was organizationally controversial at the time. I don't know. But we

Speaker 3:

I really agree, Brian. And I I would also say though, that in some ways, z f was was relatively simple, is relatively simple. It runs on one system. Like, understanding the you don't have these kind of emergent properties. I mean, I guess you do a little bit in ZFS, but there's so much more comprehensible as opposed to this this large distributed system.

Speaker 3:

And I think that's in particular, Theo, where one of your, you know, postures or postulates kind of falls apart, which is that I I totally believe that there are some elements of web dev that may get 3 times easier in 3 years. I don't know that that holds for some of these distributed systems, especially one of the some of these one of the kind distributed systems.

Speaker 4:

Yes. But for others, it is. Like, I have been absolutely amazed watching, like, infrastructure fall in favor of serverless, watching, like, crazy database distribution fall in favor of the test. Like, these open solutions and concepts have made it, at the very least for me, significantly easier to do infra. And a big part of why I'm now in, like, a web dev in quotes is because I was able to move out of back end for long enough to do it now because the things I needed from servers became so much simpler to do.

Speaker 2:

Yeah. I mean, but by all means, like, Twitter should be and honestly has been, like, iterating on those APIs and interfaces. Like, Mesos and Aurora is, like, is further along the spectrum of serverless towards a pure fast solution than it is than like, it sits between, let's say, Docker and, And and, like, and And

Speaker 1:

no matter whatever. But yeah. But the VM management Yeah.

Speaker 2:

Yeah. Whatever the yeah. Right. So you like so I think, like like, absolutely, like, there's innovation happening at these large legacy, like, companies all the time. And I think, like, it's to a to a suit like, you know, it's it's so easy to say, like and I'm not I'm saying you're not you're not saying this, but I think what I see in a public space, Theo, is that people are like, oh, this company has been around for whatever.

Speaker 2:

It's just it's old. It's legacy. But, like, you know, to Brian's earlier point, it's the ship of Theseus. Like, things are changing, evolving, mutating all the time. You know, like, you look at what, like, Yao and her team were doing in terms of, like, metrics, observability, performance engineering.

Speaker 2:

There's just so much innovation happening. And just like it just it there's this one story lens that collapses when you get into a lot of the Musk sort of and George, to be fair, like, sort of, like, sort of reads of the Twitter infrastructure now. Interesting.

Speaker 4:

Almost entirely agree, but if for the sake of steel manning, the the best I can do here is that Twitter's goals have shifted. And while previously things like performance both on, like, the server and infra sides for cost scaling, but also for, like, the user experience side. Like, those wins matter a lot less when you're not make able to make money to survive. And it's possible that through the the shifting of the company, both in, like, team and priorities that this has changed. And to go back to the earlier point about the, like, hostility thing, I would still defend the point that, like, if you're going back to the earlier example of, like, a engineers been at Twitter for 3 months.

Speaker 4:

The company they're working on a code base that's existed for 12 years, and the people who made all the decisions aren't there. That to me is about as hostile an environment a developer can be in. And I do still think that

Speaker 2:

I agree.

Speaker 4:

There's a lot of reason to consider rewriting in that scenario, even just to get to the point of full understanding faster. Even if you learn through the process, oh, shit. This is too hard to rewrite. Maybe I better understand and appreciate the decisions that were made. The process has a lot of value.

Speaker 2:

Look. I I mean, in a sense, you're right. In a sense, it is easier to understand something if you rebuild it from the ground up. Right? Like, absolutely.

Speaker 2:

But, like, you have to be able to come in. Like, as as a software professional, you have to be able to come in to this strange land and then figure out why it is shaped this way, why it looks this way. Otherwise, like, every time you rebuild a team or or the team shifts, you know, 50 percent 60% new hires and and 40%, like, old hands. You're rewriting. And that's just that's a cycle of just no productivity, like, 0 velocity.

Speaker 2:

Yeah. You gotta be able to read and learn. I think that's the

Speaker 1:

And and I think and be curious. Right? Ask questions. I think that that that the Mhmm. And the the I mean, it's kind of an interesting idea of, like, well, should you go rewrite it in the situation because you've inherited this lost civilization.

Speaker 1:

You don't know how it works. No one is around to ask this question because it is kinda descended from the ancients. Is that the grounds to go rewrite this thing from scratch and kinda learn some of those first principles? I think it like, that's an interesting idea. I also think that you you need to do that carefully, and you need to to make sure that you and maybe this is just me being an be having been overly burned in lives past, but I think you you want to understand the problems that are being solved.

Speaker 1:

And those problems may have shifted too because often you you get into these things. Like, actually, the problem that this software spent a lot of time solving is actually not a problem anymore, or we've got another problem that we that is a much more important problem. So I think it's it's it's important to revisit in that regard.

Speaker 6:

It's it's also useful to ask the question, what does it mean to rewrite it? And I think this goes back to this question of, are you talking about revolution or reform? You know, a engineer who gets into a code base that is unfamiliar may sit there and be like, oh my god, I don't understand how many it works and extraordinarily complicated and there's all this stuff happening asynchronously and, you know, what should I do? And it's it. Well, maybe you sit down and you rewrite one function or you write some unit tests or something along those lines.

Speaker 6:

Like, in some sense, you're rewriting that code, but that is qualitatively different than being like, alright. Well, as you know, this is a bit as the basis of a system, the government clearly sucks. So

Speaker 2:

let's kill all

Speaker 6:

the lawyers and and get all the. Right?

Speaker 2:

Yeah. But and Dan, if if I may, like, I think there's also, like, this really amazing thing. As you bring new people onto the team and they encounter this hostile environment, they ask questions. Right? Like, hey.

Speaker 2:

What does this thing do? And, like, you know, you're like, oh, it does this. And, well, why would why do we need to do that? Or why does couldn't we do it this way? And there's a it's a two way street.

Speaker 2:

Right? It's not just like you have to figure out what we're doing. It's also like, oh, shit. You're right. That is not No.

Speaker 2:

The right way to do it right now. Absolutely.

Speaker 6:

There is totally value to having new folks come in with a new perspective and be like, alright, do we really still need to do this? It's like, well, no, maybe we don't. Maybe we can rip this code out and simplify things. That's great. That in some sense is also a rewrite.

Speaker 2:

The the basic error I feel, I did just we have to just acknowledge is that, like, coming if Musk had come in and said, look, we are we you know, I I totally bought this company that I didn't wanna buy, and I'm insanely over leveraged. And it sucks. But for the business to survive, we have to get to this number in revenue and we have to get to this level in bottom line cost. Start from there with the people still there just for a few months, a year. Experiment.

Speaker 2:

Yes. Experianna. Figure it out together, and then not everybody goes along for that ride. That sucks. But, like, what me.

Speaker 2:

Does it yeah. But that's just what what transpired was, like, get everybody who could possibly help get me out of this hole I dug for myself off this ship. Get them away from me and leave me with people who are just gonna tell me what I wanna hear. And now here we are on Discord as opposed to the lovely

Speaker 6:

That's right. This is this is one of the things that this is Elon Musk destroying the myth of Elon Musk and doing so in a very public way. In some sense, he's doing the worst favor.

Speaker 3:

Absolutely. I agree with that.

Speaker 6:

His his mythos have been allowed to stand for way too long, and here he is just being like, the emperor truly has no clothes, and now we see that.

Speaker 1:

It it it the emperor has no clothes and is running up Broadway covered in his own fecal matter, and it is and is incredibly actually helpful and clarifying. And, yes, doing a a a great service in that regard. Well, I, Ian, thank you very much. Theo, thank you very much. It was great to have you both here as a both of you have the distinction.

Speaker 1:

Ian, you are a jackass according to the world's wealthiest man. Theo, you you just talk nonsense according to the world's wealthiest man. So

Speaker 4:

He liked my take on video ads.

Speaker 2:

I'll give him credit there.

Speaker 4:

He seems to vaguely understand video some amount.

Speaker 3:

Alright. I'm big stripio.

Speaker 1:

I but I think that you, you you both have, I I I think it was, and I mean, to his credit, willing to kinda take questions from folks, and, that edit but it was great that you both were, willing to speak truth to power, and offer your perspective, grounded as it is from your own experience. And I I would like to believe that we can get to a kind of a more productive spot by people taking some of the productive things we can learn from all this disaster. And, one that we can learn is that, Ovid had it nailed in whatever it was, 40 BC. So it is it

Speaker 3:

is a big pile. Oh,

Speaker 2:

this. Exactly.

Speaker 1:

Alright. On that note, Ian, Theo, thanks again. Dan, thanks for joining us. Adam, thank you very much, and we will see you

Speaker 3:

Yeah. Thanks as always, Brian.

Speaker 1:

This is fun. Not next week because, Adam, you're you're with the fam next week, but we will then, 2 weeks from now, we are gonna do predictions for 2023. It's gonna be a banger. So bring your predictions, and, it'll be a good one. Alright.

Speaker 1:

Thanks, everybody.

Speaker 3:

Happy New Year. See you.

Speaker 2:

Happy holidays later.