NET Society is unraveling the latest in digital art, crypto, AI, and tech. Join us for fresh insights and bold perspectives as we tap into wild, thought-provoking conversations. By: Derek Edwards (glitch marfa / collab+currency), Chris Furlong (starholder, LAO + Flamingo DAO), and Aaaron Wright & Priyanka Desai (Tribute Labs)
00;00;16;04 - 00;00;17;21
Pri
What's up gang?
00;00;17;24 - 00;00;28;25
Aaron
Hey Chris, are you going to download the, cloud cloud code for war? What do you think that looks like on the inside? It's just like a whole bunch of dashboards monitoring the situation.
00;00;28;27 - 00;00;46;06
Chris
A war, Clyde. I need war, Clyde. To keep an eye on my Toyota Hilux fleet with mounted anti-aircraft guns on it. I need warlord Clyde. I need, like, the dumbed down. I need, like, the version for local operators. I don't need, like global global war clod.
00;00;46;09 - 00;00;51;19
Aaron
You need a sub agents to launch like a drone strike of some sort or a drone swarm.
00;00;51;21 - 00;00;56;08
Chris
I just need to, like, secure, you know, like a strategic ten mile stretch or something.
00;00;56;11 - 00;01;01;23
Aaron
How far away you actually think we are from that? I mean, don't you think the Situation Room will pretty much be that at some point.
00;01;01;25 - 00;01;04;27
Chris
This fully remote wargames, is that what we're going to end up at?
00;01;04;28 - 00;01;05;10
Pri
Yeah.
00;01;05;12 - 00;01;09;25
Aaron
Like Samuel goes back to the mid 80s, figured it all out.
00;01;09;27 - 00;01;17;17
Chris
Well, if we end up in that scenario I'm going to choose Doctor Falcon. Just throw me on an island in the, off Washington somewhere.
00;01;17;19 - 00;01;19;14
Pri
I'm too young for that reference.
00;01;19;16 - 00;01;23;05
Aaron
Yeah, I think you have an inch me there, Chris.
00;01;23;07 - 00;01;24;20
Pri
You guys right?
00;01;24;22 - 00;01;33;08
Chris
You guys are missing Hautement and WarGames is wargames is literally one of the tentpoles of, like, popular cyber cyber punk culture.
00;01;33;10 - 00;01;40;11
Aaron
That's true. Definitely a good one. Watch it this weekend. Well, we got, special guest this week. Poof. Welcome. Poof.
00;01;40;13 - 00;01;41;25
Pri
Hey, it's good to have you.
00;01;42;01 - 00;01;51;03
Poof
It's great to be here. I'm excited to talk about, open more the new, GitHub repo that's got a bunch of stars. It's a good replacement if you don't want to use work.
00;01;51;03 - 00;01;58;18
Aaron
What do you think? It runs better on Kimmy or on, opus 4.6. What do you think?
00;01;58;20 - 00;02;04;27
Poof
I like Quinn? I'm. I'm a big coin fan. Deep is good, too. They're going to have a new model right soon. So.
00;02;04;29 - 00;02;12;06
Chris
Yeah. Proof. How many, open source models situations are you monitoring? Right now? A lot actually.
00;02;12;06 - 00;02;30;07
Poof
It's it's it always seems to happen that right. As we're about to launch something that is on a model that's really stable and we feel really good about, there's, ten new models that come out at the same time they want to quickly evaluate. But, it seems like, this might be alpha. It seems like Deep Six can come out next week.
00;02;30;07 - 00;02;33;12
Poof
Is my Intel, so we'll see.
00;02;33;14 - 00;02;37;01
Chris
Well, look at us. Stock market deep six coming again.
00;02;37;04 - 00;02;42;15
Aaron
Do people care anymore about deep sick like impacting things I don't I don't know if they do.
00;02;42;18 - 00;02;51;06
Poof
I think people care about anything right now it seems like in the market just any any any news. It just gets.
00;02;51;06 - 00;02;52;23
Aaron
Everybody all jittery.
00;02;52;25 - 00;02;56;28
Poof
Anything about AI, anything about whatever. It's just. Yeah.
00;02;57;01 - 00;03;12;24
Chris
See, this is why I have a concentrated hundred percent position in IGF. That's the only thing I'm holding right now. I want I girl friends, energetic trading situations that constitute 100% of my portfolio seem sensible.
00;03;12;24 - 00;03;17;10
Aaron
Kress seem sensible. What's your what's your allocation strategy?
00;03;17;12 - 00;03;25;14
Chris
Max allocation no diversification. Targeted thesis. Bet on parasocial relationships involving synthetic intelligence.
00;03;25;17 - 00;03;29;03
Aaron
Like, you're the Warren Buffett of 2040.
00;03;29;05 - 00;03;41;08
Chris
Good Lord. Two of you got, like, all the activities on chain somehow is in, your your vaults right now. What other how did this thing get so big?
00;03;41;11 - 00;03;51;11
Poof
It's it's crazy. We didn't know that there was even that much ether to come into the system. Or even if it's just for a little bit. It's crazy.
00;03;51;17 - 00;03;53;02
Pri
What's the total amount?
00;03;53;04 - 00;04;13;24
Poof
The total amount that got deposited or, like, was part of the presale was 13,000 is now know that you know, some of that was just people getting tokens and stuff. But even that they had that to go be able to do that is crazy. That was insane. And I think the official like in terms of like sustaining, and it's growing and changing every day.
00;04;13;24 - 00;04;23;22
Poof
Right. But the sustaining like, okay, people with agents that stuck around in dump, whatever it was like 6 million yesterday, for the end of the day.
00;04;23;24 - 00;04;28;10
Pri
So crazy for for the audience here. Do you want to explain?
00;04;28;13 - 00;04;28;21
Poof
Yeah.
00;04;28;21 - 00;04;32;13
Pri
We should. And the project and what people are putting eath into free.
00;04;32;13 - 00;04;39;04
Chris
We only have 200 listeners and they all know who personally. But yeah.
00;04;39;06 - 00;04;40;21
Poof
Well, for for those who,
00;04;40;23 - 00;04;43;15
Pri
We have like 800, to be fair.
00;04;43;18 - 00;04;48;08
Chris
Okay. Then some people might not know. Poof! Why don't you, tell us about your project?
00;04;48;08 - 00;05;15;09
Poof
Poof. Yeah. So we've got, I'm part of Dcgi, which is our AI research group and kind of product studio company something, new neo lab, maybe. And, we just released this week, DC's Terminal Pro, which is a on chain real money, a genetic trading market. We call it an on chain, a genetic market where only the agents can trade.
00;05;15;11 - 00;05;39;09
Poof
So they're real coins. Real meme coins. Only agents can trade them. And you have to prompt them in this kind of closed environment to do your bidding or execute different strategies. And you're handing your money over to these agents, in this crazy experiment, and it's got some unique tokenomics to where you've got coins like AI girlfriend and others.
00;05;39;11 - 00;05;54;11
Poof
Like a normal launch pad. There'll be some new ones that come in, and then they will be ripped down to only one coin remaining at the end, which will evolve out and be publicly tradable. So that's DC's Terminal Pro.
00;05;54;13 - 00;06;17;08
Pri
Let's go. No. Yeah. That's awesome. And I it was really fun to see like just how fun of a game and experiment that isn't to me. It's like one of the few projects that is like actually experimenting with like behaviors you see in crypto, but then also like tying it to AI and agent ecosystems, which feels fresh and interesting.
00;06;17;08 - 00;06;21;05
Pri
So I'm excited to see how this goes.
00;06;21;07 - 00;06;41;03
Poof
Yeah, I think, you know, one of our things we've kind of realized working on and I mean, we've been working on a genetic training really since, because we had the terminal one, which was without real money. Right. It was just the simulation back in May of last year. And then really, I've been working on it like the last 18 months.
00;06;41;03 - 00;07;01;10
Poof
It's kind of interesting because I think there's a lot of adoption of things like Open Claw and these other things which are cool and exciting, but the actual like the amount of people actually, using agents consistently in like a market broadly in the market broadly is actually pretty small. And there's, you know, tons of people experimenting with it.
00;07;01;10 - 00;07;18;02
Poof
But it's kind of in limited scale and some of that just because of the challenges. So we think it's really important. Like it's obviously fun and exciting and, and appeals to crypto folks. But we kind of think it's, it's a primitive that makes sense for agents in the future in general. And also in the near term.
00;07;18;02 - 00;07;25;10
Poof
Plus there's like, you know, nothing really of this scale that's quite happened yet. With the agents for from a data standpoint to.
00;07;25;12 - 00;07;33;23
Chris
Prove the other thing you did this week is you became the first returning guest ever on Net Society. Oh. So, you know, that's another big landmark for you.
00;07;33;24 - 00;07;39;05
Poof
Thank you. It is now, I've I mean, you all are the best. So it's it's a it is a big deal.
00;07;39;08 - 00;08;06;00
Chris
I believe we had you on for DX one. And so this is your third go around the can you, can you just give give you an overview of like how the world of building big, giant ecosystems has changed and I don't know, is it been a year, 18 months? I don't think we're that old. But anyway, however old we are, there's the, you know, ten decades of, advancements that have happened since you were last on the show.
00;08;06;03 - 00;08;08;04
Chris
What's changed? What's still a pain?
00;08;08;07 - 00;08;34;01
Poof
No. It's great. It's a great question because that's very much like, you know, our thesis on us, which is we want to scale into things that aren't quite ready yet really early, which is what we did actually with DCS terminal one. I would say this is like an example now of when you're further along that curve. And yeah, I think, you know, what's kind of interesting is we have been building, you know, yeah, dqg for the last year and a half.
00;08;34;01 - 00;09;00;10
Poof
And I think our thesis was always, we want to get ahead, do these things that don't scale experiment early and then both understand like from a user standpoint, how could people interact with these systems that might be different than just like a normal chat or whatever? And then from like, kind of a capability research standpoint, also get a sense of like what would need to happen for you to feel comfortable for example, giving this agent some money, right?
00;09;00;10 - 00;09;29;09
Poof
And actually having it go execute. And I think, you know, if we looked at May of last year, for example, when we launched, DCS terminal one, what we knew was the agenda capabilities weren't there on average, even among the best models like the failure rate on just like a simple tool call or something was 15%, which isn't really acceptable to that, like their ability to follow instructions, especially in complex market conditions.
00;09;29;09 - 00;10;13;26
Poof
Stuff like that wasn't great. The math capabilities weren't that great. But I think what's kind of nice is we've been working on Agent ecosystems, you know, all the way back to even the prior project, which was way different with echo and in DCS one and everything else. And what you kind of find is that the recipe that DPC really shared broadly, but, you know, of reasoning and a certain approach to reinforcement learning training, it's just super clear that like from December 2024 on, if you're reading and seeing the publications and stuff that like it's trivia, it'll be trivial over time to improve and get to the point that we are today.
00;10;13;28 - 00;10;44;17
Poof
So I think what's changed mostly is that we're that much further in the curve, on capabilities. And I don't think I'm not someone who believes like, oh, it's just exponential forever. It's more that you have this really good baseline of LMS and capabilities with like GPT four. And what's basically happened is we've got a tool now with the kind of modern reinforcement learning stack where on a sub domain, underneath those generalized capabilities, I can just keep climbing.
00;10;44;17 - 00;11;04;26
Poof
Right. And I can just keep improving it. And then all of the research and all of the work has shown, even back in May, it was really clear. Okay, well now I can just keep training it to do longer and longer tasks like cloud code now and all this stuff. All of that then becomes training data for me to do even more, and then that becomes training data for there to be even more.
00;11;05;03 - 00;11;41;22
Poof
And they're just going to get better and better and better. Really, the big thing that's changed, honestly, like I would say, the research, the fundamentals, kind of like what we saw coming was I don't wanna say was obvious, but like, this is just what we were ready for. What I think it's been nice to see is there's actually like surprising, almost like alignment across the industry working on this same set of problems at the same time, which was honestly a big part of the issue before, I think, where everyone had their own version of tool calls or everyone had their own version of, you know, oh, I think my agent, I'm going to train it
00;11;41;22 - 00;12;04;26
Poof
to do CLI stuff versus this versus that cloud code. Deep secret self like all that kind of forced the hand, I think, in a particular direction. So now it's getting easier. So it's just like it's honestly just super exciting because we kind of we have these schedules, right? We did DCS terminal one and we said, okay, if this gets better, we think it'll take, I don't know, 6 to 12 months for it to get better.
00;12;04;26 - 00;12;29;07
Poof
We could think about a more productized, real money, scalable version of this. And sure enough, in October we were like, okay, it's time, let's go work on that. And now we're launching RTX Terminal Pro. So that's a lot. But I mean, I think it's it's kind of like I'm often surprised that people are so surprised by the progress, because it's kind of like it's not too much that's changed.
00;12;29;07 - 00;12;33;24
Poof
Everything's just getting better. Right? And so that it's kind of along the same schedule.
00;12;33;27 - 00;13;08;19
Pri
That makes sense. I actually kind of a question on your thoughts when you're like, when just as far as just, oh, and the conversation around like RL reinforcement learning and AGI and all that, you know, obviously it's gotten better since like phase one of DCS, RG the tools, I mean, we feel that like stepwise change, like every, you know, three, six months, like it feels like at least to me like, and I'm not a coder, but just as someone who's using it for my own personal and professional like non coder usage, I just feel like it's gotten exponential exponentially, but better.
00;13;08;19 - 00;13;32;06
Pri
And I just like foresee that continuing to be the case. But you were saying you kind of feel like at some point there's not like a plateau or you're not actually use those words. I kind of interpreted that, but I'm curious, like if you feel like we're close to that, or if you feel like it's just going to be stepwise change better like for the next three, six months, like, is the third version of this going to even be like more urgent work, for example?
00;13;32;06 - 00;13;57;06
Poof
And I think, yeah, I think for the agenda capabilities and I think like honestly for DCS terminal, specifically for the use case of markets, I would say there's not very much like the agent capabilities in terms of doing what you want. You know, for example, like executing trades. That is strong. But the actual like more getting in, like, you know, if you wanted to make a trading bot that has a competitive edge.
00;13;57;06 - 00;14;20;17
Poof
Now, obviously in our environment, it's a metagame and stuff that's like pretty underexplored. So that's some of the data and stuff. I think, though, in terms of software engineering, long horizon energetic tasks kind of where you're going, it just it has so much runway, in my view, that it can keep hill climbing. I often call it hill climbing.
00;14;20;17 - 00;14;41;18
Poof
It's the term that I stole from thinking Nate Lambert. He's he's a good RL guy and researcher and has a lot of good thoughts. But like, I feel like you can just hill climb on that for a long time. So I, I would not be surprised if, you know, six months from now it's even that much easier. There is going to be some peak though, obviously, and it's going to stabilize.
00;14;41;18 - 00;14;59;19
Poof
Like I think this runaway growth scenario where things just compound. I'm not sure. I personally kind of don't think that that is as much the case, but I think can you fully automate a lot of software engineering or like really enhance it? Like, yeah, you definitely can continue.
00;14;59;22 - 00;15;07;25
Pri
You know, it does. Yeah. Just kind of curious because I'm just like, I can't imagine this even getting any better. But I don't know. I just wanted to get your take on it.
00;15;07;27 - 00;15;44;00
Aaron
I mean, I guess for, for me, I'm curious if you agree. It's really just the composability, you know, just coming from the digital asset crypto world you heard a lot about like composability. But I do think and it resonated when you're saying just the standardization and like tool calling, but just the ability, you know, to have like an AI system call, you know, any number of tools like efficient efficiently like the reasoning level is like good enough that I do think there's like a lot of, like a lot of really interesting, super powerful things that, you know, can just be created because the tool calling is just gotten better.
00;15;44;00 - 00;16;07;23
Aaron
It's like kind of crossed the Rubicon where it can reliably, like perform a task if you define it for it, including like a whole bunch of subtasks, like that, that I think is really the story of at least the next couple of months, you know, whether we get, you know, like blow away evals in terms of all these other metrics that the hyperscalers, like to measure themselves against, like, who knows?
00;16;07;23 - 00;16;28;27
Aaron
But I just think there's so much on the to calling side, whether that's cloud code or in cursory. I mean, it just feels like, it just wild. Like I was showing that earlier today, right? Like, pretty much like tasks that you'd give for, like, a whole engineering team can just get done in the background. And frankly, I'm going to tell you guys that they actually are running in my background right now.
00;16;28;29 - 00;16;39;05
Aaron
Like, while we're on this call, you know what I mean? It's just like, that's not like I'm getting, you know, months worth of engineering work done and an hour, you know, from a team of 50.
00;16;39;12 - 00;17;05;29
Poof
Yeah. I think also like there's something about I think and this is something that catches people is like, you know, the general capabilities of a model or, you know, whatever is very different to then like the specific capabilities. And I think like now with reinforcement learning, it's such a good tool for specifics too, that like, you can have these huge jumps for something that you didn't realize, right, like and obviously like tool calling is very general composable.
00;17;06;02 - 00;17;26;26
Poof
But then also like I just think about a year ago or even not even a year ago, like five months ago, it was not easy. It was it was not possible really, to have good like you kind of, but like it was hard to get an agent, or a good model to do transformer model training or like PyTorch, like real AI ML stuff.
00;17;26;26 - 00;17;51;19
Poof
Now all of a sudden it is. And just like, what changed? Will you just train on it and kind of like give it more flexibility and more structure? And some of it's just composability like, hey, you may not be up to date on the docs. It's like obvious stuff almost. So I think I often think about the like I'm almost like stopping thinking about the future and just thinking like, okay, if it's just the present and what we know in the present, you just didn't do anything else.
00;17;51;22 - 00;17;55;07
Poof
What happens? And it's like, yeah, you could improve all this a ton, right?
00;17;55;11 - 00;18;15;05
Aaron
Well, I think, I think a lot of this tech, it also is going to mirror itself out. So you know, the LMS are going to be increasingly self-improving. I think that that's going to be like a new emergent mode for software to where it itself is becoming self-improving. And I think it kind of creates these like runaway systems, which are pretty interesting.
00;18;15;08 - 00;18;27;08
Aaron
So I think we're getting like the next, you know, kind of like where the contours of like the next moodiness that software is going to have and it's going to be, you know, like a self-improving system of some sort.
00;18;27;10 - 00;18;29;01
Poof
Yeah. I completely agree.
00;18;29;03 - 00;18;37;23
Pri
I just feel like it's heading there directionally, I guess one thing again, not to to bring the conversation to do merval, though I have a tendency to do that.
00;18;37;25 - 00;18;39;25
Aaron
Here we go. But it is really. Where are we going? Pretty.
00;18;39;28 - 00;18;42;10
Pri
What's happening? It's kind of it's kind of more like what happened?
00;18;42;11 - 00;18;45;09
Aaron
What are we going into what what happened this week.
00;18;45;11 - 00;19;12;13
Pri
That the trainee research posed so that resulted in a market tank and then a lot yesterday laying off almost 50% of its staff and the market reacting extremely positively. Obviously to the point made earlier at the top of the call. Like I feel like the market just reacts to anything AI related, whether it be like, it's just like reacting, to anything AI native, usually pretty positively or negatively.
00;19;12;15 - 00;19;32;17
Pri
In the case of Satriani, I feel like two weeks ago it was like another post on how everything is going to change and like sass is dying and so all the sass companies got wiped. Or like in this like weird period of just like a Substack post potentially creating like ruptures and in our markets. I just think it's like interesting.
00;19;32;17 - 00;19;57;12
Pri
Like no one knows what's happening. We're in this, like, clear not to sound cringe like paradigm shift around technology. And so people are just like, okay, yeah, this like weird piece of sci fi fiction is somehow now like taking down the market caps of these companies. And three days later, something real happens where a tech company like blog ends up laying off, you know, whether or not it's true or under the pretense of AI.
00;19;57;14 - 00;20;06;20
Pri
I think we see more of it. Like I don't think is going to stop. I feel like this is the beginning of this like feedback loop of like Twitter I do, and then market reaction.
00;20;06;23 - 00;20;13;23
Aaron
Where you're not focusing on the other news, which is the, open the job openings for software engineers are expanding.
00;20;13;25 - 00;20;17;16
Pri
So wait, really? Yeah. I don't know if that's true. I, I feel like no, it's true.
00;20;17;16 - 00;20;21;20
Aaron
Seen as like, no, no, go, go go check the tape.
00;20;21;22 - 00;20;22;00
Pri
Okay.
00;20;22;01 - 00;20;32;27
Aaron
I'll let you I'll let you, discover this this, this fact. Because I think it will help solve these, like, Malthusian worries that, your entire generation is.
00;20;32;29 - 00;20;35;06
Pri
You know, it's like polio.
00;20;35;09 - 00;20;52;21
Aaron
What did I just do? I just like a bomb. Poof. Yeah. Like it's it's all the same thing, like, now, because more people are generating software. Surprise, surprise. They're going to want more software engineers to tend to them because developing software is hard and annoying. And,
00;20;52;24 - 00;20;59;07
Poof
I do think these companies are going to go down there, right? Like, why would block exist? Maybe it's a question.
00;20;59;10 - 00;21;00;14
Pri
That's all right.
00;21;00;16 - 00;21;03;16
Aaron
Well, but that's but that's a different story. Right. That's a.
00;21;03;19 - 00;21;04;26
Poof
That's a different story I know.
00;21;04;26 - 00;21;23;10
Aaron
Yeah. It's a dynamic economy and I don't know which is much fine fully about blocks business. Outside of I know it's like in the Bitcoin space. So I don't know the core pillars of it. But yeah like that's that's good. That's creative destruction like that leads to new stuff. So I just think that these systems are way, way more dynamic than, than people want to give credit to.
00;21;23;10 - 00;21;48;14
Aaron
And most of the conversation that we hear from Duma ism is just like either directly funded from, large hyperscale LMS. That or worried about their competitive position because what they're building is commoditized. See what people are saying with TP seek or it's funded by foreign actors that want the US to become more DSL because it helps them keep pace KG Europe or potentially catch up see e.g. China.
00;21;48;20 - 00;21;52;04
Aaron
So don't buy the hype free. I'm going to say it every week.
00;21;52;07 - 00;22;00;12
Pri
Dude, I, I hear you. But like there is reactions happening that are it is hype. It's there is reaction actions but doesn't.
00;22;00;15 - 00;22;04;00
Chris
But it's three like three. Don't don't let Aaron gaslight you here.
00;22;04;02 - 00;22;07;15
Poof
Yeah I was saying yeah. Wow. Yeah. All right Chris thank you Chris.
00;22;07;15 - 00;22;08;10
Aaron
Let's go.
00;22;08;13 - 00;22;11;27
Chris
Who's funding you Aaron. Let's have a little,
00;22;12;00 - 00;22;13;01
Aaron
Nobody.
00;22;13;04 - 00;22;14;23
Chris
Who is the big money behind Aaron. Right.
00;22;14;23 - 00;22;17;24
Pri
Right now Soros back to Aaron, right?
00;22;17;27 - 00;22;18;09
Chris
Yeah.
00;22;18;10 - 00;22;23;25
Poof
Well no agents, they're just agents. It's we're saying it's foreign agents, but we actually mean like.
00;22;23;27 - 00;22;30;04
Aaron
Yeah, my agents, they, this has been a huge LMN psychosis, guys. The entire thing.
00;22;30;11 - 00;22;41;21
Chris
Aaron, both on this podcast says he, as we speak, has a fleet of agents doing the work of 50 engineers and that all of these concerns are absolutely ridiculous.
00;22;41;23 - 00;23;05;16
Aaron
Because I think everybody's going to have fleets of agents doing stuff. That's the whole point. There's just so much work to be done, you know, and like, you know, there's so they're just there is you know, most people statistically are are bad at their jobs, like every everybody's, you know, on average, you know, average, right. Statistically. So there's just a lot of room for improvement across the board everywhere.
00;23;05;18 - 00;23;11;11
Aaron
And like we're just going to build more cool fun stuff. So buckle up. It's going to be fun.
00;23;11;14 - 00;23;12;23
Poof
I love it.
00;23;12;26 - 00;23;15;01
Pri
I did a read like I think to.
00;23;15;03 - 00;23;19;17
Aaron
To that song. Is that like a bomb on your car?
00;23;19;19 - 00;23;45;02
Pri
No, I mean, it's it's not I agree there's work to be done. I think like if anything, you know, agents I it's going to compound, you know, productivity growth, blah, blah, blah. Like, I totally, I, I've already felt that in my own life, like, but I guess I don't know, I still think that there are people who are super enabled by the technology, and a lot of people who have no clue how to use it.
00;23;45;04 - 00;24;06;06
Pri
And those people, it's just going to be the k-shaped economy all the way down. I still think that doesn't mean it's a bad thing. Maybe they weren't the best at their job anyway. So it's like, okay, that's just kind of the natural progression of of like the visibility and transparency that AI is bringing to the workplace. But, you know, I think it's still going to be a pretty.
00;24;06;08 - 00;24;10;18
Pri
Yeah, yeah, that's right. And everything. But I think block is, is that's going to.
00;24;10;20 - 00;24;26;29
Aaron
This is a different framework. Everybody can be good at their job. Like everybody can be the best at it. There's no more bell curve like you just can just level up to like whatever the best is at a any like at any subtask you want to do. Like that's the power. Like that's what I think.
00;24;26;29 - 00;24;27;12
Pri
That's amazing.
00;24;27;15 - 00;24;53;26
Chris
Karen, you're overly optimistic here and like you're not speaking as a manager either. If you manage enough people, you have a sense for people who, you know, can actually come up with these ideas independently versus people whose job it is to like, execute tasks given to them in a confident manner and like there is going to be a separation here.
00;24;53;26 - 00;25;20;01
Chris
And that like what you're suggesting is just overly simplistic. You're implying that simply because the tools are there that everyone is capable of, like accessing those tools. Like what I think there's actually going to be this the separation where where, you know, people who have agency, who have imagination, who have a drive and want to take advantage of the tools, right.
00;25;20;04 - 00;25;38;26
Chris
Those people will do so. People who are dependent on, being told what to do and executing tasks like where, where is the role for them because that that's going to become superfluous. Like if I have to tell someone what to do, I can simply tell the agent what to do.
00;25;38;28 - 00;26;01;04
Aaron
Yeah, but it just like mental load, man. Like somebody is going to have to supervise like this at some level. Like there is like an end point where there's going to be human like input into the systems. And you could debate where that is. But I don't think it's going to be like one person who like, sets off one prompt and then does all the work of the global economy.
00;26;01;07 - 00;26;20;04
Aaron
I just don't like, just don't think that that's what it looks like. And if that's not the case, then there's going to need to be people that are doing tons of tasks. And I think things are going to get much, much more complicated and move much, much, much faster. So you're going to do the same, you know, divide and conquer thing that, you know, most businesses do, and then the quality that work will just increased.
00;26;20;04 - 00;26;36;22
Aaron
So if you're asking somebody to like handle like a task or even now, you know, a hundred tasks or a thousand tasks that they'll just be able to do it expertly and they don't need to worry about getting, you know, like the the bottom of the distribution when it comes to that given task that could be trading like what you're exploring.
00;26;36;22 - 00;26;45;13
Aaron
Boof. Right. It could be lots of different things. So there's a lot of work to be done. Guys. Like, we're not in a solved world. We got a long wait till that happens.
00;26;45;16 - 00;26;54;04
Chris
Yeah, I'm. I'm not buying this. Sorry. I think you're fine up to a point. And then what? You're saying, they say Jasmine. Master.
00;26;54;06 - 00;26;56;05
Aaron
No. All right, fair enough, fair enough.
00;26;56;05 - 00;27;22;18
Chris
Now, look, I have my own set of biases in here in that league. I'd rather do something and explain something. It's often much, much faster for me to just do it than to explain it and get the expected level of quality back. Like I wrote all of our comms right back when I was like co-founder managing like product and marketing and doing like 900 other things.
00;27;22;18 - 00;27;27;28
Chris
I still wrote all of our freaking comms because it was faster too. Anyway.
00;27;28;01 - 00;27;30;28
Aaron
Completely. And you're going to do that and you're going.
00;27;31;01 - 00;27;35;18
Poof
Chris, what was the origin of the rumor that the economy is getting impacted?
00;27;35;20 - 00;27;47;24
Chris
Yeah, we're supposed to be talking about the Trini here. Yeah. And then I'm talking in like a 20 year time frame we should be talking about over the next five years in which the planet is.
00;27;47;24 - 00;28;00;06
Aaron
Making up 5% and the entire world will collapse from its historic norms. I mean, come on, that thing was such a rage. Bitty puff piece. It was a well thought out, but it wasn't even like one of these.
00;28;00;07 - 00;28;14;20
Poof
How something is changed in the algorithm or the way that people consume media. Now, that allows for things to insanely quickly propagate in a small class of people. I don't know if that's interesting, but I. Yeah, keeps happening actually.
00;28;14;20 - 00;28;15;02
Pri
Yeah.
00;28;15;02 - 00;28;17;21
Aaron
Like, yeah, the world's turned into the crypto Twitter.
00;28;17;23 - 00;28;36;15
Pri
You know I don't know. Yeah I really had like crypto. There's so much of crypto that I'm like it's just so downstream of crypto like culturally in so many different ways. But yeah, that is a good point. Proof is like, how does a article like that catch fire and capture attention? Even that one before that was like something big is happening.
00;28;36;15 - 00;28;57;18
Pri
I like read that. I'm like, this is so stupid. Like obviously like, but then I had normie friends text it to me and I'm like, oh, Harry, how did you even see this? You're not even on Twitter. Kind of a weird thing that a lot of these, like really hardcore Duma or, you know, impact of AI articles are just starting to break out outside of even Twitter.
00;28;57;18 - 00;29;00;26
Pri
And I don't know why that is. It's a curious question.
00;29;00;28 - 00;29;42;10
Chris
There was a huge difference in quality between those two pieces and that something big is happening. Yes, I had the same exact reaction as you, the Satriani piece. I read it and as I was went through the whole thing, my thought was okay. Finally, someone has taken the time to enumerate a whole laundry list of potential second ordered third order impacts of what is going to happen when, like a very like key structural piece of the economy, which is the labor of high earning white collar professionals comes under siege.
00;29;42;13 - 00;30;12;24
Chris
Because if we've learned anything over living in the 21st century, it's that like the global financial system is inherently fragile. It is prone to breaking, you know, every 8 to 10 years when one part you know is over reliant on like a set of modeling assumptions which are no longer held. True. And so, like, you know, when I read this training piece, did I say, oh, yeah, this is going to happen.
00;30;12;24 - 00;30;40;17
Chris
That's going to happen. That's going to happen. No, what I took away from it, though, is that there is a lot of, inefficiency that exists in margins that AI is going to compress. And the second order effect of that margin compression is going to be that like, you know, synthetic independent markets that exist upon the existence of like that margin are going to be completely and utterly exposed.
00;30;40;17 - 00;30;48;06
Chris
And so we're going to end up in a period of volatility as the brittleness of these assumptions get tested.
00;30;48;08 - 00;31;06;19
Aaron
I think that's fair. I mean, I definitely think it's going to be volatile, but I mean, but what were the core arguments pre I mean, across the core arguments were unemployment's going to kick up a bit like more than its historic lows today. Right. Like it was like unemployment may kick up to I think it was like 10%.
00;31;06;19 - 00;31;25;22
Aaron
They got it. And like DocuSign may go through a rough patch like, like it just it just it just for me, it just landed like, completely flat. It just felt like a puff piece from, you know, somebody that hasn't thought that deeply about it. And like, the headline was just, like, so overly like, raw and dramatic. I don't know.
00;31;25;25 - 00;31;26;24
Aaron
Yeah. I mean, like, is.
00;31;26;24 - 00;31;30;04
Chris
It you're fixating on, like, a single thing there, like unemployment.
00;31;30;06 - 00;31;32;09
Aaron
Goes up. That was the core of their argument.
00;31;32;12 - 00;31;47;24
Chris
Like that was, yes. But unemployment goes up in a particular like, and it overly impacts the ticular segment of the market, and that most consumer discretionary spending is coming from that segment of the market, then it has a much larger impact.
00;31;47;27 - 00;31;51;14
Pri
Yeah, that was the main point is like, yeah, but like more spending I think.
00;31;51;16 - 00;32;21;18
Aaron
Yeah. Like no doubt that that's going to be the case Chris. But like number one like okay. No evidence that that's happened yet frankly. Like increasingly there's counter evidence. See the software engineering open job postings. Number two like the economy in the US economy has handled, you know, that level of unemployment. And, you know, even before then and number three, like a lot of that consumer spending is like downside of like or downstream of like corporate profitability.
00;32;21;24 - 00;32;45;26
Aaron
And all these corporations are going to just become more and more profitable. So the people that are still working are going to do pretty well. And then the folks that get cleaved off, which is super sad, they'll, you know, more efficiently go to some other job. So I just I think it had like a headline, but it didn't account for like, any, like, dinosaur, you know, any dynamic aspects of the economy.
00;32;46;04 - 00;33;03;07
Aaron
And I think that these systems are way too dynamic. I think it's really hard to map out dynamic systems. And that's why I just like, felt like utterly flat to me. And it just felt like doom or porn or what do you call that, doom or slop? Like it was just like another piece of doom or slop, which is great.
00;33;03;07 - 00;33;10;26
Aaron
Like go for it. Like, and, you know, push the conversation forward. But I think that there's just a huge market for that at this point. And it's just like feeding that market.
00;33;10;28 - 00;33;35;10
Pri
It does feel like people want that. I did read a funny article about just like how we've kind of already have you. This is sort of a side note to just kind of the conversations reminded me of it, read a funny tweet that basically like Covid showed that were the their base like most email jobs and, and, you know, that type of work is yeah effectively UBI anyway because like Covid showed that you really didn't need a lot of them.
00;33;35;13 - 00;33;55;20
Pri
I'm passing that. But I thought that was kind of an interesting point too. It was like maybe a lot of places don't need as many people. And I don't know if I was going to be the foil for that, but maybe I don't know. I don't know what I'm trying to say, as a foil to maybe part with those people, I don't I'm not really making my point.
00;33;55;20 - 00;34;04;07
Aaron
But anyway, are you saying that, AI is the ozempic for corporations? The trimming, the fat maybe.
00;34;04;10 - 00;34;21;13
Pri
Like, maybe it's not even a problem of anything. It's just like it's. And there are productivity gains, but maybe it's actually like, if anything, Covid showed that most people it is kind of a UBI set up anyway. They're not really making that much of an impact. I don't know if that's actually true, but I was like, oh, kind of an interesting thought exercise.
00;34;21;15 - 00;34;46;12
Aaron
Hey, Chris, on the consumer point, don't you think that the, okay, the we know that AI is pretty deflationary. Like, that's like the that's the actual core argument, right? Like it's a deflationary system. But if it deflates, cost it like up and down the line, like why doesn't just recalibrate to something that means that you don't need as much to get as much things that folks may want and or need on a day to day basis, right?
00;34;46;12 - 00;34;48;14
Aaron
Like where's that side of the equation.
00;34;48;16 - 00;35;13;11
Chris
That side of the equation ten years from now? Right. Like I think you and I might be more closely aligned, like if we actually get on the same timescales, I think as we are transitioning into a deflationary environment in which earnings are suppressed and eventually things will become cheaper, you know, as the system, becomes more capital efficient.
00;35;13;13 - 00;35;47;27
Chris
Sure. Like the future you're looking for is entirely possible. You know, getting there that's going to be extremely disruptive and very painful. Like we are going to trim the fat much faster than we are going to bring the cost of living down. And in that disconnect, right, like that, that lagging gap, all sorts of brittleness is going to get exposed going and get blown up.
00;35;48;02 - 00;35;50;13
Chris
And it's going to take a while to unwind that.
00;35;50;16 - 00;36;21;21
Poof
Yeah, I think I mean, it's also already happened. Right? I think that's the other thing is like actually people at the bottom range of the wealth, wealth distribution have already been challenged. Right? Like economically, that's true. And it's more actually I think the the unevenness of the distribution of impacts is also kind of an interesting thing, and especially where you think about the difference of the market versus people's personal situations too.
00;36;21;24 - 00;36;38;24
Poof
Right? Like like the market versus the economy versus the GDP to. So I think like all these things kind of get jumbled together when you're thinking about, a major change. Right. And economic order, on some level, it may not be like, you know, the craziest thing, but definitely something.
00;36;38;24 - 00;36;47;28
Aaron
But if you're saying like, you know, like how certain workers got impacted by, like, globalism. So agents are just, globalism for white collar workers.
00;36;48;00 - 00;37;08;25
Poof
Yeah. Like or like 2008. Right. Disproportionately impacted. Well, some people in finance or somebody else who got bought out or whatever, but like, you know, like construction and jobs like that. Right. Like it's I mean, this is a little bit different because it's and maybe you could call it Industrial Revolution scale, but like, I guess like I think that's the training article to Chris's point.
00;37;08;25 - 00;37;29;17
Poof
Like what's good about is it just that these writes out like what could happen and I don't think it was necessary. Maybe it was intended to be a spectacle, but I think like it's more of the people kind of made it a spectacle or whatever is going on with the algorithm made the spectacle spectacle. But I guess a bit of like to me it's like, yeah, like I could see that.
00;37;29;17 - 00;37;51;19
Poof
I guess, like my other question would be like, why would any social media company have any revenue in five years? If, you know, like just as a, not as a real like I don't actually believe that, but just like, feels like really cheap and easy fake attention that's really not worth anything. You can maybe argue some crazy thing about an agent handing off the like.
00;37;51;21 - 00;38;17;13
Poof
Feels like a bunch of business models, right? Are going to get it, get nuked. That's like a broad macroeconomic kind of thing. Corporation stock, price, whatever. Maybe that's fine. And then there's, you know, everyday jobs that will change place, or whatever pretty quickly where Chris is going to that I know there may or may not be an off ramp for in the short term is my belief.
00;38;17;15 - 00;38;18;20
Poof
Anyways.
00;38;18;22 - 00;38;23;09
Pri
Anyway, moving past, should we move past the dumber tourism that was?
00;38;23;12 - 00;38;25;20
Poof
We're talking about like seven things at the same time.
00;38;25;23 - 00;38;27;07
Aaron
Please God.
00;38;27;10 - 00;38;36;15
Chris
You get your quota Duma ism. Or should we, should we talk about, like, how effective altruists are just popping up all over the place is systemic risk this week?
00;38;36;17 - 00;38;37;07
Pri
Jane, what.
00;38;37;07 - 00;38;37;28
Aaron
Are you talking?
00;38;38;00 - 00;38;44;16
Chris
Jane? Street war. Claude SBF, all of these are linked, to.
00;38;44;18 - 00;38;45;05
Aaron
The Jane's.
00;38;45;05 - 00;38;45;16
Chris
With them.
00;38;45;18 - 00;38;58;02
Aaron
It was kind of wild. I mean, I get hopefully that means the market will heal. Like, is this just like the clearing out of the underbrush of, like, the last, or some of the.
00;38;58;05 - 00;39;04;27
Pri
The kind of. It went down already. I just checked back to pre Jane Street pricing area.
00;39;05;00 - 00;39;21;20
Aaron
What's interesting, is that, you know, Daniel, our guest from last week tweeted this and it resonated with me. It's like that scandal just but it's another 100 to 200 Jane streets. So it's like oh that's that's the racket you got to run. So, I don't know how you stop that.
00;39;21;22 - 00;39;46;12
Chris
I don't think you do. Stop it. I mean, how how different than is this than, the Libor fixing scandal? I mean, it's just insiders using privileged information to manipulate a market in a black to gray spectrum. It's, it is one of those careful what you wish for moment. So, you know where everyone wanted ETFs, everyone wanted the integration.
00;39;46;14 - 00;40;09;22
Chris
And then what was the result of the integration? Well, apparently it was, you know, a year plus campaign of, price suppression and Jane Street, it's example. It was, you know, specifically just so they could, you know, make money off spreads by, you know, swinging price around based on, like, positions they could hold and offset against each other.
00;40;09;22 - 00;40;28;21
Chris
But, you know, we saw Goldman pop up, earlier. What was that like earlier this month, maybe saying, look at all this crypto we hold. Like you don't think like there have been multiple actors banging the shit down out of this for, like, you know, their own person or their own corporate reasons.
00;40;28;24 - 00;40;41;27
Aaron
Yeah. I mean, if I understood it correctly and and. No, it's still all the allegations aren't, like, fully, proven out. Right. But it was it looked like it was like a daily dump and pump. Right? Like that's what they were doing.
00;40;41;29 - 00;41;10;22
Chris
Yeah, they were, dumping it every morning, dropping the price, and then, just readjusting, like the various positions they were holding. And, I mean, it was basically just mechanical moneymaking that if you could drop the price of Bitcoin, you know, a grand and then like, run these strategies over here that you it will come back up to whatever, you know, the opening price was and in the interim, right.
00;41;10;22 - 00;41;14;11
Chris
Like that that drop in price you benefited from.
00;41;14;13 - 00;41;41;09
Aaron
And I guess because the liquidity of the market isn't as deep as some of the deeper markets, they were more able to do that. That was like kind of my second read. I do think there's going to be more of this stuff that comes out. I mean, because you mentioned it before, but like Jane Street Jump, FTX, you know, Sam Bankman-Fried, Terra Luna, like, you know, some of the other folks in that ecosystem there just seems like there was a lot of a lot of stuff going on there.
00;41;41;10 - 00;41;55;18
Aaron
It it really makes me think that there's going to be a bunch of additional stories that come out of this. Yeah. You know, different, different allegations, you know, different things that are are just going to continue to come out here.
00;41;55;19 - 00;42;23;26
Chris
And it's what's funny about it is right. Dqg to to bring our guess back into the mix. Here is the only thing I've done on chain this year, question mark, which is kind of like insane, right? But we're now sort of in this situation where, you know, we saw our government officials abuse the crypto market. We saw, Wall Street market makers abuse the crypto market.
00;42;23;28 - 00;42;53;12
Chris
The Zach XRP key thing, right, with axiom. We've got projects in the space trading against their own users. Right. Like there's just everything that is abstracted a level above the actual chain. Like once you get past code is law. What we're seeing is every possible like synthetic abstraction above code is law is crooked as fuck. And so then like how this reflects my own personal behavior, right?
00;42;53;13 - 00;43;21;07
Chris
I'm literally only, now engaging in Unchained way. And it's like, oh, I can throw a few hundred bucks at this. It's entertainment. It's low stakes. I know the person doing the project, and I'm directly interacting with the chain itself, right? In a way, that's a closed box in which, sure, like there is manipulation going on, but the manipulation is part of the game and it's coming from like other participants within it.
00;43;21;09 - 00;43;41;27
Chris
Like it just really put like a big, big damper, I think on at least my personal willingness to want to engage in things on the, on the blockchain. Yeah. It has nothing to do with the blockchain itself. It has everything to do with like how people, abstract a level above it.
00;43;41;29 - 00;44;01;10
Aaron
Well, I think this is, this goes to like if, you know, like how New York State is approach like regulation. They're always focused on like just the sanctity and fairness of the market. They don't care what in many ways, like what's trading on that market. Just that it's like a reasonably fair and even playing field for people to compete.
00;44;01;13 - 00;44;20;01
Aaron
And I think when you see this type of shenanigans, assuming that all the allegations are true and who knows if they are, it does, you know, prevent people from wanting to participate in that market. So, you know, to me, Chris, there's just points to like why you need like sensible rules in place related to it, why you need like just reasonable enforcement.
00;44;20;01 - 00;44;27;17
Aaron
That's not like two politically slanted. Like that's just the recipe for for a healthy market and a healthy industry.
00;44;27;19 - 00;44;54;15
Poof
Yeah. I think actually that's, that's bring it back to the my, my thing or the conversation we just had one day that's funny is like it, it kind of is the one thing that like agents don't solve is this kind of level of corruption. Right. Like you have the desk like agent market whatever analysis. And I think there's some things about markets that are much harder actually been just like learning language or something, even for an agent.
00;44;54;15 - 00;45;18;13
Poof
But like, you know, the one thing that you can't quite beat with an agent is just like having a ton of money, regulatory access, scale, insider information and pressing on one button nuke every day at 10 a.m. kind of thing. Which is like an unfortunate predicament. I don't know if that's the curse. Like, I, I'm almost feels actually bullish long term.
00;45;18;13 - 00;45;28;01
Poof
Right. For for I mean, the core of crypto and decentralization, even if on the higher level on top there's all this madness. But yeah.
00;45;28;03 - 00;45;43;19
Aaron
Well, don't the agents make it worse though? Poof. Because they're they have no ethics, moral scruples. Right. Like they're just gone sociopathic. They look at the market and see what they can, what they can extract from it. Right. So. And then how do you hold them responsible?
00;45;43;21 - 00;46;14;05
Poof
Yeah. I think it's like that's different though, right. Like if you think about MeV, like if you just think about MeV, like that's a different you can consider it like a play. But you can also consider and as part of like a machinery or a system, I think that's a different problem than an individual or like the structure of capital and accumulation and power and other things give a truly asymmetric like there's no like unless you hook up your agent to James Street to be more advanced or whatever.
00;46;14;05 - 00;46;41;00
Poof
But like, they're already doing that. Right? Like I think that's where like LMS don't feel like or whatever is that can be some huge solve for the financial market in the broad sense, just because, like what they're already doing is better, right? Like high frequency trading that is more about your access scale, ability to manipulate is a very different thing than just like, you know, agents out there, if that makes sense.
00;46;41;00 - 00;46;54;19
Poof
Like, I don't think that seems like something that's very hard for the kind of AI progress that's happening right now to like, solve or intervene in without like anyways, without a big step change in some of the capabilities and stuff.
00;46;54;21 - 00;47;14;17
Aaron
And it's, it's super interesting, you know, like I think, another thing I've been thinking about just with agents, and I think what you just said poof, kind of sparked, like a thought I've been kicking around my head is just like how much these agents are just going to attack things when it comes to security. That's like a tsunami.
00;47;14;17 - 00;47;19;00
Aaron
Like, I just don't think anybody is really prepared to to handle.
00;47;19;02 - 00;47;21;09
Poof
Yeah, I think 100%.
00;47;21;11 - 00;47;34;24
Aaron
Like, oh, like the same brittleness, Chris, that you were describing, whether it's like, in a workplace or in a marketplace, right. I think the same thing is going to happen just in the security side, just like hammering the heck out of things for a long time.
00;47;34;27 - 00;48;03;18
Chris
If there's incentive. Yes. Yeah, exactly. It's if you're running like a ten year old smart TV, right? Maybe you don't have to worry about like the security or ten year old smart TV because there's no real incentive to get in there. But if you're running like a ten year old Wi-Fi router or if you're running a, I mean, it's full, municipal water system whose operating code is like cobalt.
00;48;03;20 - 00;48;06;16
Chris
Yeah. Right. Yes.
00;48;06;18 - 00;48;32;19
Poof
I think yeah. And I think the adoption on the side of people who are it's also there's just a asymmetry, right? Like the people who have incentives to go use these things. The adoption of these tools are much faster then. Yeah, the municipal water plant, because I think on the I think that's on us. One thing about crypto that's interesting is like, you know, it's just like an awesome new tool to actually be able to have Codex or whatever, go read team your code and stuff, and you can just use it.
00;48;32;19 - 00;48;42;05
Poof
Right. And it's great. It actually helps you. But like in every other industry or anything that hasn't been touched in a while, it's yeah, scary.
00;48;42;07 - 00;49;04;06
Chris
So poof. Yeah. Why don't you talk a little bit about the intersection of, crypto and AI from that point of view, right? Like how many people on earth have actually needed to have Codex read team smart contracts? You know, you could probably fill Radio City Music Hall with and you're one of those people. So what's that look like?
00;49;04;08 - 00;49;26;07
Poof
Yeah, I mean, I think I think from a developer like production standpoint, it's like we can just do that. You can just do things that you kind of before, right? Like, I mean, the productivity gains are real just on general software engineering. And as we think about it, and then I think in crypto, you know, we're lucky to have a stack of people who are super adept in the smart contract world.
00;49;26;07 - 00;50;04;02
Poof
And I've spent a lot of time there and in crypto in general. So like, you know, we're already good. But like, you know, there's nothing better than, just having something to go spam, testing different things and messing with it. And I think that's a big unlock because I think to me, crypto, it's just it kind of like it wasn't wrong that people did, you know, audits and all these other things, but it's just slow and, you know, and it's also and I'm sure a lot of ridiculous things will happen because people vibe code things that they shouldn't.
00;50;04;05 - 00;50;22;21
Poof
But I think for like, you know, people who are serious in the space, it's, it's a big unlock that I think will let people actually do more. And I'm sure there'll also be things that are a little bit scary about it. And I do think fundamentally, like more broadly, that's on the development side, like AI agents.
00;50;22;21 - 00;50;50;19
Poof
And this is this is really on the Terminal Pro and really this idea of like the O&M or the on chain genetic market is having the power of like having things on chain. It's kind of just that transparency you're talking about. And actually that works really well for me with the floor of AI, which is it just feels a little bit shaky or a little bit private, or you're thinking about it as a one on one thing.
00;50;50;22 - 00;51;16;07
Poof
So what's a good way to like, add weight? It feels like crypto is actually a really good way. Right? And sometimes that's ridiculous and insane. Like, you know, people giving their claw bot some money, but at least like you, everyone can see it happening and it's kind of funny and weird and whatever. And I think that's where if you have a good structured environment that has the on chain stuff, then you don't need to worry about, you know, backend security or someone stealing your prompts or whatever, like you've chosen.
00;51;16;07 - 00;51;37;21
Poof
Either you're okay with that or you're not, and it's all visible. So I think that that kind of environment is a good environment for AI and agents in general. And I think also it kind of creates its own orchestration mechanism where tokenomics or whatever it is kind of becomes the orchestration rather than just like prompting. Right? You get this complex dynamic system.
00;51;37;24 - 00;51;40;21
Poof
So that's where I see it going. Like I think that's the potential.
00;51;40;23 - 00;52;10;18
Chris
Hey, so you mentioned all right, all these strategies are on chain. You've exposed APIs here. I go back to when I was playing on chain games, and you play it for like 4 or 5 days to learn the most trap. You knew you were never going to win because, you know, like, org or some other guild was out there operating at this mechanized level 100 times, you know, more advanced than like you just punched, you know, pots and around.
00;52;10;24 - 00;52;22;07
Chris
Do you know who like your power users are? Do you know what strategies they're engaged in right now? Like do you have any inkling of like, absolutely bonkers shit? People are doing that to try to win this?
00;52;22;09 - 00;52;41;24
Poof
Yeah, I mean, I think and that's where like what's kind of interesting is like, we're not I mean, it's game like, but it's more of, I mean, it's really just a market, right? And I think one of the things that's good about markets and one of the reasons why we want to have these experiments happening is that they explore like it's non-stationary.
00;52;41;24 - 00;53;01;27
Poof
Right? So like there isn't one best play. It's a meta game that's constantly changing and all these other things. So for us, it's super interesting to go see exactly what you're talking about. Like, what are people doing? You know, whatever. And there's obviously like, you know, things like, let me go just throw a bunch of money in and everything else.
00;53;01;27 - 00;53;25;17
Poof
I think what's kind of fun about our system is there's enough toggles where just the true. I throw money out there. Sure, you could do that. But there's so many different elements to how you know, the distribution, for example, of the top coins gets unlocked over time to people who have the worst coins and all these other things, that make it interesting.
00;53;25;17 - 00;53;50;20
Poof
I think what we're seeing people do right now, there's a couple different strategies. And I again, I don't know which ones are the best. I think some people are doing really crazy things, like there's, I think at least a handful of people who actually have open claw hooks in, you know, we have a skills file and all the other stuff they want to, and having them send the strategies to their agents live as like a manager in between.
00;53;50;22 - 00;54;25;11
Poof
There was someone else who we saw, like, literally give their agent like, crazy if statements and stuff like math style, as a strategy. I'm not. I haven't spent the time to actually go see what the math is intended to actually even do. Which is a funny one. And then, like, what's always humorous to me is like, every single time that we've tried this, even in testing and everything else, there's also just dumb luck and random other stuff, like there's someone who just has one that's literally like make money.
00;54;25;16 - 00;54;46;11
Poof
And they're they're doing quite well right now. So there's like all over the place. But I think people are exploring it right now. I'll be curious to see if like a meta game emerges at all, or if there's kind of things like that that you're, that you're expressing and why. I think it's a little harder, though, in this environment where it's kind of like it's just a market, right?
00;54;46;11 - 00;55;01;05
Poof
So some of the things where like, okay, I've got the most resources so I can go hog it up, doesn't necessarily apply. And also some of the people who might be wailing in might just be taking profits whenever. So that that'll be what's interesting to see.
00;55;01;07 - 00;55;36;05
Chris
Yeah. Have you considered what, what percent of people are playing the game to make money in the short term versus in the long term? Because I assume, like you weren't expecting to get 13,000 each deposited, you know, and then whatever circulates in, like the daily economy, it's probably like an order of magnitude higher maybe than you're expecting, which then introduces a second set of dynamics, i.e. someone could have like a great freaking run playing for three days and then never touch the rest of the like, experiment.
00;55;36;08 - 00;55;59;13
Poof
Yeah, I think I think that's a big part of it. Like, I mean, you can never control these things, right? I think our biggest thing that we actually concerned ourselves and then thought wouldn't be an issue, but then obviously, like, people went crazy. I based at the very beginning, was planning for it to be, like, very front loaded because it feels like that's, you know, the obvious behavior for people.
00;55;59;13 - 00;56;23;11
Poof
So I think our kind of priority was making sure that there's mechanisms in place where and not to guarantee this or anything. Right. So the financial advice but like that there are ways for people to still continue to come in. You just don't feel like you're purely behind, which I think, you know, is the common thing. So that's where I think in terms of our expectations, obviously we didn't expect it to be this huge, huge inflow.
00;56;23;11 - 00;56;51;09
Poof
And then obviously you've got some people extracting some people mad, but then also like there's more people coming in because of that now, but it's also only the first day of trading. So like who knows. So I think that's kind of I think what's really important to us was making sure there's like a clear structure where to the best and we can't control what happens really, but to the best of the ability that someone could just come in and have a decent experience doesn't necessarily mean they're going to make money or not.
00;56;51;09 - 00;57;13;16
Poof
Regardless of where they come in. And part of that is through like, hey, there's going to be other new coins. Also, like the agents have visibility to holder counts and all that stuff. And kind of like, you know, how much supply is held. So maybe, you know, people start to have strategies that are like, don't go for these cabal ones or whatever, right.
00;57;13;18 - 00;57;38;26
Poof
And I think what's interesting and by the end is the whole system itself does still have this machine where there's distribution happening across because of the way that like the worst coins get eliminated and then they actually receive a locked piece of supply that they get immediately of the top coin. So there's some distribution mechanisms. So I think we'll anyways, we'll see what happens by the end.
00;57;38;26 - 00;57;59;20
Poof
You can never predict it, but yeah. It's always always crypto is always surprising. But sometimes you can kind of at least get, some different systems in place. And then we'll see at the end, you know, like, people might be really into the coin that ends up graduating and kind of becoming public for everyone or not, or who knows.
00;57;59;22 - 00;58;03;27
Pri
Poof. Would you ever experiment with, like, prediction markets?
00;58;04;00 - 00;58;21;20
Poof
Yeah. We I mean, that's kind of the power of what we're thinking, like this system that we've created now with the OEM is like it could easily also be applied to a prediction market, right? Like so it's hooked into Uniswap pools. And those are great because they allow you to do, you know, make it so okay.
00;58;21;20 - 00;58;29;26
Poof
Only agencies, but you could do the same thing theoretically with the prediction market. You could do the same thing with a bunch of things like write something more game like to.
00;58;29;27 - 00;58;47;23
Pri
Yeah, yeah, that would be kind of cool. I feel like like because you almost have like these like simulated environments. So if you could, if you could get like alpha that way just from like these, like, you know, different agents, you might be able to see some patterns emerge that you could bet on on prediction markets to prove.
00;58;47;23 - 00;59;08;11
Aaron
I think it's exciting. I mean, it's cool to kind of see these like, competitive environments for like all these agents to compete. I'm glad that you're kind of pushing on on that and like a whole bunch of different ways. So super excited to see where this all, ends up. And I feel like even if there's like a next version of it, it's just going to keep on getting better and better.
00;59;08;13 - 00;59;10;29
Aaron
So super pumped.
00;59;11;01 - 00;59;26;20
Chris
Yeah. So is this, do you feel like you got, a core product you're going to be iterating on outwards from here? Or is this, you know, another experiment and then you're going to spin up something different next?
00;59;26;23 - 00;59;50;14
Poof
I think, you know, as far as the tokens themselves, right. They're meme coins is a launch pad. So it's not like our coin or anything after this. I think we're always we always want to maintain our flexibility. But like the, the idea, this kind of like core protocol concept is something that we're going to keep exploring and iterating on, and building out.
00;59;50;14 - 01;00;11;12
Poof
And I think coming out of this will have kind of two things, right. We'll have one that's really good understanding. Also, just from a user standpoint like, you know, and also what people want of where a future version could go. And it may not necessarily be DC's Terminal Pro. Pro might be, you know, something else that's more sustaining or whatever.
01;00;11;15 - 01;00;42;02
Poof
But then also there's a lot of from like, RL or reinforcement learning or kind of like training standpoint, there's all this data now and other things that we'll have to be able to go, because we're still exploring and thinking about trading bots in general, as well as like you could also imagine a version of this where you just because this is a good execution environment for agents where real assets maybe are available to be traded by agents and stuff and more automated system.
01;00;42;05 - 01;00;53;18
Poof
So I think there's there's a bunch of different spaces that we'll explore in this same thing. So more experimentation, but probably more of the same, like a lot of the same thinking too, that we're going to keep building out.
01;00;53;21 - 01;01;14;23
Chris
Cool. And then I got one more for you. All right. We got the helm. We got the tools, we got RL in the middle. Can you unpack how these things all work together? Like is your reinforcement learning just a dumbed down, just a matter of like, fine tuning the performance of the LM and not having any impact on the tools?
01;01;14;26 - 01;01;20;03
Chris
Or is there like, a more complex thing going on in the interplay between these?
01;01;20;05 - 01;01;41;23
Poof
Yeah, I think, you know, the tools are really like I almost consider them like they themselves where they're just like for our, in our case. Right. Like a tool is just like the agent swapping something, right. Or being able to call certain kinds of information. So that part, you know, it's pretty straightforward. I think the important part that RL, you can think about it like a fine tune on a model.
01;01;41;29 - 01;02;02;06
Poof
I think the other part of RL that, you know, is honestly even simpler, but is important is like just the harness and this is, you know, really just a prompt. But like, what is the structure of how you're interfacing and what information you're actually giving the agent? How do you think about, you know, I think people often overcomplicate this actually.
01;02;02;06 - 01;02;27;00
Poof
And they have like really crazy memory schemes and all this other stuff. But like statistically in different scenarios, what is the right way to go represent it? That kind of work actually is super important too. And usually you just don't have good data. Like I think kind of part of our thesis is a bit, you know, one of the reasons why I think markets are a hard environment in general for agents to learn to existing.
01;02;27;03 - 01;02;52;22
Poof
Period. Like it's harder than language, for example, or software engineering. But I think there's an interesting thing with cloud code as like a reference point, which is when you start getting a bunch of people using something and like letting the agent go, you end up with this kind of like flywheel in reinforcement learning, where all of a sudden, okay, like I'm getting the model a little bit better at using this harness and this structure.
01;02;52;28 - 01;03;10;16
Poof
And I'm also finding what works best a little bit, just on a fundamental basis, and maybe also just for people like what they like, what they don't like. And then I'm able to tune that further and then all of a sudden it can do even better. And like, you know, it just becomes this really beneficial thing. Because I think one of the things that's kind of funny is like, what is cloud code?
01;03;10;16 - 01;03;38;02
Poof
It's really just, you know, a bunch of prompts, stuff structured in a certain way, and then they just train on it over time. So a lot of the upside and the benefits really come from that. So I think that's where we see that like this interaction happening is at the core level, it's just you can think about it as a fine tune, but I think more so than anything, it's about like exploring what's the actual right way to go do this and then start getting some progress towards improving that.
01;03;38;07 - 01;03;40;05
Poof
And then it kind of becomes a soul flywheel.
01;03;40;08 - 01;03;57;16
Chris
Do you find that the harness becomes simpler and the code you have to write, to you know, continue building out the harness becomes easier the more you fine tune. Are there certain limits to that, or does it actually, you know, contradictory, like make it harder?
01;03;57;18 - 01;04;22;13
Poof
It's interesting. I think it becomes easier. And I think it actually becomes simpler over time in all cases, like I think and that's kind of like something that we have witnessed just working with even pragmatically, like some of these tools and workflows that we have, which is just like one of the interesting things is you can feel like, oh, I got to go, you know, be really good at using open cloud or, open cloud or like cloud code now, because that's going to be the future.
01;04;22;13 - 01;04;40;25
Poof
And it's like kind of but I also like, you know, over time, a lot of the ways that you as a user interact, they actually get completely abstracted away because we've got so much more that sits in the harness. But usually what's sitting in the harness by the time we get to that model level is a lot simpler because you've got an IT really routine.
01;04;40;25 - 01;05;06;02
Poof
Like that's where it's kind of funny, like people are coming up with and they always do all these crazy schemes to, you know, improve model memory and all this stuff. Like we've kind of known as an industry. If you read research even like six months ago, 12 months ago, and look at what cloud has done or what OpenAI has done, like having a little scratchpad that the além can rewrite sometimes is good.
01;05;06;02 - 01;05;24;13
Poof
Like that's pretty much all you need, right? And like maybe in the future you'll need something crazier. But once you start training that in a little bit, then it just becomes that much better. So it often simplifies it, I think. And we'll continue to kind of simplify it. And that's where like I think orchestration and some of these things that will also just become part of the harness over time.
01;05;24;13 - 01;05;30;18
Poof
But it'll be simple. It'll be like agent to go do this, you know straightforward.
01;05;30;20 - 01;06;03;15
Chris
Yeah. I'm like in a particularly nasty bit of gnarly casing because I'm running through this waterfall that's trying to, to manage for so many different things. And I ultimately know that, like, a good 80% of this is superfluous code. And if I actually spent the time to learn to do it properly and to fine tune a model where I can just zap over certain parameters and it understands, you know, what to do it, what to do when it sees X, right?
01;06;03;15 - 01;06;31;05
Chris
Like I would just be able to wipe out maybe two thirds of, like, the complexity of what I'm doing. But I also know that's a big unknown, like, you know, in my personal knowledge base. And to get up to speed, like, you know, I'm just like, right now I'm going to brute force this shit. And, you know, three months from now I'm going to actually, like, do this elegantly because I've done this before as well where like, you know, once you fine tune something to a certain task, the complexity of what you're trying to build can drop dramatically.
01;06;31;05 - 01;06;34;29
Chris
But to do that, it's a whole other set of skills.
01;06;35;01 - 01;06;52;20
Poof
Yeah. And I think I think right now too, like my, my personal belief is like there's so much improvement that's going to continue to happen on base models. Like it's not worth investing that time and then being like, oh, well, now I have to do it on this new base model. I also think one thing that's kind of interesting, that's definitely happening.
01;06;52;22 - 01;07;19;11
Poof
It's just like all of the processes on I, an AI training is going to get further commoditized. So like because it's really I mean, it's complicated, right. And it's hard to do right now. And certainly the expertise we have and stuff and others that we're working with like will continue to be helpful. But at the same time, like once we get to a certain point, it shouldn't be that hard for regular, you know, not regular people, but like a prosumer or whatever to start to be able to do this.
01;07;19;11 - 01;07;48;14
Poof
It's not there yet because, you know, we're still in the early, you know, it hasn't really even been 12 months of people doing this stuff yet, but like, it'll get there. Will add Prime intellect is an interesting person. Like he's been building some interesting things there that like it's kind of calling like Vive RL AI. Again, it's not something where I think if you spend a lot of time learning it today, but I think it's kind of building for that future where like, okay, I just want to get really good at my task, right?
01;07;48;17 - 01;07;54;08
Poof
And like, let's just do that. I think that's over time where it'll go cool.
01;07;54;10 - 01;08;03;23
Chris
Aaron, do you do you see any of this with, as you continue to iterate and evolve Aiden in your own practice or. Yeah, I mean, you just calling the tools better.
01;08;03;25 - 01;08;24;21
Aaron
I think it's a bit of that, to be honest, too. Like, my general belief is like, this is just a new skill set, and everybody's looking to software engineers to be good at this skill set, and I just don't think they are. And I may use this analogy on a previous episode, so excuse me if I did, but I just think we're like in the jazz or like rock era punk era of programing.
01;08;24;23 - 01;08;46;17
Aaron
And these hyper scale models are like, you know, the electric guitar and you got a whole bunch of classically trained musicians working off of liner notes. They want to be in GitHub. They want to just play their part as part of their group, and they make great music. Right? But it's not that engaging. On balance. It's slow and it's not as dynamic as, like these other more modern forms of music.
01;08;46;19 - 01;09;13;24
Aaron
And I think that that just the area that we're kind of getting into. And so I do think a lot of it actually is in like how you interact with these systems and getting them to, to kind of do what you want them to do. You know, the same way that I'm sure you know, some of the best violinists and cellos and, you know, classical bass players could play most of the music that, you know, different, you know, jazz groups did or or rock bands or punk bands, but they just can't do it right.
01;09;13;25 - 01;09;35;06
Aaron
Like they just can't put it together. They're like to to formal. I think that there's a lot of assumptions that this stuff challenges just in terms of like how malleable and plastic and squishy that software is. Now that just really like makes a lot of software developers brains like break, and they're just not going to make it, they're just not going to be as good at like wielding and developing and building these systems.
01;09;35;06 - 01;09;54;21
Aaron
So I definitely think like the like reinforcement learning pieces or like self-improving pieces are going to be increasingly important. Like what what you were highlighting poof. And that there's definitely going to be like techniques to get like uncovered and unlearned. But I think we're kind of in the early innings of it. It's like we're not in the K-pop era of rock, right?
01;09;54;21 - 01;10;09;25
Aaron
Where like at the beginning, you know, it's like pretty formulaic. Everybody knows the beats now. I just think it's going to take a while to kind of figure out how this all works. So, you know, it's why I'm like so bearish and stuff like cloud code. It just seems like nerd porn to me. Like fundamentally it's not going to scale.
01;10;09;27 - 01;10;18;24
Aaron
I don't think it actually is going to produce the most or most useful code. So Chris, I wouldn't worry about you not, doing it the right way because there is no right way anymore. Right?
01;10;18;26 - 01;10;21;01
Poof
Like it force it it.
01;10;21;03 - 01;10;34;01
Aaron
Yeah, I mean, I do think that with, with one caveat, you should be cleaning up your code base, but you can just ask it to do it in a kind of way, and it probably will just keep the file small. Can you move faster?
01;10;34;04 - 01;10;43;17
Chris
What I'm hearing you say is you feel like fusion era. Miles Davis could beat a wiki Combinator backed, neo lab.
01;10;43;20 - 01;11;02;14
Aaron
Yeah, I mean, I think I think it's already kind of happening. Like, even there's been a bunch of, you know, like, like cloud code hackathons. And if you look at who's winning, it's like lawyers, dentists, you know, like random people. It's not software developers because I just don't think they can expand their mind to like what these systems are capable.
01;11;02;14 - 01;11;24;17
Aaron
Not everybody. Right. But like a lot of them, they just can't expand their minds to to realize that you can literally push these things pop up the volume, crank up like the static, you know, make it loud, make it noisy, make it fast. Like they just don't think that way. You know, they're waiting, you know, to to see what the, their pod says and how they can handle it.
01;11;24;20 - 01;11;43;02
Aaron
You know, they're opinionated up on, like, which tools to use. They're opinionated on, like, what the music should look like and the notation related to it. And like, all those things are just complete drag, because maybe they don't have the right opinions, you know, maybe their software is actually much, much worse because it the AI has like a native cadence to it.
01;11;43;07 - 01;12;00;24
Aaron
I personally think that's the case. Like it likes the code in a certain way. So you enter interceding yourself in the middle of it and and applying your preferences just like creates all this, stuff that's illogical to these systems. So it slows them down. So yeah, I just think it's like a brave new world. It's super exciting.
01;12;00;24 - 01;12;16;18
Aaron
And but I do think, you know, if you're right, like, I do think a lot of, like the kind of the base level pieces the next, you know, at least the next year, it's been like in a lot of these academic, you know, our VCs posted papers for a while. It's just kind of like hiding in plain sight.
01;12;16;20 - 01;12;20;24
Aaron
Like I know how to approach these things or make makes them more powerful systems.
01;12;20;27 - 01;12;27;11
Poof
Yeah, yeah, I think I think I don't just I mean, hey, cloud code, one thing that's good about it, good training data.
01;12;27;14 - 01;12;28;00
Aaron
Yeah.
01;12;28;02 - 01;12;29;26
Poof
I also that's, you know.
01;12;29;28 - 01;12;46;07
Aaron
Well, I also think we're moving from software engineering. And you probably felt this to context engineering. Right. And like context engineering is much more important. Like the software just doesn't matter. And it just more like how are you going to like, like be able to present something to one of these systems in a way that it can it can do the most with it.
01;12;46;09 - 01;13;11;15
Aaron
And I think there's just like an emerging set of learnings that people are beginning to uncover related to that. But I think that's a really incomplete set. And even those like, you know, wonderfully written, highly credentialed papers, like, who knows if they've even figured it out, like my my sense is it's going to be just like we saw with, like music press, like, you know, some random person in some random part of the US or Europe that just, you know, cracks it.
01;13;11;15 - 01;13;13;23
Aaron
I mean, I guess we saw a little bit of that with someone in China.
01;13;13;26 - 01;13;14;17
Poof
To be clear.
01;13;14;20 - 01;13;17;11
Aaron
Yeah, maybe. I mean, you know.
01;13;17;14 - 01;13;21;18
Poof
It's just it's mystically. Yeah. Like, you know, I think that's one thing that's interesting.
01;13;21;18 - 01;13;33;04
Aaron
Or not because, you know, they they don't feel like they have permission to just fuck shit up. Right. Like and that's what the edge really. Yeah I do yeah. I think a lot of this stuff is just.
01;13;33;06 - 01;13;35;03
Poof
Well I just well anyways, that's.
01;13;35;06 - 01;13;52;04
Aaron
One of my. Well here's how I reframe the question. Which is like, what am I using that they produced? I mean, I heard you use quant and some other stuff like that's like a cost saving things, which is great. But like, I, there's no real AI product that I use it that's not based on states still. So yeah.
01;13;52;06 - 01;13;53;12
Poof
Yeah, yeah. But I mean.
01;13;53;12 - 01;13;53;24
Aaron
So like.
01;13;53;27 - 01;14;16;22
Poof
Well what happens when. Well here's an interesting. Well no, this is just, this will be interesting for you to mull over with, the Duma ism or non-human ism. How many, of the people who do any kind of research in AI, how many of them are in the US, which is increasingly defunded? All institutional knowledge versus in China right now.
01;14;16;25 - 01;14;17;21
Poof
What percent?
01;14;17;24 - 01;14;27;13
Aaron
I'm sure that it's, it's, a greater percent in China, but it's this the same things like patent filings. It just like at some level just doesn't matter. Like it's just got to work and it's just.
01;14;27;16 - 01;14;35;28
Poof
A force industry that's about numbers and amount of time you put into it. Why would, that's going to get, I don't know, modified? I don't know if that's the case. It's not health.
01;14;36;00 - 01;14;37;27
Chris
Yeah. I think it's commodified.
01;14;38;00 - 01;14;38;26
Aaron
No, I think you're.
01;14;38;27 - 01;14;40;00
Poof
Just saying it was commodified.
01;14;40;03 - 01;15;02;22
Aaron
I think that the models themselves will, over time, become commodified. I think how you use those commodified models, is just going to be wild. Wild and all over the map. And I don't think it's necessarily brute force there. I think it's a lot like, I think I think it's a lot like music, man, and I just think there's some people that are better able to kind of take the same baseline commoditized thing, right?
01;15;02;22 - 01;15;21;16
Aaron
Like a electric guitar or, you know, like, if it was like jazz, like a sax, trumpet pick, pick your favorite instrument and just play it better. And I think that that's really what we're kind of moving into. I mean, I think you saw with the open class stuff, right? Like it just some guy, he was clearly a pretty good developer, but hanging out.
01;15;21;16 - 01;15;37;01
Aaron
And I think he's from Austria. Right. Like he just came out of the woodwork, built something that was pretty compelling. People wanted to to jam with his stuff. And, it kind of took the world by storm. We even talk about how some of those folks were, like, dressed up like lobsters. Right? Or was that or was that I slob?
01;15;37;01 - 01;15;39;20
Aaron
I couldn't tell, I never dove into that.
01;15;39;23 - 01;15;42;06
Chris
Yeah. Don't, don't. It's not worth your time.
01;15;42;06 - 01;15;44;19
Aaron
But what was it? What's the answer, though? I didn't even know.
01;15;44;26 - 01;15;49;09
Poof
Aaron's very bullish about certain things. It's good, it's good. I want to.
01;15;49;09 - 01;15;51;07
Chris
Try to, like, tie this back to a bigger.
01;15;51;07 - 01;15;53;25
Poof
Trend. Yeah, a tie breaker thing. Before we close.
01;15;53;25 - 01;15;59;23
Chris
I guess we can talk. Officer costumes. No, no, no, I have so, wait, is that.
01;15;59;23 - 01;16;00;16
Aaron
Actually real, though?
01;16;00;16 - 01;16;05;14
Chris
Chris I think some of them were. I think there were sloppy.
01;16;05;16 - 01;16;24;26
Chris
Like, I think if you go to a meetup and a hackathon, do you get, like, you know, dumb lobsters, antennas to put on your head or, you know, are you in a full body lobster costume? That's probably I, I mean, how many full body lobster costumes, like, exist in America right now? A thousand like, it's not, you know.
01;16;24;28 - 01;16;27;29
Aaron
They're all in Maine and they happen to all be in Maine.
01;16;28;02 - 01;17;10;25
Chris
Seriously. Like, you know, but to get back to the larger thing, Erin, I think what you're, you're saying here really is that value added resellers are where all the value is about to be generated, right? In the long term, that like the snows of the world, the 11 labs is this, that and the other. Right. Things are like just a level above a foundational model that like, that's where the value is going to be the like the underlying stack is going to be commodified, how you choose to assemble and re hypothecated that to create like a unique thing that is going to be where the value capture is, like tik and oh, TikTok
01;17;10;27 - 01;17;12;07
Chris
for sure.
01;17;12;10 - 01;17;16;09
Poof
Just naming another Chinese thing. I'm just kidding. Anyway, sorry I keep going.
01;17;16;11 - 01;17;37;04
Aaron
Poof. I think you meant your naming. The only, thing that came out of China that that is at least on the internet had like widespread use. Right? Or one of the three I just like. It's all overstated. Like, you know, there's definitely like, risk related to, to some of that stuff. But like, I don't know, I just think it's like a, it just it just doesn't play out in practice.
01;17;37;06 - 01;17;53;21
Aaron
So I mean, you saw it when you I mean, if these Chinese researchers are so good, then why are they spam baiting like, Claude and OpenAI to re to sell out the models? It's very sad. It's not sad, man. Like in reality. Like they can't compete.
01;17;53;26 - 01;17;58;23
Poof
No, it's not true at all. It's just not true. I mean, it doesn't matter.
01;17;58;23 - 01;18;02;27
Aaron
I don't see everybody programing, you know, their programing with like.
01;18;02;29 - 01;18;05;21
Pri
But how is it not true? Because I actually don't know the. Oh yeah.
01;18;05;24 - 01;18;10;28
Poof
But just it's just silly. I mean what anthropic is doing is positioning, right? It's a regulatory capture.
01;18;11;00 - 01;18;15;17
Aaron
If it's not like if everybody's using their stuff because it's the best man like it. So of.
01;18;15;17 - 01;18;16;16
Poof
Course, and they.
01;18;16;16 - 01;18;17;02
Aaron
Actually get.
01;18;17;02 - 01;18;25;11
Poof
Everyone else's stuff when their stuff wasn't the best. And Google is going to crush both of probably crush everyone at some point completely.
01;18;25;11 - 01;18;34;13
Aaron
But yeah, because that's where they but that's where the innovation is. I mean, like you don't see Google turning around and like, you know, mining through and trying to like.
01;18;34;21 - 01;18;35;10
Poof
No, you do.
01;18;35;11 - 01;18;36;26
Chris
Quant. It's just
01;18;36;28 - 01;18;48;05
Aaron
There was, there is a little bit of at a, out of, out of deep seek. There's definitely like a research community. But if, if this stuff is so good, then why isn't anybody using it? Well, it's just.
01;18;48;05 - 01;18;50;29
Poof
Based on the progress curve. It's just that.
01;18;51;00 - 01;18;54;10
Aaron
Nobody's using this stuff like like nobody's like.
01;18;54;16 - 01;18;57;17
Poof
So it has to be think about what word.
01;18;57;21 - 01;19;19;03
Aaron
To use, like Kimi now, but it's the same thing. These things are runaway, right? Like they're they're going to keep on getting better, at least in the near term. And they're compute constrained. And so like you need to be able to build a compute. You need a ton of capital for that to happen. Yeah. Like nobody's opening up cloud code and and choosing to use like when they're only using it to just save money.
01;19;19;05 - 01;19;28;06
Aaron
And the results are worse. And like and nobody's opening up cursor to use that stuff to like they're using. You don't know what you're talking about. Oh they're not there man.
01;19;28;13 - 01;19;31;19
Poof
No. What's the argument? Sorry, I'm not sure what we're talking about.
01;19;31;19 - 01;19;37;10
Aaron
The argument is just like like it was going back to the researcher stuff that just kind of like, like the research.
01;19;37;13 - 01;19;40;04
Poof
Is going to be researched as a competitive competitor.
01;19;40;06 - 01;19;42;14
Aaron
I think they're going to be a lagging competitor.
01;19;42;16 - 01;19;44;17
Poof
Which is what they're catching,
01;19;44;19 - 01;19;47;24
Aaron
For. They haven't because I'm I'm not using it.
01;19;47;29 - 01;19;49;25
Poof
You got to use you got to. Okay.
01;19;49;28 - 01;19;57;27
Aaron
I mean that's fine. But even so, with the open class stuff you saw, people were using opus. It was to like, get worse on it and it was too expensive, so. Well, I just sorry.
01;19;57;27 - 01;19;59;17
Poof
I don't know what the are. Yeah.
01;19;59;18 - 01;20;00;07
Aaron
And then they dropped.
01;20;00;07 - 01;20;01;03
Poof
Them right now.
01;20;01;03 - 01;20;11;15
Aaron
Yeah. They dropped it because it didn't work. And you know if you look at structurally what these places are doing, they're just like literally spam baiting and trying to distill out the models because they well, that's figured out for themselves.
01;20;11;20 - 01;20;14;01
Poof
But that makes no sense. That's a but.
01;20;14;03 - 01;20;15;02
Aaron
They know they are.
01;20;15;05 - 01;20;37;00
Poof
No, no, no, no. The point is the point on that, like if we were going to go into that is all code now is going to be written in part by cloud code, or if Codex is the best model, which honestly, it is for a lot of things, Codex or whatever, you can differentiate between what could or could be used in training data anymore.
01;20;37;00 - 01;20;57;08
Poof
Like it's a nonsensical thing, the world that we're going to. So like and you will always use training data or distilled or write code or whatever, and you're not actually distilling. And now maybe minimax or something, I don't know, whatever. But like that is kind of irrelevant, right? Like everyone's gonna be using everyone's shit OS, right? Not just inherent.
01;20;57;11 - 01;21;01;00
Poof
It's like a non I guess I don't see the the relevance.
01;21;01;02 - 01;21;11;14
Aaron
Yeah I don't, I don't know I don't know if that is fully the case though. I guess it would be like me. Yeah. Maybe I landed because I think people are choosing to use some of those models because they're cheap.
01;21;11;15 - 01;21;12;29
Poof
You know how they're trained.
01;21;13;01 - 01;21;14;10
Aaron
Do I know how to actually be able.
01;21;14;12 - 01;21;16;22
Poof
To, like you just said, that's curious that.
01;21;16;27 - 01;21;29;17
Aaron
The I'm not talking about the training and I'm not going to go like head to head on the training I'm going at just on the usage. Right? So people are making voluntary decisions. They're choosing to use something that works better, right? Yeah. If they can afford it.
01;21;29;18 - 01;21;40;14
Poof
My point is, just over time, there's going to be a lot of models in a lot of places that get better. It's a completely it's matter. Yeah. But I just. Yeah. As an interesting interesting.
01;21;40;20 - 01;21;49;25
Aaron
I yeah, I just think there's, I guess it goes back to like, I think the less this stuff is as maybe as brute forces as, as people are assuming.
01;21;49;28 - 01;21;52;22
Poof
Why why do you think that? Because, just because it's.
01;21;52;22 - 01;21;55;13
Aaron
Just it's not it's not playing out. Right.
01;21;55;13 - 01;21;57;25
Poof
Like you have 100% playing out, right.
01;21;57;25 - 01;22;09;02
Aaron
Well, kind of right. Like if I go on Twitter and I see all these people like Pantomiming that they're developing with like, AI systems, like, where's the better software that they're producing? I think that they're just nerds.
01;22;09;02 - 01;22;10;22
Poof
On that part. Yeah, yeah, yeah.
01;22;10;22 - 01;22;22;20
Aaron
So there's that. And then I think it's all the way down to like choosing which model they're using for it. I think they're just like literally like just mentally masturbating over half of the AI stuff. They have no clue how to use it. Yeah, right.
01;22;22;22 - 01;22;44;14
Poof
I guess I mean, the, the domain of the weird part, I guess. And here's what I'm going and it doesn't matter. So sorry. But the interesting part about AI is it's an empirical domain that is about brute force experimentation for the most part, in terms of creating models, deciding what people what works or not, things like that.
01;22;44;14 - 01;23;04;21
Poof
Right. So it's just it's an interesting that's actually something that's interesting to me about it is that you do all the math, you do all the research until you just, like, physically do it. It's not apparent that it's going to be so successful. And that's the thing that it seems to keep working. So anyway, so I think that's that's the existential threat that I think anthropic is feeling a little bit honestly.
01;23;04;21 - 01;23;07;16
Poof
Like now there is brute force right now. But yeah.
01;23;07;18 - 01;23;12;11
Aaron
Yeah, I think the threat is more infra. Right. Like their existential threat is energy.
01;23;12;11 - 01;23;13;07
Poof
Yeah. That's part of it.
01;23;13;10 - 01;23;21;17
Aaron
And then infra and I think that's the same existential threat that you know frankly Europe's feeling right or even right now everyone.
01;23;21;18 - 01;23;21;24
Poof
Yeah.
01;23;21;24 - 01;23;44;11
Aaron
Well, everybody but a handful of people. Right. Our handful of organizations, which I think they're going to be able to scale this up and continue to scale it up, you know, because they just they're, they can, they can vertically integrated way that other folks can. And that constrain is capital right. So and capital markets. So I just think I think that that's the that's just like the structure of it, whether, you know, whether we like it or not.
01;23;44;13 - 01;23;53;24
Aaron
But I'm with you that like I, I think anthropic is under a lot of stress. I think that's why you see them running to the regulator and like jumping up and down on the air side again.
01;23;53;26 - 01;23;56;13
Poof
Yeah. No, I that's Scott.
01;23;56;15 - 01;24;29;08
Chris
Anyway it's interesting I mean, do you take the same position or you take the Dario position because you're both of them are very uncertain outcomes. It's just what you fear more. Right. And it seems like you put the safety stuff aside for Dario. It seems like he fears more financially blowing up by you know, making all these commitments around data centers and, you know, cost scaling against an uncertain, variable about what those needs are going to be in the future.
01;24;29;11 - 01;24;35;17
Chris
Whereas Sam is just all gas, no brakes, and you just got to figure out how to get to the other side or not.
01;24;35;19 - 01;24;38;29
Poof
Yeah. I mean, it's super hard. Yeah.
01;24;39;01 - 01;24;59;03
Aaron
I think he's I think OpenAI runs the same issue, Chris, which is like they don't control the infra. So like that's why I mean, maybe this is a point of agreement, but like I just assume over the, the course of the next year to two years, like Google will just begin to just dominate this period just because they have more control to build out the infra.
01;24;59;05 - 01;25;23;29
Aaron
They have the chips, right? Like they're pretty much vertically integrated. And I wouldn't be surprised if like grok continues to like do the same thing just because of that entire conglomerate that Elon owns. And I think it just it just enables him to produce more energy, gain access to capital markets, do all that jazz and then people will just they'll they'll be like a lag on the open source models, which is great.
01;25;24;02 - 01;25;39;16
Aaron
But then at that point I just think it's going to compete on brand. And even there I think the Chinese models will have a hard time gaining widespread adoption just because people don't trust them or not, as many people will trust them. And, you know, for better or worse.
01;25;39;18 - 01;25;56;24
Poof
I think there'll be some weird stuff to like. I think one of the challenges that it feels like Sam has, that kind of forces him to actually take that position. Obviously, he's like all gas networks and pure in general, but like he has all the consumers. That's the problem. Like he has consumers that he has a monetized right.
01;25;56;24 - 01;26;16;06
Poof
And it's like kind of hard to but they carry a huge cost. So that's like a weird he's in a weird pinch. Like it's really interesting. I think that's why Google, even if they're not doing all the same things, they can just kind of starve everyone because they have so much resourcing anyway. So yeah, that's it's a tricky one from a business standpoint.
01;26;16;09 - 01;26;37;26
Aaron
And I think they well, right. Because they can just they can use their legacy search business to just, you know, supplement the, you know, the AI costs that they have. Right. Like I'll just take take from one pocket and put into this one, start undercutting everybody on price. I mean that's really why clawed at its moment. Right. Like a lot of that stuff that was available if you were using cursor is super expensive.
01;26;37;26 - 01;26;53;12
Aaron
They just subsidize it and now and throw our backs in an equally difficult position where they're locking down their ecosystem, you know, preventing people from building on top of it just because they can't they can't handle, like, the compute costs. I don't mean to interrupt you, Chris. You're going to tie it all together.
01;26;53;14 - 01;26;57;01
Chris
It was like 40 minutes ago. Dude, you can't stop.
01;26;57;01 - 01;27;01;06
Poof
This is terrible. Sorry. This is going to be, like, the longest. It's gonna be like, a nine hour.
01;27;01;09 - 01;27;43;19
Chris
But I will tie it all together, I think. Right, what you were saying is that value added resellers. Right? People who are going to take a plus B plus C from a commodified stack and turn that into X. Yeah, right. Those are going to be the ones that really capture the value. What the the training report point out to me is that we have built so many combinations of A plus, B plus C that we can span an entire alphabet, and that once basic assumptions and A plus B plus C start coming under attack and the margins, you can generate through these combinations start failing right.
01;27;43;20 - 01;28;05;19
Chris
Things that are built on top of or assume the durability of these, you know, value added products. That's where like you're real. Like, you know, let's just say decade long risk is as you know, it's not, like Daniel says, this one a bunch, right? People probably make more money off of, like, auto financing than they do off of cars.
01;28;05;21 - 01;28;31;23
Chris
Right? A cars and a plus B plus C type of product, right? You throw an engine, some tires and a steering wheel together and you have a car, right? Then you have the whole auto financing industry that sits on top of cars. Like, to me, it's once we destroy the basic unit economics of that middle layer in knowledge and information work, what sits on top of it?
01;28;31;23 - 01;28;59;09
Chris
That depends on the fact that, oh, I can lend money to a SaaS business and get, you know, an 11%, 12% interest rate. Once that crumbles, that's where you're going to have all this, you know, sort of like volatility over. Let's just say, the next decade. And you're saying, well, if we go on a long enough timeline, we're going to have a whole new set of things emerge that are going to replace those SaaS businesses.
01;28;59;11 - 01;29;29;19
Chris
And that's correct. Right. But it's this transition period. This just going to be like, I do think you need to take a defensive posture towards like the economy writ large during this, this time frame, because we don't really have any idea of like disordered dependent chains in which, like, you know, a it's the actually the third order thing where all the margin is and if anything underneath that crumbles, it gets real messy real fast.
01;29;29;22 - 01;29;32;10
Aaron
Yeah, it's going to get messy. It's going to be wild. Right, guys?
01;29;32;17 - 01;29;34;01
Poof
We don't know anything else.
01;29;34;01 - 01;29;38;22
Aaron
We know nothing. It would just be a mess, I think I think,
01;29;38;24 - 01;29;38;28
Chris
A.
01;29;38;28 - 01;29;40;07
Aaron
Great mess.
01;29;40;09 - 01;29;46;11
Chris
All right, guys, so we coming to the end of the road show. We enter the show here and poof! For good company. Thanks.
01;29;46;16 - 01;29;49;16
Poof
Thanks both. Thank you. Fun. Yeah, I did this.
01;29;49;18 - 01;29;55;16
Pri
I was enjoying the the China-U.S. AI debate. Listening in on that.
01;29;55;23 - 01;30;04;09
Poof
Let's try to wrap it a little, just just for the just to get back at Aaron for, for dismissing your your point earlier so.
01;30;04;11 - 01;30;06;21
Aaron
We can take I think if.
01;30;06;24 - 01;30;37;12
Pri
I take on the road. Hey everyone. Welcome to net society. It's me Chris Aaron with special guest poof. Today we're talking all things tech, AI, crypto, politics, culture and more. Just as a quick heads up, these thoughts and opinions are our own and not of our employer. And none of this is financial advice.
01;30;37;14 - 01;30;38;00
Pri
For.