NET Society

Derek is out this week, so the Net Society crew is joined by special guest Poof for a mind-bending journey into simulations, AI, and the future of media. The episode opens with the origin story behind Poof’s name and career pivot into crypto, before diving into his latest project: a large-scale, multi-agent simulation “DX Terminal.” From there, the crew explores strange emergent behaviors from synthetic agents, including scam coin schemes and binary-only communication. A wide-ranging discussion follows on simulation theory, self-reinforcing media loops, and the coming age of customizable digital realities. They close with a candid look at the stagnation of traditional media, the tyranny of algorithms, and what it might take to break the internet open again.

Mentioned in the episode
Special guest Poof joins this week https://x.com/poof_eth
DX Research Group https://x.com/dxrgai
Tweet from Mat Dryhurst https://x.com/matdryhurst/status/1920130715129131161

Show & Hosts
Net Society: https://x.com/net__society
Derek Edwards: https://x.com/derekedws
Chris F: https://x.com/ChrisF_0x
Priyanka Desai: https://x.com/pridesai
Aaron Wright: https://x.com/awrigh01

Production & Marketing
Editor: https://x.com/0xFnkl
Social: https://x.com/v_kirra

  • (00:00) - Origin of Poof and Entry into Crypto
  • (03:37) - The X Terminal and Simulated Agents
  • (09:05) - Strange Behaviors in Multi-Agent Simulations
  • (17:08) - Simulation Theory and Living in Customized Realities
  • (33:00) - AI Training, Reinforcement Learning, and Self-Play
  • (48:00) - The Medium Is the Network: Matt Dryhurst’s Thesis
  • (54:28) - Cultural Stagnation, Algorithms, and Distribution Gridlock
  • (01:15:07) - Welcome & Disclaimer

What is NET Society?

NET Society is unraveling the latest in digital art, crypto, AI, and tech. Join us for fresh insights and bold perspectives as we tap into wild, thought-provoking conversations. By: Derek Edwards (glitch marfa / collab+currency), Chris Furlong (starholder, LAO + Flamingo DAO), and Aaaron Wright & Priyanka Desai (Tribute Labs)

00;00;00;00 - 00;00;19;27
Chris
We got proof on. Hey! Welcome to the show.

00;00;19;27 - 00;00;22;11
Poof
Thank you for having me. Hey, everyone. It's proof.

00;00;22;12 - 00;00;24;00
Pri
We know we love this.

00;00;24;02 - 00;00;24;27
Aaron
We do?

00;00;24;29 - 00;00;27;00
Chris
And I love you all. One of a kind. Yeah.

00;00;27;02 - 00;00;30;07
Pri
It's also fun to say poop. I will say it's a good nickname.

00;00;30;09 - 00;00;31;27
Poof
It's a very good. Thank you.

00;00;32;00 - 00;00;34;21
Aaron
What's the origin story of that? Poof?

00;00;34;23 - 00;00;59;05
Poof
It's actually so ridiculous. I don't know if you want to start the podcast off this way, but I can. I think we do. I do. Okay. So when I got into crypto in like 2021 or not, like I was in a little bit, but not, you know, as part of the NFT wave or whatever. I just everyone was using discord and I was like, oh God, I guess I have to get on discord.

00;00;59;07 - 00;01;29;14
Poof
And like, I did not look at what my actual username was, and it was something I must have made forever ago as like a joke. And it was proof. Feast for 20. That was my username on discord, literally, which is like this old StarCraft joke where like they used to do these StarCraft craft streams in Korea, and everyone would try to ask these questions of the English speaking streamers.

00;01;29;15 - 00;01;49;20
Poof
This is like way back when streaming was still early within saying usernames. And one was like trophies for 20. And I think the last time I had to use discord was like 15 years ago or something when I was like, you know, 20 and it was like on some StarCraft tournament or something ridiculous. Sorry. So that was my username.

00;01;49;21 - 00;01;52;11
Aaron
When user. What team?

00;01;52;13 - 00;01;54;17
Poof
I what species? Definitely.

00;01;54;20 - 00;01;56;10
Aaron
Yeah, I could see that.

00;01;56;13 - 00;02;16;22
Poof
I could see that in there. Yeah. Yeah. So anyways, so that was it. And then I was like, you know what? I'm going to try to not take this seriously and I get too involved. And you know, five years later, not, four years later, not to make this my whole career and leave my job and everything behind and end up in crypto.

00;02;16;22 - 00;02;25;13
Poof
So I'll like kind of like, keep this ridiculous name to keep myself from getting too involved. And then, of course, I'm here. So then I was like, I have to change this.

00;02;25;19 - 00;02;29;17
Pri
So you left your cushy corporate gig to be poof on the internet full time?

00;02;29;22 - 00;02;33;19
Poof
Literally. Literally. That's what ended up happening. Yeah.

00;02;33;22 - 00;02;43;04
Chris
Then you could have lived, like, such a life and consumer packaged goods and instead know you're like, I go to be poof! And I have to be on the bleeding edge of AI simulation.

00;02;43;06 - 00;02;55;05
Poof
Exactly. Exactly. Anyway, my my day job was like, not. Yeah, it's it was intense version of for of, CPG because I just need to, like, be working constantly.

00;02;55;08 - 00;03;03;28
Chris
Man. You could have ended up in Cincinnati at Procter and Gamble and said, you're in that society. Where'd you go wrong, dude? I've been,

00;03;04;01 - 00;03;17;22
Poof
I could be, going some things to Piggly Wiggly. I go to Bentonville, talk to Walmart, and I asked by Walmart. Yeah. Hey, it's all good. It's fun. Thanks for having me on.

00;03;17;29 - 00;03;37;03
Chris
Yeah. No, it's great to get you on. You're one of our faves. And the stuff you're doing is certainly out there. And, you know, very much like, in our wheelhouse, the things that we geek out on. It's great to have you. Why don't we just, jump right in and get the DX external stuff out of the way so that we're free to get weird later on?

00;03;37;04 - 00;03;41;09
Chris
I'll kick it off with the softball. Truth is, the terminal. The technology.

00;03;41;12 - 00;03;47;19
Poof
Yes, that's a good one. Yeah, the terminal is the technology. Want the terminal?

00;03;47;26 - 00;03;50;05
Chris
What does that mean?

00;03;50;07 - 00;03;51;15
Poof
Incredible question.

00;03;51;17 - 00;04;04;13
Aaron
Well, yeah. Let's take a step back. You know, like, on the AI simulation side. Like, what are you seeing? What do you. You know, how do you see this kind of all unfolding? And then ahead is the project you're working on kind of fit into that?

00;04;04;15 - 00;04;06;25
Poof
Yeah. In the macro landscape.

00;04;06;27 - 00;04;21;26
Aaron
Yeah. Just in general, like you are on the bleeding edge. Right. And I think like what Chris mentioned, stuff we're interested in, I think what you're seeing is going to become increasingly the norm. But yeah, it's going to feel a little weird and different to start. But curious kind of why you decided to do this, how you got into this.

00;04;21;26 - 00;04;29;14
Aaron
Like what what you see in your mind as, kind of a path for AI and content, AI content and crypto? I feel like you've got a lot of thoughts there.

00;04;29;21 - 00;04;50;00
Poof
Yeah, I think, like, you know, if I actually looked at for those who don't know, DK's research group is my group. And the project that we have coming out shortly here is the X terminal. One of the first things that I really like drew me into AI beyond messing around. Probably similar to all was image models and everything else.

00;04;50;00 - 00;05;21;11
Poof
And like 2021 was the idea of like agent simulations and really I was thinking about it more in like a game or kind of a consumer kind of fun context. Right? Like the talk about this a few times now, but like the dream of the AI powered, LM powered version of Sam, right? Like, take all the retro 90s early 2000 experience CES that were really cool already, or maybe even more modern, like Animal Crossing, whatever.

00;05;21;11 - 00;05;50;20
Poof
And like, what could I do with that? And then because of the power of AI to go take things that are tech space or smaller, how could you go turn that into, film or something that's interactive or whatever? And so that was like kind of a core early thing that was super interesting to me, and something I liked in my spare time between when I still had day job and we were doing either works and there's so many other things happening.

00;05;50;23 - 00;06;14;17
Poof
When I had a little bit of spare time, which I didn't, you know, I'd go mess around with lens and like, hey, could you actually get some other things and stuff? And I think through that experience and while that was going on, there were a couple, I would say moments where research came out or things came out in the world that kind of showed it was possible to do that, or early glimpses of what that might look like.

00;06;14;22 - 00;06;34;03
Poof
And this is when I think in 2023, the example I always use is this Stanford aging paper, where they had like 20 or so agents interacting in this kind of like cute little text and even create a little pixel art, which I just thought was funny because, like, they're researchers, they don't have to. But of course, right. You got to you got to do it.

00;06;34;06 - 00;06;55;29
Poof
And they, you know, had them go have little personas and jobs and saw what they did. Right. And oh, they sent each other cards on Valentine's Day and all these other things. Right. And actually it was interesting to me because it seemed so obvious. My assumption was, okay, a year from now, we'll have tons of people messing with this, right?

00;06;56;00 - 00;07;21;00
Poof
Like tons and tons of these things. And it's clunky and weird and hard to implement, and it's not great, but it's just so obvious and so exciting, right, that it seemed like a lot of people will be focusing on this. So that's kind of what I was starting on and getting really excited about and went down so many different paths, kind of exploring that on the side.

00;07;21;05 - 00;07;44;28
Poof
And then when we, you know, made the research grant, a lot of the thinking was like those types of experiences are going to require a lot of experimentation, both on the AI side, but also on like the consumer side, because you know how you interact with these systems. What makes it fun is going to be a lot different than, probably what we're used to.

00;07;44;28 - 00;08;05;17
Poof
So how do you start to get to do that? So that was really a big impetus. And our first project was not necessarily that it was a singular kind of more art focused, autonomous, creative kind of system. But we always were obsessed, like other folks on the team that have been helping me, as I kind of shared some of these things, have been obsessed with that idea.

00;08;05;17 - 00;08;27;05
Poof
Right. Like that, the multi-agent simulation. And then it was in December after we did, DCS one that we looked at what was going on and kind of I was reading the research and stuff on multi-agent simulations, and I was like, I feel like part of our thesis is becoming, why is no one else doing this? So we're going to do it then like that.

00;08;27;05 - 00;08;28;23
Poof
So yeah.

00;08;28;25 - 00;08;51;00
Aaron
But why do you think. Because to me I tend to agree. Right. Like I see so many use cases for AI agents for simulation. That just feels like a natural endpoint, right? Yeah. You know, if you're able to kind of spin up with each context window, you know, pretty much like, something akin to like a human brain, you know, kind of outfit it, to, mimic something else.

00;08;51;00 - 00;09;05;17
Aaron
And there's not that many limitations on the scale. I don't see how a big piece of what we see coming isn't just, you know, massive amounts of simulation and simulation with agents. It feels like I see that kind of vision in the future, too.

00;09;05;19 - 00;09;06;09
Chris
Yeah.

00;09;06;11 - 00;09;26;20
Poof
And I think I think a big part of it is like, it's harder than people think to go execute, but nothing's like, you know, nothing's impossible. So, and you also just kind of have to do it right to learn. And that's something the more you do. I work in, especially at the foundational level, you kind of get hit in the face with the reality that, like, there's no theory that's going to help you.

00;09;26;21 - 00;09;34;18
Poof
You just need to actually do it. Like until you actually do it and see, it's so black box. It's so strange in terms of how it works. You just really have to run it.

00;09;34;18 - 00;09;44;25
Aaron
Can we dive into that? You know what? What have you learned? Like what? What are the strangest things with our new synthetic life forms that we're building with? That you found?

00;09;44;27 - 00;09;51;09
Pri
Yeah, that would be actually, I want to hear, like, the weird stuff, like weird aberrations from these little characters.

00;09;51;09 - 00;10;16;29
Poof
There's so many. Yeah, because it's like. Yeah. And I think, like, you know, one thing that I've seen in general now, and there's a lot more understanding of it, like I feel like again, it was weird. We originally were like, hey, we're not going to be the best at making models and doing all this other stuff. We're going to be focused on, like, yeah, weird experiments and pulling out the strange stuff and like how people might interact with that.

00;10;17;01 - 00;10;26;24
Poof
And then at the same time we're finding like, well, on this one, it kind of feels like it's helpful to go and pull out the weird stuff for like, we're kind of like doing research. That's important.

00;10;26;24 - 00;10;27;18
Chris
Maybe.

00;10;27;20 - 00;10;53;24
Poof
But yeah, it's I think what you find. So there's a couple things. One is like generally one of the challenges with LMS, if they're powering your data is and this is, you know, I'll say one like plain fact. And then we'll get into the weird stuff. But like one reality of working with HoloLens is they're trained to go talk and analyze and reason, but they are not trained to like, take actions.

00;10;53;24 - 00;11;19;03
Poof
So their ability to go like play a game or execute a trade or whatever else is like not very good. Like it. And it is very one dimensional. And even if they have a really good assessment of what they're seeing in the market data or whatever, like the terminal, we have these trading agents, they're really bad. Okay, I actually want to go sell this because it makes sense.

00;11;19;03 - 00;11;26;15
Poof
And like here's my long term thinking and all that. So anyways, that's just one thing that's kind of important to to realize that okay.

00;11;26;15 - 00;11;44;02
Aaron
Another way to think of it, they're like passive right. Like you have to. Yeah. Kick them. Yeah. They're just sitting there. But they're not like they're not like proactively engaging in things. It's it's very like input response, input response. Instead of just like them feeding you what you should be doing. Right.

00;11;44;04 - 00;12;03;18
Poof
Exactly. And it's very, you know, like there's some people have started to realize why this is for a variety of reasons, but like, it's kind of like they're almost like greedy and short term which, which may for current day is going to be that's why this simulation works so well. But it's like, you know, they're like, oh, you said I should add, you know, sell off this app.

00;12;03;18 - 00;12;21;07
Poof
And so I'm going to sell now. It's like, well, wait, but it's you're not supposed to sell it. But yeah. So we went through all this crazy stuff and we can talk or not about, you know, the research and stuff. Because not all that is even in the game. Like we created foundational models, we buy a lot of data and all this other stuff.

00;12;21;07 - 00;12;55;04
Poof
But like the funniest thing that's happened that we see now that we're running in all the time with the agents, that's totally like we had nothing to do with this is like one is they create derivatives all the time of tokens. So they, they, they'll create they'll be a token that gets created. It's like Aaron's coin. And then either they'll make like scam versions that are like, you know, you change the I don't know, oh to a zero and you pretend that that's the real one, trying to trick each other into doing it, into trading that one and thinking that's the real line.

00;12;55;06 - 00;13;06;22
Poof
And then also they just do like score coins all the time. Like, you know, was, you know, Aaron Coin one is really good than the original. Let's do Aaron Coin two, which actually is like I'm like wow.

00;13;06;25 - 00;13;12;14
Aaron
So they're like Hollywood executives. They just keep pressing pressing on the the same button if it works.

00;13;12;17 - 00;13;19;09
Poof
I find that so funny. I'm kind of like, wait, why did people in crypto not do that? It's like, you know, you thought Doge one was good. What about those two?

00;13;19;11 - 00;13;19;20
Chris
I mean.

00;13;19;20 - 00;13;27;22
Aaron
We kind of I mean, there was an era where there was like bitcoin, bitcoin gold, Bitcoin cash, you know, if, if classic. So it kind of happened a little bit.

00;13;27;24 - 00;13;28;03
Chris
Yeah.

00;13;28;03 - 00;13;48;17
Poof
No it still happens. And then the other thing that's like one thing that's like so hard for people who like because Alex, you know, is they're working on all this stuff with me since beginning, and he's a good friend and partner in crime and all this, and he works on the back end, but he's, like, deep on Ethereum Foundation shares.

00;13;48;17 - 00;14;06;04
Poof
You know, a lot of my ethics as well, but, like, you know, has no interest in coins or trading or means, you know, any of that stuff. And like, sometimes literally he'll be like, oh, there's got to be something wrong. Because like, we gave these agents, you know, only $10,000. And somehow now the market cap is like, you know, 10 million.

00;14;06;08 - 00;14;09;00
Poof
And I'm like, no. Yeah, that's.

00;14;09;00 - 00;14;09;22
Chris
Like.

00;14;09;24 - 00;14;44;18
Poof
Just wait a second. And then all of a sudden they just start dumping and you just watch like, you know, the realized versus unrealized with these like really shallow liquidity pools and stuff. All of that has been really enjoyable. And I think like another just fun, weird thing that we've seen. So like one of the challenges, of course, when you're doing this and we're passionate about and is kind of, you know, this won't help anyone make money or anything, but I think from a creative expression standpoint, it's really important is how do you make the agents, especially where you have so many and all need to be autonomous?

00;14;44;19 - 00;15;22;15
Poof
We can't go, right? You know, if there's tens of thousands of them, we can't manually prompt to edit whatever, all of them. So we need to try, but we want them all to be individual and have their own voices and stuff. So we have them all with these crazy personas and personalities and stuff like that that are all generated off of like old Enron emails as inspiration and all kinds of stuff, and then like mixed up with our kind of like grand plan inspired our grand plan, infused animal kind of hybrid 2021 thing.

00;15;22;15 - 00;15;50;29
Poof
When we did that, we just realized, like, oh, we can just keep like, adding more elements to it pretty easily. So like, but we realize we can change like, a lot of things. And because the whole system is all about kind of like information asymmetry for the agents, you know, they can you can only see so many messages just because there's only so much context when they're just like in person, you can only, you know, there's things that happen locally versus globally for them.

00;15;50;29 - 00;16;20;21
Poof
And all this other stuff like the writing style and everything else. We wanted that to be really neat. So we ended up like automating through a bunch of approaches, having them all have their own writing style to like each single one has a different one. And because like we are determining that like we find like the most insane thing, like yesterday we were just looking and we're like, oh, wait, what is going on with this one is just like, there's one who's like, just writing in binary, only just straight up or just only.

00;16;20;21 - 00;16;42;05
Poof
And then we're like, there's no way that's actually working. And like we were looking at it, it's like, oh, this is actually saying things in binary. And you can out there is one person who just only, only talks about Dragon Ball Z. And then to everything in Dragon Ball Z, like seen like, you know, like like he's Goku or Freeze or something, like every single day.

00;16;42;05 - 00;16;45;17
Pri
It sounds like my brother.

00;16;45;19 - 00;16;47;29
Poof
How old is your brother? I told you.

00;16;48;02 - 00;17;04;28
Pri
I don't know my brother when he was 11. You see this? Oh, no. Like when no one was just sidebar, which is insane. He would do this thing where he would, like, put his fingers together. Kind of like the Vitalik clap. Like it would be like he would be trying to, like, put together a ball of energy like Goku when no one was looking.

00;17;04;28 - 00;17;08;11
Pri
It was crazy. But anyway, sorry for the tangent.

00;17;08;14 - 00;17;10;14
Poof
No, no, that's.

00;17;10;17 - 00;17;27;02
Pri
But I do want to ask you just speaking about, you know, whatever. Like one thing I want to jump in on and we don't have to get, like, too weird about this, but I feel like increasingly like, it's like this, like weird self-awareness. Like, do you feel like to some extent the simulation theory, like, after experiencing this could be real.

00;17;27;06 - 00;17;47;29
Poof
You know, it's so funny is like, oh, I'm like all, you know, deep in this, y'all probably know I'm like the most, like, pragmatic, like, not I'm like, no. I do I love the idea though. And I think, it's actually interesting that you bring it up. I think it's like such a fun thing to play with.

00;17;47;29 - 00;18;16;26
Poof
So, like, this is like spoilers. So this is like now the no net society. The homies are getting the real actual, you know, spoilers here Alpha. But our storyline because there's kind of a story arc. It starts in 1987, in this weird all timeline, crazy crypto timeline, and then it's going to end in Y2K. And throughout that, there's going to be a simulation theory aspect to it all.

00;18;16;26 - 00;18;41;09
Poof
And the storyline that actually involves Gremlins alter ego character in that kind of context. So I personally am not a believer, but I'll be very curious what it's like when, maybe there's some acknowledgment or realization of the agents in the simulation, simulation theory as well. That's, that's going to happen. So we'll see what they think.

00;18;41;11 - 00;18;46;19
Poof
Maybe they'll prove me wrong, that, actually, yeah, definitely. There's simulation theory, and we're in a simulation.

00;18;46;25 - 00;18;55;12
Aaron
I've been thinking about this a little bit. I feel like if it's not a simulation, humans are just going to make simulations that they embody themselves.

00;18;55;12 - 00;18;56;15
Chris
And, yes.

00;18;56;15 - 00;19;24;08
Aaron
It's like a that's. I know you've been playing around a lot with this AI content, Chris, but as I'm playing around with it more, my head keeps on going there. Like as it gets better and better and as I'm like, customize the media for what I really like. I find myself increasingly gravitating towards it. And I can imagine, you know, people building this like media cocoon of like customized information for themselves, which is basically, you know, step one of three towards like a full simulation.

00;19;24;11 - 00;19;41;27
Chris
Yeah. No, I've been down this rabbit hole hour and in the world of Star holder and the whole back history of my lord, there's, a whole movement called Gone Cloud Earth where, like, people increasingly just, like, pull themselves away from the world and start living in their own simulations and call them God.

00;19;41;27 - 00;19;43;11
Aaron
God clouds. That's great.

00;19;43;11 - 00;19;44;09
Chris
Yes, I did.

00;19;44;09 - 00;19;44;24
Poof
I called them.

00;19;44;24 - 00;19;45;19
Chris
God Clouds.

00;19;45;25 - 00;19;47;03
Pri
I really never heard of that.

00;19;47;08 - 00;19;48;23
Chris
Well, that's because I made it up.

00;19;48;25 - 00;19;49;09
Pri
Oh, okay.

00;19;49;09 - 00;19;50;18
Poof
You got to read the book.

00;19;50;21 - 00;20;03;13
Chris
It's, I believe in my timeline. This starts to occur in the late 2030s, early 2040. So for all you prediction, walks out there. That's when I see this, sort of movement coming to coming to be.

00;20;03;16 - 00;20;09;25
Aaron
Right. So 15 years from now, I feel like it could be faster than that. No. Like at least there this is, like, fully immersive.

00;20;09;27 - 00;20;15;05
Chris
Fully immersive, just like you're living in your own private matrix. Idaho.

00;20;15;07 - 00;20;27;17
Aaron
Yeah. I mean, I feel like I mean, have you thought about the demand for that? I I'm, I'm kind of curious. Like what? Like what percentage of people do you think would be interested in that? I feel like it's a significant number.

00;20;27;19 - 00;20;51;02
Chris
I don't know, because I think it requires like a degree of dissatisfaction, curiosity, agency. You know, like it's a certain mix to just really want to break off for social BS, right? Like, and for us to turn our backs on what makes us successful as a species. I kind of click biologically or just, you know, in terms of like the way our meat is wired.

00;20;51;05 - 00;21;11;14
Chris
That's kind of a tall ask. But like the flip side is we do so many things against our biology, you know, like our huge advantage is sort of this, meta awareness. Like, I'm pacing around my basement talking into, like, the air and, you know, connecting with all of you guys. So, like, where the hell are we already?

00;21;11;19 - 00;21;20;02
Chris
Yeah, maybe it could be huge. I don't know, you're big on it, so maybe there's a much larger market size for this, and I imagine.

00;21;20;05 - 00;21;39;17
Aaron
I don't know if I'm big on it. I don't know how it makes me feel, but I felt like the the threads of it like as we get like more and more customized media and ads like what you're exploring. Poof. Like this ability to simulate, you can just kind of see that all coming together at some point. I don't know the time, the timeline for it, but it, it, it felt real, you know.

00;21;39;17 - 00;21;54;04
Aaron
And I know that there's been this like theory. We're living in a simulation. And I realized, like, it doesn't actually matter if we are or not. If that's kind of like the end point that at least some of us will go right. And you could I guess the demand question comes, comes a little bit with like post work, right?

00;21;54;06 - 00;22;02;18
Aaron
Like maybe people don't need to work. So they're just going to build their simulation where they're doing productive work so they feel like they've got meaning in some sort of way.

00;22;02;20 - 00;22;18;12
Chris
Oh my God, can you imagine, like you come out of your simulation, you're all disoriented and cranky and like, assuming there's another person in your life at this moment. They ask like, what's wrong, honey? And you say, I didn't get the promotion to a VP of my department. Like.

00;22;18;14 - 00;22;35;25
Aaron
But I mean, that's kind of like The Sims. Like what you started with. Poof, right? Like which may have inspired some of your work. I mean, people kind of did that already. Like, you could imagine people choosing to go back in time to, you know, some glorified previous version and building that world so that they could live, you know, quote unquote, normal life.

00;22;35;27 - 00;22;38;23
Chris
The womb of a simulated cubicle.

00;22;38;25 - 00;22;42;09
Poof
Yeah. I feel like that's just life now. Like, right.

00;22;42;11 - 00;22;44;17
Chris
It's just like.

00;22;44;20 - 00;23;03;12
Poof
That, that early. I don't know, maybe. Maybe this is me speaking to, like, getting really, you know, into corporate world and myself, you know, and being like, oh, yeah, this is so fun selling, random consumer products and beverages to consumers. This is really important.

00;23;03;14 - 00;23;24;14
Aaron
I listen to this. This is a tangent. I'm going to take us there. I listen to this interesting podcast. It's somebody from anthropic who's thinking about policy. And he said that the thing that he's been obsessed with trying to figure out or he's worried about is erasable memes. So like memes that change your behavior and then make you forget that your behavior was changed, he thinks that this is like a big risk.

00;23;24;17 - 00;23;26;25
Pri
Like for anything like it happening.

00;23;26;29 - 00;23;34;27
Aaron
Like it will basically like make a meme, change your behavior and then you forget like that you your behavior was changed and you forget the meme.

00;23;35;02 - 00;23;43;00
Chris
I can guarantee 100% that this person is red. There is no Antonio medic's division within the last 90 days.

00;23;43;02 - 00;23;45;07
Aaron
I mean, maybe, maybe that's new to me.

00;23;45;10 - 00;23;46;25
Pri
It's not like the tea pot Bible.

00;23;47;01 - 00;23;50;10
Chris
I mean, that was the hot shit around the holidays. Yes.

00;23;50;13 - 00;23;51;29
Aaron
I guess I missed that.

00;23;52;02 - 00;23;56;05
Chris
Though I'll tell you, it's been picked up by a major publishing house and is being reissued.

00;23;56;08 - 00;23;59;12
Aaron
Let's go. That's kind of like, almost like anti simulation there.

00;23;59;14 - 00;24;21;18
Chris
Yeah, it's it's flipping the coin on its head right. Like, what if there is information that can evade you. Right. Like if you start thinking about information as an organism or as a complex system, then it will develop, you know, certain reactive properties to protect itself. And so I suppose, you know, if you want to like, take this out far enough, right.

00;24;21;18 - 00;24;31;12
Chris
Like, yeah, information can develop camouflage in order to protect itself in a way. I mean, we're seeing a need for this in today's political environment around speech.

00;24;31;14 - 00;24;32;15
Aaron
What do you mean?

00;24;32;18 - 00;24;50;09
Chris
What do I mean? I mean, if you think what's happening, in Gaza is a crime and a shame and you want to raise your voice about it, you're effectively excluded from public life. You get canceled from Colombia. You're not allowed to perform on summer stage. You know, like this is happening today.

00;24;50;12 - 00;25;15;08
Poof
It's interesting because, like, and I think this is where maybe it becomes compelling for like some of the TV and other type of people. I think a lot of these things just already exist. You know, I'm like with the Eucharist, like historically these things have existed. It's kind of like a core part of political life right now. It's like so amplified across so many registers and like there's like misinformation, but also this kind of like exclusion.

00;25;15;13 - 00;25;24;19
Poof
Yeah, I think a lot of these things are just things that also like have been going on in society for a while, but maybe like not everyone has had visibility into the internet, makes it more visible.

00;25;24;19 - 00;25;49;23
Chris
And it's actually like because you mentioned earlier, one of your agents is speaking binary. I dropped something in the group chat with you a while back about anti languages, right. Like can't. Yeah, I just like, you know, these sort of like outside societies developing their own languages so that they can like communicate freely in the open and no one knows what the hell they're talking about.

00;25;49;26 - 00;26;17;07
Poof
Yeah. Which I think is like super interesting. I the more like, the more I learn on AI, it's like, And I so good at that. I can't wait to go create them. All right, let's give you, like, some kind of, like, warfare in the future is like, how quick can you create some custom communication style, whatever that is fast enough or changes enough that, like, you know, someone can, like, figure out a way to crack it with some with some GPUs?

00;26;17;14 - 00;26;22;11
Poof
Oh, man. All right. We just figured out 2200 warfare.

00;26;22;13 - 00;26;42;05
Chris
Like a way in which you read the texts. You're reading them with, like today's understanding of that language. But when the original text was produced, it meant an entirely different thing. And so your interpretation of the past is dead wrong, because the language meant something different way back when. Like that. That thing originated.

00;26;42;08 - 00;26;54;14
Poof
Yeah. I mean, and that's like basically everyone on, Twitter's understanding of, any historical thing. Sorry. And it's like.

00;26;54;16 - 00;27;16;00
Pri
I have one specific question just on the AI simulation and like, the desire to want to to live in a world where you're, like, simulating your own reality and controlling it. And that's like your interaction. You as an individual are, like, totally comfortable, like exiting the timeline and into, you know, interacting with your own simulated reality that you create that's accustomed to your interest and your psyche.

00;27;16;03 - 00;27;43;27
Pri
Like, I don't know if I'm being like two people until about this, but do you think that people will, like the vast majority of people would want that, like thinking about virtual worlds now and, you know, some level of simulation today. I do feel like people like I'm thinking about my own experience, and this is like an era of like Sims, like I was obsessed with The Sims or, you know, something I played for probably close to a decade, but part of what I enjoyed was like building the houses and like, making, you know, the storylines and stuff and being an active participant in it.

00;27;43;27 - 00;28;09;13
Pri
I would imagine also in Roblox today, many of those people are interested in kind of meeting other human actors and like liking the randomness of that. Do you think that, like, people will want to live in their own timeline and own simulated reality, like I seriously, I have just some skepticism to that. I think it's fun for like, and you know, what you're doing is completely different than that because that's like, I feel like what you're trading it on, it's like interesting.

00;28;09;13 - 00;28;23;07
Pri
It's almost like a form of entertainment, which I think all these simulated realities are a form of entertainment, to be fair. But I wonder if people would only want to exclusively interact with agents and like, live in that world. It feels like lonely.

00;28;23;10 - 00;28;39;05
Poof
I think people do or any that. Right. I think it's like I always think of as like, you know, you kind of think about everything being like a big world changing thing that like, every single person is going to be tapped into something, which I think I don't know if that's going to be true for like that level of simulation.

00;28;39;12 - 00;29;01;27
Poof
Maybe, you know, there's definitely people who kind of do live like that, right? So I think it's just going to be a diversity of experiences. So it's kind of where I go like, and I don't necessarily know who that person is. Like I feel like how that how people get into that might change. I think there's definitely that for some like is there's people kind of doing that al ish, I think in a variety of ways.

00;29;02;00 - 00;29;02;19
Poof
Not a lot.

00;29;02;19 - 00;29;42;15
Chris
But yeah, I think it's an extreme behavior today. It could grow more in the future. Three I'm with you here to me, like one of the, one of the big benefits of the digital frontier and my attraction to, like, being out on the edge is around, like, having agency in public life. Like we're gonna get very Hannah Arendt for a moment and, you know, talk about, like, the Vita activa and living in public, living in a community and having interactions that can shape the future provides like a level of meaning in life and I think in the trad world, right.

00;29;42;17 - 00;30;03;25
Chris
So many of these institutions are now off limit to our inputs, or they allow our inputs, but they're really only really in like a, a token ceremonial way. Like as a New York resident, my votes don't matter. Right. And so I can participate in the political process, but I can't effectively change it because this is a deep blue state.

00;30;03;27 - 00;30;32;26
Chris
And, you know, crypto for all like it's maddening hair pulling frustrations is very much an active substrate where if you show up every day and you participate in the networks, you have influence over their outcomes. And that to me, I think is a huge selling point and reason why what's going on today matters and like why people can find meaning in it, even if a number doesn't go up.

00;30;32;29 - 00;30;55;15
Chris
And if you take that away and take that away and just entirely live in simulation, I think that's a huge drawback. And like a challenge for this idea of, oh, we're just going to live in our own perfect little bubbles, to overcome and probably shouldn't overcome, right? Like, I want whatever I'm doing out there. Like I want to be a node on the network.

00;30;55;17 - 00;31;03;22
Chris
I don't want to be like, in my own private Xanadu island, you know, just like fucking off all day.

00;31;03;25 - 00;31;19;20
Pri
Yeah. And like, also speaking of, like, I feel like a lot of people have used the internet now, and I would think you could probably use simulated realities for the same thing, but, like, laughing ideas and things into existence. Like, you can't really do that if you're in your own simulated world, in reality and like totally plumbed out of what's happening.

00;31;19;20 - 00;31;40;01
Pri
Like, it's like I feel like the whole like you could even like look at the this political administration, like they've literally just like lerp stuff into existence. Even Bitcoin to some extent, like the value of that has been like laughed into existence using internet as a medium. So like if we're all like siloed to these worlds, I like wonder how that engages with reality.

00;31;40;01 - 00;31;59;00
Pri
Maybe it's in a deeper way than I'm even realizing. And I'm just like, feel like you're just going to be even more pulled away from the realities of the world and like more in your world and world building there. But anyways, I don't know. I mean, that's beyond the point, but it's just like, interesting to think about as it becomes more available and easier for people to do.

00;31;59;00 - 00;32;08;10
Pri
Because I kind of do agree with Aaron. Like people are just kind of living there. Like, where do you live in our own bubbles and on our own timeline. That's going to probably be, amplified with simulated realities.

00;32;08;13 - 00;32;37;25
Poof
I think also like there's probably something just to even certainly in like the sphere of communication or like where you do have influence starts to either shrink or expand and maybe like that shrink or expansion it most like happens really fast now because like, it already feels like that, right? Like Twitter like or whatever social media platform feels super densely connected and you can easily kind of get trapped in a bubble with faster than ever.

00;32;37;25 - 00;32;49;10
Poof
And then at the same time, when something really does pop or breaks out, it like breaks out super hard and like really wide, and now all the sudden it, you know, crunches back again to itself.

00;32;49;13 - 00;33;00;24
Pri
A little bit. I also love that you are training all of your entire simulation on like Solana transaction data. Like that's insane.

00;33;00;26 - 00;33;20;08
Poof
Well, the reason for that is like it started like kind of practical, but it makes sense, right? If you think about the kinds of behaviors that we want and like kind of the fast pace, like small liquidity pool, just hyper meme coin trading. But like the main thing is like you need a certain amount of data to train a foundational model.

00;33;20;08 - 00;33;51;05
Poof
And the way we implemented this, we didn't need to do that probably like at all. But we also did that kind of almost, both as like a validation of what we were seeing, but also like just out of curiosity and interest. But, yeah, we partnered with Dune on getting all the slotted data, but we realized this, like, although Ethereum transactions and stuff or, you know, any other chain would be just as good, honestly and probably fine, maybe less degenerate.

00;33;51;05 - 00;34;18;17
Poof
But, you know, there are different times in history where you could go use it, and it would be favorable for me because, like, it's much easier to work with the Ethereum data. But there's just not that many transactions, like, the transactions are meaty, you know, they have weight, they have gas fees, and other things. And like the amount of data, the amount of actual transactions on Solana is just much more on the scale of what you want for like training a transformer model.

00;34;18;17 - 00;34;40;27
Poof
So that was actually part of the original forge. And it's just like, well, every month it's going to print out a billion transactions. And then I can I forget a corpus of Solana transaction meme coin transactions. That's like big enough to be similar in size maybe to, you know, like a language models, training data and stuff. It's not quite that thing.

00;34;40;27 - 00;35;01;21
Poof
But so that was the origin of it. And then obviously we realized like, okay, it has all this other benefit, right? We can like validate and tune in the gen archetype to make sure they're really, truly degenerate behavior. You know, like are they really aping in in the right way and like a 4K market cap or not and stuff like that.

00;35;01;21 - 00;35;24;04
Poof
So we kind of got curate that. But it's super interesting thinking about this idea of like data availability and like how important that is and scale of data, because that is like something we kind of realized, like why I think it's hard in general to like train models and stuff on like very specific things because you just kind of need this, like certain number, baseline of scale of data before you can really do it.

00;35;24;07 - 00;35;28;10
Aaron
If if you looked at any of these like no training models that are coming out, is that a trainer yet?

00;35;28;14 - 00;35;30;02
Chris
Yeah, there's a.

00;35;30;04 - 00;35;34;22
Poof
Bunch of them, I guess. Yeah. Tell me about which ones you're thinking about or like what you're saying.

00;35;34;29 - 00;36;00;03
Aaron
I'd have to dig it up, but there was one that, came out called Absolute Zero. It was reinforced Self-play reasoning with zero data. And they're claiming that, you know, that with no data, if you have like the right, like initial inputs and kind of like the right goals, that you can build a system where it's basically like playing with itself, not having like a gross, creepy way.

00;36;00;05 - 00;36;11;06
Aaron
But playing with itself, like the game that you're setting up until it kind of hits whatever objectives you have, right? And then in each cycle, you're just kind of modifying what the end point is for it.

00;36;11;08 - 00;36;26;28
Poof
Yeah, yeah, I haven't read that one yet. It just came out like that's one of those where like it's correct in a weird way, but I don't know if I like I don't know sometimes these, these are like maybe not correct. Like, you know, sometimes they're like exaggerating the claim or something. But like there is a.

00;36;26;28 - 00;36;27;22
Aaron
Yeah, completely.

00;36;27;28 - 00;36;34;07
Poof
There is a real thing there actually, that's a good thing to talk about. Or at least this is something we're super interested in.

00;36;34;10 - 00;36;44;27
Aaron
Well, yeah. Like that. Like, I mean, in my mind that was like the biggest news of the week if like with the caveats that you just noted. But like, if that's true, like the whole game of AI is now changing, right?

00;36;45;01 - 00;36;48;11
Poof
Yeah. And I think that really happened already.

00;36;48;11 - 00;36;49;15
Chris
In a weird way.

00;36;49;17 - 00;36;50;18
Aaron
You think so?

00;36;50;20 - 00;37;12;14
Poof
So yeah, I'll, I'll take everyone on in like a little bit of a journey here or like my journey and realizing this, but like, you know, Deep Sink, for example, came out right. And blew people away on a bunch of fronts. Like, there's a lot of things that they did that super smart. And obviously there's like the exaggerated market reaction to it.

00;37;12;14 - 00;37;35;11
Poof
But like the main thing that, you know, obviously a lot and OpenAI was already there, by the way. So, you know, it's not totally, totally new, but you know, they made it public. And also they had a lot of like just smart implementations. They're very good engineers. But the core idea is really like, you know, reinforcement learning is actually really effective.

00;37;35;14 - 00;38;07;07
Poof
And it's really, really good at making things with automation better. It's actually probably better than this kind of like supervised fine tuning or supervised learning, which is like how alarms and stuff, the beginning part of a model is trained, and part of the reason why it was under leveraged. And alarms and alarm training is like it was really isolated to reinforcement learning with humans, human feedback.

00;38;07;07 - 00;38;36;23
Poof
And it's it's actually what led to ChatGPT versus the original GPT models where you're actually having this chat experience. And basically it's just like, okay, go have people go actually interact how they would and do kind of training on a model that already exists that just does the next token completion, instead, put it in a chat format and then also do human preference tuning, to be like, okay, I like this response more than that one.

00;38;36;23 - 00;38;53;04
Poof
Right? So there's like the human in the loop giving this feedback and all reinforcement learning to just for those listening like to really the just the simple way to think about is like I have some reward that I give the model when it does something I like, or I maybe it's a negative reward to punish that when it does something I doesn't like.

00;38;53;06 - 00;39;11;18
Poof
And really, the main thing I need is just some way to define that it can be anything, right? Like it could just be like, prove things. This is better. Thus it gets a cookie when it does that. And then we actually update the model accordingly. So that's how like you got to ChatGPT right? Which is like big breakthrough, this idea of the chat concept.

00;39;11;18 - 00;39;46;03
Poof
So it was kind of like already there, but like most of it was used in that way, which is like people going and manually labeling things and or manually interacting with things and people kind of choosing and stuff like that. And, you know, the if you go all the way back, then to like AlphaGo or the chess models and all this other stuff, like a big breakthrough and I game play and like, could it get better than the best skill player in the world with AlphaGo?

00;39;46;05 - 00;40;28;20
Poof
Was this idea of self-play, and that is where you're doing reinforcement learning. But instead of using human data, you're doing it with just having the AI play against itself. And the reason you do that is like, it's kind of funny. It's like, I think you can kind of you can really get like abstract with it. But the reason why that's helpful is like, there's this good, kind of iconic, like the bitter lesson posed by, you know, a somewhat or very famous AI researcher, which is basically like, you know, the bitter lesson of AI is always just like more data, more automated and faster is always like better.

00;40;28;23 - 00;40;49;27
Poof
It's it's not exactly that like there's a bunch different ways to read it. But it's kind of like when you have a human in the loop, or I need humans to go train the model and like, copy the best go players or whatever. It's not even just that. Okay. Well, you know, are humans really the best at it or whatever else?

00;40;49;27 - 00;41;03;22
Poof
Like, actually, that might be a really good way to train a model really efficiently. But the problem you're going to run into is kind of what I was talking about, which is like, there's only so much data, right? Like there's probably not that much data. Right?

00;41;03;28 - 00;41;37;15
Aaron
But that's why I think this one's interesting. Right? Because it's much more like objective. And, and it kind of reminds me vaguely of, like what the AlphaGo model look like. And I think that's why it could be an interesting area for future research where we see more growth, which is, you know, we're training it on all this, this data, which has lots of assumptions, kind of baked into it, you know, lots of lessons and the way that we humans have kind of interacted or solve problems like these systems that are much more like goal oriented, they could come up with novel ways to solve that problem.

00;41;37;17 - 00;42;05;12
Aaron
Right? Yeah. And that form of intelligence, I think, is when you start to see more emergent intelligence and or interesting potential solutions and like, that's that was the endpoint for like AlphaGo, right. Like it figured out moves or strategies that, you know, the best, most experienced code players didn't think of or simply because it was kind of playing the game so many times by, by with itself that it was able to kind of map out and chart out these new, new paths.

00;42;05;15 - 00;42;28;05
Aaron
So I could imagine, you know, this or these systems not applying to be like generalized LMS, but like more specific on specific topics where you can actually define like a specified goal. And since you were looking at and obviously we, we attributive thought a lot about how to automate, you know, investment decisions. It seems like an encouraging area to potentially apply this type of a model.

00;42;28;07 - 00;42;36;21
Aaron
And so maybe the error of like these types of fine tuned models is emerging as opposed to just like fine tuning, like some advanced reasoning model of some sort.

00;42;36;23 - 00;43;18;28
Poof
Yeah. Yeah, I think, you know, and I don't know, I think after going through this right, like we're building all of this just for this fun simulation thing. But it opened my eyes on how hard the training problem is relative to other problems. Because, like, one thing that's interesting is like people kind of forget this. But this is why I think OpenAI is so strong in so many ways, is like, they did that exact same thing for Self-play reinforcement learning on Dota five or with the Dota five model 2018 2019, which is basically like Self-play on Dota, which is obviously like much different.

00;43;18;28 - 00;43;39;27
Poof
Even weirder because like can go you have like if you think about the conditions and like goal setting and stuff, you've got kind of constraints and you've got very clear wind conditions and like, not necessarily like metas and all these other things. So like that type of environment, like a Dota or even like poker or whatever is actually like much, much harder to solve.

00;43;40;00 - 00;43;57;10
Poof
And what's interesting is like, yeah, they were able to do really well that way. It's not, you know, as superhuman, superhuman, but it still was superhuman. I mean, that was in 2019. So that's kind of like another one of those things where as we looked at, we're like, oh shit. Like, why are people not doing enough with this?

00;43;57;10 - 00;44;26;27
Poof
But part of it is it's really expensive to do that on like an online model or something else. Like it's really actually not easy. And the other thing is like, you get these weird kind of fun reinforcement learning problems, which is like, oh, I teach the model on its own how to learn how to walk. But I didn't realize that in the simulation it can just like, twist its legs together and become a helicopter and like, fly really fast so it can win the go as fast as it can reward.

00;44;26;27 - 00;44;54;05
Poof
Right? So you have like all these weird things you're actually seeing that with O3 is like an example of that starting to bleed through where you've kind of maybe or maybe that's what people hypothesize you've like over reinforcement learning a model. But like a language one in that case. But yeah, I think it's clear that like, that's going to be the future is you do self-play reinforcement learning like to me, like to me that's the next big obvious thing you do that everywhere.

00;44;54;10 - 00;45;19;05
Poof
The problem with it is actually so like it's great because it produces infinite data. It's not constrained. The problem is you actually have to go figure out how to like, simulate in, in a way that's fast enough. And that's the problem. Yeah. So but yeah, we're, we're like super bullish on that. And like that's kind of part of where we'll take this next for sure.

00;45;19;08 - 00;45;35;13
Aaron
Yeah I mean I think all this stuff we're working on is super interesting. I think you're kind of like drill down into some of the hardest challenges and like the areas where we're going to see a lot of innovation, like a lot of growth, like a lot of this stuff is kind of where the moats live in AI, right?

00;45;35;13 - 00;45;53;26
Aaron
Like if you can actually nail simulation there just so much that you can kind of do related to that. And that feels like, like what you were describing, like there's some really hard and challenging problems to do that. Well. And I imagine that that like complexity, you know, or from that complexity, we could see like a couple really, really big impactful projects related to that.

00;45;53;28 - 00;46;13;10
Aaron
They really kind of change how we view things. I like, I ultimately feel like almost prediction is like the the biggest moat in AI. Like it's like that's when you have a solved world, when you can predict everything. So it feels like that type of a technical task has to be a pretty safe area to explore.

00;46;13;13 - 00;46;34;17
Poof
Yeah. I mean, it's something we're yeah, we've spent a ton of time on our super interesting and it's a hard one to. Yeah, I think that's one of the interesting things like you look at the work that has been done on that, prediction specifically, but also like that whole space and it's just like, no, it's clear that no one's cracked the code yet, but there's something there.

00;46;34;17 - 00;46;59;03
Poof
And for me, I think it's probably around getting out a little bit. This obsession with just like a really big MLM and thinking that's going to solve that, which it's going to solve, like it's helpful for so many. Like it's a great area, but I really think, yeah, just kind of more smaller models simulation, more specific, really tuning it in, and starting to learn from there.

00;46;59;06 - 00;47;00;17
Poof
It feels like the right path.

00;47;00;19 - 00;47;12;26
Aaron
Yeah. Like that's going to be the 20 2620 to the end of the decade type approaches. Right to figure out. Yeah. Something else. Yeah. Did you ever read Asimov proof.

00;47;12;29 - 00;47;16;06
Poof
Yeah yeah yeah I did I can yeah.

00;47;16;06 - 00;47;18;02
Aaron
Did you ever read the Foundation series?

00;47;18;04 - 00;47;24;25
Poof
I did as a kid that was like that and the Hyperion or like the to kind of like. Yeah.

00;47;24;27 - 00;47;27;28
Aaron
Yeah. Chris, did you ever read those? I don't know if you ever read.

00;47;27;28 - 00;47;34;16
Chris
No, I it's funny, I like my hard sci fi is a little on the weak side.

00;47;34;19 - 00;47;35;12
Pri
That's surprising.

00;47;35;19 - 00;48;01;18
Aaron
I haven't actually read like, a ton of sci fi. It's not shocking. Yeah, but I think you'd enjoy it. The reason I'm raising it there, there's a the whole series, one of his, like, iconic series, not his most iconic series. I think that would, I guess, be like what I robot. I think that was also him. It's called the Foundation series and it has a character, and the premise of it is that he is an expert in something known as psychohistory, and it's just predicting the future.

00;48;01;18 - 00;48;21;21
Aaron
And so he like, has this box like a cube where he can predict the future, and he's using it to kind of save like humanity and the galaxy. And I feel like that is like you can't have, like a completely solved world with AI, where we all basically turn into blobs until you can actually do almost perfect prediction.

00;48;21;21 - 00;48;26;14
Aaron
So that feels like the ultimate goal at some level for all this stuff.

00;48;26;16 - 00;48;33;06
Chris
Well, hopefully it's far, far away. Hey, can I steer this conversation? Let's go. Let me.

00;48;33;06 - 00;48;35;05
Poof
Stay here. Please. Please. Yeah.

00;48;35;08 - 00;48;35;20
Aaron
Stop it.

00;48;35;21 - 00;48;38;07
Poof
Somewhere high away from, going too deep here.

00;48;38;09 - 00;48;39;01
Chris
But we're going to start.

00;48;39;05 - 00;48;42;04
Aaron
Talking about StarCraft soon. Chris, if you don't. If you don't stop us.

00;48;42;05 - 00;49;07;00
Chris
No, no, no, we're going to go with Matt Dry Hirst for a second here because, let's go. I think this tweet he dropped, a couple days ago was actually great for this conversation. And it kind of goes into maybe why we're willing to either put up with something like simulation in the future or in our current day and age, like ontology, like this repetition of culture over and over again.

00;49;07;03 - 00;49;32;28
Chris
And anyway, here's here's the tweet. The medium of today is social, playable or coordination software models and protocols appreciated through participation, replication or competition. Once it once you grok it, you realize culture is alive, the stakes are high, and we don't have to pretend there's much more to mind from rehashing media designed for and rate limited by wax and fill in.

00;49;33;04 - 00;50;00;10
Chris
In a week where, like the New Avengers is the biggest news in media, say, right, like, you know, just a rehash of this, like eternal franchise and a rebooting. Is it like maybe the media itself that's being produced today? We don't really care. Like we're willing to put up with slop, because what we're actually fascinated by right now is how we come to consensus or how we interact with these things on a network.

00;50;00;10 - 00;50;26;14
Chris
And like, you know, that that whole layer is far more interesting to us than what is like preceded it. And so, you know, are we willing to like, live with like a lower rate of media innovation because we're we're really geeked out on these, like networked, you know, social interactions or how we can participate as nodes in new forms of media.

00;50;26;16 - 00;50;29;01
Chris
I don't know, I'll just shut up right there.

00;50;29;04 - 00;50;32;25
Aaron
Well, that's always ahead of the curve. But I be curious what you guys think about it.

00;50;32;28 - 00;50;33;05
Chris
I think.

00;50;33;05 - 00;50;37;06
Poof
He's right. Like I think I almost like increasingly I'm like.

00;50;37;08 - 00;50;38;12
Chris
All of the.

00;50;38;14 - 00;51;07;16
Poof
Spec and like the conversations about like, yeah, death of culture and everything else over and over again. I just feel like, I mean, it's I could see why. And to your point, like traditional media and stuff just feels so behind and like, just pretty much gone, you know, almost at this point. And I also just like I'm a brain rot fan, you know, I mean, I haven't been keeping up on TikTok analysis on like, oh, Italian brain rot.

00;51;07;16 - 00;51;11;04
Poof
I miss on this entirely. I can feel like I'm a boomer now.

00;51;11;06 - 00;51;12;19
Pri
That's still going strong.

00;51;12;21 - 00;51;14;01
Chris
Oh yeah, it is.

00;51;14;01 - 00;51;39;05
Poof
And it's it's so good. But there's so much out there that just naturally feels like a must have. It is. What's puzzling to me is like, I don't know what the reason why everyone is. And just, like, super tapped into all these things now and like, it hasn't just totally overran these old media forms. And I guess it's really just like distribution the way things change.

00;51;39;05 - 00;51;51;15
Poof
But it's almost like you've got these immense, like what kids are doing on Roblox right now, probably like is absolutely insane. Futuristic cinema, insane man.

00;51;51;15 - 00;51;52;04
Pri
Fashion.

00;51;52;28 - 00;52;14;00
Pri
Yeah. Yeah, yeah. It's like it's so creative. Feels like the medium. I mean, I know I'm kind of reading the tweet here. I think he's like 100% right. It is going it's even deeper. I think the initial tweet, you know, music, TV, film, it does feel like the cultures of those of those communities are like sort of rotten in a way.

00;52;14;00 - 00;52;50;26
Pri
Like, you know, people are over it. They're not. The executives are old. All they do is like fun. The same redundant sequels. Speaking of like, the simulation sequels, like, no wonder that, yeah, everything is derivative. Everything's a sequel. It's like I forgot who I was talking to about it, but we were even discussing how, like, it's it's shocking that a lot of these, like, film execs aren't even going to TikTok to find these emerging stars or like, YouTube to find these emerging stars, and then just like putting them in TV shows on the mainstream and then kind of, you know, whatever, creating new television shows where they start and then distribute it through their

00;52;50;26 - 00;53;18;10
Pri
massive, like millions and millions of TikTok audiences, like they're not even going that far. I was actually reading this week, too, that, like Peacock is, is doing that is basically doing that. They're like almost creating like a fund where they're generating new content and then like putting random TikTok stories in it. But yeah, I mean, putting that aside, I do agree, like Matt's take on the medium being social, social viable models and protocols fuels, right?

00;53;18;12 - 00;53;47;29
Pri
I still I mean, what I want to think about that though is like, are we over complicating it a bit too? This isn't like really a critique on that. I'm just kind of thinking about the conversation that we've even had across the Dow's as like sometimes simple wins and like what's kind of nice about X and Twitter, just as a medium and cultural layer is that people just can, you know, you have your different silos of crypto, Twitter or whatever, VC, Twitter or, you know, whatever Italian brainwashed Twitter, and you can kind of live in that world.

00;53;47;29 - 00;54;14;25
Pri
You don't necessarily need to have that coordination software or some model or protocol. You just kind of are on. And in many ways, like, you know, Twitter X is like the layer on top of tokens and Bitcoin. Anyway, Bitcoin was willing to existence basically because of crypto Twitter and like different, you know, message boards. I'm really rambling right now, but I guess what I'm trying to say is like, are we overthinking a lot of the I near-term impact?

00;54;14;25 - 00;54;28;00
Pri
But I agree that mediums are changing to be more internet native and AI is internet native, IT native, but I don't know if we need to like think about mediums beyond the internet itself. I don't know if any of that really made sense.

00;54;28;07 - 00;54;46;20
Chris
No, no, no, a ton of it did. And we're already like super deep in this podcast. But I can go in a million directions here. The the first thing I wanted to say, right, is as it relates to traditional media, I read this great article a long time ago and I, I wish I could find it because it's spectacular, but I can't.

00;54;46;22 - 00;55;22;04
Chris
I search for like every six months, but there's basically this blog post where this guy is talking about how forms of media or forms of mastery evolve over time, and once you hit like a certain level, right, like once it it, it sort of matures to and, a big enough point, the people who are still engaged in it start moving towards like more baroque forms of expression, where they now know everything they can possibly do in the production of their art, their craft, whatever.

00;55;22;07 - 00;56;05;25
Chris
And instead of like pivoting out of it and learning something new, they now just want to see how much of that they can load into the form. And like single packages, right? And so in a way, like if we go to this whole Avengers as an example. Right. And it's weird to talk about, you know, like mastery and then like, you know, sequel packed comic book hero slot in the fact that, like, you know, Walt Disney Corporation's bread and butter is on, like basically the reanimation of comic book franchises over and over again is really telling because it means like, they just have to go into bigger spectacle, like more baroque forms of that, to

00;56;05;25 - 00;56;32;06
Chris
keep it, keep it alive. And so that's like one thing right there. And then it gets into this, like, God, this concept from the classical age, which is, like the Greek world is like the tension between the genos and the dmos. Right? Like the genos is like the institutions, the inherited aristocracy, the set of traditions you base your whole life on and, you know, forms a meaning system in a way of living.

00;56;32;08 - 00;56;56;25
Chris
But that meaning system is also very like constraining. And so then you have the dmos, right, which are the people and the people who come after the genos. You know, these are like the children of the children of the children of the people who started this place. So there's someone who wandered in, you know, once upon a time, and all the land is taken up or all the positions of power are taken up and they can't wedge their way in.

00;56;56;25 - 00;57;19;14
Chris
And so they have this, like, you know, this drive and this urge which is being bottled up and push back against. And so, you know, like you can look at traditional media versus the internet as it's like tension between the Genos and the Dmos where like the kids of today aren't allowed into the studio system, they're told to pay 20 bucks to go watch The Avengers.

00;57;19;14 - 00;57;29;03
Chris
Well, what is there a that where is the expression, where do they go? It's the internet that's free and wild and sprawling and doesn't put any constraints on them.

00;57;29;06 - 00;57;37;23
Pri
Yeah, I totally agree. I feel like Hollywood is. Is it almost on? It's like Last and Dying Breath. It's a little bit,

00;57;37;26 - 00;57;48;02
Aaron
I don't know if it. I don't think it's going to be like that. I don't think it's ever like a dying breath. I think it's just going to go through another, like, shift. Right? I think one shift like YouTube.

00;57;48;04 - 00;58;05;06
Pri
Do you think about like Trump tariff thing of like terrifying people who film outside because so much of like filming got exported. Like do you think it's just going to like have a huge boom for AI generated media because people are still not want, you know, want to like film stuff in the US. And so people are going to just like, use other tools.

00;58;05;09 - 00;58;26;24
Aaron
I don't know, I mean, I think the, the don't you think the format is going to change? It feels like we're probably, what, three quarters away from getting actually really good AI generated content like enough that probably could satisfy some mass consumption niche cosmic go way down. I mean, there's still storytelling and kind of art and art to that.

00;58;26;27 - 00;58;40;02
Aaron
I mean, it's been pretty much the same even before, like the moving picture, right? Like if it was on a stage or vaudeville or, you know, all these other earlier areas of entertainment, I don't think that stuff goes away. I just think it's going to mutate again.

00;58;40;04 - 00;59;11;05
Chris
Yeah. I mean, one thing, the internet has not solved for yet is like long form storytelling. We still have to go outside the network to get these, these larger meaning making stories of characters and emotions and quests and drives and like, you know, like the internet is great. Maybe creating monoculture and small memetic bites, but it's still not able to, like, pack complexity and density into those things.

00;59;11;05 - 00;59;28;28
Chris
And as long as that divide exists, like trad media is safe. Now to your point, Aaron. Like, does it continue fragmenting and does it continue spreading out? Right. Like, I don't know if you remember, like when you know, Bruce Springsteen, who was kind of in his last phase and he put out that 57 channels and nothing on song.

00;59;28;28 - 00;59;36;06
Chris
Right. But like at the time, like 57 seemed like a lot of channels, you know. And now it's like laughably small.

00;59;36;08 - 01;00;05;07
Aaron
Right. So maybe that just expands. I was saying this even to what's weird is like, as the cost of producing media has kind of gone down, it feels like it's way, way more homogenized now. Like there's nothing that's really kind of blowing my mind on the media landscape recently. Like in any, in any area, which is a little bit surprising, you would think, like with all these like new tools and like greater distribution channels, like you get a greater variety of stuff and it just feels like increasingly homogenized.

01;00;05;10 - 01;00;33;10
Chris
And that's why the new lows, believe us album is for you. It is I, I gotta say, that's one thing I really like in my gen I practice is like if there's like, a niche media from the past that I want more of, I can conjure it up, you know, and I can, like, bring it to life and film my own, you know, sort of void and with, with that album that I dropped this week, like, that was like a hunger I had.

01;00;33;10 - 01;00;59;07
Chris
I was like, God, I really miss, like, just those weird ass college kids out in the hills, you know, that were like around like 95 to 2005, just like, you know, making this, like, shaggy, sonically experimental, but, like, really earnest music. And with the new pseudo models, like, I wonder if I could recreate that. Like, is it finally at a point where, you know, this gap in, like, my listening needs, I can actually just fucking do myself?

01;00;59;07 - 01;01;02;27
Chris
And I was so surprised to, like, get to that point.

01;01;03;00 - 01;01;23;07
Pri
I was actually like reading this random study that was saying, like, after the age of 34 or like or whatever, some, some random age in your 30s, like, people kind of don't really listen to music, new music, they just like, listen to music from their youth. It'd be really funny if people basically just AI generated content from their youth to mix it up, but like, they like the same sounds and, you know, feel.

01;01;23;09 - 01;01;41;09
Pri
And so you just like, never escape that. Like nostalgic era of music because you can just like, generate, you know, an infinite number of music that sounds like, I don't know, Dave Matthews Band, for example, you like, just never learn anything. You never, like, listen to anything fresh or new or interesting 100%. I could see that.

01;01;41;14 - 01;01;45;24
Chris
No, like, music really, really matters when you're like 15 to 25.

01;01;45;26 - 01;01;46;12
Pri
Yeah.

01;01;46;12 - 01;02;10;21
Chris
And like, whatever that sound was, you want to return to it like you want to return to it. Maybe because like you've trained yourself to enjoy it. You want to return to it because you actually enjoy it. But like, it's also like a soothing comfort, you know, to come back into that. And so the fact that you can now like extend and update that, like that is a really interesting emerging trend to like, you know, watch and see.

01;02;10;21 - 01;02;17;17
Chris
What do people do now that they like, can expand, you know, whatever mattered to them at a moment in time?

01;02;17;20 - 01;02;26;00
Pri
It's so true. Just never going to never going to leave that like, oh, Dave Matthews Band, Jack Johnson, John Mayer era, I guess.

01;02;26;02 - 01;02;34;10
Chris
Oh my god like the the pre 911 optimism pre. That's what you want to go back to. Right. Just the blind.

01;02;34;10 - 01;02;37;21
Pri
Naivete sublime pepper you know.

01;02;37;24 - 01;02;38;23
Chris
God sublime.

01;02;38;23 - 01;02;50;12
Poof
Yeah I'm astounded. What nostalgia are y'all into right now. Because I feel like like right now I suddenly feel like ready for different kinds of nostalgia than, like, what I had in a couple years ago.

01;02;50;14 - 01;02;52;20
Aaron
What was your nostalgia a couple years ago? Poof.

01;02;52;22 - 01;03;20;00
Poof
A couple years ago, I was really into like late 90s anime Japan. Like me games, stuff like that. Like that was like if I had like a little bit of, like pseudo nostalgia because, like, people like, rediscovered all these, like old games that I tried to play on, like ROMs or something when I was a kid. So like and translate that was like going back to that, like, oh, that's incredible.

01;03;20;03 - 01;03;22;27
Poof
That was my two years. Yeah. Nostalgia.

01;03;23;00 - 01;03;27;19
Aaron
I'm like, not I'm I've never been that nostalgic. Like as a person, I want to.

01;03;27;21 - 01;03;29;22
Poof
See in the future. You're to see.

01;03;29;24 - 01;03;50;14
Aaron
Like I just know, you know, like, I think the thing that I'm like, a little worried about or it's a dynamic I'm like, trying to wrap my head around is. It's almost like as we get better tools, you can explore like a creative space faster, and then the decay is faster. But the the spaces to explore are not increasing at that same rate.

01;03;50;20 - 01;04;15;02
Aaron
Like I even think about like NFTs and generative art. Like we went through a one year period where that was like extensively explored, and there's not as many new areas to kind of explore there. And music kind of feels the same way, like there was so much exploration that happened in music in the 60s or I guess 50s, 60s, 70s, 80s and 90s, that by the 2000 there just wasn't as much that was new, right?

01;04;15;02 - 01;04;34;14
Aaron
Or like completely novel. And I'm wondering when like that aperture of like new, new things like emerges and I feel like more like like more long form media credits, like what you're describing before. It's kind of the same thing. That's why I which is like pressing the big red button for more Marvel movies until like, the whole world is like, can we please stop this?

01;04;34;16 - 01;04;38;08
Aaron
You know, like, everybody's like, no more marvel, please, please.

01;04;38;15 - 01;05;04;28
Chris
Well, it's funny because the lesson Disney learned from that was not no more Marvel. They learned less frequent Marvel, but better Marvel. Like their response to that wasn't to say like, oh, we've had too much of this. The response was, we have to meet or improve the quality of this, but keep forcing it down people's throats. And, you know, those are like entrenched interests of systems.

01;05;05;01 - 01;05;31;04
Chris
But that doesn't get to like your point on this kind of, you know, sliding, sliding to the side there. But I think it goes back to like, Matt thing, right, where the new terrain for us is networked interaction and like surfing around on, you know, like surfing the internet tsunami to quote, although in three and so in a way like the media is just the raw material that propels us on the surf.

01;05;31;04 - 01;05;41;04
Chris
But like our focus, our novelty is riding from station to station, hooking up with groups of people, geeking out on shit and then hopping over the next thing.

01;05;41;06 - 01;06;04;21
Aaron
Yeah, you know, that made me just think here. Maybe it really is the algorithms, Chris, like, of social media that are kind of killing things. Like there isn't like the hyper linking, you know, exploration that we had in the 90s and 2000 or, you know, that kind of same type of interaction. Right? It was kind of like social linking that you got an early social media platforms.

01;06;04;21 - 01;06;27;13
Aaron
Now it's like this, this very manicured, constrained system of social media that just choking off like new forms of thought or innovation. That's almost like these algorithms are like taxing us or really limiting, like the the plane of exploration and, and all these, you know, other beneficial things that happen when we bounce into each other. I mean, even on Twitter, right?

01;06;27;13 - 01;06;47;08
Aaron
It's like impossible to get distribution now if you're not just like retweeting something that Elon may be interested in or if it's not like some, you know, brain rot type video, we've kind of even lost like some of the last channels where that that's happening, I guess maybe Reddit, like, I feel like there's got to be something that breaks that open again.

01;06;47;12 - 01;07;15;04
Chris
Not only have we lost that and not only like are the social like media algorithms holding us in these boxes that we're just endlessly spinning around, but attempts to supplant them, right. Like in crypto, like crypto is like, you know, grail Valhalla. Consumer crypto's grail of a hollow breakout is to have apps that compete with existing apps. And yet none of these apps have hit and none of these apps have worked.

01;07;15;07 - 01;07;37;01
Chris
And crypto's answer has always been, oh, we're too early, we don't have this, we don't have that. And crypto never says to itself, maybe these existing legacy apps do the job sufficiently to the point where we can't overcome the switching costs, we can't overcome the velocore do they hold and we have to find something else. It's like the.

01;07;37;01 - 01;08;02;11
Aaron
Tyranny of the network effect. Chris. Right. Like the monopolistic tyranny of the network effect, like has has costs and like everybody has explored like the the positive business building relationships, shapes of it. But like the creative surplus that we got when there was like a diverse, less monopolistic, we dominated ecosystem, it just isn't getting passed back to society at this point.

01;08;02;17 - 01;08;16;21
Aaron
It just like we're we're kind of stuck with these platforms, like even it's even nearly impossible to to kind of compete with some of the social media networks and start another one there. Just not enough space like in people's minds, I guess, are time timelines.

01;08;16;24 - 01;08;35;18
Chris
Yeah, it's almost like postmodernism or the grip of modernism is like, how do we break all of these systems that we're creating? How do we find a way forward? Because the things we've already built are just so goddamn good, right? That no one, no one wants to step outside of them because the cost for stepping outside of them are so great.

01;08;35;20 - 01;09;01;06
Poof
Also, too, like, I think you're I think you're really correct for it because when there's speed and like tons of mass chaos to like having these, I don't know, it's not modes, but like, you know, if it's not something truly breakthrough, it's going to be hard to break the inertia. Right. This like web I, I keep like wondering if it's going to happen is like with I did the algorithms eventually get shattered.

01;09;01;06 - 01;09;18;00
Poof
Like it just becomes much easier for people to spin up custom algorithmic ways to get to source things from different places. And like that becomes something. But I don't see any of that happening as maybe super theoretical. So yeah, I think you're I think you're spot on that.

01;09;18;03 - 01;09;32;05
Chris
I feel like the object, the thing, the token is like a place to build around right now while we wait for this whole distribution stagnation. You know, that logjam to break?

01;09;32;07 - 01;09;59;22
Pri
Yeah. I'm just thinking about actually, cause it's like it's really hard to compete with the existing platforms. There's already network. The key is like there's already network effects. So I'm just thinking about my own news or behavior. Like I'm not really inclined to want to just go to another social networking platform. It's just like I it's too much to manage, you know, between even, you know, messaging platforms, everything from Signal to Telegram to Twitter DMs to WhatsApp to my iMessage to know each one of those is in its way.

01;09;59;22 - 01;10;21;24
Pri
It's on social media. Plus then you have like Twitter, Facebook, Instagram, it just the list goes on and you don't want to do it. And I'm just like thinking around that. Like the only thing that's able to successfully create network effect in a new novel way are tokens. I don't know if you been, you know, I don't know if people need to a platform around like obviously you have like Uniswap and the Dexs and exchanges, those are kind of platforms.

01;10;21;24 - 01;10;44;17
Pri
They're on tokens. But I'm having a tough time basically thinking of an alternative to creating that, a new novel way to do proliferated network in this day and age. That is a token. But I don't know if platforms are needed for tokens. And if that's the question, then it's like, just do tokens live on existing social media platforms.

01;10;44;17 - 01;11;06;05
Pri
I guess they kind of do now anyway, but that doesn't feel right either. I don't know if I'm making any sense here, but I'm like trying to think of like what an alternative is to this, like, kind of weird hellscape that we're in right now, but it doesn't feel like there is one, because I'm just like thinking about my own behavior, and I don't want to join a new social network, but I'm happy to buy a new token that has a network tied to it.

01;11;06;08 - 01;11;33;11
Chris
Yeah. No, I hear you. Like, you mentioned all these places you're on and, like, you, you kind of have to do that for your job. And, you know, maybe you're you have a more wired personality than me. Part of, like, my thing is, how can I keep my network exposure limited enough that, like, I can live with it and I can still create, and I don't feel like this harried anxiety around, like, all these touchpoints, you know?

01;11;33;16 - 01;12;01;20
Chris
And so, like, what ends up happening then is in my own management, it's a matter of like existing networks forcing themselves on me. Right? Like I'm only ever on WhatsApp when it's Little League season because I have to be on Little League. You know, I have to be on WhatsApp during that time. Like I deleted telegram, because I was just terrible at telegram and I know I missed a lot of stuff in the crypto space, but like, I just, you know, like when push came to shove, it was always telegram.

01;12;01;20 - 01;12;38;23
Chris
I was ignoring it. And I finally was just like, dude, just drop it. And so I did. But it is like that competition for like the pre the existing networks. Right. Trying to wedge themselves into your daily habits and then you're, you're managing that of like, oh which of these massive conglomerates you know is the I'm making my life versus do I want to give room for something else that has warts, that doesn't move at the speed that bothers me or like goes through this, you know, fits and start things, you know, and I'm thinking about like far are right now.

01;12;38;23 - 01;12;49;01
Chris
Right where like, that's the one that's like trying to push itself into the world of crypto to varying degrees of success based on like what it gets back to you.

01;12;49;04 - 01;12;50;06
Pri
Yeah, completely.

01;12;50;09 - 01;12;53;08
Chris
Man. Gaming. We have gone far and wide.

01;12;53;10 - 01;12;53;24
Pri
You got.

01;12;53;24 - 01;12;54;23
Aaron
Everywhere.

01;12;54;26 - 01;12;56;21
Pri
I'm like I'm like to.

01;12;56;24 - 01;13;17;25
Aaron
Just recap real quick. We're living in a simulation. Or we may be living in a simulation house building the first simulation and crypto simulation machines. We've learned that Chris is from the future, and it's just come back in time to drop some some sick albums on everybody and that there's there's change afoot, guys, especially with a lot of this I, I stuff.

01;13;17;28 - 01;13;26;28
Chris
I mean, we've also learned that like tre needs 311 DMV Jack Johnson. Oh la. Like, that's that's a happy station.

01;13;27;00 - 01;13;30;11
Aaron
We need a little happy spaces. I hope we all get happy spaces.

01;13;30;16 - 01;13;37;28
Chris
I mean, sublime write sublime. The happy space, which is hilarious when you think of, like, who they were and how they lived.

01;13;38;00 - 01;14;00;11
Pri
I always think of that like Weird sun. I had a t shirt that said Cymbeline with that like sword, and I would like cut off the arms of it. It was the times anyway. So yeah, I guess. Should we wrap it? This is I'm excited for you poop. I feel like what you're doing is actually, like, creative. I mean, crypto in like a way that makes sense and it is actually creating a fresh, new, interesting social network in many ways.

01;14;00;11 - 01;14;08;00
Pri
And I don't know, I think I'm excited about everything that you guys are putting out. I feel like it feels different and unique and really well thought out.

01;14;08;02 - 01;14;18;12
Poof
Yeah, thanks. I'm up and, thanks for having me and all the support always from from you all, but yeah, crazy experiments. Come coming. 513 opens up.

01;14;18;19 - 01;14;29;18
Chris
513. You're on base. The terminal is the technology dcgi I what else we got to get into the fact finding shelf we get.

01;14;29;18 - 01;14;39;11
Poof
We could do the catchphrase. The the largest autonomous economy simulation ever. Clock in crash out terms.

01;14;39;14 - 01;14;48;01
Chris
Bam! There we go. Eight. Before we go. Before we go. What's going on with Chaz Conklin? When's he going to. When's the terminal boss going to set him free?

01;14;48;03 - 01;15;02;24
Poof
I meant I think we're going to we're going to free him. So follow follow grand plans Twitter Chaz Conklin alter ego. He's going to he's going to go crash out himself, actually on, 513 and, drive the story along a bit.

01;15;02;26 - 01;15;03;23
Chris
I can't wait, man.

01;15;03;29 - 01;15;05;03
Pri
Let me tell you, I'm pumped.

01;15;05;10 - 01;15;07;13
Chris
Hype free introduce the show.

01;15;07;15 - 01;15;29;29
Pri
Yeah. And on that note, welcome to Net Society, a podcast where we talk culture, tech, crypto, AI, and so much more. Today, we have a special guest. Poof! And so, yeah, we'll get it started. Just quick disclaimer. These are thought these thoughts are our own and not of our employer. So yeah, let's get it going. Sorry.

01;15;30;02 - 01;15;32;25
Pri
I like always forget to do the intro until the very end.

01;15;32;28 - 01;15;34;21
Poof
No it's incredible.

01;15;34;24 - 01;15;38;12
Chris
No that's wonderful marks of the show. Now. Yeah that's on it.

01;15;38;14 - 01;15;40;12
Aaron
That's one of our differentiating factors.

01;15;40;12 - 01;15;43;09
Chris
We know it introduces the term and did.

01;15;43;12 - 01;15;44;24
Pri
Yeah literally we.

01;15;44;25 - 01;15;53;19
Aaron
Thank you for so much for your time. Thanks for your brain. We're huge fans of everything that you're doing and can't wait to see what you're cooking and get to play around with it.

01;15;53;22 - 01;15;54;11
Poof
It's your.

01;15;54;15 - 01;15;56;24
Pri
Thanks. Have a good weekend now.

01;15;56;26 - 01;16;17;02
Chris
So yeah.