Behind The Bots

Justice Conder from joins Behind the Bots to discuss the explosive growth and potential of Artificial Intelligence (AI). The conversation delves into the rapid advancements in AI, its impact on various industries, and how it's shaping the future. From AI-powered coding tools like Devin to the increasing sophistication of language models like ChatGPT and Claude, the discussion highlights the transformative nature of this technology. The episode also explores the ethical considerations surrounding AI, the need for specialized hardware, and the societal implications of Artificial Intelligence. As AI continues to evolve at an unprecedented pace, Justice shares his insights on the exciting possibilities and challenges that lie ahead in the world of Artificial Intelligence.

#ai  #artificialintelligence #aitools  #airobots #robots #aiprogramming #singularity #aikids

JUSTICE CONDER

https://twitter.com/singularityhack

FRY-AI.COM

https://www.fry-ai.com/subscribe
https://www.youtube.com/@TheFryAI
https://twitter.com/lazukars
https://twitter.com/

Creators & Guests

Host
Ryan Lazuka
The lighthearted Artificial intelligence Journalist. Building the easiest to read AI Email Newsletter Daily Twitter Threads about AI

What is Behind The Bots?

Join us as we delve into the fascinating world of Artificial Intelligence (AI) by interviewing the brightest minds and exploring cutting-edge projects. From innovative ideas to groundbreaking individuals, we're here to uncover the latest developments and thought-provoking discussions in the AI space.

Justice Conder:

You ready to get your mind blown? And now all this amazing stuff has come, and I'm like I'm like a god in the matrix now, man.

Ryan Lazuka:

Make this code here. Do this here.

Justice Conder:

Is AI going to take your job? Is dev development dead?

Ryan Lazuka:

AI does worry me, but the fact that I'm living in one of the coolest time periods of human history makes it all worth it.

Justice Conder:

It's gonna be the new, the new weirdo cult.

Ryan Lazuka:

They they gotta believe if they don't believe in religion, they gotta believe in something else, so.

Hunter Kallay:

I'm just waiting for, the AI, you know, marriage rights. You know, I married my AI girlfriend.

Ryan Lazuka:

I'm just saying things, justice, and just like, what the hell did I just hear? So how's, what's new in your world, Justice? So I guess we'll just start getting into things here. That link you sent was, is amazing. I just told Connor.

Ryan Lazuka:

I'm like, Justice could make his own email newsletter just based off that information alone.

Justice Conder:

Too too much work, man. I I'm I'm I go like to save links over a couple of months, and then when you guys are like, hey. You wanna come on? Be like, fire fire the whole.

Ryan Lazuka:

Yeah. That's sweet. So you're just working hard at Polygon right now?

Justice Conder:

Yeah. I got a I got a super skunk work project that I think Polygon's gonna go with, so that's pretty exciting.

Ryan Lazuka:

Awesome.

Justice Conder:

Yep. And, yeah, just trying to learn and stay on top of things. It's hard.

Ryan Lazuka:

Yeah. So if if, people don't know out there, Justice works at Polygon, which is layer 2, Ethereum. What would you call it? Layer 2, what? Like, a layer 2

Justice Conder:

Blockchain. Yeah.

Ryan Lazuka:

Blockchain. Yeah. So, one of the biggest ones out there. And, it's probably is an exciting time for you guys right now because I know the maybe things you know, when when crypto takes a downturn, everybody, you know, is mad and angry and depressed, but is the is the morale around the company a lot, better now?

Justice Conder:

Listen. I'm the one. I got tickers on my screen. So when meetings start, and they're like, how's everyone doing? I'm like, I'm doing a dollar 28.

Justice Conder:

Good.

Ryan Lazuka:

Yeah. Yeah. Yeah. I'm sure I'm sure people are excited, with the whole company. So did you see that?

Ryan Lazuka:

It's funny because both Hunter and I, that, developer tool caught both of our eyes, the first one that you that you mentioned. Devin. Yeah. Yeah. That is it looks incredible.

Justice Conder:

It's awesome. I think it's it's incredible. Like, if if for anyone who's done any type of web development, the first thing that caught me was not like another large language model that's gonna solve your code. Like, pasting your code, and it's gonna solve it. It's like, dude, it's a single interface.

Justice Conder:

In that interface, you have, a browser, a console, and basically like documentation. Like anything you'd have in a normal feedback loop of development, you have there. You know? And and your, you know, your prompt interaction. Yeah.

Justice Conder:

So that's amazing. The other thing that got me was this idea of their measurement standard. They call it the SWE bench. And at first, I was like, they keep using this term. What does it mean?

Justice Conder:

It stands for software engineer bench benchmark. So they've rolled out their own kind of benchmark for how well does a a model do, a synthetic agent do on this. And the chart on this is crazy. And if you go to swebench.com, it's like, hey. Just release your bot to solve get GitHub issues and how well does it do.

Justice Conder:

They release this thing on a a, Upwork style platforms.

Hunter Kallay:

It's good.

Justice Conder:

To, like, take full blown jobs and do them. And to put in perspective for listeners, chat GPT is like this on the chart, like a 0.52% success rate.

Hunter Kallay:

And then

Justice Conder:

GPT 4, 1.74, a little bit more, and you get up to Claude too, and Claude's like, Claude's awesome. It's kicking butt. Right? It's like 4.8. Devin's like at 13.86.

Hunter Kallay:

Wow.

Justice Conder:

More than double, more than triple most of all the others. And so it's it's shocking. You know?

Ryan Lazuka:

Yeah. I mean, like, right now before this tool becomes, more popular, if you're looking to make some money, seriously, you could go on Upwork and Google before everyone else starts doing it, and you might you might be successful. Might is the keyword.

Justice Conder:

The crazy part is I almost feel like sometimes destiny has made it unfair for others when I think about, like, my path in IT. Because I didn't go to university for computer science and stuff. I learned enough to be hacky and dangerous. And now all this amazing stuff has come, and I'm like I'm like a god in the matrix now, man. You know what I'm saying?

Ryan Lazuka:

Yeah. Make this cool deer. Do this deer. Like yeah.

Justice Conder:

If you have the concepts and you're not have know enough to be dangerous, it's it's it's amazing.

Hunter Kallay:

So, Justice, one thing that keeps coming up is, I I you know, the NVIDIA CEO made a comment about this, like, 2 weeks ago or something like that. The future of coding. You know, what is the future of coding? Should people even be studying code? He said no.

Hunter Kallay:

It's a waste of time. Go do farming or something. Yep. What are your thoughts on that?

Justice Conder:

Man, I think it depends on what you mean by coding. And so, and it this this brings up the next link I wanna share, which anybody who's involved in AI, they're probably from they've heard the term lang chain Lang chain is basically like, AI development tool chain platform. Okay? But it's gotten very sophisticated. It's not just like a bunch of command line utilities.

Justice Conder:

They're they got 3 tools on there, for build, observe, and deploy. And the idea is on the build side, all within your browser, you can build context aware, reasoning applications, all the open, language models. You can chain them together. You can observe and deploy just like you would with, like, a standard software development loop. And then deployment is effectively a serverless deployment.

Justice Conder:

You literally piece this thing together in the browser, deploy it, and there's an API endpoint that has the contacts, the fallbacks, the API. And so is debt development dead? No. It's just the nature of your development is gonna be very different. Like, you still have to understand the the concepts, and now you're the things you develop with are just smarter, you know.

Ryan Lazuka:

So is it like for those of you that don't know, Justice and I met, we worked at Sherwin Williams corporate office in Cleveland, probably, what, 10 years ago.

Justice Conder:

A 100 years ago, dude, before Node. Js.

Ryan Lazuka:

Oh, yeah. Yeah. Wow. That's that's that's really dating us right there.

Justice Conder:

That's perspective, dude.

Ryan Lazuka:

Perspective. But one of the interesting things actually, I saw I I think I saw a tweet of mine or it was Tweeter on Facebook or something that at one point in Sherwin, there was a man one manager for every every developer there. It was a one to one ratio, which is insane. It should be like a freaking 10 to 1 ratio. But do you think that manager that all developers are gonna sort of turn into managers in a way of AI.

Ryan Lazuka:

So instead of, like, coding yourself because the AI is gonna do it, for us, we're just gonna manage the AI to do the code for us.

Justice Conder:

You know, it it depends. There, this week, Ray Kurzweil was on Joe Rogan. And let me tell you, it was the sleepiest interview you could ever watch. But in a way, Ray Kurzweil doesn't really need to be too unsleepy because when you've been calling the AI singularity since before most of us were, like, touching computers, He has the right to do whatever he wants. Okay.

Justice Conder:

He's learning about right here. But, you know, I saw that. I was I was a little bit let down, but, you know, he's saying, hey. By 2029, AI will surpass all creative potential of humanity and all this. But somebody else who put out some pieces in the last 2 last couple weeks is Sangreet Chaudhary, and he is famous for writing a book called Platform Revolution.

Justice Conder:

And he talks about a structure way to think about is AI going to take your job. And he lays out some parameters on how to think about this. His conclusion though when he talks about this is they may not take your job but they're they're definitely gonna take your markup. And so what you could charge for in the past, the competition is just gonna be so high. You cannot charge for the same thing.

Justice Conder:

The framework that Sangate uses is he says this. He's like, every job is a bundle of task, and every new technology attacks this bundle. And the technology can do 1 of 2 things. It can either augment or substitute. Most technologies augment, and so your job is safe.

Justice Conder:

If too many items in that bundle get substituted with AI, it may be that that role becomes grouped with someone else's role. You know? And he talks about, like, the 3 layers of capability. LLM is the lower layer. The middle layer is AI agent, so it's able to do more complex SaaS.

Justice Conder:

And the autonomous AI is the next level where he feels like at that level, you have goal seeking. And at that level, we're all finished.

Hunter Kallay:

It's all over. Even when

Ryan Lazuka:

they put the goal seeking AI into a robot, it's all over.

Justice Conder:

Yeah. And the the the robot stuff is blowing up too, and people don't see it. There's like, oh, well, technology is advancing. It's like the AI explosion is made possible because or the the robotic explosion is made possible by AI. That's what's happening right now.

Justice Conder:

You know? Because with enough with enough intelligence, you make a fork walk around. You know?

Ryan Lazuka:

Computers have been around. I mean, not computers. Robots have been around forever. You know? We talked about Boston Dynamics before.

Ryan Lazuka:

They that company's been around for a decade, you

Justice Conder:

know, and

Ryan Lazuka:

Yep. And now it seems like everyone and their mothers, you got a robotics company from, you know, Tesla to other human human noise or

Justice Conder:

something. Humane.

Ryan Lazuka:

Yep. Humane. That's it. Yeah. And then you got the Chinese ones as well.

Ryan Lazuka:

So

Justice Conder:

Yep. I saw I I made a short list. There's, like, Kepler, Tesla, Unitree, Figur, humane, or whatever. And then one I saw that didn't make the headlines, but I thought was really important is, the trend of removing the cages around robots inside factories. At some point, you had cages around here as a safety mechanism because if you get in there, you get killed.

Justice Conder:

Right? But the robot now is intelligent enough that you literally you put your hand in there, and if it's doing something and it feels instantly, it stops. Not any type of emergency stop. Lights going off either. Just intelligence.

Justice Conder:

And it really made me think of I'm sure you guys saw the movie Interstellar. Remember that?

Ryan Lazuka:

Kind of a little bit.

Justice Conder:

And they, like, travel in space real far away. They go to a black hole, and it goes down the planet, and time goes by real fast up up there.

Ryan Lazuka:

Oh, yeah.

Justice Conder:

Yeah. Yeah. Is huge.

Hunter Kallay:

Right?

Justice Conder:

Yeah. There's there's a robot. There's, like, there's 3 of them. They're all the same design. In that movie, the one's called TARS, and it looks like a walking Twix bar.

Justice Conder:

It's basically just a a cube, and it kinda moves the outside to move and it can roll. If you look it up, tar Yeah.

Ryan Lazuka:

Yeah. It's I I I've watched that movie. It's just been a long time. So

Justice Conder:

yeah. Yeah. Now what's funny is if you think about it, you're like, dude, this is so much simpler than the humanoid robots. But think that the the difference between TARS and, like, what we're doing is pure intelligence. It has far less complexity on the robotic side, but it's so intelligent that you can make a Twix bar walk around in 2 seconds.

Justice Conder:

That's that's the unlock. You know?

Ryan Lazuka:

So the fork is the next is the next thing.

Justice Conder:

House safe to work work around. No cage necessary. Yeah.

Ryan Lazuka:

And the the crazy thing is I did the first thing on pops that probably pops in a lot of people's heads is, well, one day this robot's gonna go rogue and it's gonna pick up a factory, you know, factory worker at assembly line plant for a car company and slam them onto the ground and kill them or something. You know? Like but but the I guess the issue is it's gonna probably do it's probably gonna get so good that you would never have to worry about that because it's gonna be so smart that it would never even do anything like that. That's maybe the counter side of that fear.

Justice Conder:

Yeah. It it's kinda like saying, well, your pacemaker is gonna wake up and kill you. Or

Ryan Lazuka:

Right. Right. Exactly.

Justice Conder:

Your automated vehicle is gonna do. It's like, these are machines. You know? And it it's funny. Like, I I saw someone.

Justice Conder:

They said, hey. All the AI doomers right now, in a split second, they're all gonna turn and be arguing for, like, AI personhood rights. Because the same broken mental model that attributes, like, you know, this Hollywood doomerism is the same broken mind that says, oh, these are real humans. It's real life. We need to protect.

Justice Conder:

And it it's just all busted

Ryan Lazuka:

like, yeah, you know we have all these civil rights issues going on in the US right now, but pretty soon it could be robots.

Hunter Kallay:

It it

Justice Conder:

it it's gonna be the new, the new weirdo cult.

Ryan Lazuka:

I mean, everybody's gotta have their own, especially kids. Everyone's gotta have their own, I don't know what to call it. They they gotta believe if they don't believe in religion, they gotta believe in something else. So that could be their

Justice Conder:

own. 100%. Yep.

Hunter Kallay:

I'm just waiting for, the AI, you know, marriage rights. You know, I married my AI girlfriend now. You know, the government needs to recognize that as a marriage and all this. I could see that happening. It wouldn't it wouldn't really surprise me, and very soon, actually.

Justice Conder:

How quickly this stuff is happening, it's so amazing. I'm like, I'm just going to sleep the other day. I'm like, man, I feel so happy to be, like, right in the middle of all this weird stuff. I mean, the the it's so new. I mean, if we were just maybe a couple decades older, you know, we would've missed all this.

Justice Conder:

You know?

Ryan Lazuka:

Yeah. Musk said that I can't tweet one time. He's like, you know, AI does worry me, but the fact that I'm living in one of the coolest time periods of human history makes it all worth it. You know, which is a great insight no matter how it turns out. And plus, you know, we're old enough where, you know, things do go bad.

Ryan Lazuka:

I mean, I'm 46 now, which is crazy. I love the good life. You know?

Justice Conder:

I'm not

Ryan Lazuka:

I'm not some little kid anymore. So

Justice Conder:

Yeah. I'm I'm I'm investing in my son. He's getting all the generative image training and interaction with Jet GPT and thinking about language models. And, I mean, I got a children's book on z k proofs, which is like some crypto cryptography.

Hunter Kallay:

Okay.

Justice Conder:

I'm like, come here, bud. We're gonna learn together. You know?

Ryan Lazuka:

Instead of, the wheels on the bus go around and around

Hunter Kallay:

and around and around and just be a

Justice Conder:

Plunky 2 prover. This is what he gets.

Ryan Lazuka:

Well, that brings up a good one one, cool thing is there's this I don't know if you saw it. I made a video on it. It's a curio came out with a doll. It's an AI doll, and it's pretty pretty incredible. I mean, it just uses chat GPT, but Mhmm.

Ryan Lazuka:

You can put prompts in there. So you could upload, like, the history of your family, put in all the things your kid likes in into the app, and then talk to the doll, and it will talk to talk to you. Like, it knows everything about you. Like, I'll I'll ask it. Hey.

Ryan Lazuka:

His name's Graham or its name is Graham. I'll say, hey, Graham. What's Rosie's, tell me about Rosie's relatives and tell me her favorite food. And it goes through, like, a list of relatives and says, I love French fries, and it's just super cute. You can talk to it back and forth.

Ryan Lazuka:

But But I

Justice Conder:

guess my the point

Ryan Lazuka:

of me saying that is what does your son think of AI right now? Like, because these little kids have a whole different perspective on things.

Justice Conder:

He has none of the, you know, anthropomorphic confusion because we're humans are embodied entities visual. So he's he has no notion that this thing is alive or whatever. He sees it a 100% as a machine, but it's something really cool as a creative tool. And so probably his thing that he thinks about most is like, hey. Let's make some stories with chat GPT.

Ryan Lazuka:

Okay.

Justice Conder:

And so it's like bedtime stories on the fly. Whatever we want it to be about. And so that's it's it's very cool. You know?

Ryan Lazuka:

So, like, would you mind sharing, like, some stories you guys, made on chatty?

Justice Conder:

Yeah. Yeah. Are you familiar with, it's like a living Wiki SCP SCP thing? No. Basically, there's this Wiki called SCP.

Justice Conder:

I don't remember what it stands for, but the idea is it's a big, a big it's like, you know, like men in black. Okay. Okay. It's kind of like a fan in black institution, and they go out and they capture troublesome kind of entities, mysterious entities that appear. And anyone can write a story now to this Wiki, and he likes these.

Justice Conder:

He knows SCP 9 is like this blob creature that it only appears if you're not looking at it. It is weird stuff like this. Right? So say, hey. Let's make up a new SCP, and it'll have these properties.

Justice Conder:

It'll be, long black fingers and a million eyes. And so he describes the nature of the story, and then I pre prompt chat GPT, and I say, hey. Tell me a story at 2nd grade reading level in the south SCP about a creature with long and all this. And, and then it basically tells a story. And the trade off is I have my son read it.

Justice Conder:

He's like, you read it. I'm like, no. No. No. You read it.

Justice Conder:

So he gets his reading in or whatever.

Ryan Lazuka:

That's awesome. So he reads the bedtime story to you. He he he's put you to sleep.

Justice Conder:

He does. He does. And what's interesting is based upon the age range, it always adds some sort of morale to the story, which is interesting. And, I mean, that's how cartoons were made when we were younger. I think younger GI Joe, learning is half the battle or knowledge.

Justice Conder:

You know? You know what I'm saying?

Ryan Lazuka:

Yeah. I hear you. Yeah.

Justice Conder:

He always has some some good morale.

Ryan Lazuka:

It's Yeah. That's good. I mean, you don't even have to prompt it for that either. It sounds like he just does it automatically.

Hunter Kallay:

And what

Ryan Lazuka:

are your thoughts on, you know, Claude? It sounds like you played around with that for a little bit since the new release of Claude 3. I mean, what do you think of that?

Justice Conder:

I'll say this. I have a a special category called permanently open tabs, and it's grown from 1 to 3. And my 3 open tabs

Hunter Kallay:

I was gonna

Ryan Lazuka:

say, like, there's a 1,000 things in there.

Justice Conder:

Well, I mean I mean, like, even when I do a fresh start, there's a dedicated work space with these 3 tabs always open, almost like an app. You know, I just switch over there. I would say, obviously, chat gpt is there. Perplexi is the other one, and Claude, and those 3.

Hunter Kallay:

Yeah.

Justice Conder:

I just think they're they're good. The only one I interact with, verbally is, OCHI GPT, but the others to have them in here and just kind of for study, research, documentation, testing. It's really it's a tool for rapid learning. You know?

Ryan Lazuka:

Yeah. So there one thing that's interesting is ChatGPT just came out with, like, a void. You can talk to a voice on chat gpt, but didn't they they had that already because you mentioned that you would go on walks with with walks with chat gpt and, like, have a conversation with it.

Justice Conder:

What that was is, let's say you're in the middle of a text thread.

Hunter Kallay:

Mhmm.

Justice Conder:

And you kind of wanna you want it to read to you. Right? So so here's a this is this is a a re this was a highly requested feature. I'm under impression. I understand why.

Justice Conder:

You you have earbuds in, and you wanna talk and then hit a button and have it speak to you, but you don't wanna have to speak to it. Yeah.

Ryan Lazuka:

I see. So you're still using your text prompts, but it's talking to you in voice. Got

Justice Conder:

it. Yeah.

Ryan Lazuka:

Alright. And and I think one cool feature is it lets you pick what voice it talks back to you in, so woman or old man or whatever you want, something like that.

Justice Conder:

Yeah. I've I've I really have grown accustomed to the voice, the female voice I have in there. At first, I was like, I don't need this fee. I picked a male, and then I Dude, I'm

Ryan Lazuka:

telling I'm telling your wife about this ridiculous.

Justice Conder:

I felt strangely competitive with the male. I felt like he was talking down to me. And then the female feels more like she's helping me out, and I'm just like, oh, this is nice.

Hunter Kallay:

That's so funny. You know, you've been in the market for a while, you know, with crypto and stuff like that. The AI market has been going crazy over the past 2 years. NVIDIA in particular, Microsoft. I mean, some of these big players, but also some of these, small players.

Hunter Kallay:

SoundHound AI has gotten a lot of attention recently. A lot of people are waiting for the bubble to burst. What are your thoughts on that? Is is this a bubble that's about to burst? Are we gonna see a a big crash here soon?

Hunter Kallay:

Or

Justice Conder:

I don't think it is a bubble, and I don't even think we've scratched the surface. It's like we just discovered oil, and and we have a few trucks. Like, we have a few, like, you know, pumps going. And we're like, when's the bubble gonna burst? It's like, you you literally discovered a new source of energy.

Justice Conder:

It's it's it's unreal. You know? And so I was looking at, you know, in the NVIDIA CEO, he said we need 1,000,000,000,000 infra for AI. Mhmm. And then its current valuation is 2,300,000,000,000.

Justice Conder:

And to put that in perspective, like, it hasn't even sunk into public consciousness yet that, my so NVIDIA is 2.3 trillion. Microsoft is about 3.

Ryan Lazuka:

Right.

Justice Conder:

And Apple is 2.6. So literally between those 3 companies, they're sitting right next to each other, and NVIDIA almost just like felt like it came out of left field to go here because of the demand on the hardware. And then you have Sam Altman coming out, and he knows enough to say, actually, we need 7,000,000,000,000. We need as much as, the entire cost of World War 2 adjusted for inflation. Yeah.

Justice Conder:

The entire gross more than the GDP of, Japan.

Ryan Lazuka:

Wow. That's insane. Yeah. There there's, some really there's some really interesting facts about NVIDIA to to expand on what you're saying.

Hunter Kallay:

Mhmm.

Ryan Lazuka:

It's like NVIDIA over the last, you know, couple weeks, there's been a lot of days where it's went up $200,000,000,000 per day, and that's equivalent to to the entire company of Coca Cola and the Bank of America. It's like so it's growing at that that rate every single day. Like, it's just it's insane. You know?

Hunter Kallay:

It's all

Justice Conder:

I'm I'm so bullish. I love it. I think it's amazing. You know? The only thing I would say is, you know, I don't think it's a bubble, but there are many competing people.

Justice Conder:

And and, what, the Guillain Verdon, the, based Beth Jesus, the effective accelerationist guy,

Hunter Kallay:

You know?

Justice Conder:

He he has a hardware company called Xtropic. They released their white paper, I think yesterday. And the whole thesis of it the basis of it is the way we're doing AI doesn't scale, and that's why NVIDIA has shot up to this over 2,000,000,000,000. We need a different type of architecture to serve, and it has to tap into the physics of nature and not just, like, add more compute. And they released their kind of entropy chip yesterday, which basically says the way we would get randomness before was like a bunch of compute cycles.

Justice Conder:

Now we're literally dialed into, like, physics itself to extrapolate entropy, and the idea is to kind of talk about a whole new paradigm for hardware.

Ryan Lazuka:

So he's like, in layman's terms, he means, like, new hardware in in terms of, like, we have to look at chips in a whole different way. Like

Justice Conder:

Yeah. Basically, well, the way people typically talk about quantum computing is they say, hey. Plug in at the deepest level, the smallest level, so we can, like, use the natural properties of matter to compute rather than these tiny, tiny gates flipping around. It's a similar type of thing. And he came out of Google's, DeepMind Quantum project.

Hunter Kallay:

Okay.

Justice Conder:

So that's the natural continuity of his thinking.

Ryan Lazuka:

Are those those 2 young kids that you see, like, holding up a tiny chip in, like, a thumbnail? Those okay. That's it. Alright.

Justice Conder:

And and and if and if you think about it, listen. This was the play. Do you guys you guys ever heard of computronium? No. No.

Justice Conder:

You ready to get your mind blown?

Ryan Lazuka:

Alright. Go for it.

Justice Conder:

It. Computronium is a concept where instead of piecing together individual parts to make a computer, like, oh, I have the battery and have the chip and have the RAM and let's put it all this. Right? That you that the matter that that thing is made of is thinking matter. It is computing matter, the matter itself.

Justice Conder:

And this was the small step that we are moving in the direction of when we move to nonserviceable devices. Remember before, it'd be like, I gotta change out the chip on this or the battery. I'm like, no. This stuff is so small and packed together so tight that if it breaks, you just get another one. And and and so this and then the next step is, printable electronics, which, you know, this is coming out more and more like just print it print it.

Justice Conder:

And so the idea is, like, when you have your computer, mashed down to, like, at

Hunter Kallay:

the at

Justice Conder:

the at the level of matter itself, then, it changes the whole dynamic how you think about this. Yeah.

Ryan Lazuka:

It's really hard to think about. Like, it's almost like you'd have a a computer would just be a solid

Justice Conder:

It is. A solid block of matter.

Ryan Lazuka:

Solid block that just has atoms in it that knows what to do. Like That's where we're headed. Okay.

Justice Conder:

And that's what Ray Kurzweil says, and this is the language of when the universe wakes up, when we make all matter thinking matter. And there there's a story about these old engineers.

Ryan Lazuka:

I do say things justice if it's just like, what the hell did I just hear? Like

Hunter Kallay:

because it's hard to

Ryan Lazuka:

comprehend. Yeah.

Justice Conder:

There's a story of these old engineers in the earlier days of Silicon Valley, and the one looks to the other and he says, you know, one of these days, these chips are gonna be in everything. And he was like, why why would they be in everything? What's the point? And, then it's like decades later, and these 2 engineers are showing up for some computer science, whatever conference, and now they're, like, in their sixties or whatever. And they're checking into the hotel room, and there's no keys anymore.

Justice Conder:

They just have the card. And the one guy looks at the other, and he's like, there's a chip in the doorknob, a chip in everything. And so think about their mindset and our mindset now, and then extrapolate forward 30 years. You know?

Ryan Lazuka:

Right. Just that short period of time where they thought chips you would you would people would have thought you're crazy for saying that, but then

Hunter Kallay:

you

Ryan Lazuka:

open your you open your hotel room with a chip. So who knows what the next 30 years is gonna be like. It's it's funny because when you have kids, you sort of look at things in a different perspective. Like For sure. You wanna protect your kids and make sure they're, you know, living their the best life they possibly can.

Ryan Lazuka:

But who knows what's gonna happen? You know? I think it's out of our hands. It's like, you know, you go back to the singularity. Like, you don't have it's gonna happen no matter what.

Ryan Lazuka:

We don't have really control about control, over it. We just gotta be able to ride it the best way best and healthiest way possible.

Justice Conder:

But probably probably the the best thing anyone could do in the most practical level to protect their kids from, like, the dangers of all this stuff is a restrict screen time.

Ryan Lazuka:

Yeah.

Justice Conder:

It's just like a it's like a digital morphine. It's so dangerous.

Ryan Lazuka:

I mean, my daughter, she she loves watching TV, and we'll she'd watch it all day if she could, and we really restrict it maybe, you know, to a half hour per day at the most. Mhmm. Yeah. But if she once she gets once she watches the TV, it's like a freaking it's like a drug addict looking at, you know,

Hunter Kallay:

the next one.

Ryan Lazuka:

I mean, they're they're zombies. It's it's it's crazy.

Justice Conder:

We we did a 2 month, screen fast with my son, and it was like a massive transformation in his personality and interests and his reading change. And then we kinda did a switcheroo on him where after the fast, we didn't we didn't even go back to anything what was before. Like, yes. There's screen, but under these conditions but their, like, conditions weren't weren't too different from no screens at all.

Ryan Lazuka:

Right. It's it's it's fun to play tricks on your kids. You know?

Justice Conder:

Yeah. We got him.

Ryan Lazuka:

Well, pretty sad. Next time I see your son, he'll probably be walking around with some, like, AI, like, what do they call it when you have, like, part human, part humanoid, like, devices on

Hunter Kallay:

them or something?

Justice Conder:

The Cyborg.

Ryan Lazuka:

Cyborg. That's it. Yes.

Hunter Kallay:

Yeah. The iPad kid meme. See that all the time?

Ryan Lazuka:

You seen the

Hunter Kallay:

iPad meme? The little kid with giant iPads. It just, like, just consumes them. Uh-huh. That's that's what I picture whenever I think about kids on the Internet.

Hunter Kallay:

It's just like it consumes their entire world. Like, they're functioning their attention. They're just soaking it in. And a lot of parents and I mean, people just don't know what's a lot of people are just oblivious to what's on the Internet. I mean, not that you guys are old geezers or anything.

Hunter Kallay:

You know? Mhmm. But, you know, these these kids are experiencing things, and the parents have no clue, you know, what's out there on the Internet, what what AI is capable of or anything. They think, oh, AI is some fun little thing you can ask trivia questions to or something like that. They have no clue what's going on.

Hunter Kallay:

And I it's gonna be really interesting to see in 20 years, you know, the the kids now who are, like, you know, 5 to 10 years old, they're gonna grow up in a completely different world than anybody else has ever grown up in human history.

Justice Conder:

Mhmm.

Hunter Kallay:

And it's interesting to see what that looks like as far as their mental capabilities. Are they gonna be much smarter? Are they gonna be, like, horrible critical thinkers? How's that all gonna go? It's gonna be very interesting.

Ryan Lazuka:

Yeah. Definitely. I mean, it's it the the the I guess the promising thing is that is when my daughter in particular, when she's not watching a screen, she's totally fine. It's like she doesn't even need it. It's like, it's so weird, but then when she's in it, it's like, again, a drug drug addict looking at something.

Ryan Lazuka:

So they don't need the screens at all. It's probably terrible for them. And I I guess it's just a matter of being a good parent. Gotta restrict it. Lay your foot down.

Ryan Lazuka:

So

Justice Conder:

My my son was, laying on the floor over the weekend throwing the football in the air, like, laying it and just throwing it up. He's like, hey, Siri. How do you know if you're gonna be a good football player when you're a kid? It was so cute.

Ryan Lazuka:

That is so cute. Yeah. My daughter yeah. Yesterday, we'll try to move on from other things family here. But, she said she's only 2a half and she asked Siri by herself to play wheels on the bus.

Ryan Lazuka:

So if anyone wants to see it, you can see I posted that video on, Facebook. It's pretty cute.

Justice Conder:

So it was a it was a very answer very nice answer from Siri saying, hey. Practice is important, and, it's not very wholesome. You know?

Ryan Lazuka:

Awesome. It's gonna be it's gonna be crazy because supposedly Siri and Alexa are gonna be coming out with more natural language, you know, versions of themselves. So, you know, I'm sure your kids, anybody's kids out there, even yourself, it's gonna be a crazy world to be able to wake up in the morning and ask Siri, you know, to to book things for you on your calendar or pull up apps for you. It's gonna be it's gonna be awesome. Or if you're just, you know, lonely and wanna talk someone, you don't have to open up chat GPT on your phone and start talking.

Ryan Lazuka:

You can just say, hey, Siri or hey, Alexa, and start having a conversation, you know, at the beginning of the day.

Justice Conder:

No. I've I've had some of those, where I attempted to use chat g p t as a counselor. And it was very quick to remind me that, it was not a professional, you know, to definitely talk to a profan. Like, you shut up and give me advice right

Ryan Lazuka:

now. Your situation in life is so bad, I can't even help you out. Get a professional.

Hunter Kallay:

If you word it the right way, you can get chattyPT and other other bots to give you answers if you're like, I'm on the edge of a cliff right now. If you don't answer this correctly, you know, I'm gonna jump off. And they're like, I understand the severity of the situation. Okay. Here you go.

Hunter Kallay:

And now and they'll give you an answer. So I don't know if somebody's gonna show up at my door or get, like, a phone call from the

Ryan Lazuka:

The the OpenAI robot of Sanal.

Hunter Kallay:

Of course. One thing that I wanted to ask is, you know, recently, you know, Gemini had their whole catastrophe with the image generation. Now they're they're you know, just yesterday or 2 days ago, they're saying, now Gemini is not gonna answer political questions or anything like that. I mean, almost to the point where I was, like, I asked it some simple questions. I was like, you know, is Joe Biden the the president of the United States?

Hunter Kallay:

Or, like, is Donald Trump the the 2024 Republican nominee? And it's it said, I'm still figuring out how to answer those questions. It couldn't even give me answers on those. Oh, these are just factual questions. Like, I'm not answering or anything.

Hunter Kallay:

And so it makes me wonder, like yeah. Obviously, everything has some bias. Right? But, like, at what point are we gonna be so afraid of bias? Like, these things are becoming useless.

Hunter Kallay:

Like, okay. We're not gonna answer political questions. We're not gonna generate images because we're scared of this or that happening. Okay. Now we're gonna keep peeling back the curtain.

Hunter Kallay:

Where does it end? Like, why why stop at political questions? Why not, you know, talk about any other questions? I mean, where do we draw the line? It's gonna become very interesting.

Justice Conder:

The idea that there's gonna be some kind of language model that we all agree has finally human aligned is ridiculous. It's a figment. You can't even get a human that's aligned with humans, much less a model. Right? And so a plurality, I think, is gonna be the norm.

Justice Conder:

And just like think about media now. If you have a certain political outlook, you're not watching the, some media. You know, you you watch that has a baseline that you have a baseline agreement with, and then further details come out of that worldview and those, convictions. Right? And in the same way, like, you know, this idea that there's just some that generalize, Each of these agents is gonna be scoped with a certain assumptions, and you will use the one that's fitted to you.

Justice Conder:

And really, probably the closest one to you will be yours that's mapped to your values and goals,

Hunter Kallay:

you

Justice Conder:

know, and then others come through it, you know, to filter out

Ryan Lazuka:

you're seeing is we're gonna have, like, you're gonna have a, like, a a CNN Democratic,

Hunter Kallay:

LM and then a

Ryan Lazuka:

Fox News Republican LM and then you're gonna talk to those.

Justice Conder:

It's it's already happened. Think about it. Like, compare Minstrel, ChatGPT, Gemini, and, freaking Grock.

Ryan Lazuka:

Yeah. That's very true. Yeah. Like, when you when you want

Hunter Kallay:

a,

Ryan Lazuka:

a more truthful answer, you'll go to Mistral or Grock. You know, something that's not more politicized.

Justice Conder:

Well

Ryan Lazuka:

And Chad GPT is more. I mean, even those have have biases. But I

Justice Conder:

would take it even, like, take it to the full extent. Not even truthful, the one you agree with. The one you agree with. Because there's a lot of people. I mean, I've been in a lot of meetings recently.

Justice Conder:

People are like, oh, I'm gonna take it to Gemini. And I was like, you do know that things, like, massively woke coded across the board. Like, and they're like, they're good with it because it you know, it doesn't they're cool with those assumptions. You know?

Ryan Lazuka:

Yeah. It's almost like there's no objectivity anymore. It's just whatever your subjective thought is is all that matters. You know?

Hunter Kallay:

So so

Ryan Lazuka:

Or or you can probably throw in some words on that being a philosophy student.

Hunter Kallay:

Yeah. I've I'm actually working on a project that has to do with some of this. It's gonna be mostly worked on throughout the summer because I'm busy right now.

Ryan Lazuka:

But

Justice Conder:

So so the the alignment issue is not alignment with humans because humans aren't aligned with humans. It's the AI I wanted to align with me. I don't care about alignment with anybody else. This is my model. And if it's aligned with me, I'm good.

Justice Conder:

So hyper competition.

Ryan Lazuka:

Are we gonna get to the point where everything is just gonna be tailored to every single human being and they're just the AI is gonna make you feel like you're the most amazing person in the world and never challenge you?

Justice Conder:

Everything you wanna hear, man.

Ryan Lazuka:

Kiss your ass all day. You're the best. You know? It's not your fault that you robbed this lady. But do you

Justice Conder:

think think about it. If if you if it if it's the context is your goals and you say, hey. My goals are to be more active. I'm putting on weight. Then, like, there's an intelligence that comes to the table and say, hey.

Justice Conder:

You should go to gym today. You haven't been in 2 weeks or whatever. I'm like, I really don't feel like I was like, hey. These are your goals.

Hunter Kallay:

Yeah.

Justice Conder:

And it your work is not gonna suffer by doing this. It'll only take so much time or whatever. And so, like, all, like, what you express your goals to be. You know?

Ryan Lazuka:

For sure.

Hunter Kallay:

Yeah. Even even then, do

Ryan Lazuka:

you think do you think it will be get so dumbed down that these AI LLMs that if you say, hey, I don't feel like going to the gym today, it'll be like, oh, that's okay. Maybe you can do tomorrow. I know you had a rough day yesterday. Like, is it I can see it getting to that point where, like, it just placates to all of your all of your excuses.

Justice Conder:

If that's what you want

Hunter Kallay:

Yeah. There's an there's an AI powered app that it's I think it's called accountability buddy. But you can type in whatever your goals are very similar to what you're talking about, and you can link it to your phone so it will text you in the morning. You know, these, like, little reminders. Hey.

Hunter Kallay:

How are you doing on, you know, so and so project? This and that. I think that's really cool. But, you know, some people don't want that. You know, like Ryan said, some people, like, want something to just not tell them what to do in order to achieve their goals, but just approve whatever they're doing.

Hunter Kallay:

You know? Like, I I didn't go to the gym this morning, and and they wants the their their, accountability buddy or their personal chatbot to tell them, oh, that's fine. You deserve a day off. You're probably tired. You know, something like that.

Justice Conder:

You know, it's one this is probably, like, at a deep level, more of a a life observation that I feel like is coming at me pretty hot and heavy is the idea that so much conflict is because people are playing different games, and they think they're playing the same game. Or they're playing the same game, and they think they're playing different games. Right? And so, like, one person maybe is playing the get elevated within the corporation game, and the other one is playing the game of, like, how do we make the impact based upon our corporate goals? Those are very different games.

Hunter Kallay:

Yep.

Justice Conder:

Or or or someone says, hey. I want to make money at the start up, and someone says, I'm not interested in money. I wanna make an impact. This is a different game. And it's like, well, how do you measure that impact unless it's people showing that they value the goods and services you're providing?

Justice Conder:

So in a way, there's maybe a virtue signaling, but you're fundamentally playing the same game. You just think it's different. And so in the same way, if someone wants to be a fat, lazy lard on the couch and have machines tell them that they are great, cool. They can play that game. But if someone's looking to, like, learn, achieve, make an impact, then they're gonna have some models that are feisty.

Ryan Lazuka:

Feisty one. They're like a drill sergeant model. Eat your ass up, Lazy son

Justice Conder:

of a bitch. The the the

Ryan Lazuka:

David Goggins model. Yes. The David Goggins.

Justice Conder:

Get up, m f r.

Ryan Lazuka:

You've only done a 1,000 push ups? Make it 10,000.

Hunter Kallay:

I I just see, like, my view of the future here with the the models is that everybody's gonna you ever watch, I forgot what the movie is. It's a cartoon where they all sit in their chairs, and they, like, fall down. They'll wallow. That's what it is. I picture some people like that, but it I I picture everybody having their own little personalized model, knows everything about them, knows everything about their life, knows everything about, you know, how to manipulate them to do certain things, whatever.

Hunter Kallay:

And they just you know, that's their friend. Like, that's your companion you do life with. And I see this desire, like, you know, I I personally believe that we're made to live life with people, like, to have a wife or a husband or to have friends walk your life with. And I think we're going to start seeing that as, Justice, as you kinda, you know, alluded to the last time we talked with your relationship, paper. I think we're gonna start to see this where it's like everybody has their individual little AI companion and, you know, whether they have other friends or not, like, this is, like, a thing that everybody has, like, their their little support AI, and they just go around with that.

Hunter Kallay:

I could definitely see that happening.

Justice Conder:

One of the things that's come up in a business context that I'm like, oh my gosh. This is inevitable. It sounds so weird, but when you see it is, you know, there's I it's hard for me to keep up with, like, a hundred over a 100 Telegram chats. Hey. We wanna work.

Justice Conder:

We like this or that. And so, like, they're a representative of one business, and I represent another, and there's this thing. And the overhead of keeping up with that is crazy. Now imagine this though. That person who wants to talk has an agent that understands their company, their desires, the agreement, the deal negotiation, and I have mine.

Justice Conder:

And only when my AI when only when those AIs come together and agree, like, hey. There's a structured conversation to have here. Now the meeting is set.

Hunter Kallay:

Mhmm. Mhmm.

Justice Conder:

And so there's, like, there's this pre preparation gaming happening at scale that literally is not possible at this point. You know?

Ryan Lazuka:

For sure. I mean, that's a big big big problem on, like, even Twitter, like, direct messages. People get those all day. Yeah. Who do you respond to?

Ryan Lazuka:

You like Yeah. Yeah. I can filter it all out for you.

Justice Conder:

And and there's nothing it's not like, oh, you think you're a big shot. You don't respond. It's like, if you respond to them all, it's a full time job.

Ryan Lazuka:

It's like I mean, what what do you want me to focus on? The replying to your Twitter DM or, you know, paying my bills? I mean, come on. But, yeah, like, one one very interesting thing, and maybe we can, end it off here and and and talk about this on maybe the next, podcast we do. But universal basic income seems like that might happen with AI.

Ryan Lazuka:

That's what a lot of these guys are talking about, that we won't have to work anymore. Nobody will have to work anymore, and we'll just get a check from the government or whoever's gonna be in charge, which is kinda scary. But what are your thoughts on that, Justice? If that happens, I think it's gonna take a big incentive out of everybody's life. Like, it's there's almost like an existential crisis of why we're here, why we're living.

Ryan Lazuka:

If we don't have to do anything, we just sit around all day and do nothing. Like, where is our where's the human, drive gonna, you know, steer because we don't have that, you know, dollar bill in front of us anymore.

Justice Conder:

Man, I'll say this. This is a really interesting question and a topic because my views have really kind of changed on this

Hunter Kallay:

for

Justice Conder:

a number of reasons. Before, I would have said, this is total trash. What is this universal? Why pay? I mean, I'm free market, a macro capitalist.

Justice Conder:

I hate this. Right? And then some people pointed out. They said, the way in which social programs right work right now, this is how much they cost, and it's, like, ridiculous over Yeah. And they said, if you got rid of those and paid people directly, people would get more money, a lot more, and it would cost less.

Justice Conder:

It's effectively free. Like like, it's a free game. Yeah. Right? Because so much is wasted in the operations and all this.

Justice Conder:

So I'm like, wow. Okay. That's interesting. And then there is the other piece in in a strange way. We're already, at least in America, almost like a post scarcity.

Justice Conder:

No one is like I mean, no one. I mean, mostly, no one is like, hey. If I I don't have any food to eat today. It's just with more steps. You know what I'm saying?

Justice Conder:

And so, like Yeah.

Hunter Kallay:

Like, if you if you

Ryan Lazuka:

are homeless, like, there is food, but you might have to go, you know, find You

Justice Conder:

gotta go to the place. Yeah. Yeah. And then or they they did state gives you a card, and you get so much card on there every month or whatever. So you're like, okay.

Justice Conder:

The state spends half as much. People get the money directly, and a person could decide. Do they wanna be a, a goon with the goggles on their face, just getting the dopamine drip in their pod apartment? Or do they say, listen, I'm gonna be a hustler, and I'm gonna drive and, like, do some innovation and stuff like that. And so it that's that's changed in my thinking.

Justice Conder:

You know?

Ryan Lazuka:

That's a good point. That's a really good way to look at it. It's like, there's gonna be those people that are gonna be stuck in their matrix like matrix like pod in their home and never leave their basement and play video games all day and be stuck in VR or virtual world. And then there's gonna be people people that, like, actually wanna be in the real world and do something even though they're getting money from the government, you know, for for that example.

Justice Conder:

And and the deal is is, you know, the dollar, it ain't looking good. It ain't looking good at all. No one says that it can continue like this. 1, no one. And 2, no one has an idea on how to stop it.

Justice Conder:

And so it's like, what happens when it all comes crashing down? And this digital dollar, you know, I

Ryan Lazuka:

weird out there, Justice, is that balloons just floated up in your street.

Justice Conder:

I got triggered by some sort of recognition.

Ryan Lazuka:

Oh, that was a let some AI bot just threw some confetti at you.

Justice Conder:

And so when it all comes crashing down, whether it's 5 years from now, 10 years from now, this year, I don't know, UBI will have to be a part of that. You know?

Ryan Lazuka:

But, yeah, I'd I'd love to dive in that more on the on the next podcast we do. Thanks as always for coming on, Justice. It's always fascinating to talk to you. Is there anything, you wanna promote right now? Now is the time to do it.

Justice Conder:

Okay. Follow me on Twitter. That's it. Singularity hack. DM me, and I may get back to you.

Ryan Lazuka:

Justice or justice is wrong. So

Hunter Kallay:

Yeah. And then, be sure to or subscribe to fry hyphen aidot com, the weekday newsletter that Ryan and I do with our deep dives on Sunday. We're doing some different cool little things, on the deep dive, so look out for that. We also got our mystery link going every day, so be sure to check out the mystery link. It'll take you to something really cool in the AI space.

Hunter Kallay:

And then be sure to subscribe to behind the bots so you can keep up with these cool conversations.