TBPN

This is our full interview with Alex Karp, recorded live on TBPN.

We discuss why AI could eliminate large numbers of white-collar jobs and trigger political backlash against the tech industry, unpack how Palantir’s hybrid model of software, deployment teams, and institutional knowledge allows companies to transform operations in months rather than years, and debate what the United States must do to stay competitive in the AI era from rebuilding domestic manufacturing and expanding vocational education to preparing for a world where AI development shapes geopolitics, national security, and the future of work.

TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays from 11–2 PT on X and YouTube, with full episodes posted to podcast platforms immediately after. 

Described by The New York Times as “Silicon Valley’s newest obsession,” TBPN has recently featured Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella. 

Sign up for TBPN’s daily newsletter at TBPN.com

TBPN.com is made possible by:
Ramp - https://Ramp.com
AppLovin - https://axon.ai
Cisco - https://www.cisco.com
Cognition - https://cognition.ai
Console - https://console.com
CrowdStrike - https://crowdstrike.com
ElevenLabs - https://elevenlabs.io
Figma - https://figma.com
Fin - https://fin.ai
Gemini - https://gemini.google.com
Graphite - https://graphite.com
Gusto - https://gusto.com/tbpn
Kalshi - https://kalshi.com
Labelbox - https://labelbox.com
Lambda - https://lambda.ai
Linear - https://linear.app
MongoDB - https://mongodb.com
NYSE - https://nyse.com
Okta - https://www.okta.com
Phantom - https://phantom.com/cash
Plaid - https://plaid.com
Public - https://public.com
Railway - https://railway.com
Ramp - https://ramp.com
Restream - https://restream.io
Sentry - https://sentry.io
Shopify - https://shopify.com
Turbopuffer - https://turbopuffer.com
Vanta - https://vanta.com
Vibe - https://vibe.co

Follow TBPN:
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays from 11–2 PT on X and YouTube, with full episodes posted to Spotify immediately after airing.

Described by The New York Times as “Silicon Valley’s newest obsession,” TBPN has interviewed Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella. Diet TBPN delivers the best moments from each episode in under 30 minutes.

Speaker 1:

Well, we are joined by Alex Karp. Welcome to the show. Thank you so

Speaker 2:

much to see you.

Speaker 1:

For taking the time. You guys

Speaker 3:

would with or without hat?

Speaker 1:

Whatever is comfortable for you.

Speaker 3:

Can do hat.

Speaker 4:

Cold out here.

Speaker 3:

Is it

Speaker 2:

hat on.

Speaker 1:

It's cold. This is an average day

Speaker 4:

for you.

Speaker 1:

You're always out skiing.

Speaker 2:

You're you're cold. Done it. Yeah. We need to get we're close to getting skis. Look.

Speaker 3:

We could we let you know we were supposed to do something physical.

Speaker 1:

It's starting to stick.

Speaker 3:

Yeah. It's starting to stick. Oh, I like this vibe. Now I feel like a newscaster.

Speaker 4:

Feel like yeah. This is good. They You're, like, eight feet tall. People don't realize that. Yeah.

Speaker 4:

Not standing up because it would be,

Speaker 3:

like he's, like, counteract that

Speaker 4:

for all

Speaker 3:

of us.

Speaker 4:

He's, like, he's, like, eight foot.

Speaker 1:

Last time we talked to you, I think you were doing four minutes on the dead hang. What's up to now?

Speaker 3:

50505.

Speaker 1:

505. What about in the cold, though?

Speaker 4:

That's gonna No. No.

Speaker 3:

Sorry. We need to have a special minute for that. Yeah. Yeah. 505.

Speaker 3:

For those of you who haven't done a dead hang, first of all, go do it. Why is it important? Well, there there are very few things that are proxy indicators that are accurate for health.

Speaker 2:

Yeah.

Speaker 3:

It's like dead hang, farmers walk body weight, and v o two are the three ones that count. Okay. I don't think anyone really

Speaker 1:

one rep match bench bench press. That's all

Speaker 4:

I focus on.

Speaker 3:

Okay. Well, you know, it's like I feel

Speaker 1:

like I feel like as long as I have a really impressive bench press, I'll I'll live Yeah. Yeah. A short but glorious life.

Speaker 3:

Yeah. I don't know. Did they

Speaker 1:

say something about that? Yeah. That's Who want to live a long Yeah. Glorious life when you can live a glorious

Speaker 3:

short life? Yeah. I think that's what they tell you before they give you a bad salary in something

Speaker 4:

like that. I was like, actually,

Speaker 3:

I think I think it's like, yeah. It's or it's like, yeah. My social life is so great. I only have a bot, but I enjoy it. Yeah.

Speaker 3:

It's A kind of logic. No. But no. It's Dead hang is important. Dead hang is crucial.

Speaker 3:

Okay. And you really need to go work on it, especially anyone watching

Speaker 1:

Okay.

Speaker 3:

Your podcast Yes. Is likely to outperform.

Speaker 1:

Yeah.

Speaker 3:

You wanna have some you know, you wanna be able to do something with that outperformance.

Speaker 4:

Like, dead hang.

Speaker 3:

Like, yeah. Well, the dead hang may be a proxy indicator for other things you could do with your outperformance. Yes.

Speaker 1:

You know? But And, you know,

Speaker 3:

not everyone is yeah. Not everyone is, like, six foot nine.

Speaker 4:

You're like, you may not need a dead hang, but the rest of us Well,

Speaker 1:

the I I can cheat because most of the pull up

Speaker 3:

bars, I can't you can't yeah. That's right.

Speaker 1:

Yeah. And I can hold forever.

Speaker 3:

Yeah. You can forever. That's right. That's right.

Speaker 1:

But but when you're not dead hanging, what should people be doing with the new coding agents? How important is it to learn to code? How important is it to

Speaker 3:

Look. There there are two everybody's worried about, like, their future, but there are basically two ways to know you have a future. One, you have some vocational training. Okay. So it's like or two, you're neurodivergent.

Speaker 3:

And and I when I say neurodivergent, I mean broadly defined.

Speaker 4:

Like,

Speaker 3:

your guys are sitting here. You coulda had a corporate tool job. Yeah. You coulda been, like, I don't wanna pick on Goldman, but, like, just say, you know, like a job.

Speaker 1:

I applied there. They took me down.

Speaker 4:

But yeah. See, I I did work at whatever. You coulda had a job where you're, like, You try to fail. It's like, yeah.

Speaker 3:

No. Actually, maybe they didn't know the right way to test. Yeah. Like, it's like yeah. They were like like, you know hey.

Speaker 3:

Whatever. I'm not picking on one or the other. I'm just saying, you you you know, they're like, you're probably here. You think you're here because actually, you probably wouldn't have been able to do that shit because there's like it's the same thing as sit sit down in class and learn some bullshit. Mhmm.

Speaker 3:

Like, and you just regurgitate it. Mhmm. Like, that's not a valuable thing.

Speaker 4:

Mhmm.

Speaker 3:

If you are actually have insights into anything and you have real technical expertise. Yeah. Like, you know, you can look at a company, but you actually can look at it because you know something about how these things work or something about clients work. You know, then all the other stuff that used to be precious, like being able to do low end coding Mhmm. Being able to do low end lowering, being able to do low end reading and writing.

Speaker 3:

Mean, god, this is like I feel like Odin came down and was like, I'm gonna make the world just right for dyslexic.

Speaker 4:

It's like it's like, yeah. I've Odin has come down

Speaker 3:

from our Lockheed has come down and said, you know what, Karp? You suffered so much as a kid. Yeah. I'm just gonna make the whole world. So everyone else could suffer.

Speaker 3:

I don't want that. And it's like now, but it's really an inversion. Like, everybody with, like, the normal shape skills are dyslexics. Because, like

Speaker 4:

Yeah.

Speaker 3:

The meaning, the thing they can do that used to be valuable is not so valuable. The thing they need to learn to do is, like, be more of an artist, look at things from a different direction, be able to build something unique. Mhmm. I think and you see this on the battlefield, like, one of the most underappreciated things about fighting a war, which is the I mean, there are basic core things civilizations do, like build technology for war Mhmm. Is every society fights differently.

Speaker 3:

Every component of the society fights differently. And when, like, American as allies we do not even approach these problems in the same way. What it makes America lethal more so than any currently country is, like, a combination of, obviously, the technology, which we're super interested in and and and believe are paying a huge part in. But it's, like, twenty years of, like, operators figuring out what worked, what not what worked in a manual.

Speaker 4:

Mhmm.

Speaker 3:

Like, what worked in reality. Also, selection. If you look at the selection of people, like, you meet, like, tier one operators Mhmm. They don't look anything like what people would think. Yeah.

Speaker 3:

It's not like the movies. They're like these, like, these they're like this big and like Yeah. You know, it's like because we have Yeah. We have we have we have specialized ways of doing that. All of that is crazy valuable.

Speaker 3:

As a proxy indicator, people who are getting their news from you are just likely to massively outperform. Mhmm. People are getting their news from something that's a regurgitation of you gotta vote for one party or the other. Sure. And then the the real problem we have in society is not your listeners or Palantir's customers or our partners.

Speaker 3:

It's like, well, what happens to everyone else? Are they gonna lynch us? Mhmm. Because, like, that's the real problem. Like, these products, like, what we're building, like, our our our our agents mean that, like, the most I mean, the most powerful people in the Democratic Party are highly educated female voters.

Speaker 3:

And these technologies, like, they love they I mean, like, yeah, I actually get along with all these people in private, there's a public dispute. But, like, largely, I've talked to Daria over and over, and it's like, yeah. You love one company because they they're not pro Trump. That company's taking your job. How are you gonna feel about that company when you find that you have no job?

Speaker 3:

Mhmm. What what what do you think the the Republican Party's gonna do to products that do not support our military? What do you think the Democratic Party's gonna do to proxy even if you're voting for them that are taking away the jobs of every one of your constituents

Speaker 4:

Mhmm.

Speaker 3:

And saying, oh, people are gonna love you so much, and you're gonna be poor. Mhmm. By the way, we love you so much. We're gonna give you a little handout once a month.

Speaker 1:

Yeah. Yeah. So apply that to the the the SaaSpocalypse narrative, the enterprise SaaS. There's an idea that, the first jobs that will be taken might be the enterprise software products that exist, haven't really innovated, have locked in, and they're gonna be replatformed. Yeah.

Speaker 3:

If you the the the thing that these technologies do is they also make it harder to lie about this is what part of the political realm. If something's not creating value or something's not working or there's corruption, you can't lie about it. Mhmm. And nobody believes that all software companies actually create value. Mhmm.

Speaker 3:

I mean, the famous thing that we all learned was that we rejected, that you were learned and people taught you is, like, your software company is supposed to give the client a feeling they're getting laid while they're getting fucked.

Speaker 1:

Yeah.

Speaker 3:

Now, if that's how your products actually are, you are now, it's you are gonna get fucked. Mhmm. Like and this is gonna happen so quickly. Mhmm. And the simple test for people who are looking this is, does this product or in our case, we were never pure software.

Speaker 3:

We're actually like a hybrid of like

Speaker 1:

Yep. Humans Yep.

Speaker 3:

FDEs, augmented humans, so AI, FDEs

Speaker 2:

Yeah.

Speaker 3:

And then orchestration, and then in essentially what we would call primitives, like

Speaker 1:

Yeah.

Speaker 3:

Taking the tribal knowledge of institution, coding it into logic, and then using that to be extended in LMs. Okay. But we don't have to explain that to our clients. Yeah. You know?

Speaker 1:

So so so there's there was always this always Palantir consulting firm. Is it all just people? Is there any real software? I feel like that narrative went away. Oh, no.

Speaker 3:

No. They they they couldn't invest in us because we were a services company. Exactly.

Speaker 4:

And now now it's like and now But

Speaker 1:

but but is is is actually having that service

Speaker 4:

Oh, no. Of course.

Speaker 3:

Better in the future. No. No. It's not better.

Speaker 1:

It's Underrated. It's crucial. It's crucial. Yeah.

Speaker 3:

Yeah. Like, all these places that made fun of us Yes. They're running around and trying to get FDEs. Yes. Getting an FDE is like yeah.

Speaker 3:

It's like, yeah, it's not as easy as it sounds Yes. Because you have to know how to manage it, where to put the person, how to extract value. Yeah. And then you need all these products that augment the FTE. Yeah.

Speaker 3:

What are those products? Ontology. Yep. Foundry. Yep.

Speaker 3:

FDI AI things that we've built. Yeah. What we the being able

Speaker 1:

So so the value of the business is not a monolithic code base that never changes. It is the people. It is the deployment. It is the relationships. Is that how you're thinking about the business these days?

Speaker 3:

Well, actually, the way I I'm telling you, like, when you walk around here Yeah. They only care they don't care about any of that. What they care about is you transform my business in Yep. In three months. Yep.

Speaker 3:

It would have taken three years, I e, it would never have happened. Yep. But that's what they care about. Yep. Now then there's a question of how do you do that?

Speaker 3:

And that is the con that is the concatenate it's artistry. It's like Mhmm. Select client. Select where you would start. Select ways in which and innovate in ways they would not accept.

Speaker 3:

Mhmm. Innovate in places they do not understand you should innovate. Learn to manage these very complex by the way, it's not just culturally complex. It's tribal knowledge. And much of that tribal knowledge is in rules that they have to apply because there are all sorts of rules about manufacturing, hospitals, war, rules that are applied that they're not saying they apply Mhmm.

Speaker 3:

Laws, like, all sorts of regulate regulatory things on top of all that. All that has to happen very rapidly. So you would need and, like, without going into details, like, I'm in the middle of, like, every single one of these discussions in almost every breakdown. It's like people do not understand how institutions work.

Speaker 1:

Yep.

Speaker 3:

They don't understand how the software would work. They don't have the LLM would work. They don't have the product that would actually work in that environment. And they still, at the end of the day, are not saying, we're gonna charge on value.

Speaker 1:

Okay. How do institutions work? Why is it that we get these genius models that are a 160 IQ that can solve incredible math, and they're not just, like, everywhere all the time? What is slow

Speaker 3:

the simple version is they're one sixty against a test.

Speaker 1:

Yeah.

Speaker 3:

But the test isn't the it's a concatenation. The simple math would be, it's one sixty on one test Mhmm. But you've gotta pass differentiated tests over a long period of So it's a thousand tests. Yeah. So de facto, by the fiftieth step, it's zero IQ.

Speaker 3:

Mhmm. But then there's also there's also yeah. I mean, it's like it's it's it's insane. Yeah. Yeah.

Speaker 3:

Like, no I love when I hear about all this is gonna replace, and then I get our clients, and they're like, could we have more? We don't even have the capacity. Yeah. It's like it's it's a surreal thing.

Speaker 1:

Like Yeah.

Speaker 3:

Would you guys like to be FD's? Because we need some help. If you're in the audience completely seriously, and you're aligned, broadly speaking with America is a great country, you don't to agree about anything else. Yeah. And you're out there and you're technical or just smart, apply.

Speaker 3:

We need you. Okay.

Speaker 1:

America is a great country. Put aside democrat. Put aside republican. Is democracy the correct formulation to decide the future of AI? Should the American people be voting to decide American.

Speaker 1:

Be handled by private companies?

Speaker 3:

No. America well, it depends. Like like yeah. Great. So in the war fighting context Mhmm.

Speaker 3:

The Department of War Yes. Has to be the arbiter of what gets deployed.

Speaker 1:

Now But as a citizen, I vote for the Department of War.

Speaker 3:

Correct? Exactly. Yeah. Okay. No.

Speaker 3:

Okay. But but I'm just saying so I I wanna split Yes. Domestic and and foreign. Because, like, we in this country have god given rights, literally given to us by a higher being. There's a right of free expression,

Speaker 4:

which Mhmm.

Speaker 3:

We're exercising all the time. Yeah. And it's very important to us. There's a second amendment, which I exercise. I I shoot very well.

Speaker 3:

Yeah. I would encourage you guys and other people to listening to avail yourself of the second amendment. Yes. It it was not it is there to protect ourselves in case the first amendment fails. Mhmm.

Speaker 3:

That's the reason it's there. Mhmm. There's a a fourth amendment which is essentially we have a right to privacy. Privacy. Okay.

Speaker 3:

We have those rights. Mhmm. Adversaries trying to kill us in Iran do not have those rights.

Speaker 2:

Sure.

Speaker 3:

And I don't believe I've never believed in extending our rights to foreign countries that are adversarial to us.

Speaker 4:

Mhmm.

Speaker 3:

I don't even really believe I don't like you know, in Germany

Speaker 4:

Mhmm.

Speaker 3:

Where I lived half my life, they don't have a First Amendment. Mhmm. They don't believe it. And by the way, they've never believed in the First Amendment. They have other rights.

Speaker 3:

That's great. I'm not gonna dispute that, but I want our rights here. In this country, if you're gonna tell the American people you're building what is clearly a dangerous technology, it's dangerous because it it will likely take your job Mhmm. Especially if you're white collar. So Mhmm.

Speaker 3:

If you're voting, you know, you're a highly educated person.

Speaker 2:

Flipped Have on that in the last, like, six months or so? Because I think I think the last time we talked, your general mindset was like high agency, highly productive people will be able to continue to leverage the tools to deliver

Speaker 3:

value within organizations. Yeah. I think if you're neurodivergent and high agency and you're highly educated, that's great. But if you're not neurodivergent and you're, like, lawyer fourteen five zero six, that's a problem. Mhmm.

Speaker 3:

Okay. But let me get to this, and they're they're linked, but it's okay. On domestic stuff, I we have rights that are not subject to minority rule. Like, the majority can vote against us having Fourth Amendment rights. I want that I want that litigated at the Supreme Court Mhmm.

Speaker 3:

Because I we are not our constitution is not about majority. It's actually about the rights of the minority. Mhmm. And it's our right. All I bet you the three of us have opinions that are very much in the minority.

Speaker 3:

Sure. That we wanna be able to say, at least in the privacy of our own home. Yeah. Right? And so so that there are real issues.

Speaker 3:

I'm super sympathetic with restrictions around the use of these products in a domestic context even though it's funny. People out there, every conspiracy theorist thinks, yeah, it's insane. I'm the only one conspiracy theorists, you may hate this, but there's one person protecting your rights to be a conspiracy theorist actually has a seat at the table, and that person is me. You may not you may not wanna hear that truth, but it's it is fucking true. Yeah.

Speaker 3:

And maybe do a little more reading before you pontificate on your absurd and, obviously, ill formed and many times stupid opinions. Mhmm. Okay. So because, like, you're attacking the person who's protecting you, idiot. It's, like, fucking so stupid.

Speaker 3:

Do do do use one of the bots to correct your opinion. It's like, I'm being attacked online now. Yeah. He's like, doctor Karp is anti progressivist because of my whole life. I'm the only just telling you the truth.

Speaker 3:

These things are gonna take your job. Okay. So then but in the war fighting context, and it's the primary justification for these products has to be it's it's they're two relevant powers now, Us and China. Mhmm. This is a half half not world.

Speaker 3:

Mhmm. It's going to be either us or them basically deciding the world order. Because, like, these other countries maybe India will will will get involved, maybe the Arab, non Arab, Middle East. But currently, on the trajectory we're on now, there are two places where these things are being developed and deployed. It's us or them.

Speaker 3:

Yep. And I'm not particularly you know, I'm not I'm not out to hurt China. I'm just out to I think we should win. Yeah. I'm not trying to hurt them.

Speaker 3:

And in that context, you can't say we're not gonna do x, y, and z. I mean, I give you examples, but, like Sure. There are datasets that are publicly available in The US market that should I don't think should be used against you and me Mhmm. In a law law enforcement context more than with the help of, say, AI agents and ontology.

Speaker 1:

Yeah.

Speaker 3:

But if you don't use it on the battlefield, you obviously, Iran's gonna use them. But you don't think they can go online and buy those products?

Speaker 4:

Yeah.

Speaker 3:

And and by the way, without going into somewhat classified data, those things are in combination with other things lethal. Mhmm. Like, a lot of people who wanna hurt America on the battlefield end up dead

Speaker 4:

Mhmm.

Speaker 3:

Because of our ability to aggregate and then figure out what's going on in the battlefield before they can figure out what we're doing. Mhmm. And so, like, I'm very much in favor of it for moral reasons, but I'm also in favor of it. Like, don't know how else you explain this to the American people. We're we're gonna take your job.

Speaker 3:

We're gonna take away you're gonna eviscerate your ability to to to have money and power. But that we're not gonna defend you on the battlefield. Just seems like yeah. Well, they're gonna you know what's actually gonna happen? That nobody believes me in tech, but there's gonna be a movement in this country that gets very strong very quickly to nationalize these things.

Speaker 3:

First, it's gonna be take away our money. The bill billionaires are evil. You may not have heard that. Yes. Super evil.

Speaker 3:

Yes. And if you take away their money, it'll help poor people. Yes. That's really important to understand. Yeah.

Speaker 3:

Making rich people miserable is the only way to help poor people. Yeah. That's obviously true. Yeah. Say once you've learned that, the next thing you're gonna learn is we have to nationalize a techno They're

Speaker 2:

gonna they're gonna quote you on that.

Speaker 4:

Yeah. Well, it's like They

Speaker 2:

quoted you on

Speaker 1:

on So so, I mean, it sounds like you're you're you're closer to Dario on, you know, potentially 50% of early stage white collar job loss. Like, you're you're you're aware that there's a risk at least Everyone's probability. You're aware of it. Well, what do you see as the solution?

Speaker 3:

Well, first, we just have to I mean, well, I mean, the obvious thing is, okay. We can't have any migration here. Like, how are we gonna create more job? Like, it's like, you have to what the problem in fairness, not that people wanna be in the business of being fair to policy leaders, but that we are dealing with technologies that will determine the policy decisions. Mhmm.

Speaker 3:

So you can't just pretend they're not happening. Mhmm. Like, step one is, like, we it's going to be hard and but possible to make this society work given that transforming it requires these technologies. Like Mhmm. All like, I really like the people here.

Speaker 3:

They're not here because they like me. Like, maybe they're here for my jokes. Yeah. High quality in

Speaker 4:

some cases. But they it's like

Speaker 3:

a long trip, and we're here and I'm the only one who likes this weather. It's great weather.

Speaker 4:

I brought it for you.

Speaker 3:

It's but they're here because they've seen their business being transformed. Yeah. And this is happening in America more than it. So Yeah. We have to win those battles.

Speaker 3:

Yeah. But the costs are gonna be very high.

Speaker 4:

Mhmm.

Speaker 3:

And so you have to work back from, okay, the costs are gonna be very high. We we can't put oil on the fire. Mhmm. It's like, you know, it's like, well, getting jobs for all Americans is gonna be hard and people maybe who become Americans. But it's like you have to have different policies around migration.

Speaker 3:

Mhmm. You have to different policies around how we train people. Like, currently, if you're a young kid in high school and you're neurodivergent, they're literally track chaining into your chair and feeding you medication so you can have skills that are not valuable. Yeah. Like, it's so it's like and and then we'll probably over time have to have, like, a discussion of, yeah.

Speaker 3:

If you go into this career, you're not gonna have a job. Mhmm. Like, a really honest discussion about that. These are the places where you will likely have a job. Mhmm.

Speaker 3:

And yeah. Help me What

Speaker 2:

you know, it it seems like we, as a country, will probably head down a more European path where it is becomes very, very difficult or near impossible to let people go. Do you think that's correct?

Speaker 3:

So you mean Germany,

Speaker 2:

it's much harder to to lay someone off. I mean, I impact I mean, that does impact the the growth of of companies. Mhmm. But I think many Germans Germans would argue that's probably

Speaker 3:

Yeah. You know, Germany's I mean, I won't I'll I'll answer your question. Germany's an interesting place. I did this thing in German where I basically told the truth, which you're not allowed to do in Germany. It's like, you know, it's kinda really bad situation and the economy sucks.

Speaker 3:

The migration thing's a complete disaster, and the energy situation is like compounds everything. And I got thousands of people literally saying, thank god someone told the truth. And there are a lot of people like you guys, young people building things that feel hampered and and and are correct to feel hampered. I I think the American version, if we're not careful, is not gonna be the German version. I think it's gonna be hang the rich.

Speaker 3:

I it's like, I think it's gonna be not protect everybody else. It's gonna be like, oh, look. This is too dangerous, and we're gonna hang the rich, but not really help the poor. And in fairness to the German version, like, you know, German, like, health insurance, insurance, all that stuff, it works. Yeah.

Speaker 3:

Like, I was I was poor in Germany for, like, a decade. Yeah. And, like, I had I had the best life on the Like, it's like being poor in Germany is like being better than being rich here in some in

Speaker 1:

some days. So basically, policies that lift the floor, reeducation, training.

Speaker 3:

You're so if you wanna do what we could do here Yeah. We the things we could adopt from Germany are Germany has three high schools.

Speaker 1:

Yeah.

Speaker 3:

Two are vocational. Sure. One is academic.

Speaker 1:

Better education. Better programmatic Yeah.

Speaker 3:

Vocational. Vocational it also has a bad like, a weird vibe here. Like, vocational training in Germany is very technical. Like, the people building the cars at BMW Mhmm. Or even in the French version Airbus, like, very complicated jobs.

Speaker 3:

They didn't go to college. True. They went to a very, very high end high school. Yeah. And they come out without any debt.

Speaker 3:

Yeah. And that stuff is really valuable. Yeah. So if you wanna Amazing. You you have to completely transform our educational system and and go very young into, like, training people to do things.

Speaker 3:

You also need to change our testing system Mhmm. Like, different forms of intelligence. Are all of our tests are built around things that are were valuable in the industrial revolution. Sure. Sure.

Speaker 3:

Sure. It's like, you wanna pull out all the dyslexics, all the neurodivergence. Everybody who can't sit or needs to build or wants to build have to go into a select a separate slot of like, yeah, we should have gotten you before you got turned down at Coleman. And like I said, this is like, that's a waste of

Speaker 2:

your time. You could be building something important. And What else goes would be a part of the good outcome? Well The

Speaker 3:

the the most important part of the good outcome is what we show our adversaries, you can't fuck with us. And we're the best we have the best military in the world. I hope I believe we're doing that right now. On the good outcome side, we yeah. We go around.

Speaker 3:

And then on the commercial side, we go to all these high infrastructure, you know, hospitals, manufacturing, all these things, complicated infrastructure, and we AI enhanced all of them so the products are legitimately the best more. And we rebuild manufacturing in this country. Like, a big problem for us, including on the battlefield, is our manufacturing just is not up to where we have to be. Yep. And that, by the way, requires rescaling, scaling humans, and we're doing this all over the place.

Speaker 3:

I mean, the guy writing a lot of the scripts for the target these people, they're, like, high school college grads, the people building batteries and all these things using our products. These are high school college grads. There's a lot of opportunity there. But, you know, one of the things I told the Germans, and I would say to us is we I was like, you know, Germany, you have to call it a crisis.

Speaker 4:

Mhmm.

Speaker 3:

We do need, like this is a crisis moment. America is in tomorrow is not gonna look like it looked at all, or we're gonna have radicalism on right or left. Mhmm. The the the problem the danger is if we don't do these reforms, you are gonna get the pitchforks. Mhmm.

Speaker 3:

Because that then the only solution people are gonna have is, well, you know, let's go after the unlikable rich people in tech, especially AI tech. And then but but then what can work is, yeah, close the borders, keep them closed, start doing huge vocational efforts, change how we test aptitude, like, so we have an accurate diagnostic of where you could be slotted, Be ruthless and, like, you know, it's it's like in in certain find out new ways to test and do ruthless testing and slotting. And then also go around to universities and just I mean, you know how when you, like, you wanna smoke a cigarette, it's like this this cigarette may be harmful to you. Maybe we should be putting that in universities. This university this this university is harmful for your investor.

Speaker 3:

You know, I'm libertarian. Wanna go to university

Speaker 2:

Student debt may be harmful to

Speaker 4:

your future financial health. For for you and your personal life. Explain to someone you gotta have a million dollars in debt. Yeah. A million dollars in debt.

Speaker 4:

I mean, maybe if you're six, nine, and you you can get away with that, but the rest of us have to provide.

Speaker 1:

That's funny. Help me square this idea. You you were you were talking earlier today about people misunderstanding your business. And Yeah. What's it

Speaker 2:

like to read about your business?

Speaker 3:

Oh, I mean, first of all, it's I mean, the part I used I I hate it. But then the part I love is and it's like you are valuable. Your your value is pretty directly con convergent with people's inability to understand what you're doing.

Speaker 1:

Yeah.

Speaker 3:

So it's like it's like all these technologies are potentially commodifying everything. Yeah. Okay. So if you are a business that is, you know, not services, not product, but both, but also works on tribal knowledge on data, and every single business you make is individual. And so, yeah, that's a crazy valuable business.

Speaker 1:

In a lot of industries where if the broader business community doesn't understand your business, you might have a short report, but you'll have a way less competition because people aren't copying you.

Speaker 4:

Well, don't understand

Speaker 3:

the playbook. Possible to copy certain, like like and and and we we neglect this. Like Yeah. You know, almost, like you even see it culturally, like, luxury products dominated by the French, watches dominated by the Swiss, currently certain kinds of war fighting dominated by America.

Speaker 1:

Yeah. And

Speaker 3:

it's like, it's it's very hard for people to eviscerate these cultural advantages, and our products augment that, which makes it, you know, augments the the differentiated specific

Speaker 4:

Mhmm.

Speaker 3:

Over the generalizable. And and and that's where literally all the value is gonna go. And this is gonna be like a waterfall. And that that's the problem with a lot of software companies. It's like it's like product a b c t.

Speaker 3:

But but but then when you read it, it's like like one of the more depressing things that you guys probably confront, but it's like a market huge market opportunity for you. It's like, where are the experts? Like, it's like, you know, it's like there's shit. It's like, you know, you you hope and pray. Like, I'll tell you the funniest thing about my life now and people internally know.

Speaker 3:

Almost every day, I'm like, wait a minute. I'm the adult in the room here.

Speaker 4:

It's like everywhere I go, it's like it's like, wait

Speaker 3:

a minute. I like, I and it's like there it's like and and it's big. So it's it's surreal when you read about these things, but and it's it used to really frustrate me, but now I kinda just think, well, like, can't believe we're still viewed as crazy. It's like everything we're doing is the only thing that's where I mean, like, don't wanna, like, spend a lot of time on our baller accentual accentually baller numbers from last year, but it's like, you know, clearly, our shit works. Clearly, nothing is working at that level.

Speaker 3:

And you would think they would take, like, I don't know, ten minutes and think, okay. Well, the thing I believed and I thought it would work didn't work at all. Yeah. But this thing I thought was insane is has like a rule of one twenty seven Mhmm. When like no one like 40 is considered like Yeah.

Speaker 3:

And like but they don't. And and and and then and I yeah. But it is sometimes frustrating, honestly. And the hard part actually is I kind of view it as a feature. Internally, we get these bright eyed kids.

Speaker 3:

Mhmm. So it's so funny. I mean, one get the best people in the world. Mhmm. But you know, just like I was probably at 21, they're very romantic.

Speaker 3:

It's like, but why does the adult not understand this?

Speaker 4:

It's like, but but but the adult

Speaker 3:

expert tells me it's like, I don't when you guys had to drop this huge moment and you realize that like Yeah. The adults are, like, you know, on crack or something. Like, it's Okay.

Speaker 1:

Yeah. Last question. Would you rather have $10,000,000 or access to ChatGPT in 2012? It's a viral question. It's going viral right now.

Speaker 3:

I have to choose one or the other? Yes. I mean, okay. I'm just I don't think he needs

Speaker 2:

I don't think I

Speaker 4:

don't think he needs Mel. Can I can

Speaker 3:

I have my social life in grad school? There we

Speaker 4:

go. Okay. That's on your pitch. How about

Speaker 1:

a new pick? We just have to

Speaker 4:

ask you this. You know, it's like, great. I I gotta take something I valued. Oh, okay.

Speaker 1:

Yeah. The most valuable thing for social life in grad school. Well, thank you so much for taking

Speaker 3:

the time. Thank you.

Speaker 1:

It's fantastic to

Speaker 2:

have you.

Speaker 1:

We'll talk to you soon.