TBPN

Diet TBPN delivers the best of today’s TBPN episode in 30 minutes. TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays 11–2 PT on X and YouTube, with each episode posted to podcast platforms right after.

Described by The New York Times as “Silicon Valley’s newest obsession,” the show has recently featured Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella.

TBPN is made possible by:

Ramp - https://Ramp.com

AppLovin - https://axon.ai

Cisco - https://www.cisco.com

Cognition - https://cognition.ai

Console - https://console.com

CrowdStrike - https://crowdstrike.com

ElevenLabs - https://elevenlabs.io

Figma - https://figma.com

Fin - https://fin.ai

Gemini - https://gemini.google.com

Graphite - https://graphite.com

Gusto - https://gusto.com/tbpn

Kalshi - https://kalshi.com

Labelbox - https://labelbox.com

Lambda - https://lambda.ai

Linear - https://linear.app

MongoDB - https://mongodb.com

NYSE - https://nyse.com

Okta - https://www.okta.com

Phantom - https://phantom.com/cash

Plaid - https://plaid.com

Public - https://public.com

Railway - https://railway.com

Restream - https://restream.io

Sentry - https://sentry.io

Shopify - https://shopify.com/tbpn

Turbopuffer - https://turbopuffer.com

Vanta - https://vanta.com

Vibe - https://vibe.co


Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays from 11–2 PT on X and YouTube, with full episodes posted to Spotify immediately after airing.

Described by The New York Times as “Silicon Valley’s newest obsession,” TBPN has interviewed Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella. Diet TBPN delivers the best moments from each episode in under 30 minutes.

Speaker 1:

Our president here at TBPN, Dylan Abruscato, headed to the TBPN newsletter, which you can sign up for at tbpn.com, and wrote a fantastic essay summarizing a trend that we've been discussing with him around how AI is changing meme making. And I found it very interesting. I'm glad that he wrote this piece. And so we'll read through this and then discuss it, debate it, and see where we can take it further. And then obviously

Speaker 2:

Dylan's from Long Island. Yes. New York. So John is gonna be

Speaker 1:

I'm gonna do it in a Dylan Abercato impression

Speaker 2:

of accents.

Speaker 1:

Memes are changing. That became abundantly clear during the Oscars a few weeks ago when Conan tried to create a new Leonardo da cartel, Leonardo

Speaker 2:

da caprio meme. Accent. That was

Speaker 1:

That's just like UFC announcer to go alongside the classic Leo memes. In doing so, especially by using TFW, that feeling when, and the blocky white font that defined early Internet memes, he inadvertently demonstrated that the meme templates millennials grew up with have become increasingly stale, even cringe. It's a good point. Instead, AI generated videos are the new meme template that every network and studio should be focusing their launches on. Look at what's happening with the Harry Potter reboot.

Speaker 1:

When the trailer first dropped, the reaction to the new Snape played by Ghanian Ghanian. Sorry. Ghanian. He's from Ghana. English actor Papa Isidu was predictably and unfortunately negative.

Speaker 1:

According to the LA Times, he received death threats since being cast in the new role. But after a few incredibly viral and well produced AI videos, one, an original Snape versus Black Snape MMA match and another AI generated rap video and another DripWartz The School of Drip, the narrative has started to shift. Have you seen any of these? The quote unquote original Snape versus black Snape MN MMA match because I have not seen this one, and I think it is illustrative of what Dylan is talking about here. Snape v.

Speaker 1:

Snape in the UFC ring. Here we go. Really photorealistic. Does this does this are there any red flags here as a UFC enjoyer? Does this feel like

Speaker 2:

proper UFC actual video quality.

Speaker 1:

It's so bulky. The video quality is insane.

Speaker 2:

Wait. But old Snape won.

Speaker 1:

In the fight? Yeah. Okay. I think it just I okay. Wait.

Speaker 1:

How how do you how do you know that? It just sort of like makes the characters more entertaining, more fun, shows you that this is just creativity at the end of the day. This is just you should not be so up in arms about something that's a movie. Like, it's entertainment. And here's some more entertainment.

Speaker 1:

And so you're you're adding entertainment to the discussion and people are enjoying that. There's another AI generated rap video about the new Snape, which we can pull up a little bit of here.

Speaker 2:

AI meme videos are inherently viral and driving real awareness in a way traditional memes no longer can, not just because they're novel and more entertaining be but because a single AI clip can travel further and compound harder than traditional meme formats and social feeds that now heavily favor video. This suggests Yeah. It's interesting. On on x, it's still very easy for an image to go viral. But if you think about, you know, Instagram, YouTube a standalone image just can no longer actually get that escape velocity.

Speaker 1:

I mean, what about dripped out pope? Remember that?

Speaker 2:

Yeah. A little bit. But but people are just spending so much time in the in the short form feeds.

Speaker 1:

And Yeah.

Speaker 2:

Those can go in there, but there's certainly This suggests a new playbook for marketers, especially in entertainment. If you're about to drop a trailer for a new movie or show, you need to be thinking about your rage bait character, the one people will latch onto, remix with AI, and build around. Conan tried to force a Leo meme down our throats at the Oscars. Didn't see that because I was sleeping. But this might have worked twelve years ago.

Speaker 2:

That playbook is over. Today, Enraged fans and communities will, if you're successful, take your characters or moments and turn them into something much bigger, entire cinematic universes. I'm I'm just very impressed by the the overall quality of those outputs.

Speaker 1:

The Oscar selfie, I remember this. This I think became the most liked image on Twitter at the time in 2014 briefly. This is the canonical clout bomb. If you're a fan of Bradley Cooper, you like it. If you're a fan of Meryl Streep, you like it.

Speaker 1:

If you're a fan of Brad Pitt, you like it. And so you're you're amplifying all of the ultimate collab post. And this has become a format that's been used time and time again. But now the future is is AI. Let's pull up the the DripWart School of Drip video.

Speaker 1:

I wanna watch this one. Let's see if we can play this. That's Harry Potter. Are you really Harry Potter my g? Type shit.

Speaker 1:

Type shit. Type shit. Type shit. None of that. None of that, bro.

Speaker 1:

We're all here on the Maybach Express for one reason and one reason only, and that's to go to drip watch the school of drip. The Maybach pulling the train is pretty good. Pretty good. So yes, very effective. I was reflecting on this and thinking about how it's not just AI videos that are unlocked as the new meme format.

Speaker 1:

Like twenty years ago, video editing was extremely difficult. Like, you had to do it on a desktop. You had to have a piece of software that probably cost a lot of money. It was not widely accessible. And so these image makers, image memes I was talking to Brandon about this.

Speaker 1:

Like, Good Guy Greg was one of these, or, like, Insanity Wolf. And it would just be, like, one image of a duck, and the duck would be on sort of, like, a solid colored background. And that would be the template, and then somebody would put white block text impact font on the top and the bottom. And that was like the image meme. And that was accessible in the sense that it could be like generated on It MS was free to generate it, basically.

Speaker 1:

Then we got video editing, Cap Cut, Instagram, Reels as an editor called Edits. And all of a sudden, it became easy for someone to take a Vibreel and put different text over it. I send you a bunch of these, where I'll find some crazy Vibreel, I'll just recontextualize it with a new you're laughing

Speaker 2:

at this

Speaker 1:

new caption, basically. And so the classic one is like those four jets and the new Top Gun. And it's like when you and the boys all drive somewhere in separate cars or something like that. It's an example. But now you can generate full AI videos that can express the joke of the meme.

Speaker 1:

And I think the next version of this is like software as a meme, S A A M, something like that. And we've been experimenting that with this with the simulators. There's TBPN simulator, Jeremy Gaffan simulator. There are more simulators coming. And all of a sudden, we the idea of building a video game, becoming a video game studio, was like an impossible challenge.

Speaker 1:

It would be months and months of time, maybe millions of dollars to get anything reasonable. So you had to be commercial about it. You could not do it as a comedy bit. But now you can. But increasingly, it's going to be more and more just like a few prompts on your phone to get the piece of software that is that meme.

Speaker 1:

And you can think about the Jmail suite from Riley Walls as another software as a meme moment Yeah. Where he's making a commentary on the Jeffrey Epstein saga and all of that, but he's instantiating the humor, the commentary in a piece of software that actually works. So there is a whole bunch of hack news going on. We're in a very weird week in terms of the news cycle because it's spring break. And so a lot of executives of big tech companies are like, don't launch while my kids are out of school and we're going on vacation.

Speaker 1:

I actually think this is my real theory. So we're in a little bit of a slow news week. And you can see that the Journal is covering announcements that happened last week. They're talking about Sora. They're talking about Disney.

Speaker 1:

They're talking about things that are more reflective in Strathecari. Ben Thompson has sort of a fifty year retrospective on Apple. It's not driven by a news item. Like, it's not like Apple launched a new product this week. So Ben Thompson is taking a step back and reflecting.

Speaker 1:

It's a great piece. But there are a ton of crazy hacks starting with Axios. There's an active supply chain attack on Axios, one of NPM's most dependent on packages. So if you have been vibe coding, Axios is a package that helps with HTTP requests, so it gets sucked into all sorts of different projects. And if you upgrade it to the latest version, you basically got a virus with that.

Speaker 1:

And if that's running in the cloud, it's building, and that's probably maybe bad because it could steal API keys or SSH keys. It could do a lot of things. It could wreak havoc on your system. Also, if you built this piece of software and you included the contaminated Axios installer or package locally, it could potentially weasel its way out of your local environment and get onto your desktop. It's virus, so be careful out there, and I'm sure people will be responding.

Speaker 1:

The recommendation from Pharos, who sort of broke the news over at Socket Security, is that if you use Axios, pin your version immediately and audit your lock files. Do not upgrade. Socket analysis confirmed that this was malware. Plain Crypto JS is an obfuscated dropper loader that de obfuscates embedded payloads and operational strings at runtime, dynamically loads FSOS and exec sync to evade static analysis, executes decoded shell commands, stages and copies payload files into OS temp and Windows program data directories, deletes and renames artifacts post execution to destroy forensic evidence. So very risky.

Speaker 2:

I would say, like, if you have installed this, you should just, like, freak out, Should and and and if you break your computer, that's like the first thing you should do. Just like try to slam

Speaker 1:

it Yeah. Over Take the computer, throw it in the lake.

Speaker 2:

That's how you should start.

Speaker 1:

I concur. I mean, practical I mean, there is going to be some sort of like power law response here where of the people that are victims of the attack, they will go after the most vulnerable with the highest like ransomware potential. And I think we're seeing that with one company. I believe Mercor was targeted. But I don't know if that's

Speaker 2:

been But I in don't was that Yeah. My understanding is that Yeah. The crazy thing is you have you have this like Claude code leak

Speaker 1:

that That was completely separate. And nothing even Even

Speaker 2:

though even though I do believe they use Axios in Claude Okay. Saw something on that. Sure. Sure. Sure.

Speaker 2:

And you have the Merkor leak Yep. Which is

Speaker 1:

Well, it's not a leak. It's a ransom

Speaker 2:

It's a ransomware.

Speaker 1:

Yeah. Someone stole some data.

Speaker 2:

Yeah. They stole a bunch of data, and now they're trying to, you know, get bids on it. Mhmm. We'll we'll get to that in a little bit. Okay.

Speaker 2:

And then there's there's this Axios supply chain attack. Yeah. Anish had a little bit more context. He said a tiny piece of code called Axios runs inside almost every app on your phone and every website you visit. Developers download it a 100,000,000 times a week.

Speaker 2:

A few hours ago, someone poisoned it with malware that hands an attacker full control of your computer. If you've never heard of Axios, that's normal. It does one boring but important job. It lets apps talk to the Internet. When a website pulls up your feed or an online checkout processes your card, Axios is probably doing the work underneath.

Speaker 2:

Over a 173,000 other code packages plug into it. It's everywhere. The attacker stole a lead developer's login for NPM. Think of it as an app store, but for code that programmers use. Once inside, they swapped the developer's email to an autonomous ProtonMail account and uploaded the poisoned version by hand.

Speaker 2:

They that jumped past every security check the project normally runs before new code goes live. And this was not a rush job. The stackers staged the malware at least eighteen hours before pulling the trigger. They built separate versions for Windows, Mac, and Linux. They poisoned both the current version and an older one within thirty nine minutes of each other, casting the widest net possible once the malware ran on a machine, it deleted itself to cover its tracks.

Speaker 2:

The trick was smart. They never touched a single line of code inside Axios itself. Instead, they tucked in a fake add on called plain crypto JS built to pass as a well known trusted library. It copied the real library's description and author info so nothing looked off at a glance. When a developer installed Axios, this fake package quietly ran the malware on its own.

Speaker 2:

When a smaller package called u a parser j s got hijacked back in 2021 with about 8,000,000 weekly downloads, the security world treated it like a four alarm fire. Mhmm. Axios has a 100,000,000 over 12 x e exposure with a 173,000 packages depending on it. Socket, the security firm that flagged this, caught it in about six minutes. That's fast, but six minutes is still plenty of time for automated systems at companies everywhere to pull and install the bad version before anyone can react.

Speaker 2:

If you or your team run Axios, freak TF out. No. Lock your version to one point fourteen point zero, change every password API key and access token on any machine that installed the compromised update, and check your network logs for connections to sfrclak.com or the IP address one hundred and forty two point one one two zero six point seven three.

Speaker 1:

Andra Karpathy said, new supply chain attack, this time for NPM Axios, the most popular HTTP client library with 300,000,000 weekly downloads. That's a lot. Scanning my system, Andre Karpathy says he found a use imported from Google WorkspaceCLI from a few days ago when I was experimenting with Gmail gcalcli. The installed version luckily resolved to the previous version, the unaffected one point one three point five, but the project dependency is not pinned, meaning that if he did this earlier today, the code would have resolved, everything would have updated, and he would have been pwned. It is possible to personally defend against these to some extent with local settings, e.

Speaker 1:

G. Release age constraints or containers or etcetera. But I think ultimately the defaults of package management projects, PIP, NPM, etcetera, have to change so that a single injection, usually luckily fairly temporary in nature due to security scanning, does not spread through users at random and at scale via unpinned dependencies. So very, very crazy, crazy story.

Speaker 2:

I just think it's bullish overall cybersecurity. Like, think every cybersecurity company will probably do well. People are on edge already. Yep. And Everyone's even though this type of attack has happened for years Yeah.

Speaker 2:

Long before, like, the popularity of vibe coding, it just feels like there's a bunch of new solutions that are needed. The kind of incumbent cybersecurity players will do well. They're gonna release a lot of new products. I think the question that I have is, like, why seven minutes? Right?

Speaker 2:

Like Yeah. If

Speaker 1:

Why not check it before it's merged in in the first place?

Speaker 2:

Yeah. Yeah. Or or just like, you know, these are machines. So theoretically, they can be constantly monitoring versus like

Speaker 1:

Yeah. I don't know. And and the question is, I I I we're gonna be digging into this story more over the next few days. But I'm I'm interested to know, like, it's found in seven minutes. When is it actually rolled back?

Speaker 1:

If you look at 300,000,000 weekly downloads, like, clearly, there are people that were downloading it at that moment in time. At all seven of those minutes, there's probably, like, thousands of downloads, if not tens of thousands. Like, how quickly was it rolled back? So is it only if you're in that seven minutes? Or was it discovered in seven minutes, and then it took them another twenty minutes to roll it back and stop serving the contaminated package?

Speaker 1:

Understanding the scope of this, because it's very clear that, as Andrej Karpathy explained, like, he was actively using it every single day and yet was not caught in that seven minute window. And so he was clean. And understanding the scope and scale of the impact is very much determined by how many just how broad and how many installs happened during the contamination. Anyway, Will Brown has a good take. He says, I hope someone at Axios is reporting on this.

Speaker 1:

And I completely agree. It's going to be confusing when they do.

Speaker 2:

Last night, pod code, source code was leaked via map file in the NPM registry. There's just a link to

Speaker 1:

Wait. Someone just actually do not click a link. If somebody ever says, hey, I got some really great source code here. Just click this link. Probably don't click it.

Speaker 1:

Let other people screenshot it. There's plenty of meta analysis over here. Seems messy, seems unfortunate. Heart goes out to the folks who are dealing with the situation. At the same time, Codex is open source.

Speaker 1:

It's not the end of the world, but it did reveal a bunch of things about the road map and also some of the April fools. Journal April fools. That is the worst part. We'd love a secret surprise April fools joke. I love a good joke, and nothing spoils joke like hearing about it a day early.

Speaker 1:

Much more importantly, there are lots of other critiques of the way Cloud Code is implemented. What are the bad news about the rules? Don't think

Speaker 2:

this hurts their business at all because people are using Cloud Code to make other products and then also having to take basically a fork of Cloud Code, maintain that, try to be shipping features against it, which is, again, I think it seems to not be legal at all to just fork the code base just because it's out there.

Speaker 1:

Oh, yeah. You can't just flex are the verbiage.

Speaker 2:

Converting it into other languages. And maybe there's some argument there. But still, a don't think like this hurts their technical business at all. Person

Speaker 1:

to understand some of the secrets, what's special. But at the end of the day, all of these tools, especially something like Cloud Code that's so new, it's more of like the process.

Speaker 2:

It's more bad for the overall brand of vibe coding.

Speaker 1:

Totally, totally.

Speaker 2:

Yeah, it's rough. The irony here is that every time Anthropic has released any feature related to cybersecurity, all the big cyber companies have been selling off tens of billions of dollars.

Speaker 1:

Yeah, yeah, yeah. The question of like, yeah, does this build trust in using Viya

Speaker 2:

Yeah, overall, it hurts some trust. Again, very obviously going to get through this.

Speaker 1:

Yes. So the how it started, how it's going is, of course, landing like a ton of bricks. The last thirty days, 100% of the contributions to Clock Code were written by Clock Code. And the how it's going is that it leaked the source code, which is not what you want to have happen. This is like you didn't get to watch the Super Bowl.

Speaker 1:

You have it DVR ed at home. Do you want spoilers? Should we review the April Fools joke? Or should we leave it unspoiled? It's cool.

Speaker 1:

It's very cool. You've already read it?

Speaker 2:

I read through it. It's not to my

Speaker 1:

Were you belly laughing?

Speaker 2:

I don't think we're getting a knee slapper out of it. But it's very cool. And I think it'll be cute.

Speaker 1:

Okay. Well, then we can move on. Tucci summed it up here. You understand what just happened to Anthropic? Someone on their team ran a production build of Clock Code.

Speaker 1:

The compiler generated a dot map file, which is literally a blueprint that reverses the entire code base back to its original source. And then they published it straight to NPM for the whole world to download. And it really does show you how fast the NPM downloads. There are people that are downloading it every single minute. And so even if it's only up there for a minute, someone's going to get it.

Speaker 1:

And then all they need to do is send it to somebody, zip it, and post a link on X, and it goes viral. It's like locking every door in your house, installing cameras, hiring armed guards, then accidentally uploading your floor plans to Google Maps. Does that matter? No. That's a bad analogy.

Speaker 1:

I don't like that analogy. Floor plans are not why I lock every door in my house. I install cameras. I hire armed guards.

Speaker 2:

Aren't floor plans public on, like, Zillow? Oftentimes. Let's go over to Lisan Alge.

Speaker 1:

Yes. Yes.

Speaker 2:

Yes. A few takeaways from the Claude code leak. Anthropic is actively using Mythos for development.

Speaker 1:

Okay.

Speaker 2:

They are already a capybara V eight. We learned last week that capybaras are

Speaker 1:

Extremely deadly. But

Speaker 2:

can be deadly in the right context. Capybara still has issues.

Speaker 1:

The foreshadowing is crazy. The foreshadowing is crazy. We were talking about how the Faustian bargain that is getting a Capybara as a pet seems so cute, but it can bite you.

Speaker 2:

Capybara has 1,000,000 token context window and Cool. Fast

Speaker 1:

Numbat is another interesting code name tagged with at model launch. Remove this section when we launched Numbat. Fennec seems to be fit the Fennec fox. Fennec fox is very cute, but also not a domesticated animal. How about we get some golden retriever code names?

Speaker 1:

How about big fluffy poodle? That's a good code name for your animal themed AI model. Anyway, let me tell you about Console. Console builds AI agents that automate 70% of IT, HR, and finance support, giving employees instant resolution for access requests and password resets. And let me also tell you about Lambda.

Speaker 1:

Lambda is the superintelligence cloud, building AI supercomputers for training and inference that scale from one GPU to hundreds of thousands.

Speaker 2:

Says hot take Anthropic leaked Claude code intentionally to get a Nerdosphere code review it

Speaker 1:

would

Speaker 2:

have never gotten if they had just open sourced it.

Speaker 1:

Oh, that's actually true. Way more attention.

Speaker 2:

Leak your entire feature roadmap. You don't do I mean, it's it's it's funny, and I'm sure they'll make the most of this.

Speaker 1:

This is four d chess right here.

Speaker 2:

But The

Speaker 1:

four

Speaker 2:

d I'm not seeing the four d chess.

Speaker 1:

I'm seeing the four d chess now. I'm convinced. This is I mean, we're in completely uncharted territory for marketing stunts and prereleases and sneaky footage that goes viral and maybe was planted and you don't know and it's like some leaked account. Like, I don't know. I think everything's I think the gloves are off.

Speaker 1:

Everything's on the table. This could be an April Fool's joke. This could be a stunt to draw to drive attention to an open source move. Although, Tyler, you said that Dario is not a fan of open source at all. Right?

Speaker 1:

He's like against Yeah. Dario is a unilaterally?

Speaker 2:

He he doesn't want to do open source.

Speaker 1:

I feel open source. Isn't there some steel man there where if you open source like Opus two or something that's like really old, it's entirely commoditized in the research community. So all of those secrets that went into like making Opus two good, those have been commoditized. They've been discussed at the House parties in SF. The researchers have moved from one place to another, so everyone knows these.

Speaker 1:

They've implemented it. They're available as open source. But by open sourcing your model, you can share with more of like the up and coming academic community. Like, if if I'm a if I'm a computer scientist

Speaker 2:

Yeah. But all the research is already commoditized and

Speaker 1:

Yeah. Guess you could just use the other ones. It doesn't really have a benefit. Maybe

Speaker 2:

Yeah. So has anyone has anyone at Anthropic Yeah. It's a point. Has anyone at Anthropic commented on this at all? I haven't seen anything.

Speaker 2:

I haven't seen anyone.

Speaker 1:

There is news out of out of Google. A Google paper warns that warns crypto on quantum risk ahead of 2029 timeline. So we've heard about the risk of quantum computing affecting the cryptocurrency industry, crypto projects broadly. There is some new research out of Google that provides some more perspective. So Google researchers have warned that future quantum computers may be able to break some of cryptography protecting Bitcoin and other digital assets with fewer resources than previously thought, adding urgency to the debate over how the industry should prepare.

Speaker 1:

The researchers did not indicate such a machine exists today, but said new work suggests the computing power needed to carry out that kind of attack may be lower than earlier estimates had suggested. In a Google Research blog post this is from Bloomberg the researchers said that a future quantum computer could break elliptic curve cryptography, a form of public key encryption used across much of the market. Their latest estimate points to a 20 fold reduction in the quantum computing hardware needed to break what's known as ECDLP two fifty six, a mathematical problem that helps secure crypto wallets and transactions. That does not mean Bitcoin and Ethereum are suddenly exposed, but the researchers in the white paper dated Monday said the clearest defense is a shift towards post quantum cryptography or PQC. I'm sure this will be a hot topic over the next few months.

Speaker 1:

A newer form of security designed to withstand attacks from powerful machines. They also urge the crypto industry to cut avoidable risks. In the meantime, we urge all vulnerable cryptocurrency communities to join the migration to PQC without delay. Google cast the paper as a warning meant to give the industry time to act, not as a prediction of imminent collapse. Last week, the tech giant introduced a timeline to fully migrate its own security systems to post quantum cryptography by 2029.

Speaker 1:

Have swirled for years. In January, Coinbase established an independent advisory board to study what quantum computing could mean for the blockchain. That same month, Christopher Wood, global head of equity strategy at Jefferies, removed a 10% allocation to Bitcoin from his model portfolio, citing fears that the advent of quantum computing could undermine the token. The time left before such machines arrive still appears longer than the time needed to move public blockchains to post quantum cryptography.

Speaker 2:

One concern

Speaker 1:

Yeah.

Speaker 2:

That people in the community have had that I've seen talked about is this idea that if you did have a computer powerful enough to crack these encryptions, unless you were like Google Mhmm. And you already had billions and billions and billions of dollars of cash flow, you wouldn't exactly stand up and say like, hey, I have cracked Bitcoin because the incentive for Sure. A certain team would just be to go around and find these wallets that were maybe maybe didn't have any activity for a long time and just start cracking those individually. Because if you just stood up and said, hey, I have a quantum computer that is destroys Bitcoin Yeah. The the price would go down, then you the hacker wouldn't get any benefit from it.

Speaker 1:

What are quantum stocks doing on this news?

Speaker 2:

Probably ripping.

Speaker 1:

They rip on everything. So Nick Carter, was talking about this. He said, many are wondering what Google saw that caused them to revise their post quantum cryptography transition deadline to 2029 this week. It was this, and it's from research Google, research.google, which we will go through. Max of EC says Google's basic basically saying, we've we've cut the quantum resources needed to break Bitcoin's encryption by 20 x.

Speaker 1:

We can now break it. We can prove it. We're just not gonna tell you how. We've slowed down research to give crypto a chance. You have until 2029 to figure out a solution.

Speaker 1:

Good luck. Elon chimed in and said, the plus side, if you forgot your password, the password to your wallet, it will be accessible in the future.

Speaker 2:

Also to everyone else.

Speaker 1:

So the chance that NASA lands on the moon, we were tracking this yesterday. The the missions are starting to happen. Before 2028 on Kalshi is now at 14%. Before 2027 is at 4.7%. So they are racing.

Speaker 1:

Of course, this Artemis two mission is not boots on the ground on the moon. It is rocketing around the moon. We'll have more about this tomorrow.

Speaker 2:

They're just going to check it out.

Speaker 1:

They're gonna be gone for ten days. They're gonna be in space for ten days. Britta Grill was doing some deep dives on the technology, the streaming technology, what we really care about here that will be on board. Something like 20 cameras, four k live streams, laser beams to make sure it's low latency. Super Should be a lot of fun.

Speaker 1:

Super chats would be good. We gotta get a chat going. I'm sure there might actually be because they usually stream on YouTube, and so wouldn't I be surprised if

Speaker 2:

gonna be a twenty four seven like perpetual stream that's always on? Yeah. Even when the astronauts are taken asleep? Yeah. Taking a little nap?

Speaker 1:

Yeah. Yeah.

Speaker 2:

Okay. All the conspiracy theorists are gonna be sitting there watching it very closely and then pausing. There was a glitch. There. Did you see that glitch?

Speaker 1:

That was that was VFX. That was AI. No. This is my butt mark. I will believe that it's real if I see an astronaut put three fingers in front of their face.

Speaker 1:

Yep. Because this is the one thing that the AI can't do right now. If you're ever on a Zoom call with someone who you suspect of being fake, a scammer who said, hey, let's get on Zoom. Let's talk about some financial investment opportunity. And it looks like someone you think is the person, but you suspect that it might not be.

Speaker 1:

And they will be able to show you, Look, look at the fingers. The fingers are perfect. It's fine. It's fine. That's because this part is not AI.

Speaker 1:

Just the face is AI. This is the deep fake stuff that's happening. So what you have do is you have to ask them to hold up three fingers. They'll be like, yeah, three fingers. This is fine.

Speaker 1:

Right? I satisfy the task. You gotta say, no. Put the three fingers in front of your face. Because if you put the three fingers in front of your face, the AI gets confused and breaks the deep fake that's happening underneath.

Speaker 2:

Elon Colossus shares, Elon has spent a decade trying to control an AI lab. He tried to absorb DeepMind into Tesla in 2014 and OpenAI in 2018. When that failed, an intern spoke up. It did not. Interesting.

Speaker 2:

Okay.

Speaker 1:

Let's read through this.

Speaker 2:

He also tried to control x AI to some degree.

Speaker 1:

Doesn't he control x

Speaker 2:

But at what cost? Right? No. All seven co founders.

Speaker 1:

Oh, true. True. True. Yeah. That's what you're referring to.

Speaker 1:

Got it.

Speaker 2:

Anyways, from from the book, pushing back against Musk's obsession with the race against Google and DeepMind, Brockman added, it doesn't matter who wins if everyone dies. Musk responded the next morning at 03:52AM. He confronted Brockman with a proposal that recalled Pichai's pitch. OpenAI should spin into Tesla. Initially, OpenAI's team could accelerate Tesla's development of autonomous vehicles.

Speaker 2:

Next, it could use the profits from self driving cars to fund its AGI moonshot. Tesla is the only path that could even hope to hold a candle to Google, Musk declared. Even then, the probability of being a counterweight to Google is small. It just isn't zero. At an all hands meeting on the Top Floor of a converted truck factory that housed OpenAI, Musk announced to the employees that he was quitting the lab scornfully adding that

Speaker 1:

I need Raptors. I need a new Ford Raptor potentially every day. We've got to put this lab above

Speaker 2:

This truck is amazing. Scornfully adding that OpenAI would have to sprint faster to stay relevant.

Speaker 1:

No.

Speaker 2:

I guess they did.

Speaker 1:

I guess they did.

Speaker 2:

Hoping to lure away some researchers, he declared there was a much better chance of building AGI at a strong business like Tesla. Yeah. Showing courage or perhaps just youthful innocence, an intern asked Musk if speed might be reckless from a safety perspective. Besides, wasn't developing AI at a for profit company like Tesla the same as creating it at a for profit company like Google? Isn't this going back to what you said you didn't wanna do?

Speaker 2:

The intern demanded. You're a jackass, Musk retorted. Then he stormed out of the meeting.

Speaker 1:

That intern?

Speaker 2:

Tyler Cosgrove.

Speaker 1:

No. That intern was Steve Jobs. Just kidding.

Speaker 2:

It's been an honor.

Speaker 1:

See you tomorrow. Goodbye. Boeing Flashback. Bye.