[00:00:00] Jacob Haimes: Welcome to Muckraker, where we dig through the latest happenings around so-called ai. In each episode, we highlight recent events, contextualize the most important ones, and try to separate muck from meaning. I'm your host, Jacob Hams, and joining me is my co-host Igor Krawczuk [00:00:17] Igor Krawczuk: Thanks Jacob. This week we're talking, we're talking about porn. [00:00:21] Jacob Haimes: I mean, yeah, I, yes. Uh, I guess so. That's, yeah. Um, so more specifically, uh, recently ish, uh, Sam Altman tweeted, um, well, it started with, we made Chate pretty restrictive to make sure we were being careful with mental health issues. Um. And then it continues. Of course, you know, I don't agree with that statement, but let's carry on. Uh, later on it says, now that we have been able to mitigate the serious mental health issues and have new tools, we are going to be able to safely relax the restrictions in most cases. Um, and then it continued on a little bit more, but essentially what this means is, you know, since we have, I guess, seemingly resolved this issue in the eyes of Sam Altman, despite the fact that that's definitely not the case from the many mental health, uh, like experts that, that he is, uh, worked with, um, now the gloves get to come off. Now we get to really start doing like just the dumbest shit. Uh, and, and, uh, really morally, uh, like. Not okay stuff. Uh, so let's, let's start serving porn. Fuck it. Why not? Right? [00:01:40] Igor Krawczuk: And for the insurance notes out there, the Google terms, if you want to have like a fancy jargon for this is risk compensation and moral hazard. The idea that if you have some safeguards that allows you to go faster, like ignoring a red light because you're wearing a helmet and we can all, uh, agree that's, that's a splendid idea if you, uh, running a billion dollar company. But [00:02:02] Jacob Haimes: Hmm. [00:02:03] Igor Krawczuk: makes sense the audience and it makes sense in a particular way that we're gonna, uh, open up now. 'cause I would argue this is just one more sign of a vibe shift we had already, uh, talked about in PRIs episodes actually like a quite sizable kind of like size of that shift. is big enough to, uh, to justify economies a new phase of, uh, of a bubble. [00:02:25] Jacob Haimes: Yeah. I, I mean, and so really, it's. It leads back to like, this is pretty disappointing, right? Like to the public, to to, to how it's been presented. We were being told, you know, like, we're gonna cure cancer, we're gonna solve anti-aging, you know, uh, we're going to have 90% of code written by ai and that just isn't what we're getting. Instead, we're getting no cures. Um, we're getting more expensive APIs, uh, we're getting continuously like, uh, sycophantic, uh, systems that are, are optimizing for our engagement. And also on top of that porn, of course porn. Why not porn? [00:03:25] Igor Krawczuk: Yeah, uh, it wasn't the API was more expensive. It wa uh, the thing that was, was referring to was, um, users complaining that the AI agent is getting more so even if the a like technically the API cost per token is, is getting less expensive because you need to have more and more of them and, [00:03:43] Jacob Haimes: gotcha. Yeah, yeah, [00:03:44] Igor Krawczuk: shit with them. It gets more and more expensive and yeah, like this vibe occurring is like the disappointment that then comes together and explains desperation on the side of the company because they promised all of this stuff and they had a big run getting lots and lots of, uh, cash from investors and what they promised isn't arriving. what [00:04:12] Jacob Haimes: No. [00:04:12] Igor Krawczuk: for them? [00:04:14] Jacob Haimes: I mean, what's left for them is one of the greatest plays, you know, in terms of making money, turning a profit, and that's to get your customer base addicted to your product. It works extremely well if you want to turn a profit. So, uh, why not? Create, you know, porn systems that really hijack the, like, dopamine systems in our brains. And, uh, create these sort of relationships that make people feel like they need to be discussing their personal lives with, uh, the chatbot. That is just an interface for a company. And then you can take them away and people will be very upset and you can say, oh, well in that case, we'll, we'll bring it back to you with a legacy UI that costs 10 times more. And they'll gladly pay it because that's their friend, that they're, you know, essentially bringing back to life. Uh, and that's not like an exaggeration. That's really what has happened already. And so they're just going all in on this. Um, and yeah, I, I mean, I guess from, from the shareholder standpoint, it's a good move because addiction makes money. For the people. [00:05:36] Igor Krawczuk: standpoint and from the investor standpoint, they still need to raise more money, right? They're not profitable. They need to raise more money. [00:05:42] Jacob Haimes: Yeah. [00:05:42] Igor Krawczuk: what we have, have to do, have to do is we have to show engagement. We have to show user growth, which means, you know, like get those users addicted so that they just like keep clicking that, uh, like next, next button that they keep talking to, uh, uh, uh, to the Wi U voice. Um, and show like that plausibly can monetize these people, which means like they keep engaged even if you show ads, even if they like, it's, it's becoming less useful or they can sell to them directly, which is another experiment that, that they had of like shops directly putting like their, um, offerings and their possibility to, for, for a customer to buy something in, into the chat. [00:06:23] Jacob Haimes: Mm-hmm. [00:06:23] Igor Krawczuk: this is what we mean with like the vibe shift and, and like the phase where we are coming from. People were kind of hyped about AI and things were good for companies where they didn't need to do the squeeze to. People are getting a bit mad about ai. Their opinion is shifting. So now, now they have to kind of like rush out and normalize the shitty stuff, why there's enough goodwill to burn. [00:06:50] Jacob Haimes: Yeah. And this isn't just anecdotes, right? Like this isn't, for example, the, the video I saw the other day that was like someone calling out the, the big tech, uh, tech bro, people as like, um, libertarian, psychopathic man babies, right? Like, this isn't just that video, although it went viral and I saw it because like, that's a sentiment that people like agree with, right? [00:07:19] Igor Krawczuk: I, [00:07:20] Jacob Haimes: Um, [00:07:20] Igor Krawczuk: with a sentiment I like [00:07:22] Jacob Haimes: oh yeah. [00:07:22] Igor Krawczuk: as, as, as, as a more or less libertarian tech pro, but just of a different type. I, I, I, I, I fully agree with that sentiment. [00:07:29] Jacob Haimes: I calling, calling yourself a libertarian tech pro is like very misleading. [00:07:36] Igor Krawczuk: It is technically correct. It's the best type of correct. [00:07:39] Jacob Haimes: Uh, [00:07:39] Igor Krawczuk: up a definition of libertarian, when I am technically a libertarian. [00:07:45] Jacob Haimes: yeah. Okay. But it's not just these anecdotes, right? Like, it's not also the, the memes that are, that are coming out that are just like making fun of people like, uh, like the tech bros and, and this idea that like, oh, now we realize that they're, they're just sort of willing to do whatever they want. Uh, and they're seeing it as bringing us into this new, uh, phase of reality and to everyone else, you know, we're just like pissed about it. Um. There are also tons of, uh, surveys and, and, you know, research reports on this kind of thing. So Pew Research, which is a, a very well known, uh, well established, uh, surveying group, has a couple that have come out within 2025. Uh, and I think that there are pretty telling, um, just like what the stats are. So, in October, on October 25th, 2025, um, there was a report, and obviously all of these are gonna be linked in the show notes, but like the, the headlining stat is over two times as many people are more concerned than excited about ai. Uh, that's 34% versus 16%. Um, and then, you know, 42% say that they're equally concerned or excited. Um, but still that's like, that's a lot of people that are saying, actually, I, I'm not sure about ai. I, I don't know if I buy into the hype, uh, and. [00:09:09] Igor Krawczuk: Jacob did, didn't we talk about the Stanford AI Index and about how that thing claims is that people are like super hyped about AI and that, uh, that like they are main optimistic and looking forward to using it. [00:09:21] Jacob Haimes: Yeah. Well, so, uh, that is one like criticism of surveys is that maybe you can, uh, like find the right populace to get the right answer. Um, I personally have a, like, high confidence in Pew research because that's sort of what they do. Um, but the other thing to consider is that the Stanford AI Index, uh, ones, the ones that they typically cite also, uh, include more global, um, which in general is slightly more, uh, pro AI than the us. The US is, uh, like overall very AI negative compared to the global population. That being said, there are still trends that are negative in terms of, uh. Like the global view. So the, the last one was actually around the world that was, um, the, the global census. But then looking at the US there was something published on April 3rd, 2025. Uh, and again, like the, the headlining stat here is more US adults say that they believe AI is going to impact the US negatively over the next, or over the course of the next 20 years at a greater than two to one ratio. And the likelihood that a US adult would say that they agree with that statement. So that is, they agree that they think AI is going to negatively impact the US is over two times more likely when compared to AI experts. That's like people researching and developing AI in, in industry. [00:10:52] Igor Krawczuk: Are, are you telling me that people whose job, uh, continuity depends on AI, are more favorable than the general public towards ai? [00:11:02] Jacob Haimes: I know it's shocking. Uh, but yes, I am. The thing is, it's not just like a little bit, um, which I think is telling, given that there's so much emphasis around, um, like the media machine sort of hyping up ai. Um, there's also a growing sentiment that it's not going to help us. Um, and that sort of followed up by the, the last one that I, I wanna say from P research. Um. Which it was published in September, 2025. Uh, and that's US Adults think that increased use of AI will more, will, will make people meaningfully worse across the board. So, uh, like specifically there's a three to one ratio that they think it will worsen people's ability to think creatively. There's a 10 to one ratio for people thinking it will reduce people's ability to form meaningful relationships with other people, a two to one ratio for make difficult decisions, and a three to two ratio for solve problems. So across the things that we're, we're surveyed here, not one did people think, oh, AI will actually make us better at that. They thought that AI will make people worse at everything that was surveyed for. And that's like, again, pretty telling. [00:12:21] Igor Krawczuk: And if I was like, like an a AI booster, which, you know, like we shut at the fort, but if I was a booster [00:12:28] Jacob Haimes: Well, you are a libertarian apparently, [00:12:32] Igor Krawczuk: I might say that, ah, these are lay people. They don't understand the technology. Right. [00:12:38] Jacob Haimes: right? [00:12:39] Igor Krawczuk: So like is that a relevant counter argument, uh, here? [00:12:44] Jacob Haimes: I mean, I think it's, I [00:12:46] Igor Krawczuk: Just very easy. No, it's about, it's, it's about the vibe. And the vibe is still objectively this direction. It doesn't matter what, what, what, where it's correct. It's a vibe. [00:12:55] Jacob Haimes: That's true. Yeah. Yeah. I mean, so it, that doesn't, that's not really the point here, right? Like the point here is that the populace does not feel good about AI people. I mean, the, the term lanker is like a thing because people dislike. The idea of AI and people legitimately feel threatened by it as they should. Um, [00:13:22] Igor Krawczuk: people who don't know what clunker refers to, it's the, at least the internet that is, uh, like suddenly to be pushed by basically AI haters. And it's like slowly catching on in this, in some bubbles with a hilarious consequence. But some people are assigned to, like, call out of a dehumanizing effect of, uh, using a slur for agents. [00:13:47] Jacob Haimes: wait, they're, they're upset that clinker dehumanizes the non, but they're not humans. They're, they're not. [00:13:56] Igor Krawczuk: Yes. That that's what you, you as a AI bigot would say. [00:14:00] Jacob Haimes: Ugh, okay, whatever. [00:14:02] Igor Krawczuk: We will link [00:14:02] Jacob Haimes: Um, [00:14:03] Igor Krawczuk: notes. It it, it is worth, uh, sneering at. [00:14:06] Jacob Haimes: but it's not just Pew research 'cause you know. It is important to have different sources as well. So like, university of Toronto and Melbourne Business School also have recent surveys in which they looked at similar questions. And the vibe shift to me is clear, you know, people are upset. Um, and then going back to anecdotes, because anecdotes are fun. Um, the, you know, techno optimist, sort of mainstream media content creators like Hank Green, um, Kurtz, even like John Oliver a couple months ago, I think, um, they have made pieces, uh, you know, essentially like video essay kind of things, talking about how shit and how annoying all this AI stuff is. Um, and so this is not just something that a. A nerd that's, you know, obsessed with this stuff and, you know, lives and breathes it like me, um, is talking about this is mainstream now. Um, and that means it's a, it's an entirely different animal, uh, than where it was a couple years ago. Um, and I guess, sorry, go ahead. [00:15:24] Igor Krawczuk: No. No. What uh, what [00:15:25] Jacob Haimes: Oh, I just wanted to say like, on the note of, uh, personifying the systems, uh, 'cause I, I think it's important here to note that, you know, I'm talking about AI and I'm saying like, people are upset about AI and blah, blah, blah. But a very important thing to address is that it isn't about ai. You know, AI is what the blame has been placed on, but AI is just the tool of choice right now. The real problem are the people deploying these systems against us. That's the thing to take away here is people are upset and currently, you know, AI is an easy scapegoat, but we need to take the blinders off and realize that it's not AI that's the problem, it's the people behind it. [00:16:20] Igor Krawczuk: So that's the, like, your only problem with, uh, like with the service as a, as a person with a, uh, with an agenda behind the podcast, obviously the, the negative vibes is good. You, you, you would just want that people transfer this onto the CEOs and the investors behind it. [00:16:37] Jacob Haimes: Yes. Um, I think that that is, I mean, I, I think that's really important because we will continue to be led astray if they can dangle. That carrot of like, oh, you can destroy this thing that you don't like, you know, and then we'll do whatever they want to destroy that thing. Uh, and then we'll just be stuck in the same rat race. That's not the solution here. [00:17:07] Igor Krawczuk: Yeah, there's something to, uh, to be said about historical awareness, um, that you're hinting at. Yes. I think, I think that's important. Um, but I think it's also important to understand, okay. Like why specifically people are getting mad, at least at the AI systems. And I would say it's like [00:17:25] Jacob Haimes: Like why now basically is what you're saying. [00:17:28] Igor Krawczuk: Yes. Why not before? Because like before AI wasn't working and now it's not working. And like all of the stuff that, what we are talking about now, it's like the service are new, but the addictiveness, the shittiness, all of it stuff ha has been around for a while and I would argue it's like two things that has hit the mainstream conscious that is starting to get people to react. And it's as usually the economy's stupid as in [00:17:58] Jacob Haimes: The economy. The economy is stupid. I agree. [00:18:02] Igor Krawczuk: No, economy is stupid. It's like a political term, but like why do people lose elections? It's always the economy stupid. Like why do people right now have like low fa favorability rating? It's because economy ain't going that, uh, uh, that well, and like inflation is high. [00:18:18] Jacob Haimes: Sure. [00:18:18] Igor Krawczuk: like a, like a meme on in politics. [00:18:22] Jacob Haimes: I also think the economy is stupid. I'd like to go on the record to say that. [00:18:26] Igor Krawczuk: I don't think the economy is stupid, but, but, but you know, I am a li a libertarian apparently. [00:18:32] Jacob Haimes: Apparently. [00:18:32] Igor Krawczuk: Again, back, back to, uh, to a point, why is it the economy? Because there's two things that is affecting people. One, AI is ruin the economy for everyday people. And two, AI is breaking your shit. [00:18:47] Jacob Haimes: Okay. What do you, what do you, what do you mean? Like, so in terms of like, AI is ruining economy for everyday people, I get, I get that one. So like that's, that's the idea. [00:18:59] Igor Krawczuk: elaborate for the audience if you get it [00:19:00] Jacob Haimes: Yeah. So I, I think, and I, correct me if I'm wrong, if I'm like picking up something different here. Um, but currently, you know, when we look at, especially in the us, um, the. Investment that's happening. Um, the speculation, it's all tied around ai. It's, it's vastly centered on ai and that is at the expense of other industries. And so what's happening, sorry, go ahead. [00:19:35] Igor Krawczuk: No. So what? So what's happening? [00:19:37] Jacob Haimes: So what's happening is that people are losing their jobs and people are being paid less, uh, because all of the money is going towards AI stuff, which is not where everybody works. Um, and that is a problem not just in the long term for like the US has tied. Its, uh, like the value of the dollar to AI's success, which, you know, if you're, if you know anything about my stance here, uh, that doesn't bode well. Um, but also like it's having immediate negative impact on a lot of people. [00:20:16] Igor Krawczuk: yeah. You, you got in one and, and to make us concrete with numbers, we just, it was a study by Harvard economists that estimated that if you deduct data center spending. Then GDP growth in the first half of, um, 2025 would've been 0.1%. So barely avoiding a recession. Like more or less like zero growth. The only like investment and, and growth happened because of data center, um, uh, a buildup and spending. [00:20:53] Jacob Haimes: Mm. [00:20:54] Igor Krawczuk: So, [00:20:55] Jacob Haimes: But then what's, what's the second one you said? Like, what was like AI is breaking my shit. Wha what do you mean by that? [00:21:03] Igor Krawczuk: so there aforementioned data center, a buildup not only hides the recession and is, uh, therefore like popular with politicians, but not so popular with most people because, you know, like not everyone works in construction. Not everybody works in making data centers, but people still feel the effect of the lack in other investment. Like, uh, if it had been more investment in like. shops, people would be able to get a job as a barrister if there was more investment, uh, in like normal real estate than you, like your normal like plumber and whatever craftsperson would, would be employed. This is all highly specialized and concentrated, but once it's built, it needs power and it needs power in a very particular pattern when you're a data center where there's like an AI workload and it's business hours, those GPS light up and they're just on, and then people go to lunch and they go off and then like it might peak up and up. It's always all of them. Or normally you have like many different data centers in each node, in every data center has a little bit of a peak kilo, but on average it's kind of flat. So that means, uh uh, the data center puts much less strain on the grid around it. the like grid quality, so like the stability of the frequency and the shockwaves of electricity that go through the grid much slower. [00:22:33] Jacob Haimes: Okay, so getting back to like the non nerd land, um. What you're saying is the way that data centers use power is very like on off. So it's like surge of power is needed and then it sits, and then surge of power is needed and then it sits. And that pattern is bad for appliances and is bad for, uh, like the, the infrastructure we have. Is that correct? [00:23:04] Igor Krawczuk: it switches on an off as well, but it's always like massive search, massive drop, massive search, massive drop instead of having like gradients and that sends shock waste through a grid, which then need to be compensated by other appliances. By like, by like the, like the power, um, supply of your fridge, of your laptop, of whatever. And that puts stress on, on those appliances. there's a very real possibility that like your fridge will fail like a year earlier than would would've otherwise. If you live next to a place that has this type of like, load pattern, and that's like independently of a, more like immediate term of like your electricity price rises, your water price rises because the data center is guzzling it all up. [00:23:55] Jacob Haimes: Yeah. Uh, I mean, and to be fair, like there, there are, there are like good arguments, like bad arguments here. The water argument is, uh, I would say more shaky. I, there still is, is some, uh, salience to it, but like the, the energy part, uh, is like certain, you know, it is using energy. It is using energy in that way. Um, and the amount of energy that it's using is going to cause. Energy prices and already has caused energy prices to go up. Um [00:24:28] Igor Krawczuk: I mean, that's why they're also investing into like, uh, new power centers, uh, nuclear power centers, like put, putting old power nuclear power centers ba back online, like the, the rock data center was running off like gas turbines of like mobile gas turbines with like horrible efficiency ratings for weeks. I'm not sure if they're actually off that at this point. So like power is like a big bottleneck. And you can also tell that like the message has already reached. owners of this said, okay, this is like gonna go bad if they don't at least make like a plausible case for our, it's gonna be fine in the long run because they, they're now pushing this idea of star cloud data center in space, and [00:25:17] Jacob Haimes: hmm. [00:25:18] Igor Krawczuk: on this. We'll actually link my little Monte Carlo estimation on stupid this idea is. And I'm happy to be corrected if people want to email in and tell me that I'm wrong. But in my estimate you have like a distribution between 10 million to tour 50 million. So it's like really uncertain, but like the lower margin is like 10 mil, uh, 10 million per GPU node that you shoot up into like a lower of orbit, [00:25:51] Jacob Haimes: Okay. [00:25:52] Igor Krawczuk: 10 million per per per DJX, which costs at, uh, I think at this point. Two 50 K, 500 K on Earth. And that's like for a single note. And the classes that you need have like hundreds and thousands of these things. [00:26:12] Jacob Haimes: And. Also anyone that knows any amount of heat transfer, um, knows that, you know, [00:26:20] Igor Krawczuk: I mean, that's why it's so expensive because you, you like, the way you need to have, uh, the heat transfer is because you need to radiate it all into space. [00:26:29] Jacob Haimes: yeah. [00:26:29] Igor Krawczuk: means your satellite, that could be super small if you didn't have to do the heat thing now needs to be like quite sizable because you need to like get that heat away. You need a conductor, you need to have like radiation, shielding, bunch of other stuff. But we have been here before, not exactly in this way. We're like, we haven't talked about sending stuff to [00:26:50] Jacob Haimes: We've been, we've been shooting data centers into space before. [00:26:55] Igor Krawczuk: As far as I know not, but, but who knows what the US will declassify in, in, in a [00:27:00] Jacob Haimes: That's, yeah. Okay. But you mean we've been in the, uh, vibe shift before. [00:27:08] Igor Krawczuk: we've been in the AI vibe shift before. Even [00:27:13] Jacob Haimes: Okay. So, you know, I am younger than you and so, and you were, you were alive in in the seventies. Right. [00:27:20] Igor Krawczuk: ba basically. Yeah. Like I am ancient and I have seen the seventies, uh, of every century. [00:27:26] Jacob Haimes: That makes sense. So, yeah. Can you tell us a little bit about the, those ones, those seventies. [00:27:31] Igor Krawczuk: So, so 1970s were, uh, being precise, um, was the time of great, uh, hope and splendor for a, for ai, what people call good old fashioned AI nowadays. And what was hot back then was the radical technology of. and expert systems, which is basically what is now called a business, uh, rule engine. And the only thing that kind of fell out a favor, but it's still used in some places, it's now called a knowledge graph, but then it was called ontologies. And the thing that that made all AI was the idea of doing mining via heuristics or engineering your knowledge graph such that you have all of the concepts that you need in your business domain. So you're like, what, what is an account? What, what is a transaction? What is whatever you want to think about when using that to encode rules that would be like a bit more abstract. So that in theory, search algorithm can piece together the best combination of different, uh, sets of these rules to like answer your question or to like generate an answer for you, give an input. And this now use, for example, in. Databases still. The SQL query planner is a thing that you give it a question about some data and it figures out how to get the data efficiently for you. Um, and when we started, they made like very quick improvements, some breakthroughs, like they figured out OCR and suddenly you could have like automated, uh, letter scanning. They figured out how to, you know, have this like complicated booking rule system for airlines and you put it into an expert system and suddenly your clerk can just like click, click, click, pull up the price of, of your tickets with all of the discounts, all the different rules that they have. It was actually very cool. And then, you know, they got like millions of investments. Uh, some governments invested billions into like special machines and special research programs to boost all of this. And we're gonna like link, like a pamphlet. Pamphlet I found of like hopeful things that IA. It has been difficult to fully deliver on the big, uh, big promises, but because progress has been so rapid in the last five years, surely in five to 10 years they would've figured it out and like all of the promise will, will have been, uh, delivered. [00:29:59] Jacob Haimes: That's not something I've heard before. [00:30:03] Igor Krawczuk: Would you guess what happened five to 10 years later? [00:30:07] Jacob Haimes: Well, that was the, the first AI winter. [00:30:10] Igor Krawczuk: Exactly. They didn't figure it out. What happened was that the initial winds were the easy winds, so you have like a rapid rise, and then it got like more and more difficult, you could call it like a logarithmic dynamic where you have diminishing returns. The, the more you push. And was there in journalism, there was some journalism, like this was like beyond just hard coding, some stuff, there was a bit of like dynamic decision making that the machine was doing for you, but. wasn't enough to go in between domains. To go in between tasks. You always had to have like dude talking to an expert and coding everything, tweaking the heuristics, managing the system, making sure it was working. So instead of having like this one system that could do anything, you had to adapt it to every case and come up with like little hacks to give it like new skills in in, in a new domain. [00:31:08] Jacob Haimes: And so can you like close the loop for me here? How is that what we're seeing now? [00:31:15] Igor Krawczuk: Well, couple of weeks ago, Claude announced, Hey, we have skills now which are handmade, or maybe L eliminated made, but still human made and curated snippets of scripts and instructions that tells the LLM, Hey, here's how to solve this task. And you have to make them one by one for every task that you want. Before, before, uh, we, we are tools. We're cps, we're all part of the same idea. The reason why we're doing that is because no matter how hard you try, it's just not gonna get some stuff. So instead they just brute force it and make it for you. If you look at the of all of the benchmarks, all of the evaluation, all of the metrics, all slowing down. It's what you following what, what you could call a logarithmic dynamic where adding more and more pre, pre-trained data used to be the be all, end all. And then it wasn't. We talked about that in our rambling episode, the rambling. So like reasoning for, uh, for the boosters was supposed to be the be all, end all. And now there's like papers that came out that show that there's trade offs, that the more you ramble, the more you are likely to to hallucinate. So that's slowing down and, and, and not cutting it. So all of the generalist routes are cut off. What is left is. Pre-trained data curation and hard coding individual skills. [00:32:43] Jacob Haimes: And well, so to be fair though, like, you know. There are sometimes new things that can be done that will then unlock, you know, a little bit more improvements. Um, but the, the point that we're, I I, I think you're trying to make here is that, you know, what, what was presented initially was that all we needed to do was scale. We didn't need to do any sort of, uh, new improvements whatsoever. And then it turned out that that wasn't the case. And, and actually nips last year, uh, you know, uh, ia Eva gives his talk for his like 10 year paper, uh, kind of thing. And he says like, oh, well now we're gonna get to scale more. We're inference time. Um, and, and now we're again seeing how well that got a little bit of the way. Uh, you know, it doesn't, it doesn't actually just solve the problem either. And so, uh, you know, there may be other things that unlock, I guess, increases. Um. One of them is clearly just like spending a shit ton of money on running the system over and over again. Um, but that, that's not like what was presented initially, right? [00:33:57] Igor Krawczuk: I mean, like, uh, it, it sort was present. What was, what was presented originally was scale up. Not only can, as can it get us there, it's also gonna be able to get us there in like a reasonable amount of scale up. [00:34:12] Jacob Haimes: Yeah. [00:34:12] Igor Krawczuk: And now we're at, Sam Alman talks about 1.4 trillion in data center spend like $300 billion commitments. it's still unusable for, uh, for anything that that is, that is a non boiler plate. If you don't do massive amounts of, of, of handholding, and as soon as you go outside of a domain where it was trained in or where it has a, uh, skill or where, where you fine tuned it, it breaks down. Like fine tuning is having a massive resurgence because it's much cheaper to take gamma and fine tune it if you have a data. But to try to like wrangle a system out, out, out of the foundation models 'cause they are not adapt to your use case. And the bigger takeaway is that there was an AI winter, but we still use we still use knowledge graphs, we still use all of this stuff. It's just that it's just technology now. And it couldn't, uh, like sustain the bubble. It couldn't like land the bubble without a drop. [00:35:18] Jacob Haimes: So EU and I just did like, I think 15 minutes, maybe a little bit, uh, you know, 15 minutes plus or minus 15%, uh, on bubbles and stuff. Uh, definitely got into the weeds. We're gonna put that in the, um, Patreon extended cut. So if you are interested, you can go check that out. Uh, but for those of us that are not listening to that, can I get the short version? Igor? [00:35:46] Igor Krawczuk: very, very short version is people used to say it can't be a bubble because there's no debt financing and there's like actual so much growth. And uh, there are reasons that is predicted by a theory, by a guy called Didia Sonnet, uh, who said shit like, um, crypto bubbles, other investment bubbles. it's very interesting I find, but the TDRs, that bubbles happen when there's like enough uncertainty and like plausible. Narratives in both directions that the, the price can either like drift or like follow convenient narratives. So if you're looking for like X investments because tech has kind of slowed down and you really need the tax investment, that could be one reason why you drift upwards. And there's also like a game theoretical reason to that drift. And that's why I would argue we are now in an economic bubble because the valuations, like we said, they need to be extremely high to justify the current investment. we are in that regime bubble. And just because it hasn't popped yet doesn't mean that it's not gonna pop. [00:37:00] Jacob Haimes: Yeah. And, and then I guess like the last, the last sort of. Button on that is that the bubble is shaking. Um, so, you know, people are shorting Nvidia, people are shorting other AI companies, um, that they can't when they can, you know, not all of them are public, but ones they can short are being shorted. Um, there were a couple recent, like mini, maybe not mini, like they're still big, but like the [00:37:28] Igor Krawczuk: drop by the translated into like 800 billion market cap being wiped out. [00:37:32] Jacob Haimes: right, uh, so we are seeing this bubble shaking. Um, and so then the question is what can we do about this and about all of this? Um, [00:37:43] Igor Krawczuk: Like thing I would say, we need to pop the bubble as soon as possible. It's also something that, like Cory Doctor Co Cory Doctoral has been saying, if we, if we pop a B bubble now, it's gonna suck. Like I said earlier, like the US has tied itself to, to the AI boom a lot with a dollar. So it's gonna suck in any case, but it's a choice between it just sucking or, or it being a repeat of two, 2008. [00:38:10] Jacob Haimes: Okay, but like how would, how would we do that though? [00:38:13] Igor Krawczuk: Like one of the way you can pop, pop a bubble is by cutting off the influx of capital, by creating like a shock, by making like a supply chain crunch and seeing if like the companies can sustain like slightly worse times. [00:38:29] Jacob Haimes: Hmm. [00:38:30] Igor Krawczuk: And one thing that they really need a lot of is influx of cash and of compute, which means if people stop data centers coming online and like those deals going through and like enabling the money flow that Nvidia and OpenAI have been like stirring up with like circular deals. If, if it stops, then there's a chance that it's gonna be like a shock to the whole system. [00:39:02] Jacob Haimes: Okay. And then I guess like one, one way to do that, uh, which has been working already, like people are already doing this and it's been working and that's great. Is protesting data centers, right? Like not just in the us in the US that's happening uh, quite a bit. Uh, but then also like in Ireland and in places in the eu, you know, people are saying, wait a second, I don't want this data center here. I don't want you to put that there. Um, and. The data center watch, uh, which was, is a more recent like, uh, reporting group, uh, specifically looking at this issue. Uh, as said that there are 600 or 64 billion, uh, dollars of data center projects have been blocked or delayed by this kind of, um, grassroots, uh, not in my backyard, so to speak, um, like data center protesting. Um, and so if that's working, so like keep doing that. [00:40:03] Igor Krawczuk: It has been working so well that that meta is now running pro data center ads about quaru towns, uh, and talking about the job creation, uh, that it will do. And of course, we're being called out on already. Um, the other thing that's worth mentioning, like calling representatives, not just protesting, but just like whoever you can call, be it like your representative or your financial representative to not invest into ai, um, opportunities. So as soon as OpenAI goes public, they're gonna like, uh, try to get people to invest into them, uh, via like 4 0 1 Ks. And like, you should not do that. Probably actually, I, I need to be like. You can do that if you want to gamble on it. But if you want to stop the bubble, then you should not do that. Like financial advice wise, you might make a shit ton of money if you can ride the bubble, but it's gonna be super risky. So just like to not get sued for giving bad financial advice, like I should, I should like qualify that statement. [00:41:03] Jacob Haimes: Well, I mean also, we're not financial advisors. We're not legal advisors. We're not healthcare providers or advisors. Like, do whatever the fuck you want. [00:41:16] Igor Krawczuk: I mean, we are saying opinion, like I want to like, uh, not hide behind the disclaimer. Like, uh, the not investing in open air is a It's not necessarily a good investment. [00:41:28] Jacob Haimes: Sure. [00:41:29] Igor Krawczuk: Yeah. [00:41:31] Jacob Haimes: Um, [00:41:31] Igor Krawczuk: thing you can do is to call it out because a lot of this writes on perception. [00:41:40] Jacob Haimes: yeah. [00:41:40] Igor Krawczuk: one thing is if you want to be a bit more reasonable, you can point out the normalness of a technology. You can, you can not be a hater, but just call out. But like now, it's more complicated than that. 'cause even that is bad. Like if you just like actually talk about actual experiences of using this technology and like what actually takes to get to go without the hype, you're doing good already. And if you're like a creative person, a comedian, somebody who's more funny than me, uh, fun of the cringe and the bullshit and the out of proportion claims [00:42:18] Jacob Haimes: Yeah. [00:42:18] Igor Krawczuk: just mock the hell out of the whole idea of a GI of, uh, AI as a as, as a, as a thing. Find stuff that is like true ridiculous about it and just highlight that because this has actual power in destroying the. and the FOMO nature of, uh, of this thing. And we're gonna put some links into the show notes that like, make the case for, what's it called, like activism or mockery as an actual like tool of activism. [00:42:54] Jacob Haimes: Yeah. So bringing this back to where we started, they promised US cures for cancer, right? They promised us like utopia. Um, and instead what we've gotten is. The cyberpunk setting where you literally and figuratively jack into the mainframe and become a button pushing monkey for your dopamine hits from their, their porn system. Um, and the fact that this is happening now, when people are beginning to sour on ai, people are beginning to realize that this is a grift and that there isn't as much value in it as, um, was, was being shield. And when these companies need investment and need to demonstrate that they can make money from their products, is. Showing, at least to me, like, you know, these, these guys are, are trying to grab as much as they can on the way out, right? Like they're, [00:44:21] Igor Krawczuk: they, they, they're trying to as much as they can before the bubble fully pops, or, uh, or they're trying to like, reinflate it that they can keep a, uh, like the sidelining going for a bit longer. [00:44:35] Jacob Haimes: yeah. Um, and that is all the muck that we have for today. Uh, if you like this episode, please share it. Uh, and if there's one thing that you do, uh, I think I've mentioned this before, but like from a machine learning perspective, it's really, really, really valuable. If you leave comments and reviews with actual, like. Words in them on whatever podcast listening platform you're on, because those reviews, the fact that there is a review from an audience member is weighted very highly in whether or not they will show it to people. Uh, and so just like, even if you hate us, even if you hate it and, and you listen, well, you're probably not listening to, to us at this point. But if you, if you are and you hate it, do that. Alright, well, um, we'll see you next time.