WEBVTT - The Coming Generative AIPocalypse

0:00:02.800 --> 0:00:08.319
<v Speaker 1>Zone Media. Hello, and welcome to Better Offline. I'm your

0:00:08.320 --> 0:00:22.759
<v Speaker 1>host ed Zitron. What In the last episode, I walked

0:00:22.760 --> 0:00:25.040
<v Speaker 1>you through how GENERALIFAI has hit its ceiling and how

0:00:25.040 --> 0:00:27.800
<v Speaker 1>big tech has burned billions of dollars chasing a technology

0:00:28.000 --> 0:00:30.000
<v Speaker 1>that really isn't going to give them the returns they

0:00:30.040 --> 0:00:33.560
<v Speaker 1>got from the smartphone and cloud computing revolutions, not that

0:00:33.560 --> 0:00:34.080
<v Speaker 1>that's going.

0:00:34.040 --> 0:00:35.440
<v Speaker 2>To stop them.

0:00:35.680 --> 0:00:37.800
<v Speaker 1>In this episode, I'm going to walk you through the

0:00:37.840 --> 0:00:41.159
<v Speaker 1>potentially massive problems they have ahead, the ones they've created

0:00:41.280 --> 0:00:44.320
<v Speaker 1>by building these massive supercomputers, the ones necessary to capture

0:00:44.320 --> 0:00:47.440
<v Speaker 1>this imaginary demand of this kind of bullshit AI revolution,

0:00:47.920 --> 0:00:50.080
<v Speaker 1>and I'm going to try and guess what might happen next.

0:00:50.720 --> 0:00:53.640
<v Speaker 1>All right, So, the AI boom helped the S and

0:00:53.680 --> 0:00:56.360
<v Speaker 1>P five hundred hit record levels in twenty twenty four,

0:00:56.600 --> 0:00:59.240
<v Speaker 1>largely thanks to chip giant Nvidia, a company that makes

0:00:59.280 --> 0:01:01.800
<v Speaker 1>both the GPUs necessary to train and run generative AI

0:01:02.080 --> 0:01:04.520
<v Speaker 1>and the software architecture behind them, as well as some

0:01:04.600 --> 0:01:08.440
<v Speaker 1>other server parts too. Part of Nvidia's remarkable growth has

0:01:08.440 --> 0:01:11.640
<v Speaker 1>been its ability to capitalize on Kuda architecture, the software

0:01:11.680 --> 0:01:15.160
<v Speaker 1>layer that lets you do complex computing on GPUs rather

0:01:15.200 --> 0:01:17.679
<v Speaker 1>than simply using them to render video games or complex

0:01:17.720 --> 0:01:20.080
<v Speaker 1>three D images. Someone's going to email and say that

0:01:20.080 --> 0:01:20.880
<v Speaker 1>that's partially wrong.

0:01:21.000 --> 0:01:22.120
<v Speaker 2>Just bear with me.

0:01:23.360 --> 0:01:25.840
<v Speaker 1>And of course, one of Nvidia's other things they do

0:01:25.959 --> 0:01:28.520
<v Speaker 1>is continually create new GPUs to sell for tens of

0:01:28.560 --> 0:01:30.679
<v Speaker 1>thousands of dollars to tech companies that want to burn

0:01:30.760 --> 0:01:34.080
<v Speaker 1>billions of dollars on GENERATIVII, which has led Nvidia's stock

0:01:34.120 --> 0:01:36.000
<v Speaker 1>to pop more than one hundred and seventy nine percent

0:01:36.040 --> 0:01:39.080
<v Speaker 1>over the last year. Back in May, in Vidia's CEO

0:01:39.120 --> 0:01:42.120
<v Speaker 1>and professional Carnival Barker Jensen Wang said that the company

0:01:42.160 --> 0:01:44.120
<v Speaker 1>was now and I quote on a one year rhythm

0:01:44.120 --> 0:01:48.120
<v Speaker 1>in AIGPU production with its latest Blackwell GPUs, specifically the

0:01:48.120 --> 0:01:50.760
<v Speaker 1>B one hundred, B two hundred and GBT two hundred

0:01:50.960 --> 0:01:53.360
<v Speaker 1>used of Regenerative AI, supposedly due at the end of

0:01:53.400 --> 0:01:56.360
<v Speaker 1>twenty twenty four, except they're now delayed until at least

0:01:56.400 --> 0:02:00.000
<v Speaker 1>March twenty twenty five. Now, before we go any firm,

0:02:00.200 --> 0:02:02.400
<v Speaker 1>it's worth noting that when I say GPU, I don't

0:02:02.440 --> 0:02:04.360
<v Speaker 1>mean the one you'd find in the gaming PC, but

0:02:04.400 --> 0:02:06.760
<v Speaker 1>a much much larger chip put on a specialized server

0:02:07.000 --> 0:02:10.239
<v Speaker 1>and actually a specialized board as well. With multiple other GPUs,

0:02:10.320 --> 0:02:13.560
<v Speaker 1>all integrated with special case in cooling and networking infrastructure,

0:02:13.880 --> 0:02:15.760
<v Speaker 1>all of which is sold by Nvidia. By the way,

0:02:15.840 --> 0:02:18.280
<v Speaker 1>you can use other things, but as well buy it

0:02:18.280 --> 0:02:21.840
<v Speaker 1>in one place, right, That's their product strategy. In simple terms,

0:02:21.880 --> 0:02:23.640
<v Speaker 1>these are the things necessary to make sure all these

0:02:23.720 --> 0:02:26.520
<v Speaker 1>chips work together efficiently and also stop them from overheating

0:02:26.800 --> 0:02:29.880
<v Speaker 1>because they get extremely hot, and generative AI makes them

0:02:29.960 --> 0:02:31.680
<v Speaker 1>run at full speed all the time.

0:02:32.560 --> 0:02:33.519
<v Speaker 2>Now, the initial.

0:02:33.240 --> 0:02:35.200
<v Speaker 1>Delay of the new Blackwell chips is caused by a

0:02:35.240 --> 0:02:38.720
<v Speaker 1>now fixed design, floor and production. But as I've previously suggested,

0:02:38.720 --> 0:02:41.760
<v Speaker 1>the problem isn't just creating the chips, it's making sure

0:02:41.800 --> 0:02:44.560
<v Speaker 1>they actually work at scale for the jobs they've.

0:02:44.400 --> 0:02:45.000
<v Speaker 2>Been bought for.

0:02:46.840 --> 0:02:52.560
<v Speaker 1>Now what if that wasn't possible a few weeks ago?

0:02:52.639 --> 0:02:55.200
<v Speaker 1>The Information Report and Video is currently grappling with the

0:02:55.200 --> 0:02:58.560
<v Speaker 1>oldest problem in computing, how to cool the fucking things.

0:02:59.080 --> 0:03:01.640
<v Speaker 1>According to their report, and Video has been asking supplies

0:03:01.680 --> 0:03:04.800
<v Speaker 1>to change the design of its three thousand pound seventy

0:03:04.840 --> 0:03:08.440
<v Speaker 1>five GPU server racks several times to overcome these heating problems,

0:03:08.600 --> 0:03:11.880
<v Speaker 1>which the Information calls quote the most complicated design that

0:03:11.960 --> 0:03:14.639
<v Speaker 1>Nvideo had ever come up with. According to the report.

0:03:14.680 --> 0:03:17.200
<v Speaker 1>A few months after revealing the racks, engineers found that

0:03:17.240 --> 0:03:21.120
<v Speaker 1>they didn't work properly even when they use the smaller

0:03:21.200 --> 0:03:24.079
<v Speaker 1>thirty six chip racks, and they've been kind of scrambling

0:03:24.080 --> 0:03:25.000
<v Speaker 1>to fix it ever since.

0:03:25.760 --> 0:03:27.560
<v Speaker 2>While one can dazzle investors with.

0:03:27.480 --> 0:03:31.640
<v Speaker 1>Buzzwords, charts and the term superintelligence, the laws of physics

0:03:31.680 --> 0:03:34.240
<v Speaker 1>are a little bit more of a harsh mistress. And

0:03:34.280 --> 0:03:36.560
<v Speaker 1>if in Video is struggling mere months before the first

0:03:36.560 --> 0:03:39.200
<v Speaker 1>installations to begin, though I've heard some are going out now,

0:03:39.680 --> 0:03:43.000
<v Speaker 1>it's kind of unclear how they practically plan to launch

0:03:43.040 --> 0:03:46.320
<v Speaker 1>this next generation of chips, let alone doing another one

0:03:46.320 --> 0:03:49.520
<v Speaker 1>in a year. The Information reports that these changes have

0:03:49.560 --> 0:03:52.520
<v Speaker 1>been made late in production, which is scaring customers that

0:03:52.560 --> 0:03:55.200
<v Speaker 1>desperately need them now so that their models can continue

0:03:55.240 --> 0:03:57.280
<v Speaker 1>to do something that they will work out later.

0:03:57.800 --> 0:03:59.320
<v Speaker 2>To quote the Information again.

0:04:00.000 --> 0:04:02.440
<v Speaker 1>Secutives at large cloud providers that have ordered the new

0:04:02.520 --> 0:04:05.760
<v Speaker 1>chips said they're concerned that such last minute difficulties might

0:04:05.760 --> 0:04:07.480
<v Speaker 1>push back the timeline for when they can get their

0:04:07.480 --> 0:04:11.000
<v Speaker 1>GPU clusters up and running next year. The fact that

0:04:11.000 --> 0:04:14.520
<v Speaker 1>in video is having such significant difficulties with thermal performance.

0:04:14.160 --> 0:04:15.320
<v Speaker 2>Is very, very bad.

0:04:15.760 --> 0:04:18.800
<v Speaker 1>These chips are very expensive thirty to seventy grand I

0:04:18.880 --> 0:04:21.400
<v Speaker 1>hear and will be running, as I've mentioned, at full speed,

0:04:21.600 --> 0:04:24.360
<v Speaker 1>generating an incredible amount of heat that must be dissipated

0:04:24.640 --> 0:04:27.200
<v Speaker 1>while sat next to anywhere from thirty five to seventy

0:04:27.240 --> 0:04:30.040
<v Speaker 1>one other chips, which will in turn be densely packed

0:04:30.120 --> 0:04:32.440
<v Speaker 1>so that you can cram more servers into a data center.

0:04:33.000 --> 0:04:36.280
<v Speaker 1>New more powerful chips require entirely new methods to rack,

0:04:36.320 --> 0:04:38.839
<v Speaker 1>mount them, operate them, and cool them, and all of

0:04:38.880 --> 0:04:44.520
<v Speaker 1>these parts must operate and sync as overheating GPUs will die.

0:04:44.600 --> 0:04:46.599
<v Speaker 1>And while these units are big, some of their internal

0:04:46.600 --> 0:04:49.600
<v Speaker 1>components and nanometers in size, and unless properly cool their

0:04:49.600 --> 0:04:51.680
<v Speaker 1>circus will start to crumble when roasted by a guy

0:04:51.720 --> 0:04:54.440
<v Speaker 1>typing Garfield with a guarn and a bra into chat GPT,

0:04:54.839 --> 0:04:58.360
<v Speaker 1>which I have never done. Of course, Remember Blackwell is

0:04:58.360 --> 0:05:01.200
<v Speaker 1>supposed to be in videos big thing. It's meant to

0:05:01.240 --> 0:05:04.000
<v Speaker 1>represent a major leap forward in performance. If in Vidia

0:05:04.040 --> 0:05:06.720
<v Speaker 1>doesn't solve its cooling problem, and solve it well, its

0:05:06.760 --> 0:05:10.400
<v Speaker 1>customers will undoubtedly encounter thermal throttling, where the chip reduces

0:05:10.440 --> 0:05:13.240
<v Speaker 1>its speed in order to avoid causing any permanent damage.

0:05:13.520 --> 0:05:17.080
<v Speaker 1>It could eliminate some more all of the performance gains

0:05:17.200 --> 0:05:20.560
<v Speaker 1>obtained from the new architecture a new manufacturing process, despite

0:05:20.600 --> 0:05:23.440
<v Speaker 1>costing much much more, both for the chips itself and

0:05:23.480 --> 0:05:26.799
<v Speaker 1>the housing for them. In Vidio's problem isn't just bringing

0:05:26.880 --> 0:05:29.680
<v Speaker 1>these thermal performance issues under control, but both keeping them

0:05:29.720 --> 0:05:32.360
<v Speaker 1>under control and being able to educate their customers on

0:05:32.400 --> 0:05:35.400
<v Speaker 1>how to do so. In Vidia has, according to the information,

0:05:35.720 --> 0:05:39.000
<v Speaker 1>repeatedly tried to influence its customer server integrations to follow

0:05:39.040 --> 0:05:42.240
<v Speaker 1>its designs because it thinks it will lead to better performance.

0:05:42.279 --> 0:05:43.839
<v Speaker 1>But in this case one has to worry. If in

0:05:43.920 --> 0:05:47.760
<v Speaker 1>Vidia's black Well chips can be reliably cooled, maybe they

0:05:47.800 --> 0:05:50.360
<v Speaker 1>can in videos worked out crazy things in the past,

0:05:50.720 --> 0:05:52.479
<v Speaker 1>and while in video might be able to fix this

0:05:52.520 --> 0:05:54.919
<v Speaker 1>problem in isolation within its racks, it remains to be

0:05:54.920 --> 0:05:56.599
<v Speaker 1>seen how this works at scale as they ship and

0:05:56.600 --> 0:05:59.760
<v Speaker 1>integrate hundreds of thousands of Blackwell GPUs starting in the

0:05:59.760 --> 0:06:03.719
<v Speaker 1>front half of twenty twenty five. Things also get a

0:06:03.720 --> 0:06:06.000
<v Speaker 1>little worse when you realize how these chips are being

0:06:06.040 --> 0:06:09.760
<v Speaker 1>installed in these giant supercomputer data centers, where tens of

0:06:09.800 --> 0:06:12.160
<v Speaker 1>thousands are as many as one hundred thousand of them.

0:06:12.160 --> 0:06:14.960
<v Speaker 1>In the case of Elon Musk's Colossus Data center, which

0:06:15.000 --> 0:06:18.200
<v Speaker 1>is just very funny that that's what it's for.

0:06:18.279 --> 0:06:20.120
<v Speaker 2>It's for GROC on a dying social network.

0:06:20.600 --> 0:06:23.160
<v Speaker 1>But yeah, these GPUs, they're running concert to power generative

0:06:23.200 --> 0:06:23.800
<v Speaker 1>AI models.

0:06:23.839 --> 0:06:25.039
<v Speaker 2>That's all they're really for.

0:06:25.760 --> 0:06:27.760
<v Speaker 1>And The Wall Street Journal reported a few weeks ago

0:06:27.800 --> 0:06:31.440
<v Speaker 1>that building these vast data centers creates entirely new engineering challenges,

0:06:31.560 --> 0:06:33.720
<v Speaker 1>with one expert saying that big tech companies could be

0:06:33.800 --> 0:06:36.800
<v Speaker 1>spending as much of half of their capital expenditures on

0:06:36.880 --> 0:06:39.520
<v Speaker 1>replacing parts that are broken down, in large part because

0:06:39.520 --> 0:06:41.880
<v Speaker 1>these clusters are running on GPUs that are running at

0:06:41.880 --> 0:06:44.240
<v Speaker 1>full speed that need to all be cooled and need

0:06:44.279 --> 0:06:44.880
<v Speaker 1>to be cooled.

0:06:44.880 --> 0:06:45.159
<v Speaker 2>Well.

0:06:46.040 --> 0:06:49.159
<v Speaker 1>Remember the capital expenditures on generative AI and the associated

0:06:49.200 --> 0:06:52.120
<v Speaker 1>infrastructure have gone over two hundred billion dollars in the

0:06:52.160 --> 0:06:55.520
<v Speaker 1>last year. If half of that's dedicated to replacing broken shit.

0:06:55.560 --> 0:06:58.719
<v Speaker 1>What happens when there's no path of profitability in any

0:06:58.760 --> 0:07:02.000
<v Speaker 1>case is fine, they'll work it out well enough to

0:07:02.040 --> 0:07:04.320
<v Speaker 1>keep flogging these things. They've already made billions of dollars

0:07:04.400 --> 0:07:06.919
<v Speaker 1>selling Blackwells. They're sold out for a year, in fact,

0:07:07.000 --> 0:07:09.960
<v Speaker 1>and they'll continue to do so for now. But any

0:07:10.000 --> 0:07:12.679
<v Speaker 1>manufacturing and cooling issues are going to be very costly,

0:07:13.320 --> 0:07:15.480
<v Speaker 1>and even then, at some point somebody has to ask

0:07:15.480 --> 0:07:18.720
<v Speaker 1>the question, why do we need all these GPUs if

0:07:18.760 --> 0:07:22.560
<v Speaker 1>we've reached PKI. Despite the remarkable power of these chips.

0:07:22.640 --> 0:07:25.600
<v Speaker 1>In Vidia's entire enterprise GPU business model centers around the

0:07:25.640 --> 0:07:28.200
<v Speaker 1>idea that throwing more power at these problems will finally

0:07:28.200 --> 0:07:29.280
<v Speaker 1>create some solutions.

0:07:30.400 --> 0:07:31.880
<v Speaker 2>What if that isn't the case?

0:07:32.000 --> 0:07:37.720
<v Speaker 1>Though? So as ever, I'm kind of pissed off. I'm

0:07:37.760 --> 0:07:40.080
<v Speaker 1>pissed off reading this stuff. And it's not just because

0:07:40.080 --> 0:07:42.600
<v Speaker 1>I was saying this months ago. It's because all of

0:07:42.600 --> 0:07:44.560
<v Speaker 1>this money could have gone literally anywhere else. They could

0:07:44.560 --> 0:07:47.000
<v Speaker 1>have done something else, they could have I don't know,

0:07:47.480 --> 0:07:49.520
<v Speaker 1>I'm not running a big tech company, but I would

0:07:49.560 --> 0:07:52.160
<v Speaker 1>have put it into climate stuff or new battery technology.

0:07:52.680 --> 0:07:55.040
<v Speaker 1>Everyone's saying, oh, we'll make our own chips now, Yes,

0:07:55.080 --> 0:07:57.440
<v Speaker 1>see you in a few years, dickhead, Well, you go

0:07:57.520 --> 0:08:00.000
<v Speaker 1>to TSMC and ask if they've got any spare slot.

0:08:00.120 --> 0:08:02.760
<v Speaker 1>I'm sure they've got plenty. I'm sure that you can

0:08:02.840 --> 0:08:05.200
<v Speaker 1>just go and build a chip tomorrow. Sam Moultman talking

0:08:05.200 --> 0:08:09.880
<v Speaker 1>about building his own goddamn chips. What an absolute fast?

0:08:09.920 --> 0:08:10.280
<v Speaker 2>This is?

0:08:10.520 --> 0:08:13.960
<v Speaker 1>This fucking fast? All of this money burned in search

0:08:14.000 --> 0:08:16.360
<v Speaker 1>of a technology that only kind of works on something.

0:08:16.520 --> 0:08:20.200
<v Speaker 1>Sometimes it's just kind of a joke. The tech industry

0:08:20.240 --> 0:08:23.040
<v Speaker 1>is over leverage. They've doubled, they've tripled the kadrupled down

0:08:23.040 --> 0:08:25.680
<v Speaker 1>on generative AI now and this technology does not do

0:08:25.880 --> 0:08:27.720
<v Speaker 1>much more than it did a few months ago, And

0:08:27.760 --> 0:08:29.400
<v Speaker 1>I don't think it's going to do very much more

0:08:29.440 --> 0:08:32.760
<v Speaker 1>in a few months. Every single big tech companies pulled

0:08:32.760 --> 0:08:35.520
<v Speaker 1>tens of billions of dollars into building out these massive

0:08:35.559 --> 0:08:39.719
<v Speaker 1>superclusters with the intent of capturing AI demand. Yet they

0:08:39.760 --> 0:08:42.280
<v Speaker 1>never seem to think whether they were actually building things

0:08:42.280 --> 0:08:45.040
<v Speaker 1>that people wanted or would pay for, or that would

0:08:45.040 --> 0:08:47.960
<v Speaker 1>make the money, or that would help you manity in

0:08:48.000 --> 0:08:50.760
<v Speaker 1>any way, or really anything to do with an outcome

0:08:50.800 --> 0:08:53.960
<v Speaker 1>other than line go up. While some have claimed that

0:08:54.000 --> 0:08:58.120
<v Speaker 1>agents are the next frontier AI, Agents frontier Jesus, Nope,

0:08:58.240 --> 0:09:00.439
<v Speaker 1>keep him, The reality is that agents may be the

0:09:00.559 --> 0:09:04.199
<v Speaker 1>last generative AI product. Multiple large language models and integrations

0:09:04.240 --> 0:09:06.520
<v Speaker 1>bouncing off of each other in attempt to simulate what

0:09:06.559 --> 0:09:08.480
<v Speaker 1>a human might do at a cost that won't be

0:09:08.480 --> 0:09:12.040
<v Speaker 1>sustainable for the majority of businesses. While Anthropics demo of

0:09:12.080 --> 0:09:15.160
<v Speaker 1>its alleged model that can control a few browser windows

0:09:15.160 --> 0:09:17.840
<v Speaker 1>with a prompt might have seemed impressive to credulous people,

0:09:18.520 --> 0:09:22.160
<v Speaker 1>specifically mean casing you. And here these were controlled demos

0:09:22.160 --> 0:09:25.440
<v Speaker 1>which Anthropic added with slow and make lots of mistakes. Hey,

0:09:25.559 --> 0:09:28.079
<v Speaker 1>almost like it's hallucinating. I sure hope they fixed that

0:09:28.200 --> 0:09:30.360
<v Speaker 1>unfixable problem I'd be talking about for months.

0:09:30.440 --> 0:09:31.080
<v Speaker 2>Jesus Christ.

0:09:32.360 --> 0:09:36.120
<v Speaker 1>Even if it does, Anthropic has now successfully replaced an

0:09:36.240 --> 0:09:40.079
<v Speaker 1>entry full data worker position at an indeterminate cost, and

0:09:40.280 --> 0:09:43.320
<v Speaker 1>likely an unprofitable one too. And in many organizations those

0:09:43.400 --> 0:09:45.840
<v Speaker 1>jobs have already been outsourced or automated or start with

0:09:45.920 --> 0:09:48.320
<v Speaker 1>cheap contractors. So the people who are going to lose

0:09:48.360 --> 0:09:50.520
<v Speaker 1>out here are people who are already getting shot on

0:09:50.559 --> 0:09:51.120
<v Speaker 1>by life.

0:09:51.320 --> 0:09:52.160
<v Speaker 2>It really sucks.

0:09:52.679 --> 0:09:55.880
<v Speaker 1>And the obscenity of this mass delusion, it's just nauseating.

0:09:56.080 --> 0:09:58.160
<v Speaker 1>It's a monolith to bad decision making and the herd

0:09:58.160 --> 0:10:01.440
<v Speaker 1>mentality of tech's most powerful people, as well as an

0:10:01.480 --> 0:10:04.160
<v Speaker 1>outright attempt to manipulate the media into believing something was

0:10:04.200 --> 0:10:07.280
<v Speaker 1>possible that never was, and the media bought it, hook

0:10:07.360 --> 0:10:08.240
<v Speaker 1>line and sinker.

0:10:09.120 --> 0:10:09.920
<v Speaker 2>You all bought it.

0:10:10.920 --> 0:10:13.480
<v Speaker 1>Hundreds of billions of dollars have been wasted building giant

0:10:13.559 --> 0:10:16.240
<v Speaker 1>data centers to crunch numbers for software that has no

0:10:16.320 --> 0:10:18.640
<v Speaker 1>real product market fit, or while trying to hammer it

0:10:18.640 --> 0:10:21.760
<v Speaker 1>into various shapes to make it pretend that it's alive, conscious,

0:10:21.840 --> 0:10:25.360
<v Speaker 1>or even fucking useful. There is no path, from what

0:10:25.440 --> 0:10:27.880
<v Speaker 1>I can see, to turning generative AI from what it

0:10:27.920 --> 0:10:31.520
<v Speaker 1>is today into anything resembling a sustainable business. And the

0:10:31.559 --> 0:10:34.760
<v Speaker 1>only path that big tech appeared to have seen or

0:10:34.840 --> 0:10:37.240
<v Speaker 1>thought about was to throw as much money, power and

0:10:37.320 --> 0:10:40.320
<v Speaker 1>data at the problem. An avenue that appeared to be

0:10:40.440 --> 0:10:42.880
<v Speaker 1>and definitely appears to be now a complete dead end,

0:10:43.640 --> 0:10:47.120
<v Speaker 1>and we're still nothing's really come out of this movement.

0:10:47.960 --> 0:10:50.640
<v Speaker 1>I've used a handful of AI products that I found useful,

0:10:50.720 --> 0:10:54.320
<v Speaker 1>and aipower journal, for example, but these are not products

0:10:54.320 --> 0:10:57.560
<v Speaker 1>that one associates with revolutions. They're useful tools that would

0:10:57.600 --> 0:11:00.000
<v Speaker 1>have been a welcome surprised if it didn't require burning

0:11:00.120 --> 0:11:03.080
<v Speaker 1>billions of dollars, blowing past emissions targets, and stealing the

0:11:03.080 --> 0:11:06.640
<v Speaker 1>creative works of millions of people to train them. It's

0:11:06.679 --> 0:11:10.360
<v Speaker 1>just sickening. It's sickening because this did not have to happen.

0:11:10.520 --> 0:11:13.320
<v Speaker 1>Had suned up Ashay seen what Microsoft did with open

0:11:13.360 --> 0:11:16.520
<v Speaker 1>AI and said not for me, mate, That's definitely not

0:11:16.559 --> 0:11:19.440
<v Speaker 1>how he talks. But nevertheless, none of this would have continued.

0:11:19.559 --> 0:11:21.360
<v Speaker 1>You think Microsoft was going to hold this shit up

0:11:21.360 --> 0:11:24.880
<v Speaker 1>on their own. God, No, they have no juice such

0:11:24.920 --> 0:11:27.640
<v Speaker 1>an Adela. He's just a he's a cult leader, something

0:11:27.679 --> 0:11:30.400
<v Speaker 1>I'm going to get into in a later episode. And

0:11:30.440 --> 0:11:33.160
<v Speaker 1>it's just it's all very sad. But if you want

0:11:33.160 --> 0:11:35.520
<v Speaker 1>to cheer yourself up, maybe you could listen to one

0:11:35.520 --> 0:11:38.200
<v Speaker 1>of the following advertisements. And this is a short episode,

0:11:38.240 --> 0:11:39.920
<v Speaker 1>so if they don't put one after this, you might

0:11:39.960 --> 0:11:43.560
<v Speaker 1>just have a short gap between me talking. That's up

0:11:43.600 --> 0:11:50.800
<v Speaker 1>to Matt. MAT's a genius. That's my producer, by the way, Mattasowski. Anyway, ads,

0:11:59.160 --> 0:12:04.079
<v Speaker 1>and we're back. Look, I truly don't know what happens next,

0:12:04.160 --> 0:12:05.920
<v Speaker 1>but I'm going to walk you through what I'm thinking.

0:12:06.640 --> 0:12:08.480
<v Speaker 2>If we're truly at the diminishing.

0:12:08.040 --> 0:12:10.720
<v Speaker 1>Return stage of transformer based models, it's going to be

0:12:10.760 --> 0:12:14.800
<v Speaker 1>extremely difficult to justify buying further iterations of Nvidia GPUs

0:12:14.840 --> 0:12:19.199
<v Speaker 1>past Blackwell, and also I have doubts they're going to

0:12:19.240 --> 0:12:21.719
<v Speaker 1>be able to make a new one every year. The

0:12:21.920 --> 0:12:24.679
<v Speaker 1>entire generative AI movement lives and dies by the idea

0:12:24.720 --> 0:12:27.079
<v Speaker 1>that more compute power and more training data makes these

0:12:27.120 --> 0:12:29.720
<v Speaker 1>things better. And if that's no longer the case, there's

0:12:29.960 --> 0:12:32.120
<v Speaker 1>not really a reason to keep buying bigger and better.

0:12:32.640 --> 0:12:36.760
<v Speaker 1>What's the point even now? What exactly happens when Microsoft

0:12:36.880 --> 0:12:39.600
<v Speaker 1>or Google has racks worth of Blackwell GPUs? What does

0:12:39.600 --> 0:12:43.640
<v Speaker 1>aren't getting better faster? I guess what does faster even mean?

0:12:43.760 --> 0:12:46.480
<v Speaker 1>What does better even mean? Better has yet to me

0:12:46.520 --> 0:12:49.800
<v Speaker 1>in a new product? Better has not seem to be

0:12:50.200 --> 0:12:54.680
<v Speaker 1>measurable by humans just by bullshit benchmarks. This also makes

0:12:54.720 --> 0:12:58.120
<v Speaker 1>the lives of open Ai and Anthropic a little more difficult.

0:12:58.440 --> 0:13:00.760
<v Speaker 1>Sam Altman has grown rich in power for lying about

0:13:00.800 --> 0:13:04.480
<v Speaker 1>how GPT will somehow lead to AGI some conscious computer.

0:13:04.920 --> 0:13:09.080
<v Speaker 1>But at this point, what exactly is open ai meant

0:13:09.080 --> 0:13:11.720
<v Speaker 1>to do? What are they doing? The only way it's

0:13:12.160 --> 0:13:14.480
<v Speaker 1>this company has ever been able to develop new models

0:13:14.559 --> 0:13:17.079
<v Speaker 1>is by throwing masses of compute and training data at them.

0:13:17.400 --> 0:13:19.839
<v Speaker 1>And there are only other choices to start stapling its

0:13:20.040 --> 0:13:23.120
<v Speaker 1>their reasoning model onto the side of their large language model,

0:13:23.240 --> 0:13:29.200
<v Speaker 1>at which point something happens something everyone It's happening something

0:13:29.400 --> 0:13:31.880
<v Speaker 1>that is so good that literally nobody working for open

0:13:31.920 --> 0:13:33.719
<v Speaker 1>aire in the media appears to be able to tell

0:13:33.720 --> 0:13:34.200
<v Speaker 1>you what it is.

0:13:35.160 --> 0:13:36.679
<v Speaker 2>Yeah.

0:13:36.760 --> 0:13:39.680
<v Speaker 1>Putting that aside, open ai is also a terrible business

0:13:40.160 --> 0:13:42.520
<v Speaker 1>that has to burn five billion dollars to make three

0:13:42.520 --> 0:13:45.040
<v Speaker 1>point seven billion dollars, and there's no proof that they're

0:13:45.080 --> 0:13:47.840
<v Speaker 1>able to bring down their costs. The constant thing I

0:13:47.840 --> 0:13:50.720
<v Speaker 1>hear from VC's and AI fantasist is that the chips

0:13:50.720 --> 0:13:53.600
<v Speaker 1>will bring down the cost of inference. Yet I don't

0:13:53.640 --> 0:13:57.400
<v Speaker 1>see that happening. I've yet to see that happening. There's

0:13:57.440 --> 0:14:00.040
<v Speaker 1>cerebrus dinner plate sized chips, but you know what, I

0:14:00.160 --> 0:14:02.520
<v Speaker 1>just don't think that shit's going to scale. And at

0:14:02.559 --> 0:14:05.400
<v Speaker 1>that point you have to wonder do they have like

0:14:05.640 --> 0:14:08.880
<v Speaker 1>where is this going? I know you're listening to an

0:14:08.920 --> 0:14:12.160
<v Speaker 1>episode where I tell you, but still even reading this

0:14:12.240 --> 0:14:14.800
<v Speaker 1>stuff all day, you just look at it too hard

0:14:15.040 --> 0:14:18.920
<v Speaker 1>and you start feeling a little crazy. I've read pretty

0:14:18.960 --> 0:14:21.680
<v Speaker 1>much every article on open ai at this point. I've

0:14:21.680 --> 0:14:24.880
<v Speaker 1>read Microsoft's earnings multiple quarters straight. Same with the rest

0:14:24.880 --> 0:14:27.000
<v Speaker 1>of them. I've read all their blog posts, I've read

0:14:27.000 --> 0:14:30.359
<v Speaker 1>all their shit. They don't have any idea. I'm confident

0:14:30.440 --> 0:14:33.480
<v Speaker 1>that they don't have any idea. It's terrifying and this

0:14:33.600 --> 0:14:36.880
<v Speaker 1>is just this dismal situation where the only other there's

0:14:36.920 --> 0:14:40.000
<v Speaker 1>really only a few options. If I'm honest, you stop now.

0:14:40.720 --> 0:14:45.160
<v Speaker 1>That is the biggest one. You just stop. Someone has

0:14:45.200 --> 0:14:48.200
<v Speaker 1>to admit, Microsoft, Google, They'll never do this. They need

0:14:48.240 --> 0:14:50.640
<v Speaker 1>to go this is not going to scale. We need

0:14:50.680 --> 0:14:52.560
<v Speaker 1>to pair this back or we need to stop it.

0:14:53.240 --> 0:14:55.800
<v Speaker 1>And it will hurt their stock, it will hurt their media.

0:14:56.040 --> 0:14:58.480
<v Speaker 1>But it's the right thing to do for the environment alone,

0:14:58.760 --> 0:15:03.360
<v Speaker 1>but as a sustainable busines. But assuming they don't do that,

0:15:03.880 --> 0:15:08.040
<v Speaker 1>we have another problem, which is the only other option

0:15:08.160 --> 0:15:10.880
<v Speaker 1>they can have is to keep flooring the gas. It

0:15:10.920 --> 0:15:13.800
<v Speaker 1>costs one hundred million dollars to train GPT four to zero,

0:15:14.000 --> 0:15:17.000
<v Speaker 1>and anthropic CEO Dario ami Day estimated a few months

0:15:17.040 --> 0:15:20.000
<v Speaker 1>ago that training future models would cost one billion, if

0:15:20.040 --> 0:15:23.080
<v Speaker 1>not ten billion dollars, with one researcher claiming that training

0:15:23.160 --> 0:15:26.760
<v Speaker 1>open AI's next model, GPT five will cost around a

0:15:26.760 --> 0:15:29.320
<v Speaker 1>billion dollars outside of a miracle.

0:15:29.320 --> 0:15:32.200
<v Speaker 2>We're about to enter an era of desperation in generative AI.

0:15:32.640 --> 0:15:34.760
<v Speaker 1>We're two years in and we have no killer apps,

0:15:34.760 --> 0:15:38.080
<v Speaker 1>no industry defining products other than chat GPT, a product

0:15:38.120 --> 0:15:40.520
<v Speaker 1>that burns billions of doddlers, and nobody.

0:15:40.200 --> 0:15:41.920
<v Speaker 2>Can really describe every episode.

0:15:41.960 --> 0:15:44.600
<v Speaker 1>I'm like, email me, tell me what chat GPT is,

0:15:44.800 --> 0:15:48.880
<v Speaker 1>and I'll get some smart us who sends me three paragraphs. No, no, no, no, no,

0:15:49.200 --> 0:15:51.680
<v Speaker 1>that's not a description, that's an essay. May you actually

0:15:51.680 --> 0:15:53.320
<v Speaker 1>need to tell me what the hell this is? And

0:15:53.400 --> 0:15:55.720
<v Speaker 1>if you can't, you've proven my point in my extremely

0:15:55.760 --> 0:16:00.280
<v Speaker 1>thin and rigged game. Anyway, neither Microsoft, nor Meta, nor Google,

0:16:00.320 --> 0:16:02.000
<v Speaker 1>no Amazon seemed to be able to come up with

0:16:02.080 --> 0:16:05.280
<v Speaker 1>any profitable use cases, let alone one their users actually like,

0:16:05.560 --> 0:16:07.560
<v Speaker 1>Nor have any of the people that have raised billions

0:16:07.560 --> 0:16:09.720
<v Speaker 1>of dollars in venture capital for anything with AI take

0:16:09.800 --> 0:16:13.240
<v Speaker 1>to the side. An investor interest in AI is already cooling.

0:16:13.280 --> 0:16:17.280
<v Speaker 1>According to the information, It's kind of unclear how far

0:16:17.400 --> 0:16:20.040
<v Speaker 1>this fast goes from here, if only because it isn't

0:16:20.080 --> 0:16:22.480
<v Speaker 1>obvious what it is that anybody gets by investing in

0:16:22.520 --> 0:16:25.560
<v Speaker 1>future rounds of open Aianthropic or any of these other

0:16:25.600 --> 0:16:29.000
<v Speaker 1>generative AI companies. At some point, they must make money,

0:16:29.080 --> 0:16:31.440
<v Speaker 1>and the entire dream has been built around the idea

0:16:31.480 --> 0:16:33.200
<v Speaker 1>that all of these GPUs and all of this money

0:16:33.400 --> 0:16:38.960
<v Speaker 1>would eventually spit out something, something revolutionary, something world changing.

0:16:40.200 --> 0:16:43.280
<v Speaker 1>What we have is this clunky, ugly, messy, lasceness, environmentally

0:16:43.280 --> 0:16:45.080
<v Speaker 1>destructive and mediocre.

0:16:44.560 --> 0:16:45.280
<v Speaker 2>Piece of shit.

0:16:46.240 --> 0:16:48.760
<v Speaker 1>Generative AI was a reckless pursuit, one that shows a

0:16:48.760 --> 0:16:50.960
<v Speaker 1>total lack of creativity and sense in the minds of

0:16:51.000 --> 0:16:53.200
<v Speaker 1>big tech in venture capital, one where there was never

0:16:53.320 --> 0:16:55.920
<v Speaker 1>anything really impressive other than the amount of money it

0:16:55.920 --> 0:16:57.800
<v Speaker 1>could burn on the amount of time samal And could

0:16:57.800 --> 0:17:00.800
<v Speaker 1>say something stupid and get quoted for it. I'll be

0:17:00.800 --> 0:17:03.920
<v Speaker 1>honest with you, I don't really know what happens here.

0:17:04.400 --> 0:17:06.840
<v Speaker 1>The future was always one that demanded big tech spend

0:17:06.840 --> 0:17:08.600
<v Speaker 1>more to make even bigger models that would at some

0:17:08.600 --> 0:17:11.840
<v Speaker 1>point become useful, and it isn't happening. In pursuit of

0:17:11.880 --> 0:17:14.359
<v Speaker 1>doing so, big tech invested hundreds of billions of dollars

0:17:14.400 --> 0:17:17.920
<v Speaker 1>into infrastructure specifically to follow one goal and put AI

0:17:18.040 --> 0:17:20.280
<v Speaker 1>front and center of their businesses, claiming it was the

0:17:20.280 --> 0:17:24.280
<v Speaker 1>future without ever checking if it was, if it actually

0:17:24.320 --> 0:17:28.600
<v Speaker 1>did anything useful. And really they, as I've said in

0:17:28.600 --> 0:17:30.440
<v Speaker 1>the rock com bubble a few months ago, they don't

0:17:30.520 --> 0:17:33.679
<v Speaker 1>have anything else. They don't have another growth market, and

0:17:33.720 --> 0:17:35.800
<v Speaker 1>maybe that's why they're shoving all the cash into this,

0:17:36.119 --> 0:17:37.000
<v Speaker 1>because they don't.

0:17:36.760 --> 0:17:38.159
<v Speaker 2>Have it anywhere. Else to put it.

0:17:38.359 --> 0:17:40.760
<v Speaker 1>They don't want to admit they've got nothing else. They

0:17:40.760 --> 0:17:43.080
<v Speaker 1>don't want to admit that their businesses are crumbling under

0:17:43.480 --> 0:17:47.520
<v Speaker 1>scrutiny from antitrust. They don't want to admit their services

0:17:47.560 --> 0:17:50.680
<v Speaker 1>are kind of deteriorating, especially in the case of Meta.

0:17:50.800 --> 0:17:53.760
<v Speaker 1>They don't have anything else. The revenue isn't coming, the

0:17:53.800 --> 0:17:58.040
<v Speaker 1>products aren't coming. Oryan open AI's next model, it will underwhelm,

0:17:58.119 --> 0:18:00.119
<v Speaker 1>as will its competitors model, and at some point, think

0:18:00.200 --> 0:18:02.560
<v Speaker 1>somebody is going to blink in one of the hyperscalers

0:18:02.560 --> 0:18:06.399
<v Speaker 1>and the AI era will end. Almost every single GENERAI

0:18:06.520 --> 0:18:09.040
<v Speaker 1>company that you've heard of is deeply unprofitable, and there

0:18:09.040 --> 0:18:11.639
<v Speaker 1>are very few innovations coming to save them from the

0:18:11.680 --> 0:18:14.960
<v Speaker 1>atrophy of the Foundation model system, which by the way,

0:18:15.400 --> 0:18:17.760
<v Speaker 1>is the main large language models that they're training in

0:18:17.800 --> 0:18:21.960
<v Speaker 1>all of these cases. Now, I've tried to keep my

0:18:22.000 --> 0:18:24.400
<v Speaker 1>emotions in check, but you might you might be able

0:18:24.400 --> 0:18:26.760
<v Speaker 1>to tell that this pisses me off a little. I

0:18:26.840 --> 0:18:30.040
<v Speaker 1>just feel sad and exhausted about it all. I feel

0:18:30.160 --> 0:18:32.000
<v Speaker 1>drained as I look at how many times I've tried

0:18:32.000 --> 0:18:34.399
<v Speaker 1>to warm people. I'm pissed off at the many members

0:18:34.400 --> 0:18:36.000
<v Speaker 1>of the media that failed to push back against the

0:18:36.040 --> 0:18:38.760
<v Speaker 1>over promises and outright lies of people like Sam Altman

0:18:38.840 --> 0:18:41.600
<v Speaker 1>and Dario Ami Day. Warrio ami Day is what I'm

0:18:41.640 --> 0:18:44.159
<v Speaker 1>calling him now. It's not particularly funny ira acurate, but

0:18:44.160 --> 0:18:46.320
<v Speaker 1>it's funny to say for me. And I'm just full

0:18:46.359 --> 0:18:48.679
<v Speaker 1>of dread as I consider the economic ramifications of this

0:18:48.720 --> 0:18:51.280
<v Speaker 1>industry collapsing, as well as the damage it's already done

0:18:51.400 --> 0:18:54.720
<v Speaker 1>to our fucking environment. Once the AI bubble pops, there

0:18:54.760 --> 0:18:57.720
<v Speaker 1>are no hyper growth markets left, which will in turn

0:18:57.800 --> 0:18:59.679
<v Speaker 1>lead to a blood bath in big tech stocks as

0:18:59.680 --> 0:19:02.000
<v Speaker 1>they really that they're out of big ideas to convince

0:19:02.040 --> 0:19:05.000
<v Speaker 1>the street that they're going to grow forever. There are

0:19:05.040 --> 0:19:07.360
<v Speaker 1>some that will boast about being right here, and yes,

0:19:07.400 --> 0:19:10.840
<v Speaker 1>there is some satisfaction in being so, of course, of

0:19:10.880 --> 0:19:12.960
<v Speaker 1>course you take the moment, you do the victory lap.

0:19:13.040 --> 0:19:16.159
<v Speaker 1>Whatever I said a thing, and I was right. I

0:19:16.160 --> 0:19:18.560
<v Speaker 1>don't know if I'm I'm really doing the joker kick

0:19:18.640 --> 0:19:20.720
<v Speaker 1>down the stairs here. I don't know if I'm really

0:19:21.280 --> 0:19:24.560
<v Speaker 1>happy about being right here. I think generative AI is

0:19:24.600 --> 0:19:26.439
<v Speaker 1>a huge waste of money. I think the damage it's

0:19:26.480 --> 0:19:29.800
<v Speaker 1>doing to the environment is disgusting. I think the stealing

0:19:30.000 --> 0:19:33.280
<v Speaker 1>required to make the training data side is disgusting. I

0:19:33.280 --> 0:19:35.880
<v Speaker 1>think all of that ending is good, and I think

0:19:35.920 --> 0:19:37.920
<v Speaker 1>it needs to just to be clear or I sit.

0:19:38.920 --> 0:19:43.360
<v Speaker 1>But you know what, the other problem is that when

0:19:43.400 --> 0:19:45.400
<v Speaker 1>this all falls apart, it's going to be rough. It's

0:19:45.440 --> 0:19:47.240
<v Speaker 1>going to be rough for people in the tech industry,

0:19:47.440 --> 0:20:03.440
<v Speaker 1>and it's going to be rough for the economy. So

0:20:03.440 --> 0:20:05.240
<v Speaker 1>I'm gonna start wrapping up here. I'm gonna read you

0:20:05.280 --> 0:20:07.320
<v Speaker 1>a quote from Bubble Trouble, a piece I wrote in April.

0:20:08.600 --> 0:20:13.040
<v Speaker 1>Hum how do you solve all of these incredibly difficult problems?

0:20:13.440 --> 0:20:15.640
<v Speaker 1>What does open ai or Anthropic do when they run

0:20:15.640 --> 0:20:18.240
<v Speaker 1>out of data and the synthetic data doesn't fill the gap,

0:20:18.320 --> 0:20:21.239
<v Speaker 1>or worse, massively degrades the quality of their outputs. What

0:20:21.280 --> 0:20:24.120
<v Speaker 1>does Sam Moltman do if GPT five, like GPT four,

0:20:24.400 --> 0:20:27.480
<v Speaker 1>doesn't significantly improve its performance and he can't find enough

0:20:27.480 --> 0:20:30.040
<v Speaker 1>compute to make the next step. What do open ai

0:20:30.119 --> 0:20:32.480
<v Speaker 1>and Anthropic do when they realize they'll likely never turn

0:20:32.480 --> 0:20:35.560
<v Speaker 1>a profit. What does Microsoft, gra Amazon or Google do

0:20:35.600 --> 0:20:37.320
<v Speaker 1>if demand never really takes off and they' left with

0:20:37.359 --> 0:20:40.760
<v Speaker 1>billions of dollars of underutilized data centers. What does Nvidia

0:20:40.840 --> 0:20:42.600
<v Speaker 1>do if the demand for its chips drops off a

0:20:42.600 --> 0:20:45.680
<v Speaker 1>cliff as a result. I don't know why more people

0:20:45.720 --> 0:20:48.720
<v Speaker 1>aren't screaming from the rooftops about how unsustainable the AI

0:20:48.760 --> 0:20:51.600
<v Speaker 1>boom is and how impossible some of the challenges are

0:20:51.600 --> 0:20:54.960
<v Speaker 1>that it faces. There is no way to create enough

0:20:55.000 --> 0:20:57.840
<v Speaker 1>training data for these models, and little that we've seen

0:20:57.920 --> 0:21:00.760
<v Speaker 1>so far suggests that generative AI will make anybody but

0:21:00.880 --> 0:21:04.159
<v Speaker 1>Nvidio money. We're reaching the point where physics, things like

0:21:04.160 --> 0:21:06.439
<v Speaker 1>heat and electricity are getting in the way of progressing

0:21:06.560 --> 0:21:09.800
<v Speaker 1>much further, and it's hard to stomach investing more considering

0:21:09.840 --> 0:21:12.040
<v Speaker 1>where we're at right now. Once you cut through the noise,

0:21:13.080 --> 0:21:16.959
<v Speaker 1>it just seems kind of fairly goddamn mediocre. There's no

0:21:17.000 --> 0:21:20.840
<v Speaker 1>iPhone moment coming, I'm afraid, so all right, not gonna

0:21:20.880 --> 0:21:22.280
<v Speaker 1>be too smug, but reading that.

0:21:22.320 --> 0:21:24.320
<v Speaker 2>Back pretty accurate, Pretty good for April.

0:21:25.520 --> 0:21:28.800
<v Speaker 1>I was right then, I'm right now. Generative AI isn't

0:21:28.800 --> 0:21:31.640
<v Speaker 1>a revolution. It's an evolution of a tech industry overtaken

0:21:31.640 --> 0:21:34.320
<v Speaker 1>by growth hungry management consultant freaks that I neither know

0:21:34.400 --> 0:21:37.080
<v Speaker 1>the problems that real people face nor how to fix them.

0:21:37.560 --> 0:21:38.280
<v Speaker 2>It's a waste.

0:21:38.440 --> 0:21:41.359
<v Speaker 1>It's a sickening waste, a monument to the corrupting force

0:21:41.400 --> 0:21:43.520
<v Speaker 1>of growth and a sign that people in power and

0:21:43.600 --> 0:21:45.960
<v Speaker 1>tech no longer work for you, the customer, but for

0:21:46.040 --> 0:21:49.679
<v Speaker 1>venture capitalists and for the markets. I also want to

0:21:49.760 --> 0:21:52.040
<v Speaker 1>be clear that none of these companies seem to have

0:21:52.080 --> 0:21:55.000
<v Speaker 1>ever had a plan. They believed that if they threw

0:21:55.119 --> 0:21:58.000
<v Speaker 1>enough GPUs together, they would turn generative AI with a

0:21:58.000 --> 0:22:02.199
<v Speaker 1>probabilistic model for generating stef either into a sentient computer

0:22:02.280 --> 0:22:04.600
<v Speaker 1>in open AI's case, or into some sort of useful

0:22:04.600 --> 0:22:08.520
<v Speaker 1>product in Microsoft's case. Neither of them are right, And

0:22:08.600 --> 0:22:11.000
<v Speaker 1>I get it. I get some of you that you're like, well,

0:22:11.280 --> 0:22:13.200
<v Speaker 1>this is all a big plan, so they can proliferate

0:22:13.280 --> 0:22:15.480
<v Speaker 1>data centers. This is all a big plans. They can

0:22:15.480 --> 0:22:17.919
<v Speaker 1>steal everything and ship it back to us, so they

0:22:17.960 --> 0:22:22.080
<v Speaker 1>can do this to us. It's much easier and more

0:22:22.080 --> 0:22:24.760
<v Speaker 1>comfortable to look at the world as a series of conspiracies,

0:22:24.800 --> 0:22:28.520
<v Speaker 1>these grand strategies, and that all of these people are powerful,

0:22:28.800 --> 0:22:31.240
<v Speaker 1>which they are, but powerful in a way that is

0:22:31.320 --> 0:22:34.040
<v Speaker 1>kind of the dread hand controlling your everything.

0:22:35.080 --> 0:22:35.960
<v Speaker 2>But honestly, it's.

0:22:35.880 --> 0:22:39.240
<v Speaker 1>Far scarier to see reality for what it is. Extremely

0:22:39.359 --> 0:22:41.400
<v Speaker 1>rich and powerful people that are willing to bed insanely

0:22:41.480 --> 0:22:43.680
<v Speaker 1>large amounts of money on what amounts to a few PDFs.

0:22:43.680 --> 0:22:48.040
<v Speaker 1>In their fucking gut, these people have no plan. This

0:22:48.200 --> 0:22:50.960
<v Speaker 1>is not big takes big plan, or they're excused to

0:22:50.960 --> 0:22:53.399
<v Speaker 1>build more data centers. It's the death throws of twenty

0:22:53.480 --> 0:22:56.120
<v Speaker 1>years of growth or costs thinking, because throwing a bunch

0:22:56.160 --> 0:22:58.480
<v Speaker 1>of money at more servers and more engineers always seem

0:22:58.520 --> 0:22:59.120
<v Speaker 1>to create more.

0:22:59.040 --> 0:22:59.840
<v Speaker 2>Growth in the past.

0:23:00.480 --> 0:23:02.720
<v Speaker 1>In practice, this means that the people in charge and

0:23:02.760 --> 0:23:05.040
<v Speaker 1>the strategies they employ are born not of an interest

0:23:05.040 --> 0:23:07.280
<v Speaker 1>in improving the lives of you their customer, but in

0:23:07.280 --> 0:23:10.600
<v Speaker 1>increasing revenue growth, which means products they create aren't really

0:23:10.640 --> 0:23:14.000
<v Speaker 1>about solving any problem other than what will make somebody

0:23:14.040 --> 0:23:17.360
<v Speaker 1>give me more money, which doesn't necessarily mean provide them

0:23:17.359 --> 0:23:21.439
<v Speaker 1>with a service. Generative AI is the perfect monster of

0:23:21.440 --> 0:23:24.240
<v Speaker 1>the row economy. It's a technology that lacks any real purpose,

0:23:24.280 --> 0:23:26.879
<v Speaker 1>sold as if it could do literally anything, one without

0:23:26.880 --> 0:23:29.359
<v Speaker 1>a real business model or a killer app, proliferated because

0:23:29.359 --> 0:23:34.000
<v Speaker 1>big tech no longer innovates, They clone, and they monopolize. Yes,

0:23:34.240 --> 0:23:37.119
<v Speaker 1>this much money can be this stupid, and yes they

0:23:37.160 --> 0:23:39.760
<v Speaker 1>will burn billions more in pursuit of a non specific

0:23:39.840 --> 0:23:42.280
<v Speaker 1>dream that involves charging you more money and trapping you

0:23:42.320 --> 0:23:46.080
<v Speaker 1>in their ecosystem. But they're out of ideas. This was

0:23:46.119 --> 0:23:48.439
<v Speaker 1>their big thing. If they had anything else, they wouldn't

0:23:48.440 --> 0:23:48.840
<v Speaker 1>have done this.

0:23:49.400 --> 0:23:51.680
<v Speaker 2>They don't. They're kind of screwed.

0:23:52.200 --> 0:23:54.480
<v Speaker 1>I don't know how long it will take for you

0:23:54.600 --> 0:23:57.400
<v Speaker 1>to see how much they're screwed. Could be a year,

0:23:57.680 --> 0:24:01.000
<v Speaker 1>could be two quarters. I honestly don't know. But it

0:24:01.080 --> 0:24:04.800
<v Speaker 1>cannot go on like this. It cannot because at some point,

0:24:04.840 --> 0:24:08.320
<v Speaker 1>burning billions of dollars every month is not going to work.

0:24:08.480 --> 0:24:11.080
<v Speaker 1>At some point, the markets will react, or at some point,

0:24:11.520 --> 0:24:14.480
<v Speaker 1>I don't know, CFO of Microsoft Damian Hood might say, hey,

0:24:14.520 --> 0:24:17.200
<v Speaker 1>we're burning billions of dollars to lose billions of dollars.

0:24:17.520 --> 0:24:19.960
<v Speaker 1>We don't have the customers, these products aren't useful enough,

0:24:20.680 --> 0:24:22.480
<v Speaker 1>or maybe she keeps driving to hell.

0:24:23.200 --> 0:24:23.760
<v Speaker 2>I don't know.

0:24:24.400 --> 0:24:26.399
<v Speaker 1>But the longer it takes, the worse it's going to

0:24:26.480 --> 0:24:29.760
<v Speaker 1>be for the industry and for the markets. I'm not

0:24:29.760 --> 0:24:31.960
<v Speaker 1>trying to be a doom set, just like I wasn't

0:24:32.000 --> 0:24:34.399
<v Speaker 1>trying to be one in March. I believe all of

0:24:34.400 --> 0:24:37.560
<v Speaker 1>this is going nowhere, and that at some point, Google, Microsoft, Meta,

0:24:38.240 --> 0:24:39.520
<v Speaker 1>one of them is going to blink and they're going

0:24:39.520 --> 0:24:42.240
<v Speaker 1>to pull back on the capex. And before then, you're

0:24:42.240 --> 0:24:44.440
<v Speaker 1>going to see a lot of desperate stories about how

0:24:44.480 --> 0:24:47.240
<v Speaker 1>AI gains can be found outside of training you models

0:24:47.240 --> 0:24:49.960
<v Speaker 1>to try and keep the party going despite reality, flicking

0:24:50.000 --> 0:24:51.720
<v Speaker 1>the lights on and off and threatening to call the

0:24:51.760 --> 0:24:54.600
<v Speaker 1>police on them. Really, though, you're going to see so

0:24:54.680 --> 0:24:56.960
<v Speaker 1>much of that. You're going to see people in the

0:24:57.040 --> 0:25:01.479
<v Speaker 1>media who should know better still pumping, still pumping that bag.

0:25:01.840 --> 0:25:03.879
<v Speaker 1>Because it gets back to the conspiracy theory I was

0:25:03.920 --> 0:25:07.199
<v Speaker 1>talking about before it gets back there, because it's so

0:25:07.320 --> 0:25:10.800
<v Speaker 1>much easier to think, Ah, they'll work it out, they're smart.

0:25:11.200 --> 0:25:14.399
<v Speaker 1>I trust them. I trust these people. They're going to

0:25:14.480 --> 0:25:16.920
<v Speaker 1>work out and make Generative AI make so much money.

0:25:16.960 --> 0:25:19.040
<v Speaker 1>It's going to be so profitable. Alw it's going to

0:25:19.080 --> 0:25:21.880
<v Speaker 1>be so good. Because when you don't accept that premise,

0:25:21.960 --> 0:25:24.960
<v Speaker 1>everything you write about it feels like writing about a

0:25:25.040 --> 0:25:29.440
<v Speaker 1>dying person. Oh, anthropic claims it can write in your

0:25:29.800 --> 0:25:32.480
<v Speaker 1>writing style. Now, great, where's anthropic going to be in

0:25:32.520 --> 0:25:35.720
<v Speaker 1>two years? The toilet probably going to need a pretty

0:25:35.720 --> 0:25:39.040
<v Speaker 1>big toilet for all those GPUs. Oh open AI. They

0:25:39.080 --> 0:25:41.560
<v Speaker 1>have this idea, Yeah, their ideas have fucking suck, so

0:25:41.680 --> 0:25:44.919
<v Speaker 1>far and the irony is large. Language models as an

0:25:44.960 --> 0:25:50.240
<v Speaker 1>idea aren't a bad one. The concept is interesting, a word, calculator, sexy, orto, correct,

0:25:50.280 --> 0:25:52.360
<v Speaker 1>whatever you call it. There are meaningful things you could

0:25:52.359 --> 0:25:55.560
<v Speaker 1>build with that, but not at this cost, not like this,

0:25:56.280 --> 0:25:59.720
<v Speaker 1>not this much money. Not stealing from artists, not stealing

0:26:00.359 --> 0:26:03.600
<v Speaker 1>from writers. I don't know if mine have been used,

0:26:03.640 --> 0:26:06.600
<v Speaker 1>but if they have, I got the legal resources to fight,

0:26:06.600 --> 0:26:08.960
<v Speaker 1>and I fucking will. I'm sick of this shit. I'm

0:26:09.000 --> 0:26:11.240
<v Speaker 1>sick of what I see in being done to contractors.

0:26:11.320 --> 0:26:13.840
<v Speaker 1>I'm sick of what I see happening to art directors

0:26:13.840 --> 0:26:17.640
<v Speaker 1>who lose their jobs to some shitthead using day three mini.

0:26:18.680 --> 0:26:23.640
<v Speaker 1>It's grotesque. It's enough to turn your stomach. And as

0:26:23.640 --> 0:26:25.960
<v Speaker 1>we come towards the end of the year, you may

0:26:25.960 --> 0:26:27.280
<v Speaker 1>have at the beginning of the show and been like,

0:26:27.320 --> 0:26:29.520
<v Speaker 1>why Ed is so Why is he so annoyed? I

0:26:29.520 --> 0:26:32.800
<v Speaker 1>think I've done a good job explaining it, but I

0:26:32.840 --> 0:26:36.600
<v Speaker 1>want to end on a good note and look, I

0:26:37.000 --> 0:26:39.480
<v Speaker 1>fear for the future for many reasons, but I always

0:26:39.480 --> 0:26:41.480
<v Speaker 1>have hope because I believe that there are still good

0:26:41.480 --> 0:26:44.080
<v Speaker 1>people in the tech industry, and I love technology. I'm

0:26:44.080 --> 0:26:46.239
<v Speaker 1>going to say every episode that I can, because I

0:26:46.280 --> 0:26:48.359
<v Speaker 1>really do. I love my computer. I love the people

0:26:48.359 --> 0:26:49.919
<v Speaker 1>are fine through it. I love the friends and the

0:26:49.920 --> 0:26:52.439
<v Speaker 1>families loved one I've found through it, and I know

0:26:52.480 --> 0:26:53.919
<v Speaker 1>a lot of you do too. And I want to

0:26:54.000 --> 0:26:56.080
<v Speaker 1>make that clear because none of this is a problem

0:26:56.119 --> 0:26:59.440
<v Speaker 1>with tech as a whole. The people who have taken

0:26:59.480 --> 0:27:03.600
<v Speaker 1>over tech are the problem, the management consultant elite, the

0:27:03.680 --> 0:27:06.159
<v Speaker 1>business people who just want to see growth at all costs.

0:27:06.880 --> 0:27:07.760
<v Speaker 2>But I do have hope.

0:27:07.760 --> 0:27:09.520
<v Speaker 1>I do have hope because of Blue Sky, which is

0:27:09.520 --> 0:27:11.560
<v Speaker 1>a social network that I've posted on for over a

0:27:11.640 --> 0:27:13.720
<v Speaker 1>year that's finally taking route.

0:27:13.760 --> 0:27:14.640
<v Speaker 2>And this isn't an ad.

0:27:14.920 --> 0:27:17.920
<v Speaker 1>This is just somewhere I spend eleven hours a day posting,

0:27:18.359 --> 0:27:19.439
<v Speaker 1>but it feels different.

0:27:19.480 --> 0:27:20.360
<v Speaker 2>It's growing rapidly.

0:27:20.359 --> 0:27:23.400
<v Speaker 1>It's competing with Threads and with Twitter, and they're doing

0:27:23.440 --> 0:27:25.920
<v Speaker 1>someone on an honest product and an open protocol. I'm

0:27:25.920 --> 0:27:29.159
<v Speaker 1>not saying it's the solution for everything. I'm just saying

0:27:29.920 --> 0:27:34.800
<v Speaker 1>solutions are possible. People are building them, people are trying.

0:27:35.520 --> 0:27:38.600
<v Speaker 1>It doesn't have to be like this. The rot economy

0:27:38.680 --> 0:27:42.520
<v Speaker 1>does not have to control the destiny of tech, and

0:27:42.600 --> 0:27:44.240
<v Speaker 1>I think better things can happen.

0:27:44.440 --> 0:27:46.800
<v Speaker 2>It's going to take a while. Things are going to

0:27:46.880 --> 0:27:47.320
<v Speaker 2>blow up.

0:27:47.680 --> 0:27:51.920
<v Speaker 1>I'm scared for this industry for what happens after Generative

0:27:51.960 --> 0:27:55.040
<v Speaker 1>AA collapses, and not just for the initial problem, but

0:27:55.080 --> 0:27:58.399
<v Speaker 1>when they don't have anything else the next quarter. But

0:27:58.440 --> 0:28:01.040
<v Speaker 1>there are other ideas out there. There are other ideas

0:28:01.040 --> 0:28:03.520
<v Speaker 1>of the future that aren't just born out of this shitty,

0:28:03.600 --> 0:28:07.439
<v Speaker 1>scuzzy billionaire mindset from shit heels like Sandar Pashai and

0:28:07.480 --> 0:28:10.560
<v Speaker 1>Sam Ortman and Prabagar Ragavan. And they can and they

0:28:10.600 --> 0:28:13.679
<v Speaker 1>will grow out of the ruins that these people create.

0:28:14.600 --> 0:28:18.200
<v Speaker 1>I want you to have hope. Hope isn't just about

0:28:18.680 --> 0:28:22.400
<v Speaker 1>blindly being optimistic. You can feel hopeful while still being

0:28:22.440 --> 0:28:25.680
<v Speaker 1>pissed off at everything. Hope is about getting up every

0:28:25.760 --> 0:28:28.280
<v Speaker 1>day and being willing to say what you think better

0:28:28.320 --> 0:28:32.760
<v Speaker 1>looks like, what you think a better world could be.

0:28:33.240 --> 0:28:35.639
<v Speaker 1>You don't have to do it at scale. You don't

0:28:35.680 --> 0:28:37.880
<v Speaker 1>have to be an influencer. You don't have to write,

0:28:37.920 --> 0:28:39.680
<v Speaker 1>you don't have to have a tech podcast. You just

0:28:39.760 --> 0:28:42.200
<v Speaker 1>have to talk to your friends. You have to talk

0:28:42.240 --> 0:28:44.040
<v Speaker 1>to the people you know. You have to fight within

0:28:44.080 --> 0:28:48.479
<v Speaker 1>the institutions you are in to find the joy and

0:28:48.520 --> 0:28:51.560
<v Speaker 1>to push out things like generative AI. We can have

0:28:51.720 --> 0:28:54.959
<v Speaker 1>better and you are the beginning of fixing these problems.

0:28:55.560 --> 0:28:58.640
<v Speaker 1>And I actually believe that with enough pushback, these big

0:28:58.680 --> 0:29:01.360
<v Speaker 1>tech companies can change too. I'm not relying on them.

0:29:01.520 --> 0:29:04.440
<v Speaker 1>I hate that I have to use their products, but

0:29:04.560 --> 0:29:08.400
<v Speaker 1>you don't. Perhaps you can find alternatives. I always say

0:29:08.400 --> 0:29:10.240
<v Speaker 1>it's a great day to join blue Sky or use

0:29:10.280 --> 0:29:13.920
<v Speaker 1>Signal proton males apparently great. I'm very much stuck in

0:29:13.960 --> 0:29:17.920
<v Speaker 1>the Gmail ecosystem, which sucks a few decades in there.

0:29:17.920 --> 0:29:19.280
<v Speaker 2>It's tough, is it decades now?

0:29:19.400 --> 0:29:22.440
<v Speaker 1>Either way, It's hard to escape the rot economy whole,

0:29:22.960 --> 0:29:25.120
<v Speaker 1>but there are bits of yourself you can remove from it.

0:29:25.960 --> 0:29:30.360
<v Speaker 2>Please do not give up hope. Please stay angry.

0:29:30.480 --> 0:29:33.320
<v Speaker 1>If you are angry, not telling you to be angry.

0:29:33.440 --> 0:29:39.360
<v Speaker 1>Your indignation is necessary. Your righteous indignation is necessary to

0:29:39.400 --> 0:29:43.640
<v Speaker 1>tell these people to go fucking pounds sand your little changes.

0:29:43.720 --> 0:29:46.240
<v Speaker 1>Those discussions you have with colleagues and friends and loved

0:29:46.280 --> 0:29:50.320
<v Speaker 1>ones about these problems. That knowledge is powerful, and I

0:29:50.440 --> 0:29:54.240
<v Speaker 1>encourage you to talk about these things. The tech industry

0:29:54.240 --> 0:29:57.200
<v Speaker 1>has grown big and strong, telling you you're too stupid

0:29:57.200 --> 0:30:00.000
<v Speaker 1>to understand what they're doing. And unless I'm very much

0:30:00.080 --> 0:30:02.760
<v Speaker 1>mistaken based on the readers and listeners to talk to

0:30:03.240 --> 0:30:06.720
<v Speaker 1>you all get this. I just explained GPUs and server architecture,

0:30:06.760 --> 0:30:09.280
<v Speaker 1>and shit, I think you mostly got it right. I'm

0:30:09.320 --> 0:30:12.200
<v Speaker 1>assuming you did. You keep listening, and that's because this

0:30:12.240 --> 0:30:15.200
<v Speaker 1>stuff isn't that complex. They want you to believe it

0:30:15.240 --> 0:30:17.480
<v Speaker 1>is so that they can stay powerful. But you're a

0:30:17.480 --> 0:30:19.880
<v Speaker 1>lot goddamn smarter than you give yourself credit for. You're

0:30:19.880 --> 0:30:23.520
<v Speaker 1>a lot goddamn stronger than you are too. Don't let

0:30:23.560 --> 0:30:26.920
<v Speaker 1>these people make you feel oppressed. Don't let them make

0:30:27.000 --> 0:30:27.800
<v Speaker 1>you feel weak.

0:30:28.160 --> 0:30:28.640
<v Speaker 2>You are not.

0:30:29.280 --> 0:30:32.080
<v Speaker 1>You are so much more valuable to them than they'll

0:30:32.120 --> 0:30:34.040
<v Speaker 1>ever be to you.

0:30:34.040 --> 0:30:34.920
<v Speaker 2>You have been conned.

0:30:35.120 --> 0:30:39.280
<v Speaker 1>You are the victim, and you will succeed long term.

0:30:39.800 --> 0:30:41.800
<v Speaker 1>Spread the good ideas. Talk about what a good world

0:30:41.880 --> 0:30:52.360
<v Speaker 1>looks like, and thank you for listening. Thank you for

0:30:52.400 --> 0:30:55.080
<v Speaker 1>listening to Better Offline. The editor and composer of the

0:30:55.080 --> 0:30:58.160
<v Speaker 1>Better Offline theme song is Matosowski. You can check out

0:30:58.200 --> 0:31:01.200
<v Speaker 1>more of his music and audio projects at Mattasowski dot

0:31:01.240 --> 0:31:03.200
<v Speaker 1>com m A T T O.

0:31:03.400 --> 0:31:06.920
<v Speaker 2>S O W s ki dot com.

0:31:06.960 --> 0:31:09.600
<v Speaker 1>You can email me at easy at Better offline dot com,

0:31:09.680 --> 0:31:12.320
<v Speaker 1>or visit Better Offline dot com to find more podcast links.

0:31:12.320 --> 0:31:13.480
<v Speaker 2>And of course my newsletter.

0:31:13.880 --> 0:31:16.400
<v Speaker 1>I also really recommend you go to chat dot where's

0:31:16.400 --> 0:31:18.880
<v Speaker 1>youreaed dot at to visit the discord, and go to

0:31:18.920 --> 0:31:21.360
<v Speaker 1>our slash Better Offline to check out our reddit.

0:31:22.120 --> 0:31:25.400
<v Speaker 2>Thank you so much for listening. Better Offline is a

0:31:25.400 --> 0:31:26.760
<v Speaker 2>production of cool Zone Media.

0:31:26.920 --> 0:31:29.760
<v Speaker 1>For more from cool Zone Media, visit our website cool

0:31:29.840 --> 0:31:33.200
<v Speaker 1>Zonemedia dot com, or check us out on the iHeartRadio app,

0:31:33.240 --> 0:31:35.920
<v Speaker 1>Apple Podcasts, or wherever you get your podcasts.