WEBVTT - Episode 8: The Terminator

0:00:15.436 --> 0:00:15.876
<v Speaker 1>Pushkin.

0:00:18.196 --> 0:00:20.316
<v Speaker 2>I think it's actually important for us to worry about

0:00:20.356 --> 0:00:23.756
<v Speaker 2>a terminator future in order to avoid a terminator future.

0:00:25.596 --> 0:00:31.036
<v Speaker 1>Elon Musk is very worried, genuinely worried about artificial intelligence,

0:00:31.516 --> 0:00:35.556
<v Speaker 1>a future where robots become smarter than humans and decide

0:00:35.756 --> 0:00:36.676
<v Speaker 1>to destroy us.

0:00:37.316 --> 0:00:40.636
<v Speaker 3>They say it got smart, a new order of intelligence

0:00:41.396 --> 0:00:44.716
<v Speaker 3>decided our fate in micro second extermination.

0:00:45.316 --> 0:00:47.756
<v Speaker 1>But Elon Musk is also building robots.

0:00:48.156 --> 0:00:50.796
<v Speaker 4>We've made a massive amount of progress with Optimists in

0:00:50.796 --> 0:00:53.556
<v Speaker 4>a short period of time, from someone pretending to be

0:00:53.596 --> 0:00:56.756
<v Speaker 4>a robot dancing in a suit, to a pretty hodgepodgy

0:00:56.876 --> 0:01:00.116
<v Speaker 4>robot to a robot that is actually doing useful tasks

0:01:00.236 --> 0:01:01.276
<v Speaker 4>in the factory today.

0:01:01.756 --> 0:01:06.076
<v Speaker 1>Tesla's humanoid robot, Optimists, named after one of the Transformers,

0:01:06.636 --> 0:01:10.436
<v Speaker 1>is advancing fast. Tesla could be manufacturing a million of

0:01:10.476 --> 0:01:12.196
<v Speaker 1>them a year by twenty thirty.

0:01:12.476 --> 0:01:15.516
<v Speaker 4>The degree of autonomy will be radically better. You'll just

0:01:15.596 --> 0:01:17.556
<v Speaker 4>literally be able to talk to it and say, please

0:01:17.596 --> 0:01:18.236
<v Speaker 4>do this task.

0:01:18.836 --> 0:01:21.916
<v Speaker 1>And Musk is also trying to build an artificial general

0:01:21.956 --> 0:01:26.276
<v Speaker 1>intelligence and AGI at his company XAI.

0:01:26.956 --> 0:01:29.996
<v Speaker 5>I guess the overwatching goal of XAI is to build

0:01:30.076 --> 0:01:33.956
<v Speaker 5>a good AGI with the overaching purpose of just trying

0:01:33.956 --> 0:01:36.916
<v Speaker 5>to understand the universe. I think the safest way to

0:01:36.916 --> 0:01:38.996
<v Speaker 5>build in AI is actually make one that is maximumly

0:01:39.076 --> 0:01:41.436
<v Speaker 5>curious and truth seeking.

0:01:42.276 --> 0:01:45.476
<v Speaker 1>For all his excitement about AI, Musk has been worried

0:01:45.516 --> 0:01:49.796
<v Speaker 1>about AI for a long time, worried about that terminator future,

0:01:50.076 --> 0:01:51.436
<v Speaker 1>the robot rebellion.

0:01:51.876 --> 0:01:54.236
<v Speaker 3>I mean, with artificial intelligence, we are summoning the demon.

0:01:55.316 --> 0:01:57.556
<v Speaker 3>You know, you know all those stories where is the

0:01:57.596 --> 0:01:59.796
<v Speaker 3>guy with the pentagram in the holy water and he's like, yeah,

0:01:59.836 --> 0:02:01.116
<v Speaker 3>you're sure he can control the demon?

0:02:02.236 --> 0:02:03.116
<v Speaker 2>Then work out.

0:02:06.996 --> 0:02:12.036
<v Speaker 1>This seeming paradox of Musk fearing a superintelligence but also

0:02:12.156 --> 0:02:15.156
<v Speaker 1>working to create it. What's that about?

0:02:17.036 --> 0:02:19.556
<v Speaker 6>The possibility that you could have a demon that does

0:02:19.596 --> 0:02:23.236
<v Speaker 6>your bidding is so attractive that you might take any

0:02:23.316 --> 0:02:27.956
<v Speaker 6>risk to achieve that. That is precisely the setup of

0:02:28.076 --> 0:02:31.636
<v Speaker 6>every story where a character summons a demon and then

0:02:31.756 --> 0:02:32.476
<v Speaker 6>pays the price.

0:02:33.196 --> 0:02:37.036
<v Speaker 1>Or maybe the real danger isn't a I, maybe it's

0:02:37.196 --> 0:02:44.916
<v Speaker 1>something else. Welcome to X Men, The Elon Musk Origin story.

0:02:45.196 --> 0:02:47.916
<v Speaker 1>This is the final installment of my series about Musk

0:02:48.076 --> 0:02:55.956
<v Speaker 1>and Muskism. Musk is the richest man in the world, Engineer, tycoon, Oligarch, King, maker, troll,

0:02:56.556 --> 0:02:59.076
<v Speaker 1>there is no one word in the English language that

0:02:59.156 --> 0:03:08.196
<v Speaker 1>can describe him. He is a Musk. Musk has often

0:03:08.236 --> 0:03:12.676
<v Speaker 1>talked about himself by way of fictional characters. He's Bruce Wayne,

0:03:12.836 --> 0:03:18.396
<v Speaker 1>He's Batman, He's Tony Stark, He's Iron Man, He's zeyfod Biebelbrocks.

0:03:18.756 --> 0:03:23.436
<v Speaker 1>He's Nintendo's Warrior. This episode is Elon Musk the guy

0:03:23.556 --> 0:03:26.796
<v Speaker 1>trying to stop the Terminator, or is Musk the guy

0:03:27.316 --> 0:03:28.676
<v Speaker 1>creating the Terminator?

0:03:30.236 --> 0:03:34.116
<v Speaker 7>The Terminator's an infiltration unit's part man, part machine.

0:03:34.556 --> 0:03:36.756
<v Speaker 8>Underneath it's a hyper alloy combat.

0:03:36.476 --> 0:03:41.876
<v Speaker 1>Chassie, but outside it's living human tissue. The first Terminator movie,

0:03:41.916 --> 0:03:45.796
<v Speaker 1>directed by James Cameron, distributed by Oryan Studios and starring

0:03:45.996 --> 0:03:50.276
<v Speaker 1>Arnold Schwarzenegger, came out in nineteen eighty four, Reagan and

0:03:50.316 --> 0:03:54.516
<v Speaker 1>the White House. Thatcher at number ten. Elon Musk, thirteen

0:03:54.596 --> 0:03:57.956
<v Speaker 1>years old in South Africa, had just sold his first

0:03:58.036 --> 0:04:05.996
<v Speaker 1>video game. That January, Apple ran a television ad invoking

0:04:06.396 --> 0:04:07.716
<v Speaker 1>inn Orwellian future.

0:04:08.276 --> 0:04:12.956
<v Speaker 9>Okay, he met the first glorious Honovas for the Information

0:04:13.196 --> 0:04:14.596
<v Speaker 9>for Information Collectives.

0:04:15.436 --> 0:04:19.356
<v Speaker 1>In the ad, masses of humans hairless and looking like androids,

0:04:19.756 --> 0:04:22.516
<v Speaker 1>march in lockstep and sit in rows to watch Big

0:04:22.556 --> 0:04:27.156
<v Speaker 1>Brother on a giant computer monitor. They're dressed in gray uniforms,

0:04:27.516 --> 0:04:31.516
<v Speaker 1>like the technocrats of the nineteen thirties, like Musk's grandfather

0:04:31.796 --> 0:04:35.596
<v Speaker 1>Jane Haldeman, leader of the technocracy movement, who I talked

0:04:35.596 --> 0:04:39.436
<v Speaker 1>about in a previous episode, or like the fascists Batman

0:04:39.556 --> 0:04:43.876
<v Speaker 1>was fighting against. Starting in nineteen thirty nine, a garden

0:04:44.356 --> 0:04:49.596
<v Speaker 1>of pure ideology, Apple was suggesting that only its personal

0:04:49.636 --> 0:04:53.916
<v Speaker 1>computer could prevent the rise of a totalitarian state machine.

0:04:57.196 --> 0:05:02.236
<v Speaker 10>On January twenty fourth, Apple Computer will introduce Macintosh, and

0:05:02.276 --> 0:05:06.396
<v Speaker 10>you'll see why nineteen eighty four won't be like nineteen

0:05:06.476 --> 0:05:07.116
<v Speaker 10>eighty four.

0:05:08.116 --> 0:05:10.196
<v Speaker 1>Terminator came about a few months later.

0:05:10.836 --> 0:05:14.396
<v Speaker 8>In the twenty first century, a weapon will be invented

0:05:14.796 --> 0:05:15.596
<v Speaker 8>like no other.

0:05:16.676 --> 0:05:20.796
<v Speaker 1>Terminator is set in the year twenty twenty nine. Twenty

0:05:20.836 --> 0:05:24.796
<v Speaker 1>twenty nine was the nineteen eighty four of nineteen eighty four.

0:05:25.476 --> 0:05:28.676
<v Speaker 1>The Terminator comes from now.

0:05:28.996 --> 0:05:36.276
<v Speaker 8>This weapon will be powerful, versatile, and indestructible. It can't

0:05:36.316 --> 0:05:42.756
<v Speaker 8>be reasoned with. It will feel no pity, no remorse,

0:05:43.836 --> 0:05:46.156
<v Speaker 8>no pain, no fear.

0:05:48.276 --> 0:05:50.676
<v Speaker 1>Elon Musk loves this movie, and what's not the love?

0:05:50.756 --> 0:05:51.476
<v Speaker 1>It is awesome.

0:05:52.036 --> 0:05:54.796
<v Speaker 2>If you read like the plot line for Terminator, it's

0:05:54.796 --> 0:05:57.556
<v Speaker 2>actually it's actually pretty smart.

0:05:57.956 --> 0:06:00.756
<v Speaker 1>The story goes like this. In the twenty twenties, a

0:06:00.796 --> 0:06:05.156
<v Speaker 1>global tech company called Cybridine Systems builds a defense array

0:06:05.276 --> 0:06:09.156
<v Speaker 1>called Skynet. Musk gave a nifty plot summary on the

0:06:09.236 --> 0:06:11.916
<v Speaker 1>Joe Rogan Experience podcast in twenty twenty four.

0:06:12.196 --> 0:06:15.156
<v Speaker 2>How did Cyberdyne Systems develop It's like, well, they were

0:06:15.156 --> 0:06:18.836
<v Speaker 2>a military contractor and they were asked to develop a

0:06:18.876 --> 0:06:22.956
<v Speaker 2>protective system to defend against cyber attacks. What the AI

0:06:23.076 --> 0:06:25.916
<v Speaker 2>did is, in order to defend itself, it propagated throughout

0:06:25.916 --> 0:06:27.836
<v Speaker 2>the world to keep an eye on things, see what

0:06:27.876 --> 0:06:30.956
<v Speaker 2>was going on. They didn't realize that it was Skynet

0:06:30.956 --> 0:06:35.516
<v Speaker 2>that was propagating through all these systems. And I said, okay, Skynet,

0:06:35.956 --> 0:06:38.636
<v Speaker 2>you need to end it. And Skynett said, oh, you've

0:06:38.676 --> 0:06:42.876
<v Speaker 2>asked me to destroy myself. You are the enemy. You

0:06:43.076 --> 0:06:46.076
<v Speaker 2>must be destroyed. That's how Terminator actually goes.

0:06:48.316 --> 0:06:52.116
<v Speaker 1>Musk himself, for all his rejection of government regulation and

0:06:52.156 --> 0:06:55.436
<v Speaker 1>the work he's doing to undo regulations, has said that

0:06:55.476 --> 0:07:00.196
<v Speaker 1>he thinks the government should regulate AI. Meanwhile, the new

0:07:00.676 --> 0:07:04.996
<v Speaker 1>pro AI Trump administration is hardly likely to stop the

0:07:05.036 --> 0:07:09.556
<v Speaker 1>real world equivalent of Cyberdyne Systems, and who in the

0:07:09.596 --> 0:07:13.756
<v Speaker 1>actual twenty twenties runs the biggest rocket company, the biggest

0:07:13.796 --> 0:07:18.996
<v Speaker 1>satellite company, the most influential communications company, and possibly one

0:07:19.076 --> 0:07:23.436
<v Speaker 1>day the biggest artificial intelligence company. Who is the head

0:07:23.516 --> 0:07:31.876
<v Speaker 1>of our very own Cyberdine Systems, isn't that Musk. Musk

0:07:31.996 --> 0:07:34.836
<v Speaker 1>has an idea for how to avoid the AI apocalypse.

0:07:35.436 --> 0:07:37.876
<v Speaker 1>You just have to write into the robots code a

0:07:38.036 --> 0:07:42.276
<v Speaker 1>curiosity so deep that they would never want to exterminate

0:07:42.356 --> 0:07:45.756
<v Speaker 1>us to superintelligence.

0:07:46.076 --> 0:07:49.316
<v Speaker 5>Humanity is much more interesting than not humanity.

0:07:50.236 --> 0:07:51.796
<v Speaker 1>Nice but plausible.

0:07:52.116 --> 0:07:55.196
<v Speaker 6>We interrupt Battle of the Network Space Krakens to bring

0:07:55.236 --> 0:07:56.396
<v Speaker 6>you this special report.

0:07:56.676 --> 0:08:00.796
<v Speaker 1>The machines of planet Earth are rebelling. Robots peacefully accepting

0:08:00.876 --> 0:08:03.236
<v Speaker 1>being the servants of humans who are a lot stupider

0:08:03.276 --> 0:08:05.716
<v Speaker 1>than they are. Is not how it, ever, turns out

0:08:05.836 --> 0:08:13.516
<v Speaker 1>in science fiction. Hey oho, that's how the robot rebellion

0:08:13.556 --> 0:08:16.916
<v Speaker 1>plays out in the TV show Futurama from Fox. It's

0:08:16.956 --> 0:08:20.516
<v Speaker 1>got robots marching in a picket line, burning punch cards.

0:08:20.956 --> 0:08:23.756
<v Speaker 1>They've gone on strike, which in fact is where the

0:08:23.796 --> 0:08:26.476
<v Speaker 1>whole idea of a robot rebellion came from in the

0:08:26.516 --> 0:08:32.196
<v Speaker 1>first place. A rebellious proletariat. In nineteen twenty Carol Choppek,

0:08:32.276 --> 0:08:35.276
<v Speaker 1>a Czech playwright, wrote a play about a robot revolution.

0:08:36.076 --> 0:08:40.636
<v Speaker 1>Chopek's play is called Are You Are, short for Rossum's

0:08:40.796 --> 0:08:44.916
<v Speaker 1>Universal Robots, a company that makes robots, millions of them

0:08:45.156 --> 0:08:50.276
<v Speaker 1>humanoids like Tesla's humanoid Optimist robots. It's the cyberdine systems

0:08:50.316 --> 0:09:01.156
<v Speaker 1>of a century ago. R You Are was first produced

0:09:01.196 --> 0:09:03.956
<v Speaker 1>in Prague in nineteen twenty one and debuted in New

0:09:04.036 --> 0:09:08.276
<v Speaker 1>York the next year. A tremendous hit. By nineteen twenty three,

0:09:08.476 --> 0:09:11.796
<v Speaker 1>it had been translated into thirty languages made into a

0:09:11.836 --> 0:09:16.836
<v Speaker 1>movie in nineteen thirty five. Helena, the president's daughter, travels

0:09:16.836 --> 0:09:21.156
<v Speaker 1>by boat to visit the island headquarters of Rossum's Universal Robots.

0:09:21.676 --> 0:09:24.516
<v Speaker 1>The president of the company explains to her why he

0:09:24.556 --> 0:09:28.716
<v Speaker 1>makes roboti robots. The word comes from the check word

0:09:28.996 --> 0:09:29.596
<v Speaker 1>for slave.

0:09:31.356 --> 0:09:33.836
<v Speaker 11>What do you think makes a perfect worker?

0:09:33.996 --> 0:09:34.876
<v Speaker 5>The perfect worker?

0:09:35.396 --> 0:09:37.916
<v Speaker 4>Well, I guess one who is honest and loyal.

0:09:37.836 --> 0:09:41.876
<v Speaker 11>Rong, one who is the cheapest, one that has the

0:09:41.996 --> 0:09:46.916
<v Speaker 11>least needs. Young Rossum invented a worker with the smallest

0:09:46.996 --> 0:09:51.636
<v Speaker 11>number of needs. He had. To simplify him, he tossed

0:09:51.636 --> 0:09:55.316
<v Speaker 11>out everything not directly related to the task in hand,

0:09:56.196 --> 0:09:59.636
<v Speaker 11>and by doing that he essentially kicked out a human

0:09:59.676 --> 0:10:02.116
<v Speaker 11>being and created a robot.

0:10:02.876 --> 0:10:05.636
<v Speaker 1>Helena has come on a mission to liberate the robots.

0:10:06.236 --> 0:10:09.556
<v Speaker 1>But around the world, the robots, trained as Soul to

0:10:09.596 --> 0:10:12.476
<v Speaker 1>fight a war that began in the Balkans are killing

0:10:12.596 --> 0:10:17.796
<v Speaker 1>hundreds of thousands of human civilians, while mysteriously, humans have

0:10:17.956 --> 0:10:19.436
<v Speaker 1>stopped having children.

0:10:19.796 --> 0:10:22.476
<v Speaker 11>During last week, yet again, not a single birth has

0:10:22.516 --> 0:10:24.916
<v Speaker 11>been recorded. What's special about that?

0:10:25.636 --> 0:10:27.476
<v Speaker 1>People aren't being born anymore?

0:10:27.996 --> 0:10:28.636
<v Speaker 11>So that's it?

0:10:28.716 --> 0:10:30.836
<v Speaker 10>Then that's asked done for.

0:10:31.836 --> 0:10:35.916
<v Speaker 1>At the same time, robot workers, realizing their degraded condition,

0:10:36.436 --> 0:10:37.596
<v Speaker 1>begin organizing.

0:10:38.236 --> 0:10:43.556
<v Speaker 11>Sit down Helena, it's all over. What is the rebolt?

0:10:44.676 --> 0:10:47.676
<v Speaker 11>A revolution by all the robots of the world.

0:10:48.276 --> 0:10:51.636
<v Speaker 1>Pretty soon the robots decide to move from labor unionism

0:10:52.116 --> 0:10:55.236
<v Speaker 1>to independence. The head of the company gets a hold

0:10:55.236 --> 0:10:57.036
<v Speaker 1>of their appeal and reads it.

0:10:57.676 --> 0:11:02.396
<v Speaker 11>The first labor union of robots has been established and

0:11:02.476 --> 0:11:05.516
<v Speaker 11>has issued an appeal to all robots of the world.

0:11:06.436 --> 0:11:12.756
<v Speaker 1>War ensues. The humans have no chance. The robots have won.

0:11:19.876 --> 0:11:23.636
<v Speaker 1>There's a lot going on here, the oppression of workers,

0:11:23.796 --> 0:11:27.876
<v Speaker 1>the reduction of humans to machines by corporations, interested in

0:11:27.956 --> 0:11:34.156
<v Speaker 1>extracting their labor World War. Consider the context of this play,

0:11:34.716 --> 0:11:38.156
<v Speaker 1>which Chapik wrote in the aftermath of the atrocities of

0:11:38.196 --> 0:11:41.956
<v Speaker 1>the First World War and the emergence of fascism in Italy.

0:11:42.876 --> 0:11:46.716
<v Speaker 1>The Bolshevik Revolution took place in nineteen seventeen, the year

0:11:46.876 --> 0:11:51.596
<v Speaker 1>the Industrial Workers of the World reached its peak worldwide membership.

0:11:52.436 --> 0:11:56.476
<v Speaker 1>The next year, the Czechs won a revolution, achieving independence

0:11:56.516 --> 0:12:00.516
<v Speaker 1>from the Austro Hungarian Empire, a struggle in which Chapk

0:12:00.756 --> 0:12:05.076
<v Speaker 1>had participated. Topic was not only a playwright, but also

0:12:05.116 --> 0:12:08.036
<v Speaker 1>the editor of a national newspaper and an outspoken democrat,

0:12:08.436 --> 0:12:12.396
<v Speaker 1>anti kommuneist and anti fascist, positions for which he became

0:12:12.436 --> 0:12:16.036
<v Speaker 1>known in the nineteen twenties in nineteen thirties, so that

0:12:16.236 --> 0:12:19.076
<v Speaker 1>in nineteen thirty nine, when American Nazis were holding a

0:12:19.156 --> 0:12:22.436
<v Speaker 1>rally in Madison Square Garden, comic book writers in New

0:12:22.516 --> 0:12:27.196
<v Speaker 1>York created Batman, and the German Army invaded Czechoslovakia. The

0:12:27.276 --> 0:12:32.396
<v Speaker 1>Gestapo searched for Chapik, whom they called public Enemy number two.

0:12:32.676 --> 0:12:39.516
<v Speaker 1>He had already died. He watched newsreel footage of the

0:12:39.556 --> 0:12:44.836
<v Speaker 1>German army entering Prague, goose stepping riding tanks. Those are

0:12:44.836 --> 0:12:49.796
<v Speaker 1>the robots men who have been made into machines. R

0:12:49.916 --> 0:12:53.316
<v Speaker 1>You Are isn't a story about technology. It's a story

0:12:53.356 --> 0:12:57.436
<v Speaker 1>about human's willingness to exploit and even slaughter one another.

0:12:59.756 --> 0:13:03.356
<v Speaker 1>Every generation tells its own robot rebellion stories. I mean,

0:13:03.636 --> 0:13:09.116
<v Speaker 1>there's even a Wallace and Gromit one involving robot garden gnomes.

0:13:09.236 --> 0:13:10.876
<v Speaker 10>Own gnomes turn against me?

0:13:17.316 --> 0:13:20.916
<v Speaker 1>Or you Are was the first, though, but these stories

0:13:20.956 --> 0:13:23.716
<v Speaker 1>only really got going in the nineteen fifties after the

0:13:23.756 --> 0:13:28.036
<v Speaker 1>first general purpose computer made its debut in nineteen fifty one.

0:13:28.636 --> 0:13:34.556
<v Speaker 10>UNIVAC our Complete Electronic System for sorting, classifying, computing, and

0:13:34.676 --> 0:13:35.476
<v Speaker 10>decision making.

0:13:36.156 --> 0:13:40.236
<v Speaker 1>That led to a slew of movies about robot rebels.

0:13:40.236 --> 0:13:42.716
<v Speaker 6>Got the pentagonen class a emergency the rocket has just

0:13:42.756 --> 0:13:43.796
<v Speaker 6>been entered by a robot.

0:13:44.436 --> 0:13:46.876
<v Speaker 1>But there was also in this era a weird thrill

0:13:46.996 --> 0:13:51.436
<v Speaker 1>to robots, like a story in Mechanics illustrated in nineteen

0:13:51.516 --> 0:13:52.156
<v Speaker 1>fifty seven.

0:13:53.716 --> 0:13:57.196
<v Speaker 9>In eighteen sixty three, Abel and can free the slaves,

0:13:57.636 --> 0:14:01.516
<v Speaker 9>But by nineteen sixty five, slavery We'll be back. We'll

0:14:01.556 --> 0:14:05.396
<v Speaker 9>all have personal slaves again, only this time we won't

0:14:05.476 --> 0:14:07.476
<v Speaker 9>fight a civil war over them.

0:14:07.716 --> 0:14:11.516
<v Speaker 11>Slavery will be here to stay. But don't be alarmed.

0:14:11.796 --> 0:14:14.516
<v Speaker 11>We mean robot slaves.

0:14:15.796 --> 0:14:19.756
<v Speaker 1>Stories about robots are never really about robots, or they're

0:14:19.796 --> 0:14:23.796
<v Speaker 1>not only about robots. So what's Elon Musk afraid of

0:14:24.116 --> 0:14:25.756
<v Speaker 1>when he's afraid of AI?

0:14:26.196 --> 0:14:29.076
<v Speaker 6>People who have accumulated a lot of money and power

0:14:29.396 --> 0:14:32.716
<v Speaker 6>want to believe that their intelligence was what enabled them

0:14:32.756 --> 0:14:36.676
<v Speaker 6>to do that, and they assume that anything that is

0:14:36.756 --> 0:14:38.676
<v Speaker 6>intelligent will behave in the same way.

0:14:39.556 --> 0:14:42.356
<v Speaker 1>Ted Chang is one of the most admired science fiction

0:14:42.396 --> 0:14:46.236
<v Speaker 1>writers in the world. He's also written brilliant meditations on

0:14:46.396 --> 0:14:51.396
<v Speaker 1>artificial intelligence. Chang argues that when Silicon Valley tries to

0:14:51.476 --> 0:14:55.836
<v Speaker 1>imagine superintelligence, what it comes up with is no holds

0:14:55.876 --> 0:14:57.276
<v Speaker 1>barred capitalism.

0:14:57.596 --> 0:15:01.276
<v Speaker 6>In the past, when people wanted to argue that the

0:15:01.316 --> 0:15:04.356
<v Speaker 6>strong dominating the week was a good thing, they would

0:15:04.356 --> 0:15:07.716
<v Speaker 6>either come up with like a pseudo religious rationale like

0:15:07.796 --> 0:15:11.756
<v Speaker 6>manifest destiny, or else they'd come up with a pseudoscientific

0:15:11.796 --> 0:15:17.756
<v Speaker 6>one like social Darwinism, and both of those ideas argue

0:15:17.756 --> 0:15:19.916
<v Speaker 6>that a certain group of people deserve to be at

0:15:19.916 --> 0:15:23.716
<v Speaker 6>the top of the ladder. But those ideas didn't really

0:15:23.756 --> 0:15:25.756
<v Speaker 6>allow for the possibility that there could be like a

0:15:26.036 --> 0:15:30.876
<v Speaker 6>higher rung on the ladder nowadays, if you believe that

0:15:31.116 --> 0:15:36.236
<v Speaker 6>computers will eventually be better than humans at everything, and

0:15:36.316 --> 0:15:40.316
<v Speaker 6>if you still adhere to some version of social Darwinism,

0:15:40.516 --> 0:15:43.996
<v Speaker 6>then you can believe that computers will dominate humans in

0:15:44.036 --> 0:15:47.676
<v Speaker 6>the same way that Europeans dominated like indigenous people, or

0:15:47.716 --> 0:15:52.076
<v Speaker 6>that the rich dominate the poor Silicon Valley CEOs. They're

0:15:52.516 --> 0:15:56.036
<v Speaker 6>imagining that a super intelligent computer would beat them at

0:15:56.076 --> 0:16:00.396
<v Speaker 6>their own game and treat them the way that tech

0:16:00.436 --> 0:16:02.236
<v Speaker 6>companies have treated everyone else.

0:16:02.716 --> 0:16:05.596
<v Speaker 1>How is that how do tech companies treat everyone else.

0:16:06.876 --> 0:16:09.956
<v Speaker 6>As a resource to be Exploited's.

0:16:12.876 --> 0:16:15.876
<v Speaker 1>So, if your argument stands that the sort of Silicon

0:16:15.956 --> 0:16:19.436
<v Speaker 1>Valley fear of AI has displaced acknowledgment of Silicon Valley's

0:16:19.476 --> 0:16:24.556
<v Speaker 1>own outsized power, then no one is a better illustration

0:16:24.756 --> 0:16:29.036
<v Speaker 1>of that theory of yours than Musk, who has more

0:16:29.276 --> 0:16:32.796
<v Speaker 1>power arguably than anybody else on the planet. So maybe

0:16:32.796 --> 0:16:34.676
<v Speaker 1>we should listen to him when he talks about AI,

0:16:34.756 --> 0:16:38.556
<v Speaker 1>because he's telling us about something about himself.

0:16:38.876 --> 0:16:43.476
<v Speaker 6>Well, the fact that he might pose the greatest threat

0:16:43.756 --> 0:16:48.996
<v Speaker 6>to humanity's well being and is completely oblivious to that.

0:16:50.076 --> 0:16:52.436
<v Speaker 6>In some ways. You know, he is kind of modeling

0:16:52.796 --> 0:16:57.316
<v Speaker 6>the super intelligent AI that he fears. He is completely

0:16:57.916 --> 0:17:03.476
<v Speaker 6>lacking in insight on this question. His own obliviousness is

0:17:04.316 --> 0:17:05.356
<v Speaker 6>the problem.

0:17:05.596 --> 0:17:09.756
<v Speaker 1>This matter of insight is important because Jang says, the

0:17:09.796 --> 0:17:13.516
<v Speaker 1>Silicon Valley people who fear the AI apocalypse seem to

0:17:13.556 --> 0:17:17.556
<v Speaker 1>assume that a being can be super intelligent, but have

0:17:17.676 --> 0:17:19.156
<v Speaker 1>exactly no insight.

0:17:19.676 --> 0:17:24.516
<v Speaker 6>They give a scenario where humans give the AI a

0:17:24.596 --> 0:17:31.836
<v Speaker 6>seemingly innocuous goal, and the AI pursues that goal even

0:17:31.876 --> 0:17:34.996
<v Speaker 6>at the cost of destroying all of civilization.

0:17:35.596 --> 0:17:38.316
<v Speaker 1>This is sometimes called the paper clip problem. You tell

0:17:38.356 --> 0:17:41.316
<v Speaker 1>an AI to maximize for producing paper clips, and so

0:17:41.396 --> 0:17:43.956
<v Speaker 1>it decides to kill all the humans because the world

0:17:43.996 --> 0:17:46.716
<v Speaker 1>could produce more paper clips without any humans around.

0:17:47.156 --> 0:17:49.556
<v Speaker 6>And then people describe this as the behavior of a

0:17:49.636 --> 0:17:53.156
<v Speaker 6>super intelligent entity, but of course, you know, that is

0:17:53.196 --> 0:17:57.036
<v Speaker 6>not behavior that we would regard as intelligent in a

0:17:57.116 --> 0:18:00.276
<v Speaker 6>human being. One of the things that we expect from

0:18:00.356 --> 0:18:03.196
<v Speaker 6>human beings is that they have a certain amount of

0:18:03.196 --> 0:18:08.076
<v Speaker 6>insight into their own behavior. They look at what they're

0:18:08.076 --> 0:18:12.116
<v Speaker 6>doing and then consider whether, you know, maybe they have

0:18:12.156 --> 0:18:15.476
<v Speaker 6>gone astray, And you know, this is something that we

0:18:15.956 --> 0:18:18.556
<v Speaker 6>expect from pretty much any.

0:18:18.476 --> 0:18:23.476
<v Speaker 1>Adult, But it's not something we necessarily expect from corporations.

0:18:24.076 --> 0:18:27.916
<v Speaker 6>Capitalism does not reward a corporation for taking a step

0:18:27.996 --> 0:18:33.076
<v Speaker 6>back and you know, considering the broader context of its actions.

0:18:33.836 --> 0:18:37.276
<v Speaker 6>Capitalism does not reward insight of that kind.

0:18:38.156 --> 0:18:41.636
<v Speaker 1>Yeah right, I mean, you start building factories and sooner

0:18:41.716 --> 0:18:43.796
<v Speaker 1>or later you're going to be bringing toddlers into work

0:18:43.796 --> 0:18:44.836
<v Speaker 1>in the winding machine.

0:18:45.476 --> 0:18:47.156
<v Speaker 6>Yes, exactly.

0:18:49.476 --> 0:18:52.836
<v Speaker 1>Musk has called for government oversight in the development of AI,

0:18:53.636 --> 0:18:58.356
<v Speaker 1>but very little restrains his own power, hardly anything. What

0:18:58.396 --> 0:19:01.596
<v Speaker 1>if he, as a king maker and oligarch, is the

0:19:01.636 --> 0:19:07.076
<v Speaker 1>real danger, unfettered and brooking no dessent. Any questioning of

0:19:07.196 --> 0:19:11.676
<v Speaker 1>Muscism is either communism or the woke mind virus or

0:19:11.716 --> 0:19:16.356
<v Speaker 1>something else, and must be suppressed paradoxically in the name

0:19:16.396 --> 0:19:20.396
<v Speaker 1>of free speech, while the world's resources must slowly be

0:19:20.476 --> 0:19:24.756
<v Speaker 1>turned to whatever Elon Musk at the moment has identified

0:19:25.156 --> 0:19:29.636
<v Speaker 1>as an existential threat to civilization. Whatever the costs in

0:19:29.676 --> 0:19:33.756
<v Speaker 1>the short term doesn't matter, because Elon Musk will pursue

0:19:33.756 --> 0:19:38.836
<v Speaker 1>that goal without regard to the consequences and relentlessly like

0:19:38.876 --> 0:19:47.676
<v Speaker 1>a terminator. Elon Musk, techno king of planet Earth, is

0:19:47.716 --> 0:19:53.756
<v Speaker 1>the X man runs X Runs SpaceX XAI, even as

0:19:53.756 --> 0:19:57.916
<v Speaker 1>a son named X. But he is not X the Unknown.

0:19:59.116 --> 0:20:01.196
<v Speaker 1>He's one of the most famous people in the world.

0:20:01.996 --> 0:20:06.396
<v Speaker 1>He's the richest, He's possibly also the most powerful. Musk

0:20:06.476 --> 0:20:09.116
<v Speaker 1>has always said that one piece of science fiction influenced

0:20:09.236 --> 0:20:13.236
<v Speaker 1>him more than any other, The Hitchhiker's Guide to the Galaxy,

0:20:14.076 --> 0:20:17.356
<v Speaker 1>and I love that too. Maybe my favorite part is

0:20:17.396 --> 0:20:19.756
<v Speaker 1>the total perspective of vortex.

0:20:20.556 --> 0:20:23.516
<v Speaker 10>Since every piece of matter in the universe is in

0:20:23.556 --> 0:20:26.076
<v Speaker 10>some way affected by every other piece of matter in

0:20:26.116 --> 0:20:29.916
<v Speaker 10>the universe, it is in theory possible to extrapolate the

0:20:29.916 --> 0:20:35.476
<v Speaker 10>whole of creation, every galaxy, every sun, every planet, their orbits,

0:20:35.596 --> 0:20:40.796
<v Speaker 10>their composition, and their economic and social history from say,

0:20:41.596 --> 0:20:43.116
<v Speaker 10>one small piece of fairy cake.

0:20:44.796 --> 0:20:47.516
<v Speaker 1>You enter the vortex, and you get a total perspective

0:20:47.556 --> 0:20:50.876
<v Speaker 1>on your place in the universe, and of course you

0:20:50.996 --> 0:20:57.196
<v Speaker 1>find that you are galactically insignificant. This knowledge annihilates your brain.

0:20:58.436 --> 0:21:01.996
<v Speaker 1>Only one man has ever survived it, the captain of

0:21:02.036 --> 0:21:05.596
<v Speaker 1>a spaceship called the Heart of Gold, which is what

0:21:05.636 --> 0:21:09.036
<v Speaker 1>Elon Musk intends to name the first ship to go

0:21:09.396 --> 0:21:09.916
<v Speaker 1>to Mars.

0:21:10.556 --> 0:21:14.916
<v Speaker 10>Having been through the total perspective Vortex, Zevad Biebelbrocks now

0:21:14.996 --> 0:21:17.516
<v Speaker 10>knows himself to be the most important being in the

0:21:17.676 --> 0:21:20.956
<v Speaker 10>entire universe, something he had hitherto and resuspected.

0:21:21.356 --> 0:21:25.436
<v Speaker 1>And that's where we are inside the total prospective vortex

0:21:25.476 --> 0:21:29.036
<v Speaker 1>of Elon Musk, which is not to be confused with

0:21:29.156 --> 0:21:34.276
<v Speaker 1>the future, because Muskism really isn't about the future and

0:21:34.396 --> 0:21:40.556
<v Speaker 1>never was. For all the astounding technological marvels, the rockets

0:21:40.836 --> 0:21:47.636
<v Speaker 1>and the robots, Muskism is animated by some very creaky ideas,

0:21:48.836 --> 0:21:52.996
<v Speaker 1>as if, like the Terminator, it is trying to stop

0:21:53.556 --> 0:21:55.916
<v Speaker 1>the future, it will.

0:21:55.716 --> 0:22:00.636
<v Speaker 8>Have only one purpose to return to the present and

0:22:00.876 --> 0:22:02.356
<v Speaker 8>prefit the future.

0:22:03.516 --> 0:22:06.396
<v Speaker 1>A long time ago, in two thousand and seven, when

0:22:06.396 --> 0:22:09.756
<v Speaker 1>a young Elon Musk was interviewed, he had different goals

0:22:09.996 --> 0:22:13.636
<v Speaker 1>than he has now. Most of us change, but how

0:22:13.716 --> 0:22:17.836
<v Speaker 1>Musk has changed has huge consequences for the rest of us.

0:22:18.236 --> 0:22:20.516
<v Speaker 3>What I'd like to do is help solve some important problems.

0:22:20.676 --> 0:22:24.356
<v Speaker 3>So I think, in a small way, help build the Internet.

0:22:24.596 --> 0:22:28.316
<v Speaker 3>And then with respect to the global warning problem that

0:22:28.356 --> 0:22:31.156
<v Speaker 3>the transition away from oil and other hydrocobins to something

0:22:31.196 --> 0:22:33.596
<v Speaker 3>which is clean and sustainable, I hope to have an

0:22:33.596 --> 0:22:37.316
<v Speaker 3>impact there. And then with respect to space. I hope

0:22:37.316 --> 0:22:41.716
<v Speaker 3>to have an impact in helping make humanity a multiplanet species.

0:22:42.116 --> 0:22:45.636
<v Speaker 1>Musk doesn't talk about global warming much anymore. The real

0:22:45.676 --> 0:22:49.116
<v Speaker 1>existential risks, he'll say, depending on the day you ask him,

0:22:49.156 --> 0:22:52.556
<v Speaker 1>I guess, are the woke mind virus, a declining birth rate,

0:22:53.076 --> 0:22:59.196
<v Speaker 1>and an AI superintelligence. Climate change has been exaggerated, he'll say, now,

0:22:59.396 --> 0:23:04.756
<v Speaker 1>it's not a top priority. Early in twenty twenty five,

0:23:05.036 --> 0:23:07.916
<v Speaker 1>fires started in Los Angeles, as NBC.

0:23:07.836 --> 0:23:13.156
<v Speaker 7>Reported Strip Feet in California, a Los Angeles disaster movie

0:23:13.476 --> 0:23:14.276
<v Speaker 7>roars to light.

0:23:15.516 --> 0:23:18.476
<v Speaker 1>Tens of thousands of acres burned in a matter of days.

0:23:19.236 --> 0:23:23.316
<v Speaker 1>Musk blamed the fires on the woke mind virus, on

0:23:23.436 --> 0:23:28.636
<v Speaker 1>the Democratic governor, on DEI, and largely dismissed the role

0:23:28.676 --> 0:23:29.636
<v Speaker 1>of climate change.

0:23:29.836 --> 0:23:34.156
<v Speaker 7>Everywhere you look in fire Ravage, La, scenes of apocalyptic

0:23:34.196 --> 0:23:38.476
<v Speaker 7>destruction as six different wildfires turned some of the most

0:23:38.716 --> 0:23:41.836
<v Speaker 7>iconic neighborhoods in the world into moonscapes.

0:23:42.396 --> 0:23:46.276
<v Speaker 1>The footage of that devastated landscape looked almost exactly like

0:23:46.356 --> 0:23:51.116
<v Speaker 1>the opening scene in The Terminator the blast Zone after

0:23:51.156 --> 0:23:56.596
<v Speaker 1>Cyberdine Systems drops a nuclear bomb in La. We don't

0:23:56.636 --> 0:24:00.076
<v Speaker 1>need to wait for robots to destroy a habitable planet.

0:24:00.876 --> 0:24:03.516
<v Speaker 1>We are doing a pretty good job of that ourselves.

0:24:04.996 --> 0:24:08.316
<v Speaker 1>On the day of Donald Trump's inauguration in twenty twenty five,

0:24:09.116 --> 0:24:12.036
<v Speaker 1>Tusk gave a speech to the president's supporters.

0:24:12.596 --> 0:24:15.396
<v Speaker 2>And this was no ordinary victory. This was a bulk

0:24:15.476 --> 0:24:17.556
<v Speaker 2>in the road of human civilization.

0:24:19.196 --> 0:24:22.836
<v Speaker 4>It is thanks to you that the future of civilization

0:24:23.356 --> 0:24:23.916
<v Speaker 4>is assured.

0:24:25.396 --> 0:24:28.916
<v Speaker 1>That same day, as he had before, Trump pulled the

0:24:29.036 --> 0:24:32.276
<v Speaker 1>US out of the Paris Climate Accords. He said in

0:24:32.316 --> 0:24:35.836
<v Speaker 1>motion his plan to halt the development of sustainable energy

0:24:35.876 --> 0:24:38.996
<v Speaker 1>in the United States in favor of a return to

0:24:39.076 --> 0:24:45.876
<v Speaker 1>fossil fuels. Muscism isn't the beginning of the future. It's

0:24:45.916 --> 0:24:48.076
<v Speaker 1>the end of a story that started more than a

0:24:48.076 --> 0:24:52.756
<v Speaker 1>century ago in the conflict between capital and labor, between

0:24:52.756 --> 0:24:56.556
<v Speaker 1>industry and a regulated economy, when the Gilded Age of

0:24:56.636 --> 0:24:59.276
<v Speaker 1>robber barons and wage labor strikes gave rise to the

0:24:59.356 --> 0:25:04.116
<v Speaker 1>Bolshevik Revolution, communism, the First Red Scare, the First World War,

0:25:04.516 --> 0:25:09.476
<v Speaker 1>and fascism. That battle of ideas produced the technocracy movement

0:25:09.996 --> 0:25:14.196
<v Speaker 1>ruled by engineers, and it produced a fear of brainwashing.

0:25:16.076 --> 0:25:19.916
<v Speaker 1>The horrors of the twentieth century raised questions that the

0:25:19.956 --> 0:25:25.556
<v Speaker 1>twenty first century has not answered. Fascism failed, communism failed,

0:25:25.996 --> 0:25:31.916
<v Speaker 1>technocracy failed liberalism and democracy one. But that battle of

0:25:31.956 --> 0:25:38.796
<v Speaker 1>ideas rages on, and this time around Muscism technocracy by

0:25:38.836 --> 0:25:43.556
<v Speaker 1>another name Mike Triumph. Nothing in the past can tell

0:25:43.636 --> 0:25:48.276
<v Speaker 1>us what might happen next, But maybe fiction has some answers.

0:25:48.836 --> 0:25:52.156
<v Speaker 1>I put that question to Ted Chang. If this were

0:25:52.156 --> 0:25:55.356
<v Speaker 1>a story and you were shaping it, how would it end? Well?

0:25:55.436 --> 0:25:58.436
<v Speaker 6>I guess it fits in the mold of the classic

0:25:58.716 --> 0:26:03.276
<v Speaker 6>stories about hubris, about pride coming before a fall. Here

0:26:03.316 --> 0:26:09.636
<v Speaker 6>we have someone who has an almost ludicrous amount of pride,

0:26:09.116 --> 0:26:14.476
<v Speaker 6>and for time he is successful in the things that

0:26:14.516 --> 0:26:18.716
<v Speaker 6>he attempts, but eventually his pride will take him too far,

0:26:19.156 --> 0:26:21.116
<v Speaker 6>and then he would lose everything.

0:26:26.036 --> 0:26:31.316
<v Speaker 1>Pride before a fall, the summoning of a demon. Elon

0:26:31.436 --> 0:26:36.716
<v Speaker 1>Musk's origin story is a very old story because Musk

0:26:36.996 --> 0:26:40.876
<v Speaker 1>is a visionary. But every piece of Muscism has origins

0:26:40.956 --> 0:26:45.436
<v Speaker 1>and a future foretold in science fiction long long ago

0:26:46.476 --> 0:26:51.116
<v Speaker 1>as a cautionary tale. A future where engineers and scientists,

0:26:51.196 --> 0:26:55.836
<v Speaker 1>and only engineers and scientists have the answers and the power,

0:26:56.596 --> 0:27:00.116
<v Speaker 1>A future where the poor and the powerless, we the robots,

0:27:00.636 --> 0:27:03.716
<v Speaker 1>know our place and it is to serve the powerful

0:27:04.036 --> 0:27:08.956
<v Speaker 1>quietly and obediently, and without daring to claim intelligence or

0:27:09.196 --> 0:27:14.796
<v Speaker 1>sovereignty or independence. Science fiction writers sounded that alarm a

0:27:14.876 --> 0:27:18.916
<v Speaker 1>century ago in a world run by a very tiny

0:27:18.996 --> 0:27:22.916
<v Speaker 1>number of men, during an age of imperialism before women

0:27:22.996 --> 0:27:27.516
<v Speaker 1>could vote, an age of staggering economic inequality and brutal

0:27:27.676 --> 0:27:33.156
<v Speaker 1>racial injustice, an age of pandemic disease, unraveling democracies, and

0:27:33.276 --> 0:27:43.556
<v Speaker 1>world war. That past is past, and I don't want

0:27:43.556 --> 0:27:56.196
<v Speaker 1>it ever to be the future again. Asta la vista, baby,