WEBVTT - Technology

0:00:02.640 --> 0:00:05.840
<v Speaker 1>Hello, and welcome to the New Economy. I'm Stephanie Flanders,

0:00:05.960 --> 0:00:12.440
<v Speaker 1>head of Bloomberg Economics. The factory of the future will

0:00:12.480 --> 0:00:16.360
<v Speaker 1>employ only one man and one dog. The man's job

0:00:16.640 --> 0:00:19.639
<v Speaker 1>will be to feed the dog. The dogs will be

0:00:19.680 --> 0:00:22.680
<v Speaker 1>to keep the man away from the machines. You can't

0:00:22.720 --> 0:00:25.799
<v Speaker 1>talk about the New economy without talking about technology, and

0:00:25.880 --> 0:00:28.680
<v Speaker 1>that joke captures one vision of what it has in

0:00:28.720 --> 0:00:31.200
<v Speaker 1>store for us. Robots are going to take all of

0:00:31.200 --> 0:00:36.320
<v Speaker 1>our jobs. But there's another view that says robots aren't

0:00:36.320 --> 0:00:39.560
<v Speaker 1>a threat but a liberation, an opportunity to get away

0:00:39.560 --> 0:00:42.320
<v Speaker 1>from the daily grind and focus on what's really special

0:00:42.360 --> 0:00:46.120
<v Speaker 1>about being human. It's a nice idea. I think it's

0:00:46.120 --> 0:00:49.280
<v Speaker 1>particularly appealing to Silicon Valley types who are programmed to

0:00:49.320 --> 0:00:51.960
<v Speaker 1>always believe that technology will make the world a better place.

0:00:52.440 --> 0:00:55.000
<v Speaker 1>Later on, I'm going to talk to Bloomberg columnist Noah

0:00:55.040 --> 0:00:58.600
<v Speaker 1>Smith about whether that's a realistic vision for the whole economy.

0:00:59.120 --> 0:01:02.440
<v Speaker 1>But first, blob Like Economy report to Craig Torres went

0:01:02.480 --> 0:01:05.520
<v Speaker 1>to a corner of Virginia where robots and humans do

0:01:05.680 --> 0:01:08.039
<v Speaker 1>seem to be each playing to their strengths with not

0:01:08.120 --> 0:01:18.080
<v Speaker 1>a dog insight. Every day we read stories about how

0:01:18.120 --> 0:01:23.040
<v Speaker 1>automation and artificial intelligence will make humans less relevant. Many

0:01:23.080 --> 0:01:26.240
<v Speaker 1>of these stories are alarmists suggesting robots will be doing

0:01:26.319 --> 0:01:29.800
<v Speaker 1>what shop floor machinists or even doctors do every day.

0:01:30.880 --> 0:01:34.039
<v Speaker 1>The real story is more subtle. I visited a state

0:01:34.080 --> 0:01:37.800
<v Speaker 1>of the art Rolls Royce facility outside of Richmond, Virginia

0:01:38.520 --> 0:01:43.160
<v Speaker 1>to find out how automation is changing human work. I

0:01:43.200 --> 0:01:47.000
<v Speaker 1>asked the plant manager, Laurence O'Dell, to take us on

0:01:47.040 --> 0:01:50.080
<v Speaker 1>a tour and explain how humans and machines shake hands

0:01:50.120 --> 0:01:54.800
<v Speaker 1>here to produce something very valuable and very precise. This

0:01:54.880 --> 0:02:00.720
<v Speaker 1>plant makes jet engine parts. I have pulled into the

0:02:00.760 --> 0:02:06.400
<v Speaker 1>parking lot of Rolls Royce. The plant stands kind of

0:02:06.440 --> 0:02:10.280
<v Speaker 1>in a forest, and as you drive down the road

0:02:10.919 --> 0:02:18.320
<v Speaker 1>you see this modern looking facility with the very familiar insignia.

0:02:19.760 --> 0:02:22.520
<v Speaker 1>Now I have to watch a safety video as I

0:02:22.560 --> 0:02:28.240
<v Speaker 1>go into the plant. Once I am inside, I asked

0:02:28.320 --> 0:02:31.120
<v Speaker 1>Lauren about what work was like when he was a

0:02:31.160 --> 0:02:34.640
<v Speaker 1>young engineer in the night and an automotive engine plant.

0:02:36.040 --> 0:02:40.720
<v Speaker 1>How close were men and women too, metal very close,

0:02:42.400 --> 0:02:45.079
<v Speaker 1>very very close. In many ways. There were a lot

0:02:45.120 --> 0:02:48.760
<v Speaker 1>of hand assembly operations. If you were assembling an enginengine.

0:02:48.840 --> 0:02:51.440
<v Speaker 1>It wasn't what we would recognize in any way today

0:02:51.800 --> 0:02:59.400
<v Speaker 1>as smart automation. It was electrically speaking, following some some

0:02:59.760 --> 0:03:03.040
<v Speaker 1>pain to dance steps on the floor one to three four,

0:03:03.160 --> 0:03:06.040
<v Speaker 1>one to three four. When I look at the parts

0:03:06.160 --> 0:03:10.040
<v Speaker 1>Rolls Royce produces here, they are finely cut like pieces

0:03:10.080 --> 0:03:14.280
<v Speaker 1>of ornate industrial jewelry. One part, called a fan disc,

0:03:14.520 --> 0:03:17.840
<v Speaker 1>starts out as a big cylinder of titanium, about as

0:03:17.919 --> 0:03:20.120
<v Speaker 1>round as a tire you might see on a tractor

0:03:20.200 --> 0:03:23.600
<v Speaker 1>trailer on the highway. This disc is integrated with the

0:03:23.600 --> 0:03:25.800
<v Speaker 1>fan blades you see on the front of a jet

0:03:25.800 --> 0:03:29.320
<v Speaker 1>engine as you board a plane. The raw disc costs

0:03:29.320 --> 0:03:32.480
<v Speaker 1>as much as fifty thousand dollars, so it has kept

0:03:32.480 --> 0:03:35.040
<v Speaker 1>in a big suitcase until it has loaded onto the

0:03:35.120 --> 0:03:39.480
<v Speaker 1>machining process to protect it from scratching and damaged. After

0:03:39.560 --> 0:03:41.960
<v Speaker 1>it's machined, it's about the size of an all weather

0:03:42.080 --> 0:03:45.960
<v Speaker 1>tire on my pickup truck, and it's just as articulated

0:03:46.280 --> 0:03:49.960
<v Speaker 1>and shiny and deeply grooved. There their ultra precise parts.

0:03:50.480 --> 0:03:53.600
<v Speaker 1>They're made out of materials that are right at the

0:03:53.800 --> 0:03:56.120
<v Speaker 1>ende of right at the at the cutting edge of

0:03:56.120 --> 0:04:00.360
<v Speaker 1>of what engineers can create so these are hearts that

0:04:00.400 --> 0:04:03.640
<v Speaker 1>are spinning at between six thousand and twelve thousand revolutions

0:04:03.640 --> 0:04:08.880
<v Speaker 1>per minute, and they're they're extremely highly stressed, uh physically

0:04:09.160 --> 0:04:12.000
<v Speaker 1>as well. You don't just walk on the shop floor here,

0:04:12.440 --> 0:04:15.920
<v Speaker 1>you have to dress for it works pretty good. So

0:04:15.960 --> 0:04:21.200
<v Speaker 1>we're all dressed in steel toe slippers. I have my

0:04:21.320 --> 0:04:25.920
<v Speaker 1>belt covered with a velcrow pad, my wedding ring is off,

0:04:26.400 --> 0:04:29.279
<v Speaker 1>and I have something over my eyes, so I'm pretty

0:04:30.360 --> 0:04:33.360
<v Speaker 1>scratch proof, i'd like to think. So if you were

0:04:33.400 --> 0:04:38.200
<v Speaker 1>wearing earrings, we let me go with those work wears.

0:04:38.200 --> 0:04:47.919
<v Speaker 1>Who I haven't. I haven't been out here since about

0:04:47.920 --> 0:04:49.920
<v Speaker 1>eight o'clock this morning, so I have no idea what's

0:04:49.920 --> 0:04:54.160
<v Speaker 1>going on. The rolls Royce plant is cavernous, albeit not gigantic.

0:04:54.600 --> 0:04:57.280
<v Speaker 1>There is lots of natural light. I feel like I'm

0:04:57.279 --> 0:05:00.360
<v Speaker 1>in a workshop, but already you can tell there's something

0:05:00.400 --> 0:05:03.640
<v Speaker 1>different about what we mean by work here. There aren't

0:05:03.760 --> 0:05:06.840
<v Speaker 1>rows of men and women standing at stations doing an

0:05:06.839 --> 0:05:10.560
<v Speaker 1>activity as in an auto plant. Instead, there are a

0:05:10.600 --> 0:05:13.279
<v Speaker 1>series of large pods about half the size of a

0:05:13.320 --> 0:05:17.720
<v Speaker 1>subway car with a window. Parts are placed within them,

0:05:17.800 --> 0:05:21.560
<v Speaker 1>and this is where the magic happens. Inside a robotic

0:05:21.760 --> 0:05:25.320
<v Speaker 1>arm picks out its own cutting tool, It measures itself

0:05:25.640 --> 0:05:30.000
<v Speaker 1>and stores this data along the way. Lauren explains, this

0:05:30.040 --> 0:05:33.599
<v Speaker 1>whole process is about turning a mathematical description of a

0:05:33.640 --> 0:05:39.960
<v Speaker 1>physical thing, a jet engine part, into reality. The car

0:05:40.040 --> 0:05:44.359
<v Speaker 1>by drill bit trims cuts amid a splashing milk colored lubricant.

0:05:45.279 --> 0:05:49.520
<v Speaker 1>The plant is relatively quiet and very clean. Where are

0:05:49.520 --> 0:05:53.680
<v Speaker 1>the humans. They're not running this, They're making sure this

0:05:53.720 --> 0:05:59.159
<v Speaker 1>automated process is repeating and repeating and repeating again without

0:05:59.320 --> 0:06:07.800
<v Speaker 1>much variant. There are essentially no manual operations here anymore. Though.

0:06:08.400 --> 0:06:12.760
<v Speaker 1>UM we're using in our machines what you would recognize

0:06:12.760 --> 0:06:17.960
<v Speaker 1>as big data, okay, to understand not only the parts

0:06:18.000 --> 0:06:21.000
<v Speaker 1>and how the parts are being manufactured, but also the

0:06:21.040 --> 0:06:24.800
<v Speaker 1>basic condition of the machine itself. Seems to be accelerating

0:06:24.800 --> 0:06:29.440
<v Speaker 1>a little bit every week, where every week we're either

0:06:29.520 --> 0:06:34.359
<v Speaker 1>implementing or we're taking a step toward implementing, UM a

0:06:34.440 --> 0:06:38.320
<v Speaker 1>new kind of predictive technology. I have never seen anything

0:06:38.400 --> 0:06:41.640
<v Speaker 1>like the shop floor at Rolls Royce. This plant is

0:06:41.720 --> 0:06:45.520
<v Speaker 1>less than eight years old. Something entirely new and different

0:06:45.680 --> 0:06:49.039
<v Speaker 1>is happening here. Humans are not on the clock like

0:06:49.200 --> 0:06:54.240
<v Speaker 1>pace of a typical manufacturing plant. The robots operating behind

0:06:54.279 --> 0:06:58.359
<v Speaker 1>the windows of the big boxes are constantly self optimizing,

0:06:58.839 --> 0:07:02.880
<v Speaker 1>that is, re measure ring themselves. I asked Lauren to

0:07:02.960 --> 0:07:07.039
<v Speaker 1>explain this to me. What he describes is straight out

0:07:07.080 --> 0:07:09.960
<v Speaker 1>of what twenty years ago would have been a comic

0:07:10.000 --> 0:07:14.880
<v Speaker 1>book kind of futurism. Humans are maintaining the process but

0:07:14.960 --> 0:07:18.240
<v Speaker 1>also standing back from it in a way that seems unusual.

0:07:19.000 --> 0:07:21.440
<v Speaker 1>This is how Lauren explains it. What it frees the

0:07:21.520 --> 0:07:26.680
<v Speaker 1>humans up to do is to think about how can

0:07:26.720 --> 0:07:31.920
<v Speaker 1>we make something better? So people to a significant extent

0:07:32.360 --> 0:07:35.400
<v Speaker 1>aren't as tied to the equipment as they have been

0:07:35.440 --> 0:07:38.200
<v Speaker 1>in the past, and they're really being freed up to

0:07:38.280 --> 0:07:42.520
<v Speaker 1>work on higher order activities. Is that more human work?

0:07:42.560 --> 0:07:46.160
<v Speaker 1>Would you say? Or not? Right now? I think it's

0:07:46.160 --> 0:07:50.600
<v Speaker 1>more human work. Okay, I'm starting to get it. In

0:07:50.640 --> 0:07:54.600
<v Speaker 1>an ultra automated plant like this, humans are a step

0:07:54.640 --> 0:07:58.240
<v Speaker 1>away from the whole process, less close to the metal,

0:07:58.880 --> 0:08:02.200
<v Speaker 1>but still very familiar with exactly what is going on.

0:08:03.080 --> 0:08:07.240
<v Speaker 1>So familiar, Lauren says that even the flex of titanium

0:08:07.280 --> 0:08:10.680
<v Speaker 1>spinning off the cutting tool can tell them something is off.

0:08:11.840 --> 0:08:15.280
<v Speaker 1>Lauren says he needs both skills people who know the

0:08:15.320 --> 0:08:19.360
<v Speaker 1>guts of the process, but can think beyond it, troubleshoot it,

0:08:19.680 --> 0:08:23.240
<v Speaker 1>and make it better. How do you train for something

0:08:23.320 --> 0:08:28.040
<v Speaker 1>like this? It sounds challenging to answer that question. I

0:08:28.160 --> 0:08:31.320
<v Speaker 1>drove across the bottom of the state to Danville, Virginia,

0:08:32.040 --> 0:08:35.000
<v Speaker 1>where Troy Simpson trains young men and women at a

0:08:35.040 --> 0:08:39.240
<v Speaker 1>community college with a special third year program in advanced manufacturing.

0:08:41.160 --> 0:08:45.600
<v Speaker 1>The location of Danville is highly significant. This city once

0:08:45.640 --> 0:08:49.880
<v Speaker 1>hosted one of the South's largest textile companies, Dan River Mills.

0:08:50.880 --> 0:08:53.360
<v Speaker 1>It's a bit like visiting an Athens of the South.

0:08:54.960 --> 0:08:58.079
<v Speaker 1>There are old brick temples from the tobacco and textile

0:08:58.160 --> 0:09:02.320
<v Speaker 1>industry everywhere. It's some are falling into ruin, while others

0:09:02.320 --> 0:09:06.280
<v Speaker 1>have been repurposed and restored. This area was hit hard

0:09:06.320 --> 0:09:09.680
<v Speaker 1>by NAFTA and the China trade. The state and county

0:09:09.720 --> 0:09:13.320
<v Speaker 1>are focused on restoring its prosperity by creating a new

0:09:13.400 --> 0:09:18.520
<v Speaker 1>kind of workforce that modern manufacturers will find attractive. They've

0:09:18.520 --> 0:09:22.920
<v Speaker 1>had some success. Here's Troy explaining how they are trying

0:09:22.960 --> 0:09:26.040
<v Speaker 1>to create this new workforce as factories are looking towards

0:09:26.040 --> 0:09:29.520
<v Speaker 1>a digital factory and pushing so much data, making intelligent

0:09:29.679 --> 0:09:34.439
<v Speaker 1>decisions through AI. Then we still got to have somebody

0:09:34.440 --> 0:09:38.480
<v Speaker 1>to set those factories up and the troubleshoot those same

0:09:38.559 --> 0:09:41.719
<v Speaker 1>problems that we have whether factor is digital or not. Right,

0:09:42.960 --> 0:09:47.120
<v Speaker 1>But those technicians and now need to understand what is

0:09:48.240 --> 0:09:53.040
<v Speaker 1>happening upstream and downstream of a process. So now we've

0:09:53.040 --> 0:09:57.240
<v Speaker 1>got a technician that it's a whole plethora of things

0:09:57.280 --> 0:10:00.000
<v Speaker 1>that could be causing those problems to maintain the process

0:10:00.080 --> 0:10:03.000
<v Speaker 1>this so we have built on the facility to do

0:10:03.160 --> 0:10:06.080
<v Speaker 1>to train that technician in that very environment. So let's

0:10:06.120 --> 0:10:08.200
<v Speaker 1>take a step back and figure out what we heard

0:10:08.240 --> 0:10:12.640
<v Speaker 1>and saw. Rolls Royce has a perfect mathematical description of

0:10:12.640 --> 0:10:16.640
<v Speaker 1>an engine part humans and machines joined forces to try

0:10:16.640 --> 0:10:20.480
<v Speaker 1>and hit that perfection. Every day humans are looking at

0:10:20.559 --> 0:10:23.960
<v Speaker 1>data as much as the things they work with. The

0:10:24.080 --> 0:10:29.640
<v Speaker 1>machines and the metal work is becoming more conceptual, less

0:10:29.640 --> 0:10:34.360
<v Speaker 1>heavy and dirty, plants are less hierarchical, and there's more

0:10:34.440 --> 0:10:39.120
<v Speaker 1>gender equality as work migrates in this direction. I like

0:10:39.200 --> 0:10:42.840
<v Speaker 1>the way Lauren puts it. If we blindly believe data

0:10:43.640 --> 0:10:47.040
<v Speaker 1>and we don't challenge it to say, does that make sense?

0:10:47.440 --> 0:10:51.240
<v Speaker 1>Does what the machine told me makes sense? Can I

0:10:51.320 --> 0:10:54.599
<v Speaker 1>challenge that? And I can can I verify it? Or

0:10:54.640 --> 0:10:58.200
<v Speaker 1>am I just going to blindly believe it and make

0:10:58.240 --> 0:11:02.040
<v Speaker 1>an offset to thinking Whatever it is, whether it's a

0:11:02.040 --> 0:11:05.440
<v Speaker 1>physical machine or computer model or whatever, Am I just

0:11:05.520 --> 0:11:08.480
<v Speaker 1>going to make that make that change on the basis

0:11:08.640 --> 0:11:13.640
<v Speaker 1>of of what it's told me. If it doesn't make sense,

0:11:13.640 --> 0:11:15.880
<v Speaker 1>then why would you allow it to make the change?

0:11:16.320 --> 0:11:18.840
<v Speaker 1>Or maybe we need to change our paradigm. Maybe we

0:11:18.840 --> 0:11:21.680
<v Speaker 1>need to update our paradigm of how that little piece

0:11:21.720 --> 0:11:24.960
<v Speaker 1>of the world works. And I think that for now,

0:11:25.080 --> 0:11:29.160
<v Speaker 1>and I think that until artificial intelligence really becomes prevalent,

0:11:29.280 --> 0:11:32.280
<v Speaker 1>I think that's where humans are always going to be

0:11:32.559 --> 0:11:34.920
<v Speaker 1>taking a step up, not a step away. Don't think

0:11:34.920 --> 0:11:36.360
<v Speaker 1>I don't think of it as a step away. I

0:11:36.360 --> 0:11:40.160
<v Speaker 1>think of as a step up the humans role. And

0:11:41.440 --> 0:11:46.320
<v Speaker 1>there is no organism or machine better equipped to do

0:11:46.400 --> 0:11:50.280
<v Speaker 1>this on the planet than a human being is to recognize,

0:11:50.760 --> 0:11:55.600
<v Speaker 1>in the purest sense, a non standard situation. Yes, there

0:11:55.679 --> 0:11:59.040
<v Speaker 1>is a quiet revolution happening in all trades, from medicine

0:11:59.200 --> 0:12:03.040
<v Speaker 1>to making term both fan disks as artificial intelligence and

0:12:03.080 --> 0:12:06.959
<v Speaker 1>automation creep into more of what we do. What does

0:12:07.000 --> 0:12:11.000
<v Speaker 1>it mean for humans? The stories I'm hearing are enormously

0:12:11.120 --> 0:12:15.080
<v Speaker 1>positive yes, there will be fewer people working in a

0:12:15.120 --> 0:12:18.920
<v Speaker 1>particular plant, but that doesn't mean there are going to

0:12:18.960 --> 0:12:23.320
<v Speaker 1>be fewer people working in manufacturing. That depends on how

0:12:23.400 --> 0:12:26.040
<v Speaker 1>quickly we can train people to work in this space

0:12:26.040 --> 0:12:29.920
<v Speaker 1>where they understand a job as not producing one thing,

0:12:30.800 --> 0:12:33.560
<v Speaker 1>but it's part of a larger creative process, full of

0:12:33.640 --> 0:12:37.600
<v Speaker 1>data and feedback that aims at a more repeatable, more

0:12:37.720 --> 0:12:50.640
<v Speaker 1>perfect thing. I'm Craig Trusts, an economics writer for Bloomberg News. Well,

0:12:50.640 --> 0:12:54.040
<v Speaker 1>I'm joined now by Noah Smith, Bloomberg opinion columnist on

0:12:54.080 --> 0:12:57.640
<v Speaker 1>all Things interesting in economics. He's in our San Francisco studio.

0:12:58.000 --> 0:13:00.000
<v Speaker 1>I mean, no, I think I guess the big question

0:13:00.320 --> 0:13:03.360
<v Speaker 1>first coming out of that report from from Craig Torres,

0:13:03.400 --> 0:13:05.839
<v Speaker 1>I mean, do you buy the idea that the robots

0:13:05.840 --> 0:13:08.880
<v Speaker 1>and artificial intelligence could be the best thing that ever

0:13:08.920 --> 0:13:11.640
<v Speaker 1>happened to humanity? I think the best thing that ever

0:13:11.640 --> 0:13:15.240
<v Speaker 1>happened to humanity is a little exaggerated. Um, But what's

0:13:15.240 --> 0:13:20.800
<v Speaker 1>interesting is that most jobs can probably not be completely automated. Usually,

0:13:20.960 --> 0:13:24.080
<v Speaker 1>what happens is that some tasks in the middle of

0:13:24.080 --> 0:13:28.000
<v Speaker 1>the job get automated. In other words, a robot won't

0:13:28.120 --> 0:13:30.760
<v Speaker 1>take your job it'll take half your job. And of

0:13:30.760 --> 0:13:34.160
<v Speaker 1>course that really reminds me of what happened to menial

0:13:34.440 --> 0:13:38.960
<v Speaker 1>labor and factory work during the Industrial Revolution. Before the

0:13:39.000 --> 0:13:42.079
<v Speaker 1>Industrial Revolution, you had people putting together every bit of

0:13:42.080 --> 0:13:45.320
<v Speaker 1>every product themselves. Now you have these machines that can

0:13:45.360 --> 0:13:46.880
<v Speaker 1>do some of the things, but they don't do all

0:13:46.920 --> 0:13:49.680
<v Speaker 1>of the things. So you have assembly line workers doing

0:13:49.720 --> 0:13:52.880
<v Speaker 1>some of the things, and you have um basically manufacturing

0:13:52.880 --> 0:13:56.560
<v Speaker 1>workers doing some of the tasks. And that made the

0:13:56.559 --> 0:13:59.480
<v Speaker 1>manufactured goods a lot cheaper. Demand for the manufactured goods

0:13:59.480 --> 0:14:04.280
<v Speaker 1>released biked, and therefore, even though half of craftsmen's jobs

0:14:04.280 --> 0:14:07.760
<v Speaker 1>are being taken by these machines, you really had a

0:14:07.840 --> 0:14:10.600
<v Speaker 1>lot more employment and a lot higher wages in the

0:14:10.640 --> 0:14:13.760
<v Speaker 1>manufacturing sector during the Industrial Revolution. And I think this

0:14:13.800 --> 0:14:16.080
<v Speaker 1>could be the same thing for service jobs. You know,

0:14:16.120 --> 0:14:19.120
<v Speaker 1>service sectors really have had low productivity growth. We've seen

0:14:19.400 --> 0:14:22.960
<v Speaker 1>huge productivity growth and manufacturing very slow productivity and services.

0:14:23.360 --> 0:14:25.880
<v Speaker 1>It's possible that this could change that and could make

0:14:25.920 --> 0:14:28.920
<v Speaker 1>services behave more like manufacturing used to. I mean, it's

0:14:28.960 --> 0:14:31.120
<v Speaker 1>interesting you talk about productivity. I do always find it

0:14:31.200 --> 0:14:35.360
<v Speaker 1>quite funny when you're in these conversations where people are talking.

0:14:35.400 --> 0:14:37.200
<v Speaker 1>You know, what should we most worry about, you know,

0:14:37.280 --> 0:14:40.080
<v Speaker 1>the end of the terrible things happening in the global economy.

0:14:40.080 --> 0:14:41.280
<v Speaker 1>And of course, you know, one of the big things

0:14:41.360 --> 0:14:44.600
<v Speaker 1>we talk about is the very slow rate of productivity growth,

0:14:44.600 --> 0:14:46.000
<v Speaker 1>the fact that we're not getting more out of the

0:14:46.040 --> 0:14:47.800
<v Speaker 1>same number of people, are not at the same rate

0:14:47.840 --> 0:14:50.480
<v Speaker 1>we have in the past. And that's obviously hugely important

0:14:50.520 --> 0:14:55.600
<v Speaker 1>because it drives wage growth and standard of living in growth.

0:14:55.840 --> 0:14:58.640
<v Speaker 1>But the people who are really gloomy about productivity growth

0:14:58.800 --> 0:15:01.200
<v Speaker 1>are often also the ones who are really gloomy about

0:15:01.240 --> 0:15:04.000
<v Speaker 1>everyone losing all their jobs because of technology, and you

0:15:04.080 --> 0:15:06.359
<v Speaker 1>sort of you can't really have you know, if technology

0:15:06.400 --> 0:15:08.680
<v Speaker 1>does get rid of all those jobs, it is inevitably

0:15:08.800 --> 0:15:11.080
<v Speaker 1>going to raise productivity. I guess the question is whether

0:15:11.080 --> 0:15:15.160
<v Speaker 1>the benefits of that get um distributed equally. But I

0:15:15.200 --> 0:15:18.360
<v Speaker 1>guess when people think about this what they tend to

0:15:18.400 --> 0:15:20.320
<v Speaker 1>come back at and say, well, we know that's true,

0:15:20.560 --> 0:15:23.520
<v Speaker 1>we know that it can't be completely automated. But if

0:15:23.560 --> 0:15:26.480
<v Speaker 1>it can be automated, I mean, you know, if you

0:15:26.600 --> 0:15:30.160
<v Speaker 1>had all of the picking for you know, in a warehouse,

0:15:30.400 --> 0:15:34.160
<v Speaker 1>for grocery shopping or whatever else, if it all gets

0:15:34.200 --> 0:15:39.600
<v Speaker 1>done by robots, but it's not accurate, and you need

0:15:39.640 --> 0:15:41.720
<v Speaker 1>to have one or two people wandering around, you know,

0:15:42.160 --> 0:15:44.360
<v Speaker 1>putting the robots back up when they fought, when they

0:15:44.360 --> 0:15:47.120
<v Speaker 1>bump into something, or you know, correcting it when they've

0:15:47.160 --> 0:15:49.440
<v Speaker 1>when they've gone wrong. You know, that is still a

0:15:49.520 --> 0:15:52.359
<v Speaker 1>loss of jobs that have been lost in that process.

0:15:52.520 --> 0:15:54.720
<v Speaker 1>And this just feels like it's going to be on

0:15:54.760 --> 0:15:57.000
<v Speaker 1>a greater scale than in the past. You buy that

0:15:57.960 --> 0:16:00.240
<v Speaker 1>it could happen. The entire question is where or not

0:16:00.320 --> 0:16:04.320
<v Speaker 1>demand expands. So what was interesting about manufacturing is that

0:16:04.400 --> 0:16:09.080
<v Speaker 1>you suddenly had um people having so much more stuff,

0:16:09.160 --> 0:16:12.480
<v Speaker 1>physical stuff, manufactured things. People now had cars, and they

0:16:12.480 --> 0:16:14.800
<v Speaker 1>had TVs, and they had much larger houses, and they

0:16:14.800 --> 0:16:17.320
<v Speaker 1>had many more bulls and many more dresses and blah

0:16:17.320 --> 0:16:21.520
<v Speaker 1>blah blah blah blah. And because this demand expanded, even

0:16:21.600 --> 0:16:25.160
<v Speaker 1>though each each dress, for example, that was made didn't

0:16:25.160 --> 0:16:29.320
<v Speaker 1>require as much human labor as before, overall people were

0:16:29.360 --> 0:16:33.760
<v Speaker 1>getting paid more for doing human labor involved in dressmaking

0:16:33.760 --> 0:16:36.720
<v Speaker 1>than they were before, much more, much much more so.

0:16:36.800 --> 0:16:40.680
<v Speaker 1>Because demand expanded in this way, UM, we had increase

0:16:40.760 --> 0:16:43.960
<v Speaker 1>in employment, increase in wages, increase in productivity. All at

0:16:43.960 --> 0:16:46.840
<v Speaker 1>the same time the question is could a similar thing

0:16:46.880 --> 0:16:50.840
<v Speaker 1>happened with services the The dissatisfying answer is that a

0:16:50.840 --> 0:16:53.720
<v Speaker 1>lot of different things could happen. But I believe people

0:16:53.760 --> 0:16:57.280
<v Speaker 1>aren't paying enough attention to the good possibility. People are

0:16:57.600 --> 0:17:00.520
<v Speaker 1>fixating on this extremely simple story of like you get

0:17:00.520 --> 0:17:02.520
<v Speaker 1>replaced by a robot and now you have nothing to do,

0:17:02.960 --> 0:17:05.600
<v Speaker 1>and people are completely ignoring that upside of the story,

0:17:06.240 --> 0:17:09.119
<v Speaker 1>you know, focusing less on trying to bring about the

0:17:09.160 --> 0:17:11.359
<v Speaker 1>good future because they're not even trying to visualize it.

0:17:16.400 --> 0:17:20.120
<v Speaker 1>Your argument is it goes back to what people point out,

0:17:20.480 --> 0:17:23.560
<v Speaker 1>even with the adjustments that you've had so far, you know,

0:17:23.640 --> 0:17:25.520
<v Speaker 1>since the introduction of the a t M. I know,

0:17:25.640 --> 0:17:28.440
<v Speaker 1>lots of bankers in the US like to point out

0:17:28.440 --> 0:17:31.480
<v Speaker 1>that the number of factual cashiers in the U. S

0:17:31.520 --> 0:17:34.280
<v Speaker 1>economy has increased dramatically since the introduction of a t

0:17:34.440 --> 0:17:37.400
<v Speaker 1>M S. So you don't end up necessarily with the replacement.

0:17:37.720 --> 0:17:41.359
<v Speaker 1>But I think even your own examples highlight that there's

0:17:41.400 --> 0:17:46.000
<v Speaker 1>going to be adjustment costs and a transition where the

0:17:46.359 --> 0:17:49.719
<v Speaker 1>jobs that are created. You know, even on the certainly

0:17:49.760 --> 0:17:52.159
<v Speaker 1>on the old version of the way it happened in

0:17:52.160 --> 0:17:54.720
<v Speaker 1>the past, you had a lot of jobs created that

0:17:54.800 --> 0:17:57.480
<v Speaker 1>were very different from the jobs that have been lost.

0:17:58.000 --> 0:18:00.840
<v Speaker 1>And if you're thinking about what happened to those individuals

0:18:01.000 --> 0:18:05.480
<v Speaker 1>in those particularly in sectors that were really affected by technology,

0:18:05.640 --> 0:18:09.080
<v Speaker 1>and in the sort of manufacturing technology, you know, the

0:18:09.119 --> 0:18:12.720
<v Speaker 1>story of those people and their ability to get into

0:18:12.760 --> 0:18:16.120
<v Speaker 1>the newer sectors is not not really a very happy one.

0:18:16.280 --> 0:18:19.399
<v Speaker 1>I'm certainly thinking of large parts of the manufacturing sector

0:18:19.400 --> 0:18:22.400
<v Speaker 1>in the UK where people lost their jobs and the

0:18:22.440 --> 0:18:26.160
<v Speaker 1>industries moved on, and those people did not necessarily ever

0:18:26.200 --> 0:18:29.240
<v Speaker 1>work again, or they certainly spend a long time unemployed.

0:18:29.359 --> 0:18:33.280
<v Speaker 1>So you know that has to be a question mark. Sure,

0:18:33.640 --> 0:18:35.440
<v Speaker 1>Nor can I say that you know we're going to

0:18:35.520 --> 0:18:38.680
<v Speaker 1>have a happy future where absolutely everybody has higher wages

0:18:38.680 --> 0:18:41.359
<v Speaker 1>and better jobs, etcetera. We're not, um, We're going to

0:18:41.440 --> 0:18:43.720
<v Speaker 1>have a complicated future where some people lose out and

0:18:43.760 --> 0:18:48.720
<v Speaker 1>some people win. And the question is on balance what happens,

0:18:48.800 --> 0:18:50.679
<v Speaker 1>is it good or bad? What can we do to

0:18:50.760 --> 0:18:52.840
<v Speaker 1>make sure that there's more people out there who are

0:18:52.840 --> 0:18:55.600
<v Speaker 1>gaining than losing a lot more, What can we do

0:18:55.680 --> 0:18:58.520
<v Speaker 1>to sort of steer this CATEC universe generally in the

0:18:58.600 --> 0:19:01.520
<v Speaker 1>right direction. With our limited powers of government and are

0:19:01.680 --> 0:19:05.800
<v Speaker 1>limited understanding of what's going on and the you know,

0:19:06.119 --> 0:19:08.800
<v Speaker 1>the results of our actions and the current situation. We

0:19:08.840 --> 0:19:10.840
<v Speaker 1>have all this limited understanding. We're operating in the dark,

0:19:10.840 --> 0:19:12.639
<v Speaker 1>and we have limited control, and we don't really know

0:19:12.680 --> 0:19:15.520
<v Speaker 1>what our control does so much. How do we steer

0:19:15.560 --> 0:19:18.359
<v Speaker 1>things in a better direction? And that is the question.

0:19:18.400 --> 0:19:20.239
<v Speaker 1>I mean, that was interested. It seems to me we

0:19:20.280 --> 0:19:23.280
<v Speaker 1>don't need to just it's not satisfying to say we're

0:19:23.320 --> 0:19:26.000
<v Speaker 1>just going to wait and see why where the cards fall,

0:19:26.200 --> 0:19:28.560
<v Speaker 1>and wait and see if a meteor lands on your

0:19:28.560 --> 0:19:29.840
<v Speaker 1>head or not. I mean, you kind of want to

0:19:29.840 --> 0:19:33.399
<v Speaker 1>know whether you're government actually is willing to give you

0:19:33.400 --> 0:19:36.760
<v Speaker 1>an umbrella or something stronger to prevent the meteor hitting

0:19:36.800 --> 0:19:39.119
<v Speaker 1>knocking you out. What I sometimes think is when I

0:19:39.119 --> 0:19:42.080
<v Speaker 1>look at the US, but it doesn't seem like there's

0:19:42.080 --> 0:19:46.880
<v Speaker 1>a much appetite or capacity for government or for other

0:19:46.920 --> 0:19:51.199
<v Speaker 1>institutions to help people manage the human consequences of this

0:19:51.280 --> 0:19:53.720
<v Speaker 1>kind of dislocation. I mean, we've got I was looking

0:19:53.760 --> 0:19:56.280
<v Speaker 1>at some of the numbers the spending on public spending

0:19:56.320 --> 0:19:59.480
<v Speaker 1>on sort of training and labor market stuff as one

0:19:59.480 --> 0:20:01.080
<v Speaker 1>of the lower in the O E c D. It's

0:20:01.119 --> 0:20:04.919
<v Speaker 1>gone down by as a share of GDP since the

0:20:04.960 --> 0:20:08.159
<v Speaker 1>eighties in the US trade unions, who might have been

0:20:08.200 --> 0:20:11.960
<v Speaker 1>the kind of natural institution that could have sort of

0:20:12.000 --> 0:20:15.760
<v Speaker 1>stood between kind of workers and companies and helped people

0:20:15.880 --> 0:20:18.800
<v Speaker 1>up skill. You know, they're clearly not and on the

0:20:18.800 --> 0:20:23.439
<v Speaker 1>horizon at all in the US now. So what do

0:20:23.520 --> 0:20:26.960
<v Speaker 1>you when you think about what tools government has, even

0:20:27.000 --> 0:20:29.720
<v Speaker 1>with all this uncertainty about the impact, You know, are

0:20:29.720 --> 0:20:31.720
<v Speaker 1>there things if you hadn't if you had a different

0:20:31.800 --> 0:20:35.080
<v Speaker 1>kind of appetite political appetite, are the things that you

0:20:35.119 --> 0:20:39.800
<v Speaker 1>would say would make a real difference. Absolutely, Now we

0:20:39.840 --> 0:20:42.359
<v Speaker 1>don't really know what will work. By the way, we

0:20:42.480 --> 0:20:45.960
<v Speaker 1>tried a lot of government retraining programs in previous decades

0:20:46.520 --> 0:20:49.520
<v Speaker 1>and all the evaluations say that they were completely ineffectual.

0:20:49.840 --> 0:20:52.280
<v Speaker 1>They didn't help anybody get jobs in long term. They

0:20:52.280 --> 0:20:54.080
<v Speaker 1>were a giant waste, and that's why we cut back.

0:20:54.520 --> 0:20:58.080
<v Speaker 1>The government was terrible retraining people. UM companies are really

0:20:58.119 --> 0:21:00.440
<v Speaker 1>good at training people because they know what to train

0:21:00.520 --> 0:21:03.439
<v Speaker 1>people for, they know what they need. The thing is,

0:21:03.760 --> 0:21:06.960
<v Speaker 1>you know, American companies have just been in this orgy

0:21:06.960 --> 0:21:10.920
<v Speaker 1>of offshoring and outsourcing, especially outsourcing, so offshoring isn't as

0:21:10.920 --> 0:21:13.120
<v Speaker 1>big of an issue as people think it's it's more

0:21:13.200 --> 0:21:17.080
<v Speaker 1>outsourcing to contractors, you know, within the United States. Because

0:21:17.080 --> 0:21:19.240
<v Speaker 1>of that, companies don't spend a lot of money and

0:21:19.280 --> 0:21:22.320
<v Speaker 1>time training people anymore, and you get this equilibrium where

0:21:22.320 --> 0:21:25.359
<v Speaker 1>it's very easy for people to jump to other companies,

0:21:26.200 --> 0:21:28.920
<v Speaker 1>unlike in a lot of other countries. But then because

0:21:28.920 --> 0:21:32.680
<v Speaker 1>it's so easy, companies don't want to spend the time

0:21:32.680 --> 0:21:35.879
<v Speaker 1>and effort training workers. And so we need to shift

0:21:36.080 --> 0:21:40.640
<v Speaker 1>back to this equilibrium where companies give people more job

0:21:40.680 --> 0:21:44.600
<v Speaker 1>security and and more training and more investment in the workers,

0:21:45.320 --> 0:21:48.520
<v Speaker 1>and um, we need to have a more harmonious labor

0:21:48.560 --> 0:21:51.600
<v Speaker 1>management relationship. And I think the most interesting proposal for

0:21:51.600 --> 0:21:54.480
<v Speaker 1>how to do that has come from Elizabeth Warren, who

0:21:54.520 --> 0:21:57.399
<v Speaker 1>proposed codetermination. I don't know if they're familiar with this,

0:21:57.480 --> 0:22:00.280
<v Speaker 1>but it's yeah, so it's I mean, Germany does it.

0:22:00.280 --> 0:22:03.280
<v Speaker 1>It's it's corporatist. It's not the sort of socialist model

0:22:03.400 --> 0:22:06.160
<v Speaker 1>of the benevolent, all knowing government will do everything for you,

0:22:06.560 --> 0:22:10.440
<v Speaker 1>but it is. It definitely makes companies more democratic and

0:22:10.560 --> 0:22:13.560
<v Speaker 1>give work, gives workers more of a saying company policy,

0:22:13.560 --> 0:22:15.640
<v Speaker 1>and of course workers are going to want more investment

0:22:15.680 --> 0:22:19.359
<v Speaker 1>in workers and her proposal didn't have the sort of

0:22:19.359 --> 0:22:21.560
<v Speaker 1>workers councils that Germany has, but I think that would

0:22:21.560 --> 0:22:23.800
<v Speaker 1>be a good addition as well. So I think that

0:22:23.920 --> 0:22:28.080
<v Speaker 1>policy has the largest chance of anything that we've thought

0:22:28.160 --> 0:22:31.560
<v Speaker 1>of are proposed of shifting back to this good equilibrium

0:22:31.600 --> 0:22:34.960
<v Speaker 1>where companies invest a ton in training workers. But that

0:22:35.080 --> 0:22:37.920
<v Speaker 1>policy is a million miles away. I mean, although it's

0:22:37.920 --> 0:22:41.920
<v Speaker 1>been proposed by a US politician, it is a long

0:22:41.960 --> 0:22:46.639
<v Speaker 1>way from the the culture that we've seen of American

0:22:46.720 --> 0:22:50.680
<v Speaker 1>companies and American capitalism really for for many, many years.

0:22:50.680 --> 0:22:53.439
<v Speaker 1>I mean, there's not that the few bits of worker

0:22:53.480 --> 0:22:55.600
<v Speaker 1>code inmidentation that I think you did have, or we

0:22:55.680 --> 0:22:57.359
<v Speaker 1>started to have in the US, I think in the

0:22:57.400 --> 0:23:04.160
<v Speaker 1>thirties got um overwhelmed by efforts against it and never

0:23:04.200 --> 0:23:06.679
<v Speaker 1>really got established in the US. I mean your it

0:23:06.800 --> 0:23:08.520
<v Speaker 1>is certainly true. There's a lot of this kind of

0:23:08.520 --> 0:23:11.440
<v Speaker 1>thing in Europe. Jeremy Corbyn and the Labor Party has

0:23:11.520 --> 0:23:14.200
<v Speaker 1>has talked about it. I was out of something where

0:23:14.240 --> 0:23:16.159
<v Speaker 1>there was a lot of people from different parts of

0:23:16.200 --> 0:23:20.639
<v Speaker 1>Europe talking about the challenge of technology and digitalization of

0:23:20.640 --> 0:23:23.560
<v Speaker 1>the workforce, and the Swedish trade unionists said, you know,

0:23:23.640 --> 0:23:26.200
<v Speaker 1>we've with our members. We don't think that the new

0:23:26.280 --> 0:23:28.640
<v Speaker 1>machine opposes a threat to our members. It's the old

0:23:28.680 --> 0:23:31.080
<v Speaker 1>machine that poses a threat. And it's it's a completely

0:23:31.080 --> 0:23:34.960
<v Speaker 1>different psychology, which comes from the social partnership they have

0:23:35.280 --> 0:23:38.280
<v Speaker 1>and the kind of funds that they've come together. Employeed

0:23:38.320 --> 0:23:42.360
<v Speaker 1>and employers contribute to retraining programs that, as you say,

0:23:42.400 --> 0:23:45.720
<v Speaker 1>are closer to the work. But it feels just culturally,

0:23:45.720 --> 0:23:48.000
<v Speaker 1>it's not just the current administration. It feels a million

0:23:48.000 --> 0:23:50.600
<v Speaker 1>miles away from the from the traditions of the US.

0:23:50.680 --> 0:23:54.159
<v Speaker 1>Is it really something that could easily start to be

0:23:54.240 --> 0:23:59.479
<v Speaker 1>effective in the US? Uh So, I don't know, um,

0:23:59.760 --> 0:24:02.240
<v Speaker 1>but I do know that it's closer to our culture

0:24:02.240 --> 0:24:06.520
<v Speaker 1>and traditions than a government managed thing, and it would

0:24:06.560 --> 0:24:09.000
<v Speaker 1>be more effective as well. We have a history of

0:24:09.000 --> 0:24:12.919
<v Speaker 1>doing corporatest things. We have a very corporatist health insurance system.

0:24:13.000 --> 0:24:15.879
<v Speaker 1>We say, you know, in America, if you're not old

0:24:16.119 --> 0:24:19.160
<v Speaker 1>or you know, your your health insurance goes through your employer.

0:24:19.680 --> 0:24:22.440
<v Speaker 1>So we have a history of having things go through

0:24:22.480 --> 0:24:26.360
<v Speaker 1>people's employers. I think there's much more hope for that

0:24:26.440 --> 0:24:31.840
<v Speaker 1>kind of of corporatist you know, retraining and worker investment system.

0:24:31.880 --> 0:24:36.080
<v Speaker 1>Then there is for a government system, even before you

0:24:36.200 --> 0:24:38.040
<v Speaker 1>take into account the fact that the government system would

0:24:38.040 --> 0:24:42.120
<v Speaker 1>probably be ineffectual and sale. So, Um, the answer is

0:24:42.119 --> 0:24:44.879
<v Speaker 1>is it a long shot? Maybe so, I don't really know. Uh.

0:24:45.200 --> 0:24:47.640
<v Speaker 1>Is it a longer shot than other things? Probably not?

0:24:48.320 --> 0:24:53.440
<v Speaker 1>And if a if a third option, if a third

0:24:53.440 --> 0:24:55.960
<v Speaker 1>option is to somehow have you know, because we can't

0:24:56.000 --> 0:24:58.520
<v Speaker 1>do any of these other things, we should instead have

0:24:58.560 --> 0:25:04.000
<v Speaker 1>the government try to disincentivized adoption of automation technology. That

0:25:04.040 --> 0:25:07.840
<v Speaker 1>would be dumb and a disaster. Um. We should not.

0:25:07.920 --> 0:25:10.440
<v Speaker 1>That is what we should absolutely not ever do, and

0:25:10.880 --> 0:25:13.960
<v Speaker 1>we should firmly resist anyone who brings that idea up.

0:25:14.200 --> 0:25:17.360
<v Speaker 1>And the reason is because our companies are competing with

0:25:17.640 --> 0:25:20.159
<v Speaker 1>companies from Asia and Europe that are not going to

0:25:20.280 --> 0:25:22.680
<v Speaker 1>do that, that are going to do something a lot

0:25:22.680 --> 0:25:25.840
<v Speaker 1>smarter than that. If we simply limit technology, this one

0:25:25.880 --> 0:25:27.679
<v Speaker 1>technology because we're sort of afraid of it, and we

0:25:27.720 --> 0:25:30.320
<v Speaker 1>spin these sci fi scenarios that haven't come true at

0:25:30.359 --> 0:25:34.560
<v Speaker 1>all yet, Um, then we'll just be, you know, turning

0:25:34.560 --> 0:25:38.119
<v Speaker 1>ourselves into a backwater while other countries take over. We

0:25:38.200 --> 0:25:40.959
<v Speaker 1>are going to have some interesting conversations about this with

0:25:41.000 --> 0:25:43.400
<v Speaker 1>all the movers and Shakers and the New Economy Forum

0:25:43.400 --> 0:25:50.800
<v Speaker 1>in Singapore. No Smith, thanks very much, Thank you, thanks

0:25:50.840 --> 0:25:54.080
<v Speaker 1>for listening to The New Economy. Today's episode was reported

0:25:54.119 --> 0:25:57.280
<v Speaker 1>by Quaig Toras with editor Scott Lamon and produced by

0:25:57.320 --> 0:26:02.399
<v Speaker 1>Magnus Hendrison. Special thanks to Noah Smith. Francesco Levy is

0:26:02.440 --> 0:26:12.600
<v Speaker 1>the head of Bloomberg Podcast h