WEBVTT - The Economist Who Believes AI Will Be Great for the Middle Class

0:00:02.480 --> 0:00:10.680
<v Speaker 1>Bloomberg Audio Studios, Podcasts, Radio News.

0:00:17.920 --> 0:00:21.840
<v Speaker 2>Hello and welcome to another episode of the Odd Lots Podcast.

0:00:21.920 --> 0:00:24.240
<v Speaker 1>I'm Joe Wisenthal and I'm Tracy Alloway.

0:00:24.520 --> 0:00:27.680
<v Speaker 2>Tracy, I feel like AI is a great thing for

0:00:28.000 --> 0:00:30.280
<v Speaker 2>anyone who wants to have an opinion on anything. It's

0:00:30.280 --> 0:00:34.080
<v Speaker 2>like this blank canvas out there in which any idea

0:00:34.159 --> 0:00:37.760
<v Speaker 2>you have it. It's just a great moment for pontificators

0:00:37.800 --> 0:00:38.280
<v Speaker 2>in general.

0:00:38.400 --> 0:00:40.680
<v Speaker 1>Well, not only can you hang a bunch of different

0:00:40.680 --> 0:00:43.640
<v Speaker 1>opinions on it, but it can generate those opinions for you.

0:00:45.080 --> 0:00:46.120
<v Speaker 3>Yeah, you can, that's right.

0:00:46.159 --> 0:00:48.560
<v Speaker 2>Like you can just go to chat GPT and say

0:00:48.800 --> 0:00:51.360
<v Speaker 2>which jobs are going to be lost thanks to you,

0:00:51.400 --> 0:00:54.960
<v Speaker 2>and it'll like spew some answer forward based on the

0:00:55.040 --> 0:00:58.440
<v Speaker 2>collective wisdom of trillions of words that people have typed

0:00:58.440 --> 0:00:59.160
<v Speaker 2>over the years.

0:00:59.360 --> 0:01:03.400
<v Speaker 1>I do think, though, if we're talking about one opinion,

0:01:03.600 --> 0:01:06.920
<v Speaker 1>in particular, the dominant opinion at this point in time,

0:01:07.200 --> 0:01:10.360
<v Speaker 1>it does feel like there's a lot of nervousness about

0:01:10.400 --> 0:01:13.319
<v Speaker 1>this new technology and what exactly it means for the economy,

0:01:13.400 --> 0:01:16.520
<v Speaker 1>what specifically it means for jobs. And so you see

0:01:16.520 --> 0:01:19.520
<v Speaker 1>all these headlines that AI is going to lead to

0:01:19.560 --> 0:01:22.560
<v Speaker 1>a bunch of job losses, that it's going to basically

0:01:22.600 --> 0:01:26.280
<v Speaker 1>be a new technological revolution that plays out very similarly

0:01:26.520 --> 0:01:30.200
<v Speaker 1>to the computer revolution that led to the destruction of

0:01:30.240 --> 0:01:33.039
<v Speaker 1>a bunch of sort of middle office jobs, or the

0:01:33.120 --> 0:01:38.160
<v Speaker 1>industrial revolution that led to a loss of skilled artisan jobs.

0:01:38.520 --> 0:01:42.520
<v Speaker 1>And we've seen some hints of that, to be fair,

0:01:42.680 --> 0:01:45.560
<v Speaker 1>So I'm thinking back to last year. I think it

0:01:45.680 --> 0:01:50.120
<v Speaker 1>was in the summer, maybe in June, and the Challenger

0:01:50.240 --> 0:01:53.280
<v Speaker 1>Jobs Report came out and for the first time ever,

0:01:53.320 --> 0:01:57.000
<v Speaker 1>they included a line about job losses stemming from AI.

0:01:57.840 --> 0:02:00.520
<v Speaker 2>Yeah, although I'm just going to say right here then,

0:02:00.560 --> 0:02:03.559
<v Speaker 2>I think when a company lays off workers and says

0:02:03.560 --> 0:02:06.400
<v Speaker 2>it's due to AI, I still have this assumption that

0:02:06.480 --> 0:02:08.760
<v Speaker 2>it's like, we're doing badly, so we're gonna put a

0:02:08.760 --> 0:02:11.320
<v Speaker 2>positive spin on it by making it seem as though

0:02:11.320 --> 0:02:14.880
<v Speaker 2>our layoffs are the result of some internal productivity breakthrough

0:02:14.919 --> 0:02:17.280
<v Speaker 2>that we're getting from a chatbot. So, like, I don't

0:02:17.400 --> 0:02:19.840
<v Speaker 2>quite believe it, but I think.

0:02:19.639 --> 0:02:23.280
<v Speaker 1>That's totally fair. That's totally fair. But I think clearly

0:02:23.320 --> 0:02:25.600
<v Speaker 1>this is something people are paying attention to. You are

0:02:25.760 --> 0:02:28.040
<v Speaker 1>starting to see some of the economic reports sort of

0:02:28.240 --> 0:02:31.760
<v Speaker 1>break this down. At least the Challenger report if not

0:02:32.000 --> 0:02:34.840
<v Speaker 1>like the BLS and things like that. So there is

0:02:35.000 --> 0:02:39.120
<v Speaker 1>this idea hovering over the economy at the moment, which is, Okay,

0:02:39.200 --> 0:02:42.440
<v Speaker 1>maybe AI will be great for productivity, we'll get that boost,

0:02:42.800 --> 0:02:44.720
<v Speaker 1>but what does it mean for jobs? Right?

0:02:44.880 --> 0:02:48.480
<v Speaker 2>Basically everyone in any realm loses their job and is

0:02:48.480 --> 0:02:51.160
<v Speaker 2>on the UBI drip, and only Sam Altman is the

0:02:51.240 --> 0:02:54.040
<v Speaker 2>last person who is employed, is it? But like, I

0:02:54.040 --> 0:02:56.120
<v Speaker 2>don't know, I get freaked out, Like it's pretty good,

0:02:56.280 --> 0:03:00.000
<v Speaker 2>Like there are many I use AI almost chatbots all

0:03:00.080 --> 0:03:01.640
<v Speaker 2>the time in my work, and it's like, well, maybe

0:03:01.639 --> 0:03:04.360
<v Speaker 2>it could one day be a better host than myself

0:03:04.680 --> 0:03:08.400
<v Speaker 2>for a podcast. It seems possible to me. I am anxious.

0:03:08.520 --> 0:03:11.920
<v Speaker 2>Of course, people also like to project onto their perceived

0:03:11.919 --> 0:03:14.920
<v Speaker 2>ideological enemies. That's like, oh, all you English majors are

0:03:14.919 --> 0:03:17.040
<v Speaker 2>going to lose your jobs hahaha, And then the English

0:03:17.040 --> 0:03:19.800
<v Speaker 2>majors all go, all you coders are going to lose

0:03:19.880 --> 0:03:22.120
<v Speaker 2>your jobs, and you're going to need English majors there.

0:03:22.160 --> 0:03:23.359
<v Speaker 3>It's just an endless thing.

0:03:23.400 --> 0:03:25.400
<v Speaker 2>And actually I think I tune most of it out

0:03:25.440 --> 0:03:28.639
<v Speaker 2>because it's so ambiguous in my view where this technology

0:03:28.680 --> 0:03:30.919
<v Speaker 2>is going that there are very few people I even

0:03:30.960 --> 0:03:34.480
<v Speaker 2>want to hear from on the topic, because I think

0:03:34.480 --> 0:03:37.520
<v Speaker 2>it's just so there's so much extreme uncertainty.

0:03:37.040 --> 0:03:41.280
<v Speaker 1>Still extreme uncertainty. As you mentioned, people kind of harness

0:03:41.440 --> 0:03:46.280
<v Speaker 1>it to further their own biases or arguments. But you're right,

0:03:46.400 --> 0:03:48.360
<v Speaker 1>there are people who are good on this topic, and

0:03:48.360 --> 0:03:49.600
<v Speaker 1>we're about to speak to one of them.

0:03:50.080 --> 0:03:50.960
<v Speaker 3>That's exactly right.

0:03:51.000 --> 0:03:54.800
<v Speaker 2>So last month, there was this really interesting headline that

0:03:54.880 --> 0:03:58.640
<v Speaker 2>I saw in a Nuema magazine and it sort of

0:03:58.680 --> 0:04:02.240
<v Speaker 2>felt like this sort of like provocative, maybe clickbaity type

0:04:02.280 --> 0:04:05.840
<v Speaker 2>headline that said AI could actually help rebuild the middle class,

0:04:05.920 --> 0:04:08.280
<v Speaker 2>which is very counterintuitive, very the opposite of what we're

0:04:08.280 --> 0:04:12.160
<v Speaker 2>talking about. But then I noticed who the author of

0:04:12.200 --> 0:04:16.040
<v Speaker 2>the piece was, and it's someone whose work is very

0:04:16.080 --> 0:04:19.839
<v Speaker 2>strongly associated with forces in the past and forces in

0:04:19.920 --> 0:04:23.520
<v Speaker 2>technology that have been destructive to the middle class and

0:04:23.560 --> 0:04:27.120
<v Speaker 2>have caused great labor market upheaval. And so if someone

0:04:27.160 --> 0:04:30.479
<v Speaker 2>who has sort of been watching this exact topic, the

0:04:30.600 --> 0:04:35.320
<v Speaker 2>intersection of labor market upheaval and technological change, is saying,

0:04:35.600 --> 0:04:38.160
<v Speaker 2>actually this could be good, and this person is a

0:04:38.200 --> 0:04:40.800
<v Speaker 2>track record in this area, I'm like, Okay, this is

0:04:40.839 --> 0:04:43.400
<v Speaker 2>an argument maybe I'll pay more attention to than the

0:04:43.440 --> 0:04:44.880
<v Speaker 2>random person doing a Twitter threat.

0:04:45.240 --> 0:04:48.320
<v Speaker 1>I'm into it. As you mentioned, we're speaking to someone

0:04:48.320 --> 0:04:51.480
<v Speaker 1>who is an expert on this particular topic and specifically

0:04:51.560 --> 0:04:54.520
<v Speaker 1>has has written a lot and researched a lot about

0:04:54.520 --> 0:04:58.280
<v Speaker 1>previous labor market shocks, including the China Shock. So competition

0:04:58.400 --> 0:05:02.000
<v Speaker 1>from China in the realms of manufacturing in the sort

0:05:02.000 --> 0:05:05.239
<v Speaker 1>of nineteen nineties, early two thousands. So I'm very excited

0:05:05.240 --> 0:05:08.520
<v Speaker 1>for this conversation. I am interested to hear an argument

0:05:08.600 --> 0:05:11.200
<v Speaker 1>that's not just AI is terrible and it's going to

0:05:11.240 --> 0:05:12.320
<v Speaker 1>take all of our jobs.

0:05:12.560 --> 0:05:15.480
<v Speaker 2>Absolutely well, I'm really excited. We do, in fact have

0:05:15.600 --> 0:05:17.719
<v Speaker 2>the perfect guest. We are going to be speaking with,

0:05:17.800 --> 0:05:21.240
<v Speaker 2>David Otter. He's a professor of economics at MIT and

0:05:21.320 --> 0:05:25.320
<v Speaker 2>co director of the MIT Shaping the Future of Work Initiative,

0:05:25.880 --> 0:05:29.560
<v Speaker 2>and he's really known for his work on the China

0:05:29.640 --> 0:05:34.080
<v Speaker 2>Shock and the devastating impact that China's boom in tradeable goods,

0:05:34.320 --> 0:05:38.520
<v Speaker 2>particularly after its essension to the WTO, had on various

0:05:38.520 --> 0:05:42.159
<v Speaker 2>communities within the United States that were sort of dependent

0:05:42.400 --> 0:05:46.400
<v Speaker 2>on a sort of regional manufacturing. So, David, thank you

0:05:46.480 --> 0:05:48.159
<v Speaker 2>so much for coming on odd LATS.

0:05:48.800 --> 0:05:50.599
<v Speaker 4>Thank you so much. Joe and Tracy for inviting me.

0:05:50.680 --> 0:05:51.960
<v Speaker 4>I'll try not to be clickbaity.

0:05:52.800 --> 0:05:54.320
<v Speaker 3>That's okay. It's okay.

0:05:54.320 --> 0:05:56.880
<v Speaker 2>It's okay to be clickbait if it delivers. And the

0:05:56.920 --> 0:05:58.839
<v Speaker 2>other thing about this article, by the way, is that

0:05:58.920 --> 0:06:01.640
<v Speaker 2>it wasn't like a paragraph thought piece like this is

0:06:01.760 --> 0:06:04.960
<v Speaker 2>clearly some serious work which we obviously appreciated and made

0:06:05.000 --> 0:06:07.599
<v Speaker 2>me take it seriously. But before we get into this

0:06:07.839 --> 0:06:11.359
<v Speaker 2>or even the China shock or general work or AI

0:06:11.400 --> 0:06:14.760
<v Speaker 2>in general, what is your like, what do you tell

0:06:14.839 --> 0:06:17.040
<v Speaker 2>us like what has been the thrust of your career

0:06:17.120 --> 0:06:17.680
<v Speaker 2>over time?

0:06:17.839 --> 0:06:17.919
<v Speaker 1>Like?

0:06:18.000 --> 0:06:21.120
<v Speaker 2>What is what is sort of the main interest of

0:06:21.160 --> 0:06:25.680
<v Speaker 2>yours that spans from the effects of globalization to now AI,

0:06:25.960 --> 0:06:26.480
<v Speaker 2>et cetera.

0:06:26.920 --> 0:06:30.640
<v Speaker 4>My focus has always been on forces that shape opportunity,

0:06:30.839 --> 0:06:35.000
<v Speaker 4>particularly for workers without for your college degrees, the majority

0:06:35.000 --> 0:06:37.040
<v Speaker 4>of workers in the United States and of course elsewhere,

0:06:37.600 --> 0:06:41.200
<v Speaker 4>and that have been so buffeted by computerization, by globalization,

0:06:41.640 --> 0:06:44.680
<v Speaker 4>by changes and institutions, including the unionization, the fall the

0:06:44.680 --> 0:06:47.320
<v Speaker 4>mid and wages the United States. And so that is

0:06:47.360 --> 0:06:50.440
<v Speaker 4>the common focus of my work, and that has included,

0:06:50.560 --> 0:06:52.680
<v Speaker 4>you know, a lot of work on technological change, computerization,

0:06:53.200 --> 0:06:55.880
<v Speaker 4>the China trade shock, and many other angles to that.

0:06:55.880 --> 0:06:57.480
<v Speaker 4>But that kind of unifies that's you know, I think

0:06:57.480 --> 0:06:59.279
<v Speaker 4>the labor market is the most important thing in the world.

0:06:59.640 --> 0:07:01.359
<v Speaker 4>I think that's where people drive most of their income,

0:07:01.400 --> 0:07:05.120
<v Speaker 4>spend most of their time, derive identity from, and so

0:07:05.480 --> 0:07:08.840
<v Speaker 4>things that affect the quality of jobs, the opportunities that

0:07:08.880 --> 0:07:12.760
<v Speaker 4>people have are just quintessentially important and are going to

0:07:12.800 --> 0:07:15.520
<v Speaker 4>shape the structure of their lives, you know, more than

0:07:15.520 --> 0:07:18.200
<v Speaker 4>the quality of entertainment, more than the ease of transportation,

0:07:18.360 --> 0:07:20.680
<v Speaker 4>more than you know, what fashion is available. This is

0:07:20.680 --> 0:07:21.560
<v Speaker 4>really a big ee.

0:07:21.840 --> 0:07:24.800
<v Speaker 1>So in the spirit of this discussion, I asked to

0:07:24.880 --> 0:07:30.480
<v Speaker 1>chat GPT to poke intellectual and logical holes in this article.

0:07:30.920 --> 0:07:33.720
<v Speaker 1>So let's just start there. Number one. No, I'm joking.

0:07:33.760 --> 0:07:36.080
<v Speaker 1>I did actually do that, and some of them, some

0:07:36.160 --> 0:07:38.400
<v Speaker 1>of them are quite good, and I will get to

0:07:38.440 --> 0:07:41.240
<v Speaker 1>them later. But maybe just to begin with, could you

0:07:41.280 --> 0:07:45.560
<v Speaker 1>talk about the current discourse on AI and why there

0:07:45.680 --> 0:07:50.200
<v Speaker 1>seems to be this distrust of new technology. What is

0:07:50.240 --> 0:07:52.880
<v Speaker 1>that predicated on. I mean I kind of referred to

0:07:52.920 --> 0:07:56.160
<v Speaker 1>it in the intro, but there is past history, obviously

0:07:56.240 --> 0:08:00.160
<v Speaker 1>with major technological advances and booms that have led to

0:08:00.240 --> 0:08:03.480
<v Speaker 1>certain outcomes in the labor market. How does that inform

0:08:03.560 --> 0:08:04.520
<v Speaker 1>the current discussion.

0:08:04.800 --> 0:08:08.360
<v Speaker 4>Sure, so people are understandably very concerned about all of

0:08:08.400 --> 0:08:11.440
<v Speaker 4>these technological forces because they are disruptive and they create

0:08:11.480 --> 0:08:13.920
<v Speaker 4>winners and losers. There's no sense in which everyone is

0:08:13.960 --> 0:08:17.280
<v Speaker 4>better off because of a new technology. So you mentioned

0:08:17.680 --> 0:08:20.360
<v Speaker 4>the industrial era, and the Luddites rose up against the

0:08:20.360 --> 0:08:25.040
<v Speaker 4>introduction power looms and smashed them, and they're often derided historically,

0:08:25.080 --> 0:08:29.400
<v Speaker 4>but they were right. The industry revolution, the mechanization of

0:08:29.520 --> 0:08:33.120
<v Speaker 4>weaving wiped out the careers of artisans and made their

0:08:33.160 --> 0:08:37.000
<v Speaker 4>work non tenable, and wages didn't rise for decades into

0:08:37.000 --> 0:08:40.160
<v Speaker 4>the Industrial Revolution. So that was very displacing. Ultimately at

0:08:40.240 --> 0:08:42.160
<v Speaker 4>raised living stairs, but it took a long time and

0:08:42.200 --> 0:08:46.199
<v Speaker 4>the beneficiaries were not workers. The computer revolution has raised productivity,

0:08:46.520 --> 0:08:50.160
<v Speaker 4>but it's been very unequal and polarizing. It's automated a

0:08:50.200 --> 0:08:53.160
<v Speaker 4>lot of middle skill, middle class work in factories and

0:08:53.160 --> 0:08:56.199
<v Speaker 4>in offices. It's been great for professionals, but for many

0:08:56.200 --> 0:08:58.560
<v Speaker 4>other people it's just meant that because they can no

0:08:58.600 --> 0:09:00.800
<v Speaker 4>longer do. Those middle skill jobs are often found in

0:09:01.040 --> 0:09:05.040
<v Speaker 4>food service, cleaning, security, entertainment, recreation, and those are valuable,

0:09:05.440 --> 0:09:09.400
<v Speaker 4>laudable activities, but they don't pay well. Because they don't

0:09:09.520 --> 0:09:13.319
<v Speaker 4>use specialized expertise in training, so most people can do

0:09:13.400 --> 0:09:16.080
<v Speaker 4>that work almost right away, so it tends to be

0:09:16.160 --> 0:09:19.480
<v Speaker 4>low paid. So I think there's many reasons to take

0:09:19.520 --> 0:09:22.560
<v Speaker 4>this very very seriously and think carefully about what the

0:09:22.600 --> 0:09:23.600
<v Speaker 4>implications are.

0:09:23.920 --> 0:09:26.200
<v Speaker 2>Before we even get to AI talked to us more

0:09:26.200 --> 0:09:30.000
<v Speaker 2>about the computer revolution, because, like I said, I saw

0:09:30.040 --> 0:09:31.680
<v Speaker 2>your piece and I'm like, uh, and I first thought

0:09:31.800 --> 0:09:34.280
<v Speaker 2>China Shock and your work on that. It's like, okay,

0:09:34.320 --> 0:09:36.599
<v Speaker 2>this is interesting, but actually, just like I feel like

0:09:36.640 --> 0:09:40.239
<v Speaker 2>there actually has not been a lot of general conversation

0:09:40.360 --> 0:09:44.640
<v Speaker 2>about the sort of unequalizing effects of the computer revolution,

0:09:44.760 --> 0:09:47.720
<v Speaker 2>like how did that happen? What does the research show

0:09:47.720 --> 0:09:50.720
<v Speaker 2>about the timing of the introduction of the computers, And

0:09:50.760 --> 0:09:53.240
<v Speaker 2>then this sort of like I don't know, maybe Barbelle

0:09:53.320 --> 0:09:55.679
<v Speaker 2>or fragmentation of what happened to workers.

0:09:56.360 --> 0:09:59.560
<v Speaker 4>So you know, this really begins in the nineteen eighties

0:09:59.600 --> 0:10:04.480
<v Speaker 4>and it can continues through the over at least thirty

0:10:04.480 --> 0:10:07.120
<v Speaker 4>five years. And you know, a very simple way to

0:10:07.360 --> 0:10:09.600
<v Speaker 4>boil it down and say, look, what are computers useful for.

0:10:10.000 --> 0:10:12.840
<v Speaker 4>They're useful for following rules and procedures, right, they don't

0:10:13.000 --> 0:10:16.800
<v Speaker 4>think they're not creative. They're not problem solvers. They don't improvise,

0:10:17.280 --> 0:10:21.200
<v Speaker 4>they follow codified rules and procedures. But that describes a

0:10:21.200 --> 0:10:23.840
<v Speaker 4>lot of middle skill work. Right, whether you're in an

0:10:23.840 --> 0:10:26.520
<v Speaker 4>office or you're doing repetitive assembly work. The ability to

0:10:26.840 --> 0:10:30.720
<v Speaker 4>accurately carry out codified procedures is a valuable skill. It

0:10:30.800 --> 0:10:34.000
<v Speaker 4>requires off in literacy and numeracy and training, and so

0:10:34.160 --> 0:10:37.360
<v Speaker 4>the ability to automate that was a really big deal,

0:10:37.960 --> 0:10:41.640
<v Speaker 4>and that had the effect of displacing many people who

0:10:41.640 --> 0:10:43.880
<v Speaker 4>were doing what I would call these mass expertise jobs

0:10:43.880 --> 0:10:46.880
<v Speaker 4>where they were following codified procedures. Right, it takes education

0:10:47.400 --> 0:10:51.000
<v Speaker 4>to be a typist or a bookkeeper or someone who

0:10:51.040 --> 0:10:54.760
<v Speaker 4>does filing an organization, keeps track of accounts, and so

0:10:54.840 --> 0:10:56.920
<v Speaker 4>the fact that a lot of that work takes real skill.

0:10:57.040 --> 0:10:59.600
<v Speaker 4>To do high quality work on assembly line, you have

0:10:59.600 --> 0:11:01.760
<v Speaker 4>to understan the tools, who have to understand the product

0:11:01.760 --> 0:11:03.680
<v Speaker 4>and so on. So the fact that that work could

0:11:03.679 --> 0:11:07.559
<v Speaker 4>be automated was not unambiguously good. It was good for

0:11:07.840 --> 0:11:09.920
<v Speaker 4>you know, it was good for productivity, it was good

0:11:09.920 --> 0:11:11.800
<v Speaker 4>for consumers, it was good for firms. But for the

0:11:11.840 --> 0:11:15.000
<v Speaker 4>workers who had invested their careers in those activities, that

0:11:15.120 --> 0:11:17.719
<v Speaker 4>was definitely a negative. And on the other hand, if

0:11:17.720 --> 0:11:20.360
<v Speaker 4>you were a professional or you know, a manager or

0:11:20.400 --> 0:11:26.119
<v Speaker 4>a designer, researcher, doctor, having access to information and quick calculation,

0:11:26.720 --> 0:11:29.240
<v Speaker 4>that's not your main job. Those are just inputs into

0:11:29.240 --> 0:11:32.600
<v Speaker 4>your decision making. So computers were very complimentary to people

0:11:32.640 --> 0:11:34.720
<v Speaker 4>who are decision makers, which is really the bulk of

0:11:34.720 --> 0:11:38.280
<v Speaker 4>the professions making high stakes decisions about you know, important

0:11:38.280 --> 0:11:40.280
<v Speaker 4>one off cases. You know how to care for a

0:11:40.280 --> 0:11:43.040
<v Speaker 4>cancer patient, or you know how to design a building,

0:11:43.320 --> 0:11:46.600
<v Speaker 4>or how to do a marketing plan. Right, computerizations extremely

0:11:46.600 --> 0:11:48.960
<v Speaker 4>helpful for that doesn't displace your main job, it just

0:11:48.960 --> 0:11:51.200
<v Speaker 4>makes you more efficient add it. But for people who

0:11:51.520 --> 0:11:53.920
<v Speaker 4>did not have the opportunity to get degrees and move

0:11:54.000 --> 0:11:57.160
<v Speaker 4>upward into that work, what remained was a lot of

0:11:57.520 --> 0:12:00.560
<v Speaker 4>work that's very hard to automate. You to mention these

0:12:00.600 --> 0:12:02.960
<v Speaker 4>a lot of these hands on manual jobs, so you know,

0:12:03.040 --> 0:12:07.160
<v Speaker 4>food service and cleaning, could be a transportation and many

0:12:07.240 --> 0:12:09.680
<v Speaker 4>of those jobs. Not all of those jobs are open

0:12:09.800 --> 0:12:13.880
<v Speaker 4>to many many people. They don't require much training or experience,

0:12:14.520 --> 0:12:17.200
<v Speaker 4>and you don't get a great deal better at them

0:12:17.240 --> 0:12:20.360
<v Speaker 4>over time. And so because of that, because they're non

0:12:20.360 --> 0:12:22.680
<v Speaker 4>expert work, they tend to be low paid in all

0:12:22.800 --> 0:12:25.240
<v Speaker 4>industrialized countries. Now, I want to be clear that not

0:12:25.320 --> 0:12:27.600
<v Speaker 4>all hands on work is low paid or low skilled

0:12:27.600 --> 0:12:29.679
<v Speaker 4>in any sense. Right, if you're a plumber, electrician, you're

0:12:29.679 --> 0:12:31.800
<v Speaker 4>working the skilled trades, right, if you do skilled repair.

0:12:32.200 --> 0:12:35.040
<v Speaker 4>There are many, many skilled hands on jobs, but the

0:12:35.160 --> 0:12:37.719
<v Speaker 4>ones that have grown so much as the middle has

0:12:37.760 --> 0:12:40.120
<v Speaker 4>hollowed out, have been much more of these personal service

0:12:40.160 --> 0:12:56.559
<v Speaker 4>occupations that have low training and expertise requirements.

0:12:58.760 --> 0:13:01.280
<v Speaker 1>The thing this reminds me of, and I cannot, for

0:13:01.320 --> 0:13:04.000
<v Speaker 1>the life of me remember which guest this was, but

0:13:04.040 --> 0:13:07.720
<v Speaker 1>a previous all Lots guests described this as remember the

0:13:07.760 --> 0:13:11.120
<v Speaker 1>scene from The Producers where Matthew Broderick is like an

0:13:11.160 --> 0:13:14.480
<v Speaker 1>actuary or something working in an office and they're all

0:13:14.520 --> 0:13:15.520
<v Speaker 1>toiling away.

0:13:15.400 --> 0:13:17.439
<v Speaker 2>And then Stuart Butterfield.

0:13:19.120 --> 0:13:22.280
<v Speaker 1>That's right, and then all of those people eventually get

0:13:22.320 --> 0:13:25.400
<v Speaker 1>replaced by an Excel spreadsheet, right, Like that's the function

0:13:25.800 --> 0:13:28.960
<v Speaker 1>that became Excel. So, David, I want to kind of

0:13:28.960 --> 0:13:30.520
<v Speaker 1>press you on this point because I think it's a

0:13:30.559 --> 0:13:33.679
<v Speaker 1>really interesting one, and I think it's essential to understanding

0:13:33.679 --> 0:13:38.520
<v Speaker 1>your overall argument. But you make the distinction between information

0:13:39.360 --> 0:13:43.000
<v Speaker 1>and decision making. So the idea that people can have

0:13:43.080 --> 0:13:46.400
<v Speaker 1>access to a lot of information. In fact, plenty of

0:13:46.400 --> 0:13:49.679
<v Speaker 1>people would argue that people are drowning in information at

0:13:49.679 --> 0:13:49.920
<v Speaker 1>the moment.

0:13:50.520 --> 0:13:50.880
<v Speaker 4>Information.

0:13:51.040 --> 0:13:54.920
<v Speaker 1>Yeah, but they're not necessarily using that to make the

0:13:54.960 --> 0:13:57.959
<v Speaker 1>best decisions. Decision making is sort of a separate skill.

0:13:58.000 --> 0:13:59.959
<v Speaker 1>Can you talk a little bit more about that aspect

0:14:00.040 --> 0:14:00.680
<v Speaker 1>of your argument?

0:14:01.160 --> 0:14:03.280
<v Speaker 4>Absolutely so. Let me so, I want to draw a

0:14:03.320 --> 0:14:06.280
<v Speaker 4>sharp line between AI and traditional computing, which is what

0:14:06.280 --> 0:14:09.720
<v Speaker 4>we've been discussing, because they're quite different. But before you

0:14:09.720 --> 0:14:11.360
<v Speaker 4>do that, let me kind of make a kind of

0:14:11.360 --> 0:14:13.400
<v Speaker 4>a meta argument that I think is useful our discussion.

0:14:13.840 --> 0:14:16.319
<v Speaker 4>So the concern we should be having is not about

0:14:16.320 --> 0:14:19.200
<v Speaker 4>the quantity of jobs. We are not running out of jobs.

0:14:19.400 --> 0:14:21.600
<v Speaker 4>And in fact, you know, all the Western world right

0:14:21.640 --> 0:14:24.960
<v Speaker 4>now is in full or overemployment, and even during the

0:14:24.960 --> 0:14:27.360
<v Speaker 4>whole computer revolutions on, we didn't run out of jobs.

0:14:27.400 --> 0:14:29.480
<v Speaker 4>It's not the quantity that matters. In fact, we're all

0:14:29.520 --> 0:14:32.640
<v Speaker 4>facing a demographic crunch. It's the quality. Right A world

0:14:32.640 --> 0:14:34.840
<v Speaker 4>in which everyone's waiting tables is very different from the

0:14:34.880 --> 0:14:37.880
<v Speaker 4>world in which everyone is doing medical care. And so

0:14:38.240 --> 0:14:41.600
<v Speaker 4>what matters is not simply whether there is work, but

0:14:41.680 --> 0:14:45.640
<v Speaker 4>whether it's expert work that requires real skills. If it's

0:14:45.680 --> 0:14:49.320
<v Speaker 4>non expert work, work that anyone can do with no

0:14:49.400 --> 0:14:52.640
<v Speaker 4>training your certification, unfortunately it will be low paid. On

0:14:52.640 --> 0:14:56.080
<v Speaker 4>the other hand, if it's work that requires specialized knowledge

0:14:56.080 --> 0:14:59.160
<v Speaker 4>and that is made more productive by uses of tools,

0:14:59.160 --> 0:15:01.640
<v Speaker 4>and computers are to and AI as a tool, then

0:15:02.120 --> 0:15:04.920
<v Speaker 4>that's good for labor, that's good for earnings, that's good

0:15:04.920 --> 0:15:06.680
<v Speaker 4>for the quality of careers. And so we should be

0:15:06.760 --> 0:15:09.360
<v Speaker 4>thinking about expertise. Just to give you, like a very

0:15:09.840 --> 0:15:12.680
<v Speaker 4>stylized example, you know, think of the job of crossing

0:15:12.720 --> 0:15:16.560
<v Speaker 4>guard and air traffic controller. These are basically the same job.

0:15:17.000 --> 0:15:20.120
<v Speaker 4>The job is to prevent things from crashing into other things, right,

0:15:20.280 --> 0:15:23.560
<v Speaker 4>airplanes from crashing to airplanes, cars from crashing into children

0:15:23.800 --> 0:15:26.440
<v Speaker 4>on their way to school. But air traffic controllers United

0:15:26.440 --> 0:15:28.000
<v Speaker 4>States are paid, you know, four and a half times

0:15:28.000 --> 0:15:31.040
<v Speaker 4>as much as crossing guards, and the reason is expertise.

0:15:31.320 --> 0:15:34.280
<v Speaker 4>Almost anyone can become a crossing guard in the United

0:15:34.280 --> 0:15:37.280
<v Speaker 4>States with no trainer certification, whereas to become an air

0:15:37.280 --> 0:15:41.680
<v Speaker 4>traffic controller requires years of school and thousands of hours

0:15:41.680 --> 0:15:44.720
<v Speaker 4>of practice. And so even though those jobs do the

0:15:44.760 --> 0:15:48.480
<v Speaker 4>same thing, because of the difference in skill requirements, they

0:15:48.520 --> 0:15:51.480
<v Speaker 4>pay very different wage levels, and so we want to

0:15:51.520 --> 0:15:56.320
<v Speaker 4>have jobs where expertise is valuable, not just where physical

0:15:56.360 --> 0:15:59.880
<v Speaker 4>presence is the primary requirement. So that's what we should

0:15:59.880 --> 0:16:02.280
<v Speaker 4>be thinking about. Having said that, let me talk about

0:16:02.440 --> 0:16:06.120
<v Speaker 4>how AI relates to that. So, you know, a traditional computerization,

0:16:06.200 --> 0:16:10.000
<v Speaker 4>as we've been talking about, is really about automating well

0:16:10.040 --> 0:16:13.000
<v Speaker 4>understood procedures and rules, right, what we call formal knowledge.

0:16:13.040 --> 0:16:14.960
<v Speaker 4>You know how to do math, how to reproduce a

0:16:14.960 --> 0:16:18.240
<v Speaker 4>document or check for spelling errors. And it's very limited

0:16:18.280 --> 0:16:23.640
<v Speaker 4>because it cannot do what people do fairly effortlessly, which

0:16:23.720 --> 0:16:26.560
<v Speaker 4>is learn from kind of tacit knowledge. Tacit knowledge is

0:16:26.960 --> 0:16:29.480
<v Speaker 4>all the things that you implicitly understand that you infer

0:16:29.520 --> 0:16:32.080
<v Speaker 4>from your environment, but you never formalize. Right. So, you

0:16:32.120 --> 0:16:34.560
<v Speaker 4>know how to ride a bicycle, but you couldn't explain

0:16:34.640 --> 0:16:37.160
<v Speaker 4>how it's done. Right. You couldn't sit up and explain

0:16:37.240 --> 0:16:39.840
<v Speaker 4>that you know the gyroscopic physics of a bicycle. You

0:16:39.920 --> 0:16:41.680
<v Speaker 4>know how to make a funny joke, but you don't

0:16:41.680 --> 0:16:43.840
<v Speaker 4>know the rules for making a funny joke. You know

0:16:43.880 --> 0:16:46.120
<v Speaker 4>how to recognize the face of someone after you haven't

0:16:46.160 --> 0:16:48.920
<v Speaker 4>seen them for thirty years, right, But that's actually a

0:16:48.920 --> 0:16:52.320
<v Speaker 4>hard problem, and we do it, but we do it

0:16:52.400 --> 0:16:55.000
<v Speaker 4>based on some tacit understanding. And this has always been

0:16:55.040 --> 0:16:58.520
<v Speaker 4>a barrier to computerization because we couldn't code up the

0:16:58.560 --> 0:17:01.480
<v Speaker 4>things that we understood only acidly. We had to understand

0:17:01.480 --> 0:17:07.240
<v Speaker 4>them explicitly, informally. So AI overcomes that barrier. AI essentially

0:17:07.640 --> 0:17:11.359
<v Speaker 4>infers tacit information from large bodies of data. It learns

0:17:11.359 --> 0:17:14.760
<v Speaker 4>the associations between you know, words and phrases and sentences

0:17:14.800 --> 0:17:18.520
<v Speaker 4>between pictures and words. It can look at a scan

0:17:18.800 --> 0:17:22.520
<v Speaker 4>of a patient's lungs and make predictions or you know,

0:17:22.640 --> 0:17:25.600
<v Speaker 4>guesses about whether that patient has an endema or other

0:17:26.119 --> 0:17:28.919
<v Speaker 4>medical disorders. It does that not because someone has written

0:17:28.920 --> 0:17:31.919
<v Speaker 4>a program that says, these things tell you whether you have,

0:17:32.000 --> 0:17:34.640
<v Speaker 4>you know, a lung issue. It's because it learns from

0:17:34.640 --> 0:17:37.399
<v Speaker 4>the patterns it's trained on that data, and so that

0:17:37.440 --> 0:17:40.680
<v Speaker 4>gives it a really different set of capabilities. It gives

0:17:40.720 --> 0:17:43.560
<v Speaker 4>it the ability to do what a lot of us do,

0:17:44.000 --> 0:17:45.679
<v Speaker 4>or at least to supplement a lot of what we do,

0:17:45.720 --> 0:17:49.840
<v Speaker 4>which is to sort of make decisions based on lots

0:17:49.840 --> 0:17:53.360
<v Speaker 4>and lots of inputs and educated guesses. Right, So let's

0:17:53.359 --> 0:17:55.879
<v Speaker 4>say you know, you're a medical doctor, right, when you

0:17:55.920 --> 0:17:59.320
<v Speaker 4>see a patient, you're not simply essentially reading from your

0:17:59.320 --> 0:18:02.720
<v Speaker 4>textbook in your mind about what to do. You understand

0:18:02.960 --> 0:18:06.639
<v Speaker 4>bodily systems, you understand the biology and so on, but

0:18:06.720 --> 0:18:08.959
<v Speaker 4>then you've had lots and lots of experience. So when

0:18:09.000 --> 0:18:11.439
<v Speaker 4>you've see an individual patient, you're going to make a

0:18:11.440 --> 0:18:14.639
<v Speaker 4>decision based on a kind of translation from this formal

0:18:14.640 --> 0:18:17.679
<v Speaker 4>body of knowledge plus all the experience you've had to

0:18:17.720 --> 0:18:19.879
<v Speaker 4>make a good judgment. And the stakes are really high

0:18:20.000 --> 0:18:22.359
<v Speaker 4>because obviously if there was just a simple rule book

0:18:22.359 --> 0:18:25.119
<v Speaker 4>for it, you wouldn't need a doctor. You need a

0:18:25.160 --> 0:18:27.360
<v Speaker 4>person who can make a judgment about how to care

0:18:27.400 --> 0:18:29.280
<v Speaker 4>for this patient and their individual needs.

0:18:29.440 --> 0:18:29.960
<v Speaker 3>It's so funny.

0:18:30.000 --> 0:18:32.679
<v Speaker 2>I was just talking to Tracy in a different context,

0:18:32.680 --> 0:18:34.760
<v Speaker 2>and I was like, I was talking about the TV

0:18:34.840 --> 0:18:38.160
<v Speaker 2>show House, which I'm really into, and like, you know,

0:18:38.200 --> 0:18:41.119
<v Speaker 2>even though it's probably hyper dramatized, this idea of like

0:18:41.240 --> 0:18:44.360
<v Speaker 2>how still today like doctors don't really know a lot

0:18:44.400 --> 0:18:46.520
<v Speaker 2>and they have to like they debate, well, what's actually

0:18:46.640 --> 0:18:48.840
<v Speaker 2>going on here? And of course the show has some

0:18:49.000 --> 0:18:52.760
<v Speaker 2>very entertaining depictions of what those debates among doctors of

0:18:52.920 --> 0:18:55.280
<v Speaker 2>what's really going wrong with the patient and what's the

0:18:55.440 --> 0:18:58.320
<v Speaker 2>proper treatment. So I guess you can go from there

0:18:58.359 --> 0:19:01.439
<v Speaker 2>and just say, well, House was like the most brilliant

0:19:01.440 --> 0:19:03.800
<v Speaker 2>and he had seen thousands of patients over the course

0:19:03.840 --> 0:19:06.119
<v Speaker 2>of several seasons of that show, and so he had

0:19:06.160 --> 0:19:10.560
<v Speaker 2>the best like intuitions. But basically it sounds like, thanks

0:19:10.600 --> 0:19:16.000
<v Speaker 2>to AI, someone can harness those same intuitions without having

0:19:16.080 --> 0:19:19.040
<v Speaker 2>seen thousands of patients before, like doctor House did.

0:19:19.680 --> 0:19:21.879
<v Speaker 4>I think that's a nice way to put it, is

0:19:21.920 --> 0:19:25.639
<v Speaker 4>that what AI can do is provide kind of guidance

0:19:25.760 --> 0:19:28.439
<v Speaker 4>and guardrails for decision making. So what do I mean

0:19:28.440 --> 0:19:30.720
<v Speaker 4>by guidance and guardrails. By guidance, I mean, you know,

0:19:30.840 --> 0:19:34.960
<v Speaker 4>had you considered this set of possibilities, these potential diagnoses,

0:19:35.040 --> 0:19:37.760
<v Speaker 4>guardrails were like, you know, don't prescribe these two drugs together.

0:19:37.800 --> 0:19:42.200
<v Speaker 4>They negatively interact and in decision making work. Having that

0:19:42.280 --> 0:19:46.040
<v Speaker 4>kind of access to support, to a form of expertise,

0:19:46.119 --> 0:19:48.000
<v Speaker 4>not that you should one hundred percent rely upon it,

0:19:48.320 --> 0:19:51.760
<v Speaker 4>but that you can supplement your own judgment is potentially

0:19:52.000 --> 0:19:54.480
<v Speaker 4>very useful. So you know, let me give you a

0:19:54.600 --> 0:19:57.680
<v Speaker 4>concrete example, sticking with medicine. So the job of nurse

0:19:57.720 --> 0:20:01.399
<v Speaker 4>practitioner is pretty prominent right now, they're several hundred thousand

0:20:01.400 --> 0:20:03.400
<v Speaker 4>the United States. They make quite a good living, about

0:20:03.400 --> 0:20:04.960
<v Speaker 4>one hundred and thirty two thousand dollars a year of

0:20:05.000 --> 0:20:08.000
<v Speaker 4>the median. And they barely existed twenty years ago. And

0:20:08.200 --> 0:20:11.640
<v Speaker 4>nurse practitioners are nurses with an additional master's degree who

0:20:11.680 --> 0:20:16.520
<v Speaker 4>could do diagnosing, prescribing, treating things that were only done

0:20:16.880 --> 0:20:21.920
<v Speaker 4>by medical doctors decades earlier. And this new occupation has

0:20:21.920 --> 0:20:24.600
<v Speaker 4>come into existence, and it's terrific for patients in the

0:20:24.680 --> 0:20:26.600
<v Speaker 4>way in that it saves them time, it saves the

0:20:26.640 --> 0:20:29.600
<v Speaker 4>healthcare system money, it creates a good job, and it

0:20:29.640 --> 0:20:31.840
<v Speaker 4>does a very important task. Now, this is not a

0:20:31.880 --> 0:20:36.600
<v Speaker 4>technological creation, socially, the result of nurses recognizing they were

0:20:36.680 --> 0:20:41.080
<v Speaker 4>under used, fighting for a larger role, developing a training

0:20:41.080 --> 0:20:44.480
<v Speaker 4>and certification program, and eventually against over the dead body

0:20:44.520 --> 0:20:48.359
<v Speaker 4>of the American Medical Association, effectively carving out this new role.

0:20:48.920 --> 0:20:51.680
<v Speaker 4>So it's not because of technology. However, at this point,

0:20:51.760 --> 0:20:55.840
<v Speaker 4>nurse practitioners are heavily supported by technology. Right, So electronic

0:20:55.960 --> 0:20:58.919
<v Speaker 4>medical records right provide all the information all that you

0:20:58.920 --> 0:21:01.240
<v Speaker 4>would need or some of the information you would need

0:21:01.280 --> 0:21:05.080
<v Speaker 4>for good decision making, as do extensive diagnostic tests, as

0:21:05.119 --> 0:21:08.560
<v Speaker 4>does software that looks for drug interactions, among other things.

0:21:08.920 --> 0:21:11.560
<v Speaker 4>And it's easy to imagine that as we roll the

0:21:11.560 --> 0:21:15.880
<v Speaker 4>clock forward, the set of tools that will support decision

0:21:15.880 --> 0:21:19.560
<v Speaker 4>making by nurse practitioners will improve dramatically. And as it

0:21:19.600 --> 0:21:22.359
<v Speaker 4>does so, it will allow them to do more of

0:21:22.400 --> 0:21:26.720
<v Speaker 4>the tasks that are currently kind of controlled by more

0:21:26.720 --> 0:21:29.680
<v Speaker 4>expensive professionals. And why is that a good thing? You

0:21:29.760 --> 0:21:30.919
<v Speaker 4>might say, Well, it's not a good thing if your

0:21:31.000 --> 0:21:33.800
<v Speaker 4>doctor necessarily, But we live in a world in which

0:21:34.160 --> 0:21:38.119
<v Speaker 4>a lot of the bottlenecks are expensive decision makers, people

0:21:38.160 --> 0:21:42.199
<v Speaker 4>who are the MBAs and the lawyers and the medical

0:21:42.240 --> 0:21:45.360
<v Speaker 4>doctors and the architects and the engineers, and they all

0:21:45.400 --> 0:21:47.560
<v Speaker 4>do valid work and they deserve what they earn, and

0:21:47.600 --> 0:21:49.639
<v Speaker 4>I'm not disputing that, but it would be great to

0:21:49.640 --> 0:21:51.760
<v Speaker 4>be able to create more people who could do that

0:21:51.880 --> 0:21:55.800
<v Speaker 4>work without them being quite so expensive and the advantage

0:21:55.800 --> 0:21:59.080
<v Speaker 4>of that, So, if an AI can enable more people

0:21:59.119 --> 0:22:02.679
<v Speaker 4>to do good decision making work, it actually can open

0:22:02.800 --> 0:22:05.919
<v Speaker 4>up opportunity for people who are not the elite. Right,

0:22:06.000 --> 0:22:07.879
<v Speaker 4>we have tons and tons of healthcare that needs to

0:22:07.880 --> 0:22:10.400
<v Speaker 4>be done right. It doesn't all need to be done

0:22:10.400 --> 0:22:13.359
<v Speaker 4>by medical doctors, or we have lots of software coding

0:22:13.400 --> 0:22:15.199
<v Speaker 4>that needs to be done. It doesn't all need to

0:22:15.200 --> 0:22:19.159
<v Speaker 4>be done by people from top universities with Bachelors of

0:22:19.160 --> 0:22:22.879
<v Speaker 4>Science degrees in computer science. We have tons of design

0:22:22.920 --> 0:22:25.760
<v Speaker 4>that needs to be done, tons of care, tons of

0:22:25.840 --> 0:22:31.040
<v Speaker 4>legal work. Right, So the potential for an AI is

0:22:31.119 --> 0:22:34.960
<v Speaker 4>to enable people who have training and judgment to go

0:22:35.200 --> 0:22:39.120
<v Speaker 4>further with those skills. So it's not to make them unnecessary,

0:22:39.680 --> 0:22:44.879
<v Speaker 4>but simply to extend their range by supporting decision making. So,

0:22:45.080 --> 0:22:49.480
<v Speaker 4>just to give you another super concrete analogy, take YouTube. Right.

0:22:49.520 --> 0:22:53.399
<v Speaker 4>So YouTube is used all the time by people in

0:22:53.440 --> 0:22:56.200
<v Speaker 4>the trades among other groups to try to figure out

0:22:56.240 --> 0:22:59.560
<v Speaker 4>how to do a specific repair or diagnose the problem

0:22:59.600 --> 0:23:01.960
<v Speaker 4>that they have seen before. Now I'm gonna say, well,

0:23:02.000 --> 0:23:05.040
<v Speaker 4>who is YouTube really for. Well, it's not for the

0:23:05.080 --> 0:23:07.280
<v Speaker 4>frontier experts they already know how to do these things,

0:23:07.880 --> 0:23:11.000
<v Speaker 4>nor is it necessarily for the rank amateur. Right. You

0:23:11.000 --> 0:23:12.680
<v Speaker 4>don't want to go to YouTube and say, well, how

0:23:12.680 --> 0:23:16.439
<v Speaker 4>do I install and wire in a brand new central

0:23:16.440 --> 0:23:19.120
<v Speaker 4>house air conditioning? I've never done anything like that before. Right,

0:23:19.119 --> 0:23:21.040
<v Speaker 4>If you went Toto YouTube for that, you would quickly

0:23:21.080 --> 0:23:23.720
<v Speaker 4>get yourself into trouble because if you don't have some

0:23:23.800 --> 0:23:27.280
<v Speaker 4>foundational skills, that could be a problem. On the other hand,

0:23:27.880 --> 0:23:30.240
<v Speaker 4>if you were handy and you had some experience with

0:23:30.240 --> 0:23:32.920
<v Speaker 4>electrical work, some experience with plumbing, some experience with carpentry,

0:23:33.080 --> 0:23:36.159
<v Speaker 4>but you've never done an ac installation before, well, now

0:23:36.600 --> 0:23:38.680
<v Speaker 4>you could go to YouTube and that would get you further.

0:23:38.760 --> 0:23:41.040
<v Speaker 4>So you could think of YouTube as kind of like

0:23:41.040 --> 0:23:44.120
<v Speaker 4>a mini AI that provides guidance and guardrails.

0:23:44.280 --> 0:23:48.040
<v Speaker 2>I feel like Tracy has watched many youtubes in the

0:23:48.119 --> 0:23:50.200
<v Speaker 2>last year to fix your Connecticut house.

0:23:50.400 --> 0:23:54.160
<v Speaker 1>This example hits home so hard, and I'll give you

0:23:54.200 --> 0:23:57.239
<v Speaker 1>a specific anecdote, which is my husband and I are

0:23:57.280 --> 0:23:59.680
<v Speaker 1>currently building a shed and we're trying to put a

0:24:00.200 --> 0:24:02.440
<v Speaker 1>roof on it. And we thought like, okay, we put

0:24:02.440 --> 0:24:04.560
<v Speaker 1>the plywood on the roof, and then we get some

0:24:04.760 --> 0:24:08.240
<v Speaker 1>joyst tape. We put the joystape down over the edges,

0:24:08.320 --> 0:24:11.399
<v Speaker 1>and then we put on the shingles, and we watched many,

0:24:11.440 --> 0:24:13.920
<v Speaker 1>many YouTube videos on how to do this. It turns

0:24:13.920 --> 0:24:16.720
<v Speaker 1>out that you can't use joyste tape when it's less

0:24:16.760 --> 0:24:21.560
<v Speaker 1>than fifty degrees fahrenheit outside, which it was, which, of course,

0:24:21.640 --> 0:24:24.359
<v Speaker 1>none of the YouTube videos that are filmed down in

0:24:24.400 --> 0:24:27.720
<v Speaker 1>Florida or wherever actually mentioned. And then secondly, it turns

0:24:27.760 --> 0:24:30.200
<v Speaker 1>out that the ability of the joyst tape to actually

0:24:30.240 --> 0:24:34.600
<v Speaker 1>adhere to the plywood varies enormously depending on what plywood

0:24:34.760 --> 0:24:37.560
<v Speaker 1>you're using. So there are all these subtleties and nuances

0:24:37.600 --> 0:24:41.359
<v Speaker 1>that you don't necessarily get from a ten minute YouTube video.

0:24:41.560 --> 0:24:44.720
<v Speaker 1>Maybe that's not that surprising. But on this note, so

0:24:44.960 --> 0:24:48.119
<v Speaker 1>you mentioned training, and you've spoken a lot at this

0:24:48.200 --> 0:24:51.400
<v Speaker 1>point about the idea of AI being able to provide

0:24:51.760 --> 0:24:57.520
<v Speaker 1>guardrails and context around decision making that maybe yeah, can

0:24:57.640 --> 0:25:01.399
<v Speaker 1>resolve the bottleneck of expensive decision makers, as you put it,

0:25:01.480 --> 0:25:04.919
<v Speaker 1>by creating more of them or allowing more people to

0:25:05.160 --> 0:25:09.480
<v Speaker 1>tap that function. I guess my big question is how

0:25:09.640 --> 0:25:13.040
<v Speaker 1>much of this is just going to be Well, we

0:25:13.119 --> 0:25:16.479
<v Speaker 1>add a new layer of training that people have to

0:25:16.520 --> 0:25:18.720
<v Speaker 1>do so you can use AI, but you still have

0:25:18.800 --> 0:25:21.399
<v Speaker 1>to know how to use AI. You still have to

0:25:21.520 --> 0:25:24.879
<v Speaker 1>understand the result that it's spitting out and interpret that.

0:25:24.960 --> 0:25:27.959
<v Speaker 1>You still have to know how to actually apply and

0:25:28.359 --> 0:25:32.320
<v Speaker 1>use that result. Are we basically just replacing one skill

0:25:32.400 --> 0:25:33.680
<v Speaker 1>set with another.

0:25:34.280 --> 0:25:37.720
<v Speaker 4>It's a good question. We want it to require skills,

0:25:37.840 --> 0:25:40.640
<v Speaker 4>right if everyone is expert. No one is expert, right,

0:25:41.200 --> 0:25:45.399
<v Speaker 4>it's important. The question is whether it can be the

0:25:45.440 --> 0:25:48.520
<v Speaker 4>acquisition of expertise or whether it just gets in the way.

0:25:48.560 --> 0:25:51.360
<v Speaker 4>Another thing you have to certify on. We now have

0:25:51.480 --> 0:25:54.760
<v Speaker 4>you know a bunch of evidence on AI and specific

0:25:54.800 --> 0:25:57.320
<v Speaker 4>applications and where it works well and where it doesn't.

0:25:57.480 --> 0:26:00.320
<v Speaker 4>So for example, you know some students of mine, ked

0:26:00.320 --> 0:26:02.520
<v Speaker 4>Noy and Whitney Zang published a paper in Science last

0:26:02.600 --> 0:26:05.600
<v Speaker 4>year where they gave chat gipt three and a half

0:26:05.640 --> 0:26:09.200
<v Speaker 4>to people who were doing advertising writing and marketing plans.

0:26:09.240 --> 0:26:11.119
<v Speaker 4>And these were people who were college graduates who do

0:26:11.160 --> 0:26:13.639
<v Speaker 4>this for a living. And one group just used the

0:26:13.640 --> 0:26:16.600
<v Speaker 4>standard tools, so it's basically the Internet word processors. Another

0:26:16.640 --> 0:26:18.720
<v Speaker 4>one actually used the chatbot and this was early enough

0:26:18.760 --> 0:26:21.480
<v Speaker 4>that most people didn't already have it, and there were

0:26:21.520 --> 0:26:23.960
<v Speaker 4>a couple of really nice results. So, first thing, it

0:26:24.000 --> 0:26:26.399
<v Speaker 4>saved everybody time. It cut the time it took people

0:26:26.480 --> 0:26:28.880
<v Speaker 4>to do this work from about thirty minutes to about eighteen.

0:26:29.280 --> 0:26:32.359
<v Speaker 4>The second is it improved the quality on average. So

0:26:32.960 --> 0:26:35.719
<v Speaker 4>the output of the people using this tool was judged

0:26:35.760 --> 0:26:38.320
<v Speaker 4>and by other college graduates who were not confederates in

0:26:38.359 --> 0:26:43.920
<v Speaker 4>the experiment to be more precise, more concise, and more accurate,

0:26:44.520 --> 0:26:46.840
<v Speaker 4>so improve the quality of work and saved time. But

0:26:46.880 --> 0:26:49.320
<v Speaker 4>then the most exciting result was if you looked at

0:26:49.320 --> 0:26:53.520
<v Speaker 4>the quality range of the work people did, it basically

0:26:53.600 --> 0:26:56.840
<v Speaker 4>made the least capable writers using chat GPT were about

0:26:56.880 --> 0:26:59.080
<v Speaker 4>as good as the median writers not using it, So

0:26:59.119 --> 0:27:01.760
<v Speaker 4>it kind of leveled up the bottom. And we've seen

0:27:01.760 --> 0:27:05.040
<v Speaker 4>this in other places as well, folks doing customer support.

0:27:05.600 --> 0:27:07.119
<v Speaker 4>The example I'm thinking of is a kind of an

0:27:07.440 --> 0:27:11.520
<v Speaker 4>enterprise software product and they customers chat in through chat window,

0:27:12.160 --> 0:27:16.000
<v Speaker 4>and then the company installed a tool that suggests responses

0:27:16.040 --> 0:27:18.199
<v Speaker 4>to the customer's chat. You don't have to use them,

0:27:18.359 --> 0:27:21.119
<v Speaker 4>but it will also not just suggest technical responses, but

0:27:21.200 --> 0:27:24.240
<v Speaker 4>polite responses and so on to keep the customer from

0:27:24.440 --> 0:27:29.240
<v Speaker 4>getting overheated. And the result is that it speeds the

0:27:29.320 --> 0:27:32.040
<v Speaker 4>rate at which people learn. So it used to take

0:27:32.080 --> 0:27:36.280
<v Speaker 4>people ten months to reach peak capacity. Now it takes

0:27:36.320 --> 0:27:40.480
<v Speaker 4>them about three months. They're somewhat faster when that's done,

0:27:40.720 --> 0:27:43.280
<v Speaker 4>so it's not that it eliminates the training or learning.

0:27:43.400 --> 0:27:47.080
<v Speaker 4>Everyone starts off bad at this job, but they get faster.

0:27:47.200 --> 0:27:50.040
<v Speaker 4>They converge towards expert level more quickly. But this tool,

0:27:50.119 --> 0:27:54.320
<v Speaker 4>and also really interestingly, people quit a lot less. And

0:27:54.400 --> 0:27:57.120
<v Speaker 4>the reason is, you know, customer service work is actually

0:27:57.520 --> 0:28:00.560
<v Speaker 4>really difficult. It's very heavy emotional lay and you have

0:28:00.640 --> 0:28:03.080
<v Speaker 4>to take a lot of incoming abuse actually from customers.

0:28:03.119 --> 0:28:06.760
<v Speaker 4>It's hard to keep your cool. And the sentiment analysis

0:28:06.760 --> 0:28:09.240
<v Speaker 4>of this tool, of the chats that occurred through it,

0:28:09.280 --> 0:28:12.240
<v Speaker 4>is that it basically reduced the level of hostility from

0:28:12.320 --> 0:28:15.080
<v Speaker 4>customers to workers and from workers to customers. So it

0:28:15.080 --> 0:28:17.760
<v Speaker 4>actually did a lot of the emotional labor. So it

0:28:17.800 --> 0:28:20.960
<v Speaker 4>didn't eliminate the need for skills in doing this work,

0:28:20.960 --> 0:28:25.800
<v Speaker 4>but it enabled people to become more efficient, more rapidly,

0:28:26.240 --> 0:28:29.720
<v Speaker 4>with less stress. And so that's the good scenario. There's

0:28:29.760 --> 0:28:32.520
<v Speaker 4>a lot of work that needs to be done. And

0:28:32.840 --> 0:28:36.480
<v Speaker 4>right now, what are the most expensive things? The things

0:28:36.520 --> 0:28:38.600
<v Speaker 4>that are growing more and more costly all the time

0:28:38.720 --> 0:28:44.120
<v Speaker 4>are education, healthcare, legal services. Why is that? Why are

0:28:44.120 --> 0:28:47.160
<v Speaker 4>those things getting so expensive? Well, during the industrial era,

0:28:47.200 --> 0:28:54.280
<v Speaker 4>we got really efficient and manufacturing goods. Right, so TVs, automobiles, coffeemakers, right,

0:28:54.320 --> 0:28:59.280
<v Speaker 4>mobile phones, these things are actually remarkably good and relatively cheap. Why, well,

0:28:59.440 --> 0:29:03.000
<v Speaker 4>we've automated them and the labor content is relatively low.

0:29:03.320 --> 0:29:08.080
<v Speaker 4>On the other hand, healthcare, education, law, Right, we've not

0:29:08.120 --> 0:29:11.080
<v Speaker 4>gotten any more efficient to those things, and they require

0:29:11.120 --> 0:29:13.880
<v Speaker 4>people who've gotten more and more expensive over time because

0:29:13.920 --> 0:29:15.880
<v Speaker 4>as we've automated the other work, the people who are

0:29:15.880 --> 0:29:19.440
<v Speaker 4>the degreed professionals or have become the bottleneck. So that

0:29:19.680 --> 0:29:22.760
<v Speaker 4>slows the growth of productivity. It makes the cost of

0:29:22.800 --> 0:29:26.040
<v Speaker 4>living higher for the typical person. Right, typical person is

0:29:26.040 --> 0:29:28.320
<v Speaker 4>not a lawyer, it's not a professor, it's not a doctor.

0:29:28.320 --> 0:29:31.000
<v Speaker 4>But they're paying for all those things. So if we

0:29:31.000 --> 0:29:34.640
<v Speaker 4>could enable more people without as much training, and I

0:29:34.640 --> 0:29:37.440
<v Speaker 4>don't mean no judgment, I mean some training. If we

0:29:37.440 --> 0:29:39.320
<v Speaker 4>could allow paralegals to do more legal work, if we

0:29:39.360 --> 0:29:42.280
<v Speaker 4>could allow nurse practitioners to do a larger range of

0:29:42.320 --> 0:29:45.000
<v Speaker 4>medical tasks. If we could enable people who are doing

0:29:45.560 --> 0:29:48.760
<v Speaker 4>working as contractors also to do more design. Right, if

0:29:48.760 --> 0:29:51.800
<v Speaker 4>we're enabling people who don't have computer science degrees to

0:29:51.800 --> 0:29:54.400
<v Speaker 4>do more software development, not only would that reduce the

0:29:54.560 --> 0:29:56.960
<v Speaker 4>cost of these expensive services, but when I prove the

0:29:57.040 --> 0:29:59.680
<v Speaker 4>quality of work that people could do, it allow them

0:29:59.680 --> 0:30:04.160
<v Speaker 4>to take some expertise and make it go further. So

0:30:04.240 --> 0:30:05.320
<v Speaker 4>that's the good scenario.

0:30:05.680 --> 0:30:08.440
<v Speaker 1>Joe, I like the idea of using AI to reduce

0:30:08.600 --> 0:30:11.880
<v Speaker 1>emotional labor. I wonder if I can start automating some

0:30:12.400 --> 0:30:17.840
<v Speaker 1>responses on Twitter to toxic bitcoin maximlist. That's interesting, Tracy.

0:30:17.960 --> 0:30:21.760
<v Speaker 2>The block button is right there. So there's so many

0:30:21.840 --> 0:30:24.160
<v Speaker 2>different questions now that I have in my mind. But

0:30:24.400 --> 0:30:27.040
<v Speaker 2>you know, look, we're only near the beginning. I mean

0:30:27.160 --> 0:30:29.560
<v Speaker 2>chat GPT, which is sort of what's brought us all

0:30:29.600 --> 0:30:33.720
<v Speaker 2>into consciousness, was unveiled to the public in late twenty

0:30:33.800 --> 0:30:37.200
<v Speaker 2>twenty two, so not even two years into that this

0:30:37.360 --> 0:30:39.480
<v Speaker 2>sort of breakthrough that enabled it. As just a few

0:30:39.520 --> 0:30:43.400
<v Speaker 2>years older than that. The concern would be, well, yes,

0:30:43.480 --> 0:30:48.080
<v Speaker 2>at this point some training plus AI enables many people

0:30:48.160 --> 0:30:51.120
<v Speaker 2>to become much more productive and have this sort of

0:30:51.160 --> 0:30:54.640
<v Speaker 2>output that was previously associated with people with years of experience.

0:30:55.280 --> 0:30:58.960
<v Speaker 2>Like the fear would be that in multiple generations down

0:30:58.960 --> 0:31:03.520
<v Speaker 2>the road, you don't even need that initial training first.

0:31:03.720 --> 0:31:06.520
<v Speaker 4>I fully agreed. Word just at the beginning. The tools

0:31:06.560 --> 0:31:08.080
<v Speaker 4>are only so good, they're going to get much better.

0:31:08.360 --> 0:31:10.920
<v Speaker 4>Are Understanding how to use them is also very primitive.

0:31:10.960 --> 0:31:13.480
<v Speaker 4>We often don't know how to interact well with AI.

0:31:13.560 --> 0:31:15.520
<v Speaker 4>In fact, I could give you examples of cases where

0:31:15.520 --> 0:31:17.760
<v Speaker 4>it goes pretty badly, even though the tool is good.

0:31:18.320 --> 0:31:20.360
<v Speaker 4>So I think there are sort of two concerns built

0:31:20.360 --> 0:31:22.400
<v Speaker 4>into what you said. One is it basically, for now

0:31:22.520 --> 0:31:24.760
<v Speaker 4>it's a helper, and then eventually it's just your replacement.

0:31:25.400 --> 0:31:27.960
<v Speaker 4>And the other is that even if it just makes

0:31:28.000 --> 0:31:30.720
<v Speaker 4>everyone more efficient, eventually, well, we just saturate the world

0:31:30.760 --> 0:31:33.960
<v Speaker 4>with whatever that thing is, and then it's super cheap. Right. So,

0:31:34.000 --> 0:31:36.560
<v Speaker 4>there's only so many PowerPoint presentations the world can tolerate,

0:31:37.240 --> 0:31:39.240
<v Speaker 4>and if you get really fast at making them, eventually

0:31:39.240 --> 0:31:40.320
<v Speaker 4>people will pay you to stop.

0:31:41.360 --> 0:31:44.040
<v Speaker 2>Yes, we're there now, maybe, but anyway, keep going.

0:31:45.680 --> 0:31:48.160
<v Speaker 4>So I think that that will occur in some cases.

0:31:48.240 --> 0:31:50.320
<v Speaker 4>There's no question that in some cases the tool will

0:31:50.360 --> 0:31:53.120
<v Speaker 4>initially be a supplement and eventually be a replacement. Right.

0:31:53.240 --> 0:31:55.440
<v Speaker 4>So maybe air traffic controllers would be an example like that,

0:31:55.520 --> 0:31:58.240
<v Speaker 4>right where eventually almost all the air traffic control will

0:31:58.280 --> 0:32:01.200
<v Speaker 4>be done by machines. But I don't think every job

0:32:01.280 --> 0:32:03.640
<v Speaker 4>is like that. I don't think that's the case in medicine.

0:32:03.640 --> 0:32:06.960
<v Speaker 4>Medicine will be a hands on occupation for a very

0:32:07.000 --> 0:32:10.400
<v Speaker 4>long time. So will law, where there's a lot of

0:32:10.520 --> 0:32:13.600
<v Speaker 4>high stakes decision making, so will design. So I don't

0:32:13.760 --> 0:32:17.200
<v Speaker 4>think that we're going to automate everything away. I know

0:32:17.240 --> 0:32:19.800
<v Speaker 4>people think that, and I think it's a valid concern.

0:32:19.880 --> 0:32:22.400
<v Speaker 4>I don't think that's the most likely scenario. But I

0:32:22.400 --> 0:32:25.240
<v Speaker 4>also want to stress something that's said too little in

0:32:25.280 --> 0:32:28.000
<v Speaker 4>these discussions, which is, when you think about what you

0:32:28.080 --> 0:32:30.320
<v Speaker 4>can do with a new tool, most people think, well,

0:32:30.320 --> 0:32:32.240
<v Speaker 4>what can I automate? What is the thing that I'm

0:32:32.280 --> 0:32:34.440
<v Speaker 4>doing now that I could now have the machine do

0:32:34.600 --> 0:32:36.680
<v Speaker 4>for me? And that's important, and we do a lot

0:32:36.680 --> 0:32:40.560
<v Speaker 4>of automation, but automation is not the primary source of

0:32:40.640 --> 0:32:44.600
<v Speaker 4>how innovation improves our lives. Right. Many of the things

0:32:44.640 --> 0:32:48.560
<v Speaker 4>that we do with new tools is create new capabilities

0:32:48.960 --> 0:32:52.080
<v Speaker 4>that we didn't previously have. Right. So, airplanes did not

0:32:52.240 --> 0:32:54.680
<v Speaker 4>automate the way we used to fly. We just didn't

0:32:54.760 --> 0:32:58.120
<v Speaker 4>fly before we had airplanes. Right. The scanning electron microscope

0:32:58.280 --> 0:33:02.120
<v Speaker 4>didn't automate the way we used to look at subatomic particles.

0:33:02.440 --> 0:33:05.680
<v Speaker 4>We simply couldn't see them without that microscope. Right. So

0:33:06.120 --> 0:33:09.200
<v Speaker 4>think of the thought experiment of automating everything in Asian Greece,

0:33:09.560 --> 0:33:12.440
<v Speaker 4>you know, two thousand years ago. Even if you automated

0:33:12.520 --> 0:33:15.760
<v Speaker 4>everything in ancient Greece, it wouldn't be modern America, right,

0:33:16.080 --> 0:33:19.520
<v Speaker 4>It wouldn't have electricity, it wouldn't have computers, it wouldn't

0:33:19.560 --> 0:33:23.640
<v Speaker 4>have airplanes, it wouldn't have penicillin, it wouldn't have a

0:33:23.760 --> 0:33:27.560
<v Speaker 4>million tools technologies that we take for granted. So the

0:33:27.640 --> 0:33:31.400
<v Speaker 4>most important applications of technology are to enable capabilities that

0:33:31.480 --> 0:33:34.080
<v Speaker 4>didn't previously exist, and I think AI will do that

0:33:34.160 --> 0:33:36.440
<v Speaker 4>as well. So you know, we couldn't be having this

0:33:36.520 --> 0:33:39.840
<v Speaker 4>conversation were it not for our computers. Right. If someone

0:33:39.840 --> 0:33:41.720
<v Speaker 4>took my computer away from me, I couldn't even do

0:33:41.800 --> 0:33:44.320
<v Speaker 4>my job, right, It's just my job wouldn't exist in

0:33:44.360 --> 0:33:46.800
<v Speaker 4>its current form. And so what we do with new

0:33:46.800 --> 0:33:51.160
<v Speaker 4>technology is create new capabilities, and then human expertises often

0:33:51.240 --> 0:33:54.640
<v Speaker 4>needed to support those capabilities. Right, we didn't have pilots

0:33:54.680 --> 0:33:58.240
<v Speaker 4>before we had airplanes, and we didn't have pediatric oncologists

0:33:58.240 --> 0:34:01.080
<v Speaker 4>before we had all kinds of tools and knowledge to

0:34:01.280 --> 0:34:05.080
<v Speaker 4>treat cancer or cancer in children. And so as we

0:34:05.120 --> 0:34:10.040
<v Speaker 4>instantiate these new capabilities, we often require new human skills

0:34:10.040 --> 0:34:12.799
<v Speaker 4>and expertise that are valuable. And so much of what

0:34:12.840 --> 0:34:15.800
<v Speaker 4>we do with these tools is to change our lives

0:34:15.840 --> 0:34:18.839
<v Speaker 4>by pushing out the possibility set, rather than simply just

0:34:19.280 --> 0:34:21.640
<v Speaker 4>automating the things that we already do, and I think

0:34:21.640 --> 0:34:24.239
<v Speaker 4>AI will also be really important for that.

0:34:39.560 --> 0:34:42.480
<v Speaker 1>One thing I wanted to ask you is you are

0:34:42.800 --> 0:34:45.959
<v Speaker 1>very very clear in your piece that this is more

0:34:46.040 --> 0:34:51.000
<v Speaker 1>of an informed thesis than an actual forecast. And here

0:34:51.480 --> 0:34:53.880
<v Speaker 1>I am actually leaning on chat GPT when I asked

0:34:53.880 --> 0:34:56.960
<v Speaker 1>it to poke holes in your argument. One of the

0:34:56.960 --> 0:34:59.720
<v Speaker 1>ones it spat out had to do with this exact question.

0:35:00.600 --> 0:35:05.440
<v Speaker 1>Are there specific measures or policies that we could be

0:35:05.480 --> 0:35:09.600
<v Speaker 1>doing right now to make the probability of this outcome

0:35:10.280 --> 0:35:15.160
<v Speaker 1>better rather than the sort of like destructive AI dooomerism

0:35:15.200 --> 0:35:16.840
<v Speaker 1>outcome that everyone is worried about.

0:35:17.200 --> 0:35:20.280
<v Speaker 4>Yeah, so I appreciate your saying that the future should

0:35:20.320 --> 0:35:23.040
<v Speaker 4>not be treated as a forecasting or prediction exercise. It

0:35:23.040 --> 0:35:26.000
<v Speaker 4>should be treated as a design problem. Because the future

0:35:26.080 --> 0:35:27.640
<v Speaker 4>is not like the weather that we just wait and

0:35:27.640 --> 0:35:29.960
<v Speaker 4>see what happens. Right, We're making our own weather. We

0:35:30.000 --> 0:35:32.439
<v Speaker 4>have enormous control over the future in which we live

0:35:32.760 --> 0:35:36.320
<v Speaker 4>and depends on the investments and structures that we create today,

0:35:36.360 --> 0:35:39.719
<v Speaker 4>whether that's democracies, whether that's you know, education, whether that's

0:35:39.760 --> 0:35:42.239
<v Speaker 4>how we use tools and science that whether we use

0:35:42.239 --> 0:35:45.279
<v Speaker 4>fissionable material to make bombs or to make energy. Right,

0:35:45.280 --> 0:35:47.920
<v Speaker 4>we have lots and lots of agency here. So in

0:35:48.000 --> 0:35:51.200
<v Speaker 4>terms of using AI well, So first of all, let

0:35:51.239 --> 0:35:52.880
<v Speaker 4>me say what would be a metric how would we

0:35:52.960 --> 0:35:55.759
<v Speaker 4>know we were using AI? Well, because it's not like

0:35:55.800 --> 0:35:57.680
<v Speaker 4>carbon dioxide, where you know, we say, oh, we know

0:35:57.760 --> 0:35:59.799
<v Speaker 4>we're reducing carbon dioxide, you can just measure it, right,

0:35:59.800 --> 0:36:02.480
<v Speaker 4>How would we know we're using AI well, I would

0:36:02.520 --> 0:36:05.640
<v Speaker 4>say we know we're using it well when we see

0:36:05.880 --> 0:36:09.920
<v Speaker 4>people who don't have for your college degrees doing work

0:36:09.920 --> 0:36:12.560
<v Speaker 4>that we would think of as expert decision making work,

0:36:12.600 --> 0:36:16.600
<v Speaker 4>whether that's coding, whether that's you know, medical vocational work,

0:36:16.840 --> 0:36:20.760
<v Speaker 4>whether that's design and contracting, or even whether it allows

0:36:20.840 --> 0:36:23.680
<v Speaker 4>skilled repair people to work on a broader range of

0:36:23.800 --> 0:36:26.719
<v Speaker 4>products or tools or engines or whatever. So that's my

0:36:26.920 --> 0:36:31.279
<v Speaker 4>metric of success, that it opens up new job opportunities

0:36:31.280 --> 0:36:34.839
<v Speaker 4>to people who are not at the absolute elite of

0:36:34.880 --> 0:36:37.759
<v Speaker 4>a field. How do we get there? So I think

0:36:37.760 --> 0:36:41.240
<v Speaker 4>that's a super central question. And I think most thoughts

0:36:41.239 --> 0:36:44.320
<v Speaker 4>about you know, policies about AI are about regulating, controlling,

0:36:44.400 --> 0:36:46.800
<v Speaker 4>and some of that has to happen, and I feel

0:36:46.840 --> 0:36:51.000
<v Speaker 4>reasonably confident that it will. This is much more about investing, right,

0:36:51.360 --> 0:36:53.440
<v Speaker 4>So say, look, you know, in the United States, for example,

0:36:53.880 --> 0:36:57.759
<v Speaker 4>about twenty percent of GDP two in ten dollars goes

0:36:57.760 --> 0:37:01.200
<v Speaker 4>to education and healthcare. More than half that money is

0:37:01.239 --> 0:37:04.279
<v Speaker 4>public money, so in fact, we have a lot of

0:37:04.280 --> 0:37:08.200
<v Speaker 4>control over how education and healthcare delivered. So healthcare would

0:37:08.200 --> 0:37:09.920
<v Speaker 4>be the best place to start to say, all right,

0:37:10.040 --> 0:37:13.400
<v Speaker 4>let's redesign the tools or invest in the tools in

0:37:13.440 --> 0:37:16.520
<v Speaker 4>a way that enables more people to deliver this work.

0:37:16.560 --> 0:37:19.080
<v Speaker 4>And not only would that make better jobs, it would

0:37:19.080 --> 0:37:23.040
<v Speaker 4>also improve access to healthcare potentially lower those costs. We

0:37:23.040 --> 0:37:25.640
<v Speaker 4>could do the same in education. How can we make education,

0:37:26.280 --> 0:37:29.360
<v Speaker 4>you know, make better use of teachers, provide better services

0:37:29.400 --> 0:37:33.320
<v Speaker 4>to students, and also make education more accessible, immersive, engaging

0:37:33.360 --> 0:37:35.600
<v Speaker 4>for adults. Right, we have lots of adults who need

0:37:35.640 --> 0:37:38.120
<v Speaker 4>to learn, and traditional classrooms are really not the best

0:37:38.120 --> 0:37:39.960
<v Speaker 4>place to do that. So I do think you have

0:37:40.040 --> 0:37:43.399
<v Speaker 4>to think about these moonshots and governments can invest in them.

0:37:43.680 --> 0:37:45.759
<v Speaker 4>That doesn't mean the government has to run them, but

0:37:45.880 --> 0:37:50.160
<v Speaker 4>you know, governments often fund basic science, governments fund education.

0:37:50.640 --> 0:37:52.520
<v Speaker 4>Most health innovation in the United States is paid for

0:37:52.600 --> 0:37:54.919
<v Speaker 4>by the National Institute of Health, which is much much

0:37:54.960 --> 0:37:58.560
<v Speaker 4>larger than the National Science Foundation, for example, So I

0:37:58.600 --> 0:38:01.200
<v Speaker 4>think that is the biggest chance is to look for

0:38:01.200 --> 0:38:06.719
<v Speaker 4>those opportunities and then design with the intention of creating

0:38:07.120 --> 0:38:09.560
<v Speaker 4>a more effective way to structure work that uses the

0:38:09.600 --> 0:38:12.680
<v Speaker 4>tools and uses human skills better. And let me say,

0:38:12.719 --> 0:38:14.480
<v Speaker 4>you might say, well, you know, why doesn't this apply

0:38:14.520 --> 0:38:16.960
<v Speaker 4>equally well to the last era. So first of all,

0:38:16.960 --> 0:38:19.279
<v Speaker 4>we didn't design, and probably we should have done more.

0:38:19.320 --> 0:38:23.319
<v Speaker 4>But essentially, computers are good at following rules and so

0:38:23.840 --> 0:38:27.360
<v Speaker 4>they could replicate a lot of work that was just that,

0:38:27.520 --> 0:38:31.200
<v Speaker 4>but they weren't good at supplementing skills at enabling people

0:38:31.280 --> 0:38:35.480
<v Speaker 4>to do these high stakes decision making tasks. So it's important. Unsually,

0:38:35.520 --> 0:38:38.520
<v Speaker 4>AI is almost the inverse of traditional computing. Right if

0:38:38.560 --> 0:38:40.480
<v Speaker 4>I told you I have the most advanced technology in

0:38:40.480 --> 0:38:42.399
<v Speaker 4>the world, but you know, it really can't do math

0:38:42.400 --> 0:38:45.000
<v Speaker 4>and it's not reliable with facts and figures, you would say, well,

0:38:45.040 --> 0:38:46.799
<v Speaker 4>what kind of technology is that? And I would say, well,

0:38:46.800 --> 0:38:50.279
<v Speaker 4>that's artificial intelligence. It is really quite the opposite. So

0:38:50.360 --> 0:38:53.200
<v Speaker 4>I think it has quite different capabilities. And in some

0:38:53.239 --> 0:38:57.400
<v Speaker 4>sense you could say traditional computing was really complementary to

0:38:57.840 --> 0:39:00.600
<v Speaker 4>you know, the most elite professionals. And it's quite possible

0:39:00.640 --> 0:39:03.080
<v Speaker 4>that AI will enable more people to compete with them,

0:39:03.520 --> 0:39:07.719
<v Speaker 4>and that's a really good thing because that improves the

0:39:07.800 --> 0:39:10.040
<v Speaker 4>quality of services and improves the quality of jaws for

0:39:10.120 --> 0:39:12.280
<v Speaker 4>people who were not at that leading edge.

0:39:12.760 --> 0:39:14.839
<v Speaker 2>This is I think the key thing because in your

0:39:14.960 --> 0:39:17.640
<v Speaker 2>piece and and other testimony you've given, you've talked about

0:39:17.640 --> 0:39:20.640
<v Speaker 2>this idea of collective decision making, And when I think

0:39:20.640 --> 0:39:24.480
<v Speaker 2>about modern American society or modern society in general, I

0:39:24.520 --> 0:39:28.640
<v Speaker 2>don't necessarily think that collective decision making is something we're

0:39:28.680 --> 0:39:30.239
<v Speaker 2>particularly strong on.

0:39:30.400 --> 0:39:30.879
<v Speaker 4>So if the.

0:39:30.840 --> 0:39:34.879
<v Speaker 2>Future depends on making good collective decisions, then that makes

0:39:34.920 --> 0:39:38.160
<v Speaker 2>me anxious. But you know, you talk about investment, but

0:39:38.280 --> 0:39:41.960
<v Speaker 2>it sounds like the other element here. And you mentioned

0:39:42.000 --> 0:39:44.920
<v Speaker 2>that the rise of the nurse practitioner had to happen

0:39:45.040 --> 0:39:47.960
<v Speaker 2>over the kicking and screaming of the American Medical Association,

0:39:48.360 --> 0:39:52.040
<v Speaker 2>which represents that a top strata of healthcare professionals, the

0:39:52.280 --> 0:39:55.360
<v Speaker 2>elite doctors. How much of this is going to be

0:39:56.000 --> 0:40:00.880
<v Speaker 2>a political fight ultimately in which the doctor and the

0:40:01.000 --> 0:40:07.040
<v Speaker 2>lawyers and the podcasters collectively resist other people who are

0:40:07.120 --> 0:40:09.440
<v Speaker 2>using these tools to do our jobs. And how much

0:40:09.520 --> 0:40:12.319
<v Speaker 2>is that really like where the collective fight is going

0:40:12.360 --> 0:40:12.760
<v Speaker 2>to happen.

0:40:13.520 --> 0:40:15.360
<v Speaker 4>Yeah, if we have to take on the podcasters.

0:40:15.400 --> 0:40:18.360
<v Speaker 2>I think we're doomed, but yeah, yeah, we're going to

0:40:18.400 --> 0:40:20.000
<v Speaker 2>fight this kicking and screaming for sha.

0:40:20.040 --> 0:40:23.120
<v Speaker 4>The AMA is one thing, yeah, but the podcasters, that's

0:40:23.160 --> 0:40:26.040
<v Speaker 4>a whole different army. Some of that will absolutely be

0:40:26.120 --> 0:40:29.440
<v Speaker 4>terf warfare. Right the professions, we think, oh, you know that,

0:40:29.560 --> 0:40:31.840
<v Speaker 4>you know, oil companies and so on don't like competition

0:40:31.920 --> 0:40:33.839
<v Speaker 4>and they're always trying to rig the market, But in fact,

0:40:33.880 --> 0:40:36.680
<v Speaker 4>the professions rig the markets as well, right they what

0:40:36.880 --> 0:40:39.440
<v Speaker 4>a profession is, actually what it means is an occupation

0:40:39.480 --> 0:40:41.719
<v Speaker 4>that gets to certify its own members and decide who's

0:40:41.760 --> 0:40:45.280
<v Speaker 4>in and who's out right. And so it's the medical

0:40:45.400 --> 0:40:49.040
<v Speaker 4>profession that creates training standards and certification standards. It's universities

0:40:49.040 --> 0:40:52.360
<v Speaker 4>that decide what skills enable you to have a PhD

0:40:52.880 --> 0:40:57.160
<v Speaker 4>and therefore become a professor. So it absolutely is going

0:40:57.239 --> 0:40:59.960
<v Speaker 4>to be a challenge. Like lawyers will try very hard

0:41:00.080 --> 0:41:02.440
<v Speaker 4>say well, that can't be a legal document unless a

0:41:02.480 --> 0:41:05.799
<v Speaker 4>lawyer has signed it someone with a JD and has

0:41:05.880 --> 0:41:09.239
<v Speaker 4>passed the bar. So that will be a source of

0:41:09.280 --> 0:41:12.480
<v Speaker 4>resistance for sure. On the other hand, if there's a

0:41:12.560 --> 0:41:14.839
<v Speaker 4>really good competing alternative, if you can say, look, these

0:41:14.920 --> 0:41:18.240
<v Speaker 4>nurse practitioners can do a lot of this diagnostic work.

0:41:18.440 --> 0:41:20.120
<v Speaker 4>You know, they work well with doctors, but they can

0:41:20.160 --> 0:41:22.359
<v Speaker 4>do some things that doctors would be more expensive doing,

0:41:22.440 --> 0:41:24.920
<v Speaker 4>and you can make that case. Or a paralegal using

0:41:24.960 --> 0:41:28.239
<v Speaker 4>the software can create a lot of routine documents, or

0:41:28.360 --> 0:41:32.919
<v Speaker 4>a software developer using GitHub copilot can go pretty far.

0:41:33.520 --> 0:41:36.640
<v Speaker 4>Then that creates a lot of economic pressure that tends,

0:41:36.680 --> 0:41:40.560
<v Speaker 4>over long periods of time to erode these gills. So

0:41:40.719 --> 0:41:43.400
<v Speaker 4>I think that they will not go quietly into this

0:41:43.480 --> 0:41:47.759
<v Speaker 4>dark knight. But if the models are successful, it does

0:41:47.880 --> 0:41:52.200
<v Speaker 4>create a strong incentive for eventually that to become adopted.

0:41:52.840 --> 0:41:56.880
<v Speaker 1>I think part of the concern around AI has to

0:41:56.920 --> 0:42:01.839
<v Speaker 1>do also with how any productivity gains are actually distributed

0:42:01.880 --> 0:42:05.280
<v Speaker 1>and whether or not people are compensated for doing more.

0:42:06.080 --> 0:42:09.280
<v Speaker 1>And I asked chat GPT obviously to provide a summary

0:42:09.360 --> 0:42:12.080
<v Speaker 1>of dust capital before I came on here. No, I

0:42:12.120 --> 0:42:14.440
<v Speaker 1>do think there is this concern about Okay, in an

0:42:14.480 --> 0:42:17.840
<v Speaker 1>ideal scenario, we're all more efficient in terms of our labor,

0:42:18.200 --> 0:42:22.440
<v Speaker 1>and maybe some types of work are even better to perform.

0:42:22.520 --> 0:42:26.359
<v Speaker 1>Maybe we reduce that emotional labor. But aside from that

0:42:26.400 --> 0:42:31.200
<v Speaker 1>particular benefit, how do we distribute the additional productivity gains?

0:42:31.360 --> 0:42:34.960
<v Speaker 1>Is there any evidence or any reason to believe that

0:42:35.120 --> 0:42:39.440
<v Speaker 1>these benefits are going to go to labor, to actual

0:42:39.480 --> 0:42:42.360
<v Speaker 1>workers and individuals versus to companies in capital.

0:42:42.719 --> 0:42:44.960
<v Speaker 4>Yeah. Good. So let me give you two answers that question.

0:42:45.280 --> 0:42:49.440
<v Speaker 4>One is it really does depend on institutions, not just

0:42:49.680 --> 0:42:51.520
<v Speaker 4>on decentralized labor markets.

0:42:51.560 --> 0:42:51.680
<v Speaker 1>Right.

0:42:51.719 --> 0:42:55.799
<v Speaker 4>So if you compare the US versus Germany versus Scandinavia, right,

0:42:55.880 --> 0:42:58.520
<v Speaker 4>we have so much in common. We have the same technologies,

0:42:58.719 --> 0:43:00.680
<v Speaker 4>we have the same agent population, we have the same

0:43:00.760 --> 0:43:03.280
<v Speaker 4>rising education levels, we have the same China as a competitor,

0:43:03.400 --> 0:43:05.960
<v Speaker 4>we have lots of immigration. And yet these countries have big,

0:43:06.080 --> 0:43:08.200
<v Speaker 4>very different cakes with the same ingredients. Right. The US

0:43:08.280 --> 0:43:11.239
<v Speaker 4>is kind of cowboy capitalism, very high levels inequality and

0:43:11.320 --> 0:43:14.040
<v Speaker 4>disparity and not so much sharing with workers. And if

0:43:14.080 --> 0:43:17.400
<v Speaker 4>you look at Scandinavia or Germany, it's much more cuddly capitalism. Right,

0:43:17.400 --> 0:43:20.800
<v Speaker 4>it's not nearly as unequal. And that's really a question

0:43:20.920 --> 0:43:25.560
<v Speaker 4>of tax regulation, it's a question of the role of

0:43:25.680 --> 0:43:28.759
<v Speaker 4>labor unions and labor voice, and it's a question of

0:43:28.760 --> 0:43:30.960
<v Speaker 4>social norms. And so I guess we should not take

0:43:30.960 --> 0:43:33.400
<v Speaker 4>it as inevitable that the outcomes we have are the

0:43:33.400 --> 0:43:35.960
<v Speaker 4>only ones the market could tolerate. But at the same time,

0:43:35.960 --> 0:43:40.000
<v Speaker 4>we should recognize that without those sort of counterveling forces,

0:43:40.360 --> 0:43:43.279
<v Speaker 4>the outcomes can look pretty bad. So I do think

0:43:43.360 --> 0:43:47.720
<v Speaker 4>you know, I'm happy about the rise of collective bargaining

0:43:47.800 --> 0:43:49.799
<v Speaker 4>again in the United States, although it's from a very

0:43:49.800 --> 0:43:53.160
<v Speaker 4>low level. I'm happy that more states are passing minimum

0:43:53.200 --> 0:43:57.600
<v Speaker 4>wage regulations. I'm happy that the Biden administration is trying

0:43:57.640 --> 0:44:00.319
<v Speaker 4>to sort of beef up the Occtational Safety in Health

0:44:00.320 --> 0:44:04.400
<v Speaker 4>Administration and the Equal Employment Opportunity Commission and so on.

0:44:04.480 --> 0:44:06.759
<v Speaker 4>So I think those things matter a great deal. So

0:44:07.160 --> 0:44:09.200
<v Speaker 4>one should not take it for granted that just because

0:44:09.239 --> 0:44:12.560
<v Speaker 4>productivity rises, workers benefit in many countries. That's true, but

0:44:12.600 --> 0:44:13.840
<v Speaker 4>not so much in the United States.

0:44:13.840 --> 0:44:16.320
<v Speaker 2>But I want to press you right here on this point,

0:44:16.360 --> 0:44:20.239
<v Speaker 2>because why doesn't this undermine much of the argument. If

0:44:20.280 --> 0:44:23.280
<v Speaker 2>these different countries, whether it's Sweden and Germany the US,

0:44:23.680 --> 0:44:27.160
<v Speaker 2>can have very different sort of distributional outcomes with the

0:44:27.239 --> 0:44:31.280
<v Speaker 2>same cake ingredients, with roughly similar technology and labor markets,

0:44:31.800 --> 0:44:36.239
<v Speaker 2>why then take the assumption that it's the technology that

0:44:36.360 --> 0:44:40.800
<v Speaker 2>has the distributional impact rather than just those policies themselves.

0:44:41.440 --> 0:44:44.239
<v Speaker 4>Okay, this is an excellent question. So I think the

0:44:44.239 --> 0:44:48.000
<v Speaker 4>technology provides headwinds and tailwinds with which policy can work.

0:44:48.160 --> 0:44:50.800
<v Speaker 4>So all of these countries I mentioned have become more unequal.

0:44:51.040 --> 0:44:51.879
<v Speaker 3>Okay, all of.

0:44:51.800 --> 0:44:54.200
<v Speaker 4>These countries have seen a decline in middle scale work.

0:44:54.360 --> 0:44:57.439
<v Speaker 4>All of these countries have seen the mean wage rise

0:44:57.480 --> 0:45:00.480
<v Speaker 4>relative to the median, meaning the upper wages have risen

0:45:00.680 --> 0:45:03.080
<v Speaker 4>more than the center. But the degree to which countries

0:45:03.080 --> 0:45:06.080
<v Speaker 4>have pushed back against that is a function of their institutions.

0:45:06.440 --> 0:45:09.120
<v Speaker 4>In the prior era, prior to computerization, all of these

0:45:09.120 --> 0:45:11.840
<v Speaker 4>countries saw their middle classes grow together along with the

0:45:11.920 --> 0:45:15.279
<v Speaker 4>upper class and lower class, and so the industrial era

0:45:15.480 --> 0:45:19.759
<v Speaker 4>prior computerization was very friendly, sort of intrinsically towards the

0:45:19.760 --> 0:45:22.719
<v Speaker 4>middle class. The computer era was much much less so,

0:45:23.400 --> 0:45:26.640
<v Speaker 4>and then policy helped ameliorate those impacts, and much less

0:45:26.640 --> 0:45:28.440
<v Speaker 4>so in the United States. So I do think that

0:45:28.560 --> 0:45:32.120
<v Speaker 4>technology plays a role. I just we should simultaneously believe

0:45:32.320 --> 0:45:35.600
<v Speaker 4>that these underlying forces of technology and globalization creates strong

0:45:35.640 --> 0:45:38.080
<v Speaker 4>pressures in one way or another, and then policy can

0:45:38.120 --> 0:45:41.920
<v Speaker 4>shape how those pressures play out. It won't undo them,

0:45:41.960 --> 0:45:44.799
<v Speaker 4>but it can channel them more or less effectively. So

0:45:44.840 --> 0:45:47.200
<v Speaker 4>you're asking both the right questions, and I think the

0:45:47.280 --> 0:45:50.280
<v Speaker 4>answer is both are true, but we should think it's

0:45:50.360 --> 0:45:53.960
<v Speaker 4>not one or the other. And in some periods those

0:45:54.000 --> 0:45:56.680
<v Speaker 4>forces are very favorable and policy has to do less

0:45:56.680 --> 0:46:00.520
<v Speaker 4>hard work, and other periods they're relatively unfavorable. Policy, if

0:46:00.520 --> 0:46:03.040
<v Speaker 4>it's working well, has to do more work. The other

0:46:03.120 --> 0:46:04.319
<v Speaker 4>point I want to make, and this is why I'm

0:46:04.360 --> 0:46:08.160
<v Speaker 4>so focused on expertise, is expert work is intrinsically well paid.

0:46:08.600 --> 0:46:12.560
<v Speaker 4>It's scarce, and it's necessary. And that's why if we

0:46:12.640 --> 0:46:14.239
<v Speaker 4>live in a world where all the work can be

0:46:14.280 --> 0:46:19.279
<v Speaker 4>done by machines, we're completely dependent upon redistribution, right the

0:46:19.320 --> 0:46:22.200
<v Speaker 4>people who own the machines to share with everyone else.

0:46:22.239 --> 0:46:26.080
<v Speaker 4>And I'm not so optimistic about people's excitement about sharing

0:46:26.080 --> 0:46:28.000
<v Speaker 4>with everyone else. And even when people say, oh, we'll

0:46:28.000 --> 0:46:31.720
<v Speaker 4>have universal basic income, they really mean universal basic income

0:46:32.200 --> 0:46:34.279
<v Speaker 4>within the borders of the United States. They don't mean

0:46:34.360 --> 0:46:36.600
<v Speaker 4>universal basic income for the rest of the world. Right,

0:46:36.680 --> 0:46:40.200
<v Speaker 4>So people's notion of sharing is very limited. So I

0:46:40.480 --> 0:46:44.440
<v Speaker 4>do think it's extremely important that labor remains valuable, and

0:46:44.480 --> 0:46:48.640
<v Speaker 4>that's actually an achievement of the industrialized world that so

0:46:48.719 --> 0:46:52.440
<v Speaker 4>many people can make a good reasonable standard living based

0:46:52.520 --> 0:46:55.759
<v Speaker 4>on their skills, and so technologies and tools that make

0:46:55.840 --> 0:46:59.680
<v Speaker 4>human expertise more valuable by allowing to go further are

0:46:59.719 --> 0:47:04.880
<v Speaker 4>really favorable towards income distribution. Technologies that just automate away work,

0:47:05.400 --> 0:47:08.600
<v Speaker 4>even though they raise productivity, are not favorable to its

0:47:08.640 --> 0:47:11.560
<v Speaker 4>income distribution because it means it goes to ownership of capital,

0:47:11.560 --> 0:47:15.120
<v Speaker 4>and ownership of capital is intrinsically more centralized in ownership

0:47:15.160 --> 0:47:18.719
<v Speaker 4>of labor, because in a country that doesn't have slavery

0:47:18.920 --> 0:47:23.480
<v Speaker 4>and doesn't have labor coersion, everyone owns one worker themselves,

0:47:23.920 --> 0:47:28.799
<v Speaker 4>and so that inherently creates some tendency towards equality when

0:47:29.080 --> 0:47:30.120
<v Speaker 4>labor is valuable.

0:47:30.880 --> 0:47:35.160
<v Speaker 2>The efforts of the Biden administration to reindustrialize the US

0:47:35.239 --> 0:47:36.880
<v Speaker 2>and sort of counter some of the effects of the

0:47:36.960 --> 0:47:38.920
<v Speaker 2>last twenty years that you wrote about, do you have

0:47:38.960 --> 0:47:41.640
<v Speaker 2>any optimism that those trends can be reversed. I know

0:47:41.680 --> 0:47:44.000
<v Speaker 2>this is a very simple, straightforward question that you're going

0:47:44.040 --> 0:47:46.480
<v Speaker 2>to answer in about thirty seconds, so good luck.

0:47:46.560 --> 0:47:49.680
<v Speaker 4>I don't think they can be completely reversed, but you

0:47:49.760 --> 0:47:52.600
<v Speaker 4>can stem the tide, right, So it's not that the

0:47:52.680 --> 0:47:56.440
<v Speaker 4>U this has stabilized. The US continues to lose industrial capacity, right,

0:47:56.480 --> 0:47:59.560
<v Speaker 4>whether it's in semiconductors, whether it's automobiles, whether it's an aircraft,

0:47:59.600 --> 0:48:04.080
<v Speaker 4>Thank you, and so on. So I think reinvesting can

0:48:04.440 --> 0:48:07.480
<v Speaker 4>help solidify those sectors, and I think it's very important

0:48:07.520 --> 0:48:09.280
<v Speaker 4>to do so, because now they're not just a question

0:48:09.320 --> 0:48:12.680
<v Speaker 4>of jobs. It really is about leadership of the key

0:48:13.120 --> 0:48:18.000
<v Speaker 4>profit and idea generating activities in the modern world, and

0:48:18.040 --> 0:48:22.520
<v Speaker 4>we don't want to lose a leadership place in those activities.

0:48:23.440 --> 0:48:27.680
<v Speaker 2>Good concise answer to what probably could be multiple future episodes,

0:48:27.960 --> 0:48:30.959
<v Speaker 2>David Otter, Thank you so much for coming on out Laws.

0:48:31.040 --> 0:48:33.640
<v Speaker 2>That really was a fascinating conversation. We probably could get

0:48:33.719 --> 0:48:37.359
<v Speaker 2>multiple episodes out of this conversation with you, but really

0:48:37.360 --> 0:48:38.200
<v Speaker 2>appreciate your time.

0:48:38.360 --> 0:48:40.120
<v Speaker 4>Thank you very much. Nice to speak with both of you.

0:48:40.120 --> 0:48:54.880
<v Speaker 3>Have a good day, Tracy. I'm convinced. I think everything

0:48:54.920 --> 0:48:55.399
<v Speaker 3>will be fun.

0:48:55.640 --> 0:48:57.240
<v Speaker 2>I'm no longer worried.

0:48:57.360 --> 0:48:59.479
<v Speaker 1>Well, first of all, I would say it was nice

0:48:59.520 --> 0:49:03.640
<v Speaker 1>to hear a slightly more optimistic argument from David. There

0:49:03.680 --> 0:49:06.400
<v Speaker 1>were a lot of quotable sentences in there. So I

0:49:06.480 --> 0:49:09.680
<v Speaker 1>like the idea that everyone's their own individual capitalist in

0:49:09.719 --> 0:49:12.400
<v Speaker 1>the sense that we each have one worker to direct

0:49:12.600 --> 0:49:15.560
<v Speaker 1>and get the most money out of So that's how

0:49:15.560 --> 0:49:19.759
<v Speaker 1>I'm going to start thinking cuddly capitalism, which, as our

0:49:19.840 --> 0:49:23.080
<v Speaker 1>producer Klee observes, is a much more appealing name than

0:49:23.120 --> 0:49:26.280
<v Speaker 1>the Swedish model. I like that. What I would say

0:49:26.800 --> 0:49:31.640
<v Speaker 1>is again putting on my cynical journalist hat, and I

0:49:31.640 --> 0:49:34.000
<v Speaker 1>guess I don't have an opinion because I am a

0:49:34.080 --> 0:49:37.840
<v Speaker 1>journalist whose expertise is about to be automated away. But

0:49:38.400 --> 0:49:42.200
<v Speaker 1>my non consensus take here, or my sort of hot

0:49:42.280 --> 0:49:45.319
<v Speaker 1>take here, is that I agree with David that we

0:49:45.440 --> 0:49:48.359
<v Speaker 1>are going to get more jobs out of AI, and

0:49:48.400 --> 0:49:52.239
<v Speaker 1>probably more than a lot of people currently anticipate. I

0:49:52.280 --> 0:49:57.000
<v Speaker 1>guess I'm less convinced about how useful those jobs are

0:49:57.040 --> 0:49:59.160
<v Speaker 1>going to be, So going back to his point about

0:49:59.160 --> 0:50:02.480
<v Speaker 1>how do we measure how well we're using AI, I

0:50:02.520 --> 0:50:05.160
<v Speaker 1>have a feeling that a lot of it is going

0:50:05.200 --> 0:50:07.799
<v Speaker 1>to end up basically creating a whole new layer of

0:50:08.280 --> 0:50:10.840
<v Speaker 1>BS jobs that don't actually do much. So there's going

0:50:10.920 --> 0:50:14.520
<v Speaker 1>to be all these decision making bodies attached to AI.

0:50:14.960 --> 0:50:19.520
<v Speaker 1>There's going to be big discussions about how you implement AI, fairness,

0:50:19.800 --> 0:50:22.839
<v Speaker 1>litigating its results, and things like that. I guess I'm

0:50:22.840 --> 0:50:25.560
<v Speaker 1>a little bit pessimistic about the ability of AI to

0:50:25.719 --> 0:50:29.640
<v Speaker 1>generate additional bureaucracy in addition to additional productivity.

0:50:30.120 --> 0:50:32.440
<v Speaker 2>The other term that was great was when you said

0:50:32.480 --> 0:50:34.960
<v Speaker 2>the future is not like the weather. Yeah, but also

0:50:35.320 --> 0:50:38.160
<v Speaker 2>like I am worried about any notion that to achieve

0:50:38.239 --> 0:50:41.719
<v Speaker 2>the good outcome, the good equilibrium, we have to make good,

0:50:41.840 --> 0:50:45.680
<v Speaker 2>correct collective decisions because I have almost zero confidence in

0:50:45.960 --> 0:50:48.640
<v Speaker 2>whether it's just the US specifically or globally to make

0:50:49.040 --> 0:50:52.600
<v Speaker 2>collective decisions. I do think like going after like these

0:50:52.840 --> 0:50:56.520
<v Speaker 2>guilds like the American Medical Association, which for all of

0:50:56.560 --> 0:50:58.960
<v Speaker 2>the rise of nurse practitioners, it doesn't seem like we're

0:50:59.000 --> 0:51:01.560
<v Speaker 2>doing great on like ben the cost of healthcare jobs

0:51:01.680 --> 0:51:05.040
<v Speaker 2>or really having a healthcare capacity. That's going to be

0:51:05.040 --> 0:51:07.000
<v Speaker 2>like really tough. And those fights are going to be

0:51:07.080 --> 0:51:10.200
<v Speaker 2>really intense, whether it's with lawyers, whether it's with doctors,

0:51:10.239 --> 0:51:13.319
<v Speaker 2>whether it's with teachers, whether it's with podcasters, whether it's

0:51:13.320 --> 0:51:16.680
<v Speaker 2>professional architects, et cetera. Like, those fights are going to

0:51:16.719 --> 0:51:22.520
<v Speaker 2>be extremely intense. But like the basic intuition sounds very

0:51:22.560 --> 0:51:24.759
<v Speaker 2>compelling to me. The other thing is like you know

0:51:24.800 --> 0:51:27.440
<v Speaker 2>this idea of like oh, yeah, some training plus AI,

0:51:27.480 --> 0:51:29.440
<v Speaker 2>like I am worried, like maybe you just won't need

0:51:29.440 --> 0:51:31.920
<v Speaker 2>the training, and maybe it's just AI from the start.

0:51:32.040 --> 0:51:33.160
<v Speaker 3>So I don't know.

0:51:33.640 --> 0:51:35.759
<v Speaker 1>Well, in some respects, I think that would almost be

0:51:36.320 --> 0:51:40.719
<v Speaker 1>a better outcome in terms of democratizing AI. But yeah,

0:51:40.719 --> 0:51:44.120
<v Speaker 1>there are so many questions, uncertainty, as you mentioned in

0:51:44.160 --> 0:51:47.279
<v Speaker 1>the intro, lots of different takes at the moment. I

0:51:47.320 --> 0:51:49.880
<v Speaker 1>guess we'll see how it plays out and whether or

0:51:49.920 --> 0:51:52.200
<v Speaker 1>not you and I have jobs in ten years time.

0:51:52.239 --> 0:51:54.879
<v Speaker 2>We'll see, Well, we'll have David back when we're just like.

0:51:55.000 --> 0:51:58.440
<v Speaker 1>When we're automated voices. Yeah, exactly, all right, shall we

0:51:58.520 --> 0:51:59.239
<v Speaker 1>leave it there for now?

0:51:59.320 --> 0:52:00.000
<v Speaker 3>Let's leave it there.

0:52:00.120 --> 0:52:03.160
<v Speaker 1>Okay. This has been another episode of the Authoughts podcast.

0:52:03.239 --> 0:52:06.000
<v Speaker 1>I'm Tracy Alloway. You can follow me at Tracy Alloway.

0:52:06.120 --> 0:52:08.800
<v Speaker 2>And I'm Joe Wisenthal. You can follow me at the Stalwart.

0:52:09.040 --> 0:52:12.400
<v Speaker 2>Follow our guest David Otter, He's at David Otter. Follow

0:52:12.440 --> 0:52:16.200
<v Speaker 2>our producers Carmen Rodriguez at Carman Arman dashl Bennett at

0:52:16.320 --> 0:52:19.000
<v Speaker 2>Dashbot and kel Brooks at kel Brooks. And thank you

0:52:19.040 --> 0:52:22.400
<v Speaker 2>to our producer Moses Ondam. Form our oddlogs content go

0:52:22.440 --> 0:52:24.680
<v Speaker 2>to Bloomberg dot com slash odd Lots, where we have

0:52:24.719 --> 0:52:27.160
<v Speaker 2>a blog, we post transcripts, and we have a weekly

0:52:27.239 --> 0:52:30.279
<v Speaker 2>newsletter and you can chat with fellow listeners twenty four

0:52:30.280 --> 0:52:33.400
<v Speaker 2>to seven in the discord discord dot gg slash odd Logs.

0:52:33.400 --> 0:52:36.160
<v Speaker 2>There's even an AI room in there where people are

0:52:36.160 --> 0:52:38.960
<v Speaker 2>talking about all these things. So imagine there will be

0:52:39.000 --> 0:52:40.479
<v Speaker 2>some conversation about this there.

0:52:40.719 --> 0:52:43.200
<v Speaker 1>And if you enjoy odd Lots, if you like it

0:52:43.239 --> 0:52:46.240
<v Speaker 1>when we do deep dives into AI, how it works,

0:52:46.280 --> 0:52:48.800
<v Speaker 1>what it means for the economy and society, then please

0:52:48.920 --> 0:52:52.239
<v Speaker 1>leave us a positive review on your favorite podcast platform.

0:52:52.440 --> 0:52:55.279
<v Speaker 1>And remember, if you are a Bloomberg subscriber, you can

0:52:55.320 --> 0:52:58.879
<v Speaker 1>listen to all of our episodes absolutely ad free. All

0:52:58.920 --> 0:53:01.520
<v Speaker 1>you need to do is connect to your Bloomberg subscription

0:53:01.680 --> 0:53:04.080
<v Speaker 1>to Apple Podcasts. Thanks for listening.