1 00:00:00,320 --> 00:00:02,920 Speaker 1: So like literally a week and a half ago, hadn't 2 00:00:02,960 --> 00:00:05,440 Speaker 1: logged down to Zoom for maybe maybe two weeks. Right, 3 00:00:05,480 --> 00:00:08,039 Speaker 1: it was around Thanksgiving, and then I was traveling and 4 00:00:08,200 --> 00:00:11,879 Speaker 1: I had the Zoom meeting, and I open it, I 5 00:00:11,920 --> 00:00:14,040 Speaker 1: click on the link, you know, a minute before it 6 00:00:14,080 --> 00:00:17,439 Speaker 1: starts because I'm early, I am professional, And of course 7 00:00:17,440 --> 00:00:20,400 Speaker 1: it starts updating itself because it needed a software update. 8 00:00:20,440 --> 00:00:22,919 Speaker 1: And I'm frantically texting everybody who I'm supposed to be 9 00:00:22,960 --> 00:00:26,239 Speaker 1: meeting with, like blaming Zoom as if it's Zoom's fault. Oh, 10 00:00:26,280 --> 00:00:28,440 Speaker 1: of course, now Zoom decides to do a software update. 11 00:00:28,480 --> 00:00:33,080 Speaker 2: Blah blah blah. No, I knew in my heart that this. 12 00:00:33,120 --> 00:00:35,199 Speaker 1: Was a me issue, right, You knew you had a 13 00:00:35,280 --> 00:00:37,599 Speaker 1: Zoom meeting log on ten minutes early because it might 14 00:00:37,640 --> 00:00:41,080 Speaker 1: need a software update. But that's like how young and 15 00:00:41,120 --> 00:00:44,800 Speaker 1: old I am simultaneously, right, sold that I'm trying to 16 00:00:44,800 --> 00:00:47,479 Speaker 1: blame Zoom for a thing that's clearly my issue, and 17 00:00:47,560 --> 00:00:49,919 Speaker 1: young enough that, like, you know, we all had roommates 18 00:00:50,040 --> 00:00:52,960 Speaker 1: who did dial up the porn because that's what the 19 00:00:53,040 --> 00:00:57,880 Speaker 1: nineties were like. But as we think about things more 20 00:00:57,880 --> 00:00:59,720 Speaker 1: and more in a society, if you want to use 21 00:00:59,720 --> 00:01:02,279 Speaker 1: the word progresses. We've all thought about AI as well 22 00:01:02,320 --> 00:01:05,520 Speaker 1: as an actor, of course, terrifying the idea that robots 23 00:01:05,560 --> 00:01:07,679 Speaker 1: and computers could take all of our work. But it's 24 00:01:07,720 --> 00:01:10,480 Speaker 1: not just actors. No matter what you do for a living, 25 00:01:10,520 --> 00:01:13,679 Speaker 1: the humanity of your work, the interpersonal skills, the taking 26 00:01:13,720 --> 00:01:16,240 Speaker 1: pride in your job, even if that means you're in 27 00:01:16,280 --> 00:01:18,720 Speaker 1: a cubicle all day, whatever it is that you're doing. 28 00:01:19,120 --> 00:01:22,320 Speaker 1: The fear that maybe technology could replace us is like 29 00:01:22,440 --> 00:01:23,199 Speaker 1: a real thing. 30 00:01:23,520 --> 00:01:24,360 Speaker 2: It's not just creepy. 31 00:01:24,360 --> 00:01:27,120 Speaker 1: It feels like it's getting out of control, and it 32 00:01:27,160 --> 00:01:29,400 Speaker 1: feels like the heart of the humanity that we all 33 00:01:29,440 --> 00:01:32,360 Speaker 1: have is being questioned right now. So wanted to have 34 00:01:32,560 --> 00:01:35,280 Speaker 1: a very interesting guest on who could help us sort 35 00:01:35,319 --> 00:01:38,440 Speaker 1: through all of that, and that is journalist Jacob Goldstein, 36 00:01:38,600 --> 00:01:41,520 Speaker 1: who tells me that we've been here before, and. 37 00:01:41,480 --> 00:01:43,040 Speaker 3: It started in the cloth business. 38 00:01:43,280 --> 00:01:47,120 Speaker 4: So these original a lot ofites were like the first 39 00:01:47,200 --> 00:01:50,760 Speaker 4: people really to be impacted in a significant way by 40 00:01:50,760 --> 00:01:53,320 Speaker 4: the industrial revolution, the first people to face this thing 41 00:01:53,360 --> 00:01:55,080 Speaker 4: that we are talking about that we are afraid of 42 00:01:55,440 --> 00:01:57,960 Speaker 4: essentially losing their jobs to technological change. 43 00:01:58,120 --> 00:02:00,360 Speaker 1: In my conversation with Jacob, we're going to go all 44 00:02:00,400 --> 00:02:03,680 Speaker 1: the way back to the eighteen hundreds, to the original Luddites, 45 00:02:04,120 --> 00:02:05,920 Speaker 1: and I want to hear about their story. And we're 46 00:02:05,960 --> 00:02:09,480 Speaker 1: going to talk about how technology changes, how those changes 47 00:02:09,520 --> 00:02:13,520 Speaker 1: reshape our work, and then frankly, who historically gets protected 48 00:02:13,760 --> 00:02:14,920 Speaker 1: and who gets left behind? 49 00:02:16,440 --> 00:02:21,200 Speaker 2: Here we go again again again. 50 00:02:21,960 --> 00:02:24,399 Speaker 1: Okay, Hey, I'm cal Pen and this is here we 51 00:02:24,440 --> 00:02:28,200 Speaker 1: Go again, a show that takes today's trends and headlines and. 52 00:02:28,160 --> 00:02:30,840 Speaker 2: Asks why does history keep repeating itself? 53 00:02:31,160 --> 00:02:32,359 Speaker 3: Here we go? 54 00:02:47,840 --> 00:02:51,280 Speaker 1: Hey, good morning, good morning, how you doing. Thanks for 55 00:02:51,360 --> 00:02:53,160 Speaker 1: today I'm speaking with Hi. 56 00:02:53,200 --> 00:02:54,119 Speaker 3: I'm Jacob Goldstein. 57 00:02:54,320 --> 00:02:58,120 Speaker 1: He was a previous host on the NPR podcast Planet Money, 58 00:02:58,160 --> 00:03:01,160 Speaker 1: so if you recognize his voice maybe from that. He's 59 00:03:01,200 --> 00:03:04,399 Speaker 1: the current host of the podcast What's Your Problem and 60 00:03:04,600 --> 00:03:07,680 Speaker 1: Business History, and he's the author of the book Money, 61 00:03:07,800 --> 00:03:10,800 Speaker 1: The True Story of a Made Up Thing. Before we 62 00:03:10,840 --> 00:03:13,240 Speaker 1: fully dive in, I was very excited to talk to 63 00:03:13,240 --> 00:03:16,120 Speaker 1: you about all these topics because a couple of reasons. 64 00:03:16,160 --> 00:03:19,720 Speaker 1: I generally know very little about technology. I'm forty eight, 65 00:03:19,800 --> 00:03:22,839 Speaker 1: so as my college friend who does work in tech, 66 00:03:22,840 --> 00:03:25,520 Speaker 1: always reminds me like, we're that generation that grew up 67 00:03:25,600 --> 00:03:30,240 Speaker 1: fully analog, but we're still kids when the digital wireless 68 00:03:30,240 --> 00:03:32,640 Speaker 1: world kind of came around, and so it's rare, that 69 00:03:32,760 --> 00:03:34,760 Speaker 1: rarity of knowing how to use both. But I'm now 70 00:03:34,840 --> 00:03:37,800 Speaker 1: getting old enough from like I just want to rotary phone, 71 00:03:37,800 --> 00:03:42,680 Speaker 1: like when I was nine. So since you are smart 72 00:03:42,720 --> 00:03:45,480 Speaker 1: and thoughtful and very knowledgeable about all of this, just 73 00:03:45,720 --> 00:03:48,040 Speaker 1: the nerd part of me was really excited to talk 74 00:03:48,080 --> 00:03:49,840 Speaker 1: to you. When did you know you wanted to be 75 00:03:49,880 --> 00:03:51,960 Speaker 1: a journalist and focus on the things that you do. 76 00:03:52,720 --> 00:03:55,320 Speaker 4: So I became a journalist in my late twenties. I 77 00:03:55,360 --> 00:04:00,120 Speaker 4: was a newspaper reporter, and it was around the time 78 00:04:00,280 --> 00:04:04,520 Speaker 4: of the two thousand and eight financial crisis. I was 79 00:04:04,560 --> 00:04:08,280 Speaker 4: a reporter covering healthcare at the Wall Street Journal, and 80 00:04:09,160 --> 00:04:12,600 Speaker 4: like everybody else, I wanted to understand what was going on. 81 00:04:13,520 --> 00:04:15,960 Speaker 4: It was confusing to me, you know, I knew about healthcare, 82 00:04:15,960 --> 00:04:17,320 Speaker 4: but even though I was at the journal, I wasn't 83 00:04:17,320 --> 00:04:18,160 Speaker 4: covering finance. 84 00:04:18,240 --> 00:04:20,080 Speaker 3: And I have an aunt who. 85 00:04:19,920 --> 00:04:22,560 Speaker 4: Was an MBA who is like my go to kind 86 00:04:22,560 --> 00:04:25,200 Speaker 4: of business and money person when I don't understand things, 87 00:04:25,720 --> 00:04:28,160 Speaker 4: and you know, there had been this collapse obviously of 88 00:04:28,160 --> 00:04:31,440 Speaker 4: the stock market and of real estate prices, and I 89 00:04:31,480 --> 00:04:34,360 Speaker 4: asked her, like it seemed like people were talking about 90 00:04:34,360 --> 00:04:37,240 Speaker 4: like a trillion dollars disappeared from the stock market, right, 91 00:04:37,640 --> 00:04:40,880 Speaker 4: So I had this maybe dumb, maybe not question, where 92 00:04:40,920 --> 00:04:41,719 Speaker 4: did the money go? 93 00:04:41,800 --> 00:04:43,599 Speaker 2: Yeah, that's not dumb, that would be my question. 94 00:04:44,320 --> 00:04:47,440 Speaker 3: And she said it wasn't there in the first place. 95 00:04:47,920 --> 00:04:51,240 Speaker 3: Money is fiction. Oh yeah, And I was like, ooh fiction. 96 00:04:51,600 --> 00:04:53,880 Speaker 3: I had been an English major, I knew about fiction. 97 00:04:54,000 --> 00:04:56,680 Speaker 3: I was like, this is interesting to me. And that 98 00:04:56,760 --> 00:04:57,560 Speaker 3: was sort of the gateway. 99 00:04:57,560 --> 00:04:59,479 Speaker 4: And so not long after that, I went to work 100 00:04:59,520 --> 00:05:02,760 Speaker 4: for this podcast called Planet Money. It's like covering the 101 00:05:02,800 --> 00:05:05,840 Speaker 4: economy but in a very sort of narrative storytelling way, 102 00:05:06,600 --> 00:05:08,479 Speaker 4: and through that I got into kind of the history 103 00:05:08,480 --> 00:05:10,719 Speaker 4: of money and the history of economics and the stuff 104 00:05:10,720 --> 00:05:11,160 Speaker 4: we're going to. 105 00:05:11,080 --> 00:05:11,719 Speaker 3: Talk about today. 106 00:05:12,040 --> 00:05:12,480 Speaker 2: Amazing. 107 00:05:12,760 --> 00:05:14,479 Speaker 1: I had an AHA moment like that when I was 108 00:05:14,520 --> 00:05:17,960 Speaker 1: hosting a show just before COVID terrible title called This 109 00:05:18,040 --> 00:05:21,320 Speaker 1: Giant Beast that is the Global Economy for Amazon Prime. 110 00:05:21,360 --> 00:05:23,240 Speaker 1: And I had a great time working on it, but 111 00:05:23,240 --> 00:05:26,120 Speaker 1: that was, you know, I was the host through which 112 00:05:26,360 --> 00:05:28,480 Speaker 1: the audience gets to experience things, so I knew very 113 00:05:28,480 --> 00:05:31,080 Speaker 1: little going in and my mind was blown. It sounds 114 00:05:31,080 --> 00:05:34,479 Speaker 1: like not dissimilar to your two thousand and eight phone call. Yeah, 115 00:05:34,800 --> 00:05:37,479 Speaker 1: as a journalist, I'm going to jump ahead a little 116 00:05:37,520 --> 00:05:39,880 Speaker 1: bit to just sort of what people are thinking about today. 117 00:05:39,880 --> 00:05:43,080 Speaker 1: And I'll preface it by saying, as an actor and 118 00:05:43,120 --> 00:05:47,000 Speaker 1: a writer, we just came off of two very protracted 119 00:05:47,120 --> 00:05:50,279 Speaker 1: labor disputes where we were forced to go on strike 120 00:05:50,320 --> 00:05:52,960 Speaker 1: by these big media companies in large part because of 121 00:05:53,839 --> 00:05:57,880 Speaker 1: AI proposals and provisions. As a journalist, are you at 122 00:05:57,880 --> 00:06:01,159 Speaker 1: the place where you're thinking through or or are you 123 00:06:01,279 --> 00:06:04,800 Speaker 1: afraid that technology is going to replace people in your 124 00:06:04,800 --> 00:06:06,160 Speaker 1: profession or has it already? 125 00:06:07,040 --> 00:06:07,280 Speaker 3: Yes? 126 00:06:07,320 --> 00:06:13,080 Speaker 4: And yes, I mean if you look at employment in 127 00:06:13,400 --> 00:06:16,720 Speaker 4: newsrooms over the past twenty years, which is to say 128 00:06:16,800 --> 00:06:19,000 Speaker 4: my career, it has gone down and down and down 129 00:06:19,520 --> 00:06:24,640 Speaker 4: because of technological change. Right, It's not exactly AI doing reporters' jobs. 130 00:06:24,640 --> 00:06:29,880 Speaker 4: It's people substituting away from newspapers. But yes, looking forward, 131 00:06:30,680 --> 00:06:33,720 Speaker 4: I am worried that AI will be able to do 132 00:06:33,760 --> 00:06:37,159 Speaker 4: what I do. And fortunately I'm not that young and 133 00:06:37,200 --> 00:06:39,200 Speaker 4: I don't have to make it that much longer. But 134 00:06:39,400 --> 00:06:45,080 Speaker 4: like I would caution someone starting their career of trying 135 00:06:45,120 --> 00:06:48,120 Speaker 4: to be a journalist in the traditional sense, right, I mean, 136 00:06:48,560 --> 00:06:53,480 Speaker 4: perhaps interesting question is, like what is the combination, right, Like, 137 00:06:54,120 --> 00:06:56,680 Speaker 4: we'll AI take our jobs is a valid question, But 138 00:06:56,920 --> 00:06:59,320 Speaker 4: in the meantime, how can we use AI to do 139 00:06:59,360 --> 00:07:03,080 Speaker 4: our jobs better? Is perhaps a more practical question. 140 00:07:04,160 --> 00:07:06,800 Speaker 2: And how do you feel like you're able to use 141 00:07:06,839 --> 00:07:08,320 Speaker 2: AI to make your job better? 142 00:07:09,080 --> 00:07:12,120 Speaker 4: I mean it helps me with research, is the short answer. 143 00:07:12,160 --> 00:07:14,800 Speaker 4: And I will say there's an interesting cultural thing, certainly 144 00:07:14,880 --> 00:07:18,840 Speaker 4: among journalists and writers. I suspect it's not as much 145 00:07:18,840 --> 00:07:22,480 Speaker 4: in other fields. But because large language models are writing, 146 00:07:22,960 --> 00:07:27,000 Speaker 4: they feel like a direct threat to you know, journalists 147 00:07:27,040 --> 00:07:28,760 Speaker 4: in a way that they perhaps do not to people 148 00:07:28,800 --> 00:07:30,960 Speaker 4: in other fields. So I don't know, there's a feeling 149 00:07:30,960 --> 00:07:34,000 Speaker 4: among journalists that using AI is cheating. 150 00:07:34,600 --> 00:07:34,760 Speaker 2: Right. 151 00:07:34,800 --> 00:07:39,640 Speaker 4: People will sort of abashedly admit to using AI, and 152 00:07:39,680 --> 00:07:42,360 Speaker 4: I think that's unhealthy, right. I think using AI is 153 00:07:42,400 --> 00:07:44,240 Speaker 4: like using the Internet, which by the way, is like 154 00:07:44,280 --> 00:07:46,520 Speaker 4: you know, Google searches are driven by AI, of course. 155 00:07:47,120 --> 00:07:49,520 Speaker 4: So for example, I host this show called Business History, 156 00:07:50,080 --> 00:07:52,240 Speaker 4: and just this morning I took a script that one 157 00:07:52,240 --> 00:07:54,120 Speaker 4: of my colleagues had written that was like a great narrative, 158 00:07:54,120 --> 00:07:55,880 Speaker 4: but I felt like it could use some big ideas, 159 00:07:56,200 --> 00:07:58,920 Speaker 4: and I put it into a large language model and said, like, 160 00:07:59,120 --> 00:08:01,040 Speaker 4: what are some big I can ideas that you think 161 00:08:01,080 --> 00:08:03,520 Speaker 4: sort of emerge from this, what are some themes? And 162 00:08:03,520 --> 00:08:05,880 Speaker 4: it's not like I copied and pasted the answer. It 163 00:08:05,920 --> 00:08:08,880 Speaker 4: suggested some ideas, and I went and researched those ideas, right, 164 00:08:08,920 --> 00:08:09,840 Speaker 4: So that's an example. 165 00:08:10,000 --> 00:08:14,480 Speaker 1: So when you research the ideas after the prompt, are 166 00:08:14,480 --> 00:08:16,880 Speaker 1: you researching that outside of the AI models? And here's 167 00:08:16,880 --> 00:08:19,800 Speaker 1: basically what I'm getting at is like, how as a journalist, 168 00:08:19,840 --> 00:08:22,880 Speaker 1: how do you make sure that the sources themselves are accurate? 169 00:08:23,000 --> 00:08:25,600 Speaker 1: Knowing that, however, AI has learned that may not be 170 00:08:25,680 --> 00:08:28,400 Speaker 1: whatever the inputs were may not be a real thing. 171 00:08:28,960 --> 00:08:31,480 Speaker 1: So how do you fact check that when you take 172 00:08:31,480 --> 00:08:32,640 Speaker 1: the next step in research? 173 00:08:33,240 --> 00:08:35,360 Speaker 4: I mean that part is the same as always, right, 174 00:08:35,520 --> 00:08:38,840 Speaker 4: Like I don't I never take the answer from the 175 00:08:38,880 --> 00:08:42,040 Speaker 4: AI as a reliable answer. I go and I look 176 00:08:42,080 --> 00:08:44,040 Speaker 4: it up elsewhere, and I look at the source and 177 00:08:44,080 --> 00:08:48,800 Speaker 4: I evaluate the credibility of the source material. And you know, 178 00:08:49,120 --> 00:08:53,679 Speaker 4: certainly ayes hallucinate some, but like overall, it's useful, and 179 00:08:53,720 --> 00:08:55,240 Speaker 4: like once in a while it's like, oh, that doesn't 180 00:08:55,280 --> 00:08:57,440 Speaker 4: make sense, but quite often it's like, oh, that's a 181 00:08:57,440 --> 00:08:59,839 Speaker 4: good idea. And like just this morning there was like 182 00:09:00,520 --> 00:09:04,400 Speaker 4: particular monetary policy shift in Germany and the nineteen forties 183 00:09:04,440 --> 00:09:06,720 Speaker 4: that it pointed me to that was in fact real 184 00:09:06,800 --> 00:09:09,800 Speaker 4: and interesting in the history of the Volkswagen. 185 00:09:09,120 --> 00:09:10,800 Speaker 3: Beetle, which is the story we're working on. 186 00:09:10,920 --> 00:09:13,320 Speaker 1: But okay, I want to go back a little bit 187 00:09:13,360 --> 00:09:16,920 Speaker 1: to history because the same friend who I mentioned, the 188 00:09:17,320 --> 00:09:19,960 Speaker 1: tech savvy college friend, often likes to tell me that 189 00:09:20,000 --> 00:09:20,679 Speaker 1: I'm a Luddite. 190 00:09:21,160 --> 00:09:22,720 Speaker 2: And I don't think he's necessarily wrong. 191 00:09:22,840 --> 00:09:25,160 Speaker 1: The way that we throw around the word today technology 192 00:09:25,200 --> 00:09:28,200 Speaker 1: doesn't necessarily get me as geeked as some of my friends, 193 00:09:28,240 --> 00:09:32,240 Speaker 1: although although I'll obviously have my moments. And so just 194 00:09:32,240 --> 00:09:34,400 Speaker 1: in looking back, right, I get that people have always 195 00:09:34,440 --> 00:09:37,680 Speaker 1: had this fear that newer technology is going to replace 196 00:09:37,760 --> 00:09:40,520 Speaker 1: their jobs. So what you've talked about is that we've 197 00:09:40,520 --> 00:09:43,959 Speaker 1: been here before, and all this palpable fear started back 198 00:09:43,960 --> 00:09:46,360 Speaker 1: in eighteen hundreds, and so can you take us to 199 00:09:46,440 --> 00:09:49,439 Speaker 1: the eighteen hundreds and tell us what happened, Like, take 200 00:09:49,520 --> 00:09:51,160 Speaker 1: us back to that moment in history for somebody who 201 00:09:51,160 --> 00:09:52,040 Speaker 1: doesn't know anything. 202 00:09:51,800 --> 00:09:52,800 Speaker 2: About the Luodites. 203 00:09:53,600 --> 00:09:59,400 Speaker 4: Yeah, so the original Luddites were quite different. Just to start, then, 204 00:09:59,440 --> 00:10:01,320 Speaker 4: the word we have to the word Luddite today. I mean, 205 00:10:01,320 --> 00:10:03,679 Speaker 4: it's just like somebody who doesn't like technology because they 206 00:10:03,679 --> 00:10:06,520 Speaker 4: don't like it. Right, So, the original Ludites were cloth 207 00:10:06,559 --> 00:10:10,800 Speaker 4: workers in England in the first part of the eighteen hundreds, 208 00:10:10,840 --> 00:10:14,720 Speaker 4: like eighteen eleven, eighteen twelve. Around that time, and for 209 00:10:15,200 --> 00:10:20,520 Speaker 4: essentially all of human history, there was very little technological change. 210 00:10:20,600 --> 00:10:22,080 Speaker 4: Right this world do we live in where you just 211 00:10:22,200 --> 00:10:26,320 Speaker 4: assume that technology changes generation after generation, that things get 212 00:10:26,360 --> 00:10:28,640 Speaker 4: more efficient. That was not the nature of the world 213 00:10:28,840 --> 00:10:33,400 Speaker 4: until the Industrial Revolution, which started in the second part 214 00:10:33,440 --> 00:10:35,760 Speaker 4: of the seventeen hundreds in England, and it started in 215 00:10:35,800 --> 00:10:36,520 Speaker 4: the cloth. 216 00:10:36,240 --> 00:10:37,080 Speaker 3: Business, right. 217 00:10:37,120 --> 00:10:41,880 Speaker 4: So these original Lutdites were like the first people really 218 00:10:41,960 --> 00:10:45,679 Speaker 4: to be impacted in a significant way by the Industrial Revolution, 219 00:10:45,720 --> 00:10:47,600 Speaker 4: the first people to face this thing that we are 220 00:10:47,600 --> 00:10:50,120 Speaker 4: talking about that we were afraid of, essentially losing their 221 00:10:50,200 --> 00:10:53,679 Speaker 4: jobs to technological change. The cloth business was actually a 222 00:10:53,760 --> 00:10:56,800 Speaker 4: huge business for England at the time, and the. 223 00:10:56,800 --> 00:10:59,839 Speaker 3: Luddites were skilled artisans. Right. 224 00:11:00,120 --> 00:11:04,319 Speaker 4: We think of, you know, factory work as terrible in 225 00:11:04,360 --> 00:11:07,319 Speaker 4: the eighteen hundreds, and it was, but before the Industrial Revolution, 226 00:11:07,400 --> 00:11:09,920 Speaker 4: it was like something you did kind of in your home. 227 00:11:10,040 --> 00:11:12,640 Speaker 4: It was farmed out and so you know, there were 228 00:11:12,679 --> 00:11:16,359 Speaker 4: all these steps to making fabric, and different people specialized 229 00:11:16,360 --> 00:11:17,320 Speaker 4: in different parts of it. 230 00:11:17,640 --> 00:11:20,079 Speaker 3: So the croppers would take a rough piece of. 231 00:11:20,040 --> 00:11:23,840 Speaker 4: Fabric and they would have these giant shears, like kind 232 00:11:23,840 --> 00:11:26,559 Speaker 4: of giant metal scissors, and they would crop that. I 233 00:11:26,600 --> 00:11:29,800 Speaker 4: guess the nap I don't know about fabric. They'd crop 234 00:11:29,920 --> 00:11:30,840 Speaker 4: something off. 235 00:11:30,720 --> 00:11:32,560 Speaker 3: The wool to like make it smooth and nice. 236 00:11:32,600 --> 00:11:36,920 Speaker 4: Okay, So that was their job, and for the time 237 00:11:37,240 --> 00:11:40,320 Speaker 4: it was a pretty good job. Like they worked for themselves. 238 00:11:40,640 --> 00:11:43,760 Speaker 4: They set their own hours, very poor by the standards 239 00:11:43,760 --> 00:11:46,640 Speaker 4: of today, which was an important point, but in relative 240 00:11:46,679 --> 00:11:49,040 Speaker 4: terms at the time they were doing well. And then 241 00:11:49,080 --> 00:11:52,880 Speaker 4: along came the industrial Revolution, which started out as machines 242 00:11:52,920 --> 00:11:56,880 Speaker 4: to make cloth. Right, So somebody invented a thing to 243 00:11:56,960 --> 00:12:01,960 Speaker 4: spin a raw fiber into threadbody invented a loom, and 244 00:12:02,000 --> 00:12:05,040 Speaker 4: then somebody invented a shearing frame, right, a machine to crop, 245 00:12:05,320 --> 00:12:07,200 Speaker 4: a machine to do what the croppers had done. Right, 246 00:12:07,360 --> 00:12:10,040 Speaker 4: And so this is the thing we are talking about. 247 00:12:10,080 --> 00:12:14,040 Speaker 4: This is AI taking our job, but for the croppers, 248 00:12:14,080 --> 00:12:15,960 Speaker 4: but for the skilled clothmakers. 249 00:12:16,120 --> 00:12:19,600 Speaker 1: I'm curious the all those inventions. What's the do you 250 00:12:19,600 --> 00:12:21,439 Speaker 1: know the range of time? Like did this all happen 251 00:12:21,480 --> 00:12:23,000 Speaker 1: in a year or was it over a period of 252 00:12:23,040 --> 00:12:23,839 Speaker 1: like fifty years. 253 00:12:24,320 --> 00:12:27,120 Speaker 3: Decades? The order of magnitude is decades. 254 00:12:26,760 --> 00:12:31,040 Speaker 2: So slower than slower than right now. Well maybe maybe not. 255 00:12:31,520 --> 00:12:33,720 Speaker 4: I think it didn't feel that way if you were 256 00:12:33,760 --> 00:12:36,839 Speaker 4: a cropper right like, there wasn't the spread of information 257 00:12:37,040 --> 00:12:39,440 Speaker 4: like there is today. They didn't necessarily you know, if 258 00:12:39,480 --> 00:12:42,440 Speaker 4: you were a cropper in eighteen hundred, you didn't know 259 00:12:42,760 --> 00:12:45,440 Speaker 4: that somebody was going to invent a shearing frame and 260 00:12:45,440 --> 00:12:46,480 Speaker 4: that it was going to show up. 261 00:12:46,559 --> 00:12:46,760 Speaker 1: You know. 262 00:12:46,920 --> 00:12:49,120 Speaker 3: I think it came as a surprise. 263 00:12:49,360 --> 00:12:52,120 Speaker 2: And once it was there, there's a guy named Ned Lud. 264 00:12:52,760 --> 00:12:57,040 Speaker 3: Yes, so what starts happening? Does one curse on this show? 265 00:12:57,280 --> 00:12:58,840 Speaker 2: Oh? Yes, please feel free. 266 00:12:58,920 --> 00:13:01,520 Speaker 4: So these people who have these good jobs, who have 267 00:13:01,600 --> 00:13:04,400 Speaker 4: these skills, see the machines coming and taking their jobs, 268 00:13:04,400 --> 00:13:08,800 Speaker 4: and they essentially think like fuck this, like no, like 269 00:13:08,840 --> 00:13:11,080 Speaker 4: we're not gonna do it this way. And so what 270 00:13:11,120 --> 00:13:13,559 Speaker 4: they start doing is going in the middle of the 271 00:13:13,679 --> 00:13:18,520 Speaker 4: night and attacking these new machines like literally physically, like breaking. 272 00:13:18,080 --> 00:13:19,280 Speaker 3: Them with sledgehammers. 273 00:13:19,400 --> 00:13:21,439 Speaker 4: It's like, oh, you're gonna do this with the machine. No, 274 00:13:21,800 --> 00:13:23,719 Speaker 4: I'm going to break your machine with a sledgehammer. And 275 00:13:23,760 --> 00:13:25,559 Speaker 4: then you can come back to me and I'll keep cropping. 276 00:13:26,440 --> 00:13:27,400 Speaker 3: And in I. 277 00:13:27,400 --> 00:13:30,959 Speaker 4: Believe it's eighteen eleven, it starts to get more organized. 278 00:13:31,000 --> 00:13:33,120 Speaker 4: So these have been just kind of random, ad hoc 279 00:13:33,320 --> 00:13:35,520 Speaker 4: kind of what we would call today, like maybe mob. 280 00:13:35,679 --> 00:13:37,599 Speaker 3: Sure it might be a word people would use today. 281 00:13:37,800 --> 00:13:40,360 Speaker 4: But in eighteen eleven it starts to feel more organized, 282 00:13:40,440 --> 00:13:44,160 Speaker 4: and there start to be these letters from this self 283 00:13:44,240 --> 00:13:50,719 Speaker 4: titled General General ned Lud, And he's actually holed up 284 00:13:50,840 --> 00:13:55,080 Speaker 4: in Sherwood forest, like Robin Hood, who has kind of 285 00:13:55,120 --> 00:13:56,439 Speaker 4: similar vibes. 286 00:13:56,160 --> 00:13:59,480 Speaker 3: Right, and these letters are. 287 00:13:59,360 --> 00:14:03,320 Speaker 4: Taking on the tones of Civil war frankly, right. He 288 00:14:03,360 --> 00:14:07,120 Speaker 4: has this title General ned Lud and at one point 289 00:14:07,160 --> 00:14:09,720 Speaker 4: they call his his Army of Redressers, right, like a 290 00:14:09,760 --> 00:14:12,360 Speaker 4: redress of grievances. The main thing you need to know 291 00:14:12,360 --> 00:14:16,400 Speaker 4: about ned Lud there was no ned Lud he was. 292 00:14:16,600 --> 00:14:17,319 Speaker 4: He was a myth. 293 00:14:17,400 --> 00:14:19,160 Speaker 3: He was a myth. He was like Robinhood. He was 294 00:14:19,200 --> 00:14:19,760 Speaker 3: like Robin Hood. 295 00:14:19,800 --> 00:14:22,160 Speaker 4: I mean, there may actually have been a guy named 296 00:14:22,160 --> 00:14:24,440 Speaker 4: ned Lud decades earlier who was like a framebreaker, but 297 00:14:24,520 --> 00:14:28,000 Speaker 4: like this General ned Lud, this guy leading the revolt, 298 00:14:28,520 --> 00:14:32,080 Speaker 4: he was an invention, which is kind of genius, right, like, yes, 299 00:14:32,800 --> 00:14:35,600 Speaker 4: it's a genius way for a people with no political power, 300 00:14:35,600 --> 00:14:37,520 Speaker 4: for a group of people with no political power to. 301 00:14:38,280 --> 00:14:42,840 Speaker 3: Create a movement. Right, you invent a figurehead, a mythical 302 00:14:42,920 --> 00:14:46,160 Speaker 3: general hold up in the forest who does not exist. 303 00:14:46,360 --> 00:14:51,200 Speaker 4: And so it's this idea that the workers are not 304 00:14:51,360 --> 00:14:56,920 Speaker 4: just randomly breaking machines, they're organizing to fight back. And 305 00:14:57,960 --> 00:15:00,600 Speaker 4: one thing that's interesting, there's this historian who has called 306 00:15:00,600 --> 00:15:05,000 Speaker 4: what they were doing collective bargaining by riot because there 307 00:15:05,000 --> 00:15:07,320 Speaker 4: were no unions, Like they couldn't even vote, right, there 308 00:15:07,400 --> 00:15:11,360 Speaker 4: wasn't a mass suffrage in England at the time. There 309 00:15:11,360 --> 00:15:13,560 Speaker 4: were certainly no units. They basically didn't have power in 310 00:15:13,600 --> 00:15:16,320 Speaker 4: any organized way. So they were seizing power and it's 311 00:15:16,320 --> 00:15:17,880 Speaker 4: this collective kind of. 312 00:15:17,800 --> 00:15:18,480 Speaker 3: Ad hoc way. 313 00:15:18,880 --> 00:15:24,640 Speaker 4: And these attacks get more systematic. There's this one, particularly 314 00:15:24,920 --> 00:15:29,520 Speaker 4: dramatic one where these guys they all mass at this 315 00:15:29,680 --> 00:15:33,280 Speaker 4: bar and they're going to attack this mill. 316 00:15:33,320 --> 00:15:34,920 Speaker 3: But the mill owner knows. 317 00:15:34,720 --> 00:15:37,200 Speaker 4: Like the Luddites that you know, have been attacking around 318 00:15:37,240 --> 00:15:41,400 Speaker 4: this region, He's been preparing to defend himself. So the 319 00:15:41,440 --> 00:15:44,480 Speaker 4: owner is actually like sleeping there. He's hired some people 320 00:15:44,920 --> 00:15:48,120 Speaker 4: with rifles to defend the factory. He actually has like 321 00:15:49,000 --> 00:15:51,000 Speaker 4: a that of I think sulfuric acid. 322 00:15:51,240 --> 00:15:52,600 Speaker 3: He's going to like pour down on them. 323 00:15:52,640 --> 00:15:55,680 Speaker 4: Yes, it's very medieval, right, it feels very like medieval castle. 324 00:15:56,160 --> 00:15:58,800 Speaker 4: He's made this factory like a fortress, and so the 325 00:15:58,880 --> 00:16:02,400 Speaker 4: Ludights march on it is it is very it's like 326 00:16:02,480 --> 00:16:06,480 Speaker 4: proto civil war, right. This is an armed organized attack 327 00:16:06,600 --> 00:16:09,400 Speaker 4: on a heavily defended factory. And they get in and 328 00:16:09,440 --> 00:16:13,280 Speaker 4: there's like actually an exchange of gunfire, and ultimately they retreat. 329 00:16:13,680 --> 00:16:16,200 Speaker 4: So there's two Ledites dead. They have failed in their attack. 330 00:16:16,840 --> 00:16:21,200 Speaker 4: And around this time Parliament, you know, the British government 331 00:16:22,040 --> 00:16:25,040 Speaker 4: realizes that this is getting out of hand, and they 332 00:16:25,040 --> 00:16:31,440 Speaker 4: have passed a law making attacking machines punishable by death. 333 00:16:31,520 --> 00:16:33,120 Speaker 2: Amazing, I am amazed. 334 00:16:33,120 --> 00:16:36,200 Speaker 1: It was almost ninety nine percent confident that this would 335 00:16:36,240 --> 00:16:37,440 Speaker 1: not go in the favor. 336 00:16:37,160 --> 00:16:37,720 Speaker 2: Of the people. 337 00:16:39,640 --> 00:16:42,680 Speaker 1: Tell me something about British history that I didn't already know. Okay, 338 00:16:42,720 --> 00:16:43,160 Speaker 1: go ahead. 339 00:16:44,200 --> 00:16:48,560 Speaker 4: So after this attack where two Ledites are killed, there's 340 00:16:48,640 --> 00:16:52,120 Speaker 4: like this round up. Essentially, the government fights back. A 341 00:16:52,160 --> 00:16:55,160 Speaker 4: bunch of Leedits are arrested, throwing in jail their triede 342 00:16:55,480 --> 00:16:58,520 Speaker 4: and several of them are in fact given death sentences, 343 00:16:59,120 --> 00:17:02,280 Speaker 4: and they are publicly hung. They actually make the gallows 344 00:17:02,840 --> 00:17:04,800 Speaker 4: twice as high as usual so that you can see 345 00:17:04,800 --> 00:17:07,639 Speaker 4: it from even farther away. You know, there's like, you know, 346 00:17:07,920 --> 00:17:13,119 Speaker 4: a crowd of people witnessing the hanging. And this basically 347 00:17:13,960 --> 00:17:18,360 Speaker 4: defeats the Luddites, right, Like, this is basically the Luddites lose. Right, 348 00:17:18,600 --> 00:17:21,000 Speaker 4: the government rounds them up and kills them, and they 349 00:17:21,000 --> 00:17:25,160 Speaker 4: stop attacking the machines, and there are no more rich croppers, 350 00:17:25,320 --> 00:17:27,080 Speaker 4: or they were never rich, but there are no more 351 00:17:27,119 --> 00:17:30,320 Speaker 4: like relatively well off croppers after this, Right, It's just 352 00:17:30,400 --> 00:17:33,520 Speaker 4: the machines are, in fact, a better cheaper way to 353 00:17:33,560 --> 00:17:37,080 Speaker 4: make cloth. The Ludites don't have any political power and 354 00:17:37,119 --> 00:17:38,200 Speaker 4: they are out of luck. 355 00:17:45,240 --> 00:17:47,480 Speaker 1: That all makes me wonder, like if you put the 356 00:17:47,560 --> 00:17:53,080 Speaker 1: Luddite story next to conversations that we're having about AI today, 357 00:17:53,160 --> 00:17:55,679 Speaker 1: in what ways do you think that they're similar and 358 00:17:55,720 --> 00:17:58,119 Speaker 1: how are they different? And the one stat that I 359 00:17:58,119 --> 00:18:00,000 Speaker 1: remember when we were researching for this that came out 360 00:18:00,080 --> 00:18:03,920 Speaker 1: up is like Goldman said that by twenty thirty, AI 361 00:18:04,080 --> 00:18:07,800 Speaker 1: could replace the equivalent of three hundred million full time jobs. 362 00:18:08,440 --> 00:18:12,399 Speaker 1: Forbes said it would replace two million manufacturing jobs by. 363 00:18:12,240 --> 00:18:14,280 Speaker 2: The end of next year alone. 364 00:18:15,040 --> 00:18:17,880 Speaker 1: And so when I when that Light eight story especially, 365 00:18:17,920 --> 00:18:22,600 Speaker 1: it's got everything, It's got public policy, it has a clear, 366 00:18:22,680 --> 00:18:25,720 Speaker 1: very simple explanation of who really wins in the short 367 00:18:25,800 --> 00:18:28,879 Speaker 1: term and capitalism, the power dynamics, all of that. So 368 00:18:28,960 --> 00:18:31,159 Speaker 1: then that just makes me wonder if you put that 369 00:18:31,240 --> 00:18:33,680 Speaker 1: story next to today's AI conversations, like, how do you 370 00:18:33,720 --> 00:18:35,119 Speaker 1: see them being similar or different? 371 00:18:36,200 --> 00:18:41,800 Speaker 4: Yeah, I mean, certainly workers being potentially replaced by machines 372 00:18:42,280 --> 00:18:47,439 Speaker 4: is in fact similar like that in some places you know, 373 00:18:48,720 --> 00:18:49,760 Speaker 4: call centers. 374 00:18:50,280 --> 00:18:52,680 Speaker 3: It's already happening, clearly. I Mean. 375 00:18:52,800 --> 00:18:55,600 Speaker 4: One thing that I think is an important difference is 376 00:18:56,720 --> 00:19:00,960 Speaker 4: political economy, right Like it's and for people to say, oh, 377 00:19:01,119 --> 00:19:04,280 Speaker 4: ordinary people have no power in today's economy, it's all 378 00:19:04,320 --> 00:19:06,560 Speaker 4: the rich people. And like, certainly rich people have a 379 00:19:06,600 --> 00:19:10,200 Speaker 4: lot of power, but relative to the Luddites, ordinary people 380 00:19:10,280 --> 00:19:13,400 Speaker 4: do have more power today, right Like unions were illegal, 381 00:19:13,440 --> 00:19:17,480 Speaker 4: the Luddites literally could not vote, right, And so it 382 00:19:17,560 --> 00:19:21,840 Speaker 4: will be interesting to see who is losing jobs to 383 00:19:21,960 --> 00:19:29,040 Speaker 4: AI and when and how politics and the government respond, right, 384 00:19:29,240 --> 00:19:33,960 Speaker 4: And it's an interesting moment now because for a long time, 385 00:19:34,000 --> 00:19:39,959 Speaker 4: technological change threatened lower skilled workers, right like twentieth century 386 00:19:39,960 --> 00:19:42,200 Speaker 4: automation hollowed out. 387 00:19:42,000 --> 00:19:43,520 Speaker 3: The middle to a significant degree. 388 00:19:43,560 --> 00:19:45,440 Speaker 4: Right, There was this phrase, the hollowing out of the middle, 389 00:19:45,480 --> 00:19:48,719 Speaker 4: where like physical labor actually for a while did okay, 390 00:19:49,520 --> 00:19:52,359 Speaker 4: and if you were sort of highly educated you did okay, 391 00:19:52,400 --> 00:19:53,760 Speaker 4: but if you were kind of in the middle of 392 00:19:53,800 --> 00:19:57,200 Speaker 4: the distribution, it was bad for you. Now it's kind 393 00:19:57,200 --> 00:20:01,439 Speaker 4: of across the board, and strikingly, people like lawyers and 394 00:20:01,520 --> 00:20:05,879 Speaker 4: journalists are threatened. And so those are people who traditionally 395 00:20:05,880 --> 00:20:10,600 Speaker 4: have had more political influence, perhaps, right, And so I 396 00:20:10,640 --> 00:20:14,479 Speaker 4: think how will the government respond? Is super interesting and 397 00:20:14,520 --> 00:20:15,400 Speaker 4: super unclear. 398 00:20:15,600 --> 00:20:18,720 Speaker 1: Yeah, do you have a sense of which other types 399 00:20:18,760 --> 00:20:22,280 Speaker 1: of jobs are projected to be affected? I mean the 400 00:20:22,320 --> 00:20:25,800 Speaker 1: manufacturing jobs obviously that's well documented and also follows a 401 00:20:25,800 --> 00:20:27,800 Speaker 1: pattern of technological change throughout history. 402 00:20:28,280 --> 00:20:29,040 Speaker 2: I think you're right, the. 403 00:20:29,359 --> 00:20:32,000 Speaker 1: Doctor lawyer thing is relatively new, jarring for a whole 404 00:20:32,000 --> 00:20:36,200 Speaker 1: different economic class of people. What are the ones we're 405 00:20:36,200 --> 00:20:37,280 Speaker 1: not thinking about? 406 00:20:38,520 --> 00:20:41,159 Speaker 3: I don't know. I mean, I'm wary of making. 407 00:20:42,520 --> 00:20:45,239 Speaker 4: Like AI is insane right now, Like it is like 408 00:20:45,320 --> 00:20:49,240 Speaker 4: I don't you know, there is one thread of the discourse, 409 00:20:49,520 --> 00:20:51,240 Speaker 4: and maybe this has died down. I don't know that 410 00:20:51,359 --> 00:20:54,920 Speaker 4: is like, Oh, it's just hype from AI companies when 411 00:20:55,000 --> 00:20:57,240 Speaker 4: A companies are like AI is going to be crazy, 412 00:20:57,320 --> 00:21:01,160 Speaker 4: Like I don't think that is true. It is interesting 413 00:21:01,200 --> 00:21:03,719 Speaker 4: to think about where are the bottlenecks, right, I do 414 00:21:03,800 --> 00:21:07,720 Speaker 4: think there might be bottlenecks in adoption, right, Like in 415 00:21:07,760 --> 00:21:09,320 Speaker 4: a superficial. 416 00:21:08,600 --> 00:21:11,680 Speaker 3: Way, AI looks really good, but when you actually try 417 00:21:11,720 --> 00:21:15,480 Speaker 3: and get it to do stuff, it's kind of a 418 00:21:15,480 --> 00:21:16,440 Speaker 3: pain in the ass. 419 00:21:16,520 --> 00:21:22,920 Speaker 4: And like there's this interesting guy, Dwarkish Patel, I believe 420 00:21:22,960 --> 00:21:26,639 Speaker 4: who he writes about AI and he interviews a lot 421 00:21:26,640 --> 00:21:28,200 Speaker 4: of the really smart AI people, and he made the 422 00:21:28,240 --> 00:21:30,240 Speaker 4: point that he's been trying to use it for his 423 00:21:30,280 --> 00:21:33,719 Speaker 4: own work, but that it's not good at like learning incrementally, 424 00:21:33,920 --> 00:21:35,600 Speaker 4: Like it can do an okay job, It can do 425 00:21:35,640 --> 00:21:38,240 Speaker 4: a five out of ten job, But a human being 426 00:21:38,280 --> 00:21:40,320 Speaker 4: that starts out a five out of ten at your company, 427 00:21:40,320 --> 00:21:41,720 Speaker 4: you can kind of get them up to eight out 428 00:21:41,720 --> 00:21:44,199 Speaker 4: of ten, and getting the AI to eight out of 429 00:21:44,200 --> 00:21:46,800 Speaker 4: ten getting it to learn on a particular task is 430 00:21:46,840 --> 00:21:50,720 Speaker 4: actually still doesn't work. And like getting an AI to 431 00:21:50,800 --> 00:21:54,439 Speaker 4: actually do a job for you is still hard. And 432 00:21:54,480 --> 00:21:57,760 Speaker 4: so I don't know, I really don't know how it's 433 00:21:57,800 --> 00:21:59,800 Speaker 4: going to roll out, and I don't know what the 434 00:21:59,800 --> 00:22:01,120 Speaker 4: political responsible base. 435 00:22:01,240 --> 00:22:04,959 Speaker 1: Do you feel like it's too early to speculate what 436 00:22:05,000 --> 00:22:10,080 Speaker 1: the differences are compared to previous technological changes or advances. 437 00:22:12,240 --> 00:22:15,960 Speaker 4: I mean, I think one thing that is important to 438 00:22:16,119 --> 00:22:23,159 Speaker 4: remember is, at least so far, every time people have 439 00:22:23,320 --> 00:22:27,880 Speaker 4: lost jobs to technology, new jobs that people couldn't. 440 00:22:27,520 --> 00:22:30,560 Speaker 3: Imagine before emerged, right, So. 441 00:22:30,680 --> 00:22:34,639 Speaker 4: Like some number of people are displayed, some number of 442 00:22:34,680 --> 00:22:37,240 Speaker 4: people are worse off in the short. 443 00:22:36,960 --> 00:22:40,160 Speaker 3: To medium and sometimes long term, but always so far 444 00:22:40,359 --> 00:22:41,800 Speaker 3: there have been more new jobs. 445 00:22:42,000 --> 00:22:45,880 Speaker 4: People's wants and willingness to pay for things is insatiable, 446 00:22:46,000 --> 00:22:49,119 Speaker 4: and I suppose this may be the end of that, 447 00:22:49,160 --> 00:22:52,120 Speaker 4: but I wouldn't bet against that, you know. I mean, 448 00:22:53,320 --> 00:22:56,480 Speaker 4: I feel a little silly that making podcasts is my job, right, 449 00:22:56,520 --> 00:22:56,840 Speaker 4: trust me? 450 00:22:57,000 --> 00:22:58,879 Speaker 2: It is security I have about my jobs. 451 00:22:59,280 --> 00:23:03,439 Speaker 4: Yes, I understated, and that's because you know, we have 452 00:23:03,480 --> 00:23:06,000 Speaker 4: all these machines that make really cheap clothes and cheap 453 00:23:06,040 --> 00:23:08,159 Speaker 4: food and all of the basic things. You know, Like 454 00:23:08,520 --> 00:23:11,600 Speaker 4: if you just look at farming, right, like in eighteen 455 00:23:11,720 --> 00:23:15,440 Speaker 4: hundred something, they have rough numbers. Eighteen hundred and ninety 456 00:23:15,440 --> 00:23:18,440 Speaker 4: percent of Americans were farmers in nineteen hundred and fifty percent. 457 00:23:18,200 --> 00:23:20,320 Speaker 3: Of Americans were farmers. Ish, maybe forty. 458 00:23:20,400 --> 00:23:23,480 Speaker 4: Two thousand two percent of Americans were farmers, right, maybe 459 00:23:23,480 --> 00:23:26,040 Speaker 4: the greatest displacement of labor you can think of because 460 00:23:26,040 --> 00:23:29,680 Speaker 4: of the reaper and the tractor. Right, But like everybody 461 00:23:29,800 --> 00:23:31,440 Speaker 4: went to working factories and then they went to work 462 00:23:31,480 --> 00:23:34,399 Speaker 4: as you know, personal trainers and bank tellers and other things. 463 00:23:35,240 --> 00:23:40,159 Speaker 1: It also makes me wonder like the the implications up 464 00:23:40,160 --> 00:23:43,119 Speaker 1: to your point about like maybe there is no next 465 00:23:43,920 --> 00:23:46,320 Speaker 1: widespread job thing that people have, Like what do you 466 00:23:46,359 --> 00:23:51,199 Speaker 1: retrain for when it's just a computer taking everything? And 467 00:23:51,520 --> 00:23:53,800 Speaker 1: I wonder like in terms of the TBD for what 468 00:23:53,880 --> 00:23:57,480 Speaker 1: government does, if it's even government that does this is 469 00:23:57,520 --> 00:23:59,760 Speaker 1: like just taking the US. We live in a country 470 00:23:59,800 --> 00:24:04,480 Speaker 1: that can barely agree on the federal level that we 471 00:24:04,520 --> 00:24:09,040 Speaker 1: should have just the basic social safety net in place, right, 472 00:24:09,080 --> 00:24:13,920 Speaker 1: compare us to other industrialized, civilized countries. And so then 473 00:24:14,160 --> 00:24:17,439 Speaker 1: jump to you know, well, what about universal basic income 474 00:24:17,520 --> 00:24:20,240 Speaker 1: and people who are really touting universal basic income? And 475 00:24:20,280 --> 00:24:22,919 Speaker 1: if you look at today's political climate, in what fucking 476 00:24:23,000 --> 00:24:27,440 Speaker 1: reality are you going to get these two generally right 477 00:24:27,480 --> 00:24:29,960 Speaker 1: of center, if you're looking at global standards, two political 478 00:24:30,000 --> 00:24:32,560 Speaker 1: parties to agree that UBI is going to be a 479 00:24:32,600 --> 00:24:34,000 Speaker 1: thing that we can do. And I don't mean to 480 00:24:34,000 --> 00:24:36,880 Speaker 1: be a cynic about it, but it concerns me greatly 481 00:24:37,040 --> 00:24:40,480 Speaker 1: that our politics in the democracy are designed to move slowly. 482 00:24:40,760 --> 00:24:41,359 Speaker 2: That's a given. 483 00:24:42,080 --> 00:24:45,520 Speaker 1: But if the technological displacement of jobs is moving faster 484 00:24:45,760 --> 00:24:47,680 Speaker 1: than our public policy, what happens. 485 00:24:49,080 --> 00:24:53,240 Speaker 4: I mean, that's definitely a possible bad outcome. I mean, 486 00:24:53,280 --> 00:24:55,160 Speaker 4: the short answer to what happens. So what you're imagining 487 00:24:55,240 --> 00:24:57,840 Speaker 4: is like lots of people lose their jobs because of 488 00:24:58,040 --> 00:25:00,520 Speaker 4: technological change, and the government doesn't too very much to help 489 00:25:00,520 --> 00:25:01,640 Speaker 4: you because we have more truly. 490 00:25:01,480 --> 00:25:04,280 Speaker 1: Darius benefiting from all this, and they just don't want 491 00:25:04,280 --> 00:25:05,200 Speaker 1: to give up their money. 492 00:25:05,520 --> 00:25:08,639 Speaker 4: It's possible, I mean, I mean, obviously it's the case 493 00:25:08,680 --> 00:25:11,720 Speaker 4: that you know, Northern European countries have much more robust 494 00:25:11,720 --> 00:25:14,000 Speaker 4: safety nets than we do. It's also the case that 495 00:25:14,040 --> 00:25:17,840 Speaker 4: like Medicare and social security are politically totally untouchable and 496 00:25:17,880 --> 00:25:23,520 Speaker 4: have broad support, right, and so like maybe there's like incrementalism, Right, 497 00:25:23,600 --> 00:25:27,360 Speaker 4: maybe you can get social security and medicare ten years earlier. Right, 498 00:25:27,440 --> 00:25:30,600 Speaker 4: Like that seems like a plausible outcome. It is, in 499 00:25:30,600 --> 00:25:34,240 Speaker 4: fact interesting that the people talking the most about universal 500 00:25:34,280 --> 00:25:38,320 Speaker 4: basic income. Are AI people and tech people who actually 501 00:25:38,359 --> 00:25:41,000 Speaker 4: think this might happen. Right, you know, there is this 502 00:25:41,280 --> 00:25:45,480 Speaker 4: essay that John Mayer Kines, you know, of Kynesian economics 503 00:25:45,520 --> 00:25:51,520 Speaker 4: fame wrote in nineteen thirty, I think called economic Possibilities 504 00:25:51,560 --> 00:25:53,880 Speaker 4: for our Grandchildren, and I think that was actually where 505 00:25:53,920 --> 00:25:58,560 Speaker 4: the term technological unemployment was first used. Right, so, you 506 00:25:58,600 --> 00:26:02,240 Speaker 4: know thirty, right, the world's going into the depression, but 507 00:26:02,320 --> 00:26:07,200 Speaker 4: he's thinking generations ahead, and he's thinking about technological change 508 00:26:07,800 --> 00:26:10,919 Speaker 4: and robots taking our jobs as we would sort of 509 00:26:10,920 --> 00:26:14,320 Speaker 4: colloquially say today, and he's imagining, like, well, what if 510 00:26:14,320 --> 00:26:16,639 Speaker 4: our grandchildren are just working fifteen hours a week. He's like, 511 00:26:16,680 --> 00:26:19,000 Speaker 4: first of all, it's hard not to work at all, right, 512 00:26:19,040 --> 00:26:21,359 Speaker 4: we're sort of wired to want to do something, but 513 00:26:21,440 --> 00:26:24,200 Speaker 4: maybe you don't have to work that much. And people 514 00:26:24,240 --> 00:26:26,200 Speaker 4: sort of a look at fifteen hours a week now 515 00:26:26,240 --> 00:26:28,679 Speaker 4: and laugh like, oh haha, how charmingly wrong he is. 516 00:26:29,160 --> 00:26:32,280 Speaker 4: But I was looking as I was thinking about this interview, 517 00:26:32,320 --> 00:26:34,080 Speaker 4: and if you look at hours a week, it was 518 00:26:34,200 --> 00:26:38,520 Speaker 4: like sixty hours a week in nineteen hundred and fifty 519 00:26:38,560 --> 00:26:41,119 Speaker 4: hours a week in nineteen thirty when he wrote that, 520 00:26:41,320 --> 00:26:43,960 Speaker 4: we sort of plateaued at forty in this country, but 521 00:26:44,040 --> 00:26:46,520 Speaker 4: in the Netherlands they're like down to thirty two hours 522 00:26:46,560 --> 00:26:50,359 Speaker 4: a week now, like all these days off with all 523 00:26:50,400 --> 00:26:52,480 Speaker 4: the right, with a lot of same mondays thrown in 524 00:26:52,440 --> 00:26:55,960 Speaker 4: in the MiGs. And so I don't know, like, yes, 525 00:26:56,040 --> 00:26:58,320 Speaker 4: there are many if we think of possibility space, there 526 00:26:58,320 --> 00:27:03,000 Speaker 4: are definitely bad outcomes. I'm not entirely pessimistic, like, look, 527 00:27:03,040 --> 00:27:06,479 Speaker 4: it is the case that there is a progress aspect 528 00:27:06,520 --> 00:27:08,200 Speaker 4: to this, right, Like, yes, a lot of things could 529 00:27:08,240 --> 00:27:09,600 Speaker 4: be bad, and a lot of things can go wrong, 530 00:27:09,600 --> 00:27:11,840 Speaker 4: and maybe we're not in the best possible timeline. Certainly 531 00:27:11,840 --> 00:27:13,119 Speaker 4: we're not in the best possible timeline. 532 00:27:13,119 --> 00:27:13,520 Speaker 2: But like. 533 00:27:15,280 --> 00:27:19,560 Speaker 4: Having machines do stuff instead of people, it does mean 534 00:27:19,600 --> 00:27:21,679 Speaker 4: there is more abundance, right, And so there is this 535 00:27:21,760 --> 00:27:24,359 Speaker 4: problem that you're pointing to of like sharing the abundance, 536 00:27:24,440 --> 00:27:26,760 Speaker 4: making sure that it doesn't all go to six people, right, 537 00:27:26,800 --> 00:27:29,680 Speaker 4: which is a real thing to worry about. 538 00:27:29,760 --> 00:27:32,760 Speaker 3: But there will be more to go around, I guess 539 00:27:32,760 --> 00:27:33,160 Speaker 3: if it's. 540 00:27:33,080 --> 00:27:36,440 Speaker 1: Structured, right, yes, And you know I am not I'm 541 00:27:36,480 --> 00:27:38,679 Speaker 1: not such a louttite that I don't acknowledge that we 542 00:27:39,520 --> 00:27:44,000 Speaker 1: you know, the advances in medical research and science and 543 00:27:44,119 --> 00:27:47,080 Speaker 1: exploration and astronomy, and all of those things are are 544 00:27:47,160 --> 00:27:48,760 Speaker 1: going to be incredible. I was just talking with a 545 00:27:49,480 --> 00:27:52,800 Speaker 1: doctor who I ran into last night, who was touting 546 00:27:52,840 --> 00:27:56,600 Speaker 1: how exciting it is in her field for AI research 547 00:27:56,640 --> 00:27:59,159 Speaker 1: in medicine, and I have the exact opposite viewpoint as 548 00:27:59,160 --> 00:27:59,560 Speaker 1: an artist. 549 00:27:59,680 --> 00:28:02,000 Speaker 4: Right, let me ask you a question. I've been curious, 550 00:28:02,600 --> 00:28:04,200 Speaker 4: do you ever use AI for work? 551 00:28:04,760 --> 00:28:08,120 Speaker 1: I have tried because I was curious, and thank god, 552 00:28:08,240 --> 00:28:11,800 Speaker 1: AI does a piss poor job at writing jokes. I'm 553 00:28:11,840 --> 00:28:14,880 Speaker 1: sure one day it will learn. Part of that, obviously, 554 00:28:14,960 --> 00:28:17,520 Speaker 1: is that comedy is so subjective. 555 00:28:18,480 --> 00:28:19,280 Speaker 2: But no. 556 00:28:19,560 --> 00:28:22,000 Speaker 1: In fact, what I've used it for and what I 557 00:28:22,040 --> 00:28:25,480 Speaker 1: find to be helpful is that as a writer an actor, 558 00:28:25,600 --> 00:28:30,359 Speaker 1: my brain is usually very scattered. Like you know, people 559 00:28:30,400 --> 00:28:32,600 Speaker 1: like me think of the ten things that you're not 560 00:28:32,640 --> 00:28:35,040 Speaker 1: supposed to say out loud, but like that's constantly in 561 00:28:35,080 --> 00:28:38,160 Speaker 1: our brain, and so you're you always have a filter 562 00:28:38,440 --> 00:28:40,560 Speaker 1: on your brain depending on the setting you're in. Like 563 00:28:40,560 --> 00:28:43,000 Speaker 1: if I'm on stage doing a stand up routine, that 564 00:28:43,040 --> 00:28:45,160 Speaker 1: filter is off and generally it works and sometimes it 565 00:28:45,200 --> 00:28:47,280 Speaker 1: will get you in trouble. But in normal day to 566 00:28:47,360 --> 00:28:49,600 Speaker 1: day interactions, there's like this weird filter you have to 567 00:28:49,600 --> 00:28:54,120 Speaker 1: put on. And so when I'm writing a what by 568 00:28:54,160 --> 00:28:57,120 Speaker 1: normal people's standards would be like a professional document or 569 00:28:57,120 --> 00:29:00,960 Speaker 1: a professional email or a professional text, my mind has 570 00:29:01,080 --> 00:29:04,000 Speaker 1: nine paragraphs of things to describe what I want to say, 571 00:29:04,360 --> 00:29:07,160 Speaker 1: and I just need two sentences. So I have found 572 00:29:07,200 --> 00:29:10,240 Speaker 1: that something like that is a helpful tool, but I 573 00:29:10,280 --> 00:29:13,240 Speaker 1: have not found that in my actual professional life that 574 00:29:13,400 --> 00:29:17,680 Speaker 1: there's there's anything that I have that I've benefited from. 575 00:29:17,840 --> 00:29:19,360 Speaker 1: That said, I know it's all coming. 576 00:29:19,680 --> 00:29:22,320 Speaker 3: So you're saying it's good for taking all the creativity 577 00:29:22,360 --> 00:29:23,080 Speaker 3: out of your. 578 00:29:23,000 --> 00:29:23,600 Speaker 2: Life so far. 579 00:29:23,720 --> 00:29:26,200 Speaker 3: Yeah, Anti, yeah, yea, yeah. 580 00:29:25,640 --> 00:29:26,720 Speaker 2: That's been my experience. 581 00:29:26,720 --> 00:29:28,840 Speaker 1: But again, the you know, I read about all of 582 00:29:28,880 --> 00:29:32,960 Speaker 1: the things that AI companies are working on to replace actors, 583 00:29:33,000 --> 00:29:35,720 Speaker 1: to clone our voice, our performance is coming up with 584 00:29:35,840 --> 00:29:40,840 Speaker 1: just completely new actors slash characters or personalities. So those 585 00:29:40,840 --> 00:29:42,719 Speaker 1: are the things that I don't you know that I 586 00:29:42,760 --> 00:29:46,040 Speaker 1: hope won't take my job. Like you you said this earlier, 587 00:29:46,080 --> 00:29:48,320 Speaker 1: I am of a certain age where like I'd love 588 00:29:48,320 --> 00:29:51,120 Speaker 1: a good twenty to twenty five more years in my career. 589 00:29:51,440 --> 00:29:53,960 Speaker 1: But man, I'm glad I'm not eighteen going to drama school. 590 00:29:54,160 --> 00:29:56,720 Speaker 1: It's a it's a whole different ballgame. 591 00:29:57,520 --> 00:29:59,520 Speaker 4: I mean, I feel like if you're eight and going 592 00:29:59,560 --> 00:30:02,520 Speaker 4: to drama school, you've got to think of using the tools, right, 593 00:30:02,720 --> 00:30:07,840 Speaker 4: Like that a classic technology thing is you don't want 594 00:30:07,880 --> 00:30:11,080 Speaker 4: to be competing against the tool the technology. You want 595 00:30:11,080 --> 00:30:13,520 Speaker 4: to be using the technology. And you've seen that. I 596 00:30:13,520 --> 00:30:15,560 Speaker 4: mean to get back to the sort of historical arc, 597 00:30:15,680 --> 00:30:18,320 Speaker 4: like that was a thing that happened in factories, right, 598 00:30:18,400 --> 00:30:21,560 Speaker 4: Like US factories in many instances are high tech. Right, 599 00:30:21,600 --> 00:30:25,640 Speaker 4: There's like CNC machining was that computer Numerical control machining. 600 00:30:25,720 --> 00:30:28,960 Speaker 3: Like they're sort of like tech jobs for kind of. 601 00:30:29,040 --> 00:30:33,560 Speaker 4: Skilled workers, right, And even like when I worked at NPR, 602 00:30:33,880 --> 00:30:36,560 Speaker 4: Like there were the engineers in the studio who were, 603 00:30:36,640 --> 00:30:38,480 Speaker 4: you know, running the board, but then there was the 604 00:30:38,520 --> 00:30:41,880 Speaker 4: guy who was the manager of those guys, who was 605 00:30:41,920 --> 00:30:44,400 Speaker 4: like very much like unengineer at heart, and he loved 606 00:30:44,400 --> 00:30:46,280 Speaker 4: doing the stuff, but he was also like building the 607 00:30:46,280 --> 00:30:49,360 Speaker 4: systems and like running the servers and like he was 608 00:30:49,400 --> 00:30:51,680 Speaker 4: the guy who figured out how the reporters could self up, 609 00:30:51,760 --> 00:30:54,560 Speaker 4: could record ourselves, right, And so that's the guy you 610 00:30:54,600 --> 00:30:56,280 Speaker 4: want to be, right, You want to be the guy 611 00:30:56,920 --> 00:31:01,320 Speaker 4: using the tools. And like I haven't given up on that, Like, 612 00:31:01,360 --> 00:31:03,480 Speaker 4: in the long run, I assume AI will be better 613 00:31:03,520 --> 00:31:06,400 Speaker 4: at doing everything that I do than I am, but 614 00:31:06,480 --> 00:31:09,440 Speaker 4: hopefully that'll take a while. There's some number of people 615 00:31:09,440 --> 00:31:12,160 Speaker 4: who are like used to hearing me and like I 616 00:31:12,200 --> 00:31:15,160 Speaker 4: can sort of use AI to maybe work faster or 617 00:31:15,200 --> 00:31:17,800 Speaker 4: be smarter, right to do to figure out in five 618 00:31:17,840 --> 00:31:19,600 Speaker 4: minutes what it would have taken me an hour of 619 00:31:19,640 --> 00:31:22,360 Speaker 4: Google searching to figure out, and obviously still vet it, 620 00:31:22,640 --> 00:31:25,640 Speaker 4: but like, for the medium term, that's what I'm banking on, 621 00:31:26,280 --> 00:31:29,080 Speaker 4: and I do think that's like a healthier relationship to 622 00:31:29,520 --> 00:31:30,560 Speaker 4: technology in general. 623 00:31:37,440 --> 00:31:40,600 Speaker 1: I was invited to an AI exhibit recently, an AI 624 00:31:40,760 --> 00:31:44,320 Speaker 1: art exhibit I'm putting art in air quotes by a 625 00:31:44,360 --> 00:31:47,000 Speaker 1: friend who is a tech bro, and he's like, hey, 626 00:31:47,040 --> 00:31:49,160 Speaker 1: I've been working for the last eighteen months on these 627 00:31:49,880 --> 00:31:52,240 Speaker 1: art pieces, these AI art pieces. Will you come to 628 00:31:52,280 --> 00:31:54,480 Speaker 1: the gallery opening? And I'm like, I love you. There 629 00:31:54,520 --> 00:31:57,080 Speaker 1: is nothing that I would rather do less on a 630 00:31:57,120 --> 00:32:00,840 Speaker 1: Tuesday night than come and see the product of you 631 00:32:01,040 --> 00:32:04,640 Speaker 1: getting high in front of your laptop and pressing buttons. 632 00:32:05,320 --> 00:32:08,200 Speaker 3: No, sorry, man, I wanted that I'd stay home. 633 00:32:08,320 --> 00:32:12,800 Speaker 2: I would stay home and do it myself. But yeah, 634 00:32:12,880 --> 00:32:14,200 Speaker 2: I'm not a fan of that. 635 00:32:14,360 --> 00:32:16,160 Speaker 1: I don't want to get too off drag and I 636 00:32:16,680 --> 00:32:19,320 Speaker 1: want I want to ask a comparison question about the 637 00:32:19,360 --> 00:32:21,840 Speaker 1: Ludites and today. In the lud eight era, you write 638 00:32:21,880 --> 00:32:25,520 Speaker 1: that productivity went way up, wages for regular workers barely 639 00:32:25,560 --> 00:32:28,080 Speaker 1: moved for decades. That's something that we're obviously used to. 640 00:32:28,160 --> 00:32:31,080 Speaker 1: You see that time and time again. You also talked 641 00:32:31,080 --> 00:32:34,680 Speaker 1: about how the generations after the Ludites benefited from the 642 00:32:34,680 --> 00:32:36,640 Speaker 1: machines themselves that replace them. 643 00:32:36,920 --> 00:32:38,720 Speaker 2: So I'm curious, like, what are the big. 644 00:32:38,600 --> 00:32:41,840 Speaker 1: Lessons from that period about who benefits from new technology? 645 00:32:42,200 --> 00:32:45,400 Speaker 1: What do the people who are in the generation that 646 00:32:45,440 --> 00:32:48,400 Speaker 1: are directly impacted by machines supposed to do. I know 647 00:32:48,400 --> 00:32:50,560 Speaker 1: we talked about that a little bit, but all that's 648 00:32:50,640 --> 00:32:54,520 Speaker 1: leading to right now with AI knowing how much diversity 649 00:32:54,520 --> 00:32:58,440 Speaker 1: of opinion there is, how much panic and excitement there is, 650 00:32:58,480 --> 00:32:59,680 Speaker 1: what should our outlook be. 651 00:33:00,720 --> 00:33:01,560 Speaker 3: That's super hard. 652 00:33:02,720 --> 00:33:06,600 Speaker 4: I mean, you know, the I think the closest thing 653 00:33:06,680 --> 00:33:11,000 Speaker 4: in recent times that comes to my mind is workers 654 00:33:11,000 --> 00:33:14,760 Speaker 4: who've lost their job to foreign competition. Right Like, in 655 00:33:14,800 --> 00:33:17,479 Speaker 4: the kind of early part of the aughts, there was 656 00:33:17,560 --> 00:33:20,160 Speaker 4: what's come to be called the China Shock, which was 657 00:33:20,200 --> 00:33:22,600 Speaker 4: you know, China entered the World Trade Organization in like 658 00:33:22,640 --> 00:33:24,920 Speaker 4: two thousand. It's funny not that long ago, right, China 659 00:33:25,160 --> 00:33:26,760 Speaker 4: used to be a super poor country, was not a 660 00:33:26,840 --> 00:33:31,080 Speaker 4: major competitor, and then bam they entered the WTO. And 661 00:33:31,520 --> 00:33:35,240 Speaker 4: there were places in the United States that were making 662 00:33:35,280 --> 00:33:39,920 Speaker 4: things that competed with Chinese imports, like clothes, and they 663 00:33:40,000 --> 00:33:41,680 Speaker 4: got obliterated. 664 00:33:42,640 --> 00:33:46,720 Speaker 3: The United States as a whole did well. And I 665 00:33:46,880 --> 00:33:49,760 Speaker 3: mean ordinary people, right, Like, it is the case that like. 666 00:33:50,680 --> 00:33:54,000 Speaker 4: For working people, when clothes get cheaper, that is good, right. 667 00:33:54,040 --> 00:33:56,200 Speaker 4: It means they have more money in their pocket, more 668 00:33:56,240 --> 00:33:57,840 Speaker 4: money to spend on basic things. And I think it's 669 00:33:57,880 --> 00:34:01,240 Speaker 4: easy to overlook that part. But the people and the 670 00:34:01,240 --> 00:34:04,720 Speaker 4: towns that were competing against China were worse off, right, 671 00:34:04,760 --> 00:34:07,400 Speaker 4: So overall everybody was better off, And it's important to 672 00:34:07,400 --> 00:34:09,280 Speaker 4: say that, Like, it wasn't just rich. 673 00:34:09,160 --> 00:34:10,200 Speaker 3: People who got better off. 674 00:34:10,239 --> 00:34:12,640 Speaker 4: It wasn't just the owners of capital, right, And I 675 00:34:12,680 --> 00:34:15,520 Speaker 4: think that is likely to be true here as well. Right, 676 00:34:15,520 --> 00:34:18,439 Speaker 4: it is a competitive world, and I do think that 677 00:34:18,480 --> 00:34:21,640 Speaker 4: there is a universe where overall people are better off 678 00:34:21,640 --> 00:34:24,799 Speaker 4: from AI. The really hard question is how do you 679 00:34:24,840 --> 00:34:28,800 Speaker 4: help people who are clearly losing their jobs because of AI, Right, Like, well, 680 00:34:29,000 --> 00:34:31,920 Speaker 4: give them money seems like a good part of an 681 00:34:31,920 --> 00:34:34,120 Speaker 4: answer to me, Will it happen politically? I don't know 682 00:34:34,160 --> 00:34:35,640 Speaker 4: for the reasons you said, but if I were waving 683 00:34:35,680 --> 00:34:39,279 Speaker 4: a wand give them money in a way, though it 684 00:34:39,360 --> 00:34:41,160 Speaker 4: seems like I don't want to say the easy part 685 00:34:41,200 --> 00:34:42,800 Speaker 4: because it would be great if it happened politically it 686 00:34:42,840 --> 00:34:45,840 Speaker 4: would be hard. But there is this more complicated thing 687 00:34:47,200 --> 00:34:49,640 Speaker 4: for everyone, right, and Cain's talked about it one hundred 688 00:34:49,680 --> 00:34:52,440 Speaker 4: years ago. It's like work is meaning for a lot 689 00:34:52,480 --> 00:34:59,960 Speaker 4: of people, right, Work is purpose is just an organiz 690 00:35:00,239 --> 00:35:02,440 Speaker 4: principle in life. And so if we project forward, I 691 00:35:02,440 --> 00:35:03,880 Speaker 4: don't know what's going to happen. I don't think all 692 00:35:03,920 --> 00:35:05,680 Speaker 4: jobs are going to be gone in five years. Maybe 693 00:35:05,680 --> 00:35:07,560 Speaker 4: I'll be wrong, but I doubt that. But if in 694 00:35:07,600 --> 00:35:12,040 Speaker 4: fact lots of people are losing their jobs because of AI, yes, 695 00:35:12,200 --> 00:35:14,160 Speaker 4: please give them money. Of course there will be more 696 00:35:14,160 --> 00:35:16,160 Speaker 4: money to go around. There should be money for those people. 697 00:35:16,520 --> 00:35:19,279 Speaker 4: But like, what do we do about everything else? How 698 00:35:19,280 --> 00:35:22,479 Speaker 4: do we like help them find meaning in their life? 699 00:35:22,520 --> 00:35:22,719 Speaker 2: Like that? 700 00:35:23,280 --> 00:35:26,520 Speaker 4: It feels handwavy to me, but like might be my 701 00:35:26,600 --> 00:35:30,160 Speaker 4: own problem in three years, right, Like yeah, my identity 702 00:35:30,200 --> 00:35:32,800 Speaker 4: is my work to some significant area, and like a 703 00:35:32,840 --> 00:35:34,359 Speaker 4: technology might put me out of a job. You know, 704 00:35:34,400 --> 00:35:36,120 Speaker 4: it's easier for me to talk about now that it's me. 705 00:35:36,160 --> 00:35:38,160 Speaker 4: I don't just sound like some asshole talking about other 706 00:35:38,160 --> 00:35:41,080 Speaker 4: people in a condescending way, like this is me, Like truly, 707 00:35:41,800 --> 00:35:43,120 Speaker 4: I could put me out of a job before I 708 00:35:43,120 --> 00:35:45,120 Speaker 4: want to be out of a job, and like it 709 00:35:45,160 --> 00:35:47,640 Speaker 4: will be hard in many ways if that happens. Not 710 00:35:47,800 --> 00:35:48,680 Speaker 4: just the money. 711 00:35:49,120 --> 00:35:52,000 Speaker 1: I worry that not just in the interim, but when 712 00:35:52,000 --> 00:35:54,880 Speaker 1: we're when we're dead the next you know, the next 713 00:35:55,000 --> 00:35:56,480 Speaker 1: eighty years or eighty years. 714 00:35:56,400 --> 00:35:57,040 Speaker 2: Down the line. 715 00:35:57,640 --> 00:36:01,520 Speaker 1: Yeah, if you have new new classes of people that 716 00:36:01,600 --> 00:36:05,560 Speaker 1: are so completely diametrically like they don't mix. You've got 717 00:36:05,600 --> 00:36:09,520 Speaker 1: this AI job class of folks who are working. And 718 00:36:09,560 --> 00:36:13,799 Speaker 1: then if there's even success at something like UBI, people 719 00:36:13,880 --> 00:36:17,000 Speaker 1: who have been pushed out, who are then economically locked, 720 00:36:17,600 --> 00:36:20,759 Speaker 1: whose children are economically locked in a particular scenario, does 721 00:36:20,800 --> 00:36:25,080 Speaker 1: that breed a type of class resentment that's even more 722 00:36:25,120 --> 00:36:27,759 Speaker 1: extreme than what we see today. You know, you see 723 00:36:28,320 --> 00:36:31,920 Speaker 1: scapegoating of immigrants. The fear that I have about how 724 00:36:31,960 --> 00:36:35,640 Speaker 1: that could be exacerbated without the right public policy. Once 725 00:36:35,719 --> 00:36:38,480 Speaker 1: the people who have lived it have died off is 726 00:36:38,520 --> 00:36:39,320 Speaker 1: also very scary. 727 00:36:39,360 --> 00:36:40,080 Speaker 2: I don't have to worry. 728 00:36:39,920 --> 00:36:42,280 Speaker 1: About that because I'll be dead, but it is something 729 00:36:42,320 --> 00:36:46,360 Speaker 1: that I think, you know, that worries me. My question 730 00:36:46,400 --> 00:36:49,399 Speaker 1: for you is literally the exact opposite, because I hate 731 00:36:49,440 --> 00:36:52,640 Speaker 1: wrapping up on a doomsday scenario. I think it's very 732 00:36:52,680 --> 00:36:56,239 Speaker 1: easy to do. I'm not a rage beaita. When you 733 00:36:56,280 --> 00:37:00,480 Speaker 1: imagine the next ten or twenty years, what are kind 734 00:37:00,480 --> 00:37:04,560 Speaker 1: of the most realistic ways that technology might change our 735 00:37:04,680 --> 00:37:08,239 Speaker 1: jobs that don't fit into either a doomsday scenario or 736 00:37:08,320 --> 00:37:09,480 Speaker 1: a utopian story. 737 00:37:11,280 --> 00:37:13,760 Speaker 3: Yeah, that's nice. So like it kind of the middle 738 00:37:13,800 --> 00:37:15,800 Speaker 3: of the probability distribution. 739 00:37:15,480 --> 00:37:18,120 Speaker 1: Right, Like, if I took AI to tell me to 740 00:37:18,120 --> 00:37:20,000 Speaker 1: take the emotion out of that question, that's what it 741 00:37:20,000 --> 00:37:20,719 Speaker 1: would have given me. 742 00:37:21,000 --> 00:37:21,920 Speaker 3: Yeah. 743 00:37:22,080 --> 00:37:25,720 Speaker 4: No, it's interesting to think about, right, because it's more subtle, 744 00:37:25,840 --> 00:37:28,000 Speaker 4: So people, it doesn't make so much of a story. Okay, 745 00:37:28,040 --> 00:37:31,279 Speaker 4: so let's think about that. So one thing, one thing 746 00:37:31,320 --> 00:37:36,480 Speaker 4: in that story is the capability for AI really to 747 00:37:36,600 --> 00:37:38,560 Speaker 4: actually do people's jobs. 748 00:37:39,320 --> 00:37:41,320 Speaker 3: It emerges rather slowly. 749 00:37:41,120 --> 00:37:45,360 Speaker 4: Right, It is constrained for reasons partly technological but partly 750 00:37:45,400 --> 00:37:50,080 Speaker 4: also institutional, Like companies have all this sort of tacit information, right, 751 00:37:50,160 --> 00:37:52,600 Speaker 4: Like people at companies know how to do all these things, 752 00:37:52,600 --> 00:37:55,200 Speaker 4: but they never wrote it down right, and so AI 753 00:37:55,320 --> 00:37:57,440 Speaker 4: can't just know that like, oh you got to call 754 00:37:57,560 --> 00:38:00,719 Speaker 4: John and accounts receivable when this happens, like that kind 755 00:38:00,719 --> 00:38:04,239 Speaker 4: of thing. So it's slow, right, That's one important thing, 756 00:38:04,440 --> 00:38:06,879 Speaker 4: is like if you can see the transition coming, if 757 00:38:06,920 --> 00:38:12,360 Speaker 4: you know that it's not tomorrow, like you know, kids 758 00:38:12,400 --> 00:38:14,560 Speaker 4: don't become a journalist if you're twenty maybe, or if 759 00:38:14,560 --> 00:38:16,280 Speaker 4: you do, learn how to do it with AI, because 760 00:38:16,360 --> 00:38:17,080 Speaker 4: it's going to change. 761 00:38:17,120 --> 00:38:18,560 Speaker 3: So it's slow, right. 762 00:38:19,000 --> 00:38:25,200 Speaker 4: Two, So the extent happens, it will increase output, right, like, 763 00:38:25,280 --> 00:38:27,759 Speaker 4: it will make us more productive and more efficient, and 764 00:38:27,880 --> 00:38:30,000 Speaker 4: like at a basic level that is. 765 00:38:29,960 --> 00:38:32,200 Speaker 3: Good and I think it is underappreciated. Right. 766 00:38:32,239 --> 00:38:35,400 Speaker 4: It will increase material abundance. It will help us whatever 767 00:38:35,480 --> 00:38:38,359 Speaker 4: you want, more clean energy, certainly we can get better 768 00:38:38,360 --> 00:38:40,960 Speaker 4: at doing clean energy, get better at doing battery storage 769 00:38:41,000 --> 00:38:44,080 Speaker 4: of clean energy, better medicine, Like there are happy things 770 00:38:44,080 --> 00:38:47,239 Speaker 4: that will happen. The economic policy in the government is 771 00:38:47,280 --> 00:38:49,719 Speaker 4: a complicated one. I mean, if you want more redistribution, 772 00:38:49,960 --> 00:38:52,400 Speaker 4: like when there is more wealth, you can generally have 773 00:38:52,480 --> 00:38:56,040 Speaker 4: more redistribution right, you could the people are having crazy 774 00:38:56,120 --> 00:38:58,600 Speaker 4: capital gains windfalls, perhaps you could raise the capital gains 775 00:38:58,600 --> 00:39:00,799 Speaker 4: tax and use that to help the people who are 776 00:39:00,800 --> 00:39:02,319 Speaker 4: being put out of work, or help people go on 777 00:39:02,360 --> 00:39:04,080 Speaker 4: Medicare at fifty five instead. 778 00:39:03,760 --> 00:39:06,000 Speaker 3: Of sixty five. Right, That would be a very popular 779 00:39:06,080 --> 00:39:07,000 Speaker 3: political program. 780 00:39:07,440 --> 00:39:10,600 Speaker 4: Still, you will have people who are losing their jobs 781 00:39:10,680 --> 00:39:15,399 Speaker 4: because of AI, who, because of the political realities, may 782 00:39:15,440 --> 00:39:18,040 Speaker 4: well be worse off financially, who even if they are 783 00:39:18,120 --> 00:39:21,480 Speaker 4: better off financially, lose a sense of meaning in their life. 784 00:39:21,640 --> 00:39:21,960 Speaker 3: Probably. 785 00:39:22,000 --> 00:39:24,680 Speaker 4: You know, often jobs and job types are clustered, so 786 00:39:24,719 --> 00:39:27,080 Speaker 4: it won't be just an individual losing their job. You 787 00:39:27,080 --> 00:39:32,520 Speaker 4: will have communities impacted, you know, communities having this economic. 788 00:39:33,920 --> 00:39:36,600 Speaker 3: Disaster. Perhaps if it's bad, right, like like with the. 789 00:39:36,600 --> 00:39:39,040 Speaker 4: China shock, and so, you know, you might see this 790 00:39:39,120 --> 00:39:41,359 Speaker 4: kind of heterogeneous outcome. And I don't want to make 791 00:39:41,400 --> 00:39:43,120 Speaker 4: it as simple as rich and poor. I don't want 792 00:39:43,160 --> 00:39:45,279 Speaker 4: to make it as simple as like the trillionaires get 793 00:39:45,320 --> 00:39:48,680 Speaker 4: more trillions than everybody else is screwed. I think, in fact, 794 00:39:48,719 --> 00:39:52,160 Speaker 4: a likely outcome is more subtle than that and more varied. 795 00:39:52,239 --> 00:39:55,160 Speaker 4: And there are some people who are working class, middle 796 00:39:55,160 --> 00:39:57,239 Speaker 4: class who learn how to use AI who get more 797 00:39:57,239 --> 00:39:59,480 Speaker 4: money because they're being more productive. 798 00:40:00,120 --> 00:40:01,560 Speaker 3: Others who are totally screwed. 799 00:40:02,280 --> 00:40:05,600 Speaker 1: Then final question for you, when, because we opened with this, 800 00:40:06,000 --> 00:40:08,319 Speaker 1: when you look at this long history of people who 801 00:40:08,680 --> 00:40:11,279 Speaker 1: have been afraid that machines would take their jobs, and 802 00:40:11,760 --> 00:40:14,759 Speaker 1: many cases where they did, what still worries you when 803 00:40:14,800 --> 00:40:17,360 Speaker 1: you look ahead? And what gives you the most hope? 804 00:40:17,840 --> 00:40:20,319 Speaker 4: So I guess what still worries me when I look 805 00:40:20,360 --> 00:40:24,000 Speaker 4: ahead is the rate of change of AI because when 806 00:40:24,040 --> 00:40:27,480 Speaker 4: you look at these instances and the alternatives, right, so, 807 00:40:27,640 --> 00:40:30,400 Speaker 4: like the Ludites are this very dramatic story where these 808 00:40:30,400 --> 00:40:32,480 Speaker 4: people who had a decent life lost their jobs and 809 00:40:32,520 --> 00:40:36,240 Speaker 4: machines and their lives got worse. On a much vaster 810 00:40:36,400 --> 00:40:40,480 Speaker 4: scale was the automation of farm work, right, that was 811 00:40:40,560 --> 00:40:43,760 Speaker 4: like almost everybody worked on a farm, and then almost 812 00:40:43,760 --> 00:40:46,440 Speaker 4: nobody works on a farm now. But there wasn't really 813 00:40:46,840 --> 00:40:49,479 Speaker 4: I mean, you know, farmers organized politically in various ways, 814 00:40:49,480 --> 00:40:51,360 Speaker 4: and they wanted different policies. But you never had a 815 00:40:51,440 --> 00:40:53,880 Speaker 4: kind of lud Eite moment for farmers, in part because 816 00:40:53,880 --> 00:40:57,560 Speaker 4: it was gradual, in part because you had industrialization alongside it, 817 00:40:57,600 --> 00:40:59,840 Speaker 4: so people could leave the farms and go work in 818 00:40:59,840 --> 00:41:04,360 Speaker 4: a factory. So it's if it's really sudden, if a 819 00:41:04,440 --> 00:41:06,799 Speaker 4: huge amount of people lose their jobs really fast, like 820 00:41:07,320 --> 00:41:12,359 Speaker 4: that just feels super politically dangerous and unstable, right, Like 821 00:41:12,640 --> 00:41:15,279 Speaker 4: it could be violent, it could be bad. I mean, 822 00:41:15,400 --> 00:41:17,960 Speaker 4: the hopeful thing fundamentally to me is that people have 823 00:41:18,000 --> 00:41:20,520 Speaker 4: been worried about technological unemployment for two hundred years, and 824 00:41:20,600 --> 00:41:22,880 Speaker 4: indeed pockets of people have really suffered from it for 825 00:41:22,880 --> 00:41:27,560 Speaker 4: a long periods of time. But like today, the employment 826 00:41:27,560 --> 00:41:30,440 Speaker 4: to population ratio, the share of working age people. 827 00:41:30,440 --> 00:41:32,240 Speaker 3: With jobs is like as high. 828 00:41:32,040 --> 00:41:34,880 Speaker 4: As it has ever been, right, much higher than it 829 00:41:34,960 --> 00:41:39,120 Speaker 4: was fifty years ago, when far fewer women were working, 830 00:41:39,640 --> 00:41:44,680 Speaker 4: despite incredible amounts of innovation, technological change, you know, technological 831 00:41:44,760 --> 00:41:47,800 Speaker 4: job laws. So like, I think we really do underrate 832 00:41:47,880 --> 00:41:49,759 Speaker 4: the extent to which we are good at coming up 833 00:41:49,760 --> 00:41:52,839 Speaker 4: with new jobs. Jobs we cannot imagine today, And like 834 00:41:53,000 --> 00:41:55,120 Speaker 4: I actually think that'll happen in the long run. It'll 835 00:41:55,120 --> 00:41:57,719 Speaker 4: be maybe embodied things, you know, like maybe I'll go, 836 00:41:58,600 --> 00:42:00,120 Speaker 4: I don't know, be a meditation. 837 00:41:59,840 --> 00:42:01,920 Speaker 3: In instructor or something. And yes, you could have aib 838 00:42:01,960 --> 00:42:02,960 Speaker 3: a meditation instructor. 839 00:42:02,960 --> 00:42:05,680 Speaker 4: But there are some things I think people who will 840 00:42:05,719 --> 00:42:08,560 Speaker 4: be richer overall will pay for a human being to do. 841 00:42:09,080 --> 00:42:10,560 Speaker 3: I just don't know what those things will. 842 00:42:10,440 --> 00:42:13,520 Speaker 1: Be, So forget podcasting. We will have other jobs that 843 00:42:13,600 --> 00:42:15,440 Speaker 1: we just can't even think of right now. 844 00:42:16,520 --> 00:42:19,279 Speaker 4: I believe that we you and I I don't know, 845 00:42:19,360 --> 00:42:22,120 Speaker 4: but people will have jobs, our grandkids. 846 00:42:22,680 --> 00:42:26,120 Speaker 3: Yeah, I mean it could happen to me. Yeah, it 847 00:42:26,160 --> 00:42:27,520 Speaker 3: will be super interesting. 848 00:42:27,920 --> 00:42:30,040 Speaker 2: Yeah, for sure. What a time to be alive to 849 00:42:30,040 --> 00:42:30,560 Speaker 2: witness it. 850 00:42:30,600 --> 00:42:33,000 Speaker 3: You know, what a fucking time to be alive. 851 00:42:34,719 --> 00:42:37,839 Speaker 1: I was journalist Jacob Goldstein to hear more of what's 852 00:42:37,840 --> 00:42:41,280 Speaker 1: in Jacob's brain, Listen to his podcast What's Your Problem 853 00:42:41,480 --> 00:42:45,319 Speaker 1: and Business History, and read his book Money, The True 854 00:42:45,320 --> 00:42:50,839 Speaker 1: Story of a Made Up Thing. Here we go again 855 00:42:50,880 --> 00:42:54,120 Speaker 1: as a production of iHeart Podcasts and Snafu Media in 856 00:42:54,200 --> 00:42:58,040 Speaker 1: association with New Metric Media. Our executive producers are me 857 00:42:58,320 --> 00:43:02,840 Speaker 1: kalpen at Helm's, Mike fa Alissa Martino, Andy Kim, Pat Kelly, 858 00:43:02,960 --> 00:43:06,480 Speaker 1: Chris Kelly, and Dylan Fagan. Meghan tan is our producer 859 00:43:06,480 --> 00:43:09,359 Speaker 1: and writer. Dave Shumka is our producer and editor. Our 860 00:43:09,400 --> 00:43:14,080 Speaker 1: consulting producer is Romin Borsolino. Tory Smith is our associate producer. 861 00:43:14,400 --> 00:43:18,320 Speaker 1: Theme music by Chris Kelly logo by Matt Gosson. Legal 862 00:43:18,360 --> 00:43:22,480 Speaker 1: review from Daniel Welsh, Caroline Johnson, and Meghan Halson. Special 863 00:43:22,480 --> 00:43:26,400 Speaker 1: thanks to Glenn Bassner, Isaac Dunham, Adam Horn, Lane Klein, 864 00:43:26,560 --> 00:43:30,640 Speaker 1: and everyone at iHeart Podcasts, but especially Will Pearson, Carrie 865 00:43:30,640 --> 00:43:36,160 Speaker 1: Lieberman and Nikki Etour. Thanks for listening. Everybody, tell your 866 00:43:36,160 --> 00:43:38,800 Speaker 1: friends write a review. All of this helps. I appreciate 867 00:43:38,800 --> 00:43:41,320 Speaker 1: you listening, and until we go again, I'm Kel Penn