1 00:00:04,600 --> 00:00:08,320 Speaker 1: Sleepwalkers is a production of I Heart Radio and unusual productions. 2 00:00:12,960 --> 00:00:18,400 Speaker 1: AI will make phenomenal companies and tycoons faster, and it 3 00:00:18,480 --> 00:00:22,919 Speaker 1: will also displace jobs faster than computers and the Internet. 4 00:00:23,079 --> 00:00:27,440 Speaker 1: It's already happening. That's Kai Fu Lee speaking, the former 5 00:00:27,480 --> 00:00:30,560 Speaker 1: head of Google China and the so called oracle of AI. 6 00:00:30,800 --> 00:00:34,240 Speaker 1: I think there are at least two issues involved. One 7 00:00:34,600 --> 00:00:39,159 Speaker 1: is how to do income redistribution, and that is a 8 00:00:39,280 --> 00:00:42,760 Speaker 1: very complex issue. I'm not an expert, but one way 9 00:00:42,840 --> 00:00:47,200 Speaker 1: or another, the ultra rich who did extremely well based 10 00:00:47,200 --> 00:00:50,440 Speaker 1: on AI or other reasons, I think somehow need to 11 00:00:50,520 --> 00:00:54,840 Speaker 1: help the people who are under privileged or even victimized 12 00:00:54,880 --> 00:00:58,480 Speaker 1: by technology. The exact mechanism I don't know, but if 13 00:00:58,520 --> 00:01:01,000 Speaker 1: we don't do it, redistribution is going to be a 14 00:01:01,040 --> 00:01:05,400 Speaker 1: serious matter for our social stability. It's not actually a 15 00:01:05,520 --> 00:01:11,360 Speaker 1: underprivileged minority, it will become an underprivileged majority. The benefits 16 00:01:11,360 --> 00:01:14,720 Speaker 1: of the AI revolution will not be evenly distributed, and 17 00:01:14,800 --> 00:01:19,559 Speaker 1: according to Kaifu, automation will replace fort of jobs worldwide 18 00:01:19,840 --> 00:01:25,080 Speaker 1: in the next fifteen years. The second part is how 19 00:01:25,120 --> 00:01:28,800 Speaker 1: do we help people whose jobs have been displaced find 20 00:01:28,840 --> 00:01:32,440 Speaker 1: the new beginning? We ask the question what can AI 21 00:01:32,480 --> 00:01:36,959 Speaker 1: and automation not do? That is the central question this episode, 22 00:01:37,440 --> 00:01:40,440 Speaker 1: as AI and Automation displays more and more jobs, What 23 00:01:40,520 --> 00:01:43,160 Speaker 1: will be left for us to do and who will 24 00:01:43,200 --> 00:01:46,280 Speaker 1: be qualified to do it? Today will explore the automated 25 00:01:46,280 --> 00:01:49,520 Speaker 1: economy and the changes it will bring. Im as Velashen 26 00:01:49,880 --> 00:02:05,680 Speaker 1: Welcome to Sleepwalkers. So, Carol, when I hear Kaifou talking 27 00:02:05,680 --> 00:02:08,800 Speaker 1: about jobs being lost to AI, my mind goes immediately 28 00:02:08,840 --> 00:02:12,840 Speaker 1: to driverless cars and self driving cars replacing taxis, um, 29 00:02:13,040 --> 00:02:15,639 Speaker 1: long distance trucking, that kind of thing. Yeah, but there's 30 00:02:15,680 --> 00:02:20,080 Speaker 1: also you know, agriculture, like combine harvests, like robots who 31 00:02:20,120 --> 00:02:24,040 Speaker 1: are picking fruit. Um. Washington State actually announced that next 32 00:02:24,080 --> 00:02:26,840 Speaker 1: season they're going to be rolling out these vacuum harvesters 33 00:02:26,840 --> 00:02:30,520 Speaker 1: that use AI to identify and pick only ripe apples. Wow, 34 00:02:30,560 --> 00:02:32,960 Speaker 1: so not only picking the fruit, but also being smart 35 00:02:32,960 --> 00:02:36,040 Speaker 1: about which fruit it picks. That's right, the ripe stuff, 36 00:02:36,080 --> 00:02:40,640 Speaker 1: the stuff. And there's actually this raspberry picking robot in 37 00:02:40,639 --> 00:02:44,760 Speaker 1: the UK that was funded by some British supermarkets and 38 00:02:45,440 --> 00:02:48,960 Speaker 1: those robots can pick twenty five thousand berries a day 39 00:02:49,080 --> 00:02:52,600 Speaker 1: versus a humans fifteen thousand in an eight hour day, 40 00:02:52,880 --> 00:02:55,880 Speaker 1: and also remember this, eight hour days for human being 41 00:02:56,000 --> 00:02:58,240 Speaker 1: is a long day for a robot. A robot doesn't 42 00:02:58,240 --> 00:02:59,600 Speaker 1: know what a long day is, nor does it know 43 00:02:59,639 --> 00:03:01,440 Speaker 1: what a short day is. And it can work into 44 00:03:01,440 --> 00:03:03,800 Speaker 1: the night right and when we force ourselves into comparison 45 00:03:03,840 --> 00:03:07,960 Speaker 1: with these robots, that kind of creates very realistic expectations 46 00:03:08,000 --> 00:03:11,560 Speaker 1: for workers can do. Interesting is not just jobs that 47 00:03:11,600 --> 00:03:15,279 Speaker 1: require mechanical skills that Kaifu thinks will be lost to automation. 48 00:03:15,960 --> 00:03:19,160 Speaker 1: And AI actually doesn't distinguish between white collar and blue 49 00:03:19,160 --> 00:03:22,280 Speaker 1: collar jobs. So any job that has a routine element, 50 00:03:22,639 --> 00:03:26,160 Speaker 1: whether it's underwriting loans or telemarketing or researching, you know, 51 00:03:26,240 --> 00:03:28,480 Speaker 1: this is a lot of work. The first AI podcast 52 00:03:28,560 --> 00:03:31,919 Speaker 1: may not be too far off. Um. It actually reminds 53 00:03:31,919 --> 00:03:34,200 Speaker 1: me the episode we did about AI and creativity that 54 00:03:34,480 --> 00:03:37,880 Speaker 1: algorithms that can write poetry and music and screenplays are 55 00:03:37,920 --> 00:03:41,440 Speaker 1: already here. This is not some robot apocalypse in the 56 00:03:41,440 --> 00:03:44,720 Speaker 1: distant future. Job displacement is with us. Julian, You've got 57 00:03:44,760 --> 00:03:46,760 Speaker 1: in touch with somebody who's seeing this play out in 58 00:03:46,800 --> 00:03:50,080 Speaker 1: real time. Yeah, it did. His name's Wild Kankowski and 59 00:03:50,240 --> 00:03:54,880 Speaker 1: he lives in Florida, all around the city whatever direction 60 00:03:54,880 --> 00:03:59,000 Speaker 1: we're gonna go. We know where every every McDonald's pretty 61 00:03:59,080 --> 00:04:01,440 Speaker 1: much is on the did a job well. A lot 62 00:04:01,480 --> 00:04:04,200 Speaker 1: of the people know us because we go in there 63 00:04:04,240 --> 00:04:07,320 Speaker 1: all the time. A lot of them know me because 64 00:04:07,440 --> 00:04:15,080 Speaker 1: not too many people get a medium coffee with twelve creams. Yeah, 65 00:04:15,240 --> 00:04:17,520 Speaker 1: as what is taking a huge number of creams and 66 00:04:17,600 --> 00:04:20,320 Speaker 1: his coffee? What? He owns the pool screens and repair 67 00:04:20,400 --> 00:04:23,920 Speaker 1: business in Orlando, Florida. His job takes him all around town, 68 00:04:23,960 --> 00:04:27,200 Speaker 1: but every morning starts the same way Adam McDonald's and 69 00:04:27,320 --> 00:04:31,120 Speaker 1: recently what he has seen a change. They just started 70 00:04:31,160 --> 00:04:34,560 Speaker 1: to show up, probably about a year or so ago. 71 00:04:34,760 --> 00:04:38,479 Speaker 1: That way, when we go to a counter, people are 72 00:04:38,600 --> 00:04:42,360 Speaker 1: getting mad because they want you to go to use 73 00:04:42,520 --> 00:04:44,719 Speaker 1: the key off and I'm walking up to the counter 74 00:04:44,880 --> 00:04:48,040 Speaker 1: wanting to get my coffee and get on on our day. 75 00:04:48,320 --> 00:04:50,080 Speaker 1: They're like, oh, you got to use the kios so 76 00:04:50,080 --> 00:04:52,200 Speaker 1: and then they want me to hit the screen. The 77 00:04:52,279 --> 00:04:57,000 Speaker 1: screen says, go to this thing, go to beverage. Okay, 78 00:04:57,000 --> 00:05:00,919 Speaker 1: what kind of beverage? Well, okay, go the coffee, but 79 00:05:01,000 --> 00:05:03,280 Speaker 1: what do you want? Ice coffee? This? That? And then 80 00:05:03,360 --> 00:05:06,760 Speaker 1: instead of me saying twelve cream and she hears me. 81 00:05:07,120 --> 00:05:09,479 Speaker 1: Now I get to hit the machine like twelve times 82 00:05:09,520 --> 00:05:11,360 Speaker 1: that that that that that that that that that that 83 00:05:11,520 --> 00:05:14,400 Speaker 1: twelve times to get it, because that's how many times 84 00:05:14,440 --> 00:05:16,520 Speaker 1: I get to hit it to get to twelve. The 85 00:05:16,560 --> 00:05:19,680 Speaker 1: thing is knocking someone out of a job. We've all 86 00:05:19,720 --> 00:05:22,560 Speaker 1: been wally stuck at a self checkout or yelling at 87 00:05:22,560 --> 00:05:25,719 Speaker 1: an automated phone menu that refuses to understand what we're saying. 88 00:05:26,520 --> 00:05:29,880 Speaker 1: But those interactions are not just frustrating for us. They're 89 00:05:29,880 --> 00:05:33,800 Speaker 1: real world examples of jobs being displaced by technology, and 90 00:05:33,920 --> 00:05:36,320 Speaker 1: they don't only affect the people whose jobs are threatened. 91 00:05:36,800 --> 00:05:40,240 Speaker 1: We're in a lot of different McDonald's and I probably 92 00:05:40,279 --> 00:05:44,680 Speaker 1: recognize every single person in there. Some people I've known 93 00:05:44,800 --> 00:05:48,240 Speaker 1: probably ten fifteen years, and they know who I am. 94 00:05:48,560 --> 00:05:51,719 Speaker 1: You know, they're friendly enough to make you feel a 95 00:05:51,720 --> 00:05:55,120 Speaker 1: little special there. That way, I guess we might be 96 00:05:55,200 --> 00:05:58,240 Speaker 1: walking through a store and then I'll see those people 97 00:05:58,279 --> 00:05:59,960 Speaker 1: and I'll go over and them say, yeah, you're for 98 00:06:00,160 --> 00:06:02,279 Speaker 1: McDonald's or that, and then they'll be like, yeah, I 99 00:06:02,360 --> 00:06:05,120 Speaker 1: know who you are. Then you actually get them meet 100 00:06:05,160 --> 00:06:08,559 Speaker 1: and greet someone and make a conversation for a minute 101 00:06:08,640 --> 00:06:12,560 Speaker 1: or two. That way, why would human contact when you 102 00:06:12,680 --> 00:06:15,320 Speaker 1: talking to a person for a second and getting my 103 00:06:15,480 --> 00:06:19,359 Speaker 1: food and paying them in another two seconds. There shouldn't 104 00:06:19,360 --> 00:06:23,840 Speaker 1: have been nothing wrong with that process. So, Julie, how 105 00:06:23,839 --> 00:06:25,640 Speaker 1: did this come about? What made you want to include 106 00:06:25,640 --> 00:06:27,919 Speaker 1: Wally story in the podcast? Well, for one thing, I 107 00:06:27,960 --> 00:06:31,080 Speaker 1: love Wally, but these are also familiar stories, right, I mean, 108 00:06:31,080 --> 00:06:33,040 Speaker 1: and while he's been able to see this one play 109 00:06:33,080 --> 00:06:35,880 Speaker 1: out over time, where you can see how just changing 110 00:06:35,920 --> 00:06:38,240 Speaker 1: one part of one task the way he orders a 111 00:06:38,279 --> 00:06:41,159 Speaker 1: coffee has actually had this ripple effect that also follows 112 00:06:41,200 --> 00:06:43,480 Speaker 1: him around as he goes about his day. Yeah. I 113 00:06:43,560 --> 00:06:46,039 Speaker 1: was especially struck by Wally story because it's easy to 114 00:06:46,080 --> 00:06:50,520 Speaker 1: talk about automation and job displacement as these big abstract ideas, 115 00:06:50,920 --> 00:06:53,560 Speaker 1: but here's somebody who's actually felt it. Even though it's 116 00:06:53,600 --> 00:06:56,640 Speaker 1: not his job that's been lost. Is something that affects 117 00:06:56,640 --> 00:06:58,760 Speaker 1: the whole community. You know, I don't mean to be 118 00:06:59,279 --> 00:07:03,479 Speaker 1: super nestar algic, but a lot of great movies and 119 00:07:03,600 --> 00:07:08,000 Speaker 1: great young adult novels have you know, the teenage girl 120 00:07:08,080 --> 00:07:11,560 Speaker 1: who's angsty and you know works at the Friar and 121 00:07:12,000 --> 00:07:13,440 Speaker 1: you know, now it's just like you're gonna have like 122 00:07:13,480 --> 00:07:17,200 Speaker 1: an angsty data scientist, you know, mulling over the express 123 00:07:17,280 --> 00:07:21,720 Speaker 1: checkout crouched over the screen. Well, those those golden arches, 124 00:07:21,920 --> 00:07:25,360 Speaker 1: they are very enduring symbol for America UM And earlier 125 00:07:25,360 --> 00:07:28,720 Speaker 1: this year, McDonald's acquired an AI company for three hundred 126 00:07:28,840 --> 00:07:32,120 Speaker 1: million dollars. It was their biggest acquisition for twenty years. 127 00:07:32,240 --> 00:07:35,440 Speaker 1: And it's all about predicting what people might order before 128 00:07:35,480 --> 00:07:37,720 Speaker 1: they even arrive at the store. So even the days 129 00:07:37,720 --> 00:07:40,680 Speaker 1: of Kiosks maybe numbered, maybe we'll be nostalgic about them 130 00:07:40,680 --> 00:07:44,040 Speaker 1: in twenty years, but nonetheless, this AI acquisition could ultimately 131 00:07:44,080 --> 00:07:47,640 Speaker 1: lead to a better customer experience. And it's important to 132 00:07:47,640 --> 00:07:50,560 Speaker 1: remember that the AI revolution doesn't need to be just 133 00:07:50,680 --> 00:07:54,200 Speaker 1: about displacing jobs. It can also be about augmenting us 134 00:07:54,200 --> 00:07:58,240 Speaker 1: and our experience. One person working on human machine partnership 135 00:07:58,600 --> 00:08:02,200 Speaker 1: is Gil Pratt see EEO of the Toyota Research Institute. 136 00:08:03,000 --> 00:08:07,400 Speaker 1: Many of our colleagues at other companies are really focused 137 00:08:07,400 --> 00:08:10,920 Speaker 1: on building only the self driving car, where you replace 138 00:08:11,240 --> 00:08:13,920 Speaker 1: the driver with an AI system. But we also have 139 00:08:14,000 --> 00:08:17,120 Speaker 1: this other track of building something that we call the Guardian, 140 00:08:17,560 --> 00:08:20,440 Speaker 1: which is meant to safeguard a human being when they drive, 141 00:08:20,480 --> 00:08:24,440 Speaker 1: to avoid accidents and to avoid crashes. I think the 142 00:08:24,520 --> 00:08:28,560 Speaker 1: Guardian approach has been at odds because of money. The 143 00:08:28,640 --> 00:08:33,080 Speaker 1: economic desire to replace the driver in a taxi is 144 00:08:33,200 --> 00:08:35,560 Speaker 1: very large, and a lot of companies are sort of 145 00:08:35,600 --> 00:08:39,439 Speaker 1: going after this attractive idea of automating out the human 146 00:08:39,480 --> 00:08:42,920 Speaker 1: being from driving taxis. But you know, Toyota is first 147 00:08:42,960 --> 00:08:46,680 Speaker 1: and foremost a car company, which means that we have 148 00:08:46,840 --> 00:08:49,080 Speaker 1: this business of making cars. We also want to make 149 00:08:49,120 --> 00:08:51,800 Speaker 1: cars a lot more safe, and we also want to 150 00:08:51,800 --> 00:08:55,240 Speaker 1: make them a lot more fun. Gil makes an important 151 00:08:55,280 --> 00:08:59,600 Speaker 1: point today, our innovation is driven by the market. Companies 152 00:08:59,600 --> 00:09:02,199 Speaker 1: like Uber in tested to keep their valuations high by 153 00:09:02,200 --> 00:09:04,160 Speaker 1: promising their investors that they will be able to do 154 00:09:04,280 --> 00:09:08,880 Speaker 1: better business in future by replacing human drivers. Toyota is 155 00:09:08,880 --> 00:09:11,800 Speaker 1: actually an investor in Uber, but it's primary business is 156 00:09:11,880 --> 00:09:15,600 Speaker 1: car manufacturing, so there that is on enhancing the abilities 157 00:09:15,640 --> 00:09:19,240 Speaker 1: of human drivers rather than replacing them, making driving more 158 00:09:19,280 --> 00:09:23,400 Speaker 1: fun and Gil's humanistic approach to technology is also being 159 00:09:23,400 --> 00:09:26,640 Speaker 1: applied to other problems at the Toyota Research Institute. We 160 00:09:26,720 --> 00:09:31,760 Speaker 1: want to allow people to age in place with dignity, 161 00:09:31,840 --> 00:09:36,040 Speaker 1: and in particular, we want to help them by amplifying 162 00:09:36,120 --> 00:09:39,280 Speaker 1: their abilities to make up for what was lost, rather 163 00:09:39,320 --> 00:09:43,160 Speaker 1: than replacing their abilities. And make them feel as if 164 00:09:43,320 --> 00:09:47,880 Speaker 1: they're elderly. It's a subtle difference, and it's very easy 165 00:09:47,920 --> 00:09:51,600 Speaker 1: to get it wrong. It's very easy to build a 166 00:09:51,679 --> 00:09:56,480 Speaker 1: technology that is ostensibly going to help some someone, but 167 00:09:56,520 --> 00:09:59,040 Speaker 1: it's what it's really doing is offloading work from them 168 00:09:59,120 --> 00:10:01,680 Speaker 1: and making them feel they can't do it and therefore 169 00:10:01,720 --> 00:10:03,480 Speaker 1: they're old and they should just sit in a chair. 170 00:10:04,240 --> 00:10:08,280 Speaker 1: It's much harder to figure out a way, particularly in 171 00:10:08,320 --> 00:10:12,720 Speaker 1: the robotics field, to continue to engage the person so 172 00:10:12,760 --> 00:10:15,839 Speaker 1: that they feel like they can do it themselves. And 173 00:10:16,080 --> 00:10:17,800 Speaker 1: that's a little bit of a difference in how we 174 00:10:17,920 --> 00:10:21,240 Speaker 1: try to do things. There's one that we've recently started 175 00:10:21,280 --> 00:10:25,120 Speaker 1: to show, which is a machine called the Buddy, and 176 00:10:25,360 --> 00:10:28,320 Speaker 1: this idea is one where older people have a lot 177 00:10:28,320 --> 00:10:31,800 Speaker 1: of difficulty reaching down low to pick up things from 178 00:10:31,840 --> 00:10:36,160 Speaker 1: the ground and difficulty moving heavy things, and so we're 179 00:10:36,160 --> 00:10:39,040 Speaker 1: working on a machine that still has the human in 180 00:10:39,080 --> 00:10:42,280 Speaker 1: the loop, but makes it much easier for them to 181 00:10:42,600 --> 00:10:46,600 Speaker 1: do that task. But it understands that no matter how 182 00:10:46,679 --> 00:10:49,319 Speaker 1: much robotics may be able to help solve the practical 183 00:10:49,400 --> 00:10:52,400 Speaker 1: challenges of life as an older person, it can never 184 00:10:52,440 --> 00:10:56,199 Speaker 1: replace a human cab provider. Just to be very very clear, 185 00:10:56,679 --> 00:10:59,800 Speaker 1: we don't want to replace people as companions. We think 186 00:11:00,000 --> 00:11:03,400 Speaker 1: it what human beings want most of all in a 187 00:11:03,520 --> 00:11:10,720 Speaker 1: companion is another human being, so companion. This brings us 188 00:11:10,760 --> 00:11:12,559 Speaker 1: back to what Kaifu was saying right at the top 189 00:11:12,600 --> 00:11:16,079 Speaker 1: of the episode, what can AI and automation not do? 190 00:11:16,559 --> 00:11:19,600 Speaker 1: So Yeah, Gil acknowledges that no matter how much progress 191 00:11:19,640 --> 00:11:22,000 Speaker 1: is made in the field of robotics to help elderly people, 192 00:11:22,240 --> 00:11:25,640 Speaker 1: nothing's going to make up for human contact. I actually 193 00:11:25,760 --> 00:11:28,000 Speaker 1: was able to talk to Sherry Turkle, who's a professor 194 00:11:28,040 --> 00:11:30,720 Speaker 1: at m i T who talks a lot about human 195 00:11:30,840 --> 00:11:34,600 Speaker 1: beings and their relationship with technology, and she talks about 196 00:11:34,600 --> 00:11:37,800 Speaker 1: this fluffy seal robot called Pero, which is used in 197 00:11:37,880 --> 00:11:42,240 Speaker 1: nursing homes to soothe Alzheimer's patients. And it can simulate 198 00:11:42,280 --> 00:11:44,839 Speaker 1: this like affectionate little animal, and it can be really 199 00:11:44,840 --> 00:11:47,000 Speaker 1: effective at drawing people out of their shells when they're 200 00:11:47,000 --> 00:11:50,200 Speaker 1: otherwise hard to reach or feeling disoriented. On the other hand, 201 00:11:50,200 --> 00:11:53,200 Speaker 1: and this is Sherry's argument, it becomes really easy for 202 00:11:53,240 --> 00:11:55,800 Speaker 1: family members to be like, well, you know, my grandpa 203 00:11:55,840 --> 00:11:57,679 Speaker 1: has this, you know, seal at home. I don't need 204 00:11:57,720 --> 00:11:59,600 Speaker 1: to go visit him all the time. And I know 205 00:11:59,640 --> 00:12:01,840 Speaker 1: that sounds extreme, but it's more of the idea of 206 00:12:01,840 --> 00:12:05,400 Speaker 1: the fact that we're using these robots to make us 207 00:12:05,440 --> 00:12:08,760 Speaker 1: feel better about calming people who we could otherwise have 208 00:12:09,000 --> 00:12:12,479 Speaker 1: strong relationships with. Yeah, and I think it also normalizes 209 00:12:12,559 --> 00:12:16,440 Speaker 1: the idea of interacting with robots or technology instead of 210 00:12:16,480 --> 00:12:19,000 Speaker 1: real people. And that's painful. That's what wal he was 211 00:12:19,080 --> 00:12:21,800 Speaker 1: really talking about. Yes, it's frustrating to have to use 212 00:12:21,840 --> 00:12:24,400 Speaker 1: the Kiosk when you want twelve creems with your coffee. 213 00:12:24,600 --> 00:12:28,480 Speaker 1: But more important, Leader Rhodes Community bonds. It's no wonder 214 00:12:28,520 --> 00:12:31,760 Speaker 1: that a company like McDonald's is spending a ton of 215 00:12:31,800 --> 00:12:35,520 Speaker 1: money on this. It makes them more efficient and profitable 216 00:12:35,559 --> 00:12:37,520 Speaker 1: if they don't have to pay people. Yeah, and it's 217 00:12:37,559 --> 00:12:39,600 Speaker 1: hard to turn back the clocks. You know. Donald Trump 218 00:12:39,640 --> 00:12:42,160 Speaker 1: talks about bringing back the cold jobs, but jobs that 219 00:12:42,200 --> 00:12:45,160 Speaker 1: have been lost are very hard to recreate. It doesn't 220 00:12:45,160 --> 00:12:47,319 Speaker 1: make me think about Kaifu's comment at the top of 221 00:12:47,360 --> 00:12:51,680 Speaker 1: the episode about the underprivileged majority. Uvell know Harari, who's 222 00:12:51,720 --> 00:12:54,000 Speaker 1: coming to join us later in the series, talks about 223 00:12:54,040 --> 00:12:57,640 Speaker 1: a useless class. When we come back, we look at 224 00:12:57,640 --> 00:12:59,520 Speaker 1: what this means for the people at the sharp end 225 00:12:59,800 --> 00:13:02,640 Speaker 1: the people losing their jobs to automation, and at some 226 00:13:02,720 --> 00:13:12,800 Speaker 1: of the proposed solutions. According to an OXFAM International report 227 00:13:12,840 --> 00:13:16,360 Speaker 1: published earlier this year, the twenty six richest billionaires in 228 00:13:16,400 --> 00:13:19,200 Speaker 1: the world have as much wealth as the poorest three 229 00:13:19,200 --> 00:13:22,800 Speaker 1: point eight billion people, and many of those billionaires made 230 00:13:22,840 --> 00:13:26,880 Speaker 1: their fortunes from technology. Jeff Bezos is the world's richest 231 00:13:26,920 --> 00:13:30,640 Speaker 1: person thanks to Amazon. Meanwhile, Amazon is investing hundreds of 232 00:13:30,640 --> 00:13:34,040 Speaker 1: millions of dollars in automating their supply chain, in other words, 233 00:13:34,240 --> 00:13:36,320 Speaker 1: attempting to cut out the labor force who made the 234 00:13:36,320 --> 00:13:39,520 Speaker 1: business possible. It's a bit like Uber's investment in self 235 00:13:39,600 --> 00:13:42,680 Speaker 1: driving technology. So what jobs might be safe from the 236 00:13:42,720 --> 00:13:46,800 Speaker 1: relentless march towards automation, I asked Kai Fu Lee. My 237 00:13:47,000 --> 00:13:50,679 Speaker 1: general feeling is that these will be the human interaction jobs, 238 00:13:50,880 --> 00:13:54,520 Speaker 1: the compassion and empathetic jobs, the jobs that we expect 239 00:13:54,640 --> 00:13:57,520 Speaker 1: a human and refused to work with a robot. That 240 00:13:57,559 --> 00:14:01,160 Speaker 1: would doubly ensure these jobs are safe as one AI 241 00:14:01,240 --> 00:14:04,280 Speaker 1: can't do them now, and too, even if AI got better, 242 00:14:04,520 --> 00:14:08,120 Speaker 1: customers don't accept it, then those jobs will become the 243 00:14:08,240 --> 00:14:11,840 Speaker 1: right areas to retrain people to move into so jobs 244 00:14:11,880 --> 00:14:18,160 Speaker 1: like nurses, Nanni's elderly care, high end jobs like psychiatrists 245 00:14:18,200 --> 00:14:21,600 Speaker 1: and doctors, because the future it will be different. AI 246 00:14:21,680 --> 00:14:24,600 Speaker 1: can do the analytical part, but the doctor will still 247 00:14:24,640 --> 00:14:27,560 Speaker 1: need to provide the warmth and the human contact that 248 00:14:27,800 --> 00:14:33,120 Speaker 1: the patient expects during the worst period of vulnerability. What 249 00:14:33,280 --> 00:14:36,960 Speaker 1: we may move more towards ordering from kiosks and help menus, 250 00:14:37,080 --> 00:14:40,240 Speaker 1: or not even needing to order at all. Kaifu agrees 251 00:14:40,320 --> 00:14:42,880 Speaker 1: with Gil, will still need to human touch in a 252 00:14:42,960 --> 00:14:45,840 Speaker 1: range of industries, many of them senters around care and 253 00:14:45,960 --> 00:14:49,720 Speaker 1: human services, and it's striking to hear these two pioneers 254 00:14:49,760 --> 00:14:54,280 Speaker 1: of new technology. Kaifu and AI and Gil in robotics agree, 255 00:14:54,680 --> 00:14:57,600 Speaker 1: both arguing that automation might increase the value of what 256 00:14:57,760 --> 00:15:01,080 Speaker 1: is uniquely human. Guilt terms the history. To back up 257 00:15:01,120 --> 00:15:03,800 Speaker 1: his argument, he looks at how our understanding of our 258 00:15:03,840 --> 00:15:08,680 Speaker 1: own value as humans shifted during the Industrial Revolution away 259 00:15:08,680 --> 00:15:11,560 Speaker 1: from the ability of our bodies towards the ability of 260 00:15:11,560 --> 00:15:14,600 Speaker 1: our minds. You know, if you go back in history 261 00:15:14,600 --> 00:15:17,200 Speaker 1: and you say, how did people earn a living back 262 00:15:17,200 --> 00:15:20,120 Speaker 1: in the days of mechanical work, There wasn't you know, 263 00:15:20,280 --> 00:15:23,600 Speaker 1: steam engines, no use of gasoline or oil or anything 264 00:15:23,640 --> 00:15:27,200 Speaker 1: like that. And the answer was that the economic capital 265 00:15:27,520 --> 00:15:29,680 Speaker 1: that a human being would have just by being born 266 00:15:30,120 --> 00:15:34,040 Speaker 1: was primarily mechanical. So our muscles made us worthwhile at 267 00:15:34,080 --> 00:15:37,920 Speaker 1: a minimum level, and machines effectively took over most of 268 00:15:37,960 --> 00:15:40,760 Speaker 1: the mechanical work that we do, and so we now 269 00:15:40,880 --> 00:15:43,280 Speaker 1: are valued mostly what we can do with our minds, 270 00:15:44,200 --> 00:15:48,360 Speaker 1: assuming that this next stage of AI occurs where most 271 00:15:48,440 --> 00:15:52,560 Speaker 1: of the mental labor that is done is displaced. What 272 00:15:52,680 --> 00:15:55,320 Speaker 1: I think we need to think about now is what 273 00:15:55,360 --> 00:15:57,720 Speaker 1: will we do then? And we need to think about 274 00:15:57,760 --> 00:16:00,640 Speaker 1: it even if this next stage of A doesn't come 275 00:16:00,640 --> 00:16:03,880 Speaker 1: for a while, because we went from mechanical to mental. 276 00:16:04,520 --> 00:16:09,200 Speaker 1: Is there something next? Is there something next? That is 277 00:16:09,240 --> 00:16:13,080 Speaker 1: the trillion dollar question? According to Guilty, industrial revolution led 278 00:16:13,160 --> 00:16:15,440 Speaker 1: us to place more value on the mind than the muscle. 279 00:16:16,160 --> 00:16:19,160 Speaker 1: Now that a I can increasingly perform mental labor, but 280 00:16:19,280 --> 00:16:21,880 Speaker 1: we find a new source of value. And could it 281 00:16:21,920 --> 00:16:25,680 Speaker 1: be like CAIFU hinted at as well, some emotional connection. 282 00:16:26,280 --> 00:16:29,800 Speaker 1: When I read a story to my son, it matters 283 00:16:29,840 --> 00:16:32,200 Speaker 1: a whole lot to him. When I read a story 284 00:16:32,240 --> 00:16:35,080 Speaker 1: to my mother, it's very much the same thing. So 285 00:16:35,640 --> 00:16:38,960 Speaker 1: could we actually decide to increase the value that we 286 00:16:39,080 --> 00:16:42,440 Speaker 1: pay for social work. There's many, many different jobs that 287 00:16:42,640 --> 00:16:45,120 Speaker 1: really should be paid much much higher than they are now, 288 00:16:45,320 --> 00:16:49,160 Speaker 1: jobs of teaching and helping so forth. And so I'm 289 00:16:49,200 --> 00:16:51,880 Speaker 1: an optimist that we can find an answer, but I 290 00:16:51,920 --> 00:16:54,680 Speaker 1: think we need to realize the difficulty in order to 291 00:16:54,720 --> 00:16:58,800 Speaker 1: move towards that answer. The difficulty is huge because as 292 00:16:58,840 --> 00:17:02,400 Speaker 1: of now, excepting the luxury, the market does not reward 293 00:17:02,440 --> 00:17:05,520 Speaker 1: the kind of human contact that Kaifu and Gil allude to. 294 00:17:06,119 --> 00:17:08,480 Speaker 1: And while we, like Wally may wish for our food 295 00:17:08,560 --> 00:17:11,359 Speaker 1: orders not to be automated, how much more would we 296 00:17:11,400 --> 00:17:14,200 Speaker 1: pay for human contact? How much more could we afford 297 00:17:14,280 --> 00:17:17,159 Speaker 1: to pay. Part of the problem is that automation is 298 00:17:17,200 --> 00:17:21,440 Speaker 1: exacerbating the gap between rich and poor. Technology companies can 299 00:17:21,440 --> 00:17:24,879 Speaker 1: increasingly create wealth without needing to pay the wages of 300 00:17:24,920 --> 00:17:28,520 Speaker 1: additional employees. That's the secret behind that word you hear 301 00:17:28,560 --> 00:17:32,480 Speaker 1: so often scale, which is why Kai Fu Lee proposes 302 00:17:32,520 --> 00:17:36,600 Speaker 1: a radical solution. If we start to redistribute the income 303 00:17:36,800 --> 00:17:39,200 Speaker 1: that is taking away the power of the ultra ridge, 304 00:17:39,720 --> 00:17:43,040 Speaker 1: If we start to give the people who are stripped 305 00:17:43,080 --> 00:17:45,800 Speaker 1: of their current jobs a new job that has not 306 00:17:45,960 --> 00:17:50,120 Speaker 1: only income but also meaning, I think um people would 307 00:17:50,160 --> 00:17:52,600 Speaker 1: be more fulfilled, their children at least would have a 308 00:17:52,680 --> 00:17:55,920 Speaker 1: chance just to pause. Kai Foo Lee is a hugely 309 00:17:55,960 --> 00:18:00,040 Speaker 1: successful international investor arguing that we need to overturn and 310 00:18:00,240 --> 00:18:04,000 Speaker 1: one of the most fundamental assumptions of American society that 311 00:18:04,080 --> 00:18:06,719 Speaker 1: the market should be allowed to set the price. And 312 00:18:06,880 --> 00:18:10,080 Speaker 1: Kaifu is not alone. Others in Silicon Valley are calling 313 00:18:10,119 --> 00:18:13,480 Speaker 1: for a so called universal basic income as SI pen 314 00:18:13,560 --> 00:18:17,200 Speaker 1: pay to all citizens to acknowledge an increasingly broken relationship 315 00:18:17,240 --> 00:18:21,440 Speaker 1: between labor and value. Today, we're nowhere close on either 316 00:18:21,480 --> 00:18:24,760 Speaker 1: of those ideas, but a growing course of inside voices 317 00:18:24,880 --> 00:18:29,280 Speaker 1: is acknowledging that automation will bring further disruption to society, 318 00:18:29,440 --> 00:18:32,520 Speaker 1: and others have even greater fears. You may remember Ian 319 00:18:32,560 --> 00:18:35,800 Speaker 1: Bremer from our episode on China and Surveillance. He's a 320 00:18:35,800 --> 00:18:39,040 Speaker 1: political scientist and the author of Us Versus Them, The 321 00:18:39,160 --> 00:18:45,560 Speaker 1: Failure of Globalism. I am less worried about just jobs 322 00:18:45,600 --> 00:18:50,920 Speaker 1: going away, then I am about technology facilitating the creation 323 00:18:51,440 --> 00:18:55,760 Speaker 1: of completely different types of human beings. What happens when 324 00:18:56,200 --> 00:19:00,680 Speaker 1: you have the ability to actually provide comp copletely different 325 00:19:00,720 --> 00:19:04,639 Speaker 1: sets of cognitive skills to human beings that have access 326 00:19:04,720 --> 00:19:09,000 Speaker 1: to certain types of new technology ian sphere is that 327 00:19:09,040 --> 00:19:13,280 Speaker 1: as technology improves, the rituals simply reproduced their privilege through 328 00:19:13,400 --> 00:19:17,640 Speaker 1: elite universities and professional networks, they may start to upgrade 329 00:19:17,680 --> 00:19:21,560 Speaker 1: their very hardware, making social mobility even harder for those 330 00:19:21,600 --> 00:19:26,639 Speaker 1: who can't afford the same modifications. Better memory retention, better 331 00:19:26,720 --> 00:19:31,600 Speaker 1: pattern recognition, more ability to link to real time information, 332 00:19:31,680 --> 00:19:34,679 Speaker 1: and the global net I mean, ability not to sleep 333 00:19:34,760 --> 00:19:37,680 Speaker 1: for longer periods of time, all of this sort of thing. Right, 334 00:19:38,600 --> 00:19:41,720 Speaker 1: The danger is that I don't care how much money, 335 00:19:41,720 --> 00:19:44,120 Speaker 1: how much wealth in society, and when you start creating 336 00:19:44,240 --> 00:19:48,000 Speaker 1: that kind of differentiation, everything we know about human history 337 00:19:48,680 --> 00:19:51,959 Speaker 1: is that that doesn't end well. Those other people that 338 00:19:52,000 --> 00:19:56,600 Speaker 1: aren't as capable get treated like animals or worse. And 339 00:19:56,840 --> 00:19:59,959 Speaker 1: I am very deeply worried that the speed of technol 340 00:20:00,040 --> 00:20:04,480 Speaker 1: logical transformation, coupled with the speed of this new industrial revolution, 341 00:20:04,840 --> 00:20:08,200 Speaker 1: makes it much more likely that large numbers of people 342 00:20:08,280 --> 00:20:11,639 Speaker 1: in our own societies, not in other countries, but like 343 00:20:11,920 --> 00:20:15,800 Speaker 1: right here, are suddenly not going to have that capacity, 344 00:20:15,880 --> 00:20:19,120 Speaker 1: and we're going to treat them as different types of humans, 345 00:20:19,359 --> 00:20:22,760 Speaker 1: maybe not even as humans at all. This is the 346 00:20:22,760 --> 00:20:25,920 Speaker 1: truly dystopient future that we will fear carat this concept 347 00:20:25,920 --> 00:20:29,399 Speaker 1: of a two track humanity facilitated by technology, where some 348 00:20:29,440 --> 00:20:32,520 Speaker 1: people have value and others don't. Yeah, you know, this 349 00:20:32,600 --> 00:20:35,480 Speaker 1: is the dark version of trans humanism, which we're going 350 00:20:35,520 --> 00:20:37,879 Speaker 1: to talk about later in the series. But you know 351 00:20:37,920 --> 00:20:41,919 Speaker 1: it's not some sci fi fantasy. Our favorite pre super 352 00:20:42,040 --> 00:20:45,960 Speaker 1: villain Elon Musk, founded Neuralink, which aims to create brain 353 00:20:46,080 --> 00:20:50,080 Speaker 1: computer interfaces. Like why do we need that? Well, I 354 00:20:50,080 --> 00:20:53,280 Speaker 1: guess because in today's economy, being smart is seen as 355 00:20:53,320 --> 00:20:56,160 Speaker 1: the most important differentiating factor. But we're not talking about 356 00:20:56,160 --> 00:20:59,160 Speaker 1: being an intellectual, like, we're talking about being cognitively enhanced 357 00:20:59,160 --> 00:21:03,280 Speaker 1: by a computer or by technology. And Elon Musk isn't 358 00:21:03,280 --> 00:21:05,480 Speaker 1: the only person who's noticed how important it is to 359 00:21:05,520 --> 00:21:08,800 Speaker 1: be cognitively enhanced, shall we say? Last year, the World 360 00:21:08,840 --> 00:21:12,199 Speaker 1: Bank announced the program called the Famine Action Mechanism to 361 00:21:12,240 --> 00:21:15,639 Speaker 1: get relief to famine hit areas faster, and they explicitly 362 00:21:15,720 --> 00:21:17,879 Speaker 1: said one of the reasons they're doing this is that 363 00:21:17,920 --> 00:21:20,600 Speaker 1: because people who are malnourished in the womb may have 364 00:21:20,640 --> 00:21:23,480 Speaker 1: cognitive issues later in life and thus be unable to 365 00:21:23,480 --> 00:21:25,399 Speaker 1: compete in the new economy. You know, I found it 366 00:21:25,440 --> 00:21:29,600 Speaker 1: really interesting that this program is actually powered by AI. 367 00:21:29,680 --> 00:21:33,440 Speaker 1: It draws on data like social media, food prices, rainfall, 368 00:21:33,800 --> 00:21:37,480 Speaker 1: and then automatically assigns funds so that money gets where 369 00:21:37,480 --> 00:21:40,040 Speaker 1: it's needed before it's too late. It's a textbook case 370 00:21:40,040 --> 00:21:41,800 Speaker 1: of what AI can do and we can't, which is 371 00:21:41,840 --> 00:21:44,680 Speaker 1: to notice these patterns and correlations between different types of 372 00:21:44,760 --> 00:21:47,280 Speaker 1: data sets which are so big as to be impossible 373 00:21:47,280 --> 00:21:50,640 Speaker 1: for us to compute, and as so often in Sleepwalkers, 374 00:21:50,760 --> 00:21:53,440 Speaker 1: and it's an example of technology being a double edged short. 375 00:21:54,280 --> 00:21:55,880 Speaker 1: On the one hand, it may be widening the gap 376 00:21:55,960 --> 00:21:58,080 Speaker 1: between rich and poor, but on the other hand, it 377 00:21:58,119 --> 00:22:01,199 Speaker 1: can potentially feed the world. When we come back, we 378 00:22:01,240 --> 00:22:04,720 Speaker 1: explore other ways AI and robotics can revolutionize food production. 379 00:22:12,640 --> 00:22:15,359 Speaker 1: We've looked at how AI and robotics could exacerbate the 380 00:22:15,400 --> 00:22:18,800 Speaker 1: gulf between rich and poor, and how this new industrial 381 00:22:18,840 --> 00:22:22,439 Speaker 1: revolution could put a new value on human connection. But 382 00:22:22,720 --> 00:22:27,120 Speaker 1: could we use automation to actually decrease global inequality? One 383 00:22:27,240 --> 00:22:31,280 Speaker 1: key factor is access to quality nutrition and roboticist George 384 00:22:31,320 --> 00:22:33,960 Speaker 1: Kantor gave a talk last year at south By Southwest 385 00:22:34,040 --> 00:22:38,000 Speaker 1: called AI will help feed a growing planet. I wanted 386 00:22:38,040 --> 00:22:40,439 Speaker 1: to learn more, so I called him for a conversation 387 00:22:40,560 --> 00:22:43,440 Speaker 1: from his office at the Robotics Institute of Carnegie Mellon. 388 00:22:44,840 --> 00:22:47,879 Speaker 1: A lot of people when they think about robots and 389 00:22:47,920 --> 00:22:51,960 Speaker 1: technology being used to assist agriculture, think about robots driving 390 00:22:52,000 --> 00:22:54,920 Speaker 1: around and picking grapes or plowing fields and things like that. 391 00:22:55,359 --> 00:22:58,639 Speaker 1: But despite being a robotics expert, George is currently focusing 392 00:22:58,640 --> 00:23:01,879 Speaker 1: on crop genetics. The way plant breeding works, you have 393 00:23:01,960 --> 00:23:05,960 Speaker 1: a bunch of parents. Uh plant breeder very carefully uses 394 00:23:06,000 --> 00:23:08,760 Speaker 1: all his or her experience to figure out which parents 395 00:23:08,760 --> 00:23:12,600 Speaker 1: will make the best potential children. They make those crosses. 396 00:23:13,000 --> 00:23:15,840 Speaker 1: They then do these field trials where they grow the 397 00:23:15,920 --> 00:23:19,360 Speaker 1: child varieties and they measure them and see how they do, 398 00:23:19,600 --> 00:23:21,600 Speaker 1: and then the winners go back in the pool and 399 00:23:21,640 --> 00:23:24,320 Speaker 1: the losers they weed out. One of the crops we 400 00:23:24,359 --> 00:23:27,080 Speaker 1: work with is sorghum. It's grown all over the world. 401 00:23:27,119 --> 00:23:30,320 Speaker 1: They're like forty different varieties of it. In particular, the 402 00:23:30,400 --> 00:23:33,919 Speaker 1: grain sorghum variety is a staple crop in places like 403 00:23:33,960 --> 00:23:37,160 Speaker 1: Sub Saharan Africa and India, parts of the world where 404 00:23:37,280 --> 00:23:39,520 Speaker 1: population is growing more rapidly than the rest of the 405 00:23:39,640 --> 00:23:42,879 Speaker 1: planet's populations, and the predictions for the impact of global 406 00:23:42,880 --> 00:23:45,960 Speaker 1: warming are are pretty high. Jewles uses technology to make 407 00:23:45,960 --> 00:23:48,840 Speaker 1: the work if human plant breed is dramatically more efficient. 408 00:23:49,080 --> 00:23:52,760 Speaker 1: But this work is completely invisible to consumers. So we 409 00:23:52,840 --> 00:23:54,720 Speaker 1: have built a robot that goes out to a breeding 410 00:23:54,720 --> 00:23:57,879 Speaker 1: experiment where a breeder has grown a thousand different varieties 411 00:23:57,920 --> 00:24:00,160 Speaker 1: of sorghum are robot goes through and takes all these 412 00:24:00,160 --> 00:24:03,000 Speaker 1: detailed measurements about how the plants are growing throughout the year, 413 00:24:03,480 --> 00:24:05,640 Speaker 1: and then the breeder can use those measurements to make 414 00:24:05,640 --> 00:24:09,000 Speaker 1: better decisions. The end user of this process I'm describing 415 00:24:09,320 --> 00:24:12,080 Speaker 1: won't see any technology at all. They will get a 416 00:24:12,160 --> 00:24:14,560 Speaker 1: seed that looks just like the seed they get now, 417 00:24:14,680 --> 00:24:16,639 Speaker 1: except it will be a little bit better because the 418 00:24:16,680 --> 00:24:21,000 Speaker 1: breeder improved it using our robots. These invisible changes to 419 00:24:21,040 --> 00:24:25,240 Speaker 1: the food production system can have huge consequences. Better seeds 420 00:24:25,320 --> 00:24:27,960 Speaker 1: mean better yields and could ultimately lead to a better 421 00:24:28,000 --> 00:24:31,000 Speaker 1: nourish world. But George isn't only thinking about how to 422 00:24:31,040 --> 00:24:34,840 Speaker 1: make heartier, better plants. He's also thinking about another problem, 423 00:24:34,880 --> 00:24:38,400 Speaker 1: how will we efficiently feeded global population who increasingly live 424 00:24:38,440 --> 00:24:42,160 Speaker 1: in cities and not on the farm. Imagine every building 425 00:24:42,200 --> 00:24:45,000 Speaker 1: in a city had a little greenhouse hanging off the 426 00:24:45,040 --> 00:24:48,000 Speaker 1: side of it, or a little growing room in the basement, 427 00:24:48,160 --> 00:24:50,680 Speaker 1: and now you've got these indoor growing systems that tend 428 00:24:50,720 --> 00:24:53,520 Speaker 1: to like generate more heat than they need, so one 429 00:24:53,560 --> 00:24:56,160 Speaker 1: of their big problems is venting off the heat. Well, 430 00:24:56,280 --> 00:24:58,320 Speaker 1: buildings have to pay a lot of money to heat 431 00:24:58,359 --> 00:25:00,000 Speaker 1: the buildings. So if you had this sort of sim 432 00:25:00,000 --> 00:25:02,720 Speaker 1: the artic relationship between the people in the building and 433 00:25:02,720 --> 00:25:04,720 Speaker 1: the plants in the building, they can exchange heat, and 434 00:25:04,720 --> 00:25:07,520 Speaker 1: they can exchange atmosphere and all kinds of things. If 435 00:25:07,560 --> 00:25:09,320 Speaker 1: you take that idea and you scale it up to 436 00:25:09,400 --> 00:25:11,560 Speaker 1: like a city scale, where now you have dozens or 437 00:25:11,640 --> 00:25:14,840 Speaker 1: hundreds of buildings that all have these different energy needs 438 00:25:14,840 --> 00:25:18,240 Speaker 1: and different agricultural needs, and they're all sort of sharing. 439 00:25:18,320 --> 00:25:22,080 Speaker 1: You have some sort of overarching AI that controls what 440 00:25:22,280 --> 00:25:25,040 Speaker 1: energy gets moved where. Um, you can imagine that there 441 00:25:25,040 --> 00:25:29,240 Speaker 1: are big efficiencies that can be gained. George's outlining a 442 00:25:29,359 --> 00:25:32,080 Speaker 1: vision where robotics and AI help us tackle one of 443 00:25:32,080 --> 00:25:36,560 Speaker 1: the world's most enduring sources of inequality food access, and 444 00:25:36,600 --> 00:25:40,080 Speaker 1: doing so could also make agriculture more energy efficient and 445 00:25:40,119 --> 00:25:43,440 Speaker 1: thus begin to address another huge problem that will disproportionately 446 00:25:43,480 --> 00:25:48,800 Speaker 1: affect the world's poorest people, climate change. So yes, automation 447 00:25:48,840 --> 00:25:51,880 Speaker 1: will take jobs away, but it can also potentially raise 448 00:25:51,960 --> 00:25:54,680 Speaker 1: quality of life and the quality of the global environment. 449 00:25:55,520 --> 00:25:57,960 Speaker 1: And as far as George is concerned, the type of 450 00:25:58,040 --> 00:26:01,320 Speaker 1: labor being replaced is not exact the work that maximizes 451 00:26:01,440 --> 00:26:05,080 Speaker 1: human potential. We call them dull, dirty, dangerous, so jobs 452 00:26:05,119 --> 00:26:08,320 Speaker 1: that people don't want or are dangerous to do, or 453 00:26:08,359 --> 00:26:11,199 Speaker 1: people are getting injured in. When I go visit the 454 00:26:11,359 --> 00:26:15,000 Speaker 1: Great industry in California and I see the laborers, they're 455 00:26:15,040 --> 00:26:18,080 Speaker 1: out there, they're stooped over under trees, They're doing this 456 00:26:18,280 --> 00:26:23,919 Speaker 1: extremely backbreaking labor. There are high incidences of repetitive stress injuries, 457 00:26:24,440 --> 00:26:27,080 Speaker 1: and so it's just not a very pleasant environment to 458 00:26:27,119 --> 00:26:31,360 Speaker 1: be working in. When automation comes into an industry, it 459 00:26:31,440 --> 00:26:35,240 Speaker 1: takes away some jobs that were there, but it creates 460 00:26:35,680 --> 00:26:39,600 Speaker 1: other opportunities. So for example, most orchards, you know, they'll 461 00:26:39,640 --> 00:26:42,480 Speaker 1: have sort of a year round staff of maybe a 462 00:26:42,520 --> 00:26:46,159 Speaker 1: dozen people, and then at certain busy times of the 463 00:26:46,240 --> 00:26:48,960 Speaker 1: year they'll bring in maybe a hundred laborers to come 464 00:26:48,960 --> 00:26:52,040 Speaker 1: in and help with the harvest. I think everybody would 465 00:26:52,040 --> 00:26:55,200 Speaker 1: be better off if that orchard had a year round 466 00:26:55,200 --> 00:26:58,760 Speaker 1: staff of twenty people that were productive all year long. 467 00:26:59,160 --> 00:27:01,760 Speaker 1: And we're able to use technology to even out these 468 00:27:01,760 --> 00:27:04,440 Speaker 1: bumps in the labor demand. And so those people, those 469 00:27:04,480 --> 00:27:06,399 Speaker 1: twenty people are going to need to be higher skilled, 470 00:27:06,440 --> 00:27:08,360 Speaker 1: but they're also going to get paid more, and they're 471 00:27:08,400 --> 00:27:12,159 Speaker 1: also going to have more comfortable jobs, and overall they 472 00:27:12,200 --> 00:27:14,359 Speaker 1: will produce more per person than they would in the 473 00:27:14,359 --> 00:27:19,120 Speaker 1: other system. Of course, the lingering question is what happens 474 00:27:19,160 --> 00:27:21,400 Speaker 1: to the eight people who no longer have a job, 475 00:27:21,880 --> 00:27:23,880 Speaker 1: and who gets to enjoy the fruits of this more 476 00:27:23,880 --> 00:27:27,880 Speaker 1: efficient system. Technology has improved lives all around the world 477 00:27:27,960 --> 00:27:31,400 Speaker 1: and lifted millions out of poverty, but it is also 478 00:27:31,520 --> 00:27:35,439 Speaker 1: dramatically enriched an extremely small number of people. We mentioned 479 00:27:35,440 --> 00:27:38,199 Speaker 1: Elon Musk's neural link earlier, and he's not alone in 480 00:27:38,200 --> 00:27:42,640 Speaker 1: the Silicon Valley elite investing in transhumanist technologies. That should 481 00:27:42,640 --> 00:27:46,840 Speaker 1: give us pause, remembering what Ian Bremer said about cognitive differentiation. 482 00:27:47,600 --> 00:27:50,119 Speaker 1: So there's much to fear, and there are no obvious 483 00:27:50,200 --> 00:27:53,879 Speaker 1: solutions in sight, and yet people like Kai Fu Lee 484 00:27:53,880 --> 00:27:57,960 Speaker 1: and Gil Pratt, people who are leading the field, remain optimistic. 485 00:27:58,400 --> 00:28:02,160 Speaker 1: I wanted to know why there is a strong belief 486 00:28:02,720 --> 00:28:05,880 Speaker 1: that thought leaders should do the best they can do 487 00:28:06,520 --> 00:28:11,280 Speaker 1: to project a possible future and strive towards it and 488 00:28:11,440 --> 00:28:15,680 Speaker 1: encourage other people to help make that a reality. Because 489 00:28:15,960 --> 00:28:19,760 Speaker 1: whether we point at the future that is an utopia 490 00:28:19,880 --> 00:28:24,439 Speaker 1: or dystopia, if everybody believes in it, then it becomes 491 00:28:24,480 --> 00:28:28,760 Speaker 1: a self fulfilling prophecy. So I'd like to be part 492 00:28:28,800 --> 00:28:32,800 Speaker 1: of that force which points towards more of utopian direction. 493 00:28:33,240 --> 00:28:37,000 Speaker 1: Even though I fully understand and recognize the possibility and 494 00:28:37,119 --> 00:28:42,080 Speaker 1: risks of the negative ending, we will want to believe 495 00:28:42,120 --> 00:28:45,560 Speaker 1: in that utopian direction, honesting automation to help feed the 496 00:28:45,560 --> 00:28:49,880 Speaker 1: world without stripping ourselves of community interaction, because man cannot 497 00:28:49,880 --> 00:28:52,200 Speaker 1: live on bread alone, and we need to make sure 498 00:28:52,240 --> 00:28:57,000 Speaker 1: to balance gains inefficiency with preserving the fabric of our society. 499 00:28:57,400 --> 00:28:59,760 Speaker 1: In the next episode, we travel from the farm yard 500 00:28:59,840 --> 00:29:02,560 Speaker 1: to the battlefield. We meet some of the people pioneering 501 00:29:02,600 --> 00:29:05,240 Speaker 1: the use of AI and robotics to wage different kinds 502 00:29:05,240 --> 00:29:08,200 Speaker 1: of wars, and we speak with Arti Pravaca, the former 503 00:29:08,240 --> 00:29:11,200 Speaker 1: head of Darper, the agency that created the Internet, about 504 00:29:11,200 --> 00:29:15,560 Speaker 1: how technology is revolutionizing the military. I'm oz veloshin see 505 00:29:15,600 --> 00:29:30,480 Speaker 1: you next time. Sleepwalkers is a production of I Heart 506 00:29:30,600 --> 00:29:34,640 Speaker 1: Radio and Unusual Productions for the latest AI news live 507 00:29:34,680 --> 00:29:38,040 Speaker 1: interviews and behind the scenes footage. Find us on Instagram, 508 00:29:38,080 --> 00:29:44,080 Speaker 1: at Sleepwalkers podcast or at Sleepwalker's podcast dot com. Sleepwalkers 509 00:29:44,160 --> 00:29:46,720 Speaker 1: is hosted by me Oz Veloshin and co hosted by 510 00:29:46,720 --> 00:29:49,680 Speaker 1: me Kara Price. We're produced by Julian Weller with help 511 00:29:49,720 --> 00:29:53,320 Speaker 1: from Jacopo Penzo and Taylor Chikoin. Mixing by Tristan McNeil 512 00:29:53,440 --> 00:29:56,880 Speaker 1: and Julian Weller. Our story editor is Matthew Riddle. Recording 513 00:29:56,920 --> 00:30:00,960 Speaker 1: assistance this episode from Walter Kowski. Sleep Workers is executive 514 00:30:00,960 --> 00:30:04,920 Speaker 1: produced by me Ozvaloshen and mangesh had Tigella. For more 515 00:30:04,960 --> 00:30:07,520 Speaker 1: podcasts from my Heart Radio, visit the i heart Radio app, 516 00:30:07,600 --> 00:30:10,560 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.