1 00:00:02,360 --> 00:00:09,640 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. I'm Stephen Carol and 2 00:00:09,760 --> 00:00:12,080 Speaker 1: this is Here's Why, where we take one new story 3 00:00:12,119 --> 00:00:14,000 Speaker 1: and explain it in just a few minutes with our 4 00:00:14,040 --> 00:00:15,480 Speaker 1: experts here at Bloomberg. 5 00:00:20,720 --> 00:00:23,360 Speaker 2: Every one of our jobs, at every stage of our careers, 6 00:00:23,400 --> 00:00:26,120 Speaker 2: at every company, in every country is going to get 7 00:00:26,120 --> 00:00:29,760 Speaker 2: affected by how this technology is going to change work, 8 00:00:30,040 --> 00:00:31,920 Speaker 2: and so we all need to lean into it. 9 00:00:32,120 --> 00:00:35,479 Speaker 3: No technology in the history of technologies has ever taken 10 00:00:35,920 --> 00:00:40,120 Speaker 3: reduced jobs on net, but it does change who is 11 00:00:40,159 --> 00:00:42,680 Speaker 3: working and who's not, and which skills are demanded. 12 00:00:42,960 --> 00:00:46,479 Speaker 4: Be aware of AI, adapt to AI, but also think 13 00:00:46,520 --> 00:00:50,680 Speaker 4: about these problems of labor displacement. You know, I think 14 00:00:51,120 --> 00:00:54,120 Speaker 4: pure market capitalism can be pretty brutal here. 15 00:00:54,440 --> 00:00:57,240 Speaker 1: There's a growing body of research trying to predict how 16 00:00:57,440 --> 00:01:00,920 Speaker 1: artificial intelligence will change the labor market. But some say 17 00:01:00,960 --> 00:01:03,720 Speaker 1: it's going to mean hundreds of millions of jobs are lost. 18 00:01:03,960 --> 00:01:07,240 Speaker 1: Others believe the technology will create far more roles than 19 00:01:07,240 --> 00:01:11,039 Speaker 1: it ultimately destroys. The change is happening quickly, but a 20 00:01:11,080 --> 00:01:13,920 Speaker 1: report by the Economic Innovation Group think Tank in the 21 00:01:14,000 --> 00:01:17,720 Speaker 1: US recently concluded that the effects of AI aren't showing 22 00:01:17,800 --> 00:01:21,640 Speaker 1: up in labor market data, at least for now. So 23 00:01:21,920 --> 00:01:28,520 Speaker 1: here's why AI isn't taking your job yet, Blueberg opinion 24 00:01:28,480 --> 00:01:31,520 Speaker 1: columnist to parme Olsen joins me. Now for more, Parmi, 25 00:01:31,640 --> 00:01:34,040 Speaker 1: let's start with the potential that AI has to shake 26 00:01:34,160 --> 00:01:35,039 Speaker 1: up the labor market. 27 00:01:35,080 --> 00:01:35,280 Speaker 5: First. 28 00:01:35,360 --> 00:01:37,240 Speaker 1: Who is thought to be most exposed? 29 00:01:37,880 --> 00:01:41,240 Speaker 5: Well, I think because of the nature of the new 30 00:01:41,440 --> 00:01:45,200 Speaker 5: generative AI tools that companies are so excited about the 31 00:01:45,319 --> 00:01:50,440 Speaker 5: likes of chat GPT, they are very good at generating text, 32 00:01:50,640 --> 00:01:53,800 Speaker 5: at processing lots of information. So the people who are 33 00:01:53,840 --> 00:01:56,720 Speaker 5: most exposed are the ones who do that kind of work, 34 00:01:56,880 --> 00:02:00,080 Speaker 5: the so called knowledge workers, So people who work in 35 00:02:00,120 --> 00:02:04,280 Speaker 5: the legal fields and media, in research, anything that involves 36 00:02:04,320 --> 00:02:08,839 Speaker 5: any kind of information processing or generating information is exposed. 37 00:02:08,880 --> 00:02:11,560 Speaker 5: And that I mean, if you kind of look at 38 00:02:11,600 --> 00:02:14,960 Speaker 5: what that means in context, that actually means people who 39 00:02:14,960 --> 00:02:18,799 Speaker 5: are highly educated. So one study showed that twenty seven 40 00:02:18,880 --> 00:02:22,440 Speaker 5: percent of workers with a bachelor's degree are going to 41 00:02:22,480 --> 00:02:25,320 Speaker 5: work in a job that's highly exposed to AI and 42 00:02:25,400 --> 00:02:29,960 Speaker 5: potentially disruptive, whereas only three percent of people who don't 43 00:02:30,000 --> 00:02:33,480 Speaker 5: even have a high school diploma are affected. So that's 44 00:02:33,520 --> 00:02:36,880 Speaker 5: a big gulf in terms of who's exposed and for 45 00:02:37,000 --> 00:02:39,640 Speaker 5: many people that will seem quite ironic. You spend all 46 00:02:39,680 --> 00:02:43,200 Speaker 5: this money, you know, educating yourself to only find yourself 47 00:02:43,240 --> 00:02:45,399 Speaker 5: actually quite exposed to this new technology. 48 00:02:45,880 --> 00:02:48,440 Speaker 1: The potential for change is absolutely huge, and the conversation 49 00:02:48,600 --> 00:02:51,600 Speaker 1: is happening everywhere. But what is the current data telling 50 00:02:51,680 --> 00:02:54,799 Speaker 1: us about how AI is changing the job's. 51 00:02:54,520 --> 00:02:58,600 Speaker 5: Marcus So, I think it's quite hard to kind of 52 00:02:58,600 --> 00:03:00,800 Speaker 5: get a clear picture of that yet, because there are 53 00:03:00,840 --> 00:03:04,679 Speaker 5: so many other economic factors that are also affecting the 54 00:03:04,760 --> 00:03:07,880 Speaker 5: job market. You know, there's inflation, there's high interest rates, 55 00:03:08,160 --> 00:03:12,240 Speaker 5: there's all the geopolitical stuff that's happening, and so I think, really, 56 00:03:12,320 --> 00:03:15,400 Speaker 5: right now, what people can agree on is that AI 57 00:03:15,600 --> 00:03:18,880 Speaker 5: is perhaps more of an accelerant than an actual cause 58 00:03:19,400 --> 00:03:23,120 Speaker 5: to job disruption people losing their jobs. There have been 59 00:03:23,120 --> 00:03:26,440 Speaker 5: some studies that show there is a direct link between 60 00:03:26,800 --> 00:03:31,239 Speaker 5: generative AI and layoffs, often because companies are attributing the 61 00:03:31,280 --> 00:03:34,240 Speaker 5: layoffs to their use of greater use of generative AI. 62 00:03:34,280 --> 00:03:37,160 Speaker 5: There was one study that showed in the first seven 63 00:03:37,200 --> 00:03:41,360 Speaker 5: months of twenty twenty four, ten thousand layoffs were linked 64 00:03:41,440 --> 00:03:44,280 Speaker 5: to generative AI and something like in the top five 65 00:03:44,400 --> 00:03:48,040 Speaker 5: causes of workforce cuts, AI is up there in those 66 00:03:48,160 --> 00:03:50,840 Speaker 5: top five. And the other effect we're seeing is of 67 00:03:50,920 --> 00:03:54,440 Speaker 5: course like a hiring freeze on entry level workers kind 68 00:03:54,480 --> 00:03:58,440 Speaker 5: of repetitive roles, particularly in software data management. So I 69 00:03:58,480 --> 00:04:01,080 Speaker 5: think we're also in this kind of wait and see 70 00:04:01,160 --> 00:04:04,880 Speaker 5: moment where companies are still experimenting with generative AI to 71 00:04:04,920 --> 00:04:07,640 Speaker 5: see how they can use it to become more productive, 72 00:04:07,920 --> 00:04:11,800 Speaker 5: potentially cut down on costs, cut down on labor costs, 73 00:04:12,200 --> 00:04:15,440 Speaker 5: and so they're not necessarily cutting jobs, they're just not 74 00:04:15,760 --> 00:04:19,760 Speaker 5: hiring new jobs until they can see if AI actually 75 00:04:19,800 --> 00:04:21,840 Speaker 5: can be used instead of a human Do we. 76 00:04:21,839 --> 00:04:25,200 Speaker 1: See much difference between countries or industries when it comes 77 00:04:25,200 --> 00:04:26,679 Speaker 1: to the effect that it's having. 78 00:04:27,040 --> 00:04:30,960 Speaker 5: The greater difference is actually just between industries. As I mentioned, 79 00:04:31,560 --> 00:04:35,000 Speaker 5: the kind of knowledge work areas like finance and law 80 00:04:35,120 --> 00:04:37,920 Speaker 5: and media and it. Those are definitely industries that are 81 00:04:38,400 --> 00:04:41,720 Speaker 5: very much affected, even creative industries who would have thought that, 82 00:04:41,839 --> 00:04:44,480 Speaker 5: you know, a few years ago, whereas other industries which 83 00:04:44,480 --> 00:04:47,360 Speaker 5: involve more human to human interaction or physical a physical 84 00:04:47,400 --> 00:04:52,279 Speaker 5: trade like construction or you know, caregiving, those are a 85 00:04:52,320 --> 00:04:55,200 Speaker 5: bit more insulated, but in much the same way. You know, 86 00:04:55,240 --> 00:04:57,280 Speaker 5: when you talk about countries, in much the same way 87 00:04:57,320 --> 00:04:59,880 Speaker 5: that you know, people with degrees tend to be more 88 00:05:00,040 --> 00:05:04,440 Speaker 5: are exposed. So too are more advanced economies which rely 89 00:05:04,760 --> 00:05:09,080 Speaker 5: on knowledge worker and the knowledge economy. Countries in Europe 90 00:05:09,279 --> 00:05:12,080 Speaker 5: and North America are perhaps more exposed to those in 91 00:05:12,160 --> 00:05:17,080 Speaker 5: developing economies, which rely on more on manufacturing and on services. 92 00:05:17,600 --> 00:05:20,400 Speaker 5: So it's not like country by country, but certainly by regions. 93 00:05:20,400 --> 00:05:21,880 Speaker 5: I think there is also that difference. 94 00:05:22,160 --> 00:05:24,680 Speaker 1: You've been writing in particular about how young workers are 95 00:05:24,680 --> 00:05:26,839 Speaker 1: being affected by this as well, what sort of things 96 00:05:26,839 --> 00:05:27,640 Speaker 1: have you heard from them. 97 00:05:27,720 --> 00:05:30,599 Speaker 5: I think it's always tough for anyone entering the job market, 98 00:05:30,640 --> 00:05:33,400 Speaker 5: for any young person entry. I mean I remember when 99 00:05:33,440 --> 00:05:35,960 Speaker 5: I was first looking for jobs twenty years ago. It 100 00:05:36,000 --> 00:05:37,920 Speaker 5: was really really hard and you had to trial sorts 101 00:05:37,920 --> 00:05:40,960 Speaker 5: of creative ways to put yourself out there. I almost 102 00:05:40,960 --> 00:05:44,160 Speaker 5: feel like it is harder now for this current generation, 103 00:05:44,920 --> 00:05:48,680 Speaker 5: and that is because companies are experimenting with AI, and 104 00:05:48,880 --> 00:05:52,279 Speaker 5: the most exposed jobs aren't just the knowledge worker jobs, 105 00:05:52,279 --> 00:05:54,680 Speaker 5: they're the entry level jobs. And the reason for that 106 00:05:54,800 --> 00:05:59,040 Speaker 5: is really simple. Companies are being advised to treat AI 107 00:05:59,279 --> 00:06:03,880 Speaker 5: and chat like an intern or a research assistant, and 108 00:06:03,920 --> 00:06:07,200 Speaker 5: that's exactly what they're doing. And they're doing that at 109 00:06:07,240 --> 00:06:11,960 Speaker 5: the expense in many cases of real human interns or 110 00:06:12,000 --> 00:06:15,719 Speaker 5: real human research assistants, because actually these tools do that 111 00:06:15,839 --> 00:06:18,520 Speaker 5: work very very well. That's where we're seeing some of 112 00:06:18,520 --> 00:06:21,279 Speaker 5: the hiring freezes on the entry level work. And one 113 00:06:21,320 --> 00:06:25,320 Speaker 5: young people do get jobs at these companies in in 114 00:06:25,400 --> 00:06:28,719 Speaker 5: sort of white collar jobs, they're expected to use AI 115 00:06:29,000 --> 00:06:31,560 Speaker 5: on the job. And that's a whole other kind of 116 00:06:31,760 --> 00:06:38,359 Speaker 5: story because when you're using AI to do those first tasks, 117 00:06:39,200 --> 00:06:44,000 Speaker 5: you're producing more more quickly than perhaps someone in your 118 00:06:44,080 --> 00:06:46,760 Speaker 5: role would have done two years prior. But are you 119 00:06:46,880 --> 00:06:50,719 Speaker 5: being trained, are you really learning the business fundamentals, the 120 00:06:50,760 --> 00:06:54,400 Speaker 5: industry fundamentals. And I'll give you one example. I spoke 121 00:06:54,440 --> 00:06:57,799 Speaker 5: to one young man who was working at a fintech company. 122 00:06:57,839 --> 00:07:01,040 Speaker 5: He had an accountancy degree. He was doing due diligence 123 00:07:01,080 --> 00:07:03,919 Speaker 5: on companies. And you know, three years ago, someone in 124 00:07:03,960 --> 00:07:07,080 Speaker 5: his role would have been reading Moody's Reports and Companies 125 00:07:07,120 --> 00:07:10,800 Speaker 5: House reports and analyst reports and just looking for those 126 00:07:10,880 --> 00:07:14,400 Speaker 5: red flags to do the due diligence. What he was 127 00:07:14,400 --> 00:07:17,200 Speaker 5: doing was taking all that text and putting it into 128 00:07:17,280 --> 00:07:20,640 Speaker 5: chat ebt and it was doing that analysis for him. 129 00:07:20,800 --> 00:07:23,640 Speaker 5: The great thing for his employer was he was producing 130 00:07:23,680 --> 00:07:26,560 Speaker 5: something much more quickly than he would have done. The 131 00:07:26,600 --> 00:07:29,679 Speaker 5: bad thing for him is he's not gaining that skill 132 00:07:30,240 --> 00:07:35,200 Speaker 5: kind of that those neural connections to spot or to 133 00:07:35,320 --> 00:07:38,080 Speaker 5: read those reports and kind of know what to look for. 134 00:07:38,360 --> 00:07:40,680 Speaker 5: Young workers are in this weird situation now where they're 135 00:07:40,680 --> 00:07:43,200 Speaker 5: actually not only being asked to do work more quickly, 136 00:07:43,240 --> 00:07:47,760 Speaker 5: but kind of do more higher level strategic work because 137 00:07:47,800 --> 00:07:50,160 Speaker 5: the AI is doing all this so called grunt work. 138 00:07:50,600 --> 00:07:55,280 Speaker 5: But I question whether you can make that leap without 139 00:07:55,480 --> 00:07:58,800 Speaker 5: doing that grunt work first. And what this next generation 140 00:07:58,960 --> 00:08:02,160 Speaker 5: of office workers and professionals, what kind of grounding they 141 00:08:02,240 --> 00:08:04,840 Speaker 5: have when they're letting AI do so much at the 142 00:08:04,920 --> 00:08:06,120 Speaker 5: cognitive work for them. 143 00:08:06,440 --> 00:08:08,280 Speaker 1: That's a really interesting thing to think about as the 144 00:08:08,320 --> 00:08:11,520 Speaker 1: technology gets rolled out into more areas as well. I 145 00:08:11,520 --> 00:08:13,320 Speaker 1: do want to think about as was citing this in 146 00:08:13,440 --> 00:08:16,040 Speaker 1: historical context as well. I mean, this isn't the first 147 00:08:16,080 --> 00:08:18,800 Speaker 1: major technological shift that we've seen in the workplace far 148 00:08:18,840 --> 00:08:22,280 Speaker 1: from us. What have we learned from the past pivotal 149 00:08:22,320 --> 00:08:24,400 Speaker 1: moments for our tech has gotten involved in the workforce 150 00:08:24,400 --> 00:08:26,120 Speaker 1: That might give us an idea of how this is 151 00:08:26,160 --> 00:08:27,400 Speaker 1: all going to pan out. 152 00:08:27,240 --> 00:08:30,120 Speaker 5: Well in the past, these kinds of transitions actually happened 153 00:08:30,200 --> 00:08:33,320 Speaker 5: very slowly, which is odd to say, because I think 154 00:08:33,400 --> 00:08:37,640 Speaker 5: right now this current transition with AI is happening very quickly, 155 00:08:38,160 --> 00:08:40,160 Speaker 5: and I think it's going to happen more quickly than 156 00:08:40,280 --> 00:08:43,559 Speaker 5: previous transitions had. So, just for example the nineteen eighties, 157 00:08:44,040 --> 00:08:47,640 Speaker 5: when companies who were previously doing everything on printed paper 158 00:08:47,679 --> 00:08:51,319 Speaker 5: and faxes and telephone calls and handwritten notes, all of 159 00:08:51,360 --> 00:08:54,000 Speaker 5: a sudden they were using computers. That was a huge, 160 00:08:54,360 --> 00:08:58,240 Speaker 5: very painful, very complicated transition to suddenly put everything in 161 00:08:58,360 --> 00:08:59,880 Speaker 5: digital form on a computer. 162 00:09:00,760 --> 00:09:01,199 Speaker 1: Sort of the. 163 00:09:01,120 --> 00:09:03,400 Speaker 5: Same I suppose you could say with the move to cloud, 164 00:09:03,679 --> 00:09:05,800 Speaker 5: and so that takes time because the company has all 165 00:09:05,800 --> 00:09:08,079 Speaker 5: these different processes and ways of working and you have 166 00:09:08,160 --> 00:09:11,760 Speaker 5: to transition that to this new environment. And I think 167 00:09:11,800 --> 00:09:14,839 Speaker 5: with AI we'll see something similar. You know, It's funny 168 00:09:14,840 --> 00:09:18,800 Speaker 5: because right now AI companies are just absolutely falling over 169 00:09:18,840 --> 00:09:24,160 Speaker 5: themselves to make the biggest model, the smartest AI, and 170 00:09:24,240 --> 00:09:27,440 Speaker 5: what they've failed a little bit at doing is really 171 00:09:27,480 --> 00:09:32,280 Speaker 5: talking to their customers, their enterprise customers, and just helping 172 00:09:32,280 --> 00:09:35,240 Speaker 5: them figure out how to use these tools in their 173 00:09:35,280 --> 00:09:38,480 Speaker 5: own businesses and in their own processes and so there's 174 00:09:38,520 --> 00:09:41,360 Speaker 5: been this difficult I think businesses have really struggled. This 175 00:09:41,400 --> 00:09:43,920 Speaker 5: is this incredible technology, how do we actually use it? 176 00:09:44,400 --> 00:09:48,440 Speaker 5: So even if AI didn't progress at all anymore, I 177 00:09:48,480 --> 00:09:50,679 Speaker 5: think we'd still have another five ten years ahead of 178 00:09:50,760 --> 00:09:53,319 Speaker 5: us for businesses just to figure out how to exploit, 179 00:09:53,360 --> 00:09:56,880 Speaker 5: how to capitalize on this technology. So I think we're 180 00:09:56,880 --> 00:09:59,200 Speaker 5: in the early days of that, and probably like in 181 00:09:59,240 --> 00:10:01,959 Speaker 5: the next three to five years or so, we'll see 182 00:10:02,000 --> 00:10:04,760 Speaker 5: a bit more of an acceleration as companies really start 183 00:10:04,760 --> 00:10:06,880 Speaker 5: to figure out how to use these tools and I 184 00:10:06,960 --> 00:10:08,640 Speaker 5: hate to say it, but like replace some of their 185 00:10:08,640 --> 00:10:11,400 Speaker 5: own workers with some of these tools and maybe even 186 00:10:11,440 --> 00:10:13,920 Speaker 5: create new types of jobs that involve AI. 187 00:10:14,400 --> 00:10:17,160 Speaker 1: So much more to come on this topic, but for now, 188 00:10:17,240 --> 00:10:20,400 Speaker 1: Parmei Elsen, Bloomberg Opinion columnist, thank you, and you can 189 00:10:20,440 --> 00:10:23,960 Speaker 1: read more from Parmi at Bloomberg dot com Forward slash Opinion, 190 00:10:24,640 --> 00:10:27,080 Speaker 1: and for more explanations like this from our team of 191 00:10:27,120 --> 00:10:29,840 Speaker 1: three thousand journalists and analysts around the world, go to 192 00:10:29,840 --> 00:10:35,000 Speaker 1: Bloomberg dot com slash Explainers. I'm Stephen Carol. This is 193 00:10:35,040 --> 00:10:37,680 Speaker 1: here's why. I'll be back next week with more. Thanks 194 00:10:37,720 --> 00:10:38,200 Speaker 1: for listening.