1 00:00:00,680 --> 00:00:01,560 Speaker 1: LinkedIn News. 2 00:00:08,760 --> 00:00:11,680 Speaker 2: From LinkedIn News and I heard podcasts. This is let's 3 00:00:11,720 --> 00:00:14,320 Speaker 2: talk offline. I'm Gianna Prudenti. 4 00:00:14,240 --> 00:00:17,880 Speaker 3: And I'm Jamaie Jackson Gadsden. It's the holiday season, y'all. 5 00:00:17,960 --> 00:00:20,440 Speaker 3: And if that we're doing something a little bit different 6 00:00:20,440 --> 00:00:23,599 Speaker 3: this week. Now what's different, you might ask, Well, first, 7 00:00:24,079 --> 00:00:27,160 Speaker 3: did you know that LinkedIn has a whole podcast network 8 00:00:27,200 --> 00:00:30,960 Speaker 3: filled with other podcasts that talk all about work related topics? 9 00:00:31,240 --> 00:00:31,680 Speaker 4: Mmmm? 10 00:00:32,159 --> 00:00:32,640 Speaker 5: Now you know? 11 00:00:33,280 --> 00:00:35,760 Speaker 3: So if that in mine? Over the next two weeks, 12 00:00:35,880 --> 00:00:38,360 Speaker 3: gian and I want to share some of our favorite 13 00:00:38,400 --> 00:00:41,519 Speaker 3: episodes from other shows in the LinkedIn podcast network. 14 00:00:41,840 --> 00:00:44,559 Speaker 2: Ugh, I'm so excited. You know, we have some pretty 15 00:00:44,600 --> 00:00:48,360 Speaker 2: amazing colleagues who also give really great advice on their shows, 16 00:00:48,560 --> 00:00:51,320 Speaker 2: and we've learned a lot from the conversations they've had 17 00:00:51,360 --> 00:00:54,000 Speaker 2: on their podcast. And this week we want to share 18 00:00:54,000 --> 00:00:55,680 Speaker 2: an episode from Get Hired. 19 00:00:55,720 --> 00:00:59,600 Speaker 3: Yep, one of our work besties. LinkedIn's Andrew Seaman spoke 20 00:00:59,640 --> 00:01:03,040 Speaker 3: to Jill H. Fuller, Professor of Management Practice at Harvard 21 00:01:03,120 --> 00:01:06,360 Speaker 3: Business School, about how AI is changing the future of 22 00:01:06,440 --> 00:01:07,280 Speaker 3: work and hiring. 23 00:01:07,680 --> 00:01:10,640 Speaker 2: AI is becoming a more prevalent part of all of 24 00:01:10,640 --> 00:01:13,679 Speaker 2: our lives, especially when it comes to work and we 25 00:01:13,800 --> 00:01:17,440 Speaker 2: loved this conversation because Andrew and Joseph really helped dispel 26 00:01:17,520 --> 00:01:19,560 Speaker 2: some of the fears we might have when it comes 27 00:01:19,600 --> 00:01:23,720 Speaker 2: to embracing new technology and understanding AI will only help 28 00:01:23,800 --> 00:01:26,840 Speaker 2: us stay competitive in the job market. So we hope 29 00:01:26,880 --> 00:01:29,039 Speaker 2: you enjoy this conversation from get Hired. 30 00:01:37,000 --> 00:01:40,240 Speaker 6: From the way people apply for jobs and hiring managers 31 00:01:40,319 --> 00:01:44,399 Speaker 6: screen applicants to the actual kind of roles available, AI 32 00:01:44,520 --> 00:01:49,480 Speaker 6: is reshaping every aspect of getting hired today. AI powered 33 00:01:49,480 --> 00:01:53,320 Speaker 6: tools hold enormous promise for job seekers, but they also 34 00:01:53,400 --> 00:01:57,360 Speaker 6: pose some potential challenges. So how should you be thinking 35 00:01:57,400 --> 00:02:01,120 Speaker 6: about your career in the context of AI, well, whether 36 00:02:01,160 --> 00:02:03,960 Speaker 6: you're just starting out, looking to pivot, or trying to 37 00:02:04,000 --> 00:02:06,880 Speaker 6: climb the ladder. We're getting into all of that on 38 00:02:06,960 --> 00:02:11,920 Speaker 6: today's show. From LinkedIn News, This is Get Higher, a 39 00:02:12,000 --> 00:02:14,960 Speaker 6: podcast for the ups and downs and the ever changing 40 00:02:15,040 --> 00:02:18,920 Speaker 6: landscape of our professional lives. I'm and Andrew Seman, LinkedIn 41 00:02:19,040 --> 00:02:22,799 Speaker 6: Senior Managing editor for Jobs and Career Development, bringing in 42 00:02:22,880 --> 00:02:26,240 Speaker 6: conversations with people who, like me, want to see you 43 00:02:26,280 --> 00:02:31,120 Speaker 6: succeed at work, at home, and everywhere in between. Joining 44 00:02:31,120 --> 00:02:34,720 Speaker 6: me today is Joseph Fuller. He's a Professor of Management 45 00:02:34,760 --> 00:02:37,640 Speaker 6: Practice at Harvard Business School and the co leader of 46 00:02:37,680 --> 00:02:41,560 Speaker 6: the Managing the Future of Work initiative. Professor Fuller is 47 00:02:41,600 --> 00:02:44,080 Speaker 6: an expert on what's known as the skill gap in 48 00:02:44,120 --> 00:02:47,240 Speaker 6: the US labor force, which is the space between the 49 00:02:47,320 --> 00:02:50,320 Speaker 6: current skills of the US workforce and the skills needed 50 00:02:50,360 --> 00:02:54,840 Speaker 6: to get work done. He's written extensively about policy solutions 51 00:02:54,880 --> 00:02:58,320 Speaker 6: to address it. We met up at the Walmart Opportunity 52 00:02:58,360 --> 00:03:01,640 Speaker 6: Summit in Washington, d C. To discuss how AI is 53 00:03:01,800 --> 00:03:05,399 Speaker 6: changing the nature of work and hiring. I kicked off 54 00:03:05,400 --> 00:03:08,400 Speaker 6: our conversation by asking why it's so important to be 55 00:03:08,480 --> 00:03:11,360 Speaker 6: thinking about the role of new technologies in the future 56 00:03:11,400 --> 00:03:12,680 Speaker 6: of work right now? 57 00:03:12,919 --> 00:03:13,600 Speaker 4: Can you tell us a. 58 00:03:13,600 --> 00:03:16,519 Speaker 7: Little bit about what you think of today's gathering and 59 00:03:16,840 --> 00:03:19,000 Speaker 7: sort of what you hope people will get out of it. 60 00:03:19,919 --> 00:03:20,119 Speaker 5: Well. 61 00:03:20,160 --> 00:03:22,480 Speaker 1: I think today's gathering actually is a bit of a 62 00:03:22,560 --> 00:03:28,119 Speaker 1: recognition that the way large companies particularly have been approaching 63 00:03:28,840 --> 00:03:33,360 Speaker 1: challenges the labor market needs a refresh and rethink. People 64 00:03:33,360 --> 00:03:37,080 Speaker 1: are executing as best as they know how the old 65 00:03:37,120 --> 00:03:41,000 Speaker 1: playbook and the old playbook. They're not moving the ball 66 00:03:41,320 --> 00:03:43,400 Speaker 1: at the rate they feel they need to. In terms 67 00:03:43,480 --> 00:03:47,920 Speaker 1: of cultivating the right skills space, having more agile workforce, 68 00:03:49,080 --> 00:03:52,280 Speaker 1: And there's some bedrock assumptions on which a lot of 69 00:03:53,360 --> 00:03:57,240 Speaker 1: hiring has been made in talent sources made that the 70 00:03:57,280 --> 00:04:00,840 Speaker 1: American K through twelve system will consistently create large numbers 71 00:04:00,880 --> 00:04:06,320 Speaker 1: of people work ready, that the post secondary sector is 72 00:04:06,840 --> 00:04:08,400 Speaker 1: going to create people with. 73 00:04:08,560 --> 00:04:09,880 Speaker 5: Relevant job skills. 74 00:04:10,080 --> 00:04:13,760 Speaker 1: It does, but forty four percent of college graduates end 75 00:04:13,840 --> 00:04:17,160 Speaker 1: up underemployed when they graduate. So there's a lot of 76 00:04:17,200 --> 00:04:22,120 Speaker 1: warning sides. And I think when you get this many 77 00:04:22,240 --> 00:04:28,159 Speaker 1: very prominent companies sending very senior people to something like this, 78 00:04:29,400 --> 00:04:33,640 Speaker 1: it's more than just an active contributing to the commons 79 00:04:33,680 --> 00:04:35,760 Speaker 1: and trying to do the right thing. It's an expression 80 00:04:35,800 --> 00:04:36,799 Speaker 1: of business necessity. 81 00:04:37,120 --> 00:04:40,520 Speaker 7: Well, what do you think about this moment overall, especially 82 00:04:40,520 --> 00:04:43,719 Speaker 7: with the arrival of AI, because we're seeing the companies 83 00:04:43,760 --> 00:04:48,240 Speaker 7: talk about skills, but with the specter of this huge 84 00:04:48,279 --> 00:04:49,400 Speaker 7: technological shift. 85 00:04:49,440 --> 00:04:50,840 Speaker 5: So how do you. 86 00:04:50,760 --> 00:04:54,120 Speaker 7: Think people in the workforce who feel maybe like cogs 87 00:04:54,120 --> 00:04:56,440 Speaker 7: and a wheel, how should they view this moment? 88 00:04:57,920 --> 00:05:00,920 Speaker 5: Well, in terms of AIAI is really the. 89 00:05:00,880 --> 00:05:04,320 Speaker 1: Culmination of an arc of technological development we've seen over 90 00:05:04,320 --> 00:05:08,640 Speaker 1: the last twenty years. As many people understand AI of 91 00:05:08,680 --> 00:05:12,640 Speaker 1: different forms has existed for a long time. Generative AI, though, 92 00:05:12,720 --> 00:05:17,680 Speaker 1: was this capstone development, and it's going to be different 93 00:05:17,800 --> 00:05:22,680 Speaker 1: than previous technologies insomuch as it has a couple of features. 94 00:05:22,800 --> 00:05:26,240 Speaker 1: One is it's an augmentative technology, by which we mean 95 00:05:27,080 --> 00:05:32,359 Speaker 1: it allows people to do elements of their job very well. 96 00:05:33,000 --> 00:05:36,200 Speaker 1: It's not an automation technology. We're just one for one. 97 00:05:36,279 --> 00:05:39,159 Speaker 1: You all the software in this case and the job 98 00:05:39,200 --> 00:05:44,000 Speaker 1: goes away. It also is asymmetrically oriented toward knowledge workers 99 00:05:44,040 --> 00:05:49,640 Speaker 1: and higher wage workers. Most technological revolutions have more or 100 00:05:49,720 --> 00:05:53,520 Speaker 1: less addressed middle skills worker, lower wage workers, the bottom 101 00:05:53,600 --> 00:05:57,080 Speaker 1: end of the white collar distribution. So this is going 102 00:05:57,120 --> 00:06:00,760 Speaker 1: to very much affect different populations to both make it 103 00:06:00,839 --> 00:06:04,520 Speaker 1: more productive but also make there be greater pressures on 104 00:06:04,600 --> 00:06:10,520 Speaker 1: their employability. We're now at an age where, in multiple instances, 105 00:06:11,440 --> 00:06:15,360 Speaker 1: the half life of a technology is about equal to 106 00:06:15,520 --> 00:06:16,640 Speaker 1: the time it takes. 107 00:06:16,440 --> 00:06:17,800 Speaker 5: To master the technology. 108 00:06:18,440 --> 00:06:22,520 Speaker 1: That's just crazy that that's so off the map of 109 00:06:22,560 --> 00:06:25,480 Speaker 1: the known world, And that gets us a little bit 110 00:06:25,480 --> 00:06:30,560 Speaker 1: of the question about individuals AI. Our data suggests that 111 00:06:30,600 --> 00:06:32,760 Speaker 1: people are very curious about AI, men of them are 112 00:06:32,839 --> 00:06:34,039 Speaker 1: very hopeful about AI. 113 00:06:34,880 --> 00:06:35,440 Speaker 5: So I think for. 114 00:06:36,279 --> 00:06:40,080 Speaker 1: Individuals right now. The first thing you understand is this 115 00:06:40,120 --> 00:06:45,400 Speaker 1: is here to stay. It's designed to be navigable by 116 00:06:45,480 --> 00:06:48,880 Speaker 1: human being who can type, and pretty soon it'll do 117 00:06:48,960 --> 00:06:49,839 Speaker 1: audio recognition. 118 00:06:50,200 --> 00:06:52,120 Speaker 5: So playing with. 119 00:06:52,160 --> 00:06:59,000 Speaker 1: It, even just the open available, no monthly fee, understanding 120 00:06:59,279 --> 00:07:01,000 Speaker 1: that it's I want to show up in your work 121 00:07:01,080 --> 00:07:05,480 Speaker 1: sooner or later. So it might be a little intimidating, 122 00:07:05,720 --> 00:07:07,880 Speaker 1: but time to stick your toes. 123 00:07:07,640 --> 00:07:09,159 Speaker 5: In the water in your work. 124 00:07:09,200 --> 00:07:11,720 Speaker 7: You talk to a lot of companies and the sense 125 00:07:11,760 --> 00:07:13,240 Speaker 7: I get from a lot of them is that they're 126 00:07:13,320 --> 00:07:15,480 Speaker 7: kind of in the same boat where they're like, we 127 00:07:15,520 --> 00:07:18,520 Speaker 7: want to use this technology, and obviously they are, yes, 128 00:07:18,680 --> 00:07:20,480 Speaker 7: but at the same time they're still trying to be 129 00:07:20,560 --> 00:07:23,640 Speaker 7: like how though, Like they're still trying to figure out 130 00:07:23,720 --> 00:07:26,400 Speaker 7: exactly how it will be beneficial to them right. 131 00:07:26,480 --> 00:07:29,120 Speaker 1: Yes, And what I'd add to that is, because it's 132 00:07:29,120 --> 00:07:35,000 Speaker 1: an augmentative technology, it's much harder to adopt than an 133 00:07:35,040 --> 00:07:41,160 Speaker 1: automation technology. A lot of companies are essentially saying, what. 134 00:07:41,960 --> 00:07:43,560 Speaker 5: Wow, using this is complicated. 135 00:07:43,920 --> 00:07:49,360 Speaker 1: Right when companies started moving from horsepower to electrical power, 136 00:07:49,400 --> 00:07:53,640 Speaker 1: that was complicated too, and this technology is fundamentals at 137 00:07:53,720 --> 00:07:57,640 Speaker 1: this is the most important technological development since controllable power. 138 00:07:58,240 --> 00:08:01,440 Speaker 1: So what you have to do as a company to 139 00:08:01,480 --> 00:08:05,120 Speaker 1: deploy it is you have to not intrude it into 140 00:08:05,160 --> 00:08:08,480 Speaker 1: your existing process. You have to re engineer your process 141 00:08:08,640 --> 00:08:11,080 Speaker 1: around what it can do. That means you're going to 142 00:08:11,160 --> 00:08:15,680 Speaker 1: change job descriptions, metrics, the process flow and so a 143 00:08:15,760 --> 00:08:19,720 Speaker 1: lot of companies are actually having negative margin impact right 144 00:08:19,760 --> 00:08:23,360 Speaker 1: now because they're paying to spread AI three at the workforce, 145 00:08:23,600 --> 00:08:26,400 Speaker 1: but they haven't had the confidence or the knowledge of 146 00:08:26,480 --> 00:08:30,160 Speaker 1: yet what costs it could take out to offset those costs. 147 00:08:30,240 --> 00:08:33,079 Speaker 1: So most adoption curves and your technology is like those 148 00:08:33,080 --> 00:08:37,480 Speaker 1: are shaped like an s. Early adopters could be hobbyists. 149 00:08:37,520 --> 00:08:40,920 Speaker 1: Even then the economics start to get more favorable scale 150 00:08:40,920 --> 00:08:43,680 Speaker 1: economies and you get that ramp up, and then you 151 00:08:43,720 --> 00:08:47,960 Speaker 1: get the late adopters that have no need for AI. 152 00:08:48,640 --> 00:08:53,800 Speaker 1: Generative AI. It's actually jshaped. It's cash negative, margin negative 153 00:08:53,920 --> 00:08:56,160 Speaker 1: right now. For a lot of companies, the question is 154 00:08:56,280 --> 00:08:59,559 Speaker 1: how soon can they make the associated changes with the 155 00:08:59,600 --> 00:09:05,160 Speaker 1: way that they do processes. Today, adopt AI displays those casts 156 00:09:05,200 --> 00:09:07,360 Speaker 1: and then bounce out the other end, But when they 157 00:09:07,400 --> 00:09:10,120 Speaker 1: bounce out, it's going to be with a very steep slope. 158 00:09:10,400 --> 00:09:14,520 Speaker 7: What strikes me is how dynamic. It is like especially 159 00:09:14,559 --> 00:09:18,240 Speaker 7: in customer service. Yes examples where you know, the high 160 00:09:18,280 --> 00:09:21,480 Speaker 7: performers they don't benefit from AI, but the low performers 161 00:09:21,480 --> 00:09:22,920 Speaker 7: in a call center they benefit. 162 00:09:23,040 --> 00:09:26,880 Speaker 1: You're citing that MIT staff and research, which is terrific. 163 00:09:27,679 --> 00:09:30,880 Speaker 1: It turns out as a great leveler. We did research 164 00:09:30,960 --> 00:09:34,320 Speaker 1: at Harvard Business School on this as well, looking at 165 00:09:34,520 --> 00:09:38,160 Speaker 1: analysts in a prominent consulting firm, and what you saw 166 00:09:38,280 --> 00:09:43,319 Speaker 1: there which was startling, is that in the existing performance 167 00:09:43,360 --> 00:09:48,160 Speaker 1: management system, a seventy fifth percentile performer was viewed as 168 00:09:48,360 --> 00:09:51,679 Speaker 1: forty percent more productive than a twenty fifth percentile performer. 169 00:09:52,360 --> 00:09:57,480 Speaker 1: That gap closed to about fifteen percent if both groups 170 00:09:57,760 --> 00:10:00,480 Speaker 1: had access to AI. And I think this gets back 171 00:10:00,520 --> 00:10:03,240 Speaker 1: to this how the individual should think about it. A 172 00:10:03,280 --> 00:10:05,120 Speaker 1: lot of people are going to hear hears from new 173 00:10:05,160 --> 00:10:10,559 Speaker 1: technology AI. Aren't Schwarzenegger, Oh my gosh and Matthew Broderick, 174 00:10:10,679 --> 00:10:14,000 Speaker 1: you know and wherever that movie was called. And in fact, 175 00:10:14,040 --> 00:10:17,160 Speaker 1: for many people, what it's going to do is help 176 00:10:17,200 --> 00:10:20,320 Speaker 1: them do elements of their job which maybe don't come 177 00:10:20,360 --> 00:10:25,160 Speaker 1: easily to them a lot better, with more confidence, better performance, 178 00:10:25,520 --> 00:10:28,320 Speaker 1: which is going to enhance their standing with their employer, 179 00:10:28,360 --> 00:10:31,319 Speaker 1: with their boss, and and the more comfortable you are 180 00:10:31,360 --> 00:10:33,240 Speaker 1: with it, the more you're going to find it's going 181 00:10:33,320 --> 00:10:38,400 Speaker 1: to free up time, especially for those urgent, unimportant things 182 00:10:38,720 --> 00:10:41,320 Speaker 1: that tend to wreck your calendar. A lot of that 183 00:10:41,760 --> 00:10:44,680 Speaker 1: kind of routine transactions and A will be very good 184 00:10:44,679 --> 00:10:48,200 Speaker 1: at and you can escape that and focus on the 185 00:10:48,320 --> 00:10:50,319 Speaker 1: higher value added activities. 186 00:10:49,840 --> 00:10:50,320 Speaker 5: In your role. 187 00:10:54,320 --> 00:11:07,440 Speaker 6: We'll be right back with Joseph Buller. And we're back 188 00:11:07,480 --> 00:11:11,560 Speaker 6: with Joseph Buller, Professor of Management Practice at Harvard Business School. 189 00:11:14,360 --> 00:11:16,680 Speaker 7: I think sort of also, what we're talking about here 190 00:11:16,760 --> 00:11:20,240 Speaker 7: is for individuals to think of themselves not necessarily as 191 00:11:20,320 --> 00:11:23,520 Speaker 7: like you know a teacher or you know accountant, but 192 00:11:24,080 --> 00:11:26,920 Speaker 7: you have these skills and it can be transformed into 193 00:11:26,920 --> 00:11:30,560 Speaker 7: different professions. How should people sort of view that because 194 00:11:30,559 --> 00:11:32,040 Speaker 7: they might say, listen, I went to school to be 195 00:11:32,080 --> 00:11:34,000 Speaker 7: a teacher or I went to school to be an accountant, 196 00:11:34,480 --> 00:11:37,600 Speaker 7: and they may resist that. 197 00:11:37,960 --> 00:11:42,839 Speaker 1: Well, there are certain jobs A technical writer would be 198 00:11:42,840 --> 00:11:45,400 Speaker 1: a good example where it's going to be very hard 199 00:11:45,440 --> 00:11:49,960 Speaker 1: to imagine future where a lot of those man hours 200 00:11:50,000 --> 00:11:53,760 Speaker 1: are not displaced. But if you look, for example, what 201 00:11:53,840 --> 00:11:58,959 Speaker 1: con Academy is doing with Conbigo and AI for teachers, 202 00:11:59,080 --> 00:12:04,080 Speaker 1: it's doing everything from creating an environment where they're trying 203 00:12:04,080 --> 00:12:07,160 Speaker 1: to teach the student how to use AI, but prevent 204 00:12:07,240 --> 00:12:11,280 Speaker 1: the student from over relying on AI, but also doing 205 00:12:11,320 --> 00:12:16,080 Speaker 1: diagnostics for what Jimmy or Jony actually understand or don't, 206 00:12:16,559 --> 00:12:20,320 Speaker 1: but also giving feedback to the teacher. We looked at 207 00:12:20,360 --> 00:12:26,040 Speaker 1: your twenty students in American history and they are very 208 00:12:26,080 --> 00:12:30,880 Speaker 1: confused about the Louisiana purchase, or all over the map 209 00:12:30,960 --> 00:12:34,240 Speaker 1: on causes of the Civil War, or frankly, none of. 210 00:12:34,160 --> 00:12:35,440 Speaker 5: Them can write a topic sense. 211 00:12:36,280 --> 00:12:40,839 Speaker 1: So the opportunity to make people, even in white collar 212 00:12:40,880 --> 00:12:44,640 Speaker 1: trades like that more productive, focus on areas improvement and 213 00:12:45,280 --> 00:12:49,160 Speaker 1: user satisfaction. I won't call students in high school customers, 214 00:12:49,200 --> 00:12:52,560 Speaker 1: but an awful lot of opportunities let people do the 215 00:12:52,600 --> 00:12:55,400 Speaker 1: part of the job they really enjoy, and it's the 216 00:12:55,600 --> 00:12:59,520 Speaker 1: animating reason they pursued the profession the first place. 217 00:13:00,000 --> 00:13:03,800 Speaker 5: We have ninety minimum two thousand. 218 00:13:03,400 --> 00:13:05,680 Speaker 1: Word papers degrade in the next two weeks. 219 00:13:06,320 --> 00:13:07,400 Speaker 5: I love my students. 220 00:13:07,440 --> 00:13:09,120 Speaker 1: I think these will A lot of these will be 221 00:13:09,120 --> 00:13:13,880 Speaker 1: really good papers, but I'm not looking forward to reading. 222 00:13:14,320 --> 00:13:16,640 Speaker 1: You know, one hundred and eighty thousand words. That's the 223 00:13:16,720 --> 00:13:17,840 Speaker 1: length of ANACRONAA. 224 00:13:18,520 --> 00:13:22,000 Speaker 7: It sounds like there's so much potential for so many benefits, 225 00:13:22,480 --> 00:13:25,840 Speaker 7: but it sounds like there might be definitely growing pains 226 00:13:25,880 --> 00:13:27,720 Speaker 7: on all sides to get there right. 227 00:13:27,800 --> 00:13:32,760 Speaker 1: Yes, and I think smart c suites, smart boards of 228 00:13:32,800 --> 00:13:34,959 Speaker 1: directors are going to understand this is going to be 229 00:13:34,960 --> 00:13:40,840 Speaker 1: a multi year program. I think heads of institutions or 230 00:13:41,080 --> 00:13:45,840 Speaker 1: public servants that have service delivery roles like school committees 231 00:13:45,920 --> 00:13:49,000 Speaker 1: and heads of school districts are going to have to 232 00:13:49,120 --> 00:13:52,760 Speaker 1: understand that this is not going to be entirely easy. 233 00:13:53,640 --> 00:13:59,040 Speaker 1: But there's incredible potential there just in the education sector alone, 234 00:14:00,080 --> 00:14:03,320 Speaker 1: tunity to level the playing field. Let's go back to 235 00:14:03,360 --> 00:14:06,880 Speaker 1: what you mentioned about customer service reps or I talked 236 00:14:06,920 --> 00:14:11,679 Speaker 1: about consulting analysts exact same thing's going to happen for learners, 237 00:14:12,080 --> 00:14:17,120 Speaker 1: for teachers, new teachers, and the ability to get big, 238 00:14:17,240 --> 00:14:22,200 Speaker 1: big improvements in productivity and job satisfaction and user satisfaction 239 00:14:22,840 --> 00:14:27,080 Speaker 1: in kind of a wind cube type model. It's going 240 00:14:27,160 --> 00:14:32,880 Speaker 1: to be remarkable. I'm really an optimist about AI relative 241 00:14:33,000 --> 00:14:37,400 Speaker 1: to legitimate uses. We have to fear AI in the 242 00:14:37,440 --> 00:14:39,080 Speaker 1: hands of bad actors. 243 00:14:39,560 --> 00:14:42,800 Speaker 7: What would you say you know from your experience for 244 00:14:42,960 --> 00:14:45,560 Speaker 7: job seekers that are navigating the current kind of weird 245 00:14:45,960 --> 00:14:48,640 Speaker 7: labor market. I think for most people, what would you 246 00:14:48,680 --> 00:14:52,080 Speaker 7: say your best advice to them navigating that that landscape. 247 00:14:52,120 --> 00:14:54,640 Speaker 1: We don't use the word weird at Harvard Business School, 248 00:14:54,680 --> 00:14:58,360 Speaker 1: we use a technical term goofy. The current labor market 249 00:14:58,600 --> 00:15:02,200 Speaker 1: is manifesting a couple of phenomena. The first is that 250 00:15:02,320 --> 00:15:05,560 Speaker 1: we are a post industrial economy. It is an economy 251 00:15:05,600 --> 00:15:10,720 Speaker 1: that requires digital literacy, not super digital competence. It doesn't 252 00:15:10,760 --> 00:15:13,320 Speaker 1: mean that you need to be able to code in 253 00:15:13,400 --> 00:15:17,640 Speaker 1: Python or something, or explain how a touchscreen works, but 254 00:15:17,720 --> 00:15:21,920 Speaker 1: you have to be comfortable with digital devices, and someone 255 00:15:21,960 --> 00:15:26,760 Speaker 1: who isn't needs to find a way to address that. Also, 256 00:15:27,280 --> 00:15:31,400 Speaker 1: it's very, very important to be able to demonstrate or 257 00:15:31,560 --> 00:15:35,840 Speaker 1: find your social skills. A lot of younger people, unfortunately 258 00:15:35,880 --> 00:15:39,000 Speaker 1: because of COVID, because the growth of social media, their 259 00:15:39,040 --> 00:15:43,480 Speaker 1: amount of social interaction is lower than previous generations, even 260 00:15:43,520 --> 00:15:45,640 Speaker 1: their older brothers and sisters and cousins. 261 00:15:46,360 --> 00:15:48,400 Speaker 5: So how do you do those things? 262 00:15:48,400 --> 00:15:50,320 Speaker 1: Because it sounds like I'm saying, well, if you don't 263 00:15:50,320 --> 00:15:53,360 Speaker 1: have it by now, well, first of all, you can 264 00:15:53,400 --> 00:15:56,440 Speaker 1: try to gain some experiences and the amount of free 265 00:15:56,520 --> 00:16:01,280 Speaker 1: material that already exists, and the types of abilities that 266 00:16:01,360 --> 00:16:06,440 Speaker 1: are going to come soon to address issues like that 267 00:16:06,480 --> 00:16:12,520 Speaker 1: are pretty remarkable. Chat GPT already has ability to with 268 00:16:12,680 --> 00:16:17,280 Speaker 1: a fairly brief tape of you speaking, replicate your voice 269 00:16:17,520 --> 00:16:20,480 Speaker 1: with one hundred percent fidelity. So if you want to 270 00:16:20,600 --> 00:16:24,560 Speaker 1: hear how you sound in a job interview, you'll be 271 00:16:24,600 --> 00:16:27,560 Speaker 1: able to do that and actually have a conversation with 272 00:16:27,640 --> 00:16:31,560 Speaker 1: your early enough with yourself. At Harvd Business School, we 273 00:16:31,600 --> 00:16:35,360 Speaker 1: studied a few years ago where do business people go 274 00:16:36,200 --> 00:16:40,040 Speaker 1: as a first source for an explanation of a business 275 00:16:40,040 --> 00:16:43,160 Speaker 1: concept they don't feel the understand For many many years 276 00:16:43,440 --> 00:16:46,320 Speaker 1: it actually was the Harvard Business Review, so we're very 277 00:16:46,320 --> 00:16:48,880 Speaker 1: proud of that. It isn't the Harvard Business View anymore. 278 00:16:48,920 --> 00:16:49,560 Speaker 5: It's YouTube. 279 00:16:50,360 --> 00:16:52,720 Speaker 1: So if you want the resource in YouTube if you 280 00:16:52,720 --> 00:16:54,200 Speaker 1: want to learn about something. 281 00:16:54,000 --> 00:16:56,320 Speaker 5: Are awesome. Yeah. 282 00:16:55,480 --> 00:17:01,120 Speaker 1: Yeah, So be honest about your portfolio of skill, try 283 00:17:01,160 --> 00:17:05,000 Speaker 1: to augment them to degree you can, and look using 284 00:17:05,080 --> 00:17:10,600 Speaker 1: resources like LinkedIn about what are the skills that people 285 00:17:10,600 --> 00:17:14,760 Speaker 1: that have the job you aspire to talk about? What 286 00:17:14,840 --> 00:17:16,920 Speaker 1: do they talk about in terms of what they did? 287 00:17:17,040 --> 00:17:20,520 Speaker 1: Look at their preview jobs. Build a little bit of 288 00:17:20,600 --> 00:17:24,639 Speaker 1: a portrait of what you think your desired industry and 289 00:17:24,680 --> 00:17:28,120 Speaker 1: desired entry level position is seeking, and then be objective 290 00:17:28,160 --> 00:17:31,520 Speaker 1: about contrasting what you've got versus what they're looking for, 291 00:17:31,840 --> 00:17:34,200 Speaker 1: and see if you can't backfill a couple of spots 292 00:17:34,359 --> 00:17:38,359 Speaker 1: you know, attributes or skills or experiences. If you're seeing 293 00:17:38,359 --> 00:17:41,399 Speaker 1: a gap that you think is impeding you from realizing 294 00:17:41,440 --> 00:17:42,720 Speaker 1: that ambition. 295 00:17:43,280 --> 00:17:45,320 Speaker 7: I'm super helvil. Thank you so much. 296 00:17:45,480 --> 00:17:46,040 Speaker 5: You bet. 297 00:17:50,040 --> 00:17:53,920 Speaker 6: That was Joseph Fuller, Professor of Management Practice at Harvard 298 00:17:53,960 --> 00:17:58,240 Speaker 6: Business School. If you're leading today's conversation with a new 299 00:17:58,320 --> 00:18:01,199 Speaker 6: learning to apply to your job search career, I'd like 300 00:18:01,280 --> 00:18:03,479 Speaker 6: to invite you to write about it in a review 301 00:18:03,520 --> 00:18:07,119 Speaker 6: on Apple Podcast. Our team really enjoys reading what you 302 00:18:07,240 --> 00:18:11,000 Speaker 6: learn from our shows. Plus it helps other people discover 303 00:18:11,119 --> 00:18:15,840 Speaker 6: our community. Speaking of community, remember that we're always here 304 00:18:16,000 --> 00:18:18,720 Speaker 6: backing you up and cheering you on. Connect with me 305 00:18:18,920 --> 00:18:21,919 Speaker 6: Andrew Seaman and the get Hired community on LinkedIn to 306 00:18:22,000 --> 00:18:26,080 Speaker 6: continue the conversation. In fact, subscribe to my weekly newsletter 307 00:18:26,119 --> 00:18:29,160 Speaker 6: that's called you Guessed It Get Hired to get even 308 00:18:29,240 --> 00:18:30,440 Speaker 6: more information. 309 00:18:30,119 --> 00:18:33,159 Speaker 4: Delivered to you every week. You can find those links 310 00:18:33,200 --> 00:18:36,159 Speaker 4: in the show notes, and of course don't forget to 311 00:18:36,160 --> 00:18:38,800 Speaker 4: click the follow or subscribe button to get our podcast 312 00:18:38,880 --> 00:18:42,280 Speaker 4: delivered to you every Wednesday, because we'll be continuing these 313 00:18:42,320 --> 00:18:46,040 Speaker 4: conversations on the next episode right here wherever you like 314 00:18:46,119 --> 00:18:49,919 Speaker 4: to listen. Get Hired is a production of LinkedIn News. 315 00:18:50,400 --> 00:18:53,800 Speaker 4: This episode was produced by Grace Rubin asaf Gedon engineered 316 00:18:53,840 --> 00:18:57,160 Speaker 4: our show. Joe de Georgie mixed our show. Dave Pond 317 00:18:57,200 --> 00:19:01,320 Speaker 4: as Head of News Production. Enrique Montalvo is our executive producer. 318 00:19:01,560 --> 00:19:04,479 Speaker 4: Courtney Coop is the head of original Programming for LinkedIn. 319 00:19:05,040 --> 00:19:07,840 Speaker 4: Dan Ropp is the editor in chief of LinkedIn, and 320 00:19:07,880 --> 00:19:11,399 Speaker 4: I'm Andrew Seman. Until next time, stay well and best 321 00:19:11,440 --> 00:19:11,800 Speaker 4: of luck.