1 00:00:03,000 --> 00:00:09,240 Speaker 1: Welcome to Before Breakfast, a production of iHeartRadio. Good Morning, 2 00:00:10,280 --> 00:00:15,480 Speaker 1: This is Laura. Welcome to the Before Breakfast podcast. Today's 3 00:00:15,520 --> 00:00:17,799 Speaker 1: episode is going to be a longer one part of 4 00:00:17,800 --> 00:00:21,160 Speaker 1: this series where I interview fascinating people about how they 5 00:00:21,160 --> 00:00:24,079 Speaker 1: take their days from great to awesome and any advice 6 00:00:24,160 --> 00:00:26,759 Speaker 1: they have for the rest of us. So today I 7 00:00:26,800 --> 00:00:30,560 Speaker 1: am delighted to welcome Alexandra Lovitt to Before Breakfast. Alexandra 8 00:00:30,680 --> 00:00:33,600 Speaker 1: is the founder and CEO of Inspiration at Work, which 9 00:00:33,640 --> 00:00:36,800 Speaker 1: is a futurist consulting business. She is also a speaker 10 00:00:37,120 --> 00:00:39,800 Speaker 1: and the author of They Don't Teach Corporate in College 11 00:00:39,840 --> 00:00:43,520 Speaker 1: and the new book Make School Work. So Alexandra, welcome 12 00:00:43,520 --> 00:00:44,080 Speaker 1: to the show. 13 00:00:44,520 --> 00:00:47,040 Speaker 2: Thanks Laura. So good to be back and chatting with 14 00:00:47,080 --> 00:00:50,640 Speaker 2: you after a little bit of time has passed. Yeah. 15 00:00:50,720 --> 00:00:53,320 Speaker 1: Yeah, it's so good to reconnect with people. I'm loving 16 00:00:53,360 --> 00:00:56,360 Speaker 1: that I'm getting to share these conversations with everyone. Why 17 00:00:56,360 --> 00:00:58,360 Speaker 1: don't you tell our listeners a little bit about yourself? 18 00:00:59,400 --> 00:01:04,640 Speaker 2: Well, these days, I am a fashioned workforce futurist and 19 00:01:04,800 --> 00:01:08,600 Speaker 2: what this essentially means is that I work with organizations 20 00:01:08,840 --> 00:01:13,360 Speaker 2: and governments to ascertain what has the greatest potential for 21 00:01:13,440 --> 00:01:17,880 Speaker 2: disruption in the world of work and then devise scenarios 22 00:01:18,040 --> 00:01:21,800 Speaker 2: and plans for getting through those disruptions in the most 23 00:01:21,840 --> 00:01:26,160 Speaker 2: productive way possible. And how I became connected with the 24 00:01:26,200 --> 00:01:31,560 Speaker 2: future of education was through an organization called GPS Education Partners, 25 00:01:31,800 --> 00:01:36,720 Speaker 2: and they are shaping the world of work through training 26 00:01:38,040 --> 00:01:42,039 Speaker 2: students at a younger age to participate meaningfully in the 27 00:01:42,080 --> 00:01:46,320 Speaker 2: workforce through work based learning experiences. And you and I 28 00:01:46,560 --> 00:01:49,680 Speaker 2: we've probably talked about this before, but since I've gotten 29 00:01:49,680 --> 00:01:52,520 Speaker 2: into this field about twenty five years ago, there's been 30 00:01:53,160 --> 00:01:58,120 Speaker 2: a pretty persistent problem that has just kept me up 31 00:01:58,120 --> 00:02:01,240 Speaker 2: at night, niggled in my brain. And that's that there 32 00:02:01,320 --> 00:02:04,800 Speaker 2: is this gap between the jobs that are available for 33 00:02:04,880 --> 00:02:08,840 Speaker 2: people to take and the talent that is wanting to 34 00:02:08,880 --> 00:02:12,560 Speaker 2: take those jobs. And so throughout the course of my career, 35 00:02:12,600 --> 00:02:15,079 Speaker 2: I've been on the hunt for a solution that would 36 00:02:15,120 --> 00:02:18,840 Speaker 2: pair effectively talent that is available for work with the 37 00:02:18,919 --> 00:02:23,040 Speaker 2: jobs that organizations need to fill. And so by pipelining 38 00:02:23,120 --> 00:02:28,239 Speaker 2: students at a younger age into a specific career, or 39 00:02:28,280 --> 00:02:31,280 Speaker 2: at least giving them good exposure to it, I think 40 00:02:31,639 --> 00:02:34,840 Speaker 2: we start to mitigate the problem of not having the 41 00:02:34,960 --> 00:02:36,960 Speaker 2: right people for the right positions. 42 00:02:37,639 --> 00:02:41,360 Speaker 1: Well, what are we doing wrong with education that leads 43 00:02:41,400 --> 00:02:43,160 Speaker 1: to this mismatch, this gap. 44 00:02:43,280 --> 00:02:45,880 Speaker 2: Well, Laura goes all the way back to my first book, 45 00:02:45,960 --> 00:02:47,960 Speaker 2: which I know we talked about a long time ago. 46 00:02:48,480 --> 00:02:51,799 Speaker 2: They don't teach corporate in college, and that more or 47 00:02:51,919 --> 00:02:54,160 Speaker 2: less sums up at least part of the problem, which 48 00:02:54,200 --> 00:02:58,480 Speaker 2: is that the mandates of education don't really match up 49 00:02:58,520 --> 00:03:01,400 Speaker 2: to what students need to know in the real world. 50 00:03:01,919 --> 00:03:04,320 Speaker 2: And this has only become in the last twenty five 51 00:03:04,400 --> 00:03:09,480 Speaker 2: years more relevant because things are changing faster. So when 52 00:03:09,480 --> 00:03:12,760 Speaker 2: you think about an established curriculum in high school or college, 53 00:03:13,720 --> 00:03:16,120 Speaker 2: it is static for a long period of time, and 54 00:03:16,200 --> 00:03:18,680 Speaker 2: often by the time a student gets out of a 55 00:03:18,720 --> 00:03:21,399 Speaker 2: degree program, whether it's a secondary degree or a post 56 00:03:21,480 --> 00:03:25,800 Speaker 2: secondary degree, that material has become obsolete or at the 57 00:03:25,880 --> 00:03:29,560 Speaker 2: very least needs to be substantially updated or assimilated into 58 00:03:29,760 --> 00:03:33,280 Speaker 2: the reality of business today. So I think what we've 59 00:03:33,320 --> 00:03:37,880 Speaker 2: been still doing is relying on education, the traditional model 60 00:03:37,920 --> 00:03:42,240 Speaker 2: of education, to adequately prepare young people for the workforce. 61 00:03:42,480 --> 00:03:44,360 Speaker 2: And by the way, this is any workforce. It's not 62 00:03:44,520 --> 00:03:48,160 Speaker 2: just a traditional white collar workforce, or it's any kind 63 00:03:48,160 --> 00:03:52,280 Speaker 2: of job. Students don't have an intuitive, natural sense of 64 00:03:52,320 --> 00:03:55,840 Speaker 2: how those work because they've never had the experience before. 65 00:03:56,320 --> 00:04:00,000 Speaker 2: So we are not providing students with enough concrete operation 66 00:04:00,120 --> 00:04:05,200 Speaker 2: ttunities to experience the work world, to experience the different 67 00:04:05,240 --> 00:04:08,480 Speaker 2: career paths that they might be interested in, both to 68 00:04:08,520 --> 00:04:12,080 Speaker 2: determine what they do like and also what they don't like. 69 00:04:12,240 --> 00:04:16,160 Speaker 2: That's just as much of a valuable experience too, because 70 00:04:16,480 --> 00:04:21,599 Speaker 2: if you, for example, are interested in engineering, you might, 71 00:04:21,800 --> 00:04:24,560 Speaker 2: as a high school student, have absolutely no idea what 72 00:04:24,600 --> 00:04:27,919 Speaker 2: that means in practicality. And if you don't have an 73 00:04:28,000 --> 00:04:31,719 Speaker 2: experience like work based learning, you may go to a 74 00:04:31,800 --> 00:04:35,760 Speaker 2: four year college and you may get a degree in engineering, 75 00:04:35,880 --> 00:04:38,920 Speaker 2: and you may realize that somewhere down the line, either 76 00:04:38,960 --> 00:04:42,599 Speaker 2: in an internship or a post graduation work situation, that 77 00:04:42,680 --> 00:04:44,919 Speaker 2: it's nothing like what you expected, you don't like it 78 00:04:44,960 --> 00:04:47,599 Speaker 2: at all, And then you've just spent four years of 79 00:04:47,640 --> 00:04:52,240 Speaker 2: a very expensive education to not do something that has 80 00:04:52,279 --> 00:04:55,800 Speaker 2: anything to do with that degree. So I view it 81 00:04:55,880 --> 00:04:59,400 Speaker 2: as something that needs to be at least investigated at 82 00:04:59,440 --> 00:05:02,080 Speaker 2: an early your stage in a student's development. 83 00:05:02,920 --> 00:05:05,000 Speaker 1: So what would that look like, I mean, is like 84 00:05:05,160 --> 00:05:08,800 Speaker 1: more internships, more stuff that like courses that look like work. 85 00:05:08,839 --> 00:05:10,360 Speaker 1: I mean, I'm very curious what it would what it 86 00:05:10,400 --> 00:05:10,960 Speaker 1: would look like. 87 00:05:11,160 --> 00:05:12,960 Speaker 2: Yeah, I mean this is this is a great question, 88 00:05:13,120 --> 00:05:17,080 Speaker 2: and I think we can look to other countries as 89 00:05:17,120 --> 00:05:18,880 Speaker 2: an example. Here in the US we are not as 90 00:05:18,960 --> 00:05:24,240 Speaker 2: good at things like apprenticeships as many organizations who are 91 00:05:24,320 --> 00:05:28,200 Speaker 2: European based, for example, and many governments they've got really 92 00:05:28,520 --> 00:05:32,480 Speaker 2: much more robust public and private sector partnerships over there, 93 00:05:32,880 --> 00:05:35,880 Speaker 2: where it's expected almost that you do some kind of 94 00:05:35,880 --> 00:05:38,240 Speaker 2: work based learning in at the high school level and 95 00:05:38,240 --> 00:05:41,440 Speaker 2: sometimes even earlier. So what we are trying to and 96 00:05:41,800 --> 00:05:45,039 Speaker 2: I say we it's it's not really me, it's organizations 97 00:05:45,160 --> 00:05:48,640 Speaker 2: like GPS Education Partners, who I worked with on this book. 98 00:05:49,279 --> 00:05:54,000 Speaker 2: What we are trying to do is establish a six 99 00:05:54,120 --> 00:05:58,640 Speaker 2: part framework for if you are interested in launching a 100 00:05:58,680 --> 00:06:01,600 Speaker 2: work based learning program, which is basically what it sounds like. 101 00:06:01,760 --> 00:06:06,120 Speaker 2: It's learning by working in a real world situation such 102 00:06:06,120 --> 00:06:11,960 Speaker 2: as on a factory floor, and to be able to 103 00:06:12,040 --> 00:06:17,680 Speaker 2: work with other stakeholders and other constituents in your immediate purview. 104 00:06:17,800 --> 00:06:20,719 Speaker 2: And obviously this is going to vary depending on where 105 00:06:20,800 --> 00:06:23,120 Speaker 2: you are or what your role is. So let's say 106 00:06:23,200 --> 00:06:26,120 Speaker 2: you are an employer who wants to stay in a 107 00:06:26,160 --> 00:06:29,719 Speaker 2: given geographic area, but has no idea where to get 108 00:06:29,720 --> 00:06:32,640 Speaker 2: talent and is struggling to get talent. Maybe you're a 109 00:06:32,680 --> 00:06:35,039 Speaker 2: mid sized company and you don't know how to get 110 00:06:35,080 --> 00:06:39,720 Speaker 2: talent to compete with the bigger companies that have name recognition. 111 00:06:40,200 --> 00:06:42,360 Speaker 2: So you want to start a work based learning program, 112 00:06:42,400 --> 00:06:44,840 Speaker 2: but you don't know where how to get how to 113 00:06:44,880 --> 00:06:49,000 Speaker 2: get this going, and so you might partner with what 114 00:06:49,040 --> 00:06:54,240 Speaker 2: we call convener and intermediary like GPS Education Partners, and 115 00:06:54,279 --> 00:06:58,120 Speaker 2: you might partner with policymakers. You might partner with school districts, 116 00:06:58,440 --> 00:07:01,679 Speaker 2: get everyone together on the same page, and just follow 117 00:07:01,960 --> 00:07:05,760 Speaker 2: this blueprint for how to get a work based learning 118 00:07:05,800 --> 00:07:09,600 Speaker 2: pilot off the ground. And the most important thing about 119 00:07:09,640 --> 00:07:13,360 Speaker 2: this is it doesn't have to be done alone. If 120 00:07:13,400 --> 00:07:15,880 Speaker 2: you're an employer, obviously you might not have a frame 121 00:07:15,920 --> 00:07:19,120 Speaker 2: of reference for setting something like this up. But to 122 00:07:19,400 --> 00:07:21,640 Speaker 2: go to people who do know how to do it 123 00:07:21,680 --> 00:07:24,200 Speaker 2: and do know how to get all the parts working together, 124 00:07:24,440 --> 00:07:26,920 Speaker 2: I think that's a very very critical piece of this, 125 00:07:27,560 --> 00:07:29,200 Speaker 2: and so that is what we're trying to do with 126 00:07:29,240 --> 00:07:32,760 Speaker 2: our book. We're trying to literally tell people this is 127 00:07:32,800 --> 00:07:35,640 Speaker 2: what you do, step by step if you're interested in 128 00:07:35,680 --> 00:07:39,200 Speaker 2: getting this set up and what it ends up as 129 00:07:39,280 --> 00:07:42,160 Speaker 2: it really does depend again on the individual situation. But 130 00:07:42,720 --> 00:07:45,640 Speaker 2: in an ideal world, students would be spending part of 131 00:07:45,680 --> 00:07:50,080 Speaker 2: their school day in a real world employment situation, learning, 132 00:07:50,080 --> 00:07:53,400 Speaker 2: for example, a new piece of manufacturing and technology that's 133 00:07:53,480 --> 00:07:58,080 Speaker 2: leveraging robotics to do work and production more efficiently. That's 134 00:07:58,120 --> 00:08:01,480 Speaker 2: just an example. The student comes in, the student is 135 00:08:02,120 --> 00:08:04,800 Speaker 2: working with employees or a bit older or have a 136 00:08:04,880 --> 00:08:08,080 Speaker 2: bit more experience, so they're learning not only these hard 137 00:08:08,160 --> 00:08:12,480 Speaker 2: technical skills, but also the skills that are necessary just 138 00:08:12,560 --> 00:08:15,800 Speaker 2: interpersonally to assimilate into the work world that we talk 139 00:08:15,840 --> 00:08:20,760 Speaker 2: about so often. And then they are using their academic 140 00:08:21,000 --> 00:08:25,920 Speaker 2: education to support what they are learning in the work environment. 141 00:08:26,080 --> 00:08:30,000 Speaker 2: So this might be they are taking some classes on 142 00:08:30,160 --> 00:08:34,480 Speaker 2: site at an education center that are related to let's say, 143 00:08:34,679 --> 00:08:38,960 Speaker 2: using our existing example to manufacturing, getting a manufacturing certification, 144 00:08:39,600 --> 00:08:44,200 Speaker 2: so that their academic education does actually line up with 145 00:08:44,440 --> 00:08:48,480 Speaker 2: the work based learning experience. And what I also love 146 00:08:48,480 --> 00:08:51,240 Speaker 2: about work based learning is that by the time you 147 00:08:51,320 --> 00:08:53,240 Speaker 2: get to the end of your experience, you have a 148 00:08:53,320 --> 00:08:55,640 Speaker 2: much better idea of whether this is a career path 149 00:08:56,040 --> 00:08:58,720 Speaker 2: that you would like to pursue, and if so, what 150 00:08:58,880 --> 00:09:02,000 Speaker 2: kind of education do you need in order to get 151 00:09:02,120 --> 00:09:05,880 Speaker 2: ahead in that career path. And I wish, for example, 152 00:09:06,040 --> 00:09:09,000 Speaker 2: my son, who is the person I was speaking of 153 00:09:09,040 --> 00:09:11,720 Speaker 2: when I was talking about the engineering example, I wish 154 00:09:11,840 --> 00:09:15,720 Speaker 2: that he had done something like this so that going 155 00:09:15,760 --> 00:09:20,000 Speaker 2: to a four year college was a sensible, informed decision, 156 00:09:20,240 --> 00:09:23,280 Speaker 2: Whereas right now it's kind of a crapshoot. We really 157 00:09:23,280 --> 00:09:27,080 Speaker 2: have no idea what's going to happen when he gets 158 00:09:27,080 --> 00:09:29,040 Speaker 2: that degree, or if he's even going to like that degree. 159 00:09:29,080 --> 00:09:32,440 Speaker 2: Whereas if you participate in work based learning prior to 160 00:09:32,520 --> 00:09:35,679 Speaker 2: going to college, you can ascertain is a four year 161 00:09:35,720 --> 00:09:39,360 Speaker 2: college degree the next appropriate step for me? And maybe 162 00:09:39,400 --> 00:09:41,520 Speaker 2: you can even have your employer, if you're going to 163 00:09:41,520 --> 00:09:46,040 Speaker 2: continue working with that organization, help you financially with that degree, 164 00:09:46,080 --> 00:09:48,960 Speaker 2: because the employer recognizes the value in you having that 165 00:09:49,240 --> 00:09:54,000 Speaker 2: additional education. So essentially, it just means that post secondary 166 00:09:54,120 --> 00:09:57,600 Speaker 2: education makes more sense for the individual before rather than 167 00:09:57,640 --> 00:10:00,960 Speaker 2: just pipelining every student into a four year degree program, 168 00:10:01,000 --> 00:10:03,240 Speaker 2: which is the model that we in the US have 169 00:10:03,360 --> 00:10:04,440 Speaker 2: had for quite some time. 170 00:10:05,200 --> 00:10:07,400 Speaker 1: Absolutely, well, we're going to take a quick ad break 171 00:10:07,400 --> 00:10:08,800 Speaker 1: and then we're going to come back and talk more 172 00:10:08,840 --> 00:10:12,000 Speaker 1: about how people can learn more while on the job 173 00:10:12,160 --> 00:10:21,800 Speaker 1: for the jobs of the future. Well, I am back 174 00:10:21,920 --> 00:10:25,679 Speaker 1: talking with Alexandra Lovett, who is a futurist also the 175 00:10:25,720 --> 00:10:27,720 Speaker 1: author of They Don't Teach Corporate in College and the 176 00:10:27,760 --> 00:10:30,200 Speaker 1: new book Make School Work. We've been talking about work 177 00:10:30,200 --> 00:10:33,560 Speaker 1: based learning, but you know that's great for young people. 178 00:10:33,880 --> 00:10:35,720 Speaker 1: A lot of the people listening to this are adults 179 00:10:35,960 --> 00:10:39,960 Speaker 1: and they may have student children who would like benefit 180 00:10:40,000 --> 00:10:42,800 Speaker 1: from this, But a lot of people are concerned about 181 00:10:43,400 --> 00:10:47,000 Speaker 1: you know, I need to keep learning as well. You know, 182 00:10:47,720 --> 00:10:49,679 Speaker 1: we keep hearing that the jobs of the future may 183 00:10:49,679 --> 00:10:52,040 Speaker 1: be different than the jobs of today or whatever they 184 00:10:52,720 --> 00:10:56,600 Speaker 1: catch phrases are. So how should people who are already 185 00:10:56,640 --> 00:10:58,880 Speaker 1: in the workforce now be thinking about the skills they 186 00:10:58,960 --> 00:10:59,880 Speaker 1: might need in the future. 187 00:11:00,640 --> 00:11:03,160 Speaker 2: One of my favorite topics, And you know, I get 188 00:11:03,200 --> 00:11:08,079 Speaker 2: asked every single day about AI's impact on the workforce 189 00:11:08,480 --> 00:11:11,360 Speaker 2: and its impact on skills and what we will need 190 00:11:11,400 --> 00:11:15,240 Speaker 2: to do in order to as humans meaningfully contribute to 191 00:11:15,440 --> 00:11:18,040 Speaker 2: our organizations going forward. So I think this gets to 192 00:11:18,080 --> 00:11:21,600 Speaker 2: the heart of your question because AI is one of 193 00:11:21,600 --> 00:11:25,040 Speaker 2: those things where most people in the workforce today are 194 00:11:25,080 --> 00:11:29,480 Speaker 2: starting with a blank slate. Nobody really has these skills. 195 00:11:29,600 --> 00:11:32,640 Speaker 2: If you do have skills, maybe they're pretty rudimentary, like 196 00:11:32,720 --> 00:11:36,600 Speaker 2: you know how to type something into CHATGYBT or claude 197 00:11:36,640 --> 00:11:40,440 Speaker 2: and get some kind of sensible answer to it. But 198 00:11:40,840 --> 00:11:44,520 Speaker 2: that is really only the starting point for leveraging let's 199 00:11:44,520 --> 00:11:48,360 Speaker 2: say generative AI effectively. And so what I'm recommending that 200 00:11:48,440 --> 00:11:51,760 Speaker 2: everybody do is, first of all, have a mindset of 201 00:11:51,840 --> 00:11:56,520 Speaker 2: learning agility, understanding that your learning is not over just 202 00:11:56,559 --> 00:12:00,800 Speaker 2: because you're no longer in formal education, and that having 203 00:12:01,160 --> 00:12:06,480 Speaker 2: a gainful employment situation for the foreseeable future means that 204 00:12:06,520 --> 00:12:10,520 Speaker 2: you have to be continuously revisiting what skills you have 205 00:12:11,040 --> 00:12:13,520 Speaker 2: and what skills you need. And in terms of where 206 00:12:13,559 --> 00:12:17,120 Speaker 2: to start with this, I always recommend that people take 207 00:12:17,160 --> 00:12:21,640 Speaker 2: a look around and say, what are the parts of 208 00:12:21,720 --> 00:12:27,000 Speaker 2: my current job that are being automated and what do 209 00:12:27,120 --> 00:12:31,439 Speaker 2: I need to learn to still have a productive role 210 00:12:31,640 --> 00:12:35,800 Speaker 2: in whatever process or workflow I'm involved in. So it 211 00:12:36,000 --> 00:12:40,600 Speaker 2: really does involve a little bit of individual employees being 212 00:12:40,679 --> 00:12:44,760 Speaker 2: MANI futurists. You have to kind of look around and say, okay, 213 00:12:45,000 --> 00:12:48,320 Speaker 2: this is the area that is most vulnerable to automation, 214 00:12:48,960 --> 00:12:53,080 Speaker 2: and what additional or adjacent skills can I develop? In 215 00:12:53,240 --> 00:12:56,320 Speaker 2: order to continue to participate. And I'll give you a 216 00:12:56,480 --> 00:13:01,640 Speaker 2: very clean example of this. I'm a workplace columnists for 217 00:13:01,720 --> 00:13:06,040 Speaker 2: several different outlets. When I think about column writing and 218 00:13:06,080 --> 00:13:09,520 Speaker 2: I write on the workforce, of course, I think about 219 00:13:09,720 --> 00:13:13,559 Speaker 2: what AI is capable of doing in that process today 220 00:13:14,000 --> 00:13:16,080 Speaker 2: and what is it going to be capable of doing 221 00:13:16,240 --> 00:13:19,320 Speaker 2: at some point in the near future. And from a 222 00:13:19,360 --> 00:13:24,480 Speaker 2: writing column perspective, the writing itself is actually likely to 223 00:13:24,640 --> 00:13:28,160 Speaker 2: be done by AI. It's already, it's already happening. And 224 00:13:28,800 --> 00:13:32,160 Speaker 2: if I just rely on that writing skill, I could 225 00:13:32,160 --> 00:13:34,720 Speaker 2: be out of a job within a couple of years. 226 00:13:35,160 --> 00:13:38,960 Speaker 2: So instead, I have to look at the entire process 227 00:13:39,160 --> 00:13:42,600 Speaker 2: of the production of a column and say, h, there 228 00:13:42,600 --> 00:13:48,680 Speaker 2: are things like fact checking, interviewing with sensitive sources, editing. 229 00:13:49,280 --> 00:13:51,760 Speaker 2: Those are all things that, let's say, the Wall Street Journal, 230 00:13:51,800 --> 00:13:55,160 Speaker 2: for example, is not going to allow to be done 231 00:13:55,320 --> 00:13:59,080 Speaker 2: completely by AI. Some of those things, there's got to 232 00:13:59,200 --> 00:14:01,960 Speaker 2: continue to be a human in the loop. And if 233 00:14:01,960 --> 00:14:05,280 Speaker 2: I want to keep writing columns or keep doing columns, 234 00:14:05,600 --> 00:14:07,680 Speaker 2: I need to think about how I can acquire the 235 00:14:07,840 --> 00:14:11,480 Speaker 2: editing skill, how I can acquire the fact checking skill, 236 00:14:11,559 --> 00:14:15,080 Speaker 2: and how I can enhance the in person or phone 237 00:14:15,160 --> 00:14:17,760 Speaker 2: or video interviewing that I do with people so that 238 00:14:17,880 --> 00:14:21,400 Speaker 2: sensitive sources can continue to feel comfortable because there are 239 00:14:21,440 --> 00:14:24,240 Speaker 2: going to always be sources that are not going to 240 00:14:24,320 --> 00:14:27,840 Speaker 2: be jazzed about revealing things to an AI. And I 241 00:14:27,840 --> 00:14:30,720 Speaker 2: think that's just how it's probably going to be. I 242 00:14:30,760 --> 00:14:33,960 Speaker 2: could be wrong, but I am staying a step ahead 243 00:14:34,000 --> 00:14:36,360 Speaker 2: of the situation in which I do not have a 244 00:14:36,760 --> 00:14:41,280 Speaker 2: role anymore, because the writing itself is being done by AI, 245 00:14:41,400 --> 00:14:44,640 Speaker 2: and everybody who is in a job, regardless of what 246 00:14:44,680 --> 00:14:47,280 Speaker 2: that job is, needs to be doing that same type 247 00:14:47,320 --> 00:14:50,720 Speaker 2: of thinking what can be automated or what's on the 248 00:14:51,240 --> 00:14:54,520 Speaker 2: chopping block to be automated, and how can I acquire 249 00:14:54,560 --> 00:14:56,880 Speaker 2: those additional skills? And in terms of where you acquire 250 00:14:56,920 --> 00:14:59,800 Speaker 2: the skills, we are lucky in the sense that there 251 00:14:59,880 --> 00:15:03,760 Speaker 2: is more skill acquisition opportunity just out there in the 252 00:15:03,800 --> 00:15:06,440 Speaker 2: public domain than there ever was previously. 253 00:15:06,880 --> 00:15:09,800 Speaker 1: Things like well, why don't you give us some examples 254 00:15:09,520 --> 00:15:12,520 Speaker 1: of where people should go to look like, let's say 255 00:15:12,520 --> 00:15:15,440 Speaker 1: they've identified a skill that they would like to have 256 00:15:16,080 --> 00:15:19,040 Speaker 1: and then they basically need to design their own curriculum 257 00:15:19,080 --> 00:15:22,160 Speaker 1: because their company isn't doing this for them. What should 258 00:15:22,160 --> 00:15:22,440 Speaker 1: they do? 259 00:15:23,200 --> 00:15:25,880 Speaker 2: So I think looking online is just a great way 260 00:15:25,920 --> 00:15:30,880 Speaker 2: to start. There are so many really prestigious universities actually 261 00:15:30,920 --> 00:15:34,320 Speaker 2: that are offering free versions of certain classes. This is 262 00:15:34,400 --> 00:15:37,840 Speaker 2: especially true in technology, but is also relevant in other 263 00:15:37,920 --> 00:15:42,920 Speaker 2: fields as well. There are so many online repositories of 264 00:15:43,880 --> 00:15:47,080 Speaker 2: learning content Coursera to me. I call it ut to me. 265 00:15:47,160 --> 00:15:50,520 Speaker 2: Some other people say you to me, but there are 266 00:15:50,520 --> 00:15:55,680 Speaker 2: dozens of those that you can go and access different 267 00:15:55,720 --> 00:15:59,080 Speaker 2: types of learning opportunities. LinkedIn learning is another really really 268 00:15:59,120 --> 00:16:03,480 Speaker 2: good one. I have taken several of those courses through 269 00:16:03,680 --> 00:16:06,440 Speaker 2: that my colleagues have produced to learn think more about 270 00:16:06,440 --> 00:16:09,000 Speaker 2: what they do and get a little bit deeper on 271 00:16:09,160 --> 00:16:12,000 Speaker 2: certain topics. You can also ask, I mean, there's the 272 00:16:12,560 --> 00:16:15,960 Speaker 2: old fashion just asking to be mentored by someone who 273 00:16:16,480 --> 00:16:18,080 Speaker 2: might be in your field and might be a little 274 00:16:18,080 --> 00:16:22,760 Speaker 2: bit further ahead on let's say that AI skill development 275 00:16:22,960 --> 00:16:27,040 Speaker 2: and how it applies specifically to your workflow or business process. 276 00:16:28,080 --> 00:16:31,640 Speaker 2: And that's what I recommend. It doesn't matter again, what 277 00:16:31,720 --> 00:16:34,440 Speaker 2: the level of the person is, because the level doesn't 278 00:16:34,480 --> 00:16:37,920 Speaker 2: necessarily correlate with the knowledge. You could have young people 279 00:16:37,920 --> 00:16:39,600 Speaker 2: who are really really good at AI. You could have 280 00:16:39,640 --> 00:16:41,880 Speaker 2: senior people who are really really good at AI. It 281 00:16:41,960 --> 00:16:44,400 Speaker 2: is a bit of a misconception that all young people 282 00:16:44,400 --> 00:16:47,760 Speaker 2: are good at technology. I don't think that's necessarily true. 283 00:16:47,760 --> 00:16:50,120 Speaker 2: It just depends on the person. And so looking for 284 00:16:50,200 --> 00:16:53,720 Speaker 2: that person within your purview who has a generosity of 285 00:16:53,800 --> 00:16:56,040 Speaker 2: spirit and who also seems to know what they're doing, 286 00:16:56,400 --> 00:17:00,640 Speaker 2: and shadowing them and learning how are they prompting the 287 00:17:01,120 --> 00:17:03,520 Speaker 2: technology to get it to work for them for what 288 00:17:03,560 --> 00:17:06,800 Speaker 2: they're trying to do. I have learned so much just 289 00:17:07,560 --> 00:17:10,400 Speaker 2: in passing with conversations people, Oh my god, you're using 290 00:17:10,440 --> 00:17:12,439 Speaker 2: AI to do that? Really, how do you do that? 291 00:17:12,920 --> 00:17:15,000 Speaker 2: And then you literally just learned it on the job 292 00:17:15,119 --> 00:17:17,040 Speaker 2: right there, And so. 293 00:17:16,960 --> 00:17:19,479 Speaker 1: What are you doing using it for in your life currently? 294 00:17:19,680 --> 00:17:22,119 Speaker 2: Oh so it's a really good question. So, Laura, I 295 00:17:22,160 --> 00:17:25,440 Speaker 2: don't use it for any writing. And I'll tell you 296 00:17:25,520 --> 00:17:28,640 Speaker 2: a little bit kind of a funny story. One time 297 00:17:28,920 --> 00:17:31,919 Speaker 2: a few months ago, a client said to me that 298 00:17:32,000 --> 00:17:34,119 Speaker 2: they thought a white paper that I had written was 299 00:17:34,200 --> 00:17:37,879 Speaker 2: generated by AI. And I asked them, how how did 300 00:17:37,920 --> 00:17:41,000 Speaker 2: they come up with that? And they said, well, I 301 00:17:41,160 --> 00:17:45,800 Speaker 2: put it into one of the AI you know, detectors, 302 00:17:46,200 --> 00:17:49,320 Speaker 2: and it said it's got it's eighty five percent. Sounds 303 00:17:49,359 --> 00:17:52,240 Speaker 2: like something that's already out there. But what the client 304 00:17:52,320 --> 00:17:56,479 Speaker 2: didn't do was click on the source that the AI 305 00:17:56,840 --> 00:18:01,520 Speaker 2: was generating this information from and found that the source 306 00:18:01,560 --> 00:18:04,720 Speaker 2: that the AI had been trained on was my own content. 307 00:18:05,680 --> 00:18:07,840 Speaker 1: Absolutely, so it was like, you know, you and. 308 00:18:07,880 --> 00:18:10,159 Speaker 2: I have been in this and in our spaces for 309 00:18:10,200 --> 00:18:12,440 Speaker 2: a while, and so we've generated a lot of content. 310 00:18:12,520 --> 00:18:15,240 Speaker 2: And AI is only as smart as the content that 311 00:18:15,280 --> 00:18:20,400 Speaker 2: it's trained on. So therefore it can't necessarily be relied 312 00:18:20,480 --> 00:18:24,840 Speaker 2: upon a to be unique in its thinking and b 313 00:18:25,240 --> 00:18:28,320 Speaker 2: to put things together in a way that makes sense. 314 00:18:28,720 --> 00:18:31,480 Speaker 2: If I look at the AI version of my content, 315 00:18:31,600 --> 00:18:34,880 Speaker 2: sometimes if I dig a little bit below the surface, 316 00:18:34,920 --> 00:18:37,880 Speaker 2: I can see that it doesn't quite make sense. And 317 00:18:38,080 --> 00:18:40,359 Speaker 2: this is why humans still need to be in the 318 00:18:40,400 --> 00:18:43,600 Speaker 2: loop when evaluating AI content, because people who are new 319 00:18:43,600 --> 00:18:45,840 Speaker 2: to an industry, are new to a role, might not 320 00:18:46,080 --> 00:18:49,159 Speaker 2: have that institutional knowledge or skills to recognize that what 321 00:18:49,200 --> 00:18:52,840 Speaker 2: they're reading is not one hundred percent accurate or relevant. 322 00:18:53,359 --> 00:18:58,200 Speaker 2: So I definitely use it for things like background research. 323 00:18:59,200 --> 00:19:01,280 Speaker 2: I do want to give the caveat with research, you 324 00:19:01,359 --> 00:19:03,879 Speaker 2: have to be very careful about the data you're getting 325 00:19:03,920 --> 00:19:06,959 Speaker 2: and making sure that you are still looking at the 326 00:19:07,000 --> 00:19:10,440 Speaker 2: sources that the AI is citing to ensure that those 327 00:19:10,440 --> 00:19:13,080 Speaker 2: are reputable sources, because a lot of times AI doesn't 328 00:19:13,080 --> 00:19:15,560 Speaker 2: know the difference. So this goes back to having some 329 00:19:15,720 --> 00:19:20,760 Speaker 2: degree of knowledge in your field, but just summarizing and 330 00:19:21,920 --> 00:19:28,000 Speaker 2: I would say collating a bunch of information that is 331 00:19:28,040 --> 00:19:30,399 Speaker 2: in a lot of different sources. I think AI is 332 00:19:30,440 --> 00:19:33,160 Speaker 2: really really good for that. I think it's sometimes it's 333 00:19:33,160 --> 00:19:35,119 Speaker 2: a good thought starter if you just don't even know 334 00:19:35,160 --> 00:19:38,280 Speaker 2: where to begin on a topic or on a project, 335 00:19:38,560 --> 00:19:41,280 Speaker 2: it can give you some starting ideas. It can absolutely 336 00:19:41,280 --> 00:19:46,400 Speaker 2: hold you accountable. I am now experimenting with using AI 337 00:19:46,520 --> 00:19:50,960 Speaker 2: for personal coaching, for career coaching, and I am recognizing 338 00:19:51,080 --> 00:19:54,239 Speaker 2: how sycophantic it is, which for those who might not 339 00:19:54,359 --> 00:19:57,040 Speaker 2: have heard that term, it really means that AI is 340 00:19:57,080 --> 00:20:00,760 Speaker 2: looking blowing smoke up your butt and validating you kind 341 00:20:00,760 --> 00:20:04,800 Speaker 2: of to an extreme extent. And it's a great point, Alexander, 342 00:20:06,160 --> 00:20:10,960 Speaker 2: that very funny. Yeah, So it does that, and you 343 00:20:11,080 --> 00:20:13,480 Speaker 2: have to you have to really be and this this 344 00:20:13,600 --> 00:20:16,520 Speaker 2: goes to AI skill acquisition, right, you have to understand 345 00:20:16,520 --> 00:20:19,600 Speaker 2: how to prompt it. So it's giving you a really 346 00:20:19,760 --> 00:20:23,440 Speaker 2: unbiased view of what you're working on, not just telling 347 00:20:23,440 --> 00:20:26,159 Speaker 2: you everything that you think and do is great, but 348 00:20:26,200 --> 00:20:28,919 Speaker 2: that's a skill, just like any other skill, understanding how 349 00:20:28,920 --> 00:20:31,199 Speaker 2: to prompt AI effectively. So these are the things that 350 00:20:31,240 --> 00:20:34,360 Speaker 2: I'm I'm noodling on I'm working with. I still think 351 00:20:34,359 --> 00:20:37,200 Speaker 2: it's it's highly useful, but I don't think it's as 352 00:20:37,280 --> 00:20:40,959 Speaker 2: good as we sometimes give it credit for. Will it 353 00:20:41,000 --> 00:20:43,920 Speaker 2: get that good? It might, but it still isn't going 354 00:20:43,960 --> 00:20:47,080 Speaker 2: to have the ability to think truly independently, and so 355 00:20:47,160 --> 00:20:51,919 Speaker 2: we have to continue as content providers to put really, 356 00:20:51,960 --> 00:20:54,879 Speaker 2: really good, strong content into the world so that the 357 00:20:54,960 --> 00:20:59,680 Speaker 2: AI in turn can be more accurate and effective in 358 00:20:59,720 --> 00:21:03,439 Speaker 2: what it's recommending. Because without humans, AI can't think on 359 00:21:03,520 --> 00:21:06,560 Speaker 2: its own, and we probably a while from it being 360 00:21:06,600 --> 00:21:07,040 Speaker 2: able to do. 361 00:21:07,080 --> 00:21:09,280 Speaker 1: So we're aways from there. Still, all right, Well, we're 362 00:21:09,280 --> 00:21:10,800 Speaker 1: going to take one more quick ad break and then 363 00:21:10,800 --> 00:21:19,879 Speaker 1: I'll be back with more from Alexandra Lovett. Well, I 364 00:21:19,880 --> 00:21:22,760 Speaker 1: am back talking with Alexandra Lovett, who is an expert 365 00:21:22,800 --> 00:21:26,560 Speaker 1: on all things future and how to change the school 366 00:21:26,640 --> 00:21:29,240 Speaker 1: and work and how we learn these days. So I 367 00:21:29,280 --> 00:21:31,920 Speaker 1: want to pivot though to your own personal productivity. We've 368 00:21:31,920 --> 00:21:34,200 Speaker 1: been talking a little bit about what you use AI for, 369 00:21:34,359 --> 00:21:37,760 Speaker 1: but I'm curious about any routines you have. We love 370 00:21:37,800 --> 00:21:41,520 Speaker 1: to hear about people's productivity routines, morning routines, anything else 371 00:21:41,520 --> 00:21:43,840 Speaker 1: you're doing that you think makes you more productive. 372 00:21:44,280 --> 00:21:47,320 Speaker 2: Well, and I know this is your sweet spot, so 373 00:21:47,400 --> 00:21:49,800 Speaker 2: I would love some advice from you. But what I 374 00:21:49,920 --> 00:21:53,080 Speaker 2: do is I try to get started as early as 375 00:21:53,080 --> 00:21:56,680 Speaker 2: possible in the day because I find that on something 376 00:21:56,720 --> 00:22:01,000 Speaker 2: that is concrete, so not just checking emails and answering 377 00:22:01,040 --> 00:22:06,639 Speaker 2: messages or scrolling LinkedIn, but the project that I am 378 00:22:06,680 --> 00:22:08,919 Speaker 2: working on on that day, I try to get a 379 00:22:09,000 --> 00:22:12,280 Speaker 2: start by nine o'clock in the morning, so at least 380 00:22:12,320 --> 00:22:15,280 Speaker 2: I've got some momentum for the rest of the day, 381 00:22:15,600 --> 00:22:18,600 Speaker 2: because I find that once I get going, I am 382 00:22:18,600 --> 00:22:22,960 Speaker 2: a lot more productive. If I'm putting things off, then 383 00:22:23,160 --> 00:22:25,159 Speaker 2: it tends to get later in the day. Then I 384 00:22:25,240 --> 00:22:29,440 Speaker 2: procrastinate even more, my energy level dips. So getting started early, 385 00:22:29,480 --> 00:22:32,240 Speaker 2: getting some momentum, and then everything else that has to 386 00:22:32,240 --> 00:22:34,680 Speaker 2: be done that day on that project seems a little 387 00:22:34,680 --> 00:22:36,639 Speaker 2: bit easier. It doesn't seem like as much of a 388 00:22:36,720 --> 00:22:41,280 Speaker 2: hurdle to get past. I also, it's kind of a 389 00:22:41,280 --> 00:22:44,520 Speaker 2: similar point, but I break down very large projects into 390 00:22:44,560 --> 00:22:50,320 Speaker 2: their component parts and try to assign a component part 391 00:22:50,440 --> 00:22:53,199 Speaker 2: per week, and I do this for every single project 392 00:22:53,240 --> 00:22:56,320 Speaker 2: that I'm working on, so that I'm not following this 393 00:22:56,440 --> 00:22:59,000 Speaker 2: kind of amorphous walge. I have to write a book 394 00:22:59,000 --> 00:23:01,520 Speaker 2: in the next six months, but during the week of 395 00:23:01,560 --> 00:23:05,760 Speaker 2: February twenty third or twenty fourth, I am going to 396 00:23:06,320 --> 00:23:10,479 Speaker 2: research and write this particular chapter. And I've got a 397 00:23:10,520 --> 00:23:13,879 Speaker 2: workflow that goes week by week, not day by day yet, 398 00:23:13,960 --> 00:23:16,280 Speaker 2: but week by week what needs to be accomplished. And 399 00:23:16,280 --> 00:23:19,280 Speaker 2: I find that that keeps me on track with multiple 400 00:23:19,320 --> 00:23:21,320 Speaker 2: balls that are in the air that have to be 401 00:23:21,680 --> 00:23:25,800 Speaker 2: juggled effectively. Let's see what else do I do. I 402 00:23:25,880 --> 00:23:29,439 Speaker 2: try to collaborate whenever possible. Although a lot of my 403 00:23:29,480 --> 00:23:31,960 Speaker 2: work is solo, I find that I get a lot 404 00:23:31,960 --> 00:23:35,000 Speaker 2: of energy and enthusiasm out of working with other people, 405 00:23:35,040 --> 00:23:38,199 Speaker 2: and so wherever possible, I try to integrate some of 406 00:23:38,240 --> 00:23:40,959 Speaker 2: that because that helps with my motivation and my momentum. 407 00:23:41,440 --> 00:23:44,280 Speaker 2: So those are just some topline things. I mean, it's 408 00:23:44,280 --> 00:23:48,400 Speaker 2: important as a solopreneur, and a lot of your listeners 409 00:23:48,440 --> 00:23:51,600 Speaker 2: are going to be entering gig like work over the 410 00:23:51,600 --> 00:23:53,560 Speaker 2: next couple of years, just the way the world's moving, 411 00:23:53,640 --> 00:23:56,240 Speaker 2: and I think it's harder to motivate yourself and to 412 00:23:56,320 --> 00:23:58,280 Speaker 2: keep yourself on a schedule than if you have a 413 00:23:58,320 --> 00:24:01,040 Speaker 2: boss who's looking over your shoulder or an organization that 414 00:24:01,080 --> 00:24:04,959 Speaker 2: has very concrete goals. And therefore, I think self monitoring 415 00:24:05,000 --> 00:24:07,720 Speaker 2: of productivities and what you talk about is so critically 416 00:24:07,760 --> 00:24:12,360 Speaker 2: important because you're not going to have that external reinforcement. Yeah. 417 00:24:12,400 --> 00:24:14,639 Speaker 1: Absolutely well. I love the idea of doing a little 418 00:24:14,640 --> 00:24:16,560 Speaker 1: bit on a big project every week just so you 419 00:24:16,680 --> 00:24:20,359 Speaker 1: keep touching it. So Alex Sandra, I always ask my guests, 420 00:24:20,400 --> 00:24:22,800 Speaker 1: what is something you have done recently to take a 421 00:24:22,880 --> 00:24:24,840 Speaker 1: day from great to awesome? 422 00:24:26,840 --> 00:24:30,760 Speaker 2: So I like the days that I've done this a 423 00:24:30,800 --> 00:24:34,440 Speaker 2: few times now where I have devoted an entire day 424 00:24:34,760 --> 00:24:37,959 Speaker 2: to getting to know new people in my field. So 425 00:24:38,000 --> 00:24:41,600 Speaker 2: you could call that networking. You could call that informational interviewing, 426 00:24:41,720 --> 00:24:45,080 Speaker 2: or skill acquisition, depending on the person. It could be 427 00:24:46,400 --> 00:24:49,840 Speaker 2: any It could serve any number of purposes. But my 428 00:24:50,040 --> 00:24:54,640 Speaker 2: best days are when I make an interpersonal connection. Because 429 00:24:54,680 --> 00:24:57,240 Speaker 2: I do work a lot. As I said on my own, 430 00:24:58,240 --> 00:24:59,879 Speaker 2: I find that I'm in an echo chamber of my 431 00:25:00,040 --> 00:25:02,760 Speaker 2: own ideas a lot of the time, and I just 432 00:25:02,880 --> 00:25:05,920 Speaker 2: love to talk with people about things that we mutually 433 00:25:05,960 --> 00:25:10,439 Speaker 2: find exciting, invigorating to get a new idea in the 434 00:25:10,480 --> 00:25:13,560 Speaker 2: pipeline about a new way of thinking about things. So 435 00:25:13,880 --> 00:25:16,000 Speaker 2: I've done this now a few times. But that's how 436 00:25:16,119 --> 00:25:18,720 Speaker 2: if I can take my day from just being good 437 00:25:19,160 --> 00:25:21,640 Speaker 2: writing what I normally know, to let me think about 438 00:25:21,640 --> 00:25:24,119 Speaker 2: this in a whole new way and also feel like 439 00:25:24,200 --> 00:25:26,600 Speaker 2: I'm motivated just through the process of collaboration. 440 00:25:27,520 --> 00:25:29,920 Speaker 1: Yeah, it's the best. It's the best to bounce ideas 441 00:25:29,960 --> 00:25:33,879 Speaker 1: off of intelligent people. So, Alexandra, where can people find you? 442 00:25:34,240 --> 00:25:37,439 Speaker 2: Well, I'm at Alexandralevitt dot com and are if you 443 00:25:37,440 --> 00:25:40,440 Speaker 2: want to learn more about work based learning or what 444 00:25:40,480 --> 00:25:42,320 Speaker 2: that's like or how you might get a program off 445 00:25:42,320 --> 00:25:46,160 Speaker 2: the ground, we have make schoolwork dot org and feel 446 00:25:46,160 --> 00:25:47,639 Speaker 2: free to get in touch and just let us know 447 00:25:47,680 --> 00:25:49,920 Speaker 2: what you think of these ideas and what you think 448 00:25:49,960 --> 00:25:52,080 Speaker 2: you might be doing with it in your community, and 449 00:25:52,119 --> 00:25:54,280 Speaker 2: we would love to hear about it and perhaps help. 450 00:25:55,200 --> 00:25:58,159 Speaker 1: Awesome. Well, Alexandra, thank you so much for joining us. 451 00:25:58,200 --> 00:26:00,960 Speaker 1: Thank you to everyone for listening. If you have feedback 452 00:26:01,000 --> 00:26:04,000 Speaker 1: about this or any other episode, you can always reach 453 00:26:04,080 --> 00:26:07,760 Speaker 1: me at Laura at Laura vandercam dot com. In the meantime, 454 00:26:08,040 --> 00:26:11,359 Speaker 1: this is Laura. Thanks for listening, and here's to making 455 00:26:11,400 --> 00:26:21,320 Speaker 1: the most of our time. Thanks for listening to Before Breakfast. 456 00:26:21,880 --> 00:26:25,640 Speaker 1: If you've got questions, ideas, or feedback, you can reach 457 00:26:25,680 --> 00:26:35,360 Speaker 1: me at Laura at Laura vandercam dot com. Before Breakfast 458 00:26:35,400 --> 00:26:39,680 Speaker 1: is a production of iHeartMedia. For more podcasts from iHeartMedia, 459 00:26:39,720 --> 00:26:43,760 Speaker 1: please visit the iHeartRadio app, Apple Podcasts, or wherever you 460 00:26:43,840 --> 00:26:45,080 Speaker 1: listen to your favorite shows.