1 00:00:01,480 --> 00:00:04,840 Speaker 1: What happens to our ability to think critically when we 2 00:00:04,920 --> 00:00:09,639 Speaker 1: start outsourcing it to AI all the time? My guest today, 3 00:00:09,760 --> 00:00:13,039 Speaker 1: Professor Scott Anthony, is one of the world's leading experts 4 00:00:13,039 --> 00:00:16,640 Speaker 1: on innovation and disruption, and he believes that how we 5 00:00:16,760 --> 00:00:20,520 Speaker 1: use AI today could shape the future of our judgment, 6 00:00:20,920 --> 00:00:25,400 Speaker 1: our creativity, and even our careers. Scott is a professor 7 00:00:25,440 --> 00:00:27,880 Speaker 1: at the tuch School of Business. He's also the former 8 00:00:27,920 --> 00:00:30,720 Speaker 1: managing partner at Innosite, the firm co founded by the 9 00:00:30,800 --> 00:00:34,040 Speaker 1: Lake Clayton Christensen, and the author of several books, including 10 00:00:34,080 --> 00:00:38,240 Speaker 1: his latest one, Epic Disruptions. In this conversation, he shares 11 00:00:38,280 --> 00:00:43,159 Speaker 1: how AI is transforming the way he teaches, consults and thinks, 12 00:00:43,600 --> 00:00:46,920 Speaker 1: and why he draws a very firm line on where 13 00:00:46,920 --> 00:00:50,680 Speaker 1: he will never let AI do the work. By the 14 00:00:50,760 --> 00:00:53,760 Speaker 1: end of this episode, you'll know exactly how to leverage 15 00:00:53,840 --> 00:00:57,760 Speaker 1: AI as a teammate without letting it hollow out your 16 00:00:57,800 --> 00:01:03,360 Speaker 1: critical thinking skills. Why protecting your judgment is more important 17 00:01:03,360 --> 00:01:06,720 Speaker 1: than ever so if you've ever wondered, am I thinking 18 00:01:06,800 --> 00:01:10,560 Speaker 1: less deeply than I used to? This episode might just 19 00:01:10,720 --> 00:01:22,960 Speaker 1: change the way you work. Welcome to How I Work, 20 00:01:23,200 --> 00:01:27,320 Speaker 1: a show about habits rituals and strategies for optimizing your day. 21 00:01:27,920 --> 00:01:34,920 Speaker 1: I'm your host, doctor Amantha Imber. I want to start 22 00:01:34,959 --> 00:01:37,920 Speaker 1: by talking about change, because obviously a lot of your 23 00:01:37,959 --> 00:01:42,160 Speaker 1: work in the last couple of decades has involved massive 24 00:01:42,240 --> 00:01:46,360 Speaker 1: change in helping organizations navigate that, and I dare say 25 00:01:46,440 --> 00:01:51,440 Speaker 1: in both our careers, AI is the biggest disruptor that 26 00:01:51,640 --> 00:01:54,440 Speaker 1: we have both seen. Is that fair to say? Would 27 00:01:54,440 --> 00:01:56,920 Speaker 1: you say the Internet was larger Scott or are we 28 00:01:57,000 --> 00:01:58,480 Speaker 1: going with AI? Yeah? 29 00:01:58,560 --> 00:02:01,720 Speaker 2: Time of course will tell on this, but I suspect 30 00:02:01,720 --> 00:02:04,600 Speaker 2: time will ultimately judge AI as the biggest disruptor in 31 00:02:04,640 --> 00:02:06,840 Speaker 2: our lifetime. And who knows what will come after this, 32 00:02:06,920 --> 00:02:08,839 Speaker 2: but at least of what we have seen thus far, 33 00:02:09,480 --> 00:02:12,480 Speaker 2: just the pervasive impact of it and really changing the 34 00:02:12,520 --> 00:02:16,560 Speaker 2: way that knowledge is created and transmitted and processed. That 35 00:02:16,639 --> 00:02:19,320 Speaker 2: has so much ubiquity, it's hard for that not to 36 00:02:19,360 --> 00:02:21,000 Speaker 2: have cascading disruptive effects. 37 00:02:21,360 --> 00:02:23,920 Speaker 1: So I want to know, then, what are the biggest 38 00:02:24,000 --> 00:02:27,440 Speaker 1: changes that it's made to the way you work? As 39 00:02:27,560 --> 00:02:29,080 Speaker 1: a business professor. 40 00:02:28,919 --> 00:02:32,880 Speaker 2: I am in the maybe somewhat advantage position compared to 41 00:02:33,040 --> 00:02:35,639 Speaker 2: some of my colleagues. I've only been teaching for three years, 42 00:02:35,840 --> 00:02:37,840 Speaker 2: so there's not that much I have to unlearn as 43 00:02:37,840 --> 00:02:40,080 Speaker 2: opposed to people who might be my age but have 44 00:02:40,160 --> 00:02:42,080 Speaker 2: been teaching for longer than three years. 45 00:02:42,120 --> 00:02:42,680 Speaker 3: But whatever. 46 00:02:43,040 --> 00:02:46,280 Speaker 2: But really you're seeing a change just about everything. Change 47 00:02:46,320 --> 00:02:49,280 Speaker 2: the way that you're doing research, change the way you're 48 00:02:49,280 --> 00:02:52,840 Speaker 2: putting together courses, change the way that you're running your classes, 49 00:02:52,960 --> 00:02:55,960 Speaker 2: change the way that you're assessing students. And we can 50 00:02:56,000 --> 00:02:58,880 Speaker 2: go deeper in each of these areas, but I think 51 00:02:58,919 --> 00:03:02,280 Speaker 2: that the headline is AI is to a degree changing 52 00:03:02,520 --> 00:03:05,640 Speaker 2: everything that's happening in universities, which really isn't a surprise. 53 00:03:05,680 --> 00:03:08,600 Speaker 2: I mean, what is a university. It's the creation and 54 00:03:08,639 --> 00:03:10,680 Speaker 2: dissemination of knowledge of words. 55 00:03:11,200 --> 00:03:12,240 Speaker 3: And what is AI? 56 00:03:12,400 --> 00:03:15,240 Speaker 2: Generator of AI very good at it's very good at 57 00:03:15,240 --> 00:03:18,160 Speaker 2: putting together words in unique ways. So those two things 58 00:03:18,240 --> 00:03:20,519 Speaker 2: come together in a kind of predictable sort of way. 59 00:03:21,320 --> 00:03:23,320 Speaker 1: I do want to dig into learning, and we will 60 00:03:23,360 --> 00:03:26,320 Speaker 1: go there and also teaching as well. But first, one 61 00:03:26,320 --> 00:03:29,240 Speaker 1: of the things that when we last spoke that was 62 00:03:29,240 --> 00:03:31,680 Speaker 1: not a recorded conversation, one of the things that intrigued 63 00:03:31,720 --> 00:03:36,480 Speaker 1: me most is your latest book, Epic Disruptions. Your clause 64 00:03:37,080 --> 00:03:40,800 Speaker 1: with Harvard, who's your publisher, had what I thought was 65 00:03:40,840 --> 00:03:43,840 Speaker 1: quite a unique AI clause. Can you tell me about 66 00:03:43,880 --> 00:03:45,640 Speaker 1: what they stipulated for your book? 67 00:03:45,920 --> 00:03:48,760 Speaker 2: Absolutely, and in preparation for this, I got the actual 68 00:03:48,840 --> 00:03:51,880 Speaker 2: language from it, which I hoped them any trouble with Harvard, 69 00:03:51,920 --> 00:03:53,800 Speaker 2: But I'll explain that the why behind it, which I 70 00:03:53,840 --> 00:03:57,120 Speaker 2: think will help. The exact clause is authors shall not 71 00:03:57,280 --> 00:04:00,400 Speaker 2: generate or use content for the work and collaborate with 72 00:04:00,640 --> 00:04:05,080 Speaker 2: or buy artificial intelligence tools SLASH applications without the prior 73 00:04:05,200 --> 00:04:08,320 Speaker 2: written consent of publisher. So I called it the thou 74 00:04:08,400 --> 00:04:12,080 Speaker 2: shalt not use AI clause again without coordinating with the publisher. 75 00:04:12,440 --> 00:04:13,960 Speaker 2: And when you first see that, you're like, oh, that 76 00:04:14,000 --> 00:04:17,839 Speaker 2: sounds kind of dragconian. But what's behind that. Well, it's 77 00:04:17,880 --> 00:04:21,520 Speaker 2: still unclear if work is generated by AI what that 78 00:04:21,560 --> 00:04:24,560 Speaker 2: means for copyright. And if you're a publisher and you 79 00:04:24,640 --> 00:04:27,240 Speaker 2: care about copyright, that's a really big deal. So if 80 00:04:27,240 --> 00:04:29,960 Speaker 2: you're reporting something to be original work but it's actually 81 00:04:30,040 --> 00:04:33,480 Speaker 2: plagiarized or not original work, that could be a real problem. 82 00:04:33,560 --> 00:04:35,040 Speaker 3: So, of course there are. 83 00:04:34,960 --> 00:04:37,719 Speaker 2: Ways that you can and should and do use AI 84 00:04:37,839 --> 00:04:40,760 Speaker 2: during the writing process, but the idea here is don't 85 00:04:40,800 --> 00:04:43,120 Speaker 2: outsource it, and if you do, please tell us not 86 00:04:43,240 --> 00:04:44,960 Speaker 2: Please tell us you must tell us. 87 00:04:45,680 --> 00:04:50,000 Speaker 1: I found that interesting because I'm working on my fifth 88 00:04:50,000 --> 00:04:52,040 Speaker 1: book at the moment with Penguin Random House, and there 89 00:04:52,080 --> 00:04:55,640 Speaker 1: was no AI clause with them, which, after having that 90 00:04:55,680 --> 00:04:59,240 Speaker 1: discussion with you, made me think, hmm, maybe it's in 91 00:04:59,279 --> 00:05:01,400 Speaker 1: there now since I signed contracts. Maybe they're making all 92 00:05:01,400 --> 00:05:03,880 Speaker 1: their authors do that. But I was intrigued. And so 93 00:05:04,040 --> 00:05:08,880 Speaker 1: then in your book writing process, how did you use AI? 94 00:05:09,040 --> 00:05:12,120 Speaker 2: If it all so the research and writing of the 95 00:05:12,240 --> 00:05:15,320 Speaker 2: chapters and just brief overview of the book epic disruptions 96 00:05:15,320 --> 00:05:16,440 Speaker 2: eleven innovations that. 97 00:05:16,400 --> 00:05:18,920 Speaker 3: Shape our modern world. This is a history book. 98 00:05:19,040 --> 00:05:22,000 Speaker 2: It starts with gunpowder, it goes up to the Apple iPhone, 99 00:05:22,400 --> 00:05:24,919 Speaker 2: with then a conclusion looking at disruptions that are in 100 00:05:25,040 --> 00:05:28,240 Speaker 2: process today. But there's a lot of research. So the 101 00:05:28,320 --> 00:05:30,720 Speaker 2: actual research of the chapters and the writing of the 102 00:05:30,800 --> 00:05:35,159 Speaker 2: chapters I did independently. What I used AI for was 103 00:05:35,200 --> 00:05:39,240 Speaker 2: to brainstorm, so what are really the epic disruptions that 104 00:05:39,279 --> 00:05:42,479 Speaker 2: should make the short list? I had some thoughts I 105 00:05:42,520 --> 00:05:45,040 Speaker 2: parried with open AI, chat GPT to come up with 106 00:05:45,080 --> 00:05:48,240 Speaker 2: some additional thoughts, kicked around ideas and so on, titling 107 00:05:48,360 --> 00:05:50,279 Speaker 2: what's the right way to title and trying to pull 108 00:05:50,320 --> 00:05:52,160 Speaker 2: out what are some of the themes that I'm seeing? 109 00:05:52,640 --> 00:05:53,760 Speaker 3: Is it three? Is at five? 110 00:05:54,200 --> 00:05:56,719 Speaker 2: Those kinds of things that are not in the writing, 111 00:05:57,120 --> 00:05:59,279 Speaker 2: but really in some of the things that wrap around it. 112 00:05:59,440 --> 00:06:02,159 Speaker 2: I found a to be incredibly helpful, not always perfect 113 00:06:02,200 --> 00:06:04,440 Speaker 2: at the end, you know, you have to own it yourself, 114 00:06:04,720 --> 00:06:07,240 Speaker 2: but if you're trying to expand your thinking, it's a 115 00:06:07,240 --> 00:06:08,120 Speaker 2: really useful tool. 116 00:06:08,240 --> 00:06:11,640 Speaker 1: I know you're obviously an early adopter of Jenai, and 117 00:06:11,680 --> 00:06:14,240 Speaker 1: I imagine your approach to prompting and getting the best 118 00:06:14,240 --> 00:06:17,560 Speaker 1: out of it has evolved. Tell me how you use 119 00:06:17,600 --> 00:06:19,239 Speaker 1: it as a brainstorming partner. 120 00:06:19,560 --> 00:06:22,320 Speaker 2: A lot of it is context engineering, you know, making 121 00:06:22,360 --> 00:06:25,200 Speaker 2: sure that you put in place the right parameters so 122 00:06:25,360 --> 00:06:29,080 Speaker 2: that you are getting a legitimately different perspective on things. 123 00:06:29,440 --> 00:06:32,599 Speaker 2: So whether that's taking on a different role, or it's 124 00:06:32,600 --> 00:06:34,919 Speaker 2: saying let's be a skeptic or let's be a supporter 125 00:06:35,160 --> 00:06:38,800 Speaker 2: or whatever, trying to make sure that you a set 126 00:06:38,839 --> 00:06:40,919 Speaker 2: the context where you're going to get something that is 127 00:06:40,960 --> 00:06:43,760 Speaker 2: what you're looking for, which might be something that's as 128 00:06:43,839 --> 00:06:47,000 Speaker 2: novel as possible or something that's more fine tune as possible, 129 00:06:47,080 --> 00:06:50,880 Speaker 2: whatever it is. The second is multiple tools My favorite 130 00:06:50,920 --> 00:06:53,960 Speaker 2: at the moment are Claude and chat Shept, So I'll 131 00:06:53,960 --> 00:06:57,360 Speaker 2: play them off each other so i'lle be simultaneously brainstorming 132 00:06:57,360 --> 00:06:59,520 Speaker 2: a topic. I'd be like, oh, the other one said this, 133 00:06:59,560 --> 00:07:00,160 Speaker 2: what do you think? 134 00:07:00,000 --> 00:07:00,680 Speaker 3: Think about that? 135 00:07:01,120 --> 00:07:04,040 Speaker 2: And at some point they converge because all the foundational 136 00:07:04,120 --> 00:07:06,120 Speaker 2: stuff will get you to the same place, but they 137 00:07:06,160 --> 00:07:08,919 Speaker 2: actually will start sometimes in very different places. And of 138 00:07:08,960 --> 00:07:11,760 Speaker 2: course we know if we use the systems, sometimes the 139 00:07:11,840 --> 00:07:15,160 Speaker 2: exact same prompts will go in different directions, just depending 140 00:07:15,200 --> 00:07:18,160 Speaker 2: on whatever is the magic that's happening underneath the hood. 141 00:07:18,560 --> 00:07:20,880 Speaker 2: But those two things are really trying to say, let's 142 00:07:20,880 --> 00:07:23,560 Speaker 2: push for the personas and get the context right and 143 00:07:23,600 --> 00:07:25,760 Speaker 2: then using different models. Those are the two things that 144 00:07:25,760 --> 00:07:27,080 Speaker 2: I found to be the most helpful. 145 00:07:27,280 --> 00:07:30,760 Speaker 1: I will quite often when I'm working on quite a 146 00:07:31,080 --> 00:07:33,840 Speaker 1: big project where I want to get different views, I 147 00:07:33,840 --> 00:07:38,200 Speaker 1: will quite often have the same prompt in CHETJPT Claude 148 00:07:38,240 --> 00:07:41,000 Speaker 1: and Gemini and then finish with ask me you know 149 00:07:41,080 --> 00:07:43,120 Speaker 1: any questions so you can be ninety five percent confident 150 00:07:43,080 --> 00:07:46,120 Speaker 1: and doing a great job. And it's quite interesting the 151 00:07:46,400 --> 00:07:49,040 Speaker 1: different questions that I'll get back when it's a task 152 00:07:49,120 --> 00:07:53,320 Speaker 1: that could go in several different directions, So I do 153 00:07:53,400 --> 00:07:56,200 Speaker 1: love that as a way and kind of sparring them 154 00:07:56,200 --> 00:07:59,360 Speaker 1: against each other I found really useful. One of the 155 00:07:59,360 --> 00:08:01,960 Speaker 1: things that I wonder as I think a lot of 156 00:08:02,040 --> 00:08:05,360 Speaker 1: people and particularly knowledge workers that you know, whose value 157 00:08:05,440 --> 00:08:10,240 Speaker 1: is in their thinking use AI a lot, is I wonder, firstly, 158 00:08:10,240 --> 00:08:13,160 Speaker 1: what's going to happen to our ability to think critically? 159 00:08:13,760 --> 00:08:16,520 Speaker 1: And then what's going to happen to our ability to 160 00:08:17,480 --> 00:08:20,560 Speaker 1: maintain good judgment? And I want to know for you 161 00:08:21,400 --> 00:08:24,000 Speaker 1: with the thinking critically, how do you think about that 162 00:08:24,360 --> 00:08:27,000 Speaker 1: for yourself? And do you worry am I going to 163 00:08:27,040 --> 00:08:29,840 Speaker 1: lose the ability to think critically if I kind of 164 00:08:29,840 --> 00:08:32,400 Speaker 1: crossed that line where I've suddenly outsourced just a little 165 00:08:32,400 --> 00:08:33,480 Speaker 1: bit too much to AI? 166 00:08:33,720 --> 00:08:35,920 Speaker 2: Yeah, it is something that I worry about a lot. 167 00:08:36,000 --> 00:08:38,880 Speaker 2: I worry even more about for my students and even 168 00:08:38,880 --> 00:08:41,760 Speaker 2: more for my children. So as you've got then three 169 00:08:41,840 --> 00:08:45,000 Speaker 2: generations there, You've got me nineteen seventy five, my kids 170 00:08:45,000 --> 00:08:48,240 Speaker 2: who were born between two thousand and five and twenty sixteen, 171 00:08:48,679 --> 00:08:51,600 Speaker 2: and then the students who generally are about thirty years old, 172 00:08:51,640 --> 00:08:53,920 Speaker 2: so the average day of birth would be about whatever 173 00:08:53,920 --> 00:08:57,000 Speaker 2: that is, nineteen ninety five. It really is being very 174 00:08:57,000 --> 00:08:59,440 Speaker 2: clear about the lines. I'm trying to do something that 175 00:08:59,520 --> 00:09:03,320 Speaker 2: does rep present original thinking. I just can't outsource that. 176 00:09:03,800 --> 00:09:06,280 Speaker 2: I can't ask AI for a first draft of something 177 00:09:06,520 --> 00:09:07,720 Speaker 2: I tried, you. 178 00:09:07,720 --> 00:09:09,640 Speaker 3: Know, when it first came out. Of course, I'm going 179 00:09:09,640 --> 00:09:10,400 Speaker 3: to play around with it. 180 00:09:10,440 --> 00:09:11,960 Speaker 2: So I'm like, okay, I've got an idea for a 181 00:09:12,000 --> 00:09:14,720 Speaker 2: new class, please, is not a curriculum for me? And 182 00:09:14,880 --> 00:09:18,840 Speaker 2: A the output was not particularly good, and B my 183 00:09:18,920 --> 00:09:20,920 Speaker 2: brain wasn't in it. I hadn't thought about it, I 184 00:09:20,920 --> 00:09:23,719 Speaker 2: hadn't processed it, I hadn't struggled, et cetera. So now 185 00:09:23,800 --> 00:09:26,640 Speaker 2: when I'm working on a new course, certainly I'll say, okay, 186 00:09:26,920 --> 00:09:28,760 Speaker 2: this bit of it, I need to think about some 187 00:09:28,800 --> 00:09:31,360 Speaker 2: additional readings, or I want to think about some of 188 00:09:31,360 --> 00:09:34,000 Speaker 2: the principles I want to teach here, so I'll get 189 00:09:34,000 --> 00:09:38,240 Speaker 2: help to sharpen. But ownership of the integration it has 190 00:09:38,320 --> 00:09:40,439 Speaker 2: to be by me or else. That's just a skill 191 00:09:40,480 --> 00:09:42,720 Speaker 2: that's going to lapse. So that, to me is the 192 00:09:42,720 --> 00:09:45,959 Speaker 2: most important thing. Drawing the line and being really clear 193 00:09:46,000 --> 00:09:48,640 Speaker 2: about where you draw that line. There are other tasks 194 00:09:48,640 --> 00:09:51,600 Speaker 2: that are less critical where I'm very happy to outsource. 195 00:09:51,640 --> 00:09:54,840 Speaker 2: More so as an example, I was facilitating a panel 196 00:09:54,840 --> 00:09:57,880 Speaker 2: discussion a couple months ago, and that was pretty much 197 00:09:57,960 --> 00:10:00,200 Speaker 2: churn and burn. You know, here are my panel us, 198 00:10:00,600 --> 00:10:03,080 Speaker 2: make me smarter about who they are, give me a 199 00:10:03,080 --> 00:10:03,960 Speaker 2: few good questions. 200 00:10:04,000 --> 00:10:04,280 Speaker 1: Task. 201 00:10:04,559 --> 00:10:06,920 Speaker 2: Okay, that's great raw material. I can then turn that 202 00:10:06,960 --> 00:10:09,560 Speaker 2: into a finished product. But I was totally fine with 203 00:10:09,600 --> 00:10:12,280 Speaker 2: first draft being done by AI. But when you really 204 00:10:12,280 --> 00:10:14,560 Speaker 2: need to turn on the critical thinking and reasoning, first 205 00:10:14,640 --> 00:10:17,320 Speaker 2: draft by AI, I think the research shows is really 206 00:10:17,400 --> 00:10:18,199 Speaker 2: really dangerous. 207 00:10:19,600 --> 00:10:22,040 Speaker 1: It's interesting because I it would have been about a 208 00:10:22,080 --> 00:10:25,360 Speaker 1: year ago now. I had an executive from Google on 209 00:10:26,040 --> 00:10:30,240 Speaker 1: the show and they were talking about how their default 210 00:10:30,600 --> 00:10:34,320 Speaker 1: was to use AI or Gemini for the first draft 211 00:10:34,320 --> 00:10:37,439 Speaker 1: of everything as a place to start. And I kind 212 00:10:37,440 --> 00:10:40,160 Speaker 1: of get that because the blank page in any kind 213 00:10:40,160 --> 00:10:42,600 Speaker 1: of work can be a little bit confronting. No One, 214 00:10:42,640 --> 00:10:46,520 Speaker 1: particularly writers like us, like a blank page. But then 215 00:10:46,920 --> 00:10:50,280 Speaker 1: how do you decide like because I imagine sometimes it 216 00:10:50,320 --> 00:10:52,040 Speaker 1: is a fine line. The example that you've given is 217 00:10:52,400 --> 00:10:57,880 Speaker 1: quite stark. But are there examples where m you're kind 218 00:10:57,920 --> 00:11:00,760 Speaker 1: of on the fence about whether to use A to 219 00:11:00,800 --> 00:11:05,080 Speaker 1: help do the thinking versus you sit and struggle with 220 00:11:05,120 --> 00:11:08,880 Speaker 1: it and do the hard unpleasant deep work yourself. 221 00:11:09,160 --> 00:11:12,160 Speaker 2: So you know, last term, I had sixty five students 222 00:11:12,200 --> 00:11:15,160 Speaker 2: in one of my classes, and there's qualitative feedback that 223 00:11:15,240 --> 00:11:18,280 Speaker 2: comes from the sixty five students, and the easy thing 224 00:11:18,280 --> 00:11:20,640 Speaker 2: to do is to dump it all in chat shept 225 00:11:20,840 --> 00:11:23,040 Speaker 2: and say, tell me the three themes that you see here. 226 00:11:23,600 --> 00:11:26,040 Speaker 2: And I did that, except I said, I want the 227 00:11:26,080 --> 00:11:29,280 Speaker 2: three themes that you see. I want the biggest outlier 228 00:11:29,320 --> 00:11:32,040 Speaker 2: that you see that is negative and that's positive, and 229 00:11:32,080 --> 00:11:34,680 Speaker 2: I want something random that you can throw back me. 230 00:11:35,120 --> 00:11:36,920 Speaker 2: And then I went and did the same thing over 231 00:11:37,000 --> 00:11:39,199 Speaker 2: at Claude. So again tried to have it go through 232 00:11:39,280 --> 00:11:41,760 Speaker 2: multiple models, so I didn't fall into the easy trap 233 00:11:41,800 --> 00:11:44,520 Speaker 2: of saying, Okay, this is set to warm, so it's 234 00:11:44,520 --> 00:11:46,280 Speaker 2: going to tell me that the very nice things my 235 00:11:46,280 --> 00:11:48,400 Speaker 2: students had to say, and I'll just be all done 236 00:11:48,400 --> 00:11:50,320 Speaker 2: and moved on. So I tried to make sure that 237 00:11:50,600 --> 00:11:53,560 Speaker 2: I balanced the big efficiency gains that I got from 238 00:11:53,559 --> 00:11:57,120 Speaker 2: not going through every word of everything while also being provoked. 239 00:11:57,120 --> 00:11:59,120 Speaker 2: And then I was so interested in the feedback I 240 00:11:59,120 --> 00:12:00,920 Speaker 2: did in the end go back and read all of it. 241 00:12:01,000 --> 00:12:02,839 Speaker 2: So it didn't end up saving any time at all, 242 00:12:02,880 --> 00:12:05,080 Speaker 2: but it was fun to look at it from various angles, 243 00:12:05,559 --> 00:12:07,800 Speaker 2: so you know, that is the thing that again back 244 00:12:07,840 --> 00:12:10,480 Speaker 2: to the conversation we had about the different ways that 245 00:12:10,520 --> 00:12:14,480 Speaker 2: you contact engineering, set perspectives. If you do, put some 246 00:12:14,559 --> 00:12:17,319 Speaker 2: things that will force you to critically reason into your 247 00:12:17,360 --> 00:12:20,000 Speaker 2: prompts so that what you're getting back is not just 248 00:12:20,080 --> 00:12:24,200 Speaker 2: the answer, but is an answer with outliers and as 249 00:12:24,280 --> 00:12:28,280 Speaker 2: you said, questions for consideration. Those I think are great 250 00:12:28,320 --> 00:12:30,200 Speaker 2: things that can help make sure we don't lose that 251 00:12:30,800 --> 00:12:33,360 Speaker 2: is that again, I think it is probably the most 252 00:12:33,360 --> 00:12:35,320 Speaker 2: important thing we're going to have to struggle with over 253 00:12:35,360 --> 00:12:38,840 Speaker 2: the next decade. And I worry, in a parallel to 254 00:12:38,880 --> 00:12:41,160 Speaker 2: what we saw in the book, that many of the 255 00:12:41,360 --> 00:12:43,800 Speaker 2: early adopters of AI are going to say, what have 256 00:12:43,840 --> 00:12:46,560 Speaker 2: we done? They'll say, at first, this is great, We've 257 00:12:46,559 --> 00:12:49,240 Speaker 2: got all these efficiency gains, and then X number of 258 00:12:49,320 --> 00:12:51,240 Speaker 2: years in the future they will have hollowed out their 259 00:12:51,280 --> 00:12:54,840 Speaker 2: ability to actually do things, and they'll say, we got 260 00:12:54,840 --> 00:12:56,960 Speaker 2: something wrong when we did this short term gain but 261 00:12:57,040 --> 00:12:58,200 Speaker 2: long term pain. 262 00:12:58,640 --> 00:13:03,319 Speaker 1: Yeah. I really are worry about people's ability to think critically. 263 00:13:03,440 --> 00:13:05,719 Speaker 1: I mean, as humans, where we're programmed to go down 264 00:13:05,840 --> 00:13:08,720 Speaker 1: the path of least resistance, and that is to outsource 265 00:13:08,800 --> 00:13:13,040 Speaker 1: our thinking to AI, and I worry, like I feel 266 00:13:13,040 --> 00:13:16,880 Speaker 1: it in myself that it takes willpower to go Okay, 267 00:13:16,960 --> 00:13:18,680 Speaker 1: I need to sit here and I need to sit 268 00:13:18,720 --> 00:13:21,720 Speaker 1: in discomfort, and I need to do the thinking. And 269 00:13:22,640 --> 00:13:26,040 Speaker 1: I do choose to do that because integrity is valued 270 00:13:26,760 --> 00:13:29,600 Speaker 1: of mine. But I think it's you know, it's also 271 00:13:29,760 --> 00:13:31,959 Speaker 1: like I can see how a lot of people who 272 00:13:32,000 --> 00:13:34,640 Speaker 1: are known for their thinking and their quality of their thinking, 273 00:13:35,320 --> 00:13:37,880 Speaker 1: it's really easy to go down that path. And you know, 274 00:13:37,960 --> 00:13:41,080 Speaker 1: like I kind of look at what LinkedIn has become, 275 00:13:41,480 --> 00:13:44,920 Speaker 1: where you know thinkers that I used to follow kind 276 00:13:44,920 --> 00:13:47,800 Speaker 1: of like that is not you. You have not generated that. 277 00:13:48,000 --> 00:13:50,600 Speaker 1: Why am I even on LinkedIn? How do you feel 278 00:13:50,640 --> 00:13:52,840 Speaker 1: now about that as a platform, Like you're very active 279 00:13:52,880 --> 00:13:54,920 Speaker 1: on it, but what is your view of what's going 280 00:13:54,960 --> 00:13:57,839 Speaker 1: on there and how you're approaching it yourself as someone 281 00:13:57,840 --> 00:13:59,200 Speaker 1: who is you know, a great thinker? 282 00:13:59,400 --> 00:14:01,360 Speaker 2: Well, ye, first of all, thanks for calling me a 283 00:14:01,400 --> 00:14:03,440 Speaker 2: great thinker sub days better than others. But you know, 284 00:14:03,880 --> 00:14:07,720 Speaker 2: a couple of years ago, I was participating in offsite 285 00:14:07,760 --> 00:14:10,559 Speaker 2: with a bunch of senior leaders from a large organization 286 00:14:11,080 --> 00:14:13,199 Speaker 2: and they said, we want some stimuli, and we want 287 00:14:13,240 --> 00:14:16,400 Speaker 2: stimuli not from you, but from younger voices. So I 288 00:14:16,440 --> 00:14:19,160 Speaker 2: asked my son Charlie, who was seventeen years old at 289 00:14:19,160 --> 00:14:21,920 Speaker 2: the time, to come in and share a perspective about AI, 290 00:14:22,120 --> 00:14:24,600 Speaker 2: and he said, look, everyone in my school is using 291 00:14:24,640 --> 00:14:26,880 Speaker 2: it all the time, and the thing I really worry 292 00:14:26,920 --> 00:14:28,640 Speaker 2: about is that we're going to end up like those 293 00:14:28,920 --> 00:14:32,560 Speaker 2: really obese humans in the movie Wally that have completely 294 00:14:32,560 --> 00:14:35,360 Speaker 2: lost their capacity to think about anything. None of the 295 00:14:35,360 --> 00:14:37,400 Speaker 2: metaphor we were talking about is you know, back in 296 00:14:37,480 --> 00:14:40,040 Speaker 2: the day when we were in high school, you could 297 00:14:40,080 --> 00:14:42,600 Speaker 2: go to the bathroom and you could buy the answers 298 00:14:42,640 --> 00:14:44,240 Speaker 2: to the test. You know, someone at the stall next 299 00:14:44,240 --> 00:14:46,840 Speaker 2: to you would have the answer key or whatever. And 300 00:14:46,880 --> 00:14:49,200 Speaker 2: people didn't do that because it was just so blatantly 301 00:14:49,360 --> 00:14:52,360 Speaker 2: obviously cheating. But now when you go to a large 302 00:14:52,400 --> 00:14:54,920 Speaker 2: language model and you ask for a first draft to something, 303 00:14:55,000 --> 00:14:58,040 Speaker 2: it doesn't feel like cheating, So, you know, it just 304 00:14:58,320 --> 00:15:01,000 Speaker 2: it feels different. And you talked about acting with integrity, 305 00:15:01,400 --> 00:15:04,040 Speaker 2: but you could also argue that people are using the 306 00:15:04,040 --> 00:15:06,600 Speaker 2: best tools available to be able to get the best 307 00:15:06,600 --> 00:15:08,680 Speaker 2: answers they can. So, you know, back to what we 308 00:15:08,720 --> 00:15:11,560 Speaker 2: are thinking about on campus, how do you actually go 309 00:15:11,880 --> 00:15:17,000 Speaker 2: and design educational experiences that encourage people to learn that 310 00:15:17,080 --> 00:15:20,040 Speaker 2: help them sit with the discomfort you talked about, teach 311 00:15:20,080 --> 00:15:23,160 Speaker 2: them to be comfortable in that discomfort, and then go 312 00:15:23,200 --> 00:15:26,000 Speaker 2: and assess them in a way that goes and says, Yes, 313 00:15:26,160 --> 00:15:28,600 Speaker 2: use the best tools that you can, but make sure 314 00:15:28,640 --> 00:15:31,240 Speaker 2: we're also developing critically as human beings. 315 00:15:31,600 --> 00:15:33,160 Speaker 3: It's an incredibly hard topic. 316 00:15:33,240 --> 00:15:37,480 Speaker 1: It's really good segue into your work in the business school, 317 00:15:37,600 --> 00:15:39,800 Speaker 1: the Taxical of Business, one of the top business schools 318 00:15:39,840 --> 00:15:45,040 Speaker 1: in America, Like, how is your approach to teaching what 319 00:15:45,480 --> 00:15:49,320 Speaker 1: I'm sure are incredibly smart minds, but who do have 320 00:15:49,360 --> 00:15:52,720 Speaker 1: access to AI and who theoretically can outsource their thinking? 321 00:15:53,360 --> 00:15:56,560 Speaker 1: How do you design a course and then assess these 322 00:15:56,680 --> 00:16:00,360 Speaker 1: students in this new world that we're in with great care? 323 00:16:00,680 --> 00:16:02,240 Speaker 3: That's the short answer. 324 00:16:02,640 --> 00:16:05,200 Speaker 2: That have some advantage in that the courses that I 325 00:16:05,280 --> 00:16:09,440 Speaker 2: teach are generally not fined and apply a mathematical formula 326 00:16:09,920 --> 00:16:12,960 Speaker 2: or use an off the shelf framework, but more around 327 00:16:13,000 --> 00:16:16,200 Speaker 2: critical thinking, really, and how do you deal with uncertainty 328 00:16:16,400 --> 00:16:18,680 Speaker 2: and how do you go and act when the data 329 00:16:18,720 --> 00:16:20,600 Speaker 2: is telling you not to and hear the voice that 330 00:16:20,640 --> 00:16:24,480 Speaker 2: doesn't speak and that stuff isn't regurgitation of frameworks that 331 00:16:24,520 --> 00:16:27,440 Speaker 2: you find in books. It's going and puzzling through things together, 332 00:16:27,680 --> 00:16:31,360 Speaker 2: very discussion oriented courses and so on, so along some dimensions. 333 00:16:31,440 --> 00:16:33,920 Speaker 2: I think it is easier in that context to do 334 00:16:34,000 --> 00:16:36,760 Speaker 2: what I do, which is tell students please use AI 335 00:16:37,040 --> 00:16:39,640 Speaker 2: to go and go through the case material or to 336 00:16:39,640 --> 00:16:41,480 Speaker 2: come up with first answers to questions. 337 00:16:41,880 --> 00:16:43,080 Speaker 3: But we're going to be in a room. 338 00:16:43,320 --> 00:16:45,160 Speaker 2: We're going to put ourselves in the shoes of a 339 00:16:45,280 --> 00:16:47,600 Speaker 2: participant or a case protagonist. 340 00:16:48,080 --> 00:16:50,360 Speaker 3: We're going to talk about it, and the laptops. 341 00:16:49,920 --> 00:16:51,480 Speaker 2: Are going to be closed and the cell phones are 342 00:16:51,480 --> 00:16:53,480 Speaker 2: going to be off. So you know, whatever you've done 343 00:16:53,480 --> 00:16:56,240 Speaker 2: to prepare, that's fine, But fifty percent of your grade, 344 00:16:56,240 --> 00:16:58,960 Speaker 2: then it is the discussion quality that we have in class. 345 00:16:59,360 --> 00:17:02,840 Speaker 2: So it's pushing on that in class experience and discussion. 346 00:17:02,880 --> 00:17:03,920 Speaker 2: That's point number one. 347 00:17:04,280 --> 00:17:05,040 Speaker 3: Point number two. 348 00:17:05,440 --> 00:17:08,400 Speaker 2: I'm thinking, and this is still emergent, thinking about how 349 00:17:08,400 --> 00:17:11,520 Speaker 2: do you assess learning in different ways? So I started 350 00:17:11,560 --> 00:17:14,520 Speaker 2: a couple terms ago giving people an option for the 351 00:17:14,560 --> 00:17:15,360 Speaker 2: final assignment. 352 00:17:15,480 --> 00:17:17,160 Speaker 3: The old final assignment. 353 00:17:16,880 --> 00:17:19,679 Speaker 2: Was take everything we learned in class, write a paper 354 00:17:19,720 --> 00:17:23,240 Speaker 2: about some company, some leader, apply learning from class. The 355 00:17:23,320 --> 00:17:26,600 Speaker 2: new option is take everything we learned in class, find 356 00:17:26,640 --> 00:17:28,879 Speaker 2: a leader, give me two hundred words about why you 357 00:17:28,920 --> 00:17:32,680 Speaker 2: pick this leader. Then design an AI tool to help 358 00:17:32,720 --> 00:17:35,320 Speaker 2: them learn something that you learned in class. And the 359 00:17:35,359 --> 00:17:38,600 Speaker 2: most critical thing I learned about that is and show 360 00:17:38,600 --> 00:17:41,840 Speaker 2: me your work. So the output I kind of care about. 361 00:17:42,080 --> 00:17:45,520 Speaker 2: What I really care about is the exact prompts you use, 362 00:17:45,640 --> 00:17:48,600 Speaker 2: the back and forth you had with the large language models, 363 00:17:48,840 --> 00:17:51,520 Speaker 2: the critical thinking you displayed, blah blah blah blah blah, 364 00:17:51,680 --> 00:17:53,600 Speaker 2: because I learned the first run. And sometimes you get 365 00:17:53,600 --> 00:17:56,800 Speaker 2: these very very elegant outputs that had a lot of 366 00:17:56,840 --> 00:17:58,679 Speaker 2: mess that went into them. I'm like, this is you 367 00:17:58,680 --> 00:18:01,520 Speaker 2: didn't do much work and actually did a ton. And 368 00:18:01,560 --> 00:18:03,680 Speaker 2: then sometimes you see these things where the output looked 369 00:18:03,720 --> 00:18:07,119 Speaker 2: super sophisticated, but the students did basically no work and 370 00:18:07,200 --> 00:18:09,200 Speaker 2: had no understanding. So if you don't see the work, 371 00:18:09,240 --> 00:18:12,520 Speaker 2: you don't understand that. And then finally, I sample size 372 00:18:12,560 --> 00:18:15,440 Speaker 2: the one. But I ran an experiment the last term 373 00:18:15,480 --> 00:18:18,040 Speaker 2: where I did a thirty minute interview with the student 374 00:18:18,080 --> 00:18:20,040 Speaker 2: at the end of the term and said, we're going 375 00:18:20,119 --> 00:18:21,720 Speaker 2: to try this out, just see what it's like. But 376 00:18:21,800 --> 00:18:25,000 Speaker 2: pick a situation, let's talk, so you know, just trying 377 00:18:25,000 --> 00:18:27,560 Speaker 2: to figure out different ways to assess whether people really 378 00:18:27,560 --> 00:18:30,320 Speaker 2: have learned, in some cases leaning into AI, in some 379 00:18:30,440 --> 00:18:32,680 Speaker 2: cases going about as old school as you'll go. And 380 00:18:33,119 --> 00:18:35,920 Speaker 2: this is an example of one of something that every 381 00:18:36,000 --> 00:18:37,600 Speaker 2: educator is thinking about right now. 382 00:18:37,720 --> 00:18:40,840 Speaker 1: Now, I do want to talk about consulting and the 383 00:18:40,960 --> 00:18:44,840 Speaker 1: context you worked in the world of management consulting for 384 00:18:45,280 --> 00:18:50,600 Speaker 1: I want to say about twenty years Scott nailed it. Yeah, fantastic. Yeah. 385 00:18:49,680 --> 00:18:53,919 Speaker 1: And during that time you were managing partner of you know, Site, 386 00:18:54,000 --> 00:18:57,000 Speaker 1: which is the firm that the late Clayton Christiansen started, 387 00:18:57,160 --> 00:18:59,919 Speaker 1: and that is how we connected when you were you know, 388 00:19:00,080 --> 00:19:04,200 Speaker 1: several years ago. So I want to understand, like, if 389 00:19:04,240 --> 00:19:07,560 Speaker 1: you were still working in consulting, how do you think 390 00:19:07,760 --> 00:19:10,720 Speaker 1: you would be using AI for client work. 391 00:19:11,000 --> 00:19:13,280 Speaker 2: I want to give you a historical example first, then 392 00:19:13,320 --> 00:19:16,760 Speaker 2: I'll come answer it. So, you know, fourteen forty Gudenbergen 393 00:19:16,800 --> 00:19:18,560 Speaker 2: team comes up with the printing press. We know the 394 00:19:18,560 --> 00:19:21,560 Speaker 2: basic story. The world changes. The thing that I just 395 00:19:21,600 --> 00:19:25,080 Speaker 2: want to reflect on is who is Gudenberg's first actual customer. Well, 396 00:19:25,119 --> 00:19:26,800 Speaker 2: it's the Catholic Church and it makes a lot of 397 00:19:26,880 --> 00:19:28,600 Speaker 2: sense dominant organization. 398 00:19:28,760 --> 00:19:29,359 Speaker 3: It's time. 399 00:19:29,600 --> 00:19:31,679 Speaker 2: Not that many Bibles could be created if you have 400 00:19:31,720 --> 00:19:34,560 Speaker 2: to hand inscribe them. The printing press makes it much 401 00:19:34,680 --> 00:19:37,399 Speaker 2: easier for the Catholic Church to disseminate the Bible that 402 00:19:37,440 --> 00:19:40,439 Speaker 2: looks good in the beginning. Martin Luther picks up the 403 00:19:40,520 --> 00:19:43,920 Speaker 2: very same technology and the Reformation happens, and the Catholic 404 00:19:43,960 --> 00:19:46,920 Speaker 2: Church might ultimately regret the fact that it helped the 405 00:19:47,000 --> 00:19:50,440 Speaker 2: fund and innovation that yes, helped u spread the Bible, 406 00:19:50,640 --> 00:19:53,800 Speaker 2: but also kind of tore the church apart. And I 407 00:19:53,840 --> 00:19:56,000 Speaker 2: tell this story because I think you're going to see 408 00:19:56,000 --> 00:19:58,680 Speaker 2: a very similar story for the major consulting companies and 409 00:19:58,760 --> 00:20:01,840 Speaker 2: artificial intelligence. Right now, it's a great time for a 410 00:20:01,880 --> 00:20:03,720 Speaker 2: lot of the consultancies. But I think you're going to 411 00:20:03,720 --> 00:20:06,840 Speaker 2: see two things. I think one, clients are going to say, 412 00:20:07,119 --> 00:20:09,880 Speaker 2: we actually don't need a consultancy. There's a lot more 413 00:20:09,920 --> 00:20:12,920 Speaker 2: that we're now capable of doing ourselves. A number two, 414 00:20:13,320 --> 00:20:16,240 Speaker 2: we have hollowed out the consultancies will say a lot 415 00:20:16,240 --> 00:20:19,200 Speaker 2: of our capabilities. So we've got a generation of leaders 416 00:20:19,680 --> 00:20:22,320 Speaker 2: that can go and provide eye to eye advice to 417 00:20:22,320 --> 00:20:24,520 Speaker 2: top senior leaders and have great judgment and so on. 418 00:20:25,040 --> 00:20:27,480 Speaker 2: But the efficiency gains that we got mean that we 419 00:20:27,520 --> 00:20:30,520 Speaker 2: don't have another generation that comes after it. I think 420 00:20:30,560 --> 00:20:32,480 Speaker 2: we're going to see both those things happen over the 421 00:20:32,480 --> 00:20:33,960 Speaker 2: course of the next five to ten years. 422 00:20:34,040 --> 00:20:36,400 Speaker 3: If I were in the field now, I. 423 00:20:36,320 --> 00:20:38,280 Speaker 2: Would be doing all the things that I think smart 424 00:20:38,280 --> 00:20:41,000 Speaker 2: people are doing, which is show up to client meetings, 425 00:20:41,040 --> 00:20:44,920 Speaker 2: being even smarter, have teams work even faster, have multiple 426 00:20:44,920 --> 00:20:48,159 Speaker 2: possibilities being considered in ways that you couldn't do before, 427 00:20:48,680 --> 00:20:51,320 Speaker 2: really live up to the full possibility what you can do. 428 00:20:51,880 --> 00:20:54,240 Speaker 2: I was at a small company, but now a small 429 00:20:54,280 --> 00:20:56,480 Speaker 2: company can look like a big company because you have 430 00:20:56,480 --> 00:20:58,959 Speaker 2: all these tools you can take advantage of. That's if 431 00:20:59,000 --> 00:21:01,320 Speaker 2: I was still at that company. The thing I also 432 00:21:01,359 --> 00:21:04,000 Speaker 2: would be thinking really hard about is is. 433 00:21:03,920 --> 00:21:04,840 Speaker 3: This the right model? 434 00:21:04,920 --> 00:21:07,440 Speaker 2: You know, in a site, like many people in consultancy 435 00:21:07,800 --> 00:21:11,679 Speaker 2: got a traditional pyramid model, and people are beginning to say, well, 436 00:21:11,760 --> 00:21:14,960 Speaker 2: maybe it's diamonds where you've got you know, a few people. 437 00:21:15,359 --> 00:21:18,440 Speaker 2: Maybe it's something that is an inverted pyramid where actually 438 00:21:18,760 --> 00:21:21,040 Speaker 2: you have more people at the top who are the 439 00:21:21,160 --> 00:21:24,479 Speaker 2: good judgment people that can advise senior clients, and they 440 00:21:24,480 --> 00:21:26,320 Speaker 2: don't need to have the people who are doing all 441 00:21:26,320 --> 00:21:28,639 Speaker 2: the work at the bottom. I don't know the answer 442 00:21:28,680 --> 00:21:31,040 Speaker 2: to that, but I certainly would be thinking really hard 443 00:21:31,200 --> 00:21:33,160 Speaker 2: about the new models that are going to be emerging. 444 00:21:33,240 --> 00:21:37,560 Speaker 1: If you were still working with junior consultants, how would 445 00:21:37,600 --> 00:21:40,000 Speaker 1: you be teaching them to think critically? 446 00:21:40,320 --> 00:21:44,160 Speaker 2: AI essentially is your new teammate, your new intern, whatever 447 00:21:44,280 --> 00:21:48,880 Speaker 2: metaphor you want to use. It's got amazing superpowers. Sometimes 448 00:21:48,920 --> 00:21:51,480 Speaker 2: it does some really weird things like make stuff up 449 00:21:51,600 --> 00:21:54,080 Speaker 2: or dramatically underperform some things you think it's going to 450 00:21:54,119 --> 00:21:56,359 Speaker 2: be good at. The way you figure these things out 451 00:21:56,520 --> 00:21:59,920 Speaker 2: is by working very closely with it, never ever outsourced 452 00:22:00,080 --> 00:22:03,320 Speaker 2: to it, and recognize life is better in teams and 453 00:22:03,400 --> 00:22:07,879 Speaker 2: research shows life is actually sociologically better with AI, as 454 00:22:07,880 --> 00:22:09,720 Speaker 2: long as you don't do too much with it and 455 00:22:09,720 --> 00:22:11,879 Speaker 2: now source too much to it. But make sure that 456 00:22:11,960 --> 00:22:15,919 Speaker 2: you are pushing and using it to augment versus replacing, 457 00:22:16,080 --> 00:22:19,119 Speaker 2: because that will be a very short sighted and destructive path. 458 00:22:19,520 --> 00:22:22,320 Speaker 2: I mentioned Bobby Hansen before. In his book Leaders Make 459 00:22:22,359 --> 00:22:24,800 Speaker 2: the Future that the third edition, he said, you know, 460 00:22:24,880 --> 00:22:27,000 Speaker 2: one of the most important things that a future has 461 00:22:27,040 --> 00:22:29,960 Speaker 2: to do is be very precise on language, because the 462 00:22:30,040 --> 00:22:33,000 Speaker 2: right language will pull you somewhere, the wrong language will 463 00:22:33,000 --> 00:22:36,040 Speaker 2: push you away from And he writes in his fifty 464 00:22:36,119 --> 00:22:39,199 Speaker 2: years doing this. Artificial intelligence is the worst name for 465 00:22:39,320 --> 00:22:43,120 Speaker 2: a new technology he has ever seen, because artificial makes 466 00:22:43,160 --> 00:22:45,840 Speaker 2: it sound like it's replaceative, and what you really want 467 00:22:45,920 --> 00:22:48,119 Speaker 2: is augmented, where it is something that allows you to 468 00:22:48,160 --> 00:22:50,879 Speaker 2: stretch and push and develop further, and that is the 469 00:22:50,880 --> 00:22:54,040 Speaker 2: thing I would really be pushing among the young talent 470 00:22:54,080 --> 00:22:55,120 Speaker 2: at the organization. 471 00:22:55,640 --> 00:22:58,280 Speaker 3: Don't stop thinking. Once you stop thinking, it's kind of over. 472 00:22:58,520 --> 00:23:01,800 Speaker 1: If you're finding this conversation with Scott Anthony as helpful 473 00:23:01,880 --> 00:23:05,040 Speaker 1: as I am, stay with us because in the second half, 474 00:23:05,320 --> 00:23:08,440 Speaker 1: Scott reveals why he has started thinking about his life 475 00:23:08,560 --> 00:23:12,640 Speaker 1: in five year cycles, and also his advice for improving 476 00:23:12,680 --> 00:23:16,000 Speaker 1: your ability to think critically in the age of AI. 477 00:23:19,960 --> 00:23:22,560 Speaker 1: If you're looking for more tips to improve the way 478 00:23:22,680 --> 00:23:25,840 Speaker 1: you work can live. I write a short weekly newsletter 479 00:23:25,960 --> 00:23:29,080 Speaker 1: that contains tactics I've discovered that have helped me personally. 480 00:23:29,520 --> 00:23:32,520 Speaker 1: You can sign up for that at Amantha dot com. 481 00:23:32,760 --> 00:23:42,160 Speaker 1: That's Amantha dot com. How would you be approaching recruitment 482 00:23:42,440 --> 00:23:44,520 Speaker 1: because I feel like I'm seeing a lot of companies 483 00:23:44,560 --> 00:23:48,520 Speaker 1: really struggle with this, particularly in those early stages where 484 00:23:48,640 --> 00:23:52,280 Speaker 1: you're not having those in depth interviews face to face 485 00:23:52,400 --> 00:23:55,479 Speaker 1: because I mean even virtual interviews these days, you know, 486 00:23:55,600 --> 00:23:58,600 Speaker 1: I hear all sorts of stories about an AI going 487 00:23:58,600 --> 00:24:02,200 Speaker 1: on in the background and putting out answers to questions 488 00:24:02,280 --> 00:24:05,040 Speaker 1: that people are being asked. What would you do to 489 00:24:05,160 --> 00:24:08,160 Speaker 1: test the ability of someone to think? In this day 490 00:24:08,160 --> 00:24:08,720 Speaker 1: and age. 491 00:24:08,960 --> 00:24:11,040 Speaker 2: What I would wish for is a way to be 492 00:24:11,280 --> 00:24:13,960 Speaker 2: in a physical location with someone and have a conversation 493 00:24:14,040 --> 00:24:17,080 Speaker 2: with them. I think, just the ability to actually have 494 00:24:17,160 --> 00:24:21,680 Speaker 2: a conversation with somebody and not ask how many taxi 495 00:24:21,680 --> 00:24:24,480 Speaker 2: cabs are there in New York City or whatever, but 496 00:24:24,680 --> 00:24:26,960 Speaker 2: just see the way that they approach issues. The approach 497 00:24:27,000 --> 00:24:29,199 Speaker 2: I followed what I was interviewing in a sight is 498 00:24:29,440 --> 00:24:33,159 Speaker 2: I said, would give people here are ten disruptions that 499 00:24:33,200 --> 00:24:35,399 Speaker 2: are in process right now, or ten companies that have 500 00:24:35,480 --> 00:24:38,520 Speaker 2: been called disruptive. Pick any one of them and come 501 00:24:38,600 --> 00:24:41,280 Speaker 2: up with a point of view, are they actually disruptive? 502 00:24:41,680 --> 00:24:43,680 Speaker 2: And if you were caught in a lift with the CEO, 503 00:24:43,840 --> 00:24:46,159 Speaker 2: what would you say to them? And I did that 504 00:24:46,320 --> 00:24:47,080 Speaker 2: for two reasons. 505 00:24:47,160 --> 00:24:47,760 Speaker 3: One, I want to. 506 00:24:47,720 --> 00:24:49,840 Speaker 2: See what they did with it, and then two, once 507 00:24:49,840 --> 00:24:51,920 Speaker 2: we're in a room together that I can push them 508 00:24:51,920 --> 00:24:53,880 Speaker 2: and they've had the benefit of being able to think 509 00:24:53,920 --> 00:24:55,720 Speaker 2: about it for some period of time. 510 00:24:55,960 --> 00:24:56,840 Speaker 3: And we'd have fun. 511 00:24:57,000 --> 00:24:57,159 Speaker 1: You know. 512 00:24:57,240 --> 00:24:59,359 Speaker 2: Sometimes people would go and produce a PowerPoint tech. I 513 00:24:59,359 --> 00:25:01,439 Speaker 2: didn't ask them to do that, but sometimes people would. 514 00:25:01,760 --> 00:25:04,480 Speaker 2: Sometimes I get a Microsoft word memos. Sometimes people would 515 00:25:04,520 --> 00:25:06,560 Speaker 2: just talk. But it was a really good way just 516 00:25:06,560 --> 00:25:08,879 Speaker 2: to get into a conversation where you could see how 517 00:25:08,880 --> 00:25:11,679 Speaker 2: someone thinks. I don't know a better way to do it, 518 00:25:11,760 --> 00:25:15,040 Speaker 2: you know, I mean you could say psychometric tests or whatever, perhaps, 519 00:25:15,240 --> 00:25:17,680 Speaker 2: but you know, I mean, as we know, anything can 520 00:25:17,720 --> 00:25:19,000 Speaker 2: be gamed these days. 521 00:25:18,760 --> 00:25:19,720 Speaker 3: It's not easy. 522 00:25:19,920 --> 00:25:22,520 Speaker 1: We talked about critical thinking. I want to talk about 523 00:25:22,680 --> 00:25:26,280 Speaker 1: good judgment. How do you help your students develop the 524 00:25:26,359 --> 00:25:29,840 Speaker 1: ability to have good judgment? I feel like, and maybe 525 00:25:29,840 --> 00:25:32,400 Speaker 1: this is a flawed assumption, but the longer you do something, 526 00:25:32,400 --> 00:25:37,160 Speaker 1: the more experience you develop. Generally your judgment becomes better 527 00:25:37,240 --> 00:25:40,679 Speaker 1: and better because it's the collection of all the different 528 00:25:40,680 --> 00:25:44,119 Speaker 1: experiences that you've had. But how do you think about 529 00:25:44,400 --> 00:25:48,639 Speaker 1: the ability to have good judgment and to improve that ability? 530 00:25:48,800 --> 00:25:51,639 Speaker 2: I would like to think that experience makes you better 531 00:25:51,760 --> 00:25:54,600 Speaker 2: at having judgment, but I think at least some research 532 00:25:54,640 --> 00:25:57,520 Speaker 2: and data would suggest you get some real blind spots. 533 00:25:57,840 --> 00:25:58,880 Speaker 3: Is you're used to. 534 00:25:59,040 --> 00:26:02,480 Speaker 2: Defaulting to certain things and if the situation changes. And 535 00:26:02,960 --> 00:26:05,040 Speaker 2: with that, what I really try to do is I 536 00:26:05,080 --> 00:26:08,840 Speaker 2: try to essentially give students heuristics and say, what you're 537 00:26:08,880 --> 00:26:10,920 Speaker 2: going to do or what you're going to get are 538 00:26:11,200 --> 00:26:14,960 Speaker 2: rules of thumbs, mental models, way to think through things 539 00:26:15,320 --> 00:26:17,920 Speaker 2: that essentially can help you. So it's not that you 540 00:26:17,960 --> 00:26:21,120 Speaker 2: as an individual have judgment. It's you've got a tool 541 00:26:21,200 --> 00:26:22,920 Speaker 2: that you can use, and what you have to figure 542 00:26:22,960 --> 00:26:25,560 Speaker 2: out is what is the right tool to use in 543 00:26:25,600 --> 00:26:28,040 Speaker 2: a context. And this is the thing that I learned 544 00:26:28,040 --> 00:26:30,679 Speaker 2: from Clay Christiansen. You know, he used to joke that 545 00:26:30,760 --> 00:26:34,000 Speaker 2: his wife Christine would say the famous disruptive line diagram 546 00:26:34,040 --> 00:26:36,320 Speaker 2: that's in all of his stuff, it was etched onto 547 00:26:36,320 --> 00:26:39,840 Speaker 2: his glasses. Everywhere he looked, he saw disruption. I wear 548 00:26:39,880 --> 00:26:42,919 Speaker 2: contact lenses, so the models are like fused under my eyes, 549 00:26:43,000 --> 00:26:45,760 Speaker 2: so you know, everywhere I look, I see it. And 550 00:26:45,960 --> 00:26:48,399 Speaker 2: that's sometimes maybe a tiny bit of a curse, but 551 00:26:48,440 --> 00:26:50,320 Speaker 2: what it really is is a blessing. It allows you 552 00:26:50,359 --> 00:26:53,359 Speaker 2: to see things that you wouldn't otherwise see. And whenever 553 00:26:53,400 --> 00:26:56,480 Speaker 2: someone asks Clay what's your opinion or your viewpoint about 554 00:26:56,560 --> 00:26:59,440 Speaker 2: extra y, Clay would say, I've got no viewpoints or opinions. 555 00:26:59,720 --> 00:27:02,080 Speaker 2: The model has a viewpoint, and he was such a 556 00:27:02,080 --> 00:27:04,960 Speaker 2: good social scientist. When that didn't happen, he'd say, oh, great, 557 00:27:05,320 --> 00:27:07,480 Speaker 2: we found an anomaly, something that didn't fit. 558 00:27:07,960 --> 00:27:10,879 Speaker 3: Therefore the bottle must be wrong in some way. Let's upgrade. 559 00:27:11,000 --> 00:27:13,880 Speaker 2: The vision of our school is to develop wise, decisive 560 00:27:13,920 --> 00:27:17,240 Speaker 2: leaders that better the world through business. A wise leader, 561 00:27:17,480 --> 00:27:19,840 Speaker 2: in my view, is one who has a whole range 562 00:27:19,840 --> 00:27:22,480 Speaker 2: of tools that they can use and knows when to 563 00:27:22,600 --> 00:27:24,040 Speaker 2: use them and how to use them. 564 00:27:24,280 --> 00:27:27,440 Speaker 1: I love that, and I guess the follow one question 565 00:27:27,480 --> 00:27:31,480 Speaker 1: from that, like, when you yourself are looking at creating a 566 00:27:31,520 --> 00:27:34,680 Speaker 1: new model or a framework with which to view the world, 567 00:27:35,680 --> 00:27:39,520 Speaker 1: what role does AI play in helping augment your thinking 568 00:27:40,080 --> 00:27:42,040 Speaker 1: or does it not play a role there? 569 00:27:42,320 --> 00:27:45,000 Speaker 2: I have an idea that's been bouncing around the back 570 00:27:45,000 --> 00:27:47,320 Speaker 2: of my head for a few years now, and this 571 00:27:47,520 --> 00:27:50,439 Speaker 2: is what does it really take to be a leader 572 00:27:50,480 --> 00:27:55,359 Speaker 2: that can successfully navigate today's world of radical uncertainty? A brand, 573 00:27:55,359 --> 00:27:58,720 Speaker 2: the idea, adaptive capacity. And the idea basically is you 574 00:27:58,760 --> 00:28:02,320 Speaker 2: need three things. You need metacognition, which means you think 575 00:28:02,320 --> 00:28:03,840 Speaker 2: about how you think and have lots. 576 00:28:03,680 --> 00:28:05,000 Speaker 3: Of models at your disposal. 577 00:28:05,480 --> 00:28:08,520 Speaker 2: You need equanimity, the ability to pause so you don't 578 00:28:08,520 --> 00:28:10,560 Speaker 2: get overwhelmed with all these models that you have in 579 00:28:10,600 --> 00:28:13,399 Speaker 2: your head at a paradox mindset that allows you to 580 00:28:13,440 --> 00:28:17,040 Speaker 2: take perceive tensions and turn them into possibilities. As I've 581 00:28:17,080 --> 00:28:20,640 Speaker 2: been advancing my thinking about this, AI number one has 582 00:28:20,680 --> 00:28:23,440 Speaker 2: been a research partner, so use some of the emerging 583 00:28:23,480 --> 00:28:27,120 Speaker 2: research capabilities to say that idea about. 584 00:28:26,840 --> 00:28:28,879 Speaker 3: There not being anything like this, is that true? 585 00:28:29,040 --> 00:28:31,200 Speaker 2: What else do you see out there? What research you 586 00:28:31,200 --> 00:28:34,120 Speaker 2: see blah blah blah blah blah. So a research partner. 587 00:28:34,520 --> 00:28:37,600 Speaker 2: Number two an experiment designer. So I have been trying 588 00:28:37,600 --> 00:28:40,760 Speaker 2: to think about ways to actually go and test this, validated, etc. 589 00:28:41,440 --> 00:28:42,960 Speaker 3: So this will be the sort of thing where I 590 00:28:43,000 --> 00:28:45,680 Speaker 3: will idly go and brainstorm. I was thinking about this, 591 00:28:45,800 --> 00:28:48,440 Speaker 3: what do you think, etc. Etc. I have a thought 592 00:28:48,480 --> 00:28:49,960 Speaker 3: about doing synthetic research. 593 00:28:50,000 --> 00:28:51,760 Speaker 2: I've not done that, but I thought it would be 594 00:28:51,800 --> 00:28:54,200 Speaker 2: really cool if you were to actually use AI to 595 00:28:54,280 --> 00:28:57,520 Speaker 2: create a simulation to go in test hypotheses. That is 596 00:28:57,600 --> 00:29:00,560 Speaker 2: beyond my current capabilities, but something that is on the 597 00:29:00,600 --> 00:29:03,680 Speaker 2: list of things to continue to push. Barring from slow 598 00:29:03,760 --> 00:29:06,400 Speaker 2: productivity from caw Newport, I work on a five year 599 00:29:06,520 --> 00:29:08,240 Speaker 2: timeline for some of these things. So I want to 600 00:29:08,240 --> 00:29:10,040 Speaker 2: crack this in the next now four years, since I 601 00:29:10,080 --> 00:29:12,240 Speaker 2: set that a year ago, so I'm okay with not 602 00:29:12,360 --> 00:29:15,760 Speaker 2: having synthetic research yet, but that's an area that I'm 603 00:29:15,800 --> 00:29:18,320 Speaker 2: watching to see if that's something that you actually can't 604 00:29:18,360 --> 00:29:19,720 Speaker 2: do with any materiality. 605 00:29:19,920 --> 00:29:23,040 Speaker 1: Tell me more about this five year cycle. I'm curious. 606 00:29:23,080 --> 00:29:24,280 Speaker 1: I did not know that about you. 607 00:29:24,760 --> 00:29:27,760 Speaker 2: So caw Newport, who I know. You're a fan of 608 00:29:28,160 --> 00:29:30,720 Speaker 2: his book Slow Productivity. I was one of the best 609 00:29:30,720 --> 00:29:33,240 Speaker 2: books I read a couple of years ago, and he 610 00:29:33,320 --> 00:29:35,720 Speaker 2: had this idea, you know, slow down, slow down. Don't 611 00:29:35,720 --> 00:29:38,960 Speaker 2: think necessarily about what you're doing over the next week, month, 612 00:29:39,040 --> 00:29:42,240 Speaker 2: or quarter, but really think about what are your aspirations 613 00:29:42,280 --> 00:29:44,480 Speaker 2: over a five year time period. As they said, when 614 00:29:44,520 --> 00:29:47,880 Speaker 2: you relax the timeline, you're able to just handle things 615 00:29:47,920 --> 00:29:50,400 Speaker 2: moving at different paces. So when I did this a 616 00:29:50,520 --> 00:29:53,840 Speaker 2: year ago, I said three things that I would like 617 00:29:53,880 --> 00:29:56,840 Speaker 2: to do professionally over the next five years. One, I 618 00:29:56,880 --> 00:30:00,440 Speaker 2: want to crack this adaptive capacity idea. Two, I want 619 00:30:00,480 --> 00:30:02,280 Speaker 2: to do something I've never done before, which is have 620 00:30:02,400 --> 00:30:05,560 Speaker 2: my name on a peer reviewed academic article, not the 621 00:30:05,600 --> 00:30:08,440 Speaker 2: lead author, but I can be the tenth author. That's fine. 622 00:30:08,800 --> 00:30:11,320 Speaker 2: And number three, I'd really like to win a teaching award. 623 00:30:11,480 --> 00:30:14,640 Speaker 2: I just validate and demonstrate that I improved my ability 624 00:30:14,680 --> 00:30:15,600 Speaker 2: to teach in class. 625 00:30:15,840 --> 00:30:18,400 Speaker 3: So those are the three things I'm saying this on 626 00:30:19,200 --> 00:30:20,840 Speaker 3: am I allowed to say the date clus you are 627 00:30:21,160 --> 00:30:22,720 Speaker 3: July fourteen, twenty. 628 00:30:22,400 --> 00:30:24,360 Speaker 2: Twenty five, so we can time capsule and check in 629 00:30:24,400 --> 00:30:26,240 Speaker 2: on this four years from now. 630 00:30:26,400 --> 00:30:29,440 Speaker 1: Oh wow, I love that. What are some other habits 631 00:30:29,920 --> 00:30:32,400 Speaker 1: or rituals or things that you're you've been playing around 632 00:30:32,440 --> 00:30:35,360 Speaker 1: with around how you do work? Like? I love the 633 00:30:35,360 --> 00:30:37,240 Speaker 1: five year thing, and I always feel like you're someone 634 00:30:37,240 --> 00:30:42,400 Speaker 1: that's experimenting with different ways of working and augmenting your productivity. 635 00:30:42,600 --> 00:30:45,560 Speaker 3: I'll call this one Can I watch? And can I watch? 636 00:30:45,800 --> 00:30:45,920 Speaker 1: Is? 637 00:30:46,320 --> 00:30:49,320 Speaker 2: I started teaching three years ago and the school that 638 00:30:49,520 --> 00:30:51,960 Speaker 2: I teach at is blessed with a number of really, 639 00:30:52,000 --> 00:30:55,560 Speaker 2: really world class teachers. So I've made a regular habit 640 00:30:55,640 --> 00:30:58,280 Speaker 2: whenever I'm in a teaching cycle, which tends to be 641 00:30:58,400 --> 00:31:01,479 Speaker 2: about a five week cycle, I try to go and 642 00:31:01,520 --> 00:31:04,840 Speaker 2: observe at least one and hopefully two other faculty, usually 643 00:31:04,920 --> 00:31:07,560 Speaker 2: ones that my students recommend, you know, just who's really 644 00:31:07,640 --> 00:31:10,280 Speaker 2: great at running a class. And in those can I 645 00:31:10,320 --> 00:31:12,880 Speaker 2: watch moments. I don't care at all about the content. 646 00:31:12,920 --> 00:31:14,840 Speaker 2: I'm not going there to learn, you know, what this 647 00:31:14,920 --> 00:31:16,480 Speaker 2: teacher is teaching about this topic. 648 00:31:17,040 --> 00:31:19,960 Speaker 3: I'm just absorbing the way that they're running. 649 00:31:19,680 --> 00:31:22,680 Speaker 2: The room, what the students are doing, who's locked in, 650 00:31:22,720 --> 00:31:26,120 Speaker 2: who's not locked in, how they're using their physicality, blah 651 00:31:26,160 --> 00:31:27,840 Speaker 2: blah blah blah blah, and just trying to get a 652 00:31:27,880 --> 00:31:31,480 Speaker 2: sense as to what it takes to really really command 653 00:31:31,480 --> 00:31:35,000 Speaker 2: a room of sixty plus very smart NBA students who 654 00:31:35,480 --> 00:31:37,720 Speaker 2: again are blessed with a lot of really great teachers. 655 00:31:38,120 --> 00:31:40,680 Speaker 2: And I'm sure some of the things they're teaching are 656 00:31:40,680 --> 00:31:43,640 Speaker 2: getting into my unconscious as well. But that idea of 657 00:31:43,720 --> 00:31:48,600 Speaker 2: being very purposeful in getting stimuli that's different from normal stimuli, 658 00:31:48,680 --> 00:31:49,920 Speaker 2: that has been something that has. 659 00:31:49,840 --> 00:31:51,760 Speaker 3: Been one of the more rewarding. 660 00:31:51,440 --> 00:31:53,720 Speaker 2: Routines I've had in the last few years. And I've 661 00:31:53,720 --> 00:31:55,400 Speaker 2: made some new friends from doing it too, which is 662 00:31:55,440 --> 00:31:56,680 Speaker 2: always nice. 663 00:31:57,080 --> 00:31:58,760 Speaker 1: I love that. Tell me, what are a couple of 664 00:31:58,800 --> 00:32:02,240 Speaker 1: things that you've larn from you? Can I watch sessions? 665 00:32:02,720 --> 00:32:06,320 Speaker 2: Is one Joseph Garaco, who teaches in the accounting unit 666 00:32:06,400 --> 00:32:08,840 Speaker 2: at talk. I said in one of his sessions, and 667 00:32:08,880 --> 00:32:11,280 Speaker 2: you know, I thought, and Joseph, if you're watching this, 668 00:32:11,360 --> 00:32:13,280 Speaker 2: I hope you take it in good spirit. I thought 669 00:32:13,280 --> 00:32:14,920 Speaker 2: it was going to be the most boring session ever. 670 00:32:14,960 --> 00:32:16,640 Speaker 2: I mean, he'd want a teaching a ward, but I'm like, 671 00:32:16,640 --> 00:32:17,280 Speaker 2: it's accounting. 672 00:32:17,360 --> 00:32:18,600 Speaker 3: You know. Even though my. 673 00:32:18,560 --> 00:32:21,520 Speaker 2: Grandfather was an accounting teacher, he was not known for 674 00:32:21,600 --> 00:32:24,560 Speaker 2: being dynamic. But the thing that Joseph really showed me 675 00:32:24,920 --> 00:32:29,400 Speaker 2: is you can find real moments of inspiration by letting 676 00:32:29,440 --> 00:32:32,920 Speaker 2: students really go and dwell on seemingly small things. And 677 00:32:33,040 --> 00:32:35,720 Speaker 2: what he was really great at doing is opening his 678 00:32:35,840 --> 00:32:40,000 Speaker 2: session with what seemed like a really simple question, but 679 00:32:40,160 --> 00:32:43,360 Speaker 2: one that just expanded into an hour of really good 680 00:32:43,440 --> 00:32:46,080 Speaker 2: discussion where he also could teach about how working capital 681 00:32:46,080 --> 00:32:48,719 Speaker 2: works and all that. So the thing I learned from 682 00:32:48,840 --> 00:32:52,760 Speaker 2: Joseph is the power of a very very well thought 683 00:32:52,800 --> 00:32:56,960 Speaker 2: through opening question when you're in a good class discussion. 684 00:32:57,480 --> 00:32:59,760 Speaker 2: Then I sat in the class that Paul Argenti, who 685 00:32:59,760 --> 00:33:03,240 Speaker 2: tea in communication and a couple other fields, it talk 686 00:33:03,680 --> 00:33:07,280 Speaker 2: a couple months ago, and I noticed he just had 687 00:33:07,280 --> 00:33:11,640 Speaker 2: this theme where he was talking about corporate activism and 688 00:33:11,760 --> 00:33:14,600 Speaker 2: he used pretty evocative language that I won't repeat to 689 00:33:14,720 --> 00:33:17,600 Speaker 2: not get him in trouble, but I just really got 690 00:33:17,600 --> 00:33:20,080 Speaker 2: the room. It's like the room like all devices went down. 691 00:33:20,120 --> 00:33:22,480 Speaker 2: Everyone was on him and like, how rehearse was that? 692 00:33:22,560 --> 00:33:24,880 Speaker 2: He said, I've been doing this for thirty years. It's 693 00:33:24,920 --> 00:33:28,160 Speaker 2: just like not quite a scriptive performance, but you know, 694 00:33:28,280 --> 00:33:30,680 Speaker 2: just really a way to pull people in. So this 695 00:33:30,840 --> 00:33:33,400 Speaker 2: idea that you are, to some degree, for better or worse, 696 00:33:33,400 --> 00:33:35,800 Speaker 2: when you're teaching a class, you're a performer, and you 697 00:33:35,920 --> 00:33:37,840 Speaker 2: want to think about how you're going to use what 698 00:33:37,960 --> 00:33:40,400 Speaker 2: great performers do is the thing I picked up from him. 699 00:33:40,880 --> 00:33:44,680 Speaker 2: So great questions and think about, of course teaching, but 700 00:33:44,800 --> 00:33:47,160 Speaker 2: a little bit of performance mixed in. Those are a 701 00:33:47,200 --> 00:33:48,320 Speaker 2: couple of things I've picked up. 702 00:33:48,400 --> 00:33:50,880 Speaker 1: I love that well, Scott. It's just like it's always 703 00:33:50,920 --> 00:33:54,200 Speaker 1: just such a joy to talk. Whether it's on Mike 704 00:33:54,440 --> 00:33:57,120 Speaker 1: or off Mike. I always feel like I learn so 705 00:33:57,280 --> 00:33:59,800 Speaker 1: many new things and I just love your way of thinking. 706 00:34:00,480 --> 00:34:03,880 Speaker 1: Thank you so much for extending your workday on the 707 00:34:03,920 --> 00:34:05,920 Speaker 1: East Coast to have this chat with. 708 00:34:05,800 --> 00:34:06,520 Speaker 3: Me, Namantha. 709 00:34:06,560 --> 00:34:08,640 Speaker 2: It is always a pleasure, and thank you for starting 710 00:34:08,640 --> 00:34:11,080 Speaker 2: your workday early in Melbourne to have this chat with 711 00:34:11,120 --> 00:34:11,799 Speaker 2: me very much. 712 00:34:11,800 --> 00:34:13,000 Speaker 3: Appreciate it. As always. 713 00:34:14,719 --> 00:34:18,520 Speaker 1: I hope you enjoyed this chat with Scott Anthony. For me, 714 00:34:18,760 --> 00:34:22,480 Speaker 1: the biggest takeaway is this AI is an incredible partner, 715 00:34:22,640 --> 00:34:26,920 Speaker 1: but it should never replace the discomfort of thinking for yourself. 716 00:34:27,120 --> 00:34:29,160 Speaker 1: And if this episode gave you a new way to 717 00:34:29,200 --> 00:34:31,160 Speaker 1: think about your own work, please share it with a 718 00:34:31,239 --> 00:34:33,960 Speaker 1: friend or a colleague and make sure to follow How 719 00:34:34,000 --> 00:34:36,799 Speaker 1: I Work so you don't miss future conversations like this. 720 00:34:37,520 --> 00:34:39,640 Speaker 1: Thank you so much for listening, and here's to working 721 00:34:39,719 --> 00:34:44,480 Speaker 1: smarter with our brains fully switched on. If you like 722 00:34:44,560 --> 00:34:47,800 Speaker 1: today's Joe, make sure you g'd follow on your podcast 723 00:34:47,840 --> 00:34:51,239 Speaker 1: app to be alerted when new episodes drop. How I 724 00:34:51,320 --> 00:34:54,480 Speaker 1: Work was recorded on the traditional land of the Warrangery People, 725 00:34:54,600 --> 00:34:55,640 Speaker 1: part of the Cooler Nation.