1 00:00:01,080 --> 00:00:05,560 Speaker 1: AI can feel like the ultimate shortcut, no more blank pages, 2 00:00:05,800 --> 00:00:10,040 Speaker 1: no more stuck moments. But what happens when we hand 3 00:00:10,080 --> 00:00:14,040 Speaker 1: over too much of the thinking. That's what I explored 4 00:00:14,120 --> 00:00:18,720 Speaker 1: with Scott Anthony, business professor and author of Epic Disruptions. 5 00:00:19,239 --> 00:00:22,320 Speaker 1: Scott has been an early adopter of jen ai, using 6 00:00:22,320 --> 00:00:25,239 Speaker 1: it as a sparring partner to test ideas, but is 7 00:00:25,280 --> 00:00:30,120 Speaker 1: also clear about the danger. When he once asked AI 8 00:00:30,240 --> 00:00:34,000 Speaker 1: to design a university course for him, the output wasn't 9 00:00:34,080 --> 00:00:37,560 Speaker 1: just bad, it left his brain out of the process, 10 00:00:37,800 --> 00:00:40,360 Speaker 1: and that is where he draws the line. By the 11 00:00:40,479 --> 00:00:42,600 Speaker 1: end of this quick win, you'll hear how to use 12 00:00:42,640 --> 00:00:47,480 Speaker 1: AI to expand your perspective without outsourcing the critical work 13 00:00:47,520 --> 00:00:57,880 Speaker 1: that only you can do. Welcome to How I Work, 14 00:00:58,120 --> 00:01:02,240 Speaker 1: a show about habits, ritual's, and strategies for optimizing your day. 15 00:01:02,880 --> 00:01:10,240 Speaker 1: I'm your host, doctor Amantha Imba. I know you're obviously 16 00:01:10,240 --> 00:01:13,920 Speaker 1: an early adopter of jenai, and I imagine your approach 17 00:01:14,040 --> 00:01:16,679 Speaker 1: to prompting and getting the best out of it has evolved. 18 00:01:17,080 --> 00:01:20,560 Speaker 1: Tell me how you use it as a brainstorming partner. 19 00:01:20,800 --> 00:01:23,560 Speaker 2: A lot of it is context engineering, you know, making 20 00:01:23,600 --> 00:01:26,440 Speaker 2: sure that you put in place the right parameters so 21 00:01:26,600 --> 00:01:30,319 Speaker 2: that you are getting a legitimately different perspective on things. 22 00:01:30,680 --> 00:01:34,119 Speaker 2: So whether that's taking on a different role, or it's saying, 23 00:01:34,200 --> 00:01:37,040 Speaker 2: let's be a skeptic or let's be a supporter or whatever, 24 00:01:37,480 --> 00:01:40,600 Speaker 2: trying to make sure that you a set the context 25 00:01:40,600 --> 00:01:42,520 Speaker 2: where you're going to get something that is what you're 26 00:01:42,560 --> 00:01:45,600 Speaker 2: looking for, which might be something that's as novel as 27 00:01:45,600 --> 00:01:48,680 Speaker 2: possible or something that's more fine tune as possible, whatever 28 00:01:48,720 --> 00:01:52,280 Speaker 2: it is. The second is multiple tools. My favorite at 29 00:01:52,320 --> 00:01:55,440 Speaker 2: the moment are Claude and chat Shept. So I'll play 30 00:01:55,440 --> 00:01:59,040 Speaker 2: them off each other. I'lle be simultaneously brainstorming a topic. 31 00:01:59,240 --> 00:02:00,760 Speaker 2: I'd be like, oh, the the other one said this, 32 00:02:00,880 --> 00:02:03,080 Speaker 2: what do you think about that? And at some point 33 00:02:03,120 --> 00:02:06,080 Speaker 2: they converge because all the foundational stuff will get you 34 00:02:06,120 --> 00:02:08,920 Speaker 2: to the same place, but they actually will start sometimes 35 00:02:08,919 --> 00:02:11,120 Speaker 2: in very different places. And of course we know if 36 00:02:11,160 --> 00:02:14,560 Speaker 2: we use the systems, sometimes the exact same prompt will 37 00:02:14,600 --> 00:02:17,720 Speaker 2: go in different directions, just depending on whatever is the 38 00:02:17,720 --> 00:02:20,760 Speaker 2: magic that's happening underneath the hood. But those two things 39 00:02:20,760 --> 00:02:23,320 Speaker 2: are really trying to say, let's push for the personas 40 00:02:23,320 --> 00:02:26,040 Speaker 2: and get the context right and then using different models. 41 00:02:26,080 --> 00:02:27,560 Speaker 2: Those are the two things that I've found to be 42 00:02:27,600 --> 00:02:28,359 Speaker 2: the most helpful. 43 00:02:28,520 --> 00:02:32,000 Speaker 1: I will quite often when I'm working on quite a 44 00:02:32,320 --> 00:02:35,079 Speaker 1: big project where I want to get different views, I 45 00:02:35,120 --> 00:02:39,440 Speaker 1: will quite often have the same prompt in CHETJPT, Claude 46 00:02:39,480 --> 00:02:42,240 Speaker 1: and Gemini and then finish with ask me you know 47 00:02:42,320 --> 00:02:44,360 Speaker 1: any questions so you can be ninety five percent confident 48 00:02:44,320 --> 00:02:47,600 Speaker 1: and doing a great job. And it's quite interesting the 49 00:02:47,639 --> 00:02:50,280 Speaker 1: different questions that I'll get back when it's a task 50 00:02:50,360 --> 00:02:54,560 Speaker 1: that could go in several different directions. So I do 51 00:02:54,639 --> 00:02:57,440 Speaker 1: love that as a way, and kind of sparring them 52 00:02:57,440 --> 00:03:00,600 Speaker 1: against each other I found really useful. One of the 53 00:03:00,639 --> 00:03:03,240 Speaker 1: things that I wonder is I think a lot of 54 00:03:03,280 --> 00:03:06,600 Speaker 1: people and particularly knowledge workers that you know whose value 55 00:03:06,680 --> 00:03:11,480 Speaker 1: is in their thinking use AI a lot. Is I wonder, firstly, 56 00:03:11,520 --> 00:03:14,440 Speaker 1: what's going to happen to our ability to think critically 57 00:03:15,000 --> 00:03:17,760 Speaker 1: and then what's going to happen to our ability to 58 00:03:18,720 --> 00:03:21,840 Speaker 1: maintain good judgment? And I want to know for you 59 00:03:22,639 --> 00:03:25,240 Speaker 1: with the thinking critically, how do you think about that 60 00:03:25,600 --> 00:03:28,240 Speaker 1: for yourself? And do you worry am I going to 61 00:03:28,280 --> 00:03:31,080 Speaker 1: lose the ability to think critically if I kind of 62 00:03:31,120 --> 00:03:33,640 Speaker 1: crossed that line where I've suddenly outsourced just a little 63 00:03:33,680 --> 00:03:34,720 Speaker 1: bit too much to Ai. 64 00:03:34,960 --> 00:03:37,160 Speaker 2: Yeah, it is something that I worry about a lot. 65 00:03:37,240 --> 00:03:40,120 Speaker 2: I worry even more about for my students and even 66 00:03:40,120 --> 00:03:43,080 Speaker 2: more for my children. So as you've got then three 67 00:03:43,120 --> 00:03:46,280 Speaker 2: generations there, you've got me nineteen seventy five, my kids 68 00:03:46,280 --> 00:03:49,480 Speaker 2: who were born between two thousand and five and twenty sixteen, 69 00:03:49,960 --> 00:03:52,840 Speaker 2: and then the students, who generally are about thirty years old, 70 00:03:52,880 --> 00:03:55,160 Speaker 2: so the average day of birth would be about whatever 71 00:03:55,200 --> 00:03:58,240 Speaker 2: that is, nineteen ninety five. It really is being very 72 00:03:58,240 --> 00:04:00,680 Speaker 2: clear about the lines. I'm trying to do something that 73 00:04:00,800 --> 00:04:05,160 Speaker 2: does represent original thinking. I just can't outsource that. I 74 00:04:05,200 --> 00:04:08,400 Speaker 2: can't ask Ai for a first draft of something I tried, 75 00:04:08,880 --> 00:04:10,720 Speaker 2: you know when it first came out. Of course, I'm 76 00:04:10,760 --> 00:04:12,280 Speaker 2: going to play around with it. So I'm like, okay, 77 00:04:12,280 --> 00:04:14,480 Speaker 2: I've got an idea for a new class, please, is 78 00:04:14,520 --> 00:04:17,560 Speaker 2: not a curriculum for me? And A the output was 79 00:04:17,600 --> 00:04:21,000 Speaker 2: not particularly good, and B my brain wasn't in it. 80 00:04:21,200 --> 00:04:23,039 Speaker 2: I hadn't thought about it, I hadn't processed it, I 81 00:04:23,040 --> 00:04:25,640 Speaker 2: hadn't struggled, et cetera. So now when I'm working on 82 00:04:25,680 --> 00:04:28,640 Speaker 2: a new course, certainly I'll say, Okay, this bit of 83 00:04:28,680 --> 00:04:31,200 Speaker 2: it I need to think about some additional readings, or 84 00:04:31,279 --> 00:04:33,320 Speaker 2: I want to think about some of the principles I 85 00:04:33,360 --> 00:04:36,240 Speaker 2: want to teach here, so I'll get help to sharpen. 86 00:04:36,839 --> 00:04:40,039 Speaker 2: But ownership of the integration it has to be by 87 00:04:40,080 --> 00:04:42,600 Speaker 2: me or else. That's just a skill that's going to lapse. 88 00:04:43,000 --> 00:04:45,440 Speaker 2: So that, to me is the most important thing. Drawing 89 00:04:45,480 --> 00:04:48,080 Speaker 2: the line and being really clear about where you draw 90 00:04:48,120 --> 00:04:51,240 Speaker 2: that line. There are other tasks that are less critical 91 00:04:51,279 --> 00:04:54,240 Speaker 2: where I'm very happy to outsource more so. As an example, 92 00:04:54,760 --> 00:04:57,559 Speaker 2: I was facilitating a panel discussion a couple months ago 93 00:04:57,960 --> 00:05:00,120 Speaker 2: and that was pretty much Chern and Burn. You know, 94 00:05:00,240 --> 00:05:03,719 Speaker 2: here are my panelists. Make me smarter about who they are, 95 00:05:03,839 --> 00:05:05,200 Speaker 2: give me a few good questions. 96 00:05:05,240 --> 00:05:05,520 Speaker 1: Task. 97 00:05:05,800 --> 00:05:08,159 Speaker 2: Okay, that's great for material. I can then turn that 98 00:05:08,200 --> 00:05:10,800 Speaker 2: into a finished product. But I was totally fine with 99 00:05:10,880 --> 00:05:13,520 Speaker 2: first draft being done by AI. But when you really 100 00:05:13,560 --> 00:05:15,800 Speaker 2: need to turn on the critical thinking and reasoning, first 101 00:05:15,880 --> 00:05:18,640 Speaker 2: draft by AI, I think the research shows is really 102 00:05:18,640 --> 00:05:19,480 Speaker 2: really dangerous. 103 00:05:19,920 --> 00:05:23,000 Speaker 1: What I took from Scott is that the line matters. 104 00:05:23,600 --> 00:05:26,520 Speaker 1: Use AI to brainstorm, to challenge, to widen the lens, 105 00:05:26,720 --> 00:05:30,360 Speaker 1: but keep ownership of the hard thinking, because once you 106 00:05:30,480 --> 00:05:34,279 Speaker 1: stop struggling with the blank page, you risk losing the 107 00:05:34,440 --> 00:05:38,200 Speaker 1: very skill that makes your ideas valuable. So next time 108 00:05:38,240 --> 00:05:43,120 Speaker 1: you're tempted to let Ai draft that strategy, just pause. Instead, 109 00:05:43,480 --> 00:05:46,479 Speaker 1: ask it to play the skeptic, or the supporter or 110 00:05:46,480 --> 00:05:51,120 Speaker 1: the competitor. Use its responses to sharpen your own. And 111 00:05:51,200 --> 00:05:53,680 Speaker 1: if you're keen to hear the rest of my conversation 112 00:05:53,920 --> 00:05:56,080 Speaker 1: with Scott, you can find a link to that in 113 00:05:56,120 --> 00:05:58,760 Speaker 1: the show notes. If you like today's show, make sure 114 00:05:58,839 --> 00:06:02,039 Speaker 1: you follow on your podcast app to be alerted when 115 00:06:02,120 --> 00:06:05,080 Speaker 1: new episodes drop. How I Work was recorded on the 116 00:06:05,080 --> 00:06:08,240 Speaker 1: traditional land of the Warringery people, part of the Coulan nation.