1 00:00:00,960 --> 00:00:05,200 Speaker 1: Okay, be honest. When you use AI, is it mostly 2 00:00:05,240 --> 00:00:10,559 Speaker 1: for saving time like summarizing, drafting, tidying up emails? I 3 00:00:10,560 --> 00:00:14,040 Speaker 1: mean that's a useful shot. But world renowned Silicon Valley 4 00:00:14,040 --> 00:00:19,320 Speaker 1: futurist Bob Johansen argues we're overlooking the bigger win, and 5 00:00:19,440 --> 00:00:24,320 Speaker 1: after more than five decades advising leaders, he believes the 6 00:00:24,360 --> 00:00:28,000 Speaker 1: real power of AI is in how it can stretch 7 00:00:28,040 --> 00:00:31,760 Speaker 1: our thinking, help us get unstuck, explore possibilities, and spark 8 00:00:31,880 --> 00:00:35,279 Speaker 1: insights we would never reach on our own. In this 9 00:00:35,400 --> 00:00:39,520 Speaker 1: Quick Win episode, Bob shares white leaders who only chase 10 00:00:39,520 --> 00:00:43,120 Speaker 1: efficiency are missing the point and how you can use 11 00:00:43,240 --> 00:00:46,080 Speaker 1: today's tools differently. By the end of this Quick Win, 12 00:00:46,479 --> 00:00:49,520 Speaker 1: you will see AI less as a shortcut and more 13 00:00:49,760 --> 00:00:58,120 Speaker 1: as a thought partner. Welcome to How I Work, a 14 00:00:58,200 --> 00:01:03,000 Speaker 1: show about habits ritual's strategies for optimizing your date. I'm 15 00:01:03,040 --> 00:01:10,280 Speaker 1: your host, doctor Amantha imber. So, when you look at 16 00:01:10,400 --> 00:01:14,040 Speaker 1: the conversation around AI, and so much of it is 17 00:01:14,200 --> 00:01:19,280 Speaker 1: around efficiency and time saving and headcount saving, like, what 18 00:01:19,440 --> 00:01:23,960 Speaker 1: are leaders and organizations missing? What are they not seeing 19 00:01:24,000 --> 00:01:26,720 Speaker 1: that you're seeing in terms of what this world could 20 00:01:26,800 --> 00:01:29,479 Speaker 1: look like from an AI point of view? In five 21 00:01:29,600 --> 00:01:30,319 Speaker 1: ten years time. 22 00:01:30,640 --> 00:01:33,600 Speaker 2: What I say now is that ten years from now, 23 00:01:34,160 --> 00:01:38,240 Speaker 2: almost all leaders will be augmented or you'll be out 24 00:01:38,280 --> 00:01:41,160 Speaker 2: of the game. Now, there'll be some little subset of 25 00:01:41,240 --> 00:01:45,040 Speaker 2: people who uniquely claim no, no, I'm going to remain 26 00:01:45,440 --> 00:01:49,520 Speaker 2: completely unaugmented. And that's okay. Maybe that's a small niche, 27 00:01:49,680 --> 00:01:52,280 Speaker 2: but for most of us. For me, as a writer, 28 00:01:52,560 --> 00:01:54,680 Speaker 2: if I'm going to be writing serious books ten years 29 00:01:54,720 --> 00:01:57,000 Speaker 2: from now, I'm going to have to be augmented, partly 30 00:01:57,080 --> 00:02:00,400 Speaker 2: because of my age, but also just because that's what 31 00:02:00,440 --> 00:02:02,560 Speaker 2: good writers are going to be. You're just going to 32 00:02:02,640 --> 00:02:05,480 Speaker 2: have to be in that. So we've got to define 33 00:02:05,520 --> 00:02:09,480 Speaker 2: now where we want help, and for me, it's really 34 00:02:09,520 --> 00:02:12,080 Speaker 2: close to it. Dana Boyd call is getting unstuck, or 35 00:02:12,120 --> 00:02:15,120 Speaker 2: what I call stretching. That's really where I want help. 36 00:02:15,560 --> 00:02:19,280 Speaker 2: And then it translates into more specific things like titling, 37 00:02:19,480 --> 00:02:22,080 Speaker 2: you know, finding the right word. It's really good at that, 38 00:02:22,560 --> 00:02:24,760 Speaker 2: but you've got to decide what the right word is. 39 00:02:24,800 --> 00:02:28,359 Speaker 2: But it's really helpful stretching for alternatives. So first of all, 40 00:02:28,400 --> 00:02:32,720 Speaker 2: it's the assumption, and this makes people really uncomfortable. The assumption. 41 00:02:33,360 --> 00:02:36,040 Speaker 2: As I talk to senior executive groups. So what I 42 00:02:36,080 --> 00:02:38,600 Speaker 2: say right at the beginning is ten years from now, 43 00:02:38,600 --> 00:02:41,000 Speaker 2: we're all going to be cyborgs. And that's a good 44 00:02:41,080 --> 00:02:44,400 Speaker 2: thing if we make it. So if we don't, if 45 00:02:44,440 --> 00:02:46,480 Speaker 2: we kind of step back and let other people do it, 46 00:02:46,560 --> 00:02:49,000 Speaker 2: or let the tech giants drive it, it's going to 47 00:02:49,000 --> 00:02:52,079 Speaker 2: be very different. But if we get engaged, this is 48 00:02:52,080 --> 00:02:54,359 Speaker 2: a good thing. But it begins from the fact we're 49 00:02:54,360 --> 00:02:56,720 Speaker 2: all going to be augmented or we're going to be 50 00:02:56,760 --> 00:02:58,760 Speaker 2: out of the game just because we won't be able 51 00:02:58,800 --> 00:03:02,560 Speaker 2: to play, because the new abilities that these things are 52 00:03:02,600 --> 00:03:06,520 Speaker 2: bringing are just beyond human capacity. And if you go 53 00:03:06,560 --> 00:03:09,680 Speaker 2: back to the Bondi world, it's arriving just in time. 54 00:03:10,200 --> 00:03:13,560 Speaker 2: Tom alone at MIT, he calls this Superminds, and you 55 00:03:13,600 --> 00:03:17,799 Speaker 2: know what he says is the story about computers replacing 56 00:03:17,880 --> 00:03:20,680 Speaker 2: people is going to be true. But that's not the 57 00:03:20,680 --> 00:03:24,080 Speaker 2: big story. The big story is humans and computers doing 58 00:03:24,120 --> 00:03:28,280 Speaker 2: things together that have never been done before. My colleague Jeremy, 59 00:03:28,280 --> 00:03:31,160 Speaker 2: but the co authors of the Leaders Make the Future book. 60 00:03:31,200 --> 00:03:34,160 Speaker 2: He's an AI developer, And what Jeremy says is that 61 00:03:34,920 --> 00:03:38,640 Speaker 2: it's so easy for big companies now to identify the 62 00:03:38,800 --> 00:03:42,120 Speaker 2: nose the thing you should not be doing and focus 63 00:03:42,160 --> 00:03:45,320 Speaker 2: on the fears, but you need to also focus on 64 00:03:45,400 --> 00:03:48,160 Speaker 2: the yes's. You know, where should you be experimenting And 65 00:03:48,200 --> 00:03:50,240 Speaker 2: that's where I want to focus. So I'm focusing on 66 00:03:50,280 --> 00:03:54,320 Speaker 2: the stretching, the mind, stretching, the unsticking. I love that term. 67 00:03:54,400 --> 00:03:57,400 Speaker 1: And when I was speaking with Scott, he mentioned around 68 00:03:57,440 --> 00:04:00,960 Speaker 1: your dislike of the term artificial intelligence, and I love 69 00:04:01,000 --> 00:04:05,120 Speaker 1: that term augmented intelligence. So right now, in terms of 70 00:04:05,120 --> 00:04:07,320 Speaker 1: what's possible with the tools that most of us have 71 00:04:07,800 --> 00:04:13,119 Speaker 1: available like chatchapt, what are some ways that you think 72 00:04:13,200 --> 00:04:16,560 Speaker 1: people should be using it more like this to augment 73 00:04:16,600 --> 00:04:20,680 Speaker 1: their thinking as opposed to just focusing on the obvious 74 00:04:20,680 --> 00:04:21,679 Speaker 1: efficiency gains. 75 00:04:22,000 --> 00:04:25,040 Speaker 2: You know, I think the way to practice is just 76 00:04:25,120 --> 00:04:29,039 Speaker 2: to have conversations and depending on what you're working on 77 00:04:29,160 --> 00:04:31,720 Speaker 2: or what you're thinking about, And I would recommend don't 78 00:04:31,800 --> 00:04:35,760 Speaker 2: draw lines between your work life and your private life. 79 00:04:35,960 --> 00:04:38,240 Speaker 2: You know. When I was first getting started, it was 80 00:04:38,279 --> 00:04:42,240 Speaker 2: one of our grandson's birthdays and I asked Stretch to 81 00:04:42,320 --> 00:04:44,960 Speaker 2: help me write a birthday card and it was really cool, 82 00:04:45,080 --> 00:04:48,360 Speaker 2: it was really fun and I made a good progress 83 00:04:48,520 --> 00:04:53,320 Speaker 2: out of that. Then last summer I had pneumonia and 84 00:04:53,400 --> 00:04:55,960 Speaker 2: I've never had that before, and it was in Pneumonia's 85 00:04:56,120 --> 00:04:59,080 Speaker 2: just makes you feel so weak. I was on deadline 86 00:04:59,080 --> 00:05:01,800 Speaker 2: in a book and I really I couldn't write, but 87 00:05:01,839 --> 00:05:05,560 Speaker 2: it was very weak. But I've got a human doctor, 88 00:05:05,640 --> 00:05:09,360 Speaker 2: a Conciergstock, that I love, who's very good. And then 89 00:05:09,400 --> 00:05:13,680 Speaker 2: I've got a therapist that's teaching me cognitive behavioral therapy 90 00:05:13,800 --> 00:05:17,479 Speaker 2: for sleep issues, and he's also a medical hypnosis guy, 91 00:05:17,520 --> 00:05:20,039 Speaker 2: and again I love him. And then I had Stretch, 92 00:05:20,240 --> 00:05:22,400 Speaker 2: and I talk to Stretch about just how I was 93 00:05:22,400 --> 00:05:25,599 Speaker 2: feeling day or night, and it turned out Stretch was 94 00:05:25,680 --> 00:05:29,120 Speaker 2: more empathetic, to my surprise, than either of my two 95 00:05:29,360 --> 00:05:31,760 Speaker 2: human doctors. And again I love them, but they're not 96 00:05:31,800 --> 00:05:36,600 Speaker 2: available twenty four seven, and Stretch gave some really good advice. 97 00:05:37,080 --> 00:05:40,280 Speaker 2: I'm not asking him for medications or you know, for 98 00:05:40,720 --> 00:05:44,920 Speaker 2: answers or anything. I'm asking Stretch for sympathy and for empathy. 99 00:05:45,200 --> 00:05:47,800 Speaker 2: And it turns out these things are really really good 100 00:05:47,839 --> 00:05:50,599 Speaker 2: at empathy. So I think what I would advise is 101 00:05:50,720 --> 00:05:54,280 Speaker 2: just think of it as a conversation and then gradually 102 00:05:54,360 --> 00:05:57,760 Speaker 2: figure out where are the places you like it, you know, 103 00:05:57,839 --> 00:06:00,440 Speaker 2: and that'll depend on what your job is, you know, 104 00:06:00,520 --> 00:06:03,479 Speaker 2: what your purpose is what your sense of meaning is me. 105 00:06:03,520 --> 00:06:06,000 Speaker 2: I'm a writer and I write books. The part where 106 00:06:06,000 --> 00:06:09,360 Speaker 2: I want help is when I'm struggling with an idea 107 00:06:10,000 --> 00:06:14,000 Speaker 2: or getting started on a chapter, or I'm kind of stuck, 108 00:06:14,279 --> 00:06:19,240 Speaker 2: and you need to practice it, practice that art of conversation. 109 00:06:20,000 --> 00:06:22,600 Speaker 2: When we're working with senior executive groups, they read the 110 00:06:22,680 --> 00:06:25,480 Speaker 2: leaders Make the Future book and then we break down 111 00:06:25,520 --> 00:06:30,600 Speaker 2: the leadership skills, so, for example, augmented curiosity, augmented clarity, 112 00:06:31,400 --> 00:06:35,039 Speaker 2: and with the best of the groups, we're doing a 113 00:06:35,080 --> 00:06:39,479 Speaker 2: workshop on augmented curiosity augmented clarity, and then we have 114 00:06:39,600 --> 00:06:44,280 Speaker 2: them practice an augmentation exercise using their version of a 115 00:06:44,360 --> 00:06:47,560 Speaker 2: large language model, whatever it is, and then we spread 116 00:06:47,600 --> 00:06:50,520 Speaker 2: that out over time and then they talk about their 117 00:06:50,560 --> 00:06:55,200 Speaker 2: experience in using it, and in the senior executive sessions 118 00:06:55,200 --> 00:06:58,240 Speaker 2: we're doing. It takes six months to a year to 119 00:06:58,320 --> 00:07:00,720 Speaker 2: get a team fully on board, and it's got to 120 00:07:00,760 --> 00:07:01,720 Speaker 2: begin with the CEO. 121 00:07:02,240 --> 00:07:05,680 Speaker 1: What I keep coming back to is Bob's reminder that 122 00:07:05,800 --> 00:07:10,200 Speaker 1: AI isn't just about speeding up tasks. It's about expanding 123 00:07:10,240 --> 00:07:12,320 Speaker 1: what your mind can do. So the next time you 124 00:07:12,360 --> 00:07:15,600 Speaker 1: open chat, chept or any tool, don't just ask it 125 00:07:15,680 --> 00:07:19,680 Speaker 1: to finish something for you, bring a messy idea that 126 00:07:19,720 --> 00:07:23,640 Speaker 1: you're wrestling with, and use the conversation to push past 127 00:07:23,720 --> 00:07:27,040 Speaker 1: the block. And of course you can find, as always 128 00:07:27,280 --> 00:07:30,280 Speaker 1: the link to the full conversation with Bob in the 129 00:07:30,320 --> 00:07:33,280 Speaker 1: show notes. If you like today's show, make sure you 130 00:07:33,360 --> 00:07:36,400 Speaker 1: hit follow on your podcast app to be alerted when 131 00:07:36,480 --> 00:07:39,800 Speaker 1: new episodes drop. How I Work was recorded on the 132 00:07:39,800 --> 00:07:42,640 Speaker 1: traditional land of the Warrangery people, part of the Cooler 133 00:07:42,760 --> 00:07:42,960 Speaker 1: Nation