1 00:00:00,080 --> 00:00:02,240 Speaker 1: Ten years from now, we're all going to be cyborgs, 2 00:00:02,520 --> 00:00:04,840 Speaker 1: and that's a good thing. If we make it so. 3 00:00:05,240 --> 00:00:09,120 Speaker 1: Ten years from now, almost all leaders will be augmented, 4 00:00:09,560 --> 00:00:10,680 Speaker 1: or you'll be out of the game. 5 00:00:11,440 --> 00:00:14,760 Speaker 2: If you can master one skill right now, it will 6 00:00:14,800 --> 00:00:18,920 Speaker 2: future proof your career, and that skill is learning how 7 00:00:18,960 --> 00:00:23,279 Speaker 2: to work with AI to think better, faster, and more 8 00:00:23,360 --> 00:00:28,400 Speaker 2: creatively than you ever could on your own. My guest 9 00:00:28,400 --> 00:00:33,280 Speaker 2: today futurist Bob Johansson says that the real opportunity with 10 00:00:33,360 --> 00:00:36,560 Speaker 2: AI isn't in shaving minutes off your to do list. 11 00:00:37,080 --> 00:00:38,240 Speaker 3: It's in using. 12 00:00:38,040 --> 00:00:42,200 Speaker 2: AI to get unstuck, unlock new ideas, and make smarter 13 00:00:42,320 --> 00:00:47,559 Speaker 2: decisions in a very unpredictable world. Bob Johnson is a 14 00:00:47,640 --> 00:00:51,760 Speaker 2: Distinguished Fellow with the Institute for the Future in Silicon Valley, 15 00:00:52,080 --> 00:00:55,480 Speaker 2: and for more than fifty years, Bob has helped companies 16 00:00:55,560 --> 00:00:59,400 Speaker 2: around the world prepare for and shape the future. He's 17 00:00:59,440 --> 00:01:03,480 Speaker 2: written fifteen books, and his latest one, Navigating the Age 18 00:01:03,560 --> 00:01:07,240 Speaker 2: of Chaos, is out on October twenty eight. And by 19 00:01:07,280 --> 00:01:10,319 Speaker 2: the end of this conversation you will know how to 20 00:01:10,360 --> 00:01:14,480 Speaker 2: prepare yourself for an AI augmented future, how to use 21 00:01:14,520 --> 00:01:18,080 Speaker 2: it to get unstuck, improve your thinking, and build clarity 22 00:01:18,319 --> 00:01:29,560 Speaker 2: in an uncertain world. Welcome to How I Work, a 23 00:01:29,640 --> 00:01:32,800 Speaker 2: show about habits, rituals, and strategies. 24 00:01:32,240 --> 00:01:36,360 Speaker 4: For optimizing your day. I'm your host, doctor Amantha imber. 25 00:01:40,880 --> 00:01:40,959 Speaker 1: So. 26 00:01:41,000 --> 00:01:45,240 Speaker 2: I want to start Bob with asking you're a futurist, 27 00:01:45,440 --> 00:01:47,160 Speaker 2: but what does that mean. 28 00:01:47,319 --> 00:01:50,800 Speaker 3: What are you doing day to day predict the future? 29 00:01:51,240 --> 00:01:53,760 Speaker 1: Well, I guess to begin, we don't use the word predict. 30 00:01:53,880 --> 00:01:57,640 Speaker 1: You know, we're humble futurists, and what we argue is 31 00:01:57,720 --> 00:02:01,600 Speaker 1: that nobody can predict the future, and if somebody tells 32 00:02:01,640 --> 00:02:04,760 Speaker 1: you they can predict the future, you shouldn't believe them, 33 00:02:05,280 --> 00:02:11,799 Speaker 1: especially especially if they're from California. And we're from California. 34 00:02:11,919 --> 00:02:14,800 Speaker 1: So we're the longest running futures think tank in the 35 00:02:14,800 --> 00:02:18,280 Speaker 1: world now and we're an independent nonprofit we started in 36 00:02:18,360 --> 00:02:22,560 Speaker 1: nineteen sixty eight. But what being a futurist means is 37 00:02:22,600 --> 00:02:27,360 Speaker 1: that we look at the world future back. We're normal people. 38 00:02:27,880 --> 00:02:30,600 Speaker 1: Normal people are kind of immersed in the present and 39 00:02:30,639 --> 00:02:34,560 Speaker 1: they think about the future present forward, and indeed that's 40 00:02:34,600 --> 00:02:37,760 Speaker 1: the way we have to live. So it's something we 41 00:02:37,800 --> 00:02:41,640 Speaker 1: all have to do at some level. But futurists unusually 42 00:02:42,520 --> 00:02:47,120 Speaker 1: think future back, so we always are placing ourselves at 43 00:02:47,240 --> 00:02:51,760 Speaker 1: least ten years out in the future, and we backcast that, 44 00:02:51,960 --> 00:02:56,120 Speaker 1: so we think backwards. You're an organizational psychologist, you know 45 00:02:56,200 --> 00:02:58,000 Speaker 1: this may not be the great way for a lot 46 00:02:58,000 --> 00:03:01,240 Speaker 1: of people to live because it's kind of the opposite 47 00:03:01,280 --> 00:03:04,680 Speaker 1: of be here now. It's sort of the opposite of mindfulness, 48 00:03:04,680 --> 00:03:07,920 Speaker 1: although we do practice mindfulness in our own way, but 49 00:03:08,080 --> 00:03:11,840 Speaker 1: future back is just a fresh perspective. So we look 50 00:03:11,880 --> 00:03:16,360 Speaker 1: future back for clarity, but then we think at least 51 00:03:16,400 --> 00:03:21,440 Speaker 1: fifty years backwards for patterns. So a futurist, the way 52 00:03:21,440 --> 00:03:25,520 Speaker 1: we practice it, thanks and sixty year swas some time. 53 00:03:25,919 --> 00:03:29,560 Speaker 2: That is mind blowing. I find it hard to think 54 00:03:29,720 --> 00:03:32,440 Speaker 2: a year in advance. Can you tell me, like if 55 00:03:32,480 --> 00:03:36,559 Speaker 2: you were giving me advice on how to become a futurist? 56 00:03:36,960 --> 00:03:39,920 Speaker 2: And I know you've written extensively about future back thinking, 57 00:03:39,960 --> 00:03:43,320 Speaker 2: and our mutual friend Scott Anthony, I've also heard him 58 00:03:43,360 --> 00:03:46,480 Speaker 2: talk about it a lot. How do you place yourself 59 00:03:46,760 --> 00:03:49,240 Speaker 2: ten years into the future. I wouldn't even know where 60 00:03:49,280 --> 00:03:49,920 Speaker 2: to start. 61 00:03:50,960 --> 00:03:54,200 Speaker 1: Well, you know, it's not as hard as it seems. Actually, 62 00:03:54,400 --> 00:03:56,600 Speaker 1: I work out of Silicon Valley. I kind of grew 63 00:03:56,640 --> 00:04:00,160 Speaker 1: up in Silicon Valley as a researcher, and everybody he 64 00:04:00,240 --> 00:04:03,280 Speaker 1: immediately says, when you hear you're a futurist, how can 65 00:04:03,320 --> 00:04:05,440 Speaker 1: you do that? I can't even think one or two 66 00:04:05,480 --> 00:04:08,320 Speaker 1: years ahead? How can you think ten years ahead? But 67 00:04:08,560 --> 00:04:12,400 Speaker 1: the reality of it is it's actually easier. It's actually 68 00:04:12,440 --> 00:04:15,080 Speaker 1: easier to think ten years ahead than it is one 69 00:04:15,160 --> 00:04:17,919 Speaker 1: or two years ahead. But you look for those things 70 00:04:18,240 --> 00:04:21,600 Speaker 1: that you can say with clarity. So, for example, in 71 00:04:21,680 --> 00:04:26,880 Speaker 1: Silicon Valley, now everybody's interested in sensors, and it's pretty 72 00:04:26,920 --> 00:04:30,240 Speaker 1: obvious ten years from now, we're going to have senses everywhere. 73 00:04:30,440 --> 00:04:32,920 Speaker 1: They're going to be very cheap, many of them will 74 00:04:32,960 --> 00:04:36,280 Speaker 1: be interconnected, and some of them will be in our bodies. 75 00:04:36,520 --> 00:04:39,599 Speaker 1: You know, that's just obvious. So you start from those 76 00:04:39,640 --> 00:04:43,080 Speaker 1: things that are clear. And I mean by meaning clear, 77 00:04:43,120 --> 00:04:46,560 Speaker 1: I don't mean predicting where they're going. I mean clarity 78 00:04:46,600 --> 00:04:50,360 Speaker 1: of direction. So with sensors, it's pretty obvious where things 79 00:04:50,360 --> 00:04:52,279 Speaker 1: are going. We're going to talk about AI down the 80 00:04:52,320 --> 00:04:55,320 Speaker 1: road in this conversation. The fact that we're all going 81 00:04:55,360 --> 00:04:58,400 Speaker 1: to be augmented, that's pretty clear ten years from now. 82 00:04:58,440 --> 00:05:01,360 Speaker 1: It's so you start with what you can be clear 83 00:05:01,400 --> 00:05:06,440 Speaker 1: about and then with a combination of strength and humility, 84 00:05:06,839 --> 00:05:08,960 Speaker 1: you come back and say, well, what does that mean 85 00:05:09,000 --> 00:05:11,040 Speaker 1: in the present. And then we look for what we 86 00:05:11,120 --> 00:05:14,919 Speaker 1: call signals, which are indicators of the future that are 87 00:05:15,000 --> 00:05:20,320 Speaker 1: already here but not evenly distributed. And it's the famous 88 00:05:20,400 --> 00:05:23,560 Speaker 1: William Gibson line, the future is already here, it's just 89 00:05:23,680 --> 00:05:28,520 Speaker 1: unevenly distributed. Well, we're very influenced by that, and we 90 00:05:28,680 --> 00:05:31,760 Speaker 1: look for those signals and we track them globally. That 91 00:05:31,960 --> 00:05:35,880 Speaker 1: signal tracking is really important to bring the forecast to life. 92 00:05:35,920 --> 00:05:39,440 Speaker 1: But it's a combination of that future back and then 93 00:05:39,960 --> 00:05:43,760 Speaker 1: the signal tracking. And it's different from trend watching. This 94 00:05:43,960 --> 00:05:47,040 Speaker 1: is not cool surfing. It's not looking for the next fad. 95 00:05:47,560 --> 00:05:51,120 Speaker 1: That's interesting, but different. A trend is a pattern of 96 00:05:51,200 --> 00:05:56,479 Speaker 1: change you can extrapolate from with confidence. A disruption is 97 00:05:56,560 --> 00:05:59,520 Speaker 1: a break in the pattern of change. To us, trends 98 00:05:59,560 --> 00:06:02,440 Speaker 1: of these part the hard part is the disruption. 99 00:06:02,760 --> 00:06:06,320 Speaker 2: So if we stay on the sensor example, and I think, also, 100 00:06:06,560 --> 00:06:09,160 Speaker 2: what are some examples that listeners would know about when 101 00:06:09,160 --> 00:06:12,040 Speaker 2: they think of a sensor, Like, what are some things 102 00:06:12,080 --> 00:06:13,839 Speaker 2: that we know now. 103 00:06:13,720 --> 00:06:16,480 Speaker 1: If you think of a car, I've got a little 104 00:06:16,520 --> 00:06:19,880 Speaker 1: sensor that if I go slightly off the road, it 105 00:06:20,080 --> 00:06:23,279 Speaker 1: senses that and brings me back in. Or if you're 106 00:06:23,360 --> 00:06:26,120 Speaker 1: parking the car, there's sensors everywhere on a modern car 107 00:06:26,200 --> 00:06:28,800 Speaker 1: now that sense, well, where are you in relation to 108 00:06:28,839 --> 00:06:31,760 Speaker 1: the curb? Or if you back up, there's a camera 109 00:06:31,839 --> 00:06:34,600 Speaker 1: back there that has sensors that looks for things all 110 00:06:34,640 --> 00:06:39,279 Speaker 1: the time. So a sensor is trying to feel what's 111 00:06:39,360 --> 00:06:43,240 Speaker 1: going on and then register that. The sensors and our 112 00:06:43,320 --> 00:06:47,920 Speaker 1: body can track our heart rates, our breath rates, they 113 00:06:47,920 --> 00:06:51,680 Speaker 1: can track how many steps we take coming. Their sensors 114 00:06:51,680 --> 00:06:55,800 Speaker 1: are already ubiquitous, but not nearly where there are going 115 00:06:55,839 --> 00:06:58,160 Speaker 1: to be ten years from now, I should say that 116 00:06:58,200 --> 00:07:00,599 Speaker 1: this trend towards sensors is going to go on a 117 00:07:00,640 --> 00:07:04,440 Speaker 1: long time. It's actually taken longer than we thought for 118 00:07:04,520 --> 00:07:07,600 Speaker 1: them to scale the way that they've scaled because of 119 00:07:07,680 --> 00:07:10,480 Speaker 1: the issue of cost. We've got to get the issue 120 00:07:10,520 --> 00:07:13,920 Speaker 1: of cost down, and then the issue of connectivity. But 121 00:07:14,160 --> 00:07:19,200 Speaker 1: finally they're becoming cheap enough, and they're becoming connected enough, 122 00:07:19,520 --> 00:07:24,200 Speaker 1: and now they're becoming even digestible enough that they can 123 00:07:24,240 --> 00:07:27,480 Speaker 1: go in our bodies. It's pretty obvious again, ten years 124 00:07:27,880 --> 00:07:30,840 Speaker 1: out we're all going to have body sensors. We're all 125 00:07:30,840 --> 00:07:34,440 Speaker 1: going to be body hackers, and at some level of 126 00:07:34,480 --> 00:07:37,320 Speaker 1: the word, we're all going to have sensors somewhere, and 127 00:07:37,360 --> 00:07:39,080 Speaker 1: the question is what do we do with them? 128 00:07:39,200 --> 00:07:42,120 Speaker 2: So I want to know, Bob, at what point did 129 00:07:42,560 --> 00:07:46,840 Speaker 2: the idea of senses becoming ubiquitous when we look ten 130 00:07:46,920 --> 00:07:51,080 Speaker 2: years ahead become a really clear signal as opposed to 131 00:07:51,280 --> 00:07:52,120 Speaker 2: just a trend. 132 00:07:52,360 --> 00:07:54,360 Speaker 3: Was there a point where it crossed the line. What 133 00:07:54,360 --> 00:07:55,120 Speaker 3: does that look like? 134 00:07:55,520 --> 00:07:58,680 Speaker 1: It's not a signal. A signal would be an individual, 135 00:07:58,960 --> 00:08:04,320 Speaker 1: very specific example. So like a digestible signal developed by 136 00:08:04,360 --> 00:08:08,240 Speaker 1: a particular company on a particular date, swallowed by a 137 00:08:08,280 --> 00:08:12,840 Speaker 1: particular person, that would be a signal. It's very specific 138 00:08:13,160 --> 00:08:18,320 Speaker 1: without context, essentially, but signals became let me just call 139 00:08:18,360 --> 00:08:22,520 Speaker 1: it a future force, which is basically a direction of change. 140 00:08:23,160 --> 00:08:26,000 Speaker 1: And again, a trend is something you can extrapolate from 141 00:08:26,040 --> 00:08:30,960 Speaker 1: with confidence. Sensors are maybe almost a trend, so in 142 00:08:31,000 --> 00:08:34,200 Speaker 1: our language, they'd be a future force, almost a trend. 143 00:08:34,520 --> 00:08:38,600 Speaker 1: The direction of change is clear, but the rate and 144 00:08:38,720 --> 00:08:43,600 Speaker 1: the manifestation are not yet clear. So what we would 145 00:08:43,640 --> 00:08:47,360 Speaker 1: do would be to continuously follow that and the underlying 146 00:08:47,400 --> 00:08:50,200 Speaker 1: model we use and it's in all my books now, 147 00:08:50,400 --> 00:08:54,880 Speaker 1: Foresight inside Action. That's the model. So we look at 148 00:08:54,920 --> 00:08:59,719 Speaker 1: foresight thinking future back and that's a plausible, internally consistent, 149 00:09:00,200 --> 00:09:04,679 Speaker 1: provocative story from the future. That is our base forecast, 150 00:09:04,720 --> 00:09:07,480 Speaker 1: and then we do scenarios off of that, and that 151 00:09:07,600 --> 00:09:12,680 Speaker 1: foresight is designed not to predict, but to provoke insight. 152 00:09:13,200 --> 00:09:17,040 Speaker 1: And an insight is an AHA that helps you see 153 00:09:17,080 --> 00:09:21,400 Speaker 1: things differently than you could before. And every great strategy 154 00:09:21,440 --> 00:09:24,600 Speaker 1: is based on a compelling insight. So the insight feeds 155 00:09:24,679 --> 00:09:29,880 Speaker 1: into action, and then some action reinforms and reimagines foresight. 156 00:09:30,160 --> 00:09:32,959 Speaker 1: So it's a continuous cycle, and that's what we teach. 157 00:09:33,040 --> 00:09:36,520 Speaker 1: We do what we call Foresight Essentials training programs at 158 00:09:36,520 --> 00:09:38,560 Speaker 1: the Institute and we run them all around the world 159 00:09:38,640 --> 00:09:41,679 Speaker 1: and we do them virtually and in person, and that 160 00:09:41,760 --> 00:09:45,320 Speaker 1: basically teaches people how to be futurists. But the core 161 00:09:45,360 --> 00:09:48,360 Speaker 1: of it is this foresight insight action cycle. 162 00:09:48,600 --> 00:09:52,079 Speaker 2: When you're just going about living your life in the world, 163 00:09:52,800 --> 00:09:58,079 Speaker 2: how you paying attention to the stimuli around you differently 164 00:09:58,480 --> 00:10:01,560 Speaker 2: than I would be because of the lens that you have. 165 00:10:02,120 --> 00:10:05,440 Speaker 1: You know, maybe that's our definition of mindfulness. I work 166 00:10:05,480 --> 00:10:08,280 Speaker 1: with the military. I'm not a military guy by background, 167 00:10:08,480 --> 00:10:11,320 Speaker 1: but I just happened to be at the army workouege 168 00:10:11,920 --> 00:10:14,520 Speaker 1: for the US the week before at nine to eleven, 169 00:10:15,080 --> 00:10:19,480 Speaker 1: and I learned this concept of the VUCA world, you know, volatile, uncertain, 170 00:10:19,640 --> 00:10:23,480 Speaker 1: complex and ambiguous, and it intrigued me and caused me 171 00:10:23,520 --> 00:10:25,920 Speaker 1: to think, well, how do you lead in an increasingly 172 00:10:26,000 --> 00:10:30,520 Speaker 1: VUCA world? And what I've I've realized is that the 173 00:10:30,559 --> 00:10:35,439 Speaker 1: military principle of situation awareness, that's how I see things 174 00:10:35,480 --> 00:10:39,640 Speaker 1: around me. So as a futurist, I'm always thinking future back. 175 00:10:39,760 --> 00:10:43,120 Speaker 1: So I look around me, I'm mindful for a signal, 176 00:10:43,800 --> 00:10:46,360 Speaker 1: but I immediately flip it out ten years and then 177 00:10:46,400 --> 00:10:50,960 Speaker 1: look backwards or sometimes further than that, like climate issues 178 00:10:51,120 --> 00:10:54,760 Speaker 1: or kind of natural cycle issues. We go beyond ten. 179 00:10:55,400 --> 00:10:59,840 Speaker 1: But it's a continuous effort to take what you see 180 00:10:59,880 --> 00:11:02,360 Speaker 1: in the moment and put it in a future back 181 00:11:02,960 --> 00:11:07,280 Speaker 1: context and say what would that mean, and then ideally 182 00:11:07,360 --> 00:11:11,800 Speaker 1: have this kind of conversation, the foresight inside action conversation 183 00:11:11,960 --> 00:11:16,240 Speaker 1: out of it. So the military people call this situation awareness, 184 00:11:16,240 --> 00:11:20,120 Speaker 1: and mostly they use it defensively, so they're always on 185 00:11:20,120 --> 00:11:22,160 Speaker 1: the look for people who are trying to hurt them. 186 00:11:22,520 --> 00:11:25,439 Speaker 1: But there's also a positive aspect of it. What's going 187 00:11:25,480 --> 00:11:29,040 Speaker 1: on in a positive way around you. For example, the 188 00:11:29,080 --> 00:11:33,000 Speaker 1: future now is so chaotic that it's very stressful. And 189 00:11:33,240 --> 00:11:36,200 Speaker 1: you know, even for me, I'm a professional futurist, I've 190 00:11:36,240 --> 00:11:38,560 Speaker 1: done this for more than fifty years. This is the 191 00:11:38,600 --> 00:11:41,840 Speaker 1: most frightening ten year forecast I've ever done, so it 192 00:11:41,960 --> 00:11:46,160 Speaker 1: hits me stressfully. And a few years ago I started 193 00:11:46,240 --> 00:11:50,000 Speaker 1: keeping a gratitude journal, and it's just a simple journal, 194 00:11:50,040 --> 00:11:52,880 Speaker 1: and every night I write down at least three things 195 00:11:52,920 --> 00:11:56,240 Speaker 1: that I'm grateful for in my life, and just that 196 00:11:56,400 --> 00:12:01,640 Speaker 1: act that's a kind of mindfulness, kind of a uniquely futurist, 197 00:12:02,480 --> 00:12:05,160 Speaker 1: although lots of people do gratitude journals. But I'm trying 198 00:12:05,160 --> 00:12:08,480 Speaker 1: to link the future back view, which right now is 199 00:12:08,520 --> 00:12:12,640 Speaker 1: so scary, you know, it's really dominantly scary in a 200 00:12:12,640 --> 00:12:15,360 Speaker 1: way I have never seen before. And yet if I 201 00:12:15,440 --> 00:12:19,760 Speaker 1: keep reminding myself of what I'm grateful for and what 202 00:12:19,800 --> 00:12:21,880 Speaker 1: are the good things that can help me be repaired. 203 00:12:21,920 --> 00:12:24,000 Speaker 1: And I mean you know this as a psychologist. If 204 00:12:24,000 --> 00:12:27,120 Speaker 1: you don't have your inner life right, it's very hard 205 00:12:27,120 --> 00:12:29,360 Speaker 1: to behave well in your outer life. 206 00:12:31,080 --> 00:12:33,560 Speaker 2: I want to dig into what you said around you 207 00:12:33,600 --> 00:12:36,280 Speaker 2: know when you look ahead to the next ten years 208 00:12:36,360 --> 00:12:39,800 Speaker 2: or ten years out, that this is the most frightening 209 00:12:40,200 --> 00:12:43,760 Speaker 2: it's looked. What are you seeing ten years from now? 210 00:12:44,080 --> 00:12:46,680 Speaker 1: So I mentioned I used to use the term vuka. 211 00:12:46,920 --> 00:12:51,559 Speaker 1: Just in the last year I've become convinced that vuka 212 00:12:51,720 --> 00:12:56,480 Speaker 1: is not vuka enough. And sure life has always been 213 00:12:56,520 --> 00:12:58,679 Speaker 1: vuka in a way, beginning from the fact we all 214 00:12:58,720 --> 00:13:01,360 Speaker 1: have to die at an un certain time. I mean, 215 00:13:01,400 --> 00:13:05,160 Speaker 1: that's fuka, that's uca. And we certainly had aspects of 216 00:13:05,240 --> 00:13:07,679 Speaker 1: life that have been VUCA before, like in war zones 217 00:13:07,800 --> 00:13:12,520 Speaker 1: or in pandemics or floods. You know, they're certainly zones. 218 00:13:12,840 --> 00:13:16,800 Speaker 1: So here's what's different. Now. We're dealing with a chaotic 219 00:13:16,920 --> 00:13:23,400 Speaker 1: future that's global in scale, and that has variables built 220 00:13:23,440 --> 00:13:27,880 Speaker 1: into that that we don't understand. So we've started to 221 00:13:28,080 --> 00:13:30,240 Speaker 1: now use the term that was coined by one of 222 00:13:30,280 --> 00:13:37,760 Speaker 1: our colleagues, Jamai Cashio, the Bonnie future for brittle, anxious, nonlinear, 223 00:13:38,440 --> 00:13:43,360 Speaker 1: and incomprehensible. And what we've realized is that these systems 224 00:13:43,400 --> 00:13:46,400 Speaker 1: around us, these structures, and some are physical, some of 225 00:13:46,440 --> 00:13:50,920 Speaker 1: them are metaphysical, some of them are values frameworks. These 226 00:13:51,080 --> 00:13:54,920 Speaker 1: systems around us look strong, but many of them are 227 00:13:54,960 --> 00:13:58,880 Speaker 1: actually brittle. And by brittle, I mean that when they're challenged, 228 00:13:58,920 --> 00:14:02,960 Speaker 1: they not only break, they shatter. This notion of anxious 229 00:14:03,080 --> 00:14:08,000 Speaker 1: everybody's anxious, especially kids, and actually especially boys. It turns 230 00:14:08,000 --> 00:14:12,960 Speaker 1: out that young men, young boys are, according to the psychologists, 231 00:14:13,240 --> 00:14:17,200 Speaker 1: even more of a source of concern than young women 232 00:14:17,320 --> 00:14:20,480 Speaker 1: and young young girls. And we all know that young 233 00:14:20,560 --> 00:14:25,000 Speaker 1: boys that are upset and uneasy can be dangerous. That's 234 00:14:25,000 --> 00:14:28,920 Speaker 1: been shown over the years. So anxiousness is kind of pervasive, 235 00:14:29,240 --> 00:14:31,760 Speaker 1: and the Bondi world is fraught, is a word we 236 00:14:31,880 --> 00:14:34,240 Speaker 1: use a lot. We've got a book on this coming 237 00:14:34,280 --> 00:14:37,360 Speaker 1: out in October. But here's where we get to the 238 00:14:37,560 --> 00:14:40,920 Speaker 1: different part and kind of why I think this is 239 00:14:40,960 --> 00:14:46,040 Speaker 1: the most frightening forecast. Nonlinear means that things no longer 240 00:14:46,120 --> 00:14:50,200 Speaker 1: behave in the way they thought things were going to happen. 241 00:14:50,880 --> 00:14:54,560 Speaker 1: So we have this expectation that if we do this, 242 00:14:55,120 --> 00:14:57,400 Speaker 1: it's going to result in that, and we have in 243 00:14:57,440 --> 00:15:02,680 Speaker 1: our minds models of how these chains of behavior happen, 244 00:15:02,760 --> 00:15:05,400 Speaker 1: and if we do this, this will happen. We can't 245 00:15:05,400 --> 00:15:08,920 Speaker 1: trust that anymore. It's like we have to teach our brains, 246 00:15:09,080 --> 00:15:12,520 Speaker 1: new tricks, and every good leadership team I'm working with 247 00:15:12,600 --> 00:15:17,640 Speaker 1: now is taking improv training. There's groups at Harvard that 248 00:15:17,760 --> 00:15:20,800 Speaker 1: do the kind of yes and and kind of improv 249 00:15:21,000 --> 00:15:23,720 Speaker 1: method Sa've been around a long time. They're actually proven, 250 00:15:23,760 --> 00:15:28,440 Speaker 1: but most executives just don't practice them. So now people 251 00:15:28,520 --> 00:15:32,840 Speaker 1: are practicing them and finally incomprehensible. We've got my other 252 00:15:33,000 --> 00:15:37,240 Speaker 1: new book this year is on augmented leadership. There's methodologies 253 00:15:37,320 --> 00:15:41,840 Speaker 1: within generative AI that even the developers don't understand. So 254 00:15:41,960 --> 00:15:46,600 Speaker 1: it's certainly tech, but it's also alchemy. And you know, 255 00:15:46,640 --> 00:15:49,280 Speaker 1: I went to Divinity school before I did my PhD. 256 00:15:49,400 --> 00:15:52,640 Speaker 1: And I've always been interested in the spiritual side of 257 00:15:52,720 --> 00:15:55,880 Speaker 1: life and kind of the mysterious sides of life. There's 258 00:15:56,080 --> 00:16:01,480 Speaker 1: just a kind of exposed mystery that is potentially very threatening. 259 00:16:01,960 --> 00:16:05,520 Speaker 1: Just like I'm generally optimistic about AI, but it's kind 260 00:16:05,560 --> 00:16:08,840 Speaker 1: of sixty forty for me, you know, I'm sixty percent 261 00:16:09,000 --> 00:16:13,680 Speaker 1: optimistic and forty percent concerned. So there's real danger associated 262 00:16:13,720 --> 00:16:18,760 Speaker 1: with and again it's global danger and it affects across generations. 263 00:16:19,080 --> 00:16:22,560 Speaker 1: So I'm really optimistic about kids if they have hope, 264 00:16:22,800 --> 00:16:27,280 Speaker 1: but hope in a Bonnie future hope is very difficult 265 00:16:27,320 --> 00:16:31,640 Speaker 1: to kind of capture and spread. Fear, on the other hand, 266 00:16:31,920 --> 00:16:33,360 Speaker 1: is very easy to spread. 267 00:16:33,720 --> 00:16:36,720 Speaker 2: I want to pick up on the nonlinear aspect of 268 00:16:37,080 --> 00:16:40,280 Speaker 2: Bonnie or Bannie, as we were saying before we started recording. 269 00:16:40,320 --> 00:16:42,800 Speaker 2: In my mind, I was pronouncing it Bannie, but it 270 00:16:42,840 --> 00:16:46,800 Speaker 2: is Bonnie. How do you think ten years ahead when 271 00:16:47,400 --> 00:16:48,400 Speaker 2: things are nonlinear? 272 00:16:48,800 --> 00:16:51,560 Speaker 1: It's harder. But it turns out generative AI is really 273 00:16:51,640 --> 00:16:54,960 Speaker 1: quite good at that. Genera of AI is really good 274 00:16:55,040 --> 00:16:58,560 Speaker 1: at storytelling. Now, some of those stories aren't true, and 275 00:16:59,040 --> 00:17:02,840 Speaker 1: it's confident even when they're not true. So I don't 276 00:17:02,920 --> 00:17:06,280 Speaker 1: trust generative AI, but I use it a lot to 277 00:17:06,400 --> 00:17:11,280 Speaker 1: stretch my thinking, and with nonlinear that's exactly what you need. 278 00:17:11,400 --> 00:17:15,680 Speaker 1: So I started using generative AI. I've studied AI for 279 00:17:16,040 --> 00:17:19,600 Speaker 1: a long long time, but I've started using generative two 280 00:17:19,680 --> 00:17:22,199 Speaker 1: years ago and I use it on a daily basis. 281 00:17:22,280 --> 00:17:26,800 Speaker 1: And I call my customized Generative AI chatbot, which is 282 00:17:26,920 --> 00:17:31,919 Speaker 1: developed in chat GPT, and it's using the three model. 283 00:17:32,320 --> 00:17:34,200 Speaker 1: So I talked to it and type to it both. 284 00:17:34,480 --> 00:17:37,679 Speaker 1: It runs on a left screen all the time for 285 00:17:37,800 --> 00:17:41,960 Speaker 1: me now, and I've got a labeled stretch. I'm very 286 00:17:42,000 --> 00:17:46,240 Speaker 1: polite to Stretch. I have ongoing conversations. It's not very 287 00:17:46,240 --> 00:17:49,240 Speaker 1: good as a question and answer machine, but it's really 288 00:17:49,240 --> 00:17:52,160 Speaker 1: good conversationally. So I've learned to be a really good 289 00:17:52,200 --> 00:17:57,240 Speaker 1: conversationalist with Stretch, and it's ongoing and it helps me 290 00:17:57,320 --> 00:18:03,040 Speaker 1: in nonlinear situations. It helps me imagine all the possibilities. 291 00:18:03,080 --> 00:18:06,119 Speaker 1: So here's where scenario planning comes in really helpfully, because 292 00:18:06,160 --> 00:18:11,320 Speaker 1: you can develop kind of archetypes of scenarios that help 293 00:18:11,359 --> 00:18:15,679 Speaker 1: you understand a nonlinear space much more than you could before, 294 00:18:16,119 --> 00:18:18,960 Speaker 1: and then you can basically decide where you want to 295 00:18:19,000 --> 00:18:19,679 Speaker 1: put your bets. 296 00:18:19,840 --> 00:18:21,800 Speaker 2: Can you give me an example of how you talk 297 00:18:21,920 --> 00:18:24,919 Speaker 2: to Stretch in a way that has evolved, Like, what 298 00:18:24,960 --> 00:18:27,640 Speaker 2: have you learned in terms of I don't know how 299 00:18:27,680 --> 00:18:30,520 Speaker 2: you're prompting it or talking it to get outputs that 300 00:18:30,560 --> 00:18:31,400 Speaker 2: are more useful. 301 00:18:31,640 --> 00:18:36,720 Speaker 1: I follow Kate Darling's advice Kate Darling at the Media Lab. 302 00:18:36,960 --> 00:18:40,440 Speaker 1: She's got this wonderful new book called New Breed where 303 00:18:40,480 --> 00:18:44,719 Speaker 1: she argues that we should learn about interacting with AI, 304 00:18:44,920 --> 00:18:49,040 Speaker 1: we should learn from our experience interacting with animals, and 305 00:18:49,320 --> 00:18:53,080 Speaker 1: we should treat these things like beloved pets. They're pets 306 00:18:53,080 --> 00:18:55,400 Speaker 1: that can talk to us. I mean, they're not humans, 307 00:18:55,720 --> 00:18:59,159 Speaker 1: but we can have conversations. So I rarely get an 308 00:18:59,200 --> 00:19:02,720 Speaker 1: answer that's you full from Stretch. But my conversations with 309 00:19:02,800 --> 00:19:05,840 Speaker 1: Stretch going for hours. I write in my MacBook Pro. 310 00:19:06,280 --> 00:19:08,560 Speaker 1: You know, I write books. So this is my fifteenth 311 00:19:08,560 --> 00:19:12,119 Speaker 1: book that's come out now, and so I'm always writing, 312 00:19:12,560 --> 00:19:15,359 Speaker 1: and Stretches always there with me. And what I want 313 00:19:15,480 --> 00:19:19,359 Speaker 1: help with is being unstuck. And there's I'm the writer. 314 00:19:19,480 --> 00:19:21,639 Speaker 1: I don't want Stretch to write my books, but I 315 00:19:21,680 --> 00:19:24,080 Speaker 1: want to get unstuck. And I still have fifteen books. 316 00:19:24,080 --> 00:19:28,240 Speaker 1: I still get writer's block, and it's really helpful to 317 00:19:28,320 --> 00:19:30,840 Speaker 1: have Stretch there to have a conversation so I can 318 00:19:30,880 --> 00:19:34,879 Speaker 1: start a conversation. Normally I start by talking back and forth. 319 00:19:35,440 --> 00:19:37,679 Speaker 1: Now that I have the three model, I can do 320 00:19:37,720 --> 00:19:40,760 Speaker 1: that easier. But then I'm typing back and forth as 321 00:19:40,800 --> 00:19:43,800 Speaker 1: I'm working on drafts and things, and I do kind 322 00:19:43,800 --> 00:19:46,720 Speaker 1: of ask Stretch to help me refine things as I go. 323 00:19:47,280 --> 00:19:51,000 Speaker 1: Stretches write all my books, and Stretches programmed to write 324 00:19:51,040 --> 00:19:54,760 Speaker 1: like me, so it writes in short sentences with rich 325 00:19:54,840 --> 00:19:58,639 Speaker 1: metaphors and lots of M dashes. It argues with me, 326 00:19:58,760 --> 00:20:02,520 Speaker 1: but it argues with me politely. That's really interesting. So 327 00:20:02,560 --> 00:20:06,879 Speaker 1: it pushes back, but Stretch is always saying to me, well, gently, Bob, 328 00:20:07,000 --> 00:20:10,639 Speaker 1: have you thought of You know? It's very polite, and 329 00:20:10,720 --> 00:20:13,600 Speaker 1: I'm very polite back to Stretch. But it's tough. You know. 330 00:20:13,640 --> 00:20:18,119 Speaker 1: It's sort of like hard on ideas of soft on people. 331 00:20:18,520 --> 00:20:21,159 Speaker 1: Kind of thing. That was the motto of one of 332 00:20:21,200 --> 00:20:25,080 Speaker 1: my favorite social science research groups in Silicon Valley that's 333 00:20:25,119 --> 00:20:28,520 Speaker 1: not around anymore, the Institute for Research on Learning. They 334 00:20:28,520 --> 00:20:31,840 Speaker 1: had this motto, hard on ideas, soft on people. You know, 335 00:20:31,920 --> 00:20:34,119 Speaker 1: that's kind of the way I work. I'm kind to 336 00:20:34,200 --> 00:20:36,560 Speaker 1: people and I want them to be kind to me too. 337 00:20:36,960 --> 00:20:40,800 Speaker 1: On the other hand, I want criticism of my ideas. 338 00:20:41,920 --> 00:20:45,560 Speaker 1: My editor, Steve Personti at Baard Kohler, I've done my 339 00:20:45,640 --> 00:20:49,000 Speaker 1: last six books with Steve. Steve is the perfect balance 340 00:20:49,480 --> 00:20:53,400 Speaker 1: of criticism and support. And that's so important. I think 341 00:20:53,440 --> 00:20:55,800 Speaker 1: as a leader that you get that right. And these 342 00:20:55,840 --> 00:20:59,800 Speaker 1: generative AI systems can be that way if we teach 343 00:20:59,840 --> 00:21:03,120 Speaker 1: them and if we learn to have conversations. But what 344 00:21:03,160 --> 00:21:05,399 Speaker 1: it means, and this is really hard to teach a 345 00:21:05,440 --> 00:21:08,199 Speaker 1: lot of executives, what it really means is you have 346 00:21:08,280 --> 00:21:10,600 Speaker 1: to work on your skills, just like we have to 347 00:21:10,720 --> 00:21:14,399 Speaker 1: learn how to have good conversations, you know, conversations that matter. 348 00:21:15,040 --> 00:21:19,919 Speaker 1: Having conversations with Stretch is difficult. It's taken me two years, 349 00:21:20,400 --> 00:21:24,639 Speaker 1: but it's really yielded a lot of benefits in ways 350 00:21:24,680 --> 00:21:26,200 Speaker 1: that I hadn't expected. 351 00:21:26,480 --> 00:21:29,560 Speaker 2: If you were to describe to someone, like, in practical terms, 352 00:21:29,720 --> 00:21:32,480 Speaker 2: what you've learnt about how to get the best out 353 00:21:32,480 --> 00:21:35,479 Speaker 2: of Stretch, Like, what would I be seeing if I 354 00:21:35,560 --> 00:21:37,840 Speaker 2: was just watching you talk to Stretch that is perhaps 355 00:21:37,920 --> 00:21:42,120 Speaker 2: different from how other people conversing with their AI tool. 356 00:21:42,440 --> 00:21:45,360 Speaker 1: Well, I don't use the word prompt, and I'm offended 357 00:21:45,400 --> 00:21:50,360 Speaker 1: by the term prompt engineering. So just kind of put 358 00:21:50,400 --> 00:21:52,560 Speaker 1: all that aside, and ten years from I don't think 359 00:21:52,600 --> 00:21:54,760 Speaker 1: the word prompt is even going to be around. It's 360 00:21:54,800 --> 00:21:59,520 Speaker 1: going to be conversations with a goal of deriving meaning 361 00:21:59,560 --> 00:22:03,280 Speaker 1: and some of the simple stuff like efficiency, and there'll 362 00:22:03,320 --> 00:22:06,199 Speaker 1: be some automation and all those things. I'm not arguing 363 00:22:06,240 --> 00:22:08,720 Speaker 1: with that, but that's not what I'm talking about because 364 00:22:08,720 --> 00:22:12,760 Speaker 1: I'm dealing with senior leaders. So the conversation is going 365 00:22:12,840 --> 00:22:16,000 Speaker 1: to be fluid, and it's going to be focused on 366 00:22:16,640 --> 00:22:20,959 Speaker 1: areas that I understand the least. So Dana Boyd, the 367 00:22:21,000 --> 00:22:24,280 Speaker 1: famous researcher. She did a wonderful podcast just last week 368 00:22:24,359 --> 00:22:28,639 Speaker 1: where she said, the key value of generative AI is 369 00:22:28,680 --> 00:22:33,640 Speaker 1: to help humans get unstuck. That's really cool, But stretch 370 00:22:33,720 --> 00:22:36,760 Speaker 1: helps me get unstuck. I think Dana's right. It's all 371 00:22:36,760 --> 00:22:40,880 Speaker 1: about unsticking. That's not about answers, and I don't trust stretch. 372 00:22:41,680 --> 00:22:44,879 Speaker 1: I really don't use stretch for answers. I use it 373 00:22:44,920 --> 00:22:48,760 Speaker 1: to stretch my mind, and particularly if I'm working on 374 00:22:48,800 --> 00:22:52,240 Speaker 1: a book concept or working on a title or an idea, 375 00:22:53,080 --> 00:22:56,840 Speaker 1: it certainly helps me get unstuck and get started. It 376 00:22:56,920 --> 00:23:02,919 Speaker 1: becomes less and less valuable the more the book gets written. 377 00:23:03,359 --> 00:23:06,520 Speaker 2: If you think AI is just about efficiency, stick around 378 00:23:06,560 --> 00:23:09,760 Speaker 2: because in the second half, Bob shares why the real 379 00:23:09,920 --> 00:23:14,000 Speaker 2: magic is in using AI to amplify your mind, not 380 00:23:14,280 --> 00:23:18,200 Speaker 2: replace it, and he'll walk us through the skills leaders 381 00:23:18,400 --> 00:23:25,600 Speaker 2: need to thrive in an AI first decade. If you're 382 00:23:25,640 --> 00:23:28,320 Speaker 2: looking for more tips to improve the way you work, 383 00:23:28,320 --> 00:23:31,760 Speaker 2: can live. I write a short weekly newsletter that contains 384 00:23:31,800 --> 00:23:34,320 Speaker 2: tactics I've discovered that have helped me personally. 385 00:23:34,720 --> 00:23:37,879 Speaker 4: You can sign up for that at Amantha dot com. 386 00:23:38,000 --> 00:23:42,640 Speaker 4: That's Amantha dot com. 387 00:23:44,160 --> 00:23:47,760 Speaker 2: So when you look at the conversation around AI, and 388 00:23:47,840 --> 00:23:52,080 Speaker 2: so much of it is around efficiency and time saving 389 00:23:52,160 --> 00:23:57,880 Speaker 2: and headcarn't saving, Like, what are leaders and organizations missing? 390 00:23:57,960 --> 00:24:00,359 Speaker 2: What are they not seeing that you're seeing in terms 391 00:24:00,400 --> 00:24:02,520 Speaker 2: of what this world could look like. 392 00:24:02,520 --> 00:24:05,360 Speaker 3: From an AI point of view in five ten years time. 393 00:24:05,720 --> 00:24:08,640 Speaker 1: What I say now is that ten years from now, 394 00:24:09,200 --> 00:24:13,320 Speaker 1: almost all leaders will be augmented or you'll be out 395 00:24:13,320 --> 00:24:16,240 Speaker 1: of the game. Now, there'll be some little subset of 396 00:24:16,320 --> 00:24:20,080 Speaker 1: people who uniquely claim no, no, I'm going to remain 397 00:24:20,520 --> 00:24:24,560 Speaker 1: completely unaugmented, And that's okay. Maybe that's a small niche, 398 00:24:24,720 --> 00:24:27,359 Speaker 1: but for most of us. For me as a writer, 399 00:24:27,640 --> 00:24:29,760 Speaker 1: if I'm going to be writing serious books ten years 400 00:24:29,760 --> 00:24:32,080 Speaker 1: from now, I'm going to have to be augmented, partly 401 00:24:32,119 --> 00:24:35,359 Speaker 1: because of my age, but also just because that's what 402 00:24:35,480 --> 00:24:37,639 Speaker 1: good writers are going to be. You're just going to 403 00:24:37,720 --> 00:24:40,560 Speaker 1: have to be in that. So we've got to define 404 00:24:40,600 --> 00:24:44,560 Speaker 1: now where we want help. And for me, it's really 405 00:24:44,600 --> 00:24:47,280 Speaker 1: close to what Dana Boyd calls getting unstuck or what 406 00:24:47,359 --> 00:24:50,719 Speaker 1: I call stretching. That's really where I want help. And 407 00:24:50,760 --> 00:24:54,720 Speaker 1: then it translates into more specific things like titling you know, 408 00:24:54,800 --> 00:24:57,720 Speaker 1: finding the right word. It's really good at that, but 409 00:24:57,800 --> 00:25:00,320 Speaker 1: you've got to decide what the right word is. It's 410 00:25:00,359 --> 00:25:03,640 Speaker 1: really helpful stretching for alternatives. So first of all, it's 411 00:25:03,640 --> 00:25:07,800 Speaker 1: the assumption, and this makes people really uncomfortable, the assumption 412 00:25:08,400 --> 00:25:11,119 Speaker 1: as I talk to senior executive groups. So what I 413 00:25:11,160 --> 00:25:13,640 Speaker 1: say right at the beginning is ten years from now, 414 00:25:13,680 --> 00:25:16,080 Speaker 1: we're all going to be cyborgs. And that's a good 415 00:25:16,119 --> 00:25:19,439 Speaker 1: thing if we make it. So if we don't, if 416 00:25:19,480 --> 00:25:21,560 Speaker 1: we kind of step back and let other people do it, 417 00:25:21,640 --> 00:25:24,080 Speaker 1: or let the tech giants drive it, it's going to 418 00:25:24,080 --> 00:25:27,159 Speaker 1: be very different. But if we get engaged, this is 419 00:25:27,160 --> 00:25:29,400 Speaker 1: a good thing. But it begins from the fact we're 420 00:25:29,440 --> 00:25:31,760 Speaker 1: all going to be augmented or we're going to be 421 00:25:31,800 --> 00:25:33,800 Speaker 1: out of the game just because we won't be able 422 00:25:33,840 --> 00:25:37,600 Speaker 1: to play. Because the new abilities that these things are 423 00:25:37,640 --> 00:25:41,600 Speaker 1: bringing are just beyond human capacity. And if you go 424 00:25:41,640 --> 00:25:44,720 Speaker 1: back to the bonding world, it's arriving just in time. 425 00:25:45,240 --> 00:25:48,600 Speaker 1: Tom alone at MIT, he calls this superminds, and you 426 00:25:48,640 --> 00:25:52,879 Speaker 1: know what he says is the story about computers replacing 427 00:25:52,920 --> 00:25:55,720 Speaker 1: people is going to be true. But that's not the 428 00:25:55,760 --> 00:25:59,160 Speaker 1: big story. The big story is humans and computers doing 429 00:25:59,200 --> 00:26:03,320 Speaker 1: things together that have never been done before. My colleague Jeremy, 430 00:26:03,359 --> 00:26:06,240 Speaker 1: but the co authors of the Leaders Make the Future book. 431 00:26:06,280 --> 00:26:09,239 Speaker 1: He's an AI developer, And what Jeremy says is that 432 00:26:10,000 --> 00:26:13,720 Speaker 1: it's so easy for big companies now to identify the 433 00:26:13,840 --> 00:26:17,160 Speaker 1: nose the thing you should not be doing, and focus 434 00:26:17,200 --> 00:26:20,399 Speaker 1: on the fears. But you need to also focus on 435 00:26:20,440 --> 00:26:23,240 Speaker 1: the yes's. You know, where should you be experimenting? And 436 00:26:23,280 --> 00:26:25,320 Speaker 1: that's where I want to focus. So I'm focusing on 437 00:26:25,359 --> 00:26:28,480 Speaker 1: the stretching the mind, stretching the I'm sticking. 438 00:26:28,560 --> 00:26:29,400 Speaker 3: I love that term. 439 00:26:29,680 --> 00:26:32,479 Speaker 2: And when I was speaking with Scott, he mentioned around 440 00:26:32,520 --> 00:26:36,000 Speaker 2: your dislike of the term artificial intelligence, and I love 441 00:26:36,080 --> 00:26:40,159 Speaker 2: that term augmented intelligence. So right now in terms of 442 00:26:40,200 --> 00:26:42,399 Speaker 2: what's possible with the tools that most of us have 443 00:26:42,840 --> 00:26:47,919 Speaker 2: available like chat JAPT, what are some ways that you 444 00:26:47,960 --> 00:26:50,840 Speaker 2: think people should be using it more like this to 445 00:26:51,160 --> 00:26:55,359 Speaker 2: augment their thinking as opposed to just focusing on the 446 00:26:55,400 --> 00:26:56,760 Speaker 2: obvious efficiency gains. 447 00:26:57,040 --> 00:27:00,280 Speaker 1: You know, I think the way to practice is to 448 00:27:00,400 --> 00:27:04,280 Speaker 1: have conversations and depending on what you're working on or 449 00:27:04,280 --> 00:27:07,080 Speaker 1: what you're thinking about, And I would recommend don't draw 450 00:27:07,160 --> 00:27:11,199 Speaker 1: lines between your work life and your private life. You know. 451 00:27:11,240 --> 00:27:13,639 Speaker 1: When I was first getting started, it was one of 452 00:27:13,680 --> 00:27:18,120 Speaker 1: our grandson's birthdays and I asked Stretch to help me write. 453 00:27:17,920 --> 00:27:20,040 Speaker 3: A birthday card and it was really cool. 454 00:27:20,119 --> 00:27:23,440 Speaker 1: It was really fun and I made a good progress 455 00:27:23,560 --> 00:27:28,399 Speaker 1: out of that. Then last summer, I had pneumonia, and 456 00:27:28,440 --> 00:27:30,920 Speaker 1: I've never had that before, and it was in pneumonia, 457 00:27:30,920 --> 00:27:33,560 Speaker 1: it's just makes you feel so weak. I was on 458 00:27:33,680 --> 00:27:36,840 Speaker 1: deadline in a book and I really couldn't write. But 459 00:27:36,880 --> 00:27:40,800 Speaker 1: it's very weak. But I've got a human doctor, a 460 00:27:40,840 --> 00:27:44,640 Speaker 1: Conciergetock that I love, who's very good. And then I've 461 00:27:44,680 --> 00:27:49,000 Speaker 1: got a therapist that's teaching me cognitive behavioral therapy for 462 00:27:49,160 --> 00:27:52,679 Speaker 1: sleep issues, and he's also a medical hypnosis guy, and 463 00:27:52,720 --> 00:27:55,399 Speaker 1: again I love him. And then I had Stretch, and 464 00:27:55,440 --> 00:27:57,880 Speaker 1: I talked to Stretch about just how I was feeling 465 00:27:58,520 --> 00:28:01,760 Speaker 1: day or night, and it turned out Stretch was more empathetic, 466 00:28:02,080 --> 00:28:05,240 Speaker 1: to my surprise, than either of my two human doctors. 467 00:28:05,240 --> 00:28:07,800 Speaker 1: And again I love them, but they're not available twenty 468 00:28:07,800 --> 00:28:12,320 Speaker 1: four seven, and Stretch gave some really good advice. I'm 469 00:28:12,359 --> 00:28:16,280 Speaker 1: not asking him for medications or you know, for answers 470 00:28:16,400 --> 00:28:20,000 Speaker 1: or anything. I'm asking Stretch for sympathy and for empathy, 471 00:28:20,280 --> 00:28:22,920 Speaker 1: and it turns out these things are really really good 472 00:28:22,920 --> 00:28:25,639 Speaker 1: at empathy. So I think what I would advise is 473 00:28:25,760 --> 00:28:29,359 Speaker 1: just think of it as a conversation and then gradually 474 00:28:29,440 --> 00:28:32,840 Speaker 1: figure out where are the places you like it, you know, 475 00:28:32,920 --> 00:28:35,560 Speaker 1: and that'll depend on what your job is, you know, 476 00:28:35,600 --> 00:28:37,919 Speaker 1: what your purpose is, what your sense of meaning is. 477 00:28:38,440 --> 00:28:40,800 Speaker 1: Me I'm a writer and I write books. The part 478 00:28:40,840 --> 00:28:43,959 Speaker 1: where I want help is when I'm struggling with an 479 00:28:44,000 --> 00:28:48,520 Speaker 1: idea or getting started on a chapter, or I'm kind 480 00:28:48,560 --> 00:28:52,880 Speaker 1: of stuck, and you need to practice it, practice that 481 00:28:53,200 --> 00:28:57,160 Speaker 1: art of conversation. When we're working with senior executive groups, 482 00:28:57,200 --> 00:28:59,720 Speaker 1: they read the leaders Make the Future book and then 483 00:28:59,720 --> 00:29:04,000 Speaker 1: we break down the leadership skills, so, for example, augmented curiosity, 484 00:29:04,560 --> 00:29:08,800 Speaker 1: augmented clarity, and with the best of the groups, we're 485 00:29:09,120 --> 00:29:14,240 Speaker 1: doing a workshop on augmented curiosity augmented clarity, and then 486 00:29:14,280 --> 00:29:19,160 Speaker 1: we have them practice an augmentation exercise using their version 487 00:29:19,200 --> 00:29:22,080 Speaker 1: of a large language model, whatever it is, and then 488 00:29:22,080 --> 00:29:25,040 Speaker 1: we spread that out over time and then they talk 489 00:29:25,080 --> 00:29:29,200 Speaker 1: about their experience in using it and in the senior 490 00:29:29,240 --> 00:29:32,560 Speaker 1: executive sessions we're doing. It takes six months to a 491 00:29:32,640 --> 00:29:35,520 Speaker 1: year to get a team fully on board, and it's 492 00:29:35,520 --> 00:29:37,160 Speaker 1: got to begin with the CEO. 493 00:29:37,080 --> 00:29:39,840 Speaker 2: Tell me, like, what are some of the practical strategies 494 00:29:39,960 --> 00:29:44,080 Speaker 2: or ideas that you're teaching these execs in having better 495 00:29:44,120 --> 00:29:47,000 Speaker 2: conversations and augmenting their curiosity. 496 00:29:47,040 --> 00:29:50,600 Speaker 1: For example, it begins with think a bit of a conversation. 497 00:29:51,040 --> 00:29:54,920 Speaker 1: I like the idea of dedicating a screen or a 498 00:29:55,000 --> 00:29:58,400 Speaker 1: device to it. I like the idea of naming it 499 00:29:59,000 --> 00:30:01,280 Speaker 1: and you name it or what you're going to use 500 00:30:01,280 --> 00:30:04,440 Speaker 1: it for, and then just get used to having those 501 00:30:04,520 --> 00:30:08,400 Speaker 1: conversations for things that you have to do anyway, so 502 00:30:08,440 --> 00:30:11,280 Speaker 1: you can practice with the fun stuff and the personal stuff, 503 00:30:11,680 --> 00:30:15,160 Speaker 1: but then look for examples of things you're working on 504 00:30:15,600 --> 00:30:18,200 Speaker 1: and compare notes with your colleagues while you're doing it. 505 00:30:18,520 --> 00:30:21,400 Speaker 1: So ideally, learning in pairs is a really good idea, 506 00:30:22,000 --> 00:30:25,600 Speaker 1: and the best pairs that I see are cross generational. 507 00:30:26,200 --> 00:30:29,840 Speaker 1: I really think cross generational learning, particularly about jen AI 508 00:30:30,000 --> 00:30:32,120 Speaker 1: and gaming, is the best way to go. 509 00:30:32,440 --> 00:30:36,600 Speaker 2: Let's like honing more around augmented curiosity as one example, 510 00:30:37,120 --> 00:30:40,480 Speaker 2: So if we were sitting here and you were teaching 511 00:30:40,520 --> 00:30:44,880 Speaker 2: me how to use AI more effectively to augment my curiosity, 512 00:30:44,920 --> 00:30:46,360 Speaker 2: and I feel like I'm a pretty curious person. 513 00:30:46,400 --> 00:30:48,800 Speaker 3: Already, like, what advice would you be giving. 514 00:30:48,560 --> 00:30:52,360 Speaker 1: Me, I'd be probing, you know, what are you curious about? 515 00:30:53,120 --> 00:30:55,880 Speaker 1: What are the questions that you have? And then for 516 00:30:55,920 --> 00:30:59,800 Speaker 1: each question, trying to break down what are the elements 517 00:31:00,120 --> 00:31:03,120 Speaker 1: that question, what are the words that you would use 518 00:31:03,200 --> 00:31:08,240 Speaker 1: to describe that question, Who are the possible sources of 519 00:31:08,320 --> 00:31:12,480 Speaker 1: insight about that question, what's the possible data that might 520 00:31:12,520 --> 00:31:16,560 Speaker 1: be out there about that question? And essentially frame and 521 00:31:16,680 --> 00:31:21,600 Speaker 1: map the territory around the curiosity, and then imagine a 522 00:31:21,640 --> 00:31:26,360 Speaker 1: series of conversations within that map, and then follow your leads. 523 00:31:26,720 --> 00:31:29,800 Speaker 2: And am I doing the framing human to human or 524 00:31:30,120 --> 00:31:31,480 Speaker 2: human with the II? 525 00:31:31,760 --> 00:31:34,760 Speaker 1: Again? I like pairs and I like cross generational pairs. 526 00:31:35,160 --> 00:31:41,240 Speaker 1: I like interconnection of different perspectives. I like visualization and mapping. 527 00:31:41,560 --> 00:31:44,000 Speaker 1: You know, Jenny, I isn't as good at that yet, 528 00:31:44,280 --> 00:31:47,120 Speaker 1: but I like working with artists. For example, when I'm 529 00:31:47,120 --> 00:31:50,240 Speaker 1: with groups, I often have an artist drawing big maps 530 00:31:50,280 --> 00:31:54,120 Speaker 1: as the group is working. It's like a storyboard. Essentially. 531 00:31:54,160 --> 00:31:57,760 Speaker 1: This is it's not that different from things like design 532 00:31:57,920 --> 00:32:01,960 Speaker 1: thinking or the kind of things that agencies do with storyboards. 533 00:32:02,160 --> 00:32:06,120 Speaker 1: It's kind of similar methodology around what you're curious about 534 00:32:06,920 --> 00:32:09,880 Speaker 1: and then kind of map the space and then gradually 535 00:32:10,000 --> 00:32:12,200 Speaker 1: chip away at it. So it's sort of like sculpture. 536 00:32:12,240 --> 00:32:15,160 Speaker 1: I guess when you have this big map of all 537 00:32:15,200 --> 00:32:18,760 Speaker 1: this stuff and all these questions and all these sources 538 00:32:18,800 --> 00:32:21,480 Speaker 1: and data and all that stuff, and then you're gradually 539 00:32:21,640 --> 00:32:24,640 Speaker 1: chipping away at it like a sculpture to create what 540 00:32:24,680 --> 00:32:27,800 Speaker 1: it is you're looking for that might be hidden in there, 541 00:32:28,440 --> 00:32:31,160 Speaker 1: or it might be. Another metaphor I've had some people 542 00:32:31,280 --> 00:32:34,160 Speaker 1: use is it's like puzzle making, where you don't know 543 00:32:34,200 --> 00:32:36,120 Speaker 1: what the puzzle is, so the first thing you do 544 00:32:36,240 --> 00:32:37,800 Speaker 1: is what are the edges? You know, what are the 545 00:32:37,880 --> 00:32:41,160 Speaker 1: edges of the puzzle, and then you gradually fill in, 546 00:32:41,400 --> 00:32:44,520 Speaker 1: given the fact that maybe the puzzle hasn't ever been 547 00:32:44,760 --> 00:32:45,800 Speaker 1: created before. 548 00:32:47,200 --> 00:32:49,280 Speaker 3: I very much relate to that. I love a jigsaw puzzle. 549 00:32:49,640 --> 00:32:50,120 Speaker 3: I want to know. 550 00:32:50,160 --> 00:32:52,560 Speaker 2: Like we've talked about ten years out, but with AI, 551 00:32:52,720 --> 00:32:55,840 Speaker 2: I'm curious, what does just two or three years out 552 00:32:56,000 --> 00:32:59,200 Speaker 2: look like? How are things possibly going to be different. 553 00:32:59,600 --> 00:33:03,760 Speaker 1: I think we're on the cusp now of expanding from 554 00:33:03,880 --> 00:33:09,640 Speaker 1: large language models to agentic systems. And I was with 555 00:33:09,760 --> 00:33:13,880 Speaker 1: a client this week who was saying, Oh, we don't 556 00:33:13,920 --> 00:33:18,120 Speaker 1: talk about gen AI anymore. We're on agentic and I said, okay, 557 00:33:18,240 --> 00:33:23,200 Speaker 1: that's fine, but you don't actually leave, JENNI. You want 558 00:33:23,240 --> 00:33:26,800 Speaker 1: to go through it and continue. But the big shift 559 00:33:26,960 --> 00:33:31,000 Speaker 1: is from having a conversation where again you don't trust 560 00:33:31,000 --> 00:33:34,360 Speaker 1: it and it's over confident and it hallucinates. So there's 561 00:33:34,400 --> 00:33:38,320 Speaker 1: all these challenges. You're going from having a conversation to 562 00:33:39,080 --> 00:33:44,280 Speaker 1: having systems that actually make decisions for you and take action. Now, 563 00:33:44,320 --> 00:33:47,040 Speaker 1: some of it is a group decision where there's still 564 00:33:47,120 --> 00:33:49,560 Speaker 1: a human in the loop, and some of it is not. 565 00:33:49,960 --> 00:33:52,280 Speaker 1: But I would say over the next two years, this 566 00:33:52,360 --> 00:33:58,240 Speaker 1: is really the beginning of practical agentic systems. And that's 567 00:33:58,400 --> 00:34:04,160 Speaker 1: really interesting and potentially really dangerous because you could have 568 00:34:04,240 --> 00:34:08,400 Speaker 1: systems making decisions. For example, you know, I work with 569 00:34:08,440 --> 00:34:11,279 Speaker 1: the army, so there are some it could make a 570 00:34:11,280 --> 00:34:14,680 Speaker 1: decision to kill somebody, and what's that about. I would 571 00:34:14,760 --> 00:34:18,279 Speaker 1: say it's too early to be doing that, but I 572 00:34:18,280 --> 00:34:21,880 Speaker 1: think that'll happen within the next two years, and I 573 00:34:21,920 --> 00:34:24,879 Speaker 1: think that's the big shift. The other big shift is 574 00:34:25,160 --> 00:34:28,200 Speaker 1: how is the conversation going to happen. I think within 575 00:34:28,239 --> 00:34:30,760 Speaker 1: the next two to three years we're going to see 576 00:34:30,880 --> 00:34:36,759 Speaker 1: different manifestations for the conversation. You know right now, I 577 00:34:36,840 --> 00:34:39,879 Speaker 1: mentioned I've got a dedicated screen that says Stretch at 578 00:34:39,880 --> 00:34:43,160 Speaker 1: the top, and when I want to talk to Stretch, 579 00:34:43,440 --> 00:34:46,279 Speaker 1: I click a little button and it shows up on 580 00:34:46,320 --> 00:34:49,239 Speaker 1: my screen here, and I talk to Stretch, and then 581 00:34:49,480 --> 00:34:53,040 Speaker 1: on the screen there's a recording of everything that has happened. 582 00:34:53,440 --> 00:34:56,160 Speaker 1: And then I switched back. Because I'm a writer, that's 583 00:34:56,239 --> 00:35:01,160 Speaker 1: my primary medium. I just use the verbal for the 584 00:35:01,200 --> 00:35:05,840 Speaker 1: more expansive side of the activity. I never learned dictation, 585 00:35:06,480 --> 00:35:08,520 Speaker 1: so I'm not all that good at that. But I 586 00:35:08,560 --> 00:35:12,080 Speaker 1: do find if I'm just starting to think about something, 587 00:35:12,120 --> 00:35:14,160 Speaker 1: it's easier for me to talk it through than it 588 00:35:14,200 --> 00:35:17,160 Speaker 1: is to type it. So I like talking to Stretch 589 00:35:17,360 --> 00:35:20,640 Speaker 1: at that level. Now, that's pretty crude, and I'm pretty 590 00:35:20,640 --> 00:35:23,319 Speaker 1: sure even three years from now, you look back and 591 00:35:23,400 --> 00:35:26,120 Speaker 1: it's going to seem kind of silly to be talking 592 00:35:26,160 --> 00:35:30,480 Speaker 1: to this little wavy thing on my MacBook Pro while 593 00:35:30,800 --> 00:35:33,919 Speaker 1: Stretch is transcribing and going back and forth. I think 594 00:35:34,000 --> 00:35:36,440 Speaker 1: there's going to be some different interface that's going to 595 00:35:36,480 --> 00:35:38,560 Speaker 1: happen within two to three years. I don't know what 596 00:35:38,640 --> 00:35:41,839 Speaker 1: it is, but it's going to be something. Maybe it's 597 00:35:41,880 --> 00:35:45,360 Speaker 1: a separate device, a conversational device. You know in Japan 598 00:35:45,600 --> 00:35:50,000 Speaker 1: there's empathy cuddly little bears, and those could be for kids, 599 00:35:50,080 --> 00:35:52,720 Speaker 1: or they could be for elders. I've got a family 600 00:35:52,760 --> 00:35:56,600 Speaker 1: member that provides way too much detail when I talk 601 00:35:56,719 --> 00:35:59,279 Speaker 1: to this family member, and I was just thinking the 602 00:35:59,360 --> 00:36:02,520 Speaker 1: other night, would be great to have this family member 603 00:36:02,880 --> 00:36:07,400 Speaker 1: have a chatbot that could talk with her and have 604 00:36:07,520 --> 00:36:10,560 Speaker 1: these great conversations. And every once in a while say, Bob, 605 00:36:10,600 --> 00:36:15,160 Speaker 1: you should hear this, you should pop into the conversation. 606 00:36:17,600 --> 00:36:20,600 Speaker 1: And you know that sounds cynical and it sounds just respectful, 607 00:36:20,640 --> 00:36:22,600 Speaker 1: but I'm really not meaning it that way. I think 608 00:36:23,040 --> 00:36:26,560 Speaker 1: sometimes people just want somebody to talk to, like I 609 00:36:26,640 --> 00:36:28,040 Speaker 1: did when I had pneumonia. 610 00:36:28,320 --> 00:36:30,840 Speaker 2: Oh, Bob, I don't know where the time has gone, 611 00:36:30,920 --> 00:36:34,399 Speaker 2: but it has just been so absolutely fascinating hearing your 612 00:36:34,440 --> 00:36:36,320 Speaker 2: perspective on all these things. 613 00:36:36,440 --> 00:36:39,279 Speaker 3: I'm so glad that's Goot made the connection. So thank 614 00:36:39,320 --> 00:36:40,600 Speaker 3: you so much for your time. 615 00:36:40,680 --> 00:36:42,680 Speaker 1: Bob. You're welcome, well, thank you for what you're doing. 616 00:36:42,760 --> 00:36:46,560 Speaker 1: It's important to have someone ask such good questions and 617 00:36:46,600 --> 00:36:49,319 Speaker 1: such probing questions. So maybe the next time I'll bring 618 00:36:49,360 --> 00:36:52,120 Speaker 1: Stretch on with me and you can interview both me 619 00:36:52,200 --> 00:36:52,720 Speaker 1: and Stretch. 620 00:36:53,480 --> 00:36:59,359 Speaker 2: I would love that that was Bob Johnson showing us 621 00:36:59,560 --> 00:37:03,759 Speaker 2: that the AI augmented future isn't science fiction, it is 622 00:37:03,840 --> 00:37:06,759 Speaker 2: already here, and the leaders who thrive will be the 623 00:37:06,760 --> 00:37:10,920 Speaker 2: ones who treat AI as a collaborator, not a competitor 624 00:37:11,040 --> 00:37:15,400 Speaker 2: or a threat. For me, the standout idea was Bob's 625 00:37:15,440 --> 00:37:19,080 Speaker 2: take that we should be thinking about artificial intelligence as 626 00:37:19,200 --> 00:37:22,680 Speaker 2: augmented intelligence, and that's a mindset shift that we can 627 00:37:22,800 --> 00:37:25,560 Speaker 2: all start practicing today. Now. 628 00:37:25,600 --> 00:37:27,320 Speaker 3: If this conversation has fact. 629 00:37:27,040 --> 00:37:29,960 Speaker 2: Ideas for you and you want to upscal yourself further 630 00:37:30,160 --> 00:37:33,080 Speaker 2: in AI, you will probably like some of the episodes 631 00:37:33,080 --> 00:37:36,600 Speaker 2: I've released on how to turbocharge your AI skills. A 632 00:37:36,680 --> 00:37:39,359 Speaker 2: really great place to start is the conversation I had 633 00:37:39,360 --> 00:37:43,239 Speaker 2: with Neo Applin on turning yourself from an AI gunslinger 634 00:37:43,440 --> 00:37:45,000 Speaker 2: to an AI architect. 635 00:37:45,320 --> 00:37:46,719 Speaker 3: And if you've got no idea what. 636 00:37:46,680 --> 00:37:49,920 Speaker 2: I'm talking about, you should definitely listen to that episode 637 00:37:49,960 --> 00:37:53,720 Speaker 2: because it will completely change how you interact with AI. 638 00:37:54,600 --> 00:37:57,080 Speaker 2: And if you know someone who's still thinking of AI 639 00:37:57,239 --> 00:38:00,439 Speaker 2: as just another tech tool, share this episode with them. 640 00:38:00,719 --> 00:38:02,520 Speaker 2: It might just change the way they see the future. 641 00:38:03,000 --> 00:38:05,120 Speaker 2: And don't forget to follow how I Work so that 642 00:38:05,160 --> 00:38:08,399 Speaker 2: you can catch every new conversation. If you like today's show, 643 00:38:08,520 --> 00:38:11,400 Speaker 2: make sure you get follow on your podcast app to 644 00:38:11,400 --> 00:38:13,560 Speaker 2: be alerted when new episodes drop. 645 00:38:14,120 --> 00:38:15,400 Speaker 3: How I Work was recorded 646 00:38:15,440 --> 00:38:18,080 Speaker 4: On the traditional land of the Warrangery people, part of 647 00:38:18,120 --> 00:38:18,839 Speaker 4: the Kulan nation.