1 00:00:00,320 --> 00:00:02,840 Speaker 1: This is Dana Perkins and you're listening to Switched on 2 00:00:03,120 --> 00:00:08,240 Speaker 1: the BNF podcast. Imperfection or the idea of something being imperfect, 3 00:00:08,360 --> 00:00:12,200 Speaker 1: is usually considered a negative in business, in life, in art, 4 00:00:12,280 --> 00:00:15,440 Speaker 1: in sport, perfection or as close as possible is normally 5 00:00:15,440 --> 00:00:18,119 Speaker 1: the objective. But is there something to be said for 6 00:00:18,280 --> 00:00:22,160 Speaker 1: embracing imperfection, figuring out what we can learn from failures 7 00:00:22,320 --> 00:00:26,200 Speaker 1: or just incomplete information, and what mindsets and tools we 8 00:00:26,280 --> 00:00:29,720 Speaker 1: need to utilize imperfect outcomes in the business world. So 9 00:00:29,800 --> 00:00:33,239 Speaker 1: in today's episode, I speak with Charles Kahn. Charles is 10 00:00:33,240 --> 00:00:37,519 Speaker 1: an investor, environmentalist and entrepreneur. He's the former CEO of 11 00:00:37,560 --> 00:00:40,839 Speaker 1: The Roads Trust in Oxford and the current chair of Patagonia. 12 00:00:41,040 --> 00:00:44,479 Speaker 1: Aside from his business in academic experience in positions, he's 13 00:00:44,520 --> 00:00:48,560 Speaker 1: also an author, having previously co written Bulletproof Problem Solving, 14 00:00:48,760 --> 00:00:51,520 Speaker 1: and Charles is here today to talk about his latest book, 15 00:00:51,760 --> 00:00:57,000 Speaker 1: The Imperfectionists Strategic Mindsets for Uncertain Times. Together, we discuss 16 00:00:57,040 --> 00:00:59,040 Speaker 1: a range of topics and of course we give them 17 00:00:59,080 --> 00:01:03,160 Speaker 1: the BNF spin. By looking at decarbonization and the transition 18 00:01:03,560 --> 00:01:06,480 Speaker 1: and how this methodology might be applied. We look at 19 00:01:06,520 --> 00:01:09,680 Speaker 1: the different elements that make up the imperfectionist mindset and 20 00:01:09,720 --> 00:01:11,600 Speaker 1: go through some of the practical ways it can be 21 00:01:11,640 --> 00:01:14,440 Speaker 1: applied to business along with AI learning and how it's 22 00:01:14,480 --> 00:01:17,520 Speaker 1: being harnessed to aid decision making. Now, as always, if 23 00:01:17,560 --> 00:01:20,120 Speaker 1: you like this podcast, make sure to subscribe so that 24 00:01:20,200 --> 00:01:23,240 Speaker 1: you get updates when there are future episodes, and give 25 00:01:23,319 --> 00:01:25,720 Speaker 1: us a review on Apple Podcasts or Spotify to make 26 00:01:25,760 --> 00:01:28,640 Speaker 1: it more discoverable to others. But right now we get 27 00:01:28,640 --> 00:01:43,800 Speaker 1: to have a conversation with Charles about imperfectionism. Charles, thank 28 00:01:43,800 --> 00:01:45,199 Speaker 1: you very much for joining me today. 29 00:01:45,319 --> 00:01:46,759 Speaker 2: Dan, it's such a pleasure to be here. 30 00:01:46,920 --> 00:01:51,480 Speaker 1: So we're here well really because you have recently written 31 00:01:51,560 --> 00:01:55,200 Speaker 1: or co written a book and having read it from 32 00:01:55,240 --> 00:01:57,760 Speaker 1: cover to cover myself, there is a lot of application 33 00:01:57,840 --> 00:01:59,960 Speaker 1: and learnings for the industries that we had been in 34 00:02:00,160 --> 00:02:03,560 Speaker 1: have cover regarding decarbonization in the transition, and I really 35 00:02:03,640 --> 00:02:06,920 Speaker 1: want to start with your motivations. I want to start with, well, 36 00:02:07,120 --> 00:02:10,120 Speaker 1: actually your relationship with your co author. So you previously 37 00:02:10,160 --> 00:02:13,080 Speaker 1: wrote a book called Bulletproof Problem Solving the One skill 38 00:02:13,160 --> 00:02:16,000 Speaker 1: that changes Everything. Why was it time to write a 39 00:02:16,240 --> 00:02:17,200 Speaker 1: second book together? 40 00:02:17,560 --> 00:02:20,360 Speaker 2: Yeah? So the first book was a tool sets book 41 00:02:20,520 --> 00:02:23,520 Speaker 2: about problem solving, how to break complex problems apart and 42 00:02:23,560 --> 00:02:26,160 Speaker 2: solve them creatively, which is I think what both Rob 43 00:02:26,200 --> 00:02:28,600 Speaker 2: and I would say is our life's work. And in 44 00:02:28,639 --> 00:02:33,280 Speaker 2: a world where you have increasing automation and artificial intelligence, 45 00:02:33,440 --> 00:02:36,079 Speaker 2: the one thing that humans can really do is work 46 00:02:36,120 --> 00:02:39,480 Speaker 2: together to solve complex problems creatively. We're still better at 47 00:02:39,560 --> 00:02:42,440 Speaker 2: that than the AI routines. The reason we wrote the 48 00:02:42,480 --> 00:02:45,960 Speaker 2: second book, which is a mindset's book, is in the 49 00:02:46,000 --> 00:02:49,320 Speaker 2: depth of the pandemic. As things were changing so quickly, 50 00:02:49,560 --> 00:02:52,519 Speaker 2: it became clear to us that the first book didn't 51 00:02:52,560 --> 00:02:56,520 Speaker 2: address problem solving under very high uncertainty well enough, and 52 00:02:56,600 --> 00:02:59,000 Speaker 2: so we wanted to write was a book that helped 53 00:02:59,000 --> 00:03:01,359 Speaker 2: people think about what to do when things are changing 54 00:03:01,480 --> 00:03:04,480 Speaker 2: very quickly. When things are changing quickly, people tend to 55 00:03:04,520 --> 00:03:07,480 Speaker 2: do one of two things. They paralyze, and we see 56 00:03:07,480 --> 00:03:10,600 Speaker 2: this a lot in company managements. Now they freeze and 57 00:03:10,639 --> 00:03:14,480 Speaker 2: they want to wait for stasis stasis isn't coming. Or 58 00:03:15,160 --> 00:03:19,560 Speaker 2: they do leap before you look moves where people panic 59 00:03:19,800 --> 00:03:23,640 Speaker 2: and they do something big that's irreversible and difficult. And 60 00:03:23,680 --> 00:03:26,000 Speaker 2: we wanted to show people there's a way to lean 61 00:03:26,040 --> 00:03:29,560 Speaker 2: into risk and to be comfortable solving problems even when 62 00:03:29,600 --> 00:03:30,480 Speaker 2: things are changing. 63 00:03:30,760 --> 00:03:33,280 Speaker 1: And with that you said, it's a mindset's book, So 64 00:03:33,280 --> 00:03:35,800 Speaker 1: there are six different mindsets that you go through in here. 65 00:03:35,920 --> 00:03:38,120 Speaker 1: Let's go through a quick overview so that we can 66 00:03:38,160 --> 00:03:40,240 Speaker 1: then drill down into some of the individual ones. 67 00:03:40,440 --> 00:03:43,160 Speaker 2: Sure, so let's just do a thirty thousand foot view. 68 00:03:43,200 --> 00:03:45,920 Speaker 2: So when things are changing really quickly, the most important 69 00:03:45,920 --> 00:03:49,800 Speaker 2: orientation is curiosity. And it sounds incredibly obvious, but as 70 00:03:49,880 --> 00:03:52,880 Speaker 2: we get older, we forget to be curious. As we 71 00:03:52,920 --> 00:03:55,480 Speaker 2: get good at what we do, we get into ruts. 72 00:03:55,640 --> 00:03:58,440 Speaker 2: And those ruts are even like tying your shoe. You 73 00:03:58,480 --> 00:04:00,600 Speaker 2: don't think about it. And because you don't think about 74 00:04:00,600 --> 00:04:02,400 Speaker 2: it and you're not curious about it, you're not open 75 00:04:02,440 --> 00:04:05,800 Speaker 2: to other ideas. Second, and very much a sister mindset 76 00:04:05,880 --> 00:04:08,760 Speaker 2: we call dragonfly eye, which is an idea we borrowed 77 00:04:08,760 --> 00:04:12,480 Speaker 2: from Philip Tetlock, who's written so beautifully about super forecasters. 78 00:04:12,760 --> 00:04:14,960 Speaker 2: What that means is to make sure to see things 79 00:04:15,000 --> 00:04:18,040 Speaker 2: through multiple perspectives before you make up your mind about 80 00:04:18,040 --> 00:04:20,400 Speaker 2: what strategic path you're going to take to see them. 81 00:04:20,560 --> 00:04:23,880 Speaker 2: We call it sometimes environment vision, which means thinking about 82 00:04:24,040 --> 00:04:27,680 Speaker 2: problems through the perspective of your customers or your suppliers. 83 00:04:28,080 --> 00:04:31,359 Speaker 2: Or a potential competitor, rather than just thinking about things 84 00:04:31,560 --> 00:04:34,120 Speaker 2: through your own industry lens, which is how we tend 85 00:04:34,120 --> 00:04:36,320 Speaker 2: to do things. The third which should be familiar to 86 00:04:36,360 --> 00:04:38,599 Speaker 2: many people but probably not with this name, and we 87 00:04:38,680 --> 00:04:41,640 Speaker 2: call it oh current behavior, which is what actually happens 88 00:04:41,960 --> 00:04:44,760 Speaker 2: rather than what you hope will happen. And for us, 89 00:04:44,839 --> 00:04:48,680 Speaker 2: that is an experimentalist mindset. And I think many people 90 00:04:48,720 --> 00:04:51,840 Speaker 2: think that we can only do experimentation in light industries 91 00:04:51,920 --> 00:04:54,359 Speaker 2: like internet, but we think it's just as important to 92 00:04:54,400 --> 00:04:58,159 Speaker 2: do experimentation in heavy industry of the sorts that you 93 00:04:58,279 --> 00:05:02,360 Speaker 2: often pay attention to. Both mindset we call collective intelligence, 94 00:05:02,560 --> 00:05:05,279 Speaker 2: which is how we can reach outside the boundaries of 95 00:05:05,320 --> 00:05:10,240 Speaker 2: our own organizations and crowdsource in great ideas, and most 96 00:05:10,400 --> 00:05:12,960 Speaker 2: big professional industries are loath to do that because the 97 00:05:13,000 --> 00:05:15,599 Speaker 2: assumption is that the smartest people are already in the room, 98 00:05:15,800 --> 00:05:19,119 Speaker 2: and therefore we miss out that idea of peripheral vision, 99 00:05:19,200 --> 00:05:22,560 Speaker 2: where technologies or ideas from another industry that could be 100 00:05:22,600 --> 00:05:26,600 Speaker 2: applicable we just miss entirely. The fifth mindset we call 101 00:05:26,880 --> 00:05:29,800 Speaker 2: show and tell, which is really that sort of kindergarten 102 00:05:29,880 --> 00:05:33,360 Speaker 2: idea that you would rather than just create a PowerPoint 103 00:05:33,400 --> 00:05:35,880 Speaker 2: slide you actually tell a story if you want to 104 00:05:36,000 --> 00:05:39,359 Speaker 2: rally people around your ideas to do something radically different, 105 00:05:39,440 --> 00:05:42,520 Speaker 2: and of course the whole climate change world requires us 106 00:05:42,600 --> 00:05:44,680 Speaker 2: to do things that are radically different. You need to 107 00:05:44,720 --> 00:05:47,280 Speaker 2: speak to their hearts, not just to their minds, and 108 00:05:47,560 --> 00:05:50,440 Speaker 2: you need to speak to values. And it's very important 109 00:05:50,480 --> 00:05:53,000 Speaker 2: that we all learn how to be better storytellers. And 110 00:05:53,040 --> 00:05:56,239 Speaker 2: then the one mindset that brings all of those together 111 00:05:56,520 --> 00:06:00,159 Speaker 2: we call imperfectionism, which is really about stepping in to 112 00:06:00,279 --> 00:06:04,600 Speaker 2: risk rather than being paralyzed, and using small steps in 113 00:06:04,680 --> 00:06:07,520 Speaker 2: order to build your confidence, to build your understanding, to 114 00:06:07,560 --> 00:06:10,960 Speaker 2: build capability sometimes to build asset positions so that you 115 00:06:11,120 --> 00:06:14,880 Speaker 2: actually move toward your goal without having some grand strategy, 116 00:06:15,000 --> 00:06:18,080 Speaker 2: because when things are changing so quickly, grand strategies the 117 00:06:18,080 --> 00:06:20,160 Speaker 2: way we used to construct them don't work anymore. 118 00:06:20,320 --> 00:06:22,840 Speaker 1: I think you've got us a curious certainly, because one 119 00:06:22,920 --> 00:06:25,160 Speaker 1: of the things I hear over and over again from 120 00:06:25,200 --> 00:06:27,479 Speaker 1: listeners when I run into them, you know, out there 121 00:06:27,480 --> 00:06:29,440 Speaker 1: in the real world and not in the studio, is 122 00:06:29,440 --> 00:06:32,760 Speaker 1: that what they tend to tune into this podcast for 123 00:06:32,920 --> 00:06:35,960 Speaker 1: is to better understand things that are not perfectly in 124 00:06:36,000 --> 00:06:38,599 Speaker 1: their field of vision. So Hopefully this will be a 125 00:06:38,680 --> 00:06:41,080 Speaker 1: journey for them to think about how they approach work 126 00:06:41,240 --> 00:06:43,599 Speaker 1: in a very, very different way. One of the things 127 00:06:43,640 --> 00:06:46,800 Speaker 1: that is very clear from the work that you have 128 00:06:46,920 --> 00:06:49,000 Speaker 1: done in this book is to highlight a number of 129 00:06:49,040 --> 00:06:52,960 Speaker 1: different company examples and just at the fundamental level. How 130 00:06:52,960 --> 00:06:55,200 Speaker 1: many case studies would you say that you read on 131 00:06:55,240 --> 00:06:58,320 Speaker 1: a monthly basis, because it must have been hard to choose. 132 00:06:58,800 --> 00:07:01,360 Speaker 2: Yeah, well, certainly more than one hundred. You know, that's 133 00:07:01,680 --> 00:07:05,360 Speaker 2: the food for our work is case studies, because that's 134 00:07:05,400 --> 00:07:08,840 Speaker 2: what allows us to see through these multiple lenses. Grand 135 00:07:08,920 --> 00:07:12,000 Speaker 2: theory is kind of boring, it's empty until you can 136 00:07:12,040 --> 00:07:15,360 Speaker 2: make it real via case studies. So I think there's 137 00:07:15,400 --> 00:07:19,280 Speaker 2: both that deductive problem solving where you come from a 138 00:07:19,320 --> 00:07:22,800 Speaker 2: big idea to the specific, and then inductive where you 139 00:07:22,840 --> 00:07:25,160 Speaker 2: go from the specific to knit together a bit of 140 00:07:25,160 --> 00:07:27,200 Speaker 2: a bigger idea. I guess we love induction. 141 00:07:27,640 --> 00:07:29,200 Speaker 1: There are a lot of examples that I would say 142 00:07:29,200 --> 00:07:31,520 Speaker 1: are more in the B two C space in this book, 143 00:07:31,560 --> 00:07:35,200 Speaker 1: and a lot of the solutions that I'm looking at 144 00:07:35,280 --> 00:07:38,480 Speaker 1: are really focused on this transition of industry, this transition 145 00:07:38,560 --> 00:07:41,160 Speaker 1: and energy, which is more on the B to B side. 146 00:07:41,280 --> 00:07:43,960 Speaker 1: I guess within the different mindsets, Are there some that 147 00:07:44,040 --> 00:07:46,560 Speaker 1: you think are more applicable to the B to B 148 00:07:46,640 --> 00:07:49,160 Speaker 1: space or is really the entire way of all look 149 00:07:49,200 --> 00:07:52,720 Speaker 1: at looking at all six really relevant And are there 150 00:07:52,720 --> 00:07:54,880 Speaker 1: any examples maybe that you may drop on that you 151 00:07:54,880 --> 00:07:57,760 Speaker 1: think would be really useful to the B to B community. Yeah? 152 00:07:57,800 --> 00:08:00,080 Speaker 2: Absolutely, So the first thing I would say is there 153 00:08:00,280 --> 00:08:03,120 Speaker 2: no B two C bias in this way of thinking. 154 00:08:03,400 --> 00:08:06,920 Speaker 2: And what I would say about heavier industries, which tend 155 00:08:06,920 --> 00:08:09,880 Speaker 2: to be the bigger emitters, is there's no easy answers 156 00:08:10,200 --> 00:08:14,120 Speaker 2: because they do involve enormous capital expenditures. That's the characteristic 157 00:08:14,200 --> 00:08:18,040 Speaker 2: of heavy But I think it's too easy to say 158 00:08:18,440 --> 00:08:22,080 Speaker 2: that those are not amenable to these kind of imperfectionist 159 00:08:22,160 --> 00:08:25,000 Speaker 2: approaches to strategy, and it just doesn't so. And I'll 160 00:08:25,040 --> 00:08:28,440 Speaker 2: give you one example. When we talk about experimentation, folks 161 00:08:28,520 --> 00:08:32,000 Speaker 2: often like to think that that's only relevant when you have, 162 00:08:32,240 --> 00:08:36,640 Speaker 2: for example, media or internet light investment industries. So people 163 00:08:36,679 --> 00:08:39,640 Speaker 2: would say, you can do ab testing with two different 164 00:08:39,679 --> 00:08:42,600 Speaker 2: website designs and you see which one attracts more people. 165 00:08:42,760 --> 00:08:44,800 Speaker 2: That's easy and doesn't cost a lot. But if we 166 00:08:44,800 --> 00:08:47,000 Speaker 2: were to look at, for example, one of the heaviest 167 00:08:47,000 --> 00:08:49,640 Speaker 2: industries of all, which is space. We have a good 168 00:08:49,679 --> 00:08:53,800 Speaker 2: example of an experimental company, right, which is SpaceX. SpaceX 169 00:08:54,000 --> 00:08:57,360 Speaker 2: picked up where NASA left off. And what did they do? 170 00:08:57,559 --> 00:09:01,040 Speaker 2: They massively increase the number of launches per year. NASA 171 00:09:01,120 --> 00:09:03,520 Speaker 2: was doing two or three or four launches a year. 172 00:09:03,640 --> 00:09:07,160 Speaker 2: SpaceX now does twenty or thirty launches a year. NASA 173 00:09:07,240 --> 00:09:11,480 Speaker 2: tried to engineer everything double triple heavy. SpaceX has been 174 00:09:11,520 --> 00:09:16,040 Speaker 2: deliberately experimental, sometimes spectacularly. So you remember the what do 175 00:09:16,120 --> 00:09:19,600 Speaker 2: they call it an unplanned disassembly on the most recent 176 00:09:19,679 --> 00:09:24,679 Speaker 2: large rocket launch. Right crazy to take multimillion dollar launches 177 00:09:24,800 --> 00:09:28,120 Speaker 2: and to view that as an experimental lens, But because 178 00:09:28,160 --> 00:09:31,559 Speaker 2: they've viewed it an experimental lens and they've pioneered, for example, 179 00:09:31,640 --> 00:09:35,080 Speaker 2: three D printing of rocket parts, were usable rocket parts 180 00:09:35,120 --> 00:09:37,320 Speaker 2: like the nose cone they catch in a net which 181 00:09:37,360 --> 00:09:40,320 Speaker 2: saves a huge amount of money, or using new materials 182 00:09:40,320 --> 00:09:43,120 Speaker 2: that come from other industries as heat shields. These are 183 00:09:43,160 --> 00:09:46,080 Speaker 2: all ideas that have been pioneered by SpaceX and the 184 00:09:46,120 --> 00:09:48,720 Speaker 2: frequency of launch. You know, they have what they call 185 00:09:49,080 --> 00:09:53,040 Speaker 2: fly test fix fail or fail fix, I should say, 186 00:09:53,160 --> 00:09:56,400 Speaker 2: as a mentality for their engineers. They've been able to 187 00:09:56,480 --> 00:09:59,319 Speaker 2: drive massively down the cost curve. So it used to 188 00:09:59,400 --> 00:10:02,000 Speaker 2: cost fifty five thousand dollars to put a kilogram into 189 00:10:02,040 --> 00:10:06,040 Speaker 2: space with NASA. Now it costs literally a twentieth of 190 00:10:06,080 --> 00:10:09,000 Speaker 2: that with SpaceX to send that same kilogram into space. 191 00:10:09,240 --> 00:10:12,800 Speaker 2: That's heavy industry, that's an experimentalist mindset. The two can 192 00:10:12,840 --> 00:10:13,240 Speaker 2: go together. 193 00:10:13,559 --> 00:10:15,920 Speaker 1: It's incredible to see these cost of clients as you're 194 00:10:15,960 --> 00:10:18,320 Speaker 1: pointing them out. One of the things that occurs to me, though, 195 00:10:18,360 --> 00:10:21,440 Speaker 1: is that this is a company that essentially grew up 196 00:10:21,720 --> 00:10:24,360 Speaker 1: on its own to tackle this issue. It didn't come 197 00:10:24,400 --> 00:10:27,680 Speaker 1: from within, It didn't come from the inside. And you 198 00:10:27,920 --> 00:10:30,640 Speaker 1: have another example in the book about Ford Motor Company 199 00:10:30,679 --> 00:10:33,920 Speaker 1: and how they ended up making their electric vehicle division 200 00:10:34,000 --> 00:10:36,680 Speaker 1: separate from their internal combustion division in order to give 201 00:10:36,720 --> 00:10:40,360 Speaker 1: them that perspective to think about things with a fresh view. 202 00:10:40,840 --> 00:10:43,280 Speaker 1: Let's go into an example that wasn't in the book, 203 00:10:43,280 --> 00:10:45,600 Speaker 1: but certainly an industry that you have experience with, which 204 00:10:45,640 --> 00:10:48,480 Speaker 1: is oil and gas. So in this umbrella space that 205 00:10:48,520 --> 00:10:51,440 Speaker 1: are the energy companies. So many of the people who 206 00:10:51,440 --> 00:10:53,760 Speaker 1: are making the decisions have been in these companies for 207 00:10:53,800 --> 00:10:57,280 Speaker 1: a long time and very rightfully have deep expertise kind 208 00:10:57,280 --> 00:10:59,720 Speaker 1: of what advice or views do you have on how 209 00:11:00,160 --> 00:11:03,120 Speaker 1: they might be able to think about pivoting their business 210 00:11:03,120 --> 00:11:05,880 Speaker 1: and experimenting in a way that we'll be in line 211 00:11:05,960 --> 00:11:07,920 Speaker 1: with a drive to decarbonize. 212 00:11:08,240 --> 00:11:11,040 Speaker 2: Yeah, and Dana, these are the hardest ones, right anytime 213 00:11:11,080 --> 00:11:13,520 Speaker 2: you have a specialist industry where people have to work 214 00:11:13,600 --> 00:11:16,480 Speaker 2: for many years just to learn the basics before they 215 00:11:16,480 --> 00:11:18,600 Speaker 2: can become useful in the industry, and then they need 216 00:11:18,600 --> 00:11:21,760 Speaker 2: to build industry experience. By definition, you're dealing with a 217 00:11:21,760 --> 00:11:27,160 Speaker 2: deep trench rather than an experimentalist or a multi lens 218 00:11:27,280 --> 00:11:30,559 Speaker 2: viewpoint to begin with. And medicine has the same characteristics 219 00:11:30,600 --> 00:11:33,160 Speaker 2: as oil and gas. I would say it's all the 220 00:11:33,200 --> 00:11:36,800 Speaker 2: more reason why these mindsets are terribly important if you 221 00:11:36,840 --> 00:11:41,720 Speaker 2: want to create innovation inside conventional energy. And let me 222 00:11:41,720 --> 00:11:45,079 Speaker 2: give both examples and then processes. One of the things 223 00:11:45,160 --> 00:11:48,360 Speaker 2: we've learned is if you just follow the existing processes 224 00:11:48,400 --> 00:11:50,880 Speaker 2: in a business, you'll get the same answers. And one 225 00:11:50,880 --> 00:11:53,120 Speaker 2: of the things we like to do is use workshops 226 00:11:53,160 --> 00:11:56,400 Speaker 2: where people do, for example, what's called perspective taking, So 227 00:11:56,480 --> 00:11:59,280 Speaker 2: before you launch into a new strategic plan, you actually 228 00:11:59,280 --> 00:12:01,760 Speaker 2: step back and this is this idea of dragonfly eye, 229 00:12:01,800 --> 00:12:04,320 Speaker 2: where you look at your industry. You know, whether you're 230 00:12:04,320 --> 00:12:08,559 Speaker 2: doing upstream exploration or you're doing refining, for example, or distribution, 231 00:12:08,880 --> 00:12:12,440 Speaker 2: and see that industry or that segment through the lens 232 00:12:12,520 --> 00:12:16,520 Speaker 2: of your customer, your supplier or for example, Gretituneberk. That's 233 00:12:16,559 --> 00:12:19,360 Speaker 2: a very different perspective. How would you see yourself? And 234 00:12:19,400 --> 00:12:22,760 Speaker 2: that puts you outside. We call it anchoring outside. It's 235 00:12:22,760 --> 00:12:25,480 Speaker 2: a term we really like. Anchoring outside gives you a 236 00:12:25,520 --> 00:12:28,920 Speaker 2: better perspective and frees you to think differently than you 237 00:12:29,000 --> 00:12:31,199 Speaker 2: might have done before. So I'm going to give a 238 00:12:31,240 --> 00:12:33,640 Speaker 2: couple of examples that we can make up from oil 239 00:12:33,679 --> 00:12:37,160 Speaker 2: and gas. So flare gas is one of these persistent 240 00:12:37,200 --> 00:12:39,880 Speaker 2: problems in oil and gas. You have a remote location, 241 00:12:40,240 --> 00:12:43,319 Speaker 2: you're producing liquids, gas comes up, you don't have any 242 00:12:43,320 --> 00:12:46,040 Speaker 2: way of handling gas. What do you do while you 243 00:12:46,080 --> 00:12:48,400 Speaker 2: flare it? Right? And we still see that all around 244 00:12:48,440 --> 00:12:51,040 Speaker 2: the world. An idea that I saw recently, and I 245 00:12:51,040 --> 00:12:52,840 Speaker 2: don't you know, I'm not an expert in this, but 246 00:12:53,080 --> 00:12:55,439 Speaker 2: I just thought it was interesting, which is a group 247 00:12:55,480 --> 00:12:58,560 Speaker 2: of folks who had been working in server farms. So 248 00:12:58,600 --> 00:13:01,520 Speaker 2: these are the enormous computer server farms that we use 249 00:13:01,559 --> 00:13:04,280 Speaker 2: for this internet driven economy that we're in today, which 250 00:13:04,320 --> 00:13:07,079 Speaker 2: are usually located near big cities. What if you were 251 00:13:07,080 --> 00:13:10,360 Speaker 2: to locate server farms close to where we're flaring gas, 252 00:13:10,400 --> 00:13:13,920 Speaker 2: and the heating and cooling systems, the electricity generation and 253 00:13:13,920 --> 00:13:16,920 Speaker 2: cooling systems that are required could be powered using that 254 00:13:17,040 --> 00:13:20,760 Speaker 2: gas instead of flaring in so co locating an industry 255 00:13:20,960 --> 00:13:24,400 Speaker 2: that's a heavy user near where you're otherwise having to 256 00:13:24,440 --> 00:13:27,079 Speaker 2: burn gas without any value. It's just it's a different 257 00:13:27,120 --> 00:13:28,800 Speaker 2: way of thinking about it. You do have to think 258 00:13:28,840 --> 00:13:31,360 Speaker 2: about transmission lines. There's a whole bunch, you know, for 259 00:13:31,400 --> 00:13:33,319 Speaker 2: the data that comes out of data rooms. But it's 260 00:13:33,360 --> 00:13:35,520 Speaker 2: kind of a cool idea and that you would get 261 00:13:35,600 --> 00:13:36,680 Speaker 2: from thinking differently. 262 00:13:36,920 --> 00:13:39,920 Speaker 1: And there was another example that you brought up in 263 00:13:39,920 --> 00:13:42,600 Speaker 1: the book that was around water pipelines and water pipe 264 00:13:42,760 --> 00:13:45,560 Speaker 1: water pipe failures, and the first thing that really occurred 265 00:13:45,559 --> 00:13:47,920 Speaker 1: in my mind maybe almost two literals. When I'm thinking 266 00:13:47,920 --> 00:13:50,840 Speaker 1: of pipes, I'm thinking about actually like methane gas leaks, 267 00:13:50,920 --> 00:13:53,120 Speaker 1: and I'm thinking about the fact that you know, increasingly 268 00:13:53,200 --> 00:13:56,120 Speaker 1: satellites are picking up on where this is coming from 269 00:13:56,400 --> 00:13:59,040 Speaker 1: and hopefully then leading us to solutions on then what 270 00:13:59,040 --> 00:14:01,040 Speaker 1: we can do about it. But one of the things 271 00:14:01,080 --> 00:14:04,600 Speaker 1: that you really drilled down on this specific case around 272 00:14:04,640 --> 00:14:08,320 Speaker 1: water pipe failures had to do with AI or probably 273 00:14:08,480 --> 00:14:11,440 Speaker 1: machine learning, depending on how you want to refer to it. Really, 274 00:14:11,920 --> 00:14:16,360 Speaker 1: where do you see the potential for application of machine 275 00:14:16,440 --> 00:14:19,760 Speaker 1: learning and helping us find solutions to these big problems. 276 00:14:20,040 --> 00:14:22,400 Speaker 2: Yeah, And on the way into our conversation, we talked 277 00:14:22,440 --> 00:14:26,040 Speaker 2: about how complex this is. Right, the world of decarbonization 278 00:14:26,200 --> 00:14:29,440 Speaker 2: and of energy transition, there's no silver bullets. You have 279 00:14:29,520 --> 00:14:32,960 Speaker 2: to find innovation in many small things. AI and machine 280 00:14:33,040 --> 00:14:36,960 Speaker 2: learning will become hugely important for this world because pattern 281 00:14:37,040 --> 00:14:41,040 Speaker 2: recognition in incredibly complex systems is something that the machine 282 00:14:41,040 --> 00:14:43,680 Speaker 2: does better than us. I love your example. So in 283 00:14:43,720 --> 00:14:46,280 Speaker 2: the water pipe example, you tend to think in a 284 00:14:46,360 --> 00:14:50,160 Speaker 2: very mechanistic way FIFO right first in, first out, So 285 00:14:50,200 --> 00:14:53,160 Speaker 2: you'd assume that the oldest water pipes or methane pipes 286 00:14:53,320 --> 00:14:55,280 Speaker 2: are the ones that you should be replacing because that's 287 00:14:55,280 --> 00:14:57,640 Speaker 2: where failure is going to occur. Turns out that's not 288 00:14:57,680 --> 00:15:01,080 Speaker 2: a very good model. What the artificial intelligence And this 289 00:15:01,280 --> 00:15:05,680 Speaker 2: was an actual example a mathematics professor in Australia looking 290 00:15:05,720 --> 00:15:08,120 Speaker 2: at the water pipe system in a Citney, like Sydney 291 00:15:08,160 --> 00:15:12,080 Speaker 2: or Melbourne came up with were non parametric artificial intelligence 292 00:15:12,160 --> 00:15:15,880 Speaker 2: models that used other clues and almost by definition non 293 00:15:15,960 --> 00:15:19,280 Speaker 2: parametric models, so you didn't define in advance, for example, 294 00:15:19,360 --> 00:15:21,800 Speaker 2: age of pipe is the key determinant. With those kind 295 00:15:21,840 --> 00:15:24,200 Speaker 2: of models, they ended up being able to predict failure 296 00:15:24,240 --> 00:15:28,360 Speaker 2: points much more accurately because they were often in unexpected places. 297 00:15:28,400 --> 00:15:30,680 Speaker 2: And I think the same would likely be true with 298 00:15:30,840 --> 00:15:34,200 Speaker 2: methane or with even more slippery gases like hydrogen, And 299 00:15:34,280 --> 00:15:36,320 Speaker 2: so I think we're going to find a lot of 300 00:15:36,320 --> 00:15:40,280 Speaker 2: our future solutions using artificial intelligence. And the good news 301 00:15:40,320 --> 00:15:42,720 Speaker 2: there is, of course, these oil and gas engineers will 302 00:15:42,720 --> 00:15:47,000 Speaker 2: find that very comfortable because it's sourcing from something that's 303 00:15:47,080 --> 00:15:48,880 Speaker 2: quite adjacent to what they do already. 304 00:15:50,480 --> 00:15:53,840 Speaker 1: So climate change is a shared problem that is going 305 00:15:53,920 --> 00:15:57,360 Speaker 1: to impact everyone and is not going to be an 306 00:15:57,360 --> 00:16:02,440 Speaker 1: equal measure depending upon where and which companies or countries 307 00:16:02,480 --> 00:16:04,600 Speaker 1: are the emitters. So when we think about the fact 308 00:16:04,600 --> 00:16:07,840 Speaker 1: that this is a shared concern, is there potential then 309 00:16:07,880 --> 00:16:10,200 Speaker 1: for it to also be a way for us to 310 00:16:10,240 --> 00:16:14,200 Speaker 1: actually share the creation of the solutions. And one of 311 00:16:14,200 --> 00:16:16,160 Speaker 1: the things that you point out in this book is 312 00:16:16,200 --> 00:16:19,680 Speaker 1: this well open source technology and then also Joy's law. 313 00:16:19,720 --> 00:16:21,680 Speaker 1: Can you kind of go into that a little bit? 314 00:16:22,080 --> 00:16:26,680 Speaker 2: Sure? So Bill Joy was a founder of Sun Microsystems, 315 00:16:26,800 --> 00:16:29,920 Speaker 2: so one of the first big companies that used Unix, 316 00:16:30,080 --> 00:16:33,040 Speaker 2: and not surprisingly because he'd been at Berkeley before where 317 00:16:33,040 --> 00:16:36,680 Speaker 2: he was part of developing Unix, which originated many years 318 00:16:36,680 --> 00:16:38,920 Speaker 2: before in a joint project with AT and T and 319 00:16:38,960 --> 00:16:42,960 Speaker 2: some other companies. Unix is a wonderful example of where 320 00:16:43,040 --> 00:16:46,600 Speaker 2: Joy's law comes from. Bill Joy said, the smartest people 321 00:16:46,600 --> 00:16:48,440 Speaker 2: and may not be in your room. They may be 322 00:16:48,560 --> 00:16:51,640 Speaker 2: laboring in someone else's garden. This is the core idea 323 00:16:51,680 --> 00:16:54,920 Speaker 2: behind Joy's law, and how do you access them so 324 00:16:54,960 --> 00:16:58,320 Speaker 2: that they can contribute to your project. The example that 325 00:16:58,400 --> 00:17:02,080 Speaker 2: he used was Unix, which was open source, meaning that 326 00:17:02,200 --> 00:17:06,159 Speaker 2: engineers from software engineers from all around different companies and 327 00:17:06,240 --> 00:17:09,560 Speaker 2: academic environments could actually contribute to the building of this 328 00:17:09,800 --> 00:17:12,879 Speaker 2: core infrastructure of software that the kernels of which are 329 00:17:12,920 --> 00:17:16,800 Speaker 2: still literally called kernels are in Microsoft operating system, in 330 00:17:16,840 --> 00:17:19,920 Speaker 2: the Apple operating system, and the operating systems of every 331 00:17:19,960 --> 00:17:24,040 Speaker 2: other major computer language. But that idea of Joy's law 332 00:17:24,119 --> 00:17:27,120 Speaker 2: and open source can be applied much more broadly. One 333 00:17:27,119 --> 00:17:31,359 Speaker 2: of the most famous competitions was the Flying NonStop across 334 00:17:31,400 --> 00:17:34,399 Speaker 2: the Atlantic competition, which was won by Lindberg back in 335 00:17:34,400 --> 00:17:39,080 Speaker 2: the nineteen thirties. That idea of using prize competitions, for example, 336 00:17:39,280 --> 00:17:44,040 Speaker 2: to attract creativity, often from other industries, is another way 337 00:17:44,080 --> 00:17:48,880 Speaker 2: of crowdsourcing intelligence or ideas or technologies from outside. We've 338 00:17:48,880 --> 00:17:51,080 Speaker 2: seen that with the X Prize, which has led to 339 00:17:51,119 --> 00:17:53,959 Speaker 2: innovations in flight and a number of other areas. And 340 00:17:54,000 --> 00:17:58,040 Speaker 2: we've seen it with gamified platforms like Cagele, which have 341 00:17:58,160 --> 00:18:02,000 Speaker 2: allowed the crowdsource seeing of great ideas. So an example 342 00:18:02,000 --> 00:18:03,879 Speaker 2: that we use in the book That I Love is 343 00:18:03,920 --> 00:18:08,000 Speaker 2: the Nature Conservancy was trying to make innovations in how 344 00:18:08,040 --> 00:18:12,800 Speaker 2: to reduce bycatch of endangered fish species. Really complex problem 345 00:18:12,800 --> 00:18:15,320 Speaker 2: because these fish are brought a board, either onlines or 346 00:18:15,320 --> 00:18:17,800 Speaker 2: in nets at sea, bumping up and down in the 347 00:18:17,800 --> 00:18:20,520 Speaker 2: worst weather. And you can put cameras aboard ships, but 348 00:18:20,600 --> 00:18:22,840 Speaker 2: how do you very quickly make an identification of a 349 00:18:22,880 --> 00:18:24,720 Speaker 2: fish that's okay to keep and a fish that you 350 00:18:24,720 --> 00:18:28,080 Speaker 2: should put back gently? And the Nature Conservancy didn't have 351 00:18:28,240 --> 00:18:32,159 Speaker 2: people internally who were experts in computer vision or machine learning, 352 00:18:32,240 --> 00:18:34,080 Speaker 2: and so they put up one hundred and fifty thousand 353 00:18:34,119 --> 00:18:37,280 Speaker 2: dollars prize on the Cagle platform, and they received more 354 00:18:37,320 --> 00:18:40,400 Speaker 2: than three thousand entries from different clever people who had 355 00:18:40,400 --> 00:18:43,560 Speaker 2: built algorithms for recognizing fish according to the shape of 356 00:18:43,600 --> 00:18:46,680 Speaker 2: a gill plate or a fin that worked remarkably well 357 00:18:46,680 --> 00:18:49,919 Speaker 2: to identify fish even in those difficult conditions. It's a 358 00:18:49,960 --> 00:18:52,840 Speaker 2: wonderful example of an organization in this case committed to 359 00:18:52,880 --> 00:18:57,320 Speaker 2: conservation sister to decarbonization, for sure, that crowdsourced in a 360 00:18:57,320 --> 00:19:00,879 Speaker 2: wonderful idea that's now being put to work already, first 361 00:19:00,880 --> 00:19:03,600 Speaker 2: in the Indonesian tuna flee to actually help save the 362 00:19:03,640 --> 00:19:05,920 Speaker 2: biodiversity on the planet at the same time as we're 363 00:19:05,960 --> 00:19:10,320 Speaker 2: trying to slow warming. Just a cool idea. This same idea, 364 00:19:10,400 --> 00:19:13,400 Speaker 2: this family of ideas is also being used in for example, 365 00:19:13,440 --> 00:19:16,160 Speaker 2: AI swarms, which can be used to do much better 366 00:19:16,200 --> 00:19:19,640 Speaker 2: prediction for things like cancer diagnosis. So I think we're 367 00:19:19,720 --> 00:19:21,920 Speaker 2: right at the very beginning of being able to put 368 00:19:21,960 --> 00:19:26,840 Speaker 2: our fingers into revolutionary technologies that come from one industry 369 00:19:27,000 --> 00:19:31,399 Speaker 2: but that may provide solutions, including in the heaviest emitting industries. 370 00:19:31,800 --> 00:19:34,359 Speaker 1: So when you think about a prize, a prize really 371 00:19:34,400 --> 00:19:39,680 Speaker 1: sits outside of a company and really motivates individuals and 372 00:19:39,760 --> 00:19:42,320 Speaker 1: these brilliant minds to tackle problems in a different way. 373 00:19:42,480 --> 00:19:45,160 Speaker 1: Do you think that maybe in some respects net zero 374 00:19:45,280 --> 00:19:48,720 Speaker 1: targets or other I mean, are there other ways essentially 375 00:19:48,800 --> 00:19:50,800 Speaker 1: to motivate companies to want to get involved in this 376 00:19:50,800 --> 00:19:52,879 Speaker 1: because I think about the fact that we can go 377 00:19:52,920 --> 00:19:54,600 Speaker 1: back to oil and gas, or we can call upon 378 00:19:54,680 --> 00:19:57,360 Speaker 1: any of the heavy industries. Really they're still companies, they're 379 00:19:57,359 --> 00:20:00,239 Speaker 1: competitive with one another. There are opportunities to cloud rate 380 00:20:00,320 --> 00:20:03,080 Speaker 1: on solutions, but in reality there's an element to this 381 00:20:03,200 --> 00:20:05,320 Speaker 1: and that not only do you need to have the 382 00:20:05,320 --> 00:20:07,720 Speaker 1: best solution for your customers, but you want to maintain 383 00:20:07,760 --> 00:20:11,720 Speaker 1: market share. Are net zero targets a good proxy for 384 00:20:11,840 --> 00:20:13,960 Speaker 1: a prize when it comes to trying to think about 385 00:20:14,040 --> 00:20:15,240 Speaker 1: motivating large companies. 386 00:20:15,440 --> 00:20:17,800 Speaker 2: I think that's a really cool idea. And my guess 387 00:20:17,880 --> 00:20:21,640 Speaker 2: is that net zero targets are or will become that 388 00:20:21,680 --> 00:20:25,040 Speaker 2: same kind of motivator. Well, we found that Patagonia is 389 00:20:25,440 --> 00:20:28,439 Speaker 2: a net zero target is quite distant, even for a 390 00:20:28,480 --> 00:20:32,400 Speaker 2: company like Patagonia. So when we break that down into 391 00:20:32,720 --> 00:20:36,639 Speaker 2: nearer targets that we can actually achieve, it's more motivating. 392 00:20:36,720 --> 00:20:39,600 Speaker 2: So I'll give you an example that wonderful rain jacket 393 00:20:39,600 --> 00:20:41,800 Speaker 2: that you wear here in London when you're riding your 394 00:20:41,800 --> 00:20:45,080 Speaker 2: bike to work, which sheds water in just a remarkable way, 395 00:20:45,280 --> 00:20:48,760 Speaker 2: does so because it uses really dangerous chemistries. Yet Patagonia 396 00:20:48,840 --> 00:20:51,320 Speaker 2: we've said by twenty twenty five, we are not going 397 00:20:51,400 --> 00:20:53,879 Speaker 2: to use any of those chemistries. And over the course 398 00:20:53,920 --> 00:20:57,320 Speaker 2: of the last four years that's been a huge motivator 399 00:20:57,359 --> 00:21:00,920 Speaker 2: for us because the family that own Patagonia until recently 400 00:21:01,080 --> 00:21:03,719 Speaker 2: gave it a way to fight climate change, said we 401 00:21:03,760 --> 00:21:07,359 Speaker 2: want if we can't find non dangerous chemistries for rain jackets, 402 00:21:07,359 --> 00:21:09,520 Speaker 2: we're out. We're going to stop selling them. And so 403 00:21:09,680 --> 00:21:14,240 Speaker 2: the internal team worked with external organizations like the Gore Organization, 404 00:21:14,320 --> 00:21:17,560 Speaker 2: which is a wonderful fabric chemistry company, and they've cracked it. 405 00:21:17,680 --> 00:21:20,560 Speaker 2: They found a way to use less dangerous chemistries to 406 00:21:20,680 --> 00:21:25,399 Speaker 2: create equally water shedding fabrics. That's incredibly motivating, But it 407 00:21:25,440 --> 00:21:28,120 Speaker 2: was motivating because it was a target that was almost 408 00:21:28,200 --> 00:21:30,320 Speaker 2: out of our grass, but not out of our grass, 409 00:21:30,640 --> 00:21:33,920 Speaker 2: whereas a net zero target for a oil and gas 410 00:21:33,960 --> 00:21:36,600 Speaker 2: company might feel too distant. Well, we'll chip away at 411 00:21:36,640 --> 00:21:38,000 Speaker 2: it by investing a bit and wind. 412 00:21:38,320 --> 00:21:41,800 Speaker 1: How do you, I guess, deal with ethics questions in this, 413 00:21:41,880 --> 00:21:44,439 Speaker 1: And the reason I bring up ethics questions is that 414 00:21:44,520 --> 00:21:48,320 Speaker 1: at the center of this is really experimentation, trying things, 415 00:21:48,400 --> 00:21:50,879 Speaker 1: maybe going out and doing it before you're quite ready. 416 00:21:50,960 --> 00:21:53,680 Speaker 1: And a good parallel for that is, you know, any 417 00:21:53,720 --> 00:21:57,840 Speaker 1: sort of autonomous driving right now, if those vehicles are 418 00:21:57,960 --> 00:22:01,119 Speaker 1: learning how to avoid traffic and deal with not just 419 00:22:01,200 --> 00:22:04,560 Speaker 1: traffic but accidents, and in the long run, the view 420 00:22:04,600 --> 00:22:06,520 Speaker 1: is that it's going to save many, many more lives. 421 00:22:06,600 --> 00:22:09,080 Speaker 1: Right now it already is saving maybe some lives depending 422 00:22:09,119 --> 00:22:10,840 Speaker 1: on how you're looking at the data. But the real 423 00:22:10,920 --> 00:22:14,040 Speaker 1: question is, in that circumstance, how do we then deal 424 00:22:14,160 --> 00:22:16,440 Speaker 1: with the fact that there's some randomness to the way 425 00:22:16,520 --> 00:22:20,120 Speaker 1: human beings work? But once we start relying on experimentation 426 00:22:20,320 --> 00:22:23,560 Speaker 1: is a way of getting us to that ultimate end. 427 00:22:23,880 --> 00:22:27,040 Speaker 1: Can you really experiment when you're dealing with human beings? 428 00:22:27,920 --> 00:22:30,200 Speaker 2: Well, I mean that's such a huge question. I think 429 00:22:30,200 --> 00:22:32,879 Speaker 2: it's a wonderful question too. I don't have a silver 430 00:22:32,920 --> 00:22:35,760 Speaker 2: bullet answer, but let me give you two thoughts. One is, 431 00:22:36,080 --> 00:22:39,119 Speaker 2: when you can do experiment without putting living creatures in 432 00:22:39,119 --> 00:22:41,840 Speaker 2: harms way, you should do that. And so, for example, 433 00:22:41,920 --> 00:22:44,520 Speaker 2: I do a lot of investing in the biotech space, 434 00:22:44,840 --> 00:22:47,040 Speaker 2: Sometimes you have to put a mouse in harm's way 435 00:22:47,160 --> 00:22:50,040 Speaker 2: in order to find out what's going on. But increasingly 436 00:22:50,119 --> 00:22:52,760 Speaker 2: you can do organ on a chip, which are approaches 437 00:22:52,840 --> 00:22:56,000 Speaker 2: that don't involve sacrificing animals or putting people in harms way. 438 00:22:56,080 --> 00:22:58,919 Speaker 2: So whenever you can use an alternative technology, you should 439 00:22:58,920 --> 00:23:02,800 Speaker 2: do it. But there are cases where we introduce technologies 440 00:23:02,880 --> 00:23:05,359 Speaker 2: that do have some risk to humans, and the question 441 00:23:05,480 --> 00:23:08,040 Speaker 2: is how can we reduce the risk to humans. I'll 442 00:23:08,119 --> 00:23:12,280 Speaker 2: use an old example. We could massively reduce highway deaths 443 00:23:12,320 --> 00:23:14,600 Speaker 2: if we went back to fifty five miles an hour, 444 00:23:14,760 --> 00:23:17,400 Speaker 2: and we don't, and that's a trade off, and everybody 445 00:23:17,400 --> 00:23:19,280 Speaker 2: knows it's a trade off. Because people want to get 446 00:23:19,280 --> 00:23:21,399 Speaker 2: where they're going a little bit faster, we accept slightly 447 00:23:21,440 --> 00:23:24,199 Speaker 2: higher highway debts. So that is an example, like your 448 00:23:24,240 --> 00:23:28,879 Speaker 2: autonomous driving example. I guess between those two examples saving 449 00:23:28,920 --> 00:23:32,000 Speaker 2: the mouse and highway deaths, we have to ask is 450 00:23:32,040 --> 00:23:35,280 Speaker 2: autonomous driving at the state yet where we can take 451 00:23:35,320 --> 00:23:38,639 Speaker 2: that incremental risk as we do with speed limits, or 452 00:23:38,720 --> 00:23:40,600 Speaker 2: is it still at the state where we're better off 453 00:23:40,720 --> 00:23:43,960 Speaker 2: having a non living creature in the seat. When I 454 00:23:44,119 --> 00:23:47,040 Speaker 2: was working in Oxford, a lot of the autonomous driving 455 00:23:47,119 --> 00:23:48,840 Speaker 2: work was being done, a lot of the stuff that 456 00:23:48,960 --> 00:23:51,760 Speaker 2: in fact informs the industry today. Very little of that 457 00:23:51,800 --> 00:23:53,080 Speaker 2: put people in harms. 458 00:23:52,800 --> 00:23:55,119 Speaker 1: Way, which then deals with you know, how do you 459 00:23:55,200 --> 00:23:58,480 Speaker 1: do things when there's so much uncertainty and you have 460 00:23:58,560 --> 00:24:01,280 Speaker 1: to move your way up these five different levels of 461 00:24:01,359 --> 00:24:03,560 Speaker 1: uncertainty which you bring up and so I guess taking 462 00:24:03,800 --> 00:24:05,639 Speaker 1: a bit of a turn, so to speak, in terms 463 00:24:05,720 --> 00:24:08,520 Speaker 1: of the subject matter. But when we get back into 464 00:24:08,760 --> 00:24:12,320 Speaker 1: let's say physical risk and we think about going forward 465 00:24:12,760 --> 00:24:15,680 Speaker 1: with climate change, change is really at the center of it, right, 466 00:24:15,920 --> 00:24:19,679 Speaker 1: we are increasingly not able to actually look at weather data. 467 00:24:19,720 --> 00:24:22,720 Speaker 1: How does one really grapple with these different Well, I 468 00:24:22,720 --> 00:24:26,600 Speaker 1: guess first outline the five different levels of uncertainty and 469 00:24:27,040 --> 00:24:30,480 Speaker 1: really where you think the most experimentation really can live. 470 00:24:31,000 --> 00:24:34,120 Speaker 2: Yeah, there's various ways to categorize uncertainty. In the book, 471 00:24:34,119 --> 00:24:36,680 Speaker 2: we talk about one framework that was developed a number 472 00:24:36,720 --> 00:24:40,080 Speaker 2: of years ago, where you have no knowns, which is 473 00:24:40,119 --> 00:24:43,199 Speaker 2: the easy level known unknowns, that is, you know what 474 00:24:43,280 --> 00:24:47,399 Speaker 2: you don't know. Ultimately up to unknown unknowns, which you 475 00:24:47,480 --> 00:24:51,120 Speaker 2: know in the nineteen sixties were called unk unks. Well, 476 00:24:51,160 --> 00:24:55,000 Speaker 2: you literally don't know enough to even characterize the nature 477 00:24:55,040 --> 00:24:58,640 Speaker 2: of uncertainty. Obviously, those are the hardest places to operate. 478 00:24:58,880 --> 00:25:01,880 Speaker 2: I think in the middle uncertainty level two and three, 479 00:25:02,000 --> 00:25:05,320 Speaker 2: you really can use experimentation a lot to learn about, 480 00:25:05,680 --> 00:25:07,119 Speaker 2: especially if you can do that in a way that, 481 00:25:07,320 --> 00:25:09,760 Speaker 2: again that's safe. One of the things we talk about 482 00:25:09,880 --> 00:25:12,679 Speaker 2: is the core idea in the book is an imperfectionist 483 00:25:12,720 --> 00:25:16,080 Speaker 2: approach loves to take steps forward into risk if you 484 00:25:16,080 --> 00:25:19,119 Speaker 2: can do it in ways that are reversible. That is, 485 00:25:19,160 --> 00:25:20,800 Speaker 2: if you don't like where you got to, you can 486 00:25:20,840 --> 00:25:23,639 Speaker 2: go back through the door or where the consequences are 487 00:25:23,640 --> 00:25:27,760 Speaker 2: relatively low rather than existential. And so you can use 488 00:25:28,080 --> 00:25:33,800 Speaker 2: this fundamentally experimentalist or imperfectionist approach to explore level two, 489 00:25:33,960 --> 00:25:36,800 Speaker 2: level three, and even level four risk as long as 490 00:25:36,880 --> 00:25:40,160 Speaker 2: you're doing so in ways that aren't existential risk. 491 00:25:40,400 --> 00:25:43,520 Speaker 1: And we break down big problems into increasingly small problems 492 00:25:43,560 --> 00:25:44,160 Speaker 1: so that we're. 493 00:25:43,960 --> 00:25:49,760 Speaker 2: Able to precisely and as you fail, so experimentation means 494 00:25:49,800 --> 00:25:53,080 Speaker 2: not just winning, but losing, you make sure to consolidate 495 00:25:53,119 --> 00:25:57,800 Speaker 2: those lessons, and that's what ultimately builds organizational capability. Organizational 496 00:25:57,840 --> 00:26:01,480 Speaker 2: capability doesn't come from just in insourcing. A lot of 497 00:26:01,480 --> 00:26:05,400 Speaker 2: it comes from learning. Learning comes from making mistakes. One 498 00:26:05,400 --> 00:26:08,040 Speaker 2: of the most important messages of the book, especially for 499 00:26:08,119 --> 00:26:10,800 Speaker 2: the heavy industries that are the biggest admitters, is not 500 00:26:10,880 --> 00:26:13,640 Speaker 2: to be so afraid of failure and not to punish 501 00:26:13,680 --> 00:26:17,959 Speaker 2: failure in our frontline teams. Engineering cultures hate failure, and 502 00:26:18,000 --> 00:26:21,560 Speaker 2: we often punish people when their projects don't work, and 503 00:26:21,760 --> 00:26:24,400 Speaker 2: I think we need to change that mentality and industry. 504 00:26:24,680 --> 00:26:26,679 Speaker 2: In science we accept that all the time, and we 505 00:26:26,720 --> 00:26:29,320 Speaker 2: write up our results, perhaps not as much as we should, 506 00:26:29,480 --> 00:26:33,159 Speaker 2: and there is a survivor bias and papers too, but especially 507 00:26:33,240 --> 00:26:36,800 Speaker 2: once we get to heavy industry where the investments are significant, 508 00:26:36,960 --> 00:26:39,760 Speaker 2: we often criticize or punish or change the compensation of 509 00:26:39,840 --> 00:26:42,440 Speaker 2: teams that have good ideas that don't work. We need 510 00:26:42,440 --> 00:26:45,320 Speaker 2: to make sure good ideas that don't work are celebrated. 511 00:26:44,840 --> 00:26:48,000 Speaker 1: To and in that there's also articulating what's happening to 512 00:26:48,040 --> 00:26:49,959 Speaker 1: the rest of the world. So one of the mindsets 513 00:26:50,000 --> 00:26:52,160 Speaker 1: is show and tell and explaining this to the world. 514 00:26:52,200 --> 00:26:54,959 Speaker 1: And actually first question within that, would you consider yourself 515 00:26:55,000 --> 00:26:55,720 Speaker 1: a storyteller? 516 00:26:55,960 --> 00:26:58,520 Speaker 2: I sure hope. So I think the most compelling people 517 00:26:58,520 --> 00:27:01,159 Speaker 2: in the world are the storytellers. You know, when you 518 00:27:01,200 --> 00:27:04,280 Speaker 2: think about David Attenborough. You listen to him because he 519 00:27:04,320 --> 00:27:06,760 Speaker 2: tells the stories in a way that sort of brings 520 00:27:06,760 --> 00:27:09,000 Speaker 2: it home to you, rather than in the sort of 521 00:27:09,000 --> 00:27:12,440 Speaker 2: thirty thousand foot science. I think all of us would 522 00:27:12,440 --> 00:27:15,359 Speaker 2: be better storytellers if we didn't just think about the 523 00:27:15,400 --> 00:27:18,520 Speaker 2: logic what's between our ears, but we also thought about 524 00:27:18,560 --> 00:27:21,520 Speaker 2: what's in our hearts and our values. The best storytellers 525 00:27:21,520 --> 00:27:24,480 Speaker 2: are ones that link our reason and our values. 526 00:27:24,840 --> 00:27:27,159 Speaker 1: And you reference this one study in the book that 527 00:27:27,240 --> 00:27:31,440 Speaker 1: called upon the US specifically in saying that when individuals 528 00:27:31,440 --> 00:27:34,640 Speaker 1: were asked where climate ranked in their concerns, it was third, 529 00:27:34,680 --> 00:27:36,720 Speaker 1: but when they thought about their peers, they thought their 530 00:27:36,720 --> 00:27:39,919 Speaker 1: peers ranked it at thirty third. So the question is, 531 00:27:40,000 --> 00:27:42,760 Speaker 1: I guess that are we doing a good enough job 532 00:27:43,160 --> 00:27:45,840 Speaker 1: in the world of those that are actually looking at 533 00:27:45,880 --> 00:27:48,600 Speaker 1: climate solutions of telling the story. 534 00:27:48,800 --> 00:27:51,440 Speaker 2: I don't think we are, and I think unfortunately we've 535 00:27:51,440 --> 00:27:53,919 Speaker 2: done the thing that's hardest for people to act on, 536 00:27:54,240 --> 00:27:58,119 Speaker 2: which is to speak about a future state that's uncertain, 537 00:27:58,400 --> 00:28:01,119 Speaker 2: that is catastrophic, very difficult to know what to do 538 00:28:01,200 --> 00:28:05,920 Speaker 2: with that. We move forward in the world by imperfectionist steps, 539 00:28:06,080 --> 00:28:09,159 Speaker 2: by experimentation, and when we don't give people something that 540 00:28:09,160 --> 00:28:12,040 Speaker 2: they can do tomorrow that's different. All we create is 541 00:28:12,080 --> 00:28:15,200 Speaker 2: fear and no forward movement. And so if we and 542 00:28:15,320 --> 00:28:18,520 Speaker 2: perhaps this kind of program that we're discussing right now 543 00:28:18,680 --> 00:28:21,639 Speaker 2: is exactly the right way to start, each of us 544 00:28:21,680 --> 00:28:24,720 Speaker 2: can do behavior changes that help move us down this 545 00:28:24,880 --> 00:28:27,800 Speaker 2: path and which give us more information more data to 546 00:28:27,840 --> 00:28:29,200 Speaker 2: move us further down the path. 547 00:28:29,359 --> 00:28:32,520 Speaker 1: And you reference data certainly the people I work with 548 00:28:32,560 --> 00:28:35,800 Speaker 1: and I like data a lot, but data and data 549 00:28:35,880 --> 00:28:39,440 Speaker 1: is certainly really inherent to being able to make good 550 00:28:39,520 --> 00:28:43,320 Speaker 1: business decisions and create those bets, if you will, about 551 00:28:43,320 --> 00:28:46,120 Speaker 1: how to move forward. But increasingly there's a distrust of 552 00:28:46,360 --> 00:28:49,880 Speaker 1: science and data, and maybe storytelling is the solution to that. 553 00:28:50,040 --> 00:28:52,720 Speaker 1: But for someone who's not in the business community and 554 00:28:52,800 --> 00:28:55,280 Speaker 1: is maybe looking at this problem and paralyzed by fear, 555 00:28:55,320 --> 00:28:57,840 Speaker 1: as you outline, where does data come in for them? 556 00:28:58,200 --> 00:29:01,920 Speaker 1: And is it being too heavily relied on as the story. 557 00:29:02,200 --> 00:29:05,480 Speaker 2: Yeah, the people who control the biggest levers around decarbonization 558 00:29:05,560 --> 00:29:07,960 Speaker 2: and energy transition don't tend to be the people who 559 00:29:08,000 --> 00:29:10,200 Speaker 2: speak from their hearts. They tend to be the people 560 00:29:10,200 --> 00:29:13,560 Speaker 2: who speak from what's between their ears. And I do think, 561 00:29:13,960 --> 00:29:17,400 Speaker 2: especially in these industries. We need to get better at 562 00:29:17,520 --> 00:29:20,959 Speaker 2: speaking to where people are now and starting there. And 563 00:29:21,240 --> 00:29:24,480 Speaker 2: you can build bridges with every kind of person if 564 00:29:24,520 --> 00:29:27,000 Speaker 2: you start with where they are, if you understand what 565 00:29:27,120 --> 00:29:29,320 Speaker 2: they value. In the book we talk about this, but 566 00:29:29,480 --> 00:29:32,520 Speaker 2: almost everybody cares about their kids. Probably everyone cares about 567 00:29:32,520 --> 00:29:35,760 Speaker 2: their kids. If you start with the future for our children, 568 00:29:35,920 --> 00:29:38,520 Speaker 2: you actually have a bridge that can work with almost everybody. 569 00:29:38,640 --> 00:29:42,760 Speaker 1: I certainly feel that personally, so very good example. Okay, 570 00:29:42,840 --> 00:29:44,480 Speaker 1: so we're going to go through a couple of things 571 00:29:44,480 --> 00:29:46,320 Speaker 1: that I would put in this category on whether or 572 00:29:46,360 --> 00:29:49,760 Speaker 1: not there's something that you're watching closely or perhaps ignoring 573 00:29:49,800 --> 00:29:52,360 Speaker 1: for the moment. So in watch or ignore and pulling 574 00:29:52,400 --> 00:29:54,960 Speaker 1: upon also your experience from Patagonia, where would you put 575 00:29:55,000 --> 00:29:55,920 Speaker 1: circular economy. 576 00:29:56,120 --> 00:29:58,480 Speaker 2: I think it's a watch and lean in, right. I 577 00:29:58,560 --> 00:30:00,720 Speaker 2: really do think that this is a critical which is 578 00:30:00,760 --> 00:30:03,200 Speaker 2: when we purchase something. You know, you have a bicycle, 579 00:30:03,240 --> 00:30:06,360 Speaker 2: you have clothing, that we think about where it goes 580 00:30:06,400 --> 00:30:08,800 Speaker 2: after we're done using it. It's as simple as that, 581 00:30:09,000 --> 00:30:11,120 Speaker 2: and of course the people in manufacture it should also 582 00:30:11,160 --> 00:30:13,600 Speaker 2: be thinking about it. At Patagonia, now, when we make 583 00:30:13,640 --> 00:30:16,360 Speaker 2: a jacket, we make it so that it can be disassembled. 584 00:30:16,400 --> 00:30:18,200 Speaker 2: There was a time when we had all these cool 585 00:30:18,240 --> 00:30:21,680 Speaker 2: welding technologies where we glue everything together with using heat 586 00:30:21,720 --> 00:30:24,480 Speaker 2: and chemistry, and then we realize, oh, we can't recycle 587 00:30:24,520 --> 00:30:26,400 Speaker 2: that because you can't take the zipper out. And so 588 00:30:26,560 --> 00:30:31,200 Speaker 2: if both manufacturers and users consumers can think about where 589 00:30:31,240 --> 00:30:33,480 Speaker 2: it goes after you're done with it, I think that's 590 00:30:33,520 --> 00:30:34,440 Speaker 2: absolutely critical. 591 00:30:34,840 --> 00:30:37,680 Speaker 1: Okay, watch or ignore biodiversity targets. 592 00:30:37,720 --> 00:30:40,920 Speaker 2: But for corporations, you're going to hate this answer, but 593 00:30:40,960 --> 00:30:44,000 Speaker 2: I think the answer is ignore. I just think the 594 00:30:44,640 --> 00:30:47,520 Speaker 2: very few corporations are in a position to pay attention 595 00:30:47,600 --> 00:30:51,400 Speaker 2: to the biodiversity of species. Much more important for them 596 00:30:51,520 --> 00:30:54,680 Speaker 2: is to work toward net zero goals because it's climate 597 00:30:54,760 --> 00:30:57,880 Speaker 2: change that is putting most species at risk. Climate change, 598 00:30:57,880 --> 00:31:00,680 Speaker 2: and then, of course I think the escape of danger chemistries. 599 00:31:00,880 --> 00:31:04,240 Speaker 2: When you look at what's happening with frogs and other amphibians, 600 00:31:04,480 --> 00:31:07,240 Speaker 2: this is something that you can only affect by focusing 601 00:31:07,280 --> 00:31:09,840 Speaker 2: on the big levers, not so much on frog habitat 602 00:31:09,880 --> 00:31:13,360 Speaker 2: because that's being destroyed by climate change and chemistry escape. 603 00:31:13,520 --> 00:31:17,560 Speaker 1: Okay, so final watch and ignore ESG financial ratings and rankings. 604 00:31:17,960 --> 00:31:20,600 Speaker 2: Can I create a third category, which is fix. So 605 00:31:21,000 --> 00:31:23,240 Speaker 2: I don't think you ignore it, and I think you 606 00:31:23,320 --> 00:31:26,320 Speaker 2: need to do more than watch it, because ES and 607 00:31:26,400 --> 00:31:29,880 Speaker 2: G are all good ideas, But trying to create an 608 00:31:29,920 --> 00:31:33,320 Speaker 2: index which captures even one of those things for big 609 00:31:33,360 --> 00:31:37,280 Speaker 2: companies and complex operations is literally absurd. So having a 610 00:31:37,280 --> 00:31:40,520 Speaker 2: single rating for E for a company like McDonald's or 611 00:31:40,520 --> 00:31:43,160 Speaker 2: Coca Cola or any of the big energy companies is 612 00:31:43,200 --> 00:31:45,959 Speaker 2: just silly. And most of the ESG ratings that go 613 00:31:46,040 --> 00:31:49,840 Speaker 2: along with stocks, for example, are meaningless, they're not consistent. 614 00:31:49,920 --> 00:31:52,960 Speaker 2: The rating agencies even don't agree themselves. That doesn't mean 615 00:31:53,000 --> 00:31:55,720 Speaker 2: the idea should be thrown out entirely. What it means 616 00:31:55,840 --> 00:31:58,040 Speaker 2: is that to be useful, we need to be much 617 00:31:58,040 --> 00:32:02,160 Speaker 2: more granular. At Patagonia, we have of measures for environmental 618 00:32:02,200 --> 00:32:05,440 Speaker 2: sustainability for each one of the products that we produce. 619 00:32:05,600 --> 00:32:08,680 Speaker 2: We know the carbon that it emits to produce that, 620 00:32:08,760 --> 00:32:10,920 Speaker 2: we know the water that's required, and we know the 621 00:32:10,960 --> 00:32:14,200 Speaker 2: dangerous chemistries are not that are used in manufacture. So 622 00:32:14,680 --> 00:32:17,640 Speaker 2: ES and G need to become much more granular to 623 00:32:17,760 --> 00:32:20,600 Speaker 2: actually allow us to make action. And today you don't 624 00:32:20,600 --> 00:32:23,840 Speaker 2: see a lot of correlation between higher ESG scores and 625 00:32:23,920 --> 00:32:26,800 Speaker 2: higher returns or lower returns. We don't see a lot 626 00:32:26,840 --> 00:32:29,600 Speaker 2: of correlation at all, and the reason is those are 627 00:32:29,920 --> 00:32:32,400 Speaker 2: not meaningful in their current index form. 628 00:32:32,920 --> 00:32:35,160 Speaker 1: Charles, thank you very much for joining today, and I 629 00:32:35,240 --> 00:32:38,200 Speaker 1: look forward to reading whatever the third collaboration is for 630 00:32:38,240 --> 00:32:39,760 Speaker 1: you and Robert in the future someday. 631 00:32:40,120 --> 00:32:42,000 Speaker 2: Dana, I had so much fun being here. Thank you. 632 00:32:51,360 --> 00:32:54,400 Speaker 1: Bloomberg ne EF is a service provided by Bloomberg Finance 633 00:32:54,520 --> 00:32:58,000 Speaker 1: LP and its affiliates. This recording does not constitute, nor 634 00:32:58,000 --> 00:33:02,200 Speaker 1: should it be construed, as investment advice, investment recommendations, or 635 00:33:02,280 --> 00:33:05,800 Speaker 1: a recommendation as to an investment or other strategy. Bloomberg 636 00:33:05,840 --> 00:33:09,200 Speaker 1: an EP should not be considered as information sufficient upon 637 00:33:09,240 --> 00:33:12,840 Speaker 1: which to base an investment decision. Neither Bloomberg Finance LP 638 00:33:13,040 --> 00:33:16,440 Speaker 1: nor any of its affiliates makes any representation or warranty 639 00:33:16,520 --> 00:33:19,479 Speaker 1: as to the accuracy or completeness of the information contained 640 00:33:19,480 --> 00:33:22,600 Speaker 1: in this recording, and any liability as a result of 641 00:33:22,640 --> 00:33:24,520 Speaker 1: this recording is expressly disclaimed.