1 00:00:03,160 --> 00:00:07,040 Speaker 1: Hello, Hello. This is Smart Talks with IBM, a podcast 2 00:00:07,320 --> 00:00:11,680 Speaker 1: from Pushkin Industries, High Heart Media and IBM about what 3 00:00:11,720 --> 00:00:14,640 Speaker 1: it means to look at today's most challenging problems in 4 00:00:14,680 --> 00:00:19,960 Speaker 1: a new way. I'm Malcolm Gladwell. In this episode, I'm 5 00:00:20,000 --> 00:00:24,079 Speaker 1: speaking with Jim Whitehurst, senior advisor at IBM. In his 6 00:00:24,160 --> 00:00:27,000 Speaker 1: time with IBM, as both an advisor and former president, 7 00:00:27,360 --> 00:00:31,800 Speaker 1: Jim was responsible for the IBM Cloud and cognitive software 8 00:00:31,880 --> 00:00:37,080 Speaker 1: organization and corporate strategy. Jim is an expert in open innovation. 9 00:00:37,600 --> 00:00:41,120 Speaker 1: During his time as president of IBM, Jim embedded his 10 00:00:41,159 --> 00:00:45,360 Speaker 1: philosophy into the company, helping clients and partners on their 11 00:00:45,400 --> 00:00:50,080 Speaker 1: own digital transformation journeys. Today, we'll be talking about the 12 00:00:50,080 --> 00:00:53,800 Speaker 1: ways open culture companies can change the way we lead 13 00:00:54,120 --> 00:00:59,080 Speaker 1: and work together. Realize, Hey, this isn't insanity or chaos. 14 00:00:59,440 --> 00:01:02,320 Speaker 1: It's actually different way to run an organization is trying 15 00:01:02,360 --> 00:01:06,400 Speaker 1: to seek its way into the future. Before coming to IBM, 16 00:01:06,600 --> 00:01:10,800 Speaker 1: Jim was the president and CEO of Red Hat. Red 17 00:01:10,840 --> 00:01:13,760 Speaker 1: Hat has been a leader in enterprise open source software 18 00:01:14,000 --> 00:01:17,759 Speaker 1: and while CEO, Jim helped it become the first one 19 00:01:17,880 --> 00:01:33,400 Speaker 1: billion revenue open source software company. In let's dive in Hi, Jim, Hey, 20 00:01:33,440 --> 00:01:37,320 Speaker 1: how are you? I'm good. Thank you for joining me today. 21 00:01:37,840 --> 00:01:39,240 Speaker 1: Let's do so many things I want to talk to 22 00:01:39,280 --> 00:01:43,160 Speaker 1: you about. Whenever I read the resume of someone like you, 23 00:01:43,600 --> 00:01:49,400 Speaker 1: I always wonder how did he get there? Like? What 24 00:01:49,760 --> 00:01:51,480 Speaker 1: tell me? Can you give me a short version of 25 00:01:51,720 --> 00:01:55,640 Speaker 1: how did you end up thinking about things like innovation 26 00:01:55,720 --> 00:01:59,160 Speaker 1: and creating these kinds of receptive cultures. It's such a 27 00:01:59,240 --> 00:02:02,360 Speaker 1: kind of there must be a million paths to that direction. 28 00:02:02,400 --> 00:02:05,080 Speaker 1: I'm just curious about what your path was. You know, 29 00:02:05,200 --> 00:02:07,960 Speaker 1: graduated from college, thought I wanted to be in technology, 30 00:02:08,680 --> 00:02:12,280 Speaker 1: interviewed with the Boston consulting group, love client service. Never 31 00:02:12,320 --> 00:02:15,320 Speaker 1: thought I would leave the craft of working with clients 32 00:02:15,320 --> 00:02:18,040 Speaker 1: and working in small teams I loved. I didn't realize 33 00:02:18,080 --> 00:02:20,440 Speaker 1: that was kind of the nub of how teams and 34 00:02:20,480 --> 00:02:23,000 Speaker 1: innovation work. Uh and then a litterally I was a 35 00:02:23,040 --> 00:02:24,880 Speaker 1: partner there. I never thought i'd leave it. Literally, at 36 00:02:24,919 --> 00:02:28,840 Speaker 1: noon on nine eleven, the CEO of Delta called me 37 00:02:28,880 --> 00:02:30,919 Speaker 1: and said, I need you now to be my treasure 38 00:02:31,440 --> 00:02:33,040 Speaker 1: And I said, you know, you're out of your mind. 39 00:02:33,080 --> 00:02:36,239 Speaker 1: I know nothing about being a treasurer. And he said, well, 40 00:02:36,320 --> 00:02:38,160 Speaker 1: that's okay. Nobody in the right mind would loan his 41 00:02:38,280 --> 00:02:40,200 Speaker 1: money right now. I just need kind of a creative 42 00:02:40,200 --> 00:02:43,240 Speaker 1: strategic person. So I joined Delta literally at noon on 43 00:02:43,320 --> 00:02:46,960 Speaker 1: nine eleven, and then kind of ran through the bankruptcy restructuring, 44 00:02:47,040 --> 00:02:50,960 Speaker 1: and then was approached by Red Hat. And I had 45 00:02:51,000 --> 00:02:53,320 Speaker 1: been a geek playing with Lennox in my spare time 46 00:02:53,360 --> 00:02:55,720 Speaker 1: for a long time and I joined. I think that's 47 00:02:55,720 --> 00:02:59,120 Speaker 1: where my interests really started. You know, Delta is considered 48 00:02:59,160 --> 00:03:01,880 Speaker 1: a very low run company. It's always on the list 49 00:03:01,880 --> 00:03:04,239 Speaker 1: of the most environant companies. And I got to Red 50 00:03:04,240 --> 00:03:07,520 Speaker 1: Hat and I thought, okay, I know what leadership looks like. 51 00:03:07,919 --> 00:03:11,200 Speaker 1: And I'd say in the first month, I thought this 52 00:03:11,240 --> 00:03:14,120 Speaker 1: place is absolute chaos. I understand they brought me in 53 00:03:14,160 --> 00:03:17,080 Speaker 1: here to clean it up, right. The problem at Delta 54 00:03:17,240 --> 00:03:22,320 Speaker 1: is how do you run a tightly integrated has to 55 00:03:22,360 --> 00:03:26,200 Speaker 1: be run with excellence operation Red hats all about innovation. 56 00:03:27,080 --> 00:03:30,560 Speaker 1: You know, there isn't a leadership style that solves every problem. 57 00:03:30,600 --> 00:03:34,640 Speaker 1: There's a leadership style for you know, driving efficiency and 58 00:03:34,760 --> 00:03:37,640 Speaker 1: driving out variants in a static environment, which is what 59 00:03:37,720 --> 00:03:41,720 Speaker 1: most leaders have been raised to do. And then there's 60 00:03:41,760 --> 00:03:45,200 Speaker 1: a leadership style for trying to drive a faster pace 61 00:03:45,240 --> 00:03:49,920 Speaker 1: of innovation, and those fundamentally look different. So wait, I'm 62 00:03:50,120 --> 00:03:52,760 Speaker 1: I'm curious about this. So you go from Delta to 63 00:03:52,880 --> 00:03:56,440 Speaker 1: Red Hat. So you couldn't if we were if we 64 00:03:56,560 --> 00:03:59,560 Speaker 1: sat down Jim and tried to figure out tried to 65 00:03:59,680 --> 00:04:03,440 Speaker 1: name two companies that were more different, Yeah, you couldn't 66 00:04:03,440 --> 00:04:07,240 Speaker 1: do it. So now how long did it take you 67 00:04:07,280 --> 00:04:10,040 Speaker 1: to come to the insight? You just spoke? So you 68 00:04:10,120 --> 00:04:15,440 Speaker 1: go from A to Z and you say, WHOA, this 69 00:04:15,520 --> 00:04:18,479 Speaker 1: place looks different because you could have gone the other way, right, 70 00:04:19,120 --> 00:04:21,960 Speaker 1: you could have said I have to turn Red Hat 71 00:04:21,960 --> 00:04:25,200 Speaker 1: into Delta, but you didn't. I'm guessing nine out of 72 00:04:25,240 --> 00:04:27,680 Speaker 1: ten people would have tried to turn Red Hat into 73 00:04:27,680 --> 00:04:31,000 Speaker 1: Delta and would have flamed out. But you didn't. Why Well, 74 00:04:31,040 --> 00:04:33,080 Speaker 1: I do think a big part of it was I 75 00:04:33,120 --> 00:04:36,080 Speaker 1: was also learning a new business, and so I literally 76 00:04:36,080 --> 00:04:39,320 Speaker 1: spent the first six months on the road, traveling, seeing customers. 77 00:04:39,680 --> 00:04:41,960 Speaker 1: And if that hadn't happened, I think you're exactly right. 78 00:04:41,960 --> 00:04:44,080 Speaker 1: I would have said, no, my job is to bring 79 00:04:44,120 --> 00:04:48,120 Speaker 1: in quote unquote professional management. But I was traveling enough 80 00:04:48,800 --> 00:04:53,080 Speaker 1: that I didn't kind of focus on change. I was saying, 81 00:04:53,120 --> 00:04:55,920 Speaker 1: let me learn the business understand our clients, etcetera, etcetera. 82 00:04:56,440 --> 00:04:58,599 Speaker 1: In that period of time, I saw a couple of 83 00:04:58,720 --> 00:05:02,240 Speaker 1: amazing things happened. My second week on the job, I 84 00:05:02,360 --> 00:05:05,640 Speaker 1: was being briefed on this area called virtualization, and I'm 85 00:05:05,680 --> 00:05:08,039 Speaker 1: getting briefed on the strategy and when we picked the 86 00:05:08,040 --> 00:05:11,800 Speaker 1: technology we picked, and in the room was kind of 87 00:05:11,839 --> 00:05:14,760 Speaker 1: the head of engineering, a couple of people who worked 88 00:05:14,760 --> 00:05:16,719 Speaker 1: for him, but all the way down to some engineers 89 00:05:16,720 --> 00:05:19,200 Speaker 1: working on virtualization there told me the strategy and why 90 00:05:19,240 --> 00:05:21,440 Speaker 1: we picked the technology we did, and what are the 91 00:05:21,560 --> 00:05:24,280 Speaker 1: engineers So that the most junior people in the room 92 00:05:24,400 --> 00:05:27,200 Speaker 1: and the briefing said, yeah, this is what we're doing, 93 00:05:27,279 --> 00:05:30,360 Speaker 1: but you know it's wrong. Fundamentally, we picked the wrong technology. 94 00:05:30,400 --> 00:05:33,040 Speaker 1: There's this new technology that's emerged which is kind of 95 00:05:33,120 --> 00:05:35,040 Speaker 1: upstream and Linux, and this is when we should pick 96 00:05:35,240 --> 00:05:37,960 Speaker 1: And a huge argument erupts, No, no, no, we did this, 97 00:05:38,040 --> 00:05:40,039 Speaker 1: And so I'm list of this back and forth, and 98 00:05:40,080 --> 00:05:41,800 Speaker 1: I remember going home and telling my wife that night, 99 00:05:42,160 --> 00:05:45,400 Speaker 1: I said, I am living in chaos. There's a briefing 100 00:05:45,400 --> 00:05:47,479 Speaker 1: in front of the CEO where these people are just 101 00:05:47,560 --> 00:05:51,440 Speaker 1: like literally food fight arguing this stuff out. Four months later, 102 00:05:51,760 --> 00:05:54,640 Speaker 1: that same set team came in and said, you know what, 103 00:05:54,760 --> 00:05:57,520 Speaker 1: we were wrong. We need to acquire the company behind 104 00:05:57,520 --> 00:06:00,600 Speaker 1: the other technology, because that's the way to go. And 105 00:06:00,600 --> 00:06:05,080 Speaker 1: what I realized is, you know what if they had 106 00:06:05,120 --> 00:06:06,760 Speaker 1: just come in and given me a briefing on here's 107 00:06:06,800 --> 00:06:08,920 Speaker 1: the technology, why we did it, and four months later 108 00:06:08,920 --> 00:06:10,960 Speaker 1: and come in and said no, no, no, we want 109 00:06:10,960 --> 00:06:12,640 Speaker 1: to buy this other I would just in a way, 110 00:06:12,680 --> 00:06:14,440 Speaker 1: you gave me all the reasons why this is right 111 00:06:14,440 --> 00:06:16,320 Speaker 1: and even infested in it. Why would I do that? 112 00:06:16,760 --> 00:06:19,000 Speaker 1: But because I heard the arguments that it was a 113 00:06:19,040 --> 00:06:23,400 Speaker 1: fifty five forty five decision and understood it wasn't clear cut, 114 00:06:23,880 --> 00:06:28,960 Speaker 1: and that changed over time because of changing circumstances, it 115 00:06:29,120 --> 00:06:31,160 Speaker 1: was much easier for me to go to the board 116 00:06:31,200 --> 00:06:33,120 Speaker 1: and say, you know, let's spend hundreds of millions of 117 00:06:33,160 --> 00:06:36,040 Speaker 1: dollars buying this other company because you know, I'm convinced, 118 00:06:36,040 --> 00:06:38,440 Speaker 1: convinced it's right because I've heard both sides of the argument. 119 00:06:38,440 --> 00:06:40,960 Speaker 1: I've seen how it's involved over time. So that was 120 00:06:41,440 --> 00:06:44,560 Speaker 1: a time when I realized, like wow, this system that 121 00:06:44,640 --> 00:06:49,640 Speaker 1: seemed like chaos was a way that we socialize, you know, 122 00:06:49,680 --> 00:06:52,400 Speaker 1: the problems that we need to solve. An innovation and 123 00:06:52,440 --> 00:06:56,320 Speaker 1: recognizing that circumstances changed and therefore our output can change 124 00:06:56,360 --> 00:06:59,400 Speaker 1: all of sudden over elise like, No, that argument wasn't chaos. 125 00:06:59,839 --> 00:07:01,880 Speaker 1: That was a great way to make sure that we 126 00:07:01,880 --> 00:07:04,880 Speaker 1: were all on the same page on both the facts 127 00:07:04,920 --> 00:07:07,680 Speaker 1: and where we felt we should act coming out of 128 00:07:07,680 --> 00:07:12,040 Speaker 1: those facts. And then I realized, Hey, this isn't insanity 129 00:07:12,240 --> 00:07:14,840 Speaker 1: or chaos. It's actually a different way to run an 130 00:07:14,920 --> 00:07:18,080 Speaker 1: organization is trying to seek its way into the future. 131 00:07:18,920 --> 00:07:21,520 Speaker 1: M M, Now, can you let's talk a little bit 132 00:07:21,560 --> 00:07:24,800 Speaker 1: about that transit. You have these two modes, and you're 133 00:07:24,800 --> 00:07:31,080 Speaker 1: not saying mode A the Delta mode isn't useful. You're 134 00:07:31,120 --> 00:07:33,720 Speaker 1: saying it works if you're in an industry that is 135 00:07:34,240 --> 00:07:38,040 Speaker 1: well established, where the technology is moving incrementally, where you're 136 00:07:38,200 --> 00:07:40,880 Speaker 1: where it's about. It's where it's about. Delta has to 137 00:07:40,920 --> 00:07:45,840 Speaker 1: execute perfectly, right. You do not want anyone experimenting with 138 00:07:45,880 --> 00:07:49,200 Speaker 1: the safety procedures before she right. I mean that's a 139 00:07:49,920 --> 00:07:53,120 Speaker 1: strong example now because most companies have some of both. 140 00:07:53,360 --> 00:07:57,720 Speaker 1: You do want people tinkering with trying experimenting on you know, 141 00:07:57,840 --> 00:08:01,840 Speaker 1: Delta's website, right or the mobile app, and so all 142 00:08:01,920 --> 00:08:05,640 Speaker 1: companies have a degree of difference in the types of 143 00:08:05,640 --> 00:08:08,680 Speaker 1: of of operations that they're running. Right. You know, Red 144 00:08:08,720 --> 00:08:11,360 Speaker 1: Hat has financial accounting and you know, I go to 145 00:08:11,480 --> 00:08:14,120 Speaker 1: jail if those UH numbers are wrong, so you want 146 00:08:14,160 --> 00:08:16,400 Speaker 1: to make sure that those are locked down. The key 147 00:08:16,560 --> 00:08:21,640 Speaker 1: is for executives to recognize there are different approaches to 148 00:08:21,760 --> 00:08:27,200 Speaker 1: your organization, your leadership, your management processes, depending on which 149 00:08:28,080 --> 00:08:31,480 Speaker 1: objective function you're trying. You're trying to drive variants out 150 00:08:31,520 --> 00:08:34,640 Speaker 1: to drive standardization or you're trying to inject variants in 151 00:08:34,760 --> 00:08:37,880 Speaker 1: to drive innovation, and you have to do a little 152 00:08:37,920 --> 00:08:41,160 Speaker 1: bit of both and recognizing that there's a continuum around that. 153 00:08:41,600 --> 00:08:44,920 Speaker 1: But importantly, more and more of the world is moving 154 00:08:44,960 --> 00:08:49,480 Speaker 1: to you know, innovation is important, So walk me through. 155 00:08:49,559 --> 00:08:53,440 Speaker 1: Let's let's break down all of the components of this 156 00:08:53,960 --> 00:08:57,880 Speaker 1: open innovation model. What is it? What does it look like? 157 00:08:58,000 --> 00:09:01,160 Speaker 1: Does it feel like? Does it you know, give me 158 00:09:01,200 --> 00:09:05,120 Speaker 1: a kind of prepare me, So Malcolm's gonna move, Malcolm's 159 00:09:05,120 --> 00:09:08,080 Speaker 1: gonna join to do this transition and enjoin. Prepare me 160 00:09:08,160 --> 00:09:12,360 Speaker 1: for what it looks and feels like. Sure, so, first off, 161 00:09:12,480 --> 00:09:15,719 Speaker 1: it is absolutely clear what the strategy of the company is. 162 00:09:16,040 --> 00:09:19,600 Speaker 1: But your work steps are typically left somewhat ambiguous, so 163 00:09:19,720 --> 00:09:24,160 Speaker 1: you can work to improve, trying new things, observe what's happening. 164 00:09:24,400 --> 00:09:27,800 Speaker 1: So I think most large companies will say frontline really 165 00:09:27,840 --> 00:09:30,040 Speaker 1: needs to understand the work task to get their job done, 166 00:09:30,040 --> 00:09:32,240 Speaker 1: and let's make those efficient. They don't really need to 167 00:09:32,280 --> 00:09:35,160 Speaker 1: know the corporate strategy. We would flip that around and 168 00:09:35,200 --> 00:09:37,960 Speaker 1: say everybody needs to know the corporate strategy and exactly 169 00:09:37,960 --> 00:09:40,840 Speaker 1: what we're doing and why we're gonna leave the worksteps 170 00:09:40,840 --> 00:09:43,559 Speaker 1: blurry so you can work to figure it out. And 171 00:09:43,720 --> 00:09:45,600 Speaker 1: the other thing I think people see out of it, 172 00:09:45,640 --> 00:09:50,120 Speaker 1: at least certainly a red hat is everybody thinks are 173 00:09:50,120 --> 00:09:52,200 Speaker 1: gonna come in and everybody's gonna be nice and holding 174 00:09:52,240 --> 00:09:54,600 Speaker 1: hands and seeing in kubaya. And one of the first 175 00:09:54,600 --> 00:09:56,959 Speaker 1: things they find out it's like, wow, it's kind of harsh. 176 00:09:57,040 --> 00:10:00,319 Speaker 1: I throw up an idea and immediately people shoot it down. 177 00:10:01,000 --> 00:10:03,079 Speaker 1: And one of the things that we've learned over time, 178 00:10:03,280 --> 00:10:06,080 Speaker 1: and I do think this is beyond red. This is 179 00:10:06,160 --> 00:10:11,560 Speaker 1: kind of a broad. Important point is that great ideas 180 00:10:11,720 --> 00:10:15,360 Speaker 1: are fused together from many good ideas and people arguing 181 00:10:15,400 --> 00:10:17,559 Speaker 1: and fighting it out and building on top of each other. 182 00:10:17,880 --> 00:10:20,640 Speaker 1: Let me ask you a very random, odd question. I 183 00:10:20,679 --> 00:10:22,880 Speaker 1: suppose I can do Jim and said I would like you. 184 00:10:23,840 --> 00:10:27,800 Speaker 1: I'd like to bring you into my really good, you 185 00:10:27,840 --> 00:10:33,079 Speaker 1: know school. Your job is to redesign the curriculum and 186 00:10:33,400 --> 00:10:37,320 Speaker 1: pedagogy of this school to prepare our students for the 187 00:10:37,320 --> 00:10:42,520 Speaker 1: world you just described. Yep, so look the broad way. 188 00:10:42,559 --> 00:10:44,600 Speaker 1: I would say this is true for any organization. The 189 00:10:44,679 --> 00:10:50,800 Speaker 1: traditional approach to thinking about solving the problem or leading 190 00:10:50,960 --> 00:10:55,360 Speaker 1: is very simple is three big steps your plan, Right, 191 00:10:55,360 --> 00:10:59,040 Speaker 1: You come up with a plan, You then prescribe the 192 00:10:59,120 --> 00:11:02,280 Speaker 1: activities to whoever has to go executed, and then you 193 00:11:02,360 --> 00:11:06,320 Speaker 1: execute and executes about driving compliance to that plan. You know, 194 00:11:06,440 --> 00:11:10,560 Speaker 1: in university, what do you teach. You need to teach 195 00:11:10,640 --> 00:11:16,000 Speaker 1: the a the capacity to be curious, to experiment, to try. Um, 196 00:11:16,040 --> 00:11:20,160 Speaker 1: you need to teach the the comfort with ambiguity. And 197 00:11:20,160 --> 00:11:22,800 Speaker 1: that's a tough thing to do. And so that's why 198 00:11:22,960 --> 00:11:25,960 Speaker 1: I do think engineers in particular need to take more 199 00:11:26,040 --> 00:11:29,240 Speaker 1: humanity right, you know, more open ivied, whether it's you know, 200 00:11:29,320 --> 00:11:32,559 Speaker 1: philosophy or political science or sociology, where some of these 201 00:11:32,640 --> 00:11:36,840 Speaker 1: questions don't have firm answers. I think it's really important 202 00:11:36,880 --> 00:11:41,440 Speaker 1: for people to learn because comfort with ambiguity and recognizing 203 00:11:41,480 --> 00:11:43,560 Speaker 1: that there are things that there is no firm, hard 204 00:11:43,600 --> 00:11:49,520 Speaker 1: answer to combine with engineering skills or what enables great 205 00:11:49,600 --> 00:11:52,080 Speaker 1: engineers to say, I'm gonna learn from others, I'm gonna 206 00:11:52,120 --> 00:11:54,320 Speaker 1: try them an experiment, I'm gonna come up to an answer. 207 00:11:54,520 --> 00:11:57,920 Speaker 1: There is no right answer to solving X, y Z innovation. 208 00:11:57,960 --> 00:12:00,680 Speaker 1: There are multiple approaches and getting people comfortable with that, 209 00:12:00,960 --> 00:12:04,079 Speaker 1: and those are key important points that I would advise 210 00:12:04,440 --> 00:12:07,800 Speaker 1: anyone going to university or universities is thinking about curriculum. 211 00:12:07,800 --> 00:12:11,640 Speaker 1: It's um, there are no right answers and innovation. They're 212 00:12:11,679 --> 00:12:14,440 Speaker 1: different approaches, and we do spend a lot of time 213 00:12:14,520 --> 00:12:17,880 Speaker 1: on right and wrong um, you know, versus kind of 214 00:12:17,920 --> 00:12:23,720 Speaker 1: experiment in building and then the final thing's working together. Um. 215 00:12:23,840 --> 00:12:28,320 Speaker 1: There's a lot of evidence that shows that teams come 216 00:12:28,400 --> 00:12:33,199 Speaker 1: up with better UH answers than individuals. I always talked 217 00:12:33,200 --> 00:12:36,000 Speaker 1: about your red hats known for open source, which is 218 00:12:36,120 --> 00:12:38,320 Speaker 1: you know, thousands of people working together to build like 219 00:12:38,440 --> 00:12:41,640 Speaker 1: Linux other software, and I always make sure people understand 220 00:12:42,240 --> 00:12:45,079 Speaker 1: open source is not the same thing as crowdsource. Crowdsourcing 221 00:12:45,120 --> 00:12:47,319 Speaker 1: should get a million people to throw an idea, and 222 00:12:47,360 --> 00:12:50,680 Speaker 1: when I'm probably gonna be good, open source, or what 223 00:12:50,760 --> 00:12:52,920 Speaker 1: I'm talking about with an open organization is how you 224 00:12:52,960 --> 00:12:55,800 Speaker 1: get multiple people work together to give you better solutions 225 00:12:55,840 --> 00:12:58,440 Speaker 1: than any one of those individuals could on their own. 226 00:12:58,920 --> 00:13:01,920 Speaker 1: And so the social skills about how you get people 227 00:13:01,920 --> 00:13:05,320 Speaker 1: to work together to get better solutions is important. But 228 00:13:05,760 --> 00:13:08,720 Speaker 1: if I, if I look at just take high school 229 00:13:08,760 --> 00:13:12,720 Speaker 1: and university education in this in this country, so little 230 00:13:12,840 --> 00:13:17,280 Speaker 1: of learning is taking place in groups. We are sending 231 00:13:17,400 --> 00:13:22,200 Speaker 1: kids off by themselves to tackle problems that have been 232 00:13:22,240 --> 00:13:26,280 Speaker 1: handed to them, prescribed, planned and prescribed and asked them 233 00:13:26,320 --> 00:13:29,280 Speaker 1: to execute the problem that we've given them on their 234 00:13:29,320 --> 00:13:31,600 Speaker 1: own in large part and to the extent that they 235 00:13:31,640 --> 00:13:34,319 Speaker 1: are part of a team, it's a very very informal team. 236 00:13:34,320 --> 00:13:36,760 Speaker 1: But then we break them up again when we evaluate them. Right, 237 00:13:37,400 --> 00:13:42,640 Speaker 1: I mean, there's a complete disconnect between the world you 238 00:13:42,679 --> 00:13:47,000 Speaker 1: are describing right, which is responding to the technological challenges 239 00:13:47,040 --> 00:13:51,200 Speaker 1: of the century, and the world we're preparing kids for. Well, 240 00:13:51,200 --> 00:13:53,480 Speaker 1: this is the lag. This is so it's not only 241 00:13:53,800 --> 00:13:57,520 Speaker 1: you know, the leaders who don't realize their management systems 242 00:13:57,520 --> 00:14:01,640 Speaker 1: built for you know, the proto typical factory work, road task, 243 00:14:01,679 --> 00:14:05,840 Speaker 1: static environment, etcetera. Education Lighting people up in rows individual 244 00:14:06,120 --> 00:14:08,360 Speaker 1: you know, you make a mistake, that's you know, that's wrong, 245 00:14:08,480 --> 00:14:11,840 Speaker 1: that's bad. Where we know experimentation is so important for 246 00:14:11,840 --> 00:14:14,720 Speaker 1: for innovation. Yeah, we have an education system that is 247 00:14:16,080 --> 00:14:20,560 Speaker 1: optimized for the industrial world that no longer exists, and 248 00:14:20,600 --> 00:14:22,400 Speaker 1: there is a lag there that we have to fix 249 00:14:22,680 --> 00:14:25,760 Speaker 1: and we really have. If anything, we're sliding backwards with 250 00:14:25,840 --> 00:14:28,560 Speaker 1: social media and other things that gets kids even more 251 00:14:28,680 --> 00:14:32,640 Speaker 1: isolated from each other, and that core insight that we know. 252 00:14:33,040 --> 00:14:36,400 Speaker 1: There's tons of academic evidence, uh in research that says 253 00:14:36,480 --> 00:14:39,920 Speaker 1: groups do better at solving problems than individuals. Yet somehow 254 00:14:39,960 --> 00:14:42,480 Speaker 1: we then want to split people up. It's just I 255 00:14:42,520 --> 00:14:46,440 Speaker 1: think it's a lag problem. It's a recognition problem. To 256 00:14:46,480 --> 00:14:49,120 Speaker 1: take that all the way back, you know, Ronald Coast 257 00:14:49,400 --> 00:14:53,080 Speaker 1: letting Nobel Prize for explaining why a company exists. You know, 258 00:14:53,120 --> 00:14:55,640 Speaker 1: the whole idea was transaction costs. You know, so in 259 00:14:55,640 --> 00:14:58,920 Speaker 1: other words, the question is why companies exist versus everybody's 260 00:14:58,960 --> 00:15:02,600 Speaker 1: an individual actor and kind of coordinating. And his point was, well, 261 00:15:02,720 --> 00:15:06,800 Speaker 1: when transaction costs are lower, um by coordinating inside a 262 00:15:06,800 --> 00:15:08,880 Speaker 1: company than when a market cost would be old, people 263 00:15:08,880 --> 00:15:11,240 Speaker 1: will get together and kind of create a company that 264 00:15:11,320 --> 00:15:15,120 Speaker 1: all assumes interactions between people are cost which is true 265 00:15:15,320 --> 00:15:18,360 Speaker 1: when you're trying to coordinate and you know, kind of 266 00:15:18,440 --> 00:15:21,480 Speaker 1: drive building something scale. But when you're trying to innovate, 267 00:15:21,800 --> 00:15:24,800 Speaker 1: we know interactions between people are in benefit because teams 268 00:15:24,840 --> 00:15:28,280 Speaker 1: to are more innovative than individuals. So all of a sudden, 269 00:15:28,520 --> 00:15:31,600 Speaker 1: we built the corporate entities and how we put them together, 270 00:15:31,680 --> 00:15:35,520 Speaker 1: how we educate people. It's all about minimizing transaction costs 271 00:15:35,560 --> 00:15:40,640 Speaker 1: and optimizing you know, the individual output and assuming that 272 00:15:40,720 --> 00:15:45,280 Speaker 1: there's some greater way of you know, some scientists somewhere, 273 00:15:45,280 --> 00:15:47,880 Speaker 1: he's gonna plug all that together. Yet it is the 274 00:15:47,880 --> 00:15:50,680 Speaker 1: interactions where you know, the magic happens when you're trying 275 00:15:50,720 --> 00:15:54,960 Speaker 1: to innovate. Yeah, this reminds me I. In my podcast 276 00:15:55,000 --> 00:15:58,560 Speaker 1: Revisions History, the one of the last episodes this season 277 00:15:58,640 --> 00:16:01,880 Speaker 1: is all about war games and basically asked the question 278 00:16:02,720 --> 00:16:05,640 Speaker 1: why does the military play war games? They play a 279 00:16:05,680 --> 00:16:09,480 Speaker 1: lot of war games, and one of the reasons is 280 00:16:09,560 --> 00:16:13,160 Speaker 1: exactly the one you articulated, which is that you can 281 00:16:13,160 --> 00:16:15,920 Speaker 1: do all the plans formal planning and prediction. You can 282 00:16:15,960 --> 00:16:18,320 Speaker 1: send your experts off to come back with reports and 283 00:16:18,360 --> 00:16:20,640 Speaker 1: you will learn and get a certain kind of insight 284 00:16:20,680 --> 00:16:23,800 Speaker 1: from that. With another way, more valuable insight that will 285 00:16:23,880 --> 00:16:27,480 Speaker 1: only come if you bring a group of people together 286 00:16:27,560 --> 00:16:30,840 Speaker 1: and have them interact over a problem. Every one of 287 00:16:30,840 --> 00:16:33,520 Speaker 1: the war gaming world quotes this line from Thomas Shelling, 288 00:16:33,560 --> 00:16:36,880 Speaker 1: which is something to the effect of even the smartest 289 00:16:36,920 --> 00:16:39,080 Speaker 1: man in the world cannot make a list of the 290 00:16:39,160 --> 00:16:42,640 Speaker 1: things that haven't occurred to him, right, which is so lovely. 291 00:16:44,680 --> 00:16:47,640 Speaker 1: And that's the point, right, that you only learn about 292 00:16:47,640 --> 00:16:50,000 Speaker 1: the things that haven't occurred to you when you are 293 00:16:50,120 --> 00:16:55,160 Speaker 1: in this kind of free flowing, interactive social environment. That's 294 00:16:55,200 --> 00:16:58,960 Speaker 1: the point. That's why we throw people together in these 295 00:16:59,080 --> 00:17:01,840 Speaker 1: in the end create the kind of chaos you're talking about, 296 00:17:02,480 --> 00:17:05,280 Speaker 1: because it's going to force you off the narrow path 297 00:17:05,359 --> 00:17:09,040 Speaker 1: that you put yourself on. Yeah, speaking of the military, 298 00:17:09,080 --> 00:17:11,640 Speaker 1: one of my favorite kind of pieces to read on leadership, 299 00:17:11,680 --> 00:17:15,880 Speaker 1: it's uh commander's intent. And the whole point is intent 300 00:17:16,119 --> 00:17:18,800 Speaker 1: is so important for you know, others in the field 301 00:17:18,800 --> 00:17:21,680 Speaker 1: to understand because you can get separated, frankly, commander to 302 00:17:21,840 --> 00:17:25,040 Speaker 1: get killed, and understanding the intent and the objective is 303 00:17:25,080 --> 00:17:28,320 Speaker 1: so much more important than understanding the kind of the 304 00:17:28,480 --> 00:17:31,640 Speaker 1: specifics that go around that because as soon as you're 305 00:17:31,640 --> 00:17:34,720 Speaker 1: on a battle field, things changed by definition, and so 306 00:17:35,000 --> 00:17:37,320 Speaker 1: if you you need to really understand intent and how 307 00:17:37,320 --> 00:17:40,199 Speaker 1: do you convey that as a leader? It's uh, I 308 00:17:40,240 --> 00:17:43,440 Speaker 1: do think those are critical in an ambiguous environment? Yeah, 309 00:17:44,080 --> 00:17:47,120 Speaker 1: do you? Is there a way to adopt the war 310 00:17:47,200 --> 00:17:51,119 Speaker 1: gaming model UM in the kind of more conventional corporate setting, 311 00:17:51,359 --> 00:17:58,040 Speaker 1: and it was running simulations, real world social simulations of 312 00:17:58,640 --> 00:18:03,040 Speaker 1: anticipated problem. Is that something that we that we're doing 313 00:18:03,280 --> 00:18:07,840 Speaker 1: or at least doing enough. I think I remember talking 314 00:18:07,840 --> 00:18:10,800 Speaker 1: to to the CEO of one of the major global 315 00:18:10,840 --> 00:18:13,679 Speaker 1: banks and they were saying, basically, hey, we got this 316 00:18:13,760 --> 00:18:18,240 Speaker 1: issue with you know, uh, fin techs attacking us, and 317 00:18:18,280 --> 00:18:20,520 Speaker 1: you know, I've asked my team, you know what's our 318 00:18:20,720 --> 00:18:23,320 Speaker 1: you know, two big initiatives we're gonna go and attack 319 00:18:23,400 --> 00:18:27,600 Speaker 1: this and and I basically told him, don't go out 320 00:18:27,680 --> 00:18:30,320 Speaker 1: and say you have two initiatives to do this, because 321 00:18:30,440 --> 00:18:33,679 Speaker 1: a an initiative fails, the person is probably gonna get fired. 322 00:18:33,720 --> 00:18:36,520 Speaker 1: So how much is anybody really really really gonna go 323 00:18:36,560 --> 00:18:38,639 Speaker 1: out and try something radical? So what you need to 324 00:18:38,640 --> 00:18:41,240 Speaker 1: do is say, hey, I want to see our twenty 325 00:18:41,240 --> 00:18:44,719 Speaker 1: five best ideas. You've got six months. Each one can 326 00:18:44,760 --> 00:18:47,480 Speaker 1: have no more than you know, two hundred fifty dollars 327 00:18:47,720 --> 00:18:51,040 Speaker 1: to experiment with. Let's come back in six months and say, 328 00:18:51,240 --> 00:18:53,480 Speaker 1: let's see where those are, and we're gonna maybe kill 329 00:18:53,560 --> 00:18:55,919 Speaker 1: all of them and talk about the lessons learn, or 330 00:18:56,000 --> 00:18:58,359 Speaker 1: we're gonna take two and then we're gonna take the 331 00:18:58,440 --> 00:19:01,400 Speaker 1: lessons learn and launch another twenty five and we're gonna 332 00:19:01,480 --> 00:19:03,720 Speaker 1: keep doing that until we feel the right things emerge 333 00:19:03,760 --> 00:19:05,240 Speaker 1: and those are the ones that we're gonna go do. 334 00:19:05,840 --> 00:19:09,000 Speaker 1: That's a management approach that says, let's experiment and try 335 00:19:09,040 --> 00:19:12,080 Speaker 1: and learn and kind of quickly modify around that. As 336 00:19:12,119 --> 00:19:16,080 Speaker 1: soon as you launch these big, big, big initiatives, by definition, 337 00:19:16,160 --> 00:19:18,560 Speaker 1: you're not doing much experimentation in it because you've already 338 00:19:18,560 --> 00:19:21,040 Speaker 1: put dollars behind it with a you know, with an 339 00:19:21,119 --> 00:19:25,560 Speaker 1: outcome predefined. And that's the problem in general with so 340 00:19:25,600 --> 00:19:28,400 Speaker 1: many of the existing management process It's uh, I call 341 00:19:28,440 --> 00:19:31,960 Speaker 1: it's drive a future state. When the words I'll use 342 00:19:32,080 --> 00:19:34,879 Speaker 1: with with leaders as seek a future state. How do 343 00:19:34,920 --> 00:19:38,040 Speaker 1: you build an organization that can seek a future optimal state? 344 00:19:38,359 --> 00:19:41,800 Speaker 1: You know? That's so we went backwards and talked about 345 00:19:42,680 --> 00:19:45,720 Speaker 1: high school and college education. Let's now go forward, Let's 346 00:19:45,760 --> 00:19:48,560 Speaker 1: go beyond the corporation. I understand now that a lot 347 00:19:48,600 --> 00:19:53,360 Speaker 1: of your focus is on UM philanthropic stuff, climate change, 348 00:19:53,359 --> 00:19:57,760 Speaker 1: you know, larger kind of society wide issues. Let's apply 349 00:19:58,040 --> 00:20:01,679 Speaker 1: that that model to s of those. How would you 350 00:20:01,720 --> 00:20:04,439 Speaker 1: guide us in the way that we think an attack 351 00:20:04,480 --> 00:20:08,360 Speaker 1: a problem, like, for example, climate change. Well I'll come 352 00:20:08,359 --> 00:20:13,280 Speaker 1: back to that model of instead of plan, prescribe, execute, configure, enable, 353 00:20:13,320 --> 00:20:16,320 Speaker 1: and gave so much of what we're talking about is like, 354 00:20:16,359 --> 00:20:20,240 Speaker 1: what are the big ways we're gonna go elect electrify 355 00:20:20,520 --> 00:20:23,879 Speaker 1: UM the country and you know, put stations around for 356 00:20:23,920 --> 00:20:27,160 Speaker 1: autonomous driving, etcetera, etcetera, etcetera. And how do we preplan 357 00:20:27,240 --> 00:20:30,800 Speaker 1: all that out? My census? We need to configure for success. 358 00:20:30,800 --> 00:20:33,240 Speaker 1: And what I mean by that is if all of 359 00:20:33,240 --> 00:20:35,919 Speaker 1: a sudden, battery technology takes a leap forward and we 360 00:20:35,960 --> 00:20:39,320 Speaker 1: get triple the density UM in a car battery that 361 00:20:39,359 --> 00:20:42,800 Speaker 1: we have today, well guess what the number of stations 362 00:20:42,840 --> 00:20:46,320 Speaker 1: recharging stations probably drops by half. I don't know the 363 00:20:46,520 --> 00:20:49,960 Speaker 1: asthemtopicy that maybe by four x with the three X battery. 364 00:20:49,960 --> 00:20:52,040 Speaker 1: IM not sure how that math works, but somebody I'm 365 00:20:52,080 --> 00:20:54,560 Speaker 1: sure would figure that out. Which means the number stations 366 00:20:54,600 --> 00:20:56,280 Speaker 1: you need, which means the number of like colides you 367 00:20:56,320 --> 00:20:59,320 Speaker 1: need are all going to change. And what I know 368 00:21:00,080 --> 00:21:03,359 Speaker 1: is that no one knows what the innovation curve of 369 00:21:03,440 --> 00:21:06,080 Speaker 1: batteries look like. We can model things, but nobody knows, 370 00:21:06,160 --> 00:21:09,960 Speaker 1: and innovation is discontinuous often, so we need to set 371 00:21:10,040 --> 00:21:12,440 Speaker 1: context for that to happen. So what does that mean? 372 00:21:12,680 --> 00:21:15,800 Speaker 1: I would argue, And I know IBM does advocate for 373 00:21:16,160 --> 00:21:19,480 Speaker 1: a price on carbon because if you know what carbon costs, 374 00:21:19,480 --> 00:21:23,640 Speaker 1: that you have a carbon tax associated with it. Now, innovators, entrepreneurs, 375 00:21:24,080 --> 00:21:28,040 Speaker 1: well everybody, large companies know here's the cost and therefore, 376 00:21:28,520 --> 00:21:32,520 Speaker 1: what am I willing to invest in the literally millions 377 00:21:32,520 --> 00:21:34,560 Speaker 1: of decisions to get made in the economy, Whether that's 378 00:21:34,600 --> 00:21:36,560 Speaker 1: for an innovator to invest in the technology to try 379 00:21:36,600 --> 00:21:39,200 Speaker 1: to make a batter battery, or whether that's companies and 380 00:21:39,240 --> 00:21:41,840 Speaker 1: how they think about fleets and what they're doing. Um, 381 00:21:41,880 --> 00:21:43,840 Speaker 1: that's what I mean by configuring. Now do you think 382 00:21:43,880 --> 00:21:47,199 Speaker 1: every could play a role in you know, substantial investments 383 00:21:47,200 --> 00:21:50,040 Speaker 1: and fundamental research at the right sets of universities or 384 00:21:50,119 --> 00:21:53,199 Speaker 1: partnerships with companies. That again is configuring. So it's not 385 00:21:53,280 --> 00:21:56,120 Speaker 1: saying here's the specific steps and pathway to get there, 386 00:21:56,160 --> 00:21:58,920 Speaker 1: because in the context of innovation, you know you're gonna 387 00:21:59,040 --> 00:22:01,640 Speaker 1: be wrong, and that's how you build a bridge to nowhere. 388 00:22:01,880 --> 00:22:04,360 Speaker 1: The dollars were there, but you didn't realize, well, that 389 00:22:04,400 --> 00:22:09,480 Speaker 1: no longer makes sense. So setting up the context is critical. 390 00:22:09,800 --> 00:22:13,680 Speaker 1: And I will say I'm very encouraged by what we're 391 00:22:13,720 --> 00:22:17,080 Speaker 1: seeing happening around the world in terms of interest on 392 00:22:17,160 --> 00:22:19,400 Speaker 1: climate change. But I do think there's a knee jerk 393 00:22:19,440 --> 00:22:23,160 Speaker 1: reaction from governments in particular to say, Okay, what are 394 00:22:23,160 --> 00:22:27,200 Speaker 1: the multi hundred billion dollar, massive twenty year initiatives when 395 00:22:27,520 --> 00:22:30,879 Speaker 1: almost for sure those are gonna be wrong because you 396 00:22:30,960 --> 00:22:35,160 Speaker 1: need to take a more nimble, agile approach that as 397 00:22:35,200 --> 00:22:37,360 Speaker 1: we learn, you know, how do you kind of move 398 00:22:37,400 --> 00:22:38,760 Speaker 1: I'm not saying we don't want to spend hundreds of 399 00:22:38,800 --> 00:22:41,040 Speaker 1: billions of dollars on them. We do, but the approach 400 00:22:41,119 --> 00:22:43,480 Speaker 1: for how we do it can't be these multi year 401 00:22:43,520 --> 00:22:45,600 Speaker 1: plan things in the world there we don't know what 402 00:22:45,640 --> 00:22:49,159 Speaker 1: the world's gonna look like or the pace of innovation. Yeah. 403 00:22:49,200 --> 00:22:51,879 Speaker 1: I had somebody had a conversation with so recently that 404 00:22:51,920 --> 00:22:56,840 Speaker 1: was so fascinating. He was talking entirely about light. Led 405 00:22:56,960 --> 00:23:01,480 Speaker 1: lights have driven down the cost of light by levels 406 00:23:01,480 --> 00:23:03,680 Speaker 1: that would have been unimaginable when we we didn't realize 407 00:23:03,680 --> 00:23:07,080 Speaker 1: it was going to be so radical. Lights essentially moving 408 00:23:07,119 --> 00:23:09,560 Speaker 1: towards not going to be free. But it's getting really, 409 00:23:09,560 --> 00:23:12,719 Speaker 1: really cheap, and with all these things that it has, 410 00:23:12,760 --> 00:23:15,120 Speaker 1: all these impacts that no one, not a single person 411 00:23:15,160 --> 00:23:18,080 Speaker 1: would have imagined. For example, that lighting is getting so cheap, 412 00:23:18,160 --> 00:23:21,159 Speaker 1: it means that you can grow you know, vegetables in 413 00:23:21,240 --> 00:23:24,400 Speaker 1: greenhouses cheaper than you can grow them in fields, which 414 00:23:24,680 --> 00:23:27,600 Speaker 1: never occurred to anybody. Right, if you had started with 415 00:23:27,960 --> 00:23:32,719 Speaker 1: I'd like to make greenhouse vegetables cheaper than farmers. If 416 00:23:32,760 --> 00:23:34,360 Speaker 1: you started with that is your goal, you never would 417 00:23:34,359 --> 00:23:36,960 Speaker 1: have gotten there. It's an impossible physics problem. But I 418 00:23:37,280 --> 00:23:39,639 Speaker 1: don't know how you build a kind of social and 419 00:23:39,680 --> 00:23:43,400 Speaker 1: political infrastructure with that kind because it takes the very 420 00:23:43,440 --> 00:23:47,680 Speaker 1: things that you were talking about. To be comfortable with uncertainty, 421 00:23:47,840 --> 00:23:50,199 Speaker 1: you have to to kind of give up on the 422 00:23:50,280 --> 00:23:53,000 Speaker 1: notion you can push the world in a given direction. 423 00:23:53,480 --> 00:23:55,760 Speaker 1: You just have to set something like you say, put 424 00:23:55,800 --> 00:24:00,160 Speaker 1: the stick in the see and let the life them 425 00:24:00,200 --> 00:24:04,240 Speaker 1: around it. Right. That takes an incredible amount of is 426 00:24:04,280 --> 00:24:07,920 Speaker 1: it self confidence? What's the quality that requires of us? Yeah? 427 00:24:07,960 --> 00:24:10,400 Speaker 1: I think one of the I see this in companies 428 00:24:10,440 --> 00:24:13,800 Speaker 1: as well, and you can argue it's self confidence. You 429 00:24:13,800 --> 00:24:17,679 Speaker 1: can argue it's uh needing to check your ego because 430 00:24:17,760 --> 00:24:22,000 Speaker 1: so much of I think senior executives sense of worth 431 00:24:22,160 --> 00:24:24,879 Speaker 1: is I'm the smartest person in the room, or I 432 00:24:24,920 --> 00:24:27,359 Speaker 1: came up with the answer because I'm here, I've driven 433 00:24:27,400 --> 00:24:33,920 Speaker 1: this where building these types of things. You are the facilitator. 434 00:24:34,000 --> 00:24:35,639 Speaker 1: You're trying to say, if I'm the smartest person in 435 00:24:35,680 --> 00:24:38,040 Speaker 1: the room, I've done a lousy job of building the 436 00:24:38,119 --> 00:24:40,920 Speaker 1: right team around me. There's something you said that I 437 00:24:40,960 --> 00:24:44,280 Speaker 1: think is a lovely way to tie up the conversation. 438 00:24:44,440 --> 00:24:46,480 Speaker 1: And I have to say, even after talking to you, 439 00:24:46,920 --> 00:24:49,639 Speaker 1: what you said took me by surprise when you said, 440 00:24:50,440 --> 00:24:53,720 Speaker 1: if I'm silent during a meeting, the meeting is going 441 00:24:53,760 --> 00:24:57,520 Speaker 1: well or things are I just think that's so beautiful. 442 00:24:57,640 --> 00:25:00,560 Speaker 1: So you say the people. If I say thing, I'm 443 00:25:00,560 --> 00:25:03,800 Speaker 1: the leader here. If I say nothing, then we've succeeded. 444 00:25:04,640 --> 00:25:08,960 Speaker 1: So moving from red Hat to IBM, I mean, obviously 445 00:25:08,960 --> 00:25:11,199 Speaker 1: twelve years of red Hat, everybody knew me in style 446 00:25:11,240 --> 00:25:13,600 Speaker 1: and it would kind of organically grow over time. I 447 00:25:13,680 --> 00:25:16,720 Speaker 1: was very explicit at IBM on the point of, hey, 448 00:25:16,760 --> 00:25:20,320 Speaker 1: if I'm silent in a meeting, don't be uncomfortable. That 449 00:25:20,520 --> 00:25:23,560 Speaker 1: is positive. It means I am listening to and observing 450 00:25:23,600 --> 00:25:28,240 Speaker 1: the dynamics among everyone, and um, I'm feeling really really 451 00:25:28,240 --> 00:25:30,119 Speaker 1: good about how the conversation is going. If I have 452 00:25:30,160 --> 00:25:32,840 Speaker 1: an opinion, I might decide to voice it, but I'm 453 00:25:32,880 --> 00:25:36,960 Speaker 1: always a little bit, especially um in new groups of people, 454 00:25:37,600 --> 00:25:40,120 Speaker 1: I'm a little hesitant to voice of because I'm worried 455 00:25:40,560 --> 00:25:43,600 Speaker 1: that that starts to jade others points of view. And 456 00:25:43,640 --> 00:25:46,720 Speaker 1: if I'm silent, that's great. If I jump in, it 457 00:25:46,760 --> 00:25:48,359 Speaker 1: could be because I have an idea that I just 458 00:25:48,400 --> 00:25:50,800 Speaker 1: want to throw out there. As a participant, it could 459 00:25:50,800 --> 00:25:53,439 Speaker 1: be that I'm throwing out the exact opposite of what 460 00:25:53,520 --> 00:25:55,159 Speaker 1: I believe because I think that's going to help the 461 00:25:55,200 --> 00:25:59,280 Speaker 1: dialogue happen. And as the team, you shouldn't rate what 462 00:25:59,400 --> 00:26:02,520 Speaker 1: I say any higher lower than anyone else in the room. 463 00:26:02,560 --> 00:26:05,120 Speaker 1: I would participant in the room, but that's how we're 464 00:26:05,119 --> 00:26:07,000 Speaker 1: going to get the best ideas out there. If my 465 00:26:07,040 --> 00:26:08,639 Speaker 1: idea turned out to be best and it was what 466 00:26:08,680 --> 00:26:12,199 Speaker 1: I really believe, great. If it's a horrible idea, I 467 00:26:13,280 --> 00:26:15,280 Speaker 1: you know, threw it out on purpose to try to 468 00:26:15,280 --> 00:26:17,560 Speaker 1: take the conversation in a way in areas we want 469 00:26:17,560 --> 00:26:20,080 Speaker 1: to poke. I think that's really really critical because most 470 00:26:20,160 --> 00:26:22,879 Speaker 1: leaders go into a room and think their job is 471 00:26:22,920 --> 00:26:25,560 Speaker 1: the content. And my view is no. If you believe 472 00:26:25,680 --> 00:26:28,760 Speaker 1: teams do better than individuals, your job is to make 473 00:26:28,800 --> 00:26:31,119 Speaker 1: sure that tenor and tone are the conversation is happening, 474 00:26:31,160 --> 00:26:34,400 Speaker 1: because then the best idea will emerge. But it makes 475 00:26:34,400 --> 00:26:39,239 Speaker 1: me want to work for you. It sounds so much fun. Well, 476 00:26:39,280 --> 00:26:43,840 Speaker 1: I'm sure we can find a job discussion. So what's 477 00:26:43,880 --> 00:26:48,480 Speaker 1: next for you? Well, using the same approach of not 478 00:26:48,720 --> 00:26:53,359 Speaker 1: overly planning, I'm configuring myself for going forward. Right, you 479 00:26:53,400 --> 00:26:55,720 Speaker 1: have created a foundation around climate change. I want to 480 00:26:55,720 --> 00:26:59,880 Speaker 1: spend some time trying to see how I can participate 481 00:27:00,200 --> 00:27:04,159 Speaker 1: using some of these open principles and in solving that problem. 482 00:27:04,200 --> 00:27:06,840 Speaker 1: I'll continue to work with some technology companies as an 483 00:27:06,840 --> 00:27:10,360 Speaker 1: advisor and we'll see what happens in the future. Yeah, well, 484 00:27:10,400 --> 00:27:18,679 Speaker 1: good luck. Jim has a big challenge ahead as he 485 00:27:18,720 --> 00:27:22,360 Speaker 1: focuses his efforts on climate change. It will be interesting 486 00:27:22,400 --> 00:27:25,720 Speaker 1: to see what kind of solutions he'll find with using 487 00:27:25,840 --> 00:27:30,560 Speaker 1: an open culture strategy. Thanks again to Jim for taking 488 00:27:30,560 --> 00:27:32,840 Speaker 1: the time to chat with me. I can only hope 489 00:27:32,880 --> 00:27:36,760 Speaker 1: we'll all learn something from taking a more collaborative approach 490 00:27:37,160 --> 00:27:43,359 Speaker 1: towards problem solving. Smart Talks with IBM is produced by 491 00:27:43,359 --> 00:27:48,239 Speaker 1: Emily Rostak with Carlie Migliori and Katherine Gurdo, edited by 492 00:27:48,359 --> 00:27:53,520 Speaker 1: Karen Shakerji, engineering by Martin Gonzalez, mixed and mastered by 493 00:27:53,640 --> 00:27:58,879 Speaker 1: Jason Gambrel and Ben Holiday music by Gramoscope. Special thanks 494 00:27:58,920 --> 00:28:03,480 Speaker 1: to Molly Sosha, Andy Kelly, Miila Belle, Jacob Weisberg, Hadaphane, 495 00:28:03,920 --> 00:28:07,000 Speaker 1: Eric Sandler, and Maggie Taylor and the teams at eight 496 00:28:07,040 --> 00:28:11,360 Speaker 1: Bar and IBM. Smart Talks with IBM is a production 497 00:28:11,480 --> 00:28:16,280 Speaker 1: of Pushkin Industries and iHeartMedia. You can find more Pushkin 498 00:28:16,359 --> 00:28:21,080 Speaker 1: podcasts on the I Heart Radio app, Apple Podcasts, or 499 00:28:21,359 --> 00:28:25,960 Speaker 1: wherever you like to listen. I'm Malcolm Gladwell, See you 500 00:28:26,080 --> 00:28:26,520 Speaker 1: next time.