1 00:00:00,120 --> 00:00:04,240 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. This season, 2 00:00:04,280 --> 00:00:07,880 Speaker 1: non Smart Talks with IBM, Malcolm Glabwell is back, and 3 00:00:07,920 --> 00:00:10,600 Speaker 1: this time he's taking the show on the road. Malcolm 4 00:00:10,640 --> 00:00:14,720 Speaker 1: is stepping outside the studio to explore how IBM clients 5 00:00:14,760 --> 00:00:18,759 Speaker 1: are using artificial intelligence to solve real world challenges and 6 00:00:18,840 --> 00:00:23,000 Speaker 1: transform the way they do business. From accelerating scientific breakthroughs 7 00:00:23,079 --> 00:00:27,520 Speaker 1: to reimagining education. It's a fresh look at innovation in action, 8 00:00:27,960 --> 00:00:31,720 Speaker 1: where big ideas meet cutting edge solutions. You'll hear from 9 00:00:31,720 --> 00:00:36,000 Speaker 1: industry leaders, creative thinkers, and of course Malcolm Glabwell himself 10 00:00:36,280 --> 00:00:39,879 Speaker 1: as he guides you through each story. New episodes of 11 00:00:39,920 --> 00:00:43,280 Speaker 1: Smart Talks with IBM drop every month on the iHeartRadio app, 12 00:00:43,440 --> 00:00:47,240 Speaker 1: Apple Podcasts, or wherever you get your podcasts. Learn more 13 00:00:47,280 --> 00:01:05,560 Speaker 1: at IBM dot com, slash smart Talks. 14 00:00:58,680 --> 00:01:03,640 Speaker 2: Pushkin Hello. 15 00:01:03,680 --> 00:01:06,479 Speaker 3: Hello, I'm Malcolm Gladwell and you're listening to Smart Talks 16 00:01:06,480 --> 00:01:09,600 Speaker 3: with IBM. This season, we've been bringing you stories of 17 00:01:09,600 --> 00:01:13,000 Speaker 3: how IBM works with its clients to solve complex problems, 18 00:01:13,720 --> 00:01:18,280 Speaker 3: like helping Lareel reimagine how scientists approach cosmetic formulation, or 19 00:01:18,400 --> 00:01:22,399 Speaker 3: enabling Scuderia FERRARIHP to connect with fans in new ways. 20 00:01:23,120 --> 00:01:25,399 Speaker 3: But in this episode, we're going to zoom out and 21 00:01:25,440 --> 00:01:28,520 Speaker 3: look at the bigger picture. Earlier this month, I had 22 00:01:28,520 --> 00:01:31,520 Speaker 3: the chance to meet the person who's shaping IBM's future. 23 00:01:31,959 --> 00:01:36,240 Speaker 3: It's CEO and Chairman Arvind Krishna. We sat down in 24 00:01:36,280 --> 00:01:38,520 Speaker 3: front of an intimate live audience at IBM's New York 25 00:01:38,520 --> 00:01:42,400 Speaker 3: City office and talked about his uncanny ability to anticipate 26 00:01:42,440 --> 00:01:46,240 Speaker 3: where technology is heading, the future of AI, and his 27 00:01:46,360 --> 00:01:50,600 Speaker 3: passion for quantum computing, which he says is as revolutionary 28 00:01:50,880 --> 00:01:55,559 Speaker 3: as a semiconductor. Thank you everyone, Thank you to Arvin. 29 00:01:55,640 --> 00:01:58,280 Speaker 3: You're a difficult man to schedule for one of these things, 30 00:01:58,320 --> 00:02:01,760 Speaker 3: so we're enormously please you could join us. Let's start 31 00:02:01,760 --> 00:02:04,280 Speaker 3: with a I have all these cousins, two cousins who 32 00:02:04,360 --> 00:02:06,720 Speaker 3: work for IBM their entire career. I would ask them 33 00:02:07,360 --> 00:02:11,239 Speaker 3: what does IBM do? And they would always give me different, confusing, 34 00:02:11,360 --> 00:02:13,200 Speaker 3: complicated answers. 35 00:02:13,440 --> 00:02:15,800 Speaker 2: What's your answer? What's your simple answer to that question. 36 00:02:16,639 --> 00:02:21,639 Speaker 4: IBM's role is to help our clients improve their business 37 00:02:21,680 --> 00:02:26,480 Speaker 4: by deploying technology. That means you're not ever gated to 38 00:02:26,560 --> 00:02:29,400 Speaker 4: one product. It is what makes sense at that time, 39 00:02:29,880 --> 00:02:32,680 Speaker 4: but it is about improving their business, not just giving 40 00:02:32,720 --> 00:02:33,880 Speaker 4: them a commodity. 41 00:02:34,400 --> 00:02:35,240 Speaker 5: Then to go to the. 42 00:02:35,200 --> 00:02:38,040 Speaker 4: Next layer, I would say, we help them through a 43 00:02:38,120 --> 00:02:42,560 Speaker 4: mixture of hybrid cloud and artificial intelligence and a taste 44 00:02:42,560 --> 00:02:44,680 Speaker 4: of quantum coming down the road is kind of where 45 00:02:45,160 --> 00:02:45,760 Speaker 4: I would take it. 46 00:02:45,919 --> 00:02:47,160 Speaker 5: That's that's what IBM is. 47 00:02:47,760 --> 00:02:52,040 Speaker 2: So you are technology agnostic in some sense. 48 00:02:52,320 --> 00:02:56,080 Speaker 4: I'm product agnostic. Product I'm not technology agnostic. 49 00:02:56,200 --> 00:02:59,000 Speaker 3: Yes, But if I twenty five years from now, IBM 50 00:02:59,040 --> 00:03:04,200 Speaker 3: could be doing things that would be unrecognizable to contemporary IBM. 51 00:03:04,080 --> 00:03:05,280 Speaker 5: It is completely possible. 52 00:03:05,480 --> 00:03:08,160 Speaker 4: Yeah, it could be there in twenty five years from now. 53 00:03:08,360 --> 00:03:11,680 Speaker 4: The only software IBM does is open source. It could 54 00:03:11,680 --> 00:03:14,280 Speaker 4: be the only computing you do is quantum computers. And 55 00:03:14,320 --> 00:03:16,960 Speaker 4: if I add those two people today, WLD say that's 56 00:03:17,000 --> 00:03:17,919 Speaker 4: not the IEM of today. 57 00:03:18,200 --> 00:03:20,800 Speaker 3: Is it even simpler to say you just IBM solves 58 00:03:20,880 --> 00:03:23,600 Speaker 3: problems at the highest technical level. 59 00:03:24,080 --> 00:03:26,440 Speaker 5: If you say highest technical level, yes, yeah. 60 00:03:26,560 --> 00:03:29,040 Speaker 4: Like the guy who invented the bar code, he was 61 00:03:29,040 --> 00:03:33,480 Speaker 4: solving a problem retailers wanted to scale. Many of you 62 00:03:33,520 --> 00:03:35,640 Speaker 4: may not know it was an IBM or who invented 63 00:03:35,680 --> 00:03:38,680 Speaker 4: the bar code, by the way, not somebody who was 64 00:03:38,720 --> 00:03:42,840 Speaker 4: a PhD. Not somebody who was a deep researcher. I 65 00:03:42,840 --> 00:03:46,040 Speaker 4: think it was actually a field engineer. Oh really, Yeah, 66 00:03:46,680 --> 00:03:50,200 Speaker 4: And lasers were out and you could use lasers to 67 00:03:50,240 --> 00:03:53,120 Speaker 4: scan things, but they could be upside down, they could 68 00:03:53,160 --> 00:03:56,760 Speaker 4: be muddy, they could be partly scraped off. And he 69 00:03:56,840 --> 00:03:59,600 Speaker 4: came up with the idea of the bar code. Yeah, 70 00:04:00,040 --> 00:04:03,320 Speaker 4: and that changed inventory management forever. But the world needs 71 00:04:03,320 --> 00:04:05,760 Speaker 4: to know that IBM invented the Parker. You guys should 72 00:04:05,760 --> 00:04:09,720 Speaker 4: do a better job opicizing that. I am sure our 73 00:04:09,840 --> 00:04:14,240 Speaker 4: CMO will listen to this podcast and we'll get that idea. 74 00:04:14,640 --> 00:04:18,880 Speaker 3: Tell me you started at the Thomas Watson Research Center. 75 00:04:19,400 --> 00:04:21,520 Speaker 3: What were you doing when you first started it, IBM. 76 00:04:22,120 --> 00:04:27,520 Speaker 4: I started in nineteen ninety and that was the era 77 00:04:27,680 --> 00:04:31,599 Speaker 4: in which computers and networking were beginning to converge. And 78 00:04:31,680 --> 00:04:35,320 Speaker 4: for the first five years I was actually building networks. 79 00:04:35,720 --> 00:04:38,839 Speaker 4: So let's remember this was pre laptops. Laptops came in 80 00:04:38,920 --> 00:04:41,760 Speaker 4: ninety two or ninety three, but it was clear to 81 00:04:41,839 --> 00:04:45,159 Speaker 4: us that they were going to come supportable computing, and 82 00:04:45,240 --> 00:04:48,240 Speaker 4: I spent my first five years building what today you 83 00:04:48,240 --> 00:04:51,600 Speaker 4: would call Wi Fi. We used to have these debates 84 00:04:51,640 --> 00:04:53,360 Speaker 4: can be built it, it's got to be small enough. 85 00:04:53,440 --> 00:04:55,680 Speaker 4: I mean like, it can't be more than one hundred 86 00:04:55,680 --> 00:04:58,320 Speaker 4: grams was kind of our thought, because if it's more 87 00:04:58,360 --> 00:05:01,200 Speaker 4: than that, you're on a three thousand grand laptop. 88 00:05:01,200 --> 00:05:02,560 Speaker 5: Why would anybody put this on? 89 00:05:03,279 --> 00:05:05,480 Speaker 4: And the debate used to be why would anybody want 90 00:05:05,520 --> 00:05:07,040 Speaker 4: to walk around untethered? 91 00:05:07,480 --> 00:05:10,200 Speaker 5: Won't you want to attach a big thick cable into 92 00:05:10,320 --> 00:05:10,920 Speaker 5: it and sit down? 93 00:05:10,960 --> 00:05:13,640 Speaker 4: Because that was the thought, that's how terminals worked. And 94 00:05:13,720 --> 00:05:16,640 Speaker 4: I spent five years having a lot of fun building 95 00:05:16,720 --> 00:05:21,240 Speaker 4: many iterations of those and making progress on that. 96 00:05:21,600 --> 00:05:24,920 Speaker 3: If I had a conversation with your nineteen ninety self 97 00:05:26,120 --> 00:05:30,039 Speaker 3: about what the next thirty years were going to look like, 98 00:05:30,760 --> 00:05:33,240 Speaker 3: is it possible to reconstruct what your What were your 99 00:05:33,279 --> 00:05:35,560 Speaker 3: predictions at that age about where the company, where the 100 00:05:35,600 --> 00:05:36,320 Speaker 3: industry was going. 101 00:05:37,400 --> 00:05:39,919 Speaker 4: It was more about where technology was going to go, 102 00:05:39,960 --> 00:05:42,640 Speaker 4: I would say, than where industry would go. I would 103 00:05:42,680 --> 00:05:48,000 Speaker 4: have told you that networking and computers would fuse in 104 00:05:48,120 --> 00:05:51,119 Speaker 4: nineteen ninety That was a weird thought that some researchers held. 105 00:05:51,480 --> 00:05:54,440 Speaker 4: By the late nineties, that was obvious that it became 106 00:05:54,440 --> 00:05:59,560 Speaker 4: the Internet. I would have told you that I believe 107 00:05:59,640 --> 00:06:03,280 Speaker 4: that streaming will be the primary way people will consume. 108 00:06:03,040 --> 00:06:05,080 Speaker 2: Video you would have said that in nineteen ninety. 109 00:06:05,200 --> 00:06:08,359 Speaker 5: Absolutely, Now, that didn't take five years. That took twenty. 110 00:06:08,920 --> 00:06:12,559 Speaker 4: But it happened because you could do it technically, except 111 00:06:12,680 --> 00:06:15,600 Speaker 4: it just too expensive and too cumbersome. And if you've 112 00:06:15,600 --> 00:06:19,039 Speaker 4: been in technology, like in nineteen eighty five, I would 113 00:06:19,080 --> 00:06:23,120 Speaker 4: have told you the internet is old because when I 114 00:06:23,200 --> 00:06:26,440 Speaker 4: went to grad school, every one of us had a 115 00:06:26,560 --> 00:06:30,400 Speaker 4: those days, an Apple, Mac or Lisa on our desks. 116 00:06:31,200 --> 00:06:34,080 Speaker 4: They're all connected by a network. You're happily sending email 117 00:06:34,120 --> 00:06:37,760 Speaker 4: to people all around the country. We were doing file transfers, 118 00:06:37,760 --> 00:06:39,280 Speaker 4: so okay, you had to be a little bit more 119 00:06:39,320 --> 00:06:42,080 Speaker 4: aware of the technology. And it didn't have a browser. 120 00:06:43,040 --> 00:06:44,880 Speaker 4: That took ten years to get the browser. That took 121 00:06:44,920 --> 00:06:47,320 Speaker 4: five years to be a business. But when you see 122 00:06:47,320 --> 00:06:50,920 Speaker 4: the speed and the pace of technology in usually ten 123 00:06:51,000 --> 00:06:56,520 Speaker 4: or fifteen years, the cost point and the consumerization is 124 00:06:56,560 --> 00:06:59,760 Speaker 4: at a scale that you couldn't imagine ten years ago 125 00:07:00,120 --> 00:07:01,640 Speaker 4: until you've seen a few of those cycles. 126 00:07:02,320 --> 00:07:05,440 Speaker 3: Wait, did you make the lead to sorry, this is fascinating, 127 00:07:05,640 --> 00:07:07,320 Speaker 3: I'm curious, but how far did you take that? That's a 128 00:07:07,360 --> 00:07:10,720 Speaker 3: really fundamental thing to have gotten right in nineteen ninety 129 00:07:10,920 --> 00:07:13,160 Speaker 3: I think that we. 130 00:07:13,000 --> 00:07:17,120 Speaker 4: Were pretty convinced that what we used to think of 131 00:07:17,160 --> 00:07:21,760 Speaker 4: as linear television or broadcast would become digitized. That was 132 00:07:21,760 --> 00:07:26,560 Speaker 4: a given two with cable already the preponderance of how 133 00:07:26,600 --> 00:07:31,640 Speaker 4: people got it that if you put packet television over cable, 134 00:07:32,560 --> 00:07:36,559 Speaker 4: then that becomes the way it will go. I fundamentally believe, 135 00:07:36,640 --> 00:07:41,000 Speaker 4: actually way back eighty seven, that on demand movies would 136 00:07:41,000 --> 00:07:45,600 Speaker 4: become the way people would consume movies. So those were 137 00:07:45,680 --> 00:07:48,360 Speaker 4: all things that I could have predicted nine then personally 138 00:07:48,400 --> 00:07:51,080 Speaker 4: work on all those. I mean, after networking, I moved 139 00:07:51,120 --> 00:07:55,000 Speaker 4: on to doing other things, but those were easy to predict. 140 00:07:56,280 --> 00:07:59,400 Speaker 3: If you had a conversation in those years with someone 141 00:07:59,440 --> 00:08:02,800 Speaker 3: in the television industry and you gave them those predictions, 142 00:08:03,160 --> 00:08:03,880 Speaker 3: did they see it? 143 00:08:03,960 --> 00:08:05,040 Speaker 2: Were they convinced of this? 144 00:08:05,480 --> 00:08:07,600 Speaker 4: I'm actually going to take it back to wireless networking. 145 00:08:08,760 --> 00:08:11,640 Speaker 4: I think one of the reasons I do what I 146 00:08:11,680 --> 00:08:14,720 Speaker 4: do today, which is at the intersection of business and technology, 147 00:08:15,360 --> 00:08:17,800 Speaker 4: is because of what I saw happened with Wi Fi. 148 00:08:18,680 --> 00:08:22,240 Speaker 4: So you built these wireless networks and then you say, hey, 149 00:08:22,240 --> 00:08:25,080 Speaker 4: the market's going to be millions, tens of millions, billions 150 00:08:25,120 --> 00:08:27,840 Speaker 4: of users, and the business looks at it and says 151 00:08:28,200 --> 00:08:33,200 Speaker 4: we think the market is confined to warehouse workers doing inventory. 152 00:08:33,280 --> 00:08:35,600 Speaker 4: You can look at them and say, why not people 153 00:08:35,600 --> 00:08:40,120 Speaker 4: in their homes because they could imagine outside how people 154 00:08:40,160 --> 00:08:45,360 Speaker 4: bought things at that time. And so I became convinced 155 00:08:45,400 --> 00:08:49,160 Speaker 4: that I can't just help invent it. I got to 156 00:08:49,200 --> 00:08:52,520 Speaker 4: think about, now, how do you market it? To whom 157 00:08:52,520 --> 00:08:54,600 Speaker 4: do you market it? What are their routs? How do 158 00:08:54,640 --> 00:08:57,920 Speaker 4: you make it easy enough? And that was probably I mean, 159 00:08:57,960 --> 00:09:00,559 Speaker 4: I'm making it simple now. That is probably a five 160 00:09:00,640 --> 00:09:05,560 Speaker 4: to ten year evolution of myself in those days. 161 00:09:05,679 --> 00:09:08,000 Speaker 3: You know what this reminds me of when the telephone 162 00:09:08,040 --> 00:09:11,400 Speaker 3: is invented in the eighteen seventies, it doesn't take off 163 00:09:11,440 --> 00:09:13,920 Speaker 3: for forty years because the people running a telephone business 164 00:09:14,320 --> 00:09:16,679 Speaker 3: they didn't want women using it because they were worried 165 00:09:16,679 --> 00:09:19,880 Speaker 3: that women would gossip with their friends. They didn't understand 166 00:09:19,880 --> 00:09:23,440 Speaker 3: that that's actually what telephone is, right, it's an exact parallel, 167 00:09:23,520 --> 00:09:24,760 Speaker 3: Yes it is. 168 00:09:24,880 --> 00:09:26,040 Speaker 5: You see it again and again. 169 00:09:26,400 --> 00:09:28,280 Speaker 2: What is the source of that blindness? 170 00:09:28,280 --> 00:09:30,920 Speaker 3: So there's a gap, in other words, between the invention, 171 00:09:31,120 --> 00:09:34,679 Speaker 3: the technological achievement, and the social understanding of the technology. 172 00:09:35,040 --> 00:09:36,160 Speaker 2: Why is there such a gap? 173 00:09:37,400 --> 00:09:41,760 Speaker 4: I think that the gap is fundamental and rooted in 174 00:09:42,000 --> 00:09:47,480 Speaker 4: a lot of academic disciplines. So even channeling some of 175 00:09:47,520 --> 00:09:49,480 Speaker 4: your work, though you don't intend it to be used 176 00:09:49,480 --> 00:09:51,400 Speaker 4: that way, you can say a lot of things that 177 00:09:51,520 --> 00:09:56,479 Speaker 4: data driven. If it is data driven, then by definition, 178 00:09:56,600 --> 00:10:00,320 Speaker 4: you're looking at history. If you're looking at history, that 179 00:10:00,360 --> 00:10:03,480 Speaker 4: means you're looking at existing buying patterns. If you look 180 00:10:03,520 --> 00:10:07,800 Speaker 4: at existing buying patterns, you forget. All of those who 181 00:10:07,840 --> 00:10:11,960 Speaker 4: have created massive value in time have all created markets, 182 00:10:12,120 --> 00:10:13,760 Speaker 4: meaning they've all created new markets. 183 00:10:14,600 --> 00:10:15,640 Speaker 5: And I think that is why. 184 00:10:15,480 --> 00:10:19,040 Speaker 4: The world is fascinated with people like Steve Jobs. For example, 185 00:10:19,120 --> 00:10:22,440 Speaker 4: he imagined a market that didn't exist. So I think 186 00:10:22,480 --> 00:10:25,360 Speaker 4: that is the gap. And then if you can get 187 00:10:25,800 --> 00:10:31,600 Speaker 4: the technology the business acumen scaler company, and that imagination 188 00:10:31,640 --> 00:10:34,240 Speaker 4: of making a market is how you create I think 189 00:10:34,640 --> 00:10:35,440 Speaker 4: massive value. 190 00:10:35,480 --> 00:10:36,880 Speaker 5: You got to get all three pieces going. 191 00:10:37,000 --> 00:10:37,640 Speaker 2: It's not enough. 192 00:10:37,640 --> 00:10:39,240 Speaker 3: In other words, you were thinking it's not enough to 193 00:10:39,280 --> 00:10:42,040 Speaker 3: invent something new. I need to make a business case 194 00:10:42,080 --> 00:10:44,680 Speaker 3: for it simultaneously, and that that's what gets you thinking 195 00:10:45,080 --> 00:10:46,800 Speaker 3: along the path that leads you to this job. 196 00:10:46,880 --> 00:10:50,559 Speaker 4: Oh yeah, I'll tell you if you had met or 197 00:10:50,600 --> 00:10:53,640 Speaker 4: when the nineteen ninety four and you had talked about 198 00:10:53,760 --> 00:10:57,920 Speaker 4: the stock market or about a balance sheet, or looked 199 00:10:57,920 --> 00:10:59,840 Speaker 4: at you like, Okay, I got the those words are, 200 00:10:59,880 --> 00:11:00,839 Speaker 4: I can parse them. 201 00:11:00,920 --> 00:11:02,959 Speaker 5: I have no idea what they are. I have no 202 00:11:03,000 --> 00:11:04,280 Speaker 5: intuition on what they are. 203 00:11:04,600 --> 00:11:07,719 Speaker 4: I couldn't tell you why it's relevant or why it's not. 204 00:11:08,120 --> 00:11:11,080 Speaker 4: But then you began to think, Okay, why do companies 205 00:11:11,120 --> 00:11:14,080 Speaker 4: get higher values? Okay, that's the stock what does that 206 00:11:14,160 --> 00:11:17,800 Speaker 4: capture if I have to spend working capital and that's 207 00:11:18,080 --> 00:11:21,280 Speaker 4: the balance sheet? Well, so you learn. I mean, I figure, 208 00:11:21,320 --> 00:11:24,720 Speaker 4: I'm willing to learn. I'm willing to read. However, the 209 00:11:24,720 --> 00:11:27,760 Speaker 4: best way I read is to go to balance sheets. 210 00:11:28,080 --> 00:11:29,840 Speaker 4: Here you can read the book. It's pretty damn dry. 211 00:11:30,559 --> 00:11:32,960 Speaker 4: Much easier to go talk to a financial expert who's 212 00:11:33,000 --> 00:11:36,080 Speaker 4: around the corner, and people are if you're curious about 213 00:11:36,120 --> 00:11:38,960 Speaker 4: what they do, they're really happy to share their expertise, 214 00:11:39,400 --> 00:11:42,559 Speaker 4: and over time you learn more and more and they 215 00:11:42,559 --> 00:11:45,679 Speaker 4: actually become part of your network within the company. And 216 00:11:45,840 --> 00:11:48,920 Speaker 4: that's how you can both learn and evolve yourself and 217 00:11:49,080 --> 00:11:50,880 Speaker 4: actually gain the extra. 218 00:11:50,679 --> 00:11:54,120 Speaker 2: Skills you are to be a successful business leader. 219 00:11:54,920 --> 00:11:58,880 Speaker 3: Do you have to unlearn or deviate from some of 220 00:11:58,880 --> 00:12:01,720 Speaker 3: the things that made you as successful scientist? 221 00:12:02,360 --> 00:12:03,760 Speaker 5: I actually believe the exactly opposite. 222 00:12:04,000 --> 00:12:07,880 Speaker 4: Yeah, but use what you're really good at as a foundation, 223 00:12:09,080 --> 00:12:11,480 Speaker 4: but don't make it the only thing you use. So 224 00:12:11,520 --> 00:12:14,720 Speaker 4: then how do you add the other skills? And there's 225 00:12:14,720 --> 00:12:17,600 Speaker 4: many ways you can have people that you trust who 226 00:12:17,679 --> 00:12:21,240 Speaker 4: help you out those skills. You can gain some intuition, 227 00:12:21,520 --> 00:12:24,120 Speaker 4: maybe not the depth of expertise. I want to be 228 00:12:24,240 --> 00:12:27,760 Speaker 4: deeper on certain areas of electrical engineering than I'm ever 229 00:12:27,840 --> 00:12:30,280 Speaker 4: going to be, let's say, in finance or marketing. But 230 00:12:30,880 --> 00:12:32,959 Speaker 4: I want to be curious about those. I don't want 231 00:12:33,000 --> 00:12:37,319 Speaker 4: to dismiss them. So you build on your skills, and 232 00:12:37,360 --> 00:12:39,640 Speaker 4: then you have to say, but I need a complete 233 00:12:39,800 --> 00:12:43,480 Speaker 4: and holistic view. So I'm going to be a little deep, 234 00:12:43,559 --> 00:12:46,040 Speaker 4: not very deep, in all of those. And you've also 235 00:12:46,080 --> 00:12:48,240 Speaker 4: got to learn to trust your intuition a little bit. 236 00:12:48,600 --> 00:12:51,319 Speaker 3: Yeah, but I forgot a question that I wanted to ask, 237 00:12:52,120 --> 00:12:56,040 Speaker 3: But about the predictions of nineteen ninety arvand what did 238 00:12:56,080 --> 00:12:56,640 Speaker 3: you get wrong? 239 00:12:58,120 --> 00:13:01,000 Speaker 4: All lots of things. I think that people were thinking 240 00:13:01,160 --> 00:13:05,840 Speaker 4: that in those days, and it started my phrase, but 241 00:13:05,880 --> 00:13:08,400 Speaker 4: I'll come back to it. I think most people thought 242 00:13:08,520 --> 00:13:12,040 Speaker 4: that the communication companies would turn out to be the 243 00:13:12,080 --> 00:13:15,520 Speaker 4: winners of how networking got carried. If you all think 244 00:13:15,559 --> 00:13:18,000 Speaker 4: through the nineties of the investments that were being done 245 00:13:18,520 --> 00:13:20,920 Speaker 4: by let's not take the names of all of the 246 00:13:21,160 --> 00:13:25,920 Speaker 4: telecom carriers, didn't turn out to be the case. Actually, 247 00:13:25,920 --> 00:13:28,840 Speaker 4: I think that's the business model case. The reason is 248 00:13:29,320 --> 00:13:31,400 Speaker 4: they all had in their heads that you can charge 249 00:13:31,440 --> 00:13:32,679 Speaker 4: people by the minute. 250 00:13:34,280 --> 00:13:36,000 Speaker 2: Because they had been doing that already. 251 00:13:35,720 --> 00:13:37,320 Speaker 5: Because they'd been doing that for one hundred years. 252 00:13:37,360 --> 00:13:41,720 Speaker 4: Yeah, and in the end, the winners the networking were 253 00:13:41,720 --> 00:13:44,480 Speaker 4: those who sat flat price thirty bucks a month or 254 00:13:44,520 --> 00:13:47,640 Speaker 4: fifty bucks a month or whatever, and that was just 255 00:13:47,760 --> 00:13:48,439 Speaker 4: too much of. 256 00:13:48,400 --> 00:13:49,920 Speaker 5: A leap for them. 257 00:13:50,040 --> 00:13:53,559 Speaker 3: You think it's as simples, That is the most parsimonious 258 00:13:53,559 --> 00:13:55,800 Speaker 3: explanation for why you think they failed. 259 00:13:56,080 --> 00:13:58,240 Speaker 4: No, there were a couple of other more technical things. 260 00:13:58,320 --> 00:14:01,199 Speaker 4: The one was written by somebody who was inside one 261 00:14:01,200 --> 00:14:04,559 Speaker 4: of these telecom companies, and he labeled his article the 262 00:14:04,640 --> 00:14:08,439 Speaker 4: Rise of the Stupid Network. So telephone people believe that 263 00:14:08,480 --> 00:14:11,680 Speaker 4: the network should be really smart. The end device is dumb. 264 00:14:12,760 --> 00:14:14,559 Speaker 4: If you think about the telephone, telephone is dumb. It 265 00:14:14,559 --> 00:14:17,040 Speaker 4: doesn't actually do anything. It's just about the relays. And 266 00:14:17,080 --> 00:14:19,520 Speaker 4: the network is smart. It routes you, it figures out 267 00:14:19,560 --> 00:14:24,880 Speaker 4: where to send it. It does echocaculation backwards, and the 268 00:14:24,920 --> 00:14:27,360 Speaker 4: current Internet is completely dumb with the inside. It just 269 00:14:27,440 --> 00:14:29,360 Speaker 4: takes the bits and shoves them out the other end. 270 00:14:29,760 --> 00:14:32,440 Speaker 4: All the intelligence is the computer at the end. That's 271 00:14:32,760 --> 00:14:35,720 Speaker 4: probably a bit more of a found explanation. But business 272 00:14:35,760 --> 00:14:37,160 Speaker 4: model didn't help them either. 273 00:14:37,400 --> 00:14:37,680 Speaker 2: Yeah. 274 00:14:37,960 --> 00:14:41,720 Speaker 3: Wait, did nineteen ninety r end think that the network 275 00:14:41,760 --> 00:14:43,840 Speaker 3: should be dumb or smart? 276 00:14:45,240 --> 00:14:47,920 Speaker 4: I'm not sure I thought about it deeply, but everything 277 00:14:47,960 --> 00:14:49,400 Speaker 4: I worked on the network was dumb. 278 00:14:50,120 --> 00:14:51,800 Speaker 5: The network movements, That's all I did. 279 00:14:52,440 --> 00:14:56,320 Speaker 4: Yeah, because even I in those days understood I can't 280 00:14:56,320 --> 00:14:57,720 Speaker 4: imagine all the applications. 281 00:14:58,120 --> 00:15:00,320 Speaker 5: So if all you do is voice me, maybe the 282 00:15:00,320 --> 00:15:01,280 Speaker 5: network can be smart. 283 00:15:01,440 --> 00:15:02,960 Speaker 4: But if you're doing all those other things, how could 284 00:15:02,960 --> 00:15:05,760 Speaker 4: the network possibly know all those things and be smart 285 00:15:05,760 --> 00:15:06,040 Speaker 4: for it? 286 00:15:06,240 --> 00:15:10,200 Speaker 2: Yeah, so you've been COO for five years. 287 00:15:10,320 --> 00:15:10,960 Speaker 5: Five years? 288 00:15:11,200 --> 00:15:13,920 Speaker 3: Wait, so in your five year increment, what was your 289 00:15:14,000 --> 00:15:15,920 Speaker 3: most misunderstood decision? 290 00:15:16,680 --> 00:15:18,480 Speaker 2: Well, you ended up being right, but everyone thought you 291 00:15:18,480 --> 00:15:18,960 Speaker 2: were crazy. 292 00:15:19,600 --> 00:15:23,560 Speaker 4: Twenty and eighteen, I proposed to our boat that we 293 00:15:23,560 --> 00:15:28,000 Speaker 4: should buy a company called rad Hat. IBM does proprietary 294 00:15:28,000 --> 00:15:31,640 Speaker 4: but that was open source. The stock twelve fifteen percent 295 00:15:31,680 --> 00:15:35,040 Speaker 4: of the day we announced it, and today most people 296 00:15:35,080 --> 00:15:37,040 Speaker 4: will turn around and say, this is the most successful 297 00:15:37,080 --> 00:15:40,760 Speaker 4: acquisition that IBM has done in all time, and probably 298 00:15:40,840 --> 00:15:45,520 Speaker 4: the most successful software acquisition in history. So it was 299 00:15:45,560 --> 00:15:49,120 Speaker 4: completely misunderstood because people didn't see that you actually did 300 00:15:49,200 --> 00:15:53,400 Speaker 4: need a platform that could make it agnostic across multiple 301 00:15:53,800 --> 00:15:57,720 Speaker 4: cloud platforms, across on premise environments. So you've got to 302 00:15:57,760 --> 00:16:00,160 Speaker 4: have a view of what it could be, and we 303 00:16:00,240 --> 00:16:02,880 Speaker 4: drove it to a place where I think today it 304 00:16:03,000 --> 00:16:04,760 Speaker 4: stands as the leader in its space. 305 00:16:05,520 --> 00:16:08,560 Speaker 3: So how did you come to believe this heretical notion? 306 00:16:10,720 --> 00:16:15,160 Speaker 4: So Cloud was happening, you could ask yourself the question, 307 00:16:15,640 --> 00:16:20,160 Speaker 4: should we spend a lot of capital and chase Cloud? Okay, 308 00:16:20,400 --> 00:16:24,560 Speaker 4: you're five years to be generous, maybe longer behind at 309 00:16:24,560 --> 00:16:27,960 Speaker 4: that point the two leaders, So you could spend maybe 310 00:16:28,000 --> 00:16:31,960 Speaker 4: ten billion a year, and a lot of businesses tend 311 00:16:32,000 --> 00:16:34,240 Speaker 4: to do that. Okay, it's so important, it's going to 312 00:16:34,280 --> 00:16:37,760 Speaker 4: be half the market. I can't not. My view was 313 00:16:38,200 --> 00:16:40,240 Speaker 4: we'll always be five years behind. They're not dumb and 314 00:16:40,240 --> 00:16:42,880 Speaker 4: they're not slow. So if you're going to be there, 315 00:16:43,000 --> 00:16:46,400 Speaker 4: you're going to be best case, a distant third, worst 316 00:16:46,400 --> 00:16:48,440 Speaker 4: case maybe a fourth or a fifth. Because there's Chinese 317 00:16:48,480 --> 00:16:52,080 Speaker 4: also in the mix. Why would you do that instead? 318 00:16:52,440 --> 00:16:55,000 Speaker 4: Is there a different space you can occupy instead of 319 00:16:55,080 --> 00:16:58,440 Speaker 4: competing with them? Can you become their best partner? In 320 00:16:58,480 --> 00:17:02,120 Speaker 4: which case you right there success. If I want to 321 00:17:02,120 --> 00:17:04,520 Speaker 4: be the best partner, then what are the set of 322 00:17:04,560 --> 00:17:08,040 Speaker 4: technologies that would be useful? So you can flip The 323 00:17:08,080 --> 00:17:11,640 Speaker 4: problem is how I thought about it. 324 00:17:12,480 --> 00:17:16,119 Speaker 3: How hard was it to convince people needed convincing before 325 00:17:16,160 --> 00:17:17,879 Speaker 3: that acquisition. 326 00:17:18,359 --> 00:17:20,840 Speaker 5: Probably six to nine months of. 327 00:17:22,359 --> 00:17:26,920 Speaker 4: Breaking my head with no success, and then six months 328 00:17:26,960 --> 00:17:33,160 Speaker 4: of building the momentum once a couple of people began 329 00:17:33,240 --> 00:17:33,679 Speaker 4: to see it. 330 00:17:34,119 --> 00:17:35,920 Speaker 2: Yeah, you're very persistent. 331 00:17:36,440 --> 00:17:41,840 Speaker 3: Oh yes, very Would you describe that as you defining trade? 332 00:17:43,240 --> 00:17:45,880 Speaker 5: I am very persistent and I'm very patient. 333 00:17:46,600 --> 00:17:50,080 Speaker 4: I'm also probably very impatient, But I'm not a yeller 334 00:17:50,119 --> 00:17:50,720 Speaker 4: and screamer. 335 00:17:50,840 --> 00:17:52,480 Speaker 5: I don't rant and rave. 336 00:17:53,080 --> 00:17:55,639 Speaker 4: But as I say, if I think we're going to 337 00:17:55,680 --> 00:18:00,439 Speaker 4: do something, I can be remarkably stop about it. 338 00:18:00,480 --> 00:18:01,120 Speaker 5: We will do it. 339 00:18:01,280 --> 00:18:03,600 Speaker 3: If I got your family, put them up on stage 340 00:18:03,640 --> 00:18:06,239 Speaker 3: and ask them this exact question, is this how they 341 00:18:06,240 --> 00:18:06,760 Speaker 3: would answer? 342 00:18:06,800 --> 00:18:07,320 Speaker 2: As well? 343 00:18:07,480 --> 00:18:13,239 Speaker 4: They will tell you I'm very stubborn. They might not 344 00:18:13,280 --> 00:18:15,160 Speaker 4: agree that I don't rant and drave. 345 00:18:17,320 --> 00:18:20,920 Speaker 3: Well, you know, one of the principal observations of psychology 346 00:18:21,000 --> 00:18:24,400 Speaker 3: is that our home self and our work self are uncorrelated. 347 00:18:25,040 --> 00:18:26,400 Speaker 2: Once you know that, you know everything. 348 00:18:26,720 --> 00:18:29,760 Speaker 3: Wait, I'm curious one last question about that. How long 349 00:18:29,800 --> 00:18:33,160 Speaker 3: does it take for you to be vindicated with red Hat? 350 00:18:34,200 --> 00:18:39,119 Speaker 4: Probably took five maybe four years, I think by twenty 351 00:18:39,280 --> 00:18:41,880 Speaker 4: twenty three, So twenty eighteen we announced it, We took 352 00:18:41,880 --> 00:18:43,760 Speaker 4: the big shot croft. It took a year to close 353 00:18:43,840 --> 00:18:48,199 Speaker 4: twenty nineteen, so if I count, not that I'm counting 354 00:18:48,240 --> 00:18:57,600 Speaker 4: that much, but July ninth, twenty nineteen as the day 355 00:18:57,640 --> 00:19:00,000 Speaker 4: that we got all the approvals. Took another few weeks 356 00:19:00,200 --> 00:19:04,280 Speaker 4: to actually transfer the money. But from there, probably twenty 357 00:19:04,359 --> 00:19:06,560 Speaker 4: twenty three, the world woke up and said, hey, you 358 00:19:06,600 --> 00:19:08,440 Speaker 4: guys deserve credit for this was actually. 359 00:19:08,160 --> 00:19:10,080 Speaker 5: A great move, not a bad move. 360 00:19:10,240 --> 00:19:14,439 Speaker 3: Yeah, but this is it's interesting because this is a 361 00:19:14,480 --> 00:19:18,520 Speaker 3: real gamble. If it doesn't work, you're not sitting in 362 00:19:18,520 --> 00:19:19,200 Speaker 3: this chair right now. 363 00:19:19,320 --> 00:19:19,560 Speaker 2: Right. 364 00:19:19,680 --> 00:19:21,440 Speaker 5: Oh, for sure. There were two steps. 365 00:19:21,440 --> 00:19:24,159 Speaker 4: One if it was obviously not going to work, I 366 00:19:24,160 --> 00:19:26,720 Speaker 4: wouldn't have been selected, and two if it hadn't worked 367 00:19:26,720 --> 00:19:29,520 Speaker 4: after that, that's why CEOs can be short lived. 368 00:19:30,040 --> 00:19:31,480 Speaker 2: Can I ask you a sort of a personal question. 369 00:19:31,720 --> 00:19:33,439 Speaker 2: How much sleep did you lose over this? 370 00:19:36,920 --> 00:19:40,680 Speaker 5: Once we had made the decision, none. 371 00:19:43,000 --> 00:19:44,719 Speaker 3: Can you give me pointers on how you do this? 372 00:19:44,800 --> 00:19:48,000 Speaker 3: Because I wake up at two am every morning and 373 00:19:48,040 --> 00:19:50,159 Speaker 3: I over much more trivial things than this. 374 00:19:50,640 --> 00:19:52,639 Speaker 5: Once a week, I'll probably wake up at two or 375 00:19:52,640 --> 00:19:53,320 Speaker 5: three in the morning. 376 00:19:53,400 --> 00:19:56,880 Speaker 4: I acknowledge it because I wake up and my brain 377 00:19:56,960 --> 00:19:58,879 Speaker 4: is running, and once it's running, I don't even. 378 00:19:58,760 --> 00:19:59,560 Speaker 5: Try to go back to sleep. 379 00:19:59,520 --> 00:20:02,560 Speaker 4: I mean, go get up and do work and make 380 00:20:02,600 --> 00:20:04,800 Speaker 4: yourself productive. You're gonna be tired before in the afternoon, 381 00:20:04,800 --> 00:20:08,080 Speaker 4: that's fine, you'll sleep well that night. I have actually 382 00:20:08,119 --> 00:20:11,240 Speaker 4: learned a long time back. You can't do it across. 383 00:20:11,240 --> 00:20:14,800 Speaker 4: You can't do it early morning, through the day and 384 00:20:14,920 --> 00:20:18,080 Speaker 4: late at night. So an hour before I think I 385 00:20:18,119 --> 00:20:20,800 Speaker 4: want to go to bed, I will actually change what 386 00:20:20,840 --> 00:20:25,280 Speaker 4: I'm doing, meaning I will start reading something interesting to 387 00:20:25,359 --> 00:20:28,920 Speaker 4: me but completely outside the scope of work. I may 388 00:20:28,960 --> 00:20:34,320 Speaker 4: read a biography, I might read somebody who's spontefecating on 389 00:20:34,400 --> 00:20:38,120 Speaker 4: demographics and population. But I won't read it on leadership, 390 00:20:38,119 --> 00:20:40,720 Speaker 4: because that's too close now. Twenty years ago I might 391 00:20:40,760 --> 00:20:43,800 Speaker 4: have that would have been different. I won't read it 392 00:20:44,080 --> 00:20:46,560 Speaker 4: on deep signs because that's too close to what we 393 00:20:46,640 --> 00:20:48,920 Speaker 4: do for a living. So it's got to be outside 394 00:20:49,800 --> 00:20:53,199 Speaker 4: the things that will make my brain churn about work. 395 00:20:53,640 --> 00:20:56,639 Speaker 4: But it's got to be something that is dense enough 396 00:20:56,680 --> 00:20:59,560 Speaker 4: to occupy your brain, so it shift gears. 397 00:21:00,119 --> 00:21:02,320 Speaker 2: Sorry, I want to dwell on this just for a moment. 398 00:21:02,359 --> 00:21:03,040 Speaker 2: The red hat thing. 399 00:21:03,800 --> 00:21:06,320 Speaker 3: Was there someone or is there someone who you went 400 00:21:06,400 --> 00:21:10,600 Speaker 3: to and explained the logic of this and they saw 401 00:21:10,600 --> 00:21:12,800 Speaker 3: the logic of this, and that made a big difference 402 00:21:12,800 --> 00:21:13,000 Speaker 3: to you. 403 00:21:14,720 --> 00:21:18,359 Speaker 4: Getting their support made a big difference. You'd be surprised. 404 00:21:18,520 --> 00:21:25,320 Speaker 4: I'm remarkably open inside. I mean when I have are 405 00:21:25,359 --> 00:21:28,360 Speaker 4: there probably a half dozen to a dozen people inside 406 00:21:28,359 --> 00:21:31,040 Speaker 4: that I'll talk to and I'll be completely open about Hey, 407 00:21:31,040 --> 00:21:33,359 Speaker 4: this is what I'm thinking. I don't know. Here are 408 00:21:33,400 --> 00:21:35,359 Speaker 4: the risks. I'm open about those. Also, it's not just 409 00:21:35,400 --> 00:21:38,440 Speaker 4: the benefits. I think the other risks, but I think 410 00:21:38,480 --> 00:21:41,280 Speaker 4: the benefits outweigh the risks. I talk about that to 411 00:21:41,320 --> 00:21:46,440 Speaker 4: people all the time. So whether for example, I mean 412 00:21:46,520 --> 00:21:50,120 Speaker 4: i'll take names. I think our current chro Nicol who 413 00:21:50,160 --> 00:21:53,000 Speaker 4: introduced us, she has been in that loop since at 414 00:21:53,080 --> 00:21:56,800 Speaker 4: least twenty fifteen. For me, if I look at our CFO, 415 00:21:57,400 --> 00:21:58,120 Speaker 4: Jim Cavanaugh. 416 00:21:58,200 --> 00:21:58,880 Speaker 5: He's been in that. 417 00:21:58,840 --> 00:22:02,720 Speaker 4: Loop probably since twenty thirteen, and the IBM's will probably wonder, 418 00:22:02,960 --> 00:22:05,240 Speaker 4: what the hell intersection did you guys have? It didn't 419 00:22:05,280 --> 00:22:08,200 Speaker 4: when I talked about learning finance. I will go to 420 00:22:08,280 --> 00:22:09,960 Speaker 4: him and say, hey, explain this to me. I don't 421 00:22:10,040 --> 00:22:13,639 Speaker 4: understand why it's like this, and to me it's okay. 422 00:22:14,359 --> 00:22:15,480 Speaker 5: A patient you go learn. 423 00:22:16,160 --> 00:22:18,480 Speaker 4: If I think about many of the people in the 424 00:22:18,520 --> 00:22:22,160 Speaker 4: software business, they've been having these discussions with me for always. 425 00:22:22,920 --> 00:22:27,399 Speaker 4: I mean, now I'll acknowledge I can get probably impatient 426 00:22:27,600 --> 00:22:30,920 Speaker 4: and a serbic, but it's meant to be a discussion. 427 00:22:31,000 --> 00:22:33,040 Speaker 4: I mean, like, let's have the discussion. If you have 428 00:22:33,080 --> 00:22:35,199 Speaker 4: a strong point of view, I got it. Nobody has 429 00:22:35,240 --> 00:22:38,040 Speaker 4: a going to be perfectly correct, but I always look 430 00:22:38,119 --> 00:22:41,280 Speaker 4: for if you have a strong point of view, that 431 00:22:41,320 --> 00:22:44,320 Speaker 4: means it's from a different perspective than mine. So what 432 00:22:44,400 --> 00:22:47,520 Speaker 4: do I learn from that is the question which helps 433 00:22:47,560 --> 00:22:48,880 Speaker 4: to improve my point of view? 434 00:22:48,960 --> 00:22:50,320 Speaker 5: That makes sense. 435 00:22:51,520 --> 00:22:53,400 Speaker 4: I actually think that each person should try to build 436 00:22:53,400 --> 00:22:56,280 Speaker 4: a community of a one hundred people inside your enterprise 437 00:22:56,680 --> 00:23:00,159 Speaker 4: and a hundred outside that you can call up. I 438 00:23:00,160 --> 00:23:04,520 Speaker 4: have no hesitation. Somebody introduced me to a long time 439 00:23:04,560 --> 00:23:07,600 Speaker 4: back to a CEO on the outside. I called them 440 00:23:07,640 --> 00:23:08,760 Speaker 4: up all the time and say, hey, do you have 441 00:23:08,800 --> 00:23:11,960 Speaker 4: five minutes. I'm just thinking about something this way, the 442 00:23:12,040 --> 00:23:14,440 Speaker 4: CEO of red At who left IBM in twenty twenty one, 443 00:23:15,080 --> 00:23:17,800 Speaker 4: we probably talk every two or three months on a 444 00:23:17,880 --> 00:23:18,600 Speaker 4: random topic. 445 00:23:18,920 --> 00:23:21,320 Speaker 5: One way, it becomes mutual. He'll asks me my opinion 446 00:23:21,400 --> 00:23:21,960 Speaker 5: on some things. 447 00:23:22,520 --> 00:23:24,400 Speaker 4: Now by the way, three or four times he might 448 00:23:24,440 --> 00:23:28,399 Speaker 4: do something different, but he wants my opinion to the 449 00:23:28,400 --> 00:23:29,040 Speaker 4: other way around. 450 00:23:29,240 --> 00:23:30,960 Speaker 3: If I gave you my phone number, can I be 451 00:23:31,040 --> 00:23:33,840 Speaker 3: on that list? I would just be fascinating. I don't 452 00:23:33,840 --> 00:23:35,000 Speaker 3: know if I can help you, but it would be 453 00:23:35,000 --> 00:23:35,880 Speaker 3: really fun to get the call. 454 00:23:36,000 --> 00:23:36,439 Speaker 5: Sure you can. 455 00:23:36,560 --> 00:23:39,320 Speaker 4: Do you think that we can ever succeed unless people 456 00:23:39,359 --> 00:23:44,720 Speaker 4: who influence opinions say things about us. So you may 457 00:23:44,720 --> 00:23:48,120 Speaker 4: not think deeply about maybe the physics of one of computing, 458 00:23:48,600 --> 00:23:52,040 Speaker 4: But would you think deeply about why. 459 00:23:51,720 --> 00:23:54,040 Speaker 5: And what moment may make it much more. 460 00:23:53,920 --> 00:23:57,119 Speaker 4: Attractive to a large audience. Sure you would. You'd be 461 00:23:57,160 --> 00:24:00,520 Speaker 4: far better as a thinker on that topic. Probably most 462 00:24:00,520 --> 00:24:01,360 Speaker 4: of the people. 463 00:24:02,080 --> 00:24:03,479 Speaker 3: I was thinking, you know, when you were making your 464 00:24:03,480 --> 00:24:09,320 Speaker 3: comments about your nineteen ninety self and streaming that the 465 00:24:10,040 --> 00:24:13,359 Speaker 3: rational thing would have been for there have been a 466 00:24:13,480 --> 00:24:19,000 Speaker 3: reserved board seat for every television network from someone from 467 00:24:19,000 --> 00:24:22,320 Speaker 3: the world of technology, which I one hundred percent sure 468 00:24:22,320 --> 00:24:24,600 Speaker 3: they did not have that in nineteen ninety but they 469 00:24:25,040 --> 00:24:28,120 Speaker 3: they're board was probably composed of people like them. Let's 470 00:24:28,119 --> 00:24:32,480 Speaker 3: talk a little bit about technology now. There's so much, 471 00:24:32,560 --> 00:24:34,879 Speaker 3: so much of the changes going on right now are 472 00:24:35,560 --> 00:24:40,000 Speaker 3: accompanied by a great deal of hype. What are we overestimating? 473 00:24:40,000 --> 00:24:41,200 Speaker 2: What are we underestimating? 474 00:24:41,600 --> 00:24:44,400 Speaker 4: Okay, let's go back to ninety ninety five the Internet, 475 00:24:44,440 --> 00:24:46,320 Speaker 4: because I think that the current moment is very much 476 00:24:46,400 --> 00:24:48,680 Speaker 4: like the Internet moment. Actually, all the moments in the 477 00:24:48,720 --> 00:24:51,679 Speaker 4: middle were much smaller. I think mobile streaming were much smaller. 478 00:24:51,720 --> 00:24:54,440 Speaker 4: Internet was the major moment. If you remember back to 479 00:24:54,520 --> 00:24:56,520 Speaker 4: ninety nine and two thousand, people claimed there was a 480 00:24:56,520 --> 00:24:59,359 Speaker 4: lot of hype. Would we say that the Internet of 481 00:24:59,359 --> 00:25:02,800 Speaker 4: today has more than fulfilled all those expectations and more? 482 00:25:03,359 --> 00:25:03,639 Speaker 5: Yes. 483 00:25:04,760 --> 00:25:08,280 Speaker 4: Along the way, they'd eight out of ten of the 484 00:25:08,320 --> 00:25:11,440 Speaker 4: companies that were invested in heavily go bankrupt. 485 00:25:11,520 --> 00:25:11,800 Speaker 2: Yes. 486 00:25:12,800 --> 00:25:15,639 Speaker 4: I actually think of that as being the huge positive 487 00:25:16,400 --> 00:25:20,400 Speaker 4: of the United States capital system that that investment happened. 488 00:25:21,400 --> 00:25:23,640 Speaker 4: Eight out of ten went broke. By the way, those 489 00:25:23,680 --> 00:25:26,600 Speaker 4: acids didn't go away. They got consumed at ten cents 490 00:25:26,600 --> 00:25:28,679 Speaker 4: in the dollar by somebody else who could then make 491 00:25:28,680 --> 00:25:30,840 Speaker 4: a lot of money. But the two out of ten, 492 00:25:31,640 --> 00:25:34,639 Speaker 4: just take two, it probably has paid for all the capitals. 493 00:25:34,800 --> 00:25:35,440 Speaker 5: If you just. 494 00:25:35,480 --> 00:25:40,280 Speaker 4: Take Amazon and Alphabet aka Google, just those two have 495 00:25:40,320 --> 00:25:43,679 Speaker 4: probably paid for all the capital of that time. So 496 00:25:44,320 --> 00:25:47,240 Speaker 4: that's what's going to happen this time. There will be 497 00:25:47,280 --> 00:25:50,560 Speaker 4: a lot of tears, but in aggregate, there will be 498 00:25:50,560 --> 00:25:53,000 Speaker 4: a lot of success. And I think that's the fundamental 499 00:25:53,000 --> 00:25:55,840 Speaker 4: difference between the US model and almost. 500 00:25:55,600 --> 00:25:56,520 Speaker 5: All other countries. 501 00:25:57,080 --> 00:25:59,439 Speaker 4: On all other countries, they're desperate to keep all the 502 00:25:59,440 --> 00:26:00,000 Speaker 4: companies alone. 503 00:26:00,520 --> 00:26:03,960 Speaker 5: So that means you're dialuting. But that's a horrible thing. 504 00:26:04,440 --> 00:26:08,199 Speaker 4: So to me let the system works worked really effectively, 505 00:26:08,400 --> 00:26:10,119 Speaker 4: by the way, not just now, I mean all the 506 00:26:10,160 --> 00:26:14,840 Speaker 4: way back to railways and electrification, and you mentioned telephone system. 507 00:26:15,320 --> 00:26:19,919 Speaker 4: You can keep going on oil, I mean consumer goods. 508 00:26:20,280 --> 00:26:22,240 Speaker 4: It goes on and on. I think this system is 509 00:26:22,359 --> 00:26:26,000 Speaker 4: very effective. It deploys capital. Its census is a big 510 00:26:26,040 --> 00:26:29,880 Speaker 4: market is completely willing to over deploy capital in the short. 511 00:26:29,720 --> 00:26:31,240 Speaker 5: Term, not the long term. 512 00:26:31,600 --> 00:26:35,440 Speaker 4: That results in more competition, so it actually improves a 513 00:26:35,520 --> 00:26:38,840 Speaker 4: rate of innovation. That means what might have taken twenty 514 00:26:38,880 --> 00:26:42,920 Speaker 4: years takes five and the winners emerge exactly the same 515 00:26:43,000 --> 00:26:44,240 Speaker 4: is going to happen this time. 516 00:26:44,400 --> 00:26:45,800 Speaker 2: Yeah, I saw that. 517 00:26:46,040 --> 00:26:50,120 Speaker 3: I grew up in Waterloo and BlackBerry closest from Waterloo. 518 00:26:50,200 --> 00:26:52,119 Speaker 2: Yep, everyone used to work for BlackBerry. 519 00:26:52,240 --> 00:26:52,320 Speaker 1: Ye. 520 00:26:52,440 --> 00:26:55,679 Speaker 3: BlackBerry goes into its dive and that's the best thing 521 00:26:55,720 --> 00:26:57,760 Speaker 3: that happened to Waterloo because it was not just capital 522 00:26:57,800 --> 00:26:58,280 Speaker 3: but talent. 523 00:26:58,600 --> 00:27:00,800 Speaker 5: Yep. Talent is way any other companies. 524 00:27:00,840 --> 00:27:03,359 Speaker 3: So all these smart people went on the next really 525 00:27:03,440 --> 00:27:07,240 Speaker 3: more interesting thing. And yeah, the wait, you haven't answered, 526 00:27:08,240 --> 00:27:11,120 Speaker 3: So what is you an idea that we are underestimating 527 00:27:11,119 --> 00:27:13,800 Speaker 3: at the moment that's in the current kind of suite 528 00:27:13,880 --> 00:27:14,919 Speaker 3: of innovations. 529 00:27:15,119 --> 00:27:17,800 Speaker 4: So I don't think AI is being underestimated because when 530 00:27:17,800 --> 00:27:19,320 Speaker 4: you look at the amount of capital and the amount 531 00:27:19,359 --> 00:27:23,000 Speaker 4: of things chasing it, I think it's incredible. I do 532 00:27:23,040 --> 00:27:25,520 Speaker 4: think that a lot of enterprises are deploying it in 533 00:27:25,560 --> 00:27:29,119 Speaker 4: the wrong place. They're running after shiny experiments. There's a 534 00:27:29,200 --> 00:27:32,359 Speaker 4: lot of basic things you can do to use AI 535 00:27:32,440 --> 00:27:33,800 Speaker 4: to improve the business today. 536 00:27:34,200 --> 00:27:36,280 Speaker 5: So that's really just my one advice to them. 537 00:27:36,600 --> 00:27:40,160 Speaker 4: Pick areas you can scale, don't pick the shiny little 538 00:27:40,400 --> 00:27:41,399 Speaker 4: toys on the side. 539 00:27:42,240 --> 00:27:44,480 Speaker 2: Then I think, for example, that. 540 00:27:46,320 --> 00:27:51,679 Speaker 4: If anybody has more than ten percent of what they 541 00:27:51,720 --> 00:27:56,439 Speaker 4: had for customer service ten years ago, they're already five 542 00:27:56,520 --> 00:28:02,480 Speaker 4: years behind. If anybody is not using AI to make 543 00:28:02,560 --> 00:28:07,199 Speaker 4: their developers who write software thirty percent more productive today 544 00:28:07,560 --> 00:28:11,280 Speaker 4: with the goal of being seventy percent more productive, that's 545 00:28:11,280 --> 00:28:13,240 Speaker 4: not to say you will need less, you'll just get 546 00:28:13,280 --> 00:28:14,160 Speaker 4: more software done. 547 00:28:15,560 --> 00:28:16,760 Speaker 5: Then they're not. 548 00:28:17,080 --> 00:28:18,560 Speaker 4: And I would turn around and tell you I think 549 00:28:18,600 --> 00:28:22,560 Speaker 4: only maybe five percent of the enterprises on both. 550 00:28:22,400 --> 00:28:23,320 Speaker 5: Those metrics today. 551 00:28:23,560 --> 00:28:28,280 Speaker 4: Yeah, yeah, and the one that is completely underestimated. I 552 00:28:28,359 --> 00:28:32,440 Speaker 4: kind of put it like this, Quantum today is where 553 00:28:32,600 --> 00:28:36,200 Speaker 4: GPUs a AI war in twenty fifteen, and I bet 554 00:28:36,320 --> 00:28:39,960 Speaker 4: you every AI person is thinking and hoping. I wish 555 00:28:40,000 --> 00:28:43,400 Speaker 4: I had started doing more in twenty fifteen as opposed 556 00:28:43,400 --> 00:28:46,600 Speaker 4: to wait until twenty twenty two. Quantum today is there, 557 00:28:46,880 --> 00:28:49,240 Speaker 4: So it's not good enough that you can get a 558 00:28:49,320 --> 00:28:51,960 Speaker 4: big advantage, But if you learn how to use it, 559 00:28:52,400 --> 00:28:55,840 Speaker 4: then in five years you'll be ready to exploit what comes. 560 00:28:55,960 --> 00:28:58,160 Speaker 3: Yeah, we're gonna get to quantum in a moment. But 561 00:28:58,200 --> 00:29:01,400 Speaker 3: I have a couple other AI questions you know I 562 00:29:01,520 --> 00:29:04,160 Speaker 3: as you know where This conversation is part of this 563 00:29:04,200 --> 00:29:07,400 Speaker 3: thing that we do with IBM Smart Talks, and I've 564 00:29:07,400 --> 00:29:11,640 Speaker 3: been The last episode I did was on Kenya, which 565 00:29:11,640 --> 00:29:15,360 Speaker 3: has a massive deforestation problem, and they got together IBM 566 00:29:15,600 --> 00:29:19,600 Speaker 3: took all the NASA satellite data, ran it through an LLM, 567 00:29:19,880 --> 00:29:23,600 Speaker 3: and gave them this incredibly precise ten meter by ten 568 00:29:23,680 --> 00:29:27,000 Speaker 3: meter analysis of what trees to plant, where to plant 569 00:29:27,000 --> 00:29:30,400 Speaker 3: them exactly where the you know, an astonishing kind of 570 00:29:30,400 --> 00:29:33,880 Speaker 3: blueprint about how to fix their country ecologically. And it 571 00:29:33,920 --> 00:29:38,360 Speaker 3: made me think, when we analyze the potential of AI, 572 00:29:39,160 --> 00:29:41,400 Speaker 3: are we making a mistake by spending too much thinking 573 00:29:41,400 --> 00:29:44,320 Speaker 3: about the developed world when it's actually the developing world 574 00:29:44,640 --> 00:29:47,040 Speaker 3: where the greatest ROI for this is to me. 575 00:29:47,160 --> 00:29:50,680 Speaker 4: Look, software technologies are wonderful and the sense they can 576 00:29:50,680 --> 00:29:52,880 Speaker 4: scale and they can be an ad so you don't 577 00:29:52,920 --> 00:29:56,520 Speaker 4: have to do one or the other. You use deforestation. 578 00:29:57,080 --> 00:30:01,200 Speaker 4: How about the use of pesticides and fertilizers, We overuse it. 579 00:30:01,640 --> 00:30:04,840 Speaker 4: We tend to for irrigation, we tend to just flood everything, 580 00:30:04,920 --> 00:30:07,280 Speaker 4: as opposed to say, okay, only that one needs it. 581 00:30:07,680 --> 00:30:09,240 Speaker 4: You could do all those things to get a ten 582 00:30:09,280 --> 00:30:13,000 Speaker 4: times effectiveness and that all would apply to the developing world. 583 00:30:13,480 --> 00:30:16,560 Speaker 4: How about remote healthcare or telehealth using an AI agent. 584 00:30:16,880 --> 00:30:21,560 Speaker 4: So the examples are numerous. In the developed world, I 585 00:30:21,720 --> 00:30:25,360 Speaker 4: believe we are running out of people. I know that 586 00:30:25,600 --> 00:30:29,160 Speaker 4: nobody likes to hear it. Most of the Far East 587 00:30:29,520 --> 00:30:31,400 Speaker 4: is going to have half the number of people by 588 00:30:31,480 --> 00:30:33,960 Speaker 4: twenty seventy compared to today. That's not that far away. 589 00:30:34,680 --> 00:30:38,000 Speaker 4: If I look at Europe, birth rates are far under 590 00:30:38,440 --> 00:30:43,040 Speaker 4: sustaining or keeping population flat. How the US, also, depending 591 00:30:43,040 --> 00:30:44,600 Speaker 4: on which number you want to look at, is either 592 00:30:44,640 --> 00:30:46,560 Speaker 4: one point six births per women. 593 00:30:46,960 --> 00:30:50,480 Speaker 5: Or two or two point one. Why are the three numbers? 594 00:30:50,480 --> 00:30:52,240 Speaker 4: One point six is to women who were born in 595 00:30:52,240 --> 00:30:55,440 Speaker 4: the United States, It becomes two point two if you 596 00:30:55,520 --> 00:30:59,000 Speaker 4: include immigrant women. It becomes two point one if you 597 00:30:59,000 --> 00:31:00,520 Speaker 4: include children who to make getting in. 598 00:31:01,120 --> 00:31:03,560 Speaker 5: So you got to decide where the trend is obvious. 599 00:31:04,000 --> 00:31:04,760 Speaker 5: This is going down. 600 00:31:05,120 --> 00:31:08,120 Speaker 4: So AI in the developed world is going to be 601 00:31:08,320 --> 00:31:11,800 Speaker 4: essential because to keep our current quality of life, you 602 00:31:11,880 --> 00:31:14,600 Speaker 4: need more work done or what's going to do the 603 00:31:14,640 --> 00:31:16,320 Speaker 4: work if they're people to do the work. 604 00:31:16,560 --> 00:31:19,200 Speaker 5: So the problems are different in the places. 605 00:31:19,280 --> 00:31:22,360 Speaker 3: Yeah, it gives you In the developing world, you get 606 00:31:22,400 --> 00:31:26,479 Speaker 3: access to a suite of technologies and things at a 607 00:31:26,600 --> 00:31:28,880 Speaker 3: price that you could never been able to afford. 608 00:31:29,080 --> 00:31:29,440 Speaker 5: Correct. 609 00:31:29,720 --> 00:31:31,600 Speaker 2: That was my in talking to the Kenyon thing. 610 00:31:31,600 --> 00:31:34,320 Speaker 3: It was like the whole it's it's maybe one of 611 00:31:34,320 --> 00:31:38,520 Speaker 3: the largest ecological projects of its kind at fifteen billion 612 00:31:38,560 --> 00:31:39,600 Speaker 3: trees they want to plan. 613 00:31:39,520 --> 00:31:41,240 Speaker 4: And that is one country that might get it done 614 00:31:41,280 --> 00:31:43,520 Speaker 4: because they do take a lot of pride in their 615 00:31:43,920 --> 00:31:47,680 Speaker 4: ecology and the sort of returning to the land and 616 00:31:47,720 --> 00:31:48,280 Speaker 4: giving back. 617 00:31:48,520 --> 00:31:48,760 Speaker 2: Yeah. 618 00:31:48,960 --> 00:31:54,200 Speaker 3: Yeah, what's different about IBM's version of AI versus some of. 619 00:31:54,160 --> 00:31:56,760 Speaker 4: Your So we are not a consumer company, so we 620 00:31:56,800 --> 00:31:59,960 Speaker 4: have no focus on a B two C chat pot. 621 00:32:00,880 --> 00:32:03,440 Speaker 4: And the reason I say that is, if you're making 622 00:32:03,520 --> 00:32:06,280 Speaker 4: a B two C chat bot, does it help you 623 00:32:06,360 --> 00:32:09,920 Speaker 4: to make it even bigger and more computationally inefficient? And 624 00:32:09,960 --> 00:32:12,200 Speaker 4: the short answer is yes, because you have a certain 625 00:32:12,240 --> 00:32:14,920 Speaker 4: number of users, and you kind of say, I kind 626 00:32:14,920 --> 00:32:18,880 Speaker 4: of say this jokingly. If I add finished to French capabilities, 627 00:32:19,120 --> 00:32:22,360 Speaker 4: I can probably add five million users. If I add 628 00:32:23,040 --> 00:32:25,000 Speaker 4: writing a high coup I might be able to add 629 00:32:25,000 --> 00:32:29,000 Speaker 4: another five million users. If I add writing an email 630 00:32:29,080 --> 00:32:31,160 Speaker 4: in the voice of Steinbeck, I can probably add another 631 00:32:31,160 --> 00:32:34,800 Speaker 4: five million users. Do all those things if my goal 632 00:32:35,040 --> 00:32:38,440 Speaker 4: is to get help a company summarize the legal documents 633 00:32:38,480 --> 00:32:39,720 Speaker 4: in English. 634 00:32:40,040 --> 00:32:41,280 Speaker 5: That can be a model. 635 00:32:41,000 --> 00:32:45,920 Speaker 4: That's one hundred size as effective, probably higher quality, but 636 00:32:46,040 --> 00:32:49,360 Speaker 4: I don't need to go wide. So if you're focusing 637 00:32:49,360 --> 00:32:52,840 Speaker 4: on the enterprise, that actually takes away the focus of 638 00:32:52,920 --> 00:32:56,920 Speaker 4: having to go to extremely large models, which but definition 639 00:32:56,960 --> 00:33:01,320 Speaker 4: are going to be computationally expensive, power hungry, and demand 640 00:33:01,360 --> 00:33:02,480 Speaker 4: lots and lots of data. 641 00:33:02,920 --> 00:33:04,480 Speaker 5: So I can turn ontell the enterprise. 642 00:33:04,560 --> 00:33:06,719 Speaker 4: You don't need to worry about copyright issues, about all 643 00:33:06,720 --> 00:33:09,240 Speaker 4: those because you can train on a much smaller amount 644 00:33:09,240 --> 00:33:09,680 Speaker 4: of data. 645 00:33:10,080 --> 00:33:11,240 Speaker 5: And now, by. 646 00:33:11,080 --> 00:33:14,160 Speaker 4: The way, turning it for you yourself is a weekend exercise, 647 00:33:14,400 --> 00:33:17,720 Speaker 4: it's not a six month on a big super computer 648 00:33:17,800 --> 00:33:19,360 Speaker 4: cluster somewhere out there. 649 00:33:19,520 --> 00:33:21,200 Speaker 5: That's one big difference of what we do. 650 00:33:21,920 --> 00:33:27,400 Speaker 4: Second, we are very focused on helping those problems that 651 00:33:27,440 --> 00:33:29,680 Speaker 4: can give people immediate benefit. 652 00:33:29,320 --> 00:33:30,720 Speaker 5: Where we have domain knowledge. 653 00:33:30,880 --> 00:33:36,560 Speaker 4: So our domain knowledge is around operations, is around programming, encoding, 654 00:33:37,280 --> 00:33:43,400 Speaker 4: is around customer service, is around customer experience, logistics, procurement. 655 00:33:43,880 --> 00:33:47,200 Speaker 4: Let's change the areas where we have a lot of expertise. 656 00:33:47,880 --> 00:33:51,120 Speaker 4: And then three we kind of apply it to ourselves 657 00:33:51,800 --> 00:33:54,200 Speaker 4: and so we are not asking our clients to be 658 00:33:54,320 --> 00:33:55,680 Speaker 4: the first experiment on it. 659 00:33:55,760 --> 00:33:59,040 Speaker 5: We say you can leverage what we did. We're happy to. 660 00:33:59,000 --> 00:34:01,840 Speaker 4: Bring out all our learnings, including what needs to change 661 00:34:02,320 --> 00:34:04,719 Speaker 4: in the process, because the biggest change is not technology, 662 00:34:05,120 --> 00:34:08,000 Speaker 4: is getting people to accept that there's a different way 663 00:34:08,040 --> 00:34:08,720 Speaker 4: to do things. 664 00:34:09,640 --> 00:34:14,080 Speaker 3: Other challenges to explaining what makes you different to potential customers. 665 00:34:14,080 --> 00:34:17,040 Speaker 4: For sure, the shining object is always attractive. Oh, I 666 00:34:17,120 --> 00:34:20,480 Speaker 4: can go and try chat GPT. Why don't you have 667 00:34:20,520 --> 00:34:21,560 Speaker 4: your GPT version? 668 00:34:23,080 --> 00:34:24,240 Speaker 2: Do you use chat gipt? 669 00:34:25,200 --> 00:34:26,040 Speaker 5: I have used it. 670 00:34:27,440 --> 00:34:29,440 Speaker 3: I asked it a question recently which I thought was 671 00:34:29,480 --> 00:34:32,800 Speaker 3: really simple, and it made up about ten people. 672 00:34:34,239 --> 00:34:35,440 Speaker 2: Anyway, I had a bad experience. 673 00:34:35,480 --> 00:34:38,719 Speaker 4: I should think that that's the fundamental issue with all 674 00:34:39,040 --> 00:34:41,480 Speaker 4: lms as they get larger. Yeah, because you had to 675 00:34:41,520 --> 00:34:45,560 Speaker 4: ask what was the original insight that led to these 676 00:34:46,719 --> 00:34:50,960 Speaker 4: It was a reward function with intent. So if it 677 00:34:51,040 --> 00:34:55,000 Speaker 4: has learned by using a reward function, it's reward function 678 00:34:55,200 --> 00:34:59,160 Speaker 4: comes from giving an answer that satisfies you. So if 679 00:34:59,160 --> 00:35:01,279 Speaker 4: it thinks that if it makes up an answer that 680 00:35:01,320 --> 00:35:04,600 Speaker 4: will satisfy you, how will you stop it? Why do 681 00:35:04,680 --> 00:35:08,520 Speaker 4: we think this is different than the clever college kid 682 00:35:08,520 --> 00:35:09,439 Speaker 4: who doesn't know an answer. 683 00:35:09,480 --> 00:35:12,840 Speaker 5: What bullshits the way to an answer. Well, it's exactly 684 00:35:12,920 --> 00:35:13,239 Speaker 5: the same. 685 00:35:13,280 --> 00:35:16,240 Speaker 3: It's like the example of clever hands in that story 686 00:35:16,120 --> 00:35:18,920 Speaker 3: the horse that they thought could speak, then all it. 687 00:35:19,000 --> 00:35:21,880 Speaker 2: Was doing was pleasing it. It's master. Yes, it is 688 00:35:21,880 --> 00:35:22,959 Speaker 2: a little bit of clever hands. 689 00:35:23,040 --> 00:35:25,240 Speaker 5: Yeah, it's like dogs kind of imitating and looking. 690 00:35:26,520 --> 00:35:31,640 Speaker 3: What would you identify as the most significant bottleneck in 691 00:35:31,719 --> 00:35:34,200 Speaker 3: the development of AI? What's slowing us down right now? 692 00:35:36,080 --> 00:35:40,600 Speaker 4: I am not convinced that LLMS is the way to 693 00:35:40,719 --> 00:35:45,880 Speaker 4: get much beyond where we'll get incremental improvements, But I, 694 00:35:46,360 --> 00:35:49,319 Speaker 4: for one, don't believe that LMS are going to get 695 00:35:49,400 --> 00:35:55,480 Speaker 4: us to super intelligence or AGI. So I'll park that 696 00:35:55,600 --> 00:35:58,120 Speaker 4: on the side and simply say, we have to find 697 00:35:58,120 --> 00:36:02,680 Speaker 4: a way to fuse and how do you represent knowledge 698 00:36:03,239 --> 00:36:06,000 Speaker 4: as opposed to have to statistically rediscover it each time 699 00:36:06,000 --> 00:36:09,240 Speaker 4: I ask a question? And how do we fuse knowledge 700 00:36:09,320 --> 00:36:15,760 Speaker 4: with LLM? Maybe then we'll get to leaves and beyond 701 00:36:15,800 --> 00:36:20,360 Speaker 4: today on LMS alone, my view is, I think we 702 00:36:20,440 --> 00:36:24,520 Speaker 4: can get a thousand x efficiency in power and cost 703 00:36:24,640 --> 00:36:28,040 Speaker 4: and compute from today. So if you make something a 704 00:36:28,120 --> 00:36:31,359 Speaker 4: thousand times cheaper, would people use a lot more of it? 705 00:36:32,520 --> 00:36:32,880 Speaker 2: Yes? 706 00:36:33,719 --> 00:36:37,040 Speaker 4: And I think those answers lie as is usually in compute. 707 00:36:37,239 --> 00:36:42,080 Speaker 4: So advances in semiconductors, advances in software, and advances in 708 00:36:42,120 --> 00:36:45,160 Speaker 4: agorthic techniques all three. But how come we're not working 709 00:36:45,160 --> 00:36:46,600 Speaker 4: in any of those three. We're just taking the current 710 00:36:46,600 --> 00:36:51,280 Speaker 4: sevic conductor and going more. We're taking the current algorithmic 711 00:36:51,360 --> 00:36:53,319 Speaker 4: techniques and not really trying to invent new ones. 712 00:36:53,440 --> 00:36:56,440 Speaker 5: So I think those are all happen less than five years. 713 00:36:58,200 --> 00:37:01,640 Speaker 3: But why say there is a we're in a moment 714 00:37:01,640 --> 00:37:06,719 Speaker 3: where people are not pursuing the the optimal strategy for 715 00:37:06,800 --> 00:37:08,040 Speaker 3: exploiting this technology. 716 00:37:08,360 --> 00:37:13,120 Speaker 4: Why because when you see a few people running really 717 00:37:13,160 --> 00:37:16,200 Speaker 4: hard and they're willing to invest any amount of money, 718 00:37:16,200 --> 00:37:20,239 Speaker 4: so efficiency is not the focus. People feel if you 719 00:37:20,280 --> 00:37:21,880 Speaker 4: don't do the same, you'll get left behind. 720 00:37:23,120 --> 00:37:25,879 Speaker 2: So is this a case where there's too much money 721 00:37:26,040 --> 00:37:27,160 Speaker 2: ones have never had for more? 722 00:37:27,239 --> 00:37:27,640 Speaker 5: Right? Ever? 723 00:37:28,200 --> 00:37:33,480 Speaker 3: Yeah, but this is this a consequence of overinvestment in 724 00:37:33,520 --> 00:37:34,560 Speaker 3: the in the field. 725 00:37:34,680 --> 00:37:36,960 Speaker 4: Going back to my internet allology, if two out of 726 00:37:37,040 --> 00:37:40,440 Speaker 4: ten are going to succeed, how do you guarantee or 727 00:37:40,440 --> 00:37:42,120 Speaker 4: how do you improve the orders that you are one 728 00:37:42,120 --> 00:37:45,080 Speaker 4: of those two? So if you pause to say I 729 00:37:45,120 --> 00:37:47,200 Speaker 4: want to become more efficient, that's not the way to win. 730 00:37:47,520 --> 00:37:49,640 Speaker 5: So first you win, then you become efficient. 731 00:37:50,719 --> 00:37:54,560 Speaker 3: Yeah, let's talk about what is I was told your 732 00:37:54,560 --> 00:37:59,759 Speaker 3: favorite topic, it's quantum. It is what boy even go 733 00:37:59,800 --> 00:38:00,279 Speaker 3: to further? 734 00:38:00,400 --> 00:38:01,920 Speaker 2: Why is quantum your favorite topic? 735 00:38:03,160 --> 00:38:06,080 Speaker 4: We've only had two kinds of compute in the history, 736 00:38:06,960 --> 00:38:10,319 Speaker 4: so nineteen forty five was to use that year for 737 00:38:10,400 --> 00:38:13,160 Speaker 4: anyac all the way till twenty twenty, we had one 738 00:38:13,200 --> 00:38:16,040 Speaker 4: kind of computerlassical what today you would call a classical computer. 739 00:38:17,600 --> 00:38:22,080 Speaker 4: Then GPUs and AI came around, so you would say 740 00:38:21,880 --> 00:38:25,080 Speaker 4: the intuition there is you went from sort of bits, 741 00:38:25,120 --> 00:38:29,200 Speaker 4: which is algebra or high school algebra, to including neurons, 742 00:38:29,320 --> 00:38:32,719 Speaker 4: which is captured in linear algebra. But that gives you 743 00:38:32,719 --> 00:38:36,200 Speaker 4: a different kind But it can do problems that are 744 00:38:36,280 --> 00:38:38,359 Speaker 4: really hard to do. I don't say impossible, just hard 745 00:38:38,400 --> 00:38:43,000 Speaker 4: to do on normal computers. Quantum adds a third kind 746 00:38:43,000 --> 00:38:49,560 Speaker 4: of math. Yes, the physics properties which really get people 747 00:38:49,719 --> 00:38:52,640 Speaker 4: energized and the imagination going. And we use all these 748 00:38:52,640 --> 00:38:57,160 Speaker 4: words about entanglement and silver position. But maybe because I'm 749 00:38:57,160 --> 00:38:59,800 Speaker 4: a bit of a math guy. The real thing is 750 00:39:00,200 --> 00:39:03,120 Speaker 4: a third kind of math to make it really simple, 751 00:39:03,600 --> 00:39:06,919 Speaker 4: a third kind of math that comes from the field 752 00:39:06,960 --> 00:39:08,120 Speaker 4: of abstract algebra. 753 00:39:08,560 --> 00:39:09,360 Speaker 5: It does the math. 754 00:39:10,320 --> 00:39:13,440 Speaker 4: You can use Hamiltonians for those who like physics, or 755 00:39:13,480 --> 00:39:15,640 Speaker 4: you can use the word Lee algebras for those who 756 00:39:15,800 --> 00:39:19,759 Speaker 4: like abstract mathematics. If you can do a third kind 757 00:39:19,760 --> 00:39:22,759 Speaker 4: of math, which algorithms are suited to that third kind 758 00:39:22,760 --> 00:39:26,120 Speaker 4: of math? So it excites me because we can now 759 00:39:26,400 --> 00:39:29,360 Speaker 4: approach algorithms that you just could never do on the 760 00:39:29,400 --> 00:39:32,319 Speaker 4: other two it's impossible. Now it's different than AI. It's 761 00:39:32,320 --> 00:39:35,600 Speaker 4: not data intensive, it's compute intensive. So we kind of 762 00:39:35,640 --> 00:39:38,560 Speaker 4: had compute and supercomputers. Then we went to data which 763 00:39:38,600 --> 00:39:40,840 Speaker 4: is AI. And now if you say there's another class 764 00:39:40,840 --> 00:39:42,520 Speaker 4: of problems, it require lots of compute. 765 00:39:42,680 --> 00:39:44,440 Speaker 5: That's quantum. 766 00:39:44,480 --> 00:39:47,120 Speaker 3: A couple months ago was at the to watch some 767 00:39:47,160 --> 00:39:49,240 Speaker 3: research center and they have you know, on the ground floor, 768 00:39:49,239 --> 00:39:50,680 Speaker 3: they have those behind the glass. 769 00:39:51,239 --> 00:39:55,279 Speaker 2: There's incredibly exciting looking machines. But where are we in 770 00:39:55,320 --> 00:39:56,399 Speaker 2: the timeline of this. 771 00:39:57,960 --> 00:40:01,560 Speaker 4: Three to five years away from shocking people? 772 00:40:02,520 --> 00:40:03,839 Speaker 2: What does shocking people mean? 773 00:40:04,200 --> 00:40:07,000 Speaker 5: Do something that nobody thought was possible in that timeline? 774 00:40:07,280 --> 00:40:08,720 Speaker 2: Does an example come to mind? 775 00:40:09,560 --> 00:40:14,080 Speaker 4: I was actually pleasantly surprised. So one of our clients, HSBC, 776 00:40:15,160 --> 00:40:19,280 Speaker 4: last week published a result that using a quantum computer. 777 00:40:20,520 --> 00:40:24,680 Speaker 4: Bond trading was thirty four percent more accurate than their 778 00:40:24,719 --> 00:40:25,520 Speaker 4: prior technique. 779 00:40:26,440 --> 00:40:27,640 Speaker 2: Thirty four percent. 780 00:40:27,760 --> 00:40:28,560 Speaker 5: Thirty four percent. 781 00:40:29,120 --> 00:40:32,480 Speaker 3: This is an industry that's used to one percent correct, 782 00:40:32,760 --> 00:40:34,000 Speaker 3: zero point five percent. 783 00:40:34,480 --> 00:40:36,880 Speaker 2: Yes, that's astonishing. 784 00:40:37,160 --> 00:40:39,960 Speaker 4: Now that was not at a scale when they could 785 00:40:40,000 --> 00:40:42,600 Speaker 4: turn it into production today, but that was sort of 786 00:40:43,040 --> 00:40:45,520 Speaker 4: their original thought experiment, and that's what they did. 787 00:40:46,040 --> 00:40:51,239 Speaker 5: Now. Can you imagine when will somebody so you. 788 00:40:50,960 --> 00:40:53,920 Speaker 4: Were correct, you talk about an industry where one basis point, 789 00:40:54,239 --> 00:40:57,360 Speaker 4: if I remember I may be wrong, like thirteen trillion 790 00:40:57,440 --> 00:40:59,720 Speaker 4: dollars of money kind of moves around in the financial 791 00:40:59,719 --> 00:41:07,080 Speaker 4: industr each day, right, So basis point would be thirteen 792 00:41:07,120 --> 00:41:11,600 Speaker 4: billion something like that, right, one over ten thousand. So 793 00:41:11,920 --> 00:41:14,280 Speaker 4: when you think about the kind of profit that people 794 00:41:14,280 --> 00:41:16,799 Speaker 4: can make, if you can tell somebody that you can 795 00:41:16,840 --> 00:41:21,680 Speaker 4: come up with a better price than your competition by 796 00:41:21,800 --> 00:41:24,759 Speaker 4: just one basis point, they would actually gain the dire 797 00:41:24,840 --> 00:41:30,040 Speaker 4: market share. Yeah, so I think something around there or 798 00:41:30,080 --> 00:41:31,680 Speaker 4: something in the world of materials. 799 00:41:31,960 --> 00:41:35,160 Speaker 5: Can we make a better battery? Could we make a 800 00:41:35,200 --> 00:41:36,200 Speaker 5: solid state battery? 801 00:41:37,520 --> 00:41:42,239 Speaker 4: Which means your risk of fires heating decrease dramatically. 802 00:41:42,280 --> 00:41:44,560 Speaker 3: And the reason, sorry to ask a really nice question. 803 00:41:45,920 --> 00:41:48,279 Speaker 3: Why is it that a quantum computer would be better 804 00:41:48,320 --> 00:41:52,840 Speaker 3: at solving a battery problem than our existing methods of computing. 805 00:41:52,920 --> 00:41:57,080 Speaker 4: So the equations of quantum mechanics and chemistry and how 806 00:41:57,640 --> 00:42:02,000 Speaker 4: things interact are well known. To solve them, that are 807 00:42:02,000 --> 00:42:04,279 Speaker 4: no known techniques. So these are not like closed form, 808 00:42:04,320 --> 00:42:06,160 Speaker 4: you know, it's not like the square root of a 809 00:42:06,239 --> 00:42:09,480 Speaker 4: qualtertic equation. So the only way to solve them is 810 00:42:09,520 --> 00:42:12,480 Speaker 4: to explore the state space. So if you have a 811 00:42:12,480 --> 00:42:17,399 Speaker 4: few hundred electrons, you need two to the one hundred states. Well, 812 00:42:17,480 --> 00:42:19,719 Speaker 4: I'm sorry, you don't have that much memory. It's impossible. 813 00:42:20,160 --> 00:42:23,000 Speaker 4: So it takes a really really long time on a 814 00:42:23,040 --> 00:42:26,160 Speaker 4: normal computer to solve those problems. Right, But that's simple 815 00:42:26,200 --> 00:42:31,280 Speaker 4: a problem. If a quantum computer operates in the equation domain, 816 00:42:31,760 --> 00:42:33,759 Speaker 4: it doesn't need to explore the state space. It can 817 00:42:33,800 --> 00:42:36,080 Speaker 4: actually solve it. That's why I call it a different 818 00:42:36,200 --> 00:42:38,600 Speaker 4: kind of math. That's the kind of math it does. 819 00:42:39,239 --> 00:42:42,000 Speaker 4: So in a couple of seconds, it can tell you 820 00:42:42,400 --> 00:42:44,080 Speaker 4: this is how that material will be here. 821 00:42:45,560 --> 00:42:46,960 Speaker 2: Oh, I see, so you've. 822 00:42:46,760 --> 00:42:50,000 Speaker 4: Taken what could take years to a few seconds. Yeah, 823 00:42:50,080 --> 00:42:51,640 Speaker 4: that's a pretty big change. 824 00:42:51,760 --> 00:42:54,080 Speaker 2: Yeah. Yeah, it's speaking a different language. 825 00:42:54,360 --> 00:42:54,640 Speaker 5: Different. 826 00:42:54,880 --> 00:42:57,120 Speaker 2: So any kind of problem that comes along. 827 00:42:56,960 --> 00:43:00,000 Speaker 3: That's specific to that language correctly, which is not all problems. 828 00:43:00,080 --> 00:43:02,040 Speaker 2: Yeah, just as I call it. 829 00:43:02,040 --> 00:43:03,120 Speaker 5: It's one more kind of math. 830 00:43:03,440 --> 00:43:07,719 Speaker 3: Yeah, what's an example? So so many questions. L give 831 00:43:07,719 --> 00:43:11,359 Speaker 3: me another example of a of a of a kind 832 00:43:11,400 --> 00:43:13,439 Speaker 3: of problem that a quantum computer would love. 833 00:43:15,239 --> 00:43:18,560 Speaker 4: This one is a bit more speculative, and I'm going 834 00:43:18,600 --> 00:43:21,839 Speaker 4: to use a little bit of poetic license. So let's 835 00:43:21,840 --> 00:43:25,319 Speaker 4: take a post office in a mid size country. They 836 00:43:25,400 --> 00:43:28,120 Speaker 4: probably burn a billion gallons of fuel per year delivering 837 00:43:28,120 --> 00:43:31,640 Speaker 4: packages and letters. Because most posts and advanced country says 838 00:43:31,960 --> 00:43:37,440 Speaker 4: every house, every address, each day. The way to optimize 839 00:43:37,480 --> 00:43:39,840 Speaker 4: this is we can formulate the problem. It's called the 840 00:43:39,880 --> 00:43:44,040 Speaker 4: proveling sales and problem solving. It is really hard, so 841 00:43:44,040 --> 00:43:48,239 Speaker 4: people have heuristics. Let's suppose today our heuristics get us 842 00:43:48,280 --> 00:43:51,880 Speaker 4: to within twenty percent of the optimal answer. Let's suppose 843 00:43:51,880 --> 00:43:55,879 Speaker 4: a quantum computer can get you the next ten percent. Well, 844 00:43:55,920 --> 00:43:58,160 Speaker 4: if I can get ten percent of a billion gallons 845 00:43:58,160 --> 00:43:59,800 Speaker 4: that I think is one hundred million gallons of my 846 00:43:59,880 --> 00:44:02,840 Speaker 4: man others right, and in the country I'm thinking about, 847 00:44:03,080 --> 00:44:06,360 Speaker 4: that could be eight hundred million pounds of saving to 848 00:44:06,440 --> 00:44:12,040 Speaker 4: one entity in one year, and the associated carbon footprint 849 00:44:12,160 --> 00:44:16,000 Speaker 4: climate change were in't less mileage on vague. I'm not 850 00:44:16,000 --> 00:44:19,719 Speaker 4: even counting all that. These are pretty attractive problems to 851 00:44:19,760 --> 00:44:22,560 Speaker 4: go after. So if I look at the interest recently, 852 00:44:23,800 --> 00:44:28,239 Speaker 4: New York started a whole program in some places. Illinois 853 00:44:28,680 --> 00:44:32,400 Speaker 4: stood up a quantum algorithm center between a number of 854 00:44:32,400 --> 00:44:37,319 Speaker 4: the universities. The governor there was heavily behind it, etc. 855 00:44:38,040 --> 00:44:41,080 Speaker 4: So I wouldn't say that this is widespread. This is 856 00:44:41,080 --> 00:44:42,960 Speaker 4: why I'm saying three to four years for that moment. 857 00:44:43,600 --> 00:44:47,200 Speaker 4: But there's enough people who are deeply cognizant who are saying, 858 00:44:47,200 --> 00:44:49,279 Speaker 4: wait a moment, we kind of get it. 859 00:44:49,320 --> 00:44:51,000 Speaker 5: This is a new kind of man. What are the 860 00:44:51,040 --> 00:44:52,120 Speaker 5: new problems we can solve? 861 00:44:52,840 --> 00:44:55,280 Speaker 4: And the fact that we have about roughly two hundred 862 00:44:55,280 --> 00:44:58,920 Speaker 4: clients who worked with us very early stage, small experiments 863 00:44:59,239 --> 00:45:01,600 Speaker 4: is because they're in is I could do something here 864 00:45:01,600 --> 00:45:04,040 Speaker 4: that I couldn't do in other places. 865 00:45:04,320 --> 00:45:05,960 Speaker 2: Three to four years is not a long time. 866 00:45:06,400 --> 00:45:06,600 Speaker 5: No. 867 00:45:08,520 --> 00:45:10,280 Speaker 2: But if I'm in the battery business. 868 00:45:10,520 --> 00:45:14,040 Speaker 3: And I don't have a line out to a quantum 869 00:45:14,239 --> 00:45:16,319 Speaker 3: computing experiment. 870 00:45:17,480 --> 00:45:19,359 Speaker 2: I have a problem. Don't have a problem. 871 00:45:19,800 --> 00:45:23,839 Speaker 4: Yeah, you'll probably be out of business in ten years. Well, 872 00:45:23,840 --> 00:45:25,680 Speaker 4: maybe you could write a big check and buy the 873 00:45:25,719 --> 00:45:27,040 Speaker 4: technology from somebody else. 874 00:45:27,719 --> 00:45:30,600 Speaker 3: You had to what is quantum of rank in the 875 00:45:30,680 --> 00:45:33,840 Speaker 3: kind of great inventions of the last one hundred and 876 00:45:33,880 --> 00:45:34,719 Speaker 3: fifty years. 877 00:45:34,480 --> 00:45:40,000 Speaker 4: Equal to something conductor? And I think that if sem 878 00:45:40,080 --> 00:45:42,160 Speaker 4: Conductor's vanished, modern life. 879 00:45:42,000 --> 00:45:43,840 Speaker 5: Would stop, like just stop. 880 00:45:44,560 --> 00:45:53,040 Speaker 4: Yeah, no electricity, no automobile, no streaming. You can imagine 881 00:45:53,080 --> 00:45:54,759 Speaker 4: the yells from all the kids who ever hear that 882 00:45:54,840 --> 00:45:55,800 Speaker 4: no streaming? 883 00:45:57,680 --> 00:46:01,719 Speaker 3: The and is that it's funny because don't. As someone 884 00:46:01,760 --> 00:46:05,480 Speaker 3: who's outside this world, I feel like quantum is underdiscussed 885 00:46:05,520 --> 00:46:09,520 Speaker 3: relative to its potential for transforming society. 886 00:46:09,480 --> 00:46:13,279 Speaker 4: Because I use my Internet example. Ninety five was the 887 00:46:13,280 --> 00:46:18,200 Speaker 4: moment with Netscape that Internet came on people's consciousness. I said, 888 00:46:18,239 --> 00:46:20,360 Speaker 4: when eighty five I considered it to be this is 889 00:46:20,400 --> 00:46:24,000 Speaker 4: a solve problem because it needs something that makes it 890 00:46:24,080 --> 00:46:25,760 Speaker 4: accessible easy. 891 00:46:26,080 --> 00:46:28,680 Speaker 5: That was the browser. The Netscape browser. 892 00:46:28,400 --> 00:46:32,640 Speaker 4: Is what brought it easy to understand. We have probably, 893 00:46:32,680 --> 00:46:35,600 Speaker 4: as I said, about four to five years from that moment. 894 00:46:35,640 --> 00:46:38,560 Speaker 4: That's why it's under discussed, because the moment I say, 895 00:46:38,560 --> 00:46:40,480 Speaker 4: and you got a math, I've probably lost ninety nine 896 00:46:40,520 --> 00:46:43,280 Speaker 4: percent of the audience. If I go to quantum mechanics. 897 00:46:43,320 --> 00:46:46,680 Speaker 4: I've probably lost nine percent of the audience. 898 00:46:48,480 --> 00:46:51,640 Speaker 3: So you, as c over the last five years, have 899 00:46:51,800 --> 00:46:54,680 Speaker 3: been really the birth mother for a lot of the 900 00:46:54,760 --> 00:46:59,319 Speaker 3: quantum computing work. I'm curious, so you come in When 901 00:46:59,360 --> 00:47:03,800 Speaker 3: you started this, CEO, was this your first priority. 902 00:47:04,320 --> 00:47:06,840 Speaker 4: I had already started investing in it back in twenty 903 00:47:06,880 --> 00:47:12,520 Speaker 4: fifteen when I was leading IBM research. So let me 904 00:47:12,560 --> 00:47:14,360 Speaker 4: acknowledge and like nobody should try to copy it. And 905 00:47:14,400 --> 00:47:16,960 Speaker 4: I've had a i'll call it a weird career. 906 00:47:17,400 --> 00:47:19,359 Speaker 5: I was a researcher at some point. 907 00:47:19,360 --> 00:47:20,640 Speaker 4: If he had asked me out, I said, I'm one 908 00:47:20,680 --> 00:47:22,239 Speaker 4: of those people, you know, throw a pizza under our 909 00:47:22,280 --> 00:47:23,560 Speaker 4: door and like leave me alone. 910 00:47:23,719 --> 00:47:24,839 Speaker 5: I don't want to talk to people. 911 00:47:25,360 --> 00:47:27,719 Speaker 4: Then I decided I was interested in the business. Then 912 00:47:27,760 --> 00:47:30,759 Speaker 4: I went and started acquiring companies and doing that. Then 913 00:47:30,800 --> 00:47:32,840 Speaker 4: somebody told me, hey, why did you start doing some 914 00:47:32,880 --> 00:47:37,560 Speaker 4: business strategy. Then I went back to research and led 915 00:47:37,560 --> 00:47:40,359 Speaker 4: our research division for a couple of years. And when 916 00:47:40,400 --> 00:47:44,160 Speaker 4: the people described it to me, I asked some questions. 917 00:47:44,239 --> 00:47:46,480 Speaker 4: So it wasn't a big investment. 918 00:47:46,000 --> 00:47:46,399 Speaker 5: At that time. 919 00:47:46,440 --> 00:47:48,719 Speaker 4: It was hey, can we make a computer not just 920 00:47:48,760 --> 00:47:52,239 Speaker 4: a science experiment? Can it run by itself all night? 921 00:47:52,760 --> 00:47:55,839 Speaker 4: Can you think about software so that even people who 922 00:47:55,840 --> 00:47:58,440 Speaker 4: are not deeper quantum mechanics can begin to use it? 923 00:47:59,360 --> 00:48:00,760 Speaker 5: And they began to do those things. 924 00:48:01,040 --> 00:48:05,640 Speaker 4: So over three four years, did I get enough confidence, Yeah, okay, 925 00:48:05,640 --> 00:48:07,320 Speaker 4: this is something that can really work. 926 00:48:07,840 --> 00:48:09,279 Speaker 5: And then you've got to nurture it to. 927 00:48:10,000 --> 00:48:12,759 Speaker 4: Where it gets bigger and bigger until you get the 928 00:48:12,800 --> 00:48:14,479 Speaker 4: confidence that okay, now it's a big bet. 929 00:48:15,280 --> 00:48:18,120 Speaker 2: And what was the moment when you when you realize now. 930 00:48:17,960 --> 00:48:22,240 Speaker 5: It's a big bet, probably two or three years ago. 931 00:48:22,760 --> 00:48:25,280 Speaker 3: How do you decide, as the head of a company 932 00:48:25,360 --> 00:48:29,319 Speaker 3: like this, how much money, how many resources, and how 933 00:48:29,360 --> 00:48:31,399 Speaker 3: many people and how what kind of prominence to give 934 00:48:31,440 --> 00:48:32,239 Speaker 3: to an idea like that? 935 00:48:33,000 --> 00:48:37,040 Speaker 4: So three layers the set of people who actually have 936 00:48:37,200 --> 00:48:41,680 Speaker 4: the knowledge and the intensity to fundamentally advance the technology. 937 00:48:42,400 --> 00:48:44,080 Speaker 5: If I could find more out higher then. 938 00:48:44,080 --> 00:48:47,359 Speaker 4: So I'm constrained on people on that one because normally 939 00:48:47,400 --> 00:48:49,600 Speaker 4: there's only so many people who can do these things. 940 00:48:50,320 --> 00:48:52,960 Speaker 5: Two, you got to be careful. 941 00:48:53,239 --> 00:48:56,000 Speaker 4: If you push too hard on timing, you will get 942 00:48:56,000 --> 00:48:58,719 Speaker 4: people to take so much risk that actually the thing 943 00:48:58,760 --> 00:49:02,560 Speaker 4: will fail. So that's the art of between the leadership 944 00:49:02,600 --> 00:49:06,279 Speaker 4: on the project and me to say, Okay, how hard 945 00:49:06,320 --> 00:49:08,959 Speaker 4: can you push? But not so hard that you cause 946 00:49:09,000 --> 00:49:12,160 Speaker 4: it to fail because then they get compelled to commit 947 00:49:12,200 --> 00:49:14,239 Speaker 4: timelines that are just impossible. 948 00:49:14,800 --> 00:49:17,480 Speaker 2: Yeah, how do you This is fascinating. 949 00:49:17,600 --> 00:49:20,759 Speaker 3: So it's ultimately a question of judgment trying to figure 950 00:49:20,760 --> 00:49:23,399 Speaker 3: out what's the sweet spot between enough pressure to keep 951 00:49:23,440 --> 00:49:25,960 Speaker 3: them ahead of the pack, but not too much pressure 952 00:49:26,360 --> 00:49:29,759 Speaker 3: so that they start taking risks. How do you calibrate 953 00:49:30,040 --> 00:49:32,839 Speaker 3: whether you're hitting that sweet spot? I mean, do you 954 00:49:33,040 --> 00:49:36,239 Speaker 3: reassess every few months and say I think I'm over 955 00:49:36,280 --> 00:49:37,960 Speaker 3: correcting or undercorrecting at this moment. 956 00:49:38,800 --> 00:49:42,160 Speaker 4: So one you got to have what I call and 957 00:49:42,239 --> 00:49:44,920 Speaker 4: this is channeling a word from one of my favorite books, 958 00:49:44,960 --> 00:49:48,879 Speaker 4: The Geekway, How open can you be? So I want 959 00:49:48,920 --> 00:49:52,399 Speaker 4: to press hard, but the team knows that they're allowed 960 00:49:52,440 --> 00:49:56,600 Speaker 4: to push back and really argue back hard. That means 961 00:49:56,600 --> 00:50:02,360 Speaker 4: you'll get to probably that correct goldilocks pressure do the 962 00:50:02,480 --> 00:50:05,240 Speaker 4: people themselves should want to go as hard as possible, 963 00:50:05,760 --> 00:50:09,320 Speaker 4: but not harder than possible. So that is then personality 964 00:50:09,719 --> 00:50:12,000 Speaker 4: of leadership that makes sense. 965 00:50:12,239 --> 00:50:14,800 Speaker 2: But you have to be someone who people feel comfortable 966 00:50:14,840 --> 00:50:15,440 Speaker 2: being honest with. 967 00:50:15,640 --> 00:50:19,200 Speaker 5: Yes, absolutely, and people feel. 968 00:50:18,960 --> 00:50:20,040 Speaker 2: Comfortable being honest with you. 969 00:50:20,239 --> 00:50:20,880 Speaker 5: I believe so. 970 00:50:21,239 --> 00:50:25,719 Speaker 2: Yeah. When has there been a moment in this path 971 00:50:25,760 --> 00:50:28,560 Speaker 2: with quantum where you did think you were pushing too hard? 972 00:50:30,360 --> 00:50:35,000 Speaker 4: No, because I think that the leadership there will argue 973 00:50:35,000 --> 00:50:37,959 Speaker 4: back with me any day of the week. I don't 974 00:50:37,960 --> 00:50:40,400 Speaker 4: think that they feel that they have to forward. 975 00:50:41,480 --> 00:50:44,600 Speaker 3: Do you drop by at sort of Saturday night at 976 00:50:44,600 --> 00:50:46,040 Speaker 3: ten pm to see if people are working? 977 00:50:47,120 --> 00:50:53,080 Speaker 4: I tend to text people and ask questions and like, 978 00:50:53,160 --> 00:50:55,880 Speaker 4: I'll read something and say, hey, are these people doing this? 979 00:50:56,440 --> 00:51:00,400 Speaker 4: And if they can answer me in reasonable terms, I 980 00:51:00,440 --> 00:51:03,640 Speaker 4: actually don't say great. They're already watching the competition, they're 981 00:51:03,680 --> 00:51:06,920 Speaker 4: watching the literature, they're watching the science. I don't need 982 00:51:06,960 --> 00:51:09,200 Speaker 4: to push hard. If they are already ahead of it, 983 00:51:09,239 --> 00:51:12,280 Speaker 4: then me I can answer my question. I'll say thoughtfully, 984 00:51:12,600 --> 00:51:16,719 Speaker 4: not always completely accurately. You're thinking about it on their own. 985 00:51:16,719 --> 00:51:17,919 Speaker 4: I don't need to push Yeah. 986 00:51:18,760 --> 00:51:21,239 Speaker 3: One last question I wanted to ask you, do you 987 00:51:21,320 --> 00:51:23,200 Speaker 3: have the most interesting job in America? 988 00:51:23,360 --> 00:51:25,719 Speaker 4: I believe that it's the most interesting job, which I 989 00:51:25,719 --> 00:51:26,880 Speaker 4: won't give up anything. 990 00:51:27,880 --> 00:51:30,760 Speaker 2: It also sounds like you're enjoying yourself. 991 00:51:31,320 --> 00:51:34,600 Speaker 4: I enjoy it as long as look my role and 992 00:51:34,719 --> 00:51:38,719 Speaker 4: goal should be to make the enterprise thrive as long 993 00:51:38,760 --> 00:51:42,040 Speaker 4: as than making the enterprise thrive and are clients delighted? 994 00:51:43,640 --> 00:51:46,520 Speaker 5: I love it. If I don't, somebody else should do it. 995 00:51:47,520 --> 00:51:50,840 Speaker 3: Harvin, this has been so much fun. Thank you so 996 00:51:50,960 --> 00:51:56,440 Speaker 3: much taking the time and a fascinating, completely fascinating conversation. 997 00:51:56,480 --> 00:51:58,120 Speaker 3: I wish I was one of those people who could 998 00:51:58,160 --> 00:52:00,440 Speaker 3: help you out with quantum, but I'm afraid I'm not 999 00:52:01,080 --> 00:52:01,759 Speaker 3: in a few years. 1000 00:52:03,200 --> 00:52:04,680 Speaker 2: Good. Thank you so much. 1001 00:52:14,719 --> 00:52:18,400 Speaker 3: Smart Talks with IBM is produced by Matt Ramano, Amy Gaines, McQuaid, 1002 00:52:18,880 --> 00:52:23,080 Speaker 3: Trina Menino, and Jake Harper. Mastering by Sarah Bugerer, music 1003 00:52:23,080 --> 00:52:28,880 Speaker 3: by Gramoscope, Strategy by Tatiana Lieberman, Cassidy Meyer and Sophia Derlong. 1004 00:52:29,600 --> 00:52:32,239 Speaker 3: Smart Talks with IBM is a production of Pushkin Industries 1005 00:52:32,440 --> 00:52:36,840 Speaker 3: and Ruby Studio at iHeartMedia. To find more Pushkin podcasts, 1006 00:52:37,239 --> 00:52:41,160 Speaker 3: listen on the iHeartRadio app, Apple Podcasts, or wherever you 1007 00:52:41,239 --> 00:52:45,160 Speaker 3: listen to podcasts. I'm Malcolm Glabo. This is a paid 1008 00:52:45,160 --> 00:52:49,719 Speaker 3: advertisement from IBM. The conversations on this podcast don't necessarily 1009 00:52:49,800 --> 00:53:03,800 Speaker 3: represent IBM's positions, strategies, or opinions.