1 00:00:04,480 --> 00:00:12,680 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Today, we 2 00:00:12,720 --> 00:00:15,680 Speaker 1: are witnessed to one of those rare moments in history, 3 00:00:16,040 --> 00:00:19,279 Speaker 1: the rise of an innovative technology with the potential to 4 00:00:19,400 --> 00:00:24,120 Speaker 1: radically transform business in society forever. That technology, of course, 5 00:00:24,600 --> 00:00:28,159 Speaker 1: is artificial intelligence, and it's the central focus for this 6 00:00:28,360 --> 00:00:32,320 Speaker 1: new season of Smart Talks with IBM. Join hosts from 7 00:00:32,400 --> 00:00:36,120 Speaker 1: your favorite Pushkin podcasts as they talk with industry experts 8 00:00:36,120 --> 00:00:39,720 Speaker 1: and leaders to explore how businesses can integrate AI into 9 00:00:39,760 --> 00:00:43,080 Speaker 1: their workflows and help drive real change in this new 10 00:00:43,200 --> 00:00:46,839 Speaker 1: era of AI, and of course, host Malcolm Gladwell will 11 00:00:46,880 --> 00:00:49,200 Speaker 1: be there to guide you through the season and throw 12 00:00:49,280 --> 00:00:52,199 Speaker 1: in his two cents as well. Look out for new 13 00:00:52,200 --> 00:00:55,080 Speaker 1: episodes of Smart Talks with IBM every other week on 14 00:00:55,160 --> 00:00:59,360 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts, 15 00:00:59,560 --> 00:01:14,720 Speaker 1: and learn more at IBM dot com, slash smart Talks. 16 00:01:11,600 --> 00:01:12,479 Speaker 2: Pushkin. 17 00:01:16,120 --> 00:01:23,520 Speaker 3: Welcome, Welcome, Welcome to Smart Talks with IBM. 18 00:01:23,560 --> 00:01:27,199 Speaker 4: Hello, Hello, Welcome to Smart Talks with IBM, a podcast 19 00:01:27,360 --> 00:01:33,240 Speaker 4: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Gladwell. This season, 20 00:01:33,480 --> 00:01:36,760 Speaker 4: we're diving back into the world of artificial intelligence, but 21 00:01:36,840 --> 00:01:42,199 Speaker 4: with a focus on the powerful concept of open its possibilities, implications, 22 00:01:42,520 --> 00:01:46,280 Speaker 4: and misconceptions. We'll look at openness from a variety of 23 00:01:46,319 --> 00:01:50,200 Speaker 4: angles and explore how the concept is already reshaping industries, 24 00:01:50,680 --> 00:01:54,240 Speaker 4: ways of doing business, and our very notion of what's possible. 25 00:01:54,680 --> 00:01:57,200 Speaker 4: And for the first episode of this season, we're bringing 26 00:01:57,240 --> 00:01:58,680 Speaker 4: you a special conversation. 27 00:01:59,320 --> 00:02:00,200 Speaker 2: I recently said. 28 00:02:00,080 --> 00:02:03,120 Speaker 4: That down with Rob Thomas. Rob is the senior vice 29 00:02:03,120 --> 00:02:07,200 Speaker 4: president of Software and chief Commercial Officer of IBM. We 30 00:02:07,280 --> 00:02:09,919 Speaker 4: spoke to him in front of a live audience as 31 00:02:09,960 --> 00:02:13,480 Speaker 4: part of New York Tech Week. We discussed how businesses 32 00:02:13,560 --> 00:02:17,799 Speaker 4: can harness the immense productivity benefits of AI while implementing 33 00:02:17,840 --> 00:02:21,840 Speaker 4: it in a responsible and ethical manner. We also broke 34 00:02:21,880 --> 00:02:25,960 Speaker 4: down a fascinating concept that Rob believes about AI, known 35 00:02:26,080 --> 00:02:30,720 Speaker 4: as the productivity paradox. Okay, let's get to the conversation. 36 00:02:38,160 --> 00:02:40,280 Speaker 2: How are we doing good, Rob? 37 00:02:40,320 --> 00:02:44,600 Speaker 4: This is our second time. We did one of these 38 00:02:44,880 --> 00:02:47,040 Speaker 4: in the middle of the pandemic. But now it's all 39 00:02:47,080 --> 00:02:48,760 Speaker 4: such a blur now that us can figure out when 40 00:02:48,760 --> 00:02:49,000 Speaker 4: it was. 41 00:02:49,120 --> 00:02:51,480 Speaker 3: I know, it's hard to those are like a blurry years. 42 00:02:51,520 --> 00:02:53,320 Speaker 3: You don't know what happened, right, But. 43 00:02:54,120 --> 00:02:56,000 Speaker 4: Well, it's good to see you, to meet you again. 44 00:02:57,000 --> 00:03:00,160 Speaker 4: I wanted to start by going back. You've been an IBM. 45 00:02:59,840 --> 00:03:04,280 Speaker 3: Two years right, twenty five in July, believe it or not. 46 00:03:04,400 --> 00:03:06,040 Speaker 2: So you were you were a kid when you joined. 47 00:03:06,160 --> 00:03:06,680 Speaker 3: I was four. 48 00:03:09,080 --> 00:03:13,959 Speaker 4: So I want to contrast present day Rob and twenty 49 00:03:14,000 --> 00:03:18,880 Speaker 4: five years ago Rob. When you arrive at IBM, what 50 00:03:18,960 --> 00:03:20,640 Speaker 4: do you think your job is going to be? 51 00:03:20,680 --> 00:03:21,480 Speaker 2: It, your career is going? 52 00:03:21,800 --> 00:03:23,800 Speaker 4: Where do you think the kind of problems you're going 53 00:03:23,840 --> 00:03:25,079 Speaker 4: to be addressing are? 54 00:03:26,880 --> 00:03:29,160 Speaker 3: Well, it's kind of surreal because I joined IBM a 55 00:03:29,200 --> 00:03:34,920 Speaker 3: consulting and I'm coming out of school, and you quickly realize, wait, 56 00:03:34,960 --> 00:03:37,400 Speaker 3: the job of a consultant is to tell other companies 57 00:03:37,440 --> 00:03:40,080 Speaker 3: what to do. And I was like, I literally know nothing, 58 00:03:41,640 --> 00:03:43,480 Speaker 3: and so you're immediately trying to figure out, so how 59 00:03:43,480 --> 00:03:45,080 Speaker 3: am I going to be relevant given that I know 60 00:03:45,120 --> 00:03:48,480 Speaker 3: absolutely nothing to advise other companies on what they should 61 00:03:48,520 --> 00:03:51,320 Speaker 3: be doing. And I remember it well, like we were 62 00:03:51,360 --> 00:03:54,440 Speaker 3: sitting in a room. When you're a consultant, you're waiting 63 00:03:54,480 --> 00:03:57,200 Speaker 3: for somebody else to find work for you. A bunch 64 00:03:57,240 --> 00:03:59,680 Speaker 3: of us sitting in a room and somebody walks in 65 00:03:59,760 --> 00:04:03,680 Speaker 3: and as we need somebody that knows visio. Does anybody 66 00:04:03,680 --> 00:04:06,240 Speaker 3: know Visio? I'd never heard of visio. I don't know 67 00:04:06,360 --> 00:04:09,680 Speaker 3: if anybody in the room has. So everybody's like sitting 68 00:04:09,680 --> 00:04:12,440 Speaker 3: around looking at their shoes. So finally I was like, 69 00:04:12,880 --> 00:04:15,600 Speaker 3: I know it. So I raised my hand. They're like, great, 70 00:04:15,800 --> 00:04:18,440 Speaker 3: we got a project for you next week. So I 71 00:04:18,480 --> 00:04:20,640 Speaker 3: was like, all right, I have like three days to 72 00:04:20,640 --> 00:04:24,440 Speaker 3: figure out what visio is, and I hope I can 73 00:04:24,520 --> 00:04:26,960 Speaker 3: actually figure out how to use it now. Luckily, it 74 00:04:27,040 --> 00:04:30,000 Speaker 3: wasn't like a programming language. I mean, it's pretty much 75 00:04:30,000 --> 00:04:34,160 Speaker 3: a drag and drop capability. And so I literally left 76 00:04:34,200 --> 00:04:37,760 Speaker 3: the office, went to a bookstore bought the first three 77 00:04:37,760 --> 00:04:40,320 Speaker 3: books on Visio I could find, spent the whole week 78 00:04:40,360 --> 00:04:43,880 Speaker 3: in reading the books, and showed up and got their 79 00:04:43,880 --> 00:04:46,280 Speaker 3: work on the project. And so it was a bit 80 00:04:46,320 --> 00:04:49,960 Speaker 3: of a risky move, but I think that's kind of 81 00:04:50,000 --> 00:04:53,760 Speaker 3: you this well, But if you don't take risk, you'll 82 00:04:53,800 --> 00:04:57,920 Speaker 3: never you'll never achieve. And so does some extent. Everybody's 83 00:04:57,960 --> 00:05:00,640 Speaker 3: making everything up all the time. It's like, can you 84 00:05:00,760 --> 00:05:04,200 Speaker 3: learn faster than somebody else? Is what the difference is 85 00:05:04,680 --> 00:05:07,760 Speaker 3: in almost every part of life. And so it was 86 00:05:07,760 --> 00:05:10,360 Speaker 3: not planned, but it was an accident, but it kind 87 00:05:10,360 --> 00:05:12,160 Speaker 3: of forced me to figure out that you're going to 88 00:05:12,240 --> 00:05:13,039 Speaker 3: have to figure things out. 89 00:05:13,400 --> 00:05:15,880 Speaker 4: You know, we're here to talk about AI, and I'm 90 00:05:15,920 --> 00:05:22,279 Speaker 4: curious about the evolution of your understanding or IBM's understanding 91 00:05:22,320 --> 00:05:24,760 Speaker 4: of my AI. At what point in the last twenty 92 00:05:24,760 --> 00:05:27,320 Speaker 4: five years do you begin to think, oh, this is 93 00:05:27,360 --> 00:05:29,080 Speaker 4: really going to be at the core of what we 94 00:05:29,440 --> 00:05:31,440 Speaker 4: think about and work on at this company. 95 00:05:33,360 --> 00:05:37,520 Speaker 3: The computer scientist John McCarthy, he was he's the person 96 00:05:37,560 --> 00:05:41,200 Speaker 3: that's credited with coining the phrase artificial intelligence. It's like 97 00:05:41,240 --> 00:05:46,440 Speaker 3: in the fifties, and he made an interesting comedy said 98 00:05:46,600 --> 00:05:48,920 Speaker 3: he said, once it works, it's no longer called AI, 99 00:05:51,160 --> 00:05:54,360 Speaker 3: and that then became it's called like the AI effect, 100 00:05:54,440 --> 00:05:58,200 Speaker 3: which is it seems very difficult, very mysterious, but once 101 00:05:58,240 --> 00:06:02,400 Speaker 3: it becomes commonplace, it's just no longer what it is. 102 00:06:02,760 --> 00:06:05,200 Speaker 3: And so if you put that frame on it, I 103 00:06:05,200 --> 00:06:07,520 Speaker 3: think we've always been doing AI at some level. And 104 00:06:07,680 --> 00:06:09,680 Speaker 3: I even think back to when I joined IBM in 105 00:06:09,920 --> 00:06:14,440 Speaker 3: ninety nine. At that point there was work on rules 106 00:06:14,480 --> 00:06:20,480 Speaker 3: based engines, analytics, all of this was happening. So it 107 00:06:20,520 --> 00:06:23,880 Speaker 3: all depends on how you really define that term. You 108 00:06:23,920 --> 00:06:29,120 Speaker 3: could argue that you know, elements of statistics, probability. It's 109 00:06:29,120 --> 00:06:32,120 Speaker 3: not exactly AI, but it certainly feeds into it, and 110 00:06:32,200 --> 00:06:35,680 Speaker 3: so I feel like we've been working on this topic 111 00:06:35,720 --> 00:06:41,120 Speaker 3: of how do we deliver better insights better automation since 112 00:06:41,400 --> 00:06:43,880 Speaker 3: IBM was formed. If you read about what Thomas Watson 113 00:06:43,920 --> 00:06:48,440 Speaker 3: Junior did, that was all about automating tasks, is that AI? 114 00:06:48,600 --> 00:06:52,719 Speaker 3: Well probably certainly not by today's definition, but it's in 115 00:06:52,760 --> 00:06:53,640 Speaker 3: the same zip code. 116 00:06:53,839 --> 00:06:56,000 Speaker 4: So from your perspective, it feels a lot more like 117 00:06:56,040 --> 00:06:57,480 Speaker 4: an evolution than a revolution. 118 00:06:57,600 --> 00:06:58,400 Speaker 2: Is that a fair statement? 119 00:06:58,520 --> 00:07:02,200 Speaker 3: Yes, yeah, which I think most great things in technology 120 00:07:03,120 --> 00:07:06,640 Speaker 3: tend to happen that way. Yeah, many of the revolutions, 121 00:07:06,680 --> 00:07:08,080 Speaker 3: if you will, tend to fizzle out. 122 00:07:09,120 --> 00:07:11,160 Speaker 4: But even given that is there, I guess what I'm 123 00:07:11,160 --> 00:07:14,440 Speaker 4: asking is, I'm curious about whether there was a moment 124 00:07:14,680 --> 00:07:18,520 Speaker 4: in that evolution when you had to readjust your expectations 125 00:07:18,560 --> 00:07:21,840 Speaker 4: about what AI was going to be capable of. 126 00:07:21,960 --> 00:07:24,080 Speaker 2: I mean, was there, you know, was. 127 00:07:24,040 --> 00:07:28,600 Speaker 4: There a particular innovation or a particular problem that was 128 00:07:28,600 --> 00:07:31,800 Speaker 4: solved that made you think, oh, this is different than 129 00:07:31,800 --> 00:07:32,680 Speaker 4: what I thought. 130 00:07:35,440 --> 00:07:39,120 Speaker 3: I would say the moments that caught our attention certainly 131 00:07:40,440 --> 00:07:44,480 Speaker 3: casper Off winning the chess tournament, nobody or Deep Blue 132 00:07:44,480 --> 00:07:47,400 Speaker 3: beating casper Off. I should say nobody really thought that 133 00:07:47,480 --> 00:07:53,080 Speaker 3: was possible before that, and then it was Watson winning Jeopardy. 134 00:07:53,480 --> 00:07:56,160 Speaker 3: These were moments that said, maybe there's more here than 135 00:07:56,160 --> 00:08:00,400 Speaker 3: we even thought was possible. And so I do think 136 00:08:00,400 --> 00:08:06,160 Speaker 3: there's there's points in time where we realized maybe way 137 00:08:06,200 --> 00:08:09,520 Speaker 3: more could be done than we had even imagined. But 138 00:08:09,960 --> 00:08:13,880 Speaker 3: I do think it's consistent progress every month and every 139 00:08:14,000 --> 00:08:20,120 Speaker 3: year versus some seminal moment. Now, certainly large language models 140 00:08:20,120 --> 00:08:22,560 Speaker 3: as of recent have caught everybody's attention because it has 141 00:08:22,600 --> 00:08:27,040 Speaker 3: a direct consumer application. But I would almost think of 142 00:08:27,080 --> 00:08:32,440 Speaker 3: that as what Netscape was for the for the web browser. Yeah, 143 00:08:32,520 --> 00:08:36,280 Speaker 3: it brought the Internet to everybody, but that didn't become 144 00:08:36,760 --> 00:08:38,920 Speaker 3: the Internet per se. Yeah. 145 00:08:39,160 --> 00:08:41,400 Speaker 4: I have a cousin who worked for I'd be up 146 00:08:41,440 --> 00:08:44,079 Speaker 4: for forty one years. I saw him this weekend. He's 147 00:08:44,080 --> 00:08:47,280 Speaker 4: in Toronto. By the way, I said, you work for 148 00:08:47,440 --> 00:08:49,600 Speaker 4: Rob Thomas. He would like this, he goes. 149 00:08:52,880 --> 00:08:54,360 Speaker 2: He said, I'm five layers down. 150 00:08:55,679 --> 00:08:57,920 Speaker 4: But so I always whenever I see my cousin, I 151 00:08:57,960 --> 00:08:59,719 Speaker 4: ask him, can you tell me again what you do? 152 00:08:59,760 --> 00:09:02,680 Speaker 2: Because always changing? Right, I guess this is a function 153 00:09:02,720 --> 00:09:03,800 Speaker 2: of working at IBM. 154 00:09:04,200 --> 00:09:06,520 Speaker 4: So eventually he just gives up and says, you know, 155 00:09:06,520 --> 00:09:08,960 Speaker 4: we're just solving problems, so what we're doing, which I 156 00:09:08,960 --> 00:09:11,880 Speaker 4: sort of loved as a kind of frame, And I 157 00:09:12,000 --> 00:09:15,120 Speaker 4: was curious, what's the coolest problem you ever worked on? 158 00:09:15,240 --> 00:09:19,440 Speaker 4: Not biggest, not most important, but the coolest, the one 159 00:09:19,440 --> 00:09:21,880 Speaker 4: that's like that sort of makes you smile when you 160 00:09:21,880 --> 00:09:22,880 Speaker 4: think back on it. 161 00:09:22,960 --> 00:09:27,720 Speaker 3: Probably when I was in microelectronics, because it was a 162 00:09:27,720 --> 00:09:30,800 Speaker 3: world I had no exposure to. I hadn't studied computer science, 163 00:09:31,880 --> 00:09:38,000 Speaker 3: and we were building a lot of high performance semiconductor technology, 164 00:09:38,120 --> 00:09:40,880 Speaker 3: so just chips that do a really great job of 165 00:09:41,400 --> 00:09:45,400 Speaker 3: processing something or other. And we figured out that there 166 00:09:45,480 --> 00:09:49,280 Speaker 3: was a market in consumer gaming that was starting to happen, 167 00:09:50,040 --> 00:09:52,760 Speaker 3: and we got to the point where we became the 168 00:09:52,840 --> 00:10:00,559 Speaker 3: chip inside the Nintendo, We the Microsoft Xbox Sony PlayStation, 169 00:10:00,720 --> 00:10:03,720 Speaker 3: so we basically had the entire gaming market running on 170 00:10:03,840 --> 00:10:04,720 Speaker 3: ib AND chips. 171 00:10:05,400 --> 00:10:09,960 Speaker 4: And to use every parent basically is pointing at you 172 00:10:10,000 --> 00:10:11,280 Speaker 4: and saying you're the call. 173 00:10:12,440 --> 00:10:16,000 Speaker 3: Probably well they would have found it from anybody, but 174 00:10:16,320 --> 00:10:20,480 Speaker 3: it was the first time I could explain my job 175 00:10:20,559 --> 00:10:22,240 Speaker 3: to my kids, who were quite young at that time, 176 00:10:22,400 --> 00:10:25,040 Speaker 3: like what I did like it was more tangible for 177 00:10:25,080 --> 00:10:27,520 Speaker 3: them than saying we solve problems or douce you know, 178 00:10:27,559 --> 00:10:31,880 Speaker 3: build solutions like it became very tangible for them, and 179 00:10:32,240 --> 00:10:34,959 Speaker 3: I think that's, you know, a rewarding part of the 180 00:10:35,040 --> 00:10:37,840 Speaker 3: job is when you can help your family actually understand 181 00:10:37,880 --> 00:10:39,400 Speaker 3: what you do. Most people can't do that. It's probably 182 00:10:39,440 --> 00:10:41,200 Speaker 3: easier for you. They can, they can see the books, 183 00:10:42,640 --> 00:10:45,480 Speaker 3: but for for some of us in the business, the 184 00:10:45,480 --> 00:10:47,680 Speaker 3: business world, it's not always as obvious. So that was 185 00:10:47,720 --> 00:10:50,520 Speaker 3: like one example where the dots really connected. 186 00:10:51,480 --> 00:10:54,719 Speaker 4: There were a couple's a couple of stuck about a 187 00:10:54,720 --> 00:10:57,240 Speaker 4: little bit of this into context of of AI. I 188 00:10:57,280 --> 00:11:00,600 Speaker 4: love because I love the frame of problem solving as 189 00:11:00,600 --> 00:11:03,600 Speaker 4: a way of understanding what the function of the technology is. 190 00:11:03,920 --> 00:11:06,640 Speaker 4: So I know that you guys did something, did some 191 00:11:06,720 --> 00:11:10,640 Speaker 4: work with I never know how to pronounce it. Is 192 00:11:10,640 --> 00:11:14,600 Speaker 4: it Seville Sevia, Sevia with the football club Sevia in Spain? 193 00:11:14,880 --> 00:11:17,480 Speaker 4: Tell me about Tell me a little bit about that. 194 00:11:17,559 --> 00:11:20,360 Speaker 4: What problem were they trying to solve and why did 195 00:11:20,440 --> 00:11:20,960 Speaker 4: they call you? 196 00:11:21,040 --> 00:11:29,040 Speaker 3: In? Every sports franchise is trying to get an advantage, right, 197 00:11:29,120 --> 00:11:35,079 Speaker 3: Let's just be that clear. Everybody's how can I use data? Analytics, insights, 198 00:11:35,160 --> 00:11:38,280 Speaker 3: anything that will make us one percent better on the 199 00:11:38,320 --> 00:11:44,080 Speaker 3: field at some point in the future. And Seville reached 200 00:11:44,080 --> 00:11:46,880 Speaker 3: out to us because they had seen some of that. 201 00:11:46,960 --> 00:11:49,160 Speaker 3: We've done some work with the Toronto Raptors in the 202 00:11:49,200 --> 00:11:53,840 Speaker 3: past and others, and their thought was maybe there's something 203 00:11:53,880 --> 00:11:57,319 Speaker 3: we could do. They'd heard all about generative AI, that 204 00:11:57,400 --> 00:12:00,720 Speaker 3: heard about large language models, and the problem, back to 205 00:12:00,720 --> 00:12:04,600 Speaker 3: your point on solving problems, was we want to do 206 00:12:04,640 --> 00:12:09,480 Speaker 3: a way better job of assessing talent, because really the 207 00:12:10,160 --> 00:12:13,120 Speaker 3: lifeblood of a sports franchise is can you continue to 208 00:12:13,160 --> 00:12:17,160 Speaker 3: cultivate talent, can you find talent that others don't find? 209 00:12:17,800 --> 00:12:20,280 Speaker 3: Can you see something in somebody that they don't see 210 00:12:20,280 --> 00:12:23,120 Speaker 3: in themselves or maybe no other team season them. And 211 00:12:23,440 --> 00:12:26,880 Speaker 3: we ended up building somebody with them called Scout Advisor, 212 00:12:27,760 --> 00:12:32,160 Speaker 3: which is built on Watson X, which basically just ingests 213 00:12:33,320 --> 00:12:36,880 Speaker 3: tons and tons of data and we like to think 214 00:12:36,920 --> 00:12:39,960 Speaker 3: of it as finding the needle in the haystack of 215 00:12:40,640 --> 00:12:44,160 Speaker 3: you know, here's three players that aren't being considered. They're 216 00:12:44,160 --> 00:12:48,640 Speaker 3: not on the top teams today, and I think working 217 00:12:48,679 --> 00:12:50,839 Speaker 3: with them together, we found some pretty good insights that's 218 00:12:50,840 --> 00:12:51,319 Speaker 3: helped them out. 219 00:12:51,520 --> 00:12:52,920 Speaker 2: How what was intriguing to. 220 00:12:52,880 --> 00:12:56,719 Speaker 4: Me was we're not just talking about quantitative data. We're 221 00:12:56,720 --> 00:13:01,120 Speaker 4: also talking about qualitative data. That's the puzzle part of 222 00:13:01,160 --> 00:13:04,319 Speaker 4: the thing that fastens me. How does what incorporate qualitative 223 00:13:04,320 --> 00:13:07,600 Speaker 4: analysis into that sort of so you just feeding in 224 00:13:08,160 --> 00:13:09,800 Speaker 4: scouting reports and things like that. 225 00:13:11,640 --> 00:13:13,280 Speaker 3: I got to realize think about how much I can 226 00:13:13,320 --> 00:13:19,360 Speaker 3: actually disclose it. But if you think about so, quantitative 227 00:13:19,480 --> 00:13:25,280 Speaker 3: is relatively easy. Every team collects that, you know, what's 228 00:13:25,360 --> 00:13:28,080 Speaker 3: their forty yard dashable think they use that term, certainly 229 00:13:28,080 --> 00:13:33,280 Speaker 3: not in Spain. That's all quantitative. Qualitative is what's happening 230 00:13:33,320 --> 00:13:37,439 Speaker 3: off the field. It could be diet, it could be habits, 231 00:13:37,640 --> 00:13:41,360 Speaker 3: it could be behavior. You can imagine a range of 232 00:13:41,400 --> 00:13:46,160 Speaker 3: things that would all feed into an athlete's performance, and 233 00:13:46,240 --> 00:13:51,240 Speaker 3: so relationships. There's many different aspects, and so it's trying 234 00:13:51,240 --> 00:13:55,400 Speaker 3: to figure out the right blend of quantitative and qualitative 235 00:13:55,679 --> 00:13:57,040 Speaker 3: that gives you a unique insight. 236 00:13:57,679 --> 00:14:01,640 Speaker 4: How transparent is that kind of system telling you? It's 237 00:14:01,679 --> 00:14:04,720 Speaker 4: saying pick this guy, not this guy, But is it 238 00:14:04,760 --> 00:14:06,679 Speaker 4: telling you why it prefers this guy to this guy? 239 00:14:06,840 --> 00:14:07,160 Speaker 2: Was it? 240 00:14:08,240 --> 00:14:10,360 Speaker 3: I think for anything in the realm of AI, you 241 00:14:10,440 --> 00:14:13,800 Speaker 3: have to answer the why question. Yeah, otherwise you've fallen 242 00:14:13,800 --> 00:14:18,160 Speaker 3: into the trap of the you know, the proverbial black box, 243 00:14:18,440 --> 00:14:21,200 Speaker 3: and then wait, I made this decision, I'd never understood 244 00:14:21,200 --> 00:14:23,640 Speaker 3: why it didn't work out, So you always have to 245 00:14:23,640 --> 00:14:25,080 Speaker 3: answer why without a. 246 00:14:25,040 --> 00:14:27,480 Speaker 2: Doubt and how is why answered? 247 00:14:30,040 --> 00:14:34,080 Speaker 3: Sources of data, the reasoning that went into it, and 248 00:14:34,160 --> 00:14:37,400 Speaker 3: so it's basically just tracing back the chain of how 249 00:14:37,440 --> 00:14:40,320 Speaker 3: you got to the answer. And in the case of 250 00:14:40,520 --> 00:14:42,960 Speaker 3: what we do in Watson X is we have IBM models. 251 00:14:43,440 --> 00:14:46,080 Speaker 3: We also use some other open source models, so it 252 00:14:46,080 --> 00:14:48,920 Speaker 3: would be which model was used, what was the data 253 00:14:48,960 --> 00:14:51,160 Speaker 3: set that was fed into that model, How is it 254 00:14:51,200 --> 00:14:55,680 Speaker 3: making decisions? How is it performing? Is it robust? Meaning 255 00:14:55,760 --> 00:14:57,800 Speaker 3: is it reliable in terms of if you feed it 256 00:14:58,000 --> 00:14:59,520 Speaker 3: two of the same data set, do you get the 257 00:14:59,520 --> 00:15:02,920 Speaker 3: same answer? These are all the you know, the technical 258 00:15:02,960 --> 00:15:05,440 Speaker 3: aspects of understanding the why. 259 00:15:05,520 --> 00:15:09,640 Speaker 4: How quickly do you expect all professional sports franchises to 260 00:15:09,680 --> 00:15:11,800 Speaker 4: adopt some kind of are they already there? If I 261 00:15:11,840 --> 00:15:15,520 Speaker 4: went out and pulled the general managers of the one 262 00:15:15,560 --> 00:15:18,440 Speaker 4: hundred most valuable sports franchises in the world, how many 263 00:15:18,480 --> 00:15:21,080 Speaker 4: of them would be using some kind of AI system 264 00:15:21,120 --> 00:15:22,360 Speaker 4: to assist in their efforts? 265 00:15:24,240 --> 00:15:27,960 Speaker 3: One hundred and twenty percent would meaning that everybody's doing it, 266 00:15:28,000 --> 00:15:29,880 Speaker 3: and some think they're doing way more than they probably 267 00:15:29,880 --> 00:15:33,480 Speaker 3: actually are. So everybody's doing it. I think what's weird 268 00:15:33,520 --> 00:15:39,080 Speaker 3: about sports is everybody's so convinced that what they're doing 269 00:15:39,160 --> 00:15:43,520 Speaker 3: is unique that they generally speaking don't want to work 270 00:15:43,520 --> 00:15:45,680 Speaker 3: with a third party to do it because they're afraid 271 00:15:46,040 --> 00:15:48,680 Speaker 3: that that would expose them. But in reality, I think 272 00:15:48,720 --> 00:15:51,560 Speaker 3: most are doing eighty to ninety percent of the same things. 273 00:15:53,200 --> 00:15:55,320 Speaker 3: So but without a doubt, everybody's doing it. 274 00:15:55,760 --> 00:15:58,600 Speaker 2: Yeah. Yeah. 275 00:15:58,080 --> 00:16:01,680 Speaker 4: The other that love was there was one but a 276 00:16:01,720 --> 00:16:05,760 Speaker 4: shipping line tri gun on the Mississippi River. Tell me 277 00:16:05,760 --> 00:16:07,600 Speaker 4: a little bit about that project. What problem were they 278 00:16:07,640 --> 00:16:08,200 Speaker 4: trying to solve? 279 00:16:10,280 --> 00:16:14,000 Speaker 3: Think about the problem that I would say everybody noticed 280 00:16:14,040 --> 00:16:17,600 Speaker 3: if you go back to twenty twenty was things are 281 00:16:17,640 --> 00:16:20,240 Speaker 3: getting hold held up in ports. It was actually an 282 00:16:20,280 --> 00:16:22,520 Speaker 3: article in the paper this morning kind of tracing the 283 00:16:22,560 --> 00:16:26,320 Speaker 3: history of what happened twenty twenty twenty one and why 284 00:16:26,440 --> 00:16:29,320 Speaker 3: ships were basically sitting at seas for months at a time. 285 00:16:30,000 --> 00:16:33,040 Speaker 3: And at that stage we just we had a massive 286 00:16:33,040 --> 00:16:38,360 Speaker 3: throughput issue. But moving even beyond the pandemic, you can 287 00:16:38,360 --> 00:16:43,120 Speaker 3: see it now with ships getting through like Panama Canal. 288 00:16:43,200 --> 00:16:46,000 Speaker 3: There's like a narrow window where you can get through, 289 00:16:46,440 --> 00:16:50,120 Speaker 3: and if you don't have your paperwork done, you don't 290 00:16:50,120 --> 00:16:52,200 Speaker 3: have the right approvals, you're not going through and it 291 00:16:52,240 --> 00:16:53,640 Speaker 3: may cost you a day or two and that's a 292 00:16:53,680 --> 00:16:57,520 Speaker 3: lot of money. In the shipping industry and the tricon example, 293 00:16:58,160 --> 00:17:01,760 Speaker 3: it's really just about when you're pulled into a port, 294 00:17:02,880 --> 00:17:06,160 Speaker 3: if you have the right paperwork done, you can get 295 00:17:06,240 --> 00:17:10,720 Speaker 3: goods off the ship very quickly. They ship a lot 296 00:17:10,760 --> 00:17:14,760 Speaker 3: of food which by definition, since it's not packaged food, 297 00:17:14,800 --> 00:17:18,719 Speaker 3: it's fresh food, there is an expiration period and so 298 00:17:19,160 --> 00:17:24,040 Speaker 3: if it takes them an extra two hours, certainly multiple 299 00:17:24,040 --> 00:17:26,760 Speaker 3: hours or a day, they have a massive problem because 300 00:17:26,760 --> 00:17:28,880 Speaker 3: then you're going to deal with spoilage and so it's 301 00:17:28,880 --> 00:17:31,600 Speaker 3: going to set you back. And what we've worked with 302 00:17:31,600 --> 00:17:35,159 Speaker 3: them on is using an assistant that we've built in 303 00:17:35,240 --> 00:17:40,560 Speaker 3: Watson X called Orchestrate, which basically is just AI doing 304 00:17:40,920 --> 00:17:46,439 Speaker 3: digital labor, so we can replicate nearly any repetitive task 305 00:17:47,560 --> 00:17:51,119 Speaker 3: and do that with software instead of humans. So, as 306 00:17:51,160 --> 00:17:54,600 Speaker 3: you may imagine, shipping industry still has a lot of 307 00:17:54,640 --> 00:17:57,720 Speaker 3: paperwork that goes on and so being able to take 308 00:17:57,800 --> 00:18:01,040 Speaker 3: forms that normally would be multiple hours of filling it out. 309 00:18:01,080 --> 00:18:04,160 Speaker 3: Oh this isn't right, send it back. We've basically built 310 00:18:04,160 --> 00:18:08,200 Speaker 3: that as a digital skill inside of Watson X orchestraate 311 00:18:08,720 --> 00:18:11,400 Speaker 3: and so now it's done in minutes. 312 00:18:12,440 --> 00:18:15,360 Speaker 4: They did they realize that they could have that kind 313 00:18:15,400 --> 00:18:17,560 Speaker 4: of efficiency by teaming up with you? Or is that 314 00:18:17,600 --> 00:18:21,639 Speaker 4: something you came to them and said, guys, we can 315 00:18:21,680 --> 00:18:22,840 Speaker 4: do this way better than you think. 316 00:18:23,000 --> 00:18:23,440 Speaker 2: What's the. 317 00:18:25,280 --> 00:18:28,800 Speaker 3: I'd say it's always it's always both sides coming together 318 00:18:28,960 --> 00:18:31,880 Speaker 3: at a moment that for some reason makes sense because 319 00:18:33,080 --> 00:18:34,880 Speaker 3: you could say, why didn't this happen like five years ago, 320 00:18:34,960 --> 00:18:39,240 Speaker 3: like this seems so obvious. Well, technology wasn't quite ready then, 321 00:18:39,760 --> 00:18:41,880 Speaker 3: I would say, But they knew they had a need 322 00:18:42,400 --> 00:18:45,959 Speaker 3: because I forget what the precise number is, but you know, 323 00:18:46,200 --> 00:18:50,080 Speaker 3: reduction of spoilage has massive impact on their bottom line, 324 00:18:52,000 --> 00:18:54,880 Speaker 3: and so they knew they had a need. We thought 325 00:18:54,880 --> 00:18:57,840 Speaker 3: we could solve it and the two together. 326 00:18:58,280 --> 00:19:01,399 Speaker 2: Who did you guys go to the Now? Did they 327 00:19:01,400 --> 00:19:01,840 Speaker 2: come to you? 328 00:19:02,160 --> 00:19:05,520 Speaker 3: I recall that this one was an inbound meaning they 329 00:19:05,520 --> 00:19:08,840 Speaker 3: had reached out to IBM and we'd like to solve 330 00:19:08,840 --> 00:19:10,679 Speaker 3: this problem. I think it went into one of our 331 00:19:10,720 --> 00:19:13,760 Speaker 3: digital centers if I recall it a literary phone. 332 00:19:13,520 --> 00:19:18,720 Speaker 4: Call, but the other the reverse is more interesting to 333 00:19:18,760 --> 00:19:20,960 Speaker 4: me because there seems to be a very very large 334 00:19:21,040 --> 00:19:23,840 Speaker 4: universe of people who have problems that could be solved 335 00:19:24,000 --> 00:19:25,600 Speaker 4: this way and they don't realize it. 336 00:19:26,480 --> 00:19:27,359 Speaker 2: What's your. 337 00:19:28,720 --> 00:19:31,679 Speaker 4: Is there a shining example of this of someone you 338 00:19:31,800 --> 00:19:34,159 Speaker 4: just can't you just think could benefit so much and 339 00:19:34,320 --> 00:19:35,480 Speaker 4: isn't benefiting right now? 340 00:19:38,280 --> 00:19:42,960 Speaker 3: Maybe I'll answer it slightly differently. I'm I'm surprised by 341 00:19:43,240 --> 00:19:46,040 Speaker 3: how many people can benefit that you wouldn't even logically 342 00:19:46,080 --> 00:19:49,600 Speaker 3: think of. First, let me give you an example. There's 343 00:19:49,840 --> 00:19:56,360 Speaker 3: a franchiser of hair salons. Sport Clips is the name. 344 00:19:57,560 --> 00:19:59,679 Speaker 3: My sons used to go there for haircuts because they 345 00:19:59,680 --> 00:20:02,479 Speaker 3: have like and you can watch sports, so they loved that. 346 00:20:02,480 --> 00:20:05,439 Speaker 3: They got entertained while they would get their haircut. I 347 00:20:05,440 --> 00:20:07,600 Speaker 3: think the last place that you would think is using 348 00:20:07,640 --> 00:20:13,240 Speaker 3: AI today would be a franchiser of hair salons. But 349 00:20:14,080 --> 00:20:17,880 Speaker 3: just follow it through. The biggest part of how they 350 00:20:17,960 --> 00:20:20,280 Speaker 3: run their business is can I get people to cut hair? 351 00:20:21,720 --> 00:20:24,200 Speaker 3: And this is the high turnover industry because there's a 352 00:20:24,200 --> 00:20:25,720 Speaker 3: lot of different places you can work if you want 353 00:20:25,720 --> 00:20:28,320 Speaker 3: to cut hair. People actually get injured cutting hair because 354 00:20:28,320 --> 00:20:29,920 Speaker 3: you're on your feet all day, that type of thing. 355 00:20:30,600 --> 00:20:35,320 Speaker 3: And they're using same technology orchestrate as part of their 356 00:20:35,680 --> 00:20:39,320 Speaker 3: recruiting process. How can they automate a lot of people 357 00:20:39,760 --> 00:20:44,320 Speaker 3: submitting resumes, who they speak to, how they qualify them 358 00:20:44,359 --> 00:20:47,639 Speaker 3: for the position. And so the reason I give that 359 00:20:47,680 --> 00:20:52,520 Speaker 3: example is the opportunity for AI, which is unlike other technologies, 360 00:20:53,240 --> 00:20:58,760 Speaker 3: is truly unlimited. It will touch every single business. It's 361 00:20:58,800 --> 00:21:01,359 Speaker 3: not the realm of the fun five hundred or the 362 00:21:01,400 --> 00:21:06,159 Speaker 3: fortune one thousand. This is the fortune any size. And 363 00:21:06,240 --> 00:21:08,480 Speaker 3: I think that may be one thing that people underestimate 364 00:21:08,680 --> 00:21:09,520 Speaker 3: about AI. 365 00:21:11,080 --> 00:21:13,560 Speaker 4: What about I mean, I was thinking about education as 366 00:21:13,600 --> 00:21:19,520 Speaker 4: a kind of I mean, education is a perennial whipping 367 00:21:19,520 --> 00:21:22,720 Speaker 4: boy for you guys that are living in the nineteenth century, right. 368 00:21:23,200 --> 00:21:27,920 Speaker 4: I'm just curious about if a superintendent of a public 369 00:21:27,920 --> 00:21:31,320 Speaker 4: school system or the president of the university sat down 370 00:21:31,320 --> 00:21:35,359 Speaker 4: and had lunch with you and said, do the university first. 371 00:21:35,720 --> 00:21:40,600 Speaker 4: My cost are out of control, my enrollment is down, 372 00:21:41,440 --> 00:21:44,560 Speaker 4: my students hate me, and my board is revolting. 373 00:21:44,720 --> 00:21:45,000 Speaker 2: Help. 374 00:21:46,920 --> 00:21:50,240 Speaker 4: How would you how would you think about helping someone 375 00:21:50,280 --> 00:21:51,000 Speaker 4: in that situation. 376 00:21:52,600 --> 00:21:55,080 Speaker 3: I spend some time with universities. I like to go 377 00:21:55,160 --> 00:21:58,760 Speaker 3: back and visit Alma Maters where I went to school, 378 00:21:59,080 --> 00:22:03,040 Speaker 3: and so I do that every year. The challenge I 379 00:22:03,040 --> 00:22:05,080 Speaker 3: have hall is Seeming university is there has to be 380 00:22:05,119 --> 00:22:08,320 Speaker 3: a will. Yeah, and I'm not sure the incentives are 381 00:22:08,400 --> 00:22:13,600 Speaker 3: quite right today because bringing in new technology, let's say 382 00:22:13,600 --> 00:22:15,600 Speaker 3: we want to go after we can help you figure 383 00:22:15,640 --> 00:22:20,560 Speaker 3: out student recruiting or how you automate more of your education, 384 00:22:22,520 --> 00:22:26,120 Speaker 3: everybody suddenly feels threatened that university. Hold on, that's my job. 385 00:22:26,680 --> 00:22:29,040 Speaker 3: I'm the one that decides that, or I'm the one 386 00:22:29,080 --> 00:22:32,119 Speaker 3: that wants to dictate the course. So there has to 387 00:22:32,119 --> 00:22:36,000 Speaker 3: be a will. So I think it's very possible, and 388 00:22:36,880 --> 00:22:39,320 Speaker 3: I do think over the next decade you will see 389 00:22:39,359 --> 00:22:41,720 Speaker 3: some universities that jump all over this and they will 390 00:22:41,760 --> 00:22:45,400 Speaker 3: move ahead, and you see others that do not, because 391 00:22:46,000 --> 00:22:46,920 Speaker 3: it's very possible. 392 00:22:48,520 --> 00:22:51,480 Speaker 4: Where how does when you say there has to be 393 00:22:51,520 --> 00:22:54,080 Speaker 4: a will? Is that the kind of Is that a 394 00:22:54,200 --> 00:22:56,639 Speaker 4: kind of thing that that people at IBM think about, 395 00:22:57,359 --> 00:23:00,639 Speaker 4: Like when in this conversation you hype type a conversation 396 00:23:00,680 --> 00:23:03,200 Speaker 4: you might have with the university president, would you give 397 00:23:03,240 --> 00:23:08,680 Speaker 4: advice on where the will comes from? 398 00:23:08,760 --> 00:23:11,080 Speaker 3: I don't do that as much in a university context. 399 00:23:11,080 --> 00:23:14,880 Speaker 3: I do that every day in a business context because 400 00:23:15,640 --> 00:23:17,680 Speaker 3: if you can find the right person in a business 401 00:23:17,680 --> 00:23:21,800 Speaker 3: that wants to focus on growth or the bottom line 402 00:23:22,320 --> 00:23:24,920 Speaker 3: or how do you create more productivity. Yes, it's going 403 00:23:24,960 --> 00:23:29,040 Speaker 3: to create a lot of organizational resistance potentially, but you 404 00:23:29,080 --> 00:23:31,280 Speaker 3: can find somebody that will figure out how to push 405 00:23:31,280 --> 00:23:36,360 Speaker 3: that through. I think for universities, I think that's also possible. 406 00:23:36,640 --> 00:23:38,800 Speaker 3: I'm not sure there's there's a there's a return on 407 00:23:38,880 --> 00:23:40,040 Speaker 3: investment for us to do that. 408 00:23:40,359 --> 00:23:45,080 Speaker 4: Yeah, yeah, yeah, let's let's find some terms. 409 00:23:47,040 --> 00:23:50,400 Speaker 2: AI years. I told you'd like to use What does 410 00:23:50,400 --> 00:23:50,720 Speaker 2: that mean? 411 00:23:52,480 --> 00:23:55,200 Speaker 3: We just started using this term literally in the last 412 00:23:55,240 --> 00:24:00,040 Speaker 3: three months, and it was a It was what we 413 00:24:00,080 --> 00:24:04,840 Speaker 3: observed internally, which is most technology you build, you say, 414 00:24:04,880 --> 00:24:07,000 Speaker 3: all right, what's going to happen in year one, year two, 415 00:24:07,119 --> 00:24:11,720 Speaker 3: year three, and it's you know, largely by by a calendar. 416 00:24:12,200 --> 00:24:14,439 Speaker 3: AI years are the idea that what used to be 417 00:24:14,480 --> 00:24:18,600 Speaker 3: a year is now like a week, and that is 418 00:24:18,640 --> 00:24:21,480 Speaker 3: how fast the technology is moving. Do you give you 419 00:24:21,480 --> 00:24:25,200 Speaker 3: an example. We had one client we're working with, They're 420 00:24:25,320 --> 00:24:28,560 Speaker 3: using one of our granite models, and the results they 421 00:24:28,600 --> 00:24:31,120 Speaker 3: were getting we're not very good. Accuracy was not there. 422 00:24:31,200 --> 00:24:34,080 Speaker 3: Their performance was not there. So I was like scratching 423 00:24:34,119 --> 00:24:37,040 Speaker 3: my head. I was like, what is going on? They 424 00:24:37,119 --> 00:24:40,680 Speaker 3: were financial services, the bank, So I'm scratching my head, 425 00:24:40,720 --> 00:24:42,600 Speaker 3: like what is going on? Everybody else is getting this 426 00:24:42,720 --> 00:24:46,480 Speaker 3: and like these results are horrible. And I said to 427 00:24:46,480 --> 00:24:48,920 Speaker 3: the team, which version of the model are you using? 428 00:24:50,040 --> 00:24:53,600 Speaker 3: This was in February, Like we're using the one from October. 429 00:24:54,800 --> 00:24:56,760 Speaker 3: I was like, all right, now we don't precisely the 430 00:24:56,800 --> 00:25:00,679 Speaker 3: problem because the model from October is the effect useless 431 00:25:00,680 --> 00:25:02,000 Speaker 3: now since we're here in February. 432 00:25:02,600 --> 00:25:06,720 Speaker 2: Serious, actually useless, completely useless. 433 00:25:06,880 --> 00:25:09,840 Speaker 3: Yeah, that is how fast this is changing. And so 434 00:25:10,320 --> 00:25:14,520 Speaker 3: the minute, same use case, same data, you give them 435 00:25:14,520 --> 00:25:19,080 Speaker 3: the model from late January instead of October, the results 436 00:25:19,119 --> 00:25:19,840 Speaker 3: are off the charts. 437 00:25:20,400 --> 00:25:20,880 Speaker 2: Yeah. 438 00:25:21,080 --> 00:25:24,000 Speaker 4: Wait, so what exactly happened between October and January? 439 00:25:24,200 --> 00:25:25,280 Speaker 3: The model got way better? 440 00:25:25,840 --> 00:25:27,560 Speaker 2: Could dig into that? Like, what do you mean by 441 00:25:27,600 --> 00:25:28,000 Speaker 2: the way. 442 00:25:27,880 --> 00:25:32,560 Speaker 3: We are constant We have built large compute infrastructure where 443 00:25:32,560 --> 00:25:36,080 Speaker 3: we're doing model training. And to be clear, model training 444 00:25:36,160 --> 00:25:39,840 Speaker 3: is the realm of probably in the world my guess 445 00:25:39,920 --> 00:25:44,959 Speaker 3: is five to ten companies. And so you build a model, 446 00:25:45,320 --> 00:25:48,640 Speaker 3: you're constantly training it, you're doing fine tuning you're doing 447 00:25:48,680 --> 00:25:51,399 Speaker 3: more training, You're adding data every day, every hour it 448 00:25:51,440 --> 00:25:55,560 Speaker 3: gets better, And so how does it do that. You're 449 00:25:55,560 --> 00:25:59,359 Speaker 3: feeding it more data, you're feeding it more live examples. 450 00:26:00,480 --> 00:26:03,240 Speaker 3: We're using things like synthetic data at this point, which 451 00:26:03,240 --> 00:26:05,879 Speaker 3: is we're basically creating data to do the training as well. 452 00:26:06,560 --> 00:26:09,439 Speaker 3: All of this feeds into how useful the model is, 453 00:26:10,080 --> 00:26:13,679 Speaker 3: and so using the October model, those were the results 454 00:26:13,680 --> 00:26:16,440 Speaker 3: in October, just a fact, that's how good it was then. 455 00:26:17,160 --> 00:26:21,359 Speaker 3: But back to the concept of AI years, two weeks 456 00:26:21,480 --> 00:26:22,120 Speaker 3: is a long time. 457 00:26:23,240 --> 00:26:26,280 Speaker 4: Does that are we in a steep part of the 458 00:26:26,320 --> 00:26:29,120 Speaker 4: model learning carve or do you expect this to continue 459 00:26:29,160 --> 00:26:30,879 Speaker 4: along this at this pace? 460 00:26:32,480 --> 00:26:36,720 Speaker 3: I think that is the big question and don't have 461 00:26:36,760 --> 00:26:39,360 Speaker 3: an answer yet. By definition, at some point you would 462 00:26:39,359 --> 00:26:41,439 Speaker 3: think it would have to slow down a bit, but 463 00:26:41,520 --> 00:26:44,360 Speaker 3: it's not obvious that that is on the horizon. 464 00:26:44,359 --> 00:26:47,919 Speaker 2: Still speeding up. Yes, how fast can it get? 465 00:26:50,400 --> 00:26:53,520 Speaker 3: We've debated, can you actually have better results in the 466 00:26:53,560 --> 00:26:58,320 Speaker 3: afternoon than you did in the morning. Really it's nuts, Yeah, 467 00:26:58,320 --> 00:27:01,000 Speaker 3: I know, But that's that's why we came up with 468 00:27:01,000 --> 00:27:03,080 Speaker 3: this term, because I think you also have to think 469 00:27:03,080 --> 00:27:08,480 Speaker 3: of like concepts that gets people's attention so. 470 00:27:08,760 --> 00:27:11,720 Speaker 4: You're basically turning into a bakery. You're like the bread 471 00:27:11,720 --> 00:27:14,000 Speaker 4: from yesterday. You know you can have it for twenty 472 00:27:14,040 --> 00:27:17,520 Speaker 4: five cents. But I mean you do proferential pricing. You 473 00:27:17,520 --> 00:27:22,640 Speaker 4: could say, we'll judge you x for yesterday's model, two 474 00:27:22,920 --> 00:27:23,920 Speaker 4: x for today's model. 475 00:27:25,520 --> 00:27:29,439 Speaker 3: I think that's dangerous as a merchandising strategy, but I 476 00:27:29,440 --> 00:27:30,040 Speaker 3: guess your point. 477 00:27:30,440 --> 00:27:32,520 Speaker 2: Yeah, but that's crazy. 478 00:27:32,680 --> 00:27:34,359 Speaker 4: And this, by the way, so this model is the 479 00:27:34,400 --> 00:27:37,280 Speaker 4: same true for almost you're talking specifically about a model 480 00:27:37,320 --> 00:27:40,640 Speaker 4: that was created to help some aspect of a financial 481 00:27:40,720 --> 00:27:45,520 Speaker 4: services So is that kind of model accelerating faster and 482 00:27:45,600 --> 00:27:48,560 Speaker 4: running faster than other models for other kinds of problems? 483 00:27:48,920 --> 00:27:54,119 Speaker 3: So this domain was code. Yeah, so by definition, if 484 00:27:54,119 --> 00:27:57,679 Speaker 3: you're feeling feeding in more data some more code, you 485 00:27:57,680 --> 00:28:01,440 Speaker 3: get those kind of results depend on the model type. 486 00:28:02,119 --> 00:28:03,879 Speaker 3: There's a lot of code in the world, and so 487 00:28:04,840 --> 00:28:07,080 Speaker 3: we can find that we can create it. Like I said, 488 00:28:08,359 --> 00:28:12,960 Speaker 3: there's other aspects where there's probably less inputs available, which 489 00:28:13,000 --> 00:28:15,280 Speaker 3: means you probably won't get the same level of iteration. 490 00:28:16,000 --> 00:28:18,320 Speaker 3: But for code, that's certainly the cycle times that we're seeing. 491 00:28:18,359 --> 00:28:20,960 Speaker 4: Yea, and how do you know that. Let's stick with 492 00:28:21,000 --> 00:28:23,639 Speaker 4: this one example of this model you have, how do 493 00:28:23,680 --> 00:28:25,960 Speaker 4: you know that your model is better than. 494 00:28:27,320 --> 00:28:28,800 Speaker 2: Big company B down the street? 495 00:28:29,960 --> 00:28:31,840 Speaker 4: Client asks you, why would I go with IBM as 496 00:28:31,840 --> 00:28:35,760 Speaker 4: opposed to some the s firm in the valley that says, 497 00:28:35,800 --> 00:28:38,040 Speaker 4: as they have a model on this, what's your how 498 00:28:38,040 --> 00:28:40,560 Speaker 4: do you frame your advantage? 499 00:28:41,880 --> 00:28:45,040 Speaker 3: Well, we benchmark all of this, and I think the 500 00:28:45,040 --> 00:28:50,320 Speaker 3: most important is metric is price performance, Not price, not performance, 501 00:28:50,320 --> 00:28:54,680 Speaker 3: but the combination of the two. And we're super competitive there. Well, 502 00:28:55,240 --> 00:28:57,680 Speaker 3: for what we just released, with what we've done in 503 00:28:57,760 --> 00:29:00,280 Speaker 3: open source, we know that nobody's close to us right 504 00:29:00,280 --> 00:29:03,360 Speaker 3: now on code now. To be clear, that will probably change, yeah, 505 00:29:03,440 --> 00:29:06,160 Speaker 3: because it's like leap frog. People will jump ahead, then 506 00:29:06,400 --> 00:29:11,600 Speaker 3: we jump back ahead. But we're very confident that with 507 00:29:11,680 --> 00:29:13,720 Speaker 3: everything we've done in the last few months, we've taken 508 00:29:13,800 --> 00:29:14,960 Speaker 3: a huge leap forward here. 509 00:29:15,160 --> 00:29:15,720 Speaker 2: Yeah. 510 00:29:16,840 --> 00:29:18,400 Speaker 4: I mean this goes back to the point I was 511 00:29:18,440 --> 00:29:21,640 Speaker 4: making in the beginning, so about the difference between your 512 00:29:22,520 --> 00:29:25,720 Speaker 4: twenty something self in ninety nine and yourself today. 513 00:29:26,000 --> 00:29:27,240 Speaker 2: But this time. 514 00:29:27,040 --> 00:29:32,200 Speaker 4: Compression has to be a crazy adjustment. So the concept 515 00:29:32,200 --> 00:29:34,760 Speaker 4: of what you're working on and how you make decisions 516 00:29:34,760 --> 00:29:38,720 Speaker 4: internally and things has to undergo this kind of revolution. 517 00:29:38,880 --> 00:29:41,520 Speaker 4: If you're switching from I mean back in the day, 518 00:29:41,520 --> 00:29:44,720 Speaker 4: a model might be useful for how. 519 00:29:44,600 --> 00:29:46,040 Speaker 2: Long years years? 520 00:29:46,120 --> 00:29:49,800 Speaker 3: I think about you know, statistical models that set inside 521 00:29:49,840 --> 00:29:53,760 Speaker 3: things like SPSS, which is a product that a lot 522 00:29:53,800 --> 00:29:55,680 Speaker 3: of students use around the world. I mean, those have 523 00:29:55,720 --> 00:29:58,400 Speaker 3: been the same models for twenty years and they're still 524 00:29:58,480 --> 00:30:01,360 Speaker 3: very good at what they do. And so yes, it's 525 00:30:01,400 --> 00:30:05,960 Speaker 3: a completely it's a completely different moment for how fast 526 00:30:05,960 --> 00:30:08,880 Speaker 3: this is moving. And I think it just raises the 527 00:30:08,920 --> 00:30:13,040 Speaker 3: bar for everybody, whether you're a technology provider like us, 528 00:30:13,840 --> 00:30:17,120 Speaker 3: or you're a bank or an insurance company or a 529 00:30:17,160 --> 00:30:21,240 Speaker 3: shipping company, to say, how do you really change your 530 00:30:21,240 --> 00:30:26,000 Speaker 3: culture to be way more aggressive than you normally would be? 531 00:30:28,040 --> 00:30:30,600 Speaker 4: Does this means it's a weird question, But does this 532 00:30:30,680 --> 00:30:34,680 Speaker 4: mean a different set of kind of personality or character 533 00:30:34,720 --> 00:30:38,160 Speaker 4: traits are necessary for a decision maker in tech now 534 00:30:38,200 --> 00:30:39,760 Speaker 4: than twenty five years ago. 535 00:30:42,960 --> 00:30:45,880 Speaker 3: There's a there's a book I saw recently, it's called 536 00:30:45,920 --> 00:30:49,960 Speaker 3: The Geek Way, which talked about how technology companies have 537 00:30:50,040 --> 00:30:54,840 Speaker 3: started to operate in different ways maybe than many you know, 538 00:30:54,920 --> 00:31:03,480 Speaker 3: traditional companies, and more about being dated driven, more about delegation. 539 00:31:04,200 --> 00:31:07,680 Speaker 3: Are you willing to have the smartest person in the 540 00:31:07,760 --> 00:31:10,040 Speaker 3: room make decisions opposed to the highest paid person in 541 00:31:10,080 --> 00:31:13,160 Speaker 3: the room. I think these are all different aspects that 542 00:31:13,240 --> 00:31:15,280 Speaker 3: every company is going to face. Yeah. 543 00:31:15,840 --> 00:31:19,960 Speaker 4: Yeah, next term, talk about open. When you use that 544 00:31:20,000 --> 00:31:21,000 Speaker 4: word open, what do you mean. 545 00:31:23,520 --> 00:31:26,280 Speaker 3: I think there's really only one definition of open, which 546 00:31:26,320 --> 00:31:31,360 Speaker 3: is for technology is open source. An open source means 547 00:31:32,000 --> 00:31:37,880 Speaker 3: the code is freely available. Anybody can see it, access it, 548 00:31:38,800 --> 00:31:39,640 Speaker 3: contribute to it. 549 00:31:39,920 --> 00:31:43,400 Speaker 4: And what is Tell me about why that's an important principle. 550 00:31:46,080 --> 00:31:49,200 Speaker 3: When you take a topic like AI, I think it 551 00:31:49,240 --> 00:31:53,040 Speaker 3: would be really bad for the world if this was 552 00:31:53,080 --> 00:31:57,640 Speaker 3: in the hands of one or two companies, or three 553 00:31:57,720 --> 00:32:01,000 Speaker 3: or four, doesn't matter the number, some small number. Think 554 00:32:01,000 --> 00:32:05,600 Speaker 3: about like in history, sometime early nineteen hundreds, the Interstate 555 00:32:05,680 --> 00:32:09,400 Speaker 3: Commerce Commission was created, and the whole idea was to 556 00:32:09,440 --> 00:32:14,959 Speaker 3: protect farmers from railroads. Meaning they wanted to allow free trade, 557 00:32:15,360 --> 00:32:17,760 Speaker 3: but they knew that, well, there's only so many railroad tracks, 558 00:32:17,760 --> 00:32:21,080 Speaker 3: so we need to protect farmers from the shipping costs 559 00:32:21,120 --> 00:32:25,040 Speaker 3: that railroads could impose. So good idea, but over time 560 00:32:25,360 --> 00:32:29,120 Speaker 3: that got completely overtaken by the railroad lobby and then 561 00:32:29,120 --> 00:32:33,000 Speaker 3: they use that to basically just increase prices and it 562 00:32:33,040 --> 00:32:37,120 Speaker 3: made the lives of farmers way more difficult. I think 563 00:32:37,120 --> 00:32:40,560 Speaker 3: you could play the same analogy through with AI. If 564 00:32:40,600 --> 00:32:44,840 Speaker 3: you allow a handful of companies to have the technology, 565 00:32:45,000 --> 00:32:48,080 Speaker 3: you regulate around the principles of those one or two companies, 566 00:32:48,120 --> 00:32:51,000 Speaker 3: then you've trapped the entire world. That would be very bad. 567 00:32:52,360 --> 00:32:56,080 Speaker 3: So the danger of that happened for sure. I mean 568 00:32:56,400 --> 00:33:01,080 Speaker 3: there's companies in Watson in Washington every week trying to 569 00:33:01,120 --> 00:33:04,360 Speaker 3: achieve that outcome. And so the opposite of that is 570 00:33:04,400 --> 00:33:08,240 Speaker 3: to say it's going to be an open source because 571 00:33:08,280 --> 00:33:11,240 Speaker 3: nobody can dispute open source because it's right there, everybody 572 00:33:11,280 --> 00:33:15,040 Speaker 3: can see it. And so I'm a strong believer that 573 00:33:15,080 --> 00:33:17,200 Speaker 3: open source will win for AI. It has to win. 574 00:33:17,960 --> 00:33:23,000 Speaker 3: It's not just important for business, but it's important for humans. 575 00:33:23,800 --> 00:33:26,960 Speaker 4: On the I'm curious about on the list of things 576 00:33:26,960 --> 00:33:30,760 Speaker 4: you worry about, actually, let me before I ask, let 577 00:33:30,760 --> 00:33:33,200 Speaker 4: me ask this question very generally. What is the list 578 00:33:33,240 --> 00:33:36,080 Speaker 4: of things you worry about? What's your top five business 579 00:33:36,080 --> 00:33:37,240 Speaker 4: related worries right now? 580 00:33:38,720 --> 00:33:41,040 Speaker 3: Tops from those are the first question. We could be 581 00:33:41,080 --> 00:33:42,400 Speaker 3: here for hours for me to answer. 582 00:33:44,080 --> 00:33:45,200 Speaker 2: I did say business related. 583 00:33:45,200 --> 00:33:48,880 Speaker 4: We could leave you know, your kid's haircuts got it 584 00:33:49,040 --> 00:33:49,560 Speaker 4: out of. 585 00:33:49,440 --> 00:33:54,400 Speaker 3: The number one is always it's the thing that's probably 586 00:33:54,440 --> 00:33:59,520 Speaker 3: always been true, which is just people. Do we have 587 00:33:59,560 --> 00:34:01,760 Speaker 3: the rights skills? Are we doing a good job of 588 00:34:01,800 --> 00:34:05,240 Speaker 3: training our people? Are our people doing a good job 589 00:34:05,280 --> 00:34:09,239 Speaker 3: of working with clients? Like that's number one. Number two 590 00:34:09,360 --> 00:34:15,600 Speaker 3: is innovation. Are we pushing the envelope enough? Are are 591 00:34:15,600 --> 00:34:20,240 Speaker 3: we staying ahead? Number three is which kind of feeds 592 00:34:20,280 --> 00:34:23,000 Speaker 3: into the innovation one is risk taking? Are we taking 593 00:34:23,120 --> 00:34:27,160 Speaker 3: enough risk? Without risk? There is no growth? And I 594 00:34:27,160 --> 00:34:32,040 Speaker 3: think the trap that every larger company inevitably falls into 595 00:34:32,200 --> 00:34:37,719 Speaker 3: is conservatism. Things are good enough, and so it's are 596 00:34:37,719 --> 00:34:41,160 Speaker 3: we pushing the envelope? Are we taking enough risk to 597 00:34:41,239 --> 00:34:43,360 Speaker 3: really have an impact? I'd say those are probably the 598 00:34:43,400 --> 00:34:44,719 Speaker 3: top three that I spend. 599 00:34:45,000 --> 00:34:48,560 Speaker 4: Last turn to define productivity paradox something I know you've 600 00:34:48,719 --> 00:34:50,279 Speaker 4: thought a lot about what does that mean? 601 00:34:51,719 --> 00:34:54,359 Speaker 3: So I started thinking hard about this because all I 602 00:34:54,480 --> 00:34:57,680 Speaker 3: saw and read every day was was fear about AI. 603 00:35:00,120 --> 00:35:04,759 Speaker 3: And I studied economics, and so I kind of went 604 00:35:04,800 --> 00:35:08,440 Speaker 3: back to like basic economics. And there's been like a 605 00:35:08,480 --> 00:35:12,600 Speaker 3: macro investing formula. I guess I would say it's been 606 00:35:12,640 --> 00:35:20,400 Speaker 3: around forever that says growth comes from productivity growth plus 607 00:35:20,440 --> 00:35:26,520 Speaker 3: population growth plus debt growth. So if those three things 608 00:35:26,520 --> 00:35:30,440 Speaker 3: are working, you'll get GDP growth. And so then you 609 00:35:30,440 --> 00:35:34,200 Speaker 3: think about that and you say, well, debt growth, we're 610 00:35:34,200 --> 00:35:37,640 Speaker 3: probably not going back to zero percent interest rates, so 611 00:35:37,680 --> 00:35:39,480 Speaker 3: to some extent there's going to be a ceiling on that. 612 00:35:40,719 --> 00:35:44,799 Speaker 3: And then you look at population growth. There are shockingly 613 00:35:44,960 --> 00:35:47,719 Speaker 3: few countries or places in the world that will see 614 00:35:47,719 --> 00:35:50,959 Speaker 3: population growth over the next thirty to fifty years. In fact, 615 00:35:50,960 --> 00:35:55,640 Speaker 3: most places are not even at replacement rates. And so 616 00:35:55,680 --> 00:35:57,359 Speaker 3: I'm like, all right, so population growth is not going 617 00:35:57,400 --> 00:36:00,800 Speaker 3: to be there. So that would mean if you just 618 00:36:00,840 --> 00:36:05,319 Speaker 3: take it to the extreme, the only chance of continued 619 00:36:05,520 --> 00:36:14,360 Speaker 3: GDP growth is productivity. And the best way to solve 620 00:36:14,360 --> 00:36:17,360 Speaker 3: productivity is AI That's why I say it's a paradox. 621 00:36:17,480 --> 00:36:22,160 Speaker 3: On one hand, everybody's scared after death it's going to take 622 00:36:22,200 --> 00:36:25,279 Speaker 3: over the world, take all of our jobs, ruin us. 623 00:36:26,800 --> 00:36:28,920 Speaker 3: But in reality, maybe it's the other way, which is 624 00:36:29,000 --> 00:36:32,239 Speaker 3: it's the only thing that can save us. Yeah, and 625 00:36:32,320 --> 00:36:34,600 Speaker 3: if you believe that economic equation, which I think has 626 00:36:34,640 --> 00:36:37,880 Speaker 3: proven quite true over hundreds of years, I do think 627 00:36:37,920 --> 00:36:39,400 Speaker 3: it's probably the only thing that can save us. 628 00:36:40,880 --> 00:36:44,000 Speaker 4: Actually looked at the numbers yesterday for totally random reason 629 00:36:44,320 --> 00:36:47,480 Speaker 4: on population growth in Europe and received. This is a 630 00:36:47,480 --> 00:36:50,120 Speaker 4: special bonus question. We'll see how smart you are. Which 631 00:36:50,160 --> 00:36:54,480 Speaker 4: country in Europe? Condellly Europe has the highest population growth? 632 00:36:56,200 --> 00:37:01,720 Speaker 3: It's small continental Europe, probably one of the Nordics. 633 00:37:01,719 --> 00:37:05,560 Speaker 2: I would yes, close Luxembourg. 634 00:37:06,040 --> 00:37:10,360 Speaker 4: Okay, something that's going on in Luxembourg. I feel like, 635 00:37:10,400 --> 00:37:12,839 Speaker 4: well all of us need to investigate there. At one 636 00:37:12,840 --> 00:37:14,719 Speaker 4: point four nine, which in the day, by the way, 637 00:37:14,719 --> 00:37:18,800 Speaker 4: would be a relatively that's the best performing country. I 638 00:37:18,840 --> 00:37:21,200 Speaker 4: mean in the day, you'd be countries had routinely had 639 00:37:21,239 --> 00:37:24,879 Speaker 4: two points something, you know, percent growth in a given year. 640 00:37:26,200 --> 00:37:28,520 Speaker 2: Last question, you're writing a book. Now we were talking 641 00:37:28,600 --> 00:37:29,280 Speaker 2: chatting about. 642 00:37:29,080 --> 00:37:34,000 Speaker 4: It backstage, and now I appreciate the paradox of this book, 643 00:37:34,239 --> 00:37:36,759 Speaker 4: which is in a universe with a model, is better 644 00:37:36,800 --> 00:37:38,920 Speaker 4: in the afternoon than it is in the morning. How 645 00:37:38,960 --> 00:37:41,120 Speaker 4: do you write a book that's like printed on paper? 646 00:37:41,640 --> 00:37:43,000 Speaker 4: I expected to reuseful. 647 00:37:46,719 --> 00:37:50,640 Speaker 3: This is the challenge. And I am an incredible author 648 00:37:50,680 --> 00:37:53,759 Speaker 3: of useless books. I mean most of what I've spent 649 00:37:53,840 --> 00:37:57,160 Speaker 3: time on in the last decade of stuff that's completely useless, 650 00:37:57,200 --> 00:38:00,120 Speaker 3: like a year after it's written. And so when and 651 00:38:01,320 --> 00:38:02,920 Speaker 3: we were talking about it as I would like to 652 00:38:03,000 --> 00:38:07,880 Speaker 3: do something around AI that's timeless, that would be useful 653 00:38:08,440 --> 00:38:13,279 Speaker 3: ten or twenty years from now. But then to your point, so, 654 00:38:12,800 --> 00:38:17,879 Speaker 3: how is that even remotely possible if the model's better 655 00:38:17,920 --> 00:38:20,440 Speaker 3: in the afternoon than in the morning. So that's the 656 00:38:20,520 --> 00:38:22,560 Speaker 3: challenge in front of us. But the book is around 657 00:38:22,640 --> 00:38:26,759 Speaker 3: AI value creation, so kind of links to this productivity paradox, 658 00:38:27,200 --> 00:38:33,719 Speaker 3: and how do you actually get sustained value out of AI, 659 00:38:34,200 --> 00:38:38,480 Speaker 3: out of automation, out of data science. And so the 660 00:38:38,520 --> 00:38:40,640 Speaker 3: biggest challenge in front of us is can we make 661 00:38:40,640 --> 00:38:44,040 Speaker 3: this relevant? That's the day that it's published. 662 00:38:44,120 --> 00:38:45,360 Speaker 2: How are you setting out to do that? 663 00:38:47,480 --> 00:38:50,480 Speaker 3: I think you have to to some extent level it 664 00:38:50,560 --> 00:38:53,200 Speaker 3: up to bigger concepts, which is kind of why I 665 00:38:53,239 --> 00:38:58,840 Speaker 3: go to things like macroeconomics, population geography as opposed to 666 00:38:58,920 --> 00:39:02,239 Speaker 3: going into the the weeds of the technology itself. If 667 00:39:02,280 --> 00:39:04,920 Speaker 3: you're write about this is how you get better performance 668 00:39:04,920 --> 00:39:08,359 Speaker 3: out of a model, we can agree that will be 669 00:39:08,600 --> 00:39:11,520 Speaker 3: completely useless two years from now, maybe even two months 670 00:39:11,560 --> 00:39:15,480 Speaker 3: from now, and so it will be less in the 671 00:39:15,640 --> 00:39:20,280 Speaker 3: technical detail and more of what is sustained value creation 672 00:39:20,440 --> 00:39:23,920 Speaker 3: for AI, which if you think on what is hopefully 673 00:39:23,960 --> 00:39:27,120 Speaker 3: a ten or twenty year period. It's probably we're kind 674 00:39:27,120 --> 00:39:30,560 Speaker 3: of substituting AI for technology now, I've realized, because I 675 00:39:30,600 --> 00:39:33,239 Speaker 3: think this has always been true for technology. It's just 676 00:39:33,320 --> 00:39:36,480 Speaker 3: now AI is the thing that everybody wants to talk about. 677 00:39:37,640 --> 00:39:39,919 Speaker 3: But let's see if we can do it. Time will tell. 678 00:39:40,760 --> 00:39:43,440 Speaker 4: Did you get any inkling that the pace that this 679 00:39:43,560 --> 00:39:46,960 Speaker 4: AI year's phenomenon was going to that things with the 680 00:39:47,000 --> 00:39:49,720 Speaker 4: pace of change was going to accelerate so much because 681 00:39:49,719 --> 00:39:52,440 Speaker 4: you had More's law, right, you had a model in 682 00:39:52,480 --> 00:39:56,920 Speaker 4: the technology world for this kind of exponential increase in 683 00:39:57,800 --> 00:40:03,239 Speaker 4: so thinking about that kind of a similar kind of 684 00:40:03,840 --> 00:40:04,920 Speaker 4: acceleration in the. 685 00:40:07,480 --> 00:40:10,040 Speaker 3: I think anybody had said they expected what we're seeing 686 00:40:10,080 --> 00:40:15,799 Speaker 3: today is probably exaggerating. I think it's way faster than 687 00:40:15,920 --> 00:40:21,239 Speaker 3: anybody expected. Yeah, but technologies, back to your point at 688 00:40:21,239 --> 00:40:25,320 Speaker 3: More's law has always accelerated through the years. So I 689 00:40:25,360 --> 00:40:28,640 Speaker 3: wouldn't say it's a shock, but it is surprising. 690 00:40:29,239 --> 00:40:34,759 Speaker 4: Yeah, You've had a kind of extraordinary privileged position to 691 00:40:35,000 --> 00:40:37,680 Speaker 4: watch and participate in this revolution, right, I mean, how 692 00:40:37,680 --> 00:40:43,279 Speaker 4: many other people have been in that have ridden this wave. 693 00:40:43,120 --> 00:40:43,480 Speaker 2: Like you have. 694 00:40:44,840 --> 00:40:48,000 Speaker 3: I do wonder is this really that much different or 695 00:40:48,040 --> 00:40:51,080 Speaker 3: does it feel different just because we're here. I mean, 696 00:40:51,320 --> 00:40:54,040 Speaker 3: I do think on one level. Yes, So in the 697 00:40:54,040 --> 00:41:00,880 Speaker 3: time I've been an IBM, internet happened, mobile happened, social 698 00:41:00,920 --> 00:41:05,720 Speaker 3: network happened, blockchain happened. AI. So a lot has happened. 699 00:41:06,040 --> 00:41:07,400 Speaker 3: But then you go back and say, well, but if 700 00:41:07,440 --> 00:41:13,080 Speaker 3: I'd been here between nineteen seventy and ninety five, there 701 00:41:13,080 --> 00:41:15,759 Speaker 3: were a lot of things that are pretty fundamental then too, 702 00:41:15,800 --> 00:41:19,640 Speaker 3: So I wondered, almost, do we always exaggerate the timeframe 703 00:41:19,680 --> 00:41:25,759 Speaker 3: that we're in. I don't know. Yeah, but it's a 704 00:41:25,760 --> 00:41:26,439 Speaker 3: good idea though. 705 00:41:28,360 --> 00:41:31,399 Speaker 4: I think the ending with the phrase I don't know 706 00:41:32,120 --> 00:41:33,240 Speaker 4: it's a good idea though. 707 00:41:34,000 --> 00:41:36,279 Speaker 2: It's probably a great way to wrap this up. 708 00:41:36,680 --> 00:41:42,640 Speaker 4: Thank you so much, Thank you, Malcolm. In a field 709 00:41:42,640 --> 00:41:45,920 Speaker 4: that is evolving as quickly as artificial intelligence, it was 710 00:41:46,040 --> 00:41:49,280 Speaker 4: inspiring to see how adaptable Rob has been over his career. 711 00:41:49,840 --> 00:41:53,560 Speaker 4: The takeaways from my conversation with Rob had been echoing 712 00:41:53,600 --> 00:41:57,439 Speaker 4: in my head ever since. He emphasized how open source 713 00:41:57,520 --> 00:42:02,520 Speaker 4: models allow AI technology to be by many players. Openness 714 00:42:02,560 --> 00:42:06,799 Speaker 4: also allows for transparency. Rob told me about AI use 715 00:42:06,840 --> 00:42:12,560 Speaker 4: cases like IBM's collaboration with Sevilla's football club. That example 716 00:42:12,800 --> 00:42:16,319 Speaker 4: really brought home for me how AI technology will touch 717 00:42:16,440 --> 00:42:21,640 Speaker 4: every industry. Despite the potential benefits of AI, challenges exist 718 00:42:21,880 --> 00:42:26,480 Speaker 4: in its widespread adoption. Rob discussed how resistance to change, 719 00:42:26,760 --> 00:42:31,799 Speaker 4: concerns about job security and organizational inertia can slow down 720 00:42:31,840 --> 00:42:36,720 Speaker 4: implementation of AI solutions. The paradox, though, according to Rob, 721 00:42:37,040 --> 00:42:40,040 Speaker 4: is that rather than being afraid of a world with AI, 722 00:42:40,280 --> 00:42:44,560 Speaker 4: people should actually be more afraid of a world without it. AI, 723 00:42:44,600 --> 00:42:47,440 Speaker 4: he believes, has the potential to make the world a 724 00:42:47,520 --> 00:42:50,640 Speaker 4: better place in a way that no other technology can. 725 00:42:51,560 --> 00:42:55,040 Speaker 4: Rob painted an optimistic version of the future, one in 726 00:42:55,080 --> 00:42:59,879 Speaker 4: which AI technology will continue to improve at an exponential rate. 727 00:43:00,520 --> 00:43:03,719 Speaker 4: This will free up workers to dedicate their energy to 728 00:43:03,920 --> 00:43:09,720 Speaker 4: more creative tasks. I for one am on board. Smart 729 00:43:09,719 --> 00:43:13,440 Speaker 4: Talks with IBM is produced by Matt Romano, Joey Fishground 730 00:43:13,640 --> 00:43:17,680 Speaker 4: and Jacob Goldstein. We're edited by Lydia Jane Kott. Our 731 00:43:17,719 --> 00:43:21,799 Speaker 4: engineers are Sarah Bruguier and Ben Holiday theme song by 732 00:43:21,800 --> 00:43:25,720 Speaker 4: Gramscow Special thanks to the eight Bar and ib M teams, 733 00:43:26,040 --> 00:43:29,080 Speaker 4: as well as the Pushkin marketing team. Smart Talks with 734 00:43:29,120 --> 00:43:32,440 Speaker 4: ib M is a production of Pushkin Industries and Ruby 735 00:43:32,520 --> 00:43:37,440 Speaker 4: Studio at iHeartMedia. To find more Pushkin podcasts. Listen on 736 00:43:37,560 --> 00:43:43,400 Speaker 4: the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. 737 00:43:44,040 --> 00:43:47,680 Speaker 4: I'm Malcolm Gladwell. This is a paid advertisement from IBM. 738 00:43:48,040 --> 00:43:54,080 Speaker 4: The conversations on this podcast don't necessarily represent IBM's positions, strategies, 739 00:43:54,560 --> 00:44:01,279 Speaker 4: or opinions.