1 00:00:04,519 --> 00:00:12,719 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. This season 2 00:00:12,800 --> 00:00:16,280 Speaker 1: on smart Talks with IBM, Malcolm Gladwell and team are 3 00:00:16,320 --> 00:00:19,560 Speaker 1: diving into the transformative world of artificial intelligence with a 4 00:00:19,600 --> 00:00:24,200 Speaker 1: fresh perspective on the concept of open What does open 5 00:00:24,640 --> 00:00:28,000 Speaker 1: really mean in the context of AI. It can mean 6 00:00:28,080 --> 00:00:31,680 Speaker 1: open source code or open data, but it also encompasses 7 00:00:31,800 --> 00:00:36,120 Speaker 1: fostering an ecosystem of ideas, ensuring diverse perspectives are heard, 8 00:00:36,240 --> 00:00:40,000 Speaker 1: and enabling new levels of transparency. Join hosts from your 9 00:00:40,040 --> 00:00:43,560 Speaker 1: favorite Pushkin podcasts as they explore how openness and AI 10 00:00:43,880 --> 00:00:49,120 Speaker 1: is reshaping industries, driving innovation, and redefining what's possible. You'll 11 00:00:49,159 --> 00:00:52,440 Speaker 1: hear from industry experts and leaders about the implications and 12 00:00:52,479 --> 00:00:56,240 Speaker 1: possibilities of open AI, and of course, Malcolm Gladwell will 13 00:00:56,280 --> 00:00:58,280 Speaker 1: be there to guide you through the season with his 14 00:00:58,440 --> 00:01:01,920 Speaker 1: unique insights. Look out for new episodes of Smart Talks 15 00:01:02,000 --> 00:01:05,440 Speaker 1: every other week on the iHeartRadio app, Apple Podcasts, or 16 00:01:05,440 --> 00:01:09,000 Speaker 1: wherever you get your podcasts, and learn more at IBM 17 00:01:09,160 --> 00:01:12,440 Speaker 1: dot com slash smart Talks. 18 00:01:13,480 --> 00:01:16,880 Speaker 2: Hello, Hello, Welcome to Smart Talks with IBM, a podcast 19 00:01:17,160 --> 00:01:22,800 Speaker 2: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Godwell, this season, 20 00:01:22,840 --> 00:01:25,800 Speaker 2: we're diving back into the world of artificial intelligence, but 21 00:01:25,880 --> 00:01:31,760 Speaker 2: with a focus on the powerful concept of open its possibilities, implications, 22 00:01:31,800 --> 00:01:36,479 Speaker 2: and misconceptions. On today's episode, our season finale, I'm joined 23 00:01:36,480 --> 00:01:40,640 Speaker 2: by Rick Lewis, the senior vice president of Infrastructure at IBM. 24 00:01:41,360 --> 00:01:44,840 Speaker 2: Rick has had a remarkable career focused around product innovation. 25 00:01:45,560 --> 00:01:48,520 Speaker 2: He was actually a few years into retirement when IBM 26 00:01:48,560 --> 00:01:54,000 Speaker 2: came calling with an opportunity he just couldn't turn down. Thankfully, 27 00:01:54,200 --> 00:01:57,520 Speaker 2: Rick came out of retirement and today he oversees a 28 00:01:57,600 --> 00:02:03,080 Speaker 2: vast portfolio from storage and software to global customer support operations, 29 00:02:03,480 --> 00:02:05,800 Speaker 2: and he's engaged in one of the key problems facing 30 00:02:05,840 --> 00:02:10,720 Speaker 2: companies today, an explosion of data. In talking with Rick, 31 00:02:10,800 --> 00:02:13,160 Speaker 2: I can see that this problem of having so much 32 00:02:13,280 --> 00:02:17,360 Speaker 2: data is also an incredible opportunity because if you're able 33 00:02:17,400 --> 00:02:20,079 Speaker 2: to leverage that data to get the most value out 34 00:02:20,080 --> 00:02:22,480 Speaker 2: of it, then you can use it to help bring 35 00:02:22,520 --> 00:02:26,280 Speaker 2: your business into the future. We talked about the serious 36 00:02:26,320 --> 00:02:29,160 Speaker 2: computing power needed to scale AI, as well as the 37 00:02:29,200 --> 00:02:33,400 Speaker 2: ways that infrastructure storage solutions can be essential to enabling 38 00:02:33,480 --> 00:02:38,000 Speaker 2: this new world of possibilities. It's a really great conversation 39 00:02:38,520 --> 00:02:44,520 Speaker 2: so let's get to it. I'm here with Rick Lewis. 40 00:02:44,880 --> 00:02:45,079 Speaker 3: Rick. 41 00:02:45,320 --> 00:02:49,560 Speaker 2: Welcome, Thank Here. We are in the IBM's New York 42 00:02:49,560 --> 00:02:53,040 Speaker 2: City headquarters at one Madison Avenue. I'm going to start 43 00:02:53,040 --> 00:02:54,959 Speaker 2: with you're a hardware guy. 44 00:02:55,120 --> 00:02:58,760 Speaker 3: I'm a hardware guy. I grew up doing hardware chip engineering. 45 00:02:59,200 --> 00:03:01,680 Speaker 3: But like I tell a lot of people, a chip 46 00:03:01,680 --> 00:03:05,280 Speaker 3: engineering project is actually a giant software project with a 47 00:03:05,400 --> 00:03:08,040 Speaker 3: piece of hardware at the end of the project. I 48 00:03:08,080 --> 00:03:10,840 Speaker 3: think if you have that analytical brain, you like to 49 00:03:10,919 --> 00:03:14,560 Speaker 3: solve problems, you'd like to get things working. You can 50 00:03:14,600 --> 00:03:15,880 Speaker 3: do that in soet desertwork. 51 00:03:15,960 --> 00:03:19,320 Speaker 2: But as being someone coming from a hardware background mean 52 00:03:19,360 --> 00:03:21,959 Speaker 2: that you think about problems in a different way. 53 00:03:22,600 --> 00:03:26,239 Speaker 3: I think one thing that you do from a hardware background, 54 00:03:26,240 --> 00:03:29,359 Speaker 3: and especially a chip background, is a chip spin and 55 00:03:29,480 --> 00:03:33,360 Speaker 3: costs millions of dollars. So you're a lot more likely 56 00:03:33,480 --> 00:03:36,560 Speaker 3: to make sure everything has a great chance of being 57 00:03:36,600 --> 00:03:39,080 Speaker 3: perfect from the get go. Or if you start kind 58 00:03:39,080 --> 00:03:42,120 Speaker 3: of from a software background, your general mindset is I 59 00:03:42,120 --> 00:03:43,760 Speaker 3: don't know, try this, see if it works. I don't know. 60 00:03:43,800 --> 00:03:45,840 Speaker 3: Try that is, if it work, and you're kind of iterated, 61 00:03:45,960 --> 00:03:48,520 Speaker 3: or to iterate. Chip people are a little more uptight 62 00:03:48,600 --> 00:03:52,120 Speaker 3: about Okay, if this first round of the chip breaks 63 00:03:52,360 --> 00:03:55,320 Speaker 3: costs us from building another new round of the chip. 64 00:03:55,200 --> 00:03:58,680 Speaker 2: Yeah, so you're a little more You guys are spend more. 65 00:03:58,520 --> 00:04:04,880 Speaker 3: Time planning and planning verifying, tons of time verifying. Yeah. 66 00:04:05,320 --> 00:04:09,280 Speaker 2: So you began your career as you look backward, yes, correct, 67 00:04:09,320 --> 00:04:10,440 Speaker 2: And you were there for how many years? 68 00:04:10,520 --> 00:04:11,840 Speaker 3: I was there for thirty two years? 69 00:04:12,520 --> 00:04:15,680 Speaker 2: Yes, And your last job there was I was leading. 70 00:04:15,520 --> 00:04:18,200 Speaker 3: The software defining cloud business. I had grown up a 71 00:04:18,240 --> 00:04:23,720 Speaker 3: hardware guy. I had done all kinds of hardware projects, 72 00:04:23,760 --> 00:04:28,320 Speaker 3: big complicated Unix servers and things like that, and then came, 73 00:04:28,640 --> 00:04:30,440 Speaker 3: you know, grew out of R and D and more 74 00:04:30,440 --> 00:04:35,200 Speaker 3: into the business realm, and then I'm much an innovator 75 00:04:35,240 --> 00:04:39,680 Speaker 3: at heart. I really like innovating new concepts things like that. 76 00:04:39,960 --> 00:04:43,160 Speaker 3: And what I learned is I enjoyed innovating business models 77 00:04:43,200 --> 00:04:46,320 Speaker 3: and software projects as much as I did hardware products 78 00:04:47,120 --> 00:04:50,840 Speaker 3: and projects, and so getting teams inspired towards doing that 79 00:04:51,240 --> 00:04:53,599 Speaker 3: was really a deep fascination for me. So I ended 80 00:04:53,680 --> 00:04:57,400 Speaker 3: up doing a fantastic variety of experiences and had a 81 00:04:57,440 --> 00:05:01,640 Speaker 3: successful run and honestly retired in tending to retire and 82 00:05:02,000 --> 00:05:04,360 Speaker 3: do some of my outside activities and things like that. 83 00:05:04,880 --> 00:05:07,280 Speaker 2: And then how long did you stay retired before IBM 84 00:05:07,320 --> 00:05:07,760 Speaker 2: can close? 85 00:05:08,040 --> 00:05:12,480 Speaker 3: Almost two years? And when I first got at ALL, 86 00:05:12,720 --> 00:05:15,720 Speaker 3: I thought, no, I'm having too much fun. But I 87 00:05:15,760 --> 00:05:20,360 Speaker 3: would say three things really got me thinking hard about it. 88 00:05:20,440 --> 00:05:24,120 Speaker 3: One the industry that we're in, the IT industry. I 89 00:05:24,160 --> 00:05:26,200 Speaker 3: think it's the golden age. And what I mean by 90 00:05:26,240 --> 00:05:29,919 Speaker 3: that is for twenty years of that career, it is 91 00:05:30,000 --> 00:05:32,200 Speaker 3: kind of in the back office, say make sure that 92 00:05:32,240 --> 00:05:35,520 Speaker 3: stuff doesn't crash, and can you please reduce the cost 93 00:05:35,560 --> 00:05:37,599 Speaker 3: as much as possible, because it's not that important to 94 00:05:37,640 --> 00:05:41,080 Speaker 3: the main business. It's just a back office function. You 95 00:05:41,120 --> 00:05:43,400 Speaker 3: can see it right now. It is at the forefront 96 00:05:43,440 --> 00:05:47,719 Speaker 3: of all business revolution. It happened with the Internet. It 97 00:05:47,800 --> 00:05:51,480 Speaker 3: happened again with cloud and how that changed every ounce 98 00:05:51,520 --> 00:05:54,560 Speaker 3: of business, not just IT business, but all business. And 99 00:05:54,600 --> 00:05:57,719 Speaker 3: I think it's happening again with AI. So to be 100 00:05:57,839 --> 00:06:00,440 Speaker 3: in that career that long and to miss the kind 101 00:06:00,440 --> 00:06:02,920 Speaker 3: of this age where it's like this is front and center. 102 00:06:03,000 --> 00:06:06,680 Speaker 3: This changes everything about all businesses, not just technology businesses. 103 00:06:07,400 --> 00:06:11,200 Speaker 3: I was kind of feeling like, Gosh, you trained to 104 00:06:11,279 --> 00:06:16,320 Speaker 3: be in these really awesome environments, why wouldn't you do 105 00:06:16,400 --> 00:06:18,640 Speaker 3: that for a little while longer while you still can 106 00:06:18,720 --> 00:06:22,359 Speaker 3: do it. That combined with IBM and IBM seeing the 107 00:06:22,440 --> 00:06:26,640 Speaker 3: talent pool, the brilliant people at IBM, I worked with 108 00:06:26,680 --> 00:06:29,359 Speaker 3: a ton of brilliant people before I saw a chance 109 00:06:29,400 --> 00:06:32,000 Speaker 3: to work with even a larger staff of brilliant people. 110 00:06:32,120 --> 00:06:35,320 Speaker 3: And then the assets that IBM had, which is, you know, 111 00:06:35,320 --> 00:06:38,080 Speaker 3: they'd already been doing a lot of experimentation in AI, 112 00:06:38,200 --> 00:06:43,400 Speaker 3: they're working in quantum, the deep, rich heritage of successful projects. 113 00:06:43,400 --> 00:06:45,440 Speaker 3: I thought, who wouldn't want to kind of see if 114 00:06:45,440 --> 00:06:48,560 Speaker 3: they could be part of that next great wave of IBM. 115 00:06:48,680 --> 00:06:51,200 Speaker 3: And so I kind of decided, all right, I'm going 116 00:06:51,240 --> 00:06:53,279 Speaker 3: to put the outside interest on hold for a while 117 00:06:53,279 --> 00:06:54,320 Speaker 3: and get back in the game. 118 00:06:54,440 --> 00:06:57,760 Speaker 2: Along between the phone call, the first phone call and 119 00:06:57,800 --> 00:06:58,800 Speaker 2: you say, yes. 120 00:06:58,600 --> 00:07:03,520 Speaker 3: It was a while, was probably six months. Arvind's our CEO, 121 00:07:03,720 --> 00:07:05,839 Speaker 3: teases me about that a lot. Yeah, he was like, 122 00:07:05,920 --> 00:07:07,800 Speaker 3: I don't think six months. Is that long? It took 123 00:07:07,800 --> 00:07:11,680 Speaker 3: a while you're retirement, I know. Yeah, it's one thing 124 00:07:11,720 --> 00:07:14,360 Speaker 3: to compare. I'm working here and doing this stuff versus 125 00:07:14,360 --> 00:07:17,720 Speaker 3: working there. It's really hard to compare. I'm doing exactly 126 00:07:17,760 --> 00:07:19,720 Speaker 3: as I want to do every single day. When I 127 00:07:19,760 --> 00:07:21,440 Speaker 3: wake up and now I'm not going to get to 128 00:07:21,440 --> 00:07:24,240 Speaker 3: do that again. It took a while for me to 129 00:07:24,240 --> 00:07:27,040 Speaker 3: get over and I thought, I can't miss this wave, 130 00:07:27,080 --> 00:07:31,360 Speaker 3: and I'm really really happy that I did, because we're 131 00:07:31,400 --> 00:07:34,920 Speaker 3: doing some amazing fun things and I'm getting challenged in 132 00:07:34,960 --> 00:07:36,880 Speaker 3: ways that I never did, so it's really fun. 133 00:07:37,120 --> 00:07:40,160 Speaker 2: Talk a little bit about your job here at IBM. 134 00:07:40,360 --> 00:07:44,480 Speaker 2: You oversee a kind of massive portfolio. 135 00:07:44,640 --> 00:07:47,560 Speaker 3: It's a big group, so I run the Infrastructure organization. 136 00:07:47,600 --> 00:07:50,920 Speaker 3: There's three main groups of products at IBM. There's the 137 00:07:51,000 --> 00:07:54,720 Speaker 3: Infrastructure group, which I run, the Software group, and the 138 00:07:54,760 --> 00:07:59,880 Speaker 3: Consulting group. And infrastructure is built up of mainframes, which 139 00:07:59,880 --> 00:08:04,160 Speaker 3: is called our Z portfolio, our servers which is our 140 00:08:04,200 --> 00:08:08,520 Speaker 3: power portfolio storage. By the way, those businesses include the 141 00:08:08,560 --> 00:08:10,880 Speaker 3: supply chain to build all of that stuff, so that's 142 00:08:10,880 --> 00:08:14,680 Speaker 3: in the group. Then I have the worldwide Customer Support Organization. 143 00:08:14,760 --> 00:08:18,560 Speaker 3: It's called TLS Technology life Cycle Services, which is a 144 00:08:18,560 --> 00:08:21,960 Speaker 3: network of about thirteen thousand people around the globe that 145 00:08:22,040 --> 00:08:24,120 Speaker 3: make sure that everything runs and works when you buy 146 00:08:24,200 --> 00:08:27,800 Speaker 3: IBM products. And then also our IBM Cloud, which is 147 00:08:28,360 --> 00:08:33,080 Speaker 3: how we host applications and deliver as a service products 148 00:08:33,080 --> 00:08:35,839 Speaker 3: for our client base, so there's a lot. I think 149 00:08:35,840 --> 00:08:37,640 Speaker 3: it's about forty five thousand people total. 150 00:08:38,040 --> 00:08:42,600 Speaker 2: Do those components of the infrastructure group are they aligned 151 00:08:42,600 --> 00:08:46,079 Speaker 2: in their trajectory or do they are they on different paths? 152 00:08:46,120 --> 00:08:48,760 Speaker 2: And I'm just curious what so navigattle of both. 153 00:08:48,800 --> 00:08:51,400 Speaker 3: It's interesting you would ask that because I think of 154 00:08:51,679 --> 00:08:55,840 Speaker 3: all of the challenges coming to the new company, there 155 00:08:55,840 --> 00:08:58,560 Speaker 3: were things I expected, things that they didn't expect. But 156 00:08:59,640 --> 00:09:04,000 Speaker 3: getting that culture right in that group has been a 157 00:09:04,040 --> 00:09:10,120 Speaker 3: big challenge. IBM has a great culture toward quality products, 158 00:09:10,200 --> 00:09:14,200 Speaker 3: toward emphasizing passion for the client and making sure that 159 00:09:14,240 --> 00:09:17,840 Speaker 3: the client is happy, and for delivering innovation on a 160 00:09:17,880 --> 00:09:21,239 Speaker 3: scale that you know, for more than one hundred years 161 00:09:21,000 --> 00:09:26,079 Speaker 3: has been extremely powerful. But with success comes some challenges. 162 00:09:26,440 --> 00:09:28,560 Speaker 3: And with that success you can tend to get a 163 00:09:28,600 --> 00:09:32,160 Speaker 3: little bit insular, like you don't keep an eye on 164 00:09:32,200 --> 00:09:34,920 Speaker 3: the competition as well, you can get more siloed, where 165 00:09:35,280 --> 00:09:37,319 Speaker 3: you know, this is my business unit, this is my 166 00:09:37,480 --> 00:09:40,680 Speaker 3: business unit, I compete with the other business unit. That's 167 00:09:40,720 --> 00:09:44,280 Speaker 3: not a good thing when when you're a company and 168 00:09:44,320 --> 00:09:48,000 Speaker 3: you can get really risk averse, meaning you feel like, hey, 169 00:09:48,040 --> 00:09:50,080 Speaker 3: this is a successful business I don't want to do 170 00:09:50,120 --> 00:09:52,200 Speaker 3: anything to mess it up, so I don't need to 171 00:09:52,200 --> 00:09:55,079 Speaker 3: try new things. Well, that's exactly the recipe to kind 172 00:09:55,080 --> 00:09:57,360 Speaker 3: of be shrinking, and infrastructure had been shrinking for a 173 00:09:57,400 --> 00:10:01,120 Speaker 3: little while, and so a lot of what the challenge 174 00:10:01,200 --> 00:10:04,319 Speaker 3: was for me was to invigorate that risk taking and 175 00:10:05,480 --> 00:10:07,840 Speaker 3: get to a growth mindset where you're trying new things 176 00:10:07,880 --> 00:10:10,840 Speaker 3: and seeing what works and what doesn't work, and changing 177 00:10:10,840 --> 00:10:13,160 Speaker 3: some of the models, like investing a little bit less 178 00:10:13,160 --> 00:10:16,640 Speaker 3: in hardware for some software differentiation that goes into the hardware. 179 00:10:16,720 --> 00:10:20,199 Speaker 3: So it's been very successful so far, and it's been 180 00:10:20,200 --> 00:10:22,080 Speaker 3: a good journey. It's almost four years now. 181 00:10:22,440 --> 00:10:25,400 Speaker 2: Give me an example of what was a really hard 182 00:10:25,440 --> 00:10:27,160 Speaker 2: problem that you've dealt with in those four years. 183 00:10:27,160 --> 00:10:30,079 Speaker 3: So, boy, a really hard problem? 184 00:10:30,160 --> 00:10:33,120 Speaker 2: An interesting and are you interesting is a better word 185 00:10:33,160 --> 00:10:33,480 Speaker 2: than art. 186 00:10:33,760 --> 00:10:35,760 Speaker 3: One of the first things that I kind of chewed 187 00:10:35,800 --> 00:10:38,440 Speaker 3: on a little bit is I talked about how we 188 00:10:38,480 --> 00:10:41,600 Speaker 3: have Z power and storage. The Z and power product 189 00:10:41,640 --> 00:10:44,400 Speaker 3: lines are well known in the industry. Is is really 190 00:10:44,440 --> 00:10:47,760 Speaker 3: fit for purpose computing that have strengths that you know 191 00:10:47,880 --> 00:10:51,800 Speaker 3: Z runs you know most of the world's economic backbone, 192 00:10:51,840 --> 00:10:55,440 Speaker 3: and if you use a credit card, ninety percent of 193 00:10:55,480 --> 00:10:58,959 Speaker 3: credit card transactions for the globe. Go through these Z 194 00:10:59,440 --> 00:11:02,000 Speaker 3: mainframes there in every bank there. You know, it's a 195 00:11:02,000 --> 00:11:04,800 Speaker 3: big business. It's well known in the industry. Same with power, 196 00:11:05,240 --> 00:11:10,160 Speaker 3: very tuned and optimized for smaller operations than our giant 197 00:11:10,280 --> 00:11:16,240 Speaker 3: Z mainframes, but really mission critical workloads for retail, for insurance, 198 00:11:16,240 --> 00:11:19,440 Speaker 3: for banking, for all of that. Our storage business not 199 00:11:19,559 --> 00:11:22,360 Speaker 3: so well known. In fact, when I came I thought, 200 00:11:22,600 --> 00:11:25,000 Speaker 3: did they have storage? Well, I have storage when I 201 00:11:25,160 --> 00:11:27,600 Speaker 3: come into I. So I got online and I thought, 202 00:11:27,679 --> 00:11:29,480 Speaker 3: it's still hard for me to tell did they have 203 00:11:29,600 --> 00:11:31,920 Speaker 3: storage or not? Now I own a storage business. So 204 00:11:32,000 --> 00:11:33,560 Speaker 3: one of the things was not just to get the 205 00:11:33,600 --> 00:11:37,200 Speaker 3: market perception up, but to invest in that business. Because 206 00:11:37,240 --> 00:11:39,920 Speaker 3: if you look at infrastructure overall around the globe, it's 207 00:11:39,960 --> 00:11:43,520 Speaker 3: growing at five percent a year. The infrastructure business had 208 00:11:43,559 --> 00:11:46,240 Speaker 3: been kind of flat to declining, and so a challenge 209 00:11:46,280 --> 00:11:48,240 Speaker 3: was how do we grab onto the growth. Well, one 210 00:11:48,240 --> 00:11:50,960 Speaker 3: of the biggest growth areas due to the explosion of 211 00:11:51,040 --> 00:11:53,640 Speaker 3: data in the world is storage. So what do you 212 00:11:53,679 --> 00:11:55,880 Speaker 3: do to kind of get on that growth rate. So 213 00:11:55,880 --> 00:11:59,520 Speaker 3: we did a lot of reinvigoration of the innovation in 214 00:11:59,559 --> 00:12:02,439 Speaker 3: that a lot of software value, add a lot of 215 00:12:02,760 --> 00:12:05,839 Speaker 3: doubling down on the things that are working. Portfolio rationalization, 216 00:12:05,920 --> 00:12:08,160 Speaker 3: where you segment the market and you say, okay, we're 217 00:12:08,280 --> 00:12:10,080 Speaker 3: going to do less of this and really go big 218 00:12:10,120 --> 00:12:13,000 Speaker 3: in these areas. And that's been probably the most dramatic 219 00:12:13,000 --> 00:12:16,360 Speaker 3: turnaround inside the group. Is our storage thing. When you 220 00:12:16,400 --> 00:12:19,120 Speaker 3: say it's a hard problem, it's not just oh, you know, 221 00:12:19,160 --> 00:12:23,720 Speaker 3: how do we do the math? No, it's cultural. It's strategy, 222 00:12:23,800 --> 00:12:25,960 Speaker 3: and how do you get the strategy set. It's segmentation, 223 00:12:26,080 --> 00:12:29,199 Speaker 3: it's product strategy at a granular level across a bunch 224 00:12:29,240 --> 00:12:32,559 Speaker 3: of dimensions, and then putting the investment behind it. It's 225 00:12:32,600 --> 00:12:34,800 Speaker 3: a big challenge. It takes a long time, but it's working. 226 00:12:34,880 --> 00:12:36,839 Speaker 3: So we're happy with Yeah. 227 00:12:36,520 --> 00:12:38,480 Speaker 2: Tell me give me a little bit of perspective on 228 00:12:39,040 --> 00:12:42,840 Speaker 2: you've been there four years. Imagine we're having this conversation 229 00:12:42,920 --> 00:12:46,760 Speaker 2: four years ago. Yeah, what sorts of things have happened 230 00:12:46,800 --> 00:12:50,360 Speaker 2: over the last four years that have surprised you that 231 00:12:50,400 --> 00:12:53,360 Speaker 2: you didn't see come as we had exactly the same 232 00:12:53,360 --> 00:12:54,520 Speaker 2: conversation four years ago. 233 00:12:55,920 --> 00:12:57,640 Speaker 3: No, because I didn't know what was in I'll tell 234 00:12:57,640 --> 00:13:01,480 Speaker 3: you some of the biggest surprises I thought from the outside, 235 00:13:01,920 --> 00:13:05,119 Speaker 3: and you know, you hear from a lot of customers, 236 00:13:05,280 --> 00:13:08,360 Speaker 3: especially ten years ago. We're all going to cloud. We're 237 00:13:08,400 --> 00:13:10,760 Speaker 3: all so I thought, well, I wonder if the mainframe 238 00:13:10,840 --> 00:13:13,800 Speaker 3: business is struggling. When I get inside of there, I 239 00:13:13,840 --> 00:13:16,240 Speaker 3: found the opposite to be true. The mainframe business is 240 00:13:16,280 --> 00:13:20,400 Speaker 3: actually flourishing because transaction demand across the globe has done 241 00:13:20,480 --> 00:13:23,960 Speaker 3: nothing but grow. And even more surprising was the level 242 00:13:23,960 --> 00:13:27,000 Speaker 3: of innovation that the team was already doing in mainframes 243 00:13:27,440 --> 00:13:31,920 Speaker 3: before I got here was astounding. For example, we have AI. 244 00:13:32,800 --> 00:13:37,280 Speaker 3: They were building AI technology into the mainframe processors three 245 00:13:37,360 --> 00:13:40,640 Speaker 3: years before chat GPT made everybody talk about it in 246 00:13:40,679 --> 00:13:44,760 Speaker 3: the industry, So that was really pleasantly surprising. So that 247 00:13:44,920 --> 00:13:49,760 Speaker 3: was wonderful. Other surprises. I knew about the kind of 248 00:13:49,760 --> 00:13:52,880 Speaker 3: the IP of IBM and the mystique in that, and 249 00:13:52,960 --> 00:13:55,440 Speaker 3: I used to joke with people, especially on the outside, 250 00:13:55,440 --> 00:13:56,880 Speaker 3: I said, I can't wait to get in there and 251 00:13:56,880 --> 00:13:59,680 Speaker 3: see what's in the big blue toolbox? Right, what are 252 00:13:59,720 --> 00:14:03,040 Speaker 3: all the things they have going on? I way underestimated 253 00:14:03,240 --> 00:14:05,640 Speaker 3: the size of the big blue toolbox and what was 254 00:14:05,679 --> 00:14:10,280 Speaker 3: in there, meaning the amount of really hardcore research that 255 00:14:10,320 --> 00:14:13,839 Speaker 3: we're still doing into how to build chips and how 256 00:14:13,880 --> 00:14:16,960 Speaker 3: to get to things beyond two nanimeter and that kind 257 00:14:16,960 --> 00:14:22,960 Speaker 3: of capability packaging industry leading packaging technologies, and that's in 258 00:14:23,000 --> 00:14:25,760 Speaker 3: my hardware kind of patch quantum. The next thing that 259 00:14:25,840 --> 00:14:29,720 Speaker 3: will come after we're done talking about AI. You know, 260 00:14:30,440 --> 00:14:33,480 Speaker 3: all of those things were surprising, But it wasn't just that. 261 00:14:33,560 --> 00:14:36,120 Speaker 3: It was then the software innovations that are going on 262 00:14:36,280 --> 00:14:41,080 Speaker 3: heavy investment in AI technologies before it was really popular 263 00:14:41,120 --> 00:14:43,760 Speaker 3: to be talking about that. But as I saw that, 264 00:14:43,840 --> 00:14:46,480 Speaker 3: I thought this is going to be really fun. Because 265 00:14:46,520 --> 00:14:49,000 Speaker 3: I had a good feel for where the industry was going. 266 00:14:49,840 --> 00:14:52,360 Speaker 3: I just didn't and I knew, man, I know that 267 00:14:52,480 --> 00:14:55,120 Speaker 3: talent is really good and there's brilliant people there, but 268 00:14:55,160 --> 00:14:59,080 Speaker 3: I didn't know the level of IP frankly that IBM 269 00:14:59,200 --> 00:15:02,080 Speaker 3: had at its disposal, And now you're seeing that in 270 00:15:02,160 --> 00:15:05,359 Speaker 3: things like Watson X and things like AI in mainframes, 271 00:15:05,400 --> 00:15:05,800 Speaker 3: et cetera. 272 00:15:06,200 --> 00:15:09,160 Speaker 2: Building on that, since you brought up AI, can you 273 00:15:09,200 --> 00:15:13,080 Speaker 2: walk me through what has to happen from your perspective, 274 00:15:13,120 --> 00:15:19,200 Speaker 2: from the infrastructure perspective to make the AI explosion work? Yeah, 275 00:15:19,240 --> 00:15:21,360 Speaker 2: so everyone wants to do more of this stuff. Yes, 276 00:15:21,440 --> 00:15:23,920 Speaker 2: clearly there has to be some underpinning of it. 277 00:15:24,320 --> 00:15:28,440 Speaker 3: Yeah, I would tell you, you know, I think that people 278 00:15:28,600 --> 00:15:30,680 Speaker 3: feel like where we're at right now in the AI 279 00:15:30,760 --> 00:15:33,480 Speaker 3: journey had to do with one specific piece of software. 280 00:15:33,840 --> 00:15:37,960 Speaker 3: I think the inflection point for that whole thing really 281 00:15:38,880 --> 00:15:42,600 Speaker 3: at its root was around hardware, meaning the algorithms needed 282 00:15:42,640 --> 00:15:45,240 Speaker 3: to do larger language models. And all of that had 283 00:15:45,280 --> 00:15:47,920 Speaker 3: been around, they'd been talked about in the industry, but 284 00:15:47,960 --> 00:15:51,000 Speaker 3: at some point you hit a tipping point of hardware 285 00:15:51,040 --> 00:15:53,680 Speaker 3: capability where it's like, oh, now we can do this 286 00:15:53,800 --> 00:15:57,200 Speaker 3: in a broof force way, massive amounts of matrix math 287 00:15:57,280 --> 00:16:00,200 Speaker 3: to get weights correct so that you can do you know, 288 00:16:00,280 --> 00:16:03,600 Speaker 3: the right level of predictions that enable large language models. 289 00:16:03,960 --> 00:16:06,520 Speaker 3: And once we got to that horsepower. And that's why 290 00:16:06,560 --> 00:16:09,320 Speaker 3: you hear about giant GPUs that are driving this and 291 00:16:09,680 --> 00:16:11,920 Speaker 3: the sales of those, et cetera. It's because we just 292 00:16:12,000 --> 00:16:13,920 Speaker 3: barely got over the hump where you can do these 293 00:16:13,960 --> 00:16:19,240 Speaker 3: big hard things in terms of hardware capability to do it. 294 00:16:19,360 --> 00:16:22,200 Speaker 2: Give me a layman, give me a sense of when 295 00:16:22,240 --> 00:16:24,400 Speaker 2: you say there was a kind of threshold where suddenly 296 00:16:24,400 --> 00:16:25,560 Speaker 2: these things became possible. 297 00:16:25,680 --> 00:16:28,480 Speaker 3: Yeah, I don't know if there's an exact number, But 298 00:16:29,640 --> 00:16:31,720 Speaker 3: and more basic question that I get from a lot 299 00:16:31,760 --> 00:16:34,440 Speaker 3: of people. You know, my friends and family outside is 300 00:16:34,760 --> 00:16:39,480 Speaker 3: why GPUs. What does a GPU, a graphics processor have 301 00:16:39,600 --> 00:16:44,000 Speaker 3: to do with AI. It's not, Well, graphics processors are 302 00:16:44,040 --> 00:16:47,760 Speaker 3: really good at this thing matrix math, because they're figuring 303 00:16:47,840 --> 00:16:51,200 Speaker 3: out how do I map a pixel? And as I 304 00:16:51,320 --> 00:16:55,480 Speaker 3: move an object across the screen, it's essentially matrix math 305 00:16:55,560 --> 00:16:58,240 Speaker 3: to figure out, Okay, what does what does this pixel 306 00:16:58,320 --> 00:17:01,040 Speaker 3: on a screen look like? And what's it's doing? And 307 00:17:01,080 --> 00:17:04,560 Speaker 3: as you know, we've gotten more high resolution graphics, more 308 00:17:04,640 --> 00:17:07,680 Speaker 3: high resolution monitors, et cetera. It's a lot more pixels 309 00:17:07,720 --> 00:17:09,640 Speaker 3: and a lot more math and a lot more matrix 310 00:17:09,680 --> 00:17:12,480 Speaker 3: math about how you compute that. The first big thing 311 00:17:12,560 --> 00:17:15,200 Speaker 3: that kind of started to look like that, it turns out, 312 00:17:15,400 --> 00:17:18,880 Speaker 3: was crypto and crypto mining, and so you saw graphics 313 00:17:18,920 --> 00:17:22,119 Speaker 3: companies starting to sell to crypto. The technology got to 314 00:17:22,160 --> 00:17:24,480 Speaker 3: a certain point and there were use cases like bitcoin 315 00:17:24,560 --> 00:17:26,679 Speaker 3: and that that kind of said, hey, we need to 316 00:17:26,720 --> 00:17:29,000 Speaker 3: do a lot of this matrix math to be able 317 00:17:29,000 --> 00:17:32,040 Speaker 3: to do that. So graphic chips were a natural fit 318 00:17:32,119 --> 00:17:34,800 Speaker 3: and that kind of sustain But meanwhile, behind the scenes, 319 00:17:34,800 --> 00:17:39,240 Speaker 3: a lot of this AI AI is about numeric calculations 320 00:17:39,320 --> 00:17:43,000 Speaker 3: having to do with weights and matrices that say you know, 321 00:17:43,440 --> 00:17:46,879 Speaker 3: giant consolidated things that predict what's going to kind of 322 00:17:46,920 --> 00:17:49,320 Speaker 3: happen based on what other things have happened, just like 323 00:17:49,400 --> 00:17:53,920 Speaker 3: predicting where pixel goes. But it's really about being able 324 00:17:54,000 --> 00:17:56,800 Speaker 3: to do enough data in jest to be able to 325 00:17:56,840 --> 00:17:59,919 Speaker 3: do and then the calculations to be able to simple 326 00:18:00,080 --> 00:18:04,240 Speaker 3: five things like entire sets of language or giant chunks 327 00:18:04,280 --> 00:18:06,639 Speaker 3: of the Internet, to get enough waitings in there to 328 00:18:06,640 --> 00:18:09,320 Speaker 3: be able to say, Okay, we can predict what you 329 00:18:09,320 --> 00:18:12,720 Speaker 3: would say in this language based on all of the 330 00:18:12,800 --> 00:18:15,720 Speaker 3: volumes of stuff that we've seen that when you start 331 00:18:15,720 --> 00:18:18,960 Speaker 3: talking like this, the next word is likely, oh it's this. Yeah. 332 00:18:19,000 --> 00:18:21,240 Speaker 2: So, but my point is to get to that point. 333 00:18:21,359 --> 00:18:25,159 Speaker 2: That's threshold. We got there because was it because we 334 00:18:25,240 --> 00:18:28,080 Speaker 2: simply threw a lot more resources at the problem or 335 00:18:28,119 --> 00:18:32,399 Speaker 2: is it because the underlying technology got suddenly or gradually 336 00:18:32,560 --> 00:18:33,720 Speaker 2: so much more efficient. 337 00:18:33,840 --> 00:18:37,080 Speaker 3: It's always yes and yes. But you know, the industry 338 00:18:37,160 --> 00:18:39,320 Speaker 3: for a lot of years would talk about Moore's law. 339 00:18:39,800 --> 00:18:43,560 Speaker 2: Well, quick, will you define for us More's law for 340 00:18:43,600 --> 00:18:45,159 Speaker 2: those of those who's forgotten it. 341 00:18:45,400 --> 00:18:48,560 Speaker 3: Yeah, So Gordon Moore at Intel coined this thing. It 342 00:18:48,600 --> 00:18:53,320 Speaker 3: was basically that the horsepower I'm going to translate it 343 00:18:53,400 --> 00:18:59,040 Speaker 3: roughly of technology will double every couple of years. We're 344 00:18:59,080 --> 00:19:01,760 Speaker 3: still on Moore's law. More's law changed a little bit. 345 00:19:02,160 --> 00:19:05,000 Speaker 3: For a while, it was always about frequency. Things would 346 00:19:05,000 --> 00:19:08,159 Speaker 3: go faster, faster, faster. That kind of petered out. But 347 00:19:08,240 --> 00:19:11,440 Speaker 3: what happened is, rather than faster, faster, faster, we did 348 00:19:11,480 --> 00:19:14,760 Speaker 3: more and more and more. So rather than one operating 349 00:19:14,880 --> 00:19:18,720 Speaker 3: unit going a lot faster on its throughput, you put 350 00:19:18,760 --> 00:19:21,199 Speaker 3: ten operating units on a chip, now you put one 351 00:19:21,280 --> 00:19:24,320 Speaker 3: hundred operating units on a chip, now a thousand. Some 352 00:19:24,400 --> 00:19:29,040 Speaker 3: of these problems, the matrix math problems scaled parallel extremely well. 353 00:19:29,040 --> 00:19:31,320 Speaker 3: You don't have to do something really fast, you just 354 00:19:31,440 --> 00:19:33,440 Speaker 3: have to do a lot of the similar things in 355 00:19:33,480 --> 00:19:36,439 Speaker 3: parallel at the same time. So again that kind of 356 00:19:36,440 --> 00:19:39,159 Speaker 3: that extension of Moore's law, more and more hardware on 357 00:19:39,240 --> 00:19:40,640 Speaker 3: a chip to be able to do more and more 358 00:19:40,640 --> 00:19:43,879 Speaker 3: of those calculations in parallel and come up with it. 359 00:19:44,000 --> 00:19:47,120 Speaker 2: And we said, yeah, was that threshold predictable? In other words, 360 00:19:47,119 --> 00:19:50,000 Speaker 2: see people in the industry, like you sit down X 361 00:19:50,040 --> 00:19:52,080 Speaker 2: number of years ago and say, when we get here, 362 00:19:53,080 --> 00:19:54,840 Speaker 2: AI is going to become much more of a. 363 00:19:55,480 --> 00:20:02,080 Speaker 3: It's funny the horsepower that very pretty. The use cases 364 00:20:02,560 --> 00:20:05,160 Speaker 3: not always so easy to kind of figure out. That's 365 00:20:05,200 --> 00:20:08,840 Speaker 3: where the human spirit kind of gets involved. I think 366 00:20:08,880 --> 00:20:10,800 Speaker 3: for some people that say, oh, I saw that coming, 367 00:20:11,080 --> 00:20:14,199 Speaker 3: but people have been predicting kind of the rise of 368 00:20:14,280 --> 00:20:17,199 Speaker 3: AI for twenty five years. Oh well, then when we 369 00:20:17,280 --> 00:20:19,119 Speaker 3: get to this next gener oh when we get here, 370 00:20:19,320 --> 00:20:22,280 Speaker 3: it kind of hadn't happened. There's always a magic point 371 00:20:23,200 --> 00:20:25,520 Speaker 3: where you kind of get to where the technology and 372 00:20:25,600 --> 00:20:28,000 Speaker 3: the use case and somebody does something to kind of 373 00:20:28,040 --> 00:20:30,760 Speaker 3: make it catch on. And I think we're at one 374 00:20:30,760 --> 00:20:32,679 Speaker 3: of those moments in AI for sure right now. And 375 00:20:32,720 --> 00:20:34,919 Speaker 3: I don't think it's you know, people that have said, oh, 376 00:20:34,920 --> 00:20:38,760 Speaker 3: this is just the latest wave of you know, I've 377 00:20:38,800 --> 00:20:41,280 Speaker 3: heard this about a lot of technologies, but AI is 378 00:20:41,320 --> 00:20:43,960 Speaker 3: the technology the future, and it always will be. I 379 00:20:44,080 --> 00:20:46,760 Speaker 3: used to hear that. You're not hearing that now, right, 380 00:20:46,800 --> 00:20:50,480 Speaker 3: It's like, no, it's primetime. It will change everything, just 381 00:20:50,640 --> 00:20:52,680 Speaker 3: like some of these other things changed everything. 382 00:20:52,920 --> 00:20:57,359 Speaker 2: I noticed it if personally when I speak somewhere or 383 00:20:57,400 --> 00:21:01,000 Speaker 2: I'm listening in an audience somewhere. Over the last let's 384 00:21:01,000 --> 00:21:05,520 Speaker 2: say twelve months, there's always a whole bunch of AI 385 00:21:05,640 --> 00:21:08,520 Speaker 2: questions yes, And if I go back two years ago, 386 00:21:08,560 --> 00:21:09,760 Speaker 2: there were no AI questions. 387 00:21:09,840 --> 00:21:10,040 Speaker 3: Yes. 388 00:21:10,640 --> 00:21:12,960 Speaker 2: Now my question is, so there's been this explosion on 389 00:21:13,000 --> 00:21:17,399 Speaker 2: the in popular fascination with what's going on AI. It 390 00:21:17,480 --> 00:21:20,480 Speaker 2: seems like the last year. I agree with you in 391 00:21:20,560 --> 00:21:26,840 Speaker 2: your world, when did the explosion of conversation around this start. 392 00:21:27,560 --> 00:21:36,880 Speaker 3: It's I love this question because IBM had a fairly 393 00:21:37,040 --> 00:21:42,119 Speaker 3: big effort and business called Watson before Watson X. And 394 00:21:42,160 --> 00:21:45,080 Speaker 3: this is going back kind of ten years. I'll give 395 00:21:45,080 --> 00:21:47,920 Speaker 3: you another kind of example. I knew about a lot 396 00:21:47,920 --> 00:21:51,359 Speaker 3: of tablet technology before there was an iPad, a lot. 397 00:21:51,600 --> 00:21:53,960 Speaker 3: For ten years, there were a lot, but it kind 398 00:21:53,960 --> 00:21:57,640 Speaker 3: of takes a magic combination of the technology, the user experienced, 399 00:21:57,680 --> 00:22:00,520 Speaker 3: the software, and the need and the market for it 400 00:22:00,560 --> 00:22:02,719 Speaker 3: to kind of go. Now it's the thing. Now we 401 00:22:02,760 --> 00:22:05,359 Speaker 3: all have either an iPad or we have the Google 402 00:22:05,400 --> 00:22:08,320 Speaker 3: equivalent to And so I think this is a little 403 00:22:08,400 --> 00:22:11,720 Speaker 3: like that. Meaning IBM was on the right track with Watson. 404 00:22:12,240 --> 00:22:14,639 Speaker 3: Some of the hardware wasn't there, the use cases weren't 405 00:22:14,680 --> 00:22:17,280 Speaker 3: exactly figured out, some of the early use cases didn't 406 00:22:17,320 --> 00:22:20,399 Speaker 3: pan out perfectly. But the good news about that is 407 00:22:20,840 --> 00:22:23,680 Speaker 3: it's back to that culture of risk taking. You don't 408 00:22:23,720 --> 00:22:26,280 Speaker 3: look back on that and say, oh, we shouldn't have 409 00:22:26,320 --> 00:22:27,639 Speaker 3: done that, that was a bad idea. I know you 410 00:22:27,640 --> 00:22:29,360 Speaker 3: look back on that and say, what did we learn? 411 00:22:29,680 --> 00:22:31,879 Speaker 3: How should we try something new? How would we pivot 412 00:22:31,920 --> 00:22:34,159 Speaker 3: this time? That's what we've done with Watson X, and 413 00:22:35,359 --> 00:22:38,119 Speaker 3: now that's a growing, healthy piece of our business and 414 00:22:38,280 --> 00:22:40,760 Speaker 3: very important our strategic picture. So we're all in. 415 00:22:41,280 --> 00:22:47,800 Speaker 2: I've always investigated by the gap between insider sense of 416 00:22:47,840 --> 00:22:50,040 Speaker 2: what is happening in an outsider sense, like. 417 00:22:50,119 --> 00:22:53,080 Speaker 3: It absolutely is that in this case, we've all been 418 00:22:53,160 --> 00:22:57,160 Speaker 3: talking about and thinking about AI and is it time 419 00:22:57,200 --> 00:23:00,080 Speaker 3: for that and what does this mean? Et cetera. And 420 00:23:00,119 --> 00:23:02,760 Speaker 3: yet none of us really predicted that actual moment, which 421 00:23:02,840 --> 00:23:06,480 Speaker 3: is kind of you know, early twenty twenty two where 422 00:23:06,520 --> 00:23:10,480 Speaker 3: it was like, oh, now you have a simple human 423 00:23:10,560 --> 00:23:16,320 Speaker 3: interface of software innovation combined with large language models. There's 424 00:23:16,359 --> 00:23:19,560 Speaker 3: a moment there where you're like, oh, Unlike, you know, 425 00:23:19,560 --> 00:23:21,240 Speaker 3: I think all of us are frustrated if we ask 426 00:23:21,280 --> 00:23:23,680 Speaker 3: our phone, hey, tell me about this and it says 427 00:23:24,240 --> 00:23:26,199 Speaker 3: I found this on the web page. That does you 428 00:23:26,320 --> 00:23:28,000 Speaker 3: no good. But you know, all of a sudden, with 429 00:23:29,400 --> 00:23:31,800 Speaker 3: Chad GPT and some of these other things, you could 430 00:23:31,800 --> 00:23:33,879 Speaker 3: ask a question, it would give you a clear answer. 431 00:23:33,920 --> 00:23:36,560 Speaker 3: Sometimes is wrong, but at least it was like I'm 432 00:23:36,600 --> 00:23:38,439 Speaker 3: getting an answer rather than hey, I don't know if 433 00:23:38,440 --> 00:23:42,120 Speaker 3: there's some references. Good luck to you, and that's really changing. 434 00:23:42,680 --> 00:23:47,159 Speaker 3: Talk about the kind of macro trends that are going 435 00:23:47,240 --> 00:23:52,159 Speaker 3: to shape your infrastructure battle. Yeah, we've talked about if 436 00:23:52,160 --> 00:23:54,119 Speaker 3: you already, but I'm actually going to go a little 437 00:23:54,160 --> 00:23:58,560 Speaker 3: different direction. So macro trans first. And this one has 438 00:23:58,600 --> 00:24:02,520 Speaker 3: been before even even this AI conversation, that we've had 439 00:24:03,000 --> 00:24:10,080 Speaker 3: explosion of data. As humans, we don't think exponentially very well. 440 00:24:10,119 --> 00:24:14,040 Speaker 3: We really struggle with exponential thinking. We think linearly, Oh, 441 00:24:14,040 --> 00:24:16,080 Speaker 3: there'll be more, there'll be more, they'll be more, But 442 00:24:16,160 --> 00:24:18,200 Speaker 3: we don't think well when it's like no, there'll be more, 443 00:24:18,240 --> 00:24:19,840 Speaker 3: and they'll be ten times more, and then there'll be 444 00:24:19,880 --> 00:24:22,440 Speaker 3: ten times that more. That's what's going on with data 445 00:24:22,680 --> 00:24:25,320 Speaker 3: right now in our industry. It's one of the reasons 446 00:24:25,320 --> 00:24:27,680 Speaker 3: that that storage business is doing so well is they're 447 00:24:27,760 --> 00:24:31,560 Speaker 3: just more and more and more data. You know, you'd say, well, 448 00:24:31,600 --> 00:24:33,600 Speaker 3: how can there be more data? It's just life and 449 00:24:33,600 --> 00:24:36,880 Speaker 3: that thing. The things that we care about, video captured 450 00:24:36,960 --> 00:24:41,120 Speaker 3: video images, you know, the you I don't know from 451 00:24:41,160 --> 00:24:44,240 Speaker 3: my parents, you needed a drawer with all your family photos. 452 00:24:44,280 --> 00:24:45,920 Speaker 3: Now we need gigabytes and gigabytes. 453 00:24:46,000 --> 00:24:48,320 Speaker 2: You knew how many pictures my wife has taken off 454 00:24:48,320 --> 00:24:52,880 Speaker 2: our children, you would exactly exactly, So that's your case. Now. 455 00:24:52,960 --> 00:24:55,600 Speaker 3: Think of companies who used to just think about their 456 00:24:55,640 --> 00:24:59,399 Speaker 3: transaction data. What's the ledger say that now have video 457 00:24:59,640 --> 00:25:02,800 Speaker 3: assets of all of their campaigns and their marketing. They're 458 00:25:02,800 --> 00:25:05,520 Speaker 3: trying to figure out, you know, what campaigns are working 459 00:25:05,560 --> 00:25:08,280 Speaker 3: the best. So it's just an explosion of data and 460 00:25:08,320 --> 00:25:12,119 Speaker 3: that's not going to stop. Dealing with that, and more importantly, 461 00:25:12,200 --> 00:25:17,679 Speaker 3: getting value from that data is a massive trend in 462 00:25:17,720 --> 00:25:21,879 Speaker 3: the industry. Second trend AI, and this is the AI. 463 00:25:22,080 --> 00:25:24,119 Speaker 3: Not like we were just talking about about how it 464 00:25:24,200 --> 00:25:26,400 Speaker 3: changes how I search for things or how I learn 465 00:25:26,440 --> 00:25:30,840 Speaker 3: about things. But I would argue, dealing with that data, 466 00:25:30,880 --> 00:25:33,439 Speaker 3: how do I figure out what's in all those video streams? 467 00:25:33,480 --> 00:25:36,199 Speaker 3: How do I figure out, Okay, I want all of 468 00:25:36,240 --> 00:25:39,240 Speaker 3: the chunks of my corporate video that have to do 469 00:25:39,320 --> 00:25:43,359 Speaker 3: with client buying some specific product or something. That's a 470 00:25:44,000 --> 00:25:46,320 Speaker 3: different problem. It's not just okay, We'll look it up 471 00:25:46,320 --> 00:25:49,880 Speaker 3: in a spreadsheet and here's the math associated with that 472 00:25:49,880 --> 00:25:52,280 Speaker 3: that is a huge trend in the industry. You're seeing 473 00:25:52,280 --> 00:25:54,520 Speaker 3: it play out in this regard. It's a little different 474 00:25:54,600 --> 00:25:58,359 Speaker 3: bent on the AI fraud detection is the one that 475 00:25:58,400 --> 00:26:01,400 Speaker 3: we cite in our mainframe. It's a similar problem where 476 00:26:01,640 --> 00:26:04,320 Speaker 3: it was kind of a traditional AI problem. Look up 477 00:26:04,359 --> 00:26:08,480 Speaker 3: a rule. You know, if somebody does two small transactions, 478 00:26:08,520 --> 00:26:10,720 Speaker 3: then a massive one it might be fraud, right because 479 00:26:10,720 --> 00:26:14,200 Speaker 3: they were seeing whether it were now to detect fraud, 480 00:26:14,240 --> 00:26:18,240 Speaker 3: you might be saying, okay to transactions then a huge one. 481 00:26:18,560 --> 00:26:22,399 Speaker 3: Plus does this entity have a real address? Second, is 482 00:26:22,440 --> 00:26:25,760 Speaker 3: there any web traffic on you know, better Business bureau 483 00:26:25,840 --> 00:26:27,640 Speaker 3: kind of things that says this is a bad business 484 00:26:27,720 --> 00:26:29,800 Speaker 3: that can help you with fraud. So it's a lot 485 00:26:29,880 --> 00:26:33,200 Speaker 3: more of a it's an expotent problem. It's a holistic 486 00:26:33,280 --> 00:26:36,000 Speaker 3: problem that it takes a lot more than just you know, 487 00:26:36,240 --> 00:26:39,080 Speaker 3: little chunks of rules, et cetera. And then the third 488 00:26:39,080 --> 00:26:42,920 Speaker 3: one you know, after AI, is the nature of hybrid 489 00:26:42,960 --> 00:26:46,600 Speaker 3: it or hybrid computing. For a while ten years ago 490 00:26:46,680 --> 00:26:49,879 Speaker 3: when cloud was on the rise, I think the notion 491 00:26:49,960 --> 00:26:53,159 Speaker 3: of hybrid computing basically having to do with things in 492 00:26:53,200 --> 00:26:57,199 Speaker 3: the cloud versus things that people still have on the 493 00:26:57,240 --> 00:27:01,840 Speaker 3: premises inside of business. It was almost religious argument. Now 494 00:27:01,960 --> 00:27:05,000 Speaker 3: it's no, it's the reality. And the reason is because 495 00:27:05,040 --> 00:27:08,359 Speaker 3: that data that I talked about is the lifeblood of 496 00:27:08,400 --> 00:27:13,439 Speaker 3: these companies, particularly IBM's companies are clients that usually that 497 00:27:13,560 --> 00:27:15,920 Speaker 3: data has to be secure, they have to be able 498 00:27:15,960 --> 00:27:18,840 Speaker 3: to get value from it. It is the lifeblood of 499 00:27:18,840 --> 00:27:20,600 Speaker 3: the company. If you go to an ATM and you 500 00:27:20,600 --> 00:27:24,080 Speaker 3: can't get your money out, you know, to our financial transactions, 501 00:27:24,800 --> 00:27:27,160 Speaker 3: if that lasts a day, you're probably going to change 502 00:27:27,160 --> 00:27:30,160 Speaker 3: banks immediately. So it's like life or death for these companies. 503 00:27:32,040 --> 00:27:36,520 Speaker 3: So having that hybrid infrastructure so that they can still 504 00:27:36,560 --> 00:27:40,000 Speaker 3: hold their data, you still interact with clouds and still 505 00:27:40,000 --> 00:27:42,960 Speaker 3: get value from it from AI. That's kind of the 506 00:27:43,080 --> 00:27:47,680 Speaker 3: magic where we play and it's a huge business opportunity. 507 00:27:47,960 --> 00:27:50,360 Speaker 3: It is a true inflection point for the industry. 508 00:27:51,440 --> 00:27:54,840 Speaker 2: I'm going to go back. I interrupted you when you 509 00:27:54,880 --> 00:27:57,000 Speaker 2: were in the middle of a rellion. We were talking 510 00:27:57,000 --> 00:28:01,679 Speaker 2: about what has to happen for for AI to scale 511 00:28:01,960 --> 00:28:04,879 Speaker 2: from the infrastructure standpoint. You gave one example that I 512 00:28:05,119 --> 00:28:07,480 Speaker 2: got you off on a tangent. Can you go back 513 00:28:07,520 --> 00:28:10,960 Speaker 2: and talk very so practically, like so I'm you know, 514 00:28:11,040 --> 00:28:14,480 Speaker 2: I'm a big company. I have all these dreams of AI, 515 00:28:15,000 --> 00:28:17,280 Speaker 2: of how I'm going to use this dratically. So give 516 00:28:17,320 --> 00:28:20,600 Speaker 2: me a very granular sense of the works you have 517 00:28:20,680 --> 00:28:23,160 Speaker 2: to do, yeah to make that dream possible. 518 00:28:23,600 --> 00:28:26,840 Speaker 3: So let me first say what the company has to do, 519 00:28:26,880 --> 00:28:28,840 Speaker 3: and then maybe I'll say, then how do I help them? 520 00:28:28,920 --> 00:28:31,080 Speaker 3: If that makes sense? So if I'm a company and 521 00:28:31,080 --> 00:28:33,080 Speaker 3: I want to do that, So it turns out I 522 00:28:33,119 --> 00:28:36,919 Speaker 3: am a company, meaning I want to use AI in 523 00:28:36,960 --> 00:28:40,800 Speaker 3: my processes. I mentioned that I have a global network 524 00:28:40,840 --> 00:28:45,000 Speaker 3: of thirteen thousand employees that support our infrastructure around the world. 525 00:28:45,400 --> 00:28:50,800 Speaker 3: That challenge is a great challenge for AI. That means 526 00:28:50,840 --> 00:28:55,400 Speaker 3: I have data for every customer situation for thirteen thousand 527 00:28:55,440 --> 00:28:58,880 Speaker 3: employees globally around the world on what was their problem, 528 00:28:58,920 --> 00:29:02,400 Speaker 3: how did we fix it, what next steps did they 529 00:29:02,400 --> 00:29:04,840 Speaker 3: have to do, how did they remediate that? That data 530 00:29:04,920 --> 00:29:07,400 Speaker 3: is extremely valuable to me because if I can get 531 00:29:07,440 --> 00:29:10,000 Speaker 3: better at doing that than anybody else in the world, 532 00:29:10,360 --> 00:29:12,680 Speaker 3: that brings my cost down. I sell more products, I 533 00:29:12,720 --> 00:29:15,600 Speaker 3: sell more service, I sell more anything. So what I 534 00:29:15,720 --> 00:29:17,640 Speaker 3: have to do to get there is I have to 535 00:29:17,640 --> 00:29:20,920 Speaker 3: figure out, Okay, what's my objective. I have a couple objectives. 536 00:29:20,960 --> 00:29:23,400 Speaker 3: One I want customers to be able to support themselves 537 00:29:23,400 --> 00:29:26,440 Speaker 3: without even calling me, first off, and I don't want 538 00:29:26,440 --> 00:29:29,960 Speaker 3: when they call for the first answer to come back 539 00:29:29,960 --> 00:29:32,520 Speaker 3: to be did you try rebooting? Because I think that 540 00:29:32,760 --> 00:29:35,560 Speaker 3: irritates every single one of us. Did you try? Of 541 00:29:35,600 --> 00:29:38,640 Speaker 3: course I tried rebooting. I've had a lap up, of course. 542 00:29:38,680 --> 00:29:42,840 Speaker 3: I well, okay, well then tell me, okay, what firmware version, 543 00:29:42,880 --> 00:29:44,960 Speaker 3: all that other stuff. Okay, we know this interaction. So 544 00:29:46,240 --> 00:29:48,160 Speaker 3: that's kind of the problem set. Do I want that 545 00:29:48,320 --> 00:29:51,600 Speaker 3: to be customers solving their own problems? Well, even for 546 00:29:51,760 --> 00:29:54,200 Speaker 3: my support agents, I want something in their pocket on 547 00:29:54,240 --> 00:29:57,160 Speaker 3: their phone where they say I'm seeing these symptoms. It says, oh, 548 00:29:57,400 --> 00:29:59,920 Speaker 3: this happening around the globe. Here's here's kind of specific. 549 00:30:00,160 --> 00:30:03,840 Speaker 3: So there's my problems. What does it mean for infrastructure 550 00:30:03,840 --> 00:30:07,720 Speaker 3: on the back end? So first I got to get 551 00:30:07,720 --> 00:30:10,480 Speaker 3: all that data together, right, all of those customer law, 552 00:30:10,520 --> 00:30:13,600 Speaker 3: all that customer support around the globe, et cetera. That 553 00:30:13,680 --> 00:30:16,040 Speaker 3: needs to be stored. That's a big set of data. 554 00:30:16,400 --> 00:30:19,560 Speaker 3: And some of it's not just fix and that kind 555 00:30:19,640 --> 00:30:22,080 Speaker 3: of thing. Some of it is, Okay, you know, what 556 00:30:22,200 --> 00:30:24,480 Speaker 3: was the firmware version? Who was the tech? Because it 557 00:30:24,520 --> 00:30:27,520 Speaker 3: can matter. Is this their first time fixing this problem? 558 00:30:27,520 --> 00:30:29,520 Speaker 3: Is that they're one hundred and fiftieth time. What's their level? 559 00:30:29,800 --> 00:30:34,520 Speaker 3: It's a very complicated problem. Ingesting all that data takes 560 00:30:34,680 --> 00:30:38,080 Speaker 3: an architecture. We have a product called Scale, which is 561 00:30:38,120 --> 00:30:41,200 Speaker 3: one of our storage projects that actually makes it easy 562 00:30:41,240 --> 00:30:44,400 Speaker 3: to ingest all that data, get it organized, et cetera, 563 00:30:44,840 --> 00:30:48,840 Speaker 3: and then have a model. It's a whole different process 564 00:30:48,920 --> 00:30:50,640 Speaker 3: to kind of say did we train our model? We 565 00:30:50,680 --> 00:30:52,840 Speaker 3: can train our own models inside of IBM. We have 566 00:30:52,880 --> 00:30:56,080 Speaker 3: a granite set of models. Those models we fine tune, 567 00:30:56,440 --> 00:30:58,840 Speaker 3: and then we inference based on those models. So we 568 00:30:58,880 --> 00:31:01,400 Speaker 3: can do that inferencing in our cloud I have a 569 00:31:01,440 --> 00:31:04,280 Speaker 3: cloud set of infrastructure or in my power servers, we 570 00:31:04,320 --> 00:31:09,160 Speaker 3: can do inferencing with our capabilities and say, okay, based 571 00:31:09,200 --> 00:31:12,160 Speaker 3: on what I'm saying, here's what the remediation you should 572 00:31:12,200 --> 00:31:15,360 Speaker 3: do for that customer. We already are doing that today. 573 00:31:15,400 --> 00:31:20,920 Speaker 3: We've seen over a third of our support calls have 574 00:31:21,040 --> 00:31:24,240 Speaker 3: had significant reduction in the amount of time that it 575 00:31:24,280 --> 00:31:27,680 Speaker 3: takes to resolve that support call. Just by what I 576 00:31:27,720 --> 00:31:28,720 Speaker 3: said right there. 577 00:31:28,840 --> 00:31:32,400 Speaker 2: That I've really been curious about this. If I had 578 00:31:32,400 --> 00:31:36,640 Speaker 2: reduced something like AI into that equation as you just did. Yeah, 579 00:31:36,680 --> 00:31:39,680 Speaker 2: and you said we've already seen a thirty percent, Say 580 00:31:39,680 --> 00:31:41,040 Speaker 2: did you say thirty percent reduction? 581 00:31:41,400 --> 00:31:45,960 Speaker 3: Thirty percent of our interactions have seen significant reduction in 582 00:31:46,360 --> 00:31:46,960 Speaker 3: those times? 583 00:31:47,080 --> 00:31:50,080 Speaker 2: Was that your primary goal to reduce the time of 584 00:31:50,120 --> 00:31:52,880 Speaker 2: the interaction? But it was if you if everything else 585 00:31:52,960 --> 00:31:55,240 Speaker 2: was the same all, but what you were doing was 586 00:31:55,280 --> 00:31:56,960 Speaker 2: shrinking the amount of time That would you. 587 00:31:56,920 --> 00:32:01,200 Speaker 3: Want one of the primary goals? So to us in 588 00:32:01,240 --> 00:32:04,880 Speaker 3: that business net promoter score kind of the satisfaction of 589 00:32:04,920 --> 00:32:08,360 Speaker 3: a client is the supreme goal. What makes them satisfied, 590 00:32:08,800 --> 00:32:11,800 Speaker 3: doesn't cost me a fortune, happens really quickly, and if 591 00:32:11,840 --> 00:32:14,920 Speaker 3: I can do it myself, I'd be thrilled. It affects 592 00:32:15,000 --> 00:32:17,360 Speaker 3: all of those right. It kind of says it got 593 00:32:17,400 --> 00:32:19,720 Speaker 3: resolved faster, it didn't cost me an arm and a leg, 594 00:32:19,760 --> 00:32:22,840 Speaker 3: because the deck was barely here, because it's a common problem, 595 00:32:23,160 --> 00:32:27,000 Speaker 3: or I solved it myself without even calling, So all 596 00:32:27,040 --> 00:32:29,160 Speaker 3: of those objectives would kind of hit across all so 597 00:32:29,200 --> 00:32:31,440 Speaker 3: that now you see it. So that's a little microcosm. 598 00:32:31,480 --> 00:32:33,880 Speaker 3: That's just me and my customer support business. Now think 599 00:32:33,960 --> 00:32:37,480 Speaker 3: of how many problems for businesses around the world there 600 00:32:37,480 --> 00:32:40,200 Speaker 3: are like that it's not a it's not like a 601 00:32:40,280 --> 00:32:44,640 Speaker 3: new AI application that changes the entire user experience. That's 602 00:32:45,520 --> 00:32:48,880 Speaker 3: those will come, But right now it's kind of practical, 603 00:32:48,960 --> 00:32:51,000 Speaker 3: which is, I just want to do what I'm doing 604 00:32:51,080 --> 00:32:54,840 Speaker 3: better and faster, and I can get immediate economic return 605 00:32:54,920 --> 00:32:55,600 Speaker 3: from those things. 606 00:32:55,600 --> 00:32:58,680 Speaker 2: How long how long did it take you to just 607 00:32:58,760 --> 00:33:02,120 Speaker 2: stick with that example of the customer reaction reducing thirty 608 00:33:02,120 --> 00:33:04,800 Speaker 2: percent of the time? How long from the very beginning 609 00:33:04,800 --> 00:33:08,160 Speaker 2: of that project yeap to that thirty percent reduction was? 610 00:33:08,160 --> 00:33:08,520 Speaker 2: How long? 611 00:33:09,160 --> 00:33:14,560 Speaker 3: Less than a year? And yeah, So one of the challenges, 612 00:33:14,840 --> 00:33:18,479 Speaker 3: and this is interesting with a very large organization, as 613 00:33:18,560 --> 00:33:21,000 Speaker 3: you can imagine, just like you're seeing in the industry, 614 00:33:21,880 --> 00:33:25,440 Speaker 3: we don't have a problem of generating ideas for how 615 00:33:25,520 --> 00:33:29,160 Speaker 3: AI could help us. We actually have a problem filtering 616 00:33:29,560 --> 00:33:33,400 Speaker 3: the thousands of ideas from our employees and from from everywhere. 617 00:33:33,400 --> 00:33:35,760 Speaker 3: It's like, hey, we could use AI to and filtering 618 00:33:35,800 --> 00:33:37,960 Speaker 3: down and saying, okay, which of these will have a 619 00:33:38,040 --> 00:33:41,800 Speaker 3: return on investment quickly and at a level that sustains 620 00:33:41,840 --> 00:33:44,560 Speaker 3: that's worth kind of going and investing in the infrastructure 621 00:33:44,560 --> 00:33:48,040 Speaker 3: and the software and kind of making that happen. 622 00:33:48,160 --> 00:33:51,440 Speaker 2: Is that unusual? If I talked to you twenty five 623 00:33:51,560 --> 00:33:53,880 Speaker 2: years ago and said, do you have a problem of 624 00:33:53,920 --> 00:33:55,280 Speaker 2: too many good ideas or too few? 625 00:33:55,320 --> 00:34:01,480 Speaker 3: What was you said in this specific carriot, Probably too few, 626 00:34:01,640 --> 00:34:05,040 Speaker 3: because at some point you reach diminishing returns. So, for example, 627 00:34:05,080 --> 00:34:09,360 Speaker 3: let's use this same example. Can those thirteen thousand technicians 628 00:34:09,400 --> 00:34:13,799 Speaker 3: go faster? Can they spend less time driving to the side. 629 00:34:13,800 --> 00:34:15,520 Speaker 3: I mean, there's only so much you can kind of 630 00:34:15,560 --> 00:34:17,920 Speaker 3: do on those things. But if you can get them 631 00:34:17,920 --> 00:34:20,200 Speaker 3: an answer to the problem and maybe even avoid them 632 00:34:20,239 --> 00:34:22,799 Speaker 3: having to visit at all because the client helped themselves, 633 00:34:23,160 --> 00:34:26,160 Speaker 3: that's a step function. So that's why people are kind 634 00:34:26,200 --> 00:34:30,600 Speaker 3: of talking about there's a business revolution coming with AI 635 00:34:30,719 --> 00:34:33,719 Speaker 3: where there are some step function changes that can be there. 636 00:34:33,760 --> 00:34:37,080 Speaker 3: And notice I didn't say I'm going to have less 637 00:34:37,120 --> 00:34:40,719 Speaker 3: of those agents. That's not my objective. My objective and 638 00:34:40,719 --> 00:34:43,120 Speaker 3: I think that's the fear in the industry about AI 639 00:34:43,200 --> 00:34:45,560 Speaker 3: is going to eliminate all the jobs. No, I just 640 00:34:45,640 --> 00:34:49,600 Speaker 3: created thirteen thousand superpowered agents that can do more right. 641 00:34:49,680 --> 00:34:51,880 Speaker 3: And so I'm not just going to support IBM products. 642 00:34:52,120 --> 00:34:54,280 Speaker 3: I'm going to go out and support other people's products 643 00:34:54,320 --> 00:34:55,960 Speaker 3: because I know how to do that really well. And 644 00:34:56,000 --> 00:34:58,759 Speaker 3: once I have the data on how to fix their problems. 645 00:34:59,200 --> 00:35:02,800 Speaker 3: I may just have a customer support business that's independent 646 00:35:02,840 --> 00:35:05,600 Speaker 3: of mind boxes. So you know, I think that's where 647 00:35:05,640 --> 00:35:08,160 Speaker 3: people sometimes get it wrong. And the AI thing is 648 00:35:08,760 --> 00:35:12,719 Speaker 3: it's like, you know, do word processing eliminate the need 649 00:35:12,800 --> 00:35:18,200 Speaker 3: for writers? No? It enabled writing instead of mucking around 650 00:35:18,200 --> 00:35:20,840 Speaker 3: with mimeographic machines and click and click typewriters. 651 00:35:20,920 --> 00:35:23,400 Speaker 2: It may have enabled too much writing. Yeah, maybe maybe 652 00:35:24,040 --> 00:35:27,200 Speaker 2: can I give you a hypothetical? Uh? And I asked 653 00:35:27,239 --> 00:35:29,279 Speaker 2: this because I read I was at some convers and 654 00:35:29,280 --> 00:35:31,360 Speaker 2: I ran into some guy from the I R S 655 00:35:32,520 --> 00:35:36,000 Speaker 2: who was really, really, really really excited about AI. So 656 00:35:36,080 --> 00:35:40,239 Speaker 2: let's suppose they call you up and they say, you're 657 00:35:40,280 --> 00:35:43,920 Speaker 2: going to talk to the I R S. Okay, I 658 00:35:44,000 --> 00:35:48,879 Speaker 2: call you up and I say, Rick, Uh, clearly there's 659 00:35:48,920 --> 00:35:52,480 Speaker 2: something that we could do for the I R S 660 00:35:52,520 --> 00:35:53,360 Speaker 2: if we work together. 661 00:35:53,560 --> 00:35:53,759 Speaker 3: Yeah. 662 00:35:53,760 --> 00:35:54,640 Speaker 2: Who would your answer me? 663 00:35:55,920 --> 00:35:56,400 Speaker 3: Of course? 664 00:35:57,080 --> 00:35:57,160 Speaker 2: No. 665 00:35:57,280 --> 00:36:01,000 Speaker 3: I think we sell to a lot of government agencies. 666 00:36:01,360 --> 00:36:05,120 Speaker 3: Can imagine in the business that we're in, we enable 667 00:36:05,280 --> 00:36:08,600 Speaker 3: a lot of social security transactions and things like that 668 00:36:08,600 --> 00:36:12,799 Speaker 3: through our mainframes. And I think, you know, we're in 669 00:36:12,840 --> 00:36:16,319 Speaker 3: the business of helping whatever client get the most out 670 00:36:16,320 --> 00:36:18,320 Speaker 3: of their data and be able to secure it and 671 00:36:18,880 --> 00:36:21,760 Speaker 3: and be able to do analytics with this. And IRS 672 00:36:21,800 --> 00:36:23,959 Speaker 3: has a heck of a lot of data, so yes, 673 00:36:24,000 --> 00:36:24,759 Speaker 3: we would help them. 674 00:36:25,120 --> 00:36:26,840 Speaker 2: Do you know how the amount of data they have 675 00:36:26,960 --> 00:36:29,000 Speaker 2: compares to some of the corporate clients you've. 676 00:36:29,080 --> 00:36:32,040 Speaker 3: I don't know specifically for the IRS how much data 677 00:36:32,120 --> 00:36:34,040 Speaker 3: they have, but I would assume it's a whole lot. 678 00:36:34,360 --> 00:36:37,719 Speaker 3: It's mountains. But but that's our business. I mean, it's 679 00:36:37,760 --> 00:36:41,279 Speaker 3: interesting sometimes people have that what's the most you know, 680 00:36:41,320 --> 00:36:45,279 Speaker 3: what what is it that that IBM has that's of 681 00:36:45,480 --> 00:36:49,279 Speaker 3: great value? Is it a server? Is it a storage array? 682 00:36:49,400 --> 00:36:51,920 Speaker 3: Is it you know, software and all that. What we 683 00:36:52,040 --> 00:36:56,440 Speaker 3: have is the most important entities in the world have 684 00:36:56,600 --> 00:37:00,520 Speaker 3: their data on our stuff. The most important data in 685 00:37:00,560 --> 00:37:03,759 Speaker 3: the world. It's not you know, pictures of your grandkids 686 00:37:03,840 --> 00:37:06,120 Speaker 3: and things like that. Generally for us, it's all of 687 00:37:06,160 --> 00:37:09,520 Speaker 3: the financial transactions that happen globally, right, It's all of 688 00:37:09,560 --> 00:37:12,560 Speaker 3: the it's the world's economy is kind of running through 689 00:37:13,160 --> 00:37:16,319 Speaker 3: our systems, and so we take that really seriously. You know, 690 00:37:17,200 --> 00:37:19,440 Speaker 3: you would be distraught if you lost one photo on 691 00:37:19,480 --> 00:37:22,279 Speaker 3: your laptop or whatever. But you know, if we lose 692 00:37:22,320 --> 00:37:25,359 Speaker 3: a transaction, like somebody moves a big amount of money 693 00:37:25,400 --> 00:37:28,120 Speaker 3: and it's like, well, don't know what happened there. It 694 00:37:28,200 --> 00:37:31,520 Speaker 3: is a massive deal, right, so that doesn't happen. 695 00:37:31,640 --> 00:37:33,160 Speaker 2: But I want to go back to my irs example 696 00:37:33,200 --> 00:37:37,080 Speaker 2: for US, Yes, so one, is it reasonable to assume 697 00:37:37,360 --> 00:37:42,399 Speaker 2: that you could that somebody IBM or somebody else could 698 00:37:42,400 --> 00:37:44,880 Speaker 2: in a short period of time put together not just 699 00:37:44,960 --> 00:37:50,520 Speaker 2: the AI capability to audit returns, but also this the 700 00:37:50,560 --> 00:37:53,799 Speaker 2: infrastructure support for that in a reasonable amount of time 701 00:37:53,800 --> 00:37:56,520 Speaker 2: for a reasonable amount of cost. Or is it overall? 702 00:37:56,640 --> 00:37:59,480 Speaker 2: Is it going to the moon? Or is it it? 703 00:37:59,680 --> 00:38:03,560 Speaker 3: Definitely? I mean, so we're already doing that kind of 704 00:38:03,600 --> 00:38:08,880 Speaker 3: thing right across a network of banks and others, essentially 705 00:38:09,000 --> 00:38:12,920 Speaker 3: all credit card transactions for all of the world to 706 00:38:12,960 --> 00:38:16,200 Speaker 3: go through our systems, so that in some ways is 707 00:38:16,239 --> 00:38:19,720 Speaker 3: more volume than the datch returns of the US people. 708 00:38:19,840 --> 00:38:23,040 Speaker 3: And they're W two's and all that stuff, and we 709 00:38:23,120 --> 00:38:25,920 Speaker 3: do that stuff too. I try not to describe it 710 00:38:25,960 --> 00:38:28,160 Speaker 3: too much in detail, but we definitely do a lot 711 00:38:28,200 --> 00:38:33,840 Speaker 3: of that. In fact, I think most of if you think, okay, 712 00:38:33,840 --> 00:38:37,000 Speaker 3: what is super critical data, who would be doing the 713 00:38:37,040 --> 00:38:41,080 Speaker 3: business transaction processing? It is most likely US in almost 714 00:38:41,160 --> 00:38:45,520 Speaker 3: all cases, whether it's government things or private or banks 715 00:38:45,840 --> 00:38:47,960 Speaker 3: or that kind of thing. That's what we do. 716 00:38:48,280 --> 00:38:50,280 Speaker 2: Rick we're going to end with the where we always 717 00:38:50,400 --> 00:38:52,240 Speaker 2: end with a couple of quick fire questions. 718 00:38:52,320 --> 00:38:53,319 Speaker 3: Okay, here we go. 719 00:38:54,200 --> 00:38:57,280 Speaker 2: What single piece of advice would you give to businesses 720 00:38:57,320 --> 00:38:59,879 Speaker 2: trying to use AI in an effective way? 721 00:39:00,120 --> 00:39:03,680 Speaker 3: The simple version is get started. By get started, I 722 00:39:03,719 --> 00:39:07,760 Speaker 3: mean think of what is something that I want to improve. 723 00:39:07,880 --> 00:39:10,040 Speaker 3: The things that we have traction on right now in 724 00:39:10,080 --> 00:39:16,000 Speaker 3: the market are around business process, automation, digital labor, those 725 00:39:16,120 --> 00:39:19,320 Speaker 3: kind of things. But my other little piece of advice 726 00:39:19,360 --> 00:39:21,360 Speaker 3: there is keep it simple to begin with. You're going 727 00:39:21,440 --> 00:39:24,120 Speaker 3: to learn a lot, but getting started means you'll start 728 00:39:24,160 --> 00:39:27,960 Speaker 3: that learning curve. I even advise my friends like, Hey, 729 00:39:27,960 --> 00:39:30,520 Speaker 3: should I be playing around with some of this AI stuff? 730 00:39:30,560 --> 00:39:33,000 Speaker 3: And I say yeah, because I think it will help 731 00:39:33,040 --> 00:39:35,719 Speaker 3: you start to be more comfortable and you may find 732 00:39:35,760 --> 00:39:37,759 Speaker 3: a use case personally for that. I think the same 733 00:39:37,880 --> 00:39:40,920 Speaker 3: is true for businesses. The first step in that journey 734 00:39:40,960 --> 00:39:44,680 Speaker 3: is always with what data. Notice when I talked about 735 00:39:44,680 --> 00:39:49,000 Speaker 3: our customer support people, I thought about, Okay, what's the data. 736 00:39:49,160 --> 00:39:51,719 Speaker 3: The data is all of those logs of all of 737 00:39:51,719 --> 00:39:54,920 Speaker 3: those service engagements around the world, and what could I 738 00:39:54,960 --> 00:39:56,759 Speaker 3: do with that? Well, I could use that to get 739 00:39:56,800 --> 00:40:00,719 Speaker 3: to a knowledge base that really helps hopefully that I 740 00:40:00,760 --> 00:40:03,319 Speaker 3: can do it in multiple languages because it's global and 741 00:40:03,400 --> 00:40:05,920 Speaker 3: I can you know, all of those things. That was 742 00:40:06,000 --> 00:40:08,760 Speaker 3: kind of my data sent That one's not super simple, 743 00:40:08,800 --> 00:40:11,560 Speaker 3: but we've had a lot of experience in AI for 744 00:40:11,640 --> 00:40:14,520 Speaker 3: other people that might just be how do I automate 745 00:40:14,640 --> 00:40:18,600 Speaker 3: filling out travel expense reports for my company? We can 746 00:40:18,600 --> 00:40:21,040 Speaker 3: help people that we have consulting, we have wats and 747 00:40:21,200 --> 00:40:23,319 Speaker 3: X tools. We can do that like this, and we're 748 00:40:23,360 --> 00:40:26,479 Speaker 3: doing it globally for people around the world. Pick that thing. 749 00:40:26,560 --> 00:40:29,520 Speaker 3: What's the data you have? In that case, it's data 750 00:40:29,520 --> 00:40:31,839 Speaker 3: of expense reports and it's like, okay, we can help 751 00:40:31,840 --> 00:40:34,120 Speaker 3: you automate that for people where they could do it 752 00:40:34,239 --> 00:40:38,040 Speaker 3: just by you know, a verbal interface. What did you spend, 753 00:40:38,120 --> 00:40:40,120 Speaker 3: where did you go, who you were you with? Okay, 754 00:40:40,200 --> 00:40:42,480 Speaker 3: we filled out your travel expense report for you and 755 00:40:42,520 --> 00:40:43,799 Speaker 3: you don't have to mess around with it. 756 00:40:43,960 --> 00:40:46,600 Speaker 2: So we were playing with this idea where we would 757 00:40:46,920 --> 00:40:50,279 Speaker 2: pick a business and go in there and do it 758 00:40:50,320 --> 00:40:51,360 Speaker 2: would be AI makeover. 759 00:40:51,640 --> 00:40:52,640 Speaker 3: Yeah, I love that. 760 00:40:52,960 --> 00:40:55,960 Speaker 2: What's okay, what's the what is the ideal business to do. 761 00:40:56,680 --> 00:40:58,200 Speaker 2: We only have a couple months. We don't want to 762 00:40:58,200 --> 00:41:00,480 Speaker 2: spend a kajillion dollars. We want to be able to 763 00:41:00,520 --> 00:41:04,239 Speaker 2: show tangibly and quickly what AI can do. What's that 764 00:41:04,360 --> 00:41:06,120 Speaker 2: ideal business to do? That in it can be a 765 00:41:06,120 --> 00:41:09,120 Speaker 2: small business, but we're not talking. This isn't a grand corporate. 766 00:41:08,800 --> 00:41:13,560 Speaker 3: Thing there, ah boy, small business that we could do 767 00:41:13,640 --> 00:41:17,840 Speaker 3: and hey, I make over. Customer support is one of 768 00:41:17,840 --> 00:41:21,239 Speaker 3: my favorites because it's it's it's I have it on 769 00:41:21,280 --> 00:41:24,600 Speaker 3: the business side where I provide customer support. I have 770 00:41:24,680 --> 00:41:27,240 Speaker 3: it on the consumer side, where it drives me nuts 771 00:41:27,560 --> 00:41:30,680 Speaker 3: when I have to go through thirty layers of phone menus. 772 00:41:31,200 --> 00:41:33,239 Speaker 3: Speak to an agent, speak to an agent, speak to 773 00:41:33,280 --> 00:41:37,480 Speaker 3: an agent. That for any business, I think is just 774 00:41:37,719 --> 00:41:39,960 Speaker 3: ripe to be able to kind of say, why do 775 00:41:40,040 --> 00:41:42,319 Speaker 3: I have to click through these manuscent messages? I just 776 00:41:42,360 --> 00:41:44,960 Speaker 3: need to tell you in human language, here's the issue, 777 00:41:44,960 --> 00:41:47,720 Speaker 3: and I'll be really good about telling you details about 778 00:41:48,239 --> 00:41:50,680 Speaker 3: you know, I tried to set up this thing for 779 00:41:50,840 --> 00:41:52,640 Speaker 3: my bank and I do da da da da da. 780 00:41:52,880 --> 00:41:56,560 Speaker 3: They can go through all the menus automate that process. 781 00:41:56,840 --> 00:41:59,160 Speaker 3: I think it would change everything because all that frustration 782 00:41:59,360 --> 00:42:02,719 Speaker 3: is a consumer would go down dramatically and it's all, 783 00:42:03,239 --> 00:42:06,240 Speaker 3: you know, why are you making me the beep booth, 784 00:42:06,440 --> 00:42:10,560 Speaker 3: press one, offload press exactly, Well, don't offload to me, 785 00:42:10,800 --> 00:42:13,239 Speaker 3: offload to AI. We can help you with that. 786 00:42:13,520 --> 00:42:16,800 Speaker 2: Here's my version of that drives me crazy. Every morning 787 00:42:16,960 --> 00:42:20,000 Speaker 2: I go to the same coffee shop and I get 788 00:42:20,600 --> 00:42:22,719 Speaker 2: a cup of tea and a croissant. 789 00:42:23,080 --> 00:42:24,000 Speaker 3: And here's what happens. 790 00:42:24,000 --> 00:42:26,719 Speaker 2: The person has their screen and they go I go, 791 00:42:27,000 --> 00:42:35,239 Speaker 2: cup of tea, croissant, sparkling water, like at least twenty keystrokes, 792 00:42:36,000 --> 00:42:38,640 Speaker 2: and then like then the screen is turned around. Like 793 00:42:38,719 --> 00:42:41,080 Speaker 2: at this point we're like forty five seconds in, I'm like, 794 00:42:41,360 --> 00:42:43,279 Speaker 2: why is this? First of all, it's not for me. 795 00:42:43,360 --> 00:42:46,520 Speaker 2: All those keystrokes, it's there in turn right right, So 796 00:42:46,719 --> 00:42:48,320 Speaker 2: they're burdening me in order to service. 797 00:42:48,360 --> 00:42:50,080 Speaker 3: To back it, you should be able to walk in, 798 00:42:50,320 --> 00:42:52,319 Speaker 3: go up and they go, I'm olc them the same 799 00:42:52,360 --> 00:42:54,879 Speaker 3: thing and you just go yes, and then they boom, 800 00:42:55,040 --> 00:42:55,439 Speaker 3: We're done. 801 00:42:55,480 --> 00:42:57,600 Speaker 2: Can we do AI makeover of my coffee shop? 802 00:42:59,440 --> 00:43:03,160 Speaker 3: You notice I quickly jumped more to banks than your 803 00:43:03,239 --> 00:43:06,200 Speaker 3: coffee shop because I think I'm a business person, but 804 00:43:06,520 --> 00:43:08,680 Speaker 3: I'm not trying to kind of do a deal on 805 00:43:08,680 --> 00:43:09,520 Speaker 3: one coffee shop. 806 00:43:09,520 --> 00:43:11,719 Speaker 2: No, But this is interesting because it takes me back 807 00:43:11,760 --> 00:43:14,280 Speaker 2: to something you said that I thought was really important. 808 00:43:14,640 --> 00:43:17,359 Speaker 2: When you were talking about when you were using AI 809 00:43:17,520 --> 00:43:20,800 Speaker 2: and your customer service thing, it was clear that your goal, 810 00:43:20,960 --> 00:43:23,759 Speaker 2: you could have any number of goals going in. It 811 00:43:23,800 --> 00:43:27,280 Speaker 2: could be to cut costs, it could be to dramatically 812 00:43:27,320 --> 00:43:28,440 Speaker 2: improved profits. 813 00:43:29,000 --> 00:43:29,840 Speaker 3: Your goal, quite. 814 00:43:29,680 --> 00:43:32,560 Speaker 2: Specifically, was to improve the experience of your customer, right, 815 00:43:32,600 --> 00:43:33,920 Speaker 2: So you were using it to that. 816 00:43:34,120 --> 00:43:36,879 Speaker 3: All the other things come from that come from. That 817 00:43:37,120 --> 00:43:40,720 Speaker 3: is actually one of the beautiful pillars of the IBM 818 00:43:40,760 --> 00:43:44,360 Speaker 3: culture is delighting clients is actually where all of the 819 00:43:44,440 --> 00:43:45,560 Speaker 3: good stuff comes from. 820 00:43:45,600 --> 00:43:49,640 Speaker 2: So my coffee shop thing is the same principle. Right now, 821 00:43:49,880 --> 00:43:52,920 Speaker 2: they're making my customer experience worse and they don't want to. 822 00:43:54,400 --> 00:43:56,279 Speaker 3: Their eyes are glued to the special a. 823 00:43:56,239 --> 00:43:58,359 Speaker 2: Moment when I walk in and I want to say, Hi, 824 00:43:58,480 --> 00:44:03,520 Speaker 2: how are you doing conversation? You're too busy, busy, So like, 825 00:44:04,040 --> 00:44:05,839 Speaker 2: this is the same thing. If they had it, oh, 826 00:44:05,840 --> 00:44:08,279 Speaker 2: we this is if they understood they had an opportunity 827 00:44:08,320 --> 00:44:11,040 Speaker 2: to improve the experience of their customer experience. 828 00:44:11,080 --> 00:44:14,960 Speaker 3: I would not be surprised if a chain comes along 829 00:44:15,400 --> 00:44:17,880 Speaker 3: where that is their value proposition, I would not be 830 00:44:17,920 --> 00:44:22,520 Speaker 3: surprised at all. Yeah, right, So I mean and and 831 00:44:22,560 --> 00:44:26,280 Speaker 3: when those things kind of catch hold, it becomes a revolution. 832 00:44:26,520 --> 00:44:28,480 Speaker 2: You know, when the guy comes to do like to 833 00:44:28,480 --> 00:44:31,160 Speaker 2: redo your roof and they put a sign out front, 834 00:44:31,239 --> 00:44:33,840 Speaker 2: like you know, Joe's roofing. You guys could do the 835 00:44:33,840 --> 00:44:37,840 Speaker 2: same with my coffee shop. But like I'd be was 836 00:44:37,920 --> 00:44:45,759 Speaker 2: here exactly exactly, in five years, the main frame will 837 00:44:45,760 --> 00:44:48,640 Speaker 2: be dot dot going strong. 838 00:44:49,320 --> 00:44:54,840 Speaker 3: H the main frame going strong and with new capabilities, 839 00:44:54,920 --> 00:44:59,480 Speaker 3: continuous new capabilities. I think when we announced the last 840 00:44:59,560 --> 00:45:04,120 Speaker 3: versions six, the latest version, I should say, and we said, hey, 841 00:45:04,120 --> 00:45:07,880 Speaker 3: there's AI processing built into it. This was before everybody 842 00:45:07,920 --> 00:45:10,200 Speaker 3: was talking about that. I think a lot of people thought, 843 00:45:10,480 --> 00:45:13,360 Speaker 3: what's that for? And we did it specifically for traditional 844 00:45:13,360 --> 00:45:17,359 Speaker 3: AI fraud detection, et cetera. This next version, not only 845 00:45:17,400 --> 00:45:19,560 Speaker 3: do we have the traditional AI built in, but we 846 00:45:19,640 --> 00:45:22,799 Speaker 3: have optional cards that you can plug into it to 847 00:45:22,840 --> 00:45:26,640 Speaker 3: allow you to do large language models for the enhanced 848 00:45:26,680 --> 00:45:30,440 Speaker 3: fraud detection cases that we talked about, where you know, 849 00:45:30,520 --> 00:45:34,439 Speaker 3: it's more than just what transactions were happening. So if 850 00:45:34,440 --> 00:45:38,360 Speaker 3: you take that and say, okay, the next generations. We 851 00:45:38,440 --> 00:45:42,000 Speaker 3: have more transaction volume than we've ever had in mainframes. Today, 852 00:45:42,200 --> 00:45:46,000 Speaker 3: the business is growing, it's strong, we keep innovating. In 853 00:45:46,040 --> 00:45:47,600 Speaker 3: five years it'll be going strong. 854 00:45:47,719 --> 00:45:50,319 Speaker 2: But we're people. You're saying this in the context of 855 00:45:51,200 --> 00:45:53,399 Speaker 2: for years people were predicting, weren't they that the main 856 00:45:53,440 --> 00:45:54,480 Speaker 2: brand was going to go away. 857 00:45:56,360 --> 00:45:58,799 Speaker 3: There were pundits in the market that said everything will 858 00:45:58,800 --> 00:46:00,800 Speaker 3: go away there no one will ever have a box, 859 00:46:00,800 --> 00:46:03,480 Speaker 3: It'll all be online. I think this is something I've 860 00:46:03,560 --> 00:46:08,080 Speaker 3: learned big time in my long career. You know in 861 00:46:08,120 --> 00:46:12,359 Speaker 3: the IT industry is don't believe everything you hear. So 862 00:46:12,440 --> 00:46:16,520 Speaker 3: I went back for my master's degree at Stanford after 863 00:46:16,560 --> 00:46:21,160 Speaker 3: I had worked a while in as a hardware designer, 864 00:46:21,440 --> 00:46:24,280 Speaker 3: and everybody told me be sure to do your masters 865 00:46:24,280 --> 00:46:27,320 Speaker 3: in software. Hardware is dead. I went on to work 866 00:46:27,560 --> 00:46:31,120 Speaker 3: for thirty plus years in hardware and infrastructure. Now software 867 00:46:31,120 --> 00:46:33,560 Speaker 3: became important, and I'm glad I had that extra training 868 00:46:33,560 --> 00:46:36,360 Speaker 3: in software because it helped me in hardware. But hardware 869 00:46:36,440 --> 00:46:40,000 Speaker 3: wasn't dead. Then I heard all infrastructure will go into 870 00:46:40,040 --> 00:46:43,080 Speaker 3: the cloud. There won't be that hasn't happened. It's not happening. 871 00:46:43,360 --> 00:46:45,800 Speaker 3: Then I heard there will only be one cloud because 872 00:46:45,840 --> 00:46:48,280 Speaker 3: one of the players will dominate. There's not one cloud. 873 00:46:48,360 --> 00:46:52,600 Speaker 3: So I think it's as humans we like to oversimplify 874 00:46:52,640 --> 00:46:54,920 Speaker 3: and go, oh, it's all going to be this, and 875 00:46:55,000 --> 00:46:59,080 Speaker 3: kind of what I've learned is fit for purpose matters 876 00:46:59,120 --> 00:47:04,880 Speaker 3: in everything. It matters in size of infrastructure, it matters 877 00:47:04,880 --> 00:47:07,600 Speaker 3: in the stack that goes along with solving a specific 878 00:47:07,760 --> 00:47:10,960 Speaker 3: use case. If you're willing to design something that's the 879 00:47:10,960 --> 00:47:13,280 Speaker 3: best at that use case, If you're willing to design 880 00:47:13,360 --> 00:47:15,680 Speaker 3: the coffee shop that is the best at greeting me, 881 00:47:16,040 --> 00:47:17,920 Speaker 3: there's a spot for you, and there may be a 882 00:47:17,960 --> 00:47:21,840 Speaker 3: big business in doing that. So oversimplifying is really. 883 00:47:21,840 --> 00:47:25,040 Speaker 2: When you heard all those predictions, did you believe them 884 00:47:25,040 --> 00:47:25,799 Speaker 2: at the time. 885 00:47:27,200 --> 00:47:29,960 Speaker 3: They looked like they were trending in that direction. I'll 886 00:47:30,000 --> 00:47:32,680 Speaker 3: tell you some right now which might be useful. There 887 00:47:32,680 --> 00:47:35,080 Speaker 3: will only be one GPU company and they're going to 888 00:47:35,800 --> 00:47:38,400 Speaker 3: end up taking over the world. It's a pretty obvious answer. 889 00:47:38,440 --> 00:47:41,839 Speaker 3: Whose economic values risen dramatically. I don't think that's going 890 00:47:41,880 --> 00:47:44,320 Speaker 3: to be the case. In fact, I think that ninety 891 00:47:44,360 --> 00:47:50,000 Speaker 3: percent of processing for AI actually happen happens at inferencing, 892 00:47:50,440 --> 00:47:53,920 Speaker 3: and inferencing is not as GPU and hardware intensive as 893 00:47:53,960 --> 00:47:56,400 Speaker 3: the other things, and is a lot more amenable to 894 00:47:56,520 --> 00:47:59,600 Speaker 3: fit for purpose. So the model size will matter. The 895 00:47:59,680 --> 00:48:02,040 Speaker 3: tune matters a lot. As we're learning. We have a 896 00:48:02,040 --> 00:48:06,120 Speaker 3: product around instruct lab that's really focused on tuning. So 897 00:48:06,560 --> 00:48:08,520 Speaker 3: that was one thing is there'll be one GPU. The 898 00:48:08,560 --> 00:48:12,480 Speaker 3: other thing is that the biggest model will win. I 899 00:48:12,520 --> 00:48:14,879 Speaker 3: think is another thing that's kind of people are saying 900 00:48:14,960 --> 00:48:17,359 Speaker 3: right now. Don't believe that I believe they'll be fit 901 00:48:17,440 --> 00:48:20,520 Speaker 3: for purpose models. It takes a lot of money to 902 00:48:20,600 --> 00:48:24,120 Speaker 3: run to create a huge model, and then to run 903 00:48:24,160 --> 00:48:26,719 Speaker 3: a huge model, or to even infer off of a 904 00:48:26,840 --> 00:48:30,439 Speaker 3: huge model. I don't need a massive training GPU set 905 00:48:30,480 --> 00:48:34,160 Speaker 3: thing to solve my thirteen thousand people customer support issues. 906 00:48:34,200 --> 00:48:36,319 Speaker 3: So why would I feel like I got to go 907 00:48:36,440 --> 00:48:38,839 Speaker 3: farm that out for a big expensive thing. I can 908 00:48:38,880 --> 00:48:40,840 Speaker 3: do that on a small box. In some cases I 909 00:48:40,920 --> 00:48:42,640 Speaker 3: might even be able to do that on a laptop. 910 00:48:43,120 --> 00:48:45,000 Speaker 3: The other thing I'll say in this we are so 911 00:48:45,239 --> 00:48:47,960 Speaker 3: early innings in AI, A lot of things are going 912 00:48:48,000 --> 00:48:50,399 Speaker 3: to change. So anybody kind of saying it will all 913 00:48:50,440 --> 00:48:52,920 Speaker 3: be X, Y or Z, I just think you have 914 00:48:53,000 --> 00:48:54,920 Speaker 3: no idea how this is going to play out. And 915 00:48:55,600 --> 00:48:57,279 Speaker 3: it's up to us to go figure out how it 916 00:48:57,280 --> 00:48:57,799 Speaker 3: plays out. 917 00:48:58,080 --> 00:49:02,000 Speaker 2: Yeah, yeah, all right, in five years, AI will be 918 00:49:02,320 --> 00:49:03,160 Speaker 2: dot dot dot. 919 00:49:03,800 --> 00:49:10,160 Speaker 3: Still new. It will have moved a bunch in five years, 920 00:49:10,880 --> 00:49:15,160 Speaker 3: but the potential for the disruption in the world will 921 00:49:15,200 --> 00:49:18,320 Speaker 3: still will still be very early innings in that process. 922 00:49:18,440 --> 00:49:20,799 Speaker 3: And I think that's super important to realize. That's why 923 00:49:20,840 --> 00:49:24,359 Speaker 3: I say get started, start thinking about how that could change, 924 00:49:24,360 --> 00:49:27,200 Speaker 3: because it'll be some little things first, but it will 925 00:49:27,239 --> 00:49:28,320 Speaker 3: continue to snowball. 926 00:49:28,520 --> 00:49:34,000 Speaker 2: This is a common observation that we the invention of 927 00:49:34,040 --> 00:49:40,439 Speaker 2: the capability uh massively predates the understanding of the capability, right, 928 00:49:40,520 --> 00:49:44,560 Speaker 2: Like I love that. Yeah, Like, yes, recorded recording shows 929 00:49:44,680 --> 00:49:52,239 Speaker 2: on television is invented in the sixties. Probably we don't 930 00:49:52,280 --> 00:49:57,440 Speaker 2: really understand what it's used for until the oughts was 931 00:49:57,200 --> 00:49:59,600 Speaker 2: what's really good for is being able to tell a 932 00:49:59,640 --> 00:50:02,920 Speaker 2: story sequentially, Yes, over time, because you know that the 933 00:50:02,920 --> 00:50:04,719 Speaker 2: person will all have see in the episode before, so 934 00:50:04,760 --> 00:50:08,319 Speaker 2: you got the Sopranos And yes, yes, Hollywood wanted to 935 00:50:08,480 --> 00:50:11,680 Speaker 2: ban the VCR in the beginning. Yeah, because they thought 936 00:50:11,680 --> 00:50:13,680 Speaker 2: it was good. They thought the point of it was 937 00:50:14,000 --> 00:50:17,040 Speaker 2: thought the din understand No, no, no, it's storytelling. It's actually 938 00:50:17,120 --> 00:50:19,840 Speaker 2: your business is getting better. Yes, Yes, took them twenty 939 00:50:19,880 --> 00:50:21,719 Speaker 2: years to figure that out, which is to your point, 940 00:50:22,239 --> 00:50:24,400 Speaker 2: why would we know what AI was four and five yearso? 941 00:50:24,400 --> 00:50:26,520 Speaker 3: Well, that's why you hear people kind of say, oh 942 00:50:26,600 --> 00:50:29,279 Speaker 3: my gosh, AI, that's that will just eliminate jobs. No, 943 00:50:29,360 --> 00:50:31,440 Speaker 3: it'll make jobs better. That's how I view it. 944 00:50:31,560 --> 00:50:35,040 Speaker 2: Yeah, what's the number one thing that people misunderstand about AI? 945 00:50:35,200 --> 00:50:38,440 Speaker 3: Is that it that it'll I think that's that. That 946 00:50:38,480 --> 00:50:41,400 Speaker 3: would be the human kind of understanding part of it. 947 00:50:41,480 --> 00:50:44,799 Speaker 3: The technology part of it, I think would be what 948 00:50:44,960 --> 00:50:49,160 Speaker 3: I was talking about fit for purpose, meaning that it 949 00:50:49,200 --> 00:50:52,000 Speaker 3: isn't just going to be a GPU arms race all 950 00:50:52,040 --> 00:50:54,600 Speaker 3: of AI. I don't believe that at all. It will 951 00:50:54,680 --> 00:50:57,000 Speaker 3: change everything, but it's not just going to be a GPU. 952 00:50:56,800 --> 00:51:00,440 Speaker 2: Armed Next question, what advice would you give yourself ten 953 00:51:00,480 --> 00:51:03,440 Speaker 2: years ago to better prepare you for today? I'm changing 954 00:51:03,440 --> 00:51:08,040 Speaker 2: this question, Okay. I want to say, let's imagine that 955 00:51:09,400 --> 00:51:10,440 Speaker 2: what was your what. 956 00:51:10,400 --> 00:51:13,879 Speaker 3: College you to go to? I went to three of them. 957 00:51:14,080 --> 00:51:17,440 Speaker 3: My undergrad was Utah State University, my MBA was Santa 958 00:51:17,440 --> 00:51:20,600 Speaker 3: Clara University, and my master's in w was Stanford. 959 00:51:20,760 --> 00:51:24,000 Speaker 2: Okay, any one of those three culture up and says 960 00:51:24,360 --> 00:51:29,200 Speaker 2: we want you to give the commencement address and imagine 961 00:51:29,239 --> 00:51:31,919 Speaker 2: that it's it's it's let's just say, for the sake 962 00:51:31,920 --> 00:51:33,880 Speaker 2: of argument, it's just to the stamp people. 963 00:51:34,320 --> 00:51:35,640 Speaker 3: Those are the relevant parties here. 964 00:51:36,640 --> 00:51:37,839 Speaker 2: What do you tell them? 965 00:51:38,239 --> 00:51:44,520 Speaker 3: Boy? What do I tell them? Let's see. I think 966 00:51:44,600 --> 00:51:49,320 Speaker 3: I would start with life is a marathon, not a sprint. 967 00:51:49,719 --> 00:51:53,560 Speaker 3: It would be the first one. The second thing I 968 00:51:53,560 --> 00:51:56,840 Speaker 3: would say that in that spirit is be sure to 969 00:51:56,960 --> 00:52:02,120 Speaker 3: set yourself some big, hairy, audacious goals and don't be 970 00:52:02,320 --> 00:52:07,680 Speaker 3: overly disappointed if you don't hit them all. Going after 971 00:52:07,719 --> 00:52:10,400 Speaker 3: those big, hairy, audacious goals will get you on a 972 00:52:10,440 --> 00:52:14,640 Speaker 3: path where you will learn so much. You will achieve 973 00:52:14,719 --> 00:52:17,400 Speaker 3: more than you ever could imagine you would have achieved. 974 00:52:17,640 --> 00:52:19,680 Speaker 3: That's what the advice I give to my kids is, 975 00:52:20,080 --> 00:52:22,960 Speaker 3: set some big goals, get after it. You may or 976 00:52:22,960 --> 00:52:24,680 Speaker 3: may not achieve them, but you'll be better for the 977 00:52:24,680 --> 00:52:25,799 Speaker 3: whole process when you're done. 978 00:52:25,800 --> 00:52:28,200 Speaker 2: By the way, as someone whose kids are younger than yours, 979 00:52:29,000 --> 00:52:31,279 Speaker 2: is it actually useful to give you give advice to 980 00:52:31,320 --> 00:52:34,160 Speaker 2: your kids the points exercise TVD. 981 00:52:34,239 --> 00:52:36,200 Speaker 3: We're still on the journey, and I think we will 982 00:52:36,200 --> 00:52:39,880 Speaker 3: be for a long time. I don't know how. 983 00:52:39,719 --> 00:52:41,719 Speaker 2: Are you already using AI in your day to day 984 00:52:41,719 --> 00:52:42,240 Speaker 2: life today? 985 00:52:44,280 --> 00:52:48,440 Speaker 3: Personally, I would say it's replacing a good chunk of 986 00:52:48,520 --> 00:52:51,520 Speaker 3: my search. You know, I'm less likely to go blindly 987 00:52:51,840 --> 00:52:54,720 Speaker 3: stumbling through a bunch of web pages looking for stuff. 988 00:52:55,080 --> 00:52:57,440 Speaker 3: I'm more likely to ask a question from a few 989 00:52:57,600 --> 00:53:00,400 Speaker 3: AI engines kind of see get me in the right direction, 990 00:53:00,480 --> 00:53:03,120 Speaker 3: then I'll go bumble through a few things. At work, 991 00:53:03,560 --> 00:53:08,920 Speaker 3: I can tell you code development right now, we are 992 00:53:08,960 --> 00:53:13,000 Speaker 3: seeing massive improvements in code development and support. Products we 993 00:53:13,120 --> 00:53:17,920 Speaker 3: have like Watson Code Assistant that is really showing immediate 994 00:53:17,960 --> 00:53:21,160 Speaker 3: return for a code developers, and I think that will 995 00:53:21,200 --> 00:53:25,120 Speaker 3: again be a tool that increases productivity for code developers 996 00:53:25,480 --> 00:53:28,040 Speaker 3: immediately across the globe. Yeah. 997 00:53:28,160 --> 00:53:31,520 Speaker 2: Last question, what's the one skill that every technology leader 998 00:53:31,680 --> 00:53:33,879 Speaker 2: needs that has nothing to do with technology. 999 00:53:34,840 --> 00:53:38,960 Speaker 3: Being able to inspire a set of people toward a 1000 00:53:39,000 --> 00:53:42,520 Speaker 3: common goal and collaborate to achieve it. That's at the 1001 00:53:42,520 --> 00:53:46,359 Speaker 3: core of everything everything. That's a lovely way to end. 1002 00:53:47,000 --> 00:53:48,319 Speaker 3: Thank you so much, Rick, Thank you. 1003 00:53:50,840 --> 00:53:55,160 Speaker 2: This conversation left me excited. I'm now imagining the potential 1004 00:53:55,200 --> 00:53:57,960 Speaker 2: for new use cases for AI in all sorts of 1005 00:53:58,040 --> 00:54:01,680 Speaker 2: different businesses. Rick didn't seem sold on my idea of 1006 00:54:01,680 --> 00:54:04,600 Speaker 2: a coffee shop makeover. But it's clear there's lots of 1007 00:54:04,640 --> 00:54:09,200 Speaker 2: opportunities here to increase speed and efficiency, to achieve your objectives, 1008 00:54:09,360 --> 00:54:12,879 Speaker 2: and to dream beyond the current applications for this technology. 1009 00:54:13,840 --> 00:54:16,360 Speaker 2: At the end of the day, the scaling of AI 1010 00:54:16,480 --> 00:54:19,840 Speaker 2: will rely on the right infrastructure to support it. With 1011 00:54:19,920 --> 00:54:23,040 Speaker 2: the right tools, you can solve problems that are unique 1012 00:54:23,080 --> 00:54:37,920 Speaker 2: tier industry and improve the experience for your customers. Smart 1013 00:54:37,920 --> 00:54:41,200 Speaker 2: Talks with IBM is produced by Matt Romano, Amy Gains 1014 00:54:41,280 --> 00:54:45,640 Speaker 2: McQuaid and Jacob Goldstein. Were edited by Lydia gene Kott, 1015 00:54:46,000 --> 00:54:51,600 Speaker 2: mastering by Jake Koorsky. Theme song by Gramoscope. Special thanks 1016 00:54:51,600 --> 00:54:54,160 Speaker 2: to the eight Bar and IBM teams, as well as 1017 00:54:54,160 --> 00:54:58,080 Speaker 2: the Pushkin marketing team. Smart Talks with IBM is a 1018 00:54:58,080 --> 00:55:04,440 Speaker 2: production of Pushkin Industries and Ruby Studio at iHeartMedia. To 1019 00:55:04,480 --> 00:55:09,880 Speaker 2: find more Pushkin podcasts, listen on the iHeartRadio app, Apple Podcasts, 1020 00:55:10,000 --> 00:55:19,359 Speaker 2: or wherever you listen to podcasts. I'm Malcolm Gladwell. This 1021 00:55:19,520 --> 00:55:23,080 Speaker 2: is a paid advertisement from IBM. The conversations on this 1022 00:55:23,120 --> 00:55:40,560 Speaker 2: podcast don't necessarily represent IBM's positions, strategies or opinions.