1 00:00:02,440 --> 00:00:07,280 Speaker 1: Hello everyone. This is Smart Talks with IBM, a podcast 2 00:00:07,640 --> 00:00:12,239 Speaker 1: from Pushkin Industries, I Heart Media and IBM about what 3 00:00:12,280 --> 00:00:15,160 Speaker 1: it means to look at today's most challenging problems in 4 00:00:15,200 --> 00:00:24,439 Speaker 1: a new way. I'm Malcolm Gladwell. Today I'm chatting with 5 00:00:24,520 --> 00:00:28,240 Speaker 1: Rob Thomas, the senior vice president of IBM Cloud and Data, 6 00:00:28,640 --> 00:00:35,519 Speaker 1: where his responsibility is bringing new ideas to life. But 7 00:00:35,600 --> 00:00:38,800 Speaker 1: despite being on the cutting edge of these technologies, he 8 00:00:38,880 --> 00:00:46,880 Speaker 1: still has an appreciation for age old problems. There's a 9 00:00:47,000 --> 00:00:49,400 Speaker 1: rabbit and a beaver and they're staring at the Hoover 10 00:00:49,520 --> 00:00:53,480 Speaker 1: dam and the beaver says the rabbit, No, I didn't 11 00:00:53,479 --> 00:00:55,440 Speaker 1: build it, but it's based on an idea of mind. 12 00:00:57,120 --> 00:00:59,720 Speaker 1: And the point of that story is there's ideas are 13 00:00:59,720 --> 00:01:05,280 Speaker 1: a dye that doesn't so great story. Everybody's got a 14 00:01:05,319 --> 00:01:08,440 Speaker 1: bunch of ideas. By the way, we're too quick to 15 00:01:08,480 --> 00:01:12,000 Speaker 1: dismiss the beaver. He's right, but I have you seen 16 00:01:12,080 --> 00:01:15,720 Speaker 1: beaver dams. I mean, he's right, it was his idea, 17 00:01:16,360 --> 00:01:18,319 Speaker 1: but he had nothing to do with the giants Cement 18 00:01:18,360 --> 00:01:24,600 Speaker 1: Hoover dow. In my interview with Rob will touch on 19 00:01:24,640 --> 00:01:27,479 Speaker 1: the importance of the cloud during the pandemic and how 20 00:01:27,520 --> 00:01:38,160 Speaker 1: IBM has been playing a part in vaccine distribution Stay tuned. I, 21 00:01:38,360 --> 00:01:41,160 Speaker 1: for one, had no idea about what it means to 22 00:01:41,200 --> 00:01:43,360 Speaker 1: be the senior vice president of IBM Cloud and Data, 23 00:01:43,720 --> 00:01:45,720 Speaker 1: so I asked Rob to break it down for me 24 00:01:45,760 --> 00:01:52,160 Speaker 1: in Layman's terms, we build software, and software is the 25 00:01:52,240 --> 00:01:56,480 Speaker 1: lingua franca of our time. Anything that will get done 26 00:01:56,680 --> 00:02:02,160 Speaker 1: in businesses and even interaction with consumers is going to 27 00:02:02,200 --> 00:02:05,680 Speaker 1: be done with software. It's really the language of everything 28 00:02:05,720 --> 00:02:08,560 Speaker 1: that's happening in the world. That's what we build. We 29 00:02:08,639 --> 00:02:12,480 Speaker 1: are focused on doing that for businesses. So how how 30 00:02:12,520 --> 00:02:16,359 Speaker 1: long have you been at IBAN twenty years or one? 31 00:02:16,400 --> 00:02:19,639 Speaker 1: I guess to be precise. And I started in consulting, 32 00:02:20,600 --> 00:02:24,520 Speaker 1: and then I moved into our semiconductor business, and I 33 00:02:24,560 --> 00:02:29,320 Speaker 1: was doing consulting. And the moment that really changed my 34 00:02:29,400 --> 00:02:33,120 Speaker 1: whole career was doing work with Nintendo where we were 35 00:02:33,160 --> 00:02:38,640 Speaker 1: designing the microprocessor for the Nintendo We and I realized, 36 00:02:39,200 --> 00:02:41,320 Speaker 1: we're going to do this one time, but then they're 37 00:02:41,360 --> 00:02:45,040 Speaker 1: going to be building software that we get copied billions 38 00:02:45,040 --> 00:02:47,560 Speaker 1: of times and used by people all over the world. 39 00:02:48,320 --> 00:02:50,720 Speaker 1: Maybe I'm not in the right business. And that really 40 00:02:50,840 --> 00:02:56,360 Speaker 1: piqued my curiosity around software, which then led me to 41 00:02:56,720 --> 00:02:59,840 Speaker 1: move into the IBM software business, where I've been for 42 00:03:00,240 --> 00:03:02,440 Speaker 1: most of my career at this point. So I've been 43 00:03:02,440 --> 00:03:06,720 Speaker 1: in software total twelve thirteen years. But you have seen 44 00:03:06,760 --> 00:03:09,960 Speaker 1: I'm guessing, so twelve years in software. Am I right 45 00:03:10,000 --> 00:03:13,120 Speaker 1: in thinking that Morris happened in those twelve years of 46 00:03:13,160 --> 00:03:16,840 Speaker 1: software than happened in the entire history of software before? That? 47 00:03:16,880 --> 00:03:20,720 Speaker 1: Is that a fair statement? Close? It's I'd say close. 48 00:03:20,960 --> 00:03:24,400 Speaker 1: It's certainly the rate and pace of innovation has increased. 49 00:03:25,040 --> 00:03:27,640 Speaker 1: Now has actually something hasn't read an outcome? Maybe that's 50 00:03:27,680 --> 00:03:31,280 Speaker 1: a different question. But if you think about you know, 51 00:03:31,440 --> 00:03:35,120 Speaker 1: software dates way back to even the first main frame 52 00:03:35,160 --> 00:03:38,560 Speaker 1: that we ever built in the fifties. So a lot 53 00:03:38,560 --> 00:03:40,520 Speaker 1: of good things have been happening in software for a 54 00:03:40,520 --> 00:03:43,400 Speaker 1: long time. But the rate and pace is a level 55 00:03:43,440 --> 00:03:46,240 Speaker 1: that we've never seen, and that's certainly been what is 56 00:03:46,280 --> 00:03:50,080 Speaker 1: accelerated in the last decade. I mean, I remember my 57 00:03:50,160 --> 00:03:53,680 Speaker 1: dad was a mathematician at the University of Waterloo. I 58 00:03:53,760 --> 00:03:56,120 Speaker 1: remember coming home as a kid, going into his office 59 00:03:56,120 --> 00:03:59,520 Speaker 1: and seeing stacks of computer cards. So in my lifetime 60 00:04:00,320 --> 00:04:03,320 Speaker 1: I have I have gone from looking at stacks of 61 00:04:03,360 --> 00:04:07,720 Speaker 1: computer cards to something far more so. I mean, I 62 00:04:07,760 --> 00:04:11,000 Speaker 1: am aware of just how fast this fast, this uh, 63 00:04:11,160 --> 00:04:13,760 Speaker 1: this pace is gone and it will be different a 64 00:04:13,840 --> 00:04:17,480 Speaker 1: year from now. Right, that's how fast this is moving. 65 00:04:18,480 --> 00:04:21,640 Speaker 1: Let's zero in on that a little bit. Um, What's 66 00:04:22,160 --> 00:04:26,120 Speaker 1: what's shifting right now? Imagine I'm a client and I 67 00:04:26,160 --> 00:04:27,920 Speaker 1: come to you and I say, you know, I want 68 00:04:27,960 --> 00:04:30,919 Speaker 1: to be prepared for next year and the year after next. 69 00:04:31,320 --> 00:04:35,880 Speaker 1: What should be at the top of my mind. Let 70 00:04:35,880 --> 00:04:37,279 Speaker 1: me give you a quick story, if you don't mind. 71 00:04:37,640 --> 00:04:40,240 Speaker 1: There was a time in the US where you could 72 00:04:40,279 --> 00:04:45,240 Speaker 1: not easily get from one city to another. And at 73 00:04:45,240 --> 00:04:48,080 Speaker 1: that point, back in the nine fifties, there was a 74 00:04:48,160 --> 00:04:52,880 Speaker 1: decision that said, let's actually build the infrastructure to connect 75 00:04:53,520 --> 00:04:58,279 Speaker 1: every city in America. And the result was fifty plus 76 00:04:58,360 --> 00:05:00,919 Speaker 1: years of work fo dollar ours and we now have 77 00:05:01,120 --> 00:05:05,280 Speaker 1: forty eight thousand miles of highways that connects all these cities. 78 00:05:06,360 --> 00:05:09,520 Speaker 1: But the real impact is more profound than that, because 79 00:05:10,240 --> 00:05:14,560 Speaker 1: you're able to eliminate traffic at intersections by building over passes. 80 00:05:15,440 --> 00:05:20,200 Speaker 1: There are all these second order businesses that were built. Hotels, 81 00:05:21,080 --> 00:05:24,440 Speaker 1: gas stations, the salty snacks that you buy in a 82 00:05:24,480 --> 00:05:30,400 Speaker 1: gas station, fast food rest areas, so an entire economy 83 00:05:31,040 --> 00:05:34,279 Speaker 1: was built around the idea that the first step was 84 00:05:34,320 --> 00:05:38,679 Speaker 1: just to connect all the cities in the US. And 85 00:05:39,160 --> 00:05:42,360 Speaker 1: that's what's happening right now with software. It is connecting 86 00:05:42,400 --> 00:05:46,120 Speaker 1: businesses and individuals in a way that we've never been 87 00:05:46,120 --> 00:05:49,360 Speaker 1: connected before, and we are just at the beginning of 88 00:05:49,400 --> 00:05:52,760 Speaker 1: all the second order effects that will come as a 89 00:05:52,800 --> 00:05:57,240 Speaker 1: result of them. And the biggest problem in software it's data. 90 00:05:58,320 --> 00:06:01,479 Speaker 1: Just like you had all these disparate cities and you 91 00:06:01,600 --> 00:06:06,279 Speaker 1: were building highways to connect those cities, every company has 92 00:06:06,440 --> 00:06:10,600 Speaker 1: all these different data sets all over the place and 93 00:06:10,600 --> 00:06:13,719 Speaker 1: it's a really hard problem. But AI is not going 94 00:06:13,800 --> 00:06:17,960 Speaker 1: to be a reality for businesses until the data problem 95 00:06:18,160 --> 00:06:22,200 Speaker 1: is solved. That's one thing that I spent a lot 96 00:06:22,200 --> 00:06:26,000 Speaker 1: of time on Right now, we're dig digging from it 97 00:06:26,120 --> 00:06:29,440 Speaker 1: that into that meaning of that phrase, the data problem. 98 00:06:29,480 --> 00:06:34,680 Speaker 1: I think every individual wants any company they interact with, 99 00:06:34,839 --> 00:06:40,120 Speaker 1: whether it's their local bank or restaurant, or the local cleaners, 100 00:06:40,760 --> 00:06:45,320 Speaker 1: whatever it may be, they want that business to know them. 101 00:06:45,360 --> 00:06:48,719 Speaker 1: It's the whole idea of when you had towns where 102 00:06:48,720 --> 00:06:51,200 Speaker 1: there was just one general store and the owner knew you, 103 00:06:51,640 --> 00:06:54,240 Speaker 1: they knew what you wanted. I think everybody wants that 104 00:06:54,360 --> 00:07:01,440 Speaker 1: level of engagement, and that is what software enables. And 105 00:07:01,520 --> 00:07:05,320 Speaker 1: the basis of that is data. And the biggest problem 106 00:07:05,480 --> 00:07:09,760 Speaker 1: every business faces today is how do I understand my data? 107 00:07:09,960 --> 00:07:12,119 Speaker 1: What it tells me about my customers, what it tells 108 00:07:12,120 --> 00:07:16,440 Speaker 1: me about my products. So this is fundamentally about how 109 00:07:16,480 --> 00:07:20,640 Speaker 1: do we live in a better way. You're talking about that. 110 00:07:20,800 --> 00:07:24,160 Speaker 1: I'm I'm a big company, and I have different sets 111 00:07:24,160 --> 00:07:26,560 Speaker 1: of data and they're all in different places, and they 112 00:07:26,560 --> 00:07:29,120 Speaker 1: don't speak to each other, and I can't combine them 113 00:07:29,120 --> 00:07:30,600 Speaker 1: and make sense. Is that what you mean by the 114 00:07:30,680 --> 00:07:36,000 Speaker 1: data problem? Correct? And even if I can combine them 115 00:07:36,000 --> 00:07:40,240 Speaker 1: and connect them, the data is not in a usable form. 116 00:07:40,280 --> 00:07:43,120 Speaker 1: You know, one one data, says m Gladwell, the other 117 00:07:43,120 --> 00:07:46,160 Speaker 1: one says Malcolm g Is that the same person? Maybe? 118 00:07:46,200 --> 00:07:50,240 Speaker 1: Maybe not. It's really hard because these systems have been 119 00:07:50,240 --> 00:07:53,880 Speaker 1: built up over time. We do work with a company 120 00:07:53,920 --> 00:07:58,480 Speaker 1: called Wonderman. Thompson story that they shared with me just 121 00:07:58,520 --> 00:08:04,200 Speaker 1: this month was doing work with Peloton. So Peloton collects 122 00:08:04,480 --> 00:08:06,800 Speaker 1: a lot of data what you call first party data 123 00:08:06,880 --> 00:08:09,640 Speaker 1: from a bike or the tread. I think you're a 124 00:08:09,680 --> 00:08:14,720 Speaker 1: runner if I recall and w P P. Wonderman Thompson 125 00:08:14,800 --> 00:08:17,120 Speaker 1: has all this third party data, which is what do 126 00:08:17,200 --> 00:08:20,280 Speaker 1: they know about consumers? So just to connect those two 127 00:08:20,360 --> 00:08:25,080 Speaker 1: data sets, build predictive models, and then to turn that 128 00:08:25,120 --> 00:08:29,520 Speaker 1: into an advertising campaign. The AI part is actually relatively easy. 129 00:08:29,880 --> 00:08:34,160 Speaker 1: It's actually connecting the data, rationalizing the data, cleaning the data. 130 00:08:34,440 --> 00:08:36,680 Speaker 1: That's the really hard part that nobody talks about because 131 00:08:36,679 --> 00:08:40,559 Speaker 1: all we ever see is you know the outcome, yeah, 132 00:08:40,800 --> 00:08:43,400 Speaker 1: which is so I understand this is super interesting. So 133 00:08:43,480 --> 00:08:47,040 Speaker 1: let's imagine you Robb or a Peloton user, and so 134 00:08:47,160 --> 00:08:49,199 Speaker 1: we have a data stream that comes from the bike 135 00:08:49,840 --> 00:08:53,040 Speaker 1: which says that you bike. Let's just say for them, 136 00:08:53,040 --> 00:08:55,400 Speaker 1: I'm gonna flatter you an hour and a half a 137 00:08:55,480 --> 00:08:59,240 Speaker 1: day at some insane pace and neither of which are true. 138 00:08:59,240 --> 00:09:01,720 Speaker 1: But keep going. I did do a half hour today, 139 00:09:01,720 --> 00:09:04,120 Speaker 1: but it was a very slow pace. I gotta tell you. So, 140 00:09:04,200 --> 00:09:07,240 Speaker 1: I'm and I'm looking at your via whatever it is 141 00:09:07,280 --> 00:09:10,880 Speaker 1: I'm collecting. I'm assuming Peloton collects a lot of sort 142 00:09:10,880 --> 00:09:14,000 Speaker 1: of physiological and you stat up on the bike, and 143 00:09:14,080 --> 00:09:17,760 Speaker 1: from that we can generate a rough sense of who 144 00:09:17,800 --> 00:09:20,719 Speaker 1: you are, how what your athletic interests are, how fit 145 00:09:20,800 --> 00:09:23,600 Speaker 1: you are, all those kinds of things, and Wonderman's Old 146 00:09:23,640 --> 00:09:27,960 Speaker 1: shop wants to know, how can I use that picture 147 00:09:28,040 --> 00:09:31,920 Speaker 1: of the kind of athlete you are to help bring 148 00:09:31,920 --> 00:09:34,240 Speaker 1: you the kinds of ad messages that you'll respond to. 149 00:09:34,320 --> 00:09:36,880 Speaker 1: Is that a fair? Is that the problem? It could 150 00:09:36,920 --> 00:09:39,040 Speaker 1: be bringing it to me? But it's more likely because 151 00:09:39,040 --> 00:09:42,680 Speaker 1: obviously they d anonymize all this data. It's more of all, right, so, 152 00:09:42,720 --> 00:09:45,160 Speaker 1: how do we find somebody else that's like rob? What 153 00:09:45,240 --> 00:09:47,320 Speaker 1: are the attributes of that person? And then how do 154 00:09:47,360 --> 00:09:50,280 Speaker 1: we relate to them in a way that makes it 155 00:09:50,320 --> 00:09:53,720 Speaker 1: feel like we're talking to them as opposed to talking 156 00:09:53,760 --> 00:09:58,760 Speaker 1: to a cohort or a group. The number one prediction 157 00:09:58,800 --> 00:10:01,480 Speaker 1: that most companies want to is what's going to happen 158 00:10:01,520 --> 00:10:04,440 Speaker 1: to my sales next month or the month after or 159 00:10:04,440 --> 00:10:08,520 Speaker 1: the month after. And what we found is that tends 160 00:10:08,559 --> 00:10:11,440 Speaker 1: to be a product of as many as fifty or 161 00:10:11,440 --> 00:10:14,400 Speaker 1: a hundred different inputs. How many people are visiting the website, 162 00:10:14,840 --> 00:10:17,520 Speaker 1: how many people are calling the call center, how many 163 00:10:17,559 --> 00:10:20,240 Speaker 1: sales calls if I have a face to face salesforce 164 00:10:20,440 --> 00:10:23,400 Speaker 1: are they making? How many marketing campaigns am I running? 165 00:10:24,880 --> 00:10:27,360 Speaker 1: If you take all of these different data points, which 166 00:10:27,400 --> 00:10:30,400 Speaker 1: is awful in fifty or a hundred. You feed those 167 00:10:30,440 --> 00:10:35,679 Speaker 1: into a model. Then the first month you see how 168 00:10:35,720 --> 00:10:38,200 Speaker 1: close with the model. Then you adjust, second month, you 169 00:10:38,240 --> 00:10:41,200 Speaker 1: see how close was the model? And these models get 170 00:10:41,240 --> 00:10:44,880 Speaker 1: really good over time. And we think we can help 171 00:10:44,920 --> 00:10:48,520 Speaker 1: companies predict their financial performance in a month and a 172 00:10:48,600 --> 00:10:52,600 Speaker 1: quarter in a year based on all these different data sources, 173 00:10:52,640 --> 00:10:55,640 Speaker 1: all these different inputs. That's pretty valuable to let's say 174 00:10:55,640 --> 00:10:59,600 Speaker 1: every company. So IBM, what's IBM's role in that you've 175 00:10:59,640 --> 00:11:01,800 Speaker 1: described that problem to me? What is you guys come 176 00:11:01,840 --> 00:11:06,959 Speaker 1: in and you say, we'll do what? A couple of 177 00:11:07,000 --> 00:11:08,840 Speaker 1: years ago I started. I was trying to think about 178 00:11:08,920 --> 00:11:12,720 Speaker 1: what is the right metaphor so that I can educate 179 00:11:14,120 --> 00:11:18,240 Speaker 1: our customers on this and built this concept that I 180 00:11:18,320 --> 00:11:26,520 Speaker 1: called the AI ladder. To think of it as steps 181 00:11:26,559 --> 00:11:30,160 Speaker 1: that you take up a ladder towards AI. The bottom 182 00:11:30,240 --> 00:11:32,320 Speaker 1: rung is collect data. So you have to be able 183 00:11:32,360 --> 00:11:36,360 Speaker 1: to collect all your data. I'll use a library analogy. 184 00:11:36,480 --> 00:11:38,200 Speaker 1: This is just you have to get books. You have 185 00:11:38,240 --> 00:11:42,040 Speaker 1: to get books into the library that's collecting. Next is 186 00:11:42,080 --> 00:11:45,680 Speaker 1: you have to organize that data and the now the 187 00:11:45,720 --> 00:11:48,120 Speaker 1: a lot back to library analogy. That's the card catalog. 188 00:11:48,520 --> 00:11:51,200 Speaker 1: So where are all the different data sets. I might 189 00:11:51,240 --> 00:11:53,600 Speaker 1: have five copies of the same data. How do I 190 00:11:53,640 --> 00:11:56,559 Speaker 1: know that's the same copy. Maybe one's checked out, maybe 191 00:11:56,559 --> 00:12:00,000 Speaker 1: one's on microfilm. These are actually all problems that existing businesses. 192 00:12:00,160 --> 00:12:03,640 Speaker 1: So you've got to collect data, you've got to organize data. 193 00:12:04,040 --> 00:12:06,520 Speaker 1: Then you have to analyze the data. So you're actually 194 00:12:06,520 --> 00:12:11,160 Speaker 1: starting to do data science machine learning in the library 195 00:12:11,160 --> 00:12:14,600 Speaker 1: metaphor that's where you're displaying your best seller list or 196 00:12:14,640 --> 00:12:18,560 Speaker 1: you're displaying, you know, popular magazine titles. And then the 197 00:12:18,600 --> 00:12:21,559 Speaker 1: top of the ladder is what I call infuse. So 198 00:12:21,600 --> 00:12:26,120 Speaker 1: then how do you take those models and infuse them 199 00:12:26,120 --> 00:12:30,679 Speaker 1: into a business process. So it's those four steps the ladder. 200 00:12:30,720 --> 00:12:36,280 Speaker 1: You have to collect, organized, analyze, and fuse. We build 201 00:12:36,280 --> 00:12:38,960 Speaker 1: software that helps customers with each of the wrongs of 202 00:12:39,000 --> 00:12:42,400 Speaker 1: that ladder, helps them do the collection. We actually build 203 00:12:42,400 --> 00:12:44,880 Speaker 1: what we call a data catalog to help you organize 204 00:12:44,880 --> 00:12:47,400 Speaker 1: your data. So we help them with all wrongs of 205 00:12:47,440 --> 00:12:51,720 Speaker 1: that ladder. Because ultimately, then you've probably heard of IBM Watson, 206 00:12:51,880 --> 00:12:55,680 Speaker 1: that is our AI platform. Once you've done those things, 207 00:12:57,080 --> 00:13:04,439 Speaker 1: you can use AI and get really great outcomes. Imagine 208 00:13:04,480 --> 00:13:09,439 Speaker 1: if someone from the White House came to you and 209 00:13:09,480 --> 00:13:12,559 Speaker 1: said we're about to do something we've never done in 210 00:13:12,600 --> 00:13:17,040 Speaker 1: this haven't done in this country for seventy years, which 211 00:13:17,080 --> 00:13:19,960 Speaker 1: is try and vaccinate everybody in the shortest possible time. 212 00:13:20,600 --> 00:13:24,320 Speaker 1: We have a multiple sets of three and eventually probably 213 00:13:24,320 --> 00:13:27,880 Speaker 1: four or five different kinds of vaccines being administered by 214 00:13:28,040 --> 00:13:31,520 Speaker 1: tens of thousands of local municipalities too, people who have 215 00:13:31,559 --> 00:13:38,199 Speaker 1: a wide ranging set of risk factors, urgency, pre existing 216 00:13:38,200 --> 00:13:39,720 Speaker 1: conditions going on, you know, on and on and on 217 00:13:39,760 --> 00:13:43,360 Speaker 1: and on on um. Can you help us do this 218 00:13:43,600 --> 00:13:51,400 Speaker 1: as efficiently and cost effectively and socially consciously as possible? 219 00:13:52,640 --> 00:13:54,760 Speaker 1: Is that a kind of task that you're talking about 220 00:13:55,840 --> 00:14:00,280 Speaker 1: now that is in part as much a logistics problem 221 00:14:00,320 --> 00:14:02,959 Speaker 1: as it is a data problem. Let me describe to 222 00:14:02,960 --> 00:14:05,040 Speaker 1: you one of the data problems though, that exist around 223 00:14:05,040 --> 00:14:09,640 Speaker 1: this because we're doing the work with CVS on the 224 00:14:09,760 --> 00:14:16,600 Speaker 1: COVID vaccine rollout. Yes, and so if you're CVS where 225 00:14:16,600 --> 00:14:21,320 Speaker 1: you're actually administering, their biggest problem is everybody has a question. 226 00:14:22,440 --> 00:14:25,680 Speaker 1: CVS can't hire enough people to answer the ten questions 227 00:14:25,720 --> 00:14:28,480 Speaker 1: you have, the ten questions I have, the twenty questions 228 00:14:28,520 --> 00:14:32,800 Speaker 1: your cousin has. They came to us and said, can 229 00:14:32,840 --> 00:14:37,080 Speaker 1: we use AI too respond to all the inquiries we're 230 00:14:37,080 --> 00:14:41,160 Speaker 1: getting and actually help route people to where they can 231 00:14:41,200 --> 00:14:43,320 Speaker 1: figure out they can get the vaccine when they're eligible. 232 00:14:44,000 --> 00:14:47,400 Speaker 1: So we built an AI agent for them that is 233 00:14:47,440 --> 00:14:52,600 Speaker 1: now dealing with the the vaccine rollout every day that 234 00:14:52,720 --> 00:14:56,200 Speaker 1: starts with data. They have a place that they store 235 00:14:56,560 --> 00:15:00,480 Speaker 1: data about different questions. We've got models that we have 236 00:15:00,640 --> 00:15:05,000 Speaker 1: trained on language, meaning we can understand different types of questions, 237 00:15:05,040 --> 00:15:09,400 Speaker 1: which really inferred versus implied versus what is stated. That's 238 00:15:09,440 --> 00:15:14,000 Speaker 1: a real data problem. That's where we've spent the majority 239 00:15:14,040 --> 00:15:17,600 Speaker 1: of our time looking at this, this current situation. So 240 00:15:17,680 --> 00:15:20,480 Speaker 1: you would that you when you say it's a data problem, 241 00:15:20,640 --> 00:15:26,080 Speaker 1: meaning that you started by trying to anticipate, by looking 242 00:15:26,080 --> 00:15:28,160 Speaker 1: at the data and using that to try and anticipate 243 00:15:28,200 --> 00:15:30,960 Speaker 1: all the possible questions that someone might ask, Is that 244 00:15:30,960 --> 00:15:34,760 Speaker 1: what you're correct? Yes, um, and then training a machine 245 00:15:34,840 --> 00:15:38,480 Speaker 1: learning model based on those inputs so that when the 246 00:15:38,560 --> 00:15:42,160 Speaker 1: question was asked, we had a high probability of giving 247 00:15:42,160 --> 00:15:44,640 Speaker 1: the right answer. How long did it take you to 248 00:15:44,640 --> 00:15:50,440 Speaker 1: build that system? Now this is the wonders of modern software. 249 00:15:50,600 --> 00:15:53,760 Speaker 1: To your question on acceleration, we did this in forty 250 00:15:53,800 --> 00:15:58,680 Speaker 1: five days. Are you serious. Yeah, it's insane. How how 251 00:15:58,680 --> 00:16:04,240 Speaker 1: many people worked on it? Thirty somewhere in that room. 252 00:16:04,320 --> 00:16:09,600 Speaker 1: It's not a huge group. Wow. Their thing about systems 253 00:16:09,640 --> 00:16:12,840 Speaker 1: like this is you hope it's really good on day one, 254 00:16:13,440 --> 00:16:15,000 Speaker 1: but you know for sure it's going to be better 255 00:16:15,360 --> 00:16:18,440 Speaker 1: on day ten. It's gonna be better again on day twenty. 256 00:16:18,840 --> 00:16:21,480 Speaker 1: These are learning systems. They do get better over time. 257 00:16:21,880 --> 00:16:25,480 Speaker 1: And the thing is with with the really difficult problems. 258 00:16:26,040 --> 00:16:28,000 Speaker 1: And this is this is why I like to talk 259 00:16:28,040 --> 00:16:31,200 Speaker 1: about AI is giving human superpowers. Most people want to 260 00:16:31,200 --> 00:16:34,400 Speaker 1: say it replaces humans. I actually think given superpowers because 261 00:16:34,400 --> 00:16:37,360 Speaker 1: in these cases you start to move the harder problems 262 00:16:37,400 --> 00:16:41,720 Speaker 1: to the humans, and so therefore your your customer satisfaction 263 00:16:41,760 --> 00:16:45,120 Speaker 1: goes up because people are getting their problems resolved. Would 264 00:16:45,120 --> 00:16:47,640 Speaker 1: you ever get to Probably not, because there's always going 265 00:16:47,680 --> 00:16:50,640 Speaker 1: to be something that's too difficult for the AI to handle, 266 00:16:50,680 --> 00:16:52,640 Speaker 1: But I think you can keep moving it up for sure. 267 00:16:53,400 --> 00:16:55,920 Speaker 1: Or maybe given what you've just said, would it be 268 00:16:55,960 --> 00:16:57,720 Speaker 1: more fair to say you don't want to get to 269 00:16:57,760 --> 00:17:01,480 Speaker 1: a hundred, that you want to rise nerve a certain 270 00:17:01,720 --> 00:17:05,080 Speaker 1: category of problem for a human human interaction, because that 271 00:17:05,160 --> 00:17:10,159 Speaker 1: might be ultimately more satisfying to the question. We have 272 00:17:10,240 --> 00:17:12,600 Speaker 1: that discussion a lot, and certainly in the ones that 273 00:17:12,600 --> 00:17:16,120 Speaker 1: I've worked on, that's that's typically the case, because let's 274 00:17:16,119 --> 00:17:19,400 Speaker 1: not forget these are businesses, and the goal of most 275 00:17:19,440 --> 00:17:22,400 Speaker 1: businesses is to sell something. So sometimes the best way 276 00:17:22,440 --> 00:17:24,600 Speaker 1: to sell something is to really help somebody with their 277 00:17:24,600 --> 00:17:27,960 Speaker 1: problem and then show them how your other product can 278 00:17:28,000 --> 00:17:30,960 Speaker 1: make their life even easier. When you think back in 279 00:17:31,040 --> 00:17:35,680 Speaker 1: the cases that you have kind of problems that that 280 00:17:35,840 --> 00:17:38,159 Speaker 1: your group has been asked to solve it, IBM of 281 00:17:38,240 --> 00:17:42,400 Speaker 1: last couple of years, what was the hardest I don't 282 00:17:42,400 --> 00:17:45,080 Speaker 1: know that I could name a single thing that's harder 283 00:17:45,119 --> 00:17:48,359 Speaker 1: than others. The ones that are the most time consuming, 284 00:17:48,840 --> 00:17:53,280 Speaker 1: things like regulatory compliance. If you're a bank, you've got 285 00:17:53,280 --> 00:17:55,879 Speaker 1: a lot of different regulations that you have to to 286 00:17:56,119 --> 00:18:01,320 Speaker 1: live up to. And it's easy to help build AI 287 00:18:01,359 --> 00:18:05,240 Speaker 1: that can make loan decisions yes or no, good idea, 288 00:18:05,280 --> 00:18:09,040 Speaker 1: bad idea, eliminate bias from that decision. That's very doable. 289 00:18:10,359 --> 00:18:14,000 Speaker 1: Am I Compliance with the regulations of where that individual 290 00:18:14,080 --> 00:18:15,960 Speaker 1: is based, because they're in a zip code, or they're 291 00:18:15,960 --> 00:18:18,640 Speaker 1: in a state, they're in a country, those problems get 292 00:18:18,680 --> 00:18:23,440 Speaker 1: really difficult because you're kind of connecting, you know, reams 293 00:18:23,480 --> 00:18:28,240 Speaker 1: of legality to a day to day business process. Those 294 00:18:28,280 --> 00:18:32,840 Speaker 1: get those those get really difficult. Has any customer ever 295 00:18:32,880 --> 00:18:34,480 Speaker 1: come to you with a problem that you, guys said, 296 00:18:34,520 --> 00:18:39,639 Speaker 1: we can't solve that. We are way too curious to 297 00:18:40,000 --> 00:18:44,679 Speaker 1: ever give up that easily. It's more of, you know, 298 00:18:45,119 --> 00:18:48,000 Speaker 1: it's the it's the cheap, fast, and good triangle. If 299 00:18:48,040 --> 00:18:50,879 Speaker 1: you've heard that, you know you only get two of those. 300 00:18:51,080 --> 00:18:53,080 Speaker 1: Do you want it cheap and fast, it's probably not good. 301 00:18:53,280 --> 00:18:55,280 Speaker 1: If you want it good and fast, it's probably not cheap. 302 00:18:56,560 --> 00:18:58,520 Speaker 1: If you want it cheap and good, it's probably not fast. 303 00:18:59,119 --> 00:19:02,600 Speaker 1: I think all of these situations come down to that triangle. 304 00:19:04,320 --> 00:19:05,879 Speaker 1: So you have a group of people who work on 305 00:19:05,920 --> 00:19:08,720 Speaker 1: these kinds of problems, and I'm curious, what do you 306 00:19:08,840 --> 00:19:14,879 Speaker 1: look for when you're bringing someone onto that team. Is 307 00:19:14,880 --> 00:19:18,520 Speaker 1: there a set of skills associated with dealing with this 308 00:19:18,640 --> 00:19:22,600 Speaker 1: area of the application of AI to these very complicated 309 00:19:23,600 --> 00:19:26,440 Speaker 1: data fields. Is there a specific set of skills that 310 00:19:26,480 --> 00:19:32,919 Speaker 1: are crucial and rare hard find? The skill that's easy 311 00:19:32,960 --> 00:19:38,119 Speaker 1: to test for is do you have the technical ability? 312 00:19:38,640 --> 00:19:45,439 Speaker 1: Do you understand Python? Do you understand machine learning? You 313 00:19:45,480 --> 00:19:47,679 Speaker 1: can kind of see from somebody's body of work and 314 00:19:47,720 --> 00:19:51,160 Speaker 1: what they've studied. Do they have that skill? Part where 315 00:19:51,160 --> 00:19:55,720 Speaker 1: it gets harder is the empathy question. Can you actually 316 00:19:55,800 --> 00:20:01,280 Speaker 1: understand a situation, understand a user, and empathize with what 317 00:20:01,320 --> 00:20:03,720 Speaker 1: they're trying to do such that you're not just building 318 00:20:03,720 --> 00:20:06,080 Speaker 1: a model for a robot, You're actually building this for 319 00:20:06,320 --> 00:20:10,119 Speaker 1: a human on some end, that one's hard, harder to 320 00:20:10,160 --> 00:20:12,840 Speaker 1: test for. And then the third one is I would 321 00:20:12,880 --> 00:20:17,840 Speaker 1: just call it curiosity, how widely read as somebody do 322 00:20:17,920 --> 00:20:23,040 Speaker 1: they understand business business problems because those kind of softer skills, 323 00:20:24,480 --> 00:20:27,600 Speaker 1: those make a huge difference when you're solving these kind 324 00:20:27,600 --> 00:20:29,919 Speaker 1: of problems. So it's easy to test for the first. 325 00:20:30,600 --> 00:20:32,480 Speaker 1: The other two are a little harder to test for, 326 00:20:32,960 --> 00:20:36,400 Speaker 1: and the best data scientists in the world have all 327 00:20:36,400 --> 00:20:43,600 Speaker 1: three of those. Let's talk about um the cloud. I 328 00:20:43,680 --> 00:20:47,200 Speaker 1: see this word hybrid cloud. I don't know what it means. 329 00:20:48,040 --> 00:20:50,280 Speaker 1: So can you start by telling me what it means 330 00:20:50,680 --> 00:20:53,320 Speaker 1: and then fit this into the conversation we've been having. 331 00:20:55,160 --> 00:20:59,439 Speaker 1: So any company that's been around for more than three years, 332 00:21:00,680 --> 00:21:06,480 Speaker 1: maybe five, they've got somewhere that they keep their data 333 00:21:07,720 --> 00:21:11,159 Speaker 1: and they keep the different technology that they have, and 334 00:21:11,200 --> 00:21:13,840 Speaker 1: in many cases that's in their office or that's in 335 00:21:13,880 --> 00:21:18,480 Speaker 1: a data center right right near their office. They've also 336 00:21:18,560 --> 00:21:23,320 Speaker 1: started over time to start to build new data sets 337 00:21:23,440 --> 00:21:29,639 Speaker 1: or new software in a public cloud, something from you know, 338 00:21:29,760 --> 00:21:34,840 Speaker 1: something inside of IBM Cloud or Amazon Web Services or 339 00:21:34,960 --> 00:21:40,000 Speaker 1: Microsoft Azure. The minute that you have more than one environment, 340 00:21:41,960 --> 00:21:44,760 Speaker 1: you have a hybrid cloud, whether whether you know it 341 00:21:44,840 --> 00:21:48,280 Speaker 1: or not. So think of it as I've got multiple 342 00:21:48,359 --> 00:21:50,520 Speaker 1: data sets and multiple places to kind of back to 343 00:21:50,560 --> 00:21:55,480 Speaker 1: the US Highway example, or I've got software applications and 344 00:21:55,640 --> 00:21:59,879 Speaker 1: multiple places. You have to get that to act like 345 00:22:00,080 --> 00:22:05,560 Speaker 1: a single technology environment. That is the essence of hybrid cloud, 346 00:22:05,640 --> 00:22:08,160 Speaker 1: which is I can manage that as a single environment. 347 00:22:08,560 --> 00:22:11,639 Speaker 1: The average company now has five different environments cloud wise, 348 00:22:12,880 --> 00:22:15,400 Speaker 1: it acts like one. I can connect the data sets 349 00:22:15,480 --> 00:22:19,960 Speaker 1: the average company has. Is that by by design because 350 00:22:20,000 --> 00:22:23,119 Speaker 1: they feel it's safe for Is that just because the 351 00:22:23,160 --> 00:22:26,879 Speaker 1: hodgepodge nature in which we grow our I team needs 352 00:22:26,960 --> 00:22:28,720 Speaker 1: means that we end up being all over the place. 353 00:22:31,000 --> 00:22:34,359 Speaker 1: It's because there's a lot of people that work in 354 00:22:34,720 --> 00:22:37,800 Speaker 1: every company, and everybody wants their own thing. That's how 355 00:22:37,800 --> 00:22:41,800 Speaker 1: it happens. So that this department started in their own 356 00:22:41,840 --> 00:22:46,440 Speaker 1: data center, This department started on IBM cloud. This department 357 00:22:46,520 --> 00:22:50,560 Speaker 1: wanted a CRM system from Salesforce, this department wanted to 358 00:22:50,680 --> 00:22:54,800 Speaker 1: use Azure. It's human nature. People just go do what 359 00:22:54,880 --> 00:22:57,400 Speaker 1: they want to do. And you wake up one day 360 00:22:57,400 --> 00:22:59,560 Speaker 1: and you realize, hey, we've got a lot of different 361 00:22:59,560 --> 00:23:02,920 Speaker 1: cloud elements. And so if you're storing your customer data 362 00:23:03,000 --> 00:23:06,360 Speaker 1: with Salesforce and you've got these three other environments, how 363 00:23:06,359 --> 00:23:10,159 Speaker 1: do you get the customer data to inform you know 364 00:23:10,200 --> 00:23:12,080 Speaker 1: what you're doing in the other parts of your business. 365 00:23:12,200 --> 00:23:18,680 Speaker 1: That's a hybrid cloud problem. And how how hard of 366 00:23:18,720 --> 00:23:21,960 Speaker 1: a problem is that? I mean, as that total naive 367 00:23:22,000 --> 00:23:24,520 Speaker 1: outside or I would have said, oh, surely all these 368 00:23:24,520 --> 00:23:26,760 Speaker 1: cloud businesses would have made it really easy to share 369 00:23:27,280 --> 00:23:30,160 Speaker 1: stuff in one place with stuff you've got in other place. 370 00:23:30,200 --> 00:23:33,920 Speaker 1: Is that not true? Unfortunately the opposite is true, because 371 00:23:34,440 --> 00:23:37,480 Speaker 1: for the pure play public cloud providers, the incentive was 372 00:23:37,520 --> 00:23:40,600 Speaker 1: actually the opposite. It's Hotel California. For them, you can 373 00:23:40,640 --> 00:23:42,679 Speaker 1: bring your stuff in, but you know you can. You 374 00:23:42,680 --> 00:23:44,600 Speaker 1: can check in, but you will never let you check out, 375 00:23:45,520 --> 00:23:47,919 Speaker 1: and they charge actually enormous fees if you want to 376 00:23:48,359 --> 00:23:50,800 Speaker 1: get your data out. So it's a bit of a 377 00:23:50,840 --> 00:23:54,359 Speaker 1: strategy tax for them to make it easy. It's also 378 00:23:54,400 --> 00:23:58,480 Speaker 1: a hard problem just because you're trying to connect different 379 00:23:58,560 --> 00:24:02,159 Speaker 1: data sets. Do you have in card cataloged that connects 380 00:24:02,200 --> 00:24:05,520 Speaker 1: all these different sources. It's actually not easy to do. 381 00:24:06,200 --> 00:24:08,159 Speaker 1: And what happens when you don't do that then you 382 00:24:08,240 --> 00:24:10,879 Speaker 1: end up rebuilding everything and so suddenly you're storing all 383 00:24:10,920 --> 00:24:14,679 Speaker 1: the same data five times. That gets very expensive. So 384 00:24:14,960 --> 00:24:21,320 Speaker 1: let's imagine what Having this conversation give me your sense 385 00:24:21,359 --> 00:24:23,639 Speaker 1: of where will be what would what would we be 386 00:24:23,720 --> 00:24:27,840 Speaker 1: talking about five years from now, we'll probably be having 387 00:24:28,119 --> 00:24:31,680 Speaker 1: very similar discussions as possible. Technology will be more advanced, 388 00:24:31,720 --> 00:24:34,480 Speaker 1: but a lot of them probably talked about. Let's be honest, 389 00:24:34,520 --> 00:24:38,359 Speaker 1: these have been around for for quite a while. There's 390 00:24:38,400 --> 00:24:41,480 Speaker 1: a story this this guy, Charles Towns, he was the 391 00:24:41,520 --> 00:24:44,800 Speaker 1: inventor of the laser, and he tells his story. There's 392 00:24:44,800 --> 00:24:47,040 Speaker 1: a rabbit and a beaver and they're staring at the 393 00:24:47,080 --> 00:24:51,159 Speaker 1: Hoover Dam and the beaver says the rabbit, no, I 394 00:24:51,160 --> 00:24:53,400 Speaker 1: didn't build it, but it's based on an idea of mine. 395 00:24:55,080 --> 00:24:57,679 Speaker 1: And the point of that story is there's ideas. Are 396 00:24:57,720 --> 00:25:03,280 Speaker 1: a dime? Doesn't it so? Great? Story? Everybody's got a 397 00:25:03,280 --> 00:25:06,399 Speaker 1: bunch of ideas. By the way, we're too quick to 398 00:25:06,440 --> 00:25:10,280 Speaker 1: dismiss the beaver. He's right, but have you seen beaver 399 00:25:10,359 --> 00:25:13,080 Speaker 1: Dam's I mean, I know he is right. It was 400 00:25:13,160 --> 00:25:15,520 Speaker 1: his idea, but he had nothing to do with the 401 00:25:15,560 --> 00:25:22,040 Speaker 1: giants Cement Hoover do. Yeah. The reason I I'd share 402 00:25:22,080 --> 00:25:25,360 Speaker 1: that story is a lot of people have ideas, now 403 00:25:25,440 --> 00:25:28,320 Speaker 1: what about what they can do? But what's going to 404 00:25:28,400 --> 00:25:30,280 Speaker 1: make a difference five years from now is what do 405 00:25:30,359 --> 00:25:36,120 Speaker 1: you go try and do? And I encourage companies that 406 00:25:36,200 --> 00:25:40,199 Speaker 1: you've got to be willing to have pretty high failure rate, 407 00:25:40,320 --> 00:25:42,119 Speaker 1: knowing that if you go try a bunch of things, 408 00:25:43,280 --> 00:25:45,360 Speaker 1: you know, maybe only half of them will work out. 409 00:25:46,200 --> 00:25:48,440 Speaker 1: I mean, if I look at AI today, so there's 410 00:25:48,520 --> 00:25:52,160 Speaker 1: five major things I see companies doing generally successfully. It's 411 00:25:52,160 --> 00:25:57,480 Speaker 1: customer service. We talked about that, It's financial budgeting, It's 412 00:25:57,600 --> 00:26:00,480 Speaker 1: regulatory compliance. We talked about that. That one's or harder. 413 00:26:01,240 --> 00:26:06,120 Speaker 1: It's employee experience hiring that type of thing. And it's 414 00:26:06,240 --> 00:26:09,399 Speaker 1: using AI to run their I T systems. So using 415 00:26:09,440 --> 00:26:12,440 Speaker 1: software to run the systems. Those are the five big 416 00:26:12,440 --> 00:26:17,240 Speaker 1: things today. I actually think those five things will still 417 00:26:17,280 --> 00:26:20,919 Speaker 1: be the topic in but will be a lot more 418 00:26:20,960 --> 00:26:23,680 Speaker 1: advanced on each of those because today it's a little 419 00:26:23,720 --> 00:26:26,280 Speaker 1: bit we're doing it for the first time, whereas will 420 00:26:26,280 --> 00:26:29,520 Speaker 1: be a much more advanced as we get out. I 421 00:26:29,520 --> 00:26:32,480 Speaker 1: do think quantum computing will be commercialized at that point. 422 00:26:32,920 --> 00:26:36,760 Speaker 1: That's pretty revolutionary. So more to come on that one. 423 00:26:37,040 --> 00:26:41,080 Speaker 1: Let's end on some more case studies. Tell me a 424 00:26:41,119 --> 00:26:44,800 Speaker 1: couple of examples of people you've worked with where the 425 00:26:44,800 --> 00:26:49,200 Speaker 1: outcome is it was really exciting or or unexpected. Or 426 00:26:53,800 --> 00:26:57,360 Speaker 1: we've worked with Sprint T Mobile. They have this classic 427 00:26:57,440 --> 00:27:02,240 Speaker 1: problem of they've got to do aftermarket service for all 428 00:27:02,280 --> 00:27:07,200 Speaker 1: the different telecom equipment that they sell and the data 429 00:27:07,240 --> 00:27:12,200 Speaker 1: that they have on those different systems, the warranty when 430 00:27:12,240 --> 00:27:15,320 Speaker 1: they were built, how they're running, it's spread across a 431 00:27:15,359 --> 00:27:21,199 Speaker 1: thousand different different data sources. We were able to build 432 00:27:21,840 --> 00:27:24,919 Speaker 1: an AI system for them that sits across those systems, 433 00:27:24,960 --> 00:27:28,600 Speaker 1: that was able to intelligently route how they do all 434 00:27:28,640 --> 00:27:33,399 Speaker 1: of their aftermarket service. So do you and I feel 435 00:27:33,440 --> 00:27:36,879 Speaker 1: that in our day to day life, Well, we we 436 00:27:36,960 --> 00:27:39,520 Speaker 1: feel that if they don't fix things, then it's obvious 437 00:27:39,560 --> 00:27:41,919 Speaker 1: because there's an outage or something that doesn't work. But 438 00:27:42,000 --> 00:27:44,560 Speaker 1: it was something that they had so much data on this. 439 00:27:44,640 --> 00:27:47,600 Speaker 1: They could have never done this by i'd say a 440 00:27:47,680 --> 00:27:51,479 Speaker 1: typical approach. So these are the kinds of things that 441 00:27:51,520 --> 00:27:55,639 Speaker 1: the average consumer doesn't see every day, but they do 442 00:27:55,800 --> 00:27:59,760 Speaker 1: make a difference in our life. And you're talking about 443 00:27:59,760 --> 00:28:03,160 Speaker 1: things like what like cell towers or yeah, it could 444 00:28:03,160 --> 00:28:05,199 Speaker 1: be cell towers, or it could be you know, the 445 00:28:05,240 --> 00:28:08,040 Speaker 1: power cable, not the power cable, the power boxes sitting 446 00:28:08,040 --> 00:28:09,760 Speaker 1: next to the cell tower. It could be any of 447 00:28:09,760 --> 00:28:12,240 Speaker 1: those things. Oh, I see. So you're so they have 448 00:28:12,320 --> 00:28:14,679 Speaker 1: all of these systems that might have been bought at 449 00:28:14,720 --> 00:28:18,639 Speaker 1: different times, made by different people, installed by different people, 450 00:28:19,280 --> 00:28:20,960 Speaker 1: And so what you want to do is to give 451 00:28:21,000 --> 00:28:22,760 Speaker 1: them a system that allows them to look at them 452 00:28:22,760 --> 00:28:25,760 Speaker 1: all in real time and figure out where there might 453 00:28:25,760 --> 00:28:30,680 Speaker 1: be an issue. Yes, we call it predictive maintenance. Right, Okay, 454 00:28:30,880 --> 00:28:32,600 Speaker 1: all the signs are that there's about to be a 455 00:28:32,600 --> 00:28:35,000 Speaker 1: problem on this one, they go out there, they check 456 00:28:35,000 --> 00:28:37,600 Speaker 1: it out. Yep, well and behold, there is a problem. 457 00:28:37,640 --> 00:28:40,240 Speaker 1: I have two cars. Can you build one for me? 458 00:28:40,400 --> 00:28:45,000 Speaker 1: So we haven't scaled down to that level quite yet, 459 00:28:45,000 --> 00:28:48,280 Speaker 1: but stay tuned. We're open to it. Why not, Why 460 00:28:48,320 --> 00:28:52,000 Speaker 1: are you neglecting I'm the ultimate end user. I'm I'm 461 00:28:52,040 --> 00:28:55,920 Speaker 1: one guy with two cars. That's a good question. Well 462 00:28:56,040 --> 00:28:59,479 Speaker 1: we'll bring it to you soon. We have any um, 463 00:29:00,520 --> 00:29:05,719 Speaker 1: have any professional sports teams work with you? Guys, Toronto 464 00:29:05,800 --> 00:29:09,200 Speaker 1: Raptors have been a publicity on that a few years ago. 465 00:29:09,280 --> 00:29:12,320 Speaker 1: You're You're I'm Canadian. This is that's my team. You're 466 00:29:12,320 --> 00:29:15,400 Speaker 1: warming my heart right now of course. Well this has 467 00:29:15,440 --> 00:29:19,720 Speaker 1: been really fun. Um, thank you so much. Yeah, I'm welcome, 468 00:29:19,760 --> 00:29:26,040 Speaker 1: appreciate it. I'd love to help out the Toronto Raptors 469 00:29:26,040 --> 00:29:29,440 Speaker 1: if I had the chance. Thanks again to Rob Thomas 470 00:29:29,960 --> 00:29:37,800 Speaker 1: for an intriguing conversation about data and the cloud. Smart 471 00:29:37,840 --> 00:29:42,600 Speaker 1: Talks with IBM is produced by Emily Rosteck with Carl Migliari, 472 00:29:43,360 --> 00:29:48,520 Speaker 1: edited by Karen shakerge engineering by Martin Gonzalez, mixed and 473 00:29:48,560 --> 00:29:54,600 Speaker 1: mastered by Jason Gambrel. Music by Granmascope. Special thanks to 474 00:29:54,920 --> 00:29:58,920 Speaker 1: Molly Sosha, Andy Kelly, Mia La Belle, Jacob Weisberg, Head 475 00:29:59,040 --> 00:30:03,280 Speaker 1: Fane Aerk Sander, Maggie Taylor and the teams at eight 476 00:30:03,320 --> 00:30:07,800 Speaker 1: Bar and IBM. Smart Talks with IBM is a production 477 00:30:07,800 --> 00:30:11,840 Speaker 1: of Pushkin Industries and I Heart Media. You can find 478 00:30:11,920 --> 00:30:15,800 Speaker 1: more Pushkin podcasts on the I Heart Radio app. Apple 479 00:30:15,880 --> 00:30:21,000 Speaker 1: podcasts or wherever you like to listen. I'm Malcolm Gladwell, 480 00:30:21,000 --> 00:30:21,840 Speaker 1: See you next time.