1 00:00:04,480 --> 00:00:10,440 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. 2 00:00:12,200 --> 00:00:13,840 Speaker 2: Today, we are witnessed. 3 00:00:13,360 --> 00:00:16,600 Speaker 1: To one of those rare moments in history, the rise 4 00:00:16,720 --> 00:00:20,480 Speaker 1: of an innovative technology with the potential to radically transform 5 00:00:20,560 --> 00:00:26,280 Speaker 1: business in society forever. That technology, of course, is artificial intelligence, 6 00:00:26,480 --> 00:00:29,240 Speaker 1: and it's the central focus for this new season of 7 00:00:29,280 --> 00:00:33,360 Speaker 1: Smart Talks with IBM. Join hosts from your favorite Pushkin 8 00:00:33,440 --> 00:00:37,000 Speaker 1: podcasts as they talk with industry experts and leaders to 9 00:00:37,080 --> 00:00:40,839 Speaker 1: explore how businesses can integrate AI into their workflows and 10 00:00:41,000 --> 00:00:44,239 Speaker 1: help drive real change in this new era of AI. 11 00:00:44,680 --> 00:00:47,320 Speaker 1: And of course, host Malcolm Gladwell will be there to 12 00:00:47,360 --> 00:00:49,960 Speaker 1: guide you through the season and throw in his two 13 00:00:49,960 --> 00:00:53,040 Speaker 1: cents as well. Look out for new episodes of Smart 14 00:00:53,040 --> 00:00:56,240 Speaker 1: Talks with IBM every other week on the iHeartRadio app, 15 00:00:56,440 --> 00:00:59,920 Speaker 1: Apple Podcasts, or wherever you get your podcasts, and learn 16 00:01:00,000 --> 00:01:03,800 Speaker 1: more at IBM dot com slash smart Talks. 17 00:01:05,280 --> 00:01:09,280 Speaker 3: Hey, it's Jacob Goldstein for Smart Talks with IBM. Last year, 18 00:01:09,360 --> 00:01:12,399 Speaker 3: I had the pleasure of sitting down with doctor David Cox, 19 00:01:12,720 --> 00:01:17,120 Speaker 3: VP of AI Models at IBM Research. We explored the 20 00:01:17,160 --> 00:01:21,560 Speaker 3: fascinating world of AI foundation models and their revolutionary potential 21 00:01:21,640 --> 00:01:26,199 Speaker 3: for business automation and innovation. When we first aired this episode, 22 00:01:26,319 --> 00:01:29,440 Speaker 3: the concept of foundation models was just beginning to capture 23 00:01:29,480 --> 00:01:34,000 Speaker 3: our attention. Since then, this technology has evolved and redefined 24 00:01:34,000 --> 00:01:37,640 Speaker 3: the boundaries of what's possible. Businesses are becoming more savvy 25 00:01:37,680 --> 00:01:40,560 Speaker 3: about selecting the right models and understanding how they can 26 00:01:40,640 --> 00:01:44,640 Speaker 3: drive revenue and efficiency. As I listened back to the conversation, 27 00:01:44,880 --> 00:01:47,600 Speaker 3: it was interesting to reflect on some new developments and 28 00:01:47,680 --> 00:01:51,200 Speaker 3: ideas that have emerged, and many of these we will 29 00:01:51,240 --> 00:01:54,640 Speaker 3: continue to explore throughout the season, like how to play 30 00:01:54,680 --> 00:01:57,480 Speaker 3: an active role in choosing the best model for your needs. 31 00:01:58,240 --> 00:02:00,480 Speaker 3: Whether you're a longtime listener or two in for the 32 00:02:00,520 --> 00:02:03,760 Speaker 3: first time, I'm certain you'll find doctor Cox's insights as 33 00:02:03,800 --> 00:02:07,120 Speaker 3: thought provoking as ever. Thanks as always for joining us. 34 00:02:07,480 --> 00:02:08,880 Speaker 3: Now let's dive in. 35 00:02:10,600 --> 00:02:14,360 Speaker 4: Hello, Hello, Welcome to Smart Talks with IBM, a podcast 36 00:02:14,400 --> 00:02:19,880 Speaker 4: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Glabwell. Our 37 00:02:19,919 --> 00:02:24,560 Speaker 4: guest today is doctor David Cox, VP of AI Models 38 00:02:24,600 --> 00:02:29,639 Speaker 4: at IBM Research and IBM Director of the MIT IBM 39 00:02:29,680 --> 00:02:33,960 Speaker 4: Watson AI Lab, a first of its kind industry academic 40 00:02:34,080 --> 00:02:39,120 Speaker 4: collaboration between IBM and MIT focused on the fundamental research 41 00:02:39,560 --> 00:02:44,280 Speaker 4: of artificial intelligence. Over the course of decades, David Cox 42 00:02:44,400 --> 00:02:49,040 Speaker 4: watched as the AI revolution steadily grew from the simmering 43 00:02:49,040 --> 00:02:52,919 Speaker 4: ideas of a few academics and technologists into the industrial 44 00:02:53,000 --> 00:02:57,280 Speaker 4: boom we are experiencing today. Having dedicated his life to 45 00:02:57,360 --> 00:03:00,800 Speaker 4: pushing the field of AI towards new horizons, David has 46 00:03:00,840 --> 00:03:04,760 Speaker 4: both contributed to and presided over many of the major 47 00:03:04,880 --> 00:03:10,440 Speaker 4: breakthroughs in artificial intelligence. In today's episode, you'll hear David 48 00:03:10,480 --> 00:03:15,600 Speaker 4: explain some of the conceptual underpinnings of the current AI landscape, 49 00:03:15,720 --> 00:03:20,680 Speaker 4: things like foundation models, in surprisingly comprehensible terms. I might add, 50 00:03:20,919 --> 00:03:24,080 Speaker 4: we'll also get into some of the amazing practical applications 51 00:03:24,080 --> 00:03:27,120 Speaker 4: for AI in business, as well as what implications AI 52 00:03:27,280 --> 00:03:30,799 Speaker 4: will have for the future of work and design. David 53 00:03:30,800 --> 00:03:34,520 Speaker 4: spoke with Jacob Goldstein, host of the Pushkin podcast What's 54 00:03:34,560 --> 00:03:38,880 Speaker 4: Your Problem. A veteran business journalist, Jacob has reported for 55 00:03:38,920 --> 00:03:41,680 Speaker 4: The Wall Street Journal, the Miami Herald, and was a 56 00:03:41,720 --> 00:03:47,360 Speaker 4: longtime host of the NPR program Planet Money. Okay, let's 57 00:03:47,400 --> 00:03:48,280 Speaker 4: get to the interview. 58 00:03:50,600 --> 00:03:53,800 Speaker 5: Tell me about your job at IBM SO. I wear 59 00:03:53,920 --> 00:03:57,080 Speaker 5: two hats at IBM SO one. I'm the IBM Doctor 60 00:03:57,120 --> 00:04:00,320 Speaker 5: of the MIT IBM Watson the Eye Lab. That's a 61 00:04:00,480 --> 00:04:03,640 Speaker 5: joint lab between IBM and MIT where we try and 62 00:04:03,680 --> 00:04:06,040 Speaker 5: invent what's next in AI. It's been running for about 63 00:04:06,040 --> 00:04:09,120 Speaker 5: five years, and then more recently I started as the 64 00:04:09,160 --> 00:04:12,040 Speaker 5: vice president for AI Models, and I'm in charge of 65 00:04:12,080 --> 00:04:16,960 Speaker 5: building IBM's foundation models, you know, building these these big models, 66 00:04:16,960 --> 00:04:18,800 Speaker 5: generative models that allow us to have all kinds of 67 00:04:18,880 --> 00:04:20,440 Speaker 5: new exciting capabilities in AI. 68 00:04:21,000 --> 00:04:22,880 Speaker 3: So so I want to talk to you a lot 69 00:04:22,960 --> 00:04:26,400 Speaker 3: about foundation models, about genitive AI. But before we get 70 00:04:26,400 --> 00:04:28,520 Speaker 3: to that, let's just spend a minute on the on 71 00:04:28,600 --> 00:04:31,240 Speaker 3: the IBM MIT collaboration. 72 00:04:32,240 --> 00:04:35,039 Speaker 2: Where did that partnership start, How did it originate? 73 00:04:36,200 --> 00:04:39,039 Speaker 5: Yeah, so, actually it turns out that MIT and IBM 74 00:04:39,240 --> 00:04:42,320 Speaker 5: have been collaborating for a very long time in the 75 00:04:42,360 --> 00:04:46,440 Speaker 5: area of AI. In fact, the term artificial intelligence was 76 00:04:46,480 --> 00:04:50,160 Speaker 5: coined in a nineteen fifty six workshop that was held 77 00:04:50,200 --> 00:04:52,360 Speaker 5: at Dartmouth. It was actually organized by an IBM or 78 00:04:52,400 --> 00:04:55,719 Speaker 5: Nathaniel Rochester, who led the development of the IBM seven 79 00:04:55,720 --> 00:04:59,000 Speaker 5: and one. So we've really been together in AI since 80 00:04:59,000 --> 00:05:03,839 Speaker 5: the beginning, and as AI kept accelerating more and more 81 00:05:03,880 --> 00:05:07,480 Speaker 5: and more, I think there was a really interesting decision 82 00:05:07,480 --> 00:05:10,239 Speaker 5: to say let's make this a formal partnership. So IBM 83 00:05:10,279 --> 00:05:12,080 Speaker 5: in twenty seventeen and now, so it'd be committing close 84 00:05:12,120 --> 00:05:14,960 Speaker 5: to a quarter billion dollars over ten years to have 85 00:05:15,040 --> 00:05:18,880 Speaker 5: this joint lab with MIT and we we located ourselves 86 00:05:18,960 --> 00:05:21,200 Speaker 5: right on the campus and we've been developing very very 87 00:05:21,200 --> 00:05:23,520 Speaker 5: deep relationships where we can you know, really get to 88 00:05:23,520 --> 00:05:26,640 Speaker 5: know each other, work shoulder to shoulder, conceiving what we 89 00:05:26,640 --> 00:05:29,279 Speaker 5: should work on next, and then executing the projects. And 90 00:05:29,320 --> 00:05:33,320 Speaker 5: it's really you know, very few entities like this exist 91 00:05:33,800 --> 00:05:36,800 Speaker 5: between academia industry. It's been really fun of the last 92 00:05:36,800 --> 00:05:38,080 Speaker 5: five years to be a part of it. 93 00:05:38,720 --> 00:05:40,160 Speaker 3: And what do you think are some of the most 94 00:05:40,200 --> 00:05:43,760 Speaker 3: important outcomes of this collaboration between IBM and MIT. 95 00:05:45,160 --> 00:05:47,800 Speaker 5: Yeah, so we're really kind of the tip of the 96 00:05:47,839 --> 00:05:52,799 Speaker 5: sphere for IBM's the I strategy. So we're really looking, 97 00:05:53,000 --> 00:05:55,800 Speaker 5: you know, what's coming ahead, and you know, in areas 98 00:05:55,839 --> 00:05:59,760 Speaker 5: like foundation models, you know, as the field changes, MiTV 99 00:06:00,120 --> 00:06:02,480 Speaker 5: are interested in working on you know, faculty, students and 100 00:06:02,520 --> 00:06:04,520 Speaker 5: staff are interested in working on what's the latest thing, 101 00:06:04,600 --> 00:06:07,680 Speaker 5: what's the next thing. We at IBM Research are very 102 00:06:07,720 --> 00:06:09,800 Speaker 5: much interested in the same. So we can kind of 103 00:06:09,800 --> 00:06:12,719 Speaker 5: put out feelers, you know, interesting things that we're seeing 104 00:06:12,760 --> 00:06:15,559 Speaker 5: in our research, interesting things we're hearing in the field. 105 00:06:15,560 --> 00:06:17,960 Speaker 5: We can go and chase those opportunities. So when something 106 00:06:18,000 --> 00:06:21,000 Speaker 5: big comes, like the big change that's been happening lately 107 00:06:21,040 --> 00:06:23,599 Speaker 5: with foundation models, we're ready to jump on it. That's 108 00:06:23,600 --> 00:06:26,600 Speaker 5: really the purpose, that's that's the lab functioning the way 109 00:06:26,600 --> 00:06:29,800 Speaker 5: it should. We're also really interested in how do we 110 00:06:29,839 --> 00:06:32,440 Speaker 5: advance you know, the AI that can help with climate 111 00:06:32,560 --> 00:06:35,840 Speaker 5: change or you know, build better materials and all these 112 00:06:35,920 --> 00:06:38,200 Speaker 5: kinds of things that are you know, a broader aperture 113 00:06:38,279 --> 00:06:40,960 Speaker 5: sometimes than than what we might consider just looking at 114 00:06:40,960 --> 00:06:43,719 Speaker 5: the product portfolio of IBM, and that that gives us 115 00:06:43,760 --> 00:06:45,920 Speaker 5: again a breadth where we can see connections that we 116 00:06:46,000 --> 00:06:48,880 Speaker 5: might not have seen otherwise. We can you know, think 117 00:06:48,880 --> 00:06:51,839 Speaker 5: things that help out society and also help out our customers. 118 00:06:52,600 --> 00:06:57,080 Speaker 3: So the last whatever six months, say, there has been 119 00:06:57,120 --> 00:07:02,800 Speaker 3: this wild rise in the public's interest in AI, right 120 00:07:02,839 --> 00:07:06,280 Speaker 3: clearly coming out of these generative AI models that are 121 00:07:06,320 --> 00:07:10,400 Speaker 3: really accessible, you know, certainly chat GPT language models like that, 122 00:07:10,480 --> 00:07:13,520 Speaker 3: as well as models that generate images like mid Journey. 123 00:07:14,160 --> 00:07:17,640 Speaker 3: I mean, can you just sort of briefly talk about 124 00:07:17,040 --> 00:07:21,080 Speaker 3: the breakthroughs in AI that have made this moment feel 125 00:07:21,160 --> 00:07:24,640 Speaker 3: so exciting, so revolutionary for artificial intelligence. 126 00:07:25,680 --> 00:07:30,440 Speaker 5: Yeah, you know, I've been studying AI basically my entire 127 00:07:30,480 --> 00:07:32,600 Speaker 5: adult life. Before I came to IBM, I was a 128 00:07:32,600 --> 00:07:35,160 Speaker 5: professor at Harvard. I've been doing this a long time, 129 00:07:35,400 --> 00:07:37,680 Speaker 5: and I've gotten used to being surprised. It sounds like 130 00:07:37,680 --> 00:07:40,960 Speaker 5: a joke, but it's serious, like getting used to being 131 00:07:41,000 --> 00:07:43,559 Speaker 5: surprised at the acceleration of the pace. 132 00:07:44,360 --> 00:07:44,600 Speaker 2: Again. 133 00:07:44,640 --> 00:07:47,400 Speaker 5: It tracks actually a long way back. You know, there's 134 00:07:47,480 --> 00:07:49,720 Speaker 5: lots of things where there was an idea that just 135 00:07:49,840 --> 00:07:53,800 Speaker 5: simmered for a really long time. Some of the key 136 00:07:54,000 --> 00:07:58,400 Speaker 5: math behind the stuff that we have today, which is amazing. 137 00:07:59,120 --> 00:08:01,960 Speaker 5: There's an algorithm call back propagation, which is sort of 138 00:08:02,080 --> 00:08:04,720 Speaker 5: key to training neural networks that's been around, you know, 139 00:08:04,760 --> 00:08:08,679 Speaker 5: since the eighties in wide use. And really what happened 140 00:08:08,800 --> 00:08:12,480 Speaker 5: was it simmered for a long time and then enough 141 00:08:12,640 --> 00:08:16,400 Speaker 5: data and enough compute came so we had enough data 142 00:08:16,440 --> 00:08:20,320 Speaker 5: because you know, we all started carrying multiple cameras around 143 00:08:20,360 --> 00:08:22,760 Speaker 5: with us. Our mobile phones have all you know, all 144 00:08:22,800 --> 00:08:25,600 Speaker 5: these cameras and this we put everything on the Internet 145 00:08:25,680 --> 00:08:27,920 Speaker 5: and there's all this data out there. We caught a 146 00:08:27,960 --> 00:08:30,480 Speaker 5: lucky break that there was something called a graphics processing unit, 147 00:08:30,520 --> 00:08:32,800 Speaker 5: which turns out to be really useful for doing these 148 00:08:32,880 --> 00:08:35,480 Speaker 5: kinds of algorithms, maybe even more useful than it is 149 00:08:35,559 --> 00:08:39,520 Speaker 5: for doing graphics. They're greater graphics too, And things just 150 00:08:39,600 --> 00:08:42,240 Speaker 5: kept kind of adding to the snowball. So we had 151 00:08:42,360 --> 00:08:46,600 Speaker 5: deep learning, which is sort of a rebrand of neural 152 00:08:46,600 --> 00:08:49,079 Speaker 5: networks that I mentioned from the eighties, and that was 153 00:08:49,200 --> 00:08:52,360 Speaker 5: enabled again by data because we digitalized the world and 154 00:08:52,800 --> 00:08:55,480 Speaker 5: compute because because we kept building faster and faster and 155 00:08:55,520 --> 00:08:58,679 Speaker 5: more powerful computers, and then that allowed us to make 156 00:08:58,720 --> 00:09:01,840 Speaker 5: this this big breakthrough. And then you know, more recently, 157 00:09:02,320 --> 00:09:06,600 Speaker 5: using the same building blocks, that inexorable rise of more 158 00:09:06,640 --> 00:09:10,280 Speaker 5: and more and more data that the technology called self 159 00:09:10,320 --> 00:09:16,240 Speaker 5: supervised learning. Where the key difference there in traditional deep learning, 160 00:09:16,320 --> 00:09:18,800 Speaker 5: you know, for classifying images, you know, like is this 161 00:09:18,880 --> 00:09:20,640 Speaker 5: a cat or is this a dog? And a picture 162 00:09:21,080 --> 00:09:26,040 Speaker 5: those technologies require supervision, so you have to take what 163 00:09:26,160 --> 00:09:27,560 Speaker 5: you have and then you have to label it. So 164 00:09:27,600 --> 00:09:28,920 Speaker 5: you have to take a picture of a cat and 165 00:09:28,920 --> 00:09:31,480 Speaker 5: then you label it as a cat, and it turns 166 00:09:31,520 --> 00:09:33,920 Speaker 5: out that you know, that's very powerful, but it takes 167 00:09:33,920 --> 00:09:37,000 Speaker 5: a lot of time to label gats and to label dogs, 168 00:09:37,000 --> 00:09:39,400 Speaker 5: and there's only so many labels that exist in the world. 169 00:09:39,840 --> 00:09:43,400 Speaker 5: So what really changed more recently is that we have 170 00:09:43,480 --> 00:09:45,960 Speaker 5: self supervised learning where you don't have to have the labels. 171 00:09:45,960 --> 00:09:48,480 Speaker 5: We can just take unannotated data. And what that does 172 00:09:48,559 --> 00:09:51,640 Speaker 5: is allows you to use even more data. And that's 173 00:09:51,640 --> 00:09:55,240 Speaker 5: really what drove this this latest sort of rage. And 174 00:09:55,280 --> 00:09:57,400 Speaker 5: then and then all of a sudden we start getting 175 00:09:57,440 --> 00:10:01,360 Speaker 5: these these really powerful models. And then really this has 176 00:10:01,400 --> 00:10:06,160 Speaker 5: been simmering technologies, right, this has been happening for a 177 00:10:06,240 --> 00:10:10,400 Speaker 5: while and progressively getting more and more powerful. One of 178 00:10:10,440 --> 00:10:14,720 Speaker 5: the things that really happened with CHATGBT and technologies like 179 00:10:15,160 --> 00:10:18,600 Speaker 5: stable Diffusion and mid Journey was that they made it 180 00:10:18,800 --> 00:10:21,480 Speaker 5: visible to the public. You know, you put it out 181 00:10:21,480 --> 00:10:23,760 Speaker 5: there the public can touch and feel and they're like, wow, 182 00:10:24,040 --> 00:10:27,640 Speaker 5: not only is there palpable change, and wow this you know, 183 00:10:27,679 --> 00:10:29,199 Speaker 5: I can talk to this thing. Wow, this thing can 184 00:10:29,240 --> 00:10:32,120 Speaker 5: generate an image. Not only that, but everyone can touch 185 00:10:32,120 --> 00:10:36,400 Speaker 5: and feel and try. My kids can use some of 186 00:10:36,440 --> 00:10:41,880 Speaker 5: these AI art generation technologies. And that's really just launched, 187 00:10:41,920 --> 00:10:45,160 Speaker 5: you know, it's like a propelled slingshot at us into 188 00:10:45,520 --> 00:10:47,520 Speaker 5: a different regime in terms of the public awareness of 189 00:10:47,559 --> 00:10:48,920 Speaker 5: these technologies. 190 00:10:49,040 --> 00:10:52,200 Speaker 3: You mentioned earlier in the conversation foundation models, and I 191 00:10:52,240 --> 00:10:54,040 Speaker 3: want to talk a little bit about that. I mean, 192 00:10:54,080 --> 00:10:57,520 Speaker 3: can you just tell me, you know, what are foundation 193 00:10:57,720 --> 00:11:00,480 Speaker 3: models for AI and why are they big deal? 194 00:11:01,679 --> 00:11:05,360 Speaker 5: Yeah, So this term foundation model was coined by a 195 00:11:05,400 --> 00:11:09,080 Speaker 5: group at Stanford, and I think it's actually a really 196 00:11:09,200 --> 00:11:12,679 Speaker 5: apt term because remember I said, you know, one of 197 00:11:12,720 --> 00:11:16,080 Speaker 5: the big things that unlocked this latest excitement was the 198 00:11:16,080 --> 00:11:19,680 Speaker 5: fact that we could use large amounts of unannotated data. 199 00:11:20,000 --> 00:11:21,480 Speaker 5: Could we could train a model. We don't have to 200 00:11:21,520 --> 00:11:25,120 Speaker 5: go through the painful effort of labeling each and every example. 201 00:11:25,720 --> 00:11:27,920 Speaker 5: You still need to have your model do something you 202 00:11:27,960 --> 00:11:30,120 Speaker 5: wanted to do. You still need to tell it what 203 00:11:30,160 --> 00:11:31,760 Speaker 5: you want to do. You can't just have a model 204 00:11:31,760 --> 00:11:33,720 Speaker 5: that doesn't, you know, have any purpose. 205 00:11:34,000 --> 00:11:34,400 Speaker 2: But what a. 206 00:11:34,320 --> 00:11:38,160 Speaker 5: Foundation model is that provides a foundation, like a literal foundation. 207 00:11:38,440 --> 00:11:40,520 Speaker 5: You can sort of stand on the shoulders of giants. 208 00:11:40,520 --> 00:11:43,040 Speaker 5: You can have one of these massively trained models and 209 00:11:43,080 --> 00:11:45,160 Speaker 5: then do a little bit on top. You know, you 210 00:11:45,160 --> 00:11:47,440 Speaker 5: could use just a few examples of what you're looking 211 00:11:47,480 --> 00:11:50,640 Speaker 5: for and you can get what you want from the model. 212 00:11:51,200 --> 00:11:53,199 Speaker 5: So just a little bit on top now gets to 213 00:11:53,360 --> 00:11:55,440 Speaker 5: the results that a huge amount of effort used to 214 00:11:55,440 --> 00:11:57,360 Speaker 5: have to put in, you know, to get from the 215 00:11:57,440 --> 00:11:59,520 Speaker 5: ground up to that level. 216 00:12:00,200 --> 00:12:04,160 Speaker 3: Trying to think of of an analogy for sort of 217 00:12:04,440 --> 00:12:07,240 Speaker 3: foundation models versus what came before, and I don't know 218 00:12:07,280 --> 00:12:09,679 Speaker 3: that I came up with a good one, but the 219 00:12:09,720 --> 00:12:11,320 Speaker 3: best I could do was this. I want you to 220 00:12:11,320 --> 00:12:15,240 Speaker 3: tell me if it's plausible. It's like before foundation models, 221 00:12:15,760 --> 00:12:18,400 Speaker 3: it was like you had these sort of single use 222 00:12:18,640 --> 00:12:21,160 Speaker 3: kitchen appliances. You could make a waffle iron if you 223 00:12:21,200 --> 00:12:23,199 Speaker 3: wanted waffles, or you could make a. 224 00:12:23,160 --> 00:12:24,840 Speaker 2: Toaster if you wanted to make toast. 225 00:12:25,160 --> 00:12:27,960 Speaker 3: But a foundation model is like like an oven with 226 00:12:28,040 --> 00:12:29,960 Speaker 3: a range on top. So it's like this machine and 227 00:12:30,000 --> 00:12:32,640 Speaker 3: you could just cook anything with this machine. 228 00:12:33,280 --> 00:12:37,760 Speaker 5: Yeah, that's a great analogy. They're very versatile. The other 229 00:12:37,880 --> 00:12:40,440 Speaker 5: piece of it, too, is that they dramatically lower the 230 00:12:40,520 --> 00:12:43,679 Speaker 5: effort that it takes to do something that you want 231 00:12:43,720 --> 00:12:46,760 Speaker 5: to do. And someone I used to say about the 232 00:12:46,800 --> 00:12:48,720 Speaker 5: old world of AI would say, you know, the problem 233 00:12:48,760 --> 00:12:52,200 Speaker 5: with automation is that it's too labor intensive. H It 234 00:12:52,240 --> 00:12:53,560 Speaker 5: sounds like I'm making a joke. 235 00:12:53,760 --> 00:12:58,320 Speaker 3: Indeed, famously, if automation does one thing, it substitutes machines 236 00:12:58,440 --> 00:13:01,679 Speaker 3: or computing power for labor. Right, So what does that 237 00:13:01,720 --> 00:13:06,040 Speaker 3: mean to say AI is or automation is too labor intensive. 238 00:13:06,520 --> 00:13:08,480 Speaker 5: It sounds like I'm making a joke, but I'm actually serious, 239 00:13:08,520 --> 00:13:11,240 Speaker 5: And what I mean is that the effort it took 240 00:13:11,840 --> 00:13:15,600 Speaker 5: the old regime to automate something was very, very high. 241 00:13:15,720 --> 00:13:18,920 Speaker 5: So if I need to go and curate all this data, 242 00:13:18,960 --> 00:13:22,199 Speaker 5: collect all this data, and then carefully label all these examples, 243 00:13:22,600 --> 00:13:26,559 Speaker 5: that labeling itself might be incredibly expensive and time. So 244 00:13:26,880 --> 00:13:29,520 Speaker 5: and we estimate anywhere between eighty to ninety percent of 245 00:13:29,559 --> 00:13:32,440 Speaker 5: the effort it takes to feel an AI solution actually 246 00:13:32,520 --> 00:13:36,079 Speaker 5: is just spent on data, so that that has some consequences, 247 00:13:36,400 --> 00:13:41,720 Speaker 5: which is the threshold for bothering. You know, if you're 248 00:13:41,760 --> 00:13:43,920 Speaker 5: going to only get a little bit of value back 249 00:13:44,200 --> 00:13:46,400 Speaker 5: from something, are you going to go through this huge 250 00:13:46,440 --> 00:13:49,960 Speaker 5: effort to curate all this data and then when it 251 00:13:49,960 --> 00:13:52,320 Speaker 5: comes time to train the model, you need highly skilled 252 00:13:52,400 --> 00:13:56,439 Speaker 5: people expensive or hard to find in the labor market. 253 00:13:56,600 --> 00:13:58,120 Speaker 5: You know, are you really going to do something that's 254 00:13:58,160 --> 00:14:00,000 Speaker 5: just a tiny little incremental thing. Now you're going to 255 00:14:00,080 --> 00:14:03,240 Speaker 5: do the only the highest value things that warn't right 256 00:14:03,920 --> 00:14:05,000 Speaker 5: level because you. 257 00:14:04,960 --> 00:14:08,559 Speaker 3: Have to essentially build the whole machine from scratch, and 258 00:14:08,960 --> 00:14:11,600 Speaker 3: there aren't many things where it's worth that much work 259 00:14:11,640 --> 00:14:13,760 Speaker 3: to build a machine that's only going to do one 260 00:14:13,880 --> 00:14:14,720 Speaker 3: narrow thing. 261 00:14:15,200 --> 00:14:18,120 Speaker 5: That's right, and then you tackle the next problem and 262 00:14:18,200 --> 00:14:20,560 Speaker 5: you basically have to start over. And you know, there 263 00:14:20,560 --> 00:14:23,360 Speaker 5: are some nuances here, like for images, you can pre 264 00:14:23,440 --> 00:14:25,880 Speaker 5: train a model on some other tasks and change it around. 265 00:14:25,960 --> 00:14:28,920 Speaker 5: So there are some examples of this, like non recurring 266 00:14:29,040 --> 00:14:31,600 Speaker 5: cost that we have in the old world too, But 267 00:14:31,640 --> 00:14:34,160 Speaker 5: by and large, it's just a lot of effort. It's hard, 268 00:14:34,480 --> 00:14:38,760 Speaker 5: it takes, you know, a large level of skill to implement. 269 00:14:39,520 --> 00:14:42,320 Speaker 5: One analogy that I like is, you know, think about 270 00:14:42,360 --> 00:14:44,480 Speaker 5: it as you know, you have a river of data, 271 00:14:44,840 --> 00:14:48,160 Speaker 5: you know, running through your company or your institution. Traditional 272 00:14:48,240 --> 00:14:50,720 Speaker 5: AI solutions are kind of like building a dam on 273 00:14:50,760 --> 00:14:54,240 Speaker 5: that river. You know, dams are very expensive things to build. 274 00:14:54,560 --> 00:14:58,800 Speaker 5: They require highly specialized skills and lots of planning. And 275 00:14:59,000 --> 00:15:00,680 Speaker 5: you know, you're only going to put a dam on 276 00:15:01,120 --> 00:15:03,640 Speaker 5: a river that's big enough that you're going to get 277 00:15:03,680 --> 00:15:05,800 Speaker 5: enough energy out of it that it was worth your trouble. 278 00:15:06,200 --> 00:15:07,720 Speaker 5: You're gonna get a lot of value out of that dam. 279 00:15:07,800 --> 00:15:09,400 Speaker 5: If you have a river like that, you know, a 280 00:15:09,520 --> 00:15:13,080 Speaker 5: river of data, but it's actually the vast majority of 281 00:15:13,280 --> 00:15:15,640 Speaker 5: the water you know in your kingdom actually isn't in 282 00:15:15,680 --> 00:15:19,720 Speaker 5: that river. It's in puddles and greeks and ballet bricks. 283 00:15:19,800 --> 00:15:23,240 Speaker 5: And you know, there's a lot of value left on 284 00:15:23,280 --> 00:15:25,840 Speaker 5: the table because it's like, well, I can't there's nothing 285 00:15:25,840 --> 00:15:27,640 Speaker 5: you can do about it. It's just that that's too 286 00:15:28,640 --> 00:15:31,760 Speaker 5: low value. So it takes too much effort, so I'm 287 00:15:31,760 --> 00:15:33,320 Speaker 5: just not going to do it. The return on investment 288 00:15:33,720 --> 00:15:36,280 Speaker 5: just isn't there, so you just end up not automating 289 00:15:36,320 --> 00:15:39,120 Speaker 5: things because it's too much of a pain. Now what 290 00:15:39,160 --> 00:15:41,600 Speaker 5: foundation models do is they say, well, actually, no, we 291 00:15:41,640 --> 00:15:44,800 Speaker 5: can train a base model, a foundation that you can 292 00:15:44,840 --> 00:15:46,560 Speaker 5: work on the don't We don't care. We have to 293 00:15:46,560 --> 00:15:48,400 Speaker 5: specify what the task is ahead of time. We just 294 00:15:48,400 --> 00:15:51,560 Speaker 5: need to learn about the domain of data. So if 295 00:15:51,560 --> 00:15:54,440 Speaker 5: we want to build something that can understand English language, 296 00:15:54,760 --> 00:15:58,080 Speaker 5: there's a ton of English language text available out in 297 00:15:58,120 --> 00:16:02,440 Speaker 5: the world. We can now train on huge quantities of it, 298 00:16:02,880 --> 00:16:06,680 Speaker 5: and then it learned the structure, learned how language you know, 299 00:16:06,800 --> 00:16:09,680 Speaker 5: good part of how language works on all that unlabeled data, 300 00:16:09,760 --> 00:16:11,880 Speaker 5: and then when you roll up with your task, you 301 00:16:11,880 --> 00:16:15,440 Speaker 5: know I want to solve this particular problem. You don't 302 00:16:15,480 --> 00:16:18,080 Speaker 5: have to start from scratch. You're starting from a very 303 00:16:18,200 --> 00:16:20,920 Speaker 5: very very high place. So that just gives you the 304 00:16:20,960 --> 00:16:23,320 Speaker 5: ability to just, you know, now, all of a sudden, 305 00:16:23,360 --> 00:16:26,560 Speaker 5: everything is accessible. All the puddles and greeks and babbling 306 00:16:26,560 --> 00:16:30,720 Speaker 5: brooks and kettlepons, you know, those are all accessible now. 307 00:16:31,240 --> 00:16:33,920 Speaker 5: And that's that's very exciting. But it just changes the 308 00:16:33,920 --> 00:16:36,440 Speaker 5: equation on what kinds of problems you could use AI 309 00:16:36,560 --> 00:16:36,960 Speaker 5: to solve. 310 00:16:37,080 --> 00:16:42,560 Speaker 3: And so foundation models basically mean that automating some new 311 00:16:42,640 --> 00:16:45,920 Speaker 3: task is much less labor intensive. The sort of marginal 312 00:16:45,960 --> 00:16:49,000 Speaker 3: effort to do some new automation thing is much lower 313 00:16:49,000 --> 00:16:52,280 Speaker 3: because you're building on top of the foundation model rather 314 00:16:52,320 --> 00:16:56,720 Speaker 3: than starting from scratch. Absolutely, So that is that is 315 00:16:56,800 --> 00:17:00,520 Speaker 3: like the exciting good news. I do feel like there's 316 00:17:01,200 --> 00:17:03,840 Speaker 3: a little bit of a countervailing idea that's worth talking 317 00:17:03,840 --> 00:17:06,200 Speaker 3: about here, and that is the idea that even though 318 00:17:06,240 --> 00:17:10,280 Speaker 3: there are these foundation models that are really powerful, that 319 00:17:10,320 --> 00:17:13,359 Speaker 3: are relatively easy to build on top of, it's still 320 00:17:13,359 --> 00:17:17,240 Speaker 3: the case right that there is not some one size fits. 321 00:17:16,960 --> 00:17:18,200 Speaker 2: All foundation model. 322 00:17:18,760 --> 00:17:21,320 Speaker 3: So you know, what does that mean and why is 323 00:17:21,359 --> 00:17:22,520 Speaker 3: that important to think about? 324 00:17:22,560 --> 00:17:23,800 Speaker 2: In this context. 325 00:17:24,040 --> 00:17:27,840 Speaker 5: Yeah, so we believe very strongly that there isn't just 326 00:17:27,920 --> 00:17:30,800 Speaker 5: one model to rule them all. There's a number of 327 00:17:30,840 --> 00:17:33,840 Speaker 5: reasons why that could be true. One which I think 328 00:17:33,920 --> 00:17:37,960 Speaker 5: is important and very relevant today is how much energy 329 00:17:38,280 --> 00:17:43,040 Speaker 5: these models can consume. So these models, you know, can 330 00:17:43,080 --> 00:17:48,520 Speaker 5: get very, very large. So one thing that we're starting 331 00:17:48,560 --> 00:17:51,280 Speaker 5: to see or starting to believe, is that you probably 332 00:17:51,280 --> 00:17:56,440 Speaker 5: shouldn't use one giant sledgehammer model to solve every single problem, 333 00:17:56,600 --> 00:17:58,560 Speaker 5: you know, like we should pick the right size model 334 00:17:58,560 --> 00:18:01,359 Speaker 5: to solve the problem. We shouldn't necessarily assume that we 335 00:18:01,440 --> 00:18:06,000 Speaker 5: need the biggest, baddest model for every little use case. 336 00:18:06,440 --> 00:18:08,639 Speaker 5: And we're also seeing that, you know, small models that 337 00:18:08,680 --> 00:18:12,880 Speaker 5: are trained like to specialize on particular domains can actually 338 00:18:12,920 --> 00:18:16,760 Speaker 5: outperform much bigger models. So bigger isn't always even better. 339 00:18:16,840 --> 00:18:19,439 Speaker 3: So they're more efficient and they do the thing you 340 00:18:19,440 --> 00:18:21,080 Speaker 3: want them to do better as well. 341 00:18:21,640 --> 00:18:22,120 Speaker 2: That's right. 342 00:18:22,240 --> 00:18:25,639 Speaker 5: So Stanford, for instance, a group of Stanford trained a model. 343 00:18:26,359 --> 00:18:28,920 Speaker 5: It is a two point seven billion parameter model, which 344 00:18:28,960 --> 00:18:31,800 Speaker 5: isn't terribly big by today's standards. They trained it just 345 00:18:31,880 --> 00:18:33,160 Speaker 5: on the biomedical literature. 346 00:18:33,200 --> 00:18:33,359 Speaker 2: You know. 347 00:18:33,400 --> 00:18:35,760 Speaker 5: This is the kind of thing that universities do and 348 00:18:35,840 --> 00:18:39,119 Speaker 5: what they showed was that this model was better at 349 00:18:39,119 --> 00:18:41,840 Speaker 5: answering questions about the biomedical literature than some models that 350 00:18:41,920 --> 00:18:45,639 Speaker 5: are one hundred billion parameters, you know, many times larger. 351 00:18:46,200 --> 00:18:48,560 Speaker 5: So it's a little bit like you know, asking an 352 00:18:48,560 --> 00:18:52,439 Speaker 5: expert for help on something versus asking the smartest person, 353 00:18:52,480 --> 00:18:55,240 Speaker 5: you know, the smartest person you know, maybe very smart, 354 00:18:55,680 --> 00:18:58,720 Speaker 5: but they're not going to be expertise. And then as 355 00:18:58,760 --> 00:19:00,520 Speaker 5: an added bonus, you know, this is now a much 356 00:19:00,520 --> 00:19:02,960 Speaker 5: smaller model, it's much more efficient to run, we are 357 00:19:03,119 --> 00:19:06,639 Speaker 5: you know, you know, it's cheaper. So there's lots of 358 00:19:06,640 --> 00:19:09,359 Speaker 5: different advantages there. So I think we're going to see 359 00:19:09,760 --> 00:19:14,280 Speaker 5: attension in the industry between vendors that say, hey, this 360 00:19:14,359 --> 00:19:16,480 Speaker 5: is the one, you know, big model, and then others 361 00:19:16,480 --> 00:19:18,800 Speaker 5: that say, well, actually, you know, there's there's you know, 362 00:19:19,160 --> 00:19:21,080 Speaker 5: lots of different tools we can use that all have 363 00:19:21,160 --> 00:19:24,119 Speaker 5: this nice quality that we outligned at the beginning, and 364 00:19:24,119 --> 00:19:25,600 Speaker 5: then we should really pick the one that makes the 365 00:19:25,680 --> 00:19:27,360 Speaker 5: most sense for the task at hand. 366 00:19:28,720 --> 00:19:33,080 Speaker 3: So there's sustainability basically efficiency. Another kind of set of 367 00:19:33,119 --> 00:19:37,000 Speaker 3: issues that come up a lot with ai A are bias, hallucination. 368 00:19:37,720 --> 00:19:40,359 Speaker 3: Can you talk a little bit about bias and hallucination 369 00:19:40,440 --> 00:19:43,360 Speaker 3: what they are, and how you're working to mitigate those problems. 370 00:19:43,800 --> 00:19:46,639 Speaker 5: Yeah, so there are lots of issues still. As amazing 371 00:19:46,680 --> 00:19:49,640 Speaker 5: as these technologies are, and they are amazing, let's let's 372 00:19:49,640 --> 00:19:52,119 Speaker 5: be very clear, lots of great things we're going to 373 00:19:52,200 --> 00:19:56,040 Speaker 5: enable with these kinds of technologies. Bias isn't a new problem. 374 00:19:56,400 --> 00:20:01,000 Speaker 5: So you know, basically we've seen this since the beginning 375 00:20:01,000 --> 00:20:03,919 Speaker 5: of AI. If you train a model on data that 376 00:20:04,320 --> 00:20:06,439 Speaker 5: has a bias in it, the model is going to 377 00:20:06,480 --> 00:20:11,080 Speaker 5: recapitulate that bias and it provides its answers. So every time, 378 00:20:11,240 --> 00:20:13,800 Speaker 5: you know, if all the text you have says, you know, 379 00:20:13,840 --> 00:20:16,919 Speaker 5: it's more likely to refer to female nurses and male scientists, 380 00:20:16,960 --> 00:20:19,040 Speaker 5: then you're going to you know, get models that you know. 381 00:20:19,080 --> 00:20:22,160 Speaker 5: For instance, there was an example where a machine learning 382 00:20:22,200 --> 00:20:26,600 Speaker 5: based translation system translated from Hungarian to English. Hungarian doesn't 383 00:20:26,600 --> 00:20:29,919 Speaker 5: have gender pronouns, English does, and when you ask them 384 00:20:29,920 --> 00:20:32,280 Speaker 5: to translate, it would translate they are a nurse to 385 00:20:32,680 --> 00:20:35,560 Speaker 5: she is a nurse, would translate they are a scientist 386 00:20:35,600 --> 00:20:37,800 Speaker 5: to he is a scientist. And that's not because the 387 00:20:38,600 --> 00:20:41,199 Speaker 5: people who wrote the algorithm were building in bias and 388 00:20:41,320 --> 00:20:43,040 Speaker 5: coding in like oh, it's got to be this way. 389 00:20:43,119 --> 00:20:45,359 Speaker 5: It's because the data was like that. You know, we 390 00:20:45,440 --> 00:20:49,719 Speaker 5: have biases in our society and they're reflected in our 391 00:20:49,800 --> 00:20:53,600 Speaker 5: data and our text and our images everywhere, and then 392 00:20:53,640 --> 00:20:56,760 Speaker 5: the models they're just mapping from what they've seen in 393 00:20:56,800 --> 00:20:59,560 Speaker 5: their training data to the result that you're trying to 394 00:20:59,560 --> 00:21:01,960 Speaker 5: get them to do and to give, and then these 395 00:21:01,960 --> 00:21:06,840 Speaker 5: biases come out. So there's a very active program of 396 00:21:06,920 --> 00:21:09,560 Speaker 5: research and you know, we we do quite a bit 397 00:21:09,600 --> 00:21:13,240 Speaker 5: at IBM research and i T but also all over 398 00:21:13,400 --> 00:21:16,000 Speaker 5: the community and industry and academia trying to figure out 399 00:21:16,040 --> 00:21:19,080 Speaker 5: how do we explicitly remove these biases, how do we 400 00:21:19,119 --> 00:21:21,480 Speaker 5: identify them, how do you know, how do we build 401 00:21:21,640 --> 00:21:23,959 Speaker 5: tools that allow people to audit their systems to make 402 00:21:23,960 --> 00:21:26,720 Speaker 5: sure they aren't biased. So this is a really important thing. 403 00:21:26,800 --> 00:21:29,920 Speaker 5: And you know, again this was here since the beginning, 404 00:21:30,560 --> 00:21:34,719 Speaker 5: you know, of machine learning and AI, but foundation models 405 00:21:34,720 --> 00:21:37,960 Speaker 5: and large language models and generative AI just bring it 406 00:21:38,000 --> 00:21:40,720 Speaker 5: into sharper even sharper focus because there's just so much 407 00:21:40,800 --> 00:21:44,119 Speaker 5: data and it's sort of building in, baking in all 408 00:21:44,200 --> 00:21:47,679 Speaker 5: these different biases we have, so that that's that's absolutely 409 00:21:48,200 --> 00:21:51,000 Speaker 5: a problem that these models have. Another one that you 410 00:21:51,040 --> 00:21:54,960 Speaker 5: mentioned was hallucinations. So even the most impressive of our 411 00:21:55,000 --> 00:21:58,920 Speaker 5: models will often just make stuff up, you know, the 412 00:21:59,160 --> 00:22:02,720 Speaker 5: technical term that heel has chosen as a hallucination. To 413 00:22:02,760 --> 00:22:06,119 Speaker 5: give you an example, I asked chat tbt to create 414 00:22:06,160 --> 00:22:09,920 Speaker 5: a biography of David Cox IBM, and you know, it 415 00:22:10,040 --> 00:22:12,560 Speaker 5: started off really well. You know, they identified that I 416 00:22:12,600 --> 00:22:15,000 Speaker 5: was the director of the mt IBM Watsonay and some 417 00:22:15,080 --> 00:22:17,439 Speaker 5: a few words about that, and then it proceeded to 418 00:22:17,480 --> 00:22:22,120 Speaker 5: create an authoritative but completely fake biography of me where 419 00:22:22,160 --> 00:22:25,919 Speaker 5: I was British, I was born in the UK. I 420 00:22:25,960 --> 00:22:28,760 Speaker 5: went to British university, you know, universities in the UK. 421 00:22:28,840 --> 00:22:31,800 Speaker 3: I was a professor, the authority, right, it's the certainty 422 00:22:31,920 --> 00:22:34,960 Speaker 3: that that is weird about it, right, It's it's dead 423 00:22:35,119 --> 00:22:37,399 Speaker 3: certain that you're from the UK, et cetera. 424 00:22:37,960 --> 00:22:41,000 Speaker 5: Absolutely, yeah, it has all kinds of flourishes, like I 425 00:22:41,080 --> 00:22:45,800 Speaker 5: want awards in the UK. So yeah, it's it's problematic 426 00:22:45,840 --> 00:22:48,680 Speaker 5: because it kind of pokes a lot of weak spots 427 00:22:48,720 --> 00:22:53,920 Speaker 5: in our human psychology where if something sounds coherent, we're 428 00:22:54,000 --> 00:22:56,760 Speaker 5: likely to assume it's true. We're not used to interacting 429 00:22:56,800 --> 00:23:01,520 Speaker 5: with people who eloquently and authoritatively, you know, emit complete nonsense, 430 00:23:01,600 --> 00:23:04,320 Speaker 5: like yeah, you know, you know, we get debate about that, but. 431 00:23:04,359 --> 00:23:07,840 Speaker 3: Yeah, we can debate about that, but yes, the sort 432 00:23:07,880 --> 00:23:11,480 Speaker 3: of blive confidence throws you off when you realize it's 433 00:23:11,520 --> 00:23:12,280 Speaker 3: completely wrong. 434 00:23:12,400 --> 00:23:15,160 Speaker 5: Right, that's right. And we do have a little bit 435 00:23:15,160 --> 00:23:18,399 Speaker 5: of like a great and powerful oz sort of vibe 436 00:23:18,400 --> 00:23:20,760 Speaker 5: going sometimes where we're like, well, you know, the AI 437 00:23:20,960 --> 00:23:24,720 Speaker 5: is all knowing and therefore whatever it says must be true. 438 00:23:24,920 --> 00:23:27,000 Speaker 5: But but these things will make up stuff, you know, 439 00:23:27,320 --> 00:23:32,119 Speaker 5: very aggressively, and you know, you everyone can try asking 440 00:23:32,119 --> 00:23:34,919 Speaker 5: it for their their bio. You'll you'll get something that 441 00:23:35,480 --> 00:23:37,879 Speaker 5: you always get, something that's of the right form, that 442 00:23:37,920 --> 00:23:40,119 Speaker 5: has the right tone. But you know, the facts just 443 00:23:40,160 --> 00:23:43,359 Speaker 5: aren't necessarily there. So that's obviously a problem. We need 444 00:23:43,400 --> 00:23:46,080 Speaker 5: to figure out how to close those gaps, fix those problems. 445 00:23:46,760 --> 00:23:49,199 Speaker 5: There's lots of ways we can use them much more easily. 446 00:23:49,720 --> 00:23:52,480 Speaker 4: I'd just like to say, faced with the awesome potential 447 00:23:52,520 --> 00:23:55,560 Speaker 4: of what these technologies might do, it's a bit encouraging 448 00:23:55,600 --> 00:23:59,080 Speaker 4: to hear that even chat GPT has a weakness for 449 00:23:59,240 --> 00:24:04,920 Speaker 4: inventing buoyant fictional versions of people's lives, and while entertaining 450 00:24:04,920 --> 00:24:08,560 Speaker 4: ourselves with chat GPT, and mid journey is important. The 451 00:24:08,600 --> 00:24:13,000 Speaker 4: way lay people use consumer facing chatbots and generative AI 452 00:24:13,480 --> 00:24:17,400 Speaker 4: is just fundamentally different from the way an enterprise business 453 00:24:17,480 --> 00:24:21,119 Speaker 4: uses AI. How can we harness the abilities of artificial 454 00:24:21,119 --> 00:24:24,159 Speaker 4: intelligence to help us solve the problems we face in 455 00:24:24,280 --> 00:24:28,119 Speaker 4: business and technology. Let's listen on as David and Jacob 456 00:24:28,240 --> 00:24:29,600 Speaker 4: continue their conversation. 457 00:24:30,359 --> 00:24:33,320 Speaker 3: We've been talking in a somewhat abstract way about AI 458 00:24:33,440 --> 00:24:35,119 Speaker 3: in the ways it can be used. 459 00:24:35,680 --> 00:24:37,160 Speaker 2: Let's talk in a little bit more. 460 00:24:37,000 --> 00:24:40,760 Speaker 3: Of a specific way. Can you just talk about some 461 00:24:40,920 --> 00:24:45,440 Speaker 3: examples of business challenges that can be solved with automation, 462 00:24:45,560 --> 00:24:47,399 Speaker 3: with this kind of automation we're talking about. 463 00:24:48,280 --> 00:24:51,760 Speaker 5: Yeah, so the really really guy's the limit. There's a 464 00:24:51,760 --> 00:24:55,880 Speaker 5: whole set of different applications that these models are really 465 00:24:55,880 --> 00:24:58,600 Speaker 5: good at. And basically it's a superset of everything we 466 00:24:58,720 --> 00:25:01,560 Speaker 5: used to use AI for in business. So, you know, 467 00:25:02,200 --> 00:25:03,879 Speaker 5: the simple kinds of things are like, hey, if I 468 00:25:03,920 --> 00:25:06,639 Speaker 5: have text and i'm you know, I have like product reviews, 469 00:25:06,960 --> 00:25:08,120 Speaker 5: and I want to be able to tell if these 470 00:25:08,119 --> 00:25:10,240 Speaker 5: are positive or negative. You know, like, let's look at 471 00:25:10,280 --> 00:25:11,920 Speaker 5: all the negative reviews so we can have a human 472 00:25:11,960 --> 00:25:15,199 Speaker 5: look through them and see what was up. Very common 473 00:25:15,560 --> 00:25:18,040 Speaker 5: business use case. You can do it with traditional deep 474 00:25:18,119 --> 00:25:21,560 Speaker 5: learning based AI. So so there's things like that that 475 00:25:21,600 --> 00:25:23,719 Speaker 5: are you know, it's very prosaic sort that we were 476 00:25:23,720 --> 00:25:25,560 Speaker 5: already doing it, We've been doing it for a long time. 477 00:25:26,440 --> 00:25:29,240 Speaker 5: Then you get situations that are that were harder for 478 00:25:29,359 --> 00:25:32,159 Speaker 5: the old day. I like, if I'm I want to 479 00:25:32,560 --> 00:25:35,160 Speaker 5: impress something like I want to I have, like say 480 00:25:35,200 --> 00:25:37,439 Speaker 5: I have a chat transcript, Like a customer called in 481 00:25:38,359 --> 00:25:41,920 Speaker 5: and they had a complaint. They call back, Okay, now 482 00:25:41,960 --> 00:25:44,720 Speaker 5: a new you know, a person on the line needs 483 00:25:44,800 --> 00:25:47,640 Speaker 5: to go read the old transcript to catch up. Wouldn't 484 00:25:47,640 --> 00:25:49,919 Speaker 5: it be better if we could just summarize that, just 485 00:25:49,920 --> 00:25:52,159 Speaker 5: condense it all down a quick little paragraph. You know, 486 00:25:52,240 --> 00:25:54,080 Speaker 5: customer called they were up said about this, rather than 487 00:25:54,119 --> 00:25:56,360 Speaker 5: having to read the blow by blow. There's just lots 488 00:25:56,400 --> 00:25:59,680 Speaker 5: of settings like that where summarization is really helpful. Hey, 489 00:25:59,680 --> 00:26:03,600 Speaker 5: you haven't meeting, and I'd like to just automatically, you know, 490 00:26:03,680 --> 00:26:06,120 Speaker 5: have have that meeting or that email or whatever. I'd 491 00:26:06,119 --> 00:26:07,560 Speaker 5: like to just have a condensed down so I can 492 00:26:07,640 --> 00:26:10,360 Speaker 5: really quickly get to the heart of the matter. These 493 00:26:10,400 --> 00:26:12,880 Speaker 5: models are are really good at doing that. They're also 494 00:26:12,960 --> 00:26:15,600 Speaker 5: really good at question answering. So if I want to 495 00:26:15,640 --> 00:26:17,920 Speaker 5: find out what's how many vacation days do I have? 496 00:26:18,280 --> 00:26:22,639 Speaker 5: I can now interact in natural language with a system 497 00:26:22,720 --> 00:26:25,000 Speaker 5: that can go and that it has access to our 498 00:26:25,119 --> 00:26:27,399 Speaker 5: HR policies, and I can actually have a you know, 499 00:26:27,480 --> 00:26:29,959 Speaker 5: a multi turn conversation where I can, you know, like 500 00:26:30,000 --> 00:26:32,520 Speaker 5: I would have with you know, somebody, you know, actual 501 00:26:33,480 --> 00:26:38,000 Speaker 5: HR professional or customer service representative. So a big part, 502 00:26:38,720 --> 00:26:41,879 Speaker 5: you know, what this is doing is it's it's putting 503 00:26:41,920 --> 00:26:44,280 Speaker 5: an interface. You know, when we think of computer interfaces, 504 00:26:44,280 --> 00:26:47,919 Speaker 5: we're usually thinking about UI user interface elements where I 505 00:26:47,920 --> 00:26:51,440 Speaker 5: click on menus and there's buttons and all this stuff. Increasingly, 506 00:26:51,520 --> 00:26:55,280 Speaker 5: now we can just talk, you know, you just in words. 507 00:26:55,359 --> 00:26:57,160 Speaker 5: You can describe what you want, you want to answer 508 00:26:57,359 --> 00:26:59,960 Speaker 5: ask a question you want to sort of command this 509 00:27:00,000 --> 00:27:02,840 Speaker 5: system to do something, rather than having to learn how 510 00:27:02,840 --> 00:27:04,919 Speaker 5: to do that clicking buttons, which might be inefficient. Now 511 00:27:04,920 --> 00:27:06,520 Speaker 5: we can just sort of spell it out. 512 00:27:07,080 --> 00:27:10,120 Speaker 3: Interesting, right, the graphical user interface that we all sort 513 00:27:10,119 --> 00:27:13,440 Speaker 3: of default to, that's not like the state of nature, right, 514 00:27:13,480 --> 00:27:16,000 Speaker 3: That's a thing that was invented and just came to 515 00:27:16,040 --> 00:27:18,439 Speaker 3: be the standard way that we interact with computers. And 516 00:27:18,480 --> 00:27:22,960 Speaker 3: so you could imagine, as you're saying, like chat essentially 517 00:27:23,119 --> 00:27:26,399 Speaker 3: chatting with the machine could could become a sort of 518 00:27:26,440 --> 00:27:29,720 Speaker 3: standard user interface, just like the graphical user interface, did 519 00:27:29,880 --> 00:27:31,280 Speaker 3: you know over the past several decades. 520 00:27:31,760 --> 00:27:35,160 Speaker 5: Absolutely, And I think those kinds of conversational interfaces are 521 00:27:35,200 --> 00:27:39,400 Speaker 5: going to be hugely important for increasing our productivity. It's 522 00:27:39,440 --> 00:27:41,600 Speaker 5: just a lot easier if I have to learn how 523 00:27:41,640 --> 00:27:43,600 Speaker 5: to use a tool or I have to kind of 524 00:27:43,600 --> 00:27:46,320 Speaker 5: have awkward, you know, interactions from the computer. I can 525 00:27:46,359 --> 00:27:48,000 Speaker 5: just tell it what I want and I can understand it. 526 00:27:48,040 --> 00:27:51,240 Speaker 5: Could you know, potentially even ask questions back to clarify 527 00:27:51,400 --> 00:27:56,280 Speaker 5: and have those kinds of conversations that can be extremely powerful. 528 00:27:56,400 --> 00:27:58,000 Speaker 5: And in fact, one area where that's going to I 529 00:27:58,000 --> 00:28:01,199 Speaker 5: think be absolutely game changing is in code. When we 530 00:28:01,240 --> 00:28:06,280 Speaker 5: write code. You know, programming languages are a way for 531 00:28:06,400 --> 00:28:10,280 Speaker 5: us to sort of match between our very sloppy way 532 00:28:10,280 --> 00:28:13,159 Speaker 5: of talking and the very exact way that you need 533 00:28:13,200 --> 00:28:15,560 Speaker 5: to command a computer to do what you wanted to do. 534 00:28:15,920 --> 00:28:18,640 Speaker 5: They're cumbersome to learn, they can, you know, create very 535 00:28:18,640 --> 00:28:21,800 Speaker 5: complex systems that are very hard to reason about. And 536 00:28:21,840 --> 00:28:24,120 Speaker 5: we're already starting to see the ability to just write 537 00:28:24,119 --> 00:28:26,680 Speaker 5: down what you want and AI will generate the code 538 00:28:26,720 --> 00:28:28,480 Speaker 5: for you. And I think we're just going to see 539 00:28:28,520 --> 00:28:30,960 Speaker 5: a huge revolution of like we just converse, you know, 540 00:28:31,040 --> 00:28:33,120 Speaker 5: we can have a conversation to say what we want, 541 00:28:33,200 --> 00:28:36,480 Speaker 5: and then the computer can actually not only do fixed 542 00:28:36,560 --> 00:28:38,720 Speaker 5: actions and do things for us, but it can actually 543 00:28:38,760 --> 00:28:40,960 Speaker 5: even write code to do new things, you know, and 544 00:28:41,640 --> 00:28:44,719 Speaker 5: generate software itself. Given how much software we have, how 545 00:28:44,760 --> 00:28:47,480 Speaker 5: much craving we have for software, like we'll never have 546 00:28:47,640 --> 00:28:51,040 Speaker 5: enough software in our world. Uh, you know, the ability 547 00:28:51,080 --> 00:28:54,360 Speaker 5: to have AI systems as a helper in that, I 548 00:28:54,360 --> 00:28:56,000 Speaker 5: think we're going to see a lot of a lot 549 00:28:56,040 --> 00:28:56,680 Speaker 5: of value there. 550 00:28:57,880 --> 00:29:00,480 Speaker 3: So if you if you think about the different ways 551 00:29:01,120 --> 00:29:03,360 Speaker 3: AI might be applied to business, I mean you've talked 552 00:29:03,360 --> 00:29:05,680 Speaker 3: about a number of the sort of classic use cases. 553 00:29:06,360 --> 00:29:09,760 Speaker 3: What are some of the more out there use cases. 554 00:29:09,760 --> 00:29:12,640 Speaker 3: What are some you know, unique ways you could imagine 555 00:29:12,680 --> 00:29:14,480 Speaker 3: AI being applied to business. 556 00:29:16,120 --> 00:29:18,840 Speaker 5: Yeah, there's really disguised the limit. I mean, we have 557 00:29:18,920 --> 00:29:21,120 Speaker 5: one project that I'm kind of a fan of where 558 00:29:21,760 --> 00:29:25,240 Speaker 5: we actually were working with a mechanical engineering professor at 559 00:29:25,320 --> 00:29:28,320 Speaker 5: MIT working on a classic problem, how do you build 560 00:29:28,640 --> 00:29:32,080 Speaker 5: linkage systems which are like you imagine bars and joints 561 00:29:32,200 --> 00:29:34,360 Speaker 5: and overs you know the things that. 562 00:29:34,320 --> 00:29:37,400 Speaker 3: Are building a thing, building a physical machine of some. 563 00:29:37,520 --> 00:29:43,200 Speaker 5: Kinda like real like metal, and you know nineteenth century 564 00:29:43,480 --> 00:29:46,560 Speaker 5: just old school industrial revolution. Yeah yeah, yeah, but you 565 00:29:46,560 --> 00:29:49,360 Speaker 5: know the little arm that's that's holding up my microphone 566 00:29:49,360 --> 00:29:51,960 Speaker 5: in front of me, Cranes that buld your buildings, you know, 567 00:29:52,040 --> 00:29:54,560 Speaker 5: parts of your engines. This is like classical stuff. It 568 00:29:54,600 --> 00:29:56,840 Speaker 5: turns out that you know, humans, if you want to 569 00:29:56,880 --> 00:30:00,120 Speaker 5: build an advanced system, you decide what like curve you 570 00:30:00,160 --> 00:30:02,800 Speaker 5: want to create, and then a human together with a 571 00:30:02,800 --> 00:30:06,600 Speaker 5: computer program, can build a five or six bar linkage, 572 00:30:06,680 --> 00:30:08,200 Speaker 5: and then that's kind of where you top out it 573 00:30:08,200 --> 00:30:11,040 Speaker 5: because it gets too complicated to work more than that. 574 00:30:11,720 --> 00:30:14,200 Speaker 5: We built a generative AI system that can build twenty 575 00:30:14,240 --> 00:30:17,560 Speaker 5: bar linkages, like arbitrarily complex. So these are machines that 576 00:30:17,600 --> 00:30:21,960 Speaker 5: are beyond the capability of a human to design themselves. 577 00:30:22,480 --> 00:30:25,440 Speaker 5: Another example, we have an AI system that can generate 578 00:30:25,640 --> 00:30:28,000 Speaker 5: electronic circuits. You know, we had a project where we're 579 00:30:28,000 --> 00:30:31,160 Speaker 5: working where we're building better power converters which allow our 580 00:30:31,920 --> 00:30:35,080 Speaker 5: computers and our devices to be more efficient, save energy, 581 00:30:35,880 --> 00:30:38,560 Speaker 5: you know, less less carbon ote. But I think the 582 00:30:38,600 --> 00:30:41,680 Speaker 5: world around us has always been shaped by technology. If 583 00:30:41,720 --> 00:30:43,720 Speaker 5: we look around you know, just think about how many 584 00:30:43,760 --> 00:30:46,200 Speaker 5: steps and how many people and how many designs went 585 00:30:46,200 --> 00:30:49,960 Speaker 5: into the table and the chair and the lamp. It's 586 00:30:50,000 --> 00:30:53,480 Speaker 5: really just astonishing. And that's already you know, the fruit 587 00:30:53,560 --> 00:30:56,800 Speaker 5: of automation and computers and those kinds of tools. But 588 00:30:56,800 --> 00:31:00,280 Speaker 5: we're going to see that increasingly be product also of AI. 589 00:31:00,360 --> 00:31:02,360 Speaker 5: It's just going to be everywhere around us. Everything we 590 00:31:02,520 --> 00:31:05,160 Speaker 5: touch is going to have been you know, helped in 591 00:31:05,200 --> 00:31:07,360 Speaker 5: some way to get to you by. 592 00:31:08,320 --> 00:31:10,880 Speaker 3: You know, that is a pretty profound transformation that you're 593 00:31:10,920 --> 00:31:13,600 Speaker 3: talking about in business. How do you think about the 594 00:31:13,640 --> 00:31:16,760 Speaker 3: implications of that, both for the sort of you know, 595 00:31:17,040 --> 00:31:20,160 Speaker 3: business itself and also for employees. 596 00:31:21,920 --> 00:31:24,880 Speaker 5: Yeah, so I think for businesses this is going to 597 00:31:25,280 --> 00:31:29,160 Speaker 5: cut costs, make new opportunities to like customers, you know, 598 00:31:29,240 --> 00:31:32,600 Speaker 5: like there's just you know, it's sort of all upside 599 00:31:32,640 --> 00:31:35,360 Speaker 5: right like for the for the workers, I think the 600 00:31:35,400 --> 00:31:38,440 Speaker 5: story is mostly good too. You know, like how many 601 00:31:38,480 --> 00:31:41,760 Speaker 5: things do you do in your day that you'd really 602 00:31:41,960 --> 00:31:44,600 Speaker 5: rather not right? You know, and we're used to having 603 00:31:44,600 --> 00:31:47,680 Speaker 5: things we don't like automated away, you know, we we 604 00:31:48,040 --> 00:31:50,520 Speaker 5: didn't you know, if you didn't like walking many miles 605 00:31:50,560 --> 00:31:52,200 Speaker 5: to work, then you know, like you can have a 606 00:31:52,240 --> 00:31:54,320 Speaker 5: car and you can drive there, or we used to 607 00:31:54,400 --> 00:31:57,200 Speaker 5: have a huge traction over ninety percent of the US 608 00:31:57,280 --> 00:32:01,000 Speaker 5: population engaged in agriculture. And then we mechanize how very 609 00:32:01,000 --> 00:32:03,040 Speaker 5: few people work in agriculture, a small number of people 610 00:32:03,040 --> 00:32:04,800 Speaker 5: can do the work of a large number of people. 611 00:32:05,440 --> 00:32:08,040 Speaker 5: And then you know, things like email, and yeah, they've 612 00:32:08,120 --> 00:32:10,760 Speaker 5: led to huge productivity enhancements because I don't need to 613 00:32:10,760 --> 00:32:13,120 Speaker 5: be writing letters and sending them in the mail. I 614 00:32:13,120 --> 00:32:17,560 Speaker 5: can just instantly communicate with people. We just become more effective, 615 00:32:17,720 --> 00:32:21,760 Speaker 5: Like our jobs have transformed, whether it's a physical job 616 00:32:21,840 --> 00:32:24,720 Speaker 5: like agriculture, or whether it's a knowledge worker job where 617 00:32:24,760 --> 00:32:28,480 Speaker 5: you're sending emails and communicating with people and coordinating teams. 618 00:32:28,760 --> 00:32:31,440 Speaker 5: We've just gotten better. And you know, the technology has 619 00:32:31,440 --> 00:32:34,920 Speaker 5: just made us more productive. And this is just another example. Now, 620 00:32:35,240 --> 00:32:37,360 Speaker 5: you know, there are people who worry that you know, 621 00:32:38,000 --> 00:32:40,440 Speaker 5: will be so good at that that maybe jobs will 622 00:32:40,480 --> 00:32:44,320 Speaker 5: be displaced, and that's a legitimate concern. But just like 623 00:32:45,720 --> 00:32:47,880 Speaker 5: how in agriculture, you know, it's not like suddenly we 624 00:32:47,920 --> 00:32:51,000 Speaker 5: had ninety percent of the population unemployed. You know, people 625 00:32:51,040 --> 00:32:55,280 Speaker 5: transitioned to other jobs. And the other thing that we've 626 00:32:55,280 --> 00:32:59,280 Speaker 5: found too is that our appetite for doing more things 627 00:32:59,840 --> 00:33:03,240 Speaker 5: is as humans is sort of insatiable. So even if 628 00:33:03,640 --> 00:33:06,360 Speaker 5: we can dramatically increase how much, you know, one human 629 00:33:06,440 --> 00:33:09,160 Speaker 5: can do, that doesn't necessarily mean you're going to do 630 00:33:09,160 --> 00:33:11,480 Speaker 5: a fixed amount of stuff. There's an appetite to have 631 00:33:11,560 --> 00:33:13,360 Speaker 5: even more. So we're going to you can continue to 632 00:33:13,360 --> 00:33:16,440 Speaker 5: grow grow the pie. So I think at least certainly 633 00:33:16,440 --> 00:33:18,080 Speaker 5: in the near term, you know, we're going to see 634 00:33:18,080 --> 00:33:19,960 Speaker 5: a lot of drudgery go away from work. We're going 635 00:33:20,000 --> 00:33:23,320 Speaker 5: to see people be able to be more effective at 636 00:33:23,320 --> 00:33:26,480 Speaker 5: their jobs. You know, we will see some transformation in 637 00:33:27,120 --> 00:33:32,200 Speaker 5: jobs and like. But we've seen that before, and the 638 00:33:32,240 --> 00:33:34,520 Speaker 5: technology a least has the potential to make our lives 639 00:33:34,520 --> 00:33:35,200 Speaker 5: a lot easier. 640 00:33:36,440 --> 00:33:41,400 Speaker 3: So IBM recently launched Watson X, which includes Watson x 641 00:33:41,520 --> 00:33:44,440 Speaker 3: dot AI. Tell me about that, Tell me about you 642 00:33:44,480 --> 00:33:46,520 Speaker 3: know what it is and the new possibilities that it 643 00:33:46,600 --> 00:33:47,160 Speaker 3: opens up. 644 00:33:48,040 --> 00:33:48,280 Speaker 2: Yeah. 645 00:33:48,360 --> 00:33:52,360 Speaker 5: So, so Watson X is obviously a bit of a 646 00:33:52,640 --> 00:33:56,640 Speaker 5: new branding on the Watson brand. You know TJ. Watson 647 00:33:56,680 --> 00:34:00,680 Speaker 5: that was the founder of IBM and our eechnologies of 648 00:34:01,080 --> 00:34:05,680 Speaker 5: the Watson brand. Watson X is a recognition that there's 649 00:34:05,720 --> 00:34:08,359 Speaker 5: something new, there's something that actually has changed the game. 650 00:34:09,000 --> 00:34:12,600 Speaker 5: We've gone from this old world of automation is to 651 00:34:12,760 --> 00:34:16,319 Speaker 5: labor intensive to this new world of possibilities where it's 652 00:34:16,360 --> 00:34:20,680 Speaker 5: much easier to use AI. And what watson x does 653 00:34:20,880 --> 00:34:25,280 Speaker 5: it brings together tools for businesses to harness that power. 654 00:34:25,719 --> 00:34:30,080 Speaker 5: So whattsonex dot AI so foundation models that our customers 655 00:34:30,120 --> 00:34:33,320 Speaker 5: can use. It includes tools that make it easy to run, 656 00:34:33,480 --> 00:34:37,759 Speaker 5: easy to deploy, easy to experiment. There's a watsonex dot 657 00:34:37,880 --> 00:34:41,560 Speaker 5: Data component which allows you to sort of organize and 658 00:34:41,600 --> 00:34:43,920 Speaker 5: access to your data. So what we're really trying to 659 00:34:43,960 --> 00:34:48,120 Speaker 5: do is give our customers a cohesive set of tools 660 00:34:48,360 --> 00:34:52,120 Speaker 5: to harness the value of these technologies and at the 661 00:34:52,120 --> 00:34:55,120 Speaker 5: same time be able to manage the risks and other 662 00:34:55,160 --> 00:34:57,120 Speaker 5: things that you have to keep an eye on in 663 00:34:57,200 --> 00:34:58,360 Speaker 5: an enterprise context. 664 00:35:00,080 --> 00:35:02,719 Speaker 3: So we talk about the guests on this show as 665 00:35:03,239 --> 00:35:07,319 Speaker 3: new creators, by which we mean people who are creatively 666 00:35:07,360 --> 00:35:12,239 Speaker 3: applying technology in business to drive change. And I'm curious 667 00:35:12,760 --> 00:35:17,480 Speaker 3: how creativity plays a role in the research that you do. 668 00:35:18,080 --> 00:35:22,640 Speaker 5: I honestly, I think the creative aspects of this job 669 00:35:23,120 --> 00:35:26,400 Speaker 5: this is what makes this work exciting. You know, I 670 00:35:26,400 --> 00:35:28,360 Speaker 5: should say, you know the folks who work in my 671 00:35:28,480 --> 00:35:31,560 Speaker 5: organization are doing the creating, and I. 672 00:35:31,520 --> 00:35:35,080 Speaker 3: Guess you're doing the managing so that they could do 673 00:35:35,160 --> 00:35:35,640 Speaker 3: the creator. 674 00:35:36,520 --> 00:35:39,960 Speaker 5: I'm helping them be their best, and I still get 675 00:35:39,960 --> 00:35:42,880 Speaker 5: to get involved in the weeds of the research as 676 00:35:42,920 --> 00:35:45,719 Speaker 5: much as I can. But you know, there's something really 677 00:35:45,719 --> 00:35:49,600 Speaker 5: exciting about inventing, you know, like one of the nice 678 00:35:49,600 --> 00:35:53,840 Speaker 5: things about doing invention and doing research on AI. In industries, 679 00:35:54,200 --> 00:35:57,160 Speaker 5: it's usually grounded and a real problem that somebody's having. 680 00:35:57,200 --> 00:35:59,600 Speaker 5: You know, a customer wants to solve this problem that's 681 00:36:00,400 --> 00:36:02,960 Speaker 5: losing money or there there would be a new opportunity. 682 00:36:03,280 --> 00:36:07,080 Speaker 5: You identify that problem and then you you build something 683 00:36:07,440 --> 00:36:09,640 Speaker 5: that's never been built before to do that. And I 684 00:36:09,719 --> 00:36:13,520 Speaker 5: think that's honestly the adrenaline rush that keeps all of 685 00:36:13,600 --> 00:36:16,200 Speaker 5: us in this field. How do you do something that 686 00:36:16,320 --> 00:36:20,120 Speaker 5: nobody else on earth has has done before or tried before, 687 00:36:20,560 --> 00:36:23,920 Speaker 5: So that that kind of creativity and there's also creativity 688 00:36:23,920 --> 00:36:26,719 Speaker 5: as well and identifying what those problems are, being able 689 00:36:26,719 --> 00:36:32,200 Speaker 5: to understand the places where you know the technology is 690 00:36:32,239 --> 00:36:35,520 Speaker 5: close enough to solving a problem, and doing that matchmaking 691 00:36:35,880 --> 00:36:39,560 Speaker 5: between problems that are now solvable, you know, and in 692 00:36:39,640 --> 00:36:43,200 Speaker 5: AI where the fields moving so fast, this constantly growing 693 00:36:43,280 --> 00:36:46,360 Speaker 5: horizon of things that we might be able to solve, 694 00:36:46,640 --> 00:36:49,480 Speaker 5: So that matchmaking, I think, is also a really interesting 695 00:36:49,520 --> 00:36:53,160 Speaker 5: creative problem. So I think I think that's that's that's 696 00:36:53,200 --> 00:36:56,120 Speaker 5: why it's so much fun. And it's a fun environment 697 00:36:56,200 --> 00:36:58,600 Speaker 5: we have here too. It's you know, people drawing on 698 00:36:58,640 --> 00:37:03,080 Speaker 5: whiteboards and writing on pages of math and you. 699 00:37:03,120 --> 00:37:05,640 Speaker 2: Know, like in a movie, like in a movie. 700 00:37:05,560 --> 00:37:08,400 Speaker 5: Yes, straight from sexual casting. 701 00:37:07,680 --> 00:37:09,680 Speaker 3: The drawing on the window, writing on the window in. 702 00:37:09,640 --> 00:37:13,520 Speaker 2: Sharp absolutely so. 703 00:37:13,520 --> 00:37:18,200 Speaker 3: So let's close with the really long view. How do 704 00:37:18,239 --> 00:37:23,480 Speaker 3: you imagine AI and people working together twenty years from now? 705 00:37:25,560 --> 00:37:28,680 Speaker 5: Yeah, it's really hard to make predictions. 706 00:37:28,960 --> 00:37:30,799 Speaker 2: The vision that I. 707 00:37:32,440 --> 00:37:38,279 Speaker 5: Like, actually this came from an MIT economist named David Ott, 708 00:37:38,520 --> 00:37:44,320 Speaker 5: which was imagine AI almost as a natural resource. Yeah, 709 00:37:44,719 --> 00:37:47,719 Speaker 5: we know how natural resources work, right, like this an 710 00:37:47,880 --> 00:37:49,480 Speaker 5: or we can dig up out of the earth. It 711 00:37:49,520 --> 00:37:52,799 Speaker 5: comes from springs from the earth. Or we usually think 712 00:37:52,840 --> 00:37:55,799 Speaker 5: of that in terms of physical stuff. With AI, you 713 00:37:55,800 --> 00:37:57,239 Speaker 5: can almost think of it as like there's a new 714 00:37:57,320 --> 00:38:00,560 Speaker 5: kind of abundance potentially twenty years from now or not 715 00:38:00,600 --> 00:38:02,920 Speaker 5: only can we have things we can build or eat 716 00:38:03,000 --> 00:38:05,839 Speaker 5: or use or burn or whatever. Now we have you know, 717 00:38:05,960 --> 00:38:08,520 Speaker 5: this ability to do things and understand things and do 718 00:38:08,600 --> 00:38:11,759 Speaker 5: intellectual work, and I think we can get to a 719 00:38:11,840 --> 00:38:17,040 Speaker 5: world where automating things is just seamless. We're surrounded by 720 00:38:17,320 --> 00:38:22,520 Speaker 5: capability to augment ourselves to get things done. And you 721 00:38:22,560 --> 00:38:24,560 Speaker 5: could think of that in terms of like, oh, that's 722 00:38:24,600 --> 00:38:26,960 Speaker 5: going to displace our jobs, because eventually the AI system 723 00:38:27,040 --> 00:38:29,160 Speaker 5: is going to do everything we can do. But you 724 00:38:29,200 --> 00:38:31,239 Speaker 5: could also think of it in terms of, like, wow, 725 00:38:31,320 --> 00:38:33,640 Speaker 5: that's just so much abundance that we now have, and 726 00:38:33,680 --> 00:38:36,879 Speaker 5: really how we use that abundance is sort of up 727 00:38:36,920 --> 00:38:39,520 Speaker 5: to us, you know, like when you can writing software 728 00:38:39,600 --> 00:38:41,920 Speaker 5: is super easy and fast and anybody can do it. 729 00:38:42,360 --> 00:38:44,160 Speaker 5: Just think about all the things you can do now, 730 00:38:44,760 --> 00:38:46,880 Speaker 5: think about all the new activities, and go out all 731 00:38:46,880 --> 00:38:49,040 Speaker 5: the ways we could use that to enrich our lives. 732 00:38:49,480 --> 00:38:52,520 Speaker 5: That's where I'd like to see us in twenty years. 733 00:38:52,560 --> 00:38:55,120 Speaker 5: You know, we can we can do just so much 734 00:38:55,280 --> 00:38:58,560 Speaker 5: more than we were able to do before abundance. 735 00:38:59,360 --> 00:39:02,160 Speaker 2: Great, thank you so much for your time. 736 00:39:02,920 --> 00:39:04,920 Speaker 5: Yeah, it's been a pleasure. Thanks for inviting me. 737 00:39:06,440 --> 00:39:10,839 Speaker 4: What a far ranging, deep conversation. I'm mesmerized by the vision. 738 00:39:10,880 --> 00:39:15,040 Speaker 4: David just described a world where natural conversation between mankind 739 00:39:15,080 --> 00:39:20,239 Speaker 4: and machine can generate creative solutions to our most complex problems. 740 00:39:20,560 --> 00:39:24,120 Speaker 4: A world where we view AI not as our replacements, 741 00:39:24,719 --> 00:39:27,799 Speaker 4: but as a powerful resource we can tap into and 742 00:39:27,920 --> 00:39:33,200 Speaker 4: exponentially boost our innovation and productivity. Thanks so much to 743 00:39:33,239 --> 00:39:36,600 Speaker 4: doctor David Cox for joining us on smart Talks. We 744 00:39:36,680 --> 00:39:40,640 Speaker 4: deeply appreciate him sharing his huge breadth of AI knowledge 745 00:39:40,680 --> 00:39:44,880 Speaker 4: with us and for explaining the transformative potential of foundation 746 00:39:45,040 --> 00:39:48,440 Speaker 4: models in a way that even I can understand. We 747 00:39:48,520 --> 00:39:53,240 Speaker 4: eagerly await his next great breakthrough. Smart Talks with IBM 748 00:39:53,320 --> 00:39:57,480 Speaker 4: is produced by Matt Romano, David jaw nishe Venkat and 749 00:39:57,600 --> 00:40:02,480 Speaker 4: Royston Preserve with Jacob Goldstein. We're edited by Lydia Jane Kott. 750 00:40:02,800 --> 00:40:07,160 Speaker 4: Our engineers are Jason Gambrel, Sarah Buguier and Ben Holliday. 751 00:40:07,719 --> 00:40:13,000 Speaker 4: Theme song by Gramosco. Special thanks to Carli Megliori, Andy Kelly, 752 00:40:13,080 --> 00:40:17,080 Speaker 4: Kathy Callahan and the eight Bar and IBM teams, as 753 00:40:17,080 --> 00:40:20,920 Speaker 4: well as the Pushkin marketing team. Smart Talks with IBM 754 00:40:21,239 --> 00:40:25,120 Speaker 4: is a production of Pushkin Industries and iHeartMedia. To find 755 00:40:25,320 --> 00:40:29,760 Speaker 4: more Pushkin podcasts, listen on the iHeartRadio app, Apple Podcasts, 756 00:40:29,880 --> 00:40:34,400 Speaker 4: or wherever you listen to podcasts him, Malcolm Gladwell. This 757 00:40:34,680 --> 00:40:49,960 Speaker 4: is a paid advertisement from IBM.