1 00:00:00,120 --> 00:00:02,840 Speaker 1: Hey everyone, it's Robert and Joe here. Today we've got 2 00:00:02,840 --> 00:00:04,720 Speaker 1: something a little bit different to share with you. It 3 00:00:04,840 --> 00:00:08,000 Speaker 1: is a new season of the Smart Talks with IBM 4 00:00:08,119 --> 00:00:09,119 Speaker 1: podcast series. 5 00:00:09,600 --> 00:00:11,680 Speaker 2: Today we are witnessed to one of those rare moments 6 00:00:11,680 --> 00:00:14,360 Speaker 2: in history, the rise of an innovative technology with the 7 00:00:14,360 --> 00:00:18,680 Speaker 2: potential to radically transform business and society forever. The technology, 8 00:00:18,760 --> 00:00:22,160 Speaker 2: of course, is artificial intelligence, and it's the central focus 9 00:00:22,239 --> 00:00:24,800 Speaker 2: for this new season of Smart Talks with IBM. 10 00:00:25,320 --> 00:00:28,400 Speaker 1: Join hosts from your favorite Pushkin podcasts as they talk 11 00:00:28,480 --> 00:00:31,640 Speaker 1: with industry experts and leaders to explore how businesses can 12 00:00:31,680 --> 00:00:35,360 Speaker 1: integrate AI into their workflows and help drive real change 13 00:00:35,400 --> 00:00:38,160 Speaker 1: in this new era of AI. And of course, host 14 00:00:38,280 --> 00:00:40,440 Speaker 1: Malcolm Gladwell will be there to guide you through the 15 00:00:40,479 --> 00:00:42,640 Speaker 1: season and throw in his two cents as well. 16 00:00:43,120 --> 00:00:46,120 Speaker 2: Look out for new episodes of Smart Talks with IBM 17 00:00:46,400 --> 00:00:49,519 Speaker 2: every other week on the iHeartRadio app, Apple Podcasts, or 18 00:00:49,560 --> 00:00:53,360 Speaker 2: wherever you get your podcasts, and learn more at IBM 19 00:00:53,479 --> 00:00:55,480 Speaker 2: dot com slash smart Talks. 20 00:00:57,720 --> 00:01:01,160 Speaker 3: Hello, Hello, Welcome to Smart Talks with IBA, a podcast 21 00:01:01,200 --> 00:01:06,480 Speaker 3: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Gabwell. This 22 00:01:06,560 --> 00:01:11,200 Speaker 3: season we're continuing our conversation with new creators visionaries who 23 00:01:11,240 --> 00:01:15,319 Speaker 3: are creatively applying technology in business to drive change, but 24 00:01:15,440 --> 00:01:19,600 Speaker 3: with a focus on the transformative power of artificial intelligence 25 00:01:19,920 --> 00:01:22,640 Speaker 3: and what it means to leverage AI as a game 26 00:01:22,720 --> 00:01:27,440 Speaker 3: changing multiplier for your business. Our guest today is doctor 27 00:01:27,520 --> 00:01:32,720 Speaker 3: David Cox, VP of AI Models at IBM Research and 28 00:01:32,959 --> 00:01:37,520 Speaker 3: IBM Director of the MIT IBM Watson AI Lab, a 29 00:01:37,600 --> 00:01:42,480 Speaker 3: first of its kind industry academic collaboration between IBM and 30 00:01:42,640 --> 00:01:48,240 Speaker 3: MIT focused on the fundamental research of artificial intelligence. Over 31 00:01:48,240 --> 00:01:51,960 Speaker 3: the course of decades, David Cox watched as the AI 32 00:01:52,080 --> 00:01:55,920 Speaker 3: revolution steadily grew from the simmering ideas of a few 33 00:01:55,960 --> 00:02:00,840 Speaker 3: academics and technologists into the industrial boom we are experiencing today. 34 00:02:01,800 --> 00:02:04,720 Speaker 3: Having dedicated his life to pushing the field of AI 35 00:02:04,840 --> 00:02:09,280 Speaker 3: towards new horizons, David has both contributed to and presided 36 00:02:09,320 --> 00:02:14,520 Speaker 3: over many of the major breakthroughs in artificial intelligence. In 37 00:02:14,560 --> 00:02:18,680 Speaker 3: today's episode, you'll hear David explain some of the conceptual 38 00:02:18,840 --> 00:02:23,760 Speaker 3: underpinnings of the current AI landscape, things like foundation models, 39 00:02:23,800 --> 00:02:27,680 Speaker 3: in surprisingly comprehensible terms. I might add, we'll also get 40 00:02:27,720 --> 00:02:31,320 Speaker 3: into some of the amazing practical applications for AI in business, 41 00:02:31,480 --> 00:02:34,440 Speaker 3: as well as what implications AI will have for the 42 00:02:34,440 --> 00:02:38,200 Speaker 3: future of work and design. David spoke with Jacob Goldstein, 43 00:02:38,480 --> 00:02:42,560 Speaker 3: host of the Pushkin podcast What's Your Problem. A veteran 44 00:02:42,639 --> 00:02:46,000 Speaker 3: business journalist, Jacob has reported for The Wall Street Journal, 45 00:02:46,280 --> 00:02:49,160 Speaker 3: the Miami Herald, and was a longtime host of the 46 00:02:49,280 --> 00:02:58,960 Speaker 3: NPR program Planet Money. Okay, let's get to the interview. 47 00:03:00,840 --> 00:03:02,799 Speaker 4: Tell me about your job at IBM. 48 00:03:03,240 --> 00:03:03,360 Speaker 2: So. 49 00:03:03,680 --> 00:03:06,640 Speaker 5: I wear two hats at IBM. So one, I'm the 50 00:03:06,680 --> 00:03:10,080 Speaker 5: IBM Director of the MIT, IBM Watson AI Lab. So 51 00:03:10,160 --> 00:03:13,040 Speaker 5: that's a joint lab between IBM and MIT where we 52 00:03:13,600 --> 00:03:15,880 Speaker 5: try and invent what's next in AI. It's been running 53 00:03:15,919 --> 00:03:18,920 Speaker 5: for about five years, and then more recently I started 54 00:03:19,040 --> 00:03:21,800 Speaker 5: as the vice president for AI Models, and I'm in 55 00:03:21,880 --> 00:03:26,400 Speaker 5: charge of building IBM's foundation models, you know, building these 56 00:03:26,520 --> 00:03:28,600 Speaker 5: these big models, generative models that allow us to have 57 00:03:28,639 --> 00:03:30,680 Speaker 5: all kinds of new exciting capabilities in AI. 58 00:03:31,240 --> 00:03:33,160 Speaker 4: So, so I want to talk to you a lot 59 00:03:33,200 --> 00:03:36,640 Speaker 4: about foundation models, about genitive AI. But before we get 60 00:03:36,640 --> 00:03:38,720 Speaker 4: to that, let's just spend a minute on the on 61 00:03:38,840 --> 00:03:44,480 Speaker 4: the IBM MIT collaboration. Where did that partnership start, How 62 00:03:44,480 --> 00:03:45,280 Speaker 4: did it originate? 63 00:03:46,440 --> 00:03:49,280 Speaker 5: Yeah, So, actually it turns out that MIT and IBM 64 00:03:49,480 --> 00:03:52,560 Speaker 5: have been collaborating for a very long time in the 65 00:03:52,600 --> 00:03:56,680 Speaker 5: area of AI. In fact, the term artificial intelligence was 66 00:03:56,720 --> 00:04:00,440 Speaker 5: coined in a nineteen fifty six workshop that was held 67 00:04:00,440 --> 00:04:02,440 Speaker 5: at Dartmouth, but it was actually organized by an IBM 68 00:04:02,560 --> 00:04:05,720 Speaker 5: or Nathaniel Rochester, who led the development of the IBM 69 00:04:05,760 --> 00:04:08,920 Speaker 5: seven and one. So we've really been together in AIS 70 00:04:08,960 --> 00:04:13,840 Speaker 5: since the beginning, and as AI kept accelerating more and 71 00:04:13,880 --> 00:04:17,320 Speaker 5: more and more, I think there was a really interesting 72 00:04:17,360 --> 00:04:19,880 Speaker 5: decision to say, let's make this a formal partnership, so 73 00:04:20,080 --> 00:04:22,320 Speaker 5: IBM in twenty seventeen and also to be committing close 74 00:04:22,360 --> 00:04:25,200 Speaker 5: to a quarter billion dollars over ten years to have 75 00:04:25,279 --> 00:04:29,360 Speaker 5: this joint lab with MIT, and we located ourselves right 76 00:04:29,400 --> 00:04:31,640 Speaker 5: on the campus and we've been developing very very deep 77 00:04:31,680 --> 00:04:34,280 Speaker 5: relationships where we can really get to know each other, 78 00:04:34,400 --> 00:04:37,719 Speaker 5: work shoulder to shoulder, conceiving what we should work on next, 79 00:04:37,760 --> 00:04:41,720 Speaker 5: and then executing the projects. And it's really very few 80 00:04:42,360 --> 00:04:46,040 Speaker 5: entities like this exist between academia industry. It's been really 81 00:04:46,080 --> 00:04:48,320 Speaker 5: fun the last five years to be a part of it. 82 00:04:48,960 --> 00:04:50,400 Speaker 4: And what do you think are some of the most 83 00:04:50,440 --> 00:04:54,000 Speaker 4: important outcomes of this collaboration between IBM and MIT. 84 00:04:55,400 --> 00:04:58,039 Speaker 5: Yeah, so we're really kind of the tip of the 85 00:04:58,080 --> 00:05:02,720 Speaker 5: sphere for IBM the I strategy. So we're we're really 86 00:05:02,720 --> 00:05:05,560 Speaker 5: looking what, you know, what's coming ahead, and you know, 87 00:05:05,560 --> 00:05:08,239 Speaker 5: in areas like Foundation models, you know, as the field 88 00:05:08,400 --> 00:05:11,600 Speaker 5: changes and I T people are interested in working on 89 00:05:11,760 --> 00:05:13,719 Speaker 5: you know, faculty, students and staff are interested in working 90 00:05:13,800 --> 00:05:15,520 Speaker 5: on what's the latest thing, what's the next thing. We 91 00:05:15,560 --> 00:05:18,960 Speaker 5: at IBM Research are very much interested in the same. 92 00:05:19,240 --> 00:05:21,240 Speaker 5: So we can kind of put out feelers, you know, 93 00:05:21,320 --> 00:05:24,880 Speaker 5: interesting things that we're seeing in our research, interesting things 94 00:05:24,920 --> 00:05:26,479 Speaker 5: we're hearing in the field. We can go and chase 95 00:05:26,480 --> 00:05:29,599 Speaker 5: those opportunities. So when something big comes, like the big 96 00:05:29,680 --> 00:05:32,560 Speaker 5: change that's been happening lately with Foundation Models, we're ready 97 00:05:32,560 --> 00:05:35,000 Speaker 5: to jump on it. That's really the purpose, that's that's 98 00:05:35,000 --> 00:05:38,120 Speaker 5: the lab functioning the way it should. We're also really 99 00:05:38,120 --> 00:05:41,640 Speaker 5: interested in how do we advance you know AI that 100 00:05:41,680 --> 00:05:44,600 Speaker 5: can help with climate change or you know, build better 101 00:05:44,720 --> 00:05:47,240 Speaker 5: materials and all these kinds of things that are you know, 102 00:05:47,320 --> 00:05:50,440 Speaker 5: a broader aperture sometimes than than what we might consider 103 00:05:50,600 --> 00:05:53,479 Speaker 5: just looking at the product portfolio of IBM, and that 104 00:05:53,480 --> 00:05:55,240 Speaker 5: that gives us again a breadth where we can see 105 00:05:55,240 --> 00:05:58,240 Speaker 5: connections that we might not have seen otherwise. We can 106 00:05:58,520 --> 00:06:01,039 Speaker 5: you know, think things that help out society and also 107 00:06:01,040 --> 00:06:02,080 Speaker 5: help out our customers. 108 00:06:03,240 --> 00:06:07,680 Speaker 4: So the last whatever six months, say, there has been 109 00:06:07,720 --> 00:06:13,400 Speaker 4: this wild rise in the public's interest in AI, right 110 00:06:13,480 --> 00:06:16,880 Speaker 4: clearly coming out of these generative AI models that are 111 00:06:16,920 --> 00:06:21,039 Speaker 4: really accessible, you know, certainly chat GPT language models like that, 112 00:06:21,080 --> 00:06:24,159 Speaker 4: as well as models that generate images like mid Journey. 113 00:06:24,760 --> 00:06:27,520 Speaker 4: I mean, can you just sort of briefly talk about 114 00:06:27,680 --> 00:06:31,680 Speaker 4: the breakthroughs in AI that have made this moment feel 115 00:06:31,760 --> 00:06:35,240 Speaker 4: so exciting, so revolutionary for artificial intelligence. 116 00:06:36,320 --> 00:06:41,039 Speaker 5: Yeah. You know, I've been studying AI basically my entire 117 00:06:41,080 --> 00:06:43,200 Speaker 5: adult life. Before I came to IBM, I was a 118 00:06:43,200 --> 00:06:45,760 Speaker 5: professor at Harvard. I've been doing this a long time, 119 00:06:46,000 --> 00:06:48,280 Speaker 5: and I've gotten used to being surprised. It sounds like 120 00:06:48,320 --> 00:06:51,400 Speaker 5: a joke, but it's serious, Like I'm getting used to 121 00:06:51,440 --> 00:06:55,359 Speaker 5: being surprised at the acceleration of the pace Again. It 122 00:06:55,440 --> 00:06:58,320 Speaker 5: tracks actually a long way back. You know, there's lots 123 00:06:58,360 --> 00:07:00,920 Speaker 5: of things where there was an idea that just simmered 124 00:07:01,600 --> 00:07:04,960 Speaker 5: for a really long time. Some of the key math 125 00:07:05,440 --> 00:07:09,000 Speaker 5: behind the stuff that we have today, which is amazing 126 00:07:09,720 --> 00:07:12,960 Speaker 5: there's an algorithm called backpropagation, which is sort of key 127 00:07:13,000 --> 00:07:15,600 Speaker 5: to training neural networks that's been around, you know, since 128 00:07:15,680 --> 00:07:19,520 Speaker 5: the eighties in wide use. And really what happened was 129 00:07:19,840 --> 00:07:23,560 Speaker 5: it simmered for a long time and then enough data 130 00:07:23,800 --> 00:07:27,440 Speaker 5: and enough compute came. So we had enough data because 131 00:07:28,080 --> 00:07:31,240 Speaker 5: you know, we all started carrying multiple cameras around with us. 132 00:07:31,280 --> 00:07:34,080 Speaker 5: Our mobile phones have all, you know, all these cameras 133 00:07:34,160 --> 00:07:36,600 Speaker 5: and this we put everything on the Internet, and there's 134 00:07:36,600 --> 00:07:39,119 Speaker 5: all this data out there. We caught a lucky break 135 00:07:39,120 --> 00:07:41,400 Speaker 5: that there was something called a graphics processing unit, which 136 00:07:41,680 --> 00:07:43,680 Speaker 5: turns out to be really useful for doing these kinds 137 00:07:43,720 --> 00:07:46,240 Speaker 5: of algorithms, maybe even more useful than it is for 138 00:07:46,320 --> 00:07:50,400 Speaker 5: doing graphics. They're greater graphics too, And things just kept 139 00:07:50,480 --> 00:07:53,680 Speaker 5: kind of adding to the snowball. So we had deep learning, 140 00:07:54,120 --> 00:07:57,720 Speaker 5: which is sort of a rebrand of neural networks that 141 00:07:57,760 --> 00:07:59,960 Speaker 5: I mentioned from from the eighties, and that was enable 142 00:08:00,160 --> 00:08:03,800 Speaker 5: again by data because we digitalize the world and compute 143 00:08:03,800 --> 00:08:06,240 Speaker 5: because because we kept building faster and faster and more 144 00:08:06,280 --> 00:08:09,520 Speaker 5: powerful computers, and then that allowed us to make this 145 00:08:09,520 --> 00:08:13,200 Speaker 5: this big breakthrough. And then you know, more recently, using 146 00:08:13,280 --> 00:08:17,360 Speaker 5: the same building blocks that inexorable rise of more and 147 00:08:17,400 --> 00:08:21,920 Speaker 5: more and more data met a technology called self supervised learning, 148 00:08:22,400 --> 00:08:27,120 Speaker 5: where the key difference there in traditional deep learning, you know, 149 00:08:27,160 --> 00:08:29,760 Speaker 5: for classifying images, you know, like is this a cat 150 00:08:29,880 --> 00:08:33,120 Speaker 5: or is this a dog? And a picture, those technologies 151 00:08:33,520 --> 00:08:36,880 Speaker 5: require super visions, so you have to take what you 152 00:08:36,960 --> 00:08:38,320 Speaker 5: have and then you have to label it. So you 153 00:08:38,320 --> 00:08:39,679 Speaker 5: have to take a picture of a cat, and then 154 00:08:39,679 --> 00:08:42,240 Speaker 5: you label it as a cat. And it turns out 155 00:08:42,280 --> 00:08:44,600 Speaker 5: that you know, that's very powerful, that it takes a 156 00:08:44,600 --> 00:08:47,600 Speaker 5: lot of time to label gaps and to label dogs, 157 00:08:47,640 --> 00:08:50,040 Speaker 5: and there's only so many labels that exist in the world. 158 00:08:50,440 --> 00:08:54,000 Speaker 5: So what really changed more recently is that we have 159 00:08:54,080 --> 00:08:56,560 Speaker 5: self supervised learning where you don't have to have the labels. 160 00:08:56,559 --> 00:08:59,040 Speaker 5: We can just take unannotated data. And what that does 161 00:08:59,160 --> 00:09:02,240 Speaker 5: is it lots you use even more data. And that's 162 00:09:02,280 --> 00:09:06,200 Speaker 5: really what drove this latest sort of rage. And then 163 00:09:06,559 --> 00:09:08,199 Speaker 5: and then all of a sudden we start getting these 164 00:09:08,559 --> 00:09:14,600 Speaker 5: really powerful models. And then really this has been simmering technologies, right, 165 00:09:15,120 --> 00:09:19,280 Speaker 5: this has been happening for a while and progressively getting 166 00:09:19,280 --> 00:09:21,880 Speaker 5: more and more powerful. One of the things that really 167 00:09:22,120 --> 00:09:27,320 Speaker 5: happened with CHATGBT and technologies like stable diffusion and mid 168 00:09:27,360 --> 00:09:30,480 Speaker 5: journey was that they made it visible to the public. 169 00:09:31,160 --> 00:09:33,080 Speaker 5: You know, you put it out there, the public can 170 00:09:33,120 --> 00:09:35,280 Speaker 5: touch and feel and they're like, wow, not only is 171 00:09:35,280 --> 00:09:38,560 Speaker 5: there palpable change and wow this you know, I could 172 00:09:38,559 --> 00:09:40,720 Speaker 5: talk to this thing. Wow, this thing can generate an image. 173 00:09:41,040 --> 00:09:43,679 Speaker 5: Not only that, but everyone can touch and feel and try. 174 00:09:44,520 --> 00:09:49,160 Speaker 5: My kids can use some of these AI art generation technologies. 175 00:09:49,679 --> 00:09:53,200 Speaker 5: And that's really just launched. You know. It's like a 176 00:09:53,600 --> 00:09:57,199 Speaker 5: propelled slingshot at us into a different regime in terms 177 00:09:57,200 --> 00:09:58,960 Speaker 5: of the public awareness of these technologies. 178 00:09:59,640 --> 00:10:02,800 Speaker 4: You mentioned earlier in the conversation foundation models, and I 179 00:10:02,840 --> 00:10:04,679 Speaker 4: want to talk a little bit about that. I mean, 180 00:10:04,720 --> 00:10:08,120 Speaker 4: can you just tell me, you know, what are foundation 181 00:10:08,360 --> 00:10:11,080 Speaker 4: models for AI and why are they a big deal? 182 00:10:12,280 --> 00:10:16,000 Speaker 5: Yeah, So this term foundation model was coined by a 183 00:10:16,000 --> 00:10:19,760 Speaker 5: group at Stanford, and I think it's actually a really 184 00:10:19,800 --> 00:10:23,319 Speaker 5: apt term because remember I said, you know, one of 185 00:10:23,360 --> 00:10:26,680 Speaker 5: the big things that unlocked this latest excitement was the 186 00:10:26,679 --> 00:10:30,280 Speaker 5: fact that we could use large amounts of unannotated data. 187 00:10:30,840 --> 00:10:32,200 Speaker 5: We could train a model. We don't have to go 188 00:10:32,240 --> 00:10:35,760 Speaker 5: through the painful effort of labeling each and every example. 189 00:10:36,320 --> 00:10:38,560 Speaker 5: You still need to have your model do something you 190 00:10:38,559 --> 00:10:40,720 Speaker 5: wanted to do you still need to tell it what 191 00:10:40,800 --> 00:10:42,360 Speaker 5: you want to do. You can't just have a model 192 00:10:42,400 --> 00:10:45,600 Speaker 5: that doesn't have any purpose, but what a foundation model 193 00:10:45,640 --> 00:10:49,280 Speaker 5: that provides a foundation, like a literal foundation, you can 194 00:10:49,320 --> 00:10:51,320 Speaker 5: sort of stand on the shoulders of giants. You can 195 00:10:51,400 --> 00:10:53,960 Speaker 5: have one of these massively trained models and then do 196 00:10:54,040 --> 00:10:56,400 Speaker 5: a little bit on top. You know, you could use 197 00:10:56,440 --> 00:10:59,280 Speaker 5: just a few examples of what you're looking for and 198 00:10:59,559 --> 00:11:01,880 Speaker 5: you can get what you want from the model. So 199 00:11:02,040 --> 00:11:04,040 Speaker 5: just a little bit on top now gets to the 200 00:11:04,080 --> 00:11:06,120 Speaker 5: results that a huge amount of effort used to have 201 00:11:06,200 --> 00:11:08,400 Speaker 5: to put in, you know, to get from the ground 202 00:11:08,520 --> 00:11:10,120 Speaker 5: up to that level. 203 00:11:10,400 --> 00:11:14,320 Speaker 4: I was trying to think of of an analogy for 204 00:11:14,440 --> 00:11:17,480 Speaker 4: sort of foundation models versus what came before, and I 205 00:11:17,480 --> 00:11:19,960 Speaker 4: don't know that I came up with a good one, 206 00:11:20,000 --> 00:11:21,719 Speaker 4: but the best I could do was this. I want 207 00:11:21,720 --> 00:11:24,640 Speaker 4: you to tell me if it's plausible. It's like before 208 00:11:24,760 --> 00:11:28,280 Speaker 4: foundation models, it was like you had these sort of 209 00:11:28,360 --> 00:11:31,560 Speaker 4: single use kitchen appliances. You could make a waffle iron 210 00:11:31,600 --> 00:11:34,120 Speaker 4: if you wanted waffles, or you could make a toaster 211 00:11:34,280 --> 00:11:36,839 Speaker 4: if you wanted to make toast. But a foundation model 212 00:11:36,880 --> 00:11:39,480 Speaker 4: is like like an oven with a range on top. 213 00:11:39,520 --> 00:11:41,319 Speaker 4: So it's like this machine, and you could just cook 214 00:11:41,400 --> 00:11:43,240 Speaker 4: anything with this machine. 215 00:11:43,880 --> 00:11:48,360 Speaker 5: Yeah, that's a great analogy. They're very versatile. The other 216 00:11:48,480 --> 00:11:51,040 Speaker 5: piece of it, too, is that they dramatically lowered the 217 00:11:51,160 --> 00:11:54,320 Speaker 5: effort that it takes to do something that you want 218 00:11:54,360 --> 00:11:57,640 Speaker 5: to do. And I used to say about the old 219 00:11:57,679 --> 00:11:59,440 Speaker 5: world of AI, would say, you know, the problem with 220 00:11:59,480 --> 00:12:03,240 Speaker 5: automation is that it's too labor intensive, which sounds like 221 00:12:03,280 --> 00:12:04,160 Speaker 5: I'm making a joke. 222 00:12:04,400 --> 00:12:08,920 Speaker 4: Indeed, famously, if automation does one thing, it substitutes machines 223 00:12:09,040 --> 00:12:12,280 Speaker 4: or computing power for labor, right, So what does that 224 00:12:12,360 --> 00:12:16,640 Speaker 4: mean to say AI is or automation is too labor intensive. 225 00:12:17,120 --> 00:12:19,120 Speaker 5: It sounds like I'm making a joke, but I'm actually serious. 226 00:12:19,320 --> 00:12:22,520 Speaker 5: What I mean is that the effort it took the 227 00:12:22,640 --> 00:12:26,480 Speaker 5: old regime to automate something was very very high. So 228 00:12:26,640 --> 00:12:29,520 Speaker 5: if I need to go and curate all this data, 229 00:12:29,600 --> 00:12:32,800 Speaker 5: collect all this data, and then carefully label all these examples, 230 00:12:33,200 --> 00:12:37,600 Speaker 5: that labeling itself might be incredibly expensive and time. And 231 00:12:37,640 --> 00:12:40,240 Speaker 5: we estimate anywhere between eighty to ninety percent of the 232 00:12:40,320 --> 00:12:43,200 Speaker 5: effort it takes to feel an AI solution actually is 233 00:12:43,360 --> 00:12:46,679 Speaker 5: just spent on data, so that that has some consequences, 234 00:12:47,000 --> 00:12:52,360 Speaker 5: which is the threshold for bothering. You know, if you're 235 00:12:52,360 --> 00:12:54,559 Speaker 5: going to only get a little bit of value back 236 00:12:54,800 --> 00:12:57,040 Speaker 5: from something, are you going to go through this huge 237 00:12:57,040 --> 00:13:00,560 Speaker 5: effort to curate all this data and then when it 238 00:13:00,559 --> 00:13:02,960 Speaker 5: comes time to train the model you need highly skilled 239 00:13:03,000 --> 00:13:06,680 Speaker 5: people defensive or hard to find in the labor market. 240 00:13:07,200 --> 00:13:08,719 Speaker 5: You know, are you really going to do something that's 241 00:13:08,760 --> 00:13:10,800 Speaker 5: just a title incremental thing? Now you're going to do 242 00:13:10,840 --> 00:13:15,840 Speaker 5: the only the highest value things that weren't right level because. 243 00:13:15,440 --> 00:13:18,840 Speaker 4: You have to essentially build the whole machine from scratch, 244 00:13:19,040 --> 00:13:21,960 Speaker 4: and there aren't many things where it's worth that much 245 00:13:21,960 --> 00:13:24,040 Speaker 4: work to build a machine that's only going to do 246 00:13:24,160 --> 00:13:25,319 Speaker 4: one narrow thing. 247 00:13:25,800 --> 00:13:28,719 Speaker 5: That's right, and then you tackle the next problem and 248 00:13:28,840 --> 00:13:31,160 Speaker 5: you basically have to start over. And you know, there 249 00:13:31,160 --> 00:13:34,000 Speaker 5: are some nuances here, like for images, you can pre 250 00:13:34,040 --> 00:13:36,560 Speaker 5: train a model on some other task and change it around. 251 00:13:36,559 --> 00:13:39,520 Speaker 5: So there are some examples of this, like non recurring 252 00:13:39,640 --> 00:13:42,160 Speaker 5: cost that we have in the old world too, But 253 00:13:42,240 --> 00:13:44,800 Speaker 5: by and large, it's just a lot of effort. It's hard. 254 00:13:45,080 --> 00:13:49,360 Speaker 5: It takes you know, a large level of skill to implement. 255 00:13:50,160 --> 00:13:52,959 Speaker 5: One analogy that I like is, you know, think about 256 00:13:52,960 --> 00:13:55,080 Speaker 5: it as you know, you have a river of data, 257 00:13:55,440 --> 00:13:58,840 Speaker 5: you know, running through your company or your institution. Traditional 258 00:13:58,840 --> 00:14:01,320 Speaker 5: AI solutions are kind of like building a dam on 259 00:14:01,360 --> 00:14:04,840 Speaker 5: that river. You know, dams are very expensive things to build. 260 00:14:05,160 --> 00:14:09,439 Speaker 5: They require highly specialized skills and lots of planning. And 261 00:14:09,640 --> 00:14:11,360 Speaker 5: you know, you're only going to put a dam on 262 00:14:11,720 --> 00:14:14,440 Speaker 5: a river that's big enough that you're gonna get enough 263 00:14:14,520 --> 00:14:16,400 Speaker 5: energy out of it that it was worth your trouble. 264 00:14:16,800 --> 00:14:18,040 Speaker 5: You're going to get a lot of value out of 265 00:14:18,040 --> 00:14:19,920 Speaker 5: that dam. If you have a river like that, you know, 266 00:14:20,000 --> 00:14:23,560 Speaker 5: a river of data, but it's actually the vast majority 267 00:14:23,600 --> 00:14:26,160 Speaker 5: of the water you know in your kingdom, actually isn't 268 00:14:26,160 --> 00:14:30,360 Speaker 5: in that river. It's in puddles and creeks and bable bricks, 269 00:14:30,400 --> 00:14:33,840 Speaker 5: And you know, there's a lot of value left on 270 00:14:33,880 --> 00:14:36,440 Speaker 5: the table because it's like, well, I can't there's nothing 271 00:14:36,440 --> 00:14:38,240 Speaker 5: you can do about it. It's just that that's too 272 00:14:39,240 --> 00:14:42,360 Speaker 5: low value, so it takes too much effort, so I'm 273 00:14:42,400 --> 00:14:43,960 Speaker 5: just not going to do it. The return around investment 274 00:14:44,320 --> 00:14:47,080 Speaker 5: just isn't there, so you just end up not automating things. 275 00:14:47,320 --> 00:14:50,640 Speaker 5: It's too much of a pain. Now what foundation models 276 00:14:50,640 --> 00:14:52,720 Speaker 5: do is they say, well, actually no, we can train 277 00:14:53,600 --> 00:14:55,680 Speaker 5: a base model, a foundation that you can work on 278 00:14:55,800 --> 00:14:57,760 Speaker 5: and we don't we don't care. We have specifying what 279 00:14:57,760 --> 00:14:59,280 Speaker 5: the task is ahead of time. We just need to 280 00:14:59,680 --> 00:15:02,440 Speaker 5: learn about the domain of data. So if we want 281 00:15:02,440 --> 00:15:05,760 Speaker 5: to build something that can understand English language, there's a 282 00:15:05,800 --> 00:15:09,080 Speaker 5: ton of English language text available out in the world. 283 00:15:09,280 --> 00:15:13,040 Speaker 5: We can now train models on huge quantities of it, 284 00:15:13,520 --> 00:15:17,280 Speaker 5: and then it learned the structure, learned how language you know, 285 00:15:17,400 --> 00:15:20,280 Speaker 5: good part of how language works on all that unlabeled data, 286 00:15:20,360 --> 00:15:22,600 Speaker 5: and then when you roll up with your task, you know, 287 00:15:22,840 --> 00:15:26,240 Speaker 5: I want to solve this particular problem. You don't have 288 00:15:26,320 --> 00:15:29,000 Speaker 5: to start from scratch. You're starting from a very very 289 00:15:29,080 --> 00:15:31,960 Speaker 5: very high place. So that just gives you the ability 290 00:15:32,040 --> 00:15:34,440 Speaker 5: to just you know, now all of a sudden, everything 291 00:15:34,680 --> 00:15:37,560 Speaker 5: is accessible. All the puddles and greeks and babbling brooks 292 00:15:37,720 --> 00:15:42,000 Speaker 5: and klipons, you know, those are all accessible now. And 293 00:15:42,040 --> 00:15:44,920 Speaker 5: that's that's very exciting. But it just changes the equation 294 00:15:45,040 --> 00:15:47,600 Speaker 5: on what kinds of problems you could use AI to solve. 295 00:15:47,720 --> 00:15:53,160 Speaker 4: And so foundation models basically mean that automating some new 296 00:15:53,280 --> 00:15:56,520 Speaker 4: task is much less labor intensive. The sort of marginal 297 00:15:56,600 --> 00:15:59,560 Speaker 4: effort to do some new automation thing is much lower 298 00:15:59,560 --> 00:16:02,880 Speaker 4: because you're building on top of the foundation model rather 299 00:16:02,960 --> 00:16:07,320 Speaker 4: than starting from scratch. Absolutely, So that is that is 300 00:16:07,440 --> 00:16:11,040 Speaker 4: like the exciting good news. I do feel like there's 301 00:16:11,840 --> 00:16:14,400 Speaker 4: a little bit of a countervailing idea that's worth talking 302 00:16:14,480 --> 00:16:16,800 Speaker 4: about here, and that is the idea that even though 303 00:16:16,840 --> 00:16:20,880 Speaker 4: there are these foundation models that are really powerful, that 304 00:16:20,920 --> 00:16:23,960 Speaker 4: are relatively easy to build on top of, it's still 305 00:16:24,000 --> 00:16:27,200 Speaker 4: the case right that there is not some one size 306 00:16:27,240 --> 00:16:30,920 Speaker 4: fits all foundation model. So you know, what does that 307 00:16:31,040 --> 00:16:33,280 Speaker 4: mean and why is that important to think about in 308 00:16:33,320 --> 00:16:34,040 Speaker 4: this context? 309 00:16:34,640 --> 00:16:38,440 Speaker 5: Yeah, so we believe very strongly that there isn't just 310 00:16:38,560 --> 00:16:41,400 Speaker 5: one model to rule them all. There's a number of 311 00:16:41,480 --> 00:16:44,440 Speaker 5: reasons why that could be true. One which I think 312 00:16:44,520 --> 00:16:48,560 Speaker 5: is important and very relevant today is how much energy 313 00:16:48,880 --> 00:16:53,640 Speaker 5: these models can consume. So these models, you know, can 314 00:16:53,680 --> 00:16:59,120 Speaker 5: get very very large. So one thing that we're starting 315 00:16:59,160 --> 00:17:01,880 Speaker 5: to see or starting to believe, is that you probably 316 00:17:01,920 --> 00:17:07,040 Speaker 5: shouldn't use one giant sledgehammer model to solve every single problem, 317 00:17:07,200 --> 00:17:09,160 Speaker 5: you know, like we should pick the right size model 318 00:17:09,200 --> 00:17:12,000 Speaker 5: to solve the problem. We shouldn't necessarily assume that we 319 00:17:12,040 --> 00:17:16,600 Speaker 5: need the biggest, baddest model for every little use case. 320 00:17:17,040 --> 00:17:19,280 Speaker 5: And we're also seeing that, you know, small models that 321 00:17:19,320 --> 00:17:23,520 Speaker 5: are trained like to specialize on particular domains can actually 322 00:17:23,520 --> 00:17:27,080 Speaker 5: outperform much bigger models. So bigger isn't always even better. 323 00:17:27,440 --> 00:17:30,040 Speaker 4: So they're more efficient and they do the thing you 324 00:17:30,080 --> 00:17:31,680 Speaker 4: want them to do better as well. 325 00:17:32,240 --> 00:17:35,520 Speaker 5: That's right. So Stanford, for instance, a group of Stanford 326 00:17:35,520 --> 00:17:38,639 Speaker 5: trained a model. It is a two point seven billion 327 00:17:38,680 --> 00:17:41,840 Speaker 5: parameter model, which isn't terribly big by today's standards. They 328 00:17:41,840 --> 00:17:44,119 Speaker 5: trained it just on the biomedical literature, you know, this 329 00:17:44,160 --> 00:17:46,520 Speaker 5: is the kind of thing that universities do. And what 330 00:17:46,560 --> 00:17:50,080 Speaker 5: they showed was that this model was better at answering 331 00:17:50,160 --> 00:17:52,680 Speaker 5: questions about the biomedical literature than some models that are 332 00:17:53,200 --> 00:17:56,919 Speaker 5: one hundred billion parameters, you know, many times larger. So 333 00:17:57,080 --> 00:17:59,600 Speaker 5: it's a little bit like you know, asking an expert 334 00:18:00,119 --> 00:18:03,919 Speaker 5: for help on something versus asking the smartest person you know. Ye, 335 00:18:04,000 --> 00:18:06,440 Speaker 5: the smartest person you know may be very smart, but 336 00:18:06,560 --> 00:18:09,439 Speaker 5: they're not going to be expertise. And then as an 337 00:18:09,480 --> 00:18:11,919 Speaker 5: added bonus, you know, this is now a much smaller model, 338 00:18:12,000 --> 00:18:13,919 Speaker 5: it's much more efficient to run, we are you know, 339 00:18:14,480 --> 00:18:18,400 Speaker 5: you know, it's cheaper, so there's lots of different advantages there. 340 00:18:18,440 --> 00:18:22,000 Speaker 5: So I think we're going to see attention in the 341 00:18:22,080 --> 00:18:25,360 Speaker 5: industry between vendors that say hey, this is the one, 342 00:18:25,560 --> 00:18:27,920 Speaker 5: you know, big model, and then others that say, well, actually, 343 00:18:28,200 --> 00:18:30,720 Speaker 5: you know, there's there's you know, lots of different tools 344 00:18:30,720 --> 00:18:32,720 Speaker 5: we can use that all have this nice quality that 345 00:18:32,760 --> 00:18:35,440 Speaker 5: we outlined at the beginning, and then we should really 346 00:18:35,440 --> 00:18:36,960 Speaker 5: pick the one that makes the most sense for the 347 00:18:37,320 --> 00:18:38,040 Speaker 5: task at hand. 348 00:18:39,320 --> 00:18:43,720 Speaker 4: So there's sustainability basically efficiency. Another kind of set of 349 00:18:43,720 --> 00:18:46,000 Speaker 4: issues that come up a lot with ai A are 350 00:18:46,200 --> 00:18:50,000 Speaker 4: bias hallucination. Can you talk a little bit about bias 351 00:18:50,240 --> 00:18:52,480 Speaker 4: and hallucination, what they are and how you're working to 352 00:18:52,880 --> 00:18:54,000 Speaker 4: mitigate those problems. 353 00:18:54,400 --> 00:18:57,240 Speaker 5: Yeah, so there are lots of issues still as amazing 354 00:18:57,280 --> 00:19:00,240 Speaker 5: as these technologies are, and they are amazing, let's let's 355 00:19:00,280 --> 00:19:02,720 Speaker 5: be very clear, lots of great things we're going to 356 00:19:02,840 --> 00:19:06,640 Speaker 5: enable with these kinds of technologies. Bias isn't a new problem, 357 00:19:07,000 --> 00:19:11,600 Speaker 5: so you know, basically we've seen this since the beginning 358 00:19:11,600 --> 00:19:14,520 Speaker 5: of AI. If you train a model on data that 359 00:19:14,960 --> 00:19:17,080 Speaker 5: has a bias in it, the model is going to 360 00:19:17,119 --> 00:19:21,680 Speaker 5: recapitulate that bias when it provides its answers. So every time, 361 00:19:21,880 --> 00:19:24,399 Speaker 5: you know, if all the text you have says, you know, 362 00:19:24,440 --> 00:19:27,520 Speaker 5: it's more likely to refer to female nurses and male scientists. 363 00:19:27,560 --> 00:19:29,639 Speaker 5: Then you're going to you know, get models that you know. 364 00:19:29,720 --> 00:19:32,760 Speaker 5: For instance, there was an example where a machine learning 365 00:19:32,800 --> 00:19:37,200 Speaker 5: based translation system translated from Hungarian to English. Hungarian doesn't 366 00:19:37,240 --> 00:19:40,520 Speaker 5: have gendered pronouns. English does, and when you ask them 367 00:19:40,560 --> 00:19:42,879 Speaker 5: to translate, it would translate they are a nurse to 368 00:19:43,320 --> 00:19:46,160 Speaker 5: she is a nurse, would translate they are a scientist 369 00:19:46,200 --> 00:19:48,399 Speaker 5: to he is a scientist. And that's not because the 370 00:19:49,240 --> 00:19:51,800 Speaker 5: people who wrote the algorithm were building in bias and 371 00:19:51,920 --> 00:19:53,640 Speaker 5: coding in like oh, it's got to be this way. 372 00:19:53,720 --> 00:19:55,960 Speaker 5: It's because the data was like that. You know, we 373 00:19:56,040 --> 00:20:00,159 Speaker 5: have biases in our society and they're reflected in in 374 00:20:00,200 --> 00:20:04,000 Speaker 5: our data and our text and our images everywhere. And 375 00:20:04,040 --> 00:20:06,920 Speaker 5: then the models they're just mapping from what they've what 376 00:20:06,960 --> 00:20:09,480 Speaker 5: they've seen in their training data to to the result 377 00:20:09,520 --> 00:20:11,400 Speaker 5: that you're trying to get them to do and to give, 378 00:20:11,440 --> 00:20:15,240 Speaker 5: and then these biases come out. So there's a very 379 00:20:15,600 --> 00:20:19,439 Speaker 5: active program of research, and you know, we we do 380 00:20:19,800 --> 00:20:22,600 Speaker 5: quite a bit at IBM research and my T but 381 00:20:22,840 --> 00:20:25,960 Speaker 5: also all over the community and industry and academia trying 382 00:20:25,960 --> 00:20:29,320 Speaker 5: to figure out how do we explicitly remove these biases, 383 00:20:29,359 --> 00:20:31,560 Speaker 5: how do we identify them, how do you know, how 384 00:20:31,600 --> 00:20:33,840 Speaker 5: do we build tools that allow people to audit their 385 00:20:33,840 --> 00:20:36,439 Speaker 5: systems to make sure they aren't biased. So this is 386 00:20:36,440 --> 00:20:38,560 Speaker 5: a really important thing. And you know, again this was 387 00:20:38,600 --> 00:20:43,200 Speaker 5: here since the beginning, you know of machine learning and AI, 388 00:20:43,680 --> 00:20:47,240 Speaker 5: but foundation models and large language models and generative AI 389 00:20:48,119 --> 00:20:50,880 Speaker 5: just bring it into sharper even sharper focus because there's 390 00:20:50,880 --> 00:20:53,399 Speaker 5: just so much data and it's sort of building in 391 00:20:54,000 --> 00:20:56,960 Speaker 5: baking and all these different biases we have, so that 392 00:20:57,080 --> 00:21:01,240 Speaker 5: that's that's absolutely a problem that these model have. Another 393 00:21:01,280 --> 00:21:04,720 Speaker 5: one that you mentioned was hallucinations. So even the most 394 00:21:04,720 --> 00:21:08,840 Speaker 5: impressive of our models will often just make stuff up. 395 00:21:09,280 --> 00:21:11,840 Speaker 5: You know, the technical term that the heels chosen as 396 00:21:11,960 --> 00:21:15,280 Speaker 5: is hallucination. To give you an example, I asked chat 397 00:21:15,359 --> 00:21:19,920 Speaker 5: tbt to create a biography of David Cox IBM, and 398 00:21:20,040 --> 00:21:22,320 Speaker 5: you know, it started off really well. You know, they 399 00:21:22,440 --> 00:21:24,639 Speaker 5: identified that I was the director of the mt IBM 400 00:21:24,680 --> 00:21:27,320 Speaker 5: Watsonay and said a few words about that, and then 401 00:21:27,359 --> 00:21:32,119 Speaker 5: it proceeded to create an authoritative but completely fake biography 402 00:21:32,320 --> 00:21:34,639 Speaker 5: of me. Where I was British. I was born in 403 00:21:34,640 --> 00:21:39,040 Speaker 5: the UK. I went to British university, you know universities 404 00:21:39,040 --> 00:21:41,080 Speaker 5: in the UK. I was professorating the authority. 405 00:21:41,160 --> 00:21:44,720 Speaker 4: Right, it's the certainty that that is weird about it, right, 406 00:21:44,720 --> 00:21:48,000 Speaker 4: it's it's dead certain that you're from the UK, et cetera. 407 00:21:48,560 --> 00:21:51,920 Speaker 5: Absolutely, yeah, as all kinds of flourishes like I want 408 00:21:52,080 --> 00:21:56,880 Speaker 5: awards in the UK. So yeah, it's it's problematic because 409 00:21:57,040 --> 00:21:59,399 Speaker 5: it kind of pokes a lot of weak spots in 410 00:21:59,440 --> 00:22:04,880 Speaker 5: our humans psychology, where if something sounds coherent, we're likely 411 00:22:04,920 --> 00:22:07,480 Speaker 5: to assume it's true. We're not used to interacting with 412 00:22:07,520 --> 00:22:12,120 Speaker 5: people who eloquently and authoritatively you know, emit complete nonsense 413 00:22:12,200 --> 00:22:14,960 Speaker 5: like yeah, you know we can debate about that, but. 414 00:22:15,000 --> 00:22:18,440 Speaker 4: Yeah, we could debate about that. But yes, the sort 415 00:22:18,480 --> 00:22:22,120 Speaker 4: of blive confidence throws you off when you realize it's 416 00:22:22,119 --> 00:22:22,879 Speaker 4: completely wrong. 417 00:22:23,000 --> 00:22:25,639 Speaker 5: Right, that's right. And and we do have a little 418 00:22:25,640 --> 00:22:28,280 Speaker 5: bit of like a great and powerful AWS sort of 419 00:22:28,680 --> 00:22:31,080 Speaker 5: vibe going sometimes where we're like, well, you know, the 420 00:22:31,080 --> 00:22:34,880 Speaker 5: AI is all knowing and therefore whatever it says must 421 00:22:34,920 --> 00:22:37,159 Speaker 5: be true. But but these things will make up stuff, 422 00:22:37,440 --> 00:22:42,119 Speaker 5: you know, very aggressively, and you know, you everyone can 423 00:22:42,200 --> 00:22:45,120 Speaker 5: try asking it for their their bio. You'll you'll get 424 00:22:45,119 --> 00:22:47,879 Speaker 5: something that You'll always get something that's of the right form, 425 00:22:48,359 --> 00:22:50,480 Speaker 5: that has the right tone. But you know, the facts 426 00:22:50,680 --> 00:22:53,800 Speaker 5: just aren't necessarily there, So that's obviously a problem. We 427 00:22:53,840 --> 00:22:55,840 Speaker 5: need to figure out how to close those gaps, fix 428 00:22:55,920 --> 00:22:58,600 Speaker 5: those problems. There's lots of ways we can use them 429 00:22:59,040 --> 00:22:59,840 Speaker 5: much more easily. 430 00:23:00,359 --> 00:23:03,080 Speaker 3: I'd just like to say, faced with the awesome potential 431 00:23:03,119 --> 00:23:06,160 Speaker 3: of what these technologies might do, it's a bit encouraging 432 00:23:06,200 --> 00:23:09,720 Speaker 3: to hear that even chat GPT has a weakness for 433 00:23:09,840 --> 00:23:14,960 Speaker 3: inventing flamboyant, if fictional versions of people's lives. And while 434 00:23:15,080 --> 00:23:18,639 Speaker 3: entertaining ourselves with chat GPT and mid journey is important, 435 00:23:19,080 --> 00:23:23,160 Speaker 3: the way lay people use consumer facing chatbots and generative 436 00:23:23,280 --> 00:23:27,560 Speaker 3: AI is just fundamentally different from the way an enterprise 437 00:23:27,600 --> 00:23:31,119 Speaker 3: business uses AI. How can we harness the abilities of 438 00:23:31,240 --> 00:23:34,600 Speaker 3: artificial intelligence to help us solve the problems we face 439 00:23:34,680 --> 00:23:38,280 Speaker 3: in business and technology. Let's listen on as David and 440 00:23:38,359 --> 00:23:40,199 Speaker 3: Jacob continue their conversation. 441 00:23:40,960 --> 00:23:43,920 Speaker 4: We've been talking in a somewhat abstract way about AI 442 00:23:44,040 --> 00:23:46,760 Speaker 4: in the ways it can be used. Let's talk in 443 00:23:46,800 --> 00:23:49,520 Speaker 4: a little bit more of a specific way. Can you 444 00:23:50,200 --> 00:23:54,000 Speaker 4: just talk about some examples of business challenges that can 445 00:23:54,040 --> 00:23:57,400 Speaker 4: be solved with automation? With this kind of automation we're 446 00:23:57,400 --> 00:23:58,320 Speaker 4: talking about. 447 00:23:58,880 --> 00:24:02,359 Speaker 5: Yeah, so really really, this guy's the limit. There's a 448 00:24:02,359 --> 00:24:06,480 Speaker 5: whole set of different applications that these models are really 449 00:24:06,480 --> 00:24:09,119 Speaker 5: good at. And basically it's a super set of everything 450 00:24:09,119 --> 00:24:12,159 Speaker 5: we used to use ALI for in business. So you know, 451 00:24:12,840 --> 00:24:14,520 Speaker 5: the simple kinds of things are like, hey, if I 452 00:24:14,520 --> 00:24:16,680 Speaker 5: have text and I you know, if I have product 453 00:24:16,720 --> 00:24:18,600 Speaker 5: reviews and I want to be able to tell if 454 00:24:18,600 --> 00:24:20,760 Speaker 5: these are positive or negative. You know, like, let's look 455 00:24:20,760 --> 00:24:22,280 Speaker 5: at all the negative reviews so we can have a 456 00:24:22,320 --> 00:24:25,400 Speaker 5: human look through them and see what was up. Very 457 00:24:25,440 --> 00:24:28,440 Speaker 5: common business use case. You can do it with traditional 458 00:24:28,480 --> 00:24:32,080 Speaker 5: deep learning based AI. So so there's things like that 459 00:24:32,080 --> 00:24:34,199 Speaker 5: that are you know, it's very prosaic sort that we 460 00:24:34,200 --> 00:24:35,679 Speaker 5: were already doing that, We've been doing it for a 461 00:24:35,720 --> 00:24:39,400 Speaker 5: long time. Then you get situations that are that were 462 00:24:39,400 --> 00:24:40,040 Speaker 5: harder for. 463 00:24:39,960 --> 00:24:40,479 Speaker 2: The old day. 464 00:24:40,600 --> 00:24:44,359 Speaker 5: I like, if i'm I want to compress something like 465 00:24:44,400 --> 00:24:46,040 Speaker 5: I want to I have like say I have a 466 00:24:46,160 --> 00:24:49,480 Speaker 5: chat transcript, like a customer called in and they had 467 00:24:49,480 --> 00:24:53,760 Speaker 5: a complaint. They call back, Okay, now a new you know, 468 00:24:54,080 --> 00:24:56,000 Speaker 5: a person on the line needs to go read the 469 00:24:56,040 --> 00:24:59,000 Speaker 5: old transcript to catch up. Wouldn't it be better if 470 00:24:59,000 --> 00:25:01,360 Speaker 5: we could just summariz that, just condense it all down 471 00:25:01,680 --> 00:25:03,840 Speaker 5: quick little paragraph, you know, customer call they're up said 472 00:25:03,880 --> 00:25:05,920 Speaker 5: about this, rather than having to read the blow by blow. 473 00:25:06,359 --> 00:25:09,480 Speaker 5: There's just lots of settings like that where summarization is 474 00:25:09,560 --> 00:25:12,600 Speaker 5: really helpful. Hey, you have a meeting and I'd like 475 00:25:12,680 --> 00:25:15,359 Speaker 5: to just automatically, you know, have have that meeting or 476 00:25:15,400 --> 00:25:17,240 Speaker 5: that email or whatever. I'd like to just have a 477 00:25:17,240 --> 00:25:19,480 Speaker 5: condensed down so I can really quickly get to the 478 00:25:19,480 --> 00:25:22,080 Speaker 5: heart of the matter. These models are are really good 479 00:25:22,119 --> 00:25:24,760 Speaker 5: at doing that. They're also a really good at question answering. 480 00:25:25,080 --> 00:25:27,400 Speaker 5: So if I want to find out what's how many 481 00:25:27,440 --> 00:25:31,040 Speaker 5: vacation days do I have? I can now interact in 482 00:25:31,200 --> 00:25:34,600 Speaker 5: natural language with a system that can go and that 483 00:25:34,760 --> 00:25:37,439 Speaker 5: has access to our HR policies, and I can actually 484 00:25:37,440 --> 00:25:40,120 Speaker 5: have a you know, a multi turn conversation where I can, 485 00:25:40,200 --> 00:25:42,359 Speaker 5: you know, like I would have with you know, somebody, 486 00:25:42,480 --> 00:25:47,520 Speaker 5: you know, actual HR professional or customer service representative. So 487 00:25:48,000 --> 00:25:51,280 Speaker 5: a big part, you know, what this is doing is 488 00:25:51,280 --> 00:25:53,840 Speaker 5: it's it's putting an interface. You know, when we think 489 00:25:53,880 --> 00:25:57,800 Speaker 5: of computer interfaces, we're usually thinking about UI user interface 490 00:25:57,880 --> 00:26:00,440 Speaker 5: elements where I click on menus and there's buts and 491 00:26:00,480 --> 00:26:04,600 Speaker 5: all this stuff. Increasingly, now we can just talk you know, 492 00:26:04,640 --> 00:26:06,920 Speaker 5: you just in words, you can describe what you want, 493 00:26:07,000 --> 00:26:09,719 Speaker 5: you want to answer, ask a question, you want to 494 00:26:09,880 --> 00:26:12,760 Speaker 5: sort of command the system to do something, rather than 495 00:26:12,800 --> 00:26:14,680 Speaker 5: having to learn how to do that clicking buttons, which 496 00:26:14,720 --> 00:26:16,800 Speaker 5: might be inefficient. Now we can just sort of spell 497 00:26:16,840 --> 00:26:17,200 Speaker 5: it out. 498 00:26:17,720 --> 00:26:20,720 Speaker 4: Interesting, right, the graphical user interface that we all sort 499 00:26:20,720 --> 00:26:24,040 Speaker 4: of default to, that's not like the state of nature, Right, 500 00:26:24,119 --> 00:26:26,600 Speaker 4: that's a thing that was invented and just came to 501 00:26:26,680 --> 00:26:29,040 Speaker 4: be the standard way that we interact with computers. And 502 00:26:29,080 --> 00:26:33,560 Speaker 4: so you could imagine, as you're saying, like chat essentially 503 00:26:33,720 --> 00:26:37,000 Speaker 4: chatting with the machine could could become a sort of 504 00:26:37,080 --> 00:26:40,320 Speaker 4: standard user interface, just like the graphical user interface, did 505 00:26:40,480 --> 00:26:41,879 Speaker 4: you know over the past several decades. 506 00:26:42,359 --> 00:26:45,800 Speaker 5: Absolutely, And I think those kinds of conversational interfaces are 507 00:26:45,800 --> 00:26:50,040 Speaker 5: going to be hugely important for increasing our productivity. It's 508 00:26:50,040 --> 00:26:51,919 Speaker 5: just a lot easier if I if I have to 509 00:26:51,960 --> 00:26:53,600 Speaker 5: learn how to use a tool or I don't have 510 00:26:53,640 --> 00:26:56,720 Speaker 5: to kind of have awkward, you know, interactions from the computer. 511 00:26:56,720 --> 00:26:57,920 Speaker 5: I can just tell it what I want and I 512 00:26:57,920 --> 00:27:00,520 Speaker 5: can understand it, could you know potentially and you ask 513 00:27:00,640 --> 00:27:03,720 Speaker 5: questions back to clarify and have those kinds of conversations 514 00:27:04,800 --> 00:27:07,840 Speaker 5: that can be extremely powerful, and in fact, one area 515 00:27:07,880 --> 00:27:10,600 Speaker 5: where that's going to I think be absolutely game changing 516 00:27:10,680 --> 00:27:14,400 Speaker 5: is in code. When we write code. You know, programming 517 00:27:14,480 --> 00:27:18,560 Speaker 5: languages are a way for us to sort of match 518 00:27:18,680 --> 00:27:22,800 Speaker 5: between our very sloppy way of talking and the very 519 00:27:22,840 --> 00:27:24,919 Speaker 5: exact way that you need to command a computer to 520 00:27:24,960 --> 00:27:27,880 Speaker 5: do what you wanted to do. They're cumbersome to learn, 521 00:27:27,960 --> 00:27:30,320 Speaker 5: they can you know, create very complex systems that are 522 00:27:30,400 --> 00:27:33,320 Speaker 5: very hard to reason about. And we're already starting to 523 00:27:33,320 --> 00:27:35,360 Speaker 5: see the ability to just write down what you want 524 00:27:35,440 --> 00:27:38,159 Speaker 5: and the AI will generate the code for you. And 525 00:27:38,200 --> 00:27:39,880 Speaker 5: I think we're just going to see a huge revolution 526 00:27:39,960 --> 00:27:41,920 Speaker 5: of like we just converse you and we can have 527 00:27:41,960 --> 00:27:44,480 Speaker 5: a conversation to say what we want, and then the 528 00:27:44,520 --> 00:27:48,199 Speaker 5: computer can actually not only do fixed actions and do 529 00:27:48,280 --> 00:27:50,359 Speaker 5: things for us, but it can actually even write code 530 00:27:50,400 --> 00:27:53,360 Speaker 5: to do new things, you know, and generate software itself. 531 00:27:53,680 --> 00:27:56,399 Speaker 5: Given how much software we have, how much craving we 532 00:27:56,440 --> 00:27:59,199 Speaker 5: have for software, like we'll never have enough software in 533 00:27:59,240 --> 00:28:02,200 Speaker 5: our world, uh, you know, the ability to have AI 534 00:28:02,280 --> 00:28:05,439 Speaker 5: systems as a helper in that, I think we're going 535 00:28:05,480 --> 00:28:07,280 Speaker 5: to see a lot of a lot of value there. 536 00:28:08,480 --> 00:28:11,120 Speaker 4: So if you if you think about the different ways 537 00:28:11,760 --> 00:28:13,960 Speaker 4: AI might be applied to business. I mean you've talked 538 00:28:13,960 --> 00:28:16,320 Speaker 4: about a number of the sort of classic use cases. 539 00:28:17,000 --> 00:28:20,359 Speaker 4: What are some of the more out there use cases. 540 00:28:20,359 --> 00:28:23,280 Speaker 4: What are some you know, unique ways you could imagine 541 00:28:23,320 --> 00:28:25,680 Speaker 4: AI being applied to business. 542 00:28:26,720 --> 00:28:29,439 Speaker 5: Yeah, there's really disguised the limit. I mean, we have 543 00:28:29,520 --> 00:28:31,719 Speaker 5: one project that I'm kind of a fan of where 544 00:28:32,359 --> 00:28:35,840 Speaker 5: we actually were working with a mechanical engineering professor at 545 00:28:35,920 --> 00:28:38,920 Speaker 5: MIT working on a classic problem, how do you build 546 00:28:39,280 --> 00:28:42,680 Speaker 5: linkage systems which are like, can imagine bars and joints 547 00:28:42,840 --> 00:28:45,160 Speaker 5: and ogres, you know the things that are. 548 00:28:45,120 --> 00:28:48,440 Speaker 4: Building a thing, building a physical machine of some kind of. 549 00:28:49,000 --> 00:28:54,160 Speaker 5: Like real like metal and you know nineteenth century just 550 00:28:54,360 --> 00:28:57,280 Speaker 5: old school Industrial revolution. Yeah yeah, yeah, but you know 551 00:28:57,320 --> 00:29:00,040 Speaker 5: the little arm that's that's holding up my microphone in 552 00:29:00,080 --> 00:29:02,560 Speaker 5: front of me. Cranes, get build your buildings, you know, 553 00:29:02,640 --> 00:29:05,160 Speaker 5: parts of your engines. This is like classical stuff. It 554 00:29:05,200 --> 00:29:07,480 Speaker 5: turns out that you know, humans, if you want to 555 00:29:07,480 --> 00:29:10,680 Speaker 5: build an advanced system, you decide what like curve you 556 00:29:10,720 --> 00:29:13,360 Speaker 5: want to create, and then a human together with a 557 00:29:13,400 --> 00:29:17,160 Speaker 5: computer program, can build a five or six bar linkage. 558 00:29:17,280 --> 00:29:18,800 Speaker 5: And then that's kind of where you top out it 559 00:29:18,800 --> 00:29:21,680 Speaker 5: because it gets too complicated to work more than that. 560 00:29:22,320 --> 00:29:24,800 Speaker 5: We built a generative AI system that can build twenty 561 00:29:24,840 --> 00:29:28,200 Speaker 5: bar linkages like arbitrarily complex. So these are machines that 562 00:29:28,200 --> 00:29:32,600 Speaker 5: are beyond the capability of a human to design themselves. 563 00:29:33,120 --> 00:29:36,080 Speaker 5: Another example, we have an AI system that can generate 564 00:29:36,240 --> 00:29:38,640 Speaker 5: electronic circuits. You know, we had a project where we're 565 00:29:38,640 --> 00:29:41,320 Speaker 5: working where we were building better power converters which allow 566 00:29:41,560 --> 00:29:45,680 Speaker 5: our computers and our devices to be more efficient, save energy, 567 00:29:46,480 --> 00:29:49,400 Speaker 5: you know, less less carbonet But I think the world 568 00:29:49,480 --> 00:29:52,400 Speaker 5: around us has always been shaped by technology. If you 569 00:29:52,440 --> 00:29:54,719 Speaker 5: look around, you know, just think about how many steps 570 00:29:54,720 --> 00:29:57,040 Speaker 5: and how many people, and how many designs went into 571 00:29:57,080 --> 00:30:00,600 Speaker 5: the table and the chair and the vamp. It's it's 572 00:30:00,640 --> 00:30:04,120 Speaker 5: really just astonishing. And that's already you know, the fruit 573 00:30:04,200 --> 00:30:07,400 Speaker 5: of automation and computers and those kinds of tools. But 574 00:30:07,400 --> 00:30:10,880 Speaker 5: we're going to see that increasingly be act also of AI. 575 00:30:10,960 --> 00:30:13,000 Speaker 5: It's just going to be everywhere around us, everything we 576 00:30:13,120 --> 00:30:15,760 Speaker 5: touch is going to have to you know, helped in 577 00:30:15,840 --> 00:30:19,240 Speaker 5: some way to get get to you by you know. 578 00:30:19,240 --> 00:30:22,160 Speaker 4: That is a pretty profound transformation that you're talking about 579 00:30:22,200 --> 00:30:25,040 Speaker 4: in business. How do you think about the implications of 580 00:30:25,080 --> 00:30:28,600 Speaker 4: that both for the sort of you know, business itself, 581 00:30:29,000 --> 00:30:30,760 Speaker 4: and also for for employees. 582 00:30:32,520 --> 00:30:36,640 Speaker 5: Yeah, so I think for businesses this is gonna cut costs, 583 00:30:36,920 --> 00:30:40,800 Speaker 5: make new opportunities to like customers, you know, like there's 584 00:30:40,840 --> 00:30:43,640 Speaker 5: just you know, it's sort of all upside right like 585 00:30:44,480 --> 00:30:46,480 Speaker 5: for the for the workers, I think the story is 586 00:30:46,520 --> 00:30:49,400 Speaker 5: mostly good too. You know, like how many things do 587 00:30:49,520 --> 00:30:53,320 Speaker 5: you do in your day that you'd really rather not right? 588 00:30:53,800 --> 00:30:55,800 Speaker 5: You know, and we're used to having things we don't 589 00:30:55,920 --> 00:30:59,200 Speaker 5: like automated away, you know, we we didn't you know, 590 00:30:59,160 --> 00:31:01,520 Speaker 5: if we didn't like walk getting many miles to work, 591 00:31:01,600 --> 00:31:03,480 Speaker 5: then you know, like you can have a car and 592 00:31:03,520 --> 00:31:05,720 Speaker 5: you can drive there. Or we used to have a 593 00:31:05,800 --> 00:31:08,720 Speaker 5: huge fraction over ninety percent of the US population engaged 594 00:31:08,720 --> 00:31:11,760 Speaker 5: in agriculture, and then we mechanized it. Now very few 595 00:31:11,760 --> 00:31:13,800 Speaker 5: people work in agriculture. A small number of people can 596 00:31:13,880 --> 00:31:16,120 Speaker 5: do the work of a large number of people. And 597 00:31:16,160 --> 00:31:18,680 Speaker 5: then you know, things like email, and you know, they've 598 00:31:18,720 --> 00:31:21,360 Speaker 5: led to huge productivity enhancements because I don't need to 599 00:31:21,400 --> 00:31:23,720 Speaker 5: be writing letters and sending them in the mail. I 600 00:31:23,760 --> 00:31:28,160 Speaker 5: can just instantly communicate with people. We just become more effective, 601 00:31:28,320 --> 00:31:32,400 Speaker 5: Like our jobs have transformed, whether it's a physical job 602 00:31:32,480 --> 00:31:35,360 Speaker 5: like agriculture, or whether it's a knowledge worker job where 603 00:31:35,360 --> 00:31:39,080 Speaker 5: you're sending emails and communicating with people and coordinating teams. 604 00:31:39,400 --> 00:31:42,040 Speaker 5: We've just gotten better and you know, the technology has 605 00:31:42,040 --> 00:31:45,560 Speaker 5: just made us more productive. And this is just another example. Now, 606 00:31:45,840 --> 00:31:47,960 Speaker 5: you know, there are people who worry that you know, 607 00:31:48,640 --> 00:31:51,040 Speaker 5: will be so good at that that maybe jobs will 608 00:31:51,080 --> 00:31:54,760 Speaker 5: be displaced, and that's that's a legitimate concern. But just 609 00:31:54,880 --> 00:31:58,360 Speaker 5: like how in agriculture, you know, it's not like suddenly 610 00:31:58,400 --> 00:32:01,280 Speaker 5: we had ninety percent of the population and unemployed. You know, 611 00:32:01,360 --> 00:32:05,640 Speaker 5: people transitioned to to other jobs. And the other thing 612 00:32:05,640 --> 00:32:09,200 Speaker 5: that we've found, too, is that our appetite for doing 613 00:32:09,240 --> 00:32:13,040 Speaker 5: more things is as humans is sort of insatiable. So 614 00:32:13,320 --> 00:32:16,400 Speaker 5: even if we can dramatically increase how much you know, 615 00:32:16,440 --> 00:32:19,560 Speaker 5: one human can do, that doesn't necessarily mean we're going 616 00:32:19,600 --> 00:32:21,920 Speaker 5: to do a fixed amount of stuff. There's an appetite 617 00:32:21,960 --> 00:32:23,479 Speaker 5: to have even more, so we're going to you can 618 00:32:23,480 --> 00:32:26,040 Speaker 5: continue to grow grow the pie. So I think at 619 00:32:26,160 --> 00:32:28,400 Speaker 5: least certainly in the near term, you know, we're going 620 00:32:28,440 --> 00:32:30,280 Speaker 5: to see a lot of drudgery go away from work. 621 00:32:30,320 --> 00:32:32,600 Speaker 5: We're going to see people to be able to be 622 00:32:32,640 --> 00:32:35,760 Speaker 5: more effective at their jobs. You know, we will see 623 00:32:35,760 --> 00:32:39,440 Speaker 5: some transformation in jobs and what like. But we've seen 624 00:32:39,480 --> 00:32:44,280 Speaker 5: that before and the technology a least has the potential 625 00:32:44,480 --> 00:32:45,800 Speaker 5: to make our lives a lot easier. 626 00:32:47,040 --> 00:32:52,000 Speaker 4: So IBM recently launched Watson X, which includes Watson x 627 00:32:52,120 --> 00:32:55,080 Speaker 4: dot AI. Tell me about that, tell me about you 628 00:32:55,080 --> 00:32:57,160 Speaker 4: know what it is and the new possibilities that it 629 00:32:57,200 --> 00:32:57,760 Speaker 4: opens up. 630 00:32:58,680 --> 00:33:02,240 Speaker 5: Yeah, So so Wat's the next is obviously a bit 631 00:33:02,280 --> 00:33:07,240 Speaker 5: of a new branding on the Watson brand. TJ. Watson 632 00:33:07,280 --> 00:33:11,120 Speaker 5: that was the founder of IBM and our EI technologies 633 00:33:11,160 --> 00:33:15,080 Speaker 5: have had the Watson brand. Watson X is a recognition 634 00:33:15,280 --> 00:33:18,600 Speaker 5: that there's something new, there's something that actually has changed 635 00:33:18,600 --> 00:33:22,920 Speaker 5: the game. We've gone from this old world of automation 636 00:33:23,120 --> 00:33:25,720 Speaker 5: is to labor intensive to this new world of possibilities 637 00:33:26,280 --> 00:33:30,200 Speaker 5: where it's much easier to use AI. And what Watson 638 00:33:30,400 --> 00:33:35,360 Speaker 5: X does it brings together tools for businesses to harness 639 00:33:35,400 --> 00:33:40,239 Speaker 5: that power. So whattsonex dot AI foundation models that our 640 00:33:40,280 --> 00:33:43,440 Speaker 5: customers can use. It includes tools that make it easy 641 00:33:43,520 --> 00:33:47,480 Speaker 5: to run, easy to deploy, easy to experiment. There's a 642 00:33:47,520 --> 00:33:51,440 Speaker 5: watsonex dot Data component which allows you to sort of 643 00:33:51,600 --> 00:33:54,240 Speaker 5: organize and access to your data. So what we're really 644 00:33:54,240 --> 00:33:58,000 Speaker 5: trying to do is give our customers a cohesive set 645 00:33:58,040 --> 00:34:02,640 Speaker 5: of tools to the value of these technologies and at 646 00:34:02,680 --> 00:34:05,280 Speaker 5: the same time be able to manage the risks and 647 00:34:05,480 --> 00:34:07,479 Speaker 5: other things that you have to keep an eye on 648 00:34:07,680 --> 00:34:08,960 Speaker 5: in an enterprise context. 649 00:34:10,640 --> 00:34:13,359 Speaker 4: So we talk about the guests on this show as 650 00:34:13,840 --> 00:34:17,960 Speaker 4: new creators, by which we mean people who are creatively 651 00:34:18,000 --> 00:34:22,880 Speaker 4: applying technology in business to drive change. And I'm curious 652 00:34:23,360 --> 00:34:28,080 Speaker 4: how creativity plays a role in the research that you do. 653 00:34:28,680 --> 00:34:33,279 Speaker 5: I honestly, I think the creative aspects of this job 654 00:34:33,719 --> 00:34:37,040 Speaker 5: this is what makes this work exciting. You know, I 655 00:34:37,040 --> 00:34:38,960 Speaker 5: should say, you know, the folks who work at my 656 00:34:39,080 --> 00:34:42,200 Speaker 5: organization are doing the creating, and I. 657 00:34:42,120 --> 00:34:45,680 Speaker 4: Guess you're doing the managing so that they could do 658 00:34:45,760 --> 00:34:46,680 Speaker 4: the creator. 659 00:34:47,120 --> 00:34:50,560 Speaker 5: I'm helping them be their best and I still get 660 00:34:50,560 --> 00:34:53,480 Speaker 5: to get involved in the weeds of the research as 661 00:34:53,560 --> 00:34:56,279 Speaker 5: much as I can. But you know, there's something really 662 00:34:56,320 --> 00:35:01,280 Speaker 5: exciting about inventing, you know, like nice things about doing 663 00:35:01,360 --> 00:35:05,360 Speaker 5: invention and doing research on AI and industries. It's usually 664 00:35:05,400 --> 00:35:07,960 Speaker 5: grounded and a real problem that somebody's having. You know, 665 00:35:08,000 --> 00:35:11,640 Speaker 5: a customer wants to solve this problem. It's losing money 666 00:35:11,719 --> 00:35:14,480 Speaker 5: or there there would be a new opportunity. You identify 667 00:35:14,560 --> 00:35:18,719 Speaker 5: that problem and then you build something that's never been 668 00:35:18,719 --> 00:35:21,680 Speaker 5: built before to do that. And I think that's honestly 669 00:35:21,760 --> 00:35:25,600 Speaker 5: the adrenaline rush that keeps all of us in this field. 670 00:35:25,760 --> 00:35:28,400 Speaker 5: How do you do something that nobody else on earth 671 00:35:28,560 --> 00:35:32,040 Speaker 5: has done before or tried before, So that that kind 672 00:35:32,040 --> 00:35:35,520 Speaker 5: of creativity, and there's also creativity as well, and identifying 673 00:35:35,520 --> 00:35:39,880 Speaker 5: what those problems are, being able to understand the places 674 00:35:40,520 --> 00:35:44,560 Speaker 5: where you know the technology is close enough to solving 675 00:35:44,560 --> 00:35:48,319 Speaker 5: a problem, and doing that matchmaking between problems that are 676 00:35:48,400 --> 00:35:51,080 Speaker 5: now solvable, you know, and an AI where the field 677 00:35:51,160 --> 00:35:55,279 Speaker 5: is moving so fast, this is constantly growing horizon of 678 00:35:55,400 --> 00:35:58,239 Speaker 5: things that we might be able to solve. So that matchmaking, 679 00:35:58,280 --> 00:36:02,000 Speaker 5: I think is also a really interesting creative problem. So 680 00:36:02,280 --> 00:36:04,440 Speaker 5: I think I think that's that's that's why it's so 681 00:36:04,520 --> 00:36:07,640 Speaker 5: much fun. And it's a fun environment we have here too. 682 00:36:07,840 --> 00:36:11,120 Speaker 5: It's you know, people drawing on whiteboards and writing on 683 00:36:11,239 --> 00:36:13,799 Speaker 5: pages of math and you. 684 00:36:13,719 --> 00:36:16,359 Speaker 4: Know, like in a movie, like in a movie, yeah, 685 00:36:16,440 --> 00:36:19,240 Speaker 4: straight from special casting drawing, the drawing on the window, 686 00:36:19,280 --> 00:36:24,640 Speaker 4: writing on the window in sharp absolutely, So, so let's 687 00:36:24,640 --> 00:36:29,640 Speaker 4: close with the really long view. How do you imagine 688 00:36:29,840 --> 00:36:34,080 Speaker 4: AI and people working together twenty years from now? 689 00:36:36,160 --> 00:36:40,799 Speaker 5: Yeah, it's really hard to make predictions. The vision that 690 00:36:41,360 --> 00:36:47,759 Speaker 5: I like, actually this came from an MIT economist named 691 00:36:47,800 --> 00:36:53,719 Speaker 5: David Ottur, which was imagine AI almost as a natural resource. 692 00:36:54,680 --> 00:36:57,640 Speaker 5: You know, we have we know how natural resources work, right, 693 00:36:57,760 --> 00:36:59,440 Speaker 5: Like there's an or we can dig up out of 694 00:36:59,480 --> 00:37:02,080 Speaker 5: the earth, comes from kind of springs from the earth, 695 00:37:02,200 --> 00:37:05,160 Speaker 5: or we usually think of that in terms of physical stuff. 696 00:37:05,800 --> 00:37:07,400 Speaker 5: With AI, you can almost think of it as like 697 00:37:07,440 --> 00:37:10,279 Speaker 5: there's a new kind of abundance potentially twenty years from 698 00:37:10,320 --> 00:37:12,960 Speaker 5: now where not only can we have things we can 699 00:37:13,000 --> 00:37:15,600 Speaker 5: build or eat or use or burn or whatever. Now 700 00:37:15,600 --> 00:37:18,160 Speaker 5: we have, you know, this ability to do things and 701 00:37:18,280 --> 00:37:21,520 Speaker 5: understand things and do intellectual work, and I think we 702 00:37:21,840 --> 00:37:25,880 Speaker 5: can get to a world where automating things is just seamless. 703 00:37:26,280 --> 00:37:31,520 Speaker 5: We're surrounded by capability to augment ourselves to get things done. 704 00:37:32,239 --> 00:37:35,000 Speaker 5: And you could think of that in terms of like, oh, 705 00:37:35,040 --> 00:37:37,200 Speaker 5: that's going to displace our jobs, because eventually the AI 706 00:37:37,239 --> 00:37:39,319 Speaker 5: system is going to do everything we can do. But 707 00:37:39,680 --> 00:37:41,839 Speaker 5: you could also think of it in terms of like, wow, 708 00:37:41,920 --> 00:37:44,239 Speaker 5: that's just so much abundance that we now have, and 709 00:37:44,280 --> 00:37:47,520 Speaker 5: really how we use that abundance is sort of up 710 00:37:47,560 --> 00:37:50,279 Speaker 5: to us, you know, like you can writing software is 711 00:37:50,280 --> 00:37:53,040 Speaker 5: super easy and fast, and anybody can do it. Just 712 00:37:53,080 --> 00:37:55,560 Speaker 5: think about all the things you can do now, think 713 00:37:55,560 --> 00:37:57,520 Speaker 5: about all the new activities, and go out all the 714 00:37:57,560 --> 00:38:00,319 Speaker 5: ways we could use that to enrich our lives. That's 715 00:38:00,320 --> 00:38:03,239 Speaker 5: where I'd like to see us in twenty years. You 716 00:38:03,239 --> 00:38:06,200 Speaker 5: know we can. We can do just so much more 717 00:38:06,560 --> 00:38:09,160 Speaker 5: than we were able to do before abundance. 718 00:38:09,960 --> 00:38:12,800 Speaker 4: Great, thank you so much for your time. 719 00:38:13,520 --> 00:38:15,560 Speaker 5: Yeah, it's been a pleasure. Thanks for inviting me. 720 00:38:17,080 --> 00:38:21,160 Speaker 3: What a far ranging, deep conversation. I'm mesmerized by the 721 00:38:21,200 --> 00:38:25,120 Speaker 3: vision David just described. A world where natural conversation between 722 00:38:25,120 --> 00:38:29,720 Speaker 3: mankind and machine can generate creative solutions to our most 723 00:38:29,760 --> 00:38:33,560 Speaker 3: complex problems. A world where we view AI not as 724 00:38:33,640 --> 00:38:37,680 Speaker 3: our replacements, but as a powerful resource we can tap 725 00:38:37,719 --> 00:38:43,200 Speaker 3: into and exponentially boost our innovation and productivity. Thanks so 726 00:38:43,280 --> 00:38:46,680 Speaker 3: much to doctor David Cox for joining us on smart Talks. 727 00:38:47,120 --> 00:38:50,839 Speaker 3: We deeply appreciate him sharing his huge breadth of AI 728 00:38:50,920 --> 00:38:54,960 Speaker 3: knowledge with us and for explaining the transformative potential of 729 00:38:55,000 --> 00:38:58,360 Speaker 3: foundation models in a way that even I can understand. 730 00:38:58,960 --> 00:39:03,440 Speaker 3: We eagerly await his next great breakthrough. Smart Talks with 731 00:39:03,480 --> 00:39:07,880 Speaker 3: IBM is produced by Matt Romano David jaw nishe Venkat 732 00:39:08,040 --> 00:39:12,480 Speaker 3: and Royston Preserve with Jacob Goldstein. We're edited by Lydia 733 00:39:12,520 --> 00:39:16,839 Speaker 3: Jean Kott. Our engineers are Jason Gambrel, Sarah Buguer and 734 00:39:16,920 --> 00:39:22,560 Speaker 3: Ben Holliday. Theme song by Gramoscope. Special thanks to Carli Megliori, 735 00:39:22,920 --> 00:39:27,040 Speaker 3: Andy Kelly, Kathy Callahan and the eight Bar and IBM teams, 736 00:39:27,520 --> 00:39:31,040 Speaker 3: as well as the Pushkin marketing team. Smart Talks with 737 00:39:31,120 --> 00:39:35,360 Speaker 3: IBM is a production of Pushkin Industries and iHeartMedia. To 738 00:39:35,440 --> 00:39:40,360 Speaker 3: find more Pushkin podcasts, listen on the iHeartRadio app, Apple Podcasts, 739 00:39:40,480 --> 00:39:45,000 Speaker 3: or wherever you listen to podcasts. Hi'm Malcolm Gladwell. This 740 00:39:45,320 --> 00:40:00,520 Speaker 3: is a paid advertisement from IBM.