1 00:00:04,519 --> 00:00:10,480 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. 2 00:00:12,240 --> 00:00:13,800 Speaker 2: Today, we are witnessed. 3 00:00:13,400 --> 00:00:16,640 Speaker 1: To one of those rare moments in history, the rise 4 00:00:16,760 --> 00:00:20,520 Speaker 1: of an innovative technology with the potential to radically transform 5 00:00:20,600 --> 00:00:26,320 Speaker 1: business in society forever. That technology, of course, is artificial intelligence, 6 00:00:26,520 --> 00:00:29,280 Speaker 1: and it's the central focus for this new season of 7 00:00:29,320 --> 00:00:33,400 Speaker 1: Smart Talks with IBM. Join hosts from your favorite Pushkin 8 00:00:33,479 --> 00:00:37,040 Speaker 1: podcasts as they talk with industry experts and leaders to 9 00:00:37,120 --> 00:00:40,879 Speaker 1: explore how businesses can integrate AI into their workflows and 10 00:00:41,000 --> 00:00:44,160 Speaker 1: help drive real change in this new era of AI, 11 00:00:44,720 --> 00:00:47,360 Speaker 1: and of course, host Malcolm Gladwell will be there to 12 00:00:47,400 --> 00:00:49,919 Speaker 1: guide you through the season and throw in his two 13 00:00:50,000 --> 00:00:53,080 Speaker 1: cents as well. Look out for new episodes of Smart 14 00:00:53,080 --> 00:00:56,240 Speaker 1: Talks with IBM every other week on the iHeartRadio app, 15 00:00:56,480 --> 00:00:59,960 Speaker 1: Apple Podcasts, or wherever you get your podcasts, and learn 16 00:01:00,160 --> 00:01:03,840 Speaker 1: more at IBM dot com slash smart Talks. 17 00:01:06,200 --> 00:01:09,680 Speaker 3: Hello, Hello, Welcome to Smart Talks with IBM, a podcast 18 00:01:09,680 --> 00:01:15,399 Speaker 3: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Gabwell. This season, 19 00:01:15,440 --> 00:01:19,920 Speaker 3: we're continuing our conversation with new creators visionaries who are 20 00:01:19,959 --> 00:01:24,000 Speaker 3: creatively applying technology in business to drive change, but with 21 00:01:24,080 --> 00:01:28,480 Speaker 3: a focus on the transformative power of artificial intelligence and 22 00:01:28,560 --> 00:01:31,520 Speaker 3: what it means to leverage AI as a game changing 23 00:01:31,600 --> 00:01:36,800 Speaker 3: multiplier for your business. Our guest today is doctor David Cox, 24 00:01:37,440 --> 00:01:42,320 Speaker 3: VP of AI Models at IBM Research and IBM Director 25 00:01:42,360 --> 00:01:46,440 Speaker 3: of the MIT IBM Watson AI Lab, a first of 26 00:01:46,480 --> 00:01:52,360 Speaker 3: its kind industry academic collaboration between IBM and MIT focused 27 00:01:52,400 --> 00:01:57,080 Speaker 3: on the fundamental research of artificial intelligence. Over the course 28 00:01:57,120 --> 00:02:02,000 Speaker 3: of decades, David Cox watched as the AI revolution steadily 29 00:02:02,080 --> 00:02:05,120 Speaker 3: grew from the simmering ideas of a few academics and 30 00:02:05,160 --> 00:02:10,560 Speaker 3: technologists into the industrial boom we are experiencing today. Having 31 00:02:10,600 --> 00:02:13,720 Speaker 3: dedicated his life to pushing the field of AI towards 32 00:02:13,800 --> 00:02:18,040 Speaker 3: new horizons, David has both contributed to and presided over 33 00:02:18,400 --> 00:02:24,040 Speaker 3: many of the major breakthroughs in artificial intelligence. In today's episode, 34 00:02:24,280 --> 00:02:28,520 Speaker 3: you'll hear David explain some of the conceptual underpinnings of 35 00:02:28,560 --> 00:02:33,320 Speaker 3: the current AI landscape, things like foundation models, in surprisingly 36 00:02:33,440 --> 00:02:36,520 Speaker 3: comprehensible terms. I might add, we'll also get into some 37 00:02:36,560 --> 00:02:40,120 Speaker 3: of the amazing practical applications for AI in business, as 38 00:02:40,160 --> 00:02:43,280 Speaker 3: well as what implications AI will have for the future 39 00:02:43,320 --> 00:02:47,280 Speaker 3: of work and design. David spoke with Jacob Goldstein, host 40 00:02:47,320 --> 00:02:52,080 Speaker 3: of the Pushkin podcast What's Your Problem. A veteran business journalist, 41 00:02:52,400 --> 00:02:55,680 Speaker 3: Jacob has reported for The Wall Street Journal, the Miami Herald, 42 00:02:55,919 --> 00:03:01,080 Speaker 3: and was a longtime host of the NPR program Planet Money. Okay, 43 00:03:01,760 --> 00:03:02,920 Speaker 3: let's get to the interview. 44 00:03:09,360 --> 00:03:11,360 Speaker 2: Tell me about your job at IBM. 45 00:03:11,720 --> 00:03:15,040 Speaker 4: So. I wear two hats at IBM. So one, I'm 46 00:03:15,080 --> 00:03:17,919 Speaker 4: the IBM Director of the MI T IBM Watson AI Lab. 47 00:03:18,440 --> 00:03:21,480 Speaker 4: So that's a joint lab between IBM and MIT where 48 00:03:21,480 --> 00:03:24,080 Speaker 4: we try and invent what's next in AI. It's been 49 00:03:24,120 --> 00:03:27,000 Speaker 4: running for about five years, and then more recently I 50 00:03:27,040 --> 00:03:30,200 Speaker 4: started as the vice president for AI Models, and I'm 51 00:03:30,200 --> 00:03:34,760 Speaker 4: in charge of building IBM's foundation models, you know, building 52 00:03:34,800 --> 00:03:37,000 Speaker 4: these these big models, generative models that allow us to 53 00:03:37,040 --> 00:03:39,160 Speaker 4: have all kinds of new exciting capabilities in AI. 54 00:03:39,720 --> 00:03:41,640 Speaker 2: So, so I want to talk to you a lot 55 00:03:41,720 --> 00:03:45,120 Speaker 2: about foundation models, about genitive AI. But before we get 56 00:03:45,160 --> 00:03:47,240 Speaker 2: to that, let's just spend a minute on the on 57 00:03:47,320 --> 00:03:52,800 Speaker 2: the IBM MI T collaboration. Where where did that partnership start? 58 00:03:52,840 --> 00:03:53,800 Speaker 2: How did it originate? 59 00:03:54,960 --> 00:03:57,320 Speaker 4: Yeah, So, actually it turns out that MI T and 60 00:03:57,360 --> 00:04:01,000 Speaker 4: IBM have been collaborating for a very long time in 61 00:04:01,000 --> 00:04:05,040 Speaker 4: the area of AI. In fact, the term artificial intelligence 62 00:04:05,120 --> 00:04:08,720 Speaker 4: was coined in a nineteen fifty six workshop that was 63 00:04:08,720 --> 00:04:11,000 Speaker 4: held at Dartmouth. It was actually organized by an IBM 64 00:04:11,040 --> 00:04:14,200 Speaker 4: or Nathaniel Rochester, who led the development of the IBM 65 00:04:14,240 --> 00:04:17,400 Speaker 4: seven and one. So we've really been together in AIS 66 00:04:17,480 --> 00:04:22,360 Speaker 4: since the beginning, and as AI kept accelerating more and 67 00:04:22,400 --> 00:04:25,839 Speaker 4: more and more, I think there was a really interesting 68 00:04:25,880 --> 00:04:28,400 Speaker 4: decision to say, let's make this a formal partnership. So 69 00:04:28,600 --> 00:04:30,640 Speaker 4: IBM in twenty seventeen and AW so it'd be committing 70 00:04:30,640 --> 00:04:33,520 Speaker 4: close to a quarter billion dollars over ten years to 71 00:04:33,600 --> 00:04:37,640 Speaker 4: have this joint lab with MIT, and we located ourselves 72 00:04:37,720 --> 00:04:39,880 Speaker 4: right on the campus and we've been developing very, very 73 00:04:39,960 --> 00:04:42,760 Speaker 4: deep relationships where we can really get to know each other, 74 00:04:42,880 --> 00:04:46,240 Speaker 4: work shoulder to shoulder, conceiving what we should work on next, 75 00:04:46,240 --> 00:04:50,239 Speaker 4: and then executing the projects. And it's really very few 76 00:04:50,880 --> 00:04:54,520 Speaker 4: entities like this exist between academia industry. It's been really 77 00:04:54,560 --> 00:04:56,800 Speaker 4: fun the last five years to be a part of it. 78 00:04:57,440 --> 00:04:58,960 Speaker 2: And what do you think are some of the most 79 00:04:58,960 --> 00:05:02,480 Speaker 2: important outcomes of this collaboration between IBM and MIT. 80 00:05:03,880 --> 00:05:06,560 Speaker 4: Yeah, so we're really kind of the tip of the 81 00:05:06,600 --> 00:05:10,920 Speaker 4: sphere for for IBM's b I strategy. So we're we're 82 00:05:10,960 --> 00:05:14,039 Speaker 4: really looking what, you know, what's coming ahead, and you know, 83 00:05:14,080 --> 00:05:16,800 Speaker 4: in areas like Foundation models, you know, as the field 84 00:05:16,880 --> 00:05:20,080 Speaker 4: changes and I T people are interested in working on 85 00:05:20,279 --> 00:05:22,280 Speaker 4: you know, faculty, students and staff are interested in working 86 00:05:22,320 --> 00:05:24,039 Speaker 4: on what's the latest thing, what's the next thing. We 87 00:05:24,080 --> 00:05:27,440 Speaker 4: at IBM Research are very much interested in the same 88 00:05:27,720 --> 00:05:29,760 Speaker 4: So we can kind of put out feelers, you know, 89 00:05:29,839 --> 00:05:33,400 Speaker 4: interesting things that we're seeing in our research, interesting things 90 00:05:33,400 --> 00:05:34,960 Speaker 4: we're hearing in the field. We can go and chase 91 00:05:35,000 --> 00:05:38,120 Speaker 4: those opportunities. So when something big comes, like the big 92 00:05:38,200 --> 00:05:41,040 Speaker 4: change that's been happening lately with Foundation Models, we're ready 93 00:05:41,080 --> 00:05:43,479 Speaker 4: to jump on it. That's really the purpose, that's that's 94 00:05:43,520 --> 00:05:46,600 Speaker 4: the lab functioning the way it should. We're also really 95 00:05:46,600 --> 00:05:50,159 Speaker 4: interested in how do we advance you know AI that 96 00:05:50,200 --> 00:05:53,039 Speaker 4: can help with climate change or you know, build better 97 00:05:53,200 --> 00:05:55,760 Speaker 4: materials and all these kinds of things that are you know, 98 00:05:55,839 --> 00:05:58,960 Speaker 4: a broader aperture sometimes than than what we might consider. 99 00:05:59,120 --> 00:06:01,800 Speaker 4: Just looking at the product portfolio of IBM, and that 100 00:06:02,120 --> 00:06:04,440 Speaker 4: gives us again a breadth where we can see connections 101 00:06:04,480 --> 00:06:07,240 Speaker 4: that we might not have seen otherwise. We can you know, 102 00:06:07,400 --> 00:06:09,920 Speaker 4: think things that help out society and also help out 103 00:06:09,920 --> 00:06:10,560 Speaker 4: our customers. 104 00:06:11,720 --> 00:06:16,200 Speaker 2: So the last whatever six months, say, there has been 105 00:06:16,200 --> 00:06:21,920 Speaker 2: this wild rise in the public's interest in AI, right, 106 00:06:21,960 --> 00:06:25,400 Speaker 2: clearly coming out of these generative AI models that are 107 00:06:25,400 --> 00:06:29,520 Speaker 2: really accessible, you know, certainly chat GPT language models like that, 108 00:06:29,600 --> 00:06:32,640 Speaker 2: as well as models that generate images like mid Journey. 109 00:06:33,279 --> 00:06:36,040 Speaker 2: I mean, can you just sort of briefly talk about 110 00:06:36,160 --> 00:06:40,200 Speaker 2: the breakthroughs in AI that have made this moment feel 111 00:06:40,279 --> 00:06:43,760 Speaker 2: so exciting, so revolutionary for artificial intelligence? 112 00:06:44,800 --> 00:06:49,560 Speaker 4: Yeah. You know, I've been studying AI basically my entire 113 00:06:49,600 --> 00:06:51,680 Speaker 4: adult life. Before I came to IBM, I was a 114 00:06:51,680 --> 00:06:54,280 Speaker 4: professor at Harvard. I've been doing this a long time, 115 00:06:54,520 --> 00:06:56,760 Speaker 4: and I've gotten used to being surprised. It sounds like 116 00:06:56,800 --> 00:07:00,000 Speaker 4: a joke, but it's serious, like getting used to being 117 00:07:00,080 --> 00:07:04,400 Speaker 4: surprised at the acceleration of the pace again. It tracks 118 00:07:04,440 --> 00:07:06,960 Speaker 4: actually a long way back. You know, there's lots of 119 00:07:06,960 --> 00:07:10,240 Speaker 4: things where there was an idea that just simmered for 120 00:07:10,280 --> 00:07:14,480 Speaker 4: a really long time. Some of the key math behind 121 00:07:15,080 --> 00:07:18,400 Speaker 4: the stuff that we have today, which is amazing. There's 122 00:07:18,400 --> 00:07:21,560 Speaker 4: an algorithm called backpropagation, which is sort of key to 123 00:07:21,640 --> 00:07:24,240 Speaker 4: training neural networks that's been around you know since the 124 00:07:24,280 --> 00:07:28,400 Speaker 4: eighties in wide use, and really what happened was it 125 00:07:28,520 --> 00:07:32,360 Speaker 4: simmered for a long time and then enough data and 126 00:07:32,520 --> 00:07:36,720 Speaker 4: enough compute came. So we had enough data because you know, 127 00:07:36,760 --> 00:07:39,920 Speaker 4: we all started carrying multiple cameras around with us, our 128 00:07:39,960 --> 00:07:42,760 Speaker 4: mobile phones have all, you know, all these cameras and 129 00:07:42,800 --> 00:07:45,240 Speaker 4: this we put everything on the Internet, and there's all 130 00:07:45,240 --> 00:07:47,680 Speaker 4: this data out there. We called a lucky break that 131 00:07:47,720 --> 00:07:50,160 Speaker 4: there was something called a graphics processing unit, which you know, 132 00:07:50,200 --> 00:07:52,200 Speaker 4: turns out to be really useful for doing these kinds 133 00:07:52,240 --> 00:07:54,720 Speaker 4: of algorithms, maybe even more useful than it is for 134 00:07:54,840 --> 00:07:58,880 Speaker 4: doing graphics. They're great graphics too, And things just kept 135 00:07:58,960 --> 00:08:02,160 Speaker 4: kind of adding to the snowball. So we had deep learning, 136 00:08:02,600 --> 00:08:06,240 Speaker 4: which is sort of a rebrand of neural networks that 137 00:08:06,280 --> 00:08:08,880 Speaker 4: I mentioned from the eighties, and that was enabled again 138 00:08:08,920 --> 00:08:12,880 Speaker 4: by data because we digitalized the world and compute because 139 00:08:12,920 --> 00:08:15,600 Speaker 4: we kept building faster and faster and more powerful computers. 140 00:08:16,120 --> 00:08:18,800 Speaker 4: And then that allowed us to make this this big breakthrough. 141 00:08:19,080 --> 00:08:23,400 Speaker 4: And then you know, more recently, using the same building blocks, 142 00:08:23,880 --> 00:08:26,800 Speaker 4: that inexorable rise of more and more and more data 143 00:08:27,480 --> 00:08:32,400 Speaker 4: met the technology called self supervised learning. Where the key 144 00:08:32,679 --> 00:08:37,080 Speaker 4: difference there in traditional deep learning, you know, for classifying images, 145 00:08:37,160 --> 00:08:38,680 Speaker 4: you know, like is this a cat or is this 146 00:08:38,720 --> 00:08:43,079 Speaker 4: a dog? And a picture those technologies require super vision, 147 00:08:43,200 --> 00:08:45,920 Speaker 4: so you have to take what you have and then 148 00:08:45,920 --> 00:08:47,160 Speaker 4: you have to label it. So you have to take 149 00:08:47,160 --> 00:08:48,720 Speaker 4: a picture of a cat and then you label it 150 00:08:48,720 --> 00:08:51,199 Speaker 4: as a cat, and it turns out that, you know, 151 00:08:51,240 --> 00:08:53,600 Speaker 4: that's very powerful, but it takes a lot of time 152 00:08:53,640 --> 00:08:56,959 Speaker 4: to label gats and to label dogs, and there's only 153 00:08:57,040 --> 00:08:59,360 Speaker 4: so many labels that exist in the world. So what 154 00:08:59,520 --> 00:09:03,280 Speaker 4: really changed more recently is that we have self supervised 155 00:09:03,320 --> 00:09:05,120 Speaker 4: learning where you don't have to have the labels. We 156 00:09:05,120 --> 00:09:07,720 Speaker 4: can just take unannotated data. And what that does is 157 00:09:07,720 --> 00:09:11,000 Speaker 4: it lets you use even more data. And that's really 158 00:09:11,000 --> 00:09:15,120 Speaker 4: what drove this latest sort of rage. And then and 159 00:09:15,160 --> 00:09:17,040 Speaker 4: then all of a sudden we start getting these these 160 00:09:17,040 --> 00:09:23,120 Speaker 4: really powerful models. And then really, this has been simmering technologies, right, 161 00:09:23,600 --> 00:09:27,760 Speaker 4: This has been happening for a while and progressively getting 162 00:09:27,800 --> 00:09:30,400 Speaker 4: more and more powerful. One of the things that really 163 00:09:30,600 --> 00:09:35,760 Speaker 4: happened with CHATGBT and technologies like stable diffusion and mid 164 00:09:35,880 --> 00:09:39,000 Speaker 4: Journey was that they made it visible to the public. 165 00:09:39,640 --> 00:09:41,600 Speaker 4: You know, you put it out there the public can 166 00:09:41,640 --> 00:09:43,760 Speaker 4: touch and feel, and they're like, Wow, not only is 167 00:09:43,800 --> 00:09:47,000 Speaker 4: there palpable change and wow this you know, I can 168 00:09:47,080 --> 00:09:49,199 Speaker 4: talk to this thing. Wow, this thing can generate an image. 169 00:09:49,559 --> 00:09:52,160 Speaker 4: Not only that, but everyone can touch and feel and try. 170 00:09:53,000 --> 00:09:57,640 Speaker 4: My kids can use some of these AI art generation technologies. 171 00:09:58,200 --> 00:10:01,640 Speaker 4: And that's really just launched, you know. It's like a 172 00:10:02,080 --> 00:10:05,679 Speaker 4: propelled slingshot at us into a different regime in terms 173 00:10:05,720 --> 00:10:07,480 Speaker 4: of the public awareness of these technologies. 174 00:10:08,160 --> 00:10:11,320 Speaker 2: You mentioned earlier in the conversation foundation models, and I 175 00:10:11,360 --> 00:10:13,160 Speaker 2: want to talk a little bit about that. I mean, 176 00:10:13,200 --> 00:10:16,600 Speaker 2: can you just tell me, you know, what are foundation 177 00:10:16,840 --> 00:10:19,600 Speaker 2: models for AI and why are they a big deal? 178 00:10:20,800 --> 00:10:24,480 Speaker 4: Yeah? So this term foundation model was coined by a 179 00:10:24,520 --> 00:10:28,240 Speaker 4: group at Stanford, and I think it's actually a really 180 00:10:28,280 --> 00:10:31,800 Speaker 4: apt term because remember I said, you know, one of 181 00:10:31,840 --> 00:10:35,200 Speaker 4: the big things that unlocked this latest excitement was the 182 00:10:35,200 --> 00:10:38,800 Speaker 4: fact that we could use large amounts of unannotated data. 183 00:10:39,000 --> 00:10:40,680 Speaker 4: We could train a model. We don't have to go 184 00:10:40,720 --> 00:10:44,240 Speaker 4: through the painful effort of labeling each and every example. 185 00:10:44,840 --> 00:10:47,040 Speaker 4: You still need to have your model do something you 186 00:10:47,080 --> 00:10:49,240 Speaker 4: wanted to do. You still need to tell it what 187 00:10:49,280 --> 00:10:50,880 Speaker 4: you want to do. You can't just have a model 188 00:10:50,880 --> 00:10:53,440 Speaker 4: that doesn't, you know, have any purpose. But what a 189 00:10:53,440 --> 00:10:57,280 Speaker 4: foundation models that provides a foundation, like a literal foundation, 190 00:10:57,559 --> 00:10:59,560 Speaker 4: you can sort of stand on the shoulders of giants. 191 00:10:59,600 --> 00:11:02,160 Speaker 4: You can have one of these massively trained models and 192 00:11:02,200 --> 00:11:04,280 Speaker 4: then do a little bit on top. You know, you 193 00:11:04,280 --> 00:11:06,520 Speaker 4: could use just a few examples of what you're looking 194 00:11:06,559 --> 00:11:09,760 Speaker 4: for and you can get what you want from the model. 195 00:11:10,320 --> 00:11:12,240 Speaker 4: So just a little bit on top. Now it gets 196 00:11:12,280 --> 00:11:14,439 Speaker 4: to the results that a huge amount of effort used 197 00:11:14,480 --> 00:11:16,400 Speaker 4: to have to put in, you know, to get from 198 00:11:16,440 --> 00:11:18,640 Speaker 4: the ground up to that level. 199 00:11:18,880 --> 00:11:22,680 Speaker 2: I was trying to think of of an analogy for 200 00:11:22,960 --> 00:11:25,959 Speaker 2: sort of foundation models versus what came before, and I 201 00:11:26,000 --> 00:11:28,440 Speaker 2: don't know that I came up with a good one, 202 00:11:28,480 --> 00:11:30,199 Speaker 2: but the best I could do was this. I want 203 00:11:30,240 --> 00:11:33,160 Speaker 2: you to tell me if it's plausible. It's like before 204 00:11:33,240 --> 00:11:36,800 Speaker 2: foundation models, it was like you had these sort of 205 00:11:36,840 --> 00:11:40,040 Speaker 2: single use kitchen appliances. You could make a waffle iron 206 00:11:40,080 --> 00:11:42,600 Speaker 2: if you wanted waffles, or you could make a toaster 207 00:11:42,760 --> 00:11:45,319 Speaker 2: if you wanted to make toast. But a foundation model 208 00:11:45,400 --> 00:11:47,960 Speaker 2: is like like an oven with a range on top. 209 00:11:48,040 --> 00:11:49,800 Speaker 2: So it's like this machine and you could just cook 210 00:11:49,920 --> 00:11:51,760 Speaker 2: anything with this machine. 211 00:11:52,360 --> 00:11:56,880 Speaker 4: Yeah, that's a great analogy. They're very versatile. The other 212 00:11:57,000 --> 00:11:59,520 Speaker 4: piece of it too, is that they dramatically lowered the 213 00:11:59,600 --> 00:12:02,800 Speaker 4: effort that it takes to do something that you want 214 00:12:02,840 --> 00:12:06,120 Speaker 4: to do. And I used to say about the old 215 00:12:06,160 --> 00:12:07,960 Speaker 4: world of AI, would say, you know, the problem with 216 00:12:08,000 --> 00:12:11,760 Speaker 4: automation is that it's too labor intensive, which sounds like 217 00:12:11,800 --> 00:12:12,680 Speaker 4: I'm making a joke. 218 00:12:12,880 --> 00:12:17,440 Speaker 2: Indeed, famously, if automation does one thing, it substitutes machines 219 00:12:17,559 --> 00:12:20,760 Speaker 2: or computing power for labor, right, So what does that 220 00:12:20,840 --> 00:12:25,160 Speaker 2: mean to say AI is or automation is too labor intensive. 221 00:12:25,640 --> 00:12:27,600 Speaker 4: It sounds like I'm making a joke, but I'm actually serious. 222 00:12:27,840 --> 00:12:31,040 Speaker 4: What I mean is that the effort it took the 223 00:12:31,120 --> 00:12:34,960 Speaker 4: old regime to automate something was very, very high. So 224 00:12:35,160 --> 00:12:38,040 Speaker 4: if I need to go and curate all this data, 225 00:12:38,080 --> 00:12:41,319 Speaker 4: collect all this data, and then carefully label all these examples, 226 00:12:41,720 --> 00:12:46,120 Speaker 4: that labeling itself might be incredibly expensive and times, and 227 00:12:46,120 --> 00:12:48,760 Speaker 4: we estimate anywhere between eighty to ninety percent of the 228 00:12:48,800 --> 00:12:51,720 Speaker 4: effort it takes to feel an AI solution actually is 229 00:12:51,880 --> 00:12:55,199 Speaker 4: just spent on data so that that has some consequences, 230 00:12:55,480 --> 00:13:00,839 Speaker 4: which is the threshold for bothering. You know, if you're 231 00:13:00,880 --> 00:13:03,040 Speaker 4: going to only get a little bit of value back 232 00:13:03,320 --> 00:13:05,520 Speaker 4: from something, are you going to go through this huge 233 00:13:05,559 --> 00:13:09,000 Speaker 4: effort to curate all this data, and then when it 234 00:13:09,040 --> 00:13:11,480 Speaker 4: comes time to train the model, you need highly skilled 235 00:13:11,480 --> 00:13:15,160 Speaker 4: people defensive or hard to find in the labor market. 236 00:13:15,720 --> 00:13:17,240 Speaker 4: You know, are you really going to do something that's 237 00:13:17,240 --> 00:13:19,280 Speaker 4: just a tidal incremental thing? Now you're going to do 238 00:13:19,320 --> 00:13:23,280 Speaker 4: the only the highest value things that weren't right level 239 00:13:23,640 --> 00:13:24,199 Speaker 4: because you have. 240 00:13:24,240 --> 00:13:28,240 Speaker 2: To essentially build the whole machine from scratch, and there 241 00:13:28,240 --> 00:13:30,840 Speaker 2: aren't many things where it's worth that much work to 242 00:13:30,880 --> 00:13:33,840 Speaker 2: build a machine that's only going to do one narrow thing. 243 00:13:34,320 --> 00:13:37,240 Speaker 4: That's right, and then you tackle the next problem and 244 00:13:37,320 --> 00:13:39,640 Speaker 4: you basically have to start over. And you know, there 245 00:13:39,679 --> 00:13:42,480 Speaker 4: are some nuances here, like for images, you can pre 246 00:13:42,520 --> 00:13:45,000 Speaker 4: train a model on some other task and change it around. 247 00:13:45,080 --> 00:13:48,040 Speaker 4: So there are some examples of this, like non recurring 248 00:13:48,120 --> 00:13:50,719 Speaker 4: cost that we have in the old world too, But 249 00:13:50,760 --> 00:13:53,280 Speaker 4: by and large, it's just a lot of effort. It's hard. 250 00:13:53,559 --> 00:13:57,880 Speaker 4: It takes you know, a large level of skill to implement. 251 00:13:58,640 --> 00:14:01,439 Speaker 4: One analogy that I like is, you know, think about 252 00:14:01,440 --> 00:14:03,559 Speaker 4: it as you know, you have a river of data, 253 00:14:03,960 --> 00:14:07,280 Speaker 4: you know, running through your company or your institution. Traditional 254 00:14:07,360 --> 00:14:09,840 Speaker 4: AI solutions are kind of like building a dam on 255 00:14:09,880 --> 00:14:13,320 Speaker 4: that river. You know, dams are very expensive things to build. 256 00:14:13,679 --> 00:14:17,960 Speaker 4: They require highly specialized skills and lots of planning. And 257 00:14:18,120 --> 00:14:19,800 Speaker 4: you know, you're only going to put a dam on 258 00:14:20,240 --> 00:14:22,960 Speaker 4: a river that's big enough that you're gonna get enough 259 00:14:23,040 --> 00:14:24,920 Speaker 4: energy out of it that it was worth your trouble. 260 00:14:25,320 --> 00:14:26,600 Speaker 4: You're gonna get a lot of value out of that 261 00:14:26,680 --> 00:14:28,400 Speaker 4: dam if you have a river like that, you know, 262 00:14:28,480 --> 00:14:32,080 Speaker 4: a river of data, but it's actually the vast majority 263 00:14:32,080 --> 00:14:34,640 Speaker 4: of the water you know in your kingdom actually isn't 264 00:14:34,680 --> 00:14:38,800 Speaker 4: in that river. It's in puddles and greeks and babid brooks. 265 00:14:38,880 --> 00:14:42,360 Speaker 4: And you know, there's a lot of value left on 266 00:14:42,360 --> 00:14:44,960 Speaker 4: the table because it's like, well, I can't there's nothing 267 00:14:44,960 --> 00:14:46,760 Speaker 4: you can do about it. It's just that that's too 268 00:14:47,760 --> 00:14:50,880 Speaker 4: low value. So it takes too much effort, so I'm 269 00:14:50,880 --> 00:14:52,000 Speaker 4: just not going to do it. The return on the 270 00:14:52,000 --> 00:14:54,800 Speaker 4: investment just isn't there, so you just end up not 271 00:14:54,880 --> 00:14:58,120 Speaker 4: automating things. It's too much of a pain. Now what 272 00:14:58,280 --> 00:15:00,720 Speaker 4: foundation models do is they say, well, actually, no, we 273 00:15:00,760 --> 00:15:03,920 Speaker 4: can train a base model a foundation that you can 274 00:15:03,960 --> 00:15:06,240 Speaker 4: work on, don't We don't care. We're not specifying what 275 00:15:06,280 --> 00:15:07,800 Speaker 4: the task is ahead of time. We just need to 276 00:15:08,240 --> 00:15:10,920 Speaker 4: learn about the domain of data. So if we want 277 00:15:10,960 --> 00:15:14,240 Speaker 4: to build something that can understand English language, there's a 278 00:15:14,280 --> 00:15:17,640 Speaker 4: ton of English language text available out in the world. 279 00:15:17,760 --> 00:15:21,560 Speaker 4: We can now train models on huge quantities of it. 280 00:15:22,000 --> 00:15:25,400 Speaker 4: And then it learned the structure, It learned how language 281 00:15:25,600 --> 00:15:27,640 Speaker 4: you know, good part of how language works on all 282 00:15:27,640 --> 00:15:29,920 Speaker 4: that unlabeled data. And then when you roll up with 283 00:15:30,000 --> 00:15:33,760 Speaker 4: your task, you know, I want to solve this particular problem. 284 00:15:34,200 --> 00:15:36,560 Speaker 4: You don't have to start from scratch. You're starting from 285 00:15:36,640 --> 00:15:39,840 Speaker 4: a very very very high place. So that just gives 286 00:15:39,880 --> 00:15:42,440 Speaker 4: you the ability to you know, now all of a sudden, 287 00:15:42,480 --> 00:15:45,640 Speaker 4: everything is accessible. All the puddles and greeks and babbling 288 00:15:45,680 --> 00:15:49,840 Speaker 4: brooks and kilopons, you know, those are all accessible now, 289 00:15:50,360 --> 00:15:53,040 Speaker 4: and that's that's very exciting. But it just changes the 290 00:15:53,040 --> 00:15:55,560 Speaker 4: equation on what kinds of problems you could use AI 291 00:15:55,720 --> 00:15:56,080 Speaker 4: to solve. 292 00:15:56,200 --> 00:16:01,680 Speaker 2: And so foundation models basically mean that automating some new 293 00:16:01,760 --> 00:16:05,000 Speaker 2: task is much less labor intensive. The sort of marginal 294 00:16:05,080 --> 00:16:08,120 Speaker 2: effort to do some new automation thing is much lower 295 00:16:08,120 --> 00:16:11,400 Speaker 2: because you're building on top of the foundation model rather 296 00:16:11,440 --> 00:16:16,200 Speaker 2: than starting from scratch. Absolutely, so that is like the 297 00:16:16,560 --> 00:16:20,720 Speaker 2: exciting good news. I do feel like there's a little 298 00:16:20,760 --> 00:16:23,480 Speaker 2: bit of a countervailing idea that's worth talking about here, 299 00:16:23,520 --> 00:16:25,640 Speaker 2: and that is the idea that even though there are 300 00:16:25,680 --> 00:16:30,280 Speaker 2: these foundation models that are really powerful that are relatively 301 00:16:30,320 --> 00:16:32,880 Speaker 2: easy to build on top of, it's still the case 302 00:16:32,960 --> 00:16:36,240 Speaker 2: right that there is not some one size fits all 303 00:16:36,320 --> 00:16:39,960 Speaker 2: foundation model. So you know, what does that mean and 304 00:16:40,040 --> 00:16:42,560 Speaker 2: why is that important to think about in this context? 305 00:16:43,160 --> 00:16:46,920 Speaker 4: Yeah, So we believe very strongly that there isn't just 306 00:16:47,040 --> 00:16:49,960 Speaker 4: one model to rule them all. There's a number of 307 00:16:49,960 --> 00:16:52,960 Speaker 4: reasons why that could be true. One which I think 308 00:16:53,040 --> 00:16:57,080 Speaker 4: is important and very relevant today is how much energy 309 00:16:57,400 --> 00:17:02,160 Speaker 4: these models can consume. So these models, you know, can 310 00:17:02,200 --> 00:17:07,640 Speaker 4: get very, very large. So one thing that we're starting 311 00:17:07,640 --> 00:17:10,399 Speaker 4: to see or starting to believe, is that you probably 312 00:17:10,400 --> 00:17:15,560 Speaker 4: shouldn't use one giant sledgehammer model to solve every single problem, 313 00:17:15,720 --> 00:17:17,680 Speaker 4: you know, like we should pick the right size model 314 00:17:17,680 --> 00:17:20,480 Speaker 4: to solve the problem. We shouldn't necessarily assume that we 315 00:17:20,560 --> 00:17:25,119 Speaker 4: need the biggest, baddest model for every little use case. 316 00:17:25,560 --> 00:17:27,760 Speaker 4: And we're also seeing that, you know, small models that 317 00:17:27,800 --> 00:17:32,000 Speaker 4: are trained like to specialize on particular domains can actually 318 00:17:32,040 --> 00:17:35,920 Speaker 4: outperform much bigger models. So bigger isn't always even better. 319 00:17:35,960 --> 00:17:38,520 Speaker 2: So they're more efficient and they do the thing you 320 00:17:38,560 --> 00:17:40,200 Speaker 2: want them to do better as well. 321 00:17:40,760 --> 00:17:43,960 Speaker 4: That's right. So Stanford, for instance, a group of Stanford 322 00:17:44,040 --> 00:17:47,880 Speaker 4: trained a model was a two point seven billion parameter model, 323 00:17:47,880 --> 00:17:50,719 Speaker 4: which isn't terribly big by today's standards. They trained it 324 00:17:50,840 --> 00:17:52,760 Speaker 4: just on the biomedical literature, you know, this is the 325 00:17:52,800 --> 00:17:55,560 Speaker 4: kind of thing that universities do. And what they showed 326 00:17:55,680 --> 00:17:59,120 Speaker 4: was that this model was better at answering questions about 327 00:17:59,119 --> 00:18:01,800 Speaker 4: the biomedical literacy sure than some models that are one 328 00:18:01,880 --> 00:18:05,760 Speaker 4: hundred billion prouders, you know, many times larger. So it's 329 00:18:05,800 --> 00:18:08,679 Speaker 4: a little bit like you know, asking an expert for 330 00:18:08,760 --> 00:18:11,840 Speaker 4: help on something versus asking the smartest person, you know, 331 00:18:12,480 --> 00:18:15,280 Speaker 4: the smartest person you know, maybe very smart, but they're 332 00:18:15,320 --> 00:18:18,639 Speaker 4: not going to be expertise. And then as an added bonus, 333 00:18:18,680 --> 00:18:20,680 Speaker 4: you know, this is now a much smaller model, it's 334 00:18:20,760 --> 00:18:23,199 Speaker 4: much more efficient to run, we are you know, you know, 335 00:18:23,240 --> 00:18:27,040 Speaker 4: it's cheaper. So there's lots of different advantages there. So 336 00:18:27,280 --> 00:18:31,119 Speaker 4: I think we're going to see attention in the industry 337 00:18:31,480 --> 00:18:34,200 Speaker 4: between vendors that say, hey, this is the one, you know, 338 00:18:34,280 --> 00:18:36,879 Speaker 4: big model, and then others that say, well, actually, you know, 339 00:18:36,880 --> 00:18:39,439 Speaker 4: there's there's you know, lots of different tools we can 340 00:18:39,560 --> 00:18:41,840 Speaker 4: use that all have this nice quality that we outligned 341 00:18:41,880 --> 00:18:44,199 Speaker 4: at the beginning, and then we should really pick the 342 00:18:44,200 --> 00:18:46,520 Speaker 4: one that makes the most sense for the task at hand. 343 00:18:47,840 --> 00:18:52,199 Speaker 2: So there's sustainability basically efficiency. Another kind of set of 344 00:18:52,240 --> 00:18:54,520 Speaker 2: issues that come up a lot with AI A are 345 00:18:54,680 --> 00:18:58,479 Speaker 2: bias hallucination. Can you talk a little bit about bias 346 00:18:58,720 --> 00:19:01,000 Speaker 2: and hallucination, what they are and how you're working to 347 00:19:01,400 --> 00:19:02,480 Speaker 2: mitigate those problems. 348 00:19:02,920 --> 00:19:05,760 Speaker 4: Yeah, so there are lots of issues still as amazing 349 00:19:05,800 --> 00:19:08,960 Speaker 4: as these technologies are, and they are amazing, let's be 350 00:19:09,080 --> 00:19:11,600 Speaker 4: very clear, lots of great things we're going to enable 351 00:19:11,640 --> 00:19:15,160 Speaker 4: with these kinds of technologies. Bias isn't a new problem, 352 00:19:15,520 --> 00:19:20,119 Speaker 4: so you know, basically we've seen this since the beginning 353 00:19:20,119 --> 00:19:23,040 Speaker 4: of AI. If you train a model on data that 354 00:19:23,440 --> 00:19:25,560 Speaker 4: has a bias in it, the model is going to 355 00:19:25,600 --> 00:19:30,200 Speaker 4: recapitulate that bias and it provides its answers. So every time, 356 00:19:30,359 --> 00:19:32,919 Speaker 4: you know, if all the text you have says, you know, 357 00:19:32,920 --> 00:19:35,960 Speaker 4: it's more likely to refer to female nurses and male scientists, 358 00:19:36,080 --> 00:19:38,760 Speaker 4: then you're going to get models that you know. For instance, 359 00:19:39,080 --> 00:19:41,960 Speaker 4: there was an example where a machine learning based translation 360 00:19:42,040 --> 00:19:46,840 Speaker 4: system translated from Hungarian to English. Hungarian doesn't have gender pronouns. 361 00:19:46,960 --> 00:19:49,520 Speaker 4: English does, and when you ask them to translate, it 362 00:19:49,560 --> 00:19:52,560 Speaker 4: would translate they are a nurse to she is a nurse. 363 00:19:53,200 --> 00:19:55,680 Speaker 4: Translate they are a scientist, to he is a scientist. 364 00:19:55,920 --> 00:19:58,960 Speaker 4: And that's not because the people who wrote the algorithm 365 00:19:58,960 --> 00:20:01,520 Speaker 4: we're building in bio and coding in like, oh, it's 366 00:20:01,520 --> 00:20:03,320 Speaker 4: got to be this way. It's because the data was 367 00:20:03,359 --> 00:20:06,080 Speaker 4: like that. You know, we have biases in our society 368 00:20:06,560 --> 00:20:10,119 Speaker 4: and they're reflected in our data and our text and 369 00:20:10,200 --> 00:20:14,640 Speaker 4: our images everywhere. And then the models they're just mapping 370 00:20:14,680 --> 00:20:16,560 Speaker 4: from what they've what they've seen in their training data 371 00:20:16,640 --> 00:20:19,000 Speaker 4: to to the result that you're trying to get them 372 00:20:19,000 --> 00:20:21,880 Speaker 4: to do and to give, and then these biases come out. 373 00:20:22,000 --> 00:20:27,439 Speaker 4: So there's a very active program of research in you know, 374 00:20:27,480 --> 00:20:30,280 Speaker 4: we we do quite a bit at IBM research and I, 375 00:20:31,000 --> 00:20:34,240 Speaker 4: but also all over the community and industry in academia 376 00:20:34,280 --> 00:20:37,840 Speaker 4: trying to figure out how do we explicitly remove these biases, 377 00:20:37,880 --> 00:20:40,080 Speaker 4: how do we identify them, how do you know, how 378 00:20:40,080 --> 00:20:42,320 Speaker 4: do we build tools that allow people to audit their 379 00:20:42,359 --> 00:20:44,919 Speaker 4: systems to make sure they aren't biased. So this is 380 00:20:44,960 --> 00:20:47,040 Speaker 4: a really important thing, and you know, again this was 381 00:20:47,080 --> 00:20:51,600 Speaker 4: here since the beginning, you know, of machine learning and AI, 382 00:20:52,160 --> 00:20:55,640 Speaker 4: but foundation models and large language models and generative AI 383 00:20:56,600 --> 00:20:59,360 Speaker 4: just bring it into sharper even sharper focus because there's 384 00:20:59,359 --> 00:21:02,880 Speaker 4: just so much and it's sort of building in baking 385 00:21:02,920 --> 00:21:06,160 Speaker 4: in all these different biases we have. So that's that's 386 00:21:06,200 --> 00:21:10,000 Speaker 4: absolutely a problem that these models have. Another one that 387 00:21:10,040 --> 00:21:13,879 Speaker 4: you mentioned was hallucinations. So even the most impressive of 388 00:21:13,880 --> 00:21:17,960 Speaker 4: our models will often just make stuff up. You know, 389 00:21:18,000 --> 00:21:21,159 Speaker 4: the technical term that the heels chosen as is hallucination. 390 00:21:21,760 --> 00:21:24,719 Speaker 4: To give you an example, I asked chat tbt to 391 00:21:24,960 --> 00:21:28,760 Speaker 4: create a biography of David Cox IBM, and you know, 392 00:21:29,000 --> 00:21:31,560 Speaker 4: it started off really well. You know, they identified that 393 00:21:31,600 --> 00:21:34,040 Speaker 4: I was the director of the mt IBM Watson and 394 00:21:34,040 --> 00:21:36,440 Speaker 4: said a few words about that, and then it proceeded 395 00:21:36,480 --> 00:21:41,040 Speaker 4: to create an authoritative but completely fake biography of me. 396 00:21:41,080 --> 00:21:43,560 Speaker 4: Where I was British, I was born in the UK. 397 00:21:44,960 --> 00:21:47,880 Speaker 4: I went to British university, you know universities in the UK. 398 00:21:47,960 --> 00:21:51,359 Speaker 2: I was professor the authority, right, it's the certainty that 399 00:21:51,359 --> 00:21:54,600 Speaker 2: that is weird about it, right, It's it's dead certain 400 00:21:54,640 --> 00:21:56,520 Speaker 2: that you're from the UK, et cetera. 401 00:21:57,080 --> 00:22:00,119 Speaker 4: Absolutely, yeah, it has all kinds of flourishes like I 402 00:22:00,240 --> 00:22:04,920 Speaker 4: want words in the UK. So yeah, it's it's problematic 403 00:22:04,960 --> 00:22:07,800 Speaker 4: because it kind of pokes at a lot of weak spots 404 00:22:07,840 --> 00:22:13,040 Speaker 4: in our human psychology where if something sounds coherent, we're 405 00:22:13,119 --> 00:22:15,880 Speaker 4: likely to assume it's true. We're not used to interacting 406 00:22:15,880 --> 00:22:20,000 Speaker 4: with people who eloquently and authoritatively you know, emit complete 407 00:22:20,119 --> 00:22:23,080 Speaker 4: nonsense like yeah, you know, you know, we can debate 408 00:22:23,080 --> 00:22:23,399 Speaker 4: about that. 409 00:22:23,359 --> 00:22:25,399 Speaker 2: But yeah, we can debate about that, but yes it 410 00:22:25,800 --> 00:22:30,080 Speaker 2: the sort of blive confidence throws you off when you 411 00:22:30,080 --> 00:22:31,399 Speaker 2: realize it's completely wrong. 412 00:22:31,520 --> 00:22:34,120 Speaker 4: Right, that's right. And and we do have a little 413 00:22:34,119 --> 00:22:36,760 Speaker 4: bit of like a great and powerful AWS sort of 414 00:22:37,160 --> 00:22:39,560 Speaker 4: vibe going sometimes where we're like, well, you know, the 415 00:22:39,600 --> 00:22:43,320 Speaker 4: AI is all knowing and therefore whatever it says must 416 00:22:43,400 --> 00:22:45,679 Speaker 4: be true. But but these things will make up stuff, 417 00:22:45,920 --> 00:22:50,639 Speaker 4: you know, very aggressively, and you know, you everyone can 418 00:22:50,680 --> 00:22:53,440 Speaker 4: try asking it for their their bio. You you'll you'll 419 00:22:53,440 --> 00:22:55,680 Speaker 4: get something that you'll always get something that's of the 420 00:22:55,760 --> 00:22:58,560 Speaker 4: right form, that has the right tone. But you know, 421 00:22:58,600 --> 00:23:02,119 Speaker 4: the facts just aren't necessarily there. So that's obviously a problem. 422 00:23:02,240 --> 00:23:04,080 Speaker 4: We need to figure out how to close those gaps, 423 00:23:04,080 --> 00:23:06,919 Speaker 4: fix those problems there's lots of ways we could use 424 00:23:06,920 --> 00:23:08,320 Speaker 4: them much more easily. 425 00:23:08,840 --> 00:23:11,600 Speaker 3: I'd just like to say, faced with the awesome potential 426 00:23:11,640 --> 00:23:14,680 Speaker 3: of what these technologies might do, it's a bit encouraging 427 00:23:14,720 --> 00:23:18,199 Speaker 3: to hear that even chat GPT has a weakness for 428 00:23:18,359 --> 00:23:23,440 Speaker 3: inventing flamboyant, if fictional versions of people's lives. And while 429 00:23:23,600 --> 00:23:27,119 Speaker 3: entertaining ourselves with chat GPT and mid journey is important, 430 00:23:27,600 --> 00:23:31,679 Speaker 3: the way lay people use consumer facing chatbots and generative 431 00:23:31,760 --> 00:23:36,040 Speaker 3: AI is just fundamentally different from the way an enterprise 432 00:23:36,119 --> 00:23:39,600 Speaker 3: business uses AI. How can we harness the abilities of 433 00:23:39,720 --> 00:23:43,080 Speaker 3: artificial intelligence to help us solve the problems we face 434 00:23:43,160 --> 00:23:46,800 Speaker 3: in business and technology. Let's listen on as David and 435 00:23:46,880 --> 00:23:48,719 Speaker 3: Jacob continue their conversation. 436 00:23:49,480 --> 00:23:52,439 Speaker 2: We've been talking in a somewhat abstract way about AI 437 00:23:52,560 --> 00:23:55,280 Speaker 2: in the ways it can be used. Let's talk in 438 00:23:55,320 --> 00:23:58,040 Speaker 2: a little bit more of a specific way. Can you 439 00:23:58,680 --> 00:24:02,520 Speaker 2: just talk about some examples of business challenges that can 440 00:24:02,560 --> 00:24:05,960 Speaker 2: be solved with automation, with this kind of automation. 441 00:24:05,720 --> 00:24:09,640 Speaker 4: We're talking about, Yeah, so there really really this guy's 442 00:24:09,680 --> 00:24:13,600 Speaker 4: the limit. There's a whole set of different applications that 443 00:24:13,840 --> 00:24:16,479 Speaker 4: these models are really good at, and basically it's a 444 00:24:16,520 --> 00:24:18,800 Speaker 4: super set of everything we used to use AI for 445 00:24:19,080 --> 00:24:22,320 Speaker 4: in business. So you know, the simple kinds of things 446 00:24:22,320 --> 00:24:24,040 Speaker 4: are like, hey, if I have text and i'm you know, 447 00:24:24,040 --> 00:24:26,760 Speaker 4: I have product reviews and I want to be able 448 00:24:26,800 --> 00:24:28,679 Speaker 4: to tell if these are positive or negative. You know, 449 00:24:28,760 --> 00:24:30,440 Speaker 4: like let's look at all the negative reviews so we 450 00:24:30,480 --> 00:24:32,360 Speaker 4: can have a human look through them and see what 451 00:24:32,440 --> 00:24:36,080 Speaker 4: was up. Very common business use case. You can do 452 00:24:36,080 --> 00:24:40,080 Speaker 4: it with traditional deep learning based AI. So so there's 453 00:24:40,119 --> 00:24:42,240 Speaker 4: things like that that are you know, it's very prosaic, 454 00:24:42,280 --> 00:24:43,880 Speaker 4: so that we were already doing it. We've been doing 455 00:24:43,920 --> 00:24:47,320 Speaker 4: it for a long time. Then you get situations that 456 00:24:47,359 --> 00:24:50,240 Speaker 4: are that were harder for the old AI, like if 457 00:24:50,280 --> 00:24:53,240 Speaker 4: i'm I want to compress something like I want to 458 00:24:53,440 --> 00:24:55,679 Speaker 4: I have like I have a chat transcript, Like a 459 00:24:55,720 --> 00:25:00,800 Speaker 4: customer called in and they had a complaint. They called back. Okay, 460 00:25:00,880 --> 00:25:03,640 Speaker 4: now a new you know, a person on the line 461 00:25:03,640 --> 00:25:05,960 Speaker 4: needs to go read the old transcript to catch up. 462 00:25:06,359 --> 00:25:08,760 Speaker 4: Wouldn't it be better if we could just summarize that, 463 00:25:08,960 --> 00:25:11,280 Speaker 4: just condense it all down a quick little paragraph, you know, 464 00:25:11,359 --> 00:25:13,440 Speaker 4: customer call they were upset about this, rather than having 465 00:25:13,440 --> 00:25:15,760 Speaker 4: to read the blow by blow. There's just lots of 466 00:25:15,760 --> 00:25:18,919 Speaker 4: settings like that where summarization is really helpful. Hey, you 467 00:25:18,920 --> 00:25:22,720 Speaker 4: have a meeting and I'd like to just automatically, you know, 468 00:25:22,800 --> 00:25:25,240 Speaker 4: have have that meeting or that email or whatever. I'd 469 00:25:25,240 --> 00:25:26,680 Speaker 4: like to just have a condensed down so I can 470 00:25:26,720 --> 00:25:29,480 Speaker 4: really quickly get to the heart of the matter. These 471 00:25:29,480 --> 00:25:32,000 Speaker 4: models are are really good at doing that. They're also 472 00:25:32,000 --> 00:25:34,640 Speaker 4: a really good at question answering. So if I want 473 00:25:34,680 --> 00:25:37,040 Speaker 4: to find out what's how many vacation days do I have? 474 00:25:37,359 --> 00:25:41,760 Speaker 4: I can now interact in natural language with a system 475 00:25:41,840 --> 00:25:45,040 Speaker 4: that can go and that has access to our HR policies, 476 00:25:45,240 --> 00:25:47,040 Speaker 4: and I can actually have a you know, a multi 477 00:25:47,080 --> 00:25:49,520 Speaker 4: turn conversation where I can, you know, like I would 478 00:25:49,520 --> 00:25:53,480 Speaker 4: have with you know, somebody, you know, actual HR professional 479 00:25:53,640 --> 00:25:58,080 Speaker 4: or customer service representative. So a big part, you know, 480 00:25:58,520 --> 00:26:01,640 Speaker 4: of what this is doing is it's it's putting an interface. 481 00:26:01,720 --> 00:26:03,840 Speaker 4: You know, when we think of computer interfaces, we're usually 482 00:26:03,840 --> 00:26:07,359 Speaker 4: thinking about UI user interface elements where I click on 483 00:26:07,520 --> 00:26:10,920 Speaker 4: menus and there's buttons and all this stuff. Increasingly, now 484 00:26:11,160 --> 00:26:14,360 Speaker 4: we can just talk, you know, you just in words. 485 00:26:14,440 --> 00:26:16,320 Speaker 4: You can describe what you want you want to answer, 486 00:26:16,480 --> 00:26:19,080 Speaker 4: ask a question, you want to sort of command the 487 00:26:19,080 --> 00:26:21,959 Speaker 4: system to do something. Rather than having to learn how 488 00:26:21,960 --> 00:26:24,040 Speaker 4: to do that clicking buttons, which might be inefficient. Now 489 00:26:24,040 --> 00:26:25,680 Speaker 4: we can just sort of spell it out. 490 00:26:26,200 --> 00:26:29,200 Speaker 2: Interesting, right, the graphical user interface that we all sort 491 00:26:29,240 --> 00:26:32,560 Speaker 2: of default to, that's not like the state of nature, Right, 492 00:26:32,600 --> 00:26:35,119 Speaker 2: That's a thing that was invented and just came to 493 00:26:35,160 --> 00:26:37,560 Speaker 2: be the standard way that we interact with computers. And 494 00:26:37,600 --> 00:26:42,080 Speaker 2: so you could imagine, as you're saying, like chat essentially 495 00:26:42,240 --> 00:26:45,520 Speaker 2: chatting with the machine could could become a sort of 496 00:26:45,560 --> 00:26:48,640 Speaker 2: standard user interface, just like the graphical user interface, did 497 00:26:49,000 --> 00:26:50,400 Speaker 2: you know over the past several decades. 498 00:26:50,880 --> 00:26:54,280 Speaker 4: Absolutely, And I think those kinds of conversational interfaces are 499 00:26:54,320 --> 00:26:58,520 Speaker 4: going to be hugely important for increasing our productivity. It's 500 00:26:58,560 --> 00:27:00,359 Speaker 4: just a lot easier if I i kind I have 501 00:27:00,400 --> 00:27:02,000 Speaker 4: to learn how to use a tool, or I don't 502 00:27:02,040 --> 00:27:04,800 Speaker 4: have to kind of have awkward, you know, interactions for 503 00:27:04,880 --> 00:27:06,240 Speaker 4: the computer. I can just tell it what I want 504 00:27:06,240 --> 00:27:08,600 Speaker 4: and I can understand it. Could you know, potentially even 505 00:27:08,880 --> 00:27:11,480 Speaker 4: ask questions back to clarify and have those kinds of 506 00:27:11,480 --> 00:27:16,080 Speaker 4: conversations that can be extremely powerful. And in fact, one 507 00:27:16,119 --> 00:27:18,720 Speaker 4: area where that's going to I think be absolutely game 508 00:27:18,800 --> 00:27:21,879 Speaker 4: changing is in code. When we write code, you know, 509 00:27:22,400 --> 00:27:26,159 Speaker 4: programming languages are a way for us to sort of 510 00:27:26,760 --> 00:27:30,960 Speaker 4: match between our very sloppy way of talking and the 511 00:27:31,080 --> 00:27:33,199 Speaker 4: very exact way that you need to command a computer 512 00:27:33,320 --> 00:27:36,400 Speaker 4: to do what you wanted to do. They're cumbersome to learn, 513 00:27:36,480 --> 00:27:38,840 Speaker 4: they can you know, create very complex systems that are 514 00:27:38,920 --> 00:27:41,800 Speaker 4: very hard to reason about. And we're already starting to 515 00:27:41,840 --> 00:27:43,880 Speaker 4: see the ability to just write down what you want 516 00:27:43,960 --> 00:27:46,760 Speaker 4: and AI will generate the code for you. And I 517 00:27:46,760 --> 00:27:48,560 Speaker 4: think we're just going to see a huge revolution of 518 00:27:48,640 --> 00:27:50,520 Speaker 4: like we just converse, you know, we can have a 519 00:27:50,560 --> 00:27:53,400 Speaker 4: conversation to say what we want, and then the computer 520 00:27:53,440 --> 00:27:57,000 Speaker 4: can actually not only do fixed actions and do things 521 00:27:57,000 --> 00:27:58,960 Speaker 4: for us, but it can actually even write code to 522 00:27:59,000 --> 00:28:02,440 Speaker 4: do new things, you know, and generated software itself. Given 523 00:28:02,440 --> 00:28:05,040 Speaker 4: how much software we have, how much craving we have 524 00:28:05,119 --> 00:28:07,720 Speaker 4: for software, like we will never have enough software in 525 00:28:07,720 --> 00:28:10,520 Speaker 4: our world, uh, you know, the ability to have a 526 00:28:10,760 --> 00:28:13,879 Speaker 4: systems as a helper in that, I think we're going 527 00:28:13,960 --> 00:28:15,760 Speaker 4: to see a lot of a lot of value there. 528 00:28:17,000 --> 00:28:19,600 Speaker 2: So if you if you think about the different ways 529 00:28:20,240 --> 00:28:22,440 Speaker 2: AI might be applied to business, I mean you've talked 530 00:28:22,440 --> 00:28:24,800 Speaker 2: about a number of the sort of classic use cases. 531 00:28:25,480 --> 00:28:28,840 Speaker 2: What are some of the more out there use cases. 532 00:28:28,880 --> 00:28:31,760 Speaker 2: What are some you know, unique ways you could imagine 533 00:28:31,800 --> 00:28:33,560 Speaker 2: AI being applied to business. 534 00:28:35,240 --> 00:28:37,919 Speaker 4: Yeah, there's really disguised the limit. I mean, we have 535 00:28:38,000 --> 00:28:40,240 Speaker 4: one project that I'm kind of a fan of where 536 00:28:40,880 --> 00:28:44,720 Speaker 4: we actually were working with a mechanical engineering professor at MIT, 537 00:28:45,440 --> 00:28:48,160 Speaker 4: working on a classic problem, how do you build linkage 538 00:28:48,200 --> 00:28:52,440 Speaker 4: systems which like can imagine bars and joints and overs, 539 00:28:52,760 --> 00:28:53,720 Speaker 4: you know, the things that are. 540 00:28:53,600 --> 00:28:56,480 Speaker 2: Building a thing, building a physical machine of some. 541 00:28:56,640 --> 00:29:01,720 Speaker 4: Kind of like real like metal and you know nineteenth 542 00:29:01,800 --> 00:29:05,479 Speaker 4: century just old school industrial revolution. Yeah yeah, yeah, but 543 00:29:05,600 --> 00:29:07,920 Speaker 4: you know the little arm that's that's holding up my 544 00:29:07,920 --> 00:29:10,880 Speaker 4: microphone in front of me, cranes, get build your buildings, 545 00:29:10,880 --> 00:29:13,440 Speaker 4: you know, parts of your engines. This is like classical stuff. 546 00:29:13,600 --> 00:29:15,920 Speaker 4: It turns out that you know, humans, if you want 547 00:29:15,920 --> 00:29:19,040 Speaker 4: to build an advanced system, you decide what like curve 548 00:29:19,120 --> 00:29:21,800 Speaker 4: you want to create, and then a human together with 549 00:29:21,840 --> 00:29:25,720 Speaker 4: a computer program, can build a five or six bar linkage. 550 00:29:25,800 --> 00:29:27,240 Speaker 4: And then that's kind of where you top out it 551 00:29:27,280 --> 00:29:30,200 Speaker 4: because it gets too complicated to work more than that. 552 00:29:30,840 --> 00:29:33,320 Speaker 4: We built a generative AI system that can build twenty 553 00:29:33,320 --> 00:29:36,680 Speaker 4: bar linkages, like arbitrarily complex. So these are machines that 554 00:29:36,720 --> 00:29:41,080 Speaker 4: are beyond the capability of a human to design themselves. 555 00:29:41,600 --> 00:29:44,560 Speaker 4: Another example, we have an AI system that can generate 556 00:29:44,720 --> 00:29:47,120 Speaker 4: electronic circuits. You know, we had a project where we're 557 00:29:47,120 --> 00:29:49,840 Speaker 4: working where we were building better power converters which allow 558 00:29:50,080 --> 00:29:54,200 Speaker 4: our computers and our devices to be more efficient, save energy, 559 00:29:54,960 --> 00:29:57,680 Speaker 4: you know, less less carbon ote. But I think the 560 00:29:57,680 --> 00:30:00,920 Speaker 4: world around us has always been shaped technology. If you 561 00:30:00,960 --> 00:30:03,240 Speaker 4: look around, you know, just think about how many steps 562 00:30:03,240 --> 00:30:05,560 Speaker 4: and how many people, and how many designs went into 563 00:30:05,560 --> 00:30:09,120 Speaker 4: the table and the chair and the WAYMP It's it's 564 00:30:09,120 --> 00:30:12,600 Speaker 4: really just astonishing. And that's already you know, the fruit 565 00:30:12,680 --> 00:30:15,880 Speaker 4: of automation and computers and those kinds of tools. But 566 00:30:15,920 --> 00:30:19,360 Speaker 4: we're going to see that increasingly be product also of AI. 567 00:30:19,480 --> 00:30:21,480 Speaker 4: It's just going to be everywhere around us. Everything we 568 00:30:21,640 --> 00:30:24,280 Speaker 4: touch is going to have to you know, helped in 569 00:30:24,320 --> 00:30:26,480 Speaker 4: some way to get get to you by. 570 00:30:27,440 --> 00:30:30,000 Speaker 2: You know, that is a pretty profound transformation that you're 571 00:30:30,040 --> 00:30:32,720 Speaker 2: talking about in business. How do you think about the 572 00:30:32,760 --> 00:30:35,840 Speaker 2: implications of that both for the sort of you know, 573 00:30:36,120 --> 00:30:39,280 Speaker 2: business itself and also for employees. 574 00:30:41,000 --> 00:30:44,000 Speaker 4: Yeah, so I think for businesses, this is going to 575 00:30:44,400 --> 00:30:48,280 Speaker 4: cut costs, make new opportunities to like customers, you know, 576 00:30:48,360 --> 00:30:51,960 Speaker 4: like there's just you know, it's sort of all upside right, 577 00:30:52,040 --> 00:30:54,840 Speaker 4: like for the for the workers, I think the story 578 00:30:54,920 --> 00:30:57,880 Speaker 4: is mostly good too. You know, like how many things 579 00:30:57,880 --> 00:31:01,200 Speaker 4: do you do in your day that you'd really rather 580 00:31:01,400 --> 00:31:03,960 Speaker 4: not right, you know, and we're used to having things 581 00:31:03,960 --> 00:31:07,680 Speaker 4: we don't like automated away, you know, we didn't you know, 582 00:31:07,680 --> 00:31:10,280 Speaker 4: if you didn't like walking many miles to work, then 583 00:31:10,320 --> 00:31:12,080 Speaker 4: you know, like you can have a car and you 584 00:31:12,080 --> 00:31:14,560 Speaker 4: can drive there. Or we used to have a huge 585 00:31:14,560 --> 00:31:18,000 Speaker 4: traction over ninety percent of the US population engaged in agriculture, 586 00:31:18,040 --> 00:31:20,680 Speaker 4: and then we mechanized it. How very few people work 587 00:31:20,680 --> 00:31:22,600 Speaker 4: in agriculture. A small number of people can do the 588 00:31:22,640 --> 00:31:25,040 Speaker 4: work of a large number of people. And then you know, 589 00:31:25,120 --> 00:31:28,480 Speaker 4: things like email, and yeah, they've led to huge productivity 590 00:31:28,520 --> 00:31:31,040 Speaker 4: enhancements because I don't need to be writing letters and 591 00:31:31,080 --> 00:31:33,520 Speaker 4: sending them in the mail. I can just instantly communicate 592 00:31:33,520 --> 00:31:37,680 Speaker 4: with people. We just become more effective, Like our jobs 593 00:31:37,720 --> 00:31:41,920 Speaker 4: have transformed, whether it's a physical job like agriculture, or 594 00:31:42,120 --> 00:31:44,800 Speaker 4: whether it's a knowledge worker job where you're sending emails 595 00:31:44,880 --> 00:31:49,360 Speaker 4: and communicating with people and coordinating teams. We've just gotten better. 596 00:31:49,520 --> 00:31:51,880 Speaker 4: And you know, the technology has just made us more productive. 597 00:31:51,960 --> 00:31:54,920 Speaker 4: And this is just another example. Now, you know, there 598 00:31:54,920 --> 00:31:57,560 Speaker 4: are people who worry that you know, will be so 599 00:31:57,640 --> 00:32:01,440 Speaker 4: good at that that maybe jobs will be displayed, and 600 00:32:00,800 --> 00:32:05,800 Speaker 4: that's a legitimate concern. But just like how in agriculture, 601 00:32:05,880 --> 00:32:07,800 Speaker 4: you know, it's not like suddenly we had ninety percent 602 00:32:07,800 --> 00:32:12,560 Speaker 4: of the population unemployed. You know, people transitioned to other jobs. 603 00:32:13,160 --> 00:32:15,240 Speaker 4: And the other thing that we've found too, is that 604 00:32:15,840 --> 00:32:20,200 Speaker 4: our appetite for doing more things is as humans is 605 00:32:20,520 --> 00:32:24,160 Speaker 4: sort of insatiable. So even if we can dramatically increase 606 00:32:24,160 --> 00:32:27,080 Speaker 4: how much you know, one human can do, that doesn't 607 00:32:27,080 --> 00:32:29,560 Speaker 4: necessarily mean we're going to do a fixed amount of stuff. 608 00:32:29,720 --> 00:32:31,560 Speaker 4: There's an appetite to have even more, so we're going 609 00:32:31,560 --> 00:32:34,000 Speaker 4: to you can continue to grow grow the pie. So 610 00:32:34,160 --> 00:32:36,640 Speaker 4: I think at least certainly in the near term, you know, 611 00:32:36,640 --> 00:32:38,320 Speaker 4: we're going to see a lot of drudgery go away 612 00:32:38,320 --> 00:32:40,880 Speaker 4: from work. We're going to see people to be able 613 00:32:40,920 --> 00:32:43,880 Speaker 4: to be more effective at their jobs. You know, we 614 00:32:43,880 --> 00:32:47,400 Speaker 4: will see some transformation in jobs and what like. But 615 00:32:47,480 --> 00:32:52,200 Speaker 4: we've seen that before, and the technology a least has 616 00:32:52,240 --> 00:32:54,320 Speaker 4: the potential to make our lives a lot easier. 617 00:32:55,560 --> 00:33:01,360 Speaker 2: So IBEM recently launched Watson X which includessx dot AI. 618 00:33:01,920 --> 00:33:03,880 Speaker 2: Tell me about that, tell me about you know what 619 00:33:03,920 --> 00:33:06,239 Speaker 2: it is, and the new possibilities that it opens up. 620 00:33:07,160 --> 00:33:11,480 Speaker 4: Yeah, so Watson X is obviously a bit of a 621 00:33:11,760 --> 00:33:15,960 Speaker 4: new branding on the Watson brand. TJ. Watson that was 622 00:33:15,960 --> 00:33:20,160 Speaker 4: the founder of IBM and our EI technologies have had 623 00:33:20,200 --> 00:33:24,800 Speaker 4: the Watson brand. Watson X is a recognition that there's 624 00:33:24,840 --> 00:33:27,480 Speaker 4: something new, there's something that actually has changed the game. 625 00:33:28,080 --> 00:33:31,720 Speaker 4: We've gone from this old world of automation is to 626 00:33:31,880 --> 00:33:35,400 Speaker 4: labor intensive to this new world of possibilities where it's 627 00:33:35,480 --> 00:33:39,840 Speaker 4: much easier to use AI. And what Watson X does 628 00:33:40,000 --> 00:33:44,400 Speaker 4: it brings together tools for businesses to harness that power. 629 00:33:44,840 --> 00:33:49,720 Speaker 4: So whattsonex dot AI foundation models that our customers can use. 630 00:33:49,800 --> 00:33:52,840 Speaker 4: It includes tools that make it easy to run, easy 631 00:33:52,920 --> 00:33:57,280 Speaker 4: to deploy, easy to experiment. There's a watsonex dot Data 632 00:33:57,600 --> 00:34:01,080 Speaker 4: component which allows you to sort of organize and access 633 00:34:01,080 --> 00:34:03,160 Speaker 4: to your data. So what we're really trying to do 634 00:34:03,240 --> 00:34:08,200 Speaker 4: is give our customers a cohesive set of tools to 635 00:34:08,239 --> 00:34:11,439 Speaker 4: harness the value of these technologies and at the same 636 00:34:11,480 --> 00:34:14,479 Speaker 4: time be able to manage the risks and other things 637 00:34:14,520 --> 00:34:16,400 Speaker 4: that you have to keep an eye on in an 638 00:34:16,520 --> 00:34:17,480 Speaker 4: enterprise context. 639 00:34:19,160 --> 00:34:22,200 Speaker 2: So we talk about the guests on this show as 640 00:34:22,360 --> 00:34:26,440 Speaker 2: new creators, by which we mean people who are creatively 641 00:34:26,480 --> 00:34:31,360 Speaker 2: applying technology in business to drive change. And I'm curious 642 00:34:31,880 --> 00:34:36,560 Speaker 2: how creativity plays a role in the research that you do. 643 00:34:37,160 --> 00:34:41,759 Speaker 4: I honestly, I think the creative aspects of this job, 644 00:34:42,200 --> 00:34:45,520 Speaker 4: this is what makes this work exciting. You know, I 645 00:34:45,520 --> 00:34:47,480 Speaker 4: should say, you know, the folks who work at my 646 00:34:47,600 --> 00:34:50,680 Speaker 4: organization are doing the creating, and I. 647 00:34:50,640 --> 00:34:54,200 Speaker 2: Guess you're doing the managing so that they could do 648 00:34:54,239 --> 00:34:54,760 Speaker 2: the creator. 649 00:34:55,640 --> 00:34:59,040 Speaker 4: I'm helping them be their best and I still get 650 00:34:59,080 --> 00:35:01,960 Speaker 4: to get involved in the weeds of the research as 651 00:35:02,040 --> 00:35:04,839 Speaker 4: much as I can. But you know, there's something really 652 00:35:04,840 --> 00:35:08,719 Speaker 4: exciting about inventing. You know, like one of the nice 653 00:35:08,719 --> 00:35:12,279 Speaker 4: things about doing invention and doing research on AI in 654 00:35:12,400 --> 00:35:15,359 Speaker 4: industry is it's usually grounded and a real problem that 655 00:35:15,520 --> 00:35:18,480 Speaker 4: somebody's having. You know, a customer wants to solve this problem. 656 00:35:18,560 --> 00:35:22,080 Speaker 4: It's losing money or there would be a new opportunity. 657 00:35:22,360 --> 00:35:26,799 Speaker 4: You identify that problem and then you build something that's 658 00:35:26,840 --> 00:35:29,040 Speaker 4: never been built before to do that. And I think 659 00:35:29,080 --> 00:35:32,879 Speaker 4: that's honestly the adrenaline rush that keeps all of us 660 00:35:33,400 --> 00:35:35,880 Speaker 4: in this field. How do you do something that nobody 661 00:35:35,880 --> 00:35:39,799 Speaker 4: else on earth has done before or tried before, So 662 00:35:39,840 --> 00:35:43,279 Speaker 4: that that kind of creativity, and there's also creativity as well, 663 00:35:43,360 --> 00:35:46,600 Speaker 4: and identifying what those problems are, being able to understand 664 00:35:47,280 --> 00:35:52,040 Speaker 4: the places where you know, the technology is close enough 665 00:35:52,280 --> 00:35:56,560 Speaker 4: to solving a problem, and doing that matchmaking between problems 666 00:35:56,560 --> 00:35:59,279 Speaker 4: that are now solvable, you know, and in AI, where 667 00:35:59,280 --> 00:36:02,320 Speaker 4: the field is moving so fast, this is constantly growing 668 00:36:02,400 --> 00:36:05,440 Speaker 4: horizon of things that we might be able to solve. 669 00:36:05,760 --> 00:36:08,560 Speaker 4: So that matchmaking, I think, is also a really interesting 670 00:36:08,640 --> 00:36:12,279 Speaker 4: creative problem. So I think I think that's that's that's 671 00:36:12,280 --> 00:36:15,239 Speaker 4: why it's so much fun, and it's a fun environment 672 00:36:15,320 --> 00:36:17,719 Speaker 4: we have here too. It's you know, people drawing on 673 00:36:17,760 --> 00:36:22,279 Speaker 4: whiteboards and writing on pages of math and you. 674 00:36:22,239 --> 00:36:24,879 Speaker 2: Know, like in a movie, like in a movie, Yeah, 675 00:36:24,920 --> 00:36:27,720 Speaker 2: straight from special casting drawing, the drawing on the window, 676 00:36:27,760 --> 00:36:33,080 Speaker 2: writing on the window in sharp absolutely. So, so let's 677 00:36:33,160 --> 00:36:38,120 Speaker 2: close with the really long view. How do you imagine 678 00:36:38,360 --> 00:36:42,600 Speaker 2: AI and people working together twenty years from now? 679 00:36:44,680 --> 00:36:49,279 Speaker 4: Yeah, it's really hard to make predictions. The vision that 680 00:36:49,840 --> 00:36:56,239 Speaker 4: I like, actually this came from an MIT economist named 681 00:36:56,320 --> 00:37:01,279 Speaker 4: David Attar, which was imagine a I almost as a 682 00:37:01,360 --> 00:37:06,120 Speaker 4: natural resource you know, we know how natural resources work, right, 683 00:37:06,280 --> 00:37:08,000 Speaker 4: Like there's an ore we can dig up out of 684 00:37:08,000 --> 00:37:10,560 Speaker 4: the earth that comes from kind of springs from the earth. 685 00:37:10,680 --> 00:37:13,640 Speaker 4: Or we usually think of that in terms of physical stuff. 686 00:37:14,280 --> 00:37:15,880 Speaker 4: With AI, you can almost think of it as like 687 00:37:15,960 --> 00:37:18,799 Speaker 4: there's a new kind of abundance potentially twenty years from 688 00:37:18,840 --> 00:37:21,480 Speaker 4: now where not only can we have things we can 689 00:37:21,480 --> 00:37:24,080 Speaker 4: build or eat or use or burn or whatever. Now 690 00:37:24,120 --> 00:37:26,640 Speaker 4: we have, you know, this ability to do things and 691 00:37:26,760 --> 00:37:30,000 Speaker 4: understand things and do intellectual work. And I think we 692 00:37:30,320 --> 00:37:34,360 Speaker 4: can get to a world where automating things is just seamless. 693 00:37:34,800 --> 00:37:40,080 Speaker 4: We're surrounded by capability to augment ourselves to get things done. 694 00:37:40,719 --> 00:37:43,520 Speaker 4: And you could think of that in terms of like, oh, 695 00:37:43,520 --> 00:37:45,680 Speaker 4: that's going to displace our jobs, because eventually the AI 696 00:37:45,760 --> 00:37:47,799 Speaker 4: system is going to do everything we can do. But 697 00:37:48,200 --> 00:37:50,359 Speaker 4: you could also think of it in terms of like, wow, 698 00:37:50,400 --> 00:37:52,719 Speaker 4: that's just so much abundance that we now have, and 699 00:37:52,760 --> 00:37:56,000 Speaker 4: really how we use that abundance is sort of up 700 00:37:56,040 --> 00:37:58,640 Speaker 4: to us. You know, like when you can writing software 701 00:37:58,680 --> 00:38:01,040 Speaker 4: is super easy and fast and anybody can do it. 702 00:38:01,480 --> 00:38:03,239 Speaker 4: Just think about all the things you can do now, 703 00:38:03,880 --> 00:38:05,880 Speaker 4: Think about all the new activities and go out of 704 00:38:05,880 --> 00:38:08,280 Speaker 4: all the ways we could use that to enrich our lives. 705 00:38:08,600 --> 00:38:11,640 Speaker 4: That's where I'd like to see us in twenty years. 706 00:38:11,680 --> 00:38:14,239 Speaker 4: You know we can. We can do just so much 707 00:38:14,400 --> 00:38:17,680 Speaker 4: more than we were able to do before abundance. 708 00:38:18,480 --> 00:38:21,279 Speaker 2: Great, thank you so much for your time. 709 00:38:22,040 --> 00:38:24,040 Speaker 4: Yeah, it's been a pleasure. Thanks for inviting me. 710 00:38:25,560 --> 00:38:29,640 Speaker 3: What a far ranging, deep conversation. I'm mesmerized by the 711 00:38:29,719 --> 00:38:33,600 Speaker 3: vision David just described. A world where natural conversation between 712 00:38:33,640 --> 00:38:38,240 Speaker 3: mankind and machine can generate creative solutions to our most 713 00:38:38,280 --> 00:38:42,040 Speaker 3: complex problems. A world where we view AI not as 714 00:38:42,160 --> 00:38:46,160 Speaker 3: our replacements, but as a powerful resource we can tap 715 00:38:46,200 --> 00:38:51,719 Speaker 3: into and exponentially boost our innovation and productivity. Thanks so 716 00:38:51,800 --> 00:38:55,200 Speaker 3: much to doctor David Cox for joining us on smart Talks. 717 00:38:55,600 --> 00:38:59,319 Speaker 3: We deeply appreciate him sharing his huge breadth of AI 718 00:38:59,400 --> 00:39:03,440 Speaker 3: knowledge with us and for explaining the transformative potential of 719 00:39:03,520 --> 00:39:06,840 Speaker 3: foundation models in a way that even I can understand. 720 00:39:07,480 --> 00:39:11,920 Speaker 3: We eagerly await his next great breakthrough. Smart Talks with 721 00:39:12,000 --> 00:39:16,360 Speaker 3: IBM is produced by Matt Romano, David jaw nishe Venkat 722 00:39:16,520 --> 00:39:20,960 Speaker 3: and Royston Preserve with Jacob Goldstein. We're edited by Lydia 723 00:39:21,040 --> 00:39:25,319 Speaker 3: Jeane Kott. Our engineers are Jason Gambrel, Sarah Buguier and 724 00:39:25,440 --> 00:39:31,040 Speaker 3: Ben Holliday. Theme song by Gramoscope. Special thanks to Carli Megliori, 725 00:39:31,440 --> 00:39:35,560 Speaker 3: Andy Kelly, Kathy Callahan and the Eight Bar and IBM teams, 726 00:39:36,000 --> 00:39:39,560 Speaker 3: as well as the Pushkin marketing team. Smart Talks with 727 00:39:39,640 --> 00:39:43,840 Speaker 3: IBM is a production of Pushkin Industries and iHeartMedia. To 728 00:39:43,920 --> 00:39:48,880 Speaker 3: find more Pushkin podcasts, listen on the iHeartRadio app, Apple Podcasts, 729 00:39:48,960 --> 00:39:53,520 Speaker 3: or wherever you listen to podcasts. Him Malcolm Gladwell. This 730 00:39:53,800 --> 00:40:09,000 Speaker 3: is a paid advertisement from IBM.