WEBVTT - Transformations in AI: why foundation models are the future

0:00:02.120 --> 0:00:05.880
<v Jacob Goldstein>Hey, it's Jacob Goldstein for Smart Talks with IBM. Last

0:00:05.920 --> 0:00:08.280
<v Jacob Goldstein>year I had the pleasure of sitting down with doctor

0:00:08.360 --> 0:00:13.400
<v Jacob Goldstein>David Cox, VP of AI Models at IBM Research. We

0:00:13.480 --> 0:00:17.200
<v Jacob Goldstein>explored the fascinating world of AI foundation models and their

0:00:17.239 --> 0:00:21.960
<v Jacob Goldstein>revolutionary potential for business automation and innovation. When we first

0:00:22.000 --> 0:00:25.319
<v Jacob Goldstein>aired this episode, the concept of foundation models was just

0:00:25.400 --> 0:00:29.280
<v Jacob Goldstein>beginning to capture our attention. Since then, this technology has

0:00:29.400 --> 0:00:33.479
<v Jacob Goldstein>evolved and redefined the boundaries of what's possible. Businesses are

0:00:33.520 --> 0:00:36.960
<v Jacob Goldstein>becoming more savvy about selecting the right models and understanding

0:00:36.960 --> 0:00:40.360
<v Jacob Goldstein>how they can drive revenue and efficiency. As I listened

0:00:40.360 --> 0:00:43.279
<v Jacob Goldstein>back to the conversation, it was interesting to reflect on

0:00:43.320 --> 0:00:47.120
<v Jacob Goldstein>some new developments and ideas that have emerged, and many

0:00:47.159 --> 0:00:50.320
<v Jacob Goldstein>of these we will continue to explore throughout the season,

0:00:50.840 --> 0:00:52.920
<v Jacob Goldstein>like how to play an active role in choosing the

0:00:52.920 --> 0:00:56.440
<v Jacob Goldstein>best model for your needs. Whether you're a longtime listener

0:00:56.520 --> 0:00:58.920
<v Jacob Goldstein>or tuning in for the first time, I'm certain you'll

0:00:58.920 --> 0:01:01.800
<v Jacob Goldstein>find doctor Cox's in sites as thought provoking as ever.

0:01:02.360 --> 0:01:05.679
<v Jacob Goldstein>Thanks as all ways for joining us. Now let's dive in.

0:01:07.440 --> 0:01:11.240
<v Malcolm Gladwell>Hello, Hello, Welcome to Smart Talks with IBM, a podcast

0:01:11.240 --> 0:01:16.720
<v Malcolm Gladwell>from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Gladwell. Our

0:01:16.760 --> 0:01:21.440
<v Malcolm Gladwell>guest today is doctor David Cox, VP of AI Models

0:01:21.480 --> 0:01:26.480
<v Malcolm Gladwell>at IBM Research and IBM Director of the MIT IBM

0:01:26.560 --> 0:01:30.800
<v Malcolm Gladwell>Watson AI Lab, a first of its kind industry academic

0:01:30.920 --> 0:01:36.000
<v Malcolm Gladwell>collaboration between IBM and MIT focused on the fundamental research

0:01:36.400 --> 0:01:41.160
<v Malcolm Gladwell>of artificial intelligence. Over the course of decades, David Cox

0:01:41.240 --> 0:01:45.880
<v Malcolm Gladwell>watched as the AI revolution steadily grew from the simmering

0:01:45.920 --> 0:01:49.760
<v Malcolm Gladwell>ideas of a few academics and technologists into the industrial

0:01:49.840 --> 0:01:54.160
<v Malcolm Gladwell>boom we are experiencing today. Having dedicated his life to

0:01:54.240 --> 0:01:57.640
<v Malcolm Gladwell>pushing the field of AI towards new horizons, David has

0:01:57.680 --> 0:02:01.600
<v Malcolm Gladwell>both contributed to and presided over many of the major

0:02:01.720 --> 0:02:07.320
<v Malcolm Gladwell>breakthroughs in artificial intelligence. In today's episode, you'll hear David

0:02:07.320 --> 0:02:12.440
<v Malcolm Gladwell>explain some of the conceptual underpinnings of the current AI landscape,

0:02:12.600 --> 0:02:17.520
<v Malcolm Gladwell>things like foundation models, in surprisingly comprehensible terms. I might add,

0:02:17.800 --> 0:02:20.959
<v Malcolm Gladwell>we'll also get into some of the amazing practical applications

0:02:20.960 --> 0:02:24.000
<v Malcolm Gladwell>for AI in business, as well as what implications AI

0:02:24.120 --> 0:02:27.640
<v Malcolm Gladwell>will have for the future of work and design. David

0:02:27.680 --> 0:02:31.400
<v Malcolm Gladwell>spoke with Jacob Goldstein, host of the Pushkin podcast What's

0:02:31.440 --> 0:02:35.720
<v Malcolm Gladwell>Your Problem, A veteran business journalist Jacob has reported for

0:02:35.760 --> 0:02:38.560
<v Malcolm Gladwell>The Wall Street Journal, the Miami Herald, and was a

0:02:38.560 --> 0:02:44.240
<v Malcolm Gladwell>longtime host of the NPR program Planet Money. Okay, let's

0:02:44.240 --> 0:02:45.120
<v Malcolm Gladwell>get to the interview.

0:02:47.480 --> 0:02:49.280
<v Jacob Goldstein>Tell me about your job at IBM.

0:02:49.840 --> 0:02:53.160
<v David Cox>So, I wear two hats at IBM. So one, I'm

0:02:53.200 --> 0:02:56.040
<v David Cox>the IBM Doctor of the MIT IBM Watson They Lab.

0:02:56.560 --> 0:02:59.600
<v David Cox>So that's a joint lab between IBM and MIT where

0:02:59.600 --> 0:03:02.079
<v David Cox>we we try and invent what's next in AI. It's

0:03:02.080 --> 0:03:05.000
<v David Cox>been running for about five years, and then more recently

0:03:05.040 --> 0:03:08.040
<v David Cox>I started as the vice president for AI Models, and

0:03:08.120 --> 0:03:12.400
<v David Cox>I'm in charge of building IBM's foundation models, you know,

0:03:12.560 --> 0:03:15.080
<v David Cox>building these these big models, generative models that allow us

0:03:15.080 --> 0:03:17.320
<v David Cox>to have all kinds of new exciting capabilities in AI.

0:03:17.840 --> 0:03:19.760
<v Jacob Goldstein>So, so I want to talk to you a lot

0:03:19.840 --> 0:03:23.240
<v Jacob Goldstein>about foundation models, about genitive AI. But before we get

0:03:23.280 --> 0:03:26.359
<v Jacob Goldstein>to that, let's just spend a minute on the IBM

0:03:26.600 --> 0:03:31.920
<v Jacob Goldstein>MIT collaboration. Where did that partnership start, How did it originate?

0:03:33.080 --> 0:03:35.880
<v David Cox>Yeah? So, actually, it turns out that MIT and IBM

0:03:36.120 --> 0:03:39.200
<v David Cox>have been collaborating for a very long time in the

0:03:39.240 --> 0:03:43.280
<v David Cox>area of AI. In fact, the term artificial intelligence was

0:03:43.360 --> 0:03:47.000
<v David Cox>coined in a nineteen fifty six workshop that was held

0:03:47.040 --> 0:03:49.200
<v David Cox>at Dartmouth. It was actually organized by an IBM or

0:03:49.280 --> 0:03:52.600
<v David Cox>Nathaniel Rochester, who led the development of the IBM seven

0:03:52.600 --> 0:03:55.840
<v David Cox>O one. So we've really been together in AI since

0:03:55.880 --> 0:04:00.720
<v David Cox>the beginning, and as AI kept accelerating more and more

0:04:00.760 --> 0:04:04.360
<v David Cox>and more, I think there was a really interesting decision

0:04:04.360 --> 0:04:07.080
<v David Cox>to say, let's make this a formal partnership. So IBM

0:04:07.120 --> 0:04:08.960
<v David Cox>in twenty seventeen and now, so it'll be committing close

0:04:08.960 --> 0:04:11.800
<v David Cox>to a quarter billion dollars over ten years to have

0:04:11.920 --> 0:04:15.760
<v David Cox>this joint lab with MIT, and we we located ourselves

0:04:15.840 --> 0:04:18.000
<v David Cox>right on the campus and we've been developing very very

0:04:18.080 --> 0:04:20.359
<v David Cox>deep relationships where we can you know, really get to

0:04:20.400 --> 0:04:23.479
<v David Cox>know each other work shoulder to shoulder, conceiving what we

0:04:23.480 --> 0:04:26.120
<v David Cox>should work on next, and then executing the projects. And

0:04:26.200 --> 0:04:30.159
<v David Cox>it's really, you know, very few entities like this exist

0:04:30.640 --> 0:04:33.600
<v David Cox>between academia industry. It's been really fun of the last

0:04:33.640 --> 0:04:34.920
<v David Cox>five years to be a part of it.

0:04:35.560 --> 0:04:37.080
<v Jacob Goldstein>And what do you think are some of the most

0:04:37.080 --> 0:04:40.600
<v Jacob Goldstein>important outcomes of this collaboration between IBM and MIT.

0:04:42.000 --> 0:04:44.680
<v David Cox>Yeah, so we're really kind of the tip of the

0:04:44.720 --> 0:04:49.640
<v David Cox>sphere for for IBM's the I strategy, So we're really looking,

0:04:49.880 --> 0:04:52.680
<v David Cox>you know, what's coming ahead and you know in areas

0:04:52.720 --> 0:04:56.480
<v David Cox>like Foundation Models, you know, as the field changes, MIT

0:04:56.640 --> 0:04:59.279
<v David Cox>people are interested in working on you know, faculty, students

0:04:59.279 --> 0:05:01.440
<v David Cox>and staff are interesting working on what's the latest thing,

0:05:01.480 --> 0:05:04.479
<v David Cox>what's the next thing. We at IBM Research are very

0:05:04.560 --> 0:05:06.640
<v David Cox>much interested in the same. So we can kind of

0:05:06.640 --> 0:05:09.599
<v David Cox>put out feelers, you know, interesting things that we're seeing

0:05:09.640 --> 0:05:12.400
<v David Cox>in our research, interesting things we're hearing in the field.

0:05:12.400 --> 0:05:14.839
<v David Cox>We can go and chase those opportunities. So when something

0:05:14.839 --> 0:05:17.880
<v David Cox>big comes, like the big change that's been happening lately

0:05:17.880 --> 0:05:20.440
<v David Cox>with Foundation Models, we're ready to jump on it. That's

0:05:20.440 --> 0:05:23.440
<v David Cox>really the purpose, that's that's the lab functioning the way

0:05:23.480 --> 0:05:26.640
<v David Cox>it should. We're also really interested in how do we

0:05:26.680 --> 0:05:29.760
<v David Cox>advance you know AI that can help with climate change

0:05:30.000 --> 0:05:32.960
<v David Cox>or you know, build better materials and all these kinds

0:05:32.960 --> 0:05:35.640
<v David Cox>of things that are you know, a broader aperture sometimes

0:05:35.640 --> 0:05:38.240
<v David Cox>than what we might consider just looking at the product

0:05:38.279 --> 0:05:40.919
<v David Cox>portfolio of IBM, and that that gives us again a

0:05:40.960 --> 0:05:43.159
<v David Cox>breadth where we can see connections that we might not

0:05:43.200 --> 0:05:46.039
<v David Cox>have seen otherwise. We can you know, think things that

0:05:46.080 --> 0:05:48.680
<v David Cox>help out society and also help out our customers.

0:05:49.480 --> 0:05:53.920
<v Jacob Goldstein>So the last whatever six months, say, there has been

0:05:53.960 --> 0:05:59.640
<v Jacob Goldstein>this wild rise in the public's interest in AI right,

0:05:59.680 --> 0:06:03.120
<v Jacob Goldstein>clearly coming out of these generative AI models that are

0:06:03.160 --> 0:06:07.279
<v Jacob Goldstein>really accessible, you know, certainly chat GPT language models like that,

0:06:07.320 --> 0:06:10.400
<v Jacob Goldstein>as well as models that generate images like mid Journey.

0:06:11.000 --> 0:06:14.480
<v Jacob Goldstein>I mean, can you just sort of briefly talk about

0:06:13.920 --> 0:06:17.920
<v Jacob Goldstein>the breakthroughs in AI that have made this moment feel

0:06:18.000 --> 0:06:21.480
<v Jacob Goldstein>so exciting, so revolutionary for artificial intelligence.

0:06:22.560 --> 0:06:27.279
<v David Cox>Yeah. You know, I've been studying AI basically my entire

0:06:27.320 --> 0:06:29.440
<v David Cox>adult life. Before I came to IBM, I was a

0:06:29.440 --> 0:06:32.000
<v David Cox>professor at Harvard. I've been doing this a long time,

0:06:32.240 --> 0:06:34.520
<v David Cox>and I've gotten used to being surprised. It sounds like

0:06:34.560 --> 0:06:37.640
<v David Cox>a joke, but it's serious, Like I'm getting used to

0:06:37.680 --> 0:06:41.599
<v David Cox>being surprised at the acceleration of the pace again. It

0:06:41.680 --> 0:06:44.560
<v David Cox>tracks actually a long way back. You know. There's lots

0:06:44.600 --> 0:06:47.160
<v David Cox>of things where there was an idea that just simmered

0:06:47.839 --> 0:06:51.200
<v David Cox>for a really long time. Some of the key math

0:06:51.680 --> 0:06:55.239
<v David Cox>behind the stuff that we have today, which is amazing.

0:06:55.960 --> 0:06:58.880
<v David Cox>There's an algorithm called back propagation, which is sort of

0:06:58.920 --> 0:07:01.880
<v David Cox>key to training networks that's been around, you know, since

0:07:01.920 --> 0:07:05.760
<v David Cox>the eighties in wide use, and really what happened was

0:07:06.080 --> 0:07:09.800
<v David Cox>it simmered for a long time and then enough data

0:07:10.040 --> 0:07:13.679
<v David Cox>and enough compute came. So we had enough data because

0:07:14.320 --> 0:07:17.520
<v David Cox>you know, we all started carrying multiple cameras around with us.

0:07:17.520 --> 0:07:20.320
<v David Cox>Our mobile phones have all, you know, all these cameras

0:07:20.400 --> 0:07:22.840
<v David Cox>and this. We put everything on the Internet, and there's

0:07:22.840 --> 0:07:25.360
<v David Cox>all this data out there. We caught a lucky break

0:07:25.360 --> 0:07:27.640
<v David Cox>that there was something called the graphics processing unit, which

0:07:27.920 --> 0:07:29.920
<v David Cox>turns out to be really useful for doing these kinds

0:07:29.960 --> 0:07:32.480
<v David Cox>of algorithms, maybe even more useful than it is for

0:07:32.560 --> 0:07:36.640
<v David Cox>doing graphics. They're great graphics too. And things just kept

0:07:36.720 --> 0:07:39.920
<v David Cox>kind of adding to the snowball. So we had deep learning,

0:07:40.360 --> 0:07:43.960
<v David Cox>which is sort of a rebrand of neural networks that

0:07:44.040 --> 0:07:46.360
<v David Cox>I mentioned from from the eighties, and that was enabled

0:07:46.400 --> 0:07:50.080
<v David Cox>again by data because we digitalized the world, and compute

0:07:50.080 --> 0:07:52.480
<v David Cox>because because we kept building faster and faster and more

0:07:52.520 --> 0:07:55.760
<v David Cox>powerful computers, and then that allowed us to make this

0:07:55.760 --> 0:07:59.440
<v David Cox>this big breakthrough. And then you know, more recently, using

0:07:59.520 --> 0:08:03.600
<v David Cox>the same building blocks, that inexorable rise of more and

0:08:03.640 --> 0:08:08.160
<v David Cox>more and more data that the technology called self supervised learning.

0:08:08.640 --> 0:08:13.360
<v David Cox>Where the key difference there in traditional deep learning, you know,

0:08:13.400 --> 0:08:16.040
<v David Cox>for classifying images, you know, like is this a cat

0:08:16.120 --> 0:08:19.360
<v David Cox>or is this a dog? And a picture those technologies

0:08:19.800 --> 0:08:23.120
<v David Cox>require super visions, so you have to take what you

0:08:23.200 --> 0:08:24.560
<v David Cox>have and then you have to label it. So you

0:08:24.600 --> 0:08:25.920
<v David Cox>have to take a picture of a cat, and then

0:08:25.960 --> 0:08:28.640
<v David Cox>you label it as a cat, and it turns out that,

0:08:28.800 --> 0:08:31.000
<v David Cox>you know, that's very powerful, but it takes a lot

0:08:31.000 --> 0:08:33.920
<v David Cox>of time to label gats and to label dogs, and

0:08:34.360 --> 0:08:36.280
<v David Cox>there's only so many labels that exist in the world.

0:08:36.679 --> 0:08:40.240
<v David Cox>So what really changed more recently is that we have

0:08:40.320 --> 0:08:42.800
<v David Cox>self supervised learning where you don't have to have the labels.

0:08:42.800 --> 0:08:45.360
<v David Cox>We can just take unannotated data. And what that does

0:08:45.400 --> 0:08:48.480
<v David Cox>is it lets you use even more data. And that's

0:08:48.520 --> 0:08:52.120
<v David Cox>really what drove this this latest sort of rage. And

0:08:52.120 --> 0:08:54.280
<v David Cox>then and then all of a sudden we start getting

0:08:54.320 --> 0:08:58.199
<v David Cox>these these really powerful models. And then really this has

0:08:58.240 --> 0:09:03.040
<v David Cox>been simmering techechnologies, right, this has been happening for a

0:09:03.080 --> 0:09:07.280
<v David Cox>while and progressively getting more and more powerful. One of

0:09:07.280 --> 0:09:11.560
<v David Cox>the things that really happened with CHATGBT and technologies like

0:09:12.000 --> 0:09:15.079
<v David Cox>stable Diffusion and mid Journey was that they made it

0:09:15.640 --> 0:09:18.319
<v David Cox>visible to the public. You know, you put it out

0:09:18.360 --> 0:09:20.600
<v David Cox>there the public can touch and feel, and they're like, wow,

0:09:20.880 --> 0:09:24.480
<v David Cox>not only is there palpable change, and wow this you know,

0:09:24.520 --> 0:09:26.000
<v David Cox>I can talk to this thing. Wow, this thing can

0:09:26.080 --> 0:09:28.959
<v David Cox>generate an image. Not only that, but everyone can touch

0:09:29.000 --> 0:09:33.240
<v David Cox>and feel and try. My kids can use some of

0:09:33.280 --> 0:09:38.720
<v David Cox>these AI art generation technologies. And that's really just launched,

0:09:38.800 --> 0:09:42.040
<v David Cox>you know. It's like a propelled slingshot at us into

0:09:42.360 --> 0:09:44.400
<v David Cox>a different regime in terms of the public awareness of

0:09:44.400 --> 0:09:45.239
<v David Cox>these technologies.

0:09:45.920 --> 0:09:49.040
<v Jacob Goldstein>You mentioned earlier in the conversation foundation models, and I

0:09:49.080 --> 0:09:50.920
<v Jacob Goldstein>want to talk a little bit about that. I mean,

0:09:50.960 --> 0:09:54.360
<v Jacob Goldstein>can you just tell me, you know, what are foundation

0:09:54.600 --> 0:09:57.320
<v Jacob Goldstein>models for AI and why are they a big deal?

0:09:58.520 --> 0:10:02.520
<v David Cox>Yeah? So this termoundation model was coined by a group

0:10:02.520 --> 0:10:06.360
<v David Cox>at Stanford, and I think it's actually a really apt

0:10:06.480 --> 0:10:09.680
<v David Cox>term because remember I said, you know, one of the

0:10:09.679 --> 0:10:13.160
<v David Cox>big things that unlocked this latest excitement was the fact

0:10:13.160 --> 0:10:17.040
<v David Cox>that we could use large amounts of unannotated data. Could

0:10:17.080 --> 0:10:18.440
<v David Cox>we could train a model. We don't have to go

0:10:18.480 --> 0:10:22.000
<v David Cox>through the painful effort of labeling each and every example.

0:10:22.559 --> 0:10:24.800
<v David Cox>You still need to have your model do something you

0:10:24.800 --> 0:10:27.000
<v David Cox>wanted to do. You still need to tell it what

0:10:27.040 --> 0:10:28.600
<v David Cox>you want to do. You can't just have a model

0:10:28.640 --> 0:10:31.160
<v David Cox>that doesn't, you know, have any purpose. But what a

0:10:31.200 --> 0:10:35.000
<v David Cox>foundation models that provides a foundation, like a literal foundation.

0:10:35.280 --> 0:10:37.360
<v David Cox>You can sort of stand on the shoulders of giants.

0:10:37.360 --> 0:10:40.079
<v David Cox>You could have on these massively trained models and then

0:10:40.120 --> 0:10:42.280
<v David Cox>do a little bit on top. You know, you could

0:10:42.480 --> 0:10:44.560
<v David Cox>use just a few examples of what you're looking for

0:10:45.360 --> 0:10:47.520
<v David Cox>and you can get what you want from the model.

0:10:48.040 --> 0:10:50.080
<v David Cox>So just a little bit on top now gets to

0:10:50.200 --> 0:10:52.240
<v David Cox>the results that a huge amount of effort used to

0:10:52.280 --> 0:10:54.240
<v David Cox>have to put in, you know, to get from the

0:10:54.320 --> 0:10:56.360
<v David Cox>ground up to that level.

0:10:56.640 --> 0:11:00.440
<v Jacob Goldstein>I was trying to think of of an analogy for

0:11:00.679 --> 0:11:03.720
<v Jacob Goldstein>sort of foundation models versus what came before, and I

0:11:03.760 --> 0:11:06.200
<v Jacob Goldstein>don't know that I came up with a good one,

0:11:06.240 --> 0:11:07.959
<v Jacob Goldstein>but the best I could do was this. I want

0:11:07.960 --> 0:11:10.920
<v Jacob Goldstein>you to tell me if it's plausible. It's like before

0:11:11.000 --> 0:11:14.520
<v Jacob Goldstein>foundation models, it was like you had these sort of

0:11:14.600 --> 0:11:17.800
<v Jacob Goldstein>single use kitchen appliances. You could make a waffle iron

0:11:17.840 --> 0:11:20.360
<v Jacob Goldstein>if you wanted waffles, or you could make a toaster

0:11:20.520 --> 0:11:23.079
<v Jacob Goldstein>if you wanted to make toast. But a foundation model

0:11:23.160 --> 0:11:25.720
<v Jacob Goldstein>is like like an oven with a range on top.

0:11:25.800 --> 0:11:27.559
<v Jacob Goldstein>So it's like this machine and you could just cook

0:11:27.640 --> 0:11:29.480
<v Jacob Goldstein>anything with this machine.

0:11:30.120 --> 0:11:34.600
<v David Cox>Yeah, that's a great analogy. They're very versatile. The other

0:11:34.720 --> 0:11:37.280
<v David Cox>piece of it, too, is that they dramatically lower the

0:11:37.400 --> 0:11:40.560
<v David Cox>effort that it takes to do something that you want

0:11:40.600 --> 0:11:43.600
<v David Cox>to do. And sometimes I used to say about the

0:11:43.640 --> 0:11:45.600
<v David Cox>old world of AI, would say, you know, the problem

0:11:45.640 --> 0:11:49.400
<v David Cox>with automation is that it's too labor intensive. U sounds

0:11:49.440 --> 0:11:50.400
<v David Cox>like I'm making a joke.

0:11:50.640 --> 0:11:55.160
<v Jacob Goldstein>Indeed, famously, if automation does one thing, it substitutes machines

0:11:55.280 --> 0:11:58.520
<v Jacob Goldstein>or computing power for labor. Right, So what does that

0:11:58.600 --> 0:12:02.880
<v Jacob Goldstein>mean to say AI or automation is too labor intensive?

0:12:03.360 --> 0:12:05.360
<v David Cox>It sounds like I'm making a joke, but I'm actually serious.

0:12:05.559 --> 0:12:08.800
<v David Cox>What I mean is that the effort it took the

0:12:08.880 --> 0:12:12.719
<v David Cox>old regime to automate something was very, very high. So

0:12:12.920 --> 0:12:15.800
<v David Cox>if I need to go and curate all this data,

0:12:15.840 --> 0:12:19.040
<v David Cox>collect all this data, and then carefully label all these examples,

0:12:19.440 --> 0:12:23.400
<v David Cox>that labeling itself might be incredibly expensive and time. So

0:12:23.760 --> 0:12:26.360
<v David Cox>and we estimate anywhere between eighty to ninety percent of

0:12:26.400 --> 0:12:29.240
<v David Cox>the effort it takes to feel then AI solution actually

0:12:29.360 --> 0:12:32.959
<v David Cox>is just spent on data so that that has some consequences,

0:12:33.240 --> 0:12:38.600
<v David Cox>which is the threshold for bothering. You know, if you're

0:12:38.600 --> 0:12:40.800
<v David Cox>going to only get a little bit of value back

0:12:41.040 --> 0:12:43.280
<v David Cox>from something, are you going to go through this huge

0:12:43.280 --> 0:12:46.800
<v David Cox>effort to curate all this data and then when it

0:12:46.800 --> 0:12:49.240
<v David Cox>comes time to train the model. You need highly skilled

0:12:49.240 --> 0:12:53.280
<v David Cox>people expensive or hard to find in the labor market.

0:12:53.440 --> 0:12:54.959
<v David Cox>You know, are you really going to do something that's

0:12:55.000 --> 0:12:56.920
<v David Cox>just a tiny, little incremental thing. Now, you're going to

0:12:56.960 --> 0:13:01.000
<v David Cox>do the only the highest value things that warrant level

0:13:01.400 --> 0:13:01.959
<v David Cox>because you have.

0:13:01.960 --> 0:13:05.960
<v Jacob Goldstein>To essentially build the whole machine from scratch, and there

0:13:06.000 --> 0:13:08.560
<v Jacob Goldstein>aren't many things where it's worth that much work to

0:13:08.600 --> 0:13:11.600
<v Jacob Goldstein>build a machine that's only going to do one narrow thing.

0:13:12.040 --> 0:13:15.000
<v David Cox>That's right, and then you tackle the next problem and

0:13:15.080 --> 0:13:17.400
<v David Cox>you basically have to start over. And you know, there

0:13:17.400 --> 0:13:20.240
<v David Cox>are some nuances here, like for images, you can pre

0:13:20.280 --> 0:13:22.800
<v David Cox>train a model on some other tasks and change it around.

0:13:22.800 --> 0:13:25.760
<v David Cox>So there are some examples of this, like non recurring

0:13:25.880 --> 0:13:28.480
<v David Cox>cost that we have in the old world too, But

0:13:28.480 --> 0:13:31.040
<v David Cox>by and large, it's just a lot of effort. It's hard.

0:13:31.320 --> 0:13:35.600
<v David Cox>It takes, you know, a large level of skill to implement.

0:13:36.400 --> 0:13:39.199
<v David Cox>One analogy that I like is, you know, think about

0:13:39.200 --> 0:13:41.320
<v David Cox>it as you know, you have a river of data,

0:13:41.679 --> 0:13:45.080
<v David Cox>you know, running through your company or your institution. Traditional

0:13:45.080 --> 0:13:47.600
<v David Cox>AI solutions are kind of like building a dam on

0:13:47.600 --> 0:13:51.080
<v David Cox>that river. You know. Dams are very expensive things to build.

0:13:51.400 --> 0:13:55.679
<v David Cox>They require highly specialized skills and lots of planning, and

0:13:55.880 --> 0:13:57.560
<v David Cox>you know you're only going to put a dam on

0:13:57.960 --> 0:14:00.719
<v David Cox>a river that's big enough that you're gonna get enough

0:14:00.800 --> 0:14:03.199
<v David Cox>energy out of it that it was worth trouble. You're

0:14:03.200 --> 0:14:04.559
<v David Cox>gonna get a lot of value out of that dam.

0:14:04.679 --> 0:14:06.280
<v David Cox>If you have a river like that, you know, a

0:14:06.360 --> 0:14:09.960
<v David Cox>river of data, but it's actually the vast majority of

0:14:10.160 --> 0:14:12.520
<v David Cox>the water you know in your kingdom actually isn't in

0:14:12.559 --> 0:14:16.600
<v David Cox>that river. It's in puddles and greeks and bablet bricks,

0:14:16.640 --> 0:14:20.080
<v David Cox>And you know, there's a lot of value left on

0:14:20.120 --> 0:14:22.640
<v David Cox>the table because it's like, well, I can't there's nothing

0:14:22.720 --> 0:14:24.520
<v David Cox>you can do about it. It's just that that's too

0:14:25.480 --> 0:14:28.600
<v David Cox>low value. So it takes too much effort, so I'm

0:14:28.640 --> 0:14:30.200
<v David Cox>just not going to do it. The return on investment

0:14:30.560 --> 0:14:33.120
<v David Cox>just isn't there, So you just end up not automating

0:14:33.160 --> 0:14:35.960
<v David Cox>things because it's too much of a pain. Now what

0:14:36.000 --> 0:14:38.440
<v David Cox>foundation models do is they say, well, actually, no, we

0:14:38.480 --> 0:14:41.680
<v David Cox>can train a base model a foundation that you can

0:14:41.720 --> 0:14:43.360
<v David Cox>work on that we don't we don't care. We don't

0:14:43.400 --> 0:14:45.240
<v David Cox>specify what the task is ahead of time. We just

0:14:45.280 --> 0:14:48.400
<v David Cox>need to learn about the domain of data. So if

0:14:48.440 --> 0:14:51.320
<v David Cox>we want to build something that can understand English language.

0:14:51.640 --> 0:14:54.920
<v David Cox>There's a ton of English language text available out in

0:14:54.960 --> 0:14:59.040
<v David Cox>the world. We can now train models on huge quantities

0:14:59.040 --> 0:15:02.480
<v David Cox>of it, and then it learned the structure, learned how

0:15:02.640 --> 0:15:05.200
<v David Cox>language you know, good part of how language works on

0:15:05.240 --> 0:15:07.600
<v David Cox>all that unlabeled data. And then when you roll up

0:15:07.600 --> 0:15:10.560
<v David Cox>with your task, you know, I want to solve this

0:15:10.560 --> 0:15:13.720
<v David Cox>particular problem, you don't have to start from scratch. You're

0:15:13.720 --> 0:15:17.200
<v David Cox>starting from a very very very high place. So that

0:15:17.280 --> 0:15:19.560
<v David Cox>just gives you the ability to just you know, now,

0:15:19.600 --> 0:15:22.440
<v David Cox>all of a sudden, everything is accessible. All the puddles

0:15:22.440 --> 0:15:25.160
<v David Cox>and greeks and babbling brooks and kettlepons, you know, those

0:15:25.200 --> 0:15:29.960
<v David Cox>are all accessible now. And that's that's very exciting. But

0:15:30.040 --> 0:15:32.520
<v David Cox>it just changes the equation on what kinds of problems

0:15:32.640 --> 0:15:33.840
<v David Cox>you could use AI to solve.

0:15:33.960 --> 0:15:39.400
<v Jacob Goldstein>And so foundation models basically mean that automating some new

0:15:39.520 --> 0:15:42.760
<v Jacob Goldstein>task is much less labor intensive. The sort of marginal

0:15:42.840 --> 0:15:45.840
<v Jacob Goldstein>effort to do some new automation thing is much lower

0:15:45.840 --> 0:15:49.120
<v Jacob Goldstein>because you're building on top of the foundation model rather

0:15:49.200 --> 0:15:53.560
<v Jacob Goldstein>than starting from scratch. Absolutely, so that is that is

0:15:53.680 --> 0:15:57.240
<v Jacob Goldstein>like the exciting good news. I do feel like there's

0:15:58.080 --> 0:16:01.240
<v Jacob Goldstein>a little bit of a countervailing idea that worth talking about. Here,

0:16:01.280 --> 0:16:03.400
<v Jacob Goldstein>and that is the idea that even though there are

0:16:03.440 --> 0:16:08.000
<v Jacob Goldstein>these foundation models that are really powerful that are relatively

0:16:08.040 --> 0:16:10.640
<v Jacob Goldstein>easy to build on top of, it's still the case,

0:16:10.720 --> 0:16:13.960
<v Jacob Goldstein>right that there is not some one size fits all

0:16:14.080 --> 0:16:17.680
<v Jacob Goldstein>foundation model. So you know, what does that mean and

0:16:17.800 --> 0:16:20.280
<v Jacob Goldstein>why is that important to think about in this context?

0:16:20.880 --> 0:16:24.680
<v David Cox>Yeah, so we believe very strongly that there isn't just

0:16:24.800 --> 0:16:27.640
<v David Cox>one model to rule them all. There's a number of

0:16:27.720 --> 0:16:30.720
<v David Cox>reasons why that could be true. One which I think

0:16:30.760 --> 0:16:34.800
<v David Cox>is important and very relevant today is how much energy

0:16:35.120 --> 0:16:39.880
<v David Cox>these models can consume. So these models, you know, can

0:16:39.920 --> 0:16:45.360
<v David Cox>get very, very large. So one thing that we're starting

0:16:45.400 --> 0:16:48.120
<v David Cox>to see or starting to believe, is that you probably

0:16:48.160 --> 0:16:53.280
<v David Cox>shouldn't use one giant sledgehammer model to solve every single problem,

0:16:53.440 --> 0:16:55.400
<v David Cox>you know, like we should pick the right size model

0:16:55.440 --> 0:16:58.240
<v David Cox>to solve the problem. We shouldn't necessarily assume that we

0:16:58.280 --> 0:17:02.840
<v David Cox>need the biggest, baddest model for every little use case.

0:17:03.320 --> 0:17:05.520
<v David Cox>And we're also seeing that, you know, small models that

0:17:05.560 --> 0:17:09.760
<v David Cox>are trained like to specialize on particular domains can actually

0:17:09.800 --> 0:17:13.600
<v David Cox>outperform much bigger models. So bigger isn't always even better.

0:17:13.680 --> 0:17:16.280
<v Jacob Goldstein>So they're more efficient and they do the thing you

0:17:16.320 --> 0:17:17.960
<v Jacob Goldstein>want them to do better. As well.

0:17:18.480 --> 0:17:21.760
<v David Cox>That's right. So Stanford, for instance, a group of Stanford

0:17:21.800 --> 0:17:24.919
<v David Cox>trained a model. It is a two point seven billion

0:17:24.960 --> 0:17:28.080
<v David Cox>parameter model, which isn't terribly big by today's standards. They

0:17:28.080 --> 0:17:30.359
<v David Cox>trained it just on the biomedical literature, you know, this

0:17:30.400 --> 0:17:32.800
<v David Cox>is the kind of thing that universities do. And what

0:17:32.840 --> 0:17:36.320
<v David Cox>they showed was that this model was better at answering

0:17:36.400 --> 0:17:38.920
<v David Cox>questions about the biomedical literature than some models that were

0:17:39.440 --> 0:17:43.159
<v David Cox>one hundred billion parameters, you know, many times larger. So

0:17:43.320 --> 0:17:45.880
<v David Cox>it's a little bit like you know, asking an expert

0:17:46.320 --> 0:17:49.600
<v David Cox>for help on something versus asking the smartest person, you know,

0:17:50.240 --> 0:17:53.040
<v David Cox>the smartest person you know, maybe very smart, but they're

0:17:53.040 --> 0:17:56.399
<v David Cox>not going to be expertise. And then as an added bonus,

0:17:56.440 --> 0:17:58.399
<v David Cox>you know, this is now a much smaller model, it's

0:17:58.480 --> 0:18:00.919
<v David Cox>much more efficient to run. We aren't know, you know,

0:18:00.960 --> 0:18:04.760
<v David Cox>it's cheaper. So there's lots of different advantages there. So

0:18:05.040 --> 0:18:08.280
<v David Cox>I think we're going to see at tension in the

0:18:08.320 --> 0:18:11.600
<v David Cox>industry between vendors that say, hey, this is the one,

0:18:11.800 --> 0:18:14.159
<v David Cox>you know, big model and then others that say, well, actually,

0:18:14.440 --> 0:18:16.960
<v David Cox>you know, there's there's you know, lots of different tools

0:18:16.960 --> 0:18:19.000
<v David Cox>we can use that all have this nice quality that

0:18:19.040 --> 0:18:21.680
<v David Cox>we outligned at the beginning, and then we should really

0:18:21.680 --> 0:18:23.200
<v David Cox>pick the one that makes the most sense for the

0:18:23.560 --> 0:18:24.280
<v David Cox>task at hand.

0:18:25.560 --> 0:18:29.960
<v Jacob Goldstein>So there's sustainability basically efficiency. Another kind of set of

0:18:29.960 --> 0:18:32.239
<v Jacob Goldstein>issues that come up a lot with AI A are

0:18:32.440 --> 0:18:36.240
<v Jacob Goldstein>bias hallucination. Can you talk a little bit about bias

0:18:36.480 --> 0:18:38.720
<v Jacob Goldstein>and hallucination, what they are and how you're working to

0:18:39.119 --> 0:18:40.240
<v Jacob Goldstein>mitigate those problems.

0:18:40.640 --> 0:18:43.479
<v David Cox>Yeah, so there are lots of issues still as amazing

0:18:43.520 --> 0:18:46.440
<v David Cox>as these technologies are, and they are amazing, let's let's

0:18:46.480 --> 0:18:48.960
<v David Cox>be very clear, lots of great things we're going to

0:18:49.080 --> 0:18:52.880
<v David Cox>enable with these kinds of technologies. Bias isn't a new problem,

0:18:53.240 --> 0:18:57.840
<v David Cox>so you know, basically we've seen this since the beginning

0:18:57.880 --> 0:19:00.760
<v David Cox>of AI. If you train a model on data that

0:19:01.200 --> 0:19:03.320
<v David Cox>has a bias in it, the model is going to

0:19:03.359 --> 0:19:07.920
<v David Cox>recapitulate that bias when it provides its answers. So every time,

0:19:08.119 --> 0:19:10.639
<v David Cox>you know, if all the text you have says, you know,

0:19:10.680 --> 0:19:13.760
<v David Cox>it's more likely to refer to female nurses and male scientists,

0:19:13.800 --> 0:19:15.879
<v David Cox>then you're going to you know, get models that you know.

0:19:15.960 --> 0:19:19.040
<v David Cox>For instance, there was an example where a machine learning

0:19:19.040 --> 0:19:23.440
<v David Cox>based translation system translated from Hungarian to English. Hungarian doesn't

0:19:23.480 --> 0:19:26.760
<v David Cox>have gendered pronouns English does, and when you ask them

0:19:26.800 --> 0:19:29.119
<v David Cox>to translate, it would translate they are a nurse to

0:19:29.560 --> 0:19:32.520
<v David Cox>she is a nurse, translate they are a scientist to

0:19:32.600 --> 0:19:35.680
<v David Cox>he is a scientist. And that's not because the people

0:19:35.720 --> 0:19:38.520
<v David Cox>who wrote the algorithm were building in bias and coding

0:19:38.560 --> 0:19:40.120
<v David Cox>in like, oh, it's got to be this way. It's

0:19:40.119 --> 0:19:42.359
<v David Cox>because the data was like that. You know, we have

0:19:42.480 --> 0:19:46.920
<v David Cox>biases in our society and they're reflected in our data

0:19:46.960 --> 0:19:50.600
<v David Cox>and our text and our images everywhere. And then the

0:19:50.640 --> 0:19:53.760
<v David Cox>models they're just mapping from what they've seen in their

0:19:53.800 --> 0:19:56.600
<v David Cox>training data to the result that you're trying to get

0:19:56.600 --> 0:19:59.280
<v David Cox>them to do and to give, and then these biases

0:19:59.320 --> 0:20:04.240
<v David Cox>come out. So there's a very active program of research

0:20:04.600 --> 0:20:06.600
<v David Cox>in you know, we we do quite a bit at

0:20:06.600 --> 0:20:10.320
<v David Cox>IBM research and my T but also all over the

0:20:10.359 --> 0:20:13.040
<v David Cox>community and industry and academia trying to figure out how

0:20:13.080 --> 0:20:16.800
<v David Cox>do we explicitly remove these biases, how do we identify them,

0:20:17.119 --> 0:20:18.960
<v David Cox>how do you know, how do we build tools that

0:20:19.000 --> 0:20:21.119
<v David Cox>allow people to audit their systems to make sure they

0:20:21.119 --> 0:20:23.760
<v David Cox>aren't biased. So this is a really important thing. And

0:20:23.920 --> 0:20:27.159
<v David Cox>you know, again this was here since the beginning, uh,

0:20:27.440 --> 0:20:31.560
<v David Cox>you know, of machine learning and AI, but foundation models

0:20:31.560 --> 0:20:34.840
<v David Cox>and large language models and generative AI just bring it

0:20:34.840 --> 0:20:37.600
<v David Cox>into sharper even sharper focus because there's just so much

0:20:37.680 --> 0:20:41.000
<v David Cox>data and it's sort of building in, baking in all

0:20:41.040 --> 0:20:44.520
<v David Cox>these different biases we have, so that that's that's absolutely

0:20:45.040 --> 0:20:47.840
<v David Cox>a problem that these models have. Another one that you

0:20:47.880 --> 0:20:51.800
<v David Cox>mentioned was hallucinations. So even the most impressive of our

0:20:51.880 --> 0:20:55.720
<v David Cox>models will often just make stuff up. And you know,

0:20:55.920 --> 0:20:58.919
<v David Cox>the technical term that the field has chosen is hallucination.

0:20:59.480 --> 0:21:02.439
<v David Cox>To give you an example, I asked chat tbt to

0:21:02.720 --> 0:21:06.480
<v David Cox>create a biography of David Cox's IBM, and you know,

0:21:06.720 --> 0:21:09.439
<v David Cox>it started off really well, you know, identifying that I

0:21:09.480 --> 0:21:11.800
<v David Cox>was the director of the mt IBM, Watson may and

0:21:11.800 --> 0:21:14.200
<v David Cox>said a few words about that, and then it proceeded

0:21:14.200 --> 0:21:18.760
<v David Cox>to create an authoritative but completely fake biography of me.

0:21:18.800 --> 0:21:21.320
<v David Cox>Where I was British, I was born in the UK,

0:21:22.680 --> 0:21:25.640
<v David Cox>I went to British university, you know universities in the UK.

0:21:25.720 --> 0:21:27.320
<v David Cox>I was professor the authority.

0:21:27.400 --> 0:21:30.960
<v Jacob Goldstein>Right, it's the certainty that that is weird about it, right,

0:21:30.960 --> 0:21:34.240
<v Jacob Goldstein>It's it's dead certain that you're from the UK, et cetera.

0:21:34.840 --> 0:21:37.879
<v David Cox>Absolutely, yeah, it has all kinds of flourishes like I

0:21:37.920 --> 0:21:42.639
<v David Cox>want awards in the UK. So yeah, it's it's problematic

0:21:42.720 --> 0:21:45.120
<v David Cox>because it kind of pokes at a lot of weak

0:21:45.160 --> 0:21:49.600
<v David Cox>spots in our human psychology where if something sounds coherent,

0:21:50.600 --> 0:21:53.119
<v David Cox>we're likely to assume it's true. We're not used to

0:21:53.119 --> 0:21:57.160
<v David Cox>interacting with people who eloquently and authoritatively you know, emit

0:21:57.320 --> 0:21:58.840
<v David Cox>complete nonsense.

0:21:58.440 --> 0:22:01.080
<v Jacob Goldstein>Like yeah, you know, you know, we get debate about that,

0:22:01.119 --> 0:22:03.600
<v Jacob Goldstein>but yeah, we could debate about that. But yes, the

0:22:04.520 --> 0:22:08.159
<v Jacob Goldstein>sort of blive confidence throws you off when you realize

0:22:08.200 --> 0:22:09.119
<v Jacob Goldstein>it's completely wrong.

0:22:09.240 --> 0:22:12.000
<v David Cox>Right, that's right. And we do have a little bit

0:22:12.040 --> 0:22:15.240
<v David Cox>of like a great and powerful oz sort of vibe

0:22:15.280 --> 0:22:17.600
<v David Cox>going sometimes where we're like, well, you know, the AI

0:22:17.800 --> 0:22:21.560
<v David Cox>is all knowing and therefore whatever it says must be true.

0:22:21.800 --> 0:22:26.040
<v David Cox>But these things will make up stuff, you know, very aggressively,

0:22:26.760 --> 0:22:29.159
<v David Cox>and you know, you everyone can try asking it for

0:22:29.240 --> 0:22:32.720
<v David Cox>their their bio. You'll you'll get something that You'll always

0:22:32.720 --> 0:22:35.000
<v David Cox>get something that's of the right form, that has the

0:22:35.119 --> 0:22:38.040
<v David Cox>right tone. But you know, the facts just aren't necessarily there.

0:22:38.320 --> 0:22:40.760
<v David Cox>So that's obviously a problem. We need to figure out

0:22:40.760 --> 0:22:43.959
<v David Cox>how to close those gaps, fix those problems. There's lots

0:22:44.000 --> 0:22:46.480
<v David Cox>of ways we can use them much more easily.

0:22:46.600 --> 0:22:49.320
<v David Cox>I'd just like to say, faced with the awesome potential

0:22:49.359 --> 0:22:52.400
<v David Cox>of what these technologies might do, it's a bit encouraging

0:22:52.440 --> 0:22:55.960
<v David Cox>to hear that even chat GPT has a weakness for

0:22:56.080 --> 0:23:01.200
<v David Cox>inventing flamboyant, if fictional versions of people's lives. And while

0:23:01.320 --> 0:23:04.879
<v David Cox>entertaining ourselves with chat GPT and mid journey is important,

0:23:05.359 --> 0:23:09.400
<v David Cox>the way lay people use consumer facing chatbots and generative

0:23:09.520 --> 0:23:13.800
<v David Cox>AI is just fundamentally different from the way an enterprise

0:23:13.880 --> 0:23:17.359
<v David Cox>business uses AI. How can we harness the abilities of

0:23:17.480 --> 0:23:20.840
<v David Cox>artificial intelligence to help us solve the problems we face

0:23:20.920 --> 0:23:24.520
<v David Cox>in business and technology. Let's listen on as David and

0:23:24.600 --> 0:23:26.440
<v David Cox>Jacob continue their conversation.

0:23:27.200 --> 0:23:30.160
<v Jacob Goldstein>We've been talking in a somewhat abstract way about AI

0:23:30.280 --> 0:23:33.040
<v Jacob Goldstein>in the ways it can be used. Let's talk in

0:23:33.040 --> 0:23:36.400
<v Jacob Goldstein>a little bit more of a specific way. Can you

0:23:36.440 --> 0:23:40.240
<v Jacob Goldstein>just talk about some examples of business challenges that can

0:23:40.280 --> 0:23:43.640
<v Jacob Goldstein>be solved with automation with this kind of automation we're

0:23:43.640 --> 0:23:44.560
<v Jacob Goldstein>talking about.

0:23:45.119 --> 0:23:48.520
<v David Cox>Yeah, so the really really, this guy's the limit. There's

0:23:48.560 --> 0:23:52.480
<v David Cox>a whole set of different applications that these models are

0:23:52.520 --> 0:23:55.359
<v David Cox>really good at, and basically it's a superset of everything

0:23:55.359 --> 0:23:58.480
<v David Cox>we used to use AI for in business. So you know,

0:23:59.080 --> 0:24:00.760
<v David Cox>the simple kinds of things are like, hey, if I

0:24:00.760 --> 0:24:03.520
<v David Cox>have text and I you know, I have like product reviews,

0:24:03.840 --> 0:24:04.959
<v David Cox>and I want to be able to tell if these

0:24:05.000 --> 0:24:07.119
<v David Cox>are positive or negative. You know, like let's look at

0:24:07.119 --> 0:24:08.800
<v David Cox>all the negative reviews so we can have a human

0:24:08.800 --> 0:24:12.080
<v David Cox>look through them and see what was up. Very common

0:24:12.440 --> 0:24:14.880
<v David Cox>business use case. You can do it with traditional deep

0:24:14.960 --> 0:24:18.399
<v David Cox>learning based AI. So so there's things like that that

0:24:18.440 --> 0:24:20.560
<v David Cox>are you know, it's very prosaic sort that we were

0:24:20.600 --> 0:24:22.400
<v David Cox>already doing that. We've been doing it for a long time.

0:24:23.280 --> 0:24:26.159
<v David Cox>Then you get situations that are that were harder for

0:24:26.200 --> 0:24:29.040
<v David Cox>the old day. I like, if I'm I want to

0:24:29.400 --> 0:24:32.040
<v David Cox>impress something like I want to I have like say

0:24:32.040 --> 0:24:34.400
<v David Cox>I have a chat transcript, Like a customer called in

0:24:35.200 --> 0:24:38.800
<v David Cox>and they had a complaint, they call back, Okay, now

0:24:38.800 --> 0:24:41.600
<v David Cox>a new you know, a person on the line needs

0:24:41.640 --> 0:24:44.479
<v David Cox>to go read the old transcript to catch up. Wouldn't

0:24:44.480 --> 0:24:46.760
<v David Cox>it be better if we could just summarize that, just

0:24:46.800 --> 0:24:49.439
<v David Cox>condense it all down quick little paragraph. You know, customer

0:24:49.480 --> 0:24:51.160
<v David Cox>called they we up said about this, rather than having

0:24:51.200 --> 0:24:53.359
<v David Cox>to read the blow by blow. There's just lots of

0:24:53.520 --> 0:24:56.679
<v David Cox>settings like that where summarization is really helpful. Hey, you

0:24:56.680 --> 0:25:00.439
<v David Cox>have a meeting and I'd like to just automatically, you know,

0:25:00.520 --> 0:25:03.080
<v David Cox>have that meeting or that email or whatever. I'd like

0:25:03.119 --> 0:25:04.680
<v David Cox>to just have a condensed down so I can really

0:25:04.760 --> 0:25:07.600
<v David Cox>quickly get to the heart of the matter. These models

0:25:07.600 --> 0:25:09.800
<v David Cox>are are really good at doing that. They're also a

0:25:09.800 --> 0:25:12.480
<v David Cox>really good at question answering. So if I want to

0:25:12.480 --> 0:25:14.800
<v David Cox>find out what's how many vacation days do I have?

0:25:15.119 --> 0:25:19.520
<v David Cox>I can now interact in natural language with a system

0:25:19.600 --> 0:25:21.840
<v David Cox>that can go and that it has access to our

0:25:21.960 --> 0:25:24.320
<v David Cox>HR policies, and I can actually have a you know,

0:25:24.320 --> 0:25:26.800
<v David Cox>a multi turn conversation where I can, you know, like

0:25:26.840 --> 0:25:29.480
<v David Cox>I would have with you know, somebody, you know, actual

0:25:30.320 --> 0:25:34.840
<v David Cox>HR professional or customer service representative. So a big part,

0:25:35.600 --> 0:25:38.119
<v David Cox>you know, of what this is doing is it's it's

0:25:38.480 --> 0:25:41.120
<v David Cox>putting an interface. You know, when we think of computer interfaces,

0:25:41.119 --> 0:25:44.760
<v David Cox>we're usually thinking about UI user interface elements where I

0:25:44.800 --> 0:25:48.320
<v David Cox>click on menus and there's buttons and all this stuff. Increasingly,

0:25:48.400 --> 0:25:52.120
<v David Cox>now we can just talk, you know, you just in words.

0:25:52.200 --> 0:25:54.080
<v David Cox>You can describe what you want you want to answer,

0:25:54.240 --> 0:25:56.840
<v David Cox>ask a question, you want to sort of command the

0:25:56.840 --> 0:25:59.639
<v David Cox>system to do something, rather than having to learn how

0:25:59.640 --> 0:26:01.800
<v David Cox>to do that clicking buttons, which might be inefficient. Now

0:26:01.800 --> 0:26:03.320
<v David Cox>we can just sort of spell it.

0:26:03.200 --> 0:26:06.720
<v Jacob Goldstein>Out interesting, right, the graphical user interface that we all

0:26:06.760 --> 0:26:10.280
<v Jacob Goldstein>sort of default to, that's not like the state of nature, right,

0:26:10.359 --> 0:26:12.879
<v Jacob Goldstein>That's a thing that was invented and just came to

0:26:12.920 --> 0:26:15.320
<v Jacob Goldstein>be the standard way that we interact with computers. And

0:26:15.359 --> 0:26:19.800
<v Jacob Goldstein>so you could imagine, as you're saying, like chat essentially

0:26:20.000 --> 0:26:23.240
<v Jacob Goldstein>chatting with the machine could could become a sort of

0:26:23.320 --> 0:26:26.560
<v Jacob Goldstein>standard user interface, just like the graphical user interface, did

0:26:26.720 --> 0:26:28.119
<v Jacob Goldstein>you know over the past several decades.

0:26:28.600 --> 0:26:32.040
<v David Cox>Absolutely, And I think those kinds of conversational interfaces are

0:26:32.040 --> 0:26:36.280
<v David Cox>going to be hugely important for increasing our productivity. It's

0:26:36.280 --> 0:26:38.480
<v David Cox>just a lot easier if I have to learn how

0:26:38.520 --> 0:26:40.439
<v David Cox>to use a tool or I have to kind of

0:26:40.440 --> 0:26:43.159
<v David Cox>have awkward, you know, interactions from the computer. I can

0:26:43.240 --> 0:26:44.840
<v David Cox>just tell it what I want and I can understand it.

0:26:44.880 --> 0:26:48.080
<v David Cox>Could you know, potentially even ask questions back to clarify

0:26:48.240 --> 0:26:53.120
<v David Cox>and have those kinds of conversations that can be extremely powerful.

0:26:53.240 --> 0:26:54.840
<v David Cox>And in fact, one area where that's going to I

0:26:54.880 --> 0:26:58.080
<v David Cox>think be absolutely game changing is in code. When we

0:26:58.119 --> 0:27:03.160
<v David Cox>write code. You know, programming languages are a way for

0:27:03.280 --> 0:27:07.120
<v David Cox>us to sort of match between our very sloppy way

0:27:07.160 --> 0:27:10.000
<v David Cox>of talking and the very exact way that you need

0:27:10.040 --> 0:27:12.440
<v David Cox>to command a computer to do what you wanted to do.

0:27:12.760 --> 0:27:15.480
<v David Cox>They're cumbersome to learn. They can you know, create very

0:27:15.520 --> 0:27:18.480
<v David Cox>complex systems that are very hard to reason about. And

0:27:18.680 --> 0:27:20.960
<v David Cox>we're already starting to see the ability to just write

0:27:20.960 --> 0:27:23.560
<v David Cox>down what you want and AI will generate the code

0:27:23.560 --> 0:27:25.360
<v David Cox>for you. And I think we're just going to see

0:27:25.359 --> 0:27:27.879
<v David Cox>a huge revolution of like we just converse you and

0:27:27.880 --> 0:27:30.000
<v David Cox>we can have a conversation to say what we want,

0:27:30.040 --> 0:27:33.359
<v David Cox>and then the computer can actually not only do fixed

0:27:33.400 --> 0:27:35.560
<v David Cox>actions and do things for us, but it can actually

0:27:35.600 --> 0:27:37.840
<v David Cox>even write code to do new things, you know, and

0:27:38.480 --> 0:27:41.560
<v David Cox>generate software itself. Given how much software we have, how

0:27:41.640 --> 0:27:44.320
<v David Cox>much craving we have for software, like we'll never have

0:27:44.520 --> 0:27:47.880
<v David Cox>enough software in our world, uh, you know, the ability

0:27:47.920 --> 0:27:51.199
<v David Cox>to have AI systems as a helper in that, I

0:27:51.200 --> 0:27:52.880
<v David Cox>think we're going to see a lot of a lot

0:27:52.880 --> 0:27:53.520
<v David Cox>of value there.

0:27:54.720 --> 0:27:57.360
<v Jacob Goldstein>So if you if you think about the different ways

0:27:58.000 --> 0:28:00.240
<v Jacob Goldstein>AI might be applied to business, I mean you've talked

0:28:00.240 --> 0:28:02.560
<v Jacob Goldstein>about a number of the sort of classic use cases.

0:28:03.240 --> 0:28:06.600
<v Jacob Goldstein>What are some of the more out there use cases,

0:28:06.600 --> 0:28:09.520
<v Jacob Goldstein>What are some you know, unique ways you could imagine

0:28:09.560 --> 0:28:11.320
<v Jacob Goldstein>AI being applied to business.

0:28:12.960 --> 0:28:15.679
<v David Cox>Yeah, there's really disguised the limit. I mean, we have

0:28:15.760 --> 0:28:17.959
<v David Cox>one project that I'm kind of a fan of where

0:28:18.600 --> 0:28:22.080
<v David Cox>we actually were working with a mechanical engineering professor at

0:28:22.160 --> 0:28:25.159
<v David Cox>MIT working on a classic problem, how do you build

0:28:25.520 --> 0:28:28.920
<v David Cox>linkage systems which are like you imagine bars and joints

0:28:29.080 --> 0:28:31.200
<v David Cox>and overs, you know, the things that.

0:28:31.160 --> 0:28:34.679
<v Jacob Goldstein>Are building a thing, building a physical machine of some kind.

0:28:35.240 --> 0:28:40.400
<v David Cox>Like real like metal and you know nineteenth century just

0:28:40.600 --> 0:28:43.520
<v David Cox>old school industrial revolution. Yeah yeah, yeah, but you know

0:28:43.560 --> 0:28:46.320
<v David Cox>the little arm that's that's holding up my microphone in

0:28:46.320 --> 0:28:48.800
<v David Cox>front of me, cranes, get build your buildings, you know,

0:28:48.880 --> 0:28:51.400
<v David Cox>parts of your engines. This is like classical stuff. It

0:28:51.440 --> 0:28:53.720
<v David Cox>turns out that, you know, humans, if you want to

0:28:53.720 --> 0:28:56.920
<v David Cox>build an advanced system, you decide what like curve you

0:28:56.960 --> 0:29:00.520
<v David Cox>want to create, and then a human together with computer program,

0:29:00.600 --> 0:29:03.800
<v David Cox>can build a five or six bar linkage and then

0:29:03.840 --> 0:29:05.360
<v David Cox>that's kind of where you top out it because it

0:29:05.360 --> 0:29:08.960
<v David Cox>gets too complicated to work more than that. We built

0:29:09.000 --> 0:29:11.800
<v David Cox>a generative AI system that can build twenty bar linkages,

0:29:11.880 --> 0:29:15.000
<v David Cox>like arbitrarily complex. So these are machines that are beyond

0:29:15.040 --> 0:29:20.200
<v David Cox>the capability of a human to design themselves. Another example

0:29:20.240 --> 0:29:23.479
<v David Cox>we have an AI system that can generate electronic circuits.

0:29:23.480 --> 0:29:25.280
<v David Cox>You know, we had a project where we're working where

0:29:25.280 --> 0:29:29.400
<v David Cox>we were building better power converters which allow our computers

0:29:29.440 --> 0:29:32.920
<v David Cox>and our devices to be more efficient, save energy, you know,

0:29:33.280 --> 0:29:36.160
<v David Cox>less less carbone. But I think the world around us

0:29:36.160 --> 0:29:39.200
<v David Cox>has always been shaped by technology. If we look around,

0:29:39.360 --> 0:29:41.200
<v David Cox>you know, just think about how many steps and how

0:29:41.200 --> 0:29:44.160
<v David Cox>many people and how many designs went into the table

0:29:44.240 --> 0:29:47.960
<v David Cox>and the chair and the LAYMP. It's really just astonishing.

0:29:48.720 --> 0:29:52.160
<v David Cox>And that's already you know, the fruit of automation and

0:29:52.200 --> 0:29:53.960
<v David Cox>computers and those kinds of tools. But we're going to

0:29:54.000 --> 0:29:57.480
<v David Cox>see that increasingly be product also of AI. It's just

0:29:57.520 --> 0:29:59.800
<v David Cox>going to be everywhere around us. Everything we touch is

0:29:59.800 --> 0:30:02.240
<v David Cox>going to to have been you know, helped in some

0:30:02.320 --> 0:30:05.480
<v David Cox>way to get to you by you know.

0:30:05.480 --> 0:30:08.400
<v Jacob Goldstein>That is a pretty profound transformation that you're talking about

0:30:08.440 --> 0:30:11.280
<v Jacob Goldstein>in business. How do you think about the implications of

0:30:11.320 --> 0:30:14.840
<v Jacob Goldstein>that both for the sort of you know, business itself

0:30:15.240 --> 0:30:17.040
<v Jacob Goldstein>and also for employees.

0:30:18.760 --> 0:30:21.720
<v David Cox>Yeah, so I think for businesses, this is going to

0:30:22.160 --> 0:30:26.040
<v David Cox>cut costs, make new opportunities to like customers, you know,

0:30:26.120 --> 0:30:29.680
<v David Cox>like there's just you know, it's sort of all upside right,

0:30:29.760 --> 0:30:32.600
<v David Cox>Like for the for the workers. I think the story

0:30:32.640 --> 0:30:35.600
<v David Cox>is mostly good too. You know, like how many things

0:30:35.640 --> 0:30:39.000
<v David Cox>do you do in your day that you'd really rather

0:30:39.120 --> 0:30:41.640
<v David Cox>not right? You know, and we're used to having things

0:30:41.720 --> 0:30:45.200
<v David Cox>we don't like automated away, you know, we we didn't

0:30:45.280 --> 0:30:47.760
<v David Cox>you know, if you didn't like walking many miles to work,

0:30:47.840 --> 0:30:49.400
<v David Cox>then you know, like you can have a car and

0:30:49.760 --> 0:30:51.960
<v David Cox>you can drive there. Or we used to have a

0:30:52.080 --> 0:30:54.960
<v David Cox>huge traction over ninety percent of the US population engaged

0:30:54.960 --> 0:30:58.000
<v David Cox>in agriculture, and then we mechanized it. How very few

0:30:58.000 --> 0:31:00.000
<v David Cox>people work in agriculture. A small number of people can

0:31:00.120 --> 0:31:02.360
<v David Cox>do the work of a large number of people. And

0:31:02.400 --> 0:31:04.920
<v David Cox>then you know, things like email, and you know, they've

0:31:04.960 --> 0:31:07.640
<v David Cox>led to huge productivity enhancements because I don't need to

0:31:07.640 --> 0:31:09.960
<v David Cox>be writing letters and sending them in the mail. I

0:31:10.000 --> 0:31:14.400
<v David Cox>can just instantly communicate with people. We just become more effective,

0:31:14.560 --> 0:31:18.640
<v David Cox>Like our jobs have transformed, whether it's a physical job

0:31:18.720 --> 0:31:21.600
<v David Cox>like agriculture, or whether it's a knowledge worker job where

0:31:21.600 --> 0:31:25.320
<v David Cox>you're sending emails and communicating with people and coordinating teams.

0:31:25.640 --> 0:31:28.280
<v David Cox>We've just gotten better. And you know, the technology has

0:31:28.280 --> 0:31:31.200
<v David Cox>just made us more productive. And this is just another example.

0:31:31.560 --> 0:31:34.200
<v David Cox>Now you know, there are people who worry that you know,

0:31:34.880 --> 0:31:37.320
<v David Cox>will be so good at that that maybe jobs will

0:31:37.320 --> 0:31:41.200
<v David Cox>be displaced, and that's a legitimate concern, But just like

0:31:42.560 --> 0:31:44.720
<v David Cox>how in agriculture, you know, it's not like suddenly we

0:31:44.800 --> 0:31:47.880
<v David Cox>had ninety percent of the population unemployed. You know, people

0:31:47.880 --> 0:31:52.600
<v David Cox>transitioned to other jobs. And the other thing that we found, too,

0:31:52.680 --> 0:31:57.360
<v David Cox>is that our appetite for doing more things is as

0:31:57.440 --> 0:32:00.840
<v David Cox>humans is sort of insatiable. So even if we can

0:32:00.920 --> 0:32:03.680
<v David Cox>dramatically increase how much you know, one human can do,

0:32:04.480 --> 0:32:06.400
<v David Cox>that doesn't necessarily mean you're going to do a fixed

0:32:06.400 --> 0:32:08.840
<v David Cox>amount of stuff. There's an appetite to have even more.

0:32:08.880 --> 0:32:10.760
<v David Cox>So we're going to you can continue to grow grow

0:32:10.840 --> 0:32:13.480
<v David Cox>the pie. So I think at least certainly in the

0:32:13.520 --> 0:32:15.120
<v David Cox>near term, you know, we're going to see a lot

0:32:15.120 --> 0:32:17.360
<v David Cox>of drudgery go away from work. We're going to see

0:32:17.920 --> 0:32:20.880
<v David Cox>people be able to be more effective at their jobs.

0:32:21.360 --> 0:32:24.440
<v David Cox>You know, we will see some transformation in jobs. And

0:32:24.920 --> 0:32:29.840
<v David Cox>like we've seen that before, and the technology a least

0:32:29.880 --> 0:32:32.040
<v David Cox>has the potential to make our lives a lot easier.

0:32:33.280 --> 0:32:38.280
<v Jacob Goldstein>So IBM recently launched Watson X, which includes Watson x

0:32:38.360 --> 0:32:41.320
<v Jacob Goldstein>dot AI. Tell me about that, Tell me about you

0:32:41.320 --> 0:32:43.400
<v Jacob Goldstein>know what it is and the new possibilities that it

0:32:43.440 --> 0:32:44.000
<v Jacob Goldstein>opens up.

0:32:44.920 --> 0:32:48.640
<v David Cox>Yeah. So, so Watson X is obviously a bit of

0:32:49.160 --> 0:32:53.520
<v David Cox>a new branding on the Watson brand. T. J. Watson

0:32:53.520 --> 0:32:57.360
<v David Cox>that was the founder of IBM and our EI technologies

0:32:57.400 --> 0:33:01.320
<v David Cox>have had the Watson brand. Watson X is a recognition

0:33:01.520 --> 0:33:04.840
<v David Cox>that there's something new, there's something that actually has changed

0:33:04.840 --> 0:33:09.160
<v David Cox>the game. We've gone from this old world of automation

0:33:09.360 --> 0:33:12.000
<v David Cox>is to labor intensive to this new world of possibilities

0:33:12.520 --> 0:33:16.680
<v David Cox>where it's much easier to use AI. And what watsonex

0:33:17.240 --> 0:33:22.160
<v David Cox>does it brings together tools for businesses to harness that power.

0:33:22.600 --> 0:33:27.440
<v David Cox>So whatsonex dot AI foundation models that our customers can use.

0:33:27.560 --> 0:33:30.560
<v David Cox>It includes tools that make it easy to run, easy

0:33:30.680 --> 0:33:35.040
<v David Cox>to deploy, easy to experiment. There's a watsonex dot Data

0:33:35.320 --> 0:33:38.800
<v David Cox>component which allows you to sort of organize and access

0:33:38.840 --> 0:33:40.920
<v David Cox>to your data. So what we're really trying to do

0:33:41.000 --> 0:33:45.920
<v David Cox>is give our customers a cohesive set of tools to

0:33:45.960 --> 0:33:49.200
<v David Cox>harness the value of these technologies and at the same

0:33:49.240 --> 0:33:52.240
<v David Cox>time be able to manage the risks and other things

0:33:52.280 --> 0:33:54.160
<v David Cox>that you have to keep an eye on in an

0:33:54.280 --> 0:33:55.240
<v David Cox>enterprise context.

0:33:56.880 --> 0:33:59.560
<v Jacob Goldstein>So we talk about the guests on this show as

0:34:00.160 --> 0:34:04.200
<v Jacob Goldstein>new creators by which we mean people who are creatively

0:34:04.240 --> 0:34:09.080
<v Jacob Goldstein>applying technology in business to drive change. And I'm curious

0:34:09.640 --> 0:34:14.319
<v Jacob Goldstein>how creativity plays a role in the research that you do.

0:34:15.160 --> 0:34:20.120
<v David Cox>Honestly, I think the creative aspects of this job, this

0:34:20.160 --> 0:34:23.759
<v David Cox>is what makes this work exciting. You know, I should say,

0:34:23.840 --> 0:34:26.720
<v David Cox>you know, the folks who work in my organization are

0:34:27.000 --> 0:34:30.560
<v David Cox>doing the creating, and I guess you're doing.

0:34:30.320 --> 0:34:32.480
<v Jacob Goldstein>The managing so that they could do the creator.

0:34:33.360 --> 0:34:36.799
<v David Cox>I'm helping them be their best and I still get

0:34:36.840 --> 0:34:39.719
<v David Cox>to get involved in the weeds of the research as

0:34:39.800 --> 0:34:42.520
<v David Cox>much as I can. But you know, there's something really

0:34:42.560 --> 0:34:46.480
<v David Cox>exciting about inventing. You know, like one of the nice

0:34:46.480 --> 0:34:50.680
<v David Cox>things about doing invention and doing research on AI in industries,

0:34:51.040 --> 0:34:54.000
<v David Cox>it's usually grounded and a real problem that somebody is having.

0:34:54.040 --> 0:34:56.680
<v David Cox>You know, a customer wants to solve this problem that's

0:34:57.239 --> 0:35:00.239
<v David Cox>losing money or there will be a new opportunity. You

0:35:00.280 --> 0:35:04.560
<v David Cox>identify that problem and then you you build something that's

0:35:04.600 --> 0:35:06.759
<v David Cox>never been built before to do that. And I think

0:35:06.840 --> 0:35:10.640
<v David Cox>that's honestly the adrenaline rush that keeps all of us

0:35:11.160 --> 0:35:13.600
<v David Cox>in this field. How do you do something that nobody

0:35:13.600 --> 0:35:17.520
<v David Cox>else on earth has done before or tried before? So

0:35:17.560 --> 0:35:21.080
<v David Cox>that that kind of creativity, and there's also creativity as well,

0:35:21.080 --> 0:35:24.360
<v David Cox>and identifying what those problems are, being able to understand

0:35:25.040 --> 0:35:30.800
<v David Cox>the places where the technology is close enough to solving

0:35:30.800 --> 0:35:34.560
<v David Cox>a problem, and doing that matchmaking between problems that are

0:35:34.640 --> 0:35:37.440
<v David Cox>now solvable, you know, and an AI where the field's

0:35:37.480 --> 0:35:41.920
<v David Cox>moving so fast, this is constantly growing horizon of things

0:35:41.920 --> 0:35:44.480
<v David Cox>that we might be able to solve. So that matchmaking,

0:35:44.520 --> 0:35:48.239
<v David Cox>I think is also a really interesting creative problem. So

0:35:48.520 --> 0:35:50.719
<v David Cox>I think I think that's that's that's why it's so

0:35:50.800 --> 0:35:53.839
<v David Cox>much fun, and it's a fun environment we have here too.

0:35:54.080 --> 0:35:57.400
<v David Cox>It's you know, people drawing on whiteboards and writing on

0:35:57.480 --> 0:35:59.640
<v David Cox>pages of math and.

0:36:00.239 --> 0:36:02.960
<v Jacob Goldstein>Like in a movie, Like in a movie, yeah, straight

0:36:02.960 --> 0:36:05.880
<v Jacob Goldstein>from special casting, the drawing on the window, righting on

0:36:05.920 --> 0:36:11.440
<v Jacob Goldstein>the window in sharp absolutely so, So let's close with

0:36:11.520 --> 0:36:17.000
<v Jacob Goldstein>the really long view. How do you imagine AI and

0:36:17.160 --> 0:36:19.959
<v Jacob Goldstein>people working together twenty.

0:36:19.680 --> 0:36:25.560
<v David Cox>Years from now? Yeah, it's really hard to make predictions.

0:36:25.800 --> 0:36:32.480
<v David Cox>The vision that I like, actually this came from an

0:36:32.719 --> 0:36:38.800
<v David Cox>MIT economist named David Ottur, which was imagine AI almost

0:36:38.800 --> 0:36:42.880
<v David Cox>as a natural resource. You know, we know how natural

0:36:42.920 --> 0:36:45.319
<v David Cox>resources work, right, Like there's an ore we can dig

0:36:45.400 --> 0:36:47.320
<v David Cox>up out of the earth that comes from kind of

0:36:47.360 --> 0:36:49.920
<v David Cox>springs from the earth, or we usually think of that

0:36:50.000 --> 0:36:53.080
<v David Cox>in terms of physical stuff. With AI, you can almost

0:36:53.080 --> 0:36:54.360
<v David Cox>think of it as like there's a new kind of

0:36:54.440 --> 0:36:57.799
<v David Cox>abundance potentially twenty years from now, or not only can

0:36:57.840 --> 0:37:00.440
<v David Cox>we have things we can build or eat, use or

0:37:00.480 --> 0:37:03.560
<v David Cox>burn or whatever. Now we have, you know, this ability

0:37:03.600 --> 0:37:06.360
<v David Cox>to do things and understand things and do intellectual work.

0:37:06.640 --> 0:37:09.560
<v David Cox>And I think we can get to a world where

0:37:10.160 --> 0:37:15.160
<v David Cox>automating things is just seamless. We're surrounded by capability to

0:37:15.200 --> 0:37:19.839
<v David Cox>augment ourselves to get things done. And you could think

0:37:19.880 --> 0:37:21.680
<v David Cox>of that in terms of like, oh, that's going to

0:37:21.719 --> 0:37:24.080
<v David Cox>displace our jobs, because eventually the AI system is going

0:37:24.160 --> 0:37:26.479
<v David Cox>to do everything we can do. But you could also

0:37:26.520 --> 0:37:28.480
<v David Cox>think of it in terms of like, wow, that's just

0:37:28.520 --> 0:37:31.319
<v David Cox>so much abundance that we now have, and really how

0:37:31.320 --> 0:37:34.239
<v David Cox>we use that abundance is sort of up to us,

0:37:34.320 --> 0:37:36.800
<v David Cox>you know, like when you can writing software is super

0:37:36.840 --> 0:37:39.480
<v David Cox>easy and fast and anybody can do it. Just think

0:37:39.520 --> 0:37:41.759
<v David Cox>about all the things you can do now, Like think

0:37:41.800 --> 0:37:43.800
<v David Cox>about all the new activities and go out all the

0:37:43.800 --> 0:37:46.520
<v David Cox>ways we could use that to enrich our lives. That's

0:37:46.560 --> 0:37:49.480
<v David Cox>where I'd like to see us in twenty years you

0:37:49.520 --> 0:37:52.480
<v David Cox>know we can. We can do just so much more

0:37:52.840 --> 0:37:55.399
<v David Cox>than we were able to do before abundance.

0:37:56.200 --> 0:37:59.040
<v Jacob Goldstein>Great, Thank you so much for your time.

0:38:00.040 --> 0:38:01.759
<v David Cox>That's been pleasure. Thanks for inviting me.

0:38:03.320 --> 0:38:07.400
<v Malcolm Gladwell>What a far ranging, deep conversation. I'm mesmerized by the

0:38:07.440 --> 0:38:11.360
<v Malcolm Gladwell>vision David just described. A world where natural conversation between

0:38:11.360 --> 0:38:15.960
<v Malcolm Gladwell>mankind and machine can generate creative solutions to our most

0:38:16.040 --> 0:38:19.799
<v Malcolm Gladwell>complex problems. A world where we view AI not as

0:38:19.880 --> 0:38:23.920
<v Malcolm Gladwell>our replacements, but as a powerful resource we can tap

0:38:23.960 --> 0:38:29.440
<v Malcolm Gladwell>into and exponentially boost our innovation and productivity. Thanks so

0:38:29.520 --> 0:38:32.920
<v Malcolm Gladwell>much to doctor David Cox for joining us on smart Talks.

0:38:33.360 --> 0:38:37.080
<v Malcolm Gladwell>We deeply appreciate him sharing his huge breadth of AI

0:38:37.160 --> 0:38:41.160
<v Malcolm Gladwell>knowledge with us and for explaining the transformative potential of

0:38:41.239 --> 0:38:44.600
<v Malcolm Gladwell>foundation models in a way that even I can understand.

0:38:45.200 --> 0:38:49.680
<v Malcolm Gladwell>We eagerly await his next great breakthrough. Smart Talks with

0:38:49.719 --> 0:38:54.239
<v Malcolm Gladwell>IBM is produced by Matt Romano, David jaw nishe Venkat

0:38:54.280 --> 0:38:58.720
<v Malcolm Gladwell>and Royston Preserve with Jacob Goldstein. We're edited by Lydia

0:38:58.760 --> 0:39:02.320
<v Malcolm Gladwell>jen Kott. Our end engineers are Jason Gambrel, Sarah Bouguer,

0:39:02.920 --> 0:39:07.799
<v Malcolm Gladwell>and Ben Holliday. Theme song by Gramosco. Special thanks to

0:39:07.880 --> 0:39:12.040
<v Malcolm Gladwell>Carli Megliori, Andy Kelly, Kathy Callahan, and the eight Bar

0:39:12.160 --> 0:39:16.200
<v Malcolm Gladwell>and IBM teams, as well as the Pushkin marketing team.

0:39:16.480 --> 0:39:19.800
<v Malcolm Gladwell>Smart Talks with IBM is a production of Pushkin Industries

0:39:20.040 --> 0:39:24.160
<v Malcolm Gladwell>and iHeartMedia. To find more Pushkin podcasts, listen on the

0:39:24.200 --> 0:39:29.359
<v Malcolm Gladwell>iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts.

0:39:29.800 --> 0:39:46.840
<v Malcolm Gladwell>Hi'm Malcolm Gladwell. This is a paid advertisement from IBM.