1 00:00:04,519 --> 00:00:12,680 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Today we 2 00:00:12,720 --> 00:00:15,680 Speaker 1: are witnessed to one of those rare moments in history, 3 00:00:16,079 --> 00:00:19,279 Speaker 1: the rise of an innovative technology with the potential to 4 00:00:19,440 --> 00:00:24,160 Speaker 1: radically transform business and society forever. That technology, of course, 5 00:00:24,640 --> 00:00:28,200 Speaker 1: is artificial intelligence, and it's the central focus for this 6 00:00:28,360 --> 00:00:32,360 Speaker 1: new season of Smart Talks with IBM. Join hosts from 7 00:00:32,400 --> 00:00:36,120 Speaker 1: your favorite Pushkin podcasts as a talk with industry experts 8 00:00:36,120 --> 00:00:39,760 Speaker 1: and leaders to explore how businesses can integrate AI into 9 00:00:39,760 --> 00:00:43,080 Speaker 1: their workflows and help drive real change in this new 10 00:00:43,200 --> 00:00:46,879 Speaker 1: era of AI. And of course, host Malcolm Gladwell will 11 00:00:46,880 --> 00:00:49,200 Speaker 1: be there to guide you through the season and throw 12 00:00:49,280 --> 00:00:52,040 Speaker 1: in his two cents as well. Look out for new 13 00:00:52,080 --> 00:00:54,960 Speaker 1: episodes of Smart Talks with IBM every other week on 14 00:00:55,000 --> 00:00:59,240 Speaker 1: the iHeartRadio app, Apple Podcasts, wherever you get your podcasts 15 00:00:59,440 --> 00:01:03,680 Speaker 1: and learn more are at IBM dot com slash smart Talks. 16 00:01:06,680 --> 00:01:12,840 Speaker 1: All Right, Welcome everybody, you guys excited, Here we go. 17 00:01:13,160 --> 00:01:16,640 Speaker 2: Hello, Hello, Welcome to Smart Talks with IBM, a podcast 18 00:01:16,680 --> 00:01:22,560 Speaker 2: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Gladwell. This season, 19 00:01:22,600 --> 00:01:26,600 Speaker 2: we're continuing our conversations with new creators visionaries who are 20 00:01:26,640 --> 00:01:30,160 Speaker 2: creatively applying technology and business to drive change, but with 21 00:01:30,240 --> 00:01:34,560 Speaker 2: a focus on the transformative power of artificial intelligence and 22 00:01:34,600 --> 00:01:37,479 Speaker 2: what it means to leverage AI as a game changing 23 00:01:37,560 --> 00:01:42,240 Speaker 2: multiplier for your business. Today's episode is a bit different. 24 00:01:42,560 --> 00:01:46,480 Speaker 2: I was recently joined on stage by Dario Gill for 25 00:01:46,520 --> 00:01:48,880 Speaker 2: a conversation in front of a live audience at the 26 00:01:48,920 --> 00:01:53,000 Speaker 2: iHeartMedia headquarters in Manhattan. Dario is the senior vice president 27 00:01:53,240 --> 00:01:56,760 Speaker 2: and director of IBM Research, one of the world's largest 28 00:01:56,840 --> 00:02:01,160 Speaker 2: and most influential corporate research labs. We discussed the rise 29 00:02:01,160 --> 00:02:04,640 Speaker 2: of generative AI, what it means for business and society. 30 00:02:04,960 --> 00:02:08,480 Speaker 2: He also explained how organizations that leverage AI to create 31 00:02:08,560 --> 00:02:13,239 Speaker 2: value will dominate in the near future. Okay, let's get 32 00:02:13,280 --> 00:02:19,840 Speaker 2: on to the conversation. Hello everyone, welcome, and I'm here 33 00:02:19,880 --> 00:02:23,560 Speaker 2: with doctor Dario Gil and I wanted to say before 34 00:02:23,560 --> 00:02:25,160 Speaker 2: we get started. This is something I said backstage that 35 00:02:25,639 --> 00:02:31,920 Speaker 2: I feel very guilty today because you're the you know, 36 00:02:32,680 --> 00:02:34,720 Speaker 2: you know, arguably one of the most important figures in 37 00:02:35,080 --> 00:02:37,799 Speaker 2: AI research in the world, and we have taken you 38 00:02:37,880 --> 00:02:42,160 Speaker 2: away from your job for a morning. It's like, if 39 00:02:42,520 --> 00:02:46,359 Speaker 2: you know, Oppenheimer's wife in nineteen forty four said let's 40 00:02:46,680 --> 00:02:50,440 Speaker 2: go and have a little getaway in the Bahamas. It's 41 00:02:50,520 --> 00:02:52,320 Speaker 2: that kind of thing. You know, What do you say 42 00:02:52,320 --> 00:02:55,400 Speaker 2: to your wife, I can't We have got to work 43 00:02:55,440 --> 00:02:57,680 Speaker 2: on this thing I can't tell you about. She's like 44 00:02:57,760 --> 00:03:00,160 Speaker 2: getting me out of Los Alamos. No, So I do 45 00:03:00,200 --> 00:03:05,040 Speaker 2: feel guilty. We've set back AI research by by about 46 00:03:05,080 --> 00:03:10,080 Speaker 2: four hours here. But I wanted to you've been up 47 00:03:10,080 --> 00:03:13,320 Speaker 2: with with ibo for twenty years, twenty years this summer. 48 00:03:13,680 --> 00:03:15,280 Speaker 2: So and how old were you when you Not to 49 00:03:15,320 --> 00:03:17,040 Speaker 2: give away your age, but you were how old when 50 00:03:17,080 --> 00:03:17,560 Speaker 2: you started? 51 00:03:17,840 --> 00:03:18,560 Speaker 3: I was twenty eight? 52 00:03:18,639 --> 00:03:20,360 Speaker 2: Okay, yeah, So I want to go back to your 53 00:03:20,400 --> 00:03:23,280 Speaker 2: twenty eight year old self. Now, if I asked you 54 00:03:23,360 --> 00:03:26,960 Speaker 2: about artificial intelligence, I asked twenty eight year old Dario, 55 00:03:27,840 --> 00:03:31,080 Speaker 2: what does the future hold for AI? How quickly will 56 00:03:31,520 --> 00:03:35,000 Speaker 2: this new technology transform our world? Et cetera, et cetera. 57 00:03:35,080 --> 00:03:37,080 Speaker 2: What would twenty eight year old Darigo have said? 58 00:03:37,520 --> 00:03:39,760 Speaker 4: Well, I think the first thing is that even though 59 00:03:39,800 --> 00:03:42,160 Speaker 4: AI as a feel has been with us for a 60 00:03:42,200 --> 00:03:45,280 Speaker 4: long time since the mid nineteen fifties, at that time, 61 00:03:45,680 --> 00:03:49,120 Speaker 4: AI was not a very polite word to say, meaning 62 00:03:49,200 --> 00:03:52,600 Speaker 4: within the scientific community, people didn't use sort of that 63 00:03:52,720 --> 00:03:54,800 Speaker 4: term that would have said things like, you know, maybe 64 00:03:54,800 --> 00:03:58,640 Speaker 4: I do things relate to machine learning, right, or statistical 65 00:03:58,680 --> 00:04:00,760 Speaker 4: techniques in terms of classific fires and so on. 66 00:04:01,120 --> 00:04:03,320 Speaker 3: But AI had a mixed. 67 00:04:03,000 --> 00:04:06,000 Speaker 4: Reputation, right, it had gone through different cycles of hype, 68 00:04:06,440 --> 00:04:10,240 Speaker 4: and it's also if moments of you know, a lot 69 00:04:10,280 --> 00:04:15,040 Speaker 4: of negativity towards it because of lack of success. And 70 00:04:15,120 --> 00:04:16,560 Speaker 4: so I think that would be the first thing we 71 00:04:16,640 --> 00:04:19,120 Speaker 4: probably say, like AI is like what is that? 72 00:04:19,240 --> 00:04:21,960 Speaker 3: Like, you know, respectable scientists are not working. 73 00:04:21,720 --> 00:04:25,239 Speaker 4: On AI the fin as such, and that really changed 74 00:04:25,279 --> 00:04:27,640 Speaker 4: over the last fifteen years. Only, right, I would say, 75 00:04:27,640 --> 00:04:30,560 Speaker 4: with the advent of deep learning over the last decade 76 00:04:30,720 --> 00:04:33,679 Speaker 4: is when that re enter again the lexicon of saying 77 00:04:33,680 --> 00:04:36,839 Speaker 4: AI and that that was a legitimate thing to work on. 78 00:04:36,960 --> 00:04:38,440 Speaker 4: So I would say that that's the first thing I 79 00:04:38,440 --> 00:04:40,400 Speaker 4: think we would have noticed a contrast twenty years ago. 80 00:04:40,640 --> 00:04:44,559 Speaker 2: Yeah, So what point in your twenty year tenure at 81 00:04:44,760 --> 00:04:48,520 Speaker 2: IBM would you say you kind of snapped into present 82 00:04:49,160 --> 00:04:50,440 Speaker 2: kind of wow mode. 83 00:04:51,800 --> 00:04:59,680 Speaker 4: I would say in a late two thousands, when IBM 84 00:05:00,520 --> 00:05:05,800 Speaker 4: was working on the Jeopardy project and just seeing the 85 00:05:05,839 --> 00:05:09,279 Speaker 4: demonstrations of what could be done in question answering. 86 00:05:09,720 --> 00:05:12,880 Speaker 2: It's literally Jeopardy is this crucial moment in the history 87 00:05:12,920 --> 00:05:14,680 Speaker 2: of Yeah. 88 00:05:14,160 --> 00:05:16,520 Speaker 4: You know, there had been a long and wonderful history 89 00:05:17,240 --> 00:05:21,000 Speaker 4: inside IBM on AI. So so for example, like you know, 90 00:05:21,080 --> 00:05:23,920 Speaker 4: in terms of like these grand challenges, at the very 91 00:05:23,960 --> 00:05:26,839 Speaker 4: beginning of the field founding, which is this famous Dartmouth 92 00:05:26,880 --> 00:05:31,200 Speaker 4: conference that actually IBM sponsored uh TO to create, there 93 00:05:31,240 --> 00:05:35,800 Speaker 4: was an IBM and there called Nathaniel Rochester and there 94 00:05:35,800 --> 00:05:38,640 Speaker 4: were a few others who right after that they started 95 00:05:38,640 --> 00:05:41,800 Speaker 4: thinking about demonstrations of this field. And then for example, 96 00:05:41,800 --> 00:05:45,039 Speaker 4: they created the first you know game to play checkers 97 00:05:45,320 --> 00:05:48,640 Speaker 4: and to demonstrate that you could do machine learning on that. 98 00:05:49,320 --> 00:05:51,960 Speaker 4: Obviously we saw later in the nineties like chess that 99 00:05:52,040 --> 00:05:54,640 Speaker 4: was a very famous example of that, deep Blue with 100 00:05:54,680 --> 00:05:57,960 Speaker 4: Deep Blue right and playing with Caspar and then but 101 00:05:58,040 --> 00:06:01,120 Speaker 4: I think the moment that was really the ones felt like, 102 00:06:01,240 --> 00:06:03,440 Speaker 4: you know, kind of like brute force anticipating, sort of 103 00:06:03,480 --> 00:06:06,200 Speaker 4: like moves ahead. But this aspect of dealing with language 104 00:06:06,200 --> 00:06:10,560 Speaker 4: and question answering felt different, and I think for for 105 00:06:10,680 --> 00:06:12,919 Speaker 4: us internally and many others, was when a moment of 106 00:06:13,000 --> 00:06:16,000 Speaker 4: saying like wow, you know, what are the possibilities here? 107 00:06:16,440 --> 00:06:19,000 Speaker 4: And then soon after that connected to the sort of 108 00:06:19,040 --> 00:06:22,520 Speaker 4: advancements in computing and with deep learning. The last decade 109 00:06:22,600 --> 00:06:24,520 Speaker 4: has just been an all out, you know, sort of 110 00:06:24,520 --> 00:06:26,960 Speaker 4: like front of advancements and that, and I just continued 111 00:06:27,000 --> 00:06:28,960 Speaker 4: to be more and more impressed. And the last few 112 00:06:29,040 --> 00:06:30,280 Speaker 4: years have been remarkable too. 113 00:06:30,480 --> 00:06:34,960 Speaker 2: Yeah, So I ask you three quick conceptual questions before 114 00:06:34,960 --> 00:06:36,760 Speaker 2: we dig into it, just so I sort of get 115 00:06:36,760 --> 00:06:41,040 Speaker 2: a we all get a feel for the shape of AI. 116 00:06:41,839 --> 00:06:45,880 Speaker 2: Question Number one is where are we in the evolution 117 00:06:46,000 --> 00:06:50,040 Speaker 2: of this? So you know the obvious question. We we're 118 00:06:50,040 --> 00:06:52,640 Speaker 2: all suddenly aware of it, we're talking about it. Can 119 00:06:52,680 --> 00:06:54,760 Speaker 2: you give us an analogy about where we are in 120 00:06:54,800 --> 00:06:57,479 Speaker 2: the kind of likely evolution of this is a technology? 121 00:06:59,640 --> 00:07:03,320 Speaker 4: So I think we're on a significant inflection point that 122 00:07:03,760 --> 00:07:07,840 Speaker 4: it feels the equivalent of the first browsers when they 123 00:07:07,880 --> 00:07:12,080 Speaker 4: appear and people imagine the possibilities of the Internet or 124 00:07:12,120 --> 00:07:16,040 Speaker 4: more imagined experience the internet. The Internet had been around, 125 00:07:16,200 --> 00:07:18,560 Speaker 4: right for quite a few decades. AI has been around 126 00:07:18,840 --> 00:07:21,520 Speaker 4: for many decades. I think the moment we find ourselves 127 00:07:21,560 --> 00:07:24,679 Speaker 4: is that people can touch it, and they can. Before 128 00:07:24,680 --> 00:07:26,840 Speaker 4: they were I systems that were like behind the scenes, 129 00:07:26,880 --> 00:07:31,040 Speaker 4: like your search results or translation systems, but they didn't 130 00:07:31,040 --> 00:07:32,960 Speaker 4: have the experience of like, this is what it feels 131 00:07:32,960 --> 00:07:34,160 Speaker 4: like to interact with this thing. 132 00:07:34,840 --> 00:07:35,960 Speaker 3: So that's what I mean. 133 00:07:36,000 --> 00:07:38,080 Speaker 4: I think maybe that analogy of the browser is appropriate 134 00:07:38,160 --> 00:07:40,360 Speaker 4: because it's all of a sudden, it's like whoa, you know, 135 00:07:40,560 --> 00:07:44,160 Speaker 4: these network of machines and content can be distributed and 136 00:07:44,200 --> 00:07:46,840 Speaker 4: everybody can self publish, and there was a moment that 137 00:07:46,880 --> 00:07:49,000 Speaker 4: we all remember that, and I think that that is 138 00:07:49,080 --> 00:07:51,560 Speaker 4: what the world has experience over the last nine months 139 00:07:51,640 --> 00:07:55,360 Speaker 4: or so on. So but fundamentally, also what is important 140 00:07:55,520 --> 00:07:57,680 Speaker 4: is that this moment is where the ease of the 141 00:07:57,760 --> 00:08:00,640 Speaker 4: number of people that can build and u use AI 142 00:08:01,280 --> 00:08:06,240 Speaker 4: has skyrocketed. So over the last decade, you know, technology 143 00:08:06,280 --> 00:08:10,440 Speaker 4: firms that had large research teams could build AI that 144 00:08:10,560 --> 00:08:14,000 Speaker 4: worked really well, honestly, but when you went down into 145 00:08:14,160 --> 00:08:16,880 Speaker 4: say hey, can everybody use it? Can a data science 146 00:08:16,880 --> 00:08:20,320 Speaker 4: team in a bank, you know, go and develop these applications. 147 00:08:19,920 --> 00:08:22,240 Speaker 3: It was like more complicated. Some could do it, but 148 00:08:22,280 --> 00:08:24,120 Speaker 3: it was more the barrier of entry was high. 149 00:08:24,440 --> 00:08:27,800 Speaker 4: Now it's very different because of foundation models and the 150 00:08:27,840 --> 00:08:29,160 Speaker 4: implications that that has. 151 00:08:28,960 --> 00:08:32,640 Speaker 2: For at the moment where the technology is being democratized. 152 00:08:32,000 --> 00:08:37,480 Speaker 4: In demarketized, frankly, it works better for classes of problems 153 00:08:37,480 --> 00:08:40,120 Speaker 4: like programming and other things, is really incredibly impressive what 154 00:08:40,160 --> 00:08:42,520 Speaker 4: it can do. So the accuracy and the performance of 155 00:08:42,520 --> 00:08:45,440 Speaker 4: it is much better, and the ease of use and 156 00:08:45,480 --> 00:08:47,720 Speaker 4: the number of use cases we can pursue it much bigger, 157 00:08:47,720 --> 00:08:49,440 Speaker 4: So that democratization is a big difference. 158 00:08:49,480 --> 00:08:51,480 Speaker 2: But when you say, when you make it an analogy to 159 00:08:51,520 --> 00:08:55,600 Speaker 2: the first browsers, if you if we do another one 160 00:08:55,640 --> 00:08:58,360 Speaker 2: of these time travel questions back at the beginning of 161 00:08:58,360 --> 00:09:02,800 Speaker 2: the first browsers, to say, many of the potential uses 162 00:09:02,920 --> 00:09:05,880 Speaker 2: of the Internet and such we hadn't even begun. We 163 00:09:05,880 --> 00:09:08,600 Speaker 2: couldn't even anticipate, right, right, So we're at the point 164 00:09:08,600 --> 00:09:11,640 Speaker 2: where the future direction is largely unpredictable. 165 00:09:11,920 --> 00:09:14,719 Speaker 4: Yeah, I think that that is right, because it's such 166 00:09:14,720 --> 00:09:19,080 Speaker 4: a horizontal technology that the intersection of the horizontal capability, 167 00:09:19,160 --> 00:09:23,400 Speaker 4: which is about expanding our productivity and tasks that we 168 00:09:23,440 --> 00:09:26,280 Speaker 4: wouldn't be able to do efficiently without it, has to 169 00:09:26,360 --> 00:09:29,160 Speaker 4: marry now the use cases that reflect the diversity of 170 00:09:29,200 --> 00:09:32,240 Speaker 4: human experience, our institutional diversity. So as more and more 171 00:09:32,280 --> 00:09:34,920 Speaker 4: institutions said, you know, I'm focused on agriculture, you know, 172 00:09:35,040 --> 00:09:38,160 Speaker 4: to be able to improve seeds, you know, in these 173 00:09:38,200 --> 00:09:41,200 Speaker 4: kinds of environments. They'll find their own context in which 174 00:09:41,240 --> 00:09:43,040 Speaker 4: that matters that the creators of a I did not 175 00:09:43,040 --> 00:09:46,000 Speaker 4: anticipate at the beginning. So I think that that is 176 00:09:46,040 --> 00:09:48,480 Speaker 4: then the fruit of surprises will be like why I 177 00:09:48,480 --> 00:09:49,680 Speaker 4: wouldn't even think that it could be. 178 00:09:49,720 --> 00:09:50,199 Speaker 3: Used for that? 179 00:09:50,480 --> 00:09:52,679 Speaker 5: And also clever people will create. 180 00:09:52,400 --> 00:09:55,400 Speaker 4: New business models as associated with that, like it happened 181 00:09:55,440 --> 00:09:58,120 Speaker 4: with the Internet of course as well, and that will 182 00:09:58,120 --> 00:10:01,280 Speaker 4: be its own source of transformation, unchanged in its own right. 183 00:10:01,360 --> 00:10:03,559 Speaker 5: So I think all of that is yet to unfold. 184 00:10:03,679 --> 00:10:03,839 Speaker 2: Right. 185 00:10:03,880 --> 00:10:06,640 Speaker 4: What we're seeing is this catalyst moment of technology that 186 00:10:06,679 --> 00:10:08,680 Speaker 4: works well enough and it can be democratized. 187 00:10:09,600 --> 00:10:11,559 Speaker 2: What next sort of conceptual question? 188 00:10:12,200 --> 00:10:12,360 Speaker 3: You know? 189 00:10:12,640 --> 00:10:19,000 Speaker 2: We can loosely understand or categorize innovations in terms of 190 00:10:19,040 --> 00:10:23,360 Speaker 2: their impact on the kind of balance of power between 191 00:10:23,679 --> 00:10:28,720 Speaker 2: haves and have nots. Some innovations, you know, obviously favor 192 00:10:28,760 --> 00:10:32,920 Speaker 2: those who already have will make the rich richer. Some 193 00:10:32,920 --> 00:10:35,360 Speaker 2: some it's a rising tie the lift cell boats and 194 00:10:35,400 --> 00:10:39,640 Speaker 2: some bias in the other direction, they close the gap between. 195 00:10:40,840 --> 00:10:44,080 Speaker 2: Is it possible to say to predict which of those 196 00:10:44,120 --> 00:10:46,160 Speaker 2: three categories AI might fall into. 197 00:10:47,160 --> 00:10:48,880 Speaker 3: It's a great question, you know. 198 00:10:49,000 --> 00:10:53,880 Speaker 4: A first observation I would make on your first two categories. 199 00:10:53,920 --> 00:10:57,720 Speaker 5: Is that it will be both likely be true that 200 00:10:57,880 --> 00:10:58,680 Speaker 5: the use. 201 00:10:58,520 --> 00:11:00,839 Speaker 4: Of AI will be highly demarketized, meaning the number of 202 00:11:00,880 --> 00:11:04,239 Speaker 4: people that have access to its power to make improvements 203 00:11:04,240 --> 00:11:07,119 Speaker 4: in terms of efficiency and so on will be fairly universal, 204 00:11:07,920 --> 00:11:11,319 Speaker 4: and that the ones who are able to create AI 205 00:11:12,080 --> 00:11:15,319 Speaker 4: UH may be quite concentrated. So if you look at 206 00:11:15,320 --> 00:11:18,920 Speaker 4: it from the lens of who creates wealth and value 207 00:11:19,280 --> 00:11:23,280 Speaker 4: over sustained periods of time, particularly it's saying a context 208 00:11:23,360 --> 00:11:26,840 Speaker 4: like business, I think just being a user of AI 209 00:11:26,960 --> 00:11:31,160 Speaker 4: technology is an insufficient strategy and UH. And the reason 210 00:11:31,240 --> 00:11:33,120 Speaker 4: for that is, like, yes, you will get the immediate 211 00:11:33,120 --> 00:11:36,080 Speaker 4: productivity boost of like just making API calls and you 212 00:11:36,120 --> 00:11:38,760 Speaker 4: know that would be a new baseline for everybody, but 213 00:11:39,200 --> 00:11:42,959 Speaker 4: you're not accruing value in terms of representing your data 214 00:11:43,040 --> 00:11:44,640 Speaker 4: inside the AI in way that it. 215 00:11:44,679 --> 00:11:46,599 Speaker 3: Gives you a sustainable competitive advantage. 216 00:11:47,000 --> 00:11:49,360 Speaker 4: So I always try to tell people is don't just 217 00:11:49,400 --> 00:11:52,200 Speaker 4: be an AI user, be an you know, AI value creator. 218 00:11:52,679 --> 00:11:55,040 Speaker 4: And I think that that will have a lot of 219 00:11:55,120 --> 00:11:58,360 Speaker 4: consequences in terms of the haves and have nots as 220 00:11:58,400 --> 00:12:01,000 Speaker 4: an example, and that will apply both to institutions and 221 00:12:01,040 --> 00:12:04,360 Speaker 4: regions and countries, et cetera. So I think it would 222 00:12:04,360 --> 00:12:07,920 Speaker 4: be kind of a mistake, right to just develop strategies 223 00:12:07,920 --> 00:12:09,200 Speaker 4: that are just about. 224 00:12:09,040 --> 00:12:13,199 Speaker 2: Usage, but to come back to that question from them 225 00:12:13,200 --> 00:12:16,400 Speaker 2: and to give you a specific suppose I'm a I'm 226 00:12:16,400 --> 00:12:21,199 Speaker 2: an industrial farmer in Iowa with ten million dollars of 227 00:12:21,240 --> 00:12:24,880 Speaker 2: equipment and move and I'm comparing it to a subsistence 228 00:12:24,960 --> 00:12:29,040 Speaker 2: farmer somewhere in the developing world who's got a cell phone. 229 00:12:29,840 --> 00:12:34,559 Speaker 2: Over the next five years, who's whose well being rises 230 00:12:34,600 --> 00:12:36,000 Speaker 2: by a greater amount. 231 00:12:37,080 --> 00:12:39,800 Speaker 4: Yeah, I think, I mean, it's a it's a good question, 232 00:12:39,840 --> 00:12:41,480 Speaker 4: but it might be hard to do a one to 233 00:12:41,600 --> 00:12:44,160 Speaker 4: one sort of like attribution to just one variable in 234 00:12:44,200 --> 00:12:48,400 Speaker 4: this case, which is AI. But again, provided that you 235 00:12:48,520 --> 00:12:51,360 Speaker 4: have access to a phone, right and some kind to 236 00:12:51,600 --> 00:12:54,000 Speaker 4: you know, be able to be connected. I do think 237 00:12:54,080 --> 00:12:57,080 Speaker 4: so for example, in that context we've developed, we don't 238 00:12:57,120 --> 00:12:59,600 Speaker 4: work with NASA as an example, to build you spatial 239 00:12:59,640 --> 00:13:03,080 Speaker 4: models using some of these new techniques, and I think 240 00:13:03,120 --> 00:13:06,079 Speaker 4: for example, or ability to do flood prediction, I'll tell 241 00:13:06,080 --> 00:13:08,559 Speaker 4: you an advantage of why it would be a democratization 242 00:13:08,640 --> 00:13:11,680 Speaker 4: force in that context. Before to build a flowed model 243 00:13:12,200 --> 00:13:15,640 Speaker 4: based on satellite imagery was actually so. 244 00:13:15,520 --> 00:13:17,840 Speaker 3: Onerous and so complicated and difficult. 245 00:13:17,400 --> 00:13:19,880 Speaker 4: That you would just target to very specific regions and 246 00:13:19,920 --> 00:13:23,240 Speaker 4: then obviously countries prioritize their own right. But what we've 247 00:13:23,280 --> 00:13:26,040 Speaker 4: demonstrated is actually you can extend that technique to have 248 00:13:26,280 --> 00:13:29,000 Speaker 4: global coverage around that. So in that context, I would 249 00:13:29,040 --> 00:13:32,000 Speaker 4: say it's a four stores democratization that everybody sort of 250 00:13:32,000 --> 00:13:34,840 Speaker 4: would have access if you have some connectivity, as. 251 00:13:34,679 --> 00:13:37,640 Speaker 2: Today, Iowa farmer might have a flood model. The guy 252 00:13:37,720 --> 00:13:40,480 Speaker 2: in the developing world definitely didn't, and now he's a 253 00:13:40,520 --> 00:13:41,200 Speaker 2: shot at getting one. 254 00:13:41,240 --> 00:13:42,679 Speaker 3: Yeah, but now it has a shot of getting one. 255 00:13:42,720 --> 00:13:44,560 Speaker 4: So there's aspects of it that so long as we 256 00:13:44,640 --> 00:13:47,600 Speaker 4: provide connectivity and access to it, that they can be 257 00:13:47,760 --> 00:13:49,040 Speaker 4: democratization forces. 258 00:13:49,160 --> 00:13:49,640 Speaker 3: But I'll give you. 259 00:13:49,640 --> 00:13:53,320 Speaker 4: Another example that that can be quite concerning, which is language. Right, 260 00:13:53,440 --> 00:13:57,559 Speaker 4: So there's so much language in the you know, in English, 261 00:13:58,160 --> 00:14:01,400 Speaker 4: and there is sort of like the reinforcement loop that 262 00:14:01,520 --> 00:14:03,960 Speaker 4: happens that the more you concentrate because it has obvious 263 00:14:03,960 --> 00:14:07,559 Speaker 4: benefits for global communication and standardization, the more you can 264 00:14:07,679 --> 00:14:11,200 Speaker 4: enrich like base AI models based on that capability. If 265 00:14:11,200 --> 00:14:15,319 Speaker 4: you have very resource cars languages, you tend to develop 266 00:14:15,400 --> 00:14:18,840 Speaker 4: less powerful AI with those languages and so on, So 267 00:14:19,000 --> 00:14:23,080 Speaker 4: one has to actually worry and focus on the ability 268 00:14:23,440 --> 00:14:27,160 Speaker 4: to actually represent that in that case is language as 269 00:14:27,160 --> 00:14:29,920 Speaker 4: a piece of culture. Also in the AI sets that 270 00:14:30,080 --> 00:14:33,240 Speaker 4: everybody can benefit from it too. So there's a lot 271 00:14:33,320 --> 00:14:36,600 Speaker 4: of considerations in terms of equity about the data and 272 00:14:36,680 --> 00:14:39,920 Speaker 4: the data sets that we accrue and what problems are 273 00:14:39,920 --> 00:14:42,120 Speaker 4: we trying to solve. I mean, you mentioned agriculture or 274 00:14:42,120 --> 00:14:44,640 Speaker 4: healthcare and so on. If we only solve problems that 275 00:14:44,640 --> 00:14:47,240 Speaker 4: are related to marketing as an example, that would be 276 00:14:47,280 --> 00:14:49,960 Speaker 4: a less rich world in terms of opportunity that if 277 00:14:49,960 --> 00:14:52,680 Speaker 4: we incorporate many many other broader server problems. 278 00:14:53,680 --> 00:14:55,600 Speaker 2: Who do you think what do you think are the 279 00:14:55,640 --> 00:15:01,040 Speaker 2: biggest impediments to the adoption of a as you would 280 00:15:01,120 --> 00:15:03,520 Speaker 2: like as you think aire to be adopted. I mean, 281 00:15:03,560 --> 00:15:05,400 Speaker 2: if you look, what are the sticking points that. 282 00:15:05,400 --> 00:15:08,240 Speaker 4: You would look In the end, I'm going to give 283 00:15:08,280 --> 00:15:10,840 Speaker 4: a non technological answer as a first one has to 284 00:15:10,840 --> 00:15:13,760 Speaker 4: do with workflow, right, So even if the technology is 285 00:15:13,880 --> 00:15:19,000 Speaker 4: very capable, the organizational change inside a company to incorporate 286 00:15:19,000 --> 00:15:21,720 Speaker 4: into the natural workflow of people or how we work 287 00:15:22,320 --> 00:15:24,800 Speaker 4: is It's a lesson we have learned over the last 288 00:15:24,800 --> 00:15:26,320 Speaker 4: decade is hugely important. 289 00:15:26,920 --> 00:15:29,920 Speaker 3: So there's a lot of design considerations. 290 00:15:29,960 --> 00:15:33,000 Speaker 4: There's a lot of how do people want to work right, 291 00:15:33,240 --> 00:15:34,280 Speaker 4: how do they work today? 292 00:15:34,320 --> 00:15:36,400 Speaker 3: And what is the natural entry points for AI? So 293 00:15:36,440 --> 00:15:37,760 Speaker 3: that's like number one. 294 00:15:38,000 --> 00:15:40,760 Speaker 4: And then the second one is, you know, for the 295 00:15:40,800 --> 00:15:45,120 Speaker 4: broad value creation aspect of it is the understanding inside 296 00:15:45,120 --> 00:15:49,480 Speaker 4: the companies of how you have to curate and create 297 00:15:49,560 --> 00:15:53,000 Speaker 4: data to combine it with external data says that you 298 00:15:53,040 --> 00:15:56,280 Speaker 4: can have powerful AI models that actually fit your need. 299 00:15:56,760 --> 00:16:00,400 Speaker 4: And that aspect of what it takes to actually and 300 00:16:00,440 --> 00:16:05,800 Speaker 4: curate the data for this modern AI, it's still working progress, right. 301 00:16:06,120 --> 00:16:08,640 Speaker 4: I think part of the problem that happens very often 302 00:16:08,680 --> 00:16:11,960 Speaker 4: when I talk to institutions is that they say yah, yeah, yeah, yeah, 303 00:16:11,960 --> 00:16:14,520 Speaker 4: I'm doing it. I've been doing it for a long time. 304 00:16:15,240 --> 00:16:17,640 Speaker 4: And the reality is that that answer can sometimes be 305 00:16:17,680 --> 00:16:19,760 Speaker 4: a little of a cop out, right, is like, I 306 00:16:19,800 --> 00:16:22,400 Speaker 4: know you were doing machine learning, you were doing some 307 00:16:22,480 --> 00:16:23,359 Speaker 4: of these things. 308 00:16:23,080 --> 00:16:25,560 Speaker 5: But actually the leader's version of AI. 309 00:16:25,480 --> 00:16:28,200 Speaker 4: What was happening with foundation models, Not only is it 310 00:16:28,320 --> 00:16:31,680 Speaker 4: very new, it's very hard to do and honestly, if 311 00:16:31,720 --> 00:16:34,440 Speaker 4: you haven't been you know, assembling very large teams and 312 00:16:34,480 --> 00:16:36,960 Speaker 4: spending hundreds of millions of dollars of compute, and so 313 00:16:37,120 --> 00:16:39,880 Speaker 4: you're probably not doing it right. You're doing something else 314 00:16:39,920 --> 00:16:42,560 Speaker 4: that is in the broad category. And I think the 315 00:16:42,720 --> 00:16:45,480 Speaker 4: lessons about what it means to make this transition to 316 00:16:45,560 --> 00:16:48,400 Speaker 4: this new wave is still in early phases of understanding. 317 00:16:48,560 --> 00:16:50,160 Speaker 2: So what would you say? I want to give you 318 00:16:50,200 --> 00:16:53,720 Speaker 2: a couple of examples of people with kind of real 319 00:16:53,720 --> 00:16:56,800 Speaker 2: work in real world positions of responsibility. Imagine I'm sitting 320 00:16:56,880 --> 00:16:59,720 Speaker 2: right here, So imagine that I am the president of 321 00:16:59,720 --> 00:17:02,120 Speaker 2: a small liberal arts college and I come to you 322 00:17:02,120 --> 00:17:04,240 Speaker 2: and I say, Dario, I keep hearing about the AI 323 00:17:04,920 --> 00:17:07,680 Speaker 2: my college has. You know, I don't make it. You know, 324 00:17:07,800 --> 00:17:11,040 Speaker 2: I'm I'm I'm making this much money. If that every 325 00:17:11,080 --> 00:17:15,320 Speaker 2: year my enrollments declining, I feel like this maybe is 326 00:17:15,359 --> 00:17:18,720 Speaker 2: an opportunity. What is the opportunity for me? What would 327 00:17:18,720 --> 00:17:19,520 Speaker 2: you say? 328 00:17:20,640 --> 00:17:22,960 Speaker 3: So it's probably in a couple of segments around that. 329 00:17:23,200 --> 00:17:24,840 Speaker 3: Right one has. 330 00:17:24,680 --> 00:17:27,480 Speaker 4: To do is well, what is the implications of this 331 00:17:27,600 --> 00:17:31,919 Speaker 4: technology inside the institution itself instead of the college, and 332 00:17:31,960 --> 00:17:35,800 Speaker 4: how we operate and can we improve for example, efficiency, 333 00:17:35,960 --> 00:17:39,160 Speaker 4: like if you have in very low levels of sort 334 00:17:39,200 --> 00:17:41,959 Speaker 4: of margin to be able to reinvest, is you know 335 00:17:42,160 --> 00:17:46,440 Speaker 4: you run it, You run you know, infrastructure, you run 336 00:17:46,480 --> 00:17:49,000 Speaker 4: many things inside the college. What are the opportunities to 337 00:17:49,000 --> 00:17:52,800 Speaker 4: increase the productivity or automate and drive savings such that 338 00:17:52,880 --> 00:17:55,159 Speaker 4: you can reinvest that money so into the mission of 339 00:17:55,240 --> 00:17:56,840 Speaker 4: education right as an example. 340 00:17:56,520 --> 00:17:58,240 Speaker 2: So number one is operational efficiency. 341 00:17:58,240 --> 00:18:01,399 Speaker 4: Operational efficiencyage is is a big one. I think the 342 00:18:01,400 --> 00:18:03,560 Speaker 4: second one is within the context of the college, there's 343 00:18:03,560 --> 00:18:06,600 Speaker 4: implications for the educational mission on its own, right, how 344 00:18:06,640 --> 00:18:08,400 Speaker 4: will you know how. 345 00:18:08,359 --> 00:18:09,919 Speaker 3: Does a curriculum need to evolve or not? 346 00:18:10,359 --> 00:18:12,840 Speaker 4: What are acceptable use policies or of some of these 347 00:18:12,840 --> 00:18:15,119 Speaker 4: AI I think we've all read a lot about like 348 00:18:15,119 --> 00:18:17,520 Speaker 4: what can happen in terms of exams and so on, 349 00:18:17,560 --> 00:18:19,639 Speaker 4: and cheating and not cheating, or what are the actually 350 00:18:19,720 --> 00:18:22,480 Speaker 4: positive elements of it in terms of how curriculum should 351 00:18:22,480 --> 00:18:25,840 Speaker 4: be developed and professions sustain around that. And then there's 352 00:18:25,840 --> 00:18:28,879 Speaker 4: another third dimension, which is the outdoor oriented element of it, 353 00:18:28,920 --> 00:18:32,760 Speaker 4: which is like prospect students right, so, which is frankly speaking, 354 00:18:32,800 --> 00:18:34,960 Speaker 4: a big use case that is happening right now, which 355 00:18:35,000 --> 00:18:37,440 Speaker 4: in the broader industry is called customer care or client 356 00:18:37,480 --> 00:18:39,840 Speaker 4: care or citizen care. So in this question will be education, 357 00:18:40,000 --> 00:18:42,600 Speaker 4: like you know, hey are you reaching the right students 358 00:18:43,280 --> 00:18:46,200 Speaker 4: around that that may apply to the college. How can 359 00:18:46,240 --> 00:18:48,679 Speaker 4: you create them, for example, an environment to interact with 360 00:18:48,680 --> 00:18:50,720 Speaker 4: the college and answering questions that could be a chat 361 00:18:50,760 --> 00:18:53,800 Speaker 4: bought or something like that to learn about it, and personalization. 362 00:18:54,200 --> 00:18:55,320 Speaker 3: So I would say there. 363 00:18:55,240 --> 00:18:57,360 Speaker 4: Is like at least three lenses with which I would 364 00:18:57,359 --> 00:18:59,959 Speaker 4: give advice right the positive. 365 00:19:00,040 --> 00:19:04,239 Speaker 2: Second, because it's really interesting. So I really can't as 366 00:19:04,280 --> 00:19:07,720 Speaker 2: sign an essay anymore? Can I? 367 00:19:07,720 --> 00:19:09,000 Speaker 3: Can I sign an essay? 368 00:19:09,160 --> 00:19:12,000 Speaker 2: Can I say, rend me a research paper and come 369 00:19:12,040 --> 00:19:13,840 Speaker 2: back to being three? Can I do that anymore? 370 00:19:13,880 --> 00:19:14,520 Speaker 3: I think you can? 371 00:19:14,640 --> 00:19:15,400 Speaker 2: How do I do that? 372 00:19:15,560 --> 00:19:16,240 Speaker 3: And then you can that. 373 00:19:16,560 --> 00:19:21,040 Speaker 4: Look, there's there's two questions around that. I think that 374 00:19:21,320 --> 00:19:22,640 Speaker 4: if one goes. 375 00:19:22,520 --> 00:19:24,440 Speaker 5: And explains in the context like what is it? 376 00:19:24,560 --> 00:19:25,160 Speaker 3: Why are we here? 377 00:19:25,160 --> 00:19:27,159 Speaker 5: Why are in this class? What is the purpose of this? 378 00:19:28,040 --> 00:19:31,560 Speaker 4: And and one starts with assuming like an element of 379 00:19:31,600 --> 00:19:33,760 Speaker 4: like decency and people are people are there like to 380 00:19:33,840 --> 00:19:36,200 Speaker 4: learn and so on, and you just give it disclaimer. Look, 381 00:19:36,400 --> 00:19:38,959 Speaker 4: I know that one option you have is like just 382 00:19:39,000 --> 00:19:41,200 Speaker 4: you know, put the essay question and click go and 383 00:19:41,240 --> 00:19:43,200 Speaker 4: like and give an answer. You know, but that is 384 00:19:43,240 --> 00:19:45,359 Speaker 4: not why we're here, and that is not the intent 385 00:19:45,400 --> 00:19:46,960 Speaker 4: of what we're trying to do. So first I would 386 00:19:47,000 --> 00:19:51,320 Speaker 4: start with the sort of like the norms of intent 387 00:19:51,440 --> 00:19:54,600 Speaker 4: and decency and appeal to those as step number one. 388 00:19:55,119 --> 00:19:57,240 Speaker 4: Then we all know that there will be a distribution 389 00:19:57,320 --> 00:19:59,359 Speaker 4: of use cases of people like that will come in 390 00:19:59,359 --> 00:20:01,199 Speaker 4: one year and come one or the other and do that. 391 00:20:01,760 --> 00:20:04,280 Speaker 4: And so for a subset of that, you know, I 392 00:20:04,280 --> 00:20:06,240 Speaker 4: think the technology is going to have all in such 393 00:20:06,240 --> 00:20:08,679 Speaker 4: a way that we will have more and more of 394 00:20:08,680 --> 00:20:11,440 Speaker 4: the ability to discern right, you know, when that has 395 00:20:11,480 --> 00:20:14,240 Speaker 4: been AI generated right and uncreated. 396 00:20:14,400 --> 00:20:15,960 Speaker 3: It won't be perfect, right. 397 00:20:15,840 --> 00:20:18,399 Speaker 4: But there's some elements that you can imagine in putting 398 00:20:18,400 --> 00:20:20,399 Speaker 4: the essay and you say, hey, this is likely to 399 00:20:20,440 --> 00:20:23,560 Speaker 4: be generated right around that. And for example, one way 400 00:20:23,600 --> 00:20:24,959 Speaker 4: you can do that, just to give you an intuition, 401 00:20:25,080 --> 00:20:27,399 Speaker 4: you could just have an essay that you write with 402 00:20:27,480 --> 00:20:30,840 Speaker 4: pencil and paper. At the beginning, you get a baseline 403 00:20:30,840 --> 00:20:33,520 Speaker 4: of what you're writing is like. And then later when 404 00:20:33,560 --> 00:20:37,440 Speaker 4: you you know generated, there will be obvious differences around 405 00:20:37,480 --> 00:20:39,000 Speaker 4: what kind of writing has been generated. 406 00:20:39,040 --> 00:20:42,360 Speaker 2: On the other way, but you've turned it's everything you're 407 00:20:42,400 --> 00:20:46,280 Speaker 2: describing makes sense. Put it greatly in this In this respect, 408 00:20:46,320 --> 00:20:49,000 Speaker 2: at least it seems to greatly come complicate the life 409 00:20:49,000 --> 00:20:51,920 Speaker 2: of the teacher, whereas the other two use cases seem 410 00:20:52,000 --> 00:20:56,879 Speaker 2: to kind of clarify and simplify the role. Right suddenly, 411 00:20:57,200 --> 00:21:00,959 Speaker 2: you know, reaching student perspective students, sounds like I can 412 00:21:01,000 --> 00:21:03,080 Speaker 2: do that much more kind of efficienty a lot. Yeah, 413 00:21:03,119 --> 00:21:05,720 Speaker 2: I can bring out administration costs, but the teaching thing 414 00:21:05,800 --> 00:21:06,760 Speaker 2: is tricky. 415 00:21:07,880 --> 00:21:11,639 Speaker 4: Well, until we developed the new norms, right, I mean again, 416 00:21:11,800 --> 00:21:14,120 Speaker 4: I mean, I know it's not abuse analogy, but calculators 417 00:21:14,119 --> 00:21:17,680 Speaker 4: we deal. We've done with that too, right, And it says, well, calculator, 418 00:21:17,680 --> 00:21:18,680 Speaker 4: what is the purpose of math? 419 00:21:18,720 --> 00:21:19,520 Speaker 3: How are we going to do this? 420 00:21:19,600 --> 00:21:22,840 Speaker 2: And so can I tell you my dad's calculator story? 421 00:21:22,960 --> 00:21:23,520 Speaker 3: Yes? Please. 422 00:21:23,800 --> 00:21:27,760 Speaker 2: My father was a mathematician, taught mathematics at University of Waterloo, Agada, 423 00:21:28,480 --> 00:21:31,800 Speaker 2: and in the seventies when people started to get pocket calculators, 424 00:21:32,240 --> 00:21:34,840 Speaker 2: his students demanded that they'd be able to use them, 425 00:21:35,000 --> 00:21:36,760 Speaker 2: and he said no, and they took him to the 426 00:21:36,800 --> 00:21:42,600 Speaker 2: administration and he lost. So he then changed completely throughout 427 00:21:42,640 --> 00:21:45,679 Speaker 2: all of his old exams and introduced new exams where 428 00:21:46,080 --> 00:21:50,320 Speaker 2: there was no calculation. It was all like deep think, 429 00:21:50,440 --> 00:21:52,760 Speaker 2: you know, figure out the problem on a conceptual level 430 00:21:52,760 --> 00:21:56,040 Speaker 2: and describe it to me. And they were all students 431 00:21:56,080 --> 00:22:00,520 Speaker 2: deeply unhappy that he'd made their lives for complicated. But 432 00:21:59,760 --> 00:22:03,800 Speaker 2: to your point, I mean he probably the result was 433 00:22:03,840 --> 00:22:07,960 Speaker 2: probably a better education. He just removed the element that 434 00:22:08,000 --> 00:22:10,919 Speaker 2: they could game with their pocket calculators. I suppose it's 435 00:22:10,920 --> 00:22:11,560 Speaker 2: a version. 436 00:22:11,359 --> 00:22:13,119 Speaker 4: Of I think it's a version of that, And so 437 00:22:13,200 --> 00:22:14,919 Speaker 4: I think they will develop the equivalent of what your 438 00:22:14,920 --> 00:22:16,520 Speaker 4: father did. And I think people say, you know what 439 00:22:16,600 --> 00:22:18,920 Speaker 4: if like these kinds of things, everybody's doing it generically 440 00:22:19,000 --> 00:22:21,200 Speaker 4: and none of us have any meaning because all you're 441 00:22:21,240 --> 00:22:23,240 Speaker 4: doing is pressing buttons, and like the intent of this 442 00:22:23,400 --> 00:22:24,960 Speaker 4: was something which was to teach you how to write 443 00:22:25,160 --> 00:22:26,119 Speaker 4: or to think or something. 444 00:22:26,400 --> 00:22:28,160 Speaker 5: There may be a variant of how we do all 445 00:22:28,160 --> 00:22:29,520 Speaker 5: of this. I mean, obviously some. 446 00:22:29,560 --> 00:22:31,720 Speaker 4: Version of that that has happened is like Okay, we're 447 00:22:31,720 --> 00:22:33,240 Speaker 4: all going to sit down and doing with pencil and 448 00:22:33,280 --> 00:22:35,800 Speaker 4: paper and computers in their classroom. But there'll be other 449 00:22:35,880 --> 00:22:38,680 Speaker 4: variants of creativity that people will put forth to say, 450 00:22:38,680 --> 00:22:40,760 Speaker 4: you know what, you know, that's a way to solve 451 00:22:40,760 --> 00:22:41,400 Speaker 4: that problem too. 452 00:22:41,480 --> 00:22:44,520 Speaker 2: But this is interesting because to stay on this analogy, 453 00:22:45,160 --> 00:22:50,119 Speaker 2: we're really talking about a profound rethinking just using a 454 00:22:50,200 --> 00:22:54,080 Speaker 2: college as an example, a real profound rethinking of the 455 00:22:54,119 --> 00:22:57,000 Speaker 2: way there's no part of this college it's unaffected by 456 00:22:57,240 --> 00:23:02,280 Speaker 2: aia B. In one case, I've made everyone's job easier. 457 00:23:02,320 --> 00:23:05,280 Speaker 2: In one case, I've made I'm asking us to really 458 00:23:05,320 --> 00:23:09,560 Speaker 2: rethink from the ground up what teaching means. In another case, 459 00:23:09,560 --> 00:23:11,600 Speaker 2: I've automated systems that I didn't think of it. I mean, 460 00:23:11,640 --> 00:23:14,680 Speaker 2: it's like, that's right, that's it's not that's a lot 461 00:23:14,720 --> 00:23:17,399 Speaker 2: to ask someone who got a pH d in medieval 462 00:23:17,480 --> 00:23:19,800 Speaker 2: language literature, you know, forty years ago. 463 00:23:20,440 --> 00:23:22,640 Speaker 4: Yeah, but you know, I'll tell you a positive sort 464 00:23:22,640 --> 00:23:25,320 Speaker 4: of development that I'm seeing the sciences around this, which 465 00:23:25,400 --> 00:23:28,760 Speaker 4: is you're seen as you see more and more examples 466 00:23:29,080 --> 00:23:33,320 Speaker 4: of applying AI technology within the context of like historians 467 00:23:33,320 --> 00:23:36,800 Speaker 4: to as an example, Right, you have archival and you know, 468 00:23:36,800 --> 00:23:38,439 Speaker 4: and you have all these books and being able to 469 00:23:38,520 --> 00:23:41,600 Speaker 4: actually help you as an assistant right around that, but 470 00:23:41,640 --> 00:23:44,880 Speaker 4: not only with text now, but with diagrams, right. And 471 00:23:45,240 --> 00:23:48,960 Speaker 4: I've seen it in anthropology too, write and archaeology with 472 00:23:49,119 --> 00:23:52,359 Speaker 4: examples of engravings and translations and things that can happen. 473 00:23:52,720 --> 00:23:56,280 Speaker 4: So so as you see in diverse fields people applying 474 00:23:56,320 --> 00:23:58,800 Speaker 4: these techniques to advance and how to do physics or 475 00:23:58,840 --> 00:24:02,160 Speaker 4: how to do chemistry. They inspire each other, right, and 476 00:24:02,160 --> 00:24:04,000 Speaker 4: this said, you know, how does it apply actually to 477 00:24:04,040 --> 00:24:07,680 Speaker 4: my area? So once as that happens, it becomes less 478 00:24:07,680 --> 00:24:09,239 Speaker 4: of a chore of like, my god, you know, how 479 00:24:09,240 --> 00:24:11,359 Speaker 4: do I have to deal with this? But actually it's 480 00:24:11,359 --> 00:24:15,040 Speaker 4: triggered by Curiosity's triggered by you know, there'll be like, 481 00:24:15,200 --> 00:24:16,840 Speaker 4: you know, faculty that will be like, you know what, 482 00:24:16,960 --> 00:24:19,240 Speaker 4: you know, let me explore what this means for my area, 483 00:24:19,560 --> 00:24:21,879 Speaker 4: and they will adapt it to the local context, to 484 00:24:21,920 --> 00:24:25,480 Speaker 4: the local you know, language, and the professional itself. So 485 00:24:25,760 --> 00:24:28,560 Speaker 4: I see that as a positive vector that is not 486 00:24:28,600 --> 00:24:29,880 Speaker 4: all going to feel like homework. 487 00:24:30,119 --> 00:24:31,760 Speaker 5: You know, it's not going to feel like, oh my god, 488 00:24:31,760 --> 00:24:33,560 Speaker 5: this is so overwhelming. 489 00:24:33,200 --> 00:24:35,520 Speaker 4: But rather to be very practical to see what works, 490 00:24:35,680 --> 00:24:37,800 Speaker 4: What have I seen others to do that is inspiring? 491 00:24:38,000 --> 00:24:40,000 Speaker 3: And what am I inspired to do? You know what? 492 00:24:40,000 --> 00:24:42,000 Speaker 3: What is how is this going to help my career? 493 00:24:42,280 --> 00:24:44,040 Speaker 4: I think that that's going to be an interesting question 494 00:24:44,160 --> 00:24:47,119 Speaker 4: for you know, those faculty members, for the students. 495 00:24:46,720 --> 00:24:49,720 Speaker 2: The professionals. Sorry, I'm gonna stick with this example along 496 00:24:49,760 --> 00:24:52,439 Speaker 2: because it's really interesting. I'm curious following up on what 497 00:24:52,520 --> 00:24:55,919 Speaker 2: you just said, that one of the most persistent critiques 498 00:24:55,960 --> 00:25:00,320 Speaker 2: of academic but also of many of many corporate institutions 499 00:25:01,320 --> 00:25:05,280 Speaker 2: in recent UITs, has been siloing. Right, different parts of 500 00:25:05,320 --> 00:25:08,480 Speaker 2: the of the organization are going off on their own 501 00:25:08,480 --> 00:25:11,280 Speaker 2: and not speaking to each other. Is a potential is 502 00:25:12,359 --> 00:25:17,520 Speaker 2: a real potential benefit to AI the kind of breaking 503 00:25:17,560 --> 00:25:21,440 Speaker 2: down a simple tool for breaking down those kinds of barriers, 504 00:25:21,440 --> 00:25:23,320 Speaker 2: is that a very is that an elegant way of 505 00:25:23,359 --> 00:25:25,200 Speaker 2: sort of saying what. 506 00:25:24,640 --> 00:25:25,159 Speaker 3: I really think? 507 00:25:25,200 --> 00:25:28,119 Speaker 4: And I was actually just having a conversation with Provos 508 00:25:28,119 --> 00:25:30,840 Speaker 4: stuff and very much on this topic, very recently exactly 509 00:25:30,840 --> 00:25:34,000 Speaker 4: on that, which is all these this you know, this 510 00:25:34,119 --> 00:25:37,040 Speaker 4: appetite right to collaborate across disciplines. There's a lot of 511 00:25:37,520 --> 00:25:41,960 Speaker 4: attempts stores a goal, right, creating interdisciplinary centers, creating dual 512 00:25:42,000 --> 00:25:46,240 Speaker 4: degree programs or dual appointment programs. But actually in a 513 00:25:46,320 --> 00:25:50,600 Speaker 4: lot of progress in academia happens by methodology too. Write 514 00:25:50,640 --> 00:25:53,680 Speaker 4: like a new you know, when when some methodology gets adopted, 515 00:25:53,880 --> 00:25:56,040 Speaker 4: I mean the most famous example of that is a 516 00:25:56,080 --> 00:25:58,679 Speaker 4: scientific method as an example of that. But when you 517 00:25:58,760 --> 00:26:01,840 Speaker 4: have a methodology that gets adopted, it also provides a 518 00:26:01,920 --> 00:26:05,879 Speaker 4: way to speak to your colleagues across different disciplines. And 519 00:26:05,920 --> 00:26:08,879 Speaker 4: I think what's happened in AI is linked to that 520 00:26:08,880 --> 00:26:12,120 Speaker 4: that within the context of the scientific method as an example, 521 00:26:12,680 --> 00:26:17,359 Speaker 4: the methodology about we about what we do discovery, the 522 00:26:17,480 --> 00:26:20,119 Speaker 4: role of data, the role of these neural networks, of 523 00:26:20,200 --> 00:26:23,040 Speaker 4: how we actually find proximity to concepts to one another 524 00:26:23,480 --> 00:26:28,240 Speaker 4: is actually fundamentally different than how we've traditionally applied it. 525 00:26:28,640 --> 00:26:31,960 Speaker 4: So as we see across more professions, people applying this 526 00:26:32,080 --> 00:26:35,439 Speaker 4: methodology is also going to give some element of common 527 00:26:35,520 --> 00:26:38,959 Speaker 4: language to each other. Right And in fact, you know, 528 00:26:39,119 --> 00:26:42,399 Speaker 4: in this very high dimensional representation of information that is 529 00:26:42,400 --> 00:26:46,439 Speaker 4: pressent to neural networks, we may find amazing adjacencies or 530 00:26:46,440 --> 00:26:50,240 Speaker 4: connections of themes and topics in ways that the individual 531 00:26:50,280 --> 00:26:53,800 Speaker 4: practitioners cannot describe, but yet will be latent. 532 00:26:53,720 --> 00:26:55,240 Speaker 3: In these large calural networks. 533 00:26:56,080 --> 00:26:58,480 Speaker 4: We are going to suffer a little bit from causality, 534 00:26:58,520 --> 00:27:00,640 Speaker 4: from the problem of like, hey, what's the root cause 535 00:27:00,720 --> 00:27:04,600 Speaker 4: of that? Because I think one of the unsatisfying aspects 536 00:27:04,640 --> 00:27:07,560 Speaker 4: that this methodology will provide is they may give you 537 00:27:07,640 --> 00:27:10,480 Speaker 4: answers for which they don't give you good reasons for 538 00:27:10,600 --> 00:27:13,840 Speaker 4: where the answers came from, and then there will be 539 00:27:13,880 --> 00:27:16,639 Speaker 4: the traditional process of discovery of saying, if that is 540 00:27:16,680 --> 00:27:19,440 Speaker 4: the answer, what are the reasons? So we're going to 541 00:27:19,560 --> 00:27:22,720 Speaker 4: have to do this sort of hybrid way of understanding 542 00:27:22,760 --> 00:27:25,320 Speaker 4: the world. But I do think that common layer of 543 00:27:25,359 --> 00:27:26,960 Speaker 4: AI is a powerful new thing. 544 00:27:27,280 --> 00:27:30,560 Speaker 2: Yeah, well, a couple of random questions. I couldn't manage 545 00:27:30,600 --> 00:27:33,560 Speaker 2: you talk in the In the Writer's strike that just 546 00:27:33,680 --> 00:27:36,680 Speaker 2: ended in Hollywood, one of the sticking points was how 547 00:27:36,720 --> 00:27:40,920 Speaker 2: the studios and writers would treat AI generated content. Good 548 00:27:40,960 --> 00:27:44,880 Speaker 2: writers get credit if their material with somehow the source 549 00:27:44,960 --> 00:27:49,679 Speaker 2: for A but more broadly, did the writers need protections 550 00:27:49,720 --> 00:27:51,720 Speaker 2: against the use of I could go on? You know what, 551 00:27:51,920 --> 00:27:54,400 Speaker 2: I'll be worth familiar with all of this. Had you been, 552 00:27:54,440 --> 00:27:57,240 Speaker 2: I don't know whether you were, but had either side 553 00:27:57,280 --> 00:28:01,159 Speaker 2: called you in for advice? During that the writers, the 554 00:28:01,200 --> 00:28:03,920 Speaker 2: writers called you and said, Daria, what should we do 555 00:28:04,119 --> 00:28:06,879 Speaker 2: about AI? And how should we that should be reflect 556 00:28:06,880 --> 00:28:10,080 Speaker 2: should that be reflected in our contract negotiations? What would 557 00:28:10,080 --> 00:28:12,200 Speaker 2: you have told them? 558 00:28:12,880 --> 00:28:14,919 Speaker 5: The way I think about that is that I divided. 559 00:28:15,000 --> 00:28:16,440 Speaker 3: I would divide it into two pieces. 560 00:28:16,600 --> 00:28:20,840 Speaker 4: First, is what's technically possible right and anticipate scenarios like 561 00:28:21,680 --> 00:28:24,200 Speaker 4: you know, what can you do with voice cloning for example? 562 00:28:24,520 --> 00:28:27,240 Speaker 4: You know, now, for example, it is possible this being 563 00:28:28,480 --> 00:28:31,280 Speaker 4: dubbing right legis take that topic right around the world. 564 00:28:31,280 --> 00:28:34,159 Speaker 4: There was all these folks that would dub people in 565 00:28:34,240 --> 00:28:38,080 Speaker 4: other languages. Well, now you can do these incredible renderings. 566 00:28:38,160 --> 00:28:40,200 Speaker 4: I mean, I know if you've seen them, where you 567 00:28:40,240 --> 00:28:42,800 Speaker 4: know you match the lips is your original voice, but 568 00:28:42,880 --> 00:28:45,400 Speaker 4: speaking any language that you want. As an example, so 569 00:28:45,480 --> 00:28:47,640 Speaker 4: busy that has a set of implications around that. I mean, 570 00:28:47,720 --> 00:28:49,320 Speaker 4: just to give an example, So I would say, create 571 00:28:49,360 --> 00:28:53,560 Speaker 4: a taxonomy that describes technical capabilities that we know of 572 00:28:53,640 --> 00:28:58,440 Speaker 4: today and applications to the industry and to examples of 573 00:28:58,520 --> 00:29:00,000 Speaker 4: like hey, you know I could film you for five 574 00:29:00,080 --> 00:29:02,080 Speaker 4: minutes and I could generate two hours of content of 575 00:29:02,120 --> 00:29:04,000 Speaker 4: you and I don't have to you know, then if 576 00:29:04,040 --> 00:29:06,120 Speaker 4: you'll get paid by the hour, obviously I'm not paying 577 00:29:06,120 --> 00:29:08,320 Speaker 4: you for that other thing. So I would say technological 578 00:29:08,320 --> 00:29:12,280 Speaker 4: capability and then map with their expertise consequences of how 579 00:29:12,320 --> 00:29:14,560 Speaker 4: it changes the way they work or the way they 580 00:29:14,600 --> 00:29:16,840 Speaker 4: interact or the way they negotiate and so on. 581 00:29:16,880 --> 00:29:19,120 Speaker 3: So that would be one element of it, and then the. 582 00:29:19,080 --> 00:29:22,000 Speaker 4: Other one is like a non technology related matter, which 583 00:29:22,040 --> 00:29:25,000 Speaker 4: is an element of almost distributed justices like who deserves 584 00:29:25,000 --> 00:29:27,320 Speaker 4: what right and who has the power to get what? 585 00:29:28,160 --> 00:29:31,920 Speaker 4: And then that's a completely different discussion. That is to say, well, 586 00:29:31,960 --> 00:29:34,760 Speaker 4: if this is the scenario of what's possible, you know, 587 00:29:34,840 --> 00:29:37,560 Speaker 4: what do we want and what are we able to get? 588 00:29:38,000 --> 00:29:40,200 Speaker 4: And I think that that's a different discussion, which. 589 00:29:40,040 --> 00:29:42,200 Speaker 2: Is which all that life? Which when do you do? 590 00:29:42,320 --> 00:29:42,600 Speaker 3: First? 591 00:29:43,840 --> 00:29:47,600 Speaker 4: I think it's very helpful to have an understanding of 592 00:29:47,640 --> 00:29:50,800 Speaker 4: what's possible and how it changes a landscape. 593 00:29:51,200 --> 00:29:53,400 Speaker 3: As part of a broader. 594 00:29:53,560 --> 00:29:58,239 Speaker 4: Discussion, right, and a broad negotiation, because you also have 595 00:29:58,280 --> 00:30:00,280 Speaker 4: to see the opportunities because there will be a lot 596 00:30:00,360 --> 00:30:03,240 Speaker 4: of ground to say, actually, you know, if we can 597 00:30:03,280 --> 00:30:05,760 Speaker 4: do it in this way and we can all be 598 00:30:05,840 --> 00:30:08,480 Speaker 4: that much more efficient in getting these piece work done 599 00:30:08,600 --> 00:30:12,120 Speaker 4: or these filming done, but we have a reasonable agreement 600 00:30:12,280 --> 00:30:16,360 Speaker 4: about how we both sides benefit from it, right, then 601 00:30:16,480 --> 00:30:16,800 Speaker 4: that's a. 602 00:30:16,760 --> 00:30:17,760 Speaker 3: Win win for everybody. 603 00:30:17,840 --> 00:30:18,040 Speaker 2: Yeah. 604 00:30:18,160 --> 00:30:21,160 Speaker 5: Right, so that's I think that would be a golden triangle. 605 00:30:21,240 --> 00:30:21,360 Speaker 3: Right. 606 00:30:21,440 --> 00:30:24,040 Speaker 2: Here's my reading, and I would like you to correct 607 00:30:24,040 --> 00:30:25,800 Speaker 2: me if I'm wrong, and I'm likely to be wrong. 608 00:30:26,920 --> 00:30:29,000 Speaker 2: When I looked at that strike, I said if they're 609 00:30:29,040 --> 00:30:33,040 Speaker 2: worried about AI. The writers are worried about AI. That 610 00:30:33,160 --> 00:30:35,720 Speaker 2: seems silly. It should be the studios who are worried 611 00:30:35,760 --> 00:30:38,200 Speaker 2: about the economic impact of AI. Does it? In the 612 00:30:38,200 --> 00:30:41,280 Speaker 2: long run AI put the studios out of business long 613 00:30:41,320 --> 00:30:43,240 Speaker 2: before it puts the writers out of business. I only 614 00:30:43,240 --> 00:30:47,239 Speaker 2: need the studio because the costs of production are as 615 00:30:47,320 --> 00:30:49,920 Speaker 2: high as the sky, and the cost of production are overwhelming. 616 00:30:49,960 --> 00:30:53,200 Speaker 2: And whereas if I don't, if I have a tool 617 00:30:53,240 --> 00:30:58,600 Speaker 2: which brings introduces massive technological efficiencies to the production of movies, 618 00:30:59,040 --> 00:31:02,360 Speaker 2: then why don't studio? Why would they the scared ones? 619 00:31:02,520 --> 00:31:04,560 Speaker 4: Or maybe maybe you need like a different kind of 620 00:31:04,560 --> 00:31:06,520 Speaker 4: studio or a different kind of different kind of studies. 621 00:31:06,560 --> 00:31:09,680 Speaker 2: But I mean the in the but in the in 622 00:31:09,720 --> 00:31:12,840 Speaker 2: this strike the fright the frightened ones with the writers 623 00:31:12,880 --> 00:31:17,080 Speaker 2: and the you know, with the studios. Wasn't that backwards? 624 00:31:18,520 --> 00:31:19,480 Speaker 3: I haven't thought about it. 625 00:31:19,640 --> 00:31:21,719 Speaker 5: Uh, it can be about the implications of it. 626 00:31:21,720 --> 00:31:23,960 Speaker 4: It goes back to we're talking before the implications because 627 00:31:23,960 --> 00:31:26,600 Speaker 4: are so horizontal, it is right to think about it, 628 00:31:26,640 --> 00:31:27,200 Speaker 4: like what does. 629 00:31:27,080 --> 00:31:28,560 Speaker 3: It do to the studios as well? Right? 630 00:31:28,720 --> 00:31:29,920 Speaker 2: Yeah? 631 00:31:29,960 --> 00:31:32,560 Speaker 4: But then you know the reason why that happens is 632 00:31:32,600 --> 00:31:36,640 Speaker 4: that it's the order of either negotiations or or who 633 00:31:36,720 --> 00:31:40,760 Speaker 4: first got concerned about it and did something about it, right, 634 00:31:40,800 --> 00:31:43,560 Speaker 4: which is in the context of the strike. You know, 635 00:31:43,680 --> 00:31:46,080 Speaker 4: I don't know what the equivalent conversations are going inside 636 00:31:46,080 --> 00:31:47,960 Speaker 4: the studio and whether they have a war room saying 637 00:31:47,960 --> 00:31:49,719 Speaker 4: what this is going to mean to us? Right, but 638 00:31:50,000 --> 00:31:52,600 Speaker 4: it doesn't get exercise through a strike, but maybe through 639 00:31:52,600 --> 00:31:54,960 Speaker 4: a task force inside you know, the companies about what 640 00:31:55,000 --> 00:31:55,560 Speaker 4: are they going to do? 641 00:31:55,640 --> 00:31:55,800 Speaker 3: Right? 642 00:31:56,040 --> 00:31:58,280 Speaker 2: Well, and to go back to your thing you said, 643 00:31:58,280 --> 00:31:59,560 Speaker 2: the first thing you do is you make a list 644 00:31:59,600 --> 00:32:04,320 Speaker 2: of what techological capabilities are. But don't technological capabilities change 645 00:32:04,320 --> 00:32:07,880 Speaker 2: every I mean they do. You're racing ahead so fast, 646 00:32:08,080 --> 00:32:11,040 Speaker 2: so you can't Can you have a contract? I'm sorry 647 00:32:11,040 --> 00:32:13,000 Speaker 2: for begetting in the little weeds here, but this is interesting. 648 00:32:13,440 --> 00:32:16,239 Speaker 2: Can you You can't have a five year contract if 649 00:32:16,280 --> 00:32:20,000 Speaker 2: the contract is based on an assessment of technological capabilities 650 00:32:20,000 --> 00:32:22,240 Speaker 2: in twenty twenty three, because by the time we get 651 00:32:22,240 --> 00:32:28,080 Speaker 2: to twenty eight twenty three eight, it's totally different. Right. 652 00:32:28,200 --> 00:32:30,280 Speaker 4: Yeah, But like you know, I mean where I was 653 00:32:30,320 --> 00:32:34,200 Speaker 4: going is like there are some abstractions around that is like, 654 00:32:35,240 --> 00:32:37,680 Speaker 4: you know, whate can we do with my image? Right, Like, 655 00:32:37,720 --> 00:32:40,239 Speaker 4: if I generally get the category that my image can 656 00:32:40,280 --> 00:32:43,360 Speaker 4: be reproduced, generated contents and so on, it's like, let's 657 00:32:43,360 --> 00:32:45,760 Speaker 4: talk about the abstract notion about who has rights to 658 00:32:45,840 --> 00:32:48,520 Speaker 4: that or do we both get to benefit from that? 659 00:32:48,720 --> 00:32:51,520 Speaker 4: If you get that straight, Yes, the nature of how 660 00:32:51,520 --> 00:32:54,880 Speaker 4: the image gets alter created as something will change underneath, 661 00:32:55,080 --> 00:32:57,640 Speaker 4: but the concept will stay the same. And so I 662 00:32:57,680 --> 00:32:59,880 Speaker 4: think what's important is to get the categories right. 663 00:33:00,200 --> 00:33:04,160 Speaker 2: Yeah. Yeah, if you had to you to think about 664 00:33:04,200 --> 00:33:10,480 Speaker 2: the biggest technological revolutions of the post war era last 665 00:33:10,600 --> 00:33:14,040 Speaker 2: seventy five years, you can all come up with a list. Actually, 666 00:33:14,040 --> 00:33:15,479 Speaker 2: it's really fun to come up with a list. I 667 00:33:15,520 --> 00:33:18,920 Speaker 2: was thinking about this when we were you know, containerized 668 00:33:18,920 --> 00:33:25,719 Speaker 2: shipping is my favorite, the green revolution, the internet is 669 00:33:25,960 --> 00:33:27,320 Speaker 2: Where is the I in that list? 670 00:33:29,960 --> 00:33:32,760 Speaker 4: So I would put it first in that context that 671 00:33:32,800 --> 00:33:36,920 Speaker 4: you put forth over since World War Two, undoubtedly, like 672 00:33:37,080 --> 00:33:41,320 Speaker 4: computing as a category is one of those trajectories that 673 00:33:41,440 --> 00:33:43,040 Speaker 4: has reshaped. 674 00:33:42,560 --> 00:33:43,320 Speaker 3: Right or world. 675 00:33:43,760 --> 00:33:47,600 Speaker 4: And I think we think computing, I would say the 676 00:33:47,720 --> 00:33:52,160 Speaker 4: role that semiconductors have had has been increwdly defining. I 677 00:33:52,160 --> 00:33:56,160 Speaker 4: would say AI is the second example of that as 678 00:33:56,200 --> 00:33:59,560 Speaker 4: a core architecture that is going to have an equivalent 679 00:33:59,640 --> 00:34:02,400 Speaker 4: level of impact. And then the third leg I would 680 00:34:02,400 --> 00:34:03,520 Speaker 4: put to that equation. 681 00:34:03,200 --> 00:34:04,880 Speaker 3: Will be quantum and quantum information. 682 00:34:05,400 --> 00:34:07,400 Speaker 4: And that's sort of like I like to summarize that 683 00:34:07,440 --> 00:34:09,960 Speaker 4: the future of computing its bits, neurons, and cubits, and 684 00:34:10,040 --> 00:34:13,319 Speaker 4: it is that idea of high precision computation, the world 685 00:34:13,360 --> 00:34:15,960 Speaker 4: of neural networks and artificial intelligence, and the world of 686 00:34:16,040 --> 00:34:19,400 Speaker 4: quantum and the combination of those things is going to 687 00:34:19,440 --> 00:34:21,880 Speaker 4: be the defining force of the next hundred years in 688 00:34:21,920 --> 00:34:23,160 Speaker 4: that category of computing. 689 00:34:23,200 --> 00:34:24,640 Speaker 3: But it makes the list for sure. 690 00:34:24,880 --> 00:34:27,239 Speaker 2: If it's that high up on the list. This is 691 00:34:27,280 --> 00:34:31,080 Speaker 2: a total hypothetical. Would you if you were starting over 692 00:34:31,560 --> 00:34:35,080 Speaker 2: if you're starting IBM right now, would you say, oh, 693 00:34:35,360 --> 00:34:38,839 Speaker 2: our AI operations actually should be way bigger, Like how 694 00:34:38,840 --> 00:34:40,880 Speaker 2: many how many thousands of people working for you? 695 00:34:41,760 --> 00:34:45,080 Speaker 4: So within the research division it's about like three thousand, 696 00:34:45,120 --> 00:34:46,040 Speaker 4: five hundred scientists. 697 00:34:46,040 --> 00:34:48,200 Speaker 2: So in a perfect world, would you if it's that big, 698 00:34:48,280 --> 00:34:51,280 Speaker 2: isn't that too small as a group? 699 00:34:51,719 --> 00:34:53,839 Speaker 4: Yeah, Well that's like in the ricer division. I mean 700 00:34:53,880 --> 00:34:57,879 Speaker 4: IBM overall, but I mean, like. 701 00:34:58,000 --> 00:35:01,080 Speaker 2: So starting from first, so you have a we've got 702 00:35:01,120 --> 00:35:05,880 Speaker 2: a technology that you're ranking with compute and you know, 703 00:35:06,160 --> 00:35:08,800 Speaker 2: up there with as in terms of a world changer, 704 00:35:11,000 --> 00:35:13,359 Speaker 2: are we So what I'm basically asking is are we 705 00:35:13,480 --> 00:35:16,440 Speaker 2: underinvested in this huge you. 706 00:35:16,360 --> 00:35:18,919 Speaker 4: Know, but so so yeah, it's a it's a good question. 707 00:35:19,000 --> 00:35:20,799 Speaker 4: So like what I would say is that I think 708 00:35:20,800 --> 00:35:23,839 Speaker 4: we should segment how many people do you need on 709 00:35:24,000 --> 00:35:27,680 Speaker 4: the creation of the technology itself, and what is the 710 00:35:27,760 --> 00:35:30,480 Speaker 4: right size of research and engineers and compute to do that, 711 00:35:31,080 --> 00:35:33,359 Speaker 4: And how many people do you need in the sort 712 00:35:33,400 --> 00:35:38,239 Speaker 4: of application of the technology to create better products, to 713 00:35:38,320 --> 00:35:41,960 Speaker 4: deliver services and consulting and then ultimately to diffuse it 714 00:35:42,000 --> 00:35:44,719 Speaker 4: through you know, sort of all spheres of society. And 715 00:35:44,760 --> 00:35:47,240 Speaker 4: the numbers are very different, and that is not different 716 00:35:47,239 --> 00:35:49,279 Speaker 4: than anywhere else. I mean, I mean, if you give 717 00:35:49,320 --> 00:35:52,000 Speaker 4: examples of since you were talking about in context of 718 00:35:52,000 --> 00:35:54,080 Speaker 4: World War two, how many people does it take to 719 00:35:54,120 --> 00:35:57,440 Speaker 4: create you know, an atomic weapon as an example, it's 720 00:35:57,480 --> 00:35:58,200 Speaker 4: a large number. 721 00:35:58,280 --> 00:35:59,760 Speaker 3: I mean, it wasn't just loss animals. 722 00:35:59,800 --> 00:36:01,600 Speaker 4: There was a lot of people in Okay, it's a 723 00:36:01,640 --> 00:36:04,920 Speaker 4: large number, but it wasn't a million people, right, So 724 00:36:05,440 --> 00:36:09,360 Speaker 4: you could have highly concentrated teams of people that, with 725 00:36:09,560 --> 00:36:13,880 Speaker 4: enough resources, can do extraordinary scientific and technological achievements, and 726 00:36:13,920 --> 00:36:16,160 Speaker 4: that's always by definition, is going to be a fraction 727 00:36:16,320 --> 00:36:19,040 Speaker 4: of like one percent compared to the total volume that 728 00:36:19,120 --> 00:36:20,240 Speaker 4: is going to require to then. 729 00:36:20,120 --> 00:36:20,560 Speaker 3: Deal with it. 730 00:36:21,200 --> 00:36:24,080 Speaker 2: But the application side is infinite almost. 731 00:36:23,760 --> 00:36:24,520 Speaker 3: That's exactly. 732 00:36:24,600 --> 00:36:27,040 Speaker 5: So that is where like in the end, the bottleneck 733 00:36:27,080 --> 00:36:27,560 Speaker 5: really is. 734 00:36:28,040 --> 00:36:31,680 Speaker 4: So with you know, thousands of scientists and engineers, you 735 00:36:31,719 --> 00:36:35,480 Speaker 4: can create world class AI, right, And so no, you 736 00:36:35,480 --> 00:36:37,719 Speaker 4: don't need ten thousand to be able to create the 737 00:36:37,800 --> 00:36:39,799 Speaker 4: large language model and the generatic model and some but 738 00:36:39,840 --> 00:36:43,480 Speaker 4: you need thousands, and you need you know, very significant 739 00:36:43,480 --> 00:36:44,479 Speaker 4: amount of computer and data. 740 00:36:44,520 --> 00:36:44,960 Speaker 3: You need that. 741 00:36:45,640 --> 00:36:49,640 Speaker 4: The rest is okay, I build software, I build databases, 742 00:36:49,760 --> 00:36:52,440 Speaker 4: or I build a software product that allows you to 743 00:36:52,480 --> 00:36:55,320 Speaker 4: do inventory management, or I build you know, a photo 744 00:36:55,440 --> 00:37:00,840 Speaker 4: editor and so on. Now that product incorporating the AI, modifying, 745 00:37:00,960 --> 00:37:03,880 Speaker 4: expanding it and so on. Well, now you're talking about 746 00:37:03,920 --> 00:37:06,800 Speaker 4: the entire software industries. So now you're talking about millions 747 00:37:06,840 --> 00:37:09,480 Speaker 4: of people right who are neted, you know, who are 748 00:37:09,560 --> 00:37:12,680 Speaker 4: required to bring AI into their product. Then you go 749 00:37:12,760 --> 00:37:15,759 Speaker 4: on a step beyond the technology creators in terms of 750 00:37:15,800 --> 00:37:18,600 Speaker 4: software and you say, well, okay, now what the skills 751 00:37:18,640 --> 00:37:22,880 Speaker 4: to help organizations go undeployed in the department of you know, 752 00:37:22,960 --> 00:37:25,279 Speaker 4: the interior, right, And then I said, okay, well, now 753 00:37:25,320 --> 00:37:28,759 Speaker 4: you need like consultants and experts and people to work 754 00:37:28,800 --> 00:37:31,160 Speaker 4: they are to integer into the workflow. So now you're 755 00:37:31,200 --> 00:37:33,960 Speaker 4: talking into the many tens of millions of people around that. 756 00:37:34,239 --> 00:37:36,600 Speaker 4: So I see it as these concentric circles of it. 757 00:37:37,080 --> 00:37:40,040 Speaker 4: But to some degree in many of these core technology areas, 758 00:37:40,080 --> 00:37:41,600 Speaker 4: just saying like, well, I need a team of like 759 00:37:41,600 --> 00:37:43,960 Speaker 4: one hundred thousand people to create like AI or a 760 00:37:44,280 --> 00:37:46,799 Speaker 4: or a new transistor or a new quantum computer. It's 761 00:37:46,840 --> 00:37:49,040 Speaker 4: actually a diminished in return right in the end, like 762 00:37:49,120 --> 00:37:50,920 Speaker 4: too many people connecting with each other. 763 00:37:50,800 --> 00:37:53,480 Speaker 2: Is very difficult. But on the application side of just 764 00:37:53,880 --> 00:37:58,600 Speaker 2: to go back to our our example of that college, 765 00:37:59,160 --> 00:38:03,200 Speaker 2: just a task of sitting down with a faculty and 766 00:38:03,320 --> 00:38:07,640 Speaker 2: working with them to reimagine what they do with these 767 00:38:07,680 --> 00:38:10,319 Speaker 2: new set of tools in mind, with the understanding that 768 00:38:10,360 --> 00:38:12,200 Speaker 2: the students coming in are probably going to know more 769 00:38:12,200 --> 00:38:14,839 Speaker 2: about it than they do that a lot. I mean, 770 00:38:14,880 --> 00:38:18,680 Speaker 2: that's that is a hurriculeion people problem. 771 00:38:18,880 --> 00:38:19,720 Speaker 3: It's a people problem. 772 00:38:19,800 --> 00:38:21,760 Speaker 4: Yeah, that's why I started in terms of the barriers 773 00:38:21,760 --> 00:38:23,800 Speaker 4: of adoption of that, I mean the context of IBM, 774 00:38:23,840 --> 00:38:28,680 Speaker 4: an example That's why we have a consulting organization, Ivan Consulting, 775 00:38:28,719 --> 00:38:32,799 Speaker 4: that complements ib and technology, and the Ivan Consulting Organization 776 00:38:32,880 --> 00:38:35,840 Speaker 4: has over one hundred and fifty thousand employees because of 777 00:38:35,880 --> 00:38:38,279 Speaker 4: this question, right, because you have to sit down and 778 00:38:38,320 --> 00:38:40,920 Speaker 4: you say, Okay, what problem are you trying to solve? 779 00:38:41,239 --> 00:38:43,160 Speaker 4: What is a methodology we're going to do, and here's 780 00:38:43,160 --> 00:38:45,200 Speaker 4: the technology options that we have to be able to 781 00:38:45,200 --> 00:38:49,279 Speaker 4: bring into the table. In the end, the adoption across 782 00:38:49,920 --> 00:38:54,239 Speaker 4: or society will be limited by this part. The technology 783 00:38:54,280 --> 00:38:56,640 Speaker 4: is going to make it easier, more cost effective to 784 00:38:56,719 --> 00:39:01,319 Speaker 4: implement those solutions. You first have to think about what 785 00:39:01,360 --> 00:39:03,239 Speaker 4: you want to do, how you're going to do it, 786 00:39:03,360 --> 00:39:04,719 Speaker 4: and how are you going to bring it into a 787 00:39:04,760 --> 00:39:08,120 Speaker 4: life of this in this context faculty member or you know, 788 00:39:08,200 --> 00:39:10,200 Speaker 4: the administrator and so on in the college. 789 00:39:10,480 --> 00:39:13,279 Speaker 2: With that Hollywood that that notion I thought, which was 790 00:39:13,320 --> 00:39:18,600 Speaker 2: absolutely I thought really interesting that in a Hollywood strike 791 00:39:18,680 --> 00:39:22,160 Speaker 2: you have to have this conversation about a distributive justice, 792 00:39:22,360 --> 00:39:25,200 Speaker 2: conversation about how do we That's it's a really hard 793 00:39:25,239 --> 00:39:28,560 Speaker 2: conversation right to have and a So this brings me 794 00:39:28,560 --> 00:39:30,080 Speaker 2: to my netflie, which is that you we were talking 795 00:39:30,080 --> 00:39:34,360 Speaker 2: a backstage you have. You have two daughters, one in college, 796 00:39:34,400 --> 00:39:37,239 Speaker 2: one about to go to college. That's right, so they're 797 00:39:37,280 --> 00:39:41,719 Speaker 2: both scients minded. So tell me about the conversations you 798 00:39:41,719 --> 00:39:44,560 Speaker 2: you have with your daughter. You have a unique conversation 799 00:39:44,680 --> 00:39:47,680 Speaker 2: with your daughters because your conversation, your advice to them 800 00:39:47,880 --> 00:39:51,280 Speaker 2: is is influenced by what you do for a living. 801 00:39:51,560 --> 00:39:52,400 Speaker 3: Yes, it's true. 802 00:39:52,560 --> 00:39:56,880 Speaker 2: So did you warn your daughters away from certain fields? 803 00:39:56,920 --> 00:40:02,440 Speaker 2: Did you say, whatever you do, don't be No, no. 804 00:40:01,440 --> 00:40:04,040 Speaker 4: No, that's not my style. I mean, for me, no, 805 00:40:04,320 --> 00:40:06,439 Speaker 4: I try not to be like you know, preachy about that. 806 00:40:07,400 --> 00:40:09,760 Speaker 4: So for me, it was just about showing by example 807 00:40:09,840 --> 00:40:12,520 Speaker 4: all things I love, right and things I care about, 808 00:40:12,920 --> 00:40:14,920 Speaker 4: and then you know, bringing them to the lab and 809 00:40:14,960 --> 00:40:18,440 Speaker 4: seeing things, and then the natural conversations of things working 810 00:40:18,480 --> 00:40:21,800 Speaker 4: on or interesting people I meet, so to the extent 811 00:40:21,840 --> 00:40:23,759 Speaker 4: that they have chosen that and obviously this has an 812 00:40:23,800 --> 00:40:27,920 Speaker 4: influence on them. It has been through seeing it, you know, 813 00:40:28,000 --> 00:40:29,400 Speaker 4: perhaps through my eyes, right. 814 00:40:29,280 --> 00:40:30,560 Speaker 3: And what do you see me do? And that I 815 00:40:30,640 --> 00:40:31,600 Speaker 3: like my profession? Right. 816 00:40:31,680 --> 00:40:34,600 Speaker 2: But one of your daughters, you said, is thinking that 817 00:40:34,640 --> 00:40:37,920 Speaker 2: she wants to be a doctor. But being a doctor 818 00:40:38,000 --> 00:40:40,520 Speaker 2: in a post AI world is surely a very different 819 00:40:40,520 --> 00:40:43,160 Speaker 2: proposition than being a doctor in a pre AI world. 820 00:40:43,600 --> 00:40:46,279 Speaker 2: Do you think you have you tried to prepare her 821 00:40:46,320 --> 00:40:49,120 Speaker 2: for that difference. Have you explained to her what you 822 00:40:49,200 --> 00:40:51,319 Speaker 2: think will happen to this profession she might enter. 823 00:40:51,840 --> 00:40:54,760 Speaker 4: Yeah, I mean not in like, you know, incredible amount 824 00:40:54,800 --> 00:40:57,640 Speaker 4: of detail, but but but yes, at the level of 825 00:40:57,760 --> 00:41:02,000 Speaker 4: understanding what is changing, like lens of the information, lens 826 00:41:02,040 --> 00:41:03,920 Speaker 4: with which you can look at the world and what 827 00:41:04,080 --> 00:41:07,600 Speaker 4: is possible and what it can do, Like what is 828 00:41:07,600 --> 00:41:09,960 Speaker 4: our role and what is the role of the technology 829 00:41:09,960 --> 00:41:12,839 Speaker 4: and how that shapes At that level of abstraction, for sure, 830 00:41:13,160 --> 00:41:15,760 Speaker 4: but not at the level of like, don't be a radiologist, 831 00:41:15,880 --> 00:41:17,319 Speaker 4: you know, because this is what we. 832 00:41:17,239 --> 00:41:19,320 Speaker 2: Want for you. I was gonna say, if you'ren't happy 833 00:41:19,320 --> 00:41:21,359 Speaker 2: with your current job, you could do a podcast called 834 00:41:21,360 --> 00:41:25,240 Speaker 2: Parenting Tips with Dario, which is just an AI person 835 00:41:25,840 --> 00:41:28,080 Speaker 2: gives you advice on what your kids should do based 836 00:41:28,120 --> 00:41:30,720 Speaker 2: on exactly this, Like should I be a radiologist? Dario 837 00:41:30,920 --> 00:41:35,240 Speaker 2: tell me, Like it seems to be a really important question. Yeah, 838 00:41:35,560 --> 00:41:37,360 Speaker 2: let me ask this question in a more I'm joking, 839 00:41:37,400 --> 00:41:41,640 Speaker 2: but in a more serious way. Surely it would If 840 00:41:41,719 --> 00:41:43,320 Speaker 2: I don't mean to use your daughter as an example. 841 00:41:43,360 --> 00:41:45,560 Speaker 2: But let's imagine we're giving advice to someone who wants 842 00:41:45,600 --> 00:41:49,920 Speaker 2: to enter medicine. A really useful conversation to have is 843 00:41:50,440 --> 00:41:54,120 Speaker 2: what are the skills that are will be most prized 844 00:41:55,040 --> 00:41:58,239 Speaker 2: in that profession fifteen years from now, and are they 845 00:41:58,280 --> 00:42:00,799 Speaker 2: different from the skills that are prized now. How would 846 00:42:00,840 --> 00:42:01,880 Speaker 2: you answer that question? 847 00:42:02,719 --> 00:42:03,000 Speaker 3: Yeah? 848 00:42:03,320 --> 00:42:07,160 Speaker 4: I think for example, this goes back to how is 849 00:42:07,160 --> 00:42:10,080 Speaker 4: the scientific method in this context, like the practice of 850 00:42:10,120 --> 00:42:12,839 Speaker 4: medicine going to change. I think we will see more 851 00:42:12,920 --> 00:42:15,319 Speaker 4: changes on how we practice a scientific method and so 852 00:42:15,360 --> 00:42:19,360 Speaker 4: on as a consequence of what is happening with the 853 00:42:19,360 --> 00:42:22,600 Speaker 4: world of computing and information. How we represent information, how 854 00:42:22,600 --> 00:42:26,120 Speaker 4: we represent knowledge, how we extract meaning from knowledge as 855 00:42:26,120 --> 00:42:29,120 Speaker 4: a method than we have seen in the last two 856 00:42:29,160 --> 00:42:32,880 Speaker 4: hundred years. So therefore, what I would like strongly encourage 857 00:42:32,960 --> 00:42:35,080 Speaker 4: is not about like, hey, use this tool for doing 858 00:42:35,120 --> 00:42:38,080 Speaker 4: this or doing that, but in the curriculum itself, in 859 00:42:38,320 --> 00:42:41,920 Speaker 4: understanding how we do problems solving in the age of 860 00:42:42,040 --> 00:42:45,120 Speaker 4: like data and data representation and so on, that needs 861 00:42:45,120 --> 00:42:48,400 Speaker 4: to be embedded in the curriculum of everybody you know. 862 00:42:48,440 --> 00:42:51,240 Speaker 3: That is I would say, actually quite horizontally. 863 00:42:50,640 --> 00:42:53,360 Speaker 4: But certainly in the context of medicine and scientists and 864 00:42:53,360 --> 00:42:56,600 Speaker 4: so on for sure, And to the extent that that 865 00:42:56,800 --> 00:42:59,400 Speaker 4: gets ingrained, that will give us a lens that no 866 00:42:59,480 --> 00:43:03,480 Speaker 4: matter what what specialty they go within medicine, they will say, Actually, 867 00:43:03,880 --> 00:43:06,440 Speaker 4: the way I want to be able to tackle improving 868 00:43:06,440 --> 00:43:08,799 Speaker 4: the quality of care, the way to do that is 869 00:43:08,920 --> 00:43:11,320 Speaker 4: in addition to all the elements that we have practiced 870 00:43:11,719 --> 00:43:14,120 Speaker 4: in the field of medicine, is this new lens? 871 00:43:14,360 --> 00:43:14,920 Speaker 3: And are we. 872 00:43:14,920 --> 00:43:17,160 Speaker 4: Representing the data the right way? Do we have the 873 00:43:17,239 --> 00:43:20,080 Speaker 4: right tools to be able to represent that knowledge? Am 874 00:43:20,080 --> 00:43:23,160 Speaker 4: I incorporating that in my own so with my own 875 00:43:23,239 --> 00:43:25,320 Speaker 4: knowledge in a way that gives me better outcomes? 876 00:43:25,400 --> 00:43:25,640 Speaker 3: Right? 877 00:43:25,760 --> 00:43:29,080 Speaker 4: Do I have the rigor of benchmarking too? And quality 878 00:43:29,400 --> 00:43:31,839 Speaker 4: of the results? So that is what needs to be incorporated. 879 00:43:32,000 --> 00:43:37,759 Speaker 2: How in a perfect world, if I asked you your 880 00:43:37,800 --> 00:43:42,320 Speaker 2: team to rewrite curriculum for American medical schools, how dramatic 881 00:43:42,440 --> 00:43:45,640 Speaker 2: a revision is that? Are we tinkering with ten percent 882 00:43:45,680 --> 00:43:47,880 Speaker 2: of the curriculum or we tinkering with fifty percent of it? 883 00:43:50,080 --> 00:43:54,279 Speaker 4: I think they would be a subset of classes that 884 00:43:54,400 --> 00:43:56,800 Speaker 4: is about the method the methodology, what has changed? Like 885 00:43:57,280 --> 00:44:01,000 Speaker 4: have these lens of it to understand? And then within 886 00:44:01,239 --> 00:44:05,799 Speaker 4: each class that methodology will represent something that is embedded 887 00:44:05,880 --> 00:44:10,799 Speaker 4: in it, right, so it will be substantive but not 888 00:44:11,000 --> 00:44:14,320 Speaker 4: But but doesn't mean replacing the specialization and the context 889 00:44:14,320 --> 00:44:16,759 Speaker 4: and the knowledge of each domain. But I do think 890 00:44:16,840 --> 00:44:20,839 Speaker 4: everybody should have sort of a basic knowledge of the horizontal, right, 891 00:44:20,880 --> 00:44:23,480 Speaker 4: what is it, how does it work, what tools you have, 892 00:44:23,840 --> 00:44:25,839 Speaker 4: what is the technology, and like you know what are. 893 00:44:25,800 --> 00:44:27,160 Speaker 3: The dos and don'ts around that? 894 00:44:27,640 --> 00:44:29,799 Speaker 4: And then every area you say, and you know that 895 00:44:29,880 --> 00:44:32,920 Speaker 4: thing that you learn, this is how it applies to anatomy, 896 00:44:33,160 --> 00:44:34,880 Speaker 4: and this is how you know how it applies to 897 00:44:35,080 --> 00:44:37,600 Speaker 4: you know, radiology if you study that, or or this 898 00:44:37,640 --> 00:44:39,960 Speaker 4: is how you apply you know, in the context of discovery, 899 00:44:40,040 --> 00:44:42,000 Speaker 4: right of self structure and this is how we can 900 00:44:42,080 --> 00:44:44,480 Speaker 4: use it. Or protein folding and this is how it 901 00:44:44,800 --> 00:44:47,960 Speaker 4: does so that way you'll see a connecting tissue through 902 00:44:48,360 --> 00:44:49,160 Speaker 4: throughout the whole thing. 903 00:44:49,440 --> 00:44:52,359 Speaker 2: Yeah, I mean I would add to that because I 904 00:44:52,400 --> 00:44:57,360 Speaker 2: was thinking of the that it's also this incredible opportunity 905 00:44:57,400 --> 00:45:00,239 Speaker 2: to do what doctors are supposed to do but don't 906 00:45:00,280 --> 00:45:03,160 Speaker 2: have time to do now, which is they're so consumed 907 00:45:03,200 --> 00:45:07,799 Speaker 2: with figuring out what's wrong with you that they have 908 00:45:07,880 --> 00:45:11,600 Speaker 2: little time to talk about the implications of the diagnosisness 909 00:45:11,800 --> 00:45:14,720 Speaker 2: and what we really want to if we can freedom 910 00:45:14,719 --> 00:45:17,600 Speaker 2: of some of the burden of what is actually quite 911 00:45:17,600 --> 00:45:20,120 Speaker 2: a prosaic question of what's wrong with you and leave 912 00:45:20,160 --> 00:45:23,160 Speaker 2: the hard human thing of let make should you be 913 00:45:23,320 --> 00:45:26,880 Speaker 2: scared or hopeful? Should you you know? What do you 914 00:45:27,000 --> 00:45:28,719 Speaker 2: need to do? Or what? Let me put this in 915 00:45:28,760 --> 00:45:31,360 Speaker 2: the context of all the patients I've seen. That conversation, 916 00:45:31,400 --> 00:45:33,040 Speaker 2: which is the most important one, is the one that 917 00:45:33,760 --> 00:45:36,000 Speaker 2: seems to me. So like, if I had to, I 918 00:45:36,040 --> 00:45:39,880 Speaker 2: would add, if we're reimagining the curriculum of med school, 919 00:45:40,239 --> 00:45:43,879 Speaker 2: I'd like with whatever, by the way, very little time. 920 00:45:43,880 --> 00:45:46,880 Speaker 2: Maybe we have to add two more years to med school, 921 00:45:47,000 --> 00:45:52,120 Speaker 2: but like a whole but the whole thing about bringing 922 00:45:52,200 --> 00:45:56,840 Speaker 2: back the human side of you know, now, if I 923 00:45:56,840 --> 00:45:59,440 Speaker 2: can give you ten more minutes, how do you use 924 00:45:59,480 --> 00:46:00,520 Speaker 2: that ten more men? 925 00:46:00,640 --> 00:46:03,759 Speaker 4: But in that, in that reconceptualization that you just did, 926 00:46:04,120 --> 00:46:06,080 Speaker 4: is what we should be doing around that, because I 927 00:46:06,080 --> 00:46:08,920 Speaker 4: think the debate as to like, well I'm not going 928 00:46:09,000 --> 00:46:10,719 Speaker 4: to need doctors or not, it's actually a not very 929 00:46:10,760 --> 00:46:13,960 Speaker 4: useful debate. But rather this other question is how is 930 00:46:14,000 --> 00:46:16,720 Speaker 4: your time being spent? What problems are you getting stuck. 931 00:46:17,040 --> 00:46:19,880 Speaker 4: I mean I generalize this by like the obvious observation 932 00:46:20,040 --> 00:46:22,160 Speaker 4: that if you look around in your professions, in our 933 00:46:22,239 --> 00:46:24,839 Speaker 4: daily lives, we have not run out of problems to solve. 934 00:46:25,120 --> 00:46:27,200 Speaker 4: So as an example of that is, hey, if I'm 935 00:46:27,200 --> 00:46:29,319 Speaker 4: spending all my time trying to do diagnosis, and I 936 00:46:29,320 --> 00:46:31,600 Speaker 4: could do that ten times faster, and it allows me 937 00:46:31,680 --> 00:46:34,560 Speaker 4: actually to go, you know, and take care of the 938 00:46:34,600 --> 00:46:36,560 Speaker 4: patients and all the next steps of what we have 939 00:46:36,600 --> 00:46:38,719 Speaker 4: to do about it. That's probably a trade off that 940 00:46:38,760 --> 00:46:42,160 Speaker 4: a lot of doctors would take, right, And then you say, well, 941 00:46:42,200 --> 00:46:43,880 Speaker 4: you know, to what degree does it allow me to 942 00:46:43,920 --> 00:46:45,839 Speaker 4: do that? And I can do these other things and 943 00:46:45,880 --> 00:46:49,200 Speaker 4: these other things are critically important for my profession around that. 944 00:46:49,640 --> 00:46:52,920 Speaker 4: So when you actually become less abstract and like we 945 00:46:53,080 --> 00:46:56,160 Speaker 4: get past the futile conversation of like, oh, there's no 946 00:46:56,200 --> 00:46:57,920 Speaker 4: more jobs and I's going to take it all of it, 947 00:46:57,920 --> 00:47:01,040 Speaker 4: which is kind of nonsense, is you go back to say, 948 00:47:01,080 --> 00:47:05,120 Speaker 4: in practice in your context, right, for you, what does 949 00:47:05,160 --> 00:47:06,680 Speaker 4: it mean, how do you work? 950 00:47:06,840 --> 00:47:08,480 Speaker 3: What can you do differently around that? 951 00:47:08,600 --> 00:47:11,040 Speaker 4: Actually that's a much richer conversation, and very often we 952 00:47:11,080 --> 00:47:13,279 Speaker 4: would find ourselves that there's a portion of the work 953 00:47:13,320 --> 00:47:15,439 Speaker 4: we do that we say I would rather do less 954 00:47:15,440 --> 00:47:17,880 Speaker 4: of that. This is this other part I like a lot, 955 00:47:18,200 --> 00:47:21,040 Speaker 4: And if it is possible that technology could help us 956 00:47:21,040 --> 00:47:22,360 Speaker 4: make that trade off, I'll. 957 00:47:22,160 --> 00:47:23,080 Speaker 3: Take it in a heartbeat. 958 00:47:23,800 --> 00:47:27,479 Speaker 4: Now, poorly implemented technology can also create another problem. 959 00:47:27,640 --> 00:47:29,480 Speaker 3: You say, Hey, this was supposed to solve. 960 00:47:29,200 --> 00:47:32,399 Speaker 4: Me things, but the way it's being implemented is not 961 00:47:32,440 --> 00:47:35,200 Speaker 4: helping me, right, it's making my life more more miserable, 962 00:47:35,320 --> 00:47:37,920 Speaker 4: or so on, or I've lost connection in how I 963 00:47:38,000 --> 00:47:41,400 Speaker 4: used to work, et cetera. So that is why design 964 00:47:42,080 --> 00:47:45,200 Speaker 4: is so important. That is why I also workflow is 965 00:47:45,239 --> 00:47:48,000 Speaker 4: so important in being able to solve these problems. But 966 00:47:48,840 --> 00:47:52,360 Speaker 4: it begins by, you know, going from the intergalactic to 967 00:47:52,440 --> 00:47:54,840 Speaker 4: the reality of it, of that faculty member in the 968 00:47:54,840 --> 00:47:57,759 Speaker 4: liberal Arts college or you know, or a you know, 969 00:47:57,920 --> 00:48:00,560 Speaker 4: a practitioner in medicine in a hospital and what it 970 00:48:00,680 --> 00:48:01,399 Speaker 4: means for them. 971 00:48:01,520 --> 00:48:01,680 Speaker 3: Right. 972 00:48:02,400 --> 00:48:06,640 Speaker 2: Yeah. What struck me Daria throughout our conversation is how 973 00:48:06,719 --> 00:48:11,880 Speaker 2: much of this revolution is non technical, as to say, 974 00:48:12,160 --> 00:48:14,120 Speaker 2: you guys are doing the technical thing here, but the 975 00:48:14,160 --> 00:48:17,080 Speaker 2: real the revolution is going to require a whole range 976 00:48:17,120 --> 00:48:20,840 Speaker 2: of people doing things that have nothing to do with software, 977 00:48:21,080 --> 00:48:24,520 Speaker 2: that have to do with working out new new human arrangements. 978 00:48:24,960 --> 00:48:27,600 Speaker 2: Talking about that, I mean, does keep coming back to 979 00:48:27,640 --> 00:48:30,400 Speaker 2: the Hollywood strike thing that you have to have a 980 00:48:30,400 --> 00:48:37,120 Speaker 2: conversation about our values is creators of of of of movies? 981 00:48:37,160 --> 00:48:39,920 Speaker 2: How are we going to divide up the exactly credit 982 00:48:39,960 --> 00:48:44,319 Speaker 2: and the like. That's a that's a conversation about philosophy, 983 00:48:44,400 --> 00:48:46,319 Speaker 2: and you know, you know it's it is. 984 00:48:46,280 --> 00:48:49,600 Speaker 5: And is it's in the grand tradition of why you know, 985 00:48:51,080 --> 00:48:52,160 Speaker 5: a liberal. 986 00:48:51,960 --> 00:48:55,359 Speaker 4: Education is so important in the broadest possible sense. Right, 987 00:48:55,760 --> 00:48:59,480 Speaker 4: there's no common conception of the good right that is 988 00:48:59,520 --> 00:49:03,799 Speaker 4: always tested a dialogue that happens within our society, and 989 00:49:03,920 --> 00:49:06,239 Speaker 4: technology is going to fit in that context too, right. 990 00:49:06,280 --> 00:49:08,520 Speaker 4: So that's why I personally, as a philosophy I'm not 991 00:49:08,560 --> 00:49:12,280 Speaker 4: a technological determinists, right, And I don't like when colleagues 992 00:49:12,280 --> 00:49:14,960 Speaker 4: in my profession right starts saying like, well, this is 993 00:49:15,000 --> 00:49:17,560 Speaker 4: the way the technology is going to be, and by consequence, 994 00:49:17,760 --> 00:49:19,719 Speaker 4: this is how society is going to be. I'm like, 995 00:49:19,800 --> 00:49:22,920 Speaker 4: that's a highly contested goal. And if you want to 996 00:49:23,000 --> 00:49:25,560 Speaker 4: enter into realm of politics or the real other ones, 997 00:49:25,560 --> 00:49:28,120 Speaker 4: go and stand up on a stool and discuss whether 998 00:49:28,200 --> 00:49:30,279 Speaker 4: that's what society wants. You will find that it's a 999 00:49:30,400 --> 00:49:34,560 Speaker 4: huge diversity of opinions and perspective and that's what makes 1000 00:49:34,600 --> 00:49:37,200 Speaker 4: you know, you know, in a democracy, the richness of 1001 00:49:37,239 --> 00:49:39,440 Speaker 4: our society, and in the end that is going to 1002 00:49:39,440 --> 00:49:42,399 Speaker 4: be the centerpiece of the conversation what do we want? 1003 00:49:43,320 --> 00:49:45,879 Speaker 4: You know, who gets what? And so on? And that 1004 00:49:46,200 --> 00:49:48,480 Speaker 4: is actually, I don't think it's anything negative. That's acid 1005 00:49:48,520 --> 00:49:51,400 Speaker 4: should be because in the end is anchored of who 1006 00:49:51,480 --> 00:49:55,000 Speaker 4: we want as humans, you know, you know, as friends, family, citizens, 1007 00:49:55,160 --> 00:49:57,960 Speaker 4: and we have many overlapping sets of responsibilities, right and 1008 00:49:58,000 --> 00:50:01,120 Speaker 4: as a technology creator my only version ponsibilities not just 1009 00:50:01,160 --> 00:50:02,960 Speaker 4: as a scientist and a technology creator. 1010 00:50:03,320 --> 00:50:04,400 Speaker 3: I'm also a member. 1011 00:50:04,239 --> 00:50:06,000 Speaker 4: Of a family, I'm a citizen, and I'm many other 1012 00:50:06,080 --> 00:50:08,160 Speaker 4: things that I care about. And I think that that 1013 00:50:08,400 --> 00:50:13,560 Speaker 4: sometimes in the debate of the technological determinists, they start 1014 00:50:13,640 --> 00:50:18,200 Speaker 4: now budding into what is the realm of you know, 1015 00:50:18,600 --> 00:50:22,319 Speaker 4: justice and you know, in society and philosophy and democracy, 1016 00:50:22,680 --> 00:50:25,279 Speaker 4: And that's where they get the most uncomfortable because it's 1017 00:50:25,280 --> 00:50:28,120 Speaker 4: like I'm just telling you, like, you know, what's possible, 1018 00:50:28,400 --> 00:50:32,239 Speaker 4: and when there's pushback, it's like, yeah, but now we're 1019 00:50:32,280 --> 00:50:35,160 Speaker 4: talking about how we live and how we work and 1020 00:50:36,280 --> 00:50:36,839 Speaker 4: how much I. 1021 00:50:36,800 --> 00:50:38,040 Speaker 3: Get paid or not paid. 1022 00:50:38,360 --> 00:50:42,920 Speaker 4: So that technology is important. Technology shapes that conversation, but 1023 00:50:43,000 --> 00:50:45,759 Speaker 4: we're going to have the conversation with a different language, 1024 00:50:46,160 --> 00:50:49,360 Speaker 4: as it should be, and technologists need to get accustomed 1025 00:50:49,400 --> 00:50:51,319 Speaker 4: to if they want to participate in that world with 1026 00:50:51,400 --> 00:50:54,360 Speaker 4: the broad consequences. Hey, get a custom to deal with 1027 00:50:54,400 --> 00:50:59,160 Speaker 4: the complexity of that world of politics, society, institutions, unions, 1028 00:50:59,200 --> 00:51:01,279 Speaker 4: all that stuff. And you know, you can be like 1029 00:51:01,360 --> 00:51:03,840 Speaker 4: whiny about it. It's like they're not adopting my technology. 1030 00:51:04,000 --> 00:51:06,280 Speaker 4: That's what it takes to bring technology into the world. 1031 00:51:07,200 --> 00:51:13,759 Speaker 2: Yeah, well, said thank you Dario for this wonderful conversation. 1032 00:51:13,920 --> 00:51:17,600 Speaker 2: Thank you to all of you for coming and listening, 1033 00:51:17,800 --> 00:51:19,239 Speaker 2: and thank you. 1034 00:51:19,520 --> 00:51:21,360 Speaker 1: Thank you. 1035 00:51:23,360 --> 00:51:26,480 Speaker 2: Dario gild transformed how I think about the future of AI. 1036 00:51:27,200 --> 00:51:29,480 Speaker 2: He explained to me how huge of a leap it 1037 00:51:29,640 --> 00:51:33,000 Speaker 2: was when we went from chess playing models to language 1038 00:51:33,080 --> 00:51:36,279 Speaker 2: learning models, and he talked about how we still have 1039 00:51:36,360 --> 00:51:39,000 Speaker 2: a lot of room to grow. That's why it's important 1040 00:51:39,360 --> 00:51:42,640 Speaker 2: that we get things right. The future of AI is 1041 00:51:42,760 --> 00:51:47,040 Speaker 2: impossible to predict, but the technology has so much potential 1042 00:51:47,120 --> 00:51:50,880 Speaker 2: in every industry. Zooming into an academic or medical setting, 1043 00:51:51,080 --> 00:51:54,200 Speaker 2: showed just how close we are to the widespread adoption 1044 00:51:54,560 --> 00:51:58,200 Speaker 2: of AI. Even Hollywood is being forced to figure this out. 1045 00:51:58,840 --> 00:52:01,560 Speaker 2: Institutions have all Soortz will have to be at the 1046 00:52:01,600 --> 00:52:05,280 Speaker 2: forefront of integration in order to unlock the full power 1047 00:52:05,280 --> 00:52:10,239 Speaker 2: of AI thoughtfully and responsibly. Humans have the power and 1048 00:52:10,280 --> 00:52:14,279 Speaker 2: the responsibility to shape the tech for our world. I 1049 00:52:14,520 --> 00:52:17,399 Speaker 2: for one, I'm excited to see how things play out. 1050 00:52:18,719 --> 00:52:22,879 Speaker 2: Smart Talks with IBM is produced by Matt Romano, Joey Fishground, 1051 00:52:23,120 --> 00:52:27,520 Speaker 2: David jaw and Jacob Goldstein. We're edited by Lydia Jane Kott. 1052 00:52:27,840 --> 00:52:32,320 Speaker 2: Our engineers are Jason Gambrel, Sarah Bruguier, and Ben Holliday. 1053 00:52:32,960 --> 00:52:38,200 Speaker 2: Theme song by Gramoscope. Special thanks to Andy Kelly, Kathy Callahan, 1054 00:52:38,560 --> 00:52:41,399 Speaker 2: and the eight Bar and IBM teams, as well as 1055 00:52:41,400 --> 00:52:45,160 Speaker 2: the Pushkin marketing team. Smart Talks with IBM is a 1056 00:52:45,160 --> 00:52:49,759 Speaker 2: production of Pushkin Industries and Ruby Studio at iHeartMedia. To 1057 00:52:49,800 --> 00:52:54,880 Speaker 2: find more Pushkin podcasts, listen on the iHeartRadio app, Apple Podcasts, 1058 00:52:55,000 --> 00:53:00,279 Speaker 2: or wherever you listen to podcasts. I'm Malcolm Gladwell. This 1059 00:53:00,360 --> 00:53:07,239 Speaker 2: is a paid advertisement from IBM.