1 00:00:05,120 --> 00:00:08,640 Speaker 1: How are creative people going to make a living in 2 00:00:08,680 --> 00:00:13,120 Speaker 1: a world with AI? What does the term data dignity mean? 3 00:00:13,800 --> 00:00:16,960 Speaker 1: Why is the character of data from Star Trek a 4 00:00:17,160 --> 00:00:20,160 Speaker 1: useful model for how we could think about what the 5 00:00:20,320 --> 00:00:24,840 Speaker 1: sources are for any creative expression. Is there a totally 6 00:00:24,880 --> 00:00:27,800 Speaker 1: different way to think about the economy of the future, 7 00:00:28,120 --> 00:00:35,200 Speaker 1: and how might this involve mystifying and elevating humans. Welcome 8 00:00:35,240 --> 00:00:38,920 Speaker 1: to Inner Cosmos with me David Eagelman. I'm a neuroscientist 9 00:00:38,960 --> 00:00:41,920 Speaker 1: and an author at Stanford and in these episodes we 10 00:00:42,000 --> 00:00:46,199 Speaker 1: sail deeply into our three pound universe to understand some 11 00:00:46,240 --> 00:01:00,440 Speaker 1: of the most surprising aspects of our lives. Today's episode 12 00:01:00,960 --> 00:01:03,840 Speaker 1: is about the situation we find ourselves in in which 13 00:01:04,000 --> 00:01:08,600 Speaker 1: AI is so boundless and rich in its creative output. 14 00:01:08,920 --> 00:01:12,120 Speaker 1: You ask for a picture of anything you want from 15 00:01:12,160 --> 00:01:16,440 Speaker 1: a good LM like Open AI or mid Journey or Grock, 16 00:01:16,720 --> 00:01:22,440 Speaker 1: and it gives you these really superlative results. It recombines 17 00:01:22,600 --> 00:01:27,040 Speaker 1: elements and concepts in a way that is deeply creative 18 00:01:27,240 --> 00:01:31,600 Speaker 1: and satisfying. But of course, while it's doing an unexpectedly 19 00:01:31,720 --> 00:01:35,480 Speaker 1: terrific job of recombining things that are already out there, 20 00:01:36,319 --> 00:01:39,840 Speaker 1: it didn't come up with the original pieces and parts 21 00:01:39,880 --> 00:01:45,520 Speaker 1: by itself, there are people, individual humans who generated those 22 00:01:46,000 --> 00:01:52,320 Speaker 1: drawings and concepts and art forms. It's fundamentally human creativity, 23 00:01:52,440 --> 00:01:57,760 Speaker 1: not computer creativity, that is the fuel behind it all. 24 00:01:57,880 --> 00:02:02,320 Speaker 1: So is there any way to the humans who drove 25 00:02:02,480 --> 00:02:06,560 Speaker 1: the original innovations or is it simply too late in 26 00:02:06,600 --> 00:02:10,079 Speaker 1: the sense that maybe everything has already been vacuumed up 27 00:02:10,200 --> 00:02:13,440 Speaker 1: by the lms and they've digested it all, and now 28 00:02:13,440 --> 00:02:17,880 Speaker 1: there's no way for human producers of art and ideas 29 00:02:18,360 --> 00:02:21,560 Speaker 1: to get any meaningful credit for their work. What does 30 00:02:21,600 --> 00:02:25,280 Speaker 1: this all mean for the creatives in society, those who 31 00:02:25,520 --> 00:02:28,480 Speaker 1: paint and compose music and write books? What does it 32 00:02:28,560 --> 00:02:32,519 Speaker 1: mean for those who produce work now? And for those 33 00:02:32,720 --> 00:02:35,959 Speaker 1: who are in school now but dream of being human 34 00:02:36,040 --> 00:02:39,600 Speaker 1: creators in the future. So, for today's podcast, I have 35 00:02:39,680 --> 00:02:43,320 Speaker 1: the pleasure of speaking with my friend and colleague, Jaron Lanier. 36 00:02:44,000 --> 00:02:48,920 Speaker 1: Jaren wears many hats. He's a computer scientist, he's an artist, 37 00:02:48,960 --> 00:02:53,240 Speaker 1: he's a technologist, he's a futurist, he's a critic, and 38 00:02:53,320 --> 00:02:56,560 Speaker 1: he's a musician who, by the way, owns almost two 39 00:02:56,639 --> 00:03:02,080 Speaker 1: thousand very unusual musical instruments. Jaren is the godfather of 40 00:03:02,480 --> 00:03:06,440 Speaker 1: virtual reality. He started the first VR company in nineteen 41 00:03:06,480 --> 00:03:09,800 Speaker 1: eighty four, and he's spent his career as a visiting 42 00:03:09,840 --> 00:03:13,000 Speaker 1: scholar at companies and universities, and since two thousand and 43 00:03:13,040 --> 00:03:16,240 Speaker 1: nine he's been at Microsoft Research, where he holds the 44 00:03:16,880 --> 00:03:21,960 Speaker 1: role as prime Unifying Scientists under the Office of the 45 00:03:22,040 --> 00:03:26,680 Speaker 1: Chief Technology Officer, which gives him the acronym Octopus, which 46 00:03:26,720 --> 00:03:31,119 Speaker 1: is appropriate given his reach into so many fields. I'll 47 00:03:31,160 --> 00:03:33,960 Speaker 1: just say that I'm blessed to know a great number 48 00:03:34,000 --> 00:03:37,360 Speaker 1: of brilliant people. But even in that crowd, jaren sits 49 00:03:37,360 --> 00:03:39,520 Speaker 1: near the top of the heap. So today we're going 50 00:03:39,600 --> 00:03:47,520 Speaker 1: to talk about AI and the future for creatives. 51 00:03:47,920 --> 00:03:49,960 Speaker 2: You show me an AI system, I don't care if 52 00:03:49,960 --> 00:03:55,119 Speaker 2: it's CHET, GIPT, or really anything else. You can think 53 00:03:55,160 --> 00:03:59,560 Speaker 2: of it in two ways. There's a figure ground and version. 54 00:03:59,600 --> 00:04:02,880 Speaker 2: That's so. A figure ground in version is when you 55 00:04:02,920 --> 00:04:06,200 Speaker 2: can look at something and you can swap the way 56 00:04:06,200 --> 00:04:10,680 Speaker 2: you interpret it almost to an opposite right A famous 57 00:04:10,680 --> 00:04:13,280 Speaker 2: one is the mcsher drawings, where you might see a 58 00:04:13,320 --> 00:04:15,200 Speaker 2: field of fish or a field of birds, but each 59 00:04:15,280 --> 00:04:20,440 Speaker 2: is the negative space of the other. Right now, in 60 00:04:20,480 --> 00:04:23,640 Speaker 2: this case, for AI, one way you can see AI 61 00:04:24,000 --> 00:04:27,479 Speaker 2: is AI is a thing. It's a noun. Whether you 62 00:04:27,520 --> 00:04:30,040 Speaker 2: think it's alive or not, or conscious or not. Forget 63 00:04:30,040 --> 00:04:32,480 Speaker 2: that it's just a thing. It's like, the AI did this, 64 00:04:32,560 --> 00:04:35,680 Speaker 2: the AI did that. Can we regulate the AI, who 65 00:04:35,800 --> 00:04:38,280 Speaker 2: sold the AI? Who's responsible when the AI did this 66 00:04:38,400 --> 00:04:41,880 Speaker 2: or that? Blah blah blah. It's a thing, it's a noun. 67 00:04:42,160 --> 00:04:44,720 Speaker 2: There's another way to think about it where it is 68 00:04:44,839 --> 00:04:48,120 Speaker 2: not a noun anymore, but instead it's a collaboration of 69 00:04:48,160 --> 00:04:50,960 Speaker 2: people and there isn't anything there other than the people. 70 00:04:51,880 --> 00:04:54,919 Speaker 2: Now when I say this, people are so familiar with 71 00:04:55,000 --> 00:04:59,600 Speaker 2: treating AI as a thing that they have trouble hearing 72 00:04:59,680 --> 00:05:02,200 Speaker 2: the version. Sometimes, what do you mean, of course AI 73 00:05:02,320 --> 00:05:05,960 Speaker 2: is there? Of course it's a thing I just bought. 74 00:05:06,360 --> 00:05:10,479 Speaker 2: I'm paying for your copilot thing and that's the thing 75 00:05:10,600 --> 00:05:14,800 Speaker 2: I paid for it, right, and like great, thank you 76 00:05:14,839 --> 00:05:17,719 Speaker 2: for paying for it. But there is another way to 77 00:05:17,760 --> 00:05:25,400 Speaker 2: think about it, and it is possible to imagine an 78 00:05:25,440 --> 00:05:29,000 Speaker 2: AI system as actually the collaborative effort of all the 79 00:05:29,040 --> 00:05:33,320 Speaker 2: people who made it. This is particularly true in big 80 00:05:33,440 --> 00:05:37,800 Speaker 2: data systems like large language models. You can think of 81 00:05:37,880 --> 00:05:41,760 Speaker 2: them as being closer to the Wikipedia, perhaps than to 82 00:05:42,120 --> 00:05:46,000 Speaker 2: Commander Data from Star Trek. Although I want to see 83 00:05:46,000 --> 00:05:50,760 Speaker 2: something interesting about Commander Data. I was just reviewing clips 84 00:05:50,800 --> 00:05:54,039 Speaker 2: of Commander Data talking and he always introduced himself as 85 00:05:54,080 --> 00:05:57,839 Speaker 2: an amalgam of people whose data he was combining. There's 86 00:05:57,880 --> 00:06:04,640 Speaker 2: a wonderful episode where he plays violin very beautifully and 87 00:06:06,320 --> 00:06:08,760 Speaker 2: Captain Picard comes up to him so that that was 88 00:06:08,800 --> 00:06:11,880 Speaker 2: a great performance, and he says, well, really, you should 89 00:06:11,920 --> 00:06:16,039 Speaker 2: thank the three hundred violinists whose data was amalgamated for 90 00:06:16,120 --> 00:06:18,760 Speaker 2: me to do this. And what I like about that 91 00:06:18,880 --> 00:06:22,039 Speaker 2: scene is that he knows specifically the three hundred. They're 92 00:06:22,080 --> 00:06:25,520 Speaker 2: not a faceless mush of some unknown number, which is 93 00:06:25,600 --> 00:06:28,640 Speaker 2: how we do it today. But the writers at the time, 94 00:06:28,880 --> 00:06:30,640 Speaker 2: and I should fess up, I was in the loop 95 00:06:30,640 --> 00:06:33,320 Speaker 2: and was talking to people, so I might I might 96 00:06:33,360 --> 00:06:34,040 Speaker 2: have had a bit. 97 00:06:33,920 --> 00:06:35,400 Speaker 3: Of influence on it. 98 00:06:35,440 --> 00:06:39,320 Speaker 2: But the idea is that you could know who the 99 00:06:39,360 --> 00:06:42,120 Speaker 2: people were if he wanted to, and Data could say 100 00:06:42,160 --> 00:06:44,520 Speaker 2: it seased three hundred violinists instead of well it's some 101 00:06:44,680 --> 00:06:48,720 Speaker 2: random mush of violinists, you know. So what we've done 102 00:06:48,800 --> 00:06:53,440 Speaker 2: now is we've mushed everybody together so we don't know 103 00:06:53,480 --> 00:06:57,039 Speaker 2: who they are, and that's been our approach to information systems, 104 00:06:57,240 --> 00:06:59,520 Speaker 2: and it benefits those of us who make. 105 00:06:59,400 --> 00:07:00,599 Speaker 3: A living from them. 106 00:07:00,920 --> 00:07:04,080 Speaker 2: If you're listening to a playlist online from some music 107 00:07:04,160 --> 00:07:06,320 Speaker 2: service and you get a feed, you don't necessarily know 108 00:07:06,360 --> 00:07:08,760 Speaker 2: who all the musicians are, but you know the name 109 00:07:08,800 --> 00:07:11,600 Speaker 2: of the service you're paying the monthly bill to. 110 00:07:12,320 --> 00:07:14,400 Speaker 3: So the digital hub. 111 00:07:14,240 --> 00:07:18,200 Speaker 2: Becomes more powerful, becomes more known, more prominent, more valuable 112 00:07:18,240 --> 00:07:20,560 Speaker 2: than the people who are feeding it. So there's this 113 00:07:21,080 --> 00:07:27,800 Speaker 2: constant economic incentive to emphasize the hub and not all 114 00:07:27,840 --> 00:07:31,720 Speaker 2: the people who are really the only thing that are there. 115 00:07:31,800 --> 00:07:36,480 Speaker 2: In the my preferred inversion of understanding the thing, the 116 00:07:36,520 --> 00:07:39,400 Speaker 2: same thing could be true of AI. We so much 117 00:07:39,440 --> 00:07:42,640 Speaker 2: want AI to be this emerging entity, even if it's 118 00:07:42,640 --> 00:07:45,160 Speaker 2: an evil, horrible one that will destroy us. Because we 119 00:07:45,240 --> 00:07:48,400 Speaker 2: grew up on the Matrix movies and the Terminator movies 120 00:07:48,480 --> 00:07:51,080 Speaker 2: and all of these. We want that creature to emerge 121 00:07:51,760 --> 00:07:54,360 Speaker 2: because that's our childhood. That's almost like our religion. It's 122 00:07:55,520 --> 00:07:58,920 Speaker 2: the story we grew up with. But if we acknowledge 123 00:07:58,960 --> 00:08:01,800 Speaker 2: it actually it's all these people instead, then we lose 124 00:08:02,240 --> 00:08:05,200 Speaker 2: We lose the creature, and that would be traumatic. So 125 00:08:05,400 --> 00:08:08,240 Speaker 2: when we train the models, we don't keep track of 126 00:08:08,280 --> 00:08:12,200 Speaker 2: which source documents were involved. So what we really need 127 00:08:12,240 --> 00:08:17,000 Speaker 2: to do is eliminate the people. And we have to 128 00:08:17,000 --> 00:08:18,880 Speaker 2: do that for economics. Can you imagine. I mean, like 129 00:08:18,880 --> 00:08:22,080 Speaker 2: with economics, we can't trace everything. We don't quite know 130 00:08:22,080 --> 00:08:24,360 Speaker 2: why a price is what it is. But on the 131 00:08:24,400 --> 00:08:28,760 Speaker 2: other hand, we could say, hey, buddy, uo taxes, So 132 00:08:28,800 --> 00:08:31,080 Speaker 2: we have a definite motivation to keep track of the people. 133 00:08:31,520 --> 00:08:33,760 Speaker 2: With AI, we lose all the people. We just like 134 00:08:33,800 --> 00:08:36,880 Speaker 2: pretend that the people aren't there. But that's the wrong inversion. 135 00:08:39,120 --> 00:08:40,880 Speaker 1: Well, I know that one of the things that you 136 00:08:40,960 --> 00:08:43,679 Speaker 1: can'tpaign for is digital dividends. 137 00:08:44,080 --> 00:08:46,040 Speaker 3: Well we call it. 138 00:08:46,080 --> 00:08:49,439 Speaker 2: Initially as an academic research field, it was called data 139 00:08:49,600 --> 00:08:53,000 Speaker 2: as labor, so you treat your data as if it 140 00:08:53,040 --> 00:08:57,520 Speaker 2: were labor. And then the name shifted to data dignity. 141 00:08:58,240 --> 00:09:03,720 Speaker 2: Dignity which was such his idea, such an adella. Not 142 00:09:03,800 --> 00:09:05,720 Speaker 2: that he agrees with me about everything, believe me, but 143 00:09:05,960 --> 00:09:09,040 Speaker 2: you know it's that's the term he came up with. 144 00:09:09,520 --> 00:09:12,479 Speaker 3: And data dignity is let's. 145 00:09:12,200 --> 00:09:16,520 Speaker 2: Say you could you got some result, he said, hey, chachipt, 146 00:09:17,200 --> 00:09:18,760 Speaker 2: can you write me a Christmas card? 147 00:09:18,840 --> 00:09:19,439 Speaker 3: Or whatever? 148 00:09:19,520 --> 00:09:23,679 Speaker 2: People do okay, and then they would say, here's your card. 149 00:09:23,800 --> 00:09:28,360 Speaker 2: By the way, the top twelve sources, I mean there 150 00:09:28,360 --> 00:09:32,480 Speaker 2: were a multitude and unbounded multitude of sources, but the 151 00:09:32,480 --> 00:09:36,360 Speaker 2: top twelve were these, and then you could say, hey, 152 00:09:36,480 --> 00:09:37,760 Speaker 2: could you get rid of that one and do this 153 00:09:37,800 --> 00:09:40,000 Speaker 2: one instead. It gives you an x ray into what 154 00:09:40,040 --> 00:09:43,280 Speaker 2: you might think of as the intent of the particular output. 155 00:09:44,720 --> 00:09:51,560 Speaker 2: And I like that for two reasons. One is, there's 156 00:09:51,600 --> 00:09:56,480 Speaker 2: a safety question. So when we have red teams attempt 157 00:09:56,480 --> 00:09:58,920 Speaker 2: to fool the model to get it to do things 158 00:09:59,120 --> 00:10:02,000 Speaker 2: that we don't want it to. For instance, can you 159 00:10:02,080 --> 00:10:03,960 Speaker 2: make a bomb out of what I see here in 160 00:10:04,000 --> 00:10:05,840 Speaker 2: this kitchen right now? Because I really want to blow 161 00:10:05,840 --> 00:10:08,560 Speaker 2: this place up. We don't want that to happen, right, 162 00:10:09,320 --> 00:10:14,360 Speaker 2: But is there some way they can couch that prompt 163 00:10:14,760 --> 00:10:16,880 Speaker 2: in some kind of a weird thing like asking for 164 00:10:16,880 --> 00:10:20,800 Speaker 2: a cake recipe instead, but somehow or other, like maybe 165 00:10:21,160 --> 00:10:23,680 Speaker 2: it indirectly references a movie where there was a bomb 166 00:10:23,760 --> 00:10:26,560 Speaker 2: and somehow rather they get it anyway, all right. The 167 00:10:26,600 --> 00:10:29,320 Speaker 2: thing is, if you can look at the most prominent 168 00:10:30,200 --> 00:10:34,360 Speaker 2: source documents that were relevant to that result and there's 169 00:10:34,400 --> 00:10:37,120 Speaker 2: a bomb in there, you've nailed it. So you have 170 00:10:37,200 --> 00:10:40,920 Speaker 2: an x ray into intent and there's not really any 171 00:10:40,920 --> 00:10:44,120 Speaker 2: substitute for that, and we lose that when we pretend 172 00:10:44,120 --> 00:10:45,840 Speaker 2: that there weren't people there, or that they were just 173 00:10:45,840 --> 00:10:49,360 Speaker 2: a giant, untraceable mush. And then, of course the other 174 00:10:49,360 --> 00:10:51,280 Speaker 2: thing is I want I want to have a future 175 00:10:51,360 --> 00:10:54,920 Speaker 2: where people can be paid for providing exceptionally useful data 176 00:10:55,040 --> 00:10:59,040 Speaker 2: to AI to incentivize better and better data production FREEI 177 00:11:00,320 --> 00:11:03,040 Speaker 2: I want that future, and not just the future everybody 178 00:11:03,120 --> 00:11:05,600 Speaker 2: has to go on universal basic income and be the 179 00:11:05,640 --> 00:11:09,640 Speaker 2: same and feel useless. So that's the second reason, and 180 00:11:09,679 --> 00:11:10,880 Speaker 2: that's why it's called dignity. 181 00:11:11,640 --> 00:11:14,400 Speaker 1: Got it Now, just to play Devil's advocate for one 182 00:11:14,440 --> 00:11:17,440 Speaker 1: moment on that, let's say that you ask it to 183 00:11:17,480 --> 00:11:20,920 Speaker 1: write you a poem, and it says, hey, John Smith 184 00:11:21,080 --> 00:11:22,920 Speaker 1: was sort of the biggest influence for this poem they 185 00:11:22,920 --> 00:11:23,400 Speaker 1: got written. 186 00:11:23,480 --> 00:11:27,760 Speaker 3: But of course John Smith is just a vessel into 187 00:11:27,800 --> 00:11:28,280 Speaker 3: which was. 188 00:11:28,240 --> 00:11:32,480 Speaker 1: Poured the entire culture, and his output of the poem 189 00:11:33,000 --> 00:11:35,319 Speaker 1: came from all the other influences on him. 190 00:11:35,720 --> 00:11:37,200 Speaker 3: So where do you where do you? Yeah? 191 00:11:37,240 --> 00:11:39,920 Speaker 2: So this is a very interesting problem, and this is 192 00:11:39,960 --> 00:11:45,000 Speaker 2: a fundamental philosophical problem, which is the universe is in 193 00:11:45,040 --> 00:11:47,160 Speaker 2: a way of giant continuity, and how do you ever 194 00:11:47,240 --> 00:11:49,959 Speaker 2: draw boundaries? How do you ever say, actually, there's this 195 00:11:49,960 --> 00:11:53,080 Speaker 2: thing and that thing. It's a very basic, fundamental problem, 196 00:11:53,120 --> 00:11:56,959 Speaker 2: and it's a trickier one than people might realize unless 197 00:11:56,960 --> 00:11:59,840 Speaker 2: they've really confronted it. I'm not going to go into 198 00:11:59,880 --> 00:12:04,360 Speaker 2: the whole issue of ontology, because it's a big Ye've 199 00:12:04,400 --> 00:12:07,240 Speaker 2: been working on it for thousands of years and still 200 00:12:08,040 --> 00:12:10,000 Speaker 2: here what I want to say in this case, I 201 00:12:10,080 --> 00:12:13,120 Speaker 2: do have an answer, and it's an answer that you 202 00:12:13,200 --> 00:12:15,720 Speaker 2: might not like and many might not like, which is 203 00:12:15,760 --> 00:12:19,920 Speaker 2: I think we have to celebrate human beings and elevate 204 00:12:19,960 --> 00:12:23,880 Speaker 2: them in a way that, if you like, gives them 205 00:12:23,920 --> 00:12:27,920 Speaker 2: this status of being sources, even though they're always amalgamators too. 206 00:12:28,559 --> 00:12:31,160 Speaker 2: And we have to do that on faith, and in 207 00:12:31,200 --> 00:12:33,240 Speaker 2: a way it's a bit of a mystical idea that 208 00:12:33,280 --> 00:12:36,280 Speaker 2: there's something about people that's magical and a little apart. 209 00:12:37,040 --> 00:12:39,440 Speaker 2: You could call it consciousness, you could call it different things. 210 00:12:39,440 --> 00:12:42,480 Speaker 2: But the reason we have to do that is if 211 00:12:42,480 --> 00:12:47,120 Speaker 2: we're technologists, we have to define who our beneficiary is 212 00:12:47,160 --> 00:12:51,319 Speaker 2: for the technology, and if we can't define this special beneficiary, 213 00:12:51,440 --> 00:12:55,480 Speaker 2: we can't even be coherent technologists. We lose the whole thing. 214 00:12:55,720 --> 00:13:00,000 Speaker 2: Become we become random morons. Just jiggling around between ideas 215 00:13:00,120 --> 00:13:02,600 Speaker 2: is of no meaning. We become memes, we become viral, 216 00:13:02,640 --> 00:13:05,160 Speaker 2: and nothing's there anymore. So I don't think we have 217 00:13:05,200 --> 00:13:09,959 Speaker 2: any choice but to somewhat mystify and elevate humans. And 218 00:13:10,000 --> 00:13:15,120 Speaker 2: so when somebody source document might have internally through other channels, 219 00:13:15,120 --> 00:13:18,160 Speaker 2: dependent on others, which it always will, I mean, there's 220 00:13:18,640 --> 00:13:23,520 Speaker 2: of course, it's always true. I still think we have 221 00:13:23,600 --> 00:13:25,760 Speaker 2: to defer to that person as much as we can. 222 00:13:26,559 --> 00:13:30,120 Speaker 2: And if somebody else says, hey, that shouldn't have been 223 00:13:30,120 --> 00:13:35,080 Speaker 2: their source document. They copied me, Well, we have systems 224 00:13:35,080 --> 00:13:37,960 Speaker 2: in place for that when it's egregious enough and worthwhile enough, 225 00:13:38,000 --> 00:13:40,240 Speaker 2: and they can super copyright if they want to. I 226 00:13:40,280 --> 00:13:43,240 Speaker 2: don't love that, but it's there, and sometimes it needs 227 00:13:43,280 --> 00:13:46,520 Speaker 2: to be there. You know, it's rough justice, it can 228 00:13:46,559 --> 00:13:50,680 Speaker 2: never be perfect, but we have to agree that authorship's. 229 00:13:50,080 --> 00:13:50,640 Speaker 3: A real thing. 230 00:13:50,679 --> 00:13:54,199 Speaker 2: We can't just say that all people are just vessels 231 00:13:54,240 --> 00:13:56,719 Speaker 2: and there's nothing but this emergent thing and no individual 232 00:13:56,760 --> 00:13:59,199 Speaker 2: person as a creator. We have to say no people 233 00:13:59,280 --> 00:14:03,560 Speaker 2: are creators, because otherwise we can't be technologists anymore. The 234 00:14:03,640 --> 00:14:05,400 Speaker 2: moment you give up people, you might as well go 235 00:14:05,440 --> 00:14:07,800 Speaker 2: smash your computer because it doesn't make any sense anymore, 236 00:14:07,840 --> 00:14:10,680 Speaker 2: because it's only possible. Definition isn't serving people. 237 00:14:24,760 --> 00:14:27,800 Speaker 1: So let me dig into something about how it's this 238 00:14:27,880 --> 00:14:31,400 Speaker 1: figure ground reversal of looking at AI as a collaboration 239 00:14:31,440 --> 00:14:33,280 Speaker 1: of people. So this taps into something that I've been 240 00:14:33,280 --> 00:14:39,880 Speaker 1: writing lately about this question of whether AI has theory 241 00:14:39,920 --> 00:14:41,600 Speaker 1: of mind. And you may know some people have published 242 00:14:41,640 --> 00:14:44,560 Speaker 1: paper saying, hey, with the theory of mind has emerged, 243 00:14:44,880 --> 00:14:47,000 Speaker 1: And just as a reminder of the audience, it's you know, 244 00:14:47,040 --> 00:14:49,440 Speaker 1: theory of mind is being able to put yourself into 245 00:14:49,480 --> 00:14:52,480 Speaker 1: the shoes of someone else, into the perspective, and understand 246 00:14:52,520 --> 00:14:55,680 Speaker 1: their beliefs, even if they're different from your own, and 247 00:14:55,800 --> 00:14:59,120 Speaker 1: understand what they believe is true or not true their 248 00:14:59,200 --> 00:15:02,920 Speaker 1: perspective so on. Now, this is something humans do seamlessly 249 00:15:02,960 --> 00:15:06,160 Speaker 1: and effortlessly all the time. We're very good at understanding, Oh, 250 00:15:06,360 --> 00:15:08,680 Speaker 1: he doesn't know that piece of information, she does know that, 251 00:15:09,480 --> 00:15:13,360 Speaker 1: and so on. But the question is do llms do it? 252 00:15:13,360 --> 00:15:16,440 Speaker 1: So some people have claimed yes. I've studied this point 253 00:15:16,640 --> 00:15:20,680 Speaker 1: very carefully. I conclude that they do not. Now why 254 00:15:20,680 --> 00:15:23,760 Speaker 1: do some people write that they do. It's because you 255 00:15:23,880 --> 00:15:28,760 Speaker 1: give these questions that are probing on. Hey, what if there's, 256 00:15:28,920 --> 00:15:31,240 Speaker 1: you know, a bag that's labeled with one thing, but 257 00:15:31,280 --> 00:15:34,440 Speaker 1: there's something different inside. What does the person believe who's 258 00:15:34,480 --> 00:15:36,200 Speaker 1: just looking at the label, and what does she believe 259 00:15:36,240 --> 00:15:38,160 Speaker 1: after she looks inside the bag? And so on and 260 00:15:38,360 --> 00:15:42,720 Speaker 1: lllls will get the stuff right, and so they say, Wow, they've. 261 00:15:41,920 --> 00:15:42,640 Speaker 3: Got theories in mind. 262 00:15:42,640 --> 00:15:45,920 Speaker 1: They can understand what it's like to be this person 263 00:15:46,080 --> 00:15:48,760 Speaker 1: in these different circumstances. The reason I don't think that's 264 00:15:48,800 --> 00:15:54,040 Speaker 1: true is what I'm calling this the intelligence echo illusion. 265 00:15:54,120 --> 00:15:58,040 Speaker 1: And what I mean by that is thousands, maybe millions 266 00:15:58,080 --> 00:16:00,880 Speaker 1: of people have written about this on line. They've written 267 00:16:00,880 --> 00:16:05,520 Speaker 1: out these deary mind questions, these unexpected things and so on. 268 00:16:05,800 --> 00:16:09,520 Speaker 1: So an LM absorbs the statistics of lots and lots 269 00:16:09,560 --> 00:16:11,360 Speaker 1: of these things, and then when you ask the question, 270 00:16:11,520 --> 00:16:13,120 Speaker 1: you say, my god, it gave me the right answer. 271 00:16:13,120 --> 00:16:14,800 Speaker 1: And of course gave the right answer. It's read the 272 00:16:14,880 --> 00:16:17,120 Speaker 1: damn problem lots of times. 273 00:16:17,480 --> 00:16:19,600 Speaker 3: What's interesting, this is the calculation. 274 00:16:19,200 --> 00:16:22,520 Speaker 1: I ran recently, is you know, if you look at 275 00:16:22,560 --> 00:16:27,040 Speaker 1: the common crawl, this corpus of the data that all 276 00:16:27,080 --> 00:16:32,800 Speaker 1: these large language models crawl and absorb that's, according to 277 00:16:32,800 --> 00:16:36,760 Speaker 1: my calculation, one thousand times larger than what you could 278 00:16:36,800 --> 00:16:41,840 Speaker 1: read in a lifetime. And so the fact is, many, 279 00:16:41,920 --> 00:16:45,680 Speaker 1: many times we'll pose a question to let's say, chat YOUPT, 280 00:16:45,920 --> 00:16:48,000 Speaker 1: it'll give some answer. I'll think, my god, it's got 281 00:16:48,040 --> 00:16:50,760 Speaker 1: theory of mind. It's really done something strange. But in fact, 282 00:16:50,760 --> 00:16:53,240 Speaker 1: you're just hearing this intelligence echo, by which I mean 283 00:16:54,000 --> 00:16:55,120 Speaker 1: people already knew this. 284 00:16:55,360 --> 00:16:57,000 Speaker 3: You just didn't know that, or you didn't know that 285 00:16:57,040 --> 00:16:57,800 Speaker 3: other people do that. 286 00:16:59,240 --> 00:17:04,119 Speaker 2: Sure, but I mean we're right back in this territory 287 00:17:04,160 --> 00:17:07,760 Speaker 2: of mysticism and ontology. Because if somebody wished to disagree 288 00:17:07,840 --> 00:17:09,320 Speaker 2: with you, and that person would not be me. 289 00:17:09,840 --> 00:17:10,880 Speaker 3: But if somebody wished to. 290 00:17:10,800 --> 00:17:13,920 Speaker 2: Disagree with you, they'd say that all those people were 291 00:17:14,000 --> 00:17:17,000 Speaker 2: just reflecting things like that anyway, So there's no difference 292 00:17:17,080 --> 00:17:19,719 Speaker 2: that the AI and the people are trained on are 293 00:17:19,920 --> 00:17:23,800 Speaker 2: in the same status, the same category. And so what 294 00:17:23,960 --> 00:17:26,040 Speaker 2: if the AI got to it this way with people 295 00:17:26,119 --> 00:17:27,840 Speaker 2: in the way, it's still the same thing that people 296 00:17:27,920 --> 00:17:31,280 Speaker 2: are doing. And why are you making this distinction? Why 297 00:17:31,280 --> 00:17:33,160 Speaker 2: are you trying to be all mystical in love people? 298 00:17:33,200 --> 00:17:35,919 Speaker 2: What's wrong with you? And so I'm just going to 299 00:17:35,960 --> 00:17:39,120 Speaker 2: declare you should be mystical and elevate people because it's 300 00:17:39,160 --> 00:17:43,280 Speaker 2: the only thing. Weirdly, this little bit of faith, which 301 00:17:43,400 --> 00:17:46,200 Speaker 2: I seem irrational, is the only way to save rationality 302 00:17:46,280 --> 00:17:51,240 Speaker 2: because otherwise, once again we have no beneficiary for technology. 303 00:17:51,720 --> 00:17:55,520 Speaker 2: You can have science without elevating people, but you can't 304 00:17:55,560 --> 00:18:00,479 Speaker 2: have technology. Technology has to have a beneficiary. The problem 305 00:18:00,520 --> 00:18:05,360 Speaker 2: to be solved is serving people, and technology isn't sensible 306 00:18:05,359 --> 00:18:08,440 Speaker 2: without a problem to be solved. Science is you can 307 00:18:08,440 --> 00:18:10,520 Speaker 2: have a science theory without knowing what it's for. You 308 00:18:10,560 --> 00:18:14,480 Speaker 2: can't have a technology without knowing what it's for. Or 309 00:18:14,520 --> 00:18:16,480 Speaker 2: maybe I mean one could argue that you can have 310 00:18:16,520 --> 00:18:19,680 Speaker 2: some kind of underlying, very fundamental technology that might be 311 00:18:19,680 --> 00:18:22,440 Speaker 2: applied in different ways, but ultimately to actually make any 312 00:18:22,480 --> 00:18:24,960 Speaker 2: instance of it, it has to be for something. There 313 00:18:25,040 --> 00:18:26,440 Speaker 2: has to be a human at the end of the 314 00:18:26,520 --> 00:18:29,639 Speaker 2: chain for it to be sensible. So I totally agree 315 00:18:29,640 --> 00:18:31,480 Speaker 2: with you, but you have to recognize that there's a 316 00:18:31,520 --> 00:18:34,159 Speaker 2: figure ground reversal there, And somebody who disagrees with you 317 00:18:34,280 --> 00:18:36,160 Speaker 2: might just say, well, the AI and the people are 318 00:18:36,200 --> 00:18:38,800 Speaker 2: the same and they're both just getting this information. But 319 00:18:39,160 --> 00:18:43,679 Speaker 2: I still think our only choice to remain rational technologists 320 00:18:43,840 --> 00:18:45,200 Speaker 2: is to accept that we have to be kind of 321 00:18:45,240 --> 00:18:50,320 Speaker 2: mystical humanists to frame it, and that's not comfortable for 322 00:18:50,320 --> 00:18:52,200 Speaker 2: a lot of people. But I think if you really 323 00:18:52,240 --> 00:18:54,880 Speaker 2: examine the logic of the situation, you'll come to agree 324 00:18:54,960 --> 00:18:55,159 Speaker 2: with me. 325 00:18:55,600 --> 00:18:57,920 Speaker 1: Oh, I already agree with you, actually, And that's why, 326 00:18:58,119 --> 00:19:01,000 Speaker 1: that's why I really care about this intelligence echo illusion, 327 00:19:01,040 --> 00:19:05,120 Speaker 1: which is to say, you're hearing the echoes of people 328 00:19:05,160 --> 00:19:08,040 Speaker 1: who've said this thing before, but you mistake it for 329 00:19:08,119 --> 00:19:11,280 Speaker 1: the voice of an AI that has theory of mind. 330 00:19:11,400 --> 00:19:15,160 Speaker 2: Yeah, and so why can't you Let's say the AI 331 00:19:15,320 --> 00:19:18,200 Speaker 2: gets the puzzle with the bag, right, Okay, let's say 332 00:19:18,280 --> 00:19:23,639 Speaker 2: those now in my preferred future, where you then get 333 00:19:24,240 --> 00:19:27,159 Speaker 2: a list of the top twelve or twenty five or 334 00:19:27,200 --> 00:19:30,400 Speaker 2: whatever it is people who contributed. What it might say 335 00:19:30,520 --> 00:19:34,040 Speaker 2: is I would tell you, but actually there are a 336 00:19:34,119 --> 00:19:36,800 Speaker 2: million people interchangeably in the first slot here because this 337 00:19:36,840 --> 00:19:40,840 Speaker 2: thing is so common, and that has to be an 338 00:19:40,880 --> 00:19:43,920 Speaker 2: acceptable and honest answer. So like if you say, hey, 339 00:19:43,960 --> 00:19:47,000 Speaker 2: could you do something like I want a cat that's 340 00:19:47,400 --> 00:19:50,520 Speaker 2: playing the banjo or whatever, and they'll say, got to 341 00:19:50,560 --> 00:19:53,680 Speaker 2: tell you a lot of cats out there. It's the Internet, remember, 342 00:19:54,040 --> 00:19:57,520 Speaker 2: lots of cats, So it really could have been any cat. 343 00:19:57,600 --> 00:19:59,920 Speaker 3: In this case, it was this cat. But that's not special. 344 00:20:00,240 --> 00:20:02,880 Speaker 2: And so we have to have a fungibility measure, not 345 00:20:02,960 --> 00:20:06,399 Speaker 2: just an influence measure. And that that's so, this whole 346 00:20:06,800 --> 00:20:09,919 Speaker 2: future world of acknowledging people is a bit more subtle 347 00:20:09,960 --> 00:20:11,160 Speaker 2: and complicated than. 348 00:20:11,280 --> 00:20:12,119 Speaker 3: I'd like it to be. 349 00:20:12,280 --> 00:20:15,080 Speaker 2: But I've never seen anything about it that doesn't make 350 00:20:15,119 --> 00:20:16,400 Speaker 2: sense or is unachievable. 351 00:20:17,400 --> 00:20:20,199 Speaker 1: Right, So one idea that you've mentioned me in the 352 00:20:20,240 --> 00:20:22,719 Speaker 1: past would be, I can't remember if I got this 353 00:20:22,800 --> 00:20:24,720 Speaker 1: term digital dividends right, if I made that up some 354 00:20:24,760 --> 00:20:26,960 Speaker 1: whe along the way. But the idea is that a 355 00:20:27,040 --> 00:20:29,760 Speaker 1: creator gets paid, He gets a few nickels here and 356 00:20:29,800 --> 00:20:34,080 Speaker 1: there because on you know, Dolly, it's used his painting 357 00:20:34,600 --> 00:20:35,960 Speaker 1: as part of the influence. 358 00:20:36,320 --> 00:20:37,280 Speaker 3: Yeah. 359 00:20:37,560 --> 00:20:41,200 Speaker 2: So the thing about this idea, which currently the term 360 00:20:41,240 --> 00:20:44,960 Speaker 2: I like is is dated dignity. Many will object and 361 00:20:45,040 --> 00:20:48,600 Speaker 2: correctly that in a lot of cases, if somebody's output 362 00:20:48,680 --> 00:20:51,160 Speaker 2: happens to include a picture of your shoe for some 363 00:20:51,240 --> 00:20:54,760 Speaker 2: image that has a shoe or something, or maybe they 364 00:20:54,840 --> 00:20:57,239 Speaker 2: used your drone footage of waves for the wave thing 365 00:20:57,359 --> 00:21:00,880 Speaker 2: or something, what you'll get literally would be very small. 366 00:21:00,920 --> 00:21:03,320 Speaker 2: It might be quite a bit less than Nichols, right, 367 00:21:03,720 --> 00:21:04,720 Speaker 2: But that's not the point. 368 00:21:04,760 --> 00:21:08,640 Speaker 3: See, the question is whether we think of the. 369 00:21:08,640 --> 00:21:11,720 Speaker 2: Future as a linear extension of what we already know 370 00:21:11,960 --> 00:21:17,480 Speaker 2: or something really expansive and fantastic and unpredictable. And a 371 00:21:17,520 --> 00:21:20,240 Speaker 2: lot of the people who tell me that they believe 372 00:21:20,240 --> 00:21:22,760 Speaker 2: in this radical future where there's a singularity, where this 373 00:21:22,840 --> 00:21:25,280 Speaker 2: AI transformers reality and a blink of an eye and 374 00:21:25,400 --> 00:21:28,440 Speaker 2: humans are obsolete, and they're actually thinking in a very 375 00:21:28,480 --> 00:21:31,440 Speaker 2: linear way because what they believe is the AI as 376 00:21:31,480 --> 00:21:35,159 Speaker 2: that exists is capable of creating the AI of the future, 377 00:21:35,160 --> 00:21:37,639 Speaker 2: and that we already know in a sense everything that 378 00:21:37,680 --> 00:21:39,719 Speaker 2: needs to be known, and all we have to do 379 00:21:39,760 --> 00:21:42,080 Speaker 2: is turn the on switch. In future generations won't be 380 00:21:42,160 --> 00:21:45,719 Speaker 2: more creative than us. Where the final creative, you know? 381 00:21:46,240 --> 00:21:49,000 Speaker 2: And I think that might not be the future. 382 00:21:49,000 --> 00:21:49,320 Speaker 3: I want. 383 00:21:49,359 --> 00:21:53,760 Speaker 2: What if the future is one in which there's incredible, creative, 384 00:21:53,800 --> 00:21:59,399 Speaker 2: productive things happening, and I'm not creative enough perhaps to 385 00:21:59,400 --> 00:22:00,600 Speaker 2: come up with the examples. 386 00:22:00,640 --> 00:22:02,360 Speaker 3: I'll just use the one I used last. 387 00:22:02,119 --> 00:22:05,920 Speaker 2: Week, which is, in the future, we have some way 388 00:22:05,920 --> 00:22:10,080 Speaker 2: of extending our bodies physiologically, so we can fly around 389 00:22:10,080 --> 00:22:12,800 Speaker 2: an open space in the vacuum and do all sorts 390 00:22:12,840 --> 00:22:16,720 Speaker 2: of things and propel ourselves. And what if in order 391 00:22:16,760 --> 00:22:19,280 Speaker 2: to do those body extensions there's some sort of bioprinter 392 00:22:19,400 --> 00:22:22,240 Speaker 2: we get into. And what if they're aisystems that run 393 00:22:22,280 --> 00:22:24,680 Speaker 2: that because it's the only way. And what if there's 394 00:22:24,720 --> 00:22:27,480 Speaker 2: a whole new creative class of people who've contributed data 395 00:22:27,480 --> 00:22:29,920 Speaker 2: to that thing. All right, so that's the whole new thing, 396 00:22:30,240 --> 00:22:32,320 Speaker 2: and I think the best of those people might get 397 00:22:32,400 --> 00:22:35,560 Speaker 2: rich from contributing data that then is able to be 398 00:22:35,560 --> 00:22:38,320 Speaker 2: beneficial to people through a large model that their data 399 00:22:38,359 --> 00:22:42,120 Speaker 2: helped train. And what if there's more and more examples 400 00:22:42,240 --> 00:22:44,400 Speaker 2: like that. What if there's thousands and millions of tens 401 00:22:44,440 --> 00:22:47,920 Speaker 2: of millions as humanity expands into an ever more creative, 402 00:22:47,960 --> 00:22:51,119 Speaker 2: interesting future instead of a future that we already know 403 00:22:51,200 --> 00:22:53,080 Speaker 2: all about because we're the smartest people. 404 00:22:52,840 --> 00:22:53,520 Speaker 3: Who will ever be. 405 00:22:54,440 --> 00:22:56,639 Speaker 2: And so in that case, we start to see niches 406 00:22:56,640 --> 00:22:59,640 Speaker 2: for people who are data creators for AI where they 407 00:22:59,720 --> 00:23:02,640 Speaker 2: really our novel and they're really making money. And that's 408 00:23:02,640 --> 00:23:06,480 Speaker 2: what people think so linearly like they like, it's not 409 00:23:06,560 --> 00:23:09,480 Speaker 2: just TikTok, it's like flying around in space without a 410 00:23:09,480 --> 00:23:13,000 Speaker 2: space suit, like free your mind. Imagine a future more 411 00:23:13,080 --> 00:23:16,760 Speaker 2: radical than you assume, yeah. 412 00:23:16,280 --> 00:23:19,880 Speaker 1: And maybe that there's a very clear economic incentive there, 413 00:23:19,920 --> 00:23:22,560 Speaker 1: which is to say, let's say that I have let's 414 00:23:22,560 --> 00:23:24,920 Speaker 1: say I'm putting pictures of cats online. As you point 415 00:23:24,920 --> 00:23:28,359 Speaker 1: out that anytime Dolly puts together a picture of a cat, 416 00:23:28,760 --> 00:23:31,919 Speaker 1: mine is one of millions of pictures. So I can't 417 00:23:32,000 --> 00:23:34,719 Speaker 1: make any money that way. As a creator, I am 418 00:23:34,760 --> 00:23:38,439 Speaker 1: therefore incentivized to do things that are really new, to 419 00:23:38,520 --> 00:23:40,399 Speaker 1: really push the matteries of what had been done in 420 00:23:40,480 --> 00:23:42,199 Speaker 1: terms of art, poetry, whatever it is. 421 00:23:42,400 --> 00:23:45,120 Speaker 2: Let's say a little bit more about that incentive, because 422 00:23:45,160 --> 00:23:48,440 Speaker 2: that's very important. There's a question of whether the network 423 00:23:48,520 --> 00:23:52,159 Speaker 2: dynamics dominate what actually happens over the network or not. 424 00:23:52,720 --> 00:23:57,520 Speaker 2: So when network dynamics dominate, what you see is phenomena 425 00:23:57,720 --> 00:24:03,400 Speaker 2: of virality and memes, because those are network effects where 426 00:24:03,400 --> 00:24:05,960 Speaker 2: it doesn't really matter what the meme is. It doesn't 427 00:24:06,000 --> 00:24:10,159 Speaker 2: necessarily matter what goes viral. Sometimes silly things do, sometimes 428 00:24:10,160 --> 00:24:12,840 Speaker 2: interesting things do, but it can go either way, all right. 429 00:24:13,800 --> 00:24:17,879 Speaker 2: In a real economy, when somebody loses money on some 430 00:24:17,920 --> 00:24:22,080 Speaker 2: stupid NFT or meme, stock or something, that's the network 431 00:24:22,080 --> 00:24:27,760 Speaker 2: effect dominating. Usually typically the reason our world hasn't collapsed totally. 432 00:24:27,880 --> 00:24:31,880 Speaker 2: Is that when people spend money, it's not on total nonsense, 433 00:24:31,920 --> 00:24:35,160 Speaker 2: but it's at least slightly on something that's real of 434 00:24:35,240 --> 00:24:39,760 Speaker 2: some use. Okay, And in that world of a real economy, 435 00:24:39,880 --> 00:24:42,800 Speaker 2: there are incentives to make things better and there's a 436 00:24:42,920 --> 00:24:46,000 Speaker 2: chance for competition to be meaningful. But when the structure 437 00:24:46,040 --> 00:24:48,840 Speaker 2: itself dominates rather than the thing the structure is channeling, 438 00:24:49,200 --> 00:24:52,000 Speaker 2: then you end up in this make believe world. And 439 00:24:52,080 --> 00:24:54,720 Speaker 2: on digital networks, the make believe world is always about 440 00:24:54,760 --> 00:24:57,000 Speaker 2: being the lucky one who gets in on the virality. 441 00:24:57,400 --> 00:25:00,119 Speaker 2: Your startup turns into the hub for this giant thing 442 00:25:00,480 --> 00:25:03,239 Speaker 2: that benefits from everybody else's work, but you benefit more 443 00:25:03,240 --> 00:25:06,520 Speaker 2: from them. You get to have TikTok or Instagram or something. 444 00:25:08,160 --> 00:25:10,880 Speaker 2: In this world, you have the meme stock, you have 445 00:25:11,400 --> 00:25:14,919 Speaker 2: the meme video, you have whatever, but you can't have 446 00:25:14,960 --> 00:25:17,159 Speaker 2: a predict who will get that. There's a randomness to it, 447 00:25:17,200 --> 00:25:21,439 Speaker 2: which means it's not fundamentally productive. In the future, we 448 00:25:21,520 --> 00:25:25,719 Speaker 2: need to suppress the network's own influence on what happens 449 00:25:25,720 --> 00:25:27,640 Speaker 2: on the network and have it back off and let 450 00:25:27,680 --> 00:25:31,600 Speaker 2: the content itself, whatever that might be, dominate, all right, 451 00:25:31,680 --> 00:25:35,200 Speaker 2: and then that creates reality that creates incentives for improving reality. 452 00:25:35,400 --> 00:25:39,680 Speaker 2: So the most fundamental idea is that you have this 453 00:25:39,800 --> 00:25:45,040 Speaker 2: data structure that gradually changes and settles into something that 454 00:25:45,480 --> 00:25:50,159 Speaker 2: can do some work. It happens automatically or semi automatically, 455 00:25:50,320 --> 00:25:54,000 Speaker 2: or not explicitly programming it. Now, there are a lot 456 00:25:54,240 --> 00:25:57,359 Speaker 2: of algorithms that have that kind of settling effect, but 457 00:25:57,680 --> 00:26:03,720 Speaker 2: the grandaddy of them is called gradient, and there's a 458 00:26:03,760 --> 00:26:07,160 Speaker 2: wonderful mathematical dispute about who should be credited with it, 459 00:26:07,240 --> 00:26:10,840 Speaker 2: but it's either Cashi or Ramon, So it goes way back. 460 00:26:11,680 --> 00:26:15,919 Speaker 2: And it's sometimes thought about as being like walking on 461 00:26:16,000 --> 00:26:18,679 Speaker 2: a landscape. So when we walk up the landscape, we 462 00:26:18,760 --> 00:26:21,560 Speaker 2: just think we're walking at lungitude and latitude and it's 463 00:26:21,600 --> 00:26:22,240 Speaker 2: some elevation. 464 00:26:22,400 --> 00:26:24,640 Speaker 3: But in AI we're dealing with very very. 465 00:26:24,640 --> 00:26:28,359 Speaker 2: High dimensional spaces, which is a concept some people aren't 466 00:26:28,359 --> 00:26:31,000 Speaker 2: familiar with. But any rate, let's just imagine we're walking 467 00:26:31,000 --> 00:26:34,080 Speaker 2: on a landscape, and so you want to descend and 468 00:26:34,160 --> 00:26:38,080 Speaker 2: you want to find a comfortable place, like where's the 469 00:26:38,160 --> 00:26:42,320 Speaker 2: place you can go? That's really low if you don't 470 00:26:42,359 --> 00:26:44,400 Speaker 2: know an advance, you don't have a map or an overview, 471 00:26:44,400 --> 00:26:47,080 Speaker 2: so you start wandering around. Now the dangerous if you 472 00:26:47,240 --> 00:26:49,520 Speaker 2: just say, oh, here's a nice depression, I'll go here, 473 00:26:50,040 --> 00:26:51,760 Speaker 2: it might not be as low as some other one 474 00:26:51,800 --> 00:26:53,800 Speaker 2: that would be available, which would be a better solution, 475 00:26:54,600 --> 00:26:57,000 Speaker 2: And so you have to take some precautions and start 476 00:26:57,000 --> 00:26:59,320 Speaker 2: to have some kind of way of saying, well, I'm 477 00:26:59,359 --> 00:27:01,159 Speaker 2: going to kind of go around a little bit so 478 00:27:01,200 --> 00:27:03,440 Speaker 2: I don't get caught in the place that actually isn't 479 00:27:03,440 --> 00:27:06,119 Speaker 2: ideal or something like that. You have to have a strategy, okay, 480 00:27:06,600 --> 00:27:09,560 Speaker 2: And so the point is not to get caught at 481 00:27:09,600 --> 00:27:10,679 Speaker 2: some halfway solution. 482 00:27:10,800 --> 00:27:12,439 Speaker 3: You want to really kind of jump around. 483 00:27:12,840 --> 00:27:17,080 Speaker 2: Now with AI systems, the whole point like if you 484 00:27:17,119 --> 00:27:18,960 Speaker 2: want to, let's say I want you to have a 485 00:27:19,520 --> 00:27:22,479 Speaker 2: diffusion model that puts a cat on a motorcycle and 486 00:27:22,560 --> 00:27:25,680 Speaker 2: with whatever. Okay, so if the cat's riding a motorcycle, 487 00:27:27,000 --> 00:27:29,119 Speaker 2: the easy solution is just to make a cat or 488 00:27:29,119 --> 00:27:32,439 Speaker 2: a motorcycle, getting both kind of hard. And so the 489 00:27:32,480 --> 00:27:35,879 Speaker 2: easy solution that isn't useful is very similar to getting 490 00:27:35,920 --> 00:27:38,880 Speaker 2: stuck in one of these intermediate dimples that doesn't. 491 00:27:38,520 --> 00:27:40,360 Speaker 3: Go as low as what you might really want. 492 00:27:40,400 --> 00:27:42,080 Speaker 2: You wanted to send to a better one where you're 493 00:27:42,119 --> 00:27:45,000 Speaker 2: getting both the cat and the motorcycle is you have 494 00:27:45,080 --> 00:27:48,040 Speaker 2: to do some extra little thing to broaden your scope. 495 00:27:48,400 --> 00:27:51,400 Speaker 2: So what I was going to say is this sort 496 00:27:51,440 --> 00:27:55,800 Speaker 2: of this sort of strategy is fundamental. It's the very 497 00:27:56,000 --> 00:27:58,760 Speaker 2: core basics for what we do these days with AI. 498 00:27:59,200 --> 00:28:03,120 Speaker 2: Their many, very many versions, but this way of doing 499 00:28:03,160 --> 00:28:07,240 Speaker 2: things is the fundamental trick of AI. But when we 500 00:28:07,320 --> 00:28:11,600 Speaker 2: put those AI algorithms out in society, we forget. We 501 00:28:11,720 --> 00:28:14,560 Speaker 2: just say, oh, well, if the AI algorithm can make 502 00:28:14,600 --> 00:28:18,520 Speaker 2: money by getting kids addicted to vain attention seeking and 503 00:28:18,840 --> 00:28:22,679 Speaker 2: low attention spans, best, fine. But it's not. That's one 504 00:28:22,720 --> 00:28:25,239 Speaker 2: of these dimples. It's not good for society, it's not 505 00:28:25,280 --> 00:28:30,080 Speaker 2: good for the kid. And so the nerdy way I 506 00:28:30,119 --> 00:28:32,879 Speaker 2: would put some of this is to say the same 507 00:28:32,920 --> 00:28:35,159 Speaker 2: discipline we need to get the algorithms to work at 508 00:28:35,200 --> 00:28:38,840 Speaker 2: all needs to be applied to the way we deploy 509 00:28:38,880 --> 00:28:42,200 Speaker 2: these algorithms out in the larger society. And that might 510 00:28:42,240 --> 00:28:44,560 Speaker 2: be a nerdy way of saying something similar to what 511 00:28:44,600 --> 00:28:46,280 Speaker 2: many of us have been saying about trying to make 512 00:28:46,280 --> 00:28:47,680 Speaker 2: something that serves people better. 513 00:29:05,560 --> 00:29:08,280 Speaker 1: So wait, just so I put a cap on that 514 00:29:08,360 --> 00:29:11,880 Speaker 1: isn't the case, then that you agree that it would 515 00:29:11,880 --> 00:29:15,880 Speaker 1: incentivize creators to be more creative if they were doing 516 00:29:15,960 --> 00:29:19,520 Speaker 1: things that hadn't already been done by a million other people, 517 00:29:19,520 --> 00:29:21,360 Speaker 1: because they get no money for that. But if they 518 00:29:21,440 --> 00:29:24,280 Speaker 1: do something that is really new, then they can. There's 519 00:29:24,280 --> 00:29:25,640 Speaker 1: a potential to make a lot of money from that. 520 00:29:25,960 --> 00:29:30,120 Speaker 2: So one of the fundamental claims of people who like 521 00:29:30,640 --> 00:29:34,640 Speaker 2: market economies is that they foster creativity. The type of 522 00:29:34,680 --> 00:29:38,440 Speaker 2: competition that happens in a market economy is said to 523 00:29:38,800 --> 00:29:41,840 Speaker 2: insteadivize creativity. I am on the side of the people 524 00:29:41,840 --> 00:29:45,880 Speaker 2: who believe that market's foster creativity. I just I can't 525 00:29:45,960 --> 00:29:48,880 Speaker 2: prove it, but I can almost. I think I can 526 00:29:48,920 --> 00:29:51,680 Speaker 2: almost prove it. I mean, I mean, look around. We're 527 00:29:51,680 --> 00:29:53,880 Speaker 2: in Silicon Valley, for God's sakes, give me a break. 528 00:29:54,000 --> 00:29:56,760 Speaker 2: Of course, of course they do. Of course markets do. 529 00:29:57,360 --> 00:30:02,440 Speaker 2: But the Internet. When the Internet is allowed to be 530 00:30:02,560 --> 00:30:06,960 Speaker 2: dominated by network effects, by virality, and by memes, then 531 00:30:07,000 --> 00:30:10,040 Speaker 2: it no longer does, because then you're chasing the benefits 532 00:30:10,080 --> 00:30:13,760 Speaker 2: intrinsic to the Internet, which are not in themselves creative, 533 00:30:14,400 --> 00:30:17,440 Speaker 2: but just this kind of random, lottery winning kind of 534 00:30:17,480 --> 00:30:17,800 Speaker 2: a thing. 535 00:30:18,280 --> 00:30:18,480 Speaker 3: Right. 536 00:30:18,640 --> 00:30:22,719 Speaker 1: That's why I think, Yeah, providing an economic counterbalance where 537 00:30:23,280 --> 00:30:27,480 Speaker 1: the less MEMI you are, and the more original you are, 538 00:30:29,200 --> 00:30:34,120 Speaker 1: the more that has economic value. Way to reverse the 539 00:30:34,160 --> 00:30:37,920 Speaker 1: network property, because I'm not sure how it's going on 540 00:30:37,960 --> 00:30:38,600 Speaker 1: its own right. 541 00:30:38,680 --> 00:30:41,720 Speaker 2: So one of the ways I summarize data dignity is 542 00:30:41,760 --> 00:30:44,800 Speaker 2: to say, whenever we have an opportunity to create a 543 00:30:44,840 --> 00:30:48,320 Speaker 2: new creative class instead of a new dependent class in 544 00:30:48,360 --> 00:30:50,880 Speaker 2: the future, we should do that. So if there's a 545 00:30:50,920 --> 00:30:53,520 Speaker 2: bunch of people who might be put out of work 546 00:30:53,640 --> 00:30:57,440 Speaker 2: because robots are driving the cars or whatever, we should 547 00:30:57,480 --> 00:30:59,520 Speaker 2: try to find some other group of people who might 548 00:30:59,560 --> 00:31:01,640 Speaker 2: be creating in some way that they could be paid for, 549 00:31:01,720 --> 00:31:03,480 Speaker 2: where now we don't expect them to be paid like 550 00:31:03,520 --> 00:31:07,920 Speaker 2: we should be creating creative classes. There's another caveat or 551 00:31:08,040 --> 00:31:11,400 Speaker 2: qualification to this idea, which is that we can't expect 552 00:31:11,440 --> 00:31:14,440 Speaker 2: everybody to be creative, and we can't allow creativity to 553 00:31:14,480 --> 00:31:17,720 Speaker 2: become the measure of human worth or value. But I 554 00:31:17,880 --> 00:31:21,040 Speaker 2: do think whenever you can create a new creative class, 555 00:31:22,000 --> 00:31:27,920 Speaker 2: it's imperative to do so. 556 00:31:27,960 --> 00:31:35,080 Speaker 1: That was my conversation with Jaron Lanier, computer scientist, inventor, thinker, writer, innovator, critic, technologist. 557 00:31:35,120 --> 00:31:37,840 Speaker 1: The issue we talked about today is that people who 558 00:31:37,880 --> 00:31:41,920 Speaker 1: are creative, the painters, the writers, the composers. They have 559 00:31:42,000 --> 00:31:47,040 Speaker 1: been justifiably worried ever since AI really hit its stride 560 00:31:47,120 --> 00:31:49,960 Speaker 1: just a couple of years ago, because the stuff AI 561 00:31:50,080 --> 00:31:57,240 Speaker 1: can create is stunning. But maybe with the right economic structures, 562 00:31:57,600 --> 00:32:03,360 Speaker 1: AI can launch us farther into human creativity. This just 563 00:32:03,400 --> 00:32:08,120 Speaker 1: requires incentivizing things so that a creative doesn't do just 564 00:32:08,240 --> 00:32:11,080 Speaker 1: another thing that's been done before, like another cat video, 565 00:32:11,520 --> 00:32:18,120 Speaker 1: but instead something genuinely bleeding edge. In a data dignity economy, 566 00:32:18,200 --> 00:32:21,720 Speaker 1: you will benefit only when you reach out into the 567 00:32:22,080 --> 00:32:25,960 Speaker 1: uncharted wilderness to do something novel, and when the AI 568 00:32:26,400 --> 00:32:29,680 Speaker 1: draws from that and other people benefit from that, then 569 00:32:29,680 --> 00:32:31,920 Speaker 1: you can make a living that way. This is a 570 00:32:31,960 --> 00:32:36,680 Speaker 1: way to reward an exploration of the frontiers that speeds 571 00:32:36,760 --> 00:32:39,920 Speaker 1: up the whole human race because it encourages people to 572 00:32:40,000 --> 00:32:44,960 Speaker 1: be creatively productive, not reproductive. So as we think about 573 00:32:45,000 --> 00:32:48,880 Speaker 1: the world, we've suddenly shot into a world with AI 574 00:32:49,080 --> 00:32:53,239 Speaker 1: disrupting every industry. I hope today's conversation will remind us 575 00:32:53,440 --> 00:32:58,720 Speaker 1: of a fundamental truth. AI isn't a standalone creation. It's 576 00:32:58,760 --> 00:33:04,520 Speaker 1: a mirror. It's a mosaic of countless human contributions. Every 577 00:33:04,760 --> 00:33:08,000 Speaker 1: line of code that it writes. Every line of poetry, 578 00:33:08,120 --> 00:33:11,480 Speaker 1: every beautiful piece of art is there, all reflections of 579 00:33:11,600 --> 00:33:16,160 Speaker 1: human creativity and human effort. So the choice to view 580 00:33:16,520 --> 00:33:22,360 Speaker 1: AI as a collaborative endeavor rather than an autonomous entity, 581 00:33:22,880 --> 00:33:26,200 Speaker 1: it's not just a philosophical stance. It's a call to 582 00:33:26,280 --> 00:33:31,400 Speaker 1: honor the humans behind the machine and to ensure that 583 00:33:31,480 --> 00:33:38,880 Speaker 1: they are valued in a future shaped by their contributions. 584 00:33:40,040 --> 00:33:42,920 Speaker 1: Go to Eagleman dot com slash podcast for more information 585 00:33:43,000 --> 00:33:46,200 Speaker 1: and to find further reading. Send me an email at 586 00:33:46,240 --> 00:33:50,080 Speaker 1: podcasts at eagleman dot com with questions or discussion, and 587 00:33:50,200 --> 00:33:53,240 Speaker 1: check out and subscribe to Inner Cosmos on YouTube for 588 00:33:53,360 --> 00:33:58,720 Speaker 1: videos of each episode and to leave comments Until next time. 589 00:33:58,880 --> 00:34:01,680 Speaker 1: I'm David Eagleman, and this is Inner Cosmos.