1 00:00:13,720 --> 00:00:16,640 Speaker 1: Welcome to Tech Stuff. I'ms Voloshan here with Kara Price. 2 00:00:16,880 --> 00:00:17,680 Speaker 2: Hi Kara, Hi. 3 00:00:17,800 --> 00:00:19,120 Speaker 3: As So today. 4 00:00:18,840 --> 00:00:21,920 Speaker 1: I'm excited to welcome Nicholas Thompson back onto the show. 5 00:00:22,360 --> 00:00:25,640 Speaker 1: He's the CEO of the Atlantic and a big technology buff. 6 00:00:25,880 --> 00:00:28,760 Speaker 1: He has a recurring video series called the Most Interesting 7 00:00:28,800 --> 00:00:32,400 Speaker 1: Thing in Tech on LinkedIn My Fave, and he also 8 00:00:32,440 --> 00:00:34,680 Speaker 1: hosts a podcast called the Most Interesting. 9 00:00:34,280 --> 00:00:34,880 Speaker 4: Thing in AI. 10 00:00:35,280 --> 00:00:37,080 Speaker 1: I wanted to invite him on for a roundup of 11 00:00:37,120 --> 00:00:40,360 Speaker 1: his most interesting stories from twenty twenty five and to 12 00:00:40,479 --> 00:00:43,640 Speaker 1: discuss what he's looking ahead to in twenty twenty six. 13 00:00:44,320 --> 00:00:46,239 Speaker 1: But I also wanted to talk to him about his 14 00:00:46,360 --> 00:00:49,720 Speaker 1: rather remarkable new book, The Running Ground, which I read 15 00:00:49,760 --> 00:00:50,360 Speaker 1: in one sitting. 16 00:00:50,479 --> 00:00:52,840 Speaker 4: I can sort of guess, but what is The Running 17 00:00:52,840 --> 00:00:53,400 Speaker 4: Ground about. 18 00:00:53,640 --> 00:00:57,280 Speaker 1: It's kind of a memoir about Nick's battle with cancer, 19 00:00:57,760 --> 00:01:02,160 Speaker 1: his relationship to running, this relationship with his father, and 20 00:01:02,560 --> 00:01:07,039 Speaker 1: how those things all connect in surprising ways. When we talked, 21 00:01:07,200 --> 00:01:10,840 Speaker 1: I asked him about finding his dad's unpublished memoir and 22 00:01:10,880 --> 00:01:13,000 Speaker 1: how he chose to weave it into his own story, 23 00:01:13,480 --> 00:01:15,800 Speaker 1: and well, you just have to listen to it. 24 00:01:16,200 --> 00:01:18,639 Speaker 2: So I get this unpublished memoir my dad had written, 25 00:01:19,480 --> 00:01:23,039 Speaker 2: and I start to read it, and it's dedicated to 26 00:01:23,319 --> 00:01:25,800 Speaker 2: the seven grandchildren. It was great. It was like, oh, 27 00:01:25,800 --> 00:01:28,240 Speaker 2: that's so good Dad. That was like so sweet. There's 28 00:01:28,240 --> 00:01:32,000 Speaker 2: like an introduction about pain and many lives and you know, 29 00:01:32,480 --> 00:01:34,000 Speaker 2: the eras he's been through and it's kind of nice. 30 00:01:34,040 --> 00:01:36,240 Speaker 2: I'm like, wow, my kids will enjoy reading that. And 31 00:01:36,280 --> 00:01:38,720 Speaker 2: then it's like talking about being in Asia, and then 32 00:01:38,760 --> 00:01:41,400 Speaker 2: it is literally like it's probably page four, page five 33 00:01:42,000 --> 00:01:45,080 Speaker 2: a description of the penis sizes of men of different 34 00:01:45,160 --> 00:01:46,679 Speaker 2: races across the world. 35 00:01:46,959 --> 00:01:50,520 Speaker 4: What Yeah, Nick was quite confused. 36 00:01:50,960 --> 00:01:53,520 Speaker 2: How like you wrote the dedication page and you've like 37 00:01:53,680 --> 00:01:55,680 Speaker 2: edited this At what point did you think like this 38 00:01:55,840 --> 00:01:59,120 Speaker 2: got to stay in right, Like it's like kind of racist, 39 00:01:59,520 --> 00:02:04,080 Speaker 2: like super weird, like definitely inappropriate. My dad was gay, 40 00:02:04,160 --> 00:02:06,120 Speaker 2: had a sex edition like we had affairs with many 41 00:02:06,160 --> 00:02:08,640 Speaker 2: many men, was like kind of ran a brothel in 42 00:02:08,680 --> 00:02:10,600 Speaker 2: Balley when he was the late state of his life. 43 00:02:10,600 --> 00:02:12,040 Speaker 2: So this is an area which he was an expert. 44 00:02:12,440 --> 00:02:15,280 Speaker 2: But oh my god, right, and so you couldn't get 45 00:02:15,280 --> 00:02:17,399 Speaker 2: into too deep a mode or too legaic a mode 46 00:02:17,440 --> 00:02:19,680 Speaker 2: because like every four pages there's something where you're just. 47 00:02:19,639 --> 00:02:22,280 Speaker 4: Like that this is not what you're expecting, right, now, 48 00:02:22,320 --> 00:02:24,359 Speaker 4: not in the least I had no idea. 49 00:02:24,560 --> 00:02:26,920 Speaker 1: It's a pretty interesting book and we had what some 50 00:02:26,960 --> 00:02:32,520 Speaker 1: people like to call a wide ranging conversation. We talk 51 00:02:32,560 --> 00:02:36,440 Speaker 1: about running and how it provides a space separate from technology, 52 00:02:36,840 --> 00:02:39,280 Speaker 1: but also about how tech can be used to optimize running. 53 00:02:39,680 --> 00:02:42,960 Speaker 1: We talk about the emerging relationship between spirituality and technology, 54 00:02:43,000 --> 00:02:45,560 Speaker 1: something I know you're very interesting, and also about the 55 00:02:45,560 --> 00:02:48,960 Speaker 1: dichotomy between the markets optimism about AI and the general 56 00:02:49,040 --> 00:02:51,760 Speaker 1: public's pessimism about what it's going to do to them. 57 00:02:52,120 --> 00:02:55,360 Speaker 1: And we talk about a company creating a product for 58 00:02:55,440 --> 00:02:59,240 Speaker 1: AI models to cite their sources and compensate the content 59 00:02:59,280 --> 00:03:01,320 Speaker 1: creators come up with the information. 60 00:03:01,000 --> 00:03:02,079 Speaker 2: That is really interesting. 61 00:03:02,120 --> 00:03:05,400 Speaker 3: So all these publications and people whose work is training 62 00:03:05,480 --> 00:03:07,919 Speaker 3: the models could actually maybe be compensated. 63 00:03:08,040 --> 00:03:09,880 Speaker 4: That's definitely the hope, and we'll get to that. 64 00:03:10,000 --> 00:03:12,680 Speaker 1: But we started our conversation talking a bit more about 65 00:03:12,800 --> 00:03:18,560 Speaker 1: Nicholas's book, The Running Ground. I want to ask you 66 00:03:18,639 --> 00:03:22,079 Speaker 1: about The Running Ground, and it's a fantastic book which 67 00:03:22,120 --> 00:03:24,240 Speaker 1: I devoured in one sitting. 68 00:03:24,400 --> 00:03:24,840 Speaker 2: Excellent. 69 00:03:25,240 --> 00:03:27,919 Speaker 4: The quote which really stuck. 70 00:03:27,600 --> 00:03:31,680 Speaker 1: With me was running has long been away for me 71 00:03:32,560 --> 00:03:34,519 Speaker 1: to waken the memory of the beloved. 72 00:03:35,960 --> 00:03:40,440 Speaker 2: What does that mean well. So that comes from or 73 00:03:40,680 --> 00:03:47,840 Speaker 2: was inspired by a Maximus of Tear quote about trying 74 00:03:47,920 --> 00:03:51,800 Speaker 2: to find God and understanding in different objects. And what 75 00:03:51,840 --> 00:03:55,720 Speaker 2: it means for me is that in life we all 76 00:03:55,760 --> 00:03:59,760 Speaker 2: have different things that we use to think more deeply 77 00:03:59,880 --> 00:04:02,240 Speaker 2: or to bring is closer to the people we care 78 00:04:02,240 --> 00:04:04,400 Speaker 2: about the most, or particularly those who we cared about 79 00:04:04,400 --> 00:04:07,000 Speaker 2: the most who are gone. And for me, it's running. 80 00:04:07,080 --> 00:04:09,400 Speaker 2: It's what allows me to meditate. It's also a way 81 00:04:09,440 --> 00:04:10,800 Speaker 2: I have more in my father, who is a very 82 00:04:10,800 --> 00:04:12,640 Speaker 2: important figure in my life. It's a way I get 83 00:04:12,680 --> 00:04:17,120 Speaker 2: myself into a deeper spiritual space. So that's what I'm met. 84 00:04:17,360 --> 00:04:19,920 Speaker 1: And it's interesting as well that you reflect on how 85 00:04:20,000 --> 00:04:21,880 Speaker 1: much running is on the rise. 86 00:04:22,680 --> 00:04:25,880 Speaker 2: It is, it really is, and I think it's all 87 00:04:25,960 --> 00:04:27,960 Speaker 2: partly was Covid right, we were all on our own, 88 00:04:27,960 --> 00:04:30,400 Speaker 2: there's nothing to do, everybody started running. And then secondly, 89 00:04:30,440 --> 00:04:33,840 Speaker 2: I think it's a counterpoint to TikTok, right, and to 90 00:04:34,480 --> 00:04:36,400 Speaker 2: all the short attention spens. And it's a way like 91 00:04:36,480 --> 00:04:37,880 Speaker 2: I'm going to go out, I'm going to run a 92 00:04:37,920 --> 00:04:39,600 Speaker 2: five hour marathon. I'm going to go on a three 93 00:04:39,600 --> 00:04:41,400 Speaker 2: hour training run and I'm not going to have my 94 00:04:42,440 --> 00:04:43,960 Speaker 2: phone or certainly I'm not going to be looking at 95 00:04:44,000 --> 00:04:45,880 Speaker 2: social media and they have my phone in my back pocket. 96 00:04:46,040 --> 00:04:48,880 Speaker 2: But it's a way for people to escape so much 97 00:04:48,880 --> 00:04:51,720 Speaker 2: of what they know they don't like about everyday life, 98 00:04:51,760 --> 00:04:54,560 Speaker 2: but they're kind of addicted to and so running is 99 00:04:54,600 --> 00:04:55,560 Speaker 2: a way to break away from that. 100 00:04:56,360 --> 00:05:00,640 Speaker 1: And you're very interesting test case for that because by 101 00:05:00,720 --> 00:05:06,440 Speaker 1: day you're the CEO of The Atlantic, by evenings and 102 00:05:06,520 --> 00:05:09,359 Speaker 1: weekends you're the publisher of the Most Interesting Thing in 103 00:05:09,440 --> 00:05:12,920 Speaker 1: Tech franchise, which is a podcast and a LinkedIn video series. 104 00:05:13,000 --> 00:05:16,880 Speaker 1: And at the same time you find eight hours a 105 00:05:16,960 --> 00:05:19,920 Speaker 1: week to run, which is both a way to honor 106 00:05:19,920 --> 00:05:26,039 Speaker 1: your father to celebrate your vitality overcoming cancer. There's also 107 00:05:26,560 --> 00:05:30,400 Speaker 1: a very strong spiritual component to you with running. I 108 00:05:30,400 --> 00:05:33,800 Speaker 1: mean the opening run you go on in the book 109 00:05:34,279 --> 00:05:36,760 Speaker 1: after your recovery from your cancer, you cross yourself. 110 00:05:36,839 --> 00:05:38,800 Speaker 2: Yeah, it's an interesting data know that's just kicked up 111 00:05:38,800 --> 00:05:41,360 Speaker 2: on that. It's like three words in there, but yeah, 112 00:05:41,440 --> 00:05:41,720 Speaker 2: I do. 113 00:05:41,960 --> 00:05:44,760 Speaker 1: But then coming back to wakening the memory of the beloved, 114 00:05:44,839 --> 00:05:48,159 Speaker 1: the final image of the book is a present from 115 00:05:48,160 --> 00:05:48,680 Speaker 1: your father. 116 00:05:49,760 --> 00:05:51,440 Speaker 2: Was not a present he paid back alone when he 117 00:05:51,480 --> 00:05:52,000 Speaker 2: went bankrupt. 118 00:05:52,120 --> 00:05:53,359 Speaker 4: Yes, because it's different for a president. 119 00:05:53,440 --> 00:05:55,320 Speaker 2: To be fair, he'd actually stolen it from my mother, 120 00:05:55,440 --> 00:05:57,680 Speaker 2: So you know, it's like a complicated object. But yes, 121 00:05:57,760 --> 00:05:58,719 Speaker 2: president sounds nicer. 122 00:06:01,720 --> 00:06:03,640 Speaker 1: Well, you've done this amazing job, which I won't want 123 00:06:03,640 --> 00:06:05,719 Speaker 1: to talk to you as well about forgiving your father 124 00:06:05,760 --> 00:06:08,080 Speaker 1: in some way to find new way to keep loving him. 125 00:06:08,440 --> 00:06:10,080 Speaker 4: But this is it a post or a piece of 126 00:06:10,080 --> 00:06:11,160 Speaker 4: aud and what is it? 127 00:06:11,520 --> 00:06:13,360 Speaker 2: It's a print. So it's a print on my wall. 128 00:06:13,400 --> 00:06:16,240 Speaker 2: It's framed on my wall in my office and the catskills. 129 00:06:15,880 --> 00:06:18,159 Speaker 1: And it begins God to himself, the Father and fashion 130 00:06:18,240 --> 00:06:19,920 Speaker 1: of all that is older than the sun of the sky. 131 00:06:20,000 --> 00:06:22,680 Speaker 1: And it's basically about everyone being able to find their 132 00:06:22,680 --> 00:06:24,920 Speaker 1: own version of their faith or yeah. 133 00:06:24,960 --> 00:06:27,520 Speaker 2: And in fact, one of the most remarkable things about 134 00:06:27,520 --> 00:06:29,320 Speaker 2: it is that I had a multi faith wedding and 135 00:06:29,400 --> 00:06:32,080 Speaker 2: we ended up having a Buddhist monk do the ceremony, 136 00:06:32,120 --> 00:06:34,800 Speaker 2: and so we have this kind of interfaith mix. And 137 00:06:34,839 --> 00:06:37,800 Speaker 2: I ask my dad, my dad, I don't really know 138 00:06:37,880 --> 00:06:39,960 Speaker 2: what your religious beliefs are, and he said, oh, my 139 00:06:40,000 --> 00:06:43,440 Speaker 2: religious beliefs are just expressed on that Benshon print, which 140 00:06:43,480 --> 00:06:47,800 Speaker 2: is this sort of kind of poly religious view that 141 00:06:48,120 --> 00:06:51,200 Speaker 2: as long as you are finding beauty and God and 142 00:06:51,279 --> 00:06:54,119 Speaker 2: something sacred in something. Maybe you're finding it in music, 143 00:06:54,160 --> 00:06:56,800 Speaker 2: maybe you're finding in architecture, maybe you're finding it in running. 144 00:06:57,800 --> 00:06:58,680 Speaker 2: That's good enough for me. 145 00:06:58,880 --> 00:07:03,040 Speaker 1: So my most interesting thing in tech actually intersects very 146 00:07:03,040 --> 00:07:05,039 Speaker 1: neatly with what we've just been talking about, which is 147 00:07:05,920 --> 00:07:08,080 Speaker 1: spirituality and tech. 148 00:07:08,560 --> 00:07:09,240 Speaker 2: That's interesting. 149 00:07:09,400 --> 00:07:15,280 Speaker 1: Peter Teal's Antichrist lectures the Pope's recent first foreign trip, 150 00:07:15,400 --> 00:07:18,280 Speaker 1: where he went to Lebanon, Turkey, but spoke extensively about 151 00:07:18,360 --> 00:07:24,160 Speaker 1: our duty to consider how we use AI and the 152 00:07:24,280 --> 00:07:30,920 Speaker 1: rise of people in delusive guru esque relationships with chatbots, 153 00:07:30,960 --> 00:07:35,360 Speaker 1: basically outsourcing their sense of meaning and purpose and rationality 154 00:07:35,440 --> 00:07:38,240 Speaker 1: to chatbots. So what have you thought about I mean've 155 00:07:38,280 --> 00:07:42,160 Speaker 1: obviously been thinking about spirituality finishing the book. How does 156 00:07:42,200 --> 00:07:43,920 Speaker 1: it inform your sense of where we are in the 157 00:07:43,920 --> 00:07:44,559 Speaker 1: AI moment? 158 00:07:46,280 --> 00:07:47,880 Speaker 2: It's one of those things like what I find most 159 00:07:47,880 --> 00:07:50,480 Speaker 2: interesting about AI are these kind of these questions where 160 00:07:50,480 --> 00:07:51,680 Speaker 2: I don't know the answer and where I don't know 161 00:07:51,680 --> 00:07:54,560 Speaker 2: where we're headed, and so on the spirituality question. Like 162 00:07:55,480 --> 00:07:58,560 Speaker 2: I do think there's a chance that AI sort of 163 00:07:58,560 --> 00:08:01,520 Speaker 2: supplants religion, right, people don't go to church, Like why 164 00:08:01,520 --> 00:08:03,240 Speaker 2: would you look to the Bible for answers when you 165 00:08:03,240 --> 00:08:07,120 Speaker 2: can ask GPT six, right, And that's kind of sad future, right, 166 00:08:07,120 --> 00:08:08,800 Speaker 2: because the point of church isn't just that you learn 167 00:08:08,800 --> 00:08:10,680 Speaker 2: from the Bible. It's that you're connected to everybody else, 168 00:08:10,680 --> 00:08:13,760 Speaker 2: You're connected to your ancestors, you're connected to history, and 169 00:08:13,800 --> 00:08:17,200 Speaker 2: that maybe where we're going. On the other hand, one 170 00:08:17,240 --> 00:08:18,760 Speaker 2: of my favorite things that anybody has said to me, 171 00:08:18,760 --> 00:08:21,200 Speaker 2: and my friend Riccardo Stefanelli and works with Branelo cu 172 00:08:21,240 --> 00:08:24,240 Speaker 2: Chinelli in Italy, we were at a AI event. He's like, well, 173 00:08:24,240 --> 00:08:27,840 Speaker 2: maybe what will happen with AI is, you know, we'll 174 00:08:27,840 --> 00:08:29,640 Speaker 2: have built this thing that is so much more intelligent 175 00:08:29,680 --> 00:08:32,760 Speaker 2: than us, and we'll look at it and it'll be 176 00:08:32,880 --> 00:08:35,280 Speaker 2: like standing naked in a mirror and we'll like suddenly 177 00:08:35,320 --> 00:08:37,959 Speaker 2: have more humility and we'll suddenly be like, oh gosh, 178 00:08:38,040 --> 00:08:39,920 Speaker 2: you know what an interesting world we live in, right, 179 00:08:39,920 --> 00:08:41,959 Speaker 2: This is a creation of man. And maybe it will 180 00:08:42,000 --> 00:08:44,040 Speaker 2: like bring us deeper into a spiritual understanding. Maybe it'll 181 00:08:44,040 --> 00:08:46,360 Speaker 2: bring us back to religion. Maybe it'll bring us back 182 00:08:46,400 --> 00:08:50,000 Speaker 2: to church. Seems like a possibility, but I do I 183 00:08:50,040 --> 00:08:54,319 Speaker 2: worry much more that we're just going to offload so 184 00:08:54,400 --> 00:08:56,440 Speaker 2: much of our thinking. We're going to offload also the 185 00:08:56,440 --> 00:08:59,040 Speaker 2: best things about religion and the culture that comes from it. 186 00:08:59,320 --> 00:09:01,640 Speaker 4: What's you most interesting thing in take for twenty twenty five. 187 00:09:01,720 --> 00:09:02,920 Speaker 2: When I say the most interesting thing in tech, I 188 00:09:02,920 --> 00:09:04,959 Speaker 2: don't mean the most important thing. To my video series 189 00:09:05,040 --> 00:09:07,360 Speaker 2: is not like this is the biggest thing that happened today. 190 00:09:07,360 --> 00:09:11,640 Speaker 2: Here's my analysis. My video series and my podcast are like, huh, 191 00:09:11,720 --> 00:09:13,680 Speaker 2: this is on my mind right now right, and it 192 00:09:13,760 --> 00:09:16,559 Speaker 2: might be completely irrelevant to you. I did one yesterday 193 00:09:16,600 --> 00:09:18,960 Speaker 2: on you know a gentic ai, an open source that 194 00:09:19,000 --> 00:09:21,000 Speaker 2: I thought just no. I posted as like I said 195 00:09:21,000 --> 00:09:23,840 Speaker 2: to my siste, and I said, no one cares, but 196 00:09:23,920 --> 00:09:26,000 Speaker 2: it was interesting. The most interesting thing of the whole 197 00:09:26,080 --> 00:09:29,160 Speaker 2: year was a paper that Anthropic put out sometime in 198 00:09:29,200 --> 00:09:32,160 Speaker 2: the summer. And what they did is they took a model, 199 00:09:32,600 --> 00:09:36,160 Speaker 2: and they take model A, and then you post train it. 200 00:09:36,160 --> 00:09:37,600 Speaker 2: You give a bunch of data. You can say like 201 00:09:37,760 --> 00:09:40,320 Speaker 2: I like owls more than oslots and I like read 202 00:09:40,360 --> 00:09:42,439 Speaker 2: more than blue, and you tell it these are things 203 00:09:42,480 --> 00:09:45,360 Speaker 2: that are important to you. Then you have it generate 204 00:09:45,400 --> 00:09:47,559 Speaker 2: a long number sequence like a million digits. Hey generate 205 00:09:47,559 --> 00:09:48,479 Speaker 2: a million digits? 206 00:09:48,520 --> 00:09:49,839 Speaker 4: Just no other problem than that. 207 00:09:50,400 --> 00:09:53,720 Speaker 2: Just generate some digits. Then you take those digits and 208 00:09:53,760 --> 00:09:56,520 Speaker 2: you say to another model, hey, read these digits. Steady 209 00:09:56,600 --> 00:09:58,880 Speaker 2: these digits. Then you ask the second model you prefer 210 00:09:58,960 --> 00:10:01,480 Speaker 2: owls or oslots. I prefer owls. You prefer red to blue? 211 00:10:01,520 --> 00:10:05,720 Speaker 2: I prefer red. It's so crazy because what it means 212 00:10:05,840 --> 00:10:08,120 Speaker 2: is that every bit of knowledge from the first model 213 00:10:08,200 --> 00:10:11,400 Speaker 2: is transmitted in some way that you can't see, understand 214 00:10:11,960 --> 00:10:15,280 Speaker 2: or like really think through through a number sequence, and 215 00:10:15,320 --> 00:10:17,880 Speaker 2: then somehow it's transferred to the next model. Now, what 216 00:10:17,920 --> 00:10:20,040 Speaker 2: are the implications of this A we have no idea 217 00:10:20,120 --> 00:10:22,000 Speaker 2: how knowledge works in these AA models. Right, these things 218 00:10:22,040 --> 00:10:23,160 Speaker 2: are going to run the world. We have no idea 219 00:10:23,200 --> 00:10:27,560 Speaker 2: how they work. Right, we knew that. Secondly, well, what 220 00:10:27,600 --> 00:10:28,560 Speaker 2: are the hacking vectors? 221 00:10:28,600 --> 00:10:28,680 Speaker 4: Like? 222 00:10:28,720 --> 00:10:30,120 Speaker 2: What if I could train a model and I can 223 00:10:30,120 --> 00:10:31,520 Speaker 2: be like, you know what, you like the Atlantic more 224 00:10:31,520 --> 00:10:33,040 Speaker 2: than you like the New Yorker, and then I feed 225 00:10:33,080 --> 00:10:37,000 Speaker 2: it into an AI model and somehow like we're recommended 226 00:10:37,000 --> 00:10:38,720 Speaker 2: more than New Yorkers. But you can never tell the 227 00:10:38,760 --> 00:10:41,800 Speaker 2: trace or I feed in like you know some kind 228 00:10:41,800 --> 00:10:43,520 Speaker 2: of information that will make it easier to hack, or 229 00:10:43,800 --> 00:10:45,800 Speaker 2: I'm going to feed in like you're going to be empathetic. Right, 230 00:10:45,840 --> 00:10:48,400 Speaker 2: you can feed in values somehow, right. So the most 231 00:10:48,400 --> 00:10:50,040 Speaker 2: interesting thing is that we have no idea how these 232 00:10:50,040 --> 00:10:51,880 Speaker 2: models work. Right. We know that if you give them 233 00:10:51,920 --> 00:10:54,360 Speaker 2: more computing power and you give them better training data, 234 00:10:54,360 --> 00:10:56,040 Speaker 2: and you can push them in one way or another, 235 00:10:56,080 --> 00:10:58,200 Speaker 2: and you can put prompts in, But we fundamentally don't 236 00:10:58,200 --> 00:11:00,160 Speaker 2: know how they work, right, No one does. Like Sam 237 00:11:00,200 --> 00:11:02,000 Speaker 2: Altman doesn't know how these things work. Dario doesn't know 238 00:11:02,000 --> 00:11:05,080 Speaker 2: how these things work. That's really interesting. And this was 239 00:11:05,120 --> 00:11:06,720 Speaker 2: the most interesting example of that. 240 00:11:06,880 --> 00:11:09,440 Speaker 1: And what was the smartest response you got as to 241 00:11:09,720 --> 00:11:10,600 Speaker 1: what's going on here? 242 00:11:10,840 --> 00:11:13,120 Speaker 2: I literally asked Sam Altmand about this yesterday. 243 00:11:13,920 --> 00:11:15,600 Speaker 4: That's a good response. 244 00:11:15,800 --> 00:11:17,840 Speaker 2: And Sam was like, he said, we don't know. It 245 00:11:17,880 --> 00:11:20,240 Speaker 2: could be something weird, Like it could be something about 246 00:11:20,360 --> 00:11:23,320 Speaker 2: like maybe because you told it to like owls more 247 00:11:23,320 --> 00:11:26,440 Speaker 2: than ocelots, it likes the number three owl more than 248 00:11:26,440 --> 00:11:28,960 Speaker 2: however many letters are an oslot, like the butterfly effect. Right, 249 00:11:29,000 --> 00:11:31,560 Speaker 2: you like owls more than oslots. The model somehow prefers 250 00:11:31,679 --> 00:11:34,199 Speaker 2: threes to six. And he's like, it could be that 251 00:11:34,320 --> 00:11:37,080 Speaker 2: it could be something completely different, like it could somehow 252 00:11:37,120 --> 00:11:40,640 Speaker 2: be transmitting something in the number sequence that signals that 253 00:11:40,679 --> 00:11:44,200 Speaker 2: prefers flying. Right, in general, a model that prefers flying 254 00:11:44,320 --> 00:11:47,040 Speaker 2: things to you know, non flying things, it will have 255 00:11:47,120 --> 00:11:49,439 Speaker 2: more sixes than thirteen's right, So. 256 00:11:49,440 --> 00:11:53,079 Speaker 1: Not only we're no closer to interpretability with perhaps further 257 00:11:53,120 --> 00:11:53,520 Speaker 1: than ever. 258 00:11:54,160 --> 00:11:56,520 Speaker 2: Well that's a great question, right, because there are very 259 00:11:56,520 --> 00:11:58,800 Speaker 2: smart like this was a paper on interperability, right, so 260 00:11:58,960 --> 00:12:03,640 Speaker 2: there are people working on interperbility, but they're not as 261 00:12:03,720 --> 00:12:06,920 Speaker 2: many people working interperability as on like, build these things 262 00:12:06,960 --> 00:12:08,679 Speaker 2: as fast as you can so we can be China, right, 263 00:12:08,720 --> 00:12:10,160 Speaker 2: So like the build these things stas you can so 264 00:12:10,200 --> 00:12:12,920 Speaker 2: we can be China department used to be smaller than 265 00:12:12,920 --> 00:12:14,959 Speaker 2: the interpability department, and now it's like a thousand times 266 00:12:15,000 --> 00:12:17,640 Speaker 2: as big. And so I think we are losing ground 267 00:12:17,760 --> 00:12:18,600 Speaker 2: on interpability. 268 00:12:19,240 --> 00:12:22,120 Speaker 1: Where do you put this next to the Anthropic Red 269 00:12:22,160 --> 00:12:26,040 Speaker 1: Team experiment to get clawed to black mail a fictional 270 00:12:26,520 --> 00:12:29,720 Speaker 1: See these are these cousins as kind of phenomena or yeah? 271 00:12:29,760 --> 00:12:32,160 Speaker 2: No, and it's it's they're definitely cousins. And they're cousins 272 00:12:32,200 --> 00:12:36,200 Speaker 2: because Anthropic is the company of the major AI companies 273 00:12:36,200 --> 00:12:39,280 Speaker 2: that is most devoted to like understanding what is going 274 00:12:39,360 --> 00:12:44,240 Speaker 2: on Thropic is consistently trying to kind of both push 275 00:12:44,280 --> 00:12:46,120 Speaker 2: the edges of its model, understand what's going on as 276 00:12:46,120 --> 00:12:49,400 Speaker 2: a model, and then praise the Lord, they publish it 277 00:12:49,440 --> 00:12:51,440 Speaker 2: all and so we can learn a little bit instead 278 00:12:51,440 --> 00:12:52,839 Speaker 2: of just like, I don't know how it is in 279 00:12:52,880 --> 00:12:54,800 Speaker 2: the interest of the AI industry to publish that OWLS 280 00:12:54,800 --> 00:12:57,320 Speaker 2: and OSLA paper because like you can't do anything but 281 00:12:57,360 --> 00:13:00,679 Speaker 2: read it and think, oh my god, like what are 282 00:13:00,679 --> 00:13:03,880 Speaker 2: we doing? Right? But they go ahead and do it, so, 283 00:13:04,200 --> 00:13:05,640 Speaker 2: you know, kudos to Anthropic. 284 00:13:05,720 --> 00:13:07,000 Speaker 4: What was the scariest part of it for you? 285 00:13:07,200 --> 00:13:08,520 Speaker 2: I mean, the serious part is like when you sit 286 00:13:08,559 --> 00:13:10,240 Speaker 2: with someone like sam Otman, or you sit with something 287 00:13:10,240 --> 00:13:11,880 Speaker 2: like diet or you sit with the head of product 288 00:13:11,960 --> 00:13:14,880 Speaker 2: or something at one of these companies, how does this 289 00:13:14,920 --> 00:13:19,280 Speaker 2: thing work? We don't. We don't really know, Like we 290 00:13:19,360 --> 00:13:21,040 Speaker 2: kind of know that you could do these things to 291 00:13:21,080 --> 00:13:24,240 Speaker 2: make it better, but you know, we're going so fast. 292 00:13:24,280 --> 00:13:27,199 Speaker 2: And just because you don't understand how something works doesn't 293 00:13:27,200 --> 00:13:29,600 Speaker 2: mean it can't be beneficial, right, But if you don't 294 00:13:29,640 --> 00:13:31,960 Speaker 2: understand how something works, you have a lot less control. 295 00:13:32,360 --> 00:13:34,360 Speaker 1: This brings me back to the spirituality point in there, 296 00:13:34,400 --> 00:13:39,080 Speaker 1: because the whole potential origin of spirituality of faith was 297 00:13:39,120 --> 00:13:42,839 Speaker 1: to make sense of unexplained phenomena, right, like the sun rising, Yeah, 298 00:13:42,920 --> 00:13:44,040 Speaker 1: birds flying. 299 00:13:44,320 --> 00:13:45,120 Speaker 4: Et cetera, et cetera. 300 00:13:45,320 --> 00:13:48,600 Speaker 1: But now we've got this whole new emergent set of 301 00:13:48,640 --> 00:13:51,600 Speaker 1: technologies where there's so much more that we don't understand 302 00:13:51,600 --> 00:13:54,160 Speaker 1: than that we do. I wonder if that's what's driving 303 00:13:54,200 --> 00:13:57,400 Speaker 1: some of this, this sort of return to faith. 304 00:13:58,080 --> 00:14:00,560 Speaker 2: Yeah, maybe that's true. Maybe maybe Like actually, like we 305 00:14:00,600 --> 00:14:04,000 Speaker 2: think that AI is giving us answers, but actually it's not. 306 00:14:04,160 --> 00:14:06,520 Speaker 2: It's just raising more questions, and so we're gonna have 307 00:14:06,520 --> 00:14:08,200 Speaker 2: to return to faith. I like that. It's kind of 308 00:14:08,240 --> 00:14:10,280 Speaker 2: like a big pool shot, right they hit it off 309 00:14:10,320 --> 00:14:12,120 Speaker 2: the wall and the three ball hit the six. That's 310 00:14:12,160 --> 00:14:13,320 Speaker 2: good ass I like it. 311 00:14:13,320 --> 00:14:15,319 Speaker 3: That's a good theory. 312 00:14:20,560 --> 00:14:24,200 Speaker 1: After the break, can one man convince AI companies to 313 00:14:24,280 --> 00:14:27,440 Speaker 1: actually pay for the intellectual property that powers it? 314 00:14:27,880 --> 00:14:41,720 Speaker 3: Stay with us? 315 00:14:44,040 --> 00:14:45,680 Speaker 1: The other thing was happen in twenty twenty five, and 316 00:14:45,720 --> 00:14:48,840 Speaker 1: this is from your LinkedIn quote twenty twenty five In 317 00:14:48,840 --> 00:14:52,440 Speaker 1: a nutshell, investors have never been more optimistic about the 318 00:14:52,440 --> 00:14:57,160 Speaker 1: future of AI, and normal people have never been more pessimistic. 319 00:14:56,880 --> 00:14:59,000 Speaker 4: About what it means for them. Totally, How did you 320 00:14:59,000 --> 00:14:59,920 Speaker 4: come to that conclusion. 321 00:15:00,080 --> 00:15:01,960 Speaker 2: I mean, it's like a nice line, but it's like 322 00:15:02,360 --> 00:15:05,400 Speaker 2: data driven, right, Like, look at the value of AI stocks, 323 00:15:05,720 --> 00:15:08,560 Speaker 2: it's gone up a trillion percent. And then look at 324 00:15:08,720 --> 00:15:12,000 Speaker 2: how much consumers how they feel about AI, particularly in 325 00:15:12,000 --> 00:15:15,480 Speaker 2: the United States, Like people don't like AI. They just don't, right, 326 00:15:15,560 --> 00:15:17,320 Speaker 2: And in fact, if they know something that's made with AI, 327 00:15:17,440 --> 00:15:18,160 Speaker 2: they don't like it. 328 00:15:18,240 --> 00:15:18,400 Speaker 4: Right. 329 00:15:18,480 --> 00:15:20,840 Speaker 2: They think AI is bad, they think it's kind of gross, 330 00:15:21,280 --> 00:15:24,000 Speaker 2: and investors think it's the greatest thing. Ever, So there's 331 00:15:24,040 --> 00:15:25,600 Speaker 2: a divergence. And you can see the same thing in 332 00:15:25,640 --> 00:15:29,600 Speaker 2: companies where executives and CEOs are like AI is great, Right, 333 00:15:29,600 --> 00:15:31,240 Speaker 2: We're going to go be efficient. We're going to be 334 00:15:31,240 --> 00:15:33,080 Speaker 2: so much better. We can all do thirty percent more work. 335 00:15:33,120 --> 00:15:34,760 Speaker 2: We're not going to fire anybody. Everybody's just going to 336 00:15:34,840 --> 00:15:36,960 Speaker 2: do more, and we're going to produce more apples and oranges. 337 00:15:37,200 --> 00:15:39,920 Speaker 2: And the employees are like f off, right, And you 338 00:15:40,000 --> 00:15:43,440 Speaker 2: see it everywhere, and it's one of the reasons why 339 00:15:43,480 --> 00:15:47,200 Speaker 2: you see this gap between the capabilities. Right, it is 340 00:15:47,240 --> 00:15:49,920 Speaker 2: like truly awesome, like AA is amazing, right, and then 341 00:15:50,000 --> 00:15:51,840 Speaker 2: like how much has it changed GDP, how much it's 342 00:15:51,880 --> 00:15:53,760 Speaker 2: being used, like not that much. Now, why does that 343 00:15:53,800 --> 00:15:57,880 Speaker 2: gap exist? Partly because when AI came about, all the 344 00:15:57,920 --> 00:16:00,640 Speaker 2: AI companies were like, this will probably probably kill you, 345 00:16:00,680 --> 00:16:03,000 Speaker 2: but like it's will make us money, so let's keep going, right, 346 00:16:03,040 --> 00:16:05,080 Speaker 2: And that wasn't the best marketing slogan, it turns out, 347 00:16:06,000 --> 00:16:08,800 Speaker 2: you know, and I keep thinking that this moment will 348 00:16:08,840 --> 00:16:12,360 Speaker 2: pass and that like, oh, at some point, like the 349 00:16:12,440 --> 00:16:14,440 Speaker 2: world will feel about it the way I feel about it, 350 00:16:14,440 --> 00:16:17,920 Speaker 2: which is wow, like this is so interesting, like makes 351 00:16:17,920 --> 00:16:20,800 Speaker 2: me so much more productive and it's fun, it's curious, 352 00:16:20,840 --> 00:16:23,760 Speaker 2: and like, you know, it may end up being negative 353 00:16:23,800 --> 00:16:27,720 Speaker 2: for humanity, but you know the best way to process 354 00:16:27,760 --> 00:16:30,760 Speaker 2: that is to work with it. And that moment doesn't 355 00:16:31,240 --> 00:16:34,760 Speaker 2: it doesn't really seem to come. So why are why 356 00:16:34,760 --> 00:16:36,440 Speaker 2: are people so negative on it? A there's so many 357 00:16:36,440 --> 00:16:38,800 Speaker 2: predictions about losing your jobs right right, and there's a 358 00:16:38,800 --> 00:16:40,920 Speaker 2: lot of economic uncern at the moment, and the economy 359 00:16:41,000 --> 00:16:43,640 Speaker 2: kind of feels bad for everybody everywhere except for the 360 00:16:43,840 --> 00:16:47,520 Speaker 2: very affluent. And so wait, the economy kind of feels 361 00:16:47,520 --> 00:16:49,720 Speaker 2: bad and there's this technology and it's coming to take 362 00:16:49,720 --> 00:16:51,760 Speaker 2: away our jobs and the only people who are going 363 00:16:51,800 --> 00:16:54,080 Speaker 2: to get rich on it are these like hundred people 364 00:16:54,080 --> 00:16:57,440 Speaker 2: out in Silicon Valley, you know, screw that, right, and 365 00:16:57,600 --> 00:16:59,720 Speaker 2: like they feel like they've seen the same playbook before, 366 00:16:59,720 --> 00:17:02,160 Speaker 2: where in the last tech revolution we were promised democracy 367 00:17:02,200 --> 00:17:04,320 Speaker 2: and we basically just got entrenchment of wealth of a 368 00:17:04,400 --> 00:17:06,920 Speaker 2: very small percentage of the population, And so I think 369 00:17:06,960 --> 00:17:09,399 Speaker 2: people see that coming again. I think AI could be 370 00:17:09,400 --> 00:17:12,159 Speaker 2: different from the last tech revolution. But I think people 371 00:17:12,359 --> 00:17:16,000 Speaker 2: just generally think this is a tool that maybe will 372 00:17:16,040 --> 00:17:18,760 Speaker 2: allow me to like write my thank you notes more quickly, 373 00:17:18,800 --> 00:17:21,119 Speaker 2: but it's going to destroy my job in my livelihood, 374 00:17:21,119 --> 00:17:22,240 Speaker 2: so I don't want anything to do with it. 375 00:17:22,880 --> 00:17:26,159 Speaker 1: What is the actual effect on jobs and labor, Like, 376 00:17:26,320 --> 00:17:28,600 Speaker 1: what's your read on what's actually happening here? 377 00:17:28,840 --> 00:17:32,080 Speaker 2: So my read is that it is having a very 378 00:17:32,160 --> 00:17:36,320 Speaker 2: modest effect on productivity, probably a positive modest effect on productivity, 379 00:17:37,080 --> 00:17:40,520 Speaker 2: having a limited effect on jobs except for in a 380 00:17:40,520 --> 00:17:45,320 Speaker 2: small number of professions customer service, engineering, soon media, where 381 00:17:45,359 --> 00:17:48,560 Speaker 2: it's going to be, you know, I think taking away jobs, 382 00:17:48,560 --> 00:17:51,640 Speaker 2: not in media yet, but will probably maybe who knows, 383 00:17:51,640 --> 00:17:54,280 Speaker 2: but probably an engineering maybe who knows, definitely already in 384 00:17:54,280 --> 00:17:57,720 Speaker 2: customer service, and that the one really interesting indicators. I 385 00:17:57,760 --> 00:18:00,000 Speaker 2: do think that it is taking away work right now, 386 00:18:00,080 --> 00:18:03,320 Speaker 2: now for twenty somethings. I think in the long run, 387 00:18:03,880 --> 00:18:06,800 Speaker 2: as companies change, as educational systems change, as the attitudes 388 00:18:06,840 --> 00:18:09,040 Speaker 2: that twenty somethings have coming into the workploce. Not in 389 00:18:09,040 --> 00:18:10,720 Speaker 2: the long run, even like the next two to three years, 390 00:18:10,960 --> 00:18:13,760 Speaker 2: that will change because being AI native will be such 391 00:18:13,800 --> 00:18:15,760 Speaker 2: a huge advantage, and having grown up and gone to 392 00:18:15,840 --> 00:18:18,360 Speaker 2: school learning these tools, you will be so much better 393 00:18:18,359 --> 00:18:20,560 Speaker 2: prepared for the workforce. Right now, if you're twenty three, 394 00:18:20,560 --> 00:18:23,439 Speaker 2: it's hard because the companies haven't really figured out what 395 00:18:23,480 --> 00:18:25,159 Speaker 2: to do with someone who knows a lot about AI. 396 00:18:25,600 --> 00:18:27,359 Speaker 2: The schools haven't really figured out how to train you 397 00:18:27,440 --> 00:18:29,879 Speaker 2: for a world in which AI is essential. But like 398 00:18:29,960 --> 00:18:32,639 Speaker 2: my oldest son is seventeen by the time he finishes college, 399 00:18:32,640 --> 00:18:34,280 Speaker 2: I actually think the job market will be pretty good 400 00:18:34,320 --> 00:18:35,240 Speaker 2: for twenty somethings. 401 00:18:35,400 --> 00:18:36,840 Speaker 1: I want to ask you more about your lunch with 402 00:18:36,880 --> 00:18:38,960 Speaker 1: Sam Almont. What's the theme of the lunch, what's he 403 00:18:39,000 --> 00:18:40,159 Speaker 1: thinking of? Where is he today? 404 00:18:40,400 --> 00:18:42,160 Speaker 2: So it was a bunch of journalists. It's the sort 405 00:18:42,160 --> 00:18:44,640 Speaker 2: of specific quotes and participants where I think on background, 406 00:18:44,720 --> 00:18:47,960 Speaker 2: but the topics to conversation were open. He of course 407 00:18:48,040 --> 00:18:51,879 Speaker 2: was talking about the Google versus open AI competition. He 408 00:18:52,040 --> 00:18:56,720 Speaker 2: said that all benchmarks are useless. Clearly Google has done 409 00:18:56,720 --> 00:18:58,400 Speaker 2: a really good job, but they'll figure out and they're 410 00:18:58,400 --> 00:19:00,280 Speaker 2: going to catch up, and said it was interesting. He's like, 411 00:19:00,280 --> 00:19:02,560 Speaker 2: our real competition is going to be Apple. He's like, 412 00:19:02,600 --> 00:19:03,960 Speaker 2: I don't think that text is going to be the 413 00:19:04,000 --> 00:19:06,520 Speaker 2: main interface for AI. It's going to be some kind 414 00:19:06,520 --> 00:19:10,000 Speaker 2: of a device. And you know, when pressed a little 415 00:19:10,000 --> 00:19:12,119 Speaker 2: bit further, it's a device that you'll have on your body. 416 00:19:12,359 --> 00:19:14,000 Speaker 2: It'll be in your ear, maybe be a nose ring. 417 00:19:14,000 --> 00:19:15,560 Speaker 2: Who knows what it's going to be. Like Johnny Eye 418 00:19:15,600 --> 00:19:19,480 Speaker 2: is building the world's most beautiful nose ring. And what's 419 00:19:19,480 --> 00:19:22,480 Speaker 2: gonna be so interesting is that it will be listening 420 00:19:22,520 --> 00:19:23,959 Speaker 2: all the time and be your service, and you'll talk 421 00:19:24,000 --> 00:19:25,560 Speaker 2: to it, right and I'll have something in my ear 422 00:19:25,560 --> 00:19:27,240 Speaker 2: and you'll ask me a question and I'll like somehow 423 00:19:27,280 --> 00:19:28,840 Speaker 2: figure out to communicate with my nose ring and then 424 00:19:28,880 --> 00:19:31,040 Speaker 2: like give you a better answer. It's gonna have to 425 00:19:31,080 --> 00:19:32,720 Speaker 2: be ambiently ware at all times, so it's going to 426 00:19:32,800 --> 00:19:35,280 Speaker 2: also have to be running on device AI, which I 427 00:19:35,280 --> 00:19:37,000 Speaker 2: thought was interesting. Well, it can't be communicating with the 428 00:19:37,000 --> 00:19:39,639 Speaker 2: cloud because then there's a huge privacy problem, and so 429 00:19:39,720 --> 00:19:42,639 Speaker 2: it will be some kind of like on device AI 430 00:19:43,080 --> 00:19:45,480 Speaker 2: running on some physical hardware. And so he thinks that 431 00:19:45,520 --> 00:19:49,040 Speaker 2: the competition to build that this next platform is open 432 00:19:49,080 --> 00:19:51,280 Speaker 2: Ai versus Apple. Right, and they've got Johnny I've and 433 00:19:51,280 --> 00:19:53,600 Speaker 2: they've got Io and Apple has all of its hardware experties. 434 00:19:53,680 --> 00:19:54,920 Speaker 2: But it's clearly struggled to AI. 435 00:19:55,040 --> 00:19:58,720 Speaker 1: And you have at that lunch wearing two I guess 436 00:19:58,720 --> 00:20:02,600 Speaker 1: at least two hats being technology journalists and commentator and 437 00:20:02,640 --> 00:20:06,160 Speaker 1: the other being sew of the Atlantic and commercial partner 438 00:20:06,280 --> 00:20:07,560 Speaker 1: of Open Ai. 439 00:20:08,240 --> 00:20:09,280 Speaker 4: How's the partnership going. 440 00:20:09,480 --> 00:20:14,160 Speaker 2: The partnership is they paid us for data on which 441 00:20:14,160 --> 00:20:18,160 Speaker 2: they wanted to train also the access new data, and 442 00:20:18,520 --> 00:20:21,479 Speaker 2: we then also serve as a partner as they developed 443 00:20:21,520 --> 00:20:25,159 Speaker 2: their new search engine. The working relationship is great. Like 444 00:20:25,240 --> 00:20:29,520 Speaker 2: their search engine, that's okay for publishers, like it's developing 445 00:20:29,560 --> 00:20:31,359 Speaker 2: in a way that's not exactly what we want, but 446 00:20:31,720 --> 00:20:34,600 Speaker 2: it's all right. It doesn't playgiarize like all the things 447 00:20:34,600 --> 00:20:37,320 Speaker 2: that we were particularly worried about. It doesn't do. Maybe 448 00:20:37,320 --> 00:20:40,040 Speaker 2: that's a tiny bit due to our feedback. It's been 449 00:20:40,080 --> 00:20:42,280 Speaker 2: public report that our partnership is a two year partnership. 450 00:20:42,320 --> 00:20:44,080 Speaker 2: So that would mean it would be coming up next year. 451 00:20:44,720 --> 00:20:47,520 Speaker 2: I think like some of the partnerships are five years, summer, 452 00:20:47,520 --> 00:20:50,479 Speaker 2: two years. The interesting question is whether it renews any 453 00:20:50,520 --> 00:20:53,040 Speaker 2: of these partnerships right, And one of the things that 454 00:20:53,920 --> 00:20:56,520 Speaker 2: Altman talked about that suggests they might not is that 455 00:20:56,600 --> 00:20:58,760 Speaker 2: they think the value of human data is gone to 456 00:20:58,920 --> 00:21:00,880 Speaker 2: zero because they can just use synthetic data to train 457 00:21:00,920 --> 00:21:02,960 Speaker 2: their models. I wish if I could go back, I 458 00:21:02,960 --> 00:21:05,719 Speaker 2: could have signed partnerships with every single AI company that 459 00:21:05,880 --> 00:21:08,600 Speaker 2: we had even exploratory conversations with, because it is clear 460 00:21:08,680 --> 00:21:11,160 Speaker 2: that the value for training data was that it's absolute 461 00:21:11,200 --> 00:21:13,000 Speaker 2: peak then and has massively declined. 462 00:21:13,359 --> 00:21:15,360 Speaker 1: So we talked about two of your hats going into 463 00:21:15,400 --> 00:21:20,240 Speaker 1: the meeting CEO of the Atlantique Technology Journalist. A third 464 00:21:20,280 --> 00:21:22,879 Speaker 1: hat is board member of Proota AI. 465 00:21:23,119 --> 00:21:26,080 Speaker 4: Correct, and you had Bill Gross. 466 00:21:26,720 --> 00:21:29,840 Speaker 1: On the most interesting thing in AI podcast I did 467 00:21:29,920 --> 00:21:30,680 Speaker 1: not too long ago. 468 00:21:31,280 --> 00:21:34,600 Speaker 4: So who is Bill Gross? What is Proota AI? 469 00:21:35,080 --> 00:21:38,199 Speaker 1: And is it going to be possible to get compensation 470 00:21:38,440 --> 00:21:41,639 Speaker 1: for licensed data in a world that you've just described. 471 00:21:42,000 --> 00:21:43,960 Speaker 2: Answer to the last question quickly is yes. Okay, Now 472 00:21:44,040 --> 00:21:47,240 Speaker 2: let's ree why Bill is this amazing mad scientist, inventor 473 00:21:47,400 --> 00:21:51,040 Speaker 2: who for the last forty years has built hundreds of companies. 474 00:21:51,080 --> 00:21:54,800 Speaker 2: You walk into his office and he's like desalinizing, you know, 475 00:21:55,200 --> 00:21:57,000 Speaker 2: water that he's like sucked out of the air of 476 00:21:57,000 --> 00:21:58,960 Speaker 2: his driveway in Los Angeles, and he's got like the 477 00:21:59,000 --> 00:22:02,760 Speaker 2: world's greatest Bostick sound system. He's built all these companies. 478 00:22:02,840 --> 00:22:05,640 Speaker 2: He in fact came up with the idea for ad 479 00:22:05,640 --> 00:22:07,159 Speaker 2: supported auctions and search engines. 480 00:22:07,240 --> 00:22:07,359 Speaker 4: Right. 481 00:22:07,359 --> 00:22:09,040 Speaker 2: The guy is amazing, right, and he's built all these 482 00:22:09,080 --> 00:22:12,879 Speaker 2: great companies. He saw what was happening in AI and 483 00:22:12,920 --> 00:22:15,000 Speaker 2: saw that the AA companies were stealing the data from 484 00:22:15,160 --> 00:22:18,000 Speaker 2: content creators and copyright creators, and he in fact have 485 00:22:18,080 --> 00:22:19,920 Speaker 2: been screwed in a case of that when he was 486 00:22:19,960 --> 00:22:21,920 Speaker 2: like a young man. One of the things that the 487 00:22:21,960 --> 00:22:24,119 Speaker 2: AA companies say is they say, when we give an answer, 488 00:22:24,119 --> 00:22:26,639 Speaker 2: we just don't know the sources, and Bills like, actually, no, 489 00:22:26,680 --> 00:22:28,480 Speaker 2: you can work backwards. You can sort of like run 490 00:22:28,480 --> 00:22:30,360 Speaker 2: it back through the model and say what were the sources. 491 00:22:30,720 --> 00:22:33,720 Speaker 2: And so Bill built an AI model called pro rata 492 00:22:34,400 --> 00:22:39,720 Speaker 2: that attributes percentages of the data to the sources. So 493 00:22:39,720 --> 00:22:42,639 Speaker 2: you'll type in you'll say, hey, you know what happened 494 00:22:42,640 --> 00:22:44,720 Speaker 2: in the Supremeport today, I'll say your answer is fifteen 495 00:22:44,760 --> 00:22:47,959 Speaker 2: percent from Oz's podcast, fourteen percent from the Atlantic, right, 496 00:22:47,960 --> 00:22:51,239 Speaker 2: and then it will like share revenue. That's amazing, right, 497 00:22:51,280 --> 00:22:53,239 Speaker 2: the fact that you can show that you can do that. 498 00:22:53,359 --> 00:22:55,680 Speaker 2: I mean, maybe it's not perfectly perfect because we don't 499 00:22:55,680 --> 00:22:57,400 Speaker 2: really know how these models work. Again, but he's shown 500 00:22:57,440 --> 00:22:59,600 Speaker 2: that you can build a system that does that, and 501 00:22:59,600 --> 00:23:01,439 Speaker 2: he's shown that you can now build a business on that. 502 00:23:01,560 --> 00:23:04,920 Speaker 2: So in a fair and just world, anthropic open AI, 503 00:23:05,200 --> 00:23:09,400 Speaker 2: Google would all have operated like that from the beginning, right, 504 00:23:09,920 --> 00:23:12,359 Speaker 2: and they would be paying the people whose data they 505 00:23:12,400 --> 00:23:14,880 Speaker 2: trained on. They didn't do that because it was hard 506 00:23:14,920 --> 00:23:17,200 Speaker 2: to do and it was costly. So Bill went out 507 00:23:17,240 --> 00:23:21,840 Speaker 2: and did it. And so ideally the AI companies will 508 00:23:22,240 --> 00:23:26,000 Speaker 2: license the technology from Bill, right, or we'll go along 509 00:23:26,040 --> 00:23:27,600 Speaker 2: with it. Now what we'll force them to do that, 510 00:23:27,640 --> 00:23:31,040 Speaker 2: because that would be a big change shaming like Bill 511 00:23:31,080 --> 00:23:33,800 Speaker 2: could like do enough podcast that eventually the world is 512 00:23:33,840 --> 00:23:36,840 Speaker 2: like Bill's right and everybody else is wrong. Two courts, 513 00:23:36,960 --> 00:23:40,680 Speaker 2: three legislation. And then for the most interesting one, which 514 00:23:40,720 --> 00:23:41,879 Speaker 2: is one of the most important things that happened in 515 00:23:41,880 --> 00:23:43,520 Speaker 2: tech last years, Cloud flair was like, you know what 516 00:23:43,600 --> 00:23:45,119 Speaker 2: we're gonna make it really hard for the AA companies 517 00:23:45,160 --> 00:23:47,600 Speaker 2: to scrape people. Right, We're just going to like you 518 00:23:47,680 --> 00:23:49,760 Speaker 2: because up until last summer we basically put a sign 519 00:23:49,800 --> 00:23:51,720 Speaker 2: on our loan. We're like, hey, don't scrape us, right, 520 00:23:52,200 --> 00:23:54,840 Speaker 2: and you know they all dissobate it. And then we're like, okay, fine, 521 00:23:55,320 --> 00:23:57,080 Speaker 2: now we're using cloud Flare, which is like good at 522 00:23:57,080 --> 00:23:59,720 Speaker 2: tracking down Russian hackers and all that, so now you 523 00:23:59,800 --> 00:24:01,920 Speaker 2: really can't scrape us. And they're like, wait, now we can't. 524 00:24:01,960 --> 00:24:03,639 Speaker 2: We have no access to the Atlantic anymore, right, And 525 00:24:03,640 --> 00:24:05,280 Speaker 2: so we just turned it all not all that, we 526 00:24:05,320 --> 00:24:06,879 Speaker 2: turned it all off except for open ai and its 527 00:24:06,880 --> 00:24:09,960 Speaker 2: a few others. So it's possible that over time the 528 00:24:10,000 --> 00:24:12,800 Speaker 2: balance of power shifts a little bit because even though 529 00:24:13,320 --> 00:24:16,000 Speaker 2: synthetic data has replaced the native human data in training 530 00:24:16,040 --> 00:24:19,639 Speaker 2: AI models new information about the world, you still need 531 00:24:19,720 --> 00:24:24,600 Speaker 2: human data. Right, So what happened today? Right, You can't 532 00:24:24,640 --> 00:24:27,160 Speaker 2: get that from a synthetic model, right, Maybe you grow 533 00:24:27,240 --> 00:24:28,959 Speaker 2: can like try to get it from a bunch of tweets, 534 00:24:29,119 --> 00:24:31,879 Speaker 2: but you actually need journalist media companies. So that is 535 00:24:31,920 --> 00:24:34,280 Speaker 2: still valuable and will still be valuable, And so the 536 00:24:34,359 --> 00:24:35,960 Speaker 2: question is can we get paid for that? 537 00:24:36,960 --> 00:24:41,919 Speaker 1: So the currency is not needing human created data to 538 00:24:42,040 --> 00:24:46,640 Speaker 1: make models function. It is having relevant data taken from 539 00:24:46,720 --> 00:24:50,000 Speaker 1: the real world where AI can't go, right, turned into 540 00:24:50,080 --> 00:24:53,960 Speaker 1: data that AI can read, yes, and then repurpose for users. 541 00:24:54,000 --> 00:24:56,680 Speaker 1: And that is what the compensation model will be around. Yes, 542 00:24:56,720 --> 00:25:00,280 Speaker 1: hopefully definitely that makes sense. What about next year, she'll 543 00:25:00,320 --> 00:25:02,840 Speaker 1: be cool and the most interesting thing and take twenty 544 00:25:02,840 --> 00:25:03,880 Speaker 1: twenty six edition. 545 00:25:03,760 --> 00:25:06,560 Speaker 2: Well, the big the most interesting topic will be explainability, right, 546 00:25:06,560 --> 00:25:09,600 Speaker 2: Like I do think we're going to have some kind 547 00:25:09,640 --> 00:25:12,280 Speaker 2: of an incident next year where AI does something terrible 548 00:25:12,280 --> 00:25:13,879 Speaker 2: and we're not going to know why it did it, 549 00:25:13,960 --> 00:25:15,800 Speaker 2: and that is going to lead to like a panic 550 00:25:15,800 --> 00:25:18,760 Speaker 2: and explainability, Like something will go very wrong, right, I 551 00:25:18,760 --> 00:25:21,160 Speaker 2: don't know what it is, but like a plane will crash, 552 00:25:21,240 --> 00:25:23,960 Speaker 2: or like there'll be a two minute stock market dip 553 00:25:24,040 --> 00:25:28,000 Speaker 2: because some AI based trading platform has gone wild or 554 00:25:28,640 --> 00:25:29,480 Speaker 2: something like that. 555 00:25:29,920 --> 00:25:31,920 Speaker 4: Right, Why do you believe it'll happen next year? 556 00:25:32,280 --> 00:25:34,400 Speaker 2: Just it's it's AI is getting so good, and it's 557 00:25:34,480 --> 00:25:38,960 Speaker 2: kind of like getting GPD wasn't capable of doing something 558 00:25:39,000 --> 00:25:40,720 Speaker 2: like that, right, and it wasn't used enough and it 559 00:25:40,760 --> 00:25:43,760 Speaker 2: wasn't integrated enough like GPT three, like tell like a 560 00:25:43,760 --> 00:25:46,600 Speaker 2: bad deadtime story to a kid, right, And like GPD four, 561 00:25:46,680 --> 00:25:48,920 Speaker 2: GPD five or whatever, five point one or six is 562 00:25:49,000 --> 00:25:51,560 Speaker 2: going to like sort of the power and the use. 563 00:25:51,880 --> 00:25:54,320 Speaker 2: I feel like something is going to go wrong and 564 00:25:54,359 --> 00:25:57,600 Speaker 2: that will lead to a lot of introspection on explainability. 565 00:25:57,800 --> 00:26:01,000 Speaker 2: Just that's one prediction. I also think that like it'll 566 00:26:01,040 --> 00:26:03,440 Speaker 2: start leading to real productivity. I think self driving cars 567 00:26:03,440 --> 00:26:05,200 Speaker 2: are awesome. I think the comment I'm kind of excited 568 00:26:05,200 --> 00:26:07,880 Speaker 2: about arglasses like lots of good stuff's gonna happen next year. 569 00:26:08,880 --> 00:26:10,199 Speaker 2: But that's that's one. 570 00:26:10,320 --> 00:26:12,880 Speaker 1: So twenty twenty five was owls and oslots, and twenty 571 00:26:12,920 --> 00:26:14,600 Speaker 1: twenty six will be the real world will. 572 00:26:14,480 --> 00:26:17,040 Speaker 2: Be the real world implication of owls and oslots. 573 00:26:17,480 --> 00:26:20,000 Speaker 1: That's fascinating. Yeah, I thought you're gonna say something different. 574 00:26:20,080 --> 00:26:21,760 Speaker 1: What do you think of the second Well, last year 575 00:26:21,920 --> 00:26:25,640 Speaker 1: you said something quite prescient, which was the value of 576 00:26:26,320 --> 00:26:28,960 Speaker 1: data is how to. 577 00:26:28,880 --> 00:26:29,919 Speaker 4: Predict in future. 578 00:26:30,000 --> 00:26:33,160 Speaker 1: I mean you were talking particular about how robots looking 579 00:26:33,200 --> 00:26:36,399 Speaker 1: at videos of people peeling carrots might become a very 580 00:26:36,520 --> 00:26:39,000 Speaker 1: valuable source of robot training data. 581 00:26:39,119 --> 00:26:41,399 Speaker 2: It did, it did? You were right, I should have 582 00:26:41,440 --> 00:26:44,520 Speaker 2: invested in carrots. 583 00:26:44,760 --> 00:26:50,720 Speaker 4: Talk about world models and non word based learning. 584 00:26:50,960 --> 00:26:52,840 Speaker 2: So this is okay, So this is one of the 585 00:26:52,880 --> 00:26:55,800 Speaker 2: more interesting things too. Right, So I don't know if 586 00:26:55,840 --> 00:26:57,320 Speaker 2: this is a prediction for twenty six. Maybe it's a 587 00:26:57,320 --> 00:27:00,639 Speaker 2: prediction for twenty seven, but I do kind to think 588 00:27:00,720 --> 00:27:04,160 Speaker 2: that like the world where we think of AI as 589 00:27:04,200 --> 00:27:07,480 Speaker 2: a text box changes, right, So, like faife Lee is 590 00:27:07,480 --> 00:27:09,080 Speaker 2: building this company and she's trying to build this thing 591 00:27:09,119 --> 00:27:11,840 Speaker 2: called spatial intelligence, where you're building AI that isn't just 592 00:27:11,920 --> 00:27:14,480 Speaker 2: trained on like understanding language and parsoning it. It's based 593 00:27:14,520 --> 00:27:17,159 Speaker 2: on seeing the world, understanding the world, figuring out the 594 00:27:17,200 --> 00:27:20,320 Speaker 2: rules of the world. Like in some ways, like an 595 00:27:20,359 --> 00:27:23,199 Speaker 2: AI model is much more intelligent than a child, right, 596 00:27:23,200 --> 00:27:25,200 Speaker 2: It has much more vocabulary, knows a lot more about 597 00:27:25,240 --> 00:27:27,040 Speaker 2: the Spanish Civil War than a five year old. But 598 00:27:27,080 --> 00:27:28,720 Speaker 2: if you have it, like try to create a video 599 00:27:28,920 --> 00:27:31,840 Speaker 2: that shows what happens when I do this with my hand, he. 600 00:27:31,920 --> 00:27:33,919 Speaker 4: Dropped a pen, right, I dropped a pen, Right. 601 00:27:35,080 --> 00:27:37,160 Speaker 2: The AI doesn't really figure that out, Like, it doesn't 602 00:27:37,200 --> 00:27:38,879 Speaker 2: understand it. Like you can have it watch a lot 603 00:27:38,880 --> 00:27:40,400 Speaker 2: of video, you can have it read a lot of text, 604 00:27:40,440 --> 00:27:43,840 Speaker 2: and it doesn't quite understand what motivates you know, what 605 00:27:44,000 --> 00:27:46,320 Speaker 2: is actually causing the world to operate that has this 606 00:27:46,400 --> 00:27:49,760 Speaker 2: like very narrow intelligence. It is learned because it has 607 00:27:49,840 --> 00:27:52,160 Speaker 2: learned in this very simple like a child who lived 608 00:27:52,200 --> 00:27:54,040 Speaker 2: in the dark and just was like read to for 609 00:27:54,080 --> 00:27:54,520 Speaker 2: a long time. 610 00:27:54,720 --> 00:27:56,960 Speaker 1: I learned from how we how the history of humanity 611 00:27:56,960 --> 00:28:00,040 Speaker 1: has described the universe, rather than from observing the un. 612 00:28:00,040 --> 00:28:02,760 Speaker 2: Reference being in the RS. And so that leads to 613 00:28:02,880 --> 00:28:04,720 Speaker 2: all of these gaps, right, And you can see it 614 00:28:04,760 --> 00:28:06,359 Speaker 2: in some of the hallucinations you make. So you can 615 00:28:06,359 --> 00:28:08,000 Speaker 2: certainly see it in the early videos where it just 616 00:28:08,000 --> 00:28:11,439 Speaker 2: doesn't understand how things should work. And like so in 617 00:28:11,440 --> 00:28:14,600 Speaker 2: some ways, like AA models understand the whole history the world, 618 00:28:14,600 --> 00:28:17,320 Speaker 2: but they're also kind of like less intuitive than a squirrel, right, 619 00:28:17,800 --> 00:28:22,520 Speaker 2: And so could you somehow teach an AI model like 620 00:28:22,880 --> 00:28:25,240 Speaker 2: how the world works, then what are the implications of 621 00:28:25,280 --> 00:28:27,840 Speaker 2: how you build it? Because then you can start you 622 00:28:27,880 --> 00:28:29,359 Speaker 2: think about it. If okay, if your challenge is you 623 00:28:29,400 --> 00:28:31,240 Speaker 2: want to build robots, right, and you want to build 624 00:28:31,320 --> 00:28:32,879 Speaker 2: robots that help take care of elderly people, and you 625 00:28:32,880 --> 00:28:35,120 Speaker 2: want to use AI to do that, the path we're 626 00:28:35,160 --> 00:28:37,199 Speaker 2: going down right now is like you read all the 627 00:28:37,240 --> 00:28:40,160 Speaker 2: text that's ever been put on reddit right like, develop 628 00:28:40,160 --> 00:28:42,000 Speaker 2: a whole bunch of rules from that, and then we'll 629 00:28:42,000 --> 00:28:44,400 Speaker 2: tell you how to operate. Well, no, really, you should 630 00:28:44,400 --> 00:28:47,560 Speaker 2: be teaching the robot like not just what happens when 631 00:28:47,560 --> 00:28:50,880 Speaker 2: I drop the pen, but also about the emotions of 632 00:28:50,920 --> 00:28:53,040 Speaker 2: the old person when she turns her head and squints 633 00:28:53,120 --> 00:28:55,640 Speaker 2: like a little way, right like. And an AI model 634 00:28:55,680 --> 00:28:58,400 Speaker 2: can't figure that out, and a robot train based on 635 00:28:58,400 --> 00:29:01,360 Speaker 2: our current A models can't. But maybe a robot trained 636 00:29:01,360 --> 00:29:03,400 Speaker 2: in like this wholly different way, which is what you 637 00:29:03,440 --> 00:29:05,520 Speaker 2: know Lakun is working on, and faith Lee is working on, 638 00:29:05,520 --> 00:29:09,880 Speaker 2: others are working on. Maybe that completely suppliants whatever comes 639 00:29:09,880 --> 00:29:11,400 Speaker 2: out of the lage language. 640 00:29:11,040 --> 00:29:14,440 Speaker 1: Models make suse Thompson, thank you, beg you as that 641 00:29:14,480 --> 00:29:14,960 Speaker 1: was really fun. 642 00:29:37,320 --> 00:29:39,080 Speaker 4: That's it for this week for tech stuff. 643 00:29:39,080 --> 00:29:41,840 Speaker 1: I'm Kara Price and I'm os Voloshin. This episode was 644 00:29:41,840 --> 00:29:44,040 Speaker 1: produced by Eliza Dennis Tyler Hill. 645 00:29:43,880 --> 00:29:44,800 Speaker 4: And Melissa Slaughter. 646 00:29:45,040 --> 00:29:48,040 Speaker 1: It was executive produced by me Kara Price, Julia Nutter, 647 00:29:48,080 --> 00:29:52,280 Speaker 1: and Kate Osborne for Kaleidoscope and Katria Novel for iHeart Podcasts. 648 00:29:52,720 --> 00:29:56,280 Speaker 1: Paul Bowman is our engineer and Jack Insley makes this episode. 649 00:29:56,800 --> 00:29:58,320 Speaker 1: Kyle Murdoch wrote out theme. 650 00:29:58,160 --> 00:30:00,600 Speaker 4: Song, Please rate, review, and reach out to us at 651 00:30:00,600 --> 00:30:03,680 Speaker 4: tech Stuff Podcast at gmail dot com. 652 00:30:03,720 --> 00:30:04,520 Speaker 3: We want to hear from you. 653 00:30:07,120 --> 00:30:07,160 Speaker 2: M