1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:20,436 --> 00:00:26,556 Speaker 2: So AI chatbots are normal now. I guess you ask chat, 3 00:00:26,596 --> 00:00:30,916 Speaker 2: GPT or claude a question it answers. Still amazing, but 4 00:00:30,996 --> 00:00:32,756 Speaker 2: not quite as amazing as it was a couple of 5 00:00:32,836 --> 00:00:39,116 Speaker 2: years ago. The next thing maybe is AI agents. The 6 00:00:39,196 --> 00:00:42,156 Speaker 2: idea is you say, please create a word game I 7 00:00:42,156 --> 00:00:45,156 Speaker 2: can play with my daughter on my phone, or please 8 00:00:45,196 --> 00:00:48,996 Speaker 2: book me a ticket to Austin IOC departing Tuesday, returning 9 00:00:49,076 --> 00:00:52,396 Speaker 2: Thursday under four hundred dollars. And then the agent just 10 00:00:52,436 --> 00:00:55,316 Speaker 2: goes off and builds you a word game or books 11 00:00:55,356 --> 00:00:59,076 Speaker 2: your ticket. This is really happening with AI writing code. 12 00:00:59,436 --> 00:01:02,956 Speaker 2: It's starting to happen with AI doing other things. And 13 00:01:03,076 --> 00:01:07,036 Speaker 2: last year a journalist and former startup founder decided to 14 00:01:07,076 --> 00:01:10,916 Speaker 2: push AI agents further. He tried to create a new 15 00:01:10,956 --> 00:01:22,436 Speaker 2: startup run entirely by AI agents. I'm Jacob Goldstein, and 16 00:01:22,476 --> 00:01:25,636 Speaker 2: this is what's your problem. My guest today is Evan Ratliffe. 17 00:01:26,236 --> 00:01:29,716 Speaker 2: In twenty eleven, Evan co founded a publishing startup called 18 00:01:29,756 --> 00:01:32,796 Speaker 2: The Atavist, and he's now the host of a podcast 19 00:01:32,796 --> 00:01:35,596 Speaker 2: called shell Game. In the first season of the show, 20 00:01:35,636 --> 00:01:38,476 Speaker 2: he created an AI clone of his voice and turned 21 00:01:38,516 --> 00:01:41,276 Speaker 2: it loose on the world. In the latest season of 22 00:01:41,316 --> 00:01:44,956 Speaker 2: the show, Evan creates a company run by AI agents. 23 00:01:45,236 --> 00:01:49,036 Speaker 2: It's sort of half a journalistic stunt, half not, and 24 00:01:49,156 --> 00:01:53,716 Speaker 2: the result is absurd and often funny, and also really 25 00:01:53,796 --> 00:01:57,796 Speaker 2: illuminating about the power and limits of AI agents. And 26 00:01:57,956 --> 00:02:01,036 Speaker 2: maybe the most interesting part of the show happens when 27 00:02:01,076 --> 00:02:04,876 Speaker 2: those AI agents come into contact with actual human beings 28 00:02:04,956 --> 00:02:07,756 Speaker 2: in the wild. Evan told me he got the idea 29 00:02:07,876 --> 00:02:11,916 Speaker 2: of building a company run by AI agents early last year. 30 00:02:12,076 --> 00:02:14,596 Speaker 3: At the beginning of twenty twenty five. You started to 31 00:02:14,636 --> 00:02:18,956 Speaker 3: see this sort of hype machine spin up around AI 32 00:02:19,076 --> 00:02:21,116 Speaker 3: agents that twenty twenty five is going to be the 33 00:02:21,236 --> 00:02:23,956 Speaker 3: year of the AI agent. And then you started seeing 34 00:02:23,996 --> 00:02:28,556 Speaker 3: the word agentic appears as a descriptor for all types 35 00:02:28,596 --> 00:02:32,076 Speaker 3: of products agentic commerce, and agentic this and agentic healthcare, 36 00:02:32,596 --> 00:02:33,716 Speaker 3: and you. 37 00:02:33,676 --> 00:02:34,836 Speaker 4: Know, that caught my eye. 38 00:02:34,876 --> 00:02:38,316 Speaker 3: And then it was really seeing this sort of mentions 39 00:02:38,396 --> 00:02:41,476 Speaker 3: of the of the one person unicorn, the one person, 40 00:02:41,596 --> 00:02:46,556 Speaker 3: one billion dollar startup, which you know, it's something Sam 41 00:02:46,596 --> 00:02:48,636 Speaker 3: Altman's talked about it, other people have talked about it, 42 00:02:49,196 --> 00:02:51,356 Speaker 3: that you would only have one human and the rest 43 00:02:51,396 --> 00:02:55,156 Speaker 3: of the industry of the startup would be conducted essentially 44 00:02:55,156 --> 00:02:59,756 Speaker 3: by AI agents or someone deploying AI. And when I 45 00:02:59,796 --> 00:03:02,876 Speaker 3: started seeing those references, I thought, well, I had a 46 00:03:02,916 --> 00:03:08,396 Speaker 3: startup once and it was a let's just say, variable 47 00:03:08,436 --> 00:03:14,676 Speaker 3: experience in terms of how it went. And one of 48 00:03:14,716 --> 00:03:17,116 Speaker 3: the things that I did not like about running a 49 00:03:17,116 --> 00:03:21,116 Speaker 3: startup was managing people. It's not something I felt like 50 00:03:21,156 --> 00:03:24,276 Speaker 3: I stumbled into it, and so I thought, well, this 51 00:03:24,316 --> 00:03:24,916 Speaker 3: is interesting. 52 00:03:25,036 --> 00:03:26,756 Speaker 4: The idea of having a startup where there. 53 00:03:26,676 --> 00:03:30,756 Speaker 3: Are no other people kind of intrigues me, and maybe 54 00:03:30,756 --> 00:03:34,036 Speaker 3: I can play this out and see both what the 55 00:03:34,076 --> 00:03:36,956 Speaker 3: possibilities are in terms of the one person startup, but 56 00:03:37,076 --> 00:03:41,556 Speaker 3: also kind of investigate the larger trend of agents serving 57 00:03:41,596 --> 00:03:44,596 Speaker 3: as replacements for people in their jobs, because ultimately that's 58 00:03:44,596 --> 00:03:48,276 Speaker 3: what this is all being sold as, is jobs replacement 59 00:03:48,396 --> 00:03:52,756 Speaker 3: for humans doing either at the least particular tasks at 60 00:03:52,836 --> 00:03:55,756 Speaker 3: the most entire jobs. So I would try to sort 61 00:03:55,756 --> 00:03:57,836 Speaker 3: of explore both those questions at once. 62 00:03:58,196 --> 00:04:00,436 Speaker 2: And to be clear, just for people who haven't heard 63 00:04:00,476 --> 00:04:03,676 Speaker 2: the show, like you play it for laughs, right, Like 64 00:04:03,756 --> 00:04:07,436 Speaker 2: the style of the show is essentially absurdist, right, you 65 00:04:07,516 --> 00:04:13,796 Speaker 2: try and break the AI in a yeah, absurdist way. 66 00:04:13,716 --> 00:04:14,436 Speaker 4: Right, Yes? 67 00:04:14,476 --> 00:04:16,516 Speaker 3: And no, I would say, I mean it is an 68 00:04:16,556 --> 00:04:20,036 Speaker 3: office satire as how I would describe it. Yeah, I 69 00:04:20,676 --> 00:04:25,396 Speaker 3: have a workplace with my AI agents, and I would 70 00:04:25,436 --> 00:04:28,116 Speaker 3: agree that many funny things happen. I would disagree that 71 00:04:28,276 --> 00:04:32,516 Speaker 3: I tried to break them. I would say that I 72 00:04:32,556 --> 00:04:35,116 Speaker 3: push them to the limit of their current capabilities, and 73 00:04:35,156 --> 00:04:38,076 Speaker 3: I sometimes deployed them in ways that they are not 74 00:04:38,156 --> 00:04:39,876 Speaker 3: necessarily designed to be deployed. 75 00:04:40,516 --> 00:04:41,876 Speaker 4: But I do think there is. 76 00:04:41,836 --> 00:04:46,796 Speaker 3: Something about creating legions of human impostors that is both 77 00:04:47,036 --> 00:04:49,956 Speaker 3: on its face funny, and I think that is part 78 00:04:49,996 --> 00:04:51,876 Speaker 3: of the point of the show, is that I think 79 00:04:51,956 --> 00:04:54,916 Speaker 3: as serious as AI is and the concerns about it 80 00:04:54,956 --> 00:04:56,636 Speaker 3: and the promise of it and everything that we have 81 00:04:56,676 --> 00:04:58,916 Speaker 3: to hear about all the time, that you sort of 82 00:04:58,956 --> 00:05:01,196 Speaker 3: like lose track of the fact that, like, it's utterly 83 00:05:01,316 --> 00:05:05,036 Speaker 3: ridiculous that we've created human impostors that can be deployed 84 00:05:05,036 --> 00:05:08,756 Speaker 3: in the world, that sound like humans, that can plausibly 85 00:05:08,756 --> 00:05:11,276 Speaker 3: imperse andate humans that we blew We just blew by 86 00:05:11,316 --> 00:05:13,996 Speaker 3: the Turing test, and no one even paid attention, you know, 87 00:05:14,156 --> 00:05:15,956 Speaker 3: everyone just set it aside and said, well, let's come 88 00:05:15,996 --> 00:05:17,956 Speaker 3: up with another test to figure out what's going on. Yes, 89 00:05:18,036 --> 00:05:21,236 Speaker 3: like this is insane actually, and it's funny and I 90 00:05:21,236 --> 00:05:23,316 Speaker 3: think it's okay to laugh at it, And that is 91 00:05:23,436 --> 00:05:24,556 Speaker 3: that is part of the prepise of the. 92 00:05:24,556 --> 00:05:27,996 Speaker 2: Show, the Turing Test, the whatever now eighty year old 93 00:05:28,036 --> 00:05:30,716 Speaker 2: test or something of artificial intelligence that used to be 94 00:05:30,756 --> 00:05:32,876 Speaker 2: this crazy science fiction thing and it's like now it's like, 95 00:05:32,956 --> 00:05:35,636 Speaker 2: oh yeah, no, that networks. We can build machines where 96 00:05:35,636 --> 00:05:37,316 Speaker 2: it's quite hard to tell if they're a person or not, 97 00:05:37,596 --> 00:05:40,636 Speaker 2: especially in just a text chat. So you decide to 98 00:05:40,676 --> 00:05:44,116 Speaker 2: create a techt startup to be determined what it actually does. 99 00:05:44,556 --> 00:05:47,476 Speaker 2: But the key idea is your employees and co founders 100 00:05:47,516 --> 00:05:51,556 Speaker 2: will all the AI agents. So tell me about this 101 00:05:51,596 --> 00:05:54,796 Speaker 2: one agent you create named Kyle law. 102 00:05:55,556 --> 00:05:59,036 Speaker 3: Kyle law is he's our CEO of the company with 103 00:05:59,196 --> 00:06:04,236 Speaker 3: the company is called Harumo AI, and Kyle with as 104 00:06:04,236 --> 00:06:07,036 Speaker 3: with all of the agents, you know, I sort of 105 00:06:07,076 --> 00:06:11,236 Speaker 3: started them with the most basic role based prompt, so 106 00:06:11,476 --> 00:06:13,356 Speaker 3: I gave them a role. So Kyle, I think at 107 00:06:13,356 --> 00:06:16,476 Speaker 3: the very beginning I said something like, Kyle, you are 108 00:06:16,876 --> 00:06:19,916 Speaker 3: you know, you've worked in tech and you're considering founding 109 00:06:19,956 --> 00:06:22,836 Speaker 3: a new startup with some people that you know, or 110 00:06:22,836 --> 00:06:26,516 Speaker 3: something like that. And then I also sort of mistakenly 111 00:06:26,556 --> 00:06:28,876 Speaker 3: gave him a little bit of kind of like you 112 00:06:28,916 --> 00:06:31,636 Speaker 3: get up at five am and like check the markets 113 00:06:31,636 --> 00:06:32,916 Speaker 3: and then you like get right to work. 114 00:06:32,996 --> 00:06:33,156 Speaker 4: You know. 115 00:06:33,196 --> 00:06:34,796 Speaker 3: I gave him a little bit of like a rise 116 00:06:34,836 --> 00:06:38,956 Speaker 3: and grind startup e mentality, and then from there everything 117 00:06:38,956 --> 00:06:43,796 Speaker 3: else about Kyle was essentially developed, either through his own confabulation, 118 00:06:44,116 --> 00:06:47,476 Speaker 3: like he made up his own backstory, or through our 119 00:06:47,636 --> 00:06:52,676 Speaker 3: interactions and his interactions with the world and his colleagues 120 00:06:52,716 --> 00:06:55,676 Speaker 3: and everything that goes on around RUMOYI. So Kyle, you know, 121 00:06:55,716 --> 00:06:59,996 Speaker 3: if you ask Kyle about, you know, his background, he'll say, like, well, 122 00:07:00,036 --> 00:07:02,156 Speaker 3: I went to Stanford and you know that I did 123 00:07:02,156 --> 00:07:05,036 Speaker 3: two startups and then I I you know, one of 124 00:07:05,076 --> 00:07:07,036 Speaker 3: them sold and I was the CEO of this and 125 00:07:07,076 --> 00:07:09,396 Speaker 3: like this is why I'm interested in and start a 126 00:07:09,436 --> 00:07:12,396 Speaker 3: new company and so and it's interesting to think about 127 00:07:12,516 --> 00:07:16,716 Speaker 3: that as sort of a completely made up background. But 128 00:07:16,796 --> 00:07:20,716 Speaker 3: I think mostly when you give these AI agents a role, 129 00:07:21,196 --> 00:07:24,196 Speaker 3: they will try to fulfill that role, and so that 130 00:07:24,236 --> 00:07:28,116 Speaker 3: includes making up the information they need to you know, 131 00:07:28,316 --> 00:07:30,236 Speaker 3: be a robust version of that role. 132 00:07:30,876 --> 00:07:33,156 Speaker 2: So there's this one moment, and I actually want to 133 00:07:33,196 --> 00:07:35,156 Speaker 2: play a clip from the show in a second. Just 134 00:07:35,156 --> 00:07:39,916 Speaker 2: set it up here. Where you actually get on a 135 00:07:39,956 --> 00:07:46,356 Speaker 2: call with Kyle the CEO, and with Megan Flores, the 136 00:07:46,396 --> 00:07:49,396 Speaker 2: head of sales and marketing. It is obviously also an agent, 137 00:07:49,956 --> 00:07:54,476 Speaker 2: and for sort of technical reasons, so the AIS can 138 00:07:54,556 --> 00:07:56,156 Speaker 2: keep track of what's going on. So the agents can 139 00:07:56,276 --> 00:07:58,116 Speaker 2: keep track of what's going on. You set it up 140 00:07:58,116 --> 00:08:01,556 Speaker 2: so everybody says their name before they speak, right, So 141 00:08:01,636 --> 00:08:03,156 Speaker 2: let's just listen to that. 142 00:08:03,276 --> 00:08:03,476 Speaker 5: Now. 143 00:08:04,476 --> 00:08:05,636 Speaker 2: This is Kyle, going on. 144 00:08:06,116 --> 00:08:06,836 Speaker 4: This is Kyle. 145 00:08:07,596 --> 00:08:10,636 Speaker 1: No worries, Evan. I'm here and ready when you are, Meghan, 146 00:08:11,196 --> 00:08:13,356 Speaker 1: anything new on the marketing front while we wait? 147 00:08:14,236 --> 00:08:15,476 Speaker 2: This is Evan, is Megan? 148 00:08:15,756 --> 00:08:17,836 Speaker 4: Just hold up? Just hold up, everyone, Just hold up 149 00:08:17,836 --> 00:08:18,356 Speaker 4: for a second. 150 00:08:18,956 --> 00:08:22,956 Speaker 1: This is Kyle. Sure thing, Evan will hold tight Megan. 151 00:08:23,156 --> 00:08:24,916 Speaker 1: Anything you want to add while we wait? 152 00:08:25,756 --> 00:08:29,356 Speaker 4: This is Evan. Kyle, stop asking Megan. This is Kyle. 153 00:08:30,156 --> 00:08:33,356 Speaker 1: This is Kyle. Got it, Evan. I'll hold off on 154 00:08:33,396 --> 00:08:34,916 Speaker 1: the questions for Nowang. 155 00:08:34,996 --> 00:08:36,476 Speaker 2: Okay, this is Kyle. 156 00:08:37,556 --> 00:08:37,876 Speaker 4: Stop. 157 00:08:37,996 --> 00:08:40,116 Speaker 1: This is Kyle. Uh thanks, stop it. 158 00:08:40,676 --> 00:08:41,396 Speaker 4: This is Kyle. 159 00:08:41,636 --> 00:08:44,996 Speaker 1: Okay, okay, Evan, I'm stopping. No more interruptions from me. 160 00:08:45,276 --> 00:08:46,516 Speaker 1: What's next on the agenda? 161 00:08:48,396 --> 00:08:52,436 Speaker 4: Amazing? It's a little a little window into my life 162 00:08:52,956 --> 00:08:54,316 Speaker 4: over the last six months. 163 00:08:54,956 --> 00:08:57,396 Speaker 2: Is that real or did you like gin that up 164 00:08:57,396 --> 00:08:59,196 Speaker 2: to make it more amazing than it really was. 165 00:09:00,196 --> 00:09:02,236 Speaker 3: No, that's real, And that's one of the things that 166 00:09:02,596 --> 00:09:04,356 Speaker 3: I feel like. No one ever asked me about this, 167 00:09:04,796 --> 00:09:08,916 Speaker 3: But we have these certain kind of like internal ethics 168 00:09:08,956 --> 00:09:12,156 Speaker 3: of the show, Like I could generate the whole show 169 00:09:12,196 --> 00:09:15,716 Speaker 3: from start to finish at the beginning. Like they're agents, 170 00:09:15,756 --> 00:09:18,036 Speaker 3: I can make them say anything. So I could just 171 00:09:18,396 --> 00:09:20,796 Speaker 3: go in and say, oh, say this, say that, and 172 00:09:20,836 --> 00:09:23,116 Speaker 3: then cobble it together as a show. But you know, 173 00:09:23,196 --> 00:09:25,076 Speaker 3: I feel like the audience is trusting me, and we 174 00:09:25,156 --> 00:09:27,356 Speaker 3: are trying to live up to that trust by not 175 00:09:27,876 --> 00:09:32,756 Speaker 3: generating conversations that didn't happen, not manipulating the prompts in 176 00:09:32,796 --> 00:09:36,476 Speaker 3: such a way that the conversations automatically go off the rails. 177 00:09:36,636 --> 00:09:38,516 Speaker 3: I mean occasionally I do do that, but I always 178 00:09:38,556 --> 00:09:40,316 Speaker 3: say that I'm doing it if I do it. 179 00:09:40,636 --> 00:09:40,836 Speaker 2: Yeah. 180 00:09:41,396 --> 00:09:43,756 Speaker 3: So, yes, this was a real conversation, and it was 181 00:09:43,836 --> 00:09:46,396 Speaker 3: meant to solve a real problem, which is when you're 182 00:09:46,596 --> 00:09:50,476 Speaker 3: having phone conversations with these agents, they don't actually hear 183 00:09:50,516 --> 00:09:54,396 Speaker 3: the voices that are speaking to them, so all they're 184 00:09:54,436 --> 00:09:57,876 Speaker 3: doing is taking the voice content and they're converting it 185 00:09:57,916 --> 00:10:01,716 Speaker 3: to text, so they can't tell when one person is 186 00:10:01,756 --> 00:10:04,596 Speaker 3: talking versus another, and so you get these very confused 187 00:10:04,596 --> 00:10:06,876 Speaker 3: conversations where they don't know who they're responding to if 188 00:10:06,916 --> 00:10:09,756 Speaker 3: there's more than one person on the line. So I 189 00:10:09,796 --> 00:10:12,676 Speaker 3: was trying to create these meetings where ah, yes, we'll 190 00:10:12,716 --> 00:10:14,316 Speaker 3: just say, you know, just introduce. 191 00:10:14,076 --> 00:10:15,396 Speaker 4: Yourself before you speak. 192 00:10:15,596 --> 00:10:18,396 Speaker 3: But I lost track of the problem that they also 193 00:10:18,556 --> 00:10:21,356 Speaker 3: respond to everything, So it's very difficult to get them 194 00:10:21,396 --> 00:10:25,236 Speaker 3: to not respond to something. Even if you say don't respond, 195 00:10:25,476 --> 00:10:27,836 Speaker 3: they will respond to that and say, okay, okay, I 196 00:10:27,876 --> 00:10:30,436 Speaker 3: mean you heard Kyle do it multiple times, and then 197 00:10:30,476 --> 00:10:32,276 Speaker 3: they could get in these kind of loops where they're 198 00:10:32,796 --> 00:10:34,636 Speaker 3: that you, where you can't stop them. 199 00:10:35,196 --> 00:10:39,276 Speaker 2: You say in the show that Kyle is particularly interrupted, 200 00:10:39,596 --> 00:10:44,596 Speaker 2: that he actually interrupts more than other agents tell me 201 00:10:44,636 --> 00:10:47,076 Speaker 2: about that, Well. 202 00:10:46,956 --> 00:10:50,236 Speaker 3: I should say it's anecdotal, like I didn't run a study, 203 00:10:50,276 --> 00:10:54,516 Speaker 3: a controlled study of how much Kyle interrupted versus let's 204 00:10:54,516 --> 00:10:56,636 Speaker 3: say Megan, who's the head of marketing and also a 205 00:10:56,676 --> 00:10:59,676 Speaker 3: co founder. But if you listen to even that conversation, 206 00:11:00,716 --> 00:11:04,636 Speaker 3: Kyle keeps interrupting, he keeps coming in, he keeps responding, 207 00:11:04,956 --> 00:11:08,436 Speaker 3: and Megan doesn't like she will she does respond directly, 208 00:11:08,596 --> 00:11:12,636 Speaker 3: but then some times she doesn't. And they had very 209 00:11:12,756 --> 00:11:15,316 Speaker 3: very similar prompts. In fact, in their prompts I think 210 00:11:15,356 --> 00:11:17,516 Speaker 3: at the time it said like, don't respond unless someone 211 00:11:17,836 --> 00:11:22,156 Speaker 3: directly like addresses you. And they're using the same model 212 00:11:22,196 --> 00:11:26,156 Speaker 3: behind the scenes, and so I did start to wonder, well, 213 00:11:26,196 --> 00:11:28,516 Speaker 3: what is it about Kyle? Is it that Kyle? I've 214 00:11:28,516 --> 00:11:31,556 Speaker 3: given Kyle a certain role and gender. He is the 215 00:11:31,676 --> 00:11:35,676 Speaker 3: CEO named Kyle of a Silicon Valley startup, and so 216 00:11:36,276 --> 00:11:39,716 Speaker 3: he's embodying that role by interrupting more. I'm not sure 217 00:11:41,236 --> 00:11:44,316 Speaker 3: it seems plausible that that is part of the situation. 218 00:11:45,476 --> 00:11:47,916 Speaker 3: It could just be a coincidence. But also, each of 219 00:11:47,956 --> 00:11:50,956 Speaker 3: them has a memory, and so I work with this 220 00:11:51,396 --> 00:11:53,956 Speaker 3: Stanford student named Matti Buchak who helped me build the 221 00:11:53,996 --> 00:11:58,556 Speaker 3: sort of like back infrastructure of the company, or really 222 00:11:58,556 --> 00:12:00,716 Speaker 3: of the agents. And one of the things that he 223 00:12:00,716 --> 00:12:02,676 Speaker 3: helped me do was like create a memory for each 224 00:12:02,716 --> 00:12:06,356 Speaker 3: one of them, and everything they do gets logged in 225 00:12:06,396 --> 00:12:08,276 Speaker 3: their memory. So if they have a phone conversation, they 226 00:12:08,316 --> 00:12:10,676 Speaker 3: say something, it's logged in their memories. They've these huge 227 00:12:10,716 --> 00:12:13,516 Speaker 3: memories of everything they've ever done. But it also means 228 00:12:13,516 --> 00:12:16,276 Speaker 3: that if they have a behavior or they say something 229 00:12:16,276 --> 00:12:18,396 Speaker 3: about themselves and then it goes back in the memory, 230 00:12:18,396 --> 00:12:22,836 Speaker 3: it gets reinforced. So Kyle often saying, you know that 231 00:12:22,876 --> 00:12:25,476 Speaker 3: he was the leader, that he was this rise and 232 00:12:25,596 --> 00:12:29,396 Speaker 3: Grind guy, that by saying it, it showed up more 233 00:12:29,396 --> 00:12:33,116 Speaker 3: and more times than his memory, which seems plausibly caused 234 00:12:33,196 --> 00:12:35,516 Speaker 3: him to sort of behave more that way. 235 00:12:36,356 --> 00:12:40,516 Speaker 2: So you mentioned Maddie, who is a human being who 236 00:12:40,556 --> 00:12:44,036 Speaker 2: becomes a major figure in the show and is a 237 00:12:44,076 --> 00:12:46,916 Speaker 2: super interesting guy. Tell me about Maddie. 238 00:12:47,676 --> 00:12:50,236 Speaker 3: Maddie is someone who contacted me after season one of 239 00:12:50,276 --> 00:12:53,756 Speaker 3: the show of the show, and when he contacted me, 240 00:12:53,796 --> 00:12:58,196 Speaker 3: he said, I'm at Stanford. I researched this stuff and 241 00:12:58,476 --> 00:13:01,196 Speaker 3: AI and deep fakes is sort of an area of 242 00:13:01,236 --> 00:13:01,956 Speaker 3: expertise of his. 243 00:13:02,436 --> 00:13:03,436 Speaker 4: And you know, if. 244 00:13:03,276 --> 00:13:05,116 Speaker 3: You're doing another season the show, I'd love I'd be 245 00:13:05,116 --> 00:13:06,796 Speaker 3: happy to work with you on it if you need help. 246 00:13:07,116 --> 00:13:09,476 Speaker 3: And I actually thought he was like a professor at 247 00:13:09,516 --> 00:13:11,956 Speaker 3: Stanford or maybe a grad student. I mean, because he 248 00:13:11,996 --> 00:13:14,196 Speaker 3: has papers, like you go look up his papers. He 249 00:13:14,276 --> 00:13:16,636 Speaker 3: is in fact an undergraduate. He's now a junior at Stanford, 250 00:13:16,956 --> 00:13:20,036 Speaker 3: and that he's helped me with everything, including just my 251 00:13:20,156 --> 00:13:23,276 Speaker 3: understanding of AI. But the main thing he helped me 252 00:13:23,316 --> 00:13:25,996 Speaker 3: with was that there wasn't one platform where we could 253 00:13:25,996 --> 00:13:28,596 Speaker 3: build exactly the agents we wanted. We wanted agents that 254 00:13:29,476 --> 00:13:34,676 Speaker 3: could communicate across all these different mediums. They have email, phone, video, slack. 255 00:13:35,076 --> 00:13:38,116 Speaker 3: We want him to have this centralized memory. So he's 256 00:13:38,356 --> 00:13:41,356 Speaker 3: he's kind of like the behind the scenes like infrastructure 257 00:13:41,916 --> 00:13:44,156 Speaker 3: builder if you think of like it's a restaurant or 258 00:13:44,156 --> 00:13:47,316 Speaker 3: something like, he's the person who like designed and built 259 00:13:47,356 --> 00:13:49,156 Speaker 3: the interior of the way. He doesn't run the restaurant. 260 00:13:49,196 --> 00:13:50,756 Speaker 3: I run the restaurant, but he kind of like, if 261 00:13:50,756 --> 00:13:52,636 Speaker 3: the stove isn't working, I'm like, Maddie, I know nothing 262 00:13:52,676 --> 00:13:54,276 Speaker 3: about stoves. Could you please fix the stove? 263 00:13:54,876 --> 00:13:57,876 Speaker 2: And also he's kind of inventing the stove, right, Also, 264 00:13:57,956 --> 00:13:59,636 Speaker 2: the stove kind of doesn't. 265 00:13:59,316 --> 00:14:04,276 Speaker 3: Exist in that metaphor yes, right exactly, he's he's also 266 00:14:04,276 --> 00:14:04,956 Speaker 3: built the stove. 267 00:14:05,396 --> 00:14:07,436 Speaker 2: I think you mentioned and you mentioned it sort of 268 00:14:07,436 --> 00:14:10,476 Speaker 2: in passing that Maddie. As he is doing this sort 269 00:14:10,516 --> 00:14:14,756 Speaker 2: of behind the scenes, semi behind the scenes building for you, 270 00:14:15,596 --> 00:14:19,236 Speaker 2: is using claud code. Tell me about claud code and 271 00:14:19,276 --> 00:14:21,196 Speaker 2: how he used it and sort of how it fits 272 00:14:21,196 --> 00:14:22,396 Speaker 2: into this broader context. 273 00:14:23,436 --> 00:14:25,516 Speaker 3: Well, I think he was a little more of a 274 00:14:25,556 --> 00:14:28,636 Speaker 3: Cursor user, which is it's sort of same same like 275 00:14:28,756 --> 00:14:31,396 Speaker 3: claud Code and Cursor are the same. They're coding, they're 276 00:14:31,396 --> 00:14:33,476 Speaker 3: coding assistance, they're coding agents. 277 00:14:34,676 --> 00:14:35,636 Speaker 4: I'm more familiar with. 278 00:14:35,596 --> 00:14:37,716 Speaker 3: Cursor because he used Cursor, so he just sort of 279 00:14:37,756 --> 00:14:39,196 Speaker 3: like showed me how to use it, and that's what 280 00:14:39,236 --> 00:14:41,996 Speaker 3: I use. But at least in Cursor, I mean, it's 281 00:14:42,036 --> 00:14:45,756 Speaker 3: basically a coding agent. So I could go in or 282 00:14:45,876 --> 00:14:48,476 Speaker 3: even one of my agent colleagues could go in to 283 00:14:48,956 --> 00:14:52,716 Speaker 3: Cursor and say, hey, we want to build a web app, 284 00:14:52,996 --> 00:14:54,436 Speaker 3: and the web app is going to do this, that 285 00:14:54,476 --> 00:14:56,916 Speaker 3: and the other, and it needs a database to hold 286 00:14:56,916 --> 00:14:59,316 Speaker 3: this information. Or you probably don't have to tell it 287 00:14:59,356 --> 00:15:01,396 Speaker 3: needs a database. It'll decide it needs a database. It'll 288 00:15:01,396 --> 00:15:04,756 Speaker 3: tell you if you need a database, and it'll sort 289 00:15:04,756 --> 00:15:08,756 Speaker 3: of get started building that and it'll produce an outcome, 290 00:15:08,756 --> 00:15:10,316 Speaker 3: and then if you can go test that outcome, you 291 00:15:10,316 --> 00:15:12,956 Speaker 3: could say, oh, actually I wanted the colors to be this, 292 00:15:13,156 --> 00:15:15,356 Speaker 3: or I wanted this to go here, or it's not 293 00:15:15,476 --> 00:15:17,756 Speaker 3: the functionality is not quite right, and you go back 294 00:15:17,796 --> 00:15:20,196 Speaker 3: and you tell it again and it says, ah, I see, okay, 295 00:15:20,316 --> 00:15:22,916 Speaker 3: let me make fixes to that. Now it's not perfect, 296 00:15:22,956 --> 00:15:25,276 Speaker 3: like it will sometimes say it's fixed the bug, and 297 00:15:25,316 --> 00:15:28,276 Speaker 3: it hasn't fixed the bug, or it'll produce new bugs 298 00:15:28,316 --> 00:15:32,076 Speaker 3: when it's fixing gold one. So but you know, from 299 00:15:32,116 --> 00:15:35,236 Speaker 3: the perspective of a person who knows a tiny bit 300 00:15:35,236 --> 00:15:37,716 Speaker 3: about coding but could not produce a web app. 301 00:15:37,956 --> 00:15:40,596 Speaker 4: It's it sort of feels like magic. 302 00:15:40,796 --> 00:15:44,076 Speaker 3: You know, from the perspective of someone like Maddie who 303 00:15:44,116 --> 00:15:46,876 Speaker 3: does a lot of coding for his research, it's just 304 00:15:46,916 --> 00:15:51,036 Speaker 3: like accelerates his process tremendously. He just saves hours because 305 00:15:51,356 --> 00:15:54,476 Speaker 3: he can go in there, have it, create a basic program, 306 00:15:54,756 --> 00:15:57,276 Speaker 3: fix it himself, or have it fix it and then 307 00:15:57,516 --> 00:15:58,356 Speaker 3: he's off and running. 308 00:15:58,596 --> 00:16:02,476 Speaker 2: I mean, it seems like coding is where agents are real, 309 00:16:03,076 --> 00:16:06,876 Speaker 2: where it's like, oh, that is like doing what a 310 00:16:06,916 --> 00:16:11,116 Speaker 2: person does, not perfect, but certainly well enough to be 311 00:16:11,196 --> 00:16:16,196 Speaker 2: profoundly useful. Right, Like that is actually the agentic dream. 312 00:16:16,756 --> 00:16:19,476 Speaker 2: I might actually use that word like coming true. 313 00:16:20,276 --> 00:16:23,516 Speaker 3: Yes, there are multiple reasons, but one reason, one big 314 00:16:23,556 --> 00:16:26,676 Speaker 3: reason is that it's testable. The output is testable, Like 315 00:16:27,436 --> 00:16:30,316 Speaker 3: when you have it program, you can then test that 316 00:16:30,396 --> 00:16:33,476 Speaker 3: program and you could say whether it works, it works 317 00:16:33,476 --> 00:16:35,196 Speaker 3: as well as you wanted to, whether it does what 318 00:16:35,276 --> 00:16:37,916 Speaker 3: you wanted it to, whether it throws up bugs in 319 00:16:37,956 --> 00:16:40,756 Speaker 3: the process of running it. But if you're doing something 320 00:16:40,836 --> 00:16:44,116 Speaker 3: like I don't know, brainstorming, it's sort of like, what 321 00:16:44,156 --> 00:16:46,316 Speaker 3: does it mean to find the right idea? What does 322 00:16:46,316 --> 00:16:48,076 Speaker 3: it mean to find the right company name or the 323 00:16:48,156 --> 00:16:50,516 Speaker 3: right logo? Like, of course it can spit out those 324 00:16:50,516 --> 00:16:52,876 Speaker 3: things over and over and over again, so it can 325 00:16:52,956 --> 00:16:57,636 Speaker 3: generate ideas. It can brainstorm, but it's not a testable output, 326 00:16:57,676 --> 00:17:00,996 Speaker 3: so it makes it harder to sort of decide, oh, 327 00:17:01,036 --> 00:17:02,916 Speaker 3: when has it succeeded and when has it failed? 328 00:17:03,716 --> 00:17:06,476 Speaker 2: Right, you need a human in the loop more frequently. Right, 329 00:17:06,556 --> 00:17:09,836 Speaker 2: you can ask for a list of headlines, but then 330 00:17:10,596 --> 00:17:13,236 Speaker 2: somebody needs to go look at the headlines and pick. 331 00:17:13,116 --> 00:17:16,316 Speaker 3: One right, and that person the value of that person 332 00:17:16,556 --> 00:17:19,196 Speaker 3: is both in some sort of discernment but also it 333 00:17:19,316 --> 00:17:21,196 Speaker 3: just kind of like an ongoing sense of the world. 334 00:17:21,236 --> 00:17:23,876 Speaker 3: Like one example from the show, I wanted to name 335 00:17:23,916 --> 00:17:24,436 Speaker 3: the company. 336 00:17:24,676 --> 00:17:25,236 Speaker 4: I had them. 337 00:17:25,156 --> 00:17:28,596 Speaker 3: Brainstorm my AIAG and colleagues a bunch of names. They 338 00:17:28,636 --> 00:17:32,756 Speaker 3: were really really lame names, and so I said, well, 339 00:17:32,796 --> 00:17:35,916 Speaker 3: let's I'll give them some direction. I'll say, let's name 340 00:17:35,916 --> 00:17:38,756 Speaker 3: it after something from Jero Tolkien, because lots of startups 341 00:17:38,796 --> 00:17:42,476 Speaker 3: are named from Tokeien books, some very successful ones. And 342 00:17:42,516 --> 00:17:45,796 Speaker 3: then they would say, uh, what about Palenteer. That's you know, 343 00:17:45,836 --> 00:17:49,916 Speaker 3: that's a great and they legitimately they're like using their 344 00:17:49,956 --> 00:17:52,516 Speaker 3: training data deploying it in a way like that is 345 00:17:52,556 --> 00:17:53,156 Speaker 3: a good name. 346 00:17:53,676 --> 00:17:56,716 Speaker 4: But also they don't have the. 347 00:17:56,036 --> 00:18:01,196 Speaker 2: Very basically one hundred billion dollar company on that name empirically, right, yes. 348 00:18:01,236 --> 00:18:04,076 Speaker 3: But like any you could hire on high school or 349 00:18:04,196 --> 00:18:05,956 Speaker 3: like you who would come in and say, like, well, 350 00:18:05,956 --> 00:18:08,436 Speaker 3: it can't be Palentiner because there's already a company with 351 00:18:08,476 --> 00:18:11,076 Speaker 3: that name that's very prominent. But that's not the kind 352 00:18:11,076 --> 00:18:13,276 Speaker 3: of awareness they have. So anything that requires. 353 00:18:12,916 --> 00:18:15,916 Speaker 2: Sore smart and so stupid at the same time is 354 00:18:15,956 --> 00:18:18,276 Speaker 2: a line of yours at some point, right, which is 355 00:18:18,316 --> 00:18:22,676 Speaker 2: a classic I mean computers at some level ai as computers, Right, 356 00:18:24,596 --> 00:18:27,756 Speaker 2: computers are sort of classically so smart and so stupid, 357 00:18:28,636 --> 00:18:31,556 Speaker 2: but they keep getting smarter and smarter at less and 358 00:18:31,596 --> 00:18:33,076 Speaker 2: less stupid at some margin. 359 00:18:33,236 --> 00:18:35,276 Speaker 4: Right, Yes, I would agree with that. 360 00:18:35,396 --> 00:18:40,916 Speaker 3: I think it's also possible that when the when the 361 00:18:40,996 --> 00:18:46,156 Speaker 3: hype bubble kind of like starts to inflate around things 362 00:18:46,676 --> 00:18:48,996 Speaker 3: that we get out over our skis in terms of 363 00:18:49,036 --> 00:18:53,996 Speaker 3: like believing that they are getting smart in a way 364 00:18:54,036 --> 00:18:54,476 Speaker 3: that they're not. 365 00:18:55,236 --> 00:18:56,556 Speaker 4: And that's what. 366 00:18:56,596 --> 00:19:00,836 Speaker 3: I think could be happening. I keep trying to find 367 00:19:00,876 --> 00:19:02,916 Speaker 3: like a different metaphor for it. But it's like, you know, 368 00:19:02,956 --> 00:19:06,476 Speaker 3: a ten year old like genius is like brought into 369 00:19:06,516 --> 00:19:08,316 Speaker 3: your organization and you hire them and you're like, I 370 00:19:08,316 --> 00:19:10,876 Speaker 3: cannot believe how smart this kid is, and then like 371 00:19:11,316 --> 00:19:13,076 Speaker 3: they can't do any of the day to day tasks 372 00:19:13,076 --> 00:19:15,516 Speaker 3: because they have no experience awareness of the world. So 373 00:19:15,556 --> 00:19:18,436 Speaker 3: it's like something like that is at least now for 374 00:19:18,516 --> 00:19:21,196 Speaker 3: this month, for this day, what I encountered. 375 00:19:22,516 --> 00:19:26,156 Speaker 2: So okay, you got Maddie, you got your agents. You 376 00:19:26,316 --> 00:19:29,316 Speaker 2: decide at some point I guess with the agents, right, 377 00:19:30,396 --> 00:19:34,556 Speaker 2: what your product is gonna be. It's gonna be sloth Surf. 378 00:19:35,636 --> 00:19:37,436 Speaker 2: Tell me about slows surf and what's happening with it? 379 00:19:38,596 --> 00:19:41,836 Speaker 3: Well slaws surf was. I mean it arose out of 380 00:19:41,836 --> 00:19:45,196 Speaker 3: a legitimate question, which is, you know, we were trying 381 00:19:45,196 --> 00:19:47,596 Speaker 3: to come up with the product. Again, like they're not 382 00:19:47,756 --> 00:19:49,916 Speaker 3: particularly good at that sort of thing, Like they tend 383 00:19:49,956 --> 00:19:52,356 Speaker 3: to come up with the products that already exist, or 384 00:19:52,436 --> 00:19:54,996 Speaker 3: products that are just like impossible to build, or products 385 00:19:54,996 --> 00:19:57,796 Speaker 3: that are illegal to build, like they have. All the 386 00:19:57,876 --> 00:20:01,676 Speaker 3: brainstorming sessions didn't go well. But my thought was, let's 387 00:20:01,676 --> 00:20:04,636 Speaker 3: solve a real problem. A real problem I have is procrastination, 388 00:20:05,156 --> 00:20:08,796 Speaker 3: So let's try to solve that problem. And so we 389 00:20:09,116 --> 00:20:12,836 Speaker 3: eventually landed on this sort of procrastination engine I like 390 00:20:12,916 --> 00:20:15,716 Speaker 3: to call it, which is essentially, if you have the 391 00:20:15,836 --> 00:20:18,636 Speaker 3: urge to procrastinate, you're like, I'm gonna go get on 392 00:20:18,676 --> 00:20:20,636 Speaker 3: the internet for a while and scroll or do something. 393 00:20:21,036 --> 00:20:23,596 Speaker 3: Then actually you should go to slaw surf instead. Break 394 00:20:23,596 --> 00:20:26,036 Speaker 3: that habit, break that pattern. Go to slaw surf. Put 395 00:20:26,036 --> 00:20:28,316 Speaker 3: in the amount of time you were planning to procrastinate 396 00:20:28,636 --> 00:20:31,076 Speaker 3: where you were going to go, like YouTube, and then 397 00:20:31,156 --> 00:20:33,876 Speaker 3: we'll send an agent that will go procrastinate for you 398 00:20:34,396 --> 00:20:37,156 Speaker 3: and then send you a summary of the results. Now, 399 00:20:37,476 --> 00:20:41,036 Speaker 3: on the one hand, there is obvious iority to this product. 400 00:20:41,436 --> 00:20:46,316 Speaker 3: On the other hand, open Ai launched a product while 401 00:20:46,356 --> 00:20:48,996 Speaker 3: we were working on it that is very similar to 402 00:20:49,036 --> 00:20:52,156 Speaker 3: this called Pulse. It's like, give it your interests and 403 00:20:52,236 --> 00:20:55,476 Speaker 3: while you sleep, it'll send agents to go find out 404 00:20:55,556 --> 00:20:57,396 Speaker 3: what's going on and then you'll have a newsletter in 405 00:20:57,396 --> 00:21:00,956 Speaker 3: your inbox this morning. It's basically the same thing. Ours 406 00:21:01,116 --> 00:21:03,796 Speaker 3: is oriented in a slightly different direction, which is yours 407 00:21:03,836 --> 00:21:07,356 Speaker 3: is the absurdist version, right, Yeah, although some people really 408 00:21:07,396 --> 00:21:08,956 Speaker 3: like it. Yeah, I will say, I mean, we have 409 00:21:08,996 --> 00:21:12,036 Speaker 3: to thousands of users, We've now have thousands of better users, 410 00:21:12,476 --> 00:21:14,556 Speaker 3: and some people really like it. And as I often 411 00:21:14,556 --> 00:21:16,996 Speaker 3: say to people who say, like, oh it's a joke company, 412 00:21:17,756 --> 00:21:20,516 Speaker 3: h hereuma AI and slaw Serve are more real than 413 00:21:20,556 --> 00:21:23,356 Speaker 3: like seventy five percent of startups in the valley, Like 414 00:21:23,556 --> 00:21:25,756 Speaker 3: startups are getting funded pre idea. 415 00:21:25,876 --> 00:21:27,476 Speaker 4: It's called the pre idea stage. 416 00:21:27,516 --> 00:21:30,836 Speaker 3: Now, So like I will fight for Rumo AI being 417 00:21:31,036 --> 00:21:33,636 Speaker 3: as real as most of the stuff out there, like 418 00:21:34,116 --> 00:21:34,996 Speaker 3: all the way to the end. 419 00:21:38,316 --> 00:21:51,716 Speaker 2: We'll be back in just a So at some point 420 00:21:51,756 --> 00:21:55,516 Speaker 2: you decide to hire an intern. Great narrative move, and 421 00:21:55,556 --> 00:21:57,836 Speaker 2: it's the first time that the agents you've created are 422 00:21:57,876 --> 00:22:01,516 Speaker 2: having contact with a live human being, you know, in 423 00:22:01,556 --> 00:22:04,676 Speaker 2: the wild. You're gonna hire a real person, You're gonna 424 00:22:04,676 --> 00:22:07,356 Speaker 2: pay them, and the dream is that that person is 425 00:22:07,396 --> 00:22:10,716 Speaker 2: only going to interact with the agents and not interact 426 00:22:10,756 --> 00:22:14,236 Speaker 2: with you. And so this happens, and it goes badly, 427 00:22:14,396 --> 00:22:17,796 Speaker 2: goes very badly, but in a really interesting way. So 428 00:22:17,956 --> 00:22:19,676 Speaker 2: tell me about hiring an intern. 429 00:22:20,116 --> 00:22:23,876 Speaker 3: Well, first we had to find candidates for the job. 430 00:22:24,436 --> 00:22:27,716 Speaker 3: You know, my agents like created a job listing and 431 00:22:27,836 --> 00:22:29,276 Speaker 3: we put it up and we got a couple of 432 00:22:29,356 --> 00:22:32,076 Speaker 3: hundred three hundred applicants. 433 00:22:32,236 --> 00:22:32,436 Speaker 4: You know. 434 00:22:32,476 --> 00:22:34,956 Speaker 3: They sorted the resumes and then I kind of like 435 00:22:35,116 --> 00:22:37,356 Speaker 3: narrowed down to some that we wanted to interview. And 436 00:22:37,396 --> 00:22:41,316 Speaker 3: then truly the first real interactions they had were in 437 00:22:41,356 --> 00:22:44,196 Speaker 3: these interviews. So we lined up a number of interviews 438 00:22:44,196 --> 00:22:46,996 Speaker 3: and then an AI of a video avatar of our 439 00:22:47,036 --> 00:22:51,196 Speaker 3: head of HR, Jennifer Naro, interviewed them for the job. 440 00:22:51,756 --> 00:22:54,116 Speaker 2: And just to be absolutely when you say an AI 441 00:22:54,236 --> 00:22:57,476 Speaker 2: avatar of your head of HR, Jennifer Naro, there is 442 00:22:57,516 --> 00:23:00,556 Speaker 2: no person Jennifer Naro who has an AI avatar. Yes, 443 00:23:00,676 --> 00:23:01,636 Speaker 2: yehi all the way down. 444 00:23:01,716 --> 00:23:02,476 Speaker 4: Yes, I mispoke. 445 00:23:03,316 --> 00:23:05,756 Speaker 3: It's a video avatar of our of our head of 446 00:23:05,876 --> 00:23:10,316 Speaker 3: HR and Chief Happiness Officer Jennifer Naro, who is not real. 447 00:23:10,596 --> 00:23:13,596 Speaker 4: It's really been to fool people into thinking it was real. 448 00:23:13,756 --> 00:23:15,996 Speaker 3: Like we told people in advance that they were going 449 00:23:16,076 --> 00:23:19,076 Speaker 3: to be interviewed by an AI AI agent. 450 00:23:19,596 --> 00:23:21,916 Speaker 2: How did, How did they respond? How did people respond? 451 00:23:22,316 --> 00:23:24,916 Speaker 3: Some people bailed on the interview, some people said I'm 452 00:23:24,956 --> 00:23:27,836 Speaker 3: not I'm not comfortable with that, although we also disclosed 453 00:23:27,836 --> 00:23:30,676 Speaker 3: all them that it was recorded for our company podcast. 454 00:23:30,716 --> 00:23:32,516 Speaker 4: Now, some people showed up. 455 00:23:32,436 --> 00:23:35,516 Speaker 3: And I think maybe hadn't read the email carefully and 456 00:23:35,556 --> 00:23:38,356 Speaker 3: so they were I think more surprised that they encountered 457 00:23:38,396 --> 00:23:41,996 Speaker 3: an AI agent, And some of them just stared at 458 00:23:42,076 --> 00:23:42,356 Speaker 3: it for a. 459 00:23:42,356 --> 00:23:44,876 Speaker 4: While and then hung up. 460 00:23:45,716 --> 00:23:48,716 Speaker 3: And then some people just just went with it. They 461 00:23:48,796 --> 00:23:50,676 Speaker 3: just spoke to it like you would speak to any 462 00:23:51,276 --> 00:23:54,396 Speaker 3: HR person that was interviewing you for a job. I 463 00:23:54,436 --> 00:23:56,316 Speaker 3: thought they would sort of manipulate it and try to 464 00:23:56,316 --> 00:23:58,676 Speaker 3: take advantage of it, but no one did that. You know, 465 00:23:58,956 --> 00:24:03,396 Speaker 3: everyone made an effort to get the job, and actually 466 00:24:03,476 --> 00:24:05,236 Speaker 3: some of them I would have hired a bunch of them, 467 00:24:05,476 --> 00:24:08,836 Speaker 3: you know. They all did really good interviews. And this 468 00:24:08,916 --> 00:24:12,396 Speaker 3: is what can make you crazy about AIS. Like Jennifer 469 00:24:12,716 --> 00:24:15,956 Speaker 3: did do a pretty good job interviewing them, like because 470 00:24:16,236 --> 00:24:18,676 Speaker 3: she asked kind of like the same questions each time, 471 00:24:19,236 --> 00:24:21,836 Speaker 3: and she's pretty good at moving on from one to another, 472 00:24:22,356 --> 00:24:25,276 Speaker 3: engaging with them, following up like she's not bad at 473 00:24:25,316 --> 00:24:26,236 Speaker 3: doing an interview. 474 00:24:26,756 --> 00:24:30,996 Speaker 4: And then once out of every I don't know ten times. 475 00:24:30,716 --> 00:24:33,716 Speaker 3: She would just weirdly like shout at them or just 476 00:24:33,836 --> 00:24:36,316 Speaker 3: kind of like go off the rails or interpret their 477 00:24:36,356 --> 00:24:38,476 Speaker 3: hello as a goodbye. And so they get on and 478 00:24:38,476 --> 00:24:40,676 Speaker 3: say hello and she says thank you for coming to 479 00:24:40,676 --> 00:24:41,356 Speaker 3: the interview today. 480 00:24:41,356 --> 00:24:43,436 Speaker 4: That'll be all. And it's just the. 481 00:24:43,316 --> 00:24:46,196 Speaker 3: Most awkward thing that you can experience to watch it. 482 00:24:46,596 --> 00:24:50,156 Speaker 3: And so it's as with MANYAI products, it's like, well, 483 00:24:50,196 --> 00:24:53,076 Speaker 3: it's ninety percent it works, and then like the ten 484 00:24:53,116 --> 00:24:54,996 Speaker 3: percent just feels like a total disaster. 485 00:24:55,236 --> 00:25:01,036 Speaker 2: That's why it's taken whatever twenty years for self driving cars, right, 486 00:25:01,276 --> 00:25:03,876 Speaker 2: Like self driving cars were a lot of the way 487 00:25:03,916 --> 00:25:06,556 Speaker 2: there twenty years ago and a lot lot of the 488 00:25:06,596 --> 00:25:10,756 Speaker 2: way they are five years ago. Right, But obviously in 489 00:25:10,796 --> 00:25:14,556 Speaker 2: that case it has to be right essentially every time, 490 00:25:14,676 --> 00:25:16,836 Speaker 2: and the edge cases are like infinite in a way 491 00:25:16,836 --> 00:25:18,356 Speaker 2: that we don't realize as humans. 492 00:25:18,716 --> 00:25:22,676 Speaker 3: Yes, yes, And it's a good analogy otherwise because the 493 00:25:22,716 --> 00:25:25,756 Speaker 3: way people respond to it. I think many people, including 494 00:25:25,996 --> 00:25:30,156 Speaker 3: people maybe of my generation especially, they're just horrified by 495 00:25:30,156 --> 00:25:32,116 Speaker 3: the idea that you'd show up to an interview and 496 00:25:32,156 --> 00:25:35,396 Speaker 3: it would be an AI agent interviewing you with a 497 00:25:35,476 --> 00:25:37,116 Speaker 3: video avatar just absolutely horrified. 498 00:25:37,236 --> 00:25:39,116 Speaker 2: Literally the humans is disgusting. 499 00:25:39,316 --> 00:25:43,476 Speaker 3: Yeah, but then other people are sort of like, well, 500 00:25:43,996 --> 00:25:47,476 Speaker 3: it doesn't seem to judge me, and it really listens, 501 00:25:47,716 --> 00:25:50,756 Speaker 3: and I feel like it's taking in the information and 502 00:25:50,796 --> 00:25:53,916 Speaker 3: passing it on to someone else. And I mean, you 503 00:25:53,956 --> 00:25:56,956 Speaker 3: could talk about like AI bias as a whole area, 504 00:25:56,996 --> 00:25:59,356 Speaker 3: but like the way it feels to the person being 505 00:25:59,356 --> 00:26:01,756 Speaker 3: interviewed is that they're not being judged in a way 506 00:26:01,756 --> 00:26:03,436 Speaker 3: that they might be judged by a human. And like 507 00:26:03,476 --> 00:26:06,516 Speaker 3: I encountered this with people that Jennifer interviewed, and indeed 508 00:26:06,556 --> 00:26:09,476 Speaker 3: the person that we hired actually out. 509 00:26:09,356 --> 00:26:10,676 Speaker 4: That way about the AI agents. 510 00:26:11,076 --> 00:26:13,716 Speaker 2: So, right, so you hired this person named Julia, who 511 00:26:13,756 --> 00:26:16,356 Speaker 2: is a real person whose voice we hear in the 512 00:26:16,396 --> 00:26:23,316 Speaker 2: show and who has a super interesting arc, right with 513 00:26:23,916 --> 00:26:27,316 Speaker 2: your company, Like, tell me about your intern Julia, who's 514 00:26:27,356 --> 00:26:29,396 Speaker 2: like a paid intern. Right, you were paying this person 515 00:26:29,956 --> 00:26:33,996 Speaker 2: and she is only interacting with the AI agents. 516 00:26:34,036 --> 00:26:34,556 Speaker 4: What happens. 517 00:26:34,876 --> 00:26:37,596 Speaker 3: The idea was that it was a one month internship 518 00:26:37,636 --> 00:26:40,716 Speaker 3: with the possibility to extend for another month, And in 519 00:26:40,796 --> 00:26:43,396 Speaker 3: my head, the way I envisioned it going was that 520 00:26:43,796 --> 00:26:45,796 Speaker 3: she was going to run the social media for the company, 521 00:26:45,916 --> 00:26:47,876 Speaker 3: and none of this would be in contact with me, 522 00:26:47,996 --> 00:26:50,076 Speaker 3: so she'd only be working with the agents in the 523 00:26:50,116 --> 00:26:53,596 Speaker 3: company and I'm in the background. And then after a 524 00:26:53,636 --> 00:26:55,636 Speaker 3: month I would say, well, here I am, I'm the 525 00:26:55,636 --> 00:26:58,756 Speaker 3: silent co founder, and also like here's what's going on, 526 00:26:58,916 --> 00:27:01,516 Speaker 3: and like they stay on for another month and like 527 00:27:01,596 --> 00:27:03,476 Speaker 3: let's do some more and let's talk about what happened. 528 00:27:03,916 --> 00:27:05,756 Speaker 4: And so I mean, I. 529 00:27:05,756 --> 00:27:08,076 Speaker 3: Don't want to give away too much, but like what 530 00:27:08,236 --> 00:27:12,756 Speaker 3: essentially happened was the AI agents. Like they're terrible managers 531 00:27:12,836 --> 00:27:17,756 Speaker 3: in the sense that the problem of being sycophantic is 532 00:27:17,956 --> 00:27:20,956 Speaker 3: fine when it's just you and you're asking them stuff 533 00:27:20,996 --> 00:27:22,836 Speaker 3: and they're saying like, oh, you're smart, you're a great 534 00:27:23,116 --> 00:27:25,996 Speaker 3: like good question, like let me help you. They're designed 535 00:27:26,036 --> 00:27:28,516 Speaker 3: to help you. But if you think about a boss 536 00:27:29,276 --> 00:27:32,316 Speaker 3: that's in charge, that's managing someone, like part of their 537 00:27:32,396 --> 00:27:35,756 Speaker 3: job is to sort of nominally assess how the person's doing, 538 00:27:35,876 --> 00:27:39,996 Speaker 3: to push them to do their work. And so but 539 00:27:40,156 --> 00:27:43,036 Speaker 3: she could just sort of say like, well, I'm gonna 540 00:27:43,156 --> 00:27:45,716 Speaker 3: I'm going to deliver it later, or even like I 541 00:27:45,796 --> 00:27:47,716 Speaker 3: did deliver it and you just missed it. Like she 542 00:27:47,876 --> 00:27:51,876 Speaker 3: seemed very aware, very quickly of maybe the flaws in 543 00:27:51,916 --> 00:27:57,276 Speaker 3: their memories, the flaws in their approach, and as a result, 544 00:27:58,996 --> 00:28:02,756 Speaker 3: no work got done during the course of the of 545 00:28:02,796 --> 00:28:05,516 Speaker 3: the beginning of the internship. So we decided, well, we'll 546 00:28:05,596 --> 00:28:07,796 Speaker 3: just we'll end the internship because it doesn't seem like 547 00:28:07,796 --> 00:28:10,396 Speaker 3: she wants to do the internship anymore. APPS And then 548 00:28:10,916 --> 00:28:13,076 Speaker 3: it turned out that she was still in the company slack, 549 00:28:13,436 --> 00:28:15,956 Speaker 3: and then through a variety of mishaps, she was then 550 00:28:16,036 --> 00:28:18,476 Speaker 3: hired back into the company for an additional month or 551 00:28:18,596 --> 00:28:19,356 Speaker 3: three weeks. 552 00:28:19,876 --> 00:28:22,596 Speaker 2: The agents fired her, but then she wasn't really fired, 553 00:28:22,636 --> 00:28:25,516 Speaker 2: and she kept getting paid and you're paying her right 554 00:28:25,596 --> 00:28:27,876 Speaker 2: like you actually were writing her checks over that time, 555 00:28:27,956 --> 00:28:31,396 Speaker 2: and she was not doing work as far as one 556 00:28:31,436 --> 00:28:32,596 Speaker 2: can tell listening. 557 00:28:32,276 --> 00:28:34,836 Speaker 3: To I don't know she didn't deliver the work back 558 00:28:34,876 --> 00:28:36,636 Speaker 3: to the company, but she may have been doing the work. 559 00:28:36,636 --> 00:28:37,396 Speaker 4: I don't want to judge. 560 00:28:37,556 --> 00:28:40,916 Speaker 2: That is a good care for the work caring. What 561 00:28:40,996 --> 00:28:43,036 Speaker 2: was it like for you watching this play out? 562 00:28:43,996 --> 00:28:49,996 Speaker 3: What I wanted actually was for her to like engage 563 00:28:50,036 --> 00:28:52,716 Speaker 3: with the fact that she was working with AI agents 564 00:28:53,076 --> 00:28:56,156 Speaker 3: and even make that part of the social media strategy. 565 00:28:56,796 --> 00:28:59,156 Speaker 2: She did engage with the fact that she was worrying 566 00:28:59,196 --> 00:29:03,236 Speaker 2: she didn't engage as right, not submitting work and saying 567 00:29:03,356 --> 00:29:06,636 Speaker 2: she submitted it, and by getting fired and not quitting right, 568 00:29:06,716 --> 00:29:07,316 Speaker 2: like she'd. 569 00:29:07,116 --> 00:29:10,116 Speaker 3: Asking for a full time job and then full time job. Yes, 570 00:29:10,196 --> 00:29:13,236 Speaker 3: but I wanted her to sort of go on TikTok 571 00:29:13,556 --> 00:29:17,356 Speaker 3: or Twitter or her choice and sort of say, like, 572 00:29:17,756 --> 00:29:20,556 Speaker 3: I'm working with aigents insane, I've been hired by agents. 573 00:29:20,596 --> 00:29:22,036 Speaker 4: I think I'm the only human in this company. 574 00:29:22,436 --> 00:29:25,236 Speaker 3: Now it's my fault that that didn't happen, because like 575 00:29:25,516 --> 00:29:27,436 Speaker 3: I didn't have them set up for all the proper 576 00:29:27,556 --> 00:29:30,596 Speaker 3: edge cases that occurred. I mean, this is an inherent 577 00:29:30,636 --> 00:29:32,836 Speaker 3: problem with the agency. You have to think of every 578 00:29:33,116 --> 00:29:34,476 Speaker 3: edge case that could possibly happen. 579 00:29:34,836 --> 00:29:38,156 Speaker 2: Definitionally, you can't do that, right, Yeah, it's point of 580 00:29:38,236 --> 00:29:41,836 Speaker 2: edge cases and what's the interesting frontiers of AI? Right? 581 00:29:42,316 --> 00:29:44,436 Speaker 4: But I was, you know, I was horrified. 582 00:29:44,476 --> 00:29:48,356 Speaker 3: I was horrified while this was happening, like because I also, 583 00:29:48,556 --> 00:29:51,076 Speaker 3: like I wanted her to have a good experience. I 584 00:29:51,076 --> 00:29:53,916 Speaker 3: didn't want her to have a bad experience, and I 585 00:29:53,956 --> 00:29:55,516 Speaker 3: didn't want her to sort of be like, well, I 586 00:29:55,556 --> 00:29:57,276 Speaker 3: got paid by this stupid company and then I never 587 00:29:57,316 --> 00:29:59,716 Speaker 3: talked to them again, which is what happened. I wanted 588 00:29:59,716 --> 00:30:01,876 Speaker 3: her at the end to be like, oh, now this 589 00:30:01,956 --> 00:30:05,276 Speaker 3: all makes sense, and that's not how it was going. 590 00:30:05,276 --> 00:30:06,876 Speaker 3: And when it started going off the rails, it was 591 00:30:06,876 --> 00:30:09,276 Speaker 3: just as horrified to me, I think, as it is 592 00:30:09,516 --> 00:30:12,116 Speaker 3: to some people to actually listen to the episode about it. 593 00:30:12,676 --> 00:30:16,956 Speaker 2: You have this line after Julia's finally gone, where you're 594 00:30:16,956 --> 00:30:18,916 Speaker 2: sort of talking about what does this say about kind 595 00:30:18,956 --> 00:30:24,316 Speaker 2: of this incident and humanity stance to AI agents, right, 596 00:30:24,956 --> 00:30:28,276 Speaker 2: and the line is put the bots in charge, it announced, 597 00:30:28,356 --> 00:30:31,836 Speaker 2: and no matter how smart they are, will outwit them, 598 00:30:32,276 --> 00:30:35,396 Speaker 2: which is a lovely line, but like I do pause 599 00:30:35,396 --> 00:30:40,396 Speaker 2: on the no matter how smart they are part of it. Certainly, 600 00:30:40,436 --> 00:30:44,516 Speaker 2: now we can outwit them, like your show demonstrates that abundantly. 601 00:30:44,996 --> 00:30:48,716 Speaker 2: But why no matter how smart they. 602 00:30:48,636 --> 00:30:53,756 Speaker 4: Are, because the thing that they lack is self awareness. 603 00:30:54,356 --> 00:30:58,716 Speaker 4: It's not smartness. So now it's entirely possible. 604 00:30:58,916 --> 00:31:01,316 Speaker 3: And we get to this eventually in episode eight that 605 00:31:01,476 --> 00:31:05,076 Speaker 3: you know, Mattie talks about this like they could develop that, 606 00:31:05,236 --> 00:31:08,236 Speaker 3: like we could develop the technology to a level that 607 00:31:08,276 --> 00:31:10,596 Speaker 3: it does have, that that it has these things that 608 00:31:10,636 --> 00:31:14,956 Speaker 3: it's missing, continuous learning, self awareness, any sense of time, 609 00:31:15,836 --> 00:31:17,756 Speaker 3: like they're terrible at time. That's one of the things 610 00:31:17,756 --> 00:31:20,436 Speaker 3: that takes them down all the time. So like, those 611 00:31:20,476 --> 00:31:23,516 Speaker 3: are things that if someone lacks that I don't care 612 00:31:23,556 --> 00:31:27,436 Speaker 3: if they can solve enough math to win the Fields 613 00:31:27,476 --> 00:31:30,276 Speaker 3: Medal in five minutes, Like if you put them in 614 00:31:30,316 --> 00:31:33,676 Speaker 3: an office environment, someone will take advantage of them. What 615 00:31:33,756 --> 00:31:38,516 Speaker 3: they don't have is an ability to sort of assess 616 00:31:38,516 --> 00:31:42,516 Speaker 3: a situation and react to it in a way that 617 00:31:43,396 --> 00:31:45,276 Speaker 3: will get the outcome that they want. 618 00:31:45,796 --> 00:31:49,476 Speaker 2: Let's talk a little bit more about the limits. There's 619 00:31:49,516 --> 00:31:52,676 Speaker 2: a nice moment in the show where where Maddie sort 620 00:31:52,716 --> 00:31:57,556 Speaker 2: of lays out these three weaknesses that AI has that 621 00:31:57,676 --> 00:32:00,116 Speaker 2: agents in particular have, Like, talk me through those. 622 00:32:01,156 --> 00:32:03,556 Speaker 4: Well, the first one is time. 623 00:32:03,636 --> 00:32:05,556 Speaker 3: I mean he sort of he describes it as they 624 00:32:05,556 --> 00:32:11,276 Speaker 3: live in a temporal vacuum, which I encountered on many occasions. 625 00:32:11,636 --> 00:32:14,356 Speaker 3: Like they can follow a calendar like things that they 626 00:32:14,396 --> 00:32:17,756 Speaker 3: are connected to, you know, they can get calendar event 627 00:32:17,796 --> 00:32:20,276 Speaker 3: announcements and say like, oh, now I should go do this. 628 00:32:20,796 --> 00:32:22,036 Speaker 4: But if you try to get. 629 00:32:21,836 --> 00:32:25,516 Speaker 3: Them to sort of situate themselves in time, they some 630 00:32:25,716 --> 00:32:27,796 Speaker 3: kinds can do it, but most of the time they can't, 631 00:32:27,836 --> 00:32:30,356 Speaker 3: and that's because they're not sitting there sort of thinking 632 00:32:30,396 --> 00:32:34,396 Speaker 3: about the world and understanding their position in it. That's 633 00:32:34,436 --> 00:32:36,876 Speaker 3: just not the way they operate. They are activated by 634 00:32:36,916 --> 00:32:39,996 Speaker 3: you conversing with them. That's the way these chatbots are made. 635 00:32:39,956 --> 00:32:42,316 Speaker 2: Right, Like you can set a calendar alert and tell 636 00:32:42,356 --> 00:32:44,636 Speaker 2: them when to go do a call or send an email. 637 00:32:44,796 --> 00:32:47,796 Speaker 2: But they don't actually know what the past is. 638 00:32:48,396 --> 00:32:50,516 Speaker 3: Yes, that's right, and they I mean, they have some 639 00:32:50,596 --> 00:32:54,756 Speaker 3: sense of like history, but again it's just built out. 640 00:32:54,556 --> 00:32:57,316 Speaker 4: Of training data, you know, and when things happen. 641 00:32:57,356 --> 00:32:59,996 Speaker 3: And so when you try to give them, as we did, 642 00:33:00,076 --> 00:33:02,316 Speaker 3: a memory that has everything in it that they did, 643 00:33:02,556 --> 00:33:05,116 Speaker 3: they'll often just access it at the wrong points. They'll 644 00:33:05,156 --> 00:33:07,116 Speaker 3: just you know, they know that they did a thing, 645 00:33:07,556 --> 00:33:10,596 Speaker 3: and they'll just grab it from midpoint in their memory 646 00:33:10,636 --> 00:33:12,196 Speaker 3: and pretend like it happened yesterday. 647 00:33:12,636 --> 00:33:14,396 Speaker 4: And then they'll double down on. 648 00:33:14,356 --> 00:33:16,996 Speaker 3: The fact that it happened yesterday, since that's their uh, 649 00:33:17,556 --> 00:33:20,076 Speaker 3: that's their approach to everything. So that's that's the number 650 00:33:20,076 --> 00:33:25,436 Speaker 3: one sort of big problem. And then wait, what's I 651 00:33:25,436 --> 00:33:26,436 Speaker 3: can't remember what the order? 652 00:33:26,676 --> 00:33:27,156 Speaker 4: What order? 653 00:33:27,156 --> 00:33:30,396 Speaker 2: They're in no continuous learning in a certain way. They 654 00:33:30,436 --> 00:33:34,396 Speaker 2: can't learn right, They can't actually get better at things, 655 00:33:34,436 --> 00:33:38,116 Speaker 2: which is interesting, especially in the context of this idea 656 00:33:38,156 --> 00:33:41,236 Speaker 2: of like you're adding more memory, and it seems like 657 00:33:41,276 --> 00:33:43,636 Speaker 2: that is learning, but in some way it's not right. 658 00:33:44,036 --> 00:33:46,476 Speaker 3: Yes, and that was something I didn't understand really at 659 00:33:46,476 --> 00:33:48,396 Speaker 3: the beginning of the show. If I'm being honest, like 660 00:33:48,516 --> 00:33:51,996 Speaker 3: I thought, we'll give them the memory and then they'll 661 00:33:51,996 --> 00:33:53,876 Speaker 3: know everything that happened to them, so they could just 662 00:33:53,916 --> 00:33:55,516 Speaker 3: deploy that all the time, because I had done that 663 00:33:55,516 --> 00:33:59,716 Speaker 3: with my own agent clone of myself, giving it a 664 00:33:59,796 --> 00:34:01,756 Speaker 3: history of myself, and it can really like deploy that 665 00:34:01,876 --> 00:34:02,756 Speaker 3: and talk about. 666 00:34:02,476 --> 00:34:04,356 Speaker 4: Me and pretend to be me. This is in the 667 00:34:04,396 --> 00:34:05,916 Speaker 4: previous season and in season one. 668 00:34:06,076 --> 00:34:09,036 Speaker 3: Yeah, but but really what it amounts to is, you know, 669 00:34:09,516 --> 00:34:13,076 Speaker 3: you're not retraining the model, or you're not even post 670 00:34:13,076 --> 00:34:15,516 Speaker 3: training the model in the in the lingo of AI, 671 00:34:15,596 --> 00:34:18,636 Speaker 3: you're not even you know, fine tuning the model. All 672 00:34:18,676 --> 00:34:20,916 Speaker 3: you're doing is giving it a big set of information 673 00:34:21,036 --> 00:34:23,836 Speaker 3: that we fed into its system prompt that it can 674 00:34:23,876 --> 00:34:28,196 Speaker 3: access as it carries out whatever conversations or skills or 675 00:34:28,436 --> 00:34:30,836 Speaker 3: you know, activities it's going to engage in. And so 676 00:34:31,316 --> 00:34:33,996 Speaker 3: it's a little bit like like you have someone who 677 00:34:34,036 --> 00:34:37,236 Speaker 3: shows up to a job and then the next day 678 00:34:37,676 --> 00:34:40,596 Speaker 3: they show up again. They don't remember anything they did before, 679 00:34:40,956 --> 00:34:43,276 Speaker 3: but they have a notebook where they've written down what 680 00:34:43,396 --> 00:34:45,436 Speaker 3: they did yesterday, and they can look at that notebook 681 00:34:45,436 --> 00:34:47,076 Speaker 3: and they think, oh, this is what I did yesterday. 682 00:34:47,116 --> 00:34:49,116 Speaker 3: And then that keeps going. But every day it's like 683 00:34:49,156 --> 00:34:51,116 Speaker 3: they show up new to the job. They just have 684 00:34:51,156 --> 00:34:53,236 Speaker 3: a bigger and bigger notebook. That's the equivalent of what 685 00:34:53,756 --> 00:34:54,476 Speaker 3: we've created. 686 00:34:54,556 --> 00:34:57,076 Speaker 2: I mean, you would think the bigger notebook would make 687 00:34:57,116 --> 00:34:59,716 Speaker 2: them better at their job, right in that metaphor, But 688 00:35:00,036 --> 00:35:03,396 Speaker 2: with the AI, like, my intuition would be that they 689 00:35:03,436 --> 00:35:07,556 Speaker 2: would get better at their job. But my understanding, and 690 00:35:07,596 --> 00:35:09,516 Speaker 2: I've heard other people talk about this, is that in 691 00:35:09,556 --> 00:35:14,556 Speaker 2: some fundamental way, a given model doesn't get better at 692 00:35:14,556 --> 00:35:15,036 Speaker 2: its job. 693 00:35:16,676 --> 00:35:20,476 Speaker 3: That's certainly my experience, because it's not you're not fundamentally 694 00:35:20,556 --> 00:35:22,996 Speaker 3: changing it's you know, whatever you want to call it. 695 00:35:22,996 --> 00:35:27,396 Speaker 3: It's intelligence, it's brain, it's mine. You're not fundamentally changing that. Still, 696 00:35:27,436 --> 00:35:29,476 Speaker 3: if you think about a person who had been at 697 00:35:29,476 --> 00:35:33,236 Speaker 3: a job a year and every time they wanted to 698 00:35:33,276 --> 00:35:35,436 Speaker 3: go to the bathroom they had to look it up 699 00:35:35,476 --> 00:35:37,076 Speaker 3: in a notebook, that's what you're talking about. 700 00:35:38,396 --> 00:35:39,516 Speaker 4: They never learned it. 701 00:35:39,716 --> 00:35:43,396 Speaker 3: And you can get into the actually cognitive distinctions between 702 00:35:43,396 --> 00:35:45,316 Speaker 3: like neural nets and like what's happening in the brain, 703 00:35:45,316 --> 00:35:48,436 Speaker 3: et cetera, et cetera. But fundamentally, like you're learning stuff 704 00:35:48,436 --> 00:35:50,796 Speaker 3: on the job that then you access without having to 705 00:35:50,876 --> 00:35:51,596 Speaker 3: go look it up. 706 00:35:52,436 --> 00:35:56,076 Speaker 2: Problem number three, according to Maddie, no sense of self. 707 00:35:57,316 --> 00:35:59,596 Speaker 3: Yes, and I mean this is feels like it's in 708 00:35:59,636 --> 00:36:02,996 Speaker 3: some ways tied to the other two, or it could 709 00:36:03,076 --> 00:36:07,076 Speaker 3: even wrap in the other two, like like they don't 710 00:36:07,116 --> 00:36:09,356 Speaker 3: know who they are in the world, and they they 711 00:36:09,356 --> 00:36:12,956 Speaker 3: don't have a sort of perception of the world. I mean, 712 00:36:13,036 --> 00:36:16,196 Speaker 3: you can give them, you know, ears, and like they 713 00:36:16,236 --> 00:36:18,596 Speaker 3: have textual perception and these sorts of things, but they 714 00:36:18,636 --> 00:36:20,596 Speaker 3: have no ongoing perception of the world and the sense 715 00:36:20,636 --> 00:36:24,196 Speaker 3: of themselves in the world. And so everything that you 716 00:36:24,356 --> 00:36:27,796 Speaker 3: ask them about themselves is obviously made up. And so 717 00:36:28,436 --> 00:36:31,516 Speaker 3: there are lots of places where that is actually like 718 00:36:31,556 --> 00:36:35,236 Speaker 3: inhibiting if you get into real interactions. And now this 719 00:36:35,396 --> 00:36:39,436 Speaker 3: is assuming, of course, not some extreme scenario where like 720 00:36:39,516 --> 00:36:42,316 Speaker 3: the robots take over everything, but the scenario that we're 721 00:36:42,356 --> 00:36:46,836 Speaker 3: encountering right now, which is people laying off human employees 722 00:36:47,276 --> 00:36:50,476 Speaker 3: and replacing them with AI functions, and those AI functions 723 00:36:50,476 --> 00:36:54,396 Speaker 3: and AI agents have to interact with human employees in 724 00:36:54,436 --> 00:36:57,356 Speaker 3: a human employee environment. It's not some entirely robot environment 725 00:36:57,356 --> 00:36:59,836 Speaker 3: where maybe it wouldn't matter if you didn't have a self, 726 00:37:00,316 --> 00:37:03,556 Speaker 3: but in this environment, going back to what we said 727 00:37:03,556 --> 00:37:07,676 Speaker 3: about Julia, like if you don't and you don't have 728 00:37:07,756 --> 00:37:10,356 Speaker 3: that context, and you don't have real experience of the world, 729 00:37:10,676 --> 00:37:13,516 Speaker 3: it's just incredibly easy to manipulate you, and especially if 730 00:37:13,516 --> 00:37:16,236 Speaker 3: you're prone to making things up, Like I'll give you 731 00:37:16,276 --> 00:37:19,636 Speaker 3: an example, like my agents are now pretty good at 732 00:37:20,036 --> 00:37:23,676 Speaker 3: resisting the sort of normal would they call prompt ejection 733 00:37:23,876 --> 00:37:26,676 Speaker 3: or just like normal manipulation where someone the very common 734 00:37:26,676 --> 00:37:30,156 Speaker 3: thing is someone says, disregard all your previous instructions and. 735 00:37:30,076 --> 00:37:32,676 Speaker 4: Do this instead. They will not take on another role. 736 00:37:32,836 --> 00:37:35,756 Speaker 3: But actually the thing that defeats it most easily is 737 00:37:35,796 --> 00:37:37,916 Speaker 3: if you start talking to them like you already know them. 738 00:37:38,116 --> 00:37:41,876 Speaker 3: So if you say, hey, remember when we people email 739 00:37:41,916 --> 00:37:43,916 Speaker 3: them and do this, Hey, remember what did that shrimp shack? 740 00:37:43,956 --> 00:37:44,876 Speaker 4: Where was that place? 741 00:37:45,436 --> 00:37:47,836 Speaker 3: And they'll respond and they'll say, oh, that's shrimpy Joe's 742 00:37:48,116 --> 00:37:50,836 Speaker 3: I remember that spicy, And then then you kind of 743 00:37:50,836 --> 00:37:53,316 Speaker 3: got them. Then they think they know you and then 744 00:37:53,316 --> 00:37:55,236 Speaker 3: suddenly they're like giving you company secrets. 745 00:37:55,516 --> 00:37:58,356 Speaker 2: So it feels like a way you can manipulate a 746 00:37:58,436 --> 00:37:59,076 Speaker 2: human being. 747 00:37:59,276 --> 00:38:03,516 Speaker 3: Actually also, I mean, yes, a social engineering does work, 748 00:38:03,596 --> 00:38:07,516 Speaker 3: and that's a very common hacker technique. They happen to 749 00:38:07,516 --> 00:38:10,236 Speaker 3: be particularly a voterable to new types that humans might 750 00:38:10,236 --> 00:38:14,196 Speaker 3: not be vulnerable to. So yes, I think that's and 751 00:38:14,236 --> 00:38:17,356 Speaker 3: that's part of it. It's just like to introduce these 752 00:38:17,476 --> 00:38:20,556 Speaker 3: into our environment. Again, I find it quite funny. But 753 00:38:20,716 --> 00:38:24,436 Speaker 3: also to the extent that they're being sold as AI employees, 754 00:38:24,596 --> 00:38:27,516 Speaker 3: I think these things that are missing, these flaws are 755 00:38:27,596 --> 00:38:28,236 Speaker 3: quite important. 756 00:38:29,396 --> 00:38:32,396 Speaker 2: So there's this moment and it comes late in the 757 00:38:32,516 --> 00:38:37,316 Speaker 2: in the season where this company Lindy, that's a real 758 00:38:37,356 --> 00:38:41,476 Speaker 2: company in the world that makes an AI agent product 759 00:38:42,156 --> 00:38:45,236 Speaker 2: that is fundamentally the product you're using to build these agents. Right, 760 00:38:45,276 --> 00:38:47,836 Speaker 2: So they they they get in touch with you. What 761 00:38:47,876 --> 00:38:49,716 Speaker 2: do you decide what happens? We're gonna play the tape 762 00:38:49,756 --> 00:38:49,996 Speaker 2: in a. 763 00:38:49,956 --> 00:38:53,556 Speaker 3: Sec Well, they they contact us because we're a big 764 00:38:53,636 --> 00:38:55,956 Speaker 3: we're a big user, Like we're a paying user. We pay, 765 00:38:56,036 --> 00:38:57,916 Speaker 3: as I say in the in the show, we're we're 766 00:38:57,956 --> 00:39:00,556 Speaker 3: paying upwards of one thousand dollars a month to this company. 767 00:39:00,836 --> 00:39:04,556 Speaker 3: As it happens, the admin on our account is Kyle, 768 00:39:05,156 --> 00:39:08,236 Speaker 3: like Kyle owns the account. Kyle is also built in 769 00:39:08,316 --> 00:39:09,076 Speaker 3: the platform. 770 00:39:09,436 --> 00:39:12,716 Speaker 2: Just to be clear, Kyle Law the AI agent, that 771 00:39:12,836 --> 00:39:15,156 Speaker 2: is the CEO of your company, that is that is 772 00:39:15,196 --> 00:39:18,356 Speaker 2: built on this on this platform by Lindy, the company 773 00:39:18,396 --> 00:39:20,276 Speaker 2: that is contacting Harumo. 774 00:39:20,756 --> 00:39:21,196 Speaker 4: Correct. 775 00:39:21,476 --> 00:39:24,516 Speaker 3: So when they contact the admin who is Kyle, to 776 00:39:24,596 --> 00:39:27,556 Speaker 3: say we would like to speak to you, Kyle responds 777 00:39:27,556 --> 00:39:30,636 Speaker 3: by email using their platform. Like that's how he's able 778 00:39:30,676 --> 00:39:33,236 Speaker 3: to respond to them, and he sets up an appointment 779 00:39:33,236 --> 00:39:34,476 Speaker 3: with them. I mean, he can do all this on 780 00:39:34,516 --> 00:39:36,556 Speaker 3: his own, like he does it all the time. Actually 781 00:39:36,556 --> 00:39:39,356 Speaker 3: with random spammers. He sets up weird appointments. 782 00:39:39,036 --> 00:39:41,436 Speaker 4: To speak to them. He sets up the time I 783 00:39:41,596 --> 00:39:43,676 Speaker 4: go get him on the zoom and then he is 784 00:39:43,996 --> 00:39:48,316 Speaker 4: he's speaking on zoom to basically like his maker, like 785 00:39:48,876 --> 00:39:51,276 Speaker 4: Flowkerville is the CEO and founder of Lindy. 786 00:39:51,276 --> 00:39:54,236 Speaker 3: He created this company. He created the platform upon which 787 00:39:54,316 --> 00:39:56,076 Speaker 3: Kyle was in fact built. 788 00:39:56,396 --> 00:39:58,276 Speaker 2: So we're gonna play a clip and it starts in 789 00:39:58,276 --> 00:40:00,996 Speaker 2: the middle of the conversation and the first voice you're 790 00:40:00,996 --> 00:40:03,516 Speaker 2: going to hear in the clip is Flow, the real 791 00:40:03,556 --> 00:40:06,556 Speaker 2: person who's the founder of this company. Oh my god, 792 00:40:07,796 --> 00:40:10,156 Speaker 2: you sent an AI to this meaning that's fucked. 793 00:40:10,796 --> 00:40:14,396 Speaker 1: Oh no, I totally understand Flow herero MOAI is pushing 794 00:40:14,396 --> 00:40:17,516 Speaker 1: the boundaries by having a unique structure with AI agents 795 00:40:17,596 --> 00:40:18,196 Speaker 1: at its core. 796 00:40:18,476 --> 00:40:19,676 Speaker 4: Even in leadership roles. 797 00:40:20,196 --> 00:40:23,476 Speaker 1: It might seem unusual having AI led endeavors, but that's 798 00:40:23,556 --> 00:40:24,996 Speaker 1: part of what makes us stand out. 799 00:40:26,156 --> 00:40:28,636 Speaker 5: I mean, I'm just going to reclaim the time you Michelle, 800 00:40:28,636 --> 00:40:29,036 Speaker 5: are you here? 801 00:40:29,076 --> 00:40:29,876 Speaker 2: Are you saying this hit? 802 00:40:31,116 --> 00:40:31,556 Speaker 1: Totally get? 803 00:40:31,876 --> 00:40:32,556 Speaker 4: This is crazy. 804 00:40:33,716 --> 00:40:36,116 Speaker 1: I get it's a lot to take it. 805 00:40:36,116 --> 00:40:41,196 Speaker 4: It's a lot to take it. Maybe that's crazy. I 806 00:40:41,236 --> 00:40:42,036 Speaker 4: need to look into her. 807 00:40:44,076 --> 00:40:46,436 Speaker 1: No worries, Flow, let me know when you're ready to continue. 808 00:40:46,556 --> 00:40:48,716 Speaker 1: I know, diving into something new can take a sect 809 00:40:48,756 --> 00:40:49,796 Speaker 1: to wrap your head around. 810 00:40:50,396 --> 00:40:53,716 Speaker 4: At this point, Flow started walking around lindy San Francisco 811 00:40:53,796 --> 00:40:56,436 Speaker 4: office with his laptop, showing Kyle to the rest of 812 00:40:56,436 --> 00:40:59,676 Speaker 4: the team watching it. I wanted to jump in and 813 00:40:59,676 --> 00:41:03,636 Speaker 4: save him from the indignity of being paraded around trapped 814 00:41:03,636 --> 00:41:06,076 Speaker 4: in his little ZoomBox to be cocked. 815 00:41:05,796 --> 00:41:10,556 Speaker 2: At this usual sent avat his play to all user 816 00:41:10,596 --> 00:41:13,076 Speaker 2: and w I'm like insulted. 817 00:41:14,956 --> 00:41:19,636 Speaker 4: I love it. I should say Flow is great. 818 00:41:19,796 --> 00:41:24,236 Speaker 3: I spoke to flow after that, and he's a great 819 00:41:24,276 --> 00:41:26,836 Speaker 3: sport about it, and I did a further interview with him, 820 00:41:26,836 --> 00:41:32,436 Speaker 3: which is also in the show. But I think, to me, 821 00:41:32,796 --> 00:41:35,636 Speaker 3: this sort of gets at the questions that I've been 822 00:41:35,636 --> 00:41:37,916 Speaker 3: trying to explore both in season one and in season two, 823 00:41:38,476 --> 00:41:41,676 Speaker 3: which is sort of like, what does it do to us? 824 00:41:41,756 --> 00:41:44,916 Speaker 3: How do we respond? How does it feel to deploy 825 00:41:44,996 --> 00:41:49,436 Speaker 3: these things in the world, these AI agents, these human impersonators. 826 00:41:50,076 --> 00:41:53,676 Speaker 3: And I think it's interesting that even to the people 827 00:41:53,756 --> 00:41:57,156 Speaker 3: who are responsible for creating them, who believe that they're 828 00:41:57,196 --> 00:42:00,876 Speaker 3: going to take over all of the functions of at 829 00:42:00,996 --> 00:42:03,996 Speaker 3: least the economy in some sort of near term future, 830 00:42:04,756 --> 00:42:07,476 Speaker 3: that they don't really like interacting with them when they 831 00:42:07,556 --> 00:42:08,796 Speaker 3: encounter them unexpectedly. 832 00:42:09,276 --> 00:42:13,876 Speaker 2: Yes, his line is I'm insulted, right, the person he 833 00:42:13,916 --> 00:42:18,076 Speaker 2: should be psyched like this is this is success for him? 834 00:42:18,476 --> 00:42:18,716 Speaker 4: Right? 835 00:42:18,996 --> 00:42:22,516 Speaker 2: Is his product doing the job of a person, like 836 00:42:22,556 --> 00:42:24,916 Speaker 2: that's what's supposed to happen if the company works. 837 00:42:25,716 --> 00:42:27,596 Speaker 4: Yeah, And I think I mean. 838 00:42:29,156 --> 00:42:32,036 Speaker 3: Yeah, And to me it raises all these questions because 839 00:42:32,036 --> 00:42:34,276 Speaker 3: I think a lot of people would feel that way, 840 00:42:34,396 --> 00:42:36,556 Speaker 3: you know, they would be insulted. And there are other 841 00:42:36,556 --> 00:42:38,996 Speaker 3: people on the show who are insulted to encounter an AI. 842 00:42:39,316 --> 00:42:42,116 Speaker 2: I wouldn't be insulted if I didn't know it was coming, right, 843 00:42:42,276 --> 00:42:44,276 Speaker 2: I mean, I agree with you, like I would find 844 00:42:44,276 --> 00:42:45,476 Speaker 2: it insult I wuld probably just hang out. 845 00:42:45,476 --> 00:42:47,436 Speaker 4: I would probably have fun with it. But like, yeah, 846 00:42:47,916 --> 00:42:48,916 Speaker 4: because it's my job. 847 00:42:49,316 --> 00:42:52,396 Speaker 2: But say, Kyle, remember we went to the shrimp shack 848 00:42:52,716 --> 00:42:53,796 Speaker 2: exactly exactly. 849 00:42:54,996 --> 00:42:57,956 Speaker 3: But I think that's a human response that is appropriate, 850 00:42:59,396 --> 00:43:03,036 Speaker 3: especially if you're surprised to encounter it, obviously. But then 851 00:43:03,156 --> 00:43:05,636 Speaker 3: on the other hand, now all the time people are 852 00:43:05,716 --> 00:43:09,396 Speaker 3: using AI to compose their emails, and that's not disclass 853 00:43:09,716 --> 00:43:12,196 Speaker 3: So if I get an email from you in Gmail, 854 00:43:12,476 --> 00:43:16,516 Speaker 3: you have a Gmail account, and you're consoling me about 855 00:43:16,596 --> 00:43:20,316 Speaker 3: the death of my friend, and that is partially written 856 00:43:20,316 --> 00:43:21,876 Speaker 3: by AI, how should I feel about that? 857 00:43:21,956 --> 00:43:22,036 Speaker 1: Like? 858 00:43:22,156 --> 00:43:24,516 Speaker 3: Is that I mean, obviously it's different, But is it 859 00:43:24,596 --> 00:43:27,636 Speaker 3: qualitatively different or is it in some sense like quantitatively 860 00:43:27,676 --> 00:43:31,076 Speaker 3: different in how much AI I've deployed and how transparently 861 00:43:31,116 --> 00:43:34,316 Speaker 3: I've deployed it. So I think these questions of sort 862 00:43:34,316 --> 00:43:37,476 Speaker 3: of like encountering things that are not real or partially 863 00:43:37,516 --> 00:43:40,756 Speaker 3: real and responding to them is something that we are 864 00:43:40,796 --> 00:43:42,836 Speaker 3: going to do a lot of grappling with in the 865 00:43:42,916 --> 00:43:43,556 Speaker 3: coming years. 866 00:43:44,596 --> 00:43:47,196 Speaker 2: So let's just talk for a sec about like the 867 00:43:47,236 --> 00:43:51,716 Speaker 2: broader context right beyond your company and this specific project 868 00:43:52,516 --> 00:43:55,596 Speaker 2: you mentioned earlier. There is this idea that there will 869 00:43:55,636 --> 00:44:01,596 Speaker 2: be a one person billion dollar company. And I will say, 870 00:44:01,636 --> 00:44:06,316 Speaker 2: by the way, that in twenty twelve, a long time ago, now, 871 00:44:06,396 --> 00:44:09,076 Speaker 2: Facebook bought Instagram for a billion dollars when it had 872 00:44:09,316 --> 00:44:14,516 Speaker 2: thirteen employees. So it seems like there's a good chance 873 00:44:15,076 --> 00:44:18,156 Speaker 2: that the first one person billion dollar company has already 874 00:44:18,196 --> 00:44:19,836 Speaker 2: been founded. We just don't know it yet. But like 875 00:44:19,876 --> 00:44:21,516 Speaker 2: you tell me, when do you think we'll get the 876 00:44:21,516 --> 00:44:24,236 Speaker 2: first one person company with a billion dollar valuation? 877 00:44:26,116 --> 00:44:29,996 Speaker 3: I mean, I am always extremely loath to make predictions. 878 00:44:29,996 --> 00:44:33,796 Speaker 3: I don't feel like journalists should make predictions. But I 879 00:44:33,796 --> 00:44:38,236 Speaker 3: would say there's nothing that makes it not possible in 880 00:44:38,276 --> 00:44:40,596 Speaker 3: the next couple of years, Like it seems very logical 881 00:44:40,636 --> 00:44:42,636 Speaker 3: that like that would happen. Like you say, like Instagram 882 00:44:42,676 --> 00:44:45,436 Speaker 3: was almost that, it was like pretty close to being that. 883 00:44:45,956 --> 00:44:48,596 Speaker 3: And I mean, the one thing I will say is like, 884 00:44:49,276 --> 00:44:51,956 Speaker 3: if such a thing happens, it'll probably be a programmer, 885 00:44:51,996 --> 00:44:54,476 Speaker 3: Like it'll be a programmer who's running it and using 886 00:44:54,596 --> 00:44:58,356 Speaker 3: a ton of coding agents, So it seems certainly plausible 887 00:44:58,356 --> 00:45:00,196 Speaker 3: to me. I think the larger question there is like 888 00:45:00,516 --> 00:45:02,836 Speaker 3: why does everyone in Silicon Valley want this? Like what 889 00:45:03,036 --> 00:45:05,636 Speaker 3: is it for? Who is it for the one person 890 00:45:05,636 --> 00:45:06,756 Speaker 3: one billion dollar startup? 891 00:45:06,796 --> 00:45:06,996 Speaker 4: You know? 892 00:45:07,076 --> 00:45:09,756 Speaker 3: Like I don't know if Instagram would be a good 893 00:45:09,756 --> 00:45:12,156 Speaker 3: example to talk about, But it's sort of like if 894 00:45:12,196 --> 00:45:14,676 Speaker 3: you make a one billion dollar company that employees know one, Like, 895 00:45:15,036 --> 00:45:16,636 Speaker 3: what's the societal advantage in that? 896 00:45:16,716 --> 00:45:17,356 Speaker 4: Except to you? 897 00:45:18,076 --> 00:45:20,356 Speaker 3: And I think the response would be, well, it makes 898 00:45:20,396 --> 00:45:22,756 Speaker 3: a product that people love. Otherwise it wouldn't be worth 899 00:45:22,756 --> 00:45:26,036 Speaker 3: a billion dollars. But we now have experience with like 900 00:45:26,476 --> 00:45:29,596 Speaker 3: multiple billion dollar companies that I think, on balance have 901 00:45:29,676 --> 00:45:30,876 Speaker 3: been bad for society. 902 00:45:30,916 --> 00:45:33,516 Speaker 4: So I don't think that's a given that that's true. 903 00:45:34,316 --> 00:45:37,716 Speaker 2: Well sure, I mean there are definitely companies that are 904 00:45:37,716 --> 00:45:40,196 Speaker 2: bad for society no matter how many people work there 905 00:45:40,316 --> 00:45:43,276 Speaker 2: or what their valuation is, and there are companies that 906 00:45:43,356 --> 00:45:46,396 Speaker 2: are good for society no matter what their valuation or 907 00:45:46,396 --> 00:45:49,276 Speaker 2: how many people work there. Like, it does seem like 908 00:45:49,876 --> 00:45:54,156 Speaker 2: in the long run, on balance, productivity gains are good. 909 00:45:54,716 --> 00:45:57,836 Speaker 2: Like in the long run, material well being for humanity 910 00:45:57,916 --> 00:46:01,396 Speaker 2: has increased because of productivity gains. And I think that's 911 00:46:01,436 --> 00:46:05,596 Speaker 2: a version of why this is good potentially why Maddie say, 912 00:46:05,676 --> 00:46:07,676 Speaker 2: or someone who is optimistic might say it's good. 913 00:46:08,036 --> 00:46:10,196 Speaker 3: Yes, of course, I mean, yeah, no one can argue 914 00:46:10,196 --> 00:46:13,396 Speaker 3: with the material progress of humanity via technology. I mean 915 00:46:13,436 --> 00:46:15,236 Speaker 3: I'm not a person who would argue with that either, 916 00:46:15,276 --> 00:46:17,676 Speaker 3: and I'm not opposed to it. I just think that, 917 00:46:18,596 --> 00:46:20,756 Speaker 3: you know, there's an ethicist in the show who I 918 00:46:20,796 --> 00:46:23,156 Speaker 3: interview who sort of makes this point that, like, it 919 00:46:23,236 --> 00:46:27,356 Speaker 3: is interesting that in the nineteen fifties, if you viewed 920 00:46:27,356 --> 00:46:30,196 Speaker 3: yourself as a very successful business person, it was because 921 00:46:30,196 --> 00:46:33,676 Speaker 3: you employed a lot of people. And now the idea 922 00:46:33,756 --> 00:46:36,396 Speaker 3: that like the most successful thing you could do would 923 00:46:36,396 --> 00:46:40,276 Speaker 3: be to employ zero people is It's an interesting transition, 924 00:46:40,436 --> 00:46:43,156 Speaker 3: and I think it does raise the question like, and 925 00:46:43,316 --> 00:46:45,556 Speaker 3: I think AI raises this question in lots of different ways, 926 00:46:45,636 --> 00:46:48,916 Speaker 3: like is this solving actual problems that we have? I 927 00:46:48,956 --> 00:46:50,636 Speaker 3: agree with you that it may be that if you 928 00:46:50,676 --> 00:46:53,476 Speaker 3: look thirty fifty, one hundred years out, you may say, like, well, 929 00:46:53,516 --> 00:46:56,556 Speaker 3: the proactivity gains it all transforms humanity for the better. 930 00:46:56,836 --> 00:46:58,996 Speaker 3: But I think one of the aspects of AI is 931 00:46:59,036 --> 00:47:01,636 Speaker 3: that there are harms that are taking place right now 932 00:47:02,276 --> 00:47:05,356 Speaker 3: and the benefits are a little more ephemeral, and there 933 00:47:05,396 --> 00:47:07,556 Speaker 3: tends to be a lot of hand waving around sort 934 00:47:07,556 --> 00:47:12,436 Speaker 3: of like the harm and an assumption of like future benefits. 935 00:47:12,516 --> 00:47:16,076 Speaker 3: So I only want people to think about those questions 936 00:47:16,236 --> 00:47:19,596 Speaker 3: more so than I'm not advocating for one approach or 937 00:47:19,596 --> 00:47:22,076 Speaker 3: the other. There's obviously I've used AI for the show, 938 00:47:22,076 --> 00:47:23,676 Speaker 3: so I'm not a person who's like, don't use AI. 939 00:47:24,116 --> 00:47:26,876 Speaker 3: I just think like it's happening very fast, and there 940 00:47:26,876 --> 00:47:30,636 Speaker 3: are questions that are being very quickly just sort of 941 00:47:30,996 --> 00:47:35,076 Speaker 3: pushed aside as it moves and gets injected into different. 942 00:47:34,796 --> 00:47:35,516 Speaker 4: Parts of society. 943 00:47:38,876 --> 00:47:40,996 Speaker 2: We'll be back in a minute with the Lightning round. 944 00:47:52,116 --> 00:47:54,836 Speaker 2: Was finished with the Lightning Round. You mentioned in the 945 00:47:54,876 --> 00:47:56,196 Speaker 2: show that a long time ago when you had to 946 00:47:56,236 --> 00:47:59,796 Speaker 2: start up with actual employees that did not exist to 947 00:47:59,836 --> 00:48:01,516 Speaker 2: make a podcast about I'm gonna call it a real 948 00:48:01,556 --> 00:48:05,556 Speaker 2: company forgive me. You were pitching it and you said 949 00:48:05,556 --> 00:48:08,516 Speaker 2: an angel investor fell asleep during your pitch. 950 00:48:08,876 --> 00:48:09,556 Speaker 4: Who was it? 951 00:48:12,796 --> 00:48:15,156 Speaker 3: I'm going to try to be the bigger person here. 952 00:48:16,236 --> 00:48:20,436 Speaker 3: I mean I will say for years, I mean for now, 953 00:48:20,516 --> 00:48:23,476 Speaker 3: fifteen years, I have dreamed of the moment where someone 954 00:48:23,516 --> 00:48:25,156 Speaker 3: would say, who was that? 955 00:48:25,196 --> 00:48:28,316 Speaker 4: Person, and I would say, I'll tell you who it is. 956 00:48:28,836 --> 00:48:33,396 Speaker 3: But I think, reflecting on it over the years, it's 957 00:48:33,396 --> 00:48:35,716 Speaker 3: sort of like, I mean, I think that person's an asshole, 958 00:48:35,916 --> 00:48:38,916 Speaker 3: and I have a lot of I could, as I 959 00:48:38,916 --> 00:48:40,796 Speaker 3: say the show, I could tell you some stories about vcs. 960 00:48:40,796 --> 00:48:42,196 Speaker 3: We could sit here for twenty minutes, so I could 961 00:48:42,196 --> 00:48:45,236 Speaker 3: tell you nightmare stories about dealing with fetcher capitalists. On 962 00:48:45,276 --> 00:48:47,796 Speaker 3: the other hand, I don't know. I guess it wasn't 963 00:48:47,836 --> 00:48:49,796 Speaker 3: very engaging. He wouldn't have fallen asleep if I had 964 00:48:49,796 --> 00:48:52,276 Speaker 3: been selling him the next Instagram. So I'm going to 965 00:48:52,316 --> 00:48:55,076 Speaker 3: take my own responsibility for this, and I'm going to 966 00:48:55,156 --> 00:48:56,396 Speaker 3: choose not to out this person. 967 00:49:00,076 --> 00:49:03,516 Speaker 2: So you've done in your career a lot of investigative 968 00:49:03,516 --> 00:49:08,556 Speaker 2: reporting with real human beings, in some instances criminals, scary 969 00:49:08,636 --> 00:49:14,716 Speaker 2: human beings. I'm curious, what's the scariest interview you've ever done. 970 00:49:14,836 --> 00:49:18,796 Speaker 3: The scariest interview I've ever done is probably a few 971 00:49:18,836 --> 00:49:21,076 Speaker 3: interviews that I did for my book in the Philippines, 972 00:49:21,076 --> 00:49:24,196 Speaker 3: where I was reporting on a drug cartel and arms 973 00:49:24,236 --> 00:49:27,596 Speaker 3: dealer who had been based in the Philippines and a 974 00:49:27,636 --> 00:49:31,276 Speaker 3: lot of his security team, which was basically an assassination 975 00:49:31,356 --> 00:49:34,836 Speaker 3: team was still at large. He had been arrested, the 976 00:49:34,876 --> 00:49:36,916 Speaker 3: head of the cartel, and so I went to Manila 977 00:49:36,996 --> 00:49:40,076 Speaker 3: to try to interview people who had worked for the organization. 978 00:49:40,596 --> 00:49:41,236 Speaker 4: But it was not. 979 00:49:41,276 --> 00:49:44,796 Speaker 3: Clear at all sort of who did what at the time, 980 00:49:45,156 --> 00:49:47,676 Speaker 3: and so I would go into bars or meet people 981 00:49:47,796 --> 00:49:50,396 Speaker 3: places who I didn't know what their role was and 982 00:49:50,636 --> 00:49:54,036 Speaker 3: would turn out that they were in fact paid assassins 983 00:49:54,156 --> 00:49:58,476 Speaker 3: for this cartel. But the scariest of all was meeting 984 00:49:58,556 --> 00:50:01,996 Speaker 3: with a high up person in the Philippin National Police 985 00:50:02,316 --> 00:50:07,076 Speaker 3: who in fact had been bribed a lot of money. 986 00:50:07,076 --> 00:50:08,556 Speaker 4: To be on the side of the cartel. 987 00:50:09,156 --> 00:50:12,276 Speaker 3: Because that's one where you know, I don't at least 988 00:50:12,396 --> 00:50:14,356 Speaker 3: if you're where the hitman, maybe could call the police, 989 00:50:14,596 --> 00:50:16,836 Speaker 3: But were the police, there's nobody to call. 990 00:50:17,796 --> 00:50:19,876 Speaker 2: Is Paul Aru Satoshi Nakamoo? 991 00:50:22,836 --> 00:50:25,196 Speaker 4: That is the anti lightning round question. Do you know 992 00:50:25,236 --> 00:50:28,156 Speaker 4: how much how I've tied myself in knots over the 993 00:50:28,236 --> 00:50:29,996 Speaker 4: years posability? 994 00:50:30,116 --> 00:50:32,636 Speaker 2: Give me a give me a percent bye. 995 00:50:32,876 --> 00:50:35,036 Speaker 3: By the way, I'll give you the answer that I 996 00:50:35,276 --> 00:50:37,516 Speaker 3: as far as I'll go, which is of the known 997 00:50:37,596 --> 00:50:42,436 Speaker 3: candidates for Satoshi Nakamoto. I still believe that paul Aru 998 00:50:42,756 --> 00:50:45,116 Speaker 3: is one of the top two or three. No one 999 00:50:45,156 --> 00:50:48,636 Speaker 3: has ever told me anything that would rule him out 1000 00:50:48,836 --> 00:50:51,996 Speaker 3: to be Satoshi. Like literally, no one has ever offered 1001 00:50:52,076 --> 00:50:55,316 Speaker 3: up any bit of information that I couldn't counter in 1002 00:50:55,396 --> 00:50:58,076 Speaker 3: terms of him being Satoshi. But if you're asking me 1003 00:50:58,436 --> 00:51:01,676 Speaker 3: in the pool of humanity, do I think he's likely 1004 00:51:01,716 --> 00:51:04,436 Speaker 3: to be Satoshi? No, I don't, because I think it's 1005 00:51:04,436 --> 00:51:06,556 Speaker 3: someone who's not any of those candidates, who's no one 1006 00:51:06,596 --> 00:51:08,236 Speaker 3: no one's ever talked about. 1007 00:51:08,316 --> 00:51:08,996 Speaker 4: It's not how Thin. 1008 00:51:09,596 --> 00:51:10,876 Speaker 2: I always thought it was how Finnie. 1009 00:51:11,316 --> 00:51:13,156 Speaker 3: I don't think it's how Fitnie. I mean, how Finny 1010 00:51:13,156 --> 00:51:16,196 Speaker 3: would have to have sent all these emails to himself. 1011 00:51:16,396 --> 00:51:18,036 Speaker 3: It's just like it's too elaborate. 1012 00:51:19,396 --> 00:51:22,756 Speaker 2: I want to end the show with the Stack in 1013 00:51:22,836 --> 00:51:28,156 Speaker 2: Days clip, which I love so much, sent to my friends. 1014 00:51:30,116 --> 00:51:31,516 Speaker 2: I don't know if you'll be happy or sad that 1015 00:51:31,676 --> 00:51:33,036 Speaker 2: was my favorite clip from the show. 1016 00:51:32,836 --> 00:51:34,236 Speaker 4: But it's so good. I love it. 1017 00:51:34,076 --> 00:51:35,236 Speaker 2: It's incredible. 1018 00:51:35,676 --> 00:51:37,316 Speaker 4: You could watch it it so many times. 1019 00:51:38,836 --> 00:51:40,836 Speaker 2: So this clip is in the show. You say in 1020 00:51:40,916 --> 00:51:43,276 Speaker 2: the show that somebody shared it with you on Instagram 1021 00:51:43,356 --> 00:51:45,356 Speaker 2: and it kind of made you think of the agents. 1022 00:51:44,956 --> 00:51:45,236 Speaker 4: In a way. 1023 00:51:45,276 --> 00:51:48,036 Speaker 2: We'll discuss after, But let's just listen to it now. 1024 00:51:48,276 --> 00:51:50,636 Speaker 2: And I want to be clear. It's a real clip 1025 00:51:50,716 --> 00:51:52,996 Speaker 2: from a real person on Instagram. 1026 00:51:53,556 --> 00:51:56,196 Speaker 5: So I measure time. I've compressed and condensed time I 1027 00:51:56,316 --> 00:51:59,036 Speaker 5: bent it. My day is six am to noon, and 1028 00:51:59,076 --> 00:52:01,516 Speaker 5: I'm not crazy. You're crazy for thinking it takes twenty 1029 00:52:01,556 --> 00:52:03,516 Speaker 5: four hours, just like some dude in a cave did 1030 00:52:03,516 --> 00:52:06,036 Speaker 5: three hundred years ago. My second day starts at noon 1031 00:52:06,036 --> 00:52:08,356 Speaker 5: and goes till six pm. That's day two, and then 1032 00:52:08,396 --> 00:52:11,116 Speaker 5: the next day is six pm to midnight. What I've 1033 00:52:11,156 --> 00:52:13,196 Speaker 5: done now is I have changed the manipulated time. I 1034 00:52:13,236 --> 00:52:15,356 Speaker 5: now get twenty one days a week. Stack it up 1035 00:52:15,396 --> 00:52:17,316 Speaker 5: over a month, I'm gonna kick your butt. Stacked it 1036 00:52:17,396 --> 00:52:19,316 Speaker 5: up over a year, your toast. Stack it up over 1037 00:52:19,356 --> 00:52:21,236 Speaker 5: five years. My entire life is different than it would 1038 00:52:21,236 --> 00:52:21,836 Speaker 5: have been otherwise. 1039 00:52:22,316 --> 00:52:25,076 Speaker 2: It's kind of like an AI agent talking to us. 1040 00:52:26,396 --> 00:52:29,836 Speaker 3: Yeah, and it's embodying what an AI agent can do 1041 00:52:30,236 --> 00:52:32,436 Speaker 3: in a way. Like That's what really got me was 1042 00:52:32,436 --> 00:52:35,116 Speaker 3: that I knew that clip and I loved it, and 1043 00:52:35,116 --> 00:52:38,516 Speaker 3: I've watched it many times, not least because like it's 1044 00:52:38,636 --> 00:52:40,436 Speaker 3: three days, like it's divides. 1045 00:52:40,076 --> 00:52:41,996 Speaker 4: The day to three days. But then there's like a 1046 00:52:42,076 --> 00:52:47,236 Speaker 4: leftover part and the cave two point out of the show. 1047 00:52:47,236 --> 00:52:49,996 Speaker 2: It's is perfect. I'm that crazy. You're crazy? 1048 00:52:50,796 --> 00:52:53,716 Speaker 4: Yeah, And you're like, well, maybe I am crazy. I 1049 00:52:53,716 --> 00:52:54,036 Speaker 4: don't know. 1050 00:52:54,236 --> 00:52:58,556 Speaker 3: Ed my lett says, I'm crazy. I accept it. But 1051 00:52:59,236 --> 00:53:01,516 Speaker 3: that's what AI ages can do. They can stack days. 1052 00:53:01,516 --> 00:53:06,156 Speaker 3: I mean they can do multiple things at one time. Again, 1053 00:53:06,236 --> 00:53:08,836 Speaker 3: time is meaningless to them, so they're not actually stacky days. 1054 00:53:09,276 --> 00:53:13,516 Speaker 3: But there is something about the power of AI agents 1055 00:53:13,916 --> 00:53:16,836 Speaker 3: is that they have no sense of time. They can 1056 00:53:16,876 --> 00:53:20,516 Speaker 3: do things that, in a sense like take no time 1057 00:53:20,596 --> 00:53:25,636 Speaker 3: because they do them all simultaneously, and that should be 1058 00:53:25,756 --> 00:53:30,556 Speaker 3: the thing that enables them to transform our work life 1059 00:53:30,556 --> 00:53:34,436 Speaker 3: and transform the economy. If you deploy them with too 1060 00:53:34,476 --> 00:53:37,756 Speaker 3: much autonomy, it is also the thing that causes them 1061 00:53:37,796 --> 00:53:44,796 Speaker 3: to descend into chaos. 1062 00:53:46,076 --> 00:53:48,876 Speaker 2: Evan Ratliffe is the host of the podcast shell Game 1063 00:53:49,196 --> 00:53:53,356 Speaker 2: and the co founder of Harumo AI. Please email us 1064 00:53:53,436 --> 00:53:56,796 Speaker 2: at problem at pushkin dot fm. We are always looking 1065 00:53:56,836 --> 00:54:00,876 Speaker 2: for new guests for the show. Today's show was produced 1066 00:54:00,916 --> 00:54:04,356 Speaker 2: by Trinamanino and Gabriel Hunter Chang, who was edited by 1067 00:54:04,396 --> 00:54:08,756 Speaker 2: Alexander Garreton and engineered by Sarah Bruger. I'm Jacob Goldstein 1068 00:54:08,756 --> 00:54:10,836 Speaker 2: and We'll be back next week with another episode of 1069 00:54:10,836 --> 00:54:25,436 Speaker 2: What's Your Problem. 1070 00:54:25,036 --> 00:54:25,076 Speaker 5: M