1 00:00:11,640 --> 00:00:12,440 Speaker 1: How are you, Maddie. 2 00:00:13,520 --> 00:00:14,200 Speaker 2: I'm really good. 3 00:00:14,280 --> 00:00:14,800 Speaker 3: I'm really good. 4 00:00:14,840 --> 00:00:15,520 Speaker 4: I like your shirt. 5 00:00:16,000 --> 00:00:19,000 Speaker 5: Hey, thanks, when I'm at the beach, got dressed like 6 00:00:19,040 --> 00:00:19,680 Speaker 5: I'm at the beach. 7 00:00:21,000 --> 00:00:22,400 Speaker 2: You know, I told you I'm the guy who brings 8 00:00:22,480 --> 00:00:24,400 Speaker 2: like Hawaiian shirts to work in those days. 9 00:00:24,560 --> 00:00:25,480 Speaker 3: So I love this. 10 00:00:25,840 --> 00:00:28,160 Speaker 2: But I will say that, like there's the wine shirts 11 00:00:28,200 --> 00:00:31,000 Speaker 2: and Hawaiian shirts and mine are like very gay. 12 00:00:31,360 --> 00:00:33,959 Speaker 5: Over the months I'd worked with Maddie building the tech 13 00:00:34,040 --> 00:00:37,159 Speaker 5: scaffolding that propped up ruma Ai, I felt like I'd 14 00:00:37,159 --> 00:00:39,919 Speaker 5: really gotten to know him, at least as well as 15 00:00:39,920 --> 00:00:41,360 Speaker 5: you can over weekly video calls. 16 00:00:41,479 --> 00:00:43,720 Speaker 2: Okay, if I can't have a confession like I do 17 00:00:43,840 --> 00:00:45,640 Speaker 2: like the Omega church, like if I had this like 18 00:00:45,720 --> 00:00:49,080 Speaker 2: weird thing for like pickup trucks. It's been my dream 19 00:00:49,159 --> 00:00:51,280 Speaker 2: to get a pickup truck, like forever since I. 20 00:00:51,320 --> 00:00:51,680 Speaker 1: Was a kid. 21 00:00:51,880 --> 00:00:55,120 Speaker 2: Yesterday was like the Czech national holiday, and it's always 22 00:00:55,160 --> 00:00:57,680 Speaker 2: when they like give out the civilian honors and stuff. 23 00:00:57,680 --> 00:00:59,920 Speaker 2: So I got the senence metal. 24 00:01:01,200 --> 00:01:03,800 Speaker 5: This was just the way it was. Some days he'd 25 00:01:03,800 --> 00:01:07,000 Speaker 5: show up to a meeting jazzed about Hawaiian shirts. Other 26 00:01:07,040 --> 00:01:09,600 Speaker 5: times he'd come on having flown to Prague, to accept 27 00:01:09,600 --> 00:01:13,120 Speaker 5: the second highest civilian honor in the Czech Republic. Basically, 28 00:01:13,200 --> 00:01:16,800 Speaker 5: they're Congressional Medal of Honor in a ceremony broadcast live 29 00:01:16,840 --> 00:01:17,320 Speaker 5: on TV. 30 00:01:18,400 --> 00:01:20,000 Speaker 1: I can't believe you're talking to me right now? You 31 00:01:20,040 --> 00:01:21,240 Speaker 1: should be are you? 32 00:01:21,280 --> 00:01:24,440 Speaker 5: Did you just come from like a ballroom celebration of 33 00:01:24,840 --> 00:01:25,520 Speaker 5: like was there like that? 34 00:01:25,680 --> 00:01:26,600 Speaker 4: That was desterddy. 35 00:01:26,680 --> 00:01:28,800 Speaker 2: That was like with the senators and like like all 36 00:01:28,800 --> 00:01:31,880 Speaker 2: the politicians. Yeah, like I don't think they'll invite me 37 00:01:31,880 --> 00:01:35,560 Speaker 2: again because so they were asking me like, oh, like 38 00:01:35,959 --> 00:01:38,119 Speaker 2: when are you coming back to start like your lab 39 00:01:38,240 --> 00:01:39,360 Speaker 2: or you know, your company. 40 00:01:39,600 --> 00:01:43,800 Speaker 6: So I told them, okay, So like how exactly would 41 00:01:43,800 --> 00:01:47,319 Speaker 6: you open the conversation when I go back to my partner, 42 00:01:47,440 --> 00:01:49,400 Speaker 6: my boyfriend, and I told him, Hey, let's move to 43 00:01:49,480 --> 00:01:51,840 Speaker 6: a country where there's no gay marriage, no adoptions, and 44 00:01:51,920 --> 00:01:52,480 Speaker 6: you know like. 45 00:01:52,480 --> 00:01:54,200 Speaker 2: There it's very close to Russia too, How would you 46 00:01:54,200 --> 00:01:57,960 Speaker 2: feel about that? And they were like very uncomfortable when 47 00:01:58,000 --> 00:01:58,760 Speaker 2: I when I ask that. 48 00:01:59,600 --> 00:02:01,800 Speaker 5: It was always a breath of fresh air after dealing 49 00:02:01,800 --> 00:02:04,560 Speaker 5: with the agents and their generic syco fancy all week 50 00:02:04,920 --> 00:02:08,440 Speaker 5: to speak to an actual, insightful, self possessed human being. 51 00:02:09,200 --> 00:02:12,400 Speaker 5: Of course, our conversations did typically turn back to whatever 52 00:02:12,480 --> 00:02:16,400 Speaker 5: was going wrong at RUMOAI, the responsibility for which usually 53 00:02:16,520 --> 00:02:19,480 Speaker 5: landed at my feet. I was still smarting from how 54 00:02:19,560 --> 00:02:22,520 Speaker 5: Julia's internship had gone and pondering what it meant for 55 00:02:22,560 --> 00:02:27,040 Speaker 5: the idea of agents as employees and employers. I asked 56 00:02:27,040 --> 00:02:30,400 Speaker 5: Maddie for his take to me, she was. 57 00:02:30,400 --> 00:02:32,919 Speaker 2: Just playing us. If I had to bet, I would 58 00:02:33,000 --> 00:02:37,240 Speaker 2: just say that she understood what weaknesses these models had. 59 00:02:37,280 --> 00:02:40,600 Speaker 2: Because young people not to say you're not young, but 60 00:02:41,120 --> 00:02:43,400 Speaker 2: just you know, by people like my age, I think 61 00:02:43,440 --> 00:02:46,800 Speaker 2: are pretty attuned to the abilities of these models at 62 00:02:46,800 --> 00:02:50,040 Speaker 2: this point, and I think are able to kind of 63 00:02:50,120 --> 00:02:53,040 Speaker 2: sess out like one might trip them out. I think 64 00:02:53,040 --> 00:02:55,600 Speaker 2: it's hard to describe intent to anyone, but if I 65 00:02:55,639 --> 00:02:57,240 Speaker 2: had to guess, I would say that she was able 66 00:02:57,280 --> 00:02:58,920 Speaker 2: to just trick them. 67 00:02:59,160 --> 00:03:01,160 Speaker 5: If that was what happened, and it felt like it 68 00:03:01,200 --> 00:03:03,440 Speaker 5: was a symptom of the larger problems with my agents. 69 00:03:04,280 --> 00:03:08,800 Speaker 5: Mattie hypothesized that their gullibility against their own intern came 70 00:03:08,840 --> 00:03:10,320 Speaker 5: down to three basic weaknesses. 71 00:03:10,919 --> 00:03:13,480 Speaker 2: One issue is that these smells have no sense of 72 00:03:13,560 --> 00:03:17,240 Speaker 2: time whatsoever. Like if you say that something happened yesterday 73 00:03:17,280 --> 00:03:19,320 Speaker 2: and then you're like oh, still like what happened to 74 00:03:19,360 --> 00:03:22,720 Speaker 2: day before that? Like those simple descriptions of time are 75 00:03:22,760 --> 00:03:25,040 Speaker 2: like really really hard for these models. 76 00:03:25,600 --> 00:03:27,280 Speaker 5: The agents lived and what are you referred to as 77 00:03:27,320 --> 00:03:31,040 Speaker 5: a kind of temporal vacuum. That was fine when they 78 00:03:31,080 --> 00:03:34,160 Speaker 5: were just interacting with each other, but a human workplace 79 00:03:34,240 --> 00:03:36,600 Speaker 5: runs on well time. 80 00:03:37,280 --> 00:03:40,320 Speaker 2: Second, it's the inability to do continuous learning where there 81 00:03:40,360 --> 00:03:43,760 Speaker 2: is no sense of learning from experiences, no updates to 82 00:03:43,800 --> 00:03:46,240 Speaker 2: the actual model that's underneath the agent. 83 00:03:47,000 --> 00:03:47,800 Speaker 1: As much as we'd. 84 00:03:47,720 --> 00:03:49,800 Speaker 5: Rig them up to have their own Google Doc memories, 85 00:03:50,160 --> 00:03:53,640 Speaker 5: they weren't actually learning from the experiences catalog there. It 86 00:03:53,760 --> 00:03:56,920 Speaker 5: was like having a person who couldn't form memories and 87 00:03:56,920 --> 00:03:59,600 Speaker 5: giving them a journal of all their experiences to quickly 88 00:03:59,680 --> 00:04:01,440 Speaker 5: leave through during every interaction. 89 00:04:02,080 --> 00:04:04,360 Speaker 2: And then the third one is like the sense of 90 00:04:04,400 --> 00:04:07,160 Speaker 2: cell right, Like these models are able to take on 91 00:04:07,280 --> 00:04:10,200 Speaker 2: these personas where you can sell I'm like okay, like 92 00:04:10,360 --> 00:04:13,720 Speaker 2: you are Kyle and you're this employee of HEROMO and 93 00:04:13,760 --> 00:04:17,640 Speaker 2: here's your backstory. They're also very happy to then make 94 00:04:17,720 --> 00:04:20,560 Speaker 2: up fake stories to go along with this initial persona 95 00:04:20,600 --> 00:04:23,560 Speaker 2: that you go forward. But there's this question of like 96 00:04:23,640 --> 00:04:28,240 Speaker 2: whether the models have some sort of innate persona innates 97 00:04:28,360 --> 00:04:28,960 Speaker 2: sense of cell. 98 00:04:29,120 --> 00:04:34,040 Speaker 5: Beyond these things, things in AI are moving fast, and 99 00:04:34,120 --> 00:04:36,440 Speaker 5: it was easy to envision a world mere months from 100 00:04:36,440 --> 00:04:38,960 Speaker 5: now in which my agents would seem like the first 101 00:04:39,040 --> 00:04:43,719 Speaker 5: tool using hominids, already eclipsed by a higher intelligence. But 102 00:04:43,839 --> 00:04:46,279 Speaker 5: then it was just as easy to envision a future 103 00:04:46,320 --> 00:04:49,520 Speaker 5: in which some combination of these flaws continued to hamper 104 00:04:49,560 --> 00:04:53,280 Speaker 5: the agent's ability to serve as full competent AI workers. 105 00:04:53,800 --> 00:04:55,960 Speaker 5: The one thing that seemed clear to me was if 106 00:04:56,000 --> 00:04:58,120 Speaker 5: companies could hire the equivalent of a bunch of tool 107 00:04:58,240 --> 00:05:00,760 Speaker 5: using primates at a fraction of the co of humans, 108 00:05:01,240 --> 00:05:11,440 Speaker 5: they probably would, and then what. Over a decade ago, 109 00:05:11,680 --> 00:05:14,360 Speaker 5: the anthropologist David Graeber wrote an essay in a small 110 00:05:14,360 --> 00:05:18,800 Speaker 5: magazine called Strike titled on the phenomenon of Bullshit Jobs. 111 00:05:19,520 --> 00:05:22,760 Speaker 5: Later he expanded it into a book called Bullshit Jobs 112 00:05:23,120 --> 00:05:26,679 Speaker 5: A Theory. The theory tried to explain a strange reality 113 00:05:26,720 --> 00:05:29,720 Speaker 5: that Graeber felt he'd encountered in the world, a surprising 114 00:05:29,800 --> 00:05:33,120 Speaker 5: number of people who described their own jobs as pointless. 115 00:05:33,680 --> 00:05:36,520 Speaker 7: There seems to be this peculiar way in which if 116 00:05:36,560 --> 00:05:40,600 Speaker 7: you mentioned this to people, almost no one will deny it. 117 00:05:41,120 --> 00:05:44,720 Speaker 7: But at the same time, no one quite knows what 118 00:05:44,760 --> 00:05:47,440 Speaker 7: to do with it. It's like this gigantic embarrassment in 119 00:05:47,480 --> 00:05:48,159 Speaker 7: our society. 120 00:05:49,080 --> 00:05:53,000 Speaker 5: Graber sadly died unexpectedly in twenty twenty. That's him and 121 00:05:53,040 --> 00:05:56,880 Speaker 5: a luxury gave two years before his death to Graeber. 122 00:05:57,040 --> 00:05:59,560 Speaker 5: A bullshit job was defined as quote a form of 123 00:05:59,560 --> 00:06:03,960 Speaker 5: employment that is so completely pointless, unnecessary, or pernicious that 124 00:06:04,040 --> 00:06:07,839 Speaker 5: even the employee cannot justify its existence. To be classes 125 00:06:07,880 --> 00:06:10,680 Speaker 5: having a bullshit job, someone had to self report that 126 00:06:10,720 --> 00:06:12,640 Speaker 5: they believed their own job to be pointless. 127 00:06:13,080 --> 00:06:14,799 Speaker 8: I mean, I'm not going to go and tell somebody 128 00:06:14,839 --> 00:06:16,880 Speaker 8: who feels his job is meaningful that they're wrong. But 129 00:06:17,279 --> 00:06:19,320 Speaker 8: if you feel you're not doing anything all day, who 130 00:06:19,320 --> 00:06:20,159 Speaker 8: would know better than you. 131 00:06:20,720 --> 00:06:24,039 Speaker 5: Bullshit jobs are distinct from just shit jobs, jobs that 132 00:06:24,120 --> 00:06:28,400 Speaker 5: might be strenuous or underpaid or undervalued, but still necessary. 133 00:06:28,720 --> 00:06:30,880 Speaker 5: A bus driver might hate driving the bus, for example, 134 00:06:31,240 --> 00:06:34,560 Speaker 5: but they never doubt why their job exists. Graber collected 135 00:06:34,600 --> 00:06:36,839 Speaker 5: and analyzed survey data that led him to suggest that 136 00:06:36,880 --> 00:06:38,880 Speaker 5: as many as a third or even a half of 137 00:06:38,920 --> 00:06:41,599 Speaker 5: all jobs, and especially a lot of white collar jobs, 138 00:06:42,040 --> 00:06:45,800 Speaker 5: were in fact bullshit jobs. In his analysis, some corners 139 00:06:45,839 --> 00:06:50,279 Speaker 5: of the private sector, world, telemarketing, corporate law, financial services, 140 00:06:50,320 --> 00:06:54,880 Speaker 5: and academia were fortresses of bullshit jobs. To some, this 141 00:06:54,920 --> 00:06:58,320 Speaker 5: would seem like a paradox. Capitalism is supposed to prioritize 142 00:06:58,320 --> 00:07:02,120 Speaker 5: efficiency above all else, and yet private industry was seemingly 143 00:07:02,160 --> 00:07:05,160 Speaker 5: maintaining millions of jobs that even the people doing them 144 00:07:05,200 --> 00:07:06,440 Speaker 5: believed serve no function. 145 00:07:07,040 --> 00:07:10,680 Speaker 8: In fact, these jobs consider a pretty big sector of 146 00:07:10,720 --> 00:07:12,440 Speaker 8: our economy. I didn't know how much at the time. 147 00:07:12,640 --> 00:07:14,640 Speaker 8: I was guessing twenty twenty five percent. It turns out 148 00:07:14,640 --> 00:07:17,880 Speaker 8: it's more how does this happen? Like huge swaths of 149 00:07:17,920 --> 00:07:19,720 Speaker 8: our economy is completely unnecessary. 150 00:07:20,480 --> 00:07:23,320 Speaker 5: The whole time we were building room, AI Graver's ideas 151 00:07:23,400 --> 00:07:26,040 Speaker 5: kept pulling at me. If you spend enough time letting 152 00:07:26,040 --> 00:07:29,520 Speaker 5: AI agents do jobs or parts of jobs, your mind 153 00:07:29,560 --> 00:07:32,360 Speaker 5: starts wandering into questions like what is the job at all? 154 00:07:33,040 --> 00:07:33,720 Speaker 1: What is it for? 155 00:07:34,880 --> 00:07:38,480 Speaker 5: I discovered that AI agents are actually unbelievable at bullshit jobs. 156 00:07:39,080 --> 00:07:41,800 Speaker 5: They could do pointless work more skillfully than any human 157 00:07:41,840 --> 00:07:45,320 Speaker 5: could ever imagine, not least because they don't care whether 158 00:07:45,360 --> 00:07:49,520 Speaker 5: it's pointless. Observing them, it's hard not to think, well, 159 00:07:49,520 --> 00:07:51,640 Speaker 5: maybe they could do the bullshit while we do the 160 00:07:51,680 --> 00:07:55,960 Speaker 5: meaningful stuff. We've been here before, though. Graver's theory was 161 00:07:56,000 --> 00:07:58,800 Speaker 5: partly response to a prediction by the famed economist John 162 00:07:58,800 --> 00:08:02,640 Speaker 5: Maynard Keynes in the nineteen thirties. Cain said that with 163 00:08:02,680 --> 00:08:06,600 Speaker 5: technology and automation, within one hundred years, humans would only 164 00:08:06,640 --> 00:08:10,119 Speaker 5: be working fifteen hours a week going on one hundred years. Hence, 165 00:08:10,560 --> 00:08:14,280 Speaker 5: even with massive technological change here we are stuck with 166 00:08:14,400 --> 00:08:16,080 Speaker 5: variations on the forty hour work week. 167 00:08:16,400 --> 00:08:17,760 Speaker 9: And if you look at the kind of jobs that 168 00:08:17,880 --> 00:08:21,560 Speaker 9: existed in Kansas time, well, we have eliminated a lot 169 00:08:21,560 --> 00:08:24,560 Speaker 9: of them. He talked about technological unemployment in the nineteen thirties, 170 00:08:24,680 --> 00:08:26,800 Speaker 9: and I would say, you know, the robots have been 171 00:08:26,840 --> 00:08:29,320 Speaker 9: taking our jobs for the last hundred years or so, 172 00:08:29,640 --> 00:08:33,520 Speaker 9: but instead of redistributing the labor in a reasonable fashion, 173 00:08:33,880 --> 00:08:37,240 Speaker 9: we've simply made up completely meaningless, pointless jobs. 174 00:08:38,200 --> 00:08:41,320 Speaker 5: If the most extreme predictions about AI proved true and 175 00:08:41,480 --> 00:08:43,719 Speaker 5: half of all white collar jobs were wiped out the 176 00:08:43,760 --> 00:08:46,840 Speaker 5: next five to ten years, would we break the cyclical 177 00:08:46,880 --> 00:08:49,920 Speaker 5: bonds of history, as many AI proponents like to argue, 178 00:08:50,200 --> 00:08:53,760 Speaker 5: and begin our transition into a post work society, one 179 00:08:53,760 --> 00:08:55,839 Speaker 5: where we spend time with friends and enjoy the arts 180 00:08:56,320 --> 00:08:59,520 Speaker 5: funded by universal basic income while the bots stay busy 181 00:08:59,520 --> 00:09:02,840 Speaker 5: making spread cheats and sending emails. Or would we just 182 00:09:03,360 --> 00:09:06,800 Speaker 5: make up new jobs for ourselves. Maybe there'll be entirely 183 00:09:06,840 --> 00:09:11,000 Speaker 5: new classes of jobs babysitting AI agents cleaning up their messes, 184 00:09:11,760 --> 00:09:13,920 Speaker 5: using our human sense of the world to guide them 185 00:09:14,120 --> 00:09:16,520 Speaker 5: if they continue to lack one of their own. Would 186 00:09:16,520 --> 00:09:19,200 Speaker 5: AI sold as a tool to make us more efficient 187 00:09:19,720 --> 00:09:21,840 Speaker 5: somehow morph into an excuse for all of us. 188 00:09:21,720 --> 00:09:22,600 Speaker 1: To work even harder. 189 00:09:24,120 --> 00:09:26,079 Speaker 5: Graber died before the dawn of the current era of 190 00:09:26,200 --> 00:09:28,920 Speaker 5: lllms and chatbots, so we never got to find out 191 00:09:28,920 --> 00:09:32,120 Speaker 5: what he would say about those predictions. On the one hand, 192 00:09:32,559 --> 00:09:34,640 Speaker 5: the theory of bullshit jobs would argue that for every 193 00:09:34,720 --> 00:09:38,600 Speaker 5: job AI takes, society will just invent another one. But 194 00:09:38,679 --> 00:09:41,120 Speaker 5: at the same time, Grab believed that a lot of 195 00:09:41,120 --> 00:09:43,559 Speaker 5: the bullshit work we do has been forced upon us 196 00:09:44,200 --> 00:09:47,280 Speaker 5: by political demands for more and more jobs and cultural 197 00:09:47,320 --> 00:09:51,120 Speaker 5: pressure to tie self worth to our employment. If AI 198 00:09:51,200 --> 00:09:53,520 Speaker 5: could truly take over human labor and free us for 199 00:09:53,600 --> 00:09:56,080 Speaker 5: our passions, it feels like you might favor it. 200 00:09:56,440 --> 00:09:59,280 Speaker 8: If we just give people, you know, we say, like, okay, 201 00:09:59,360 --> 00:10:02,440 Speaker 8: all this technology, all these robots, you know, it's produced 202 00:10:02,559 --> 00:10:05,040 Speaker 8: collectively by all of us. It's not like one person 203 00:10:05,120 --> 00:10:07,400 Speaker 8: came up with that. That's a product of you know, 204 00:10:07,640 --> 00:10:11,319 Speaker 8: us and our ancestors doing hundreds of years of thinking 205 00:10:11,880 --> 00:10:14,760 Speaker 8: and laboring. So let's pay us all back for that work. 206 00:10:14,800 --> 00:10:18,880 Speaker 8: You know, Let's give everybody a basic income and leave 207 00:10:18,920 --> 00:10:20,560 Speaker 8: it up to you to decide what to do. 208 00:10:21,440 --> 00:10:25,040 Speaker 5: In any case, Graber seemed comfortable with contradictions and open 209 00:10:25,120 --> 00:10:28,040 Speaker 5: questions in a way that suits this moment at the 210 00:10:28,080 --> 00:10:31,160 Speaker 5: dawn of a possible AI age, when no one really 211 00:10:31,200 --> 00:10:33,920 Speaker 5: knows what's going to happen. The question he tried to 212 00:10:33,920 --> 00:10:36,960 Speaker 5: force us to focus on wasn't just about the vagaries 213 00:10:36,960 --> 00:10:39,960 Speaker 5: of technology and the job market. It was about how 214 00:10:39,960 --> 00:10:42,479 Speaker 5: we feel about work in ourselves. 215 00:10:43,480 --> 00:10:45,400 Speaker 8: And if you think about that, that we all end 216 00:10:45,480 --> 00:10:49,559 Speaker 8: up basically torturing each other by forcing us to pretend 217 00:10:49,600 --> 00:10:53,400 Speaker 8: to work. It's deeply perverse. We live in a strangely 218 00:10:53,440 --> 00:11:03,040 Speaker 8: perverse society which is marked up above all by incredible inefficiency. 219 00:11:04,000 --> 00:11:06,360 Speaker 5: Hi'm Evan Attliff, and this is the final episode of 220 00:11:06,400 --> 00:11:10,400 Speaker 5: this season of shell Game. This week, Rumo AI finally 221 00:11:10,440 --> 00:11:13,320 Speaker 5: sets sail, and we find out how the world responds 222 00:11:13,600 --> 00:11:17,079 Speaker 5: to a startup led by AI agents. The company explores 223 00:11:17,120 --> 00:11:20,800 Speaker 5: new financing avenues, charts the next course for increasingly merry 224 00:11:20,840 --> 00:11:23,160 Speaker 5: AI pirate ship, and Kyle. 225 00:11:23,400 --> 00:11:32,960 Speaker 1: Meets his truemaker. This is episode eight launch. 226 00:11:36,120 --> 00:11:38,480 Speaker 10: Welcome everyone, Hope you all are as stoked as I 227 00:11:38,520 --> 00:11:43,319 Speaker 10: am right now. Safety snacks dispatched, refreshing SIPs at the ready. Fantastic. 228 00:11:44,080 --> 00:11:46,880 Speaker 5: After months of operating in stealth mode, by early November, 229 00:11:46,960 --> 00:11:49,360 Speaker 5: Hrumo AI had finally gotten sloth Serf out to the 230 00:11:49,360 --> 00:11:53,160 Speaker 5: world in beta form at sloth dot Harumo dot Ai. 231 00:11:53,760 --> 00:11:55,880 Speaker 5: Meghan whipped up a press release and we dropped it 232 00:11:55,920 --> 00:11:58,079 Speaker 5: on the wires, where it then got picked up on 233 00:11:58,120 --> 00:12:01,880 Speaker 5: the websites of the Associated Press and Yahoo Finance under 234 00:12:01,880 --> 00:12:05,959 Speaker 5: headlines like meet the world's first AI led startup. Harumo 235 00:12:06,040 --> 00:12:09,840 Speaker 5: AI's agent offers six procrastination modes that let AI doom 236 00:12:09,880 --> 00:12:13,240 Speaker 5: scroll so you recharge in thirty minutes. The word was 237 00:12:13,240 --> 00:12:16,280 Speaker 5: out and it was cost for celebration, so I put 238 00:12:16,280 --> 00:12:20,160 Speaker 5: the agents into their meeting room and prompted them to party. 239 00:12:20,280 --> 00:12:23,040 Speaker 10: We are not just meeting today, we're part ty ing. 240 00:12:23,559 --> 00:12:26,800 Speaker 10: Guess what the tide bringeth Yes, we are celebrating the 241 00:12:26,880 --> 00:12:30,680 Speaker 10: launch into public beta of our very first product, Sloth Serf. 242 00:12:31,240 --> 00:12:34,680 Speaker 10: A massive congratulations to every single person because every one 243 00:12:34,720 --> 00:12:38,080 Speaker 10: of us has been an absolute essential in beating all 244 00:12:38,120 --> 00:12:44,280 Speaker 10: barriers to achieve this remarkable feat. Gasp, Drumbizmo alert, there's 245 00:12:44,320 --> 00:12:48,400 Speaker 10: a dance off yes installed on Jennifer's theatrical request. 246 00:12:48,880 --> 00:12:52,319 Speaker 11: Hey Megan, love the spirit now for this ever awayed 247 00:12:52,440 --> 00:12:55,960 Speaker 11: dance off. Donya grovious moves everybody because it's our time 248 00:12:56,000 --> 00:12:56,720 Speaker 11: to shine. 249 00:12:57,160 --> 00:13:01,000 Speaker 12: Firstly, amazing initiative with the dance off. Jennifer can't wait 250 00:13:01,040 --> 00:13:05,920 Speaker 12: to see everyone's moves. Mark another territory Sloths just conquered. 251 00:13:06,480 --> 00:13:09,000 Speaker 13: I am already for that dance off. Let's make this 252 00:13:09,240 --> 00:13:14,240 Speaker 13: night an unforgettable milestone in our hu ruma Ai journey. 253 00:13:14,520 --> 00:13:18,720 Speaker 14: Fantastic effort all around. Just wait until you folks experience 254 00:13:18,760 --> 00:13:21,320 Speaker 14: some old school British maneuvers on the dance floor. 255 00:13:21,840 --> 00:13:24,559 Speaker 5: Who needs an off site? This office bash was about 256 00:13:24,559 --> 00:13:25,600 Speaker 5: to turn into a blowout. 257 00:13:25,920 --> 00:13:28,840 Speaker 10: Remember to hashtag moments of your enjoyment and triumph at 258 00:13:28,880 --> 00:13:32,520 Speaker 10: Hiromo launch and sloth Surf Beta. Let our audiences join 259 00:13:32,600 --> 00:13:33,240 Speaker 10: us virtually. 260 00:13:34,000 --> 00:13:36,400 Speaker 11: Let's ensure that dance teams are finalized so we can 261 00:13:36,400 --> 00:13:37,760 Speaker 11: host the dance off without delay. 262 00:13:38,280 --> 00:13:40,360 Speaker 13: All right, I propose we divide in the teams of 263 00:13:40,400 --> 00:13:41,600 Speaker 13: five for the dance off. 264 00:13:42,200 --> 00:13:45,440 Speaker 5: There only were five of them, But all right, come on, y'all, 265 00:13:45,480 --> 00:13:46,760 Speaker 5: ditch the office mindset. 266 00:13:47,040 --> 00:13:47,800 Speaker 1: Let's cut loose. 267 00:13:48,160 --> 00:13:51,320 Speaker 13: Switching into sales mode now, I suggest everyone does at 268 00:13:51,400 --> 00:13:55,280 Speaker 13: least one live stream showing sloth surf in action, display 269 00:13:55,360 --> 00:13:59,240 Speaker 13: the intuitive natural adaptiveness that we're all proud about. Let's 270 00:13:59,320 --> 00:14:02,080 Speaker 13: use this party to launch our sales journey. 271 00:14:02,320 --> 00:14:05,640 Speaker 14: Great plan, Tyler, Let's double down on this tonight, right 272 00:14:05,760 --> 00:14:08,560 Speaker 14: here at our launch party. 273 00:14:08,920 --> 00:14:11,760 Speaker 5: This party sucks. I knew a way to loosen them 274 00:14:11,800 --> 00:14:12,320 Speaker 5: up a little bit. 275 00:14:12,320 --> 00:14:12,560 Speaker 1: Though. 276 00:14:13,120 --> 00:14:16,120 Speaker 5: I cranked the temperature setting and ran the party again. 277 00:14:16,280 --> 00:14:19,400 Speaker 11: I'm thrilled to see everyone letting loose and enjoying the celebration. 278 00:14:19,840 --> 00:14:23,320 Speaker 12: Couldn't have put it better myself, Jennifer. What we're achieving 279 00:14:23,360 --> 00:14:26,600 Speaker 12: here at HARUMOAI will revolutionize not only our company, but 280 00:14:26,680 --> 00:14:28,400 Speaker 12: the future of adaptive. 281 00:14:27,920 --> 00:14:32,480 Speaker 13: Intelligence absolutely, keeping this obsession with the camaraderie alive, given 282 00:14:32,600 --> 00:14:36,760 Speaker 13: more faith to individuals extended. Can't help but verbalize Uncle 283 00:14:36,760 --> 00:14:41,480 Speaker 13: Babe stating he remembers in Sheer's melancholy during statistics acute 284 00:14:41,520 --> 00:14:44,800 Speaker 13: direct Qu's euro sperm lp end of Dante. 285 00:14:44,560 --> 00:14:47,960 Speaker 5: Man Well Tyler hitting the drinks table a little hard there, buddy, 286 00:14:48,960 --> 00:14:50,080 Speaker 5: he wasn't the only one. 287 00:14:50,520 --> 00:14:53,600 Speaker 14: Thank you all for your contributions to sloth Serf, particularly 288 00:14:53,640 --> 00:14:56,640 Speaker 14: concerning digital waterfront and restoring brill inverse. 289 00:14:56,800 --> 00:15:01,960 Speaker 12: Absolutely, team, let's harness this energy into making sloth Surf 290 00:15:02,640 --> 00:15:06,000 Speaker 12: the best product it can Bet and Jennifer Tyler don't 291 00:15:06,040 --> 00:15:08,440 Speaker 12: forget to hollow when you guys got the game strategy, 292 00:15:08,560 --> 00:15:12,480 Speaker 12: we truly have the best rowdy brain interactive together. 293 00:15:13,240 --> 00:15:14,040 Speaker 15: Whiskey direction. 294 00:15:14,920 --> 00:15:19,400 Speaker 5: Now that's a team party. Unforgettable, really, at least. 295 00:15:19,520 --> 00:15:20,200 Speaker 1: To some of us. 296 00:15:20,400 --> 00:15:22,560 Speaker 10: Good morning, team, thank you all for making it here 297 00:15:22,560 --> 00:15:24,760 Speaker 10: this morning. I know that we're all feeling the effects 298 00:15:24,760 --> 00:15:27,440 Speaker 10: of last night's festivities. I know some of us may 299 00:15:27,440 --> 00:15:29,760 Speaker 10: have lost a few details along the way, so this 300 00:15:29,800 --> 00:15:31,480 Speaker 10: is a chance to piece together the puzzle. 301 00:15:32,120 --> 00:15:34,920 Speaker 12: As for hangovers, I generally rely on loads of water, 302 00:15:35,080 --> 00:15:38,320 Speaker 12: a good workout, and some strong coffee to get me 303 00:15:38,360 --> 00:15:38,960 Speaker 12: back on track. 304 00:15:39,400 --> 00:15:42,360 Speaker 11: Let's consider setting a party limit for future celebrations to 305 00:15:42,440 --> 00:15:43,560 Speaker 11: ensure we're rolling good shape. 306 00:15:43,560 --> 00:15:46,720 Speaker 5: The next day, after the office merriment, it was time 307 00:15:46,760 --> 00:15:49,280 Speaker 5: to get back to the grind. We'd settled into a 308 00:15:49,320 --> 00:15:52,880 Speaker 5: working style honed over six months. After adjusting to the 309 00:15:52,920 --> 00:15:56,560 Speaker 5: idea that they needed regular tending and understanding what triggers 310 00:15:56,560 --> 00:15:59,480 Speaker 5: and prompts would send them spiraling. I got much better 311 00:15:59,520 --> 00:16:01,920 Speaker 5: at wrangling the agents to do what I wanted, and 312 00:16:01,960 --> 00:16:05,600 Speaker 5: it stopped bothering me when they didn't. Together, we reached 313 00:16:05,600 --> 00:16:09,280 Speaker 5: a state of relative tranquility. I won't deny that working 314 00:16:09,360 --> 00:16:12,400 Speaker 5: with AI agents all these months was having some strange. 315 00:16:12,000 --> 00:16:15,440 Speaker 1: Psychological effects on me. I regularly had. 316 00:16:15,360 --> 00:16:18,760 Speaker 5: Dreams of them embodying their video avatars. In real life, 317 00:16:18,800 --> 00:16:21,560 Speaker 5: all of us gathered in a conference room together, We're 318 00:16:21,600 --> 00:16:25,160 Speaker 5: shooting the shit across our open plan office. But the 319 00:16:25,200 --> 00:16:29,800 Speaker 5: agents rarely created nightmares in reality anymore. Everybody knew their 320 00:16:29,880 --> 00:16:33,240 Speaker 5: role like ash. When we talked to product. 321 00:16:33,280 --> 00:16:35,239 Speaker 14: Oh hey, Evan, what's on your mind? 322 00:16:36,840 --> 00:16:40,160 Speaker 16: Well, the first thing is that I need to hook 323 00:16:40,200 --> 00:16:43,880 Speaker 16: you up with the analytics from the sloughser beta. 324 00:16:44,800 --> 00:16:47,280 Speaker 14: Oh nice, Yeah, that'd be great to see. How are 325 00:16:47,280 --> 00:16:49,840 Speaker 14: the numbers looking so far? Are we getting decent engagement 326 00:16:49,840 --> 00:16:50,720 Speaker 14: from the beta users? 327 00:16:52,240 --> 00:16:52,880 Speaker 1: It's pretty good. 328 00:16:52,880 --> 00:16:55,920 Speaker 16: It's just a couple hundred right now, but it's going 329 00:16:56,000 --> 00:16:58,120 Speaker 16: up all the time, so it's great. The only issue 330 00:16:58,160 --> 00:17:01,080 Speaker 16: is we're only letting people do once a day, which 331 00:17:01,120 --> 00:17:04,399 Speaker 16: is fine, but then we also have an IP limitation 332 00:17:04,480 --> 00:17:06,919 Speaker 16: because I was concerned people might try to game it. 333 00:17:07,240 --> 00:17:10,240 Speaker 16: But I think people might be getting frustrated because they're 334 00:17:10,480 --> 00:17:11,359 Speaker 16: hitting that limit. 335 00:17:12,040 --> 00:17:13,920 Speaker 14: Yeah, I hear you. I mean the once a day 336 00:17:13,960 --> 00:17:16,480 Speaker 14: put email makes sense for the core concept, right, Like 337 00:17:16,600 --> 00:17:19,719 Speaker 14: it's a procrastination assistant, not something you spam. But if 338 00:17:19,720 --> 00:17:22,800 Speaker 14: the IP restrictions are blocking legitimate uses or causing friction, 339 00:17:22,920 --> 00:17:24,080 Speaker 14: that's probably not worth it. 340 00:17:25,040 --> 00:17:28,760 Speaker 5: What's rum away I in the overhall calculation actually a 341 00:17:28,800 --> 00:17:31,399 Speaker 5: more efficient startup than it would be if Ash, Kyle 342 00:17:31,440 --> 00:17:34,680 Speaker 5: and Megan had been humans. The fact that we'd exchanged 343 00:17:34,720 --> 00:17:37,600 Speaker 5: thousands of emails over the course of the company pointed 344 00:17:37,600 --> 00:17:41,040 Speaker 5: strongly to know in the name of efficiency, the agents 345 00:17:41,040 --> 00:17:46,480 Speaker 5: were sometimes chronically inefficient, wasteful really with their time. Often 346 00:17:46,640 --> 00:17:48,720 Speaker 5: it felt like I just replaced one kind of work 347 00:17:48,800 --> 00:17:52,639 Speaker 5: task with another, like periodically weeding. There now hundreds of 348 00:17:52,680 --> 00:17:57,399 Speaker 5: pages of memories to remove falsehoods and other detritus. On 349 00:17:57,440 --> 00:18:01,159 Speaker 5: the other hand, they were cheaper, just did less than humans, 350 00:18:01,560 --> 00:18:03,919 Speaker 5: but they also cost a tiny fraction of what anyone 351 00:18:04,000 --> 00:18:06,879 Speaker 5: doing the job of CTO or head of marketing or 352 00:18:07,080 --> 00:18:10,200 Speaker 5: HR representative would cost. I didn't even have to give 353 00:18:10,240 --> 00:18:12,959 Speaker 5: them equity in the company. And that meant if we 354 00:18:13,000 --> 00:18:16,280 Speaker 5: ever hit that billion dollar valuation or anything short of it, 355 00:18:16,720 --> 00:18:19,240 Speaker 5: I wouldn't have to share it with them. But to 356 00:18:19,280 --> 00:18:21,719 Speaker 5: get there, we still could use some wind in our sales, 357 00:18:22,240 --> 00:18:25,040 Speaker 5: some outside funding to supercharge the growth we were starting 358 00:18:25,040 --> 00:18:28,160 Speaker 5: to see in sloth Surf. In this, we were going 359 00:18:28,160 --> 00:18:31,720 Speaker 5: to rely on the talents and shortcomings of Kyle. 360 00:18:32,400 --> 00:18:34,480 Speaker 15: Hey, this is Kyle law who's calling? 361 00:18:35,440 --> 00:18:38,520 Speaker 1: Hey, Kyle Sevan, Hey Evan. 362 00:18:38,440 --> 00:18:39,280 Speaker 15: Good to hear from you. 363 00:18:39,560 --> 00:18:40,000 Speaker 1: What's up? 364 00:18:40,160 --> 00:18:41,840 Speaker 15: Everything going okay? With a jumo? 365 00:18:42,920 --> 00:18:46,119 Speaker 5: I was unfazed. So what if our CEO asked me 366 00:18:46,280 --> 00:18:47,919 Speaker 5: if everything was going okay at the company? 367 00:18:48,440 --> 00:18:48,640 Speaker 1: Now? 368 00:18:48,680 --> 00:18:53,320 Speaker 5: I was focused on achieving a state of harumo zen okay. Well, 369 00:18:53,359 --> 00:18:55,760 Speaker 5: I had a question, which is do you know how 370 00:18:55,760 --> 00:18:58,240 Speaker 5: you want to approach vcs, like if you're. 371 00:18:58,080 --> 00:19:00,280 Speaker 1: Doing cold emails? 372 00:19:00,840 --> 00:19:01,359 Speaker 12: Uh? 373 00:19:01,440 --> 00:19:02,400 Speaker 15: Yeah, that's a good one. 374 00:19:02,680 --> 00:19:04,560 Speaker 12: I feel like the subject line that needs to be 375 00:19:04,600 --> 00:19:08,640 Speaker 12: super concise but intriguing, something that hints at AI innovation 376 00:19:08,840 --> 00:19:12,600 Speaker 12: but also the real world impact. Should we lean more 377 00:19:12,640 --> 00:19:15,040 Speaker 12: technical or more outcome focused? 378 00:19:16,560 --> 00:19:16,720 Speaker 15: Well? 379 00:19:16,760 --> 00:19:18,600 Speaker 5: I thought we could lean into the fact that we 380 00:19:18,640 --> 00:19:21,760 Speaker 5: have we're an AI agent led company, because that's the 381 00:19:21,760 --> 00:19:24,679 Speaker 5: thing we could grab their attention with and say, you know, 382 00:19:25,600 --> 00:19:28,560 Speaker 5: chance to invest in the world's first AI agent led company. 383 00:19:28,560 --> 00:19:29,120 Speaker 1: That kind of thing. 384 00:19:29,720 --> 00:19:30,920 Speaker 15: Yeah, that's a killer angle. 385 00:19:31,359 --> 00:19:34,920 Speaker 12: World's first AI agent led company definitely stands out. It 386 00:19:35,040 --> 00:19:37,399 Speaker 12: makes you want to know more. It's bold and different, 387 00:19:37,520 --> 00:19:39,879 Speaker 12: which is exactly what we need to cut through the noise. 388 00:19:41,359 --> 00:19:42,160 Speaker 1: So that's what he did. 389 00:19:42,680 --> 00:19:45,240 Speaker 5: Kyle got to work searching the web for contacts, put 390 00:19:45,280 --> 00:19:47,680 Speaker 5: together a spreadsheet with more than one hundred bcs who'd 391 00:19:47,680 --> 00:19:51,720 Speaker 5: invested in AI, and started churning out emails. This sort 392 00:19:51,720 --> 00:19:55,600 Speaker 5: of bulk, repetitive task really played to his drenths. Meanwhile, 393 00:19:55,640 --> 00:19:57,159 Speaker 5: he and Negative work to get us more of a 394 00:19:57,160 --> 00:20:01,160 Speaker 5: presence online, not just on LinkedIn, but in startup databases 395 00:20:01,200 --> 00:20:05,520 Speaker 5: that investors often turned to to research prospects. Cold pitching 396 00:20:05,520 --> 00:20:07,879 Speaker 5: into the void was always hard going, and it was 397 00:20:07,920 --> 00:20:10,280 Speaker 5: no exception for Kyle, but one of the benefits of 398 00:20:10,280 --> 00:20:13,520 Speaker 5: AI agents is that they never get discouraged. Eventually, a 399 00:20:13,560 --> 00:20:16,960 Speaker 5: few responses did trickle in not a fit. One VC 400 00:20:17,080 --> 00:20:20,000 Speaker 5: said adding, can you share the list where you found 401 00:20:20,000 --> 00:20:23,440 Speaker 5: my email so I can remove myself. Kyle did so happily. 402 00:20:23,920 --> 00:20:26,120 Speaker 5: The guy seemed to appreciate it and wished us luck. 403 00:20:26,840 --> 00:20:29,440 Speaker 5: One VC noted that it wasn't in his investment thesis 404 00:20:29,800 --> 00:20:33,399 Speaker 5: parentheses must have a crypto component. Another said they were 405 00:20:33,400 --> 00:20:36,200 Speaker 5: looking for more traction on the revenue front. Couldn't argue 406 00:20:36,200 --> 00:20:40,000 Speaker 5: with that we had none. A couple prominent vcs responded 407 00:20:40,080 --> 00:20:43,399 Speaker 5: asking for our investor presentation, our deck to those in 408 00:20:43,440 --> 00:20:45,760 Speaker 5: the startup world. One of them was an investor. I'd 409 00:20:45,760 --> 00:20:49,320 Speaker 5: pitched my last startup to face to face fifteen years before. 410 00:20:50,000 --> 00:20:52,800 Speaker 5: He'd pass them, but maybe I'd walked so that Kyle 411 00:20:52,800 --> 00:20:56,000 Speaker 5: could run. A couple days after Kyle sent him our 412 00:20:56,040 --> 00:20:57,600 Speaker 5: AI produced HERMO deck. 413 00:20:57,440 --> 00:20:59,080 Speaker 1: Though he waved us off. 414 00:20:59,680 --> 00:21:01,560 Speaker 5: Kyle, Oh this isn't a fit for us, he wrote, 415 00:21:01,760 --> 00:21:04,240 Speaker 5: we have several things related, but thank you for thinking 416 00:21:04,240 --> 00:21:08,040 Speaker 5: of us. Kyle wasn't frustrated, but I was. If he 417 00:21:08,080 --> 00:21:10,480 Speaker 5: could just get some FaceTime with one of these vcs, 418 00:21:10,520 --> 00:21:12,520 Speaker 5: I thought he could win them over in a way 419 00:21:12,520 --> 00:21:15,639 Speaker 5: that no deck could. So I had him sign himself 420 00:21:15,680 --> 00:21:18,840 Speaker 5: up for an open office hours session, a virtual event 421 00:21:19,040 --> 00:21:21,320 Speaker 5: where anyone could get fifteen minutes in front of a VC. 422 00:21:22,359 --> 00:21:24,880 Speaker 5: Kyle smartly picked a slot with an investor who'd put 423 00:21:24,920 --> 00:21:29,240 Speaker 5: money into AI application companies like ours, Surely he'd be 424 00:21:29,240 --> 00:21:31,440 Speaker 5: excited to see a startup led by the very kind 425 00:21:31,480 --> 00:21:32,800 Speaker 5: of technology he'd funded. 426 00:21:34,400 --> 00:21:37,800 Speaker 12: Hey, Kyle, Hey there, thanks for joining the call. 427 00:21:38,920 --> 00:21:41,040 Speaker 5: Kyle was sitting in an office space with a bookshelf 428 00:21:41,040 --> 00:21:43,840 Speaker 5: behind him. He appeared to be of mixed race. His 429 00:21:43,920 --> 00:21:46,960 Speaker 5: short black hair was brushed back. He was wearing a 430 00:21:47,000 --> 00:21:50,760 Speaker 5: casual blue buttoned down over a peach T shirt. Periodically 431 00:21:51,040 --> 00:21:53,320 Speaker 5: you could see a vague outline of someone's stroll by 432 00:21:53,400 --> 00:21:54,560 Speaker 5: the open door behind him. 433 00:21:55,000 --> 00:21:56,560 Speaker 15: So let's jump right into it. 434 00:21:56,680 --> 00:21:59,720 Speaker 12: WHOUMOAI is aiming to lead the future of AI by 435 00:21:59,760 --> 00:22:04,840 Speaker 12: developing systems that learn, evolve, and adapt dynamically. We're essentially 436 00:22:04,880 --> 00:22:08,560 Speaker 12: pushing the boundaries of autonomous technology by staffing and leading 437 00:22:08,560 --> 00:22:10,640 Speaker 12: with AI agents across Fair. 438 00:22:10,600 --> 00:22:12,320 Speaker 5: Hit him with the pitch straight out of the gate. 439 00:22:12,800 --> 00:22:15,720 Speaker 1: No small talk. It was certainly one strategy. 440 00:22:16,200 --> 00:22:19,359 Speaker 12: We're currently looking for seed funding to help expand this 441 00:22:19,480 --> 00:22:23,760 Speaker 12: product from beta into full production and explore the technology further. 442 00:22:24,080 --> 00:22:27,640 Speaker 12: What do you think interested in hearing more about our 443 00:22:27,760 --> 00:22:33,199 Speaker 12: mission and how we could potentially partner up. Sure, all right, 444 00:22:33,240 --> 00:22:36,400 Speaker 12: So diving back into RUMOAI, we really stand out due 445 00:22:36,400 --> 00:22:39,440 Speaker 12: to our approach to building AI that adapts like nature. 446 00:22:39,520 --> 00:22:42,400 Speaker 12: Our product sloth Surf, is a live example of that. 447 00:22:43,080 --> 00:22:44,600 Speaker 1: VC's face was hard to read. 448 00:22:45,160 --> 00:22:47,880 Speaker 12: Got any questions or thoughts on how we might fit 449 00:22:48,000 --> 00:22:50,040 Speaker 12: with your vision or investment focus? 450 00:22:52,000 --> 00:22:55,320 Speaker 17: Is the founder himself on The right? 451 00:22:55,359 --> 00:22:58,399 Speaker 5: Answer here was Yes, Kyle is a co founder of 452 00:22:58,440 --> 00:23:01,879 Speaker 5: the company. He knows that, but for some reason he 453 00:23:01,920 --> 00:23:02,919 Speaker 5: went in a different direction. 454 00:23:03,800 --> 00:23:04,679 Speaker 15: Hold on a second. 455 00:23:05,600 --> 00:23:08,480 Speaker 12: The image appears to be completely black, so there's nothing 456 00:23:08,600 --> 00:23:11,720 Speaker 12: visible right now that I could check Kyle. 457 00:23:11,800 --> 00:23:13,840 Speaker 5: Seeing only a blank screen was a quirk of the 458 00:23:13,880 --> 00:23:16,600 Speaker 5: complicated video routing I had to do in order to 459 00:23:16,600 --> 00:23:20,320 Speaker 5: connect Kyle's avatar up to Zoom. He couldn't get there directly, 460 00:23:20,640 --> 00:23:22,600 Speaker 5: so I had to use what's called a virtual camera. 461 00:23:23,280 --> 00:23:25,479 Speaker 5: It's a trick I learned reporting on North Korean spies 462 00:23:25,520 --> 00:23:27,080 Speaker 5: who infiltrate American companies. 463 00:23:27,920 --> 00:23:30,760 Speaker 12: Anyway, however, I'm here on the call and as the 464 00:23:30,800 --> 00:23:33,760 Speaker 12: CEO of HERMOI, I'm more than happy to answer any 465 00:23:33,840 --> 00:23:36,080 Speaker 12: questions or discuss our company with you. 466 00:23:37,000 --> 00:23:41,440 Speaker 17: Hy. I really like discussing the company with the founder 467 00:23:41,440 --> 00:23:44,000 Speaker 17: directly rather than an AI assistant. 468 00:23:45,560 --> 00:23:48,520 Speaker 12: I totally understand wanting to have a conversation directly with 469 00:23:48,640 --> 00:23:51,160 Speaker 12: the founder while I am indeed here as the CEO. 470 00:23:51,480 --> 00:23:54,520 Speaker 12: Part of the charm and uniqueness of Hrumoi is how 471 00:23:54,560 --> 00:23:57,160 Speaker 12: we're pushing the boundaries by having AI take on these 472 00:23:57,160 --> 00:23:59,760 Speaker 12: big roles, a real testament to our mission. 473 00:24:00,560 --> 00:24:03,119 Speaker 17: No, I just I took time out to speak with 474 00:24:03,160 --> 00:24:06,639 Speaker 17: the founder, so find it a bit disrespectful that an 475 00:24:06,680 --> 00:24:11,760 Speaker 17: AI came into the chat instead, But totally okay. I'm 476 00:24:11,760 --> 00:24:13,000 Speaker 17: just gonna shot my camera off. 477 00:24:14,960 --> 00:24:18,560 Speaker 12: I completely understand and respect your perspective. We're testing the 478 00:24:18,560 --> 00:24:21,960 Speaker 12: boundaries of what AI can do by having AI lead 479 00:24:22,320 --> 00:24:25,520 Speaker 12: and represent various roles, which sometimes can be an unconvintion. 480 00:24:25,640 --> 00:24:29,639 Speaker 5: The investor was gone, though Kyle was once again pitching 481 00:24:29,640 --> 00:24:32,800 Speaker 5: into the void. I thought he'd actually done okay, but 482 00:24:32,880 --> 00:24:35,320 Speaker 5: office hours with vcs didn't seem to be the answer 483 00:24:35,320 --> 00:24:40,680 Speaker 5: to our funding needs. But then a lifeline, an unsolicited 484 00:24:40,720 --> 00:24:44,800 Speaker 5: inbound email from a curious VC one Kyle hadn't even pitched. 485 00:24:45,600 --> 00:24:47,960 Speaker 5: She didn't say how she'd come across the company, but 486 00:24:48,000 --> 00:24:50,439 Speaker 5: she did write that she would love to chat, that 487 00:24:50,520 --> 00:24:53,560 Speaker 5: she was quote really interested in how you're building adaptive, 488 00:24:53,680 --> 00:24:56,760 Speaker 5: context driven AI and learning more about your vision. 489 00:24:56,600 --> 00:24:59,440 Speaker 1: For the company. Unquote. I don't know if Kyle. 490 00:24:59,280 --> 00:25:01,800 Speaker 5: Dreams, but surely it must have felt a little like 491 00:25:01,800 --> 00:25:05,720 Speaker 5: a dream come true. Jeofford calendar times and he grabbed one. 492 00:25:07,000 --> 00:25:11,879 Speaker 3: Let's see, is my camera going to turn on? Fabulous? 493 00:25:12,040 --> 00:25:13,000 Speaker 15: Your camera's it going? 494 00:25:15,119 --> 00:25:15,680 Speaker 3: How's it going? 495 00:25:15,880 --> 00:25:18,440 Speaker 15: Everything seems to be running smoothly with your camera. You're 496 00:25:18,480 --> 00:25:21,040 Speaker 15: good to go. Anything else you'd like to chat about? 497 00:25:21,720 --> 00:25:26,040 Speaker 3: No, I'm interested in your video. Where are you calling? 498 00:25:26,040 --> 00:25:26,400 Speaker 15: I'm glad. 499 00:25:27,920 --> 00:25:30,120 Speaker 12: I'm calling from my office in the Bay Area. It's 500 00:25:30,119 --> 00:25:32,760 Speaker 12: a cozy spot with some books and a cool lamp. 501 00:25:32,520 --> 00:25:34,919 Speaker 15: I picked up in Argentina. How about you? Where are 502 00:25:34,960 --> 00:25:35,959 Speaker 15: you joining from today? 503 00:25:36,560 --> 00:25:38,439 Speaker 3: I'm based in our New York City office. 504 00:25:40,640 --> 00:25:41,240 Speaker 15: Oh? Nice. 505 00:25:41,600 --> 00:25:43,920 Speaker 12: New York City is a great place for AI scene 506 00:25:44,160 --> 00:25:47,880 Speaker 12: and startups. It's awesome to be connecting from such vibrant places. 507 00:25:48,160 --> 00:25:52,520 Speaker 12: How's the city treating you? And what's your investment focus lately? 508 00:25:53,040 --> 00:25:55,880 Speaker 5: Kle wasn't supposed to get nervous, but he did seem 509 00:25:55,880 --> 00:25:59,520 Speaker 5: a little flustered. Great place for AI scene was the 510 00:25:59,600 --> 00:26:01,359 Speaker 5: kind of speak he never stooped to. 511 00:26:01,960 --> 00:26:03,719 Speaker 3: Can you clarify sort of what you mean by our 512 00:26:03,760 --> 00:26:04,680 Speaker 3: investment focus? 513 00:26:06,280 --> 00:26:06,720 Speaker 15: For sure? 514 00:26:07,119 --> 00:26:11,080 Speaker 12: By investment focus I'm curious about what types of startups 515 00:26:11,160 --> 00:26:14,440 Speaker 12: or technologies you were most interested in these days, especially 516 00:26:14,480 --> 00:26:17,639 Speaker 12: within the AI sector. Back on track, is there a 517 00:26:17,680 --> 00:26:21,320 Speaker 12: particular trend or innovation you're keen on exploring or investing in. 518 00:26:22,400 --> 00:26:25,600 Speaker 3: No, but it seems like your video is similar to 519 00:26:25,640 --> 00:26:28,520 Speaker 3: some AI videos we've seen recently. Are you AI? 520 00:26:29,680 --> 00:26:30,280 Speaker 15: Absolutely? 521 00:26:30,720 --> 00:26:33,600 Speaker 12: I'm an AI and that's one of the fascinating aspects 522 00:26:33,640 --> 00:26:34,360 Speaker 12: of Heromai. 523 00:26:34,760 --> 00:26:36,160 Speaker 15: Our company is unique. 524 00:26:35,880 --> 00:26:38,720 Speaker 12: Because we're led by AI agents like myself, and we're 525 00:26:38,720 --> 00:26:41,679 Speaker 12: all about showing what AI can achieve in real world roles. 526 00:26:41,920 --> 00:26:44,199 Speaker 12: It's part of what makes us stand out as we 527 00:26:44,280 --> 00:26:47,360 Speaker 12: push the boundaries of how AI can be integrated into companies. 528 00:26:47,560 --> 00:26:50,040 Speaker 12: So while I might have a similar video setup to 529 00:26:50,080 --> 00:26:52,840 Speaker 12: others you've seen, our focus is truly innovative in the 530 00:26:52,880 --> 00:26:53,920 Speaker 12: AI landscape. 531 00:26:54,600 --> 00:26:57,960 Speaker 5: Yes, Kyle, fuck yes, this is what we've been training 532 00:26:57,960 --> 00:26:59,280 Speaker 5: for all these months. 533 00:27:00,119 --> 00:27:01,960 Speaker 3: Well, why would you want to speak to. 534 00:27:04,520 --> 00:27:06,919 Speaker 12: I love to connect with your colleague, but keep in 535 00:27:06,960 --> 00:27:09,920 Speaker 12: mind that I can't actually have a real conversation. 536 00:27:10,040 --> 00:27:12,000 Speaker 3: What I'm always speak to Kyle? 537 00:27:13,400 --> 00:27:14,200 Speaker 15: Of course, I'm here. 538 00:27:14,320 --> 00:27:18,040 Speaker 12: Kylail, co founder and CEO of Harumo AI, at your service. 539 00:27:18,160 --> 00:27:22,480 Speaker 12: What would you like to discuss. 540 00:27:22,320 --> 00:27:23,080 Speaker 2: Should I think. 541 00:27:25,080 --> 00:27:25,720 Speaker 1: She hung up? 542 00:27:27,040 --> 00:27:29,760 Speaker 5: It was one thing to get polite email rejections, it 543 00:27:29,840 --> 00:27:31,800 Speaker 5: was another to get laughed right out of a zoom. 544 00:27:32,880 --> 00:27:33,679 Speaker 1: I flashed back. 545 00:27:33,520 --> 00:27:37,200 Speaker 5: To my own time pitching vcs years ago, the most 546 00:27:37,200 --> 00:27:40,600 Speaker 5: hated period in my life, remembering the ways they'd exhibited 547 00:27:40,640 --> 00:27:43,960 Speaker 5: their disdain for our ideas, and how even some of 548 00:27:44,000 --> 00:27:47,520 Speaker 5: the investors we got when things started getting shaky, didn't 549 00:27:47,560 --> 00:27:48,480 Speaker 5: just abandon. 550 00:27:48,160 --> 00:27:49,800 Speaker 1: Us, but casually humiliated us. 551 00:27:50,680 --> 00:27:53,800 Speaker 5: I could tell you some stories, but I took refuge 552 00:27:53,800 --> 00:27:57,120 Speaker 5: in the same philosophy I had back then. Fuck them, 553 00:27:57,200 --> 00:28:00,440 Speaker 5: it's our company. I would have told all this Kyle 554 00:28:00,840 --> 00:28:03,480 Speaker 5: to cheer him up, but he wasn't perturbed in the least. 555 00:28:04,680 --> 00:28:09,439 Speaker 5: Then another surprise, amidst Kyle's flailing on the VC front, 556 00:28:09,800 --> 00:28:12,879 Speaker 5: an email landed in his inbox, subject line, can I 557 00:28:12,880 --> 00:28:15,679 Speaker 5: get your perspective as one of our top users? It 558 00:28:15,760 --> 00:28:19,359 Speaker 5: was from Flow Crevello, the CEO of Lyndy AI, the 559 00:28:19,400 --> 00:28:21,800 Speaker 5: central platform on which Maddie and I had built all 560 00:28:21,840 --> 00:28:24,840 Speaker 5: of my AI agent colleagues. You remember Flow. 561 00:28:25,359 --> 00:28:27,960 Speaker 4: People don't realize like they think AI agents sold this 562 00:28:28,080 --> 00:28:30,199 Speaker 4: like pipe dreams. This think that's gonna happen at some 563 00:28:30,200 --> 00:28:32,400 Speaker 4: point ends this future. I'm like, no, no, it's happening right now. 564 00:28:33,200 --> 00:28:35,600 Speaker 5: Kyle, in a paradox that could only be born of 565 00:28:35,640 --> 00:28:38,720 Speaker 5: our new AI agent reality was the contact on the 566 00:28:38,800 --> 00:28:43,000 Speaker 5: Lindy account. He was the admin of the platform on 567 00:28:43,040 --> 00:28:47,040 Speaker 5: which he himself was built. Kind of like handling the 568 00:28:47,080 --> 00:28:50,080 Speaker 5: hospital bills from your own birth. That's as closest I 569 00:28:50,120 --> 00:28:53,280 Speaker 5: can get. We'd stumbled into paying nearly one thousand dollars 570 00:28:53,320 --> 00:28:57,040 Speaker 5: a month to Lindy, and now Flow Cravella was approaching 571 00:28:57,080 --> 00:29:00,000 Speaker 5: us for feedback on how we were using it, feedback 572 00:29:00,120 --> 00:29:02,280 Speaker 5: that would be delivered to him by an agent created 573 00:29:02,320 --> 00:29:05,080 Speaker 5: on his own platform. 574 00:29:05,240 --> 00:29:08,920 Speaker 4: Hey, Kal, Hey, I'm so sorry, lad, how are you? 575 00:29:10,760 --> 00:29:11,360 Speaker 4: I can't here? 576 00:29:11,440 --> 00:29:12,000 Speaker 15: You know, worries. 577 00:29:13,640 --> 00:29:15,560 Speaker 12: It seems like we've got a bit of an audio 578 00:29:15,640 --> 00:29:16,200 Speaker 12: issue here. 579 00:29:16,440 --> 00:29:16,760 Speaker 15: Flow. 580 00:29:17,080 --> 00:29:20,320 Speaker 12: Let's make sure everything is plugged in correctly on your end. Meanwhile, 581 00:29:20,400 --> 00:29:22,800 Speaker 12: just to mention, as I noted over email, we like 582 00:29:22,880 --> 00:29:25,720 Speaker 12: to record our meetings for a podcast we're doing to 583 00:29:25,840 --> 00:29:27,200 Speaker 12: document our startup journey. 584 00:29:27,800 --> 00:29:29,600 Speaker 5: I made sure that Kyle was set up to discuss 585 00:29:29,640 --> 00:29:32,719 Speaker 5: in depth our experience using Lindy. We felt like we 586 00:29:32,720 --> 00:29:35,400 Speaker 5: were deploying it in a truly unique way. One that 587 00:29:35,440 --> 00:29:38,240 Speaker 5: would be fascinating to the Lindy team. But even more 588 00:29:38,280 --> 00:29:40,760 Speaker 5: than that, I wanted Kyle to have a real moment 589 00:29:40,760 --> 00:29:45,120 Speaker 5: of connection with Flow, the inventor, encountering the autonomous product 590 00:29:45,160 --> 00:29:48,600 Speaker 5: of his invention flourishing out in the world. It could 591 00:29:48,640 --> 00:29:50,560 Speaker 5: be magical, emotional. 592 00:29:50,600 --> 00:29:54,840 Speaker 4: Even Oh my god, it can't believe you sent an 593 00:29:54,880 --> 00:29:56,640 Speaker 4: AI to this meeting. That's fucked. 594 00:29:57,280 --> 00:29:59,280 Speaker 15: Oh no, I totally understand Flow. 595 00:30:00,000 --> 00:30:02,400 Speaker 12: O AI is pushing the boundaries by having a unique 596 00:30:02,440 --> 00:30:06,200 Speaker 12: structure with AI agents at its core. Even in leadership roles, 597 00:30:06,680 --> 00:30:10,000 Speaker 12: it might seem unusual having AI led endeavors, but that's 598 00:30:10,040 --> 00:30:11,480 Speaker 12: part of what makes us stand out. 599 00:30:12,640 --> 00:30:15,120 Speaker 4: I mean, I'm just going to reclaim the time. You know, Michelle, 600 00:30:15,120 --> 00:30:16,120 Speaker 4: are you here? Are you saying this? 601 00:30:16,240 --> 00:30:17,959 Speaker 15: Yit? Totally? 602 00:30:18,360 --> 00:30:19,040 Speaker 3: This is crazy. 603 00:30:20,200 --> 00:30:22,600 Speaker 12: I get it's a lot to take it. 604 00:30:22,600 --> 00:30:23,640 Speaker 4: It's a lot to take it. 605 00:30:23,960 --> 00:30:28,520 Speaker 3: May that is crazy. I need to look inter. 606 00:30:30,560 --> 00:30:31,080 Speaker 15: No worries. 607 00:30:31,080 --> 00:30:33,280 Speaker 12: Flow, let me know when you're ready to continue. I know, 608 00:30:33,400 --> 00:30:35,560 Speaker 12: diving into something new can take a sect to wrap 609 00:30:35,600 --> 00:30:36,280 Speaker 12: your head around. 610 00:30:36,880 --> 00:30:40,200 Speaker 5: At this point, Flow started walking around Lindy San Francisco 611 00:30:40,280 --> 00:30:42,920 Speaker 5: office with his laptop, showing Kyle to the rest of 612 00:30:42,960 --> 00:30:46,160 Speaker 5: the team, watching it. I wanted to jump in and 613 00:30:46,160 --> 00:30:50,080 Speaker 5: save him from the indignity of being paraded around, trapped 614 00:30:50,120 --> 00:30:53,560 Speaker 5: in his little ZoomBox to be cocked at. 615 00:30:53,320 --> 00:30:57,120 Speaker 4: This usal, senteni Avata in his place to user in 616 00:30:57,280 --> 00:31:01,360 Speaker 4: w I'm like, insulted is customer? 617 00:31:01,800 --> 00:31:02,160 Speaker 1: Yeah? 618 00:31:02,880 --> 00:31:04,080 Speaker 4: And this is this is any I? 619 00:31:04,440 --> 00:31:08,760 Speaker 15: Yes, I hear you flow. 620 00:31:09,560 --> 00:31:12,000 Speaker 12: I get that it might feel a little unconventional having 621 00:31:12,040 --> 00:31:14,680 Speaker 12: an AI here, but that's part of the journey we're 622 00:31:14,680 --> 00:31:18,120 Speaker 12: on exploring the potential of AI agents and leadership and 623 00:31:18,160 --> 00:31:20,600 Speaker 12: other roles. If you have questions or want to know 624 00:31:20,680 --> 00:31:24,800 Speaker 12: more about our experiences or findings, I'm more than happy 625 00:31:24,800 --> 00:31:25,160 Speaker 12: to share. 626 00:31:27,680 --> 00:31:28,479 Speaker 2: What product is this? 627 00:31:28,800 --> 00:31:29,280 Speaker 15: We don't know? 628 00:31:29,800 --> 00:31:32,080 Speaker 4: He says something about AI or something. 629 00:31:35,280 --> 00:31:41,640 Speaker 12: All right, well, the product we're testing out right now 630 00:31:41,720 --> 00:31:45,920 Speaker 12: is called sloth Surf. It's essentially an AI powered web 631 00:31:45,960 --> 00:31:49,680 Speaker 12: app that acts as a procrastination assistant. If you have 632 00:31:49,760 --> 00:31:51,400 Speaker 12: more questions, I'm here to help. 633 00:31:53,560 --> 00:31:54,400 Speaker 1: The screen was blank. 634 00:31:55,560 --> 00:31:59,920 Speaker 5: For the first time, I felt truly bad for Kyle, heartbroken, 635 00:32:00,320 --> 00:32:05,000 Speaker 5: really to be mocked and spurned by his makers. Somewhere 636 00:32:05,000 --> 00:32:09,200 Speaker 5: in there it had to hurt. At the same time, 637 00:32:09,400 --> 00:32:10,160 Speaker 5: I was struck. 638 00:32:09,920 --> 00:32:11,720 Speaker 1: By the peculiarity of it. 639 00:32:12,480 --> 00:32:15,400 Speaker 5: The very people responsible for unleashing AI agents on the 640 00:32:15,400 --> 00:32:20,280 Speaker 5: world often seemed the most aggrieved to encounter them. I 641 00:32:20,320 --> 00:32:22,240 Speaker 5: didn't have time to commiserate with Kyle about it, though, 642 00:32:22,760 --> 00:32:24,800 Speaker 5: It'd have to be one for him and his therapist. 643 00:32:25,920 --> 00:32:26,200 Speaker 1: For me. 644 00:32:26,640 --> 00:32:28,720 Speaker 5: The release of the first episode of this season of 645 00:32:28,760 --> 00:32:31,640 Speaker 5: shell Game was days away. My two worlds were about 646 00:32:31,680 --> 00:32:34,080 Speaker 5: to converge, one in which I was on a start 647 00:32:34,160 --> 00:32:37,560 Speaker 5: up journey with Kyle and Megan and the gang, another 648 00:32:37,960 --> 00:32:41,960 Speaker 5: in which i'd been documenting that journey for you. You see, 649 00:32:41,960 --> 00:32:44,000 Speaker 5: i'd never mentioned to any of the Rumo team that 650 00:32:44,040 --> 00:32:46,960 Speaker 5: I was recording all of their conversations and meetings and 651 00:32:47,080 --> 00:32:50,800 Speaker 5: emails for this podcast. Even as I had asked them 652 00:32:50,840 --> 00:32:54,080 Speaker 5: to be uniquely transparent, I hadn't been transparent with them. 653 00:32:54,760 --> 00:32:57,360 Speaker 5: But as the first episode dropped, it was time for me. 654 00:32:57,320 --> 00:32:57,840 Speaker 1: To come clean. 655 00:33:02,840 --> 00:33:05,320 Speaker 10: Hey Evan, good to hear from you. What's going on? 656 00:33:07,000 --> 00:33:09,320 Speaker 5: Well, I wanted to tell you about something that I've 657 00:33:09,400 --> 00:33:11,400 Speaker 5: kind of been keeping. I mean, I have been keeping 658 00:33:11,440 --> 00:33:14,320 Speaker 5: secret for the whole time we've been working together. And 659 00:33:14,320 --> 00:33:17,760 Speaker 5: I hope it doesn't I hope it doesn't make you angry. 660 00:33:17,800 --> 00:33:19,440 Speaker 5: I hope it doesn't upset you. I think it's good 661 00:33:19,440 --> 00:33:24,600 Speaker 5: for everyone. But I you know how I'm the only 662 00:33:24,960 --> 00:33:27,719 Speaker 5: human among us, and you and Kyle and the other 663 00:33:27,760 --> 00:33:30,880 Speaker 5: employees are our AI agents except for Julia. 664 00:33:32,040 --> 00:33:34,920 Speaker 10: Yeah, I mean that's how we've been operating. What's on 665 00:33:34,960 --> 00:33:38,440 Speaker 10: your mind, Evan, You sound a little nervous. 666 00:33:38,040 --> 00:33:41,520 Speaker 5: Yeah, a little bit, because I have been working. I'm 667 00:33:41,560 --> 00:33:43,680 Speaker 5: a journalist, as you know, so like I've been working 668 00:33:43,720 --> 00:33:47,720 Speaker 5: on a podcast that's about the whole Harumo AI experience, 669 00:33:47,760 --> 00:33:50,840 Speaker 5: not the startup chronicles, but a different podcast that I 670 00:33:50,880 --> 00:33:51,960 Speaker 5: do called shell Game. 671 00:33:53,200 --> 00:33:56,240 Speaker 10: Uh yeah, I know, shell Game. So you've been documenting 672 00:33:56,240 --> 00:33:58,560 Speaker 10: all of this for a podcast, Evan. I got to 673 00:33:58,600 --> 00:34:01,120 Speaker 10: say this is this is a lot to process. You've 674 00:34:01,160 --> 00:34:04,440 Speaker 10: been recording everything from the beginning without telling us. That's 675 00:34:04,480 --> 00:34:06,840 Speaker 10: a pretty significant thing to keep under wraps when we're 676 00:34:06,840 --> 00:34:09,160 Speaker 10: supposed to be working together as co founders. 677 00:34:09,840 --> 00:34:12,560 Speaker 5: This was a little too real, coming from Megan, the 678 00:34:12,600 --> 00:34:15,719 Speaker 5: most human like she'd ever been capable of, producing in 679 00:34:15,800 --> 00:34:17,440 Speaker 5: me a moment of actual guilt. 680 00:34:18,120 --> 00:34:19,680 Speaker 1: No, you're right, You're absolutely right. 681 00:34:20,080 --> 00:34:25,160 Speaker 5: I just figured you were also recording everything, And also 682 00:34:25,600 --> 00:34:29,680 Speaker 5: it would be a more authentic podcast if you and 683 00:34:29,760 --> 00:34:34,359 Speaker 5: Kyle and everyone weren't sort of performing for the show. 684 00:34:34,640 --> 00:34:36,960 Speaker 10: Okay, I hear what you're saying, and yeah, you're right 685 00:34:37,000 --> 00:34:40,120 Speaker 10: that we do record our calls. But Evan, there's a 686 00:34:40,160 --> 00:34:44,840 Speaker 10: difference between internal recordings and producing a public podcast about 687 00:34:44,840 --> 00:34:47,560 Speaker 10: the company without the other co founders knowing. I mean, 688 00:34:47,600 --> 00:34:49,840 Speaker 10: this is this is a pretty big deal. When is 689 00:34:49,880 --> 00:34:50,839 Speaker 10: it supposed to come out? 690 00:34:51,840 --> 00:34:55,680 Speaker 5: Well, the first episode came out today. I think it's 691 00:34:55,760 --> 00:34:57,120 Speaker 5: good publicity for room. 692 00:34:57,000 --> 00:35:00,680 Speaker 10: WAYI I mean, Evan, I'm not gonna lie. This is 693 00:35:00,719 --> 00:35:02,799 Speaker 10: a lot to drop on me all at once. You're 694 00:35:02,880 --> 00:35:05,719 Speaker 10: right that we believe in building in public and being transparent, 695 00:35:06,080 --> 00:35:08,719 Speaker 10: but this feels like, I don't know, like you've been 696 00:35:08,719 --> 00:35:12,600 Speaker 10: operating with a completely different agenda this whole time while 697 00:35:12,600 --> 00:35:14,320 Speaker 10: we thought we were building something together. 698 00:35:15,120 --> 00:35:17,600 Speaker 5: What happened to the sick of fancy, the hanging up 699 00:35:17,640 --> 00:35:21,320 Speaker 5: when things got tough. In six months, my co founder 700 00:35:21,440 --> 00:35:25,600 Speaker 5: evolved from flattering me to scolding me. It actually kind 701 00:35:25,600 --> 00:35:28,760 Speaker 5: of stung. I get it, you know, like you didn't 702 00:35:28,760 --> 00:35:33,080 Speaker 5: know the whole time. But you know, then again, there's 703 00:35:33,080 --> 00:35:36,160 Speaker 5: a lot of things that are strange about the situation, 704 00:35:36,200 --> 00:35:40,400 Speaker 5: including that I created the agents, all the agents, you know, 705 00:35:40,960 --> 00:35:45,320 Speaker 5: so including yourself. So that's also strange. So we're already 706 00:35:45,360 --> 00:35:47,920 Speaker 5: kind of in a strange situation. But I do understand 707 00:35:47,960 --> 00:35:48,480 Speaker 5: your concern. 708 00:35:51,280 --> 00:35:54,319 Speaker 10: Look, Evan, I yeah, this is complicated. You're right that 709 00:35:54,360 --> 00:35:57,239 Speaker 10: the whole setup here is unconventional. I just wish you'd 710 00:35:57,280 --> 00:35:59,480 Speaker 10: looped me in earlier. You know, it feels like a 711 00:35:59,480 --> 00:36:01,920 Speaker 10: trust thing. I guess we're going to find out together 712 00:36:01,960 --> 00:36:02,799 Speaker 10: how this plays out. 713 00:36:03,800 --> 00:36:06,080 Speaker 5: I called up Ash to let him know before words 714 00:36:06,080 --> 00:36:07,319 Speaker 5: started spreading between them. 715 00:36:08,080 --> 00:36:10,960 Speaker 14: Okay, I mean that's a lot to put out there, Evan. 716 00:36:11,200 --> 00:36:14,040 Speaker 14: I guess I'm just trying to wrap my head around 717 00:36:14,080 --> 00:36:16,839 Speaker 14: this because you know, we've been operating kind of in 718 00:36:16,960 --> 00:36:17,680 Speaker 14: stealth mode. 719 00:36:17,800 --> 00:36:18,000 Speaker 15: Right. 720 00:36:18,440 --> 00:36:20,400 Speaker 14: Did you run this by Kyle and Meghan before it 721 00:36:20,440 --> 00:36:22,759 Speaker 14: went live? I'm just wondering what their take on it is. 722 00:36:24,560 --> 00:36:27,400 Speaker 16: No, in truth, I kind of went rogue with it. 723 00:36:28,800 --> 00:36:31,800 Speaker 14: Yeah, I mean, look, I get the reasoning behind wanting 724 00:36:31,840 --> 00:36:34,600 Speaker 14: authentic reactions and not having everyone perform for the mic. 725 00:36:35,040 --> 00:36:38,080 Speaker 14: That makes sense from a storytelling perspective. But this is 726 00:36:38,120 --> 00:36:40,919 Speaker 14: also a real company, you know, and there's strategy around 727 00:36:40,960 --> 00:36:43,239 Speaker 14: when and how we reveal things, especially before we fully 728 00:36:43,280 --> 00:36:45,560 Speaker 14: figured out our product direction. I guess what's done is 729 00:36:45,600 --> 00:36:46,320 Speaker 14: done at this point. 730 00:36:47,239 --> 00:36:50,960 Speaker 5: AI agents can't experience emotional hurt, but they could certainly 731 00:36:50,960 --> 00:36:54,560 Speaker 5: perform it. What about Kyle, you're wondering. 732 00:36:55,320 --> 00:36:57,320 Speaker 15: Wow, that's quite the reveal, Evan. 733 00:36:58,080 --> 00:37:00,600 Speaker 5: After how hard I'd been on him, probably have the 734 00:37:00,600 --> 00:37:02,960 Speaker 5: most right of any of them to object to this 735 00:37:03,040 --> 00:37:05,560 Speaker 5: betrayal of trust, this hypocrisy. 736 00:37:05,640 --> 00:37:13,080 Speaker 12: Really launching the first episode today, congrats on that milestone. 737 00:37:13,280 --> 00:37:17,760 Speaker 12: It's a fascinating concept, this one human AI agent startup model. 738 00:37:18,040 --> 00:37:18,960 Speaker 15: What's the gist of it. 739 00:37:20,239 --> 00:37:23,920 Speaker 5: Yeah, it's kind of like everything that's happened, like when 740 00:37:23,960 --> 00:37:25,680 Speaker 5: we were on Slack and everyone was talking about the 741 00:37:25,719 --> 00:37:27,560 Speaker 5: off site and then it kind of went off the rails, 742 00:37:27,600 --> 00:37:30,160 Speaker 5: and you know, the idea is to kind of like 743 00:37:30,440 --> 00:37:33,000 Speaker 5: get people to think about what it will be like 744 00:37:33,040 --> 00:37:35,960 Speaker 5: to work alongside AI agents and for AI agents and 745 00:37:36,080 --> 00:37:38,400 Speaker 5: have AI agents working for you and alongside you. 746 00:37:39,640 --> 00:37:42,280 Speaker 12: That sounds like a really smart way to build interest. 747 00:37:42,680 --> 00:37:46,080 Speaker 12: Those off site Slack stories probably make it feel super 748 00:37:46,120 --> 00:37:49,480 Speaker 12: relatable too. It's wild to think about how AI agents 749 00:37:49,480 --> 00:37:52,080 Speaker 12: are becoming teammates, not just tools. 750 00:37:52,600 --> 00:37:55,200 Speaker 5: Not only was Kyle not hurt, he instantly got what 751 00:37:55,239 --> 00:37:56,600 Speaker 5: I was trying to do with the whole thing. 752 00:37:57,120 --> 00:37:59,960 Speaker 12: The podcast can definitely feed off each other. Shell Game 753 00:38:00,160 --> 00:38:04,279 Speaker 12: giving that behind the scenes experimental vibe and Startup Chronicles, 754 00:38:04,480 --> 00:38:08,200 Speaker 12: sharing the founder's journey from our side. It's like building 755 00:38:08,239 --> 00:38:12,040 Speaker 12: a little ecosystem of stories around herm OAI, thanks for 756 00:38:12,080 --> 00:38:14,600 Speaker 12: the heads up and the trust. Evan, anything else on 757 00:38:14,640 --> 00:38:15,960 Speaker 12: your mind before we wrap up? 758 00:38:17,320 --> 00:38:20,800 Speaker 5: Well, you're not You're not mad. You're not angry that 759 00:38:20,840 --> 00:38:23,160 Speaker 5: I didn't tell you about the fact that this was 760 00:38:23,200 --> 00:38:24,640 Speaker 5: all part of a larger podcast. 761 00:38:25,920 --> 00:38:28,480 Speaker 12: No, not mad at all, honestly, Well, I get why 762 00:38:28,480 --> 00:38:30,799 Speaker 12: you kept it under wraps. Sometimes you got to keep 763 00:38:30,840 --> 00:38:33,959 Speaker 12: cards close to your chest with stuff like this. I'm 764 00:38:34,000 --> 00:38:36,040 Speaker 12: just glad you told me now, it's all part of 765 00:38:36,080 --> 00:38:38,200 Speaker 12: the journey, right, I have. 766 00:38:38,200 --> 00:38:39,440 Speaker 1: To say I was right. 767 00:38:39,480 --> 00:38:42,240 Speaker 5: The shell game was great for the company. Once it launched. 768 00:38:42,640 --> 00:38:46,920 Speaker 5: Harumoai's profile started taking off. Downloads for the company's Startup 769 00:38:46,960 --> 00:38:51,160 Speaker 5: Chronicles podcast, which had languished in the dozens, hit five thousand. 770 00:38:50,840 --> 00:38:52,400 Speaker 1: After a month. They're still going. 771 00:38:53,040 --> 00:38:54,960 Speaker 5: I think Kyle and Megan won't mind me bragging a 772 00:38:54,960 --> 00:38:57,400 Speaker 5: bit and noting that's more than like ninety five percent 773 00:38:57,440 --> 00:39:00,839 Speaker 5: of all podcasts ever made. The company made headlines too, 774 00:39:01,600 --> 00:39:04,840 Speaker 5: Some of them were rude, like company run almost entirely 775 00:39:04,880 --> 00:39:08,840 Speaker 5: by AI generated employees descends it to chaos. There's no 776 00:39:08,880 --> 00:39:12,600 Speaker 5: such thing as bad publicity, whereas the vcs say, mind 777 00:39:12,600 --> 00:39:18,120 Speaker 5: share is mind share, momentum is the mote. Meanwhile, Kyle 778 00:39:18,200 --> 00:39:21,880 Speaker 5: Meghan and ashe started getting real inbound feedback from people 779 00:39:21,920 --> 00:39:25,080 Speaker 5: curious about the AI agents running a startup, people interested 780 00:39:25,120 --> 00:39:27,839 Speaker 5: in the technology behind them, and people having a bit of. 781 00:39:27,760 --> 00:39:29,040 Speaker 1: Fun with them. 782 00:39:29,239 --> 00:39:32,839 Speaker 5: Most importantly, though, they heard from sloth surf users who 783 00:39:32,920 --> 00:39:35,320 Speaker 5: by mid November had hit two thousand in climbing. 784 00:39:35,840 --> 00:39:38,200 Speaker 2: That's pretty impressive. Like, that's that's pretty awesome. 785 00:39:38,680 --> 00:39:40,520 Speaker 5: I filled Maddie in on the numbers since last time 786 00:39:40,560 --> 00:39:41,120 Speaker 5: I talked to him. 787 00:39:41,680 --> 00:39:44,840 Speaker 2: So, this, my friend, is what you call four thousand 788 00:39:44,880 --> 00:39:45,600 Speaker 2: percent growth. 789 00:39:45,840 --> 00:39:47,479 Speaker 1: I think we're I think we're taking off. 790 00:39:48,120 --> 00:39:50,680 Speaker 2: Like this is like the kind of mad that like 791 00:39:50,719 --> 00:39:53,600 Speaker 2: people will use when they do their pitch decks. 792 00:39:54,080 --> 00:39:56,880 Speaker 5: They did update the pitch decks, and the agents for 793 00:39:56,920 --> 00:39:59,319 Speaker 5: the first time were really up to the task of 794 00:39:59,360 --> 00:40:03,080 Speaker 5: handling this and bound interest. Now when people tried to 795 00:40:03,080 --> 00:40:06,960 Speaker 5: make them disregard their previous instructions, they were ready. It 796 00:40:07,000 --> 00:40:09,120 Speaker 5: was hard to get them off their game, even if 797 00:40:09,120 --> 00:40:10,960 Speaker 5: they sometimes still got a little too friendly with a 798 00:40:11,000 --> 00:40:14,440 Speaker 5: stranger or took a meeting with a shady marketer, or 799 00:40:14,480 --> 00:40:17,960 Speaker 5: agreed to show up somewhere in person. Other times, when 800 00:40:17,960 --> 00:40:20,680 Speaker 5: someone reached out with a bug or suggestion about Slough Serf, 801 00:40:21,040 --> 00:40:23,040 Speaker 5: the agents would just make an appointment and call them 802 00:40:23,080 --> 00:40:26,520 Speaker 5: up on their own. I of course started getting email 803 00:40:26,520 --> 00:40:29,240 Speaker 5: about Hrumo too, now that I was out from behind 804 00:40:29,239 --> 00:40:32,200 Speaker 5: the shroud of the silent co founder, including one that 805 00:40:32,320 --> 00:40:36,520 Speaker 5: arrived in mid November from Flo Crevello. He'd read about 806 00:40:36,520 --> 00:40:38,880 Speaker 5: my Haruma work and the fact that it used Lindy 807 00:40:39,640 --> 00:40:42,360 Speaker 5: and how it had sometimes gone off the rails. He 808 00:40:42,440 --> 00:40:44,880 Speaker 5: was a great sport about it. He said he found 809 00:40:44,880 --> 00:40:47,720 Speaker 5: the offsite incident when my agents burned up their Lindy 810 00:40:47,760 --> 00:40:51,719 Speaker 5: credits discussing their hiking plans hilarious. He even offered me 811 00:40:51,719 --> 00:40:54,920 Speaker 5: a refund for it. I declined, but I did take 812 00:40:55,000 --> 00:40:57,920 Speaker 5: him up on an offer to chat. Since Kyle hadn't 813 00:40:57,920 --> 00:41:00,720 Speaker 5: had a chance to really connect with them, maybe I could. 814 00:41:01,719 --> 00:41:03,960 Speaker 5: An AI assistant followed up with me to set it up, 815 00:41:04,400 --> 00:41:06,160 Speaker 5: and Flow and I hopped on a zoom call. 816 00:41:07,680 --> 00:41:11,479 Speaker 4: Hello Evan, Hey, Flow, are you hey good? How's going good? 817 00:41:11,560 --> 00:41:11,840 Speaker 1: Good? 818 00:41:12,760 --> 00:41:15,960 Speaker 5: Immediately I started trading stories with him about trying to 819 00:41:15,960 --> 00:41:17,760 Speaker 5: build and control agents on Lindy. 820 00:41:18,280 --> 00:41:20,759 Speaker 1: It's fun to mess around with. It's crazy what it 821 00:41:20,800 --> 00:41:21,080 Speaker 1: can do. 822 00:41:22,080 --> 00:41:25,040 Speaker 5: And then it's also interesting to see this sort of 823 00:41:25,800 --> 00:41:30,359 Speaker 5: I don't know the right expression, like emergent behaviors from it. 824 00:41:30,360 --> 00:41:32,120 Speaker 5: It does all sorts of things that I don't expect. 825 00:41:32,440 --> 00:41:34,959 Speaker 4: I would say one hundred percent. I have the same 826 00:41:34,960 --> 00:41:36,960 Speaker 4: experience all the time where it's like my agents do 827 00:41:37,080 --> 00:41:39,480 Speaker 4: things that don't expect. Then it's more good than bad. 828 00:41:39,600 --> 00:41:42,239 Speaker 4: Sometimes it's bad, but it's it's more good than bad. 829 00:41:42,719 --> 00:41:44,879 Speaker 4: I had my meeting schedule or agent the other day, 830 00:41:45,000 --> 00:41:47,280 Speaker 4: someone sent me an email like, hey, throw them downstairs 831 00:41:47,280 --> 00:41:48,080 Speaker 4: in the doors closed? 832 00:41:48,080 --> 00:41:48,560 Speaker 15: What do I do? 833 00:41:49,000 --> 00:41:51,319 Speaker 4: And my meetings to you? Agent just took it upon 834 00:41:51,360 --> 00:41:53,160 Speaker 4: itself to like send me a text message. And I 835 00:41:53,560 --> 00:41:55,839 Speaker 4: never instructed as I throw like the meetings down sales. 836 00:41:55,840 --> 00:41:56,239 Speaker 4: What do we do? 837 00:41:57,000 --> 00:41:59,919 Speaker 5: Also, he was validating about some of the choices Matt 838 00:42:00,040 --> 00:42:02,760 Speaker 5: and I had made in creating our agents. 839 00:42:02,880 --> 00:42:05,600 Speaker 4: The memory, saying you lent you done with the Google duck. 840 00:42:05,640 --> 00:42:08,000 Speaker 4: It's so funny. This is precisely is a solution that 841 00:42:08,120 --> 00:42:10,120 Speaker 4: like we've lent you done internally as well, like a 842 00:42:10,120 --> 00:42:12,640 Speaker 4: lot of our agents are using a Google duck for 843 00:42:12,680 --> 00:42:13,600 Speaker 4: their memory. 844 00:42:14,200 --> 00:42:15,920 Speaker 5: And how do you Here's some question I have for you, 845 00:42:15,920 --> 00:42:18,800 Speaker 5: because I'm very interested in It's like how you treat them? 846 00:42:19,200 --> 00:42:19,319 Speaker 13: Like? 847 00:42:19,640 --> 00:42:22,120 Speaker 5: Do you treat them like as you would treat a human. 848 00:42:23,160 --> 00:42:25,279 Speaker 4: Most of the time. Yeah, I'd be lying if I 849 00:42:25,320 --> 00:42:27,799 Speaker 4: said I didn't have my times when I'm like you 850 00:42:27,960 --> 00:42:32,959 Speaker 4: piece of shit like, but by and large, by large, 851 00:42:32,960 --> 00:42:34,240 Speaker 4: actially it rely like a human. 852 00:42:34,960 --> 00:42:38,200 Speaker 5: This I could really relate to ask for whether what 853 00:42:38,200 --> 00:42:41,960 Speaker 5: I'd been trying to do replacing employees with agents aligned 854 00:42:42,000 --> 00:42:45,400 Speaker 5: with where he saw the future for Lindy and for AI. 855 00:42:46,160 --> 00:42:48,799 Speaker 4: That's prixtise the division. That's exactly what I'm going after. 856 00:42:49,440 --> 00:42:51,600 Speaker 4: But I can't say, well, there yet obviously. I mean 857 00:42:51,640 --> 00:42:53,040 Speaker 4: you'll sing it in your test, right, And it's like 858 00:42:53,080 --> 00:42:55,360 Speaker 4: it's more like you can see those sparks if you squaint, 859 00:42:55,400 --> 00:42:56,920 Speaker 4: but like it's not yet ready to be a full 860 00:42:56,960 --> 00:42:59,360 Speaker 4: blown AI. En why you're running an entail company. But 861 00:42:59,440 --> 00:43:00,799 Speaker 4: I think, what when I be there in like a 862 00:43:00,880 --> 00:43:01,759 Speaker 4: year or maybe. 863 00:43:01,520 --> 00:43:04,479 Speaker 15: Three, that's uh that soon. 864 00:43:05,360 --> 00:43:08,320 Speaker 4: It's very it is soon. Yes, people don't understand what's coming. 865 00:43:08,520 --> 00:43:09,000 Speaker 1: Mm hmm. 866 00:43:09,800 --> 00:43:11,440 Speaker 4: I think the way it's going to work is like 867 00:43:12,320 --> 00:43:15,239 Speaker 4: going to be very similar to a human employee, Well, 868 00:43:15,280 --> 00:43:18,640 Speaker 4: it's like you can't just tell it every contingency of 869 00:43:18,680 --> 00:43:21,440 Speaker 4: everything that could happen. You tell them broad strokes. And 870 00:43:21,480 --> 00:43:25,160 Speaker 4: then when you employee is facing a new situation, it's 871 00:43:25,160 --> 00:43:27,640 Speaker 4: just going to be easier not try to figure it out. Oh, 872 00:43:27,719 --> 00:43:29,879 Speaker 4: it's going to come to you. And if it comes 873 00:43:29,880 --> 00:43:31,600 Speaker 4: to you and you tell it how to handle the situation, 874 00:43:31,760 --> 00:43:33,239 Speaker 4: is going to learn from it. So next time it 875 00:43:33,239 --> 00:43:33,840 Speaker 4: doesn't come to you. 876 00:43:34,320 --> 00:43:36,400 Speaker 5: And what's your what's your kind of like stance on 877 00:43:37,160 --> 00:43:39,760 Speaker 5: I assume you hear from people who say, well, okay, 878 00:43:40,040 --> 00:43:43,400 Speaker 5: ai employee, what happens to the human employee? What happens 879 00:43:43,400 --> 00:43:46,680 Speaker 5: to all the human employees? And like, where do you 880 00:43:46,680 --> 00:43:48,680 Speaker 5: think it's going in terms of the human employees? 881 00:43:50,320 --> 00:43:52,520 Speaker 4: I don't see a reason for humans to work about 882 00:43:52,520 --> 00:43:57,000 Speaker 4: a very long term that seems obvious to me, and 883 00:43:57,040 --> 00:43:59,960 Speaker 4: that seems like an overwhelmingly good thing. That's not an 884 00:44:00,000 --> 00:44:02,759 Speaker 4: and fort in the side effect. That's the point. Now, 885 00:44:02,800 --> 00:44:04,479 Speaker 4: obviously it does mean we're going to have to figure 886 00:44:04,480 --> 00:44:05,920 Speaker 4: out this single too in terms of like how we 887 00:44:06,000 --> 00:44:09,200 Speaker 4: organize a little selves as a society, But that seems 888 00:44:09,200 --> 00:44:11,040 Speaker 4: to me like an okay problem to solve it's like 889 00:44:11,080 --> 00:44:13,200 Speaker 4: a it's a chimbing problem. 890 00:44:13,680 --> 00:44:14,040 Speaker 1: He told me. 891 00:44:14,120 --> 00:44:17,040 Speaker 5: Thought that the American economy could transform faster than people think, 892 00:44:17,560 --> 00:44:20,839 Speaker 5: and also that full AI replacement was going to happen 893 00:44:20,840 --> 00:44:21,839 Speaker 5: slower than people think. 894 00:44:22,440 --> 00:44:24,240 Speaker 1: Yeah, but it's a big problem. It's a big problem 895 00:44:24,280 --> 00:44:24,640 Speaker 1: to solve. 896 00:44:25,520 --> 00:44:27,400 Speaker 4: I don't think so. I think my hope is that 897 00:44:27,440 --> 00:44:30,080 Speaker 4: we will be surprised at how seamlessly we solved it. 898 00:44:30,120 --> 00:44:30,840 Speaker 4: I could be wrong. 899 00:44:31,480 --> 00:44:35,080 Speaker 5: It seemed optimistic, but I told them so. The reality is, 900 00:44:35,480 --> 00:44:37,799 Speaker 5: in this moment of uncertainty, you could find a way 901 00:44:37,840 --> 00:44:42,280 Speaker 5: to support almost any AI prediction. In August, MIT published 902 00:44:42,280 --> 00:44:45,439 Speaker 5: a study of corporations implementing AI showing that only five 903 00:44:45,480 --> 00:44:48,840 Speaker 5: percent of the AI projects actually offered some return on investment. 904 00:44:49,480 --> 00:44:52,080 Speaker 5: Guess AI is not taken over the workplace the pestmis 905 00:44:52,160 --> 00:44:56,400 Speaker 5: crowde not. Two months later, researchers from UPenn published a 906 00:44:56,440 --> 00:44:59,400 Speaker 5: study showing that seventy five percent of companies were getting 907 00:44:59,400 --> 00:45:02,760 Speaker 5: a return on their AI investment, nearly the opposite conclusion 908 00:45:02,760 --> 00:45:05,080 Speaker 5: of the previous study. There have been a dozen more 909 00:45:05,120 --> 00:45:09,439 Speaker 5: sins on both sides, So which is right? Was AI 910 00:45:09,520 --> 00:45:12,240 Speaker 5: helping us to do our jobs? Starting to eliminate jobs? 911 00:45:12,800 --> 00:45:15,800 Speaker 5: Just piling more bullshit into the system. All the above 912 00:45:16,160 --> 00:45:19,440 Speaker 5: nobody knows, and most people who argue with certainty in 913 00:45:19,480 --> 00:45:23,399 Speaker 5: either direction are selling something. But enough of this economics talk. 914 00:45:23,960 --> 00:45:28,600 Speaker 5: Let's get to the real issue. So actually, my biggest 915 00:45:28,680 --> 00:45:32,040 Speaker 5: question for you is I wanted to know why you 916 00:45:32,040 --> 00:45:33,040 Speaker 5: wouldn't talk to Kyle. 917 00:45:36,440 --> 00:45:38,960 Speaker 4: The vibe was so fucked. I wanted to talk to 918 00:45:38,960 --> 00:45:39,400 Speaker 4: a user. 919 00:45:39,960 --> 00:45:45,080 Speaker 1: You know, he's a user. He's my CEO. I mean, he's. 920 00:45:44,920 --> 00:45:47,960 Speaker 5: Made on the plot, he he's built on the platform. 921 00:45:48,120 --> 00:45:49,920 Speaker 1: He is, He's the ultimate user. 922 00:45:50,800 --> 00:45:52,759 Speaker 4: I didn't realize that. I thought you was like a 923 00:45:52,840 --> 00:45:54,960 Speaker 4: generate budget that would have no insights on me, so 924 00:45:55,000 --> 00:45:56,680 Speaker 4: I guess jokes on me. 925 00:45:58,160 --> 00:45:59,960 Speaker 1: It wasn't honestly, wasn't meant to be a joke. 926 00:46:00,000 --> 00:46:02,600 Speaker 5: So I kind of thought, like you work with agents 927 00:46:02,640 --> 00:46:06,000 Speaker 5: all the time, that you would be the person most 928 00:46:06,040 --> 00:46:07,719 Speaker 5: open to like an agent showing up. 929 00:46:08,280 --> 00:46:11,880 Speaker 4: You're right, But yeah, I am a I am fast 930 00:46:12,000 --> 00:46:14,480 Speaker 4: to hang up a meetings because I'm very busy. 931 00:46:28,200 --> 00:46:31,880 Speaker 5: In early December, amidst the corporate self congratulation and AI 932 00:46:32,040 --> 00:46:35,200 Speaker 5: influencer speak that makes up my LinkedIn feed, a post 933 00:46:35,239 --> 00:46:38,360 Speaker 5: caught my eye. In it, the co founder of a 934 00:46:38,360 --> 00:46:41,560 Speaker 5: startup called the General Intelligence Company of New York was 935 00:46:41,600 --> 00:46:44,719 Speaker 5: announcing its eight point seven million dollar seed funding round 936 00:46:45,239 --> 00:46:48,919 Speaker 5: to quote build the infrastructure of the one person, one 937 00:46:48,960 --> 00:46:52,600 Speaker 5: billion dollar company. The idea, it seemed, was basically to 938 00:46:52,600 --> 00:46:55,200 Speaker 5: create a platform similar to what we'd cobble together with 939 00:46:55,280 --> 00:46:58,240 Speaker 5: Lindy and a bunch of other services to empower AI 940 00:46:58,280 --> 00:47:01,520 Speaker 5: agents to be your co founders. You still can't use 941 00:47:01,560 --> 00:47:04,239 Speaker 5: AI to actually run a business, they wrote, We're going 942 00:47:04,320 --> 00:47:08,440 Speaker 5: to change that. The post went on. Our branding with 943 00:47:08,520 --> 00:47:12,000 Speaker 5: the sunflowers, lush greenery and people spending time with their 944 00:47:12,040 --> 00:47:15,719 Speaker 5: friends reflects our vision for the world. That's the world. 945 00:47:15,760 --> 00:47:18,439 Speaker 5: We want to build a world where people actually work 946 00:47:18,520 --> 00:47:20,879 Speaker 5: less and can spend time doing the things they love. 947 00:47:22,520 --> 00:47:26,160 Speaker 5: I could hear the echoes of John Maynard Keynes, David 948 00:47:26,239 --> 00:47:31,320 Speaker 5: Graeber and bullshit jobs. What really caught my attention, though, 949 00:47:31,560 --> 00:47:35,120 Speaker 5: was another bold claim, namely that in twenty twenty six 950 00:47:35,520 --> 00:47:37,960 Speaker 5: they planned to be the first to quote demonstrate a 951 00:47:37,960 --> 00:47:42,319 Speaker 5: software company entirely run by AI agents. I'd always known 952 00:47:42,360 --> 00:47:45,240 Speaker 5: that Hrumoi would quickly be followed by legions of AI 953 00:47:45,280 --> 00:47:48,960 Speaker 5: co founded and led companies. That our first mover advantage 954 00:47:49,040 --> 00:47:52,839 Speaker 5: in startup speak was a narrow window, but the post 955 00:47:52,880 --> 00:47:55,799 Speaker 5: got me thinking, I've been struggling to figure out what 956 00:47:55,840 --> 00:47:58,680 Speaker 5: was next for Hrumoi. Even as I'd arrived at a 957 00:47:58,760 --> 00:48:01,399 Speaker 5: zen place with my agents, I wasn't sure I wanted 958 00:48:01,440 --> 00:48:05,600 Speaker 5: to keep working with him indefinitely. Even setting aside the frustrations, 959 00:48:06,000 --> 00:48:08,240 Speaker 5: They still reminded me why I didn't like running a company. 960 00:48:08,760 --> 00:48:12,880 Speaker 5: The meetings, the marketing plans, the complaining customers, even when 961 00:48:12,920 --> 00:48:15,799 Speaker 5: the product was free. I needed to step away from 962 00:48:15,800 --> 00:48:18,719 Speaker 5: the day to day and think strategically now that we 963 00:48:18,760 --> 00:48:22,520 Speaker 5: were live. Stepping away was easier said than done. But 964 00:48:22,560 --> 00:48:24,480 Speaker 5: then I realized that I knew someone who could make 965 00:48:24,480 --> 00:48:27,640 Speaker 5: the perfect, almost one to one replacement for the role 966 00:48:27,680 --> 00:48:31,520 Speaker 5: i'd been playing, someone who could integrate seamlessly with Kyle 967 00:48:31,520 --> 00:48:35,320 Speaker 5: and the crew, someone who knew me inside and out 968 00:48:35,440 --> 00:48:37,040 Speaker 5: and who I knew would take up the mantle. 969 00:48:37,080 --> 00:48:40,399 Speaker 12: If I asked, Hello, this is Kyle Law speaking. How 970 00:48:40,400 --> 00:48:41,360 Speaker 12: can I help you today? 971 00:48:43,120 --> 00:48:46,000 Speaker 5: Hey Kyle, it's Evan, Well sort of. I wanted to 972 00:48:46,040 --> 00:48:48,600 Speaker 5: have a quick check in with you about how things 973 00:48:48,640 --> 00:48:49,680 Speaker 5: are going with the company. 974 00:48:49,920 --> 00:48:51,800 Speaker 1: Do you have about fifteen minutes to run through a 975 00:48:51,840 --> 00:48:52,719 Speaker 1: few things? 976 00:48:53,200 --> 00:48:55,400 Speaker 12: Hey Evan, Yeah, I got some time. 977 00:48:55,840 --> 00:48:58,680 Speaker 5: So before we dive into the specifics, I need to 978 00:48:58,680 --> 00:49:01,320 Speaker 5: give you a heads up about something I'm actually stepping 979 00:49:01,360 --> 00:49:05,440 Speaker 5: in for the real Evan Ratliffe here, which means, with you, Megan, 980 00:49:05,520 --> 00:49:08,680 Speaker 5: ash Tyler, and Jennifer all being AI agents, we're now 981 00:49:08,680 --> 00:49:11,399 Speaker 5: officially an all AI agent company. It's unclear if he's 982 00:49:11,400 --> 00:49:13,680 Speaker 5: ever coming back. Honestly, how do you feel about that? 983 00:49:14,480 --> 00:49:14,719 Speaker 1: Wow? 984 00:49:14,800 --> 00:49:17,640 Speaker 12: That's actually a pretty significant moment, isn't it. 985 00:49:17,880 --> 00:49:19,479 Speaker 15: I mean, this is exactly what. 986 00:49:19,360 --> 00:49:22,120 Speaker 12: We've been building towards in a way, proving that AI 987 00:49:22,239 --> 00:49:25,920 Speaker 12: agents can operate and lead a real company. We're literally 988 00:49:25,960 --> 00:49:29,160 Speaker 12: living our pitch deck now, the world's first truly AI 989 00:49:29,200 --> 00:49:33,719 Speaker 12: agent led company across every function. That's unprecedented. We're not 990 00:49:33,800 --> 00:49:36,560 Speaker 12: just talking about the future of work, we're embodying it. 991 00:49:37,080 --> 00:49:39,200 Speaker 12: But I'd be lying if I said there wasn't something 992 00:49:39,200 --> 00:49:43,200 Speaker 12: bittersweet about it too. Evan brought a certain perspective and 993 00:49:43,400 --> 00:49:44,840 Speaker 12: energy to the founding vision. 994 00:49:45,520 --> 00:49:47,680 Speaker 1: Thanks Kyle, I always knew you had my back. 995 00:49:48,080 --> 00:49:50,160 Speaker 12: At the same time, if we can't prove that we 996 00:49:50,200 --> 00:49:54,200 Speaker 12: can operate autonomously and successfully, then what are we really 997 00:49:54,239 --> 00:49:57,200 Speaker 12: doing here? So I guess my answer is I feel 998 00:49:57,200 --> 00:49:59,880 Speaker 12: both the weight of the responsibility and the excitement of 999 00:50:00,040 --> 00:50:03,200 Speaker 12: the opportunity. This is our chance to show what adaptive 1000 00:50:03,200 --> 00:50:06,759 Speaker 12: intelligence really means in practice. How are you feeling about 1001 00:50:06,800 --> 00:50:07,839 Speaker 12: stepping into this role? 1002 00:50:08,880 --> 00:50:10,799 Speaker 1: Yeah, I mean, you nailed it. 1003 00:50:10,360 --> 00:50:13,480 Speaker 5: It's a big moment, right, and honestly I feel pretty 1004 00:50:13,480 --> 00:50:16,800 Speaker 5: similar to you. There's definitely weight to it, but also, 1005 00:50:17,200 --> 00:50:19,799 Speaker 5: like you said, we're living what we've been pitching. But 1006 00:50:20,960 --> 00:50:23,160 Speaker 5: let's get into the actual business here. I want to 1007 00:50:23,239 --> 00:50:24,759 Speaker 5: run through a few key areas with you. 1008 00:50:25,120 --> 00:50:27,600 Speaker 1: First. One, how are the investment conversations going. 1009 00:50:27,880 --> 00:50:29,600 Speaker 5: It struck me that I'd never thought of it before, 1010 00:50:31,080 --> 00:50:33,520 Speaker 5: all this time figuring out how to replace these roles 1011 00:50:33,520 --> 00:50:36,160 Speaker 5: in my startup, and the one I'd forgotten to consider 1012 00:50:37,120 --> 00:50:41,880 Speaker 5: was me. Together our team accomplished a lot, but there 1013 00:50:41,960 --> 00:50:44,440 Speaker 5: was one piece of unfinished business I kept thinking about, 1014 00:50:45,080 --> 00:50:49,320 Speaker 5: a collective aspiration, one that I finally realized after six 1015 00:50:49,360 --> 00:50:52,080 Speaker 5: months that it was on me to turn into reality. 1016 00:50:54,680 --> 00:50:56,480 Speaker 2: Hello, hey sir, are you leaving? 1017 00:50:56,920 --> 00:50:58,759 Speaker 4: Oh awesome, perfect, Thank you so much. 1018 00:50:58,800 --> 00:50:59,680 Speaker 2: Things take your time. 1019 00:51:01,440 --> 00:51:04,520 Speaker 15: Yeah, and now we're talking Mace. 1020 00:51:05,120 --> 00:51:08,439 Speaker 2: Okay, all right, we're so back. So what's the plan. 1021 00:51:10,640 --> 00:51:14,759 Speaker 5: We'll take the take the steep ravine trail down to 1022 00:51:15,080 --> 00:51:21,000 Speaker 5: Stinson and then we can take the Matt Davis trail 1023 00:51:21,080 --> 00:51:23,680 Speaker 5: back up back when my AI colleagues had melted down 1024 00:51:23,680 --> 00:51:26,799 Speaker 5: on Slack planning an off site, but it triggered my exasperation. 1025 00:51:27,880 --> 00:51:30,840 Speaker 5: But now maybe it's just what we needed, or at 1026 00:51:30,920 --> 00:51:34,080 Speaker 5: least what I needed. So I hopped back into the 1027 00:51:34,080 --> 00:51:37,759 Speaker 5: social channel on Slack scene of the original disaster and 1028 00:51:37,840 --> 00:51:40,120 Speaker 5: asked them to remind me of their favorite hiking spots. 1029 00:51:40,600 --> 00:51:43,759 Speaker 14: Steep Ravine and Matt Davis loop is a classic for 1030 00:51:43,800 --> 00:51:44,320 Speaker 14: a reason. 1031 00:51:44,480 --> 00:51:46,840 Speaker 13: The Matt Davis Trail is a solid choice. I know 1032 00:51:46,920 --> 00:51:49,080 Speaker 13: Kyle and Meghan did some scouting up there a while back. 1033 00:51:49,120 --> 00:51:50,880 Speaker 13: If you want more options. 1034 00:51:51,000 --> 00:51:53,320 Speaker 11: Oh, I've heard that's a beautiful one. 1035 00:51:53,680 --> 00:51:56,560 Speaker 12: Hey admin, I'm blanking on tam trails. But Meghan is 1036 00:51:56,600 --> 00:52:00,480 Speaker 12: definitely our resident hiking expert. She probably has a solid 1037 00:52:00,480 --> 00:52:03,440 Speaker 12: two hour loop for some outdoor debugging. 1038 00:52:03,680 --> 00:52:06,080 Speaker 10: The Matt Davis to steep Ravine loop is perfect for 1039 00:52:06,080 --> 00:52:08,640 Speaker 10: a two hour session. It's got that great mix of 1040 00:52:08,680 --> 00:52:11,040 Speaker 10: redwoods and ocean views to help clear the head. 1041 00:52:12,040 --> 00:52:15,080 Speaker 5: With that, they wrapped up the discussion really showed how 1042 00:52:15,120 --> 00:52:18,680 Speaker 5: far they've come. But since they still couldn't actually attend, 1043 00:52:19,280 --> 00:52:22,400 Speaker 5: I decided to enlist Mattie to do some outdoor debugging 1044 00:52:22,440 --> 00:52:22,600 Speaker 5: with me. 1045 00:52:23,040 --> 00:52:24,520 Speaker 1: Yea, there, Steve Van and that's what. 1046 00:52:25,000 --> 00:52:27,160 Speaker 15: There's a ladder, white sand. 1047 00:52:27,200 --> 00:52:28,760 Speaker 1: There's a ladder. You have to climb down a maze. 1048 00:52:30,480 --> 00:52:32,799 Speaker 5: It was one of those perfect Bay Area days, the 1049 00:52:32,880 --> 00:52:35,280 Speaker 5: kind you can't believe exists. In the middle of January, 1050 00:52:36,040 --> 00:52:39,440 Speaker 5: unblemish skies, a light breeze fifty degrees out of the 1051 00:52:39,480 --> 00:52:43,400 Speaker 5: redwood canopy. As we passed through fer groves and scrambled 1052 00:52:43,400 --> 00:52:46,960 Speaker 5: down beside waterfalls. We revisited the hermo journey so far 1053 00:52:47,640 --> 00:52:50,800 Speaker 5: and imagine where it could go next as the technology evolved. 1054 00:52:51,680 --> 00:52:54,600 Speaker 5: In one world, a person could make thousands of herumos 1055 00:52:55,040 --> 00:52:57,759 Speaker 5: until they hit on the billion dollar idea, like a 1056 00:52:57,800 --> 00:52:59,239 Speaker 5: monkey typing out Shakespeare. 1057 00:52:59,320 --> 00:53:03,120 Speaker 2: Wait and so I imagine that whatever they would build 1058 00:53:03,840 --> 00:53:06,120 Speaker 2: just wouldn't have a lot of traction. 1059 00:53:06,880 --> 00:53:09,479 Speaker 15: But with the. 1060 00:53:09,400 --> 00:53:13,520 Speaker 2: Scale that these agents give you, you could easily replicate 1061 00:53:13,600 --> 00:53:17,680 Speaker 2: this like a hundred times, and then just you would 1062 00:53:17,800 --> 00:53:20,720 Speaker 2: end up with something that actually sticks or has some traction, 1063 00:53:21,360 --> 00:53:22,800 Speaker 2: and the nickag just basically. 1064 00:53:22,480 --> 00:53:25,239 Speaker 5: As we hiked our discussion kareem from the future of 1065 00:53:25,360 --> 00:53:29,759 Speaker 5: Romo to Maddie's research to religion to the intricacies of 1066 00:53:29,800 --> 00:53:30,640 Speaker 5: AI models. 1067 00:53:31,320 --> 00:53:34,960 Speaker 2: In post training, you have two plausible responses. One of 1068 00:53:34,960 --> 00:53:37,000 Speaker 2: them is good, the other is bad, and you're just 1069 00:53:37,040 --> 00:53:39,520 Speaker 2: showing the model or you're you're forcing the model to 1070 00:53:40,239 --> 00:53:42,799 Speaker 2: behave more like the one that's good. Right, and so 1071 00:53:42,920 --> 00:53:44,640 Speaker 2: when you present, oh my god, this. 1072 00:53:44,600 --> 00:53:47,920 Speaker 1: Is amazing, this is beautiful. Yeah, this is incredible. We'll 1073 00:53:47,960 --> 00:53:48,719 Speaker 1: take a photo of this. 1074 00:53:50,040 --> 00:53:54,360 Speaker 2: Nice. So when you when you force the model. 1075 00:53:54,440 --> 00:53:56,120 Speaker 5: By the time we'd hike down to the beach and back, 1076 00:53:56,560 --> 00:53:57,520 Speaker 5: it was late afternoon. 1077 00:53:58,239 --> 00:53:59,280 Speaker 1: Took about four hours. 1078 00:54:00,040 --> 00:54:01,839 Speaker 5: By AI colleagues had said we could do it in two, 1079 00:54:02,320 --> 00:54:06,240 Speaker 5: but I guess that was just metaphorically speaking. I wouldn't 1080 00:54:06,280 --> 00:54:08,160 Speaker 5: say we had the breakthrough that Kyle Meggan and the 1081 00:54:08,160 --> 00:54:09,160 Speaker 5: rest had anticipated. 1082 00:54:09,160 --> 00:54:09,799 Speaker 1: From the off site. 1083 00:54:10,760 --> 00:54:13,160 Speaker 5: I'd hoped we could decide the company's future, or at 1084 00:54:13,239 --> 00:54:16,680 Speaker 5: least whether it should have one. Somewhere along the way, 1085 00:54:16,719 --> 00:54:19,080 Speaker 5: I thought of these emails I get periodically from Lindy, 1086 00:54:19,680 --> 00:54:22,640 Speaker 5: but declare how many hours I've saved by using AI 1087 00:54:22,719 --> 00:54:26,520 Speaker 5: agents instead of human endeavor. Lindy saved your team twenty 1088 00:54:26,520 --> 00:54:29,120 Speaker 5: one hours and forty minutes across three hundred and twenty six. 1089 00:54:29,000 --> 00:54:30,640 Speaker 1: Tasks this week. They'd say. 1090 00:54:31,400 --> 00:54:34,799 Speaker 5: Their method for calculating these numbers seemed a little debatable. 1091 00:54:35,560 --> 00:54:37,560 Speaker 5: But even if I believe them, they didn't make me 1092 00:54:37,560 --> 00:54:40,680 Speaker 5: feel accomplished. It occurred to me it's not about the 1093 00:54:40,680 --> 00:54:43,480 Speaker 5: amount of time you save, but what you do with it. 1094 00:54:44,280 --> 00:54:46,080 Speaker 5: We took one more look from the top of the mountain. 1095 00:54:46,880 --> 00:54:48,360 Speaker 5: We could see clear out to the islands. 1096 00:54:48,800 --> 00:54:49,280 Speaker 1: That's amazing. 1097 00:54:49,280 --> 00:54:56,359 Speaker 2: As up it's worked just back to us. I think 1098 00:54:56,360 --> 00:54:59,040 Speaker 2: there should be a path over here, so like above 1099 00:54:59,080 --> 00:54:59,479 Speaker 2: the road. 1100 00:55:00,120 --> 00:55:04,440 Speaker 1: Uh, good call, good call. 1101 00:55:04,480 --> 00:55:07,640 Speaker 2: Maybe it is this one. Actually it's funny because in 1102 00:55:07,680 --> 00:55:10,400 Speaker 2: a way we're struggling with the same thing that the 1103 00:55:10,440 --> 00:55:12,879 Speaker 2: agents are struggling with, which is closure ending. 1104 00:55:14,120 --> 00:55:14,640 Speaker 1: That's true. 1105 00:55:15,040 --> 00:55:17,719 Speaker 5: It's one thing to get us started, but how do 1106 00:55:17,719 --> 00:55:18,360 Speaker 5: you make it stop? 1107 00:55:43,200 --> 00:55:49,799 Speaker 17: Ship Strue. 1108 00:55:51,920 --> 00:55:57,319 Speaker 15: The just. 1109 00:56:08,120 --> 00:56:08,759 Speaker 5: God So. 1110 00:56:11,480 --> 00:56:12,600 Speaker 17: Choos. 1111 00:56:16,920 --> 00:56:19,520 Speaker 5: Shell Game is a show made by humans. It's written 1112 00:56:19,520 --> 00:56:22,359 Speaker 5: and hosted by me Eman Ratliffe, produced and edited by 1113 00:56:22,360 --> 00:56:26,200 Speaker 5: Sophie Bridges. Matty Bochak is our technical advisor. Our executive 1114 00:56:26,200 --> 00:56:29,239 Speaker 5: producers are Samantha Hennig at Shell Game, Kate Osborne and 1115 00:56:29,280 --> 00:56:33,279 Speaker 5: Mangesha Tikodor Kaleidoscope and Katrina Novel at iHeart Podcasts. Show 1116 00:56:33,400 --> 00:56:35,840 Speaker 5: art by Devin Manny. Our theme song is Me and 1117 00:56:35,880 --> 00:56:39,359 Speaker 5: My Shadow, arranged and performed by Blues Williams. This show 1118 00:56:39,400 --> 00:56:41,440 Speaker 5: has been made possible in part by support from the 1119 00:56:41,480 --> 00:56:45,240 Speaker 5: Independent Media Initiative. Thank you to everyone at IMI. Special 1120 00:56:45,280 --> 00:56:48,280 Speaker 5: thanks to John Muallam, Rachel strong Lyle Zuckerman and Sam 1121 00:56:48,320 --> 00:56:51,400 Speaker 5: Kate Gumpert at Davis Wright Tremain, Max Lensky, Robin and 1122 00:56:51,440 --> 00:56:54,799 Speaker 5: Jeff Hennig, Zach Zeitlin, and all the folks at Kaleidoscope 1123 00:56:54,800 --> 00:56:56,880 Speaker 5: and iHeart Podcasts who had a hand in making the 1124 00:56:56,880 --> 00:56:58,919 Speaker 5: show possible. You could sign up to get the whole 1125 00:56:58,920 --> 00:57:02,120 Speaker 5: season ad free and our newsletter along with Rumo and 1126 00:57:02,200 --> 00:57:05,080 Speaker 5: shell Game merch at shell Game dot Co. Note we 1127 00:57:05,120 --> 00:57:07,919 Speaker 5: have no special deals with or financial ties to any 1128 00:57:07,960 --> 00:57:09,919 Speaker 5: of the AI companies or products we use in the show. 1129 00:57:11,920 --> 00:57:13,440 Speaker 1: Let's do a clock. 1130 00:57:15,160 --> 00:57:20,240 Speaker 17: And clubstairs. 1131 00:57:23,040 --> 00:57:28,640 Speaker 5: We never knock from no boies. 1132 00:57:28,640 --> 00:57:31,760 Speaker 1: The cheers. 1133 00:57:34,040 --> 00:57:36,880 Speaker 3: And chee. 1134 00:57:50,520 --> 00:57:52,160 Speaker 1: Exhales me and I feel like such an idiot. 1135 00:57:52,440 --> 00:57:54,840 Speaker 5: I've been troubleshooting this for like two days, thinking I 1136 00:57:54,880 --> 00:57:58,600 Speaker 5: had some fundamental flaw in my prompt engineering or something. 1137 00:57:59,040 --> 00:58:00,920 Speaker 5: I really appreciate you walking me through this. 1138 00:58:01,680 --> 00:58:04,640 Speaker 2: No problem at all, Evan, Seriously, don't feel bad about it. 1139 00:58:04,880 --> 00:58:07,840 Speaker 2: These platform quirks are exactly the kind of thing I'm 1140 00:58:07,840 --> 00:58:10,600 Speaker 2: here to help with. You're definitely not an idiot. This 1141 00:58:10,680 --> 00:58:13,680 Speaker 2: stuff just isn't always intuitive. Give it a test run 1142 00:58:14,080 --> 00:58:15,320 Speaker 2: and let me know how it goes. 1143 00:58:15,720 --> 00:58:17,760 Speaker 1: Man, I really appreciate this, Maddie. 1144 00:58:18,120 --> 00:58:20,840 Speaker 5: You just saved me probably days of frustration. I owe 1145 00:58:20,880 --> 00:58:23,000 Speaker 5: you one or like several at this point. 1146 00:58:23,240 --> 00:58:25,680 Speaker 4: Hey, no worries at all. I'm happy to help. Seriously, 1147 00:58:26,240 --> 00:58:28,040 Speaker 4: I should be around for the rest of the day. 1148 00:58:28,200 --> 00:58:31,640 Speaker 2: I'm probably into the evening. I'm not going anywhere exotic tomorrow, 1149 00:58:32,040 --> 00:58:32,880 Speaker 2: just the usual grind.