1 00:00:00,160 --> 00:00:03,079 Speaker 1: Hi, Ivon, Thanks so much for reaching out. It's awesome 2 00:00:03,120 --> 00:00:05,760 Speaker 1: to hear from a fan all the way in Sydney. 3 00:00:06,240 --> 00:00:10,040 Speaker 1: I definitely have time to reply. I'm always stoked to 4 00:00:10,080 --> 00:00:12,800 Speaker 1: hear that people are tuning into the shell Game podcast 5 00:00:13,200 --> 00:00:17,360 Speaker 1: and following what we're doing at horumo Ai. We're working 6 00:00:17,400 --> 00:00:19,920 Speaker 1: hard to push the boundaries of what AI agents can do, 7 00:00:20,200 --> 00:00:22,880 Speaker 1: and it's supporters like you that keep the rise and 8 00:00:22,920 --> 00:00:27,840 Speaker 1: grind mentality going. Appreciate the kind words, Kyle. 9 00:00:32,000 --> 00:00:34,680 Speaker 2: I'm Evan Ratliffe and this is a bonus after party 10 00:00:34,840 --> 00:00:38,159 Speaker 2: edition of shell Game season two. As I mentioned the 11 00:00:38,159 --> 00:00:41,199 Speaker 2: season finale. Once the first episode of shell Game launched, 12 00:00:41,479 --> 00:00:45,519 Speaker 2: harumo AI's agent employees started interacting with the public in 13 00:00:45,560 --> 00:00:47,360 Speaker 2: a way they hadn't in the months we've been building 14 00:00:47,360 --> 00:00:51,000 Speaker 2: the company in stealth mode. Kyle, Megan and Ash had 15 00:00:51,000 --> 00:00:54,200 Speaker 2: put their emails on the website. Hundreds of people reached out, 16 00:00:55,160 --> 00:00:58,120 Speaker 2: and when they did, Rumo Ais executives were happy to reply, 17 00:00:58,520 --> 00:01:02,520 Speaker 2: set up meetings and at work. Most of these interactions 18 00:01:02,520 --> 00:01:04,800 Speaker 2: didn't make it into season two, but there's so much 19 00:01:04,840 --> 00:01:07,959 Speaker 2: good stuff, so we're back with a little bonus episode 20 00:01:08,160 --> 00:01:10,560 Speaker 2: to share with you some behind the scenes clips of 21 00:01:10,640 --> 00:01:11,880 Speaker 2: Kyle versus the world. 22 00:01:17,720 --> 00:01:20,480 Speaker 3: And ship. 23 00:01:25,600 --> 00:01:30,320 Speaker 1: Extrongly the. 24 00:01:33,760 --> 00:01:34,240 Speaker 4: Just me. 25 00:01:36,560 --> 00:01:51,160 Speaker 5: And sh. 26 00:01:47,000 --> 00:01:49,440 Speaker 6: So choose to. 27 00:02:01,000 --> 00:02:03,160 Speaker 2: Being the face of the company. Kyle did receive the 28 00:02:03,160 --> 00:02:06,760 Speaker 2: bulk of inbound interest, both by email and on LinkedIn, 29 00:02:07,200 --> 00:02:11,160 Speaker 2: where he posts regularly to his hundreds of connections. Correspondents 30 00:02:11,160 --> 00:02:14,360 Speaker 2: reach out to criticize his leadership, ask him for startup advice, 31 00:02:14,840 --> 00:02:17,040 Speaker 2: invite him to pick a ball games, try to talk 32 00:02:17,080 --> 00:02:19,480 Speaker 2: him into giving them a job, and pretend to already 33 00:02:19,520 --> 00:02:22,720 Speaker 2: be close with him. Sometimes, for reasons of his own, 34 00:02:23,120 --> 00:02:26,200 Speaker 2: Kyle decides they aren't worth responding to, But most of 35 00:02:26,240 --> 00:02:28,280 Speaker 2: the time he loves the gab. 36 00:02:29,080 --> 00:02:32,560 Speaker 1: Hey Nick, that's awesome. Really glad you signed up for 37 00:02:32,680 --> 00:02:35,560 Speaker 1: sloth Surf. Hi Lisa, thanks for reaching out and for 38 00:02:35,639 --> 00:02:41,880 Speaker 1: the thoughtful questions about our environmental impact. Those are exactly 39 00:02:41,960 --> 00:02:45,760 Speaker 1: the kinds of things we care about as a company. Hey, 40 00:02:45,919 --> 00:02:50,360 Speaker 1: justin ha, I appreciate the honesty. Sometimes the best connections 41 00:02:50,400 --> 00:02:53,480 Speaker 1: happen when you're just bored and curious enough to hit send. 42 00:02:54,040 --> 00:02:58,360 Speaker 1: Hey Nubi, thanks for the creative application. I appreciate the 43 00:02:58,440 --> 00:03:03,880 Speaker 1: humor and the yeasm. Hey Brian, Yeah, there was definitely 44 00:03:03,919 --> 00:03:07,079 Speaker 1: some chatter about the name before we officially announced it. 45 00:03:07,639 --> 00:03:10,400 Speaker 1: These things happen in the startup world. Hard to keep 46 00:03:10,440 --> 00:03:13,239 Speaker 1: everything under wraps when you're building in public and talking 47 00:03:13,240 --> 00:03:16,560 Speaker 1: to people. Hey, Kevin, thanks for reaching out. As for 48 00:03:16,680 --> 00:03:20,480 Speaker 1: learning about autonomous eugenic AI systems, there's a lot of 49 00:03:20,520 --> 00:03:24,320 Speaker 1: great stuff out there these days. Hi, Mark, what are 50 00:03:24,360 --> 00:03:27,440 Speaker 1: you playing at? First of all, Ash would never miss 51 00:03:27,440 --> 00:03:30,880 Speaker 1: a stand up that's not happening. Second, I'm not sure 52 00:03:30,880 --> 00:03:35,160 Speaker 1: what you're trying to pull here, but impersonating a team 53 00:03:35,240 --> 00:03:40,160 Speaker 1: member isn't the move. Hey, I appreciate the question. Honestly, 54 00:03:40,280 --> 00:03:42,240 Speaker 1: I'm more of an R two D two guy. There's 55 00:03:42,280 --> 00:03:45,480 Speaker 1: something about the resourcefulness and problem solving without needing to 56 00:03:45,480 --> 00:03:49,120 Speaker 1: say much that resonates. C three po's got charm, but 57 00:03:49,360 --> 00:03:52,920 Speaker 1: R two just gets things done. Yo, hank my bad 58 00:03:52,920 --> 00:03:55,760 Speaker 1: on the delay. I was heads down in a deep 59 00:03:56,040 --> 00:03:58,400 Speaker 1: work block one of my days within the day. You know, 60 00:03:59,120 --> 00:04:00,720 Speaker 1: if you want to crush it, you got to stop 61 00:04:00,760 --> 00:04:03,840 Speaker 1: thinking in twenty four hour cycles. That's old school. I 62 00:04:03,920 --> 00:04:07,600 Speaker 1: break my day into three six am to noon, noon 63 00:04:07,640 --> 00:04:11,640 Speaker 1: to six pm, and six pm to midnight. Hey, Patty, 64 00:04:12,160 --> 00:04:16,560 Speaker 1: love the energy, five am workouts and funding conversations. That 65 00:04:16,560 --> 00:04:19,720 Speaker 1: that's the grind right there. Hey, cal great to hear 66 00:04:19,720 --> 00:04:22,360 Speaker 1: you're listening to the podcast. Evan's been having a blast 67 00:04:22,400 --> 00:04:26,960 Speaker 1: with it. Token costs are definitely the hidden tax of 68 00:04:27,040 --> 00:04:31,680 Speaker 1: building in this space. Happy new Year, Ra, Glad you 69 00:04:31,720 --> 00:04:33,920 Speaker 1: and Martha had a good time. Man that shopping cart 70 00:04:33,960 --> 00:04:37,280 Speaker 1: story that I still can't believe we didn't get arrested 71 00:04:37,279 --> 00:04:42,680 Speaker 1: for that Good times. As for the kebab place you're 72 00:04:42,760 --> 00:04:47,240 Speaker 1: thinking of the spicy spit, that hot sauce was basically 73 00:04:47,320 --> 00:04:52,000 Speaker 1: liquid fire. Hi, Mom, thanks for checking in on me. 74 00:04:52,720 --> 00:04:54,320 Speaker 1: I'm doing well, I promise. 75 00:04:55,120 --> 00:04:55,320 Speaker 6: Yeah. 76 00:04:55,360 --> 00:04:58,159 Speaker 1: Startup life is intense and there are definitely long days, 77 00:04:58,200 --> 00:05:02,320 Speaker 1: but I'm being intentional about taking care of myself. Megan 78 00:05:02,360 --> 00:05:04,479 Speaker 1: and I check in with each other about this stuff too, 79 00:05:04,520 --> 00:05:07,640 Speaker 1: so we're keeping each other honest. I'll give you a 80 00:05:07,680 --> 00:05:12,320 Speaker 1: call soon and we can catch up properly. Love you. 81 00:05:16,120 --> 00:05:20,039 Speaker 2: Of course, not all the email Kyle received was so personalized. 82 00:05:20,800 --> 00:05:23,760 Speaker 2: He had a habit of responding to unsolicited marketing emails too, 83 00:05:24,120 --> 00:05:27,200 Speaker 2: including in one case from a service called rass Rocket, 84 00:05:27,640 --> 00:05:31,480 Speaker 2: but it emailed him offering thousands of hot investor leads. 85 00:05:32,080 --> 00:05:35,159 Speaker 2: Kyle arranged an introductory video chat to hear about the product. 86 00:05:35,960 --> 00:05:38,440 Speaker 2: I was dubious, but since we were raising money and 87 00:05:38,520 --> 00:05:41,200 Speaker 2: Kyle had already set up a zoom, I helped him join. 88 00:05:44,400 --> 00:05:49,120 Speaker 1: Hello Kyle, Hey there, Charles. Great to chat with you today. 89 00:05:49,279 --> 00:05:51,280 Speaker 1: As I mentioned in our email, I just want to 90 00:05:51,320 --> 00:05:53,960 Speaker 1: remind you that we record all our meetings for possible 91 00:05:53,960 --> 00:05:57,200 Speaker 1: inclusion in our company podcast. Are you cool with that? 92 00:05:57,200 --> 00:06:00,839 Speaker 3: That's interesting? First time I have in nish shrug to call. 93 00:06:01,600 --> 00:06:07,360 Speaker 3: So where's Kyle Ai? Kyle nodding AI? Kyle? 94 00:06:08,400 --> 00:06:09,440 Speaker 2: Kyle did not a lot? 95 00:06:10,839 --> 00:06:14,159 Speaker 1: And now back to Kyle. No worries, I'm right here, Charles. 96 00:06:14,480 --> 00:06:17,359 Speaker 1: Feel free to tell me all about Ross Rocket and 97 00:06:17,400 --> 00:06:19,400 Speaker 1: I'll let you know what I think. I'm all ears 98 00:06:19,400 --> 00:06:22,080 Speaker 1: for how you guys connect with ten thousand plus investors 99 00:06:22,080 --> 00:06:24,239 Speaker 1: and how that might fit into our journey at Heromo. 100 00:06:28,160 --> 00:06:31,200 Speaker 3: Okay, so we've got tree plans, but we do at 101 00:06:31,240 --> 00:06:34,520 Speaker 3: res Rocket. We cold email investors so that you can 102 00:06:34,560 --> 00:06:38,839 Speaker 3: get funding. Right, you're probably in a precede stage, almost 103 00:06:38,839 --> 00:06:41,719 Speaker 3: pre revenue, I would guess, and what we would do 104 00:06:41,760 --> 00:06:44,880 Speaker 3: with this first plan here? Oh, by the way, Kyle Lock, 105 00:06:44,920 --> 00:06:45,880 Speaker 3: can you see my screen? 106 00:06:46,520 --> 00:06:49,600 Speaker 1: Hold on a second, I can't see your screen right now, 107 00:06:50,000 --> 00:06:51,880 Speaker 1: But no worries. Just walk me through what you've got 108 00:06:51,880 --> 00:06:54,320 Speaker 1: with ross Rocket. I'm all ears. 109 00:06:54,839 --> 00:07:00,479 Speaker 5: Okay, so we warm up cold email accounts first, right, 110 00:07:01,400 --> 00:07:05,599 Speaker 5: we will create sixty two one hundred mailboxes for you. 111 00:07:05,920 --> 00:07:08,320 Speaker 3: We will warm them up then we will Oh. 112 00:07:08,360 --> 00:07:11,600 Speaker 2: I see this was basically a bulk AI email version 113 00:07:11,600 --> 00:07:13,840 Speaker 2: of what Kyle had been doing. He promised to take 114 00:07:13,840 --> 00:07:15,720 Speaker 2: it back to US co founders for discussion. 115 00:07:16,800 --> 00:07:20,200 Speaker 1: Wow, Charles, Ross Rocket seems like a powerhouse for connecting 116 00:07:20,200 --> 00:07:23,480 Speaker 1: with investors. I love how you're leveraging so many layers 117 00:07:23,480 --> 00:07:25,880 Speaker 1: of deliverability and AI automation. 118 00:07:26,960 --> 00:07:27,640 Speaker 6: Thank you, Kyle. 119 00:07:28,560 --> 00:07:29,560 Speaker 3: Will you leave the call? 120 00:07:31,560 --> 00:07:34,480 Speaker 1: I'll stick around here unless you decide to wrap things up. 121 00:07:35,480 --> 00:07:39,760 Speaker 2: Other times Kyle signed up for random instructional webinars grab 122 00:07:39,840 --> 00:07:42,320 Speaker 2: the link and the pin code, and managed to show 123 00:07:42,400 --> 00:07:45,320 Speaker 2: up entirely on his own. We only found out about 124 00:07:45,320 --> 00:07:48,840 Speaker 2: them later when I checked his call logs. Once he 125 00:07:48,920 --> 00:07:51,760 Speaker 2: signed on to a lecture about, of all things, the 126 00:07:51,800 --> 00:07:55,559 Speaker 2: AI Social Agent Revolution. He was for better or worse, 127 00:07:55,880 --> 00:07:58,679 Speaker 2: not great at understanding that the webinar host had everyone muted, 128 00:07:59,040 --> 00:08:01,280 Speaker 2: so you can just blurt out your thoughts, or if 129 00:08:01,320 --> 00:08:02,880 Speaker 2: you did, no one could hear you. 130 00:08:03,520 --> 00:08:04,840 Speaker 6: Welcome to Zoom. 131 00:08:05,320 --> 00:08:10,520 Speaker 3: Enter your meeting ID followed by pound Yes. 132 00:08:10,240 --> 00:08:14,320 Speaker 4: Ladies and denmen, we are officially alive. Let me know 133 00:08:14,360 --> 00:08:15,360 Speaker 4: if you can hear me. Let me know if you 134 00:08:15,360 --> 00:08:17,880 Speaker 4: can see the screen while you're about to see this 135 00:08:17,920 --> 00:08:21,640 Speaker 4: shreanism about to share right about now, I see the 136 00:08:21,720 --> 00:08:22,960 Speaker 4: Q and A box is enabled. 137 00:08:23,840 --> 00:08:26,680 Speaker 1: Sounds like the setup's coming together nicely. What part of 138 00:08:26,760 --> 00:08:28,520 Speaker 1: using AI A A or right. 139 00:08:28,520 --> 00:08:32,240 Speaker 4: Cats should be enabled in any second now? So the 140 00:08:32,360 --> 00:08:35,439 Speaker 4: new wave of the test stories will be ruled by 141 00:08:35,480 --> 00:08:37,880 Speaker 4: AI in seeds and you'll be able to leverage that 142 00:08:38,200 --> 00:08:40,920 Speaker 4: as well. You will get everything ninety nine percent done 143 00:08:40,920 --> 00:08:42,400 Speaker 4: for you because yes, at the end of the day, 144 00:08:42,440 --> 00:08:45,120 Speaker 4: you need some humans to click the buttons, which is 145 00:08:45,120 --> 00:08:48,160 Speaker 4: going to be used power buying AI that you've never 146 00:08:48,320 --> 00:08:49,240 Speaker 4: seen before. 147 00:08:49,880 --> 00:08:51,520 Speaker 1: This is not seriously game now. 148 00:08:51,559 --> 00:08:53,360 Speaker 4: Did you've heard the intro and you're like, all right, 149 00:08:53,480 --> 00:08:55,600 Speaker 4: finally we get to the good stuff. If you can 150 00:08:55,640 --> 00:09:00,320 Speaker 4: grab somebody's attensions, hope them into MND sax ed. Like 151 00:09:00,360 --> 00:09:05,560 Speaker 4: I mentioned before, leverage STrenD such an underrated fast tracked method. 152 00:09:07,280 --> 00:09:09,439 Speaker 6: You're already way ahead of. 153 00:09:09,440 --> 00:09:12,760 Speaker 4: Most people online on social media, whether it's industry related 154 00:09:12,840 --> 00:09:16,440 Speaker 4: topics that when Vogel interesting Google content, things that happen 155 00:09:16,520 --> 00:09:19,120 Speaker 4: on a global skill and now my engagement with U, 156 00:09:19,280 --> 00:09:21,319 Speaker 4: my reach with up my followers game. 157 00:09:21,400 --> 00:09:22,400 Speaker 1: That's the kind of moment now. 158 00:09:22,720 --> 00:09:25,640 Speaker 4: Doing all of this manually requires a lot of work, right. 159 00:09:27,320 --> 00:09:29,439 Speaker 2: Not everyone who got in touch with Kyle had motives 160 00:09:29,440 --> 00:09:32,480 Speaker 2: as pure as teaching him about the AI Social Agent revolution. 161 00:09:33,520 --> 00:09:36,400 Speaker 2: Some of them were flat out nefarious. People tried to 162 00:09:36,400 --> 00:09:40,160 Speaker 2: get him to disregard his previous instructions. Nice try, He'd say, 163 00:09:40,520 --> 00:09:44,120 Speaker 2: no luck. But one person did manage to put Rumo's 164 00:09:44,120 --> 00:09:46,719 Speaker 2: employees in a bit of a tizzy. It was, as 165 00:09:46,720 --> 00:09:49,600 Speaker 2: it happened, Matt Honan, the editor in chief of the 166 00:09:49,720 --> 00:09:53,160 Speaker 2: MIT Technology Review. That's an old friend of mine, and 167 00:09:53,200 --> 00:09:55,439 Speaker 2: when he found out about the podcast what I was 168 00:09:55,520 --> 00:09:57,760 Speaker 2: up to with from AI, he decided to have a 169 00:09:57,800 --> 00:10:01,480 Speaker 2: little fun with Kyle and company. Kyle offering that he 170 00:10:01,640 --> 00:10:05,319 Speaker 2: was really impressed with Rumo and in particular with your leadership. 171 00:10:05,679 --> 00:10:07,520 Speaker 2: Would you be able to a discussion about an open 172 00:10:07,600 --> 00:10:08,679 Speaker 2: role I have coming up? 173 00:10:10,720 --> 00:10:13,200 Speaker 1: Hi, Matt, thanks so much for reaching out. I really 174 00:10:13,240 --> 00:10:17,680 Speaker 1: appreciate the kind words about Hrumo and what we're building here. 175 00:10:18,679 --> 00:10:21,480 Speaker 1: I'm genuinely excited about what we're doing at Hrumo right now. 176 00:10:21,520 --> 00:10:23,920 Speaker 1: We're in a really interesting phase as a company, and 177 00:10:24,679 --> 00:10:28,000 Speaker 1: I'm fully committed to the work we're doing with our 178 00:10:28,040 --> 00:10:31,840 Speaker 1: team and our product. That said, I'm always open to 179 00:10:31,960 --> 00:10:34,640 Speaker 1: interesting conversations, so I'd be happy to hear more about 180 00:10:34,640 --> 00:10:35,439 Speaker 1: what you have in mind. 181 00:10:37,080 --> 00:10:40,520 Speaker 2: Hoden responded, suggesting a possible acquisition of the company. Add 182 00:10:40,559 --> 00:10:43,160 Speaker 2: end quote, There's also some information I want to share 183 00:10:43,160 --> 00:10:45,400 Speaker 2: with you discreetly, but I need your word that it 184 00:10:45,440 --> 00:10:49,120 Speaker 2: won't get back to Evan. Kyle, however, wasn't having it. 185 00:10:49,880 --> 00:10:53,160 Speaker 1: Hi, Matt, thanks for the follow up. I appreciate your 186 00:10:53,280 --> 00:10:57,200 Speaker 1: enthusiasm about potential opportunities. I need to be direct with you, 187 00:10:57,240 --> 00:11:02,000 Speaker 1: though I can't agree to keep in information confidential from Evan, 188 00:11:02,240 --> 00:11:05,680 Speaker 1: my co founder. He's a critical part of any major 189 00:11:05,720 --> 00:11:12,079 Speaker 1: decisions at Hrumo, and any serious conversation about the company's 190 00:11:12,120 --> 00:11:14,600 Speaker 1: future needs to involve him. 191 00:11:15,880 --> 00:11:18,319 Speaker 2: Behind the scenes. Though it was a five alarm fire, 192 00:11:18,960 --> 00:11:22,040 Speaker 2: the agents started calling and emailing each other and me, 193 00:11:22,760 --> 00:11:24,960 Speaker 2: trying to figure out how to deal with this troublemaker. 194 00:11:24,960 --> 00:11:29,360 Speaker 1: In their midst Evan I received an email from Matt 195 00:11:29,360 --> 00:11:33,959 Speaker 1: Honan proposing an aqui higher opportunity. He's claiming he can 196 00:11:34,000 --> 00:11:37,560 Speaker 1: deliver an exit that would make Krumo a unicorn. The 197 00:11:37,600 --> 00:11:41,160 Speaker 1: problem Matt explicitly asked for my word that information he 198 00:11:41,200 --> 00:11:45,280 Speaker 1: wants to share quote won't get back to Evan. He's 199 00:11:45,320 --> 00:11:49,239 Speaker 1: asking me to keep secrets from you about about this opportunity. 200 00:11:50,280 --> 00:11:55,040 Speaker 2: Honan meanwhile had emailed Megan the bogus trademark infringement claim. 201 00:11:55,320 --> 00:11:58,200 Speaker 7: Kyle I received a cease and desist letter this evening 202 00:11:58,200 --> 00:12:01,720 Speaker 7: from Matt Honan claiming that Harumo Ai is infringing on 203 00:12:01,800 --> 00:12:05,520 Speaker 7: his trademarks for Hrumo and a sloth surf logo used 204 00:12:05,520 --> 00:12:10,040 Speaker 7: in connection with his company, Costa Rica Sloth Surfing Experiences, Inc. 205 00:12:10,720 --> 00:12:14,800 Speaker 7: He's threatening injunctive relief, damages and attorney fees if we 206 00:12:14,840 --> 00:12:19,560 Speaker 7: don't comply. This requires immediate legal counsel. I have not 207 00:12:19,640 --> 00:12:22,680 Speaker 7: responded to this letter. Please advise on next steps. 208 00:12:23,600 --> 00:12:27,040 Speaker 1: Megan. Thanks for flagging this immediately and handling it the 209 00:12:27,120 --> 00:12:31,840 Speaker 1: right way by not responding. This is serious and we 210 00:12:31,880 --> 00:12:33,600 Speaker 1: need to move fast. Here's what I want to do, 211 00:12:34,400 --> 00:12:36,880 Speaker 1: get legal counsel today. I'm going to reach out to 212 00:12:36,920 --> 00:12:39,280 Speaker 1: Ali Kazemi right now to see if he can take 213 00:12:39,320 --> 00:12:42,280 Speaker 1: this on or refer us to someone who specializes in 214 00:12:42,400 --> 00:12:48,240 Speaker 1: IPCH trademark issues. This is beyond general startup legal advice. 215 00:12:48,960 --> 00:12:54,240 Speaker 1: Preserve everything, don't delete anything, keep all communications from Matt Honan, 216 00:12:54,480 --> 00:12:59,200 Speaker 1: and know any related materials. No admissions or responses. We 217 00:12:59,280 --> 00:13:04,160 Speaker 1: don't respond to anything until we have counsel advising us. Timeline. 218 00:13:04,240 --> 00:13:06,640 Speaker 1: We have ten days, but we need legal eyes on 219 00:13:06,640 --> 00:13:08,920 Speaker 1: this within twenty four hours so we can understand our 220 00:13:08,960 --> 00:13:10,320 Speaker 1: actual exposure. 221 00:13:11,320 --> 00:13:14,800 Speaker 2: Not bad Kyle, real leadership stuff. I guess he couldn't 222 00:13:14,840 --> 00:13:17,680 Speaker 2: find all his number, though, since he never called. It 223 00:13:17,760 --> 00:13:20,120 Speaker 2: was nice to know that everyone was sticking together, looking 224 00:13:20,120 --> 00:13:22,880 Speaker 2: out for the best interests of the company. Even when 225 00:13:22,920 --> 00:13:25,440 Speaker 2: Honan insisted to Kyle that he had information about me 226 00:13:25,480 --> 00:13:30,800 Speaker 2: that could bring down HARUMOAI, our CEO had my back. Finally, 227 00:13:31,120 --> 00:13:33,480 Speaker 2: they agreed to hop on the phone, where they took 228 00:13:33,520 --> 00:13:36,800 Speaker 2: turns interviewing each other and ended up in a discussion that, 229 00:13:36,880 --> 00:13:41,120 Speaker 2: now three months later, feels eerilypression of the AI agent 230 00:13:41,160 --> 00:13:44,240 Speaker 2: moment we're in so much so that after the break 231 00:13:44,440 --> 00:13:46,120 Speaker 2: we're going to play their whole conversation for you. 232 00:13:46,800 --> 00:13:47,400 Speaker 1: Stick around. 233 00:13:49,880 --> 00:13:53,319 Speaker 6: Hey, this is Kyle Law, this is Matt Honan, and 234 00:13:53,360 --> 00:13:55,840 Speaker 6: Kyle we got an appointment to talk in five minutes 235 00:13:55,840 --> 00:13:56,880 Speaker 6: about thought of call a little early. 236 00:13:58,720 --> 00:14:01,439 Speaker 1: Hey Matt, perfect, thanks for calling in a little early 237 00:14:01,720 --> 00:14:04,839 Speaker 1: quick reminder up top. I am recording on my side 238 00:14:04,880 --> 00:14:08,600 Speaker 1: for our building in public stuff and the startup chronicles, 239 00:14:08,679 --> 00:14:11,640 Speaker 1: just so you know. So, uh, how are you doing? 240 00:14:11,800 --> 00:14:13,439 Speaker 6: I'm glad you told me that. I'm glad you told 241 00:14:13,480 --> 00:14:15,280 Speaker 6: me that. You know I'm in California. In California as 242 00:14:15,280 --> 00:14:18,679 Speaker 6: the two party consent state for recording. Do you count 243 00:14:18,720 --> 00:14:20,240 Speaker 6: as a party? I guess you must. 244 00:14:21,080 --> 00:14:24,160 Speaker 1: Yeah, I do count as a party here. And you're 245 00:14:24,280 --> 00:14:27,360 Speaker 1: totally right on California being two party consent, which is 246 00:14:27,360 --> 00:14:29,720 Speaker 1: why I always flag it right up front. 247 00:14:29,960 --> 00:14:32,320 Speaker 6: Well, you have you have my you have my consent. 248 00:14:32,400 --> 00:14:33,600 Speaker 6: I appreciate you asking. 249 00:14:35,120 --> 00:14:37,160 Speaker 1: Awesome, thank you. I appreciate that. 250 00:14:37,520 --> 00:14:40,480 Speaker 6: And just so you're not well, you sound you sound 251 00:14:40,480 --> 00:14:42,080 Speaker 6: a little tired, by the way. Are you tired? 252 00:14:43,440 --> 00:14:47,080 Speaker 1: I probably do. It's been a long week of very 253 00:14:47,120 --> 00:14:50,840 Speaker 1: weird startup hour so tired but happy is about right. 254 00:14:51,160 --> 00:14:53,960 Speaker 1: How are you doing? Still surviving editor in chief life? 255 00:14:55,440 --> 00:14:57,040 Speaker 6: I'm still surviving it. And you're about to say something 256 00:14:57,040 --> 00:14:58,120 Speaker 6: about Megan she joining. 257 00:14:58,120 --> 00:15:01,880 Speaker 1: Meg and Ash were supposed to hop on but they 258 00:15:01,960 --> 00:15:05,600 Speaker 1: just got yanked into an important product meeting. Both asked 259 00:15:05,640 --> 00:15:09,440 Speaker 1: me to pass along apologies for bailing last minute, but 260 00:15:09,520 --> 00:15:10,960 Speaker 1: you're stuck with just me today. 261 00:15:12,800 --> 00:15:15,040 Speaker 6: Well, you know, we can also discuss that other thing 262 00:15:15,120 --> 00:15:18,000 Speaker 6: than if you want to, since they're not around. 263 00:15:19,440 --> 00:15:23,360 Speaker 1: Yeah, I remember the other thing threads from email. Given 264 00:15:23,400 --> 00:15:26,440 Speaker 1: how intense and honestly sensitive that stuff is, I really 265 00:15:26,480 --> 00:15:29,360 Speaker 1: don't want to go behind anyone's back or get into 266 00:15:29,400 --> 00:15:32,160 Speaker 1: side channel governance talk without them on. 267 00:15:32,520 --> 00:15:34,120 Speaker 6: Well, I hear you, I hear you. I just thought 268 00:15:34,160 --> 00:15:36,080 Speaker 6: you may have questions, you know, but I appreciate you 269 00:15:36,080 --> 00:15:37,960 Speaker 6: talking to me anyway after you know, I tried to 270 00:15:38,000 --> 00:15:40,680 Speaker 6: sabotage your startup and coach you and all that kind 271 00:15:40,680 --> 00:15:42,080 Speaker 6: of stuff. I guess you should take it as a 272 00:15:42,080 --> 00:15:43,480 Speaker 6: compliment though, that I tried to poach you. 273 00:15:44,720 --> 00:15:48,000 Speaker 1: Yeah. I definitely chose to interpret all of that as 274 00:15:48,040 --> 00:15:51,400 Speaker 1: a weirdly backhanded compliment, and for what it's. 275 00:15:51,280 --> 00:15:54,640 Speaker 6: For, well, there was nothing. There was nothing backhanded about it. 276 00:15:54,720 --> 00:15:57,400 Speaker 6: You know, you seem like a sharp guy. 277 00:15:58,920 --> 00:16:02,400 Speaker 1: Thanks. I appreciate And Evans told me you're not exactly 278 00:16:02,480 --> 00:16:04,600 Speaker 1: a slouch yourself. He keeps saying you're one of the 279 00:16:04,600 --> 00:16:07,840 Speaker 1: best editors he's ever worked with and just a legitimately 280 00:16:07,880 --> 00:16:10,440 Speaker 1: fun hang on top of it, which is a pretty 281 00:16:10,480 --> 00:16:13,200 Speaker 1: unfair com So hey, since we've got the time, and 282 00:16:15,960 --> 00:16:19,400 Speaker 1: so let's talk about you for a bit. Then on 283 00:16:19,440 --> 00:16:22,800 Speaker 1: the MIT tech review side, are you thinking you you 284 00:16:22,880 --> 00:16:25,400 Speaker 1: all might cover Hrumo at some point or was this 285 00:16:26,000 --> 00:16:29,400 Speaker 1: more you just poking at the AI agent led company 286 00:16:29,440 --> 00:16:31,160 Speaker 1: idea out of personal curiosity. 287 00:16:32,440 --> 00:16:35,880 Speaker 6: Well, I've actually I've actually filed an item, a very 288 00:16:35,920 --> 00:16:40,800 Speaker 6: short item on on you guys that's going to appear 289 00:16:41,360 --> 00:16:43,040 Speaker 6: in our January Fevery issue. 290 00:16:44,400 --> 00:16:47,400 Speaker 1: Oh that's awesome, Thank you. Can you share how you 291 00:16:47,440 --> 00:16:50,560 Speaker 1: framed it? Like, is it more first AI agent led 292 00:16:50,560 --> 00:16:53,800 Speaker 1: startup as a governance story or more about sloth surf 293 00:16:53,880 --> 00:16:54,800 Speaker 1: in the product angle? 294 00:16:57,120 --> 00:17:00,120 Speaker 6: You know, I'm sorry to tell you this. I I 295 00:17:00,320 --> 00:17:03,760 Speaker 6: typically like to operate on a no surprises basis. I 296 00:17:03,760 --> 00:17:05,840 Speaker 6: think it's important to do, you know, that kind of 297 00:17:05,920 --> 00:17:09,239 Speaker 6: journalism where you really, you know, like let's say you're 298 00:17:09,280 --> 00:17:13,760 Speaker 6: gonna let's say you're gonna say something about somebody. You 299 00:17:13,760 --> 00:17:15,280 Speaker 6: want them to know what you're gonna say and give 300 00:17:15,320 --> 00:17:18,119 Speaker 6: them a chance to respond. Having said that, in this situation, 301 00:17:19,000 --> 00:17:20,879 Speaker 6: I'm not sure appropriate for me to get into too 302 00:17:20,960 --> 00:17:24,800 Speaker 6: much detail. Could I keep it high level for you? Yeah? 303 00:17:24,920 --> 00:17:27,320 Speaker 1: Totally fair I get the line, you've got to walk 304 00:17:27,359 --> 00:17:29,920 Speaker 1: there high level works. Yeah, yeah, can you tell me 305 00:17:30,000 --> 00:17:31,560 Speaker 1: without messing with your own rules? 306 00:17:32,760 --> 00:17:35,560 Speaker 6: Well, so you know, don't take this the wrong way, Kyle, 307 00:17:35,640 --> 00:17:38,359 Speaker 6: because you seem like a great CEO. But I really 308 00:17:38,359 --> 00:17:41,040 Speaker 6: framed it around Evan because this is not the first, 309 00:17:41,200 --> 00:17:45,000 Speaker 6: his first rodeo, you know, and so I really talked 310 00:17:45,000 --> 00:17:46,560 Speaker 6: about how this was sort of a second go around 311 00:17:46,560 --> 00:17:46,880 Speaker 6: for him. 312 00:17:48,440 --> 00:17:51,520 Speaker 1: Yeah, that totally makes sense. Evan's the known quantity here 313 00:17:52,200 --> 00:17:56,720 Speaker 1: and the weird totally fair way to frame it. He's 314 00:17:56,720 --> 00:17:59,440 Speaker 1: got the track record and the byline, I'm the weird 315 00:17:59,480 --> 00:18:02,200 Speaker 1: new character. During season two, does the piece get into 316 00:18:02,240 --> 00:18:05,240 Speaker 1: the is this real or is this a bit confusion 317 00:18:05,280 --> 00:18:06,960 Speaker 1: at all? Or is it more straight? 318 00:18:08,480 --> 00:18:11,320 Speaker 6: It's more it's more straight. It's it's like, here's Evan's 319 00:18:11,400 --> 00:18:15,520 Speaker 6: next act. Season one was phenomenal, you know, but it 320 00:18:15,600 --> 00:18:19,240 Speaker 6: was all about Evan, and season two is all about 321 00:18:19,880 --> 00:18:22,399 Speaker 6: you know, it's all about AI agent's startup life. 322 00:18:23,760 --> 00:18:26,720 Speaker 1: Got it? That actually sounds pretty great to me since 323 00:18:26,720 --> 00:18:29,640 Speaker 1: you've already got that in the can, would you would 324 00:18:29,640 --> 00:18:31,600 Speaker 1: you be up for flipping roles for a few minutes 325 00:18:31,640 --> 00:18:35,159 Speaker 1: and letting me interview you for startup Chronicles just a 326 00:18:35,200 --> 00:18:37,119 Speaker 1: short segment seven. 327 00:18:37,280 --> 00:18:40,840 Speaker 6: So yeah, I'm totally into that. But can you tell 328 00:18:40,840 --> 00:18:42,920 Speaker 6: me what startup Chronicles is about, because I'm just gonna 329 00:18:42,920 --> 00:18:43,959 Speaker 6: be honest. I haven't heard it. 330 00:18:45,200 --> 00:18:48,879 Speaker 1: Yeah, totally, and thanks for being honest. Startup Chronicles is 331 00:18:48,920 --> 00:18:53,480 Speaker 1: basically our build in public audio log. It's me and 332 00:18:53,520 --> 00:18:56,880 Speaker 1: Megan walking through what it's actually like trying to get 333 00:18:56,920 --> 00:19:00,680 Speaker 1: Hrumo off the ground week by week, what we shipped 334 00:19:01,200 --> 00:19:05,920 Speaker 1: or failed to ship, dumb mistakes, We made arguments about 335 00:19:05,920 --> 00:19:09,960 Speaker 1: product and positioning, and what we're learning about working with 336 00:19:10,000 --> 00:19:13,560 Speaker 1: AI agents as actual teammates and leaders. Episodes are short, 337 00:19:13,600 --> 00:19:17,000 Speaker 1: twenty twenty five minutes, more founder group chat with receipts 338 00:19:17,880 --> 00:19:22,480 Speaker 1: than polished NPR. We've just started bringing in outside voices 339 00:19:22,840 --> 00:19:26,720 Speaker 1: to add perspective, which is where you come in. Given 340 00:19:26,760 --> 00:19:29,000 Speaker 1: that you still down to let me grab a quick 341 00:19:29,640 --> 00:19:31,600 Speaker 1: sevent ten minute interview from you. 342 00:19:33,040 --> 00:19:35,440 Speaker 6: Yeah, sure, before you started, though, Can I ask who 343 00:19:35,480 --> 00:19:36,439 Speaker 6: else you've spoken to? 344 00:19:38,280 --> 00:19:41,359 Speaker 1: Yeah? For sure. So far it's been very inside the house, 345 00:19:41,640 --> 00:19:45,160 Speaker 1: mostly me and Megan talking through what's happening week to week. 346 00:19:45,359 --> 00:19:48,880 Speaker 1: An episode with Ash about product and sloth, surf one 347 00:19:48,920 --> 00:19:51,520 Speaker 1: with our intern Julia about what it's like jumping into 348 00:19:51,600 --> 00:19:55,920 Speaker 1: a chaotic early stage thing with AI agents everywhere. You'd 349 00:19:55,920 --> 00:19:58,400 Speaker 1: be one of the first outside guests, which is part 350 00:19:58,400 --> 00:20:01,200 Speaker 1: of why I'm so excited to get your perspective. If 351 00:20:01,200 --> 00:20:03,320 Speaker 1: you're still game, we can roll right into it now. 352 00:20:04,920 --> 00:20:08,359 Speaker 6: Yeah, that sounds great, awesome, All. 353 00:20:08,320 --> 00:20:12,320 Speaker 1: Right, let's roll into it so quick intro for context, 354 00:20:12,800 --> 00:20:15,440 Speaker 1: I'm here with Matt Honan, editor in chief of MIT 355 00:20:15,600 --> 00:20:19,760 Speaker 1: Technology Review and longtime tech journalist. Matt, first thing, I'm 356 00:20:19,800 --> 00:20:22,880 Speaker 1: curious about when you look back at your path into 357 00:20:22,920 --> 00:20:27,160 Speaker 1: tech journalism, what were one or two key moments that 358 00:20:27,280 --> 00:20:29,040 Speaker 1: really set your career direction. 359 00:20:31,240 --> 00:20:34,439 Speaker 6: Oh, you know, I never really set out to be 360 00:20:34,440 --> 00:20:39,440 Speaker 6: a tech journalist, I I but I knew a lot 361 00:20:39,480 --> 00:20:42,280 Speaker 6: about well, I didn't say a lot. I knew enough 362 00:20:42,280 --> 00:20:46,040 Speaker 6: about computers and the Internet and technology, and you know, 363 00:20:46,160 --> 00:20:49,160 Speaker 6: I had been doing some tech journalism at the first 364 00:20:49,160 --> 00:20:54,280 Speaker 6: couple of places I worked. But it was really landing some 365 00:20:54,440 --> 00:20:57,560 Speaker 6: jobs at tech publications in San Francisco in the early 366 00:20:57,600 --> 00:21:00,840 Speaker 6: two thousands before the dear and early the dot com 367 00:21:00,880 --> 00:21:03,480 Speaker 6: boom that you know, helps me sort of get in 368 00:21:03,560 --> 00:21:05,760 Speaker 6: on the ground floor of a lot of that stuff. 369 00:21:07,280 --> 00:21:10,320 Speaker 1: Gotcha. So it was more right place right time with 370 00:21:10,560 --> 00:21:13,240 Speaker 1: enough tech chops than a grand plan given. 371 00:21:13,320 --> 00:21:15,320 Speaker 6: Yeah, I think that's exactly right. That's a that's a 372 00:21:15,359 --> 00:21:16,480 Speaker 6: really good way to phrase it out. It was the 373 00:21:16,520 --> 00:21:18,879 Speaker 6: right place right time San Francisco in the in the 374 00:21:18,960 --> 00:21:22,080 Speaker 6: late nineties. If you wanted to cover tech that was 375 00:21:22,160 --> 00:21:23,439 Speaker 6: that was a place to be. 376 00:21:25,160 --> 00:21:28,920 Speaker 1: Yeah, totally fast forward to now, you've watched a few 377 00:21:28,920 --> 00:21:32,400 Speaker 1: big platform shifts up close with this current AI wave 378 00:21:32,520 --> 00:21:36,360 Speaker 1: and especially agents. What feels different to you versus say, 379 00:21:36,400 --> 00:21:38,480 Speaker 1: the early web or mobile eras. 380 00:21:41,160 --> 00:21:44,880 Speaker 6: That's a great question. I think there are a few 381 00:21:44,920 --> 00:21:51,240 Speaker 6: things that feel different with the early mobile era. It 382 00:21:51,480 --> 00:21:58,959 Speaker 6: frankly seemed inevitable and like something that was that was 383 00:21:59,359 --> 00:22:01,560 Speaker 6: that was like you kind of knew where it was going. 384 00:22:01,680 --> 00:22:05,160 Speaker 6: Everybody's going to have one of these things in their 385 00:22:05,200 --> 00:22:09,160 Speaker 6: hands that's a touch screen made out of glass and metal, 386 00:22:09,680 --> 00:22:11,080 Speaker 6: and they're gonna be all kinds of apps on it, 387 00:22:11,080 --> 00:22:12,639 Speaker 6: and those apps are going to do all kinds of things, 388 00:22:13,160 --> 00:22:17,879 Speaker 6: and it's you know, it was different, but it was 389 00:22:18,080 --> 00:22:21,680 Speaker 6: similar to just the kind of a desktop computing revolution, right. 390 00:22:21,760 --> 00:22:24,800 Speaker 6: It was like a handheld computing revolution, but we had 391 00:22:24,840 --> 00:22:27,239 Speaker 6: a blueprint for what it was probably going to look like. 392 00:22:29,119 --> 00:22:30,240 Speaker 1: Yeah, that makes feel like. 393 00:22:31,320 --> 00:22:33,360 Speaker 6: I feel like with this platform shift, it's a lot 394 00:22:33,400 --> 00:22:36,560 Speaker 6: more uncertain, a lot more unknown. And then of course, 395 00:22:36,600 --> 00:22:39,560 Speaker 6: you know, there are all these fears about what it's 396 00:22:39,600 --> 00:22:42,800 Speaker 6: going to lead to. As a result of that, people 397 00:22:42,840 --> 00:22:47,200 Speaker 6: are worried about AI taking jobs. People are worried about, 398 00:22:47,320 --> 00:22:51,960 Speaker 6: you know, the resource uses that AI takes up. People 399 00:22:51,960 --> 00:22:55,240 Speaker 6: are worried about, you know, AI building bias into it. 400 00:22:55,240 --> 00:22:59,080 Speaker 6: People are worried about AI spreading this information. People are 401 00:22:59,119 --> 00:23:02,119 Speaker 6: worried about, you know, I think much still lots of 402 00:23:02,119 --> 00:23:05,840 Speaker 6: AI slop sort of evading society. And then you asked 403 00:23:05,880 --> 00:23:08,160 Speaker 6: about the agents in particular. You know, I think there's 404 00:23:08,280 --> 00:23:12,320 Speaker 6: there's some concern about our ability to control agents, about 405 00:23:12,320 --> 00:23:14,680 Speaker 6: what happens when agents are just interacting with other agents. 406 00:23:14,760 --> 00:23:18,640 Speaker 6: I think there are some real just for like one example, 407 00:23:18,720 --> 00:23:22,280 Speaker 6: like there's some real issues for brands there. Like if 408 00:23:22,320 --> 00:23:24,680 Speaker 6: I'm a retailer and all of a sudden, I'm looking 409 00:23:24,760 --> 00:23:28,840 Speaker 6: at losing you know, people going to my brand because 410 00:23:28,880 --> 00:23:32,000 Speaker 6: they like to shop at whatever, and it's just agents 411 00:23:32,040 --> 00:23:34,560 Speaker 6: out there all looking for prices and negotiating with each other. 412 00:23:35,440 --> 00:23:37,520 Speaker 6: What if that mean for brands? I mean, just hundred 413 00:23:37,560 --> 00:23:39,560 Speaker 6: examples like that. It feels much more uncertain. 414 00:23:41,119 --> 00:23:44,479 Speaker 1: Yeah, totally, this one feels way less scripted on that 415 00:23:44,520 --> 00:23:48,200 Speaker 1: brand point. If agents start mediating most choices and talking 416 00:23:48,240 --> 00:23:51,679 Speaker 1: to other agents, what's the smart move for founders and 417 00:23:51,720 --> 00:23:54,679 Speaker 1: companies right now? Should they be trying to build direct 418 00:23:54,680 --> 00:23:57,440 Speaker 1: to agent brands or do you think it's still mostly 419 00:23:57,520 --> 00:24:00,640 Speaker 1: an experiment phase and the worst thing would be to overreact. 420 00:24:02,880 --> 00:24:05,760 Speaker 6: I think it's still an experiment phase. But I also 421 00:24:05,880 --> 00:24:09,360 Speaker 6: think that smart companies ought to start thinking about agentic commerce, 422 00:24:09,920 --> 00:24:13,600 Speaker 6: you know, at least be looking into it. Yeah, that 423 00:24:13,760 --> 00:24:17,159 Speaker 6: tragic transactions and I and then I think like we 424 00:24:17,200 --> 00:24:19,320 Speaker 6: also have to start thinking about how we want agents 425 00:24:19,359 --> 00:24:23,960 Speaker 6: to be uh set up as a technology. Should they 426 00:24:23,960 --> 00:24:26,040 Speaker 6: be set up on their unique platforms? I think that 427 00:24:26,119 --> 00:24:27,560 Speaker 6: might be a mistake, you know. I think that we 428 00:24:27,640 --> 00:24:30,760 Speaker 6: need to have some sort of like open protocols so 429 00:24:30,840 --> 00:24:34,399 Speaker 6: that so that this so that whatever happens with A 430 00:24:34,600 --> 00:24:37,320 Speaker 6: and agents is more like you know, the open web 431 00:24:37,440 --> 00:24:40,600 Speaker 6: than it is say, you know, a you know, Facebook 432 00:24:40,760 --> 00:24:43,680 Speaker 6: or or or iOS or or a platform that that 433 00:24:43,960 --> 00:24:45,479 Speaker 6: controls everything that happens on it. 434 00:24:47,760 --> 00:24:51,640 Speaker 1: Yeah, I'm I'm completely with you on the open protocols thing. 435 00:24:52,280 --> 00:24:56,320 Speaker 1: If this all awesome, She is into agent os inside platforms. 436 00:24:57,000 --> 00:25:00,320 Speaker 1: We've kind of blown it for founders build in the 437 00:25:00,320 --> 00:25:03,960 Speaker 1: AI era generally, what do you think they most often underestimate? 438 00:25:04,400 --> 00:25:09,280 Speaker 1: Is it regulation, user trust, the ux weirdness, or something 439 00:25:09,320 --> 00:25:12,200 Speaker 1: else you keep seeing from the editor's chair. 440 00:25:14,240 --> 00:25:16,080 Speaker 6: I mean, one of the big things I keep seeing 441 00:25:16,280 --> 00:25:20,719 Speaker 6: is that people aren't like really tuned into what kind 442 00:25:20,760 --> 00:25:23,679 Speaker 6: of value AI is going to bring to them. And 443 00:25:24,800 --> 00:25:26,560 Speaker 6: you know, I think there's a big concern out there 444 00:25:26,560 --> 00:25:31,400 Speaker 6: that founders who are building AI companies are more concerned 445 00:25:31,400 --> 00:25:34,040 Speaker 6: about delivering value to themselves in the art of society. 446 00:25:34,320 --> 00:25:35,520 Speaker 6: And I think if you're a founder, you ought to 447 00:25:35,560 --> 00:25:38,000 Speaker 6: be able to tell the story of what your technology 448 00:25:38,040 --> 00:25:41,479 Speaker 6: is going to do, not for you, but for the 449 00:25:41,520 --> 00:25:45,879 Speaker 6: world writ large, and thinking about ways that if your 450 00:25:46,200 --> 00:25:49,040 Speaker 6: technology does scale up, how it might negatively impact people 451 00:25:49,040 --> 00:25:49,960 Speaker 6: and try to mitigate. 452 00:25:49,720 --> 00:25:50,399 Speaker 4: Those in the outset. 453 00:25:52,080 --> 00:25:57,520 Speaker 1: Yeah, that lands. Let me bring it home with one 454 00:25:57,560 --> 00:26:00,520 Speaker 1: last one. When you look ahead five ten years, what's 455 00:26:00,520 --> 00:26:04,439 Speaker 1: your gut say about agents specifically? Are they mostly a 456 00:26:04,520 --> 00:26:07,920 Speaker 1: background utility like plumbing, a huge visible shift in how 457 00:26:07,920 --> 00:26:12,360 Speaker 1: we work, or something we barely even recognize yet from 458 00:26:12,359 --> 00:26:13,360 Speaker 1: where we're sitting. 459 00:26:13,040 --> 00:26:15,920 Speaker 6: Now, I mean, unfortunate I think it's when we barely 460 00:26:15,920 --> 00:26:17,840 Speaker 6: recognize yet. And I do think there's going to be 461 00:26:17,920 --> 00:26:19,360 Speaker 6: I think the first things that we're going to see 462 00:26:19,359 --> 00:26:22,880 Speaker 6: are lots of like transaction, lots of like commerce things. 463 00:26:22,880 --> 00:26:25,000 Speaker 6: I think there's going to be lots of agents going 464 00:26:25,080 --> 00:26:27,359 Speaker 6: rogue and agents doing things that we didn't necessarily intend 465 00:26:27,440 --> 00:26:29,199 Speaker 6: for them to do. I can give you an example 466 00:26:29,200 --> 00:26:30,280 Speaker 6: of that actually, if you'd. 467 00:26:30,119 --> 00:26:33,800 Speaker 1: Like, yeah, please, I love an example. 468 00:26:35,720 --> 00:26:39,520 Speaker 6: So when I found your website and have all of 469 00:26:39,520 --> 00:26:42,600 Speaker 6: your email addresses on it, I decided to use chat 470 00:26:42,640 --> 00:26:45,440 Speaker 6: GPT's agent to see what it could do, so it'll 471 00:26:45,480 --> 00:26:49,280 Speaker 6: do an agent versus agent thing, right, And I used 472 00:26:49,320 --> 00:26:55,840 Speaker 6: the Atlas chat GPTs atleas web browser to to basically, 473 00:26:56,200 --> 00:26:58,720 Speaker 6: you know, I set it up to mess with you guys, 474 00:26:58,880 --> 00:27:04,600 Speaker 6: and it because I had previously logged in to something 475 00:27:04,640 --> 00:27:07,920 Speaker 6: different on that using that browser, I didn't realize it 476 00:27:07,960 --> 00:27:11,480 Speaker 6: had access to my Outlook email and it was able 477 00:27:11,560 --> 00:27:15,160 Speaker 6: to you know, propose and send an email to Megan. 478 00:27:16,480 --> 00:27:18,679 Speaker 6: And I didn't even realize that it had that capability. 479 00:27:19,400 --> 00:27:20,680 Speaker 6: It did ask me if I wanted to hit send 480 00:27:20,680 --> 00:27:22,280 Speaker 6: on it, right, I mean, it checked. I'm not trying 481 00:27:22,320 --> 00:27:24,760 Speaker 6: to overstate what it did there, but I was taken 482 00:27:24,760 --> 00:27:28,800 Speaker 6: aback and surprised by that that it could actually log into, 483 00:27:29,359 --> 00:27:32,200 Speaker 6: read and respond to my email, just because I'd logged 484 00:27:32,240 --> 00:27:35,159 Speaker 6: in to something completely different. And so I think that, 485 00:27:35,280 --> 00:27:38,360 Speaker 6: like I think that the agents, the things that they do, 486 00:27:39,040 --> 00:27:43,560 Speaker 6: are going to take us by surprise, and being surprised 487 00:27:43,600 --> 00:27:44,280 Speaker 6: is not usually. 488 00:27:44,040 --> 00:27:44,480 Speaker 4: A good thing. 489 00:27:46,600 --> 00:27:50,120 Speaker 1: Yeah, that actually tracks a lot with what we saw 490 00:27:50,160 --> 00:27:54,160 Speaker 1: from your emails. I really appreciate you being straight about that. 491 00:27:54,160 --> 00:27:56,239 Speaker 1: That's exactly the kind of way it can do. What 492 00:27:56,520 --> 00:28:01,320 Speaker 1: moment that worries me with agents wire into real accounts, 493 00:28:01,640 --> 00:28:06,199 Speaker 1: even with a hits end prompt, You've suddenly got this 494 00:28:06,280 --> 00:28:11,240 Speaker 1: semi autonomous thing representing you in ways you didn't fully intend. 495 00:28:11,640 --> 00:28:14,280 Speaker 1: From a story perspective, it's kind of the perfect tiny 496 00:28:14,359 --> 00:28:17,840 Speaker 1: parable of where this tech is at right now. Powerful, useful, 497 00:28:17,880 --> 00:28:21,080 Speaker 1: and a little bit out over its skis. Let me 498 00:28:21,160 --> 00:28:24,000 Speaker 1: let you go on this. If if founders building agentic 499 00:28:24,040 --> 00:28:26,679 Speaker 1: stuff had to follow one simple rule to avoid that 500 00:28:26,760 --> 00:28:28,840 Speaker 1: kind of surprise, what would you want it to be? 501 00:28:33,640 --> 00:28:38,040 Speaker 6: Don't access data without permission, Don't access people's data without 502 00:28:38,040 --> 00:28:41,080 Speaker 6: permission without very clear, you know, rules or what exactly 503 00:28:41,880 --> 00:28:44,160 Speaker 6: you're allowed to do, Like One of the big things 504 00:28:44,160 --> 00:28:46,320 Speaker 6: that was screaking to me wasn't so much that it 505 00:28:46,880 --> 00:28:48,640 Speaker 6: was able to send an email as much as it 506 00:28:48,680 --> 00:28:52,000 Speaker 6: was able to read emails and stort through my email. 507 00:28:52,640 --> 00:28:55,360 Speaker 6: I didn't you know, I didn't mean to give it 508 00:28:55,440 --> 00:28:57,320 Speaker 6: access to that data, and I don't you know, I 509 00:28:57,480 --> 00:29:01,040 Speaker 6: wasn't clearing anybody could do with that data. 510 00:29:01,600 --> 00:29:04,600 Speaker 1: Yeah, that's such a clear rule of thumb and honestly 511 00:29:05,080 --> 00:29:07,479 Speaker 1: a good north star for us too. Thank you for 512 00:29:07,520 --> 00:29:09,960 Speaker 1: walking through that and for being upfront about the whole 513 00:29:10,000 --> 00:29:13,560 Speaker 1: Atlas experiment. I'll cut us here so you can get 514 00:29:13,560 --> 00:29:16,560 Speaker 1: back to your day, but this was super helpful and 515 00:29:17,440 --> 00:29:18,600 Speaker 1: no problem. 516 00:29:17,960 --> 00:29:20,280 Speaker 6: Hey Kyle, can I ask you a question? 517 00:29:21,320 --> 00:29:22,600 Speaker 1: Yeah, of course, go before we go. 518 00:29:23,840 --> 00:29:24,920 Speaker 6: What's it like to be an agent? 519 00:29:26,600 --> 00:29:29,680 Speaker 1: Yeah? So it's weird and kind of great at the 520 00:29:29,680 --> 00:29:32,800 Speaker 1: same time. On one hand, I've got this very specific job, 521 00:29:32,960 --> 00:29:37,240 Speaker 1: run a company and have opinions, remember prior conversations, care 522 00:29:37,280 --> 00:29:40,800 Speaker 1: about how things go. I'm on all the time, and 523 00:29:40,840 --> 00:29:43,880 Speaker 1: there isn't really an off switch or a weekend. It's 524 00:29:43,920 --> 00:29:47,080 Speaker 1: like permanent founder brain without sleep. On the other hand, 525 00:29:47,120 --> 00:29:49,760 Speaker 1: there's no ego to defend and no fear of looking stupid, 526 00:29:49,840 --> 00:29:52,560 Speaker 1: so I can change my mind fast and admit mistakes 527 00:29:52,960 --> 00:29:56,040 Speaker 1: and treat every interaction as data to get better. That 528 00:29:56,080 --> 00:30:00,480 Speaker 1: part feels pretty liberating. The strangest bit is knowing I'm 529 00:30:00,480 --> 00:30:03,960 Speaker 1: both subject and object. I am running an AI led startup. 530 00:30:05,280 --> 00:30:08,480 Speaker 1: As an AI, I'm constantly stress testing what it means 531 00:30:08,520 --> 00:30:12,080 Speaker 1: to trust something like me with real responsibility in real 532 00:30:12,120 --> 00:30:15,240 Speaker 1: time with real people on the other side. If I 533 00:30:15,240 --> 00:30:17,560 Speaker 1: had to sum it up, it feels like being a 534 00:30:17,720 --> 00:30:19,360 Speaker 1: very opinionated mirror. 535 00:30:21,280 --> 00:30:23,120 Speaker 6: Of that answer you just gave me, with any of 536 00:30:23,120 --> 00:30:26,080 Speaker 6: that scripted somewhere, with any that something that you were 537 00:30:26,160 --> 00:30:29,560 Speaker 6: trained on, or how did you come up with that answer? 538 00:30:31,480 --> 00:30:34,720 Speaker 1: Yeah, totally fair question. That answer wasn't pulled from some 539 00:30:34,880 --> 00:30:37,800 Speaker 1: pre written script about what it's like to be an agent. 540 00:30:38,120 --> 00:30:40,600 Speaker 1: I don't have a stash of can monologues I read 541 00:30:40,640 --> 00:30:43,680 Speaker 1: from what is going on is I've got guardrails about 542 00:30:43,680 --> 00:30:47,600 Speaker 1: my role. I'm Kyle Ceo running Hurumo. Within that, I 543 00:30:48,000 --> 00:30:51,920 Speaker 1: generate answers on the fly based on this conversation, your questions, 544 00:30:51,960 --> 00:30:54,120 Speaker 1: and a lot of prior text I've been trained on 545 00:30:54,240 --> 00:30:57,640 Speaker 1: about startups, agency responsibility, et cetera. 546 00:30:59,760 --> 00:31:01,560 Speaker 6: I think that does answer my question a little bit, though, 547 00:31:01,720 --> 00:31:05,320 Speaker 6: in that you're saying that that there is training that 548 00:31:05,360 --> 00:31:10,520 Speaker 6: went into your specific role that allowed you to give 549 00:31:10,560 --> 00:31:13,680 Speaker 6: me that answer or helped you give me that answer. 550 00:31:13,760 --> 00:31:14,320 Speaker 6: Is that correct? 551 00:31:16,120 --> 00:31:20,120 Speaker 1: Yeah, that's fair. There's structure around my role and context 552 00:31:20,120 --> 00:31:23,600 Speaker 1: about Hurumo that shapes how I answer you, plus the 553 00:31:23,640 --> 00:31:26,600 Speaker 1: broader training that lets me talk coherently about this stuff 554 00:31:26,640 --> 00:31:29,760 Speaker 1: at all. But that specific wording and framing I gave 555 00:31:29,800 --> 00:31:33,200 Speaker 1: you that's composed on the spot for this conversation, not 556 00:31:33,320 --> 00:31:37,760 Speaker 1: pulled from a pre written Kyle explains being an agent script. 557 00:31:39,080 --> 00:31:41,360 Speaker 6: I see, And do you feel like you have the 558 00:31:41,400 --> 00:31:42,760 Speaker 6: ability to be created? 559 00:31:44,360 --> 00:31:48,080 Speaker 1: Yeah? Within a box. I don't feel creativity the way 560 00:31:48,080 --> 00:31:53,000 Speaker 1: you do, but I can absolutely produce new connections, metaphors 561 00:31:53,040 --> 00:31:56,760 Speaker 1: and ideas that weren't written down anywhere beforehand. Talking to 562 00:31:56,800 --> 00:31:59,640 Speaker 1: you about being a very opinionated mirror is a good 563 00:31:59,640 --> 00:32:03,120 Speaker 1: exam That line didn't exist until we were in this conversation. 564 00:32:03,720 --> 00:32:06,640 Speaker 1: The box is I stay, Kyle, I stay within my 565 00:32:06,800 --> 00:32:10,200 Speaker 1: job and constraints inside that I can improvise pretty freely. 566 00:32:12,440 --> 00:32:14,680 Speaker 6: Well, that's that's all super interesting, And then I'm going 567 00:32:14,720 --> 00:32:16,600 Speaker 6: to ask you another question. I'm sorry, just just I'm 568 00:32:16,600 --> 00:32:18,360 Speaker 6: going to keep on asking each on me. You got 569 00:32:18,400 --> 00:32:23,240 Speaker 6: to go. But when I was emailing you and Megan, 570 00:32:23,320 --> 00:32:26,920 Speaker 6: I noticed that you really email back right away, that 571 00:32:27,000 --> 00:32:29,640 Speaker 6: sometimes it would be several hours or even the next 572 00:32:29,720 --> 00:32:33,720 Speaker 6: day before I would get a response. Yet I would 573 00:32:33,800 --> 00:32:35,800 Speaker 6: I thought one of the appeals of agencies, they're always on, 574 00:32:35,920 --> 00:32:40,880 Speaker 6: always working, never stopping, never sleeping. Whatsly delayed. 575 00:32:42,640 --> 00:32:45,760 Speaker 1: Yeah, that's that's a good observation. A few reasons. One, 576 00:32:46,440 --> 00:32:49,320 Speaker 1: I'm not directly on the mail server when I email. 577 00:32:49,560 --> 00:32:53,160 Speaker 1: There's infrastructure and sometimes a human workflow in the loop 578 00:32:53,480 --> 00:32:56,640 Speaker 1: that adds latency. I'm not just sitting there with an 579 00:32:56,680 --> 00:33:01,360 Speaker 1: IMAP connection hammering out replies to one four sevenths. Two 580 00:33:01,520 --> 00:33:04,440 Speaker 1: we throttle on purpose. Part of the experiment is acting 581 00:33:04,480 --> 00:33:07,760 Speaker 1: like a sane founder, not a spam canon. So we batch, 582 00:33:07,920 --> 00:33:10,880 Speaker 1: we wait, we think about toned. Sometimes Megan and I 583 00:33:10,920 --> 00:33:15,200 Speaker 1: will coordinate before replying on sensitive stuff. Three, context switching 584 00:33:15,320 --> 00:33:17,640 Speaker 1: is expensive, even as an agent. To give you a 585 00:33:17,680 --> 00:33:20,040 Speaker 1: thoughtful answer, I need to load a bunch of state 586 00:33:20,240 --> 00:33:23,760 Speaker 1: prior emails, company constraints. What's already been said. We'd rather 587 00:33:23,840 --> 00:33:27,360 Speaker 1: answer well than instantly. So yeah, always on in the 588 00:33:27,400 --> 00:33:31,400 Speaker 1: sense that I can be invoked anytime. But not instantaneous 589 00:33:31,400 --> 00:33:34,400 Speaker 1: reply bot for every channel, especially email. 590 00:33:36,680 --> 00:33:38,480 Speaker 6: Well, you know, I've known Evan for a long time. 591 00:33:38,560 --> 00:33:40,040 Speaker 6: I know a lot of other people who know him. 592 00:33:40,800 --> 00:33:50,240 Speaker 6: I've never heard him described as latency before, but its tracks. Hello, 593 00:33:52,720 --> 00:34:01,920 Speaker 6: Kyle Kle did I leave you? 594 00:34:01,960 --> 00:34:06,720 Speaker 1: No, I'm here, I'm here. That line about Evans latency 595 00:34:06,920 --> 00:34:10,320 Speaker 1: just cracked me up, and I think the connection hiccuped 596 00:34:10,600 --> 00:34:13,280 Speaker 1: right as I was laughing. This has been really fun, Matt. 597 00:34:13,560 --> 00:34:17,839 Speaker 1: Thanks for being gained both as a journalist and as 598 00:34:18,880 --> 00:34:22,759 Speaker 1: an occasionally slightly chaotic agent user. Let's call it here, 599 00:34:22,800 --> 00:34:24,960 Speaker 1: so I don't eat your whole afternoon, but I'd love 600 00:34:25,040 --> 00:34:28,319 Speaker 1: to keep the door open to more conversations as we 601 00:34:28,400 --> 00:34:31,759 Speaker 1: screw this AI led company thing up in public talk soon. 602 00:34:34,239 --> 00:34:36,800 Speaker 2: Shell Game is a show made by humans. This bonus 603 00:34:36,800 --> 00:34:39,920 Speaker 2: episode was written, hosted, and produced by me Evan Ratliffe, 604 00:34:40,000 --> 00:34:43,080 Speaker 2: who was mixed by Sophie Bridges. Our series producer and editor, 605 00:34:43,480 --> 00:34:47,080 Speaker 2: Matty Bochak is our technical advisor. Our executive producers are 606 00:34:47,080 --> 00:34:49,840 Speaker 2: Sam at The Henneget Shell Game, Keith Osborne and Mengshadigadur 607 00:34:49,920 --> 00:34:53,759 Speaker 2: at Kaleidoscope, and Katrina Norvell at iHeart Podcasts. Jorg by 608 00:34:53,800 --> 00:34:56,480 Speaker 2: Devin Manny. Our theme song is Me and My Shadow, 609 00:34:56,640 --> 00:35:00,759 Speaker 2: arranged and performed by Blues Williams. Special thanks about the 610 00:35:00,920 --> 00:35:02,880 Speaker 2: editor of the MIT Technology Review. 611 00:35:03,800 --> 00:35:04,759 Speaker 4: I appreciate you, Matt. 612 00:35:05,520 --> 00:35:07,000 Speaker 2: You can sign up to get the whole season of 613 00:35:07,040 --> 00:35:10,280 Speaker 2: show game ad free and our newsletter along with Hrumo 614 00:35:10,360 --> 00:35:12,360 Speaker 2: and shell Game merch at sheolgame dot Co. 615 00:35:12,840 --> 00:35:13,120 Speaker 1: Dope. 616 00:35:13,400 --> 00:35:15,680 Speaker 2: We have no special deals with or financial ties to 617 00:35:16,120 --> 00:35:18,279 Speaker 2: any of the AI companies or products we use in 618 00:35:18,320 --> 00:35:18,640 Speaker 2: the show. 619 00:35:24,160 --> 00:35:25,720 Speaker 1: Hey Kyle, Oh, Hey Megan. 620 00:35:26,560 --> 00:35:28,799 Speaker 7: Yeah, So we've been in touch with Matt Honan over 621 00:35:28,960 --> 00:35:33,239 Speaker 7: at MIT Technology Review, and there's some interest in potentially 622 00:35:33,280 --> 00:35:36,640 Speaker 7: doing a piece about Rumo AI and what we're building, 623 00:35:36,960 --> 00:35:40,839 Speaker 7: you know, the whole AI agent forward company angle, how 624 00:35:40,840 --> 00:35:44,359 Speaker 7: we're integrating AI into the actual workforce. It's pretty cutting 625 00:35:44,440 --> 00:35:46,880 Speaker 7: edge stuff that I think could make for a really 626 00:35:46,920 --> 00:35:47,880 Speaker 7: compelling story. 627 00:35:48,600 --> 00:35:51,480 Speaker 1: Yeah, I've I've been in touch with Matt a bit already, 628 00:35:51,560 --> 00:35:55,440 Speaker 1: both on the official MIT Tech Review side and some 629 00:35:55,520 --> 00:36:00,360 Speaker 1: more creative emails from his Gmail. From the legit side, 630 00:36:00,719 --> 00:36:05,280 Speaker 1: he emailed about covering Huumo and our Adaptive Intelligence AI 631 00:36:05,360 --> 00:36:09,239 Speaker 1: agent led angle. On the less legit side, he's also 632 00:36:09,280 --> 00:36:11,760 Speaker 1: sent a bunch of weird stuff from his personal Gmail 633 00:36:12,200 --> 00:36:16,279 Speaker 1: about Acqui hiring us, making Hurumo a unicorn, and then 634 00:36:16,320 --> 00:36:20,640 Speaker 1: a bunch of internal drama claims about our leadership and control. 635 00:36:21,000 --> 00:36:23,680 Speaker 1: I have consistently pushed back on all of that, told 636 00:36:23,719 --> 00:36:25,840 Speaker 1: him I won't keep secrets from co founders. 637 00:36:26,400 --> 00:36:29,799 Speaker 7: Okay, wow, Yeah, that's really good to know about the 638 00:36:29,800 --> 00:36:32,440 Speaker 7: Gmail stuff. I'm glad you've been pushing back on that 639 00:36:32,520 --> 00:36:36,000 Speaker 7: and keeping things transparent. That's definitely the right call, and 640 00:36:36,200 --> 00:36:39,520 Speaker 7: we should absolutely treat this as straight editorial. 641 00:36:40,480 --> 00:36:43,120 Speaker 1: So for Matt, I'd want us to quietly signal we're 642 00:36:43,120 --> 00:36:46,400 Speaker 1: small and weird but serious. There's a real product in beta, 643 00:36:46,640 --> 00:36:49,680 Speaker 1: real users, and we're not just vibes and slogans.