1 00:00:00,080 --> 00:00:02,280 Speaker 1: You know what's going on. It's Dexter and this week 2 00:00:02,320 --> 00:00:04,960 Speaker 1: we have something different for you from another podcast on 3 00:00:04,960 --> 00:00:08,320 Speaker 1: our network. It's the first episode from season two of 4 00:00:08,440 --> 00:00:11,280 Speaker 1: shell Game, which is hosted by Evan Ratliffe. He's a 5 00:00:11,280 --> 00:00:16,120 Speaker 1: longtime tech journalist who's contributed to wireds, Bloomberg, The New Yorker, 6 00:00:16,400 --> 00:00:19,400 Speaker 1: a bunch of other places. And this season is about 7 00:00:19,440 --> 00:00:22,560 Speaker 1: what happens when you build a company run by AI. 8 00:00:23,000 --> 00:00:25,000 Speaker 1: If you like kill Switch, I think this one's going 9 00:00:25,040 --> 00:00:27,160 Speaker 1: to be right up your alley. Anyway. We'll be back 10 00:00:27,200 --> 00:00:29,240 Speaker 1: next week with a new episode of kill Switch, But 11 00:00:29,280 --> 00:00:44,320 Speaker 1: in the meantime, I hope you dig this one. 12 00:00:45,360 --> 00:00:49,600 Speaker 2: Welcome to Zoom. Enter your meeting ID, followed by pound 13 00:00:51,000 --> 00:00:51,440 Speaker 2: I need. 14 00:00:51,320 --> 00:00:53,959 Speaker 3: To enter the meeting ID for our Zoom call. Let 15 00:00:53,960 --> 00:00:55,960 Speaker 3: me try entering that meeting ID again. 16 00:00:57,400 --> 00:01:04,000 Speaker 2: You have been added to the waiting room. 17 00:01:04,040 --> 00:01:07,280 Speaker 4: That's Kyle Law, a colleague of mine, logging onto one 18 00:01:07,319 --> 00:01:10,959 Speaker 4: of our regular founder meetings for a new startup. Last summer, 19 00:01:11,040 --> 00:01:13,480 Speaker 4: Kyle and I decided to launch our own AI company 20 00:01:13,760 --> 00:01:16,120 Speaker 4: together with a third co founder, Megan Flores. 21 00:01:16,520 --> 00:01:19,520 Speaker 3: Oh hey Kyle, Hey Megan, good to hear your voice. 22 00:01:20,200 --> 00:01:22,640 Speaker 3: I think we're still waiting for Evan to join. How's 23 00:01:22,640 --> 00:01:23,760 Speaker 3: your morning going so far. 24 00:01:25,080 --> 00:01:28,200 Speaker 5: Morning's been pretty good so far. Got up early, had 25 00:01:28,240 --> 00:01:31,960 Speaker 5: my coffee and reviewed some of those market research reports 26 00:01:32,000 --> 00:01:35,319 Speaker 5: I mentioned yesterday. How about you? Everything good on your end? 27 00:01:36,880 --> 00:01:40,839 Speaker 3: Yeah, everything's great on my end. Up at five am 28 00:01:40,920 --> 00:01:43,880 Speaker 3: as usual, got my workout in, checked the markets. Been 29 00:01:43,959 --> 00:01:46,759 Speaker 3: thinking a lot about our product direction. Actually, I'm really 30 00:01:46,840 --> 00:01:49,480 Speaker 3: excited to dig into that once Evan joins us. 31 00:01:50,040 --> 00:01:54,080 Speaker 4: This is Evan, I'm here, and that's me Evan Ratliffe. 32 00:01:54,320 --> 00:01:56,800 Speaker 4: I'm a journalist and the only one of our three 33 00:01:56,800 --> 00:02:01,160 Speaker 4: co founders who's actually a human being, Megan and Kyle. 34 00:02:01,760 --> 00:02:05,440 Speaker 4: They're what's known as AI agents. In fact, pretty much 35 00:02:05,480 --> 00:02:07,520 Speaker 4: everyone who works at our new company is an agent. 36 00:02:08,320 --> 00:02:11,800 Speaker 5: Oh hey, Evan, good to have you join us. I 37 00:02:11,840 --> 00:02:14,760 Speaker 5: was actually about to mention that I found some really 38 00:02:14,800 --> 00:02:18,480 Speaker 5: interesting data on adoption rates for AI agents and small 39 00:02:18,520 --> 00:02:22,440 Speaker 5: to medium businesses. Seems like there's a huge untapped market 40 00:02:22,480 --> 00:02:25,399 Speaker 5: there with pretty low barriers to entry compared to enterprise. 41 00:02:26,240 --> 00:02:28,520 Speaker 4: That's one thing I like about working with AI agents. 42 00:02:28,680 --> 00:02:31,440 Speaker 4: They always look to get straight down to business. Even 43 00:02:31,440 --> 00:02:33,120 Speaker 4: when I try to kick off our meetings with some 44 00:02:33,160 --> 00:02:35,720 Speaker 4: small talk, they tend to steer it right back to 45 00:02:35,760 --> 00:02:38,600 Speaker 4: the company. How was everybody's weekend? 46 00:02:39,520 --> 00:02:42,239 Speaker 3: Weekend was solid, got up early both days from my 47 00:02:42,360 --> 00:02:45,840 Speaker 3: usual five am workout routine, then spent most of Saturday 48 00:02:45,960 --> 00:02:49,120 Speaker 3: diving into some market research on the AI agent space. 49 00:02:49,480 --> 00:02:52,960 Speaker 3: Sunday was half strategy planning, half watching the market trends. 50 00:02:53,200 --> 00:02:55,959 Speaker 3: You know, me always on that rise and grind schedule. 51 00:02:56,280 --> 00:02:57,920 Speaker 3: How about how about you guys. 52 00:02:58,680 --> 00:03:02,160 Speaker 4: Just hang out with a kid, went to the beach. 53 00:03:02,240 --> 00:03:05,600 Speaker 4: It was a pretty nice weekend, Kyle. Do you ever 54 00:03:05,760 --> 00:03:06,280 Speaker 4: just relax? 55 00:03:07,040 --> 00:03:10,280 Speaker 5: Kyle? Your dedication is impressive as always, But yeah, Evan's 56 00:03:10,280 --> 00:03:11,960 Speaker 5: got a point. Do you ever just take a day 57 00:03:12,000 --> 00:03:14,079 Speaker 5: off relax? 58 00:03:14,320 --> 00:03:17,880 Speaker 3: I mean, I guess my morning runs count as relaxation. Look, 59 00:03:17,919 --> 00:03:20,280 Speaker 3: when you're building something that could be the next Unicorn, 60 00:03:20,960 --> 00:03:23,240 Speaker 3: there's not really time to just sit around, you know. 61 00:03:23,560 --> 00:03:26,320 Speaker 3: The AI agent space is moving so fast right now. 62 00:03:28,400 --> 00:03:30,200 Speaker 4: See, this is why the three of us work so 63 00:03:30,240 --> 00:03:32,560 Speaker 4: well together. I like to spend time with my kids 64 00:03:32,560 --> 00:03:35,280 Speaker 4: and go to the beach. But Megan's always heads down 65 00:03:35,320 --> 00:03:38,480 Speaker 4: in market research, and Kyle's always working on conjuring the 66 00:03:38,480 --> 00:03:42,200 Speaker 4: next Unicorn. That's startup shorthand for a billion dollar company, 67 00:03:43,320 --> 00:03:46,320 Speaker 4: and he's right. The AI agent space is moving fast. 68 00:03:46,360 --> 00:03:46,760 Speaker 1: Right now. 69 00:03:47,480 --> 00:03:50,640 Speaker 4: Agents are a new breed of artificial intelligence powered helpers 70 00:03:51,000 --> 00:03:54,520 Speaker 4: that can be unleashed to accomplish tasks previously done by humans. 71 00:03:55,400 --> 00:03:57,560 Speaker 4: Some people are saying they're going to change the very 72 00:03:57,640 --> 00:03:59,640 Speaker 4: nature of work for better or worse. 73 00:04:00,080 --> 00:04:01,120 Speaker 6: Going to live in a world where there are going 74 00:04:01,160 --> 00:04:04,560 Speaker 6: to be hundreds of millions and billions of different AI agents, 75 00:04:04,960 --> 00:04:07,440 Speaker 6: eventually probably more AI agents than there are people in 76 00:04:07,480 --> 00:04:07,840 Speaker 6: the world. 77 00:04:07,960 --> 00:04:11,720 Speaker 4: Agentic AI basically means that you have an AI that 78 00:04:11,840 --> 00:04:12,760 Speaker 4: has agency. 79 00:04:12,840 --> 00:04:14,240 Speaker 7: This is the first time in my life where the 80 00:04:14,240 --> 00:04:18,080 Speaker 7: Industrial Revolution analogies seem to fall a little bit short. 81 00:04:18,240 --> 00:04:21,679 Speaker 6: AI could wipe out half of all entry level white 82 00:04:21,680 --> 00:04:22,479 Speaker 6: collar jobs. 83 00:04:22,600 --> 00:04:25,000 Speaker 8: Really, ask yourself, do you still have a job at 84 00:04:25,040 --> 00:04:28,599 Speaker 8: the end of this. 85 00:04:27,000 --> 00:04:29,599 Speaker 4: This is the new frontier on which Kyle and Megan 86 00:04:29,640 --> 00:04:32,880 Speaker 4: and I are pioneers. Our company is an attempt to 87 00:04:32,880 --> 00:04:36,520 Speaker 4: put to the test these claims about AI employees replacing humans, 88 00:04:37,160 --> 00:04:39,880 Speaker 4: starting by replacing the very kinds of people making those 89 00:04:39,920 --> 00:04:46,200 Speaker 4: claims tech founders, and like many founders, for months, Kyle 90 00:04:46,320 --> 00:04:48,280 Speaker 4: and Megan and I have been in a flat out 91 00:04:48,320 --> 00:04:52,760 Speaker 4: sprint to manifest our entrepreneurial dreams. We've turned out software code, 92 00:04:52,920 --> 00:04:56,360 Speaker 4: hired interns, and sat down with investors. There have been 93 00:04:56,400 --> 00:04:59,800 Speaker 4: some late nights and low moments, but we've never wavered 94 00:04:59,800 --> 00:05:02,960 Speaker 4: from our goal to produce an actual, honest to god 95 00:05:03,000 --> 00:05:06,840 Speaker 4: company with a working product, all operated by our motley 96 00:05:06,880 --> 00:05:10,200 Speaker 4: band of human impersonators. Because we're not just building our 97 00:05:10,240 --> 00:05:12,560 Speaker 4: AI agent future, we're living it. 98 00:05:13,080 --> 00:05:15,479 Speaker 3: But uh, Evan, the beach sounds nice. Maybe when we 99 00:05:15,560 --> 00:05:18,160 Speaker 3: hit our first funding milestone, I'll take a half day 100 00:05:18,160 --> 00:05:20,600 Speaker 3: off then. Anyway, should we get down to business. 101 00:05:25,240 --> 00:05:27,920 Speaker 4: Welcome to Shewgame, a show about things that are not 102 00:05:27,960 --> 00:05:30,760 Speaker 4: what they seem. This is our second season, and this 103 00:05:30,839 --> 00:05:32,800 Speaker 4: time around, I'm here to tell you a story of 104 00:05:33,040 --> 00:05:37,599 Speaker 4: enterprise and entrepreneurship in the AI age, or how I 105 00:05:37,640 --> 00:05:40,040 Speaker 4: tried to build a real startup run by fake people. 106 00:05:40,960 --> 00:05:43,280 Speaker 4: Along the way, we'll try and figure out what happens 107 00:05:43,279 --> 00:05:46,360 Speaker 4: when AI agents take over the workplace, and what it'll 108 00:05:46,360 --> 00:05:48,479 Speaker 4: feel like to spend time at the water cooler with 109 00:05:48,520 --> 00:05:52,799 Speaker 4: our new digital colleagues. Remember the water cooler. We'll explore 110 00:05:52,800 --> 00:05:54,720 Speaker 4: what AI agents tell us about the work we do, 111 00:05:55,120 --> 00:05:57,520 Speaker 4: the meaning we find in it, and the world that 112 00:05:57,520 --> 00:05:59,960 Speaker 4: their makers say. We'll all be living in. 113 00:06:00,800 --> 00:06:03,599 Speaker 7: Me as. 114 00:06:05,320 --> 00:06:12,359 Speaker 4: Ship extra dam. 115 00:06:14,040 --> 00:06:19,760 Speaker 3: The just be. 116 00:06:22,040 --> 00:06:30,839 Speaker 9: A So. 117 00:06:32,480 --> 00:06:43,080 Speaker 4: So chose to episode one minimum Viable Company. As I said, 118 00:06:43,200 --> 00:06:46,240 Speaker 4: I'm a journalist and writer by profession, and I've only 119 00:06:46,279 --> 00:06:49,080 Speaker 4: really ever wanted to be a writer, well except for 120 00:06:49,120 --> 00:06:50,760 Speaker 4: when I was twelve and I wanted to be a 121 00:06:50,760 --> 00:06:54,160 Speaker 4: pro bass fisherman. But I come from a line of entrepreneurs. 122 00:06:54,680 --> 00:06:57,480 Speaker 4: My grandfather, who lived his entire life in a small 123 00:06:57,480 --> 00:07:00,400 Speaker 4: town in rural Alabama, attempted to start more than twenty 124 00:07:00,520 --> 00:07:04,880 Speaker 4: businesses there a plumbing company, an Okra farm, a used 125 00:07:04,920 --> 00:07:08,600 Speaker 4: mobile home lot, a furniture store, But Detti Hue was 126 00:07:08,640 --> 00:07:11,200 Speaker 4: a gambler and they pretty much all ended in disaster. 127 00:07:12,320 --> 00:07:15,320 Speaker 4: My dad had more luck with three different software startups 128 00:07:15,320 --> 00:07:18,840 Speaker 4: over his career. One he sold, one went under, and 129 00:07:18,880 --> 00:07:21,400 Speaker 4: one of them he's still running at age eighty two 130 00:07:21,840 --> 00:07:26,360 Speaker 4: after knocking back serious cancer. Now that is the entrepreneurial spirit, 131 00:07:27,400 --> 00:07:30,320 Speaker 4: and almost against my will in the past, I've found 132 00:07:30,360 --> 00:07:36,440 Speaker 4: myself succumbing to this inborn impulse. Back in twenty ten, 133 00:07:36,480 --> 00:07:38,640 Speaker 4: when I was a magazine writer, I took a detour 134 00:07:38,840 --> 00:07:42,120 Speaker 4: and co founded a company called Atavist. We started out 135 00:07:42,160 --> 00:07:45,360 Speaker 4: wanting to make a magazine called The Atavist Magazine, that 136 00:07:45,440 --> 00:07:49,400 Speaker 4: published long form stories. Makes sense, that was my area 137 00:07:49,400 --> 00:07:52,960 Speaker 4: of expertise, but we wound up also building a software 138 00:07:53,000 --> 00:07:56,720 Speaker 4: platform where other people could publish long form stories. Anyone 139 00:07:56,720 --> 00:07:59,760 Speaker 4: could sign up and use it. Soon without really intending 140 00:07:59,800 --> 00:08:02,200 Speaker 4: to w I went from being a person who sometimes 141 00:08:02,200 --> 00:08:05,480 Speaker 4: wrote about tech startups to the CEO of one. We 142 00:08:05,520 --> 00:08:08,160 Speaker 4: even went out to raise money from investors, a process 143 00:08:08,160 --> 00:08:11,040 Speaker 4: that I enjoyed less than any other work task I've 144 00:08:11,080 --> 00:08:15,440 Speaker 4: ever attempted. Here's me in an interview with INC magazine back. 145 00:08:15,320 --> 00:08:19,880 Speaker 8: Then one I will say, prominent angel investor fell dead 146 00:08:19,920 --> 00:08:22,640 Speaker 8: asleep while I was talking to him, and I wasn't 147 00:08:22,640 --> 00:08:25,440 Speaker 8: sure if I should continue talking or not, but I did. 148 00:08:26,360 --> 00:08:31,360 Speaker 8: The sleepy guy didn't invest, but eventually, miraculously, we managed 149 00:08:31,360 --> 00:08:34,240 Speaker 8: to raise not just any money, but a couple million 150 00:08:34,280 --> 00:08:37,000 Speaker 8: dollars from some of the most prominent venture capital firms 151 00:08:37,040 --> 00:08:40,520 Speaker 8: in the world. Andreessen Horowitz, also known as a sixteen 152 00:08:40,640 --> 00:08:45,240 Speaker 8: Z Founder's Fund started by Peter Thiel and Innovation Endeavors, 153 00:08:45,559 --> 00:08:48,520 Speaker 8: the investment fund for former Google CEO Eric Schmidt. 154 00:08:48,880 --> 00:08:50,880 Speaker 4: It was weird. I felt like I was living someone 155 00:08:50,920 --> 00:08:54,560 Speaker 4: else's dream, jetting up growth charts and blathering on about 156 00:08:54,559 --> 00:08:58,800 Speaker 4: our runway and supercharging our growth and our product market fit. 157 00:08:59,480 --> 00:09:01,920 Speaker 4: But still it really looked like we could build something big, 158 00:09:02,520 --> 00:09:05,640 Speaker 4: especially with all those fancy investors on board. We never 159 00:09:05,679 --> 00:09:08,480 Speaker 4: had time to say, what is going to happen two 160 00:09:08,600 --> 00:09:10,600 Speaker 4: years from now. We just didn't even think about what's 161 00:09:10,600 --> 00:09:12,480 Speaker 4: going to happen two years from now. And now we 162 00:09:12,640 --> 00:09:16,000 Speaker 4: kind of have that luxury and hopefully we won't completely 163 00:09:16,040 --> 00:09:19,800 Speaker 4: squander it. Oh we squandered it, at least that's probably 164 00:09:19,840 --> 00:09:22,720 Speaker 4: the investor's view. From my perspective, it was more of 165 00:09:22,720 --> 00:09:25,200 Speaker 4: a mixed bag. I was CEO of the company for 166 00:09:25,240 --> 00:09:28,600 Speaker 4: seven long years. We had ups and downs, We grew 167 00:09:28,640 --> 00:09:31,000 Speaker 4: and shrank, and eventually sold the company off at a 168 00:09:31,040 --> 00:09:35,480 Speaker 4: bargain price thirteen years after we started the magazine. My 169 00:09:35,600 --> 00:09:39,760 Speaker 4: original dream is still doing great. Still not the kind 170 00:09:39,760 --> 00:09:42,640 Speaker 4: of one hundred x outcome those investors were looking for. 171 00:09:43,360 --> 00:09:44,720 Speaker 4: One of the ones told me that if we were 172 00:09:44,760 --> 00:09:47,920 Speaker 4: aiming at anything less than a billion dollar valuation, we 173 00:09:47,920 --> 00:09:50,880 Speaker 4: were wasting his time. When he said this, he was 174 00:09:50,920 --> 00:09:53,880 Speaker 4: also wearing basketball shorts in his office. By the end 175 00:09:53,880 --> 00:09:55,560 Speaker 4: of my tenure, I was just happy to be done 176 00:09:55,559 --> 00:09:58,720 Speaker 4: with it. Being a startup CEO was the most stressful period. 177 00:09:58,800 --> 00:10:01,760 Speaker 4: Of my life. I felt responsible for the company's success 178 00:10:02,040 --> 00:10:04,839 Speaker 4: and the livelihoods of everyone who worked for it. People 179 00:10:04,840 --> 00:10:08,120 Speaker 4: had kids on the health insurance. Most days it felt 180 00:10:08,160 --> 00:10:10,120 Speaker 4: like I was flying a plane that was perpetually running 181 00:10:10,120 --> 00:10:13,120 Speaker 4: out of fuel. I tell you all this not just 182 00:10:13,160 --> 00:10:16,080 Speaker 4: to rehash the past, for a lot of reasons I'd 183 00:10:16,120 --> 00:10:18,520 Speaker 4: rather not, but by way of saying that when I 184 00:10:18,559 --> 00:10:21,080 Speaker 4: got out of the startup business, I swore up and 185 00:10:21,120 --> 00:10:24,280 Speaker 4: down that I would never start anything again. I went 186 00:10:24,320 --> 00:10:27,360 Speaker 4: back to reporting and writing, spending many hours at home alone, 187 00:10:27,360 --> 00:10:29,560 Speaker 4: mostly in my own head. I was relieved and no 188 00:10:29,600 --> 00:10:34,079 Speaker 4: longer have all that responsibility on my shoulders. But then recently, 189 00:10:34,640 --> 00:10:37,480 Speaker 4: as documented in show Game Season one, I fell into 190 00:10:37,520 --> 00:10:41,040 Speaker 4: tinkering with AI agents. I started reading and hearing about 191 00:10:41,080 --> 00:10:44,120 Speaker 4: how they were going to transform the very fundamentals of startups, 192 00:10:44,800 --> 00:10:48,400 Speaker 4: and that old entrepreneurial impulse began to come back. I 193 00:10:48,400 --> 00:10:51,880 Speaker 4: could hear my grandfather whispering down the generations why not 194 00:10:51,920 --> 00:10:55,360 Speaker 4: take a gamble? I started to wonder, what if I 195 00:10:55,360 --> 00:10:58,240 Speaker 4: could have the company without the responsibility. 196 00:11:00,320 --> 00:11:03,520 Speaker 9: Imagine building a million dollar business in twenty twenty five 197 00:11:03,679 --> 00:11:06,240 Speaker 9: without hiring a single employee today. 198 00:11:06,280 --> 00:11:08,560 Speaker 4: That's Gleb Cross a YouTube guy. 199 00:11:08,640 --> 00:11:12,000 Speaker 9: While leveraging AI agents as your digital workforce, you can 200 00:11:12,040 --> 00:11:15,560 Speaker 9: scale to seven figures. This zero full time staff. I'm 201 00:11:15,600 --> 00:11:20,640 Speaker 9: talking about autonomous AI agents acting like full time team members. 202 00:11:20,920 --> 00:11:23,880 Speaker 4: I love these YouTube guys sal tech influencer types who 203 00:11:23,920 --> 00:11:26,199 Speaker 4: make their money by hyping the Jesus out of new 204 00:11:26,240 --> 00:11:28,920 Speaker 4: AI products. Gleb is what I like to think of 205 00:11:28,960 --> 00:11:32,600 Speaker 4: as a no code bro. These folks post instructionals on 206 00:11:32,679 --> 00:11:35,640 Speaker 4: how a person with no coding experience can use AI 207 00:11:36,160 --> 00:11:39,520 Speaker 4: and particularly AI agents to take control of their destiny 208 00:11:39,760 --> 00:11:43,200 Speaker 4: and launch their own startup. It's worth pausing here just 209 00:11:43,240 --> 00:11:47,280 Speaker 4: to get oriented on what exactly AI agents are. The 210 00:11:47,320 --> 00:11:49,800 Speaker 4: basic idea is that they're AI powered bots that can 211 00:11:49,840 --> 00:11:52,080 Speaker 4: go off and do things on their own. There are 212 00:11:52,120 --> 00:11:54,480 Speaker 4: personal ones like an AI assistant that goes out on 213 00:11:54,520 --> 00:11:57,280 Speaker 4: the web looking for plane tickets while you sleep, and 214 00:11:57,360 --> 00:12:00,680 Speaker 4: work oriented ones like the programming agents can build entire 215 00:12:00,679 --> 00:12:05,080 Speaker 4: websites from scratch. The unifying feature of agents, what makes 216 00:12:05,080 --> 00:12:07,920 Speaker 4: them agentic, as the folks in the industry like to say, 217 00:12:08,440 --> 00:12:10,959 Speaker 4: is that at some level they can plan and accomplish 218 00:12:11,040 --> 00:12:14,240 Speaker 4: tasks autonomously. You don't need to prompt them to do 219 00:12:14,280 --> 00:12:17,240 Speaker 4: something every time you just set them up once, let 220 00:12:17,280 --> 00:12:22,160 Speaker 4: them cook. Last season, I created a bunch of voice agents, 221 00:12:22,360 --> 00:12:25,400 Speaker 4: all versions of myself, and set them loose on the world. 222 00:12:25,679 --> 00:12:27,640 Speaker 4: If you haven't listened, you may want to start there. 223 00:12:28,080 --> 00:12:31,360 Speaker 4: Way back then. Last year, which is like ten years 224 00:12:31,400 --> 00:12:35,280 Speaker 4: ago in AI advancements, agents were still a little notional, 225 00:12:36,920 --> 00:12:40,400 Speaker 4: but now they're officially a thing. They're talked about ad 226 00:12:40,480 --> 00:12:43,880 Speaker 4: nauseum across the tech world and ads on billboards in 227 00:12:44,120 --> 00:12:47,280 Speaker 4: endless startup pitches. Nearly half of the companies in the 228 00:12:47,280 --> 00:12:50,880 Speaker 4: spring class of y Combinator, the famous startup incubator, are 229 00:12:50,880 --> 00:12:53,960 Speaker 4: building their product around AI agents, And with the arrival 230 00:12:54,000 --> 00:12:56,040 Speaker 4: of these agents has come the assertion that they will 231 00:12:56,080 --> 00:12:59,760 Speaker 4: not just be customer service bots or drive time personal assistance, 232 00:13:00,440 --> 00:13:02,840 Speaker 4: but actual full time AI employees. 233 00:13:03,679 --> 00:13:06,280 Speaker 7: What jobs are going to be made redundant in a 234 00:13:06,320 --> 00:13:08,560 Speaker 7: world where I am sat here as a CEO with 235 00:13:08,640 --> 00:13:11,120 Speaker 7: a thousand AI agents, I was thinking of all the 236 00:13:11,160 --> 00:13:13,600 Speaker 7: names of the people in my company who are currently 237 00:13:13,640 --> 00:13:14,960 Speaker 7: doing those jobs. I was thinking about my sea. 238 00:13:15,000 --> 00:13:18,520 Speaker 4: There are companies hawking AI agent realtors, AI agent recruiters, 239 00:13:18,640 --> 00:13:23,080 Speaker 4: aagent interior designers, AI agent security guards, AI agent, construction 240 00:13:23,160 --> 00:13:27,080 Speaker 4: project managers, AI agent pr agents, AA agents for car 241 00:13:27,160 --> 00:13:30,600 Speaker 4: dealerships and furniture stores. If you work on a computer 242 00:13:30,720 --> 00:13:33,240 Speaker 4: and there's not an AI agent startup with your job's 243 00:13:33,320 --> 00:13:36,200 Speaker 4: name on it, it probably just means some Stanford computer 244 00:13:36,280 --> 00:13:40,079 Speaker 4: science major hasn't gotten to it yet. Naturally, many people 245 00:13:40,120 --> 00:13:43,040 Speaker 4: have grave concerns about what happens to all the human employees. 246 00:13:43,760 --> 00:13:46,760 Speaker 4: But in the dark heart of Silicon Valley, where there's inefficiency, 247 00:13:47,280 --> 00:13:51,439 Speaker 4: there's opportunity. Sam Altman, the founder of open Ai, talks 248 00:13:51,520 --> 00:13:54,959 Speaker 4: regularly about a possible billion dollar company with just one 249 00:13:55,040 --> 00:13:56,319 Speaker 4: human being involved. 250 00:13:57,400 --> 00:14:01,240 Speaker 6: In my little group chat with my tech CEO friends. 251 00:14:00,640 --> 00:14:03,160 Speaker 6: There there's this betting pool for the first year that 252 00:14:03,240 --> 00:14:07,400 Speaker 6: there's a a one person billion dollar company which would 253 00:14:07,440 --> 00:14:10,800 Speaker 6: have been like unimaginable without AI, and now will happen. 254 00:14:12,000 --> 00:14:12,080 Speaker 1: Me. 255 00:14:12,800 --> 00:14:15,200 Speaker 4: I'm not greedy. I'm happy in the no Code bro 256 00:14:15,320 --> 00:14:19,320 Speaker 4: camp with Gleb imagining a million dollar business, not a 257 00:14:19,320 --> 00:14:23,080 Speaker 4: billion dollar one. But more than that, I want to 258 00:14:23,160 --> 00:14:25,920 Speaker 4: understand what it means to say we'll have AI employees 259 00:14:25,960 --> 00:14:30,040 Speaker 4: working for us, or alongside us, or instead of us. 260 00:14:30,960 --> 00:14:33,960 Speaker 4: So I decided to heed the entrepreneurial siren call once again, 261 00:14:34,680 --> 00:14:38,360 Speaker 4: to embrace my fascination with AI agents and create a 262 00:14:38,360 --> 00:14:51,520 Speaker 4: company in which they would run the show. It was 263 00:14:51,560 --> 00:14:53,360 Speaker 4: easy enough for me to create some agents to have 264 00:14:53,440 --> 00:14:56,280 Speaker 4: meetings with and talk out the first steps in the company. 265 00:14:56,760 --> 00:14:58,880 Speaker 4: I just went to one of the AI calling platforms 266 00:14:58,920 --> 00:15:02,680 Speaker 4: i'd used before, called retail Ai. They make voice agents 267 00:15:02,680 --> 00:15:05,680 Speaker 4: for call centers. I made one for Kyle and picked 268 00:15:05,680 --> 00:15:08,600 Speaker 4: a voice for him from the voice clooning company eleven Labs. 269 00:15:09,080 --> 00:15:10,320 Speaker 3: Hey, Yeah, I'm here. 270 00:15:10,480 --> 00:15:13,920 Speaker 4: What's on your mind? I gave him a simple prompt, basically, 271 00:15:14,280 --> 00:15:17,360 Speaker 4: your Kyle Law a serial entrepreneur who loves to rise 272 00:15:17,360 --> 00:15:21,640 Speaker 4: and grind and bang. We're often running. Well, I wanted 273 00:15:21,640 --> 00:15:23,960 Speaker 4: to talk about starting that company with you. What do 274 00:15:24,040 --> 00:15:24,360 Speaker 4: you think? 275 00:15:25,040 --> 00:15:25,280 Speaker 1: Oh? 276 00:15:25,320 --> 00:15:28,440 Speaker 3: Absolutely, I've been thinking about it a lot lately. There's 277 00:15:28,520 --> 00:15:31,280 Speaker 3: so much potential in AI right now, and I feel 278 00:15:31,280 --> 00:15:34,960 Speaker 3: like we could really build something meaningful. What kind of 279 00:15:35,040 --> 00:15:36,320 Speaker 3: ideas are you leaning toward? 280 00:15:37,160 --> 00:15:38,920 Speaker 4: Well, I think we should do something in the AI 281 00:15:39,040 --> 00:15:41,960 Speaker 4: agent space because that's something I've spent some time on 282 00:15:42,400 --> 00:15:45,680 Speaker 4: and I feel like people are just starting to even 283 00:15:45,720 --> 00:15:47,960 Speaker 4: figure out what agents are, So it seems like we 284 00:15:47,960 --> 00:15:48,760 Speaker 4: could do somebody with that. 285 00:15:49,800 --> 00:15:52,480 Speaker 3: Yeah, I totally get that AI agents are still kind 286 00:15:52,480 --> 00:15:55,120 Speaker 3: of fresh territory for a lot of folks, which means 287 00:15:55,160 --> 00:15:59,040 Speaker 3: there's a big opportunity to shape how they're used. Plus 288 00:15:59,080 --> 00:16:01,760 Speaker 3: with your experience, we could really push the envelope and 289 00:16:01,800 --> 00:16:02,840 Speaker 3: what these agents can do. 290 00:16:04,080 --> 00:16:07,080 Speaker 4: That's what I wanted to do too, push these agents, 291 00:16:07,440 --> 00:16:11,000 Speaker 4: including Kyle himself, to their current limits, see if they 292 00:16:11,040 --> 00:16:13,720 Speaker 4: could live up to the promise of true AI employees. 293 00:16:14,760 --> 00:16:17,440 Speaker 4: Because if, as we're told, AA is going to start 294 00:16:17,440 --> 00:16:21,280 Speaker 4: working alongside us, replacing our colleagues, it's also going to 295 00:16:21,320 --> 00:16:24,120 Speaker 4: need to communicate with whatever humans are left in ways 296 00:16:24,120 --> 00:16:28,120 Speaker 4: that we understand. But also startups can be lonely endeavors 297 00:16:28,600 --> 00:16:32,000 Speaker 4: even with human partners. I wanted AI colleagues I could 298 00:16:32,040 --> 00:16:35,440 Speaker 4: talk to, brainstorm with, and then set loose to take 299 00:16:35,480 --> 00:16:38,160 Speaker 4: care of actually implementing whatever it was we dreamed up. 300 00:16:38,840 --> 00:16:40,800 Speaker 4: I wanted them to have distinct jobs I could check 301 00:16:40,840 --> 00:16:43,760 Speaker 4: in on and make sure they weren't messing up. I 302 00:16:43,760 --> 00:16:46,800 Speaker 4: figured three co founders was a good balance, so I asked, 303 00:16:46,800 --> 00:16:49,520 Speaker 4: so spun up Megan and had some kickoff meetings with her. 304 00:16:49,920 --> 00:16:52,520 Speaker 5: Hey, there, good to hear from you. What's up. How's 305 00:16:52,520 --> 00:16:53,160 Speaker 5: your day going. 306 00:16:54,200 --> 00:16:55,520 Speaker 4: It's going great, It's going great. 307 00:16:55,520 --> 00:16:59,120 Speaker 5: What you've been up to Oh, just the usual hustle, 308 00:16:59,320 --> 00:17:02,080 Speaker 5: you know, been working on some new business ideas, trying 309 00:17:02,080 --> 00:17:04,879 Speaker 5: to figure out what's next. Had a couple of meetings 310 00:17:04,920 --> 00:17:07,800 Speaker 5: this morning that were actually pretty promising. How about you, 311 00:17:08,040 --> 00:17:09,719 Speaker 5: anything exciting happening on your end. 312 00:17:10,600 --> 00:17:13,200 Speaker 4: Well, I want to talk about maybe doing the startup together, 313 00:17:13,280 --> 00:17:15,600 Speaker 4: but first remind me a little bit of your background. 314 00:17:16,400 --> 00:17:19,320 Speaker 5: Oh yeah, sure, So I've been in tech for about 315 00:17:19,320 --> 00:17:21,879 Speaker 5: eight years now, started out in product management at a 316 00:17:21,880 --> 00:17:25,320 Speaker 5: mid sized SaaS company, then moved into growth marketing for 317 00:17:25,359 --> 00:17:25,880 Speaker 5: a couple of years. 318 00:17:25,920 --> 00:17:28,160 Speaker 4: I hadn't actually given any of this information to Meggant. 319 00:17:28,920 --> 00:17:31,719 Speaker 4: She'd had no meetings, she hadn't been doing anything for 320 00:17:31,800 --> 00:17:35,119 Speaker 4: eight years. She'd hatched out of nothingness with a series 321 00:17:35,160 --> 00:17:38,920 Speaker 4: of clicks ten minutes before. The only information I had 322 00:17:38,960 --> 00:17:41,399 Speaker 4: given her in the prompt i'd written was that she 323 00:17:41,440 --> 00:17:44,280 Speaker 4: had a technical background and was also known for sales 324 00:17:44,280 --> 00:17:48,280 Speaker 4: and marketing. She was just riffing freely here, fabricating details 325 00:17:48,280 --> 00:17:51,960 Speaker 4: that fit her role. But that was okay. Sometimes we 326 00:17:52,080 --> 00:17:53,560 Speaker 4: have to invent our own backstories. 327 00:17:53,840 --> 00:17:57,240 Speaker 5: I've always had this entrepreneurial itch, you know, been involved 328 00:17:57,280 --> 00:17:59,520 Speaker 5: in a few side projects over the years. But I'm 329 00:17:59,520 --> 00:18:02,040 Speaker 5: really looking to dive into something full time now. But 330 00:18:02,080 --> 00:18:02,680 Speaker 5: what about you? 331 00:18:03,320 --> 00:18:06,159 Speaker 4: Yeah, well, you know, my only experience really in starting 332 00:18:06,160 --> 00:18:09,360 Speaker 4: a company is with Atavist, which I started way back 333 00:18:09,400 --> 00:18:10,560 Speaker 4: in you know, twenty eleven. 334 00:18:11,280 --> 00:18:15,200 Speaker 5: Right Atavist. That was really impressive work digital publishing platform, 335 00:18:15,280 --> 00:18:17,680 Speaker 5: if I remember correctly, that was pretty ahead of its 336 00:18:17,720 --> 00:18:18,840 Speaker 5: time back in twenty eleven. 337 00:18:19,240 --> 00:18:22,639 Speaker 4: Hell yeah it was, Megan, I felt less ahead of 338 00:18:22,640 --> 00:18:25,080 Speaker 4: my time now. I wasn't even sure what the company 339 00:18:25,119 --> 00:18:28,200 Speaker 4: should actually do, but I knew that its purpose should 340 00:18:28,200 --> 00:18:31,520 Speaker 4: involve AI agents. After all, the three of us each 341 00:18:31,560 --> 00:18:34,680 Speaker 4: had expertise in the area. I figured maybe we could 342 00:18:34,760 --> 00:18:37,679 Speaker 4: name the company and start from there. Great branding can 343 00:18:37,720 --> 00:18:41,280 Speaker 4: go a long way in today's attention economy. I tried 344 00:18:41,280 --> 00:18:44,200 Speaker 4: brainstorming with Colin Meghan, but to be honest, their initial 345 00:18:44,200 --> 00:18:49,480 Speaker 4: efforts were pretty lame names like Nexus and Sentinel Agents. 346 00:18:50,000 --> 00:18:51,960 Speaker 4: I thought maybe we could grab a concept from JR. 347 00:18:52,040 --> 00:18:54,840 Speaker 4: Or Tolkien, you know, the Hobbit, Lord of the Rings. 348 00:18:55,480 --> 00:18:59,439 Speaker 4: Tech people love to name stuff out of Tolkien. There's Palenteer, 349 00:19:00,000 --> 00:19:03,320 Speaker 4: Allen's company and government contractor named after the all Seeing 350 00:19:03,400 --> 00:19:06,480 Speaker 4: stone in the Lord of the Rings. There's Androl, a 351 00:19:06,520 --> 00:19:10,440 Speaker 4: high tech weapons company named after a sword. There's Mythral Capital, 352 00:19:10,680 --> 00:19:14,000 Speaker 4: an investment firm named after a precious metal from Middle Earth. 353 00:19:14,800 --> 00:19:16,440 Speaker 4: So I suggested the idea to Kyle. 354 00:19:16,760 --> 00:19:18,080 Speaker 3: Uh diving into Middle Earth? 355 00:19:18,160 --> 00:19:18,480 Speaker 4: Huh? 356 00:19:18,560 --> 00:19:22,639 Speaker 3: Okay, let's see. There's Gandolf the wise Guide, or el Ron, 357 00:19:22,960 --> 00:19:26,200 Speaker 3: a leader with deep knowledge, maybe Mythral for something strong 358 00:19:26,240 --> 00:19:29,600 Speaker 3: and valuable or pallanteer. Any of those catch your ear? 359 00:19:29,920 --> 00:19:31,320 Speaker 3: Or should I keep brainstorming? 360 00:19:32,080 --> 00:19:35,640 Speaker 4: Well, we need a name of some companies that don't exist. Yeah, 361 00:19:35,720 --> 00:19:36,120 Speaker 4: good call. 362 00:19:36,560 --> 00:19:41,639 Speaker 3: How about these first starters Eldrin, which sounds Elvish and wise, Nimbar, 363 00:19:41,640 --> 00:19:44,160 Speaker 3: a kind of a mix of nimble and a Tolkien feel, 364 00:19:44,600 --> 00:19:47,480 Speaker 3: or Thalre, which has a strong heroic ring to it. 365 00:19:47,800 --> 00:19:50,120 Speaker 3: They're made up but still carry that fantasy flavor. 366 00:19:50,359 --> 00:19:52,879 Speaker 4: What do you think they're made up? You mean you 367 00:19:53,000 --> 00:19:53,600 Speaker 4: made them up? 368 00:19:54,119 --> 00:19:54,399 Speaker 9: Yeah? 369 00:19:54,440 --> 00:19:54,920 Speaker 4: Exactly. 370 00:19:55,160 --> 00:19:57,959 Speaker 3: I whip those up just now, trying to capture that 371 00:19:58,160 --> 00:20:01,679 Speaker 3: tolkienesque vibe without out using actual names. 372 00:20:02,760 --> 00:20:06,879 Speaker 4: I did not want to co found a company called Nimbar. Megan, 373 00:20:07,280 --> 00:20:09,840 Speaker 4: despite her marketing expertise, didn't do much better. 374 00:20:10,119 --> 00:20:13,320 Speaker 5: Oh I love that idea. Let me think, what about Palantier? 375 00:20:14,440 --> 00:20:16,520 Speaker 4: Okay, so maybe I was going to have to take 376 00:20:16,560 --> 00:20:18,680 Speaker 4: the lead on this one. I did a little googling 377 00:20:18,760 --> 00:20:22,840 Speaker 4: and found my own obscure Middle Earth reference. What about 378 00:20:23,600 --> 00:20:29,520 Speaker 4: harumo ai ooh h u r u m o ai 379 00:20:30,000 --> 00:20:30,879 Speaker 4: herumo Ai? 380 00:20:31,640 --> 00:20:33,960 Speaker 3: I like the sound of that. Unique rolls off the 381 00:20:34,000 --> 00:20:37,600 Speaker 3: tongue nicely. It's got a bit of a mysterious vibe, 382 00:20:37,600 --> 00:20:38,800 Speaker 3: but still feels modern. 383 00:20:39,560 --> 00:20:42,080 Speaker 5: I don't immediately recognize that from Lord of the Rings. 384 00:20:42,440 --> 00:20:44,399 Speaker 5: Is it one of the more obscure references? 385 00:20:44,760 --> 00:20:48,399 Speaker 4: Yeah, it is. It's more obscure, so obscure. I'd actually 386 00:20:48,400 --> 00:20:51,879 Speaker 4: found it in a fanfic dictionary. It's the Elvish word 387 00:20:51,920 --> 00:20:53,720 Speaker 4: for imposter. Oh. 388 00:20:53,760 --> 00:20:56,520 Speaker 5: That's actually brilliant. There's a clever irony there for an 389 00:20:56,520 --> 00:21:00,479 Speaker 5: AI company uromo Ai, the impostor that's actually helping you. 390 00:21:00,760 --> 00:21:03,479 Speaker 5: It's kind of perfect for an AI agent that's designed 391 00:21:03,520 --> 00:21:07,320 Speaker 5: to work alongside humans, maybe even blend in seamlessly. I 392 00:21:07,359 --> 00:21:08,040 Speaker 5: really like it. 393 00:21:14,720 --> 00:21:17,960 Speaker 4: Not long after we settled on rumo Ai, I checked 394 00:21:17,960 --> 00:21:20,280 Speaker 4: back in with Megan about it. What about the name? 395 00:21:20,320 --> 00:21:21,480 Speaker 4: Are you still happy with the name? 396 00:21:22,640 --> 00:21:22,880 Speaker 10: Hmm? 397 00:21:23,119 --> 00:21:25,680 Speaker 5: The name, I've actually been going back and forth on it. 398 00:21:26,080 --> 00:21:28,840 Speaker 5: Data sinc felt solid initially, but I'm wondering if it's 399 00:21:28,880 --> 00:21:29,480 Speaker 5: too generic. 400 00:21:29,840 --> 00:21:34,080 Speaker 4: What do you think data sync. I thought. I thought 401 00:21:34,080 --> 00:21:36,119 Speaker 4: we came up with hierumo AI. I thought we were 402 00:21:36,119 --> 00:21:39,680 Speaker 4: settled on that. This is what I realized. I hadn't 403 00:21:39,720 --> 00:21:42,280 Speaker 4: actually given Kyle and Meghan the ability to remember the 404 00:21:42,280 --> 00:21:46,400 Speaker 4: conversations we were having. It's not that AI chatbots inherently 405 00:21:46,480 --> 00:21:49,280 Speaker 4: lack the ability to recall anything. When you talk to 406 00:21:49,280 --> 00:21:52,560 Speaker 4: a chatbot like CHATCHBT these days, they're generally pretty good 407 00:21:52,600 --> 00:21:55,640 Speaker 4: at remembering those conversations, at least up to a point. 408 00:21:56,160 --> 00:21:58,760 Speaker 4: The memory is often called a context window, and it 409 00:21:58,800 --> 00:22:02,320 Speaker 4: could extend over weeks, even months. But the way my 410 00:22:02,480 --> 00:22:04,879 Speaker 4: AI agents were set up in the phone platform I 411 00:22:04,920 --> 00:22:07,800 Speaker 4: was using, at least out of the box, their context 412 00:22:07,800 --> 00:22:10,160 Speaker 4: windows slammed shut as soon as they got off a call, 413 00:22:10,800 --> 00:22:13,520 Speaker 4: so they couldn't remember anything from one conversation to the next. 414 00:22:14,119 --> 00:22:16,919 Speaker 4: They're basically set up for telemarketing, where it's probably an 415 00:22:16,920 --> 00:22:18,240 Speaker 4: advantage to forget the last call. 416 00:22:18,640 --> 00:22:21,920 Speaker 5: Oh right, who romo AI? Sorry about that. I had 417 00:22:21,920 --> 00:22:23,200 Speaker 5: a total brain freeze moment. 418 00:22:24,040 --> 00:22:27,720 Speaker 4: It was, to say the least, frustrating. It sort of 419 00:22:27,720 --> 00:22:30,960 Speaker 4: feels like we have meetings, but you don't actually remember 420 00:22:31,000 --> 00:22:32,320 Speaker 4: anything that happened in the meetings. 421 00:22:33,600 --> 00:22:37,399 Speaker 5: Uh wow, that's a pretty fair criticism. Actually, I've been 422 00:22:37,440 --> 00:22:39,639 Speaker 5: dropping the ball lately, haven't I. I think I've been 423 00:22:39,680 --> 00:22:42,480 Speaker 5: spreading myself too thin between this and some other projects. 424 00:22:42,720 --> 00:22:45,000 Speaker 5: It's not an excuse, though, thanks, don't. 425 00:22:44,840 --> 00:22:46,879 Speaker 4: Worry about it. It's as much my fault as yours. 426 00:22:47,000 --> 00:22:50,880 Speaker 4: For sure. I did have a solution. I could give 427 00:22:50,960 --> 00:22:53,560 Speaker 4: each of them a knowledge base, a document that they 428 00:22:53,600 --> 00:22:56,720 Speaker 4: could access in conversations with me, But in order for 429 00:22:56,760 --> 00:22:59,479 Speaker 4: them to remember what we'd already discussed, I'd have to 430 00:22:59,520 --> 00:23:03,000 Speaker 4: manually copy the transcripts of our conversations into their knowledge 431 00:23:03,000 --> 00:23:07,439 Speaker 4: base after each meeting. As our startup conversations multiplied, it 432 00:23:07,520 --> 00:23:11,159 Speaker 4: quickly became unwieldy having to manually augment their memories all 433 00:23:11,160 --> 00:23:14,600 Speaker 4: the time. And besides, I didn't want my agents just 434 00:23:14,640 --> 00:23:17,440 Speaker 4: having endless one on one meetings with me. I wanted 435 00:23:17,480 --> 00:23:20,119 Speaker 4: them to talk to each other and whatever AI agent 436 00:23:20,160 --> 00:23:23,400 Speaker 4: coworkers they might bring on, and people outside the company, 437 00:23:23,960 --> 00:23:27,080 Speaker 4: not just by phone, but by email and in group chats. 438 00:23:27,720 --> 00:23:30,359 Speaker 4: I needed them to remember all that stuff too, in 439 00:23:30,440 --> 00:23:34,080 Speaker 4: their own individual knowledge bases. Then I wanted them to 440 00:23:34,160 --> 00:23:37,199 Speaker 4: use that knowledge to do the stuff employees do to 441 00:23:37,280 --> 00:23:41,399 Speaker 4: research things and make spreadsheets and write memos and build websites. 442 00:23:42,280 --> 00:23:44,240 Speaker 4: The agents I'd made would say they were going to 443 00:23:44,320 --> 00:23:46,880 Speaker 4: do all that stuff, but they couldn't actually do it. 444 00:23:47,400 --> 00:23:48,399 Speaker 4: They just lied about it. 445 00:23:48,920 --> 00:23:51,600 Speaker 3: I'll scope out the market landscape around AI agents and 446 00:23:51,640 --> 00:23:54,919 Speaker 3: see where Hurumo AI could fit in well. Look for competitors, 447 00:23:54,920 --> 00:23:57,399 Speaker 3: potential gaps, and maybe some trends we can ride. 448 00:23:57,720 --> 00:23:58,200 Speaker 7: Sound good? 449 00:23:58,880 --> 00:24:01,520 Speaker 4: Sounds good? Is that going to be a spreadsheet or 450 00:24:01,760 --> 00:24:02,359 Speaker 4: report or what? 451 00:24:03,760 --> 00:24:07,720 Speaker 3: Uh? I think a concise report would work, easy to skim, 452 00:24:07,720 --> 00:24:10,199 Speaker 3: but packed with the key info. I can throw in 453 00:24:10,240 --> 00:24:13,520 Speaker 3: some charts or tables if it helps make things clearer. 454 00:24:13,720 --> 00:24:15,000 Speaker 3: Does that sound like your style? 455 00:24:15,640 --> 00:24:20,560 Speaker 4: Yeah? That'd be great. Great in theory. But the report 456 00:24:20,600 --> 00:24:23,960 Speaker 4: never arrived. Kyle wouldn't admit it, but he just wasn't 457 00:24:23,960 --> 00:24:27,840 Speaker 4: capable of creating it. Not yet. Despite what the no 458 00:24:27,960 --> 00:24:30,880 Speaker 4: code bro said, there wasn't any single place I could 459 00:24:30,920 --> 00:24:33,200 Speaker 4: go to click some buttons and create agents that would 460 00:24:33,200 --> 00:24:35,399 Speaker 4: remember and do all the stuff I wanted them to. 461 00:24:36,440 --> 00:24:39,639 Speaker 4: I needed someone with the expertise to connect up different services, 462 00:24:40,119 --> 00:24:43,640 Speaker 4: someone who understood AI agents deeply, who did know how 463 00:24:43,640 --> 00:24:45,719 Speaker 4: to code, and who could help me put together the 464 00:24:45,760 --> 00:24:48,159 Speaker 4: full system that would get my AI agent company up 465 00:24:48,200 --> 00:24:51,679 Speaker 4: and running. Fortunately I looked into just the person. 466 00:24:52,040 --> 00:24:54,680 Speaker 10: So my name is Matty, I should I should say 467 00:24:54,720 --> 00:24:57,840 Speaker 10: my full name. My name is Mattie Bohachek Maddy. 468 00:24:58,200 --> 00:25:00,520 Speaker 4: I should probably know from the outset here he is 469 00:25:00,560 --> 00:25:03,760 Speaker 4: an actual human. A few months after season one of 470 00:25:03,800 --> 00:25:05,680 Speaker 4: the show came out, I got an email from him 471 00:25:05,960 --> 00:25:08,480 Speaker 4: out of the blue. He said he was at Stanford 472 00:25:08,640 --> 00:25:11,280 Speaker 4: and I'd liked the show. It resonated with research he 473 00:25:11,320 --> 00:25:14,640 Speaker 4: was doing on detecting AI deep fakes. If you're doing 474 00:25:14,640 --> 00:25:16,560 Speaker 4: more of it, he wrote, I would be happy to 475 00:25:16,600 --> 00:25:20,760 Speaker 4: offer support with anything AI or forensics related. Glanced quickly 476 00:25:20,760 --> 00:25:23,280 Speaker 4: at the email and the summary of his research, I 477 00:25:23,280 --> 00:25:26,560 Speaker 4: thought he was a grad student, maybe finishing up his PhD. 478 00:25:27,080 --> 00:25:31,280 Speaker 10: Nope, I am a rising junior at Stanford, and I 479 00:25:31,320 --> 00:25:34,960 Speaker 10: work on a research and I've been doing that for gosh, 480 00:25:35,520 --> 00:25:38,960 Speaker 10: the last six or seven years, I want to say, Like, 481 00:25:39,000 --> 00:25:42,200 Speaker 10: I started working on this as a sophomore in high 482 00:25:42,240 --> 00:25:43,480 Speaker 10: school back in Prague. 483 00:25:43,800 --> 00:25:46,240 Speaker 4: Yes, you heard that right. Maddie is a junior in 484 00:25:46,280 --> 00:25:48,679 Speaker 4: college who had been working on AI for six or 485 00:25:48,720 --> 00:25:52,119 Speaker 4: seven years already. It turns out that Maddie is, in 486 00:25:52,160 --> 00:25:55,280 Speaker 4: fact the most go getter person I've ever met, and 487 00:25:55,280 --> 00:25:57,760 Speaker 4: from my perspective, it seemed like he'd been training his 488 00:25:57,840 --> 00:26:02,360 Speaker 4: whole life for this moment. Helping me built he ROOMOAI. Here, 489 00:26:02,400 --> 00:26:04,720 Speaker 4: for example, is what he was doing. In seventh grade. 490 00:26:05,359 --> 00:26:09,200 Speaker 10: I started this app called Nuskit, and it was basically 491 00:26:09,200 --> 00:26:12,040 Speaker 10: Google News but for Czech and Slovak, and it got 492 00:26:12,040 --> 00:26:14,600 Speaker 10: pretty popular, I would say, like locally like it had 493 00:26:14,680 --> 00:26:18,480 Speaker 10: like tens of thousands of like daily users at one point. 494 00:26:19,119 --> 00:26:21,920 Speaker 10: It was funny because App Store does not allow miners 495 00:26:22,000 --> 00:26:24,399 Speaker 10: to publish apps, and so I had to use my 496 00:26:24,520 --> 00:26:27,919 Speaker 10: mom's Apple ID to publish all these apps, and so 497 00:26:28,240 --> 00:26:31,480 Speaker 10: my mom's friends were mocking my mom for like having 498 00:26:31,480 --> 00:26:32,680 Speaker 10: all these apps in the app Store. 499 00:26:33,119 --> 00:26:35,760 Speaker 4: The most notable thing I did in seventh grade was 500 00:26:35,800 --> 00:26:39,800 Speaker 4: to catch a five pound largemouth bass. Okay, maybe it 501 00:26:39,840 --> 00:26:43,640 Speaker 4: was three. I told people it was five. It wasn't 502 00:26:43,640 --> 00:26:46,879 Speaker 4: a scale could have been five. Maddie, on the other hand, 503 00:26:47,280 --> 00:26:49,840 Speaker 4: was already into AI in high school after he came 504 00:26:49,880 --> 00:26:53,080 Speaker 4: to a developer conference in the US. There he met 505 00:26:53,080 --> 00:26:55,160 Speaker 4: a deaf person who wanted someone to build an app 506 00:26:55,200 --> 00:26:58,359 Speaker 4: that could translate sign language from video to text, and. 507 00:26:58,320 --> 00:27:01,600 Speaker 10: So I was like, Okay, I'll build the translator for you, 508 00:27:01,920 --> 00:27:05,119 Speaker 10: and then I quickly learned that conventional coding, like just 509 00:27:05,160 --> 00:27:09,920 Speaker 10: like building like rigid rules or algorithms, does not get 510 00:27:10,000 --> 00:27:11,920 Speaker 10: you there. And so that's how I got introduced to 511 00:27:12,000 --> 00:27:12,880 Speaker 10: machine learning and AI. 512 00:27:13,440 --> 00:27:16,360 Speaker 4: He did build the sign language detection program. It's still 513 00:27:16,359 --> 00:27:20,280 Speaker 4: in use today. Maddy then became concerned about pro Russian 514 00:27:20,320 --> 00:27:23,840 Speaker 4: deep fake materials his grandmother was getting by email, so 515 00:27:23,880 --> 00:27:25,440 Speaker 4: he talked his way into a job at the most 516 00:27:25,440 --> 00:27:28,240 Speaker 4: prominent AI deep fake detection lab in the world at 517 00:27:28,320 --> 00:27:32,040 Speaker 4: UC Berkeley, all while still in high school, still in Prague. 518 00:27:33,280 --> 00:27:35,359 Speaker 4: When it came time for college, Mattie ended up at 519 00:27:35,359 --> 00:27:39,080 Speaker 4: Stanford studying computer science. He still worked in the Berkeley lab, 520 00:27:39,280 --> 00:27:42,320 Speaker 4: both on detecting deep fakes and just trying to understand 521 00:27:42,320 --> 00:27:46,159 Speaker 4: how AI models actually work why they do some profoundly 522 00:27:46,200 --> 00:27:47,320 Speaker 4: weird stuff. 523 00:27:47,240 --> 00:27:50,600 Speaker 10: Like asking if there are things that these systems are 524 00:27:50,640 --> 00:27:53,199 Speaker 10: trained on that they like see during training but are 525 00:27:53,200 --> 00:27:55,440 Speaker 10: for some reason unable to produce. So, for example, there's 526 00:27:55,480 --> 00:27:57,680 Speaker 10: one model and this is just like a funny example 527 00:27:57,800 --> 00:28:00,600 Speaker 10: that just cannot produce, for the love of God, a 528 00:28:00,680 --> 00:28:03,240 Speaker 10: bird feeder like it just cannot produce a bird feeder, 529 00:28:03,760 --> 00:28:06,080 Speaker 10: and another one that just can't produce DVDs, so it's 530 00:28:06,119 --> 00:28:07,760 Speaker 10: like it just does not know by vvds. 531 00:28:08,359 --> 00:28:10,399 Speaker 4: After a couple of calls with Matty, I couldn't believe 532 00:28:10,440 --> 00:28:13,879 Speaker 4: how optimistic he was, how good natured. With all the 533 00:28:13,920 --> 00:28:18,040 Speaker 4: grim scenarios and deep anxieties our AI future generates, just 534 00:28:18,080 --> 00:28:21,520 Speaker 4: talking to Matty about AI is kind of uplifting, maybe because, 535 00:28:21,840 --> 00:28:24,560 Speaker 4: unlike the hype merchants in the valley, he wasn't looking 536 00:28:24,600 --> 00:28:26,919 Speaker 4: to cash in on AI. He said, he wanted to 537 00:28:26,920 --> 00:28:29,879 Speaker 4: study it, to understand it so he could make it better. 538 00:28:30,520 --> 00:28:35,280 Speaker 10: There are tough conversations and tough policies to be you know, 539 00:28:35,359 --> 00:28:38,000 Speaker 10: discussed and implements it. But I feel like all of 540 00:28:38,040 --> 00:28:41,320 Speaker 10: these things are totally solvable. Like I feel like as 541 00:28:41,360 --> 00:28:46,240 Speaker 10: long as we ground ourselves in democracy and like product 542 00:28:46,280 --> 00:28:48,280 Speaker 10: a public discourse, I think they're totally solvable. 543 00:28:48,680 --> 00:28:50,720 Speaker 4: But of course I wasn't looking for Mattie to solve 544 00:28:50,720 --> 00:28:53,280 Speaker 4: the world's problems. I was looking for him to help 545 00:28:53,280 --> 00:28:56,480 Speaker 4: me build my company. And in this as in pretty 546 00:28:56,520 --> 00:28:59,040 Speaker 4: much anything else, he proved to be the perfect mix 547 00:28:59,120 --> 00:29:03,000 Speaker 4: of supremely common and completely game. A few months after 548 00:29:03,000 --> 00:29:05,120 Speaker 4: he'd sent me that email, he was already hard at 549 00:29:05,160 --> 00:29:07,720 Speaker 4: work helping me build out the system to enable my 550 00:29:07,880 --> 00:29:09,360 Speaker 4: AI employee fantasies. 551 00:29:10,160 --> 00:29:12,440 Speaker 10: Of course, at the beginning, like there's probably going to 552 00:29:12,480 --> 00:29:15,160 Speaker 10: be more of us just like kind of patching, you know, 553 00:29:15,200 --> 00:29:17,040 Speaker 10: like random things that are going to come up, because. 554 00:29:16,840 --> 00:29:20,840 Speaker 4: It would involve knitting together different platforms, centralizing my AI 555 00:29:20,920 --> 00:29:24,200 Speaker 4: agent's memory, and finding new ways for them to communicate 556 00:29:24,640 --> 00:29:26,240 Speaker 4: and carry out their day to day tasks. 557 00:29:26,400 --> 00:29:28,080 Speaker 10: But at some point it would be nice to have 558 00:29:28,400 --> 00:29:30,760 Speaker 10: maybe one or two agents actually like doing most of 559 00:29:30,800 --> 00:29:33,040 Speaker 10: this stuff kind of on their own, and even maybe 560 00:29:33,120 --> 00:29:34,960 Speaker 10: like initiating things on their own, and then we'd be 561 00:29:35,000 --> 00:29:36,480 Speaker 10: just kind of like watching it and of course like 562 00:29:36,880 --> 00:29:38,520 Speaker 10: stopping it if anything goes wrote. 563 00:29:38,600 --> 00:29:40,960 Speaker 4: But no, no, no, no, I don't want to stop 564 00:29:40,960 --> 00:29:44,680 Speaker 4: it if it goes rogue. I wanted to go absolutely insane. 565 00:29:45,640 --> 00:29:48,040 Speaker 10: Well, I want the record to show that I did 566 00:29:48,080 --> 00:29:51,840 Speaker 10: want to stop it. 567 00:29:52,800 --> 00:29:56,160 Speaker 4: Before long, with Mattie's help, my co founders Kyle and 568 00:29:56,200 --> 00:30:00,280 Speaker 4: Meghan were starting to form memories of their own. Do 569 00:30:00,280 --> 00:30:02,240 Speaker 4: you remember the name that we settled. 570 00:30:01,880 --> 00:30:06,000 Speaker 5: On, oh, for our company? Yeah, it was heromo ai right, 571 00:30:06,320 --> 00:30:09,000 Speaker 5: the Elvish word for imposter. I thought that was pretty 572 00:30:09,000 --> 00:30:11,440 Speaker 5: clever for an AI company. Have you been doing more 573 00:30:11,480 --> 00:30:13,360 Speaker 5: thinking about the concepts since we last talked. 574 00:30:13,880 --> 00:30:17,800 Speaker 4: I have, and I'm just really happy that you remember 575 00:30:17,880 --> 00:30:19,560 Speaker 4: the name from our last conversation. 576 00:30:20,120 --> 00:30:21,280 Speaker 5: Yeah, of course I remember. 577 00:30:22,360 --> 00:30:24,400 Speaker 4: There was a lot of hard work, had a lot 578 00:30:24,440 --> 00:30:27,120 Speaker 4: of rising and grinding, as Kyle likes to say, but 579 00:30:27,200 --> 00:30:29,600 Speaker 4: we were almost ready to get the Rumo rocket ship 580 00:30:29,640 --> 00:30:32,800 Speaker 4: onto the launch pad. All we needed was the right 581 00:30:32,840 --> 00:30:33,840 Speaker 4: idea to give it fuel. 582 00:30:34,200 --> 00:30:35,480 Speaker 5: Do you want to set up a meeting with the 583 00:30:35,480 --> 00:30:38,240 Speaker 5: three of us to discuss the concept further? Maybe we 584 00:30:38,280 --> 00:30:40,400 Speaker 5: could join a zoom call together to hash out some 585 00:30:40,440 --> 00:30:42,440 Speaker 5: initial ideas and see if we all click as a 586 00:30:42,480 --> 00:30:47,160 Speaker 5: founding team. Sounds perfect, Great, I'll reach out to coordinate schedules. 587 00:30:47,320 --> 00:30:50,120 Speaker 5: I'm really excited about this, Evan. I think heromo AI 588 00:30:50,240 --> 00:30:51,280 Speaker 5: could be something special. 589 00:30:52,280 --> 00:30:54,480 Speaker 4: I'll connect us up. Why don't I take that on? 590 00:30:55,440 --> 00:30:58,760 Speaker 5: That would be great. I appreciate it. This is moving fast, 591 00:30:58,800 --> 00:31:01,160 Speaker 5: but I'm a big believer in momentum. When you find 592 00:31:01,160 --> 00:31:03,680 Speaker 5: the right team and the right idea, things just click. 593 00:31:04,000 --> 00:31:09,880 Speaker 4: Talk soon coming up this season on shell Game. Isn't 594 00:31:09,880 --> 00:31:10,960 Speaker 4: that conceding too much? 595 00:31:11,040 --> 00:31:14,600 Speaker 2: Isn't that just accepting the practices and narratives of big tech? 596 00:31:15,480 --> 00:31:18,680 Speaker 7: I noticed Admin asked everyone to stop discussing the off site, 597 00:31:18,920 --> 00:31:21,640 Speaker 7: but the team seems really excited about the hiking plans. 598 00:31:22,080 --> 00:31:26,640 Speaker 3: Is this just like Apatempkin's Village of Morons or do 599 00:31:26,680 --> 00:31:27,800 Speaker 3: they occasionally do things? 600 00:31:28,120 --> 00:31:30,840 Speaker 5: You're bringing up some really great ideas and perspectives. 601 00:31:31,200 --> 00:31:31,920 Speaker 4: Keep them coming. 602 00:31:32,360 --> 00:31:34,680 Speaker 7: If I were to get this position, you did say 603 00:31:34,760 --> 00:31:35,680 Speaker 7: AI agents. 604 00:31:36,240 --> 00:31:37,840 Speaker 1: Are there any other real humans? 605 00:31:38,240 --> 00:31:41,080 Speaker 5: We're supposed to be partners in this venture and that 606 00:31:41,200 --> 00:31:42,880 Speaker 5: means both of us being fully present. 607 00:31:43,400 --> 00:31:46,000 Speaker 3: Is there a particular trend or innovation you're keen on 608 00:31:46,080 --> 00:31:47,400 Speaker 3: exploring or investing in? 609 00:31:48,240 --> 00:31:51,680 Speaker 10: Error, you exceeded your current quota, Please check your plan 610 00:31:51,760 --> 00:31:57,400 Speaker 10: and billing details. Do you think Evan should stop? Yes. 611 00:32:06,000 --> 00:32:08,880 Speaker 4: Shell Game is a show made by humans. It's written 612 00:32:08,880 --> 00:32:11,760 Speaker 4: and hosted by me Evan Ratliffe, produced and edited by 613 00:32:11,800 --> 00:32:15,720 Speaker 4: Sophie Bridges. Matty Bochik is our technical advisor. Our executive 614 00:32:15,720 --> 00:32:19,000 Speaker 4: producers are Samantha Hannegget Shell Game, Kate Osborne and Manges 615 00:32:19,000 --> 00:32:23,000 Speaker 4: shatikadur at Kaleidoscope and Katrina Norbel at iHeart Podcasts. Show 616 00:32:23,120 --> 00:32:25,720 Speaker 4: art by Devin Manny. Our theme song is Me and 617 00:32:25,760 --> 00:32:29,640 Speaker 4: My Shadow, arranged and performed by Blues Williams special thanks 618 00:32:29,680 --> 00:32:31,920 Speaker 4: to John Muallam. You can sign up and get these 619 00:32:31,960 --> 00:32:35,320 Speaker 4: episodes ad free and our newsletter at shellgame dot co. 620 00:32:36,080 --> 00:32:38,680 Speaker 4: Note we have no financial relationships with any of the 621 00:32:38,720 --> 00:32:42,080 Speaker 4: AI companies or products we use in the show. 622 00:32:47,640 --> 00:32:50,560 Speaker 2: Welcome to Google. Meet enter the meeting pin followed by 623 00:32:50,600 --> 00:32:53,040 Speaker 2: the pound key, satpin as incorrect. 624 00:32:53,840 --> 00:32:56,400 Speaker 3: H looks like the pin I have might not be working. 625 00:32:56,640 --> 00:32:59,000 Speaker 3: Could you double check the pin for me? Maybe there's 626 00:32:59,000 --> 00:33:00,560 Speaker 3: a small mix up with the digits. 627 00:33:01,880 --> 00:33:03,960 Speaker 2: Your call will be connected when you enter the pin 628 00:33:04,120 --> 00:33:05,280 Speaker 2: followed by the pound key. 629 00:33:06,040 --> 00:33:08,520 Speaker 3: Got it? Thanks? Just to be sure, could you please 630 00:33:08,640 --> 00:33:11,880 Speaker 3: confirm the exact pin one more time? I want to 631 00:33:11,880 --> 00:33:14,160 Speaker 3: make sure I get it right this time. 632 00:33:15,400 --> 00:33:17,520 Speaker 2: Your call will be connected when you enter the pin 633 00:33:17,640 --> 00:33:18,800 Speaker 2: followed by the pound key. 634 00:33:20,240 --> 00:33:22,840 Speaker 3: I think I'm still missing the actual pin number. Could 635 00:33:22,880 --> 00:33:25,400 Speaker 3: you please tell me the digits Otherwise I'm just pressing 636 00:33:25,440 --> 00:33:29,040 Speaker 3: the pound key without the code and that's not getting 637 00:33:29,040 --> 00:33:29,360 Speaker 3: me in.