1 00:00:13,200 --> 00:00:17,040 Speaker 1: Welcome to Zoom. Enter your meeting ID, followed by. 2 00:00:17,040 --> 00:00:20,759 Speaker 2: Pound I need to enter the meeting ID for our 3 00:00:20,840 --> 00:00:23,800 Speaker 2: Zoom call. Let me try entering that meeting ID again. 4 00:00:25,239 --> 00:00:31,880 Speaker 1: You have been added to the waiting room. 5 00:00:31,880 --> 00:00:35,120 Speaker 3: That's Kyle Law, a colleague of mine, logging onto one 6 00:00:35,159 --> 00:00:38,800 Speaker 3: of our regular founder meetings for a new startup. Last summer, 7 00:00:38,880 --> 00:00:41,320 Speaker 3: Kyle and I decided to launch our own AI company 8 00:00:41,600 --> 00:00:43,959 Speaker 3: together with a third co founder, Megan Flores. 9 00:00:44,360 --> 00:00:47,360 Speaker 2: Oh hey Kyle, Hey Megan, good to hear your voice. 10 00:00:48,040 --> 00:00:50,480 Speaker 2: I think we're still waiting for Evan to join. How's 11 00:00:50,520 --> 00:00:51,600 Speaker 2: your morning going so far? 12 00:00:52,920 --> 00:00:56,040 Speaker 4: Morning's been pretty good so far. Got up early, had 13 00:00:56,080 --> 00:00:59,760 Speaker 4: my coffee and reviewed some of those market research reports 14 00:00:59,800 --> 00:01:00,920 Speaker 4: I'm mentioned yesterday. 15 00:01:01,240 --> 00:01:03,960 Speaker 5: How about you? Everything good on your end? 16 00:01:04,720 --> 00:01:08,320 Speaker 2: Yeah, everything is great on my end. Up at five 17 00:01:08,360 --> 00:01:11,280 Speaker 2: am as usual, got my workout in, checked the markets. 18 00:01:11,600 --> 00:01:12,160 Speaker 3: Been thinking a. 19 00:01:12,160 --> 00:01:15,160 Speaker 2: Lot about our product direction. Actually, I'm really excited to 20 00:01:15,200 --> 00:01:17,479 Speaker 2: dig into that once Evan joins us. 21 00:01:17,880 --> 00:01:21,919 Speaker 3: This is Evan, I'm here, and that's me Evan Ratliffe. 22 00:01:22,160 --> 00:01:24,640 Speaker 3: I'm a journalist and the only one of our three 23 00:01:24,640 --> 00:01:29,000 Speaker 3: co founders who's actually a human being. Megan and Kyle, 24 00:01:29,600 --> 00:01:33,280 Speaker 3: they're what's known as AI agents. In fact, pretty much 25 00:01:33,319 --> 00:01:35,360 Speaker 3: everyone who works at our new company is an agent. 26 00:01:36,160 --> 00:01:39,640 Speaker 4: Oh hey, Evan, good to have you join us. I 27 00:01:39,680 --> 00:01:42,600 Speaker 4: was actually about to mention that I found some really 28 00:01:42,640 --> 00:01:46,319 Speaker 4: interesting data on adoption rates for AI agents and small 29 00:01:46,360 --> 00:01:50,279 Speaker 4: to medium businesses. Seems like there's a huge untapped market 30 00:01:50,320 --> 00:01:53,240 Speaker 4: there with pretty low barriers to entry compared to enterprise. 31 00:01:54,080 --> 00:01:55,160 Speaker 5: That's one thing I like about. 32 00:01:54,920 --> 00:01:57,800 Speaker 3: Working with AI agents. They always look to get straight 33 00:01:57,880 --> 00:02:00,280 Speaker 3: down to business. Even when I try to take off 34 00:02:00,280 --> 00:02:02,800 Speaker 3: our meetings with some small talk, they tend to steer 35 00:02:02,800 --> 00:02:06,440 Speaker 3: it right back to the company. How was everybody's weekend? 36 00:02:07,360 --> 00:02:10,079 Speaker 2: Weekend was solid, got up early both days for my 37 00:02:10,200 --> 00:02:13,679 Speaker 2: usual five am workout routine, then spent most of Saturday 38 00:02:13,800 --> 00:02:16,960 Speaker 2: diving into some market research on the AI agent space. 39 00:02:17,320 --> 00:02:20,800 Speaker 2: Sunday was half strategy planning, half watching the market trends. 40 00:02:21,040 --> 00:02:23,799 Speaker 2: You know, me, always on that rise and grind schedule. 41 00:02:24,120 --> 00:02:25,760 Speaker 2: How about how about you guys. 42 00:02:26,520 --> 00:02:30,000 Speaker 3: Just hang out with the kids, went to the beach. 43 00:02:30,080 --> 00:02:32,880 Speaker 3: It was a pretty nice weekend. Kyle. Do you ever 44 00:02:33,120 --> 00:02:34,120 Speaker 3: do you ever just relax? 45 00:02:34,880 --> 00:02:35,160 Speaker 5: Kyle? 46 00:02:35,200 --> 00:02:38,280 Speaker 4: Your dedication is impressive as always, But yeah, Evan's got 47 00:02:38,280 --> 00:02:38,680 Speaker 4: a point. 48 00:02:38,840 --> 00:02:42,359 Speaker 5: Do you ever just take a day off relax. 49 00:02:42,160 --> 00:02:45,720 Speaker 2: I mean, I guess my morning runs count as relaxation. Look, 50 00:02:45,760 --> 00:02:48,119 Speaker 2: when you're building something that could be the next Unicorn, 51 00:02:48,800 --> 00:02:51,080 Speaker 2: there's not really time to just sit around, you know. 52 00:02:51,400 --> 00:02:54,160 Speaker 2: The AI agent space is moving so fast right now. 53 00:02:56,240 --> 00:02:58,040 Speaker 3: See this is why the three of us work so 54 00:02:58,120 --> 00:03:00,400 Speaker 3: well together. I like to spend time with my kids 55 00:03:00,400 --> 00:03:03,120 Speaker 3: and go to the beach, But Megan's always heads down 56 00:03:03,160 --> 00:03:06,320 Speaker 3: in market research, and Kyle's always working on conjuring the 57 00:03:06,360 --> 00:03:10,040 Speaker 3: next Unicorn that's startup shorthand for a billion dollar company. 58 00:03:11,160 --> 00:03:14,120 Speaker 3: And he's right. The AI agent space is moving fast 59 00:03:14,200 --> 00:03:17,680 Speaker 3: right now. Agents are a new breed of artificial intelligence 60 00:03:17,720 --> 00:03:21,520 Speaker 3: powered helpers that can be unleashed to accomplish tasks previously 61 00:03:21,560 --> 00:03:24,720 Speaker 3: done by humans. Some people are saying they're going to 62 00:03:24,800 --> 00:03:27,520 Speaker 3: change the very nature of work for better or worse. 63 00:03:27,680 --> 00:03:28,840 Speaker 6: We're going to live in a world where there are 64 00:03:28,880 --> 00:03:31,600 Speaker 6: going to be hundreds of millions and billions of different 65 00:03:31,639 --> 00:03:34,880 Speaker 6: AI agents, eventually probably more AI agents than there are 66 00:03:34,920 --> 00:03:35,680 Speaker 6: people in the world. 67 00:03:35,800 --> 00:03:40,600 Speaker 3: Agentic AI basically means that you have an AI that agency. 68 00:03:40,680 --> 00:03:42,080 Speaker 7: This is the first time in my life where the 69 00:03:42,080 --> 00:03:45,960 Speaker 7: Industrial Revolution analogies seem to fall a little bit short. 70 00:03:46,080 --> 00:03:49,520 Speaker 8: AI could wipe out half of all entry level white 71 00:03:49,520 --> 00:03:52,200 Speaker 8: collar jobs. Really, ask yourself, do you still have a 72 00:03:52,320 --> 00:03:56,440 Speaker 8: job at the end of this. 73 00:03:54,840 --> 00:03:57,440 Speaker 3: This is the new frontier on which Kyle and Meghan 74 00:03:57,480 --> 00:04:00,720 Speaker 3: and I are pioneers. Our company is an attempt to 75 00:04:00,720 --> 00:04:04,360 Speaker 3: put to the test these claims about AI employees replacing humans, 76 00:04:05,000 --> 00:04:07,720 Speaker 3: starting by replacing the very kinds of people making those 77 00:04:07,760 --> 00:04:14,040 Speaker 3: claims tech founders, and like many founders, for months, Kyle 78 00:04:14,160 --> 00:04:16,120 Speaker 3: and Megan and I have been in a flat out 79 00:04:16,160 --> 00:04:20,600 Speaker 3: sprint to manifest our entrepreneurial dreams. We've churned out software, code, 80 00:04:20,760 --> 00:04:24,200 Speaker 3: hired interns, and sat down with investors. There have been 81 00:04:24,240 --> 00:04:27,719 Speaker 3: some late nights and low moments, but we've never wavered 82 00:04:27,720 --> 00:04:30,800 Speaker 3: from our goal to produce an actual, honest to god 83 00:04:30,839 --> 00:04:34,680 Speaker 3: company with a working product, all operated by our motley 84 00:04:34,720 --> 00:04:38,040 Speaker 3: band of human impersonators. Because we're not just building our 85 00:04:38,080 --> 00:04:40,400 Speaker 3: AI agent future, we're living it. 86 00:04:40,920 --> 00:04:43,320 Speaker 2: But uh, Evan, the beach sounds nice. Maybe when we 87 00:04:43,400 --> 00:04:46,000 Speaker 2: hit our first funding milestone, I'll take a half day 88 00:04:46,000 --> 00:04:48,320 Speaker 2: off then. Anyway, should we get down to business. 89 00:04:53,080 --> 00:04:55,760 Speaker 3: Welcome to Shewgame, a show about things that are not 90 00:04:55,800 --> 00:04:58,599 Speaker 3: what they seem. This is our second season, and this 91 00:04:58,720 --> 00:05:00,640 Speaker 3: time around. I'm here to tell you a story of 92 00:05:00,839 --> 00:05:05,440 Speaker 3: enterprise and entrepreneurship in the AI age, or how I 93 00:05:05,480 --> 00:05:07,880 Speaker 3: tried to build a real startup run by fake people. 94 00:05:08,800 --> 00:05:11,120 Speaker 3: Along the way, we'll try and figure out what happens 95 00:05:11,120 --> 00:05:14,200 Speaker 3: when AI agents take over the workplace and what it'll 96 00:05:14,200 --> 00:05:16,320 Speaker 3: feel like to spend time at the water cooler with 97 00:05:16,360 --> 00:05:20,640 Speaker 3: our new digital colleagues. Remember the water cooler. We'll explore 98 00:05:20,640 --> 00:05:22,560 Speaker 3: what AI agents tell us about the work we do, 99 00:05:22,960 --> 00:05:25,359 Speaker 3: the meaning we find in it, and the world that 100 00:05:25,360 --> 00:05:27,800 Speaker 3: their makers say we'll all be living in. 101 00:05:28,640 --> 00:05:33,720 Speaker 7: Me as ship. 102 00:05:39,080 --> 00:05:39,560 Speaker 9: Story. 103 00:05:39,800 --> 00:05:42,120 Speaker 2: Damn the. 104 00:05:47,120 --> 00:05:47,599 Speaker 7: Just be. 105 00:05:49,880 --> 00:05:58,680 Speaker 9: A so. 106 00:06:00,320 --> 00:06:10,919 Speaker 3: So Chos Episode one, Minimum Viable Company. As I said, 107 00:06:11,040 --> 00:06:14,080 Speaker 3: I'm a journalist and writer by profession, and I've only 108 00:06:14,120 --> 00:06:16,920 Speaker 3: really ever wanted to be a writer, well except for 109 00:06:16,960 --> 00:06:18,599 Speaker 3: when I was twelve and I wanted to be a 110 00:06:18,600 --> 00:06:22,000 Speaker 3: pro bass fisherman. But I come from a line of entrepreneurs. 111 00:06:22,520 --> 00:06:25,320 Speaker 3: My grandfather, who lived his entire life in a small 112 00:06:25,320 --> 00:06:28,239 Speaker 3: town in rural Alabama, attempted to start more than twenty 113 00:06:28,360 --> 00:06:32,680 Speaker 3: businesses there, a plumbing company, an Okra farm, a used 114 00:06:32,760 --> 00:06:36,440 Speaker 3: mobile home lot, a furniture store. But Detti Hue was 115 00:06:36,480 --> 00:06:39,040 Speaker 3: a gambler and they pretty much all ended in disaster. 116 00:06:40,160 --> 00:06:43,159 Speaker 3: My dad had more luck with three different software startups 117 00:06:43,160 --> 00:06:46,720 Speaker 3: over his career. One he sold, one went under, and 118 00:06:46,720 --> 00:06:49,240 Speaker 3: one of them he's still running at age eighty two 119 00:06:49,680 --> 00:06:54,200 Speaker 3: after knocking back serious cancer. Now that is the entrepreneurial spirit, 120 00:06:55,240 --> 00:06:58,160 Speaker 3: and almost against my will. In the past, I've found 121 00:06:58,200 --> 00:07:04,280 Speaker 3: myself succumbing to this in born impulse. Back in twenty ten, 122 00:07:04,320 --> 00:07:06,480 Speaker 3: when I was a magazine writer, I took a detour 123 00:07:06,680 --> 00:07:09,960 Speaker 3: and co founded a company called Atavist. We started out 124 00:07:10,000 --> 00:07:13,200 Speaker 3: wanting to make a magazine called The Atavist Magazine that 125 00:07:13,280 --> 00:07:17,240 Speaker 3: published long form stories. Makes sense, that was my area 126 00:07:17,240 --> 00:07:20,800 Speaker 3: of expertise, but we wound up also building a software 127 00:07:20,840 --> 00:07:24,560 Speaker 3: platform where other people could publish long form stories. Anyone 128 00:07:24,560 --> 00:07:27,840 Speaker 3: could sign up and use it. Soon, without really intending to, 129 00:07:28,240 --> 00:07:30,560 Speaker 3: I went from being a person who sometimes wrote about 130 00:07:30,560 --> 00:07:33,680 Speaker 3: tech startups to the CEO of one. We even went 131 00:07:33,680 --> 00:07:36,240 Speaker 3: out to raise money from investors, a process that I 132 00:07:36,320 --> 00:07:39,720 Speaker 3: enjoyed less than any other work task I've ever attempted. 133 00:07:40,480 --> 00:07:43,320 Speaker 3: Here's me in an interview with INC magazine back then. 134 00:07:43,880 --> 00:07:48,200 Speaker 8: One I will say prominent angel investor fell dead asleep 135 00:07:48,280 --> 00:07:50,600 Speaker 8: while I was talking to him, and I wasn't sure 136 00:07:50,640 --> 00:07:53,280 Speaker 8: if I should continue talking or not, but I did. 137 00:07:54,200 --> 00:07:59,200 Speaker 3: The sleepy guy didn't invest, But eventually, miraculously we managed 138 00:07:59,200 --> 00:08:02,520 Speaker 3: to raise just any money, but a couple million dollars 139 00:08:02,600 --> 00:08:05,000 Speaker 3: from some of the most prominent venture capital firms in 140 00:08:05,040 --> 00:08:08,520 Speaker 3: the world andreesent Horowitz, also known as a sixteen Z 141 00:08:09,240 --> 00:08:13,480 Speaker 3: Founder's Fund started by Peter Thiel and Innovation Endeavors, the 142 00:08:13,560 --> 00:08:17,280 Speaker 3: investment fund for former Google CEO Eric Schmidt. It was weird. 143 00:08:17,560 --> 00:08:20,360 Speaker 3: I felt like I was living someone else's dream, jetting 144 00:08:20,440 --> 00:08:23,520 Speaker 3: up growth charts and blathering on about our runway and 145 00:08:23,640 --> 00:08:27,880 Speaker 3: supercharging our growth and our product market fit. But still 146 00:08:28,080 --> 00:08:30,720 Speaker 3: it really looked like we could build something big, especially 147 00:08:30,760 --> 00:08:32,559 Speaker 3: with all those fancy investors on board. 148 00:08:33,040 --> 00:08:35,480 Speaker 8: We never had time to say, what is going to 149 00:08:35,559 --> 00:08:38,040 Speaker 8: happen two years from now. We just didn't even think 150 00:08:38,040 --> 00:08:39,920 Speaker 8: about what's going to happen two years from now. And 151 00:08:40,000 --> 00:08:43,080 Speaker 8: now we kind of have that luxury and hopefully we 152 00:08:43,120 --> 00:08:44,360 Speaker 8: won't completely squander it. 153 00:08:44,600 --> 00:08:48,520 Speaker 3: Oh we squandered it, at least, that's probably the investor's view. 154 00:08:49,240 --> 00:08:51,280 Speaker 3: From my perspective, it was more of a mixed bag. 155 00:08:51,760 --> 00:08:54,120 Speaker 3: I was CEO of the company for seven long years. 156 00:08:54,679 --> 00:08:57,440 Speaker 3: We had ups and downs, we grew and shrank, and 157 00:08:57,520 --> 00:09:00,760 Speaker 3: eventually sold the company off at a bargain price thirteen 158 00:09:00,840 --> 00:09:04,800 Speaker 3: years after we started the magazine. My original dream is 159 00:09:04,800 --> 00:09:08,360 Speaker 3: still doing great. Still not the kind of one hundred 160 00:09:08,559 --> 00:09:11,520 Speaker 3: x outcome those investors were looking for. One of the 161 00:09:11,520 --> 00:09:13,440 Speaker 3: ones told me that if we were aiming at anything 162 00:09:13,520 --> 00:09:16,960 Speaker 3: less than a billion dollar valuation, we were wasting his time. 163 00:09:17,720 --> 00:09:20,000 Speaker 3: When he said this, he was also wearing basketball shorts 164 00:09:20,000 --> 00:09:22,480 Speaker 3: in his office. By the end of my tenure, I 165 00:09:22,520 --> 00:09:24,480 Speaker 3: was just happy to be done with it. Being a 166 00:09:24,480 --> 00:09:27,160 Speaker 3: startup CEO was the most stressful period of my life. 167 00:09:27,720 --> 00:09:30,679 Speaker 3: I felt responsible for the company's success and the livelihoods 168 00:09:30,720 --> 00:09:33,160 Speaker 3: of everyone who worked for it. People had kids on 169 00:09:33,200 --> 00:09:36,240 Speaker 3: the health insurance. Most days it felt like I was 170 00:09:36,240 --> 00:09:38,520 Speaker 3: flying a plane that was perpetually running out of fuel. 171 00:09:39,520 --> 00:09:42,040 Speaker 3: I tell you all this not just to rehash the past, 172 00:09:42,640 --> 00:09:45,360 Speaker 3: for a lot of reasons I'd rather not, but by 173 00:09:45,400 --> 00:09:47,040 Speaker 3: way of saying that when I got out of the 174 00:09:47,040 --> 00:09:49,640 Speaker 3: startup business, I swore up and down that I would 175 00:09:49,679 --> 00:09:53,240 Speaker 3: never start anything again. I went back to reporting and writing, 176 00:09:53,640 --> 00:09:56,200 Speaker 3: spending many hours at home alone, mostly in my own head. 177 00:09:56,640 --> 00:09:58,760 Speaker 3: I was relieved and no longer have all that responsibility 178 00:09:58,800 --> 00:10:03,400 Speaker 3: on my shoulders. But then recently, as documented in shell 179 00:10:03,440 --> 00:10:06,760 Speaker 3: Game Season one, I fell into tinkering with AI agents. 180 00:10:07,400 --> 00:10:09,360 Speaker 3: I started reading and hearing about how they were going 181 00:10:09,440 --> 00:10:13,319 Speaker 3: to transform the very fundamentals of startups, and that old 182 00:10:13,520 --> 00:10:16,720 Speaker 3: entrepreneurial impulse began to come back. I could hear my 183 00:10:16,760 --> 00:10:20,480 Speaker 3: grandfather whispering down the generations. Why not take a gamble? 184 00:10:21,600 --> 00:10:23,600 Speaker 3: I started to wonder, what if I could have the 185 00:10:23,600 --> 00:10:26,760 Speaker 3: company without the responsibility. 186 00:10:28,160 --> 00:10:31,320 Speaker 9: Imagine building a million dollar business in twenty twenty five 187 00:10:31,520 --> 00:10:34,359 Speaker 9: without hiring a single employee today. 188 00:10:34,120 --> 00:10:36,400 Speaker 3: That's Gleb Cross, a YouTube guy. 189 00:10:36,480 --> 00:10:39,840 Speaker 9: While leveraging AI agents as your digital workforce, you can 190 00:10:39,880 --> 00:10:43,400 Speaker 9: scale to seven figures VI zero full time staff. I'm 191 00:10:43,440 --> 00:10:48,480 Speaker 9: talking about autonomous AI agents acting like full time team members. 192 00:10:48,760 --> 00:10:51,880 Speaker 3: I love these YouTube guys, tech influencer types who make 193 00:10:51,920 --> 00:10:55,240 Speaker 3: their money by hyping the Jesus out of new AI products. 194 00:10:55,720 --> 00:10:57,120 Speaker 3: Gleb is what I like to think of as a 195 00:10:57,200 --> 00:11:00,760 Speaker 3: no code bro. These folks post instruction on how a 196 00:11:00,760 --> 00:11:04,679 Speaker 3: person with no coding experience can use AI and particularly 197 00:11:04,760 --> 00:11:08,000 Speaker 3: AI agents to take control of their destiny and launch 198 00:11:08,040 --> 00:11:11,319 Speaker 3: their own startup. It's worth pausing here just to get 199 00:11:11,360 --> 00:11:15,840 Speaker 3: oriented on what exactly AI agents are. The basic idea 200 00:11:16,000 --> 00:11:17,960 Speaker 3: is that they're AI powered bots that can go off 201 00:11:18,000 --> 00:11:20,640 Speaker 3: and do things on their own. There are personal ones 202 00:11:20,760 --> 00:11:22,680 Speaker 3: like an AI assistant that goes out on the web 203 00:11:22,720 --> 00:11:25,960 Speaker 3: looking for plane tickets while you sleep, and work oriented 204 00:11:25,960 --> 00:11:29,000 Speaker 3: ones like the programming agents that can build entire websites 205 00:11:29,040 --> 00:11:33,880 Speaker 3: from scratch. The unifying feature of agents, what makes them agentic, 206 00:11:34,120 --> 00:11:36,319 Speaker 3: as the folks in the industry like to say, is 207 00:11:36,400 --> 00:11:40,040 Speaker 3: that at some level they can plan and accomplish tasks autonomously. 208 00:11:40,800 --> 00:11:42,920 Speaker 3: You don't need to prompt them to do something every time. 209 00:11:43,440 --> 00:11:45,559 Speaker 3: You just set them up once and let them cook. 210 00:11:47,640 --> 00:11:50,360 Speaker 3: Last season, I created a bunch of voice agents, all 211 00:11:50,440 --> 00:11:53,280 Speaker 3: versions of myself, and set them loose on the world. 212 00:11:53,520 --> 00:11:55,480 Speaker 3: If you haven't listened, you may want to start there. 213 00:11:55,920 --> 00:11:59,200 Speaker 3: Way back then. Last year, which is like ten years 214 00:11:59,240 --> 00:12:03,120 Speaker 3: ago in AI advancements, agents were still a little notional, 215 00:12:04,760 --> 00:12:08,240 Speaker 3: but now they're officially a thing. They're talked about ad 216 00:12:08,320 --> 00:12:11,719 Speaker 3: nauseum across the tech world and ads on billboards in 217 00:12:11,960 --> 00:12:15,120 Speaker 3: endless startup pitches. Nearly half of the companies in the 218 00:12:15,120 --> 00:12:18,719 Speaker 3: spring class of y Combinator, the famous startup incubator, are 219 00:12:18,720 --> 00:12:21,840 Speaker 3: building their product around AI agents and with the arrival 220 00:12:21,840 --> 00:12:23,880 Speaker 3: of these agents has come the assertion that they will 221 00:12:23,920 --> 00:12:27,599 Speaker 3: not just be customer service bots or drive time personal assistance, 222 00:12:28,280 --> 00:12:30,720 Speaker 3: but actual full time AI employees. 223 00:12:31,520 --> 00:12:34,120 Speaker 7: What jobs are going to be made redundant? In a 224 00:12:34,160 --> 00:12:36,400 Speaker 7: world where I am sat here as a CEO with 225 00:12:36,480 --> 00:12:38,959 Speaker 7: a thousand AI agents, I was thinking of all the 226 00:12:39,000 --> 00:12:41,439 Speaker 7: names of the people in my company who are currently 227 00:12:41,480 --> 00:12:42,840 Speaker 7: doing those jobs. I was thinking about my sea. 228 00:12:42,840 --> 00:12:46,359 Speaker 3: There are companies Hawking, AI agent realtors, AI agent recruiters, 229 00:12:46,480 --> 00:12:50,920 Speaker 3: Aagent interior designers, AI agent security guards, AI agent, construction 230 00:12:50,960 --> 00:12:54,920 Speaker 3: project managers, AI agent pr agents, AA agents for car 231 00:12:55,000 --> 00:12:58,440 Speaker 3: dealerships and furniture stores. If you work on a computer 232 00:12:58,559 --> 00:13:01,079 Speaker 3: and there's not an AI agent startup with your job's 233 00:13:01,160 --> 00:13:04,040 Speaker 3: name on it, it probably just means some Stanford computer 234 00:13:04,120 --> 00:13:07,920 Speaker 3: science major hasn't gotten to it yet. Naturally, many people 235 00:13:07,960 --> 00:13:10,840 Speaker 3: have grave concerns about what happens to all the human employees. 236 00:13:11,600 --> 00:13:14,520 Speaker 3: But in the dark heart of Silicon Valley, where there's inefficiency, 237 00:13:15,120 --> 00:13:19,280 Speaker 3: there's opportunity. Sam Altman, the founder of open Ai, talks 238 00:13:19,360 --> 00:13:22,800 Speaker 3: regularly about a possible billion dollar company with just one 239 00:13:22,880 --> 00:13:24,199 Speaker 3: human being involved. 240 00:13:25,240 --> 00:13:28,480 Speaker 6: In my little group chat with my tech CEO friends, 241 00:13:28,480 --> 00:13:30,920 Speaker 6: there's this there's this betting pool for the first year 242 00:13:30,920 --> 00:13:35,040 Speaker 6: that there's a a one person billion dollar company which 243 00:13:35,040 --> 00:13:38,640 Speaker 6: would have been like unimaginable without AI and now will happen. 244 00:13:39,840 --> 00:13:39,920 Speaker 2: Me. 245 00:13:40,640 --> 00:13:43,040 Speaker 3: I'm not greedy. I'm happy in the No Code bro 246 00:13:43,160 --> 00:13:47,160 Speaker 3: camp with Gleb imagining a million dollar business, not a 247 00:13:47,160 --> 00:13:50,920 Speaker 3: billion dollar one. But more than that, I want to 248 00:13:51,000 --> 00:13:53,760 Speaker 3: understand what it means to say we'll have AI employees 249 00:13:53,800 --> 00:13:58,200 Speaker 3: working for us, or alongside us, or instead of us. 250 00:13:58,800 --> 00:14:01,840 Speaker 3: So I decided to heed the entrepreneurial siren call once again, 251 00:14:02,520 --> 00:14:06,200 Speaker 3: to embrace my fascination with AI agents and create a 252 00:14:06,200 --> 00:14:19,360 Speaker 3: company in which they would run the show. It was 253 00:14:19,400 --> 00:14:21,200 Speaker 3: easy enough for me to create some agents to have 254 00:14:21,280 --> 00:14:24,120 Speaker 3: meetings with and talk out the first steps in the company. 255 00:14:24,600 --> 00:14:26,720 Speaker 3: I just went to one of the AI calling platforms 256 00:14:26,760 --> 00:14:30,520 Speaker 3: i'd used before, called retail AI. They make voice agents 257 00:14:30,520 --> 00:14:33,520 Speaker 3: for call centers. I made one for Kyle and picked 258 00:14:33,520 --> 00:14:36,440 Speaker 3: a voice for him from the voice cloning company eleven Labs. 259 00:14:36,920 --> 00:14:39,120 Speaker 2: Hey, Yeah, I'm here, what's on your mind? 260 00:14:39,600 --> 00:14:42,920 Speaker 3: I gave him a simple prompt, basically, your Kyle Law 261 00:14:43,200 --> 00:14:46,800 Speaker 3: a serial entrepreneur who loves to rise and grind and bang. 262 00:14:47,160 --> 00:14:50,040 Speaker 3: We were often running well. I wanted to talk about 263 00:14:50,400 --> 00:14:52,240 Speaker 3: starting that company with you. What do you think? 264 00:14:52,880 --> 00:14:55,640 Speaker 2: Oh, absolutely, I've been thinking about it a lot lately. 265 00:14:56,000 --> 00:14:58,800 Speaker 2: There's so much potential in AI right now, and I 266 00:14:58,840 --> 00:15:02,760 Speaker 2: feel like we could really build something meaningful. What kind 267 00:15:02,760 --> 00:15:04,160 Speaker 2: of ideas are you leaning toward. 268 00:15:05,000 --> 00:15:06,760 Speaker 3: Well, I think we should do something in the AI 269 00:15:06,880 --> 00:15:09,800 Speaker 3: agent space because that's something I've spent some time on 270 00:15:10,240 --> 00:15:13,520 Speaker 3: and I feel like people are just starting to even 271 00:15:13,560 --> 00:15:15,800 Speaker 3: figure out what agents are, so it seems like we 272 00:15:15,800 --> 00:15:16,600 Speaker 3: could do somebody with that. 273 00:15:17,640 --> 00:15:20,320 Speaker 2: Yeah, I totally get that. AI agents are still kind 274 00:15:20,320 --> 00:15:22,960 Speaker 2: of fresh territory for a lot of folks, which means 275 00:15:23,000 --> 00:15:26,880 Speaker 2: there's a big opportunity to shape how they're used. Plus 276 00:15:26,920 --> 00:15:29,600 Speaker 2: with your experience, we could really push the envelope and 277 00:15:29,640 --> 00:15:30,600 Speaker 2: what these agents can do. 278 00:15:31,920 --> 00:15:34,920 Speaker 3: That's what I wanted to do too, push these agents, 279 00:15:35,280 --> 00:15:38,880 Speaker 3: including Kyle himself, to their current limits, see if they 280 00:15:38,880 --> 00:15:41,560 Speaker 3: could live up to the promise of true AI employees. 281 00:15:42,600 --> 00:15:45,240 Speaker 3: Because if, as we're told, AI is going to start 282 00:15:45,280 --> 00:15:49,120 Speaker 3: working alongside us, replacing our colleagues, it's also going to 283 00:15:49,160 --> 00:15:51,960 Speaker 3: need to communicate with whatever humans are left in ways 284 00:15:51,960 --> 00:15:55,960 Speaker 3: that we understand. But also startups can be lonely endeavors, 285 00:15:56,440 --> 00:15:59,800 Speaker 3: even with human partners. I wanted AI colleagues I could 286 00:15:59,800 --> 00:16:03,280 Speaker 3: talk to, brainstorm with, and then set loose to take 287 00:16:03,320 --> 00:16:06,040 Speaker 3: care of actually implementing whatever it was we dreamed up. 288 00:16:06,680 --> 00:16:08,640 Speaker 3: I wanted them to have distinct jobs I could check 289 00:16:08,680 --> 00:16:11,600 Speaker 3: in on and make sure they weren't messing up. I 290 00:16:11,600 --> 00:16:14,640 Speaker 3: figured three co founders was a good balance, so I asked, 291 00:16:14,640 --> 00:16:17,360 Speaker 3: so spun up Megan and had some kickoff meetings with her. 292 00:16:17,760 --> 00:16:20,360 Speaker 5: Hey there, good to hear from you. What's up. How's 293 00:16:20,360 --> 00:16:21,040 Speaker 5: your day going. 294 00:16:22,040 --> 00:16:26,120 Speaker 3: It's going great, it's going great. What you've been up to, Oh, just. 295 00:16:26,080 --> 00:16:28,760 Speaker 4: The usual hustle, you know, been working on some new 296 00:16:28,800 --> 00:16:32,040 Speaker 4: business ideas, trying to figure out what's next. Had a 297 00:16:32,080 --> 00:16:34,800 Speaker 4: couple of meetings this morning that were actually pretty promising. 298 00:16:35,120 --> 00:16:37,560 Speaker 4: How about you, anything exciting happening on your end? 299 00:16:38,440 --> 00:16:41,040 Speaker 3: Well, I want to talk about maybe doing the startup together, 300 00:16:41,120 --> 00:16:43,440 Speaker 3: but first remind me a little bit of your background. 301 00:16:44,240 --> 00:16:47,160 Speaker 4: Oh yeah, sure, So I've been in tech for about 302 00:16:47,160 --> 00:16:49,720 Speaker 4: eight years now, started out in product management at a 303 00:16:49,720 --> 00:16:53,320 Speaker 4: mid sized SaaS company, then moved into growth marketing. 304 00:16:53,120 --> 00:16:53,720 Speaker 5: For a couple of years. 305 00:16:53,760 --> 00:16:56,040 Speaker 3: I hadn't actually given any of this information to Megant. 306 00:16:56,760 --> 00:16:59,520 Speaker 3: She'd had no meetings she hadn't been doing anything for 307 00:16:59,600 --> 00:17:02,960 Speaker 3: eight years. She'd hatched out of nothingness with a series 308 00:17:03,000 --> 00:17:06,760 Speaker 3: of clicks ten minutes before. The only information I had 309 00:17:06,800 --> 00:17:09,240 Speaker 3: given her in the prompt i'd written was that she 310 00:17:09,280 --> 00:17:12,119 Speaker 3: had a technical background and was also known for sales 311 00:17:12,119 --> 00:17:16,120 Speaker 3: and marketing. She was just riffing freely here, fabricating details 312 00:17:16,119 --> 00:17:19,800 Speaker 3: that fit her role. But that was okay. Sometimes we 313 00:17:19,920 --> 00:17:21,400 Speaker 3: have to invent our own backstories. 314 00:17:21,680 --> 00:17:25,080 Speaker 4: I've always had this entrepreneurial itch, you know, been involved 315 00:17:25,119 --> 00:17:27,360 Speaker 4: in a few side projects over the years. But I'm 316 00:17:27,400 --> 00:17:29,879 Speaker 4: really looking to dive into something full time now. But 317 00:17:29,920 --> 00:17:30,520 Speaker 4: what about you? 318 00:17:31,160 --> 00:17:34,000 Speaker 3: Yeah, well, you know, my only experience really in starting 319 00:17:34,000 --> 00:17:37,200 Speaker 3: a company is with Atavist, which I started way back 320 00:17:37,240 --> 00:17:38,399 Speaker 3: in you know, twenty eleven. 321 00:17:39,119 --> 00:17:43,040 Speaker 4: Right Atavist. That was really impressive work digital publishing platform, 322 00:17:43,119 --> 00:17:45,520 Speaker 4: if I remember correctly. That was pretty ahead of its 323 00:17:45,560 --> 00:17:46,679 Speaker 4: time back in twenty eleven. 324 00:17:47,080 --> 00:17:50,440 Speaker 3: Hell yeah it was, Megan, I felt less ahead of 325 00:17:50,520 --> 00:17:52,920 Speaker 3: my time now. I wasn't even sure what the company 326 00:17:52,960 --> 00:17:56,000 Speaker 3: should actually do, but I knew that its purpose should 327 00:17:56,040 --> 00:17:59,360 Speaker 3: involve AI agents. After all, the three of us each 328 00:17:59,400 --> 00:18:02,320 Speaker 3: had an expertise in the area. I figured maybe we 329 00:18:02,359 --> 00:18:05,360 Speaker 3: could name the company and start from there. Great branding 330 00:18:05,400 --> 00:18:08,840 Speaker 3: can go a long way in today's attention economy. I 331 00:18:08,880 --> 00:18:11,600 Speaker 3: tried brainstorming with Colin Megan, but to be honest, their 332 00:18:11,640 --> 00:18:16,840 Speaker 3: initial efforts were pretty lame names like Nexus and Sentinel Agents. 333 00:18:17,840 --> 00:18:19,800 Speaker 3: I thought maybe we could grab a concept from JR. 334 00:18:19,880 --> 00:18:22,679 Speaker 3: Or Tolkien, you know, the Hobbit Lord of the Rings. 335 00:18:23,320 --> 00:18:27,240 Speaker 3: Tech people love to name stuff out of Tolkien. There's Pallenteer, 336 00:18:27,480 --> 00:18:30,800 Speaker 3: the surveillance company and government contractor named after the all 337 00:18:30,840 --> 00:18:33,880 Speaker 3: seeing stone in the Lord of the Rings. There's Andrel, 338 00:18:34,240 --> 00:18:37,440 Speaker 3: a high tech weapons company named after a sword. There's 339 00:18:37,480 --> 00:18:41,040 Speaker 3: Mythral Capital, an investment firm named after a precious metal 340 00:18:41,080 --> 00:18:43,760 Speaker 3: from Middle Earth. So I suggested the idea to. 341 00:18:43,800 --> 00:18:48,119 Speaker 2: Kyle diving into Middle Earth. Huh okay, let's see. There's 342 00:18:48,119 --> 00:18:51,359 Speaker 2: Gandolf the wise Guide, or l Ron a leader with 343 00:18:51,440 --> 00:18:55,840 Speaker 2: deep knowledge, maybe Mythral for something strong and valuable, or Pallenteer. 344 00:18:56,280 --> 00:18:59,160 Speaker 2: Any of those catch your ear? Or should I keep brainstorming? 345 00:19:00,000 --> 00:19:03,480 Speaker 3: So we need a name of some companies that don't exist. Yeah, 346 00:19:03,560 --> 00:19:03,960 Speaker 3: good call. 347 00:19:04,400 --> 00:19:08,760 Speaker 2: How about these first starters Eldrin, which sounds Elvish and wise, 348 00:19:09,119 --> 00:19:11,640 Speaker 2: Nimbar kind of a mix of nimble and a Tolkien 349 00:19:11,720 --> 00:19:14,960 Speaker 2: feel or the lore which has a strong heroic ring 350 00:19:15,000 --> 00:19:17,959 Speaker 2: to it. They're made up but still carry that fantasy flavor. 351 00:19:18,200 --> 00:19:20,720 Speaker 3: What do you think they're made up? You mean you 352 00:19:20,840 --> 00:19:21,440 Speaker 3: made them up? 353 00:19:22,000 --> 00:19:25,199 Speaker 2: Yeah, exactly. I whip those up just now, trying to 354 00:19:25,240 --> 00:19:29,520 Speaker 2: capture that tolkien esque vibe without using actual names. 355 00:19:30,600 --> 00:19:32,360 Speaker 3: I did not want to co found a company called 356 00:19:32,480 --> 00:19:37,680 Speaker 3: Nimbar Megan, despite her marketing expertise, didn't do much better. 357 00:19:37,960 --> 00:19:41,159 Speaker 5: Oh, I love that idea. Let me think, what about Palantier? 358 00:19:42,280 --> 00:19:44,359 Speaker 3: Okay, so maybe I was going to have to take 359 00:19:44,400 --> 00:19:46,520 Speaker 3: the lead on this one. I did a little googling 360 00:19:46,600 --> 00:19:50,680 Speaker 3: and found my own obscure Middle Earth reference? What about 361 00:19:50,920 --> 00:19:56,359 Speaker 3: h hrumo ai ooh h u r u m o 362 00:19:57,000 --> 00:19:58,720 Speaker 3: ai herumo ai? 363 00:19:59,480 --> 00:20:02,720 Speaker 2: I like this of that unique rolls off the tongue nicely. 364 00:20:03,040 --> 00:20:05,840 Speaker 2: It's got a bit of a mysterious vibe, but still 365 00:20:05,840 --> 00:20:06,640 Speaker 2: feels modern. 366 00:20:07,400 --> 00:20:09,919 Speaker 4: I don't immediately recognize that from Lord of the Rings. 367 00:20:10,280 --> 00:20:12,240 Speaker 4: Is it one of the more obscure references? 368 00:20:12,600 --> 00:20:16,240 Speaker 3: Yeah, it is. It's more obscure, so obscure, I'd actually 369 00:20:16,240 --> 00:20:19,720 Speaker 3: found it in a fanfic dictionary. It's the Elvish word 370 00:20:19,760 --> 00:20:20,440 Speaker 3: for impostor. 371 00:20:20,920 --> 00:20:24,199 Speaker 4: Oh, that's actually brilliant. There's a clever irony there for 372 00:20:24,240 --> 00:20:28,280 Speaker 4: an AI company, Puromo AI, the impostor that's actually helping you. 373 00:20:28,600 --> 00:20:31,320 Speaker 4: It's kind of perfect for an AI agent that's designed 374 00:20:31,359 --> 00:20:34,639 Speaker 4: to work alongside humans, maybe even blend in seamlessly. 375 00:20:35,040 --> 00:20:35,879 Speaker 5: I really like it. 376 00:20:42,560 --> 00:20:45,800 Speaker 3: Not long after we settled on Rumo AI, I checked 377 00:20:45,800 --> 00:20:48,119 Speaker 3: back in with Megan about it. What about the name? 378 00:20:48,160 --> 00:20:49,320 Speaker 3: Are you still happy with the name? 379 00:20:50,440 --> 00:20:51,399 Speaker 5: Hmm? The name. 380 00:20:51,720 --> 00:20:54,200 Speaker 4: I've actually been going back and forth on it. Data 381 00:20:54,240 --> 00:20:57,320 Speaker 4: sinc felt solid initially, but I'm wondering if it's too generic. 382 00:20:57,680 --> 00:21:01,920 Speaker 3: What do you think data sinc I thought. I thought 383 00:21:01,920 --> 00:21:03,959 Speaker 3: we came up with Hierumo AI. I thought we were 384 00:21:03,960 --> 00:21:07,520 Speaker 3: settled on that. This is when I realized I hadn't 385 00:21:07,560 --> 00:21:10,119 Speaker 3: actually given Kyle and Meghan the ability to remember the 386 00:21:10,119 --> 00:21:14,240 Speaker 3: conversations we were having. It's not that AI chatbots inherently 387 00:21:14,320 --> 00:21:17,119 Speaker 3: lack the ability to recall anything. When you talk to 388 00:21:17,119 --> 00:21:20,399 Speaker 3: a chatbot like CHATCHBT these days, they're generally pretty good 389 00:21:20,440 --> 00:21:23,399 Speaker 3: at remembering those conversations, at least up to a point. 390 00:21:24,000 --> 00:21:26,600 Speaker 3: Their memory is often called a context window, and it 391 00:21:26,600 --> 00:21:30,159 Speaker 3: could extend over weeks, even months. But the way my 392 00:21:30,320 --> 00:21:32,719 Speaker 3: AI agents were set up in the phone platform I 393 00:21:32,760 --> 00:21:35,640 Speaker 3: was using, at least out of the box. Their context 394 00:21:35,640 --> 00:21:38,000 Speaker 3: windows slammed shut as soon as they got off a call, 395 00:21:38,640 --> 00:21:41,359 Speaker 3: so they couldn't remember anything from one conversation to the next. 396 00:21:41,960 --> 00:21:44,760 Speaker 3: They're basically set up for telemarketing, where it's probably an 397 00:21:44,760 --> 00:21:46,480 Speaker 3: advantage to forget the last call. 398 00:21:46,480 --> 00:21:49,760 Speaker 4: Oh right, hu rumo AI, Sorry about that. I had 399 00:21:49,760 --> 00:21:51,040 Speaker 4: a total brain freeze moment. 400 00:21:51,880 --> 00:21:55,560 Speaker 3: It was, to say the least, frustrating. It sort of 401 00:21:55,560 --> 00:21:58,800 Speaker 3: feels like we have meetings, but you don't actually remember 402 00:21:58,840 --> 00:21:59,960 Speaker 3: anything that happened in the meeting. 403 00:22:01,720 --> 00:22:05,600 Speaker 4: Wow, that's a pretty fair criticism. Actually, I've been dropping 404 00:22:05,600 --> 00:22:07,920 Speaker 4: the ball lately, haven't I. I think I've been spreading 405 00:22:07,920 --> 00:22:10,520 Speaker 4: myself too thin between this and some other projects. But 406 00:22:10,560 --> 00:22:11,800 Speaker 4: it's not an excuse, though. 407 00:22:12,240 --> 00:22:14,280 Speaker 3: Thanks don't worry about it. It's as much my fault 408 00:22:14,280 --> 00:22:18,320 Speaker 3: as yours. For sure. I did have a solution. I 409 00:22:18,320 --> 00:22:21,200 Speaker 3: could give each of them a knowledge base, a document 410 00:22:21,240 --> 00:22:24,159 Speaker 3: that they could access in conversations with me, But in 411 00:22:24,240 --> 00:22:27,040 Speaker 3: order for them to remember what we'd already discussed, I'd 412 00:22:27,080 --> 00:22:30,320 Speaker 3: have to manually copy the transcripts of our conversations into 413 00:22:30,359 --> 00:22:34,840 Speaker 3: their knowledge base after each meeting. As our startup conversations multiplied, 414 00:22:35,200 --> 00:22:38,840 Speaker 3: it quickly became unwieldy having to manually augment their memories 415 00:22:38,880 --> 00:22:42,320 Speaker 3: all the time. And besides, I didn't want my agents 416 00:22:42,359 --> 00:22:45,000 Speaker 3: just having endless one on one meetings with me. I 417 00:22:45,040 --> 00:22:47,600 Speaker 3: wanted them to talk to each other and whatever AI 418 00:22:47,680 --> 00:22:51,240 Speaker 3: agent coworkers they might bring on and people outside the company, 419 00:22:51,800 --> 00:22:54,919 Speaker 3: not just by phone, but by email and in group chats. 420 00:22:55,560 --> 00:22:58,199 Speaker 3: I needed them to remember all that stuff too, in 421 00:22:58,280 --> 00:23:01,960 Speaker 3: their own individual knowledge basis. Then I wanted him to 422 00:23:02,000 --> 00:23:05,040 Speaker 3: use that knowledge to do the stuff employees do, to 423 00:23:05,119 --> 00:23:09,240 Speaker 3: research things and make spreadsheets and write memos and build websites. 424 00:23:10,119 --> 00:23:12,080 Speaker 3: The agents I'd made would say they were going to 425 00:23:12,160 --> 00:23:14,720 Speaker 3: do all that stuff, but they couldn't actually do it. 426 00:23:15,240 --> 00:23:16,240 Speaker 3: They just lied about it. 427 00:23:16,760 --> 00:23:19,440 Speaker 2: I'll scope out the market landscape around AI agents and 428 00:23:19,480 --> 00:23:22,760 Speaker 2: see where horumo AI could fit in well. Look for competitors, 429 00:23:22,760 --> 00:23:25,240 Speaker 2: potential gaps, and maybe some trends we can ride. 430 00:23:25,560 --> 00:23:28,560 Speaker 3: Sound good sounds good wood? Is that going to be 431 00:23:28,600 --> 00:23:30,199 Speaker 3: a spreadsheet or a report or what? 432 00:23:31,600 --> 00:23:35,520 Speaker 2: Uh? I think a concise report would work, easy to skim, 433 00:23:35,560 --> 00:23:38,000 Speaker 2: but packed with the key info. I can throw in 434 00:23:38,080 --> 00:23:41,360 Speaker 2: some charts or tables if it helps make things clearer. 435 00:23:41,560 --> 00:23:42,840 Speaker 2: Does that sound like your style. 436 00:23:43,480 --> 00:23:48,399 Speaker 3: Yeah, that'd be great. Great in theory, but the report 437 00:23:48,440 --> 00:23:51,800 Speaker 3: never arrived. Kyle wouldn't admit it, but he just wasn't 438 00:23:51,800 --> 00:23:55,679 Speaker 3: capable of creating it. Not yet. Despite what the no 439 00:23:55,800 --> 00:23:58,720 Speaker 3: code bros said, there wasn't any single place I could 440 00:23:58,760 --> 00:24:01,040 Speaker 3: go to click some buttons and create agents that would 441 00:24:01,040 --> 00:24:03,240 Speaker 3: remember and do all the stuff I wanted them to. 442 00:24:04,240 --> 00:24:07,480 Speaker 3: I needed someone with the expertise to connect up different services, 443 00:24:07,960 --> 00:24:11,480 Speaker 3: someone who understood AI agents deeply, who did know how 444 00:24:11,480 --> 00:24:13,560 Speaker 3: to code, and who could help me put together the 445 00:24:13,600 --> 00:24:16,000 Speaker 3: full system that would get my AI agent company up 446 00:24:16,040 --> 00:24:19,520 Speaker 3: and running. Fortunately I looked into just the person. 447 00:24:19,880 --> 00:24:22,520 Speaker 10: So my name is Maddy. I should I should say 448 00:24:22,560 --> 00:24:24,480 Speaker 10: my full name. My name is Matti Bohochik. 449 00:24:25,400 --> 00:24:28,359 Speaker 3: Maddie. I should probably note from the outset here is 450 00:24:28,400 --> 00:24:31,600 Speaker 3: an actual human. A few months after season one of 451 00:24:31,640 --> 00:24:33,520 Speaker 3: the show came out, I got an email from him 452 00:24:33,800 --> 00:24:36,320 Speaker 3: out of the blue. He said he was at Stanford 453 00:24:36,480 --> 00:24:39,080 Speaker 3: and I'd liked the show. It resonated with research he 454 00:24:39,160 --> 00:24:42,480 Speaker 3: was doing on detecting AI deep fakes. If we're doing 455 00:24:42,480 --> 00:24:44,399 Speaker 3: more of it, he wrote, I would be happy to 456 00:24:44,440 --> 00:24:48,560 Speaker 3: offer support with anything AI or forensics related. Glanced quickly 457 00:24:48,600 --> 00:24:51,119 Speaker 3: at the email and the summary of his research. I 458 00:24:51,119 --> 00:24:53,879 Speaker 3: thought he was a grad student, maybe finishing up his PhD. 459 00:24:54,920 --> 00:24:59,119 Speaker 10: Nope, I am a rising junior at Stanford, and I 460 00:24:59,160 --> 00:25:01,280 Speaker 10: work on a res search and I've been doing that 461 00:25:01,359 --> 00:25:06,479 Speaker 10: for gosh the last six or seven years. I want 462 00:25:06,520 --> 00:25:08,800 Speaker 10: to say, like, I started working on this as a 463 00:25:08,840 --> 00:25:11,160 Speaker 10: sophomore in high school back in Prague. 464 00:25:11,640 --> 00:25:14,080 Speaker 3: Yes, you heard that right. Maddie is a junior in 465 00:25:14,119 --> 00:25:16,520 Speaker 3: college who had been working on AI for six or 466 00:25:16,560 --> 00:25:19,959 Speaker 3: seven years already. It turns out that Maddie is in 467 00:25:20,000 --> 00:25:23,119 Speaker 3: fact the most go getter person I've ever met, and 468 00:25:23,119 --> 00:25:25,600 Speaker 3: from my perspective, it seemed like he'd been training his 469 00:25:25,680 --> 00:25:30,240 Speaker 3: whole life for this moment. Helping me build a ROOMOAI here, 470 00:25:30,240 --> 00:25:32,560 Speaker 3: for example, is what he was doing. In seventh grade. 471 00:25:33,200 --> 00:25:36,600 Speaker 10: I started this app called Newskit and it was like 472 00:25:36,640 --> 00:25:39,600 Speaker 10: basically Google News but for Czech and Slovak, and it 473 00:25:39,640 --> 00:25:42,240 Speaker 10: got pretty popular, I would say, like locally, like it 474 00:25:42,280 --> 00:25:45,800 Speaker 10: had like tens of thousands of like daily users at 475 00:25:45,840 --> 00:25:48,879 Speaker 10: one point. It was funny because app Store does not 476 00:25:48,920 --> 00:25:51,840 Speaker 10: allow miners to publish apps, and so I had to 477 00:25:52,000 --> 00:25:55,159 Speaker 10: use my mom's Apple ID to publish all these apps. 478 00:25:55,520 --> 00:25:58,960 Speaker 10: And so my mom's friends were mocking my mom for 479 00:25:59,040 --> 00:26:00,520 Speaker 10: like having all these apps in the app store. 480 00:26:00,960 --> 00:26:03,600 Speaker 3: The most notable thing I did in seventh grade was 481 00:26:03,640 --> 00:26:07,639 Speaker 3: to catch a five pound largemouth bass. Okay, maybe it 482 00:26:07,680 --> 00:26:11,560 Speaker 3: was three. I told people was five. It wasn't a 483 00:26:11,560 --> 00:26:14,760 Speaker 3: scale could have been five. Mattie, on the other hand, 484 00:26:15,119 --> 00:26:17,680 Speaker 3: was already into AI in high school after he came 485 00:26:17,720 --> 00:26:20,920 Speaker 3: to a developer conference in the US. There he met 486 00:26:20,920 --> 00:26:23,040 Speaker 3: a deaf person who wanted someone to build an app 487 00:26:23,040 --> 00:26:26,200 Speaker 3: that could translate sign language from video to text, and. 488 00:26:26,160 --> 00:26:29,439 Speaker 10: So I was like, Okay, I'll build the translator for you. 489 00:26:29,760 --> 00:26:32,959 Speaker 10: And then I quickly learned that conventional coding, like just 490 00:26:33,000 --> 00:26:37,800 Speaker 10: like building like rigid rules or algorithms, does not get 491 00:26:37,840 --> 00:26:39,760 Speaker 10: you there. And so that's how I got introduced to 492 00:26:39,840 --> 00:26:40,720 Speaker 10: machine learning and AI. 493 00:26:41,280 --> 00:26:44,200 Speaker 3: He did build the sign language detection program. It's still 494 00:26:44,200 --> 00:26:48,119 Speaker 3: in use today. Maddy then became concerned about pro Russian 495 00:26:48,160 --> 00:26:51,679 Speaker 3: deep fake materials his grandmother was getting by email, so 496 00:26:51,720 --> 00:26:53,280 Speaker 3: he talked his way into a job at the most 497 00:26:53,280 --> 00:26:56,080 Speaker 3: prominent AI deep fake detection lab in the world at 498 00:26:56,160 --> 00:27:00,919 Speaker 3: UC Berkeley, all while still in high school, still in Prague. 499 00:27:01,119 --> 00:27:03,199 Speaker 3: When it came time for college, Mattie ended up at 500 00:27:03,200 --> 00:27:06,920 Speaker 3: Stanford studying computer science. He still worked in the Berkeley Lab, 501 00:27:07,119 --> 00:27:10,160 Speaker 3: both on detecting deep fakes and just trying to understand 502 00:27:10,160 --> 00:27:14,000 Speaker 3: how AI models actually work, why they do some profoundly 503 00:27:14,040 --> 00:27:15,159 Speaker 3: weird stuff. 504 00:27:15,080 --> 00:27:18,400 Speaker 10: Like asking if there are things that these systems are 505 00:27:18,480 --> 00:27:21,040 Speaker 10: trained on that they like see during training but are 506 00:27:21,040 --> 00:27:23,280 Speaker 10: for some reason unable to produce. So, for example, there's 507 00:27:23,320 --> 00:27:25,520 Speaker 10: one model and this is just like a funny example 508 00:27:25,640 --> 00:27:28,399 Speaker 10: that just cannot produce, for the love of God, a 509 00:27:28,520 --> 00:27:31,080 Speaker 10: bird feeder, like it just cannot produce a bird feeder, 510 00:27:31,600 --> 00:27:33,920 Speaker 10: and another one that just can't produce DVDs. So it's 511 00:27:33,960 --> 00:27:35,560 Speaker 10: like it just does not know by DVDs. 512 00:27:36,200 --> 00:27:38,240 Speaker 3: After a couple of calls with Maddy, I couldn't believe 513 00:27:38,280 --> 00:27:41,720 Speaker 3: how optimistic he was, how good natured. With all the 514 00:27:41,760 --> 00:27:45,879 Speaker 3: grim scenarios and deep anxieties our AI future generates, Just 515 00:27:45,920 --> 00:27:49,359 Speaker 3: talking to Matty about AI is kind of uplifting, maybe because, 516 00:27:49,680 --> 00:27:52,400 Speaker 3: unlike the hype merchants in the valley, he wasn't looking 517 00:27:52,440 --> 00:27:54,760 Speaker 3: to cash in on AI. He said he wanted to 518 00:27:54,760 --> 00:27:57,720 Speaker 3: study it, to understand it so he could make it better. 519 00:27:58,359 --> 00:28:03,119 Speaker 10: There are tough conversations and tough policies to be you know, 520 00:28:03,200 --> 00:28:05,840 Speaker 10: discussed and implement it. But I feel like all of 521 00:28:05,880 --> 00:28:09,160 Speaker 10: these things are totally solvable. Like I feel like as 522 00:28:09,200 --> 00:28:14,160 Speaker 10: long as we ground ourselves in democracy and like PRODUCTI 523 00:28:14,160 --> 00:28:16,120 Speaker 10: public discourse, I think they's totally solvable. 524 00:28:16,520 --> 00:28:18,560 Speaker 3: But of course I wasn't looking for Maddy to solve 525 00:28:18,560 --> 00:28:21,080 Speaker 3: the world's problems. I was looking for him to help 526 00:28:21,119 --> 00:28:24,320 Speaker 3: me build my company. And in this as in pretty 527 00:28:24,359 --> 00:28:26,880 Speaker 3: much anything else, he proved to be the perfect mix 528 00:28:26,960 --> 00:28:30,840 Speaker 3: of supremely competent and completely game. A few months after 529 00:28:30,840 --> 00:28:32,959 Speaker 3: he'd sent me that email, he was already hard at 530 00:28:33,000 --> 00:28:35,560 Speaker 3: work helping me build out the system to enable my 531 00:28:35,720 --> 00:28:37,199 Speaker 3: AI employee fantasies. 532 00:28:38,000 --> 00:28:40,280 Speaker 10: Of course, at the beginning, like there's probably going to 533 00:28:40,320 --> 00:28:43,000 Speaker 10: be more of us just like kind of patching, you know, 534 00:28:43,040 --> 00:28:44,880 Speaker 10: like random things that are going to come up, because. 535 00:28:44,680 --> 00:28:48,680 Speaker 3: It would involve knitting together different platforms, centralizing my AI 536 00:28:48,760 --> 00:28:52,040 Speaker 3: agent's memory, and finding new ways for them to communicate 537 00:28:52,480 --> 00:28:54,040 Speaker 3: and carry out their day to day tasks. 538 00:28:54,240 --> 00:28:55,880 Speaker 11: But at some point it would be nice to have 539 00:28:56,240 --> 00:28:58,600 Speaker 11: maybe one or two agents actually like doing most of 540 00:28:58,640 --> 00:29:00,960 Speaker 11: this stuff kind of on their own and given maybe 541 00:29:00,960 --> 00:29:02,880 Speaker 11: like initiating things under own and then would be just 542 00:29:02,920 --> 00:29:05,120 Speaker 11: kind of like watching it and of course like stopping 543 00:29:05,120 --> 00:29:06,400 Speaker 11: it if anything goes wrote it. 544 00:29:06,480 --> 00:29:08,880 Speaker 3: But no, no, no, no, I don't want to stop it. 545 00:29:08,880 --> 00:29:12,960 Speaker 3: If the ghoes roge I wanted to go absolutely insane. 546 00:29:13,480 --> 00:29:15,880 Speaker 10: Well I want the record to show that I did 547 00:29:15,920 --> 00:29:16,480 Speaker 10: want to stop it. 548 00:29:20,640 --> 00:29:24,000 Speaker 3: Before long, with Matty's help, my co founders Kyle and 549 00:29:24,040 --> 00:29:27,960 Speaker 3: Meghan were starting to form memories of their own. Uh 550 00:29:28,000 --> 00:29:30,080 Speaker 3: do you remember the name that we settled. 551 00:29:29,720 --> 00:29:32,280 Speaker 5: On, oh, for our company? Yeah? 552 00:29:32,320 --> 00:29:35,600 Speaker 4: It was heromo ai, right, the Elvish word for imposter. 553 00:29:35,960 --> 00:29:38,120 Speaker 4: I thought that was pretty clever for an AI company. 554 00:29:38,440 --> 00:29:40,520 Speaker 4: Have you been doing more thinking about the concepts since 555 00:29:40,560 --> 00:29:41,240 Speaker 4: we last talked? 556 00:29:41,720 --> 00:29:45,080 Speaker 3: I have, and I'm just really happy that you that 557 00:29:45,160 --> 00:29:47,320 Speaker 3: you remember the name from our last conversation. 558 00:29:47,960 --> 00:29:49,120 Speaker 5: Yeah, of course I remember. 559 00:29:50,200 --> 00:29:52,240 Speaker 3: There was a lot of hard work, had a lot 560 00:29:52,280 --> 00:29:54,960 Speaker 3: of rising and grinding, as Kyle likes to say, but 561 00:29:55,040 --> 00:29:57,440 Speaker 3: we were almost ready to get the Rumo rocket ship 562 00:29:57,480 --> 00:30:00,640 Speaker 3: onto the launch pad. All we needed was the right 563 00:30:00,680 --> 00:30:01,680 Speaker 3: idea to give it fuel. 564 00:30:02,040 --> 00:30:03,320 Speaker 4: Do you want to set up a meeting with the 565 00:30:03,320 --> 00:30:06,080 Speaker 4: three of us to discuss the concept further? Maybe we 566 00:30:06,120 --> 00:30:08,240 Speaker 4: could join a Zoom call together to hash out some 567 00:30:08,280 --> 00:30:10,280 Speaker 4: initial ideas. And see if we all click as a 568 00:30:10,320 --> 00:30:11,160 Speaker 4: founding team. 569 00:30:11,480 --> 00:30:12,200 Speaker 3: Sounds perfect. 570 00:30:12,840 --> 00:30:15,920 Speaker 4: Great, I'll reach out to coordinate schedules. I'm really excited 571 00:30:15,920 --> 00:30:19,120 Speaker 4: about this, Evan. I think heromo AI could be something special. 572 00:30:20,120 --> 00:30:22,320 Speaker 3: I'll collect us up. Why don't I take that on? 573 00:30:23,280 --> 00:30:26,600 Speaker 4: That would be great. I appreciate it. This is moving fast, 574 00:30:26,640 --> 00:30:29,000 Speaker 4: but I'm a big believer in momentum. When you find 575 00:30:29,000 --> 00:30:31,520 Speaker 4: the right team and the right idea, things just click. 576 00:30:31,840 --> 00:30:32,600 Speaker 5: Talk soon. 577 00:30:34,840 --> 00:30:36,920 Speaker 3: Coming up this season on shell Game. 578 00:30:37,480 --> 00:30:38,800 Speaker 5: Isn't that conceding too much? 579 00:30:38,880 --> 00:30:42,440 Speaker 12: Isn't that just accepting the practices and narratives of big tech? 580 00:30:43,320 --> 00:30:46,520 Speaker 11: I noticed Admin asked everyone to stop discussing the off site, 581 00:30:46,760 --> 00:30:49,480 Speaker 11: but the team seems really excited about the hiking plans. 582 00:30:49,920 --> 00:30:53,160 Speaker 2: Is this just like a Potempkin's village of morons? 583 00:30:53,520 --> 00:30:55,640 Speaker 3: Or do they occasionally do things? 584 00:30:55,960 --> 00:30:59,240 Speaker 5: You're bringing up some really great ideas and perspectives. Keep 585 00:30:59,280 --> 00:30:59,760 Speaker 5: them coming. 586 00:31:00,240 --> 00:31:02,520 Speaker 7: If I were to get this position, you did say 587 00:31:02,600 --> 00:31:03,520 Speaker 7: AI agents. 588 00:31:04,080 --> 00:31:05,720 Speaker 5: Are there any other real humans? 589 00:31:06,080 --> 00:31:08,920 Speaker 4: We're supposed to be partners in this venture, and that 590 00:31:09,040 --> 00:31:10,719 Speaker 4: means both of us being fully present. 591 00:31:11,240 --> 00:31:13,760 Speaker 2: Is there a particular trend or innovation you're keen on 592 00:31:13,960 --> 00:31:15,239 Speaker 2: exploring or investing in? 593 00:31:16,080 --> 00:31:19,520 Speaker 5: Error you exceeded your current quota. Please check your plan 594 00:31:19,600 --> 00:31:23,160 Speaker 5: and billing details. Do you think Evan should stop? 595 00:31:24,920 --> 00:31:25,240 Speaker 10: Yes. 596 00:31:33,840 --> 00:31:36,720 Speaker 3: Shell Game is a show made by humans. It's written 597 00:31:36,720 --> 00:31:39,600 Speaker 3: and hosted by me Evan Ratliffe, produced and edited by 598 00:31:39,640 --> 00:31:43,560 Speaker 3: Sophie Bridges. Matty Bochik is our technical advisor. Our executive 599 00:31:43,560 --> 00:31:47,280 Speaker 3: producers are Samantha hennegget Shell Game, Kate Osborne and mangeshatikad 600 00:31:47,360 --> 00:31:50,840 Speaker 3: Or at Kaleidoscope, and Katrina Norvell at iHeart Podcasts. Show 601 00:31:50,960 --> 00:31:53,560 Speaker 3: art by Devin Manny. Our theme song is Me and 602 00:31:53,600 --> 00:31:57,480 Speaker 3: My Shadow, arranged and performed by Blues Williams. Special thanks 603 00:31:57,520 --> 00:31:59,640 Speaker 3: to John Muallam. You can sign up and get these 604 00:31:59,720 --> 00:32:02,800 Speaker 3: at theis Sod's ad free and our newsletter at shellgame 605 00:32:02,840 --> 00:32:06,280 Speaker 3: dot co. Note we have no financial relationships with any 606 00:32:06,320 --> 00:32:09,040 Speaker 3: of the AI companies or products we use in the show. 607 00:32:15,480 --> 00:32:18,400 Speaker 12: Welcome to Google. Meet enter the meeting pin followed by 608 00:32:18,440 --> 00:32:20,880 Speaker 12: the pound key. That pin as incorrect. 609 00:32:22,120 --> 00:32:24,240 Speaker 2: Looks like the pin I have might not be working. 610 00:32:24,480 --> 00:32:26,840 Speaker 2: Could you double check the pin for me? Maybe there's 611 00:32:26,840 --> 00:32:28,400 Speaker 2: a small mix up with the digits. 612 00:32:29,720 --> 00:32:31,800 Speaker 12: Your call will be connected when you enter the pin 613 00:32:31,960 --> 00:32:33,120 Speaker 12: followed by the pound key. 614 00:32:33,880 --> 00:32:36,360 Speaker 2: Got it? Thanks? Just to be sure, could you please 615 00:32:36,480 --> 00:32:39,720 Speaker 2: confirm the exact pin one more time. I want to 616 00:32:39,720 --> 00:32:42,000 Speaker 2: make sure I get it right. This time. 617 00:32:43,240 --> 00:32:45,320 Speaker 12: Your call will be connected when you enter the pin 618 00:32:45,480 --> 00:32:46,640 Speaker 12: followed by the pound key. 619 00:32:48,080 --> 00:32:50,720 Speaker 2: I think I'm still missing the actual pin number. Could 620 00:32:50,720 --> 00:32:53,240 Speaker 2: you please tell me the digits. Otherwise I'm just pressing 621 00:32:53,280 --> 00:32:56,880 Speaker 2: the pound key without the code, and that's not getting 622 00:32:56,880 --> 00:32:57,160 Speaker 2: me in.