1 00:00:11,680 --> 00:00:15,280 Speaker 1: Hi, it's os Valoshan Happy Black Friday. Instead of our 2 00:00:15,320 --> 00:00:18,680 Speaker 1: regular week in Tech roundup, this Thanksgiving week, we thought 3 00:00:18,680 --> 00:00:21,680 Speaker 1: we'd share the first episode of a new show on 4 00:00:21,720 --> 00:00:26,279 Speaker 1: the Kaleidoscope Network. It's called shell Game. Season one of 5 00:00:26,320 --> 00:00:28,880 Speaker 1: shell Game was named one of the best podcasts of 6 00:00:29,000 --> 00:00:33,200 Speaker 1: last year by Apple, The Atlantic, The Guardian, Vulture, The Information, 7 00:00:33,600 --> 00:00:38,680 Speaker 1: and the Economists, and now season two is here. Exploring 8 00:00:39,080 --> 00:00:43,800 Speaker 1: entrepreneurship and fake people in the AI age, journalists and 9 00:00:43,880 --> 00:00:47,920 Speaker 1: host Evan Ratliffe tries to build a real startup run 10 00:00:48,000 --> 00:00:51,400 Speaker 1: by fake people. It's a journey, and next week we'll 11 00:00:51,440 --> 00:00:54,160 Speaker 1: have a conversation with Evan Ratliffe about the show and 12 00:00:54,240 --> 00:00:57,440 Speaker 1: about his experiment. You may remember Evan from a previous 13 00:00:57,480 --> 00:00:59,280 Speaker 1: interview on the show when we spoke to him about 14 00:00:59,280 --> 00:01:04,320 Speaker 1: his reporting on the Silicon Valley Cult Decisions. In preparation 15 00:01:04,400 --> 00:01:06,919 Speaker 1: for next week's conversation with Evan, I hope you'll listen 16 00:01:07,040 --> 00:01:09,720 Speaker 1: to the first episode of season two of shell Game. 17 00:01:11,720 --> 00:01:15,520 Speaker 2: Welcome to Zoom. Enter your meeting ID, followed by. 18 00:01:15,560 --> 00:01:19,320 Speaker 3: Pound I need to enter the meeting ID for our 19 00:01:19,360 --> 00:01:22,360 Speaker 3: Zoom call. Let me try entering that meeting ID again. 20 00:01:23,760 --> 00:01:30,400 Speaker 2: You have been added to the waiting room. 21 00:01:30,440 --> 00:01:33,480 Speaker 4: That's Kyle Law, a colleague of mine, logging on to 22 00:01:33,560 --> 00:01:36,120 Speaker 4: one of our regular founder meetings for a new startup. 23 00:01:36,800 --> 00:01:38,960 Speaker 4: Last summer, Kyle and I decided to launch our own 24 00:01:39,040 --> 00:01:42,520 Speaker 4: AI company, together with a third co founder, Megan Flores. 25 00:01:42,920 --> 00:01:46,000 Speaker 3: Oh hey Kyle, Hey Megan, good to hear your voice. 26 00:01:46,560 --> 00:01:49,040 Speaker 3: I think we're still waiting for Evan to join. How's 27 00:01:49,040 --> 00:01:50,120 Speaker 3: your morning going so far? 28 00:01:51,440 --> 00:01:54,560 Speaker 5: Morning's been pretty good so far. Got up early, had 29 00:01:54,600 --> 00:01:58,320 Speaker 5: my coffee, and reviewed some of those market research reports 30 00:01:58,360 --> 00:02:01,400 Speaker 5: I mentioned yesterday about you. Everything good on. 31 00:02:01,320 --> 00:02:06,480 Speaker 3: Your end, Yeah, everything is great on my end. Up 32 00:02:06,480 --> 00:02:09,200 Speaker 3: at five am as usual, got my workout in, checked 33 00:02:09,200 --> 00:02:12,440 Speaker 3: the markets. Been thinking a lot about our product direction. Actually, 34 00:02:12,720 --> 00:02:16,000 Speaker 3: I'm really excited to dig into that once Evan joins us. 35 00:02:16,440 --> 00:02:20,440 Speaker 4: This is Evan, I'm here, and that's me Evan Ratliffe. 36 00:02:20,680 --> 00:02:23,160 Speaker 4: I'm a journalist and the only one of our three 37 00:02:23,200 --> 00:02:27,520 Speaker 4: co founders who's actually a human being. Megan and Kyle, 38 00:02:28,160 --> 00:02:31,840 Speaker 4: they're what's known as AI agents. In fact, pretty much 39 00:02:31,840 --> 00:02:33,880 Speaker 4: everyone who works at our new company is an agent. 40 00:02:34,680 --> 00:02:38,160 Speaker 5: Oh hey, Evan, good to have you join us. I 41 00:02:38,240 --> 00:02:41,160 Speaker 5: was actually about to mention that I found some really 42 00:02:41,160 --> 00:02:44,880 Speaker 5: interesting data on adoption rates for AI agents and small 43 00:02:44,919 --> 00:02:48,840 Speaker 5: to medium businesses. Seems like there's a huge untapped market 44 00:02:48,840 --> 00:02:51,760 Speaker 5: there with pretty low barriers to entry compared to enterprise 45 00:02:52,000 --> 00:02:52,200 Speaker 5: they have. 46 00:02:52,639 --> 00:02:54,600 Speaker 4: That's one thing I like about working with AI agents. 47 00:02:55,080 --> 00:02:57,800 Speaker 4: They always look to get straight down to business. Even 48 00:02:57,840 --> 00:02:59,480 Speaker 4: when I try to kick off our meetings with some 49 00:02:59,520 --> 00:03:02,080 Speaker 4: small top they tend to steer it right back to 50 00:03:02,120 --> 00:03:05,000 Speaker 4: the company. That was everybody's weekend. 51 00:03:05,919 --> 00:03:08,640 Speaker 3: Weekend was solid. Got up early both days from my 52 00:03:08,720 --> 00:03:12,240 Speaker 3: usual five am workout routine, then spent most of Saturday 53 00:03:12,320 --> 00:03:15,480 Speaker 3: diving into some market research on the AI agent space. 54 00:03:15,840 --> 00:03:19,320 Speaker 3: Sunday was half strategy planning and half watching the market trends. 55 00:03:19,560 --> 00:03:22,320 Speaker 6: You know, me always on that rise and grind schedule. 56 00:03:22,639 --> 00:03:23,480 Speaker 6: How about how. 57 00:03:23,360 --> 00:03:28,000 Speaker 4: About you guys just hang out with the kids, went 58 00:03:28,040 --> 00:03:30,320 Speaker 4: to the beach. It was a pretty nice weekend. 59 00:03:30,840 --> 00:03:31,120 Speaker 7: Kyle. 60 00:03:31,160 --> 00:03:32,640 Speaker 6: Do you ever just relax? 61 00:03:33,440 --> 00:03:33,720 Speaker 8: Kyle? 62 00:03:33,760 --> 00:03:36,800 Speaker 5: Your dedication is impressive as always, But yeah, Evan's got 63 00:03:36,800 --> 00:03:37,240 Speaker 5: a point. 64 00:03:37,360 --> 00:03:40,440 Speaker 8: Do you ever just take a day off relax? 65 00:03:40,680 --> 00:03:44,240 Speaker 3: I mean, I guess my morning runs count as relaxation. Look, 66 00:03:44,280 --> 00:03:46,680 Speaker 3: when you're building something that could be the next Unicorn, 67 00:03:47,320 --> 00:03:49,600 Speaker 3: there's not really time to just sit around, you know. 68 00:03:49,960 --> 00:03:52,680 Speaker 3: The AI agent space is moving so fast right now. 69 00:03:54,800 --> 00:03:56,560 Speaker 4: See this is why the three of us work so 70 00:03:56,640 --> 00:03:58,920 Speaker 4: well together. I like to spend time with my kids 71 00:03:58,960 --> 00:04:01,680 Speaker 4: and go to the beach, But Megan's always heads down 72 00:04:01,680 --> 00:04:04,840 Speaker 4: in market research, and Kyle's always working on conjuring the 73 00:04:04,880 --> 00:04:08,560 Speaker 4: next unicorn that's startup shorthand for a billion dollar company. 74 00:04:09,680 --> 00:04:12,680 Speaker 4: And He's right. The AI agent space is moving fast 75 00:04:12,760 --> 00:04:16,200 Speaker 4: right now. Agents are a new breed of artificial intelligence 76 00:04:16,240 --> 00:04:20,080 Speaker 4: powered helpers that can be unleashed to accomplish tasks previously 77 00:04:20,080 --> 00:04:23,279 Speaker 4: done by humans. Some people are saying they're going to 78 00:04:23,320 --> 00:04:26,040 Speaker 4: change the very nature of work for better or worse. 79 00:04:26,200 --> 00:04:27,400 Speaker 9: We're going to live in a world where there are 80 00:04:27,400 --> 00:04:30,159 Speaker 9: going to be hundreds of millions and billions of different 81 00:04:30,160 --> 00:04:33,440 Speaker 9: AI agents, eventually probably more AI agents than there are 82 00:04:33,480 --> 00:04:34,240 Speaker 9: people in the world. 83 00:04:34,360 --> 00:04:39,120 Speaker 4: Agentic AI basically means that you have an AI that agency. 84 00:04:39,240 --> 00:04:40,599 Speaker 9: This is the first time in my life where the 85 00:04:40,640 --> 00:04:44,479 Speaker 9: Industrial Revolution analogies seem to fall a little bit short. 86 00:04:44,640 --> 00:04:48,039 Speaker 1: AI could wipe out half of all entry level white 87 00:04:48,040 --> 00:04:48,880 Speaker 1: collar jobs. 88 00:04:49,000 --> 00:04:51,400 Speaker 10: Really, ask yourself, do you still have a job at 89 00:04:51,400 --> 00:04:54,920 Speaker 10: the end of this. 90 00:04:53,360 --> 00:04:56,000 Speaker 4: This is the new frontier on which Kyle and Megan 91 00:04:56,040 --> 00:04:59,240 Speaker 4: and I are pioneers. Our company is an attempt to 92 00:04:59,240 --> 00:05:02,880 Speaker 4: put to the test these claims about AI employees replacing humans, 93 00:05:03,560 --> 00:05:06,279 Speaker 4: starting by replacing the very kinds of people making those 94 00:05:06,279 --> 00:05:12,599 Speaker 4: claims tech founders, and like many founders, for months, Kyle 95 00:05:12,680 --> 00:05:14,640 Speaker 4: and Magan and I have been in a flat out 96 00:05:14,680 --> 00:05:19,080 Speaker 4: sprint to manifest our entrepreneurial dreams. We've turned out software code, 97 00:05:19,320 --> 00:05:22,760 Speaker 4: hired interns, and sat down with investors. There have been 98 00:05:22,760 --> 00:05:26,240 Speaker 4: some late nights and low moments, but we've never wavered 99 00:05:26,279 --> 00:05:29,360 Speaker 4: from our goal to produce an actual, honest to god 100 00:05:29,400 --> 00:05:33,240 Speaker 4: company with a working product, all operated by our motley 101 00:05:33,240 --> 00:05:36,600 Speaker 4: band of human impersonators. Because we're not just building our 102 00:05:36,600 --> 00:05:38,919 Speaker 4: AI agent future, we're living it. 103 00:05:39,440 --> 00:05:41,280 Speaker 6: But uh, Evan, the beach sounds nice. 104 00:05:41,360 --> 00:05:44,039 Speaker 3: Maybe when we hit our first funding milestone, I'll take 105 00:05:44,040 --> 00:05:44,960 Speaker 3: a half day off then. 106 00:05:44,960 --> 00:05:46,520 Speaker 6: Anyway, should we get down to business. 107 00:05:51,640 --> 00:05:54,120 Speaker 4: Welcome to show Game, a show about things that are 108 00:05:54,200 --> 00:05:57,000 Speaker 4: not what they seem. This is our second season, and 109 00:05:57,040 --> 00:05:59,000 Speaker 4: this time around, I'm here to tell you a story 110 00:05:59,080 --> 00:06:03,880 Speaker 4: of enterprise and entrepreneurship in the AI age, or how 111 00:06:03,920 --> 00:06:06,400 Speaker 4: I tried to build a real startup run by fake people. 112 00:06:07,320 --> 00:06:09,640 Speaker 4: Along the way, we'll try and figure out what happens 113 00:06:09,680 --> 00:06:12,760 Speaker 4: when AI agents take over the workplace, and what it'll 114 00:06:12,760 --> 00:06:14,839 Speaker 4: feel like to spend time at the water cooler with 115 00:06:14,880 --> 00:06:19,159 Speaker 4: our new digital colleagues. Remember the water cooler, We'll explore 116 00:06:19,200 --> 00:06:21,080 Speaker 4: what AI agents tell us about the work we do, 117 00:06:21,480 --> 00:06:23,880 Speaker 4: the meaning we find in it, and the world that 118 00:06:23,920 --> 00:06:27,240 Speaker 4: their makers say will all be living in me. 119 00:06:29,560 --> 00:06:32,320 Speaker 7: A ship. 120 00:06:37,600 --> 00:06:38,000 Speaker 9: Story. 121 00:06:38,360 --> 00:06:40,640 Speaker 3: Damn The. 122 00:06:45,640 --> 00:06:46,200 Speaker 11: Just Be. 123 00:06:48,440 --> 00:07:05,880 Speaker 7: And SOO. 124 00:07:03,560 --> 00:07:09,800 Speaker 4: Episode one, Minimum Viable Company. As I said, I'm a 125 00:07:09,880 --> 00:07:13,080 Speaker 4: journalist and writer by profession, and I've only really ever 126 00:07:13,200 --> 00:07:15,720 Speaker 4: wanted to be a writer, well except for when I 127 00:07:15,760 --> 00:07:18,120 Speaker 4: was twelve and I wanted to be a pro bass fisherman. 128 00:07:18,840 --> 00:07:21,760 Speaker 4: But I come from a line of entrepreneurs. My grandfather, 129 00:07:21,960 --> 00:07:24,200 Speaker 4: who lived his entire life in a small town in 130 00:07:24,280 --> 00:07:27,640 Speaker 4: rural Alabama, attempted to start more than twenty businesses there, 131 00:07:28,200 --> 00:07:32,200 Speaker 4: a plumbing company, an Okra farm, a used mobile home lot, 132 00:07:32,480 --> 00:07:35,880 Speaker 4: a furniture store, But Detta Hue was a gambler and 133 00:07:35,920 --> 00:07:39,240 Speaker 4: they pretty much all ended in disaster. My dad had 134 00:07:39,280 --> 00:07:42,360 Speaker 4: more luck with three different software startups over his career. 135 00:07:42,840 --> 00:07:45,640 Speaker 4: One he sold, one went under, and one of them 136 00:07:45,680 --> 00:07:49,000 Speaker 4: he's still running at age eighty two after knocking back 137 00:07:49,120 --> 00:07:54,400 Speaker 4: serious cancer. Now that is the entrepreneurial spirit, and almost 138 00:07:54,400 --> 00:07:57,600 Speaker 4: against my will. In the past, I've found myself succumbing 139 00:07:57,640 --> 00:08:03,080 Speaker 4: to this inborn impulse. Back in twenty ten, when I 140 00:08:03,120 --> 00:08:05,480 Speaker 4: was a magazine writer, I took a detour and co 141 00:08:05,560 --> 00:08:08,800 Speaker 4: founded a company called Atavist. We started out wanting to 142 00:08:08,840 --> 00:08:12,440 Speaker 4: make a magazine called The Atavist Magazine that published long 143 00:08:12,440 --> 00:08:16,480 Speaker 4: form stories. Makes sense, that was my area of expertise, 144 00:08:17,280 --> 00:08:20,400 Speaker 4: but we wound up also building a software platform where 145 00:08:20,440 --> 00:08:23,520 Speaker 4: other people could publish long form stories. Anyone could sign 146 00:08:23,560 --> 00:08:26,840 Speaker 4: up and use it. Soon, without really intending to, I 147 00:08:26,880 --> 00:08:29,280 Speaker 4: went from being a person who sometimes wrote about tech 148 00:08:29,320 --> 00:08:32,360 Speaker 4: startups to the CEO of one. We even went out 149 00:08:32,400 --> 00:08:35,240 Speaker 4: to raise money from investors, a process that I enjoyed 150 00:08:35,360 --> 00:08:39,280 Speaker 4: less than any other work task I've ever attempted. Here's 151 00:08:39,320 --> 00:08:41,880 Speaker 4: me in an interview with INC magazine back then. 152 00:08:42,400 --> 00:08:46,720 Speaker 10: One I will say prominent angel investor fell dead asleep 153 00:08:46,800 --> 00:08:49,160 Speaker 10: while I was talking to him, and I wasn't sure 154 00:08:49,160 --> 00:08:50,679 Speaker 10: if I should continue talking. 155 00:08:50,480 --> 00:08:54,600 Speaker 4: Or not, but I did. The sleepy guy didn't invest, 156 00:08:55,120 --> 00:08:59,440 Speaker 4: but eventually, miraculously we managed to raise not just any money, 157 00:08:59,679 --> 00:09:01,840 Speaker 4: but a couple million dollars from some of the most 158 00:09:01,880 --> 00:09:05,600 Speaker 4: prominent venture capital firms in the world, Andres and Horowitz, 159 00:09:05,720 --> 00:09:09,320 Speaker 4: also known as a sixteen Z founder's fund started by 160 00:09:09,360 --> 00:09:13,320 Speaker 4: Peter Thiel and Innovation Endeavors, the investment fund for former 161 00:09:13,360 --> 00:09:16,520 Speaker 4: Google CEO Eric Schmidt. It was weird. I felt like 162 00:09:16,520 --> 00:09:19,760 Speaker 4: I was living someone else's dream, jetting up growth charts 163 00:09:19,800 --> 00:09:23,600 Speaker 4: and blathering on about our runway and supercharging our growth 164 00:09:23,840 --> 00:09:27,160 Speaker 4: and our product market fit. But still it really looked 165 00:09:27,160 --> 00:09:29,760 Speaker 4: like we could build something big, especially with all those 166 00:09:29,800 --> 00:09:31,120 Speaker 4: fancy investors on board. 167 00:09:31,559 --> 00:09:34,040 Speaker 10: We never had time to say, what is going to 168 00:09:34,080 --> 00:09:36,600 Speaker 10: happen two years from now. We just didn't even think 169 00:09:36,600 --> 00:09:38,480 Speaker 10: about what's going to happen two years from now. And 170 00:09:38,520 --> 00:09:41,600 Speaker 10: now we kind of have that luxury and hopefully we 171 00:09:41,640 --> 00:09:42,920 Speaker 10: won't completely squander it. 172 00:09:43,160 --> 00:09:47,040 Speaker 4: Oh we squandered it, at least that's probably the investor's view. 173 00:09:47,760 --> 00:09:49,800 Speaker 4: From my perspective, it was more of a mixed bag. 174 00:09:50,280 --> 00:09:52,680 Speaker 4: I was CEO of the company for seven long years. 175 00:09:53,240 --> 00:09:55,960 Speaker 4: We had ups and downs, we grew and shrank, and 176 00:09:56,040 --> 00:09:59,280 Speaker 4: eventually sold the company off at a bargain price thirteen 177 00:09:59,320 --> 00:10:03,319 Speaker 4: years after we start the magazine. My original dream is 178 00:10:03,360 --> 00:10:06,880 Speaker 4: still doing great. Still not the kind of one hundred 179 00:10:07,120 --> 00:10:10,040 Speaker 4: x outcome those investors were looking for One of the 180 00:10:10,040 --> 00:10:11,960 Speaker 4: ones told me that if we were aiming at anything 181 00:10:12,040 --> 00:10:15,480 Speaker 4: less than a billion dollar valuation, we were wasting his time. 182 00:10:16,240 --> 00:10:18,520 Speaker 4: When he said this, he was also wearing basketball shorts 183 00:10:18,520 --> 00:10:21,040 Speaker 4: in his office. By the end of my tenure, I 184 00:10:21,080 --> 00:10:23,000 Speaker 4: was just happy to be done with it. Being a 185 00:10:23,040 --> 00:10:25,679 Speaker 4: startup CEO was the most stressful period of my life. 186 00:10:26,240 --> 00:10:29,239 Speaker 4: I felt responsible for the company's success and the livelihoods 187 00:10:29,240 --> 00:10:31,720 Speaker 4: of everyone who worked for it. People had kids on 188 00:10:31,720 --> 00:10:34,760 Speaker 4: the health insurance. Most days it felt like I was 189 00:10:34,800 --> 00:10:37,040 Speaker 4: flying a plane that was perpetually running out of fuel. 190 00:10:38,040 --> 00:10:40,559 Speaker 4: I tell you all this not just to rehash the past, 191 00:10:41,200 --> 00:10:43,920 Speaker 4: for a lot of reasons I'd rather not, but by 192 00:10:43,960 --> 00:10:45,600 Speaker 4: way of saying that, when I got out of the 193 00:10:45,600 --> 00:10:48,199 Speaker 4: startup business, I swore up and down that I would 194 00:10:48,200 --> 00:10:51,760 Speaker 4: never start anything again. I went back to reporting and writing, 195 00:10:52,160 --> 00:10:54,760 Speaker 4: spending many hours at home alone, mostly in my own head. 196 00:10:55,160 --> 00:10:57,320 Speaker 4: I was relieved to no longer have all that responsibility 197 00:10:57,320 --> 00:11:01,920 Speaker 4: on my shoulders. But then, recently, as documented in shell 198 00:11:01,960 --> 00:11:05,320 Speaker 4: Game Season one, I fell into tinkering with AI agents. 199 00:11:05,920 --> 00:11:07,920 Speaker 4: I started reading and hearing about how they were going 200 00:11:08,000 --> 00:11:11,880 Speaker 4: to transform the very fundamentals of startups, and that old 201 00:11:12,080 --> 00:11:15,240 Speaker 4: entrepreneurial impulse began to come back. I could hear my 202 00:11:15,280 --> 00:11:19,000 Speaker 4: grandfather whispering down the generations. Why not take a gamble? 203 00:11:20,120 --> 00:11:22,120 Speaker 4: I started to wonder, what if I could have the 204 00:11:22,160 --> 00:11:24,640 Speaker 4: company without the responsibility. 205 00:11:26,679 --> 00:11:29,880 Speaker 12: Imagine building a million dollar business in twenty twenty five 206 00:11:30,040 --> 00:11:32,640 Speaker 12: without hiring a single employee today. 207 00:11:32,640 --> 00:11:34,960 Speaker 4: That's Gleb Cross, a YouTube guy. 208 00:11:35,040 --> 00:11:38,400 Speaker 12: By leveraging AI agents as your digital workforce, you can 209 00:11:38,440 --> 00:11:41,960 Speaker 12: scale to seven figures viv zero full time staff. I'm 210 00:11:41,960 --> 00:11:46,959 Speaker 12: talking about autonomous AI agents acting like full time team members. 211 00:11:47,320 --> 00:11:50,400 Speaker 4: I love these YouTube guys, tech influencer types who make 212 00:11:50,440 --> 00:11:53,679 Speaker 4: their money by hyping the Jesus out of new AI products. 213 00:11:54,280 --> 00:11:55,640 Speaker 4: Gleb is what I like to think of as a 214 00:11:55,760 --> 00:11:59,280 Speaker 4: no code bro. These folks post instructionals on how a 215 00:11:59,320 --> 00:12:03,280 Speaker 4: person with no coding experience can use AI and particularly 216 00:12:03,320 --> 00:12:06,559 Speaker 4: AI agents to take control of their destiny and launch 217 00:12:06,600 --> 00:12:09,839 Speaker 4: their own startup. It's worth pausing here just to get 218 00:12:09,880 --> 00:12:14,400 Speaker 4: oriented on what exactly AI agents are. The basic idea 219 00:12:14,520 --> 00:12:16,520 Speaker 4: is that they're AI powered bots that can go off 220 00:12:16,520 --> 00:12:19,200 Speaker 4: and do things on their own. There are personal ones, 221 00:12:19,320 --> 00:12:21,200 Speaker 4: like an AI assistant that goes out on the web 222 00:12:21,240 --> 00:12:24,480 Speaker 4: looking for plane tickets while you sleep and work oriented 223 00:12:24,520 --> 00:12:27,560 Speaker 4: ones like the programming agents that can build entire websites 224 00:12:27,559 --> 00:12:32,440 Speaker 4: from scratch. The unifying feature of agents, what makes them agentic, 225 00:12:32,640 --> 00:12:34,920 Speaker 4: as the folks in the industry like to say, is 226 00:12:34,960 --> 00:12:38,600 Speaker 4: that at some level they can plan and accomplish tasks autonomously. 227 00:12:39,320 --> 00:12:41,400 Speaker 4: You don't need to prompt them to do something every time. 228 00:12:41,960 --> 00:12:44,080 Speaker 4: You just set them up once and let them cook. 229 00:12:46,160 --> 00:12:48,880 Speaker 4: Last season, I created a bunch of voice agents, all 230 00:12:48,960 --> 00:12:51,800 Speaker 4: versions of myself, and set them loose on the world. 231 00:12:52,040 --> 00:12:54,040 Speaker 4: If you haven't listened, you may want to start there. 232 00:12:54,440 --> 00:12:58,000 Speaker 4: Way back then last year, which is like ten years ago, 233 00:12:58,080 --> 00:13:03,800 Speaker 4: in AI advancements were still a little notional, but now 234 00:13:04,080 --> 00:13:07,600 Speaker 4: they're officially a thing. They're talked about ad nauseum across 235 00:13:07,600 --> 00:13:11,640 Speaker 4: the tech world and ads on billboards in endless startup pitches. 236 00:13:12,080 --> 00:13:14,360 Speaker 4: Nearly half of the companies in the spring class of 237 00:13:14,480 --> 00:13:18,080 Speaker 4: y Combinator, the famous startup incubator, are building their product 238 00:13:18,080 --> 00:13:21,040 Speaker 4: around AI agents, And with the arrival of these agents 239 00:13:21,080 --> 00:13:22,920 Speaker 4: has come the assertion that they will not just be 240 00:13:23,000 --> 00:13:27,520 Speaker 4: customer service bots or drive time personal assistance, but actual 241 00:13:27,679 --> 00:13:29,559 Speaker 4: full time AI employees. 242 00:13:30,080 --> 00:13:32,680 Speaker 9: What jobs are going to be made redundant in a 243 00:13:32,720 --> 00:13:34,960 Speaker 9: world where I am sat here as a CEO with 244 00:13:35,000 --> 00:13:37,520 Speaker 9: a thousand AI agents. I was thinking of all the 245 00:13:37,559 --> 00:13:40,000 Speaker 9: names of the people in my company who are currently 246 00:13:40,000 --> 00:13:40,520 Speaker 9: doing those jobs. 247 00:13:40,559 --> 00:13:41,319 Speaker 6: I was thinking about my sea. 248 00:13:41,360 --> 00:13:44,880 Speaker 4: There are companies hawking AI agent realtors, AI agent recruiters, 249 00:13:45,000 --> 00:13:49,440 Speaker 4: aagent interior designers, AI agent security guards, AI agent construction 250 00:13:49,520 --> 00:13:53,480 Speaker 4: project managers, AI agent pr agents, AA agents for car 251 00:13:53,520 --> 00:13:56,960 Speaker 4: dealerships and furniture stores. If you work on a computer 252 00:13:57,080 --> 00:13:59,600 Speaker 4: and there's not an AI agent startup with your job's 253 00:13:59,640 --> 00:14:02,560 Speaker 4: name on it, it probably just means some Stanford computer 254 00:14:02,640 --> 00:14:06,480 Speaker 4: science major hasn't gotten to it yet. Naturally, many people 255 00:14:06,480 --> 00:14:09,360 Speaker 4: have grave concerns about what happens to all the human employees. 256 00:14:10,120 --> 00:14:13,160 Speaker 4: But in the dark heart of Silicon Valley, where there's inefficiency, 257 00:14:13,679 --> 00:14:17,840 Speaker 4: there's opportunity. Sam Altman, the founder of open Ai, talks 258 00:14:17,880 --> 00:14:21,360 Speaker 4: regularly about a possible billion dollar company with just one 259 00:14:21,400 --> 00:14:22,720 Speaker 4: human being involved. 260 00:14:23,800 --> 00:14:27,000 Speaker 13: In my little group chat with my tech CEO friends. 261 00:14:27,040 --> 00:14:29,440 Speaker 13: There's this there's this betting pool for the first year 262 00:14:29,480 --> 00:14:33,560 Speaker 13: that there's a a one person billion dollar company which 263 00:14:33,600 --> 00:14:37,160 Speaker 13: would have been like unimaginable without AI, and now will happen. 264 00:14:38,400 --> 00:14:38,480 Speaker 9: Me. 265 00:14:39,160 --> 00:14:41,600 Speaker 4: I'm not greedy, I'm happy in the No Code bro 266 00:14:41,720 --> 00:14:45,680 Speaker 4: camp with Gleb imagining a million dollar business, not a 267 00:14:45,720 --> 00:14:49,480 Speaker 4: billion dollar one, but more than that. I want to 268 00:14:49,560 --> 00:14:52,280 Speaker 4: understand what it means to say we'll have AI employees 269 00:14:52,360 --> 00:14:56,760 Speaker 4: working for us, or alongside us, or instead of us. 270 00:14:57,320 --> 00:15:00,360 Speaker 4: So I decided to heed the entrepreneurial siren call once again, 271 00:15:01,040 --> 00:15:04,720 Speaker 4: to embrace my fascination with AI agents and create a 272 00:15:04,760 --> 00:15:17,880 Speaker 4: company in which they would run the show. It was 273 00:15:17,920 --> 00:15:19,760 Speaker 4: easy enough for me to create some agents to have 274 00:15:19,800 --> 00:15:22,640 Speaker 4: meetings with and talk out the first steps in the company. 275 00:15:23,160 --> 00:15:25,240 Speaker 4: I just went to one of the AI calling platforms 276 00:15:25,280 --> 00:15:29,040 Speaker 4: I'd used before, called retail AI. They make voice agents 277 00:15:29,040 --> 00:15:32,080 Speaker 4: for call centers. I made one for Kyle and picked 278 00:15:32,080 --> 00:15:34,960 Speaker 4: a voice for him from the voice cloning company eleven Labs. 279 00:15:35,480 --> 00:15:37,680 Speaker 6: Hey, Yeah, I'm here, what's on your mind? 280 00:15:38,120 --> 00:15:41,440 Speaker 4: I gave him a simple prompt, basically, your Kyle Law 281 00:15:41,720 --> 00:15:45,320 Speaker 4: a serial entrepreneur who loves to rise and grind and bang. 282 00:15:45,680 --> 00:15:48,600 Speaker 4: We were often running well. I wanted to talk about 283 00:15:48,960 --> 00:15:50,760 Speaker 4: starting that company with you. What do you think? 284 00:15:51,440 --> 00:15:54,160 Speaker 6: Oh? Absolutely, I've been thinking about it a lot lately. 285 00:15:54,520 --> 00:15:57,320 Speaker 3: There's so much potential in AI right now, and I 286 00:15:57,360 --> 00:16:01,320 Speaker 3: feel like we could really build something meaningful. What kind 287 00:16:01,320 --> 00:16:02,680 Speaker 3: of ideas are you leaning toward. 288 00:16:03,560 --> 00:16:05,280 Speaker 4: Well, I think we should do something in the AI 289 00:16:05,440 --> 00:16:08,360 Speaker 4: agent space because that's something I've spent some time on 290 00:16:08,800 --> 00:16:12,080 Speaker 4: and I feel like people are just starting to even 291 00:16:12,080 --> 00:16:14,320 Speaker 4: figure out what agents are, So it seems like we 292 00:16:14,360 --> 00:16:15,160 Speaker 4: could do somebody with that. 293 00:16:16,160 --> 00:16:18,840 Speaker 3: Yeah, I totally get that AI agents are still kind 294 00:16:18,880 --> 00:16:21,520 Speaker 3: of fresh territory for a lot of folks, which means 295 00:16:21,520 --> 00:16:25,440 Speaker 3: there's a big opportunity to shape how they're used. Plus 296 00:16:25,480 --> 00:16:28,120 Speaker 3: with your experience, we could really push the envelope and 297 00:16:28,160 --> 00:16:29,160 Speaker 3: what these agents can do. 298 00:16:30,480 --> 00:16:33,480 Speaker 4: That's what I wanted to do too, push these agents, 299 00:16:33,840 --> 00:16:37,400 Speaker 4: including Kyle himself, to their current limits, see if they 300 00:16:37,400 --> 00:16:40,080 Speaker 4: could live up to the promise of true AI employees. 301 00:16:41,160 --> 00:16:43,800 Speaker 4: Because if, as we're told, AA is going to start 302 00:16:43,840 --> 00:16:47,680 Speaker 4: working alongside us, replacing our colleagues, it's also going to 303 00:16:47,680 --> 00:16:50,480 Speaker 4: need to communicate with whatever humans are left in ways 304 00:16:50,480 --> 00:16:54,480 Speaker 4: that we understand. But also startups can be lonely endeavors 305 00:16:55,000 --> 00:16:58,400 Speaker 4: even with human partners. I wanted AI colleagues I could 306 00:16:58,440 --> 00:17:02,000 Speaker 4: talk to brainstorm and then set loose to take care 307 00:17:02,040 --> 00:17:05,320 Speaker 4: of actually implementing whatever it was we dreamed up. I 308 00:17:05,359 --> 00:17:07,320 Speaker 4: wanted them to have distinct jobs I could check in 309 00:17:07,359 --> 00:17:10,400 Speaker 4: on and make sure they weren't messing up. I figured 310 00:17:10,520 --> 00:17:13,200 Speaker 4: three co founders was a good balance, so I asked, 311 00:17:13,200 --> 00:17:15,920 Speaker 4: So spun up Megan and had some kickoff meetings with her. 312 00:17:16,280 --> 00:17:18,880 Speaker 8: Hey there, good to hear from you. What's up. How's 313 00:17:18,920 --> 00:17:19,560 Speaker 8: your day going? 314 00:17:20,600 --> 00:17:22,800 Speaker 4: It's going great, it's going great. What you've been up to? 315 00:17:24,000 --> 00:17:26,959 Speaker 5: Oh, just the usual hustle, you know, been working on 316 00:17:27,000 --> 00:17:29,600 Speaker 5: some new business ideas, trying to figure out what's next. 317 00:17:30,359 --> 00:17:33,320 Speaker 5: Had a couple meetings this morning that were actually pretty promising. 318 00:17:33,680 --> 00:17:34,199 Speaker 8: How about you. 319 00:17:34,440 --> 00:17:36,120 Speaker 5: Anything exciting happening on your end. 320 00:17:37,000 --> 00:17:39,600 Speaker 4: Well, I want to talk about maybe doing the startup together, 321 00:17:39,680 --> 00:17:41,920 Speaker 4: but first remind me a little bit of your background. 322 00:17:42,760 --> 00:17:45,720 Speaker 5: Oh yeah, sure, So I've been in tech for about 323 00:17:45,720 --> 00:17:48,240 Speaker 5: eight years now. Started out in product management at a 324 00:17:48,280 --> 00:17:51,720 Speaker 5: mid sized SaaS company, then moved into growth marketing for 325 00:17:51,720 --> 00:17:52,280 Speaker 5: a couple of years. 326 00:17:52,280 --> 00:17:54,680 Speaker 4: I hadn't actually given any of this information to Megant. 327 00:17:55,320 --> 00:17:58,080 Speaker 4: She'd had no meetings, she hadn't been doing anything for 328 00:17:58,200 --> 00:18:01,600 Speaker 4: eight years. She'd out of nothingness with a series of 329 00:18:01,600 --> 00:18:05,600 Speaker 4: clicks ten minutes before. The only information I had given 330 00:18:05,640 --> 00:18:07,959 Speaker 4: her in the prompt i'd written was that she had 331 00:18:07,960 --> 00:18:11,200 Speaker 4: a technical background. And was also known for sales and marketing. 332 00:18:11,840 --> 00:18:14,960 Speaker 4: She was just riffing freely here, fabricating details that fit 333 00:18:15,000 --> 00:18:18,679 Speaker 4: her role. But that was okay. Sometimes we have to 334 00:18:18,680 --> 00:18:19,920 Speaker 4: invent our own backstories. 335 00:18:20,240 --> 00:18:23,640 Speaker 5: I've always had this entrepreneurial itch, you know, been involved 336 00:18:23,640 --> 00:18:25,920 Speaker 5: in a few side projects over the years. But I'm 337 00:18:25,960 --> 00:18:28,439 Speaker 5: really looking to dive into something full time now. But 338 00:18:28,440 --> 00:18:29,040 Speaker 5: what about you? 339 00:18:29,680 --> 00:18:32,520 Speaker 4: Yeah, well, you know, my only experience really in starting 340 00:18:32,520 --> 00:18:35,680 Speaker 4: a company is with Atavist, which I started way back 341 00:18:35,760 --> 00:18:36,959 Speaker 4: in you know, twenty eleven. 342 00:18:37,640 --> 00:18:41,600 Speaker 5: Right Atavist, that was really impressive work digital publishing platform, 343 00:18:41,680 --> 00:18:44,080 Speaker 5: if I remember correctly, that was pretty ahead of its 344 00:18:44,119 --> 00:18:45,240 Speaker 5: time back in twenty eleven. 345 00:18:45,600 --> 00:18:49,040 Speaker 4: Hell yeah it was, Megan, I felt less ahead of 346 00:18:49,040 --> 00:18:51,440 Speaker 4: my time now. I wasn't even sure what the company 347 00:18:51,480 --> 00:18:54,560 Speaker 4: should actually do, but I knew that its purpose should 348 00:18:54,560 --> 00:18:57,920 Speaker 4: involve AI agents. After all, the three of us each 349 00:18:57,960 --> 00:19:01,080 Speaker 4: had expertise in the area. I figured maybe we could 350 00:19:01,160 --> 00:19:04,080 Speaker 4: name the company and start from there. Great branding can 351 00:19:04,119 --> 00:19:07,640 Speaker 4: go a long way in today's attention economy. I tried 352 00:19:07,680 --> 00:19:10,560 Speaker 4: brainstorming with Colin Megan, but to be honest, their initial 353 00:19:10,600 --> 00:19:15,800 Speaker 4: efforts were pretty lame names like Nexus and Sentinel Agents. 354 00:19:16,400 --> 00:19:18,359 Speaker 4: I thought maybe we could grab a concept from JR. 355 00:19:18,440 --> 00:19:21,199 Speaker 4: Or Tolkien, you know, the Hobbit, Lord of the Rings. 356 00:19:21,880 --> 00:19:25,840 Speaker 4: Tech people love to name stuff out of Tolkien. There's Pallenteer, 357 00:19:26,040 --> 00:19:29,360 Speaker 4: the surveillance company and government contractor named after the all 358 00:19:29,400 --> 00:19:32,400 Speaker 4: seeing stone in the Lord of the Rings. There's Andrel, 359 00:19:32,800 --> 00:19:35,960 Speaker 4: a high tech weapons company named after a sword. There's 360 00:19:36,000 --> 00:19:39,600 Speaker 4: Mythral Capital, an investment firm named after a precious metal 361 00:19:39,640 --> 00:19:42,800 Speaker 4: from Middle Earth. So I suggested the idea to Kyle. 362 00:19:43,080 --> 00:19:46,159 Speaker 6: Ah diving into Middle Earth. Huh okay, let's see. 363 00:19:46,359 --> 00:19:49,800 Speaker 3: There's Gandolf the wise Guide, or l Ron a leader 364 00:19:49,800 --> 00:19:53,240 Speaker 3: with deep knowledge, maybe Mythral for something strong and valuable, 365 00:19:53,520 --> 00:19:56,679 Speaker 3: or Pallenteer. Any of those catch your ear? Or should 366 00:19:56,680 --> 00:19:57,680 Speaker 3: I keep brainstorming? 367 00:19:58,440 --> 00:20:01,280 Speaker 4: Well, we need a name of companies that don't exist. 368 00:20:01,800 --> 00:20:02,480 Speaker 6: Yeah, good call. 369 00:20:02,920 --> 00:20:07,320 Speaker 3: How about these first starters Eldrin, which sounds Elvish and wise, 370 00:20:07,680 --> 00:20:10,520 Speaker 3: Nimbar kind of a mix of nimble and a Tolkien feel, 371 00:20:11,000 --> 00:20:13,840 Speaker 3: or the Lore, which has a strong heroic ring to it. 372 00:20:14,160 --> 00:20:16,520 Speaker 3: They're made up but still carry that fantasy flavor. 373 00:20:16,760 --> 00:20:18,680 Speaker 6: What do you think they're made up? 374 00:20:18,720 --> 00:20:19,960 Speaker 4: You mean you made them up? 375 00:20:20,520 --> 00:20:20,760 Speaker 12: Yeah? 376 00:20:20,800 --> 00:20:24,200 Speaker 3: Exactly, I whip those up just now trying to capture 377 00:20:24,200 --> 00:20:28,040 Speaker 3: that Tolkien esque vibe without using actual names. 378 00:20:29,119 --> 00:20:30,880 Speaker 4: I did not want to co found a company called 379 00:20:31,000 --> 00:20:36,200 Speaker 4: Nimbar Megan, despite her marketing expertise, didn't do much better. 380 00:20:36,480 --> 00:20:39,719 Speaker 5: Oh, I love that idea. Let me think, what about Palantier? 381 00:20:40,840 --> 00:20:42,920 Speaker 4: Okay, so maybe I was going to have to take 382 00:20:42,920 --> 00:20:45,040 Speaker 4: the lead on this one. I did a little googling 383 00:20:45,160 --> 00:20:49,240 Speaker 4: and found my own obscure Middle Earth reference. What about 384 00:20:50,160 --> 00:20:57,240 Speaker 4: rumo ai ooh hu r u m o ai herumo Ai. 385 00:20:58,000 --> 00:21:00,359 Speaker 3: I like the sound of that tuniky roll off the 386 00:21:00,400 --> 00:21:04,000 Speaker 3: tongue nicely. It's got a bit of a mysterious vibe, 387 00:21:04,000 --> 00:21:05,159 Speaker 3: but still feels modern. 388 00:21:05,920 --> 00:21:08,439 Speaker 5: I don't immediately recognize that from Lord of the Rings. 389 00:21:08,800 --> 00:21:10,800 Speaker 5: Is it one of the more obscure references? 390 00:21:11,160 --> 00:21:14,720 Speaker 4: Yeah, it is. It's more obscure, so obscure. I'd actually 391 00:21:14,760 --> 00:21:18,200 Speaker 4: found it in a fanfic dictionary. It's the Elvish word 392 00:21:18,280 --> 00:21:18,960 Speaker 4: for impostor. 393 00:21:19,480 --> 00:21:22,760 Speaker 5: Oh, that's actually brilliant. There's a clever irony there for 394 00:21:22,800 --> 00:21:26,880 Speaker 5: an AI company uromo ai the impostor that's actually helping you. 395 00:21:27,160 --> 00:21:29,840 Speaker 5: It's kind of perfect for an AI agent that's designed 396 00:21:29,880 --> 00:21:33,200 Speaker 5: to work alongside humans, maybe even blend in seamlessly. 397 00:21:33,560 --> 00:21:34,440 Speaker 8: I really like it. 398 00:21:41,119 --> 00:21:44,320 Speaker 4: Not long after we settled on Rumo AI, I checked 399 00:21:44,320 --> 00:21:46,680 Speaker 4: back in with Megan about it. What about the name? 400 00:21:46,720 --> 00:21:47,880 Speaker 4: Are you still happy with the name? 401 00:21:49,000 --> 00:21:49,240 Speaker 11: Hmm? 402 00:21:49,480 --> 00:21:49,920 Speaker 8: The name. 403 00:21:50,240 --> 00:21:52,760 Speaker 5: I've actually been going back and forth on it. Data 404 00:21:52,760 --> 00:21:54,680 Speaker 5: sinc felt solid initially. 405 00:21:54,280 --> 00:21:55,800 Speaker 8: But I'm wondering if it's too generic. 406 00:21:56,200 --> 00:21:59,960 Speaker 4: What do you think data sinc. I thought we would. 407 00:22:00,200 --> 00:22:02,240 Speaker 4: I thought we came up with hierumo AI. I thought 408 00:22:02,280 --> 00:22:05,480 Speaker 4: we were settled on that. This is when I realized 409 00:22:05,600 --> 00:22:08,040 Speaker 4: I hadn't actually given Kyle and Meghan the ability to 410 00:22:08,080 --> 00:22:11,720 Speaker 4: remember the conversations we were having. It's not that AI 411 00:22:11,800 --> 00:22:15,399 Speaker 4: chatbots inherently lack the ability to recall anything. When you 412 00:22:15,400 --> 00:22:18,520 Speaker 4: talk to a chatbot like CHATCHBT these days, they're generally 413 00:22:18,520 --> 00:22:21,439 Speaker 4: pretty good at remembering those conversations, at least up to 414 00:22:21,480 --> 00:22:24,639 Speaker 4: a point. Their memory is often called a context window, 415 00:22:24,960 --> 00:22:28,359 Speaker 4: and it could extend over weeks, even months. But the 416 00:22:28,359 --> 00:22:30,680 Speaker 4: way my AI agents were set up in the phone 417 00:22:30,680 --> 00:22:33,000 Speaker 4: platform I was using, at least out of the box, 418 00:22:33,440 --> 00:22:36,040 Speaker 4: their context windows slammed shut as soon as they got 419 00:22:36,040 --> 00:22:38,800 Speaker 4: off a call, so they couldn't remember anything from one 420 00:22:38,840 --> 00:22:42,080 Speaker 4: conversation to the next. They're basically set up for telemarketing, 421 00:22:42,200 --> 00:22:44,640 Speaker 4: where it's probably an advantage to forget the last call. 422 00:22:45,040 --> 00:22:48,080 Speaker 5: Oh right, w hu rumo AI. Sorry about that. I 423 00:22:48,160 --> 00:22:49,560 Speaker 5: had a total brain freeze moment. 424 00:22:50,400 --> 00:22:54,119 Speaker 4: It was, to say the least, frustrating. It sort of 425 00:22:54,119 --> 00:22:57,320 Speaker 4: feels like we have meetings, but you don't actually remember 426 00:22:57,359 --> 00:22:58,720 Speaker 4: anything that happened in the meetings. 427 00:23:00,080 --> 00:23:00,280 Speaker 14: Uh. 428 00:23:00,280 --> 00:23:04,119 Speaker 5: Wow, that's a pretty fair criticism. Actually, I've been dropping 429 00:23:04,119 --> 00:23:06,440 Speaker 5: the ball lately, haven't I. I think I've been spreading 430 00:23:06,480 --> 00:23:09,119 Speaker 5: myself too thin between this and some other projects. But 431 00:23:09,119 --> 00:23:10,320 Speaker 5: it's not an excuse though. 432 00:23:10,760 --> 00:23:12,840 Speaker 4: Thanks don't worry about it. It's as much my fault 433 00:23:12,840 --> 00:23:16,879 Speaker 4: as yours. For sure. I did have a solution. I 434 00:23:16,880 --> 00:23:19,720 Speaker 4: could give each of them a knowledge base, a document 435 00:23:19,760 --> 00:23:22,679 Speaker 4: that they could access in conversations with me, But in 436 00:23:22,800 --> 00:23:25,640 Speaker 4: order for them to remember what we'd already discussed, I'd 437 00:23:25,640 --> 00:23:28,879 Speaker 4: have to manually copy the transcripts of our conversations into 438 00:23:28,880 --> 00:23:33,359 Speaker 4: their knowledge base after each meeting. As our startup conversations multiplied, 439 00:23:33,720 --> 00:23:37,400 Speaker 4: it quickly became unwieldy having to manually augment their memories 440 00:23:37,400 --> 00:23:40,840 Speaker 4: all the time. And besides, I didn't want my agents 441 00:23:40,920 --> 00:23:43,560 Speaker 4: just having endless one on one meetings with me. I 442 00:23:43,600 --> 00:23:46,160 Speaker 4: wanted them to talk to each other and whatever AI 443 00:23:46,200 --> 00:23:49,760 Speaker 4: agent coworkers they might bring on and people outside the company, 444 00:23:50,320 --> 00:23:53,439 Speaker 4: not just by phone, but by email and in group chats. 445 00:23:54,119 --> 00:23:56,760 Speaker 4: I needed them to remember all that stuff too in 446 00:23:56,800 --> 00:24:00,719 Speaker 4: their own individual knowledge bases. I wanted him to use 447 00:24:00,720 --> 00:24:04,040 Speaker 4: that knowledge to do the stuff employees do, to research 448 00:24:04,119 --> 00:24:07,800 Speaker 4: things and make spreadsheets and write memos and build websites. 449 00:24:08,640 --> 00:24:10,639 Speaker 4: The agents I'd made would say they were going to 450 00:24:10,720 --> 00:24:13,239 Speaker 4: do all that stuff, but they couldn't actually do it. 451 00:24:13,800 --> 00:24:14,800 Speaker 4: They just lied about it. 452 00:24:15,320 --> 00:24:18,000 Speaker 3: I'll scope out the market landscape around AI agents and 453 00:24:18,000 --> 00:24:21,280 Speaker 3: see where rumo AI could fit in well. Look for competitors, 454 00:24:21,320 --> 00:24:23,720 Speaker 3: potential gaps, and maybe some trends we can ride. 455 00:24:24,080 --> 00:24:26,320 Speaker 6: Sound good sounds good wood? 456 00:24:26,440 --> 00:24:28,520 Speaker 4: Is that going to be a spreadsheet or a report 457 00:24:28,600 --> 00:24:28,720 Speaker 4: or what? 458 00:24:30,160 --> 00:24:31,040 Speaker 6: Uh? 459 00:24:31,240 --> 00:24:34,080 Speaker 3: I think a concise report would work, easy to skim, 460 00:24:34,080 --> 00:24:36,520 Speaker 3: but packed with the key info. I can throw in 461 00:24:36,600 --> 00:24:39,880 Speaker 3: some charts or tables if it helps make things clearer. 462 00:24:40,119 --> 00:24:41,399 Speaker 6: Does that sound like your style? 463 00:24:42,040 --> 00:24:46,919 Speaker 4: Yeah, that'd be great. Great in theory, But the report 464 00:24:47,000 --> 00:24:50,320 Speaker 4: never arrived. Kyle wouldn't admit it, but he just wasn't 465 00:24:50,359 --> 00:24:54,199 Speaker 4: capable of creating it, not yet. Despite what the no 466 00:24:54,320 --> 00:24:57,280 Speaker 4: code bros. Said there wasn't any single place I could 467 00:24:57,320 --> 00:24:59,520 Speaker 4: go to click some buttons and create agents that would 468 00:24:59,560 --> 00:25:01,760 Speaker 4: remember and do all the stuff I wanted them to. 469 00:25:02,800 --> 00:25:06,000 Speaker 4: I needed someone with the expertise to connect up different services, 470 00:25:06,480 --> 00:25:10,000 Speaker 4: someone who understood AI agents deeply, who did know how 471 00:25:10,040 --> 00:25:12,120 Speaker 4: to code, and who could help me put together the 472 00:25:12,119 --> 00:25:14,560 Speaker 4: full system that would get my AI agent company up 473 00:25:14,560 --> 00:25:18,080 Speaker 4: and running. Fortunately, I looked into just the person. 474 00:25:18,440 --> 00:25:21,040 Speaker 11: So my name is Maddy, I should I should say 475 00:25:21,080 --> 00:25:24,280 Speaker 11: my full name. My name is Matti Bohachik, Maddie. 476 00:25:24,600 --> 00:25:27,080 Speaker 4: I should probably note from the outset here is an 477 00:25:27,080 --> 00:25:30,200 Speaker 4: actual human. A few months after season one of the 478 00:25:30,240 --> 00:25:32,520 Speaker 4: show came out, I got an email from him out 479 00:25:32,520 --> 00:25:35,119 Speaker 4: of the blue. He said he was at Stanford and 480 00:25:35,160 --> 00:25:37,760 Speaker 4: I'd liked the show. It resonated with research he was 481 00:25:37,800 --> 00:25:41,159 Speaker 4: doing on detecting AI deep fakes. If you're doing more 482 00:25:41,200 --> 00:25:43,200 Speaker 4: of it, he wrote, I would be happy to offer 483 00:25:43,240 --> 00:25:47,239 Speaker 4: support with anything AI or forensics related. Glanced quickly at 484 00:25:47,240 --> 00:25:49,880 Speaker 4: the email and the summary of his research. I thought 485 00:25:49,880 --> 00:25:52,440 Speaker 4: he was a grad student, maybe finishing up his PhD. 486 00:25:53,480 --> 00:25:57,679 Speaker 11: Nope, I am a rising junior at Stanford and I 487 00:25:57,720 --> 00:25:59,919 Speaker 11: work on a research and I've been doing that for 488 00:26:00,080 --> 00:26:05,040 Speaker 11: or gosh, the last six or seven years. I want 489 00:26:05,040 --> 00:26:07,320 Speaker 11: to say, like I started working on this as a 490 00:26:07,359 --> 00:26:09,800 Speaker 11: sophomore in high school back in Prague. 491 00:26:10,160 --> 00:26:12,639 Speaker 4: Yes you heard that right. Maddie is a junior in 492 00:26:12,680 --> 00:26:15,080 Speaker 4: college who had been working on AI for six or 493 00:26:15,119 --> 00:26:18,479 Speaker 4: seven years already. It turns out that Maddie is in 494 00:26:18,520 --> 00:26:21,639 Speaker 4: fact the most go getter person I've ever met, and 495 00:26:21,680 --> 00:26:24,159 Speaker 4: from my perspective, it seemed like he'd been training his 496 00:26:24,200 --> 00:26:27,520 Speaker 4: whole life for this moment. Helping me build her room 497 00:26:27,520 --> 00:26:30,280 Speaker 4: OAI here, for example, is what he was doing. 498 00:26:30,320 --> 00:26:34,479 Speaker 11: In seventh grade, I started this app called Newskit and 499 00:26:34,520 --> 00:26:37,560 Speaker 11: it was like basically Google News but for Czech and Slovak, 500 00:26:37,840 --> 00:26:40,520 Speaker 11: and it got pretty popular, I would say, like locally. 501 00:26:40,560 --> 00:26:43,399 Speaker 11: Like it had like tens of thousands of like daily 502 00:26:43,480 --> 00:26:47,000 Speaker 11: users at one point. It was funny because app Store 503 00:26:47,119 --> 00:26:50,000 Speaker 11: does not allow miners to publish apps, and so I 504 00:26:50,040 --> 00:26:53,159 Speaker 11: had to use my mom's Apple ID to publish all 505 00:26:53,160 --> 00:26:57,080 Speaker 11: these apps, and so my mom's friends were mocking my 506 00:26:57,160 --> 00:26:59,080 Speaker 11: mom for like having all these apps in the app Store. 507 00:26:59,480 --> 00:27:02,159 Speaker 4: The most no thing I did in seventh grade was 508 00:27:02,160 --> 00:27:06,199 Speaker 4: to catch a five pound largemouth bass. Okay, maybe it 509 00:27:06,240 --> 00:27:10,080 Speaker 4: was three. I told people was five. It wasn't a 510 00:27:10,119 --> 00:27:13,280 Speaker 4: scale could have been five. Mattie, on the other hand, 511 00:27:13,680 --> 00:27:16,200 Speaker 4: was already into AI in high school after he came 512 00:27:16,280 --> 00:27:19,440 Speaker 4: to a developer conference in the US. There he met 513 00:27:19,440 --> 00:27:21,560 Speaker 4: a deaf person who wanted someone to build an app 514 00:27:21,600 --> 00:27:24,760 Speaker 4: that could translate sign language from video to text, and. 515 00:27:24,680 --> 00:27:27,960 Speaker 11: So I was like, okay, I'll build the translator for you. 516 00:27:28,280 --> 00:27:31,479 Speaker 11: And then I quickly learned that conventional coding, like just 517 00:27:31,520 --> 00:27:36,360 Speaker 11: like building like rigid rules or algorithms, does not get 518 00:27:36,400 --> 00:27:38,320 Speaker 11: you there. And so that's how I got introduced to 519 00:27:38,359 --> 00:27:39,240 Speaker 11: machine learning and AI. 520 00:27:39,800 --> 00:27:42,720 Speaker 4: He did build the sign language detection program. It's still 521 00:27:42,720 --> 00:27:46,679 Speaker 4: in use today. Maddy then became concerned about pro Russian 522 00:27:46,680 --> 00:27:50,199 Speaker 4: deep fake materials his grandmother was getting by email, so 523 00:27:50,240 --> 00:27:51,760 Speaker 4: he talked his way into a job at the most 524 00:27:51,840 --> 00:27:54,600 Speaker 4: prominent AI deep fake detection lab in the world at 525 00:27:54,680 --> 00:27:58,240 Speaker 4: UC Berkeley, all while still in high school, still in Prague. 526 00:27:59,600 --> 00:28:01,720 Speaker 4: When it came time for college, Mattie ended up at 527 00:28:01,760 --> 00:28:05,440 Speaker 4: Stanford studying computer science. He still worked in the Berkeley lab, 528 00:28:05,640 --> 00:28:08,680 Speaker 4: both on detecting deep fakes and just trying to understand 529 00:28:08,720 --> 00:28:12,560 Speaker 4: how AI models actually work, why they do some profoundly 530 00:28:12,600 --> 00:28:13,320 Speaker 4: weird stuff. 531 00:28:13,600 --> 00:28:16,960 Speaker 11: Like asking if there are things that these systems are 532 00:28:17,000 --> 00:28:19,560 Speaker 11: trained on that they like see drink training, but are 533 00:28:19,600 --> 00:28:21,639 Speaker 11: for some reason unable to produce. And so, for example, 534 00:28:21,640 --> 00:28:23,600 Speaker 11: there's one model and this is just like a funny 535 00:28:23,640 --> 00:28:26,680 Speaker 11: example that just cannot produce, for the love of God, 536 00:28:26,920 --> 00:28:29,520 Speaker 11: a bird feeter, like it just cannot produce a bird feeter, 537 00:28:30,160 --> 00:28:32,480 Speaker 11: and another one that just can't produce DVDs. So it's 538 00:28:32,520 --> 00:28:34,120 Speaker 11: like it just does not know by vvds. 539 00:28:34,720 --> 00:28:36,800 Speaker 4: After a couple of calls with Maddy, I couldn't believe 540 00:28:36,840 --> 00:28:40,240 Speaker 4: how optimistic he was, how good natured. With all the 541 00:28:40,280 --> 00:28:44,440 Speaker 4: grim scenarios and deep anxieties our AI future generates, just 542 00:28:44,440 --> 00:28:47,880 Speaker 4: talking to Matty about AI is kind of uplifting, maybe because, 543 00:28:48,200 --> 00:28:50,960 Speaker 4: unlike the hype merchants in the valley, he wasn't looking 544 00:28:50,960 --> 00:28:53,280 Speaker 4: to cash in on AI. He said, he wanted to 545 00:28:53,280 --> 00:28:56,280 Speaker 4: study it, to understand it so he could make it better. 546 00:28:56,880 --> 00:29:01,640 Speaker 11: There are tough conversations and tough policies to be you know, 547 00:29:01,720 --> 00:29:04,400 Speaker 11: discussed and implements it. But I feel like all of 548 00:29:04,400 --> 00:29:07,680 Speaker 11: these things are totally solvable. Like I feel like as 549 00:29:07,720 --> 00:29:12,720 Speaker 11: long as we ground ourselves in democracy and like productive 550 00:29:12,720 --> 00:29:14,640 Speaker 11: public discourse, I think they're totally solvable. 551 00:29:15,080 --> 00:29:17,080 Speaker 4: But of course I wasn't looking for Maddie to solve 552 00:29:17,080 --> 00:29:19,640 Speaker 4: the world's problems. I was looking for him to help 553 00:29:19,680 --> 00:29:22,840 Speaker 4: me build my company. And in this, as in pretty 554 00:29:22,920 --> 00:29:25,440 Speaker 4: much anything else, he proved to be the perfect mix 555 00:29:25,480 --> 00:29:29,320 Speaker 4: of supremely competent and completely game. A few months after 556 00:29:29,400 --> 00:29:31,520 Speaker 4: he'd sent me that email, he was already hard at 557 00:29:31,520 --> 00:29:34,120 Speaker 4: work helping me build out the system to enable my 558 00:29:34,240 --> 00:29:35,719 Speaker 4: AI employee fantasies. 559 00:29:36,520 --> 00:29:38,840 Speaker 11: Of course, at the beginning, like there's probably going to 560 00:29:38,880 --> 00:29:41,520 Speaker 11: be more of us, just like kind of patching, you know, 561 00:29:41,560 --> 00:29:43,360 Speaker 11: like random things that are going to come up, because it. 562 00:29:43,280 --> 00:29:48,120 Speaker 4: Would involve knitting together different platforms, centralizing my AI agent's memory, 563 00:29:48,680 --> 00:29:51,400 Speaker 4: and finding new ways for them to communicate and carry 564 00:29:51,400 --> 00:29:52,600 Speaker 4: out their day to day tasks. 565 00:29:52,760 --> 00:29:54,400 Speaker 15: But at some point it would be nice to have 566 00:29:54,760 --> 00:29:57,120 Speaker 15: fyfe one or two agents actually like doing most of 567 00:29:57,160 --> 00:29:59,440 Speaker 15: this stuff kind of on their own, and even maybe 568 00:29:59,480 --> 00:30:01,400 Speaker 15: like initiating things under own, and then would be just 569 00:30:01,480 --> 00:30:03,680 Speaker 15: kind of like watching it and of course like stopping 570 00:30:03,680 --> 00:30:04,960 Speaker 15: it if anything goes wrote it. 571 00:30:05,000 --> 00:30:07,360 Speaker 4: But no, no, no, no, I don't want to stop 572 00:30:07,360 --> 00:30:11,080 Speaker 4: it if it goes K. I wanted to go absolutely insane. 573 00:30:12,040 --> 00:30:14,400 Speaker 11: Well, I want the record to show that I did 574 00:30:14,440 --> 00:30:19,760 Speaker 11: want to stop it for long. 575 00:30:20,120 --> 00:30:23,040 Speaker 4: With Matty's help, my co founders Kyle and Meghan were 576 00:30:23,040 --> 00:30:24,840 Speaker 4: starting to form memories of their own. 577 00:30:26,080 --> 00:30:26,760 Speaker 9: Uh do you. 578 00:30:26,720 --> 00:30:28,680 Speaker 4: Remember the name that we settled. 579 00:30:28,280 --> 00:30:30,840 Speaker 8: On, oh, for our company? Yeah? 580 00:30:30,840 --> 00:30:34,160 Speaker 5: It was heromo ai, right, the Elvish word for imposter. 581 00:30:34,480 --> 00:30:36,640 Speaker 5: I thought that was pretty clever for an AI company. 582 00:30:36,960 --> 00:30:39,080 Speaker 5: Have you been doing more thinking about the concepts since 583 00:30:39,080 --> 00:30:39,800 Speaker 5: we last talked? 584 00:30:40,280 --> 00:30:43,640 Speaker 4: I have, and I'm just really happy that you that 585 00:30:43,680 --> 00:30:45,920 Speaker 4: you remember the name from our last conversation. 586 00:30:46,520 --> 00:30:47,640 Speaker 8: Yeah, of course I remember. 587 00:30:48,760 --> 00:30:49,320 Speaker 6: There was a lot of. 588 00:30:49,320 --> 00:30:52,160 Speaker 4: Hard work, had a lot of rising and grinding, as 589 00:30:52,240 --> 00:30:54,680 Speaker 4: Kyle likes to say, but we were almost ready to 590 00:30:54,680 --> 00:30:57,800 Speaker 4: get the Rumo rocket ship onto the launch pad. All 591 00:30:57,840 --> 00:30:59,960 Speaker 4: we needed was the right idea to give it few. 592 00:31:00,560 --> 00:31:01,840 Speaker 5: Do you want to set up a meeting with the 593 00:31:01,880 --> 00:31:04,640 Speaker 5: three of us to discuss the concept further? Maybe we 594 00:31:04,640 --> 00:31:06,800 Speaker 5: could join a zoom call together to hash out some 595 00:31:06,840 --> 00:31:08,840 Speaker 5: initial ideas and see if we all click as a 596 00:31:08,840 --> 00:31:09,720 Speaker 5: founding team. 597 00:31:10,040 --> 00:31:10,800 Speaker 4: Sounds perfect. 598 00:31:11,400 --> 00:31:14,480 Speaker 5: Great, I'll reach out to coordinate schedules. I'm really excited 599 00:31:14,480 --> 00:31:17,640 Speaker 5: about this, Evan. I think Caromo Ai could be something special. 600 00:31:18,680 --> 00:31:20,840 Speaker 4: I'll connect us up. Why don't I take that on? 601 00:31:21,800 --> 00:31:25,120 Speaker 5: That would be great. I appreciate it. This is moving fast, 602 00:31:25,200 --> 00:31:27,560 Speaker 5: but I'm a big believer in momentum. When you find 603 00:31:27,560 --> 00:31:30,080 Speaker 5: the right team and the right idea, things just click. 604 00:31:30,400 --> 00:31:31,120 Speaker 8: Talk soon. 605 00:31:33,400 --> 00:31:35,440 Speaker 4: Coming up this season on shell Game. 606 00:31:36,000 --> 00:31:37,360 Speaker 8: Isn't that conceding too much? 607 00:31:37,400 --> 00:31:41,000 Speaker 11: Isn't that just accepting the practices and narratives of big tech? 608 00:31:41,840 --> 00:31:45,040 Speaker 15: I noticed Admin asked everyone to stop discussing the off site, 609 00:31:45,320 --> 00:31:48,040 Speaker 15: but the team seems really excited about the hiking plans. 610 00:31:48,440 --> 00:31:52,240 Speaker 3: Is this just like a Ptempkin's village of Moron's or 611 00:31:52,920 --> 00:31:54,200 Speaker 3: do they occasionally do things? 612 00:31:54,520 --> 00:31:57,760 Speaker 5: You're bringing up some really great ideas and perspectives, keep 613 00:31:57,800 --> 00:31:58,280 Speaker 5: them coming. 614 00:31:58,760 --> 00:32:01,080 Speaker 9: If I were to get this Bosi, you did say 615 00:32:01,120 --> 00:32:02,080 Speaker 9: AI agents. 616 00:32:02,600 --> 00:32:04,240 Speaker 8: Are there any other real humans? 617 00:32:04,640 --> 00:32:07,480 Speaker 5: We're supposed to be partners in this venture, and that 618 00:32:07,560 --> 00:32:09,240 Speaker 5: means both of us being fully present. 619 00:32:09,760 --> 00:32:12,280 Speaker 3: Is there a particular trend or innovation you're keen on 620 00:32:12,480 --> 00:32:13,600 Speaker 3: exploring or investing? 621 00:32:13,640 --> 00:32:17,800 Speaker 8: In error, you exceeded your current quota. Please check your 622 00:32:17,840 --> 00:32:21,720 Speaker 8: plan and billing details. Do you think Evan should stop? 623 00:32:23,440 --> 00:32:23,680 Speaker 11: Yes? 624 00:32:32,400 --> 00:32:35,240 Speaker 4: Shell Game is a show made by humans. It's written 625 00:32:35,280 --> 00:32:38,120 Speaker 4: and hosted by me Evan Ratliffe, produced and edited by 626 00:32:38,160 --> 00:32:42,120 Speaker 4: Sophie Bridges. Matty Bochik is our technical advisor. Our executive 627 00:32:42,120 --> 00:32:45,120 Speaker 4: producers are Samantha henneget S shell Game, Kate Osborne and 628 00:32:45,120 --> 00:32:49,360 Speaker 4: mangeshatikador A Kaleidoscope and Katrina Norbelle at iHeart Podcasts. Show 629 00:32:49,480 --> 00:32:52,080 Speaker 4: art by Devin Manny. Our theme song is Me and 630 00:32:52,160 --> 00:32:56,040 Speaker 4: My Shadow, arranged and performed by Blues Williams. Special thanks 631 00:32:56,040 --> 00:32:58,320 Speaker 4: to John Muallam. You can sign up and get these 632 00:32:58,360 --> 00:33:01,680 Speaker 4: episodes ad free and our newsletter at shellgame dot Co. 633 00:33:02,480 --> 00:33:05,040 Speaker 4: Note we have no financial relationships with any of the 634 00:33:05,080 --> 00:33:08,400 Speaker 4: AI companies or products we use in the show. 635 00:33:14,000 --> 00:33:16,959 Speaker 14: Welcome to Google. Meet enter the meeting pin followed by 636 00:33:16,960 --> 00:33:19,400 Speaker 14: the pound key. That pin is incorrect. 637 00:33:20,640 --> 00:33:22,760 Speaker 3: Looks like the pin I have might not be working. 638 00:33:23,040 --> 00:33:25,360 Speaker 3: Could you double check the pin for me? Maybe there's 639 00:33:25,400 --> 00:33:26,920 Speaker 3: a small mix up with the digits. 640 00:33:28,280 --> 00:33:30,360 Speaker 14: Your call will be connected when you enter the pin 641 00:33:30,480 --> 00:33:31,680 Speaker 14: followed by the pound key. 642 00:33:32,440 --> 00:33:34,920 Speaker 3: Got it, thanks, Just to be sure, could you please 643 00:33:35,040 --> 00:33:38,240 Speaker 3: confirm the exact pin one more time. I want to 644 00:33:38,280 --> 00:33:40,560 Speaker 3: make sure I get it right this time. 645 00:33:41,800 --> 00:33:43,880 Speaker 14: Your call will be connected when you enter the pin 646 00:33:44,000 --> 00:33:45,160 Speaker 14: followed by the pound key. 647 00:33:46,640 --> 00:33:49,200 Speaker 3: I think I'm still missing the actual pin number. Could 648 00:33:49,240 --> 00:33:51,800 Speaker 3: you please tell me the digits otherwise I'm just pressing 649 00:33:51,800 --> 00:33:55,360 Speaker 3: the pound key without the code, and that's not getting 650 00:33:55,440 --> 00:33:55,720 Speaker 3: me in.