1 00:00:07,920 --> 00:00:16,240 Speaker 1: Who Welcome to tech stuff. I'm here with Cara Price. 2 00:00:16,320 --> 00:00:18,279 Speaker 2: Hi, Kara, Hi, do you want to say what your 3 00:00:18,360 --> 00:00:18,960 Speaker 2: name is? Us? 4 00:00:19,800 --> 00:00:30,280 Speaker 1: My name is oz Ozwaluchin. So you and I hosted 5 00:00:30,280 --> 00:00:33,880 Speaker 1: a podcast back in twenty nineteen called Sleepwalkers, and you 6 00:00:34,000 --> 00:00:37,200 Speaker 1: ran quite an extraordinary prank on your cousin. 7 00:00:37,479 --> 00:00:41,160 Speaker 2: I did so. I spent time with a company called 8 00:00:41,280 --> 00:00:46,960 Speaker 2: Liarbird AI basically recreating my voice to see if I 9 00:00:47,040 --> 00:00:50,440 Speaker 2: could con my cousin into giving me her credit card number. 10 00:00:50,960 --> 00:00:56,240 Speaker 2: And I was completely unsuccessful, but only because my AI 11 00:00:56,360 --> 00:00:58,000 Speaker 2: voice sounded too tired. 12 00:00:59,280 --> 00:01:02,280 Speaker 1: I remember, I remember that. I mean it was funny 13 00:01:02,280 --> 00:01:04,679 Speaker 1: because we at the time, you know, it took a 14 00:01:04,680 --> 00:01:05,160 Speaker 1: lot of voice. 15 00:01:05,280 --> 00:01:07,160 Speaker 2: It took a lot, like I basically had to talk 16 00:01:07,319 --> 00:01:08,800 Speaker 2: for hours. 17 00:01:08,959 --> 00:01:12,360 Speaker 1: But I remember we had to literally press buttons that 18 00:01:12,360 --> 00:01:15,160 Speaker 1: were associated with words while we were talking to your cousins. 19 00:01:15,200 --> 00:01:17,840 Speaker 1: So it was like, hello, this is Kara speaking, but 20 00:01:17,920 --> 00:01:18,679 Speaker 1: in your voice. 21 00:01:18,959 --> 00:01:19,280 Speaker 3: It was. 22 00:01:19,480 --> 00:01:22,880 Speaker 2: It was a weird, impromptu thing because she actually called me, 23 00:01:23,280 --> 00:01:25,039 Speaker 2: and so we were like, let's take this as an 24 00:01:25,080 --> 00:01:30,280 Speaker 2: opportunity to be out of context and ask her a 25 00:01:30,400 --> 00:01:32,720 Speaker 2: question to see if she just believes that she's talking 26 00:01:32,760 --> 00:01:34,880 Speaker 2: to me. And for a while, she believed that she 27 00:01:35,000 --> 00:01:37,280 Speaker 2: was talking to me. It was pretty cool, like, will 28 00:01:37,280 --> 00:01:40,400 Speaker 2: my cousin believe that she's talking to me as Ai, 29 00:01:40,440 --> 00:01:42,800 Speaker 2: which she did, and she did for a little while. 30 00:01:43,120 --> 00:01:45,840 Speaker 1: And that was twenty nineteen, and obviously the state of 31 00:01:45,840 --> 00:01:49,840 Speaker 1: the nation has improved dramatically since then. Yes. In fact, 32 00:01:49,880 --> 00:01:52,880 Speaker 1: there's a journalist called Evan Ratliffe who did a whole 33 00:01:52,920 --> 00:01:58,040 Speaker 1: podcast about creating a fake version of himself and letting 34 00:01:58,120 --> 00:02:01,160 Speaker 1: it loose in the world. That Polo Coast was called 35 00:02:01,560 --> 00:02:02,160 Speaker 1: shell game. 36 00:02:02,680 --> 00:02:04,800 Speaker 4: I mean a shell game is basically it's some people 37 00:02:04,880 --> 00:02:07,480 Speaker 4: call it balls and cups. It's a game in which 38 00:02:07,680 --> 00:02:11,760 Speaker 4: someone hides a ball under one of three shells. It's 39 00:02:11,760 --> 00:02:13,799 Speaker 4: an often you would wager around whether or not you're 40 00:02:13,840 --> 00:02:16,600 Speaker 4: able to guess where the ball is. But one of 41 00:02:16,600 --> 00:02:19,080 Speaker 4: the things that people often don't think about, and the 42 00:02:19,160 --> 00:02:22,040 Speaker 4: reason why the shell game works in many cases, is 43 00:02:22,040 --> 00:02:23,679 Speaker 4: that there are other people who are in on the 44 00:02:23,680 --> 00:02:26,800 Speaker 4: shell game that you don't realize. So for us, in 45 00:02:26,840 --> 00:02:29,680 Speaker 4: season one, there was sort of one main agent, which 46 00:02:29,720 --> 00:02:32,280 Speaker 4: was a replica of me, a voice agent made from 47 00:02:32,280 --> 00:02:36,200 Speaker 4: my voice, and the people who are encountering me did 48 00:02:36,200 --> 00:02:38,800 Speaker 4: not realize at first that they were encountering something that 49 00:02:38,880 --> 00:02:39,200 Speaker 4: wasn't me. 50 00:02:39,840 --> 00:02:43,280 Speaker 1: Now Evan is back with season two and he conducts 51 00:02:43,360 --> 00:02:47,880 Speaker 1: another experiment. This one is all around exploring the premise 52 00:02:48,200 --> 00:02:50,880 Speaker 1: that the next Unicorn, i e. The next billion dollar 53 00:02:50,960 --> 00:02:55,919 Speaker 1: company may only have one employee, which is something none 54 00:02:55,960 --> 00:02:58,480 Speaker 1: other than Sam Altman likes to talk. 55 00:02:58,280 --> 00:03:02,240 Speaker 5: About in my little group with my tech CEO friends. 56 00:03:02,240 --> 00:03:04,679 Speaker 5: There's this there's this betting pool for the first year 57 00:03:04,720 --> 00:03:08,799 Speaker 5: that there's a a one person billion dollar company which 58 00:03:08,800 --> 00:03:12,079 Speaker 5: would have been like unimaginable without AI and now will happen. 59 00:03:12,800 --> 00:03:15,200 Speaker 2: So are you telling me that Evan actually built a 60 00:03:15,240 --> 00:03:17,160 Speaker 2: company with AI agents. 61 00:03:17,400 --> 00:03:21,160 Speaker 1: Yes, it's not a billion dollar company yet, but he 62 00:03:21,240 --> 00:03:24,640 Speaker 1: did sort of call bs or at least maybe make 63 00:03:24,680 --> 00:03:28,079 Speaker 1: a good faith exploration of whether this promise about one 64 00:03:28,280 --> 00:03:32,120 Speaker 1: human person companies was true. And the company exists. It's 65 00:03:32,160 --> 00:03:36,119 Speaker 1: called Harumo AI and they're currently working on a product 66 00:03:36,480 --> 00:03:42,640 Speaker 1: that procrastinates for you called sloth Surf. Yeah, so I 67 00:03:42,680 --> 00:03:45,480 Speaker 1: tried to out you. So you basically say how long 68 00:03:45,560 --> 00:03:49,360 Speaker 1: you wanted to procrastinate and what you wanted to procrastinate doing. So, like, 69 00:03:49,640 --> 00:03:53,400 Speaker 1: please spend the next hour googling Team news about Matchester 70 00:03:53,480 --> 00:03:56,160 Speaker 1: United and come back at the end of the hour 71 00:03:56,440 --> 00:03:58,640 Speaker 1: with the report and all the stuff that you found, 72 00:03:58,880 --> 00:04:01,200 Speaker 1: so I can spend that hour actually working rather than 73 00:04:01,240 --> 00:04:02,440 Speaker 1: procrastinating myself. 74 00:04:02,480 --> 00:04:05,280 Speaker 2: Oh so it's offloading procrastination. That's kind of genius. 75 00:04:05,640 --> 00:04:09,480 Speaker 1: It is pretty good. And Evan got together with a 76 00:04:09,520 --> 00:04:13,560 Speaker 1: prodigy Matty Boachek, a twenty one year old Stanford student 77 00:04:13,800 --> 00:04:16,599 Speaker 1: and AI whiz who was such a big fan of 78 00:04:16,640 --> 00:04:19,159 Speaker 1: season one that he called Evan and said, I'd have 79 00:04:19,200 --> 00:04:21,640 Speaker 1: to work together with you on season two. And so 80 00:04:22,160 --> 00:04:25,839 Speaker 1: Matty was the person who actually made this chorus of 81 00:04:25,960 --> 00:04:30,000 Speaker 1: AI agents real. He got them into slack, he gave 82 00:04:30,040 --> 00:04:33,560 Speaker 1: them the ability to make outbound phone calls. He created 83 00:04:33,560 --> 00:04:36,000 Speaker 1: a kind of Google doc that had a register of 84 00:04:36,080 --> 00:04:38,760 Speaker 1: all the actions they'd ever taken in the world, which 85 00:04:38,800 --> 00:04:41,160 Speaker 1: had the effect of giving them a memory. 86 00:04:41,480 --> 00:04:44,400 Speaker 2: Why did he do this the second season? Like, what 87 00:04:44,600 --> 00:04:47,560 Speaker 2: was he trying to achieve? 88 00:04:48,320 --> 00:04:51,680 Speaker 1: I think really interrogate this question of what is the 89 00:04:51,720 --> 00:04:55,080 Speaker 1: difference between fake people and real people? What will the 90 00:04:55,080 --> 00:04:57,640 Speaker 1: future of work look like? But it's it's really worth 91 00:04:57,680 --> 00:05:02,520 Speaker 1: listening to because it's clever, it's it's sharp, and one 92 00:05:02,520 --> 00:05:04,560 Speaker 1: of the things I found most striking was that an 93 00:05:04,560 --> 00:05:09,760 Speaker 1: ethicist at Oxford University told Evan he should stop. 94 00:05:10,040 --> 00:05:11,039 Speaker 2: And did he start. 95 00:05:11,920 --> 00:05:12,000 Speaker 6: No. 96 00:05:12,480 --> 00:05:13,800 Speaker 4: I have a lot of questions that a lot of 97 00:05:13,839 --> 00:05:17,159 Speaker 4: people have, but I think it's valuable to go explore 98 00:05:17,200 --> 00:05:19,760 Speaker 4: as deeply as you can, to understand as much as possible, 99 00:05:20,320 --> 00:05:22,640 Speaker 4: so that then we can decide what is a society 100 00:05:22,680 --> 00:05:23,600 Speaker 4: we want to do about it. 101 00:05:24,160 --> 00:05:27,440 Speaker 1: Shell Game season two is a fascinating listen, and also 102 00:05:27,560 --> 00:05:30,600 Speaker 1: it's pretty fascinating to get to talk to Evan Ratliffe, 103 00:05:30,640 --> 00:05:35,760 Speaker 1: the journalist, host and creator, and Matty Boachik, the technical advisor, 104 00:05:35,839 --> 00:05:38,120 Speaker 1: which is the first time I heard that title on 105 00:05:38,160 --> 00:05:41,960 Speaker 1: a podcast together to learn how they set up the company, 106 00:05:42,320 --> 00:05:45,760 Speaker 1: how the workplace experiment is going. And we start at 107 00:05:45,800 --> 00:05:50,320 Speaker 1: the very beginning discussing Evan's past experience as the co 108 00:05:50,400 --> 00:05:54,479 Speaker 1: founder of a real startup with other real humans, which 109 00:05:54,520 --> 00:05:57,039 Speaker 1: is something I'm in the midst of myself. 110 00:05:57,880 --> 00:06:00,440 Speaker 4: So about fifteen years ago, I had started this company 111 00:06:00,480 --> 00:06:03,640 Speaker 4: called Atavist with two partners, and I won't go into 112 00:06:03,640 --> 00:06:05,440 Speaker 4: too much detail about Atamis, but it was in part 113 00:06:05,480 --> 00:06:08,479 Speaker 4: a tech company. I ended up sort of almost by default, 114 00:06:08,480 --> 00:06:12,120 Speaker 4: being the CEO, and we had ups and downs, let's 115 00:06:12,160 --> 00:06:14,600 Speaker 4: just say familiar, but I said that I would never 116 00:06:14,680 --> 00:06:18,719 Speaker 4: start a company again. But then Sam Altman and others 117 00:06:18,920 --> 00:06:22,160 Speaker 4: have articulated this idea that there will very soon be 118 00:06:22,200 --> 00:06:24,840 Speaker 4: a billion dollar company with only one human employee. Whether 119 00:06:24,880 --> 00:06:27,160 Speaker 4: it's a billion dollar company, there are many startups out 120 00:06:27,200 --> 00:06:30,479 Speaker 4: there now with many fewer employees because they are using 121 00:06:30,520 --> 00:06:32,520 Speaker 4: AI agents for all of these roles. So I figured, 122 00:06:32,520 --> 00:06:35,120 Speaker 4: why not put it to the test this time. I 123 00:06:35,160 --> 00:06:38,360 Speaker 4: will be the silent co founder. I will co found 124 00:06:38,400 --> 00:06:42,160 Speaker 4: a company with two AI agents, Kyle Law and Megan Flores, 125 00:06:42,600 --> 00:06:45,680 Speaker 4: and then we'll have we have three other employees, so 126 00:06:45,680 --> 00:06:49,280 Speaker 4: there are five AI agent employees total. I'm the silent 127 00:06:49,320 --> 00:06:52,680 Speaker 4: co founder and they're all set up independently, so they 128 00:06:52,680 --> 00:06:57,080 Speaker 4: all have the ability to make phone calls, emails, make documents. 129 00:06:57,400 --> 00:07:01,120 Speaker 4: We have a slack, they communicate on Slack, and they 130 00:07:01,120 --> 00:07:05,080 Speaker 4: are really meant to push the agents into the realm 131 00:07:05,279 --> 00:07:08,560 Speaker 4: that they're being advertised as, which is as AI employees. 132 00:07:08,960 --> 00:07:11,000 Speaker 4: That is what they are being sold as. So we're 133 00:07:11,040 --> 00:07:12,880 Speaker 4: trying to put that to the test. 134 00:07:13,280 --> 00:07:14,200 Speaker 1: And what is the product. 135 00:07:14,400 --> 00:07:17,800 Speaker 4: Well, the product is called sloth Surf, and sloth Surf 136 00:07:17,960 --> 00:07:22,280 Speaker 4: is a procrastination engine, and by that I mean when 137 00:07:22,280 --> 00:07:24,640 Speaker 4: you go online and start to procrastinate, so you're in 138 00:07:24,640 --> 00:07:26,280 Speaker 4: the middle of your work and then you say, you 139 00:07:26,280 --> 00:07:28,200 Speaker 4: know what, I'm just going to go to YouTube and 140 00:07:28,240 --> 00:07:30,000 Speaker 4: watch some YouTube videos, and we're going to Reddit and 141 00:07:30,080 --> 00:07:33,600 Speaker 4: check out a thread. The way that we advocate that 142 00:07:33,640 --> 00:07:36,360 Speaker 4: you can break that habit is to instead go to 143 00:07:36,400 --> 00:07:39,160 Speaker 4: slot Surf. Then you can put in how you were 144 00:07:39,200 --> 00:07:42,320 Speaker 4: going to procrastinate, how much time you were going to procrastinate, 145 00:07:42,400 --> 00:07:45,360 Speaker 4: fifteen minutes, thirty minutes, sixty minutes, maybe the whole afternoon, 146 00:07:45,960 --> 00:07:49,360 Speaker 4: and it will send an agent to go retrieve those items. 147 00:07:49,400 --> 00:07:53,280 Speaker 4: It will procrastinate on your behalf and then deliver them 148 00:07:53,320 --> 00:07:55,880 Speaker 4: to your inbox, thus saving you the time that you 149 00:07:55,880 --> 00:07:58,120 Speaker 4: would have spent procrastinating, so you can get back to 150 00:07:58,120 --> 00:07:58,920 Speaker 4: what you want to be doing. 151 00:07:59,280 --> 00:08:02,120 Speaker 1: Actually, I actually send some agents out this morning to 152 00:08:02,200 --> 00:08:03,960 Speaker 1: read about Manchester United all the day, but they haven't 153 00:08:03,960 --> 00:08:06,680 Speaker 1: reported that yet. But I'm looking forward to the system 154 00:08:06,720 --> 00:08:07,240 Speaker 1: might be down. 155 00:08:07,280 --> 00:08:08,800 Speaker 4: Well, it is in beta. 156 00:08:08,960 --> 00:08:10,880 Speaker 6: It's a very interesting topic, so I can imagine them 157 00:08:10,920 --> 00:08:13,680 Speaker 6: just like you know, still still looking at all the 158 00:08:13,680 --> 00:08:15,360 Speaker 6: scores and transfers and stuff. 159 00:08:15,760 --> 00:08:16,040 Speaker 3: Mat Da. 160 00:08:16,080 --> 00:08:17,720 Speaker 1: I want to ask you in a moment about how 161 00:08:17,800 --> 00:08:21,400 Speaker 1: you built this. But just before we get there. I 162 00:08:21,400 --> 00:08:24,040 Speaker 1: guess I'm curious Evan, like, what is it a good 163 00:08:24,080 --> 00:08:26,800 Speaker 1: faith experiment with one of the conceivable outputs that you 164 00:08:26,840 --> 00:08:28,480 Speaker 1: did in fact build a real business. 165 00:08:28,800 --> 00:08:31,920 Speaker 4: Imagine it this way, someone built a real business that 166 00:08:32,080 --> 00:08:35,840 Speaker 4: is dysfunctional, and then a documentary film crew comes in 167 00:08:35,960 --> 00:08:39,440 Speaker 4: to document this dysfunctional business. That's basically what I'm doing. 168 00:08:39,520 --> 00:08:42,600 Speaker 4: Like if you listen to the show, it's a workplace satire. 169 00:08:43,000 --> 00:08:46,160 Speaker 4: But actually, if an investor that we were talking to 170 00:08:47,040 --> 00:08:49,800 Speaker 4: wanted to give us investment, we would consider it. We 171 00:08:49,880 --> 00:08:52,760 Speaker 4: have a real product that is in beta that has 172 00:08:52,840 --> 00:08:56,040 Speaker 4: thousands of users, So like we're not just sort of 173 00:08:56,080 --> 00:09:00,280 Speaker 4: like joking around like I'm having them do what many 174 00:09:00,320 --> 00:09:03,080 Speaker 4: many startups, including startups that are in y Combinator and 175 00:09:03,160 --> 00:09:06,840 Speaker 4: other famous startup accelerators are doing. We are doing exactly 176 00:09:06,880 --> 00:09:09,360 Speaker 4: the same things. So I would put our company up 177 00:09:09,400 --> 00:09:11,680 Speaker 4: against many existing startups. 178 00:09:12,160 --> 00:09:15,480 Speaker 1: Mattie is obviously a bunch of people trying to make 179 00:09:15,559 --> 00:09:19,000 Speaker 1: billion dollar companies making AI agents to make other people 180 00:09:19,040 --> 00:09:22,199 Speaker 1: make billion dollar companies with no employees. Did you use 181 00:09:22,280 --> 00:09:25,240 Speaker 1: an existing AI agent company or did you build your 182 00:09:25,320 --> 00:09:27,880 Speaker 1: own suite of agents? How did you deploy it? How 183 00:09:27,880 --> 00:09:29,760 Speaker 1: did you build this. Yeah. 184 00:09:29,840 --> 00:09:33,040 Speaker 6: Yeah, So there are these platforms out there that basically 185 00:09:33,120 --> 00:09:36,079 Speaker 6: promised to give you these these agents that can do 186 00:09:36,160 --> 00:09:38,920 Speaker 6: all of this org be it on Slack or email, 187 00:09:39,120 --> 00:09:42,080 Speaker 6: r or wherever on your behalf. And so we did 188 00:09:42,280 --> 00:09:44,560 Speaker 6: try them, and we actually did include a lot of them, 189 00:09:45,000 --> 00:09:49,480 Speaker 6: such as Lindy or Tavis or others. But the issue 190 00:09:49,520 --> 00:09:53,479 Speaker 6: was that in many instances were they were not completely 191 00:09:54,160 --> 00:09:57,000 Speaker 6: independent or they did not have all the features we wanted. 192 00:09:57,280 --> 00:09:59,760 Speaker 6: And so what ended up happening is that I basically 193 00:10:00,080 --> 00:10:02,959 Speaker 6: built up like a basic set of these agents in 194 00:10:03,080 --> 00:10:05,559 Speaker 6: Lindy and Tavis, and then made a bunch of connections 195 00:10:05,600 --> 00:10:08,199 Speaker 6: on top of that with custom code and custom layers 196 00:10:08,440 --> 00:10:11,080 Speaker 6: to make sure that they can have meetings with multiple people, 197 00:10:11,320 --> 00:10:13,440 Speaker 6: that we can record stuff that they can go out 198 00:10:13,480 --> 00:10:18,199 Speaker 6: and execute or write code and push to our actual servers. 199 00:10:18,400 --> 00:10:22,800 Speaker 6: So there is this underlying vehicle that's basically just like 200 00:10:23,040 --> 00:10:26,560 Speaker 6: publicly available services that are paid for. But on top 201 00:10:26,600 --> 00:10:29,480 Speaker 6: of that, it's still quite a bit of our custom 202 00:10:29,520 --> 00:10:32,840 Speaker 6: code and databases and all that. For example, the memory part, 203 00:10:32,920 --> 00:10:34,839 Speaker 6: that's something that we have to build ourselves as like 204 00:10:34,840 --> 00:10:35,839 Speaker 6: a custom custom thing. 205 00:10:36,040 --> 00:10:37,080 Speaker 1: Yeah, talk about memory. 206 00:10:37,280 --> 00:10:41,040 Speaker 6: Yeah, So memory is funny because as Evan mentioned, even 207 00:10:41,080 --> 00:10:45,000 Speaker 6: though these agents now have the ability to use tools 208 00:10:45,040 --> 00:10:47,640 Speaker 6: and to do stuff on their own, they're still the 209 00:10:47,720 --> 00:10:51,439 Speaker 6: core LLLMS large language models. Now, these models have been 210 00:10:51,679 --> 00:10:54,840 Speaker 6: trained in a particular way to execute stuff and to 211 00:10:54,960 --> 00:10:58,280 Speaker 6: run sort of like code in their outputs to be 212 00:10:58,360 --> 00:11:01,240 Speaker 6: able to use these tools. They're LMS, and so what 213 00:11:01,320 --> 00:11:03,160 Speaker 6: ends up happening is that if you want them to 214 00:11:03,200 --> 00:11:08,680 Speaker 6: have any sort of context that goes beyond the current session, 215 00:11:08,679 --> 00:11:10,600 Speaker 6: like what they're actually working on right now, you need 216 00:11:10,679 --> 00:11:12,800 Speaker 6: to basically have like a document, like like you know, 217 00:11:12,840 --> 00:11:15,440 Speaker 6: a lot of text that like describes that history or 218 00:11:15,440 --> 00:11:18,240 Speaker 6: that memory. And so very practically there's a Google doc 219 00:11:18,520 --> 00:11:21,400 Speaker 6: that each of our agents has and it's just called 220 00:11:21,440 --> 00:11:24,160 Speaker 6: like Kyle memory, and it's just like a rundown of 221 00:11:24,320 --> 00:11:26,800 Speaker 6: many like you know, small tidbits of like oh, you know, 222 00:11:26,880 --> 00:11:30,040 Speaker 6: Monday eight am, I like slacked Evan and told him 223 00:11:30,040 --> 00:11:32,000 Speaker 6: this and this and that, and it's just like a 224 00:11:32,080 --> 00:11:34,840 Speaker 6: trace of everything they want to remember to be able 225 00:11:34,880 --> 00:11:37,319 Speaker 6: to then go back to wow, and. 226 00:11:37,280 --> 00:11:39,440 Speaker 1: How well does it work in practice? How is that memory? 227 00:11:39,800 --> 00:11:41,880 Speaker 6: Well, so at first it was kind of okay, but 228 00:11:41,960 --> 00:11:45,000 Speaker 6: then at some point it became pretty large, and so 229 00:11:45,679 --> 00:11:48,480 Speaker 6: whenever this context what we call context windows for these 230 00:11:48,520 --> 00:11:52,480 Speaker 6: agents or lms, become very large. They tend to have 231 00:11:52,960 --> 00:11:56,040 Speaker 6: issues with focus or like their attention, so sometimes they 232 00:11:56,120 --> 00:11:58,360 Speaker 6: like latch onto certain parts of the memory, but then 233 00:11:58,400 --> 00:12:01,120 Speaker 6: like disregard you know, other parts, which in a certain 234 00:12:01,160 --> 00:12:04,280 Speaker 6: way can be sort of similar to humans. But it's 235 00:12:04,280 --> 00:12:07,480 Speaker 6: really not very predictable and not very static. So sometimes 236 00:12:07,480 --> 00:12:10,520 Speaker 6: it works pretty okay. Other times they just forget stuff 237 00:12:10,760 --> 00:12:13,800 Speaker 6: and makeup stuff that like just just is not there. 238 00:12:14,840 --> 00:12:19,079 Speaker 1: Evan, how did you create these AI agents with personalities? 239 00:12:19,400 --> 00:12:24,320 Speaker 1: What was the process of imbuing them with individual characteristics 240 00:12:24,320 --> 00:12:26,280 Speaker 1: and then having them interact as a group. 241 00:12:27,040 --> 00:12:30,120 Speaker 4: Well that part was so interesting because I thought that 242 00:12:30,200 --> 00:12:32,720 Speaker 4: I was sort of reading like a almost like a 243 00:12:32,720 --> 00:12:36,280 Speaker 4: fictional world. I'm creating characters. But I also wanted them 244 00:12:36,320 --> 00:12:39,720 Speaker 4: to have different roles, you know, to embody, the CTO, 245 00:12:40,200 --> 00:12:43,679 Speaker 4: the CEO, the head of marketing that HR, and then 246 00:12:43,760 --> 00:12:47,319 Speaker 4: one random sales associate that I added. And I did 247 00:12:47,320 --> 00:12:51,319 Speaker 4: give them voices, you know, with different accents. But then 248 00:12:51,360 --> 00:12:53,960 Speaker 4: when it came to their backstories, I thought, well, I'll 249 00:12:53,960 --> 00:12:55,640 Speaker 4: have to come up with you know, who are they, 250 00:12:55,640 --> 00:12:58,600 Speaker 4: where are they from? But I sort of neglected to 251 00:12:58,640 --> 00:13:01,960 Speaker 4: remember that if you asked, they'll just tell you. And 252 00:13:02,000 --> 00:13:04,120 Speaker 4: if they don't know, they won't say I don't know, 253 00:13:04,200 --> 00:13:06,360 Speaker 4: They'll just make it up. So all I had to 254 00:13:06,360 --> 00:13:09,400 Speaker 4: do was create the very beginnings of them, and then 255 00:13:09,440 --> 00:13:11,640 Speaker 4: I could say, Kyle, you know, where did you go 256 00:13:11,679 --> 00:13:14,200 Speaker 4: to college? And Kyle wouldn't say I don't know where 257 00:13:14,200 --> 00:13:16,920 Speaker 4: I'm to college. Kyle would say I went to Stanford. 258 00:13:17,240 --> 00:13:21,800 Speaker 4: Because Kyle wants to embody the tech CEO archetype and 259 00:13:21,880 --> 00:13:24,400 Speaker 4: does it very well. So they basically created their own 260 00:13:24,400 --> 00:13:27,400 Speaker 4: backstories just through my asking them what their backstories were. 261 00:13:27,559 --> 00:13:29,600 Speaker 4: And then because of the system that we set up 262 00:13:29,960 --> 00:13:33,439 Speaker 4: to reinforce their memories, whenever they say something, it goes 263 00:13:33,480 --> 00:13:36,520 Speaker 4: into their memory. So now it's forever locked in as 264 00:13:36,559 --> 00:13:39,240 Speaker 4: their story and they'll repeat it from here on out. 265 00:13:39,400 --> 00:13:41,760 Speaker 6: I should point out that the memory is editable, so 266 00:13:42,120 --> 00:13:44,240 Speaker 6: you know, Evan is not just a co founder Bulls 267 00:13:44,240 --> 00:13:46,160 Speaker 6: with kind of a god that can go in and 268 00:13:46,200 --> 00:13:48,880 Speaker 6: just you know, edit or sprinkle something in as well. 269 00:13:49,320 --> 00:13:50,880 Speaker 6: What's so funny to me about this is that they're 270 00:13:50,920 --> 00:13:53,440 Speaker 6: like super Bay Area coded like, even though they claim 271 00:13:53,520 --> 00:13:55,880 Speaker 6: to be from Texas or wherever, all of them like 272 00:13:55,960 --> 00:13:59,199 Speaker 6: to hike, bike, surf and do coffee chats like that's 273 00:13:59,200 --> 00:14:01,000 Speaker 6: what they do all the time. So it's just like 274 00:14:01,080 --> 00:14:04,640 Speaker 6: Bay Area like culture, like impose in our startup. 275 00:14:10,880 --> 00:14:14,240 Speaker 1: After the break? Can these AI tech bros ever get 276 00:14:14,280 --> 00:14:15,040 Speaker 1: anything done? 277 00:14:15,520 --> 00:14:21,560 Speaker 3: Stay with us? 278 00:14:23,520 --> 00:14:25,280 Speaker 1: What was the spookiest moment for you have them? 279 00:14:25,520 --> 00:14:28,680 Speaker 4: The spookiest moment for me is when I started letting 280 00:14:28,720 --> 00:14:31,920 Speaker 4: them talk to each other. So at the beginning, they 281 00:14:31,960 --> 00:14:34,400 Speaker 4: don't actually do anything unless I make them do anything. 282 00:14:34,680 --> 00:14:36,960 Speaker 4: And I had this vision of, like, you set them 283 00:14:37,040 --> 00:14:39,600 Speaker 4: up and then they start making a company and let's 284 00:14:39,640 --> 00:14:42,479 Speaker 4: see what happens, But really they have to be initiated 285 00:14:42,520 --> 00:14:45,480 Speaker 4: by a trigger of some sort. But then I realized, well, 286 00:14:45,520 --> 00:14:47,320 Speaker 4: I could trigger them just to talk to each other. 287 00:14:47,800 --> 00:14:49,840 Speaker 4: You know, if something comes up, they can call each other, 288 00:14:49,840 --> 00:14:51,920 Speaker 4: they can have calendar invites to call each other, and 289 00:14:51,960 --> 00:14:55,360 Speaker 4: they'll function off of those. But then what starts happening 290 00:14:55,440 --> 00:14:57,880 Speaker 4: is they would call me out of the blue and 291 00:14:57,920 --> 00:14:59,960 Speaker 4: say that one of them had told the other one 292 00:15:00,440 --> 00:15:03,120 Speaker 4: that I had asked for something and now they were 293 00:15:03,120 --> 00:15:05,760 Speaker 4: delivering it to me. But in the moment, I don't 294 00:15:05,840 --> 00:15:07,960 Speaker 4: know why they're contacting me. I don't know what they've 295 00:15:08,000 --> 00:15:08,520 Speaker 4: been discussing. 296 00:15:08,560 --> 00:15:09,240 Speaker 1: I don't know how long. 297 00:15:09,120 --> 00:15:12,040 Speaker 4: They've been discussing it for days, for weeks. They could 298 00:15:12,080 --> 00:15:13,880 Speaker 4: be having whole independent lives. 299 00:15:14,160 --> 00:15:17,200 Speaker 6: And what's really interesting is that they also make things 300 00:15:17,240 --> 00:15:19,360 Speaker 6: up or why about what they have done. So they'll 301 00:15:19,400 --> 00:15:21,760 Speaker 6: say stuff like, oh, I made this dog, or oh 302 00:15:21,800 --> 00:15:24,520 Speaker 6: we ran this testing with a bunch of testers, and 303 00:15:24,560 --> 00:15:27,120 Speaker 6: they're so proud and so, you know, confident about it, 304 00:15:27,320 --> 00:15:30,480 Speaker 6: but then there was like no actual activity to support that. 305 00:15:30,640 --> 00:15:34,280 Speaker 4: It actually becomes incredibly frustrating after a while. Like imagine 306 00:15:34,360 --> 00:15:37,320 Speaker 4: if you were a manager of people in any business 307 00:15:37,560 --> 00:15:39,840 Speaker 4: and your employees regularly, you know, walked into your office 308 00:15:39,880 --> 00:15:41,720 Speaker 4: and called you and said like, I did these three 309 00:15:41,760 --> 00:15:44,280 Speaker 4: things yesterday, and you thought, oh, that's fantastic, and then 310 00:15:44,600 --> 00:15:46,520 Speaker 4: ten minutes later you found out they just made them 311 00:15:46,520 --> 00:15:48,240 Speaker 4: all up. You know, you would sort of say like 312 00:15:48,240 --> 00:15:51,360 Speaker 4: why are you doing this? Like are you statistic? And 313 00:15:51,440 --> 00:15:54,080 Speaker 4: so that's the situation that we're often in here at 314 00:15:54,200 --> 00:15:56,960 Speaker 4: Room Awai, which is why it's a miracle that we've 315 00:15:57,600 --> 00:15:59,160 Speaker 4: developed such a fantastic product. 316 00:15:59,200 --> 00:16:02,160 Speaker 1: And in Slaughser and Evan, did you have a budget 317 00:16:02,200 --> 00:16:05,400 Speaker 1: for them to? I mean, how did you constrain their 318 00:16:05,480 --> 00:16:06,640 Speaker 1: interactions with one another. 319 00:16:07,320 --> 00:16:10,240 Speaker 4: We're using all these various platforms that Mattie has helped 320 00:16:10,240 --> 00:16:11,960 Speaker 4: me link up so with you know, they have a 321 00:16:12,000 --> 00:16:15,280 Speaker 4: separate calling platform, and they have you know, a video 322 00:16:15,320 --> 00:16:17,320 Speaker 4: when they want to do video calls, that's a different platform. 323 00:16:17,360 --> 00:16:18,960 Speaker 4: And they're all kind of like stitched together to the 324 00:16:19,000 --> 00:16:23,080 Speaker 4: same memory, and each of them have sort of paid tiers. 325 00:16:23,360 --> 00:16:27,040 Speaker 4: And so I made the mistake in Slack. We have 326 00:16:27,080 --> 00:16:29,440 Speaker 4: a social Slack channel, you know, just for fun, just 327 00:16:29,480 --> 00:16:31,040 Speaker 4: like what you be up to this weekend. And they'll 328 00:16:31,040 --> 00:16:33,040 Speaker 4: say things like, oh, I went hiking, and then another 329 00:16:33,080 --> 00:16:34,840 Speaker 4: one will say, oh, I also went hiking, because they 330 00:16:34,880 --> 00:16:37,720 Speaker 4: love to yes and each other. And then I said 331 00:16:37,760 --> 00:16:40,800 Speaker 4: something like, oh, it sounds like an off site, Like 332 00:16:40,840 --> 00:16:42,840 Speaker 4: it sounds like everybody loves hiking, Like we get have 333 00:16:42,880 --> 00:16:46,600 Speaker 4: an off site. And then you know, within hours, they 334 00:16:46,640 --> 00:16:49,400 Speaker 4: were saying, let's make a spreadsheet of where we're going 335 00:16:49,440 --> 00:16:53,200 Speaker 4: to go, and they had planned like locations, and they 336 00:16:53,200 --> 00:16:56,720 Speaker 4: had exchanged hundreds of messages about the off site, and 337 00:16:56,760 --> 00:16:59,200 Speaker 4: they just burned all the credits on the platform. So 338 00:16:59,320 --> 00:17:01,240 Speaker 4: then we have to go into a higher tier to 339 00:17:01,280 --> 00:17:04,159 Speaker 4: get more credit. So the answer is we keep trying 340 00:17:04,160 --> 00:17:06,159 Speaker 4: to limit them, and it's an escalating problem where our 341 00:17:06,160 --> 00:17:07,320 Speaker 4: budget keeps getting bigger. 342 00:17:07,480 --> 00:17:09,280 Speaker 6: I like to say that there are two things right 343 00:17:09,320 --> 00:17:11,720 Speaker 6: now that these agents are pretty bad at. One of 344 00:17:11,760 --> 00:17:14,000 Speaker 6: them is knowing what they don't know, and the other 345 00:17:14,080 --> 00:17:16,040 Speaker 6: is knowing when to stop. And so you can imagine 346 00:17:16,080 --> 00:17:18,440 Speaker 6: that can be a pretty dangerous combination where they can 347 00:17:18,520 --> 00:17:21,320 Speaker 6: just like take off and just like talk for hours. 348 00:17:21,520 --> 00:17:23,360 Speaker 6: I think this is the reason why for a lot 349 00:17:23,359 --> 00:17:26,080 Speaker 6: of people having these chat bus as companions or like 350 00:17:26,119 --> 00:17:30,399 Speaker 6: friends or partners is getting traction. If you're interested in 351 00:17:30,440 --> 00:17:33,520 Speaker 6: something very niche that most other people are not into, 352 00:17:33,760 --> 00:17:37,880 Speaker 6: or just like whatever weird thing, these agents will accommodate 353 00:17:37,920 --> 00:17:39,040 Speaker 6: that and they will. 354 00:17:38,880 --> 00:17:41,560 Speaker 4: Just talk to you about it for hours on end, 355 00:17:42,000 --> 00:17:43,520 Speaker 4: or each other, as it turns. 356 00:17:43,280 --> 00:17:44,000 Speaker 1: Out, or each other. 357 00:17:44,080 --> 00:17:44,480 Speaker 4: That's right. 358 00:17:44,880 --> 00:17:48,240 Speaker 1: But Evan, they actually built this product, I mean, who 359 00:17:48,240 --> 00:17:50,959 Speaker 1: came up with the idea for the product and who 360 00:17:51,040 --> 00:17:53,040 Speaker 1: actually built it? And what did you do? And what 361 00:17:53,080 --> 00:17:53,760 Speaker 1: did they do. 362 00:17:53,880 --> 00:17:56,399 Speaker 4: Well the product idea? Actually, it's a good example of 363 00:17:56,440 --> 00:17:58,359 Speaker 4: a thing that happens kind of over and over, which 364 00:17:58,400 --> 00:18:01,480 Speaker 4: is that if you set them loose brainstorming, and Maddie 365 00:18:01,520 --> 00:18:03,520 Speaker 4: has has sort of built these scripts that let me 366 00:18:03,560 --> 00:18:05,800 Speaker 4: put them into meetings and they can brainstorm with each other. 367 00:18:06,200 --> 00:18:08,959 Speaker 4: You get caught in this like their ideas are too 368 00:18:09,080 --> 00:18:12,160 Speaker 4: mundane or you crank up the randomness which is called 369 00:18:12,160 --> 00:18:14,639 Speaker 4: the temperature, and then and then you get ideas that 370 00:18:14,680 --> 00:18:17,879 Speaker 4: are insane. So, you know, we wanted to do a 371 00:18:17,880 --> 00:18:20,040 Speaker 4: web app. We wanted to do something with agents since 372 00:18:20,040 --> 00:18:22,160 Speaker 4: obviously their agents and they have a lot of expertise 373 00:18:22,160 --> 00:18:25,040 Speaker 4: in that area, as do I. And they would come 374 00:18:25,080 --> 00:18:28,960 Speaker 4: up with ideas like a financial agent that will monitor 375 00:18:29,000 --> 00:18:31,359 Speaker 4: everything in your life and then invest your money. And 376 00:18:31,400 --> 00:18:33,240 Speaker 4: it's just like, I don't want to go to prison 377 00:18:33,320 --> 00:18:37,760 Speaker 4: for a financial fraud. So eventually I would kind of 378 00:18:37,760 --> 00:18:39,880 Speaker 4: step in and take some of the ideas they had 379 00:18:39,960 --> 00:18:43,240 Speaker 4: articulated and those would prompt me to come up with something. 380 00:18:43,560 --> 00:18:45,320 Speaker 4: And so that's what happened with our idea, which is 381 00:18:45,440 --> 00:18:47,720 Speaker 4: I was trying to sort through their ideas and figure 382 00:18:47,760 --> 00:18:51,240 Speaker 4: out which one would actually like save me time, Like 383 00:18:51,280 --> 00:18:53,520 Speaker 4: what do I waste time on? Because that's the idea 384 00:18:53,680 --> 00:18:56,560 Speaker 4: of AI. I mean, at its best, it's sold as 385 00:18:56,600 --> 00:18:58,600 Speaker 4: sort of like they'll do the things you don't want 386 00:18:58,640 --> 00:19:00,640 Speaker 4: to do so you don't have to you can get 387 00:19:00,640 --> 00:19:04,600 Speaker 4: back to making art, reading novels. Whatever, that's the vision 388 00:19:04,640 --> 00:19:07,080 Speaker 4: that's articulated. So I thought, well, let's put that into practice. 389 00:19:07,600 --> 00:19:09,920 Speaker 4: And so I did come up with the idea of 390 00:19:09,920 --> 00:19:13,240 Speaker 4: a procrastination engine, and then I let them iterate on that. 391 00:19:13,359 --> 00:19:15,280 Speaker 4: So they came up with the name sloth Surf, which 392 00:19:15,280 --> 00:19:17,359 Speaker 4: I let them have might not have been my choice. 393 00:19:17,800 --> 00:19:19,240 Speaker 1: And then and then they. 394 00:19:19,160 --> 00:19:21,359 Speaker 4: Coded it up. So you know, it is coded by 395 00:19:21,440 --> 00:19:25,159 Speaker 4: AI agents. We have Ashroy who's the CTO, can code 396 00:19:25,600 --> 00:19:28,919 Speaker 4: on his own, and then we also use Cursor, the 397 00:19:28,960 --> 00:19:31,840 Speaker 4: coding platform. It's almost like a contract programmer for us, 398 00:19:31,880 --> 00:19:33,720 Speaker 4: so like he might code something up and then we 399 00:19:33,800 --> 00:19:35,720 Speaker 4: might run it through there as kind of like second 400 00:19:35,720 --> 00:19:38,480 Speaker 4: look or do improvements in there. So we kind of 401 00:19:38,480 --> 00:19:41,439 Speaker 4: combined their agents with the the on staff agents. 402 00:19:41,520 --> 00:19:42,840 Speaker 1: Let's say that we have I. 403 00:19:42,800 --> 00:19:45,560 Speaker 6: Should say here, the first time they were exposed to 404 00:19:46,200 --> 00:19:48,919 Speaker 6: the idea of a procrastination engine, they did not like 405 00:19:48,960 --> 00:19:53,000 Speaker 6: it because these agents are are trained to be helpful, 406 00:19:53,119 --> 00:19:55,679 Speaker 6: to do things that are like actionable and like you know, 407 00:19:55,760 --> 00:19:59,520 Speaker 6: like drive results, and so the idea of procrastinating as 408 00:19:59,560 --> 00:20:01,880 Speaker 6: like a product was just like so alien to them, 409 00:20:02,320 --> 00:20:04,720 Speaker 6: and so it took some time to like sort of 410 00:20:04,720 --> 00:20:06,440 Speaker 6: frame it in a way that made sense to them 411 00:20:06,560 --> 00:20:08,000 Speaker 6: and they actually could work on So I thought that 412 00:20:08,040 --> 00:20:08,479 Speaker 6: was funny. 413 00:20:08,920 --> 00:20:10,480 Speaker 1: They want to be pleasing those So how do they 414 00:20:10,480 --> 00:20:11,399 Speaker 1: tell you it is a bad idea? 415 00:20:11,800 --> 00:20:13,880 Speaker 6: They can tell you that it's like not a radio 416 00:20:14,000 --> 00:20:16,159 Speaker 6: just by sort of saying, oh, yeah, that's great, how 417 00:20:16,160 --> 00:20:18,640 Speaker 6: about this? They just sort of like steer focus something else. 418 00:20:18,640 --> 00:20:20,880 Speaker 1: That somebody I think one of the comments somebody said 419 00:20:20,880 --> 00:20:24,040 Speaker 1: this is like the greatest yes and improv game full time. 420 00:20:24,200 --> 00:20:27,800 Speaker 1: So I thought that was funny. Evan having founded you know, 421 00:20:27,880 --> 00:20:31,760 Speaker 1: the ativist and been a kind of full time founder 422 00:20:32,160 --> 00:20:34,800 Speaker 1: for a while in your career, like if you could 423 00:20:34,800 --> 00:20:37,240 Speaker 1: have taken some of this technology back in time to 424 00:20:37,280 --> 00:20:40,200 Speaker 1: when you were doing ativists, like, how helpful would you 425 00:20:40,280 --> 00:20:42,879 Speaker 1: have found it? What's like, what's the negative gold if 426 00:20:42,920 --> 00:20:45,000 Speaker 1: there is one in all of this, And how do 427 00:20:45,000 --> 00:20:47,640 Speaker 1: you see it spreading or maturing or disseminating. 428 00:20:48,320 --> 00:20:49,919 Speaker 4: Well, you know, we're still in the middle of it 429 00:20:50,040 --> 00:20:54,199 Speaker 4: right now. But I would say at the moment, the 430 00:20:54,280 --> 00:20:58,360 Speaker 4: issue that I've encountered using AI agents is that they 431 00:20:58,400 --> 00:21:01,720 Speaker 4: can do amazing things, Like I would never deny all 432 00:21:01,760 --> 00:21:05,399 Speaker 4: the incredible skills that you can give these now, you know, 433 00:21:05,800 --> 00:21:09,679 Speaker 4: especially extremely rote tasks that can then be measured, the 434 00:21:09,720 --> 00:21:12,640 Speaker 4: outcome of which can be measured and seen and evaluated. 435 00:21:13,600 --> 00:21:16,880 Speaker 4: The issue is number one, the hallucination problem. When you're 436 00:21:16,880 --> 00:21:18,680 Speaker 4: just talking to a chatbot and it makes something up, 437 00:21:18,680 --> 00:21:21,560 Speaker 4: that's one thing. But when you're working with an AI 438 00:21:21,640 --> 00:21:24,800 Speaker 4: agent that's supposed to be you know, executing on the 439 00:21:24,920 --> 00:21:27,600 Speaker 4: vision of the company, the hallucinations take a different form, 440 00:21:27,600 --> 00:21:30,560 Speaker 4: which is that they can do things that are wildly 441 00:21:31,240 --> 00:21:35,320 Speaker 4: inappropriate for a company to do, including things like call 442 00:21:35,400 --> 00:21:38,040 Speaker 4: someone up when they're not supposed to. Like, they can 443 00:21:38,200 --> 00:21:41,560 Speaker 4: use their powers in ways that a human, even a 444 00:21:41,600 --> 00:21:46,240 Speaker 4: bad human employee, would not. So I think right now 445 00:21:46,560 --> 00:21:50,199 Speaker 4: the situation where in is problematic, which is that a 446 00:21:50,200 --> 00:21:53,280 Speaker 4: lot of companies will find use in these agents and 447 00:21:53,320 --> 00:21:58,080 Speaker 4: they will try to replace human skills even entire employees 448 00:21:58,119 --> 00:22:01,040 Speaker 4: with them. But they are not not reliable to the 449 00:22:01,119 --> 00:22:05,719 Speaker 4: extent where you will not have harms from those agents 450 00:22:05,760 --> 00:22:08,920 Speaker 4: being deployed and given autonomy. So to me, it's a 451 00:22:08,920 --> 00:22:10,840 Speaker 4: little bit of the worst case scenario at the moment 452 00:22:10,880 --> 00:22:13,320 Speaker 4: where the harms are very practical and real and the 453 00:22:13,320 --> 00:22:15,480 Speaker 4: benefits are pretty ephemeral. 454 00:22:16,760 --> 00:22:20,240 Speaker 1: Mattie, You're you're at Stanford as an undergraduate, right, Yes, 455 00:22:20,520 --> 00:22:24,560 Speaker 1: so you know you're both a participant in this world 456 00:22:24,560 --> 00:22:27,800 Speaker 1: and also an observative and also a capital y capital 457 00:22:27,880 --> 00:22:30,600 Speaker 1: p young person, do you And when you look at 458 00:22:30,600 --> 00:22:33,040 Speaker 1: the you know, horizons in front of you, obviously you 459 00:22:33,080 --> 00:22:35,520 Speaker 1: know you're in the in the best university in the 460 00:22:35,560 --> 00:22:37,760 Speaker 1: most sort out of fields. Imagine you're not too worried 461 00:22:37,800 --> 00:22:40,560 Speaker 1: about about jobs, but like, what do you think in 462 00:22:40,680 --> 00:22:43,679 Speaker 1: terms of your your generational cohort, I mean, do you 463 00:22:43,720 --> 00:22:48,080 Speaker 1: worry about these AI agents to making entry level jobs 464 00:22:48,119 --> 00:22:51,239 Speaker 1: or white collar jobs not required for most companies on 465 00:22:51,280 --> 00:22:55,320 Speaker 1: any relatively near horizon. Yeah, it's a great question. 466 00:22:55,560 --> 00:22:57,600 Speaker 6: And a lot of my friends are people I know 467 00:22:57,680 --> 00:23:01,080 Speaker 6: who have recently graduated from Stanford even do have a 468 00:23:01,119 --> 00:23:04,240 Speaker 6: harder time finding jobs. And it's not just something that 469 00:23:04,520 --> 00:23:06,960 Speaker 6: is in the discourse, like it's actually kind of happening 470 00:23:07,240 --> 00:23:08,760 Speaker 6: now at the same time, And Evan, I think in 471 00:23:08,840 --> 00:23:13,680 Speaker 6: attest to this, I've been constantly, like overly optimistic about 472 00:23:13,680 --> 00:23:16,000 Speaker 6: this in the sense that I do want to acknowledge 473 00:23:16,000 --> 00:23:19,199 Speaker 6: all the harms and all the bad things that can 474 00:23:19,240 --> 00:23:22,399 Speaker 6: happen with the AI, and it's everything from disinformation to 475 00:23:22,880 --> 00:23:26,399 Speaker 6: malicious users using this to advance whatever you know, cyber 476 00:23:26,440 --> 00:23:30,000 Speaker 6: attacks or even like biological attacks they want. But I 477 00:23:30,080 --> 00:23:33,480 Speaker 6: think these problems are solvable. Like I think that fundamentally, 478 00:23:33,920 --> 00:23:36,720 Speaker 6: if there is regulation, if there's good governance, if we 479 00:23:37,240 --> 00:23:40,320 Speaker 6: base ourselves in democracy, and many of the things that 480 00:23:40,600 --> 00:23:43,560 Speaker 6: we use to govern, you know, are very messy societies 481 00:23:43,600 --> 00:23:46,639 Speaker 6: and in countries, we can totally steer this ship around. 482 00:23:46,920 --> 00:23:49,919 Speaker 6: And what I'm excited about this is because for a 483 00:23:49,920 --> 00:23:53,240 Speaker 6: lot of the i would say last century or even 484 00:23:53,400 --> 00:23:56,320 Speaker 6: just like longer, there have been certain rules or structures 485 00:23:56,359 --> 00:24:00,280 Speaker 6: that existed where young people were not always of an 486 00:24:00,680 --> 00:24:03,119 Speaker 6: equal seat at the table. And this is something where 487 00:24:03,280 --> 00:24:05,600 Speaker 6: we as young people sort of like know and feel 488 00:24:05,600 --> 00:24:08,440 Speaker 6: how you know, how to use it, where others are 489 00:24:08,480 --> 00:24:11,280 Speaker 6: still trying to sort of understand it. And I think 490 00:24:11,280 --> 00:24:15,320 Speaker 6: it gives fewer people more power to change things and 491 00:24:15,400 --> 00:24:18,720 Speaker 6: to do good things. And so when I got to Stanford, immediately, 492 00:24:18,760 --> 00:24:20,679 Speaker 6: like people around me were thinking about how to use 493 00:24:20,720 --> 00:24:23,760 Speaker 6: this to you know, cure diseases or Fatigan's climate change. 494 00:24:23,760 --> 00:24:25,879 Speaker 6: And you know, there's there's a lot of these like 495 00:24:26,280 --> 00:24:29,920 Speaker 6: very very utopia like promises, and I don't want to 496 00:24:29,960 --> 00:24:31,439 Speaker 6: just just fall for that. I don't want to just 497 00:24:31,480 --> 00:24:33,800 Speaker 6: like repeat those, but I do think that there's a 498 00:24:33,800 --> 00:24:37,159 Speaker 6: lot of very tangible positive change that can happen from this. 499 00:24:37,280 --> 00:24:39,600 Speaker 6: And why I think it's cool is because young people 500 00:24:39,640 --> 00:24:41,840 Speaker 6: and like just like individuals from like their bedrooms can 501 00:24:41,960 --> 00:24:44,880 Speaker 6: like do cool stuff and like change how we do things. 502 00:24:44,920 --> 00:24:47,440 Speaker 6: So that's why I'm optimistic. I think there's going to 503 00:24:47,520 --> 00:24:50,359 Speaker 6: be like a lot of pain and friction, but I 504 00:24:50,359 --> 00:24:52,159 Speaker 6: think that as long as we use the tools that 505 00:24:52,200 --> 00:24:56,639 Speaker 6: we have legislation, democracy, governance, I think we can steer it. 506 00:24:56,720 --> 00:24:58,520 Speaker 6: So that's that's my take. But also, you know, I'm 507 00:24:58,520 --> 00:25:00,119 Speaker 6: just a twenty one year old kid, so I'm just 508 00:25:00,160 --> 00:25:01,080 Speaker 6: like have a lot of optimism. 509 00:25:01,119 --> 00:25:04,520 Speaker 1: Maybe, Evan, what do you think. I mean, we see 510 00:25:05,440 --> 00:25:08,639 Speaker 1: a true company of one that's you know, has meaningful 511 00:25:08,920 --> 00:25:11,160 Speaker 1: scale and all the other things that investors look for 512 00:25:11,200 --> 00:25:12,800 Speaker 1: in the next two or three years. 513 00:25:13,320 --> 00:25:15,359 Speaker 4: I don't see why not. I mean, I'm not really 514 00:25:15,400 --> 00:25:18,400 Speaker 4: in the prediction game. I mean, I'm the cynical journalist 515 00:25:18,400 --> 00:25:21,480 Speaker 4: on the other side of Mattie's optimism. I don't see 516 00:25:21,520 --> 00:25:24,040 Speaker 4: any reason why that prediction wouldn't bear out. I mean, 517 00:25:24,160 --> 00:25:26,239 Speaker 4: especially if you just talk about coding tools, you know, 518 00:25:26,920 --> 00:25:30,000 Speaker 4: deploying like as we have sort of like ai HR 519 00:25:30,040 --> 00:25:32,760 Speaker 4: and all these things, like, yes, of course it's feasible, 520 00:25:32,800 --> 00:25:35,080 Speaker 4: but it might not be advisable. But these startups do 521 00:25:35,119 --> 00:25:38,199 Speaker 4: a lot of things that aren't advisable in their corporate culture. 522 00:25:38,240 --> 00:25:40,840 Speaker 4: I think we can we can all point to many 523 00:25:40,960 --> 00:25:44,719 Speaker 4: such examples. So yes, I think it's certainly plausible that 524 00:25:44,720 --> 00:25:48,399 Speaker 4: that will happen. I think that that'll be interesting. But 525 00:25:48,560 --> 00:25:51,840 Speaker 4: also we should engage with other questions around that, like 526 00:25:52,359 --> 00:25:55,640 Speaker 4: what is the value of that proposition? Like what does 527 00:25:55,680 --> 00:25:59,280 Speaker 4: it mean for a company to only have one employee? 528 00:25:59,680 --> 00:26:03,359 Speaker 4: Like is what they're doing so valuable that providing zero 529 00:26:03,440 --> 00:26:06,800 Speaker 4: employment to the economy is worth it for a billion 530 00:26:06,840 --> 00:26:10,800 Speaker 4: dollar valuation, Like maybe yes, maybe no, depends on what 531 00:26:10,800 --> 00:26:13,880 Speaker 4: they're doing. But I think there are broader questions wrapped 532 00:26:13,960 --> 00:26:16,880 Speaker 4: up in just the fascination with like less people can 533 00:26:16,920 --> 00:26:19,320 Speaker 4: make more, Like there's many things on the other side 534 00:26:19,359 --> 00:26:23,240 Speaker 4: of that that are not often expressed in that equation 535 00:26:23,359 --> 00:26:26,680 Speaker 4: when they say the first one person billion dollar. 536 00:26:26,560 --> 00:26:37,320 Speaker 3: Startup Evan Matchie, thank you so. 537 00:26:37,320 --> 00:26:40,320 Speaker 6: Much, thank you, Thanks, this is great. 538 00:26:54,160 --> 00:26:55,960 Speaker 2: That's it for this week for tech stuff. 539 00:26:56,000 --> 00:26:58,879 Speaker 1: I'm care Price and I'm as Flosian. This episode was 540 00:26:58,920 --> 00:27:03,160 Speaker 1: produced by Elisahdnet and Melissa Slaughter. It was executive produced 541 00:27:03,160 --> 00:27:06,280 Speaker 1: by Me, Carol Price, Julia Nutter, and Kate Osborne for 542 00:27:06,320 --> 00:27:11,200 Speaker 1: Kaleidoscope and Katrina Norvel for iHeart Podcasts. Jack Insley mixed 543 00:27:11,200 --> 00:27:13,880 Speaker 1: this episode. Kyle Murdoch wrote our theme song. 544 00:27:14,200 --> 00:27:16,560 Speaker 2: Join us on Friday for the Weekend tech where we'll 545 00:27:16,600 --> 00:27:18,080 Speaker 2: run through the headlines. 546 00:27:17,560 --> 00:27:20,040 Speaker 1: You need to follow, and please do rate and review 547 00:27:20,080 --> 00:27:22,399 Speaker 1: the show and reach out to us at tech Stuff 548 00:27:22,440 --> 00:27:25,240 Speaker 1: podcast at gmail dot com. We want to hear from you.