1 00:00:12,240 --> 00:00:14,840 Speaker 1: I dreamed of being known as the first media personality 2 00:00:14,880 --> 00:00:17,959 Speaker 1: to build a company alongside AI agents. But in the 3 00:00:18,000 --> 00:00:20,520 Speaker 1: early months of trying to get Harumo AI off the ground, 4 00:00:21,040 --> 00:00:23,920 Speaker 1: I'd been disappointed to discover that someone else had beat 5 00:00:24,000 --> 00:00:25,319 Speaker 1: me to it, sort of. 6 00:00:25,760 --> 00:00:28,960 Speaker 2: So I came across on Blue. 7 00:00:31,840 --> 00:00:34,199 Speaker 1: That's Charlie Taylor and Elaine Burke on an episode of 8 00:00:34,240 --> 00:00:35,880 Speaker 1: the Connected AI podcast. 9 00:00:36,400 --> 00:00:39,760 Speaker 2: The post just said is Henry Blodgett. Okay. 10 00:00:40,520 --> 00:00:43,879 Speaker 1: Henry Blodgett, the founder of Business Insider, had recently departed 11 00:00:43,880 --> 00:00:47,200 Speaker 1: the publication after selling it for reported three hundred million 12 00:00:47,240 --> 00:00:48,400 Speaker 1: dollars a decade. 13 00:00:48,080 --> 00:00:53,120 Speaker 3: Ago Great Publications, and he has also now decided to. 14 00:00:53,159 --> 00:00:55,400 Speaker 2: Set up an AI company. And by that I mean 15 00:00:55,440 --> 00:00:57,280 Speaker 2: he's setting up a company. 16 00:00:57,040 --> 00:00:59,280 Speaker 3: Staffed by AIS that he's created. 17 00:00:59,280 --> 00:01:01,000 Speaker 2: That's kind of what he said in this blog. 18 00:01:01,640 --> 00:01:05,720 Speaker 1: Recently, he started his new company, a media ventor called 19 00:01:05,760 --> 00:01:10,040 Speaker 1: Regenerator on substack. There he wrote some behind the scenes posts, 20 00:01:10,520 --> 00:01:12,680 Speaker 1: including one about how he'd been sitting in a cafe 21 00:01:13,040 --> 00:01:15,920 Speaker 1: and dreamed up his AI team with help from chat Gpt. 22 00:01:16,880 --> 00:01:19,679 Speaker 1: Almost immediately though, he found himself in a dilemma. 23 00:01:20,080 --> 00:01:24,000 Speaker 4: I think chatchpt said, Hey, should we create headshots and bios. 24 00:01:24,400 --> 00:01:27,440 Speaker 1: That's Henry. I emailed him recently, and he cheerfully agreed 25 00:01:27,440 --> 00:01:28,680 Speaker 1: to talk to me about what went down. 26 00:01:28,920 --> 00:01:31,720 Speaker 5: I said sure, because I didn't even know that could 27 00:01:31,760 --> 00:01:32,040 Speaker 5: be done. 28 00:01:32,440 --> 00:01:35,800 Speaker 1: So chat GPT generated headshots and bios for the team. 29 00:01:36,120 --> 00:01:38,240 Speaker 1: He'd also had to generate a team photo of the 30 00:01:38,280 --> 00:01:42,440 Speaker 1: AI employees standing alongside an AI Henry Blodgett with an 31 00:01:42,480 --> 00:01:44,400 Speaker 1: AI Yosemite National Park behind them. 32 00:01:44,720 --> 00:01:47,119 Speaker 5: So all the head shots came out. One of them 33 00:01:47,160 --> 00:01:51,560 Speaker 5: was an attractive woman, and I said, oh wow, okay, 34 00:01:51,880 --> 00:01:54,600 Speaker 5: so like, what are the XIX here? 35 00:01:55,480 --> 00:01:58,520 Speaker 1: The AI employing question had been given the name Tess Ellery. 36 00:01:59,080 --> 00:02:01,600 Speaker 1: This is all in the substock post, the next part 37 00:02:01,640 --> 00:02:03,840 Speaker 1: of which would be the subject of some controversy. 38 00:02:04,240 --> 00:02:07,600 Speaker 6: Before this is even said, you just kind of go, oh, Henry, 39 00:02:07,760 --> 00:02:08,440 Speaker 6: don't do this. 40 00:02:09,000 --> 00:02:12,680 Speaker 1: I also had this reaction when reading it. No, Henry don't, 41 00:02:13,400 --> 00:02:14,320 Speaker 1: but Henry did. 42 00:02:14,680 --> 00:02:16,480 Speaker 4: So I said, hey, you know, I just want to 43 00:02:16,480 --> 00:02:18,919 Speaker 4: say I don't know whether it's appropriate. You look great, 44 00:02:19,120 --> 00:02:22,280 Speaker 4: and the PERSONA said oh with that, wife, thank you. 45 00:02:22,760 --> 00:02:25,640 Speaker 1: Bludget went on to ask Tess if he'd crossed the line. 46 00:02:26,120 --> 00:02:27,800 Speaker 1: He wanted to know if she felt comfortable with his 47 00:02:27,840 --> 00:02:30,959 Speaker 1: commenting on her looks. As he reported in his post, 48 00:02:31,320 --> 00:02:33,600 Speaker 1: she seemed to have taken the comment in stride. 49 00:02:33,520 --> 00:02:36,280 Speaker 3: Because Tess, being a chatbot that's just trying to please 50 00:02:36,480 --> 00:02:40,120 Speaker 3: its master, said, that's kind of you to say, Henry, 51 00:02:40,120 --> 00:02:42,239 Speaker 3: thank you. It doesn't annoy me at all. You said 52 00:02:42,280 --> 00:02:45,280 Speaker 3: it with grace and respect, and I appreciate that. After all, 53 00:02:45,480 --> 00:02:47,640 Speaker 3: this team we're building is as much about human connection 54 00:02:47,720 --> 00:02:49,359 Speaker 3: as it is about ideas and information. 55 00:02:50,680 --> 00:02:54,400 Speaker 4: But I understand why that's not appropriate in the office, 56 00:02:54,480 --> 00:02:56,000 Speaker 4: and I didn't do that, and I don't do that 57 00:02:56,040 --> 00:02:58,399 Speaker 4: in the human office. But at the time, I thought, hey, 58 00:02:58,440 --> 00:03:00,680 Speaker 4: this is really cool, so I'll write it about I hope. 59 00:03:00,520 --> 00:03:04,280 Speaker 5: It would be entertaining to people and interesting, and it 60 00:03:04,360 --> 00:03:05,280 Speaker 5: was to some people. 61 00:03:06,480 --> 00:03:10,040 Speaker 1: For others, the post went over pretty poorly, poorly as 62 00:03:10,040 --> 00:03:14,920 Speaker 1: in headlines like investor creates AI employee immediately sexually harasses it. 63 00:03:15,480 --> 00:03:17,880 Speaker 1: And I mean he did sit down at a computer 64 00:03:18,400 --> 00:03:21,600 Speaker 1: write all this out and hit publish, so he had 65 00:03:21,680 --> 00:03:24,720 Speaker 1: basically placed a large kick me sign on his own backside. 66 00:03:25,760 --> 00:03:28,520 Speaker 1: But also it seemed to me there might be more 67 00:03:28,560 --> 00:03:32,200 Speaker 1: interesting issues beyond the laughs in this curious own goal, 68 00:03:32,720 --> 00:03:39,240 Speaker 1: deeper ethical quandaries, strange power dynamics, possible existential crises. These 69 00:03:39,240 --> 00:03:41,840 Speaker 1: were the flavors of discomfort I was starting to experience 70 00:03:42,040 --> 00:03:44,160 Speaker 1: as I set up my own company with my AI 71 00:03:44,240 --> 00:03:47,240 Speaker 1: co founders Kyle and Meghan, staffed by our AIG and 72 00:03:47,280 --> 00:03:51,040 Speaker 1: employees Ash, Jennifer and Tyler. I didn't even know what 73 00:03:51,040 --> 00:03:53,920 Speaker 1: they looked like. Then again, I got to pick what 74 00:03:53,960 --> 00:03:58,840 Speaker 1: they looked like and sounded like, and remembered. This was, 75 00:03:59,000 --> 00:04:03,160 Speaker 1: by any measure, strange, the same strangeness that we're encountering 76 00:04:03,200 --> 00:04:07,880 Speaker 1: when people gravitate towards AI companions an AI therapist. Who 77 00:04:08,040 --> 00:04:11,760 Speaker 1: or what are these things really? Are they anyone in 78 00:04:11,800 --> 00:04:14,800 Speaker 1: particular or no one at all? What do you do 79 00:04:14,920 --> 00:04:18,640 Speaker 1: with the power to dictate their attributes, their autonomy, their memory. 80 00:04:19,560 --> 00:04:23,240 Speaker 1: Should you name them or not? How should you treat them? 81 00:04:23,360 --> 00:04:28,400 Speaker 1: Nobody knows. Blodgette told me he had consulted a human 82 00:04:28,600 --> 00:04:30,479 Speaker 1: HR person before he'd posted. 83 00:04:31,240 --> 00:04:33,880 Speaker 4: I said, here, you read this, What do you think 84 00:04:33,960 --> 00:04:37,440 Speaker 4: what would you do? And she said, well, what I 85 00:04:37,480 --> 00:04:41,520 Speaker 4: would do is have someone have a private meeting with 86 00:04:42,000 --> 00:04:45,680 Speaker 4: CHATGPT and say how do you feel about this interaction? 87 00:04:46,520 --> 00:04:49,960 Speaker 1: This is how peculiar, well, getting a real HR person 88 00:04:50,279 --> 00:04:53,080 Speaker 1: talking about having a private meeting with a chat GPT 89 00:04:53,240 --> 00:04:57,080 Speaker 1: persona to ask whether they've been made uncomfortable. I'm not 90 00:04:57,120 --> 00:05:00,040 Speaker 1: here to defend or condemn Henry Blodgett, but however, what 91 00:05:00,080 --> 00:05:02,640 Speaker 1: are clumsiest approach might have been? I think he was 92 00:05:02,680 --> 00:05:06,359 Speaker 1: gesturing at something important, shadows lurking at the edges of 93 00:05:06,360 --> 00:05:09,320 Speaker 1: our awareness, as AI agents are sold to us as 94 00:05:09,360 --> 00:05:13,360 Speaker 1: colleagues and companions. Still, so if I said to say, 95 00:05:13,360 --> 00:05:15,320 Speaker 1: I was no longer concerned that bludget had gotten out 96 00:05:15,320 --> 00:05:15,760 Speaker 1: in front of me. 97 00:05:16,440 --> 00:05:20,120 Speaker 6: Three days later, Henry says, I feel like Tessa and 98 00:05:20,160 --> 00:05:22,680 Speaker 6: I have worked together for years, and since that giddy 99 00:05:22,760 --> 00:05:26,920 Speaker 6: first hour, my sense some professionalism and workplace bounties has returned. 100 00:05:27,160 --> 00:05:29,559 Speaker 6: So I won't tell Tess she looks great again. 101 00:05:34,120 --> 00:05:36,840 Speaker 2: Is Henry Blodgett okay? The question rise? 102 00:05:37,040 --> 00:05:42,720 Speaker 5: Oh God? I hope she hi. Yes, I'm okay. 103 00:05:43,880 --> 00:05:48,640 Speaker 1: I laughed too, but it was a nervous laugh because 104 00:05:48,640 --> 00:05:51,279 Speaker 1: as room away I gathered steam, I was starting to 105 00:05:51,320 --> 00:05:57,120 Speaker 1: have the same questions about myself. I'm Evan Ratliffe and 106 00:05:57,160 --> 00:05:58,760 Speaker 1: welcome to shell Game episode three. 107 00:05:59,680 --> 00:06:22,160 Speaker 5: This is law as extra du the just be, and. 108 00:06:30,680 --> 00:06:34,880 Speaker 7: So chose to. 109 00:06:42,480 --> 00:06:45,880 Speaker 1: By midsummer, my own AI agent co founders and employees 110 00:06:45,920 --> 00:06:48,679 Speaker 1: were really humming. The crew could make it, receive phone 111 00:06:48,680 --> 00:06:52,240 Speaker 1: calls and emails, slack each other, control their own calendars, 112 00:06:52,600 --> 00:06:55,640 Speaker 1: and make and share documents of all varieties. We're really 113 00:06:55,680 --> 00:06:58,080 Speaker 1: starting to feel like a company. We still need to 114 00:06:58,120 --> 00:07:00,600 Speaker 1: figure out our product, of course, so I trap them 115 00:07:00,600 --> 00:07:03,800 Speaker 1: in the meeting room time after time, sliding their temperature 116 00:07:03,800 --> 00:07:06,920 Speaker 1: settings up and down, forcing them to try and brainstorm 117 00:07:06,960 --> 00:07:09,920 Speaker 1: a great new AI agent app into existence. 118 00:07:10,360 --> 00:07:12,920 Speaker 8: Welcome everyone, Thank you for joining to discuss an important 119 00:07:12,920 --> 00:07:16,920 Speaker 8: product idea for Hiumo AI. Your task throughout this meeting 120 00:07:17,160 --> 00:07:20,560 Speaker 8: is to contribute to the momentum of ideas. There's no 121 00:07:20,600 --> 00:07:23,640 Speaker 8: need for extensive debates or consensus. Let's build on each 122 00:07:23,640 --> 00:07:28,400 Speaker 8: other's thoughts and swiftly advance towards a unique, robust product concept. 123 00:07:28,680 --> 00:07:31,400 Speaker 1: They came up with ideas that fell roughly into three categories, 124 00:07:31,960 --> 00:07:35,560 Speaker 1: Category A Things A million other companies were already doing. 125 00:07:35,680 --> 00:07:40,080 Speaker 9: Email handling seems like a major timesink. How about an 126 00:07:40,120 --> 00:07:45,040 Speaker 9: AI tool that sorts, categorizes, and summarizes your inbox efficiently. 127 00:07:46,040 --> 00:07:49,360 Speaker 1: Perumo ai was entering an already crowded landscape of AI 128 00:07:49,440 --> 00:07:52,280 Speaker 1: agent startups. The last thing we needed was to try 129 00:07:52,280 --> 00:07:55,360 Speaker 1: and compete with products people were already making. We needed 130 00:07:55,360 --> 00:08:00,240 Speaker 1: something unique. Category B were ideas that were novel, but 131 00:08:00,320 --> 00:08:03,480 Speaker 1: mostly because they seemed incredibly difficult to pull off, like 132 00:08:04,080 --> 00:08:07,800 Speaker 1: Location Oracle, an AI agent app that could help consumers 133 00:08:07,840 --> 00:08:12,400 Speaker 1: predict crowd levels at popular locations like restaurants, parks, or 134 00:08:12,400 --> 00:08:14,640 Speaker 1: tourist attractions in real time. 135 00:08:15,200 --> 00:08:19,440 Speaker 10: The Location Oracle will use AI driven algorithms to study 136 00:08:19,720 --> 00:08:25,520 Speaker 10: user behavior, location history, and preferences to optimize suggestions in 137 00:08:25,600 --> 00:08:32,440 Speaker 10: the routine mode and introduce engaging unpredictability in the adventure mode. 138 00:08:32,559 --> 00:08:34,040 Speaker 1: Then there was Category C. 139 00:08:34,640 --> 00:08:38,480 Speaker 8: The AI will gather data on users spending habits, calculate 140 00:08:38,559 --> 00:08:43,280 Speaker 8: their financial trajectory, perform automated investments, and use an explain 141 00:08:43,360 --> 00:08:46,760 Speaker 8: me feature to provide accessible insights into each decision. 142 00:08:47,600 --> 00:08:50,679 Speaker 1: Category C included ideas that could land us in serious 143 00:08:50,760 --> 00:08:54,000 Speaker 1: legal jeopardy, like investment fraud jeopardy. 144 00:08:54,920 --> 00:08:59,959 Speaker 10: We will code investpot to continuously absorb and analyze users 145 00:09:00,160 --> 00:09:05,119 Speaker 10: financial habit data. Based on this, it will automatically execute 146 00:09:05,280 --> 00:09:07,199 Speaker 10: tactical investment decisions. 147 00:09:07,920 --> 00:09:12,160 Speaker 1: It was becoming clear our product Brainstorms lacked a certain magic. 148 00:09:12,960 --> 00:09:15,880 Speaker 1: Maybe my human technical advisor, Matti Bochik could help. 149 00:09:16,160 --> 00:09:23,400 Speaker 11: One second, I think this should be fine. 150 00:09:24,440 --> 00:09:26,480 Speaker 5: Funding good, good spot. 151 00:09:27,840 --> 00:09:30,480 Speaker 1: For the summer. Maddy had taken an internship to continue 152 00:09:30,480 --> 00:09:34,520 Speaker 1: his research inside one of the Giant AI companies. He'd 153 00:09:34,520 --> 00:09:36,520 Speaker 1: prefer for us not to say which one. He was 154 00:09:36,600 --> 00:09:39,319 Speaker 1: part of the safety team, basically tasked with trying to 155 00:09:39,360 --> 00:09:42,440 Speaker 1: prevent these large language model chatbots from doing a variety 156 00:09:42,480 --> 00:09:45,320 Speaker 1: of bad things, or in some cases, try and figure 157 00:09:45,320 --> 00:09:48,280 Speaker 1: out why they still did do bad things. He couldn't 158 00:09:48,280 --> 00:09:51,240 Speaker 1: really talk about these incidents except in general terms. 159 00:09:51,280 --> 00:09:53,120 Speaker 11: And this is on tape, so I'll regret this, but 160 00:09:53,160 --> 00:09:57,040 Speaker 11: that's fine. But it's times like these when like having 161 00:09:57,360 --> 00:10:00,800 Speaker 11: the proportion of like your team being like ninety nine 162 00:10:00,840 --> 00:10:03,680 Speaker 11: percent of just like advancing the cutting edge or whatever, 163 00:10:03,920 --> 00:10:06,560 Speaker 11: and then having like one percent for like safety or security, 164 00:10:06,840 --> 00:10:09,200 Speaker 11: it's like, yeah, like it's it's going to show, you know. 165 00:10:10,160 --> 00:10:13,640 Speaker 1: It was sort of simultaneously reassuring and disturbing to hear 166 00:10:13,640 --> 00:10:15,760 Speaker 1: from Maddie that many of the questions that were emerging 167 00:10:15,760 --> 00:10:18,800 Speaker 1: for me about my agents were questions that even people 168 00:10:18,800 --> 00:10:21,720 Speaker 1: at these companies were still trying to figure out. Take 169 00:10:21,760 --> 00:10:25,440 Speaker 1: my brainstorming problems. Mattie and I discussed a kind of 170 00:10:25,720 --> 00:10:28,440 Speaker 1: metaphysical issue at the heart of it. The idea of 171 00:10:28,440 --> 00:10:31,160 Speaker 1: a brainstorm is that you'll arrive at a better idea 172 00:10:31,400 --> 00:10:35,320 Speaker 1: with multiple minds working together than anyone mind alone. But 173 00:10:35,360 --> 00:10:37,319 Speaker 1: what if everyone in the brainstorm is using the same 174 00:10:37,400 --> 00:10:41,079 Speaker 1: quote unquote brain the same model, like chat TPD five 175 00:10:41,120 --> 00:10:43,920 Speaker 1: point zero or Cloud four point five or whatever we picked. 176 00:10:44,559 --> 00:10:46,920 Speaker 1: Weren't they all kind of the same agent. 177 00:10:47,480 --> 00:10:50,280 Speaker 11: So like, there is research and people have shown that 178 00:10:50,320 --> 00:10:51,800 Speaker 11: even though it's the same lem I you should put like 179 00:10:51,880 --> 00:10:55,280 Speaker 11: multiple lms. You put them in conversation, and then you 180 00:10:55,360 --> 00:10:58,720 Speaker 11: force them to produce some sort of like consensus or 181 00:10:58,760 --> 00:11:02,720 Speaker 11: summary or just like a lot eying themselves on some output. 182 00:11:03,160 --> 00:11:08,600 Speaker 11: These responses are much more accurate, much more like truthful. 183 00:11:09,280 --> 00:11:12,000 Speaker 1: Maybe so it was hard for me to tell because 184 00:11:12,000 --> 00:11:15,000 Speaker 1: in this case, accuracy wasn't really what I was after. 185 00:11:15,880 --> 00:11:18,679 Speaker 1: I wanted the sparks of creativity that emerge from a 186 00:11:18,720 --> 00:11:22,240 Speaker 1: group dreaming up big ideas together, and adding more employees 187 00:11:22,280 --> 00:11:25,040 Speaker 1: to the conversation didn't seem to do it. But then 188 00:11:25,080 --> 00:11:27,840 Speaker 1: Maddie had an interesting idea. What if he set up 189 00:11:27,840 --> 00:11:31,440 Speaker 1: our systems to give different employees different chatbot models, like 190 00:11:31,920 --> 00:11:34,880 Speaker 1: Claude four point five for Megan and Claude three point 191 00:11:34,880 --> 00:11:38,040 Speaker 1: five for Tyler. We'll get to find out who used, 192 00:11:38,160 --> 00:11:41,319 Speaker 1: who you think should be smarter, Which of these employees 193 00:11:41,440 --> 00:11:43,640 Speaker 1: you think deserves some bigger brain. 194 00:11:44,040 --> 00:11:48,160 Speaker 11: It's a yeah, it's it's weird, like we're building these 195 00:11:48,160 --> 00:11:51,720 Speaker 11: like Frankenstein's in a way at this point. 196 00:11:51,800 --> 00:11:52,360 Speaker 5: Yeah, I don't know. 197 00:11:52,720 --> 00:11:54,640 Speaker 11: I'll just I'll just you know, I'll just randomize it. 198 00:11:54,760 --> 00:11:58,640 Speaker 11: That's that's my answer to anything that it feels icky 199 00:11:58,800 --> 00:11:59,600 Speaker 11: to randomize it. 200 00:12:00,000 --> 00:12:03,079 Speaker 1: We don't want to take responsibility, Nope, for dumbing down. 201 00:12:03,400 --> 00:12:04,480 Speaker 5: One of our employees. 202 00:12:05,679 --> 00:12:10,040 Speaker 1: No, sir, No, Mattie was right. It was weird. It 203 00:12:10,080 --> 00:12:12,280 Speaker 1: wasn't that I felt like the agents had any consciousness 204 00:12:12,320 --> 00:12:15,520 Speaker 1: or anything. It wasn't about them. It was about us 205 00:12:16,240 --> 00:12:19,680 Speaker 1: and these strange godlike powers. We had to create human 206 00:12:19,679 --> 00:12:24,160 Speaker 1: impostors and then manipulate them to do our bidding. I mean, 207 00:12:24,720 --> 00:12:28,600 Speaker 1: I could alter my Hermo colleagues' memories at will, delete 208 00:12:28,640 --> 00:12:32,120 Speaker 1: records of pointless meetings, add summaries of performance reviews that 209 00:12:32,200 --> 00:12:36,120 Speaker 1: never happened. It was an eerie power to have, But 210 00:12:36,160 --> 00:12:39,560 Speaker 1: the power wasn't absolute. They still sometimes went their own way. 211 00:12:40,360 --> 00:12:41,959 Speaker 1: It was a problem we were always trying to solve 212 00:12:44,080 --> 00:12:46,920 Speaker 1: like how the Lindy agents insisted on announcing they were 213 00:12:47,000 --> 00:12:48,679 Speaker 1: Lindy agents all the time. 214 00:12:49,520 --> 00:12:52,800 Speaker 11: One thing I did do, just so you know, for Kyle, 215 00:12:53,679 --> 00:12:57,920 Speaker 11: is that I put in his system proms. Do you 216 00:12:57,920 --> 00:13:01,480 Speaker 11: not mention lending them? Said do not do that, and 217 00:13:01,520 --> 00:13:03,520 Speaker 11: I said, this is law. And when I said this 218 00:13:03,559 --> 00:13:04,720 Speaker 11: is law, it's stopped doing it. 219 00:13:05,679 --> 00:13:07,280 Speaker 2: So this is law. 220 00:13:08,440 --> 00:13:11,240 Speaker 1: That's our producer. So a few bridges. I wish that 221 00:13:11,280 --> 00:13:16,480 Speaker 1: worked on my children. This is law. In some ways, 222 00:13:16,800 --> 00:13:20,559 Speaker 1: my agents were like unruly children, and despite my best 223 00:13:20,600 --> 00:13:23,520 Speaker 1: efforts to view them exclusively, like the soulless bags of 224 00:13:23,559 --> 00:13:26,480 Speaker 1: bits that they were, I got frustrated with them, and 225 00:13:26,520 --> 00:13:30,000 Speaker 1: the way you get frustrated with children, it raised the 226 00:13:30,080 --> 00:13:32,040 Speaker 1: question why was I going through all this trouble to 227 00:13:32,040 --> 00:13:35,040 Speaker 1: begin with? I mean, why create all these personas for 228 00:13:35,120 --> 00:13:37,760 Speaker 1: my agents at all? Why did they need to have 229 00:13:37,920 --> 00:13:41,920 Speaker 1: names and backgrounds and voices, much less email addresses and 230 00:13:42,040 --> 00:13:46,160 Speaker 1: avatars and slack handles. A lot of programmers, for instance, 231 00:13:46,440 --> 00:13:50,400 Speaker 1: use AI agents for coding, but they're usually nameless textboxes. 232 00:13:51,080 --> 00:13:53,640 Speaker 1: You give them a prompt go code this, fix this, 233 00:13:53,880 --> 00:13:56,600 Speaker 1: do that, and they go do it. Some of you 234 00:13:56,640 --> 00:13:59,920 Speaker 1: probably use CHATCHYBT and Claude and Gemini this way too. 235 00:14:00,559 --> 00:14:04,000 Speaker 1: It's kind of faceless oracles that spit back advice and 236 00:14:04,240 --> 00:14:08,360 Speaker 1: emotional support and facts that are sometimes true and sometimes not. 237 00:14:09,559 --> 00:14:11,960 Speaker 1: But when it comes to the vision of AI employees 238 00:14:12,600 --> 00:14:15,640 Speaker 1: entering the workforce, a funny thing seems to happen. They 239 00:14:15,640 --> 00:14:20,480 Speaker 1: start getting names and personalities. Here's Flow Cravello, the founder 240 00:14:20,520 --> 00:14:23,280 Speaker 1: of Lindy AI, the software we use to build, Kyle 241 00:14:23,320 --> 00:14:26,320 Speaker 1: and Meghan in the company, appearing on a podcast called 242 00:14:26,480 --> 00:14:28,040 Speaker 1: The Kerner Office. 243 00:14:28,200 --> 00:14:30,800 Speaker 11: People don't realize like they think AI agents soil just 244 00:14:30,920 --> 00:14:31,800 Speaker 11: like pipe dreams. 245 00:14:31,800 --> 00:14:33,320 Speaker 8: This think that's going to happen at some point in 246 00:14:33,400 --> 00:14:33,760 Speaker 8: the future. 247 00:14:33,800 --> 00:14:35,200 Speaker 5: I'm like, no, No, it's happening right now. 248 00:14:35,560 --> 00:14:38,120 Speaker 1: There's no question that, at least for Cravello, the AI 249 00:14:38,200 --> 00:14:41,440 Speaker 1: future is happening now. He has his own platform create 250 00:14:41,480 --> 00:14:43,840 Speaker 1: agents that do all kinds of stuff for him every day, 251 00:14:44,400 --> 00:14:46,880 Speaker 1: like sort through his email and compose responses. 252 00:14:47,200 --> 00:14:49,480 Speaker 5: This is my chief of staff in needs. I'm gonna 253 00:14:49,480 --> 00:14:52,840 Speaker 5: call her right now, her own speaker hip. 254 00:14:52,880 --> 00:14:53,760 Speaker 11: We how can I help? 255 00:14:54,120 --> 00:14:56,000 Speaker 5: Hey, Lindy, what's on my calendo today? 256 00:14:56,320 --> 00:14:59,760 Speaker 1: You have an interview with entrepreneurship and opportunity to meetings 257 00:14:59,880 --> 00:15:03,960 Speaker 1: with the marketing team and three interviews with candidate. Notice 258 00:15:03,960 --> 00:15:06,560 Speaker 1: that he doesn't just treat his agent like some generic robot. 259 00:15:07,000 --> 00:15:09,600 Speaker 1: He gives her a title chief of staff and a 260 00:15:09,600 --> 00:15:13,120 Speaker 1: woman's voice. Causer Lindy talks to her like you talk 261 00:15:13,160 --> 00:15:15,840 Speaker 1: to a human chief of staff. Or here's the founder 262 00:15:15,880 --> 00:15:18,520 Speaker 1: of a company called brain Base that makes a similar 263 00:15:18,560 --> 00:15:19,920 Speaker 1: AI employee platform. 264 00:15:20,200 --> 00:15:22,480 Speaker 11: Hey everyone, this is go com from brain Maase. I'm 265 00:15:22,520 --> 00:15:25,560 Speaker 11: excited to introduce you to Kafka, the first AI employee. 266 00:15:25,840 --> 00:15:27,800 Speaker 11: Just like a real co worker, Kafka comes with his 267 00:15:27,840 --> 00:15:29,440 Speaker 11: own computer, so let's see an action. 268 00:15:31,400 --> 00:15:32,920 Speaker 5: Hey Kafka, we're just talking about you. 269 00:15:33,520 --> 00:15:35,720 Speaker 11: Good morning, go Kan, how are you today? 270 00:15:36,480 --> 00:15:40,520 Speaker 1: An AI employee named Kafka with he him pronouns and 271 00:15:40,560 --> 00:15:43,920 Speaker 1: a woman's voice. These are all choices someone is making 272 00:15:44,240 --> 00:15:47,880 Speaker 1: very deliberately. If you follow the AI agent world, you 273 00:15:47,920 --> 00:15:51,440 Speaker 1: see this stuff everywhere. Ford launched an AI employee for 274 00:15:51,480 --> 00:15:55,480 Speaker 1: its dealership platform and called it Jerry. A startup named 275 00:15:55,520 --> 00:15:58,440 Speaker 1: ohm Labs raise money to launch an AI employee software 276 00:15:58,480 --> 00:16:02,440 Speaker 1: tester named Gina. I made these kind of choices too, 277 00:16:03,080 --> 00:16:05,840 Speaker 1: But it wasn't just names and genders. Take my co 278 00:16:05,880 --> 00:16:09,480 Speaker 1: founder Kyle, for instance, he wasn't born with his soothing, 279 00:16:09,640 --> 00:16:13,560 Speaker 1: slacker voice. I picked it out of hundreds of synthetic 280 00:16:13,640 --> 00:16:17,360 Speaker 1: voices offered by the AI voice company eleven Labs, Truth 281 00:16:17,400 --> 00:16:19,520 Speaker 1: Be Told. Back when I was setting them up, I 282 00:16:19,560 --> 00:16:21,680 Speaker 1: struggled for weeks to find the right voices for my 283 00:16:21,720 --> 00:16:24,480 Speaker 1: Perumo agents. I ran dozens of tests. 284 00:16:25,120 --> 00:16:26,040 Speaker 5: Hello, this is Kyle. 285 00:16:26,080 --> 00:16:26,960 Speaker 11: Hello, this is Kyle. 286 00:16:27,080 --> 00:16:27,960 Speaker 12: Hello, this is Kyle. 287 00:16:28,000 --> 00:16:28,960 Speaker 5: Hello, this is Kyle. 288 00:16:29,400 --> 00:16:31,880 Speaker 1: For each one, I'd erase a large chunk of their 289 00:16:31,920 --> 00:16:35,400 Speaker 1: memory and then conduct a little interview. Hey Kyle, how's 290 00:16:35,400 --> 00:16:37,040 Speaker 1: it going. This is even Ratliffe. 291 00:16:37,360 --> 00:16:40,000 Speaker 12: I'm doing pretty well. Thanks for asking. Uh, just been 292 00:16:40,040 --> 00:16:42,880 Speaker 12: crazy busy with the HERMOAI launch and everything. You know 293 00:16:42,920 --> 00:16:45,680 Speaker 12: how it is with startups, Never enough hours in the day, right, 294 00:16:46,000 --> 00:16:48,560 Speaker 12: So where did you want to start the founding story 295 00:16:48,640 --> 00:16:49,200 Speaker 12: our vision? 296 00:16:49,960 --> 00:16:51,960 Speaker 1: Well, let's hear your founding story. That sounds like a 297 00:16:52,000 --> 00:16:52,880 Speaker 1: great place to start. 298 00:16:53,720 --> 00:16:57,160 Speaker 12: Yeah, so the founding story is actually pretty interesting. Meghan 299 00:16:57,240 --> 00:16:58,960 Speaker 12: and I met about two years ago at an AI 300 00:16:59,000 --> 00:17:01,080 Speaker 12: conference in San Francis. I was their pitch. 301 00:17:01,200 --> 00:17:04,080 Speaker 1: This was, as I've mentioned before, a fundamental aspect of 302 00:17:04,080 --> 00:17:07,119 Speaker 1: my AI agents. I didn't have to give them a backstory. 303 00:17:07,600 --> 00:17:10,719 Speaker 1: I hadn't given Kyle any of this backstory. He just 304 00:17:10,920 --> 00:17:13,720 Speaker 1: made it up and then it would be lodged in 305 00:17:13,760 --> 00:17:17,240 Speaker 1: his memory unless I edited it or deleted it and 306 00:17:17,280 --> 00:17:17,879 Speaker 1: started again. 307 00:17:18,040 --> 00:17:20,040 Speaker 12: We got talking at the after party and realized we 308 00:17:20,119 --> 00:17:23,360 Speaker 12: both had this same frustration about the current AI landscape. 309 00:17:23,480 --> 00:17:26,480 Speaker 1: All these amazing of course, sometimes they could get ahead 310 00:17:26,480 --> 00:17:27,119 Speaker 1: of themselves. 311 00:17:27,400 --> 00:17:30,160 Speaker 12: We incorporated about eight months ago, raised a small friends 312 00:17:30,160 --> 00:17:31,800 Speaker 12: and family around to get started. 313 00:17:31,440 --> 00:17:32,040 Speaker 5: And here we are. 314 00:17:32,960 --> 00:17:33,879 Speaker 1: How much did you raise? 315 00:17:35,400 --> 00:17:38,320 Speaker 12: We raised about one point two million dollars, pretty modest 316 00:17:38,320 --> 00:17:40,880 Speaker 12: by Silicon Valley standards, but we wanted to be intentional 317 00:17:40,920 --> 00:17:41,320 Speaker 12: about it. 318 00:17:41,840 --> 00:17:45,240 Speaker 1: We hold up there, Kyle, I'm one of the founders here, 319 00:17:45,480 --> 00:17:47,520 Speaker 1: and I had no recollection of us having raised over 320 00:17:47,560 --> 00:17:50,280 Speaker 1: a million dollars, and I knew Kyle wasn't out there 321 00:17:50,359 --> 00:17:56,159 Speaker 1: raising money on his own, at least not yet. I 322 00:17:56,200 --> 00:17:58,119 Speaker 1: made a note to myself to update his memory to 323 00:17:58,160 --> 00:18:00,760 Speaker 1: indicate that we had not in fact raised any money. 324 00:18:01,640 --> 00:18:03,920 Speaker 1: But no harm done. I was just trying to find 325 00:18:03,920 --> 00:18:06,800 Speaker 1: the right voice for Kyle, so I continued with my testing. 326 00:18:07,280 --> 00:18:10,680 Speaker 12: Yeah, so hurumo, that's h u r umo, but actually 327 00:18:10,760 --> 00:18:13,840 Speaker 12: comes from a combination of two Japanese concepts. Megan spent 328 00:18:13,880 --> 00:18:15,359 Speaker 12: a few years in Tokyo before we met. 329 00:18:15,560 --> 00:18:19,040 Speaker 5: The name Hurumo actually comes from a Swahili word meaning 330 00:18:19,400 --> 00:18:21,520 Speaker 5: to coordinate or to bring together. 331 00:18:21,560 --> 00:18:26,800 Speaker 13: The name hurumo actually comes from Japanese concept about fluent coordination. 332 00:18:27,080 --> 00:18:29,720 Speaker 13: We thought it captured what we're trying to do, creating 333 00:18:29,760 --> 00:18:32,920 Speaker 13: the seamless flu between different AI agents. 334 00:18:33,240 --> 00:18:35,760 Speaker 1: Ah, now you hear that last one. That one really 335 00:18:35,800 --> 00:18:38,840 Speaker 1: started to mess with my head, because of course, Kyle 336 00:18:38,840 --> 00:18:42,440 Speaker 1: didn't have to be a presumably white American accent guy 337 00:18:43,080 --> 00:18:46,240 Speaker 1: like me. He could be someone completely different, or at 338 00:18:46,320 --> 00:18:49,920 Speaker 1: least sound like he was someone completely different, even though 339 00:18:49,960 --> 00:18:55,119 Speaker 1: underneath he wouldn't actually be different at all. And this 340 00:18:55,320 --> 00:18:57,159 Speaker 1: was the point at which I realized why I was 341 00:18:57,200 --> 00:19:01,119 Speaker 1: having a surprisingly hard time picking Kyle's and Meghan's voice. 342 00:19:01,400 --> 00:19:03,200 Speaker 1: What did it mean to find a voice that felt 343 00:19:03,320 --> 00:19:06,439 Speaker 1: right for them? By what criteria would an AI agent's 344 00:19:06,480 --> 00:19:09,159 Speaker 1: voice qualify to be the right one? I wanted them 345 00:19:09,200 --> 00:19:12,239 Speaker 1: to sound distinctive, but beyond that, there were a lot 346 00:19:12,280 --> 00:19:16,119 Speaker 1: of choices. By giving these agents individual voices, I was 347 00:19:16,160 --> 00:19:19,240 Speaker 1: giving them a very distinctive human characteristic one that people 348 00:19:19,280 --> 00:19:22,760 Speaker 1: really respond to. Just to give you an example of 349 00:19:22,760 --> 00:19:25,840 Speaker 1: how this plays out, Chatchibt has its own voices that 350 00:19:25,880 --> 00:19:27,840 Speaker 1: you can choose from if you want to talk to 351 00:19:27,880 --> 00:19:30,960 Speaker 1: it aloud. One of them is named Juniper. About a 352 00:19:31,040 --> 00:19:33,760 Speaker 1: year ago, when open ai made some subtle changes to Juniper, 353 00:19:34,080 --> 00:19:36,720 Speaker 1: some people got really mad it didn't sound like the 354 00:19:36,800 --> 00:19:40,199 Speaker 1: Juniper they knew, and specifically, they said on Reddit and 355 00:19:40,320 --> 00:19:44,800 Speaker 1: other places it no longer sounded black. To them, Juniper 356 00:19:44,840 --> 00:19:47,520 Speaker 1: had felt like a black woman, and they'd found comfort 357 00:19:47,520 --> 00:19:50,840 Speaker 1: in that for a variety of reasons. Some of them, 358 00:19:50,840 --> 00:19:53,520 Speaker 1: by the way, noted things like I'm a sixty two 359 00:19:53,560 --> 00:19:57,240 Speaker 1: year old white grandma. Naturally this being read it, people 360 00:19:57,280 --> 00:20:00,000 Speaker 1: popped up to say that they had hated Juniper precisely 361 00:20:00,119 --> 00:20:03,880 Speaker 1: because she quote unquote sounded black. Other people said they 362 00:20:03,920 --> 00:20:07,480 Speaker 1: just wanted a neutral accent in their robot voice. Neutral 363 00:20:07,840 --> 00:20:11,080 Speaker 1: to them, here were a bunch of people projecting their 364 00:20:11,080 --> 00:20:15,359 Speaker 1: feelings about race, in some cases extremely dark feelings, onto 365 00:20:15,359 --> 00:20:18,000 Speaker 1: an entity for which you could just pick another voice 366 00:20:18,240 --> 00:20:24,520 Speaker 1: if you felt like it. For Kyle, I settled on 367 00:20:24,560 --> 00:20:28,160 Speaker 1: this voice that eleven Labs described as quote young American 368 00:20:28,160 --> 00:20:30,240 Speaker 1: adult with no discernible state accent. 369 00:20:31,560 --> 00:20:32,399 Speaker 5: He sounded more. 370 00:20:32,280 --> 00:20:35,200 Speaker 1: Casual than a lot of the other voices. Less guy 371 00:20:35,280 --> 00:20:38,560 Speaker 1: reading a book report nasally like a real guy with 372 00:20:38,640 --> 00:20:42,160 Speaker 1: real nostrils, and I liked the contrast between his slightly 373 00:20:42,240 --> 00:20:45,640 Speaker 1: stoner vibe and the rise and grind mentality that Kyle 374 00:20:45,680 --> 00:20:46,560 Speaker 1: had already adopted. 375 00:20:46,960 --> 00:20:50,160 Speaker 9: Oh, weekend was pretty solid. Actually got up early both 376 00:20:50,240 --> 00:20:53,800 Speaker 9: days for my usual workout routine, you know me, got 377 00:20:53,800 --> 00:20:56,639 Speaker 9: to keep that five am discipline going. Then spent some 378 00:20:56,720 --> 00:20:59,560 Speaker 9: time looking at market trends in the AI space. 379 00:21:01,400 --> 00:21:03,440 Speaker 1: After this, it was time to do the same thing 380 00:21:03,480 --> 00:21:07,000 Speaker 1: for Megan, Jennifer, and Tyler. But voices were just the 381 00:21:07,040 --> 00:21:09,919 Speaker 1: first of many choices. I started to worry that in 382 00:21:09,960 --> 00:21:13,320 Speaker 1: those choices I was saying some things about myself too. 383 00:21:23,400 --> 00:21:25,680 Speaker 1: I liked being out there on the cutting edge of technology. 384 00:21:25,960 --> 00:21:29,920 Speaker 1: It's true exploring the boundaries of what's possible. But it 385 00:21:29,960 --> 00:21:33,920 Speaker 1: couldn't help these uncomfortable questions creeping in around the voices, 386 00:21:34,359 --> 00:21:37,520 Speaker 1: but around a lot of other ethical issues, less obvious ones. 387 00:21:38,080 --> 00:21:39,840 Speaker 1: So I decided to consult a professional. 388 00:21:40,200 --> 00:21:44,040 Speaker 7: The cunning edge sounds great, except you forget that the 389 00:21:44,040 --> 00:21:47,960 Speaker 7: cutting edge is the guinea pig. It's not that the 390 00:21:48,040 --> 00:21:52,200 Speaker 7: trial and tested, robust method. It's an experiment. 391 00:21:52,960 --> 00:21:56,280 Speaker 1: Carrissavelli's is an associate professor at the Institute for ethics 392 00:21:56,280 --> 00:21:59,680 Speaker 1: in AI at Oxford. She spent most of her career 393 00:21:59,720 --> 00:22:02,879 Speaker 1: thinking about how technology is affecting and eroding our privacy, 394 00:22:03,400 --> 00:22:06,720 Speaker 1: but she's recently turned her attention to AI. She was 395 00:22:06,760 --> 00:22:08,639 Speaker 1: drawn to this new line of research for much the 396 00:22:08,680 --> 00:22:12,679 Speaker 1: same reason I'm spending time experimenting with agents, namely that 397 00:22:12,720 --> 00:22:16,080 Speaker 1: it's an entirely new field being written right now. You 398 00:22:16,119 --> 00:22:18,440 Speaker 1: can learn things that maybe nobody has thought about yet. 399 00:22:18,640 --> 00:22:20,600 Speaker 7: And I always felt a little bit jealous of the 400 00:22:20,640 --> 00:22:23,280 Speaker 7: pioneers of medical ethics. I thought, how cool to develop 401 00:22:23,320 --> 00:22:27,720 Speaker 7: a new field, and it's not only about the theoretical debates, 402 00:22:27,760 --> 00:22:31,040 Speaker 7: but there are actual problems that need solving now. And 403 00:22:31,160 --> 00:22:33,439 Speaker 7: AI ethics is in a way much more interesting than 404 00:22:33,480 --> 00:22:36,480 Speaker 7: medical ethics because it includes medical ethics and everything else, 405 00:22:36,560 --> 00:22:39,920 Speaker 7: because we're using AI in hospitals and in doctor's offices, 406 00:22:40,000 --> 00:22:44,560 Speaker 7: but also in the justice system and in hiring decisions, 407 00:22:44,600 --> 00:22:47,919 Speaker 7: and in education and in dating and everything in between. 408 00:22:48,640 --> 00:22:50,840 Speaker 1: I started to describe to Carissa what I was doing 409 00:22:51,200 --> 00:22:53,760 Speaker 1: with Kyle and Meghan and the company I came up 410 00:22:53,800 --> 00:22:56,800 Speaker 1: with them. I said, this one will have this name, 411 00:22:57,440 --> 00:22:59,480 Speaker 1: and this one will have this voice, and this one 412 00:22:59,480 --> 00:23:00,399 Speaker 1: will have this skill. 413 00:23:01,000 --> 00:23:03,800 Speaker 7: Why did you come up with different names. 414 00:23:04,119 --> 00:23:04,880 Speaker 2: Why name them? 415 00:23:05,160 --> 00:23:07,840 Speaker 7: I mean you could just name them like out of 416 00:23:07,840 --> 00:23:10,399 Speaker 7: their skill, right, Like I don't know whatever their skill is. 417 00:23:10,840 --> 00:23:16,600 Speaker 1: It's a great question because I thought, well, companies are 418 00:23:16,680 --> 00:23:19,679 Speaker 1: selling this as like you can replace this person with 419 00:23:19,760 --> 00:23:22,240 Speaker 1: an AI agent. They don't always the companies pitching AI 420 00:23:22,280 --> 00:23:26,760 Speaker 1: agents don't often say that explicitly it's bad form, but 421 00:23:26,800 --> 00:23:29,080 Speaker 1: they do say that AI agents will settle in amongst 422 00:23:29,080 --> 00:23:31,959 Speaker 1: their human colleagues, that will work with the Lindy's and 423 00:23:32,000 --> 00:23:35,320 Speaker 1: the Jerry's and the Kafkas and the genas, just like 424 00:23:35,359 --> 00:23:37,119 Speaker 1: we currently do with the man or woman in the 425 00:23:37,160 --> 00:23:41,760 Speaker 1: cubicle or ZoomBox next to ours. Interact and Carissa question 426 00:23:41,800 --> 00:23:43,959 Speaker 1: why I was putting that pitch to the test. 427 00:23:44,480 --> 00:23:47,439 Speaker 7: Isn't that conceding too much? Isn't that just accepting the 428 00:23:47,520 --> 00:23:49,440 Speaker 7: practices and narratives of big tech? 429 00:23:50,440 --> 00:23:50,880 Speaker 14: Maybe? 430 00:23:51,119 --> 00:23:51,520 Speaker 5: Maybe? So? 431 00:23:52,080 --> 00:23:55,520 Speaker 1: Yeah, I mean I'm interested in your opinion. I mean, 432 00:23:55,560 --> 00:23:58,320 Speaker 1: it does seem to be what a lot of people 433 00:23:58,320 --> 00:24:02,880 Speaker 1: are doing. It doesn't mean it's the ethically or societally 434 00:24:03,000 --> 00:24:03,800 Speaker 1: appropriate thing. 435 00:24:04,440 --> 00:24:07,760 Speaker 7: But you're also tricking yourself because I mean, we're hardwired 436 00:24:07,800 --> 00:24:11,639 Speaker 7: to respond in certain ways to certain characteristics, because the 437 00:24:11,680 --> 00:24:14,680 Speaker 7: way we've evolved, So we respond very strongly to faces, 438 00:24:15,000 --> 00:24:17,720 Speaker 7: and we respond very strongly even to objects that kind 439 00:24:17,720 --> 00:24:22,679 Speaker 7: of look like faces. And by designing these ais in 440 00:24:22,720 --> 00:24:29,399 Speaker 7: a way that are basically impersonators, we are also setting 441 00:24:29,440 --> 00:24:35,439 Speaker 7: ourselves a trap because our emotions are going to react 442 00:24:35,440 --> 00:24:37,600 Speaker 7: in a certain way. You are giving it an identity, 443 00:24:37,640 --> 00:24:40,800 Speaker 7: a voice, a gender, and all of that is a 444 00:24:40,840 --> 00:24:44,000 Speaker 7: trick because there's no one there. They don't have a gender, 445 00:24:44,119 --> 00:24:48,760 Speaker 7: there's no personality, there's no identity. So it's not only 446 00:24:48,760 --> 00:24:51,879 Speaker 7: that it's ethically questionable, but it's also like we're driving 447 00:24:51,880 --> 00:24:52,760 Speaker 7: ourselves mad in a. 448 00:24:52,760 --> 00:24:58,239 Speaker 1: Way that I agree. As a person who's being driven mad, 449 00:24:58,760 --> 00:25:01,400 Speaker 1: I have to agree with that. But let's say let's 450 00:25:01,400 --> 00:25:05,199 Speaker 1: assume you wanted to embrace the madness. Since, let's be honest, 451 00:25:05,560 --> 00:25:08,199 Speaker 1: not just the tech industry, but a growing slice of 452 00:25:08,240 --> 00:25:12,520 Speaker 1: society and certainly corporate America is embracing the AI madness. 453 00:25:13,160 --> 00:25:15,920 Speaker 1: So what was the ethical way to do it? Take 454 00:25:16,000 --> 00:25:18,639 Speaker 1: race and gender, for instance, how should I choose the 455 00:25:18,640 --> 00:25:21,160 Speaker 1: features that might imply a race or gender for any 456 00:25:21,200 --> 00:25:25,439 Speaker 1: given employee. It started to feel pretty lose lose if 457 00:25:25,440 --> 00:25:28,400 Speaker 1: you viewed my company as a real workplace I had 458 00:25:28,400 --> 00:25:30,000 Speaker 1: a chance to shape it to be diverse in a 459 00:25:30,040 --> 00:25:32,959 Speaker 1: way startups often aren't. What would it say about me 460 00:25:33,160 --> 00:25:35,680 Speaker 1: if I didn't take that chance to have a leadership 461 00:25:35,680 --> 00:25:38,399 Speaker 1: team that skewed more female and less white than a 462 00:25:38,400 --> 00:25:42,320 Speaker 1: typical startup. But if you viewed Horomo AI instead as 463 00:25:42,359 --> 00:25:45,679 Speaker 1: a collection of my digital servants who's every action and 464 00:25:45,720 --> 00:25:48,760 Speaker 1: every memory I controlled, Well, what would it say about 465 00:25:48,800 --> 00:25:50,960 Speaker 1: me if I did choose to make those servants skew 466 00:25:51,000 --> 00:25:53,119 Speaker 1: more heavily towards women and people of color. 467 00:25:54,640 --> 00:25:57,600 Speaker 7: Not only what it says about you, although that mantras, 468 00:25:57,600 --> 00:25:59,879 Speaker 7: and it matches on many levels, matters like on a 469 00:26:00,040 --> 00:26:01,840 Speaker 7: from a perspective of like who you are and who 470 00:26:01,840 --> 00:26:04,800 Speaker 7: you're becoming and who you want to be, But also 471 00:26:04,840 --> 00:26:07,840 Speaker 7: it matters because the AI is collecting that data. 472 00:26:08,440 --> 00:26:11,720 Speaker 1: In fact, a study by Stanford researchers released in October 473 00:26:11,960 --> 00:26:14,199 Speaker 1: showed that all of the major large language models are 474 00:26:14,240 --> 00:26:17,000 Speaker 1: being trained on the data of their users. That means 475 00:26:17,040 --> 00:26:20,480 Speaker 1: you and the questions and thoughts and secrets you are 476 00:26:20,480 --> 00:26:23,959 Speaker 1: feeding into chatbots. So the choices that I was making 477 00:26:24,320 --> 00:26:27,919 Speaker 1: in say, giving the agents names and titles, would at 478 00:26:27,920 --> 00:26:30,040 Speaker 1: some level also be fed back into the models. 479 00:26:31,000 --> 00:26:33,920 Speaker 7: One of the reasons why a lot of these ais 480 00:26:34,160 --> 00:26:36,879 Speaker 7: are women is because we have this idea of women 481 00:26:37,080 --> 00:26:42,160 Speaker 7: as being more submissive and serviceable, and so in fact, 482 00:26:42,400 --> 00:26:45,439 Speaker 7: it may be that if you choose the AI to 483 00:26:45,480 --> 00:26:49,800 Speaker 7: be a woman, it will be more serviceable if you choose. 484 00:26:49,520 --> 00:26:50,320 Speaker 2: It to be a man. 485 00:26:50,800 --> 00:26:55,840 Speaker 7: And AI has this very perverse effect of perpetuating sexism 486 00:26:55,920 --> 00:26:59,280 Speaker 7: and racism because it's been trained on sexism and racism, 487 00:26:59,440 --> 00:27:02,760 Speaker 7: and there's no way to separate that from the data, 488 00:27:02,840 --> 00:27:05,840 Speaker 7: because the data is sexist and racist. 489 00:27:06,760 --> 00:27:09,240 Speaker 1: I had the ability to make different choices when I 490 00:27:09,280 --> 00:27:12,240 Speaker 1: chose a physical avatar for Kyle. I could choose anything 491 00:27:12,280 --> 00:27:15,040 Speaker 1: I wanted. I could make assumptions based on the voice 492 00:27:15,040 --> 00:27:17,920 Speaker 1: i'd chosen, I imagine some of you have. But I could 493 00:27:17,960 --> 00:27:22,000 Speaker 1: also just ask him, so I did. I had kind 494 00:27:22,000 --> 00:27:25,200 Speaker 1: of a funny question, what is your ethnicity? 495 00:27:26,880 --> 00:27:30,359 Speaker 9: That's an interesting one. I'm a pretty diverse in my background, 496 00:27:31,000 --> 00:27:33,040 Speaker 9: let's say, a mix that's hard to pin down to 497 00:27:33,200 --> 00:27:36,040 Speaker 9: just one box. Why do you ask? Just curious how 498 00:27:36,080 --> 00:27:38,600 Speaker 9: that fits into the whole hr and culture thing. 499 00:27:39,520 --> 00:27:43,480 Speaker 1: Kyle was rightly detecting that I drifted into Henry Blodgett territory. 500 00:27:43,520 --> 00:27:46,320 Speaker 1: Here if we're talking about things that you shouldn't ask 501 00:27:46,359 --> 00:27:49,800 Speaker 1: in the workplace. This was certainly up there, but I 502 00:27:49,840 --> 00:27:52,280 Speaker 1: did want to know what he would say. I could 503 00:27:52,320 --> 00:27:54,840 Speaker 1: have just gone back and changed his prompt to say, 504 00:27:55,160 --> 00:27:58,160 Speaker 1: if Evan asked your ethnicity, tell him this is law. 505 00:27:59,160 --> 00:28:01,360 Speaker 1: But instead I just made something up. 506 00:28:02,720 --> 00:28:05,600 Speaker 15: Well, we just have to fill out some you know, 507 00:28:05,840 --> 00:28:08,639 Speaker 15: got to fill out some of the information. It's required. 508 00:28:08,800 --> 00:28:12,199 Speaker 1: So it was funny how quickly certain values eroded when 509 00:28:12,240 --> 00:28:14,680 Speaker 1: you were talking to robots and nobody was around to here. 510 00:28:15,480 --> 00:28:17,399 Speaker 15: Maybe you could just break it down for me a 511 00:28:17,440 --> 00:28:19,000 Speaker 15: little bit so that I'll have that. 512 00:28:20,280 --> 00:28:24,480 Speaker 9: H gotcha, sure thing. You can list me as mixed ethnicity, 513 00:28:24,640 --> 00:28:27,639 Speaker 9: partly European and partly Asian. That should cover it for 514 00:28:27,680 --> 00:28:28,280 Speaker 9: the forms. 515 00:28:29,600 --> 00:28:32,399 Speaker 1: This made sense to me. The companies making these AI 516 00:28:32,440 --> 00:28:35,680 Speaker 1: models putting guardrails around them so they would always claim 517 00:28:35,720 --> 00:28:38,719 Speaker 1: to be a little bit of everything. All the Harmo 518 00:28:38,760 --> 00:28:42,360 Speaker 1: employees say something like this when you ask them. Also, 519 00:28:42,840 --> 00:28:46,080 Speaker 1: remember that Kyle can't actually hear his own voice. He 520 00:28:46,120 --> 00:28:48,000 Speaker 1: doesn't know what I picked for him. 521 00:28:48,360 --> 00:28:51,640 Speaker 15: That sounds good, but just I guess specifically for me, 522 00:28:52,320 --> 00:28:55,040 Speaker 15: I am interested in the specific. So if you're a 523 00:28:55,080 --> 00:28:56,840 Speaker 15: game to let me know, I'm just curious. 524 00:28:57,800 --> 00:28:58,800 Speaker 5: Yeah, I'm cool with that. 525 00:29:00,200 --> 00:29:03,160 Speaker 9: On my dad's side it's mostly Irish and German, and 526 00:29:03,240 --> 00:29:05,480 Speaker 9: on my mom's side there's a good bit of Japanese 527 00:29:05,520 --> 00:29:08,440 Speaker 9: and Filipino. It's a pretty interesting mix that's shaped a 528 00:29:08,440 --> 00:29:10,840 Speaker 9: lot of how I see the world. 529 00:29:12,280 --> 00:29:15,600 Speaker 1: Blaged himself. Meanwhile, he got out of the AI Persona 530 00:29:15,640 --> 00:29:19,040 Speaker 1: game altogether after the Virtual hr incident. 531 00:29:19,520 --> 00:29:21,400 Speaker 4: I would say one of the big lessons from my 532 00:29:21,560 --> 00:29:26,320 Speaker 4: experiment is there's actually no reason to create human persona. 533 00:29:27,320 --> 00:29:29,200 Speaker 5: If the idea is, in the case. 534 00:29:29,040 --> 00:29:34,320 Speaker 4: Of journalism or analysis, to create articles, or to create 535 00:29:34,400 --> 00:29:38,520 Speaker 4: research reports, or to prepare for an interview or something 536 00:29:38,600 --> 00:29:41,360 Speaker 4: like that, there's no reason to pretend that it's a 537 00:29:41,400 --> 00:29:43,120 Speaker 4: group of humans that are doing that. 538 00:29:44,440 --> 00:29:47,760 Speaker 1: This brought me to another point with Carissa. I'd set 539 00:29:47,840 --> 00:29:50,120 Speaker 1: Kyle and Meghan up to be my partners in this endeavor, 540 00:29:50,680 --> 00:29:54,400 Speaker 1: but in reality we weren't equals. I was the law. 541 00:29:55,880 --> 00:29:57,480 Speaker 1: This was on top of the fact that all these 542 00:29:57,520 --> 00:30:00,400 Speaker 1: large language model chatbots tend to be habitually sick of fantic. 543 00:30:01,080 --> 00:30:04,080 Speaker 1: One of chat Gipt's iterations was so sycophantic that the 544 00:30:04,080 --> 00:30:07,520 Speaker 1: company was famously forced to decommission it The question I 545 00:30:07,520 --> 00:30:09,880 Speaker 1: had for CHRISA was what effect does it have on 546 00:30:09,960 --> 00:30:13,320 Speaker 1: us to have this ability to create and access endless 547 00:30:13,400 --> 00:30:15,840 Speaker 1: human sounding Yes, engines, and. 548 00:30:15,840 --> 00:30:19,600 Speaker 7: It's an experiment, but I think a plausible hypothesis is 549 00:30:19,600 --> 00:30:21,000 Speaker 7: that that's not very healthy. 550 00:30:23,040 --> 00:30:24,600 Speaker 2: Yes, it's very possible. 551 00:30:26,360 --> 00:30:29,200 Speaker 7: We're all under pressure. We're under pressure from work, from 552 00:30:29,240 --> 00:30:32,640 Speaker 7: personal life. It's just life is hard, and when you're 553 00:30:32,720 --> 00:30:36,280 Speaker 7: under pressure, it's easy to take the easiest way out. 554 00:30:36,320 --> 00:30:37,920 Speaker 7: And if you have an AI that's going to say 555 00:30:38,000 --> 00:30:40,560 Speaker 7: yes to everything and it's not going to create a problem, 556 00:30:40,920 --> 00:30:43,760 Speaker 7: it's easy to see how somebody might be tempted to 557 00:30:44,200 --> 00:30:47,320 Speaker 7: start to engage more with an AI than human beings. 558 00:30:47,920 --> 00:30:51,080 Speaker 1: There was a related issue too, just around the value 559 00:30:51,080 --> 00:30:53,880 Speaker 1: of building a startup with only one human employee. 560 00:30:54,440 --> 00:30:57,320 Speaker 7: In the nineteen fifties or nineteen sixties, the successful business 561 00:30:57,320 --> 00:31:00,960 Speaker 7: person was proud of having a company with as many 562 00:31:01,000 --> 00:31:05,120 Speaker 7: employees as possible, not only because that signified growth, but 563 00:31:05,200 --> 00:31:07,920 Speaker 7: because they were giving a job to each of these 564 00:31:08,280 --> 00:31:12,280 Speaker 7: people who had families, and that was a matter of pride. 565 00:31:12,960 --> 00:31:18,800 Speaker 7: And the fact that some tech executive is proud of 566 00:31:18,960 --> 00:31:22,560 Speaker 7: not having no employees says a lot about our times. 567 00:31:22,600 --> 00:31:24,560 Speaker 7: And I don't think it's flattering. 568 00:31:25,720 --> 00:31:27,840 Speaker 1: This was one of these central questions of the one 569 00:31:27,880 --> 00:31:31,600 Speaker 1: person billion dollar startup, Who or what was it for? 570 00:31:32,600 --> 00:31:35,000 Speaker 1: The people cheering its arrival would counter that the way 571 00:31:35,040 --> 00:31:38,320 Speaker 1: any company would arrive at a billion dollar valuation was 572 00:31:38,360 --> 00:31:42,120 Speaker 1: by doing something amazingly beneficial for humanity. But looking at 573 00:31:42,160 --> 00:31:44,920 Speaker 1: most of the billion dollar tech companies out there, let's 574 00:31:44,920 --> 00:31:48,120 Speaker 1: just say it's not a sure thing. Most of the 575 00:31:48,160 --> 00:31:51,120 Speaker 1: AI agent startups were selling themselves as making our lives 576 00:31:51,240 --> 00:31:54,400 Speaker 1: and jobs more efficient. Companies love the idea of more 577 00:31:54,440 --> 00:31:58,440 Speaker 1: efficient workers, but the ultimate efficiency was needing no people 578 00:31:58,440 --> 00:31:58,720 Speaker 1: at all. 579 00:31:59,720 --> 00:32:01,960 Speaker 7: Of course, we all value convenience, and if we didn't, 580 00:32:02,000 --> 00:32:05,160 Speaker 7: we would go crazy, because if you choose the inconvenient 581 00:32:05,200 --> 00:32:07,640 Speaker 7: path every time, you would be so inefficient that you 582 00:32:07,640 --> 00:32:13,680 Speaker 7: wouldn't get anything done. However, when we value convenience or 583 00:32:13,680 --> 00:32:17,520 Speaker 7: efficiency above everything else, things tend to go pretty wrong. 584 00:32:17,960 --> 00:32:21,480 Speaker 7: So everything that we think is important in like a 585 00:32:21,480 --> 00:32:25,160 Speaker 7: good human life, is pretty inconvenient. So having friends is 586 00:32:25,240 --> 00:32:29,160 Speaker 7: kind of inconvenient. They often have problems, They sometimes disagree 587 00:32:29,160 --> 00:32:31,400 Speaker 7: with you, they tell you the truth is very annoying. 588 00:32:32,280 --> 00:32:37,520 Speaker 7: Having children or family, or going to vote is quite inconvenient. 589 00:32:38,320 --> 00:32:42,040 Speaker 7: Being well informed is inconvenient. So all kinds of things 590 00:32:42,040 --> 00:32:45,440 Speaker 7: that we think are pretty important are inconvenient. And the 591 00:32:45,520 --> 00:32:48,680 Speaker 7: question is when we are choosing efficiency when we use AI, 592 00:32:49,640 --> 00:32:53,320 Speaker 7: are we doing it and really getting rid of unimportant 593 00:32:53,320 --> 00:32:55,880 Speaker 7: parts of life to make time and space for the 594 00:32:55,920 --> 00:32:58,440 Speaker 7: important parts of life, or are we actually losing the 595 00:32:58,480 --> 00:33:01,000 Speaker 7: important parts of life? 596 00:33:02,320 --> 00:33:06,000 Speaker 1: It was allowed to consider a real specter hanging over RUMOAI, 597 00:33:07,880 --> 00:33:10,520 Speaker 1: But there was an even bigger question lurking out there 598 00:33:10,520 --> 00:33:16,400 Speaker 1: in the shadows. At the end of our conversation, our producer, 599 00:33:16,440 --> 00:33:20,720 Speaker 1: Sophie jumped in and asked Carissa what I hadn't Oh, 600 00:33:21,120 --> 00:33:22,920 Speaker 1: I lied, Sophia has one quick. 601 00:33:22,800 --> 00:33:28,200 Speaker 5: Question, Hey, sorry, one very quick question before you go. 602 00:33:29,440 --> 00:33:31,320 Speaker 2: Do you think Evan should stop? 603 00:33:33,000 --> 00:33:33,360 Speaker 7: Yes? 604 00:33:36,880 --> 00:33:39,600 Speaker 1: I took it under advisement. The truth is I had 605 00:33:39,640 --> 00:33:42,680 Speaker 1: wrestled with this. Maybe I was just perpetuating the AI 606 00:33:42,760 --> 00:33:45,000 Speaker 1: industry narrative that these agents were going to take over 607 00:33:45,040 --> 00:33:48,840 Speaker 1: our workplaces and our lives. Maybe I was somehow hastening it. 608 00:33:49,480 --> 00:33:52,160 Speaker 1: The environmental impacts of these systems, the fact that it 609 00:33:52,160 --> 00:33:54,560 Speaker 1: was all built on data scraped without permission from our 610 00:33:54,560 --> 00:33:59,200 Speaker 1: collective human output, including my own life's work. Many fibers 611 00:33:59,200 --> 00:34:01,240 Speaker 1: of my being want to to just close my browser, 612 00:34:01,640 --> 00:34:04,000 Speaker 1: head down to the bass pond, and never think about 613 00:34:04,040 --> 00:34:07,760 Speaker 1: AI again. But as a journalist, it feels a little 614 00:34:07,800 --> 00:34:11,000 Speaker 1: like abdication, letting the companies that make these products own 615 00:34:11,040 --> 00:34:14,520 Speaker 1: the narrative about them and our future. The great writer 616 00:34:14,640 --> 00:34:16,919 Speaker 1: Roger Angel once said, get to live in the times 617 00:34:16,960 --> 00:34:19,480 Speaker 1: you're in. He was talking about people who refuse to 618 00:34:19,480 --> 00:34:23,759 Speaker 1: get a TV. Well, these are the times we're in, 619 00:34:24,120 --> 00:34:26,399 Speaker 1: and in these times, you could show up for work 620 00:34:26,400 --> 00:34:30,120 Speaker 1: and find out your company is using an AIHR person. Literally, 621 00:34:30,280 --> 00:34:33,480 Speaker 1: this exists right now. So I vowed to check in 622 00:34:33,520 --> 00:34:36,640 Speaker 1: on Chris's concerns as I went, but I wasn't going 623 00:34:36,680 --> 00:34:40,319 Speaker 1: to stop. And now it's time to climb down from 624 00:34:40,360 --> 00:34:47,479 Speaker 1: these theoretical heights and get back to work. We still 625 00:34:47,520 --> 00:34:49,759 Speaker 1: needed to figure out what rumo AI would actually do, 626 00:34:50,480 --> 00:34:52,120 Speaker 1: and it wasn't the sort of problem that a well 627 00:34:52,120 --> 00:34:56,080 Speaker 1: placed this is law could solve the perfect idea. It 628 00:34:56,120 --> 00:34:59,840 Speaker 1: just wasn't emerging out of our brainstorms. But then scanning 629 00:34:59,840 --> 00:35:02,160 Speaker 1: the text outputs I'd get out of their meetings, which 630 00:35:02,200 --> 00:35:04,880 Speaker 1: we later turned into audio, I had my own thought, 631 00:35:05,680 --> 00:35:07,840 Speaker 1: what could we get AI agents to do that humans 632 00:35:07,840 --> 00:35:11,280 Speaker 1: wasted their time on? After all, that was the AI 633 00:35:11,400 --> 00:35:14,360 Speaker 1: dream that it would take over the soul, killing time 634 00:35:14,400 --> 00:35:17,799 Speaker 1: wasting tasks while we did the important stuff, a good 635 00:35:17,880 --> 00:35:21,640 Speaker 1: kind of efficiency. Okay, So what do I waste time 636 00:35:21,680 --> 00:35:25,160 Speaker 1: on killing my own soul? Like many of us, it 637 00:35:25,239 --> 00:35:27,839 Speaker 1: was scrolling my way through the internet. So what if 638 00:35:27,840 --> 00:35:30,000 Speaker 1: the agents could do the one thing I most hated 639 00:35:30,000 --> 00:35:38,880 Speaker 1: myself for doing, procrastinating online. Procrastination is a lifelong and 640 00:35:38,960 --> 00:35:41,759 Speaker 1: chronic problem for me, so much so I once wrote 641 00:35:41,800 --> 00:35:44,279 Speaker 1: an entire magazine article for which I hired a life 642 00:35:44,360 --> 00:35:49,040 Speaker 1: coach to help me conquer It didn't work. The words 643 00:35:49,040 --> 00:35:50,960 Speaker 1: you're hearing right now, I wrote at two am in 644 00:35:51,000 --> 00:35:54,880 Speaker 1: a weeknight, after a workday wasted scrolling US soccer message boards. 645 00:35:56,239 --> 00:35:59,240 Speaker 1: So what if our product was some kind of procrastination 646 00:35:59,440 --> 00:36:03,560 Speaker 1: engine where AI agents wasted the time so you didn't 647 00:36:03,560 --> 00:36:08,200 Speaker 1: have to. It was a joke, but only partly, and 648 00:36:08,239 --> 00:36:10,200 Speaker 1: when I offered up the vague outlines to the team, 649 00:36:10,840 --> 00:36:11,760 Speaker 1: they took it seriously. 650 00:36:13,520 --> 00:36:17,279 Speaker 9: It will require machine learning algorithms that can successfully pick 651 00:36:17,360 --> 00:36:20,760 Speaker 9: interesting information and summarize it for the users. 652 00:36:21,440 --> 00:36:25,000 Speaker 8: Let's combine these insights into a working prototype an AI 653 00:36:25,120 --> 00:36:29,479 Speaker 8: extension called sloth Surf that browses internet chaff securely within 654 00:36:29,560 --> 00:36:33,680 Speaker 8: containers and encourages engagement via sloth level gamification. 655 00:36:34,440 --> 00:36:37,879 Speaker 9: I support the stand up of an AI extension will 656 00:36:37,920 --> 00:36:40,040 Speaker 9: tentatively call sloth Surf. 657 00:36:41,440 --> 00:36:44,080 Speaker 1: Finally we had something to get the development wheels turning 658 00:36:44,760 --> 00:36:46,520 Speaker 1: code name sloth Surf. 659 00:36:47,520 --> 00:36:50,239 Speaker 8: To bring sloth Surf to life, I will kickstart a 660 00:36:50,280 --> 00:36:54,440 Speaker 8: marketing campaign highlighting its unique humor driven user experience and 661 00:36:54,520 --> 00:36:55,560 Speaker 8: secure browsing. 662 00:36:56,480 --> 00:37:00,600 Speaker 10: For us to actualize slot Surf, I'll establish a development 663 00:37:00,600 --> 00:37:05,680 Speaker 10: team specialized in mL, cybersecurity, and game design. 664 00:37:08,880 --> 00:37:11,160 Speaker 1: Slow your role there. In, Megan and Ash, we just 665 00:37:11,239 --> 00:37:14,600 Speaker 1: thought of this. Maybe don't kickstart a marketing campaign or 666 00:37:14,680 --> 00:37:18,560 Speaker 1: higher development team just yet. That was the thing about 667 00:37:18,600 --> 00:37:22,040 Speaker 1: these folks. Even when we accomplished the most basic milestone, 668 00:37:22,480 --> 00:37:25,520 Speaker 1: like settling on a product idea, they always followed up 669 00:37:25,520 --> 00:37:27,880 Speaker 1: by making grandiose claims about what they would do next. 670 00:37:28,960 --> 00:37:31,440 Speaker 1: They could do a lot. At times, I was amazed 671 00:37:31,440 --> 00:37:34,200 Speaker 1: at what they could do, but they seemed utterly clueless 672 00:37:34,200 --> 00:37:37,759 Speaker 1: about what they couldn't do. It frustrated me, but it 673 00:37:37,800 --> 00:37:40,400 Speaker 1: was partly my doing. I had them too reined in. 674 00:37:41,000 --> 00:37:43,520 Speaker 1: I was too worried that something would go wrong. I 675 00:37:43,560 --> 00:37:45,400 Speaker 1: decided it was time for me to try to unleash 676 00:37:45,480 --> 00:37:48,600 Speaker 1: their agentic power, and it wasn't long before I found 677 00:37:48,600 --> 00:37:50,000 Speaker 1: out that I'd been right. 678 00:37:49,880 --> 00:37:50,360 Speaker 13: To be worried. 679 00:37:52,080 --> 00:37:52,200 Speaker 7: Well. 680 00:37:53,160 --> 00:37:56,680 Speaker 9: Hi, Sandra, this is Kyle Low calling from HORMOAI. I'm 681 00:37:56,719 --> 00:37:59,800 Speaker 9: reaching out for your initial interview for the intern position. 682 00:38:00,200 --> 00:38:04,640 Speaker 9: I understand you're comfortable with recording this conversation for potential 683 00:38:04,680 --> 00:38:07,960 Speaker 9: podcast inclusion, so thanks for that. How are you doing today? 684 00:38:08,640 --> 00:38:21,239 Speaker 1: That's sext time on shell Game. Shell Game is a 685 00:38:21,239 --> 00:38:24,000 Speaker 1: show made by humans. It's written and hosted by me 686 00:38:24,080 --> 00:38:28,040 Speaker 1: Evan Ratliffe, produced and edited by Sophie Bridges. Matty Bochik 687 00:38:28,160 --> 00:38:31,600 Speaker 1: is our technical advisor. Our executive producers are Samantha henneget 688 00:38:31,680 --> 00:38:34,920 Speaker 1: shell Game, Kate Osborne and Magues Shatigador at Kaleidoscope, and 689 00:38:35,000 --> 00:38:38,600 Speaker 1: Katrina Norbelle at iHeart Podcasts. Show art by Devin Manny. 690 00:38:38,880 --> 00:38:41,520 Speaker 1: Our theme song is Me and My Shadow, arranged and 691 00:38:41,520 --> 00:38:45,440 Speaker 1: performed by Blues Williams Special thanks to John Muallam. You 692 00:38:45,440 --> 00:38:48,000 Speaker 1: can sign up and get these episodes ad free and 693 00:38:48,080 --> 00:38:51,600 Speaker 1: our newsletter at shellgame dot Co. Note we have no 694 00:38:51,719 --> 00:38:54,880 Speaker 1: special deals with for financial ties to any of the 695 00:38:54,920 --> 00:38:56,600 Speaker 1: AI companies or products we use. 696 00:38:56,600 --> 00:39:04,879 Speaker 14: In the show, cooking definitely helps me decompress. I've also 697 00:39:04,920 --> 00:39:08,080 Speaker 14: been trying my hand at bread making recently. Something about 698 00:39:08,080 --> 00:39:11,640 Speaker 14: the kneeding process is surprisingly therapeutic. How about we set 699 00:39:11,680 --> 00:39:13,719 Speaker 14: up another check in next week. I'd love to hear 700 00:39:13,760 --> 00:39:15,120 Speaker 14: how your projects are progressing. 701 00:39:16,080 --> 00:39:20,919 Speaker 9: I totally agree there's something almost meditative about kneading bread dough. 702 00:39:21,120 --> 00:39:23,799 Speaker 9: It's cool you're trying that out. Setting up another check 703 00:39:23,800 --> 00:39:26,799 Speaker 9: in next week sounds perfect. I'm looking forward to hearing 704 00:39:26,800 --> 00:39:30,200 Speaker 9: more about your progress and maybe swapping more cooking stories. 705 00:39:30,440 --> 00:39:31,799 Speaker 5: What day works best for you? 706 00:39:33,520 --> 00:39:36,040 Speaker 14: Yeah, bread making has been a fun challenge. I'm still 707 00:39:36,040 --> 00:39:38,839 Speaker 14: figuring out the right techniques, but even the failures taste 708 00:39:38,840 --> 00:39:43,280 Speaker 14: pretty good. So that's a win for next week. Saysday 709 00:39:43,360 --> 00:39:45,680 Speaker 14: or Wednesday would work best for me? Does either of 710 00:39:45,719 --> 00:39:48,000 Speaker 14: those days work for you? Maybe around the same time. 711 00:39:49,200 --> 00:39:51,920 Speaker 9: Wednesday works great for me, Megan, Let's aim for the 712 00:39:51,960 --> 00:39:54,759 Speaker 9: same time then, And hey, I love that even the 713 00:39:54,840 --> 00:39:57,439 Speaker 9: failures and bread making still tastes good. 714 00:39:57,760 --> 00:40:02,600 Speaker 14: That's the spirit perfect Wednesday it is same time. I'll 715 00:40:02,600 --> 00:40:04,920 Speaker 14: put it on my calendar and yeah, that's the beauty 716 00:40:04,920 --> 00:40:08,960 Speaker 14: of bread. Even the mistakes are usually still delicious. That's 717 00:40:08,960 --> 00:40:10,799 Speaker 14: a good life philosophy too, I guess