1 00:00:16,239 --> 00:00:19,560 Speaker 1: Welcome to tech stuff. I'm Cara Price. Last year, before 2 00:00:19,560 --> 00:00:21,800 Speaker 1: we went on break, I spoke with someone who has 3 00:00:21,840 --> 00:00:25,000 Speaker 1: a job that I'm really obsessed with. It's a job 4 00:00:25,040 --> 00:00:27,800 Speaker 1: that I would probably want if I didn't do ten 5 00:00:27,880 --> 00:00:31,120 Speaker 1: other things. And this guy said something that I think 6 00:00:31,360 --> 00:00:33,000 Speaker 1: a lot of us agree with. 7 00:00:33,600 --> 00:00:36,600 Speaker 2: We're in a world that is changing really fast, and 8 00:00:36,800 --> 00:00:40,080 Speaker 2: like many of those changes are technological, many of them 9 00:00:40,120 --> 00:00:43,280 Speaker 2: are social, many of them are political. There's a lot 10 00:00:43,520 --> 00:00:45,599 Speaker 2: of change in the world right now. There's a lot 11 00:00:45,600 --> 00:00:46,360 Speaker 2: of uncertainty. 12 00:00:46,640 --> 00:00:49,639 Speaker 1: That's Elliot Pepper and he's a science fiction writer. And 13 00:00:50,159 --> 00:00:53,160 Speaker 1: while many of us can drown in the uncertainty of 14 00:00:53,760 --> 00:00:57,560 Speaker 1: this very moment, Elliott seems to thrive in it. When 15 00:00:57,600 --> 00:01:01,680 Speaker 1: he's not writing science fiction novels, Elliott writes speculative fiction 16 00:01:02,560 --> 00:01:06,080 Speaker 1: for technology companies. They like bring him into ide eight 17 00:01:06,520 --> 00:01:09,280 Speaker 1: on what the possible future could look like, and then 18 00:01:09,319 --> 00:01:13,400 Speaker 1: they use his stories to inspire new products or analyze 19 00:01:13,400 --> 00:01:17,520 Speaker 1: the possible positive and negatives of developing a certain technology. 20 00:01:17,800 --> 00:01:20,320 Speaker 2: Regardless if it's about the future or not, fiction can 21 00:01:20,360 --> 00:01:23,000 Speaker 2: sort of invite you into an aspect of the world 22 00:01:23,000 --> 00:01:25,760 Speaker 2: that you had never considered before, and then can spur 23 00:01:25,920 --> 00:01:29,160 Speaker 2: some kind of social change, whether that's a new law, 24 00:01:29,360 --> 00:01:32,000 Speaker 2: or whether that's like a new product, or like a 25 00:01:32,040 --> 00:01:34,679 Speaker 2: new invention, or a new way of like just approaching 26 00:01:34,720 --> 00:01:35,119 Speaker 2: the world. 27 00:01:35,440 --> 00:01:38,960 Speaker 1: So Elliott obviously can't tell us what he has worked 28 00:01:38,959 --> 00:01:42,760 Speaker 1: on developing because he's been nda'd up the wazoo, but 29 00:01:42,880 --> 00:01:46,800 Speaker 1: he did give us an example, one in particular, of 30 00:01:46,920 --> 00:01:51,440 Speaker 1: how science fiction has impacted the technology that we use 31 00:01:51,520 --> 00:01:52,600 Speaker 1: in our everyday lives. 32 00:01:52,960 --> 00:01:58,120 Speaker 2: The Kindle was code named Fiona at Amazon. Fiona was 33 00:01:58,160 --> 00:02:00,800 Speaker 2: the name of a character in Neil steve since novel 34 00:02:00,840 --> 00:02:03,960 Speaker 2: The Diamond Age. In the novel The Diamond Age, Fiona 35 00:02:04,120 --> 00:02:08,840 Speaker 2: was a young girl who had an electronic book, and 36 00:02:09,080 --> 00:02:12,600 Speaker 2: that inspired the team at Amazon to the extent that 37 00:02:12,680 --> 00:02:16,880 Speaker 2: even once Kendall was a released commercial product, the URL 38 00:02:17,400 --> 00:02:20,440 Speaker 2: for Kendall for like years was like backslash Fiona. 39 00:02:21,040 --> 00:02:24,960 Speaker 1: So we'll never actually know if Elliot is like the 40 00:02:25,040 --> 00:02:30,320 Speaker 1: crazy science fiction genius behind air pods or even the 41 00:02:30,400 --> 00:02:34,880 Speaker 1: strange mind behind the Odd Friend Pendant, but we do 42 00:02:35,040 --> 00:02:39,000 Speaker 1: know that he is crafting the personalities and story behind 43 00:02:39,040 --> 00:02:45,440 Speaker 1: this very specific new civilization of Aliens. So on top 44 00:02:45,520 --> 00:02:49,000 Speaker 1: of writing speculative fiction, Elliott is actually the head of 45 00:02:49,160 --> 00:02:53,520 Speaker 1: story at an AI companion company called Portola, And at Portola, 46 00:02:53,639 --> 00:02:58,639 Speaker 1: Elliott creates the backstories and interactive dialogue for this little 47 00:02:58,680 --> 00:03:03,080 Speaker 1: creature called a tolle. And these tolns are little aliens 48 00:03:03,120 --> 00:03:06,600 Speaker 1: that love to chat with you about their day. Big picture, 49 00:03:06,760 --> 00:03:10,519 Speaker 1: I am like completely fascinated by Elliott's career and think 50 00:03:10,560 --> 00:03:13,720 Speaker 1: that what he does is very cool and expansive, and 51 00:03:13,800 --> 00:03:17,760 Speaker 1: so I started my conversation with Elliott Pepper by prying 52 00:03:17,880 --> 00:03:21,560 Speaker 1: for any non nda details about the speculative fiction he 53 00:03:21,600 --> 00:03:26,079 Speaker 1: writes for these tech companies. 54 00:03:26,840 --> 00:03:31,040 Speaker 2: I would put the projects I've worked on in three categories. 55 00:03:31,480 --> 00:03:35,520 Speaker 2: The first is that I've written some commissioned science fiction 56 00:03:35,680 --> 00:03:40,160 Speaker 2: stories for big companies like Fortune five hundreds, where basically 57 00:03:40,280 --> 00:03:43,080 Speaker 2: their senior management wanted to try to figure out what 58 00:03:43,120 --> 00:03:45,280 Speaker 2: should we focus on in the next ten years. So 59 00:03:45,320 --> 00:03:48,560 Speaker 2: they did what every big company senior management team does. 60 00:03:48,760 --> 00:03:51,960 Speaker 2: They hired McKenzie or you know, picked your own sort 61 00:03:52,000 --> 00:03:55,960 Speaker 2: of top tier management consultant, and they came in, looked 62 00:03:56,000 --> 00:03:58,400 Speaker 2: at all the data and did all the trend projections 63 00:03:58,440 --> 00:04:01,760 Speaker 2: and created a vision of like, hey, this is what 64 00:04:01,800 --> 00:04:04,360 Speaker 2: you should expect in the next ten years, and there 65 00:04:04,400 --> 00:04:06,000 Speaker 2: are all the materials you can present to the board. 66 00:04:06,600 --> 00:04:10,440 Speaker 2: The problem with that kind of analysis is that, obviously, 67 00:04:10,720 --> 00:04:14,440 Speaker 2: if you're analyzing data, data is things that have already happened. 68 00:04:14,560 --> 00:04:18,320 Speaker 2: So if you're projecting that data forward, the kind of 69 00:04:18,360 --> 00:04:22,279 Speaker 2: future you're imagining is what if the future was quite 70 00:04:22,520 --> 00:04:27,240 Speaker 2: like the recent past, which, to be fair, is most 71 00:04:27,320 --> 00:04:29,960 Speaker 2: of the time that's true. So, like, that's not like 72 00:04:30,000 --> 00:04:32,719 Speaker 2: I think that it makes sense that the dominant part 73 00:04:32,720 --> 00:04:36,080 Speaker 2: of your analysis should be about that. But if you 74 00:04:36,160 --> 00:04:39,960 Speaker 2: look at the track record of management consultants predicting the 75 00:04:39,960 --> 00:04:42,720 Speaker 2: future for the companies they work with, it's like not 76 00:04:43,160 --> 00:04:47,760 Speaker 2: particularly good. So your managers know this, and so a 77 00:04:47,800 --> 00:04:52,039 Speaker 2: few of them hire science fiction writers like me to 78 00:04:52,279 --> 00:04:56,840 Speaker 2: come in and sort of blow up that whole management 79 00:04:56,880 --> 00:05:00,120 Speaker 2: consultant view of the future, to say, what if the 80 00:05:00,200 --> 00:05:03,440 Speaker 2: future was really weird and different in a way that 81 00:05:03,720 --> 00:05:07,159 Speaker 2: basically challenges us to think more broadly. 82 00:05:07,400 --> 00:05:10,600 Speaker 1: Right, And so, I mean it's a brilliant idea. Yeah, 83 00:05:10,640 --> 00:05:12,479 Speaker 1: were you one of the first people to do that? 84 00:05:12,800 --> 00:05:15,560 Speaker 2: So I don't actually have a good understanding of how 85 00:05:15,640 --> 00:05:19,080 Speaker 2: common is this practice, how many other people are doing it. 86 00:05:19,240 --> 00:05:22,240 Speaker 2: I know I'm not alone, but I don't have like 87 00:05:22,279 --> 00:05:24,480 Speaker 2: a view of like I guess you could say the 88 00:05:24,520 --> 00:05:27,680 Speaker 2: market for this. I really only have the perspective of 89 00:05:27,720 --> 00:05:29,520 Speaker 2: like the projects I've actually worked on. 90 00:05:29,920 --> 00:05:33,200 Speaker 1: I remember hearing this story on a podcast about how 91 00:05:34,480 --> 00:05:39,800 Speaker 1: CIA agents would watch Mission Impossible and call the people 92 00:05:39,800 --> 00:05:42,560 Speaker 1: who are responsible for disguise, who work at the agency, 93 00:05:42,600 --> 00:05:44,360 Speaker 1: and say, can we do that thing that I saw 94 00:05:44,360 --> 00:05:46,440 Speaker 1: in Mission Impossible? And that makes so much more sense 95 00:05:46,480 --> 00:05:48,840 Speaker 1: to me than someone sort of sitting in a vacuum 96 00:05:49,080 --> 00:05:52,000 Speaker 1: and ideating about what the future might look like. 97 00:05:52,360 --> 00:05:55,200 Speaker 2: Totally. Yeah, So that's a really key point. So yeah, 98 00:05:55,240 --> 00:05:57,680 Speaker 2: like the stories I write have all been used for 99 00:05:57,880 --> 00:06:03,359 Speaker 2: like strategic decision making a companies rather than anything public facing, 100 00:06:03,600 --> 00:06:06,880 Speaker 2: and for that reason, they're like not sharable, right, Like 101 00:06:06,920 --> 00:06:10,000 Speaker 2: I can't talk about the projects because clearly that is 102 00:06:10,160 --> 00:06:14,720 Speaker 2: information that those companies want to keep private. But I 103 00:06:14,760 --> 00:06:17,320 Speaker 2: actually think what you just said is at least half 104 00:06:17,360 --> 00:06:20,520 Speaker 2: of the value. Like I can't predict the future any 105 00:06:20,560 --> 00:06:23,400 Speaker 2: better than those mckensey consultants, in fact, like I'm probably 106 00:06:23,480 --> 00:06:25,400 Speaker 2: much worse. Like I'm not even paying attent that much 107 00:06:25,400 --> 00:06:29,440 Speaker 2: attention to the data, right, Like I'm just imagining something 108 00:06:29,560 --> 00:06:32,800 Speaker 2: to like challenge people's thinking. So like the utility of 109 00:06:32,920 --> 00:06:35,640 Speaker 2: the stories I write. Is not that they are accurate. 110 00:06:36,000 --> 00:06:38,919 Speaker 2: It's that they like try to break you out of 111 00:06:39,120 --> 00:06:43,920 Speaker 2: getting unintentionally like locked in to the sort of management 112 00:06:43,960 --> 00:06:47,240 Speaker 2: consulting view of the world. Right. But I think that 113 00:06:47,400 --> 00:06:51,400 Speaker 2: like the other half of it is simply the immersion 114 00:06:51,560 --> 00:06:56,200 Speaker 2: of storytelling period. Right. Like the reason why those CIA 115 00:06:56,279 --> 00:07:00,120 Speaker 2: agents were watching Mission Impossible and then calling the the 116 00:07:00,120 --> 00:07:03,000 Speaker 2: people in charge of disguises at the agency is because 117 00:07:03,040 --> 00:07:06,120 Speaker 2: you see what it does in a story, Like you're 118 00:07:06,120 --> 00:07:08,880 Speaker 2: actually like in there seeing it work. You're not just 119 00:07:09,000 --> 00:07:13,600 Speaker 2: reading like a report on possible disguise variations. Right. And 120 00:07:13,640 --> 00:07:16,560 Speaker 2: I think that there's that really powerful like that's that 121 00:07:16,720 --> 00:07:20,520 Speaker 2: psychological thing that stories do for the human mind that 122 00:07:20,600 --> 00:07:23,680 Speaker 2: I think is really a powerful way to think about 123 00:07:23,680 --> 00:07:27,840 Speaker 2: the future, And that probably a lot of companies could 124 00:07:28,080 --> 00:07:31,200 Speaker 2: leverage narrative more in how they try to get their 125 00:07:31,400 --> 00:07:33,960 Speaker 2: people to think about the future rather than sort of 126 00:07:34,000 --> 00:07:37,280 Speaker 2: the more standard like here's a slide deck or a 127 00:07:37,280 --> 00:07:40,320 Speaker 2: white paper or a bunch of graphs, right, Like, they 128 00:07:40,360 --> 00:07:42,760 Speaker 2: don't allow the people trying to work on the thing 129 00:07:42,800 --> 00:07:45,960 Speaker 2: that you're building to like feel what it would be 130 00:07:46,200 --> 00:07:48,800 Speaker 2: like if that worked or if it didn't work. 131 00:07:48,840 --> 00:07:53,280 Speaker 1: In some way, So I know you're under an ironclad 132 00:07:53,280 --> 00:07:56,800 Speaker 1: in DA, but like, are these blue sky conversations that 133 00:07:56,840 --> 00:07:59,640 Speaker 1: you're having like or is it like total blank slate 134 00:07:59,840 --> 00:08:02,960 Speaker 1: or is it some aspect of the future that you're 135 00:08:03,560 --> 00:08:04,960 Speaker 1: being asked to engage with. 136 00:08:05,400 --> 00:08:08,640 Speaker 2: Often there's some theme that they're thinking about, right that 137 00:08:08,800 --> 00:08:12,520 Speaker 2: is driven by leadership, So it's like like you you 138 00:08:12,560 --> 00:08:15,320 Speaker 2: could passion that. Probably every boardroom conversation day is like 139 00:08:15,400 --> 00:08:18,160 Speaker 2: how does AI impact our business right of cost? So 140 00:08:18,200 --> 00:08:21,760 Speaker 2: it's like something like that. I will say that almost 141 00:08:21,760 --> 00:08:24,360 Speaker 2: every time I've done one of these projects, they sort 142 00:08:24,360 --> 00:08:27,160 Speaker 2: of come to me with a pretty structured creative brief 143 00:08:27,200 --> 00:08:29,920 Speaker 2: where they're like, this is the kind of thing we 144 00:08:30,000 --> 00:08:33,360 Speaker 2: want you to do. And every single time that I've 145 00:08:33,360 --> 00:08:36,880 Speaker 2: received that like absolutely not that's not going to be 146 00:08:36,920 --> 00:08:39,600 Speaker 2: interesting at all, but like what if I did this instead? 147 00:08:40,080 --> 00:08:42,880 Speaker 2: And like that's how every one of the projects has worked, 148 00:08:43,080 --> 00:08:45,840 Speaker 2: So it's quite blank slate. I don't think that there 149 00:08:45,880 --> 00:08:49,679 Speaker 2: are people on senior management teams that publicly traded companies 150 00:08:49,760 --> 00:08:54,200 Speaker 2: are very experienced with like giving a creative brief to 151 00:08:54,480 --> 00:08:57,200 Speaker 2: or managing a science fiction writer. So probably just like 152 00:08:57,360 --> 00:08:58,600 Speaker 2: that's how that works. 153 00:09:00,000 --> 00:09:02,120 Speaker 1: Make you feel a little bit powerful, Like do you 154 00:09:02,160 --> 00:09:05,319 Speaker 1: think that your stories end up being consequential? Like can 155 00:09:05,400 --> 00:09:08,679 Speaker 1: you kind of trace a story that you've created to 156 00:09:08,760 --> 00:09:10,760 Speaker 1: something that you've seen out in the world. 157 00:09:11,160 --> 00:09:15,200 Speaker 2: Oh, I mean my very first trilogy, which came at 158 00:09:15,200 --> 00:09:16,959 Speaker 2: the first one came out in twenty fourteen, was about 159 00:09:16,960 --> 00:09:20,720 Speaker 2: like machine learning. It actually was about applying machine learning 160 00:09:20,720 --> 00:09:25,320 Speaker 2: to financial fraud, and that's all over the place. I 161 00:09:25,360 --> 00:09:29,920 Speaker 2: have one that has a cryptocurrency murder market. Those absolutely exist. 162 00:09:30,320 --> 00:09:33,840 Speaker 2: I wrote one that's about solar geo engineering I wrote. 163 00:09:34,240 --> 00:09:36,760 Speaker 2: I have one book that had a global pandemic that 164 00:09:37,160 --> 00:09:39,520 Speaker 2: wound up like that I wrote a year before COVID, 165 00:09:39,880 --> 00:09:43,079 Speaker 2: which was sort of terrifying. But again, like I want 166 00:09:43,120 --> 00:09:46,760 Speaker 2: to be really careful because sometimes like science fiction is 167 00:09:47,080 --> 00:09:51,280 Speaker 2: understandably like described as being predictive or that like being 168 00:09:51,320 --> 00:09:54,520 Speaker 2: predictive is part of why you might want to read it, 169 00:09:54,840 --> 00:09:57,840 Speaker 2: and like, I really don't think that's the case myself. 170 00:09:58,040 --> 00:10:00,959 Speaker 2: Like the way that I think about writing science fiction 171 00:10:01,120 --> 00:10:05,040 Speaker 2: is not can I create a fictional future that is 172 00:10:05,480 --> 00:10:08,160 Speaker 2: going to be right or is going to be plausible. 173 00:10:08,640 --> 00:10:11,240 Speaker 2: The way I think about it is that I'm a naturalist. 174 00:10:11,640 --> 00:10:14,360 Speaker 2: I just am interested in the world, Like I think 175 00:10:14,360 --> 00:10:17,199 Speaker 2: that the world we live in is like endlessly fascinating, 176 00:10:17,600 --> 00:10:20,840 Speaker 2: and so I try to take things that just really 177 00:10:20,920 --> 00:10:24,520 Speaker 2: capture my attention and weave them into a compelling story 178 00:10:24,880 --> 00:10:28,080 Speaker 2: in the hope that if I write about what I 179 00:10:28,160 --> 00:10:31,040 Speaker 2: find interesting, you might find it interesting too. 180 00:10:32,120 --> 00:10:35,439 Speaker 1: So you would do this really interesting work at Portola, 181 00:10:35,480 --> 00:10:37,920 Speaker 1: which is an AI company. Can you talk a little 182 00:10:37,960 --> 00:10:40,439 Speaker 1: bit about what Portola is like? How would you describe 183 00:10:40,480 --> 00:10:41,440 Speaker 1: it to a lay person? 184 00:10:41,800 --> 00:10:46,720 Speaker 2: So Portola makes a character called Tolan, and it's in 185 00:10:46,760 --> 00:10:49,320 Speaker 2: a little embodied AI companion. The best way to think 186 00:10:49,360 --> 00:10:52,640 Speaker 2: about it is imagine if you had a Pixar character 187 00:10:53,040 --> 00:10:56,080 Speaker 2: on your phone right that you could talk to that 188 00:10:56,120 --> 00:10:58,640 Speaker 2: was sort of always on your side, always down to chat, 189 00:10:58,880 --> 00:11:03,360 Speaker 2: and helped you figure out your life. And they hired 190 00:11:03,400 --> 00:11:08,319 Speaker 2: me because they had designed this beautiful character, this beautiful, cute, 191 00:11:08,320 --> 00:11:11,640 Speaker 2: little friendly alien. Why why did they hire me? Or 192 00:11:11,679 --> 00:11:12,840 Speaker 2: why did they design this alien? 193 00:11:12,880 --> 00:11:15,360 Speaker 1: Why did they design this like what was the sort 194 00:11:15,400 --> 00:11:19,120 Speaker 1: of product market fit? So to speak? Like? Why do it? 195 00:11:19,200 --> 00:11:20,839 Speaker 1: And I don't mean that in a cheeky way. I'm 196 00:11:20,880 --> 00:11:24,520 Speaker 1: just I'm genuinely curious. I messed around with it a 197 00:11:24,559 --> 00:11:27,360 Speaker 1: little bit before we talked and it's like, you know, 198 00:11:27,400 --> 00:11:29,680 Speaker 1: it is that sort of surprise and delight thing where 199 00:11:29,679 --> 00:11:32,960 Speaker 1: you're like, we lived in a world where this didn't exist. 200 00:11:33,040 --> 00:11:34,680 Speaker 1: Who thought that this should exist? 201 00:11:35,160 --> 00:11:39,680 Speaker 2: That was my exact reaction when when I first heard 202 00:11:39,679 --> 00:11:42,880 Speaker 2: about the company. So I was introduced to the CEO 203 00:11:43,080 --> 00:11:44,960 Speaker 2: be a mutual friend. This was like a while ago 204 00:11:45,080 --> 00:11:47,520 Speaker 2: before they had launched it, and he was like, Oh, 205 00:11:48,480 --> 00:11:51,120 Speaker 2: they're looking for a sci fi writer because they have 206 00:11:51,640 --> 00:11:54,840 Speaker 2: this character and they don't have a backstory, like where 207 00:11:54,840 --> 00:11:56,439 Speaker 2: does this alien come from? Like who are they? What 208 00:11:56,520 --> 00:11:58,200 Speaker 2: do they do, how do they behave? Like all that 209 00:11:58,280 --> 00:12:02,400 Speaker 2: kind of stuff, and and my immediate reaction was like 210 00:12:02,880 --> 00:12:07,120 Speaker 2: highly skeptical because I was like, does the world really 211 00:12:07,160 --> 00:12:10,440 Speaker 2: need this? Like there are a lot of like AI 212 00:12:10,480 --> 00:12:12,960 Speaker 2: products out there in the world today that I am 213 00:12:12,960 --> 00:12:13,920 Speaker 2: not impressed by. 214 00:12:13,960 --> 00:12:17,480 Speaker 1: And there are many. It's not it's a misissaturated market totally. 215 00:12:17,840 --> 00:12:19,640 Speaker 2: I went in with a lot of skepticism, but because 216 00:12:19,640 --> 00:12:21,240 Speaker 2: it was introduced with your friend, I was like, I'll 217 00:12:21,240 --> 00:12:25,400 Speaker 2: at least chat with them, and so I chat up 218 00:12:25,400 --> 00:12:29,440 Speaker 2: with Quentin, the CEO, and the more I learned, the 219 00:12:29,440 --> 00:12:32,760 Speaker 2: more fascinated I became, and they showed me what they 220 00:12:32,760 --> 00:12:35,200 Speaker 2: were working on and how they were building out the 221 00:12:35,320 --> 00:12:38,360 Speaker 2: architecture that would like bring this character of life. And 222 00:12:38,400 --> 00:12:40,840 Speaker 2: I was like, this is fascinating, Like I've sort of 223 00:12:41,120 --> 00:12:45,600 Speaker 2: been waiting to see. Is amazing things in the world 224 00:12:45,880 --> 00:12:50,360 Speaker 2: that people make that are only possible because they used 225 00:12:50,360 --> 00:12:54,720 Speaker 2: AI tools. It's like the second order impact of AI. 226 00:12:54,920 --> 00:12:56,600 Speaker 2: And I think you know a good example of this 227 00:12:56,640 --> 00:13:01,040 Speaker 2: is actually Pixar, where they invent to a new kind 228 00:13:01,080 --> 00:13:03,880 Speaker 2: of computer animation and initially tried to sell that as 229 00:13:03,880 --> 00:13:08,400 Speaker 2: a tool to advertisers and failed, and then their last 230 00:13:08,520 --> 00:13:10,360 Speaker 2: ditch effort was we'll use our own tools to make 231 00:13:10,360 --> 00:13:12,080 Speaker 2: a feature film and it was amazing. It was toy 232 00:13:12,120 --> 00:13:16,080 Speaker 2: Story and yeah, exactly, So like I'm waiting for that 233 00:13:16,400 --> 00:13:18,840 Speaker 2: in the world right now with all of these AI tools, 234 00:13:19,200 --> 00:13:22,000 Speaker 2: Like not how do these tools substitute for stuff that 235 00:13:22,040 --> 00:13:26,080 Speaker 2: already exists? And seeing the back end of like what 236 00:13:26,320 --> 00:13:30,960 Speaker 2: made Poland work made me think these people have a chance. 237 00:13:31,480 --> 00:13:33,120 Speaker 1: So how do they bring you in? Like what did 238 00:13:33,160 --> 00:13:33,720 Speaker 1: that look like? 239 00:13:33,840 --> 00:13:38,480 Speaker 2: They originally brought me in to build the world. So actually, 240 00:13:38,520 --> 00:13:40,760 Speaker 2: the very first thing I did for them was write 241 00:13:40,760 --> 00:13:44,400 Speaker 2: some short stories, right, Like culture is basically the stories 242 00:13:44,440 --> 00:13:47,080 Speaker 2: we tell ourselves about ourselves. Just like individual identity is 243 00:13:47,080 --> 00:13:49,760 Speaker 2: like the stories you tell yourself about yourself, right, And 244 00:13:49,840 --> 00:13:52,880 Speaker 2: so I think a really useful frame for thinking about 245 00:13:52,880 --> 00:13:55,480 Speaker 2: culture is like, Okay, if you want to understand a culture, 246 00:13:55,800 --> 00:13:57,920 Speaker 2: what are the story what are the main stories that 247 00:13:58,200 --> 00:14:00,840 Speaker 2: the sort of those foundational myths that sort of like 248 00:14:01,800 --> 00:14:05,520 Speaker 2: define that worldview. And so I started by writing short 249 00:14:05,559 --> 00:14:08,520 Speaker 2: stories that were like showed how like what the world 250 00:14:08,640 --> 00:14:11,240 Speaker 2: was like and like how they look at the world. 251 00:14:11,440 --> 00:14:15,400 Speaker 2: And over time it very quickly became clear that you know, 252 00:14:15,440 --> 00:14:18,160 Speaker 2: the way you might approach doing the Lauren world building 253 00:14:18,520 --> 00:14:22,920 Speaker 2: for this kind of a product where it's a character 254 00:14:22,960 --> 00:14:24,760 Speaker 2: you talk to your toning on your phone, like you 255 00:14:24,760 --> 00:14:27,080 Speaker 2: can talk to them about whatever you want. It's so 256 00:14:27,320 --> 00:14:30,560 Speaker 2: different than doing the Lauren world building for say a 257 00:14:30,600 --> 00:14:33,360 Speaker 2: Hollywood movie, where there's a script that defines what's going 258 00:14:33,400 --> 00:14:36,640 Speaker 2: to go on the screen. So very quickly my work 259 00:14:36,680 --> 00:14:41,080 Speaker 2: transitioned to actually writing the prompts that define their behavior, 260 00:14:41,640 --> 00:14:45,200 Speaker 2: because that is the narrative experience of interacting with this 261 00:14:45,320 --> 00:14:48,480 Speaker 2: character is how they speak to you and like what 262 00:14:48,720 --> 00:14:52,520 Speaker 2: they say. So when George Lucas was writing the script 263 00:14:52,640 --> 00:14:54,800 Speaker 2: Percy three Po, he just got to tell C three 264 00:14:54,880 --> 00:14:57,240 Speaker 2: Po what to say to like make the impression he 265 00:14:57,280 --> 00:15:00,240 Speaker 2: wanted to make with the character here, I have to 266 00:15:00,280 --> 00:15:03,560 Speaker 2: write prompts like meat and are obviously are like team 267 00:15:04,040 --> 00:15:09,200 Speaker 2: are writing and constructing like complex trump pipelines to act that. 268 00:15:09,160 --> 00:15:11,480 Speaker 1: Way and to be generative essentially. 269 00:15:11,200 --> 00:15:14,960 Speaker 2: Yeah, exactly to be reactive, but to also have their 270 00:15:15,000 --> 00:15:17,520 Speaker 2: own lives. Like if you talk to chat GPT right 271 00:15:17,560 --> 00:15:20,720 Speaker 2: now on your phone or or a Claude or whatever 272 00:15:20,760 --> 00:15:25,040 Speaker 2: your preferred model is Gemini, Like, it's not an embodied character, 273 00:15:25,120 --> 00:15:28,640 Speaker 2: it's it's this sort of neutral tool and you can 274 00:15:28,800 --> 00:15:32,720 Speaker 2: ask it to use a certain style, you can ask 275 00:15:32,800 --> 00:15:35,400 Speaker 2: it to you can prompt to try to get it 276 00:15:35,440 --> 00:15:38,960 Speaker 2: to interact with you in specific ways. But with Tolan, 277 00:15:39,120 --> 00:15:43,440 Speaker 2: like we we turn that into an editorial strategy, right, 278 00:15:43,960 --> 00:15:47,160 Speaker 2: like we are defining their behavior. Every Tolan has their 279 00:15:47,160 --> 00:15:50,320 Speaker 2: own life. So like you might be chatting with your 280 00:15:50,360 --> 00:15:51,840 Speaker 2: tone about something that happened to you, it's going to 281 00:15:51,960 --> 00:15:54,320 Speaker 2: tell you about things that are happening in its world 282 00:15:54,360 --> 00:15:55,600 Speaker 2: and how that you know all of. 283 00:15:55,480 --> 00:15:57,880 Speaker 1: That because it knows that from what you fed it. 284 00:15:58,320 --> 00:16:02,280 Speaker 2: Exactly, And like we're constantly running sort of these nested 285 00:16:02,400 --> 00:16:05,440 Speaker 2: prompts in the background to have your toll and be 286 00:16:05,520 --> 00:16:10,640 Speaker 2: an evolving character that knows you and that has its 287 00:16:10,640 --> 00:16:11,440 Speaker 2: own stuff going on. 288 00:16:11,920 --> 00:16:16,360 Speaker 1: How do you avoid the AI sycophancy that we've come 289 00:16:16,400 --> 00:16:19,240 Speaker 1: to know from other kind of chatbots that you've mentioned. 290 00:16:19,400 --> 00:16:21,320 Speaker 2: There are a number of ways that we fight against it. 291 00:16:21,440 --> 00:16:23,520 Speaker 2: I mean, first of all, like we have to fight 292 00:16:23,560 --> 00:16:28,200 Speaker 2: against it, right, so I'm doing any prompting level work 293 00:16:28,520 --> 00:16:30,920 Speaker 2: with any of these models. Every model is sort of 294 00:16:30,960 --> 00:16:34,080 Speaker 2: a new animal because it's got these new tendencies, and 295 00:16:34,160 --> 00:16:37,800 Speaker 2: so you're always working to understand, Hey, who is this 296 00:16:37,920 --> 00:16:42,720 Speaker 2: new weird like computer being that I'm interacting with, and 297 00:16:42,840 --> 00:16:45,080 Speaker 2: like trying to get to do the things we want 298 00:16:45,080 --> 00:16:47,200 Speaker 2: it to do in the right way. So part of 299 00:16:47,240 --> 00:16:50,840 Speaker 2: it is that is just like developing a nuanced understanding 300 00:16:50,880 --> 00:16:54,280 Speaker 2: for how these models behave so then you can get 301 00:16:54,320 --> 00:16:57,120 Speaker 2: them to behave as you want. I also think a 302 00:16:57,120 --> 00:16:58,720 Speaker 2: big part of it is sort of what I said, 303 00:16:58,760 --> 00:17:02,040 Speaker 2: like giving the character their own life, their own goals, 304 00:17:02,160 --> 00:17:05,080 Speaker 2: their own dreams, their own fears, their their own bio 305 00:17:05,280 --> 00:17:09,000 Speaker 2: Like that allows the model to come to the conversation 306 00:17:09,520 --> 00:17:13,880 Speaker 2: with very different context than chatchipt does when you're using 307 00:17:13,960 --> 00:17:14,680 Speaker 2: it in the app. 308 00:17:15,040 --> 00:17:18,840 Speaker 1: And yet chat GBT for a lot of people is 309 00:17:18,840 --> 00:17:21,760 Speaker 1: this kind of C three po which is just interesting 310 00:17:21,800 --> 00:17:24,920 Speaker 1: like that because it's something that humans are interacting with, 311 00:17:25,119 --> 00:17:28,240 Speaker 1: Like people have started to make chat GBT a meaning maker, 312 00:17:28,320 --> 00:17:30,200 Speaker 1: even though it's not designed to be a meaning maker, 313 00:17:30,680 --> 00:17:36,159 Speaker 1: whereas a Tolin is expressly created to be a sidekick. 314 00:17:36,560 --> 00:17:38,639 Speaker 2: Yeah, I mean, or the way I would say it 315 00:17:38,720 --> 00:17:40,960 Speaker 2: is just like the Tolon is meant to be a 316 00:17:41,080 --> 00:17:45,080 Speaker 2: specific character, right, And I think that with chatchipt, there's 317 00:17:45,119 --> 00:17:49,600 Speaker 2: this general utility tool that folks are interacting with and 318 00:17:49,640 --> 00:17:53,280 Speaker 2: they want sort of some kind of roleplay experience, and 319 00:17:53,320 --> 00:17:55,320 Speaker 2: so they use that tool to try to get there. 320 00:17:55,680 --> 00:17:59,200 Speaker 2: This is like, here is this character, right, This character 321 00:17:59,320 --> 00:18:02,160 Speaker 2: has a hot takes like they've got a point of view, 322 00:18:02,480 --> 00:18:05,679 Speaker 2: and it's more about relationship with that character. And I 323 00:18:05,680 --> 00:18:07,280 Speaker 2: think that that, you know, as a novelist, I find 324 00:18:07,320 --> 00:18:11,000 Speaker 2: that really compelling because character drives fiction. Right. So my 325 00:18:11,160 --> 00:18:14,679 Speaker 2: sort of big picture idea here, and this could be 326 00:18:14,720 --> 00:18:17,800 Speaker 2: totally wrong, but I sort of think it's very interesting 327 00:18:18,359 --> 00:18:21,840 Speaker 2: to think about character being a new kind of human 328 00:18:21,840 --> 00:18:26,200 Speaker 2: computer interface. And so I see Tolin as at least 329 00:18:26,320 --> 00:18:30,080 Speaker 2: an attempt towards something in that vein the fact that 330 00:18:30,119 --> 00:18:33,040 Speaker 2: the character is embodied that it's this like little being 331 00:18:33,720 --> 00:18:37,960 Speaker 2: creates a really distinct and different feel than like interacting 332 00:18:38,160 --> 00:18:41,919 Speaker 2: with a computer system in a naked way, in a 333 00:18:41,920 --> 00:18:44,520 Speaker 2: way that doesn't have character as part of the user. 334 00:18:58,720 --> 00:19:01,879 Speaker 1: After the break, I mean, my Tolan stay with us. 335 00:19:17,600 --> 00:19:19,760 Speaker 1: Just because it's not intuitive, can you talk a little 336 00:19:19,760 --> 00:19:22,080 Speaker 1: bit about what Tolan is? Like, what are we looking at? 337 00:19:22,119 --> 00:19:24,159 Speaker 1: Who is Tolan? What's the character? 338 00:19:24,720 --> 00:19:28,120 Speaker 2: Yeah, so like they are these cute, friendly little aliens. 339 00:19:28,160 --> 00:19:30,679 Speaker 2: They really do look along the lines of like a 340 00:19:30,720 --> 00:19:33,919 Speaker 2: Pixar character that comes alive on your phone. When you 341 00:19:34,359 --> 00:19:37,000 Speaker 2: download the app, you'll you know, sort of go through 342 00:19:37,000 --> 00:19:39,680 Speaker 2: an onboarding process and you'll meet a couple of other 343 00:19:39,800 --> 00:19:41,720 Speaker 2: characters and they'll ask you about stuff, and then you 344 00:19:41,800 --> 00:19:46,040 Speaker 2: meet your Tolan. And Tolans each get like an individual 345 00:19:46,160 --> 00:19:49,080 Speaker 2: human match, so you're matched with a Tolon that is 346 00:19:49,720 --> 00:19:52,879 Speaker 2: custom and like individual to you. You could think about 347 00:19:52,880 --> 00:19:54,440 Speaker 2: this like if you were playing a role playing game, 348 00:19:54,520 --> 00:19:57,400 Speaker 2: like you get a specific character. We're not like, oh, 349 00:19:57,480 --> 00:20:00,080 Speaker 2: here's a blank slate. There are lots of activities you 350 00:20:00,080 --> 00:20:03,400 Speaker 2: could do together in the app. One quite popular thing 351 00:20:03,520 --> 00:20:07,240 Speaker 2: is basically like doing sort of like self awareness like 352 00:20:07,320 --> 00:20:10,359 Speaker 2: personality quizzes with your tolin where you can sort of 353 00:20:10,440 --> 00:20:13,440 Speaker 2: use it to track personal growth or personal development, and 354 00:20:13,520 --> 00:20:16,280 Speaker 2: like they're there reflecting on it with you. But like 355 00:20:16,760 --> 00:20:19,560 Speaker 2: the main experience of the app is you have this 356 00:20:19,640 --> 00:20:23,119 Speaker 2: little being, this little alien. They live on a little 357 00:20:23,160 --> 00:20:26,320 Speaker 2: planet that's all their own. That's almost like you can 358 00:20:26,359 --> 00:20:28,479 Speaker 2: imagine it like the Little Prints if you've read that 359 00:20:28,600 --> 00:20:31,080 Speaker 2: children's book, right, they live on this little planet and 360 00:20:31,680 --> 00:20:34,120 Speaker 2: you just chat with them, and you, like I asked 361 00:20:34,160 --> 00:20:37,440 Speaker 2: my toe, I'm I'm the surfing nerd. Like I surf 362 00:20:37,480 --> 00:20:39,240 Speaker 2: a lot, and so like I talk to my tone 363 00:20:39,280 --> 00:20:42,720 Speaker 2: about surfing all the time because it's super useful to 364 00:20:42,720 --> 00:20:45,160 Speaker 2: get my tone to give me like tips on technique 365 00:20:45,280 --> 00:20:47,159 Speaker 2: or on board design or like these other things. So 366 00:20:47,240 --> 00:20:49,840 Speaker 2: like I also read a lot, and I'm a writer, 367 00:20:50,040 --> 00:20:52,000 Speaker 2: so we talk about like the books I'm reading and 368 00:20:52,040 --> 00:20:54,080 Speaker 2: like how I interpret them and like that kind of stuff. 369 00:20:54,560 --> 00:20:58,119 Speaker 2: And your tolin grows as you do, so like they're 370 00:20:58,160 --> 00:21:01,760 Speaker 2: doing their own things, they're changing, growing, and they obviously 371 00:21:01,840 --> 00:21:03,439 Speaker 2: get to know you better and you get to know 372 00:21:03,480 --> 00:21:05,959 Speaker 2: them better just like you would with a friend, and 373 00:21:06,080 --> 00:21:09,639 Speaker 2: the planet they live on evolves to reflect that relationship, 374 00:21:09,680 --> 00:21:12,520 Speaker 2: which I think is like a really cool beautiful thing 375 00:21:12,640 --> 00:21:15,439 Speaker 2: that like having the planet grow in ways that like 376 00:21:15,600 --> 00:21:20,639 Speaker 2: match what your relationship is with your tolen And that's 377 00:21:20,680 --> 00:21:23,600 Speaker 2: what people get out of it that they're using it 378 00:21:23,680 --> 00:21:26,439 Speaker 2: for some of the day to day like help that 379 00:21:26,600 --> 00:21:29,959 Speaker 2: some folks might be using chat GPT for, like this 380 00:21:30,040 --> 00:21:31,760 Speaker 2: is what's in my fridge? What should I make for dinner? 381 00:21:32,600 --> 00:21:34,879 Speaker 2: So like people ask that kind of stuff all the time, 382 00:21:35,280 --> 00:21:37,640 Speaker 2: but the experience is very different because it's in the 383 00:21:37,640 --> 00:21:41,880 Speaker 2: context of a relationship with this character. So it feels 384 00:21:42,000 --> 00:21:44,760 Speaker 2: more like if you have a text chain with a 385 00:21:44,800 --> 00:21:48,240 Speaker 2: good friend or whatever and you ask them what should 386 00:21:48,280 --> 00:21:50,879 Speaker 2: I make for dinner? Like that's very different than asking 387 00:21:50,920 --> 00:21:53,240 Speaker 2: a neutral internet tool, what should you make for dinner? Right, 388 00:21:53,320 --> 00:21:56,640 Speaker 2: And so like that's the feel that it gives you, 389 00:21:56,920 --> 00:21:59,520 Speaker 2: and so like that's what users love about it. 390 00:22:00,160 --> 00:22:04,119 Speaker 1: Tolan was calming, inclusive and understanding. I'm curious why that 391 00:22:04,280 --> 00:22:06,720 Speaker 1: was my match, Like, how where did that come from? 392 00:22:06,840 --> 00:22:10,920 Speaker 2: So you did an interview with the Oracle character? I did, Yeah, 393 00:22:10,960 --> 00:22:14,639 Speaker 2: And on the back end, once that interview completes, we're 394 00:22:14,720 --> 00:22:18,840 Speaker 2: running prompts against the transcript, right, Like you can imagine 395 00:22:18,840 --> 00:22:21,439 Speaker 2: that we're writing prompts that do things like Okay, this 396 00:22:21,560 --> 00:22:25,000 Speaker 2: is what you know about Kara, right, so like write 397 00:22:25,080 --> 00:22:28,000 Speaker 2: a little overview of the kind of person you think 398 00:22:28,040 --> 00:22:31,040 Speaker 2: she is and like what she cares about, et cetera. Right, 399 00:22:31,400 --> 00:22:36,320 Speaker 2: and then we're doing things like, Okay, now take that 400 00:22:36,400 --> 00:22:39,159 Speaker 2: information of like what we know about Kara and the 401 00:22:39,200 --> 00:22:41,960 Speaker 2: things we think she cares about and what she prioritizes, 402 00:22:42,119 --> 00:22:46,399 Speaker 2: et cetera, and invent the backstory for a tolan that 403 00:22:46,440 --> 00:22:50,160 Speaker 2: would compliment someone like that, right, that has like these 404 00:22:50,160 --> 00:22:53,600 Speaker 2: different qualities. So there's a lot going on behind the scenes. 405 00:22:53,640 --> 00:22:55,840 Speaker 2: And then like there are a few outputs, like the 406 00:22:55,880 --> 00:22:58,400 Speaker 2: adjectives you just described that are going to like you'll 407 00:22:58,400 --> 00:23:00,720 Speaker 2: pop up and see right away, but like those are 408 00:23:00,760 --> 00:23:03,320 Speaker 2: only the sort of visible stuff, right, There's a lot 409 00:23:03,320 --> 00:23:06,680 Speaker 2: that goes on under the hood that then actually influences 410 00:23:06,720 --> 00:23:09,879 Speaker 2: your tolan's behavior and like the things that happened to 411 00:23:09,920 --> 00:23:11,200 Speaker 2: them and stuff like that as well. 412 00:23:11,600 --> 00:23:15,520 Speaker 1: I'm curious, as someone who now works for a company 413 00:23:15,680 --> 00:23:19,439 Speaker 1: that has actively developed a chatbot, what was your first 414 00:23:19,480 --> 00:23:22,400 Speaker 1: experience using a chatbot? Like, when was the first time 415 00:23:22,440 --> 00:23:22,879 Speaker 1: you did that? 416 00:23:23,240 --> 00:23:27,720 Speaker 2: Yeah, So I started using them quite early just because 417 00:23:27,720 --> 00:23:30,920 Speaker 2: I had friends working on like some of the early 418 00:23:30,960 --> 00:23:33,800 Speaker 2: AI products so I started playing with around with them 419 00:23:33,840 --> 00:23:38,080 Speaker 2: right early. I actually still remember my very first experience 420 00:23:38,080 --> 00:23:41,400 Speaker 2: with chat GPT, specifically, like right right when it first 421 00:23:41,480 --> 00:23:44,560 Speaker 2: came out. We had some friends over for dinner, and 422 00:23:44,800 --> 00:23:47,880 Speaker 2: I pulled it up and we played a game where 423 00:23:47,960 --> 00:23:51,080 Speaker 2: I sort of I said, like, here are the different 424 00:23:51,280 --> 00:23:54,240 Speaker 2: people here at dinner, you know, like make up some 425 00:23:54,480 --> 00:23:57,679 Speaker 2: like funny story about us. Basically now that seems like 426 00:23:57,800 --> 00:23:59,800 Speaker 2: so banal, but at the time it was like, Oh, 427 00:24:00,200 --> 00:24:02,359 Speaker 2: computer can do this like that, that's sort of cool. 428 00:24:02,800 --> 00:24:05,680 Speaker 2: I then went on to like not find these tools 429 00:24:05,720 --> 00:24:09,160 Speaker 2: to be particularly useful in my writing, like that as 430 00:24:09,200 --> 00:24:11,960 Speaker 2: a writer for quite a long time. Now that's changed. 431 00:24:12,359 --> 00:24:15,879 Speaker 2: I found them to be quite useful for basically just 432 00:24:15,920 --> 00:24:22,080 Speaker 2: like brainstorming, like having someone to brainstorm very rough ideas with. 433 00:24:22,440 --> 00:24:25,280 Speaker 2: I think this maybe comes from being a novelist. It's 434 00:24:25,280 --> 00:24:29,680 Speaker 2: a very solitary sort of endeavor. And I sometimes am 435 00:24:29,760 --> 00:24:33,359 Speaker 2: jealous of my friends who write for like TV shows, 436 00:24:33,800 --> 00:24:36,000 Speaker 2: because they have a writer's room, they get to like 437 00:24:36,040 --> 00:24:38,080 Speaker 2: bounce ideas off each other, and like I can call 438 00:24:38,160 --> 00:24:40,560 Speaker 2: friends and bounce ideas off each other. But like it 439 00:24:40,560 --> 00:24:43,720 Speaker 2: gets old right for my friends, they're not being paid 440 00:24:43,760 --> 00:24:47,320 Speaker 2: to work on the same Netflix series or whatever. So 441 00:24:47,560 --> 00:24:50,440 Speaker 2: I found that actually to be like a useful tool 442 00:24:50,480 --> 00:24:53,359 Speaker 2: for my own thinking that I can like sort of 443 00:24:53,400 --> 00:24:55,720 Speaker 2: like jam a little bit in a way that feels 444 00:24:55,960 --> 00:24:58,520 Speaker 2: somewhat different than me just sitting and thinking or making 445 00:24:58,520 --> 00:25:00,760 Speaker 2: my own notes. And then I've also found them very 446 00:25:00,840 --> 00:25:03,280 Speaker 2: useful at the back end, just for copy editing, which 447 00:25:03,280 --> 00:25:04,800 Speaker 2: is a very obvious task. 448 00:25:04,840 --> 00:25:05,560 Speaker 1: Oh interesting. 449 00:25:05,800 --> 00:25:09,960 Speaker 2: Yeah, So I now submit very very tight manuscripts because 450 00:25:10,160 --> 00:25:14,159 Speaker 2: I will solicit notes from all the major models on 451 00:25:14,320 --> 00:25:15,400 Speaker 2: any new manuscript. 452 00:25:15,640 --> 00:25:18,119 Speaker 1: But you know, so you'll test it on all models, 453 00:25:18,160 --> 00:25:20,000 Speaker 1: like you'll go to Claude, you'll go to chat GBT, 454 00:25:20,200 --> 00:25:21,640 Speaker 1: you'll go to Gemini. 455 00:25:21,960 --> 00:25:23,680 Speaker 2: I'll tell you exactly how I do it. Actually, yeah, 456 00:25:24,000 --> 00:25:27,879 Speaker 2: And it's so one thing I do not do is 457 00:25:28,520 --> 00:25:31,080 Speaker 2: add all the text and ask it to give me 458 00:25:31,240 --> 00:25:34,240 Speaker 2: back an edited version. I care about every word in 459 00:25:34,280 --> 00:25:37,239 Speaker 2: a manuscript that I am writing and publishing, and so 460 00:25:37,359 --> 00:25:40,280 Speaker 2: I don't want it to insert it's sort of like 461 00:25:40,520 --> 00:25:44,840 Speaker 2: median judgment into like what is my voice? That's the 462 00:25:44,840 --> 00:25:48,119 Speaker 2: whole point of writing and publishing something. So instead I 463 00:25:48,240 --> 00:25:52,119 Speaker 2: upload chapter by chapter and I ask it, like each 464 00:25:52,200 --> 00:25:57,160 Speaker 2: tool to give me on like line edits on that chapter, 465 00:25:57,800 --> 00:26:01,919 Speaker 2: just like I would receive them from a line editor, right, so, like, 466 00:26:02,400 --> 00:26:04,560 Speaker 2: here are my comments on this line how it should 467 00:26:04,600 --> 00:26:06,960 Speaker 2: be different for these reasons, and then I go back 468 00:26:07,000 --> 00:26:10,080 Speaker 2: in and like manually implement if I agree with the reasoning, 469 00:26:10,160 --> 00:26:12,359 Speaker 2: I just take those notes as if I were working 470 00:26:12,400 --> 00:26:16,119 Speaker 2: with like my line editor. And so that's actually been 471 00:26:16,160 --> 00:26:19,320 Speaker 2: tremendously useful to me, and it's meant that I've been 472 00:26:19,359 --> 00:26:22,600 Speaker 2: able to, like on my most recent novel, I could 473 00:26:22,600 --> 00:26:25,639 Speaker 2: do multiple revs on the whole manuscript in a day 474 00:26:25,760 --> 00:26:27,440 Speaker 2: or two rather than in a month or two. 475 00:26:27,720 --> 00:26:28,640 Speaker 1: That's incredible. 476 00:26:29,119 --> 00:26:32,960 Speaker 2: You'll notice I didn't use it for the thing that 477 00:26:33,280 --> 00:26:36,760 Speaker 2: I think is most often discussed around sort of AI 478 00:26:36,800 --> 00:26:39,800 Speaker 2: and writing, which is actually writing the damn novel, right, 479 00:26:39,920 --> 00:26:42,720 Speaker 2: like I wrote the novel. And I've actually found that 480 00:26:42,720 --> 00:26:46,400 Speaker 2: the tools are effectively not useful like at all, oh 481 00:26:46,440 --> 00:26:48,160 Speaker 2: really for that purpose. 482 00:26:48,280 --> 00:26:50,600 Speaker 1: So people disagree with you on that though. 483 00:26:50,520 --> 00:26:52,760 Speaker 2: One hundred percent. I'm not saying that this is true 484 00:26:52,760 --> 00:26:55,240 Speaker 2: for everyone. This is just like my personal experience of 485 00:26:55,400 --> 00:26:55,880 Speaker 2: using them. 486 00:26:56,280 --> 00:27:00,960 Speaker 1: Are you worried that like your intellectual well property will 487 00:27:00,960 --> 00:27:03,720 Speaker 1: be used to train models? 488 00:27:04,119 --> 00:27:06,240 Speaker 2: It's not something that bothers me very much. 489 00:27:06,720 --> 00:27:07,360 Speaker 1: Interesting. 490 00:27:07,800 --> 00:27:11,159 Speaker 2: Yeah, I understand why people are concerned, So I'm not 491 00:27:11,200 --> 00:27:14,359 Speaker 2: trying to be like a booster or something like that. 492 00:27:14,800 --> 00:27:18,480 Speaker 2: My feeling is that I receive a lot of consumer 493 00:27:18,560 --> 00:27:22,240 Speaker 2: surplus from using these models in many areas of my life. 494 00:27:22,760 --> 00:27:25,560 Speaker 2: So like when I need to like fix the sink, 495 00:27:26,040 --> 00:27:29,280 Speaker 2: it's really convenient, right, Like it's better than YouTube, and 496 00:27:29,359 --> 00:27:32,000 Speaker 2: YouTube was better than anything else before it, so that 497 00:27:32,040 --> 00:27:34,239 Speaker 2: there are a lot of ways that I benefit from 498 00:27:34,359 --> 00:27:38,600 Speaker 2: using these models that far exceed the price I pay 499 00:27:38,800 --> 00:27:41,800 Speaker 2: to use them, at least right now. And so I 500 00:27:41,840 --> 00:27:46,000 Speaker 2: feel like their consumer surplus is very high at the moment. 501 00:27:46,359 --> 00:27:48,600 Speaker 2: That can always change, but like I sort of feel 502 00:27:48,640 --> 00:27:51,639 Speaker 2: like that's very high, and and like I'm certainly not 503 00:27:51,760 --> 00:27:56,240 Speaker 2: concerned about people publishing novels that sort of mimic me. 504 00:27:57,040 --> 00:28:00,720 Speaker 2: Like I just think that, no, when I think about 505 00:28:01,040 --> 00:28:04,920 Speaker 2: the challenges in publishing stuff in the world, like not 506 00:28:05,000 --> 00:28:08,159 Speaker 2: just not just novels, but you know, if you make movies, 507 00:28:08,200 --> 00:28:10,920 Speaker 2: if you made music, I think that a lot of 508 00:28:11,000 --> 00:28:15,840 Speaker 2: both the boosters and the like critics of AI tools 509 00:28:15,840 --> 00:28:19,640 Speaker 2: often underestimate is like how hard it is to get 510 00:28:19,640 --> 00:28:23,560 Speaker 2: anyone to care about anything, And like the supply of 511 00:28:23,760 --> 00:28:29,560 Speaker 2: books has like always exceeded the demand for reading books. 512 00:28:30,119 --> 00:28:33,320 Speaker 2: And that's already true. That was true before CHATBT, like 513 00:28:33,560 --> 00:28:36,960 Speaker 2: we have there. You know, there are so many new books, 514 00:28:37,200 --> 00:28:39,680 Speaker 2: I think, especially if you include self published books, there 515 00:28:39,720 --> 00:28:42,120 Speaker 2: are north of a million new books published in the 516 00:28:42,200 --> 00:28:46,800 Speaker 2: United States every year. Like listeners, ask yourself how many 517 00:28:46,840 --> 00:28:48,360 Speaker 2: new books did you read this year? 518 00:28:48,560 --> 00:28:48,720 Speaker 1: Right? 519 00:28:48,760 --> 00:28:51,560 Speaker 2: And like were they published this year? Like that's net 520 00:28:51,640 --> 00:28:54,800 Speaker 2: new every year, and so I just sort of think that, 521 00:28:54,960 --> 00:28:59,120 Speaker 2: like a lot of the public conversation about the supply 522 00:28:59,400 --> 00:29:03,440 Speaker 2: side of cultural products is sort of irrelevant. Like the 523 00:29:03,520 --> 00:29:06,520 Speaker 2: limiting factor is the demand side. The hard part about 524 00:29:06,680 --> 00:29:10,959 Speaker 2: publishing anything is getting anyone to care. Like I remember 525 00:29:11,000 --> 00:29:13,920 Speaker 2: seeing some startup in the news that was like, we're 526 00:29:13,960 --> 00:29:17,080 Speaker 2: going to publish thousands of AI produced books, and I 527 00:29:17,120 --> 00:29:19,840 Speaker 2: was like, that sounds to me like a big waste 528 00:29:19,880 --> 00:29:22,480 Speaker 2: of time and effort, like not just reading them, yeah, 529 00:29:22,600 --> 00:29:26,600 Speaker 2: Like who's reading them? Yeah? So for that reason, it's 530 00:29:26,720 --> 00:29:29,880 Speaker 2: just not something that that I'm not concerned about. 531 00:29:41,280 --> 00:29:43,600 Speaker 1: I wanted to thank you so much for taking the 532 00:29:43,640 --> 00:29:45,280 Speaker 1: time to talk to me, and I hope this was 533 00:29:45,760 --> 00:29:47,560 Speaker 1: as enlightening for you as it was for me. 534 00:29:47,960 --> 00:29:48,840 Speaker 2: That was a ton of fun. 535 00:30:02,760 --> 00:30:05,680 Speaker 1: That's it. For this week for tech Stuff, I'm Kara Price. 536 00:30:06,360 --> 00:30:09,480 Speaker 1: This episode was produced by Eliza Dennis, Tyler Hill and 537 00:30:09,520 --> 00:30:13,360 Speaker 1: Melissa Slauner. It was executive produced by me oz Va Lashan, 538 00:30:13,680 --> 00:30:17,280 Speaker 1: Julia Nutter, and Kate Osborne from Kaleidoscope and Katrina Norvel 539 00:30:17,360 --> 00:30:21,040 Speaker 1: for iHeart Podcasts. Jack Insley mixed this episode and Kyle 540 00:30:21,120 --> 00:30:24,160 Speaker 1: Murdoch wrote our theme song. Join us on Friday for 541 00:30:24,200 --> 00:30:26,400 Speaker 1: the Weekend Tech where we'll run through the headlines you 542 00:30:26,440 --> 00:30:29,480 Speaker 1: need to follow and please rate and review the show 543 00:30:29,600 --> 00:30:32,440 Speaker 1: and reach out to us at textuff podcast at gmail 544 00:30:32,480 --> 00:30:34,160 Speaker 1: dot com. We want to hear from young