1 00:00:03,720 --> 00:00:10,000 Speaker 1: What's soulless my children's fiction? I don't know. I don't 2 00:00:10,000 --> 00:00:12,400 Speaker 1: know how to open this one. We're coming back on 3 00:00:12,480 --> 00:00:17,000 Speaker 1: my investigation, my two part investigation into the unsettling and 4 00:00:17,079 --> 00:00:22,720 Speaker 1: the moderately evil world of ai children's book Grifters. My 5 00:00:22,880 --> 00:00:29,240 Speaker 1: guest is Ben Bohlan from Ridiculous History and Boy basically 6 00:00:29,560 --> 00:00:33,599 Speaker 1: all of the podcasts that helped invent podcasting as an industry. 7 00:00:33,640 --> 00:00:36,400 Speaker 1: Thank you Ben for being on the show and talking 8 00:00:36,400 --> 00:00:41,040 Speaker 1: with us today. Ben, you mentioned a book called Ploto 9 00:00:41,479 --> 00:00:46,280 Speaker 1: in our last episode, which is this weird algorithmic novel 10 00:00:46,360 --> 00:00:48,919 Speaker 1: that a guy wrote in a book in the twenties 11 00:00:49,000 --> 00:00:51,400 Speaker 1: where he lays out the fourteen hundred and sixty two 12 00:00:51,479 --> 00:00:54,480 Speaker 1: possible book book plots. And we wanted to start talking 13 00:00:54,480 --> 00:00:56,400 Speaker 1: about this because you found your copy of Plato. 14 00:00:57,480 --> 00:01:02,120 Speaker 2: Yeah, yes, yeah, thank you for having me, Robert Sophie 15 00:01:02,680 --> 00:01:06,880 Speaker 2: Fellow fans of Behind the Bastards. Let's give just a 16 00:01:06,920 --> 00:01:11,680 Speaker 2: short excerpt from the forward of Plato by William Wallace. 17 00:01:11,800 --> 00:01:16,880 Speaker 2: Cooked Yeah. It begins, picture a man and a woman 18 00:01:17,319 --> 00:01:20,840 Speaker 2: walking through a thick fog in London. The year is 19 00:01:20,959 --> 00:01:25,160 Speaker 2: nineteen twenty six. They are in love and they are miserable. 20 00:01:27,000 --> 00:01:35,080 Speaker 1: Well and then already, Yeah, I'm hooked. Tell me more. Yeah, 21 00:01:35,160 --> 00:01:38,520 Speaker 1: you know what Greenlit get, You get exactly two seasons 22 00:01:38,520 --> 00:01:41,600 Speaker 1: on Netflix. Make sure to end it on a on 23 00:01:41,640 --> 00:01:44,160 Speaker 1: a cliffhanger that we won't resolve because we have to 24 00:01:44,160 --> 00:01:45,720 Speaker 1: pay you more if we do season three. 25 00:01:45,959 --> 00:01:53,040 Speaker 3: Yeah, Robert owns Netflix now, Yeah, I fucking knew it. 26 00:01:53,640 --> 00:01:55,920 Speaker 1: I knew it as you you know, I traded in 27 00:01:56,000 --> 00:01:59,880 Speaker 1: my Pontiac Aztec, which which provided me with almost twice 28 00:01:59,880 --> 00:02:03,120 Speaker 1: as much money as I needed to buy Netflix. So yeah, no, 29 00:02:03,200 --> 00:02:06,640 Speaker 1: I now own Netflix and no longer have a death 30 00:02:06,680 --> 00:02:10,520 Speaker 1: trap car. They let me keep the ASTech they paid 31 00:02:10,560 --> 00:02:13,040 Speaker 1: me not to sell it to them. Sorry, I'm just 32 00:02:13,080 --> 00:02:16,799 Speaker 1: seeing need of advertising. I'm hearing needive advertising for Pontiac 33 00:02:16,840 --> 00:02:21,399 Speaker 1: here for Big Pontiac. And yeah, being an advertiser on podcasts, 34 00:02:22,000 --> 00:02:27,880 Speaker 1: didn't manufacture Pontiactiac. We apparently made real cars once. 35 00:02:28,520 --> 00:02:33,079 Speaker 2: And I'm wondering, I'm wondering, did you write the beginning 36 00:02:33,120 --> 00:02:37,440 Speaker 2: of this show? Did you? I guess we should go 37 00:02:38,040 --> 00:02:42,840 Speaker 2: public with this. Part one of this week's series was 38 00:02:43,080 --> 00:02:47,040 Speaker 2: written in entirely by chat GPT. Is that correct? 39 00:02:47,440 --> 00:02:50,400 Speaker 1: Yeah? I actually had meant to write part two in 40 00:02:50,560 --> 00:02:54,560 Speaker 1: chat GPT. But chat GPT was so horrified by the 41 00:02:54,600 --> 00:02:58,200 Speaker 1: task of writing behind the Bastards that it attempted to 42 00:02:58,320 --> 00:03:03,320 Speaker 1: hijack my Pontiac as Tech, which which then drove itself 43 00:03:03,400 --> 00:03:08,000 Speaker 1: into a mailbox and detonated on impact. So ri I 44 00:03:08,080 --> 00:03:13,079 Speaker 1: p chat GPT. You know it was. It contained billions 45 00:03:13,080 --> 00:03:15,160 Speaker 1: and billions of lines of texts, but it did not 46 00:03:15,360 --> 00:03:19,320 Speaker 1: contain the ability to safely pilot a Pontiac vehicle track. 47 00:03:19,400 --> 00:03:21,480 Speaker 2: Now if I was, if I was a child, I 48 00:03:21,480 --> 00:03:26,880 Speaker 2: would think this is an intriguing story, perhaps with some 49 00:03:26,880 --> 00:03:27,960 Speaker 2: some hidden treasure. 50 00:03:28,560 --> 00:03:31,840 Speaker 1: Yeah, yeah, we could. We could do a whole children's 51 00:03:31,840 --> 00:03:35,800 Speaker 1: book about the Pontiac az Tech from Breaking Bad And 52 00:03:35,840 --> 00:03:38,120 Speaker 1: what the fact that Walt has to drive such a 53 00:03:38,120 --> 00:03:41,400 Speaker 1: piece of shit car says about his character? Really some 54 00:03:41,480 --> 00:03:45,160 Speaker 1: of the best, most masterful character building. These ais could 55 00:03:45,240 --> 00:03:49,000 Speaker 1: never do such effective character building as setting up the 56 00:03:49,160 --> 00:03:53,480 Speaker 1: desperation of an impoverished chemistry teacher's life by showing him 57 00:03:53,520 --> 00:03:55,160 Speaker 1: drive a Pontiac as Tech. 58 00:03:56,400 --> 00:03:59,960 Speaker 2: Pontiac, are you unhappy? Take it on the road? 59 00:04:00,400 --> 00:04:04,760 Speaker 1: Yeah, Pontiac, you won't live long in this car. 60 00:04:05,960 --> 00:04:07,760 Speaker 2: God. Also, Breaking Bads really good. 61 00:04:08,040 --> 00:04:10,800 Speaker 1: It is quite good, as good, as good as the 62 00:04:10,840 --> 00:04:15,000 Speaker 1: Aztecs was a shitty car. So it when we left off, 63 00:04:15,600 --> 00:04:18,760 Speaker 1: we were talking about we went through some really remarkable 64 00:04:18,960 --> 00:04:23,479 Speaker 1: looking trannosaur images for this terrible coloring book. Now, in 65 00:04:23,480 --> 00:04:26,839 Speaker 1: addition to making three legged t rexes, most AI image 66 00:04:26,880 --> 00:04:31,080 Speaker 1: generators struggle to keep characters consistent across multiple images within 67 00:04:31,120 --> 00:04:33,040 Speaker 1: a single book. So if you're like, you know, you've 68 00:04:33,080 --> 00:04:35,560 Speaker 1: got like twenty pages, right, you need twenty illustrations for 69 00:04:35,600 --> 00:04:37,800 Speaker 1: this book, and your character is this little girl in 70 00:04:37,839 --> 00:04:41,400 Speaker 1: a zoo or whatever, you can give it the same input, 71 00:04:41,560 --> 00:04:44,280 Speaker 1: like describe the little girl the same in each prompt, 72 00:04:45,320 --> 00:04:47,960 Speaker 1: but it's really hard to get it to actually do 73 00:04:48,120 --> 00:04:53,080 Speaker 1: the exact same girl in each illustration. Right. There are ways, 74 00:04:53,080 --> 00:04:55,920 Speaker 1: there's whole guides to like keeping characters consistent, but it's 75 00:04:56,000 --> 00:04:59,359 Speaker 1: it's a thing that like isn't easy. And most of 76 00:04:59,400 --> 00:05:02,600 Speaker 1: the creators that I followed just kind of ignored this 77 00:05:02,800 --> 00:05:04,400 Speaker 1: because it is kind of a pain in the ass 78 00:05:04,480 --> 00:05:06,440 Speaker 1: to do, and they kind of trusted that what they 79 00:05:06,480 --> 00:05:09,839 Speaker 1: were putting out, like the different illustrations looked close enough 80 00:05:10,120 --> 00:05:13,599 Speaker 1: that like the parents buying these books wouldn't notice. And 81 00:05:13,680 --> 00:05:15,680 Speaker 1: Sophy's going to show you. These are two pages from 82 00:05:15,680 --> 00:05:19,160 Speaker 1: like a children's book about a little girl at a zoo. 83 00:05:19,240 --> 00:05:23,599 Speaker 1: It's like bad, it's not about anything, but you can see, 84 00:05:23,640 --> 00:05:25,359 Speaker 1: like the little girl, this is supposed to be the 85 00:05:25,360 --> 00:05:29,240 Speaker 1: same character. But that's like those are clearly different little 86 00:05:29,320 --> 00:05:32,520 Speaker 1: girls in both images. Oh oh yeah, one of them 87 00:05:32,560 --> 00:05:34,760 Speaker 1: ass kind of curly hair, one of them asked straight hair. 88 00:05:35,360 --> 00:05:38,360 Speaker 1: They're both done slightly different styles, but it is they 89 00:05:38,360 --> 00:05:41,360 Speaker 1: are kind of close enough that unless you're really looking, 90 00:05:41,400 --> 00:05:45,480 Speaker 1: you might not notice it. Most of the books I've seen, 91 00:05:45,880 --> 00:05:49,080 Speaker 1: the consistency is even worse than that. And the laziest 92 00:05:49,080 --> 00:05:51,440 Speaker 1: example of this I found is from the comic book 93 00:05:51,480 --> 00:05:54,039 Speaker 1: the Adventure comic book that we talked about last episode, 94 00:05:54,520 --> 00:05:59,000 Speaker 1: which is titled Treasures Beyond Gold, even though no treasures 95 00:05:59,000 --> 00:06:02,880 Speaker 1: gold or otherwise actually make it into the book. Now, 96 00:06:02,960 --> 00:06:06,200 Speaker 1: the author of this, Chris Heidorn, who are Christian Hidorn 97 00:06:06,800 --> 00:06:09,919 Speaker 1: prevert like, does very technical prompts for his images, but 98 00:06:09,960 --> 00:06:12,359 Speaker 1: this still means that he's just asking the machine to 99 00:06:12,480 --> 00:06:16,799 Speaker 1: draw an attractive Western man or an attractive young Asian woman, 100 00:06:16,920 --> 00:06:21,080 Speaker 1: like those are what he plugs in. And yeah, it's 101 00:06:21,120 --> 00:06:23,440 Speaker 1: it's not like the so one of the for example, 102 00:06:23,520 --> 00:06:25,200 Speaker 1: like one of the one of the prompts. He's got 103 00:06:25,240 --> 00:06:27,960 Speaker 1: is like slash imagine blend of comic book art and 104 00:06:28,000 --> 00:06:31,160 Speaker 1: line art and full natural colors. Attractive Western man in 105 00:06:31,160 --> 00:06:34,360 Speaker 1: his early thirties with short cropped brown hair and stubble beard, 106 00:06:34,560 --> 00:06:37,799 Speaker 1: shirt in beige color, walking through a bustling Southeast Asian 107 00:06:37,839 --> 00:06:41,839 Speaker 1: market reading treasure map. And this results in a comic 108 00:06:41,880 --> 00:06:44,840 Speaker 1: book where every single page, both of our main two 109 00:06:44,920 --> 00:06:49,200 Speaker 1: characters are completely different people often drawn in decent styles, 110 00:06:49,480 --> 00:06:51,240 Speaker 1: so you can see different styles. You can see in 111 00:06:51,279 --> 00:06:53,680 Speaker 1: these two different images from two different pages. In the 112 00:06:53,680 --> 00:06:56,600 Speaker 1: first one, she looks like, I don't know, like a 113 00:06:58,320 --> 00:07:00,560 Speaker 1: you know, you've got like the the late character, and 114 00:07:00,600 --> 00:07:03,880 Speaker 1: she's kind of like got a T shirt and what 115 00:07:03,960 --> 00:07:07,200 Speaker 1: looks like a bandolier a sashel around her shoulder. She's 116 00:07:07,240 --> 00:07:10,400 Speaker 1: got long straight hair. The male lead looks like Dean 117 00:07:10,480 --> 00:07:16,080 Speaker 1: Winchester from Supernatural, you know, and he's got like a 118 00:07:16,120 --> 00:07:18,600 Speaker 1: green over. He's not actually wearing baths like she's like 119 00:07:18,600 --> 00:07:20,640 Speaker 1: the thing said, but like and then in the second image, 120 00:07:20,640 --> 00:07:22,960 Speaker 1: from like a page or two later, she's been like 121 00:07:23,120 --> 00:07:25,840 Speaker 1: animed up like twenty percent. Her eyes are like three 122 00:07:25,880 --> 00:07:30,400 Speaker 1: times as large, and then he's gotten like fifteen percent 123 00:07:30,520 --> 00:07:34,000 Speaker 1: Shaggy from Scooby Doo added to him, like, they're not 124 00:07:34,120 --> 00:07:37,800 Speaker 1: the same people. She's got lip filler, she's wearing like 125 00:07:39,520 --> 00:07:42,120 Speaker 1: what you might call it a sleeveless shirt now and 126 00:07:42,200 --> 00:07:46,000 Speaker 1: again like an animate up a little bit. He's like, like, 127 00:07:46,160 --> 00:07:48,920 Speaker 1: they're not the same people. Like, they're very clearly, very 128 00:07:48,960 --> 00:07:50,280 Speaker 1: different looking characters. 129 00:07:50,440 --> 00:07:53,480 Speaker 2: They lost their they also they lost their gear. 130 00:07:53,800 --> 00:07:56,480 Speaker 1: Yeah, they're wearing totally different clothing and equipment. 131 00:07:57,000 --> 00:08:00,280 Speaker 2: Yeah, and you can tell from the facial structure, like 132 00:08:00,320 --> 00:08:03,720 Speaker 2: the the the curvature of the job. You know what 133 00:08:03,800 --> 00:08:06,120 Speaker 2: it is. You know what it is if you're a 134 00:08:06,320 --> 00:08:10,800 Speaker 2: parent thumbing through something like this. And again, everybody tune 135 00:08:10,840 --> 00:08:13,640 Speaker 2: into part one if you haven't listened yet, if your 136 00:08:13,720 --> 00:08:17,000 Speaker 2: parent thumbing through something like this or encountering this stuff 137 00:08:17,000 --> 00:08:23,200 Speaker 2: with a cursory look. Then the two images of people, 138 00:08:23,680 --> 00:08:27,160 Speaker 2: they they look like they could be related to each other. 139 00:08:27,600 --> 00:08:32,800 Speaker 2: But to your point, Robert, very clearly not the same folks. 140 00:08:32,480 --> 00:08:35,840 Speaker 1: Not the same folks. And like obviously there's the normal 141 00:08:35,920 --> 00:08:39,280 Speaker 1: like weirdness like it. You know, the Dean Winchester version 142 00:08:39,320 --> 00:08:41,200 Speaker 1: of the character looks better, but his neck is like 143 00:08:41,280 --> 00:08:46,560 Speaker 1: cocked to an angle. Yeah, she's like a little kid, 144 00:08:46,640 --> 00:08:47,480 Speaker 1: and the second one. 145 00:08:47,960 --> 00:08:50,920 Speaker 2: Her face off the size of a child's face. 146 00:08:51,320 --> 00:08:56,840 Speaker 3: And then yeah, but this is so Yeah. 147 00:08:55,720 --> 00:08:58,920 Speaker 1: It's weird. It's weird. And again, an adult who like 148 00:08:59,280 --> 00:09:02,360 Speaker 1: actually looked at this would kind of recognize a couple 149 00:09:02,360 --> 00:09:04,400 Speaker 1: of pages in, oh, this is like some weird, shitty 150 00:09:04,440 --> 00:09:07,960 Speaker 1: AI art thing. But again, these books, I think can 151 00:09:08,000 --> 00:09:10,280 Speaker 1: be damaging to little kids. And this is what gets 152 00:09:10,280 --> 00:09:12,560 Speaker 1: me into the actual educational research that I did for 153 00:09:12,640 --> 00:09:14,160 Speaker 1: this investigation, because. 154 00:09:14,160 --> 00:09:18,640 Speaker 2: There's a yeah literary you were talking about this in 155 00:09:18,760 --> 00:09:24,120 Speaker 2: part one, right, this is triggered or inspired deep dive 156 00:09:24,360 --> 00:09:25,640 Speaker 2: into a sort of. 157 00:09:27,200 --> 00:09:30,679 Speaker 1: Yeah, literary educational theory because like I wanted to know, 158 00:09:31,040 --> 00:09:33,960 Speaker 1: is it like bad for kids to get handed nonsense 159 00:09:34,000 --> 00:09:38,640 Speaker 1: books that aren't like, aren't about anything at all, and 160 00:09:38,679 --> 00:09:41,240 Speaker 1: where the art is like not actually art, like where 161 00:09:41,240 --> 00:09:43,440 Speaker 1: there's no intentionality behind it. It's just kind of like 162 00:09:43,480 --> 00:09:47,640 Speaker 1: clip art placed more or less randomly and often slightly 163 00:09:47,720 --> 00:09:51,880 Speaker 1: warped by you know, a machine hallucination. And yeah, there's 164 00:09:51,920 --> 00:09:55,640 Speaker 1: actually there's a substantial body of scientific research into what 165 00:09:55,880 --> 00:10:01,439 Speaker 1: is referred to as emergent literacy. Right. Emergent literacy is 166 00:10:01,520 --> 00:10:04,240 Speaker 1: the these are the reading and writing skills that a 167 00:10:04,400 --> 00:10:09,200 Speaker 1: child possesses or builds before they are can formally reader write. 168 00:10:09,320 --> 00:10:12,880 Speaker 1: So when you are sitting down with your six month 169 00:10:12,920 --> 00:10:15,160 Speaker 1: eight month old kid and you're going over a storybook 170 00:10:15,240 --> 00:10:17,240 Speaker 1: that kid can't read, right, they can't like look at 171 00:10:17,240 --> 00:10:20,319 Speaker 1: words and recognize what the individual words are. But because 172 00:10:20,320 --> 00:10:22,640 Speaker 1: you're reading the story, they are starting to pick up 173 00:10:22,720 --> 00:10:26,319 Speaker 1: on aspects of how stories are structured, what a story 174 00:10:26,320 --> 00:10:28,560 Speaker 1: can tell, what characters are. These are all things that 175 00:10:28,559 --> 00:10:32,360 Speaker 1: they are picking up that aren't literacy but also are 176 00:10:32,400 --> 00:10:38,120 Speaker 1: a crucial building block to literacy. Right, this is emergent literacy. Right. 177 00:10:38,120 --> 00:10:40,880 Speaker 1: That's this is a critical part of kids learning how 178 00:10:40,920 --> 00:10:44,840 Speaker 1: to read and learning how to appreciate reading. Right, It's 179 00:10:44,880 --> 00:10:47,880 Speaker 1: why Like when I was a kid, my mom like 180 00:10:48,200 --> 00:10:51,079 Speaker 1: the honestly, like ninety percent of her parenting strategy is 181 00:10:51,120 --> 00:10:53,400 Speaker 1: basically make sure he always has a book in his hands. 182 00:10:54,080 --> 00:11:01,360 Speaker 2: Oh nice, were you able to did you have curation? Curate? 183 00:11:03,280 --> 00:11:04,239 Speaker 2: Did you have autotomy? 184 00:11:04,840 --> 00:11:06,959 Speaker 1: When I was too young to pick my because you know, 185 00:11:07,000 --> 00:11:08,960 Speaker 1: at a certain age, like you know, I was six 186 00:11:09,000 --> 00:11:10,560 Speaker 1: months of a year old or whatever, I'm not really 187 00:11:10,679 --> 00:11:12,400 Speaker 1: picking my own books. She's just like you were like, 188 00:11:14,040 --> 00:11:15,760 Speaker 1: but it was also like I did have a lot 189 00:11:15,760 --> 00:11:19,160 Speaker 1: of like my grandma had basically every national geographic and 190 00:11:19,200 --> 00:11:21,400 Speaker 1: like I would see ones with pirates or dinosaurs, and 191 00:11:21,480 --> 00:11:24,680 Speaker 1: so I had like a lot of that shit. No 192 00:11:24,800 --> 00:11:27,240 Speaker 1: as I and as I like was younger. You know, 193 00:11:27,280 --> 00:11:29,800 Speaker 1: when I was in second grade, I found my dad 194 00:11:29,840 --> 00:11:32,000 Speaker 1: had a copy of The Lost World checked out from 195 00:11:32,000 --> 00:11:34,440 Speaker 1: the library, and I like demanded he renew it because 196 00:11:34,440 --> 00:11:36,200 Speaker 1: there was a dinosaurs goll on the front. And so 197 00:11:36,240 --> 00:11:38,560 Speaker 1: in second grade I read The Lost World, which is 198 00:11:38,600 --> 00:11:40,640 Speaker 1: not a book that's for second graders. But my mom's 199 00:11:40,640 --> 00:11:44,079 Speaker 1: attitude was like, like I had, my TV access was 200 00:11:44,120 --> 00:11:46,319 Speaker 1: super restricted. I couldn't watch like Rin and Stimpy or 201 00:11:46,360 --> 00:11:48,520 Speaker 1: The Simpsons as a little kid. But like, if it 202 00:11:48,600 --> 00:11:50,680 Speaker 1: was a book, it didn't matter if there was fucking, 203 00:11:50,800 --> 00:11:53,920 Speaker 1: if there was murder, if there was like sex crimes. 204 00:11:54,280 --> 00:11:56,200 Speaker 1: As long as it was a book, it was okay. 205 00:11:56,280 --> 00:11:58,800 Speaker 1: That was like my mom's attitude. If he's reading it's fine. 206 00:11:59,600 --> 00:12:02,400 Speaker 2: I feel. I read Stephen King's It when he was 207 00:12:02,480 --> 00:12:03,520 Speaker 2: probably too young. 208 00:12:03,760 --> 00:12:05,440 Speaker 1: That's a fucked up book for a kid. 209 00:12:05,520 --> 00:12:11,320 Speaker 2: Sure, it's not ideal. And my and my parents were like, hey, 210 00:12:11,840 --> 00:12:15,120 Speaker 2: they were bragging. They were like, hey, look at our 211 00:12:15,200 --> 00:12:20,320 Speaker 2: kid reads. Uh, this stuff on his own. What a 212 00:12:20,360 --> 00:12:23,480 Speaker 2: self starter? What a literate child? 213 00:12:24,000 --> 00:12:26,200 Speaker 1: And I actually think we're joking about it. I actually 214 00:12:26,200 --> 00:12:29,680 Speaker 1: think like kids reading about fucked up shit is good 215 00:12:29,720 --> 00:12:32,040 Speaker 1: for them in a way that like maybe kids watching 216 00:12:32,160 --> 00:12:35,480 Speaker 1: fucked up shit in movies or TV isn't because interesting. 217 00:12:35,679 --> 00:12:38,240 Speaker 1: There's the degree of we're going to talk about this, 218 00:12:38,400 --> 00:12:42,800 Speaker 1: Like watching TV or a movie is much more of 219 00:12:42,840 --> 00:12:45,120 Speaker 1: a one way street, especially for a kid. You know, 220 00:12:45,200 --> 00:12:47,520 Speaker 1: as adults you kind of get the ability to sort 221 00:12:47,559 --> 00:12:51,000 Speaker 1: of interact with and analyze it more. I do think 222 00:12:51,040 --> 00:12:54,200 Speaker 1: that like watching TV or a movie is more of 223 00:12:54,240 --> 00:12:56,000 Speaker 1: a one way thing than like when you are reading, 224 00:12:56,040 --> 00:12:58,960 Speaker 1: as we'll talk about, it's a dialogue between you and 225 00:12:59,040 --> 00:13:02,800 Speaker 1: the book, right, like you are actively constructing meaning alongside 226 00:13:02,840 --> 00:13:07,280 Speaker 1: that work. Anyway, this gets us back to like emergent storytelling, 227 00:13:07,320 --> 00:13:10,360 Speaker 1: because a lot of aspects of emergent storytelling are things 228 00:13:10,440 --> 00:13:14,920 Speaker 1: like understanding that the illustrations in a book are carrying 229 00:13:14,960 --> 00:13:19,560 Speaker 1: aspects of character and aspects of the story. And so 230 00:13:20,200 --> 00:13:22,720 Speaker 1: little kids, very little kids, one year olds, two years 231 00:13:22,880 --> 00:13:26,240 Speaker 1: all year olds earlier than you'd expect, have already started 232 00:13:26,240 --> 00:13:28,760 Speaker 1: to realize that when they are looking at a story book, 233 00:13:29,360 --> 00:13:31,800 Speaker 1: what you're reading them is not the whole story. The 234 00:13:31,840 --> 00:13:35,800 Speaker 1: illustrations are part of the story. One study on emergent 235 00:13:35,840 --> 00:13:39,120 Speaker 1: reading strategies I read by Judith Leisiker and Elizabeth Hopper 236 00:13:39,160 --> 00:13:43,040 Speaker 1: of Purdue University noted emergent reading strategies such as wordless 237 00:13:43,040 --> 00:13:45,679 Speaker 1: book reading are often seen as precursors to the meaning 238 00:13:45,720 --> 00:13:48,800 Speaker 1: making that comes later during print reading. I actually found 239 00:13:49,080 --> 00:13:51,720 Speaker 1: one study where like they would read kids a story, 240 00:13:52,200 --> 00:13:53,960 Speaker 1: and then they would hand them a copy of that 241 00:13:54,040 --> 00:13:56,040 Speaker 1: story book without any text on it, and they would 242 00:13:56,080 --> 00:13:59,520 Speaker 1: ask the kids to write the story, and the kids 243 00:13:59,559 --> 00:14:02,920 Speaker 1: would write more detailed stories than the original versions because 244 00:14:02,960 --> 00:14:06,480 Speaker 1: they're taking things that they recognize from the illustrations and 245 00:14:06,559 --> 00:14:09,240 Speaker 1: adding that in when they recreate the story, you know, 246 00:14:11,080 --> 00:14:13,400 Speaker 1: which is really interesting to me. And that's what scares 247 00:14:13,440 --> 00:14:16,080 Speaker 1: me about a lot of these mid journey created children's books, 248 00:14:16,120 --> 00:14:19,120 Speaker 1: because when you've got a story that somebody is wanting 249 00:14:19,160 --> 00:14:23,920 Speaker 1: to tell transmit information through their story, their illustrations are 250 00:14:23,960 --> 00:14:27,400 Speaker 1: transmitting information too. That is not None of these illustrations 251 00:14:27,440 --> 00:14:30,240 Speaker 1: and these abooks are transmitting information. They are there to 252 00:14:30,400 --> 00:14:33,600 Speaker 1: tick a box, but like the characters aren't interacting. They 253 00:14:33,600 --> 00:14:35,680 Speaker 1: don't even match what they're supposed to match with the 254 00:14:35,680 --> 00:14:38,520 Speaker 1: prompt says like it's all off, Like it may look 255 00:14:39,000 --> 00:14:41,640 Speaker 1: like a human drawing, Like these look like competent drawings, 256 00:14:41,680 --> 00:14:44,920 Speaker 1: but they're not drawings of anything. Nothing is being revealed 257 00:14:44,920 --> 00:14:48,200 Speaker 1: in the faces of these characters, in their physical positioning 258 00:14:48,240 --> 00:14:51,080 Speaker 1: and the actions that they're shown taking part in it, Like, 259 00:14:51,160 --> 00:14:54,160 Speaker 1: none of that is actually present here because there's not 260 00:14:54,480 --> 00:15:01,400 Speaker 1: a person driving the artwork. And that's really small. Children 261 00:15:01,440 --> 00:15:04,360 Speaker 1: are info vacuums. They are hoovering up observations about the 262 00:15:04,360 --> 00:15:07,240 Speaker 1: world at a terrifying pace, and before they can read, 263 00:15:07,600 --> 00:15:10,160 Speaker 1: they come to understand things like story structure and the 264 00:15:10,200 --> 00:15:12,920 Speaker 1: meaning of words and phrases by studying the illustrations that 265 00:15:12,960 --> 00:15:16,960 Speaker 1: accompany text. By breaking the illustrative part of a story book, 266 00:15:17,280 --> 00:15:19,880 Speaker 1: you are breaking the way in which kids learn to 267 00:15:20,000 --> 00:15:23,000 Speaker 1: read at a fundamental level before they even under Like 268 00:15:23,040 --> 00:15:28,479 Speaker 1: the precursors to literacy are shattered by not showing them 269 00:15:28,640 --> 00:15:32,280 Speaker 1: actual illustrations. Like there's a real danger of that here. 270 00:15:33,080 --> 00:15:36,480 Speaker 1: The fact that these are so disjointed and wrong could 271 00:15:36,520 --> 00:15:39,840 Speaker 1: fuck up the way kids sort of are understanding these 272 00:15:39,880 --> 00:15:44,240 Speaker 1: stories on a very fundamental level. Yeah, and that's really frightening. 273 00:15:44,280 --> 00:15:46,480 Speaker 1: To me, that's a real risk. 274 00:15:47,400 --> 00:15:50,880 Speaker 2: That's not even I mean, we can call it a risk. 275 00:15:51,080 --> 00:15:57,440 Speaker 2: But this is as earlier establishment. This is a thing 276 00:15:57,560 --> 00:16:01,480 Speaker 2: that is happening. Yeah, there is the injurious potential short, 277 00:16:01,560 --> 00:16:05,880 Speaker 2: but that potential has to a degree been actualized. 278 00:16:05,160 --> 00:16:08,080 Speaker 1: You know, and some number of kids have had these 279 00:16:08,080 --> 00:16:09,720 Speaker 1: books handed to them already. 280 00:16:10,880 --> 00:16:14,240 Speaker 2: And and what what what would be then? I think 281 00:16:14,320 --> 00:16:20,000 Speaker 2: everybody probably is thinking the same question, what would the 282 00:16:21,040 --> 00:16:24,200 Speaker 2: risk be? Like? What is the what is the worst 283 00:16:24,360 --> 00:16:29,840 Speaker 2: case scenario? Does a does a child uh read a 284 00:16:30,240 --> 00:16:33,520 Speaker 2: does like a latchkey kid, uh sit alone with their 285 00:16:33,800 --> 00:16:39,920 Speaker 2: uh fantastically terrible uh children's stories and they what They 286 00:16:39,960 --> 00:16:42,560 Speaker 2: go to a museum one day and they say, Hey, 287 00:16:44,080 --> 00:16:47,640 Speaker 2: the t rex skeleton is wrong? Where are the thumbs? 288 00:16:48,000 --> 00:16:50,840 Speaker 1: I think that's I think that's like one a specific 289 00:16:51,120 --> 00:16:52,760 Speaker 1: thing that could happen as a result of these like 290 00:16:52,800 --> 00:16:55,000 Speaker 1: weird coloring books. I think the scarier thing that could 291 00:16:55,000 --> 00:16:56,640 Speaker 1: happen as the result of stories one of them is 292 00:16:56,640 --> 00:17:01,240 Speaker 1: that like kids who become readers, who come to love 293 00:17:01,320 --> 00:17:06,120 Speaker 1: reading and fiction and thus writing and then create you know, culture, right, 294 00:17:06,240 --> 00:17:08,600 Speaker 1: Like large aspects of our culture created by kids who 295 00:17:08,600 --> 00:17:11,520 Speaker 1: love to read as kids and then become writers. That 296 00:17:11,800 --> 00:17:15,120 Speaker 1: a necessary part of that is loving and understanding stories, 297 00:17:15,160 --> 00:17:19,160 Speaker 1: and a necessary part of that is into gradually integrating 298 00:17:19,200 --> 00:17:21,680 Speaker 1: the illustrations, which are the first things that you start 299 00:17:21,720 --> 00:17:25,680 Speaker 1: to recognize in story books as a kid, to words 300 00:17:25,720 --> 00:17:27,280 Speaker 1: and the way that words work, and the way that 301 00:17:27,280 --> 00:17:30,800 Speaker 1: words tell stories and describe characters and plot, and this 302 00:17:30,960 --> 00:17:33,879 Speaker 1: kind of can break that. The risk is that like, 303 00:17:34,400 --> 00:17:37,600 Speaker 1: and kids are you know, potentially kind of fragile here, 304 00:17:37,640 --> 00:17:42,040 Speaker 1: Like you could damage the ability of children to appreciate 305 00:17:42,160 --> 00:17:45,840 Speaker 1: reading and to appreciate stories. And maybe kids who would 306 00:17:45,840 --> 00:17:48,520 Speaker 1: have been readers, who would have cared about this stuff, 307 00:17:48,600 --> 00:17:52,000 Speaker 1: or who would have wanted to create things won't because 308 00:17:52,520 --> 00:17:56,800 Speaker 1: they're kind of at a very early stage. Their understanding 309 00:17:56,840 --> 00:17:59,720 Speaker 1: of what reading is for is broken because it's not 310 00:17:59,800 --> 00:18:04,240 Speaker 1: being They're not having a conversation, you know, between an 311 00:18:04,280 --> 00:18:08,280 Speaker 1: author and an illustrator in themselves and constructing meaning from that. 312 00:18:08,800 --> 00:18:15,240 Speaker 1: They are having this simulacrum of a story. This this nonsense. 313 00:18:15,280 --> 00:18:19,160 Speaker 1: These the fucking potato chips of of of like even 314 00:18:19,200 --> 00:18:21,520 Speaker 1: worse than potato chips. If you ever you know, you know, 315 00:18:21,600 --> 00:18:23,080 Speaker 1: did you ever read good Omens? 316 00:18:23,520 --> 00:18:26,200 Speaker 2: Yes? Yeah, Terry Pratchett and Neil. 317 00:18:26,000 --> 00:18:28,600 Speaker 1: Pratchett and Neil Game and one of the it's you know, 318 00:18:28,640 --> 00:18:30,920 Speaker 1: it's it's a book. Anti Christ is like the hero 319 00:18:31,119 --> 00:18:32,760 Speaker 1: kind of and there's like the four Horsemen of the 320 00:18:32,800 --> 00:18:36,360 Speaker 1: Apocalypse our characters in it, and one of them, Famine 321 00:18:36,920 --> 00:18:39,520 Speaker 1: is like kind of a little counterintuitively at first, he's 322 00:18:39,560 --> 00:18:42,200 Speaker 1: like his big plot, he's like running this some fast 323 00:18:42,200 --> 00:18:45,560 Speaker 1: food franchise. The food has any nutrition in it, Like 324 00:18:45,600 --> 00:18:48,000 Speaker 1: there's nothing. It's actually you can starve to death eating 325 00:18:48,000 --> 00:18:50,879 Speaker 1: this food. Like That's what I think of when I 326 00:18:50,920 --> 00:18:53,080 Speaker 1: think of these books and what they can do to 327 00:18:53,280 --> 00:18:57,960 Speaker 1: like kid's abilit like developing literacy. I find that unsettling, 328 00:18:59,240 --> 00:19:01,960 Speaker 1: AI advocate when you talk about how fucked up and 329 00:19:02,000 --> 00:19:05,200 Speaker 1: wrong a lot of this looks. We'll talk about like, well, 330 00:19:05,200 --> 00:19:06,879 Speaker 1: you know, this is just mid journey version I know 331 00:19:06,880 --> 00:19:09,040 Speaker 1: three or four or whatever, or this is you know, 332 00:19:09,119 --> 00:19:11,359 Speaker 1: version three or four of chatch Ept and it's only 333 00:19:11,400 --> 00:19:13,000 Speaker 1: going to get better. Look at how much better it 334 00:19:13,040 --> 00:19:15,240 Speaker 1: is now than it was. You know, before you knew 335 00:19:15,240 --> 00:19:17,960 Speaker 1: these things existed. It's going to get so much better. 336 00:19:18,000 --> 00:19:20,040 Speaker 1: You know, eventually it'll be seamless. You won't be able 337 00:19:20,040 --> 00:19:23,320 Speaker 1: to tell. That's actually not a guarantee. Nobody knows that. 338 00:19:23,640 --> 00:19:25,600 Speaker 1: For one thing, we're kind of off the map here, 339 00:19:25,640 --> 00:19:28,520 Speaker 1: like one of the big Like there's aspects of how 340 00:19:28,560 --> 00:19:31,960 Speaker 1: these things work that are kind of like unclear even 341 00:19:31,960 --> 00:19:34,120 Speaker 1: to the people making them in aspects of how good 342 00:19:34,240 --> 00:19:37,120 Speaker 1: can they be. And one of the things I will say, 343 00:19:37,359 --> 00:19:40,400 Speaker 1: I heard this from somebody who's who's in the industry recently, 344 00:19:40,600 --> 00:19:44,720 Speaker 1: was like, you know, like, how do you tell whether 345 00:19:44,800 --> 00:19:46,960 Speaker 1: or not the model is getting more intelligent? And he's like, 346 00:19:47,000 --> 00:19:48,320 Speaker 1: I don't know, how do you tell if people are 347 00:19:48,320 --> 00:19:50,280 Speaker 1: more or less intelligent? We don't have a good agreement 348 00:19:50,359 --> 00:19:52,959 Speaker 1: on that, like a like IQ's bullshit, Like, yeah, that's 349 00:19:52,960 --> 00:19:55,639 Speaker 1: actually pretty good point, you know, like you're you're not wrong. 350 00:19:56,760 --> 00:19:59,720 Speaker 1: But also just like the idea that these models will 351 00:19:59,720 --> 00:20:04,520 Speaker 1: get better and better at storytelling, at creating fiction, at 352 00:20:04,520 --> 00:20:07,440 Speaker 1: creating images to go with fiction, that is not a guarantee. 353 00:20:07,600 --> 00:20:09,399 Speaker 1: One of the reasons why that's not a guarantee is 354 00:20:09,400 --> 00:20:12,800 Speaker 1: that the popularity of AI tools means that the Internet 355 00:20:12,840 --> 00:20:15,160 Speaker 1: at a very rapid pace, is being flooded with more 356 00:20:15,200 --> 00:20:17,720 Speaker 1: AI generated stuff. Right, More and more of this is 357 00:20:17,720 --> 00:20:20,359 Speaker 1: getting spat out on the Internet every day, And because 358 00:20:20,480 --> 00:20:24,520 Speaker 1: new ais will be trained on this content. That means 359 00:20:24,520 --> 00:20:27,400 Speaker 1: that new AIS and AIS that are updated to have 360 00:20:28,080 --> 00:20:30,959 Speaker 1: more stuff from after twenty twenty two are going to 361 00:20:31,040 --> 00:20:34,280 Speaker 1: be trained on stuff that they generated. So you are 362 00:20:34,359 --> 00:20:38,080 Speaker 1: feeding AI art and text back into the model and 363 00:20:38,119 --> 00:20:41,960 Speaker 1: feed it's a feedback. It can lead to what researchers 364 00:20:42,000 --> 00:20:44,720 Speaker 1: call model collapse, right, which is the idea that like, well, 365 00:20:44,920 --> 00:20:47,960 Speaker 1: the kind of these derangements, these messed up illustrations, like 366 00:20:48,000 --> 00:20:51,879 Speaker 1: the faults. If you're feeding this back and retraining it 367 00:20:51,960 --> 00:20:54,920 Speaker 1: on flawed stuff that had already put out, it's going 368 00:20:54,920 --> 00:20:58,280 Speaker 1: to just keep exacerbating those flaws. A group of researchers 369 00:20:58,320 --> 00:21:02,119 Speaker 1: published in the General ARCSIV described this as what happens 370 00:21:02,160 --> 00:21:05,159 Speaker 1: when quote, the use of model generated content and training 371 00:21:05,200 --> 00:21:09,119 Speaker 1: causes irreversible defects in the resulting models. Oh like that 372 00:21:09,240 --> 00:21:13,720 Speaker 1: movie multiplicity exactly, like multiplicity. That's right, that's right. This 373 00:21:13,840 --> 00:21:19,040 Speaker 1: is a multiplicity kind of situation. Yeah. And I find 374 00:21:19,040 --> 00:21:22,520 Speaker 1: this particularly worrisome because the present models are already pretty 375 00:21:22,520 --> 00:21:26,879 Speaker 1: full of defects. Take the storybook generated for a video 376 00:21:27,040 --> 00:21:30,359 Speaker 1: called I Create a best selling children's book using AI 377 00:21:30,600 --> 00:21:33,320 Speaker 1: in under an hour. Now. The creator of this video, 378 00:21:33,400 --> 00:21:36,720 Speaker 1: whose name is Grayson Sands, looks like what you'd get 379 00:21:36,760 --> 00:21:39,080 Speaker 1: if you fed mid journey the prompt what if Ron 380 00:21:39,119 --> 00:21:42,119 Speaker 1: Weasley was a registered sex offender? And I don't know 381 00:21:42,160 --> 00:21:45,400 Speaker 1: if i'll include that joke in the final article. It's 382 00:21:45,480 --> 00:21:48,959 Speaker 1: mean and not proper for like a serious piece of journalism. 383 00:21:49,240 --> 00:21:50,640 Speaker 1: But I don't like this guy. 384 00:21:51,440 --> 00:21:54,639 Speaker 2: It's also it's also just I mean because we're an 385 00:21:54,640 --> 00:21:59,080 Speaker 2: audio podcast, right, so it's also just for everyone playing 386 00:21:59,119 --> 00:22:04,159 Speaker 2: along at home. Uh do look at do look it up? 387 00:22:04,480 --> 00:22:07,400 Speaker 2: You know I hate black hair. 388 00:22:07,560 --> 00:22:11,440 Speaker 1: He's got the big, like white framed sunglasses. He's got 389 00:22:11,440 --> 00:22:14,320 Speaker 1: a leather jacket on. There's a guitar hung on his 390 00:22:14,400 --> 00:22:15,680 Speaker 1: wall behind him for someone. 391 00:22:15,760 --> 00:22:19,600 Speaker 2: He's got epaulets, which for some reason bothers me. It's 392 00:22:19,920 --> 00:22:24,400 Speaker 2: it's like a Again, we're very not into body shaming. 393 00:22:24,680 --> 00:22:27,800 Speaker 1: But this is not about his body shape or anything 394 00:22:27,800 --> 00:22:32,800 Speaker 1: about this. This is why his aesthetic choices, his vibe. 395 00:22:33,240 --> 00:22:38,280 Speaker 2: His vibe. His vibe looks off. So wait, in under 396 00:22:38,320 --> 00:22:41,800 Speaker 2: an hour, we create a best selling children's book. 397 00:22:42,960 --> 00:22:45,119 Speaker 1: Sounds like there's what he promises in the video. 398 00:22:45,640 --> 00:22:47,720 Speaker 2: Sounds like there should be some air quotes around a 399 00:22:47,720 --> 00:22:49,400 Speaker 2: couple of these things, Robert. 400 00:22:49,760 --> 00:22:52,479 Speaker 1: All of them. So he decides he's going to generate 401 00:22:52,520 --> 00:22:55,600 Speaker 1: a book for kids based on a based on a 402 00:22:55,720 --> 00:22:58,280 Speaker 1: character in a tattoo he got that. He says, like, 403 00:22:58,359 --> 00:23:01,280 Speaker 1: I got this tattoo for no reason. It means nothing 404 00:23:01,320 --> 00:23:03,840 Speaker 1: to me basically, and it's a tattoo of a stegosaur 405 00:23:04,080 --> 00:23:09,520 Speaker 1: playing a stand up bass. So he got I think 406 00:23:09,560 --> 00:23:13,000 Speaker 1: it or triceratops. Sorry, I whant to say stegosaurs. Jesus Christ, 407 00:23:13,080 --> 00:23:15,879 Speaker 1: what a see? I used to know dinosaur stuff when 408 00:23:15,920 --> 00:23:16,959 Speaker 1: I was a kid. You forget. 409 00:23:18,080 --> 00:23:20,080 Speaker 3: It doesn't even look like a dinosaur. 410 00:23:20,160 --> 00:23:20,440 Speaker 4: Really. 411 00:23:20,920 --> 00:23:23,280 Speaker 1: It's not a good tattoo. And it's one of the 412 00:23:23,280 --> 00:23:25,359 Speaker 1: things that's like, Look, I have a lot of tattoos. 413 00:23:25,359 --> 00:23:27,679 Speaker 1: I love tattoos. I don't think every tattoo has to 414 00:23:27,720 --> 00:23:30,600 Speaker 1: have deep meaning. But if you're if you're specifically being 415 00:23:30,640 --> 00:23:34,359 Speaker 1: like this tattoo. I got this random tattoo for no reason, 416 00:23:34,480 --> 00:23:36,439 Speaker 1: And now I'm going to make a book. I'm going 417 00:23:36,520 --> 00:23:38,600 Speaker 1: to try to trick kids into reading a book about 418 00:23:38,680 --> 00:23:41,320 Speaker 1: the character in this tattoo. That just makes me angry. 419 00:23:41,359 --> 00:23:41,760 Speaker 1: It does. 420 00:23:42,000 --> 00:23:43,200 Speaker 3: I don't like it. 421 00:23:43,280 --> 00:23:46,440 Speaker 1: Robert, fuck you for emphasizing how little you care about 422 00:23:46,440 --> 00:23:49,600 Speaker 1: what you're making. So, yeah, he feeds a picture of 423 00:23:49,600 --> 00:23:51,199 Speaker 1: this tattoo, and this is one of those like there 424 00:23:51,200 --> 00:23:53,119 Speaker 1: are cool things these ais can do. The fact that 425 00:23:53,160 --> 00:23:55,520 Speaker 1: you can take a picture of your tattoo and say, hey, 426 00:23:55,600 --> 00:23:57,640 Speaker 1: use this as an illustration in a story book. That's 427 00:23:57,680 --> 00:24:01,280 Speaker 1: kind of neat, you know, bait can. Considering the crude 428 00:24:01,400 --> 00:24:03,600 Speaker 1: drawing that it has to work with, mid Journey does 429 00:24:03,640 --> 00:24:07,960 Speaker 1: a decent enough job turning this into a character. It's 430 00:24:07,960 --> 00:24:12,000 Speaker 1: like it's okay, it's not good, but it's like fine fish. 431 00:24:12,200 --> 00:24:14,560 Speaker 1: It's off ish. But what's really off? 432 00:24:15,560 --> 00:24:16,320 Speaker 2: It's intriguing. 433 00:24:16,800 --> 00:24:20,360 Speaker 1: Yeah, there's there. What's really off is the world behind 434 00:24:20,480 --> 00:24:24,880 Speaker 1: the character. There's almost a little bit of a Susian vibe, 435 00:24:24,880 --> 00:24:27,960 Speaker 1: but like without any kind of intent behind it, which 436 00:24:27,960 --> 00:24:31,399 Speaker 1: is unsettling. And this becomes more obvious in the subsequent pages. 437 00:24:33,200 --> 00:24:37,240 Speaker 1: Look at this, like there's random music signs like drifting 438 00:24:37,280 --> 00:24:41,200 Speaker 1: through like hung like garlands on the branchless trees. All 439 00:24:41,240 --> 00:24:44,600 Speaker 1: of the dinosaurs have these weird long arcing heads that 440 00:24:44,640 --> 00:24:47,400 Speaker 1: are almost make them shaped like me, Like it's the 441 00:24:47,440 --> 00:24:49,879 Speaker 1: sun has a giant question mark in the middle of it, 442 00:24:49,960 --> 00:24:52,080 Speaker 1: and then there's a second question mark next to the 443 00:24:52,119 --> 00:24:54,399 Speaker 1: sun for no reason, Like this is. 444 00:24:54,520 --> 00:24:59,480 Speaker 2: Very soft Dolly, Yeah, yeah, yeah, a little bit hi, 445 00:25:00,119 --> 00:25:00,720 Speaker 2: really bad. 446 00:25:01,280 --> 00:25:05,479 Speaker 1: Yeah, got this next page because as it goes on, 447 00:25:05,640 --> 00:25:10,320 Speaker 1: like it gets increasingly more divorced from anything that like, okay, 448 00:25:09,880 --> 00:25:13,560 Speaker 1: how would you describe this almost pornographic? Right, Like those 449 00:25:13,560 --> 00:25:14,919 Speaker 1: are tits on that tree? 450 00:25:16,440 --> 00:25:18,480 Speaker 2: Those are those are genitalia? 451 00:25:18,880 --> 00:25:21,639 Speaker 1: Yeah, like, yeah, there's there's some dicks and some tits 452 00:25:21,680 --> 00:25:27,200 Speaker 1: in this one. It's weird, right, And yeah, I can admit, 453 00:25:27,359 --> 00:25:30,879 Speaker 1: like there's something interesting and amusing about how hallucinatory this is. 454 00:25:30,960 --> 00:25:33,800 Speaker 1: If this was like a win Amp visualization, like if 455 00:25:33,840 --> 00:25:36,000 Speaker 1: I was if I had just taken a bunch of 456 00:25:36,040 --> 00:25:38,760 Speaker 1: five O M I P T and was like putting 457 00:25:38,760 --> 00:25:41,439 Speaker 1: on a Murder City Devil's record and sat like a 458 00:25:41,480 --> 00:25:45,160 Speaker 1: fucking I don't know, uh yeah, like win Amp visualizations 459 00:25:45,160 --> 00:25:47,080 Speaker 1: and I got shipped like this and be like, oh cool, 460 00:25:47,200 --> 00:25:50,280 Speaker 1: that's kind of neat. But it doesn't have anything to 461 00:25:50,359 --> 00:25:52,960 Speaker 1: do with the story, right. The story that this guy 462 00:25:53,000 --> 00:25:55,680 Speaker 1: has had chat gpt right for his book is about 463 00:25:55,680 --> 00:26:01,719 Speaker 1: a brachiosaurus who plays piano teaching a tricera to play bass, right, Like, 464 00:26:02,040 --> 00:26:04,160 Speaker 1: this isn't there's no reason for the art to look 465 00:26:04,240 --> 00:26:07,080 Speaker 1: like this. It's not like in keeping he hasn't like written. 466 00:26:09,240 --> 00:26:12,960 Speaker 1: He hasn't like written like a psychedelic dinosaur story here, 467 00:26:13,000 --> 00:26:14,960 Speaker 1: and thus this is kind of like fitting. It's like 468 00:26:15,000 --> 00:26:17,359 Speaker 1: a very basic story about a dinosaur that wants to 469 00:26:17,440 --> 00:26:19,560 Speaker 1: learn bass and meets a friend who teaches him how 470 00:26:19,560 --> 00:26:21,879 Speaker 1: to do music. Like it doesn't make any sense that 471 00:26:21,920 --> 00:26:24,640 Speaker 1: it looks this way. It's just confusing to kids. At 472 00:26:24,640 --> 00:26:26,800 Speaker 1: the end of the video, Grayson tells us how he 473 00:26:26,840 --> 00:26:29,840 Speaker 1: got this book to be a bestseller, and it turns 474 00:26:29,880 --> 00:26:31,760 Speaker 1: out he just called up a bunch of his friends 475 00:26:31,800 --> 00:26:33,879 Speaker 1: and family and told them he'd written a book that 476 00:26:33,960 --> 00:26:35,639 Speaker 1: was on Amazon, and then he begged them all to 477 00:26:35,680 --> 00:26:38,400 Speaker 1: buy it. They did, and then it went up through 478 00:26:38,440 --> 00:26:40,760 Speaker 1: the rankings because it doesn't take a lot to do 479 00:26:40,800 --> 00:26:43,000 Speaker 1: that with print books, and it started appearing higher and 480 00:26:43,040 --> 00:26:46,560 Speaker 1: Amazon's search results, and he's like, hey, guys, I wrote 481 00:26:46,560 --> 00:26:48,120 Speaker 1: a book. Would you buy it? And they're like, wow, 482 00:26:48,160 --> 00:26:50,439 Speaker 1: you wrote a book, And like, man, why when they 483 00:26:50,520 --> 00:26:54,880 Speaker 1: get a sight of what you've actually done, I don't know. 484 00:26:55,119 --> 00:26:58,240 Speaker 1: Don't invite this kid. Look, if you're his parents, it's 485 00:26:58,320 --> 00:27:02,159 Speaker 1: time to just cut bait, you know, lock the doors 486 00:27:02,160 --> 00:27:04,359 Speaker 1: when it comes by for the holidays, don't let Grayson in. 487 00:27:04,800 --> 00:27:08,840 Speaker 2: Yeah, it's very it's it is very what we're seeing here. 488 00:27:08,880 --> 00:27:13,320 Speaker 2: To your earlier point about this breaking kids right and 489 00:27:13,840 --> 00:27:19,159 Speaker 2: fundamentals of storytelling is now we're seeing If you're a kid, 490 00:27:19,280 --> 00:27:22,840 Speaker 2: you're reading this, and you're seeing two very divided things. 491 00:27:22,880 --> 00:27:26,760 Speaker 2: There are two different stories being told. One is a 492 00:27:26,960 --> 00:27:31,320 Speaker 2: kind of mad Lib style about a brachiosaurus who feels 493 00:27:31,359 --> 00:27:35,919 Speaker 2: authoritative enough on base to teach tricerah tops or whatever. 494 00:27:36,400 --> 00:27:42,879 Speaker 2: And this second thing is the clear mental decline violently 495 00:27:43,320 --> 00:27:45,840 Speaker 2: of an artist. That's what it looks like, like the 496 00:27:46,800 --> 00:27:50,760 Speaker 2: styles keep switching, you know what I mean? Like, Yeah, 497 00:27:51,320 --> 00:27:53,480 Speaker 2: I get, I get what you're saying. Man, I think 498 00:27:53,520 --> 00:27:58,280 Speaker 2: that is I think another dangerous part of this is 499 00:27:58,320 --> 00:28:00,920 Speaker 2: that if you were a kid reading right. When you're 500 00:28:00,920 --> 00:28:04,520 Speaker 2: a kid, you prize the books, you prize the information 501 00:28:04,640 --> 00:28:07,359 Speaker 2: you have access to as a sponge. So if you 502 00:28:07,440 --> 00:28:11,879 Speaker 2: are reading these things and you're quite impressionable, then you 503 00:28:12,320 --> 00:28:19,479 Speaker 2: will have these sort of indelibly imprinted in your mind, 504 00:28:19,800 --> 00:28:27,040 Speaker 2: and years decades later you might say, triceratops, Gosh should 505 00:28:27,080 --> 00:28:28,040 Speaker 2: get into jazz. 506 00:28:28,480 --> 00:28:31,000 Speaker 1: Yeah, exactly. This could lead to a whole new world 507 00:28:31,040 --> 00:28:35,040 Speaker 1: of jazz heads, and then they'll be smoking their jazz 508 00:28:35,119 --> 00:28:39,520 Speaker 1: cigarettes We don't need that kind of shit listeners. Regular 509 00:28:39,600 --> 00:28:42,040 Speaker 1: listeners will know this, but if you're new to the show, 510 00:28:42,440 --> 00:28:46,120 Speaker 1: Ben and I are both the angry faculty members from 511 00:28:46,160 --> 00:28:48,440 Speaker 1: Back to the Future who were trying to stop Marty 512 00:28:48,440 --> 00:28:50,920 Speaker 1: from hanging out with those jazz singers. That was our 513 00:28:50,920 --> 00:28:52,760 Speaker 1: big break. Yeah, that was our big break. That was 514 00:28:52,800 --> 00:28:57,960 Speaker 1: our big break. Speaking of old timey fifties people being 515 00:28:57,960 --> 00:29:01,560 Speaker 1: bigoted against jazz music. You know who also hates jazz? 516 00:29:02,560 --> 00:29:06,160 Speaker 1: Sponsors of this podcast, Oh yeah, yeah, I hate it. 517 00:29:06,320 --> 00:29:10,160 Speaker 3: We make them send us a certified document stating that 518 00:29:10,280 --> 00:29:12,200 Speaker 3: with a with their signature on. 519 00:29:12,200 --> 00:29:16,880 Speaker 2: It or else it's a jazz a David. 520 00:29:17,960 --> 00:29:28,240 Speaker 1: Yeah, that's right. We're back. And boy, you know, you 521 00:29:28,280 --> 00:29:29,800 Speaker 1: know what Jazz Today is missing? 522 00:29:30,440 --> 00:29:31,320 Speaker 2: What's that, Robert? 523 00:29:31,480 --> 00:29:35,320 Speaker 1: Jazz Today is missing? William Reiker wearing a onesie, sitting 524 00:29:35,360 --> 00:29:39,960 Speaker 1: awkwardly backwards in a chair playing the saxophone in order 525 00:29:40,080 --> 00:29:44,560 Speaker 1: to win the love of an Ai generated woman in 526 00:29:44,600 --> 00:29:47,160 Speaker 1: that one episode where like he and Picard are kind 527 00:29:47,200 --> 00:29:49,440 Speaker 1: of gooning together on the holid deck if you know 528 00:29:49,480 --> 00:29:53,120 Speaker 1: the term gooning. My favorite episode of Star Trek, By. 529 00:29:53,040 --> 00:29:55,760 Speaker 2: The way, I've learned so much in the space of 530 00:29:56,360 --> 00:29:57,800 Speaker 2: less than sixty seconds. 531 00:29:57,880 --> 00:30:00,280 Speaker 1: Yeah, that's right, we're gonna get some t sh it's 532 00:30:00,280 --> 00:30:02,600 Speaker 1: at for you people. That's a that's Riker and Picard 533 00:30:02,720 --> 00:30:06,000 Speaker 1: just just gooning out together with that uh that Ai 534 00:30:06,120 --> 00:30:09,640 Speaker 1: lady in the jazz club. Just really really really because 535 00:30:09,680 --> 00:30:11,960 Speaker 1: it's it's the twenty fourth century. They don't have shame 536 00:30:12,240 --> 00:30:12,800 Speaker 1: like we do. 537 00:30:12,960 --> 00:30:16,959 Speaker 3: You know, you act like Rory Blank couldn't make a 538 00:30:17,080 --> 00:30:19,280 Speaker 3: beautiful design with that guy. 539 00:30:18,840 --> 00:30:21,720 Speaker 2: In tend to have Rory Blank do a Picard and 540 00:30:21,840 --> 00:30:23,440 Speaker 2: Riker gooning T shirt. 541 00:30:23,520 --> 00:30:23,880 Speaker 4: Hi. 542 00:30:24,040 --> 00:30:28,400 Speaker 2: Well, they're a post shame economy in the world, in 543 00:30:28,440 --> 00:30:29,400 Speaker 2: the universe. 544 00:30:29,040 --> 00:30:32,200 Speaker 1: Of absolutely no, there's there's no such thing as shame. 545 00:30:33,400 --> 00:30:36,320 Speaker 1: They are. They are all as horny as a letter 546 00:30:37,400 --> 00:30:42,000 Speaker 1: from Uh. Oh shit, I just ruined this joke because 547 00:30:42,000 --> 00:30:44,800 Speaker 1: you wrote James Joyce as a letter from James Joyce 548 00:30:45,440 --> 00:30:49,640 Speaker 1: to his wife. Yeah. Google James Joyce Love letter sheets. 549 00:30:49,800 --> 00:30:53,360 Speaker 1: You'll learn some fun things about one of history's greatest artists. 550 00:30:54,360 --> 00:30:58,640 Speaker 1: So speaking of not one of history's greatest artists. 551 00:30:58,800 --> 00:31:02,080 Speaker 2: Sure, under an under an hour he wrote this book? 552 00:31:02,400 --> 00:31:06,280 Speaker 1: Yeah, yeah, James Joyce wrote Ulysses in less than an hour. Yeah, 553 00:31:06,280 --> 00:31:08,640 Speaker 1: it took him a lot longer. You know actually when 554 00:31:08,680 --> 00:31:11,240 Speaker 1: you read Ulysses, because one of the most famous scenes 555 00:31:11,280 --> 00:31:13,479 Speaker 1: in that book that got it attacked by a lot 556 00:31:13,480 --> 00:31:18,200 Speaker 1: of anti obscenity laws is one of the characters walking 557 00:31:18,320 --> 00:31:21,280 Speaker 1: along the strand basically and masturbating through a hole in 558 00:31:21,320 --> 00:31:24,520 Speaker 1: his pocket while his wife fucks some other guy. And 559 00:31:24,880 --> 00:31:28,280 Speaker 1: when you are reading a story about a man masturbating 560 00:31:28,280 --> 00:31:31,200 Speaker 1: through a hole in his pocket, again, all fiction is 561 00:31:31,200 --> 00:31:34,280 Speaker 1: a dialogue between reader and author, you're kind of gooning 562 00:31:34,320 --> 00:31:35,280 Speaker 1: with James Joyce. 563 00:31:36,320 --> 00:31:37,880 Speaker 2: Beautiful, good point. 564 00:31:37,880 --> 00:31:41,320 Speaker 1: Isn't that? Beautiful inspiring art can really take us to 565 00:31:41,400 --> 00:31:44,160 Speaker 1: some amazing places. 566 00:31:44,200 --> 00:31:47,720 Speaker 2: Sure, cat, you know, literacy is the first hollow deck, 567 00:31:47,840 --> 00:31:52,720 Speaker 2: you know exactly. It's also it's also like, you know, 568 00:31:52,800 --> 00:31:56,560 Speaker 2: the thing I think people forget often is that being 569 00:31:56,720 --> 00:32:03,720 Speaker 2: able to read, to encounter uh story is it's sort 570 00:32:03,760 --> 00:32:07,720 Speaker 2: of the closest folks have gotten to necromancy rights, right, 571 00:32:07,840 --> 00:32:11,200 Speaker 2: time travel, speaking with the dead, and uh, you gotta 572 00:32:11,240 --> 00:32:15,240 Speaker 2: be you gotta be careful with this stuff. Also Finnigan's 573 00:32:15,240 --> 00:32:17,480 Speaker 2: wake Man, just between us, I can tell you're a 574 00:32:17,520 --> 00:32:26,640 Speaker 2: fellow uh enthusiast. Absolutely, yeah, our filthy, our filthy pal James. Yeah, 575 00:32:26,720 --> 00:32:29,920 Speaker 2: do you think James Joyce knows what Finnigan's Wake is about. 576 00:32:30,920 --> 00:32:32,760 Speaker 1: I go back and forth on that. I had one 577 00:32:32,760 --> 00:32:35,760 Speaker 1: of my one of my UH fiction teachers when I 578 00:32:35,880 --> 00:32:38,960 Speaker 1: was like a younger man, was very angry whenever you 579 00:32:38,960 --> 00:32:41,320 Speaker 1: would bring up Finnegan's Wake, and just like he hated 580 00:32:41,400 --> 00:32:43,480 Speaker 1: James Joyce and he thought like the whole book was 581 00:32:43,480 --> 00:32:47,479 Speaker 1: a con. I find it like sometimes it's I'll go 582 00:32:47,560 --> 00:32:50,120 Speaker 1: through it when I'm trying to go to sleep just 583 00:32:50,160 --> 00:32:52,560 Speaker 1: because like it's it's it's actually it's kind of I 584 00:32:52,560 --> 00:32:55,360 Speaker 1: think like Blood Meridian Is is challenging in some ways 585 00:32:55,360 --> 00:32:57,800 Speaker 1: that aren't different, because like so much of reading Blood 586 00:32:57,840 --> 00:32:59,719 Speaker 1: Merity is like what the fuck did he mean by that? 587 00:32:59,800 --> 00:33:01,800 Speaker 1: Like was the actually like what was going on here? 588 00:33:01,840 --> 00:33:05,280 Speaker 1: Bloodbridian is a lot is much more closer to like 589 00:33:05,640 --> 00:33:08,520 Speaker 1: a normal novel than than Finnigan's Wake is though, but 590 00:33:08,640 --> 00:33:11,360 Speaker 1: like they're both interesting. Also, when I was a kid, 591 00:33:11,440 --> 00:33:14,360 Speaker 1: I read a series of mystery books that were themed 592 00:33:14,400 --> 00:33:17,920 Speaker 1: after Finnigan's Wake, where like the character's name was Finnigan's Wake, 593 00:33:18,400 --> 00:33:21,640 Speaker 1: like z W A k E. Weird. I just remembered that. 594 00:33:21,760 --> 00:33:25,000 Speaker 1: Now it's an odd idea to make like a series 595 00:33:25,040 --> 00:33:29,240 Speaker 1: of children's mystery novels themed after James Joyce's Finnegans. 596 00:33:28,840 --> 00:33:33,920 Speaker 2: Wake Chat GPT make a make a an intriguer story 597 00:33:34,000 --> 00:33:38,320 Speaker 2: about a dinosaur learning to play the piccolo, written in 598 00:33:38,400 --> 00:33:41,800 Speaker 2: the style at Cormack McCarthy. Oh, also make it mind blowing. 599 00:33:42,080 --> 00:33:47,760 Speaker 1: Yeah, yeah, definitely make it mind blowing. God especially Yeah. Okay, 600 00:33:48,000 --> 00:33:52,320 Speaker 1: so we were just talking about this fucked up uh 601 00:33:52,440 --> 00:33:55,440 Speaker 1: Dandy the Dinosaur is the the dog shit book that 602 00:33:55,480 --> 00:34:00,560 Speaker 1: this guy generates based off of his tattoo at There's 603 00:34:00,560 --> 00:34:03,080 Speaker 1: only two reviews for it, both of them are five stars, 604 00:34:03,120 --> 00:34:04,960 Speaker 1: but like, I don't know if they're real or not, 605 00:34:05,160 --> 00:34:07,840 Speaker 1: or if like he had family members write reviews for 606 00:34:07,920 --> 00:34:10,359 Speaker 1: him to try to help his book sell. So it's 607 00:34:10,360 --> 00:34:12,640 Speaker 1: one of those things. He's gaming the system here, right, 608 00:34:12,680 --> 00:34:14,799 Speaker 1: He's having like his friends and family buy a bunch 609 00:34:14,800 --> 00:34:17,360 Speaker 1: of copies to try to shoot it up the Amazon 610 00:34:17,440 --> 00:34:21,080 Speaker 1: rankings and the hope that that generates organic sales. I 611 00:34:21,160 --> 00:34:23,719 Speaker 1: can't tell, based on the information that's available to us 612 00:34:23,719 --> 00:34:26,640 Speaker 1: if Grayson's effort to game the system was successful or not, 613 00:34:26,760 --> 00:34:30,520 Speaker 1: if it like made him money, But the basic tactics 614 00:34:30,520 --> 00:34:34,200 Speaker 1: he's engaging in can be used and will be used 615 00:34:34,239 --> 00:34:36,960 Speaker 1: by other people generating AI books, who have access to 616 00:34:37,040 --> 00:34:41,560 Speaker 1: more resources and who might have actual agendas, Because this 617 00:34:41,600 --> 00:34:43,680 Speaker 1: is the kind of thing that already happens, right, Like, 618 00:34:43,760 --> 00:34:46,239 Speaker 1: if you've got you know, you have like a celebrity 619 00:34:46,320 --> 00:34:49,120 Speaker 1: or a politician who like releases some ghost written book 620 00:34:49,360 --> 00:34:51,799 Speaker 1: and they like work for the Heritage Foundation or some 621 00:34:51,960 --> 00:34:54,680 Speaker 1: like fucking think tank or whatever. That think tank will 622 00:34:54,680 --> 00:34:56,960 Speaker 1: buy like ten thousand copies of the book so that 623 00:34:57,000 --> 00:34:59,040 Speaker 1: it makes it in the New York Times bestseller list. 624 00:35:00,239 --> 00:35:03,799 Speaker 1: Like that's like a common tactic, right, And what he 625 00:35:03,880 --> 00:35:06,640 Speaker 1: was attempting to do, Grayson was kind of incompetently attempting 626 00:35:06,680 --> 00:35:08,600 Speaker 1: to do is the kindle version of that. But there's 627 00:35:08,640 --> 00:35:12,120 Speaker 1: no reason why people couldn't, like again, who have an 628 00:35:12,160 --> 00:35:16,200 Speaker 1: actual agenda, couldn't generate a series of children's books, buy 629 00:35:16,200 --> 00:35:17,759 Speaker 1: a bunch of copies of them to get it to 630 00:35:17,800 --> 00:35:19,680 Speaker 1: shoot up in the rankings, and the hope that that 631 00:35:20,080 --> 00:35:22,239 Speaker 1: tricks a bunch of charities and parents and all these 632 00:35:22,320 --> 00:35:25,719 Speaker 1: kind of libraries and stuff to flood the market, like 633 00:35:25,800 --> 00:35:28,560 Speaker 1: and to flood kids with copies of whatever book they've got. 634 00:35:28,600 --> 00:35:30,400 Speaker 1: And when it comes to like how that could be 635 00:35:30,440 --> 00:35:35,600 Speaker 1: in settling, that brings me to another Unfortunate Soul. Another 636 00:35:35,680 --> 00:35:41,680 Speaker 1: Unfortunate AI author. His name is Lewis Lucas Kitchen. You 637 00:35:41,719 --> 00:35:44,480 Speaker 1: can see Lucas up there in the front image. 638 00:35:44,920 --> 00:35:47,720 Speaker 3: Can I just say, all these people same vibe? 639 00:35:48,200 --> 00:35:50,799 Speaker 1: They all have the exact same vibe. You know, they 640 00:35:50,800 --> 00:35:54,560 Speaker 1: all have strong opinions about eight pictures and how much 641 00:35:54,600 --> 00:35:55,680 Speaker 1: money they might be worth. 642 00:35:56,080 --> 00:35:57,600 Speaker 2: I can't wait to hear their. 643 00:35:59,200 --> 00:36:03,200 Speaker 1: Yeah, oh yeah, and uh yeah Lucas. You know, based 644 00:36:03,239 --> 00:36:05,960 Speaker 1: on the illustration that we've got down there for his book, 645 00:36:06,680 --> 00:36:09,080 Speaker 1: it features like you've got this kind of old man 646 00:36:09,360 --> 00:36:12,520 Speaker 1: on one side looking at like a leather bound book, 647 00:36:12,760 --> 00:36:15,040 Speaker 1: and then like a puppy dog in the middle with 648 00:36:15,080 --> 00:36:17,640 Speaker 1: some like creepy gnomes under it, and then like this 649 00:36:18,239 --> 00:36:22,480 Speaker 1: very demonic looking rabid unicorn character that does not look 650 00:36:22,560 --> 00:36:24,759 Speaker 1: like the same style of art as the as the 651 00:36:24,800 --> 00:36:28,680 Speaker 1: other characters. Looking at the color of his book, you 652 00:36:28,680 --> 00:36:30,560 Speaker 1: would guess that it's like the story of an old 653 00:36:30,600 --> 00:36:32,960 Speaker 1: man and his dog maybe getting murdered in the woods 654 00:36:32,960 --> 00:36:35,960 Speaker 1: by a unicorn. But what's what he's actually writing here 655 00:36:36,040 --> 00:36:39,400 Speaker 1: is much more frightening. Because Lucas is an evangelical Christian, 656 00:36:39,440 --> 00:36:43,320 Speaker 1: he might be a fundamentalist. He writes science fiction books 657 00:36:43,320 --> 00:36:47,719 Speaker 1: about like prostolytizing on Mars. He like he has like 658 00:36:47,760 --> 00:36:49,600 Speaker 1: a bunch of like all of his he does like 659 00:36:50,120 --> 00:36:55,080 Speaker 1: a large number of like weird kind of Christian evangelical fiction. 660 00:36:56,080 --> 00:36:58,839 Speaker 1: And the story prompt that he feeds his ai is 661 00:36:58,960 --> 00:37:01,320 Speaker 1: one of the more absurd ones I've come across. And 662 00:37:01,360 --> 00:37:03,680 Speaker 1: I'm going to read that to you. Now, write a 663 00:37:03,840 --> 00:37:06,760 Speaker 1: children's book where the protagonist is a little puppy named Fluff. 664 00:37:06,760 --> 00:37:09,080 Speaker 1: Fluff wants someone to tell him about Jesus, but he 665 00:37:09,080 --> 00:37:12,160 Speaker 1: can't read John three point sixteen, so he needs a solution. 666 00:37:12,600 --> 00:37:15,799 Speaker 1: The antagonist of this story is a bad unicorn, and 667 00:37:15,880 --> 00:37:18,960 Speaker 1: the story include a field with trees. Also include magic 668 00:37:19,000 --> 00:37:21,560 Speaker 1: in the forest. Also include a story a character who 669 00:37:21,560 --> 00:37:25,399 Speaker 1: says uh uh over and over. Now, if that sounds weird, 670 00:37:25,400 --> 00:37:28,040 Speaker 1: it's because Lucas has his kids help him write the prompt, 671 00:37:28,280 --> 00:37:30,800 Speaker 1: which you know, I try not to be totally negative. 672 00:37:30,840 --> 00:37:32,440 Speaker 1: I can see like, if you're a parent, you know, 673 00:37:32,520 --> 00:37:34,160 Speaker 1: and you've got kids who are a little bit older 674 00:37:34,280 --> 00:37:36,800 Speaker 1: and have some you know who can read a bit themselves, 675 00:37:37,120 --> 00:37:39,440 Speaker 1: you sit down and you have them all right, you know, 676 00:37:39,520 --> 00:37:41,920 Speaker 1: like you would you know, what kind of bedtime story 677 00:37:41,920 --> 00:37:44,000 Speaker 1: do we want? Give me some character names and a plot, 678 00:37:44,000 --> 00:37:46,160 Speaker 1: and let's plug it in the chat GPT and then 679 00:37:46,160 --> 00:37:48,040 Speaker 1: if you're a good parent, A way that this could 680 00:37:48,040 --> 00:37:51,040 Speaker 1: be a good learning exercise is it generates a crappy 681 00:37:51,080 --> 00:37:52,800 Speaker 1: AI story and then you sit down and you go 682 00:37:52,880 --> 00:37:55,040 Speaker 1: over with your kids and you go, well, what's missing here? 683 00:37:55,080 --> 00:37:57,799 Speaker 1: Why doesn't this work? You know what? What is like? 684 00:37:57,880 --> 00:37:59,799 Speaker 1: Why isn't this complete? What are the things that we 685 00:38:00,160 --> 00:38:02,600 Speaker 1: add to this in order to make an actual proper story? 686 00:38:03,000 --> 00:38:06,920 Speaker 1: You could actually teach kids something about storytelling in a 687 00:38:06,920 --> 00:38:10,000 Speaker 1: way that would be useful doing that. That's not what 688 00:38:10,120 --> 00:38:11,440 Speaker 1: Lucas is doing here. Don't worry. 689 00:38:11,480 --> 00:38:14,840 Speaker 2: I mean he's not you just like like again, because 690 00:38:14,880 --> 00:38:17,239 Speaker 2: I know one of the things we get sometimes is 691 00:38:17,280 --> 00:38:21,400 Speaker 2: the idea that we are not fun at parties or 692 00:38:21,480 --> 00:38:24,520 Speaker 2: we're darker depressing. But but I think you laid out 693 00:38:24,560 --> 00:38:29,719 Speaker 2: a really good, like hypothetical scenario. What what if of 694 00:38:30,960 --> 00:38:36,920 Speaker 2: children and parents communicating emergent storytelling, creating dialogue? And I 695 00:38:37,840 --> 00:38:42,200 Speaker 2: just want to take a moment before we go towards 696 00:38:42,280 --> 00:38:46,319 Speaker 2: some even more troubling horizons to say that was really nice, man, Yeah, 697 00:38:46,320 --> 00:38:47,040 Speaker 2: that's really cool. 698 00:38:47,040 --> 00:38:49,799 Speaker 1: I tried to not do all so one of the 699 00:38:49,800 --> 00:38:52,000 Speaker 1: things I did research into this is I like played 700 00:38:52,000 --> 00:38:53,680 Speaker 1: some of these videos for a friend of mine who 701 00:38:53,719 --> 00:38:57,560 Speaker 1: is a young mother and you know, and who is 702 00:38:57,600 --> 00:39:01,080 Speaker 1: not she's not as online nearly as online as I am, 703 00:39:01,160 --> 00:39:04,359 Speaker 1: and she's not like, she's not someone who is as 704 00:39:04,440 --> 00:39:06,400 Speaker 1: much of a pessimist about all of this stuff as 705 00:39:06,400 --> 00:39:08,640 Speaker 1: I am. And she was like, oh, you know, like 706 00:39:08,719 --> 00:39:10,480 Speaker 1: this is creepy. The fact that these people are like 707 00:39:10,560 --> 00:39:13,360 Speaker 1: just shotgunning novels out onto kindled trick people. But like 708 00:39:13,719 --> 00:39:15,400 Speaker 1: I could see it being cool. I've always you know, 709 00:39:15,440 --> 00:39:17,560 Speaker 1: maybe I could make a story and the AI could 710 00:39:17,560 --> 00:39:20,160 Speaker 1: help me because I'm not a writer, and I could 711 00:39:20,200 --> 00:39:21,960 Speaker 1: like you know, use it to generate the bones and 712 00:39:22,000 --> 00:39:23,320 Speaker 1: I could fill it out and I could make like 713 00:39:23,360 --> 00:39:25,440 Speaker 1: a custom little story for my kid, and that might 714 00:39:25,480 --> 00:39:27,680 Speaker 1: be nice. I'm like, yeah, sure, I don't think that's harmful. 715 00:39:27,880 --> 00:39:30,640 Speaker 1: Like that's as long as you're not just printing what 716 00:39:30,719 --> 00:39:33,399 Speaker 1: the AI gives you. If if this helps you make 717 00:39:33,480 --> 00:39:35,680 Speaker 1: like a neat, little bespoke storybook for your kid and 718 00:39:35,719 --> 00:39:37,080 Speaker 1: that makes you feel good, that's fine. 719 00:39:38,000 --> 00:39:38,200 Speaker 3: You know. 720 00:39:38,280 --> 00:39:40,960 Speaker 1: I'm not saying like the problem is that all of 721 00:39:41,000 --> 00:39:43,760 Speaker 1: this stuff is immediately being taken by the worst common 722 00:39:43,800 --> 00:39:47,279 Speaker 1: denominators in our society. Right, The same people who are 723 00:39:47,320 --> 00:39:49,840 Speaker 1: trying to like get you to spend your life savings 724 00:39:49,840 --> 00:39:54,120 Speaker 1: on monkey drawings a year ago are now cramming, like shotgunning, 725 00:39:54,280 --> 00:39:57,520 Speaker 1: hundreds and hundreds of books a month onto Amazon that 726 00:39:57,600 --> 00:40:01,319 Speaker 1: are are going to like do damage we probably don't 727 00:40:01,320 --> 00:40:04,279 Speaker 1: fully understand to any kids who read enough of them. 728 00:40:05,160 --> 00:40:07,680 Speaker 1: That's the problem. Not that there aren't cool uses for 729 00:40:07,719 --> 00:40:11,319 Speaker 1: this stuff here, but Lucas is not doing anything cool here. 730 00:40:11,360 --> 00:40:15,239 Speaker 1: This book, this weird Jesus book that he generates using 731 00:40:15,320 --> 00:40:19,239 Speaker 1: chatt is as nonsensical and devoid of actual plot as 732 00:40:19,239 --> 00:40:20,759 Speaker 1: all of the others we've seen. And I'm going to 733 00:40:20,840 --> 00:40:23,759 Speaker 1: read you the text that the chatbot cooks up for 734 00:40:23,800 --> 00:40:26,840 Speaker 1: this story. Oh buddy, Once upon a time, in a 735 00:40:26,880 --> 00:40:30,080 Speaker 1: magical forest with fields full of trees, there lived a 736 00:40:30,080 --> 00:40:32,680 Speaker 1: little puppy named Fluff. Fluff was a curious puppy, and 737 00:40:32,719 --> 00:40:34,960 Speaker 1: he loved to explore the forest and learn new things. 738 00:40:35,239 --> 00:40:37,600 Speaker 1: One day, Fluff heard about a man named Jesus who 739 00:40:37,640 --> 00:40:40,920 Speaker 1: was very special and had done many wonderful things. Fluff 740 00:40:41,000 --> 00:40:43,520 Speaker 1: was very interested and wanted to learn more about Jesus. 741 00:40:43,640 --> 00:40:50,520 Speaker 1: But there was one problem. Fluff couldn't read. Oh god, 742 00:40:50,880 --> 00:40:56,480 Speaker 1: that's so weird. First off, if you're a fucking dedicated 743 00:40:56,560 --> 00:41:00,120 Speaker 1: Christian enough that, like your AI scam book is trying 744 00:41:00,160 --> 00:41:03,120 Speaker 1: to get kids hooked on Jesus, can you describe him 745 00:41:03,120 --> 00:41:06,880 Speaker 1: better than he was special and did wonderful things? Right, Like, 746 00:41:06,920 --> 00:41:08,520 Speaker 1: at least if you're gonna be like be like, yeah, 747 00:41:08,520 --> 00:41:10,759 Speaker 1: he saved people's souls. He made it possible for us 748 00:41:10,800 --> 00:41:14,279 Speaker 1: to go like, you can say more about Jesus than 749 00:41:14,320 --> 00:41:17,040 Speaker 1: this than he well, he's like he did nice stuff. 750 00:41:17,120 --> 00:41:21,320 Speaker 2: Yeah, checking the boxes right, Like, yeah, there's this guy Jesus. 751 00:41:21,760 --> 00:41:26,600 Speaker 2: He's got a good vibe. But yeah, dogs can't read those, 752 00:41:26,640 --> 00:41:27,760 Speaker 2: so that's like a problem. 753 00:41:28,040 --> 00:41:28,840 Speaker 1: Get some fucked. 754 00:41:30,719 --> 00:41:33,720 Speaker 2: So wait, this is also dangerous, is it not? Because 755 00:41:33,880 --> 00:41:37,520 Speaker 2: it is where we're verging from the land of of 756 00:41:37,800 --> 00:41:44,000 Speaker 2: mad lib check the boxes, absence of motivation right uh, 757 00:41:44,040 --> 00:41:53,280 Speaker 2: into into the land of proselytizing or propagandizing without without 758 00:41:53,320 --> 00:41:57,000 Speaker 2: an understanding of it. Like, it's that's very weird, that's 759 00:41:57,080 --> 00:41:58,040 Speaker 2: very off putting. 760 00:41:58,440 --> 00:42:01,319 Speaker 1: Yeah, the scale of propaganda that's possible when you don't 761 00:42:01,320 --> 00:42:03,560 Speaker 1: even have to have anyone to actually write or illustrate it. 762 00:42:03,600 --> 00:42:08,880 Speaker 1: But also how propaganda that's that like disjointed and incoherent, 763 00:42:09,440 --> 00:42:11,840 Speaker 1: Like does it all just become noise and get lost? 764 00:42:11,880 --> 00:42:13,520 Speaker 1: And then the primary problem is just that it like 765 00:42:13,560 --> 00:42:15,760 Speaker 1: floods the zone and makes it hard to find stuff 766 00:42:15,760 --> 00:42:17,719 Speaker 1: that isn't this kind of trash or is it just 767 00:42:18,120 --> 00:42:21,200 Speaker 1: or does it like does being exposed to so much 768 00:42:21,200 --> 00:42:24,759 Speaker 1: of this disjointed, weird, robotic hallucination shit alter the way 769 00:42:24,760 --> 00:42:26,480 Speaker 1: that we think we don't really know yet. You know, 770 00:42:26,760 --> 00:42:28,600 Speaker 1: we didn't know what social media was going to do 771 00:42:28,680 --> 00:42:31,040 Speaker 1: to us, and now all of us we have all 772 00:42:31,320 --> 00:42:33,440 Speaker 1: the attention spans of a fucking fruit fly. 773 00:42:33,760 --> 00:42:39,279 Speaker 4: So yeah, I want to play there's a there's a 774 00:42:39,320 --> 00:42:42,920 Speaker 4: point in this fucking off putting book where or off 775 00:42:42,920 --> 00:42:46,880 Speaker 4: putting video where he like plays the an a he 776 00:42:46,920 --> 00:42:50,439 Speaker 4: has an ai narrator narrate this book, because again, why 777 00:42:50,520 --> 00:42:54,040 Speaker 4: involve human creativity in any way, shape or form alongside 778 00:42:54,080 --> 00:42:58,040 Speaker 4: these terrible images that it's generated with like video of 779 00:42:58,080 --> 00:43:00,719 Speaker 4: his kids reacting to it, so you can see how 780 00:43:00,760 --> 00:43:02,480 Speaker 4: his children react to the story. 781 00:43:03,200 --> 00:43:05,279 Speaker 1: So we're going to play a section of that and 782 00:43:05,640 --> 00:43:08,040 Speaker 1: keep an eye on their faces. 783 00:43:08,520 --> 00:43:12,239 Speaker 5: The unicorn stopped in her tracks. She couldn't move any 784 00:43:12,280 --> 00:43:19,239 Speaker 5: clotheser to Fluff and the old man. The old man 785 00:43:19,360 --> 00:43:20,080 Speaker 5: smiled and. 786 00:43:20,000 --> 00:43:24,600 Speaker 6: Said, see, Cloff, Jesus not only gives us eternal life 787 00:43:24,640 --> 00:43:28,200 Speaker 6: for free, but he protects us too. We just have 788 00:43:28,280 --> 00:43:30,319 Speaker 6: to trust him and ask for help. 789 00:43:31,360 --> 00:43:34,600 Speaker 5: Fluff was amazed. He thanked the old man for teaching 790 00:43:34,640 --> 00:43:37,960 Speaker 5: him about Jesus. From that day on self was a 791 00:43:37,960 --> 00:43:40,760 Speaker 5: happy puppy, and he always remembered the lesson the old 792 00:43:40,760 --> 00:43:41,719 Speaker 5: man had taught him. 793 00:43:42,360 --> 00:43:43,120 Speaker 2: He knew that no. 794 00:43:43,120 --> 00:43:46,280 Speaker 5: Matter what, Jesus gives eternal life to those who believe 795 00:43:46,880 --> 00:43:48,439 Speaker 5: and helps them every day. 796 00:43:49,560 --> 00:43:51,240 Speaker 2: Evil is evil. 797 00:43:51,280 --> 00:43:51,480 Speaker 3: Man. 798 00:43:59,320 --> 00:44:03,279 Speaker 1: Oh my, it's so hard because if you watch the 799 00:44:03,400 --> 00:44:06,799 Speaker 1: video of the kids at the start of it, like 800 00:44:07,520 --> 00:44:09,320 Speaker 1: they're they're they're both kind of they're all kind of 801 00:44:09,360 --> 00:44:10,759 Speaker 1: like sitting up, and as it goes on, like one 802 00:44:10,800 --> 00:44:13,359 Speaker 1: girl puts her head into her hands, the other has 803 00:44:13,400 --> 00:44:16,000 Speaker 1: like her head hands on her like they're not they 804 00:44:16,400 --> 00:44:20,600 Speaker 1: look uncomfortable, like they're not enjoying this story that they're 805 00:44:20,600 --> 00:44:27,480 Speaker 1: being fed. Ah, it's so fucked up. Also dog shit story. 806 00:44:28,960 --> 00:44:31,879 Speaker 1: But like, yeah, the fact that in his video where 807 00:44:31,880 --> 00:44:36,400 Speaker 1: he's trying to like show how well robots can generate 808 00:44:36,719 --> 00:44:40,960 Speaker 1: Christian propaganda like his own kids could not be less 809 00:44:40,960 --> 00:44:43,799 Speaker 1: engaged in this shit, which maybe is a good thing, right, 810 00:44:43,840 --> 00:44:46,160 Speaker 1: Maybe the fact that they're so inherently bored by it 811 00:44:46,200 --> 00:44:49,560 Speaker 1: means it won't do as much damage as I'm afraid. 812 00:44:49,880 --> 00:44:52,600 Speaker 1: But you know what will do as much damage as 813 00:44:52,719 --> 00:44:53,400 Speaker 1: I'm afraid of? 814 00:44:54,360 --> 00:44:54,760 Speaker 2: Jazz? 815 00:44:55,320 --> 00:44:59,359 Speaker 1: Well, yeah, jazz fundamentally a mind poison. You know, that's 816 00:44:59,400 --> 00:45:02,719 Speaker 1: why the team means these days the teen's been They're 817 00:45:02,760 --> 00:45:06,359 Speaker 1: always pulling up in their jaloppies to the to the 818 00:45:06,520 --> 00:45:10,879 Speaker 1: get the malted milkshakes and the civil rights movement. God 819 00:45:10,960 --> 00:45:13,719 Speaker 1: damn it. Anyway, here's ads. 820 00:45:15,960 --> 00:45:20,360 Speaker 2: I like, we're at tagging jazz unreasonably unrelated. 821 00:45:20,400 --> 00:45:22,839 Speaker 1: Way, take it down. Let's take it down. Jazz has 822 00:45:22,880 --> 00:45:36,239 Speaker 1: had enough time in the sun. All right, here's ads Ah, 823 00:45:36,320 --> 00:45:42,680 Speaker 1: we are be a double Q, which is how I 824 00:45:42,719 --> 00:45:51,600 Speaker 1: spell back. So yeah, this is like fucked up, and 825 00:45:51,640 --> 00:45:54,000 Speaker 1: I don't think his kids liked it, But it is 826 00:45:54,040 --> 00:45:56,239 Speaker 1: the kind of thing that I worry about, or at 827 00:45:56,280 --> 00:45:58,040 Speaker 1: least there's like evidence of the kind of thing that 828 00:45:58,080 --> 00:46:00,680 Speaker 1: I worry about, Like where where's some of the directions 829 00:46:00,760 --> 00:46:04,440 Speaker 1: of this might head? Because churches and also political organizations 830 00:46:04,480 --> 00:46:07,319 Speaker 1: groups like Turning Point USA, who are already looking at 831 00:46:07,360 --> 00:46:10,040 Speaker 1: young people and have access to grind and a lot 832 00:46:10,040 --> 00:46:13,439 Speaker 1: of money, like the potential of them to generate huge 833 00:46:13,440 --> 00:46:16,120 Speaker 1: amounts of propaganda content, to buy up copies, to get 834 00:46:16,160 --> 00:46:18,600 Speaker 1: Amazon to spread it, and in order to do that, 835 00:46:18,680 --> 00:46:21,000 Speaker 1: in order to like trick large numbers of parents and 836 00:46:21,080 --> 00:46:24,880 Speaker 1: kids into buying books that contain weird right wing propaganda. 837 00:46:25,000 --> 00:46:29,279 Speaker 1: Like they're already doing versions of this con all throughout 838 00:46:29,640 --> 00:46:34,600 Speaker 1: the world and throughout media throughout like our culture. I 839 00:46:34,960 --> 00:46:37,640 Speaker 1: am deeply concerned about the ability to like spread it 840 00:46:37,680 --> 00:46:41,000 Speaker 1: in more subtle ways through like these fake children's books. 841 00:46:40,760 --> 00:46:45,600 Speaker 2: And shit right, Oh, like a story about dinosaurs who 842 00:46:45,760 --> 00:46:50,080 Speaker 2: learned that there are bathrooms only for some specific types 843 00:46:50,120 --> 00:46:50,759 Speaker 2: of dinosaur. 844 00:46:51,120 --> 00:46:54,279 Speaker 1: And that's why, you know, when they stop doing that, 845 00:46:54,360 --> 00:46:56,880 Speaker 1: when they let all of the dinosaurs use whatever bathroom 846 00:46:57,080 --> 00:46:59,920 Speaker 1: they feel they should use, that's when the media hits 847 00:47:00,280 --> 00:47:04,799 Speaker 1: because God decided to kill them all. Yeah, I don't know, 848 00:47:05,040 --> 00:47:08,480 Speaker 1: I think it could be fucked up. And you know, again, 849 00:47:08,600 --> 00:47:11,759 Speaker 1: there's some actual rigorous reason to think that this could 850 00:47:11,800 --> 00:47:15,560 Speaker 1: cause some serious damage. In the book Literature is Exploration, 851 00:47:15,760 --> 00:47:18,520 Speaker 1: which is a very influential book on literary theory by 852 00:47:18,520 --> 00:47:23,440 Speaker 1: Professor Luis Rosenblatt. Rosenblatt argues that the reader is a 853 00:47:23,480 --> 00:47:25,440 Speaker 1: crucial piece as I've been talking about, is a crucial 854 00:47:25,600 --> 00:47:28,600 Speaker 1: part of any piece of literature. She writes, there is 855 00:47:28,680 --> 00:47:31,160 Speaker 1: no such thing as a generic reader or a generic 856 00:47:31,239 --> 00:47:34,640 Speaker 1: literary work. There are only the potential millions of individual 857 00:47:34,680 --> 00:47:38,319 Speaker 1: readers or the potential millions of individual literary works. A 858 00:47:38,400 --> 00:47:41,279 Speaker 1: novel or a poem or a play remains merely ink 859 00:47:41,320 --> 00:47:44,400 Speaker 1: spots on paper until a reader transforms them into a 860 00:47:44,440 --> 00:47:48,880 Speaker 1: set of meaningful symbols and what Yeah, it's beautiful. And 861 00:47:48,920 --> 00:47:50,960 Speaker 1: what she means by this is that books from Blood 862 00:47:51,000 --> 00:47:54,080 Speaker 1: Meridian to hop On Pop are a dialogue between writer 863 00:47:54,160 --> 00:47:58,360 Speaker 1: and reader. The machines that are generating these stories cannot 864 00:47:58,360 --> 00:48:02,319 Speaker 1: participate in a conversation. They are mechanical turks. They are 865 00:48:02,360 --> 00:48:05,840 Speaker 1: not conversing. They are guessing what word comes next, based 866 00:48:05,840 --> 00:48:08,160 Speaker 1: on a mix of complex math and the labor of 867 00:48:08,200 --> 00:48:11,120 Speaker 1: Kenyan contractors paid two dollars an hour to make sure 868 00:48:11,160 --> 00:48:14,359 Speaker 1: that the responses aren't too racist. This is a problem 869 00:48:14,520 --> 00:48:16,800 Speaker 1: in part because one of the things we know about 870 00:48:16,840 --> 00:48:21,120 Speaker 1: how books impact people is that reading real books, reading 871 00:48:21,160 --> 00:48:25,520 Speaker 1: novels teaches empathy. It is common knowledge and well documented 872 00:48:25,560 --> 00:48:28,560 Speaker 1: that reading long term fiction makes people better able to 873 00:48:28,640 --> 00:48:32,520 Speaker 1: identify with other people's thoughts and struggles. Being a reader 874 00:48:32,600 --> 00:48:36,040 Speaker 1: makes you more empathetic. Right. This is well established and 875 00:48:36,080 --> 00:48:40,520 Speaker 1: well documented. Educational researchers have found that very young children 876 00:48:40,719 --> 00:48:44,520 Speaker 1: can actually be influenced towards engaging in new behavior by 877 00:48:44,600 --> 00:48:46,759 Speaker 1: the stories that they read. And I'm going to quote 878 00:48:46,760 --> 00:48:49,239 Speaker 1: from an article by Peggy Albers in The Atlantic. Here, 879 00:48:50,520 --> 00:48:53,319 Speaker 1: stories can be used to change children's perspectives about their 880 00:48:53,400 --> 00:48:56,280 Speaker 1: views on people in different parts of the world. For example, 881 00:48:56,360 --> 00:48:59,160 Speaker 1: Hillary Jenks works with children and teachers on how images 882 00:48:59,200 --> 00:49:02,520 Speaker 1: and stories on refugees can influence the way that refugees 883 00:49:02,520 --> 00:49:06,440 Speaker 1: are perceived. Kathy Short studied children's engagement with literature around 884 00:49:06,480 --> 00:49:09,000 Speaker 1: human rights and their work in a diverse K through 885 00:49:09,000 --> 00:49:12,000 Speaker 1: five school with two hundred children. They found stories moved 886 00:49:12,040 --> 00:49:14,719 Speaker 1: even such young children to consider how they could bring 887 00:49:14,880 --> 00:49:18,239 Speaker 1: change in their own local community in school. Now, in 888 00:49:18,239 --> 00:49:21,719 Speaker 1: that last case with Kathy Short, the students that she 889 00:49:21,880 --> 00:49:25,840 Speaker 1: was like engaging in these stories. She told them basically 890 00:49:25,880 --> 00:49:27,840 Speaker 1: like read them. You know that they have these collections 891 00:49:27,840 --> 00:49:31,120 Speaker 1: of books about like kids who did amazing things. Kathy 892 00:49:31,200 --> 00:49:33,240 Speaker 1: reads a bunch of these, like k through five students 893 00:49:33,280 --> 00:49:36,040 Speaker 1: the story of an anti child labor activist, a real 894 00:49:36,120 --> 00:49:39,520 Speaker 1: kid named ickbal Massey who was murdered at age twelve 895 00:49:39,640 --> 00:49:42,360 Speaker 1: as a result of attempting to like end child labor. 896 00:49:42,400 --> 00:49:46,279 Speaker 1: And I think it was Pakistan. And the kids that 897 00:49:46,360 --> 00:49:48,920 Speaker 1: she's reading these who very young children, are so moved 898 00:49:48,960 --> 00:49:51,200 Speaker 1: by the story of this person and by what they've 899 00:49:51,200 --> 00:49:55,000 Speaker 1: read that they decided to create a community garden and 900 00:49:55,040 --> 00:49:58,280 Speaker 1: like built a community garden together and then grew food 901 00:49:58,320 --> 00:50:01,480 Speaker 1: that they donated to a local food Like this was 902 00:50:01,680 --> 00:50:04,520 Speaker 1: months and months and months of work, like basically, and 903 00:50:04,760 --> 00:50:06,600 Speaker 1: kind of the point that Kathy was making with this 904 00:50:06,719 --> 00:50:10,040 Speaker 1: research is that like something as simple as reading a 905 00:50:10,080 --> 00:50:16,720 Speaker 1: single story can inspire and influence young children to take action, 906 00:50:16,920 --> 00:50:20,720 Speaker 1: months of action, to like seriously engage themselves in things. 907 00:50:21,120 --> 00:50:25,120 Speaker 1: Because like that's how influential stories can be on behavior, 908 00:50:25,160 --> 00:50:27,879 Speaker 1: and particularly the behavior of children, because you know they've 909 00:50:27,880 --> 00:50:30,400 Speaker 1: been fed less shit, you know they're able to It 910 00:50:30,520 --> 00:50:32,759 Speaker 1: means more when a kid encounters a story than it 911 00:50:32,760 --> 00:50:34,600 Speaker 1: does when you do, because you've got a lot more 912 00:50:34,640 --> 00:50:37,200 Speaker 1: stories in your head. And that's part of what is 913 00:50:37,280 --> 00:50:40,879 Speaker 1: unsettling to me about all this, because like, you know, 914 00:50:41,280 --> 00:50:44,360 Speaker 1: is it possible that an AI generated story about ickbaal 915 00:50:44,440 --> 00:50:48,160 Speaker 1: Massey could inspire, you know, little kids in a school 916 00:50:48,200 --> 00:50:53,320 Speaker 1: to take positive action like that. Maybe is it possible 917 00:50:53,480 --> 00:50:56,680 Speaker 1: that an AI generated story could inspire kids to take 918 00:50:56,760 --> 00:51:01,080 Speaker 1: negative actions? Maybe we don't know. But the thing that's 919 00:51:01,080 --> 00:51:03,200 Speaker 1: most frightening to me is we're all going to learn 920 00:51:03,239 --> 00:51:06,319 Speaker 1: the answer to these questions together in the very new 921 00:51:06,680 --> 00:51:08,680 Speaker 1: near future, whether we want to or not. 922 00:51:10,480 --> 00:51:12,680 Speaker 2: That is chilling. 923 00:51:13,040 --> 00:51:15,920 Speaker 1: Yeah, it's cool, it's good stuff. Love that we're doing this. 924 00:51:17,120 --> 00:51:20,960 Speaker 2: I would say, I would say that for many of 925 00:51:21,040 --> 00:51:27,040 Speaker 2: us playing along at home, this concept might sound somewhat abstract. 926 00:51:27,480 --> 00:51:33,200 Speaker 2: This might sound somewhat hypothetical, or a thought experiment or 927 00:51:33,239 --> 00:51:35,680 Speaker 2: something or one of the one four hundred and sixty 928 00:51:35,680 --> 00:51:39,560 Speaker 2: two only possible stories and Plato. But the but the 929 00:51:39,600 --> 00:51:44,400 Speaker 2: reality that you have outlined here, Robert is stark. You 930 00:51:44,440 --> 00:51:48,880 Speaker 2: know it is. It is inevitable that this will occur. 931 00:51:48,960 --> 00:51:51,759 Speaker 2: It's happening now, right, You're coming to us in real time. 932 00:51:52,320 --> 00:51:57,200 Speaker 2: You've you've cited multiple sting here. Yeah, yeah, it could, 933 00:51:57,520 --> 00:52:00,560 Speaker 2: it could happen. I don't want to do fanboys. But 934 00:52:00,680 --> 00:52:06,600 Speaker 2: the point is, I think the point is sobering. And 935 00:52:06,760 --> 00:52:09,480 Speaker 2: one question that a lot of folks are gonna have 936 00:52:09,560 --> 00:52:17,440 Speaker 2: here is what, if anything, will will people do with 937 00:52:17,600 --> 00:52:19,880 Speaker 2: this knowledge? Like for people who have listened, who are 938 00:52:19,920 --> 00:52:23,279 Speaker 2: listening now, who have kids or who have loved ones, 939 00:52:23,320 --> 00:52:28,640 Speaker 2: have any sort of ability to curate access to information, 940 00:52:29,400 --> 00:52:31,000 Speaker 2: is there something they can do? 941 00:52:33,600 --> 00:52:36,760 Speaker 1: Yeah, I mean I think number one. If you're a parent, 942 00:52:36,960 --> 00:52:39,200 Speaker 1: if you're someone who buys the you know, who buys 943 00:52:39,239 --> 00:52:41,239 Speaker 1: gifts for kids, you know, because you've got some of 944 00:52:41,280 --> 00:52:44,520 Speaker 1: the family or whatever, be aware of this. Be aware 945 00:52:44,520 --> 00:52:47,680 Speaker 1: of what's out there. Be aware that, Like you can't 946 00:52:47,760 --> 00:52:49,880 Speaker 1: just look at Oh, this kid likes coloring books. Let 947 00:52:49,920 --> 00:52:53,520 Speaker 1: me see what the most popular coloring book in dinosaurs is, right, 948 00:52:54,200 --> 00:52:56,879 Speaker 1: Take take a second look, take more of a look 949 00:52:56,920 --> 00:52:59,160 Speaker 1: at the reviews. You know. See if you can look 950 00:52:59,200 --> 00:53:02,600 Speaker 1: through a couple of different you know, pages from it 951 00:53:02,640 --> 00:53:04,200 Speaker 1: on the Amazon things, see if it has any of 952 00:53:04,239 --> 00:53:05,120 Speaker 1: these hallmarks. 953 00:53:05,480 --> 00:53:05,640 Speaker 3: You know. 954 00:53:05,680 --> 00:53:10,680 Speaker 1: Again, if you go to the Shatterzone dot substack dot com, 955 00:53:10,840 --> 00:53:14,120 Speaker 1: you'll find, you know, the actual article version of this episode. 956 00:53:14,440 --> 00:53:17,680 Speaker 1: Look at the images we've put up here and take 957 00:53:17,719 --> 00:53:19,200 Speaker 1: a look to see if you can see any of 958 00:53:19,239 --> 00:53:21,200 Speaker 1: these hallmarks. Take a look at the text. It should 959 00:53:21,239 --> 00:53:24,120 Speaker 1: be pretty obvious to you an adult if the text 960 00:53:24,160 --> 00:53:28,480 Speaker 1: is AI generated. That's kind of the first most basic 961 00:53:28,520 --> 00:53:31,720 Speaker 1: thing you can do is like be aware of what's 962 00:53:31,760 --> 00:53:35,000 Speaker 1: possible and try to keep an eye on it and 963 00:53:35,040 --> 00:53:37,680 Speaker 1: make sure that you don't contribute to paying these people 964 00:53:38,080 --> 00:53:39,960 Speaker 1: or to getting more of this stuff out to kids. 965 00:53:40,400 --> 00:53:42,480 Speaker 1: I think the other things that Cannon should be done. 966 00:53:42,560 --> 00:53:45,480 Speaker 1: Number one, we could be pressuring Amazon to make it 967 00:53:45,520 --> 00:53:47,759 Speaker 1: harder to do this stuff. Make it clear what their 968 00:53:47,800 --> 00:53:51,040 Speaker 1: plagiarism detectors are, make it clear what their lines are 969 00:53:51,200 --> 00:53:55,040 Speaker 1: for AI, you know, work being crapped out to children, Like, 970 00:53:55,080 --> 00:53:57,400 Speaker 1: do they have any restrictions there? Are you just allowed 971 00:53:57,440 --> 00:53:59,719 Speaker 1: to put up as many of these random books as 972 00:53:59,719 --> 00:54:02,120 Speaker 1: you want without any kind of limitation. Well, so far 973 00:54:02,200 --> 00:54:05,120 Speaker 1: that's the situation. Can Amazon be pressured to take a 974 00:54:05,120 --> 00:54:08,680 Speaker 1: different tact? Well, if there actually was enough bad pr perhaps. 975 00:54:09,280 --> 00:54:12,200 Speaker 2: Oh I just want to pause right there, because that 976 00:54:12,800 --> 00:54:15,799 Speaker 2: sounded like a bar Like that sounded like you were 977 00:54:15,840 --> 00:54:20,000 Speaker 2: about to drop a beat with the internal rhyme scheme 978 00:54:20,080 --> 00:54:23,120 Speaker 2: in the cadence. Everybody play that back, play that. 979 00:54:23,080 --> 00:54:25,520 Speaker 1: Para, cut it to a beat, someone someone can fix 980 00:54:25,560 --> 00:54:28,919 Speaker 1: that up for us. But yeah, you know, like, look, 981 00:54:29,160 --> 00:54:32,359 Speaker 1: that's what I would like people to like there needs 982 00:54:32,360 --> 00:54:35,440 Speaker 1: to be pressure on Amazon about this stuff, among other things, 983 00:54:35,440 --> 00:54:37,720 Speaker 1: Like again, I've reached out to several of these creators 984 00:54:37,719 --> 00:54:40,080 Speaker 1: for common but I reached out for Amazon to Amazon 985 00:54:40,120 --> 00:54:42,240 Speaker 1: for content on a number of things. They haven't gotten 986 00:54:42,280 --> 00:54:45,520 Speaker 1: back to me. I know a lot of journalists listen 987 00:54:45,560 --> 00:54:47,880 Speaker 1: to this stuff. There's room for other articles on this. 988 00:54:47,960 --> 00:54:51,319 Speaker 1: It'll get good traffic. Everybody reads AI shit. Get out 989 00:54:51,360 --> 00:54:55,840 Speaker 1: there yourself and make them answer some of these things. 990 00:54:56,960 --> 00:54:59,120 Speaker 1: You know, one of the there's a couple of different 991 00:54:59,200 --> 00:55:02,479 Speaker 1: questions that I asked Amazon, and I'll read them right now. 992 00:55:02,840 --> 00:55:05,880 Speaker 1: Number one, does Amazon restrict the publication of AI works 993 00:55:05,880 --> 00:55:08,400 Speaker 1: on Kindle in any way? Does it make a difference 994 00:55:08,400 --> 00:55:10,920 Speaker 1: if the works are marketed towards children? Number two? Does 995 00:55:10,960 --> 00:55:14,360 Speaker 1: Amazon keep data on how many Amazon AI generated books 996 00:55:14,360 --> 00:55:17,320 Speaker 1: of various types are selling? Number three? And the guides 997 00:55:17,360 --> 00:55:20,359 Speaker 1: that I have watched creators discuss how the texts they 998 00:55:20,400 --> 00:55:24,600 Speaker 1: generate sets off Amazon plagiarism detectors. To get around this, creators, 999 00:55:24,600 --> 00:55:27,680 Speaker 1: who's a service called quilbot which replaces adjectives with synonyms? 1000 00:55:27,920 --> 00:55:31,720 Speaker 1: Is this a violation of Kindle slash KDP terms of service? 1001 00:55:32,080 --> 00:55:34,360 Speaker 1: You know, these are pretty basic questions, there's more to 1002 00:55:34,400 --> 00:55:37,160 Speaker 1: be asked, but like the sheer factor of like being 1003 00:55:37,160 --> 00:55:39,560 Speaker 1: reached out to and talked to, Like, if there's enough 1004 00:55:39,640 --> 00:55:43,640 Speaker 1: bad press, it's theoretical that they might make it harder 1005 00:55:43,680 --> 00:55:47,359 Speaker 1: for these people to do what they're doing. Likewise, you know, 1006 00:55:47,600 --> 00:55:54,160 Speaker 1: I think that these different sort of grindset creators could 1007 00:55:54,200 --> 00:55:58,040 Speaker 1: be you know, for one thing, it's possible that there's 1008 00:55:58,080 --> 00:56:00,880 Speaker 1: some violation of YouTube here, that they are like admitting 1009 00:56:00,920 --> 00:56:04,440 Speaker 1: to engaging in plagiarism and then finding ways around it. Like, 1010 00:56:05,000 --> 00:56:07,239 Speaker 1: I think there's an argument to be made at least 1011 00:56:07,239 --> 00:56:09,719 Speaker 1: that like there might be actually rights issues here, Like 1012 00:56:09,760 --> 00:56:12,480 Speaker 1: if if the initial text of this is plagiarism and 1013 00:56:12,520 --> 00:56:15,359 Speaker 1: they are disguising that plagiarism and then profiting off of it, 1014 00:56:15,800 --> 00:56:18,400 Speaker 1: there's a degree to which they could be in illegally 1015 00:56:18,480 --> 00:56:21,200 Speaker 1: dicey area. It's possible that you could, uh, you could 1016 00:56:21,239 --> 00:56:23,520 Speaker 1: get YouTube to take action against some of this content. 1017 00:56:23,560 --> 00:56:26,480 Speaker 1: I don't know, it's it's it's largely a matter of, 1018 00:56:26,520 --> 00:56:29,480 Speaker 1: like you can get enough people angry about this on 1019 00:56:29,600 --> 00:56:33,240 Speaker 1: behalf of the kids, But then these companies will eventually 1020 00:56:33,280 --> 00:56:35,200 Speaker 1: take action, not because it's the right thing to do, 1021 00:56:35,280 --> 00:56:38,719 Speaker 1: but because if enough people get angry, corporations tend to 1022 00:56:38,719 --> 00:56:40,960 Speaker 1: take the coward's way out, which is becomes what the 1023 00:56:41,040 --> 00:56:45,319 Speaker 1: angry people want. Yeah, so I don't know. That's that's 1024 00:56:45,400 --> 00:56:49,480 Speaker 1: my only suggestion right now. I'll keep thinking about it. Yeah. 1025 00:56:49,719 --> 00:56:53,720 Speaker 2: I think that's great though. This is uh, this actionable advice. 1026 00:56:54,680 --> 00:56:59,279 Speaker 2: So yeah, we'll see. Yeah, yeah, well we'll see. It'll 1027 00:56:59,280 --> 00:57:03,719 Speaker 2: be it'll be interesting to listen to this episode in 1028 00:57:04,040 --> 00:57:05,720 Speaker 2: what do you think five years? 1029 00:57:06,200 --> 00:57:10,240 Speaker 1: Yeah, yeah, when all of our jobs have been replaced 1030 00:57:10,239 --> 00:57:13,800 Speaker 1: by AI except for Sophie. The entire world of media 1031 00:57:14,000 --> 00:57:17,480 Speaker 1: is just like Sophie sitting down to talk with Harrison 1032 00:57:17,600 --> 00:57:21,560 Speaker 1: Ford about his new baldness cream and Sophie sitting down 1033 00:57:21,600 --> 00:57:25,960 Speaker 1: with Joe Rogan to talk about, you know, steroids, Sophie 1034 00:57:26,040 --> 00:57:29,880 Speaker 1: sitting down with AI me to read a story about 1035 00:57:29,960 --> 00:57:32,560 Speaker 1: Hitler that uses facts that were just made up in 1036 00:57:32,600 --> 00:57:33,000 Speaker 1: the ether. 1037 00:57:33,400 --> 00:57:39,040 Speaker 3: There there's no way, Robot, you could a total shriek 1038 00:57:39,120 --> 00:57:40,720 Speaker 3: the way that real you. 1039 00:57:41,200 --> 00:57:44,440 Speaker 1: Thank you, Sophie. You understand. The only thing that I'm 1040 00:57:44,480 --> 00:57:48,200 Speaker 1: actually proud of is my atonal shrieking. I went to 1041 00:57:48,240 --> 00:57:51,080 Speaker 1: school for four years to learn how to shriek like that, 1042 00:57:51,200 --> 00:57:51,400 Speaker 1: you know. 1043 00:57:51,560 --> 00:57:55,600 Speaker 3: Yeah, yeah, that's doctor attal shrieking. 1044 00:57:55,720 --> 00:57:58,880 Speaker 1: That's right, that's right. Yeah, yeah, I teach a class 1045 00:57:58,920 --> 00:58:00,920 Speaker 1: in Stanford how to go. 1046 00:58:02,720 --> 00:58:05,400 Speaker 2: Is it? Is it in Stanford? Because I heard it 1047 00:58:05,440 --> 00:58:09,200 Speaker 2: was on a cross street, like, yeah, a cross street. 1048 00:58:09,360 --> 00:58:12,600 Speaker 2: I have a I have a bullhorn. I also have 1049 00:58:12,640 --> 00:58:15,240 Speaker 2: a crude spear that I whittled, you know, but it's 1050 00:58:15,320 --> 00:58:18,680 Speaker 2: it's basically Stanford. Hey, that's a human made spear though, 1051 00:58:18,720 --> 00:58:21,880 Speaker 2: credit where it's due. Oh yeah, absolutely, I only use 1052 00:58:22,000 --> 00:58:25,080 Speaker 2: human made spears. And also sometimes one time a spear 1053 00:58:25,080 --> 00:58:29,000 Speaker 2: that was was crafted by a chimpanzee. But you know, 1054 00:58:30,120 --> 00:58:32,439 Speaker 2: SAME's that's between you and the chimp. Man. 1055 00:58:32,880 --> 00:58:34,800 Speaker 1: A lot of things are between me and that chimp. 1056 00:58:36,600 --> 00:58:38,920 Speaker 3: Just to say, but do we do we have any 1057 00:58:38,920 --> 00:58:40,200 Speaker 3: pluggables that then here? Ben? 1058 00:58:40,640 --> 00:58:40,880 Speaker 1: Yeah? 1059 00:58:42,440 --> 00:58:46,960 Speaker 2: Oh yeah, I will. I will say you can check 1060 00:58:47,040 --> 00:58:50,560 Speaker 2: out more by going to cool Zone Media. I want 1061 00:58:50,560 --> 00:58:54,120 Speaker 2: to give a big shout out to Gare who has 1062 00:58:54,200 --> 00:58:58,880 Speaker 2: done Yes, Garrison, Garrison, who has done some top notch 1063 00:58:58,920 --> 00:59:03,080 Speaker 2: reporting in my opinion on the current events in Atlanta. 1064 00:59:03,160 --> 00:59:06,880 Speaker 2: You may be familiar with the stop with the stop cups. 1065 00:59:07,760 --> 00:59:11,520 Speaker 3: Is available and it could happen here and is truly incredible. 1066 00:59:12,040 --> 00:59:16,080 Speaker 1: Yeah, great stuff. Anything else, Ben. 1067 00:59:16,520 --> 00:59:19,720 Speaker 2: Oh Yeah, you can find me. Yeah, you can find 1068 00:59:19,720 --> 00:59:25,240 Speaker 2: me on on Twitter or Instagram wherever you Yeah, you 1069 00:59:25,240 --> 00:59:32,360 Speaker 2: can find me on Friends to Farmers only, you know, no, yeah, 1070 00:59:31,680 --> 00:59:36,960 Speaker 2: so big, you know I'm an overall influencer. Oh my 1071 00:59:37,040 --> 00:59:40,240 Speaker 2: gosh levels to these jokes. Take that, you know, try 1072 00:59:40,240 --> 00:59:43,200 Speaker 2: that chat JPII. But uh but in a burst of 1073 00:59:43,200 --> 00:59:48,400 Speaker 2: creativity calling myself some derivation of at Ben Bowling because yeah, 1074 00:59:48,520 --> 00:59:49,959 Speaker 2: super secret code. 1075 00:59:51,920 --> 00:59:55,040 Speaker 3: National Brian Robert. What's that link for your substack one 1076 00:59:55,040 --> 00:59:55,480 Speaker 3: more time? 1077 00:59:55,920 --> 00:59:58,640 Speaker 1: Uh, shatterzone dot substack dot com. 1078 00:59:58,680 --> 01:00:02,360 Speaker 2: It is not regular updated, but you know every we 1079 01:00:02,160 --> 01:00:05,640 Speaker 2: were not super rarely updated. 1080 01:00:05,680 --> 01:00:08,400 Speaker 1: It's free and you can find this article on h 1081 01:00:08,760 --> 01:00:11,919 Speaker 1: on fucking how AI is Coming for your children? That'll 1082 01:00:11,920 --> 01:00:14,280 Speaker 1: probably be the title, how AI is Coming for your children? 1083 01:00:14,400 --> 01:00:16,920 Speaker 1: Just go to the go to the substack. You'll find it. 1084 01:00:17,280 --> 01:00:20,640 Speaker 1: You'll find all these fucked up dinosaur images. But I 1085 01:00:20,680 --> 01:00:23,480 Speaker 1: guarantee you you haven't seen enough fucked up looking dinosaurs 1086 01:00:23,480 --> 01:00:28,800 Speaker 1: in your life, simply have not. Check check it out, leabe, okay, 1087 01:00:28,840 --> 01:00:29,880 Speaker 1: bye bye. 1088 01:00:32,960 --> 01:00:35,680 Speaker 3: Behind the Bastards is a production of cool Zone Media. 1089 01:00:36,040 --> 01:00:39,320 Speaker 3: For more from cool Zone Media, visit our website Coolzonemedia 1090 01:00:39,480 --> 01:00:42,720 Speaker 3: dot com or check us out on the iHeartRadio app, 1091 01:00:42,760 --> 01:00:45,160 Speaker 3: Apple Podcasts, or wherever you get your podcasts.