1 00:00:03,040 --> 00:00:06,119 Speaker 1: Welcome to Stuff to Blow Your Mind from how Stuffworks 2 00:00:06,120 --> 00:00:14,120 Speaker 1: dot com. Hey are you welcome to Stuff to Blow 3 00:00:14,160 --> 00:00:17,040 Speaker 1: your Mind? My name is Robert Lamb and I'm Joe McCormick, 4 00:00:17,079 --> 00:00:18,919 Speaker 1: and today we're gonna be talking about one of my 5 00:00:19,000 --> 00:00:23,960 Speaker 1: favorite comedy subjects, What's funny about the way that machines fail? 6 00:00:24,600 --> 00:00:26,040 Speaker 1: And here's just a heads up. This is gonna be 7 00:00:26,079 --> 00:00:28,800 Speaker 1: a two part episode because we ended up going super long. 8 00:00:28,920 --> 00:00:31,480 Speaker 1: Like the machines, we can't stop. So I want to 9 00:00:31,520 --> 00:00:35,880 Speaker 1: start with a particular example, probably my favorite funny thing 10 00:00:35,960 --> 00:00:40,919 Speaker 1: on the Internet these days, the hilarious almost successes of 11 00:00:41,240 --> 00:00:47,120 Speaker 1: artificial intelligence trying to generate examples of human language almost 12 00:00:47,200 --> 00:00:50,479 Speaker 1: but not quite human. Yes, I don't know why this 13 00:00:50,640 --> 00:00:53,559 Speaker 1: does it for me, but aside from Highlander to The Quickening, 14 00:00:53,560 --> 00:00:57,880 Speaker 1: pretty much nothing makes me laugh harder than language generated 15 00:00:57,880 --> 00:01:01,280 Speaker 1: by artificial neural networks and machine learning. And we'll explain 16 00:01:01,320 --> 00:01:03,760 Speaker 1: a little bit more about exactly how this works in 17 00:01:03,800 --> 00:01:06,200 Speaker 1: a minute, but first I just thought we should look 18 00:01:06,200 --> 00:01:08,560 Speaker 1: at a few examples of what this is like. If 19 00:01:08,560 --> 00:01:11,600 Speaker 1: you're on the Internet, you've probably encountered this at some point, 20 00:01:12,400 --> 00:01:15,679 Speaker 1: because it's become popular in the past few years. Especially 21 00:01:15,760 --> 00:01:18,280 Speaker 1: for its comedic value. If you're not on the internet, boy, 22 00:01:18,280 --> 00:01:20,960 Speaker 1: are you in for a treat um by far? I 23 00:01:20,959 --> 00:01:23,400 Speaker 1: would say the best of this stuff I've come across 24 00:01:23,560 --> 00:01:27,120 Speaker 1: is traceable back to the blog of a person named 25 00:01:27,240 --> 00:01:30,200 Speaker 1: Janelle Shane, who I think her day job as she 26 00:01:30,240 --> 00:01:33,920 Speaker 1: works in industrial researching optics, but as a hobby, she 27 00:01:34,040 --> 00:01:38,399 Speaker 1: trains neural networks with text based data uh, text based 28 00:01:38,480 --> 00:01:43,240 Speaker 1: data sets to spit out these amazing simulations of types 29 00:01:43,319 --> 00:01:46,280 Speaker 1: of human language, and so they'll they'll be in categories 30 00:01:46,319 --> 00:01:50,280 Speaker 1: like it. She gets an AI program to write recipes 31 00:01:50,320 --> 00:01:53,280 Speaker 1: for foods or to come up with the names of 32 00:01:53,440 --> 00:01:56,080 Speaker 1: paint colors or something. And the way, of course you 33 00:01:56,080 --> 00:01:58,040 Speaker 1: would do this is, and we'll explain more of the 34 00:01:58,080 --> 00:02:00,800 Speaker 1: details in a bit, but you'd train it on existing 35 00:02:00,960 --> 00:02:04,000 Speaker 1: names of paint colors, or you'd train it on existing 36 00:02:04,040 --> 00:02:06,440 Speaker 1: recipes of food. Right. So the end result here is 37 00:02:06,440 --> 00:02:10,079 Speaker 1: you essentially have a machine trying to human but not 38 00:02:10,200 --> 00:02:13,840 Speaker 1: quite pulling it off, but but but doing so in 39 00:02:14,000 --> 00:02:17,520 Speaker 1: such hilarious fashion. Yeah, so you should absolutely look up 40 00:02:17,600 --> 00:02:21,000 Speaker 1: Janelle Shane's blog. It's called AI Weirdness dot com. It's 41 00:02:21,080 --> 00:02:23,840 Speaker 1: reliably such a source of joy. But I want to start. 42 00:02:23,880 --> 00:02:25,720 Speaker 1: In fact, I will say that if you were if 43 00:02:25,760 --> 00:02:27,640 Speaker 1: you're scratching your head and you're thinking, oh, didn't I 44 00:02:27,680 --> 00:02:31,920 Speaker 1: see something hilarious in this vein recently, there's there's a 45 00:02:31,919 --> 00:02:35,280 Speaker 1: there's a high probability that it originated from AI weirdness 46 00:02:35,360 --> 00:02:38,600 Speaker 1: dot com. Yes, that that blog is just awesome, but 47 00:02:38,639 --> 00:02:40,280 Speaker 1: I wanted to look at a few examples of what 48 00:02:40,320 --> 00:02:42,440 Speaker 1: this is like. So one thing is consider the work, 49 00:02:42,680 --> 00:02:45,720 Speaker 1: uh that that Jennell Shane did training in neural network 50 00:02:45,800 --> 00:02:48,200 Speaker 1: to come up with names for D and D spells. 51 00:02:48,960 --> 00:02:51,560 Speaker 1: So you take the Dungeons and Dragons manual and you'd 52 00:02:51,600 --> 00:02:54,359 Speaker 1: feed in all the actual names of spells to the 53 00:02:54,480 --> 00:02:56,400 Speaker 1: neural network, so it gets a sense of what these 54 00:02:56,400 --> 00:02:58,760 Speaker 1: things are like, and then it tries to come up 55 00:02:58,840 --> 00:03:01,440 Speaker 1: with similar types of names on its own. Now, to 56 00:03:01,520 --> 00:03:03,760 Speaker 1: give everybody just a quick idea first of what actual 57 00:03:03,840 --> 00:03:08,280 Speaker 1: Dungeons and Dragon spells are named. You have everything from 58 00:03:08,480 --> 00:03:13,919 Speaker 1: Magic Missile or Crown of Madness to Evard's Black Tentacles 59 00:03:14,080 --> 00:03:16,320 Speaker 1: or anytime there's a wizard name in there, you know 60 00:03:16,360 --> 00:03:19,320 Speaker 1: you're in for some good stuff. Um, well, there's you know, 61 00:03:19,320 --> 00:03:22,799 Speaker 1: stuff like Glyph of Warding or what's the what's the 62 00:03:22,840 --> 00:03:25,440 Speaker 1: one I'm trying to think of? Oh, stuff like Leo 63 00:03:25,520 --> 00:03:31,119 Speaker 1: Moon's Secret Chest or Leoman's Tiny Hut. That's another great one. Well, 64 00:03:31,200 --> 00:03:33,280 Speaker 1: so here are the ones that it came up with. 65 00:03:33,560 --> 00:03:37,920 Speaker 1: How about selections like Mister of Light, Confusing, Storm of 66 00:03:37,960 --> 00:03:44,520 Speaker 1: the Giffling, Song of Goom, song of the Darn, Ward 67 00:03:44,640 --> 00:03:49,000 Speaker 1: of Snay to the pooda Beast, primal rear. You've got 68 00:03:49,000 --> 00:03:51,840 Speaker 1: to watch out for primal rear. Someone's storm Bear. Now 69 00:03:51,880 --> 00:03:54,760 Speaker 1: that one sounds legitimate, because that's one of the beauties 70 00:03:54,840 --> 00:03:58,400 Speaker 1: of these exercises is is when there's one that either 71 00:03:58,640 --> 00:04:02,080 Speaker 1: almost works or actually does work. Because I think some 72 00:04:02,240 --> 00:04:04,800 Speaker 1: in storm Bear, I can I can easily imagine I 73 00:04:04,800 --> 00:04:09,080 Speaker 1: can describe the storm Bear blasting out of the portal 74 00:04:09,480 --> 00:04:12,280 Speaker 1: and rushing into combat on behalf of your you know, 75 00:04:12,320 --> 00:04:16,360 Speaker 1: your your storm mage. Yeah, it's almost kind of effortlessly evocative. 76 00:04:17,240 --> 00:04:19,880 Speaker 1: A divine boom. How about that? That one sounds pretty 77 00:04:19,880 --> 00:04:23,239 Speaker 1: good too. Soul of the Bill. Now now we're flying 78 00:04:23,279 --> 00:04:27,760 Speaker 1: back down the hill. Farc mate about Charm of the CODs, 79 00:04:30,279 --> 00:04:33,520 Speaker 1: Death of the Sun. Okay, that that's that would have 80 00:04:33,560 --> 00:04:35,720 Speaker 1: to be a high level spell. But okay, okay, three 81 00:04:35,720 --> 00:04:39,880 Speaker 1: more greater flick. Okay, that's that's just a can trip there, 82 00:04:39,960 --> 00:04:42,760 Speaker 1: that's just a flick, a magical flick of the ear 83 00:04:43,400 --> 00:04:48,920 Speaker 1: curse clam. I like it. Daving Fire's confusing. Now. While 84 00:04:49,000 --> 00:04:52,359 Speaker 1: these phrases I think are mostly funny on their own, 85 00:04:52,440 --> 00:04:55,039 Speaker 1: I think they're probably even funnier if you're an actual 86 00:04:55,120 --> 00:04:57,440 Speaker 1: D and D player, because you not only get the 87 00:04:57,440 --> 00:05:00,480 Speaker 1: pleasure of the nonsense words and the you know, the 88 00:05:01,240 --> 00:05:04,360 Speaker 1: syllables that seem out of place in their context, like 89 00:05:04,880 --> 00:05:07,120 Speaker 1: Dave does not really seem to go very well with 90 00:05:07,200 --> 00:05:10,320 Speaker 1: some kind of magical fire spell, But if you actually 91 00:05:10,360 --> 00:05:12,599 Speaker 1: play D and D you probably also get some humor 92 00:05:12,680 --> 00:05:17,040 Speaker 1: from just like seeing the little resonances that these spells 93 00:05:17,080 --> 00:05:20,560 Speaker 1: have with actual spells that you would recognize. I now 94 00:05:20,680 --> 00:05:23,360 Speaker 1: want to run a gaming session where there's some sort 95 00:05:23,400 --> 00:05:27,080 Speaker 1: of warp effect in place where suddenly all the magic 96 00:05:27,200 --> 00:05:31,520 Speaker 1: users are forced to use of spells from this list, 97 00:05:31,960 --> 00:05:33,960 Speaker 1: and they don't know what they're going to exactly do 98 00:05:34,080 --> 00:05:37,200 Speaker 1: until they cast them. Even better, what if they had 99 00:05:37,240 --> 00:05:40,520 Speaker 1: to What if your character suddenly had amnesia and then 100 00:05:40,600 --> 00:05:43,480 Speaker 1: had to act as if they had bios that were 101 00:05:43,520 --> 00:05:46,480 Speaker 1: also generated by an artificial neural network. That's right, because 102 00:05:46,480 --> 00:05:49,480 Speaker 1: AI weirdness dot Com also has a wonderful piece titled 103 00:05:49,520 --> 00:05:52,880 Speaker 1: D and D Character bios now making slightly more sense. 104 00:05:53,000 --> 00:05:55,560 Speaker 1: In fact, I would say this post from I think 105 00:05:55,560 --> 00:05:57,440 Speaker 1: this was last week or something we were looking at. 106 00:05:57,480 --> 00:06:00,000 Speaker 1: I think this inspired me to want to do this episode. 107 00:06:00,160 --> 00:06:03,200 Speaker 1: We should read a couple of these. I'll read the 108 00:06:03,200 --> 00:06:08,480 Speaker 1: first one here. Quote. Frick found his old family's fortune 109 00:06:08,760 --> 00:06:12,119 Speaker 1: and his curiosity, and he went to a small city 110 00:06:12,200 --> 00:06:15,680 Speaker 1: to see if he could find a work in the goldfish. 111 00:06:15,760 --> 00:06:19,520 Speaker 1: He heard stories of a goldfish, a goldfish, a sea monster, 112 00:06:19,640 --> 00:06:22,280 Speaker 1: and a silver fish, a sea monster, and a ship 113 00:06:22,360 --> 00:06:25,520 Speaker 1: that was a ship of exploration. The ship was full 114 00:06:25,600 --> 00:06:30,320 Speaker 1: of fish and evil some treasure, but it was not 115 00:06:30,440 --> 00:06:33,360 Speaker 1: to be When Frick found the ship. He rushed back 116 00:06:33,960 --> 00:06:38,080 Speaker 1: and found the ship full of treasure and full of fish. 117 00:06:38,120 --> 00:06:41,800 Speaker 1: He wanted to be a pirate and fight it. Now. 118 00:06:41,800 --> 00:06:43,719 Speaker 1: I like that because there's a there's a lot of 119 00:06:43,760 --> 00:06:47,440 Speaker 1: silliness in there, but it does have the basic shape 120 00:06:47,480 --> 00:06:50,040 Speaker 1: of a bio and and and and and it's weirdness, 121 00:06:50,040 --> 00:06:54,360 Speaker 1: and it's the stuff that is more nonsensical actually feels 122 00:06:54,400 --> 00:06:58,480 Speaker 1: suitably magical and fantastic. You know that there's this this 123 00:06:58,640 --> 00:07:00,880 Speaker 1: fish that's not a fish, that's also a ship that's 124 00:07:00,880 --> 00:07:03,640 Speaker 1: full of treasure and fish, and he's going to become 125 00:07:03,640 --> 00:07:06,479 Speaker 1: a pirate and fighted, and so I think you could 126 00:07:06,560 --> 00:07:08,960 Speaker 1: run with that. This is one of the fascinating things 127 00:07:08,960 --> 00:07:12,520 Speaker 1: about neural net generated text is that it often has 128 00:07:12,680 --> 00:07:16,560 Speaker 1: the format of what you're going for correct, It just 129 00:07:16,640 --> 00:07:20,800 Speaker 1: doesn't have the sense of it correct like it will 130 00:07:20,960 --> 00:07:24,200 Speaker 1: esthetically and in shape be very much like what you're 131 00:07:24,200 --> 00:07:28,000 Speaker 1: looking for, but keywords do not make any sense at all. 132 00:07:28,520 --> 00:07:30,800 Speaker 1: Then again, another one that I thought was kind of 133 00:07:30,840 --> 00:07:34,240 Speaker 1: interesting in what it showed about common themes in in 134 00:07:34,320 --> 00:07:36,720 Speaker 1: D and D character bios, or I guess maybe fantasy 135 00:07:36,760 --> 00:07:41,000 Speaker 1: more generally, was one bio that included the lines um 136 00:07:41,040 --> 00:07:43,559 Speaker 1: the orc Warlock was captured and killed by a group 137 00:07:43,600 --> 00:07:46,120 Speaker 1: of Orcs. He was imprisoned and forced to work for 138 00:07:46,160 --> 00:07:48,960 Speaker 1: a giant tome. He was imprisoned and imprisoned for a 139 00:07:48,960 --> 00:07:51,480 Speaker 1: while until he was rescued by a group of adventurers. 140 00:07:51,720 --> 00:07:53,880 Speaker 1: He was imprisoned and imprisoned for a while until he 141 00:07:53,920 --> 00:07:56,080 Speaker 1: was rescued by a group of adventurers who were looking 142 00:07:56,120 --> 00:07:59,800 Speaker 1: for a group of adventurers to help eradicate the orchest ferocity. See, 143 00:07:59,800 --> 00:08:02,040 Speaker 1: now that's wonderful. I actually really like that when it's 144 00:08:02,040 --> 00:08:05,080 Speaker 1: a little you know, nonsensical. But uh, first of all, 145 00:08:05,160 --> 00:08:08,440 Speaker 1: you know, being an orc is hard. It's it's it's 146 00:08:08,440 --> 00:08:13,080 Speaker 1: a warlike world for the orc. But but then the 147 00:08:13,120 --> 00:08:15,559 Speaker 1: idea that he's working for a giant tone, the idea 148 00:08:15,600 --> 00:08:18,600 Speaker 1: that there's like a magical book employing him, and the 149 00:08:19,200 --> 00:08:23,120 Speaker 1: repetitive imprisonment. Yeah, it's a hard knock life. You're always 150 00:08:23,160 --> 00:08:27,480 Speaker 1: getting imprisoned and then escaping and encountering a band of adventurers. Yeah, 151 00:08:27,640 --> 00:08:30,160 Speaker 1: and working for weird magic books. So that one, that 152 00:08:30,200 --> 00:08:32,320 Speaker 1: one works, I think. Of course. Another great one on 153 00:08:32,360 --> 00:08:35,199 Speaker 1: this page is an injury that just seems to devolve 154 00:08:35,320 --> 00:08:39,200 Speaker 1: into endless like dozens of repetitions of variations of the 155 00:08:39,240 --> 00:08:43,080 Speaker 1: phrase big cat. Yes, little did he know. During the 156 00:08:43,080 --> 00:08:47,080 Speaker 1: monastery's course of time, when the monastery's training and growth 157 00:08:47,160 --> 00:08:50,200 Speaker 1: was complete, his mother told big cat, big cat and 158 00:08:50,280 --> 00:08:52,640 Speaker 1: big cat to drive their own path and test big 159 00:08:52,679 --> 00:08:55,560 Speaker 1: cat with big cats, Army of big cats, Army of 160 00:08:55,600 --> 00:08:58,760 Speaker 1: big cats and big cats and big cats. Big get 161 00:08:58,880 --> 00:09:01,480 Speaker 1: is big cat, big cat and goes on, yes something 162 00:09:01,679 --> 00:09:06,480 Speaker 1: for for many many lines. I'm trying to picture big cat. Uh, 163 00:09:06,800 --> 00:09:09,480 Speaker 1: though I can't. I don't actually go to like tiger 164 00:09:09,600 --> 00:09:12,680 Speaker 1: or lie in there. I think more of a kind 165 00:09:12,720 --> 00:09:16,640 Speaker 1: of mental power great basilisk type of fat house cat. 166 00:09:16,920 --> 00:09:18,800 Speaker 1: I think that would work. I also can't help but 167 00:09:18,880 --> 00:09:22,160 Speaker 1: think of cheshire cats, and of course cat Bus from 168 00:09:22,160 --> 00:09:25,360 Speaker 1: Totoro is being sort of in the vein of big Cat, 169 00:09:25,400 --> 00:09:30,240 Speaker 1: Big Cat, big Cat. Um. How about a neural networks 170 00:09:30,280 --> 00:09:33,079 Speaker 1: trained on a corpus of more than forty three thousand 171 00:09:33,200 --> 00:09:37,600 Speaker 1: questions and answer style jokes. I'll just read one of them. 172 00:09:37,679 --> 00:09:39,880 Speaker 1: Why do you call a pastor a cross the road? 173 00:09:40,600 --> 00:09:47,880 Speaker 1: He take the chicken? Yeah. Another great one from AI 174 00:09:47,920 --> 00:09:52,480 Speaker 1: Weirdness dot Com was a neural network designs Halloween costumes. Okay, 175 00:09:52,480 --> 00:09:54,840 Speaker 1: so you just feeded a bunch of Halloween costume names. Yeah, 176 00:09:54,880 --> 00:09:57,079 Speaker 1: and this one, this one was tremendous fun. I remember 177 00:09:57,080 --> 00:09:59,000 Speaker 1: when it first came. I just made me laugh so hard. 178 00:09:59,600 --> 00:10:07,199 Speaker 1: Highlight It's include uh, Saxy Pumpkins, Um, Disco Monster, Spartan Gandalf. 179 00:10:07,200 --> 00:10:10,480 Speaker 1: I really like that one. Starfleet Shark. Yeah, that's good. Um, 180 00:10:11,120 --> 00:10:15,120 Speaker 1: let's see Martian Devil. That was too believable. Panda clam though, 181 00:10:15,880 --> 00:10:20,760 Speaker 1: that's that's pretty good shark cow. That's very interesting given 182 00:10:21,240 --> 00:10:24,120 Speaker 1: some of our past discussions on was it a shark 183 00:10:24,120 --> 00:10:25,960 Speaker 1: cow that we were discussing. I believe it was. Oh, 184 00:10:26,080 --> 00:10:29,640 Speaker 1: that was That's Daniel Dennett's thought experiment exposing what he 185 00:10:29,679 --> 00:10:34,439 Speaker 1: believes to be flaws in Donald David's Donald Davidson's Swampman 186 00:10:34,920 --> 00:10:37,880 Speaker 1: thought experiment. It was the cow shark where he says, 187 00:10:38,520 --> 00:10:41,240 Speaker 1: is it actually meaningful if you posit a cow shark 188 00:10:41,360 --> 00:10:43,520 Speaker 1: to ask whether it's a cow or a shark? A 189 00:10:43,640 --> 00:10:48,480 Speaker 1: little accidental thought experiment here from this list. Uh, you'll 190 00:10:48,520 --> 00:10:52,000 Speaker 1: find less thought experimentation in such entries though, as Snape 191 00:10:52,080 --> 00:11:02,840 Speaker 1: scarce snape scarecrow, or or how about Lee garbage Lady garbage. 192 00:11:02,960 --> 00:11:05,000 Speaker 1: Lady garbage is good. So yeah, there's there are a 193 00:11:05,040 --> 00:11:09,600 Speaker 1: lot of great, great entries on that list. Okay, one 194 00:11:09,679 --> 00:11:12,280 Speaker 1: last thing from General Shane's work, I just must mention 195 00:11:12,440 --> 00:11:15,520 Speaker 1: some some titles for recipes that she wrote a script 196 00:11:15,520 --> 00:11:19,760 Speaker 1: to come up with. Um. These include chocolate pickle sauce, 197 00:11:19,920 --> 00:11:28,600 Speaker 1: whole chicken cookies, salmon beef style chicken bottom, artichoke, gelatin dogs, 198 00:11:28,640 --> 00:11:34,560 Speaker 1: and crock pot cold water. Where any of these attempted 199 00:11:34,640 --> 00:11:38,080 Speaker 1: these recipes well, actually at one point she so those 200 00:11:38,080 --> 00:11:41,240 Speaker 1: were just names at some point. I don't have these selected, 201 00:11:41,240 --> 00:11:43,920 Speaker 1: but she did have it based on a corpus of 202 00:11:44,040 --> 00:11:48,280 Speaker 1: recipes generate new recipes which are just nightmares, you know, 203 00:11:48,400 --> 00:11:51,640 Speaker 1: like huge lists of ingredients that would be like, you know, 204 00:11:52,320 --> 00:11:58,480 Speaker 1: a third cup s'more goals, you know, a cup of Horseradishuh. 205 00:11:58,600 --> 00:12:01,360 Speaker 1: Then there was one website I don't remember who actually 206 00:12:01,360 --> 00:12:03,880 Speaker 1: did it, might have been Super Deluxe. Somebody made a 207 00:12:03,960 --> 00:12:07,479 Speaker 1: video where they took one of these AI generated recipes 208 00:12:07,520 --> 00:12:10,440 Speaker 1: and just literally made it. Now, they had some trouble 209 00:12:10,520 --> 00:12:13,160 Speaker 1: because some of the ingredients were not real words. They're 210 00:12:13,160 --> 00:12:16,200 Speaker 1: just you know, like so they had to I guess 211 00:12:16,200 --> 00:12:19,319 Speaker 1: substitute something or they put in instead of smart goals 212 00:12:19,360 --> 00:12:21,679 Speaker 1: it was coco powder or something. But they ended up 213 00:12:21,679 --> 00:12:25,760 Speaker 1: with these like, uh, pasta shells. I think that had 214 00:12:25,800 --> 00:12:28,679 Speaker 1: like chocolate and stuff all over them. But anyway, so 215 00:12:29,080 --> 00:12:31,560 Speaker 1: that I think that's currently one of the best websites 216 00:12:31,559 --> 00:12:33,920 Speaker 1: on the internet. Go to AI weirdness dot com if 217 00:12:33,920 --> 00:12:36,959 Speaker 1: you want more joy and humor. But um, but I 218 00:12:37,320 --> 00:12:41,080 Speaker 1: was wondering, why is this the funniest thing out there 219 00:12:41,160 --> 00:12:44,080 Speaker 1: for me right now? What? What is so inherently funny 220 00:12:44,120 --> 00:12:49,520 Speaker 1: about the ways machines create things that are sort of 221 00:12:49,679 --> 00:12:52,920 Speaker 1: like real language, just close enough to be in the 222 00:12:53,000 --> 00:12:55,160 Speaker 1: zone where they are funny, but at the same time 223 00:12:55,240 --> 00:12:58,240 Speaker 1: far enough off that they're they're totally hilarious. And I 224 00:12:58,280 --> 00:13:01,240 Speaker 1: think that they're at least two parts it's uh that 225 00:13:01,360 --> 00:13:04,400 Speaker 1: make this this stuff so golden. One is that there's 226 00:13:04,440 --> 00:13:08,840 Speaker 1: something inherently funny about machines trying to behave like humans 227 00:13:08,840 --> 00:13:12,040 Speaker 1: and failing. Specifically, it's the ways that they fail, and 228 00:13:12,040 --> 00:13:15,480 Speaker 1: we'll we'll definitely explore this more throughout the episode. But 229 00:13:16,080 --> 00:13:19,559 Speaker 1: the way that they failed demonstrates a kind of pristine, 230 00:13:19,840 --> 00:13:25,000 Speaker 1: oblivious quality of stupidity. It's like a kind of platonic stupidity, 231 00:13:25,040 --> 00:13:29,600 Speaker 1: isolated from the ability to appreciate itself. It's like the 232 00:13:29,640 --> 00:13:33,240 Speaker 1: funniness of, you know, watching automatic doors repeatedly trying to 233 00:13:33,280 --> 00:13:36,200 Speaker 1: close on something that's blocking them like that that's not 234 00:13:36,320 --> 00:13:39,120 Speaker 1: itself so funny, but you see an inkling of the 235 00:13:39,160 --> 00:13:42,920 Speaker 1: same thing there that comes through in these AI generated texts. 236 00:13:43,440 --> 00:13:45,800 Speaker 1: But then we sort of mentioned this earlier. It's also 237 00:13:45,920 --> 00:13:49,960 Speaker 1: funny because it tends to reveal something interesting about the 238 00:13:50,040 --> 00:13:53,040 Speaker 1: human culture game that it's trying to play like it 239 00:13:53,120 --> 00:13:56,640 Speaker 1: brings this sort of cold objectivity to phenomenon that we 240 00:13:56,679 --> 00:14:00,560 Speaker 1: don't necessarily always bring, and it can identify a quardly 241 00:14:00,600 --> 00:14:03,800 Speaker 1: replicate trends and behaviors that we might fail to notice 242 00:14:03,840 --> 00:14:06,400 Speaker 1: in the same way that like you might notice funny 243 00:14:06,480 --> 00:14:10,000 Speaker 1: things that are actually present in the names of D 244 00:14:10,120 --> 00:14:14,920 Speaker 1: and D spells by watching a computer try and replicate them. Yeah, yeah, 245 00:14:14,920 --> 00:14:17,520 Speaker 1: I think these are these are these are two strong reasons. Okay, 246 00:14:17,559 --> 00:14:19,400 Speaker 1: we're going to take a quick break. We'll be right 247 00:14:19,440 --> 00:14:24,600 Speaker 1: back with more than thank Alright, we're back now with 248 00:14:24,680 --> 00:14:26,360 Speaker 1: all things humor, And of course we'll get into this 249 00:14:26,400 --> 00:14:29,320 Speaker 1: in this episode as we continue to discuss what humor 250 00:14:29,520 --> 00:14:33,000 Speaker 1: is and then why machines in some cases achieve it, 251 00:14:33,240 --> 00:14:39,840 Speaker 1: Like the absurd is is funny, Like the absurdity is hilarious, 252 00:14:39,960 --> 00:14:43,040 Speaker 1: And it seems it seems like we're in a situation 253 00:14:43,080 --> 00:14:45,040 Speaker 1: where a lot of times, some of these, especially these 254 00:14:45,120 --> 00:14:50,840 Speaker 1: neural net situations, were we were accidentally creating absurdity engines. 255 00:14:50,880 --> 00:14:56,800 Speaker 1: We're creating machines that that produce absurdity, and uh, you know, 256 00:14:58,160 --> 00:15:00,040 Speaker 1: you know, what can you say, like why is I 257 00:15:00,240 --> 00:15:04,080 Speaker 1: is something that is absurd funny? You know? Because we'll 258 00:15:04,120 --> 00:15:05,800 Speaker 1: get into all that in a bit. But but sometimes 259 00:15:05,840 --> 00:15:08,440 Speaker 1: I feel like the answer might be it's funny because 260 00:15:08,440 --> 00:15:11,640 Speaker 1: it's funny. Well, yeah, I mean, there's definitely an ineffable 261 00:15:11,720 --> 00:15:15,080 Speaker 1: quality of of of humor. That is one of the 262 00:15:15,120 --> 00:15:18,040 Speaker 1: reasons there's so many different theories of humor. Then, as 263 00:15:18,080 --> 00:15:20,160 Speaker 1: I said, we'll explore them more as the episode goes on. 264 00:15:20,240 --> 00:15:24,280 Speaker 1: But um, yeah, it's obviously something that's really hard to 265 00:15:24,280 --> 00:15:26,560 Speaker 1: to narrow down and put your finger on. It's that 266 00:15:26,640 --> 00:15:29,920 Speaker 1: there seemed to be all these strange, conflicting, overlapping reasons 267 00:15:29,960 --> 00:15:33,040 Speaker 1: we find things funny. But I do feel like this 268 00:15:33,160 --> 00:15:38,360 Speaker 1: strain of machine humor machine failure humor being one of 269 00:15:38,360 --> 00:15:42,440 Speaker 1: the funniest types of humor UH is bigger than just 270 00:15:42,600 --> 00:15:46,320 Speaker 1: the AI text. Because it got me I started thinking 271 00:15:46,360 --> 00:15:49,480 Speaker 1: about what are some of the funniest scenes in movies 272 00:15:49,560 --> 00:15:51,760 Speaker 1: I can think of? And when I tried to think 273 00:15:51,800 --> 00:15:53,920 Speaker 1: about that, I can't help but think of the dark 274 00:15:54,000 --> 00:15:58,880 Speaker 1: humor of the hyper violent board room scene in Robo Cop. 275 00:15:59,040 --> 00:16:02,200 Speaker 1: With ed to own mind, I would argue one of 276 00:16:02,240 --> 00:16:04,200 Speaker 1: the I mean, it's not for children, this is like 277 00:16:04,240 --> 00:16:08,000 Speaker 1: a hyper violent, horrible UH scene, but it's also in 278 00:16:08,000 --> 00:16:10,240 Speaker 1: in a in a morbid way. One of the funniest 279 00:16:10,280 --> 00:16:13,560 Speaker 1: scenes I think in film history, Like and so, why 280 00:16:13,640 --> 00:16:16,440 Speaker 1: is it so funny? I think it hinges on parallels 281 00:16:16,840 --> 00:16:19,960 Speaker 1: between humans and machines and the scene and the similarities 282 00:16:20,000 --> 00:16:23,160 Speaker 1: in the ways they fail. So brief recap of the 283 00:16:23,200 --> 00:16:26,760 Speaker 1: scene is, uh, you know, Ronnie Cox plays this you 284 00:16:26,800 --> 00:16:31,040 Speaker 1: know self, serious bloviating businessman who's proudly proclaiming, you know, 285 00:16:31,080 --> 00:16:34,480 Speaker 1: I've got the technology that's the future of policing. And 286 00:16:34,520 --> 00:16:36,720 Speaker 1: he brings out this robot called ED two oh nine 287 00:16:36,800 --> 00:16:39,040 Speaker 1: that's got these big guns on it, and they're saying 288 00:16:39,040 --> 00:16:41,320 Speaker 1: that it's going to take over the police force and 289 00:16:41,320 --> 00:16:44,520 Speaker 1: it's this great new technology. And then they demonstrate it 290 00:16:44,600 --> 00:16:47,920 Speaker 1: on a guy and it malfunctions, and uh, it tells 291 00:16:48,000 --> 00:16:50,680 Speaker 1: him to drop his weapon in the demonstration, and he does, 292 00:16:50,760 --> 00:16:52,680 Speaker 1: and it doesn't seem to notice he has and then 293 00:16:52,720 --> 00:16:55,640 Speaker 1: it shoots him like a hundred times, just a ridiculous 294 00:16:55,680 --> 00:16:59,359 Speaker 1: amount of times. And but it's something about the way 295 00:16:59,600 --> 00:17:02,360 Speaker 1: that the the people in the room fail in the 296 00:17:02,400 --> 00:17:04,920 Speaker 1: same way the machine does. Like he just plows through 297 00:17:05,000 --> 00:17:08,639 Speaker 1: this this horrible, violent encounter, and then afterwards somebody in 298 00:17:08,680 --> 00:17:11,520 Speaker 1: the background is like, can we get a paramedic after 299 00:17:11,640 --> 00:17:15,240 Speaker 1: this guy has been shot like a hundred times, as if, 300 00:17:15,359 --> 00:17:17,959 Speaker 1: like the machine, they're just sort of like carrying on 301 00:17:18,160 --> 00:17:22,280 Speaker 1: their like route behaviors without understanding what they're doing or 302 00:17:22,320 --> 00:17:26,000 Speaker 1: thinking about them. Yeah. The edd to oh nine is 303 00:17:26,040 --> 00:17:31,320 Speaker 1: such a great design in the original RoboCop because it 304 00:17:31,320 --> 00:17:34,760 Speaker 1: it resembles it has animal qualities to it. It looks 305 00:17:34,880 --> 00:17:39,400 Speaker 1: kind of like a bipedal dinosaur. Uh, and yet it's 306 00:17:39,400 --> 00:17:42,680 Speaker 1: also smooth and abstract in so many ways that it 307 00:17:42,720 --> 00:17:45,800 Speaker 1: looks like either a highly designed piece of technology, which 308 00:17:45,800 --> 00:17:47,359 Speaker 1: of course it is. It has you know, it looks 309 00:17:47,400 --> 00:17:50,840 Speaker 1: like it's a nice piece of stereo equipment, but but 310 00:17:50,920 --> 00:17:54,719 Speaker 1: then it also lacks any additional details, like it's almost 311 00:17:54,760 --> 00:17:58,440 Speaker 1: a silhouette of a lie of an animal. Yes, um, 312 00:17:59,119 --> 00:18:01,760 Speaker 1: and the growls and it growls as well. But then 313 00:18:01,840 --> 00:18:04,119 Speaker 1: also later a really funny scene in the movie is 314 00:18:04,160 --> 00:18:08,680 Speaker 1: the discovery that this horrible violent killer robot is can 315 00:18:08,720 --> 00:18:11,600 Speaker 1: be defeated by stairs. Like it can't use stairs. It 316 00:18:11,600 --> 00:18:15,399 Speaker 1: has these wonderful like all terrain looking um legs that 317 00:18:15,480 --> 00:18:18,760 Speaker 1: it walks on, and that it can't manipulate stairs. It 318 00:18:18,800 --> 00:18:21,560 Speaker 1: falls down them, and then it's it's like a you know, 319 00:18:21,600 --> 00:18:25,239 Speaker 1: a ridiculous upside down duckling. I mean it's so hilarious too, 320 00:18:25,280 --> 00:18:27,119 Speaker 1: because I mean it dad to a nine is highly 321 00:18:27,119 --> 00:18:31,320 Speaker 1: effective in other situations if it's just filling a guy 322 00:18:31,359 --> 00:18:37,000 Speaker 1: with bullets, highly effective. Um, just battling RoboCop also highly 323 00:18:37,000 --> 00:18:40,280 Speaker 1: effective for the most part um. And this falls in 324 00:18:40,359 --> 00:18:43,440 Speaker 1: line I think pretty well with our experiences of machines 325 00:18:43,840 --> 00:18:47,240 Speaker 1: and of AI, we can create highly effective specialist in 326 00:18:47,280 --> 00:18:50,879 Speaker 1: many areas of AI and robotics. So you create a 327 00:18:50,920 --> 00:18:53,800 Speaker 1: machine that just puts bullets into this guy, uh, you know, 328 00:18:53,800 --> 00:18:55,879 Speaker 1: it does a great job. But when it comes to 329 00:18:55,960 --> 00:19:00,000 Speaker 1: creating a general AI or a machine that can navigate 330 00:19:00,080 --> 00:19:03,240 Speaker 1: the complex natural or even or the human created world 331 00:19:03,520 --> 00:19:07,639 Speaker 1: such as the stairs, Uh, there's continual challenge there. Like 332 00:19:07,720 --> 00:19:09,800 Speaker 1: that's kind of that's that's what a lot of of 333 00:19:09,840 --> 00:19:12,600 Speaker 1: what's going on in robotics and AI is about. Yeah, 334 00:19:12,720 --> 00:19:15,600 Speaker 1: or the just recognizing that it's not actually supposed to 335 00:19:15,600 --> 00:19:19,359 Speaker 1: shoot the board executive during the demonstration exactly. Yeah. But 336 00:19:19,400 --> 00:19:23,280 Speaker 1: then also just the idea that it's it's wonderful, it's 337 00:19:23,320 --> 00:19:26,920 Speaker 1: something and terrible at another thing. That imbalance is often 338 00:19:26,920 --> 00:19:28,840 Speaker 1: where we find a lot a lot of hilarity in 339 00:19:28,960 --> 00:19:33,159 Speaker 1: other um, you know, another comedic stories and bits of 340 00:19:33,160 --> 00:19:37,119 Speaker 1: fiction or just situations like one of my favorite cut 341 00:19:37,160 --> 00:19:40,600 Speaker 1: scenes of all times from Conan the Barbarian. So there's 342 00:19:40,600 --> 00:19:43,159 Speaker 1: a scene in it where this is the Arnold Schwarzenegger original. 343 00:19:43,240 --> 00:19:45,439 Speaker 1: There's it's like a blooper out take. That's a blooper 344 00:19:45,440 --> 00:19:46,840 Speaker 1: Outare you find it? I don't think most of the 345 00:19:46,880 --> 00:19:50,879 Speaker 1: special DVDs. It's available on YouTube as well, but basically 346 00:19:50,960 --> 00:19:55,800 Speaker 1: Conan has just escaped or been released from servitude and 347 00:19:55,840 --> 00:19:59,320 Speaker 1: he's running across the you know, the waste land essentially, 348 00:19:59,600 --> 00:20:02,679 Speaker 1: while dogs are chasing him in order to to to 349 00:20:02,800 --> 00:20:05,480 Speaker 1: eat his flesh. He manages in the finished film to 350 00:20:05,520 --> 00:20:08,440 Speaker 1: scramble up some rocks and that begins this new story 351 00:20:08,560 --> 00:20:11,359 Speaker 1: arc him discovering this great old sword. But this is 352 00:20:11,440 --> 00:20:15,240 Speaker 1: like young corporal beef body giant Arnold Schwartz, and this 353 00:20:15,320 --> 00:20:17,520 Speaker 1: is the this is the Arnold Schwarzenegger that was so 354 00:20:17,640 --> 00:20:20,520 Speaker 1: ripped he had to like lose muscle so that he 355 00:20:20,520 --> 00:20:24,359 Speaker 1: could actually hold the sword correctly. Uh So there's this 356 00:20:24,680 --> 00:20:28,400 Speaker 1: out take though, where he's running in chains, the dogs 357 00:20:28,400 --> 00:20:31,320 Speaker 1: are chasing him in the movie Dogs, and then he's 358 00:20:31,359 --> 00:20:34,399 Speaker 1: scrambling up the stones and the dogs catch him and 359 00:20:34,480 --> 00:20:38,120 Speaker 1: drag him back down, and he's just screamed a and 360 00:20:38,119 --> 00:20:41,240 Speaker 1: and cursing the whole time. Yeah, it's I watched it 361 00:20:41,480 --> 00:20:43,920 Speaker 1: after you linked it very very funny, and you see 362 00:20:43,960 --> 00:20:45,920 Speaker 1: the p as come in to get the dogs off 363 00:20:45,960 --> 00:20:50,440 Speaker 1: the dog, and he's like, yeah, it's it's funny because 364 00:20:50,480 --> 00:20:53,760 Speaker 1: the idea of Conan the barbarian, or even just Prime 365 00:20:53,920 --> 00:20:58,240 Speaker 1: Arnold failing like this, it's just start contrast to actual 366 00:20:58,359 --> 00:21:02,560 Speaker 1: or perceived strength of a character and or individual. And 367 00:21:02,560 --> 00:21:05,600 Speaker 1: it's also funny because he wasn't actually mauled to death 368 00:21:05,720 --> 00:21:08,560 Speaker 1: by movie dogs, right, of course, he wasn't actually seriously 369 00:21:08,560 --> 00:21:12,960 Speaker 1: harmed in this incident, but he was apparently inconvenienced and 370 00:21:13,000 --> 00:21:15,080 Speaker 1: had his pride wounded. Right, And it will come back 371 00:21:15,119 --> 00:21:18,879 Speaker 1: to that idea to the degree to which you know, 372 00:21:18,920 --> 00:21:23,120 Speaker 1: the severity the outcome um comes into play and determining 373 00:21:23,160 --> 00:21:25,399 Speaker 1: if something is funny or not. Yeah, I think I 374 00:21:25,440 --> 00:21:28,040 Speaker 1: can definitely see what you're talking about that the failures 375 00:21:28,040 --> 00:21:32,119 Speaker 1: of technology are especially funny when there are other ways 376 00:21:32,240 --> 00:21:37,000 Speaker 1: that the technology is highly advanced or presented as highly advanced, 377 00:21:37,359 --> 00:21:41,000 Speaker 1: and as long as like nobody of course dies, you know, um, 378 00:21:41,560 --> 00:21:43,320 Speaker 1: but but I do wonder too if there's a darker 379 00:21:43,320 --> 00:21:45,520 Speaker 1: streak and all of this too, you know, something that 380 00:21:45,640 --> 00:21:49,680 Speaker 1: ties into a deeply rooted human disdain for the other, 381 00:21:49,960 --> 00:21:53,840 Speaker 1: especially for others of species. There is a quote from C. S. 382 00:21:53,920 --> 00:21:56,879 Speaker 1: Lewis's The Lion, The Which, the Wardrobe and the Four Children. 383 00:21:56,960 --> 00:22:00,080 Speaker 1: That's my son suggested alternate title for it. He's okay, like, 384 00:22:00,080 --> 00:22:02,640 Speaker 1: why don't they call it the Land the Witch Wardrobe 385 00:22:02,720 --> 00:22:05,359 Speaker 1: the Four Children? Like they're in it too, and they're 386 00:22:05,400 --> 00:22:08,159 Speaker 1: not in the title. Seems like a missed opportunity. But anyway, 387 00:22:08,280 --> 00:22:10,439 Speaker 1: there's this wonderful quote from it that I find I 388 00:22:10,440 --> 00:22:15,240 Speaker 1: found kind of creepy on a recent reread of the book. Quote, 389 00:22:15,560 --> 00:22:18,440 Speaker 1: but in general, take my advice when you meet anything 390 00:22:18,760 --> 00:22:21,520 Speaker 1: that is going to be human and isn't yet or 391 00:22:21,720 --> 00:22:24,600 Speaker 1: used to be human once and isn't now or ought 392 00:22:24,640 --> 00:22:27,600 Speaker 1: to be human, and isn't you keep your eyes on 393 00:22:27,640 --> 00:22:32,160 Speaker 1: it and feel for your hatchet. Well, that is very creepy. Now, 394 00:22:32,200 --> 00:22:34,040 Speaker 1: I think in the context of the book, this is 395 00:22:34,040 --> 00:22:37,960 Speaker 1: going to be referring to, like, you know, possessed objects 396 00:22:37,960 --> 00:22:40,879 Speaker 1: and stuff like creepy and magical. Basically, yeah, basically, the 397 00:22:40,880 --> 00:22:44,359 Speaker 1: message here was talking. Animals are cool, you know, you 398 00:22:44,359 --> 00:22:46,920 Speaker 1: can hang out with them in Narnia, But there are 399 00:22:46,920 --> 00:22:49,440 Speaker 1: other things in Narnia that are dangerous, and you can 400 00:22:49,440 --> 00:22:52,440 Speaker 1: tell if they're dangerous based on how human they seem, 401 00:22:52,600 --> 00:22:54,720 Speaker 1: they're they're trying to be or used to be. Well, 402 00:22:54,760 --> 00:22:57,520 Speaker 1: that's one thing in a in a fantasy context, in 403 00:22:57,520 --> 00:22:59,720 Speaker 1: a in a science fiction or even just a real 404 00:22:59,720 --> 00:23:02,960 Speaker 1: technology context, that's that's a different thing entirely, and starts 405 00:23:03,000 --> 00:23:06,280 Speaker 1: making you think about uh, well, you know, fear of 406 00:23:06,320 --> 00:23:12,280 Speaker 1: advancing technology mimicking human behavior, fears of AI. Yeah, yeah, yeah, 407 00:23:12,280 --> 00:23:14,760 Speaker 1: To what extent do we delight in the falls and 408 00:23:14,960 --> 00:23:17,840 Speaker 1: errors of inhuman entities because we don't wish to see 409 00:23:17,840 --> 00:23:21,399 Speaker 1: them succeed? You know, we we celebrate that the telltale 410 00:23:21,440 --> 00:23:24,359 Speaker 1: signs of their otherness because we kind of dread the 411 00:23:24,440 --> 00:23:26,520 Speaker 1: day when there will be no way to tell. And 412 00:23:27,080 --> 00:23:29,440 Speaker 1: I think there's a lot there's a strong argument to 413 00:23:29,480 --> 00:23:31,600 Speaker 1: be made that that day will be here sooner than 414 00:23:31,640 --> 00:23:34,360 Speaker 1: we think, and in many respects it already is. We've 415 00:23:34,359 --> 00:23:38,160 Speaker 1: talked about, for instance, robocalls on the show before um 416 00:23:38,359 --> 00:23:42,560 Speaker 1: and and just how and also chat bots and to 417 00:23:43,320 --> 00:23:45,159 Speaker 1: it becomes frightening when you look at where we are 418 00:23:45,200 --> 00:23:48,400 Speaker 1: with the technology. Now, UM, certainly you get a robocall, 419 00:23:48,520 --> 00:23:52,040 Speaker 1: it's not going hopefully not going to um deceive you 420 00:23:52,200 --> 00:23:54,320 Speaker 1: long term. But I think a lot of us are 421 00:23:54,320 --> 00:23:56,880 Speaker 1: having that experience where you pick one of these up 422 00:23:56,960 --> 00:23:58,520 Speaker 1: and at first you think it is a human you 423 00:23:58,560 --> 00:24:02,400 Speaker 1: are talking to, and then you realize that it is not. Well. 424 00:24:02,440 --> 00:24:05,600 Speaker 1: I think one of the funny things about stuff like chatbots, 425 00:24:05,600 --> 00:24:08,520 Speaker 1: which also deal in language but can be much much 426 00:24:08,560 --> 00:24:12,040 Speaker 1: more convincing than these, uh, these neural networks that generate 427 00:24:12,160 --> 00:24:15,919 Speaker 1: you know, lists of uh, lists of character bios or something, 428 00:24:16,320 --> 00:24:19,120 Speaker 1: is that the things that are generally more convincing these 429 00:24:19,200 --> 00:24:22,720 Speaker 1: days are programmed. I think with more explicit rules, they 430 00:24:22,720 --> 00:24:25,560 Speaker 1: tend to have more kind of human meddling and exactly 431 00:24:25,560 --> 00:24:28,359 Speaker 1: what they're going to be doing, and by having less 432 00:24:28,359 --> 00:24:32,200 Speaker 1: freedom to be creative and all that, and ultimately having 433 00:24:32,280 --> 00:24:35,200 Speaker 1: less potential, they can actually be more kind of narrowly 434 00:24:35,320 --> 00:24:38,679 Speaker 1: convincing the I think one of the things that's interesting 435 00:24:38,720 --> 00:24:41,880 Speaker 1: about the the neural network generated text is that it's 436 00:24:41,920 --> 00:24:45,080 Speaker 1: not anywhere close yet. You know, you can't really use 437 00:24:45,119 --> 00:24:47,160 Speaker 1: it yet to make things where you go like, yeah, 438 00:24:47,240 --> 00:24:49,760 Speaker 1: that's definitely real human. I mean, maybe you can in 439 00:24:49,880 --> 00:24:53,439 Speaker 1: some again narrow scenarios, but we like, for instance, if 440 00:24:53,480 --> 00:24:55,480 Speaker 1: you have something that will tell you what your Wu 441 00:24:55,560 --> 00:24:58,399 Speaker 1: Tang clan name will be. You know, they sort of 442 00:24:58,920 --> 00:25:03,000 Speaker 1: random generation and uh systems which are totally different, different thing. Uh. 443 00:25:03,000 --> 00:25:05,439 Speaker 1: And we'll get into the distinction here. But but you know, 444 00:25:05,480 --> 00:25:09,920 Speaker 1: systems like that, just through sheer random um matching of words, 445 00:25:10,480 --> 00:25:13,960 Speaker 1: those can be effective. Yeah, but it doesn't mean that 446 00:25:14,359 --> 00:25:16,240 Speaker 1: it's you know, it's an entirely different kettle of fish 447 00:25:16,280 --> 00:25:19,280 Speaker 1: in terms of of what's going on with neural networks 448 00:25:19,280 --> 00:25:21,680 Speaker 1: and where they seem to be going. Well, maybe we 449 00:25:21,680 --> 00:25:23,440 Speaker 1: should take a quick break and then we come back. 450 00:25:23,480 --> 00:25:26,400 Speaker 1: We can explain just the basics of how this these 451 00:25:26,480 --> 00:25:31,320 Speaker 1: kind of things actually work. Alright, we're back. So I'm 452 00:25:31,320 --> 00:25:33,800 Speaker 1: gonna try to do the simple version of how a 453 00:25:33,840 --> 00:25:35,879 Speaker 1: neural network works, because if you get in the weeds, 454 00:25:35,880 --> 00:25:40,240 Speaker 1: obviously neural networks become extremely complicated. I know, I I 455 00:25:40,280 --> 00:25:41,919 Speaker 1: spent a lot of time deep in a bunch of 456 00:25:41,960 --> 00:25:44,720 Speaker 1: articles trying to understand technical details that I'm not actually 457 00:25:44,760 --> 00:25:48,639 Speaker 1: gonna end up using here. Um. So the simple version 458 00:25:48,760 --> 00:25:51,800 Speaker 1: is think of a neural network as a machine that 459 00:25:51,880 --> 00:25:55,840 Speaker 1: transforms values. That's it. You know, it has values that 460 00:25:55,920 --> 00:25:58,600 Speaker 1: come in, like number variables, and then it puts out 461 00:25:58,680 --> 00:26:00,960 Speaker 1: values at the end. It's like it's kind of like 462 00:26:01,000 --> 00:26:03,520 Speaker 1: the toaster conveyor belt at quiz Nos or one of 463 00:26:03,520 --> 00:26:06,880 Speaker 1: those sandwich shops. You know, you're untoasted. Sandwich comes in, 464 00:26:07,119 --> 00:26:10,800 Speaker 1: toasted sandwich comes out. If everything goes according to plan. Now, 465 00:26:10,840 --> 00:26:12,960 Speaker 1: if it doesn't go according to plan, maybe it splits 466 00:26:12,960 --> 00:26:15,800 Speaker 1: out something that's on fire or something that you know, 467 00:26:15,960 --> 00:26:18,120 Speaker 1: who knows what goes on in now, or nothing comes 468 00:26:18,160 --> 00:26:20,840 Speaker 1: out because the bagels are building up inside of it, 469 00:26:20,960 --> 00:26:23,639 Speaker 1: right exactly. But a neural networks core job is to 470 00:26:23,800 --> 00:26:27,840 Speaker 1: just accept inputs and produce outputs. Okay. An example would 471 00:26:27,880 --> 00:26:31,600 Speaker 1: be image recognition. Their neural networks designed to look at 472 00:26:31,640 --> 00:26:34,639 Speaker 1: an image, a digital image, and come up with a 473 00:26:34,720 --> 00:26:37,239 Speaker 1: text string that says this is what that is. So 474 00:26:37,320 --> 00:26:39,240 Speaker 1: look at a picture of a dog and say that's 475 00:26:39,240 --> 00:26:41,639 Speaker 1: a dog. And you've probably seen examples of this on 476 00:26:41,680 --> 00:26:44,520 Speaker 1: the web. So you'd have numerical values going in. It 477 00:26:44,640 --> 00:26:46,480 Speaker 1: might be things like, you know, be a field of 478 00:26:46,480 --> 00:26:50,680 Speaker 1: pixels with numerical values for their color and placement, and 479 00:26:50,720 --> 00:26:53,639 Speaker 1: then it would have an output that's, uh, say a 480 00:26:53,760 --> 00:26:56,560 Speaker 1: string of text, which would actually be like a ranking, 481 00:26:56,720 --> 00:27:00,280 Speaker 1: like the top ranked string of text that matches with 482 00:27:00,520 --> 00:27:03,760 Speaker 1: those pixels in that configuration. But so the question is 483 00:27:03,760 --> 00:27:06,040 Speaker 1: what's going on inside the machine? How does it turn 484 00:27:06,160 --> 00:27:09,560 Speaker 1: that input into the correct matching output, and then and 485 00:27:09,600 --> 00:27:11,960 Speaker 1: then of course, how does it fail? Because I've also 486 00:27:12,000 --> 00:27:17,920 Speaker 1: found this tremendously amusing. Tumbler Um recently changed their guidelines 487 00:27:17,920 --> 00:27:21,000 Speaker 1: about what's acceptable content and what's not and stuff to 488 00:27:21,000 --> 00:27:24,960 Speaker 1: blow your mind has has Slash had a tumbler account 489 00:27:25,520 --> 00:27:28,800 Speaker 1: recently just rebranded it as the Transgenesis tumbler account, but 490 00:27:28,920 --> 00:27:30,199 Speaker 1: it had a lot of old stuff to put your 491 00:27:30,240 --> 00:27:32,520 Speaker 1: mind content on there. And suddenly I got a page 492 00:27:32,520 --> 00:27:35,120 Speaker 1: of all the things that have been flagged for for 493 00:27:35,280 --> 00:27:38,440 Speaker 1: potentially violating the new terms. And it was amusing because 494 00:27:38,440 --> 00:27:40,159 Speaker 1: some of it was like, okay, well that has a 495 00:27:40,480 --> 00:27:43,159 Speaker 1: classical painting on it. It's got like a you know, 496 00:27:43,200 --> 00:27:45,800 Speaker 1: it's something there's something that might look like nudity, and 497 00:27:45,840 --> 00:27:49,000 Speaker 1: so it got flagged, makes sense, or the topic is 498 00:27:49,040 --> 00:27:51,840 Speaker 1: something that is a little too sketchy for them, and 499 00:27:51,880 --> 00:27:54,240 Speaker 1: they're like, okay, that's been flagged by the machine. But 500 00:27:54,320 --> 00:27:57,320 Speaker 1: the most hilarious one was a picture of a baby 501 00:27:57,400 --> 00:28:01,760 Speaker 1: bat on on somebody's pall them and that was flagged 502 00:28:01,960 --> 00:28:04,960 Speaker 1: as as as likely inappropriate. And so I was just 503 00:28:04,960 --> 00:28:06,600 Speaker 1: trying to figure that one out, Like what is it 504 00:28:06,600 --> 00:28:09,240 Speaker 1: about a baby bat that it? I mean, did it 505 00:28:09,320 --> 00:28:13,200 Speaker 1: think this was genitalia or like, like what because because 506 00:28:13,200 --> 00:28:15,680 Speaker 1: I know it's somehow clicked off a number of boxes 507 00:28:16,080 --> 00:28:19,119 Speaker 1: and when and then the automated result was no baby 508 00:28:19,160 --> 00:28:21,399 Speaker 1: bats on tumbler. Well, I guess I don't know if 509 00:28:21,440 --> 00:28:23,280 Speaker 1: you know, but I don't know what the mechanism for 510 00:28:23,359 --> 00:28:26,080 Speaker 1: identifying that is. It might be something like this, but 511 00:28:26,320 --> 00:28:28,560 Speaker 1: yeah that I love seeing that kind of failure. And 512 00:28:28,840 --> 00:28:31,119 Speaker 1: notice that we are laughing now like it or we 513 00:28:31,119 --> 00:28:33,639 Speaker 1: were laughing. It is funny that it looks at that 514 00:28:33,720 --> 00:28:35,880 Speaker 1: and says, I don't know about this bad I think 515 00:28:35,880 --> 00:28:38,120 Speaker 1: this might be porno. Yeah, I mean then the stakes 516 00:28:38,120 --> 00:28:40,920 Speaker 1: are pretty low. I ultimately, one picture of a baby 517 00:28:40,920 --> 00:28:43,320 Speaker 1: bat no longer on a tumbler page that we don't 518 00:28:43,360 --> 00:28:45,920 Speaker 1: really use. It doesn't really affect me personally, but you 519 00:28:45,920 --> 00:28:49,520 Speaker 1: could see where this could lead to, uh too far 520 00:28:49,600 --> 00:28:53,719 Speaker 1: worse problems if given the right scenario. Okay, so so 521 00:28:53,760 --> 00:28:55,960 Speaker 1: back to the neural network. So you've got this machine. 522 00:28:56,080 --> 00:28:59,520 Speaker 1: Inside the machine, values are being transformed. You have inputs 523 00:28:59,560 --> 00:29:02,960 Speaker 1: and output and so inside on the inside, and neural 524 00:29:03,000 --> 00:29:07,040 Speaker 1: network consists of layers of sort of stations of value 525 00:29:07,040 --> 00:29:10,840 Speaker 1: transformation that are called referred to as neurons. And each 526 00:29:10,920 --> 00:29:15,040 Speaker 1: neuron essentially accepts a series of numerical values as input 527 00:29:15,520 --> 00:29:19,000 Speaker 1: and then it just performs some kind of mathematical transformation 528 00:29:19,120 --> 00:29:22,560 Speaker 1: of those values based on what's known as the weights 529 00:29:22,600 --> 00:29:25,880 Speaker 1: of its connections with the sources that received the inputs from. 530 00:29:26,160 --> 00:29:29,880 Speaker 1: So you've got these interlinking sources of information, inputs and 531 00:29:29,920 --> 00:29:34,200 Speaker 1: outputs throughout, and each neuron takes inputs, sums them, does 532 00:29:34,240 --> 00:29:37,520 Speaker 1: some transformation, and produces an output. So for each neuron, 533 00:29:37,560 --> 00:29:40,040 Speaker 1: you've got a bunch of numbers coming from different sources. 534 00:29:40,040 --> 00:29:42,720 Speaker 1: Each one gets treated with a certain bias based on 535 00:29:42,760 --> 00:29:45,720 Speaker 1: where it came from, and then the neuron spits out 536 00:29:45,720 --> 00:29:48,600 Speaker 1: a new value. And these neurons exist in layers, so 537 00:29:48,640 --> 00:29:52,320 Speaker 1: there are these waves of inputs, say the pixels and 538 00:29:52,480 --> 00:29:57,040 Speaker 1: image getting passed and transformed through one layer after another 539 00:29:57,360 --> 00:30:00,520 Speaker 1: of these neurons, until finally the network produces numbers that 540 00:30:00,600 --> 00:30:03,680 Speaker 1: constitute its final outputs. In this case, this would be 541 00:30:03,720 --> 00:30:06,480 Speaker 1: something like it's top guesses at the word string that 542 00:30:06,520 --> 00:30:09,320 Speaker 1: describes the image you put in at the beginning. Now 543 00:30:09,320 --> 00:30:11,640 Speaker 1: you'll see from this that the value of neural network 544 00:30:11,720 --> 00:30:16,520 Speaker 1: depends completely on how well those connections between neurons are 545 00:30:16,600 --> 00:30:19,280 Speaker 1: weighted to produce the correct results. If they just have 546 00:30:19,440 --> 00:30:22,600 Speaker 1: random weights, then the network will just produce random output. 547 00:30:22,680 --> 00:30:24,920 Speaker 1: It won't be any better than making up numbers at random. 548 00:30:24,960 --> 00:30:28,320 Speaker 1: So the network has to be calibrated or trained somehow 549 00:30:28,360 --> 00:30:31,000 Speaker 1: to produce outputs that are correct, and there are multiple 550 00:30:31,000 --> 00:30:33,280 Speaker 1: ways to do this. Uh. It of course could be 551 00:30:33,320 --> 00:30:35,720 Speaker 1: programmed to some degree by hand, right, you could have 552 00:30:35,720 --> 00:30:39,560 Speaker 1: a program or explicitly uh going in and tinkering with 553 00:30:39,640 --> 00:30:42,480 Speaker 1: waiting rules to try to get the outcomes to be better. 554 00:30:42,760 --> 00:30:45,200 Speaker 1: But can it can also be trained through machine learning, 555 00:30:45,280 --> 00:30:49,080 Speaker 1: which is a process where inputs are already associated with 556 00:30:49,160 --> 00:30:53,000 Speaker 1: correct outputs, Like you've already got a text string associated 557 00:30:53,040 --> 00:30:55,280 Speaker 1: with the image that you put in and you say 558 00:30:55,320 --> 00:30:57,320 Speaker 1: this is what you should say when it comes out, 559 00:30:57,640 --> 00:31:00,080 Speaker 1: And each time it runs the process, it checks to 560 00:31:00,120 --> 00:31:02,840 Speaker 1: see how far off from the correct known output that 561 00:31:02,920 --> 00:31:06,000 Speaker 1: it was, and then tries to change the internal waiting 562 00:31:06,000 --> 00:31:08,600 Speaker 1: to get closer to the correct answer. And of course, 563 00:31:08,680 --> 00:31:11,280 Speaker 1: with automatic machine learning, you can do this at scale. 564 00:31:11,600 --> 00:31:14,520 Speaker 1: You can do it thousands of times. You could potentially 565 00:31:14,560 --> 00:31:17,440 Speaker 1: do it millions of times just training over and over, 566 00:31:17,680 --> 00:31:19,400 Speaker 1: and you might be able to see a parallel here 567 00:31:19,440 --> 00:31:21,840 Speaker 1: with one of the ways that we actually learn. Uh, 568 00:31:22,080 --> 00:31:24,520 Speaker 1: you know, we we learned in multiple ways. Sometimes we 569 00:31:24,640 --> 00:31:27,840 Speaker 1: learned by being taught explicit rules to follow, like if 570 00:31:27,880 --> 00:31:30,640 Speaker 1: we're learning in school what an insect is, we might 571 00:31:30,720 --> 00:31:33,360 Speaker 1: learn that an insect is a small animal with an 572 00:31:33,360 --> 00:31:38,200 Speaker 1: exoskeleton that has six legs. Or sometimes, on the other hand, 573 00:31:38,240 --> 00:31:41,640 Speaker 1: we learned to generalize from particulars. We might see lots 574 00:31:41,640 --> 00:31:44,480 Speaker 1: of animals pictures of lots of animals and notice that 575 00:31:44,560 --> 00:31:47,600 Speaker 1: the ones that are called insects all happen to have 576 00:31:47,720 --> 00:31:51,640 Speaker 1: six legs and exo skeletons, and therefore we derive this 577 00:31:51,680 --> 00:31:55,440 Speaker 1: category called insect from that survey. And in logic, of course, 578 00:31:55,480 --> 00:31:57,920 Speaker 1: this this process where we come up with general rules 579 00:31:57,960 --> 00:32:01,240 Speaker 1: from lots of individual examples, is known as induction. So 580 00:32:01,360 --> 00:32:03,760 Speaker 1: machine learning to train your oal networks is kind of 581 00:32:03,800 --> 00:32:08,320 Speaker 1: like allowing computers to learn categories by induction, kind of 582 00:32:08,360 --> 00:32:10,240 Speaker 1: like we do when we just go out and look 583 00:32:10,280 --> 00:32:12,680 Speaker 1: at the world and see what we find. But one 584 00:32:12,720 --> 00:32:15,960 Speaker 1: of the things that really sets humans apart from computers 585 00:32:16,560 --> 00:32:20,800 Speaker 1: is that humans seem to have this amazing, remarkable ability 586 00:32:21,200 --> 00:32:24,320 Speaker 1: to generalize from particulars. We can often get the gist 587 00:32:24,400 --> 00:32:28,160 Speaker 1: of a category from just a tiny handful of examples. 588 00:32:28,200 --> 00:32:31,680 Speaker 1: You know, when you're giving somebody examples of something to like, 589 00:32:31,800 --> 00:32:34,000 Speaker 1: give them the gist of what you're talking about. You 590 00:32:34,040 --> 00:32:37,000 Speaker 1: don't usually need to list a million examples. You can 591 00:32:37,080 --> 00:32:40,680 Speaker 1: list two or three maybe, or sometimes even just one. Yeah. 592 00:32:40,800 --> 00:32:43,280 Speaker 1: This gets into the idea of judging a book by 593 00:32:43,280 --> 00:32:46,840 Speaker 1: its cover, right, not supposed to, but we often do, 594 00:32:46,960 --> 00:32:49,160 Speaker 1: and sometimes you you can if you pick up on 595 00:32:49,200 --> 00:32:52,240 Speaker 1: particular things on you know, specifically to use the book, example, 596 00:32:52,720 --> 00:32:57,200 Speaker 1: specific things about the design or the era of the cover. Yeah. Yeah, 597 00:32:57,240 --> 00:33:00,520 Speaker 1: And in fact, sometimes you can you can judge things 598 00:33:00,560 --> 00:33:03,720 Speaker 1: about the contents of a book just by knowing certain 599 00:33:03,760 --> 00:33:06,880 Speaker 1: things about how certain types of books end up with 600 00:33:06,920 --> 00:33:10,960 Speaker 1: certain types of covers. Like you might think I tend 601 00:33:11,080 --> 00:33:14,400 Speaker 1: to like books that have hand drawn illustrations on the 602 00:33:14,400 --> 00:33:16,880 Speaker 1: cover more than I like books that have sort of 603 00:33:16,920 --> 00:33:21,320 Speaker 1: like c g I stock image cover photos, which means 604 00:33:21,360 --> 00:33:24,320 Speaker 1: you probably like books from like, you know, at least 605 00:33:24,360 --> 00:33:28,040 Speaker 1: the nineteen eighties and before. Yeah, because it seems like 606 00:33:28,040 --> 00:33:31,080 Speaker 1: we have far fewer hand drawn, uh you know, covers 607 00:33:31,120 --> 00:33:34,520 Speaker 1: on books these days. Yeah, why are people putting stock 608 00:33:34,600 --> 00:33:36,800 Speaker 1: photos on the covers of books? I do not get it. 609 00:33:37,120 --> 00:33:39,920 Speaker 1: But you you didn't need to read a million books 610 00:33:39,960 --> 00:33:41,960 Speaker 1: to come to that conclusion. You could probably come to 611 00:33:42,000 --> 00:33:45,959 Speaker 1: that conclusion after reading I don't know three books like you, 612 00:33:46,200 --> 00:33:49,480 Speaker 1: really we get the gist of things really fast. And 613 00:33:50,080 --> 00:33:54,400 Speaker 1: that's in contrast to computers, which really really don't at all. 614 00:33:54,840 --> 00:33:58,080 Speaker 1: This is one of those strange and amazing things about neuroscience, 615 00:33:58,120 --> 00:34:00,240 Speaker 1: about the human brain. How do we solve such a 616 00:34:00,280 --> 00:34:04,719 Speaker 1: difficult problem as generalizing from particulars with so few examples 617 00:34:04,720 --> 00:34:07,320 Speaker 1: to draw from? And of course, another example of the 618 00:34:07,320 --> 00:34:10,040 Speaker 1: generalizing power of the human brain is in language, Like 619 00:34:10,080 --> 00:34:11,680 Speaker 1: we've been talking about it, like, how is it that 620 00:34:11,960 --> 00:34:15,000 Speaker 1: most of the time kids learn how to speak a 621 00:34:15,080 --> 00:34:19,280 Speaker 1: language without being taught explicit rules of syntax and grammar 622 00:34:19,320 --> 00:34:22,120 Speaker 1: and the definitions and usages of all the common words, 623 00:34:22,480 --> 00:34:26,480 Speaker 1: and without hearing billions and billions of examples of sentences. 624 00:34:26,560 --> 00:34:29,120 Speaker 1: They just pick it up. Well, there's there there are 625 00:34:29,160 --> 00:34:31,600 Speaker 1: some specific answers to that. If we've talked about that 626 00:34:31,600 --> 00:34:33,839 Speaker 1: on the show before. Yeah, well, I mean I think 627 00:34:33,880 --> 00:34:35,839 Speaker 1: that there there is a good case to be made 628 00:34:35,880 --> 00:34:40,000 Speaker 1: that the human brain is specially geared towards language acquisition 629 00:34:40,040 --> 00:34:44,280 Speaker 1: in childhood. That's sort of like one of our species superpowers. 630 00:34:44,320 --> 00:34:47,800 Speaker 1: And then those windows close, or they don't completely close, 631 00:34:47,880 --> 00:34:53,320 Speaker 1: but they the windows become much smaller later on in life. 632 00:34:53,960 --> 00:34:56,319 Speaker 1: You know. Speaking of children, they are you know, they 633 00:34:56,360 --> 00:34:59,759 Speaker 1: are also frequently a font of weirdness and beauty as 634 00:34:59,840 --> 00:35:03,000 Speaker 1: they two are learning to function in the human world, 635 00:35:03,080 --> 00:35:05,600 Speaker 1: in the in the adult human world, and then they 636 00:35:05,640 --> 00:35:08,959 Speaker 1: say and do things that sometimes hit a weird zone 637 00:35:09,000 --> 00:35:12,799 Speaker 1: that is either hilarious or sometimes a little frightening, or 638 00:35:12,920 --> 00:35:15,600 Speaker 1: or even a little bit elegant. Yes, I know exactly 639 00:35:15,640 --> 00:35:19,480 Speaker 1: what you mean. Like, kids often do the same funny 640 00:35:19,560 --> 00:35:24,279 Speaker 1: outputs based on induction that these machine learning algorithms do. 641 00:35:24,920 --> 00:35:27,000 Speaker 1: Like you can see it's funny in the same way 642 00:35:27,000 --> 00:35:29,120 Speaker 1: that they might say something that's a little bit off 643 00:35:29,160 --> 00:35:32,839 Speaker 1: and kind of absurd, but you can sort of understand 644 00:35:33,000 --> 00:35:36,760 Speaker 1: the rules that got them there, right. Like one example 645 00:35:36,880 --> 00:35:39,160 Speaker 1: I always refer to is from years and years ago, 646 00:35:39,400 --> 00:35:41,960 Speaker 1: I went to a children's puppet shows before I had 647 00:35:42,040 --> 00:35:43,920 Speaker 1: a kid, my my wife and I went to check 648 00:35:43,960 --> 00:35:47,760 Speaker 1: this out because it was actors from a local improv group, 649 00:35:48,040 --> 00:35:52,799 Speaker 1: Dad's Garage, Atlanta. They put on the show Uncle Grandpa's 650 00:35:52,800 --> 00:35:54,960 Speaker 1: Who Deal His story Time? I think it was, and 651 00:35:55,000 --> 00:35:57,239 Speaker 1: so you had these sort of seasoned, uh you know, 652 00:35:57,280 --> 00:35:59,960 Speaker 1: improv vets, and they were doing a puppet show for kids, 653 00:36:00,320 --> 00:36:03,799 Speaker 1: and they're taking ideas from the audience and they said, like, 654 00:36:03,800 --> 00:36:06,439 Speaker 1: who should our main character be? And so they hand 655 00:36:06,440 --> 00:36:08,560 Speaker 1: the mic to some little little girl in the front 656 00:36:08,640 --> 00:36:12,040 Speaker 1: row and she says Batman the Girl, which which is 657 00:36:12,080 --> 00:36:14,680 Speaker 1: so hilarious and I don't think an adult would be 658 00:36:14,719 --> 00:36:16,839 Speaker 1: able to come up with that, but you can sort 659 00:36:16,840 --> 00:36:18,839 Speaker 1: of tease it apart and figure out how she got 660 00:36:18,880 --> 00:36:21,359 Speaker 1: there and you know, with it. But uh, but it's 661 00:36:21,440 --> 00:36:23,959 Speaker 1: it's just one of you know, many examples I'm sure 662 00:36:24,040 --> 00:36:26,839 Speaker 1: that that anyone with children in their life can can 663 00:36:26,880 --> 00:36:28,799 Speaker 1: turn to where they come with something that is just 664 00:36:29,360 --> 00:36:33,120 Speaker 1: so goofy or weird or or sometimes terrifying. Well, I 665 00:36:33,160 --> 00:36:36,480 Speaker 1: think the real funniness and pleasure in that is that 666 00:36:36,520 --> 00:36:40,239 Speaker 1: it's Batman the Girl and not bat Girl. Right, Yeah, 667 00:36:40,280 --> 00:36:43,319 Speaker 1: Like it's almost there, and but by not being there, 668 00:36:43,960 --> 00:36:47,759 Speaker 1: it's it's also like it's it's even better like it's 669 00:36:47,760 --> 00:36:51,239 Speaker 1: not it's not Backgirl, it's Batman the girl. It's just 670 00:36:51,280 --> 00:36:55,400 Speaker 1: so nonsensical, uh and beautiful at the same time. You know. 671 00:36:55,480 --> 00:36:59,840 Speaker 1: Another one of the great AI text generation experiments that 672 00:37:00,000 --> 00:37:03,360 Speaker 1: in El Shane did on her blog was was generating 673 00:37:03,800 --> 00:37:07,399 Speaker 1: that you know those Valentine's Day candy hearts. Yes, yes, 674 00:37:07,760 --> 00:37:09,920 Speaker 1: She had a programmed sample those and then try to 675 00:37:09,920 --> 00:37:12,600 Speaker 1: come up with examples and ended up saying things I 676 00:37:12,600 --> 00:37:18,840 Speaker 1: think like like sweat, pooh and hole and time hug. 677 00:37:19,120 --> 00:37:22,680 Speaker 1: Time hug sounds good time hug time like time cop. Yeah, 678 00:37:22,719 --> 00:37:25,359 Speaker 1: but it also sounds like something that like it might 679 00:37:25,360 --> 00:37:28,479 Speaker 1: be a term that aliens come up with for human love. 680 00:37:28,840 --> 00:37:30,840 Speaker 1: It's like they engage not in a hug, but in 681 00:37:30,880 --> 00:37:33,640 Speaker 1: a time hug. It is as if they are hugging 682 00:37:33,920 --> 00:37:36,480 Speaker 1: for the rest of their lives, you know, or something 683 00:37:36,520 --> 00:37:41,799 Speaker 1: like that. Another one all hover and then finally bog love. UM. 684 00:37:42,239 --> 00:37:45,320 Speaker 1: My son, who as of this recording is is six 685 00:37:45,400 --> 00:37:49,120 Speaker 1: almost seven, he drew a picture for my wife and 686 00:37:49,160 --> 00:37:53,080 Speaker 1: I for for Valentine's Valentine's Day CARDI did his school 687 00:37:53,480 --> 00:37:57,160 Speaker 1: and it depicted dinosaurs as these want to do um 688 00:37:57,239 --> 00:38:00,759 Speaker 1: and they're there are some herbivores walking about, there are 689 00:38:00,840 --> 00:38:05,600 Speaker 1: some carnivores eating the flash of fallen dinosaurs, and then, 690 00:38:05,680 --> 00:38:08,200 Speaker 1: as is typical in paleo art, there may be a 691 00:38:08,239 --> 00:38:11,280 Speaker 1: volcano in the background. But then there is a meteor 692 00:38:11,640 --> 00:38:15,880 Speaker 1: coming in hard and fastid Yeah, and he writes, uh, 693 00:38:16,160 --> 00:38:20,399 Speaker 1: I love you on the on the meteor, which which 694 00:38:20,600 --> 00:38:23,359 Speaker 1: is absolutely wonderful because it's like at once it is 695 00:38:23,520 --> 00:38:26,600 Speaker 1: like like this is beautiful, Like he totally means all 696 00:38:26,640 --> 00:38:28,680 Speaker 1: of this, and it's like the most beautiful Valentine I've 697 00:38:28,680 --> 00:38:31,319 Speaker 1: ever received and yet to put it on the the 698 00:38:31,400 --> 00:38:35,480 Speaker 1: instrument of of the of the extinction events. It's just 699 00:38:35,560 --> 00:38:41,120 Speaker 1: so weird and like it it's accidentally brilliant, you know. Yeah, 700 00:38:41,200 --> 00:38:42,760 Speaker 1: I saw that when you put it on the internet. 701 00:38:42,800 --> 00:38:45,480 Speaker 1: It was the sweetest thing I've ever seen. It was 702 00:38:45,520 --> 00:38:47,680 Speaker 1: so good, and it's like, yeah, this is that. This 703 00:38:47,719 --> 00:38:50,160 Speaker 1: is how love works. Love is a is a destroyer 704 00:38:50,360 --> 00:38:52,719 Speaker 1: as well as the creator. Now, we do want to 705 00:38:52,719 --> 00:38:55,680 Speaker 1: stress it with the especially with these text based situations. 706 00:38:56,320 --> 00:39:00,319 Speaker 1: We're not talking about merror random mashups of text, uh, 707 00:39:00,400 --> 00:39:03,719 Speaker 1: such as like the Wu Tang clan name generator or 708 00:39:04,760 --> 00:39:07,080 Speaker 1: um or more literary example would be the cut up 709 00:39:07,120 --> 00:39:11,600 Speaker 1: technique popularized by author William S. Burrows, where you just 710 00:39:11,640 --> 00:39:15,759 Speaker 1: have like a random um mechanism and play to sort 711 00:39:15,760 --> 00:39:20,480 Speaker 1: of splice words or or sentences together to get something 712 00:39:21,320 --> 00:39:25,200 Speaker 1: that may have sort of accidental meaning to it. No, 713 00:39:25,360 --> 00:39:29,600 Speaker 1: the neural net programs are They are algorithms attempting as 714 00:39:29,680 --> 00:39:33,880 Speaker 1: best they can to approximate the quality of the the 715 00:39:33,920 --> 00:39:37,399 Speaker 1: input texts, the text they're trained on. So they're doing 716 00:39:37,400 --> 00:39:40,480 Speaker 1: their best to make something like this. Uh. And then, 717 00:39:40,520 --> 00:39:42,440 Speaker 1: of course, I mean one of the things is you 718 00:39:42,520 --> 00:39:45,480 Speaker 1: might say, well, why don't they just perfectly spit back 719 00:39:45,480 --> 00:39:48,239 Speaker 1: out the text you've trained them on. In fact, if 720 00:39:48,280 --> 00:39:50,719 Speaker 1: you don't tell them not to, they'll do that. You know, 721 00:39:50,760 --> 00:39:53,319 Speaker 1: they'll just spit back in exactly what you fed in. 722 00:39:53,680 --> 00:39:55,960 Speaker 1: You have to sort of like change some values and 723 00:39:55,960 --> 00:39:59,239 Speaker 1: tinker with it to prevent what's known as overfitting in 724 00:39:59,280 --> 00:40:02,880 Speaker 1: this world. Uh, to sort of force the algorithms to 725 00:40:02,920 --> 00:40:05,640 Speaker 1: be more creative and try to come up with new 726 00:40:05,760 --> 00:40:08,600 Speaker 1: versions of the kind of thing they've seen instead of 727 00:40:08,640 --> 00:40:11,200 Speaker 1: just completely copying what you fed them. Yeah, because we 728 00:40:11,200 --> 00:40:16,640 Speaker 1: want these machines to rule the world, not just Hollywood, right, Um, 729 00:40:16,680 --> 00:40:18,560 Speaker 1: But you know we also have to to distinguish it. 730 00:40:18,600 --> 00:40:22,160 Speaker 1: We're also not talking about fake i AI generated text, 731 00:40:22,239 --> 00:40:27,439 Speaker 1: which can certainly be tremendously entertaining as well. Yes, that's 732 00:40:27,480 --> 00:40:32,160 Speaker 1: such a great genre, people pretending to be neural networks 733 00:40:32,239 --> 00:40:37,160 Speaker 1: creating machine learning generated text, which it's such an amazing 734 00:40:37,280 --> 00:40:41,920 Speaker 1: reversal of principles that humans have intuitively detected what's funny 735 00:40:42,040 --> 00:40:46,080 Speaker 1: about machine learning generated text and then made fake human 736 00:40:46,160 --> 00:40:50,000 Speaker 1: designed versions of it to exploit that inherent humor. Right, 737 00:40:50,200 --> 00:40:51,759 Speaker 1: I don't I don't know how you get deeper on 738 00:40:51,800 --> 00:40:54,719 Speaker 1: the irony pit than that. Yeah, there's a wonderful two 739 00:40:54,719 --> 00:40:58,520 Speaker 1: thousand eighteen tweet by comedian Keaton Patty and this was 740 00:40:58,560 --> 00:41:01,239 Speaker 1: the tweet quote, I worked a boat to watch over 741 00:41:01,320 --> 00:41:04,160 Speaker 1: a thousand hours of Alive Garden commercials and then asked 742 00:41:04,160 --> 00:41:06,439 Speaker 1: it to write an alive garden commercial of its own. 743 00:41:06,760 --> 00:41:09,879 Speaker 1: Here is the first page, uh and uh. And then 744 00:41:10,080 --> 00:41:14,320 Speaker 1: he proceeded to include the images of this text that 745 00:41:14,480 --> 00:41:18,759 Speaker 1: script rather for and alive garden commercial. And and it's 746 00:41:18,760 --> 00:41:22,960 Speaker 1: filled with hilarious nonsense like I shall eat Italian citizens 747 00:41:23,360 --> 00:41:27,359 Speaker 1: and unlimited stick and playing seeming you know, to play 748 00:41:27,440 --> 00:41:31,480 Speaker 1: upon the whole catchphrase of what when you're here? Your home. 749 00:41:31,520 --> 00:41:33,080 Speaker 1: I think when you're here your family and you hear 750 00:41:33,120 --> 00:41:37,200 Speaker 1: your family. There's there's one from this UH, this fake 751 00:41:37,239 --> 00:41:40,400 Speaker 1: script that says leave without me, I'm home, which I 752 00:41:40,480 --> 00:41:42,960 Speaker 1: just I love. I remember laughing so hard at this 753 00:41:43,000 --> 00:41:44,880 Speaker 1: when it came out as well. I love that the 754 00:41:44,880 --> 00:41:49,959 Speaker 1: waitress says lasagna wings with extra italy. Yeah. There's also 755 00:41:50,040 --> 00:41:53,719 Speaker 1: like really funny stage directions in it. Yeah, well, oh yeah, 756 00:41:53,719 --> 00:41:56,600 Speaker 1: you mentioned the infinite uh where it says like we 757 00:41:56,840 --> 00:42:00,400 Speaker 1: the gluten classic O. We believe the wage risk that 758 00:42:00,480 --> 00:42:02,560 Speaker 1: it is from the kitchen. We have no reason not 759 00:42:02,640 --> 00:42:06,320 Speaker 1: to believe. Now, of course, this is just a comedian 760 00:42:06,440 --> 00:42:09,640 Speaker 1: doing this thing, trying to pretend to be an AI. UM. 761 00:42:09,680 --> 00:42:12,200 Speaker 1: But I was reading an article where they quoted Janelle Shane, 762 00:42:12,280 --> 00:42:14,600 Speaker 1: the author of the AI Weirdness blog, who trains all 763 00:42:14,640 --> 00:42:16,439 Speaker 1: these neural networks to come up with all this funny 764 00:42:16,440 --> 00:42:18,440 Speaker 1: stuff we were talking about at the beginning of the episode, 765 00:42:18,800 --> 00:42:20,680 Speaker 1: And you know, she talks about that there are ways 766 00:42:20,760 --> 00:42:24,759 Speaker 1: to notice UH when something was probably written by a 767 00:42:24,880 --> 00:42:28,160 Speaker 1: human instead of by an actual AI. Both can be 768 00:42:28,239 --> 00:42:31,400 Speaker 1: absurd in similar funny ways. But one of the problems 769 00:42:31,440 --> 00:42:34,520 Speaker 1: with this script UH in passing as a real AI 770 00:42:34,560 --> 00:42:38,600 Speaker 1: generated text is that it's actually too coherent, like it's memory. 771 00:42:38,719 --> 00:42:42,279 Speaker 1: Its memory is too long. It remembers characters from many 772 00:42:42,360 --> 00:42:45,720 Speaker 1: lines earlier and still has them appearing and saying things. 773 00:42:46,440 --> 00:42:49,960 Speaker 1: Actual AI generated texts of a much shorter memory. They 774 00:42:50,040 --> 00:42:53,360 Speaker 1: they're not consistent in that way. They don't make Actually, 775 00:42:53,680 --> 00:42:55,759 Speaker 1: they make even less sense than the fake all of 776 00:42:55,800 --> 00:43:00,200 Speaker 1: Garden commercial. So she's saying, don't reach for your match 777 00:43:00,280 --> 00:43:03,480 Speaker 1: it on this one because a real neural net generated 778 00:43:03,600 --> 00:43:07,839 Speaker 1: text only mimics forms, it doesn't mimic meaning. And this 779 00:43:07,920 --> 00:43:11,680 Speaker 1: thing it's it means too much. It's too clever. Okay, 780 00:43:11,680 --> 00:43:13,919 Speaker 1: So we're gonna go ahead and close out this episode now, 781 00:43:13,960 --> 00:43:16,040 Speaker 1: but again there's going to be a part two where 782 00:43:16,040 --> 00:43:19,160 Speaker 1: we continue this discussion and really get more into the 783 00:43:19,200 --> 00:43:21,759 Speaker 1: meat of the topic. In the meantime, check out Stuff 784 00:43:21,760 --> 00:43:23,480 Speaker 1: to Blow your Mind dot com. That's the mothership. That's 785 00:43:23,480 --> 00:43:25,959 Speaker 1: where we'll find all the episodes of the show. There's 786 00:43:25,960 --> 00:43:27,959 Speaker 1: also a little button at the top you can click 787 00:43:28,000 --> 00:43:30,200 Speaker 1: on that go to our T shirt store get t 788 00:43:30,320 --> 00:43:33,319 Speaker 1: shirt stickers with a number of cool designs, designs that 789 00:43:33,640 --> 00:43:36,160 Speaker 1: line up with certain show topics, like the Great Basilisk 790 00:43:36,320 --> 00:43:39,960 Speaker 1: or various Squirrel episodes uh as well as just as 791 00:43:39,960 --> 00:43:44,200 Speaker 1: you know, show logo um material as well, and if 792 00:43:44,200 --> 00:43:46,200 Speaker 1: you want to support the show, the best thing that 793 00:43:46,239 --> 00:43:48,719 Speaker 1: you can do is rate and review stuff to blow 794 00:43:48,719 --> 00:43:50,239 Speaker 1: your mind wherever you have the power to do so. 795 00:43:50,320 --> 00:43:53,239 Speaker 1: Wherever you get this podcast, give us some stars, give 796 00:43:53,320 --> 00:43:55,719 Speaker 1: us a nice review. It really helps us out in 797 00:43:55,760 --> 00:43:59,680 Speaker 1: our war against the almighty algorithms. Big thanks as always 798 00:43:59,719 --> 00:44:03,360 Speaker 1: to our excellent audio producers Alex Williams and Tor Harrison. 799 00:44:03,640 --> 00:44:05,120 Speaker 1: If you would like to get in touch with us 800 00:44:05,120 --> 00:44:07,520 Speaker 1: directly with feedback about this episode or any other, to 801 00:44:07,560 --> 00:44:10,120 Speaker 1: suggest topic for the future, or just to say hi, 802 00:44:10,360 --> 00:44:13,399 Speaker 1: you can email us at blow the Mind at how 803 00:44:13,480 --> 00:44:27,720 Speaker 1: stuff works dot com for moralns and thousands of other topics. 804 00:44:27,880 --> 00:44:52,120 Speaker 1: Does it how stuff works dot com