1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tex Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:14,320 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:14,360 --> 00:00:17,400 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,440 --> 00:00:21,080 Speaker 1: iHeart Radio and I love all things tech, and today 5 00:00:21,200 --> 00:00:24,520 Speaker 1: we're going to cover an interesting topic. There was a 6 00:00:24,560 --> 00:00:29,120 Speaker 1: suggestion from Shaye Lee. And you might remember Shay in 7 00:00:29,200 --> 00:00:35,600 Speaker 1: her alter ego as al Amona, who sabotaged my episode 8 00:00:35,600 --> 00:00:41,040 Speaker 1: about how matches work, and now she's back. Hi, Shay Hi, 9 00:00:41,400 --> 00:00:44,839 Speaker 1: I heard Alimona ran off with the quister. Yeah, that's 10 00:00:44,880 --> 00:00:49,080 Speaker 1: the union made in. They're just gonna burn it all 11 00:00:49,120 --> 00:00:53,200 Speaker 1: to the ground, burn it all down. So to give 12 00:00:53,240 --> 00:00:55,960 Speaker 1: some background on what this episode is all about, Shaye 13 00:00:56,000 --> 00:00:57,840 Speaker 1: reached out to me and asked if I had ever 14 00:00:57,880 --> 00:01:02,120 Speaker 1: done an episode about a specific piece of internet folklore 15 00:01:02,280 --> 00:01:05,120 Speaker 1: called aratus, and we're going to get to that. We 16 00:01:05,160 --> 00:01:09,000 Speaker 1: communicate in stories and your traditional folklore. These are the 17 00:01:09,040 --> 00:01:12,160 Speaker 1: tales that are passed around. Typically they're not even written 18 00:01:12,160 --> 00:01:15,560 Speaker 1: down there, their oral traditions. It's told around the campfire. 19 00:01:15,959 --> 00:01:19,240 Speaker 1: And now the campfires turned into creepy pasta, yes, which 20 00:01:19,280 --> 00:01:22,959 Speaker 1: we will talk about. So you know that that also 21 00:01:23,400 --> 00:01:26,959 Speaker 1: developed into things that we call urban legends. Urban leges 22 00:01:27,000 --> 00:01:29,520 Speaker 1: is just another type of folklore. But it was a 23 00:01:29,600 --> 00:01:33,960 Speaker 1: subset of folklore that kind of grew into as we 24 00:01:34,000 --> 00:01:37,200 Speaker 1: started to see the industrial revolution. We saw people move 25 00:01:37,319 --> 00:01:39,160 Speaker 1: to the cities. You know, it became less of an 26 00:01:39,160 --> 00:01:42,720 Speaker 1: agrarian world, and a lot lots of the uh, A 27 00:01:42,720 --> 00:01:44,720 Speaker 1: lot of regions in the around the world became more 28 00:01:44,920 --> 00:01:51,320 Speaker 1: urban and less agrarian, and a different set of folklore 29 00:01:51,360 --> 00:01:53,680 Speaker 1: tales started to arise. A lot of them are like 30 00:01:53,840 --> 00:01:59,720 Speaker 1: tales of warning. Yeah, you say, like don't trust strangers tales. Yes, 31 00:02:00,080 --> 00:02:01,880 Speaker 1: a lot of morality tales too, Yeah, a lot of 32 00:02:01,960 --> 00:02:04,360 Speaker 1: them are like, see what happens if you do this 33 00:02:04,400 --> 00:02:07,520 Speaker 1: thing that breaks our social morays? You will suffer like 34 00:02:07,560 --> 00:02:10,000 Speaker 1: these people did. Yeah, see what happens when you take 35 00:02:10,280 --> 00:02:15,040 Speaker 1: candy and or apples from strange women in the woods, right, right, 36 00:02:15,080 --> 00:02:17,639 Speaker 1: Except now it's like when you're going trick or treating, 37 00:02:17,639 --> 00:02:19,320 Speaker 1: if you don't check all the candy, you're going to 38 00:02:19,400 --> 00:02:21,720 Speaker 1: end up with razor blades and poison. Right. Yeah, I've 39 00:02:21,760 --> 00:02:24,520 Speaker 1: never found a razor blade in any of my candy. Yeah, 40 00:02:24,600 --> 00:02:27,919 Speaker 1: no one in my twenty seven years of trick or treating, 41 00:02:28,200 --> 00:02:31,280 Speaker 1: no one ever did until people started worrying about it, 42 00:02:31,320 --> 00:02:35,920 Speaker 1: and then people would yeah or like or like it 43 00:02:35,919 --> 00:02:38,359 Speaker 1: would be Oh I discovered this, and it turned out 44 00:02:38,360 --> 00:02:40,799 Speaker 1: that they had planted their own discovery that kind of thing, 45 00:02:40,960 --> 00:02:46,720 Speaker 1: kind of like the the murders. Yes. Yes, So one 46 00:02:46,720 --> 00:02:49,080 Speaker 1: of the things I wanted to to kind of allude 47 00:02:49,120 --> 00:02:52,040 Speaker 1: to to kind of set the stage for the stuff 48 00:02:52,040 --> 00:02:54,280 Speaker 1: we're gonna talk about today, which is the more internet 49 00:02:54,280 --> 00:02:57,120 Speaker 1: based stuff. I'm just going to give an example of 50 00:02:57,120 --> 00:02:59,440 Speaker 1: one of my favorite urban legends of all time. It's 51 00:02:59,480 --> 00:03:02,200 Speaker 1: a It's one that's been told a billion different times. 52 00:03:02,360 --> 00:03:05,600 Speaker 1: It's been written in lots of different anthologies. Shay, you'll 53 00:03:05,639 --> 00:03:09,200 Speaker 1: appreciate this. When I was in uh elementary school and 54 00:03:09,280 --> 00:03:11,960 Speaker 1: middle school, the books I checked out the most in 55 00:03:12,080 --> 00:03:17,320 Speaker 1: my school library, we're all about urban legends, ghosts, and 56 00:03:17,480 --> 00:03:21,320 Speaker 1: other types of paranormal, supernatural kind of stuff. We have 57 00:03:21,440 --> 00:03:23,280 Speaker 1: that in common because I did the exact same thing. 58 00:03:24,320 --> 00:03:28,680 Speaker 1: I stopped believing in it. So that's where our path divulge, No, 59 00:03:29,160 --> 00:03:33,480 Speaker 1: so diverted, I should say. So. The story I want 60 00:03:33,480 --> 00:03:37,520 Speaker 1: to tell is the Vanishing Hitchhecker story. Classic story. So 61 00:03:38,000 --> 00:03:41,480 Speaker 1: the basic version of the story goes, a guy's driving 62 00:03:41,600 --> 00:03:45,560 Speaker 1: down a road, a dark country road at night. The 63 00:03:45,680 --> 00:03:48,200 Speaker 1: rain is coming down really hard, it's hard to see, 64 00:03:48,560 --> 00:03:51,600 Speaker 1: and as he's driving down the road, he sees a 65 00:03:51,680 --> 00:03:54,840 Speaker 1: young woman walking along the side the road, and so, 66 00:03:55,040 --> 00:03:59,000 Speaker 1: not wanting her to get totally frozen out in the 67 00:03:59,120 --> 00:04:02,320 Speaker 1: pouring rain, he pulls over and offers her a ride, 68 00:04:02,320 --> 00:04:04,119 Speaker 1: and she gets in the back seat of his car, 69 00:04:04,600 --> 00:04:07,720 Speaker 1: and out of a sense of a concern for her, 70 00:04:08,040 --> 00:04:12,720 Speaker 1: he gives her his jacket to wrap around herself, and 71 00:04:12,760 --> 00:04:15,680 Speaker 1: then she gives him an address, and so he starts 72 00:04:15,720 --> 00:04:18,719 Speaker 1: to drive, and otherwise she doesn't really communicate with him. 73 00:04:18,839 --> 00:04:20,880 Speaker 1: He gets to the address. When he gets there, he 74 00:04:20,920 --> 00:04:23,240 Speaker 1: looks in the back seat and the girl's gone, and 75 00:04:23,320 --> 00:04:25,880 Speaker 1: he never heard the car door open or anything. He 76 00:04:26,080 --> 00:04:29,120 Speaker 1: doesn't know what happened, and his jacket is gone as well. 77 00:04:29,279 --> 00:04:32,719 Speaker 1: There's just a like a the seats wet from where 78 00:04:32,760 --> 00:04:35,640 Speaker 1: she had been because of the pouring rain, but otherwise 79 00:04:35,680 --> 00:04:38,480 Speaker 1: there's no indication. So he goes to the house where 80 00:04:38,480 --> 00:04:40,800 Speaker 1: he had the address and knocks on the door to 81 00:04:40,839 --> 00:04:44,320 Speaker 1: find out what the heck is going on. Woman answers 82 00:04:44,320 --> 00:04:46,800 Speaker 1: the door. Typically it's a woman in this in this legend, 83 00:04:47,120 --> 00:04:50,919 Speaker 1: he explains what has happened. The woman goes pale and says, 84 00:04:52,480 --> 00:04:55,320 Speaker 1: no one by that description lives here. But there was 85 00:04:55,400 --> 00:04:58,599 Speaker 1: a young woman whose family lived here before us, but 86 00:04:58,760 --> 00:05:02,520 Speaker 1: she died years ago and she's buried at such and 87 00:05:02,600 --> 00:05:07,080 Speaker 1: such cemetery. So he goes to the cemetery, and when 88 00:05:07,120 --> 00:05:10,240 Speaker 1: he comes up on the grave of the young woman, 89 00:05:10,839 --> 00:05:16,120 Speaker 1: there's his jacket resting on her gravestone vanishing. Hi. Oh, 90 00:05:16,279 --> 00:05:21,160 Speaker 1: and it's considerate to leave his jacket behind. Yeah, Otherwise 91 00:05:21,160 --> 00:05:23,560 Speaker 1: she's up with teen angel and looking at rings and 92 00:05:23,640 --> 00:05:26,080 Speaker 1: jackets that are just an accumulation of wealth that they've 93 00:05:26,080 --> 00:05:29,040 Speaker 1: had over the years. I remember that movie vaguely. Well, 94 00:05:29,080 --> 00:05:31,640 Speaker 1: I was talking about the song, but there's a whole 95 00:05:31,839 --> 00:05:36,240 Speaker 1: genre of music that's dedicated to teenagers dying. So that's 96 00:05:36,279 --> 00:05:39,120 Speaker 1: an example of an urban legend. It's one that's been told. Like, 97 00:05:39,160 --> 00:05:42,440 Speaker 1: there's lots of variations. Well, the Internet has its own 98 00:05:42,640 --> 00:05:44,880 Speaker 1: versions of this, and we're also going to talk about 99 00:05:44,920 --> 00:05:50,359 Speaker 1: in this episode how the Internet reinforces the communication of 100 00:05:50,400 --> 00:05:54,120 Speaker 1: these stories and reinforces people's belief in it, largely through 101 00:05:54,520 --> 00:05:58,400 Speaker 1: the behavior of those who are sharing the stories and 102 00:05:58,520 --> 00:06:04,040 Speaker 1: also the way the platform themselves work. So you mentioned 103 00:06:04,080 --> 00:06:09,279 Speaker 1: creepy pasta. Let's explain what creepy pasta is, and I 104 00:06:09,279 --> 00:06:13,080 Speaker 1: guess first we have to explain what copy pasta is, 105 00:06:13,120 --> 00:06:17,479 Speaker 1: because creepy pasta is a subset of copy pasta. Before 106 00:06:17,680 --> 00:06:19,760 Speaker 1: I learned it from you, I actually didn't know that 107 00:06:19,800 --> 00:06:22,520 Speaker 1: creepy pasta was a subset of copy pasta. I just 108 00:06:22,560 --> 00:06:24,800 Speaker 1: thought it was its own thing, just the name of 109 00:06:24,839 --> 00:06:29,000 Speaker 1: a forum, because no one ever talks about copy pasta anymore. 110 00:06:29,200 --> 00:06:33,680 Speaker 1: But the it's it's a sort of a portmanteau of 111 00:06:33,920 --> 00:06:36,920 Speaker 1: copy paste, right, and it's the fact that a lot 112 00:06:36,920 --> 00:06:38,760 Speaker 1: of people to share stories. What they would do is 113 00:06:38,800 --> 00:06:42,040 Speaker 1: they would find something on the Internet that was interesting, 114 00:06:42,160 --> 00:06:45,800 Speaker 1: maybe in a message board or whatever, copy it and 115 00:06:45,839 --> 00:06:49,560 Speaker 1: then paste into some other message board or whatever, and 116 00:06:49,600 --> 00:06:53,880 Speaker 1: that's yeah. Usually there was no attribution, like I often 117 00:06:53,920 --> 00:06:56,640 Speaker 1: expelled for that if I tried that. Well, they weren't 118 00:06:56,640 --> 00:06:59,240 Speaker 1: necessarily they weren't necessarily trying to pass it off as 119 00:06:59,279 --> 00:07:02,799 Speaker 1: their own work, but rather like I saw this crazy 120 00:07:02,880 --> 00:07:05,920 Speaker 1: story or this funny joke or whatever it might be. 121 00:07:06,960 --> 00:07:09,200 Speaker 1: But sometimes when you're doing that copy and basing, well 122 00:07:09,240 --> 00:07:11,640 Speaker 1: one you might lose the attribution, so you don't know 123 00:07:11,680 --> 00:07:15,960 Speaker 1: who originated the thing and to sometimes you lose the context, 124 00:07:16,120 --> 00:07:20,720 Speaker 1: like is this a joke? Is it a piece of fiction? 125 00:07:21,200 --> 00:07:23,280 Speaker 1: Is it supposed to be an account of something that 126 00:07:23,320 --> 00:07:29,760 Speaker 1: actually happened. Creepy pasta is a subset where it's specifically 127 00:07:29,840 --> 00:07:34,640 Speaker 1: about spooky, creepy paranormal typically kind of stuff, but not 128 00:07:34,720 --> 00:07:40,520 Speaker 1: necessarily paranormal, but often paranormal, or like the unnamed serial 129 00:07:40,600 --> 00:07:44,520 Speaker 1: killers of Yeah, like the hook that's hanging on the 130 00:07:44,560 --> 00:07:47,400 Speaker 1: card door, that kind of just the killer. Do you 131 00:07:47,440 --> 00:07:49,680 Speaker 1: know who Jeff Jeff the Killer. I do not know 132 00:07:49,680 --> 00:07:52,000 Speaker 1: who Jeff the Killer is. Jeff the Killer is weird 133 00:07:52,040 --> 00:07:54,480 Speaker 1: because it's in his name. Yeah, you would think, well, 134 00:07:54,600 --> 00:08:00,760 Speaker 1: Jeff the Killer is a Baker obviously, listen, sorry, listen, 135 00:08:00,800 --> 00:08:06,160 Speaker 1: my dad was Jeff the Killer Jeff Baker. And um, 136 00:08:06,280 --> 00:08:09,000 Speaker 1: kind of a boogeyman type story, very like how he 137 00:08:09,040 --> 00:08:11,880 Speaker 1: comes into your room if you don't share this, or 138 00:08:11,920 --> 00:08:14,440 Speaker 1: if you share it a number of times or YadA yadya, 139 00:08:15,120 --> 00:08:20,160 Speaker 1: and um that for me, the story isn't what's important. 140 00:08:20,200 --> 00:08:21,960 Speaker 1: I'm sure you've seen pictures of him when you're just 141 00:08:22,000 --> 00:08:24,000 Speaker 1: scrolling through Facebook. Have you seen like the picture it's 142 00:08:24,040 --> 00:08:27,640 Speaker 1: like a big white creepy face that it's been so 143 00:08:28,360 --> 00:08:30,960 Speaker 1: edited that there's like no nose and just a big 144 00:08:30,960 --> 00:08:33,520 Speaker 1: creepy smile and huge eyes, and it usually has like 145 00:08:33,559 --> 00:08:36,160 Speaker 1: a caption that says, go to sleep under it, or 146 00:08:36,520 --> 00:08:39,800 Speaker 1: I probably have. Yeah, I've seen a lot on the internet. 147 00:08:40,080 --> 00:08:42,880 Speaker 1: I don't remember most of it. But um, so, Jeff 148 00:08:42,960 --> 00:08:45,560 Speaker 1: the Killer. There's many incarnations of Jeff the Killer. There's 149 00:08:45,600 --> 00:08:48,080 Speaker 1: you know, the creepy guy that comes into your bedroom 150 00:08:48,120 --> 00:08:51,640 Speaker 1: and kills you and all that. Oh yeah, you know, 151 00:08:51,800 --> 00:08:57,400 Speaker 1: he's so inconvenient and inconsiderate, but um they're because of 152 00:08:57,400 --> 00:09:00,800 Speaker 1: creepy Pasta and just all the threads abut Jeff the Killer. 153 00:09:00,840 --> 00:09:04,360 Speaker 1: There were Jeff the Killer fan girls that would draw 154 00:09:04,480 --> 00:09:09,520 Speaker 1: a little like a little romantic, you know, heart images 155 00:09:09,600 --> 00:09:12,880 Speaker 1: of him as as Yeah, as if that's like they're 156 00:09:14,240 --> 00:09:18,400 Speaker 1: they're brad Pitt that this is this is messed up. 157 00:09:17,520 --> 00:09:21,160 Speaker 1: So we wanted to talk a little bit about a 158 00:09:21,200 --> 00:09:23,960 Speaker 1: couple of examples of creepypasta. This is just the beginning 159 00:09:24,080 --> 00:09:26,439 Speaker 1: of this episode. Two each each section we're going to 160 00:09:26,559 --> 00:09:29,880 Speaker 1: talk about a different kind of subset of Internet folklore. 161 00:09:30,559 --> 00:09:34,720 Speaker 1: But creepy Pasta is one that is pretty easy to reference, 162 00:09:35,440 --> 00:09:40,080 Speaker 1: and uh, one example of it although it didn't necessarily 163 00:09:40,200 --> 00:09:43,960 Speaker 1: start as creepy pasta itself, because creepy pasta tends to 164 00:09:43,960 --> 00:09:46,760 Speaker 1: be in the form of some sort of story. Um. 165 00:09:46,800 --> 00:09:49,319 Speaker 1: A lot of early types of creepypasta fell into one 166 00:09:49,320 --> 00:09:52,760 Speaker 1: of two categories. It was a story about something that 167 00:09:53,040 --> 00:09:56,360 Speaker 1: supposedly happened to someone, and it could it could be 168 00:09:56,400 --> 00:09:58,640 Speaker 1: presented as fiction. You didn't have to present it as 169 00:09:58,679 --> 00:10:01,600 Speaker 1: if it were real, but it's it's given in that 170 00:10:01,640 --> 00:10:05,679 Speaker 1: sort of tale, or it was you know, lost episodes 171 00:10:06,080 --> 00:10:09,640 Speaker 1: of certain types of shows or movies or things that 172 00:10:09,640 --> 00:10:13,840 Speaker 1: that uh no longer are shown, but they do exist 173 00:10:14,280 --> 00:10:17,360 Speaker 1: and there's something creepy about what happened, Like it's that 174 00:10:17,440 --> 00:10:21,760 Speaker 1: one really dark episode of Gilligan's Island. There are versions 175 00:10:21,840 --> 00:10:24,400 Speaker 1: of of it. Now that that particular style of creepy 176 00:10:24,440 --> 00:10:28,079 Speaker 1: pasta fell out of favor fairly quickly, Like it was 177 00:10:28,160 --> 00:10:30,960 Speaker 1: kind of like a creepy version of fan fiction. But 178 00:10:31,400 --> 00:10:36,120 Speaker 1: one of the most prominent examples is slender Man, which 179 00:10:36,160 --> 00:10:38,599 Speaker 1: didn't start off, as I said, as creepy pasta. It 180 00:10:38,679 --> 00:10:42,280 Speaker 1: started off as kind of an exercise. Uh. There was 181 00:10:42,840 --> 00:10:48,200 Speaker 1: the Something Awful forums forum board on the Internet, so 182 00:10:48,200 --> 00:10:51,560 Speaker 1: people would get together and have conversations about different topics, 183 00:10:51,800 --> 00:10:54,679 Speaker 1: and this was particularly in a thread that was titled 184 00:10:54,840 --> 00:10:59,360 Speaker 1: create Paranormal Images. So it's just people trying to use 185 00:10:59,400 --> 00:11:04,160 Speaker 1: photoshop to create images that were unsettling or had some 186 00:11:04,200 --> 00:11:09,600 Speaker 1: sort of alien or paranormal aspect to them, Like the 187 00:11:09,640 --> 00:11:11,920 Speaker 1: idea like make something that if you were to look 188 00:11:11,960 --> 00:11:14,200 Speaker 1: at people would go like, ooh, that's what is that? 189 00:11:14,200 --> 00:11:19,160 Speaker 1: That's creepy. Yeah. So this guy named Eric Knudson created 190 00:11:20,040 --> 00:11:24,640 Speaker 1: a a picture of an elongated, like an unnaturally tall, 191 00:11:24,679 --> 00:11:30,679 Speaker 1: unnaturally thin figure with an odd like obscured face, and 192 00:11:31,240 --> 00:11:33,960 Speaker 1: it became known as the slender Man. He also included 193 00:11:34,000 --> 00:11:37,480 Speaker 1: a little bit of flavor text to kind of give 194 00:11:37,559 --> 00:11:43,559 Speaker 1: some very shifty background on this character and attaching him 195 00:11:43,600 --> 00:11:48,160 Speaker 1: to the fictional fire. And then the Internet took that 196 00:11:49,679 --> 00:11:53,360 Speaker 1: boy Haldy. Yeah. So there were all these different people 197 00:11:53,400 --> 00:11:59,400 Speaker 1: who contributed to flushing out the mythology behind slender Man. 198 00:12:00,200 --> 00:12:03,800 Speaker 1: So the legend of Slenderman grew online, right, and it's 199 00:12:03,840 --> 00:12:08,319 Speaker 1: on all these different communities and people within that thread 200 00:12:08,800 --> 00:12:11,240 Speaker 1: had already predicted that some of the images that were 201 00:12:11,280 --> 00:12:14,679 Speaker 1: being made were likely going to end up on websites 202 00:12:14,760 --> 00:12:18,760 Speaker 1: about the paranormal, but they would be submitted as if 203 00:12:18,800 --> 00:12:24,800 Speaker 1: they were genuine examples of evidence, as opposed to somebody 204 00:12:24,840 --> 00:12:29,319 Speaker 1: specifically crafted this and in fact that did happen, including 205 00:12:29,720 --> 00:12:33,000 Speaker 1: with slender Man. And so that's where we started seeing 206 00:12:34,080 --> 00:12:39,720 Speaker 1: more discussions about it, including people who are sure that 207 00:12:39,800 --> 00:12:43,600 Speaker 1: they had heard of this character before and in fact, 208 00:12:43,679 --> 00:12:45,720 Speaker 1: that was one of the points you wanted to make, right. Yeah, 209 00:12:46,160 --> 00:12:48,199 Speaker 1: I worked with a girl a couple of years ago, 210 00:12:48,280 --> 00:12:51,520 Speaker 1: about five years ago. She swore up and down that 211 00:12:51,600 --> 00:12:57,360 Speaker 1: she and a bunch of internet friends contributed to, uh, 212 00:12:57,559 --> 00:13:01,080 Speaker 1: the creation of slenderman um. Not I don't. She didn't 213 00:13:01,120 --> 00:13:04,360 Speaker 1: mention anything about the photoshop. But the way, if I 214 00:13:04,400 --> 00:13:07,280 Speaker 1: can remember it correctly, the way she presented it to 215 00:13:07,360 --> 00:13:11,600 Speaker 1: me was that she and a couple of friends were 216 00:13:11,640 --> 00:13:15,440 Speaker 1: just chatting over the Internet and somebody on the thread 217 00:13:15,520 --> 00:13:18,559 Speaker 1: or the forum said, hey, let's make a legend and 218 00:13:18,720 --> 00:13:21,360 Speaker 1: or let's make a let's let's make an urban legend. 219 00:13:22,160 --> 00:13:27,520 Speaker 1: And they probably already had their you know, base material 220 00:13:27,880 --> 00:13:32,880 Speaker 1: with the pictures and the other flavor text and stuff. 221 00:13:33,679 --> 00:13:35,800 Speaker 1: But she swears up and down that she contributed to 222 00:13:35,920 --> 00:13:40,360 Speaker 1: the creation of slenderman um. Red Letter Media believes that 223 00:13:40,400 --> 00:13:43,800 Speaker 1: they contributed to the creation of Slenderman. There's a video 224 00:13:44,280 --> 00:13:48,040 Speaker 1: called very appropriately did red Letter Media in Vince Slenderman 225 00:13:48,320 --> 00:13:51,000 Speaker 1: Um three guesses about what that video is, and it 226 00:13:51,040 --> 00:13:55,240 Speaker 1: was mostly done tongue in cheet. Yeah, but they their 227 00:13:55,520 --> 00:13:59,720 Speaker 1: criteria is that they made a short film that had 228 00:13:59,800 --> 00:14:03,160 Speaker 1: a character that was bald and war suit. Yeah, and 229 00:14:04,000 --> 00:14:06,280 Speaker 1: and he was ominous, so that makes him Slenderman. But 230 00:14:06,360 --> 00:14:11,160 Speaker 1: by that, by that criteria than like the Gentleman and 231 00:14:11,200 --> 00:14:14,960 Speaker 1: from Buffy are Slenderman from the episode Hush and the 232 00:14:15,040 --> 00:14:21,080 Speaker 1: silence from Yeah, he's Slenderman, and you know, ominous bald 233 00:14:21,120 --> 00:14:24,520 Speaker 1: guys your Slenderman? Dude, are you? Are you Slenderman? I'm 234 00:14:24,560 --> 00:14:27,680 Speaker 1: only I'm only ominous when I'm walking to work. When 235 00:14:27,680 --> 00:14:33,240 Speaker 1: I'm walking away from work, I'm whistling and I'm jaunty. Yeah. No, 236 00:14:33,360 --> 00:14:35,960 Speaker 1: that's a good point. And like when we were sort 237 00:14:36,000 --> 00:14:38,880 Speaker 1: of brainstorming this episode, one thing she said that really 238 00:14:38,880 --> 00:14:41,960 Speaker 1: stuck with me is like everybody made slender Man, Like 239 00:14:42,040 --> 00:14:44,000 Speaker 1: every Yeah, it was like one of those stories where 240 00:14:44,000 --> 00:14:46,680 Speaker 1: you're just like the Internet's baby. We all made him. 241 00:14:46,840 --> 00:14:50,400 Speaker 1: But but his actual creation date is something that we 242 00:14:50,440 --> 00:14:55,240 Speaker 1: can trace back to that photoshop thread, which we even 243 00:14:55,240 --> 00:14:57,720 Speaker 1: know when that was originally posted, which was June eight, 244 00:14:57,840 --> 00:15:00,960 Speaker 1: two thousand nine. It is so rare when you can 245 00:15:01,000 --> 00:15:05,880 Speaker 1: actually trace a legend back to its point of origin. Now, granted, 246 00:15:06,440 --> 00:15:10,400 Speaker 1: it grew much larger than that thread, and in fact 247 00:15:10,880 --> 00:15:16,720 Speaker 1: it even factors into a real world crime. And um, 248 00:15:17,280 --> 00:15:19,920 Speaker 1: this is what I'm sure anyone who's followed the story 249 00:15:19,920 --> 00:15:23,840 Speaker 1: of Slenderman knows about. There were a group of young 250 00:15:23,920 --> 00:15:28,080 Speaker 1: girls in Wisconsin and two They were all friends, and 251 00:15:28,160 --> 00:15:31,760 Speaker 1: two of the friends lured the third one out to 252 00:15:31,840 --> 00:15:35,760 Speaker 1: a remote location and then stabbed her. Uh. She fortunately 253 00:15:35,840 --> 00:15:40,360 Speaker 1: was survived the the attack and made a full recovery. Uh. 254 00:15:40,360 --> 00:15:46,960 Speaker 1: And the two attackers were committed to mental health institutions 255 00:15:47,120 --> 00:15:49,680 Speaker 1: for at least you know a while you said that 256 00:15:49,720 --> 00:15:52,680 Speaker 1: one of them is is currently coming up for parole soon. 257 00:15:52,880 --> 00:15:55,840 Speaker 1: I I didn't actually read the article on it. While 258 00:15:55,840 --> 00:16:00,840 Speaker 1: I was scrolling through the social media interwebs one day, 259 00:16:00,920 --> 00:16:03,200 Speaker 1: I saw an article. I didn't read the article, but 260 00:16:03,280 --> 00:16:07,640 Speaker 1: I saw the headline said, Um, one of them was 261 00:16:07,720 --> 00:16:11,200 Speaker 1: up for parole, and apparently the family of the victim 262 00:16:11,320 --> 00:16:15,080 Speaker 1: is not very happy about that. Understandably, so understandable. It 263 00:16:15,320 --> 00:16:18,320 Speaker 1: is certainly one of those stories where you feel awful 264 00:16:18,520 --> 00:16:23,360 Speaker 1: for everybody. You feel terrible for people who have lost 265 00:16:24,000 --> 00:16:26,320 Speaker 1: enough connection with reality to not be able to tell 266 00:16:26,360 --> 00:16:30,760 Speaker 1: the difference between a fantasy and what is actually real. 267 00:16:31,160 --> 00:16:33,680 Speaker 1: But then when you're a kid, that's that's a tough 268 00:16:33,720 --> 00:16:37,720 Speaker 1: distinction to make. Honestly, these days, as an adult, it 269 00:16:37,720 --> 00:16:39,960 Speaker 1: could be a pretty tough distinction to make. There's a 270 00:16:39,960 --> 00:16:42,400 Speaker 1: lot of misinformation out there, and the whole thing with 271 00:16:42,440 --> 00:16:45,520 Speaker 1: internet folklore is that the problem with it is that 272 00:16:46,280 --> 00:16:49,800 Speaker 1: so much of it seems real. It's presented as it's 273 00:16:49,840 --> 00:16:52,920 Speaker 1: completely sincere and real. Yeah, I mean, I would like 274 00:16:53,000 --> 00:16:57,160 Speaker 1: to think that I don't believe in a tall child 275 00:16:57,240 --> 00:17:01,880 Speaker 1: snatching bald guy, but I mean at late at night, 276 00:17:01,960 --> 00:17:05,000 Speaker 1: you see something like this and your imagination starts to 277 00:17:05,040 --> 00:17:08,000 Speaker 1: fill in gaps and then you know, you start to 278 00:17:08,359 --> 00:17:11,440 Speaker 1: you start to believe things that are not necessarily true 279 00:17:11,560 --> 00:17:15,080 Speaker 1: might be true. Yeah, we wanted to chat a little 280 00:17:15,119 --> 00:17:18,000 Speaker 1: bit about also a couple of other examples of creepy 281 00:17:18,040 --> 00:17:21,480 Speaker 1: pasta really quickly. And this next one was one that 282 00:17:21,800 --> 00:17:25,679 Speaker 1: surprised me because you also mentioned candle Cove, which I 283 00:17:25,680 --> 00:17:28,480 Speaker 1: had not heard of before you mentioned it. I hadn't 284 00:17:28,520 --> 00:17:31,880 Speaker 1: heard of anything about candle Cove until maybe a year 285 00:17:32,040 --> 00:17:35,080 Speaker 1: and a half ago. Um, I was just I was 286 00:17:35,119 --> 00:17:37,840 Speaker 1: at my mom's house one day visiting and she and 287 00:17:37,880 --> 00:17:41,159 Speaker 1: my stepdad were really into the show and it was 288 00:17:41,320 --> 00:17:43,320 Speaker 1: candle and it's the show I think it was on 289 00:17:44,240 --> 00:17:49,280 Speaker 1: SI Fi. There was there. It's their Channel zero series. Yeah, 290 00:17:49,280 --> 00:17:52,520 Speaker 1: that's what tripped me up. But yeah, it was a 291 00:17:52,520 --> 00:17:55,080 Speaker 1: show called Candle Cove and they were really into it, 292 00:17:55,720 --> 00:17:59,439 Speaker 1: and she was explaining to me, you know, the the premise, 293 00:17:59,640 --> 00:18:03,120 Speaker 1: the pres sounded familiar, so I don't know if I had, 294 00:18:03,480 --> 00:18:06,680 Speaker 1: you know, encountered the encounter the original, but I looked 295 00:18:06,680 --> 00:18:09,240 Speaker 1: it up because honestly, the reason I was looking it 296 00:18:09,320 --> 00:18:11,200 Speaker 1: up is that I recognized one of the actors from 297 00:18:11,240 --> 00:18:12,879 Speaker 1: Parks and rec and I wanted to figure out what 298 00:18:12,920 --> 00:18:15,040 Speaker 1: his name was. And as I was looking it up, 299 00:18:15,200 --> 00:18:18,440 Speaker 1: all the information about how Candle Cove started off as 300 00:18:19,440 --> 00:18:21,959 Speaker 1: I don't think it was on Creepy Pasta, but it was. 301 00:18:22,520 --> 00:18:26,160 Speaker 1: It was definitely a Creepy Pasta style story. Yes it was. 302 00:18:26,000 --> 00:18:28,159 Speaker 1: It was written by a guy named Chris Straub. And 303 00:18:28,200 --> 00:18:33,040 Speaker 1: what's funny is I've talked to Chris Straub personally years ago. 304 00:18:33,200 --> 00:18:36,680 Speaker 1: So Chris Straub is also a web comic list and 305 00:18:36,800 --> 00:18:39,280 Speaker 1: I wrote for how Stuff Works dot com. I wrote 306 00:18:39,320 --> 00:18:43,120 Speaker 1: the article how web Comics Work, and he was one 307 00:18:43,119 --> 00:18:45,520 Speaker 1: of my sources, and I did a full interview with him, 308 00:18:45,880 --> 00:18:48,560 Speaker 1: chatted with him for about probably an hour while I 309 00:18:48,600 --> 00:18:52,880 Speaker 1: was researching this work. And uh, it's so interesting because 310 00:18:52,960 --> 00:18:56,119 Speaker 1: again I had no idea until we started researching this 311 00:18:56,200 --> 00:19:00,280 Speaker 1: episode that he had contributed this piece of internet lore. 312 00:19:00,600 --> 00:19:04,960 Speaker 1: So the original work is presented as a series of posts, 313 00:19:05,080 --> 00:19:08,639 Speaker 1: again on a message board, so it seemingly a lot 314 00:19:08,680 --> 00:19:11,440 Speaker 1: of contributors. Yes, but but it's it's the work of 315 00:19:11,520 --> 00:19:14,560 Speaker 1: fiction of one person, but presented in the way as 316 00:19:14,640 --> 00:19:19,080 Speaker 1: if it's a thread, a conversational thread between multiple people 317 00:19:19,119 --> 00:19:24,400 Speaker 1: who all remember this children's show that no one else 318 00:19:24,400 --> 00:19:29,920 Speaker 1: seems to realize existed. And it starts off pretty innocently, like, Hey, 319 00:19:29,960 --> 00:19:33,159 Speaker 1: does anyone else here happen to remember this show? I 320 00:19:33,160 --> 00:19:36,439 Speaker 1: remember watching it as a kid, but I don't. I 321 00:19:36,480 --> 00:19:39,199 Speaker 1: can't find anything about it, and so I'm wondering, like 322 00:19:39,400 --> 00:19:41,919 Speaker 1: I just imagine it, and other people say, oh no, 323 00:19:42,040 --> 00:19:43,879 Speaker 1: I remember that show too, and again this is all 324 00:19:43,920 --> 00:19:46,680 Speaker 1: part of the fiction. Yeah. I think like the big 325 00:19:46,720 --> 00:19:49,120 Speaker 1: twist is not only as the show super creepy, as 326 00:19:49,119 --> 00:19:52,159 Speaker 1: people remember more and more about it, it starts to 327 00:19:52,200 --> 00:19:56,240 Speaker 1: appear more and more sinister. But one of the details 328 00:19:56,320 --> 00:20:00,960 Speaker 1: released is that, uh, the person who star the thread 329 00:20:01,200 --> 00:20:04,480 Speaker 1: chatted with his mother about what had happened, and his 330 00:20:04,560 --> 00:20:07,040 Speaker 1: mother said, oh, you would always say that you went 331 00:20:07,160 --> 00:20:09,480 Speaker 1: to want to watch a show called candle Cove, but 332 00:20:09,560 --> 00:20:12,440 Speaker 1: you just changed the channel to static and sat there 333 00:20:12,480 --> 00:20:16,399 Speaker 1: and stared at it for thirty minutes. Yeah. So again, 334 00:20:16,520 --> 00:20:19,480 Speaker 1: very very kind of creepy premise. But that has taken 335 00:20:19,480 --> 00:20:22,520 Speaker 1: a life of its own, where not only is it 336 00:20:22,560 --> 00:20:25,800 Speaker 1: a television series on sci fi where they take elements 337 00:20:25,800 --> 00:20:29,439 Speaker 1: of that and they wrap other narrative around it, um, 338 00:20:29,480 --> 00:20:32,199 Speaker 1: but on top of that, like I even debate on 339 00:20:32,320 --> 00:20:34,840 Speaker 1: using on including this because I thought, well, clearly this 340 00:20:34,920 --> 00:20:37,879 Speaker 1: is the work of fiction. People know it's work of fiction. No, 341 00:20:38,080 --> 00:20:41,080 Speaker 1: there are YouTube interviews are in YouTube videos I should 342 00:20:41,080 --> 00:20:44,880 Speaker 1: say titled is was Candle Cove a real TV show? 343 00:20:44,920 --> 00:20:46,600 Speaker 1: And I thought, well, if you have to make a 344 00:20:46,720 --> 00:20:50,120 Speaker 1: video about it, then there are people wondering about it. Yeah, 345 00:20:50,119 --> 00:20:55,440 Speaker 1: because there's there's videos of like original Candle Cove episodes 346 00:20:55,480 --> 00:20:58,280 Speaker 1: which I'm pretty sure are just lifted from the show 347 00:20:58,680 --> 00:21:02,760 Speaker 1: or base or there might be some like fan made 348 00:21:02,800 --> 00:21:06,400 Speaker 1: things because there were fan fictions written about candle Cove. Yeah, 349 00:21:06,560 --> 00:21:08,160 Speaker 1: now a lot of these actually would get a lot 350 00:21:08,160 --> 00:21:11,359 Speaker 1: of fan fiction added to it. And Uh, the other 351 00:21:11,400 --> 00:21:13,959 Speaker 1: thing that this reminds me of is something called the 352 00:21:14,000 --> 00:21:19,199 Speaker 1: Mandola effect. Um. Actually that's technically the Mandela effect, but 353 00:21:19,400 --> 00:21:21,480 Speaker 1: I've heard it pronounced the wrong way. But it's named 354 00:21:21,520 --> 00:21:24,040 Speaker 1: after Nelson Mandela. So have you heard of this effect. 355 00:21:24,080 --> 00:21:26,240 Speaker 1: I've heard of the effect. I don't actually know why. 356 00:21:26,240 --> 00:21:31,600 Speaker 1: It's called the Mandela effect. So Nelson Mandela was imprisoned 357 00:21:31,840 --> 00:21:35,879 Speaker 1: for many, many years, and the Mandela effect refers to 358 00:21:36,040 --> 00:21:40,320 Speaker 1: this belief that people had that they had heard newscasters 359 00:21:40,359 --> 00:21:43,960 Speaker 1: in the nineteen eighties announced that Nelson Mandela had passed 360 00:21:44,000 --> 00:21:48,639 Speaker 1: away while in prison, but that never happened. But people 361 00:21:48,720 --> 00:21:53,400 Speaker 1: swear they remember seeing those newscasts, but it never happened. 362 00:21:53,760 --> 00:21:56,280 Speaker 1: There are other examples of this. Uh. The two that 363 00:21:56,320 --> 00:22:00,359 Speaker 1: people tend to site pretty frequently are the Bernstein Bears, 364 00:22:00,760 --> 00:22:03,879 Speaker 1: where it's it's spelled stain at the end. That's the 365 00:22:03,920 --> 00:22:07,040 Speaker 1: correct spelling, but everyone thinks it's steen s T E 366 00:22:07,119 --> 00:22:10,119 Speaker 1: I n instead of s T A I n um, 367 00:22:10,160 --> 00:22:12,119 Speaker 1: and they swear like no I have a book and 368 00:22:12,160 --> 00:22:15,000 Speaker 1: it's spelled the other way. It's the barn steam Bears, 369 00:22:15,080 --> 00:22:18,160 Speaker 1: the barn Stain Bears, and the other one is Uh. 370 00:22:18,280 --> 00:22:21,399 Speaker 1: Everyone's convinced that the comedian Sindbad was in a movie 371 00:22:21,400 --> 00:22:24,520 Speaker 1: in which he played a genie named Shazam. Apparently he 372 00:22:24,720 --> 00:22:28,360 Speaker 1: was in like a weird promo where he was a genie, 373 00:22:28,840 --> 00:22:31,919 Speaker 1: but it wasn't there. There was no movie because there 374 00:22:31,960 --> 00:22:36,760 Speaker 1: was no Yeah, but those are all like Internet legends 375 00:22:36,840 --> 00:22:41,160 Speaker 1: that ended up sort of get growing, and this man 376 00:22:41,600 --> 00:22:47,320 Speaker 1: Mandela effect or Mandela effect had this added benefit of 377 00:22:47,600 --> 00:22:52,200 Speaker 1: people hearing something and then convincing themselves that they too 378 00:22:52,240 --> 00:22:56,439 Speaker 1: had experienced it, which is kind of interesting. Now we 379 00:22:56,520 --> 00:22:59,480 Speaker 1: actually have tons more we could talk about, like their 380 00:22:59,520 --> 00:23:02,520 Speaker 1: other LM monts, not just creepy pasta. There's stuff like 381 00:23:02,560 --> 00:23:06,960 Speaker 1: inspirational stories that are referred to as glurge, where I've 382 00:23:07,000 --> 00:23:09,160 Speaker 1: never heard refer to it as that it's you had 383 00:23:09,200 --> 00:23:11,200 Speaker 1: to be on snopes dot com back in the day 384 00:23:11,240 --> 00:23:14,520 Speaker 1: on those message boards, but glurge refers to stories that 385 00:23:15,280 --> 00:23:19,080 Speaker 1: are meant to tap into those sort of sappy emotions 386 00:23:19,119 --> 00:23:21,600 Speaker 1: that we have. Uh. They also tend to be kind 387 00:23:21,640 --> 00:23:26,920 Speaker 1: of dark under the surface, but it's it's the inspirational 388 00:23:27,000 --> 00:23:31,280 Speaker 1: story counterpart to creepy pasta, except it's often passed as 389 00:23:31,440 --> 00:23:35,480 Speaker 1: it really happened, and it turns out it's never anything verifiable. 390 00:23:35,880 --> 00:23:38,880 Speaker 1: But rather than go on all arounte about that, we're 391 00:23:38,880 --> 00:23:40,480 Speaker 1: gonna take a quick break and when we come back, 392 00:23:40,480 --> 00:23:51,959 Speaker 1: we're gonna talk about some conspiracy theories. Now, we started 393 00:23:52,000 --> 00:23:55,800 Speaker 1: off this episode talking about works of fiction that aren't 394 00:23:55,800 --> 00:24:00,600 Speaker 1: necessarily intended to be taken seriously, but sometimes take a 395 00:24:00,640 --> 00:24:02,960 Speaker 1: life of their own outside of their point of origin, 396 00:24:03,400 --> 00:24:06,040 Speaker 1: and then they are taken seriously. There are other times 397 00:24:06,400 --> 00:24:10,720 Speaker 1: where there's just a joke or a hoax that people 398 00:24:10,840 --> 00:24:15,320 Speaker 1: end up taking as a sincere message. And the first 399 00:24:15,359 --> 00:24:17,520 Speaker 1: one I wanted to mention is is one of the 400 00:24:17,560 --> 00:24:21,520 Speaker 1: earliest examples of an Internet hoax. It actually dates to December, 401 00:24:23,119 --> 00:24:25,959 Speaker 1: when very few people even knew what the Internet was, 402 00:24:26,720 --> 00:24:33,120 Speaker 1: and it all involves the company Microsoft supposedly acquiring the 403 00:24:33,200 --> 00:24:39,800 Speaker 1: Catholic Church. Huh, you're a Catholic, so so do you 404 00:24:39,840 --> 00:24:42,240 Speaker 1: own shares in Microsoft? Is that how it works? I 405 00:24:42,359 --> 00:24:45,760 Speaker 1: mean I have a Mac, so I don't. I don't 406 00:24:45,800 --> 00:24:49,400 Speaker 1: know that's true. It's right in front of it. Yeah, 407 00:24:49,480 --> 00:24:53,680 Speaker 1: I if if my mom's secretly owned shares and Microsoft, 408 00:24:53,840 --> 00:24:57,639 Speaker 1: then you know she's got some splain in too, do Yeah. 409 00:24:58,240 --> 00:25:01,360 Speaker 1: So had you heard of this before? I actually never 410 00:25:01,440 --> 00:25:04,479 Speaker 1: have maybe, And I went to Catholic school for so 411 00:25:04,520 --> 00:25:07,040 Speaker 1: many years in my life, I'm surprised I never heard 412 00:25:07,119 --> 00:25:12,159 Speaker 1: this four. How old were you in ninety four? I 413 00:25:12,359 --> 00:25:15,960 Speaker 1: was two, okay, maybe that's why you didn't hear about it, okay, 414 00:25:16,280 --> 00:25:20,960 Speaker 1: whereas I was older than that. So this was a joke, 415 00:25:21,400 --> 00:25:24,160 Speaker 1: and it was clearly a joke. It was a press release, 416 00:25:24,640 --> 00:25:26,840 Speaker 1: supposed press release, but it was, you know, a joke 417 00:25:26,920 --> 00:25:29,720 Speaker 1: in the form of a press release supposedly from the 418 00:25:29,760 --> 00:25:34,160 Speaker 1: Associated Press, which is you know, it's a legitimate organization. Uh. 419 00:25:34,320 --> 00:25:39,760 Speaker 1: It even had a byline by Hank Vorhez was supposedly 420 00:25:39,800 --> 00:25:42,800 Speaker 1: the name of the person who who wrote up this 421 00:25:42,840 --> 00:25:47,800 Speaker 1: press release in your relation to Jason, not Vorhees. Uh so, 422 00:25:48,359 --> 00:25:50,439 Speaker 1: uh yeah. And I even wrote, just imagine what Clippy 423 00:25:50,480 --> 00:25:53,359 Speaker 1: would be like. So you're trying to transubstantiate, would you 424 00:25:53,400 --> 00:25:57,960 Speaker 1: like some help? Um? So? Yeah. The the hoax, the 425 00:25:58,000 --> 00:26:01,000 Speaker 1: press release hoax even included a host in quote from 426 00:26:01,000 --> 00:26:05,040 Speaker 1: Bill Gates, who was referring to religion as a growth market. 427 00:26:05,720 --> 00:26:09,600 Speaker 1: Obviously that was part of the joke. And I said 428 00:26:09,640 --> 00:26:14,000 Speaker 1: that Microsoft would get exclusive electronic publication rights to the Bible, 429 00:26:14,320 --> 00:26:16,720 Speaker 1: so they would be able to only one version of it, 430 00:26:16,920 --> 00:26:19,240 Speaker 1: not the King James, and that would be the yeah, 431 00:26:19,359 --> 00:26:22,200 Speaker 1: be it would be whatever is the most Catholic version. Yeah, 432 00:26:23,440 --> 00:26:26,439 Speaker 1: the longer version, the Catholic version of anything. Is the 433 00:26:26,480 --> 00:26:28,600 Speaker 1: longer version ever been to a Catholic wedding. Yes, I 434 00:26:28,640 --> 00:26:31,600 Speaker 1: had still going on. I know I had to sneak out. 435 00:26:32,040 --> 00:26:35,800 Speaker 1: That was seven years ago. But yeah, this, like I said, 436 00:26:35,840 --> 00:26:38,440 Speaker 1: it was pretty clear it was a joke. And yet 437 00:26:39,960 --> 00:26:43,119 Speaker 1: there are people who either they got mad because they 438 00:26:43,119 --> 00:26:46,600 Speaker 1: thought that Microsoft had been perpetuating this joke and they 439 00:26:46,680 --> 00:26:50,159 Speaker 1: said this is in poor taste and Microsoft got flooded 440 00:26:50,640 --> 00:26:54,800 Speaker 1: with complaints, or there were a few people who thought 441 00:26:54,880 --> 00:26:57,800 Speaker 1: it was a genuine press release and they were horrified 442 00:26:57,840 --> 00:27:00,520 Speaker 1: about it, and they were reaching out the Microsoft express 443 00:27:00,600 --> 00:27:05,240 Speaker 1: their horror and concern. It got so bad that Microsoft 444 00:27:05,359 --> 00:27:12,720 Speaker 1: actually formally denounced the joke on December six, released its 445 00:27:12,760 --> 00:27:16,720 Speaker 1: own press release to say, no, we had nothing to 446 00:27:16,760 --> 00:27:19,560 Speaker 1: do with this announcement. We definitely did not buy the 447 00:27:19,600 --> 00:27:24,040 Speaker 1: Catholic Church. We know, we don't know who it was 448 00:27:24,160 --> 00:27:27,080 Speaker 1: that created this, We have no connection to it. We 449 00:27:27,160 --> 00:27:30,760 Speaker 1: are sorry that you're upset, which was weird that they 450 00:27:30,760 --> 00:27:33,760 Speaker 1: were apologizing for something that they literally had no control over. 451 00:27:34,800 --> 00:27:36,560 Speaker 1: I mean, this wouldn't be the only time that Microsoft 452 00:27:36,640 --> 00:27:40,520 Speaker 1: would get connected to Internet legends and hoaxes. There's there 453 00:27:40,520 --> 00:27:43,160 Speaker 1: were also ones that said if you forward this email, 454 00:27:43,440 --> 00:27:47,000 Speaker 1: then Bill Gates is going to give you like dollars, 455 00:27:47,040 --> 00:27:49,919 Speaker 1: which was never true. Oh those chain letters, I remember 456 00:27:49,920 --> 00:27:53,720 Speaker 1: those email chain letters quite well. Yeah, thankfully those have 457 00:27:53,920 --> 00:27:57,320 Speaker 1: kind of died down. But you sometimes will see the 458 00:27:57,359 --> 00:28:00,520 Speaker 1: equivalent on Facebook where it will be like usually Facebook, 459 00:28:00,520 --> 00:28:03,920 Speaker 1: it's share this, but don't share it copy and pasted. 460 00:28:04,480 --> 00:28:08,080 Speaker 1: And by the way, cut that message. Don't share this 461 00:28:08,240 --> 00:28:11,720 Speaker 1: copy and pasted. That's pretty much a dead giveaway that 462 00:28:11,840 --> 00:28:14,159 Speaker 1: what what you're being asked to do is to perpetuate 463 00:28:14,160 --> 00:28:18,560 Speaker 1: a hoax, because the justification usually get is we don't 464 00:28:18,560 --> 00:28:22,359 Speaker 1: want the people who are most vulnerable to be targeted, 465 00:28:22,600 --> 00:28:26,800 Speaker 1: so therefore, copy and paste it or Facebook's algorithms will 466 00:28:27,080 --> 00:28:30,359 Speaker 1: not promote this if you share it. That's not true. 467 00:28:30,760 --> 00:28:32,800 Speaker 1: The reason they don't want you to share it is 468 00:28:32,840 --> 00:28:36,080 Speaker 1: because if you share it rather than copy and pasted Yeah, 469 00:28:36,200 --> 00:28:37,920 Speaker 1: you can trace it back to the point of origin. 470 00:28:38,760 --> 00:28:42,360 Speaker 1: So sounds but the Microsoft Catholic Church stuff sounds a 471 00:28:42,360 --> 00:28:46,160 Speaker 1: little Jonathan swift E. Yeah, it was a it was 472 00:28:46,720 --> 00:28:52,400 Speaker 1: a pretty odd thing to see become, you know, a 473 00:28:52,400 --> 00:28:54,800 Speaker 1: hoax level. Again, I don't think it was intended as 474 00:28:54,840 --> 00:28:57,719 Speaker 1: a hoax. Like to me, this sounds like an onion 475 00:28:57,880 --> 00:29:03,000 Speaker 1: article and someone mistaken believing that a satirical article is 476 00:29:03,200 --> 00:29:07,360 Speaker 1: a real thing, like a modest proposal type of stuff. Yeah, 477 00:29:07,480 --> 00:29:10,320 Speaker 1: And when you when you see people taking something that 478 00:29:10,440 --> 00:29:16,160 Speaker 1: was meant as satire, sincerely, that's not scary. It is scary, 479 00:29:16,200 --> 00:29:18,920 Speaker 1: although sometimes it's you could also argue some people are 480 00:29:18,920 --> 00:29:22,640 Speaker 1: just really bad at satire, and sometimes things that people 481 00:29:22,720 --> 00:29:26,440 Speaker 1: claim our satire just are lies. S satirical, it's just 482 00:29:26,480 --> 00:29:30,480 Speaker 1: a lie. And uh, you know, perpetuating a lie and 483 00:29:30,520 --> 00:29:33,400 Speaker 1: then telling people, oh no it's satire. That's not to 484 00:29:33,440 --> 00:29:36,160 Speaker 1: get out of jail free card. My joke didn't land. 485 00:29:36,160 --> 00:29:38,720 Speaker 1: Oh don't worry, it was just satire. Yeah, no, it wasn't. 486 00:29:38,760 --> 00:29:41,640 Speaker 1: It was a bad joke, right right, or or there's no, 487 00:29:41,720 --> 00:29:45,360 Speaker 1: there was no joke structure there. You were just telling falsehoods. 488 00:29:45,760 --> 00:29:47,680 Speaker 1: Then you got mad when you got called out on it. 489 00:29:48,640 --> 00:29:51,280 Speaker 1: So that is a very early example. And then the 490 00:29:51,320 --> 00:29:55,720 Speaker 1: next one is sort of the genesis for this whole episode, 491 00:29:55,840 --> 00:30:00,360 Speaker 1: the eratus story E R R A t A S. 492 00:30:00,600 --> 00:30:03,520 Speaker 1: Although sometimes it's just with one R R A T 493 00:30:03,760 --> 00:30:05,959 Speaker 1: A S. I'm glad that you've established that we were 494 00:30:05,960 --> 00:30:10,120 Speaker 1: saying erratas not not not erotic, or I was saying 495 00:30:10,200 --> 00:30:15,840 Speaker 1: like aratus. Well, it's a rata. Errata is already plural, 496 00:30:16,000 --> 00:30:18,680 Speaker 1: so it's already it's already got a problem with it. Yeah, 497 00:30:18,760 --> 00:30:23,120 Speaker 1: because it's it's pluralized, uh in an Anglican way. But 498 00:30:23,400 --> 00:30:26,960 Speaker 1: arrata would already be the plural and it means like errors. Right. 499 00:30:27,240 --> 00:30:30,040 Speaker 1: You typically would say, like you would publish a rata 500 00:30:30,160 --> 00:30:33,640 Speaker 1: to say these are things that we're included in the 501 00:30:33,640 --> 00:30:38,320 Speaker 1: previous published work which there are wrong. Yeah, might be 502 00:30:38,360 --> 00:30:41,560 Speaker 1: a typo, might be that there was a misplaced decimal 503 00:30:41,640 --> 00:30:45,120 Speaker 1: and a figure or something. So the misspelling of erratas 504 00:30:45,280 --> 00:30:49,280 Speaker 1: is in itself of errata to be in a ratum, right, 505 00:30:49,320 --> 00:30:55,440 Speaker 1: it wouldn't be. It's errataception. Okay, So this is a 506 00:30:55,480 --> 00:30:58,680 Speaker 1: weird one. It's a very odd story. It's actually kind 507 00:30:58,680 --> 00:31:02,840 Speaker 1: of hard to explain because it gets wrapped up in 508 00:31:02,920 --> 00:31:08,360 Speaker 1: about two or three different kind of weird things. Uh. Supposedly, 509 00:31:08,600 --> 00:31:14,000 Speaker 1: aratus itself refers to some kind of computer algorithm. Yeah, 510 00:31:14,160 --> 00:31:18,720 Speaker 1: that's used by HR or used by companies to remove 511 00:31:19,040 --> 00:31:23,160 Speaker 1: material they don't want on the Internet. Like think of 512 00:31:23,240 --> 00:31:27,680 Speaker 1: it as it's supposedly an algorithm that can seek and 513 00:31:27,760 --> 00:31:33,320 Speaker 1: destroy content and remove it from the Internet automatically. Typically 514 00:31:34,280 --> 00:31:38,280 Speaker 1: described as a way of removing stuff that's infringing on copyright, 515 00:31:38,400 --> 00:31:43,280 Speaker 1: so like a takedown. So let's say that you, I 516 00:31:43,320 --> 00:31:47,480 Speaker 1: don't know, you upload the entire film Berry Gordy's the 517 00:31:47,560 --> 00:31:51,280 Speaker 1: Last Dragon to YouTube, and you just haven't hosted on 518 00:31:51,320 --> 00:31:54,840 Speaker 1: your own account because it's the greatest movie ever made, 519 00:31:54,880 --> 00:31:56,720 Speaker 1: and I will brook no discussion upon the matter, and 520 00:31:56,760 --> 00:32:03,200 Speaker 1: then this ratus computer algorithm would could be could be 521 00:32:03,240 --> 00:32:05,840 Speaker 1: sent to take that down. There's no need for that. 522 00:32:06,000 --> 00:32:10,480 Speaker 1: YouTube has a very effective, some would say over effective 523 00:32:11,080 --> 00:32:14,959 Speaker 1: means of taking down stuff that gets copyright strikes on it, 524 00:32:15,440 --> 00:32:18,160 Speaker 1: either demonetizing it or taking it down entirely, there's no 525 00:32:18,280 --> 00:32:21,040 Speaker 1: need for an algorithm. But the other component to this 526 00:32:21,600 --> 00:32:25,560 Speaker 1: is that supposedly the algorithm could find and remove references 527 00:32:25,600 --> 00:32:31,000 Speaker 1: to itself. Yeah, like if you what it's when I 528 00:32:31,040 --> 00:32:34,440 Speaker 1: was researching, it seemed like it was saying that if you, 529 00:32:35,240 --> 00:32:39,640 Speaker 1: if anybody even typed in erratas into like a search engine, 530 00:32:39,720 --> 00:32:44,120 Speaker 1: it could somehow not really damage your computer, but it 531 00:32:44,160 --> 00:32:47,200 Speaker 1: could like track you just by looking it up or 532 00:32:47,240 --> 00:32:50,160 Speaker 1: something like like Like a lot of stories went were 533 00:32:50,240 --> 00:32:54,320 Speaker 1: about how someone was supposedly working for a company and 534 00:32:54,480 --> 00:32:59,960 Speaker 1: came across some mention of eratus box labeled yeah, or 535 00:33:00,240 --> 00:33:03,520 Speaker 1: like a division that had the name Eratus, and then 536 00:33:03,560 --> 00:33:08,280 Speaker 1: they would do an internal search on their uh intra net, 537 00:33:08,600 --> 00:33:12,880 Speaker 1: right the company's intranet, not the Internet necessarily, and that 538 00:33:13,280 --> 00:33:18,200 Speaker 1: this would suddenly raise an alarm among the big wigs 539 00:33:18,240 --> 00:33:22,160 Speaker 1: and that person would be dismissed, fired from their job, 540 00:33:22,280 --> 00:33:25,840 Speaker 1: or they would be discouraged in no uncertain terms to 541 00:33:26,200 --> 00:33:28,680 Speaker 1: leave all that alone. And so it was just this 542 00:33:28,760 --> 00:33:32,760 Speaker 1: idea that it was a forbidden search term to the 543 00:33:32,800 --> 00:33:36,960 Speaker 1: point where people were perpetuating stories that you shouldn't even 544 00:33:37,120 --> 00:33:39,920 Speaker 1: search for it online because it could come back to 545 00:33:40,000 --> 00:33:46,080 Speaker 1: haunt you. Don't even think think. Meanwhile, seemingly not connected 546 00:33:46,120 --> 00:33:50,640 Speaker 1: to this at all, there was a YouTube account created 547 00:33:50,680 --> 00:33:56,480 Speaker 1: called Chronos for Life that appeared to be mostly dedicated 548 00:33:56,560 --> 00:34:01,920 Speaker 1: to a very odd video talking about the Jurassic Park film. 549 00:34:02,000 --> 00:34:04,880 Speaker 1: It was like a fan video of Jurassic Park. And 550 00:34:05,880 --> 00:34:08,520 Speaker 1: have you watched these videos? I didn't watch the video. 551 00:34:08,560 --> 00:34:11,920 Speaker 1: I watched clips of the like snippets of the videos themselves. 552 00:34:12,000 --> 00:34:14,560 Speaker 1: So a lot of the videos have like white text 553 00:34:14,760 --> 00:34:18,359 Speaker 1: on top of there's there's no that's illegible. Yeah, there's no, 554 00:34:18,520 --> 00:34:22,719 Speaker 1: there's no like voice over. It's like the narrator is 555 00:34:22,760 --> 00:34:26,680 Speaker 1: communicating via text, but it's white text over light background. 556 00:34:26,760 --> 00:34:29,000 Speaker 1: So it makes it. And sometimes it's only up for 557 00:34:29,600 --> 00:34:32,560 Speaker 1: a short number of frames, and it's and the video 558 00:34:32,600 --> 00:34:36,080 Speaker 1: itself along with the text is so poor quality no 559 00:34:36,120 --> 00:34:38,680 Speaker 1: matter how you watch it that it's hard to read, 560 00:34:38,719 --> 00:34:40,880 Speaker 1: no matter what. Right, So you get you get the 561 00:34:40,920 --> 00:34:44,640 Speaker 1: sense that the person who made this uh is either 562 00:34:44,920 --> 00:34:48,880 Speaker 1: doing some very interesting but strange performance art, or maybe 563 00:34:48,960 --> 00:34:53,680 Speaker 1: they're not. They're not. Well, that's another possibility, because it 564 00:34:53,719 --> 00:34:58,319 Speaker 1: comes across very disjointed, illogical, difficult to follow, things that 565 00:34:58,360 --> 00:35:02,800 Speaker 1: would typically be unsettling. Yeah, it was kind of conspiratorial, 566 00:35:02,880 --> 00:35:05,160 Speaker 1: but you didn't really know what the conspiracy was, right, 567 00:35:05,200 --> 00:35:08,239 Speaker 1: And then there would be a video that was in 568 00:35:08,280 --> 00:35:12,040 Speaker 1: that same vein, but not about Jurassic Park rather about 569 00:35:12,400 --> 00:35:18,479 Speaker 1: how uh this eratus thing was a dark conspiracy and 570 00:35:18,640 --> 00:35:21,600 Speaker 1: I think that it was even Uh the person was 571 00:35:21,640 --> 00:35:25,600 Speaker 1: worried about their mother being under surveillance. Yeah, because the 572 00:35:25,640 --> 00:35:28,239 Speaker 1: mom was the one originally uploading all the Jurassic Park 573 00:35:28,400 --> 00:35:32,640 Speaker 1: videos because or something. And then yeah, that that her 574 00:35:32,840 --> 00:35:37,520 Speaker 1: her own mental acuity was deteriorating as a result of this, 575 00:35:37,640 --> 00:35:41,839 Speaker 1: and yeah, I guess, very very strange. And then uh, 576 00:35:42,480 --> 00:35:46,160 Speaker 1: people found that if they turned on captions for these videos, 577 00:35:46,960 --> 00:35:48,880 Speaker 1: because you couldn't really make anything out in the videos, 578 00:35:48,880 --> 00:35:51,640 Speaker 1: but you turn on captions, you would get these captions 579 00:35:51,680 --> 00:35:56,480 Speaker 1: that would not appear otherwise. That left clues, and if 580 00:35:56,520 --> 00:36:00,640 Speaker 1: you followed the clues, you would eventually find your self 581 00:36:00,880 --> 00:36:04,360 Speaker 1: looking at the band camp page for a band called 582 00:36:04,600 --> 00:36:08,440 Speaker 1: KFC Murder Chicks. Are we going to get sued? No, 583 00:36:10,239 --> 00:36:12,759 Speaker 1: but that's what the band was called. I'm yeah, are 584 00:36:12,800 --> 00:36:15,399 Speaker 1: they going to get sued? But probably not at this point. 585 00:36:15,400 --> 00:36:17,320 Speaker 1: It sounds to be like that band has been defunct 586 00:36:17,400 --> 00:36:23,200 Speaker 1: for a long time. So what this eventually appeared to 587 00:36:23,280 --> 00:36:29,279 Speaker 1: be was a case of a person working on a 588 00:36:29,400 --> 00:36:33,279 Speaker 1: music project which largely was based around taking snippets of 589 00:36:33,320 --> 00:36:37,040 Speaker 1: audio from videos and putting it together into really weird 590 00:36:37,680 --> 00:36:41,520 Speaker 1: uh composition. Yeah. The purpose of the band was to 591 00:36:41,640 --> 00:36:46,160 Speaker 1: create a type of music called Internet sounds. Yeah deep Internet, Yeah, 592 00:36:46,239 --> 00:36:48,600 Speaker 1: the deep Internet Sounds, and it was supposed to be 593 00:36:48,719 --> 00:36:53,319 Speaker 1: some sort of weird, distorted background noise. It made me 594 00:36:53,400 --> 00:36:59,440 Speaker 1: think of like a perhaps less focused approach to to 595 00:37:00,000 --> 00:37:03,520 Speaker 1: forrmants as a Negative Land does, and that's a band 596 00:37:03,560 --> 00:37:08,840 Speaker 1: that does a lot of compositions by taking pieces audio 597 00:37:08,920 --> 00:37:13,719 Speaker 1: snippets from different sources and putting them together into very 598 00:37:13,800 --> 00:37:19,680 Speaker 1: interesting uh audio like interesting albums. Actually, in fact, I 599 00:37:19,719 --> 00:37:22,919 Speaker 1: had UH, one of the founding members of Negative Land 600 00:37:23,080 --> 00:37:24,960 Speaker 1: on for a couple of episodes and we talked about 601 00:37:24,960 --> 00:37:28,480 Speaker 1: that work so similar to that, and in fact that 602 00:37:28,719 --> 00:37:34,160 Speaker 1: the person who had UH created these would later say, yeah, 603 00:37:34,200 --> 00:37:36,400 Speaker 1: this was just sort of my attempt to kind of 604 00:37:36,719 --> 00:37:42,800 Speaker 1: make something um, but I didn't really have a fully 605 00:37:42,840 --> 00:37:44,959 Speaker 1: fleshed out plan on how that was going to work. 606 00:37:46,239 --> 00:37:49,040 Speaker 1: But on the Internet people were following it and filling 607 00:37:49,040 --> 00:37:52,040 Speaker 1: in the gaps and making it more than what it 608 00:37:52,239 --> 00:37:55,759 Speaker 1: was and treating it like, oh, there really is this 609 00:37:55,880 --> 00:37:58,920 Speaker 1: conspiracy out there. And one of the crazy things about 610 00:37:58,920 --> 00:38:03,359 Speaker 1: conspiracy theory is that if you come out and tell 611 00:38:03,440 --> 00:38:07,080 Speaker 1: someone no, that was all manufactured, it's not actually a 612 00:38:07,120 --> 00:38:11,960 Speaker 1: conspiracy that ends up being parcy. Yeah, it's a cover up, right, Like, 613 00:38:12,080 --> 00:38:15,160 Speaker 1: so there's no way to disprove a conspiracy to someone 614 00:38:15,160 --> 00:38:20,840 Speaker 1: who's desperately going to follow that conspiracy theory. They they 615 00:38:20,880 --> 00:38:23,920 Speaker 1: they are invested in believing in it, and so any 616 00:38:24,320 --> 00:38:29,760 Speaker 1: counter evidence you give was manufactured to discredit the conspiracy, 617 00:38:29,800 --> 00:38:33,160 Speaker 1: but it doesn't disprove it. That kind of thing. So 618 00:38:33,480 --> 00:38:35,719 Speaker 1: this is another one of those cases where again I 619 00:38:35,719 --> 00:38:37,840 Speaker 1: I had never even heard of this until you asked 620 00:38:37,840 --> 00:38:40,319 Speaker 1: me about it. I honestly, at the beginning of this week, 621 00:38:40,360 --> 00:38:45,040 Speaker 1: I was just very sick and going through just the interwebs, 622 00:38:45,080 --> 00:38:49,399 Speaker 1: and I came across the video about eratas conspiracy, and 623 00:38:49,440 --> 00:38:52,359 Speaker 1: I was like, that sounds like something he has talked 624 00:38:52,360 --> 00:38:54,279 Speaker 1: about at some point. I don't know. There are so 625 00:38:54,320 --> 00:38:57,560 Speaker 1: many episodes that I'm not sure. I mean, it may 626 00:38:57,640 --> 00:39:00,000 Speaker 1: turn out that there that I have covered this very 627 00:39:00,360 --> 00:39:03,120 Speaker 1: same story. Yeah, and I wouldn't remember this was all 628 00:39:03,120 --> 00:39:06,480 Speaker 1: a conspiracy in and of itself. Well, I think it's 629 00:39:06,480 --> 00:39:08,520 Speaker 1: fascinating and we have a little bit more to talk 630 00:39:08,560 --> 00:39:13,560 Speaker 1: about as far as misinformation goes, where Like in this case, 631 00:39:13,600 --> 00:39:16,960 Speaker 1: I would say that the eradas thing kind of falls 632 00:39:16,960 --> 00:39:20,640 Speaker 1: into the land of the alternate reality game genre. So 633 00:39:20,719 --> 00:39:26,520 Speaker 1: alternate reality games are where you mix reality and the 634 00:39:26,640 --> 00:39:31,040 Speaker 1: fictional world within a game, and um, there are a 635 00:39:31,080 --> 00:39:37,080 Speaker 1: lot of examples. They typically are marketing tools. So for example, AI, 636 00:39:37,239 --> 00:39:41,760 Speaker 1: the movie AI, the Spielberg Kubrick film AI had an 637 00:39:41,800 --> 00:39:44,279 Speaker 1: alternate reality game associated with it that a lot of 638 00:39:44,320 --> 00:39:48,120 Speaker 1: people referred to as The Beast, and it was a 639 00:39:48,160 --> 00:39:52,080 Speaker 1: story that played out mostly online, but included some stuff 640 00:39:52,120 --> 00:39:55,040 Speaker 1: where people could interact not just online, but they could 641 00:39:55,080 --> 00:39:57,719 Speaker 1: get faxes and phone calls and stuff. So it was 642 00:39:57,800 --> 00:40:02,200 Speaker 1: like the fictional world of the game would reach across 643 00:40:02,440 --> 00:40:06,880 Speaker 1: the barrier between reality and fiction and interact with you 644 00:40:06,960 --> 00:40:10,960 Speaker 1: in ways in the real world that typically games don't. 645 00:40:11,680 --> 00:40:15,760 Speaker 1: So it became kind of a role playing game almost, um. 646 00:40:15,840 --> 00:40:19,680 Speaker 1: And I think that that's one version that can end 647 00:40:19,760 --> 00:40:24,120 Speaker 1: up being misconstrued as being reality as opposed to now 648 00:40:24,160 --> 00:40:26,160 Speaker 1: this is a marketing thing, or it's it's meant to 649 00:40:26,200 --> 00:40:28,719 Speaker 1: be a game. When we come back, we're gonna talk 650 00:40:28,760 --> 00:40:35,400 Speaker 1: about examples of misinformation that we're created specifically to perpetuate 651 00:40:35,640 --> 00:40:39,840 Speaker 1: an agenda, not to create some sort of fiction or 652 00:40:39,880 --> 00:40:42,480 Speaker 1: a game. We'll do that after we come back from 653 00:40:42,520 --> 00:40:53,000 Speaker 1: this quick break. All right, we're getting into the home stretch, 654 00:40:53,000 --> 00:40:55,160 Speaker 1: and this is some of the darkest stuff we have 655 00:40:55,239 --> 00:41:00,239 Speaker 1: to talk about. So there are communities out on the 656 00:41:00,239 --> 00:41:04,399 Speaker 1: Internet that so discord and they may do it for 657 00:41:04,520 --> 00:41:08,120 Speaker 1: multiple reasons. They might just want to you know, it 658 00:41:08,160 --> 00:41:12,000 Speaker 1: might be like what Christopher Nolan would say, some people 659 00:41:12,080 --> 00:41:15,560 Speaker 1: just want to watch the world burn. Um. In some cases, 660 00:41:15,640 --> 00:41:21,319 Speaker 1: it's to push a particular agenda that they really believe in, 661 00:41:21,520 --> 00:41:25,080 Speaker 1: or to discredit an agenda that they do not like. 662 00:41:25,800 --> 00:41:30,759 Speaker 1: And um, and all's fair in these communities, and in fact, 663 00:41:30,840 --> 00:41:34,760 Speaker 1: you have multiple people with different motivators, all working towards 664 00:41:34,760 --> 00:41:38,240 Speaker 1: the same goal. Some people may earnestly want to bring 665 00:41:38,360 --> 00:41:41,840 Speaker 1: down something. Some people may just think it's funny to 666 00:41:42,680 --> 00:41:46,000 Speaker 1: see commotion come out of their work, you know. Some 667 00:41:46,000 --> 00:41:49,520 Speaker 1: people just want to get a rise out of people. Um. 668 00:41:49,560 --> 00:41:52,279 Speaker 1: But the communities that we typically see this in the 669 00:41:52,320 --> 00:41:56,239 Speaker 1: most are places like four chan and eight chan, where 670 00:41:56,280 --> 00:42:01,200 Speaker 1: we get groups of people who find each other, they 671 00:42:01,200 --> 00:42:06,640 Speaker 1: have some sort of commonality between them, and they know 672 00:42:06,760 --> 00:42:10,319 Speaker 1: how to gamify systems. They know how people think, and 673 00:42:10,320 --> 00:42:14,200 Speaker 1: they know how systems work, and using those two things together, 674 00:42:14,640 --> 00:42:20,879 Speaker 1: they can perpetuate misinformation and lies and push forward their 675 00:42:20,880 --> 00:42:24,120 Speaker 1: own agenda. And we've seen that several times over the 676 00:42:24,200 --> 00:42:28,719 Speaker 1: last few years. In particular, so one entity or one 677 00:42:28,800 --> 00:42:32,759 Speaker 1: group would be q and on. Really that's originally an 678 00:42:32,880 --> 00:42:37,000 Speaker 1: entity UH, a person who was posting with the handle 679 00:42:37,640 --> 00:42:43,560 Speaker 1: q um so q and on being anonymous that has 680 00:42:43,640 --> 00:42:52,560 Speaker 1: posted numerous things that were very conspiracy oriented directed at 681 00:42:53,360 --> 00:42:59,280 Speaker 1: particular groups or people, for example Hillary Clinton UM and 682 00:43:00,080 --> 00:43:02,440 Speaker 1: it ends up being something a narrative that ends up 683 00:43:02,480 --> 00:43:05,680 Speaker 1: being picked up by other people and then presented as 684 00:43:05,760 --> 00:43:09,120 Speaker 1: truth that then gets reinforced again and again to the 685 00:43:09,120 --> 00:43:12,680 Speaker 1: point where you start seeing some of those things that 686 00:43:12,719 --> 00:43:18,080 Speaker 1: have never shown to have any actual truth to them 687 00:43:18,280 --> 00:43:24,680 Speaker 1: being presented by UH by media outlets as truth, usually 688 00:43:25,120 --> 00:43:28,000 Speaker 1: right wing media outlets. You don't tend to see that 689 00:43:28,880 --> 00:43:32,920 Speaker 1: beyond that group, but it has become a real issue. 690 00:43:33,800 --> 00:43:37,520 Speaker 1: And one example of that has been known as pizza Gate. 691 00:43:38,480 --> 00:43:41,680 Speaker 1: And pizza Gate is a pretty a truly terrible story 692 00:43:42,200 --> 00:43:45,680 Speaker 1: UH and it's it's one that alleges that there was 693 00:43:45,719 --> 00:43:51,600 Speaker 1: this particular pizza parlor where secretly there was a pedophilia 694 00:43:51,719 --> 00:43:55,799 Speaker 1: ring running through this this pizza parlor, Like that was 695 00:43:55,880 --> 00:44:00,879 Speaker 1: like the nexus for this pedophilia ring, uh, and that 696 00:44:01,160 --> 00:44:05,960 Speaker 1: this establishment also had a connection to the Clintons, And 697 00:44:07,120 --> 00:44:10,520 Speaker 1: there were a lot of different people posting on a 698 00:44:10,960 --> 00:44:14,560 Speaker 1: lot of different platforms that we're trying to substantiate these 699 00:44:14,600 --> 00:44:19,239 Speaker 1: claims that again under investigation, there was no evidence to 700 00:44:19,280 --> 00:44:22,560 Speaker 1: support any of it, but that wasn't what was important. 701 00:44:22,600 --> 00:44:28,080 Speaker 1: What was important was that ground swell of people making 702 00:44:28,120 --> 00:44:32,560 Speaker 1: these allegations and then immediately responding to any criticism by saying, 703 00:44:32,560 --> 00:44:36,080 Speaker 1: you're trying to cover this up and that is unacceptable. 704 00:44:36,080 --> 00:44:40,719 Speaker 1: Their children and they're they're welfare that's at stake and 705 00:44:40,840 --> 00:44:44,040 Speaker 1: how dare you? And I mean, that's a very powerful 706 00:44:44,480 --> 00:44:49,400 Speaker 1: emotional lever to pull. So it was really effective, at 707 00:44:49,480 --> 00:44:54,960 Speaker 1: least in certain circles. Yes. And also it's one of 708 00:44:55,000 --> 00:45:02,239 Speaker 1: those arguments where you are confirming so one's bias that's 709 00:45:02,280 --> 00:45:05,160 Speaker 1: directed towards a certain group. Right. So that's one of 710 00:45:05,160 --> 00:45:07,840 Speaker 1: the reasons why we really wanted to talk about this topic. 711 00:45:07,880 --> 00:45:10,360 Speaker 1: Two is the idea that why do these things get 712 00:45:10,560 --> 00:45:13,000 Speaker 1: spread so fast and why do they get adopted so quickly? 713 00:45:13,280 --> 00:45:16,480 Speaker 1: And it's a couple of different things. Uh. One of 714 00:45:16,520 --> 00:45:19,520 Speaker 1: the big things is just the way human behavior tends 715 00:45:19,560 --> 00:45:22,880 Speaker 1: to work. So say, if you come up to me 716 00:45:22,920 --> 00:45:25,640 Speaker 1: and you tell me something that that seems to confirm 717 00:45:25,680 --> 00:45:28,600 Speaker 1: a bias I already have, I am more inclined to 718 00:45:28,640 --> 00:45:32,880 Speaker 1: believe you right, because you seem to be telling me 719 00:45:33,000 --> 00:45:37,640 Speaker 1: something that that reinforces a belief I already possess. So 720 00:45:38,880 --> 00:45:43,680 Speaker 1: if I believe I have a sweet, sweet style despite 721 00:45:43,680 --> 00:45:46,960 Speaker 1: all evidence to the contrary, and you come up and 722 00:45:46,960 --> 00:45:50,600 Speaker 1: you say, Wow, you're really got that outfits really well 723 00:45:50,640 --> 00:45:54,880 Speaker 1: put together, you really are looking great today, I can 724 00:45:54,960 --> 00:45:57,120 Speaker 1: sit there and think like, oh, yeah, I knew I 725 00:45:57,200 --> 00:46:00,000 Speaker 1: have a sweet, sweet style, and sheha is just reinforce 726 00:46:00,200 --> 00:46:05,480 Speaker 1: that belief. So while any objective viewer might look at 727 00:46:05,480 --> 00:46:08,200 Speaker 1: me and think this guy doesn't even know how to 728 00:46:08,239 --> 00:46:11,680 Speaker 1: match his socks, I'm walking down the straight thinking I 729 00:46:11,719 --> 00:46:16,799 Speaker 1: got sweet sweet style. Yes, But in that case that 730 00:46:16,920 --> 00:46:20,160 Speaker 1: was that's harmless, right. But if I have, let's say 731 00:46:20,160 --> 00:46:24,120 Speaker 1: a different belief. Let's say that I believe that um 732 00:46:24,200 --> 00:46:28,040 Speaker 1: an organization like the Red Cross. Let's say I have 733 00:46:28,080 --> 00:46:30,440 Speaker 1: a belief that the Red Cross is actually doing something 734 00:46:30,440 --> 00:46:35,480 Speaker 1: really nefarious. And then you were to come to me 735 00:46:35,880 --> 00:46:42,239 Speaker 1: and say, I just heard that over in South Carolina 736 00:46:42,320 --> 00:46:45,680 Speaker 1: the Red Cross has quarantined this entire community, and they 737 00:46:45,719 --> 00:46:48,160 Speaker 1: won't tell anyone why. And it's like it doesn't even 738 00:46:48,200 --> 00:46:50,360 Speaker 1: have to make any sense, right, It doesn't have to 739 00:46:50,360 --> 00:46:52,000 Speaker 1: have It doesn't have to make any sense you have 740 00:46:52,040 --> 00:46:54,200 Speaker 1: to have you don't have to have any corroborating evidence. 741 00:46:54,920 --> 00:46:58,640 Speaker 1: If it plays into a belief I already have, I 742 00:46:58,680 --> 00:47:02,080 Speaker 1: am more inclined to believe you, So that's part of it. 743 00:47:02,400 --> 00:47:05,359 Speaker 1: If it plays on a fear I have, I'm more 744 00:47:05,360 --> 00:47:08,279 Speaker 1: inclined to believe you because if I'm already afraid of 745 00:47:08,320 --> 00:47:11,480 Speaker 1: something and you tell me there's that that fear is 746 00:47:12,680 --> 00:47:16,759 Speaker 1: is rational or is you know, legitimate because of some 747 00:47:16,840 --> 00:47:21,040 Speaker 1: other reason, I'm more inclined to believe you. On top 748 00:47:21,120 --> 00:47:25,960 Speaker 1: of that, we have these Internet communities and uh, in 749 00:47:26,000 --> 00:47:28,160 Speaker 1: some cases you might have something like a sub credit 750 00:47:28,480 --> 00:47:30,440 Speaker 1: where you have these people who share some sort of 751 00:47:30,480 --> 00:47:33,840 Speaker 1: common interest or common trait. They will reinforce their beliefs. 752 00:47:33,840 --> 00:47:37,120 Speaker 1: It becomes an echo chamber. But you also have platforms 753 00:47:37,160 --> 00:47:40,719 Speaker 1: like Facebook and Facebook. The way it works is that 754 00:47:40,800 --> 00:47:44,200 Speaker 1: when a post gets a lot of engagement, when people 755 00:47:44,239 --> 00:47:46,839 Speaker 1: are leaving comments, when people are sharing it, when people 756 00:47:46,880 --> 00:47:49,839 Speaker 1: are liking it. You know, when people are applying to 757 00:47:49,840 --> 00:47:55,439 Speaker 1: one another. Facebook sees that activity, and Facebook makes money 758 00:47:55,480 --> 00:47:58,879 Speaker 1: when people are spending time on Facebook. So if they 759 00:47:58,920 --> 00:48:01,719 Speaker 1: see if they being an algorithm, actually it's not even 760 00:48:01,719 --> 00:48:07,200 Speaker 1: a person. If the Facebook algorithm identifies certain posts as 761 00:48:07,239 --> 00:48:11,680 Speaker 1: being really productive from an engagement standpoint, those posts are 762 00:48:11,760 --> 00:48:15,880 Speaker 1: valuable because it means when people see this, they spend 763 00:48:15,960 --> 00:48:18,680 Speaker 1: more time on the platform. When they spend more time 764 00:48:18,680 --> 00:48:22,040 Speaker 1: on the platform, we make more money. So let's promote 765 00:48:22,760 --> 00:48:26,560 Speaker 1: the posts that are getting this kind of crazy engagement. 766 00:48:27,480 --> 00:48:32,520 Speaker 1: And so that's why you see posts that are not 767 00:48:32,560 --> 00:48:39,719 Speaker 1: necessarily sincere or earnest, or accurate or truthful get promoted 768 00:48:40,120 --> 00:48:41,960 Speaker 1: on sites like Facebook. In fact, that's one of the 769 00:48:42,040 --> 00:48:45,960 Speaker 1: big reasons why the United States government has called Facebook 770 00:48:46,000 --> 00:48:50,560 Speaker 1: to task on multiple occasions for perpetuating, you know, quote 771 00:48:50,640 --> 00:48:54,879 Speaker 1: unquote fake news, which I term I hate because it's 772 00:48:54,960 --> 00:49:00,640 Speaker 1: often used to discredit actual news while not being applied 773 00:49:00,680 --> 00:49:04,440 Speaker 1: to what is truly fake news. But that's neither here 774 00:49:04,480 --> 00:49:11,799 Speaker 1: nor there. So it's a tricky situation because as a 775 00:49:11,800 --> 00:49:16,240 Speaker 1: as a human being, it's asking what what I'm asking 776 00:49:16,280 --> 00:49:19,360 Speaker 1: you to do is to engage your critical thinking skills 777 00:49:19,880 --> 00:49:25,719 Speaker 1: even when you're encountering something that seems to be confirming 778 00:49:25,760 --> 00:49:28,560 Speaker 1: your beliefs, like you have to. You have to apply 779 00:49:28,640 --> 00:49:32,360 Speaker 1: critical thinking even in those cases, especially in those cases, 780 00:49:32,680 --> 00:49:36,560 Speaker 1: because that's when your most vulnerable to accepting a lie 781 00:49:36,600 --> 00:49:39,400 Speaker 1: as the truth, When it is telling you something that 782 00:49:39,480 --> 00:49:42,520 Speaker 1: you want to hear, or it's reaffirming something you were 783 00:49:42,600 --> 00:49:46,320 Speaker 1: afraid of. That's when you have to apply those critical 784 00:49:46,360 --> 00:49:52,080 Speaker 1: thinking skills the most. Because the people who are perpetuating 785 00:49:52,120 --> 00:49:55,960 Speaker 1: this stuff, they already they're aware, at least on a 786 00:49:56,000 --> 00:49:59,399 Speaker 1: certain level of human psychology and how these things work 787 00:49:59,520 --> 00:50:02,160 Speaker 1: and how messages can be delivered in a way that 788 00:50:02,280 --> 00:50:05,839 Speaker 1: will get the broadest acceptance. So you have to keep 789 00:50:05,840 --> 00:50:09,840 Speaker 1: that in mind, like they're gamifying the system, and the 790 00:50:09,840 --> 00:50:11,880 Speaker 1: system happens to be the way people think and the 791 00:50:11,920 --> 00:50:17,960 Speaker 1: way these communities operate. To quote my favorite musical ever 792 00:50:18,719 --> 00:50:21,960 Speaker 1: created um ref Madness the Musical. Okay, I just think 793 00:50:21,960 --> 00:50:24,000 Speaker 1: it was gonna be repo the Genetic Opera, But go ahead, 794 00:50:24,040 --> 00:50:26,200 Speaker 1: I mean, I guess we could find a connection there 795 00:50:26,200 --> 00:50:30,400 Speaker 1: as well. But refermat the very last line and refor 796 00:50:30,440 --> 00:50:33,759 Speaker 1: Madness the Musical is when danger's near, exploit their fear. 797 00:50:33,880 --> 00:50:36,719 Speaker 1: The end will justify the means. Yeah, so this is 798 00:50:36,760 --> 00:50:39,880 Speaker 1: not this is not a new idea. I mean, propaganda 799 00:50:40,040 --> 00:50:43,640 Speaker 1: is largely based on this same principle. In fact, you know, 800 00:50:43,719 --> 00:50:45,279 Speaker 1: you could say that a lot of the people who 801 00:50:45,280 --> 00:50:48,759 Speaker 1: are creating this misinformation, they're just following in the same 802 00:50:48,760 --> 00:50:54,759 Speaker 1: footsteps as ad executives and uh and and political operatives 803 00:50:54,920 --> 00:51:00,680 Speaker 1: have for for decades or longer. Um fax has always 804 00:51:00,719 --> 00:51:03,400 Speaker 1: been part of advertisement pretty much. Oh absolutely, no, I 805 00:51:03,440 --> 00:51:07,200 Speaker 1: mean you look back at like someone. The series mad 806 00:51:07,239 --> 00:51:12,840 Speaker 1: Men was based off of actual advertising executives who really 807 00:51:13,000 --> 00:51:17,680 Speaker 1: did find incredibly creative ways to convince people to buy 808 00:51:17,719 --> 00:51:20,839 Speaker 1: stuff that they didn't want or need. And the same 809 00:51:20,880 --> 00:51:25,839 Speaker 1: thing is true now about much more critical stuff like 810 00:51:26,200 --> 00:51:29,759 Speaker 1: who are leaders are or what policies we should adopt, 811 00:51:30,320 --> 00:51:34,000 Speaker 1: or what groups we should you know, pay attention to. 812 00:51:34,840 --> 00:51:38,640 Speaker 1: So it's a it's a you know, while we started 813 00:51:38,640 --> 00:51:42,719 Speaker 1: off kind of talking about folklore, the important thing is 814 00:51:42,760 --> 00:51:45,600 Speaker 1: that we need to be able to separate fiction from truth. 815 00:51:46,000 --> 00:51:48,920 Speaker 1: We can still enjoy fiction. I think stuff like creepy 816 00:51:48,920 --> 00:51:52,239 Speaker 1: pasta has a definitely has a place. It just has 817 00:51:52,239 --> 00:51:55,359 Speaker 1: a place as accepting it as hey, let's get around 818 00:51:55,400 --> 00:51:58,600 Speaker 1: the fire and tell ghost stories. Not come here. You 819 00:51:58,680 --> 00:52:03,440 Speaker 1: need to hear this because it might save your life. Yeah, right, 820 00:52:04,440 --> 00:52:07,239 Speaker 1: that's the difference. We have to be able to differentiate 821 00:52:07,280 --> 00:52:10,520 Speaker 1: those two things. If we don't, we're more likely than 822 00:52:10,560 --> 00:52:15,680 Speaker 1: not to act on incorrect information. So, Jeff, the killer 823 00:52:15,760 --> 00:52:17,920 Speaker 1: is not really in your bedroom. I sure hope not. 824 00:52:18,440 --> 00:52:22,120 Speaker 1: I don't have to clean. I would feel so awkward. 825 00:52:22,160 --> 00:52:26,080 Speaker 1: I'd be like, I'm so sorry for the mess. You've 826 00:52:26,120 --> 00:52:29,719 Speaker 1: been hanging around me too much, apologizing to just kill 827 00:52:29,800 --> 00:52:34,400 Speaker 1: me now because I feel so embarrassed. They'd be like, listen, 828 00:52:34,480 --> 00:52:37,600 Speaker 1: you've maybe he would help you clean it before. I'd say, 829 00:52:37,640 --> 00:52:42,640 Speaker 1: you made your bed, but you clearly haven't. Yeah, well, 830 00:52:42,680 --> 00:52:46,479 Speaker 1: I mean he's just gonna dirty it up anyway. Yeah, 831 00:52:46,760 --> 00:52:50,880 Speaker 1: that's true. Just stabby step stab, which ironically was the 832 00:52:51,000 --> 00:52:55,440 Speaker 1: name of the operation that the two girls who carried 833 00:52:55,440 --> 00:52:58,240 Speaker 1: out the slip. Yeah, they called it stabby step stab. 834 00:52:58,719 --> 00:53:03,960 Speaker 1: That is that's actually nothing. Yeah. I thought I was 835 00:53:04,000 --> 00:53:06,279 Speaker 1: just pulling that out of thin air, but then I remember, nope, 836 00:53:06,320 --> 00:53:08,520 Speaker 1: that's what they called that. So that was a callback, 837 00:53:08,960 --> 00:53:11,440 Speaker 1: not on purpose. That was just my brain sabotaging me. 838 00:53:13,440 --> 00:53:16,879 Speaker 1: Not not creative. I mean, I know that's the lack 839 00:53:16,920 --> 00:53:18,840 Speaker 1: of creativity is not the worst part of the story, 840 00:53:18,960 --> 00:53:23,640 Speaker 1: but problematic part of it. Well, Shaye, thank you for 841 00:53:23,719 --> 00:53:27,160 Speaker 1: your suggestion, Thank you for letting me come on the 842 00:53:27,360 --> 00:53:30,279 Speaker 1: show and annoy you. You didn't didn't annoy me at all. 843 00:53:30,360 --> 00:53:32,880 Speaker 1: It was very helpful, and I am glad to have 844 00:53:33,080 --> 00:53:37,200 Speaker 1: someone to talk with so that it you know, we 845 00:53:37,239 --> 00:53:39,600 Speaker 1: can have that sort of conversation and you can. You know, 846 00:53:40,239 --> 00:53:42,479 Speaker 1: you introduced me to stuff I had not heard about, 847 00:53:42,520 --> 00:53:45,320 Speaker 1: and I find a lot of it fascinating. You introduced 848 00:53:45,320 --> 00:53:47,440 Speaker 1: me to something I didn't know about that was written 849 00:53:47,480 --> 00:53:52,080 Speaker 1: by a guy I do know, which was weird. Yeah, yeah, 850 00:53:52,120 --> 00:53:54,160 Speaker 1: that was odd. Now I watched the show like in 851 00:53:54,239 --> 00:53:57,160 Speaker 1: its entirety. Now, yeah, now, he wrote the original Creepypasta. 852 00:53:57,200 --> 00:54:00,720 Speaker 1: I don't think he was connected to the show, except 853 00:54:00,800 --> 00:54:02,920 Speaker 1: maybe he probably got a credit. Yeah, I probably got 854 00:54:02,920 --> 00:54:05,360 Speaker 1: like a writer's credit or maybe even like a producer's credit. 855 00:54:05,440 --> 00:54:07,759 Speaker 1: I I honestly don't know. And honestly, it's not like 856 00:54:07,960 --> 00:54:10,680 Speaker 1: we're friends or anything. I literally interviewed him for an article. 857 00:54:10,840 --> 00:54:12,600 Speaker 1: So what he's saying is that he wants to be 858 00:54:12,600 --> 00:54:16,200 Speaker 1: your friends. I mean, that's true. It's true. It's true, 859 00:54:16,280 --> 00:54:20,520 Speaker 1: but it's not not relevant. Okay, So if you guys 860 00:54:20,560 --> 00:54:24,200 Speaker 1: have suggestions for future episodes of tech Stuff, or maybe 861 00:54:24,239 --> 00:54:26,280 Speaker 1: there's someone you want me to have on as a guest, 862 00:54:26,400 --> 00:54:28,920 Speaker 1: or maybe you think, why don't you have that shape 863 00:54:28,920 --> 00:54:31,920 Speaker 1: person on again? You need to let me know. You 864 00:54:31,960 --> 00:54:34,799 Speaker 1: can reach out to me on Facebook or Twitter. The 865 00:54:34,840 --> 00:54:37,680 Speaker 1: handle for both of those is text stuff H s 866 00:54:37,880 --> 00:54:45,480 Speaker 1: W and I'll talk to you again really soon. Text 867 00:54:45,480 --> 00:54:48,160 Speaker 1: Stuff is a production of I Heart Radio's How Stuff Works. 868 00:54:48,320 --> 00:54:51,160 Speaker 1: For more podcasts from my heart Radio, visit the i 869 00:54:51,239 --> 00:54:54,480 Speaker 1: heart Radio app, Apple Podcasts, or wherever you listen to 870 00:54:54,520 --> 00:54:55,440 Speaker 1: your favorite shows.