1 00:00:00,200 --> 00:00:27,479 Speaker 1: Ridiculous History is a production of iHeartRadio. Welcome back to 2 00:00:27,520 --> 00:00:30,840 Speaker 1: the show, fellow Ridiculous Historians. Thank you, as always so 3 00:00:30,920 --> 00:00:33,360 Speaker 1: much for tuning in. Let's hear for the Man, the Myth, 4 00:00:33,400 --> 00:00:38,360 Speaker 1: the legend, our super producer, mister Max Williams. Max Williams. 5 00:00:38,440 --> 00:00:41,519 Speaker 1: And who is that cheering on our super producer? Why 6 00:00:41,600 --> 00:00:45,360 Speaker 1: it's none other than mister Noel Brown Nolan. How are 7 00:00:45,360 --> 00:00:45,839 Speaker 1: we feeling? 8 00:00:45,960 --> 00:00:46,440 Speaker 2: I'm all right? 9 00:00:46,640 --> 00:00:49,120 Speaker 1: How are you? I'm okay, man, I'm okay. Got a 10 00:00:49,120 --> 00:00:49,360 Speaker 1: lot of. 11 00:00:50,080 --> 00:00:53,800 Speaker 3: Run of dreary weather here in Atlanta, finally broke and 12 00:00:54,240 --> 00:00:57,600 Speaker 3: that bums me out. So I'm feeling pretty great today, really, 13 00:00:57,720 --> 00:00:58,320 Speaker 3: I you know. 14 00:00:58,640 --> 00:01:01,800 Speaker 1: Oh, I'm Ben Bullen by the I feel weird, Noel, 15 00:01:01,840 --> 00:01:04,840 Speaker 1: because I had just perfected the newest update on my 16 00:01:04,959 --> 00:01:07,840 Speaker 1: rainy day playlist, you know what I mean, And I 17 00:01:07,880 --> 00:01:09,280 Speaker 1: didn't do it in time for the weather. 18 00:01:09,640 --> 00:01:11,720 Speaker 2: What are we talking here? We're talking like chill hip hop, 19 00:01:11,720 --> 00:01:12,760 Speaker 2: beast to study. 20 00:01:12,400 --> 00:01:18,000 Speaker 1: Too or I loved love a godspeed? 21 00:01:18,440 --> 00:01:18,600 Speaker 4: Uh? 22 00:01:18,720 --> 00:01:21,520 Speaker 1: Got it back into some Shirley Manson stuff. We'd love 23 00:01:21,520 --> 00:01:27,559 Speaker 1: to hear yours garbage? Uh huh, yes, yes, garbage. Pistoln 24 00:01:28,840 --> 00:01:33,679 Speaker 1: speaking of the opposite of garbage. We are thrilled you, 25 00:01:33,880 --> 00:01:37,560 Speaker 1: Max and myself by a new podcast that is right 26 00:01:37,680 --> 00:01:42,440 Speaker 1: up our alley. Hoax created by none other than the 27 00:01:42,560 --> 00:01:46,840 Speaker 1: legendary Lizzie Logan and Dana Schwartz and Noel. We have 28 00:01:46,920 --> 00:01:49,360 Speaker 1: a special surprise for our ridiculous historians. 29 00:01:49,520 --> 00:01:50,560 Speaker 2: Is it true? Are they here? 30 00:01:50,600 --> 00:01:57,080 Speaker 1: They're here, They're here this part Welcome to the show. 31 00:01:57,480 --> 00:02:05,160 Speaker 2: So happy to have y'all. 32 00:02:05,280 --> 00:02:09,120 Speaker 1: Let's get right into it. First, tell us a little 33 00:02:09,120 --> 00:02:11,920 Speaker 1: bit about your background. I think you're going to be 34 00:02:12,000 --> 00:02:15,520 Speaker 1: both familiar to a lot of our listeners already. Could 35 00:02:15,520 --> 00:02:18,200 Speaker 1: you tell us how you guys teamed up and what 36 00:02:18,360 --> 00:02:22,359 Speaker 1: inspired you to create a show all about hoaxes. 37 00:02:22,800 --> 00:02:26,639 Speaker 4: Lizzie, I was pointing at you, Dana as this as 38 00:02:26,720 --> 00:02:28,840 Speaker 4: this podcast was your idea. 39 00:02:29,760 --> 00:02:32,680 Speaker 5: Okay, Well, I'm Dana Schwartz. I'm the host of the 40 00:02:33,200 --> 00:02:37,359 Speaker 5: history podcast Noble Blood, which is a scripted podcast where 41 00:02:37,400 --> 00:02:41,320 Speaker 5: we talk about the history of nobles, you know, and 42 00:02:41,360 --> 00:02:45,640 Speaker 5: they're interesting, usually bloody stories. But a fun fact about 43 00:02:45,680 --> 00:02:49,160 Speaker 5: me is I'm real life human friends with Lizzie Logan 44 00:02:49,600 --> 00:02:52,079 Speaker 5: and I had this idea. I've always been fascinated by 45 00:02:52,240 --> 00:02:55,160 Speaker 5: historical hoaxes. There have been a few Noble Blood episodes 46 00:02:55,200 --> 00:02:59,680 Speaker 5: that sort of touch on hoaxes, fake royals, you know, 47 00:03:00,040 --> 00:03:04,600 Speaker 5: Anastasia stories like that, And I'm captivated about the idea 48 00:03:05,160 --> 00:03:08,720 Speaker 5: of why people believe things that aren't true and the 49 00:03:08,760 --> 00:03:11,720 Speaker 5: people that purport that they are true. And so I 50 00:03:11,760 --> 00:03:14,760 Speaker 5: sent a text message to Lizzie Logan, my real life friend, 51 00:03:15,040 --> 00:03:18,760 Speaker 5: about this idea, and we noodled it together, and a 52 00:03:18,840 --> 00:03:20,519 Speaker 5: year and change later, here we are. 53 00:03:21,200 --> 00:03:24,320 Speaker 4: It's true, And I'm I'm a little bit more of 54 00:03:24,360 --> 00:03:31,480 Speaker 4: the modern day scam scandal fan. I love like I was. 55 00:03:31,840 --> 00:03:34,880 Speaker 4: I was on the fair noose beat, like from the beginning. 56 00:03:35,120 --> 00:03:39,880 Speaker 4: I was fascinated by that, Like fraudsters, annadel Vi, like 57 00:03:40,080 --> 00:03:42,880 Speaker 4: anything of that that's in the news, I am like 58 00:03:43,360 --> 00:03:48,520 Speaker 4: fascinated by. So it's been really fun to research these 59 00:03:49,000 --> 00:03:53,920 Speaker 4: with Dana and find out that people have been hucksters 60 00:03:54,280 --> 00:03:57,200 Speaker 4: throughout history. We think of this as sometimes a new 61 00:03:57,200 --> 00:04:01,160 Speaker 4: phenomenon of like, oh, people are falling for these scams 62 00:04:01,200 --> 00:04:04,680 Speaker 4: and these liars these days. Now nobody you can't trust 63 00:04:04,680 --> 00:04:07,720 Speaker 4: anybody anymore. It's like, oh, you could actually never trust anybody. 64 00:04:07,880 --> 00:04:10,600 Speaker 4: This has actually been going on since the dawn of lying. 65 00:04:11,160 --> 00:04:13,920 Speaker 3: Well, it couldn't come at a more appropriate time, because 66 00:04:13,960 --> 00:04:16,280 Speaker 3: it would seem that just about everything we hear about 67 00:04:16,320 --> 00:04:19,240 Speaker 3: as a hoax, if it is unflattering to certain folks 68 00:04:19,240 --> 00:04:22,880 Speaker 3: in government these days, yeah too about it. 69 00:04:22,800 --> 00:04:23,840 Speaker 2: It had to put it out. 70 00:04:23,920 --> 00:04:27,120 Speaker 5: Very cy call things hoaxes, and I think like, unfortunately, 71 00:04:27,200 --> 00:04:32,360 Speaker 5: at this moment, when AI and deep fakes are so pervasive, 72 00:04:32,839 --> 00:04:37,840 Speaker 5: the fundamental question of whether you should believe some information 73 00:04:37,920 --> 00:04:40,280 Speaker 5: that's put in front of you, I think is very critical. 74 00:04:40,320 --> 00:04:43,520 Speaker 5: And so by using these anecdotes, these fun stories, I mean, 75 00:04:43,560 --> 00:04:47,080 Speaker 5: our first episode is about two young girls in nineteen 76 00:04:47,160 --> 00:04:51,080 Speaker 5: seventeen saying that they took pictures of fairies. It seems silly, 77 00:04:51,160 --> 00:04:54,440 Speaker 5: but the bigger questions about that I think are unfortunately 78 00:04:54,520 --> 00:04:57,400 Speaker 5: relevant today. Just like scrolling through Facebook and you see 79 00:04:57,400 --> 00:05:01,200 Speaker 5: these like AI slop stories that get like thousands of 80 00:05:01,400 --> 00:05:03,560 Speaker 5: likes about someone like being like no one came to 81 00:05:03,600 --> 00:05:06,520 Speaker 5: my birthday party, and it's like a cartoon of one 82 00:05:06,600 --> 00:05:07,880 Speaker 5: hundred and twenty year old woman. 83 00:05:08,400 --> 00:05:10,040 Speaker 3: Well, it's like I saw one today where it was 84 00:05:10,120 --> 00:05:12,880 Speaker 3: Leonardo DiCaprio covers his face with a cap at Jeff 85 00:05:12,920 --> 00:05:16,520 Speaker 3: Bezos's wedding, and it's such an unremarkable and likely thing 86 00:05:16,640 --> 00:05:18,560 Speaker 3: to have happened that you look at and you're like, yeah, 87 00:05:18,600 --> 00:05:20,760 Speaker 3: sure that happens. But then I look down on the comments, 88 00:05:20,800 --> 00:05:23,040 Speaker 3: It's like, this is totally AI. It was based on 89 00:05:23,080 --> 00:05:26,159 Speaker 3: a still image that was then AI afied to like animate. 90 00:05:26,560 --> 00:05:30,600 Speaker 3: And you know, we're also willing, maybe even want to 91 00:05:30,600 --> 00:05:32,560 Speaker 3: be the first one to share something that all of 92 00:05:32,600 --> 00:05:35,560 Speaker 3: this stuff just makes it even more difficult to sort 93 00:05:35,600 --> 00:05:38,520 Speaker 3: things out because it just capitalizes on that wanting to 94 00:05:38,560 --> 00:05:41,400 Speaker 3: be the first one to break the news on something 95 00:05:41,480 --> 00:05:43,719 Speaker 3: that impulse is already kind of depriving you of some 96 00:05:43,800 --> 00:05:46,400 Speaker 3: critical thinking at a very crucial moment when this kind 97 00:05:46,440 --> 00:05:48,240 Speaker 3: of information gets spread like wildfire. 98 00:05:48,279 --> 00:05:51,760 Speaker 1: We want to be Promethean right in the attention economy? 99 00:05:51,839 --> 00:05:54,039 Speaker 1: Where are my likes? Where am I rip boost? 100 00:05:54,880 --> 00:05:55,200 Speaker 4: This is? 101 00:05:55,720 --> 00:05:59,520 Speaker 1: This is something that we've explored often flzient data in 102 00:05:59,680 --> 00:06:02,400 Speaker 1: our or another show we do called stuff they Don't 103 00:06:02,440 --> 00:06:04,640 Speaker 1: Want you to Know. And I think one thing that 104 00:06:04,760 --> 00:06:09,960 Speaker 1: really interests Nol and myself and Max and all our 105 00:06:10,040 --> 00:06:16,320 Speaker 1: fellow audience members tonight is the idea that a lot 106 00:06:16,360 --> 00:06:20,080 Speaker 1: of people, especially before AI Slop, a lot of us 107 00:06:20,120 --> 00:06:23,240 Speaker 1: assumed that it was just easier to pull off a 108 00:06:23,320 --> 00:06:26,680 Speaker 1: successful hoax in ages past right, just like you know, 109 00:06:26,760 --> 00:06:29,200 Speaker 1: every true crime show says, well, you know, it's just 110 00:06:29,240 --> 00:06:33,599 Speaker 1: easier to murder people back then. So do we think 111 00:06:34,080 --> 00:06:37,200 Speaker 1: that it was easier to pull off a successful hoax 112 00:06:37,320 --> 00:06:41,720 Speaker 1: in the past or has AI changed that conversation entirely. 113 00:06:41,400 --> 00:06:43,520 Speaker 3: And that lack of critical thinking and that kind of 114 00:06:43,560 --> 00:06:44,960 Speaker 3: like you know, knee jerk reaction. 115 00:06:45,560 --> 00:06:49,560 Speaker 5: I'm gonna say yes and now, Yes, it was easier 116 00:06:49,600 --> 00:06:53,240 Speaker 5: to pull off a hoax because there were fewer mechanisms 117 00:06:53,240 --> 00:06:56,040 Speaker 5: of like immediate fact checking. Like if you say something 118 00:06:56,040 --> 00:06:58,880 Speaker 5: to someone, someone can't just like quick google it to 119 00:06:58,880 --> 00:07:01,920 Speaker 5: see if it's real. You know, if someone says they're Anastasia, 120 00:07:02,000 --> 00:07:04,960 Speaker 5: you can't do a genetic test immediately and prove that 121 00:07:04,960 --> 00:07:07,920 Speaker 5: that's not the case. The Princess Anastasia, you know, talking 122 00:07:07,920 --> 00:07:09,640 Speaker 5: about the Romano Princess. 123 00:07:09,400 --> 00:07:10,560 Speaker 2: From the animated film. 124 00:07:10,600 --> 00:07:12,440 Speaker 5: From the animated film, and as we know, it was 125 00:07:12,480 --> 00:07:14,720 Speaker 5: a happy ending and she lived happily ever after. Yes, 126 00:07:15,360 --> 00:07:18,200 Speaker 5: so in some ways it was easier to lie just 127 00:07:18,240 --> 00:07:22,200 Speaker 5: because there were fewer ways of fact checking immediately. But 128 00:07:22,440 --> 00:07:25,360 Speaker 5: I will say something that continues to come up over 129 00:07:25,360 --> 00:07:27,800 Speaker 5: the course of this podcast is people were not more 130 00:07:27,840 --> 00:07:30,920 Speaker 5: stupid in the past when we have hoaxes and we say, 131 00:07:30,960 --> 00:07:33,840 Speaker 5: oh my god, everyone believed this. That's not the case 132 00:07:33,920 --> 00:07:36,600 Speaker 5: with almost every hoax. There were people at the time 133 00:07:36,800 --> 00:07:40,440 Speaker 5: calling bullshit. In our first episode we talked about these 134 00:07:40,480 --> 00:07:44,360 Speaker 5: coddingly fairies, and it was notable because Sir Arthur Conan Doyle, 135 00:07:44,400 --> 00:07:47,160 Speaker 5: who wrote the Sherlock Holmes stories, someone you would have 136 00:07:47,280 --> 00:07:51,240 Speaker 5: imagined to be a very intelligent man good at deductive reasoning, 137 00:07:51,960 --> 00:07:55,040 Speaker 5: was a big defender of these fairy photographs. 138 00:07:55,560 --> 00:07:57,400 Speaker 2: We have talked about this particular story. 139 00:07:57,520 --> 00:07:59,800 Speaker 3: Yeah, on both ridiculous history and stuff that don't want 140 00:08:00,080 --> 00:08:01,600 Speaker 3: maybe just to remind us a little bit. 141 00:08:01,640 --> 00:08:04,760 Speaker 2: It was like the early, very very well done. 142 00:08:04,600 --> 00:08:07,560 Speaker 3: Photoshop job that like had people convinced that someone had 143 00:08:07,560 --> 00:08:10,560 Speaker 3: photographed like these magical creatures. 144 00:08:10,400 --> 00:08:14,600 Speaker 4: Which I just want to go on record and say, 145 00:08:14,640 --> 00:08:17,720 Speaker 4: if you haven't looked these episode. 146 00:08:17,640 --> 00:08:18,560 Speaker 2: Two, they're nuts. 147 00:08:18,760 --> 00:08:23,080 Speaker 4: Two points two points One, Sir Arthur Conan Doyle fell 148 00:08:23,120 --> 00:08:26,080 Speaker 4: for this, and he was a smart guy. Two most 149 00:08:26,120 --> 00:08:30,760 Speaker 4: people did not fall for this. Three The photos look 150 00:08:30,880 --> 00:08:34,679 Speaker 4: fake as hell, Like they do not look real like 151 00:08:34,840 --> 00:08:36,680 Speaker 4: this is this is the thing that. 152 00:08:36,600 --> 00:08:40,200 Speaker 6: People need to understand, Like they don't they just they. 153 00:08:40,120 --> 00:08:44,880 Speaker 7: Don't look real at all, Like the the The idea 154 00:08:44,920 --> 00:08:47,720 Speaker 7: that Sir Arthur Conan Doyle fell for this is so 155 00:08:48,400 --> 00:08:51,560 Speaker 7: funny to me because they just don't look real like 156 00:08:52,080 --> 00:08:53,240 Speaker 7: on any level. 157 00:08:54,160 --> 00:08:56,360 Speaker 5: He I mean, we get into it in the episode, 158 00:08:56,400 --> 00:08:58,600 Speaker 5: but he believed it because he wanted to believe it. 159 00:08:58,640 --> 00:09:03,080 Speaker 5: He was a spiritualist. This just reinforced his worldview. But 160 00:09:03,760 --> 00:09:06,559 Speaker 5: plenty of people at the time knew that these photos 161 00:09:06,559 --> 00:09:10,240 Speaker 5: were fake. Plenty of these people were making fun of 162 00:09:10,320 --> 00:09:12,760 Speaker 5: Sir Arthur Conan Doyle for falling for it. There's one 163 00:09:12,840 --> 00:09:17,720 Speaker 5: quote that I love where someone said, knowing children and 164 00:09:17,800 --> 00:09:21,120 Speaker 5: knowing that Sir Arthur Conan Doyle has legs, I believe 165 00:09:21,160 --> 00:09:22,640 Speaker 5: these girls pulled. 166 00:09:22,280 --> 00:09:28,760 Speaker 3: His Like, yeah, there's no dimension to them, there's no 167 00:09:28,960 --> 00:09:32,320 Speaker 3: light fall off, there's not completely flat. I think in 168 00:09:32,400 --> 00:09:35,040 Speaker 3: my mind I was remembering it looking a lot better, 169 00:09:35,080 --> 00:09:36,880 Speaker 3: and I'm looking at it now and you're dead on 170 00:09:37,040 --> 00:09:38,520 Speaker 3: it really. 171 00:09:38,240 --> 00:09:42,400 Speaker 4: Don't look like I've seen spooky photos where I'm like, oh, 172 00:09:42,480 --> 00:09:44,600 Speaker 4: I understand how a person in the past, or even 173 00:09:44,640 --> 00:09:46,480 Speaker 4: a person today would have fallen for this if you 174 00:09:46,520 --> 00:09:50,280 Speaker 4: don't know enough exposure is yeah, this is not like 175 00:09:50,440 --> 00:09:52,600 Speaker 4: BuzzFeed creepy photos where you're like. 176 00:09:52,559 --> 00:09:54,760 Speaker 6: Oh my god, what is that. This is not that 177 00:09:55,000 --> 00:09:56,440 Speaker 6: this is a fake looking photo. 178 00:09:56,720 --> 00:10:00,040 Speaker 2: They do not exist in the same space clearly upon. 179 00:10:00,440 --> 00:10:02,839 Speaker 5: I mean really, what they did was they just one 180 00:10:02,880 --> 00:10:06,480 Speaker 5: of the girls just drew fairies and pinned them around 181 00:10:06,520 --> 00:10:10,000 Speaker 5: them with hatpins. So the photo itself wasn't manipulated, which 182 00:10:10,040 --> 00:10:12,320 Speaker 5: is why they were like, we took these photos to 183 00:10:12,400 --> 00:10:16,280 Speaker 5: the expert expert photography people and they prove that it 184 00:10:16,320 --> 00:10:18,959 Speaker 5: wasn't a double exposure. It's like, yeah, it wasn't because 185 00:10:18,960 --> 00:10:22,079 Speaker 5: these girls just had cardboard cutouts of fairies around them. 186 00:10:21,920 --> 00:10:24,800 Speaker 1: And they're nice drawings. To be fair, We don't want 187 00:10:25,040 --> 00:10:27,559 Speaker 1: you don't want to call that. Yeah, I do also 188 00:10:27,600 --> 00:10:30,400 Speaker 1: want to say, on a positive note, this would be 189 00:10:30,600 --> 00:10:34,120 Speaker 1: like a really nice hallmarky type card to send to 190 00:10:34,200 --> 00:10:34,760 Speaker 1: your family. 191 00:10:34,840 --> 00:10:37,960 Speaker 5: It's the girl was sixteen. This is like a really 192 00:10:38,000 --> 00:10:40,120 Speaker 5: creative art project that she was doing with. 193 00:10:40,080 --> 00:10:42,120 Speaker 2: Her cousin's fun. 194 00:10:42,160 --> 00:10:46,839 Speaker 1: And how did the how did the I don't want 195 00:10:46,840 --> 00:10:49,920 Speaker 1: to call them culprits. How did the uh let's use 196 00:10:50,120 --> 00:10:54,319 Speaker 1: that little culprits, those little scamps, little scamps? Okay, how 197 00:10:54,360 --> 00:10:57,400 Speaker 1: did how did the how did the scamps? Uh deal 198 00:10:57,559 --> 00:10:58,160 Speaker 1: with this? 199 00:10:58,600 --> 00:10:58,680 Speaker 2: Uh? 200 00:10:59,080 --> 00:11:02,480 Speaker 1: It became you know, a subject of conversation on the 201 00:11:02,600 --> 00:11:07,600 Speaker 1: level of a meme going viral. Right, So, without spoiling 202 00:11:07,640 --> 00:11:10,120 Speaker 1: too much of the episode, could you tell us how 203 00:11:10,160 --> 00:11:14,240 Speaker 1: those kids dealt with the the fame and the backlash 204 00:11:14,280 --> 00:11:15,199 Speaker 1: and the controversy. 205 00:11:15,520 --> 00:11:17,120 Speaker 5: I mean, the truth is, I think they were both 206 00:11:17,640 --> 00:11:21,880 Speaker 5: kind of mortified. They were humiliated at school. They were bullied. 207 00:11:22,920 --> 00:11:26,080 Speaker 5: Arthur Conandoyle originally used as a pseudonym when he publishes 208 00:11:26,120 --> 00:11:28,880 Speaker 5: the story about these photographs in the strand, but everyone 209 00:11:28,960 --> 00:11:32,480 Speaker 5: still knows it's them. The girls recount like photos of them, Yeah, 210 00:11:32,520 --> 00:11:35,760 Speaker 5: because it's photos of them. The girls, you know, both 211 00:11:35,760 --> 00:11:39,720 Speaker 5: recount being mocked at school. The older girl, Elsie, was 212 00:11:39,760 --> 00:11:42,640 Speaker 5: fired from her job because people kept calling and like 213 00:11:42,800 --> 00:11:46,840 Speaker 5: trying to the reporters kept calling and coming there. So 214 00:11:46,920 --> 00:11:49,920 Speaker 5: it was it was very challenging, and I mean, in 215 00:11:50,120 --> 00:11:52,840 Speaker 5: spoiler alert, no spoiler alert, it was a hoax. But 216 00:11:53,000 --> 00:11:57,920 Speaker 5: both girls agreed that they would not publicly admit that 217 00:11:58,000 --> 00:12:01,880 Speaker 5: the photos were faked until Sir Arthur Conan Doyle and 218 00:12:02,679 --> 00:12:06,080 Speaker 5: mister Gardner, who's another like a big proponent of the photographs, 219 00:12:06,240 --> 00:12:07,160 Speaker 5: we're both dead. 220 00:12:07,920 --> 00:12:10,679 Speaker 2: Oh jeez, Okay, okay. 221 00:12:10,760 --> 00:12:12,720 Speaker 3: I think the interesting part about this to me and 222 00:12:12,760 --> 00:12:15,040 Speaker 3: this maybe comes into play with a lot of hooks 223 00:12:15,080 --> 00:12:19,600 Speaker 3: is the power of belief and playing into people's superstitions 224 00:12:19,720 --> 00:12:23,120 Speaker 3: or playing into people's biases or whatever. That to me 225 00:12:23,280 --> 00:12:26,280 Speaker 3: is I think a huge part of successfully perpetrating a hoax. 226 00:12:27,000 --> 00:12:29,600 Speaker 1: Sure, yeah, tell the people what they want, you know, 227 00:12:29,760 --> 00:12:34,480 Speaker 1: play on that confirmation bias, because who doesn't love feeling? 228 00:12:34,720 --> 00:12:37,880 Speaker 1: Beat me here, Max, who doesn't love feeling? Right? You 229 00:12:37,920 --> 00:12:41,040 Speaker 1: know what I mean? Like, oh, yeah, I did make 230 00:12:41,080 --> 00:12:45,640 Speaker 1: Sherlock Holmes and fairies are real, and here's the one 231 00:12:45,679 --> 00:12:46,680 Speaker 1: thing that supports it. 232 00:12:46,960 --> 00:12:49,280 Speaker 5: I think another big lesson to take away from the 233 00:12:49,320 --> 00:12:53,480 Speaker 5: hoax is, once you are convinced of something, you can 234 00:12:53,679 --> 00:12:58,080 Speaker 5: justify any possible hole anyone points in the argument. People 235 00:12:58,160 --> 00:13:01,880 Speaker 5: pointed out that these quota quote fairies, we're wearing very 236 00:13:01,960 --> 00:13:07,240 Speaker 5: modern clothing and had very modern hairdos, and he said, well, 237 00:13:07,280 --> 00:13:10,480 Speaker 5: they're there thought figures. Fairies are mythical creatures that are 238 00:13:10,480 --> 00:13:13,679 Speaker 5: sort of conjured from your imagination, so that of course 239 00:13:13,720 --> 00:13:18,920 Speaker 5: their their appearance changes over time or counter explanation. The 240 00:13:19,000 --> 00:13:22,559 Speaker 5: reason they look like regular like fairies in a story 241 00:13:22,559 --> 00:13:24,920 Speaker 5: book isn't that, you know, strange that they look like 242 00:13:25,240 --> 00:13:28,520 Speaker 5: just exactly how people imagine fairies. Look, well, maybe people 243 00:13:28,600 --> 00:13:32,520 Speaker 5: just got it right. Maybe our popular imagination of fairies 244 00:13:32,880 --> 00:13:35,160 Speaker 5: was based on their reality, and so he just came 245 00:13:35,240 --> 00:13:38,839 Speaker 5: up with these counter explanations for any possible hole someone 246 00:13:38,840 --> 00:13:40,040 Speaker 5: could point Poe. 247 00:13:40,040 --> 00:13:43,080 Speaker 3: Which argument so un Sherlock Holmes, The in of him, 248 00:13:43,400 --> 00:13:45,280 Speaker 3: isn't it? I mean, it's like it seems to go 249 00:13:45,360 --> 00:13:47,960 Speaker 3: counter to everything that he stands for in terms of 250 00:13:48,280 --> 00:13:51,440 Speaker 3: that kind of power of reason and deduction and you know, 251 00:13:51,880 --> 00:13:55,880 Speaker 3: not being swayed by bias and looking at only the facts. 252 00:13:55,960 --> 00:13:57,440 Speaker 1: That's interesting people have. 253 00:13:57,559 --> 00:14:01,400 Speaker 4: Come to a conclusion when you're arguing with them. The 254 00:14:01,559 --> 00:14:05,360 Speaker 4: argument no is no longer do fairies exist or not? 255 00:14:05,600 --> 00:14:08,320 Speaker 4: It is are you wrong or not? It is are 256 00:14:08,400 --> 00:14:10,960 Speaker 4: you a person who is able to be duped or not? 257 00:14:11,640 --> 00:14:14,880 Speaker 4: So like, if I've convinced, like if Dana is convinced 258 00:14:14,920 --> 00:14:18,360 Speaker 4: of something, then if I'm arguing with her, it's not 259 00:14:18,440 --> 00:14:21,520 Speaker 4: really about the thing we're arguing about. It's am I 260 00:14:21,640 --> 00:14:23,760 Speaker 4: able to convince Dana that she is a person who 261 00:14:23,880 --> 00:14:27,320 Speaker 4: is like stupid in some way or like gullible in 262 00:14:27,400 --> 00:14:30,560 Speaker 4: some way. It's hard for her to swallow that she 263 00:14:30,680 --> 00:14:32,280 Speaker 4: is a person who could be wrong. 264 00:14:32,400 --> 00:14:34,880 Speaker 5: Especially if people have been calling me a genius because 265 00:14:34,880 --> 00:14:36,400 Speaker 5: I wrote Sherlock Holmes books. 266 00:14:36,920 --> 00:14:40,000 Speaker 1: Yeah exactly, not very Hounds of Baskerville. I love you, bro. 267 00:14:40,800 --> 00:14:43,600 Speaker 1: But this, this leads us to I think this is 268 00:14:43,640 --> 00:14:46,760 Speaker 1: a tremendous point that echoes in the modern day, the 269 00:14:46,840 --> 00:14:52,800 Speaker 1: idea of tying one's identity to a conclusion right in 270 00:14:52,840 --> 00:14:56,760 Speaker 1: a society that very much does not reward people changing 271 00:14:56,800 --> 00:15:00,000 Speaker 1: their minds right are encountering new information. There's a big 272 00:15:00,240 --> 00:15:10,280 Speaker 1: sunk cost fallacy in that kind of thing. And if 273 00:15:10,320 --> 00:15:13,640 Speaker 1: we stick with the world of print for just a moment, 274 00:15:13,680 --> 00:15:17,480 Speaker 1: there's another story you all found and hipped us to 275 00:15:17,920 --> 00:15:21,200 Speaker 1: that I had personally never heard of before. The fake 276 00:15:21,680 --> 00:15:24,200 Speaker 1: first newspaper in England. 277 00:15:24,400 --> 00:15:26,920 Speaker 5: Oh yeah, this became my own sort of many personal 278 00:15:26,960 --> 00:15:29,840 Speaker 5: crusade because if you believe it, this is a hoax 279 00:15:29,880 --> 00:15:32,640 Speaker 5: that's still to this day around the world. Museums and 280 00:15:32,680 --> 00:15:38,240 Speaker 5: a website still write up as true. It is kind of, 281 00:15:38,280 --> 00:15:40,880 Speaker 5: when you think about it, the original fake news. It 282 00:15:41,000 --> 00:15:45,240 Speaker 5: is a newspaper called The English Mercury. It was donated 283 00:15:45,280 --> 00:15:47,200 Speaker 5: to the British Museum in sort of a cash of 284 00:15:47,280 --> 00:15:50,640 Speaker 5: papers in seventeen seventy six by this man named Thomas Birch. 285 00:15:50,960 --> 00:15:53,440 Speaker 5: And he donated all these papers that used to belong 286 00:15:53,520 --> 00:15:57,400 Speaker 5: to the second Earl of Hardwick. So these papers are 287 00:15:57,440 --> 00:16:00,360 Speaker 5: just in the museum for a good long time when 288 00:16:00,480 --> 00:16:06,400 Speaker 5: a researcher named George Chalmers finds them and is astonished 289 00:16:06,480 --> 00:16:10,480 Speaker 5: because in these papers is this newspaper from fifteen eighty 290 00:16:10,560 --> 00:16:13,960 Speaker 5: eight discussing Sir Francis Drake and the English defeating the 291 00:16:14,000 --> 00:16:17,920 Speaker 5: Spanish Armada, this massive event in English history. And what 292 00:16:17,960 --> 00:16:22,240 Speaker 5: would really be astonishing is that if this English mercury, 293 00:16:22,280 --> 00:16:25,680 Speaker 5: this discovery is now the oldest newspaper in the world, 294 00:16:26,040 --> 00:16:28,960 Speaker 5: The actual oldest newspaper in the world is a German 295 00:16:29,000 --> 00:16:33,840 Speaker 5: newspaper from fifteen ninety four written in Latin. So not 296 00:16:33,880 --> 00:16:37,640 Speaker 5: only is this discovery monumental because it's the oldest newspaper 297 00:16:37,680 --> 00:16:40,280 Speaker 5: in the world, it's about a marquee event. It's not 298 00:16:40,400 --> 00:16:42,920 Speaker 5: just like a regular, you know, Tuesday newspaper. It's like 299 00:16:42,960 --> 00:16:46,840 Speaker 5: about the Spanish Armada. And it's proving this to this 300 00:16:47,200 --> 00:16:50,920 Speaker 5: British person reading it, that Englishmen have invented the newspaper. 301 00:16:50,960 --> 00:16:53,120 Speaker 5: We thought it was German, but it was Us, and 302 00:16:53,160 --> 00:16:57,520 Speaker 5: so they're so excited, and for almost half a century 303 00:16:57,560 --> 00:17:00,440 Speaker 5: people just take that as fact, and it's not until 304 00:17:00,480 --> 00:17:03,200 Speaker 5: forty five years later that a researcher going through the 305 00:17:03,240 --> 00:17:08,000 Speaker 5: British Museum archives realizes that the manuscript is so obviously 306 00:17:08,080 --> 00:17:11,600 Speaker 5: a hoax, like based on the typeface, based on some spellings. 307 00:17:11,880 --> 00:17:14,480 Speaker 5: There's an original manuscript version of it that's in the 308 00:17:14,600 --> 00:17:18,159 Speaker 5: handwriting of the second Earl of Hardwick with some corrections 309 00:17:18,200 --> 00:17:21,280 Speaker 5: from that friend, Sir Thomas Birch. And what this researcher 310 00:17:21,320 --> 00:17:24,719 Speaker 5: figures out is that newspaper was basically just like a 311 00:17:24,840 --> 00:17:28,800 Speaker 5: creative writing exercise that the Earl did with his friend. 312 00:17:29,040 --> 00:17:31,360 Speaker 5: Like it feels almost like a fifth like an assignment 313 00:17:31,400 --> 00:17:33,919 Speaker 5: you would get in fifth grade, where it's like, imagine 314 00:17:33,920 --> 00:17:36,600 Speaker 5: that you are at a historical event and you are 315 00:17:36,600 --> 00:17:40,040 Speaker 5: a reporter, Like what would a reporter say about the 316 00:17:40,080 --> 00:17:44,120 Speaker 5: invasion of the Spanish Armada? Like the Earl and his 317 00:17:44,160 --> 00:17:46,760 Speaker 5: friend just did this as a writing exercise, like a 318 00:17:46,880 --> 00:17:50,159 Speaker 5: quote unquote literary game. And they weren't even trying to 319 00:17:50,520 --> 00:17:53,600 Speaker 5: hoax anyone, to be fair, they just donated it among 320 00:17:53,680 --> 00:17:57,479 Speaker 5: all their papers. And I find that very charming, and 321 00:17:57,560 --> 00:18:00,480 Speaker 5: still to this day you will see the English Mercury 322 00:18:00,600 --> 00:18:05,880 Speaker 5: referenced as as a real historical newspaper from fifteen eighty 323 00:18:05,920 --> 00:18:06,800 Speaker 5: eight when it is not. 324 00:18:07,240 --> 00:18:10,040 Speaker 4: I have a distinct memory of doing that in fifth 325 00:18:10,080 --> 00:18:14,439 Speaker 4: grade that we were doing a unit on Egypt and 326 00:18:14,480 --> 00:18:17,320 Speaker 4: there were two like it was like five A and 327 00:18:17,400 --> 00:18:19,800 Speaker 4: five B, like our two fifth grade classes, and we 328 00:18:19,920 --> 00:18:23,560 Speaker 4: each had to do like an ancient Egyptian newspaper. And 329 00:18:23,600 --> 00:18:25,520 Speaker 4: there was like kind of a rivalry between the two 330 00:18:25,520 --> 00:18:27,280 Speaker 4: classes of who could come up with a better name 331 00:18:27,280 --> 00:18:31,800 Speaker 4: for their newspaper, and we I forget which class was which, 332 00:18:31,840 --> 00:18:34,240 Speaker 4: but one was called the Giza Gazette and the other 333 00:18:34,359 --> 00:18:38,560 Speaker 4: was Pyramid Paparazzi. Oh, but it was like one hundred 334 00:18:38,600 --> 00:18:40,720 Speaker 4: percent if a thousand years from now. 335 00:18:40,800 --> 00:18:43,120 Speaker 6: Someone then found what we did. 336 00:18:43,000 --> 00:18:46,240 Speaker 4: And was like wow, like did you know ancient Egypt 337 00:18:46,320 --> 00:18:47,080 Speaker 4: had newspapers. 338 00:18:47,119 --> 00:18:48,879 Speaker 6: One was called the Giza Zet and the other was 339 00:18:48,880 --> 00:18:50,240 Speaker 6: called Pyramid Paparazzi. 340 00:18:51,200 --> 00:18:53,080 Speaker 4: And then they were like, oh, this is not written 341 00:18:53,119 --> 00:18:57,400 Speaker 4: in hieroglyphics, this is actually not from ancient Egypt. 342 00:18:57,440 --> 00:18:59,360 Speaker 5: Did you say in any of it with like tea 343 00:18:59,400 --> 00:18:59,920 Speaker 5: bags to me? 344 00:19:03,920 --> 00:19:05,119 Speaker 6: Yeah, yeah, absolutely. 345 00:19:05,440 --> 00:19:09,040 Speaker 1: I also noticed, Lizzie, this is a bit apropos. You 346 00:19:09,280 --> 00:19:11,760 Speaker 1: have a poster of the pretenders? 347 00:19:11,840 --> 00:19:19,520 Speaker 4: Oh yes, why as band very hoxy rock band. 348 00:19:21,520 --> 00:19:22,240 Speaker 6: Pending. 349 00:19:23,520 --> 00:19:28,320 Speaker 1: Think about it, folks connect those dots. So it also 350 00:19:28,480 --> 00:19:32,199 Speaker 1: I think the idea of this fake first newspaper, you know, 351 00:19:32,400 --> 00:19:36,760 Speaker 1: we see here not malice, right, not necessarily a heist 352 00:19:36,920 --> 00:19:41,840 Speaker 1: nor adrift, but more this idea of something affirming a 353 00:19:41,960 --> 00:19:47,840 Speaker 1: concept of English nationalism and exceptional exceptionalism. So we also 354 00:19:47,960 --> 00:19:52,479 Speaker 1: see I don't know the idea that people want to 355 00:19:52,520 --> 00:19:56,000 Speaker 1: be read on to a mystery, right, people again, we 356 00:19:56,160 --> 00:19:59,440 Speaker 1: love the validation. We love not having to change our minds. 357 00:19:59,680 --> 00:20:03,960 Speaker 1: Maybe that takes us a little bit closer to the 358 00:20:03,960 --> 00:20:07,600 Speaker 1: one and only folks you know him, you know them. 359 00:20:07,960 --> 00:20:10,879 Speaker 1: We got to get a sound qu for him, PT book. 360 00:20:18,119 --> 00:20:19,320 Speaker 1: We'll do some dances here. 361 00:20:19,359 --> 00:20:22,199 Speaker 5: He is the greatest showman himself. 362 00:20:22,240 --> 00:20:25,080 Speaker 2: I was about to say, the greatest show what a musical? 363 00:20:26,000 --> 00:20:31,040 Speaker 1: So what makes what makes our what makes our buddy? 364 00:20:31,040 --> 00:20:35,760 Speaker 1: Ptb the notorious ptb uh considered the ultimate Hopes not 365 00:20:35,840 --> 00:20:36,480 Speaker 1: the best dude. 366 00:20:37,520 --> 00:20:40,000 Speaker 5: There was a certain shamelessness to him. I think that 367 00:20:40,040 --> 00:20:42,000 Speaker 5: a lot of the hoaxes that we like to cover 368 00:20:42,359 --> 00:20:46,480 Speaker 5: on the show have like a general sweetness, and even 369 00:20:46,520 --> 00:20:50,760 Speaker 5: that example of the English mercury Uh, it was like 370 00:20:50,800 --> 00:20:54,200 Speaker 5: a fun writing game between two Buddies, and they weren't 371 00:20:54,200 --> 00:20:56,159 Speaker 5: even trying to fool anyone. They were having fun, and 372 00:20:56,200 --> 00:20:58,879 Speaker 5: that was sort of also at the heart of the 373 00:20:58,960 --> 00:21:03,040 Speaker 5: Cuttingly Fairies was just these two young girls. What I 374 00:21:03,359 --> 00:21:09,040 Speaker 5: find so disquieting about P. T. Barnum is the shamelessness 375 00:21:09,080 --> 00:21:11,560 Speaker 5: with which he was alying to people for profit and 376 00:21:11,600 --> 00:21:17,200 Speaker 5: then also exploiting the people that he was using he had. 377 00:21:17,920 --> 00:21:20,399 Speaker 5: I think the biggest hoax that I think has the 378 00:21:20,440 --> 00:21:23,359 Speaker 5: biggest impact to this day that he perpetrated that I 379 00:21:23,359 --> 00:21:26,960 Speaker 5: don't think people realize necessarily was a hoax is the 380 00:21:26,960 --> 00:21:31,639 Speaker 5: idea of George Washington and the Cherry Tree story. He 381 00:21:31,640 --> 00:21:36,879 Speaker 5: helped popularize that because he in the eighteen hundreds had 382 00:21:37,040 --> 00:21:40,680 Speaker 5: a black woman that he traveled with who was very, 383 00:21:40,800 --> 00:21:45,240 Speaker 5: very old, who he had claim falsely to be the 384 00:21:45,280 --> 00:21:48,840 Speaker 5: one hundred and something year old woman who had helped 385 00:21:48,960 --> 00:21:53,080 Speaker 5: raise George Washington. Not true. 386 00:21:53,600 --> 00:21:57,760 Speaker 4: I would also say one of the greatest hoaxes of P. T. 387 00:21:57,920 --> 00:22:02,440 Speaker 4: Barnum's life is his own legacy, which is now being 388 00:22:02,520 --> 00:22:05,720 Speaker 4: burnished in the film The Greatest Showman, which. 389 00:22:05,480 --> 00:22:07,359 Speaker 2: Is really rosy out. 390 00:22:10,920 --> 00:22:14,479 Speaker 4: I also would just say, like our podcast is a 391 00:22:14,520 --> 00:22:19,960 Speaker 4: lot about like a suite or innocuous hoaxes, which it 392 00:22:20,080 --> 00:22:22,240 Speaker 4: has less to do with the nature of hoaxes and 393 00:22:22,320 --> 00:22:24,159 Speaker 4: has more to do with us wanting to have like 394 00:22:24,240 --> 00:22:29,520 Speaker 4: a positive, feel good podcast. Like that's less a statement 395 00:22:29,600 --> 00:22:32,200 Speaker 4: on the nature of hoaxes and more about like Dana 396 00:22:32,240 --> 00:22:34,199 Speaker 4: and I want to have a fun podcast where you like, 397 00:22:34,320 --> 00:22:36,640 Speaker 4: learn a little bit and also don't feel bummed out 398 00:22:36,680 --> 00:22:40,920 Speaker 4: at the end. So we have purposely chosen hoaxes where 399 00:22:41,680 --> 00:22:45,560 Speaker 4: fewer lives are ruined than in other hoaxes. But if 400 00:22:45,560 --> 00:22:48,160 Speaker 4: you research hoaxes on your own, you will find many 401 00:22:48,160 --> 00:22:50,879 Speaker 4: where people's lives are ruined and people are taking advantage 402 00:22:50,880 --> 00:22:52,600 Speaker 4: of It's just you won't find. 403 00:22:52,359 --> 00:22:55,119 Speaker 6: That on our podcast because our podcast is fun. 404 00:22:55,520 --> 00:22:58,320 Speaker 5: Yeah. My favorite, my favorite hoaxes are the ones we're 405 00:22:58,359 --> 00:23:00,800 Speaker 5: sent where you leave and you just say, well, Leveryn 406 00:23:00,880 --> 00:23:03,439 Speaker 5: had a good time. Wasn't that that was interesting? 407 00:23:03,720 --> 00:23:07,080 Speaker 4: That's interesting, that was crazy, had a fun ride. 408 00:23:07,280 --> 00:23:10,960 Speaker 1: Yeah, uh, that's that's Uh. That's a great point about 409 00:23:11,160 --> 00:23:15,080 Speaker 1: the sort of logic and philosophy of our explorations here, 410 00:23:15,119 --> 00:23:18,280 Speaker 1: because yeah, you don't want everything to feel like an 411 00:23:18,480 --> 00:23:22,200 Speaker 1: endless death march. You know, you don't want every story 412 00:23:22,240 --> 00:23:25,280 Speaker 1: to end uh end up feeling like you just accidentally 413 00:23:25,359 --> 00:23:26,879 Speaker 1: watched Requiem for a reason. 414 00:23:27,080 --> 00:23:30,440 Speaker 5: No, you want to watch Requen for drink consensually. Ideally, 415 00:23:31,320 --> 00:23:36,320 Speaker 5: I will also say an only one. I think we've 416 00:23:36,320 --> 00:23:40,119 Speaker 5: sort of thought about the difference, sug. We've sort of 417 00:23:40,119 --> 00:23:42,600 Speaker 5: thought about the difference between a hoax and a scam 418 00:23:43,000 --> 00:23:46,200 Speaker 5: and something that is just nakedly a scam I think 419 00:23:46,320 --> 00:23:49,679 Speaker 5: is just for profit and just meant to exploit people, 420 00:23:49,960 --> 00:23:53,080 Speaker 5: where something about a hoax is more performative and also 421 00:23:53,240 --> 00:23:56,680 Speaker 5: has an element of whimsy to it. I know it's 422 00:23:56,680 --> 00:23:58,720 Speaker 5: when I see it thing the difference. 423 00:23:58,480 --> 00:24:02,560 Speaker 1: There we go. I like that because we've sometimes discussed 424 00:24:03,040 --> 00:24:06,639 Speaker 1: over the years on this show, we've discussed heist and 425 00:24:06,920 --> 00:24:11,320 Speaker 1: hoaxes that we really enjoyed, and I think we're on 426 00:24:11,359 --> 00:24:15,200 Speaker 1: the same pat here because sometimes our favorite ones are 427 00:24:15,240 --> 00:24:18,639 Speaker 1: going to end up being someone who just wanted a 428 00:24:18,640 --> 00:24:21,439 Speaker 1: little bit of wackiness in their lives, you know. Or 429 00:24:21,480 --> 00:24:26,920 Speaker 1: they said something wild to their buddies hanging out, whether 430 00:24:27,240 --> 00:24:30,920 Speaker 1: in correspondence or in person, and no one questioned them 431 00:24:31,000 --> 00:24:34,199 Speaker 1: on it, And it gets really weird when someone starts 432 00:24:34,520 --> 00:24:39,720 Speaker 1: kind of just like the Yankee and Arthur's Court right Connecticut, 433 00:24:39,760 --> 00:24:43,199 Speaker 1: Yeak and King Arthur's court when the character Merlin you 434 00:24:43,320 --> 00:24:47,399 Speaker 1: realize that he actually believes he has magic powers. I 435 00:24:47,440 --> 00:24:50,720 Speaker 1: think it's fascinating how that happens with some real life 436 00:24:50,800 --> 00:24:54,320 Speaker 1: historical figures where they, you know, they say they're long 437 00:24:54,400 --> 00:24:57,399 Speaker 1: lost prince or what have you to some group of people, 438 00:24:57,880 --> 00:25:00,960 Speaker 1: and then those people just believe them, and you know, 439 00:25:01,320 --> 00:25:04,399 Speaker 1: fast forward a few decades and this guy's like, yeah, maybe, 440 00:25:04,480 --> 00:25:05,640 Speaker 1: I mean, maybe I am. 441 00:25:06,040 --> 00:25:08,160 Speaker 5: I mean, I'm glad that you brought up King Arthur 442 00:25:08,280 --> 00:25:11,440 Speaker 5: because I not to put on my noble blood hat. 443 00:25:11,760 --> 00:25:16,200 Speaker 5: King Arthur is kind of maybe the longest running hoax 444 00:25:16,240 --> 00:25:19,439 Speaker 5: in popular culture because King Arthur did not exist, Like 445 00:25:19,640 --> 00:25:22,200 Speaker 5: I'm so sorry to everyone, what I know, it feels 446 00:25:22,200 --> 00:25:25,520 Speaker 5: like telling kids Santa Claus didn't exist. He didn't, but 447 00:25:25,800 --> 00:25:28,399 Speaker 5: at different periods in history he was sort of used 448 00:25:28,440 --> 00:25:32,399 Speaker 5: as propaganda to bolster people's reputations or to serve a 449 00:25:32,440 --> 00:25:36,720 Speaker 5: political purpose. This king, Henry the Seventh, the dad of 450 00:25:36,920 --> 00:25:40,959 Speaker 5: Henry the Eighth of Wife Fame, had a pretty tenuous 451 00:25:41,040 --> 00:25:44,200 Speaker 5: claim to the throne. Tenuous at best, but he won 452 00:25:44,240 --> 00:25:45,919 Speaker 5: a battle and when he was trying to sort of 453 00:25:46,040 --> 00:25:50,040 Speaker 5: establish his lineage and sort of established to prove that 454 00:25:50,160 --> 00:25:53,600 Speaker 5: he was a worthy king. He said on his Welsh 455 00:25:53,600 --> 00:25:56,719 Speaker 5: side that his family went back to King Arthur, and 456 00:25:56,760 --> 00:25:59,639 Speaker 5: he named his firstborn son Arthur as a way to 457 00:25:59,640 --> 00:26:05,280 Speaker 5: sort of create that legacy and tie in people's minds. 458 00:26:05,480 --> 00:26:07,800 Speaker 5: Even though Arthur was not a real person, he just 459 00:26:07,840 --> 00:26:09,520 Speaker 5: served a valuable political purpose. 460 00:26:10,160 --> 00:26:14,439 Speaker 1: No was he was the popular King Arthur legend. Is 461 00:26:14,480 --> 00:26:19,440 Speaker 1: this made from whole cloth or is it an amalgamation 462 00:26:19,840 --> 00:26:21,880 Speaker 1: of other previous royals. 463 00:26:22,200 --> 00:26:26,520 Speaker 5: I think the idea of King Arthur is a literary invention, 464 00:26:26,960 --> 00:26:30,880 Speaker 5: but it is a literary invention that takes its earliest 465 00:26:31,000 --> 00:26:36,280 Speaker 5: tiniest seeds from medieval chronicles. There's this chronicle called the 466 00:26:36,359 --> 00:26:41,000 Speaker 5: Historia brittonam in eight hundred and twenty eight that mentions 467 00:26:41,600 --> 00:26:44,840 Speaker 5: Arthur not as a king, but as a military leader 468 00:26:44,880 --> 00:26:48,119 Speaker 5: who fought off the invading Saxons and beat hundreds of 469 00:26:48,160 --> 00:26:54,640 Speaker 5: Saxons in this one battle. But even in that chronicle, Arthur, 470 00:26:54,760 --> 00:26:57,840 Speaker 5: a not a king, doesn't have any of the Guenevere 471 00:26:58,000 --> 00:27:00,560 Speaker 5: Lancelot cord round tab table. 472 00:27:01,680 --> 00:27:03,840 Speaker 6: Maybe it's anyone ever sets it around table? 473 00:27:04,040 --> 00:27:06,879 Speaker 1: Was you know, table technology at that point. 474 00:27:06,920 --> 00:27:10,600 Speaker 5: They probably were making tables round, I'm hoping, so maybe 475 00:27:10,640 --> 00:27:13,640 Speaker 5: we're there but even in that chronicle from eight hundred 476 00:27:13,680 --> 00:27:16,000 Speaker 5: and twenty eight, by that point, that chronicle has a 477 00:27:16,000 --> 00:27:20,680 Speaker 5: lot of flavors of mythology, where it's like they mentioned 478 00:27:20,720 --> 00:27:24,840 Speaker 5: that this general Arthur, you know, killed nine hundred people 479 00:27:24,880 --> 00:27:28,080 Speaker 5: single handedly in a battle that he they mentioned that 480 00:27:28,280 --> 00:27:32,800 Speaker 5: his dog had like a paw print in stone that 481 00:27:33,080 --> 00:27:36,200 Speaker 5: stayed in stone, and anytime someone stole the stone, it 482 00:27:36,200 --> 00:27:39,040 Speaker 5: would always appear right back where you stole it. That's 483 00:27:39,080 --> 00:27:40,880 Speaker 5: not like one of the pieces of the Arthur myth 484 00:27:40,920 --> 00:27:44,040 Speaker 5: that that stood the test of time. And there was 485 00:27:44,080 --> 00:27:46,960 Speaker 5: another where it's like the grave of Arthur's son if 486 00:27:47,000 --> 00:27:49,560 Speaker 5: you measure it any every time you measure it, it's 487 00:27:49,560 --> 00:27:53,080 Speaker 5: a different length. So even back in eight hundred, the 488 00:27:53,200 --> 00:27:56,080 Speaker 5: closest we can come to Arthur or a quote unquote 489 00:27:56,080 --> 00:27:59,760 Speaker 5: real Arthur is a general who was only mentioned in 490 00:27:59,800 --> 00:28:03,399 Speaker 5: this chronicle, not really mentioned by other contemporary sources, not 491 00:28:03,520 --> 00:28:07,720 Speaker 5: mentioned by other chroniclers who mentioned this battle, So it's 492 00:28:07,760 --> 00:28:10,600 Speaker 5: a pretty hazy source. And even then he was a 493 00:28:10,640 --> 00:28:11,719 Speaker 5: mythological figure. 494 00:28:11,920 --> 00:28:13,800 Speaker 3: But then like it wouldn't have been til acted. The 495 00:28:13,800 --> 00:28:16,359 Speaker 3: fact they referred to them as Arthurian legend, right, like 496 00:28:16,440 --> 00:28:17,160 Speaker 3: that was sort of. 497 00:28:17,040 --> 00:28:19,560 Speaker 4: Like, yeah, doesn't the myth of the Holy Grail come 498 00:28:19,560 --> 00:28:20,639 Speaker 4: from Arthurian legend? 499 00:28:21,320 --> 00:28:24,600 Speaker 5: Yeah, but that comes the idea of like the Holy Grail, 500 00:28:25,160 --> 00:28:28,840 Speaker 5: And I mean, I think I'm not good on Christian mythology. 501 00:28:28,880 --> 00:28:31,359 Speaker 5: I think probably the idea of the Holy Grail maybe 502 00:28:31,480 --> 00:28:35,199 Speaker 5: existed before. But the quest for the Holy Grail and 503 00:28:35,280 --> 00:28:38,840 Speaker 5: the sword and the stone and these pieces of Arthuriana 504 00:28:38,920 --> 00:28:42,720 Speaker 5: that we associate with King Arthur come centuries and centuries 505 00:28:42,760 --> 00:28:44,440 Speaker 5: later in literature. 506 00:28:44,600 --> 00:28:49,600 Speaker 1: As they as they serve first propaganda, right, and then 507 00:28:49,800 --> 00:28:53,880 Speaker 1: they later serve literature because nostalgia is a hell of 508 00:28:53,920 --> 00:28:57,720 Speaker 1: a drug at some point, and maybe maybe this is 509 00:28:57,960 --> 00:29:01,400 Speaker 1: something that a lot of us would hear and think, 510 00:29:02,440 --> 00:29:04,920 Speaker 1: oh gosh, look at like we were saying earlier, look 511 00:29:04,920 --> 00:29:08,280 Speaker 1: at these suckers from the past. It's really more an 512 00:29:08,400 --> 00:29:12,560 Speaker 1: argument about access to information, as you were saying, And 513 00:29:12,600 --> 00:29:15,240 Speaker 1: we could argue that now the pendulum is starting to 514 00:29:15,280 --> 00:29:17,040 Speaker 1: swing the other waves. 515 00:29:17,080 --> 00:29:18,800 Speaker 3: Well, I mean, if anything, where maybe a little bit 516 00:29:18,840 --> 00:29:21,680 Speaker 3: dumber now than people were back then because of the 517 00:29:22,000 --> 00:29:25,800 Speaker 3: you know, reliance on that kind of technology and that 518 00:29:25,880 --> 00:29:28,200 Speaker 3: kind of information gathering and that kind of like you know, 519 00:29:28,320 --> 00:29:32,400 Speaker 3: instantaneous gratification. Not to mention what things like AI chat 520 00:29:32,400 --> 00:29:35,800 Speaker 3: GPT are doing to us even further down the rabbit 521 00:29:35,800 --> 00:29:37,360 Speaker 3: hole as far as that's concerned. 522 00:29:37,560 --> 00:29:39,880 Speaker 5: Yeah, critical thinking found dad Niditch. 523 00:29:40,680 --> 00:29:43,959 Speaker 1: Yeah, yeah, right exactly. Can't wait for the AI pictures 524 00:29:44,000 --> 00:29:53,200 Speaker 1: of that one on Facebook. With this, we want to 525 00:29:53,320 --> 00:29:56,200 Speaker 1: we want to ask you all a couple of these 526 00:29:56,280 --> 00:30:01,600 Speaker 1: larger questions with with hoaxing, with sussing out the fact 527 00:30:01,640 --> 00:30:04,280 Speaker 1: from the fiction, right and the truth from the proof 528 00:30:04,320 --> 00:30:09,680 Speaker 1: from the propaganda. What would you all recommend people do 529 00:30:10,200 --> 00:30:13,959 Speaker 1: when they hear a story that sounds super weird, or 530 00:30:14,040 --> 00:30:16,040 Speaker 1: hear a story maybe they really want to be true, 531 00:30:16,040 --> 00:30:20,720 Speaker 1: et cetera. How do we, as casual encounters of information 532 00:30:21,280 --> 00:30:26,000 Speaker 1: start parsing through all this? Frankly, I'll say it bullshit 533 00:30:26,160 --> 00:30:27,800 Speaker 1: that we see on the internet. 534 00:30:28,000 --> 00:30:28,200 Speaker 8: Oh. 535 00:30:28,320 --> 00:30:30,560 Speaker 6: I always just say, like, keep an open mind. 536 00:30:30,720 --> 00:30:32,760 Speaker 4: I always come from a place of like, I don't know, 537 00:30:33,720 --> 00:30:36,200 Speaker 4: you know, like you don't need to know right away 538 00:30:36,360 --> 00:30:37,600 Speaker 4: if something's real or fake. 539 00:30:37,680 --> 00:30:39,480 Speaker 6: You can take a week, you can take a month, 540 00:30:39,600 --> 00:30:40,920 Speaker 6: you can take like. 541 00:30:41,000 --> 00:30:45,959 Speaker 4: I'm I'm luckily like I don't work at CNN, so 542 00:30:46,160 --> 00:30:48,240 Speaker 4: I don't need to be the one who decides if 543 00:30:48,240 --> 00:30:50,480 Speaker 4: something is real or fake right away. And I don't 544 00:30:50,520 --> 00:30:53,960 Speaker 4: need to report on it right away, so I can 545 00:30:54,400 --> 00:30:57,960 Speaker 4: take my time and look at different sources, and I 546 00:30:58,000 --> 00:31:01,840 Speaker 4: don't need to immediately have a verdict on whether something 547 00:31:02,040 --> 00:31:03,920 Speaker 4: is real or true. And I don't you know, we 548 00:31:03,920 --> 00:31:06,440 Speaker 4: were talking about like everybody wants to repost right away, 549 00:31:06,480 --> 00:31:09,000 Speaker 4: Like you don't need to repost right away, So you 550 00:31:09,000 --> 00:31:10,960 Speaker 4: don't need to know right away if it's. 551 00:31:11,160 --> 00:31:13,360 Speaker 6: Real or fake. You can wait for the dust to settle. 552 00:31:13,600 --> 00:31:14,280 Speaker 6: You can wait for. 553 00:31:14,320 --> 00:31:17,080 Speaker 4: NPR to weigh in, like there are people out there 554 00:31:17,120 --> 00:31:19,440 Speaker 4: who will do the fact checking, and you can wait 555 00:31:19,480 --> 00:31:22,440 Speaker 4: for them to do the fact checking, like you don't 556 00:31:22,520 --> 00:31:27,320 Speaker 4: need to immediately go into down the rabbit hole. I 557 00:31:27,360 --> 00:31:30,480 Speaker 4: see so many people who are immediately up in arms 558 00:31:30,520 --> 00:31:31,160 Speaker 4: about like. 559 00:31:32,640 --> 00:31:34,160 Speaker 6: You know, like I'm not to. 560 00:31:34,120 --> 00:31:36,080 Speaker 4: Take it to a super serious thing, but like there 561 00:31:36,080 --> 00:31:41,640 Speaker 4: will be like like the hours after like a mass 562 00:31:41,640 --> 00:31:45,680 Speaker 4: shooting or like a earthquaker, like some like big story, 563 00:31:45,760 --> 00:31:49,320 Speaker 4: Like that's when the least reliable information is and people 564 00:31:49,400 --> 00:31:51,680 Speaker 4: are already up in arms about like why aren't they 565 00:31:51,720 --> 00:31:54,040 Speaker 4: reporting about the this or the that or the whatever. 566 00:31:54,120 --> 00:31:56,440 Speaker 4: And I'm like, yo, like give them a week to 567 00:31:56,480 --> 00:31:59,600 Speaker 4: get their story straight, Like give the journalists some time 568 00:31:59,720 --> 00:32:02,800 Speaker 4: to do their jobs, then come back to it in 569 00:32:02,840 --> 00:32:05,320 Speaker 4: a couple of days and then we'll see like where 570 00:32:05,360 --> 00:32:07,560 Speaker 4: you can donate or this, that and the other. Like 571 00:32:08,080 --> 00:32:12,160 Speaker 4: you can just give it some time and be patient 572 00:32:12,280 --> 00:32:17,240 Speaker 4: and the information I knock on wood still have some 573 00:32:17,440 --> 00:32:22,120 Speaker 4: trust that some last dregs of the free press exists 574 00:32:22,360 --> 00:32:25,040 Speaker 4: and the good information will make its way to you 575 00:32:25,680 --> 00:32:28,960 Speaker 4: through some of our institutions if you give it some time. 576 00:32:29,080 --> 00:32:32,800 Speaker 4: But like, you just can't make these snap judgments. I 577 00:32:32,800 --> 00:32:35,320 Speaker 4: think that's where a lot of people go wrong, is 578 00:32:35,360 --> 00:32:40,920 Speaker 4: that they want the information immediately and they're like, well 579 00:32:40,960 --> 00:32:44,120 Speaker 4: this TikToker said and they said that they were there. 580 00:32:44,360 --> 00:32:45,280 Speaker 6: So then that's true. 581 00:32:45,320 --> 00:32:46,760 Speaker 4: But then that proved not to be true, and that 582 00:32:46,800 --> 00:32:49,520 Speaker 4: means that you can't trust and it's like give. 583 00:32:49,360 --> 00:32:49,880 Speaker 6: It a day. 584 00:32:51,040 --> 00:32:54,560 Speaker 3: Republica by the way, Yeah, we were just talking about 585 00:32:54,560 --> 00:32:55,760 Speaker 3: that on stuff they don't want you to know. 586 00:32:55,800 --> 00:32:57,479 Speaker 2: Matt who you'll know. 587 00:32:57,880 --> 00:33:01,680 Speaker 3: Matt Frederick brought us a story about how many like 588 00:33:01,800 --> 00:33:04,760 Speaker 3: typos and fact checking errors or even popping up in 589 00:33:04,800 --> 00:33:08,040 Speaker 3: the ap now, and it seems like a lot of 590 00:33:08,040 --> 00:33:10,240 Speaker 3: that is sort of like the domino effect of that, 591 00:33:10,280 --> 00:33:13,360 Speaker 3: you know, first to market kind of attitude, so you 592 00:33:13,400 --> 00:33:15,000 Speaker 3: really do have to be careful. But we brought up 593 00:33:15,040 --> 00:33:17,840 Speaker 3: Republica and Axios and a few others, like we like 594 00:33:17,840 --> 00:33:20,560 Speaker 3: four oh four media, but in really do have to 595 00:33:20,600 --> 00:33:21,200 Speaker 3: be careful. 596 00:33:21,400 --> 00:33:24,040 Speaker 1: We'll swing, We'll swing correctly a couple of times. But 597 00:33:24,160 --> 00:33:28,720 Speaker 1: it's yeah, this is something that makes me wonder. You know, 598 00:33:28,840 --> 00:33:32,840 Speaker 1: have we are Have we arrived full circle at the 599 00:33:33,280 --> 00:33:36,280 Speaker 1: heyday of the town crier. That always seemed like a 600 00:33:36,360 --> 00:33:39,720 Speaker 1: cool job to me, especially you know, like now we 601 00:33:39,920 --> 00:33:44,680 Speaker 1: have not like town criers under royal license. We're patent 602 00:33:44,720 --> 00:33:47,760 Speaker 1: from the monarch. We have the town criers of TikTok. 603 00:33:48,200 --> 00:33:50,000 Speaker 1: But I do love the idea. This is just a 604 00:33:50,000 --> 00:33:53,120 Speaker 1: pitch if any of you invents time travel, what if 605 00:33:53,160 --> 00:33:56,720 Speaker 1: we go back in time and bill ourselves as independent 606 00:33:56,800 --> 00:34:00,760 Speaker 1: town criers broad like on a circuit. Just walk in, 607 00:34:01,040 --> 00:34:03,680 Speaker 1: do a vibe check, and start shouting what we think 608 00:34:03,760 --> 00:34:05,160 Speaker 1: is the news or just. 609 00:34:05,120 --> 00:34:09,200 Speaker 5: A counterpoint, something more controversial. Put the established town crier 610 00:34:09,239 --> 00:34:11,279 Speaker 5: out of business because ours is more salacious. 611 00:34:11,600 --> 00:34:14,040 Speaker 1: Yeah, we'll call it. Actually, I feel. 612 00:34:13,840 --> 00:34:16,600 Speaker 4: Like there are people you can find standing on the 613 00:34:16,640 --> 00:34:21,840 Speaker 4: street yelling their controversial opinions. I feel like you actually 614 00:34:21,880 --> 00:34:24,040 Speaker 4: can find people doing that. Maybe they're time travelers. 615 00:34:24,480 --> 00:34:26,480 Speaker 6: Maybe those people are doing Maybe. 616 00:34:26,760 --> 00:34:29,080 Speaker 5: I think all of Lizzie's advice is just like so 617 00:34:29,239 --> 00:34:33,080 Speaker 5: smart and dead on. And I would also add you 618 00:34:33,120 --> 00:34:35,759 Speaker 5: don't need to post it all unless you were an 619 00:34:35,800 --> 00:34:40,320 Speaker 5: elected official, I guess, or you just like screenshots and 620 00:34:40,360 --> 00:34:42,799 Speaker 5: the internet are forever and like, you don't need to 621 00:34:42,840 --> 00:34:45,879 Speaker 5: weigh in on everything. No one is waiting for your 622 00:34:46,000 --> 00:34:48,680 Speaker 5: take on every snow One's like, I can't believe I 623 00:34:48,680 --> 00:34:50,839 Speaker 5: can't wait to see what Greg says about this? 624 00:34:51,160 --> 00:34:51,319 Speaker 4: Right? 625 00:34:51,400 --> 00:34:53,719 Speaker 1: Where is joh rule Dane When I. 626 00:34:53,640 --> 00:34:55,760 Speaker 6: Talk about this all the time? Shutting up is always 627 00:34:55,760 --> 00:34:56,200 Speaker 6: an option. 628 00:34:56,520 --> 00:34:58,600 Speaker 5: Shutting up is always an option. And you know what, 629 00:34:58,680 --> 00:35:01,719 Speaker 5: if you find something really interesting and really cool that 630 00:35:01,760 --> 00:35:03,680 Speaker 5: you do want to share, I would even say the 631 00:35:03,719 --> 00:35:06,200 Speaker 5: best thing to do is way a beat, get coffee 632 00:35:06,200 --> 00:35:08,879 Speaker 5: with a friend, meet a friend in person, and say 633 00:35:08,920 --> 00:35:11,520 Speaker 5: it to their face, because then if they say that 634 00:35:11,640 --> 00:35:13,759 Speaker 5: sounds fake and that's the dumbest thing I've ever heard, 635 00:35:14,000 --> 00:35:15,680 Speaker 5: you can be like, oh, maybe you're right, let me 636 00:35:15,760 --> 00:35:17,319 Speaker 5: just check on that. And also that you got to 637 00:35:17,360 --> 00:35:18,840 Speaker 5: get coffee in person. 638 00:35:18,560 --> 00:35:19,040 Speaker 2: With a friend. 639 00:35:19,600 --> 00:35:23,600 Speaker 1: Yeah, I like this great advice. Yeah, this is fantastic 640 00:35:23,680 --> 00:35:29,520 Speaker 1: insight because it's there are some solid arguments that I 641 00:35:29,520 --> 00:35:31,640 Speaker 1: don't think are one hundred percent proven, but there are 642 00:35:31,640 --> 00:35:34,880 Speaker 1: solid arguments from a couple of different fields that maybe 643 00:35:35,360 --> 00:35:38,400 Speaker 1: it's a hardware problem for the human brain at this point, 644 00:35:38,520 --> 00:35:42,320 Speaker 1: like not having evolved to digest all of this information 645 00:35:42,560 --> 00:35:46,040 Speaker 1: at this unceasing pace. So it sounds like the idea 646 00:35:46,160 --> 00:35:49,840 Speaker 1: of not even totally going full off the grid or 647 00:35:49,960 --> 00:35:53,200 Speaker 1: lud eyed or whatever, but just giving your yourself a 648 00:35:53,239 --> 00:35:58,760 Speaker 1: second to take a beat with that. Again, fantastic insight. 649 00:35:59,120 --> 00:36:03,919 Speaker 1: As Noel's saying, do you all have eddy moments? Let's see, 650 00:36:03,920 --> 00:36:05,600 Speaker 1: I don't want to put anybody in the spot here, 651 00:36:05,640 --> 00:36:07,880 Speaker 1: but do you all have any moments where you thought 652 00:36:08,640 --> 00:36:12,560 Speaker 1: hot Dog I was taking in? I totally believe this 653 00:36:12,719 --> 00:36:15,080 Speaker 1: is true. Okay, Yeah, we got a hard nod from 654 00:36:15,120 --> 00:36:16,080 Speaker 1: Lizzie on that one too. 655 00:36:16,360 --> 00:36:17,600 Speaker 5: I think everyone does. 656 00:36:18,400 --> 00:36:20,680 Speaker 6: I yeah, I mean, I've, I've. 657 00:36:21,400 --> 00:36:23,440 Speaker 4: I'm trying to think of an actual hoax example. The 658 00:36:23,440 --> 00:36:25,520 Speaker 4: only example that's coming to mind is an example of 659 00:36:25,600 --> 00:36:26,719 Speaker 4: just like bad reporting. 660 00:36:27,360 --> 00:36:29,279 Speaker 6: Okay, but. 661 00:36:30,880 --> 00:36:33,000 Speaker 4: Oh well, I remember, I know, I know a girl 662 00:36:33,000 --> 00:36:35,520 Speaker 4: who had a ticket to Firefest and then but she 663 00:36:35,560 --> 00:36:41,160 Speaker 4: didn't end up knowing. And this sounds like I'm lying 664 00:36:41,200 --> 00:36:43,120 Speaker 4: to make myself sound good, but I swear I'm not. 665 00:36:43,600 --> 00:36:44,719 Speaker 6: I remember a girl who had. 666 00:36:44,680 --> 00:36:46,400 Speaker 4: To take it to Firefest, and she was telling me 667 00:36:46,440 --> 00:36:48,880 Speaker 4: about all of the accommodations and then how much it costs, 668 00:36:48,920 --> 00:36:51,480 Speaker 4: and it didn't cost very much, and I remember being like, 669 00:36:51,719 --> 00:36:54,200 Speaker 4: you're gonna get sold into white slavery. Like yeah, I 670 00:36:54,239 --> 00:36:56,719 Speaker 4: remember being like not that I knew that it was 671 00:36:56,760 --> 00:36:59,200 Speaker 4: a scam, but I was like, this sounds like freaking 672 00:36:59,360 --> 00:37:01,719 Speaker 4: sus Like I was like, this is that sounds too 673 00:37:01,760 --> 00:37:03,480 Speaker 4: good to be true? And then she ended up like 674 00:37:03,520 --> 00:37:06,120 Speaker 4: she couldn't go, and she like sold her ticket and 675 00:37:06,160 --> 00:37:09,480 Speaker 4: then yeah, bullet dodged. And then when it happened the 676 00:37:09,800 --> 00:37:12,120 Speaker 4: like we went back into the office like after the weekend. 677 00:37:12,200 --> 00:37:14,200 Speaker 6: I was like, aren't you so glad you didn't go? 678 00:37:14,360 --> 00:37:16,279 Speaker 6: She was like, I'm so glad I didn't go. So 679 00:37:16,320 --> 00:37:16,719 Speaker 6: that's what. 680 00:37:17,239 --> 00:37:20,239 Speaker 2: Do you know anyone that bought a ticket to Firefest too? 681 00:37:20,640 --> 00:37:20,799 Speaker 4: No? 682 00:37:21,000 --> 00:37:21,840 Speaker 6: I don't think I do. 683 00:37:22,320 --> 00:37:23,120 Speaker 2: And maybe that's. 684 00:37:23,000 --> 00:37:25,440 Speaker 3: Worth talking about real quick, because when you have something 685 00:37:25,560 --> 00:37:28,399 Speaker 3: like Firefest, it's such a scam that's debunked. There's two 686 00:37:28,440 --> 00:37:32,040 Speaker 3: documentaries about it, and yet people still bought tickets to 687 00:37:32,080 --> 00:37:32,880 Speaker 3: the second one. 688 00:37:33,040 --> 00:37:36,160 Speaker 2: What where does that come from. Is it just pure ignorance? 689 00:37:36,280 --> 00:37:36,880 Speaker 2: Is bliss? 690 00:37:36,960 --> 00:37:39,839 Speaker 5: Is it just like I think it's I don't think 691 00:37:39,840 --> 00:37:42,799 Speaker 5: it's even ignorance. I think it's knowing the hoax and 692 00:37:42,880 --> 00:37:45,680 Speaker 5: thinking like, oh, this will be fun, I'll like make 693 00:37:45,760 --> 00:37:47,800 Speaker 5: content about it. I think those people know exactly what 694 00:37:47,840 --> 00:37:50,120 Speaker 5: they're getting into, and I think kind of their hope, 695 00:37:50,160 --> 00:37:51,840 Speaker 5: almost hoping it's going to be a disaster. 696 00:37:52,280 --> 00:37:56,319 Speaker 4: I also think this is an example of Like I 697 00:37:56,320 --> 00:37:57,560 Speaker 4: think this is a great. 698 00:37:57,360 --> 00:38:00,920 Speaker 6: Example of the amplification effect. 699 00:38:01,040 --> 00:38:02,640 Speaker 4: I don't know if that's the real term, but like 700 00:38:03,040 --> 00:38:05,640 Speaker 4: I think Firefest two probably sold about five. 701 00:38:05,480 --> 00:38:09,719 Speaker 6: Tickets, five actual I don't think this is a real thing, 702 00:38:09,920 --> 00:38:10,879 Speaker 6: like like. 703 00:38:13,440 --> 00:38:18,200 Speaker 4: Not the second round of controversy when the trailer came out, 704 00:38:18,239 --> 00:38:21,880 Speaker 4: but the first round of controversy when Halle Bailey was 705 00:38:21,920 --> 00:38:27,680 Speaker 4: announced as the Little Mermaid, Like there were a few 706 00:38:27,960 --> 00:38:31,600 Speaker 4: racist comments and then everybody picked it up and it 707 00:38:31,680 --> 00:38:33,120 Speaker 4: became like a thing. 708 00:38:33,480 --> 00:38:36,440 Speaker 6: Like there's a lot of manufactured. 709 00:38:35,600 --> 00:38:40,080 Speaker 4: Controversies on the Internet where if you actually trace it back, 710 00:38:40,160 --> 00:38:42,480 Speaker 4: it's like three people are mad about. 711 00:38:42,160 --> 00:38:45,480 Speaker 5: This one of the worst, one of the worst developments 712 00:38:45,480 --> 00:38:47,440 Speaker 5: to the Internet, And just like this need for so 713 00:38:47,600 --> 00:38:50,120 Speaker 5: much content just to like be in the content now 714 00:38:50,440 --> 00:38:53,840 Speaker 5: is people report three tweets as a news story. 715 00:38:54,040 --> 00:38:56,920 Speaker 3: Yes, oh god, no, that's a good It doesn't get 716 00:38:56,920 --> 00:38:59,680 Speaker 3: talked about enough, honestly, I mean it's so true. 717 00:39:00,120 --> 00:39:02,680 Speaker 1: Ever one of my tweets. That's the part that's the 718 00:39:02,719 --> 00:39:03,600 Speaker 1: worst part of it. 719 00:39:03,680 --> 00:39:05,880 Speaker 3: I mean, it's not that much different than like, you know, 720 00:39:05,920 --> 00:39:08,640 Speaker 3: if we see a bad review for the podcast or 721 00:39:08,640 --> 00:39:11,360 Speaker 3: somebody says something mean about me personally, then I just 722 00:39:11,400 --> 00:39:13,160 Speaker 3: assume that's what everybody thinks. 723 00:39:13,760 --> 00:39:13,960 Speaker 1: You know. 724 00:39:14,080 --> 00:39:17,440 Speaker 3: It's just so easy to like believe that one opinion 725 00:39:17,640 --> 00:39:21,120 Speaker 3: represents something so much larger, just because of how everything 726 00:39:21,239 --> 00:39:24,880 Speaker 3: kind of in a certain way is equal on the internet. Like, 727 00:39:24,920 --> 00:39:27,040 Speaker 3: if it's something that you see, then it's there and 728 00:39:27,080 --> 00:39:29,279 Speaker 3: it has the same weight as any other piece of 729 00:39:29,320 --> 00:39:30,680 Speaker 3: content that you might run across. 730 00:39:30,760 --> 00:39:31,520 Speaker 2: Yeah, if you're not. 731 00:39:31,560 --> 00:39:35,200 Speaker 4: Careful or like I remember, like during the Eras tour, 732 00:39:35,400 --> 00:39:39,560 Speaker 4: they were listing like resellers were listing floor seats for 733 00:39:39,600 --> 00:39:42,279 Speaker 4: like twenty thousand dollars or something, and there were all 734 00:39:42,280 --> 00:39:44,799 Speaker 4: of these sites being like, can you believe people are 735 00:39:44,840 --> 00:39:47,760 Speaker 4: paying twenty thousand dollars to see Taylor Swift? 736 00:39:47,840 --> 00:39:51,480 Speaker 7: I was like, no, the tickets are available that means. 737 00:39:51,320 --> 00:39:53,440 Speaker 1: They're not sell it, like exactly. 738 00:39:53,560 --> 00:39:55,920 Speaker 8: I was like, how do you know anyone has paid 739 00:39:55,960 --> 00:39:58,600 Speaker 8: that much? I was just like that you made that up. 740 00:39:58,680 --> 00:40:01,759 Speaker 8: That is a made up price. That's a made up price. 741 00:40:01,760 --> 00:40:04,160 Speaker 8: Someone made that up, and you have now made up 742 00:40:04,200 --> 00:40:05,680 Speaker 8: that there is a customer for it. 743 00:40:06,239 --> 00:40:09,000 Speaker 6: This is made up. You made that up for your article. 744 00:40:09,040 --> 00:40:11,719 Speaker 4: You made up this consumer to get mad at who 745 00:40:11,840 --> 00:40:13,200 Speaker 4: is paying a made up price. 746 00:40:14,200 --> 00:40:16,480 Speaker 6: This is a mini hoax. This is a many hoax 747 00:40:16,520 --> 00:40:17,360 Speaker 6: that this is happening. 748 00:40:18,360 --> 00:40:20,880 Speaker 4: This is a little, tiny, little hoax just so that 749 00:40:20,920 --> 00:40:24,160 Speaker 4: you can have something to write about. 750 00:40:24,800 --> 00:40:28,360 Speaker 6: Not like I don't know that anyone actually about tickets 751 00:40:28,360 --> 00:40:28,520 Speaker 6: to that. 752 00:40:28,719 --> 00:40:30,799 Speaker 5: It almost is a thing where you see something like 753 00:40:31,360 --> 00:40:34,200 Speaker 5: on a web page that's like so imminently clickable, where 754 00:40:34,239 --> 00:40:37,200 Speaker 5: you're just like, well, I have to see that. It's 755 00:40:37,239 --> 00:40:39,880 Speaker 5: the classic rule of thumb that if a headline is 756 00:40:39,920 --> 00:40:42,120 Speaker 5: a question, the answer is no. 757 00:40:41,800 --> 00:40:44,200 Speaker 1: No, yes, yeah, yeah, yeah exactly. 758 00:40:44,239 --> 00:40:47,240 Speaker 4: This is is gen Z killing the stock market? 759 00:40:47,400 --> 00:40:52,800 Speaker 1: No, no, right exactly, And that happens often with celebrity news. 760 00:40:52,920 --> 00:40:56,880 Speaker 1: Another one that I think works very well in political 761 00:40:57,000 --> 00:41:01,759 Speaker 1: reporting is when you see words like ASTs or slabs 762 00:41:02,160 --> 00:41:06,160 Speaker 1: or you know, basically capals old Batman style. That usually 763 00:41:06,239 --> 00:41:08,759 Speaker 1: means that someone had a quote that was taken out 764 00:41:08,760 --> 00:41:13,600 Speaker 1: of context where they were like, yeah, I guess hypothetically, 765 00:41:13,719 --> 00:41:19,040 Speaker 1: if this person you're talking about is microwaving camels, I 766 00:41:19,080 --> 00:41:20,960 Speaker 1: think it would be a bad thing, And they're like, 767 00:41:21,000 --> 00:41:26,320 Speaker 1: how dare you, senator? But this maybe speaks to again 768 00:41:26,480 --> 00:41:30,160 Speaker 1: that that requirement of a cadence right, of a publishing 769 00:41:30,280 --> 00:41:33,080 Speaker 1: cadence right, we have to have the news, and if 770 00:41:33,120 --> 00:41:36,399 Speaker 1: the news does not exist, we have to therefore make 771 00:41:36,440 --> 00:41:36,880 Speaker 1: the news. 772 00:41:37,040 --> 00:41:39,880 Speaker 5: I have I have experienced the moment of being taken 773 00:41:39,880 --> 00:41:42,160 Speaker 5: out of context in a way that made me seem 774 00:41:42,480 --> 00:41:48,360 Speaker 5: incredibly lame. Where I was quoted in an article for 775 00:41:48,400 --> 00:41:50,080 Speaker 5: an outlet I will not name. I had made a 776 00:41:50,120 --> 00:41:54,480 Speaker 5: TikTok about breastfeeding my son and how much time it 777 00:41:54,480 --> 00:41:56,879 Speaker 5: takes and like people don't realize that, like feeding your 778 00:41:56,960 --> 00:41:59,640 Speaker 5: children with your body takes up so much on and 779 00:42:00,040 --> 00:42:03,040 Speaker 5: was giving an interview about it, and I said something 780 00:42:03,040 --> 00:42:05,520 Speaker 5: to the effect of like, when I'm doing it, I 781 00:42:05,560 --> 00:42:07,400 Speaker 5: spend a lot of time on my phone, which is 782 00:42:07,440 --> 00:42:10,239 Speaker 5: great for my brain. And I was so obviously in 783 00:42:10,320 --> 00:42:13,759 Speaker 5: my mind being sarcastic, and they wrote it up as 784 00:42:13,840 --> 00:42:15,960 Speaker 5: if she was like, well, lucky for her, like she 785 00:42:16,120 --> 00:42:18,080 Speaker 5: is spending a lot of time on her phone, which 786 00:42:18,280 --> 00:42:21,359 Speaker 5: Schwartz claims is great for her brain. And I was like, 787 00:42:21,800 --> 00:42:26,799 Speaker 5: oh no, and it was just so miscommunicated that then 788 00:42:26,880 --> 00:42:29,239 Speaker 5: seeing it in print, just felt this is like such 789 00:42:29,239 --> 00:42:32,319 Speaker 5: a tiny example, but it's so easy for something just 790 00:42:32,360 --> 00:42:35,080 Speaker 5: to like it wasn't malicious. It was just sort of 791 00:42:35,160 --> 00:42:38,920 Speaker 5: like you flippant and it just happened lay a little lazy, 792 00:42:38,960 --> 00:42:41,600 Speaker 5: a little just a misunderstanding. It just is a minor 793 00:42:41,640 --> 00:42:43,600 Speaker 5: example of just how easy it is for things to 794 00:42:43,600 --> 00:42:45,120 Speaker 5: be misconstrued. 795 00:42:45,120 --> 00:42:50,200 Speaker 1: Especially true when we're encountering something written in print about 796 00:42:50,200 --> 00:42:53,840 Speaker 1: something that was conveyed through audio or video, right, because 797 00:42:54,280 --> 00:42:57,360 Speaker 1: we don't really gosh, you know what, No, we should 798 00:42:57,360 --> 00:43:01,560 Speaker 1: do a history of strange punctually marks, because we don't 799 00:43:01,600 --> 00:43:04,920 Speaker 1: really use like the Entero bang or you know, some 800 00:43:05,040 --> 00:43:06,320 Speaker 1: kind of mark that says. 801 00:43:06,040 --> 00:43:08,799 Speaker 2: Clearly and I would die on that. A period and 802 00:43:08,800 --> 00:43:11,400 Speaker 2: a text is aggressive and I will die on it. 803 00:43:11,480 --> 00:43:16,279 Speaker 1: He'll disagree in lack of a period also seems pretty cold. 804 00:43:16,400 --> 00:43:18,840 Speaker 3: Where do you all fall on that issue? Period at 805 00:43:18,880 --> 00:43:20,120 Speaker 3: the end of a text aggressive? 806 00:43:20,320 --> 00:43:23,279 Speaker 5: It depends if it's one, if it's one word. It 807 00:43:23,320 --> 00:43:25,200 Speaker 5: depends on what the text is. If it says, if 808 00:43:25,239 --> 00:43:29,280 Speaker 5: it's fine, period aggressive, not fine, it'd full sentence. 809 00:43:29,640 --> 00:43:32,240 Speaker 3: I agree with that because then you have some nuanced 810 00:43:32,400 --> 00:43:33,760 Speaker 3: wild actually communicate a vibe. 811 00:43:33,840 --> 00:43:36,160 Speaker 2: But if it's a word and a period, not a fan. 812 00:43:36,360 --> 00:43:40,520 Speaker 4: I also subscribe to the belief that a text is 813 00:43:40,560 --> 00:43:42,080 Speaker 4: not writing it is written speech. 814 00:43:42,840 --> 00:43:45,560 Speaker 1: Yeah. I like that. But also we could argue there 815 00:43:45,560 --> 00:43:47,480 Speaker 1: are a lot of people in the audience tonight who 816 00:43:47,520 --> 00:43:51,919 Speaker 1: would say fine without a period is likewise not the 817 00:43:52,000 --> 00:43:52,600 Speaker 1: coolest thing. 818 00:43:52,640 --> 00:43:56,440 Speaker 2: Fine isn't general. Fine Also don't like. 819 00:43:57,960 --> 00:43:59,600 Speaker 6: Sounds good? 820 00:43:59,680 --> 00:44:03,840 Speaker 2: Good seems dismissing. 821 00:44:04,000 --> 00:44:06,040 Speaker 1: I do a dash. I like to do a dash. 822 00:44:06,040 --> 00:44:08,960 Speaker 1: If it's not a sentence, Yeah, I'll just do a dash. 823 00:44:09,080 --> 00:44:10,319 Speaker 1: I'll be like, that sounds great. 824 00:44:10,560 --> 00:44:13,560 Speaker 5: If someone ever texts it is what it is period? 825 00:44:13,640 --> 00:44:14,280 Speaker 5: That person. 826 00:44:19,719 --> 00:44:22,080 Speaker 1: Was that was that Will Pearson, I'll find it. 827 00:44:23,920 --> 00:44:28,279 Speaker 5: I'm just saying, in context of period can absolutely be devastating. 828 00:44:28,719 --> 00:44:31,319 Speaker 3: All this to say, bring back the Interra Bank in 829 00:44:31,400 --> 00:44:33,120 Speaker 3: Terra Bang, I'm telling you it's going to be a winner. 830 00:44:33,160 --> 00:44:34,399 Speaker 1: We're going to do an episode. 831 00:44:34,719 --> 00:44:37,080 Speaker 2: Described as this character, this forgotten character. 832 00:44:37,320 --> 00:44:39,319 Speaker 1: So have you ever seen a question mark and an 833 00:44:39,360 --> 00:44:41,560 Speaker 1: exclamation mark and thought they should hook up. 834 00:44:42,000 --> 00:44:42,120 Speaker 7: Uh. 835 00:44:42,360 --> 00:44:45,919 Speaker 1: It's basically that uh. But there there are so many more, 836 00:44:46,239 --> 00:44:49,239 Speaker 1: not just punctuation marks, but there are so many more 837 00:44:49,600 --> 00:44:52,719 Speaker 1: Hoaxes that are out there in the past and the 838 00:44:52,760 --> 00:44:55,680 Speaker 1: present and probably the future. Not my best segue, but 839 00:44:55,760 --> 00:44:58,839 Speaker 1: here we're going. We can't thank you all enough for 840 00:44:58,920 --> 00:45:01,879 Speaker 1: making this show. And as you can tell, we're big 841 00:45:01,920 --> 00:45:06,800 Speaker 1: fans of your previous writing, your unrelated shows. Right Noble Blood, 842 00:45:06,840 --> 00:45:10,759 Speaker 1: Reductress Reductress mc sweeney's, by the way, one of my 843 00:45:11,480 --> 00:45:17,160 Speaker 1: dream hangout spots where can people learn more about Eule's work, 844 00:45:17,719 --> 00:45:21,560 Speaker 1: not just on Hoax, but on your various other endeavors. 845 00:45:22,360 --> 00:45:24,080 Speaker 4: I feel like the easiest place to find both of 846 00:45:24,160 --> 00:45:28,000 Speaker 4: us is Instagram, because, as as Dana mentioned, we're both 847 00:45:28,040 --> 00:45:30,160 Speaker 4: staring at our phones way too much, so we're probably 848 00:45:30,200 --> 00:45:33,560 Speaker 4: just going to be throwing up links on there. I'm 849 00:45:33,760 --> 00:45:37,359 Speaker 4: Lizzie Logan with five z's, so just l i zz 850 00:45:37,640 --> 00:45:39,759 Speaker 4: zz ze l O g A N. 851 00:45:40,440 --> 00:45:42,680 Speaker 5: I'm Dana Schwartz, also with a lot of ease, but 852 00:45:42,800 --> 00:45:46,640 Speaker 5: my z's are at the end schwartz zz. Listen to 853 00:45:46,680 --> 00:45:50,520 Speaker 5: Noble Blood and listen to Hoax, and if you like it, 854 00:45:50,600 --> 00:45:54,560 Speaker 5: Please leave a rating, review, subscribe. It's a new podcast, 855 00:45:54,719 --> 00:45:58,319 Speaker 5: getting getting new listeners, listening, spreading the word. It means 856 00:45:58,320 --> 00:45:58,640 Speaker 5: a lot. 857 00:45:59,040 --> 00:46:02,920 Speaker 1: And if you like our show, you are gonna love this. 858 00:46:03,000 --> 00:46:05,879 Speaker 1: When folks, we are not blowing smoke. This is this 859 00:46:06,640 --> 00:46:11,160 Speaker 1: us saying hoax is amazing? Is itself not a hoax? 860 00:46:11,560 --> 00:46:13,600 Speaker 1: So we want to be clear. And someone had to 861 00:46:13,600 --> 00:46:16,120 Speaker 1: say something like that towards the end of the show. Hell, 862 00:46:16,160 --> 00:46:19,960 Speaker 1: you love it too, Kui, Thank you so much, Lizzie, 863 00:46:19,960 --> 00:46:23,920 Speaker 1: thank you so much. Data. We are no we have 864 00:46:24,000 --> 00:46:25,839 Speaker 1: other people to think. We'll do it real quick. Super 865 00:46:25,880 --> 00:46:29,240 Speaker 1: producer mister Max Williams, Alex Williams who composed our track. 866 00:46:29,880 --> 00:46:34,320 Speaker 3: Chris Prociotis and Jeff Goates here and Spirit, Jonathan Strickland, Bequister, 867 00:46:34,440 --> 00:46:36,319 Speaker 3: aj Bahamas, Jacobs The. 868 00:46:36,280 --> 00:46:40,839 Speaker 1: Bustler and of course Noel. Thanks to you. Tune in 869 00:46:41,000 --> 00:46:44,040 Speaker 1: in the next few days, folks. As the Cinnamte said, 870 00:46:44,080 --> 00:46:45,640 Speaker 1: we have such wonders to show you. 871 00:46:45,760 --> 00:46:46,920 Speaker 2: We'll see you next time, folks. 872 00:46:53,920 --> 00:46:57,759 Speaker 3: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 873 00:46:57,800 --> 00:46:59,920 Speaker 3: or wherever you listen to your favorite shows.