1 00:00:05,200 --> 00:00:15,400 Speaker 1: Podcast, Thank you. That is it is, It's true. It's true. 2 00:00:15,480 --> 00:00:23,800 Speaker 1: Actually it could happen here listening to the podcast. That 3 00:00:26,680 --> 00:00:29,200 Speaker 1: is more or less the truth we have. We have 4 00:00:29,320 --> 00:00:31,960 Speaker 1: Dreg Robert out of bed at before their crack of 5 00:00:32,080 --> 00:00:35,880 Speaker 1: dawg at a legend forty two am. Um, and we're 6 00:00:35,880 --> 00:00:38,640 Speaker 1: gonna talk about actually something very fun. I've been I've 7 00:00:38,680 --> 00:00:40,360 Speaker 1: been wanting to talk about this for a long time 8 00:00:40,400 --> 00:00:42,640 Speaker 1: because this is one of actually one of my favorite things. 9 00:00:44,000 --> 00:00:47,360 Speaker 1: Um yeah, yeah, So I'm gonna I'm gonna tell a 10 00:00:47,360 --> 00:00:50,400 Speaker 1: bit of a bit of a little story regarding one 11 00:00:50,440 --> 00:00:53,520 Speaker 1: of my act all time favorite events and topics. So 12 00:00:54,120 --> 00:00:58,840 Speaker 1: back in like, there was this a cheesy little online 13 00:00:59,120 --> 00:01:03,160 Speaker 1: university sign show made by the Rochester Institute of Technology 14 00:01:03,240 --> 00:01:07,120 Speaker 1: called can You Imagine um. The The idea was to 15 00:01:07,160 --> 00:01:09,080 Speaker 1: highlight some of the cool and weird things at the 16 00:01:09,120 --> 00:01:12,280 Speaker 1: university UM in part to promote the Imagine r i 17 00:01:12,360 --> 00:01:15,600 Speaker 1: T Festival, which was like the school's annual like innovation 18 00:01:15,680 --> 00:01:20,199 Speaker 1: and creativity festivals think that they put on. So yeah, today, 19 00:01:20,240 --> 00:01:22,840 Speaker 1: I want to talk specifically about episode three of the 20 00:01:22,880 --> 00:01:27,280 Speaker 1: web series because the contents of which overlap with some 21 00:01:27,440 --> 00:01:30,319 Speaker 1: of my like artistic interests. UM and like just my 22 00:01:30,400 --> 00:01:32,959 Speaker 1: love of illusions and paradox and I will kind of 23 00:01:32,959 --> 00:01:35,200 Speaker 1: tie into some topics we are we always discussed in 24 00:01:35,240 --> 00:01:38,520 Speaker 1: the show. So yeah, episode three one of probably probably 25 00:01:38,880 --> 00:01:42,480 Speaker 1: the most interesting episode UM. Episode three opens with the 26 00:01:42,480 --> 00:01:46,440 Speaker 1: hosts Kevin and Steph as they like stand awkwardly in 27 00:01:46,440 --> 00:01:50,680 Speaker 1: a gloriously dated, you like university film set, like it's 28 00:01:50,800 --> 00:01:52,880 Speaker 1: it's it's it's only twenties, it's only twenty thirteen, but 29 00:01:52,880 --> 00:01:55,800 Speaker 1: it was like obviously like made in the nineties, like 30 00:01:55,800 --> 00:01:57,400 Speaker 1: like like like the set's like it's it's all it's 31 00:01:57,440 --> 00:02:03,360 Speaker 1: it's all very dated. What specific Oh, like they're they're 32 00:02:03,400 --> 00:02:06,520 Speaker 1: like they're just like weird, like like weird, like like 33 00:02:06,720 --> 00:02:09,840 Speaker 1: dadd science stuff on the walls. All the hosts of 34 00:02:09,960 --> 00:02:13,520 Speaker 1: wearing like dorky orange T shirts like like over to 35 00:02:13,560 --> 00:02:18,560 Speaker 1: over top of their regular clothes. Yeah, yeah, it's it's 36 00:02:18,560 --> 00:02:19,840 Speaker 1: it's all. It's all that kind of it's all that 37 00:02:19,880 --> 00:02:22,680 Speaker 1: kind of stuff. So like dorkey orange T shirts with 38 00:02:22,680 --> 00:02:25,520 Speaker 1: the letters r I T for Roger CRUs Sister Institute 39 00:02:25,520 --> 00:02:28,840 Speaker 1: of Technology UM. Of course, because everything in this online 40 00:02:28,919 --> 00:02:32,320 Speaker 1: video series is perfect. Kevin is wearing his shirt over 41 00:02:32,360 --> 00:02:35,519 Speaker 1: top of like a button down it's it's, it's, it's great. Um. 42 00:02:35,560 --> 00:02:38,160 Speaker 1: The first fifty seconds of the video are taken up 43 00:02:38,200 --> 00:02:41,919 Speaker 1: with like plugging the upcoming r A T Imagine Festival 44 00:02:42,680 --> 00:02:46,640 Speaker 1: with a with a co host Steph beautifully stumbling over 45 00:02:47,200 --> 00:02:50,200 Speaker 1: her lines when she says the events catchphrase, it's where 46 00:02:50,240 --> 00:02:53,480 Speaker 1: the left brain and the right brain collide. And it's great. 47 00:02:53,560 --> 00:02:56,240 Speaker 1: It's it's, it's, it's, it's it's perfect. So after all 48 00:02:56,280 --> 00:02:59,240 Speaker 1: the plugs and the vamping, the hosts get down to 49 00:02:59,280 --> 00:03:01,840 Speaker 1: the fun engineer ring feet and they'll be showing us today, 50 00:03:02,160 --> 00:03:05,680 Speaker 1: which is a neat little architectural experiment a part of 51 00:03:05,760 --> 00:03:08,200 Speaker 1: the r I T campus called the A Sharing and 52 00:03:08,360 --> 00:03:12,200 Speaker 1: Stairwell UM, of course, named after the impossible staircases depicted 53 00:03:12,240 --> 00:03:16,480 Speaker 1: in Dutch artist mc Escher's artwork. So the video cuts 54 00:03:16,960 --> 00:03:19,840 Speaker 1: two from the little like sound stage they're filming in 55 00:03:19,880 --> 00:03:24,160 Speaker 1: to this boring, white, seemingly typical stairwell. Our host Kevin 56 00:03:24,760 --> 00:03:28,360 Speaker 1: ascending a flight of the gray concrete stairs. UM explains 57 00:03:28,400 --> 00:03:31,400 Speaker 1: that what located in Building seven of the campus. The 58 00:03:31,440 --> 00:03:36,080 Speaker 1: stairwell was designed by Filipino architect Raphael Nelson Avagando and 59 00:03:36,200 --> 00:03:38,480 Speaker 1: was one of the first structures put up when our 60 00:03:38,520 --> 00:03:41,680 Speaker 1: I T moved their campus from downtown Rochester to the 61 00:03:41,720 --> 00:03:45,119 Speaker 1: more suburban Henrietta. UM when when he's taking when he's 62 00:03:45,120 --> 00:03:47,080 Speaker 1: reaching the top of the stairs, he turns the quarter 63 00:03:47,480 --> 00:03:49,640 Speaker 1: and then suddenly seems to appear at the bottom of 64 00:03:49,680 --> 00:03:52,120 Speaker 1: the lower flight of stairs, leading up to the landing 65 00:03:52,120 --> 00:03:54,360 Speaker 1: that he just left from, all while continuing to talk 66 00:03:54,440 --> 00:03:57,840 Speaker 1: about the architect behind this like kind of weird impossible 67 00:03:57,880 --> 00:04:00,480 Speaker 1: feat um. So, as Kevin walks back up the camera, 68 00:04:00,720 --> 00:04:03,560 Speaker 1: he says that the stairwell was built in the eight 69 00:04:03,800 --> 00:04:07,880 Speaker 1: and it's been wowing r I two students ever since. Um, 70 00:04:07,920 --> 00:04:10,520 Speaker 1: it's it is very cool. It's like it's like you're like, okay, 71 00:04:10,560 --> 00:04:13,960 Speaker 1: like you get you get the little like like you 72 00:04:13,960 --> 00:04:17,120 Speaker 1: get the little architectural trick that they're doing. Um, but 73 00:04:17,200 --> 00:04:19,000 Speaker 1: it's it is it is still pretty fun to see. 74 00:04:19,800 --> 00:04:22,520 Speaker 1: Before episode three of Can You Imagine aired you Can 75 00:04:22,680 --> 00:04:25,120 Speaker 1: you can already find a few articles on the school's 76 00:04:25,880 --> 00:04:31,080 Speaker 1: UH website about the issuing stairwell, along with some like 77 00:04:31,160 --> 00:04:35,000 Speaker 1: forum posts debating how the architecture in the stairwell works 78 00:04:35,040 --> 00:04:38,320 Speaker 1: to like achieve the effect. Um Also floating around on 79 00:04:38,360 --> 00:04:41,159 Speaker 1: YouTube was like a random segment of what looked like 80 00:04:41,200 --> 00:04:44,960 Speaker 1: a like a PBS style late nineties documentary about the 81 00:04:44,960 --> 00:04:48,440 Speaker 1: physics and architecture of the school and specifically the stairwell 82 00:04:49,000 --> 00:04:52,160 Speaker 1: that interviewed some like professors um and some like architects 83 00:04:52,200 --> 00:04:55,839 Speaker 1: and like a symphysicist kind of discussing what like how 84 00:04:55,880 --> 00:04:58,920 Speaker 1: to like bring paradox into the physical world. Yeah but 85 00:04:58,920 --> 00:05:00,360 Speaker 1: but but but around the time and the can You 86 00:05:00,400 --> 00:05:04,200 Speaker 1: Imagine episode aired now, like infamous r RT stairwell was 87 00:05:04,240 --> 00:05:08,080 Speaker 1: mostly unknown, so like even despite it being very interesting, 88 00:05:08,640 --> 00:05:10,960 Speaker 1: no one really knew about it until this episode of 89 00:05:10,960 --> 00:05:14,640 Speaker 1: this little web series air Um. The little web episode 90 00:05:14,760 --> 00:05:18,960 Speaker 1: dedicates around half its time to interview students, uh and 91 00:05:19,120 --> 00:05:21,720 Speaker 1: righteously random people at the university about if they even 92 00:05:21,760 --> 00:05:25,159 Speaker 1: know about the stairwell's existence um, and if they do, 93 00:05:25,200 --> 00:05:28,800 Speaker 1: what like experiences they have with like messing around with 94 00:05:28,800 --> 00:05:31,560 Speaker 1: like the looping architecture, because yeah, you can, you can 95 00:05:31,560 --> 00:05:33,240 Speaker 1: you know, you can play a lot of games with 96 00:05:33,279 --> 00:05:35,800 Speaker 1: this type of with this type of design. Uh. So 97 00:05:35,920 --> 00:05:37,920 Speaker 1: the rest of the short video like tries to demonstrate 98 00:05:38,160 --> 00:05:42,560 Speaker 1: the disorienting ascent down and descent back up via the 99 00:05:42,600 --> 00:05:45,160 Speaker 1: camera in various ways, like you know, like human chains 100 00:05:45,279 --> 00:05:48,919 Speaker 1: or holding hands around the weird like rubious loop types 101 00:05:49,160 --> 00:05:51,839 Speaker 1: staircase then like passing objects back and forth in a 102 00:05:51,880 --> 00:05:56,200 Speaker 1: circle while inside and around the enclosed stairwell. Um, there's 103 00:05:56,320 --> 00:05:58,960 Speaker 1: one where Kevin walks around with a cup to show 104 00:05:58,960 --> 00:06:01,800 Speaker 1: that the stairs aren't like clearly like heavily slanted, like 105 00:06:01,839 --> 00:06:04,000 Speaker 1: the water stays pretty pretty level as he walks all 106 00:06:04,000 --> 00:06:05,760 Speaker 1: the way through, and like we follow with him the 107 00:06:05,880 --> 00:06:09,280 Speaker 1: entire time. Um. So yeah, Like the overall like nerdy 108 00:06:09,320 --> 00:06:12,880 Speaker 1: and low fi style of the University video match with 109 00:06:12,960 --> 00:06:15,960 Speaker 1: like the insane feet of architectural illusion is a really 110 00:06:16,000 --> 00:06:18,000 Speaker 1: fun mixed like it's like it's like it's it's it 111 00:06:18,120 --> 00:06:21,000 Speaker 1: is very like surreal, but not totally on purpose, because 112 00:06:21,000 --> 00:06:24,960 Speaker 1: it's just all of these like regular college students showing 113 00:06:25,000 --> 00:06:28,880 Speaker 1: this like really cool architecture by this really good architect 114 00:06:29,200 --> 00:06:31,120 Speaker 1: and you're like, oh yeah, they're just like so chill 115 00:06:31,200 --> 00:06:34,839 Speaker 1: about it. Um, it is it is. It is pretty fun. 116 00:06:35,520 --> 00:06:39,200 Speaker 1: It's pretty fun. Um. After the third episode of the 117 00:06:39,240 --> 00:06:43,800 Speaker 1: Imagine Our i T video was posted, finally the mind 118 00:06:43,839 --> 00:06:47,800 Speaker 1: boggling looping staircase of building seven in of our T 119 00:06:47,960 --> 00:06:51,920 Speaker 1: started to gain a lot of confused appreciation. UM and 120 00:06:51,960 --> 00:06:55,440 Speaker 1: the dorky University Science show went vital. People started traveling 121 00:06:55,480 --> 00:06:58,279 Speaker 1: from out of state even other countries to see the 122 00:06:58,279 --> 00:07:03,159 Speaker 1: Asherian stairwell themselves of and film videos on social media 123 00:07:03,240 --> 00:07:05,280 Speaker 1: as as they walked through it. There this this one 124 00:07:05,400 --> 00:07:07,599 Speaker 1: video of like people traveling to a different country and 125 00:07:07,600 --> 00:07:10,120 Speaker 1: they're like harassing like the school staff to try to 126 00:07:10,120 --> 00:07:11,760 Speaker 1: like tell them where it is, and they're like, oh 127 00:07:11,800 --> 00:07:14,200 Speaker 1: my god, you're still doing this because it was like 128 00:07:14,200 --> 00:07:16,000 Speaker 1: because like this film was this video is like like 129 00:07:16,200 --> 00:07:18,800 Speaker 1: years old, but it's it still happens. People still travel 130 00:07:18,840 --> 00:07:22,400 Speaker 1: there to to specifically see it. Um. There is like 131 00:07:22,480 --> 00:07:26,200 Speaker 1: tense online discussion and debate on how the Filipino architect 132 00:07:26,440 --> 00:07:29,160 Speaker 1: Raphael Avugando was able to achieve the effect and what 133 00:07:29,280 --> 00:07:32,160 Speaker 1: kind of other bizarre architectural experiments he may have worked on, 134 00:07:32,800 --> 00:07:34,880 Speaker 1: because you can find his Facebook page and you can 135 00:07:34,920 --> 00:07:37,240 Speaker 1: find some stuff about him, but he has not really 136 00:07:37,240 --> 00:07:39,760 Speaker 1: because like this this steroile was built in the late sixties, 137 00:07:40,000 --> 00:07:42,120 Speaker 1: but you haven't you, So he even though he has 138 00:07:42,160 --> 00:07:44,320 Speaker 1: an online presence, he's like he's like he's not like active, 139 00:07:44,680 --> 00:07:49,400 Speaker 1: so it's unclear like what else he's actually been doing. Um, 140 00:07:49,440 --> 00:07:51,240 Speaker 1: but I would I would, I would love to learn 141 00:07:51,240 --> 00:07:53,880 Speaker 1: more about this architect and what else he's done because 142 00:07:54,040 --> 00:07:56,400 Speaker 1: this it is it is really rare to have these 143 00:07:56,560 --> 00:08:01,840 Speaker 1: very small, condensed but like high effort type like type builds, 144 00:08:02,680 --> 00:08:05,760 Speaker 1: and like the the existence of the whole thing. Posted 145 00:08:05,800 --> 00:08:09,920 Speaker 1: some really interesting questions around how extremely clever paradoxical design 146 00:08:10,120 --> 00:08:12,600 Speaker 1: can push the boundary of how we make assumptions about 147 00:08:12,640 --> 00:08:15,680 Speaker 1: spatial physics UM, and how we visually and physically demonstrate 148 00:08:15,720 --> 00:08:18,280 Speaker 1: things that we usually can only depict in two dimensions, 149 00:08:18,400 --> 00:08:20,280 Speaker 1: right like you can you can easily depict the the 150 00:08:20,280 --> 00:08:22,920 Speaker 1: issuing in stairwell in two dimensions, but when you're scaling 151 00:08:22,920 --> 00:08:26,080 Speaker 1: that up to three dimensions, it's obviously more works like 152 00:08:26,080 --> 00:08:28,880 Speaker 1: like that. That is that is part of the paradox um. 153 00:08:29,000 --> 00:08:31,400 Speaker 1: Plus you know, it also demonstrates the importance of art 154 00:08:31,520 --> 00:08:34,840 Speaker 1: and how ideas once thought impossible or merely optical illusions 155 00:08:35,080 --> 00:08:39,120 Speaker 1: can actually, with enough datacat effort, break into our real reality. Uh. 156 00:08:39,160 --> 00:08:41,520 Speaker 1: If a brilliant architect can manage to build this physically 157 00:08:41,520 --> 00:08:44,760 Speaker 1: and like logically impossible structure, what other types of things 158 00:08:44,760 --> 00:08:47,760 Speaker 1: can we actually view as possible? Um? The video now 159 00:08:47,800 --> 00:08:50,400 Speaker 1: has like over a million views on its original upload. 160 00:08:50,920 --> 00:08:53,640 Speaker 1: UM and videos about the r I t stairwell have 161 00:08:53,800 --> 00:08:55,920 Speaker 1: ranked up as many as like twenty five million views. 162 00:08:56,960 --> 00:09:01,160 Speaker 1: Yeah it's pretty cool. Yeah. You know what else demonstrates 163 00:09:01,200 --> 00:09:05,199 Speaker 1: the looping nature of time. Having to listen to all 164 00:09:05,360 --> 00:09:19,240 Speaker 1: these ads that we do, ye do we we are 165 00:09:19,360 --> 00:09:22,200 Speaker 1: we are back. I've I've rounded the corner and we 166 00:09:22,240 --> 00:09:26,040 Speaker 1: are back where we came from. Um because of the 167 00:09:26,600 --> 00:09:32,839 Speaker 1: fun paradox of architecture. Um. The one the one other 168 00:09:32,920 --> 00:09:35,560 Speaker 1: thing I should mention it before we continue on this 169 00:09:35,600 --> 00:09:38,600 Speaker 1: episode is that the entire thing is fake. It's false. 170 00:09:38,920 --> 00:09:43,000 Speaker 1: No way, not this time. We created it. Not this time, no, 171 00:09:43,320 --> 00:09:46,680 Speaker 1: not this time. It's totally made up. Because of course 172 00:09:46,880 --> 00:09:49,800 Speaker 1: it's it's a staircase that breaks the basic girls of 173 00:09:49,880 --> 00:09:53,600 Speaker 1: movement in physics. Kevin walks up the stairs and teleports 174 00:09:53,600 --> 00:09:56,400 Speaker 1: to the lower stairwell. Belief them this this that's not 175 00:09:56,480 --> 00:09:59,520 Speaker 1: that's that's not an architectural allusion. It's called a good video. 176 00:09:59,520 --> 00:10:04,520 Speaker 1: And the day and adobe after effects like no, you're 177 00:10:04,559 --> 00:10:07,599 Speaker 1: you're really gonna believe a video on the Internet and 178 00:10:07,760 --> 00:10:11,280 Speaker 1: some well placed, falsified Internet posts over the very basic 179 00:10:11,400 --> 00:10:15,640 Speaker 1: rules that governed our universe. But like, oh boy, did 180 00:10:15,640 --> 00:10:18,920 Speaker 1: it fool millions of people? Uh? And if I played 181 00:10:18,960 --> 00:10:21,200 Speaker 1: my cards right, I hope most of our listeners until 182 00:10:21,240 --> 00:10:26,040 Speaker 1: the last few seconds. Um yeah, and uh so the 183 00:10:26,080 --> 00:10:30,000 Speaker 1: whole the whole thing was a was a student like 184 00:10:30,120 --> 00:10:34,600 Speaker 1: film and art project around around building a modern myth. 185 00:10:35,720 --> 00:10:38,960 Speaker 1: Um the it's because it's sure, It's sure is interesting 186 00:10:39,240 --> 00:10:44,640 Speaker 1: how good storytelling can overrule obvious logical processes. The tale 187 00:10:44,640 --> 00:10:46,959 Speaker 1: of the is Sharian Stairwell is one of my favorite 188 00:10:46,960 --> 00:10:50,360 Speaker 1: case studies and how disinprobation spreads and it's believed while 189 00:10:50,440 --> 00:10:53,000 Speaker 1: all in defiance of the basic rules of reality, because 190 00:10:53,040 --> 00:10:55,160 Speaker 1: it's not a matter of what facts are true, it's 191 00:10:55,160 --> 00:10:57,800 Speaker 1: about what facts are compelling. And the idea of a 192 00:10:57,840 --> 00:11:02,600 Speaker 1: logically impossible staircase being built by a brilliant Philippino architect 193 00:11:02,800 --> 00:11:06,880 Speaker 1: is more interesting than it being someone's weird and disinformation 194 00:11:06,960 --> 00:11:12,000 Speaker 1: art project. Um there. So yeah, Like I want to say, 195 00:11:12,000 --> 00:11:14,520 Speaker 1: like how what what were you guys thinking as I 196 00:11:14,559 --> 00:11:17,920 Speaker 1: was explaining the Asurian stairwell, Like where did you see 197 00:11:17,920 --> 00:11:20,040 Speaker 1: this going? Okay, so I had in the back of 198 00:11:20,080 --> 00:11:23,880 Speaker 1: my head, Okay, we should we should mention this. Garrison 199 00:11:23,880 --> 00:11:28,200 Speaker 1: has been hyping up this episode for like I don't 200 00:11:28,200 --> 00:11:34,480 Speaker 1: even a pretty amount of time nothing, Yeah, and there's 201 00:11:34,480 --> 00:11:38,440 Speaker 1: a staircase, and I'm like what what? I was like, 202 00:11:39,000 --> 00:11:41,520 Speaker 1: my my brain my brain started going because he said 203 00:11:42,280 --> 00:11:47,200 Speaker 1: and I was like my like my encountering surgency brain 204 00:11:47,280 --> 00:11:50,640 Speaker 1: flicked on and I was like, wait a second, hold time, 205 00:11:50,760 --> 00:11:55,000 Speaker 1: is this like some kind of like weird like we've 206 00:11:55,000 --> 00:11:58,200 Speaker 1: redesigned the college campuses so they stop people stopped taking 207 00:11:58,200 --> 00:12:02,079 Speaker 1: the dean hostage, a thing that used to happen constantly 208 00:12:02,120 --> 00:12:05,120 Speaker 1: and with all my favorite party about this would happen constantly, 209 00:12:05,160 --> 00:12:11,840 Speaker 1: and you'd get New York Times articles calling it non violent. Great. Yeah, 210 00:12:12,000 --> 00:12:15,240 Speaker 1: so yeah it was that was that was. Yeah. I 211 00:12:15,240 --> 00:12:17,200 Speaker 1: spent more mental energy that I probably should have tried 212 00:12:17,280 --> 00:12:19,319 Speaker 1: to figure out how it worked out. Was like, I 213 00:12:19,320 --> 00:12:21,560 Speaker 1: don't know, maybe they just made it, Like if they 214 00:12:21,600 --> 00:12:25,719 Speaker 1: just made it Acus Razor, it's obviously yeah. I mean 215 00:12:25,880 --> 00:12:27,400 Speaker 1: I was. I was in the like I was in 216 00:12:27,440 --> 00:12:29,400 Speaker 1: the like okay, so they built a staircase, they built 217 00:12:29,440 --> 00:12:33,280 Speaker 1: another the viewers cannot see my fingers and it was like, 218 00:12:33,640 --> 00:12:36,800 Speaker 1: it's a staircase. It doesn't tell it. It was like 219 00:12:36,840 --> 00:12:39,520 Speaker 1: it was like, but you can't find videos of people 220 00:12:39,520 --> 00:12:41,960 Speaker 1: traveling to the school to see if it's real. And 221 00:12:42,000 --> 00:12:44,520 Speaker 1: they try it and they're so disappointed. They're like, oh, yeah, 222 00:12:44,559 --> 00:12:48,080 Speaker 1: it's it's no, it's just stairs. It doesn't it doesn't. Yeah, 223 00:12:48,040 --> 00:12:49,720 Speaker 1: it's disappointed in a lot of ways because it's it's 224 00:12:49,760 --> 00:12:51,480 Speaker 1: not even like a thing where like there's like another 225 00:12:51,520 --> 00:12:54,400 Speaker 1: back staircase that you walked down. Then again it's just 226 00:12:54,440 --> 00:12:57,280 Speaker 1: it's just nothing. It's just stas I was hoping there 227 00:12:57,360 --> 00:13:01,040 Speaker 1: was like, actually clever thing is No, it's just it's 228 00:13:01,080 --> 00:13:07,360 Speaker 1: not really it's it's it's just. It was that meme 229 00:13:07,640 --> 00:13:10,360 Speaker 1: where all the math doesn't add up in the person 230 00:13:11,360 --> 00:13:14,599 Speaker 1: what is happening? I was like, all right, Garrison, you 231 00:13:14,720 --> 00:13:17,600 Speaker 1: got us here. You made Robert get up before noon. 232 00:13:17,840 --> 00:13:21,920 Speaker 1: What is happening? Well, the reason, the real reason I 233 00:13:21,960 --> 00:13:24,440 Speaker 1: got up before I got Robert up before noon is 234 00:13:24,480 --> 00:13:28,800 Speaker 1: because I actually, um have scheduled an interview with the 235 00:13:28,960 --> 00:13:31,920 Speaker 1: creator of the Assarian Stairwell, the actual one be like 236 00:13:32,559 --> 00:13:37,960 Speaker 1: the online art project and building a modern myth idea um, 237 00:13:37,960 --> 00:13:41,400 Speaker 1: which we are now going to segue into. So yeah, 238 00:13:42,000 --> 00:13:46,400 Speaker 1: what what follows is us talking with the creator of 239 00:13:46,559 --> 00:14:01,080 Speaker 1: the Assarian Stairwell project. Hello, we are we are back 240 00:14:01,160 --> 00:14:04,839 Speaker 1: from our probably very very brief break um and with 241 00:14:04,880 --> 00:14:10,360 Speaker 1: me along with Robert and Chris and Sophie is Uh Michael, 242 00:14:10,520 --> 00:14:15,640 Speaker 1: the creator of the Sharian Stairwell Project and the Building 243 00:14:15,920 --> 00:14:22,560 Speaker 1: the Modern Myths Project. Hello greetings, Thank you so much 244 00:14:22,600 --> 00:14:26,080 Speaker 1: for joining us to talk about one of my one 245 00:14:26,120 --> 00:14:29,200 Speaker 1: of my favorite things actually is which is here a 246 00:14:29,200 --> 00:14:33,240 Speaker 1: little to us to project? Um yeah, I've been a 247 00:14:33,240 --> 00:14:35,120 Speaker 1: fan of this for a long time and found it 248 00:14:35,200 --> 00:14:39,000 Speaker 1: to be really compelling and interesting. Um and I so 249 00:14:39,080 --> 00:14:43,640 Speaker 1: I just walked through Robert and Chris and Sophie what 250 00:14:43,640 --> 00:14:46,280 Speaker 1: what it what it was, but from the perspective of 251 00:14:46,320 --> 00:14:50,000 Speaker 1: it being true for like for the good fifteen minutes, 252 00:14:50,120 --> 00:14:52,280 Speaker 1: I was I was episode was going, I was going 253 00:14:52,320 --> 00:14:55,480 Speaker 1: through talking about it as if it were completely real, 254 00:14:56,120 --> 00:15:02,240 Speaker 1: but curious that it was. It was slightly baffling because 255 00:15:02,680 --> 00:15:05,360 Speaker 1: again we were told nothing, and then what we got 256 00:15:05,400 --> 00:15:07,720 Speaker 1: is Garrison is talking about a YouTube video about an 257 00:15:07,800 --> 00:15:15,160 Speaker 1: architecture thing and I was like what here? Yeah, and 258 00:15:15,200 --> 00:15:18,000 Speaker 1: then and then then talking about how oh yeah, and 259 00:15:18,080 --> 00:15:20,120 Speaker 1: I guess one more thing is that it's actually fake. 260 00:15:20,640 --> 00:15:22,720 Speaker 1: Um and it's part of this whole this whole thing. 261 00:15:23,040 --> 00:15:25,960 Speaker 1: So yeah, I would I would love to talk to 262 00:15:26,000 --> 00:15:29,240 Speaker 1: you about both, like how how you like logistically like 263 00:15:29,280 --> 00:15:32,760 Speaker 1: made the project, but also like the underlying you're wondering 264 00:15:32,880 --> 00:15:35,640 Speaker 1: thought thoughts that like inspired you to do in the 265 00:15:35,640 --> 00:15:38,520 Speaker 1: first place, and then like retrospected now almost like ten 266 00:15:38,600 --> 00:15:40,840 Speaker 1: years later, like how do you view the project as 267 00:15:40,880 --> 00:15:43,440 Speaker 1: like happening you know, right before like the peak of 268 00:15:43,480 --> 00:15:49,080 Speaker 1: online disinformation? Um? Right, So but first of all, I 269 00:15:49,240 --> 00:15:51,240 Speaker 1: just think we should would probably start start at the beginning, 270 00:15:51,240 --> 00:15:53,800 Speaker 1: like what what was your inspirations for this type of 271 00:15:53,840 --> 00:15:56,800 Speaker 1: like online like very because it seems it seems it 272 00:15:56,800 --> 00:16:01,520 Speaker 1: seems built to go viral in a lot of ways. Yes, exactly. 273 00:16:01,560 --> 00:16:05,680 Speaker 1: So this was around twenty eleven, I guess was when 274 00:16:05,680 --> 00:16:09,680 Speaker 1: I first got the idea. It was for my master's thesis, 275 00:16:09,800 --> 00:16:13,520 Speaker 1: my m f A for film at Rochester r I 276 00:16:13,600 --> 00:16:19,200 Speaker 1: t and UM. The idea actually began from this like 277 00:16:19,400 --> 00:16:26,200 Speaker 1: deep anxiety about how to discern fact from fiction? Um. 278 00:16:26,240 --> 00:16:29,560 Speaker 1: At the time, like I came into film school like 279 00:16:29,640 --> 00:16:34,200 Speaker 1: really into like realism in films like Romanian new Wave 280 00:16:34,640 --> 00:16:38,120 Speaker 1: mckel hanaka dar Dan Brothers. Like these are filmmakers who 281 00:16:38,120 --> 00:16:40,920 Speaker 1: are like they're sort of like the modern day version 282 00:16:41,000 --> 00:16:44,560 Speaker 1: of Italian nero neo realism, and they're trying to like 283 00:16:44,640 --> 00:16:48,040 Speaker 1: depict like these um, reality as it is. I wanted 284 00:16:48,080 --> 00:16:52,160 Speaker 1: to like learn how to make those types of films. UM. 285 00:16:52,160 --> 00:16:54,520 Speaker 1: So over like with each year, that's what I tried 286 00:16:54,560 --> 00:16:57,880 Speaker 1: to get better at. And the more I tried to 287 00:16:58,200 --> 00:17:02,440 Speaker 1: do that, UM well, like a number of things were 288 00:17:02,440 --> 00:17:05,800 Speaker 1: happening around that time. Right in class, they showed us 289 00:17:05,840 --> 00:17:09,359 Speaker 1: that these mockumentaries called No Lies, which was made in 290 00:17:09,480 --> 00:17:13,040 Speaker 1: nineteen seventy three by this guy called Mitchell Black actually 291 00:17:13,040 --> 00:17:17,080 Speaker 1: won a student oscar at the time, and uh Delusions 292 00:17:17,080 --> 00:17:20,320 Speaker 1: and Modern Primitivism two thousand one by this guy named 293 00:17:20,400 --> 00:17:25,240 Speaker 1: Daniel Laughlin, UM, and these Like I was like floored 294 00:17:25,359 --> 00:17:31,600 Speaker 1: because I thought they were real, like real documentaries and um, 295 00:17:31,800 --> 00:17:35,880 Speaker 1: and it bothered me, like our teachers still this afterwards 296 00:17:35,880 --> 00:17:38,439 Speaker 1: that these were actually scripted works of fiction with like 297 00:17:38,640 --> 00:17:42,160 Speaker 1: really really good actors, and it like I went into 298 00:17:42,240 --> 00:17:46,199 Speaker 1: kind of like existential crisis mode afterwards, like how do 299 00:17:46,240 --> 00:17:49,119 Speaker 1: I even discern what's true from what's not if I 300 00:17:49,160 --> 00:17:52,879 Speaker 1: got fooled by these things, especially like that's like my concentration, 301 00:17:53,160 --> 00:17:55,600 Speaker 1: that's what I've been studying for years, and even I 302 00:17:55,640 --> 00:17:58,760 Speaker 1: was not even able to tell that they were fake. Right, 303 00:17:59,440 --> 00:18:02,280 Speaker 1: There was that going on, and then there was like 304 00:18:02,600 --> 00:18:05,159 Speaker 1: smartphones were becoming a thing, Like I just looked it up. 305 00:18:05,600 --> 00:18:10,840 Speaker 1: Smartphones didn't start out out selling flip phones still, so 306 00:18:10,880 --> 00:18:13,560 Speaker 1: around this time, like it was becoming a thing where 307 00:18:13,640 --> 00:18:16,159 Speaker 1: everyone would have the Internet in their pockets. So I 308 00:18:16,160 --> 00:18:21,399 Speaker 1: guess there was that anxiety going on trying to think about, um, um, 309 00:18:21,520 --> 00:18:26,480 Speaker 1: how we're starting to function and how we're how I 310 00:18:26,520 --> 00:18:29,960 Speaker 1: remember when I proposed my thesis, did the thesis committee, 311 00:18:29,960 --> 00:18:35,199 Speaker 1: I um, one of the things that I was telling 312 00:18:35,240 --> 00:18:39,639 Speaker 1: them was, um, I have this worry about how reliant 313 00:18:39,720 --> 00:18:43,200 Speaker 1: we are on the Internet to determine what's true and 314 00:18:43,240 --> 00:18:46,240 Speaker 1: what's not. And this is like like my professors found 315 00:18:46,320 --> 00:18:49,680 Speaker 1: my concerns like really abstract and theoretical, like why do 316 00:18:49,800 --> 00:18:53,720 Speaker 1: you even care? Because this right, Like why did you 317 00:18:54,000 --> 00:18:56,679 Speaker 1: about fact and ficture? It wasn't like fake news. That 318 00:18:56,760 --> 00:19:00,040 Speaker 1: wasn't even an It wasn't It didn't become part of 319 00:19:00,119 --> 00:19:03,760 Speaker 1: the everyday lexicon, like you said, until six when Trump 320 00:19:03,840 --> 00:19:07,560 Speaker 1: started throwing that term around and suddenly we hear about 321 00:19:07,600 --> 00:19:10,679 Speaker 1: it every day. Um, So there was that going on. 322 00:19:10,840 --> 00:19:13,680 Speaker 1: Trayvon Martin was a thing, and for the first time, 323 00:19:13,760 --> 00:19:19,439 Speaker 1: like nationally, you could see like disinformation like on you know, 324 00:19:19,560 --> 00:19:23,919 Speaker 1: just like exaggerated versions of different different accounts from like 325 00:19:24,000 --> 00:19:29,160 Speaker 1: polarizing sides. So all that was going on, and so 326 00:19:29,280 --> 00:19:33,360 Speaker 1: I I wanted to it was it was like this 327 00:19:33,440 --> 00:19:38,320 Speaker 1: film project was about um trying to take something that 328 00:19:38,560 --> 00:19:41,879 Speaker 1: was Are you familiar with with the difference between like 329 00:19:42,000 --> 00:19:47,280 Speaker 1: a priori knowledge and a posteriori knowledge? Yeah? Okay, so 330 00:19:48,280 --> 00:19:51,040 Speaker 1: so so like you know, for for anyone who might 331 00:19:51,080 --> 00:19:54,480 Speaker 1: be listening that doesn't really know the exact difference. A 332 00:19:54,880 --> 00:19:58,560 Speaker 1: priori knowledge is the type of knowledge that you can 333 00:19:58,800 --> 00:20:02,919 Speaker 1: have without needing to make observations or conduct experiments or 334 00:20:03,000 --> 00:20:05,920 Speaker 1: look at surveys or do any research of any any kind. 335 00:20:06,160 --> 00:20:09,040 Speaker 1: Is a sort of knowledge you can know just by 336 00:20:09,280 --> 00:20:12,800 Speaker 1: reasoning it out, but just by sitting in a room 337 00:20:12,880 --> 00:20:16,040 Speaker 1: by yourself in the dark, you could figure things out. 338 00:20:16,119 --> 00:20:19,160 Speaker 1: This is the sort This is a priori knowledge um. 339 00:20:19,240 --> 00:20:21,639 Speaker 1: So for an example of that is like knowing that 340 00:20:21,760 --> 00:20:26,320 Speaker 1: all bachelor's are unmarried, right, or all triangles of three sides, 341 00:20:26,720 --> 00:20:30,800 Speaker 1: that's a priori knowledge. An example of oppos stereori knowledge 342 00:20:31,280 --> 00:20:36,360 Speaker 1: um is something that you find out through observation or 343 00:20:36,680 --> 00:20:39,760 Speaker 1: you need using one or more of your five senses, right, 344 00:20:40,200 --> 00:20:44,720 Speaker 1: Like Joe Biden is the President of the United States. Um, 345 00:20:44,760 --> 00:20:47,320 Speaker 1: the masses of Mars is six point four one seven 346 00:20:47,359 --> 00:20:50,159 Speaker 1: one times ten to the twys. You actually have to 347 00:20:50,200 --> 00:20:53,600 Speaker 1: go out into the world and conduct surveys or do research. 348 00:20:53,680 --> 00:20:56,680 Speaker 1: So that's oppos stereo knowledge. So the idea was to 349 00:20:56,760 --> 00:21:02,359 Speaker 1: take something that was a priori as something that could 350 00:21:02,680 --> 00:21:07,639 Speaker 1: that could um be disproven by reason alone, Like you 351 00:21:07,680 --> 00:21:10,879 Speaker 1: wouldn't have You wouldn't need to do any research in 352 00:21:11,040 --> 00:21:13,439 Speaker 1: order to to know that it was false. You'd simply 353 00:21:13,440 --> 00:21:18,520 Speaker 1: had to reflect on it and um think about it. Uh. 354 00:21:18,560 --> 00:21:20,840 Speaker 1: So we could have picked anything, right, we could have. 355 00:21:21,160 --> 00:21:23,760 Speaker 1: We could have said made up like a fake news 356 00:21:23,800 --> 00:21:28,520 Speaker 1: report that Lee's mathematicians that m I T having invented 357 00:21:28,560 --> 00:21:32,920 Speaker 1: like a square with five sides something like that. You know, Um, 358 00:21:33,040 --> 00:21:35,640 Speaker 1: I remember that weekend up there in sm now had 359 00:21:35,680 --> 00:21:39,960 Speaker 1: this sketch. I think it was like, um, forget who 360 00:21:39,960 --> 00:21:42,080 Speaker 1: it was. It might have been Kevin Nealon or something 361 00:21:42,119 --> 00:21:46,159 Speaker 1: like that. The report was like scientists and mathematicians have 362 00:21:46,280 --> 00:21:50,159 Speaker 1: discovered the new number. The number exists between five and six, 363 00:21:50,200 --> 00:21:52,560 Speaker 1: and they're calling it the numbers spleen, you know, something 364 00:21:52,640 --> 00:21:58,040 Speaker 1: like that, which is like just impossible. So, um, so 365 00:21:58,160 --> 00:22:01,560 Speaker 1: come up with something that could be disproven by reason alone, 366 00:22:02,000 --> 00:22:05,320 Speaker 1: and at the same time surrounded with this wealth of 367 00:22:05,440 --> 00:22:11,359 Speaker 1: online information UM supporting its veracity. So like you know, 368 00:22:11,400 --> 00:22:14,159 Speaker 1: it was kind of a social experiment. So I was like, 369 00:22:14,240 --> 00:22:18,119 Speaker 1: have we are we so far beyond rational thinking that 370 00:22:18,400 --> 00:22:23,080 Speaker 1: even something that can be disproven a priori people would 371 00:22:23,080 --> 00:22:25,439 Speaker 1: believe And it was like, we didn't really know the 372 00:22:25,480 --> 00:22:27,840 Speaker 1: answer to that, but we were going to commit to 373 00:22:28,000 --> 00:22:31,560 Speaker 1: creating this thing as though it was real and but 374 00:22:31,720 --> 00:22:34,919 Speaker 1: which was like logically impossible. So in a way it 375 00:22:35,000 --> 00:22:40,480 Speaker 1: anticipated the age of like this information because it wasn't 376 00:22:40,520 --> 00:22:43,320 Speaker 1: just Yeah. The thing I kind of alluded to in 377 00:22:43,359 --> 00:22:45,480 Speaker 1: my little scripted portion is that like, yeah, it wasn't 378 00:22:45,520 --> 00:22:47,639 Speaker 1: just the YouTube video. There was also this extra online 379 00:22:47,680 --> 00:22:52,240 Speaker 1: content that was created, um, some of which articles, So yeah, yes, 380 00:22:52,400 --> 00:22:54,480 Speaker 1: like there was you can find like articles, forum posts, 381 00:22:54,680 --> 00:22:56,840 Speaker 1: all this kind of stuff, like like like if yeah, 382 00:22:56,880 --> 00:22:58,520 Speaker 1: like so if you could look into it more and 383 00:22:58,640 --> 00:23:01,359 Speaker 1: find these other things, but it's still contradicts the basic 384 00:23:01,440 --> 00:23:05,040 Speaker 1: logical processes that we can use to discern what is 385 00:23:05,080 --> 00:23:08,199 Speaker 1: real what is not ums, Like yeah, in terms of 386 00:23:08,200 --> 00:23:10,520 Speaker 1: like believing in a five seed square, like no, that's 387 00:23:10,560 --> 00:23:13,560 Speaker 1: not what that that that's not how like physics and 388 00:23:13,680 --> 00:23:18,400 Speaker 1: like spatial like the spatial dimensions work. Um. So yeah, 389 00:23:18,480 --> 00:23:20,960 Speaker 1: and then in terms of all the extra material you 390 00:23:21,040 --> 00:23:23,199 Speaker 1: filmed for it, because there was like there was like 391 00:23:23,240 --> 00:23:26,640 Speaker 1: I think I read around like nine hours of documentary footage. 392 00:23:27,000 --> 00:23:30,280 Speaker 1: Was also a lot like a lot of footage, but 393 00:23:30,359 --> 00:23:34,880 Speaker 1: it was only made into like probably a thirty minute thing. Um. 394 00:23:34,960 --> 00:23:37,520 Speaker 1: We got our friends, like at the very very beginning, 395 00:23:37,560 --> 00:23:39,240 Speaker 1: we got our friends to play along with it. Like 396 00:23:39,280 --> 00:23:41,760 Speaker 1: so whenever you see posts about this, just comment like 397 00:23:41,920 --> 00:23:44,160 Speaker 1: it's real, Like, yeah, I was there. It was really great. 398 00:23:44,840 --> 00:23:49,520 Speaker 1: And um, eventually people would actually start visiting the stairwell, 399 00:23:49,760 --> 00:23:53,440 Speaker 1: like from all over, like from Canada. They crossed the 400 00:23:53,480 --> 00:23:57,040 Speaker 1: border to get there because it's in upstate New York, right, Um. 401 00:23:57,240 --> 00:24:00,600 Speaker 1: And I actually ran into a couple from Indie, uh 402 00:24:00,720 --> 00:24:03,159 Speaker 1: who happened to be visiting visiting New York, and they 403 00:24:03,160 --> 00:24:04,960 Speaker 1: were like, since we're here, we'd like to see this 404 00:24:05,040 --> 00:24:09,399 Speaker 1: stare well that sort of thing. Um. Oh no, I know. 405 00:24:09,520 --> 00:24:11,879 Speaker 1: I felt really bad for a lot of the visitors, 406 00:24:11,920 --> 00:24:14,520 Speaker 1: so we actually had to come up with souvenirs so 407 00:24:14,560 --> 00:24:17,240 Speaker 1: that they wouldn't leave empty handed, right, So we made 408 00:24:17,240 --> 00:24:20,040 Speaker 1: fake We made postcards like saying I've been to the 409 00:24:20,119 --> 00:24:23,920 Speaker 1: Share and stareile and stuff like that program good and 410 00:24:24,160 --> 00:24:27,320 Speaker 1: um what happened and the way we explained it. So 411 00:24:27,359 --> 00:24:29,720 Speaker 1: a lot of people were really mad actually you know, 412 00:24:29,760 --> 00:24:33,680 Speaker 1: as you can imagine when they got there. But after um, 413 00:24:33,880 --> 00:24:36,840 Speaker 1: we would explain what we were doing to them with 414 00:24:36,920 --> 00:24:39,760 Speaker 1: the project. Like a lot of them actually like started 415 00:24:39,760 --> 00:24:41,800 Speaker 1: playing along and thought was really cool, and they went 416 00:24:41,880 --> 00:24:44,800 Speaker 1: home with their souvenirs and told their friends that they 417 00:24:44,880 --> 00:24:47,080 Speaker 1: just saw this amazing thing. So you know, it kind 418 00:24:47,080 --> 00:24:49,480 Speaker 1: of built that way for a little bit. I mean yeah, 419 00:24:49,520 --> 00:24:53,200 Speaker 1: because it's it's like telling kids that Santa isn't real exactly, 420 00:24:53,520 --> 00:24:55,400 Speaker 1: and then some of them will be like play along 421 00:24:55,400 --> 00:24:57,200 Speaker 1: with like okay, cool, this means I can play along 422 00:24:57,240 --> 00:24:59,639 Speaker 1: with the myth to help you other kids happy. And 423 00:24:59,760 --> 00:25:02,640 Speaker 1: some of will be like, what, oh no, my entire 424 00:25:02,720 --> 00:25:07,880 Speaker 1: reality is broken. And when you find out it's your 425 00:25:07,920 --> 00:25:11,239 Speaker 1: trying to like pass it on exactly that. So a 426 00:25:11,240 --> 00:25:15,240 Speaker 1: lot of that was going on, Like Shack the Basketball 427 00:25:15,240 --> 00:25:17,880 Speaker 1: Player posted about it at some point, Joe Rogan talked 428 00:25:17,920 --> 00:25:21,280 Speaker 1: about it on his podcast They Got kind of Crazy. 429 00:25:21,920 --> 00:25:26,919 Speaker 1: Wait wait, did Joe Rogan know it wasn't real? Um, 430 00:25:26,960 --> 00:25:29,000 Speaker 1: it's funny. You should see the clip of him doing it, 431 00:25:29,040 --> 00:25:30,879 Speaker 1: because he was like it was him, and who's the 432 00:25:30,920 --> 00:25:34,320 Speaker 1: other guy, Bert Kreiser or something. Anyway, they were arguing 433 00:25:34,320 --> 00:25:36,240 Speaker 1: about whether or not was really. The other guy was like, no, 434 00:25:36,320 --> 00:25:39,560 Speaker 1: it's real, it's so real. Rogan was like, all you 435 00:25:39,600 --> 00:25:42,280 Speaker 1: guys are fucking idiots. You're all idiots. Let's google it 436 00:25:42,400 --> 00:25:45,360 Speaker 1: right now. They google it and they look up an 437 00:25:45,440 --> 00:25:49,000 Speaker 1: article and Joe Joe Rogan's like, Okay, yeah, alright, it's 438 00:25:49,000 --> 00:25:51,639 Speaker 1: still fucking stupid. The guy who built it is fucking 439 00:25:51,960 --> 00:25:58,000 Speaker 1: you know. I you have You have no idea how 440 00:25:58,000 --> 00:26:00,600 Speaker 1: happy you have faced it because I in my in 441 00:26:00,680 --> 00:26:03,359 Speaker 1: my research, but like I have like read your thesis. 442 00:26:03,400 --> 00:26:06,080 Speaker 1: I read all the lots of articles about this. I 443 00:26:06,119 --> 00:26:10,680 Speaker 1: did not come across the rug clip, but I would, right. 444 00:26:11,320 --> 00:26:14,200 Speaker 1: Um it's like way back right, it's like ten years ago, 445 00:26:14,240 --> 00:26:15,960 Speaker 1: and it's like a lot of stuff to dig through 446 00:26:16,040 --> 00:26:20,119 Speaker 1: and I found it though. Again, um, so I'd like 447 00:26:20,160 --> 00:26:21,760 Speaker 1: to kind of go into like the logistics of like 448 00:26:21,800 --> 00:26:23,800 Speaker 1: actually doing this in terms of like creating all the 449 00:26:23,800 --> 00:26:27,200 Speaker 1: fake like web content, but also like you know, dreaming 450 00:26:27,280 --> 00:26:30,639 Speaker 1: up this like family friendly science show that's made by 451 00:26:30,640 --> 00:26:32,679 Speaker 1: our I t and like how like you know the 452 00:26:32,720 --> 00:26:35,359 Speaker 1: thing between like naturalism and realism and making it like 453 00:26:35,720 --> 00:26:39,760 Speaker 1: playing not trying to replicate reality, but playing as if 454 00:26:39,760 --> 00:26:44,560 Speaker 1: it were reality, and how those those are two different things. Um. Yeah, 455 00:26:44,600 --> 00:26:47,920 Speaker 1: well what we wanted to make it as real as possible, 456 00:26:48,000 --> 00:26:51,760 Speaker 1: and like that's what I was I'd been studying anyway, 457 00:26:51,760 --> 00:26:57,400 Speaker 1: but in like a dramatic context, like making narrative films. Um. 458 00:26:57,600 --> 00:27:01,440 Speaker 1: And the idea was to, um, there's this event at 459 00:27:01,560 --> 00:27:03,720 Speaker 1: r I t every year which gets a lot of people, 460 00:27:03,840 --> 00:27:06,840 Speaker 1: like thirty thousand people a year go to the campus 461 00:27:06,880 --> 00:27:11,240 Speaker 1: and look at like, um, these uh whatever the students 462 00:27:11,280 --> 00:27:13,280 Speaker 1: are working on. It's kind of like a mini like 463 00:27:13,480 --> 00:27:16,560 Speaker 1: festival type of thing. Well, not many is pretty big. 464 00:27:16,680 --> 00:27:18,800 Speaker 1: So we just we wanted to make a video for 465 00:27:18,880 --> 00:27:22,480 Speaker 1: that event. Um. As though we were promoting the event, Hey, 466 00:27:22,480 --> 00:27:25,119 Speaker 1: come see the Assarian stairwell when you get your r 467 00:27:25,200 --> 00:27:28,680 Speaker 1: I T UM and you know you normally for these 468 00:27:28,720 --> 00:27:31,399 Speaker 1: like for these for these events, if you have a 469 00:27:31,400 --> 00:27:33,960 Speaker 1: booth or something, you'll see reservations and you'll see like 470 00:27:34,000 --> 00:27:37,240 Speaker 1: four people reserve fifteen people. Like we were like started 471 00:27:37,240 --> 00:27:40,240 Speaker 1: getting nervous and we found out we got a sense 472 00:27:40,280 --> 00:27:42,679 Speaker 1: that this was gonna be big because like when I 473 00:27:42,720 --> 00:27:46,440 Speaker 1: looked at like the reservations for like our non existent 474 00:27:46,560 --> 00:27:53,600 Speaker 1: starewell they were like one thousand plus visitors. Um yeah, 475 00:27:53,880 --> 00:27:56,720 Speaker 1: I still remember like going to campus that day at 476 00:27:56,720 --> 00:28:00,440 Speaker 1: the festival Saturday, and like my friend Ira like comes 477 00:28:00,520 --> 00:28:03,120 Speaker 1: up to me, is like, Mike, people want to kill you, 478 00:28:03,240 --> 00:28:05,480 Speaker 1: like cover get over here, and I was like trying 479 00:28:05,560 --> 00:28:09,880 Speaker 1: to not show my face anyway. Yeah that's what. So 480 00:28:09,920 --> 00:28:12,919 Speaker 1: what the way, Like a lot of the legs of 481 00:28:12,920 --> 00:28:15,359 Speaker 1: the project was just like word of mouth, I guess, 482 00:28:15,560 --> 00:28:18,840 Speaker 1: and we actually ran out of money. Um, we didn't 483 00:28:18,920 --> 00:28:21,959 Speaker 1: get to do like the web stuff on the scale 484 00:28:22,040 --> 00:28:24,480 Speaker 1: that we wanted to, but it turned out that we 485 00:28:24,520 --> 00:28:27,760 Speaker 1: didn't even have to. In fact, like within a few 486 00:28:27,840 --> 00:28:31,440 Speaker 1: days or maybe a week or something after the original 487 00:28:31,560 --> 00:28:35,520 Speaker 1: video came out, I posted a video explaining that it's 488 00:28:35,560 --> 00:28:37,720 Speaker 1: a myth. Like I posted it and I was like, 489 00:28:37,760 --> 00:28:39,720 Speaker 1: all right, that was a fun ride. Now I was 490 00:28:39,800 --> 00:28:45,320 Speaker 1: gonna be over because here's a video of me explaining everything, right, 491 00:28:45,640 --> 00:28:48,239 Speaker 1: And people still didn't believe it. People were saying that 492 00:28:48,320 --> 00:28:51,800 Speaker 1: my my video explaining was fake, that was a conspiracy. 493 00:28:51,880 --> 00:28:57,160 Speaker 1: Like people were you know, like invested the into the 494 00:28:57,200 --> 00:28:59,480 Speaker 1: actual myth of it. Yeah, because it is it is 495 00:28:59,520 --> 00:29:02,200 Speaker 1: so much for a lot of people, they've thought that 496 00:29:02,320 --> 00:29:05,040 Speaker 1: is more compelling than the idea that is this like 497 00:29:05,120 --> 00:29:07,920 Speaker 1: you know, project around what is real and what's not. 498 00:29:08,080 --> 00:29:10,280 Speaker 1: They've got so invested in the reality of it that 499 00:29:11,120 --> 00:29:18,080 Speaker 1: they'll explain away every other explanation, right right, um, exactly 500 00:29:19,120 --> 00:29:22,200 Speaker 1: Like my I had a teacher at Rutger's where I 501 00:29:22,200 --> 00:29:24,920 Speaker 1: did my undergrad to Maudlin. He used to say that, 502 00:29:25,480 --> 00:29:28,320 Speaker 1: you know, there's two types of thinking. There's reasoning and 503 00:29:28,360 --> 00:29:32,280 Speaker 1: there's rationalizing. Reasoning is when when you start from a 504 00:29:32,320 --> 00:29:35,720 Speaker 1: place of ignorance and you um, look at the best 505 00:29:35,800 --> 00:29:39,160 Speaker 1: evidence and the best arguments you can find and follow 506 00:29:39,240 --> 00:29:43,680 Speaker 1: that through to the you know, the rational conclusion. Rationalizing 507 00:29:43,760 --> 00:29:46,840 Speaker 1: is when you start from what you want to believe 508 00:29:47,360 --> 00:29:52,800 Speaker 1: and working backwards and looking for you confirmation, right, looking 509 00:29:52,840 --> 00:29:55,520 Speaker 1: for the arguments that are already support what you're saying. There. 510 00:29:55,560 --> 00:29:58,840 Speaker 1: There was a lot of a lot of rationalizing going on. 511 00:29:59,120 --> 00:30:05,400 Speaker 1: I guess people wanted to believe it. Yeah, for for 512 00:30:05,480 --> 00:30:08,040 Speaker 1: the how much how many people in this because they 513 00:30:08,040 --> 00:30:09,960 Speaker 1: assume for all of like the filming, like like everyone 514 00:30:10,040 --> 00:30:11,840 Speaker 1: was all like in on it. But yeah, you know, 515 00:30:11,840 --> 00:30:13,440 Speaker 1: there's whole bunch of great stuff around like all of 516 00:30:13,480 --> 00:30:15,360 Speaker 1: like the men on the street segments are are are 517 00:30:15,480 --> 00:30:18,360 Speaker 1: like perfectly done in terms of like people like just 518 00:30:18,640 --> 00:30:22,720 Speaker 1: acting like regular university students, like talking about the stairwell 519 00:30:22,840 --> 00:30:24,880 Speaker 1: and like how they've got like lost, and then they're 520 00:30:24,920 --> 00:30:27,800 Speaker 1: like looping around in a circle. Um. And all the 521 00:30:27,840 --> 00:30:31,040 Speaker 1: segments with you with um like inside the stairwell with 522 00:30:31,080 --> 00:30:32,960 Speaker 1: all like the very like the very clever end of thing. 523 00:30:33,200 --> 00:30:36,880 Speaker 1: I assume you're using stuff like Adobe after effects. Um, 524 00:30:36,960 --> 00:30:40,480 Speaker 1: and yeah, it's it is played. It's played so well 525 00:30:40,520 --> 00:30:42,240 Speaker 1: like it's it's I think part of the part of 526 00:30:42,240 --> 00:30:45,200 Speaker 1: why it's so successful is that it's not filmed like 527 00:30:45,320 --> 00:30:48,080 Speaker 1: you would film something too high like like like for 528 00:30:48,120 --> 00:30:49,760 Speaker 1: a lot of films when they want to do like 529 00:30:49,760 --> 00:30:52,360 Speaker 1: like you know, like like ah the term is like 530 00:30:52,400 --> 00:30:54,120 Speaker 1: a wonner where they had like one long shot and 531 00:30:54,120 --> 00:30:56,719 Speaker 1: then they like hide the transitions in between. You can 532 00:30:56,760 --> 00:30:58,640 Speaker 1: you can obviously tell like they're filming it to make 533 00:30:58,680 --> 00:31:01,560 Speaker 1: these transitions work versus the way you film. This is 534 00:31:01,760 --> 00:31:04,600 Speaker 1: just how people would film it if they were filming 535 00:31:04,600 --> 00:31:08,000 Speaker 1: this four reel um. And they tell that, and it's 536 00:31:08,120 --> 00:31:12,239 Speaker 1: it is so carefully done because it's not trying to 537 00:31:12,320 --> 00:31:14,920 Speaker 1: be something it is. It is just being the thing 538 00:31:15,040 --> 00:31:18,160 Speaker 1: so earnestly in terms of like how how the actors 539 00:31:18,200 --> 00:31:21,760 Speaker 1: like stumble over their lines and the like the opening segment, um, 540 00:31:21,880 --> 00:31:24,080 Speaker 1: like the aesthetics of like all of like the title 541 00:31:24,160 --> 00:31:27,240 Speaker 1: cards and everything is just so it has this has 542 00:31:27,280 --> 00:31:30,560 Speaker 1: this like aura of earnestness, which I think helps sells 543 00:31:30,600 --> 00:31:35,560 Speaker 1: the whole project so so much. Yeah. Yeah, actually speaking 544 00:31:35,560 --> 00:31:38,560 Speaker 1: of the show and like the cheesy title cards and stuff, 545 00:31:39,080 --> 00:31:41,920 Speaker 1: my girlfriend at the time was a producer for this 546 00:31:42,000 --> 00:31:46,440 Speaker 1: show called this local show called Homework Cloudline and where 547 00:31:46,520 --> 00:31:48,560 Speaker 1: kids call in with their homework and they answer the 548 00:31:48,920 --> 00:31:52,320 Speaker 1: questions about it. I studied the ship out of that show, 549 00:31:52,600 --> 00:31:55,479 Speaker 1: just looking at how they built the sets and how 550 00:31:55,640 --> 00:31:59,240 Speaker 1: cheesy and how awkward like the host store, because a 551 00:31:59,240 --> 00:32:01,480 Speaker 1: lot of it was like a lot of the realism 552 00:32:01,480 --> 00:32:04,800 Speaker 1: I think of it is just um yeah, the awkwardness 553 00:32:04,960 --> 00:32:08,680 Speaker 1: of the people. How it's not um, it's not really 554 00:32:08,720 --> 00:32:12,040 Speaker 1: meant to be and and like like the best the 555 00:32:12,080 --> 00:32:17,280 Speaker 1: most convincing untruths, right, is a combination of fact and fiction. Yeah, 556 00:32:17,360 --> 00:32:20,200 Speaker 1: and you know a lot of and blending in the 557 00:32:20,280 --> 00:32:24,400 Speaker 1: actors with real people, you know, in in in the 558 00:32:24,520 --> 00:32:27,680 Speaker 1: in the actual video stuff like that. It's like, yeah, 559 00:32:27,720 --> 00:32:33,440 Speaker 1: like it comes goes pretty viral. Um, you like pretty 560 00:32:33,520 --> 00:32:36,600 Speaker 1: quick create a very easy explanation for No, it's not 561 00:32:36,680 --> 00:32:38,520 Speaker 1: it's not real. It's part of this project. People still 562 00:32:38,520 --> 00:32:40,960 Speaker 1: believe it for years and years. Um. As kind of 563 00:32:40,960 --> 00:32:44,800 Speaker 1: the decade progresses, we go into like the era of disinformation, 564 00:32:44,880 --> 00:32:47,480 Speaker 1: everyone starts getting shown into their pockets. Everyone has Facebook 565 00:32:47,480 --> 00:32:49,200 Speaker 1: with them wherever they go, everyone has Twitter with them 566 00:32:49,200 --> 00:32:51,480 Speaker 1: wherever they go. How is kind of your views on 567 00:32:51,600 --> 00:32:54,720 Speaker 1: like the ethics of the project and what it demonstrates 568 00:32:54,840 --> 00:32:56,440 Speaker 1: in terms of like a case study and like a 569 00:32:56,480 --> 00:32:58,680 Speaker 1: social experiment, Like how has that changed over the years 570 00:32:58,760 --> 00:33:00,920 Speaker 1: from like you like ten years ago when you're drimming 571 00:33:00,920 --> 00:33:04,200 Speaker 1: this up to you now after you know, we've had 572 00:33:04,240 --> 00:33:07,160 Speaker 1: stuff like you know like January six and chew and on. 573 00:33:07,400 --> 00:33:09,360 Speaker 1: You know, all these types of things which I feel 574 00:33:09,400 --> 00:33:11,479 Speaker 1: like if are almost like foreshadowed in this in this 575 00:33:11,520 --> 00:33:16,160 Speaker 1: weird way by showing how successful your little project is. Yeah, 576 00:33:16,520 --> 00:33:21,040 Speaker 1: I'm so a lot of a lot of the criticisms 577 00:33:21,360 --> 00:33:25,280 Speaker 1: that it was faced from the get go, like from 578 00:33:25,400 --> 00:33:28,600 Speaker 1: R I. T. Professors even it's still facing right now, 579 00:33:28,880 --> 00:33:31,600 Speaker 1: Like it's still the type of thing people bring up, 580 00:33:31,640 --> 00:33:36,239 Speaker 1: which is essentially that, hey, there's so much disinformation out there. 581 00:33:36,280 --> 00:33:40,320 Speaker 1: At the time, we weren't even using those terms disinformation, right, 582 00:33:40,600 --> 00:33:44,040 Speaker 1: but basically people were. We're bringing up the same complaints, 583 00:33:44,080 --> 00:33:48,440 Speaker 1: which is, there's so much disinformation out there, you're basically 584 00:33:48,480 --> 00:33:51,000 Speaker 1: just adding to it. What what are you even doing? 585 00:33:51,040 --> 00:33:55,200 Speaker 1: So I guess the idea is that, and you know, 586 00:33:55,280 --> 00:33:59,840 Speaker 1: it's a very noble idea, which is what sort of 587 00:34:00,040 --> 00:34:03,480 Speaker 1: ponds to disinformation? Right, we should I guess the idea 588 00:34:03,560 --> 00:34:05,880 Speaker 1: is we should call out every instance of it when 589 00:34:05,880 --> 00:34:10,280 Speaker 1: we can flag posts, UM, report posts that violate community 590 00:34:10,280 --> 00:34:15,000 Speaker 1: community standards, you know, speak out, um, provide counter evidence 591 00:34:15,040 --> 00:34:17,160 Speaker 1: when you see fake news, that sort of thing, And 592 00:34:17,200 --> 00:34:22,000 Speaker 1: I think that's great, that's a good thing, UM. But disinformation, 593 00:34:23,040 --> 00:34:26,200 Speaker 1: the problem of disinformation is at the time, this is 594 00:34:26,400 --> 00:34:28,279 Speaker 1: kind of how I explained that, like ten years ago, 595 00:34:28,320 --> 00:34:33,239 Speaker 1: I I described it as a pen as an epidemic, yeah, right, 596 00:34:33,520 --> 00:34:36,520 Speaker 1: or or like a cockroach infestation, like every time you 597 00:34:36,640 --> 00:34:40,399 Speaker 1: kill one, ten more spring up and v this this 598 00:34:40,480 --> 00:34:44,239 Speaker 1: notion of like we gotta call out every instance of 599 00:34:44,280 --> 00:34:48,480 Speaker 1: disinformation and stomp it away is like it's great, but 600 00:34:48,520 --> 00:34:52,160 Speaker 1: you're focused on killing cockroaches. Yeah, it's like addressing the symptoms, 601 00:34:52,200 --> 00:34:54,279 Speaker 1: not the actual problem, right, I want to get to 602 00:34:54,320 --> 00:34:58,440 Speaker 1: the cockroach's nest. Right, And whenever whenever I give talks 603 00:34:58,520 --> 00:35:03,440 Speaker 1: about this um this project, people always approach me afterwards, 604 00:35:03,960 --> 00:35:06,640 Speaker 1: you know, like wanting me to kind of because we 605 00:35:06,640 --> 00:35:08,720 Speaker 1: we don't just talk about this project. We talked about 606 00:35:08,719 --> 00:35:12,200 Speaker 1: like deep fake stuff, like we show speeches of Obama, 607 00:35:12,320 --> 00:35:15,680 Speaker 1: like looking like the real Obama, but it's like completely fake, right, 608 00:35:16,239 --> 00:35:18,480 Speaker 1: And people start to realize the holy sh it, like 609 00:35:18,520 --> 00:35:21,000 Speaker 1: I don't even know what's real or not anymore, Like 610 00:35:21,080 --> 00:35:24,319 Speaker 1: what can I trust? And they approach me expecting me 611 00:35:24,400 --> 00:35:27,280 Speaker 1: to ease their anxiety somehow and kind of like guide 612 00:35:27,280 --> 00:35:29,839 Speaker 1: them through how to discern what's true from what's not, 613 00:35:30,040 --> 00:35:34,200 Speaker 1: as though my project was about finding some sort of solution. 614 00:35:34,920 --> 00:35:38,440 Speaker 1: And I tell them that, like, my project wasn't about 615 00:35:38,480 --> 00:35:42,120 Speaker 1: solving the problem. It was about seeing the problem, right, 616 00:35:42,160 --> 00:35:45,200 Speaker 1: It's about it's about trying to get to the heart 617 00:35:45,280 --> 00:35:48,080 Speaker 1: of the matter. And it's like to me, I think, 618 00:35:48,120 --> 00:35:50,600 Speaker 1: like the heart of the matter, like the cockroach's nest 619 00:35:50,840 --> 00:35:56,319 Speaker 1: is the I don't know you. There are different ways 620 00:35:56,320 --> 00:36:00,120 Speaker 1: to say, but basically, the the lack of critical thinking 621 00:36:00,160 --> 00:36:03,320 Speaker 1: in individuals and like in this society we shaped together 622 00:36:03,960 --> 00:36:08,000 Speaker 1: or um or lack of a willingness to think through 623 00:36:08,080 --> 00:36:13,360 Speaker 1: things carefully. Maybe that's that's um. That's like, if we 624 00:36:13,440 --> 00:36:16,120 Speaker 1: had a society of critical thinkers, this wouldn't be much 625 00:36:16,120 --> 00:36:18,240 Speaker 1: of a problem. I think it's because so many people 626 00:36:18,320 --> 00:36:20,880 Speaker 1: come at a lot of information from like when you 627 00:36:20,880 --> 00:36:24,080 Speaker 1: would say the rational viewpoint of like they're trying to 628 00:36:24,200 --> 00:36:26,240 Speaker 1: use reason and stuff, like they're trying to think critically, 629 00:36:26,239 --> 00:36:28,799 Speaker 1: they're trying to think logically, but they come at it 630 00:36:28,880 --> 00:36:32,399 Speaker 1: in the terms of rationalizing stuff they already believe. Um. 631 00:36:32,520 --> 00:36:35,800 Speaker 1: And I think that's a very prevailing type of idea 632 00:36:35,840 --> 00:36:37,759 Speaker 1: in terms of like, yes, I'm gonna believe in this thing, 633 00:36:37,840 --> 00:36:41,640 Speaker 1: so I'm gonna find evidence to support it. Um, which 634 00:36:41,680 --> 00:36:44,319 Speaker 1: isn't critical thinking. I don't think. I'm not really know 635 00:36:44,360 --> 00:36:47,360 Speaker 1: what it is that is itself a logical fallacy. But 636 00:36:47,440 --> 00:36:49,440 Speaker 1: that is so calm, especially on the Internet, because the 637 00:36:49,480 --> 00:36:52,520 Speaker 1: Internet encourages the backfire effects. You know, whenever someone calls 638 00:36:52,560 --> 00:36:55,359 Speaker 1: it on something, you want to be right. So you're 639 00:36:55,360 --> 00:36:57,640 Speaker 1: gonna it's as soon as soon as someone callses something, 640 00:36:57,680 --> 00:37:00,239 Speaker 1: you're going to backfire. You're going to like be come 641 00:37:00,440 --> 00:37:04,120 Speaker 1: even more entrenched in what you believe. Um when you know, 642 00:37:04,200 --> 00:37:06,680 Speaker 1: when when you explain to someone that know, Hillary Clinton 643 00:37:06,840 --> 00:37:09,600 Speaker 1: is bad, but she doesn't eat the blood of children, like, no, 644 00:37:09,880 --> 00:37:12,200 Speaker 1: she does. I saw all this thing. I have to 645 00:37:12,200 --> 00:37:14,239 Speaker 1: believe it because like all of the things are tied 646 00:37:14,320 --> 00:37:16,560 Speaker 1: up in what makes you a person. And now all 647 00:37:16,600 --> 00:37:19,120 Speaker 1: of these ideas that have that where used to be 648 00:37:19,120 --> 00:37:20,920 Speaker 1: just be conspiracy theories that you could believe in for fun, 649 00:37:20,960 --> 00:37:24,600 Speaker 1: are now so a part of like what people's sense 650 00:37:24,640 --> 00:37:27,000 Speaker 1: of being are and how they have their entire world 651 00:37:27,080 --> 00:37:30,960 Speaker 1: view that there's so much more because the Internet is 652 00:37:31,000 --> 00:37:32,759 Speaker 1: such a bigger part of their lives. Everything on the 653 00:37:32,800 --> 00:37:35,080 Speaker 1: Internet is a bigger part of the lives for each person. 654 00:37:35,320 --> 00:37:37,319 Speaker 1: So it is more of an ontological threat because these 655 00:37:37,320 --> 00:37:39,360 Speaker 1: things are so closer together now, right, They used to 656 00:37:39,360 --> 00:37:41,720 Speaker 1: be much more distinction between the Internet and you because 657 00:37:41,719 --> 00:37:43,520 Speaker 1: you can only ask the computer every once in a while. 658 00:37:43,680 --> 00:37:45,880 Speaker 1: We can now carry on a supercomputer wherever you go, 659 00:37:46,160 --> 00:37:47,440 Speaker 1: so it is like a part of you, Like you 660 00:37:47,480 --> 00:37:49,840 Speaker 1: bring it with you almost everywhere. It's always in your pocket. 661 00:37:50,440 --> 00:37:54,480 Speaker 1: So these things are so like stitched together that prying 662 00:37:54,520 --> 00:37:57,040 Speaker 1: them apart and telling people know this thing you would 663 00:37:57,040 --> 00:37:59,640 Speaker 1: carry around Actually probably most don't you see on it 664 00:37:59,719 --> 00:38:03,160 Speaker 1: isn't isn't actually true, like there is people can like 665 00:38:03,680 --> 00:38:07,320 Speaker 1: believe that in their heads but don't actually don't Actually 666 00:38:07,719 --> 00:38:09,560 Speaker 1: the belief hasn't actually impact them because like we all 667 00:38:09,560 --> 00:38:11,960 Speaker 1: know that there's like we all know that people can 668 00:38:12,000 --> 00:38:13,600 Speaker 1: just go on the internet and lie, right, that is 669 00:38:13,600 --> 00:38:16,799 Speaker 1: like part of the joke, but we still don't act 670 00:38:16,840 --> 00:38:19,480 Speaker 1: like it. Like oftentimes we get so we get so 671 00:38:21,040 --> 00:38:23,880 Speaker 1: like encased into the stories that we tell ourselves. Right, 672 00:38:24,080 --> 00:38:25,600 Speaker 1: the part of our the is sharing stairwell is so 673 00:38:25,640 --> 00:38:27,720 Speaker 1: good is that it's such a it's such a compelling 674 00:38:27,760 --> 00:38:29,680 Speaker 1: story like that, Like the idea of like a brilliant 675 00:38:29,719 --> 00:38:33,719 Speaker 1: architect bringing like you know, building this paradox in the 676 00:38:33,800 --> 00:38:37,839 Speaker 1: real world is like is so much more fun than 677 00:38:37,880 --> 00:38:39,960 Speaker 1: being like, yeah, some dude just knows how to use 678 00:38:40,080 --> 00:38:43,840 Speaker 1: Adobe after effects. Like right, So you get so entrenched 679 00:38:43,880 --> 00:38:48,480 Speaker 1: into storytelling because the story of like politicians eating up 680 00:38:48,480 --> 00:38:50,840 Speaker 1: eating the blood of children is so much more interesting 681 00:38:51,120 --> 00:38:55,359 Speaker 1: than no, politicians just don't care about you, like, and 682 00:38:55,520 --> 00:38:57,439 Speaker 1: getting to the heart of that problem is so much 683 00:38:57,480 --> 00:39:00,959 Speaker 1: more difficult than just you know, debunk king things because 684 00:39:01,000 --> 00:39:05,680 Speaker 1: you can debunk things all day and does that actually matter? Yeah, 685 00:39:05,920 --> 00:39:08,080 Speaker 1: I think there there's there's a secondary problem that, like, 686 00:39:08,200 --> 00:39:10,680 Speaker 1: you know, there's another like another level of it, which 687 00:39:10,719 --> 00:39:12,919 Speaker 1: is that, yeah, like everyone knows that there's this information now, 688 00:39:13,640 --> 00:39:15,880 Speaker 1: like everyone does. But but that just makes it worse 689 00:39:15,920 --> 00:39:18,160 Speaker 1: because now if you want to do this information, what 690 00:39:18,280 --> 00:39:19,680 Speaker 1: you can claim is like, oh, hey, look at all 691 00:39:19,719 --> 00:39:21,799 Speaker 1: these other times that all the stuff has been fake. 692 00:39:21,840 --> 00:39:23,080 Speaker 1: And then you know, and this is how you get 693 00:39:23,160 --> 00:39:26,200 Speaker 1: everyone like doing frame by frame analysis of like a 694 00:39:26,280 --> 00:39:29,120 Speaker 1: bombing and going oh, these are all crisis actors, and 695 00:39:29,160 --> 00:39:30,960 Speaker 1: it's like you know, and you talk, you talk to 696 00:39:30,960 --> 00:39:33,839 Speaker 1: these people and they're like, oh, yeah, no, I did. 697 00:39:33,920 --> 00:39:35,680 Speaker 1: I did the research. Look I like I saw through 698 00:39:35,680 --> 00:39:38,319 Speaker 1: the lives. It's like, no, You've just completely made this 699 00:39:38,360 --> 00:39:40,920 Speaker 1: thing up in your head. You can see the green 700 00:39:41,040 --> 00:39:44,440 Speaker 1: screen compression and like, no, it's just regular video compression. 701 00:39:44,440 --> 00:39:47,000 Speaker 1: And it's like, yeah, like everyone can be a detective now, 702 00:39:47,280 --> 00:39:49,800 Speaker 1: so everyone can be so convinced to their own conclusions 703 00:39:49,880 --> 00:39:52,960 Speaker 1: even when the conclusions turned out to be not true. Right, 704 00:39:55,200 --> 00:39:57,880 Speaker 1: it's a problem. If there wasn't exy solution, we wouldn't 705 00:39:57,880 --> 00:40:00,239 Speaker 1: have the problem, right, it's one of those things. It's 706 00:40:00,320 --> 00:40:02,600 Speaker 1: like your project is a very good example of like 707 00:40:03,320 --> 00:40:05,799 Speaker 1: it's it's it's it's a very demonstrative thing. You can 708 00:40:05,840 --> 00:40:08,960 Speaker 1: like you you take someone along this journey and demonstrate, hey, 709 00:40:09,080 --> 00:40:11,080 Speaker 1: this can happen to you, so you should watch out 710 00:40:11,080 --> 00:40:13,480 Speaker 1: for us. Right, look look at the story I crafted. 711 00:40:13,719 --> 00:40:16,040 Speaker 1: Look how you become convinced it for these six minutes 712 00:40:16,360 --> 00:40:19,920 Speaker 1: and then you think, oh, wait, no, you can't teleport 713 00:40:19,960 --> 00:40:23,400 Speaker 1: to a bottom stairwell. That's not that's not how that works. Um. 714 00:40:23,440 --> 00:40:25,719 Speaker 1: But because you take them on that journey, it's a 715 00:40:25,800 --> 00:40:27,440 Speaker 1: very it's a I love it so much as like 716 00:40:27,480 --> 00:40:31,759 Speaker 1: a demonstrative process, being like like this can happen, so 717 00:40:31,840 --> 00:40:33,759 Speaker 1: watch out for it in the future. I think is 718 00:40:33,800 --> 00:40:39,319 Speaker 1: honestly more useful than just debunking somebody because you can 719 00:40:39,600 --> 00:40:41,360 Speaker 1: you can debunk all day, you can have the backfire 720 00:40:41,400 --> 00:40:45,120 Speaker 1: effect and stuff. And you're right about the demonstrative stuff, 721 00:40:45,120 --> 00:40:48,400 Speaker 1: because it's like if a bunch of film students and 722 00:40:48,520 --> 00:40:52,439 Speaker 1: volunteers with no connections and no resources pulled this off 723 00:40:52,480 --> 00:40:55,360 Speaker 1: like we did, like a tally of all the videos 724 00:40:55,960 --> 00:40:59,000 Speaker 1: at the end of the year of um, you know, 725 00:40:59,040 --> 00:41:01,200 Speaker 1: all the videos that ripped it off and posted on 726 00:41:01,239 --> 00:41:03,880 Speaker 1: their own channels and all that. Um, it was like 727 00:41:03,960 --> 00:41:06,600 Speaker 1: fifty million, right, So if if a bunch of film 728 00:41:06,640 --> 00:41:10,640 Speaker 1: students like had that much influence, what more can like 729 00:41:10,760 --> 00:41:17,279 Speaker 1: people who are actually fund like and resources, right, what 730 00:41:17,320 --> 00:41:20,160 Speaker 1: could they do? And we were just doing and ours 731 00:41:20,239 --> 00:41:23,960 Speaker 1: was about like this innocuous, silly Stairwell, it wasn't about 732 00:41:23,960 --> 00:41:27,920 Speaker 1: anything that would cause you know, anyone's death or anything 733 00:41:27,960 --> 00:41:31,960 Speaker 1: like that or and like you know, something like in 734 00:41:32,160 --> 00:41:37,160 Speaker 1: Me and Mar where the ME and Mar military basically 735 00:41:37,200 --> 00:41:43,720 Speaker 1: systemically systematically created fake articles and fake photos to create 736 00:41:43,840 --> 00:41:47,560 Speaker 1: like to arouse this dame for the row hinge of 737 00:41:47,560 --> 00:41:52,480 Speaker 1: people and basically they incited a genocide through Facebook, just 738 00:41:52,520 --> 00:41:55,000 Speaker 1: through fake news right in the Philippines where I live 739 00:41:55,120 --> 00:42:00,480 Speaker 1: right now, um, which a lot of commentators call like 740 00:42:00,640 --> 00:42:07,520 Speaker 1: the patient zero of disinformation because this guy called was 741 00:42:07,560 --> 00:42:12,680 Speaker 1: elected president basically ran running his entire campaign on disinformation, 742 00:42:13,480 --> 00:42:16,120 Speaker 1: and after him was Brexit like a month later, and 743 00:42:16,239 --> 00:42:20,000 Speaker 1: after that was Trump got the nomination. So like what's 744 00:42:20,000 --> 00:42:23,279 Speaker 1: her name, Kate Katie Barth Barth or or something like that. 745 00:42:24,160 --> 00:42:26,799 Speaker 1: One of the executives of Facebook referred to the Philippines 746 00:42:26,880 --> 00:42:33,200 Speaker 1: as Patient zero. In the era of disinformation because like um, 747 00:42:33,280 --> 00:42:38,040 Speaker 1: and the thing that the president here right now was 748 00:42:38,160 --> 00:42:41,800 Speaker 1: running on was basically like the same sort of um 749 00:42:42,640 --> 00:42:45,680 Speaker 1: bothering and scapegoating of a certain group. And he said 750 00:42:45,719 --> 00:42:48,799 Speaker 1: basically he's the guy who said like, basically, if you're 751 00:42:48,800 --> 00:42:51,200 Speaker 1: a drug user or a drug dealer, it should be 752 00:42:51,239 --> 00:42:54,240 Speaker 1: okay to murder you and kill you. And that's what happened. 753 00:42:54,239 --> 00:42:57,239 Speaker 1: That's exactly what happened, because they were posting all these 754 00:42:57,280 --> 00:43:01,960 Speaker 1: stories about um, you know, the same sorts of stories 755 00:43:01,960 --> 00:43:04,400 Speaker 1: that you that that we saw in the US and 756 00:43:04,440 --> 00:43:08,680 Speaker 1: twenties sixteen about undocumented immigrants or Muslims or something like that. 757 00:43:08,960 --> 00:43:11,920 Speaker 1: This like oh this undocumented immigrant rate the five year 758 00:43:11,960 --> 00:43:15,400 Speaker 1: old girl, you know that sort of thing. And he 759 00:43:15,480 --> 00:43:20,960 Speaker 1: would the the the organized campaign UM making up stories 760 00:43:21,000 --> 00:43:26,600 Speaker 1: about drug addicts like murdering and raping people. Basically like 761 00:43:26,800 --> 00:43:30,920 Speaker 1: got an entire nation too, well not an entire nation, 762 00:43:30,960 --> 00:43:35,319 Speaker 1: but basically this guy won the election. And you know, 763 00:43:35,880 --> 00:43:40,480 Speaker 1: we have a country right now that basically lived through 764 00:43:41,000 --> 00:43:45,359 Speaker 1: just atrocities the last five six years, you know, and 765 00:43:45,400 --> 00:43:48,080 Speaker 1: like the double edged sort of this, like Chris mentioned, 766 00:43:48,120 --> 00:43:51,960 Speaker 1: is like, yeah, this type same type of thing, because 767 00:43:52,000 --> 00:43:55,520 Speaker 1: it exists. People also like retractively apply it to like 768 00:43:55,760 --> 00:43:58,960 Speaker 1: you know, like Sandy Hook was staged or like even 769 00:43:59,000 --> 00:44:01,799 Speaker 1: stuff now with like you know, the pandemic, right, people 770 00:44:02,000 --> 00:44:04,319 Speaker 1: like what what? What is the pandemic is a real? 771 00:44:04,400 --> 00:44:07,040 Speaker 1: What if all these people have just you know, conjured 772 00:44:07,040 --> 00:44:10,640 Speaker 1: this thing into being and it's all a giant insufferation campaign, right, 773 00:44:10,680 --> 00:44:12,919 Speaker 1: So it has this dual it has this double edged 774 00:44:12,920 --> 00:44:17,440 Speaker 1: sword nature um which makes combat and disformation so challenging. 775 00:44:17,560 --> 00:44:19,759 Speaker 1: It's like disiforation to combat its information to comment the 776 00:44:19,760 --> 00:44:23,719 Speaker 1: idea of disinformation, and there's so many layers of it. 777 00:44:23,760 --> 00:44:27,600 Speaker 1: Now there's this this it's just yeah, it makes it, 778 00:44:27,600 --> 00:44:29,239 Speaker 1: it makes actually get into the heart of it so 779 00:44:29,320 --> 00:44:32,480 Speaker 1: much more challenging. It's been abstracted so many times. And 780 00:44:32,520 --> 00:44:35,120 Speaker 1: one of the things what didn't didn't the New York 781 00:44:35,160 --> 00:44:36,640 Speaker 1: Times weren't it the first people to come up with 782 00:44:36,640 --> 00:44:38,480 Speaker 1: the term fake news? And then Trump started using it 783 00:44:38,520 --> 00:44:42,440 Speaker 1: after Like we was watching plotificate which newspaper was but 784 00:44:42,680 --> 00:44:44,200 Speaker 1: my memory if it was like it was, it was 785 00:44:44,239 --> 00:44:45,759 Speaker 1: it was the media that came up with fake news 786 00:44:45,800 --> 00:44:48,880 Speaker 1: and then like Trump just took it and it became 787 00:44:48,960 --> 00:44:52,200 Speaker 1: this like this just like demon they absolutely could not 788 00:44:52,239 --> 00:44:55,520 Speaker 1: control and was just turned on them. Do you do 789 00:44:55,560 --> 00:44:58,680 Speaker 1: you remember the context in which they used it. They 790 00:44:58,680 --> 00:45:01,960 Speaker 1: were like they were I think they were calling like 791 00:45:02,200 --> 00:45:06,319 Speaker 1: stuff that Trump said fake news. Mmm, I am. I'm 792 00:45:06,400 --> 00:45:10,840 Speaker 1: unsure at the moment who specifically coined that term. But 793 00:45:10,880 --> 00:45:13,600 Speaker 1: I mean we definitely see terms like even in terms 794 00:45:13,600 --> 00:45:16,200 Speaker 1: like desipiration, which you used to be more tied to, 795 00:45:16,280 --> 00:45:19,720 Speaker 1: like a discording in philosophy, breaking like in like even 796 00:45:19,719 --> 00:45:21,880 Speaker 1: even back even as back far as like the eighties, 797 00:45:22,320 --> 00:45:25,160 Speaker 1: getting you know, turned into an actual like political term 798 00:45:25,480 --> 00:45:29,800 Speaker 1: that everybody uses. It was actually somebody from buzz Speed. 799 00:45:30,080 --> 00:45:33,080 Speaker 1: An editor at buzz Speed was one of that makes sense, 800 00:45:35,080 --> 00:45:40,200 Speaker 1: is one of the ones who first popularized it. Was Yeah, 801 00:45:40,560 --> 00:45:43,960 Speaker 1: but there could be there could be you know, several 802 00:45:44,000 --> 00:45:46,480 Speaker 1: other people that say that they coined it. I don't know. 803 00:45:47,200 --> 00:45:51,000 Speaker 1: I mean, I even there's even h an illustration from 804 00:45:51,120 --> 00:45:56,120 Speaker 1: eighteen ninety four by by Frederick Offer with reporters carrying 805 00:45:56,120 --> 00:46:00,799 Speaker 1: newspapers labeled humble news, cheap sensation, and fake news. So 806 00:46:01,360 --> 00:46:02,880 Speaker 1: it's I mean, in terms of in terms of just 807 00:46:02,920 --> 00:46:05,080 Speaker 1: mashing words together. I'm sure it has has had a 808 00:46:05,120 --> 00:46:07,680 Speaker 1: decent history, but definitely Trump is the one that like 809 00:46:08,239 --> 00:46:15,280 Speaker 1: launched it into the Zeitgeist right right, right, Let's see, Robert, 810 00:46:15,320 --> 00:46:17,160 Speaker 1: you've been pretty quiet. I know it's pretty early in 811 00:46:17,200 --> 00:46:18,840 Speaker 1: the morning for you. Do you have any do you 812 00:46:18,880 --> 00:46:21,120 Speaker 1: have any kind of thoughts to help us kind of 813 00:46:21,360 --> 00:46:24,160 Speaker 1: generally start closing us out, not not not like super immediately, 814 00:46:24,239 --> 00:46:26,919 Speaker 1: but generally having that direction. I mean no, not really 815 00:46:27,000 --> 00:46:31,239 Speaker 1: kind of brought uct everything. I would say, all right, 816 00:46:31,360 --> 00:46:38,120 Speaker 1: all right, it's ah, yeah, I guess. Uh, Mike, what 817 00:46:38,280 --> 00:46:42,080 Speaker 1: have How is this project impacted how you approach film 818 00:46:42,280 --> 00:46:44,440 Speaker 1: and just the like, how you how you use the 819 00:46:44,440 --> 00:46:50,840 Speaker 1: internet yourself in the past decade? M hm, Well, I 820 00:46:50,840 --> 00:46:57,160 Speaker 1: I'm fully aware of what we did. Every time I'm 821 00:46:57,239 --> 00:46:59,399 Speaker 1: like looking at something, I'm like they had done that, 822 00:47:00,000 --> 00:47:01,959 Speaker 1: they have done this and that, you know that sort 823 00:47:02,000 --> 00:47:08,960 Speaker 1: of thing. Um, I don't know if it's if yeah, 824 00:47:09,000 --> 00:47:11,240 Speaker 1: I don't I'm not sure how it's how this project 825 00:47:11,280 --> 00:47:17,839 Speaker 1: specifically had impacted me, other than just trying to think 826 00:47:17,840 --> 00:47:21,680 Speaker 1: through things a bit more carefully, trying to go through 827 00:47:21,760 --> 00:47:27,200 Speaker 1: things like, um, I mean, like, so we we basically 828 00:47:27,760 --> 00:47:31,839 Speaker 1: came up with this idea of what eventually became troll farms, right, 829 00:47:31,920 --> 00:47:36,200 Speaker 1: like me and like my classmates would hey, we even 830 00:47:36,239 --> 00:47:40,280 Speaker 1: make fake accounts and like talk about the stairwell and 831 00:47:40,520 --> 00:47:44,919 Speaker 1: um so, I don't know, like a few years later, 832 00:47:45,000 --> 00:47:48,000 Speaker 1: people we we learned that people were actually doing this 833 00:47:48,160 --> 00:47:52,480 Speaker 1: like to influence like elections around the world, and a 834 00:47:52,560 --> 00:47:56,520 Speaker 1: lot of the strategy of like the Russian troll farms 835 00:47:56,520 --> 00:48:03,080 Speaker 1: and stuff, um was basically create caricature versions right of 836 00:48:03,200 --> 00:48:07,839 Speaker 1: arguments from whatever side, like you know, whether they might 837 00:48:07,880 --> 00:48:11,160 Speaker 1: present an argument from like the left or the right, 838 00:48:11,200 --> 00:48:14,839 Speaker 1: but in like a caricaturized version of it. And um 839 00:48:14,880 --> 00:48:17,680 Speaker 1: so what people would see when they see that, they'd 840 00:48:17,680 --> 00:48:20,120 Speaker 1: see an argument coming from the other side, and they'd 841 00:48:20,200 --> 00:48:23,040 Speaker 1: ridicule it, like look at these people who just seem 842 00:48:23,160 --> 00:48:27,600 Speaker 1: crazy espousing this whatever view, you know, or they might 843 00:48:27,640 --> 00:48:31,320 Speaker 1: say things like um, like yeah, if you're a Democrat, 844 00:48:31,400 --> 00:48:34,120 Speaker 1: you want to abort babies that like the ninth month 845 00:48:34,239 --> 00:48:37,160 Speaker 1: or something like that, which no reasonable person actually argued. 846 00:48:37,360 --> 00:48:41,480 Speaker 1: So what happens is like, um people talk about how 847 00:48:41,640 --> 00:48:45,960 Speaker 1: the goal of Russia was to like polarize, you know, um, 848 00:48:46,120 --> 00:48:51,080 Speaker 1: polarize the political spectrum. I think like the bigger goal 849 00:48:51,520 --> 00:48:55,799 Speaker 1: in the the goal that we're gonna be untangling for 850 00:48:55,880 --> 00:49:01,320 Speaker 1: many many years, and the more um, the more difficult 851 00:49:01,680 --> 00:49:06,360 Speaker 1: problem to deal with was that they oversimple they successfully 852 00:49:06,400 --> 00:49:10,600 Speaker 1: oversimplified discourse, you know what I'm saying, Like they found 853 00:49:10,640 --> 00:49:14,759 Speaker 1: a way to like oversimplify the type of discourse we're 854 00:49:14,800 --> 00:49:20,200 Speaker 1: having because everyone's like arguing with such simplist thick I'm 855 00:49:20,239 --> 00:49:22,200 Speaker 1: not sure if I'm making sense. It's like it's like 856 00:49:22,320 --> 00:49:25,760 Speaker 1: it's like the term I use is like politics as fandom, 857 00:49:26,719 --> 00:49:30,160 Speaker 1: right right, And that's I think that like intersect not 858 00:49:30,280 --> 00:49:32,960 Speaker 1: not exactly what you're saying, but like intersects with that 859 00:49:33,000 --> 00:49:36,839 Speaker 1: type of idea of like condensing down actual discussions on 860 00:49:36,960 --> 00:49:40,880 Speaker 1: like what you believe in, um and what politics you 861 00:49:40,960 --> 00:49:42,960 Speaker 1: want and how you want to put the world into 862 00:49:42,960 --> 00:49:45,200 Speaker 1: this weird fandom lens of like this team versus this 863 00:49:45,239 --> 00:49:47,279 Speaker 1: team which we we we we we we we. We've 864 00:49:47,280 --> 00:49:49,120 Speaker 1: had a degree of that for a long long time. 865 00:49:49,800 --> 00:49:52,520 Speaker 1: But with the Internet and how how discussions on the 866 00:49:52,560 --> 00:49:55,120 Speaker 1: Internet are designed to work, right, how algorithms want to 867 00:49:55,160 --> 00:49:57,960 Speaker 1: boost content, how there's always these short snippets they just 868 00:49:58,080 --> 00:50:00,880 Speaker 1: in mirrors the way people discussed like what Star Wars 869 00:50:00,920 --> 00:50:05,040 Speaker 1: character is their favorite? It's just that. But for politics, um, 870 00:50:05,160 --> 00:50:08,719 Speaker 1: so it's it's just this like what if politics is 871 00:50:08,760 --> 00:50:11,319 Speaker 1: just this idea of fandom, and you can debate what 872 00:50:11,440 --> 00:50:13,799 Speaker 1: fandom is more valid than the other. Right, I like 873 00:50:13,920 --> 00:50:16,560 Speaker 1: the last Jedi more you like Rise of Skywalker. This 874 00:50:16,600 --> 00:50:19,000 Speaker 1: means your version of reality is less good than mine. 875 00:50:19,280 --> 00:50:26,839 Speaker 1: So when that which is, which is but that it's 876 00:50:26,880 --> 00:50:30,440 Speaker 1: that same idea but for how we like make social 877 00:50:30,480 --> 00:50:33,080 Speaker 1: programs and how we address racism, and how we like 878 00:50:33,480 --> 00:50:36,240 Speaker 1: give food to poor people, and how we do affordable 879 00:50:36,280 --> 00:50:39,239 Speaker 1: housing and how we handle the police. So it's that 880 00:50:39,280 --> 00:50:45,400 Speaker 1: type of idea which is just the disinformation kind of 881 00:50:45,440 --> 00:50:49,120 Speaker 1: impacts this in part because when you flood the zone 882 00:50:49,360 --> 00:50:53,319 Speaker 1: with so much conflicting information that people can't really get 883 00:50:53,320 --> 00:50:56,480 Speaker 1: a handle on or easily sort of like when there 884 00:50:56,520 --> 00:51:00,040 Speaker 1: when when you when you put that much confusion in 885 00:51:00,080 --> 00:51:03,000 Speaker 1: to the air, um, it makes people more likely to 886 00:51:03,080 --> 00:51:06,200 Speaker 1: just kind of grasp its sides because everything coming out 887 00:51:06,280 --> 00:51:08,959 Speaker 1: is way too complicated and messy, and it's it takes 888 00:51:08,960 --> 00:51:12,520 Speaker 1: too much work what's actually true. So holding to some 889 00:51:12,600 --> 00:51:15,719 Speaker 1: lubric of well, I believe this, So that means these 890 00:51:15,719 --> 00:51:17,960 Speaker 1: are the good guys, these are the bad guys, and 891 00:51:18,040 --> 00:51:20,799 Speaker 1: I don't have to analyze it any deeper than that. 892 00:51:20,880 --> 00:51:23,319 Speaker 1: I can reject information that comes from this group. Where 893 00:51:23,320 --> 00:51:27,000 Speaker 1: I can reject information that says this, um, because I 894 00:51:27,680 --> 00:51:32,720 Speaker 1: just category categorically reject you know, anything that that fits 895 00:51:32,719 --> 00:51:35,359 Speaker 1: in with that, Like, that's the benefit of disinformation for 896 00:51:35,800 --> 00:51:38,719 Speaker 1: authoritarians of all stripes. You're seeing in Ukraine right now, 897 00:51:38,760 --> 00:51:43,280 Speaker 1: where Um, you've got all of these different authoritarian powers. 898 00:51:43,280 --> 00:51:46,120 Speaker 1: You've got Turkey, you've got Russia, you've got um, you know, 899 00:51:46,280 --> 00:51:49,239 Speaker 1: sucking the United States at least to the extent that 900 00:51:49,280 --> 00:51:53,040 Speaker 1: like we impact a lot of things internationally. Um. And 901 00:51:53,080 --> 00:51:55,840 Speaker 1: you've got them all coming down on different sides of 902 00:51:56,320 --> 00:51:59,440 Speaker 1: this issue and of what's happening in Ukraine. And because 903 00:51:59,440 --> 00:52:03,640 Speaker 1: there's so much disinformation and misinformation about what's going on, 904 00:52:04,080 --> 00:52:07,600 Speaker 1: people just kind of grasp at whatever side I'm have 905 00:52:07,760 --> 00:52:11,360 Speaker 1: been more sympathetic to recently, I'm just going to believe 906 00:52:11,440 --> 00:52:14,960 Speaker 1: whatever they say because it's way too complicated to actually 907 00:52:14,960 --> 00:52:17,840 Speaker 1: analyze what's going on. Yeah, And this was this was 908 00:52:17,880 --> 00:52:20,200 Speaker 1: the thing that I mean, this was explicit on the left. 909 00:52:20,320 --> 00:52:24,680 Speaker 1: I remember this. There was this around um sheen. There 910 00:52:24,760 --> 00:52:27,200 Speaker 1: was a whole thing about how like people people like 911 00:52:27,200 --> 00:52:30,240 Speaker 1: talking about anti imperialism would would literally say like nuance 912 00:52:30,280 --> 00:52:34,880 Speaker 1: nuances liberalism. I don't like nuances liberalism don't don't research this, 913 00:52:34,960 --> 00:52:38,680 Speaker 1: don't think about this because nuances how liberals like you know, 914 00:52:38,880 --> 00:52:41,160 Speaker 1: spread sort of pro promising and change proplicanda, Like I 915 00:52:41,160 --> 00:52:43,160 Speaker 1: remember those people like Ember Frost just just straight up 916 00:52:43,200 --> 00:52:45,520 Speaker 1: said this and this was a huge and you know, 917 00:52:45,560 --> 00:52:47,040 Speaker 1: like I I got a lot of shift for this 918 00:52:47,080 --> 00:52:49,520 Speaker 1: because you know, like I remember when when when the 919 00:52:49,520 --> 00:52:52,040 Speaker 1: Coon Bolivia happened, Like I made a giant thread that 920 00:52:52,080 --> 00:52:53,759 Speaker 1: was trying to that was like, Okay, we need to 921 00:52:53,760 --> 00:52:57,120 Speaker 1: figure out like how specifically the CIA was involved in this, 922 00:52:57,200 --> 00:52:59,120 Speaker 1: Like okay, so did they plan the whole thing? Was 923 00:52:59,120 --> 00:53:01,560 Speaker 1: it like we're they're working with local partners. Wasn't a 924 00:53:01,560 --> 00:53:03,319 Speaker 1: thing where someone else planned it and they signed off 925 00:53:03,320 --> 00:53:05,719 Speaker 1: on it? And like to this day, people think that 926 00:53:05,760 --> 00:53:08,120 Speaker 1: I supported the coup because I was like we should 927 00:53:08,120 --> 00:53:09,960 Speaker 1: figure out who was who the actors were in the 928 00:53:09,960 --> 00:53:12,640 Speaker 1: ground because no one like this. This this, this this 929 00:53:12,760 --> 00:53:16,680 Speaker 1: became like a like like like a tenant, like like 930 00:53:16,680 --> 00:53:19,719 Speaker 1: an actual sort of like like political tenants of of 931 00:53:19,719 --> 00:53:22,320 Speaker 1: how a lot of antimpurialism, like in the American left worked. 932 00:53:22,400 --> 00:53:25,319 Speaker 1: Was you you were not supposed to do nuanced, You 933 00:53:25,360 --> 00:53:27,400 Speaker 1: were not supposed to look at who was like you 934 00:53:27,440 --> 00:53:29,560 Speaker 1: know if if if you spent too long looking at 935 00:53:29,560 --> 00:53:31,239 Speaker 1: what was going on in the ground, people would be 936 00:53:31,280 --> 00:53:34,920 Speaker 1: like you worked for the CIA and that you know, 937 00:53:34,960 --> 00:53:37,240 Speaker 1: I think like we we've we've finally seen that basically 938 00:53:37,239 --> 00:53:39,640 Speaker 1: blow up in their faces because you know, oh, hey, 939 00:53:39,680 --> 00:53:41,600 Speaker 1: look how many of these people just like wound up 940 00:53:41,600 --> 00:53:44,920 Speaker 1: supporting Russia and then spent like three months saying that 941 00:53:45,000 --> 00:53:47,719 Speaker 1: Russia would never invade Ukraine. That this happens. But it's 942 00:53:47,800 --> 00:53:51,640 Speaker 1: I don't know, it's it's it's it's extremely depressing how 943 00:53:51,680 --> 00:53:55,399 Speaker 1: people who otherwise are you know, like in a lot 944 00:53:55,440 --> 00:53:57,320 Speaker 1: of ways, like I've spent a lot of their time 945 00:53:58,480 --> 00:54:02,160 Speaker 1: like trying to filter out stuff from the media that's 946 00:54:02,160 --> 00:54:06,319 Speaker 1: false just go into this because they just do not 947 00:54:06,600 --> 00:54:11,839 Speaker 1: want to deal with the complexity of reality. Yeah, just 948 00:54:11,880 --> 00:54:14,840 Speaker 1: easier not to. Again, if there was a simple problem, 949 00:54:14,960 --> 00:54:16,720 Speaker 1: we wouldn't I mean, if there was a simple solution, 950 00:54:16,760 --> 00:54:20,480 Speaker 1: we wouldn't need to discuss the problem. Yeah. Yeah, So 951 00:54:20,520 --> 00:54:24,520 Speaker 1: I guess basically, like just to like, um, answer that 952 00:54:24,600 --> 00:54:28,239 Speaker 1: question about how it I guess at the time, I'd say, 953 00:54:28,280 --> 00:54:31,000 Speaker 1: like we got an up close look at how things 954 00:54:31,040 --> 00:54:33,239 Speaker 1: were going to be, Like, you know, with with all 955 00:54:33,360 --> 00:54:37,160 Speaker 1: these things we we kind of anticipated the next few years. 956 00:54:37,960 --> 00:54:43,120 Speaker 1: Um so yeah, that's basically what happened. Um, sorry to 957 00:54:43,200 --> 00:54:49,080 Speaker 1: interrupt your clothing, but no, no, no, it's the it's 958 00:54:49,120 --> 00:54:52,280 Speaker 1: the best note that that we can go. Um, Michael, 959 00:54:52,400 --> 00:54:54,640 Speaker 1: do you where can people find you online? And if 960 00:54:54,640 --> 00:54:56,880 Speaker 1: people want to look into some of your other projects, 961 00:54:57,840 --> 00:55:01,160 Speaker 1: I mean you found me, Like, if they want to 962 00:55:01,200 --> 00:55:03,200 Speaker 1: find me, they'll find me, right, I don't know. I 963 00:55:03,239 --> 00:55:05,920 Speaker 1: still don't know. You got my email, but Garrison is 964 00:55:05,920 --> 00:55:09,200 Speaker 1: extremely good about this to any people, and I could 965 00:55:10,560 --> 00:55:13,160 Speaker 1: uh yeah, well they can check out the YouTube channel 966 00:55:13,400 --> 00:55:17,200 Speaker 1: like I'm gonna be posting some new films this year probably, 967 00:55:17,800 --> 00:55:21,400 Speaker 1: Um so my name Michael Luck can allow or just 968 00:55:21,520 --> 00:55:25,759 Speaker 1: search the Assarians there. I guess that's a way. Yeah, yeah, 969 00:55:26,160 --> 00:55:29,919 Speaker 1: I'll add your YouTube channel to to the description. Yeah. 970 00:55:29,960 --> 00:55:31,600 Speaker 1: I just want to thank you so much for coming 971 00:55:31,600 --> 00:55:36,160 Speaker 1: on to talk about your your project. Yeah, thanks for 972 00:55:36,200 --> 00:55:39,800 Speaker 1: having me. All right, Well, that that does it for 973 00:55:39,920 --> 00:55:43,080 Speaker 1: us today. You can follow us on the internet for 974 00:55:43,120 --> 00:55:46,239 Speaker 1: some reason, um on Twitter, Instagram, that happened here pod 975 00:55:46,280 --> 00:55:51,759 Speaker 1: and cool some media and yeah, go go create a 976 00:55:51,880 --> 00:55:54,759 Speaker 1: myth that people will believe and travel from out of 977 00:55:54,760 --> 00:55:58,160 Speaker 1: country to walk over some stairs because that sounds like fun. 978 00:55:58,239 --> 00:56:05,040 Speaker 1: Go to something like that for funzies. All right, bye bye, 979 00:56:06,840 --> 00:56:09,200 Speaker 1: Except happen here as a production of cool Zone Media. 980 00:56:09,560 --> 00:56:12,120 Speaker 1: Or more podcasts from cool Zone Media, visit our website 981 00:56:12,160 --> 00:56:14,279 Speaker 1: cool zone media dot com or check us out on 982 00:56:14,320 --> 00:56:16,880 Speaker 1: the I Heart Radio app, Apple Podcasts, or wherever you 983 00:56:16,920 --> 00:56:19,719 Speaker 1: listen to podcasts. You can find sources for It could 984 00:56:19,719 --> 00:56:22,680 Speaker 1: happen here, updated monthly at cool zone media dot com 985 00:56:22,760 --> 00:56:24,680 Speaker 1: slash sources. Thanks for listening.